英語閱讀 學(xué)英語,練聽力,上聽力課堂! 注冊 登錄
> 輕松閱讀 > 英語漫讀 >  內(nèi)容

如何抗擊隱形的歧視?

所屬教程:英語漫讀

瀏覽:

2016年11月29日

手機(jī)版
掃描二維碼方便學(xué)習(xí)和分享
Six months ago, tech entrepreneur Rohan Gilkes tried to rent a cabin in Idaho over the July 4 weekend, using the website Airbnb. All seemed well, until the host told him her plans had changed: she needed to use the cabin herself. Then a friend of Rohan’s tried to book the same cabin on the same weekend, and his booking was immediately accepted. Rohan’s friend is white; Rohan is black.

半年前,科技創(chuàng)業(yè)者羅恩•吉爾克斯(Rohan Gilkes)嘗試通過Airbnb網(wǎng)站預(yù)訂愛達(dá)荷州的一間小屋,在美國獨(dú)立日長周末使用。一切似乎都很順利,直到房主告訴他,她的計(jì)劃有變:她自己需要使用那間小屋。之后,羅恩的一個朋友試著在同樣時間預(yù)訂那間小屋,他的預(yù)訂被迅速接受了。羅恩的朋友是白人;羅恩是黑人。

This is not a one-off. Late last year, three researchers from Harvard Business School — Benjamin Edelman, Michael Luca and Dan Svirsky — published a working paper with experimental evidence of discrimination. Using fake profiles to request accommodation, the researchers found that applicants with distinctively African-American names were 16 per cent less likely to have their bookings accepted. Edelman and Luca have also published evidence that black hosts receive lower incomes than whites while letting out very similar properties on Airbnb. The hashtag #AirbnbWhileBlack has started to circulate.

這并非一次性事件。哈佛商學(xué)院(Harvard Business School)的3名研究人員——本杰明•埃德爾曼( Benjamin Edelman)、邁克爾•盧卡(Michael Luca)和丹•斯維爾斯基(Dan Svirsky)去年年末發(fā)布了一份工作論文,其中的實(shí)驗(yàn)證據(jù)證明了歧視的存在。研究人員使用假的資料來申請訂房,他們發(fā)現(xiàn),如果看申請者的姓名明顯像是非裔美國人,其預(yù)訂被接受的可能性要低16%。埃德爾曼和盧卡還發(fā)布了一些證據(jù),表明在Airbnb上出租類似房源時,黑人房主的租房所得會比白人房主低。“#AirbnbWhileBlack(Airbnb上的黑人)”的話題標(biāo)簽開始傳播。

Can anything be done to prevent such discrimination? It’s not a straightforward problem. Airbnb condemns racial discrimination but, by making names and photographs such a prominent feature of its website, it makes discrimination, conscious or unconscious, very easy.

可以做些什么來防止這種歧視嗎?這不是一個簡單明了的問題。Airbnb譴責(zé)種族歧視,但Airbnb網(wǎng)站的一個突出特征就是顯示姓名和照片,這讓有意或者無意的歧視變得非常容易。

“It’s a cheap way to build trust,” says researcher Michael Luca. But, he adds, it “invites discrimination”.

“這是一種成本低廉的建立信任的方式,”研究員邁克爾•盧卡說。但他補(bǔ)充道,這“招來了歧視”。

Of course there’s plenty of discrimination to be found elsewhere. Other studies have used photographs of goods such as iPods and baseball cards being held in a person’s hand. On Craigslist and eBay, such goods sell for less if held in a black hand than a white one. An unpleasant finding — although in such cases it’s easy to use a photograph with no hand visible at all.

當(dāng)然,其他地方也可以發(fā)現(xiàn)很多歧視現(xiàn)象。另一些研究使用了賣家手持商品(如iPod或者棒球卡)拍下的商品照片。在Craigslist和eBay上,黑人手持的商品賣價會比白人手持商品的賣價低。這個發(fā)現(xiàn)令人不舒服——盡管在這種情況賣家想避免受到歧視很容易,只需使用不露出手的商品照片就可以了。

The Harvard Business School team have produced a browser plug-in called “Debias Yourself”. People who install the plug-in and then surf Airbnb will find that names and photographs have been hidden. It’s a nice idea, although one suspects that it will not be used by those who need it most. Airbnb could impose the system anyway but that is unlikely to prove tempting.

哈佛商學(xué)院的團(tuán)隊(duì)制作了一個叫做“Debias Yourself”的防偏見瀏覽器插件。安裝這個插件的人在瀏覽Airbnb的時候會發(fā)現(xiàn)姓名和照片被隱藏了。這是個好主意,不過我懷疑那些最需要這個功能的人不會使用它。Airbnb可以強(qiáng)行實(shí)施這個系統(tǒng),但這樣做不太可能有吸引力。

However, says Luca, there are more subtle ways in which the platform could discourage discrimination. For example, it could make profile portraits less prominent, delaying the appearance of a portrait until further along in the process of making a booking. And it could nudge hosts into using an “instant book” system that accelerates and depersonalises the booking process. (The company recently released a report describing efforts to deal with the problem.)

然而,盧卡表示,平臺還可以使用一些更含蓄的方式來阻止歧視。比如,平臺可以讓資料中的個人照片變得不那么突出,在預(yù)訂進(jìn)行到一定階段后再顯現(xiàn)照片。平臺還可以敦促房主使用“即時預(yù)訂”系統(tǒng),這種系統(tǒng)即能加快預(yù)訂過程,又能去除預(yù)訂過程中的個人因素。(該公司最近發(fā)布了一份報告,描述了為處理這一問題做出的努力。)

But if the Airbnb situation has shone a spotlight on unconscious (and conscious) bias, there are even more important manifestations elsewhere in the economy. A classic study by economists Marianne Bertrand and Sendhil Mullainathan used fake CVs to apply for jobs. Some CVs, which used distinctively African-American names, were significantly less likely to lead to an interview than identical applications with names that could be perceived as white.

如果說Airbnb的情況讓人們關(guān)注到無意識(和有意識)的偏見,那么在經(jīng)濟(jì)的其他領(lǐng)域,還有一些更重要的反映出偏見的情況。經(jīng)濟(jì)學(xué)家瑪麗安娜•貝特朗(Marianne Bertrand)和森德希爾•穆萊納坦(Sendhil Mullainathan)所做的一項(xiàng)經(jīng)典研究使用了假簡歷來申請工作。使用明顯是非裔美國人姓名的簡歷得到面試的幾率,要低于內(nèi)容一樣但使用可能被認(rèn)為是白人姓名的簡歷。

Perhaps the grimmest feature of the Bertrand/Mullainathan study was the discovery that well-qualified black applicants were treated no better than poorly qualified ones. As a young black student, then, one might ask: why bother studying when nobody will look past your skin colour? And so racism can create a self-reinforcing loop.

或許貝特朗和穆萊納坦進(jìn)行的研究中最令人沮喪的一點(diǎn)是,完全夠格的黑人申請者得到的待遇和不那么夠格的申請者一樣糟糕。那么,一個年輕的黑人學(xué)生或許會問:如果沒人在乎你膚色以外的東西,為何還要費(fèi)力學(xué)習(xí)呢?因此,種族主義可能會導(dǎo)致一個自我加強(qiáng)的循環(huán)。

What to do?

該怎么辦?

One approach, as with “Debias Yourself”, is to remove irrelevant information: if a person’s skin colour or gender is irrelevant, then why reveal it to recruiters? The basic idea behind “Debias Yourself” was proven in a study by economists Cecilia Rouse and Claudia Goldin. Using a careful statistical design, Rouse and Goldin showed that when leading professional orchestras began to audition musicians behind a screen, the recruitment of women surged.

有一種策略,就像“Debias Yourself”防偏見插件一樣,是去除無關(guān)信息:既然一個人的膚色或者性別不影響其錄用,那何必把這些信息透露給招聘人員呢?經(jīng)濟(jì)學(xué)家塞西莉亞•勞斯(Cecilia Rouse)和克勞迪婭•戈?duì)柖?Claudia Goldin)的一項(xiàng)研究證明了“Debias Yourself”所依據(jù)的基本理念是正確的。通過細(xì)心的統(tǒng)計(jì)設(shè)計(jì),勞斯和戈?duì)柖”砻?,?dāng)一流的專業(yè)管弦樂團(tuán)開始隔著屏風(fēng)面試音樂家時,女性被錄取的幾率激增。

Importantly, blind auditions weren’t introduced to fight discrimination against women — orchestras didn’t think such discrimination was a pressing concern. Instead, they were a way of preventing recruiters from favouring the pupils of influential teachers. Yet a process designed to fight nepotism and favouritism ended up fighting sexism too.

重要的是,在這里,盲試的引入并不是為了抗擊對女性的歧視——管弦樂團(tuán)并不認(rèn)為他們在性別歧視方面存在緊迫問題。事實(shí)上,盲試是為了防止招聘者偏袒具有影響力的教師的學(xué)生。然而,這種旨在打擊裙帶關(guān)系和徇私行為的程序最終也打擊了性別歧視。

 . . .   . . . 

A new start-up, “Applied”, is taking these insights into the broader job market. “Applied” is a spin-off from the UK Cabinet Office, the Behavioural Insights Team and Nesta, a charity that supports innovation; the idea is to use some simple technological fixes to combat a variety of biases.

新創(chuàng)立的企業(yè)Applied正把這些洞見應(yīng)用到更廣泛的就業(yè)市場中。Applied是由“行為研究小組”(Behavioural Insights Team,由英國內(nèi)閣辦公室(Cabinet Office)和支持創(chuàng)新的慈善機(jī)構(gòu)英國國家科技藝術(shù)基金會(Nesta)合作成立)和Nesta合作成立的公司,其創(chuàng)辦理念是通過一些簡單的技術(shù)性修正來抗擊各種偏見。

A straightforward job application form is a breeding ground for discrimination and cognitive error. It starts with a name — giving clues to nationality, ethnicity and gender — and then presents a sequence of answers that are likely to be read as one big stew of facts. A single answer, good or bad, colours our perception of everything else, a tendency called the halo effect.

一份直觀的工作申請表為偏見和認(rèn)知錯誤提供了溫床。這種表格把暴露申請者國籍、族裔和性別的姓名放在最開頭,它接下來提供的一系列答案可能被看做各種事實(shí)的大雜燴。只需一個我們喜歡或不喜歡的答案,就會影響我們對其余一切答案的看法,這是一種叫做光暈效應(yīng)的傾向。

A recruiter using “Applied” will see “chunked” and “anonymised” details — answers to the application questions from different applicants, presented in a randomised order and without indications of race or gender. Meanwhile, other recruiters will see the same answers, but shuffled differently. As a result, says Kate Glazebrook of “Applied”, various biases simply won’t have a chance to emerge.

一個使用Applied服務(wù)的招聘人員將會看到“區(qū)塊化”和“匿名化”的細(xì)節(jié)——將不同申請者對申請表問題的答案用隨機(jī)順序列出來,不體現(xiàn)種族或者性別。同時,其他招聘人員將看到同樣的答案,但以不同順序列出。Applied的凱特•格萊茲布魯克(Kate Glazebrook)表示,這樣一來,各種偏見根本沒有機(jī)會產(chǎn)生。

When the Behavioural Insights Team ran its last recruitment round, applicants were rated using the new process and a more traditional CV-based approach. The best of the shuffled, anonymised applications were more diverse, and much better predictors of a candidate who impressed on the assessment day. Too early to declare victory — but a promising start.

當(dāng)“行為研究小組”進(jìn)行最后一輪招聘時,有的申請人接受的是新程序的評分,有的接受的是基于簡歷的更傳統(tǒng)方式的評分。使用被打亂順序、匿名化的申請表選出的最佳申請人更加背景各異,在評估日令人印象深刻的幾率也大大提高。宣布勝利還為時過早——但這是一個充滿希望的開端。
 


用戶搜索

瘋狂英語 英語語法 新概念英語 走遍美國 四級聽力 英語音標(biāo) 英語入門 發(fā)音 美語 四級 新東方 七年級 賴世雄 zero是什么意思北京市北月牙胡同小區(qū)英語學(xué)習(xí)交流群

網(wǎng)站推薦

英語翻譯英語應(yīng)急口語8000句聽歌學(xué)英語英語學(xué)習(xí)方法

  • 頻道推薦
  • |
  • 全站推薦
  • 推薦下載
  • 網(wǎng)站推薦