英語演講 學(xué)英語,練聽力,上聽力課堂! 注冊(cè) 登錄
> 英語演講 > 英語演講mp3 > TED音頻 >  第123篇

演講MP3+雙語文稿:為什么我們和機(jī)器人有情感上的聯(lián)系

所屬教程:TED音頻

瀏覽:

2022年05月21日

手機(jī)版
掃描二維碼方便學(xué)習(xí)和分享
https://online2.tingclass.net/lesson/shi0529/10000/10387/tedyp124.mp3
https://image.tingclass.net/statics/js/2012

聽力課堂TED音頻欄目主要包括TED演講的音頻MP3及中英雙語文稿,供各位英語愛好者學(xué)習(xí)使用。本文主要內(nèi)容為演講MP3+雙語文稿:為什么我們和機(jī)器人有情感上的聯(lián)系,希望你會(huì)喜歡!

【演講者及介紹】Kate Darling

凱特·達(dá)林(Kate Darling)——機(jī)器人倫理學(xué)家,研究人類與機(jī)器人之間的關(guān)系。

【演講主題】為什么我們和機(jī)器人有情感上的聯(lián)系

【中英文字幕】

Translation by psjmz mz. Reviewed by XinranBi.

00:13

There was a day, about 10 years ago, when Iasked a friend to hold a baby dinosaur robot upside down. It was this toycalled a Pleo that I had ordered, and I was really excited about it becauseI've always loved robots. And this one has really cool technical features. Ithad motors and touch sensors and it had an infrared camera. And one of thethings it had was a tilt sensor, so it knew what direction it was facing. Andwhen you held it upside down, it would start to cry. And I thought this wassuper cool, so I was showing it off to my friend, and I said, "Oh, hold itup by the tail. See what it does." So we're watching the theatrics of thisrobot struggle and cry out. And after a few seconds, it starts to bother me alittle, and I said, "OK, that's enough now. Let's put him back down."And then I pet the robot to make it stop crying.

大概10年前的一天,我讓一個(gè)朋友頭朝下地握持一個(gè)小恐龍機(jī)器人。這個(gè)機(jī)器人是我訂購的一款叫做Pleo的玩具,我對(duì)此非常興奮,因?yàn)槲乙恢倍己芟矚g機(jī)器人。這個(gè)機(jī)器人有很酷的技術(shù)特征。它有馬達(dá)和觸覺傳感器,還有一個(gè)紅外攝像頭。它還有一個(gè)部件是傾斜傳感器,所以它就會(huì)知道自己面對(duì)的是什么方向。當(dāng)你把它倒過來,它會(huì)開始哭泣。我覺得這點(diǎn)非???,所以我展示給我朋友看,我說:“抓住尾巴豎起來,看看它會(huì)怎樣?!庇谑俏覀兛粗@個(gè)機(jī)器人表演,掙扎和哭泣。幾秒鐘后,我開始感到有點(diǎn)不安,于是我說,“好了,差不多了,我們把它放回去吧。”然后我撫摸著機(jī)器人,讓它停止哭泣。

01:19

And that was kind of a weird experience forme. For one thing, I wasn't the most maternal person at the time. Althoughsince then I've become a mother, nine months ago, and I've learned that babiesalso squirm when you hold them upside down.

這對(duì)我來說是一種奇怪的經(jīng)歷。首先,我那時(shí)還不是個(gè)很有母性的人。盡管在那之前的9個(gè)月,我已經(jīng)成為了一個(gè)母親,我還知道,當(dāng)你讓嬰兒大頭朝下時(shí),嬰兒也會(huì)抽泣。

01:33

(Laughter)

(笑聲)

01:35

But my response to this robot was alsointeresting because I knew exactly how this machine worked, and yet I stillfelt compelled to be kind to it. And that observation sparked a curiosity thatI've spent the past decade pursuing. Why did I comfort this robot? And one ofthe things I discovered was that my treatment of this machine was more thanjust an awkward moment in my living room, that in a world where we'reincreasingly integrating robots into our lives, an instinct like that mightactually have consequences, because the first thing that I discovered is thatit's not just me.

但我對(duì)這個(gè)機(jī)器人的反應(yīng)也非常有趣,因?yàn)槲掖_切地知道這個(gè)機(jī)器工作的原理,然而我仍然感到有必要對(duì)它仁慈些。這個(gè)觀察引起了好奇心,讓我花費(fèi)了長(zhǎng)達(dá)10年的時(shí)間去追尋。為什么我會(huì)去安慰這個(gè)機(jī)器人?我發(fā)現(xiàn)我對(duì)待這個(gè)機(jī)器人的方式不僅是我起居室里一個(gè)尷尬時(shí)刻,在這個(gè)世界里,我們正越來越多地將機(jī)器人融入到我們生活中,像這樣的本能可能會(huì)產(chǎn)生一些后果,因?yàn)槲野l(fā)現(xiàn)的第一件事情是,這并非只是發(fā)生在我身上的個(gè)例。

02:20

In 2007, the Washington Post reported thatthe United States military was testing this robot that defused land mines. Andthe way it worked was it was shaped like a stick insect and it would walkaround a minefield on its legs, and every time it stepped on a mine, one of thelegs would blow up, and it would continue on the other legs to blow up moremines. And the colonel who was in charge of this testing exercise ends upcalling it off, because, he says, it's too inhumane to watch this damaged robotdrag itself along the minefield. Now, what would cause a hardened militaryofficer and someone like myself to have this response to robots?

2007年,華盛頓郵報(bào)報(bào)道稱,美國(guó)軍方 正在測(cè)試拆除地雷的機(jī)器人。它的形狀就像一只竹節(jié)蟲,用腿在雷區(qū)上行走,每次踩到地雷時(shí),它的一條腿就會(huì)被炸掉,然后繼續(xù)用其他腿去引爆更多的地雷。負(fù)責(zé)這次測(cè)試的上校后來取消了這個(gè)測(cè)試,因?yàn)樗f,看著這個(gè)機(jī)器人拖著殘破的身軀在雷區(qū)掙扎行走,實(shí)在太不人道了。那么,是什么導(dǎo)致了一個(gè)強(qiáng)硬的軍官和像我這樣的人 對(duì)機(jī)器人有這種反應(yīng)呢?

03:04

Well, of course, we're primed by sciencefiction and pop culture to really want to personify these things, but it goes alittle bit deeper than that. It turns out that we're biologically hardwired toproject intent and life onto any movement in our physical space that seemsautonomous to us. So people will treat all sorts of robots like they're alive.These bomb-disposal units get names. They get medals of honor. They've hadfunerals for them with gun salutes. And research shows that we do this evenwith very simple household robots, like the Roomba vacuum cleaner.

不可否認(rèn),我們都被科幻小說及流行文化所影響,想要將這些東西擬人化,但真實(shí)情況還有著更深層的含義。事實(shí)表明,我們天生就具有將意圖和生活投射到物理空間中,在我們看來能自主行動(dòng)的任何運(yùn)動(dòng)物體上。所以人們像對(duì)待活物一樣對(duì)待各種各樣的機(jī)器人。這些拆彈機(jī)器人有自己的名字。它們能獲得榮譽(yù)勛章。人們?yōu)樗鼈兣e行了葬禮,并用禮炮向它們致敬。研究還發(fā)現(xiàn),我們即便對(duì)非常簡(jiǎn)單的家居機(jī)器人也會(huì)這樣,比如Roomba吸塵器。

03:41

(Laughter)

(笑聲)

03:42

It's just a disc that roams around yourfloor to clean it, but just the fact it's moving around on its own will cause peopleto name the Roomba and feel bad for the Roomba when it gets stuck under thecouch.

它只是一個(gè)在你地板上通過旋轉(zhuǎn)進(jìn)行清理的圓盤,但僅僅因?yàn)樗軌蜃约阂苿?dòng),就會(huì)導(dǎo)致人們想要給Roomba取名,當(dāng)它卡在沙發(fā)下時(shí),還會(huì)替它感到難過。

03:53

(Laughter)

(笑聲)

03:55

And we can design robots specifically toevoke this response, using eyes and faces or movements that peopleautomatically, subconsciously associate with states of mind. And there's anentire body of research called human-robot interaction that really shows howwell this works. So for example, researchers at Stanford University found outthat it makes people really uncomfortable when you ask them to touch a robot'sprivate parts.

我們可以專門設(shè)計(jì)機(jī)器人來喚起這種反應(yīng),使用諸如眼睛,面孔或動(dòng)作,這些人們自動(dòng)地,在潛意識(shí)中與心智狀態(tài)相聯(lián)系的特征。這一整套叫做人機(jī)交互的研究顯示了這個(gè)方法的效果的確非常好。比如,在斯坦福大學(xué)的研究者發(fā)現(xiàn),當(dāng)你叫人們觸摸機(jī)器人的私處時(shí),他們會(huì)感到很不舒服。

04:20

(Laughter)

(笑聲)

04:22

So from this, but from many other studies,we know, we know that people respond to the cues given to them by theselifelike machines, even if they know that they're not real.

從這個(gè)以及更多其他研究中,我們知道人們會(huì)對(duì)這些栩栩如生的機(jī)器給他們的線索做出反應(yīng),即使他們知道它們只是機(jī)器。

04:34

Now, we're headed towards a world whererobots are everywhere. Robotic technology is moving out from behind factorywalls. It's entering workplaces, households. And as these machines that cansense and make autonomous decisions and learn enter into these shared spaces, Ithink that maybe the best analogy we have for this is our relationship withanimals. Thousands of years ago, we started to domesticate animals, and wetrained them for work and weaponry and companionship. And throughout history,we've treated some animals like tools or like products, and other animals,we've treated with kindness and we've given a place in society as ourcompanions. I think it's plausible we might start to integrate robots insimilar ways.

我們正邁入一個(gè)機(jī)器人無處不在的社會(huì)。機(jī)器人科技正在走出工廠的圍墻。它們正在進(jìn)入工作場(chǎng)所,家居環(huán)境。隨著這些能夠感知并自己做決定和學(xué)習(xí)的機(jī)器進(jìn)入這些共享空間,我認(rèn)為一個(gè)最好的類比就是我們和動(dòng)物的關(guān)系。幾千年前,我們開始馴養(yǎng)動(dòng)物,我們訓(xùn)練它們?yōu)槲覀児ぷ?,保護(hù)和陪伴我們。在這個(gè)歷史進(jìn)程中,我們把有些動(dòng)物當(dāng)作工具或產(chǎn)品使用,對(duì)其它一些動(dòng)物,我們則對(duì)它們很好,在社會(huì)中給予它們同伴的位置。我認(rèn)為我們可能會(huì)開始以類似的方式整合機(jī)器人。

05:22

And sure, animals are alive. Robots arenot. And I can tell you, from working with roboticists, that we're pretty faraway from developing robots that can feel anything. But we feel for them, andthat matters, because if we're trying to integrate robots into these sharedspaces, we need to understand that people will treat them differently thanother devices, and that in some cases, for example, the case of a soldier whobecomes emotionally attached to the robot that they work with, that can beanything from inefficient to dangerous. But in other cases, it can actually beuseful to foster this emotional connection to robots. We're already seeing somegreat use cases, for example, robots working with autistic children to engagethem in ways that we haven't seen previously, or robots working with teachersto engage kids in learning with new results. And it's not just for kids. Earlystudies show that robots can help doctors and patients in health care settings.

當(dāng)然,動(dòng)物有生命。機(jī)器人沒有。作為機(jī)器人專家,我可以告訴各位,我們距離能產(chǎn)生感情的機(jī)器人還很遙遠(yuǎn)。但我們同情它們,這點(diǎn)很重要,因?yàn)槿绻覀儑L試把機(jī)器人整合進(jìn)這些共享空間,就需要懂得人們會(huì)把它們與其他設(shè)備區(qū)別對(duì)待,而且在有些場(chǎng)景下,比如,那個(gè)士兵對(duì)一起工作的機(jī)器人產(chǎn)生情感依戀的例子,這可能是低效的,也可能是危險(xiǎn)的。但在其他場(chǎng)景下,培養(yǎng)與機(jī)器人的情感聯(lián)系可能非常有用。我們已經(jīng)看到了一些很好的使用場(chǎng)景,比如跟自閉癥兒童一起的機(jī)器人以我們前所未見的方式與他們互動(dòng),或者讓機(jī)器人與老師共事,在幫助孩子們學(xué)習(xí)方面獲得新的成果。并且并不只適用于兒童。早期的研究發(fā)現(xiàn)機(jī)器人可以在醫(yī)療保健領(lǐng)域幫助醫(yī)生和病人。

06:26

This is the PARO baby seal robot. It's usedin nursing homes and with dementia patients. It's been around for a while. AndI remember, years ago, being at a party and telling someone about this robot,and her response was, "Oh my gosh. That's horrible. I can't believe we'regiving people robots instead of human care." And this is a really commonresponse, and I think it's absolutely correct, because that would be terrible.But in this case, it's not what this robot replaces. What this robot replacesis animal therapy in contexts where we can't use real animals but we can userobots, because people will consistently treat them more like an animal than adevice.

這是帕羅嬰兒海豹機(jī)器人。它被用于療養(yǎng)院來陪伴老年癡呆癥患者。它已經(jīng)面世有陣子了。我記得若干年前,在參與的一次聚會(huì)上跟人講到這個(gè)機(jī)器人時(shí),她的反應(yīng)往往是,“哦,天哪。太可怕了。我無法相信我們給人們的是機(jī)器人護(hù)理,而不是人類護(hù)理?!边@是一個(gè)非常普遍的反應(yīng),我覺得這是完全正確的,因?yàn)檫@可能會(huì)很可怕。但在這個(gè)場(chǎng)景下,機(jī)器人替代的不是護(hù)理。機(jī)器人替代的是動(dòng)物療法,這可以用在無法使用真正動(dòng)物,但可以使用機(jī)器人的場(chǎng)合中,因?yàn)槿藗儠?huì)把它們當(dāng)成動(dòng)物而不是設(shè)備看待。

07:16

Acknowledging this emotional connection torobots can also help us anticipate challenges as these devices move into moreintimate areas of people's lives. For example, is it OK if your child's teddybear robot records private conversations? Is it OK if your sex robot hascompelling in-app purchases?

承認(rèn)這種與機(jī)器人的情感聯(lián)系也能幫助我們預(yù)見到挑戰(zhàn),隨著這些設(shè)備將進(jìn)入人們生活中更親密的領(lǐng)域。比如,用你孩子的玩具熊機(jī)器人錄制私人對(duì)話是否合適?你的性愛機(jī)器人有強(qiáng)制的內(nèi)置付費(fèi)系統(tǒng)是否合適?

07:34

(Laughter)

(笑聲)

07:36

Because robots plus capitalism equalsquestions around consumer protection and privacy.

因?yàn)闄C(jī)器人加上資本就等于消費(fèi)者保護(hù)和隱私問題。

07:43

And those aren't the only reasons that ourbehavior around these machines could matter. A few years after that firstinitial experience I had with this baby dinosaur robot, I did a workshop withmy friend Hannes Gassert. And we took five of these baby dinosaur robots and wegave them to five teams of people. And we had them name them and play with themand interact with them for about an hour. And then we unveiled a hammer and ahatchet and we told them to torture and kill the robots.

這些還不是我們對(duì)待這些機(jī)器人的行為之所以重要的唯一原因。在我第一次見到這只小恐龍機(jī)器人的幾年后,我和朋友漢內(nèi)斯· 加瑟特 開展了一次研討會(huì)。我們拿了5個(gè)小恐龍機(jī)器人,把它們分給5隊(duì)人。我們讓他們?yōu)樗鼈內(nèi)∶惆樗鼈円黄鸹?dòng)大約一個(gè)小時(shí)。然后我們拿出了斧頭和錘子讓他們?nèi)フ勰ズ蜌⑺罊C(jī)器人。

08:13

(Laughter)

(笑聲)

08:17

And this turned out to be a little moredramatic than we expected it to be, because none of the participants would evenso much as strike these baby dinosaur robots, so we had to improvise a little,and at some point, we said, "OK, you can save your team's robot if youdestroy another team's robot."

這個(gè)結(jié)果比我們想的要更有戲劇性,因?yàn)樯踔翛]有一個(gè)參與者去攻擊這些小恐龍機(jī)器人。所以我們得臨時(shí)湊合一下,在某個(gè)時(shí)候,我們說,“好吧,你可以保住你們隊(duì)的機(jī)器人,但前提是把其它隊(duì)的機(jī)器人毀掉?!?

08:35

(Laughter)

(笑聲)

08:37

And even that didn't work. They couldn't doit. So finally, we said, "We're going to destroy all of the robots unlesssomeone takes a hatchet to one of them." And this guy stood up, and hetook the hatchet, and the whole room winced as he brought the hatchet down onthe robot's neck, and there was this half-joking, half-serious moment ofsilence in the room for this fallen robot.

即便這樣也沒用,他們不愿意去做。所以最后,我們說,“我們將要?dú)У羲械臋C(jī)器人,除非有人拿短柄斧砍掉它們中的一個(gè)?!庇袀€(gè)人站了起來,他拿起斧頭,當(dāng)他把斧頭砍到機(jī)器人的脖子上時(shí),整個(gè)房間的人都縮了回去,房間中出現(xiàn)了一個(gè)為這個(gè)倒下的機(jī)器人半玩笑半嚴(yán)肅的沉默時(shí)刻。

09:02

(Laughter)

(笑聲)

09:03

So that was a really interestingexperience. Now, it wasn't a controlled study, obviously, but it did lead tosome later research that I did at MIT with Palash Nandy and Cynthia Breazeal,where we had people come into the lab and smash these HEXBUGs that move aroundin a really lifelike way, like insects. So instead of choosing something cutethat people are drawn to, we chose something more basic, and what we found wasthat high-empathy people would hesitate more to hit the HEXBUGS.

那真是一個(gè)有趣的體驗(yàn)。它不是一個(gè)對(duì)照實(shí)驗(yàn),顯然不是,但這引發(fā)了我后來在麻省理工跟帕拉什 · 南迪和 辛西婭 · 布雷西亞爾做的研究,我們讓來到實(shí)驗(yàn)室的人們打碎這些 像活生生的昆蟲那樣移動(dòng)的遙控電子甲蟲。與選擇人們喜歡的可愛東西相比,我們選擇了一些更基本的東西,我們發(fā)現(xiàn)富有同情心的人們?cè)趽羲檫@些機(jī)器昆蟲時(shí)要更加猶豫。

09:34

Now this is just a little study, but it'spart of a larger body of research that is starting to indicate that there maybe a connection between people's tendencies for empathy and their behavioraround robots. But my question for the coming era of human-robot interaction isnot: "Do we empathize with robots?" It's: "Can robots changepeople's empathy?" Is there reason to, for example, prevent your childfrom kicking a robotic dog, not just out of respect for property, but becausethe child might be more likely to kick a real dog?

這只是一個(gè)小小的研究,但它是一個(gè)更大范圍研究的一部分,這開始表明人們的同情心與他們對(duì)待機(jī)器人的行為可能存在某種聯(lián)系。但我對(duì)即將到來的人機(jī)交互時(shí)代的問題并不是:“我們對(duì)機(jī)器人會(huì)產(chǎn)生同情心嗎?”而是:“機(jī)器人會(huì)改變?nèi)祟惖耐樾膯??”是不是存在這樣的原因,比如說,阻止你的孩子踢一只機(jī)器狗,不只是出于對(duì)財(cái)產(chǎn)的尊重,而是因?yàn)楹⒆痈赡軙?huì)去踢一只真的狗?

10:11

And again, it's not just kids. This is theviolent video games question, but it's on a completely new level because ofthis visceral physicality that we respond more intensely to than to images on ascreen. When we behave violently towards robots, specifically robots that aredesigned to mimic life, is that a healthy outlet for violent behavior or isthat training our cruelty muscles? We don't know ... But the answer to thisquestion has the potential to impact human behavior, it has the potential toimpact social norms, it has the potential to inspire rules around what we canand can't do with certain robots, similar to our animal cruelty laws. Becauseeven if robots can't feel, our behavior towards them might matter for us. Andregardless of whether we end up changing our rules, robots might be able tohelp us come to a new understanding of ourselves.

并且,這不只適用于兒童。這是一個(gè)關(guān)于暴力游戲的問題,但這個(gè)問題上升到了一個(gè)全新的水平,因?yàn)檫@種出于本能的物質(zhì)性行為要比我們對(duì)屏幕上的圖像反應(yīng)更強(qiáng)烈。當(dāng)我們對(duì)機(jī)器人,對(duì)專門設(shè)計(jì)來模擬生命的機(jī)器人表現(xiàn)出暴力行徑時(shí),這是暴力行為的健康疏導(dǎo)還是在培養(yǎng)我們實(shí)施殘忍行徑的力量?我們還不知道… 但這個(gè)問題的答案有可能影響人類行為,它有可能影響社會(huì)規(guī)范,可能會(huì)啟發(fā)我們制定對(duì)特定機(jī)器人能做什么和不能做什么的規(guī)則,就類似于我們的動(dòng)物虐待法。因?yàn)榧幢銠C(jī)器人不能感知,我們對(duì)待它們的行為也可能對(duì)我們有著重要意義。不管我們是否最終會(huì)改變我們的規(guī)則,機(jī)器人也許能幫助我們對(duì)自己有一個(gè)全新的認(rèn)識(shí)。

11:15

Most of what I've learned over the past 10years has not been about technology at all. It's been about human psychologyand empathy and how we relate to others. Because when a child is kind to aRoomba, when a soldier tries to save a robot on the battlefield, or when agroup of people refuses to harm a robotic baby dinosaur, those robots aren'tjust motors and gears and algorithms. They're reflections of our own humanity.

我在過去10年中學(xué)到的經(jīng)驗(yàn)大部分跟技術(shù)無關(guān),而是關(guān)于人類心理學(xué),同情心,以及我們?nèi)绾闻c他人相處。因?yàn)楫?dāng)一個(gè)兒童友好地對(duì)待Roomba時(shí),當(dāng)一個(gè)士兵試圖拯救戰(zhàn)場(chǎng)上的機(jī)器人時(shí),或者當(dāng)一組人拒絕傷害小恐龍機(jī)器人時(shí),這些機(jī)器人就不只是馬達(dá),齒輪和算法。它們映射出了我們的人性。

11:46

Thank you.

謝謝。

11:47

(Applause)

(掌聲)

用戶搜索

瘋狂英語 英語語法 新概念英語 走遍美國(guó) 四級(jí)聽力 英語音標(biāo) 英語入門 發(fā)音 美語 四級(jí) 新東方 七年級(jí) 賴世雄 zero是什么意思濟(jì)南市郭東富居苑英語學(xué)習(xí)交流群

  • 頻道推薦
  • |
  • 全站推薦
  • 推薦下載
  • 網(wǎng)站推薦