【每天一篇經(jīng)濟(jì)學(xué)人】Speak easy 容易說(shuō)(2023年第45期)

文章來(lái)源:《經(jīng)濟(jì)學(xué)人》Jul 22th 2023 期 Culture?欄目 Speak easy 容易說(shuō)
?

[Paragraph 1]
Jennifer destefano answered a call from a number she did not recognise.?“Mom, I messed up,” her daughter’s voice told her, sobbing. “These bad men have me.” A man proceeded to demand money, or he would drug her daughter and leave her in Mexico.?But while she kept him on the phone, friends managed to reach her daughter, only to discover that she was, in fact, free and well on a skiing trip in Arizona. The voice used on the phone was a fake.
詹妮弗·德斯特凡諾接聽(tīng)了一個(gè)陌生電話。“媽媽,我搞砸了,”她女兒哭著說(shuō),“這些壞人抓了我?!?一名男子接著索要錢財(cái),拿不到錢他就會(huì)給她女兒下藥,然后把她丟在墨西哥。但當(dāng)她保持通話的同時(shí),朋友們?cè)O(shè)法聯(lián)系上了她的女兒,實(shí)際上她女兒人身安全且自由,正在亞利桑那州參加滑雪旅行。電話里的聲音是偽造的。
?
[Paragraph 2]
Ms DeStefano, still shaken, told this story to a US Senate subcommittee hearing on artificial intelligence in June.?The dangers that voice-cloning technology pose are only now starting to be uttered aloud.?In recent months, most of the attention paid to artificial intelligence (AI) has gone to so-called “l(fā)arge-language models” like ChatGPT, which churn out text.?But voice cloning’s implications will also be profound.
德斯特凡諾仍心有余悸,她在今年6月份的美國(guó)參議院人工智能小組委員會(huì)聽(tīng)證會(huì)上講述了這個(gè)故事。人們才剛剛開(kāi)始公開(kāi)討論聲音克隆技術(shù)的危害。最近幾個(gè)月,人工智能的關(guān)注點(diǎn)主要集中在“大型語(yǔ)言模型”,如可生成文本的ChatGPT。然而,聲音克隆也將產(chǎn)生深遠(yuǎn)影響。
?
[Paragraph 3]
A brief sample of a voice can be used to train an AI model, which can then speak any given text sounding like that person.?Apple is expected to include the feature for iPhones in its new operating system, iOS 17, due to be released in September.?It is advertised as helping people who may be in danger of losing their voice, for example to a degenerative disease such as ALS.
可用一個(gè)簡(jiǎn)短的聲音樣本來(lái)訓(xùn)練人工智能模型,然后它就可以用樣本人的聲音說(shuō)出任何給定的文本。蘋(píng)果預(yù)計(jì)將在 9 月發(fā)布新的操作系統(tǒng) iOS 17,蘋(píng)果手機(jī)中將加入這項(xiàng)功能。據(jù)宣傳,該功能可以幫助那些可能面臨失聲危險(xiǎn)的人,例如漸凍癥患者。
?
[Paragraph 4]
For those eager to try voice cloning now, ElevenLabs, an AI startup, offers users the chance to create their own clones in minutes. The results are disturbingly accurate.?When generating a playback, the system offers a slider that allows users to choose between variability and stability.?Select more variability, and the audio will have a lifelike intonation, including pauses and stumbles like “er…” Choose “stability”, and it will come across more like a calm and dispassionate newsreader.
對(duì)于那些迫切想嘗試語(yǔ)音克隆的人來(lái)說(shuō),AI初創(chuàng)公司ElevenLabs提供了一個(gè)機(jī)會(huì),用戶在幾分鐘內(nèi)即可創(chuàng)建自己的克隆聲音。結(jié)果驚人地準(zhǔn)確。在生成音頻回放時(shí),系統(tǒng)提供了一個(gè)滑動(dòng)條,允許用戶在變化性和穩(wěn)定性之間進(jìn)行選擇。選擇 "多變",音頻就會(huì)有逼真的音調(diào),包括 "呃...... "這樣的停頓和結(jié)巴。選擇“穩(wěn)定”,聲音就會(huì)更像一個(gè)冷靜、平和的新聞播報(bào)員。
?
[Paragraph 5]
Taylor Jones, a linguist and consultant, took a careful look at the quality of ElevenLabs’s clone of his voice in a YouTube video.?A lower-tech test, a “conversation” with his own mother, fooled the woman who raised him. (“Don’t you ever do that again,” she warned.)
泰勒·瓊斯是一位語(yǔ)言學(xué)家兼顧問(wèn),他認(rèn)真研究了ElevenLabs在 YouTube 視頻中克隆他聲音的質(zhì)量。在一項(xiàng)技術(shù)含量較低的測(cè)試中,即與他自己母親“對(duì)話”,他的媽媽被成功糊弄了。(“你可別再這樣做了,”她警告說(shuō)。)
?
[Paragraph 6]
For several years, customers have been able to identify themselves over the phone to their bank and other companies using their voice.?This was a security upgrade, not a danger. Not even a gifted mimic could fool the detection system.?But the advent of cloning will force adaptation, for example by including voice as only one of several identification factors (and thus undercutting the convenience), in order to prevent fraud.
多年來(lái),客戶一直可以通過(guò)電話里自己的聲音向他們的銀行和其他公司進(jìn)行身份驗(yàn)證。這是一種安全升級(jí)措施,而不是一種危險(xiǎn)。即使是天才的模仿者也無(wú)法騙過(guò)檢測(cè)系統(tǒng)。但是聲音克隆技術(shù)的出現(xiàn)將迫使銀行等機(jī)構(gòu)做出調(diào)整,例如聲音僅作為多種身份驗(yàn)證因素之一(因此便利性減弱了),以防詐騙。
?
[Paragraph 7]
Creative industries could face disruption too. Voice actors’ skills, trained over a lifetime, can be ripped off in a matter of seconds.?The Telegraph, a British broadsheet, recently reported on actors who had mistakenly signed away rights to their voices, making it possible to clone them for nothing.?New contracts will be needed in future. But some actors may, in fact, find cloning congenial.?Val Kilmer, who has lost much of his voice to throat cancer, was delighted to have his voice restored for “Top Gun: Maverick”.?Others may be spared heading to the studio for retakes. It is the middling professional, not the superstar, who is most threatened.
創(chuàng)意產(chǎn)業(yè)也可能會(huì)被顛覆。配音演員畢生訓(xùn)練的技能可以在幾秒鐘內(nèi)被剝奪。英國(guó)大報(bào)《每日電訊報(bào)》最近報(bào)道了一些演員簽署放棄自己聲音權(quán)利的合同,這是錯(cuò)誤之舉,使得業(yè)界人士可以免費(fèi)克隆他們的聲音。未來(lái)將需要新的合同。但事實(shí)上,一些演員可能會(huì)覺(jué)得克隆聲音是明智之舉。瓦爾·基爾默因喉癌失去了大部分聲音,他很高興能在《壯志凌云:獨(dú)行俠》中恢復(fù)聲音。其他人可以不用再去錄音室重錄了。最受威脅的是中級(jí)專業(yè)人員,而不是超級(jí)明星。
?
[Paragraph 8]
Another industry that will have to come to grips with the rise of clones is journalism.?On-the-sly recordings have long been the stuff of blockbuster scoops. Now who will trust a story based on an audio clip?
新聞界也將不得不面對(duì)聲音克隆的問(wèn)題。秘密錄音長(zhǎng)期以來(lái)一直是轟動(dòng)一時(shí)的獨(dú)家新聞?,F(xiàn)在誰(shuí)還會(huì)相信只言片語(yǔ)的音頻片段呢?
?
[Paragraph 9]
Slightly easier to manage might be the false positives: recordings purporting to be someone but which are fakes.?Sophisticated forensic techniques could be of use here, proving a clip to be AI, say, in a courtroom.?The opposite problem—the false negatives—will arise when public figures deny authentic recordings.?Proving that a clip is genuine is hard, perhaps even impossible.?Journalists will need to show how they obtained and stored audio files—unless, as so often, they have promised a source anonymity.
稍微容易處理的可能是假正類情況:聲稱是某人的錄音,但實(shí)際上是假的。精細(xì)的取證技術(shù)可以在此處派上用場(chǎng),例如在法庭上證明一段音頻是由人工智能生成的。當(dāng)公眾人物否認(rèn)真實(shí)錄音時(shí),就會(huì)出現(xiàn)相反的問(wèn)題——假負(fù)類情況。證明一段音頻的真實(shí)性很難,甚至無(wú)法證明。新聞?dòng)浾咝枰故舅麄儷@取和存儲(chǔ)音頻文件的全過(guò)程--除非他們使用匿名消息來(lái)源。
?
[Paragraph 10]
During his first presidential run, Mr Trump did more than anyone to popularise the term “fake news”—and that was well before voice cloning, deepfake videos, artificially generated images and the like were widespread.?Now, ever more people caught up in wrongdoing will be tempted by the defence, “It wasn’t me.” Many people will have even more reason to believe them.
在特朗普第一次競(jìng)選總統(tǒng)期間,他普及了“假新聞”這個(gè)術(shù)語(yǔ)--而那時(shí)候聲音克隆、深度偽造視頻、人工生成的圖像等還沒(méi)有普及?,F(xiàn)在,越來(lái)越多的不法分子將會(huì)利用“這不是我”狡辯。許多人將有更多理由相信他們。