Alexa’s Ability to Replicate Anyone’s Speech Under Amazon’s Plan Raises Concerns Regarding Scams
For its voice assistant Alexa, Amazon is developing new technology that will enable it to impersonate any human’s voice, living or dead, with the help of less than a minute of recorded audio.
Amazon.com wants to give users the option to make the voice assistant Alexa sound exactly like their grandmother or anyone else. Alexa will hopefully become more commonplace in consumers’ lives thanks to Amazon’s project. But the general public’s focus has already moved on.
A developer at Alphabet Inc. ‘s (GOOGL.O) Google claimed that a business chat bot had developed sentience, which was hotly disputed. According to numbers the company has disclosed for device sales since January 2019, another Amazon official claimed on Tuesday that Alexa had 100 million users worldwide.
Rohit Prasad, senior vice president and head scientist at Amazon, showed the function with a video of a kid asking for an Amazon device at the company’s Re:Mars conference on Wednesday in Las Vegas. “Alexa, can Grandma finish reading me the Wizard of Oz?”
Amazon (AMZN) showcased on stage how a little boy’s grandmother’s voice may be substituted for Alexa’s characteristic voice when the virtual assistant is reading a narrative to the youngster. The video was shown on stage. Rohit Prasad stated that the updated system will be able to collect enough voice data from less than a minute of audio to make personalization like this possible.
In the past, the process required a person to spend hours in a recording studio. However, the updated system will be able to collect enough voice data from less than a minute of audio. Prasad did not provide any additional information regarding the possible release date of this feature. Amazon declined to comment on a timeline.
According to Prasad, the idea originated from Amazon’s exploration of novel approaches to incorporating additional human characteristics into artificial intelligence. This is particularly pertinent in light of the ongoing pandemic, during which so many of us have had to say goodbye to someone we cared deeply about.
Although artificial intelligence cannot take away the anguish of loss, it can ensure that people’s memories live on. During the demonstration, Prasad mentioned that the feature might be used to honor a deceased family member. “So many of us have lost someone we love,” he said, referring to the COVID-19 pandemic.
As a result, the company has made artificial companion-like communication a major area of attention. According to Prasad, Amazon wants Alexa to have “generalizable intelligence,” or the capacity to adjust to user situations and pick up new ideas without much outside assistance.
The all-knowing, all-capable, uber artificial general intelligence, or AGI, that Alphabet’s DeepMind unit and Elon Musk-co founded OpenAI are pursuing, he claimed, “is not to be mistaken with” this objective. While artificial intelligence (AI) cannot take away the agony of loss, it can make memories linger longer, according to Prasad.
Nevertheless, despite the presentation’s upbeat emotional tone, several in the technological community quickly criticized the new Alexa features. They regarded vocal mimicking as the ideal instrument for deepfakes, criminal schemes, and other evil purposes rather than just as a way to establish an emotional connection.
WORRIED THAT SOMEONE HAS YOUR PERSONAL & BUSINESS INFORMATION?
With how easy it is for scammers to acquire your data, it’s reasonable to be alarmed. Protect yourself and your loved ones by getting advice from experts.
We will guide and even help you get your money back from scammers.
The Technology is More Harmful Than it Seems
A representative for Amazon told Fortune that Prasad’s presentation was based on the text-to-speech (TTS) research that the company has been conducting using recent technological developments.
The representative stated, “We’ve learnt to produce a high-quality voice with much less data than recording in a professional studio. The voice mimicking feature is still in the process of being developed, and the company has not disclosed when it plans to make it available to the general public.
According to Prasad, the new voice speech pattern technology can create a high-quality voice from “less than a minute of recorded audio,” which is made feasible “by defining the challenge as a voice conversion task and not a speech generation path.”
Prasad said that the new technology could one day become common in consumers’ lives and that it could be used to increase users’ trust in their Amazon devices. In order to produce the feature, according to Prasad, the business had to figure out how to record a “high-quality voice” in a short amount of time as opposed to spending hours in a recording studio.
The function, which is sure to raise significant privacy concerns and ethical issues about permission, was not further explained by Amazon. “The companionship bond we have with Alexa is one of the things that astonished me the most. The human qualities of empathy and emotion are crucial for establishing trust in this companionship role,” he said.
Scams about Online/Banking appear year-round. To prevent this kind of scam you can Contact Us for Support.
What are The Fears Associated with This Concern?
Even though the new mimicking function may be cutting-edge, some people, notably those who work for industry-related companies, worry that it might be employed for evil.
Natasha Crampton, Microsoft’s chief responsible A.I. the officer, told Reuters that the company, which also developed vocal imitation technology to assist people with speech impairment, limited which areas of its business could use the technology out of concern that it would be used to facilitate political deep fakes.
According to a blog post by Natasha Crampton, “this technology has enormous possibilities in education, accessibility, and entertainment, but it is also easy to envisage how it may be used to inappropriately impersonate speakers and deceive listeners.”
Online anxieties are being fueled by the new feature
“Remember when we warned you that deepfakes would exacerbate the culture’s already-existing epistemological crisis and feelings of mistrust and alienation?
Yeah that. That time a LOT,” Damien P. Williams, a researcher in algorithms, values, and bias, identified himself as @wolven, a Twitter user, in his bio.
If you noticed large sums of money disappearing from your bank account Ezchargeback is here to help you. You can also visit our News page for more Guides and updates.
- Author: Sophie Mellor – https://fortune.com/2022/06/23/amazons-plan-for-alexa-to-mimic-anyones-voice-raises-fears-it-will-be-used-for-deepfakes-and-scams/
- Author: Jeffrey Dastin – https://www.reuters.com/technology/amazon-has-plan-make-alexa-mimic-anyones-voice-2022-06-22/
- Author: Samantha Murphy Kelly – https://www.iqstock.news/n/amazon-alexa-mimic-deceased-loved-voices-4098647/
Find Related News
Subscribe to Our Newsletter
Scam Recovery Resources
Scams are becoming increasingly sophisticated, making it difficult for ordinary people to protect their personal information and financial data. But have no fear! Because all of the money that has been lost can be recovered using the tactics indicated by professional recovery agents.
Things were, however, very different just a few years ago. You were unable to contact any company for assistance in recovering your funds. It’s crucial to understand that internet frauds and funds recovery help to fill a gap in the market that is still mostly untapped. Today, you have the opportunity to take control of your situation and achieve the best possible outcome by obtaining your money back. You have fund recovery experts that can help you reclaim the money you’ve misplaced.
If it is, Saygus has made a point of giving the impression that the business truly plans to produce a phone. Its prototypes, which I have personally used, are useful but otherwise quite ordinary—even going so far as to rent up a portion of an office complex in Utah and construct a massive Saygus logo on the structure.
Are You Trying to Recover From A CFD Trading Scam? Online trading has become extremely popular over the years. Nowadays, people want new ways to
FundTrace is committed to upholding the journalistic standards online, including accuracy. With our news reporting, our policy is to review each issue on a case by case basis, immediately upon becoming aware of a potential error or need for clarification, and to resolve it as quickly as possible.
do you need help?
A lot of those who contact us have questions and concerns about their personal and business data being compromised. We aim to arm you with the legal and technical know-how in the fight against scams. Also, we will be able to refer you to top scam recovery agencies.
Please fill up the form. Rest assured that our support team will get in touch with you