Alexa’s Ability to Replicate Anyone’s Speech Under Amazon’s Plan Raises Concerns Regarding Scams
For its voice assistant Alexa, Amazon is developing new technology that will enable it to impersonate any human’s voice, living or dead, with the help of less than a minute of recorded audio.
Amazon.com wants to give users the option to make the voice assistant Alexa sound exactly like their grandmother or anyone else. Alexa will hopefully become more commonplace in consumers’ lives thanks to Amazon’s project. But the general public’s focus has already moved on.
A developer at Alphabet Inc. ‘s (GOOGL.O) Google claimed that a business chat bot had developed sentience, which was hotly disputed. According to numbers the company has disclosed for device sales since January 2019, another Amazon official claimed on Tuesday that Alexa had 100 million users worldwide.
Rohit Prasad, senior vice president and head scientist at Amazon, showed the function with a video of a kid asking for an Amazon device at the company’s Re:Mars conference on Wednesday in Las Vegas. “Alexa, can Grandma finish reading me the Wizard of Oz?”
Amazon (AMZN) showcased on stage how a little boy’s grandmother’s voice may be substituted for Alexa’s characteristic voice when the virtual assistant is reading a narrative to the youngster. The video was shown on stage. Rohit Prasad stated that the updated system will be able to collect enough voice data from less than a minute of audio to make personalization like this possible.
In the past, the process required a person to spend hours in a recording studio. However, the updated system will be able to collect enough voice data from less than a minute of audio. Prasad did not provide any additional information regarding the possible release date of this feature. Amazon declined to comment on a timeline.
According to Prasad, the idea originated from Amazon’s exploration of novel approaches to incorporating additional human characteristics into artificial intelligence. This is particularly pertinent in light of the ongoing pandemic, during which so many of us have had to say goodbye to someone we cared deeply about.
Although artificial intelligence cannot take away the anguish of loss, it can ensure that people’s memories live on. During the demonstration, Prasad mentioned that the feature might be used to honor a deceased family member. “So many of us have lost someone we love,” he said, referring to the COVID-19 pandemic.
As a result, the company has made artificial companion-like communication a major area of attention. According to Prasad, Amazon wants Alexa to have “generalizable intelligence,” or the capacity to adjust to user situations and pick up new ideas without much outside assistance.
The all-knowing, all-capable, uber artificial general intelligence, or AGI, that Alphabet’s DeepMind unit and Elon Musk-co founded OpenAI are pursuing, he claimed, “is not to be mistaken with” this objective. While artificial intelligence (AI) cannot take away the agony of loss, it can make memories linger longer, according to Prasad.
Nevertheless, despite the presentation’s upbeat emotional tone, several in the technological community quickly criticized the new Alexa features. They regarded vocal mimicking as the ideal instrument for deepfakes, criminal schemes, and other evil purposes rather than just as a way to establish an emotional connection.
WORRIED THAT SOMEONE HAS YOUR PERSONAL & BUSINESS INFORMATION?
With how easy it is for scammers to acquire your data, it’s reasonable to be alarmed. Protect yourself and your loved ones by getting advice from experts.
We will guide and even help you get your money back from scammers.
The Technology is More Harmful Than it Seems
A representative for Amazon told Fortune that Prasad’s presentation was based on the text-to-speech (TTS) research that the company has been conducting using recent technological developments.
The representative stated, “We’ve learnt to produce a high-quality voice with much less data than recording in a professional studio. The voice mimicking feature is still in the process of being developed, and the company has not disclosed when it plans to make it available to the general public.
According to Prasad, the new voice speech pattern technology can create a high-quality voice from “less than a minute of recorded audio,” which is made feasible “by defining the challenge as a voice conversion task and not a speech generation path.”
Prasad said that the new technology could one day become common in consumers’ lives and that it could be used to increase users’ trust in their Amazon devices. In order to produce the feature, according to Prasad, the business had to figure out how to record a “high-quality voice” in a short amount of time as opposed to spending hours in a recording studio.
The function, which is sure to raise significant privacy concerns and ethical issues about permission, was not further explained by Amazon. “The companionship bond we have with Alexa is one of the things that astonished me the most. The human qualities of empathy and emotion are crucial for establishing trust in this companionship role,” he said.
Scams about Online/Banking appear year-round. To prevent this kind of scam you can Contact Us for Support.
What are The Fears Associated with This Concern?
Even though the new mimicking function may be cutting-edge, some people, notably those who work for industry-related companies, worry that it might be employed for evil.
Natasha Crampton, Microsoft’s chief responsible A.I. the officer, told Reuters that the company, which also developed vocal imitation technology to assist people with speech impairment, limited which areas of its business could use the technology out of concern that it would be used to facilitate political deep fakes.
According to a blog post by Natasha Crampton, “this technology has enormous possibilities in education, accessibility, and entertainment, but it is also easy to envisage how it may be used to inappropriately impersonate speakers and deceive listeners.”
Online anxieties are being fueled by the new feature
“Remember when we warned you that deepfakes would exacerbate the culture’s already-existing epistemological crisis and feelings of mistrust and alienation?
Yeah that. That time a LOT,” Damien P. Williams, a researcher in algorithms, values, and bias, identified himself as @wolven, a Twitter user, in his bio.
If you noticed large sums of money disappearing from your bank account Ezchargeback is here to help you. You can also visit our News page for more Guides and updates.
- Author: Sophie Mellor – https://fortune.com/2022/06/23/amazons-plan-for-alexa-to-mimic-anyones-voice-raises-fears-it-will-be-used-for-deepfakes-and-scams/
- Author: Jeffrey Dastin – https://www.reuters.com/technology/amazon-has-plan-make-alexa-mimic-anyones-voice-2022-06-22/
- Author: Samantha Murphy Kelly – https://www.iqstock.news/n/amazon-alexa-mimic-deceased-loved-voices-4098647/
Find Related News
Subscribe to Our Newsletter
Scam Recovery Resources
These scammers are the masters of disguise. They create fake profiles and do complete research before trying to reach people in order to make sure they know the tricks that can exploit their targets. Most of the people who have been scammed online were scammed through online dating.
Recent claims made by Coinsdeskfx, a website under investigation for possible fraud, claimed to give users trading opportunities in investment offers as well as a variety of other services. Many people, however, are unaware of the fact that everything on their website is fraudulent and inaccurate.
The most common indication of a broker scam or a Forex trading scam is that the broker is either unregulated or has a low-quality regulatory license. CFD Scams are becoming increasingly common, which is extremely dangerous – due to this, investors need to be aware of which companies to avoid in the CFD market.
The United Kingdom-based Empires Trade FX claims to offer its reputable fund management services globally under SIC license number 64205. However, many claim that despite their authentic looking persona, this company is using investors for all of the wrong reasons!
FundTrace is committed to upholding the journalistic standards online, including accuracy. With our news reporting, our policy is to review each issue on a case by case basis, immediately upon becoming aware of a potential error or need for clarification, and to resolve it as quickly as possible.
do you need help?
A lot of those who contact us have questions and concerns about their personal and business data being compromised. We aim to arm you with the legal and technical know-how in the fight against scams. Also, we will be able to refer you to top scam recovery agencies.
Please fill up the form. Rest assured that our support team will get in touch with you