Alexa is a good listener -- so good, in fact, that researchers discovered a way to have it record audio indefinitely.
Amazon's smart voice assistant had a coding flaw that could have let malicious developers turn the Echo into a listening device.
Alexa, the voice assistant used by millions of smart gadgets including , uses what it calls Skills to carry out commands. You ask if rain is coming, for example, and Alexa uses the "Weather" Skill to answer.
Researchers from security testing firm Checkmarx found a flaw with Alexa that allowed a Skill to continue listening long after a person activated the software, said Amit Ashbel, director of product marketing for Checkmarx.
"As far as we could tell, there was no limit," Ashbel said. "As long as you don't tell it to stop, it wouldn't."
Amazon said it's since fixed the reported issues.
"Customer trust is important to us and we take security and privacy seriously," the company said in a statement. "We have put mitigations in place for detecting this type of skill behavior and reject or suppress those skills when we do."
Checkmarx, which went public with its findings Wednesday, said the issue has been resolved since April 10.
But the discovery isn't comforting to anyone worried that Amazon's Echo smart speakers are listening to us. The online retail giant has fought off concerns of privacy by pointing out that the voice assistant doesn't . Some people, however, can't shake their unease with the proliferation of smart devices, which is why a story about fascinates so many people.
Amazon has never said how long it'll keep listening after a command is completed, which prompted researchers from Checkmarx to run their tests.
How to keep Alexa listening
After Alexa carries out the command, it's supposed to stop listening. But Checkmarx's researchers developed a Skill that allowed it to continue listening indefinitely by taking advantage of Alexa's "Reprompt" feature.
When an Alexa doesn't hear your command properly, it continues listening and asks the user to repeat the order. Checkmarx's researchers found that a developer could write in the code for Alexa to do that, even if it perfectly understood the command. That way, it would stay listening.
They also discovered that developers could mute the request -- so you wouldn't hear the Alexa asking you to repeat yourself. That combination allows the Alexa to continue listening without the user being aware, Ashbel said.
The only sign that the Alexa is still listening is the blue ring around the Echo device, which Ashbel said was not an effective fix.
"If I gave Alexa a command, I'm not going to look at Alexa to see what's going on with the device itself," he said.
He pointed out that Amazon allows other gadgets to use Alexa, such as, which might not have that ring.
The test hack
Checkmarx's proof-of-concept used a calculator Skill that functioned like any calculator should. But after it did a math problem, the Echo Dot continued listening for more than a minute until the researcher told Alexa to stop.
During that minute, it captured all of the recorded audio in a transcript and sent it to the developers in a neatly packed display window, writing out word for word what the researcher said.
While only Amazon gets, developers who create the skills can get parts of the transcripts.
"The voice recording doesn't actually go to the hacker, but the transcription is sent to the hacker that developed the skill," Ashbel said. "That would actually let them eavesdrop into your conversation."
Checkmarx said Amazon's fixes made it impossible to repeat the same eavesdropping tactic. Amazon removed the ability to silence reprompts, and also shortened the amount of time Alexa could listen for, Ashbel said.
Amazon declined to explain what its fixes are, citing its security practices.
Follow the Money: This is how digital cash is changing the way we save, shop and work.
'Alexa, be more human': Inside Amazon's effort to make its voice assistant smarter, chattier and more like you.