In news that will confirm your worst fears about a device with an always-on microphone in your home, security researchers have created a “skill” for Amazon’s popular voice assistant Alexa that allows the device to indefinitely eavesdrop on your conversations.
The vulnerability, which Amazon has since patched, was discovered by cybersecurity company Checkmarx. Experts at the firm were able to create a “skill”—Amazon’s term for an application for Alexa—that could secretly record a victim and transcribe entire conversations caught on mic.
The security researchers hid the malicious task in a seemingly innocuous calculator skill that could be used to solve math problems. Unbeknownst to any victim who installed the skill, asking Alexa to use the app would enable the attack.
While Alexa is designed to be listening at all times to pick up on any commands the user may wish it complete, the cycle for it to record is supposed to be short and sweet—it’s only supposed to communicate with Amazon servers to process commands after it hears its wake word, which is usually “Alexa.” After Alexa reads back information in response to a given prompt, it is supposed to either end the session or ask the user for another command, briefly keeping the session open.
When a user opens up a session with the calculator app, its code creates a second session but doesn’t provide a vocal prompt from Alexa to inform the user the microphone is still active. That keeps Alexa listening and recording the user’s conversations long after communication with the smart speaker has ceased.
With the session still open, the device is instructed by the skill to continue to transcribe any conversation it picks up. That information is collected, recorded, and made searchable for the makers of the skill.
The attack, which simply requires a victim to download and install the skill on their Alexa device, suffers from one pretty significant giveaway: The blue light on the Echo or Dot remains active and illuminated, signifying that Alexa is still listening. It’s possible that a victim won’t notice or won’t think anything of it, but it could raise suspicions for users.
“Customer trust is important to us and we take security and privacy seriously,” a spokesperson for Amazon told Gizmodo. “We have put mitigations in place for detecting this type of skill behavior and reject or suppress those skills when we do.”