Security researchers created a ‘skill’ that allows Alexa to spy on you

April 26, 2018

In news that will confirm your worst fears about a device with an always-on microphone in your home, security researchers have created a “skill” for Amazon’s popular voice assistant Alexa that allows the device to indefinitely eavesdrop on your conversations.

The vulnerability, which Amazon has since patched, was discovered by cybersecurity company Checkmarx. Experts at the firm were able to create a “skill”—Amazon’s term for an application for Alexa—that could secretly record a victim and transcribe entire conversations caught on mic.

The security researchers hid the malicious task in a seemingly innocuous calculator skill that could be used to solve math problems. Unbeknownst to any victim who installed the skill, asking Alexa to use the app would enable the attack.

While Alexa is designed to be listening at all times to pick up on any commands the user may wish it complete, the cycle for it to record is supposed to be short and sweet—it’s only supposed to communicate with Amazon servers to process commands after it hears its wake word, which is usually “Alexa.” After Alexa reads back information in response to a given prompt, it is supposed to either end the session or ask the user for another command, briefly keeping the session open.

When a user opens up a session with the calculator app, its code creates a second session but doesn’t provide a vocal prompt from Alexa to inform the user the microphone is still active. That keeps Alexa listening and recording the user’s conversations long after communication with the smart speaker has ceased.

With the session still open, the device is instructed by the skill to continue to transcribe any conversation it picks up. That information is collected, recorded, and made searchable for the makers of the skill.

The attack, which simply requires a victim to download and install the skill on their Alexa device, suffers from one pretty significant giveaway: The blue light on the Echo or Dot remains active and illuminated, signifying that Alexa is still listening. It’s possible that a victim won’t notice or won’t think anything of it, but it could raise suspicions for users.

“Customer trust is important to us and we take security and privacy seriously,” a spokesperson for Amazon told Gizmodo. “We have put mitigations in place for detecting this type of skill behavior and reject or suppress those skills when we do.”

Gizmodo has the full article

Sponsored Recommendations

ASK THE EXPERT: ServiceNow’s Erin Smithouser on what C-suite healthcare executives need to know about artificial intelligence

Generative artificial intelligence, also known as GenAI, learns from vast amounts of existing data and large language models to help healthcare organizations improve hospital ...

TEST: Ask the Expert: Is Your Patients' Understanding Putting You at Risk?

Effective health literacy in healthcare is essential for ensuring informed consent, reducing medical malpractice risks, and enhancing patient-provider communication. Unfortunately...

From Strategy to Action: The Power of Enterprise Value-Based Care

Ever wonder why your meticulously planned value-based care model hasn't moved beyond the concept stage? You're not alone! Transition from theory to practice with enterprise value...

State of the Market: Transforming Healthcare; Strategies for Building a Resilient and Adaptive Workforce

The U.S. healthcare system is facing critical challenges, including workforce shortages, high turnover, and regulatory pressures. This guide highlights the vital role of technology...