Amazon’s Alexa is one of the most popular digital assistants out there. Most people who use it believe that they are only interacting with a digital assistant, but in reality, there are many third parties involved, and for the most part, the experience seems rather harmless for users. However, North Carolina State University researchers recently uncovered numerous vulnerabilities that increase security risks when using the digital assistant.
The security risks are not in Alexa’s code but rather courtesy of thousands of skills or program that the digital assistant interacts with. The programs may range from music apps to apps that people use to order groceries. Most of those programs are made by third-party developers, and they are connected to millions of homes all over the world. The researchers set out to establish the potential security risks that Alexa users can be exposed to through those apps or skills.
“When people use Alexa to play games or seek information, they often think they’re interacting only with Amazon,” stated Anupam Das, who was a co-author in the study.
More than 100 million Alexa devices have been sold so far, and there are more than 100,000 Alexa skills available for download. The research team developed an automated program that they used to analyze more than 90,000 skills. The first major discovery was that the skills display their developer’s name, which is a problem because Amazon does not verify if the developer is real. In other words, the developer could be anyone, including a malicious hacker.
How do the skills pose a security threat to users?
Researchers were concerned about the discovery because it exposes Alexa users to the potential risk of their private data being stolen and used for malicious intents. Some of the skills may require payment details such as credit card or banking information or even email addresses which may expose user data to the wrong people.
Amazon has some security measures in place, but the analysis revealed that roughly 23.3 percent of the skills lack important privacy policies to protect the user. The research team also recommended some measures that Amazon can implement for better security. Some of the measures include validating developers and implementing visual and audio cues that let them know that the skill is from a third party.