PINs and text messages can be inferred from smart speaker recordings, study shows
Malicious attackers can extract PIN codes and text messages from audio recorded by smart speakers from up to 1.6 feet away. That’s according to a new study authored by researchers at the University of Cambridge, which showed that it’s possible to capture virtual keyboard taps with microphones located on nearby devices, like Alexa- and Google Assistant-powered speakers.
Amazon Echo, Google Home, and other smart speakers pack microphones that are always on in the sense that they process audio they hear in order to detect wake-up phrases like “OK Google” and “Alexa.” These wake-phrase detectors occasionally send audio data to remote servers. Studies have found that up to a minute of audio can be uploaded to servers without any keywords present, either by accident or absent privacy controls. Reporting has revealed that accidental activations have exposed contract workers to private conversations, and researchers say these activations could reveal sensitive information like