A group of students from Berkeley have demonstrated how malicious commands to Siri, Google Assistant and Alexa can be hidden in recorded music or innocuous-sounding speech.

Simply playing the tracks over the radio, streaming music track or podcast could allow attackers to take control of a smart home …


Go to original Source

Ben Lovejoy


Powered by WPeMatico