Researchers use artificial intelligence and voice to create a terrifying Mac attack

Researchers have discovered a terrifying exploit that means attackers can learn what you type on your Mac based on the sound of your keyboard tapping.

Aug 9, 2023 - 14:02
 0  32
Researchers use artificial intelligence and voice to create a terrifying Mac attack

A British research team at Durham University has identified an exploit that allows attackers to understand what you're typing on a MacBook Pro - based on the sound you make with each keystroke.

Such attacks are not particularly new. The researchers found research dating back to the 1950s on the use of acoustics to recognize human handwriting. They also note that the first document detailing the use of such an attack surface was written for the US National Security Agency (NSA) in 1972, prompting speculation that such attacks may already exist.

"The state origin of (AS-CAs) raises speculation that such an attack may already be possible with today's devices, but remains a mystery," the researchers wrote.

There is no doubt that if the US and UK governments investigated such matters.​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​

How the attack works

As Bleeping Computer reports, British security researchers have figured out how to recognize what you type with up to 95 percent accuracy. The attack, which uses a combination of voice and artificial intelligence, is not limited to Macs.

The attack is explained in more detail here, but it's not that simple. The attacker must first calibrate the sound of that character's keys to train the AI. This means recognizing the specific sound of each keystroke, although this can be achieved during a Zoom chat if you join the conversation while other meeting participants can hear your Mac's keyboard.

The study claims that because the attack algorithm matches each sound to each key, it captures the text you type. "Researchers collected exercise data by pressing 36 keys 25 times on a modern MacBook Pro and recording the sound produced by each press," the white paper explains.

What does it mean?

Simply put, the nature of these attacks means that if someone can access your computer and record your exercise data—or find another way to listen and recognize the sound of your keyboard as you type—they can use artificial intelligence to track your activities. firmly They just need to know how to listen.

The microphone you use to listen can be the one you leave Zoom on, a hacked smartphone microphone, or an app that has access to the microphone if you expect the app to accept an incorrectly signed privacy agreement. A microphone can even be a traditional fishing device, and when there is room, a deep learning algorithm can allow attackers to access sensitive data, passwords and more.

What's next?

In terms of exploits, this is also a good example of how AI can be used in new ways to undermine security constraints in new ways. This will become even more problematic when the cost of quantum computers falls, as these machines can recycle data much faster than today's computers. In theory, these quantum computers could crack the encryption keys that the Internet relies on within hours, meaning that traditional encryption codes are relatively easy to use.

The researchers hypothesize that topics could include using smart speakers to classify keystrokes (what I call Siri Sleuthing) or adding generative AI-style LLM models to improve keystroke recognition. Acoustic attacks of this nature are also much easier to carry out because so many devices now have built-in microphones, while AI research continues to advance. Even Apple has a patent for Siri lip reading. It seems that the decision to protect privacy is needed first, but the will to do so seems to be lacking in some important areas.

What can you do?

There are some mitigation measures to help combat such attacks. Random passwords with many cases and using the Shift key can help, while touch typing also reduces accuracy, presumably because typists tend to have a relatively steady cadence when they type. The researchers recommend that people try to change their writing style to confuse the algorithm. Other defenses include white noise, software-based keyboard sound filters, or software that plays random keys and messes up the algorithm.

Using different keyboards with a Mac can also help, while using biometric authentication, password managers and passwords can help limit information for attackers.

It's also a good idea to regularly check apps on all devices if they try to request access to the microphone. You never know who's listening.

What's Your Reaction?

like

dislike

love

funny

angry

sad

wow