Apple has agreed to pay US$95 million ($153 million) to settle a proposed class action lawsuit claiming that its voice-activated Siri assistant violated user privacy.
The proposed settlement, filed in an Oakland, California court, would resolve a five-year-old lawsuit revolving around allegations that Apple surreptitiously activated Siri to record conversations through iPhones and other devices equipped with the virtual assistant for more than a decade.
The alleged recordings occurred even when people didn’t seek to activate the virtual assistant with the trigger words, “Hey, Siri”.
The lawsuit asserted that some of the recorded conversations were then allegedly shared with advertisers in an attempt to sell their products to consumers who were more likely to be interested in the goods and services.
The allegations about a snoopy Siri appeared to contradict Apple’s long-standing commitment to protecting its customers’ privacy.
CEO Tim Cook has often framed this crusade as a fight to preserve “a fundamental human right”.
Apple claims no wrongdoing
Apple isn’t acknowledging any wrongdoing in the settlement, which still must be approved by US district judge Jeffrey White. Lawyers in the case have proposed scheduling a 14 February court hearing in Oakland to review the terms.
If the settlement is approved, tens of millions of US-based consumers who owned iPhones and other Apple devices from 17 September 2014 through to the end of last year could file claims.
Each consumer could receive up to US$20 ($32) per Siri-equipped device covered by the settlement, although the payment could be reduced or increased, depending on the volume of claims.
Only 3 to 5 per cent of eligible consumers are expected to file claims, according to estimates in court documents.
Eligible consumers will be limited to seeking compensation on a maximum of five devices.
The settlement represents a sliver of the US$705 billion ($1.134 trillion) in profits that Apple has pocketed since September 2014.
How to protect your privacy
Whether built into standalone smart speakers, computers, tablets, or phones, voice assistants rely on listening for specific sound patterns, such as a “wake word” like “Alexa” or “OK, Google,” to activate and record.
However, they can sometimes misinterpret sounds and begin recording unexpectedly.
These recordings are often transmitted to the manufacturer’s servers, so you should take measures to protect both your interactions with the voice assistant and your private conversations with others.
Here are some practical steps you can take to protect your privacy:
Disable microphone access
Go to your phone’s settings and restrict access to your microphone for all your apps.
If you’re worried about how often Siri is listening to and recording you without your consent, turn off Apple’s virtual helper by following these steps:
Navigate to Settings > Siri & Search.
Toggle off Listen for ‘Hey Siri’ and press the Side button for Siri.
Tap Turn Off Siri when a pop-up window appears.
See if recordings are being stored
Regardless of the type of voice assistant you use, you should be able to check whether recordings are automatically stored permanently.
You should be able to choose how long recordings are stored, or even to automatically delete them.
There are some practical steps you can take to better ensure your privacy when using smart assistants. Source: Getty / alvarez
Check the privacy policy
Check the privacy policy for your voice assistant to understand how your audio recordings are handled and who can listen to them.
You may be able to change your settings to opt out of human review of recordings.