Apple issued rare apology after report revealed employees regularly listened to private conversations
The hardest words to say are, "I'm sorry," but Apple (surprisingly) doesn't have a problem saying them after a whistleblower revealed that human strangers were listening to your private conversations. Apple commendably went a step further and actually fixed the issue that makes it feel like your phone is eavesdropping on you.
The unnamed whistleblower told The Guardian that Siri records conversations as a form of quality control called "grading." The purpose was to allow Apple to improve Siri, but it ended up feeling like one huge privacy violation.
It turns out, Apple's voice assistant could be triggered accidentally, even by muffled background noises or zippers. Once triggered, Siri made audio recordings, some of which included personal discussions about medical information, business deals, and even people having sex. The percentage of people yelling out, "Hey Siri!" while getting it on is probably very small.
RELATED: After someone tried selling her nude photos, Sia shared them on Twitter
Apple ensured that these recordings wouldn't be linked to data that could identify you, but the recordings were linked to user data that showed location, app data, and contact details. So, yeah, they could actually identify you.
To make things worse, the recordings were listened to by third-party contractors, not Apple employees. "[T]here's a high turnover. It's not like people are being encouraged to have consideration for people's privacy, or even consider it. If there were someone with nefarious intentions, it wouldn't be hard to identify [people on the recordings]," the whistleblower told The Guardian.
Apple did the right thing and apologized for the practice. "We know that customers have been concerned by recent reports of people listening to audio Siri recordings as part of our Siri quality evaluation process. We realize we haven't been fully living up to our high ideals, and for that we apologize," Apple said in a post.
Not only that, they changed their policy to address the concerns revealed in The Guardian article. Now, Apple will no longer record conversations as a default. If you want to share your conversations with Apple so they can make improvements on Siri, you have to specifically opt in. Apple will also stop using third-party contractors to listen to the recordings. Quality control will be left to Apple employees who will review computer-generated transcripts instead of recordings. Any accidental recordings will be deleted.
RELATED: IT pros share some crucial lessons on how to avoid getting hacked
Technology has made our lives easier, but it's also ushered in a whole slew of privacy concerns. It's hard not to feel like your phone is your own personal telescreen from "1984," but worse because at least telescreens didn't have addictive Snapchat filters. Why should privacy be the trade-off just because we want the convenience of being able to say, "Hey Siri, what's the difference between a dolphin and a whale?" It's nice to have the peace of mind that we can make robots do our bidding without feeling like they're spying on us – at least when it comes to our iPhones.
- What I realized about feminism after my male friend was disgusted ... ›
- 18 men opened up about what it's like being a man in rural America ... ›
- Apple urges users to update their phones after hack - Upworthy ›
- Is Your Smartphone Secretly Listening to You? - Consumer Reports ›
- Is your smartphone listening in on your conversations? - Komando ... ›
- Use Live Listen with AirPods and Powerbeats Pro - Apple Support ›
- Your phone really is listening to you. 4 things you can do to stop it ›
- Is Your Phone Recording Your Conversations? The Answer Might ... ›
- The internet is convinced our phones are listening to us. Here's what ... ›
- Are smartphones listening and targeting us with ads? - CBS News ›
- Is Your iPhone Listening to You? And Is Apple Spying on You? ›
- Your Phone Is Listening and it's Not Paranoia - VICE ›
- How to stop your iPhone from listening to your conversations — Quartz ›