We have been hearing that Apple has been using a “grading” program to hear Siri voice recordings globally. After several reports criticised the company, it has suspended the project. The company also promised to make the system more transparent where the users can opt-out of the grading programs. Those who are unaware, the Cupertino based tech giant had hired employees that were found to be hearing private conversations via Siri. In a new development, the company has fired 300 contractors in Cork, Ireland who were involved in listening to more than 1,000 Siri conversations.
The grading process has raised deep concerns over the user’s privacy as their personal data, as well as details, are being used in the wrong ways. As we mentioned earlier after this the company decided to suspend the grading project. According to a report by Engadget, many contractors throughout Europe may have been let go and are upset that Apple fired so many people with just one week’s notice. Even those who were concerned with the program’s ethical implications expressed their frustration.
For the past month the staff had been on paid leave. In another report by The Guardian said, “The staff had been on paid leave since 2 August, the day Apple announced its decision to suspend the program, referred to as “grading”, as it conducted a thorough review”.
Apart from that the tech giant has also made a few changes in its privacy policy as well as apologised for the Siri grading program. In a press release it said:
As a result of our review, we realise we haven’t been fully living up to our high ideals, and for that we apologise. As we previously announced, we halted the Siri grading program. We plan to resume later this fall when software updates are released to our users — but only after making the following changes:
- First, by default, we will no longer retain audio recordings of Siri interactions. We will continue to use computer-generated transcripts to help Siri improve.
- Second, users will be able to opt in to help Siri improve by learning from the audio samples of their requests. We hope that many people will choose to help Siri get better, knowing that Apple respects their data and has strong privacy controls in place. Those who choose to participate will be able to opt out at any time.
- Third, when customers opt in, only Apple employees will be allowed to listen to audio samples of the Siri interactions. Our team will work to delete any recording which is determined to be an inadvertent trigger of Siri.
Apple is not the only one that has employed human supervision for its automatic voice assistant other brands like Google, Amazon, Facebook and Microsoft were also identified for listening to recordings from its digital assistant. But now Apple has suspended both non-optional recording and the subsequent grading policies. The company has also promised not to record or use the user’s data unless they opt-in. This is a great move by the company, which could bring the user trust back.
For the latest gadget and tech news, and gadget reviews, follow us on Twitter, Facebookand Instagram. For newest tech & gadget videos subscribe to our YouTube Channel. You can also stay up to date using the Gadget Bridge Android App.