Apple Understands: Siri's "eavesdropping" On IPhone And Co. Suspended For The Time Being

Table of contents:

Apple Understands: Siri's "eavesdropping" On IPhone And Co. Suspended For The Time Being
Apple Understands: Siri's "eavesdropping" On IPhone And Co. Suspended For The Time Being

Video: Apple Understands: Siri's "eavesdropping" On IPhone And Co. Suspended For The Time Being

Video: Apple Understands: Siri's "eavesdropping" On IPhone And Co. Suspended For The Time Being
Video: That's How You Get a Dystopia @ Mozilla Privacy Lab 2023, December
Anonim

What happened? At the end of last week, a Guardian report made big waves on the web. Contract workers engaged by Apple are said to be ear witnesses of highly private conversations that iOS voice assistant Siri records and passes on for analysis. According to the Guardian's sources, there are even talks about drug dealing and the sounds of human sleep - also known as "sex". Apart from that, Apple doesn't do it, because employees only accidentally hear all of these highly explosive records. Such conversations are most likely to be picked up by chance on the Apple Watch and Apple's HomePod speaker. However, it is questionable: Sometimes people and addresses can also be assigned based on the content of the conversation.

Siri also understands fun … besides your private conversations:

Funny Siri sayings

And what does Apple say about it? Exceptionally, the iPhone manufacturer is not silent, but appeases. According to Apple, only one percent of people's Siri recordings would be used to improve the voice assistant. There would also be no link to corresponding Apple IDs or customer accounts. In practice, however, the commendable approach of anonymization is nullified if the content of the conversation reveals too much.

Apple expectations: act now

Incidentally, Apple is not alone with this problem, such "listening actions" are also known from Google and Amazon. Small but nice difference: They are expected to deal with data protection rather loosely, but from Apple? There is a different, stricter expectation. How else could the Californians live up to their claims?

Two years ago, GIGA readers were still rather critical of language assistants:

Image
Image
Image
Image

Start photo gallery (5 photos)

In my opinion, the basic fact of the data analysis of Siri conversations is less to be criticized, but rather the constraint put up with for the user. So far, the customer can only escape the analysis of his conversations if he does not use Siri or has to install a special iOS profile again (see the instructions at 9to5Mac) - both approaches are only a bad one in the end Compromise.

Tip: The cheapest iPad on Amazon

If we take Apple at its word, then we have to make improvements. Say: If Apple wants to analyze my Siri conversations, then I would like to be asked beforehand. This can be done, for example, after installing the next iOS update using an opt-in procedure. Only if I expressly agree to the analysis, can Apple also listen to it. Whatever happens on my phone, please stay on my phone - Thanks, Apple.

Note: The opinions expressed in this article represent the views of the author only and are not necessarily the position of the entire GIGA editorial team.

Sven Kaulfuss
Sven Kaulfuss

Is the article helpful?

Yes No (2) comments

Your life is GIGA.

?

Help us to improve and please tell us:

Why is the article not helpful for you?

The information is out of date I have not received enough information The information is incorrect I disagree Other Send answer

Thank you, your GIGA team

?

Thank you

Your opinion is important to us. Also like to discuss with us in the comments.

Window will close automatically in 6 seconds

Close Window

Recommended: