News

Published on July 31st, 2019 📆 | 4304 Views ⚑

0

Apple’s Siri is listening to you during your sex encounters


Text to Speech Voices

Technology companies get involve in increasingly invasive practices, so users’ privacy is compromised as never seen before. According to information security services specialists, Apple and its contractors have access to sensitive user information, such as medical details, drug purchases, and even recordings of their sexual encounters through the process of “quality control” of the company’s voice assistant Siri.

A minimal portion of interactions between Siri
and users is sent to Apple contractors operating virtually worldwide. The job
of these companies is, according to a set of variables, to evaluate the
responses that the voice assistant provides to the user. Unsurprisingly, Apple
does not clearly mention this practice to consumers.

The company only mentions that some data is
taken to make constant improvements to Siri, although information security
services specialists consider that the company commits a clear privacy
breach by not explicitly mentioning that this work is done by humans listening
to user recordings.

When questioned about this practice, the
company stated: “A tiny portion of requests and queries that Siri receives
are subject to rigorous analysis to improve the user experience. These records
are in no way associated with the IP address or physical location of users. In
addition, the analysis is performed in secure installations in accordance with
Apple’s data confidentiality measures.” It is estimated that about 1% of
the daily requests received by the voice assistant are subject to this
procedure.

An anonymous informant working at one of
Apple’s contracting companies mentioned that users and organizations should
remain aware of intrusive capabilities of tools such as Siri; “Considering
how often voice assistants are accidentally activated, these companies could
find user recordings in private circumstances,” the anonymous source said.





“Siri can be ‘awaken’ even without clearly
hearing the activation words (Hi, Siri). Even an Apple Watch with Siri can
start recording if it detects any movement or hears random words,” the
informant added.

Although Siri is included in most Apple
developments, Apple’s major data sources are Apple Watch, iPhone, and Smart
Speaker HomePod. According to specialists in information security services, the
Apple Watch smart watch is the device with the most unauthorized activations of
the voice assistant. 

Finally, the informant mentioned that, at
Apple’s express request, Siri’s accidental activations must be reported as
technical errors; “There is no special procedure for dealing with
accidental voice assistant activations, so in the end, the company is not
really striving to reduce these errors in the service”.

According to information security services
specialists from the International Institute of Cyber Security (IICS), when a
user asks Siri about access to their voice records, the assistant only mentions
“I only hear when you’re talking to me”. That’s absolutely false, as
it has been shown that users’ voice records are also stored by Apple even when
the voice assistant is activated by mistake.

(Visited 8 1 times)



Source link

Tagged with:



Comments are closed.