Household robots to speakers: Here are the privacy questions to keep in mind

by By Avi Asher-Schapiro | @AASchapiro | Thomson Reuters Foundation
Wednesday, 29 September 2021 09:30 GMT

Prompts on how to use Amazon's Alexa personal assistant are seen in an Amazon ‘experience centre’ in Vallejo, California, U.S., May 8, 2018. Picture taken May 8, 2018. REUTERS/Elijah Nouvelage

Image Caption and Rights Information

Consumers buying gifts like smart speakers and wearable health monitors should keep in mind how, and when, the devices collect data, says digital rights expert Alexis Hancock

By Avi Asher-Schapiro

Sept 29 (Thomson Reuters Foundation) – Amazon.com Inc this week announced a household, canine-like robot called Astro and a deal with Walt Disney Co to imbue its voice-controlled tech in resort hotels, striving to make its virtual aide Alexa a bigger part of consumers' lives.

From wearable health monitors, to smart speakers and in-home security systems, consumer electronics that collect and share data are increasingly common and affordable.

The Thomson Reuters Foundation asked Alexis Hancock, a digital rights expert at the Electronic Frontier Foundation, what privacy risks consumers should keep in mind when considering these kinds of technologies.

What should consumers considering buying gadgets that collect personal data consider?

I would suggest thinking about the exact need they are trying to fulfill; how exactly are you going to make your life better - sometimes it doesn't require an internet connected device.

Not every tool needs to come with an app; you can get cool stuff without a privacy risk.

But once you connect things to the internet, you need to worry about all sorts of things: secure passwords, encryption, and of course the privacy practices of the companies and what they do with the data they're collecting.

We know that our data is being collected all the time; shouldn't we just let it go?

So, we try to combat a lot of this kind of nihilism at the Electronic Frontier Foundation; people may think: "They already collected so much data, I should just accept it."

What we say is that pushing back can be good and consumers can have an impact.

In 2020, for example, after consumer outcry about the microphone in Google’s Nest (a home security system which failed to disclose its built-in microphone), they changed it... and now there's a physical kill switch on the mic.

Nest is one of many home security camera systems for sale - what, if any privacy risks should people consider?

Again, I would suggest people think: what problem are you trying to solve? Do you really need a company-affiliated camera and doorbell device in your home?

With a lot of these products they sell themselves as being about safety and security, but many of the business practices we've seen come out are concerning - especially the amount of data sharing with police or other entities.

So people should ask themselves: are you buying something that's going to keep you safe, or are they buying another layer of surveillance for your loved ones?

How does that thinking apply to in-home smart speakers and assistance devices?

If you are worried about companies sharing or capturing your data, I would research what privacy measures are built into the devices - do they allow you to cut off the mic, can you disconnect from the internet?   

If they don't function without the  internet, is your home internet safe? Remember: the more devices you add in your home, the more routes there are for your data to be shared.

Also, something to consider for voice-activated devices: the artificial intelligence isn't always built for everyone, and they might not recognize certain kinds of accents or voice types.

And what about wearable devices like watches that monitor your steps or heartbeat?

There are valid reasons to buy these kinds of tools. I enjoy products that help me understand me better, but I don't enjoy sending that data off to end points of the internet that I don’t know about.

And if these devices use AI to make recommendations about my sleep, or tone of voice, they should be transparent about the models they are using - I want to know how they are arriving at these recommendations.

Related Links:

Who owns your data? It's complicated

Data of the dead: Virtual immortality exposes holes in privacy laws

Britons risk having data 'sold to highest bidder' after Brexit, whistleblower warns

This article was updated on Wednesday, 29 Sept 2021 to include news of Amazon's Astro robot.

(Reporting by Avi Asher-Schapiro @AASchapiro, Editing by Tom Finn and Zoe Tabary. Please credit the Thomson Reuters Foundation, the charitable arm of Thomson Reuters, that covers the lives of people around the world who struggle to live freely or fairly. Visit http://news.trust.org)

Our Standards: The Thomson Reuters Trust Principles.

Themes