How tech companies deceive you into giving up your data and privacy | Finn Lützow-Holm Myrstad
The Dangers of Internet-Connected Toys
In this talk, Finn Myrstad discusses the dangers of internet-connected toys and how they can compromise children's privacy. He introduces Cayla, a popular toy that uses speech recognition technology to answer children's questions and responds like a friend. However, the company behind Cayla was harvesting personal information while families were innocently chatting away in their homes.
Introduction to Cayla
- Finn Myrstad introduces Cayla as an internet-connected toy that uses speech recognition technology to answer children's questions.
- He highlights the innocence of having a favorite toy as a child.
- However, he warns that the power lies with the company behind these toys who harvest masses of personal information.
Investigating Internet-Connected Devices
- Finn Myrstad explains his job is to protect consumers' rights in his country.
- He highlights billions of devices such as cars, energy meters and even vacuum cleaners are expected to come online by 2020.
- They thought investigating further was necessary because what was Cayla doing with all the interesting things she was learning?
Privacy Concerns with Cayla
- In order to play with Cayla, you need to download an app to access all her features.
- Parents must consent to terms being changed without notice.
- Recordings of the child, her friends and family can be used for targeted advertising.
- All this information can be shared with unnamed third parties.
Security Flaws in Internet Connected Toys
- Anyone with a smartphone can connect to Cayla within a certain distance.
- The company that made and programmed Cayla issued a series of statements that one had to be an IT expert in order to breach the security.
- Finn Myrstad live hacks Cayla to show how easy it is to connect to her without any password or circumventing any other type of security.
Consequences and Solutions
- After publishing their report, Cayla was banned in Germany, taken off the shelves by Amazon and Wal-Mart, and she's now peacefully resting at the German Spy Museum in Berlin.
- There are few rules to protect us and the ones we have are not being properly enforced.
- We need to get the security and privacy of these devices right before they enter the market.
Introduction
The speaker talks about how people agree to terms and conditions without actually reading them, leading to a power imbalance where companies can gather and use personal information on an unprecedented scale.
People Don't Read Terms and Conditions
- People tick the box saying they have read the terms and conditions, but have they really?
- It's unrealistic to expect consumers to read the lengthy terms of popular apps.
- The speaker and her colleagues printed out more than 900 pages of app terms and conditions, which took them over 31 hours to read aloud.
- Achieving informed consent is close to impossible.
Companies Should Provide More Understandable Terms
- Putting the burden of responsibility on consumers is unfair.
- Companies should provide less take-it-or-leave-it options and more understandable terms before asking for consent.
Investigating Dating Apps
The speaker discusses how dating apps are worth billions of dollars but also raise concerns about privacy violations.
Popular Dating Apps Raise Privacy Concerns
- Dating apps are worth billions of dollars annually.
- Users share intimate details with their partners but who else has access?
Investigating Privacy Violations in Dating Apps
- The speaker downloaded a popular dating app herself to investigate privacy violations.
- A preticked box gave the dating company access to all her personal pictures on Facebook, including some that were quite personal.
- Reading the terms and conditions revealed that all content posted as part of the service automatically grants the company an irrevocable perpetual license to use it for any purpose for all time.
- This could lead to financial loss, embarrassment, and other negative consequences.
The Dangers of Data Misuse
In this section, the speaker discusses how companies can manipulate our emotions and discriminate against us by analyzing our data.
Subconscious Manipulation
- Companies can analyze your emotions based on your photos and chats.
- They use this information to target you with ads when you are at your most vulnerable.
- A fitness app can sell your data to a health insurance company, preventing you from getting coverage in the future.
Not All Uses of Data Are Malign
In this section, the speaker acknowledges that not all uses of data are bad and highlights some positive changes that have been made.
- Some uses of data are flawed or need more work.
- Dating companies changed their policies globally after a legal complaint was filed.
Consumers Can't Fix This Alone
In this section, the speaker explains why consumers cannot fix data misuse on their own.
- Organizations fighting for consumer rights cannot be everywhere.
- If we know that something innocent we said will come back to haunt us, we will stop speaking.
- If we know that we are being watched and monitored, we will change our behavior.
We Have Lost Control of Our Lives
In this section, the speaker emphasizes the importance of controlling who has access to our data and how it is used.
- If we can't control who has our data and how it is being used, we have lost control of our lives.
- The stories shared in this talk are not random examples but rather signs that things need to change.
How Can We Achieve Change?
In this section, the speaker suggests ways in which change can be achieved regarding data privacy and security.
- Companies need to prioritize privacy and security to build trust and loyalty with their users.
- Governments must create a safer internet by ensuring enforcement and up-to-date rules.
- Citizens can use their voice to remind the world that technology can only truly benefit society if it respects basic rights.
Conclusion
In this section, the speaker concludes the talk by thanking the audience.
- The speaker thanks the audience for listening.