What is the Cayla toy? Describe Cayla and identify the privacy issues that existed with this toy and ultimately led to it being banned in most countries. How does Cayla differ from popular devices like Echo and Siri?

Myrstad says that even if people do not purchase and keep flawed devices like Cayla in their homes, they are still not safe. What does he mean by this?
Why does Myrstad say that we all may have been lured into a false sense of security in regard to the many apps on our phones? How many apps do you have on your phone and of these apps, how many have you carefully and closely read the terms for? Explain.
Describe the experiment that Myrstad and his colleagues embarked on regarding phone apps and the terms associated with them. Why did Myrstad and his colleagues do this experiment? What did they find out?
Myrstad claims that “achieving informed consent is close to impossible,” and he asks the question, “do you think it's fair to put the burden of responsibility on the consumer?” Myrstad says that he does not think this is fair—share your thoughts and opinion on the matter.
What privacy concerns or security concerns do you have based on what Myrstad described that he found about a popular dating app? Identify specific aspects of the terms and conditions that Myrstad notes that you find troubling and explain why. Would any of this keep you from joining a dating app in the future? Explain.
Identify and explain the commercial practices that Myrstad describes regarding financial loss, subconscious manipulation, and discrimination. Do any of these practices shock you or were you aware that these types of practices are happening, possibly to you? Explain.
Towards the end of his Ted Talk, Myrstad says, “Well, companies need to realize that by prioritizing privacy and security, they can build trust and loyalty to their users.” Based on what you learned in this unit, what would you suggest to a company like the dating app that Myrstad examined or other businesses to improve their privacy and security and build trust and loyalty with their users?

The Cayla toy is a talking doll that connects to the internet and uses voice recognition technology to have interactive conversations with children. It can answer questions, play games, and engage in general conversation. However, the privacy issues with Cayla arise from the fact that it is vulnerable to hacking. Hackers can remotely connect to the doll through Bluetooth and use it as a listening device, potentially collecting sensitive information. This led to the banning of Cayla in most countries due to concerns about unauthorized surveillance of children.

Cayla differs from popular devices like Echo and Siri in a few ways. Firstly, Cayla is specifically designed for children, whereas Echo and Siri are more general-purpose virtual assistants. Additionally, Cayla requires a physical connection to the toy itself, while Echo and Siri are typically activated through voice commands from a distance. However, all of these devices raise privacy concerns as they also collect and store user data.

When Myrstad says that even if people do not purchase and keep flawed devices like Cayla in their homes, they are still not safe, he means that privacy and security issues extend beyond specific devices. Many other smart devices and apps also have privacy vulnerabilities that can be exploited. Therefore, the lack of safety extends to the broader ecosystem of connected devices.

Myrstad says we may have been lured into a false sense of security regarding apps on our phones because many people do not read or fully understand the terms and conditions associated with these apps. They often contain complex language and legal terms that make it difficult for users to know how their data is being collected, used, and shared. As for me, I have numerous apps on my phone, but I must admit that I haven't carefully read the terms and conditions for most of them. It's mainly because they are lengthy and filled with jargon, making it time-consuming and challenging to comprehend all the implications.

Myrstad and his colleagues embarked on an experiment where they tested approximately 20 popular apps to see whether users’ expectations matched the reality of the apps' data collection practices. They examined the terms and conditions of these apps and extracted unusual requests for permissions that were not directly relevant to the app's functionality. The purpose of this experiment was to shed light on the deceptive practices employed by some apps to gain unnecessary access to user data. They found that many apps request unnecessary access permissions, making users vulnerable to privacy breaches.

Myrstad argues that achieving informed consent is close to impossible because the terms and conditions presented by companies are often lengthy, complex, and filled with legal jargon. These factors make it difficult for consumers to fully understand the implications of sharing their data. He believes that the responsibility should not solely rest on the consumer, as companies should make their policies more transparent and user-friendly.

Based on Myrstad's findings about a popular dating app, the privacy concerns arise from the excessive data collection and sharing practices stated in the terms and conditions. For instance, the app collects detailed information about users' sexual preferences and shares it with marketing partners. This kind of intrusive data collection can lead to potential discrimination or misuse by third parties. The fact that the app also reserves the right to change its terms without notice is troubling. While these concerns would make me wary of joining such an app, individual preferences may vary.

Myrstad describes commercial practices such as financial loss, subconscious manipulation, and discrimination. Financial loss refers to hidden costs or subscription fees that users may unknowingly incur through apps. Subconscious manipulation involves techniques used to influence users' behavior without their awareness. Discrimination can occur when personal data is misused in ways that infringe people's rights or restrict opportunities for certain individuals or groups. These commercial practices are concerning as they exploit user data for commercial gain without the users' fully informed consent.

To improve privacy and security, companies should prioritize transparency and user control over their data. They should simplify and clearly communicate their terms and conditions, ensuring they are easily understandable to the average user. Companies should also minimize the collection of unnecessary data and give users the ability to provide explicit consent for data sharing. Implementing strong encryption measures and regularly auditing their systems for vulnerabilities is also crucial. By prioritizing privacy and security, companies can build trust and loyalty among their users.

Make it more shorter but still understandable for a 9th grader

Cayla is a talking doll that connects to the internet, but it was banned in many countries because it had privacy issues. People were worried that hackers could use it to listen in on conversations. Cayla is different from devices like Echo and Siri because it's made for kids and needs a physical connection.

When Myrstad says we're not safe even if we don't have Cayla, he means that other devices and apps can still have privacy problems. We may not realize it, but many apps on our phones collect data about us. Myrstad did an experiment on popular apps and found out that they ask for access to our data that they don't really need.

Myrstad thinks it's unfair to put all the responsibility on us as consumers to understand complex terms and conditions. He believes companies should be more transparent about how they use our data.

The dating app Myrstad looked at had privacy concerns because it collected personal information, like sexual preferences, and shared it with others. The terms and conditions also gave the app permission to change its rules without telling us. These concerns might make people hesitant to join the app.

Some companies use practices like hidden costs, subtle manipulation, and discrimination to make money. These practices exploit our data and can harm us, which is not right. Some people may be shocked by these practices, while others may have suspected it was happening.

To build trust with users, companies should be transparent and give users control over their data. They should make their terms and conditions simpler and easier to understand. Companies should also limit the data they collect and get clear consent before sharing it. By prioritizing privacy and security, companies can earn trust and loyalty from users.