The technology industry is hot again. After facing intense scrutiny in the Senate Judiciary Committee earlier this week, social media and technology executives now face the wrath of Minnesota's top attorney.
A new report released Thursday by Minnesota Attorney General Keith Ellison's office showed that emerging technologies such as social media and artificial intelligence programs are harming the state's youth.
The report, commissioned by the state Legislature last year and led by Ellison, found that specific design choices companies made to increase engagement had negative impacts on consumers, especially children and young people. I explained this in detail. Ellison said in a statement that the report's findings and its recommendations will help state policymakers better regulate technology companies and create safer digital spaces for children.
“Technology like social media is changing the way Minnesota children grow up, often in very negative ways,” Ellison said. “As it stands, technology companies with little oversight and a habit of putting profits over people have almost unrestricted access to our children via computers and smartphones.”
Harmful and unwanted experiences are rampant on social media, with more than a quarter of Meta users reporting witnessing bullying or harassment in the past seven days, the report said, based on the company's own internal research. It is quoted and stated. This unrestricted digital access created by technology companies has made dangerous encounters rampant on social media.
“As a minor, on more than one occasion I received sexually explicit photos from men who added my account,” the complainant wrote to Ellison's office. “There was no need to re-add to see the images submitted. Minors should never be exposed to this kind of behavior.”
Inappropriate and explicit content is normalized for children, with the average age of exposure to pornography nationwide being 12 years old, and 1 in 8 Instagram users under 16 reporting unwanted sexual advances on the platform. ing.
Young people recognize that social media has a negative impact on their mental health, but design choices such as relative likes, lax default privacy settings, and AI algorithms limit their use. is increasing, while user happiness is being undermined, the report says.
Infinite scrolling not only leads to sleep deprivation and lower academic performance among students, including about half of college students, but it also creates a culture that idolizes content and creators that pander to biased desires rather than reality.
“When she was 13 years old, she began cutting her own hands,” another complainant wrote in a letter to the Minnesota Attorney General's Office. She said: “When I asked her why, she said she did it because she was saying on Instagram how refreshing it was for girls to cut themselves.”
Governments around the world have been slow to wake up children to the increasing impact that technology has on users. In the United States, regulations vary by state.
Florida and Texas have passed laws restricting content moderation on social media platforms, but a legal challenge invalidates Florida's bill and will send Texas to the U.S. Supreme Court later this month. It has become. Additionally, Montana became the first state to ban TikTok last year, while states such as Utah, Delaware, and Connecticut are restricting children's access to social media and the platform's ability to target them. passed the bill.
Nationally, regulatory efforts by Congress include bills that would limit platforms' access to children's personal data and require more transparency about social media platforms' targeting practices. President Joe Biden issued an executive order in October guiding the development and use of AI by federal agencies.
The report says efforts to date have yielded lessons that will guide future regulations. For example, Ellison's report noted that overly prescriptive laws can “rapidly become obsolete as technology advances.” At the same time, overly broad reporting requirements have had little impact.
“If we want Minnesota's youth to grow up with the dignity, safety, and respect they deserve, this unacceptable status quo must change,” Ellison said.
Ellison's recommendations for state policymakers include:
- Prohibit “dark patterns” within platform design, such as autoplay, autoplay, and aggressive notifications that amplify platform usage.
- Require transparency in product development that includes potentially harmful features.
- Create consumer-friendly device-based defaults.
- Track how platform-specific technologies impact users.
- Mandate interoperability and encourage consumer choice.
- Mandate usage restrictions and provide technology education in schools.
While the advisory aims to hold tech companies accountable, Ellison said cyberbullying, hypersexualization, and predators will continue to roam the digital landscape, and users will continue to have the ability to engage in risky behavior. He admitted that he would keep it. But users need to be more aware of the consequences of their involvement, he said, and companies must have “the same mitigating power as in the offline world”.
Going forward, as AI becomes more mainstream and more individuals gain the ability to target people online, it will become increasingly important to have stronger rules of engagement to protect vulnerable consumers. Yes, Ellison said.
“We will continue to use every tool at our disposal to prevent ruthless corporations from preying on our children,” Ellison said. “We hope that other policymakers will use the content of this report to do the same.”