Podcasts > The Kevin Roberts Show > #104 | Jason Miyares

#104 | Jason Miyares

By Heritage Podcast Network

Step into the digital dialogue with Jason Miyares on The Kevin Roberts Show, where the intersection of technology, ethics, and youth engagement takes the spotlight. In a conversation that unveils the unsettling tactics of Big Tech, Miyares, alongside host Kevin Roberts, delves into the controversial methods companies employ to captivate the younger generation, drawing chilling parallels with the renowned practices of Big Tobacco. As tech giants face criticism for breeding addiction in children, the episode explores the legal, ethical, and societal ramifications of such strategies.

The discourse extends beyond mere condemnation, pivoting to the pressing issue of a mental health crisis among teens linked to rampant social media consumption. As stories of parental helplessness surface, the show sheds light on bipartisan legal efforts to rein in major corporations, including Meta, and the unique challenges faced by legislators such as Virginia's proposal to ban TikTok for minors. From the role of parents in online vigilance to the need for stringent guardrails against the tech industry's influence, Miyares and Roberts navigate the complexities of artificial intelligence policy and its profound impact on the fabric of society.

Listen to the original

#104 | Jason Miyares

This is a preview of the Shortform summary of the Feb 28, 2024 episode of the The Kevin Roberts Show

Sign up for Shortform to access the whole episode summary along with additional materials like counterarguments and context.

#104 | Jason Miyares

1-Page Summary

Big Tech companies hooking youth on social media

Jason Miyares criticizes Big Tech companies for strategies that hook children on social media, likening these tactics to those once used by Big Tobacco. These practices raise legal and ethical concerns, including the violation of privacy laws and the instigation of a youth mental health crisis.

Targeting children to get them addicted at early age

Miyares compares Big Tech's strategies to Big Tobacco's historical marketing, citing how companies like Instagram target young people with ads on children's programs. He describes these firms' views on youth as an 'untapped' demographic with language that suggests a predatory approach, akin to "sprinkling digital cocaine."

Violating age restrictions and privacy laws

Meta, before 2019, allowed children under the age of 13 to create Instagram accounts without parental permission, breaking COPPA regulations. Bipartisan state attorneys general, including Miyares, have expressed concern about these legal breaches and the broader social and mental health issues that could arise from unchecked social media use among children.

Causing mental health crisis

Alongside an increase in smartphone and social media use, there's a rise in anxiety, depression, and suicidal thoughts among teenagers. Parents share regret about not recognizing the harmful effects of social media earlier, as they observe its significant negative impact on their children's mental health.

Bipartisan efforts against Meta/Facebook over consumer protection laws

A group of 41 state attorneys general have filed a lawsuit against Meta, accusing the company of knowingly allowing and targeting minors with their services, constituting a consumer protection law violation. This lawsuit is representative of a bipartisan effort to address the issue.

Virginia efforts to ban TikTok for minors

Virginia Governor Youngkin aims to ban TikTok for minors due to risks of bullying, exposure to predators, and other threats, including national security concerns over the possibility of data requisition by the Chinese government.

Role of parents in monitoring kids' social media usage

Parents are struggling to protect their children on social media platforms. Miyares suggests keeping smartphones in a secure location and utilizing parental controls. He equates allowing unrestricted access to social media to negligence, as parents seek more robust ways to police their children’s online experiences.

Feeling overwhelmed

Parents feel daunted by social media, with some underestimating its potential for harm. They look for support and sometimes create accounts themselves to monitor their children's online behavior.

Need for guardrails and accountability

Miyares stresses the importance of implementing guardrails and holds Big Tech responsible. He views litigation as a tool to empower parents, push for strict online protection laws, and counter the tech industry's lobbying efforts.

Artificial intelligence policy considerations

Miyares touches on the implications of AI on social media content, emphasizing differing algorithms between China and the U.S., and the need for federal oversight to balance AI's potential benefits, like healthcare innovations, with its risks, including job displacement and generation of harmful content.

1-Page Summary

Additional Materials

Clarifications

  • COPPA regulations, or the Children's Online Privacy Protection Act, are laws in the United States that aim to protect the privacy of children under 13 online. These regulations require websites and online services to obtain parental consent before collecting personal information from children. Violating COPPA regulations can result in significant penalties for companies that target children with their online services.
  • Bipartisan state attorneys general are legal officials from different political parties who work together on common issues or concerns. In this context, they collaborate across party lines to address legal matters related to Big Tech companies and social media use among children. Their joint efforts demonstrate a unified approach to tackling complex issues that transcend political divides.
  • Meta is the new name for the company previously known as Facebook, Inc. It encompasses various social media platforms like Facebook, Instagram, and WhatsApp. The company has faced criticism for its practices related to targeting minors and privacy violations. Recently, Meta has been in the spotlight due to legal actions and concerns over its impact on youth mental health.
  • Governor Youngkin aimed to ban TikTok for minors in Virginia due to concerns about bullying, exposure to predators, and potential national security risks related to data privacy. This action was part of efforts to protect young users from online threats and ensure their safety while using social media platforms. The proposed ban reflected growing worries about the negative impacts of unrestricted access to certain apps on minors' well-being and security. The decision highlighted the importance of addressing digital safety issues and regulating online platforms to safeguard children's online experiences.

Counterarguments

  • Big Tech companies argue that their platforms offer educational content and opportunities for social connection, which can be beneficial for youth development when used appropriately.
  • Some experts suggest that social media addiction is not solely the fault of Big Tech companies, but also a result of broader societal issues and individual choices.
  • There is debate over the effectiveness of age restrictions, with some arguing that they are difficult to enforce and that education about responsible use may be more effective.
  • The link between social media use and mental health issues is complex, and some researchers argue that there isn't a clear causal relationship, suggesting that other factors may also play a significant role.
  • The tech industry often claims that they are actively working to improve safety measures for minors and that they have introduced features like screen time management and content filters to help parents control their children's social media use.
  • Some argue that banning platforms like TikTok may not address the underlying issues and could lead to unintended consequences, such as driving youth to use less regulated or more secretive platforms.
  • There is a perspective that parental involvement and education are key to ensuring children's safety online, rather than solely relying on government intervention or tech company policies.
  • Concerning AI policy, some argue that innovation should not be stifled by overregulation and that the tech industry is capable of self-regulating to prevent the generation of harmful content while still advancing beneficial uses of AI.

Get access to the context and additional materials

So you can understand the full picture and form your own opinion.
Get access for free
#104 | Jason Miyares

Big Tech companies hooking youth on social media

Jason Miyares likens Big Tech companies to Big Tobacco, suggesting these firms deliberately target and hook children on social media to secure early-age users, which poses various legal and ethical challenges.

Targeting children to get them addicted at early age

Miyares draws a comparison between Big Tech and the infamous marketing techniques of Big Tobacco, which used characters like Joe Camel to appeal to young audiences. Instagram (Meta), for instance, ran targeted ads on children's programs like PBS Kids, and internally referred to the youth market as an 'untapped' demographic, akin to "sprinkling digital cocaine."

Violating age restrictions and privacy laws

Meta, prior to 2019, allowed children under 13 years old to create Instagram accounts without explicit parental permission, flouting COPPA (Children's Online Privacy Protection Act) regulations. Miyares, along with bipartisan state AGs, has raised concerns regarding this breach, as well as the broader social and mental health crises potentially spurred by the unregulated exposure of kids to social media.

Causing mental health crisis

Increased smartphone and social media usage has paralleled a rise in anxiety, depression, and suicidal ideation among teens. Parents have lamented the profound impact on their children and expressed regret over not understanding the adverse effects of social media sooner.

Bipartisan efforts against Meta/Facebook over consumer protection laws

In response to these issues, 41 state attorneys general have launched a lawsuit against Meta, alleging the company knowingly permits and targets minors. This action represents a bipartisan initiative, grounded in consumer protection law violations.

Virginia efforts to ban TikTok for minors

Virginia's Governor Youngkin is pushing to ban TikTok for minors amidst concerns over bullying, exposure to predators, and other risks, which have taken the form of an invasion of adolescents' privacy and well-being. The possibility of TikTok data being requisitioned by the Chinese Communist Party raises further national security issues.

Role of parents in monitoring kids' social media usage

Parents feel swamped trying to safeguard their children on social media. Miyares advises keeping smartphones in the parents' bedroom and using parental controls, likening unrestricted access to leaving a child unguarded in a dangerous park. The litigation underway seeks to compel Big Tech to implement robust parental controls. Parents who find social media overwhelming are clamoring for help, with many setting up their accounts to monitor their children's activities.

Feeling overwhelmed

Parents have voiced feeling overwhelmed by the sheer scale of challenges posed by ...

Here’s what you’ll find in our full summary

Registered users get access to the Full Podcast Summary and Additional Materials. It’s easy and free!
Start your free trial today

Big Tech companies hooking youth on social media

Additional Materials

Clarifications

  • COPPA, the Children's Online Privacy Protection Act, is a U.S. federal law that regulates the online collection of personal information from children under 13. It requires websites to obtain parental consent before gathering data from children and outlines specific privacy protections for minors online. The law aims to safeguard children's privacy and safety on the internet by setting guidelines for website operators regarding data collection practices. COPPA has been in effect since 2000 and has implications for websites and online services targeting children.
  • The Joe Camel marketing campaign was a controversial advertising strategy by R.J. Reynolds Tobacco Company in the late 1980s and early 1990s. It featured a cartoon character, Joe Camel, known for his cool and suave demeanor, which critics argued appealed to children and teenagers. The campaign faced significant backlash for allegedly targeting a younger demographic and glamorizing smoking. Ultimately, the Joe Camel campaign was discontinued amidst concerns about its impact on youth smoking rates.
  • Bipartisan state AGs are state Attorneys General from different political parties who come together to address common issues or concerns. In this context, they have united to take legal action against Big Tech companies like Meta for alleged violations related to targeting minors on social media. This collaboration signifies a joint effort across party lines to address consumer protection and privacy concerns in the digital realm.
  • Consumer protection law violations involve actions by companies that harm consumers through deceptive practices, unfair business methods, or breaches of regulations designed to safeguard consumers. In the context of Big Tech companies like Meta, these violations could include targeting minors with harmful content, disregarding privacy laws, or engaging in practices that exploit vulnerable consumer groups. State attorneys general and regulatory bodies may take legal action against companies found to be in violation of consumer protection laws to ensure fair and ethical treatment of consumers. Such violations can lead to legal consequences, fines, and requirements for companies to change their practices to comply with the law.
  • A private cause of action allows an individual to sue for damages or seek legal remedies against another party based on specific legal grounds. It gives the plaintiff the right to initiate a lawsuit by alleging facts that support their claim for relief in court. This legal concept enables individuals to seek redress for harm o ...

Counterarguments

  • Big Tech companies may argue that their platforms are designed to foster connectivity and creativity, not addiction, and that they provide valuable educational and social opportunities for young users.
  • Meta could contend that they have implemented measures to comply with COPPA and other regulations, and that any past issues have been addressed with stricter age verification processes.
  • Some researchers and industry experts might suggest that the correlation between social media usage and mental health issues is complex and not solely caused by social media; other environmental and psychological factors may also play significant roles.
  • It could be argued that lawsuits against Big Tech companies may not address the root causes of the issues and that collaboration between tech companies, parents, and regulators might be more effective.
  • Opponents of banning TikTok for minors might argue that such bans could infringe on freedom of expression and that education about safe online practices would be a better approach.
  • Some might suggest that while parents have a role in monitoring their children's social media usage, education and digital literacy programs for both parents and children are also crucial in promoting safe online behavior.
  • There may be a perspective that underestimating the risks of platforms like TikTok is not solely due to naivety but also due to a lack of clear inform ...

Get access to the context and additional materials

So you can understand the full picture and form your own opinion.
Get access for free

Create Summaries for anything on the web

Download the Shortform Chrome extension for your browser

Shortform Extension CTA