(OSV News) – There is no shortage these days of proposed state and federal government legislation directed at both safety and liability in a digital environment that almost everyone – perhaps with the exception of Big Tech companies themselves – agrees is increasingly dangerous for kids.
Following California’s successful lead, legislation in Arkansas, Connecticut, Illinois, Maryland, Minnesota, New Jersey, New Mexico, New York, Oregon, Utah, Virginia, and West Virginia aims to impose on social media giants fines, age-appropriate design codes, and user age restrictions for opening accounts.
At the U.S. Capitol, federal legislation to reform Section 230 of the 1996 Communications Decency Act that basically provides immunity to online services – has been re-introduced in the U.S. House and Senate. Discussion of bills such as the bipartisan Kids Online Safety Act, S.3663, dominated February Senate Judiciary Committee hearings, while lawmakers simultaneously mull a TikTok ban over national security concerns.
“Many of these digital spaces our children have been dropped into were not designed with children in mind,” explained Christopher McKenna, a digital security expert, founder of Protect Young Eyes, and a keynote speaker at the 2022 National Catholic Education Association conference. So it isn’t surprising, McKenna told OSV News, that “when dropped into these algorithms, children make decisions that aren’t always helpful to them.”
For parents who might wonder if the threat is overstated, McKenna has this reminder: “You’re pitting a child against the world’s greatest product designers.”
After setting up fake social media accounts, researchers at University of Texas – Austin found those with feminine usernames received an average of 100 sexually explicit messages daily.
A 2022 Pew Research survey indicated 46% of U.S. teens – who most frequently visit TikTok, Instagram, and YouTube – report they are “constantly” on the internet.
Responsible digital parenting, McKenna stresses, requires adult involvement: “There should be a minimum age of access for social media, unless a parent or somebody has said you can get in. And there are many ways to do that.”
McKenna recounted the authentication required by an online game his son wanted to play. Once McKenna’s son entered his correct birthdate, a pop-up prompted him to hand the phone to an adult. Through a third-party vendor, the elder McKenna entered an email address and credit card number. A temporary micro-transaction of a penny was processed.
The game manufacturer “never got my credit card number; they never got anything,” said McKenna. “What they got was a message back from this third-party service provider saying this child – who just created an account – now has express consent to be inside of the app because a parent has verified they’re involved.”
McKenna emphasized such technology is not unavailable. “It’s just simply we have all bought the marketing lie that social media doesn’t have to play by the same rules,” he said. However, McKenna said, “Without liability or better laws, (tech companies) won’t change their behavior.”
McKenna is also unquestionably pro-commerce and pro-tech. “I’m all for businesses making responsible money,” McKenna confirmed. “But not on the data of our children. Not on the behaviors and the clicks and the taps of our children.”
Ideally, said McKenna, “we should live in a society where the probability of egregious harm to our young people is low. That if you’re a good parent, it’s really low. And if you’re a bad parent — and don’t really care about their digital wellbeing – there are other constraints in place that prevent children from wandering into those places when we’re not watching.”
Jessica Heldman, a child rights professor at the University of San Diego and a member of its Children’s Advocacy Institute, agreed with McKenna’s risk assessment.
“The business model of social media is built around maximizing user engagement, as opposed to ensuring users are engaging in safe and healthy ways,” Heldman told OSV News. “They’re using addictive mechanisms to push children towards dangerous content.”
Heldman also said there should be both regulation and accountability – and, notwithstanding continuous technological advances, there are precedents for it
“There’s a long history of regulation and liability when it comes to products used by consumers – particularly children – that cause harm. The focus of a lot of the legislation around the country is that there is research that indicates a clear connection between use of social media and increased mental health issues, suicidality, self-harm, child trafficking, and even sale of fentanyl among young people,” Heldman explained. “So when we can trace these harms to something being accessed by consumers, it’s fair to call it a ‘product’ that requires some safety limitations.”
The current social media legal situation is, Heldman said, reminiscent of earlier concerns about children and smoking. However, “once those baseline legal responsibilities were outlined,” she recalled, “it absolutely changed the way tobacco companies operated.”
Heldman said that there are many advocates and legislators ready to step forward if effecting change means “lighting a fire under these companies through legislation and litigation.”
“We don’t want to get youth off of social media,” said Amanda Raffoul, a pediatrics instructor and researcher at Harvard Medical School’s Department of Pediatrics and a Fellow in the Boston Children’s Hospital Division of Adolescent and Young Adult Medicine. “We just want to make sure that these platforms aren’t targeting vulnerable youth – and more often, that’s happening because of really hyper-specific algorithms that tend to push people towards more extreme content as time goes on.”
Higher social media use, Raffoul told OSV News, is associated with increased risk of poor body image, experience of cyberbullying, and anxiety and depression. The impact is greater in girls than in boys, and some outcomes are intensified by the platform itself – such as those that use images or social reward systems.
The effects, Raffoul shared, “are a lot worse when youth are using social media at a very high rate in adolescence.”
However, parents can still take some positive proactive steps.
“The number one biggest thing is, your kid’s phone or device – whatever it might be that they’re accessing the internet on – it’s not theirs; it’s yours,” said Kristin Bird, a mother of three and executive director of Burning Hearts Disciples, Wisconsin-based parish and diocesan consulting firm, who has written on the topic of social media use for the Catholic youth ministry Life Teen. “And so remember that – and be willing and able to put parameters around the use, just like you would anything else in your home that’s yours that your kids use.”
While Bird said that she’s a parent just like any other – and every kid is different – she also knows what has worked for her and her husband Tony, a public high school principal. She suggested that parents set clear social media use expectations and consequences ahead of time.
She added that a teen’s cell phone also should not be off-limits to parents. Bird recommended parents have passcodes to everything on those phones, not just for the accountability aspect, but “so if something happens, you can check in on it.”