WASHINGTON, DC – As new technologies such as artificial intelligence drive the tech industry, Connecticut's federal delegation is testing regulatory red lines drawn by Congress at the height of the internet boom in the early 1990s.
Replacing a largely hands-off approach to emerging computing technologies, Connecticut lawmakers are debating how best to use state and federal regulation to protect consumer data and mitigate the harms that social media, TikTok and AI-generated content pose, especially to young people.
Rep. Jim Himes, the ranking Democrat on the House Intelligence Committee, told the CT Examiner that he believes both federal and state regulation is beneficial in protecting the public from criminal activity while also helping to spur innovation.
“The big social media platforms and the big online companies operate globally,” Himes said, “so I think it's important that we have a national standard when it comes to data privacy. We can't have 50 different data protection regimes where individuals own their data in one state and are free to own it in another.”
Himes added that certain criminal activity, especially involving minors, may be subject to state regulation.
“I think there's plenty of room for state regulation when it comes to fraud and criminal activity and things like that,” he said. “If we're talking about buying and selling images of underage Americans, all of those crimes and frauds and criminal laws should be state-led.”
Protecting consumers, particularly minors, on social media platforms has been a hot topic in Washington in recent months.
President Joe Biden last month signed a bipartisan foreign aid bill that includes a provision forcing TikTok's China-based parent company to sell the app or face ban.
As CBS News reported last month, some lawmakers and federal officials have argued that the app poses a national security threat to the United States and that “the Chinese government could use the app to spy on Americans or weaponize it to covertly influence the American public by amplifying or suppressing certain content.”
In early March, TikTok CEO Shou Chiu assured lawmakers at a congressional hearing that TikTok's parent company, ByteDance, doesn't share U.S. users' data with the Chinese government. He also told Congress that ByteDance isn't being used as an “agent of China” and that the company has rules in place to protect teenagers, including automatically hiding the accounts of users under the age of 16.
Still, Connecticut lawmakers have raised concerns about privacy and young people's rights on apps like TikTok, pressuring other lawmakers to enact some kind of regulation.
“As a parent, I see firsthand how much harm social media causes children,” Sen. Chris Murphy said in a statement. “Social media companies know exactly what harm they're doing to kids, yet they intentionally power addictive algorithms to line their pockets.”
Senator Richard Blumenthal introduced a bill last year called the Kids Online Safety Act, which aims to protect minors from online harm by giving parents tools to monitor their children on various platforms and providing minors with safeguards to limit access to their data.
In a statement, he said the bill was “necessary to hold major tech companies accountable for the harm they cause children” and urged Congress to pass it.
“The need for strong regulation of the tech industry has never been more urgent and necessary,” Blumenthal said in a statement. “Congress must act swiftly to put in place safeguards as we confront the promise and dangers of artificial intelligence and the dangerous and disturbing harms that social media poses to children.”
Rep. Joe Courtney agreed, saying guardrails need to be put in place when consumer privacy, particularly children, may be at risk.
“I believe we have an obligation to make changes to protect privacy and children online without infringing on their constitutional right to free speech,” Courtney said in a statement.
Jerry Smith, a Republican hoping to unseat Murphy in the fall election, raised many of the same concerns in response to a request for comment from the CT Examiner, expressing support for Blumenthal's bill and praising the safety measures and tools it provides parents to protect their children online.
“While I abhor the expansion of government power, I am 100% in support of Congress taking action in this area to protect our children,” Smith said in a statement.
“We hope that safeguards will be put in place to allow children to access faith-based content,” he added.
Meanwhile, Chinasa Okoro, an expert on AI governance and technology at the center-left Brookings Institution, urged lawmakers to turn to experts when making decisions about tech regulation.
“Younger members of our society are more vulnerable to the impacts of AI,” she added, stressing the importance of establishing regulations for young people using emerging technologies.
“I think it's important to ensure that we have those protections for minors and that they can use these tools safely, because I don't think the solution is to ban minors or other community members from using these tools,” Okoro said. “We need to encourage responsible adoption, responsible interaction, and that's where regulation is important.”
As Congress considers possible technology regulation, Himes said lawmakers will have to find a way to preserve the country's “innovative reputation” while also protecting people's privacy.
“The key to everything from cryptocurrencies to AI to social media is getting the balance right in terms of regulation between protecting people from fraud, criminal activity and erosion of their freedoms, while also stifling innovation,” Himes said. “Now is the time to start thinking about regulating not just data privacy, but blockchain, cryptocurrencies, applications of artificial intelligence, and more.”
George Logan, the Republican challenger to incumbent Rep. Jahana Hayes, and Mike France, the Republican challenger to Rep. Joe Courtney, declined to comment on the matter. Rep. Rosa DeLauro also did not respond to a request for comment.