WASHINGTON — Congress has a plethora of technology policy challenges to solve, from artificial intelligence systems to data privacy to children's online safety, but there's not much time on the legislative calendar until the November election. do not have.
Despite the lack of a federal law, more than a dozen states have enacted data privacy laws, and more are in the process of enacting them. Similarly, several states have enacted or are considering legislation related to artificial intelligence systems, all of which increases the pressure to develop national policy.
Starting Wednesday, there will be less than 50 days left in both chambers of Congress before the Nov. 5 election.
Linda Moore, CEO of TechNet, an organization whose members include the heads of Amazon.com, Apple and Google LLC, said Congress could ultimately enact narrower AI legislation by November. However, he said it was unlikely that a comprehensive bill would be passed by then. , Meta Platforms Inc., Microsoft Corp., OpenAI, etc.
“Congress is having a hard time coming together around the issues that we really need to make sure we have good, solid foundational AI policy,” Moore said, “and a national privacy law that's necessary for the tech industry to thrive.” He also talked about changes to the immigration system. In an interview.
“As for a comprehensive AI policy package, we do not expect it to be passed by the end of the year for a variety of reasons,” Moore said. “They have their hands very busy and they have a lot of things that need to get done by the end of the year, and we know they’re not done yet.”
Moore said Congress should either enact legislation to limit the use of AI deepfakes in elections or fully fund the National AI Research Resource, a joint venture of the National Science Foundation, 10 federal agencies, and about 25 non-governmental organizations. He said he may provide funding. The goal of this initiative is to expand access to resources across the country to advance the development of AI systems.
Senate Majority Leader Charles E. Schumer of New York has held a series of closed-door briefings on AI systems for lawmakers in 2023, with the aim of building consensus on broad AI legislation, but has not yet decided when they will be held. Not shown. We propose such measures.
In February, the House established an AI task force led by Rep. Jay Obanolte (R-Calif.) and Rep. Ted Lieu (D-Calif.), both of whom want to create a comprehensive AI task force that broadly covers the technology. He said that he did not expect to submit any such legislation. The task force is expected to release its report by the end of this year.
Earlier this month, Congress announced that federal data Expectations were high that a privacy law would be passed. The Commerce Committee announced that they had reached agreement on the bill. However, the proposal does not yet have broad support in either chamber.
The bill would give consumers rights over their data and the right to sue companies for privacy violations. Small businesses, which are not yet defined in the proposal, would be excluded from the bill's provisions.
The bill would pre-empt a patchwork of state laws to create a national standard, but state attorneys general would have the authority to enforce the law, similar to the Federal Trade Commission. The House Energy and Commerce Committee is scheduled to hold a subcommittee hearing on the proposal this week, but the Senate Commerce Committee has not yet scheduled one.
Congressional aides said Senate committee staff is working to drum up support.
Maryland's measures
As Congress debates how to move forward with federal privacy measures, one state has passed a bill with stricter provisions than most other states that have their own laws.
Earlier this month, the Maryland General Assembly passed a bill that would require companies to collect only the data necessary for transactions and limit the collection of sensitive data such as location information. If signed by Gov. Wes Moore, the law would go into effect in October 2025.
The data minimization requirements in Maryland law go further than laws enacted in 14 other states and are similar to those being considered by Congress in the Cantwell and Rogers proposal.
Other states require companies to obtain user consent to collect data they deem necessary. But Maryland has eschewed the consent approach, said Keir Lamont, director of U.S. law at the Future of Privacy Forum, an advocacy group focused on data privacy.
Lamont said in an interview that the bill's effect is to “generally downplay the importance of consent and instead put default limits on the data that can be collected.” “This is a new approach to protecting privacy at the state level. If other states continue to adopt similar language, it could impact the federal debate over privacy laws.”
State efforts on artificial intelligence systems and children's online safety could also advance federal legislation.
According to the Council of State Governments, a bipartisan group that promotes states' efforts, four states – California, Connecticut, Louisiana and Vermont – have reported unintended but foreseeable impacts from unsafe or ineffective AI systems. The law was enacted to protect individuals from Other states require companies to disclose to users when they use AI systems in hiring or other decision-making roles, the council said.
Congress is also discussing tightening rules to ensure the safety and privacy of children using online platforms, but those efforts have not yet led to legislation. Two bipartisan bills addressing children's online safety were approved by the Senate Commerce Committee last year, but have not yet reached a floor vote.
Lamont said countries are once again stepping into the breach.
Earlier this month, the Maryland General Assembly passed a separate bill on so-called “age-appropriate design,” requiring technology companies to build platforms with privacy as a default option to ensure their services are age-appropriate for users. required to guarantee that.
The measure is similar to a 2021 law enacted in California that is being challenged in court by NetChoice, a trade group for the technology industry. A U.S. district court blocked part of the law, but California appealed the ruling.
Vermont is also considering age-appropriate design laws.
“States are becoming increasingly aggressive in legislating on topics that would otherwise be left to the federal government,” Lamont said.