The federal suspension on state AI rules is approaching passage. Why is it important?
State and local governments are restricted on how they can regulate artificial intelligence Under the proposal before Congress. AI leaders say the move ensures that the US can guide innovation, but critics say it could lead to a decline in consumer protection for rapidly growing technologies.
suggestionas passed by the House of Representatives, “for 10 years, a law or regulation regulating an artificial intelligence model, an artificial intelligence system, or an automated decision system, states or political subdivisions “have no state or political subdivisions.” In May, the House added it to the full budget bill. Reducing services like Medicaid and snap. The Senate made some changes. In other words, moratoriums are only necessary. He says he will accept funds As part of $42.5 billion Broadband, Equity, Access, Deployment Programs.
AI developers and some lawmakers say federal action is needed to prevent the nation from creating a patchwork of various rules and regulations that could slow the growth of technology. Rapid growth of AI generation since Openai chatgpt The explosion in the scene in late 2022 has made businesses less technology into as many spaces as possible. Economic impact is important China’s competition with us To see which countries’ technology is dominant, generative AI poses privacy, transparency and other risks to consumers that lawmakers have tried to curb.
“(Congress) has not implemented meaningful consumer protection laws for many years,” said Ben Winters, director of AI and Privacy at the American Consumer Federation. “If the federal government doesn’t act, if no one else can act, it only benefits high-tech companies.”
Efforts to limit states’ ability to regulate artificial intelligence could mean less consumer protection for technologies that are increasingly permeating all aspects of American life. “There was a lot of debate at the state level, and I think it’s important to approach this issue at multiple levels,” said Anjana Susarla, a professor at Michigan State University who studies AI. “You can approach it at the national level. You can approach it at the state level too. I think you need both.”
Several states have already begun adjusting AI
The proposed language prohibits enforcement of regulations, including those already listed in the book. The exceptions are rules and laws that facilitate AI development, and rules and laws that apply the same standards to non-AAI models and systems that do the same thing. These types of regulations are already beginning to pop up. The biggest focus is not the US, but Europe, which is already undertaken by the European Union. AI standards. However, the nation is beginning to get caught up in action.
Colorado I handed the set Of last year’s consumer protection, it is expected to come into effect in 2026. California has adopted more than a dozen AI-related recruitments Last year’s law. Other states have laws and regulations that often deal with certain issues. Deep fake etc. Alternatively, ask the AI developer to publish information about their training data. At the local level, some regulations also address potential employment discrimination when AI systems are used for employment.
Arsene Kourinian, partner at law firm Meyer Brown, said: So far, in 2025, state legislators have at least introduced it. 550 Proposals According to a national meeting of state legislatures, around AI. At a House Committee hearing last month, Rep. Jay Obernolte, a Republican from California, expressed his desire to go beyond more state-level regulations. “We have a limited legislative runway to ensure that the state can resolve that issue before it goes too far,” he said.
Some states have book laws, but not all of them have come into effect or seen enforcement. This limits the potential short-term impact of the moratorium, says Covensway Felkeegan of the International Association of Privacy Professionals, Washington’s Managing Director. “There is no execution yet.”
The moratorium is likely to block state lawmakers and policymakers from developing and proposing new regulations, Zweifel Keegan said. “The federal government will be the only major potential regulator around the AI system,” he said.
What does a suspension on state AI regulations mean?
AI developers are demanding guardrails placed in their work to be consistent and streamlined.
“As an industry and as a country, whatever it is, we need one clear federal standard,” Alexandr Wang, founder and CEO of data company Scale AI, told lawmakers. April hearing. “But we need to be clear about one federal standard and there’s a preemptive measure to prevent this outcome, with 50 different standards.”
During the Senate Commerce Committee May hearingOpenai CEO Sam Altman told Sen. Ted Cruz, a Republican from Texas, that the EU-style regulatory system is “disastrous” for the industry. Instead, Altman suggested that the industry develop its own standards.
Sen. Brian Schatz, a Hawaii Democrat, said Altman thought some guardrails would be good if the industry’s self-regulation was sufficient at this point, but “it’s gone too far.” (Disclosure: CNET’s parent company Ziff Davis filed a lawsuit against Openai in April, claiming it infringed Ziff Davis’ copyright in training and operating AI systems.)
However, not all AI companies support moratoriums. in New York Times OperationCEO of humanity, Dario Amodei, calls it “too dull” and says instead the federal government should create transparency standards for AI companies. “Having this national transparency standard will help not only the public but also the Congress understand how technology is developed, allowing lawmakers to decide whether further government action is necessary.”
Concerns from both developers who create AI systems and “deployers” who use them in their interactions with consumers often stem from fears that states will require important tasks such as impact ratings and transparency notices before the product is released, Kourinian said. Consumer advocates say more regulations are needed, and hampering state capabilities could undermine user privacy and safety.
Kourinian said suspensions on certain state rules and laws could result in more consumer protection issues being addressed by courts or the state attorney general. Existing laws regarding unfair and deceptive practices inherent in AI still apply. “Time tells us how judges interpret these issues,” he said.
Suzara said the spread of AI across the industry means that states could potentially be able to regulate issues such as privacy and transparency more widely without focusing on technology. However, suspensions on AI regulations could lead to such policies being bound by lawsuits. “We need to find some kind of balance between ‘I don’t want to stop innovation’, but we need to realize that there may be real consequences,” she said.
According to Zweifel-Keegan, many policies regarding the governance of AI systems occur due to these so-called technology-independent rules and laws. “It’s worth remembering that there are a lot of existing laws, and it could create new laws that don’t cause moratoriums, but apply to AI systems as long as they apply to other systems,” he said.
The proposed 10-year suspension of state AI laws is now in the hands of the US Senate. There, Commissions on Commercial, Science and Transport have already held hearings on artificial intelligence.
Will the AI Moratorium pass?
The bill is currently in the hands of the US Senate, and as more people are aware of the proposal, debate over the moratorium is being raised. The proposal cleared a key procedural hurdle in the Senator’s ruling that the senator would pass the so-called bird rules. Winters said the move to link the moratorium to states that accept Beads funding is likely to help.
Whether it passed in its current form is no more procedural than political, Winters said. Senators from both parties, including Republican Sens. Josh Hawley and Marsha Blackburn, have expressed concerns about tying the nation’s hands.
“Even if it wasn’t procedurally taken away, I think there are strong, unresolved questions about whether it will be handed over as it is written,” Winters said.
Any bill approved by the Senate must be accepted in the House of Representatives, where it passes the narrowest margin. Even some House members who voted for the bill say they don’t like the moratorium, Rep. Marjorie Taylor Green, a key ally of President Donald Trump. Georgia Republican Posted on X this week That she “resolutely opposes” the moratorium and not votes on bills that include the moratorium.
At the state level, Letters signed by 40 state attorney generals – Of the parties, they asked Congress to refuse to suspend it and instead create its wider regulatory system. “The bill does not propose a regulatory scheme that replaces or supplements the laws enacted by the state or currently under consideration, leaving Americans who are not fully protected from the potential harms of AI,” they wrote.