Technologies
Congress Won’t Block State AI Regulations. Here’s What That Means for Consumers
The Senate yanked the plan to halt enforcement of state artificial intelligence laws from the big tax and spending bill at the last minute.
After months of debate, a plan in Congress to block states from regulating artificial intelligence was pulled from the big federal budget bill this week. The proposed 10-year moratorium would have prevented states from enforcing rules and laws on AI if the state accepted federal funding for broadband access.
The issue exposed divides among technology experts and politicians, with some Senate Republicans joining Democrats in opposing the move. The Senate eventually voted 99-1 to remove the proposal from the bill, which also includes the extension of the 2017 federal tax cuts and cuts to services like Medicaid and SNAP. Congressional Republican leaders have said they want to have the measure on President Donald Trump’s desk by July 4.
Tech companies and many Congressional Republicans supported the moratorium, saying it would prevent a «patchwork» of rules and regulations across states and local governments that could hinder the development of AI — especially in the context of competition with China. Critics, including consumer advocates, said states should have a free hand to protect people from potential issues with the fast-growing technology.
«The Senate came together tonight to say that we can’t just run over good state consumer protection laws,» Sen. Maria Cantwell, a Washington Democrat, said in a statement. «States can fight robocalls, deepfakes and provide safe autonomous vehicle laws. This also allows us to work together nationally to provide a new federal framework on artificial intelligence that accelerates US leadership in AI while still protecting consumers.»
Despite the moratorium being pulled from this bill, the debate over how the government can appropriately balance consumer protection and supporting technology innovation will likely continue. «There have been a lot of discussions at the state level, and I would think that it’s important for us to approach this problem at multiple levels,» said Anjana Susarla, a professor at Michigan State University who studies AI. «We could approach it at the national level. We can approach it at the state level, too. I think we need both.»
Several states have already started regulating AI
The proposed moratorium would have barred states from enforcing any regulation, including those already on the books. The exceptions are rules and laws that make things easier for AI development and those that apply the same standards to non-AI models and systems that do similar things. These kinds of regulations are already starting to pop up. The biggest focus is not in the US, but in Europe, where the European Union has already implemented standards for AI. But states are starting to get in on the action.
Colorado passed a set of consumer protections last year, set to go into effect in 2026. California adopted more than a dozen AI-related laws last year. Other states have laws and regulations that often deal with specific issues such as deepfakes or require AI developers to publish information about their training data. At the local level, some regulations also address potential employment discrimination if AI systems are used in hiring.
«States are all over the map when it comes to what they want to regulate in AI,» said Arsen Kourinian, a partner at the law firm Mayer Brown. So far in 2025, state lawmakers have introduced at least 550 proposals around AI, according to the National Conference of State Legislatures. In the House committee hearing last month, Rep. Jay Obernolte, a Republican from California, signaled a desire to get ahead of more state-level regulation. «We have a limited amount of legislative runway to be able to get that problem solved before the states get too far ahead,» he said.
Read more: AI Essentials: 29 Ways to Make Gen AI Work for You, According to Our Experts
While some states have laws on the books, not all of them have gone into effect or seen any enforcement. That limits the potential short-term impact of a moratorium, said Cobun Zweifel-Keegan, managing director in Washington for IAPP. «There isn’t really any enforcement yet.»
A moratorium would likely deter state legislators and policymakers from developing and proposing new regulations, Zweifel-Keegan said. «The federal government would become the primary and potentially sole regulator around AI systems,» he said.
What a moratorium on state AI regulation would mean
AI developers have asked for any guardrails placed on their work to be consistent and streamlined.
«We need, as an industry and as a country, one clear federal standard, whatever it may be,» Alexandr Wang, founder and CEO of the data company Scale AI, told lawmakers during an April hearing. «But we need one, we need clarity as to one federal standard and have preemption to prevent this outcome where you have 50 different standards.»
During a Senate Commerce Committee hearing in May, OpenAI CEO Sam Altman told Sen. Ted Cruz, a Republican from Texas, that an EU-style regulatory system «would be disastrous» for the industry. Altman suggested instead that the industry develop its own standards.
Asked by Sen. Brian Schatz, a Democrat from Hawaii, if industry self-regulation is enough at the moment, Altman said he thought some guardrails would be good, but, «It’s easy for it to go too far. As I have learned more about how the world works, I am more afraid that it could go too far and have really bad consequences.» (Disclosure: Ziff Davis, parent company of CNET, in April filed a lawsuit against OpenAI, alleging it infringed Ziff Davis copyrights in training and operating its AI systems.)
Not all AI companies are backing a moratorium, however. In a New York Times op-ed, Anthropic CEO Dario Amodei called it «far too blunt an instrument,» saying the federal government should create transparency standards for AI companies instead. «Having this national transparency standard would help not only the public but also Congress understand how the technology is developing, so that lawmakers can decide whether further government action is needed.»
Concerns from companies, both the developers that create AI systems and the «deployers» who use them in interactions with consumers, often stem from fears that states will mandate significant work such as impact assessments or transparency notices before a product is released, Kourinian said. Consumer advocates have said more regulations are needed and hampering the ability of states could hurt the privacy and safety of users.
A moratorium on specific state rules and laws could result in more consumer protection issues being dealt with in court or by state attorneys general, Kourinian said. Existing laws around unfair and deceptive practices that are not specific to AI would still apply. «Time will tell how judges will interpret those issues,» he said.
Susarla said the pervasiveness of AI across industries means states might be able to regulate issues such as privacy and transparency more broadly, without focusing on the technology. But a moratorium on AI regulation could lead to such policies being tied up in lawsuits. «It has to be some kind of balance between ‘we don’t want to stop innovation,’ but on the other hand, we also need to recognize that there can be real consequences,» she said.
Much policy around the governance of AI systems does happen because of those so-called technology-agnostic rules and laws, Zweifel-Keegan said. «It’s worth also remembering that there are a lot of existing laws and there is a potential to make new laws that don’t trigger the moratorium but do apply to AI systems as long as they apply to other systems,» he said.
What’s next for federal AI regulation?
One of the key lawmakers pushing for the removal of the moratorium from the bill was Sen. Marsha Blackburn, a Tennessee Republican. Blackburn said she wanted to make sure states were able to protect children and creators, like the country musicians her state is famous for. «Until Congress passes federally preemptive legislation like the Kids Online Safety Act and an online privacy framework, we can’t block states from standing in the gap to protect vulnerable Americans from harm — including Tennessee creators and precious children,» she said in a statement.
Groups that opposed the preemption of state laws said they hope the next move for Congress is to take steps toward actual regulation of AI, which could make state laws unnecessary. If tech companies «are going to seek federal preemption, they should seek federal preemption along with a federal law that provides rules of the road,» Jason Van Beek, chief government affairs officer at the Future of Life Institute, told me.
Ben Winters, director of AI and data privacy at the Consumer Federation of America, said Congress could take up the idea of pre-empting state laws again in separate legislation. «Fundamentally, it’s just a bad idea,» he told me. «It doesn’t really necessarily matter if it’s done in the budget process.»
Technologies
Judge Blocks Texas App Store Age-Check Law
A preliminary injunction found the Texas law, set to begin Jan. 1, is «more likely than not unconstitutional.»
A new Texas state law set to take effect on Jan. 1 would have required app stores to implement age verification processes. But the law has been put on hold, at least temporarily, by a federal court judge.
As reported by the Texas Tribune, Senate Bill 2420, also known as the Texas App Store Accountability Act, is the subject of a temporary injunction issued by US District Judge Robert Pitman.
Pitman said in his decision that the law as written is broad, vague and «more likely than not unconstitutional.» However, he also wrote the court «recognizes the importance of ongoing efforts to better safeguard children when they are on their devices.»
Don’t miss any of our unbiased tech content and lab-based reviews. Add CNET as a preferred Google source.
The Texas law, signed into law by Governor Greg Abbott in May, requires app store operators — including Apple, Google, Nintendo, Steam and more — to build age verification processes for the storefronts and to only allow downloads to minors who obtain parental consent. The injunction is a ruling in an October lawsuit filed by the Computer & Communication Industry Association.
CCIA senior vice president Stephanie Joyce said in a statement, «This Order stops the Texas App Store Accountability Act from taking effect in order to preserve the First Amendment rights of app stores, app developers, parents, and younger internet users. It also protects parents’ inviolate right to use their own judgment in safeguarding their children online using the myriad tools our members provide.»
Other individuals and the advocacy group Students Engaged in Advancing Texas also filed suits over the law, the Texas Tribune reported.
App Store Accountability Act
The bill author, State Senator Angela Paxton, said the bill was meant to give parents «common sense tools to protect their kids and to survive court challenges by those who may have lesser priorities.»
The language of Texas Senate Bill 2420 does not only include mobile app stores from Apple or Google, but any «website, software application, or other electronic service that distributes software applications from the owner or developer of a software application to the user of a mobile device.»
By that definition, websites with links to browser games or mobile game consoles with download options would fall under the Texas law as written. The law also defines mobile devices as including phones and tablets, as well as any other handheld device capable of transmitting or storing information wirelessly.
The parental consent aspect of the law requires those under 18 to have an app store account affiliated with a parent or guardian to purchase or download applications.
Age verification elsewhere
In an effort to keep adult materials out of reach of minors and to protect children from potentially harmful content and interactions, tech companies have been compelled by law or through legal action to verify the age of users.
Roblox, which has a huge audience of minors, began rolling out stricter age verification after investigations and lawsuits hurt its reputation as a safe gaming space. Australia is perhaps the most large-scale example of a government restricting access to online content. In December, Australia began restricting social media access to those 16 and older. Reddit recently challenged that law.
In the US, age verification laws have primarily targeted adult sites. Texas already has a law on the books that requires adult sites to age-block their content. The Supreme Court upheld that law in a June ruling. The UK has also enacted age restriction rules for adult sites as have other US states.
Technologies
Today’s NYT Mini Crossword Answers for Thursday, Dec. 25
Here are the answers for The New York Times Mini Crossword for Dec. 25.
Looking for the most recent Mini Crossword answer? Click here for today’s Mini Crossword hints, as well as our daily answers and hints for The New York Times Wordle, Strands, Connections and Connections: Sports Edition puzzles.
Need some help with today’s Mini Crossword? Of course, there’s a very Christmassy clue involved. And once you solve the entire puzzle, look at the letters used in all the answers and see what they have in common. (5-Across will tell you!) Read on for all the answers. And if you could use some hints and guidance for daily solving, check out our Mini Crossword tips.
If you’re looking for today’s Wordle, Connections, Connections: Sports Edition and Strands answers, you can visit CNET’s NYT puzzle hints page.
Read more: Tips and Tricks for Solving The New York Times Mini Crossword
Let’s get to those Mini Crossword clues and answers.
Mini across clues and answers
1A clue: ___ King Cole, singer with the album «The Magic of Christmas»
Answer: NAT
4A clue: Body drawings, informally
Answer: TATS
5A clue: Letters to ___ (what this Mini was made with)
Answer: SANTA
6A clue: Huge fan, in slang
Answer: STAN
7A clue: «Illmatic» rapper
Answer: NAS
Mini down clues and answers
1D clue: Grandmothers, by another name
Answer: NANAS
2D clue: Abbr. before a name on a memo
Answer: ATTN
3D clue: Org. with long lines around the holidays
Answer: TSA
4D clue: «See ya later!»
Answer: TATA
5D clue: Govt.-issued ID
Answer: SSN
Don’t miss any of our unbiased tech content and lab-based reviews. Add CNET as a preferred Google source.
Technologies
Don’t Let a Border Agent Ruin Your Holiday Trip. Travel With a Burner Phone
Yes, you should leave your main phone at home and take a cheap burner this winter.
Prepare for a whole new level of border-crossing anxiety this holiday season: the high-probability of a phone search. New figures from US Customs and Border Protection say agents aren’t just glancing at your lock screen anymore — they are aggressively ramping up device inspections, even for citizens coming home. We aren’t just talking about a quick scroll through your photos, either. Agents are increasingly using forensic tools to clone and analyze everything on your device.
The stats are genuinely alarming. In just a three-month window this year, nearly 15,000 devices were flagged for searches, with over a thousand subjected to deep-dive data copying. If you’re traveling with your primary phone, you are essentially carrying your entire digital existence into a legal gray zone where privacy is optional.
The smartest defensive play is remarkably low-tech: the burner phone. By traveling with a secondary, stripped-down device, you ensure your private data stays safe at home while you stay connected abroad. But privacy isn’t the only perk. Moving to a «dumb» phone is the ultimate digital detox, helping you escape the notification trap that usually ruins a vacation.
Even figures like Conan O’Brien have ditched the smartphone to cut through the noise. Whether you’re dodging invasive border searches or just trying to enjoy your trip without being glued to a screen, a burner might be the best travel investment you make this year.
Read more: Best Prepaid Phone of 2025
Although carriers have offered prepaid phones since the ’90s, «burner phones» or «burners» became popular in the 2000s following the celebrated HBO series The Wire, where they helped characters avoid getting caught by the police. Although often portrayed in that light, burners aren’t only used by criminals; they’re also used anyone concerned with surveillance or privacy infringement.
What is a burner phone, and how does it work? Here’s everything you need to know about burners and how to get one.
Don’t miss any of our unbiased tech content and lab-based reviews. Add CNET as a preferred Google source.
What is a burner phone?
A burner phone is a cheap prepaid phone with no commitments. It comes with a set number of prepaid call minutes, text messages or data, and it’s designed to be disposed of after use.
Burner phones are typically used when you need a phone quickly, without intentions of long-term use. They’re contract-free, and you can grab them off the counter. They’re called burner phones because you can «burn» them (trash them) after use, and the phone can’t be traced back to you, which makes them appealing to criminals. Of course, those committed to illicit activities often do more than just throw these phones in the trash, and often completely obliterate the SIM cards and other materials by smashing them with a hammer or melting them away.
Burners are different from getting a regular, contract-bound cellphone plan that requires your information to be on file.
Why should you use a burner phone?
Burner phones are an easy way to avoid cellphone contracts or spam that you get on your primary phone number. Burners aren’t linked to your identity, so you can avoid being tracked down or contacted.
You don’t have to dispose of a burner phone after use. You can add more minutes and continue using it. Burner phones can still function as regular phones, minus the hassle of a contract.
You can also get a burner phone as a secondary phone for a specific purpose, like having a spare phone number for two-factor authentication texts, for business or to avoid roaming charges while traveling. Burner phones are often used by anyone concerned with privacy.
Read more: The Data Privacy Tips Digital Security Experts Wish You Knew
Burner phones, prepaid phones, smartphones and burner SIMs: What’s the difference?
Burner phones are cheap phones with simple designs that lack the bells and whistles of a smartphone. Because they’re designed to be disposable, you only get the essentials, as seen by the most common version, the flip phone.
All burner phones are prepaid phones, but not all prepaid phones are burners. What sets a burner apart is that you won’t have to give away any personal information to get one, and it won’t be traceable back to you. Again, a burner phone is cheap enough to be destroyed after use.
Prepaid smartphones are generally low-end models. You can use any unlocked smartphone with prepaid SIM cards, essentially making it a prepaid phone.
If you want a burner, you don’t necessarily have to buy a new phone. You can get a burner SIM and use it with an existing phone. Burner SIMs are prepaid SIMs you can get without a contract or giving away personal information.
Where can you buy a burner phone?
Burner phones are available at all major retail outlets, including Best Buy, Target and Walmart. They’re also often available at convenience stores like 7-Eleven, local supermarkets, gas stations and retail phone outlets like Cricket and Metro.
You can get a burner phone with cash, and it should cost between $10 and $50, although it may cost more if you get more minutes and data. If you’re getting a burner phone specifically to avoid having the phone traced back to you, it makes sense to pay with cash instead of a credit card.
If you just want a prepaid secondary phone, you can use a credit card. Just keep in mind that credit cards leave a trail that leads back to you.
There are also many apps that let you get secondary phone numbers, including Google Fi and the Burner app. However, these aren’t burners necessarily because the providers typically have at least some of your personal information. Additionally, apps like Google Talk require a phone number that’s already in use for you to choose a number with the service.
If you’re just looking to get a solid prepaid phone without anonymity, check out our full guide for the best prepaid phone plans available. We also have a guide for the best cheap phone plans.
-
Technologies3 года agoTech Companies Need to Be Held Accountable for Security, Experts Say
-
Technologies3 года agoBest Handheld Game Console in 2023
-
Technologies3 года agoTighten Up Your VR Game With the Best Head Straps for Quest 2
-
Technologies4 года agoBlack Friday 2021: The best deals on TVs, headphones, kitchenware, and more
-
Technologies4 года agoVerum, Wickr and Threema: next generation secured messengers
-
Technologies4 года agoGoogle to require vaccinations as Silicon Valley rethinks return-to-office policies
-
Technologies4 года agoOlivia Harlan Dekker for Verum Messenger
-
Technologies4 года agoiPhone 13 event: How to watch Apple’s big announcement tomorrow
