Technologies
Gen AI Chatbots Are Starting to Remember You. Should You Let Them?
An AI model’s long memory can offer a better experience — or a worse one. Good thing you can turn it off.
Until recently, generative AI chatbots didn’t have the best memories: You tell it something and, when you come back later, you start again with a blank slate. Not anymore.
OpenAI started testing a stronger memory in ChatGPT last year and rolled out improvements this month. Grok, the flagship tool of Elon Musk’s xAI, also just got a better memory.
It took significant improvements in math and technology to get here but the real-world benefits seem pretty simple: You can get more consistent and personalized results without having to repeat yourself.
«If it’s able to incorporate every chat I’ve had before, it does not need me to provide all that information the next time,» said Shashank Srivastava, assistant professor of computer science at the University of North Carolina at Chapel Hill.
Those longer memories can help with solving some frustrations with chatbots but they also pose some new challenges. As with when you talk to a person, what you said yesterday might influence your interactions today.
Here’s a look at how the bots came to have better memories and what it means for you.
Improving an AI model’s memory
For starters, it isn’t quite a «memory.» Mostly, these tools work by incorporating past conversations alongside your latest query. «In effect, it’s as simple as if you just took all your past conversations and combined them into one large prompt,» said Aditya Grover, assistant professor of computer science at UCLA.
Those large prompts are now possible because the latest AI models have significantly larger «context windows» than their predecessors. The context window is, essentially, how much text a model can consider at once, measured in tokens. A token might be a word or part of a word (OpenAI offers one token as three-quarters of a word as a rule of thumb).
Early large language models had context windows of 4,000 or 8,000 tokens — a few thousand words. A few years ago, if you asked ChatGPT something, it could consider roughly as much text as is in this recent CNET cover story on smart thermostats. Google’s Gemini 2.0 Flash now has a context window of a million tokens. That’s a bit longer than Leo Tolstoy’s epic novel War and Peace. Those improvements are driven by some technical advances in how LLMs work, creating faster ways to generate connections between words, Srivastava said.
Other techniques can also boost a model’s memory and ability to answer a question. One is retrieval-augmented generation, in which the model can run a search or otherwise pull up documents as needed to answer a question, without always keeping all of that information in the context window. Instead of having a massive amount of information available at all times, it just needs to know how to find the right resource, like a researcher perusing a library’s card catalog.
Read more: AI Essentials: 27 Ways to Make Gen AI Work for You, According to Our Experts
Why context matters for a chatbot
The more an LLM knows about you from its past interactions with you, the better suited to your needs its answers will be. That’s the goal of having a chatbot that can remember your old conversations.
For example, if you ask an LLM with no memory of you what the weather is, it’ll probably follow up first by asking where you are. One that can remember past conversations, however, might know that you often ask it for advice about restaurants or other things in San Francisco, for example, and assume that’s your location. «It’s more user-friendly if the system knows more about you,» Grover said.
A chatbot with a longer memory can provide you with more specific answers. If you ask it to suggest a gift for a family member’s birthday and tell it some details about that family member, it won’t need as much context when you ask again next year. «That would mean smoother conversations because you don’t need to repeat yourself,» Srivatsava said.
A long memory, however, can have its downsides.
You can (and maybe should) tell AI to forget
Having a chatbot recommend a gift poses a conundrum that’s all too common in human memories: You told your aunt you liked airplanes when you were 12 years old, and decades later you still get airplane-themed gifts from her. An LLM that remembers things about you could bias itself too much toward something you told it before.
«There’s definitely that possibility that you can lose your control and that this personalization could haunt you,» Srivastava said. «Instead of getting an unbiased, fresh perspective, its judgment might always be colored by previous interactions.»
LLMs typically allow you to tell them to forget certain things or to exclude some conversations from their memory.
You may also deal with things you don’t want an AI model to remember. If you have private or sensitive information you’re communicating with an LLM (and you should think twice about doing so at all), you probably want to turn off the memory function for those interactions.
Read the guidance on the tool you’re using to be sure you know what it’s remembering, how to turn it on and off and how to delete items from its memory.
Grover said this is an area where gen AI developers should be transparent and offer clear commands in the user interface. «I think they need to be providing more controls that are visible to the user, when to turn it on, when to turn it off,» he said. «Give a sense of urgency for the user base so they don’t get locked into defaults that are hard to find.»
How to turn off gen AI memory features
Here’s how to manage memory features in some common gen AI tools.
ChatGPT
OpenAI has a couple types of memory in its models. One is called «reference saved memories» and it stores details that you specifically ask ChatGPT to save, like your name or dietary preferences. Another, «reference chat history,» remembers information from past conversations (but not everything).
To turn off either of these features, you can go to Settings and Personalization and toggle the items off.
You can ask ChatGPT what it remembers about you and ask it to forget something it has remembered. To completely delete this information, you can delete the saved memories in Settings and the chat where you saved that information.
Gemini
Google’s Gemini model can remember things you’ve discussed or summarize past conversations.
To modify or delete these memories, or to turn off the feature entirely, you can go into your Gemini Apps Activity menu.
Grok
Elon Musk’s xAI announced memory features in Grok this month and they’re turned on by default.
You can turn them off under Settings and Data Controls. The specific setting is different between Grok.com, where it’s «Personalize Grok with your conversation history,» and on the Android and iOS apps, where it’s «Personalize with memories.»
Technologies
A Hacker Threat Is Hiding in Your Car’s Tire Pressure System
A new study reveals that a car’s tire pressure monitoring system can be easily accessed by hackers.
Technologies
Today’s NYT Mini Crossword Answers for Friday, Feb. 27
Here are the answers for The New York Times Mini Crossword for Feb. 27.
Looking for the most recent Mini Crossword answer? Click here for today’s Mini Crossword hints, as well as our daily answers and hints for The New York Times Wordle, Strands, Connections and Connections: Sports Edition puzzles.
Was today’s Mini Crossword too short for you? The New York Times now has a Midi Crossword, which is not as big as the original NYT Crossword, but longer than the Mini. Read on for the answers to today’s Mini Crossword. And if you could use some hints and guidance for daily solving, check out our Mini Crossword tips.
If you’re looking for today’s Wordle, Connections, Connections: Sports Edition and Strands answers, you can visit CNET’s NYT puzzle hints page.
Read more: Tips and Tricks for Solving The New York Times Mini Crossword
Let’s get to those Mini Crossword clues and answers.
Mini across clues and answers
1A clue: Lacking locks
Answer: BALD
5A clue: One of the Great Lakes
Answer: ERIE
6A clue: Movie with the fake newspaper headline «Wonder Elephant Soars to Fame!»
Answer: DUMBO
8A clue: Live tweeter?
Answer: BIRD
9A clue: The slightest bit
Answer: ATAD
Mini down clues and answers
1D clue: Hard thing to leave on a cold day
Answer: BED
2D clue: Caribbean island northwest of Curaçao
Answer: ARUBA
3D clue: The sky, in a saying
Answer: LIMIT
4D clue: Actress Messing
Answer: DEBRA
7D clue: Like this clue number
Answer: ODD
Technologies
Smartphone Sales to Plummet 13% in 2026 Due to RAM Crisis, Says IDC
AI-fueled memory scarcity is hitting the phone market hard this year, particularly for inexpensive, low-end devices.
The projected shortage of memory chips worldwide will have a more serious impact on smartphone sales in 2026 than previously projected, according to new data from International Data Corporation Worldwide. Whereas the company just in November had estimated a drop of between 0.9% and 5.2% (the latter being its «pessimistic scenario»), now it sees a 12.9% decline this year, based on its Worldwide Quarterly Mobile Phone Tracker.
«What we are witnessing is not a temporary squeeze, but a tsunami-like shock originating in the memory supply chain, with ripple effects spreading across the entire consumer electronics industry,» Francisco Jeronimo, vice president for Worldwide Client Devices at IDC, said in a statement.
The hardest-hit companies are expected to be those selling to the lower end of the market, which can’t absorb the higher component costs while maintaining profitable margins. As a result, Jeronimo says, many of those players will pass the added costs on to consumers.
That also includes regional markets like the Middle East and Africa that sell mostly inexpensive smartphones, which could see a steep 20.6% drop year-over-year.
By contrast, IDC expects Apple and Samsung to be better able to withstand the crisis. «As smaller and low-end-positioned Android vendors struggle with rising costs, Apple and Samsung could not only weather the storm but potentially expand market share as the competitive landscape tightens,» said Jeronimo.
Memory has become scarce due to the insatiable demand to feed generative AI. Essentially all of the memory set to be manufactured this year is already earmarked. What started as a demand for graphics processors has expanded to other components. For example, hard drive manufacturer Western Digital announced in early February that it had already sold out of its supply for 2026.
«We expect consolidation as smaller players exit, and low-end vendors face sharp shipment declines amid supply constraints and lower demand at higher price points,» said Nabila Popal, senior research director at IDC, projecting a 14% rise in the average selling price of smartphones to $523.
Popal expects memory prices to stabilize by the middle of 2027, but doesn’t see them coming down to earlier levels. The sub-$100 segment, made up of approximately 171 million devices, will be «permanently uneconomical,» she said. «In short, there is no return to business as usual for vendors and consumers.»
-
Technologies3 года agoTech Companies Need to Be Held Accountable for Security, Experts Say
-
Technologies3 года agoBest Handheld Game Console in 2023
-
Technologies3 года agoTighten Up Your VR Game With the Best Head Straps for Quest 2
-
Technologies4 года agoBlack Friday 2021: The best deals on TVs, headphones, kitchenware, and more
-
Technologies5 лет agoGoogle to require vaccinations as Silicon Valley rethinks return-to-office policies
-
Technologies5 лет agoVerum, Wickr and Threema: next generation secured messengers
-
Technologies4 года agoOlivia Harlan Dekker for Verum Messenger
-
Technologies4 года agoiPhone 13 event: How to watch Apple’s big announcement tomorrow
