According to OpenAI, ChatGPT memory is opt-in by default and users must actively turn it off. You can clear your memory at any time within settings or simply by telling the bot to clear it. When you clear memory settings, that information is no longer used to train AI models. It is unclear exactly how much personal data will be used to train AI meanwhile Someone is chatting with a chatbot. Also, turning off memory does not completely opt out of training his OpenAI model through chat. That's another opt-out.
The company also claims that it does not store certain sensitive information in memory. Once you tell ChatGPT your password (don't do this) or your social security number (or this), the app's memory is thankfully easier to forget. Jang also said OpenAI is seeking feedback on whether other personally identifiable information, such as a user's ethnicity, is too sensitive for the company to automatically capture.
“We think there are many cases where this example is useful, but for now we trained the model not to actively memorize that information,” Chan says.
It's easy to see how ChatGPT's memory functionality can malfunction. A user forgets that they asked the chatbot a question about an issue, an abortion clinic, or how to non-violently deal with their mother-in-law, only to have it remind them or ask someone to see it in a future chat. You can How ChatGPT's memory handles health data is also something of an open question. “We are trying to prevent ChatGPT from remembering specific health details, but this is still a work in progress,” says OpenAI spokesperson Nico Felix. In this way, ChatGPT is the same song about the persistence of the Internet in a very new era. Look at this great new memory feature until it becomes a bug.
OpenAI is also not the first entity to tinker with memory in generative AI. Google emphasizes “multi-turn” technology in its LLM, Gemini 1.0. This means that the user can interact with Gemini Pro using his single prompt, which is a single interaction between the user and the chatbot, or multiple times, where the bot “remembers” the context of previous messages. It also means you can have an ongoing conversation.
An AI framework company called LangChain is developing a memory module that helps large language models recall previous interactions between end users and models. Giving LLMs long-term memory is “very powerful in creating unique LLM experiences. Chatbots can begin to tailor their responses to you as an individual based on what they know about you.” ,” said Harrison Chase, co-founder and CEO of LangChain. “Lack of long-term memory can also create unpleasant experiences. No one wants to have to repeatedly tell a chatbot that they're a vegetarian to recommend a restaurant.”
This technology is sometimes called “context retention” or “persistent context” rather than “memory,” but the end goal is the same. It's about making human-computer interaction feel so fluid and natural that it's easy for users to forget about it. What a chatbot might remember. This is also a potential boon for businesses looking to deploy chatbots and maintain ongoing relationships with their customers on the other end.
“You can think of these as just a bunch of tokens that get added to the beginning of a conversation,” says OpenAI research scientist Liam Fedas. “The bot has some level of intelligence, and behind the scenes it looks through its memories and says, ‘These seem to be related. Let me merge them. And that will be reflected in the token budget.”
Fedus and Jang say ChatGPT's memory capacity is nowhere near the capacity of the human brain. Nevertheless, Fedus explained around the same time that ChatGPT's memory is limited to “a few thousand tokens.” If so.
Is this the hyper-vigilant virtual assistant that tech consumers have been promised for the past decade, or will it use your tastes, preferences, and personal data to better serve tech companies better than you? Is OpenAI just another data collection scheme that provides a “I think the past assistants just didn't have the intelligence,” Fedus said. “And now we're getting there.”
Will Knight contributed to this story.