Page Harrington uses ChatGPT to design custom t-shirts, among other daily tasks.
Jessica Camilleri-Shelton says our ChatGPT history is a "deep and widespread map of our thoughts, feelings and way of life."

Page Harrington asked ChatGPT to make a morning checklist for her 5-year-old child with ADHD who can't read. The generative AI shot out a color-coded routine.

Harrington also uses ChatGPT to make amusing oil paintings of her family, source trending songs for TikToks and even create interior designs for her home.

The Massachusetts 33-year-old is among a growing number of ChatGPT power users. Some estimates find ChaptGPT now has between 800 million and 1 billion active daily users.

Harrington is such a prolific user that the chatbot told her she seeks answers or ideas about every 11 minutes. "I'm using it all day, everyday," she told USA TODAY. "I am entrenched."

ChatGPT is so integral to her daily life that her chat history has become a detailed map of her inner self, Harrington said.

"It is hysterical to look at because it really shows what my brain looks like," she said. "I use ChatGPT to take an idea I have and make it better. It reveals the constant thought-spiraling I have on a daily basis."

While a detailed ChatGPT history can be hugely beneficial − the more that users reveal, the more relevant the outputs are − it also raises privacy implications. In fact, ChatGPT histories are so intensely personal some people say they'd rather let a stranger read their texts than see their chatbot banter.

That prospect is now a possibility. A May court order has, at least temporarily, prevented ChatGPT's parent company, OpenAI, from honoring user requests to delete the history of personal accounts.

The move has created confusion over what exactly is ChatGPT history. Is it public? Could it be used against you?

"We are forgetting how much we are sharing with it," said Jessica Camilleri-Shelton, an early ChatGPT adopter and U.K.-based AI content creator. "A lot of people are in a daze about what this tool represents and the way they're interacting with it. The things they're sharing and the history that's being built on them is a deep and revealing picture of who they are."

Why your ChatGPTs aren't being deleted right now

New York federal Judge Ona Wang ruled in May that OpenAI must hold on to some chats in a copyright infringement lawsuit.

The lawsuit filed by The New York Times in 2023 alleges OpenAI used its articles to train generative AI models. The newspaper and other plaintiffs say the ChatGPT user data could contain potential evidence to support that claim.

The order requires OpenAI to keep ChatGPT histories even if a user requests the chats be deleted or if state privacy laws require OpenAI to delete the data. The order does not apply to business accounts.

OpenAI appealed the preservation order but was unsuccessful. Last month, Wang upheld the order when she rejected a user's petition that chats should not be maintained for privacy reasons.

"Every single chat from everybody in America is now frozen under protective order and cannot be deleted," said Jay Edelson, a Chicago-based plaintiff's attorney who sues AI companies on behalf of users. "Even if people think they have temporary chats, or are deleting chats, by virtue of a court order, that's not happening."

It's common for a judge to preserve records in litigation, said Ryan Calo, a University of Washington law professor.

According to Calo, the court order doesn't technically violate OpenAI's terms of service, which say the company must retain data to comply with legal orders. But it raises "a very important legal question about what the people that make AI owe to the people who own the copyright behind the training data."

The company said it will "resume our standard data retention practices" once the court permits.

In the meantime, user data applicable to the order will be "stored separately in a secure system" and is only accessible to OpenAI's legal and security team, according to OpenAI.

Calo says ChatGPT users should be vigilant about the sensitive information they share. Such a robust data stockpile could still be vulnerable to cyberattacks and requests from law enforcement.

"So, if you go talk to ChatGPT about your most sensitive stuff and then there's like a lawsuit or whatever, like, we could be required to produce that," OpenAI CEO Sam Altman acknowledged on the "This Past Weekend" podcast last month.

Bottom line? Calo's recommendation is to stay calm and carry on using ChatGPT, just more cautiously.

Treating ChatGPT like it's 'one of us'

From social media to search histories, people have navigated myriad privacy threats before. But ChatGPT makes us feel uniquely vulnerable because it is increasingly an extension of ourselves, according to Kate Devlin, professor of artificial intelligence and society at King's College London.

"We tend to treat these things as if they are one of us," Devlin said.

That's the case for Rue Halloway, a 20-year-old social media creator from New York City, who uses generative AI for guidance on interpersonal situations and her ADHD. Halloway turned to ChatGPT when she found therapy was too expensive.

"I'm very neutral," Halloway said about the vulnerability of her ChatGPT history. While she wouldn't like it if her history ever became public, she wouldn't get too upset over it. After all, she already forks over a lot of personal data to other tech platforms, she said.

Some people are willing to trade their data for access to "an expert at your fingertips," Camilleri-Shelton said.

Not everyone is so sure. With no federal law protecting online information and just a patchwork of state privacy laws, many Americans are confused and concerned about how their online information is used, according to surveys by the Pew Research Center.

The discomforting reality is that ChatGPT consumes vast amounts of information all the time, privacy experts say. Even if you request that it delete your information, it has likely already digested and incorporated it, according to Calli Schroeder, lead of the AI and Human Rights Project at Electronic Privacy Information Center.

"Because of the way these systems are built, you can't delete individual pieces of information once it's become part of a training data set," she said.

For this reason, some ChatGPT power users like Harrington have decided to avoid highly personal or emotional queries, preferring less revealing conversations, such as how to make a skin-care routine out of the products on her bathroom counter.

"ChatGPT doesn't know if your boyfriend hates you," Harrington said. "But ChatGPT does know if you should use CeraVe oil cleanser before Cetaphil soap. ... I'm going with things that are more fact rather than opinion."

How to protect your ChatGPT history

Worried about disclosing too much to ChatGPT? Here are some tips:

  • If you need support, reach out to a friend or professional, not generative AI: "The whole purpose ... is to sound plausible, not true," Devlin said. Users should not take replies as fact about their health and well-being.
  • Private chats don't mean privacy: "Ultimately your information is being sent to a tech company," Devlin said. "This is not a confidential service. It wasn't designed to be that."
  • Gut check before you type: "Anything you plug in, that's no longer yours," Schroeder said.

This article originally appeared on USA TODAY: Should you delete your ChatGPT history? Why you might not have a choice.

Reporting by Nicole Fallert, USA TODAY / USA TODAY

USA TODAY Network via Reuters Connect