OpenAI will be holding onto all of your conversations with ChatGPT and possibly sharing them with a lot of lawyers, even the ones you thought you deleted. That’s the upshot of an order from the federal judge overseeing a lawsuit brought against OpenAI by The New York Times over copyright infringement. Judge Ona Wang upheld her earlier order to preserve all ChatGPT conversations for evidence after rejecting a motion by ChatGPT user Aidan Hunt, one of several from ChatGPT users asking her to rescind the order over privacy and other concerns.
Judge Wang told OpenAI to “indefinitely” preserve ChatGPT’s outputs since the Times pointed out that would be a way to tell if the chatbot has illegally recreated articles without paying the original publishers. But finding those examples means hanging onto every intimate, awkward, or just private communication anyone’s had with the chatbot. Though what users write isn’t part of the order, it’s not hard to imagine working out who was conversing with ChatGPT about what personal topic based on what the AI wrote. In fact, the more personal the discussion, the easier it would probably be to identify the user.
Hunt pointed out that he had no warning that this might happen until he saw a report about the order in an online forum. and is now concerned that his conversations with ChatGPT might be disseminated, including “highly sensitive personal and commercial information.” He asked the judge to vacate the order or modify it to leave out especially private content, like conversations conducted in private mode, or when there are medical or legal matters discussed.
According to Hunt, the judge was overstepping her bounds with the order because “this case involves important, novel constitutional questions about the privacy rights incident to artificial intelligence usage – a rapidly developing area of law – and the ability of a magistrate [judge] to institute a nationwide mass surveillance program by means of a discovery order in a civil case.”
Judge Wang rejected his request because they aren’t related to the copyright issue at hand. She emphasized that it’s about preservation, not disclosure, and that it’s hardly unique or uncommon for the courts to tell a private company to hold onto certain records for litigation. That’s technically correct, but, understandably, an everyday person using ChatGPT might not feel that way.
She also seemed to particularly dislike the mass surveillance accusation, quoting that section of Hunt’s petition and slamming it with the legal language equivalent of a diss track. Judge Wang added a “[sic]” to the quote from Hunt’s filing and a footnote pointing out that the petition “does not explain how a court’s document retention order that directs the preservation, segregation, and retention of certain privately held data by a private company for the limited purposes of litigation is, or could be, a “nationwide mass surveillance program.” It is not. The judiciary is not a law enforcement agency.”
That ‘sic burn’ aside, there’s still a chance the order will be rescinded or modified after OpenAI goes to court this week to push back against it as part of the larger paperwork battle around the lawsuit.
Deleted but not gone
Hunt’s other concern is that, regardless of how this case goes, OpenAI will now have the ability to retain chats that users believed were deleted and could use them in the future. There are concerns over whether OpenAI will lean into protecting user privacy over legal expedience. OpenAI has so far argued in favor of that privacy and has asked the court for oral arguments to challenge the retention order that will take place this week. The company has said it wants to push back hard on behalf of its users. But in the meantime, your chat logs are in limbo.
Many may have felt that writing into ChatGPT is like talking to a friend who can keep a secret. Perhaps more will now understand that it still acts like a computer program, and the equivalent of your browser history and Google search terms are still in there. At the very least, hopefully, there will be more transparency. Even if it’s the courts demanding that AI companies retain sensitive data, users should be notified by the companies. We shouldn’t discover it by chance on a web forum.
And if OpenAI really wants to protect its users, it could start offering more granular controls: clear toggles for anonymous mode, stronger deletion guarantees, and alerts when conversations are being preserved for legal reasons. Until then, it might be wise to treat ChatGPT a bit less like a therapist and a bit more like a coworker who might be wearing a wire.