December 29, 2025

ChatGPT will produce erotica in December, OpenAI CEO Sam Altman announced Tuesday.

He wrote in a post on social media platform X that users will be able to ask for the content if they are verified to be adults.

“In December, as we roll out age-gating more fully and as part of our ‘treat adult users like adults’ principle, we will allow even more, like erotica for verified adults,” Altman said.

One X user, who identifies as an artificial intelligence content creator, questioned Altman’s announcement.

“Why do age-gates always have to lead to erotica? Like, I just want to be able to be treated like an adult and not a toddler, that doesn’t mean I want perv-mode activated,” the user, @catebligh, wrote.

Altman replied that “you won’t get it unless you ask for it.”

He said in his original post that OpenAI would be relaxing restrictions it placed on ChatGPT. The company has had to first make sure it is “being careful” with chatbot users’ mental health issues before rolling back any limitations, Altman explained.

“We made ChatGPT pretty restrictive to make sure we were being careful with mental health issues. We realize this made it less useful/enjoyable to many users who had no mental health problems, but given the seriousness of the issue we wanted to get this right,” the executive wrote. “Now that we have been able to mitigate the serious mental health issues and have new tools, we are going to be able to safely relax the restrictions in most cases.”

Another X user who identifies as a senior software engineer said in response to Altman, “about time.” ChatGPT users “don’t want chaos, just authenticity,” @roubalsehgal said.

“Chatgpt used to feel like a person you could actually talk to, then it turned into a compliance bot. if it can be made fun again without losing the guardrails, that’s a huge win,” @roubalsehgal noted.

Altman said in response that OpenAI agrees.

“Almost all users can use ChatGPT. however they’d like without negative effects; for a very small percentage of users in mentally fragile states there can be serious problems. 0.1% of a billion users is still a million people,” Altman continued. “We needed (and will continue to need) to learn how to protect those users, and then with enhanced tools for that, adults that are not at risk of serious harm (mental health breakdowns, suicide, etc) should have a great deal of freedom in how they use ChatGPT.”

Have questions, concerns or tips? Send them to Ray at [email protected].

Leave a Reply

Your email address will not be published. Required fields are marked *

WP2Social Auto Publish Powered By : XYZScripts.com