The Advanced Voice Mode (AVM) runs on OpenAI's latest GPT-4o model and enables the user to talk to the chatbot without the need for text prompts.
You can talk to ChatGPT as if you were talking to another person, pausing if necessary. According to OpenAI, the AVM feature “provides more natural, real-time conversations, lets you interrupt at any time, and senses and responds to your emotions.”
The long-awaited feature was initially announced at the OpenAI Spring Update event, and was released to beta testers in July before rolling out to premium subscribers in late September.
“Free users will also get a sneak peek at Advanced Voice,” the company announced in a post on X in October. “Free users and additional users in the EU will be updated, we promise.”
Despite its somewhat exclusive nature, the feature has already proven to be a hit with users, and when AVM finally made its way to Plus subscribers at large, social media lit up with posts about all the cool things the feature could do, from simulated breaks during long recitations to a variety of regional voices and accents.
It has proven so popular in fact that Meta and Google were quick to follow suit with their own conversation features.
This news comes just 24 hours after the company announced a new chat history search feature that will also be coming to the web app, and has started rolling out the ability to search your ChatGPT chat history on the web,” the reveal post on X reads.