Get ready to chat with AI like you're talking to a friend! OpenAI is launching Advanced Voice, a new feature of its ChatGPT. This update will give some users the ability to speak to ChatGPT instead of typing.
However, don't be too excited yet: it will only be available for a few lucky subscribers who have subscribed to ChatGPT plus. First there will be a small test group before everybody else can access it.
What makes Advanced Voice so special? Instead of Siri or Google Assistant always giving programmed responses, the voice function on ChatGPT can talk more like an actual person. It can speak and understand many languages even show emotions through its sound!
Another thing about advanced voice that is cool is how quickly it responds. The answers are given in approximately 320 milliseconds on average which is roughly the same as a normal human speaking time in conversation; this means that one does not have uncomfortable pauses within their conversation when they want to carry out smooth and natural dialogue with their AI.
Although OpenAI initially mentioned this feature back in May, they've been working hard on ensuring that it's safe and reliable for people before making it accessible by them. They need to be extra careful about rejecting content that isn't allowed and ensuring their systems can support many people using it at once.
We do not know what criteria exactly OpenAI would use while deciding who gets first access to Advanced Voice but we know such individuals shall come from those paying $20 monthly for having ChatGPT Plus. By doing so the company aims at learning something from this small testing group then gradually allowing more people use it for other purposes.
This could change everything about how we interact with AI! It could make AI easier for non-techies, especially those who struggle to type. Moreover, talking to AI may expand our use of this technology in daily life.
OpenAI plans to offer Advanced Voice to more users by the end of year but the exact date will depend on how well initial testing goes. Well, let us see how this amazing feature performs in real life!