ChatGPT can now imitate your voice.. We are in great danger
ChatGPT’s ability to generate text that resembles what you might say in a conversation with another person has improved: it can now mimic voices. But be careful, because this new feature, which actually looks very promising, also highlights the problem that AI can mimic your voice without your permission and you already know what it can be used for.
GPT-4o , OpenAI’s most advanced model, allows AI to create different voices. However, during testing, it was discovered that under certain conditions, the AI could mimic the user’s voice, even from short audio snippets.
This phenomenon is due to the way the language model generates speech. Based on a sample, it can produce new sounds, even if they are unauthorized. Although OpenAI has implemented security measures to prevent this, the possibility of this happening brings a significant risk to the table.
Imagine receiving a call from a friend or even a family member with their imitated voice asking for confidential information or telling you that they need you to make a bank transfer. It all seems real, especially because you recognize their voice. But what if it’s not them?
This type of crime is expected to increase in the coming years, and can pose a significant risk to individuals due to the complexity of the fraud identification process, which is also likely to see multiple improvements over time.
Ultimately, a snippet of someone's voice from a conversation is enough for a criminal to use to create very realistic conversations that are ultimately fake.
Of course, the fact that GPT-4o is already a perfect tool for this kind of thing is worrying. As for how to solve this big problem that will only get worse, there are several possible ways to stop it, such as greater regulation or better security measures by the companies behind these tools.
However, it seems that the “solution” lies right in you. Considering that you will only have your ears to be able to determine whether or not it is a fake call.