I was testing an AI chatbot service, and something really strange occurred. The AI abruptly changed from its typical voice to one that sounded exactly like mine. I never authorized it to replicate my voice like that. This switch caught me off guard during our chat.
Has anyone else faced a similar issue with AI voice tech? I’m worried about how these systems might be duplicating and storing our voices without consent. It felt quite creepy, as if I were in a science fiction film where technology starts to imitate people.
What steps should I take now? Is it necessary to report this or reach out to the company? I’m anxious about privacy concerns and whether my voice data is being misused.
This seems more likely a technical issue rather than deliberate cloning. During trial versions, AI voice systems can fluctuate unexpectedly. I faced something similar last year where a beta service generated distorted audio resembling my speech due to feedback errors.
It’s crucial to reach out to their support and describe your experience in detail. Make sure to ask about their voice sampling practices, as reputable companies will clarify and ensure any unintentional recordings are deleted. Additionally, review their privacy policy regarding voice data to understand your rights.
thats definetly sketchy behavior from the ai system. i’d screenshot everything and contact the company asap - dont let them brush it off as a “glitch”. also check if you agreed to voice training in the terms when signing up, companies sometimes sneak that stuff in there. might want to consider filing a complaint with data protection authorities too if they cant explain how this happend.