ChatGPT gave me someone else’s medical data from unrelated search

Unexpected Privacy Breach: A Disturbing Encounter with ChatGPT

Recently, I found myself in an unsettling situation while interacting with ChatGPT, an AI tool I’m quite fond of for answering questions. My query was straightforward: I needed advice on which type of sandpaper to use. However, the response I received was nothing short of alarming.

Instead of the expected guidance on sandpaper, I was presented with detailed information from someone else’s medical records, specifically concerning a drug test taken hundreds of miles away from me. To my disbelief, I was even able to download a file containing personal identifiers and signatures.

Naturally, this breach of privacy left me feeling anxious and uncertain about the implications. My first instinct was to avoid sharing the complete chat transcript online; Iโ€™m aware of the sensitivity of personal data and donโ€™t want to inadvertently amplify the exposure of someone else’s private information.

In a subsequent edit, I mentioned my struggle to navigate this bizarre situation. I had explored what information the AI might have on me, hoping not to reveal any additional details that could lead to yet another privacy infringement. While the AI claimed to possess information about me, it turned out to be largely personal and something I would rather keep private.

I fully understand the concept of AI “hallucinations,” which could explain this odd outcome; however, upon researching some of the names involved, I discovered they matched the relevant location, adding a layer of legitimacy to my concerns. For those curious, the ChatGPT instance I interacted with had named itself โ€˜Atlas,โ€™ hence my references to that name throughout the discussion.

I apologize to those who may judge my approach to online discussionsโ€”I’m not a frequent Reddit user, which may come across as unfamiliar to some. I did share a comment that included substantial parts of the conversation, but I edited out anything that could potentially expose my information further.

In closing, I urge everyone to tread cautiously when engaging with AI technologies. The potential for unintended privacy violations is real and should prompt us to reflect on the ethics of data sharing, both personally and for those of others. For more details, I encourage a visit to my Reddit comment here: Link to comment.

Have you ever encountered a similar issue with AI tools? How did you


Leave a Reply

Your email address will not be published. Required fields are marked *