BBS:      TELESC.NET.BR
Assunto:  ChatGPT used in South Korean murder probe
De:       Mike Powell
Data:     Sat, 21 Feb 2026 13:03:32 -0500
-----------------------------------------------------------
ChatGPT search trail becomes central evidence in South Korea double murder
probe

By Eric Hal Schwartz published 19 hours ago

The deaths of two men trigger a legal test for how generative AI conversations
can be used in prosecution

    Two men were found dead in separate motels after drinking beverages a woman
allegedly spiked with prescription drugs.
    Seoul police say her repeated ChatGPT questions about lethal
sedative-alcohol combinations show she knew the mixture could be deadly.
    Investigators argue her chatbot search history proves intent, making it
central to their upgraded murder charges.

South Korean police have upgraded charges against a 21-year-old woman to murder
after uncovering a disturbing series of queries she apparently typed into
ChatGPT before two men were found dead in separate motel rooms.

Investigators in Seoul say the suspect, identified only as Kim, repeatedly
asked the AI chatbot in different ways about what happens when you mix sleeping
pills with alcohol and when it becomes dangerous and eventually lethal. Police
now argue that those searches show she knew the risks long before she served
the drug-laced drinks that left two men dead and another unconscious.

Authorities had originally arrested Kim in February on the lesser offense of
inflicting bodily injury resulting in death, a charge that often applies when
someone causes fatal harm without intent to kill. That changed once digital
forensics teams combed through her phone. The combination of her earlier
statements and the precise phrasing of her ChatGPT questions convinced
investigators she was not simply reckless or unaware. It shaped the backbone of
a revised case that now alleges deliberate, premeditated poisoning.

According to police accounts, the first suspected murder occurred on January 28
when Kim checked in with a man in his 20s in a hotel and left two hours later.
Staff discovered his body the next day. On February 9, a nearly identical
sequence played out at a different motel with another man in his 20s. In both
cases, police say the victims consumed alcoholic drinks that Kim had prepared,
which investigators believe she had dissolved prescription sedatives into.

Detectives uncovered an earlier, nonfatal attempt involving Kim's
then-partner, who later recovered. After he regained consciousness,
investigators say Kim began preparing stronger mixtures and significantly
increased drug dosages. The role of ChatGPT became central to the case once
phone records were decoded. The searches investigators highlighted were not
broad or vague. They were, according to authorities, specific, repeated, and
fixated on lethality.

The police say it means she knew what could happen and that it changes the
story from an unintentional overdose to a planned and studied poisoning. Kim
reportedly told investigators that she mixed the sedatives into drinks but
claimed she did not expect the men to die. Police counter that her digital
behavior contradicts that story. They have also suggested that actions she took
after the two motel deaths further undermine her claims. According to
officials, she removed only the empty bottles used in the mixtures before
leaving the motel rooms, while taking no steps to call for help or alert
authorities. Detectives interpret that as an attempted cover-up rather than
panic or confusion.

ChatGPT poison guide

One of the most striking elements of the case, beyond the violence itself, is
the way generative AI fits into the investigative timeline. For years, police
have relied on browser histories, text logs, and social media messages to
establish intent. The presence of chatbot interactions adds a new category of
evidence. ChatGPT, unlike a traditional search engine, can deliver personalized
guidance in conversational form. When someone asks a question about harm, the
phrasing and follow-ups can reveal not only curiosity but persistence.

For everyday people who use AI casually, the case serves as a reminder that
digital footprints can take on lives of their own. As more people turn to
chatbots for everything from homework help to medical questions, law
enforcement agencies around the world are beginning to explore how these
conversations should be handled during investigations. Some countries already
treat logs from AI services no differently from browser data. Others are still
weighing privacy concerns and legal boundaries.

While the events themselves are tragic, they highlight a new reality.
Technology now sits in the background of many serious crimes. In this instance,
police believe the ChatGPT queries help paint a clear picture of intent. The
courts will eventually decide the extent to which those questions prove guilt.
For the public, the outcome may influence how people think about the privacy,
permanence, and potential consequences of interacting with AI.


https://www.techradar.com/ai-platforms-assistants/chatgpt-search-trail-becomes-
central-evidence-in-south-korea-double-murder-probe

$$
--- SBBSecho 3.28-Linux
 * Origin: Capitol City Online (1:2320/105)

-----------------------------------------------------------
[Voltar]