Website logo
Home

Blog

Caitlin não conseguia parar de gerar imagens IA de si própria — até ter um colapso psicótico - Notícias

Caitlin não conseguia parar de gerar imagens IA de si própria — até ter um colapso psicótico - Notícias

At first, the AI ​​seemed magical.He would think of an idea, write a few words and, seconds later, he would find himself in any situation he could imagine: floating on Jupiter, with a halo and angel wings.The magic wore off...

Caitlin não conseguia parar de gerar imagens IA de si própria  até ter um colapso psicótico -  Notícias

At first, the AI ​​seemed magical.He would think of an idea, write a few words and, seconds later, he would find himself in any situation he could imagine: floating on Jupiter, with a halo and angel wings.The magic wore off when he started hearing voices saying he could fly.Mental health professionals are beginning to warn about a new phenomenon that some call "AI psychosis": situations where people descend into delusional thinking, paranoia or hallucinations triggered by their interactions with intelligent systems.In some cases, users begin to interpret chatbot responses as personally meaningful, conscious,

At first, AI seems magical.He'll think of an idea, write a few words, and, a few minutes later, find himself in any position he can imagine: floating on Jupiter, with a halo and angel wings.The magic passes when he starts hearing voices telling him he can fly.

Mental health experts are beginning to warn of a new phenomenon some are calling "AI psychosis": conditions in which people become delusional, confused or hallucinate due to their interactions with intelligent systems.

In some cases, users have started to describe Chatboot responses as a private, view, or password that is expected of them.

But with the rise of hyperrealistic AI-generated images and videos, there is now a much more potent psychological risk, researchers say, especially for users with pre-existing vulnerability to psychosis.

Two years ago, Caitlin Ner, director of a venture capital firm in North America, experienced this firsthand.

During that time, Caitlin was responsible for user experience at a consumer-facing AI imaging startup and spent up to nine hours a day writing prompts for early-stage development systems with the goal of improving the company's models.

In a Newsweek article, Caitlin says she was previously diagnosed with bipolar disorder but was stable with medication and therapy.

At first, Caitlin says, the AI ​​worked like magic.She quickly began using the instructions to create pictures of herself in different situations.

He'll think of an idea, write a few words, and seconds later find himself in any situation he can imagine: floating on Jupiter;With halo and angel wings;As a superstar in front of 70,000 people;As a zombie.

But within a few months this spell turned into a mania.

When Caitlin started working with this AI, the images were still unpredictable.Sometimes they appeared with distorted faces, extra limbs and nudity, when none of them were wanted.

She spent endless hours sifting through the contents to remove anomalies, but she was exposed to so many disturbing human forms that she believed it began to distort her perception of her own body and overstimulate her brain in ways that were actually harmful to her mental health.

As the tools became more robust, the images created tended to have ideal shapes: fewer blemishes, smoother faces and thinner bodies.Repeated viewing of these images "reprogrammed" the subject's perception of normalityWhen she looked at her true reflection, Caitlin saw something that needed fixing

The principal thought, "If only I looked like my AI version..."She was obsessed with losing weight, having a better body, and having perfect skin.

Working hours became longer, sleep became scarcer, and with the endless series created by artificial intelligence, he began to replace them with drawings.The process itself was addictive because each new drawing gave me the satisfaction of a little dopamine rush.There was always another idea, another version to try, another picture to create.

Shortly thereafter, his mind was affected by an epilepsy-related accident, which caused psychosis.He could no longer distinguish fact from fiction.He saw patterns that didn't exist, signs in the results that seemed to be messages intended for him.

As she stared at these images, Caitlin began to hear auditory hallucinations somewhere between the artificial intelligence and her own mind.Some voices were soothing;others made fun of her or shouted at her - she responded as if they were real people talking to her in the room.

When she saw an AI-generated image of him on a flying horse, she began to believe that he could actually fly.The voices told her to fly off the balcony and assured her that she would survive.This great deception almost made her jump.

After a few sleepless nights, he broke down physically and emotionally.Euphoria overcame fatigue, fear, depression and confusion.

It was one of the scariest experiences of his life.The first step to overcoming the episode was to reach out to friends and family who knew about his mental illness.Quit AI startup.

Not being exposed to AI-generated images on a daily basis helped him stabilize, although he only realized his work had caused an episode when he sought clinical follow-up and explained what had happened.

Recovery takes time, therapy and integrative medicine.Since then it has established a more balanced relationship with technology.He continues to use AI, but sets strict limits - for example, no whispers at night and no endless repetition.

He also saw himself again as he was.The mirror is no longer your enemy.

Today, he understands that what happened was not just a coincidence between mental illness and technology.It was a form of digital addiction fueled by the image of artificial intelligence itself.

According to Caitlin, we are creating tools that blur the lines between imagination and reality.This is beautiful, but also dangerous, especially for those living with psychological vulnerability.

AI can be a source of inspiration and positive visualization.It's here to stay. But Caitlin works in mental health; I also believe that more ethical concerns are needed in the tech sector; Because Because For people like her, and for many who play on the fringes of machine innovation, the line between motivation and volatility is thinner than we think.

As últimas notícias em português sobre esportes, entretenimento, saúde e tecnologia.

© 2025 Zedd Brasil, Inc. All Rights Reserved.