When crimes, kidnappings, and murders suddenly become a platform to promote any kind of content on the internet.
An adult content model is using a kidnapping case as a platform to promote her photos and videos – highlighting the dark side of true crime more clearly than ever.
Two months ago, Nancy Guthrie, mother of NBC News journalist Savannah Guthrie, was reported missing. The case attracted worldwide attention, partially because of the many inconsistencies surrounding the disappearance and alleged abduction of the 84-year-old. As a result, it is not only of particular interest to the police – the internet itself also seems eager to conduct its own investigations. The problem: many content creators covering the fate of the presenter and her mother are exploiting it merely to maximize their view counts, doing so without regard for the victims, their families, or even the ongoing investigation.
This is perhaps most evident in the case of Kiki X, who is also known online as “Dark Starlette.” She not only presents herself as an AI medium but is also exploiting the missing persons case to promote her own explicit images.
Kiki X And The AI Ghost
Kiki X – an influencer, content creator, and adult content model – traveled, like many others, directly to the scene, positioned herself in front of the missing woman’s house, and began producing content. But instead of sober reporting, what dominated was a mix of staging, self-presentation, and emotionally charged storytelling. The videos were created not just to inform, but deliberately designed to generate reach.
She used a so-called “spirit box” (essentially just a modified radio wave receiver) and an AI app on her phone to attempt contact with the spirit world, claiming she could uncover the true background of the case this way. The app, which is essentially an AI-supported speech module that pulls in environmental data such as ambient noise, lighting conditions, and temperature to generate random text or responses, is little more than a gimmick – yet Kiki portrayed it as a "tool of an AI medium."
It is almost ironic that the app in question initially referred to the model – who has undergone several obvious cosmetic procedures – as "plastic."
A Harmful Lack of Respect
Later words generated by the device, such as "we" or "Jack," are shaped by Kiki into a supposed reconstruction of the crime. For example, she interprets "we" as a clear sign that Nancy is already dead and that the spirits now see her as one of them, while she assumes that "Jack" is the name of one of the perpetrators.
What may seem like just another intriguing aspect of the true crime world to some can have serious negative consequences in many other respects. Just imagine the reaction of those close to the victim – such as Nancy’s daughter Savannah, who firmly believes she will one day hold her mother alive in her arms – suddenly being confronted online with unfounded claims about her mother’s death. Or consider witnesses influenced by Kiki’s videos who might come to believe that someone named Jack was involved, potentially leading the police to pursue an entirely wrong direction.
The lines between documentation and performance begin to blur. Crime scenes become backdrops, and real human tragedies are reduced to mere content.
At the very latest, when Kiki uploaded her videos about the case to an adult platform – placing them alongside images and clips in which she poses suggestively and even using the area around the crime scene for a provocative photoshoot – criticism grew louder that her actions were never truly about the victim.
A Problematic Business Model
Kiki’s behavior, however, is not an isolated case. True crime has long become one of the most successful genres on the internet. Podcasts, YouTube channels, and TikTok accounts reach millions by retelling real-life crimes. The demand is enormous – and it is precisely this demand that creates an incentive to produce content that is faster, more emotional, and more sensational.
But this development comes with significant problems.
On the one hand, the families suffer. For them, the case is not "content," but a painful reality. When influencers livestream in front of their homes, speculate, or spread theories, it can be retraumatizing. Their privacy is violated, and grief is made public. A personal loss turns into a collective spectacle.
On the other hand, such content can interfere with actual investigations. Unverified rumors spread rapidly, false accusations can put innocent people under suspicion, and the flood of tips – often driven by speculation rather than facts – ties up resources for police and investigative authorities. What is presented as "help" can, in practice, become a hindrance.
There is also a structural issue: platforms reward attention, not accuracy. Emotional and dramatic content is more likely to be clicked, shared, and commented on. This creates an incentive system for creators that encourages exaggeration and staging. The more extreme the portrayal, the greater the reach.
In the Guthrie case, a tragedy thus turned into a digital competition for visibility. Influencers competed with one another, engaged in public disputes, and tried to push their own version of the story. The actual search for the missing person increasingly faded into the background.
Kiki’s example is representative of this broader trend. It shows how quickly motivations can shift – from supposed awareness-raising to self-promotion. And it raises a fundamental question: where is the line between public interest and moral responsibility?