That Ain’t It: When AI Turns Legacy Into Engagement Bait

Introduction

“You know, it gets to a point where enough is more than enough…”

And that’s exactly where we are.

A viral AI-generated video has been circulating across TikTok, Instagram, X, and Facebook, capturing millions of views and even more divided opinions. Created by nyei.ai, the video places fallen cultural icons including Tupac Shakur, The Notorious B.I.G., Selena Quintanilla, Michael Jackson, Kobe Bryant, Whitney Houston, and more inside a fictional award show setting, where they appear animated, present, and celebrated as if they never left.

On the surface, it is polished, cinematic, and emotionally engineered to feel like a tribute. The lighting, the staging, and the expressions are all designed to evoke nostalgia and admiration. 

For some, it lands that way. They see it as creative. They see it as honoring legacies through modern tools.

I don’t. (And before any of you AI-zealots try and come for me, don’t. I can string some “sentence enhancers” together and send your way that makes you rethink your life’s choices.)

Because once you sit with it for more than a few seconds, the feeling shifts. What initially looks like homage begins to feel like something else entirely. And that discomfort is not random. It is rooted in something deeper than personal preference or resistance to new technology.

It is about boundaries.

The Trending Topic

The video, created by nyei.ai, uses artificial intelligence to reconstruct the likenesses of deceased celebrities and places them in a fabricated environment, with a fictional award show that never existed. In this imagined setting, these individuals are shown interacting, celebrating, and occupying space in a moment that feels real but is entirely manufactured.

This is not archival footage. It is not restoration of lost media. It is not a documentary interpretation grounded in truth. Instead, it is a fully synthetic re-creation of people and moments, designed to simulate presence and provoke emotion.

And that distinction matters.

Because when you recreate real people in fictional settings, you are no longer preserving history. You are rewriting it. Even if the intention is positive, the outcome shifts from documentation to fabrication. And fabrication, especially when applied to real individuals with real legacies, is never neutral.

Why It Feels Off

The discomfort surrounding this video is not rooted in a lack of understanding of AI. Nor is it a rejection of innovation. The technology is advanced, and the execution is impressive from a technical standpoint.

But that is not the issue.

The issue is that this kind of content removes agency from people who can no longer speak for themselves. 

None of these individuals (nor their respective estates) consented to being recreated, repositioned, or reimagined in this way. They did not approve the setting. They did not approve of the narrative. They did not approve the use of their likeness to generate engagement in a digital environment.

And when you are dealing with figures like Tupac, Biggie, Selena, Whitney, and Kobe, you are not just dealing with celebrities. You are dealing with individuals whose lives and legacies are deeply tied to cultural identity, collective memory, and, in many cases, unresolved grief.

Reconstructing them in a fictional setting without context or consent does not feel like celebration. It feels like appropriation of memory.

Tribute vs. Exploitation

There is a meaningful difference between honoring someone and using them, and that line becomes even more important when the individuals in question are no longer here to protect their image.

A real tribute preserves truth. It respects the context of a person’s life and contributions. It centers their actual story, not an imagined version of it.

This video does the opposite. It reimagines reality, blurs the line between fact and fiction, and ultimately centers the creator’s vision over the individual’s lived experience.

That shift may seem subtle, but it changes everything.

Because in a content-driven economy where attention is currency, moments like this begin to look less like homage and more like strategy. Legacy becomes a tool. Emotion becomes a lever. And cultural icons become vehicles for engagement.

That is where the line is crossed.

The Bigger Issue: Culture as Raw Material

What we are witnessing is not just one isolated piece of content. It is a reflection of where technology is heading and how quickly it is outpacing ethical guardrails.

AI now allows creators to replicate faces, voices, and entire personas with increasing accuracy. It enables the creation of moments that feel real but never happened. And in doing so, it turns culture into something that can be remixed, repurposed, and redistributed at scale.

That creates a fundamental risk, especially when media literacy rates are low; attention spans are rapidly decreasing; and common sense is…well, that isn’t a vegetable grown in everyone’s garden to begin with so I digress. 

Because historically, many of the individuals represented in this video, particularly Black artists and entertainers, have already experienced limited control over their narratives, likeness, and economic value. The introduction of AI into this dynamic does not reset that history. It extends it.

Now, even in death, control over image and representation can be bypassed entirely.

That is not progress. That is a continuation of an existing pattern, amplified by technology.

Where Creators and Platforms Get It Wrong

The core issue here is not capability. It is judgment.

There is a growing assumption that if something is technically possible, it is also acceptable. That if something performs well, it must be valuable. And that if something evokes emotion, it must be meaningful.

None of those assumptions hold up under scrutiny.

Viral content is not always responsible content (ask Kendall Jenner and Pepsi). Emotional content is not always respectful content (ask Dove). And creative execution does not automatically justify creative direction (ask H&M).

This is where both creators and platforms have to take a step back and reassess.

Because without that pause, the line between innovation and exploitation will continue to blur.

What Should Happen Next

If the goal is to honor legacy, there are ways to do that without crossing ethical boundaries.

Creators and brands should prioritize authenticity by using real footage, real stories, and real context. If AI is used, it should be applied in ways that enhance understanding rather than fabricate new realities.

More importantly, there needs to be a stronger emphasis on consent and transparency.

That means:

  • Engaging estates or representatives before using likeness

  • Clearly labeling AI-generated content as synthetic

  • Avoiding fictional scenarios that place real individuals in imagined contexts

  • Ensuring that storytelling remains rooted in truth

At the platform level, AI tools must evolve to include safeguards.

This includes:

  • Likeness protection systems for public figures

  • Mandatory disclosure tags for generated media

  • Opt-out mechanisms for families and estates

  • Content moderation policies that flag unauthorized recreations

Right now, the barrier to misuse is low, and the incentives to push boundaries are high. That imbalance needs to be addressed.

Final Thoughts

AI will continue to evolve. That is not in question.

What is in question is whether our standards will evolve with it.

Because if we do not define clear boundaries now, we risk normalizing a world where legacy can be reconstructed, repackaged, and redistributed without accountability (see the rebranding of Hitler). A world where memory becomes content, and content becomes detached from the people it represents.

And when that happens, we lose something that cannot be recreated by any algorithm.

We lose respect.

And well, that ain’t it.

Next
Next

When a Moment Goes Wrong: Tourette’s, Responsibility, and a Missed Opportunity at the BAFTAs