A few friends of mine recently shared posts from an influencer named Mia Zelu — styled, poised, and apparently enjoying Wimbledon.
She looked like any other jet-setting model until headlines revealed the twist: Mia wasn’t real. She was an AI-generated persona created by a marketing firm, complete with a growing fan base and lifestyle brand.
From anime-styled idols to photorealistic fashionistas, the ranks of AI influencers are growing fast. Valued at $6.95 billion in 2024, the AI influencer market is projected to reach $45.82 billion by 2030. Around 150 virtual influencers now populate Instagram, with top figures like Lu do Magalu (7.2 million followers) and Lil Miquela (2.6 million) commanding massive engagement.
Some analysts believe AI influencers may soon outnumber human ones. And while human creators still lead in earnings and relatability, what matters more is the shift in perception: these synthetic personas, crafted from code, prompts, and marketing strategy, are no longer seen as novelties. They are being treated — followed, loved, and defended — not as code, but as people.
From Code to Connection
It may seem unsettling. But the idea of following fictional figures is hardly new. We’ve long constructed personas to reflect what we admire, desire, or fear. What’s different now is the source — and the scale.
People Are (Not Quite) People
In anthropology, personhood is more than biology. It’s a culturally recognized status that carries expectations, rights, and meaning. Personhood includes memory, agency, vulnerability. A persona, by contrast, is a mask — a projected role or social identity.
AI influencers perform personas with astonishing fluency — flawless skin, clever captions, curated charm. But they don’t suffer. They don’t remember. They don’t know what it feels to long for something and to fail.
And yet millions follow them — not just for fashion tips, but for parasocial connection.
“If emotional resonance is all it takes to form attachment, do we still need personhood — or is persona enough?”
Ghosts Again (And Again)
We’ve been here before, in a way.
Millions revere spiritual figures who lived centuries ago — Jesus Christ, Buddha, the saints — whose biographies have been retold until they became moral archetypes. People tattoo Kurt Cobain’s face or quote John Lennon as if his words were scripture. Here at home, we turn Jose Rizal into a meme, reimagine Darna as a feminist icon, and debate whether Ninoy Aquino was a martyr or a myth. Sherlock Holmes has a museum in London and an active fan club, despite being entirely fictional. We cry over the deaths of Marvel characters and defend the legacy of Taylor Swift like kin.
In short: we’ve never needed someone to be alive—or real—to care about them deeply. What matters is narrative coherence. A life we can follow. A story that speaks to our own. It’s not the ontological reality of the figure—it’s the emotional economy they activate.
“We’ve never needed someone to be alive—or real—to care about them deeply.”
Behind the Wheel
So far, most AI influencers are skin-deep: pretty faces, plausible captions, zero complexity. But that’s about to change. To sustain interest and deepen attachment, future AI personas will be built with backstories—complete with childhood trauma, identity crises, redemption arcs, and unfinished dreams.
This isn’t speculative fiction. Creative studios are already hiring writers and worldbuilders to design entire biographies for AI figures: where they grew up, what they fear, whom they love. These aren’t just campaigns. They’re carefully scripted lives—designed for emotional yield.
And because AI models can now simulate language with emotional precision, these backstories can be delivered in ways that feel raw, vulnerable, even transcendent.
We’re not just following influencers anymore. We’re following characters. And the line between the two is dissolving.
Everything Counts (In Large Amounts)
There is comfort in saying, “This isn’t new.” And in many ways, it isn’t.
Humans have always formed emotional bonds with avatars, deities, ghosts, and celebrities. From the Epic of Gilgamesh to the fandoms of K-pop, we’ve built our social worlds around constructed figures.
What’s new is the automation of this process.
AI enables the rapid, scalable creation of emotionally plausible personas tailored to every niche: the quiet queer artist who “gets” you, the woke college student who posts TikTok spoken word in Taglish, the soulful priestess who offers morning mantras. These aren’t one-size-fits-all celebrities. They’re algorithmically customized mirror selves—engineered, immortal, and always online.
What’s also new is responsiveness. AI influencers can reply to your comments, remember what you like, generate content that feels personal. The simulation of intimacy is now participatory.
And finally, what’s chillingly new is the commercial logic. These personas exist not just to entertain or inspire—but to sell. Behind every synthetic self is a monetization pipeline: ads, merch, affiliations, product drops.
Only When I Lose Myself…
We are entering an era of synthetic fame. And with it comes an existential riddle: when everyone we admire is a simulation, what happens to the Self?
This is not a call to panic, but to reflect. The problem isn’t that AI personas are fake. The problem is that our attention—and emotion—is finite. If we are pouring our care into simulations, what are we neglecting in the real?
More importantly, how do we teach the next generation to tell the difference between a persona engineered for engagement and a person who has lived, lost, and grown?
AI influencers will keep rising. Some will be slop. Some will be art. But the question we must keep asking is not whether they are real—but whether what they awaken in us is.
“The human mind doesn’t care about code. It cares about coherence.”
Because in the end, the human mind craves story. And that story—whether told by a prophet, a poet, or a program—must still answer the oldest question we’ve ever asked:
Who am I? Sino ba talaga ako?





















