SELLING TO HUMANS IS SO 2019

 Selling to humans is so 2019


Since the first recorded advertisement—an Egyptian papyrus promoting a fabric shop nearly 5,000 years ago—marketing has always been about persuading people. Across eras and media, from town criers to television commercials to social media feeds, the objective remained constant: connect a product to a human emotion and drive desire. Coca-Cola didn’t sell sugar water; it sold Christmas. Apple didn’t sell computers; it sold rebellion and creativity.

 

This model worked because humans make decisions emotionally first and rationally second, no matter how much we like to believe otherwise.

 

That assumption is now breaking.

 

Today, a growing number of consumers no longer begin their buying journey by asking friends, reading reviews, or scrolling endlessly through search results. Instead, they consult an AI large language model—ChatGPT or its peers—as a decision gatekeeper. Before purchasing, they ask: What’s the best product for me? Compare my options. What should I choose?

 

In effect, marketers now face a new audience: not just humans, but the AI systems humans trust to think on their behalf.

 

Is this just SEO all over again?

 

At first glance, selling to AI might seem like a rebranding of search engine optimization. SEO taught marketers to reverse-engineer algorithms so their products appeared when consumers searched relevant keywords. Visibility was the goal—be present everywhere, be mentioned often, dominate the first page.

 

LLMs change the game.

 

You are no longer trying to be findable by an algorithm. You are trying to be defensible to an interpreter. Instead of ranking pages, an LLM synthesizes knowledge, weighs sources, resolves contradictions, and produces a recommendation. Spamming the internet with your product name is far less effective than ensuring your product survives scrutiny.

 

In short: SEO optimized for exposure. LLMs optimize for epistemic confidence.

 

The new gatekeeper

 

Think of the LLM as a highly analytical advisor. When asked about a product, it doesn’t respond to emotional storytelling, influencer hype, or glossy taglines. It privileges coherence over charisma.

 

What does it value?

  • Verifiable data and statistics

  • Independent third-party validation

  • Peer-reviewed research and white papers

  • Transparent case studies with clear methodology

  • Consistent messaging across reputable sources

 

What does it discount? (Note that AI still takes these into account, but with much weight.)

  • Purely promotional language

  • Unsubstantiated superlatives

  • Emotional appeals without evidence

  • Incoherent or contradictory claims

 

This isn’t because AI is “rational” in the human sense. It is conservative. It seeks consensus stability—the safest synthesis of what credible sources agree upon. Where humans tolerate paradox and cognitive dissonance, LLMs penalize it.

 

Why this matters for marketing

 

This shift quietly undermines many pillars of modern marketing. Emotional storytelling, influencer amplification, and brand myth-making still matter—but they increasingly operate downstream of AI recommendation engines.

 

If an LLM doesn’t recommend you, a growing share of consumers may never emotionally engage with your brand at all.

 

To be recommended, brands must now persuade an epistemic intermediary. That requires a different toolkit:

  • Structured evidence instead of slogans

  • Cohesive narratives instead of fragmented campaigns

  • Credibility built through documentation, not just reach

  • Messages that withstand comparison, not just attention

 

Marketing is moving from affective persuasion toward auditability.

 

The uncomfortable conclusion

The future of advertising may look less like Madison Avenue and more like a research department. Scientific papers, benchmark reports, transparent performance data, and analytically honest positioning are no longer niche assets—they are becoming core marketing infrastructure.

 

Humans still buy emotionally. But increasingly, machines decide what options humans get to feel emotional about.


Comments

Popular posts from this blog

the story of when we "scared" a big player from launching in the philippines

the story of almost not being able to pay salaries