I run a marketing agency. We use Claude, ChatGPT, Ahrefs, Semrush. Same tools as everyone else. Same access to the same APIs.
This is the honest part: our tools are not our advantage.
I'm obsessed with AI. As a non-native English speaker, for years my ideas were better than my grammar. AI finally closed that gap. It handles the syntax so I can focus on the substance. I use it to remove the friction between my brain and the page, not the friction between the page and your brain. What follows is not an anti-AI take; it's about the parts of creativity that can't be automated.
The data backs this up. 54% of LinkedIn posts are now likely AI-written (Originality.ai). 15% of Reddit posts too, up 146% since 2021. Every competitor has the same capability to generate keyword-optimized, structurally correct, grammatically polished content. In about twelve seconds.
So what's left?
The Inversion
For years, content creation was the bottleneck. Knowing what to write about was relatively easy. Actually producing the content required real investment: research, drafting, editing, iteration.
That equation has flipped.
Production is now trivial. The bottleneck has moved upstream to the input: what you know that isn't in the training data. What you've observed that hasn't been published. What you've learned from doing the work that can't be scraped from the internet.
The question used to be: can you produce enough content? Now it's: do you have anything worth producing?
The Authenticity Test
Not all content needs to pass this filter. Informational articles, how-to guides, reference pages: these serve a purpose. They're somewhere between Wikipedia and a blog post. They cover ground. They answer questions. They're fine.
But what about the content that makes you remembered? The content that gives you a voice? That shapes your character, or your company's character?
For that kind of content, we apply a simple test: could an LLM with access to Google produce this with a single prompt?
If yes, it might serve a coverage function, but it won't differentiate. Someone else will produce the same thing. Probably already has.
If no, because it requires data we collected, patterns we observed, failures we experienced, specific scenarios we've lived through, then there's something there. Not because of how it's written, but because of what it contains.
Proof of Work
There is a second layer to the filter, one that addresses the laziness epidemic.
We are seeing a flood of "one-click content." Articles generated by a single prompt, reviewed for 30 seconds, and shipped. The problem isn't just that they are generic; it's that the reader can feel the lack of effort.
If I subconsciously detect that you spent 12 seconds creating this, why should I invest five minutes reading it?
In the age of AI, difficulty is a feature, not a bug.
We now look for "Proof of Work" signals. We want to see that the author didn't just ask an LLM to "write a blog post," but used AI as a force multiplier for a complex, difficult process. We want to see custom visualizations, interactive elements, or synthesis of multiple disparate sources. Things that require human orchestration.
The "Bookmark" Game
Before publishing, we play a quick game. We score the piece against these four questions. If the answer is "No," we don't ship.
Does this contain a custom visualization, unique dataset, or interactive element that an LLM couldn't hallucinate?
Did you combine at least three distinct sources (e.g., a sales call, support ticket, market report) to reach this conclusion?
Was this physically difficult to write? If it flowed out effortlessly in one go, it's usually fluff.
Would a stranger save this for later? Not just "read and nod," but "save and reference."
This isn't about being anti-AI. It's about respect for the reader. If you want attention in a noisy world, you have to pay for it with effort.
What Actually Differentiates
We work with B2B software companies, mostly cybersecurity and developer tools. The patterns we've seen that actually create content advantage:
Internal data nobody else has. Benchmarks from real deployments. Aggregated patterns from customer implementations. Performance metrics that require access to production systems.
Documented failures. What we tried that didn't work, and why. This is oddly hard for AI to generate convincingly because it requires having actually tried things.
Opinions with receipts. A position backed by specific experience. Not "we believe X" but "we ran Y for 18 months and X is what happened."
Access-dependent insights. Conversations with practitioners. Observations from inside client organizations. Context that requires being in the room.
The Zero-Volume Keyword Problem
Here's something we learned the hard way over the past year: everyone sees the same data in SEO tools. Same keyword volumes. Same difficulty scores. Same "opportunity" lists.
Which means everyone targets the same terms.
The actual gold is in conversations that never show up in Ahrefs or Semrush. Sales calls. Support tickets. Board meetings. These are where you hear the language prospects actually use, the specific problems they're trying to solve, the emerging industry terms that haven't hit search volume yet.
We've seen this pattern repeatedly: a term shows "0 volume" in every tool, but it's the exact phrase a CISO uses when they have budget and urgency. Those searches convert at 10x the rate of high-volume head terms. The tools can't see them because there aren't enough searches to register. But each search represents someone ready to buy.
The secret is that these three layers complete each other. High-volume content builds domain authority and captures top-of-funnel awareness. Authority content establishes expertise and earns backlinks. Zero-volume content from internal insights converts the buyers who know exactly what they need.
Skip the bottom layer and you have no foundation. Skip the top layer and you're leaving money on the table. Most companies over-invest in volume and under-invest in the content that comes from actually talking to customers.
If your competitive advantage is "we write good content," you don't have a competitive advantage. Neither do we. The advantage comes from having something to write about that others don't have access to.
The Tools Question
Someone will ask: if AI tools aren't the moat, why use them?
Because they're infrastructure now. Like spreadsheets. Like email. Using them isn't an advantage, but not using them is a disadvantage. They handle the production part so we can focus on the part that actually matters: acquiring the novel input that makes content worth creating.
We're not against AI tools. We use them constantly. What we're against is the idea that using them well is a strategy. It's a baseline.
The Takeaway
The moat isn't the tool. It's what you feed it. And the only thing worth feeding it is knowledge you've earned by doing work that hasn't been scraped, indexed, and trained on yet.

Yuval Halevi
Helping SaaS companies and developer tools get cited in AI answers since before it was called "GEO." 10+ years in B2B SEO, 50+ cybersecurity and SaaS tools clients.
Related Articles
15 Content Strategies That Rank in Search and Get Cited by AI
Tested content strategies that work in 2026. From self-contained sections to extraction-ready formats.
AI VisibilityAI Visibility for B2B SaaS: 15 Best Practices That Actually Work
Field-tested tactics for getting cited by ChatGPT, Claude, and Perplexity.
Product-Led GrowthYour PLG Strategy Has an AI Blind Spot
PLG companies obsess over activation metrics but miss where the real shortlist forms: AI conversations.