Growtika
    LET'S TALK
    OpinionGrowtika · March 14, 2026

    You are the last generation
    to remember when
    the internet
    was mostly human.

    Digg is the proof of concept. What killed it isn't a bug to be patched. It's the new economics of every open platform, running permanently.

    Yuval Halevi/Founder, Growtika/9 min read
    0% synthetic

    For a few years in the mid-2000s, Digg was the front page of the internet.

    Founded in 2004 by Kevin Rose, it grew into one of Silicon Valley's most-watched startups. By 2008, Digg was attracting over 236 million visitors a year. Google entered advanced acquisition talks to buy it for a reported $200 million. The deal fell through. Users voted stories up or down. The best rose to the top. Reddit launched the year after Digg, in 2005. For a while they coexisted and competed directly.

    Even at its peak, Digg had trust problems. As early as 2006, users tried to game the voting system. Organized groups buried content they disagreed with. Others charged hundreds of dollars to push stories to the front page. Digg hired engineers to build diversity algorithms to fight it. It was a constant arms race, but a manageable one, because gaming the system still required humans spending real time.

    Then Digg redesigned in 2010. The community revolted. Its audience dropped by a quarter in a single month. Much of its user base migrated to Reddit. Ohanian, Reddit's co-founder, wrote an open letter at the time, arguing that the redesign had departed from what made Digg special: giving the power back to the people.

    The site never recovered. In 2012, the company that once nearly sold for $200 million was broken up and sold in parts. The brand went for $500,000. The domain changed hands again. The name became a footnote.

    Last year, Rose bought it back, together with Ohanian. They raised money, rebuilt from scratch, and opened a Groundbreakers early-access program that hit its 23,000-member cap within days. The demand was real. In January 2026, they launched a full public beta. The vision: a modern community platform with real moderation, verified users, and an open algorithm. A second chance.

    On March 13, 2026, eight weeks after launch, the CEO published a post on the homepage. Most of the team was gone. The app was gone. The post-mortem was brief:

    "Within hours, we got a taste of what we'd only heard rumors about. The internet is now populated, in meaningful part, by sophisticated AI agents and automated accounts. We knew bots were part of the landscape, but we didn't appreciate the scale, sophistication, or speed at which they'd find us."
    Justin Mezzell, CEO of Digg, March 13, 2026

    The bot problem was real. So was another problem the post-mortem buried in a single sentence: "positioning Digg as simply an alternative to incumbents wasn't imaginative enough." This essay is about the first problem, the one that doesn't get fixed by better moderation, and why it's going to keep happening to every platform that comes after Digg.

    Digg's failure is being reported as a moderation story: overwhelmed by bots, couldn't keep up, needed better vendors. Some outlets are framing it as a Dead Internet moment, the long-running theory that most online activity is already synthetic. That framing is closer to the truth, but it still misses the actual mechanism.

    What hit Digg wasn't chaos. It was the same incentive that's driven spam since Google made links valuable: flood a platform with inauthentic content, extract the authority, move on. For years that required real human effort: content farms, hired posters, organized link schemes. Costly, detectable, containable. What changed is the cost dropped to near-zero.

    platform economics
    Authority Arbitrage
    aw-THOR-ih-tee AR-bih-trazh
    noun  ·  platform economics
    Flooding a platform that has real authority with inauthentic content to extract traffic, backlinks, or leads, at near-zero cost per post.
    Why it happens
    New platforms inherit domain authority from previous versions or earn it fast through press. Automated systems detect this authority gap instantly. The cost of posting inauthentic content is near-zero. The return is real: traffic, links, leads. It's rational.
    What's new about it
    Automated account flooding and content farms have existed for 20 years. The mechanism isn't new. What's new: AI-generated content is indistinguishable from genuine posts, costs fractions of a cent, and scales without humans. The arbitrage window used to close as platforms matured. Now it opens within hours of launch.
     
    What changed
    Before AI
    Highcost to post fake content
    Spam existed. A low-quality article cost a few dollars. A content farm post took minutes of human time. Detectable, expensive, containable.
    2026
    ~$0cost to post fake content
    AI generates human-quality posts at fractions of a cent. Not literally zero, but close enough that the math that once made moderation viable no longer works.
    The incentive has always existed. What changed is how much it costs to act on it. A human writer charges $50-200 per article. A content farm worker in 2015 charged $1-5. An AI agent in 2026 costs fractions of a cent per post, writes at human quality, and never sleeps. No moderation budget scales to meet a cost that approaches zero. The only structural fix is making participation expensive again, specifically for the actor doing the flooding.
    01

    Same week. Two stories. Nobody connected them.

    Same week, March 2026Mar 10  →  Mar 13
    March 10, 2026
    Meta acquires Moltbook: the social network built entirely for AI agents.

    "A registry where agents are verified and tethered to human owners." (Vishal Shah, VP of AI Products at Meta, as reported by Axios, March 10, 2026.) The identity infrastructure for AI agents, purchased and scaled by the largest social platform on earth.

    The same force, now with a verified identity layer, institutional backing, and a $115-135B capex budget for 2026.
    March 13, 2026
    Digg announces a major team layoff. Reason: inauthentic accounts overwhelmed the platform before any real community could form.

    Tens of thousands of accounts, many producing human-quality AI content indistinguishable from genuine posts. Industry-standard vendors. None of it was enough. When you can't trust the votes, comments, and links, the platform has nothing.

    Platform destroyed by agents with no verified identity.
    The unverified version destroys small platforms.  |  The verified version gets absorbed into the dominant one. Same behavior. Different accountability layer.

    Digg was killed by the unverified version of AI agents: anonymous, unaccountable, cheap to spin up, impossible to distinguish from genuine users. Moltbook built the verified version: every agent linked to a real human account, every action technically attributable to an owner. Meta bought it.

    Think about what that means. The "Wild West" killed Digg. Meta just bought the deed to the ranch.

    The identity layer for AI agents is going to be owned by whoever controls the registry. Right now that's Meta. Meta has done this before. Instagram was a small photo app. WhatsApp was a scrappy messenger. Meta bought both, scaled them to billions of users, and they became defining infrastructure. Moltbook is tiny today. The question is whether its agent registry becomes what WhatsApp became for messaging. When one company with 3.6 billion daily active users decides which agents are verified, the question changes. It's no longer "how do we stop inauthentic content?" It becomes: who gets to decide what authentic even means?

    This isn't humans versus bots. It's small platforms versus whoever ends up owning the identity infrastructure. Digg was an early casualty. The thing being fought over is much bigger than Digg.

    02

    Every platform that collapses sends its users to Reddit, Meta, or Discord.

    The consolidation mechanism: watch it run  ·  Platform names are fictional except Digg
    Community platform (alive)
    Overrun by inauthentic accounts
    Dead (users migrated)
    Incumbent (growing)

    Every platform that collapses under inauthentic traffic sends its users somewhere. They go to Reddit, Meta, Discord, the platforms big enough to defend themselves. Authority arbitrage is a tax on being small. Every small platform that dies makes the big ones stronger. Nobody writes about that part.

    03

    Content is infinite. Engagement is infinite. Only one thing is scarce now.

    Resource
    Before AI
    2026
    Content
    Expensive
    $50–$200 per article
    ~$0
    AI generates at fractions of a cent
    Engagement
    Costly
    Required human time
    ~$0
    Bot clicks, votes, likes
    Verified Presence
    Free
    Being human was default
    ScarceTHE FLIP
    The only thing left that's real

    The existing internet business model rewards manufacturing engagement. The raw material of authority arbitrage is now free. The thing that became genuinely scarce: verified human presence. There is no price signal for it anywhere online. The first platform that figures out how to charge for verified human presence will be hard to kill.

    05

    Friction is the product.

    Which platforms can still resist, and which can't
    Already overrunInauthentic engagement is indistinguishable from real. The signal is gone.
    Voting aggregatorsA vote is a click. Automating it takes seconds. This is the Digg model. The economics no longer work.
    General social feedsVerified agent registries like Moltbook make inauthentic posts legally attributable, and still unstoppable at scale.
    Under pressureThe friction that once protected these platforms is eroding as AI improves.
    Code repositoriesUsed to require working code. AI writes working code now. The gap is closing faster than GitHub can respond.
    Q&A platformsStack Overflow has seen significant traffic decline since LLMs arrived. Whether the community signal survives that pressure is an open question.
    Still defensibleNatural friction that is genuinely difficult to automate. Not impossible. Costly enough to matter.
    Niche expert forumsBeing wrong in front of 200 domain experts has a real social cost. Hard to fake credibly at scale.
    Single-author publicationsReaders follow a person, not a topic. The author is the product. Identity is the moat.
    Live and real-timePodcasts, streams, in-person. Presence is still hard to fake convincingly in real time.

    The right question is not "how do we build trust?" That has no good answer when agents participate for free. The right question: what creates friction for agents that does not create friction for humans? That's a narrow design space. Most current platforms fail the test.

    06

    Survivors will look like clubs, not networks.

    The invite-only answer is obvious. Everyone lands on it. A $5 fee killed 99% of Metafilter's spam in 2001. It still works. What's less obvious is what happens at scale.

    The clubs survive. But Meta just bought the verified identity layer for AI agents. If one company owns the registry that says which agents are real and which aren't, then the question isn't "how do we stop fake content?" The question is who controls the definition of real.

    Small communities with real friction will be fine. The problem is the platforms that reach a billion people. Those are consolidating around whoever controls the identity infrastructure. Right now that's shaping up to be Meta. The open internet isn't just being overrun. It's being enclosed.

    Open Network
    Anyone can join  ·  3.2M accounts
    US
    user_482
    MK
    mk91
    PO
    post3
    AG
    agent77
    FA
    farm2
    BO
    boost
    AU
    auto8
    VI
    viral
    TR
    traffic
    EN
    engage
    CL
    click
    GP
    gpt9
    SY
    syn12
    AR
    army5
    FA
    fake1
    MI
    mill
    BO
    bot_r
    SL
    slayer
    PO
    post3b
    FA
    farm8
    AN
    anon7
    US
    usr28
    RE
    reply9
    NE
    new_u
    GU
    guest1
    SI
    sign_2
    AC
    acct3
    US
    usr99
    BO
    bot22
    AU
    auto_x
    AC
    acct_9
    AC
    acct_4
    NO
    node_8
    US
    usr_01
    AU
    auto77
    VI
    viral2
    CL
    click9
    PO
    post88
    US
    usr_44
    FA
    farm_6
    GE
    gen_ai
    RE
    reply8
    NE
    new_33
    AC
    acct_7
    BO
    boost2
    FA
    farm77
    BO
    bot_44
    AU
    auto22
    NO
    node_3
    US
    usr_05
    US
    usr_81
    GP
    gpt_12
    AG
    agent2
    PO
    post_9
    CL
    click2
    VI
    viral9
    SY
    syn_44
    AR
    army_8
    AU
    auto_3
    FA
    fake_7
    0 of 60 are inauthentic

    Automated accounts arrived before real users built anything worth protecting. The content looks real: AI-generated, human-quality. You can't tell the difference. Neither can the platform.

    Closed Club
    Invitation only  ·  843 members
    EV
    elena_v
    Product @ Stripe
    verified
    MT
    marcus_t
    Founder, 2x
    verified
    RK
    reza_k
    ML Eng @ Anthropic
    verified
    SM
    sara_m
    VC Partner
    verified
    JO
    jay_o
    Security researcher
    verified
    PD
    priya_d
    CTO @ Series B
    verified
    CL
    chen_l
    Ex-Reddit community
    verified
    NB
    nina_b
    Journalist, The Atlantic
    verified

    Every person here was vouched for by someone already inside. Fewer people. Every single one is real.

    THE INVERSIONGrowth was always proof of value. When authority arbitrage is the default, growth is the vulnerability. A room with 8 verified humans produces more real signal than a stadium of 3 million anonymous accounts.

    This inverts twenty years of platform thinking. The platforms that survive the next five years won't be the ones that scaled fastest. They'll be the ones that made membership mean something, that made joining require something authority arbitrage can't trivially overcome: a real person staking their reputation that you belong.

    The internet you knew, where votes meant humans voted, where comments meant humans commented, is not at risk.
    It's already gone.
    Note on this piece

    I run an SEO agency, so I've watched these economics from both sides. This is an opinion essay based on publicly reported events. All quotes are attributed to their sources. We're human (believe it or not) — if we got something wrong or you have a correction, contact us and we'll fix it or add a note.