How a Viral App Exposed This AI Data Security Risk

Seventeen-year-old Emma thought she'd struck gold.

A new mobile app called Neon was paying cold, hard cash just to record her phone calls.

"Phone companies profit off your data. Now, you can too," boasted the company's website.

So Emma recorded everything. Calls with her boyfriend... her therapist appointment... a conversation with her grandmother about college applications. In one week alone, Emma pocketed more than $40.

Still, there was a trade-off... Neon collected users' voice data in order to sell it to artificial-intelligence ("AI") companies for training new AI models.

Within weeks, Neon had rocketed to No. 2 on Apple's App Store. TikTok exploded with teenagers flaunting their earnings.

Then, TechCrunch made a horrifying discovery: Neon's entire database was exposed on the Internet. Every recording. Every transcript. Every phone number.

Overnight, the app vanished. But millions of personal conversations were already out there.

Now, "Emma" isn't a real teenager. But stories like this were the reality for those who signed up for Neon.

And as we'll explain, this isn't just a story about one reckless app. It's the front line of AI's data security problem. And the investors who are first to spot the companies solving it stand to gain the most...

The Growing AI Data Gap

Neon wasn't unique. It was just sloppy. But every day, dozens of AI companies are collecting data the same way...

The entire AI industry runs on one brutal equation: Better data equals better AI. And human conversations – messy, emotional, and unguarded – are some of the best data of all.

But the wells are running dry. Major AI models have consumed vast amounts of text data – GPT-3 alone trained on roughly 45 terabytes, equivalent to approximately 90 million novels.

But rather than needing just more data, the industry faces a different crisis: Elon Musk warned in January that AI companies have "exhausted basically the cumulative sum of human knowledge" for training purposes.

AI developers are scrambling for new sources of data to fill the gap – but they're increasingly turning to synthetic, AI-generated content rather than seeking vastly larger datasets.

Reddit, a gold mine for authentic human conversations, now charges millions for access to its content for data-mining purposes... which used to be free. The New York Times sued OpenAI for billions of dollars, claiming the company used copyrighted materials to train its chatbot.

The consequences for privacy infractions are severe. Europe's AI Act threatens fines up to 7% of global revenue. And California's severe penalties will kick in next year.

Four Ways to Invest in AI Dominance

The answer to the privacy trap must lie beyond human data harvesting.

While competitors hoard personal information like digital gold miners, four companies have discovered something better: how to profit by solving AI's privacy problem...

Cloudflare (NET) is becoming the Internet's privacy gatekeeper for the AI era. It runs one of the world's largest networks, blocking more than 227 billion cyberthreats daily. Cloudflare promises never to use customer content to train AI models... And since 2024, it has given its clients one-click power to block AI data scrapers like GPTBot. More than 1 million websites use GPTBot by default.

Palo Alto Networks (PANW) is creating guardrails for mission-critical AI. Its Prisma AI Runtime Security platform is designed to sit between AI models and business operations, inspecting every prompt and response in real time. If an employee accidentally tries to feed in proprietary code – or if a model tries to output customer credit-card numbers – Prisma can block it before the damage occurs.

CrowdStrike (CRWD) turns the chaos of cyberattacks into intelligence. Its Falcon platform processes trillions of events every day, and every threat it prevents is fed back into its system, creating a loop where the AI grows sharper with each attack. CrowdStrike also monitors how employees use generative AI – from ChatGPT to local apps – and keeps sensitive data from leaking out.

Snowflake (SNOW) helped pioneer the "data clean room" – a secure vault where companies can analyze or train AI on shared datasets without exposing raw information. Nearly 30% of the Fortune 500 now use Snowflake's platform, with major brands like Disney, Netflix, Roku, and Samsung leveraging data clean rooms for privacy-preserving advertising optimization, fraud detection, and supply-chain collaboration.

These companies understand the value of AI that learns from patterns, threats, and synthetic scenarios – instead of private conversations.

AI Adoption Rides on This Moment

 Folks thought they were getting paid by Neon for their conversations...

Instead, they were paying the real cost. And this story is about more than one company's privacy violation. It's the canary in the coal mine for an industry built on a foundation of surveillance.

We're witnessing AI's great reorganization. The companies that figure out how to build intelligence without violating privacy will inherit the digital world. The ones that can't move on from harvesting human data will face an endless cycle of scandals, lawsuits, and regulatory penalties.

This shift is a survival strategy. In five years, "we protect your privacy" will be the only business model that works.

Investors face the same choice... Bet on the companies clinging to old, poorly regulated models – or on the leaders turning AI into a trusted and secure technology.

Good investing,

Josh Baylin


Editor's note: According to our colleague Whitney Tilson, a massive shift is underway in America. It's a growing divide between those benefiting from AI and those getting left behind. But you don't have to sit on the sidelines. Whitney says one breakthrough could help you massively outperform stocks, bonds, gold, or real estate... all with the power of AI.

Further Reading

"In a world of automation, the edge belongs to the irreplaceable," Josh writes. As AI takes over some parts of education, the value of human-centered learning is growing. And that offers a major opportunity for companies that are doing what AI can't.

"Look for companies that are building things of value in the present," Dave Lashmet says. The value of a company isn't based on last quarter's sales... Instead, it's found in how it's investing in the future right now.

Back to Top