The Greatest AI Scam in History
How Builder.ai Fooled the World
The promise of Artificial Intelligence is the siren song of our generation. Slap a “.ai” on your startup’s name, and you’ll have venture capitalists lining up at your door, ready to shower you with cash. It’s the dot-com bubble all over again, but with more algorithms and less dial-up. This hype, however, has a dark side, and no story illustrates this better than the spectacular collapse of Builder.ai, a company that promised to revolutionize app development but instead delivered a masterclass in deception.
The Rise and Fall of a Unicorn
Builder.ai, once valued at a staggering $1.5 billion, sold a simple, yet powerful dream: building a custom app should be as easy as ordering a pizza. Their charismatic founder, Sachin Dev Duggal, charmed investors and the media with this vision, securing over $450 million in funding from giants like Microsoft, SoftBank, and the Qatar Investment Authority. The company’s AI, affectionately named “Natasha,” was supposedly able to take your app idea and, with the help of a vast library of pre-built features, assemble a functional application with minimal human intervention.
The problem? It was a lie.
While Builder.ai did employ a small team of engineers working on legitimate AI tools, the core of their operation was a far cry from the automated powerhouse they advertised. Instead of a revolutionary AI, “Natasha” was largely a front for over 700 human engineers in India and other countries, manually piecing together apps. This “AI washing” was just the tip of the iceberg.
The real scandal, the one that brought the whole house of cards down, was good old-fashioned financial fraud. An internal audit revealed that Builder.ai had inflated its 2024 revenue by a jaw-dropping 300%, claiming $220 million when the actual figure was closer to $50 million. This was achieved through a “round-tripping” scheme with another Indian company, VerSe Innovation, where they would invoice each other for non-existent services, creating the illusion of massive revenue growth.
The charade couldn’t last forever. In May 2025, after a creditor, Viola Credit, caught wind of the financial discrepancies and seized $37 million, the company’s operations ground to a halt. With only $5 million in restricted funds remaining, Builder.ai was forced to file for bankruptcy, leaving over 1,000 employees jobless and a trail of multi-million dollar losses for its star-studded list of investors.
The Deepfake Conference Call: A $25 Million Heist
While Builder.ai’s scam was a complex web of financial deceit and technological misrepresentation, other AI-powered scams are more direct in their approach. In a case that sounds like something out of a science fiction movie, a finance clerk in Hong Kong was duped into transferring over $25 million to criminals after participating in a video conference call with what he believed were the company’s senior executives. In reality, every single person on the call, apart from the clerk himself, was a deepfake. Scammers had used publicly available footage to create highly realistic digital puppets of the executives, complete with their voices.
This incident is a chilling example of how easily deepfake technology can be weaponized for financial gain. The ability to create convincing, real-time deepfakes is no longer the stuff of Hollywood; it’s a tool in the hands of sophisticated criminals.
AI-Driven Investment Scams: The Rise of the Bots
Another worrying trend is the proliferation of AI-driven investment scams. These schemes often target the cryptocurrency and stock trading markets, using AI to create a vast network of fake social media profiles, forums, and websites to disseminate convincing misinformation about lucrative investment opportunities.
These aren’t your typical, poorly-worded phishing emails. AI can generate highly personalized and grammatically perfect messages, and even power chatbots that can engage in convincing conversations with potential victims. They can also use deepfakes of well-known financial figures like Elon Musk to lend an air of legitimacy to their fraudulent schemes. The result is a highly effective and scalable way to lure in unsuspecting investors, with losses from this type of scam on a significant rise.
The story of Builder.ai, and the rise of other sophisticated AI scams, serves as a stark reminder for those of us in the tech world. While the potential of AI is immense, so is the potential for its misuse. As developers, entrepreneurs, and investors, it’s crucial to approach the AI gold rush with a healthy dose of skepticism. The pressure to present a groundbreaking AI solution can be immense, but as the Builder.ai saga proves, a foundation built on lies will inevitably crumble. The code we write and the companies we build must be grounded in reality, not just hype. After all, you can’t fake it ‘til you make it when the very technology you’re claiming to have mastered is the one that can expose your fraud.