Aravind Srinivas: How On-Device AI Threatens Data Centers

Perplexity CEO Aravind Srinivas reveals why your phone’s chip is the biggest threat to $1T data centres. Discover the future of private, local AI.

Perplexity CEO Aravind Srinivas

Imagine your phone or laptop becoming smarter without constantly relying on distant cloud servers. For years, artificial intelligence has depended on massive data centers to handle most of its processing. Perplexity CEO Aravind Srinivas asks a simple but big question: what if AI could run directly on the devices people use every day?

If that happens, AI could be faster, keep our information private, and feel more personal. It could also change how tech companies build and pay for AI.

Why Centralized Data Centers Still Power AI Today

Most AI systems today rely heavily on the cloud. When a user enters a prompt, the request travels to a data center where powerful machines process it and send the response back. This setup exists because modern AI models are large and need a lot of computing power.

That’s why companies such as Google, Meta, Microsoft, OpenAI, and Perplexity continue investing billions in data center infrastructure. Industry forecasts suggest global spending on data centers could approach one trillion dollars by 2030, driven largely by AI demand.

Srinivas doesn’t dispute the importance of this model today. What he questions is how dominant it will remain in the future.

The Risk for Data Centers if AI Goes Local

In a recent podcast appearance, Srinivas described what he sees as a potential turning point for AI infrastructure. If AI models can be made compact and efficient enough to run locally on consumer hardware, the need for centralized inference could decline sharply.

In his opinion, the real threat to data centers is when AI can operate straight from a device’s chip.

This isn’t a claim about where AI stands today, but about where it could go. If everyday tasks no longer need to be routed through distant servers, the traditional role of data centers may gradually shrink for certain use cases.

What On-Device AI Could Enable

With on-device AI, it works directly on your device instead of going online all the time. Srinivas thinks this could make everyday tasks easier.

  • Faster Responses: Processing on your device cuts out internet delays, making things feel more immediate.
  • Stronger Privacy: With AI on the device, personal data remains local rather than being sent to servers.
  • More Personal Experiences: On-device AI can fit into the way you work, making tasks like writing and organizing files easier and private.

Instead of acting only when prompted, AI could quietly support how people already use their devices.

Why Cloud AI Isn’t Going Away

According to Srinivas, cloud systems still matter because large data centers are needed for training AI and doing difficult work. In the future, heavy lifting stays in cloud data centers, and daily tasks run on your devices. This keeps AI speedy, personal, and reliable.

What This Shift Could Mean for AI’s Future

If AI operates on your device, it does more than just speed things up. Companies might adjust their methods, data could be safer, and users may have greater control. Rather than depending on one system for all users, AI could adjust to individual devices.

With better chips and growing local power, this vision looks realistic. These are the reasons tech experts are discussing Srinivas’s thoughts seriously.

Final Thoughts

On-device AI could change how smart systems work. Instead of relying only on huge data centers, future AI might run partly or mostly on personal devices. Perplexity CEO Aravind Srinivas calls this an evolution, not something that will happen all at once. Anyone curious about the future of AI should watch this idea closely.