Inside Cursor: The future of AI coding with Co-founder Sualeh Asif

Inside Cursor: The future of AI coding with Co-founder Sualeh Asif

Author: Lukas Biewald April 29, 2025 Duration: 49:36

In this episode of Gradient Dissent, host Lukas Biewald talks with Sualeh Asif, the CPO and co-founder of Cursor, one of the fastest-growing and most loved AI-powered coding platforms. Sualeh shares the story behind Cursor’s creation, the technical and design decisions that set it apart, and how AI models are changing the way we build software. They dive deep into infrastructure challenges, the importance of speed and user experience, and how emerging trends in agents and reasoning models are reshaping the developer workflow.

Sualeh also discusses scaling AI inference to support hundreds of millions of requests per day, building trust through product quality, and his vision for how programming will evolve in the next few years.

⏳Timestamps:

00:00 How Cursor got started and why it took off

04:50 Switching from Vim to VS Code and the rise of CoPilot

08:10 Why Cursor won among competitors: product philosophy and execution

10:30 How user data and feedback loops drive Cursor’s improvements

12:20 Iterating on AI agents: what made Cursor hold back and wait

13:30 Competitive coding background: advantage or challenge?

16:30 Making coding fun again: latency, flow, and model choices

19:10 Building Cursor’s infrastructure: from GPUs to indexing billions of files

26:00 How Cursor prioritizes compute allocation for indexing

30:00 Running massive ML infrastructure: surprises and scaling lessons

34:50 Why Cursor chose DeepSeek models early

36:00 Where AI agents are heading next

40:07 Debugging and evaluating complex AI agents

42:00 How coding workflows will change over the next 2–3 years

46:20 Dream future projects: AI for reading codebases and papers

🎙 Get our podcasts on these platforms:


Follow Weights & Biases:



Lukas Biewald hosts Gradient Dissent: Conversations on AI, a series that moves beyond theoretical discussions to examine how artificial intelligence is actually built and deployed. Each episode features a direct, unscripted talk with a leading practitioner-you’ll hear from engineers and researchers at places like NVIDIA, Meta, Google, Lyft, and OpenAI. The focus is on the tangible challenges and breakthroughs they encounter, from initial research to the complex reality of putting models into production. This isn't about abstract futures; it's a grounded look at the decisions shaping the field right now. Biewald, bringing his perspective from Weights & Biases, steers conversations toward the practical trade-offs and collaborative efforts that define modern AI work. For anyone in technology or business who wants to understand the mechanics behind the headlines, this podcast offers a rare, candid window into the process. You’ll come away with a clearer sense of how ideas become functional systems and what it really takes to operate at the cutting edge.
Author: Language: English Episodes: 100

Gradient Dissent: Conversations on AI
Podcast Episodes
Scaling LLMs and Accelerating Adoption with Aidan Gomez at Cohere [not-audio_url] [/not-audio_url]

Duration: 51:31
On this episode, we’re joined by Aidan Gomez, Co-Founder and CEO at Cohere. Cohere develops and releases a range of innovative AI-powered tools and solutions for a variety of NLP use cases.We discuss:- What “attention” m…
Neural Network Pruning and Training with Jonathan Frankle at MosaicML [not-audio_url] [/not-audio_url]

Duration: 1:02:00
Jonathan Frankle, Chief Scientist at MosaicML and Assistant Professor of Computer Science at Harvard University, joins us on this episode. With comprehensive infrastructure and software tools, MosaicML aims to help busin…
Shreya Shankar — Operationalizing Machine Learning [not-audio_url] [/not-audio_url]

Duration: 54:38
About This EpisodeShreya Shankar is a computer scientist, PhD student in databases at UC Berkeley, and co-author of "Operationalizing Machine Learning: An Interview Study", an ethnographic interview study with 18 machine…
Jeremy Howard — The Simple but Profound Insight Behind Diffusion [not-audio_url] [/not-audio_url]

Duration: 1:12:57
Jeremy Howard is a co-founder of fast.ai, the non-profit research group behind the popular massive open online course "Practical Deep Learning for Coders", and the open source deep learning library "fastai".Jeremy is als…
Jerome Pesenti — Large Language Models, PyTorch, and Meta [not-audio_url] [/not-audio_url]

Duration: 52:35
Jerome Pesenti is the former VP of AI at Meta, a tech conglomerate that includes Facebook, WhatsApp, and Instagram, and one of the most exciting places where AI research is happening today.Jerome shares his thoughts on T…
D. Sculley — Technical Debt, Trade-offs, and Kaggle [not-audio_url] [/not-audio_url]

Duration: 1:00:26
D. Sculley is CEO of Kaggle, the beloved and well-known data science and machine learning community.D. discusses his influential 2015 paper "Machine Learning: The High Interest Credit Card of Technical Debt" and what the…