Kevin K. Yang: Engineering Proteins with ML

Kevin K. Yang: Engineering Proteins with ML

Author: Daniel Bashir September 28, 2023 Duration: 1:00:00

In episode 92 of The Gradient Podcast, Daniel Bashir speaks to Kevin K. Yang.

Kevin is a senior researcher at Microsoft Research (MSR) who works on problems at the intersection of machine learning and biology, with an emphasis on protein engineering. He completed his PhD at Caltech with Frances Arnold on applying machine learning to protein engineering. Before joining MSR, he was a machine learning scientist at Generate Biomedicines, where he used machine learning to optimize proteins.

Have suggestions for future podcast guests (or other feedback)? Let us know here or reach us at editor@thegradient.pub

Subscribe to The Gradient Podcast:  Apple Podcasts  | Spotify | Pocket Casts | RSSFollow The Gradient on Twitter

Outline:

* (00:00) Intro

* (02:40) Kevin’s background

* (06:00) Protein engineering early in Kevin’s career

* (12:10) From research to real-world proteins: the process

* (17:40) Generative models + pretraining for proteins

* (22:47) Folding diffusion for protein structure generation

* (30:45) Protein evolutionary dynamics and generative models of protein sequences

* (40:03) Analogies and disanalogies between protein modeling and language models

* (41:45) In representation learning

* (45:50) Convolutions vs. transformers and inductive biases

* (49:25) Pretraining tasks for protein structure

* (51:45) More on representation learning for protein structure

* (54:06) Kevin’s thoughts on interpretability in deep learning for protein engineering

* (56:50) Multimodality in protein engineering and future directions

* (59:14) Outro

Links:

* Kevin’s Twitter and homepage

* Research

* Generative models + pre-training for proteins and chemistry

* Broad intro to techniques in the space

* Protein structure generation via folding diffusion

* Protein sequence design with deep generative models (review)

* Evolutionary velocity with protein language models predicts evolutionary dynamics of diverse proteins

* Protein generation with evolutionary diffusion: sequence is all you need

* ML for protein engineering

* ML-guided directed evolution for protein engineering (review)

* Learned protein embeddings for ML

* Adaptive machine learning for protein engineering (review)

* Multimodal deep learning for protein engineering



Get full access to The Gradient at thegradientpub.substack.com/subscribe

Hosted by Daniel Bashir, The Gradient: Perspectives on AI moves beyond surface-level headlines to explore the intricate machinery and human ideas shaping artificial intelligence. Each episode is built on a foundation of deep research, leading to conversations that are both technically substantive and broadly accessible. You'll hear from researchers, engineers, and philosophers who are actively building and critiquing our technological future, discussing not just how AI systems work, but the larger implications of their integration into society. This isn't about speculative hype; it's a grounded examination of real progress, persistent challenges, and ethical considerations from those on the front lines. The discussions peel back layers on topics like model architecture, policy, and the fundamental science behind the algorithms becoming part of our daily lives. For anyone curious about the substance behind the buzz-whether you have a technical background or are simply keen to understand a defining technology of our age-this podcast offers a crucial and thoughtful resource. Tune in for a consistently detailed and nuanced take that treats artificial intelligence with the complexity it deserves.
Author: Language: English Episodes: 100

The Gradient: Perspectives on AI
Podcast Episodes
Linus Lee: At the Boundary of Machine and Mind [not-audio_url] [/not-audio_url]

Duration: 2:28:46
In episode 56 of The Gradient Podcast, Daniel Bashir speaks to Linus Lee. Linus is an independent researcher interested in the future of knowledge representation and creative work aided by machine understanding of langua…
Suresh Venkatasubramanian: An AI Bill of Rights [not-audio_url] [/not-audio_url]

Duration: 1:40:58
In episode 55 of The Gradient Podcast, Daniel Bashir speaks to Professor Suresh Venkatasubramanian. Professor Venkatasubramanian is a Professor of Computer Science and Data Science at Brown University, where his research…
Melanie Mitchell: Abstraction and Analogy in AI [not-audio_url] [/not-audio_url]

Duration: 54:47
Have suggestions for future podcast guests (or other feedback)? Let us know here!In episode 53 of The Gradient Podcast, Daniel Bashir speaks to Professor Melanie Mitchell. Professor Mitchell is the Davis Professor at the…
Marc Bellemare: Distributional Reinforcement Learning [not-audio_url] [/not-audio_url]

Duration: 1:12:22
Have suggestions for future podcast guests (or other feedback)? Let us know here!In episode 52 of The Gradient Podcast, Daniel Bashir speaks to Professor Marc Bellemare. Professor Bellemare leads the reinforcement learni…
François Chollet: Keras and Measures of Intelligence [not-audio_url] [/not-audio_url]

Duration: 1:28:50
In episode 51 of The Gradient Podcast, Daniel Bashir speaks to François Chollet.François is a Senior Staff Software Engineer at Google and creator of the Keras deep learning library, which has enabled many people (includ…
Yoshua Bengio: The Past, Present, and Future of Deep Learning [not-audio_url] [/not-audio_url]

Duration: 1:14:09
Happy episode 50! This week’s episode is being released on Monday to avoid Thanksgiving. Have suggestions for future podcast guests (or other feedback)? Let us know here!In episode 50 of The Gradient Podcast, Daniel Bash…
Kanjun Qiu and Josh Albrecht: Generally Intelligent [not-audio_url] [/not-audio_url]

Duration: 47:21
In episode 49 of The Gradient Podcast, Daniel Bashir speaks to Kanjun Qiu and Josh Albrecht. Kanjun and Josh are CEO and CTO of Generally Intelligent, an AI startup aiming to develop general-purpose agents with human-lik…

«1...678910