Knowledge Distillation with Helen Byrne

NeurIPS Special

December 22, 2023 Helen Byrne Season 1 Episode 3
NeurIPS Special
Knowledge Distillation with Helen Byrne
More Info
Knowledge Distillation with Helen Byrne
NeurIPS Special
Dec 22, 2023 Season 1 Episode 3
Helen Byrne

NeurIPS is the world’s largest AI conference, where leading AI practitioners come together to share the latest research and debate the way forward for artificial intelligence. 

In this special episode, Helen examines some of the big themes of NeurIPS 2023 and talks to a range of attendees about their work, the big issues of the day, and what they’ve seen at NeurIPS that caught their attention. 

It’s fair to say that LLMs loomed large over this year’s conference, but there’s plenty more to discuss – from AI’s potential to combat climate change to new techniques for computational efficiency. 

Helen’s guests are: 

Sofia Liguori – Research Engineer at Google Deepmind, specialising in the application of AI to sustainability and climate change. 

Priya Donti – Assistant Professor in Electrical Engineering and Computer Science at MIT and Co-founder of Climate Change AI. Priya discusses the challenges associated with introducing leading-edge AI systems into highly complex real-world power generation and delivery systems. 

Irene Chen – Assistant Professor at UC Berkeley and UCSF’s Computational Precision Health program. Irene talks about her goal of delivering more equitable healthcare at a time when AI is set to disrupt the field. She also discusses the potential to make use of commercial LLMs in a way that protects sensitive user data. 

James Briggs – AI engineer at Graphcore. James and colleagues were presenting their paper ‘Training and inference of large language models using 8-bit floating point’ at this year’s NeurIPS. James explains their work and the importance of using smaller numerical representations to unlock computational efficiency in AI. 

Abhinav (Abhi) Venigalla – is a member of the technical staff at Databricks. The company provides a range of products to help organisations unlock the potential of enterprise-grade AI. Abhi talks about the increasing emphasis on inference tools and computational efficiency as AI moves out of the research lab and into commercial deployment. 

Show Notes

NeurIPS is the world’s largest AI conference, where leading AI practitioners come together to share the latest research and debate the way forward for artificial intelligence. 

In this special episode, Helen examines some of the big themes of NeurIPS 2023 and talks to a range of attendees about their work, the big issues of the day, and what they’ve seen at NeurIPS that caught their attention. 

It’s fair to say that LLMs loomed large over this year’s conference, but there’s plenty more to discuss – from AI’s potential to combat climate change to new techniques for computational efficiency. 

Helen’s guests are: 

Sofia Liguori – Research Engineer at Google Deepmind, specialising in the application of AI to sustainability and climate change. 

Priya Donti – Assistant Professor in Electrical Engineering and Computer Science at MIT and Co-founder of Climate Change AI. Priya discusses the challenges associated with introducing leading-edge AI systems into highly complex real-world power generation and delivery systems. 

Irene Chen – Assistant Professor at UC Berkeley and UCSF’s Computational Precision Health program. Irene talks about her goal of delivering more equitable healthcare at a time when AI is set to disrupt the field. She also discusses the potential to make use of commercial LLMs in a way that protects sensitive user data. 

James Briggs – AI engineer at Graphcore. James and colleagues were presenting their paper ‘Training and inference of large language models using 8-bit floating point’ at this year’s NeurIPS. James explains their work and the importance of using smaller numerical representations to unlock computational efficiency in AI. 

Abhinav (Abhi) Venigalla – is a member of the technical staff at Databricks. The company provides a range of products to help organisations unlock the potential of enterprise-grade AI. Abhi talks about the increasing emphasis on inference tools and computational efficiency as AI moves out of the research lab and into commercial deployment.