Artwork

Contenido proporcionado por Machine Learning Street Talk (MLST). Todo el contenido del podcast, incluidos episodios, gráficos y descripciones de podcast, lo carga y proporciona directamente Machine Learning Street Talk (MLST) o su socio de plataforma de podcast. Si cree que alguien está utilizando su trabajo protegido por derechos de autor sin su permiso, puede seguir el proceso descrito aquí https://es.player.fm/legal.
Player FM : aplicación de podcast
¡Desconecta con la aplicación Player FM !

Pushing compute to the limits of physics

1:23:32
 
Compartir
 

Manage episode 495730554 series 2803422
Contenido proporcionado por Machine Learning Street Talk (MLST). Todo el contenido del podcast, incluidos episodios, gráficos y descripciones de podcast, lo carga y proporciona directamente Machine Learning Street Talk (MLST) o su socio de plataforma de podcast. Si cree que alguien está utilizando su trabajo protegido por derechos de autor sin su permiso, puede seguir el proceso descrito aquí https://es.player.fm/legal.

Dr. Maxwell Ramstead grills Guillaume Verdon (AKA “Beff Jezos”) who's the founder of Thermodynamic computing startup Extropic.

Guillaume shares his unique path – from dreaming about space travel as a kid to becoming a physicist, then working on quantum computing at Google, to developing a radically new form of computing hardware for machine learning. He explains how he hit roadblocks with traditional physics and computing, leading him to start his company – building "thermodynamic computers." These are based on a new design for super-efficient chips that use the natural chaos of electrons (think noise and heat) to power AI tasks, which promises to speed up AND lower the costs of modern probabilistic techniques like sampling. He is driven by the pursuit of building computers that work more like your brain, which (by the way) runs on a banana and a glass of water!

Guillaume talks about his alter ego, Beff Jezos, and the "Effective Accelerationism" (e/acc) movement that he initiated. Its objective is to speed up tech progress in order to “grow civilization” (as measured by energy use and innovation), rather than “slowing down out of fear”. Guillaume argues we need to embrace variance, exploration, and optimism to avoid getting stuck or outpaced by competitors like China. He and Maxwell discuss big ideas like merging humans with AI, decentralizing intelligence, and why boundless growth (with smart constraints) is “key to humanity's future”.

REFS:

1. John Archibald Wheeler - "It From Bit" Concept

00:04:45 - Foundational work proposing that physical reality emerges from information at the quantum level

Learn more: https://cqi.inf.usi.ch/qic/wheeler.pdf

2. AdS/CFT Correspondence (Holographic Principle)

00:05:15 - Theoretical physics duality connecting quantum gravity in Anti-de Sitter space with conformal field theory

https://en.wikipedia.org/wiki/Holographic_principle

3. Renormalization Group Theory

00:06:15 - Mathematical framework for analyzing physical systems across different length scales

https://www.damtp.cam.ac.uk/user/dbs26/AQFT/Wilsonchap.pdf

4. Maxwell's Demon and Information Theory

00:21:15 - Thought experiment linking information processing to thermodynamics and entropy

https://plato.stanford.edu/entries/information-entropy/

5. Landauer's Principle

00:29:45 - Fundamental limit establishing minimum energy required for information erasure

https://en.wikipedia.org/wiki/Landauer%27s_principle

6. Free Energy Principle and Active Inference

01:03:00 - Mathematical framework for understanding self-organizing systems and perception-action loops

https://www.nature.com/articles/nrn2787

7. Max Tegmark - Information Bottleneck Principle

01:07:00 - Connections between information theory and renormalization in machine learning

https://arxiv.org/abs/1907.07331

8. Fisher's Fundamental Theorem of Natural Selection

01:11:45 - Mathematical relationship between genetic variance and evolutionary fitness

https://en.wikipedia.org/wiki/Fisher%27s_fundamental_theorem_of_natural_selection

9. Tensor Networks in Quantum Systems

00:06:45 - Computational framework for simulating many-body quantum systems

https://arxiv.org/abs/1912.10049

10. Quantum Neural Networks

00:09:30 - Hybrid quantum-classical models for machine learning applications

https://en.wikipedia.org/wiki/Quantum_neural_network

11. Energy-Based Models (EBMs)

00:40:00 - Probabilistic framework for unsupervised learning based on energy functions

https://www.researchgate.net/publication/200744586_A_tutorial_on_energy-based_learning

12. Markov Chain Monte Carlo (MCMC)

00:20:00 - Sampling algorithm fundamental to modern AI and statistical physics

https://en.wikipedia.org/wiki/Markov_chain_Monte_Carlo

13. Metropolis-Hastings Algorithm

00:23:00 - Core sampling method for probability distributions

https://arxiv.org/abs/1504.01896

***SPONSOR MESSAGE***

Google Gemini 2.5 Flash is a state-of-the-art language model in the Gemini app. Sign up at https://gemini.google.com

  continue reading

238 episodios

Artwork
iconCompartir
 
Manage episode 495730554 series 2803422
Contenido proporcionado por Machine Learning Street Talk (MLST). Todo el contenido del podcast, incluidos episodios, gráficos y descripciones de podcast, lo carga y proporciona directamente Machine Learning Street Talk (MLST) o su socio de plataforma de podcast. Si cree que alguien está utilizando su trabajo protegido por derechos de autor sin su permiso, puede seguir el proceso descrito aquí https://es.player.fm/legal.

Dr. Maxwell Ramstead grills Guillaume Verdon (AKA “Beff Jezos”) who's the founder of Thermodynamic computing startup Extropic.

Guillaume shares his unique path – from dreaming about space travel as a kid to becoming a physicist, then working on quantum computing at Google, to developing a radically new form of computing hardware for machine learning. He explains how he hit roadblocks with traditional physics and computing, leading him to start his company – building "thermodynamic computers." These are based on a new design for super-efficient chips that use the natural chaos of electrons (think noise and heat) to power AI tasks, which promises to speed up AND lower the costs of modern probabilistic techniques like sampling. He is driven by the pursuit of building computers that work more like your brain, which (by the way) runs on a banana and a glass of water!

Guillaume talks about his alter ego, Beff Jezos, and the "Effective Accelerationism" (e/acc) movement that he initiated. Its objective is to speed up tech progress in order to “grow civilization” (as measured by energy use and innovation), rather than “slowing down out of fear”. Guillaume argues we need to embrace variance, exploration, and optimism to avoid getting stuck or outpaced by competitors like China. He and Maxwell discuss big ideas like merging humans with AI, decentralizing intelligence, and why boundless growth (with smart constraints) is “key to humanity's future”.

REFS:

1. John Archibald Wheeler - "It From Bit" Concept

00:04:45 - Foundational work proposing that physical reality emerges from information at the quantum level

Learn more: https://cqi.inf.usi.ch/qic/wheeler.pdf

2. AdS/CFT Correspondence (Holographic Principle)

00:05:15 - Theoretical physics duality connecting quantum gravity in Anti-de Sitter space with conformal field theory

https://en.wikipedia.org/wiki/Holographic_principle

3. Renormalization Group Theory

00:06:15 - Mathematical framework for analyzing physical systems across different length scales

https://www.damtp.cam.ac.uk/user/dbs26/AQFT/Wilsonchap.pdf

4. Maxwell's Demon and Information Theory

00:21:15 - Thought experiment linking information processing to thermodynamics and entropy

https://plato.stanford.edu/entries/information-entropy/

5. Landauer's Principle

00:29:45 - Fundamental limit establishing minimum energy required for information erasure

https://en.wikipedia.org/wiki/Landauer%27s_principle

6. Free Energy Principle and Active Inference

01:03:00 - Mathematical framework for understanding self-organizing systems and perception-action loops

https://www.nature.com/articles/nrn2787

7. Max Tegmark - Information Bottleneck Principle

01:07:00 - Connections between information theory and renormalization in machine learning

https://arxiv.org/abs/1907.07331

8. Fisher's Fundamental Theorem of Natural Selection

01:11:45 - Mathematical relationship between genetic variance and evolutionary fitness

https://en.wikipedia.org/wiki/Fisher%27s_fundamental_theorem_of_natural_selection

9. Tensor Networks in Quantum Systems

00:06:45 - Computational framework for simulating many-body quantum systems

https://arxiv.org/abs/1912.10049

10. Quantum Neural Networks

00:09:30 - Hybrid quantum-classical models for machine learning applications

https://en.wikipedia.org/wiki/Quantum_neural_network

11. Energy-Based Models (EBMs)

00:40:00 - Probabilistic framework for unsupervised learning based on energy functions

https://www.researchgate.net/publication/200744586_A_tutorial_on_energy-based_learning

12. Markov Chain Monte Carlo (MCMC)

00:20:00 - Sampling algorithm fundamental to modern AI and statistical physics

https://en.wikipedia.org/wiki/Markov_chain_Monte_Carlo

13. Metropolis-Hastings Algorithm

00:23:00 - Core sampling method for probability distributions

https://arxiv.org/abs/1504.01896

***SPONSOR MESSAGE***

Google Gemini 2.5 Flash is a state-of-the-art language model in the Gemini app. Sign up at https://gemini.google.com

  continue reading

238 episodios

Todos los episodios

×
 
Loading …

Bienvenido a Player FM!

Player FM está escaneando la web en busca de podcasts de alta calidad para que los disfrutes en este momento. Es la mejor aplicación de podcast y funciona en Android, iPhone y la web. Regístrate para sincronizar suscripciones a través de dispositivos.

 

Guia de referencia rapida

Escucha este programa mientras exploras
Reproducir