Artwork

Contenido proporcionado por The New Stack Podcast and The New Stack. Todo el contenido del podcast, incluidos episodios, gráficos y descripciones de podcast, lo carga y proporciona directamente The New Stack Podcast and The New Stack o su socio de plataforma de podcast. Si cree que alguien está utilizando su trabajo protegido por derechos de autor sin su permiso, puede seguir el proceso descrito aquí https://es.player.fm/legal.
Player FM : aplicación de podcast
¡Desconecta con la aplicación Player FM !
icon Daily Deals

AI Agents are Dumb Robots, Calling LLMs

28:31
 
Compartir
 

Manage episode 472440230 series 2574278
Contenido proporcionado por The New Stack Podcast and The New Stack. Todo el contenido del podcast, incluidos episodios, gráficos y descripciones de podcast, lo carga y proporciona directamente The New Stack Podcast and The New Stack o su socio de plataforma de podcast. Si cree que alguien está utilizando su trabajo protegido por derechos de autor sin su permiso, puede seguir el proceso descrito aquí https://es.player.fm/legal.

AI agents are set to transform software development, but software itself isn’t going anywhere—despite the dramatic predictions. On this episode of The New Stack Makers, Mark Hinkle, CEO and Founder of Peripety Labs, discusses how AI agents relate to serverless technologies, infrastructure-as-code (IaC), and configuration management.

Hinkle envisions AI agents as “dumb robots” handling tasks like querying APIs and exchanging data, while the real intelligence remains in large language models (LLMs). These agents, likely implemented as serverless functions in Python or JavaScript, will automate software development processes dynamically. LLMs, leveraging vast amounts of open-source code, will enable AI agents to generate bespoke, task-specific tools on the fly—unlike traditional cloud tools from HashiCorp or configuration management tools like Chef and Puppet.

As AI-generated tooling becomes more prevalent, managing and optimizing these agents will require strong observability and evaluation practices. According to Hinkle, this shift marks the future of software, where AI agents dynamically create, call, and manage tools for CI/CD, monitoring, and beyond. Check out the full episode for more insights.

Learn more from The New Stack about emerging trends in AI agents:

Lessons From Kubernetes and the Cloud Should Steer the AI Revolution

AI Agents: Why Workflows Are the LLM Use Case to Watch

Join our community of newsletter subscribers to stay on top of the news and at the top of your game.

  continue reading

301 episodios

Artwork

AI Agents are Dumb Robots, Calling LLMs

The New Stack Podcast

113 subscribers

published

iconCompartir
 
Manage episode 472440230 series 2574278
Contenido proporcionado por The New Stack Podcast and The New Stack. Todo el contenido del podcast, incluidos episodios, gráficos y descripciones de podcast, lo carga y proporciona directamente The New Stack Podcast and The New Stack o su socio de plataforma de podcast. Si cree que alguien está utilizando su trabajo protegido por derechos de autor sin su permiso, puede seguir el proceso descrito aquí https://es.player.fm/legal.

AI agents are set to transform software development, but software itself isn’t going anywhere—despite the dramatic predictions. On this episode of The New Stack Makers, Mark Hinkle, CEO and Founder of Peripety Labs, discusses how AI agents relate to serverless technologies, infrastructure-as-code (IaC), and configuration management.

Hinkle envisions AI agents as “dumb robots” handling tasks like querying APIs and exchanging data, while the real intelligence remains in large language models (LLMs). These agents, likely implemented as serverless functions in Python or JavaScript, will automate software development processes dynamically. LLMs, leveraging vast amounts of open-source code, will enable AI agents to generate bespoke, task-specific tools on the fly—unlike traditional cloud tools from HashiCorp or configuration management tools like Chef and Puppet.

As AI-generated tooling becomes more prevalent, managing and optimizing these agents will require strong observability and evaluation practices. According to Hinkle, this shift marks the future of software, where AI agents dynamically create, call, and manage tools for CI/CD, monitoring, and beyond. Check out the full episode for more insights.

Learn more from The New Stack about emerging trends in AI agents:

Lessons From Kubernetes and the Cloud Should Steer the AI Revolution

AI Agents: Why Workflows Are the LLM Use Case to Watch

Join our community of newsletter subscribers to stay on top of the news and at the top of your game.

  continue reading

301 episodios

Kaikki jaksot

×
 
Loading …

Bienvenido a Player FM!

Player FM está escaneando la web en busca de podcasts de alta calidad para que los disfrutes en este momento. Es la mejor aplicación de podcast y funciona en Android, iPhone y la web. Regístrate para sincronizar suscripciones a través de dispositivos.

 

icon Daily Deals
icon Daily Deals
icon Daily Deals

Guia de referencia rapida

Escucha este programa mientras exploras
Reproducir