Artwork

เนื้อหาจัดทำโดย Turpentine, Erik Torenberg, and Nathan Labenz เนื้อหาพอดแคสต์ทั้งหมด รวมถึงตอน กราฟิก และคำอธิบายพอดแคสต์ได้รับการอัปโหลดและจัดหาให้โดยตรงจาก Turpentine, Erik Torenberg, and Nathan Labenz หรือพันธมิตรแพลตฟอร์มพอดแคสต์ของพวกเขา หากคุณเชื่อว่ามีบุคคลอื่นใช้งานที่มีลิขสิทธิ์ของคุณโดยไม่ได้รับอนุญาต คุณสามารถปฏิบัติตามขั้นตอนที่แสดงไว้ที่นี่ https://th.player.fm/legal
Player FM - แอป Podcast
ออฟไลน์ด้วยแอป Player FM !

Training Zamba: A Hybrid Model Master Class with Zyphra's Quentin Anthony

2:25:00
 
แบ่งปัน
 

Manage episode 447566560 series 3452589
เนื้อหาจัดทำโดย Turpentine, Erik Torenberg, and Nathan Labenz เนื้อหาพอดแคสต์ทั้งหมด รวมถึงตอน กราฟิก และคำอธิบายพอดแคสต์ได้รับการอัปโหลดและจัดหาให้โดยตรงจาก Turpentine, Erik Torenberg, and Nathan Labenz หรือพันธมิตรแพลตฟอร์มพอดแคสต์ของพวกเขา หากคุณเชื่อว่ามีบุคคลอื่นใช้งานที่มีลิขสิทธิ์ของคุณโดยไม่ได้รับอนุญาต คุณสามารถปฏิบัติตามขั้นตอนที่แสดงไว้ที่นี่ https://th.player.fm/legal

In this episode of The Cognitive Revolution, Nathan dives deep into the world of state space models with returning co-host Jason Meaux and special guest Quentin Anthony, Head of Model Training at Zyphra. Explore the cutting-edge Zamba 2-7b model, which combines selective state space and attention mechanisms. Uncover practical insights on model training, architectural choices, and the challenges of scaling AI. From learning schedules to hybrid architectures, loss metrics to context length extension, this technical discussion covers it all. Don't miss this in-depth conversation on the future of personalized, on-device AI.

Check out more about Zyphra and Jason Meaux here:

Zyphra's website: https://www.zyphra.com

Zamba2-7B Blog: https://www.zyphra.com/post/zamba2-7b

Zamba2 GitHub: https://github.com/Zyphra/Zamba2

Tree attention: https://www.zyphra.com/post/tree-attention-topology-aware-decoding-for-long-context-attention-on-gpu-clusters

Jason's Meaux Twitter: https://x.com/KamaraiCode

Jason's Meaux website: https://www.statespace.info

Be notified early when Turpentine's drops new publication: https://www.turpentine.co/exclusiveaccess

SPONSORS:

Weights & Biases RAG++: Advanced training for building production-ready RAG applications. Learn from experts to overcome LLM challenges, evaluate systematically, and integrate advanced features. Includes free Cohere credits. Visit https://wandb.me/cr to start the RAG++ course today.

Shopify: Shopify is the world's leading e-commerce platform, offering a market-leading checkout system and exclusive AI apps like Quikly. Nobody does selling better than Shopify. Get a $1 per month trial at https://shopify.com/cognitive

Notion: Notion offers powerful workflow and automation templates, perfect for streamlining processes and laying the groundwork for AI-driven automation. With Notion AI, you can search across thousands of documents from various platforms, generating highly relevant analysis and content tailored just for you - try it for free at https://notion.com/cognitiverevolution

LMNT: LMNT is a zero-sugar electrolyte drink mix that's redefining hydration and performance. Ideal for those who fast or anyone looking to optimize their electrolyte intake. Support the show and get a free sample pack with any purchase at https://drinklmnt.com/tcr

CHAPTERS:

(00:00:00) Teaser

(00:00:42) About the Show

(00:01:05) About the Episode

(00:03:09) Introducing Zyphra

(00:07:28) Personalization in AI

(00:12:48) State Space Models & Efficiency (Part 1)

(00:19:22) Sponsors: Weights & Biases RAG++ | Shopify

(00:21:26) State Space Models & Efficiency (Part 2)

(00:22:23) Dense Attention to Shared Attention

(00:29:41) Zyphra's Early Bet on Mamba (Part 1)

(00:33:18) Sponsors: Notion | LMNT

(00:36:00) Zyphra's Early Bet on Mamba (Part 2)

(00:37:22) Loss vs. Model Quality

(00:44:53) Emergence & Grokking

(00:50:06) Loss Landscapes & Convergence

(00:56:55) Sophia, Distillation & Secrets

(01:09:00) Competing with Big Tech

(01:23:50) The Future of Model Training

(01:30:02) Deep Dive into Zamba 1

(01:34:24) Zamba 2 and Mamba 2

(01:38:56) Context Extension & Memory

(01:44:04) Sequence Parallelism

(01:45:44) Zamba 2 Architecture

(01:53:57) Mamba Attention Hybrids

(02:00:00) Lock-in Effects

(02:05:32) Mamba Hybrids in Robotics

(02:07:07) Ease of Use & Compatibility

(02:12:10) Tree Attention vs. Ring Attention

(02:22:02) Zyphra's Vision & Goals

(02:23:57) Outro

SOCIAL LINKS:

Website: https://www.cognitiverevolution.ai

Twitter (Podcast): https://x.com/cogrev_podcast

Twitter (Nathan): https://x.com/labenz

LinkedIn: https://www.linkedin.com/in/nathanlabenz/

  continue reading

200 ตอน

Artwork
iconแบ่งปัน
 
Manage episode 447566560 series 3452589
เนื้อหาจัดทำโดย Turpentine, Erik Torenberg, and Nathan Labenz เนื้อหาพอดแคสต์ทั้งหมด รวมถึงตอน กราฟิก และคำอธิบายพอดแคสต์ได้รับการอัปโหลดและจัดหาให้โดยตรงจาก Turpentine, Erik Torenberg, and Nathan Labenz หรือพันธมิตรแพลตฟอร์มพอดแคสต์ของพวกเขา หากคุณเชื่อว่ามีบุคคลอื่นใช้งานที่มีลิขสิทธิ์ของคุณโดยไม่ได้รับอนุญาต คุณสามารถปฏิบัติตามขั้นตอนที่แสดงไว้ที่นี่ https://th.player.fm/legal

In this episode of The Cognitive Revolution, Nathan dives deep into the world of state space models with returning co-host Jason Meaux and special guest Quentin Anthony, Head of Model Training at Zyphra. Explore the cutting-edge Zamba 2-7b model, which combines selective state space and attention mechanisms. Uncover practical insights on model training, architectural choices, and the challenges of scaling AI. From learning schedules to hybrid architectures, loss metrics to context length extension, this technical discussion covers it all. Don't miss this in-depth conversation on the future of personalized, on-device AI.

Check out more about Zyphra and Jason Meaux here:

Zyphra's website: https://www.zyphra.com

Zamba2-7B Blog: https://www.zyphra.com/post/zamba2-7b

Zamba2 GitHub: https://github.com/Zyphra/Zamba2

Tree attention: https://www.zyphra.com/post/tree-attention-topology-aware-decoding-for-long-context-attention-on-gpu-clusters

Jason's Meaux Twitter: https://x.com/KamaraiCode

Jason's Meaux website: https://www.statespace.info

Be notified early when Turpentine's drops new publication: https://www.turpentine.co/exclusiveaccess

SPONSORS:

Weights & Biases RAG++: Advanced training for building production-ready RAG applications. Learn from experts to overcome LLM challenges, evaluate systematically, and integrate advanced features. Includes free Cohere credits. Visit https://wandb.me/cr to start the RAG++ course today.

Shopify: Shopify is the world's leading e-commerce platform, offering a market-leading checkout system and exclusive AI apps like Quikly. Nobody does selling better than Shopify. Get a $1 per month trial at https://shopify.com/cognitive

Notion: Notion offers powerful workflow and automation templates, perfect for streamlining processes and laying the groundwork for AI-driven automation. With Notion AI, you can search across thousands of documents from various platforms, generating highly relevant analysis and content tailored just for you - try it for free at https://notion.com/cognitiverevolution

LMNT: LMNT is a zero-sugar electrolyte drink mix that's redefining hydration and performance. Ideal for those who fast or anyone looking to optimize their electrolyte intake. Support the show and get a free sample pack with any purchase at https://drinklmnt.com/tcr

CHAPTERS:

(00:00:00) Teaser

(00:00:42) About the Show

(00:01:05) About the Episode

(00:03:09) Introducing Zyphra

(00:07:28) Personalization in AI

(00:12:48) State Space Models & Efficiency (Part 1)

(00:19:22) Sponsors: Weights & Biases RAG++ | Shopify

(00:21:26) State Space Models & Efficiency (Part 2)

(00:22:23) Dense Attention to Shared Attention

(00:29:41) Zyphra's Early Bet on Mamba (Part 1)

(00:33:18) Sponsors: Notion | LMNT

(00:36:00) Zyphra's Early Bet on Mamba (Part 2)

(00:37:22) Loss vs. Model Quality

(00:44:53) Emergence & Grokking

(00:50:06) Loss Landscapes & Convergence

(00:56:55) Sophia, Distillation & Secrets

(01:09:00) Competing with Big Tech

(01:23:50) The Future of Model Training

(01:30:02) Deep Dive into Zamba 1

(01:34:24) Zamba 2 and Mamba 2

(01:38:56) Context Extension & Memory

(01:44:04) Sequence Parallelism

(01:45:44) Zamba 2 Architecture

(01:53:57) Mamba Attention Hybrids

(02:00:00) Lock-in Effects

(02:05:32) Mamba Hybrids in Robotics

(02:07:07) Ease of Use & Compatibility

(02:12:10) Tree Attention vs. Ring Attention

(02:22:02) Zyphra's Vision & Goals

(02:23:57) Outro

SOCIAL LINKS:

Website: https://www.cognitiverevolution.ai

Twitter (Podcast): https://x.com/cogrev_podcast

Twitter (Nathan): https://x.com/labenz

LinkedIn: https://www.linkedin.com/in/nathanlabenz/

  continue reading

200 ตอน

All episodes

×
 
Loading …

ขอต้อนรับสู่ Player FM!

Player FM กำลังหาเว็บ

 

คู่มืออ้างอิงด่วน