Artwork

เนื้อหาจัดทำโดย Turpentine, Erik Torenberg, and Nathan Labenz เนื้อหาพอดแคสต์ทั้งหมด รวมถึงตอน กราฟิก และคำอธิบายพอดแคสต์ได้รับการอัปโหลดและจัดหาให้โดยตรงจาก Turpentine, Erik Torenberg, and Nathan Labenz หรือพันธมิตรแพลตฟอร์มพอดแคสต์ของพวกเขา หากคุณเชื่อว่ามีบุคคลอื่นใช้งานที่มีลิขสิทธิ์ของคุณโดยไม่ได้รับอนุญาต คุณสามารถปฏิบัติตามขั้นตอนที่แสดงไว้ที่นี่ https://th.player.fm/legal
Player FM - แอป Podcast
ออฟไลน์ด้วยแอป Player FM !

Popular Mechanistic Interpretability: Goodfire Lights the Way to AI Safety

1:55:33
 
แบ่งปัน
 

Manage episode 434715096 series 3452589
เนื้อหาจัดทำโดย Turpentine, Erik Torenberg, and Nathan Labenz เนื้อหาพอดแคสต์ทั้งหมด รวมถึงตอน กราฟิก และคำอธิบายพอดแคสต์ได้รับการอัปโหลดและจัดหาให้โดยตรงจาก Turpentine, Erik Torenberg, and Nathan Labenz หรือพันธมิตรแพลตฟอร์มพอดแคสต์ของพวกเขา หากคุณเชื่อว่ามีบุคคลอื่นใช้งานที่มีลิขสิทธิ์ของคุณโดยไม่ได้รับอนุญาต คุณสามารถปฏิบัติตามขั้นตอนที่แสดงไว้ที่นี่ https://th.player.fm/legal

Nathan explores the cutting-edge field of mechanistic interpretability with Dan Balsam and Tom McGrath, co-founders of Goodfire. In this episode of The Cognitive Revolution, we delve into the science of understanding AI models' inner workings, recent breakthroughs, and the potential impact on AI safety and control. Join us for an insightful discussion on sparse autoencoders, polysemanticity, and the future of interpretable AI.

Papers

Apply to join over 400 founders and execs in the Turpentine Network: https://hmplogxqz0y.typeform.com/to/JCkphVqj

SPONSORS:

Oracle Cloud Infrastructure (OCI) is a single platform for your infrastructure, database, application development, and AI needs. OCI has four to eight times the bandwidth of other clouds; offers one consistent price, and nobody does data better than Oracle. If you want to do more and spend less, take a free test drive of OCI at https://oracle.com/cognitive

The Brave search API can be used to assemble a data set to train your AI models and help with retrieval augmentation at the time of inference. All while remaining affordable with developer first pricing, integrating the Brave search API into your workflow translates to more ethical data sourcing and more human representative data sets. Try the Brave search API for free for up to 2000 queries per month at https://bit.ly/BraveTCR

Omneky is an omnichannel creative generation platform that lets you launch hundreds of thousands of ad iterations that actually work customized across all platforms, with a click of a button. Omneky combines generative AI and real-time advertising data. Mention "Cog Rev" for 10% off https://www.omneky.com/

Head to Squad to access global engineering without the headache and at a fraction of the cost: head to https://choosesquad.com/ and mention “Turpentine” to skip the waitlist.

CHAPTERS:

(00:00:00) About the Show

(00:00:22) About the Episode

(00:03:52) Introduction and Background

(00:08:43) State of Interpretability Research

(00:12:06) Key Insights in Interpretability

(00:16:53) Polysemanticity and Model Compression (Part 1)

(00:17:00) Sponsors: Oracle | Brave

(00:19:04) Polysemanticity and Model Compression (Part 2)

(00:22:50) Sparse Autoencoders Explained

(00:27:19) Challenges in Interpretability Research (Part 1)

(00:30:54) Sponsors: Omneky | Squad

(00:32:41) Challenges in Interpretability Research (Part 2)

(00:33:51) Goodfire's Vision and Mission

(00:37:08) Interpretability and Scientific Models

(00:43:48) Architecture and Interpretability Techniques

(00:50:08) Quantization and Model Representation

(00:54:07) Future of Interpretability Research

(01:01:38) Skepticism and Challenges in Interpretability

(01:07:51) Alternative Architectures and Universality

(01:13:39) Goodfire's Business Model and Funding

(01:18:47) Building the Team and Future Plans

(01:31:03) Hiring and Getting Involved in Interpretability

(01:51:28) Closing Remarks

(01:51:38) Outro

  continue reading

172 ตอน

Artwork
iconแบ่งปัน
 
Manage episode 434715096 series 3452589
เนื้อหาจัดทำโดย Turpentine, Erik Torenberg, and Nathan Labenz เนื้อหาพอดแคสต์ทั้งหมด รวมถึงตอน กราฟิก และคำอธิบายพอดแคสต์ได้รับการอัปโหลดและจัดหาให้โดยตรงจาก Turpentine, Erik Torenberg, and Nathan Labenz หรือพันธมิตรแพลตฟอร์มพอดแคสต์ของพวกเขา หากคุณเชื่อว่ามีบุคคลอื่นใช้งานที่มีลิขสิทธิ์ของคุณโดยไม่ได้รับอนุญาต คุณสามารถปฏิบัติตามขั้นตอนที่แสดงไว้ที่นี่ https://th.player.fm/legal

Nathan explores the cutting-edge field of mechanistic interpretability with Dan Balsam and Tom McGrath, co-founders of Goodfire. In this episode of The Cognitive Revolution, we delve into the science of understanding AI models' inner workings, recent breakthroughs, and the potential impact on AI safety and control. Join us for an insightful discussion on sparse autoencoders, polysemanticity, and the future of interpretable AI.

Papers

Apply to join over 400 founders and execs in the Turpentine Network: https://hmplogxqz0y.typeform.com/to/JCkphVqj

SPONSORS:

Oracle Cloud Infrastructure (OCI) is a single platform for your infrastructure, database, application development, and AI needs. OCI has four to eight times the bandwidth of other clouds; offers one consistent price, and nobody does data better than Oracle. If you want to do more and spend less, take a free test drive of OCI at https://oracle.com/cognitive

The Brave search API can be used to assemble a data set to train your AI models and help with retrieval augmentation at the time of inference. All while remaining affordable with developer first pricing, integrating the Brave search API into your workflow translates to more ethical data sourcing and more human representative data sets. Try the Brave search API for free for up to 2000 queries per month at https://bit.ly/BraveTCR

Omneky is an omnichannel creative generation platform that lets you launch hundreds of thousands of ad iterations that actually work customized across all platforms, with a click of a button. Omneky combines generative AI and real-time advertising data. Mention "Cog Rev" for 10% off https://www.omneky.com/

Head to Squad to access global engineering without the headache and at a fraction of the cost: head to https://choosesquad.com/ and mention “Turpentine” to skip the waitlist.

CHAPTERS:

(00:00:00) About the Show

(00:00:22) About the Episode

(00:03:52) Introduction and Background

(00:08:43) State of Interpretability Research

(00:12:06) Key Insights in Interpretability

(00:16:53) Polysemanticity and Model Compression (Part 1)

(00:17:00) Sponsors: Oracle | Brave

(00:19:04) Polysemanticity and Model Compression (Part 2)

(00:22:50) Sparse Autoencoders Explained

(00:27:19) Challenges in Interpretability Research (Part 1)

(00:30:54) Sponsors: Omneky | Squad

(00:32:41) Challenges in Interpretability Research (Part 2)

(00:33:51) Goodfire's Vision and Mission

(00:37:08) Interpretability and Scientific Models

(00:43:48) Architecture and Interpretability Techniques

(00:50:08) Quantization and Model Representation

(00:54:07) Future of Interpretability Research

(01:01:38) Skepticism and Challenges in Interpretability

(01:07:51) Alternative Architectures and Universality

(01:13:39) Goodfire's Business Model and Funding

(01:18:47) Building the Team and Future Plans

(01:31:03) Hiring and Getting Involved in Interpretability

(01:51:28) Closing Remarks

(01:51:38) Outro

  continue reading

172 ตอน

ทุกตอน

×
 
Loading …

ขอต้อนรับสู่ Player FM!

Player FM กำลังหาเว็บ

 

คู่มืออ้างอิงด่วน