Artwork

เนื้อหาจัดทำโดย SWI swissinfo.ch เนื้อหาพอดแคสต์ทั้งหมด รวมถึงตอน กราฟิก และคำอธิบายพอดแคสต์ได้รับการอัปโหลดและจัดหาให้โดยตรงจาก SWI swissinfo.ch หรือพันธมิตรแพลตฟอร์มพอดแคสต์ของพวกเขา หากคุณเชื่อว่ามีบุคคลอื่นใช้งานที่มีลิขสิทธิ์ของคุณโดยไม่ได้รับอนุญาต คุณสามารถปฏิบัติตามขั้นตอนที่แสดงไว้ที่นี่ https://th.player.fm/legal
Player FM - แอป Podcast
ออฟไลน์ด้วยแอป Player FM !

New wars, new weapons and the Geneva Conventions

24:54
 
แบ่งปัน
 

Manage episode 415425329 series 2789582
เนื้อหาจัดทำโดย SWI swissinfo.ch เนื้อหาพอดแคสต์ทั้งหมด รวมถึงตอน กราฟิก และคำอธิบายพอดแคสต์ได้รับการอัปโหลดและจัดหาให้โดยตรงจาก SWI swissinfo.ch หรือพันธมิตรแพลตฟอร์มพอดแคสต์ของพวกเขา หากคุณเชื่อว่ามีบุคคลอื่นใช้งานที่มีลิขสิทธิ์ของคุณโดยไม่ได้รับอนุญาต คุณสามารถปฏิบัติตามขั้นตอนที่แสดงไว้ที่นี่ https://th.player.fm/legal

Send us a Text Message.

In the wars in Ukraine and in the Middle East, new, autonomous weapons are being used. Our Inside Geneva podcast asks whether we’re losing the race to control them – and the artificial intelligence systems that run them.

“Autonomous weapons systems raise significant moral, ethical, and legal problems challenging human control over the use of force and handing over life-and-death decision-making to machines,” says Sai Bourothu, specialist in automated decision research with the Campaign to Stop Killer Robots.

How can we be sure an autonomous weapon will do what we humans originally intended? Who’s in control?

Jean-Marc Rickli from the Geneva Centre for Security Policy adds: “AI and machine learning basically lead to a situation where the machine is able to learn. And so now, if you talk to specialists, to scientists, they will tell you that it's a black box, we don't understand, it's very difficult to backtrack.”

Our listeners asked if an autonomous weapon could show empathy? Could it differentiate between a fighter and a child? Last year, an experiment asked patients to rate chatbot doctors versus human doctors.

“Medical chatbots ranked much better in the quality. But they also asked them to rank empathy. And on the empathy dimension they also ranked better. If that is the case, then you opened up a Pandora’s box that will be completely transformative for disinformation,” explains Rickli.

Are we going to lose our humanity because we think machines are not only more reliable, but also kinder?

“I think it's going to be an incredibly immense task to code something such as empathy. I think almost as close to the question of whether machines can love,” says Bourothu.

Join host Imogen Foulkes on the Inside Geneva podcast to learn more about this topic.

Please listen and subscribe to our science podcast -- the Swiss Connection.

Get in touch!

Thank you for listening! If you like what we do, please leave a review or subscribe to our newsletter.
For more stories on the international Geneva please visit www.swissinfo.ch/
Host: Imogen Foulkes
Production assitant: Claire-Marie Germain
Distribution: Sara Pasino
Marketing: Xin Zhang

  continue reading

บท

1. The Ethics of Autonomous Weapons (00:00:07)

2. The Rise of Empathetic Machines (00:15:49)

120 ตอน

Artwork
iconแบ่งปัน
 
Manage episode 415425329 series 2789582
เนื้อหาจัดทำโดย SWI swissinfo.ch เนื้อหาพอดแคสต์ทั้งหมด รวมถึงตอน กราฟิก และคำอธิบายพอดแคสต์ได้รับการอัปโหลดและจัดหาให้โดยตรงจาก SWI swissinfo.ch หรือพันธมิตรแพลตฟอร์มพอดแคสต์ของพวกเขา หากคุณเชื่อว่ามีบุคคลอื่นใช้งานที่มีลิขสิทธิ์ของคุณโดยไม่ได้รับอนุญาต คุณสามารถปฏิบัติตามขั้นตอนที่แสดงไว้ที่นี่ https://th.player.fm/legal

Send us a Text Message.

In the wars in Ukraine and in the Middle East, new, autonomous weapons are being used. Our Inside Geneva podcast asks whether we’re losing the race to control them – and the artificial intelligence systems that run them.

“Autonomous weapons systems raise significant moral, ethical, and legal problems challenging human control over the use of force and handing over life-and-death decision-making to machines,” says Sai Bourothu, specialist in automated decision research with the Campaign to Stop Killer Robots.

How can we be sure an autonomous weapon will do what we humans originally intended? Who’s in control?

Jean-Marc Rickli from the Geneva Centre for Security Policy adds: “AI and machine learning basically lead to a situation where the machine is able to learn. And so now, if you talk to specialists, to scientists, they will tell you that it's a black box, we don't understand, it's very difficult to backtrack.”

Our listeners asked if an autonomous weapon could show empathy? Could it differentiate between a fighter and a child? Last year, an experiment asked patients to rate chatbot doctors versus human doctors.

“Medical chatbots ranked much better in the quality. But they also asked them to rank empathy. And on the empathy dimension they also ranked better. If that is the case, then you opened up a Pandora’s box that will be completely transformative for disinformation,” explains Rickli.

Are we going to lose our humanity because we think machines are not only more reliable, but also kinder?

“I think it's going to be an incredibly immense task to code something such as empathy. I think almost as close to the question of whether machines can love,” says Bourothu.

Join host Imogen Foulkes on the Inside Geneva podcast to learn more about this topic.

Please listen and subscribe to our science podcast -- the Swiss Connection.

Get in touch!

Thank you for listening! If you like what we do, please leave a review or subscribe to our newsletter.
For more stories on the international Geneva please visit www.swissinfo.ch/
Host: Imogen Foulkes
Production assitant: Claire-Marie Germain
Distribution: Sara Pasino
Marketing: Xin Zhang

  continue reading

บท

1. The Ethics of Autonomous Weapons (00:00:07)

2. The Rise of Empathetic Machines (00:15:49)

120 ตอน

Tüm bölümler

×
 
Loading …

ขอต้อนรับสู่ Player FM!

Player FM กำลังหาเว็บ

 

คู่มืออ้างอิงด่วน