Artwork

เนื้อหาจัดทำโดย re:publica เนื้อหาพอดแคสต์ทั้งหมด รวมถึงตอน กราฟิก และคำอธิบายพอดแคสต์ได้รับการอัปโหลดและจัดหาให้โดยตรงจาก re:publica หรือพันธมิตรแพลตฟอร์มพอดแคสต์ของพวกเขา หากคุณเชื่อว่ามีบุคคลอื่นใช้งานที่มีลิขสิทธิ์ของคุณโดยไม่ได้รับอนุญาต คุณสามารถปฏิบัติตามขั้นตอนที่แสดงไว้ที่นี่ https://th.player.fm/legal
Player FM - แอป Podcast
ออฟไลน์ด้วยแอป Player FM !

An approach to adversarial research (Video-in)

27:53
 
แบ่งปัน
 

Manage episode 310831483 series 3074243
เนื้อหาจัดทำโดย re:publica เนื้อหาพอดแคสต์ทั้งหมด รวมถึงตอน กราฟิก และคำอธิบายพอดแคสต์ได้รับการอัปโหลดและจัดหาให้โดยตรงจาก re:publica หรือพันธมิตรแพลตฟอร์มพอดแคสต์ของพวกเขา หากคุณเชื่อว่ามีบุคคลอื่นใช้งานที่มีลิขสิทธิ์ของคุณโดยไม่ได้รับอนุญาต คุณสามารถปฏิบัติตามขั้นตอนที่แสดงไว้ที่นี่ https://th.player.fm/legal
The use of data-driven algorithmic systems to run our lives has become commonplace. It is also becoming increasingly clear that they don’t work equally for everyone. Interrogating these systems is challenging because they are usually protected by terms and conditions that keep their code opaque and their data inaccessible to outsiders. So how do you fight injustice if you can’t see it? One approach is to find the stories of who these systems harm rather than focusing on how they work.
  • Surya Mattu

In today’s digital world social, economic, and racial injustice lurks in the shadows of the unseen Facebook post, the hidden algorithm used to sort employment resumes, and the risk assessment tool used in criminal sentencing. These systems tend to be opaque and beyond scrutiny. Access is usually restricted to large companies and governing bodies whose interests are often unaligned with large parts of their customer base and citizens. Much of the criticism of the technology industry tends to be hypothetical or speculative because it can be very difficult to measure the ways in which people are being harmed. The methods of personalization that have transformed how we use the internet have also obfuscated the disparate impact that takes place there. This makes it significantly harder for those interested in regulation to collect the evidence necessary to hold tech companies accountable. It is possible to collect some of this information by harnessing the network and communications infrastructure that the internet is made up of. The data traveling through these systems tell compelling stories if you know how to look for them. They also often reflect systemic biases and prejudices prevalent in society.

  continue reading

33 ตอน

Artwork
iconแบ่งปัน
 
Manage episode 310831483 series 3074243
เนื้อหาจัดทำโดย re:publica เนื้อหาพอดแคสต์ทั้งหมด รวมถึงตอน กราฟิก และคำอธิบายพอดแคสต์ได้รับการอัปโหลดและจัดหาให้โดยตรงจาก re:publica หรือพันธมิตรแพลตฟอร์มพอดแคสต์ของพวกเขา หากคุณเชื่อว่ามีบุคคลอื่นใช้งานที่มีลิขสิทธิ์ของคุณโดยไม่ได้รับอนุญาต คุณสามารถปฏิบัติตามขั้นตอนที่แสดงไว้ที่นี่ https://th.player.fm/legal
The use of data-driven algorithmic systems to run our lives has become commonplace. It is also becoming increasingly clear that they don’t work equally for everyone. Interrogating these systems is challenging because they are usually protected by terms and conditions that keep their code opaque and their data inaccessible to outsiders. So how do you fight injustice if you can’t see it? One approach is to find the stories of who these systems harm rather than focusing on how they work.
  • Surya Mattu

In today’s digital world social, economic, and racial injustice lurks in the shadows of the unseen Facebook post, the hidden algorithm used to sort employment resumes, and the risk assessment tool used in criminal sentencing. These systems tend to be opaque and beyond scrutiny. Access is usually restricted to large companies and governing bodies whose interests are often unaligned with large parts of their customer base and citizens. Much of the criticism of the technology industry tends to be hypothetical or speculative because it can be very difficult to measure the ways in which people are being harmed. The methods of personalization that have transformed how we use the internet have also obfuscated the disparate impact that takes place there. This makes it significantly harder for those interested in regulation to collect the evidence necessary to hold tech companies accountable. It is possible to collect some of this information by harnessing the network and communications infrastructure that the internet is made up of. The data traveling through these systems tell compelling stories if you know how to look for them. They also often reflect systemic biases and prejudices prevalent in society.

  continue reading

33 ตอน

Alle afleveringen

×
 
Loading …

ขอต้อนรับสู่ Player FM!

Player FM กำลังหาเว็บ

 

คู่มืออ้างอิงด่วน