We have ethical responsibilities when coding. We’re able to extract remarkably precise intuitions about an individual. But do we have a right to know what they didn’t consent to share, even when they willingly shared the data that leads us there? A major retailer’s data-driven marketing accidentally revealed to a teen’s family that she was pregnant. Eek.
What are our obligations to people who did not expect themselves to be so intimately known without sharing directly? How do we mitigate against unintended outcomes? For instance, an activity tracker carelessly revealed users’ sexual activity data to search engines. A social network’s algorithm accidentally triggered painful memories for grieving families who’d recently experienced death of their child and other loved ones. We design software for humans. Balancing human needs and business specs can be tough. It’s crucial that we learn how to build in systematic empathy.
In this talk, we’ll delve into specific examples of uncritical programming, and painful results from using insightful data in ways that were benignly intended. You’ll learn ways we can integrate practices for examining how our code might harm individuals. We’ll look at how to flip the paradigm, netting consequences that can be better for everyone.