Date: Fri, 29 Jan 2016 15:00:00 +0000
<p>Algorithms are pervasive in our society and make thousands of automated decisions on our behalf every day. The possibility of digital discrimination is a very real threat, and it is very plausible for discrimination to occur accidentally (i.e. outside the intent of the system designers and programmers). Christian Sandvig joins us in this episode to talk about his work and the concept of auditing algorithms.</p> <p><a href="http://niftyc.org/">Christian Sandvig</a> (<a href="https://twitter.com/niftyc">@niftyc</a>) has a PhD in communications from Stanford and is currently an Associate Professor of Communication Studies and Information at the University of Michigan. His research studies the predictable and unpredictable effects that algorithms have on culture. His work exploring the topic of auditing algorithms has framed the conversation of how and why we might want to have oversight on the way algorithms effect our lives. His writing appears in numerous publications including <a href="http://socialmediacollective.org/author/niftyc/">The Social Media Collective</a>, <a href="http://www.huffingtonpost.com/christian-sandvig/">The Huffington Post</a>, and <a href="http://www.wired.com/2015/05/facebook-not-fault-study/">Wired</a>.</p> <p>One of his papers we discussed in depth on this episode was <a href="http://www-personal.umich.edu/~csandvig/research/Auditing%20Algorithms%20--%20Sandvig%20--%20ICA%202014%20Data%20and%20Discrimination%20Preconference.pdf"> Auditing Algorithms: Research Methods for Detecting Discrimination on Internet Platforms</a>, which is well worth a read.</p>