Socialist Feminism Reading Group 🌹

January 21, 2018 4:00 pm - 6:00 pm

at the Kogod Courtyard at the National Portrait Gallery in Chinatown, Washington, DC. | 8th and F Streets, NW Washington, D.C. | Washington, DC

Event Page: https://www.meetup.com/DC-DSA/events/245224111/

Please read our Socialist Reading Groups: Participation Guide and join us as we explore and discuss topics within Socialist Feminism 🌹


Note: Occasionally the Kogod Courtyard is taken over for private events. If that happens for our meeting, we will move up to the former snack bar on the 3rd floor of the same building. There are tables and chairs there, and it is quiet, but no food is allowed, alas.

===========================================================

This month, our readings focus on the topic of Socialist Feminist Perspectives on Biases in Artificial Intelligence (AI) Algorithms and Machine Learning.  As we read these articles, we can think about how the results of bias in AI algorithms, reflects and perpetuates the  system of sexual hierarchy that ensures the order and control via a sexual division of labor that benefits capitalism.  Note too how some of these articles focus on the profit motive for fixing these biases.  Click on the links to the titles below to access the documents: 

• Who Trained Your A.I.?: Artificial intelligence systems are only as good as the data used to teach them. A lot of that data is old and biased—and quietly shaping our future. by April Glasser, Slate, 10-24-17 (AI based on the Enron case)


• </a><a href=”https://www.technologyreview.com/s/608248/biased-algorithms-are-everywhere-and-no-one-seems-to-care/”>Biased Algorithms Are Everywhere, and No One Seems to Care: The big companies developing them show no interest in fixing the problem, Will Knight, MIT Technology Review, July 12, 2017


• Inside the surprisingly sexist world of artificial intelligence, Sarah Todd, Quartz, October 25, 2015


• AI robots are sexist and racist, experts warn, Henry Bodkin, The Telegraph, 24 August 2017


• MACHINES TAUGHT BY PHOTOS LEARN A SEXIST VIEW OF WOMEN, Tom Simonite, WIRED, 8-21-17


• HOW TO KEEP YOUR AI FROM TURNING INTO A RACIST MONSTER, Megan Garcia, WIRED, 2-13-17


• Princeton researchers discover why AI become racist and sexist: Study of language bias has implications for AI as well as human cognition, Annalee Newitz, arsTechnica, 4/18/1017


• Algorithms Can Be Pretty Crude Toward Women, Cathy O’Neil, Bloomberg View, March 24, 2017  (Cathy O’Neil is a mathematician who has worked as a professor, hedge-fund analyst and data scientist. She founded ORCAA, an algorithmic auditing company, and is the author of “Weapons of Math Destruction.”)

 Facebook (Still) Letting Housing Advertisers Exclude Users by Race, Julia Angwin, Ariana Tobin and Madeleine Varner, ProPublica, November 21, 2017 (After ProPublica revealed last year that Facebook advertisers could target housing ads to whites only, the company announced it had built a system to spot and reject discriminatory ads. We retested and found major omissions.)


https://lh3.googleusercontent.com/fYg80SifIFBfzm1brCu9xXtPJ031gFmAHUloul76YLgcp9RTmHxMqbcptiR6wVQsl2MicF_ixo1bznEsO_VASQlz_-jScPW_D0khFPeaFrqKqXcAxs_L8JHUteFt7hrOYiVA-eqN