Paige Tutt described the use of multicultural emojis as a new means for individuals to spread racist messages online:
The emoji are being used to make racist comments on social media and insert questions of race in texts and tweets where it may never have arisen before. Instead of correcting its mistake — excluding people of color from emoji — Apple has, in some ways, made the situation worse.
Technology can never be predictive, only descriptive. It reflects our human nature – all of our inherently marvelous and unfavorable qualities. As our societies around the world creep towards higher use of data to make decisions, it’s important that we take a step back and ask critical questions about how the data is gathered, analyzed, and displayed. Dr. Alvaro Bedoya illustrates how unexamined biases exist in data:
This video is part of a longer presentation in which he and Dr. Latanya Sweeney discuss big data, inequality, and discrimination.
Signing off with my newly created bitmoji:
Came across two articles recently that spoke to this topic.
h/t to FemTechNet for the link to The Guardian’s article about machine logic
h/t to D’Arcy Norman for posting this video that talks more about how our biases are written into code.