Saturday, November 23, 2019

How facial analysis software that classifies gender misidentifies trans and non-binary people, is inaccurate with people of color, and could harm these groups (Rachel Metz/CNN)

Rachel Metz / CNN:
How facial analysis software that classifies gender misidentifies trans and non-binary people, is inaccurate with people of color, and could harm these groups  —  San Francisco (CNN)Artificial intelligence doesn't know what to make of Os Keyes.  —  The 29-year-old graduate student is dark-haired …



from Techmeme https://ift.tt/2QJHdLt

No comments:

Post a Comment