S.H.E. is helping take the bias out of search. Add S.H.E. to your browser to give women’s transformations the visibility the deserve.
This is an excellent looking tool.
Musings of a Modern Day Cyberneticist
S.H.E. is helping take the bias out of search. Add S.H.E. to your browser to give women’s transformations the visibility the deserve.
This is an excellent looking tool.
Well, any computer scientist or experienced programmer knows right away that being “made of math” does not demonstrate anything about the accuracy or utility of a program. Math is a lot more of a social construct than most people think. But we don’t need to spend years taking classes in algorithms to understand how and why the types of algorithms used in artificial intelligence systems today can be tremendously biased. Here, look at these four photos. What do they have in common?
The founders of Predictim want to be clear with me: Their product—an algorithm that scans the online footprint of a prospective babysitter to determine their “risk” levels for parents—is not racist. It is not biased.
Another example of an app saying “We don’t have bias in our AI” when it seems patently true that they do. I wonder how one would prove (mathematically) that one didn’t have bias?