👓 I Tried Predictim's AI Scan for 'Risky' Babysitters on People I Trust | Gizmodo

Read I Tried Predictim's AI Scan for 'Risky' Babysitters on People I Trust (Gizmodo)
The founders of Predictim want to be clear with me: Their product‚ÄĒan algorithm that scans the online footprint of a prospective babysitter to determine their ‚Äúrisk‚ÄĚ levels for parents‚ÄĒis not racist. It is not biased.

Another example of an app saying “We don’t have bias in our AI” when it seems patently true that they do. I wonder how one would prove (mathematically) that one didn’t have bias?

Leave a Reply

Your email address will not be published. Required fields are marked *