Imagine if that kind of omnipresent AI was installed at dens of abuse like Fox News or the Weinstein Co., picking up on the culture and learning from its worst members.
Similarly, Boston University researchers trained AI on text from Google News.
They then asked the software to complete this sentence: “Man is to computer programmer as woman is to X.” The AI replied, “Homemaker.” The AI learned what’s already out there in the culture.
If technologists can manage to tune AI to detect and counter bias or abuse, it could have a positive impact on work culture.
“It’s a really important question,” Eric Horvitz, who runs Microsoft Research, told .
It wouldn’t be possible to manually encode every nuance of driving a car into software.
But when you equip cars with sensors and cameras that can pull in everything going on inside and around the car as a human drives, the AI can automatically learn from the driver’s billions of subtle actions.All of that data can feed AI, which can learn how employees are acting and, for instance, identify groups that have low morale or spot a low-level employee who seems ripe for promotion.If troublesome white-male biases get into such AI, it could have repercussions for women or minorities, adding prejudice to hiring, promotions and salary decisions.The AI can be fed rules—like “Don’t go over the speed limit” or “Don’t beep the horn to scare cyclists”—that can counter some bad human habits.But good luck identifying and countering all of those habits.“When should we change reality to make our systems perform in an aspirational way?