Start your day with intelligence. Get The OODA Daily Pulse.
A year ago Google reported a breakthrough artificial intelligence technology called BERT. BERT, which is used in Google’s internet search engine, learns from digitized information like Wikipedia entries, news articles, and old books. This presents an issue that has caught the attention of security researchers and experts who claim BERT could be picking up on and mimicking biases that are found in the sources it learns from — potentially decades of biases. This could bring old biases into new technology.
BERT doesn’t give women equal credit to men and is more likely to associate men with computer programming. As A.I. continues to expand into new domains and products, unexpected biases continue to be discovered. Computer scientist Robert Munro fed 100 commonly used words into BERT, and in 99/100 cases BERT associated the words with men. Words like “money,” “house,” and “action” were associated with men while one word was associated with women — “mom.” This historical gender bias could continue to perpetuate today’s society through A.I. technology like BERT.
Read More: A.I. Systems Echo Biases They’re Fed, Putting Scientists on Guard
This topic of AI system bias is one we have reported on extensively at OODALoop. For AI to really deliver full value to enterprises and governments and citizens solutions to these and other issues (including security and explainability) will need to be mitigated. For tips and techniques on mitigating risks of AI solutions see the following OODALoop special reports: