Problems With Structural Bias With Big Data Automation Models

Article posted on : link to source

Automation is not a fad. It is the future of business models in almost every industry. Unfortunately, automation has introduced new risks that brands must prepare for.

One of the biggest concerns that have been raised in recent months is the risk of institutional bias and unintentional discrimination. Brands may make decisions based on demographic information with limited sample sizes or flawed data sets.

This can lead to some problems. Brands must understand the challenges of structural bias in their data.

What problems can biased data create?

Brands are discovering a couple of different ways that biased data can affect their business models. Here are some of the issues they must be prepared to address.

Allegations of employment discrimination based on flawed data

A couple of months ago, James Damore, a Google employee, stirred up quite a controversy after publishing his Google manifesto. He cherry-picked some studies to present data that yielded an unfavourable view of female employees at Google. Wired’s Megan Molteni states that this shows the problems that poorly selected data can create in an increasingly diverse workforce:

“It wasn’t a screed or a rant, but, judging by his document, Damore clearly feels that some basic truths are getting ignored—silenced, even—by Google’s bosses. So in …

Read More on Datafloq