CONNECT WITH US

Master Talks 3: Algorithms may become biased and need governance

Peng Chen, DIGITIMES, Taipei 0

Algorithms are changing our lives in different ways. However, Cathy O'Neil, founder of O'Neil risk consulting and algorithmic auditing, said in some cases, the mathematical models are embedded with opinions and potentially become "weapons" that cause harm.

The mathematician and author revealed the downsides of algorithms at "Building a Better World," a master series organized by Epoch Foundation and MIT Sloan School of Management.

O'Neil said artificial intelligence and big data have given the public a sense, through marketing, that they are neutral and trustworthy. But in her opinion, algorithms are the "perfect mechanisms to hide behind sophisticated mathematics." Sometimes, those in power would hide behind the math and use their power for or against people.

"I would like to do the opposite. I would like to explain exactly what's happening so that everyone can understand it," O'Neil said.

"It's not a math test. Everyone can ask the right questions. The right kind of skeptical questions about the algorithms that have so much impact on our lives," she added.

The algorithms that O'Neil referred to can predict success in some form, requiring the definition of success and historical data to look for patterns from the past.

In the speech titled "Weapons of Math Destruction," O'Neil talked about several cases that the algorithm's accomplishment was for people who build and deploy them rather than those who are impacted. The story of Kyle, a Vanderbilt University student, was one of them.

Kyle took some time off to get treatment for bipolar disorder at a hospital, where he accepted several mental health assessments. O'Neil said that when he tried to apply for a job later, he was rejected because he failed a "personality test."

According to O'Neil, Kyle recognized some questions in the test were the same as the ones he took in mental health assessments. His father, a lawyer, realized it violates the Americans with Disabilities Act and filed lawsuits against every company that uses the particular test.

Kyle's story is a typical example of what O'Neil calls "a weapon of math destruction." The mathematician said the idea contains three things:

A high-impact "test" that affects many people. The secretness (Kyle was lucky to know about the questions) and the destructiveness of the test destroyed the job opportunity for people who the law was meant to protect.

"It wasn't just destructive for one person. It was destructive for an entire class of people," O'Neil said.

The same issue was found in the healthcare system in America. For example, health insurance company Optum used an algorithm to identify clients with complicated health problems. O'Neil said the company tried to help these people to avoid unnecessary procedures that generate extra costs and eventually, the act will help itself, too.

However, the algorithm was found that it could not identify people with complicated health issues and undertreated, especially some African Americans. O'Neil said these people did not get treated in the past so that the algorithm would not think they need help. Therefore, the insurance company would not offer them further service.

O'Neil said the problem resulted from the way Optum defined "success" when building the algorithm. The company used "cost," which is easier to measure, instead of "complicated health problems" and caused the pattern matching breakdown.

To avoid issues brought by problematic algorithms, O'Neil suggested setting up an approval process for high-level and impact algorithms to ensure they are effective and safe. The process should also require businesses that would profit from the algorithms to define success openly and promise the models would not harm particular groups like the minority or people with mental health issues.