116 3rd St SE
Cedar Rapids, Iowa 52401
Home / Opinion / Guest Columnists
AI needs regulation to avoid discrimination
Qihang Lin
Jun. 20, 2023 9:01 am, Updated: Jun. 21, 2023 9:02 am
Artificial intelligence is playing an ever-larger role in our lives, from approving mortgages and credit cards to getting a job interview to what advertisements we see in our social media feeds.
But those AI algorithms can be unfair, unwittingly discriminating against people based on race, gender, health conditions, and a range of other factors based on the data that it’s given. As critical as AI is going to be in shaping the direction of our lives in the future, it’s imperative that an independent third-party body be created to regulate AI systems, ensuring that decisions made by algorithms are fair and don’t discriminate against disadvantaged groups.
I’ve been researching discrimination in AI systems as part of a team funded with a National Science Foundation grant. We’ve found that without safeguards in place, algorithms can easily become discriminatory if the people who are designing and programming them are not careful. AI uses machine learning, which is the process of programming an algorithm to analyze enormous amounts of data so it learns how to do tasks it’s programmed to do. As more data is added, the algorithm learns more about its task and changes how it does things as a result, “learning” as it goes, just as a human would respond to learning new things.
However, algorithms can learn discriminatory things based on the data. Job recruiting tools have unknowingly discriminated against women. Black people arrested for a crime have been unfairly flagged as a greater risk to reoffend than white people.
In health care, an algorithm concluded that Black people were less susceptible than white people to certain types of illnesses based on data showing fewer Black people are tested for or hospitalized with those illnesses less often. But what the algorithm wasn’t told is that fewer Black people have access to health care, so even if they have the illness, they are less likely to see a doctor or get tested.
The AI model thought fewer Black people than white people had a disease because they were healthier, when that result is actually a sign of reduced access to health care resources. A regulatory and certification body would be able to review that algorithm and make sure it’s designed in a way that it takes into account Black peoples’ diminished access to health care before it’s used to make decisions that affect lives.
The regulatory organization could be a government agency that operates similar to the way the U.S. Department of Agriculture holds food producers to food safety standards. Or it could be an industry group, similar to the way ISO certification is awarded to businesses that meet certain management or operational standards by an independent third party. In fact, there already are ISO standards that apply to AI for things like security, but none of them pertain to discrimination.
The body — whether a government agency or industry group — would consult with AI researchers, legal experts, and domain experts to develop and standardize a procedure for assessing the fairness of a business’ AI systems. Laws and policies would be put in place that require AI systems to be certified as fair and unbiased before deployment.
This new body would have the authority to audit the data-driven decision-making systems used by businesses, governments, hospitals, and other organizations to ensure they are fair, transparent and trustworthy.
Such a certification system would add credibility to AI products and reduce public concern that decisions made with algorithms are being done so unfairly. It would also help businesses by reducing the likelihood that their decisions are biased and in violation of state or federal discrimination laws.
Regulations and standards would also promote the applications of AI, and eventually increase both the efficiency and fairness of our society.
Qihang Lin is associate professor of business analytics at the University of Iowa Tippie College of Business.
Opinion content represents the viewpoint of the author or The Gazette editorial board. You can join the conversation by submitting a letter to the editor or guest column or by suggesting a topic for an editorial to editorial@thegazette.com

Daily Newsletters