How Algorithms Can Lower Minority Credit Ratings

0

In an episode of Black mirror, the world operates on a rating system. Just as you and your Uber driver rate each other after a ride, each person in the episode’s fictional world scores on a five-star scale after each interaction. The average score of each individual affects their access to basic goods and services, such as housing and transport.

In real life, too, people are tracked by scores and ratings that may grant or deny opportunities, but instead of being determined by other people, they are put together by mysterious algorithms. From credit scores to hiring processes, automated systems make decisions based on a wealth of personal information, often without revealing the types of information included in the calculation. Some even take into account people’s friends, family and acquaintances to make assumptions about their character traits, which some privacy experts say could lead to discrimination.

A German company called Kreditech, for example, asks loan seekers to share information on their social media, which they can comb through for details about their friends. Being related to someone who has already paid off a loan with the company is “usually a good indicator,” said the company’s chief financial officer. Financial Time.

In India and Russia, FICO, the company behind FICO credit ratings, is partnering with startups like Lenddo to capture user information from their cellphones. Lenddo uses the locations reported by applicants’ phones to determine if they really live and work where they say they do, then analyzes a candidate’s network to determine “if they are in contact with others.” good borrowers or with people with a long track record of cheating lenders ”, Bloomberg reports.

Encouraging more types of information to consider can help people who do not have a credit score, or who might not have the usual indicators of creditworthiness, to access loans and bank accounts that might otherwise be available to them. be closed.

But the more complex and opaque these powerful algorithms, the more ways people get disqualified from job searches and loan applications, and the harder it is to know why. Moreover, systems that take into account the actions of people’s family and friends risk attributing guilt by association, denying someone opportunities because of who they are related. They can reduce a person’s chances of upward mobility, only depending on the social group in which they find themselves.

A person living in a low-income community, for example, is likely to have friends and family with similar income levels. Someone in their extended network is more likely to have a bad repayment history than someone in their network of an upper middle class white collar. If a scoring algorithm took this fact into account, it could lock out the low-income person. just according to his social environment.

The Equal Credit Opportunity Act does not authorize creditors in the United States to discriminate on the basis of race, color, religion, national origin, sex, marital status, age, but taking into account a person’s network could allow creditors to end these requirements. .

A 2007 Federal Reserve report found that blacks and Hispanics had lower credit scores than whites and Asians, and that “residing in low-income or majority minority census tracts” is a predictor of low credit scores. Since people are likely to have friends and family who live nearby and are of the same race, using social media to assess their creditworthiness could reintroduce factors that creditors are not allowed to take. into account.

In a 2014 essay published by the Open Technology Institute of New America, three privacy researchers – Danah Boyd, Karen Levy, and Alice Marwick – wrote about the potential for discrimination when algorithms examine people’s social connections. :

The notion of protected class remains a fundamental legal concept, but as individuals increasingly face technology-mediated discrimination based on their position within networks, it may be incomplete. In the more visible examples of network discrimination, it is easy to see inequalities by race and class, as these are often proxies for the network position. As a result, we see results that disproportionately affect people who are already marginalized.

Preventing algorithmic discrimination is a challenge. It is not easy to force companies to obey laws that would protect consumers from unfair credit practices, says Danielle Citron, professor of law at the University of Maryland. “We don’t have hard and fast rules. It’s the Wild West in some ways.

Agencies responsible for enforcing the relevant laws – the Civil Rights division of the Department of Justice, the Consumer Financial Protection Bureau and the Federal Trade Commission – have a mixed record in prosecuting companies that violate them, Citron says. And once their reins are handed over to President-elect Donald Trump, they may be even less interested in prosecuting offenders, preferring instead to “let the free market deal with it,” she anticipates.

A look abroad shows how far businesses – and even governments – are willing to take notes. Far beyond the network-based rating systems in Russia and India, China is currently testing a system that would give Chinese citizens a rating that goes well beyond the credit ratings we are used to. According to The Wall Street Journal, the system takes into account the usual financial factors, but mixes up ‘social inputs’ such as law enforcement (including traffic violations and paying for public transport), academic honesty and volunteer activity . The system will likely also use online metrics such as purchasing habits and interactions with others.

The score could end up determining whether people can access internet services, jobs, education, transportation, and various other goods and services, much like universal assessments in Black mirror.

US laws would prevent such a radical government system from being put in place. But private companies, which are subject to different regulations than the federal government, could come one step closer to such a reality – indeed, credit scores in their current form are already affecting people’s lives in crucial ways. “We’re not as far as we think we are,” says Citron.

Leave A Reply

Your email address will not be published.