BETA
This is a BETA experience. You may opt-out by clicking here

More From Forbes

Edit Story

Maybe The Apple And Goldman Sachs Credit Card Isn’t Gender Biased

Following
This article is more than 4 years old.

The Apple-branded credit card is under scrutiny, because women are receiving less credit than their spouses who share their income and credit score. In launching the card, Apple partnered with Goldman Sachs, and Goldman is the issuing bank for the card. Now, Goldman’s credit review process is being labeled sexist by Elizabeth Warren and several high-power tech execs. Are the accusations fair?

A tech entrepreneur, David Heinemeier Hansson, first raised the issue when he tweeted that the Apple Card’s algorithms discriminated against his wife, giving him 20 times the credit limit it had given to her. Apple cofounder Steve Wozniak weighed in asserting that he can borrow ten times as much as his wife on their Apple Cards. Hansson said his wife has a better credit score than he does, and that the couple file joint tax returns. Similarly, Wozniak and his wife also file joint returns and share credit card and bank accounts.

The picture that’s been painted is of an AI algorithm that has been able to find a proxy for gender which the algorithm is using to determine credit limits. “My belief isn’t there was some nefarious person wanting to discriminate. But that doesn’t matter. How do you know there isn’t an issue with the machine-learning algo when no one can explain how this decision was made?” Hansson said in an interview with Bloomberg.

And a sexist algorithm is certainly a possibility. How would this work? The credit card application doesn’t specifically ask for an applicant’s gender. Instead, a sophisticated machine-learning algorithm could hone in on a proxy for gender, and then apply it to determine credit limits. For example, it could “learn” that those that have credit cards for a particular women’s clothing store are a bad risk—and almost all of them would be women. It could then provide lower credit limits to those who carry this card, and this would inadvertently result in women receiving lower credit limits than men.

But Wired’s editor-in-chief, Nick Thompson, told CBS This Morning, that he’s doubtful that the algorithms are sexist. “One of the reasons when this story broke that I was a little skeptical is that probably the system is set up to minimize defaults, and a system that’s set up to minimize defaults is not going to give all men 20 times the credit limit of all women. Perhaps you would have a tiny difference based on historical data,” he said.

That’s not to say that bias in AI algorithms isn’t a problem. Thompson claims that undetected bias in algorithms is “one of the biggest problems of our time right now.” Yet, he believes the Goldman credit card story is “not the best example of this huge problem.”

In fact, Thompson thought the differences in credit limits were more likely to be related to the excessive incomes associated with Wozniak and other tech executives. It’s not beyond reason to assume that the cofounder of Apple might appear to be a good credit risk. I’d certainly lend him money.

In fact, Goldman has said that there is no algorithm making decisions off of unknown factors. In defending their assignment of credit limits, Goldman spokesperson, Patrick Lenihan says there is no “black box.” He says, “For credit decisions we make, we can identify which factors from an individual’s credit bureau issued credit report or stated income contribute to the outcome. We welcome a discussion of this topic with policymakers and regulators.” In a tweet Goldman also responded to accusations saying they “reviewed their credit process to guard against unintended biases and outcomes.”

I applied for the Apple credit card, and the application only required me to enter my name, address, social security number and birthdate. My husband also applied and did indeed receive a higher credit limit than I did (and, interestingly, he also had to provide a photo of his driver’s license in the application process, while I did not). Not able to turn down the great discounts offered for opening new accounts, I have far more credit cards than my husband. This is my best guess as to why my husband received a higher credit limit.

Regarding credit history, research has revealed a number of gender differences. For example, women tend to have more credit cards open than men do. Women also tend to have higher installment loan balances, higher revolving credit utilization rates, and greater prevalence of delinquency and bankruptcy than otherwise comparable men. The causes for these gender differences can’t be blamed on Goldman Sachs. The big question is whether Goldman has an algorithm that has learned to constrain the credit limits of all women based on these averages.

I’m not defending Goldman Sachs, I just want to point out that we really don’t yet know whether their credit limit allocation is sexist—and we shouldn’t jump to conclusions. Fortunately, the New York State Department of Financial Services (NYSDFS) launched an investigation into the alleged gender bias, so hopefully they will ultimately uncover whether gender bias played a role. This type of investigation is critically important, and if a sexist algorithm is to blame, we can all learn from Goldman’s mistakes. And if it’s not, we can learn from our own mistakes.

Follow me on Twitter or LinkedInCheck out my website