The Washington PostDemocracy Dies in Darkness

Can a credit card be sexist?

The Apple Card controversy illustrates how a history of bias in credit lending, coupled with discriminatory AI algorithms, hurt women

By
November 12, 2019 at 10:00 a.m. EST

The world of credit — the approval of card holders, the assigning of limits — can be opaque, even “baffling,” for some.

Take Karen Smith, a 47-year-old freelance writer and tech consultant living in the Chicago suburbs. She’s been an independent contractor since her second child, who’s now 16, was born. Her work schedule has historically fluctuated depending on the demands of her husband’s work.

“It’s somewhat of a typical story of two parents trying to balance out who needed to work more to maximize the family’s potential,” she says.

About seven years ago, Smith says she’d in a “lull period” in which her work income was “pretty low” for the year. She and her husband decided they should get another credit card. But when she signed up as the first applicant on the joint credit application, she was denied. Smith says she has a high credit rating and “an extensive history of positive credit,” with no apparent marks against her.

Smith and her husband appealed the rejection. In conversations with the company, which Smith preferred not naming, it appeared a significant portion of the family debt — their house — was being assigned to her, but shared family income was not, she says. After going back and forth a few times, Smith and her husband decided to drop the fight.

“Even with the white male patriarchy as represented by my spouse advocating on my behalf with credit companies, it didn’t matter,” Smith says. “It didn’t permit any sort of exception, any deeper look.”

Smith, among many others, recently shared her story on social media after a strongly worded Twitter thread by tech entrepreneur David Heinemeier Hansson went viral last week. In the thread, Hansson said that he and his wife, Jamie Heinemeier Hansson, had both applied for the Apple Card, which was released in August. He claimed he was given a credit limit 20 times higher than Jamie, despite the fact that they shared an income and she had a higher credit score than him.

Apple Card algorithm sparks gender bias allegations against Goldman Sachs

Among the other responses on Twitter was one from Steve Wozniak, the co-founder of Apple. He said the discrepancy between his and his own wife’s credit limits for the Apple Card, which doesn’t currently allow joint accounts, was 10-fold. “We have no separate bank or credit card accounts or any separate assets,” he wrote on Twitter. “Hard to get to a human for a correction though. It’s big tech in 2019.”

Over the weekend, the New York Department of Financial Services, prompted by Hansson’s tweets, announced it would open an investigation into whether Goldman Sachs discriminates on the basis of sex in the way it sets its credit limits. “Any algorithm that intentionally or not results in discriminatory treatment of women or any other protected class violates New York law,” a spokeswoman for the agency said in a statement Saturday.

Apple has deferred to Goldman Sachs for requests for comment. Andrew Williams, a spokesperson for the bank, said in a statement that Apple Card applications are “evaluated independently.” The company evaluates an individual’s income and an individual’s creditworthiness, which “includes factors like personal credit scores, how much personal debt you have, and how that debt is managed.”

“We have not and will not make decisions based on factors like gender,” Williams said. He added that the company is looking to enable joint family accounts in the future.

A ‘ludicrously low credit limit’

When Apple first released the Apple Card — billed as “a new kind of credit card, made by Apple, not a bank” — the Hanssons were intrigued. The couple, who have been married for eight years and have three kids, are an “Apple family,” David says: They use iPhones and Macs. But they’re also “privacy conscious,” so they especially liked that Apple, in partnership with its issuing bank, Goldman Sachs, promised it would never sell their data to third parties.

David, a 40-year-old tech entrepreneur, applied for the card first. The application process was quick; using the Wallet app on his iPhone, he input information including his birthday, social security number and annual income. He says he “immediately got a reasonable credit limit” and started making purchases.

Jamie, 39, a longtime consultant who is not currently working, followed soon after. She submitted “essentially the same information” that her husband did, according to David. As a married couple living in California, a community property state — meaning that all assets acquired during the marriage are considered “community property” — they split everything 50/50. Jamie claimed the same income as David, and she even had a slightly higher credit score than him. In a written statement for Fast Company, Jamie explains that “while I am now a mother of three children — a ‘homemaker’ is what I am forced to call myself on tax returns — I am still a millionaire who contributes greatly to my household and pays off credit in full each month.”

But when Jamie was approved for the card, it came back with the “ludicrously low credit limit” of a value 20 times lower than his, David says.

“The only way I could make sense of it — given the fact that we have the same financial situation — like, what’s different between me and my wife? Well, she’s a woman,” he says. Jamie wrote that when she inquired about the credit line, Apple Card representatives told her, “It’s just the algorithm.”

The story illustrates how potential biases in credit lending manifest: On the one hand, women have long lacked credit parity with men — women only received legal protection from credit discrimination in the 1970s. But today, with the rise of AI algorithms determining everything from credit lending to hiring to advertising, women face another potential source of discrimination.

“These algorithms are trained on data that are a reflection of the world we live in or the world we lived in in the past,” says Meredith Whittaker, a research scientist at New York University and co-founder of the university’s AI Now Institute. “This data irreducibly imprints these histories of discrimination, these patterns of bias.”

That discrimination is “intersectional,” Whittaker says, and disproportionately hurts women of color.

Hansson acknowledges that he and Jamie are in a “very privileged” situation. “We didn’t need this credit card; we wanted it for privacy implications,” he says. “But there might very well be people who need this to happen, who need extra credit. For them, this is a material issue.”

A credit card of her own

Well into the 20th century, women struggled to get approved for credit cards. As the Smithsonian reports, any woman looking to open a card was subject to discriminatory questions — whether she was married, if she planned to have children. Many banks required single, divorced or widowed women to bring a man along with them to cosign for a card.

It wasn’t until 1974 that the Equal Credit Opportunity Act made it illegal for any creditor to take gender, race, religion or national origin into account. But discrepancies still exist today. An analysis from the Federal Reserve found that single women under 40 had lower credit scores than comparable single men, which reflected that single women had “more intensive use of credit” — an outcome, the study author notes, that may reflect economic circumstances, employment and “men and women being potentially treated differently by the credit market and institutions.” As many note, women’s lower credit is also tied to the gender pay gap.

Vrinda Gupta, a 28-year-old living in Berkeley, Calif., knows the credit card world well. She worked on Visa’s credit card team for years before she was confronted with a situation that led her to launch Sequin, a credit card marketed to women.

It all came to a head in 2016, when Gupta’s Visa team helped launch the Chase Sapphire Reserve card, she says. Other members of her team applied for the card and “loved it.” So she decided to apply for the card, too. At that point, she was a Chase card holder. But when she applied online for the Sapphire Reserve, while she was at work, she was rejected.

“I just felt really helpless in that moment,” Gupta says. She says she didn’t understand initially why she was rejected, being “in the industry” and a “millennial with privilege, where I was making a pretty decent income.” Gupta’s own research at Visa had investigated why millennials were denied credit cards, and how to increase their financial literacy. She reached out to Chase to find out why she’d been denied after her co-workers suggested she may not have met the credit line requirements. After talking with an agent and explaining her situation, she was eventually approved, she says.

“As we told [Gupta] verbally and in writing at the time, this was purely a debt-to-income decision,” Patricia Wexler, a spokeswoman for Chase, said in a statement. “Given how much she said her income was, it would have been entirely irresponsible for anyone to extend her more credit than she already had, given she was at her limit. She asked if we could move around credit from one of her existing lines, and we agreed to do that and opened a Sapphire Reserve card for her — keeping her total credit limit flat.”

The algorithmic ‘black box’

The “black box” algorithms dictating credit outcomes are increasingly being scrutinized. Earlier this year, a widely publicized study found that facial recognition algorithms were significantly more likely to mix up black women’s faces than those of white women, or those of black or white men. That’s a good corollary for the type of bias and discrimination many other AI technologies exhibit, says Whittaker, the researcher at NYU. “We’ve had now years of evidence showing that yes, these systems reproduce biases, and they reproduce them along the lines of racial and gender and other types of discrimination that we see present in our culture and social institutions,” she says.

The Apple Card Is Sexist. Blaming the Algorithm Is Proof.

Bias in AI is a result of several factors, according to Whittaker. A report from the AI Now Institute explores how a “diversity crisis” in the AI sector contributes to unintentional bias (studies have found that more than 80 percent of AI professors are men, for example). Data sets often disproportionately represent white and male subjects, too. But the real issue is more systemic, Whittaker says.

“We have to look at the incentive structures that are driving big tech,” she says. “They are discriminatory and they’re harming a handful of people, but they’re also making a handful of people extraordinarily rich. It’s an uncomfortable fact, but there are winners, even though there are a significant number of losers, to these patterns of discrimination.”

The first step in combating these patterns is acknowledging that bias does exist in algorithms — that they’re not simply “neutral and objective” — according to Whittaker. Other steps include increasing diversity within the AI sector and making the algorithms themselves more transparent.

Hansson is a big proponent of that last piece. “This isn’t just about one wife and her husband getting mad about one isolated instance,” he says. “Algorithms need to be transparent, they need to be accountable, they need to be correctable.”

Gupta’s experience at Visa led her to begin building the Sequin card, which she says tries to tackle credit challenges affecting women in a variety of ways, including by giving cash back on items that might fall under the “pink tax” — retail and beauty products, for example. The primary aim is to increase transparency throughout the credit card experience, from what it takes to build credit to making credit card payments predictable, she says.

For her part, Smith, the freelancer, has taken matters into her own hands.

She eventually took a credit offer from her bank; based on her banking history with them, Smith surmises she was an attractive credit customer. “I’ve taken these very active steps, and I’ve passed this onto friends of mine who have been in similar situations, where they’re taking a career pause or caring for older relatives,” she says. “We have to always be on guard. We have to protect our own interests.”