Enlarge / A dashboard built into iOS on iPhones lets you manage your card.Apple

Apple launched its own branded MasterCard nationwide in August. In the months since, the digital-first payment system has won some fans for its easy integration into the iPhone and Apple ecosystem, and it more or less seemed to work about as well as any other credit card. Now, however, financial-services regulators want to know what's going on under the hood amid accusations that the software determining the card's terms has a sexist slant.

What happened?

Software developer and entrepreneur David Heinemeier Hansson took to Twitter late last week to complain about his wife Jamie Heinemeier Hansson's experience with AppleCard.

"The @AppleCard is such a fucking sexist program," his lengthy thread began. "My wife and I filed joint tax returns, live in a community-property state, and have been married for a long time. Yet Apple's black box algorithm thinks I deserve 20x the credit limit she does. No appeals work."

"It gets even worse," he added, sharing a screenshot showing $0 owed on a limit of, apparently, $57.24. "Even when she pays off her ridiculously low limit in full, the card won't approve any spending until the next billing period. Women apparently aren't good credit risks even when they pay off the fucking balance in advance and in full."

Speaking with Apple customer service did no good, he added, with representatives repeatedly deflecting blame to the black box that makes the determinations. Customer service representatives were, "very nice, courteous people representing an utterly broken and reprehensible system," Hansson said. "The first person was like 'I don't know why, but I swear we're not discriminating, IT'S JUST THE ALGORITHM.' I shit you not. 'IT'S JUST THE ALGORITHM!'"

Several other men on Twitter chimed in with replies outlining similar experiences. They said their wives, who on paper look like the better credit risks, received significantly less favorable terms on their Apple Cards than they did. One of the responses came from Apple co-founder Steve Wozniak, who tweeted that, although he and his wife have only joint bank accounts and assets, his Apple Card was given a limit 10 times higher than his wife's.

As Hansson's thread went viral and gained media attention, representatives of Apple VIP customer service stepped in. They bumped the credit limit on Jamie's card up to match David's and launched an internal investigation.

External investigation

Apple VIP support aren't the only ones interested in figuring out if the company's mysterious algorithm is behaving in discriminatory ways; regulators are investigating now, too.

Hansson's tweets drew the attention of Linda Lacewell, head of the New York Department of Financial Services. "Here in New York State, we support innovation," Lacewell wrote in a blog post Sunday, adding:

However, new technologies cannot leave certain consumers behind or entrench discrimination. We believe innovation can help solve many challenges, including making quality financial services more accessible and affordable. Yet, this can't be accomplished without maintaining public confidence. For innovation to deliver lasting and sustained value, the consumers who use new products or services must be able to trust they are being treated fairly.

All financial products and services offered in New York State are required not to discriminate against protected groups. Those products include the Apple Card, which is backed by New York-based Goldman Sachs.

Goldman Sachs issued a statement Sunday saying the discrepancies happened because credit decisions are made on an individual basis, not taking family factors into account.

"We look at an individual's income and an individual's creditworthiness, which includes factors like personal credit scores, how much debt you have, and how that debt has been managed," the company said. "Based on these factors, it is possible for two family members to receive significantly different credit decisions. In all cases, we have not and will not make decisions based on factors like gender."

CNBC reports that Goldman was "aware of the potential issue" before the card launched in August but chose to move forward anyway. The bank says it is still considering ways of launching shared accounts, including adding multiple cardholders to a single account or allowing for co-signers.

The statement (and the potential for joint accounts or co-signers) does not specifically address why several users reported their wives—in some cases literal millionaires—were given significantly lower Apple Card credit limits and higher interest rates despite being the higher-income earners in the family, having higher credit scores, or both.

Unintended consequences

It's unlikely in the extreme that someone at either Apple or Goldman Sachs sat down, twirled his mustache à la Snidely Whiplash, and said, "Ah ha! Let's treat women more badly than men!" Doing so would be both morally and economically stupid, and nobody's accusing the companies of doing it intentionally.

Decisions made by algorithm, though, have a way of reflecting good old-fashioned human biases—just with even less transparency. And it happens in almost every field. The examples are becoming countless.

About a year ago, Amazon had to stop using an AI tool for hiring and recruiting purposes after it turned out not to be advancing female candidates. Essentially, the software looked at the company's current successful workforce, which skews male, and decided "male" must be a determinant of success.

In 2015, ProPublica discovered that Asian American families were likely to be charged significantly more for SAT test-prep services. The algorithm determining price wasn't built expressly to discriminate by race; instead, it used ZIP code—but it charged higher rates in neighborhoods that turned out to be predominantly Asian.

Algorithms with systemic biases are also pervasive in the criRead More – Source