What is “political” about this?
Sex, Apple and Goldman Sachs? I sense politics is not far.
Last edited:
What is “political” about this?
Oh I get it But you replied to a post about what doesn’t affect FICO scores, not what affects credit limits or account interest rates.You don’t get it. Fico score is not the only thing that influences what credit limit and APR you'll get. There are way more advanced models for that. while you know what goes into Fico score you don't know what goes into Goldman's credit models.
But some of those in one way or another get factored into various aspects of lending decisions (credit cards falling under that category) as those involve more than just simply FICO.Factors that don’t affect your FICO credit score:
- Being married for many years
- Gender
- Filing joint taxes
- Living in a community property state
- Assets, no matter their value—$1,000 or $100 million—regardless of whether they are individually or jointly owned
- Income, be it individual, joint, from social security, a trust fund, or even if it is non-existent. You can be a CEO of a Fortune 10 company or collect cans and bottles for recycling, whether you make $100/year or $100 million/year. Doesn’t matter.
- Whether you rent a run down shack on the wrong side of the tracks or live in a gated community in a $50 million mansion.
Here is what I don’t get, why is it okay to raise car insurance premiums for males, but not okay to consider women more of a credit liability.
Not saying it is true, but financial risk is shielded by limiting exposure to financial ruin.
If it turns out that it can be proven that females are more of a credit risk than males, why not offer lower credit limits.
Insurance is allowed to used gender as a consideration.
Curious, what’s the particular commentary in relation to the article?
And before you sneakily remove this totally relevant video again, please tell me how it breaks forum rules?
I don’t trust banks at all, but this seems like the only smart move from a PR perspective. It’s giving people the option to be re-checked so they can really see if it’s the company’s supposed bias or just them.
First, that this whole gender bias thing is a big non-story. As many more educated people than me have already said, there are many factors that go into determining credit limits, and just because 2 people share assets and bank accounts doesn't mean their credit history is at all the same. Also, there have been reports of women getting a higher credit limit than their male partners, so again this big story about gender bias with Apple Card is just a big pile of nothing.Curious, what’s the particular commentary in relation to the article?
Well, with the additional details the video now has context. As far as what was actually said, it seems that it's not really down to simply rerunning the same things and that's it, given that the statement mentioned "Based on additional information that we may request, we will re-evaluate your credit line."First, that this whole gender bias thing is a big non-story. As many more educated people than me have already said, there are many factors that go into determining credit limits, and just because 2 people share assets and bank accounts doesn't mean their credit history is at all the same. Also, there have been reports of women getting a higher credit limit than their male partners, so again this big story about gender bias with Apple Card is just a big pile of nothing.
With all that said, the video shows that you can't change the outcomes of an algorithm simply by pressing "Run" again. Goldman Sachs offering to "reevaluate" Apple Card credit limits instantly reminded me of this scene from The Office where Michael asks the guy to "crunch those numbers again" to see if that will get his company out of financial trouble. Predictably, it doesn't. And just as predictably, the algorithm to determine credit limits won't give out different results just by "crunching the numbers" again. In fact, because of this whole outburst, Goldman Sachs may now change the algorithm to include some gender bias so that the females in question will get higher credit limits and the media will be calmed down for a time.
Or that simply different people in fact had different financial aspects to them (despite being married and all that) and it's those differences, unrelated to gender or anything like that, that ultimately played a role.I am not certain, but my guess is the bank was using a neural net to predict creditworthiness and got caught up with one of the problems of using them - the hidden units can create hidden, implicit representations of characteristics, like sex, without it being obvious. This is going to happen more and more until we figure out how to understand what the damn things are doing.
...
Or that simply different people in fact had different financial aspects to them (despite being married and all that) and it's those differences, unrelated to gender or anything like that, that ultimately played a role.
Except that they basically did.The fact that the bank could not explain immediately what happened suggests to me that it is not simple as different backgrounds resulting in different credit scores.
Except that they basically did.
Initially it was that they didn’t explain it, then when that didn’t hold up, it became that the explanation can’t be trusted. Moving the goal posts and deflecting to something else only ends up undermining any actual points that might be there.So you blindly trust corporate denials? I understand they do not inteintionally use gender, but that have no tproven that their AI does not represent it explicitly. Let them show the statistics for their credit limits for men and women.
So first it was that they didn’t explain it, then when that didn’t hold up it was that the explanation can’t be trusted. Moving the goal posts and deflecting to something else only ends up undermining any pints that might actually be there.
There was more to the original explanation, as mentioned in the original article about this which is linked in the follow up article that is associated with this discussion. (Pretty much all of this has also been fairly extensively covered in the discussion associated with the initial article.)They merely claimed they do not explicitly use gender in assigning a credit limit. That is not the same thing as confirming that any AI they use does not implicitly represent gender, nor is it the same thing releasing statistics about the credit limits they've given to men versus women. The question of trust is about how thoroughly they looked, if at all, for implicit bias about gender in their software.
There was more to the original explanation, as mentioned in the original article about this which is linked in the follow up article that is associated with this discussion. All of this has also been fairly extensively covered in the discussion associated with the initial article about it.
What I originally pointed out in my initial reply to one potential take on it is that there's also a fairly logical and plausible existing explanation for it all.I read it and saw no evidence - just a claim. If that satisfies you, then fine. Banks have gotten into trouble before with AI systems implicitly discriminating, so I think they should provide technical documentation proving they don't discriminate. You should read the short article at this link if you do not think this is a problem.