@ChestertonsMeme's banner p

ChestertonsMeme

blocking the federal fist

0 followers   follows 0 users  
joined 2022 September 10 06:20:52 UTC

				

User ID: 1098

ChestertonsMeme

blocking the federal fist

0 followers   follows 0 users   joined 2022 September 10 06:20:52 UTC

					

No bio...


					

User ID: 1098

I'm not going to read an AI-generated post. But I did ask an AI to summarize it in a few sentences, so I get the gist. Maybe next time just post your thoughts so others don't have to do this extra round-trip through an AI.

These are my unfiltered thoughts on the object-level issue:

It's not Communism. It's opaque and centralized but historical Communist systems are not unique in those respects.

The credit scoring system is a result of many conflicting interests who all place constraints on how businesses make decisions. Consider what would happen if a business used their own method for evaluating credit risk:

  1. They might accidentally use a forbidden input, such as race, or a proxy for one, such as zip code. This exposes the business to substantial legal risk. Figuring out the set of inputs that are both predictive and allowed takes a lot of specialized knowledge of the laws in the jurisdiction in which the business operates. This is expensive. It's cheaper to outsource this work and risk to specialized companies.
  2. They might make a mistake in predicting credit risk. To take your example, the fact that a customer has a history of on-time rent payments doesn't necessarily mean they're low enough risk for what the business is evaluating them for. If it's for a new rental agreement, maybe the customer's income has disappeared recently. If it's for a credit card, maybe paying rent doesn't predict paying off credit cards. Using a specialized company for evaluating risk ensures that the weaknesses of the score are at least well-known and understood.
  3. If they try to make the process more transparent, they might make a mistake with privacy and PII. The opacity of the current system allows credit bureaus to launder private information into a less-private score that's still useful to businesses.
  4. Also if they try to make the process more transparent, they open themselves up to gaming.

The real question is, what is the alternative, and does it live within the constraints we've placed on how businesses make decisions?