Data compilers’ secret scores have consumers pegged — fairly or not

by E. Scott Reckard, Los Angeles Times

Credit-ScoreConsumers won access to their credit scores more than a decade ago, after advocates voiced concerns over errors and lending bias.

But most people remain in the dark about hundreds of other data-collection programs still being used to size up consumers and market to them. The secret scores and data are used by employers, utilities, banks, healthcare providers, debt collectors and a host of other enterprises.

Consumers have no way to review them or correct factual errors, say advocates of consumer access to the reports. They argue that the consumer protections applying to credit reports need to be extended to all consumer scores — particularly when they are used for identity checks, fraud prevention, medical histories and profitability predictions.

“What’s to keep the data storers from just making things up?” said University of Maryland law professor Frank Pasquale, whose book, “The Black Box Society: Technologies of Search, Reputation, and Finance,” will be published this fall by Harvard University Press.

“Or cooking the algorithm to increase their profitability, regardless of the underlying data?” Pasquale said. “How are we to be sure some malcontent isn’t just messing with people’s scores?”

A recent Federal Trade Commission hearing examined how the data collection programs can touch on highly personal matters: scores to predict whether individuals would take their medications; whether consumers are likely to pay a debt if contacted by phone or mail; the degree of a person’s influence over others on the Internet; whether a customer is pregnant, and if so, when the baby is due.

Some of the scores are used to screen for fraud or to determine whether to grant or deny consumers’ requests for goods and services. But defenders of the practices say most of the data crunching simply helps match consumers to goods and services they want.

The Direct Marketing Assn., a trade group for data brokers, has calculated that the industry generates $156 billion in annual revenue, 70% of it involving companies sharing their records on consumers and groups of individuals with other enterprises. The brokers’ computer programs, which crunch data from public records and private databases, can involve hundreds and even thousands of factors, experts say, acknowledging that the complexities can appear threatening.

“We realize we have to act responsibly to have consumers’ trust,” said Rachel Thomas, a Direct Marketing Assn. spokeswoman.

But consumers have little cause for worry, she said.

“In marketing analytics, the worst thing that can happen is a prediction is wrong and the consumer gets an offer for something they’re not interested in,” Thomas said. “We work hard to make sure the information is only used for marketing purposes.”

But privacy advocates say these self-imposed protections are inadequate. There’s no way to tell, they say, whether the marketing data are used to deny consumers a product or service based on, for instance, the neighborhood where they live.

“These scores offer predictions that can become consumers’ destiny, whether they are right or wrong,” said Pam Dixon, director of the San Diego nonprofit World Privacy Forum, which issued a report last week calling on Congress and federal regulators to lift the veil on the enormous but little-known data broker industry.

Dixon said the secret scores include:

  • Scores predicting households likely to pay debt;
  • A job security score that claims to predict future income and capacity to pay;
  • Churn scores” seeking to predict when customers will move their business to another bank, cellphone provider or cable TV service;
  • An Affordable Care Act health risk score creating a relative measure of predicted healthcare costs for a particular enrollee.

“In effect, it is a proxy score for how sick a person is,” Dixon wrote in her report, “The Scoring of America,” co-written by Robert Gellman.

One category that helps protect consumers — but could also harm them if mistakes are made — is fraud and identity scores. They’re designed to indicate the likelihood that a person seeking something may be a scam artist masquerading as a good-faith consumer.

Many of the data collection companies will provide consumers with their personal information and allow them to correct errors, said Consumer Data Industry Assn. spokesman Norm Magnuson. More than 40 such companies are listed on a Consumer Financial Protection Bureau Web page, he said.

But the critics note that there are loopholes. There’s no requirement, for instance, that companies disclose scoring based on someone’s neighborhood. That raises the specter of redlining — the illegal practice of turning someone down for a loan or insurance merely because they live in an area deemed high risk.

The solution, the critics say, would be for the industry to open its books. Regulators and consumers should be able to see the data to judge whether scores are being used improperly as proxies for such things as race or sex, or disclosing private information about consumers without their knowledge.

Pasquale, the University of Maryland professor, said the evidence shows that that may be the case. He recalled that one data analyst told him enthusiastically about matching credit scores with driving data collected by monitors on newer-model cars.

“He believed that if you could match driving habits to enough credit scores, you wouldn’t need the credit scores,” Pasquale said in an interview this week. “You could make the same predictions based on how someone drives.”