http://www.nytimes.com/2014/12/28/technology/the-scoreboards-where-you-cant-see-your-score-.html 2014-12-27 16:15:35 The Scoreboards Where You Can’t See Your Score New books by technology experts examine consumer-ranking techniques in widespread use, but offer different advice on how people might protect themselves. === The characters in Gary Shteyngart’s novel “ Consider the protagonist, Lenny Abramov, age 39. A digital dossier about him accumulates his every health condition (high cholesterol, depression), liability (mortgage: $560,330), purchase (“bound, printed, nonstreaming media artifact”), tendency (“heterosexual, nonathletic, nonautomotive, nonreligious”) and probability (“life span estimated at 83”). And that profile is available for perusal by employers, friends and even strangers in bars. It’s a fictional forecast of a data-deterministic culture in which computer algorithms constantly analyze consumers’ profiles, issuing individuals numeric rankings that may benefit or hinder them. Observing a street billboard that publicly broadcasts the score of each passer-by, the Abramov character says in the novel, “The old Chinese woman had a decent 1,400, but others, the young Latina mothers, even a profligate teenaged Hasid puffing down the street, were showing blinking red scores below 900, and I worried for them.” In two nonfiction books, scheduled to be published in January, technology experts examine similar consumer-ranking techniques already in widespread use. Even before the appearance of these books, a Unlike Lenny Abramov, however, most people in real life are not aware of “This will happen whether or not you want to participate, and these scores will be used by others to make major decisions about your life, such as whether to hire, insure, or even date you,” write Michael Fertik and David Thompson in a forthcoming book, “ In his new book, “Important corporate actors have unprecedented knowledge of the minutiae of our daily lives,” he writes in “ Both books outline how consumer scoring works. Data brokers amass dossiers with thousands of details about individual consumers, like age, religion, ethnicity, profession, mortgage size, social networks, estimated income and health concerns such as impotence and irritable bowel syndrome. Then analytics engines can compare patterns in those variables against computer forecasting models. Algorithms are used to assign consumers scores — and to recommend offering, or withholding, particular products, services or fees — based on predictions about their behavior. But while both books emphasize the notion that consumer reputations are vulnerable to such covert scoring apparatuses, the authors differ markedly in the steps they say ordinary people might take to protect themselves. Befitting the founder of a firm that markets reputation management, Mr. Fertik contends that individuals have some power to influence commercial scoring systems. He presents nascent technologies, like online education courses that can score people on the specific practical skills or concepts they have mastered, as democratizing forces that could enable workers to better compete for jobs on merit. His book suggests that readers curate, or hack, their digital reputations — for instance, by emphasizing certain keywords on their résumés to position them better for predictive scoring engines, or by posting positive reviews of restaurants or hotels online, in the hope that algorithms will flag them for future V.I.P. treatment. “Employers’ algorithms will pick your résumé out of the pile of thousands just as instantaneously and robotically as they pass over others,” he and his co-author write. “Banks and lenders will automatically approve you for the better rates and offers. The more appealing dates on apps and sites like Tinder, Match and OkCupid will see your profile before they see any others.” Think of this technique as reputation engine optimization. If an algorithm incorrectly pegs you as physically unfit, for instance, the book suggests that you can try to mitigate the wrong. You can buy a Fitbit fitness tracker, for instance, and upload the exercise data to a public profile — or even “snap that Fitbit to your dog” and “you’ll quickly be the fittest person in your town.” Professor Pasquale offers a more downbeat reading. Companies, he says, are using such a wide variety of numerical rating systems that it would be impossible for average people to significantly influence their scores. “Corporations depend on automated judgments that may be wrong, biased or destructive,” Professor Pasquale writes. “Faulty data, invalid assumptions and defective models can’t be corrected when they are hidden.” Moreover, trying to influence scoring systems could backfire. If a person attached a fitness device to a dog and tried to claim the resulting exercise log, he suggests, an algorithm might be able to tell the difference and issue that person a high score for propensity toward fraudulent activity. “People shouldn’t think they can outwit corporations with hundreds of millions of dollars,” Professor Pasquale said in a phone interview. Consumers would have more control, he argues, if Congress extended the right to see and correct credit reports to other kinds of rankings. “If credit scores can be regulated,” he says, “why not the scoring systems used by digital advertisers and employers?”