Models are opinions embedded in mathematics.
Algorithms control almost everything we do on the internet, from Google search, Netflix movie recommendations, our Facebook news feed, Job applications, etc. Algorithms are mathematical models used to solve a set of problems or to perform computational instructions. American Mathematician and Author Cathy O’Neil write about the impact of big data algorithms on increasing preexisting inequality in the world. In Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy, she calls these algorithms Weapons of Math Destruction.
WMDs, are mathematical models or algorithms that claim to quantify important traits: teacher quality, recidivism risk, creditworthiness but have harmful outcomes and often reinforce inequality, keeping the poor poorer and the rich richer.
Cathy defines algorithms as opinions embedded in code. These algorithms are being weaponized, and she argues that the algorithms are becoming more Widespread, Mysterious, and Destructive. She sights examples of how the WMDs are being used in various fields such as teacher assessment, predictive policing, insurance, the justice system, microtargeting politics, money lending, and how the algorithm decisions can lead to increasing inequality, reinforcing racism, and harming the poor.
Models are opinions embedded in mathematics.
According to Cathy, these mathematical models share key features: they are opaque (black box), unregulated, difficult to contest, a questionable definition of success, and Pernicious Feedback loops, The scalability of these algorithms amplifies any inherent biases to affect increasingly larger populations.
Ill-conceived mathematical models now micromanage the economy, from advertising to prisons. They’re opaque, unquestioned, and unaccountable, and they operate at a scale to sort, target, or “optimize” millions of people. By confusing their findings with on-the-ground reality, most of them create pernicious WMD feedback loops.
Verdicts from WMDs land like dictates from the algorithmic gods. The model itself is a black box, its contents a fiercely guarded corporate secret. If the people being evaluated are kept in the dark, the thinking goes, they’ll be less likely to attempt to game the system. Instead, they’ll simply have to work hard, follow the rules, and pray that the model registers and appreciates their efforts. But if the details are hidden, it’s also harder to question the score or to protest against it.
You cannot appeal to a WMD. That’s part of their fearsome power. They do not listen. Nor do they bend. They’re deaf not only to charm, threats, and cajoling but also to logic—even when there is good reason to question the data that feeds their conclusions.
The three elements of a WMD: Opacity, Scale, and Damage.
Data scientists all too often lose sight of the folks on the receiving end of the transaction. They certainly understand that a data-crunching program is bound to misinterpret people a certain percentage of the time, putting them in the wrong groups and denying them a job or a chance at their dream house. But as a rule, the people running the WMDs don’t dwell on those errors.
Their feedback is money, which is also their incentive. Their systems are engineered to gobble up more data and fine-tune their analytics so that more money will pour in. Investors, of course, feast on these returns and shower WMD companies with more money.
A model, after all, is nothing more than an abstract representation of some process, be it a baseball game, an oil company’s supply chain, a foreign government’s actions, or a movie theater’s attendance. Whether it’s running in a computer program or in our head, the model takes what we know and uses it to predict responses in various situations. All of us carry thousands of models in our heads. They tell us what to expect, and they guide our decisions.
A model’s blind spots reflect the judgments and priorities of its creators. While the choices in Google Maps and avionics software appear cut and dried, others are far more problematic.
Racism, at the individual level, can be seen as a predictive model whirring away in billions of human minds around the world. It is built from faulty, incomplete, or generalized data. Whether it comes from experience or hearsay, the data indicates that certain types of people have behaved badly. That generates a binary prediction that all people of that race will behave that same way.
Consequently, racism is the most slovenly of predictive models. It is powered by haphazard data gathering and spurious correlations, reinforced by institutional inequities, and polluted by confirmation bias. In this way, oddly enough, racism operates like many of the WMDs.
We are ranked, categorized, and scored in hundreds of models, on the basis of our revealed preferences and patterns.
Propaganda Machine – Online Advertising
Anywhere you find the combination of great need and ignorance, you’ll likely see predatory ads. If people are anxious about their sex lives, predatory advertisers will promise them Viagra or Cialis, or even penis extensions. If they are short of money, offers will pour in for high-interest payday loans. If their computer is acting sludgy, it might be a virus inserted by a predatory advertiser, who will then offer to fix it.
When it comes to WMDs, predatory ads practically define the genre. They zero in on the most desperate among us at enormous scale.
In education, they promise what’s usually a false road to prosperity, while also calculating how to maximize the dollars they draw from each prospect. Their operations cause immense and nefarious feedback loops and leave their customers buried under mountains of debt. And the targets have little idea how they were scammed, because the campaigns are opaque. They just pop up on the computer, and later call on the phone. The victims rarely learn how they were chosen or how the recruiters came to know so much about them.
The poorest 40 percent of the US population is in desperate straits. Many industrial jobs have disappeared, either replaced by technology or shipped overseas. Unions have lost their punch. The top 20 percent of the population controls 89 percent of the wealth in the country, and the bottom 40 percent controls none of it. Their assets are negative: the average household in this enormous and struggling underclass has a net debt of $14,800, much of it in extortionate credit card accounts. What these people need is money. And the key to earning more money, they hear again and again, is education.
Along come the for-profit colleges with their highly refined WMDs to target and fleece the population most in need. They sell them the promise of an education and a tantalizing glimpse of upward mobility—while plunging them deeper into debt. They take advantage of the pressing need in poor households, along with their ignorance and their aspirations, then they exploit it. And they do this at great scale. This leads to hopelessness and despair, along with skepticism about the value of education more broadly, and it exacerbates our country’s vast wealth gap.
As insurance companies learn more about us, they’ll be able to pinpoint those who appear to be the riskiest customers and then either drive their rates to the stratosphere or, where legal, deny them coverage. This is a far cry from insurance’s original purpose, which is to help society balance its risk. In a targeted world, we no longer pay the average. Instead, we’re saddled with anticipated costs. Instead of smoothing out life’s bumps, insurance companies will demand payment for those bumps in advance. This undermines the point of insurance, and the hits will fall especially hard on those who can least afford them.
All the Best in your quest to get Better. Don’t Settle: Live with Passion.
Pingback: 100 Books Reading Challenge 2021 – Lanre Dahunsi