Minority Report

Minority Report

Facebook
Twitter
Email
Flipboard
Pocket

Sometimes it feels like we’re living in an era in which information has finally become “free”—unlimited media access, twenty-four-hour wellness tracking, endless dating possibilities. But there’s nothing inherently progressive about Big Data. A new report shows that when Big Data creeps into our workplaces and our financial lives, it may simply create new ways of reinforcing old racial and economic injustices.

The report, “Civil Rights, Big Data, and Our Algorithmic Future,” by the think tank Robinson + Yu, notes that technological advances, the declining cost of data storage, and the intensified surveillance climate of post-9/11 America have spurred massive data collection. This accumulation of private information by corporations and government has created troubling new issues in the areas of labor rights, privacy and ethics. Consider the influence of Big Data on hiring practices. Hiring algorithms are often seen as an “objective,” meritocratic assessment, free of irrational prejudice or biases. But the report warns that because “[d]igital indicators of race, religion, or sexual preference can easily be observed or inferred online,” the mining of social media and Google-search data can reinforce systemic discrimination. The result may be a perpetuation of an unjust status quo: disproportionately white, upper-class, elite-educated and culturally homogeneous. Sloppy résumé scans end up excluding people based on superficial criteria—where they live, for example, a metric bound to reflect already-existing housing discrimination. Big Data manipulation allows these subtle individual slights to be expanded to new orders of magnitude with monstrous efficiency. Since the algorithm reflects social patterns, researcher David Robinson tells The Nation, “any time someone is the victim of old-fashioned human discrimination, that discrimination is likely to be reflected in some of the data points that these new algorithms measure. Culturally speaking, there is a real tendency to defer to decisions that come from computers—which means if we’re not careful, it is reasonable to expect that computers will sanitize biased inputs into neutral-seeming outputs.”

Thank you for reading The Nation!

We hope you enjoyed the story you just read. It’s just one of many examples of incisive, deeply-reported journalism we publish—journalism that shifts the needle on important issues, uncovers malfeasance and corruption, and uplifts voices and perspectives that often go unheard in mainstream media. For nearly 160 years, The Nation has spoken truth to power and shone a light on issues that would otherwise be swept under the rug.

In a critical election year as well as a time of media austerity, independent journalism needs your continued support. The best way to do this is with a recurring donation. This month, we are asking readers like you who value truth and democracy to step up and support The Nation with a monthly contribution. We call these monthly donors Sustainers, a small but mighty group of supporters who ensure our team of writers, editors, and fact-checkers have the resources they need to report on breaking news, investigative feature stories that often take weeks or months to report, and much more.

There’s a lot to talk about in the coming months, from the presidential election and Supreme Court battles to the fight for bodily autonomy. We’ll cover all these issues and more, but this is only made possible with support from sustaining donors. Donate today—any amount you can spare each month is appreciated, even just the price of a cup of coffee.

The Nation does not bow to the interests of a corporate owner or advertisers—we answer only to readers like you who make our work possible. Set up a recurring donation today and ensure we can continue to hold the powerful accountable.

Thank you for your generosity.

Ad Policy
x