Minority Report

Minority Report

Facebook
Twitter
Email
Flipboard
Pocket

Sometimes it feels like we’re living in an era in which information has finally become “free”—unlimited media access, twenty-four-hour wellness tracking, endless dating possibilities. But there’s nothing inherently progressive about Big Data. A new report shows that when Big Data creeps into our workplaces and our financial lives, it may simply create new ways of reinforcing old racial and economic injustices.

The report, “Civil Rights, Big Data, and Our Algorithmic Future,” by the think tank Robinson + Yu, notes that technological advances, the declining cost of data storage, and the intensified surveillance climate of post-9/11 America have spurred massive data collection. This accumulation of private information by corporations and government has created troubling new issues in the areas of labor rights, privacy and ethics. Consider the influence of Big Data on hiring practices. Hiring algorithms are often seen as an “objective,” meritocratic assessment, free of irrational prejudice or biases. But the report warns that because “[d]igital indicators of race, religion, or sexual preference can easily be observed or inferred online,” the mining of social media and Google-search data can reinforce systemic discrimination. The result may be a perpetuation of an unjust status quo: disproportionately white, upper-class, elite-educated and culturally homogeneous. Sloppy résumé scans end up excluding people based on superficial criteria—where they live, for example, a metric bound to reflect already-existing housing discrimination. Big Data manipulation allows these subtle individual slights to be expanded to new orders of magnitude with monstrous efficiency. Since the algorithm reflects social patterns, researcher David Robinson tells The Nation, “any time someone is the victim of old-fashioned human discrimination, that discrimination is likely to be reflected in some of the data points that these new algorithms measure. Culturally speaking, there is a real tendency to defer to decisions that come from computers—which means if we’re not careful, it is reasonable to expect that computers will sanitize biased inputs into neutral-seeming outputs.”

Thank you for reading The Nation!

We hope you enjoyed the story you just read, just one of the many incisive, deeply-reported articles we publish daily. Now more than ever, we need fearless journalism that shifts the needle on important issues, uncovers malfeasance and corruption, and uplifts voices and perspectives that often go unheard in mainstream media.

Throughout this critical election year and a time of media austerity and renewed campus activism and rising labor organizing, independent journalism that gets to the heart of the matter is more critical than ever before. Donate right now and help us hold the powerful accountable, shine a light on issues that would otherwise be swept under the rug, and build a more just and equitable future.

For nearly 160 years, The Nation has stood for truth, justice, and moral clarity. As a reader-supported publication, we are not beholden to the whims of advertisers or a corporate owner. But it does take financial resources to report on stories that may take weeks or months to properly investigate, thoroughly edit and fact-check articles, and get our stories into the hands of readers.

Donate today and stand with us for a better future. Thank you for being a supporter of independent journalism.

Thank you for your generosity.

Ad Policy
x