In fields from advertising to medicine, big data is looked to as the statistical omni-tool for tackling virtually any social epidemic in the 21st century — the modern day equivalent of penicillin to the field of antibiotics, with the added ability to market products to consumers they didn’t know they wanted. But there’s a dark side to the powerful force big data exerts in our increasingly connected and quantifiable existence, with its most dangerous quality lying in its banality.

The WMDs of the information age are “Weapons of Math Destruction,” writes Cathy O’Neil, a Harvard and MIT-educated mathematician and author of a book by the same name, which examines the algorithms behind predictive policing, teacher assessment, financial investment and other areas embracing big data analysis as a transformative tool.

Though math is universally looked to as the ultimate inarguable truth logically incapable of lying, algorithms can lie, according to O’Neil, who saw first-hand the devastation wrought by weaponized math while working in finance for the hedge fund D.E. Shaw at the outset of the housing market crash in 2007.

“I realized that this crisis that had happened — that had stopped the economies of the entire world — was essentially based on a mathematical lie,” O’Neil said Monday referring to the AAA scores investment ratings agencies Standard & Poor’s, Moody’s and Fitch gave securities backed by subprime mortgages — one of the chief causes (though far from the only) of the financial meltdown that followed.

“That was a weaponized mathematics, because the insiders knew back then that these mortgage-backed securities that were getting these AAA ratings didn’t deserve them,” O’Neil told a panel gathered to discuss the book. “It was supposed to show that they were safe and it scaled up the size of the mortgage-backed security market massively, and they were based on this idea that mathematicians were diligently doing that data work at these credit ratings agencies.”

“It was a lie, and it was exactly the opposite of what I wanted mathematics to be,” she continued, likening the math to a “shield with all this dirty stuff underneath, blinding people, because people trust math.”

More often than maliciousness or deceit, results — and more specifically the algorithms that produce them — fall prey to the biases of subjects who create them, producing skewed results that reinforce preconceived notions or hypotheses, rather than challenge or inform them.

The algorithms behind Facebook’s News Feed or Trending Topics, designed to curate content based on users’ browsing history and likes, as well as those of their friends, is one example of a biased algorithmic impact on the flow of information. Teacher assessment tests in which scores often correlate with resources or poor ratings of medical facilities where high-risk patients are purposefully sent are others.

“There’s no such thing as objectivity in algorithms. When I build my algorithm to cook dinner for my family I am embedding my values in that algorithm,” O’Neil said, weighing her desire for her children to eat vegetables against her son’s love of Nutella.

“Algorithms are not inherently fair,” she continued, “because the person who builds the model is in charge of defining success.”

One example of that duality is in retail, where chains like Walmart rely on algorithms to harness big data including weather and foot traffic to determine when and how many employees are scheduled on the floor — a method that gives employees little notice to plan schedules around second jobs or classes, while also ensuring they work below the number of hours that would require retailers to pay benefits.

“When I was a kid, when you had not such a great job, the idea was to go to night school and get better educated and then get a better job,” O’Neil, a former professor at Barnard College, said. “It’s actually preventing that kind of American Dream mobility thing from happening.”

The bias extends to another area of civil unrest gaining media attention and influencing the discourse of the 2016 election cycle — law enforcement, where big data is becoming increasingly relied on for predictive policing — coordinating police presence to correspond with computer predictions of where and what crimes will occur when based on records of past criminal activity.

“We could think of it as predicting crime, but I would argue we’re just predicting police,” O’Neil said. “This is what police are going to do, and then we get extra points as modelers for having ‘accurate models’ because we did such a good job predicting the police.”

Surveillance, another high-profile example since the Snowden leaks and rise of ISIS-inspired terror attacks, shares the same dilemma according to Rachel Levinson-Waldman, senior counsel to the Liberty and National Security Program at the Brennan Center for Justice.

Levinson-Waldman said national security counterterrorism surveillance fits O’Neil’s three criteria for a 21st century WMD — large scale, opaque and damaging.

“The more you have brown skin, the more you come from a Muslim background, the more that you fall within one of these buckets for which this information is being weaponized, the more damage there is likely to be,” she said, adding there’s a “higher willingness to accept false positives because it seems scary to miss the actual positive.”

“It’s trusted but it’s also feared,” O’Neil said of math. “It’s kind of like the perfect mechanism to keep people from asking questions.”

Follow Giuseppe on Twitter