copyright notice
link to the published version: IEEE Computer, January, 2017


accesses since November 23, 2016

Software Sophistry and Political Sleight of Hand

Hal Berghel


These two practices aren’t unrelated, they betray two sides of the same fundamental character flaw.


Noted linguist and social philosopher Noam Chomsky wrote in his 1967 essay, “The Responsibility of Intellectuals” (chomsky.info/19670223/), that the intellectual's obligation was to tell the truth and expose lies—a lofty goal. Unfortunately, all too often the opposite is the case , not just with intellectuals but also in the general population, and most certainly with politicians. It's especially problematic with people who are in positions of significant influence. In this column, I single out two practices , from two different domains, that seem unrelated on the surface but are in reality bound together with a common theme: the manipulation of public opinion by misinformation in order to get unsuspecting victims to do things that are contrary to their interests. One example comes from software development and the other from politics.

Edward Bernays and the Fine Art of Misinformation

One of the great defects in human character is the propensity to unreflectively follow others. William Trotter introduced the term herd mentality to describe this phenomenon back in 1916.1 Trotter was one of the first social scientists of his era to observe that much of group behavior is typically unreflective, imitative, irrational, and inconsistent with the group’s long-term interests. This observation motivates some people to exploit this weakness to manipulate and control the herd to their own advantage. Of course, this effort goes largely unnoticed by the herds themselves.

Among those who sought to capitalize on the ignorance and innocence of crowds was Edward Bernays, the founder of modern public relations. It’s one of life’s important ironies that Bernays initially called his subtle manipulation of the public propaganda (indeed, that was the title of his most popular book2), only changing the label after Adolf Hitler and Joseph Goebbels tarnished the term. But propaganda should have been widely recognized for what it was, even before Hitler and Goebbels: social engineering. The phenomenon remains with us today under the rubric of faux news, talking heads, lying politicians, unscrupulous advertising, marketers, promoters, and fraudsters. In fact, it has become the theme song of neoliberals in modern business and politics. Trickle-up economics such as tax concessions, credits, deferrals, and abatements for businesses; government bailouts of failed corporations; public support of sports franchises; foreign investment credits; and so forth are all of minimal economic value to the public yet they rarely generate significant public pushback because the herd has been successfully manipulated. Although this phenomenon is well documented by distinguished scholars and journalists,3–6 the public is largely unaware of it because it’s rarely discussed in mainstream media—it doesn’t bleed, so it doesn’t lead.

In all these cases, it must be remembered that the goal is to shape the public perception of something in order to encourage the target audience to do something it normally wouldn't. Aldous Huxley captured one dimension of the debilitating influence of propaganda with his observation that people have an almost infinite capacity for distraction from the most important issues that affect their own welfare. His memorable 1958 quote is worth repeating here (https://www.huxley.net/bnw-revisited/index.html):

In regard to propaganda the early advocates of universal literacy and a free press envisaged only two possibilities: the propaganda might be true, or the propaganda might be false. They did not foresee what in fact has happened, above all in our Western capitalist democracies - the development of a vast mass communications industry, concerned in the main neither with the true nor the false, but with the unreal, the more or less totally irrelevant. In a word, they failed to take into account man's almost infinite appetite for distractions.

And this isn't to mention “reality” TV, talk shows, and other forms of content-free drivel like the gorilla vaudeville we call modern political campaigns. But I digress.

The unifying theme between the two categories of that I discuss here is the use of misinformation to manipulate an unsuspecting public, consumer, stakeholder, or electorate. There's a discernable parallel between its use in software and politics.

Software’s Seven Deadly Sins

An intriguing way of organizing a discussion of software gone wrong is via the seven deadly (unforgivable) sins: pride, sloth, gluttony, anger, envy, lust, and greed. Chris Nodder used these seven dastardly deeds to introduce his seminal book on the use (and misuse) of perception management and social engineering techniques with regard to online persuasion, 7 his concept also applies to software development (and politics, business, and virtually every other dimension of our human experience, for that matter). Nodder's book is an important tool for designers because it crystalizes their thinking on the likely effect of design. But it is, in my view, far more valuable to consumers as a tool to refine their personal crap detectors. By investing in a careful reading, we become prepared for the continuous assault on any of our senses subsumed under advertising and marketing. This is an alternative to having Madonna's “Material Girl” on a continuous loop on your iPod or to substitute Machiavelli's The Prince for scripture.

Nodder exposed the online infrastructure behind “controlling people's behavior for financial gain.” As he described it, “the best examples of ‘evil design' are the ones where you don't even realize that people are being manipulated until it's pointed out to you.” In this sense, effective advertising works like a good magic trick. As Nodder stated,

The idea behind evil design is that people enter willingly into the deal, even when the terms are exposed to them. Confidence tricksters are another group who control behavior for gain, but they take things a stage further than evil design by hiding the true outcome of the activity. …. So evil design is that which creates purposefully designed interfaces that make users emotionally involved in doing something that benefits the designers more than them.

Let's map observations onto what I'll call a “persuasion continuum:”

A Persuasion Continuum

We can map our observations on a continuum from least deceptive and least harmful to most deceptive and most harmful:

  1. transparent, even if subtle, persuasion;
  2. translucent persuasion that will likely involve a lack of candor, some insincerities, inaccuracies, and white lies;
  3. opaque persuasion that might involve outright misrepresentations, distortions of the truth, and other elements that are candidates for Federal Trade Commission complaints; and
  4. deception and possibly outright fraud that could be candidates for prosecution.

My personal belief is that a moral hazard begins at stage 2 of this continuum. Nodder, on the other hand, advanced a different position in the form of the Golden Rule of Persuasion: “creators of a persuasive technology should never seek to persuade a person or persons to do something they themselves would not consent to be persuaded to do.” Nodder’s Golden Rule is orthogonal to our continuum.

I prefer Dieter Rams's design criterion, that “Good Design Is Honest—don't attempt to manipulate the consumer with promises that can't be kept” (www.archdaily.com/198583/dieter-rams-10-principles-of-good-design). Deception, misrepresentation, white lies, exaggerations, falsehoods, and so forth are necessarily more collusive than persuasive. Human–computer interface specialist Harry Brignull referred to them as “dark patterns” that are “not mistakes, they are carefully crafted with a solid understanding of human psychology, and they do not have the user's interests in mind” (darkpatterns.org). My position is that Nodder was too charitable. That said, I recognize that the received view, caveat emptor, is more akin to Nodder's than to Rams's or Brignull's. I encourage consideration of the principle that even stage 2 in our continuum can be justified only under baroque interpretation of the term public good and is intrinsically indistinguishable from trickery. This contrast provides a serviceable overview for our purposes and one to which we'll return below.

For now, let's look closely at a popular biomedical bone of contention. What do we make of big tobacco's argument that, because a direct causative link between smoking and lung cancer and heart disease can't be scientifically proven, the claim that smoking is harmful must be rejected as unscientific? 8 Note that Nodder's Golden Rule approach adopts a more tolerant attitude toward big tobacco's protestations. Golden Rulers might argue that, because all science is statistical or probabilistic in nature, the encouragement to smoke falls within a Rumsfeldian fair-use exemption for anti-scientists: the absence of direct proof is proof of absence (of correlation). And if that isn't the way science works, well then so much the worse for science. Of course, scientists would take a more measured precautionary view since critical thresholds are the exception rather than the rule, as in dose–response relationships. A thesis that demands denial whenever causality is reclusive is not only unscientific, it's just silly. So is the converse thesis that belief trumps science. To wit, consider Bill O'Reilly's remark, “tide goes in, tide goes out… you just can't explain that.” (www.youtube.com/watch?v=wb3AFMe2OQY; skip ahead to the 1:55 mark)

Another problem with the Golden Rule of Persuasion is that the term consent is ill-defined. Were we to substitute “sufficiently informed consent adequate to make an enlightened decision,” we might be more accepting of it. But as it stands, it consent completely ignores the pervasive effect of ignorance on poor judgment. In the presence of information asymmetry, caveat emptor is a euphemism for a confidence game.

Dieselgate offers a dramatic example of deception-by-software. At this writing, Volkswagen has agreed to a $15 billion settlement in the US alone for its duplicity about vehicle emissions (www.autotrader.ca/newsfeatures/20161101/dieselgate-from-the-drivers-seat-part-2/). Loss of future sales, loss in stock price, and fines in other countries will cost the company many billions of dollars more. However, the more interesting story isn't about emissions at all, but how it came to be that a VW manager thought that its emissions cheating program was a good idea in the first place . Although this issue has been largely ignored in the mainstream media, it reveals the far more serious problem of how corporate deception can become institutionalized. Having been in computing for quite a few decades at this point, I will gladly wager dollars-to-doughnuts that the deception wasn't conceived by the programmers involved. Programmers typically focus on problem solving and the implementation of algorithms, not deviousness and stealth (3-letter government agencies and pure play contractors excluded). The problem-solving paradigm and challenge of efficient and accurate implementation just doesn't seem to be a good fit with the trickster psych. In my view, identifying the cause of this ethical breach is far more important than precisely identifying emission rates, because the former, not the latter, can't be openly measured and debated. We have no way of knowing at this point how portable such software deceptions have become and where they're most likely to appear next.

Political Sleight of Hand

Software sophistry shares DNA with one of the slipperier political maneuvers, deceptive policy design—the deliberate dissemination of misinformation to delude the public into thinking that a policy is for one purpose when it's actually for another. In addition to social engineering, deceptive policy design also involves a healthy dose of perception management—it's what actors, phishing scammers, fraudsters, and politicians do for a living. 9 But whereas actors and criminals might dabble in perception management, politicians have perfected it to an art form.

The misuse of public policy for parochial ends has been well documented and intensively studied by academics and journalists alike. 3,10 Policy corruption plagues all democratic governments because bad policy is intentionally disguised. However, there are subtle distinctions to be made. David Stockman, for one, has said that recent public policy changes “is what is ruining American capitalism.” 6,11 He made a convincing case for the claim that policy wonks use policy changesto conceal their partisan advocacy as some form of statism. Of course, statism extends beyond economic policy matters; police statism, corporatism, neoliberalism, and crony capitalism all advocate some form of statist intervention, but they do so in different, yet overlapping, domains. For example, surveillance, state-subsidized markets, regulation, or lack thereof, government bailouts, and so on, are all statist activities, although they work toward different ends—a point that is commonly overlooked. The political effect is that visible signs of statism masquerade as necessary byproducts of free enterprise, national security, national defense, and social safety nets. This is by design.

Stockman recommended critically examining all policy proposals from politicians, for they aren't likely to be in the public interest—certainly not in the long run—and only serve to propagate the power of the controlling elite. Stockman's thesis was provocative and credible in light of decades of feckless foreign, economic, national security, drug, healthcare, and other policies. A politician who advocates for a “national policy” attracts more support from the public than an agenda that's nothing more than a subcerebral, glandular opinion, but all too often the only difference is the label. Stockman admonished us for not challenging policies by holding them up to at least a minimal standard: coherent, consistent, and truthfully represented.

But even this minimal standard isn't enough because the policy might be deliberatively deceptive. Political scientists Jacob Hacker and Paul Pierson documented several categories of policies that qualify. 3 Of course, there are policies whose features “are designed to hide what policies are really doing while deliberately restricting the scope for future democratic choice.” An example of this is the Medicare Prescription Drug, Improvement, and Modernization Act of 2003.

The received view is that this so-called Medicare Modernization Act was a political pacifier offered by the George W. Bush administration to deflect public and Democratic criticism by taking a positive action to help Medicare beneficiaries deal with increasing healthcare costs. 10 Although the Act had serious fiscal deficiencies, such as prohibiting the federal government from using its bargaining power to reduce drug costs and requiring that all prescription coverage be handled by intermediate private corporations rather than by Medicare directly, the trickle-up welfare economics for corporate America aren't in and of themselves sufficient to qualify the Act as policy corruption. That honor goes to the well-documented misrepresentation of the budget. The 2003 Bush administration budget had prescription drug coverage pegged at $400 billion over 10 years, even though the Medicare chief actuary reported that the 10-year budget would actually be at least $535 billion, and now we know it to be north of $600 billion. Because the legitimate cost projection of well over $500 billion was likely to inflame Congress, the head of Medicare ordered the actuary to hide the actual cost projections from Congress. Although several government agencies ultimately reported on the impropriety (if not outright illegality) of this conduct (www.gao.gov/decisions/appro/302911.pdf; oig.hhs.gov/publications/docs/press/2004/070704IGStatement.pdf), the bill had already passed. And this wasn't the end of public deception in the name of progressive healthcare legislation. Seven years later, the Barack Obama administration's Affordable Care Act piled on additional trickle-up giveaways and handouts to the very same healthcare and insurance industries that benefitted from Bush's largess, once again by misrepresenting the effects of the legislation to the public. This led journalist Matt Taibbi to comment that “The epic struggle to pass health care reform was … a shameless betrayal of the public trust of historic proportions.” 12

As Wall Street executives pointed out to Congress after the 2008 economic meltdown, making stupid decisions isn't illegal, so we'll pass over whether the Medicare Modernization Act was in the public interest in the first place. Boondoggles aren't necessarily deceptive. But the intentional withholding of actuarial data that affects federal budget projections is a paradigm case of deceptive tactics used to push bad policy—it denied both the public and Congress a measured decision. Beyond that, the Bush administration hired actors to present bogus news reports and interviews with administration officials purporting to show that “all people with Medicare will be able to get coverage that will lower prescription drug spending.” 13 Not only was failing to disclose that these were paid actors reading from a government-approved script a breach of the National Broadcasters Association code of conduct, it also smacked of state-sponsored domestic propaganda. But even if it wasn't a direct violation of the 1948 Smith-Mundt Act, it still failed the Government Accountability Office's sniff test when it found that some presentations weren't “strictly factual” (bureaucratese slang for “lies”).

A second example of a policy deceit can be found in the Bush administration's efforts to eliminate estate taxes for the political donor class. In a widely reported effort to attract public support, and with the help of Republican pollster Frank Luntz, the administration labeled the estate tax a “death tax” and added emotive fuel by appealing to family farmers who lost their farms as a result. However, no such farmers could be identified, the reason being that the married exemption for family farms and small businesses at the time was $4.1 million, as long as a farm's heirs continued to operate the business (this exemption was several times the average value of most farms and businesses). In fact, the average farm assets for all estates over $10 million was only $1.85 million, still well under the limit. 4 The proposed “death tax” legislation never had anything to do with farms and small businesses; that was deliberate misinformation put forward to advance the legislation, which accommodated the interests of the very wealthy. And despite these facts, the House of Representatives still passed the Death Tax Repeal Act of 2015, claiming that “the death tax can force a family to sell off parts of a business or farm, lay off workers, or shutter a business entirely” (policyandtaxationgroup.com/status-of-estate-tax-legislation/). Estate taxes only affect the political donor class. Full stop. But by framing the discussion around imaginary penalties to vulnerable family farms and small businesses, the sponsors of these bills attracted support from those who would be disadvantaged by their passage. Expect the “death tax” agenda to once again rear its ugly head with the newly elected government.

W e're well served to be mindful of Albert Einstein's maxim that a foolish faith in authority is the worst enemy of truth (quotesgem.pro/status/125015). To paraphrase the poet George Herbert (with considerable liberty), the vision of a blind man standing alone is less impeded than one standing on the shoulders of a liar. You may quote me on this.

As far as software and technology deception are concerned, I agree with Brignull that computing professionals need to take public stands against these darker behaviors lest those of us who aren't part of the solution become part of the problem. One of the ethical discussions computing professionals should have is whether or to what extent deceptive programming should be tolerated. 14 The problem is much worse in politics because of the power of entrenched special interests, but the underlying ethical issues are the same. Society has become more accepting of deception because our crap detectors either aren't turned on or aren't tuned properly. Much like the TV sets of the 1940s and 1950s, if the tuning isn't continuously tweaked, a crap detector will only produce noise. In the case of politics, a review of the policies and legislation behind the Wall Street bailouts, tax credits, tax shelters, the Alternative Minimum Tax, the oil depletion allowance, deferred income, property tax abatements, and foreign tax credits, to name but a few, will all reveal similar deceptions. The reason for this is simple: politicians have no enforceable ethical or fiduciary duty to citizens and taxpayers, and only an informed electorate is in a position to hold them accountable for their decisions at the ballot box. To paraphrase philosopher and calculator inventor Blaise Pascal, the exercise responsible political judgment requires a prepared mind.

Computing and high-tech industries set themselves to far higher standards (he said without bias). We make plenty of mistakes, but outside of the undue influence of a few three-letter government agencies, our technology is largely free of deceit. Once again, the reason is simple: our industries are very competitive, and for the most part, we're held accountable. Competition keeps our crap detectors polished—at least with respect to high tech.

References

  1. W. Trotter, Instincts of the Herd in Peace and War , T.F. Unwin, 1916.
  2. E. Bernays, Propaganda , Routledge, 1928.
  3. J. Hacker and P. Pierson, Winner-Take-All Politics: How Washington Made the Rich Richer and Turned Its Back on the Middle Class , Simon and Schuster, 2011.
  4. D.C. Johnston, Perfectly Legal: The Covert Campaign to Rig Our Tax System to Benefit the Super Rich—and Cheat Everybody Else , Portfolio Reprint Edition, 2005.
  5. T. Piketty, Capital in the Twenty-First Century , Belknap Press, 2014.
  6. D. Stockman, The Great Deformation: The Corruption of Capitalism in America , Public Affairs, 2013.
  7. C. Nodder, Evil by Design , Wiley, 2013.
  8. N. Oreskes and E. Conway, Merchants of Doubt: How a Handful of Scientists Obscured the Truth on Issues from Tobacco Smoke to Global Warming , Bloomsbury Press, 2011.
  9. H. Berghel, “Breaking the Fourth Wall of Electronic Crime: Blame It on the Thespians,” Computer , vol. 45, no. 5, 2012, pp. 86–88.
  10. J. Hacker and P. Pierson, Off Center: The Republican Revolution and the Erosion of American Democracy , Yale Univ. Press, 2005.
  11. D. Stockman, Trumped: A Nation on the Brink of Ruin… And How to Bring It Back , Laissez Faire Books, 2016.
  12. M. Taibbi, Griftopia , Spiegel and Grau, 2011.
  13. D. Barstow and R. Stein, “The Message Machine: How the Government Makes News; Under Bush, a New Age of Prepackaged News,” The New York Times , 13 Mar. 2005; query.nytimes.com/gst/fullpage.html?res=9A03E5DD153CF930A25750C0A9639C8B63&pagewanted=3.
  14. K. Greene, “How Should We Program Computers to Deceive?,” Pacific Standard , 3 Sept. 2014; psmag.com/how-should-we-program-computers-to-deceive-eedbf653805a.