Nudged Citizen

(Continuation of “AAA Citizen”) The Chinese Social Credit system works in ways similar to (and inspired by) customer care programs such as special discounts for super markets client card holders or members of frequent flyer programs. As Netzpolitik.org points out, pilot schemes such as Alibaba’s sesame credit have their western pendants. Facebook, for example, filed a patent application in 2015 based on using platform-user data for the creation of credit scores:

When an individual applies for a loan, the lender examines the credit ratings of members of the individual’s social network who are connected to the individual through authorized nodes. If the average credit rating of these members is at least a minimum credit score, the lender continues to process the loan application. Otherwise, the loan application is rejected.[i]

Like the Chinese Social Credit projects, this model combines the carrot and the stick on the level of both individual and group behavior, creating personal incentives, directed group-self-management (you will be careful who you ‘friend’ given its impact on your credit score, and monitor closely those you do friend) and class consolidation (coming from a poor background with a poor family and poor friends, it will not only be so-deemed abstract social determinism but a literal valuation of peers’ scores that will keep you from getting credit). Like the Chinese Social Credit System, then, Western capitalism has long combined data gathering with small perks handed in order to nudge citizens and customers into specific kinds of behavior.

Though various conceptual approaches to nudging exist, they all agree in defining the term as describing a purposeful guiding of decision processes that, however, are supposed to remain “free”. Through nudging, it is claimed, people are enticed to take certain decisions, but not forced to. Cass Sunstein and Nobel Prize winner Richard Thaler, authors of the 2008 book Nudge that was instrumental in popularizing the concept, call this setting of nudging impulses “choice architecture”. Legend has it that one of the founding moments of nudging was when one of the managers of Schiphol Airport in Amsterdam introduced a fly-design in urinals in order to nudge men to actually relieve themselves into the urinal–instead of just roughly in its direction–by targeting a fly printed on a piece of plastic laid into the sanitary installations. As a result, cleanliness improved by 80%.[ii] A better known example of choice architecture can be found in supermarkets placing products for children at their eye level (rather than that of their parents), or in cafeterias offering fruit early in the queue and sweets near the cash register in order to induce clients to choose the former over the latter.

But as in China, such nudging has not been confined to private enterprise in the West. Well-known “nudge units”[iii] have been established by the U.S., the British and the German government among others. While the British Unit was successful enough to be partially privatized in 2014, public outcry accompanied the project in Germany, where memory of two 20thcentury totalitarian regimes is still vivid and state-nudging summons specters of prior experiments in social manipulation, brainwashing and totalitarian control. As opposed to public announcements urging people to drive safer, eat better and smoke less, nudging is perceived as a way of messing with people’s lives and privacy in hidden and with the explicit goal to not have them notice. State nudging particularly has been most remarked for its post-modern implementation of power: instead of clear hierarchical authority and control through discipline and punishment, environments are designed so as to make their targets think she or he took their decision intuitively, thus inducing a form of identification with power that makes it all the easier to obey what dystopian vocabulary might describe as subliminal orders. But such critique also extends to less subliminal instances, such as this example given by The Economist:

Laws in some American states that have suppressed black people’s votes, such as those passed by North Carolina in 2013, look remarkably like nefarious nudges, from limiting the types of IDs that can be used for registration to banning out-of-precinct voting. All made voting less easy, attractive, social and timely—and disproportionately cut the number of black people voting.[iv]

Other well-known political dark-art-debates concern nudging through social media, for example during the 2016 U.S. presidential election or the Brexit campaign (cue: Cambridge Analytica). But incidents drawn from the digital realm include more than just acts of fiendish foreign governments. Experiments such as Facebook’s 2014 attempt to manipulate their users’ emotions by tweaking their news feed[v] show the dystopian potential arising from nudging’s stealth nature, irrelevant of the force (private or public) applying it.

There is, then, a crucial difference between primitive forms of public communication on the one hand, and social media & smart world nudging on the other. Unlike Public Service announcements (“Don’t litter the forests”) or laws that make people organ donors by default (asking them to opt out rather than in), smart cybernetic modes of nudging are continuously finetuned through permanent flows of data extraction and target surveillance. Unlike advertisement posters easy to recognize and process as such, cybernetic nudging (such as tampering with social media news feeds) is harder to assess and ignore because they cannot necessarily be told apart from the “authentic” reality of our smart worlds. How these dystopian differences between announcements or advertisement and nudging may best be understood is excellently illustrated by Charles Duhigg in an article he wrote on US retailer Target for The New York Times[vi].

The obvious goal of advertising is to entice people to pursue specific forms of consumption. But instead of just randomly dropping flyers over crowds, advertisers have long sought to target-design their offer in order for it to be more efficient (something measured in return per dollar spend). Classic examples of this would be advertising for razor blades shown during sports transmissions (with a high rate of young male spectators) or for designer clothing in lifestyle magazines. Cybernetic nudging goes beyond that in that it extracts and extrapolates data from targets before pitching their product communication to them so as to gain insights into these targets’ minds and the vulnerabilities to exploit. Instead of just addressing roughly defined group identities (“men”, “style aficionados”), cybernetic product communication is based on dynamic feed-backs concerning specific individuals (young male sports-fan in the process of being divorced). As research has found that these consumption patterns are hard to change, companies seek to identify moments when individuals actually are vulnerable to external manipulation to warrant the investment into advertisement or nudging:

… when some customers were going through a major life event, like graduating from college or getting a new job or moving to a new town, their shopping habits became flexible in ways that were both predictable and potential gold mines for retailers. The study found that when someone marries, he or she is more likely to start buying a new type of coffee. When a couple move into a new house, they’re more apt to purchase a different kind of cereal. When they divorce, there’s an increased chance they’ll start buying different brands of beer.

As opposed to largely random mass advertisement easily shrugged off by targets, cybernetic nudging is not only character-designed but also precisely timed, pitched when the defenses of habit or custom are down and even the critical mind is easier to sway. Obviously, such stealth attempts at manipulation require that its targets are not aware of them. Nudging therefore responds to multiple shortcomings of classic advertising. Not only is it targeted and timed and cheaper, but it is not easily identifiable as the “libertarian paternalism” as which Nudge author Cass Sunstein describes it and which creates in many people a suspicion as to the truth content of product communication. Of course the maker of a product or service will say his is the best product, so targets of classical advertisement will take that statement with a grain of salt. The nudged citizen, however, thinking that a certain choice arose from his own intuition rather than choice architecture, might not be just as careful. But in order to achieve this veneer of the intuitive, communication needs cues that connect the nudge to some private information, as it is this connection to something private that makes it seem like it was actually one’s own idea. As Duhigg reports in his piece, Target’s marketing department therefore asked its research team: “If we wanted to figure out if a customer is pregnant, even if she didn’t want us to know, can you do that?” [Italics added]

So, while many people identify digital target-advertising with its primitive forms, such as being shown ads for a pair of shoes just considered on one website on another site, cybernetic nudging is much more than that. It’s about collecting private information about a target, weaponizing that information into a tailored nudge, measuring the effect of that nudge, adjusting the nudge and continuously reiterating it until the desired result is achieved (for example the creation of a new habit: eating healthier, shopping at Target). It is in this tailored design, feed-back circle and reiterative nudging geared towards habit formation that we find cybernetic nudging’s most dystopian potential. As targeted manipulation becomes more individualized and more invisible, we move from advertising in its narrow sense of simply making known an offer to a mass public to direct but concealed manipulation of individual people, undermining potential appeals to personal responsibility or decision-making by leaving a person in the dark as to what is happening to it. As opposed to the largely transparent rules and data gathering processes underlying the  Chinese Social Credit System, Western nudging then not only  by default conceals its nature, but also its sources of information (and the privacy breaches they imply).

Duhigg notes that “new parents are a retailers’ holy grail” because a prime moment for manipulating people’s habits “[…] is right around the birth of a child when parents are exhausted and overwhelmed and their shopping patterns and brand loyalties are up for grabs.” But instead of waiting for the birth of a child and its public announcement and then compete with other retailers in sending the young parents baby-related offers, Target wanted to be the first to know so they could be the first to profit from the situation and nudge the upcoming parents towards their stores without interference. In order to do this, Target drew on the information collected in their own stores through a Guest ID number attributed to each costumer, as well as on data bought from external sources; in both instances without the customer’s knowledge:

Whenever possible, Target assigns each shopper a unique code — known internally as the Guest ID number — that keeps tabs on everything they buy. “If you use a credit card or a coupon, or fill out a survey, or mail in a refund, or call the customer help line, or open an e-mail we’ve sent you or visit our Web site, we’ll record it and link it to your Guest ID,” Pole said. “We want to know everything we can.”

Also linked to your Guest ID is demographic information like your age, whether you are married and have kids, which part of town you live in, how long it takes you to drive to the store, your estimated salary, whether you’ve moved recently, what credit cards you carry in your wallet and what Web sites you visit. Target can buy data about your ethnicity, job history, the magazines you read, if you’ve ever declared bankruptcy or got divorced, the year you bought (or lost) your house, where you went to college, what kinds of topics you talk about online, whether you prefer certain brands of coffee, paper towels, cereal or applesauce, your political leanings, reading habits, charitable giving and the number of cars you own. (In a statement, Target declined to identify what demographic information it collects or purchases.) [Italics added]

Duhigg excellently describes how this data is worked upon and combined with insight from research into habit-formation to actually create tools that allow Target to identify the stage of a woman’s pregnancy and then use that knowledge so as to create in the upcoming mother the habit of shopping at Target. Based on the data Target had on their customers:

They know that if she [a hypothetic customer named Jenny] receives a coupon via e-mail, it will most likely cue her to buy online. They know that if she receives an ad in the mail on Friday, she frequently uses it on a weekend trip to the store. And they know that if they reward her with a printed receipt that entitle her to a free cup of Starbucks coffee, she’ll use it when she comes back again.

The forms and modes of nudging–here specifically with the goal of habit triggering–then are tailor-made and improved as more data on “Jenny” is collected. This is a major difference between nudging and advertising. As mentioned a few times already, such nudging is also different in that it needs to be concealed. Instead of simply sending Jenny additional coupons or offers concerning baby products, Target mixes those coupons in with other offers. That was a lesson learnt from earlier efforts when clients had understood from the baby-related adds suddenly appearing in their mailboxes that the company had gained information on them that they themselves had not given and had reacted with anger at this. There was, in other words, a steering process improved through a learning process based on a data-based feed-back cycle: the very definition of cybernetics. Where this combined with the choice architecture of nudging instead of advertising was, when the advertising was ‘disappeared’ into pre-existing habit environments (what Duhigg refers to as to “piggyback on an existing habit”). In the case of Jenny that habit environment was going to Target in order to buy cleaning products. In order to nudge pregnant women into buying baby products, Target thus mixed baby advertising with coupons for products they regularly bought at Target, as well as products they would never buy, such as “a lawn mower”. The manipulative impulse resulting from intense research was concealed under a veil of randomness without which it had been found to be ineffective. Instead of feeling manipulated, women were induced to feel that using the coupons was their very own idea; a school-book example of a successfull nudge.

Social engineering through cybernetic systems, then, is not only not exclusive to China, but highly developed and pursued by both the States and corporations of Western Capitalism. However, it is generally not as transparent and anodyne as the examples developed here may suggest. In order to further assess the authoritarian nature of such social control based on big data, algorithms and concealed behavioralist cues, the next post will therefore continue following the red thread running from Chinese Social Credit through capitalist nudging with a look at the use of bail and parole algorithms in the United States. (continued in “Automated Justice / Predictive Policing”)

[i]https://netzpolitik.org/2015/dystopia-wird-wirklichkeit-was-ist-dran-an-chinas-social-credit-system/

[ii]https://www.tagesspiegel.de/themen/agenda/streit-ums-nudging-wie-der-staat-verbraucher-erzieht/11446920.html

[iii]https://www.washingtonpost.com/news/wonk/wp/2017/08/11/governments-are-trying-to-nudge-us-into-better-behavior-is-it-working/?noredirect=on&utm_term=.103fc9718872 . This article also contains assessments of nudging’s effectiveness compared with other forms of shaping behavior.

[iv]https://www.economist.com/international/2017/05/18/policymakers-around-the-world-are-embracing-behavioural-science

[v]https://www.theguardian.com/technology/2014/jun/29/facebook-users-emotions-news-feeds

[vi]Charles Duhigg. „How Companies learn your secrets.“ The New York Times. Feb.16. 2012. Web. https://www.nytimes.com/2012/02/19/magazine/shopping-habits.html?pagewanted=1&_r=1&hp Unless otherwise noted, all following quotes are drawn from this article.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

w

Connecting to %s