Should Online Ads Really Offer Binge Drinkers A Booze Discount?

Should Online Ads Really Offer Binge Drinkers A Booze Discount?

Does the Internet have a duty to protect us from ourselves?

I began to wonder as I attempted to trick Facebook into believing I was single. Hoping to escape the plague of bridal ads that descended after I announced my engagement, I tried to confuse the big data overlords by switching my relationship status and joining every dating site that wasn't obviously a scam. I wanted privacy. Instead, I got ads for matchmaking sites side-by-side with the wedding registries already pitching me in my News Feed.

The nature of these ads -- and the ease with which I’d triggered them -- points to the amoral state of the online marketing apparatus: The Internet makes no moral assessment of what its ads aim to make us do. If I’m a bride, the Internet offers me gowns. If I’m a bride who wants to cheat, it delivers me mates. According to the rules of the Web, the good and righteous path is whichever makes us spend.

But should the Internet really be so blind, especially when it knows us so well? To be fair, the technical limitations of today’s targeting techniques mean the marketers probably failed to realize I was both engaged and on the prowl. “A fairly large amount of people are listed as both male and female by data brokers,” said Brian Kennish, founder of Disconnect, a maker of privacy software. And yet, given the expanding range of digital information that we produce and companies collect, there's every reason to think the data overlords will have more complete portraits of us soon.

This raises a new quandary: Will online advertisements adopt a moral code? As they get more insight into our peccadillos, weak spots, indulgences and addictions, should the Facebooks and Googles of the world limit marketers from pushing products that make us behave badly or cause harm? And who’d decide what “bad” looks like?

Generations of Mad Men have been urging us to suspend good judgment and grab the extra Big Mac. Only this time, it’s different. The intimate ads on the Internet are providing “evermore ways to capitalize on our vulnerabilities by being able to figure out more precisely who we are and what we would be vulnerable to at those contexts and moments,” said Evan Selinger, a professor at the Rochester Institute of Technology who studies the intersection of philosophy and tech. Compared with the omnipresent, personalized prompts that appear on our iPhones, yesteryear’s billboards, TV commercials and magazine pages “hit us at more of a generic consumer level.”

The ethical Internet of the future, having realized I was a confused bride trying to cheat, might have banished the OkCupid pitches and instead skewed toward ads for couples counseling. Alternately, if the trend toward indulging all urges continues, advertisers could have pitched me dating sites pixel-by-pixel with promos for books on hiding affairs from a boyfriend. “Getting cold feet because your fiancé’s acting distant?” an ad in my Gmail inbox might read -- Google knowing full well the warmth of my spouse-to-be’s emails had tapered off lately. “Meet these 17 guys near you who like to talk about their feelings.“

As the quality of personal data that data brokers collect increases, advertisers will be pitching us in more customized ways, potentially raising thorny ethical dilemmas with regard to the behavior they aim to induce, while gaining more sway than we might realize -- or want -- over our behavior. It may become harder to tell what was our idea, and what we did because a Fortune 500 company suggested.

If getting clicks and credit cards is the sole mandate, it's easy to imagine ads that might prey on our frailties. Consider a person who regularly researches antidepressants and Googles tips on beating depression. One day, her searches switch to “how to tie a noose.” Will she see banner ads peddling psychiatric help? Or ads for rope? In 2010, Google came under fire for displaying ads for toxic chemicals alongside a suicide discussion forum. "Hydrogen Sulphide. Find medical & lab equipment. Feed your passion on eBay.co.uk!" read one of the promotions, which Google only removed after media coverage. Searching Google for tips on how to commit suicide now automatically serves up the number for a counseling hotline -- though it also delivers ads touting "Best Way To Kill Your Self" and "Painless Quick Ways To Die."

A report released last week by the Federal Trade Commission offered a glimpse at how data brokers are already slicing and dicing populations into niche categories like “Urban Scramble” (primarily low-income African-American and Latino city dwellers) or “Rural Everlasting” (single people over 66 with “low educational attainment and low net worths").

Soon, with help from the data brokers that gather our particulars, marketers may be able to pinpoint people who appear to be compulsive gamblers so they can push ads for casinos. It’s coding to maximize clicks, no matter the moral or personal cost.

The alternative? Programming for the public good. Advertisers get to pick whom they’ll target, but it’s lawmakers and ad networks, like Facebook, Google, Twitter or Apple’s iAds, that set the rules of play. Government officials might step in with their own limits, and there’s ample precedent for curbing how brands can advertise (You won’t see Joe Camel in classrooms). The companies, with their do-no-evil mantras, also could lead the charge. Google could prohibit fast food chains from targeting people who want to diet, or ban Snickers from being pitched to diabetes sufferers. Already, Silicon Valley giants have bans on what marketers can show, no matter what the business would pay. Facebook, for example, only allows dating ads to reach members who've set their relationship status to "single" or left it undefined.

But relying on firms to self-police presumes we trust Google and Facebook to pick what’s good for us and rule on the most moral path. It also puts companies in the position of being our guardians, protecting us from advertising they deem unsavory, unhealthy, unethical or undesirable (a trickier task than it might seem. If a man married to a woman begins downloading dating apps for gay men, would it be right to show him more ads for gay singles sites, or to push marriage counseling?) Do we want Cook, Zuck, Page and Costolo deciding what’s in our best interest, or what's ethical? Selinger argued such chaperoning would be unwelcome on the Web: “Frankly, because of a certain cyber-libertarian view that we have, if websites became more morally intrusive or morally inquisitive … there would be a big backlash."

And yet these are companies that have spent years positioning themselves as forces for public good. "Facebook was not originally created to be a company. It was built to accomplish a social mission," Mark Zuckerberg declared when the social network filed for its initial public stock offering. Though they might be loathe to turn down business or embrace a potentially controversial moral code, the alternative -- letting our every vulnerability be manipulated -- could be equally unpalatable.

Currently, Facebook and Google's extensive advertising guidelines focus on excluding ads that "provoke" or "offend" their members. But beyond banning ads for illegal substances, there's little explicit concern for our well-being. An even bigger backlash might arrive if people realize Facebook and Google are allowing advertisers to exploit our weaknesses and lead us down harmful paths.

The best way to avoid the moral-question morass may be a path that makes Silicon Valley's giants equally unhappy: Stop getting to know us. Quit looking for more ways to decipher our moods, stop going after our shopping patterns and drop the attempts to decode our psyches. Because if the data brokers and social networks keep gathering more, the questions around what advertising should or shouldn't be allowed to reach us will eventually demand answers. The big data barons might find that ignorance is bliss.

Popular in the Community

Close

What's Hot