Speaking of those settings, Google does have a “Privacy Checkup” tool that you can use to hide certain data from being tracked or gathered. It’s generally well-designed, but for one major example, shown below. Play a game with yourself if you like — see if you can spot the problem before you read further:
This is a perfect example of what’s known as a dark pattern. A dark pattern is a pattern designed to trick you into choosing the “right” option, where “right” is defined as “What the company wants you to pick,” as opposed to what you actually want. In this case, boxes are checked by default and you uncheck them to hide information. But if you uncheck the box labeled “Don’t feature my publicly shared Google+ photos as background images on Google products & services,” you’re actually giving Google permission to use your name and profile to advertise products. Google flipped the meaning of the checkbox to make it more likely that someone not reading carefully would click the wrong option.
But what’s really interesting to me is that the word “Don’t” is bolded. You bold something you want to draw attention to — and that’s pretty much the opposite of how a dark pattern works. Huge organizations are much less monolithic than they appear from the outside, and I suspect that what we see here is a tale of two opinions, played out in a single checkbox. By reversing what checking the option does, Google made it more likely that you would give it permission to use your personal likeness and data for advertising. By bolding the word “Don’t,” Google made it more likely that you’d realize what the box did and set the setting appropriately.
In any case, Google’s decision to stop anonymizing data should be serious, but there’s not much chance people will treat it that way. To-date, people have largely been uninterested in the ramifications of giving corporations and governments 24/7 permission to monitor every aspect of their lives, even when it intrudes into private homes or risks chilling freedom of speech.