Facebook, Privacy & Reality

A Tech Editorial by Robbin Orbison

In recent days there has been much press about the so-called “data breach” at Facebook. The word “breach” is being used loosely, but it is referring to a situation in which data about the personal preferences of millions of Facebook users was used to influence voting. The coverage has focused mainly on the political aspect of the story, the financial impact it is having on Facebook, and the overall question of privacy on the internet.

This blog is an attempt to break this story down to the facts about what happened and examine where the real issues lie.

First, the facts of the case. Dr. Aleksandr Kogan is a Cambridge University neuroscientist who engages in complex studies that can be most accessibly described as cognitive and behavioral research. He has contributed to research papers with titles like “Mania Symptoms Predict Biased Emotion Experience and Perception in Couples,” and “An fMRI study of Caring vs. Self-Focus During Induced Compassion and Pride.” I think you get the picture.

Dr. Kogan routinely engages in collecting data for his research, and in this case did so by creating a personality test app which he promoted on Facebook. Those who agreed to take the test agreed that information about themselves could be harvested and used for research purposes. In addition, a Facebook feature called Social Graphic API made it possible for Kogan to also collect data on the friends of those who participated in the test. At least that is as much information as I have been able to find in dozens of articles. How exactly this app and these terms manifested themselves to the users is unclear.

In any event, 270,000 people participated in the personality test, and since the likes and other data about their friends was up for grabs, Kogan ended up with data on about 50 million people, most of whom had no idea it was happening.

Now, enter Cambridge Analytica. Or not exactly. They actually entered the picture much earlier in the story. They hired Kogan to conduct a massive research project for which he created the now infamous personality test app and collected the data in question.   Cambridge Analytica then went on to convince political campaigns, notably that of Donald Trump, that they could predict and influence voter behavior with this data. (Despite having the word Cambridge in common, there is no relationship between the Cambridge Analytica and Cambridge University other than the apparent contractual one entered into by Kogan.)

Traditional advertising is designed to appeal to target markets using demographics, targeting segments of the population that can be believed to have similar tendencies. For example, male baldness products would target men of a certain age, income, and geographic location. This is a commonly used system with which we are all familiar.

So, what is different about this situation? The data extracted by Dr. Kogan was used to categorize individuals not by groups they belonged to but by their very personalities, as could supposedly be determined by their likes and engagement on Facebook. So instead of trying to sell baldness cures to middle aged men from the suburbs, Cambridge Analytica would build a target audience based on a personality trait, say, vanity.

One interesting point in this story is that Kogan now maintains that this method of targeting voters by determining personality types gleaned through social media interaction is highly unreliable, and he suggests that Cambridge Analytica was in fact selling a product they may have known to be a “myth.” Intuitively, it would make sense that he’s right since many people on social media seek to project an image different from their true personality.

According to Facebook rules at the time, Kogan was completely within his rights to harvest and use the data. This is why the word “breach” does not properly apply. The information was not hacked or stolen but obtained through a process that was allowed and ostensibly promoted by Facebook. The Social Graphic API was routinely used by app developers in the same manner for years until it was finally dismantled in 2015. Where Kogan stepped out of bounds was in selling the data to Cambridge Analytica. And Cambridge Analytica compounded the problem by refusing to comply when Facebook found out they had it and demanded that they delete it.

Those are the facts as we know them and yes, they are full of missing pieces. But it’s enough of a background to begin to explore the questions being raised about Facebook, privacy and responsibility.

To begin with, does anyone really think that what they post on Facebook WON’T be used for profit? Facebook’s entire business model is based on enticing people to share information about themselves and selling the benefits of that information to its advertisers. The data is packaged so that advertisers can’t see the data of any individual Facebook user, but the fact is every time you like something on Facebook your likes are being collected and used to create target audiences for advertisers.

So, what is wrong with that? Ultimately, this is about businesses trying to sell things by using what they know about potential customers to influence them to buy. And this has been going on since the dawn of advertising.

Let’s say it’s 1955. There is no internet. You frequent your local hardware store and chat with the owner from time to time. The owner makes notes of things about you – your birthday maybe, or how many kids you have, or preferences you have for certain brands. And when that owner gets in a new shipment of something he or she thinks you would like, you get a call or a note. The hardware store owner has collected information about you and is using it in hopes of selling you more stuff.

Over the years technology has made it easier and easier to collect, analyze and use data to reach customers, but the underlying concept has not changed. Facebook is just one of many new mediums that allow the process to take place on a larger scale. Some find the very concept of “big data” to be sinister and when abuses or perceived abuses occur, it is convenient to blame the medium.

Facebook makes no bones about how it makes money and is simply a more sophisticated platform for doing something not at all new. What makes it seem nefarious are the numbers. A hardware store owner remembering your favorite brand of tile grout seems quaint, in fact helpful and appreciated. But when technology collects that same tidbit of information about millions of people and sells it to anonymous third parties who make millions of dollars from it, it feels different.

Whether it really is different is one of many philosophical questions raised in a world where societal infrastructure and the human capacity to adapt to change profoundly lag the breakneck speed of technological advances. But that reality is beyond the scope of this discussion.

The questions at hand are 1) Who is the victim? 2) Who is to blame? and 3) What are the damages?

The current outrage about this poorly understood debacle is on behalf of the individuals whose data was misappropriated. With all due respect to those individuals and their understandable feeling of privacy violation, it’s difficult to see how they were specifically harmed, except in their capacity as part of the true collective victim – humanity. When our already imperfect system of choosing leaders becomes this susceptible to sophisticated manipulation against which the population is powerless to defend itself, the result may be called many things, but democracy is not one of them. We are all damaged by any events that favor the few with the means to manipulate the many without influence.

As for who is to blame, there are a lot of suspects to choose from. Let’s start with the current frontrunner, Facebook.

Does Facebook have a responsibility to ensure that personal information posted freely on the internet remains private? People post things on Facebook precisely because they want to share things about themselves and the more people “like” what they post the happier they are. It’s almost a ridiculous question, except that it’s not.

The fact that I like pistachio ice cream has no value to anyone. But if an advertiser is interested in reaching pistachio-ice-cream-lovers, and Facebook can facilitate that to the tune of millions of people, suddenly ice cream preferences are valuable information. And if information is power, then with great power comes great responsibility.   That companies like Facebook are at worst denying responsibility, and at best having trouble figuring out exactly what that responsibility is, is not surprising and is to some extent understandable. This is new territory for everyone.

The next link in the blame chain is Kogan. He has declared himself to be the unfairly accused scapegoat. What did he do wrong? He exploited features that Facebook made available to researchers like him. And he apparently did little or no due diligence on his client, Cambridge Analytica, or the Facebook terms and conditions (he admits he relied on Cambridge Analytica’s assurances that everything was in compliance with Facebook policy). But he fails to emerge as the ultimate villain.

Something could be said of Cambridge Neuroscience – the Cambridge University department that Kogan worked for. There has been little if anything said about their internal policies which appear to let employees enter into contracts without due diligence and authorization processes.

Then we have Cambridge Analytica, by all accounts a shady group pushing a product they may have known was bogus. But if they believed in their personality-type driven targeting system, (setting honey trapping and things of that nature aside) then what is really wrong with exploiting information to achieve better outcomes for their clients?

Which brings us to the clients. The political campaigns are the only players in the story upon whom it is incumbent to hold themselves to higher standards. They are public servants, and the only ones of whom we have reasonable expectations that they will err on the side of transparency, fairness and the democratic process.

Are there any other suspects? Maybe decades of insufficient leadership in education, income equalization, and other areas that leave people vulnerable to manipulation by those who seek to deceive them.

Airplane crash investigators use the term “cascade effect” to refer to a series of events leading up to a crash. There is rarely one single cause.

As they used to say in the NFL, you make the call.

Finally, the damages? Global loss of confidence in world leadership, potential economic instability as the financial sector reacts to the impact of increasingly unchecked technology companies, and a growing sense of helplessness that individuals on both sides of the political spectrum cannot deny.

Ten years ago, when the world was reeling from an unprecedented financial crisis caused by greed and a level of complexity beyond the grasp of most people, the phrase “too big to fail” entered the common lexicon and we debated the logic of helping giants survive despite their destructive behavior. We appear to be entering the era of “too big to police,” where new monsters have been created but their presence is so woven into the fabric of existence and the problems they create are so lacking in clear solutions that we have no choice but to accept their failure of stewardship and allow them to proceed with unaffected apathy. And in fact, that apathy continues to settle upon all of us as more and more too-big-to fill-in-the-blank issues take center stage in the global discourse.

I admit my first reaction was to question why this was different than the 1955 hardware store. That is a form of apathy born out of trying to deny the existence of yet another thing to be worried about. Others derive their apathy from cynical beliefs that we’re all pawns and everyone is out to get us so what’s the point. And yet others simply don’t involve themselves in pondering such questions and happily post their cat pictures on social media and answer fun questions about their favorite pizza toppings.

But here’s what it all adds up to, in my humble opinion. These are complex problems without clear solutions, and to address them we need intelligent, deep-thinking leaders in government, industry, academia, science, and even the moral compass that can be contributed by religion. And all of those institutions need some overhauling of their own.   I don’t know how we get there, but I know where we have to start.

We all need to wake up to the fact that it is no longer 1955. Your preference for a brand of tile grout can now be weaponized against you, not personally but as a member of the global society.

Acceptance of that global societal membership by all of us is the last and best hope for humanity and for regaining control of a world increasingly spinning away from us. It doesn’t mean we have to like each other or agree with each other, only that we have to stop denying our shared predicament and realize that we are in a kind of Nash Equilibrium where the best outcomes can only result from individuals considering the good of the whole.

The first instinct of those incensed in this case was to feel personally violated, ripped off, targeted. But we’re in this together. The impact is on all of us. The sooner we start making decisions with all of us in mind, the sooner we can begin to create an environment conducive to solving the vexing problems of our time instead of working ourselves into frenzies trying to single out one party or event in the cascade that led to the crash.

And in the meantime, if you don’t want information about you used for the commercial gain of others, don’t post it on social media.

Robbin Orbison is the owner and operator of CapeSpace Business Centers, a full service business/coworking facility in Hyannis, Massachusetts. She is also a member and serves on the Board of Directors of the Cape Cod Technology Council.

Skip to content