A day or two ago I put up a blog pointing to a number of resources relating to social media and privacy, primarily Facebook (and to a lesser extent Twitter).

One of the articles I mentioned there was Kevin Townsend's   Talking privacy and Facebook with Alexander Hanff. Hanff is Head of Ethical Networks at Privacy International, and his thoughts on the erosion of privacy and Facebook's "bait and switch" marketing (not to mention the reasons he no longer has a Facebook account) are well worth reading in their own right. Indeed, I think there are a lot of people in the security business who wouldn't be doing Facebook except for its value as a communication channel. Or, some would say, PR medium.

However, I was particularly interested by Kevin's conclusion:

What if we’re wrong? What if Zuckerberg is right? What if people really don’t care about their privacy? What right then do we have to try to persuade them otherwise? The question is simple: at what point does ‘education’ become social engineering on a massive scale: an erudite few trying to change the lifestyle opinions of the unenlightened masses who simply don’t agree with us?

In the security industry, we're sometimes over-ready to be over-prescriptive, seeing security and privacy concerns as paramount where others see them as a distraction. And we've become used to the mindset that computer users will always prefer convenience to security. And sometimes that's certainly true, though in the past 20 years I've worked with many people who were anxious to Do The Right Thing to maintain good security, even if they needed guidance on what the right thing actually is.

But outside the office (or in the office but not exactly working), perhaps fewer people are concerned about their own privacy and confidentiality? So, do we have the right to insist that they should care? Well, education is social engineering in its more traditional, sociological sense: it's part of the process of - well, socialization. Perhaps people are better socialized into trying to think securely in a work context (where it's expected and sometimes mandatory) than in their personal lives (where security is more of a personal decision). It seems to me that we should at least try to give them enough information on what the risks are, so that they can at least make informed decisions.

In the meantime, here's a pointer (tip of the hat to Kurt Wismer) to a cartoon that may or may not do justice to Zuckerberg, but certainly succinctly expresses the misgivings many of us have...

David Harley CITP FBCS CISSP
Senior Research Fellow

/2010/09/20/privacy-who-cares/