It's Independence Day tomorrow, so here's a story about your freedom to think about and act on.
News came to light this week that a Facebook (Nasdaq: FB) data scientist named Adam Kramer conducted an experiment on 689,003 users of the social network site over a seven-day stretch in January 2012.
It's news for a few reasons.
Some people are ticked off that they may have been used in an experiment without their knowledge. Some of the subjects of the experiment may have been teenagers.
And some people, namely folks at the UK Information Commissioner's Office and the Data Protection Commissioner of Ireland, are looking into whether European Union privacy laws were broken.
Facebook wants us to believe it's all a big kerfuffle over nothing.
And today, I'm going to show you how we're letting those rights slip through our fingers...
Another Right Bites the Dust (Thanks, FB!)
So what if some people got good-manipulated (as in feeling good and uplifting) news feeds and some people got bad-manipulated (as in depressing) news feeds to see how they'd react and how it affected their social interaction with peers? We'll never know.
At the time of the experiment, Facebook's Terms of Service policy said user data could be used to improve Facebook's products. In May 2012, FB added the phrases "internal operations" and "research" to the policy.
That means Facebook changed the rules after the fact.
So what? Who reads all that blather anyway?
Kramer, the not-so-mad scientist who ran the study, said, "The reason we did this research is because we care about the emotional impact of Facebook and the people that use our product. We felt that it was important to investigate the common worry that seeing friends post positive content leads to people feeling negative or left out. At the same time, we were concerned that exposure to friends' negativity might lead people to avoid visiting Facebook."
That's thoughtful. The 100 or so scientists FB employs are just thinking about the company's 1.3 billion users - and how to make their day.
But Pam Dixon of the World Privacy Forum had another take on it - one that is decidedly less... altruistic.
"This isn't A/B testing," said Dixon. (In marketing, two versions A and B are compared, which are identical except for one variation that might affect a user's behavior.) "They didn't just want to change users' behaviors, they wanted to change their moods."
According to The Wall Street Journal this morning, "Since its creation in 2007, Facebook's Data Science group has run more than 1,000 tests. One published study deconstructed how families communicate, another delved into the causes of loneliness. One test looked at how social behaviors spread through networks. In 2010, the group measured how "political mobilization messages" sent to 61 million people caused people in social networks to vote in the 2010 congressional elections."
So, like I said: What's the big deal?
We put our own selves out there through social media, and even when we don't put ourselves out there, we're being watched anyway.
In case you don't realize how you're being watched, think about what Big Data number crunchers collect when you TiVo a show, use your credit card for anything, transfer money over the web, view any Internet content, send an e-mail, or use the phone. There are eyes and ears all over everything wired and wireless - not to mention the cameras everywhere.
We're still free, right?
No, we are not.
Maybe we just don't care. Maybe this whole independence thing is past its prime.
Too many of us have let government and corporations like FB piss away our right to be free of their prying eyes and manipulation.
Maybe what we need is a Dependence Day.
More from Shah Gilani: Most people who are just "in the market" don't understand high-frequency trading and dark pools. And that's okay. But here's your chance to learn all about the history of dark pools, and their impact on the markets, from an expert who was there when they were spawned in the 1990s...