Why Facebook's (Nasdaq: FB) News Feed Tinkering Scandal Will Blow Over

Facebook Inc. (Nasdaq: FB) stock has been logging impressive gains, up nearly 24% year to date and an astonishing 172% in the past year. And the company's strategic acquisitions continue to bolster investor confidence.

But a recently published study could erode Facebook users' trust - and raises concerns about privacy all over again.

FB stockFacebook tweaked the content mix in the news feeds of close to 690,000 (unsuspecting) users for one week in early 2012. Some members were flooded with more positive posts, while others were shown more negative posts.

Researchers from Cornell University, the University of California, San Francisco, and Facebook then analyzed more than 3 million posts containing over 122 million words. Using an algorithm, they characterized the language as either positive or negative.

The study found that users shown more negative content were more apt to share negative posts, whereas users in the positive group showed a tendency to share more upbeat posts.

In short, Facebook was able to alter the emotional state of its users.

Here's why some believe the study, and Facebook, crossed a line - and what this development means for FB stock...

FB Study Was Not Illegal... but Was It Ethical?

The results of the study were published June 17 in the highly respected academic journal, Proceedings of the National Academy of Science.

"Emotional states can be transferred to others via emotional contagion, leading people to experience the same emotions without their awareness," wrote data scientist Adam D.I. Kramer of Facebook, Jamie E. Guillory of the University of California, and Jeffrey T. Hancock of Cornell University.

The mood changes were small. Yet, given the size, scale, and reach of Facebook, the findings have major implications, according to the researchers.

Major implications, indeed.

Some Facebook users and critics say Facebook crossed an ethical boundary.

"Facebook didn't do anything illegal, but they didn't do right by their customers," said Brian Blau, a technology analyst with research firm Gartner. "Doing psychological testing on people crosses the line."

Privacy activist Lauren Weinstein took to Twitter and wrote, "I wonder if Facebook KILLED anyone with their emotion manipulation stunt. At their scale and with depressed people out there, it's possible."

But here's why Facebook is defending the study - and what the potential impact will be on FB stock...

FB's terms of services give the company permission to conduct this kind of research. Yes, it's loosely buried there within the 14,000-word agreement.

"We carefully consider what research we do and have a strong internal review process," Facebook said in a statement. "There is no unnecessary collection of people's data in connection with these research initiatives and all data is stored securely."

Facebook and study leader Kramer defended the experiment.

"The reason we did this research is because we care about the emotional impact of Facebook and the people that use our product," Kramer wrote in a public post on his Facebook page. "We felt that is was important to investigate the common worry that seeing friends post positive content leads to people feeling negative or left out. At the same time, we were concerned that exposure to friends' negativity might lead people to avoid visiting Facebook."

But he was a bit more apologetic than his employer.

He added, "In hindsight, the research benefits of the paper may not have justified all of this anxiety."

Scandalous - but Not Seriously Damaging to FB Stock

James Grimmelmann, professor of law at the University of Maryland, blogged: "This study is a scandal because it brought Facebook's troubling practices into a realm - academia - where we still have standards of treating people with dignity and serving the common good. The sunlight of academic practices throws into sharper relief Facebook's utter unconcern for its users and for society."

While the study and results are unquestionably scandalous, they aren't likely to be too damaging.

The study took place two years ago. Facebook is now a more mature and shrewd company. Additionally, what Facebook did isn't really that different from what any number of free social networking sites do behind the scenes.

"Facebook knows it can push users' limits, invade their privacy, use their information, and get away with it. Facebook has done so many things over the years that scared and freaked out people," Grimmelmann told Bloomberg.

"Even so, the anger won't have a long-lasting effect," Grimmelmann added. Some users may threaten to leave Facebook. Most, however, "want to be where their friends are," and there isn't a Facebook alternative offering more privacy, he said.

Grimmelmann also perceptively explained in his blog that the study itself isn't the problem. "The problem is our astonishingly low standards for Facebook and other digital manipulators."

Facebook stock shrugged off the backlash. Just before noon, FB shares were up 0.25% at $67.77.

Today's Top Story: Bioscience investing might seem complicated at first, but it doesn't have to be. Here's how to tap big profits from a bioscience niche not everyone knows about...

Related Articles: