Did scrolling through your Facebook news feed leave you feeling a little bummed back in January 2012?

Chances are, you don't recall your emotional state after reading social media updates more than two years ago, but if you were a Facebook user, you might have been an unwitting participant in a recently published study.

A paper in the Proceedings of the National Academy of Sciences reveals that Facebook intentionally manipulated the news feeds of nearly 700,000 users to study "emotional contagion through social networks."

In short, the social network hid either good news or bad news from users' feeds to see how it affected their emotions.

Researchers tweaked the algorithm Facebook uses to sweep users' posts so they could identify those with positive and negative words. They then fed people either neutral or positive information from friends or neutral-to-sad information.

They found that tampering with news feeds to fill them with more negative updates didn't just make users more negative — it also made them more withdrawn.

They were more likely to post negative updates, and they were also less likely to post overall. And they socially engaged less.

On the flip side, people who viewed less negative content were more likely to post happier updates.

An interesting study, yes, but is manipulating thousands of people's moods without their knowledge ethical?

Informed consent

Federal regulations state that studies involving human subjects require informed consent.

Researchers affiliated with Facebook, Cornell University and the University of California-San Francisco wrote in their paper that their research "was consistent with Facebook’s Data Use Policy, to which all users agree prior to creating an account on Facebook, constituting informed consent for this research."

Facebook's data use policy does contain the following: "In addition to helping people see and find things that you do and share, we may use the information we receive about you ... for internal operations, including troubleshooting, data analysis, testing, research and service improvement."

This is how the policy reads now, but Forbes reports that in January 2012, Facebook's data use policy contained no reference to the possibility of its users being subjected to potential research.

In May 2012 — four months after the study — Facebook made changes to this policy, which is when the previously quoted line referencing "research" was introduced.

Additionally, as there was no age filter set on the study, it could have contained participants who were under the age of 18.

The debate on whether it's ethical to involve Facebook users in a study without their consent is even more complicated because researchers proved their manipulation of news feeds had a negative impact on users' emotional states.

"There are people who can't afford to be made to feel very much worse than they already do," NPR's Linda Holmes wrote. "There's every chance that this experiment … took a depressed person somewhere and made it harder for him or her to get up. There's every chance that somebody went for a quick dose of distraction because of a breakup or a job loss or a death or a simple setback and didn't get it, because it was denied to them on purpose, just to see what would happen."

On Sunday, even Adam Kramer, the Facebook data scientist who led the study wrote, "In hindsight, the research benefits of the paper may not have justified all of this anxiety."

But will the outrage Facebook faces today result in users deactivating their accounts?

Probably not.

For years Facebook has been criticized for how it uses its 1.3 billion members' data, but many people consider it an indispensable communication tool, and reports show that the social media site's audience is still growing.

Related on MNN: