Facebook has come under fire after it was revealed in a study published earlier this month that it psychologically experimented on large numbers of its users two years ago without their knowledge.

For one week in January 2012, Facebook tweaked the News Feed algorithm to adjust the amount of positive or negative language users were exposed to on their news feed to see how they would react.

The researchers, who were from Facebook, Cornell University and the University of California, San Francisco, found that when users were exposed to fewer positive posts, they would put up fewer positive posts and more negative posts, and the opposite was true when they were exposed to fewer negative posts.

This showed that emotional states could be transmitted between people without face-to-face interaction.

Facebook says they didn't do anything wrong, because apparently when you joined Facebook and agreed to that lengthy policy disclosure that you didn't read, you gave them permission to carry out studies like this, via a brief, vague mention that your information might be used for "research."

But many people, including scientists, are questioning the ethics of what was done, saying that users didn't really give informed consent, despite agreeing to the data use policy.