Facebook’s Social Research Experiment

I-need-help1
Facebook are back in the news again, this time for conducting research without the consent of their users. Although maybe that is a false statement, users may well have signed those rights away without realizing too.

All Facebook did was to “deprioritizing a small percentage of content in News Feed (based on whether there was an emotional word in the post) for a group of people (about 0.04% of users, or 1 in 2500) for a short period (one week, in early 2012). Nobody’s posts were “hidden,” they just didn’t show up on some loads of Feed. Those posts were always visible on friends’ timelines, and could have shown up on subsequent News Feed loads”. This is the explanation offered by the author of the report about the experiment. Read the full text here.

Simply speaking they wanted to adjust the type of information a user was exposed to to see if it effected their mood. So if a user receives lots of positive news, what will happen to them? What will they post about?

Some studies have suggested that lots of Facebook use tends to lead to people feeling bad about themselves. The logic is simple, all my friends post about how great their lives are and about the good side we might say. I who have a life that has both ups and downs are not exposed to the downs, so I feel that I am inadequate.

This sounds reasonable. I am not a Facebook user but the odd messages I get are rarely about arguing with partners, tax problems, getting locked out of the house, flat tyres, missed meetings or parking tickets. I presume Facebook users do not suffer from these issues, they always seem to be smiling.

So in order to test the hypothesis a little manipulation of the news feed. More positive or more negative words, and then look to see how the posts are effected. The theory above does not seem to hold water as a statistic however, although bearing in mind the methodology etc (and the conductor) I take the claims with a pinch of salt. More positive words tend to lead to more positive posts in response.

Hardly rocket science we might say.

I have a degree in sociology, an MA in Applied Social research and work in the field. Conducting experiments of this type is not allowed in professional circles, it is considered unethical, there is no informed consent, rights are infringed upon and the list goes on. What if somebody did something serious during the experiment?

Of course “The reason we did this research is because we care about the emotional impact of Facebook and the people that use our product”.

If readers are interested in looking at a few other fun experiments that might be considered ethically dubious I can offer a few. Check out the Stanley Milgram experiment, where people administered (False) electric shocks to other people who got the answers to their questions wrong. Yale University here, not a fringe department of Psychology. Researchers were investigating reactions to authority, and the results are very interesting, but you couldn’t do it today.

Or how about the so-called Monster study. The Monster Study was a stuttering experiment on 22 orphan children in Davenport, Iowa, in 1939 conducted by Wendell Johnson at the University of Iowa. After placing the children in control and experimental groups, Research Assistant Mary Tudor gave positive speech therapy to half of the children, praising the fluency of their speech, and negative speech therapy to the other half, belittling the children for every speech imperfection and telling them they were stutterers. Many of the normal speaking orphan children who received negative therapy in the experiment suffered negative psychological effects and some retained speech problems during the course of their life. The University of Iowa publicly apologized for the Monster Study in 2001.

Terrible as these experiments may sound, they were conducted in the name of science. Their results may have proved useful. Facebopok (along with 23andME and other commercial entities) are behaving in the way they are because they want to make more money, their interest is solely there (even if they dress it up as better user experience). And in the case of Facebook they have access to 1.3 billion users, and mandate to do whatever they like with them.

One thought on “Facebook’s Social Research Experiment

  1. Christopher Roberts

    Having studied Psychology at A Level, I did question how ethical the methods used were. Facebook can do an awful lot without your say so, because you agree to their terms when you use their site. There are those fighting against lengthy Ts & Cs but I doubt they will have that much luck. Lawyers just love their lengthy legal documents – after all, it keeps them in the job.

Leave a Reply to Christopher Roberts Cancel reply

Your email address will not be published. Required fields are marked *