After the study was published in the Proceedings of the National Academy Sciences and the experiment was made public people found it controversial. During the experiment, Facebook data scientists tweaked the news feed algorithms of roughly 0.04 percent of Facebook users, or 698,003 people. They can influence if you see bad things or good things in the news feed and they did it. Describing the impact, Adam D.I. Kramer, one of the study’s authors, said in a statement: “At the end of the day, the actual impact on people in the experiment was the minimal amount to statistically detect it — the result was that people produced an average of one fewer emotional word, per thousand words, over the following week.” Many people don’t like to have their feelings manipulated and this is correct and may suggest an ethical aspect to be considered in connection with such a study. But on the other part when signing for a Facebook account people give to the network administrators permission to use your personal information for “internal operations,” including “data analysis” or “testing”.
After the study was made, the word „research“ was added. Too late? Possibly! But the idea that all users give consent remains and because of this the study was legally conducted.