Facebook is under some serious fire because for experimenting on thousands of users back in 2012. The company wanted to determine whether the testing could alter people’s emotional state, or if they could prompt them to post either positive or negative content.
For one week, these scientists assigned to the experiment enabled an algorithm that automatically omitted content containing positive or negative emotions from the central news feeds of 689,003 users. Bottom line is Facebook toyed with its users emotions and used them as guinea pigs! What many people already fear about FB technology is becoming a reality.
Its Data Science Team is responsible with turning information, created by 800 million people, into usable scientific research. The scientist who led the “experimental study”, Adam Kramer said he had second thoughts, posting on his Facebook page: “In hindsight, the research benefits of the paper may not have justified all of this anxiety”. A lot of Facebookers don’t see this invasion of trust as a first time thing.
The delicate line between the privacy of users and the corporation’s ambitions is only highlighted with this experiment. Companies like Facebook, Google Inc, and Twitter Inc. rely mostly on data-driven advertising dollars. So naturally they need to collect and store information. But not ALL of the information can or is used for advertising-at least not yet. What Facebook does with its extra personal information is largely unknown to the public, which doesn’t exactly help with user trust. They announced that the study was conducted anonymously, so that the researchers couldn’t learn the names of the subjects.