Facebook’s study on emotional contagion may not have broken laws, but it has exposed the unfettered power of big data companies grounded in opaque user policies. For one week in 2012, researchers from Facebook, Cornell and the University of California skewed the emotional content of almost 700,000 news feeds to test how users would react. They found that people would write slightly more negative posts when exposed to negative feeds and vice versa. News of the study spread on the Internet on Monday, angering users who thought Facebook had treated them as “lab rats” and sparking European legal probes. Facebook executive Sheryl Sandberg eventually apologized for “poorly” communicating the study, but Facebook stood firm. “When someone signs up for Facebook, we’ve always asked permission to use their information,” the company said in a statement. “To suggest we conducted any corporate research without permission is complete fiction.”
Facebook is half right. Users agree to terms and conditions when they join the social network. In-house experiments, called “A/B testing,” are routine, too. They observe how users react to small changes in format and content, such as a bigger icon or a different shade of blue. The purpose is to improve user experience on the site.
You must login to view the full content on this page.
Or, use your linked account: