Facebook researchers recently apologized for intentionally manipulating what people saw in their newsfeeds as they conducted a study on user interaction. The researchers filtered the feeds to contain fewer positive emotional words. They wanted to see if that would lead users to post their own content with fewer positive words.
Interactive Computing Professor Amy Bruckman says the Facebook study brings up hard questions. Bruckman wonders if Facebook had only done the happy case (filtering people’s newsfeeds to have fewer negative words), would people still have concerns about the study and privacy? She says it’s not that users lost control in this one instance – it’s that they never had it in the first place. Facebook’s algorithm is curated by an algorithm that is not transparent.