Photo Credit
In 2011, Facebook’s strategy to get users to spend more time on their site was to give them easy-to-get-into/hard-to-leave games; it worked. We all played Candy Crush, Texas Holdem Poker, and Farmville. Throughout the decade, Facebook grew through acquiring more chat platforms to learn our thoughts and feelings and essentially had mapped out our brains and hearts. Facebook is now able to gauge (and based on the latest reports, control) how the world feels at any given moment. With people losing interest in games like Candy Crush, the company had to change strategy; around 2015, they did. Equipped with new tools for mapping our feelings they were able to see what’s in people’s hearts and address their most primal notion - what makes them tick (and click). They’ve found that engagement goes up around negative topics, claiming that we’d stay longer on a negative character video clip or an article than a positive one. Facebook made a choice to focus on the negative and updated its algorithm to serve us a tailor-made reality that encourages idiocracy and segregation, favoring inciting comments that raise the flames, no matter the price, as long as we stay. It is possible that without knowing, Facebook answered Einstein’s oldest question of whether the world is a friendly place or not. To the scientists that believe it isn’t, Einstein answered: “then we are simply victims to the random toss of the dice and our lives have no real purpose or meaning”. Facebook scientists may be right, it may be that the vast majority of the population would find greater fascination in the site of a car accident than in how a car engine works, which is fine if you don’t care about purpose or meaning.