Algorithms have operated behind the scenes of our best-loved social platforms since their earliest days, but their side effects have only recently been scrutinized in light of Donald Trump’s rapid ascent to president-elect, and the UK’s surprising Brexit vote last summer.
These cultural moments taking place on either side of the Atlantic gained considerable momentum on social media, and in their wake there has been growing recognition of the role of Facebook et al. in bringing about their outcomes.
German Chancellor Angela Merkel recently weighed in on this debate, suggesting that Facebook and Google be made to disclose their secret algorithms, saying that users are entitled to know how the information they see is selected for them, and that it’s very likely this issue will come under the regulatory microscope.
Merkel and other critics of the algorithm trend are focusing on what in the tech world is called “filter bubbles” — personal online networks of content which continuously reinforce our individual bias and worldview.
“Filter bubble” describes the byproduct of social algorithms: These algorithms attempt to align with our own natural tendencies by selecting online content that is personally relevant to us, based on our previous clicks/shares/views. And these algorithms have become increasingly effective at that task.
The result is that our individual online experiences, from Google searches to Facebook feeds, are more filtered than ever. And content deemed too different to our audience profile is excluded — creating the “bubble effect” which prevents new points of view from challenging us. My view? Marketers should take a pin to that bubble.
Let me explain. This particular element of our online lives has fascinated us a while, so we delved into the issue here at our U.K.-based Unfolded Talks session, drawing on evolutionary science, psychology and various perspectives about well-being to understand the consequences of the filter bubble.
The result was that we agreed that social algorithms have not only magnified and automated our natural personal bias, but have made these bubbles completely invisible to us. That’s a process that should be a real concern for businesses — particularly marketers.
Freshly filtered news
What, where and how we consume content has never been more important, which is why online platforms have led the way in tailoring that information to users’ needs, based on their past clicks and views.
For social networks, this means suggesting news and connections that align with each user’s typical online activity. For search engines, it means that each user will find different search results than will the next guy for, say, “Tunisia,” “climate change” or “best sports car.”
In the short term, this scenario can be incredibly useful in an information-rich age. We are bombarded with so much information that surely it makes sense to filter out what isn’t relevant, right?
Maybe not, because when we take a long-term perspective, the problems become apparent. The events of this year are a sobering example. They show us that a long-term side effect of these bubbles has been the creation of extreme “us-versus-them” world views — demonstrated during the recent campaigns leading up to the U.S. election and Britain’s EU referendum.
Parroting isn’t personal, so challenge your audience.
Personalization is the buzzword behind these algorithms, as those in the marketing industry well know. Most ad-funded networks, from news outlets to social media, trade on their volume of audience insight, and it has become accepted conventional wisdom that “relevance” is “personal,” and that “personal” is the goal advertisers should aim for.