Search, Social Networks & the Robots that Ruined Them

by royalarse13

Many social media experts have lamented on deterioration in the quality of search engines and social networks recently.  The criticism is wide-ranging, but one specific area so pervasive and influential on the content we consume needs to be explored in great detail: the impact of algorithms.

Understanding how filtered-algorithmic search skews your information consumption is vital to the Information Revolution.  Is it a good thing that two different people searching identical topics at the same time will yield dramatically different results?  Eli Pariser discusses what he calls invisible algorithmic editing at Ted:

To paraphrase a few key points:

  • There is a shift to how information is flowing online and it is invisible.
  • Facebook noticed that I was clicking more links from a certain type of friend .  Without consulting me about it, the contrary links were edited out: they disappeared.
  • Phenomena observed even when logged out — search ingrates 57- different variables many of which are independent from the search keywords/terms.
  • Moves us very quickly from an Internet that shows us what it thinks we want to see, at the expense of showing us what we may want to see.

This kind of automatic invisible pruning has many benefits to users: improves time management, exploits the high likelihood similar people like similar topics, activity-as-a-measure of engagement can lead to better search results and more relevent content.  But at what cost?

These personalized algorithmic filters can focus too much on what we click on first, which in many ways skews results towards impulsive clicks/comments, and away from substance. Eli mentioned food as an analogy, which I’ll expand on:  We impulsively crave sweets and fatty foods at the peril of our physical health by ignoring more nutritious substance contained in vegetables and fruits.  Like our health, changing the information we consume has broad consequences on our minds, culture and society which aren’t fully understood in the digital world.

My problem with this kind invisible algorithmic editing is that those actively seeking information by searching keywords and topics should not be directed with any additional personalized discretion or specificity, unless it is expressly revealed by the platform (and provides an easily accessible disable option).  Facebook and Google curate the info we consume behind the scenes, unbeknownst to most users.

All media (online, print, TV etc) is certainly controlled centrally and the message is carefully manicured to suit their unique interests. Perhaps its naive to assume that the internet would be any different but Google/Facebook filtering our search results at best begs the question as to whose interests are being served with such curation, at worst violates the ‘don’t be evil’ credo.

What is Desired vs. What is Delivered?

Using Twitter as an example, selecting who to follow should be the prime determinant of the kind of information you’re observing. Who I click on, reply to, or who retweets me should have zero impact on what I see. I want raw, chronological aggregation, not something a robot knows I will like based on clicks and activity.

If the user has a narrow range of desired content, then its reasonable to assume the algorithmic outputs are helpful in achieving that specific goal.  However, the ‘randomness’ of Facebook and Twitter is what makes it so useful as tool to discover new things (Facebook about people, and Twitter news/facts).  The participation of robots must then exclude many cool people with exciting information/lives that get lost because their content is deemed less relevant.  Once an individual is off-your-grid, they risk never being reintegrated lest you actively seek them out.  Folks with hundreds, or thousands, of friends/followers risk the loss of these meaningful connections through no (active) fault of their own.

What a shame that so much great content is passively lost to invisible algorithmic editing.  A platform must control of these automated processes very carefully to ensure the overabundance of information online is assimilated in a fashion that pleases the user. This is the critical challenge of the Information Age.  Facebook, Google, Twitter, StockTwits, LinkedIN and many others are surely taking note and tweaking their algos in perpetuity.  Wish them luck, as the trend in social networking suggests curation of data/info — and not the quality of the users within the communities themselves — will be the penultimate factor in success or failure.