On Klout’s home page, the company states, “We believe everyone has influence. Our Goal is to help you understand and leverage that influence.”
Influence as defined by Klout relates solely to your online life, which brings up questions as to what constitutes real influence and how influence is or should be defined.
The service scores your online influence on a scale of 1-100 and, as of late April, Justin Bieber had a perfect score of 100 with his 18,000,000 Twitter followers. According to the Wired magazine graphic below, he received over 30 tweets per second on his birthday from his obsessive and adoring fans. Klout measures True Reach, defined as the number of people you reach. It also measures Amplification, defined as how often people respond or spread your information/message/status update/blog. Lastly, it measures Network, quantified by measuring how often other Kloutish, influential people respond to and share your messages.
This is fascinating stuff. But it’s also somewhat disconcerting, the way modern math and its algorithms are creeping into our daily lives.
Take the case of Sam Fiorella, veteran marketing consultant, profiled in the Wired article “What Your Klout Score Really Means” by Seth Stevenson. An interviewer asked Fiorella for his Klout score–his was 34–and, according to Fiorella, the conversation ended shortly thereafter. The company hired another candidate who had a score of 67.
In the context of my profession–teaching–is it too far fetched to speculate that, in the future, administrators could plug in numbers such as discipline referrals written, time spent on the online grade book, e-mails written during school hours, number of visits to the men’s room, etc., to give teachers or teacher candidates some sort of score? Could I be hired or lose my job twenty years from now based on some lifeless formula?! I’m being a little silly, of course, but we could just be scratching the surface of how far Klout and other algorithm-driven services could affect our daily lives. Stevenson continues:
Klout is starting to infiltrate more and more of our everyday transactions. In February, the enterprise-software giant Salesforce.com introduced a service that lets companies monitor the Klout scores of customers who tweet compliments and complaints; those with the highest scores will presumably get swifter, friendlier attention from customer service reps. In March, luxury shopping site Gilt Groupe began offering discounts proportional to a customer’s Klout score.
I’m ok with the fact that I might not ever benefit from the perks mentioned above. And I do not mind that Amazon and Netflix use algorithms to make predictions about what books and movies I might enjoy. But where do people–if we have a choice–decide where the line should be drawn? If a company or large corporation’s main objective is to make money, there is little reason to think companies won’t continue to mine and collect data, plugging it into formulas, attempting to predict and influence our behavior. Scary stuff. In a 2010 article for The Globe and Mail, David Woodside writes:
The algorithmic expansion may be great in scale, but is it also great for us? Clay Shirky, who has written extensively about the impact of technology, says from his office at New York University that to a large degree authority is shifting away from “experts” to sources of information that no one is really in charge of. “The workings are hidden and, in some cases, not understandable.”
That means that the keepers of the algorithms have become figures of enormous importance. “In a society with an increasing amount of ‘algorithmic authority,’ ” Mr. Shirky argues, “understanding who manages the algorithm that society relies on, and how those algorithms are shaped and used, becomes a key question of governance – just as, in a previous age, understanding the limits of military or police power became a key part of democratic governments.”
The key issue, he says, is not whether algorithms have too much power, but whether society is able to keep them in check. “You would start worrying about a tool when it became powerful enough that the owners of that algorithm could start to shape outcomes to their own needs.”