Lanier: Who owns the future?

When Evgeny Morozov and Shoshana Zuboff already had a certain standing in Germany, Jaron Lanier was pretty much unknown. But he might just become the leader of the „We own our data“-movement of people such as Max Schrems. What Lanier does in his book is introducing the so called „Siren Servers“ and highlighting how we manage to ignore Terms and Conditions of online services and how these service providers trick us into not caring.

Lanier

“We want free online experiences so badly that we are happy to not be paid for information that comes from us now or ever. That sensibility also implies that the more dominant information becomes in our economy, the less most of us will be worth.”

To Lanier the digital revolution shows one significant result: although we can’t eat data and they are not a basic human need, they became a major function. We just overemphasize our own taste. We feel the massive need to communicate in the digital space – so taste has taken the function of basic human needs. He criticizes, that we want our free online services so massively, that we appreciate we don’t have to pay for them at all – instead we freely hand in our own information. Information becomes a commercial factor in-between humans. That implies, that the bigger the value of information gets, the value of the individual information shrinks in that economic circuit: „We love our treats but will eventually discover we are depleting our own value.“

Online services basically trick us into offering our data for free. They promise, that they improve their services based on our needs. But that is a surreal promise, writes Lanier. We are promised an open world – transparent as a crystal ball – by the same „true believers“ that „encrypt their servers even as they seek to gather the rest of the world’s information and find the best way to leverage it“.

Apart from that, most online services have huge exit costs, that are in the DNAs of the platform. After a platform – such as Facebook – has reached a critical mass, it is pretty difficult to start a discussion on another platform – it is a winner’s market, that rewards the most successful platform. So, what started with a choice, becomes a burden in the end.

That is the moment when we are most vulnerable. All my data is locked-in on the company’s servers. If I want to change the service, I can’t transfer my data easily – it belongs to Facebook or Instagram. If I want to change anyways, it takes hours of work to copy the data, but they will remain the property of the old platform. Most platform that have a creative approach ask their users to express themselves, but only after the rules of the platform itself. Exit costs again and the soft force through lock-ins lead to a new kind of digital immobility. Lanier calls it ironic, that we delivered ourselves into this dependency psychologically.

And not only that – since data becomes one of the major currencies of the 21st century – digital companies crave for them. For his book Lanier invents the so-called siren servers – a reminescence of the siren in Greek mythology. These servers are company-owned networks always looking for relevant data. They are „characterized by narcissism, hyperamplified risk-aversion, and extreme information asymmetry“. The decision that are made by these servers are based on the decisions of real humans. So, once more risk is becoming socialized. If something goes wrong – the users carries the collectivized cost: „This goes beyond the traditional idea of cost externalization, to automated, unexamined risk externalization.”

Lanier warns, that algorithms might become so successful, that they rather change their environment, than just to analyze it: „A successful Siren Server no longer acts only as player within a larger system. Instead it becomes a central planner.“

How does this work? Humans tend to smoothen situations. They just might act accordingly to mechanical recommendations – take Google Now or dating platforms as an example. If the technology in the background isn’t perfect, humans improvize. That again means, we can never tell, if an algorithm works perfect, because human behaviour fills the gap. An argument postulated by Joseph Weizenbaum forty years ago.  „People have repeatedly proven adaptable enough to lower standards in order to make software seem smart“, says Lanier. In fact, we always follow little „nudges“. We are always ready to subjugate us to smaller changes – and algorithm on the other side is a concrete building with zero ability to change. We forget, give in and we are creative, a machines saves states.

In other terms: we change society based on risk and turbulences: „We will not find the highest peaks if we organize markets to radiate risk and become deterministic accumulators of power around a small number of dominant computer nodes. Too little is learned through that process.“ Or put in another way: „Ultralight, friction-free moments in life are wonderful, but not as figure, only as ground.“ Making mistakes means learning – if a computer program is  build to run perfectly – it just can’t learn anything.

That just doesn’t apply to computers. For an algorithm to be not thrown away, it needs offer at least a little analytical skill or at least be so convincing that we adapt to ist recommendation. But the main point remains the same: an algorithm stays an algorithm. It can’t leave its own bubble: „Just as networked services that choose music for you don’t have real taste, a cloud-computing engine that effectively chooses your politicians doesn’t have political wisdom.

And since this is a recurring element in literature at the moment – Lanier also talks about the fallacy of risk management through data. The financial crisis can be considered an element of this fallacy – information technology gives promises that we humans can’t deliver: „Starting in the 1980s, but really blossoming in the 1990s, finance got networked, and schemes were for the first time able to exceed the predigital limitations of human deception.“ We are “chasing a dynamic, ever-better-but-never-perfect statistic result”, which “is the very heart of modern cloud computing.”

This is, in fact, what Friedrich August von Hayek has warned decades ago. The true success of the economic sciences was to make us believe, that we can make predictions. Economic sciences are tools to bridge the gap. Hayek highlights, that he will always favor a lack of knowledge – even if it makes it harder to decide or to predict – to the fallacy of perfect knowledge, that has to be flawed. But this again is a contradiction, to the vision of those (according to Morozov) “solutionists”, that want to make the world a better place through the power and control of prediction.

Summary: All in all, “Who owns the future?” is a rather negative take and not very specific take on the protagonists of the digital economy. Many consider it as too pessimistic and too philosophical. On the other hand, writers such as Nicholas Carr or Evgeny Morozov have their points and they voice concerns in early stages of new digital trends. There is a weird feeling of acknowledging unhealthy changes in the Terms and Conditions of platforms, combined with the unwillingness to resist and opt out. In Lanier’s book, this point is stressed the most – and in fact, he is attacking the users. That’s what makes the book good and worth reading.

Extra #1: Jaron Lanier offers a very good hint for all secret service agents and platform providers:

„Don’t play favorites; don’t have taste. You are to be a neutral facilitator, the connector, the hub, but never an agent who could be blamed for a decision. Reduce the number of decisions that can be pinned on you to an absolute minimum.“

„Who owns the future“ by Jaron Lanier. Published 2013 by Simon & Schuster. 17,00 US-Dollar.

http://books.simonandschuster.com/Who-Owns-the-Future/Jaron-Lanier/9781451654974 

 

 

 

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s