The Reading Space

Posts tagged Information

Why we must remember to delete – and forget – in the digital age

Human knowledge is based on memory. But does the digital age force us to remember too much? Viktor Mayer-Schönberger argues that we must delete and let go.

"In his book Delete: The Virtue of Forgetting in the Digital Age, Victor Mayer-Schönberger, professor of internet governance and regulation at the University of Oxford’s Internet Institute, writes: “Time is quite simply a very difficult dimension of human memory for humans to master.” (…)

"Since the early days of humankind," he writes, "we have tried to remember, to preserve our knowledge, to hold on to our memories and we have devised numerous devices and mechanisms to aid us. Yet through millennia, forgetting has remained just a bit easier and cheaper than remembering." (…)

The overabundance of cheap storage on hard disks means that it is no longer economical to even decide whether to remember or forget. “Forgetting – the three seconds it takes to choose – has become too expensive for people to use,” he writes. If Mayer-Schönberger’s stepdad had taken digital photographs, his stepson wouldn’t have had to bother thinking about which to delete. (…)

The dream of overcoming human memory’s fallibility was expressed by HG Wells when, in the 1930s, he wrote of a "world brain" through which “the whole human memory can be … made accessible to every individual”. Today, perhaps we have that world brain, and it is called Google. Mayer-Schönberger sounds an Orwellian note about this: "Quite literally, Google knows more about us than we can remember ourselves." (…)

That inability to forget, Mayer-Schönberger argues, limits one’s decision-making ability and ability to form close links with people who remember less. “The effect may be stronger when caused by more comprehensive and easily accessible external digital memory. Too perfect a recall, even when it is benignly intended to aid our decision-making, may prompt us to become caught up in our memories, unable to leave our past behind.”

And not being able to leave our past behind makes humans, he argues, more unforgiving in the digital age than ever before. In 2006, Vancouver-based psychotherapist Andrew Feldmar was crossing the Canada-US border to pick up a friend from Seattle airport – something he’d done many times before. This time, though, the border guard searched online and found that in 2001 Feldmar had written in an academic journal that he had taken LSD in the 1960s. As a result, Feldmar was barred entry to the US. “This case shows that because of digital technology, society’s ability to forget has become suspended, replaced by perfect memory.”

In the 19th century, Jeremy Bentham envisaged a prison called a panopticon in which guards could watch prisoners without them knowing whether they were being watched. In the 20th century, Michel Foucault argued that the model of the panopticon was used more abstractly to exercise control over society. In the 21st century, Mayer-Schönberger argues that the panopticon now extends across time and cyberspace, making us act as if we are watched even if we are not. He worries that this “perfect memory” will make us self-censor. “That’s becoming standard. In the US most colleges have a mandatory class on how to clean up your Facebook account.” (…)

In my home country of Austria, the DNA database keeps samples of everybody who left traces at a crime scene. It even means there are two classes of people – suspects and non-suspects and the class of suspects includes those who have been mugged or raped who have their DNA samples on the database.” (…)

Mayer-Schönberger writes in the new edition of Delete: "Digital memory, in reminding us of who she was more than 10 years ago, denied her the chance to evolve and change." This story, he argues, typifies how digital memory denies us the capacity to forgive.

Once lost, it’s difficult to reconstruct. Germany’s lawmakers tried prohibiting HR departments from Googling job applicants – thereby compelling institutional forgetting. “It was impossible to operationalise. They couldn’t stop HR department workers Googling at home, for instance.” (…)

"Nine out of 10 Americans want the right to force websites and advertising companies to delete all stored information about them. And for US digital natives [those born after the introduction of digital technology] the figure is 84%." (…)

What Facebook does to human identity. “In the analogue era, it was relatively simple to keep your lives separate. If my main leisure pursuits were being in the golf club and in an S&M circle, it was essential that no one at the former knew about the latter. Facebook, by not allowing you to have two accounts, problematises that separation. The response is that individuals employ strategies to hack the system – almost all my colleagues have two Facebook accounts, to keep different parts of their lives boxed in.” (…)

He suggests that users, when saving a document they have created, would have to select an expiration date in addition to the document’s name and location on their hard disk. “Expiration dates are about asking humans to reflect – if only for a few moments – about how long the information they want to store may remain valuable.”

This chimes with Harvard cyberlaw expert Jonathan Zittrain’s idea that we should have a right to declare reputation bankruptcy – ie to have certain aspects of one’s digital past erased from the digital memory. (…)

Mayer-Schönberger is now researching what he calls “institutions of remembering”. “We set up institutions of memory to help us remember important things – such as the Holocaust, for example. But with Google and Flickr and other sites offering seemingly comprehensive memory, we might be prompted to devalue these established institutions of memory. They risk being drowned out by stuff online. My fear is that the digital age, while benefiting us enormously, impoverishes us too.”

Stuart Jeffries, Why we must remember to delete – and forget – in the digital age, The Guardian, 30 June 2011

See also:

Why Privacy Matters Even if You Have ‘Nothing to Hide’
Viktor Mayer-Schönberger: Delate, The Virtue of Forgetting (video lecture)
☞ Jeffrey Rosen, The Web Means the End of Forgetting, NYT, July 21, 2010

Filed under Privacy Information Memory

Of Data Scientists, Big Data, the City and Dancers

In order to grasp and analyze rhythms, it is necessary to get outside them, but not completely: be it through illness or a technique. A certain exteriority enables the analytic intellect to function. However, to grasp a rhythm it is necessary to have been grasped by it; one must let oneself go, give oneself over, abandon oneself to its duration. Like in music and the learning of a language (in which one only really understand the meanings and connections when one comes to produce them, which is to say, to produce spoken rhythms).

“In order to grasp this fleeting object, which is not exactly an object, it is therefore necessary to situate oneself simultaneously inside and outside.”

- Henri Lefebvre, French sociologist, intellectual and philosopher, Rhythmanalysis: Space, Time and Everyday Life.

Lefebvre in his 1992 collection of essays talks about the rhythm of cities. To me this is the flow of the people, the morning coffee routine, the lunchtime decisions, the evening meandering, the beat of the bar on a Friday night, the sweat dripping off the ceiling of a tiny club, the sun coming up late on a Saturday night. Strangers exchanging stolen kisses under umbrellas, the race across the road in a gap in the traffic, the sudden surprising green park round the corner, the hidden entrance to the underground stations.

How people shape the city, the pulse as agents gather together to form a temporary autonomous zone before collapsing back to being shaped by the city. To be not just in the city, but of the city. (…)

Big Data

You can’t just turn your Data Scientist eye onto something and say “Oh we’ll throw this into MapReduce, it’ll be awesome”, you need to have been part of that data, to have lived it. (…)

To deal with big data you have to have been in it, not a Scientist but as a Dancer. (…)

But that isn’t enough, many people are already immersed in the data, here our journalists know all this stuff inside out. Getting carried away by the rhythms is as easy as getting in and letting yourself go. According to Lefebvre you then have to get back out again. (…)

And that’s the trick…

“in order to grasp and analyze rhythms, it is necessary to get outside them, but not completely: be it through illness or a technique. A certain exteriority enables the analytic intellect to function.”

(…)

You can’t have someone who’s a “Data Scientist” just turn up and apply their tools, clusters and statistics. They haven’t been in-it enough. And you can’t have someone who’s within the company, who understands and feels the flow of data everyday, unless, unless they know how to separate themselves, to get outside. When people grow with a company, love the company, understand everything that company could be, getting outside it is a hard won skill. The “Scientist” needs to be able to remove themselves and apply clear analytical skill, but with the fundamental understanding of the subject.

So all those companies advertising for a Data Scientist, I think I have this to say…

  1. You want a Dancer not a Scientist.
  2. Good luck with that!

As for the future of (data driven) journalism…

“In order to grasp this fleeting object, which is not exactly an object, it is therefore necessary to situate oneself simultaneously inside and outside.”

The “fleeting object, which is not exactly an object” that’s your story. The flow of data will gather together now and then forming a tangible shape for you to spot and grasp before it collapses back into the stream. You have to be in it to understand it, and outside to spot it. Just one or the other wont do.

When the Data Scientist/Dancer has sufficiently honed their skill to identify useful shifts, patterns and rhythms in the data, they can then set up algorithms to spot these on their behalf.

To understand the usefulness of algorithms we should first fully understand Golems and Robots.

http://en.wikipedia.org/wiki/Golem
http://en.wikipedia.org/wiki/R.U.R._(Rossum%27s_Universal_Robots)

Reverend Dan Catt, works at the Guardian, previously at Flickr as a frontend engineer, Of Data Scientists, Big Data, the City and Dancers, 2 June, 2011

Filed under Information

Why Privacy Matters Even if You Have ‘Nothing to Hide’

                    

The nothing-to-hide argument (…) is not of recent vintage. One of the characters in Henry James’s 1888 novel, The Reverberator, muses: “If these people had done bad things they ought to be ashamed of themselves and he couldn’t pity them, and if they hadn’t done them there was no need of making such a rumpus about other people knowing.” (…)

Likewise, in Friedrich Dürrenmatt’s novella “Traps,” which involves a seemingly innocent man put on trial by a group of retired lawyers in a mock-trial game, the man inquires what his crime shall be. “An altogether minor matter,” replies the prosecutor. “A crime can always be found.” (…)

“If you have nothing to hide, then that quite literally means you are willing to let me photograph you naked? And I get full rights to that photograph—so I can show it to your neighbors?” The Canadian privacy expert David Flaherty expresses a similar idea when he argues: "There is no sentient human being in the Western world who has little or no regard for his or her personal privacy; those who would attempt such claims cannot withstand even a few minutes’ questioning about intimate aspects of their lives without capitulating to the intrusiveness of certain subject matters." (…)

To evaluate the nothing-to-hide argument, we should begin by looking at how its adherents understand privacy. Nearly every law or policy involving privacy depends upon a particular understanding of what privacy is. The way problems are conceived has a tremendous impact on the legal and policy solutions used to solve them. As the philosopher John Dewey observed, “A problem well put is half-solved.” (…)

Privacy can be invaded by the disclosure of your deepest secrets. It might also be invaded if you’re watched by a peeping Tom, even if no secrets are ever revealed. With the disclosure of secrets, the harm is that your concealed information is spread to others. With the peeping Tom, the harm is that you’re being watched. You’d probably find that creepy regardless of whether the peeper finds out anything sensitive or discloses any information to others. There are many other forms of invasion of privacy, such as blackmail and the improper use of your personal data. Your privacy can also be invaded if the government compiles an extensive dossier about you.

Privacy, in other words, involves so many things that it is impossible to reduce them all to one simple idea. And we need not do so.

To describe the problems created by the collection and use of personal data, many commentators use a metaphor based on George Orwell’s Nineteen Eighty-Four. Orwell depicted a harrowing totalitarian society ruled by a government called Big Brother that watches its citizens obsessively and demands strict discipline. The Orwell metaphor, which focuses on the harms of surveillance (such as inhibition and social control), might be apt to describe government monitoring of citizens. But much of the data gathered in computer databases, such as one’s race, birth date, gender, address, or marital status, isn’t particularly sensitive. Many people don’t care about concealing the hotels they stay at, the cars they own, or the kind of beverages they drink. Frequently, though not always, people wouldn’t be inhibited or embarrassed if others knew this information.

Another metaphor better captures the problems: Franz Kafka’s The Trial. Kafka’s novel centers around a man who is arrested but not informed why. He desperately tries to find out what triggered his arrest and what’s in store for him. He finds out that a mysterious court system has a dossier on him and is investigating him, but he’s unable to learn much more. The Trial depicts a bureaucracy with inscrutable purposes that uses people’s information to make important decisions about them, yet denies the people the ability to participate in how their information is used.

The problems portrayed by the Kafkaesque metaphor are of a different sort than the problems caused by surveillance. They often do not result in inhibition. Instead they are problems of information processing—the storage, use, or analysis of data—rather than of information collection. They affect the power relationships between people and the institutions of the modern state. They not only frustrate the individual by creating a sense of helplessness and powerlessness, but also affect social structure by altering the kind of relationships people have with the institutions that make important decisions about their lives.

Legal and policy solutions focus too much on the problems under the Orwellian metaphor—those of surveillance—and aren’t adequately addressing the Kafkaesque problems—those of information processing. (…)

As the computer-security specialist Schneier aptly notes, the nothing-to-hide argument stems from a faulty “premise that privacy is about hiding a wrong.” Surveillance, for example, can inhibit such lawful activities as free speech, free association, and other First Amendment rights essential for democracy. (…)

The problems are not just Orwellian but Kafkaesque. Government information-gathering programs are problematic even if no information that people want to hide is uncovered. In The Trial, the problem is not inhibited behavior but rather a suffocating powerlessness and vulnerability created by the court system’s use of personal data and its denial to the protagonist of any knowledge of or participation in the process. The harms are bureaucratic ones—indifference, error, abuse, frustration, and lack of transparency and accountability.

One such harm, for example, which I call aggregation, emerges from the fusion of small bits of seemingly innocuous data. When combined, the information becomes much more telling. By joining pieces of information we might not take pains to guard, the government can glean information about us that we might indeed wish to conceal. For example, suppose you bought a book about cancer. This purchase isn’t very revealing on its own, for it indicates just an interest in the disease. Suppose you bought a wig. The purchase of a wig, by itself, could be for a number of reasons. But combine those two pieces of information, and now the inference can be made that you have cancer and are undergoing chemotherapy. That might be a fact you wouldn’t mind sharing, but you’d certainly want to have the choice.

Another potential problem with the government’s harvest of personal data is one I call exclusion. Exclusion occurs when people are prevented from having knowledge about how information about them is being used, and when they are barred from accessing and correcting errors in that data. Many government national-security measures involve maintaining a huge database of information that individuals cannot access. Indeed, because they involve national security, the very existence of these programs is often kept secret. This kind of information processing, which blocks subjects’ knowledge and involvement, is a kind of due-process problem. It is a structural problem, involving the way people are treated by government institutions and creating a power imbalance between people and the government. To what extent should government officials have such a significant power over citizens? This issue isn’t about what information people want to hide but about the power and the structure of government.

A related problem involves secondary use. Secondary use is the exploitation of data obtained for one purpose for an unrelated purpose without the subject’s consent. How long will personal data be stored? How will the information be used? What could it be used for in the future? The potential uses of any piece of personal information are vast. Without limits on or accountability for how that information is used, it is hard for people to assess the dangers of the data’s being in the government’s control.

Yet another problem with government gathering and use of personal data is distortion. Although personal information can reveal quite a lot about people’s personalities and activities, it often fails to reflect the whole person. It can paint a distorted picture, especially since records are reductive—they often capture information in a standardized format with many details omitted.

For example, suppose government officials learn that a person has bought a number of books on how to manufacture methamphetamine. That information makes them suspect that he’s building a meth lab. What is missing from the records is the full story: The person is writing a novel about a character who makes meth. When he bought the books, he didn’t consider how suspicious the purchase might appear to government officials, and his records didn’t reveal the reason for the purchases. Should he have to worry about government scrutiny of all his purchases and actions? Should he have to be concerned that he’ll wind up on a suspicious-persons list? Even if he isn’t doing anything wrong, he may want to keep his records away from government officials who might make faulty inferences from them. He might not want to have to worry about how everything he does will be perceived by officials nervously monitoring for criminal activity. He might not want to have a computer flag him as suspicious because he has an unusual pattern of behavior.

The nothing-to-hide argument focuses on just one or two particular kinds of privacy problems—the disclosure of personal information or surveillance—while ignoring the others. It assumes a particular view about what privacy entails, to the exclusion of other perspectives.

It is important to distinguish here between two ways of justifying a national-security program that demands access to personal information. The first way is not to recognize a problem. This is how the nothing-to-hide argument works—it denies even the existence of a problem. The second is to acknowledge the problems but contend that the benefits of the program outweigh the privacy sacrifice. The first justification influences the second, because the low value given to privacy is based upon a narrow view of the problem. And the key misunderstanding is that the nothing-to-hide argument views privacy in this troublingly particular, partial way.

Investigating the nothing-to-hide argument a little more deeply, we find that it looks for a singular and visceral kind of injury. Ironically, this underlying conception of injury is sometimes shared by those advocating for greater privacy protections. For example, the University of South Carolina law professor Ann Bartow argues that in order to have a real resonance, privacy problems must “negatively impact the lives of living, breathing human beings beyond simply provoking feelings of unease.” She says that privacy needs more “dead bodies,” and that privacy’s “lack of blood and death, or at least of broken bones and buckets of money, distances privacy harms from other [types of harm].”

Bartow’s objection is actually consistent with the nothing-to-hide argument. Those advancing the nothing-to-hide argument have in mind a particular kind of appalling privacy harm, one in which privacy is violated only when something deeply embarrassing or discrediting is revealed. Like Bartow, proponents of the nothing-to-hide argument demand a dead-bodies type of harm.

Bartow is certainly right that people respond much more strongly to blood and death than to more-abstract concerns. But if this is the standard to recognize a problem, then few privacy problems will be recognized. Privacy is not a horror movie, most privacy problems don’t result in dead bodies, and demanding evidence of palpable harms will be difficult in many cases.

Privacy is often threatened not by a single egregious act but by the slow accretion of a series of relatively minor acts. In this respect, privacy problems resemble certain environmental harms, which occur over time through a series of small acts by different actors. Although society is more likely to respond to a major oil spill, gradual pollution by a multitude of actors often creates worse problems.

Privacy is rarely lost in one fell swoop. It is usually eroded over time, little bits dissolving almost imperceptibly until we finally begin to notice how much is gone. When the government starts monitoring the phone numbers people call, many may shrug their shoulders and say, “Ah, it’s just numbers, that’s all.” Then the government might start monitoring some phone calls. “It’s just a few phone calls, nothing more.” The government might install more video cameras in public places. “So what? Some more cameras watching in a few more places. No big deal.” The increase in cameras might lead to a more elaborate network of video surveillance. Satellite surveillance might be added to help track people’s movements. The government might start analyzing people’s bank rec­ords. “It’s just my deposits and some of the bills I pay—no problem.” The government may then start combing through credit-card records, then expand to Internet-service providers’ records, health records, employment records, and more. Each step may seem incremental, but after a while, the government will be watching and knowing everything about us.

"My life’s an open book," people might say. "I’ve got nothing to hide." But now the government has large dossiers of everyone’s activities, interests, reading habits, finances, and health. What if the government leaks the information to the public? What if the government mistakenly determines that based on your pattern of activities, you’re likely to engage in a criminal act? What if it denies you the right to fly? What if the government thinks your financial transactions look odd—even if you’ve done nothing wrong—and freezes your accounts? What if the government doesn’t protect your information with adequate security, and an identity thief obtains it and uses it to defraud you? Even if you have nothing to hide, the government can cause you a lot of harm.

"But the government doesn’t want to hurt me," some might argue. In many cases, that’s true, but the government can also harm people inadvertently, due to errors or carelessness.

When the nothing-to-hide argument is unpacked, and its underlying assumptions examined and challenged, we can see how it shifts the debate to its terms, then draws power from its unfair advantage. The nothing-to-hide argument speaks to some problems but not to others. It represents a singular and narrow way of conceiving of privacy, and it wins by excluding consideration of the other problems often raised with government security measures. When engaged directly, the nothing-to-hide argument can ensnare, for it forces the debate to focus on its narrow understanding of privacy. But when confronted with the plurality of privacy problems implicated by government data collection and use beyond surveillance and disclosure, the nothing-to-hide argument, in the end, has nothing to say.”

Daniel J. Solove, a professor of law at the George Washington University Law School, This essay is an excerpt from his new book, Nothing to Hide: The False Tradeoff Between Privacy and Security, published this month by Yale University Press cited in Why Privacy Matters Even if You Have ‘Nothing to Hide’, The Chronicle Review, May 15, 2011

See also: Why we must remember to delete – and forget – in the digital age

Filed under Information Privacy Society

Massimo Pigliucci on ignorance and the need of critical thinking in our times

                                  
“Ignorance is the root of all evil, according to Plato, who also famously gave us a still-current definition of its opposite: knowledge. For Plato, knowledge is “justified true belief.” That definition is worthy of consideration as we reflect on the perils of ignorance in the twenty-first century.

Plato thought that three conditions must be met in order for us to “know” something: the notion in question must actually be true; we must believe it (because if we do not believe something that is true, we can hardly claim that we know it); and, most subtly, it must be justifiable – there must be reasons why we believe the notion to be true. (…)

The paradox of ignorance in our era: on the one hand, we are constantly bombarded by expert opinion, by all sorts of people – with or without Ph.D. after their name – who tell us exactly what to think (though rarely why we should think it). On the other hand, most of us are woefully inadequate to practice the venerable and vital art of baloney detection (or, more politely, critical thinking), which is so necessary in modern society.

You can think of the paradox in another way: we live in an era when knowledge – in the sense of information – is constantly available in real time through computers, smart phones, electronic tablets, and book readers. And yet we still lack the basic skills of reflecting on such information, of sifting through the dirt to find the worthy nuggets. We are ignorant masses awash in information.

Of course, it may be that humanity has always been short on critical thinking. That’s why we keep allowing ourselves to be talked into supporting unjust wars (not to mention actually dying in them), or voting for people whose main job seems to be to amass as much wealth for the rich as they can get away with. It is also why so many people are duped by exceedingly costly sugar pills sold to them by homeopathic “doctors,” and why we follow the advice of celebrities (rather than real doctors) about whether to vaccinate our kids.

But the need for critical thinking has never been as pressing as in the Internet era. At least in developed countries – but increasingly in underdeveloped ones as well – the problem is no longer one of access to information, but of the lack of ability to process and make sense of that information. (…)

Education has increasingly been transformed into a commodity system, in which the “customers” (formerly students) are kept happy with personalized curricula while being prepared for the job market (rather than being prepared to be responsible human beings and citizens).

This can and must change, but it requires a grassroots movement that uses blogs, online magazines and newspapers, book clubs and meet-up clubs, and anything else that might work to promote educational opportunities to develop critical-thinking skills. After all, we do know that it is our future.”

Massimo Pigliucci, Professor of Philosophy at the Graduate Center of the City University of New York, Ignorance Today, Project Syndicate, Apr 22, 2011.

Filed under Age of information Education Knowledge Information Paradoxes Rationalism Skepticism

Hans Rosling: What people need isn’t more data but a new mindset

                      

Hans Rosling has maintained a fact-based worldview – an understanding of how global health trends act as a signifier for economic development based on hard data. Today, he argues, countries and corporations alike need to adopt that same data-driven understanding of the world if they are to make sense of the changes we are experiencing in this new century, and the opportunities and challenges that lie ahead.

"My basic idea is that the world has changed so much, what people need isn’t more data but a new mindset. They need a new storage system that can handle this new information. But what I have found over the years is that the CEOs of the biggest companies are actually those that already have the most fact-based worldview, more so than in media, academia or politics. Those CEOs that haven’t grasped the reality of the world have already failed in business. If they don’t understand what is happening in terms of potential new markets in the Middle East, Africa and so on, they are out. So the bigger and more international the organisation, the more fact-based the CEO’s worldview is likely to be. The problem is that they are slow in getting their organisation to follow. (…)

For instance, in terms of education levels, we no longer live in a world that is divided into the West and the rest; our world today stretches from Canada to Yemen with all the other countries somewhere in between. There’s a broad spectrum of levels and we have to realise that Asia, Brazil, Latin America and, to some extent, the Middle East are catching up with the countries we used to call the ‘West’.

But even when people act within a fact-based worldview, they are used to talking with sterile figures. They are used to standing on a podium, clicking through slide shows in PowerPoint rather than interacting with their presentation. The problem is that companies have a strict separation between their IT department, where datasets are produced, and the design department, so hardly any presenters are proficient in both. Yet this is what we need. Getting people used to talking with animated data is, to my mind, a literacy project. (…)

In the world today, it’s not money that drags people into modern times, it’s people that drag money into modern times. I can demonstrate human resources successes in Asia through health being improved, family size decreasing and then education levels increasing. That makes sense: when more children survive, parents accept that there is less need for multiple births, and they can afford to put their children through school. So Pfizer have moved their research and development of drugs to Asia, where there are brilliant young people who are amazing at developing drugs. It’s realising this kind of change that’s important. (…)

The problem isn’t that specialised companies lack the data they need, it’s that they don’t go and look for it, they don’t understand how to handle it. (…)

Western Europe and other high-income countries have to integrate themselves into the world in the same way big companies are doing. They have to look at the advantages, resources and markets that exist in different places around the world.

And some organisations aren’t willing to share their data, even though it would be a win-win situation for everybody and we would do much better in tackling the problems we need to tackle. Last April, the World Bank caved in and finally embraced an open data policy, but the OECD uses tax money to compile data and then sells it in a monopolistic way. The Chinese Statistical Bureau provides data more easily than the OECD. The richest countries in the world don’t have the vision to change.

I call this the ‘database hugging disorder’. To heal it, we have to instil a clear division of labour between those who provide the datasets – like the World Bank, the World Health Organisation or companies themselves – those who provide new technologies to access or process them, like Google or Microsoft, and those who ‘play’ with them and give data meaning. It’s like a great concert: you need a Mozart or a Chopin to write wonderful music, then you need the instruments and finally the musicians.

Meteorologists are one group that has a ready grasp of this idea. They receive a huge amount of data, which they process in a highly sophisticated way, translating it into stunning graphics – and there they are on prime-time TV presenting the weather while we all watch. This is exactly what we strive to emulate. We want our economic indicators, our social indicators and our environmental indicators to be communicated on prime-time television with the same level of efficiency. (…)

Play with data and give it meaning.”

Hans Rosling in a talk with Ulrike Reinhard, A Data State of Mind, Think Quarterly

Hans Rosling studied statistics and medicine at Uppsala University, Sweden. He earned a PhD, spent two decades studying in Africa and, as chairman of the Karolinska International Research and Training Committee, has collaborated with universities in Asia, Africa, the Middle East and Latin America.

See also: Hans Rosling shows the best stats you’ve ever seen, TED.com, Feb 2006

Filed under Information Knowledge Politics Statistics