Data journalism: innovation or a waste of time and energy?

Data journalism is one of the most innovative types of journalism today, based on the processing, interpretation and visualization of large amounts of information.

Journalists have worked with statistics and other data sets throughout the existence of the profession. According to  Simon Rogers , one of the gurus of modern data journalism and the founder of the Datablog for the Guardian, all sports journalism, financial analytics, even weather forecasts are based on the processing of such information.

The main difference between data journalism and conventional articles is that data is placed at the heart of the story. Infographics, interactive maps, timelines, pie charts around which the entire journalistic story is built are products of data journalism.

Officially, this direction took shape in an independent genre relatively recently – the term itself was proposed in 2010 . But materials that correspond to the format can also be found in the archives. This stage in data journalism is even called Victorian.

begining

The journalist Florence Nightingale fought to improve conditions in the British army. She did extensive work analyzing the mortality rate in the troops, and established and then proved to the British Parliament that in most cases soldiers died for reasons that could be prevented, for example, due to unsanitary conditions.

Her report, published in 1858, was replete with graphs, charts, and data journalists confidently rank her among the pioneers of the direction.

The British Guardian insists that one of the first high-profile pieces of data journalism was published in a pilot issue in 1891.

This edition of the newspaper included a table obtained from an anonymous source. It contained a list of schools in Manchester and Salford, the number of students and the annual government spending on education.

By analyzing this data, Guradian was able to identify large discrepancies with official statistics and more accurately establish how many children received free education and what was the proportion of students from poor families.

An even earlier publication of the New York Tribune from 1849 is considered one of the first uses of infographics.

In this article, the newspaper visually depicted data on an outbreak of cholera brought by ship from Europe. With the help of the graph, a sad and justified prediction was made that the peak of the epidemic in New York at that time was yet to come.

The work of British physician Jon Snow , published in 1854, is an example of mapping, a technique widely used by data journalists today.

In a world still unaware of the existence of microbes, Snow documented and mapped cholera cases during the London outbreak. This visualization allowed him to suspect a connection between the epidemic and the pumping station around which the sick lived. As a result, it was determined that contaminated sewage water entered this section of the city water supply.

But the emergence of data journalism as a full-fledged direction became possible a century later – on the wave of the digital revolution and the replication of the ideas of information accessibility: open data, e-government, the spread of open source solutions for processing and visualization, as well as the emergence of bigdata.

New birth

Modern data journalism owes a lot to journalistic reporting with the use of computer technologies, or CAR (computer-assisted reporting).

This trend emerged in the early stages of the computer era, when reporters began experimenting with computers to process information. For example, in 1952, CBS used computer algorithms to predict the outcome of the US presidential election and the absolute victory of Eisenhower with an accuracy of one percent.

Philip Meyer and Elliot Jaspen are considered the founding fathers of data journalism. Meyer is the author of the term “precision journalism” , which he coined in his book of  the same name.

In it, he advocated the use of scientific data analysis in investigative journalism. Meyer used statistical analysis to investigate the causes of the 1967 Detroit riots for the Detroit Free Press. After analyzing the data, he was able to dispel several popular myths in society about the composition of the rebels and their motives.

In the 1980s, there were dozens of data journalists working in the US data journalism genre . And in 1989, CAR received the highest recognition in the professional environment for the first time: the Atlanta Journal-Constitution was awarded the Pulitzer Prize for a series of articles on racial inequality that was revealed in the analysis of bank mortgage lending policies.

That same year, Elliot Jaspen , another ardent supporter of CAR and a Pulitzer Prize winner, was instrumental in transforming data journalism into an academic discipline. He founded the  National Institute for Computer-Assisted Reporting  (NICAR) in Missouri.

In the first decade of the 21st century, most of the titans of the Western media market had their own data blogs: Guradian with Datablog , The Upshot  of the New York Times, Data Desk blog  of the Los Angeles Times, Washington Post and many others. The largest associations of investigative journalists have also actively joined in the advancement of data journalism: from Investigative Reporters and Editors  to the Global Investigative Journalism Network.

Another powerful boost in the new era was data journalism from WikiLeaks and Edward Snowden , who gave a number of media outlets exclusive access to tens of thousands of raw classified files about the wars in Iraq and Afghanistan.

DATA JOURNALISM AWARDS WEBSITE.

Spiegel , the Guardian,  and the Associated Press , among others, have handled these huge archives . In 2012, the Data Journalism Awards were established  for Excellence in Data Journalism .

Data journalism today

Modern data journalism is hundreds of materials and tools published daily around the world. According to the prevailing belief , it became not only the brainchild of a technological revolution that swept the world, but also a reaction of the media sphere to the ubiquitous event journalism, which lives on the speed of information delivery and sensationalism.

In contrast, data journalism is scrupulous, interested in details, and in the “five W” formula, it mainly focuses on the question “Why?”

One of its main goals is to move the reader a little further in understanding not only the event, but also its context, and to provide a basis for isolated incidents and disparate facts.

Like, for example, last year’s large-scale project of the German Berliner Morgenpost . In its article Es war nicht immer der Osten, the newspaper traced the dynamics of the distribution of votes in elections in each of the German states since 1990.

The newspaper managed to demonstrate that the arrival of the extreme right in the German parliament for the first time in half a century was by no means as surprise as it seemed.

Another well-known data project, Gapminder,  seeks to influence our perception of the state of affairs in general on the planet.

Its creators believe that we are completely wrong about the world, and they try to prove it with the help of data. To begin with, visitors are invited to take a test of knowledge of basic global statistics , which, according to the authors, you will certainly fail.

Finding causal relationships, new perspectives is another key challenge that data journalism is struggling to solve. Agency Bloomberg is known for its high-quality infographics, in particular the project “The most deadly profession in America” .

According to the findings in this story, scavengers in the United States face a higher risk of losing their lives doing their jobs than firefighters, and American taxi drivers are more likely to die violent deaths than police officers.

The ability to discover hidden patterns through dry data analysis has made this genre particularly popular with investigative journalists. Steve Doy  analyzed the damage done by Hurricane Andrew in the United States in 1993.

He combined two data streams: the destruction map and the wind speed. This allowed him to identify areas where the consequences were large-scale also because in these areas there were less strict requirements for the quality of construction. Doy was awarded the Pulitzer Prize for his investigation.

In 2015, Al Jazeera America analyzed the movement of a derailed train and found that this particular train had been on a dangerous turn at high speed in the months before the crash.

Last year, Canada’s Globe and Mai l processed information from 870 police stations. As a result, she was able to establish that the police refused to investigate one of the five allegations of sexual crimes, classifying them as “unfounded.”

Mission of data journalism

Another mission of data journalism is to facilitate access to fragmented, unreadable, inaccessible information. Sometimes months of work by data journalism teams that have processed thousands of lines of information or digitized hundreds of pdf documents end up with a cleaned and verified spreadsheet “just” posted online in a readable format for everyone to use.

This mission is especially important in a repressive environment where access to reliable quantitative information is closed or severely restricted.

La Nacion Data  collects, digitizes and publishes a variety of information about Argentina, a country without a freedom of information law.

The Excesos Sin Castigo project publishes data on the extractive and oil industries in Peru.

In this case, several dense paragraphs of text can be replaced by one summary infographic .

For example, the data blog Information is beautiful about the biggest online identity breaches.

Or the Guardian US project on LGBT rights in the United States , or a real-time map of the murders of women in Turkey .

Moreover, data journalism is not always seriousness itself. The result of many hours of work with numbers can be, for example, a rating of the most stupid showdowns between Wikipedia editors.

What’s next for data journalism?

In 2017, Google News Lab published a study in which the present of data journalism was encouraging enough. Thus, 42 percent of journalists reported that they use data in their work, and in 51 percent of cases a data journalism specialist works in the editorial office.

But, according to many, data journalism is not developing as rapidly as expected: the playing field is still mostly Western media, and in some countries data journalism is absent altogether.

Moreover, after the 2016 US presidential election, some rushed to bury data journalism in this country too , because no reputable data source, from FiveThirtyEight to Politico, could predict the outcome of this vote .

Data journalism is a direction in journalism, which is based on the collection, analysis and processing of data to create media materials.

The goal remains the same – to provide information to readers and to tell about important events or phenomena. The source is not expert opinions or press releases, but data. The main task of a data journalist is to turn them into a clear story and a beautiful visual product. Technology and modern visualization tools help create interactive maps , graphics, and even personalized elements .

The toolkit that data journalists use is quite versatile: data manipulation skills will be useful not only in the media, but also when working with corporate media, client mailing lists, PR, when creating reports for presentation outside – that is, wherever it is necessary to present a large the amount of data is clear and convincing.

Leave a Comment