Deep Throat Meets Data Mining

Journalism is changing. Watchdog journalism — the perusal of thousands of pages of official records in search of anomalies and other signs of abuse and corruption — is much harder to do than reporting on celebrity shenanigans or fashion trends. John Mecklin writes:

On a disaggregated Web, it seems, people and advertisers
simply will not pay anything like the whole freight for investigative
reporting. But Hamilton thinks advances in computing can alter the
economic equation, supplementing and, in some cases, even substituting
for the slow, expensive and eccentric humans required to produce
in-depth journalism as we’ve known it.

Already, complex algorithms — programming often placed under the over-colorful umbrella of “artificial intelligence” — are used to gather content for Web sites like Google News,
which serves up a wide selection of journalism online, without much
intervention from actual journalists. Hamilton sees a not-too-distant
future in which that process would be extended, with algorithms mining
information from multiple sources and using it to write parts of
articles or even entire personalized news stories.

From another section in the same story:

Investigative reporters have long used computers to sort and search databases in pursuit of their stories. Investigative Reporters and Editors and its National Institute for Computer-Assisted Reporting,
for example, hold regular computer-assisted reporting training sessions
around the country. And the country’s major journalism schools all deal
in some way with computer-enhanced journalism. The emerging
academic/professional field of computational journalism, however, might
be thought of as a step beyond computer-assisted reporting, an attempt
to combine the fields of information technology and journalism and
thereby respond to the enormous changes in information availability and
quality wrought by the digital revolution

Leave a Reply

Your email address will not be published. Required fields are marked *