Readers should insist AI not lead to smaller newsrooms

Sursa: Pixabay

This is shaping up as the year the other shoe dropped on artificial intelligence: what began as a plotline of science fiction has matured into global terror that AI might replace masses of white-collar workers. It’s a big deal for many industries, but few cases should spark as much concern as journalism – an industry that’s critical to society and which has already been disrupted to the breaking point.

It may seem self-serving for a veteran journalist to claim journalism is so important, but I stand by it: with all due respect to social media influencers and independent bloggers – the other two primary “sources” of information – neither have the commitment to verification, standards and ethics that the quality media does, or the resources to pursue them.

Those things are essential to maintaining the public’s trust, which is already being hammered by the toxicity of modern politics, which has tainted everyone in the ecosystem including the media. If that trust deteriorates any further, and if the younger generations are not reclaimed for real journalism, then society’s ability to navigate free markets and democratic politics will plummet.

Recent days brought two developments that underscored the pace of developments.

First, the Associated Press agreed to license its news archive to OpenAI (which will improve the bot’s understanding of world history and events) in a deal that also gave the agency, which I spent decades at, access to OpenAI (which will increase the temptation to use it).  AP had already been experimenting with writing automated articles, especially for sports results.

And then it was reported that Google is in discussions with news publishers including the New York Times and Washington Post about building and selling artificial intelligence tools that could produce written journalism. The reports said Google claims the product – which the Times said bears the Star Trekky name of Genesis – was pitched as able collect information, write stories and more.

Everyone issued the requisite reassurances. But realistically, we face the prospect of bots writing stories and headlines. ChatGPT itself has boasted to me of AI’s ability to do this, and even to perform journalistic tasks like information-gathering (as well as summary generation, news monitoring and more). So I asked:  Because you gather information online, could you not end up offering untruth as fact?

ChatGPT is programmed to be fair (if evasive), so it offered no disputation: “I rely on the information that is available on the internet (so) while I strive to provide accurate and reliable information, there is a possibility of encountering inaccuracies … Journalists and individuals should critically evaluate and verify the information they receive from multiple reliable sources before considering it as factual.”

For now, this need for human verification is understood to be a serious matter (if you doubt it, ask AI which novels you’re most famous for and you’ll probably see). But we risk forgetting this as the bots get way better. When I studied computer vision and AI in college and grad school in the late 1980s, the challenge was getting the thing to tell a circle from a square; now it can produce fake poetry and art. That’s a steep learning curve.

Consider the two basic advantages bot-journalists will always enjoy, even before machine learning expands their realm of competence:

  1. Access to memory: Each of us can summon up only a tiny fraction of the things we have heard and seen, the thoughts we have had and the information we have acquired. For this reason I tend to trust an instinct (say, to turn one way rather than the other at a fork in the road); I figure it might be suppressed memory. But AI simply has total recall – a concept so amazing they made it the title of an Arnold Schwarzenegger movie.
  2. Speed of calculation: Much of what we perceive as thought, even analysis, projection and inference, is essentially calculation. No human can match the calculation speed of even a rudimentary computer, let alone generative AI working from large language models with machine learning capabilities that optimize and self-develop.

The narrative that optimists – myself among them – are currently telling is that humans have several advantages as well.

The baseline is, as said, the notion that only humans only they can be trusted with verification – essentially, that trust itself is a monopoly for our species. But as information-gathering capability meets greater fact-checking capability, the problem with AI will be less visible. AI results will start to be accurate so much of the time that we will may find ourselves on the slippery slope of granting the bots the autonomy to publish.

Secondly, there is the idea that only humans can be creative. Risking the mockeries of the “fake news” cabal, I posit that much of legitimate journalism can be seen to require creativity: The way you present a feature-length human interest story; the way you approach an investigation; the way you lay out an argument in an analysis.

In the short term, few will dispute the notion that AI cannot be creative. But the seeds of coming change will become clear to anyone who asks AI for a Stones song in the style of the Doors, or a Robert Frost-style poem about love in springtime, or a plot outline about reporters who lose their jobs to AI. It can approximate creativity pretty well, even in imitation. There are neuroscientists who will argue that all art is imitation, and that all creativity is calculation. Perhaps you see the coming problem.

Putting how people see AI in the future aside, for now the lay of the land appears as follows.

There is no question that AI can take data sets, or any indisputable facts, and transform them into stories rather close to ready-to-publish, if your standards do not require sparkle. As said, at AP and elsewhere this has already begun. AI will get better at contextualizing, using its total recall and blitzkrieg calculations. That will appear to duplicate many functions long performed by cantankerous journalists.

Moreover, AI can create a very efficient shortcut to writing more complex content, from a multifaceted news story to an analysis and even to an opinion piece. Ask ChaptGPT to give you a story criticizing a certain government in the style of a certain pundit, and the result will be bad at first, and then gradually better as you add nuance to the prompt. No one currently wants op-eds written by bots, but as a foundation it can save all kinds of time. Pundits will start to use it, I can guarantee. Even this shortcut will be another slippery slope, and will raise ethics issues.

All this amounts to an indisputable efficiency, at multiple stages of production.

The optimistic view is that this will improve the product by freeing up the existing journalistic corps from automatable (and therefore theoretically less intellectually challenging) tasks in favor of more elevated things that only humans can do. And these certainly exist – from the highest-level planning of investigations and connect-the-dots public service journalism to obvious things like going into the field to interview human subjects less likely to open up to a stream of electrons.

Indeed, I’m pretty sure the latter function will never be replaced – but will it be foregone? I wish I knew for sure that the answer is no way.

Whenever industries confront a once-in-a-generation world-changing efficiency, they face two possibilities. The first is the one described above: reallocation to improve the overall product. The second is downsizing, which would help distressed bottom lines but also risk long-term devastation of the news business. It would be short-termist in the worst way.

As is famous, the journalism industry has been battered by the digital disruption. Classified income is gone; print sales are disappearing; much of online advertising, he expected panacea, has floated away to search and social and even ecommerce sites; paywalls are a hard sell for much of the audience, if they amount to more than peanuts.

In the process of dealing with this, in some countries perhaps half the workforce in journalism has been culled already since the turn of the century. The remaining hardy souls have to work harder than before (24/7 online deadlines) in more formats than before (video is now critical) with fewer financial resources than before (but of course with access to the same efficiency tools that threaten them).

This challenge has coincided with declining public trust in the mainstream media – which comes mostly from politics, as I’ve said, but also attaches to a broad sense that the media, in its desperation, is cutting corners and engaged in clickbait.

The first path – reallocation – offers a pathway to strengthening the product, improving the public service, regaining credibility and attracting new talent by offering more fulfilling careers to aspiring journalists. The second – downsizing – would quickly accelerate what is looking like a death spiral.

Let’s hope the market – meaning the public – chooses wisely. So good people, hear this plea: Patronize publications that are committed to real journalism, and punish those that will use AI as the excuse to fire more journalists with what the late Milan Kundera called “unbearable lightness.”

LĂSAȚI UN MESAJ

Please enter your comment!
Please enter your name here