Today Reuters announced that it is introducing AI-based copywriters into its newsrooms. It’s rationale is the following:
…Machines write bad stories and journalists struggle with mounds of data. So that’s why Reuters is building a “cybernetic newsroom” – marrying the best of machine capability and human judgment to drive better journalism, rather than asking one to be a second-rate version of the other.
It’s a smart play. Before anyone goes out and fires their copywriters though, you have to read a little more into the context of their AI’s use. Turns out that the program, named “Lynx Insight” will begin by writing short snippets of financial reporting data such as…
Johnson & Johnson has outperformed the S&P 500 Pharmaceuticals sector by 1.9 pct in the last month. In the same period it has fallen short of the broader S&P 500 by 3.8 pct.
Again, really smart. It’s taking tasks that are menial, repetitive, and simple and automating them. It’s piecing together data, forming sentences, and allowing human operators to evaluate and publish those statements as news.
What the AI cannot do is provide the “why.”
You see, Reuters is selecting financial services as its main area of activation not because it requires context but because it doesn’t. Financial reporting is supposed to be agnostic of opinions. The same can be said of most sports reporting, the area that Lynx is going to be eventually rolled out to.
What I want to point out is that human copywriters are still needed to contextualize information. For instance, I would trust an AI like this to analyze digital performance data and create basic insights about how a program is performing vs. a historical baselines. What I wouldn’t do is ask the AI to explain why. Hell, I’ve seen way too many human analysts read out data and get it flat wrong.
You have to have a human that knows the qualitative landscape and provides interpretation of the data in order for real content generation.
If you need any proof of this, just take a look at some of the more epic AI content generation fails form brands like Burger King and BMW.
I guess my point is this – there are way too many overeager marketers out there that will only read the headline like this and extrapolate the idea that AI content generation is a great idea. What I’m saying is recognize the limitations of the technology, observe the nuance of the human-written content that works and go from there.