Recorded Future is a startup technology company that described itself as a “temporal analytics engine.” It tries to uncover and analyze very faint signals, basically in order to predict the future. It’s backed by Google Ventures and the data-loving VC firm IA Ventures.
Today, Recorded Future articulated its vision of the future of news. By news they don’t just mean what’s broadcast on TV at 5 and 11, they mean current events of interest to people seeking actionable information. The gist of the company’s argument is this: real-time web publishing, best exemplified by the news-breaking social network Twitter, is ultimately a race to the bottom. Eventually the time between things happening and their entering the cycle of news recycling that goes on for days or weeks will drop from the 10 or 20 minutes that it’s at right now…to zero. That’s a losing proposition for competitive news gatherers, the company says, and will be replaced in the future by an endless competition to get better at predicting the news earlier and earlier, before it happens. It’s a compelling argument, I think, and well worth considering.
Is it really possible to predict the future based on a giant index of digital information? Google has said it aims in the future to serve up what you want before you even ask for it. And people have always said that those who don’t learn from the past are doomed to repeat it. (There’s some very interesting discussion on this topic going on over on my Google Plus account right now.)
I’m apt to believe that there is a good chance that with enough data, analyzed smartly enough, many events are predictable with enough accuracy that it would be useful. Is that where the next arms race of analytics software will be fought? I wouldn’t be surprised. I am willing to bet that Google in particular will offer in the not-so-distant future, if it can pull it off, prediction or recommendation technologies based on the massive swaths of data it is ingesting and analyzing from the web and search, from web traffic, from spoken word analysis, from sensors in self-driving cars and other signals. I was very surprised that the company shut down its Smart Meter platform last month, but perhaps it’s focusing on data collection in industries where it can own more of the technology stack than in energy.
Walmart is already fine-tuning how it stocks the shelves across its empire of stores based on what it learns from peoples’ posts to Twitter. So is this stuff for real? I think it’s only a question of how truly useful it ends up being.
Compare this with the real-time web of today. Twitter is famous for breaking news of earthquakes, political scandals and developments in many industries (especially technology).
From News to Pre-News
From Recorded Future:
“The early nature of such signals obviously makes them very attractive. At the same time, these are subtle signals, and it will take judgment, statistical rigor, or the like, to take advantage of effectively and confidently.
“Tricky issues also remain in identifying prescient signals. These range from the technical (efficiently and accurately organizing references to time in news) to the psychological (how we go about researching and analyzing information that may indicate a future event).
“In summary, the nature of news continues to change, and the game of analyzing it for actionable information as is shifting from news to pre-news to early event detection – that’s where the future is and the value lies.”
Time to detect that news after it happens is, arguably, no longer a competitive advantage. Companies like Recorded Future believe they know where and how to look before events happen – to try and discern clues about what will happen in the future.
I would argue though that same strategy is true of after-action real-time news discovery. There is still a competitive advantage in knowing where to watch for news updates, even if there’s no longer any competitive advantage to consuming widely-watched sources faster.
Pre-cognition service providers might argue that they know how to show you where to go in order to “skate where the puck will be,” but I’m not convinced there’s not still plenty of advantage to be found in strategic determination of where to watch for real-time events.
“It goes without saying,” Recorded Future says none the less, “that the ability to capture value (be it economic, strategic, tactical) is directly proportional to how early one can detect and execute.”
Of course that’s true – but we’ll see how well the predictors can execute their detection and thus provide opportunities for the rest of us to execute our responses. I’m not quite ready to give up on real-time news as too slow, yet.
My takes: No need to pit news prediction against news capturing and using: they each have own values. But forecasting news is made possible through ecological research, not scientific research, as my article will be arguing. Key developments to make ecological research feasible are first of all, crowd sourcing and second, data mining in large scale.