Strategy & Operations » C-suite Communication » Are bots listening to your earnings calls?

Are bots listening to your earnings calls?

Everything executives say on a call is being carefully scrutinized and scored by machines, says Tim Raiswell, finance practice leader at global research and advisory firm Gartner.

CFOs are used to having their numbers placed under the microscope during an earnings call. Now, thanks to advances in natural language processing (NLP) and machine learning, their choice of words and how they deliver them are also being increasingly scrutinized and scored.

This development has the potential for increasingly serious impacts on the company stock price performance. Understanding who, or perhaps better stated, what is listening — and what they are listening for, is an essential new responsibility for CEOs, CFOs and heads of investor relations.

While CFOs are used to testing their message with analysts, increasingly they must ask themselves: “Are we sending the right message to the bots?”

More than 13,000 public companies produce more than 30,000 hours of earnings calls per year, and about 60% of each call is spent on the Q&A session. CEOs and CFOs are most often in the hot seat, as portfolio managers and investment analysts look for clues to a company’s future. As these investors look for an edge, they are turning to automated speech and text analysis for additional clues about performance.

The use of nonfinancial data such as this — referred to in investment analytics as “nontraditional data” — is a growing part of investor information services as firms look to establish the return on investment from rapidly evolving machine learning approaches.

What bots look for

NLP is the use of software to analyze human language using complex algorithms. Sentiment analysis measures the positivity or negativity of a set of prose to understand hidden emotions behind the words. The general idea, then, is that NLP and sentiment analysis can identify underlying and unspoken meaning about company performance and potential that isn’t explicit in direct language from senior leaders.

Key indicators of hidden sentiment include indirect answers to questions, exuberant words, a lack of fillers and qualifying statements.

For example, Gartner NLP analysis of the earnings transcripts of consumer-facing companies finds talk of economic slowdown and macroeconomic headwinds spiking toward the end of 2018. This was not a trend identified by a human analyst, but by a process called topic modeling in which a machine learning algorithm finds clusters of related subject matter in earnings documents, that change over time. The topic of recession was about nine times more likely to be raised by executives or analysts in end of year calls as in third-quarter updates.

By staying informed of new trends and developments in NLP and sentiment analysis, CEOs, CFOs and investor relations leaders can avoid major misinterpretations arising from simple word choices and syntax.

The business of NLP

The world’s largest investment banks and innovative startups are deploying more and more bots that promise to deliver concrete share price signals based on sentiment or diction choices used by earnings call participants.

For example, one global bank recently launched a text analyzer that evaluates the “narrative coherence” of a company’s earnings statements and contrasts it with those of industry competitors. The software can parse earnings transcripts from a company over time and analyze how often specific key business terms are used, revealing an impact to future business drivers.

It is not just huge corporations. Several startups are pioneering new approaches such as analyzing speech patterns and tone to make specific forecasts of how the share price of a company will move. Some of these offerings are already available via the Nasdaq analytics hub, which shows NLP approaches gaining traction among Wall Street analysts.

Word choices and syntax to avoid

Software and analysis experts have identified a series of markers that are more likely to indicate deception or hidden sentiment.

David Larcker, professor of accounting at Stanford University’s Graduate School of Business, pioneered research in this area and identified common signs that company representatives aren’t telling the truth:

  • Failure to directly answer an analyst’s question
  • Using words like “my team” and “we” more than “I” or “myself”
  • Using exuberant words like “amazing” or “awesome”
  • Other words that bots look out for include “but” and “if,” as well as other contrasting or qualifying prepositions.
  • In addition, avoid qualifying phrases such as “as I said before” or “to the best of my knowledge.”

Lastly, contrary to common belief, research shows dishonest executives are less likely to stammer or use filler phrases such as “um,” “uh” or “err.” This is likely due to heavy coaching and predeveloped talking points for uneasy topics.

In addition to avoiding the tells above, those delivering earnings calls should regularly review their previous statements and prepare for major divergences between statements. If you have been repeatedly touting the benefits of a major growth initiative for the past three quarters and then suddenly downplay it, the analysts may not immediately pick up on it, but the bots certainly will.

Earnings calls are one of the most powerful tools a company uses to communicate with the investment community. Anyone preparing for their next call should make sure they don’t ignore these emerging technologies.

Share
Was this article helpful?

Comments are closed.

Subscribe to get your daily business insights