Skip to main content
Apr 05, 2022

Mind your tone: new tools are monitoring executive sentiment through voice analysis

Around 40 percent of information provided by spoken words is conveyed through tone

Sentiment analysis has become a mainstream tool for investors. These days, when your earnings call transcript is published, AI models scan the text to identify which sections appear more positive or negative. The information can be piped straight into trading strategies or used as a basis for further investigation.

At the cutting edge of this work is an attempt to glean insight directly from the audio of the call. As we all know, how you say something can be just as important as – if not more important than – what you actually say. Around 40 percent of information provided by spoken words is conveyed through the tone, according to scientific studies.

One firm focused on this area is Helios Life Enterprises. The start-up provides sentiment analysis of the tone used by executives on earnings calls, looking at factors such as intonation, speed and volume. The results are then packaged up for buy-side clients, which are mainly quant firms at the moment, says Sean Austin, CEO and co-founder.

Individuals listening in to an earnings call may be able to pick up on this kind of nuance, notes Austin, especially those trained to look out for it. Technology, however, enables investors to stay on top of this signal across the whole market, by reviewing the thousands of public company calls released each quarter.

Comparing words and tone

While tonal analysis can provide stand-alone information for investors, a common use-case is to compare the results with text-based sentiment. When executives use positive words, does their tone match up? Or is there a discrepancy, implying things are not going as well as they say?

The latter may have happened during the initial stages of the global microchip shortage, according to research carried out by Aiera, a financial event access and monitoring platform. In early 2021, IT executives used positive or neutral words to talk about chip supply issues, says the study. But their tone was markedly negative.

The findings suggest ‘the positive language used by the executives was incongruous with their true beliefs about the state of their business… supporting the idea that IT sector executives were downplaying these associated risks,’ write the authors.

Individual profiles

To help identify discrepancies in tone, Helios builds speech profiles for individual corporate leaders.

It even tries to distinguish between how people speak in different contexts – for example, on an earnings call as opposed to at an investment conference. Could such a detailed profile raise privacy issues?

‘As these are public events and corporate representatives, we don’t see it as an invasion of privacy but rather an enrichment of communication that can be done passively,’ says Austin.

‘The audio and tonal analysis platform does make personalized models to better understand individuals over time, which allows better analytics around the speech. We certainly have security and privacy laws in mind as we continue to advance what we’re doing for the world of financial services.’

The emergence of audio analysis gives companies another reason to consider machines, as well as humans, when crafting their message. Some issuers are already adapting their corporate reports to avoid falling foul of algorithms, according to one study.

Of course, public speakers – and those who train them – have always focused on delivery as much as content. As sentiment analysis extends to audio, however, matching your tone to your words will become even more important.

This article originally appeared in the Spring 2022 issue of IR Magazine.

Clicky