A multimodal sentiment classifier for financial decision making

Todd, Andrew and Bowden, James and Cummins, Mark and Su, Yang (2025) A multimodal sentiment classifier for financial decision making. International Review of Financial Analysis. ISSN 1057-5219 (https://doi.org/10.1016/j.irfa.2025.104322)

[thumbnail of Todd-etal-2025-A-multimodal-sentiment-classifier-for-financial-decision-making] Text. Filename: Todd-etal-2025-A-multimodal-sentiment-classifier-for-financial-decision-making.pdf
Accepted Author Manuscript
Restricted to Repository staff only until 1 January 2099.

Download (823kB) | Request a copy

Abstract

This study pioneers a multimodal approach to financial sentiment analysis through the integration of audio and textual data to enhance predictive accuracy. Motivated by the underutilisation of paralinguistic features and deep learning techniques in financial sentiment analysis, we introduce a novel deep learning-enabled multimodal classifier trained on corporate earnings calls using a subset of S&P 100 constituents. Our methodology incorporates FinBERT, a financial variant of Bidirectional Encoder Representation Transformations (BERT), alongside paralinguistic features and a deep learning classifier. Comparative analysis against established sentiment analysis methods, including dictionary approaches and machine learning models, suggests that our multimodal classifier achieves improved out-of-sample accuracy. Specifically, the inclusion of paralinguistic characteristics improves sentiment detection accuracy. Our research provides nuanced insights into sentiment analysis detection of different speakers (managers and analysts) during both the management discussion and Q&A sections of corporate earnings calls. Combined, our results suggest that multimodal sentiment analysis classification possesses the ability to deepen our understanding of the interplay between sentiment and market characteristics.

ORCID iDs

Todd, Andrew, Bowden, James ORCID logoORCID: https://orcid.org/0000-0002-0419-1882, Cummins, Mark ORCID logoORCID: https://orcid.org/0000-0002-3539-8843 and Su, Yang;