Search for a command to run...
Communication is an interactive social act involving the combination of multimodal signals and actions rooted in cognitive and social foundations with potentially deep evolutionary origins. Historically, research on animal communication has focused either on monoagent, unimodal sequence organizations or signal onsets, excluding overlaps and silences, which leads to missing out on potentially relevant communicative information and limits interspecific comparisons. Researchers increasingly call for multimodal and interactive approaches, including the integration of actions alongside communicative signals. However, no tool has been developed to fully process such complex data. In this study, we propose a new processing framework equipped with an open-access tool (‘Multi-interaction’) that aims at simplifying the study of multimodal communicative exchanges and cross-species comparisons. First, this approach considers intraindividual and interindividual overlaps occurring during any communicative sequence by transcribing overlapping units together rather than as separate elements. It also allows the flexible categorization of units, enabling researchers to classify species-specific elements into broader, species-general categories and adapt analyses to any annotated data set, modality or granularity level. Second, this tool can be used to transcribe each annotated multimodal interaction into a unidimensional discrete sequence that preserves the type of units making up the interaction (including silences), their order, emitter and overlap. Third, the proposed approach extracts automatically quantitative variables describing sequence properties, such as the overall duration, number of units, diversity of units, proportion of interindividual or intraindividual overlap or proportion of each individual contribution during the interaction. It also allows researchers to build automatically co-occurrence matrices of units (i.e. unit associations) that can capture intraindividual and interindividual associations either sequentially or in overlaps. To our knowledge, this framework is the first to propose a generic method to process quantitatively multimodal and multiagent information, including overlaps and silences, which will enable the scientific community to handle large data sets and cross-species comparisons. • Comparative studies lack consistent methods for multimodal interaction analysis. • We introduce a unified framework for studying multimodal interactions. • We propose new methods for annotating, processing and analysing interaction data. • Interaction units, emitters, overlaps and silences are preserved throughout. • This approach enables species-general comparisons of interactions.