Do Males and Females Use Separate System Architectures in Multimodal Information Processing?

Research output: Chapter in Book/Published conference outputConference publication

Abstract

This paper investigates whether there are differences in the way we process multimodal information and whether males and females use different system architectures during the use of speech and hand gestures. Hoping that the results will facilitate the design of multimodal systems, we conducted a set of experiments. First, 18 male and female participants were videotaped and their video protocols were analysed when describing two chairs using speech and hand gestures. Hand gestures and their corresponding lexical affiliates were annotated for post analysis using qualitative methods. Our previous empirical studies state that females use gestures more frequently than males, while males prefer verbal communication to describe the objects. Regarding the temporal alignment of gestures and corresponding lexical affiliates, males and females have similar integration patterns. Although gestures precede related keywords in majority, the time interval between the onsets of gesture strokes and the related keywords are shorter in females compared to males. In a follow-up experiment, we investigated the brain activities of 7 males and 7 females using a 16-channel EEG headset (Emotiv). Our findings show that the beta spectral moment is stronger and the changes of spectral moment from alpha to beta band are also more distinctive in females. This demonstrates the shorter temporal alignment of speech and hand gestures in females. The qualitative and quantitative studies indicate that there are gender differences in information processing and males and females seem to use different system architectures to process multimodal information. This may have implications on the development of gender-adaptive systems to increase the efficiency and acceptance of technology.

Original languageEnglish
Title of host publicationProceedings of the 2024 16th International Conference on Computer and Automation Engineering (ICCAE)
PublisherIEEE
Pages432-436
Number of pages5
ISBN (Electronic)9798350370058
DOIs
Publication statusPublished - 1 Jul 2024

Publication series

Name2024 16th International Conference on Computer and Automation Engineering, ICCAE 2024

Bibliographical note

Publisher Copyright:
© 2024 IEEE.

Keywords

  • gender differences
  • gesture recognition
  • information processing
  • multimodal systems
  • speech recognition

Fingerprint

Dive into the research topics of 'Do Males and Females Use Separate System Architectures in Multimodal Information Processing?'. Together they form a unique fingerprint.

Cite this