Emotion detection from male speech in computer games

Tarashankar Rudra*, Manolya Kavakli, David Tien

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference proceeding contribution

2 Citations (Scopus)
8 Downloads (Pure)


This paper presents the experimental results of our method of classifying emotional states of neutral and anger of male voice from artificial pidgin utterances using Support vector machine (SVM). The objective of the paper is to demonstrate that a new genre of languages called Game pidgin language (GPL) that can not only be used to capture real time speech recognition, but can also generate response from the non-player-character. The emotion which is a floating point value can be used to generate the corresponding emotive response from the non-player-character (NPC).

Original languageEnglish
Title of host publicationTENCON 2007 - 2007 IEEE Region 10 Conference
Place of PublicationPiscataway, New Jersey, USA
Number of pages4
Publication statusPublished - 2007
EventIEEE Region 10 Conference, TENCON 2007 - Taipei, Taiwan, Province of China
Duration: 30 Oct 20072 Nov 2007


OtherIEEE Region 10 Conference, TENCON 2007
CountryTaiwan, Province of China

Bibliographical note

Copyright 2007 IEEE. Reprinted from Proceedings of IEEE Region 10 Conference TENCON 2007. This material is posted here with permission of the IEEE. Such permission of the IEEE does not in any way imply IEEE endorsement of any of Macquarie University’s products or services. Internal or personal use of this material is permitted. However, permission to reprint/republish this material for advertising or promotional purposes or for creating new collective works for resale or redistribution must be obtained from the IEEE by writing to pubs-permissions@ieee.org. By choosing to view this document, you agree to all provisions of the copyright laws protecting it.

Fingerprint Dive into the research topics of 'Emotion detection from male speech in computer games'. Together they form a unique fingerprint.

Cite this