RMIT University
Browse

Varying microphone patterns for meeting speech segmentation using spatial audio cases

journal contribution
posted on 2024-11-01, 06:15 authored by Eva Cheng, Ian Burnett, Christian Ritz
Meetings, common to many business environments, generally involve stationary participants. Thus, participant location information can be used to segment meeting speech recordings into each speaker's 'turn'. The authors' previous work proposed the use of spatial audio cues to represent the speaker locations. This paper studies the validity of using spatial audio cues for meeting speech segmentation by investigating the effect of varying microphone pattern on the spatial cues. Experiments conducted on recordings of a real acoustic environment indicate that the relationship between speaker location and spatial audio cues strongly depends on the microphone pattern.

History

Related Materials

  1. 1.
    DOI - Is published in 10.1007/11922162_26
  2. 2.
    ISSN - Is published in 03029743

Journal

Lecture Notes in Computer Science

Start page

221

End page

228

Total pages

8

Publisher

Springer

Place published

Germany

Language

English

Copyright

© Springer-Verlag Berlin Heidelberg 2006.

Former Identifier

2006014344

Esploro creation date

2020-06-22

Fedora creation date

2013-02-11

Usage metrics

    Scholarly Works

    Exports

    RefWorks
    BibTeX
    Ref. manager
    Endnote
    DataCite
    NLM
    DC