Javascript must be enabled to continue!

Publications


2013
Automated Tonal Balance Enhancement for Audio Mastering Applications [Conference]

Stylianos Ioannis Mimilakis, Konstantinos Drossos, Andreas Floros, and Dionisios Katerelos, “Automated Tonal Balance Enhancement for Audio Mastering Applications”, in proceedings of the 134th Audio Engineering Society Convention, May 4–7, Rome, Italy, 2013

Modern audio mastering procedures are involved with the selective enhancement or attenuation of specific frequency bands. The main reason is the tonal enhancement of the original / unmastered audio material. The aforementioned process is mostly based on the musical information and the mode of the audio material. This information can be retrieved from a listening procedure of the original stimuli, or the correspondent musical key notes. The current work presents an adaptive and automated equalization system that performs the aforementioned mastering procedure, based on a novel method of fundamental frequency tracking. In addition to this, the overall system is being evaluated with objective PEAQ analysis and subjective listening tests in real mastering audio conditions.

Attachment language: English File type: PDF document Paper (.pdf)
Updated: 02-12-2019 11:34 - Size: 563.17 KB
Attachment language: English File type: BiBTex LaTeX BibTeX record (.bib)
Updated: 02-12-2019 11:34 - Size: 317 B
BibTex Record (Popup)
Copy the citation
Gestural User Interface for Audio Multitrack Real-time Stereo Mixing [Conference]

Konstantinos Drossos, Konstantinos Koukoudis, and Andreas Floros, “Gestural User Interface for Audio Multitrack Real-time Stereo Mixing," in proceedings of the 8th Conference on Interaction with Sound - Audio Mostly 2013, Sep. 18–20, Piteå, Sweden, 2013

Sound mixing is a well-established task applied (directly or indirectly) in many fields of music and sound production. For example, in the case of classical music orchestras, their conductors perform sound mixing by specifying the reproduction gain of specific groups of musical instruments or of the entire orchestra. Moreover, modern sound artists and performers also employ sound mixing when they compose music or improvise in real-time. In this work a system is presented that incorporates a gestural interface for real-time multitrack sound mixing. The proposed gestural sound mixing control scheme is implemented on an open hardware micro-controller board, using common sensor modules. The gestures employed are as close as possible to the ones particularly used by the orchestra conductors. The system overall performance is also evaluated in terms of the achieved user experience through subjective tests.

Attachment language: English File type: PDF document Paper (.pdf)
Updated: 02-12-2019 11:56 - Size: 3.27 MB
Attachment language: English File type: BiBTex LaTeX BibTeX record (.bib)
Updated: 02-12-2019 11:56 - Size: 581 B
BibTex Record (Popup)
Copy the citation
Investigating Auditory Human-Machine Interaction: Analysis and Classification of Sounds Commonly Used by Consumer Devices [Conference]

Konstantinos Drossos, Rigas Kotsakis, Panos Pappas, George Kalliris, and Andreas Floros, “Investigating Auditory Human-Machine Interaction: Analysis and Classification of Sounds Commonly Used by Consumer Devices”, in proceedings of the 134th Audio Engineering Society Convention, May 4–7, Rome, Italy, 2013

Many common consumer devices use a short sound indication for declaring various modes of their functionality, such as the start and the end of their operation. This is likely to result in an intuitive auditory human-machine interaction, imputing a semantic content to the sounds used. In this paper we investigate sound patterns mapped to "Start" and "End" of operation manifestations and explore the possibility such semantics’ perception to be based either on users’ prior auditory training or on sound patterns that naturally convey appropriate information. To this aim, listening and machine learning tests were conducted. The obtained results indicate a strong relation between acoustic cues and semantics along with no need of prior knowledge for message conveyance.

Attachment language: English File type: PDF document Paper (.pdf)
Updated: 02-12-2019 11:37 - Size: 438.26 KB
Attachment language: English File type: BiBTex LaTeX BibTeX record (.bib)
Updated: 02-12-2019 11:37 - Size: 431 B
BibTex Record (Popup)
Copy the citation
Sound Events and Emotions: Investigating the Relation of Rhythmic Characteristics and Arousal [Conference]

Konstantinos Drossos, Rigas Kotsakis, George Kalliris, and Andreas Floros, “Sound Events and Emotions: Investigating the Relation of Rhythmic Characteristics and Arousal”, in proceedings of the 4th IEEE International Conference on Information, Intelligence, Systems and Applications (IISA 2013), Jul. 10–12, Piraeus, Greece, 2013

A variety of recent researches in Audio Emotion Recognition (AER) outlines high performance and retrieval accuracy results. However, in most works music is considered as the original sound content that conveys the identified emotions. One of the music characteristics that is found to represent a fundamental means for conveying emotions are the rhythm-related acoustic cues. Although music is an important aspect of everyday life, there are numerous non-linguistic and nonmusical sounds surrounding humans, generally defined as sound events (SEs). Despite this enormous impact of SEs to humans, a scarcity of investigations regarding AER from SEs is observed. There are only a few recent investigations concerned with SEs and AER, presenting a semantic connection between the former and the listener's triggered emotion. In this work we analytically investigate the connection of rhythm-related characteristics of a wide range of common SEs with the arousal of the listener using sound events with semantic content. To this aim, several feature evaluation and classification tasks are conducted using different ranking and classification algorithms. High accuracy results are obtained, demonstrating a significant relation of SEs rhythmic characteristics to the elicited arousal.

Attachment language: English File type: PDF document Paper (.pdf)
Updated: 02-12-2019 11:51 - Size: 173.64 KB
Attachment language: English File type: BiBTex LaTeX BibTeX record (.bib)
Updated: 02-12-2019 11:51 - Size: 382 B
BibTex Record (Popup)
Copy the citation

Subscribe to my newsletter