Hennig et al., 2011 - Google Patents
The nature and perception of fluctuations in human musical rhythmsHennig et al., 2011
View HTML- Document ID
- 4689056106333214367
- Author
- Hennig H
- Fleischmann R
- Fredebohm A
- Hagmayer Y
- Nagler J
- Witt A
- Theis F
- Geisel T
- Publication year
- Publication venue
- PloS one
External Links
Snippet
Although human musical performances represent one of the most valuable achievements of mankind, the best musicians perform imperfectly. Musical rhythms are not entirely accurate and thus inevitably deviate from the ideal beat pattern. Nevertheless, computer generated …
- 230000033764 rhythmic process 0 title abstract description 17
Classifications
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10H—ELECTROPHONIC MUSICAL INSTRUMENTS
- G10H2210/00—Aspects or methods of musical processing having intrinsic musical character, i.e. involving musical theory or musical parameters or relying on musical knowledge, as applied in electrophonic musical tools or instruments
- G10H2210/031—Musical analysis, i.e. isolation, extraction or identification of musical elements or musical parameters from a raw acoustic signal or from an encoded audio signal
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F19/00—Digital computing or data processing equipment or methods, specially adapted for specific applications
- G06F19/30—Medical informatics, i.e. computer-based analysis or dissemination of patient or disease data
- G06F19/34—Computer-assisted medical diagnosis or treatment, e.g. computerised prescription or delivery of medication or diets, computerised local control of medical devices, medical expert systems or telemedicine
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10L—SPEECH ANALYSIS OR SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING; SPEECH OR AUDIO CODING OR DECODING
- G10L17/00—Speaker identification or verification
- G10L17/26—Recognition of special voice characteristics, e.g. for use in lie detectors; Recognition of animal voices
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Detecting, measuring or recording for diagnostic purposes; Identification of persons
- A61B5/72—Signal processing specially adapted for physiological signals or for diagnostic purposes
- A61B5/7235—Details of waveform analysis
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Hennig et al. | The nature and perception of fluctuations in human musical rhythms | |
Loehr et al. | Temporal coordination between performing musicians | |
Chang et al. | Body sway reflects leadership in joint music performance | |
Chemin et al. | Body movement selectively shapes the neural representation of musical rhythms | |
Frühauf et al. | Music on the timing grid: The influence of microtiming on the perceived groove quality of a simple drum pattern performance | |
Hennig | Synchronization in human musical rhythms and mutually interacting complex systems | |
Czepiel et al. | Synchrony in the periphery: inter-subject correlation of physiological responses during live music concerts | |
Fiveash et al. | You got rhythm, or more: The multidimensionality of rhythmic abilities | |
Ellamil et al. | One in the dance: musical correlates of group synchrony in a real-world club environment | |
Goldman et al. | Improvisation experience predicts how musicians categorize musical structures | |
Eerola et al. | Shared periodic performer movements coordinate interactions in duo improvisations | |
Henry et al. | What can we learn about beat perception by comparing brain signals and stimulus envelopes? | |
Shoda et al. | How live performance moves the human heart | |
Royal et al. | Activation in the right inferior parietal lobule reflects the representation of musical structure beyond simple pitch discrimination | |
Räsänen et al. | Fluctuations of hi-hat timing and dynamics in a virtuoso drum track of a popular music recording | |
Krishnan et al. | Beatboxers and guitarists engage sensorimotor regions selectively when listening to the instruments they can play | |
Jakubowski et al. | Probing imagined tempo for music: Effects of motor engagement and musical experience | |
Cohrdes et al. | “The sound of affect”: Age differences in perceiving valence and arousal in music and their relation to music characteristics and momentary mood | |
Weineck et al. | Neural synchronization is strongest to the spectral flux of slow music and depends on familiarity and beat salience | |
Kim et al. | Musical social entrainment | |
Høffding et al. | Into the hive-mind: Shared absorption and cardiac interrelations in expert and student string quartets | |
Lee et al. | Dance and music in “Gangnam Style”: How dance observation affects meter perception | |
Miranda | Plymouth brain-computer music interfacing project: from EEG audio mixers to composition informed by cognitive neuroscience | |
Dalla Bella et al. | Unravelling individual rhythmic abilities using machine learning | |
Hoesl et al. | Modelling perceived syncopation in popular music drum patterns: A preliminary study |