Please use this identifier to cite or link to this item: http://20.193.157.4:9595/xmlui/handle/123456789/2538
Title: Development Of A Tool To Objectively Identify Normal Human Voice
Authors: Lathadevi, Hassan Thotappa
Keywords: Voice
Issue Date: Dec-2019
Publisher: BLDE(Deemed to be University)
Abstract: Acoustic analysis is used to assist differential diagnosis, documentation and evaluation of treatment for voice disorders. Clinical data has shown that Jitter, Shimmer, Mean Pitch and Harmonic Noise Ratio are the indices of voice pathology. A voice with some periodicity can now be analysed with a computerised acoustic analyser, a relatively newer technique that can be widely used in clinical practice. Objectives : To create a database of normal voices, analyse and identify different parameters of these voices and hence identify benchmarks of normal voices. Materials and Methods : Voice samples of 458 normal males and 542 normal females aged between 18 to 28 years were collected using a sustained vowel /a/ which was recorded and analysed using a freely downloadable software “ PRAAT”. The parameters like Jitter, Shimmer, Harmonic to Noise Ratio and Pitch were derived and mean, SD and range of voice parameters were calculated. Results : In males the value of parameters were mean pitch(137.05), jitter(0.011), shimmer(0.08) and Harmonic to Noise Ratio(20.48). In females the parameters were mean pitch(234.27), jitter(0.01), shimmer(0.08) and harmonics to noise ratio(21.73). Conclusion : Voices can be objectively analysed using acoustic parameters like mean pitch, jitter, shimmer and harmonic to noise ratio. A large database yields more reliable normative parameters. Institutions should develop their own standard protocol for selection of subjects, recording of voices and their analysis.
URI: http://hdl.handle.net/123456789/2538
Appears in Collections:Department of ENT

Files in This Item:
File Description SizeFormat 
PhD.Dr. LATHADEVI H T-ENT.pdf20.05 MBAdobe PDFThumbnail
View/Open


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.