(Dis)agreement on Sight-Singing Assessment of Undergraduate Musicians
Autor(a) principal: | |
---|---|
Data de Publicação: | 2018 |
Outros Autores: | , |
Tipo de documento: | Artigo |
Idioma: | eng |
Título da fonte: | Repositório Institucional da UNESP |
Texto Completo: | http://dx.doi.org/10.3389/fpsyg.2018.00837 http://hdl.handle.net/11449/166168 |
Resumo: | Assessment criteria for sight-singing abilities are similar to those used to judge music performances across music school programs. However, little evidence of agreement among judges has been provided in the literature. Fifty out of 152 participants were randomly selected and blindly assessed by three judges, who evaluated students based on given criteria. Participants were recorded while sight-singing 19 intervals and 10 tonal melodies. Interjudge agreement on melodic sight-singing was tested considering four items in a five-point Likert scale format as follows: (1) Intonation and pitch accuracy; (2) Tonal sense and memory; (3) Rhythmic precision, regularity of pulse and subdivisions; (4) Fluency and music direction. Intervals were scored considering a 3-point Likert scale. Agreement was conducted using weighted kappa. For melodic sight-singing considering the ten tonal melodies, on average, the weighted kappa (kappa(w)) were: kappa 1(w) = 0.296, kappa 2(w) = 0.487, kappa 3(w) = 0.224, and kappa 4(w) = 0.244, ranging from fair to moderate.. For intervals, the lowest agreement was kappa = 0.406 and the highest was kappa = 0.792 (on average, kappa = 0.637). These findings light up the discussion on the validity and reliability of models that have been taken for granted in assessing music performance in auditions and contests, and illustrate the need to better discuss evaluation criteria. |
id |
UNSP_36ed5674ca0f9703cf29302270bf14a6 |
---|---|
oai_identifier_str |
oai:repositorio.unesp.br:11449/166168 |
network_acronym_str |
UNSP |
network_name_str |
Repositório Institucional da UNESP |
repository_id_str |
2946 |
spelling |
(Dis)agreement on Sight-Singing Assessment of Undergraduate Musicianssight-singing assessmentinter-judge validity and reliabilitymusic performance assessmentconservatoire trainingmusic evaluationAssessment criteria for sight-singing abilities are similar to those used to judge music performances across music school programs. However, little evidence of agreement among judges has been provided in the literature. Fifty out of 152 participants were randomly selected and blindly assessed by three judges, who evaluated students based on given criteria. Participants were recorded while sight-singing 19 intervals and 10 tonal melodies. Interjudge agreement on melodic sight-singing was tested considering four items in a five-point Likert scale format as follows: (1) Intonation and pitch accuracy; (2) Tonal sense and memory; (3) Rhythmic precision, regularity of pulse and subdivisions; (4) Fluency and music direction. Intervals were scored considering a 3-point Likert scale. Agreement was conducted using weighted kappa. For melodic sight-singing considering the ten tonal melodies, on average, the weighted kappa (kappa(w)) were: kappa 1(w) = 0.296, kappa 2(w) = 0.487, kappa 3(w) = 0.224, and kappa 4(w) = 0.244, ranging from fair to moderate.. For intervals, the lowest agreement was kappa = 0.406 and the highest was kappa = 0.792 (on average, kappa = 0.637). These findings light up the discussion on the validity and reliability of models that have been taken for granted in assessing music performance in auditions and contests, and illustrate the need to better discuss evaluation criteria.Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)Univ Estadual Paulista, Dept Mus, Sao Paulo, BrazilUniv Fed Sao Paulo, Dept Psychiat, Sao Paulo, BrazilUniv Estadual Paulista, Dept Mus, Sao Paulo, BrazilFAPESP: 2014/03322-1Frontiers Media SaUniversidade Estadual Paulista (Unesp)Universidade Federal de São Paulo (UNIFESP)Bortz, Graziela [UNESP]Germano, Nayana G. [UNESP]Cogo-Moreira, Hugo2018-11-29T17:13:39Z2018-11-29T17:13:39Z2018-05-29info:eu-repo/semantics/publishedVersioninfo:eu-repo/semantics/article9application/pdfhttp://dx.doi.org/10.3389/fpsyg.2018.00837Frontiers In Psychology. Lausanne: Frontiers Media Sa, v. 9, 9 p., 2018.1664-1078http://hdl.handle.net/11449/16616810.3389/fpsyg.2018.00837WOS:000433394300001WOS000433394300001.pdfWeb of Sciencereponame:Repositório Institucional da UNESPinstname:Universidade Estadual Paulista (UNESP)instacron:UNESPengFrontiers In Psychologyinfo:eu-repo/semantics/openAccess2023-12-19T06:21:18Zoai:repositorio.unesp.br:11449/166168Repositório InstitucionalPUBhttp://repositorio.unesp.br/oai/requestopendoar:29462023-12-19T06:21:18Repositório Institucional da UNESP - Universidade Estadual Paulista (UNESP)false |
dc.title.none.fl_str_mv |
(Dis)agreement on Sight-Singing Assessment of Undergraduate Musicians |
title |
(Dis)agreement on Sight-Singing Assessment of Undergraduate Musicians |
spellingShingle |
(Dis)agreement on Sight-Singing Assessment of Undergraduate Musicians Bortz, Graziela [UNESP] sight-singing assessment inter-judge validity and reliability music performance assessment conservatoire training music evaluation |
title_short |
(Dis)agreement on Sight-Singing Assessment of Undergraduate Musicians |
title_full |
(Dis)agreement on Sight-Singing Assessment of Undergraduate Musicians |
title_fullStr |
(Dis)agreement on Sight-Singing Assessment of Undergraduate Musicians |
title_full_unstemmed |
(Dis)agreement on Sight-Singing Assessment of Undergraduate Musicians |
title_sort |
(Dis)agreement on Sight-Singing Assessment of Undergraduate Musicians |
author |
Bortz, Graziela [UNESP] |
author_facet |
Bortz, Graziela [UNESP] Germano, Nayana G. [UNESP] Cogo-Moreira, Hugo |
author_role |
author |
author2 |
Germano, Nayana G. [UNESP] Cogo-Moreira, Hugo |
author2_role |
author author |
dc.contributor.none.fl_str_mv |
Universidade Estadual Paulista (Unesp) Universidade Federal de São Paulo (UNIFESP) |
dc.contributor.author.fl_str_mv |
Bortz, Graziela [UNESP] Germano, Nayana G. [UNESP] Cogo-Moreira, Hugo |
dc.subject.por.fl_str_mv |
sight-singing assessment inter-judge validity and reliability music performance assessment conservatoire training music evaluation |
topic |
sight-singing assessment inter-judge validity and reliability music performance assessment conservatoire training music evaluation |
description |
Assessment criteria for sight-singing abilities are similar to those used to judge music performances across music school programs. However, little evidence of agreement among judges has been provided in the literature. Fifty out of 152 participants were randomly selected and blindly assessed by three judges, who evaluated students based on given criteria. Participants were recorded while sight-singing 19 intervals and 10 tonal melodies. Interjudge agreement on melodic sight-singing was tested considering four items in a five-point Likert scale format as follows: (1) Intonation and pitch accuracy; (2) Tonal sense and memory; (3) Rhythmic precision, regularity of pulse and subdivisions; (4) Fluency and music direction. Intervals were scored considering a 3-point Likert scale. Agreement was conducted using weighted kappa. For melodic sight-singing considering the ten tonal melodies, on average, the weighted kappa (kappa(w)) were: kappa 1(w) = 0.296, kappa 2(w) = 0.487, kappa 3(w) = 0.224, and kappa 4(w) = 0.244, ranging from fair to moderate.. For intervals, the lowest agreement was kappa = 0.406 and the highest was kappa = 0.792 (on average, kappa = 0.637). These findings light up the discussion on the validity and reliability of models that have been taken for granted in assessing music performance in auditions and contests, and illustrate the need to better discuss evaluation criteria. |
publishDate |
2018 |
dc.date.none.fl_str_mv |
2018-11-29T17:13:39Z 2018-11-29T17:13:39Z 2018-05-29 |
dc.type.status.fl_str_mv |
info:eu-repo/semantics/publishedVersion |
dc.type.driver.fl_str_mv |
info:eu-repo/semantics/article |
format |
article |
status_str |
publishedVersion |
dc.identifier.uri.fl_str_mv |
http://dx.doi.org/10.3389/fpsyg.2018.00837 Frontiers In Psychology. Lausanne: Frontiers Media Sa, v. 9, 9 p., 2018. 1664-1078 http://hdl.handle.net/11449/166168 10.3389/fpsyg.2018.00837 WOS:000433394300001 WOS000433394300001.pdf |
url |
http://dx.doi.org/10.3389/fpsyg.2018.00837 http://hdl.handle.net/11449/166168 |
identifier_str_mv |
Frontiers In Psychology. Lausanne: Frontiers Media Sa, v. 9, 9 p., 2018. 1664-1078 10.3389/fpsyg.2018.00837 WOS:000433394300001 WOS000433394300001.pdf |
dc.language.iso.fl_str_mv |
eng |
language |
eng |
dc.relation.none.fl_str_mv |
Frontiers In Psychology |
dc.rights.driver.fl_str_mv |
info:eu-repo/semantics/openAccess |
eu_rights_str_mv |
openAccess |
dc.format.none.fl_str_mv |
9 application/pdf |
dc.publisher.none.fl_str_mv |
Frontiers Media Sa |
publisher.none.fl_str_mv |
Frontiers Media Sa |
dc.source.none.fl_str_mv |
Web of Science reponame:Repositório Institucional da UNESP instname:Universidade Estadual Paulista (UNESP) instacron:UNESP |
instname_str |
Universidade Estadual Paulista (UNESP) |
instacron_str |
UNESP |
institution |
UNESP |
reponame_str |
Repositório Institucional da UNESP |
collection |
Repositório Institucional da UNESP |
repository.name.fl_str_mv |
Repositório Institucional da UNESP - Universidade Estadual Paulista (UNESP) |
repository.mail.fl_str_mv |
|
_version_ |
1799965336109318144 |