Algorithmic Arias
Navigating the Melodic Maze of Music and Mind
DOI:
https://doi.org/10.47611/jsrhs.v14i1.8910Keywords:
Affective Algorithmic Composition, Facial recognition, Music therapy, Personalized medicine, Real-time music generation, AAC, Emotional response to music, Mental health, Pain management, Substance abuse recovery, Music technology, Personalized musicAbstract
Music has a profound impact on our emotional well-being, and music therapy has proven effective in various healthcare settings. However, traditional methods of music therapy lack real-time personalization. This research explores the potential of personalized music therapy using Affective Algorithmic Composition (AAC) and facial recognition technology. The focus is on harnessing the power of AAC to generate music tailored to an individual's emotional state in real-time. Existing research confirms the strong connection between music and the brain, demonstrating the effectiveness of music therapy in various healthcare settings, and facial recognition technology provides a cost-effective and versatile tool to measure these emotions. While the field of AAC is experiencing significant growth, user preferences and the factors influencing them are crucial for developing successful real-time personalized music experiences. To gain insights into these aspects, a survey (n=160) was conducted to collect data on demographics, music listening habits, and influences on musical preferences across a range of ages. While the sample had a significant portion of teenagers (14-15 years old), responses from all age groups contributed to a comprehensive understanding of music preferences and their potential connection to emotions across the lifespan. Future studies will focus on refining emotion detection algorithms, optimizing AAC for real-time music generation, and conducting clinical trials to evaluate the effectiveness of this approach.
Downloads
References or Bibliography
Agres, K. R., Dash, A., & Chua, P. (2023, April). AffectMachine-Classical: A novel system for
generating affective classical music. arXiv :2304.04915v1
https://arxiv.org/pdf/2304.04915.pdf
Athavle, M., Mudale, D., Shrivastav, U., & Gupta, M. (2021). Music recommendation based on face emotion recognition. Journal of Informatics Electrical and Electronics Engineering (JIEEE), 2(2), 1–11. https://doi.org/10.54060/jieee/002.02.018
Bharucha, J. J., & Stoeckig, K. (1986). Reaction time and musical expectancy: Priming of chords. Journal of Experimental Psychology: Human Perception and Performance, 12(4), 403–410. https://doi.org/10.1037/0096-1523.12.4.403
Bharucha, J. J. (1987). Music Cognition and Perceptual Facilitation: a Connectionist framework. Music Perception, 5(1), 1–30. https://doi.org/10.2307/40285384
Bian, W., Song, Y., Gu, N., Chan, T. Y., Lo, T., Li, T., Wong, K. C. K., Xue, W., & Trillo, R. A. (2023). MoMusic: a Motion-Driven Human-AI collaborative music composition and performing system. Proceedings of the AAAI Conference on Artificial Intelligence, 37(13), 16057–16062. https://doi.org/10.1609/aaai.v37i13.26907
Biasutti, M. (2015, May). PEDAGOGICAL APPLICATIONS OF COGNITIVE RESEARCH
ON MUSICAL IMPROVISATION. Frontiers in Psychology.
https://www.frontiersin.org/articles/10.3389/fpsyg.2015.00614/full
Caputo, A., Kletenik, D., & Steinberg, J. (2021). A study on the perception of Algorithmic Composition music. https://www.semanticscholar.org/paper/A-Study-on-the-Perception-of-Algorithmic-Music-Caputo-Kletenik/3a4288cdf06a774ae84e2df16ac3d8d96e13e0a5
Carraturo, G., Pando‐Naude, V., Costa, M., Vuust, P., Bonetti, L., & Brattico, E. (2023). The major-minor mode dichotomy in music perception: A systematic review on its behavioural, physiological, and clinical correlates. bioRxiv (Cold Spring Harbor Laboratory). https://doi.org/10.1101/2023.03.16.532764
Chlan, L. L., Heiderscheit, A., Skaar, D. J., & Neidecker, M. V. (2018). Economic Evaluation of a Patient-Directed Music Intervention for ICU patients receiving mechanical ventilatory support*. Critical Care Medicine, 46(9), 1430–1435. https://doi.org/10.1097/ccm.0000000000003199
Chung, J. & Vercoe, G. S. (2006, April). The affective remixer: personalized music arranging. CHI '06 Extended Abstracts on Human Factors in Computing Systems, 393–398
https://dl.acm.org/doi/abs/10.1145/1125451.1125535
Edwards, E., St Hillaire-Clarke, C., Frankowski, D. W., Finkelstein, R., Cheever, T. R., Chen, W. G., Onken, L. S., Poremba, A., Riddle, R., Schloesser, D., Burgdorf, C. E., Wells, N., Fleming, R., & Collins, F. S. (2023). NIH Music-Based Intervention Toolkit: Music-Based Interventions for Brain Disorders of Aging. Neurology, 100(18), 868–878. https://doi.org/10.1212/wnl.0000000000206797
Fernández, J., & Vico, F. J. (2013). AI Methods in Algorithmic Composition: A Comprehensive survey. Journal of Artificial Intelligence Research, 48, 513–582. https://doi.org/10.1613/jair.3908
Franěk, M., Petružálek, J., & Šefara, D. (2022). Facial Expressions and Self-Reported Emotions when viewing Nature images. International Journal of Environmental Research and Public Health, 19(17), 10588. https://doi.org/10.3390/ijerph191710588
Frühholz, S., Trost, W., & Grandjean, D. (2014). The role of the medial temporal limbic system in processing emotions in voice and music. Progress in Neurobiology, 123, 1–17. https://doi.org/10.1016/j.pneurobio.2014.09.003
Fu, V. X., Oomens, P., Klimek, M., Verhofstad, M. H. J., & Jeekel, J. (2020, December). THE EFFECT OF PERIOPERATIVE MUSIC ON MEDICATION REQUIREMENT AND HOSPITAL LENGTH OF STAY: A META-ANALYSIS. Annals of Surgery, Vol. 272 No. 6. https://pubmed.ncbi.nlm.nih.gov/31356272/
Galińska, E. (2015). Music therapy in neurological rehabilitation settings. Psychiatria Polska, 49(4), 835–846. https://doi.org/10.12740/pp/25557
Ghetti, C., Chen, X., Fachner, J., & Gold, C. (2017). Music therapy for people with substance use disorders. The Cochrane Library. https://doi.org/10.1002/14651858.cd012576
Grimaud, A.M. & Eerola, T. (2020, April). EmoteControl: an interactive system for real-time control of emotional expression in music. Personal and Ubiquitous Computing 25:677–689. https://doi.org/10.1007/s00779-020-01390-7
Golden, T. L., Springs, S., Kimmel, H. J., Gupta, S., Tiedemann, A., Sandu, C. C., & Magsamen, S. (2021). The Use of music in the treatment and Management of Serious Mental Illness: A Global Scoping Review of the literature. Frontiers in Psychology, 12. https://doi.org/10.3389/fpsyg.2021.649840
Guerrier, G., Bernabei, F., Lehmann, M., Pellegrini, M., Giannaccare, G., & Rothschild, P-R. (2021, September). EFFICACY OF PREOPERATIVE MUSIC INTERVENTION ON PAIN AND ANXIETY IN PATIENTS UNDERGOING CATARACT SURGERY. Frontiers in Pharmacology. https://doi.org/10.3389/fphar.2021.748296
Haruvi, A., Kopito, R., Brande-Eilat, N., Kalev, S., Kay, E., & Furman, D. (2021, April). DIFFERENCES IN THE EFFECTS ON HUMAN FOCUS OF MUSIC PLAYLISTS AND PERSONALIZED SOUNDSCAPES, AS MEASURED BY BRAIN SIGNALS. Arctop, Research & Development. https://www.biorxiv.org/content/10.1101/2021.04.02.438269v1.full
He, J. (2022). Algorithm composition and emotion recognition based on machine learning. Computational Intelligence and Neuroscience, 2022, 1–10. https://doi.org/10.1155/2022/1092383
Hiller, L. A., & Isaacson, L. M. (1958, July). Musical Composition with a High-Speed Digital Computer. Journal of the Audio Engineering Society, 6(3), 154-160. https://www.aes.org/e-lib/browse.cfm?elib=231
Hohmann, L., Bradt, J., Stegemann, T., & Koelsch, S. (2017). Effects of music therapy and music-based interventions in the treatment of substance use disorders: A systematic review. PLOS ONE, 12(11), e0187363. https://doi.org/10.1371/journal.pone.0187363
Hou, Y. (2022, July). AI Music Therapist: A Study on Generating Specific Therapeutic Music based on Deep Generative Adversarial Network Approach. (n.d.). IEEE Conference Publication | IEEE Xplore.
https://ieeexplore.ieee.org/document/9832398
Huang, C-F. & Lin, E-J. (2013). AN EMOTION-BASED METHOD TO PERFORM
ALGORITHMIC COMPOSITION. Proceedings of the 3rd International Conference on
Music & Emotion (ICME3), Jyväskylä, Finland, 11th - 15th June 2013.
https://jyx.jyu.fi/handle/123456789/41590#
James, W. (1890). The Principles of Psychology. New York: Henry Holt and Company the
Principles of Psychology.
http://dx.doi.org/10.1037/11059-000
Janssen, J., Van den Broek, E. L., & Westerink, J. H. D. M. (2011). Tune in to your emotions: a robust personalized affective music player. User Modeling and User-Adapted Interaction, 22(3), 255–279. https://doi.org/10.1007/s11257-011-9107-7
Jespersen, K. V., Otto, M., Kringelbach, M. L., Van Someren, E. J., & Vuust, P. (2019). A randomized controlled trial of bedtime music for insomnia disorder. Journal of Sleep Research, 28(4). https://doi.org/10.1111/jsr.12817
Juslin, P., Barradas, G., & Eerola, T. (2015). From Sound to Significance: Exploring the mechanisms underlying emotional reactions to music. American Journal of Psychology, 128(3), 281–304. https://doi.org/10.5406/amerjpsyc.128.3.0281
Kamath, P., Li, Z., Gupta, C., Jaidka, K., Nanayakkara, S., & Wyse, L. (2023, March). Evaluating Descriptive Quality of AI-Generated Audio Using Image-Schemas. Proceedings of the 28th International Conference on Intelligent User Interfaces, 621–632.
https://dl.acm.org/doi/10.1145/3581641.3584083
Kayser, D., Egermann, H., & Barraclough, N. E. (2021). Audience facial expressions detected by automated face analysis software reflect emotions in music. Behavior Research Methods, 54(3), 1493–1507. https://doi.org/10.3758/s13428-021-01678-3
Kellaris, J. J. (1992). The experience of time as a function of musical loudness and gender of listener. ACR. https://www.acrwebsite.org/volumes/7380
Klepzig, K., Stender, K., Lotze, M., & Hamm, A. O. (2022). Written in the face? Facial expressions during pleasant and unpleasant chills. Psychology of Music, 51(3), 952–970. https://doi.org/10.1177/03057356221122607
Kowald, D., Muellner, P., Zangerle, E., Bauer, C., Schedl, M., & Lex, E. (2021, February). SUPPORT THE UNDERGROUND: CHARACTERISTICS OF BEYOND-MAINSTREAM MUSIC LISTENERS. EPJ Data Science, 10-14 https://epjdatascience.springeropen.com/articles/10.1140/epjds/s13688-021-00268-9
Liu, Y., Liu, G., Wei, D., Li, Q., Yuan, G., Wu, S., Wang, G., & Zhao, X. (2018). Effects of musical tempo on musicians’ and non-musicians’ emotional experience when listening to music. Frontiers in Psychology, 9. https://doi.org/10.3389/fpsyg.2018.02118
Lorek, M., Bąk, D., Kwiecień-Jaguś, K., & Mędrzycka-Dąbrowska, W. (2023). The Effect of
Music as a Non-Pharmacological Intervention on the Physiological, Psychological, and
Social Response of Patients in an Intensive Care Unit. Healthcare 2023, 11, 1687.
https://doi.org/10.3390/healthcare11121687
Meier, M. (2014). Algorithmic composition of music in real-time with soft constraints. https://www.semanticscholar.org/paper/Algorithmic-composition-of-music-in-real-time-with-Meier/36aaea8cbdc9fd7862f89396e2e84f29e9247c71
Nightingale, F. (1946). Notes on nursing: What it is, and what it is not. Appleton-Century.
Nilsson, U. (2008). The Anxiety‐ and Pain‐Reducing Effects of Music Interventions: A Systematic Review. AORN Journal, 87(4), 780–807. https://doi.org/10.1016/j.aorn.2007.09.013
Nuanáin, C. Ó. & Sullivan, L. (2014, October). Real-time Algorithmic Composition with a Tabletop Musical Interface - A First Prototype and Performance. A/M '14: Proceedings of the 9th Audio Mostly: A Conference on Interaction With Sound, No.: 9, Pages 1–7. https://dl.acm.org/doi/10.1145/2636879.2636890
Ogg, M., Sears, D. R. W., & McAdams, M. M. M. a. S. (2017). Psychophysiological Indices of Music-Evoked Emotions in Musicians. Music Perception: An Interdisciplinary Journal, 35(1), 38–59. https://www.jstor.org/stable/26417378
Panksepp, J. (2010). Affective neuroscience of the emotional BrainMind: evolutionary perspectives and implications for understanding depression. Dialogues in Clinical Neuroscience, 12(4), 533–545. https://doi.org/10.31887/dcns.2010.12.4/jpanksepp
Panksepp, J., & Bernatzky, G. (2002). Emotional sounds and the brain: the neuro-affective foundations of musical appreciation. Behavioural Processes, 60(2), 133–155. https://doi.org/10.1016/s0376-6357(02)00080-3
Porcaro, L., Gómez, E., & Castillo, C. (2022, January). DIVERSITY IN THE MUSIC LISTENING EXPERIENCE: INSIGHTS FROM FUTURE GROUP INTERVIEWS. Conference on Human Information Interaction and Retrieval (CHIIR '22). fects of https://arxiv.org/abs/2201.10249
Putkinen, V., Zhou, X., Gan, X., Yang, L., Becker, B., Sams, M., & Nummenmaa, L. (2024, January). Bodily maps of musical sensations across cultures. Proceedings of the National Academy of Sciences, 121(5). https://doi.org/10.1073/pnas.2308859121
Quinto, L., Thompson, W. F., & Taylor, A. (2013). The contributions of compositional structure and performance expression to the communication of emotion in music. Psychology of Music, 42(4), 503–524. https://doi.org/10.1177/0305735613482023
Rafikian, S. (2019, May). Machine Learning & Algorithmic Music Composition | CCTP-607: “Big Ideas”: AI to the Cloud. https://blogs.commons.georgetown.edu/cctp-607-spring2019/2019/05/06/machine-learning-algorithmic-music-composition/
Raglio, A., Baiardi, P., Vizzari, G., Imbriani, M., Castelli, M., Manzoni, S., Vico, F. J., & Manzoni, L. (2021). Algorithmic Music for Therapy: Effectiveness and Perspectives. Applied Sciences, 11(19), 8833. https://doi.org/10.3390/app11198833
Reybrouck, M., Podlipniak, P., & Welch, D. (2019). Editorial: The influence of loud music on physical and Mental health. Frontiers in Psychology, 10. https://doi.org/10.3389/fpsyg.2019.02149
Salimpoor, V. N., Zald, D. H., Zatorre, R. J., Dagher, A., & McIntosh, A. R. (2015). Predictions and the brain: how musical sounds become rewarding. Trends in Cognitive Sciences, 19(2), 86–91. https://doi.org/10.1016/j.tics.2014.12.001
Särkämö, T., Tervaniemi, M., Laitinen, S., Numminen, A., Kurki, M., Johnson, J. K., & Rantanen, P. (2013). Cognitive, emotional, and social benefits of regular musical activities in early dementia: randomized controlled study. The Gerontologist, 54(4), 634–650. https://doi.org/10.1093/geront/gnt100
Sutoo, D., & Akiyama, K. (2004). Music improves dopaminergic neurotransmission: demonstration based on the effect of music on blood pressure regulation. Brain Research, 1016(2), 255–262. https://doi.org/10.1016/j.brainres.2004.05.018
Thaut, M. H., McIntosh, K. W., McIntosh, G. C., & Hoemberg, V. (2001). Auditory rhythmicity enhances movement and speech motor control in patients with Parkinson's disease. Functional neurology, 16(2), 163–172. https://pubmed.ncbi.nlm.nih.gov/11495422/
Timmerman, H., Van Boekel, R., Van De Linde, L. S., Bronkhorst, E. M., Vissers, K., Van Der Wal, S. E. I., & Steegers, M. (2023). The effect of preferred music versus disliked music on pain thresholds in healthy volunteers. An observational study. PLOS ONE, 18(1), e0280036. https://doi.org/10.1371/journal.pone.0280036
Valevicius, D., Lopez, A., Diushekeeva, A., Lee, A., & Roy, M. (2023). Emotional responses to favorite and relaxing music predict music-induced hypoalgesia. Frontiers in Pain Research, 4. https://doi.org/10.3389/fpain.2023.1210572
White, J. M. (2001). Music as intervention: a notable endeavor to improve patient outcomes. The Nursing clinics of North America, 36(1), 83–92.
Wiafe, A. & Fränti, P. (2023, January.) Affective algorithmic composition of music: A systematic
review. Applied Computing and Intelligence 3 (1): 27–43.
https://www.aimspress.com/article/doi/10.3934/aci.2023003
Williams, D., Kirke, A., Miranda, E., Daly, I., Hwang, F., Weaver, J., & Nasuto, S. (2017, May).
Affective Calibration of Musical Featuresets in an Emotionally Intelligent Music Composition System.
ACM Transactions on Applied Perception, Volume 14, Issue 3, Article No.17, pp 1–13.
https://dl.acm.org/doi/10.1145/3059005
Williams, D., Kirke, A., Miranda, E. R., Roesch, E. B., Daly, I., & Nasuto, S. J. (2014). Investigating affect in algorithmic composition systems. Psychology of Music, 43(6), 831–854. https://doi.org/10.1177/0305735614543282
Williams, D., Hodge, V. J., & Wu, C-Y. (2020, November). On the use of AI for Generation of
Functional Music to Improve Mental Health. Front. Artif. Intell. 3:497864
https://doi.org/10.3389/frai.2020.497864
Witvliet, C. V., & Vrana, S. R. (2007). Play it again Sam: Repeated exposure to emotionally evocative music polarises liking and smiling responses, and influences other affective reports, facial EMG, and heart rate. Cognition & Emotion, 21(1), 3–25. https://doi.org/10.1080/02699930601000672
Woods, K.J., Hewett, A., Spencer, A., Morillon, B., & Loui, P. (2019, July). Modulation in background music
influences sustained attention. arXiv: Neurons and Cognition.
Woods, K. J. P., Sampaio, G., James, T., Przysinda, E., Hewett, A., Spencer, A.E., Morillon, B., & Loui, P. (2021, October). Stimulating music supports attention in listeners with attentional difficulties. https://doi.org/10.1101/2021.10.01.462777
Yuksel, B. F., Oleson, K. B., Chang, R., & Jacob, R. J. K. (2019). Detecting and adapting to users’ cognitive and affective state to develop intelligent musical interfaces. Springer series on cultural computing (pp. 163–177) https://doi.org/10.1007/978-3-319-92069-6_11
Zaatar, M. T., Alhakim, K., Enayeh, M., & Tamer, R. (2024). The transformative power of music: Insights into neuroplasticity, health, and disease. Brain, Behavior, & Immunity - Health, 35, 100716. https://doi.org/10.1016/j.bbih.2023.100716
Zatorre, R. (2018, March). From Perception to Pleasure: How Music Changes the Brain [Video]. TED Conferences, TEDxHECMontréal.
https://www.ted.com/talks/dr_robert_zatorre_from_perception_to_pleasure_how_music_changes_the_brain
Published
How to Cite
Issue
Section
Copyright (c) 2025 Advik Rai; Janine Sharbaugh

This work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License.
Copyright holder(s) granted JSR a perpetual, non-exclusive license to distriute & display this article.


