How can I help?
How can I help?

Kingsley Ash

Senior Lecturer

Kingsley Ash is a composer and performer of electronic and computer music whose work explores technology-enabled music performance and sound installations.

Orcid Logo 0000-0001-9362-8152
Kingsley Ash staff profile image

About

Kingsley Ash is a composer and performer of electronic and computer music whose work explores technology-enabled music performance and sound installations.

Kingsley has performed and presented work through a range of media in venues across London including the Royal Albert Hall and the Royal Festival Hall, as well as in Europe, the USA and Asia. His recent work has been in the field of sonification, particularly the exploration of environmental data through sound and interactivity.

Kingsley studied Astrophysics at Manchester University and Music Technology at York University and is currently working towards his PhD at Goldsmiths, London.

Academic positions

  • Senior Lecturer
    Leeds Metropolitan University, Film, Music & Performance, Leeds, United Kingdom | 2007 - present

Non-academic positions

  • Production Manager
    EMI Music, London | 2005 - 2006

Degrees

  • MSc Music Technology
    University of York, York, United Kingdom | 01 July 1998 - 01 September 1999

  • BSc Astrophysics
    University of Manchester, Manchester, United Kingdom | 1994 - 1998

Research interests

Current research interests include data sonification, particularly the investigation of environmental data and processes through sound and interactivity, as well as an ongoing collaboration with Dr Yvon Bonenfant (University of Winchester) that seeks to explore the unique qualities of the human voice through playful interactive artworks.

Kingsley studied Astrophysics at Manchester University and Music Technology at York University and is currently working towards his PhD at Goldsmiths, London under the supervision of Dr Mick Grierson. Further details of his work can be found online at http://www.kingsleyash.com.

Publications (39)

Sort By:

Other

Affected States: Analysis and Sonification of Twitter Music Trends: Winner.

Featured 2012
Other

“Anton’s Shorts” – Music

Featured 2007
Other

Livecell, for laptop ensemble and string quartet: Honorary mention.

Featured 2011
Other

“Theatre of Noise” - Co-composer in residence

Featured 2009
Other

“Macbeth” - Music and sound design

Featured 2011 Mooted Theatre Company, York
Other

“River Voices” – Outdoor water powered sound installation

Featured 2009 Clifton Castle
Other

“Henry VI” - Music and Sound Design

Featured 2006
Composition

Uluzuzulalia

Featured 2014

Uluzuzulalia is an interactive childrens theatre piece funded by the Wellcome Trust and Arts Council England. Currently touring the UK, with performances in London, Bath, Manchester, Winchester, Birmingham and more to come in 2015.

Other

“Gillamour” - Sound installation

Featured 2008
Other

“Sunyata” - Interactive multi-user sound installation

Featured 2009
Other

“Corpse Way” - Sound installation

Featured 2007
Other

Livecell, for networked devices, string quartet and projection: Winner.

Featured 2010
Other

“Hedda Gabler” – Music

Featured 2012 Mooted Theatre Company, York
Performance

Livecell Performance

Featured 28 June 2012 Interface2012, Birmingham Conservatoire, UK Publisher
AuthorsAsh K, Stavropoulos N

Performance of Livecell with live string quartet as part of Interface2012

Conference Contribution

Urbicolous Disport. An interactive, generative sound-toy installation in which participants capture, manipulate and play with the sounds of the city.

Featured September 2012 Live Interfaces: Performance, Art, Music University of Leeds, UK
AuthorsAsh K, Dolphin A
Performance

If and Only If

Featured 30 April 2013 Noisefloor Festival, Staffordshire University
Performance

GoLImp IV

Featured 1 September 2008 (re)Actor3, Human Computer Interaction Conference, Liverpool
Other

“Collide-oscope” – Sound Artist

Featured 2012
Performance

Livecell

Featured 19 June 2012 New Resonances Festival, Wilton’s Music Hall, London
Performance

Livecell Performance

Featured 26 October 2012 Korean Electroacoustic Music Society Annual Conference, Seoul, South Korea
AuthorsAsh K, Stavropoulos N
Performance

GoLImp I

Featured 29 February 2008 Echochroma III, Leeds
Journal article

Interactive multimedia systems for engineering education in acoustics, synthesis and signal processing.

Featured June 2001 European Journal of Engineering Education26(2):91-106 Taylor & Francis
AuthorsAuthors: Ash K, Hunt A, Howard D, Kirk R, Tyrell A, Editors: de Graaff E

This paper describes programmable multimedia systems, developed at the University of York, which are used extensively for teaching on a variety of music technology and mainstream engineering courses. Software and hardware systems are described for the physical modelling of acoustic spaces, and for constructing interactive synthesis and signal processing networks. Details are given on how these have been successfully integrated into higher education programmes at York.

Performance

Livecell

Featured 31 July 2011 International Computer Music Conference, Huddersfield
Performance

GoLImp III

Featured 28 August 2008 International Computer Music Conference, Belfast
Performance

If and Only If

Featured 11 November 2013 Echochroma VIII, Leeds
Other

“Othello” – Music and Sound Design

Featured 2012 York Theatre Royal
Composition

If and Only If

Featured 2012 View More Info

If and only if is a live laptop performance that combines traditional laptop improvisation with elements of live coding. Data generated from the live manipulation of Markov chain probability weightings is used to generate and process sound materials in real time. These sound materials continue to evolve under the direction of the Markov chains, while the performer turns his attention to other layers of sound. The resulting piece is complex and detailed, with the temporal and spectral space to allow the noiselessness of the data to make its presence unheard.

Performance

Performance with live string quartet at the International Computer Music Conference (ICMC)

Featured 3 August 2011 Huddersfield Author Publisher
AuthorsAsh K, Stavropoulos N, Morgan D, Sweeney N, Dowdall L, Athlaoich AN
Performance

Livecell

Featured 22 September 2011 Pixilerations Festival, Providence, USA
Performance

Livecell

Featured 14 December 2012 International Festival for Innovation in Music Production and Composition, Leeds College of Music
Composition

GoLimp III

Featured 2008 View More Info

GoLImp (Game of Life Improvisation) is based around a system that allows realtime manipulation of a 2D grid of cellular automata (“automata bending” - Ariza, C., Computer Music Journal, 2007) to generate and process musical events in a performance setting. The performer is able to influence the process at all stages by generating seeds and populating cells, manipulating the rules under which they evolve and affecting the complex mapping that determines the nature of the musical events that are produced as a result. Depending on the current conditions and the rules specified, the cellular automata can either be unstable and produce sounds that are equally chaotic, or they can be highly symmetrical and regular. During the performance these cellular automata are projected allowing the audience to experience the growth and evolution of the cells while listening to the sounds they create. The performance consists of dynamically shaping the conditions and rules, thereby both shaping and responding to the generative output in order to produce an interesting and meaningful musical result. Conditions can be set that may allow the cell populations to grow, to be stable, or to die off, with corresponding changes to the sound output. However, the multi-parametric mapping ensures there is a complex correspondence between the cells and the sound output, so no individual cells necessarily correspond to a particular sound or pitch, but the overall state of the cellular automata are reflected in the overall sound. The musical output of the system is generated through the triggering and manipulation of samples including traditional instruments, field recordings and synthesised sounds, with different combinations of sounds used in different sections of the improvisation. Data from the cellular automata system is used to determine and change parameters including timing, pitch and amplitude as well as delay, filter, reverberation and granulation processes. The system also allows for real-time input of sounds, so a live input can be employed during the performance to add to the sonic pallet to be processed by the cellular automata system.

Performance

If and Only If

Featured 26 November 2013 Leeds College of Music
Conference Proceeding (with ISSN)

Affected States: Analysis and Sonification of Twitter Music Trends

Featured June 2012 18th International Conference on Auditory Display Proceedings of the 18th International Conference on Auditory Display, Atlanta, GA, USA Nees MA, Walker BN, Freeman J Atlanta, GA, USA The International Community for Auditory Display
AuthorsAuthors: Ash K, Editors: Nees MA, Walker BN, Freeman J

This paper describes an approach to the sonification of real- time twitter music trend data realized for the ICAD 2012 Sonification Competition: Listening to the World Listening. The paper will discuss the techniques used to create the sonification and the motivations behind them, including details of the data analysis, mapping strategies, visual display and sonification output. The system analyses the Twitter Music Trends data feed, which aggregates music listening data from Twitter by artist, as well as the Echo Nest REST API to determine the perceived emotional affect and prevailing descriptions of a selection of the latest trending artists. The resulting data is visualized and sonified in real-time to facilitate analysis and generate an appealing visual and auditory display of the resulting data. Experience with the system suggests that it is successful in allowing users to determine perceived emotional affect and quality for a number of artists simultaneously, and could allow further investigation into the correlation between these factors. The system also generates appealing visual music that reaches beyond the practice of scientific investigation to reach out to a wider audience.

Composition

Livecell

Featured 2011 View More Info

Livecell is a system for the interactive real-time composition and performance of music for live string quartet. Users interact with a touch screen to create and destroy cells in a continuously evolving artificial life simulation based on cellular automata. The state of these cells is continuously translated into a musical score, which is then transmitted over the network to the musicians' laptops to be performed by the string quartet live as it appears on their screens. Different areas of the interface correspond to the different instruments in the string quartet, and cells are able to grow and move between these areas allowing the composition and the instrumentation to evolve both under the direction of the user and with the natural evolution of the cells. Users are able to determine and change the rules used in the cell evolution calculations, as well as affect the form, rhythm and harmonic colour of the musical material produced. Through the system a single user can take on the roles of both composer, conductor and improviser to determine the textures, harmonies, tempo and other musical parameters of the emerging composition, mediated by the technology and performed live by the string quartet. The musical output is complex and the result of a careful balance between the influence of the user and the calculations of the algorithm, resulting in a very engaging experience for the user, performers and listeners alike. Paper presentation/performance at ICMC 2011, Huddersfield Paper presentation at Music & Technologies Conference, Lithuania, 2011 Paper/Demo at Korean Electroacoustic Music Society Conference, Korea, 2012 Demo at BBC Radio 3 Free Thinking Festival 2011, Newcastle Demo at EVALondon 2012 Conference Performances at: Interface 2012, Birmingham, IFIMPaC 2012, Leeds New Resonances Festival, London, 2012

Chapter

Livecell: Real-time Score Generation through Interactive Generative Composition

Featured 01 January 2013 Music and Technologies 978-1-4438-4213-6 Cambridge Scholars Publishing
Conference Proceeding (with ISSN)

Livecell: Real-time score generation through interactive generative composition.

Featured July 2011 International Computer Music Conference Proceedings of the International Computer Music Conference Adkins M, Isaacs B Huddersfield, UK The International Computer Music Association
AuthorsAuthors: Ash K, Stavropoulos N, Editors: Adkins M, Isaacs B

This paper discusses Livecell, an interactive generative composition and real-time scoring application for user and string quartet. The paper outlines the authors’ rationale, discusses the use of cellular automata in this context and provides an insight into the application’s structural design. The authors summarise their approach to CA musification and interface design as well as score generation and display. The paper concludes with observations on the musical output and discussion of future developments.

Conference Proceeding (with ISSN)

Livecell: Real-time score generation through interactive generative composition

Featured November 2011 1st International Conference on Music and Technologies Kaunus, Lithuania
Journal article

Stochastic processes in the musification of cellular automata: A case study of the Livecell project

Featured 2012 Emille: The Journal of the Korean Electro-Acoustic Music Society

This paper discusses Livecell, a system for interactive generative composition through real-time score generation as well as the triggering and treatment of soundfiles. The paper outlines the authors’ rationale, discusses the use of cellular automata in this context and provides an insight into the application’s structural design. The authors summarise their approach to CA musification, the representation of data relations in a musical score, and interface design as well as score generation and display. The paper concludes with observations on the musical output and discussion of future developments.

Other

“Sound Gardens” – Sound art workshop

Featured 2011
AuthorsAsh K, Archibald R

Current teaching

  • Experimental Music Systems
  • Live Performance Technologies
  • Interfaces and Interactivity
  • Collaborative Practice
  • Audio-visual Interfaces

Grants (1)

Sort By:

Grant

Your Vivacious Voice

Wellcome Trust
{"nodes": [{"id": "13078","name": "Kingsley Ash","jobtitle": "Senior Lecturer","profileimage": "/-/media/images/staff/kingsley-ash.png","profilelink": "/staff/kingsley-ash/","department": "Leeds School of Arts","numberofpublications": "39","numberofcollaborations": "39"},{"id": "3049","name": "Professor Nikos Stavropoulos","jobtitle": "Professor","profileimage": "/-/media/images/staff/professor-nikos-stavropoulos.jpg","profilelink": "/staff/professor-nikos-stavropoulos/","department": "Leeds School of Arts","numberofpublications": "39","numberofcollaborations": "5"}],"links": [{"source": "13078","target": "3049"}]}
Kingsley Ash
13078
login