Co-creating the Senses: Sight, Sound, and AI in Motion

Activity: Participating in or organising an event typesParticipation in Festival/Exhibition

Description

Devised and Performed on 20 May 2025. Part of "Future Screens NI", a discussion day on AI, and issues we encounter when co-creating with machines, especially for working in the creative industries.

A short performance piece using AI, real-time sensory responses, live improvised sounds and music.
A collaboration between Dr Daniel Brice (Al and Unreal Engine), Dr Eilís Phillips (Concept and Voice), and Prof Franziska Schroeder (Concept, Sound Design and Saxophone).
Queen's University Belfast: SARC: Centre for Interdisciplinary Research in Sound and Music and Media Lab, Queen's University Belfast:

CREDITS:
Úna Monaghan - music excerpts from her work "Réalta" (https://www.unamonaghan.com)
Support for Sound - Craig Jackson and Chris Kelly
Technical Support - Robbie Coey

LLM Model - Mistral7B Instruct v0.3
AI 3D Modelling, Meshy

Period20 May 2025
Event typeExhibition
LocationBelfastShow on map