This system is developed in Unity for the Meta Quest 3 headset within a mixed reality environment. It is based on a swarm intelligence algorithm called “self-assembly,” where multiple agents can join independently and form structures to solve complex tasks. In this application, the algorithm operates on cube-shaped agents that automatically connect through their faces when in close proximity, and generate musical rhythms, harmonies, and melodies according to the structures being formed. Initially, I implemented a fully autonomous solution in a virtual scene. Subsequently, I adapted the system to a mixed reality environment, allowing users to interact with agents directly or through swarm parameters via virtual interfaces.
The code of the autonomous solution can be found here: https://github.com/pedro-lucas-bravo/self_assembly_sync_music
