블로그1_썸네일_영문.jpg

The emergence of ChatGPT and AI image generators has opened up new possibilities for creating text, images, and videos. Coupled with the rise of the metaverse during the COVID-19 era, people began crafting avatars, exploring virtual spaces, and interacting with others in entirely new ways. Advances in AI and VR technology have since extended the boundaries of the real world into these virtual realms.

So, how will AI and VR transform the way we experience music? Verses offers an answer with a groundbreaking concept: Interactive Music.

Interactive Music is a new form of music consumption proposed by Verses. It combines traditional audio-based music with visual elements, interactive motions, storytelling, and world-building, all powered by generative AI. The result is a multisensory, immersive musical experience like no other.

블로그1_도표_영문.jpg

‘Connecting sound with other elements’ goes beyond simply listening to music—it means engaging with various sensory and interactive components to create a richer, more immersive musical experience.

Verses’ Interactive Music is built on this concept, seamlessly combining visuals, interactive motions, storytelling, and user experiences into one cohesive whole. Each of these four elements is intricately linked to the music, dynamically interacting and evolving in real-time through generative AI technology.

1. Visuals

Interactive Music expands the experience of music from simply ‘listening’ to ‘seeing,’ offering users a more immersive way to engage with sound. This involves incorporating various forms of visual feedback that interact with the music. Examples include 3D objects that change in sync with the rhythm, metaverse environments that evolve with the flow of the music, and 3D virtual instruments that users can play through their movements.

These visual elements go beyond passive observation, creating opportunities for direct interaction with the music. This enables users to experience music in a more intuitive and engaging way. Dynamic visuals that shift with the rhythm or transform metaverse environments not only add vibrancy but also bring the music to life, offering a groundbreaking experience where the sound unfolds in real time right before the user’s eyes.

2. Interactive Motions

Another key feature of Interactive Music is its ability to change in response to the user’s interactive motions. Actions like shaking a smartphone or touching the screen dynamically alter the flow of the music, generating new beats and allowing users to recreate the music in their own unique way.

For instance, in Verses’ interactive single ‘FIGHTMAN’ by the artist SUMIN, every screen tap or device shake transforms the rhythm and beats of the song in real-time. This creates a creative experience where users can actively intervene in and reshape the flow of the music.

The music's instant response to even the smallest movements or touches fosters a deep sense of connection between the user and the sound. It maximizes the joy of musical interaction by making the experience highly personal and engaging. This real-time interplay between user motions and music not only enhances immersion but also elevates participation, unlocking a whole new dimension of music appreciation.

3. Storytelling

The integration of music with metaverse-based storytelling is another remarkable aspect of Interactive Music. In this system, the user’s actions within the metaverse—performed through their avatar—trigger real-time reactions in the music. For instance, when an avatar interacts with objects or follows specific paths in the virtual world, new musical elements are added or existing ones are transformed.

This approach goes beyond passive music consumption, inviting users to actively engage with both the story and the music. It feels much like a video game, where users become creators of ‘musical events’ within the metaverse. This dynamic interaction makes the experience deeply personalized and more immersive, offering users a unique and ever-evolving musical journey.

4. User Experiences