Google’s Instrument Playground represents an AI-driven experiment that allows you to quickly and easily describe an instrument and begin experimenting with its sound. The experiment uses Google’s MusicLM, which also powers MusicFX, to provide a virtual keyboard and assign a 20-second created clip to each key, much like a synthesizer or some looping features in Apple’s GarageBand.
Tools like Suno, which can generate full songs with lyrics and vocals from a text prompt, and Rightsify Hydra II, which is trained on fully licenced music and even provides instrument stems, are examples of the fast-expanding field of AI in music. From basic song generating to complete control over output, even Amazon and Adobe are developing AI-related music capabilities (through an Alexa skill).
What Is The Mechanism Of Instrument Playground?
MusicLM is an AI model that, like AI image generators, can be taught to convert text into music using samples of musical instruments.
While competitors like Suno, Meta’s MusicGen, and MusicFX can simulate a wide variety of instruments, Instrument Playground is unique in that it allows users to manipulate individual instruments in a manner similar to that of a synthesizer. It begins by utilizing the instrument to generate a 20-second clip. After this is formed, you can compose a song using the keyboard (or the laptop’s centre row of keys).
It was created by Simon Doury, who was Google’s Artist in Residence and was first released at the year’s end. Doury stated, “A starting point for this experiment was exploring a playful interface based on Music LM that inspires creativity and discovery of instruments from around the world for everyone.” And so, the experiment began. Playing with the tool, and especially combining two of these instruments, is a blast because of the worldwide nature of the instruments.
Is Instrument Playground Effective?
Due to the nature of working with a created track, getting started with Instrument Playground might be a challenge. As it synthesizes music rather than having notes assigned to individual keys, it cannot be used in the same way as a traditional piano.
If you want to incorporate certain instruments into an AI-generated song, you can try using this tool in conjunction with the Google MusicFX project.
Within a few seconds of your request, it can generate a brief clip of a spacey flute playing a tiny tune and provide you with a keyboard to manipulate the sound. On top of that, there’s an advanced mode where you can use up to four instruments, loop parts of each sound, and build a whole track out of samples generated by AI.
What’s New In It?
Instrument Playground, Google’s AI sandbox, appears to be nothing more than that at first glance. However, the underlying AI music model is quite powerful and, as we witnessed with MusicFX DJ Mode, has the potential to usher in a new era of music production.
Envision yourself capable of creating a new musical instrument that does not exist in the physical world and then programming it to play in a virtual one, like a glass string piano, which can only be played in outer space.
While questions about the legitimacy of data utilized by AI music tools and worries about the potential effects of anyone being able to compose songs on the industry as a whole have valid concerns, the truth is that musicians will make the most of these tools when they use them to produce groundbreaking new music.