[ad_1]
The Made by Google podcast recently released an episode where the hosts explained the engineering behind Google Pixel sound design. Google’s Pixel phones are considered some of the most well-designed Android smartphones on the market. This reputation stems especially from the attention to detail the company puts into its operating system. From the visual design, to the animations, and the sounds for every action. Each and every little aspect of a Pixel phone has months of testing and perfecting behind it.
How Google Pixel sound design is planned
Made by Google is a podcast that delves into the behind-the-scenes of how things work at Google. This includes talking to “Googlers” and asking the how and why of everything Google. Season 5, Episode 2 – Sounds Like Pixel Perfect – was a look into how sound effects are made for the Pixel phone lineup.
One aspect worth noting was that the team describes the sounds as “simple”, “human”, and “playful”. This, of course, complements the UI design of the Pixel phones. For example, the sliding bubble animation that clings to the screen until its pulled away. Another fun fact is that it’s not only sound, but also the hardware in the phones. The designers at Google make both the software and hardware work in perfect synergy, quite like Apple’s devices.
“If it’s a ringtone, I want to make sure that that’s going to sound really good in my home, but it’s also going to sound really good on the street. And then we optimize the sound for the hardware. So we make any orchestration or EQ or whatever it is, any types of changes that need to be made to get the sound sounding as good as possible on the actual device.”
AI is a “collaborative partner”
It also seems the designers at Google have already begun implementing generative AI in their work. This isn’t surprising at all given how heavily the Pixel phones use AI. The recently released Gems collection of sounds and ringtones was made with the help of AI. This doesn’t mean it’s entirely AI generated, of course.
For example, as detailed by 9to5Google, the “Amber Beats” sound started with a prompt to AI. “Behind the groovy bass line, there’s an atmospheric pad synth, which adds depth to the track. Bouncy drums are featured with a clear glass hit predominant sound. There are layers of synth sounds.”
The prompt itself showcases extensive knowledge of audio design, but that wasn’t all. From the 30-second clip that was output by the AI, a two-second excerpt was taken and altered. “I took that audio clip and began to rhythmically and melodically alter it, building an original idea that felt authentic to me.”
The entire episode is a very cool insight into how Google’s sound design team operates. As generative AI continues to improve, there might even come a time when employees are laid off and all of the above-mentioned work is handled by AI. If that time does come though, the sound design might lose its “human” descriptor.
[ad_2]
Source link