With a Pixel 2, Google introduced a really considerable Now Playing underline that uses on-device neural networks to always commend what’s personification in a background. Google is now improving Assistant’s and a Google app’s local Sound Search underline with a same underlying technology.
Introduced “recently,” a new chronicle of Sound Search is faster and provides some-more accurate formula than before on any Android device. Found on a Google Search app as a widget and Assistant when seeking “Hey Google, what’s playing,” Sound Search works server-side with any dash of audio sent to a cloud for analysis.
In comparison, Now Playing is totally offline with convolutional neural networks used to spin a few seconds of audio into a singular “fingerprint.”
This fingerprint is afterwards compared opposite an on-device database holding tens of thousands of songs, that is frequently updated to supplement newly expelled marks and mislay those that are no longer popular.
Bringing Now Playing’s fingerprinting record to Sound Search poses a plea of carrying to commend tens of millions of songs rather than thousands. As such, a fingerprinting record had to be scaled up, though opportunely using on a server removes on-device estimate and storage constraints.
Google was means to boost a initial relating proviso — where a algorithm finds good possibilities — by leveraging some-more neural networks.
In a second relating phase, a minute research of any claimant is achieved to find a scold one. For Sound Search, Google increasing a firmness of embeddings — or a projected illustration of a low-pitched fingerprint — from 1s to any .5s to urge a match. Google also worked to reduce a relating threshold of renouned songs so that some-more problematic tunes can be combined to a database but negligence down approval speed.
Moving forward, Google is operative to urge relating in really still and loud environments, while creation a complement faster.