Chipmaker Audience Inc (ADNC.O) is acquiring a Silicon Valley startup in hopes of helping build smartphones that can figure out what you need at any time of the day, or even how you’re sleeping at night.
Audience announced late on Tuesday it is paying $41 million for Sensor Platforms, which creates algorithms that help analyze data from sensors on smartphones and other mobile devices.
Mixing Sensor Platforms’ technology with its own audio processing expertise could give Audience a leg up in improving how smartphones and other gadgets interpret what activities their owners are doing and how to help them.
Samsung and other manufacturers are packing gyroscopes, cameras, microphones, barometers and other sensors into smartphones. But those sensors drain battery power, and app developers are trying to find more ways to use them.
Audience wants to design low-power chips that build on its audio expertise by analyzing data from several sensors at once. Chief Executive Peter Santos used sleep analysis as an example.
Sleep analysis apps on smartphones or smart wrist bands currently rely mostly on motion sensors to detect tossing and turning at night. But they could be much improved using a processor designed to combine and analyze data about movement, the sound of breathing and background noise like a blaring television or noisy neighbors.
“The presence or lack of snoring, the pace and evenness of breathing. There’s a lot more information that’s available acoustically,” Santos told Reuters on Wednesday. “Having the intelligence and being able to make sense of the sound information is something we excel at.”
The Mountain View, California company lost Apple (AAPL.O) as its largest customer in 2012, and it now depends on smartphone leader Samsung Electronics (005930.KS) for most of its business. Its revenue last year was $161 million, making it a relatively small player in the global smartphone supply chain. Like other chipmakers, it is increasingly focusing on emerging wearable computing devices.
Audience’s interest in context-aware computing is not limited to the bedroom. Santos says smartphones and a growing crop of smart watches and other intelligent clothing should do a better job of combining audio with other sensors to interpret and react to a range of situations and activities, like riding a bike or traveling on a train.
“We’re seeing more and more companies looking at the sensory area,” said Chardan CapitalMarkets analyst Jay Srivatsa. “The good part for Audience is that the sensory business is much like voice processing in that there are no standards. It all comes from algorithms you develop internally.”