For years, technologies like Apple’s Siri and Google’s Google Now have been flying blind, awoken by the spoken command of their users. If Apple acquires PrimeSense, perceptual computing may finally open its eyes.
Reports from Israel’s Calcalist indicate that Apple is close to snapping up PrimeSense for US$345 million or so – lower than what investors had hoped for, apparently. PrimeSense has not confirmed the talks, which reportedly could be concluded as early as this week.
Gamers know PrimeSense – the company was the eyes and ears behind the original Xbox, able to ‘see’ users and put them inside the game itself. The original Xbox also recognises spoken commands, which can be used to control the console itself (and, to a lesser extent, some games – although gamers haven’t warmed to in-game commands as readily.)
One possible reason for the relatively low valuation is that both Microsoft and Intel are each working on perceptual computing themselves. Rather than use a PrimeSense sensor, the Xbox One incorporates its own, smaller sensor, which has been improved and more deeply integrated into the Xbox OS.
Intel has also worked with third parties, including Creative, to integrate depth-sensing cameras like the kind PrimeSense helped develop into ultrabooks by the end of 2014. Another Israeli developer, PointGrab, has also showed off technology that uses the webcam built into notebooks to recognise gestures.
For now, PrimeSense isn’t talking. “PrimeSense is the leading 3D technology in the market,” the company has said in a statement. “We are focused on building a prosperous company while bringing 3D sensing and Natural Interaction to the mass market in a variety of markets such as interactive living room and mobile devices. We do not comment on what any of our partners, customers or potential customers are doing and we do not relate to rumors or re-cycled rumors.”
The work by Microsoft, Intel, and PointGrab means that if the Apple-PrimeSense deal is completed, PC makers won’t necessarily have to wave gestures goodbye. But so far, it’s been the mobile platforms that have shown the most interest in perceptual computing.
The PrimeSense sensor inside the Xbox does point the way toward technologies that could prove useful in phones, including facial recognition, a technology that Google launched with the Galaxy Nexus and Android 4.0. And unless users turn it off, the new Kinect camera is otherwise eternally vigilant, keeping its eyes and ears cocked for new commands.
Phones haven’t gone quite so far, primarily limited by concerns about limiting their battery life. But Apple has already begun leaning toward always-on sensing. Apple’s new M7 sensor chip within the iPhone 5S aggregates information from the in-phone gyroscope, compass and accelerator – and as the MIT Technology Review notes – is forever collecting data. Apple is also reportedly installing sensors within its stores to track how shoppers move – and over which products they linger.
Meanwhile, Google designed the Moto X smartphone to constantly listen for spoken commands, also with a dedicated chip. For now, however, Apple’s Siri needs a bit of prompting to wake up.
That could change. Integrating PrimeSense wouldn’t mean that tomorrow’s iPhone would integrate an always-on camera. But with a battery life that already exceeds eleven hours, phones like the iPhone 5S can spare a few milliwatts – especially if they’re smart enough to turn the camera off when it’s inside your pocket.
As of now, there’s no indication that an iWatch or an Apple version of Google Glass is in the works, although Apple has registered the trademark for iSight. Nevertheless, giving Apple eyes and ears into your world suggests some intriguing possibilities. For now, however, they’re all the subject of speculation – which is sometimes just as fun as an actual product.
By Mark Hachman. Macworld