One of the quieter pieces of news that made its debut this year was Intel’s RealSense depth-sensing camera tech. Though not yet a tangible product, Intel spent the year cutting deals with computer makers Acer, Asus, Dell, HP, Lenovo, Fujitsu, NEC, and Toshiba to incorporate the technology into their 2015 models so the tech is certainly off to a solid start as an integrated part of the PC experience.
But let’s explain a bit more about exactly what it is before we go into why it may become an even before deal in the imaging world
A Camera That Senses the Environment
RealSense actually began its life as a bulky attachment in back in its beta stage but quickly shrunk into a significantly smaller device that interestingly can now be embedded into electronic equipment of almost any size. Intel’s RealSense imaging technology comes in three forms: a front-facing camera (which captures facial movements, tracks fingers and hands, and detects backgrounds and foregrounds), a rear-facing camera (that can scan and measure rooms and objects), and a snapshot camera (which can alter photo backgrounds after a photo has been taken).
While RealSense is making some noise in the PC world with its gesture control capability and What Intel refers to as “perceptual computing” – allowing computers to “see” depth the way the human eye does, using an integrated 3D depth and 2D cameras, the tech is even more intriguing when you begin thinking about it outside of a computer.
Next Gen Imaging
We’ve chatted about light-field photography in the past and for now it’s been confined to a few specialty cameras. Intel’s RealSense tech could essentially take this trick into just about any mobile imaging device or digital camera for that matter. For those that don’t know much about the rather trendy world of light-field photography it allows the shooter to alter the way light effects a scene and also allows for focus of an image to be altered after capture.
RealSense actually captures an image in multiple layers with its specialized lens array (similar to Lytro) and really nicely pull off the depth-of-field alteration Lytro has mastered and also allows for accurate distance measurements, both on its surface and within photos if the subject of the image is within a few meters of the camera lens at the time of capturing.
This has some very cool applications including 3D mapping, augmented reality-like interaction not to mention to some nifty editing tricks like filtering an image in segments.
This says nothing about the tech’s core ability to sense objects in front of the camera lens – which is the big part of the PC-equation regarding gesture control.
Intel’s Future Plans
At their developers conference held earlier this year, Intel highlighted a number of ways their RealSense technology is bringing cutting-edge computer/human interactions to the marketplace. Here are a few they recently cited on their website:
- Food Network with Intel RealSense 3D Camera: Intel and Food Network have collaborated to create a cooking application that uses Intel RealSense technology to eliminate the need to touch the device and instead use gesture control and voice commands on an Intel RealSense 3D camera enabled device for easy scrolling through the application and navigation of recipes in real time. This will be available in the Spring of 2015.
- All-in-one PC with 3D Display Concept: Intel showcased a futuristic “3D I/O” technology that integrates Intel RealSense technology with a specialized glass screen to enable an immersive, hologram-like experience without the need for additional eyewear. It is available to experience in Intel’s CES booth. Intel RealSense 3D cameras can also provide intuitive, sight-based capabilities for solving complex problems in different application and devices.
- Intel RealSense technology and wearables: Looking into the future, Krzanich also demonstrated a wearable technology research project that can help vision-impaired people navigate their environments more easily and safely. The wearable solution places sensors on clothing, equipped with Intel RealSense 3D cameras that sense the vicinity and trigger vibrations as a feedback mechanism, helping people to navigate their environment.
- Intel RealSense technology inside Robotics and Multi-copter drone: Intel demonstrated an iRobot® AVA® 500 video collaboration robot equipped with Intel RealSense cameras to support the platform’s autonomous navigation and obstacle avoidance technologies. In addition, Intel highlighted the growing possibilities for multi-copter drones, including the inspection of fields and power lines, delivering goods, and even monitoring endangered species.
Our guess is that 2016 could be the year Intel branches out of the computer world with RealSense and it’s fairly safe bet the smartphone world will be waiting with open arms.