Friday, October 21, 2016

Fraunhofer Promises Automotive SPAD-based LiDAR in 2018

Fraunhofer Institute works on SAPD LiDAR for autonomous cars which, in theory, could prevent accidents like Tesla crash:

“A camera’s accuracy depends very much on the lighting available. In this case, it failed. The radar system recognized the obstacle, but couldn’t locate it precisely and mistook the truck for a road sign,” says Werner Brockherde, head of the CMOS Image Sensors business unit at the Fraunhofer Institute for Microelectronic Circuits and Systems IMS in Duisburg.

The researchers have dubbed the new generation of sensors “Flash LiDAR.” They are composed of photodiodes developed at Fraunhofer IMS known as single photon avalanche diodes (SPAD) “Unlike standard LiDAR, which illuminates just one point, our system generates a rectangular measuring field,” Brockherde explains.

“The first systems with our sensors will go into production in 2018,” Brockherde says.

Fraunhofer LiDAR SPAD sensor

ON Semi Powers Light's Multi-Aperture Camera

BusinessWire: ON Semiconductor helps Silicon Valley start-up Light providing its image sensors for L16 multi-aperture camera. Through close collaboration with Light, ON Semiconductor supplies specially customized sensor devices, based on its 1/3.2-inch format AR1335 CMOS sensor product. With up to 10 sensor devices capturing image data simultaneously, Light's L16 camera is supposed to deliver an impressive 52MP resolution, plus over 5X optical zoom without any degradation in image quality.

Meanwhile, Light publishes some info on calibration and alignment of its 16-sensor camera:

"A camera that has one optical path worries less about what is “true” or “real” because there is only one truth, one reality. This reality can be objectively tested and optimized, but it requires adjusting only one path.

A multi-aperture camera with sixteen optical paths (apertures + mirrors + sensors) contends with sixteen realities. In order to merge those realities to create one truth (final image), the camera needs to know precisely where each optical path is relative to the others.

In Light’s Palo Alto office, we’ve been using a specially-designed calibration box to “teach” each L16 prototype where all of its optical paths are relative to the others and relative to the world it will capture. This allows the sixteen paths to behave as one - maintaining the same consistency as a camera with only one optical path.

Light L16 calibration box

Chipworks Estimates iPhone 7 Camera Cost at 9.5% Total BOM

Chipworks-TechInsights' iPhone 7 reverse engineering report estimates that camera and imaging functions costs about 9.5% of the BOM:

Thursday, October 20, 2016

All-New Tesla Autopilot Has 8 Cameras

Tesla announces that all its new cars will be equipped with its own design of autopilot hardware that will eventually provide fully autonomous driving:

"Eight surround cameras provide 360 degree visibility around the car at up to 250 meters of range. Twelve updated ultrasonic sensors complement this vision, allowing for detection of both hard and soft objects at nearly twice the distance of the prior system. A forward-facing radar with enhanced processing provides additional data about the world on a redundant wavelength, capable of seeing through heavy rain, fog, dust and even the car ahead.

To make sense of all of this data, a new onboard computer with more than 40 times the computing power of the previous generation runs the new Tesla-developed neural net for vision, sonar and radar processing software.

"Teslas with new hardware will temporarily lack certain features currently available on Teslas with first-generation Autopilot hardware, including some standard safety features such as automatic emergency braking, collision warning, lane holding and active cruise control. As these features are robustly validated we will enable them over the air, together with a rapidly expanding set of entirely new features. As always, our over-the-air software updates will keep customers at the forefront of technology and continue to make every Tesla, including those equipped with first-generation Autopilot and earlier cars, more capable over time."

Tuesday, October 18, 2016

ST ToF Sensor in iPhone 7

Chipworks discovered a ToF proximity sensor, apparently made by ST, next to the front camera in iPhone 7:

"...when we looked at the selfie camera side and took out the sub-assembly, both the ambient light sensor and the LED/sensor module were different from those in the 6s model.

When we took them off and looked at the module, it looked very STMicroelectronics-ish to us. Looking at the die, it is not the same, but definitely similar in style and die numbering (S2L012AC) to the VL53L0/S3L012BA die with the two SPAD arrays, however this time the LED is bonded on top of the ToF die to give a very compact module.

Based on this we think it is safe to conclude that the proximity sensor is now a ToF sensor that can also act as an accurate rangefinder for the selfie camera. It was also in the 7 Plus, so a good design win for STMicroelectronics. So far nothing has been announced by either Apple or STMicroelectronics, but it is yet another one of the subtle improvements that we see in the evolution of mobile phones.

Monday, October 17, 2016

Jean-Luc Jaffard Joins Chronocam

Chronocam, a Paris-based developer of event-driven vision sensors, announces that Jean-Luc Jaffard has been named as VP Sensor to the company. Jaffard brings more than 30 years’ experience in the chip industry, including a lengthy career developing the imaging business at ST Microelectronics.

Thanks to PD for the info!

Optotune Lens in Machine Vision

Optotune publishes a presentation on its tunable lens applications in machine vision. A few slides from the presentation are below:

Saturday, October 15, 2016

Sharp CMOS Sensors Lineup

Sharp CMOS sensors lineup keeps evolving. Whereas a year and a half ago, Sharp used to have 7 CMOS products, the newly released catalog features just two of them:

EMVA 1288 Update Released

EMVA 1288 Release 3.1 standard is to include a new template for machine vision camera datasheet, among other improvements:

"The new release is now open for public review and discussion and will become the official release 3.1 on December 30, 2016, if no objections are filed.

The new release contains only a few refinements and additions, because release 3.0 proved already to be a robust and stable release. The major progress is the new data template sheet. This makes it easy to compare the main features of cameras with data summarized in a standardized way on a single page. The two other major additions are: total SNR curve including the spatial non-uniformities, and diagrams of horizontal and vertical profiles for illustration of the spatial non-uniformities. The document can be downloaded from EMVA’s website at"

Thanks to TL for the info!

ST ToF Products

ST publishes a promotional video on its ToF sensors: