STMicroelectronics unveils its 3D image sensor, the VD55H1
STMicroelectronics has announced a new family of high-resolution time-of-flight sensors that bring advanced 3D depth imaging to smartphones and other devices.
The 3D family makes its debut with the VD55H1. This sensor maps three-dimensional surfaces by measuring distance to more than half a million points. Objects can be detected up to five meters from the sensor, and even further with patterned lighting. The VD55H1 addresses new use cases in the AR/VR market, including room mapping, gaming, and 3D avatars. In smartphones, the sensor improves the performance of camera system features, including bokeh effect, multi-camera selection and video segmentation. Face authentication security is also enhanced with higher resolution and more accurate 3D images to protect phone unlocking, mobile payment and any smart system involving secure transactions and access control. In robotics, the VD55H1 provides high-fidelity 3D scene mapping for all target distances to enable new, more powerful capabilities.
“The innovative VD55H1 3D depth sensor reinforces ST’s leadership in time-of-flight and complements our full suite of depth-sensing technologies,” said Eric Aussedat, executive vice president and general manager of ST’s Imaging subgroup. ST. “The FlightSense™ portfolio now includes direct and indirect ToF products, from all-in-one single-point sensors to sophisticated high-resolution 3D imagers enabling future generations of intuitive, smart, and autonomous devices.”
Indirect time-of-flight (iToF) sensors, such as the VD55H1, calculate the distance to objects by measuring the phase shift between the reflected signal and the transmitted signal. This is a complementary technique to direct time-of-flight (dToF) sensors, which measure the time it takes for transmitted signals to reflect back to the sensor. ST’s extensive portfolio of advanced technologies enables the company to design direct and indirect high-resolution ToF sensors and offer optimized solutions tailored to application requirements.
The VD55H1’s unique pixel architecture and manufacturing process, leveraging internal 40nm stacked wafer technology, ensures low power consumption, low noise and optimized die area. The die contains 75% more pixels than existing VGA sensors, in a smaller die size.
The VD55H1 sensor is now available to mainstream customers. Volume production maturity is planned for the second half of 2022. A reference design and complete software package are available to help accelerate sensor evaluation and project development.
Additional technical information
Featuring a 672 x 804 back-illuminated pixel array (BSI) for iToF depth sensing, the VD55H1 is the first sensor of its kind in the industry.
It has the unique ability to operate with a modulation frequency of 200 MHz with a demodulation contrast of over 85% at 940 nm. This reduces depth noise by a factor of two compared to existing sensors which typically operate around 100 MHz. Additionally, multi-frequency operation, an advanced depth-unwinding algorithm, low pixel noise floor, and high pixel dynamic range ensure superior measurement accuracy over long distances. Depth accuracy is better than 1% and typical accuracy is 0.1% of distance.
Other features include a short sequence capture that supports frame rate up to 120fps and improves motion blur robustness. Additionally, advanced clock and phase management, including the Spread Spectrum Clock Generator (SSCG), provides multi-device interference mitigation and optimized electromagnetic compatibility.
Power consumption can be reduced to less than 100mW in some streaming modes, to help extend the runtime of battery-operated devices.
A consumer device form factor reference design for the VD55H1 has been created which includes the lighting system. A comprehensive supporting software driver and library containing an advanced deep reconstruction image signal processing pipeline compatible with Android embedded platforms are also provided.
If you have an interesting article / experience / case study to share, please contact us at [email protected]