Sony launches the world’s first dual-stacked image sensor for phones

The race to develop stellar image sensors for phones has been going on for a long time. Brands like Samsung, Apple, OPPO, Vivo and Xiaomi have invested heavily in R&D to constantly improve the CMOS image sensors used in modern devices. Take for example optical image stabilization, pixel binning technology, or telephoto lenses that most often fit an entry-level DSLR shooter. Now, Sony has taken a big step forward in image sensor technology with its announcement at the IEEE International Meeting on Electronic Devices. Revealing details about the CMOS image sensor featuring a dual-layer transistor pixel for next-level photography on phones.

The technology employed here by Sony Semiconductor Solutions is the world’s first stacked image sensor that promises wider dynamic range and excellent noise reduction. Compared to conventional CMOS image sensors, this one has separate photodiodes and pixel transistors, installed on different substrate layers.

The stacked structure has a pixel chip having backlit pixels above a logic chip. The photodiodes for converting light into electrical signals and the pixel transistors that control the signals are located side by side on the same layer.

This architecture therefore doubles the level of the saturation signal for improved image properties and good dynamic range. Amplifier transistors can be increased in size because there are two separate layers – also transfer gates (TRG), including reset transistors (RST) and select transistors (SEL) occupy one layer with no photodiode. This results in better night photography and reduced noise reduction in darker places.

According to Sony, the expanded dynamic range, color correction and excellent noise reduction will also help eliminate the risk of overexposure and underexposure when clicking images. The Tokyo-based electronics giant plans to bring the 2-layer transistor pixel technology to mobile devices in the near future.

Michael C. Garrison