News Center     Technology Application

Technology Application

Introduction to the Stacked Event-Based Vision Sensor Jointly Developed by SONY and Prophesee 2020-03-11

SONY and Prophesee S.A. announced that they have jointly developed a Stacked Event-based vision sensor with the industry’s smallest 4.86um pixel size and an industry-leading 1214 dB (or higher) HDR performance. The new vision sensor and its performance results were unveiled at the International Solid-State Circuits Conference (ISSCC) held in San Francisco, CA, U.S.A., on February 16, 2016.

The new Event-based vision sensor can asynchronously check each pixel for changes in brightness and only output the coordinate and time data for pixels where a change was detected to realize data output with high efficiency, high speed and low latency. Despite its small size and low power consumption, the sensor offers data output with high resolution, high speed and high time resolution, a feat made possible through the integration of SONY’s stacked CMOS image sensor technology. Cu-Cu (Copper-to-Copper) connection*1 is used to achieve a smaller pixel size and excellent low-light performance, while Prophesee’s Metavision® Event-based vision technology delivers fast pixel response, high temporal resolution and high throughput data output. The new sensor is suitable for all kinds of machine vision applications such as the inspection of fast-moving objects under different environments and conditions.

              Chip photograph-Pixel Chip                                               Logic Chip

Key features:
1. The Stacked Event-based vision sensor delivers the industry’s smallest pixel size of 4.86um. The compact, high resolution pixel chip (top) and logic chip (bottom) incorporate a signal processing circuit to detect changes in brightness using an asynchronous delta modulation technique. Each pixel of the two separate chips is connected using a Cu-Cu connection in a stacked configuration. In addition to the industry’s smallest 4.86 um pixel size, the 1/2 inch-type sensor with 1280x720 high-definition resolution achieves high density integration using a sophisticated 40nm logic process.
 
2. A high aperture ratio*2 achieves the industry’s highest 124 dB HDR (or higher) performance. By placing only back-illuminated pixels and N-type MOS transistors on the pixel chip (top part), the aperture ratio is increased to 77% to achieve industry-leading 124 dB (or higher) HDR performance. The high-sensitivity and low-noise technologies developed by SONY for its CMOS image sensors over the years means that event detection can be conducted in low-light (40 mlx) conditions. 
 
 
               Traditional image sensor                                          New image sensor

3. Fast-resolution, high-output event data output For sensors using a frame rate, the entire pixel is output at fixed intervals according to the frame rate, while Event-Based sensors use a line selection arbitration circuit Select pixel data simultaneously. By adding 1us precision time information to the pixel address where the brightness changes, it can ensure the reading of event data with high time resolution. In addition, by effectively compressing event data, that is, the brightness change polarity, time, and x / y coordinate information of each event, a high output event rate of 1.066Geps * 4 has been achieved.
 
 


           Ball trajectory                         Frame-based image sensing           Event-based image sensor
                                             (Schematic diagram of event data readout)

Main specifications:

* 1: When stacking the back-illuminated CMOS image sensor part (top wafer) and logic circuit (bottom wafer), the technology of providing electrical continuity through the connected Cu (copper) pads. Compared with TSV wiring, the connection is achieved by penetrating the electrode around the primitive area. Compared with this, this method has greater freedom in design, improves productivity, reduces size and improves Performance. Sony announced the technology at the International Electronic Equipment Conference (IEDM) in San Francisco in December 2016.
* 2: Aperture ratio of light incident surface of each pixel (except the shading part)
* 3: A circuit that determines the priority order in the Y-axis direction in response to requests for multiple pixels whose brightness changes.
* 4: Number of events per second
 
Back to message
  Last Introduction to the SONY IMX462LQR Image Sensor(2020-04-08)
  Next Introduction to the SONY IMX464LQR CMOS Image Sensor(2020-02-10)