According to a report by MacRumors, Apple is working on a custom image sensor with a dynamic range that can match the human eye. This in-house image sensor could be used in future iPhone models.
Back in July, the company filed a patent titled “Image Sensor With Stacked Pixels Having High Dynamic Range And Low Noise”. The patent describes an image sensor made up of two layers. The top layer, the sensor die, houses components designed for capturing light. Meanwhile, the bottom layer is the logic die, which is responsible for processing.
An integral part of the design is the Lateral Overflow Integration Capacitor (LOFIC) system, which lets each pixel store different amounts of light based on the scene brightness. This means that the sensor can capture images with extreme lighting differences without compromising on detail.
Other than that, each pixel gets its own built-in memory circuit. The circuit detects and cancels out heat-induced electronic noise, leading to less visual grain. This correction happens on the chip itself before the image is saved. This advanced architecture can apparently achieve up to 20 stops of dynamic range. For comparison, the human eye has a dynamic range of around 20 to 30 stops.

It is worth noting that Apple has filed plenty of patents over the years, and those do not always translate to a finished product. However, the report stated that the company has already developed the sensor and could be testing it in hardware. MacRumors attributes this claim to Weibo leakster Fixed Focus Digital.
Naturally, the veracity of this particular detail is questionable, so it is wise to exercise a healthy amount of scepticism. At the moment, Apple is relying on Sony for the sensors in its iPhone models, and a change seems unlikely in the near future.
(Source: MacRumors)