xref: /linux/Documentation/driver-api/media/camera-sensor.rst (revision 292f83dc68442c8a33cdc6358795786234bf2f50)
1.. SPDX-License-Identifier: GPL-2.0
2
3Writing camera sensor drivers
4=============================
5
6CSI-2 and parallel (BT.601 and BT.656) busses
7---------------------------------------------
8
9Please see :ref:`transmitter-receiver`.
10
11Handling clocks
12---------------
13
14Camera sensors have an internal clock tree including a PLL and a number of
15divisors. The clock tree is generally configured by the driver based on a few
16input parameters that are specific to the hardware: the external clock frequency
17and the link frequency. The two parameters generally are obtained from system
18firmware. **No other frequencies should be used in any circumstances.**
19
20The reason why the clock frequencies are so important is that the clock signals
21come out of the SoC, and in many cases a specific frequency is designed to be
22used in the system. Using another frequency may cause harmful effects
23elsewhere. Therefore only the pre-determined frequencies are configurable by the
24user.
25
26ACPI
27~~~~
28
29Read the ``clock-frequency`` _DSD property to denote the frequency. The driver
30can rely on this frequency being used.
31
32Devicetree
33~~~~~~~~~~
34
35The preferred way to achieve this is using ``assigned-clocks``,
36``assigned-clock-parents`` and ``assigned-clock-rates`` properties. See the
37`clock device tree bindings <https://github.com/devicetree-org/dt-schema/blob/main/dtschema/schemas/clock/clock.yaml>`_
38for more information. The driver then gets the frequency using
39``clk_get_rate()``.
40
41This approach has the drawback that there's no guarantee that the frequency
42hasn't been modified directly or indirectly by another driver, or supported by
43the board's clock tree to begin with. Changes to the Common Clock Framework API
44are required to ensure reliability.
45
46Power management
47----------------
48
49Camera sensors are used in conjunction with other devices to form a camera
50pipeline. They must obey the rules listed herein to ensure coherent power
51management over the pipeline.
52
53Camera sensor drivers are responsible for controlling the power state of the
54device they otherwise control as well. They shall use runtime PM to manage
55power states. Runtime PM shall be enabled at probe time and disabled at remove
56time. Drivers should enable runtime PM autosuspend.
57
58The runtime PM handlers shall handle clocks, regulators, GPIOs, and other
59system resources required to power the sensor up and down. For drivers that
60don't use any of those resources (such as drivers that support ACPI systems
61only), the runtime PM handlers may be left unimplemented.
62
63In general, the device shall be powered on at least when its registers are
64being accessed and when it is streaming. Drivers should use
65``pm_runtime_resume_and_get()`` when starting streaming and
66``pm_runtime_put()`` or ``pm_runtime_put_autosuspend()`` when stopping
67streaming. They may power the device up at probe time (for example to read
68identification registers), but should not keep it powered unconditionally after
69probe.
70
71At system suspend time, the whole camera pipeline must stop streaming, and
72restart when the system is resumed. This requires coordination between the
73camera sensor and the rest of the camera pipeline. Bridge drivers are
74responsible for this coordination, and instruct camera sensors to stop and
75restart streaming by calling the appropriate subdev operations
76(``.s_stream()``, ``.enable_streams()`` or ``.disable_streams()``). Camera
77sensor drivers shall therefore **not** keep track of the streaming state to
78stop streaming in the PM suspend handler and restart it in the resume handler.
79Drivers should in general not implement the system PM handlers.
80
81Camera sensor drivers shall **not** implement the subdev ``.s_power()``
82operation, as it is deprecated. While this operation is implemented in some
83existing drivers as they predate the deprecation, new drivers shall use runtime
84PM instead. If you feel you need to begin calling ``.s_power()`` from an ISP or
85a bridge driver, instead add runtime PM support to the sensor driver you are
86using and drop its ``.s_power()`` handler.
87
88See examples of runtime PM handling in e.g. ``drivers/media/i2c/ov8856.c`` and
89``drivers/media/i2c/ccs/ccs-core.c``. The two drivers work in both ACPI and DT
90based systems.
91
92Control framework
93~~~~~~~~~~~~~~~~~
94
95``v4l2_ctrl_handler_setup()`` function may not be used in the device's runtime
96PM ``runtime_resume`` callback, as it has no way to figure out the power state
97of the device. This is because the power state of the device is only changed
98after the power state transition has taken place. The ``s_ctrl`` callback can be
99used to obtain device's power state after the power state transition:
100
101.. c:function:: int pm_runtime_get_if_in_use(struct device *dev);
102
103The function returns a non-zero value if it succeeded getting the power count or
104runtime PM was disabled, in either of which cases the driver may proceed to
105access the device.
106
107Frame size
108----------
109
110There are two distinct ways to configure the frame size produced by camera
111sensors.
112
113Freely configurable camera sensor drivers
114~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
115
116Freely configurable camera sensor drivers expose the device's internal
117processing pipeline as one or more sub-devices with different cropping and
118scaling configurations. The output size of the device is the result of a series
119of cropping and scaling operations from the device's pixel array's size.
120
121An example of such a driver is the CCS driver (see ``drivers/media/i2c/ccs``).
122
123Register list based drivers
124~~~~~~~~~~~~~~~~~~~~~~~~~~~
125
126Register list based drivers generally, instead of able to configure the device
127they control based on user requests, are limited to a number of preset
128configurations that combine a number of different parameters that on hardware
129level are independent. How a driver picks such configuration is based on the
130format set on a source pad at the end of the device's internal pipeline.
131
132Most sensor drivers are implemented this way, see e.g.
133``drivers/media/i2c/imx319.c`` for an example.
134
135Frame interval configuration
136----------------------------
137
138There are two different methods for obtaining possibilities for different frame
139intervals as well as configuring the frame interval. Which one to implement
140depends on the type of the device.
141
142Raw camera sensors
143~~~~~~~~~~~~~~~~~~
144
145Instead of a high level parameter such as frame interval, the frame interval is
146a result of the configuration of a number of camera sensor implementation
147specific parameters. Luckily, these parameters tend to be the same for more or
148less all modern raw camera sensors.
149
150The frame interval is calculated using the following equation::
151
152	frame interval = (analogue crop width + horizontal blanking) *
153			 (analogue crop height + vertical blanking) / pixel rate
154
155The formula is bus independent and is applicable for raw timing parameters on
156large variety of devices beyond camera sensors. Devices that have no analogue
157crop, use the full source image size, i.e. pixel array size.
158
159Horizontal and vertical blanking are specified by ``V4L2_CID_HBLANK`` and
160``V4L2_CID_VBLANK``, respectively. The unit of the ``V4L2_CID_HBLANK`` control
161is pixels and the unit of the ``V4L2_CID_VBLANK`` is lines. The pixel rate in
162the sensor's **pixel array** is specified by ``V4L2_CID_PIXEL_RATE`` in the same
163sub-device. The unit of that control is pixels per second.
164
165Register list based drivers need to implement read-only sub-device nodes for the
166purpose. Devices that are not register list based need these to configure the
167device's internal processing pipeline.
168
169The first entity in the linear pipeline is the pixel array. The pixel array may
170be followed by other entities that are there to allow configuring binning,
171skipping, scaling or digital crop :ref:`v4l2-subdev-selections`.
172
173USB cameras etc. devices
174~~~~~~~~~~~~~~~~~~~~~~~~
175
176USB video class hardware, as well as many cameras offering a similar higher
177level interface natively, generally use the concept of frame interval (or frame
178rate) on device level in firmware or hardware. This means lower level controls
179implemented by raw cameras may not be used on uAPI (or even kAPI) to control the
180frame interval on these devices.
181
182Rotation, orientation and flipping
183----------------------------------
184
185Some systems have the camera sensor mounted upside down compared to its natural
186mounting rotation. In such cases, drivers shall expose the information to
187userspace with the :ref:`V4L2_CID_CAMERA_SENSOR_ROTATION
188<v4l2-camera-sensor-rotation>` control.
189
190Sensor drivers shall also report the sensor's mounting orientation with the
191:ref:`V4L2_CID_CAMERA_SENSOR_ORIENTATION <v4l2-camera-sensor-orientation>`.
192
193Use ``v4l2_fwnode_device_parse()`` to obtain rotation and orientation
194information from system firmware and ``v4l2_ctrl_new_fwnode_properties()`` to
195register the appropriate controls.
196
197Sensor drivers that have any vertical or horizontal flips embedded in the
198register programming sequences shall initialize the V4L2_CID_HFLIP and
199V4L2_CID_VFLIP controls with the values programmed by the register sequences.
200The default values of these controls shall be 0 (disabled). Especially these
201controls shall not be inverted, independently of the sensor's mounting
202rotation.
203