xref: /linux/Documentation/userspace-api/media/drivers/camera-sensor.rst (revision 07fdad3a93756b872da7b53647715c48d0f4a2d0)
1.. SPDX-License-Identifier: GPL-2.0
2
3.. _media_using_camera_sensor_drivers:
4
5Using camera sensor drivers
6===========================
7
8This section describes common practices for how the V4L2 sub-device interface is
9used to control the camera sensor drivers.
10
11You may also find :ref:`media_writing_camera_sensor_drivers` useful.
12
13Sensor internal pipeline configuration
14--------------------------------------
15
16Camera sensors have an internal processing pipeline including cropping and
17binning functionality. The sensor drivers belong to two distinct classes, freely
18configurable and register list-based drivers, depending on how the driver
19configures this functionality.
20
21Freely configurable camera sensor drivers
22~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
23
24Freely configurable camera sensor drivers expose the device's internal
25processing pipeline as one or more sub-devices with different cropping and
26scaling configurations. The output size of the device is the result of a series
27of cropping and scaling operations from the device's pixel array's size.
28
29An example of such a driver is the CCS driver.
30
31Register list-based drivers
32~~~~~~~~~~~~~~~~~~~~~~~~~~~
33
34Register list-based drivers generally, instead of able to configure the device
35they control based on user requests, are limited to a number of preset
36configurations that combine a number of different parameters that on hardware
37level are independent. How a driver picks such configuration is based on the
38format set on a source pad at the end of the device's internal pipeline.
39
40Most sensor drivers are implemented this way.
41
42Frame interval configuration
43----------------------------
44
45There are two different methods for obtaining possibilities for different frame
46intervals as well as configuring the frame interval. Which one to implement
47depends on the type of the device.
48
49Raw camera sensors
50~~~~~~~~~~~~~~~~~~
51
52Instead of a high level parameter such as frame interval, the frame interval is
53a result of the configuration of a number of camera sensor implementation
54specific parameters. Luckily, these parameters tend to be the same for more or
55less all modern raw camera sensors.
56
57The frame interval is calculated using the following equation::
58
59	frame interval = (analogue crop width + horizontal blanking) *
60			 (analogue crop height + vertical blanking) / pixel rate
61
62The formula is bus independent and is applicable for raw timing parameters on
63large variety of devices beyond camera sensors. Devices that have no analogue
64crop, use the full source image size, i.e. pixel array size.
65
66Horizontal and vertical blanking are specified by ``V4L2_CID_HBLANK`` and
67``V4L2_CID_VBLANK``, respectively. The unit of the ``V4L2_CID_HBLANK`` control
68is pixels and the unit of the ``V4L2_CID_VBLANK`` is lines. The pixel rate in
69the sensor's **pixel array** is specified by ``V4L2_CID_PIXEL_RATE`` in the same
70sub-device. The unit of that control is pixels per second.
71
72Register list-based drivers need to implement read-only sub-device nodes for the
73purpose. Devices that are not register list based need these to configure the
74device's internal processing pipeline.
75
76The first entity in the linear pipeline is the pixel array. The pixel array may
77be followed by other entities that are there to allow configuring binning,
78skipping, scaling or digital crop, see :ref:`VIDIOC_SUBDEV_G_SELECTION
79<VIDIOC_SUBDEV_G_SELECTION>`.
80
81USB cameras etc. devices
82~~~~~~~~~~~~~~~~~~~~~~~~
83
84USB video class hardware, as well as many cameras offering a similar higher
85level interface natively, generally use the concept of frame interval (or frame
86rate) on device level in firmware or hardware. This means lower level controls
87implemented by raw cameras may not be used on uAPI (or even kAPI) to control the
88frame interval on these devices.
89
90Rotation, orientation and flipping
91----------------------------------
92
93Some systems have the camera sensor mounted upside down compared to its natural
94mounting rotation. In such cases, drivers shall expose the information to
95userspace with the :ref:`V4L2_CID_CAMERA_SENSOR_ROTATION
96<v4l2-camera-sensor-rotation>` control.
97
98Sensor drivers shall also report the sensor's mounting orientation with the
99:ref:`V4L2_CID_CAMERA_SENSOR_ORIENTATION <v4l2-camera-sensor-orientation>`.
100
101Sensor drivers that have any vertical or horizontal flips embedded in the
102register programming sequences shall initialize the :ref:`V4L2_CID_HFLIP
103<v4l2-cid-hflip>` and :ref:`V4L2_CID_VFLIP <v4l2-cid-vflip>` controls with the
104values programmed by the register sequences. The default values of these
105controls shall be 0 (disabled). Especially these controls shall not be inverted,
106independently of the sensor's mounting rotation.
107