xref: /linux/Documentation/process/generated-content.rst (revision 23b0f90ba871f096474e1c27c3d14f455189d2d9)
1============================================
2Kernel Guidelines for Tool-Generated Content
3============================================
4
5Purpose
6=======
7
8Kernel contributors have been using tooling to generate contributions
9for a long time. These tools can increase the volume of contributions.
10At the same time, reviewer and maintainer bandwidth is a scarce
11resource. Understanding which portions of a contribution come from
12humans versus tools is helpful to maintain those resources and keep
13kernel development healthy.
14
15The goal here is to clarify community expectations around tools. This
16lets everyone become more productive while also maintaining high
17degrees of trust between submitters and reviewers.
18
19Out of Scope
20============
21
22These guidelines do not apply to tools that make trivial tweaks to
23preexisting content. Nor do they pertain to tooling that helps with
24menial tasks. Some examples:
25
26 - Spelling and grammar fix ups, like rephrasing to imperative voice
27 - Typing aids like identifier completion, common boilerplate or
28   trivial pattern completion
29 - Purely mechanical transformations like variable renaming
30 - Reformatting, like running Lindent, ``clang-format`` or
31   ``rust-fmt``
32
33Even whenever your tool use is out of scope, you should still always
34consider if it would help reviewing your contribution if the reviewer
35knows about the tool that you used.
36
37In Scope
38========
39
40These guidelines apply when a meaningful amount of content in a kernel
41contribution was not written by a person in the Signed-off-by chain,
42but was instead created by a tool.
43
44Detection of a problem and testing the fix for it is also part of the
45development process; if a tool was used to find a problem addressed by
46a change, that should be noted in the changelog. This not only gives
47credit where it is due, it also helps fellow developers find out about
48these tools.
49
50Some examples:
51 - Any tool-suggested fix such as ``checkpatch.pl --fix``
52 - Coccinelle scripts
53 - A chatbot generated a new function in your patch to sort list entries.
54 - A .c file in the patch was originally generated by a coding
55   assistant but cleaned up by hand.
56 - The changelog was generated by handing the patch to a generative AI
57   tool and asking it to write the changelog.
58 - The changelog was translated from another language.
59
60If in doubt, choose transparency and assume these guidelines apply to
61your contribution.
62
63Guidelines
64==========
65
66First, read the Developer's Certificate of Origin:
67Documentation/process/submitting-patches.rst. Its rules are simple
68and have been in place for a long time. They have covered many
69tool-generated contributions. Ensure that you understand your entire
70submission and are prepared to respond to review comments.
71
72Second, when making a contribution, be transparent about the origin of
73content in cover letters and changelogs. You can be more transparent
74by adding information like this:
75
76 - What tools were used?
77 - The input to the tools you used, like the Coccinelle source script.
78 - If code was largely generated from a single or short set of
79   prompts, include those prompts. For longer sessions, include a
80   summary of the prompts and the nature of resulting assistance.
81 - Which portions of the content were affected by that tool?
82 - How is the submission tested and what tools were used to test the
83   fix?
84
85As with all contributions, individual maintainers have discretion to
86choose how they handle the contribution. For example, they might:
87
88 - Treat it just like any other contribution.
89 - Reject it outright.
90 - Treat the contribution specially, for example, asking for extra
91   testing, reviewing with extra scrutiny, or reviewing at a lower
92   priority than human-generated content.
93 - Ask for some other special steps, like asking the contributor to
94   elaborate on how the tool or model was trained.
95 - Ask the submitter to explain in more detail about the contribution
96   so that the maintainer can be assured that the submitter fully
97   understands how the code works.
98 - Suggest a better prompt instead of suggesting specific code changes.
99
100If tools permit you to generate a contribution automatically, expect
101additional scrutiny in proportion to how much of it was generated.
102
103As with the output of any tooling, the result may be incorrect or
104inappropriate. You are expected to understand and to be able to defend
105everything you submit. If you are unable to do so, then do not submit
106the resulting changes.
107
108If you do so anyway, maintainers are entitled to reject your series
109without detailed review.
110