Patrick Steinhardt <ps@xxxxxx> writes: > diff --git a/parse-options.h b/parse-options.h > index 997ffbee805..8d5f9c95f9c 100644 > --- a/parse-options.h > +++ b/parse-options.h > @@ -92,6 +92,10 @@ typedef int parse_opt_subcommand_fn(int argc, const char **argv, > * `value`:: > * stores pointers to the values to be filled. > * > + * `precision`:: > + * precision of the integer pointed to by `value`. Should typically be its > + * `sizeof()`. The fact of the integer allowing to store up to 16-bit vs 32-bit, is that "precision"? "My --size option runs up to 200,000, what value should I set it to?" is a natural question the readers of this sentence would have in their mind, if we call it "range" or something (which might not be a bad thing to have, but that is totally outside the theme of this topic). In any case, include a phrase "number of bytes" somewhere in the description to make it clear what unit we are counting. Are their common use case where this number is *not* its sizeof() already in the codebase?