Search for a command to run...
Code comments and documentation are essential for software maintainability, facilitating code understanding, modification, and debugging. While generative artificial intelligence (AI) coding tools have reshaped development practices since 2022, their impact on commenting and documentation remains uncharted territory. This paper presents an empirical study examining documentation patterns in major open-source repositories before (2018-2021) and after (2022-2025) the rise of AI-assisted programming. We analyze commit histories from six repositories (pandas, scikit-learn, TensorFlow, Django, React, Node.js) across Python and JavaScript ecosystems, measuring documentation commit ratios, commit message quality, and documentation scope. Our findings reveal an 8.3 % decrease in documentation-focused commits and a 53.4 % increase in commit message detail between eras. We observe that commit messages in the post-AI era contain 56.8 % more words on average (34.8 vs 22.2 words), with statistical significance (<tex xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">$p$</tex> i 0.0001). Documentation commits also show an 84.9 % reduction in scope (260.7 vs <tex xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">$1,721.1$</tex> lines changed), though this change is not statistically significant. These patterns suggest that AI tools may be influencing how developers approach code documentation, with potential implications for long-term maintainability. This work provides empirical evidence to guide best practices for documentation in AI-assisted development and highlights the need for tooling that encourages comprehensive commenting across code-generation methods.