DSpace Repository

A Comparative Analysis of Large Language Models for Code Documentation Generation

Show simple item record

dc.contributor.author Kumar, Dhruv
dc.date.accessioned 2024-08-12T10:47:23Z
dc.date.available 2024-08-12T10:47:23Z
dc.date.issued 2023-12
dc.identifier.uri https://arxiv.org/abs/2312.10349
dc.identifier.uri http://dspace.bits-pilani.ac.in:8080/jspui/xmlui/handle/123456789/15212
dc.description.abstract This paper presents a comprehensive comparative analysis of Large Language Models (LLMs) for generation of code documentation. Code documentation is an essential part of the software writing process. The paper evaluates models such as GPT-3.5, GPT-4, Bard, Llama2, and Starchat on various parameters like Accuracy, Completeness, Relevance, Understandability, Readability and Time Taken for different levels of code documentation. Our evaluation employs a checklist-based system to minimize subjectivity, providing a more objective assessment. We find that, barring Starchat, all LLMs consistently outperform the original documentation. Notably, closed-source models GPT-3.5, GPT-4, and Bard exhibit superior performance across various parameters compared to open-source/source-available LLMs, namely LLama 2 and StarChat. Considering the time taken for generation, GPT-4 demonstrated the longest duration, followed by Llama2, Bard, with ChatGPT and Starchat having comparable generation times. Additionally, file level documentation had a considerably worse performance across all parameters (except for time taken) as compared to inline and function level documentation. en_US
dc.language.iso en en_US
dc.subject Computer Science en_US
dc.subject Large Language Models (LLMs) en_US
dc.subject GPT-3.5 en_US
dc.title A Comparative Analysis of Large Language Models for Code Documentation Generation en_US
dc.type Preprint en_US


Files in this item

Files Size Format View

There are no files associated with this item.

This item appears in the following Collection(s)

Show simple item record

Search DSpace


Advanced Search

Browse

My Account