Several measures have been proposed to represent various characteristics of code, such as size, complexity, cohesion, coupling, etc. These measures are deemed interesting because the internal characteristics they measure (which are not interesting per se) are believed to be correlated with external software qualities (like reliability, maintainability, etc.) that are definitely interesting for developers or users. Although many measures have been proposed for software code, new measures are continuously proposed. However, before starting using a new measure, we would like to ascertain that it is actually useful and that it provides some improvement with respect to well established measures that have been in use for a long time and whose merits have been widely evaluated. In 2018, a new code measure, named “Cognitive Complexity” was proposed. According to the proposers, this measure should correlate to code understandability much better than traditional code measures, such as McCabe Complexity, for instance. However, hardly any experimentation proved whether the “Cognitive Complexity” measure is better than other measures or not. Actually, it was not even verified whether the new measure provides different knowledge concerning code with respect to traditional measures. In this paper, we aim at evaluating experimentally to what extent the new measure is correlated with traditional measures. To this end, we measured the code from a set of open-source Java projects and derived models of “Cognitive Complexity” based on the traditional code measures yielded by a state-of-the-art code measurement tool. We found that fairly accurate models of “Cognitive Complexity” can be obtained using just a few traditional code measures. In this sense, the “Cognitive Complexity” measure does not appear to provide additional knowledge with respect to previously proposed measures.

An Extended Study of the Correlation of Cognitive Complexity-related Code Measures

L. Lavazza
2022-01-01

Abstract

Several measures have been proposed to represent various characteristics of code, such as size, complexity, cohesion, coupling, etc. These measures are deemed interesting because the internal characteristics they measure (which are not interesting per se) are believed to be correlated with external software qualities (like reliability, maintainability, etc.) that are definitely interesting for developers or users. Although many measures have been proposed for software code, new measures are continuously proposed. However, before starting using a new measure, we would like to ascertain that it is actually useful and that it provides some improvement with respect to well established measures that have been in use for a long time and whose merits have been widely evaluated. In 2018, a new code measure, named “Cognitive Complexity” was proposed. According to the proposers, this measure should correlate to code understandability much better than traditional code measures, such as McCabe Complexity, for instance. However, hardly any experimentation proved whether the “Cognitive Complexity” measure is better than other measures or not. Actually, it was not even verified whether the new measure provides different knowledge concerning code with respect to traditional measures. In this paper, we aim at evaluating experimentally to what extent the new measure is correlated with traditional measures. To this end, we measured the code from a set of open-source Java projects and derived models of “Cognitive Complexity” based on the traditional code measures yielded by a state-of-the-art code measurement tool. We found that fairly accurate models of “Cognitive Complexity” can be obtained using just a few traditional code measures. In this sense, the “Cognitive Complexity” measure does not appear to provide additional knowledge with respect to previously proposed measures.
2022
2022
https://www.iariajournals.org/software/soft_v15_n12_2022_paged.pdf
Cognitive complexity; software code measures; McCabe complexity; cyclomatic complexity; Halstead measures; static code measures
Lavazza, L.
File in questo prodotto:
File Dimensione Formato  
JAS_2022.pdf

accesso aperto

Descrizione: Articolo principale
Tipologia: Versione Editoriale (PDF)
Licenza: Creative commons
Dimensione 459.57 kB
Formato Adobe PDF
459.57 kB Adobe PDF Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11383/2143992
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus ND
  • ???jsp.display-item.citation.isi??? ND
social impact