Evaluating Software Complexity Based on Decision Coverage

Article excerpt

It is becoming increasingly difficult to ignore the complexity of software products. Software metrics are proposed to help show indications for quality, size, complexity, etc. of software products. In this paper, software metrics related to complexity are developed and evaluated. A dataset of many open source projects is built to assess the value of the developed metrics. Comparisons and correlations are conducted among the different tested projects. A classification is proposed to classify software code into different levels of complexity. The results showed that measuring the complexity of software products based on decision coverage gives a significant indicator of degree of complexity of those software products. However, such indicator is not exclusive as there are many other complexity indicators that can be measured in software products. In addition, we conducted a comparison among several available metric tools that can collect software complexity metrics. Results among those different tools were not consistent. Such comparison shows the need to have a unified standard for measuring and collecting complexity attributes.

Keywords: Complexity, Software Metrics, Decision Coverage, Software Quality, Testing

1Introduction

In recent years, the software products are getting more complex. Producing a software with all its functionalities while at the same time having high quality attribute is a serious challenge. Improving software testing and measurements can help in findings software bugs early and hence reduce their impact [1]. However, it is very difficult to test every as-pect or attribute in the software, especially when the software application is very huge and has many branches. There are several metrics that have been developed to help de-velopers and testers in their development process in order to guarantee the correctness of tasks and improving the maintainability of the software [2], [3], [4], [5]. Cyclomatic complexity is one of metrics that is used to measure the complexity of a program by measuring the number of linearly independ-ent paths through the source code [6]. Cy-clomatic complexity is computed using the Control Flow Graph (CFG). In CFG, there are nodes and directed edges. The nodes refer to commands or decisions in the program and each edge connects two nodes (i.e. com-mands) when the second command can be executed after the first one.

In this paper, measuring the complexity will be based on decision coverage. Decision coverage is a metric that measures the possi-ble branches that are followed by a flow con-trol structure [7]. A decision is a program point in which the control flow has two or more alternative branches [8]. Decision cov-erage is the percentage of the decision out-comes that have been tested or visited by test cases relative to the overall decisions [7]. The decision coverage metric will be added to the existing metrics in SWMetrics tool de-veloped by one of the paper's authors [13]. SWMetrics computes many metrics such as: Line of Code (LOC), Statement Line of Code (SLOC), Cyclomatic complexity and math counts. The objective of decision coverage testing is to show all the decisions within a component that have been executed at least once. This is usually a software complexity indicator where more decisions in a program mean more complexity. The remainder of this paper is structured as follows: Section 2 presents a background of software metrics. Section 3 discusses some of the metrics that proposed to measure some features of soft-ware, especially complexity. Some tools that can calculate software metrics have also been discussed in this section. Section 4 presents the setup of our experiments. Section 5 de-scribes the experimental results. Section 6 in-cludes a conclusion or summary of the work presented in this paper.

2 Background

Software metrics provide a numerical data related to development, operation, and maintenance of the software product, project, process, etc. …

Oops!

An unknown error has occurred. Please click the button below to reload the page. If the problem persists, please try again in a little while.