Do you remember when you introduced a new project which was documented, with unit tests, with a clean architecture and fully decoupled, that you were so proud of? … And after some time, poof! The project is a mess, vulnerabilities, spaghetti code, tightly coupled, without any style consistency. …And there is more, another tight deadline is here! This is not a science-fiction scenario. I believe that a lot of us have lived such days as developers.
Figure 1. – My new cool project after a while is a mess 😭 (Source).
There is high pressure on developers to meet tight deadlines, while not compromising the quality of the software, which should be clean, readable, consistent, reusable, maintainable, testable, efficient, secure, etc. Even in smaller projects or teams, it is a struggle to properly sustain the code’s quality and architecture.
It is a difficult task and that’s why constructive and quality feedback from code peer reviews and manual testing, provides software quality assurance. But, is it enough? To improve code peer review’s efficiency, reduce its required time and automate testing processes, Static Code Analysis and Dynamic Code Analysis can be used.
In this article, we will learn about static code analysis, dynamic code analysis, how they can help us, their limitations, and how to choose the right tools depending on our needs. So, if we are ready… ikuzo (let’s go).
Static Code Analysis
Static Code Analysis (also known as Static Program Analysis, Source Code Analysis or Static Analysis) is the examination of the source code that is performed without running the program (just by “reading” the code) to identify:
Code Quality issues,
Vulnerabilities (security weaknesses),
Violations of coding standards, etc.
The main advantage of static code analysis is to detect and eliminate issues early in the software development process, resulting in lower fixing cost.
Probably in the majority of the software development teams, this analysis is already performed through code peer reviews (manual). The downsides of the manual code reviews are that requires a lot of time (i.e. it’s expensive) and may not be always effective and in-depth. For that reasons, several tools have been implemented to automate this process.
Static Code Analysis Tools
The static code analysis tools are reviewing the source code automatically, based on multiple coding rules. The most known static code analysis tool for .NET developers may be the .NET Compiler Platform (Roslyn) Analyzers that inspect code for style, quality, maintainability, design, and other issues.
It is important to notice that the static code analysis tools have limitations. They cannot identify whether the business requirements, the developer’s intent and the agreed implementation logic, have been fulfilled in the code. Thus, code peer reviews are still an important factor of the development process (a program cannot replace a peer review). Also, static code analysis tools may:
Identify false positive issues (i.e. issues that don’t require any fix-action) or
Not identify some actual issues (false negatives).
Choosing a Static Code Analysis Tool
Each static code analysis tool has its own features, integrations and supports different programming languages. So, it’s important to choose a static code analysis tool based on your needs, for example:
Programming Language: Supports your programming language(s).
Integrations: Integrates with your Integrated Development Environment (IDE) and your Continuous Integration (CI) system.
Suppression Feature: Provides the ability to dismiss false positive issues. Ideally, these suppressions should be in a separate file and not by getting the code dirty with suppressions attributes.
Rules Extendibility Feature: Provides the ability to add new rules that would suit your team’s and your business’s requirements.
Summary Metrics: Provides a summary of the metrics under investigation.
Collaborative/Reporting Features: Provide a way to share the project’s metrics with developers and management.
Fast Analysis Results: The static code analysis will be executed multiple times. So, it’s important to not add delay to the developers. If it does, the developers will avoid using it.
Dynamic Code Analysis
The dynamic code analysis (also known as Dynamic Testing or Dynamic Program Analysis) is the opposite of the static code analysis. In dynamic code analysis, the examination of the code is performed while the code is running. The main idea is to interact with the running application by providing it with different inputs (test data) and examine the results.
The test data can include cases that examine different business scenarios but also malicious inputs such as extreme inputs (long strings, negative and large positive numbers), unexpected inputs, SQL injections etc.
As we can understand, the efficiency of such an analysis is depending on the quality and quantity of the input test data. The code coverage measure can be used to describe the degree to which the source code is executed for the selected input test data.
Dynamic code analysis can be used to identify critical cases, such as:
Runtime Vulnerabilities (e.g. security threats).
Program Reliability (e.g. program errors, memory leaks, race conditions, etc.).
Response Time (e.g. delays on specific requests or scenarios).
Consumed Resources (e.g. CPU usage, memory usage, number of third-party requests, etc.).
To perform such tests, significant computational resources are required. In addition, an isolated (testing) environment with all the necessary dependencies on third-party resources (e.g. databases, APIs, etc.) is required so production systems aren’t affected.
Performing Dynamic Code Analysis
Dynamic code analysis can be performed by applying both white-box and black-box testing. In white-box testing, we are using the information of the internal structure of the code to design the test cases. For example, white-box testing for dynamic code analysis can be performed by unit and integration tests.
On the contrary, in black-box testing, we do not need the information about the internal structure of the code to examine its functionalities. For example, black-box testing for dynamic code analysis can be performed by integration tests and with third-party utilities.
These third-party utilities can support the identification of several pre-defined cases (e.g. vulnerabilities) or they can “record” the performed actions as the program is being executed, to be re-executed easily afterwards. The selection of these utilities is based on the critical cases that we would like to identify.
Developers are under high pressure to meet tight deadlines and at the same time not to compromise the quality of the software. The software quality can be described by several attributes (e.g. Maintainability, Security, Efficiency, etc.) which require a high effort to be accomplished.
There are some manual processes (e.g. code peer reviews and manual testing) that can help to maintain and improve software quality but it’s expensive (because it requires a lot of time) and may not always be effective. Static code analysis tools and Dynamic code analysis (in white-box and black-box) should be used along with the existing manual processes to boost the software quality.
Static Code Analysis tools provide diagnostics (about code quality, coding standards, etc.) early in the software development process, resulting in a lower fixing cost. They cannot identify whether the business requirements have been fulfilled in the code.
Dynamic Code Analysis (for example as unit tests, integration tests and third-party utilities) can identify vulnerabilities, memory leaks, race conditions, etc.
The selection of these tools and methods should be based on our needs (programming language, integrations, etc.) and goals (e.g. find threats quickly, keep low response times, improve memory usage, etc.).
To achieve the highest quality in our software, we have to use various tools and methods from both Static Code Analysis and Dynamic Code Analysis. The use of these tools will educate us about the rules that we should follow and their impact, thus our skills will keep improving.
In future articles, I would share with you my experiences by using some of these tools in .NET projects, so stay tuned!