Financial apps, streaming services, logistics, e-commerce, and IoT solutions have been gaining new ground since the advantages of the microservices architecture have become more and more obvious. Its business benefits include easier app development and maintenance, increased productivity, and higher app execution speed. Moreover, the microservices architecture allows companies to choose and combine different technologies and platforms in one system and accelerate the development process using continuous integration and distributed teams, each of which can be fully responsible for the development of a separate microservice.
However, all these advantages come with a number of difficulties that should be taken into account when developing a microservice application. Testing is one of the most common issues, turning into a rather complex, effort- and time-consuming project phase, so implementing automated testing becomes an important part of Continuous Integration & Continuous Delivery (CI/CD). This approach also encourages companies to integrate quality assurance experts into development teams early, increasing the business value of the solution, shortening time-to-market, and ensuring higher-quality code throughout the development cycle.
In this article, I would like to share my experience of deploying and writing automated tests for a microservice web application running in Docker in the Amazon AWS cloud. Auriga’s customer uses this application for planning and calculating the cost of jobs performed for their end clients. Prior to the introduction of automated testing, unit tests were already used on the project, but they often failed to detect some important bugs.
Project Architecture
The system consists of more than 10 services. Some of them are written in .NET Core, while others are written in Node.js. Each service runs in a Docker container in Amazon Elastic Container Service. Each service has its own Postgres database, although some microservices also work with Redis. There are no shared databases. If several services need the same data, this data, while being changed, is transmitted to each of the services via the SNS (Simple Notification Service) and SQS (Amazon Simple Queue Service), and the services allocate it to separate dedicated databases.
Most of the services are not available directly from the Internet. Access is granted via API Gateway, which verifies access rights, and it is a separate service that also needs to be tested.
In addition, the application uses SignalR as a real-time notification service. It is accessible directly from the Internet, and it works with OAuth, because it turned out to be less practical to build web socket support into the Gateway than to integrate OAuth and the notification service.
Reasons for Failure of Traditional Testing Approach
Prior to using Docker, the project’s team widely implemented unit tests using mock objects, but engineers regularly faced difficulties dealing with data, including the following:
- low testing data consistency in a relational database
- poor testing collaboration with cloud services
- incorrect assumptions when scripting mock objects.
During the development process, it became clear that conducting unit tests did not enable us to find all problems in a timely manner. Moreover, our system uses the Postgres database, and we did not succeed in achieving the automatic launch of Postgres and the migration execution launching of the test.
After examining all possible workarounds, we decided to approach the problem from a different angle by doing the following:
- implementing integration tests and, thus, communicating with a real database rather than a mock object
- testing all microservices in a Docker container due to the high probability of bugs in a unit test when creating an initial state
- using local versions instead of cloud services to ensure test reproducibility and Internet autonomy.
Benefits of Testing using Docker
Besides the obvious benefits of testing using Docker, like the opportunity to automate the testing process (and speed up the project itself) and the ability to test the bugs not caught before using traditional testing, our solution has a number of other advantages.
Firstly, the validity of tests is ensured by the fact that the same Docker images that go into production are subjected to tests, thereby reducing the number of errors.
Secondly, the same script is run both by the developers themselves on their Windows desktops and by the GitLab CI server under Linux. Thus, the introduction of new tests does not require installing additional tools either on the developer’s computer or on the server side where the tests are run on commit.
Thirdly, the test environment is deployed and runs tests on the local server. This guarantees that the automated tests will work anyway and that we will not waste time looking for the reason for stopping the process in the logs. Using LocalStack eliminates the need for requests to Amazon and, thus, solves the problem of too many requests.
Moreover, Docker allows you to run tests without using the main bench and without interfering with the work of developers.
Finally, after the testing cycle completion, it is required to return the system to its original state. In Docker, this is done by stopping and deleting containers, which minimizes the code.
Setting Up Test Environment
- The first task is to deploy the test environment. The steps required to start a microservice are as follows:
- Configure the service under test for a local environment; the environment variables specify the details for connecting to the base and AWS.
- Start Postgres and migrate by running Liquibase.
In relational DBMS, before writing data to the database, you need to create data schemas (i.e., tables). When updating the application, the tables must be aligned with the new version with no data loss. This is migration. Creating tables in an initially empty database is a special case of migration. Migration can be built into the application itself. Both .NET and Node.js have migration frameworks. In our case, for security reasons, microservices cannot change the data schema, and migration is performed using Liquibase.
- Start Amazon LocalStack. This is an implementation of AWS services to run on their own. There is a ready-made image for LocalStack in Docker Hub.
- Run the script to create the required entities in LocalStack. Shell scripts use AWS CLI.
To execute requests to the microservice, we use Postman, which allows us to create test scripts.
How Automated Tests Work
During the test, everything works in Docker: the service under test, Postgres, the migration tool, and Postman, or rather its console version – Newman. Docker solves the following problems:
- Independence of host configuration
- Installing dependencies: Docker downloads images from Docker Hub
- Resetting the system to its original state: just delete the containers.
Docker-compose connects containers into a virtual network isolated from the Internet, in which containers find each other by domain names.
The test is controlled by a shell script. We use Git Bash to run the test on Windows. Thus, one script is enough for both Windows and Linux. Git and Docker are installed by all developers on the project. Installing Git on Windows installs Git Bash as well, so everyone has it as well.
The script performs the following steps:
- Building Docker images
- Docker-compose build
- Starting DB and LocalStack
- Docker-compose up -d <container>
- Database migration and LocalStack preparation
- Docker-compose run <container>
- Launching the tested service
- Docker-compose up -d <service>
- Launching the test (Newman)
- Stopping all containers
- Docker-compose down
- Posting results in Slack
- We have a chat for messages with a green checkmark or a red cross and a link to the log.
The following Docker images are used in these steps:
- The service under test is the same image as that for the production. The configuration for the test is performed through the environment variables.
- For Postgres, Redis, and LocalStack, ready-made images from Docker Hub are used. There are also ready-made images for Liquibase and Newman. Based on them, we build our own images by adding our files there.
- A pre-built AWS CLI image is used for LocalStack preparation. Based on it, an image containing the script is generated.
Using volumes, you don’t have to build a Docker image just to add files to the container. However, volumes are not suitable for our environment, because GitLab CI tasks themselves run in containers. You can manage Docker from such a container, but volumes don’t work there.
Problems We Solved
In the process of integration test automation, we encountered and solved the following problems:
- conflicts of parallel tasks in one Docker host
- conflicts of identifiers in the database during test iterations
- waiting for microservices to be ready
- merging and outputting logs to external systems
- testing outgoing HTTP requests
- testing web sockets (using SignalR)
- testing OAuth authentication and authorization.
To solve these problems, in some cases, it is enough to write separate scripts, add steps to Postman, or, as in the case of testing web sockets, write an additional special tool.
Conclusion
The decision to deploy automated testing using Docker technologies resulted in a set of stable tests in which each service works, interacting with the database and Amazon LocalStack. These tests significantly reduce the risk of errors inevitable in a project to develop and maintain an application with a complex interaction of 10+ microservices with frequent regular deployments, where more than 30 developers work in different locations and remotely. Therefore, they allow the customer to quickly obtain a reliable and high-quality solution. In 2016, Puppet surveyed more than 4,600 developers and found that teams using containers in conjunction with DevOps or Agile practices deployed new applications, on average, 200 times faster than when using the waterfall model. In our project, the customer received the first release of the system after 12 months of development, and the number of errors on the backend side did not exceed 1 per 20 commits.