API Data-driven testing (DDT) describes a way of testing where test input data and assertions keys and expected values are all driven from a table. Typically the table is a spreadsheet with every row represent a test step or an API call.
The advantage of DDT is the ease to add additional rows to the spreadsheet when new test scenarios are added to the product (API) under test. Also, in DDT, the test environment settings and control are not hard-coded.
This API DDT Framework is developed using SmartBear ReadyAPI product. To use the framework you will have to get a license or you can download the trial version of SmartBear ReadyAPI.
If you adopt this framework for your API testing, you are not only going to save a lot of time and money, you would also easily add security and performance testing to your API product testing without any extra effort as both tests are embedded in this DDT framework.
Before you go any further, it is better to go through the article I published in my blog titled DX: How to boost the performance of your APIs - Part I
- Create new directory
$ mkdir api-ddt-framework
$ cd api-ddt-framework
- Initialize git repository in the newly created directory
$ git init
Initialized empty Git repository in /Users/{username}/Downloads/api-ddt-framework/.git/
$
- Get a copy of this repo
$ git clone [email protected]{gihubUsername}:earth2digital/automated-api-ddt-framework.git
Cloning into 'automated-api-ddt-framework'...
Enter passphrase for key '/Users/{Username}/.ssh/id_rsa':
Enter passphrase for key '/Users/{Username}/.ssh/id_rsa':
remote: Counting objects: 126, done.
remote: Compressing objects: 100% (109/109), done.
remote: Total 126 (delta 62), reused 58 (delta 15), pack-reused 0
Receiving objects: 100% (126/126), 83.42 KiB | 199.00 KiB/s, done.
Resolving deltas: 100% (62/62), done.
$
The excel spreadsheet included in the API DDT Farmework, has 2 types of spreadsheets, an Overview Sheet and a Microservice sheet. The Overview sheet contains the overview of the Microservices that is the scope of testing. The location column highlighted in red determine which sheet/s in scope for testing.
The API DDT Framework loops on the list in that column to get all the sheets in-scope and execute all the test steps included in every sheet. Every sheet represent a Microservice to be tested. If you want to skip one of the sheets (Or Microservices) from testing, you can do so by removing it's location value from the column highlighted in red or remove the whole row.
The Microservice sheet has all the test steps that need to be executed in all test scnarios for a particular Microservice. In this example we have 5 Microservices to be tested:
- Account
- Notification
- Order
- Connection
Every Microservice sheet has 2 sets of test input data, Request Parameters and Response Assertions. The API DDT Framework supports up to 13 assertions on the API Response. The assertions are of two types, HTTP response code and JSON response message payload. For the response message payload type, you can define any key you want using JSON Path (to be retrieved from the JSON payload in the response) and expected value using just normal text or another JSON Path in the response message.
The Microservice sheet also has a column called "Run" highlighted in red. That column can be used to mark steps to be skipped from testing. All you need to do is to set it to true or false for whether to have the step run or not.
For now, you can choose to update the sheets accordingly and run the API DDT Framework or you can just run it without doing any updates as the example sheets have been tested to be working fine for Functional, Security and Performance testing. Although performance testing needs a bit of a tweak to give accurate results.
To run the Functional Test cases, just select the project name on the left handside menu "GenericProject_v7" and then click the green play arrow highlighted in red.
To run the Security Test cases, just select the Test Suite on the left handside menu "SecurityTest" and then click the green play arrow highlighted in red.
To run the Security Test cases, just select the Test Suite on the left handside menu "LoadTest" and then click the green play arrow highlighted in red.
- Run Functional Test
$ sudo /opt/SmartBear/ReadyAPI-2.3.0/bin/testrunner.sh -e https://api.myprototype.com.au -FPDF -R"Project Report" /Users/{username}/downloads/api-ddt-framework/automated-api-ddt-framework -f /Users/{username}/downloads/api-ddt-framework
$
- Run Security Test
$ sudo /opt/SmartBear/ReadyAPI-2.3.0/bin/securitytestrunner.sh /Users/{username}/downloads/api-ddt-framework/automated-api-ddt-framework -c "SecurityTest" -d https://api.myprototype.com.au -FPDF -R"SecurityTest Report"
$
- Run Performance Test
$ sudo /opt/SmartBear/ReadyAPI-2.3.0/bin/loadtestrunner.sh -E test -FPDF /Users/{username}/downloads/api-ddt-framework/automated-api-ddt-framework -n "LoadTest" -S "Arriving VU/s,Time Taken"
$
To run the project report, just select the Project Name on the left handside menu "GenericProject_v7" and then click the gray report icon highlighted in red.
To save the report as PDF or any other format, click on the save icon highlighted in red.
While running the test cases using command line, you might experience problems as the microsoft fonts are not installed on AWS Linux. To solve the issue, execute the below commands:
$ wget http://corefonts.sourceforge.net/msttcorefonts-2.5-1.spec
$ wget https://www.cabextract.org.uk/cabextract-1.6-1.i386.rpm
$ sudo yum install cabextract-1.6-1.i386.rpm
$ sudo yum install -y rpmdevtools rpm-build
$ rpmbuild -ba msttcorefonts-2.5-1.spec
$ sudo yum install /home/ec2-user/rpmbuild/RPMS/noarch/msttcorefonts-2.5-1.noarch.rpm