Follow

Your first test

This explains how to create and run a very basic test, by entering a URL manually.

There are three main steps when you run a performance test:

  1. Create the test plan
  2. Run the test
  3. Review the Results

Create a test plan

 The steps on how to create a basic test plan are shown in the video below:

A test plan is a form, where each field can be modified according to the user's need. For the first time, most of the fields are pre-filled with recommended values. 

The only two pieces of information missing are a name for the test plan (in the example above, called BasicTest) and the target of the test, i.e. the Application we want to test. In this example we manually enter a URL. 

 

Run a test plan

To Run a test is very easy just select the test in the Test Plans table and click the RUN TEST button, as shown here below.

 Note, once the test starts running, the first step is Queueing. The test is pushed to the load generators, that will execute it. Once it starts the execution, a message will appear on the progress bar. In addition, if there are errors, a message will show the percentage of errors up to that point. So, if for any reason the test is causing too many errors, it can be canceled, by clicking on the STOP TEST button.

Screen_Shot_2016-02-24_at_2.31.22_PM.png

The last phase is Analyzing, when the data coming from the test are processed to get ready to be displayed.

When completed an email is sent to account owner, and the outcome of the test is displayed:

 

We are now ready for the last step!

Review the results

From the table above, when clicking on the RESULT field the Result home page is shown:

At the top there is an executive summary, with a brief description of the test, and the main outcome. In this case, the test outcome was FAIL. The outcome is based on a threshold value, that is configurable in the test plan, that by default is set to 600 ms.

Next to it, there are the aggregate total response time and the successful requests response time.

The two graphs below show the throughput and the total number of requests. The throughput is divided in intervals, in this case of a minute each. 

If there were errors, they would also be displayed in the graph. And a ratio of the errors versus successful requests would also be displayed in the circular graph.

The image below shows the average response time measured during the test. This is the time taken to successfully load the page. The horizontal axis shows the duration divided in intervals. The green bars represent the number of concurrent users involved in the test. In this case the test was run using a linear ramp traffic model (the default in DiveCloud), where the number of users grows linearly during the test execution. 

The orange line represents the threshold value, by default 600ms. It is very clear now why the test Failed. The average response time is mostly above the threshold. 

  

More detailed information on specific requests can be also displayed. When a URL is selected from the HTTP REQUEST list, similar information will be displayed, this time referring to the individual URL (in this case it is the same, as we only had one URL). 

In addition when clicking on the Average Response Time VS Concurrent user, also a distribution for each interval is shown:

0 Comments

Please sign in to leave a comment.
Powered by Zendesk