Saturday, June 18, 2011

How to use a CSV file with JMeter

I’m testing a new site right now and part of my process required load testing of the “add to cart” functionality. To do this, I wanted to have multiple “users” all on the system at the same time, each adding the same item to their cart. JMeter seemed like the logical way to deal with this, but figuring out how to get it to handle various usernames and passwords wasn’t that easy.

After some trial and error and a bunch of Googling I have it working, so thought I’d do a quick post on it for others to reference:

First, create a CSV file with the logins and passwords. You can just do this in a text editor, and the format should be:
username,password
username2,password2
username3,password3

Save that file in the bin directory where your JMeter installation lives. For this example, I saved the file as “logins.csv”.
Now go into JMeter and find the HTTP request step that you want to modify to use the values in this CSV file. Right click on it and go to Add > Config Element > CSV Data Set Config. This adds the CSV Data Set Config as a child of the HTTP request.
Your tree should now look like this:

    
In the tree above, you can see that I’m using the CSV data to modify the “Log in” HTTP Request.
Now click on the CSV Data Set Config step to modify it. Its screen looks like this:

You’ll need to fill in at least 3 values on this screen:
  1. Filename: if your file is in the /bin directory, this can be just the filename. If it’s somewhere else, use the full path to the file.
  2. Variable names: this is the equivalent to a “column name” in a spreadsheet.
  3. Delimiter: comma is the default delimiter, but if your file uses tabs this is the place to say so.
The other fields are optional but may be useful to you. Read all about them on the JMeter CSV Data Set Config user manual.
After the CSV step is filled out correctly, go back to your HTTP request and change the value of the login and password fields to variables.
Here’s what that looks like in my test:



Because I had defined my variables as “login” and “password” in the previous step, in this step I need to use those variables as the value of the parameters that are being sent with the request. To do that I replaced the email address with ${login} and the password with ${password}.
Now when you run this test, JMeter will fill in the values of those two variables with what’s in the CSV file. The first thread will use row 1, the second thread uses row 2, etc etc.
If you have any questions please let me know!

Thursday, June 9, 2011

Functional Testing Approach

  • Identify Features
  • Implement Test Design
  • Plan and Design Test
  • Configure Test Environment
  • Execute Test Cases
  • Analyze Report and Retest
Identified Features:
Identify all the features to be tested using Jmeter. Here, we will create a Test Plan(with any website) in order to demonstrate how we can configure the Test Plan to include functional testing capabilities. The modified Test Plan will include these scenarios:
1. Navigate to Home page
2. Navigate to About Us page
3. Navigate to Careers page
4. Navigate to Resource Center page
5. Navigate to Contact page 
and etc.....

Following these scenarios, we will simulate various entries and form submission as a request to a page is made, while checking the correct page response to these user entries. We will add assertions to the samples following these scenarios to verify the 'correctness' of a requested page. In this manner, we can see if the pages responded correctly to invalid data.

Implement Test Design:
Identify the pattern to be verified in the response from server and design test cases  accordingly. Here, using HTTP Proxy Server to Record Page Requests:
We will need to include the HTTP Proxy Server element in the WorkBench. Some configuration will be required, as shown in the following snapshot:


Configuring the Proxy Server:
Simulating these scenarios will require JMeter to make requests for accessing the home pages, About us page, Competency page, etc.
Selecting Add Assertion will be especially useful as we add specific patterns of the page that we want to evaluate as a later part of this exercise. The Capture HTTP Headers option is selected to capture the Header information as we begin recording. However, to make the recording neater, we will keep this option unchecked.
In addition, since we do not require images in our testing, in the URL Pattern to Exclude section, add these patterns: .*.jpg, .*.js, .*.png, .*.gif', .*.ico, .*.css, otherwise these image files, which are not necessary for our testing, will be recorded causing unnecessary clutter in our recording.
Add thread groups as child of Test Plan as displaying in fig below.

Now change the name of each thread group as according to requirements as like showing in figure given below.

Let the Recording Begin...
Let us proceed with the recording following the test cases in the previous table as our guide. As you record each page, select the specific tags or page elements the correctness of which you want to validate and add them to the Patterns to Test section in the Response Assertion element of each sampler. This may take most of your recording time, since as you record, you need to decide carefully which page element(s) would be the most effective measure of correctness. There are plenty of developer tools available to help you in this possibly tedious task.
The Assertion Results listener is used with the Response Assertion elements, to summarize the success or failure of a page in meeting the validation criteria defined in each Response Assertion.

Execute Test Cases:
Once the assertions are properly completed, we are expecting that running our Test Plan would pass all the assertions. Passed assertions will not show any error in Assertion Results | Listener installed within the same scope. As for all Listeners, results as captured by the Listeners can be saved and reproduced at a later time. Following is a sample explaining what passed Assertions would reveal as the Test is executed.
On the other hand, a failed Assertion would show an error message in the same Listener as the following snapshot illustrates.
                                      
Since a page error or Page not found error is a real risk in web applications, a failure may originate from such an error, and not just because of a failed Assertion. We can view more information about the sampler that contains the failed Assertion to investigate the origins of a failure. A View Results Tree Listener records the details of requests and logs all errors (indicated by the red warning sign and red fonts).

Summary:By execution find out the results and report all the defects if any. After resolving those defects retest new build with the same scripts.

This will provide visual means for us to understand the capabilities of JMeter tools that support functional testing, as we directly wrote and implemented a JMeter script. We have demonstrated building a Test Plan to contain functional validations (or assertions) by incorporating various essential JMeter components, particularly the 'Response Assertion' element and 'Assertion Result' Listener. By using the 'User Defined Variable' Configuration element, we have also parametrized several values in order to give our Test Plan better flexibility. In addition, we have observed the result of these assertions as we performed a 'live' run of the application under test. An HTTP Request sampler may require to be modified, if there are any changes to the parameter(s) that the sampler sends with each request.

If you have any type of concern contact me or reply....

Stability Testing Approach

  • Identify test objectives
  • Identify key scenarios
  • Identify the duration
  • Identify metrics
  • Implement test cases
  • Simulate load and Run
  • Analyze results
Objective of Stability Testing:
To confirm whether our web-site is continuously functioning well in or above an acceptable period.

Key scenarios:
Particular number of users will access home page, and top menu options like “About Us”, “Competency”, “Service Offerings”, “Careers”, “Contact Us”, “Home”, “Policies” and  other sub-menu options over a given duration.

Defining Workload:
To confirm whether our server is continuously responding well for 2 days or for 10000 times for the repeated access with the same set of requests.

Metrics to be collected:
Average Response Time and Throughput values will be collected after execution.

Sample Test cases:
1. Open http://exampleworld.com site and Record by
clicking on Home button in single thread group using
Jmeter
2. Select the thread and Set No Of Thread Group = 10
Ramp-Up period = 1
Loop Count is 10000
3. Run Jmeter
Server should be able to respond all the requests successfully.

Implement test cases in JMeter:
In Jmeter record the steps given in Test cases in the same way as explained in functionality Testing.

Simulating the load and Execute:
Enter: No Of Thread Group = 10
Ramp-Up period = 1
Loop Count is 10000
Now Run Jmeter (Ctrl+R)

Results Analysis :
If Server is able to respond all the requests successfully over this duration result is Pass.

Conclusion :
All Stability cases are Executed successfully 10000 times and All cases are Passed.

Stress Testing Approach

  • Identify test objectives
  • Identify key scenarios
  • Identify the workload
  • Identify metrics
  • Implement test cases
  • Simulate load and Run
  • Analyze results
Objective of Stress Testing:
To observe the results when testing our web-site beyond normal operational capacity, till its breaking point by slowly increasing the load.

Key scenarios:
In first test case, there are ten threads, particular number of users will access home page, and top menu options like “About Us”,  “Service Offerings”, “Careers”, “Contact Us”, “Home”, “Policies”. 

Defining Workload:
Starting from 100 users, we are increasing user count by 200, 400, 800.. till No response from server. All these virtual users are simultaneously accessing with the functions given under key-scenarios.

Metrics to be collected:
Average Response Time and Throughput values will be collected after execution.

Sample Test cases:
1.Open Jmeter
2.Add 10 Thread groups under test plan
3. Do proxy setting and record the following functions for each thread groups
           a. Open http://exampleworld.com
           b. Click on “About Us”
           c. Click on “Competency”.
          d. Click on “Service Offerings”.
          e. Click on “Careers”.
          g. Click on “Resource Centre”.
          h. Click on “Contact Us”.
          i. Click on “Home”.
          j. Click on “Polic” .
4. Set No. Of Threads = 10 for each thread group.
Ramp-up period = 0
(No of simultaneous users now = 100) and RUN Jmeter
5. Repeat the 4th Step by increasing No. Of threads by 10
each time till No response from server. ( 10, 20, 30 ...)

Implement test cases in JMeter:
In Jmeter record the steps given in Test cases in the same way as explained in functionality Testing.

Simulating the load and Execute:
Set No. Of Threads = 10 for each thread group.
Ramp-up period = 0
(No of simultaneous users now = 100) and RUN Jmeter
Repeat the above steps by increasing No. Of threads by 10 each time till No
response from server. ( 10, 20, 30 ...)

Results Analysis :
For 100 users
Average Time taken : 1.67 Seconds
Throughput : 34/Sec

For 200 users
Average Time taken : 3.73 Seconds
Throughput : 31.6/Sec

For 400 users
Average Time taken : 5.32 Seconds
Throughput : 29.5/Sec

For 800 users
Average Time taken : 8.43 Seconds
Throughput : 28/Sec
No Response for 840 Users.

Conclusion :
Approximately 800 users can access the server by the given functions.


Load Testing Approach

Load Testing Using Jmeter
  • Identify Requirements
  • Identify load-critical scenarios
  • Identify the target load levels
  • Implement test Design
  • Design specific tests
  • Run tests
  • Analyze the results
Identify Load Requirements
Load testing helps to identify the maximum operating capacity of an application as well as any bottlenecks that might interfere with its operating at capacity. As you begin load testing, it is recommended that you start with a small number of virtual users ( by giving 1 user in the Thread Group) and then incrementally increase the load from normal to peak (by giving the number of users as 500 in the Thread Group) .

We can then observe how our application performs during this gradually increasing load condition. Eventually, we will cross a threshold limit for our performance and load objectives. For example, we might continue to increase the load until the server processor utilization reaches 75 percent, or when end-user response times exceed 5 seconds. 

Identify the load-critical scenarios
1. Identify the workload profile for distributing the entire load among the key scenarios.
2. Identify the metrics that you want to collect in order to verify them against your performance objectives.

By using an iterative testing process, these steps should help you achieve your performance objectives.
We will create a Test Plan(with any website) in order to demonstrate how we can configure the Test Plan to include load testing capabilities. The modified Test Plan will include these scenarios:
  • Access the Home page
  • Access the Careers page
  • Access the Contact page etc.....
Implementing Load Test Cases in Jmeter
1. Record the Identified Test cases as described in Functional Test part.
2. Now change the No of threads group as given in test cases as like showing in figure given below.


Running the Test:
Once all the Listeners are added, the Test Plan is Run and results as captured by the Listeners which can be saved and reproduced at a later time.

Analysis on Load Testing Results :-
Analyze the metric data captured during the tests as below:

Aggregate Graph:
The aggregate graph is similar to the aggregate report. The primary difference is the aggregate graph provides an easy way to generate bar graphs and save the graph as a PNG file. By default, the aggregate graph will generate a bar chart 450 x 250 pixels.


# Label - The label of the sample. If "Include group name in label?" is selected, then the name of the thread group is added as a prefix. This allows identical labels from different thread groups to be collated separately if required.
# Samples - The number of samples with the same label
# Average - The average time of a set of results
# Median - The median is the time in the middle of a set of results. 50% of the samples took no more than this time; the remainder took at least as long.
# 90% Line - 90% of the samples took no more than this time. The remaining samples at least as long as this. (90 th percentile)
# Min - The shortest time for the samples with the same label
# Max - The longest time for the samples with the same label
# Error % - Percent of requests with errors
# Throughput - the Throughput is measured in requests per second/minute/hour. The time unit is chosen so that the displayed rate is at least 1.0. When the throughput is saved to a CSV file, it is expressed in requests/second, i.e. 30.0 requests/minute is saved as 0.5.
# Kb/sec - The throughput measured in Kilobytes per second



The aggregate graph provides an easy way to generate bar graphs and save the graph as a PNG file. By default, the aggregate graph will generate a bar chart 450 x 250 pixels.
Here, as showing in figure there are two exis:
  • X- Axis: It shows the requests which is run by Jmeter 
  • Y- Axis: It shows the time in milliseconds


Summary Report :
The summary report creates a table row for each differently named request in your test. This is similar to the Aggregate Report , except that it uses less memory. The throughput is calculated from the point of view of the sampler target (e.g. the remote server in the case of HTTP samples). JMeter takes into account the total time over which the requests have been generated. If other samplers and timers are in the same thread, these will increase the total time, and therefore reduce the throughput value. So two identical samplers with different names will have half the throughput of two samplers with the same name. It is important to choose the sampler labels correctly to get the best results from the Report.

View Result Table:
This Visualizer creates a row for every sample result. Like the View result Tree this Listener uses a lot of memory space.

Summary:
This will provide visual means for us to understand the capabilities of JMeter tools that support Load testing, as we directly wrote and implemented a JMeter script. We have demonstrated building a Test Plan to contain Load testing by incorporating various essential JMeter components, particularly the 'Aggregate Graph' element and 'Summary Report' Listener.


Report Analysis

In Jmeter, there is lot of type listener introduced, through we can analyse the performance result report easily. 
Most of the listeners perform several roles in addition to "listening" to the test results. They also provide means to view, save, and read saved test results.
Note that Listeners are processed at the end of the scope in which they are found.                  
The saving and reading of test results is generic. The various listeners have a panel whereby one can specify the file to which the results will be written (or read from). By default, the results are stored as XML files, typically with a ".jtl" extension. Storing as CSV is the most efficient option, but is less detailed than XML (the other available option).
Listeners do not process sample data in non-GUI mode, but the raw data will be saved if an output file has been configured. In order to analyse the data generated by a non-GUI test run, you need to load the file into the appropriate Listener.



Listeners can use a lot of memory if there are a lot of samples. Most of the listeners currently keep a copy of every sample in their scope, apart from:
  • Simple Data Writer
  • BeanShell/BSF Listener
  • Mailer Visualizer
  • Monitor Results
  • Summary Report
The following Listeners no longer need to keep copies of every single sample. Instead, samples with the same elapsed time are aggregated. Less memory is now needed, especially if most samples only take a second or two at most.
  • Aggregate Report
  • Aggregate Graph
  • Distribution Graph
To minimise the amount of memory needed, use the Simple Data Writer, and use the CSV format.

Graph Results:

The Graph Results listener generates a simple graph that plots all sample times. Along the bottom of the graph, the current sample (black), the current average of all samples(blue), the current standard deviation (red), and the current throughput rate (green) are displayed in milliseconds.

The throughput number represents the actual number of requests/minute the server handled. This calculation includes any delays you added to your test and JMeter's own internal processing time. The advantage of doing the calculation like this is that this number represents something real - your server in fact handled that many requests per minute, and you can increase the number of threads and/or decrease the delays to discover your server's maximum throughput. Whereas if you made calculations that factored out delays and JMeter's processing, it would be unclear what you could conclude from that number.
Control Panel                                 
The following table briefly describes the items on the graph. Further details on the precise meaning of the statistical terms can be found on the web - e.g. Wikipedia - or by consulting a book on statistics.
  • Data - plot the actual data values
  • Average - plot the Average
  • Median - plot the Median (midway value)
  • Deviation - plot the Standard Deviation (a measure of the variation)
  • Throughput - plot the number of samples per unit of time
The individual figures at the bottom of the display are the current values. "Latest Sample" is the current elapsed sample time, shown on the graph as "Data".

Spline Visualizer:

The Spline Visualizer provides a view of all sample times from the start of the test till the end, regardless of how many samples have been taken. The spline has 10 points, each representing 10% of the samples, and connected using spline logic to show a single continuous line.
The graph is automatically scaled to fit within the window. This needs to be borne in mind when comparing graphs.
Control Panel


                       


Assertion Results:

The Assertion Results visualizer shows the Label of each sample taken. It also reports failures of any Assertions that are part of the test plan.
Control Panel





View Results Tree:

The View Results Tree shows a tree of all sample responses, allowing you to view the response for any sample. In addition to showing the response, you can see the time it took to get this response, and some response codes. Note that the Request panel only shows the headers added by JMeter. It does not show any headers (such as Host) that may be added by the HTTP protocol implementation.
There are several ways to view the response, selectable by a drop-down box at the bottom of the left hand panel.
  • HTML
  • HTML (download embedded resources)
  • JSON
  • Regexp Tester
  • Text
  • XML

Control Panel
Here, the text view of the result.

As showed in below figure the result is displaying in HTML render format.

Aggregate Graph:

The aggregate graph is similar to the aggregate report. The primary difference is the aggregate graph provides an easy way to generate bar graphs and save the graph as a PNG file. By default, the aggregate graph will generate a bar chart 450 x 250 pixels.
Control Panel

See the Report for a similar Listener that does not store individual samples and so needs constant memory.
  • Label - The label of the sample. If "Include group name in label?" is selected, then the name of the thread group is added as a prefix. This allows identical labels from different thread groups to be collated separately if required.
  • # Samples - The number of samples with the same label
  • Average - The average time of a set of results
  • Median - The median is the time in the middle of a set of results. 50% of the samples took no more than this time; the remainder took at least as long.
  • 90% Line - 90% of the samples took no more than this time. The remaining samples at least as long as this. (90 th percentile )
  • Min - The shortest time for the samples with the same label
  • Max - The longest time for the samples with the same label
  • Error % - Percent of requests with errors
  • Throughput - the Throughput is measured in requests per second/minute/hour. The time unit is chosen so that the displayed rate is at least 1.0. When the throughput is saved to a CSV file, it is expressed in requests/second, i.e. 30.0 requests/minute is saved as 0.5.
  • Kb/sec - The throughput measured in Kilobytes per second
Times are in milliseconds.


Running the Test Plan

To run the test plan you can go to Run menu -> Run or press ctrl + r. To indicate that the test is running a green square lits up in the upper-right-hand corner. After selecting "stop" as well the green light remains on unless and until all test threads exit. Furthermore the total number of currently active threads is indicated by the numbers to the left of the green square. The green square will change to grey Once JMeter  finishes running your Test Plan. You can also stop the test plan in between by selecting Shutdown or Stop from the Run menu.
A file which you want to save the results for in your listener could be opened in any visualizer. The results will be displayed according to the visualizer itself. However you can open the same file in more than one visualizer because during the test run the JMeter will ensure that no sample is recorded more than once to the same file.

Configuring JMeter
The properties of JMeter can be modified by implementing changes in the jmeter.properties in the /bin directory or you can also develop a copy of jmeter.properties on your own by specifying it in the command line.
Some more JMeter properties can be defined in the file as defined by the JMeter propertyuser.properties having the default value user.properties. If the file is in the current directory then it will automatically get loaded. In the same way you can update the system properties fromsystem.properties.
Parameters
 Attribute Description Required
 remote_hosts Comma-delimited list of remote JMeter hosts. You need to list the         machines where you have JMeter remote servers running because then you will be able to control those servers in a distributed environment from this machine's GUI. NO
 not_in_menu With the help of this you can customize your JMeter by allowing only those components to display which you want to by listing their classname or their class label. NO
 user.properties Addition JMeter properties are contained in this file and are added before the -q and -J options are processed and after the initial property file. NO
 xml.parser The default value is: org.apache.xerces.parsers.SAXParser. An implementation can be specified as XML parser. NO
 search_paths This displays the list of paths to search JMeter add-on classes by JMeter such as additional samplers which is in addition to the jars kept in  lib/ext directory.   NO
 system.properties This contains additional system properties which are added before the -S and -D options are processed. NO
 user.classpath A path to be searched for utility classes by JMeter in addition to jars kept in lib directory. NO
 ssl.provider If you don't require the built-in Java implementation then the class can be specified by you for your SSL implementation. NO

Record Test Script(test request)


HTTP Proxy Server is a non-test element feature of JMeter which enables user to record scripts from a real browser.
1. Add Thread Group in your Test Plan


Add Thread Group
2. Add Proxy Server to WorkBench
  • Add Non-test Elements > HTTP Proxy Server
add-proxy1
  • Set “Target Controller” where your recorded scripts will be added
proxy-target
  • Set URL patters to INCLUDE or EXCLUDE
    • .* – all
    • .*\.png – all png images
    • .*\.gif – all gif images
    • .*\.jpg – all jpg images
    • .*\.php
    • .*\.jsp
    • .*\.html
    • .*\.htm
    • .*\.js
  • Click START
proxy-exinclude
3. Set your Browser to use Proxy (I used IE7 browser)
  • Go to Tools > Internet options > Connections > LAN Setting
  • Select “proxy server”
  • Set Address = localhost; Port = 8080
lan
4. Browse your application and record
5. In HTTP Proxy Server, click “Stop” when finished
To verify if recording was successful, you should be able to see HTTP requests generated under your “Target Controller”
More Tricks:
Add a Constant Timer to the HTTP Proxy Server node to record your think-time (right-click on the “HTTP Proxy Server” node and choose “Add,” “Timer,” and then “Constant Timer.”)
Next change the “Thread Delay” on the timer to “${T}” to tell the proxy server to record your time rather than to add a constant time for each request.

Using JMeter for a Simple Test

Lets see how to run JMeter now. We will conduct a simple test to set up a test plan and stress test a Web application. Before proceeding with the test we need to have a test plan first which will help the Jmeter to perform the testing in steps. There are several elements in a test plan like thread groups, listeners, assertions, sample generating controllers, logic controllers etc. Well, these elements will be described later. One point to remember here is that a test plan must have at least one thread group having all other JMeter elements which is the starting point of a test plan. All of the other threads created by the Jmeter to simulate simultaneous users will be controlled by this thread group. Lets go through the steps now.
Step 1: Start JMeter by running the JMeter.bat file for Windows or the JMeter file on Unix.



Step 2: Create a thread group by right-clicking the Test plan element as shown in the picture below. Now select ADD and then the Thread group option. After selecting the thread group option a thread group element will be created by the Jmeter under Test Plan element. This is to confirm about the number of users to be stimulated by you and the number of times the test plan to be repeated.
The screen after creating the Thread group appears like this.


Following properties can be set as shown below:Name -- You can give any name to the thread group.

Number of Threads -- You can enter as many threads to stimulate a load test. A single user is represented by each Thread so if you wish to simulate a load test with 5 concurrent users then you need to enter 5 as the value for this property.
Ramp-Up Period -- It indicates the time taken by Jmeter to create all of the threads needed. If you set 10 seconds as the ramp-up period for 5 threads then the JMeter will take 10 seconds to create those 5 threads. also by setting its value to 0 all the threads can be created at once.
Forever -- If you choose this option then the Jmeter will keep sending the unspecific requests to the tested application. And if disabled then the test will be repeated the number of times entered in the Loop Count box.
Loop Count -- By specifying its value Jmeter gets to know that how many times a test is to be repeated provided that the Forever check box should be unchecked.

Step 3: Now you need to mention the HTTP request (URL and parameters). To this right click on the Thread Group node then you need to select Add -> Sampler -> HTTP Request as shown in the picture. 




The picture below shows the screen of HTTP Request in which you can set some properties as described below. Here address which we have taken for the "Hello World" servlet is http://localhost:8080/examples/servlets/servlet/HelloWorldExample.
The following properties can be set on the HTTP Request screen.
Name -- You should put a descriptive name as a thread group can have multiple HTTP Request elements.
Server Name or IP -- Mention the server name or the IP address of the machine which is running the application to be tested.
Port Number -- Give the port number on which the Web applications run which is usually 80.
Protocol -- Mention the protocol to be used here i.e. either 
HTTPS or HTTP.Method -- The GET or POST method is to be mentioned here.                 
Path -- You need to mention the path of the resource that will handle this request.                       
Follow Redirects -- follows redirection, if any, sent by the Web application.
Parameters -- This option shows the list of parameters sent with the request. You can add and remove  parameters by using Add and Delete buttons.
Send a file with a request -- With the help of this option a file upload can be simulated to the Web application.
Retrieve all images and Java Applets -- This option is used to download embedded content.

Step 4: Now the format of the results is to be selected to get a page containing the results of every request by right-clicking on the Thread group node then select Add -> Listener -> View Results Tree.


Step 5: Now its time to run the Test plan by selecting Run from the menu and then select Start (or Ctrl-R): Hence The test plan will be repeated 10 times in View Result Tree as shown below.

Step 6: The results can been seen in the Results Tree after the completion of the Test plan. As you can seen in the picture below we have selected the first request in the upper pane and also the request which was generated with the results in the form of an HTML page with the text "Hello World!"




Step 7:  JMeter has got an interesting feature which is Add more listeners. You can add a View Results in Table listener to view the requests in tabular form as shown below.
Moreover you can add an Aggregate Report listener to get the summary of run as shown below.   
                                                                                    
Step 8: To save the test plan for later use select File from the menu and then Save Test Plan (or Ctrl-S). Its preferable to save the test plan before running it.

What is Jmeter?

JMeter from the Apache Software Foundation is a Java application tool designed to load test functional behavior and measure performance. It was originally designed for testing Web Applications but has since expanded to other test functions. JMeter can be used to simulate a heavy load on a server, network or object to test its strength or to analyze overall performance under different load types.

JMeter is not a browser
JMeter is not a browser. As far as web-services and remote services are concerned, JMeter looks like a browser (or rather, multiple browsers); however JMeter does not perform all the actions supported by browsers. In particular, JMeter does not execute the Javascript found in HTML pages.

Further Information About JMeter
The important functionalities of Jmeter is that a heavy load on a server can be stimulated by using it, not on a server but also a heavy load on a network or object to test its strength under different load types. A graphical analysis of performance can also be done by using Jmeter or the behavior of your server/script/object can also be tested under heavy concurrent load.

Before going any further lets see how to install it and required things.

Requirements
  • JMeter requires a fully compliant JVM 1.5 or higher because JMeter uses only standard Java APIs, please do not file bug reports if your JRE fails to run JMeter because of JRE implementation issues.
  • Operating Systems: JMeter is a 100% Java application and should run correctly on any system that has a compliant Java implementation. JMeter has been tested and works under:
    • Unix (Solaris, Linux, etc)
    • Windows (98, NT, XP, etc)
    • OpenVMS Alpha 7.3+
Installation: To install a release build, simply unzip the zip/tar file into the directory where you want JMeter to be installed. Provided that you have a JRE/JDK correctly installed and the JAVA_HOME environment variable set, there is nothing more for you to do.

Note: there can be problems (especially with client-server mode) if the directory path contains any spaces.

Running Jmeter: To run JMeter, run the jmeter.bat (for Windows) or jmeter (for Unix) file. These files are found in the bin directory. After a short pause, the JMeter GUI should appear.
There are some additional scripts in the bin directory that you may find useful. Windows script files (the .CMD files require Win2K or later):
  • jmeter.bat - run JMeter (in GUI mode by default)
  • jmeter-n.cmd - drop a JMX file on this to run a non-GUI test
  • jmeter-n-r.cmd - drop a JMX file on this to run a non-GUI test remotely
  • jmeter-t.cmd - drop a JMX file on this to load it in GUI mode
  • jmeter-server.bat - start JMeter in server mode
  • mirror-server.cmd - runs the JMeter Mirror Server in non-GUI mode
  • shutdown.cmd - Run the Shutdown client to stop a non-GUI instance gracefully
  • stoptest.cmd - Run the Shutdown client to stop a non-GUI instance abruptly

Unix script files; should work on most Linux/Unix systems:
  • jmeter - run JMeter (in GUI mode by default). Defines some JVM settings which may not work for all JVMs.
  • jmeter-server - start JMeter in server mode (calls jmeter script with appropriate parameters)
  • jmeter.sh - very basic JMeter script with no JVM options specified.
  • mirror-server.sh - runs the JMeter Mirror Server in non-GUI mode
  • shutdown.sh - Run the Shutdown client to stop a non-GUI instance gracefully
  • stoptest.sh - Run the Shutdown client to stop a non-GUI instance abruptly