Monday, 2 January 2017

Gatling
Overview : Gatling is an open-source load testing framework based on scala, Akka and Netty.
Features:
·         High performance
·         Ready to present HTML reports
·         Scenario recorder and developer-friendly DSL
Record
·         Compatible with all browsers
·         Easy way to script your scenarios
Edit
·         Write your scenarios with our scripting API or directly in Scala
·         Easy-to-read and developer-friendly
·         Easier maintainability
Launch
Terminal
·         Linux / OSX: gatling.sh
·         Windows: gatling.bat
Build tool
·         Maven: mvn gatling:execute
·         SBT: sbt test
Analyze
·         Clear, exhaustive, dynamic and colorful reports
·         Significant metrics: 99th percentiles
·         Ready-to-present





Setting up Project
1.                  Install Gatling from http://gatling.io/#/resources/download. We can download zip bundle or we can add dependencies in pom.xml
<dependency>
  <groupId>io.gatling.highcharts</groupId>
  <artifactId>gatling-charts-highcharts</artifactId>
  <version>2.2.2</version>
</dependency>

2.                  In Gatling > user-files create folder with name "LoadTest".
3.                  In order to run Gatling, you need to have a JDK installed. Gatling requires JDK8, you should use an up-to-date version.

Gatling Configuration:
We can change of setting in Gatling.conf like directory changes, connection time out, read time out etc.
When simulation run and any post request which required payload then default directory location is         data = user-files/data or bodies = user-files/bodies.  By default simulation file location is user-files/simulations we can change it also.

We change as:
Data :  Folder where user's data (e.g. files used by Feeders) is located
Bodies :  Folder where bodies are located
simulations = Folder where the bundle's simulations are located
reportsOnly = If set, name of report folder to look for in order to generate its report
binaries = If set, name of the folder where compiles classes are located: Defaults to GATLING_HOME/target.
results = Name of the folder where all reports folder are located

We also change these data
keepAlive = true    // Allow pooling HTTP connections (keep-alive header automatically added)
connectTimeout = 60000  //Timeout when establishing a connection
#pooledConnectionIdleTimeout = 60000  //Timeout when a connection stays unused in the pool
 readTimeout = 60000   //Timeout when a used connection stays idle
 #maxRetry = 2   //Number of times that a request should be tried again
 requestTimeout = 600000    //Timeout of the requests

Recorder: By Recorder we can create simulations. We have to setup proxy in browser.

The Gatling Recorder helps you to quickly generate scenarios, by either acting as a HTTP proxy between the browser and the HTTP server or converting HAR (Http ARchive) files. Either way, the Recorder generates a simple simulation that mimics your recorded navigation.
4.                  Go to Gatling > bin and start cmd then run command recorder.bat (for window) and gatling.sh (for Linux).
5.                  Fill package name class name like recordedSimulation and it will generate scala file as simulation.
Listening proxy port:
In the Recorder, you have to define one port (for both HTTP and HTTPS): the local proxy port. This is the port your browser must connect to so that the Recorder is able to capture your navigation.

Running

Once everything has been configured, press the Start button to launch the recorder.

Recorded Events

As you navigate through your application, the recorder will log three kinds of events:
·         Requests: The requests sent by the browser.
·         Pauses: The time between each request.
·         Tags: Manually set markers.
Proxy setting in browser :
            Run : $GATLING_HOME/bin/recorder.sh in CMD




Once run recorder produced output
package computerdatabase // 1 The optional Package

import io.gatling.core.Predef._ // 2 The required imports
import io.gatling.http.Predef._
import scala.concurrent.duration._

class BasicSimulation extends Simulation { // 3 The class declaration, Extends Simulation

  val httpConf = http // 4 The Common configuration to all HTTP requests
    .baseURL("http://computer-database.gatling.io") // 5 The base URL
    .acceptHeader("text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8") // 6 Common Header then will be sent with all requests
    .doNotTrackHeader("1")
    .acceptLanguageHeader("en-US,en;q=0.5")
    .acceptEncodingHeader("gzip, deflate")
    .userAgentHeader("Mozilla/5.0 (Windows NT 5.1; rv:31.0) Gecko/20100101 Firefox/31.0")

  val scn = scenario("BasicSimulation") // 7 The scenario definition
    .exec(http("request_1"// 8 Request name with will displayed in final reports
    .get("/")) // 9 URI
    .pause(5) // 10 Pause / Think time pause(5) means 5 seconds

  setUp( // 11 one sets up the scenarios that will be launched in this Simulation
    scn.inject(atOnceUsers(1)) // 12 Declaring to inject into scenario named scn one single user
  ).protocols(httpConf) // 13 Attaching the HTTP configuration declared above.
}

Logging:     logback.xml changes which we done
Set "DEV_HOME" property to where you need logs to be generated

<?xml version="1.0" encoding="UTF-8"?>
<configuration>
        <property name="DEV_HOME" value="c:/logs/gatling" />
        <appender name="CONSOLE" class="ch.qos.logback.core.ConsoleAppender">
                        <encoder>
                                        <pattern>%d{HH:mm:ss.SSS} [%-5level] %logger{15} - %msg%n%rEx</pattern>
                                        <immediateFlush>false</immediateFlush>
                        </encoder>
        </appender>

        <appender name="FILE-INFO"
                        class="ch.qos.logback.core.rolling.RollingFileAppender">
                        <file>${DEV_HOME}/info.log</file>
                        <encoder class="ch.qos.logback.classic.encoder.PatternLayoutEncoder">
                                        <Pattern>%d{HH:mm:ss.SSS} [%-5level] %logger{15} - %msg%n%rEx
                                        </Pattern>
        </encoder>
                        <rollingPolicy class="ch.qos.logback.core.rolling.TimeBasedRollingPolicy">
                                        <!-- rollover daily -->
                                        <fileNamePattern>${DEV_HOME}/archived/info.%d{yyyy-MM-dd}.%i.log
                                        </fileNamePattern>
                                        <timeBasedFileNamingAndTriggeringPolicy
                                                        class="ch.qos.logback.core.rolling.SizeAndTimeBasedFNATP">
                                                        <maxFileSize>100MB</maxFileSize>
                                        </timeBasedFileNamingAndTriggeringPolicy>
                        </rollingPolicy>

                        <filter class="ch.qos.logback.classic.filter.LevelFilter">
                                        <level>INFO</level>
                                        <onMatch>ACCEPT</onMatch>
                                        <onMismatch>DENY</onMismatch>
                        </filter>
        </appender>

        <appender name="FILE-ERROR"
                        class="ch.qos.logback.core.rolling.RollingFileAppender">
                        <file>${DEV_HOME}/error.log</file>
                        <encoder class="ch.qos.logback.classic.encoder.PatternLayoutEncoder">
                                        <Pattern>%d{HH:mm:ss.SSS} [%-5level] %logger{15} - %msg%n%rEx
                                        </Pattern>
                        </encoder>

                        <rollingPolicy class="ch.qos.logback.core.rolling.TimeBasedRollingPolicy">
                                        <!-- rollover daily -->
                                        <fileNamePattern>${DEV_HOME}/archived/error.%d{yyyy-MM-dd}.%i.log
                                        </fileNamePattern>
                                        <timeBasedFileNamingAndTriggeringPolicy
                                                        class="ch.qos.logback.core.rolling.SizeAndTimeBasedFNATP">
                                                        <maxFileSize>100MB</maxFileSize>
                                        </timeBasedFileNamingAndTriggeringPolicy>
                        </rollingPolicy>

                        <filter class="ch.qos.logback.classic.filter.LevelFilter">
                                        <level>ERROR</level>
                                        <onMatch>ACCEPT</onMatch>
                                        <onMismatch>DENY</onMismatch>
                        </filter>

        </appender>
<logger name="io.gatling.http" level="DEBUG" />
        <root level="INFO">
                        <appender-ref ref="FILE-INFO" />
                        <!--appender-ref ref="FILE-ERROR" /-->
        </root>
</configuration>
Running Gatling

Launch the second script located in the bin directory:
·         On Linux/Unix:
·         $GATLING_HOME/bin/gatling.sh
·         On Windows:
·         %GATLING_HOME%\bin\gatling.bat
You should see a menu with the simulation examples:
Choose a simulation number:
   [0] computerdatabase.BasicSimulation
When the simulation is done, the console will display a link to the HTML reports.
                               
Example of LoadTest project : Employee-Management
Create a project named load test and create folder structure like:


1.       Common: This folder have files which will be used in whole application.
a.       ApplicationProperties.scala : load properties file and get Base path.
o    By ConfigFactory.load(PATH) we will load properties file.
getBasePath() method is used to load base path:
def getBasePath(): String = {

          val gatlingConfUrl = getClass.getClassLoader.getResource("gatling.conf")

          val projectRootDir = gatlingConfUrl.getParent.getParent

          return projectRootDir.toString
}
b.      URILiterals.scala: All URL setup in this file.
Constants.scala:
object Constants {

  val HTTP_PROTOCOL_CONTENT_TYPE_HEADER_XML = http
    .baseURL(ApplicationProperties.baseURL)
    .inferHtmlResources(BlackList(""".*\.js""", """.*\.css""", """.*\.gif""", """.*\.jpeg""", """.*\.jpg""", """.*\.ico""", """.*\.woff""", """.*\.(t|o)tf""", """.*\.png"""), WhiteList())
    .acceptEncodingHeader("gzip,deflate")
    .contentTypeHeader("text/xmlcharset=UTF-8")
    .basicAuth(ApplicationProperties.userName, ApplicationProperties.password)

  val HTTP_PROTOCOL_CONTENT_TYPE_HEADER_JSON = http
    .baseURL(ApplicationProperties.baseURL)
    .inferHtmlResources(BlackList(""".*\.js""", """.*\.css""", """.*\.gif""", """.*\.jpeg""", """.*\.jpg""", """.*\.ico""", """.*\.woff""", """.*\.(t|o)tf""", """.*\.png"""), WhiteList())
    .acceptHeader("application/xml")
    .acceptEncodingHeader("gzip, deflate")
    .contentTypeHeader("application/json")
    .basicAuth(ApplicationProperties.userName, ApplicationProperties.password)

  val HTTP_PROTOCOL_ACCEPT_TYPE_HEADER_XML = http
    .baseURL(ApplicationProperties.baseURL)
    .inferHtmlResources(BlackList(""".*\.js""", """.*\.css""", """.*\.gif""", """.*\.jpeg""", """.*\.jpg""", """.*\.ico""", """.*\.woff""", """.*\.(t|o)tf""", """.*\.png"""), WhiteList())
    .acceptHeader("application/json")
    .header("DM-Accept-Encoding","application/gzip")
    .basicAuth(ApplicationProperties.userName, ApplicationProperties.password)

  val HTTP_PROTOCOL_ACCEPT_CONTENT_TYPE_HEADER_XML = http
    .baseURL(ApplicationProperties.baseURL)
    .inferHtmlResources(BlackList(""".*\.js""", """.*\.css""", """.*\.gif""", """.*\.jpeg""", """.*\.jpg""", """.*\.ico""", """.*\.woff""", """.*\.(t|o)tf""", """.*\.png"""), WhiteList())
    .acceptHeader("application/json")
    .header("DM-Accept-Encoding","application/gzip")
    .contentTypeHeader("application/xml")
    .basicAuth(ApplicationProperties.userName, ApplicationProperties.password)
}

2.       Utility: LoggerUtility.scala provides utility to log all request/response.
import org.slf4j.Logger
import org.slf4j.LoggerFactory
object LoggerUtility {

  var logger: Logger = LoggerFactory.getLogger("LoggerUtility")

  def logger(extraInfo: ExtraInfo) {
   
    val url = "\n\n" + " Request: " + extraInfo.request +
        "\n" + " Request Name: " + extraInfo.requestName +", Status: "+ extraInfo.response.statusCode +
       // "\n" + " Request Body: " + getRequestBody(extraInfo.request) +
        "\n" + " Response: " + getResponseBody(extraInfo.response.body,extraInfo.requestName)

      logger.info(url)
  }
}

·         extraInfoExtractor { extraInfo => List(LoggerUtility.logger(extraInfo)) } used to get information of url.

3.       Rest :  
This is an example of get service where organization will be feed by csv we have to inject this feeder this will th
object GetEmployeeByOrganizationId {

       val feeder = csv(ApplicationProperties.organizationsIdFileName).circular //its fetch organization data from csv in circular order
 
       val uri = URILiterals.GetEmployeeByOrganizationIdURI
       val requestName = "REST.EmployeeService.GetLegacyMachineByOrganizationId"
  
       val getEmployeeByOrganizationId = http(requestName)
              .get(uri)
              .extraInfoExtractor { extraInfo => List(LoggerUtility.logger(extraInfo)) }

       val scn = scenario("GetEmployeeByOrganizationId").feed(feeder)
              .exec(getEmployeeByOrganizationId)
}
POST/PUT/DELETE request:
object CreateUpdateDeleteEmployee {


       val feeder = csv(ApplicationProperties.employeeFileName)
 
       val createEmployeeURI = URILiterals.CreateEmployeeURI  //1
       val createEmployeeServiceName = "REST.EmployeeService.CreateEmployee"
 
       val updateEmployeeURI = URILiterals.UpdateEmployeeURI //2
       val updateEmployeeServiceName = "REST.EmployeeService.UpdateEmployee"
 
       val deletEmployeeURI = URILiterals.DeleteEmployeeURI //3
       val deleteEmployeeServiceName = "REST.EmployeeService.DeleteEmployee"

       val execHttpCreate = http(createEmployeeServiceName) //4
       .post(createEmployeeURI)
       .check(headerRegex("Location", "\\b[\\w-]+$").saveAs("employeeId")) //5
       .body(ElFileBody(ApplicationProperties.employeeBodyName)) //6
       .extraInfoExtractor { extraInfo => List(LoggerUtility.logger(extraInfo)) } //7

       val execHttpUpdate = http(updateEmployeeServiceName)
       .put(updateEmployeeURI)
       .body(ElFileBody(ApplicationProperties.employeeBodyName))
       .extraInfoExtractor { extraInfo => List(LoggerUtility.logger(extraInfo)) }

       val execHttpDelete = http(deleteEmployeeServiceName)
       .delete(deletEmployeeURI)
       .extraInfoExtractor { extraInfo => List(LoggerUtility.logger(extraInfo)) }

       val createUpdateDeleteEmployeeScn = scenario("CreateUpdateDeleteEmployee").feed(feeder)//8
       .exec(execHttpCreate)//9
       .doIf("${employeeId.exists()}") { //10
              exec(execHttpUpdate)
       }.doIf("${employeeId.exists()}") {
      tryMax(3) { //11 it will try 3 times for delete the employee data
              exec(execHttpDelete)
      }
    }
}
Detail
1: Create Employee URI ex. Employee/createEmployee with payload
2: Update Employee URI ex. Employee/updateEmployee with payload
3: Delete Employee URI ex. Employee/deleteEmployee with payload
4: service name
5: check is used to get data basis on regex it will get data from response and put var employeeID
6: body tag take payload
7: extract Info used to get information for logging.
8: create scenario it’s name should be unique, and we can feed feeder in it.
9: exec() used to execute requests

Note: If we setup our load testing application using any dependency so we have to put jar into gatling lib folder. We can also run gatling using maven but in example application we will run by command line.

Simulation:  Now we will create Employee simulation
class EmployeeSimulation extends Simulation {

setUp(

    // START SCENARIO
GetEmployeeByOrganizationId.scn.inject( //1
rampUsers(ApplicationProperties.getEmployeeByOrganizationIdUsers) over (ApplicationProperties.loadtestDuration minutes)).
      protocols(Constants.HTTP_PROTOCOL_ACCEPT_TYPE_HEADER_XML),

CreateUpdateDeleteEmployee.createUpdateDeleteEmployeeScn.inject(
rampUsers(ApplicationProperties.createUpdateDeleteEmployeeUsers) over (ApplicationProperties.loadtestDuration minutes)).
      protocols(Constants.HTTP_PROTOCOL_ACCEPT_CONTENT_TYPE_HEADER_XML)
  // END SCENARIO
)
}
Inject: for inject scenario.
rampUsers: No of user at a time to hit a request.
Over: in time lets take example 500 users in 1 min
Protocols: contains base url, content header, Accept type header, Authentication etc.
Bodies: In bodies folder put file of xml or json for use in payload – ELFileBody(file path)
employeeBody.xml
<?xml version="1.0" encoding="UTF-8" standalone="yes" ?>
<ns2:employee xmlns:ns2="http://demo.com/v3" xmlns:ns3="http://axiom.demo.com/Platform/Data/Services">
<ns2:name>${employeeName}</ns2:name>
<ns2:rollNo><ns2:name>6140R</ns2:name></ns2:rollNo>
</ns2:employee>

Data: organizationsIdFileName.csv
organizationId
1223
1188

employeeName.csv

employeeName
John martin

       Raman raghav