What You Will Learn
This course equips you with production-grade performance testing skills using Apache JMeter. You will design realistic load scenarios with thread groups and think time, build maintainable scripts using recording and correlation, validate performance SLAs with assertions, and execute distributed load tests across AWS EC2 clusters — all integrated into Jenkins CI/CD pipelines.
- Performance testing types —
load(normal traffic),stress(push to the limit),spike(sudden surge),soak(run for hours to catch leaks) - The difference between functional testing ("does it work?") and performance testing ("does it work for everyone at once?")
- Why Java 11 is recommended — newer versions break some JMeter plugins;
JMETER_HOMEenvironment variable setup - JMeter's folder structure —
bin/for executables,lib/for drivers,lib/ext/for plugin JARs - GUI mode for building and debugging scripts; non-GUI mode (
jmeter -n -t test.jmx -l results.jtl) for actual load runs - The JMeter test plan hierarchy: Test Plan → Thread Group → Samplers → Listeners — everything nests inside each other
- An e-commerce demo application runs throughout the course — every test is built on a real website with real data
- How performance testing fits into a QA career — salary ranges, job titles, and what interviewers actually ask
- Thread Group element —
Number of Threads(= concurrent virtual users),Ramp-Up Period(seconds for all users to start),Loop Count - Why gradual ramp-up matters — sudden floods of users create artificial spikes; real traffic builds gradually like morning rush hour
- Scheduler setup — running a test at a specific start time, for a specific duration, with a startup delay
- Steady-state testing — keeping a fixed user count for 30–60 minutes to surface memory leaks and slow degradation
- Load shape patterns — linear ramp, step load (add 100 users every 5 minutes), spike (sudden surge and drop)
Delay Thread Creation Until Neededcheckbox — stops JMeter from pre-allocating memory for all users upfront- Error handling — choosing whether to Continue, Stop Thread, or Stop Test when a sampler fails
- Ultimate Thread Group plugin for complex load shapes that the built-in Thread Group can't express
- HTTP Request sampler —
Server Name,Port,Protocol,Method(GET/POST/PUT/DELETE), andPathconfiguration GETfetches data;POSTsubmits forms or creates resources;PUT/PATCHupdate;DELETEremoves- HTTP Header Manager — adding
Content-Type: application/jsonandAuthorizationheaders to every request - Cookie Manager — JMeter automatically handles session cookies so you stay logged in across requests
- HTTPS/SSL — JMeter can test secure sites; certificate handling keeps test traffic looking like a real browser
- Status codes decoded:
200success,201created,302redirect,400bad request,401not authenticated,404not found,500server error - How redirect following works — JMeter automatically follows HTTP→HTTPS redirects if the checkbox is enabled
- Cache Manager — simulating a browser that remembers previously downloaded assets
- HTTP(S) Test Script Recorder — JMeter acts as a "middleman" between your browser and the website, capturing all traffic
- Setting your browser's proxy to
localhost:8888so it routes through JMeter's recorder - Installing JMeter's SSL certificate so it can record secure (HTTPS) sites without certificate errors
- Choosing where recorded requests land — picking the right controller as the recording target
- Filtering rules — exclude images, CSS, JavaScript, fonts, analytics — keep only the meaningful app requests
- Request grouping — sorting recorded steps into logical folders: "Login", "Browse Products", "Checkout"
- Recording a complete e-commerce journey: home page → search → product detail → add to cart → checkout
- Post-recording cleanup — removing duplicates, parameterizing hard-coded values, making it reusable
CSV Data Set Configelement —Filenamepath,Variable Names(column headers),Delimiter,Recycle on EOF,Stop Thread on EOFsettings- Creating test data files — a spreadsheet saved as CSV with usernames, passwords, products, quantities per row
- Variable syntax
${username},${password}— drop these into any request field and JMeter substitutes real values ${__random(1,1000,orderId)}function — generate a unique random number for order IDs or prices- Sequential data access (first thread gets row 1, second gets row 2) vs. random access per test run
- Multiple CSV files — separate files for user credentials, product catalog, shipping addresses
Recycle on EOF=Trueloops infinitely through your data;=Falsestops when rows run out- Combining CSV variables with JMeter functions — encoding special characters in usernames for URL safety
- Why correlation is critical — session IDs and CSRF tokens change every login; hard-coded values cause instant failures under load
Regular Expression Extractor— uses patterns likename="token" value="(.+?)"withTemplate $1$to capture a valueReference Name— the variable name you give to the extracted value; used as${tokenName}in subsequent requestsJSON Extractor— for API responses, uses JSONPath syntax like$.data.idto pull values from nested JSONXPath Extractor— for HTML pages, finds hidden form fields using XPath expressions- CSRF token handling — extract the anti-forgery token from the login page, inject it into the login POST request body
- Cookie-based session persistence — the Cookie Manager keeps your session alive across the full test scenario
- Debugging correlation failures — using View Results Tree to compare expected vs. actual extracted values step by step
- Response Assertion — checks that the response body, code, or headers
Contains,Matches, orEqualsexpected text - Negate assertion — confirm that an error message does NOT appear in a successful response
- JSON Assertion — validates specific fields in API responses using JSONPath like
$.statusequals"success" - Duration Assertion — your SLA checker: if a response takes more than 3,000ms, mark it failed regardless of content
- Size Assertion — flags suspiciously small responses that might be empty error pages rather than full page content
- Assertion scope —
Main sample onlychecks just the primary request;Main sample and sub-sampleschecks everything including embedded resources - Combining assertions — stack a Response Assertion AND Duration Assertion on the same request for thorough validation
- How failed assertions show up in the Aggregate Report — they increase the error rate and trigger CI/CD failures
- Constant Timer — adds a fixed pause (e.g., 2,000ms) before every request it applies to
- Uniform Random Timer — varies the delay within a range (e.g., 1–5 seconds) for more natural pacing
- Gaussian Random Timer — bell-curve distributed delays: most pauses cluster around 3 seconds, a few are shorter or longer
- Poisson Random Timer — mathematically models inter-arrival times to replicate real user behavior patterns
- Synchronizing Timer — "hold all threads here until N users are ready, then release them all at once" for burst testing
- Constant Throughput Timer — instead of setting delays, you set a target rate (e.g., 60 requests/minute) and JMeter adjusts automatically
- Think time rationale — pauses represent reading, decision-making, and form-filling that real humans do between clicks
- Timer scope — placing a timer inside a request applies it only there; placing it at Thread Group level applies it everywhere
If Controller— runs its child samplers only when a condition is true, like"${lastResponseCode}" == "200"Loop Controller— repeats a block of requests a set number of times (e.g., add 5 products before checking out)While Controller— keeps looping until a condition becomes false (e.g., poll an API until a job status is "complete")ForEach Controller— iterates over a numbered list of variables (productId_1,productId_2, ...) one by oneTransaction Controller— wraps multiple requests into one named business transaction; reports show one combined response time for "Full Checkout"Simple Controller— an organizational folder that groups requests visually without affecting execution logic${JMeterThread.last_sample_ok}— built-in variable that tells you whether the previous sampler passed its assertions- Nesting controllers — combining If + Loop + Transaction to model complex, realistic multi-step user behaviors
View Results Tree— full request/response inspector for debugging; shows exactly what JMeter sent and receivedAggregate Report— the main results table: Samples, Average, Min, Max, 90th/95th/99th percentile, Error%, Throughput (req/sec)- Why average response time is misleading — 95th percentile shows what most users actually experienced under load
Graph Results— real-time chart of response times; useful for watching trends develop during a test runActive Threads Over Time(plugin) — shows how many concurrent users were running at each point in timeResponse Times Over Time(plugin) — trends in latency; reveals when performance degraded during the test- JTL file format — raw CSV of every sample; timestamp, elapsed time, label, response code, success flag
- Error rate analysis — 4xx errors mean your script has a bug; 5xx errors mean the server is struggling under load
- Plugins Manager installation — drop
jmeter-plugins-manager.jarintolib/ext/and restart; browse available plugins from the GUI - PerfMon Plugin — installs a lightweight agent on the server; sends CPU, memory, disk I/O, and network metrics back to JMeter in real time
- Custom Thread Groups (Ultimate Thread Group, Concurrency Thread Group) — express complex load shapes the standard Thread Group cannot
Response Times Percentilesgraph — visualize how the 90th and 99th percentile change across the test durationTransactions per Second(TPS) graph — see how throughput evolves as load increases- JDBC sampler — test your database directly; requires the correct driver JAR in
lib/and a JDBC Connection Configuration element - WebSocket sampler — load test real-time applications like chat or live dashboards
- Plugin/Java version compatibility — always check that a plugin supports your Java 11 + JMeter 5.x combination before installing
- JMeter master-controller + agent-slave architecture — one machine coordinates, multiple machines generate load
- AWS EC2 provisioning — choosing an instance type, configuring a security group to allow JMeter's RMI ports
- Installing Java and JMeter on remote EC2 Linux instances — same version as the master to avoid compatibility issues
user.propertiesconfiguration —remote_hosts=IP1,IP2,IP3registers agent IPs with the master- Starting the agent daemon on each EC2 instance —
bin/jmeter-server &makes it listen for the master Remote Start Allfrom the JMeter GUI master — sends the test plan to all agents simultaneously and starts them- RMI port configuration —
server.rmi.localport=4000opens the correct port through AWS security group rules - Result aggregation — all agent JTL files stream back to the master; one combined report covers all virtual users
- Multi-step correlation chains — extract an ID from step 1, use it in step 2 to get a token, use that token in step 3
- JSR223 Pre/Post Processors with Groovy scripting — when regex alone isn't enough, write a few lines of code
vars.get("variableName")andvars.put("variableName", value)— the JMeter scripting API for reading and writing variablesprev.getResponseDataAsString()— access the full previous response body inside a post-processor script- Advanced regex patterns — lookahead
(?=...)and lookbehind(?<=...)to extract values between specific delimiters - Handling JavaScript-generated dynamic values — scanning raw HTML for hidden fields added by client-side code
- Pre-Processors run before a request; Post-Processors run after — understanding the execution order prevents bugs
- Debug Sampler — adds a fake "request" that prints all current variable values to the results tree for live inspection
- HTML Dashboard Report generation —
jmeter -g results.jtl -o report-output/produces charts, tables, and trend graphs automatically - JTL configuration — enabling all fields (
jmeter.save.saveservice.*properties) so the HTML report has full data to work with - Jenkins integration — the Performance Plugin shows response time and error rate trends across every build
- Jenkins pipeline Groovy stage —
sh 'jmeter -n -t test.jmx -l results.jtl'runs the test; results feed into the next stage - Pass/fail thresholds — configure error rate and response time limits; if exceeded, Jenkins marks the build as UNSTABLE or FAILED
- Non-GUI mode best practices — always disable View Results Tree and other heavy listeners; increase JVM heap with
JVM_ARGS=-Xmx4g - Continuous performance testing strategy — run smoke-sized load tests on every commit; full regression overnight
- Making the performance report part of every sprint review — showing the team trends, not just pass/fail
Performance KPIs
I track response time (how fast), throughput (TPS — transactions per second), error rate, and the 95th percentile. Average response time lies — if 95% of users get 1 second but 5% get 30 seconds, the average looks fine but your SLA is broken.
Correlation
Websites issue unique session IDs and CSRF tokens per login. Regular Expression Extractor or JSON Extractor captures these values into ${variables} that are injected into subsequent requests — without this, every post-login request fails with a 401.
Non-GUI Mode
jmeter -n -t test.jmx -l results.jtl runs without the interface. The GUI consumes significant resources — disabling it gives you 20–30% more throughput capacity from the same machine. Always run real tests in non-GUI mode.
JMeter vs Gatling/k6
JMeter: GUI-based, Java, huge plugin ecosystem, industry standard for QA teams. Gatling: code-based (Scala/JS), developer-friendly, better async support. k6: JavaScript, cloud-native, great for developer-driven perf testing. For most QA interviews, deep JMeter knowledge is what they ask about.
Thread Group Design
I ramp users up gradually — say 500 users over 10 minutes — to simulate real morning rush traffic. I run soak tests for 30–60 minutes to catch memory leaks. The Scheduler and Duration settings make the test self-terminating so it doesn't run forever in CI/CD.
Distributed Testing
One machine saturates its own NIC around 500–1,000 users. JMeter distributed mode uses multiple AWS EC2 agents registered in user.properties via remote_hosts. The master sends the test plan to all agents simultaneously and aggregates all JTL results into one report.