TestNG Comprehensive Cheatsheet
Installation
| Platform/Tool | Installation Method |
|---|
| Maven | Add to pom.xml:
<dependency> <groupId>org.testng</groupId> <artifactId>testng</artifactId> <version>7.8.0</version> <scope>test</scope>
</dependency> |
| Gradle | Add to build.gradle:
testImplementation 'org.testng:testng:7.8.0' |
| Eclipse IDE | Help → Eclipse Marketplace → Search “TestNG” → Install |
| IntelliJ IDEA | Bundled by default (or File → Settings → Plugins → Marketplace → “TestNG”) |
| VS Code | code --install-extension vscjava.vscode-java-pack
code --install-extension testng.testng |
| Manual (Linux/macOS) | wget https://repo1.maven.org/maven2/org/testng/testng/7.8.0/testng-7.8.0.jar
export CLASSPATH=$CLASSPATH:/path/to/testng-7.8.0.jar |
| Manual (Windows) | Download JAR from Maven Central
set CLASSPATH=%CLASSPATH%;C:\path\to\testng-7.8.0.jar |
Core Annotations
| Annotation | Description |
|---|
@Test | Marks a method as a test method |
@Test(priority = 1) | Sets execution order (lower numbers run first) |
@Test(description = "...") | Adds descriptive text to test for reporting |
@Test(timeOut = 5000) | Fails test if execution exceeds timeout (milliseconds) |
@Test(expectedExceptions = Exception.class) | Expects specific exception to be thrown |
@Test(enabled = false) | Disables/skips the test |
@Test(groups = {"smoke", "regression"}) | Assigns test to one or more groups |
@Test(dependsOnMethods = {"testMethod"}) | Runs after specified method(s) complete |
@Test(dependsOnGroups = {"smoke"}) | Runs after all tests in specified group(s) |
@Test(alwaysRun = true) | Runs test even if dependencies fail |
@Test(invocationCount = 3) | Runs test multiple times |
@Test(threadPoolSize = 5) | Runs multiple invocations in parallel threads |
@BeforeMethod | Executes before each @Test method |
@AfterMethod | Executes after each @Test method |
@BeforeClass | Executes once before any test method in the class |
@AfterClass | Executes once after all test methods in the class |
@BeforeTest | Executes before any test method in <test> tag |
@AfterTest | Executes after all test methods in <test> tag |
@BeforeSuite | Executes once before all tests in the suite |
@AfterSuite | Executes once after all tests in the suite |
@BeforeGroups | Executes before first test method of specified group(s) |
@AfterGroups | Executes after last test method of specified group(s) |
@DataProvider | Supplies data to test methods for parameterization |
@Parameters | Injects parameters from testng.xml into test methods |
@Factory | Creates test instances dynamically |
@Listeners | Attaches custom listeners to test class |
Command Line Execution
| Command | Description |
|---|
java -cp "classes:lib/*" org.testng.TestNG testng.xml | Run tests using XML suite file |
java -cp "classes:lib/*" org.testng.TestNG -testclass com.example.MyTest | Run specific test class |
java -cp "classes:lib/*" org.testng.TestNG -testclass Test1,Test2 | Run multiple test classes (comma-separated) |
java -cp "classes:lib/*" org.testng.TestNG -groups smoke testng.xml | Run tests from specific group(s) |
java -cp "classes:lib/*" org.testng.TestNG -excludegroups slow testng.xml | Exclude specific group(s) from execution |
java -cp "classes:lib/*" org.testng.TestNG -d test-output testng.xml | Specify output directory for reports |
java -cp "classes:lib/*" org.testng.TestNG -parallel methods -threadcount 5 | Run tests in parallel with thread count |
java -cp "classes:lib/*" org.testng.TestNG -verbose 10 testng.xml | Set verbosity level (0-10, higher = more detail) |
java -cp "classes:lib/*" org.testng.TestNG -methods MyTest.test1,MyTest.test2 | Run specific test methods |
java -cp "classes:lib/*" org.testng.TestNG -suitename "MySuite" -testname "MyTest" | Override suite and test names |
java -cp "classes:lib/*" org.testng.TestNG -reporter org.testng.reporters.EmailableReporter | Use specific reporter |
java -cp "classes:lib/*" org.testng.TestNG -listener com.example.MyListener | Add custom listener |
Maven Commands
| Command | Description |
|---|
mvn test | Run all tests |
mvn test -Dtest=MyTestClass | Run specific test class |
mvn test -Dtest=MyTestClass#testMethod | Run specific test method |
mvn test -Dtest=MyTestClass#test* | Run test methods matching pattern |
mvn test -DsuiteXmlFile=smoke-tests.xml | Run specific suite XML file |
mvn test -Dgroups=smoke,regression | Run specific test groups |
mvn test -DexcludedGroups=slow | Exclude specific test groups |
mvn test -Denvironment=staging -Dbrowser=chrome | Pass system properties to tests |
mvn test -DskipTests | Skip test execution |
mvn test -Dmaven.test.failure.ignore=true | Continue build even if tests fail |
mvn test -Dparallel=methods -DthreadCount=4 | Run tests in parallel |
mvn clean test | Clean previous builds and run tests |
mvn test -X | Run tests with debug output |
mvn surefire-report:report | Generate HTML test report |
Gradle Commands
| Command | Description |
|---|
gradle test | Run all tests |
gradle test --tests MyTestClass | Run specific test class |
gradle test --tests MyTestClass.testMethod | Run specific test method |
gradle test --tests *IntegrationTest | Run tests matching pattern |
gradle test --tests MyTestClass --tests OtherTest | Run multiple test classes |
gradle test -Denvironment=staging | Pass system properties |
gradle clean test | Clean and run tests |
gradle test --info | Run with detailed logging |
gradle test --debug | Run with debug-level logging |
gradle test --rerun-tasks | Force re-run even if up-to-date |
gradle test --continue | Continue execution after test failures |
gradle test --fail-fast | Stop execution on first test failure |
Assertion Methods
| Method | Description |
|---|
Assert.assertEquals(actual, expected) | Verify two values are equal |
Assert.assertEquals(actual, expected, "message") | Assert with custom failure message |
Assert.assertNotEquals(actual, expected) | Verify two values are not equal |
Assert.assertTrue(condition) | Verify condition is true |
Assert.assertFalse(condition) | Verify condition is false |
Assert.assertNull(object) | Verify object is null |
Assert.assertNotNull(object) | Verify object is not null |
Assert.assertSame(actual, expected) | Verify same object reference |
Assert.assertNotSame(actual, expected) | Verify different object references |
Assert.fail("message") | Explicitly fail a test |
Assert.assertThrows(Exception.class, () -> {...}) | Verify exception is thrown |
Assert.expectThrows(Exception.class, () -> {...}) | Same as assertThrows (alias) |
Data Providers
| Pattern | Description |
|---|
@DataProvider(name = "testData") | Define a data provider method |
@Test(dataProvider = "testData") | Use data provider in test |
@DataProvider(parallel = true) | Run data provider iterations in parallel |
Object[][] dataProvider() | Return 2D array of test data |
Iterator<Object[]> dataProvider() | Return iterator for large datasets |
@DataProvider(indices = {0, 2, 4}) | Run only specific data set indices |
Data Provider Example
@DataProvider(name = "loginData")
public Object[][] getLoginData() {
return new Object[][] {
{"user1", "pass1", true},
{"user2", "pass2", false},
{"user3", "pass3", true}
};
}
@Test(dataProvider = "loginData")
public void testLogin(String username, String password, boolean expected) {
boolean result = login(username, password);
Assert.assertEquals(result, expected);
}
TestNG XML Configuration
Basic Suite Configuration
<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE suite SYSTEM "https://testng.org/testng-1.0.dtd">
<suite name="Test Suite" parallel="methods" thread-count="5" verbose="1">
<!-- Suite-level parameters -->
<parameter name="browser" value="chrome"/>
<parameter name="environment" value="staging"/>
<!-- Define test groups -->
<test name="Smoke Tests">
<groups>
<run>
<include name="smoke"/>
<exclude name="slow"/>
</run>
</groups>
<!-- Specify test classes -->
<classes>
<class name="com.example.LoginTest"/>
<class name="com.example.SearchTest">
<!-- Include specific methods -->
<methods>
<include name="testBasicSearch"/>
<include name="testAdvancedSearch"/>
</methods>
</class>
</classes>
</test>
<!-- Another test configuration -->
<test name="Regression Tests">
<packages>
<package name="com.example.regression.*"/>
</packages>
</test>
<!-- Listeners -->
<listeners>
<listener class-name="com.example.CustomListener"/>
</listeners>
</suite>
Parallel Execution Configuration
<!-- Parallel at suite level -->
<suite name="Parallel Suite" parallel="tests" thread-count="3">
<test name="Test1">...</test>
<test name="Test2">...</test>
</suite>
<!-- Parallel options: methods, tests, classes, instances -->
Maven Surefire Plugin Configuration
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<version>3.0.0</version>
<configuration>
<!-- Specify suite files -->
<suiteXmlFiles>
<suiteXmlFile>testng.xml</suiteXmlFile>
<suiteXmlFile>smoke-tests.xml</suiteXmlFile>
</suiteXmlFiles>
<!-- Run specific groups -->
<groups>smoke,regression</groups>
<excludedGroups>slow,manual</excludedGroups>
<!-- Parallel execution -->
<parallel>methods</parallel>
<threadCount>5</threadCount>
<!-- System properties -->
<systemPropertyVariables>
<browser>chrome</browser>
<environment>staging</environment>
</systemPropertyVariables>
<!-- Continue on failures -->
<testFailureIgnore>false</testFailureIgnore>
</configuration>
</plugin>
</plugins>
</build>
Gradle Test Configuration
test {
useTestNG() {
// Suite files
suites 'src/test/resources/testng.xml'
// Include/exclude groups
includeGroups 'smoke', 'regression'
excludeGroups 'slow'
// Parallel execution
parallel = 'methods'
threadCount = 5
// Preserve order
preserveOrder = true
// Group by instances
groupByInstances = true
}
// System properties
systemProperty 'browser', 'chrome'
systemProperty 'environment', 'staging'
// Test output
testLogging {
events "passed", "skipped", "failed"
exceptionFormat "full"
}
}
Common Use Cases
Use Case 1: Basic Test Class with Setup and Teardown
import org.testng.annotations.*;
import org.testng.Assert;
public class UserManagementTest {
private DatabaseConnection db;
private UserService userService;
@BeforeClass
public void setupClass() {
// Initialize database connection once for all tests
db = new DatabaseConnection("jdbc:mysql://localhost:3306/testdb");
db.connect();
}
@BeforeMethod
public void setupMethod() {
// Create fresh service instance before each test
userService = new UserService(db);
}
@Test(priority = 1, groups = {"smoke"})
public void testCreateUser() {
User user = userService.createUser("john@example.com", "John Doe");
Assert.assertNotNull(user.getId());
Assert.assertEquals(user.getEmail(), "john@example.com");
}
@Test(priority = 2, dependsOnMethods = {"testCreateUser"})
public void testFindUser() {
User user = userService.findByEmail("john@example.com");
Assert.assertNotNull(user);
Assert.assertEquals(user.getName(), "John Doe");
}
@AfterMethod
public void cleanupMethod() {
// Clean up test data after each test
userService.deleteAllUsers();
}
@AfterClass
public void cleanupClass() {
// Close database connection after all tests
db.disconnect();
}
}
Use Case 2: Data-Driven Testing with DataProvider
import org.testng.annotations.*;
import org.testng.Assert;
public class LoginTest {
private LoginPage loginPage;
@BeforeMethod
public void setup() {
loginPage = new LoginPage();
loginPage.open();
}
@DataProvider(name = "loginCredentials")
public Object[][] getLoginData() {
return new Object[][] {
{"valid@user.com", "ValidPass123", true, "Dashboard"},
{"invalid@user.com", "WrongPass", false, "Invalid credentials"},
{"", "password", false, "Email is required"},
{"user@test.com", "", false, "Password is required"},
{"admin@test.com", "AdminPass!", true, "Admin Panel"}
};
}
@Test(dataProvider = "loginCredentials")
public void testLogin(String email, String password,
boolean shouldSucceed, String expectedMessage) {
loginPage.enterEmail(email);
loginPage.enterPassword(password);
loginPage.clickLogin();
if (shouldSucceed) {
Assert.assertTrue(loginPage.isLoggedIn());
Assert.assertEquals(loginPage.getPageTitle(), expectedMessage);
} else {
Assert.assertTrue(loginPage.hasError());
Assert.assertTrue(loginPage.getErrorMessage().contains(expectedMessage));
}
}
@AfterMethod
public void teardown() {
loginPage.close();
}
}
Use Case 3: Parallel Test Execution with Groups
<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE suite SYSTEM "https://testng.org/testng-1.0.dtd">
<suite name="Parallel Test Suite" parallel="tests" thread-count="3">
<test name="Chrome Tests" parallel="methods" thread-count="2">
<parameter name="browser" value="chrome"/>
<groups>
<run>
<include name="smoke"/>
</run>
</groups>
<classes>
<class name="com.example.tests.HomePageTest"/>
<class name="com.example.tests.SearchTest"/>
</classes>
</test>
<test name="Firefox Tests" parallel="methods" thread-count="2">
<parameter name="browser" value="firefox"/>
<groups>
<run>
<include name="smoke"/>
</run>
</groups>
<classes>
<class name="com.example.tests.HomePageTest"/>
<class name="com.example.tests.SearchTest"/>
</classes>
</test>
<test name="API Tests" parallel="classes" thread-count="3">
<groups>
<run>
<include name="api"/>
</run>
</groups>
<packages>
<package name="com.example.api.*"/>
</packages>
</test>
</suite>
// Test class using parameters
public class CrossBrowserTest {
private WebDriver driver;
@Parameters({"browser"})
@BeforeMethod
public void setup(String browser) {
if (browser.equalsIgnoreCase("chrome")) {
driver = new ChromeDriver();
} else if (browser.equalsIgnoreCase("firefox")) {
driver = new FirefoxDriver();
}
driver.manage().timeouts().implicitlyWait(Duration.ofSeconds(10));
}
@Test(groups = {"smoke"})
public void testHomePage() {
driver.get("https://example.com");
Assert.assertEquals(driver.getTitle(), "Example Domain");
}
@AfterMethod
public void teardown() {
if (driver != null) {
driver.quit();
}
}
}
Use Case 4: API Testing with Retry Logic
import org.testng.IRetryAnalyzer;
import org.testng.ITestResult;
// Retry analyzer for flaky tests
public class RetryAnalyzer implements IRetryAnalyzer {
private int retryCount = 0;
private static final int MAX_RETRY = 3;
@Override
public boolean retry(ITestResult result) {
if (retryCount < MAX_RETRY) {
retryCount++;
return true;
}
return false;
}
}
// API Test class
public class APITest {
private RestClient client;
@BeforeClass
public void setup() {
client = new RestClient("https://api.example.com");
}
@Test(groups = {"api"}, retryAnalyzer = RetryAnalyzer.class)
public void testGetUser() {
Response response = client.get("/users/1");
Assert.assertEquals(response.getStatusCode(), 200);
Assert.assertNotNull(response.jsonPath().getString("name"));
}
@Test(groups = {"api"}, dependsOnMethods = {"testGetUser"})
public void testCreateUser() {
String payload = "{\"name\":\"John\",\"email\":\"john@test.com\"}";
Response response = client.post("/users", payload);
Assert.assertEquals(response.getStatusCode(), 201);
}
@Test(groups = {"api"}, timeOut = 5000)
public void testPerformance() {
long startTime = System.currentTimeMillis();
Response response = client.get("/users");
long endTime = System.currentTimeMillis();
Assert.assertEquals(response.getStatusCode(), 200);
Assert.assertTrue((endTime - startTime) < 3000,
"API response time exceeded 3 seconds");
}
}
Use Case 5: Custom Listeners and Reporting
import org.testng.ITestListener;
import org.testng.ITestResult;
import org.testng.ITestContext;
public class CustomTestListener implements ITestListener {
@Override
public void onTestStart(ITestResult result) {
System.out.println("Starting test: " + result.getName());
}
@Override
public void onTestSuccess(ITestResult result) {
System.out.println("Test passed: " + result.getName());
}
@Override
public void onTestFailure(ITestResult result) {
System.out.println("Test failed: " + result.getName());
// Take screenshot, log error, etc.
captureScreenshot(result.getName());
}
@Override
public void onTestSkipped(ITestResult result) {
System.out.println("Test skipped: " + result.getName());
}
@Override
public void onFinish(ITestContext context) {
System.out.println("Total tests run: " + context.getAllTestMethods().length);
System.out.println("Passed: " + context.getPassedTests().size());
System.out.println("Failed: " + context.getFailedTests().size());
System.out.println("Skipped: " + context.getSkippedTests().size());
}
private void captureScreenshot(String testName) {
// Screenshot logic here
}
}
// Using the listener
@Listeners(CustomTestListener.class)
public class MyTest {
@Test
public void testExample() {
Assert.assertTrue(true);
}
}
Best Practices
-
Use meaningful test names: Name tests clearly to describe what they verify (e.g., testUserCanLoginWithValidCredentials instead of test1)
-
Leverage groups effectively: Organize tests into logical groups (smoke, regression, api, ui) to run subsets of tests based on context and save execution time
-
Implement proper setup and teardown: Use @BeforeMethod/@AfterMethod for test-level setup and @BeforeClass/@AfterClass for expensive operations like database connections
-
Make tests independent: Each test should be self-contained and not rely on execution order or shared state. Use dependsOnMethods sparingly and only when logical dependency exists
-
Use DataProviders for test data: Separate test data from