Salta ai contenuti

TestNG Comprehensive Cheatsheet

TestNG Comprehensive Cheatsheet

Installation

Platform/ToolInstallation Method
MavenAdd to pom.xml:
<dependency>
  <groupId>org.testng</groupId>
  <artifactId>testng</artifactId>
  <version>7.8.0</version>
  <scope>test</scope>
</dependency>
GradleAdd to build.gradle:
testImplementation 'org.testng:testng:7.8.0'
Eclipse IDEHelp → Eclipse Marketplace → Search “TestNG” → Install
IntelliJ IDEABundled by default (or File → Settings → Plugins → Marketplace → “TestNG”)
VS Codecode --install-extension vscjava.vscode-java-pack
code --install-extension testng.testng
Manual (Linux/macOS)wget https://repo1.maven.org/maven2/org/testng/testng/7.8.0/testng-7.8.0.jar
export CLASSPATH=$CLASSPATH:/path/to/testng-7.8.0.jar
Manual (Windows)Download JAR from Maven Central
set CLASSPATH=%CLASSPATH%;C:\path\to\testng-7.8.0.jar

Core Annotations

AnnotationDescription
@TestMarks a method as a test method
@Test(priority = 1)Sets execution order (lower numbers run first)
@Test(description = "...")Adds descriptive text to test for reporting
@Test(timeOut = 5000)Fails test if execution exceeds timeout (milliseconds)
@Test(expectedExceptions = Exception.class)Expects specific exception to be thrown
@Test(enabled = false)Disables/skips the test
@Test(groups = {"smoke", "regression"})Assigns test to one or more groups
@Test(dependsOnMethods = {"testMethod"})Runs after specified method(s) complete
@Test(dependsOnGroups = {"smoke"})Runs after all tests in specified group(s)
@Test(alwaysRun = true)Runs test even if dependencies fail
@Test(invocationCount = 3)Runs test multiple times
@Test(threadPoolSize = 5)Runs multiple invocations in parallel threads
@BeforeMethodExecutes before each @Test method
@AfterMethodExecutes after each @Test method
@BeforeClassExecutes once before any test method in the class
@AfterClassExecutes once after all test methods in the class
@BeforeTestExecutes before any test method in <test> tag
@AfterTestExecutes after all test methods in <test> tag
@BeforeSuiteExecutes once before all tests in the suite
@AfterSuiteExecutes once after all tests in the suite
@BeforeGroupsExecutes before first test method of specified group(s)
@AfterGroupsExecutes after last test method of specified group(s)
@DataProviderSupplies data to test methods for parameterization
@ParametersInjects parameters from testng.xml into test methods
@FactoryCreates test instances dynamically
@ListenersAttaches custom listeners to test class

Command Line Execution

CommandDescription
java -cp "classes:lib/*" org.testng.TestNG testng.xmlRun tests using XML suite file
java -cp "classes:lib/*" org.testng.TestNG -testclass com.example.MyTestRun specific test class
java -cp "classes:lib/*" org.testng.TestNG -testclass Test1,Test2Run multiple test classes (comma-separated)
java -cp "classes:lib/*" org.testng.TestNG -groups smoke testng.xmlRun tests from specific group(s)
java -cp "classes:lib/*" org.testng.TestNG -excludegroups slow testng.xmlExclude specific group(s) from execution
java -cp "classes:lib/*" org.testng.TestNG -d test-output testng.xmlSpecify output directory for reports
java -cp "classes:lib/*" org.testng.TestNG -parallel methods -threadcount 5Run tests in parallel with thread count
java -cp "classes:lib/*" org.testng.TestNG -verbose 10 testng.xmlSet verbosity level (0-10, higher = more detail)
java -cp "classes:lib/*" org.testng.TestNG -methods MyTest.test1,MyTest.test2Run specific test methods
java -cp "classes:lib/*" org.testng.TestNG -suitename "MySuite" -testname "MyTest"Override suite and test names
java -cp "classes:lib/*" org.testng.TestNG -reporter org.testng.reporters.EmailableReporterUse specific reporter
java -cp "classes:lib/*" org.testng.TestNG -listener com.example.MyListenerAdd custom listener

Maven Commands

CommandDescription
mvn testRun all tests
mvn test -Dtest=MyTestClassRun specific test class
mvn test -Dtest=MyTestClass#testMethodRun specific test method
mvn test -Dtest=MyTestClass#test*Run test methods matching pattern
mvn test -DsuiteXmlFile=smoke-tests.xmlRun specific suite XML file
mvn test -Dgroups=smoke,regressionRun specific test groups
mvn test -DexcludedGroups=slowExclude specific test groups
mvn test -Denvironment=staging -Dbrowser=chromePass system properties to tests
mvn test -DskipTestsSkip test execution
mvn test -Dmaven.test.failure.ignore=trueContinue build even if tests fail
mvn test -Dparallel=methods -DthreadCount=4Run tests in parallel
mvn clean testClean previous builds and run tests
mvn test -XRun tests with debug output
mvn surefire-report:reportGenerate HTML test report

Gradle Commands

CommandDescription
gradle testRun all tests
gradle test --tests MyTestClassRun specific test class
gradle test --tests MyTestClass.testMethodRun specific test method
gradle test --tests *IntegrationTestRun tests matching pattern
gradle test --tests MyTestClass --tests OtherTestRun multiple test classes
gradle test -Denvironment=stagingPass system properties
gradle clean testClean and run tests
gradle test --infoRun with detailed logging
gradle test --debugRun with debug-level logging
gradle test --rerun-tasksForce re-run even if up-to-date
gradle test --continueContinue execution after test failures
gradle test --fail-fastStop execution on first test failure

Assertion Methods

MethodDescription
Assert.assertEquals(actual, expected)Verify two values are equal
Assert.assertEquals(actual, expected, "message")Assert with custom failure message
Assert.assertNotEquals(actual, expected)Verify two values are not equal
Assert.assertTrue(condition)Verify condition is true
Assert.assertFalse(condition)Verify condition is false
Assert.assertNull(object)Verify object is null
Assert.assertNotNull(object)Verify object is not null
Assert.assertSame(actual, expected)Verify same object reference
Assert.assertNotSame(actual, expected)Verify different object references
Assert.fail("message")Explicitly fail a test
Assert.assertThrows(Exception.class, () -> {...})Verify exception is thrown
Assert.expectThrows(Exception.class, () -> {...})Same as assertThrows (alias)

Data Providers

PatternDescription
@DataProvider(name = "testData")Define a data provider method
@Test(dataProvider = "testData")Use data provider in test
@DataProvider(parallel = true)Run data provider iterations in parallel
Object[][] dataProvider()Return 2D array of test data
Iterator<Object[]> dataProvider()Return iterator for large datasets
@DataProvider(indices = {0, 2, 4})Run only specific data set indices

Data Provider Example

@DataProvider(name = "loginData")
public Object[][] getLoginData() {
    return new Object[][] {
        {"user1", "pass1", true},
        {"user2", "pass2", false},
        {"user3", "pass3", true}
    };
}

@Test(dataProvider = "loginData")
public void testLogin(String username, String password, boolean expected) {
    boolean result = login(username, password);
    Assert.assertEquals(result, expected);
}

TestNG XML Configuration

Basic Suite Configuration

<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE suite SYSTEM "https://testng.org/testng-1.0.dtd">
<suite name="Test Suite" parallel="methods" thread-count="5" verbose="1">
    
    <!-- Suite-level parameters -->
    <parameter name="browser" value="chrome"/>
    <parameter name="environment" value="staging"/>
    
    <!-- Define test groups -->
    <test name="Smoke Tests">
        <groups>
            <run>
                <include name="smoke"/>
                <exclude name="slow"/>
            </run>
        </groups>
        
        <!-- Specify test classes -->
        <classes>
            <class name="com.example.LoginTest"/>
            <class name="com.example.SearchTest">
                <!-- Include specific methods -->
                <methods>
                    <include name="testBasicSearch"/>
                    <include name="testAdvancedSearch"/>
                </methods>
            </class>
        </classes>
    </test>
    
    <!-- Another test configuration -->
    <test name="Regression Tests">
        <packages>
            <package name="com.example.regression.*"/>
        </packages>
    </test>
    
    <!-- Listeners -->
    <listeners>
        <listener class-name="com.example.CustomListener"/>
    </listeners>
</suite>

Parallel Execution Configuration

<!-- Parallel at suite level -->
<suite name="Parallel Suite" parallel="tests" thread-count="3">
    <test name="Test1">...</test>
    <test name="Test2">...</test>
</suite>

<!-- Parallel options: methods, tests, classes, instances -->

Maven Surefire Plugin Configuration

<build>
    <plugins>
        <plugin>
            <groupId>org.apache.maven.plugins</groupId>
            <artifactId>maven-surefire-plugin</artifactId>
            <version>3.0.0</version>
            <configuration>
                <!-- Specify suite files -->
                <suiteXmlFiles>
                    <suiteXmlFile>testng.xml</suiteXmlFile>
                    <suiteXmlFile>smoke-tests.xml</suiteXmlFile>
                </suiteXmlFiles>
                
                <!-- Run specific groups -->
                <groups>smoke,regression</groups>
                <excludedGroups>slow,manual</excludedGroups>
                
                <!-- Parallel execution -->
                <parallel>methods</parallel>
                <threadCount>5</threadCount>
                
                <!-- System properties -->
                <systemPropertyVariables>
                    <browser>chrome</browser>
                    <environment>staging</environment>
                </systemPropertyVariables>
                
                <!-- Continue on failures -->
                <testFailureIgnore>false</testFailureIgnore>
            </configuration>
        </plugin>
    </plugins>
</build>

Gradle Test Configuration

test {
    useTestNG() {
        // Suite files
        suites 'src/test/resources/testng.xml'
        
        // Include/exclude groups
        includeGroups 'smoke', 'regression'
        excludeGroups 'slow'
        
        // Parallel execution
        parallel = 'methods'
        threadCount = 5
        
        // Preserve order
        preserveOrder = true
        
        // Group by instances
        groupByInstances = true
    }
    
    // System properties
    systemProperty 'browser', 'chrome'
    systemProperty 'environment', 'staging'
    
    // Test output
    testLogging {
        events "passed", "skipped", "failed"
        exceptionFormat "full"
    }
}

Common Use Cases

Use Case 1: Basic Test Class with Setup and Teardown

import org.testng.annotations.*;
import org.testng.Assert;

public class UserManagementTest {
    
    private DatabaseConnection db;
    private UserService userService;
    
    @BeforeClass
    public void setupClass() {
        // Initialize database connection once for all tests
        db = new DatabaseConnection("jdbc:mysql://localhost:3306/testdb");
        db.connect();
    }
    
    @BeforeMethod
    public void setupMethod() {
        // Create fresh service instance before each test
        userService = new UserService(db);
    }
    
    @Test(priority = 1, groups = {"smoke"})
    public void testCreateUser() {
        User user = userService.createUser("john@example.com", "John Doe");
        Assert.assertNotNull(user.getId());
        Assert.assertEquals(user.getEmail(), "john@example.com");
    }
    
    @Test(priority = 2, dependsOnMethods = {"testCreateUser"})
    public void testFindUser() {
        User user = userService.findByEmail("john@example.com");
        Assert.assertNotNull(user);
        Assert.assertEquals(user.getName(), "John Doe");
    }
    
    @AfterMethod
    public void cleanupMethod() {
        // Clean up test data after each test
        userService.deleteAllUsers();
    }
    
    @AfterClass
    public void cleanupClass() {
        // Close database connection after all tests
        db.disconnect();
    }
}

Use Case 2: Data-Driven Testing with DataProvider

import org.testng.annotations.*;
import org.testng.Assert;

public class LoginTest {
    
    private LoginPage loginPage;
    
    @BeforeMethod
    public void setup() {
        loginPage = new LoginPage();
        loginPage.open();
    }
    
    @DataProvider(name = "loginCredentials")
    public Object[][] getLoginData() {
        return new Object[][] {
            {"valid@user.com", "ValidPass123", true, "Dashboard"},
            {"invalid@user.com", "WrongPass", false, "Invalid credentials"},
            {"", "password", false, "Email is required"},
            {"user@test.com", "", false, "Password is required"},
            {"admin@test.com", "AdminPass!", true, "Admin Panel"}
        };
    }
    
    @Test(dataProvider = "loginCredentials")
    public void testLogin(String email, String password, 
                         boolean shouldSucceed, String expectedMessage) {
        loginPage.enterEmail(email);
        loginPage.enterPassword(password);
        loginPage.clickLogin();
        
        if (shouldSucceed) {
            Assert.assertTrue(loginPage.isLoggedIn());
            Assert.assertEquals(loginPage.getPageTitle(), expectedMessage);
        } else {
            Assert.assertTrue(loginPage.hasError());
            Assert.assertTrue(loginPage.getErrorMessage().contains(expectedMessage));
        }
    }
    
    @AfterMethod
    public void teardown() {
        loginPage.close();
    }
}

Use Case 3: Parallel Test Execution with Groups

<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE suite SYSTEM "https://testng.org/testng-1.0.dtd">
<suite name="Parallel Test Suite" parallel="tests" thread-count="3">
    
    <test name="Chrome Tests" parallel="methods" thread-count="2">
        <parameter name="browser" value="chrome"/>
        <groups>
            <run>
                <include name="smoke"/>
            </run>
        </groups>
        <classes>
            <class name="com.example.tests.HomePageTest"/>
            <class name="com.example.tests.SearchTest"/>
        </classes>
    </test>
    
    <test name="Firefox Tests" parallel="methods" thread-count="2">
        <parameter name="browser" value="firefox"/>
        <groups>
            <run>
                <include name="smoke"/>
            </run>
        </groups>
        <classes>
            <class name="com.example.tests.HomePageTest"/>
            <class name="com.example.tests.SearchTest"/>
        </classes>
    </test>
    
    <test name="API Tests" parallel="classes" thread-count="3">
        <groups>
            <run>
                <include name="api"/>
            </run>
        </groups>
        <packages>
            <package name="com.example.api.*"/>
        </packages>
    </test>
</suite>
// Test class using parameters
public class CrossBrowserTest {
    
    private WebDriver driver;
    
    @Parameters({"browser"})
    @BeforeMethod
    public void setup(String browser) {
        if (browser.equalsIgnoreCase("chrome")) {
            driver = new ChromeDriver();
        } else if (browser.equalsIgnoreCase("firefox")) {
            driver = new FirefoxDriver();
        }
        driver.manage().timeouts().implicitlyWait(Duration.ofSeconds(10));
    }
    
    @Test(groups = {"smoke"})
    public void testHomePage() {
        driver.get("https://example.com");
        Assert.assertEquals(driver.getTitle(), "Example Domain");
    }
    
    @AfterMethod
    public void teardown() {
        if (driver != null) {
            driver.quit();
        }
    }
}

Use Case 4: API Testing with Retry Logic

import org.testng.IRetryAnalyzer;
import org.testng.ITestResult;

// Retry analyzer for flaky tests
public class RetryAnalyzer implements IRetryAnalyzer {
    private int retryCount = 0;
    private static final int MAX_RETRY = 3;
    
    @Override
    public boolean retry(ITestResult result) {
        if (retryCount < MAX_RETRY) {
            retryCount++;
            return true;
        }
        return false;
    }
}

// API Test class
public class APITest {
    
    private RestClient client;
    
    @BeforeClass
    public void setup() {
        client = new RestClient("https://api.example.com");
    }
    
    @Test(groups = {"api"}, retryAnalyzer = RetryAnalyzer.class)
    public void testGetUser() {
        Response response = client.get("/users/1");
        Assert.assertEquals(response.getStatusCode(), 200);
        Assert.assertNotNull(response.jsonPath().getString("name"));
    }
    
    @Test(groups = {"api"}, dependsOnMethods = {"testGetUser"})
    public void testCreateUser() {
        String payload = "{\"name\":\"John\",\"email\":\"john@test.com\"}";
        Response response = client.post("/users", payload);
        Assert.assertEquals(response.getStatusCode(), 201);
    }
    
    @Test(groups = {"api"}, timeOut = 5000)
    public void testPerformance() {
        long startTime = System.currentTimeMillis();
        Response response = client.get("/users");
        long endTime = System.currentTimeMillis();
        
        Assert.assertEquals(response.getStatusCode(), 200);
        Assert.assertTrue((endTime - startTime) < 3000, 
            "API response time exceeded 3 seconds");
    }
}

Use Case 5: Custom Listeners and Reporting

import org.testng.ITestListener;
import org.testng.ITestResult;
import org.testng.ITestContext;

public class CustomTestListener implements ITestListener {
    
    @Override
    public void onTestStart(ITestResult result) {
        System.out.println("Starting test: " + result.getName());
    }
    
    @Override
    public void onTestSuccess(ITestResult result) {
        System.out.println("Test passed: " + result.getName());
    }
    
    @Override
    public void onTestFailure(ITestResult result) {
        System.out.println("Test failed: " + result.getName());
        // Take screenshot, log error, etc.
        captureScreenshot(result.getName());
    }
    
    @Override
    public void onTestSkipped(ITestResult result) {
        System.out.println("Test skipped: " + result.getName());
    }
    
    @Override
    public void onFinish(ITestContext context) {
        System.out.println("Total tests run: " + context.getAllTestMethods().length);
        System.out.println("Passed: " + context.getPassedTests().size());
        System.out.println("Failed: " + context.getFailedTests().size());
        System.out.println("Skipped: " + context.getSkippedTests().size());
    }
    
    private void captureScreenshot(String testName) {
        // Screenshot logic here
    }
}

// Using the listener
@Listeners(CustomTestListener.class)
public class MyTest {
    
    @Test
    public void testExample() {
        Assert.assertTrue(true);
    }
}

Best Practices

  • Use meaningful test names: Name tests clearly to describe what they verify (e.g., testUserCanLoginWithValidCredentials instead of test1)

  • Leverage groups effectively: Organize tests into logical groups (smoke, regression, api, ui) to run subsets of tests based on context and save execution time

  • Implement proper setup and teardown: Use @BeforeMethod/@AfterMethod for test-level setup and @BeforeClass/@AfterClass for expensive operations like database connections

  • Make tests independent: Each test should be self-contained and not rely on execution order or shared state. Use dependsOnMethods sparingly and only when logical dependency exists

  • Use DataProviders for test data: Separate test data from