"Fossies" - the Fresh Open Source Software Archive

Member "junit5-r5.5.2/documentation/src/docs/asciidoc/user-guide/advanced-topics.adoc" (8 Sep 2019, 129 Bytes) of package /linux/privat/junit5-r5.5.2.tar.gz:


As a special service "Fossies" has tried to format the requested source page into HTML format (assuming AsciiDoc format). Alternatively you can here view or download the uninterpreted source code file. A member file download can also be achieved by clicking within a package contents listing on the according byte size field.

Advanced Topics

JUnit Platform Launcher API

One of the prominent goals of JUnit 5 is to make the interface between JUnit and its programmatic clients – build tools and IDEs – more powerful and stable. The purpose is to decouple the internals of discovering and executing tests from all the filtering and configuration that’s necessary from the outside.

JUnit 5 introduces the concept of a Launcher that can be used to discover, filter, and execute tests. Moreover, third party test libraries – like Spock, Cucumber, and FitNesse – can plug into the JUnit Platform’s launching infrastructure by providing a custom {TestEngine}.

The launcher API is in the {junit-platform-launcher} module.

An example consumer of the launcher API is the {ConsoleLauncher} in the {junit-platform-console} project.

Discovering Tests

Introducing test discovery as a dedicated feature of the platform itself will (hopefully) free IDEs and build tools from most of the difficulties they had to go through to identify test classes and test methods in the past.

Usage Example:

Unresolved directive in launcher-api.adoc - include::{testDir}/example/UsingTheLauncherDemo.java[tags=imports]
Unresolved directive in launcher-api.adoc - include::{testDir}/example/UsingTheLauncherDemo.java[tags=discovery]

There’s currently the possibility to select classes, methods, and all classes in a package or even search for all tests in the classpath. Discovery takes place across all participating test engines.

The resulting TestPlan is a hierarchical (and read-only) description of all engines, classes, and test methods that fit the LauncherDiscoveryRequest. The client can traverse the tree, retrieve details about a node, and get a link to the original source (like class, method, or file position). Every node in the test plan has a unique ID that can be used to invoke a particular test or group of tests.

Executing Tests

To execute tests, clients can use the same LauncherDiscoveryRequest as in the discovery phase or create a new request. Test progress and reporting can be achieved by registering one or more {TestExecutionListener} implementations with the Launcher as in the following example.

Unresolved directive in launcher-api.adoc - include::{testDir}/example/UsingTheLauncherDemo.java[tags=execution]

There is no return value for the execute() method, but you can easily use a listener to aggregate the final results in an object of your own. For examples see the {SummaryGeneratingListener} and {LegacyXmlReportGeneratingListener}.

Plugging in your own Test Engine

JUnit currently provides two {TestEngine} implementations.

  • {junit-jupiter-engine}: The core of JUnit Jupiter.

  • {junit-vintage-engine}: A thin layer on top of JUnit 4 to allow running vintage tests with the launcher infrastructure.

Third parties may also contribute their own TestEngine by implementing the interfaces in the {junit-platform-engine} module and registering their engine. By default, engine registration is supported via Java’s java.util.ServiceLoader mechanism. For example, the junit-jupiter-engine module registers its org.junit.jupiter.engine.JupiterTestEngine in a file named org.junit.platform.engine.TestEngine within the /META-INF/services in the junit-jupiter-engine JAR.

Note
{HierarchicalTestEngine} is a convenient abstract base implementation (used by the {junit-jupiter-engine}) that only requires implementors to provide the logic for test discovery. It implements execution of TestDescriptors that implement the Node interface, including support for parallel execution.
Warning
The junit- prefix is reserved for TestEngines from the JUnit Team

The JUnit Platform Launcher enforces that only TestEngine implementations published by the JUnit Team may use the junit- prefix for their TestEngine IDs.

  • If any third-party TestEngine claims to be junit-jupiter or junit-vintage, an exception will be thrown, immediately halting execution of the JUnit Platform.

  • If any third-party TestEngine uses the junit- prefix for its ID, a warning message will be logged. Later releases of the JUnit Platform will throw an exception for such violations.

Plugging in your own Test Execution Listener

In addition to the public {Launcher} API method for registering test execution listeners programmatically, by default custom {TestExecutionListener} implementations will be discovered at runtime via Java’s java.util.ServiceLoader mechanism and automatically registered with the Launcher created via the LauncherFactory. For example, an example.TestInfoPrinter class implementing {TestExecutionListener} and declared within the /META-INF/services/org.junit.platform.launcher.TestExecutionListener file is loaded and registered automatically.

JUnit Platform Reporting

The junit-platform-reporting artifact contains {TestExecutionListener} implementations that generate test reports. These listeners are typically used by IDEs and build tools. The package org.junit.platform.reporting.legacy.xml currently contains the following implementation.

  • {LegacyXmlReportGeneratingListener} generates a separate XML report for each root in the {TestPlan}. Note that the generated XML format is compatible with the de facto standard for JUnit 4 based test reports that was made popular by the Ant build system. The LegacyXmlReportGeneratingListener is used by the [running-tests-console-launcher] as well.

Note
The {junit-platform-launcher} module also contains {TestExecutionListener} implementations that can be used for reporting purposes. See {LoggingListener} and {SummaryGeneratingListener} for details.

Configuring the Launcher

If you require fine-grained control over automatic detection and registration of test engines and test execution listeners, you may create an instance of LauncherConfig and supply that to the LauncherFactory.create(LauncherConfig) method. Typically an instance of LauncherConfig is created via the built-in fluent builder API, as demonstrated in the following example.

Unresolved directive in launcher-api.adoc - include::{testDir}/example/UsingTheLauncherDemo.java[tags=launcherConfig]

JUnit Platform Test Kit

The junit-platform-testkit artifact provides support for executing a test plan on the JUnit Platform and then verifying the expected results. As of JUnit Platform 1.4, this support is limited to the execution of a single TestEngine (see Engine Test Kit).

Warning
Although the Test Kit is currently an experimental feature, the JUnit Team invites you to try it out and provide feedback to help improve the Test Kit APIs and eventually promote this feature.

Engine Test Kit

The {testkit-engine-package} package provides support for executing a {TestPlan} for a given {TestEngine} running on the JUnit Platform and then accessing the results via a fluent API to verify the expected results. The key entry point into this API is the {EngineTestKit} which provides static factory methods named engine() and execute(). It is recommended that you select one of the engine() variants to benefit from the fluent API for building an EngineDiscoveryRequest.

Note
If you prefer to use the LauncherDiscoveryRequestBuilder from the Launcher API to build your EngineDiscoveryRequest, you must use one of the execute() variants in EngineTestKit.

The following test class written using JUnit Jupiter will be used in subsequent examples.

Unresolved directive in testkit.adoc - include::{testDir}/example/ExampleTestCase.java[tags=user_guide]

For the sake of brevity, the following sections demonstrate how to test JUnit’s own JupiterTestEngine whose unique engine ID is "junit-jupiter". If you want to test your own TestEngine implementation, you need to use its unique engine ID. Alternatively, you may test your own TestEngine by supplying an instance of it to the EngineTestKit.engine(TestEngine) static factory method.

Asserting Statistics

One of the most common features of the Test Kit is the ability to assert statistics against events fired during the execution of a TestPlan. The following tests demonstrate how to assert statistics for containers and tests in the JUnit Jupiter TestEngine. For details on what statistics are available, consult the Javadoc for {EventStatistics}.

Unresolved directive in testkit.adoc - include::{testDir}/example/testkit/EngineTestKitStatisticsDemo.java[tags=user_guide]
  1. Select the JUnit Jupiter TestEngine.

  2. Select the ExampleTestCase test class.

  3. Execute the TestPlan.

  4. Filter by container events.

  5. Assert statistics for container events.

  6. Filter by test events.

  7. Assert statistics for test events.

Note
In the verifyJupiterContainerStats() test method, the counts for the started and succeeded statistics are 2 since the JupiterTestEngine and the ExampleTestCase class are both considered containers.

Asserting Events

If you find that asserting statistics alone is insufficient for verifying the expected behavior of test execution, you can work directly with the recorded {Event} elements and perform assertions against them.

For example, if you want to verify the reason that the skippedTest() method in ExampleTestCase was skipped, you can do that as follows.

Tip

The assertThatEvents() method in the following example is a shortcut for org.assertj.core.api.Assertions.assertThat(events.list()) from the {AssertJ} assertion library.

For details on what conditions are available for use with AssertJ assertions against events, consult the Javadoc for {EventConditions}.

Unresolved directive in testkit.adoc - include::{testDir}/example/testkit/EngineTestKitSkippedMethodDemo.java[tags=user_guide]
  1. Select the JUnit Jupiter TestEngine.

  2. Select the skippedTest() method in the ExampleTestCase test class.

  3. Execute the TestPlan.

  4. Filter by test events.

  5. Save the test Events to a local variable.

  6. Optionally assert the expected statistics.

  7. Assert that the recorded test events contain exactly one skipped test named skippedTest with "for demonstration purposes" as the reason.

If you want to verify the type of exception thrown from the failingTest() method in ExampleTestCase, you can do that as follows.

Tip

For details on what conditions are available for use with AssertJ assertions against events and execution results, consult the Javadoc for {EventConditions} and {TestExecutionResultConditions}, respectively.

Unresolved directive in testkit.adoc - include::{testDir}/example/testkit/EngineTestKitFailedMethodDemo.java[tags=user_guide]
  1. Select the JUnit Jupiter TestEngine.

  2. Select the ExampleTestCase test class.

  3. Execute the TestPlan.

  4. Filter by test events.

  5. Assert that the recorded test events contain exactly one failing test named failingTest with an exception of type ArithmeticException and an error message equal to "/ by zero".

Although typically unnecessary, there are times when you need to verify all of the events fired during the execution of a TestPlan. The following test demonstrates how to achieve this via the assertEventsMatchExactly() method in the EngineTestKit API.

Tip

Since assertEventsMatchExactly() matches conditions exactly in the order in which the events were fired, ExampleTestCase has been annotated with @TestMethodOrder(OrderAnnotation.class) and each test method has been annotated with @Order(…​). This allows us to enforce the order in which the test methods are executed, which in turn allows our verifyAllJupiterEvents() test to be reliable.

Unresolved directive in testkit.adoc - include::{testDir}/example/testkit/EngineTestKitAllEventsDemo.java[tags=user_guide]
  1. Select the JUnit Jupiter TestEngine.

  2. Select the ExampleTestCase test class.

  3. Execute the TestPlan.

  4. Filter by all events.

  5. Print all events to the supplied writer for debugging purposes. Debug information can also be written to an OutputStream such as System.out or System.err.

  6. Assert all events in exactly the order in which they were fired by the test engine.

The debug() invocation from the preceding example results in output similar to the following.

All Events:
	Event [type = STARTED, testDescriptor = JupiterEngineDescriptor: [engine:junit-jupiter], timestamp = 2018-12-14T12:45:14.082280Z, payload = null]
	Event [type = STARTED, testDescriptor = ClassTestDescriptor: [engine:junit-jupiter]/[class:example.ExampleTestCase], timestamp = 2018-12-14T12:45:14.089339Z, payload = null]
	Event [type = SKIPPED, testDescriptor = TestMethodTestDescriptor: [engine:junit-jupiter]/[class:example.ExampleTestCase]/[method:skippedTest()], timestamp = 2018-12-14T12:45:14.094314Z, payload = 'for demonstration purposes']
	Event [type = STARTED, testDescriptor = TestMethodTestDescriptor: [engine:junit-jupiter]/[class:example.ExampleTestCase]/[method:succeedingTest()], timestamp = 2018-12-14T12:45:14.095182Z, payload = null]
	Event [type = FINISHED, testDescriptor = TestMethodTestDescriptor: [engine:junit-jupiter]/[class:example.ExampleTestCase]/[method:succeedingTest()], timestamp = 2018-12-14T12:45:14.104922Z, payload = TestExecutionResult [status = SUCCESSFUL, throwable = null]]
	Event [type = STARTED, testDescriptor = TestMethodTestDescriptor: [engine:junit-jupiter]/[class:example.ExampleTestCase]/[method:abortedTest()], timestamp = 2018-12-14T12:45:14.106121Z, payload = null]
	Event [type = FINISHED, testDescriptor = TestMethodTestDescriptor: [engine:junit-jupiter]/[class:example.ExampleTestCase]/[method:abortedTest()], timestamp = 2018-12-14T12:45:14.109956Z, payload = TestExecutionResult [status = ABORTED, throwable = org.opentest4j.TestAbortedException: Assumption failed: abc does not contain Z]]
	Event [type = STARTED, testDescriptor = TestMethodTestDescriptor: [engine:junit-jupiter]/[class:example.ExampleTestCase]/[method:failingTest()], timestamp = 2018-12-14T12:45:14.110680Z, payload = null]
	Event [type = FINISHED, testDescriptor = TestMethodTestDescriptor: [engine:junit-jupiter]/[class:example.ExampleTestCase]/[method:failingTest()], timestamp = 2018-12-14T12:45:14.111217Z, payload = TestExecutionResult [status = FAILED, throwable = java.lang.ArithmeticException: / by zero]]
	Event [type = FINISHED, testDescriptor = ClassTestDescriptor: [engine:junit-jupiter]/[class:example.ExampleTestCase], timestamp = 2018-12-14T12:45:14.113731Z, payload = TestExecutionResult [status = SUCCESSFUL, throwable = null]]
	Event [type = FINISHED, testDescriptor = JupiterEngineDescriptor: [engine:junit-jupiter], timestamp = 2018-12-14T12:45:14.113806Z, payload = TestExecutionResult [status = SUCCESSFUL, throwable = null]]