advanced.md (googletest-release-1.11.0) | : | advanced.md (googletest-release-1.12.0) | ||
---|---|---|---|---|
skipping to change at line 160 | skipping to change at line 160 | |||
#### Floating-Point Predicate-Format Functions | #### Floating-Point Predicate-Format Functions | |||
Some floating-point operations are useful, but not that often used. In order to | Some floating-point operations are useful, but not that often used. In order to | |||
avoid an explosion of new macros, we provide them as predicate-format functions | avoid an explosion of new macros, we provide them as predicate-format functions | |||
that can be used in the predicate assertion macro | that can be used in the predicate assertion macro | |||
[`EXPECT_PRED_FORMAT2`](reference/assertions.md#EXPECT_PRED_FORMAT), for | [`EXPECT_PRED_FORMAT2`](reference/assertions.md#EXPECT_PRED_FORMAT), for | |||
example: | example: | |||
```c++ | ```c++ | |||
EXPECT_PRED_FORMAT2(testing::FloatLE, val1, val2); | using ::testing::FloatLE; | |||
EXPECT_PRED_FORMAT2(testing::DoubleLE, val1, val2); | using ::testing::DoubleLE; | |||
... | ||||
EXPECT_PRED_FORMAT2(FloatLE, val1, val2); | ||||
EXPECT_PRED_FORMAT2(DoubleLE, val1, val2); | ||||
``` | ``` | |||
The above code verifies that `val1` is less than, or approximately equal to, | The above code verifies that `val1` is less than, or approximately equal to, | |||
`val2`. | `val2`. | |||
### Asserting Using gMock Matchers | ### Asserting Using gMock Matchers | |||
See [`EXPECT_THAT`](reference/assertions.md#EXPECT_THAT) in the Assertions | See [`EXPECT_THAT`](reference/assertions.md#EXPECT_THAT) in the Assertions | |||
Reference. | Reference. | |||
skipping to change at line 205 | skipping to change at line 208 | |||
### Type Assertions | ### Type Assertions | |||
You can call the function | You can call the function | |||
```c++ | ```c++ | |||
::testing::StaticAssertTypeEq<T1, T2>(); | ::testing::StaticAssertTypeEq<T1, T2>(); | |||
``` | ``` | |||
to assert that types `T1` and `T2` are the same. The function does nothing if | to assert that types `T1` and `T2` are the same. The function does nothing if | |||
the assertion is satisfied. If the types are different, the function call will | the assertion is satisfied. If the types are different, the function call will | |||
fail to compile, the compiler error message will say that | fail to compile, the compiler error message will say that `T1 and T2 are not the | |||
`T1 and T2 are not the same type` and most likely (depending on the compiler) | same type` and most likely (depending on the compiler) show you the actual | |||
show you the actual values of `T1` and `T2`. This is mainly useful inside | values of `T1` and `T2`. This is mainly useful inside template code. | |||
template code. | ||||
**Caveat**: When used inside a member function of a class template or a function | **Caveat**: When used inside a member function of a class template or a function | |||
template, `StaticAssertTypeEq<T1, T2>()` is effective only if the function is | template, `StaticAssertTypeEq<T1, T2>()` is effective only if the function is | |||
instantiated. For example, given: | instantiated. For example, given: | |||
```c++ | ```c++ | |||
template <typename T> class Foo { | template <typename T> class Foo { | |||
public: | public: | |||
void Bar() { testing::StaticAssertTypeEq<int, T>(); } | void Bar() { testing::StaticAssertTypeEq<int, T>(); } | |||
}; | }; | |||
skipping to change at line 386 | skipping to change at line 388 | |||
```c++ | ```c++ | |||
vector<pair<Bar, int> > bar_ints = GetBarIntVector(); | vector<pair<Bar, int> > bar_ints = GetBarIntVector(); | |||
EXPECT_TRUE(IsCorrectBarIntVector(bar_ints)) | EXPECT_TRUE(IsCorrectBarIntVector(bar_ints)) | |||
<< "bar_ints = " << testing::PrintToString(bar_ints); | << "bar_ints = " << testing::PrintToString(bar_ints); | |||
``` | ``` | |||
## Death Tests | ## Death Tests | |||
In many applications, there are assertions that can cause application failure if | In many applications, there are assertions that can cause application failure if | |||
a condition is not met. These sanity checks, which ensure that the program is in | a condition is not met. These consistency checks, which ensure that the program | |||
a known good state, are there to fail at the earliest possible time after some | is in a known good state, are there to fail at the earliest possible time after | |||
program state is corrupted. If the assertion checks the wrong condition, then | some program state is corrupted. If the assertion checks the wrong condition, | |||
the program may proceed in an erroneous state, which could lead to memory | then the program may proceed in an erroneous state, which could lead to memory | |||
corruption, security holes, or worse. Hence it is vitally important to test that | corruption, security holes, or worse. Hence it is vitally important to test that | |||
such assertion statements work as expected. | such assertion statements work as expected. | |||
Since these precondition checks cause the processes to die, we call such tests | Since these precondition checks cause the processes to die, we call such tests | |||
_death tests_. More generally, any test that checks that a program terminates | _death tests_. More generally, any test that checks that a program terminates | |||
(except by throwing an exception) in an expected fashion is also a death test. | (except by throwing an exception) in an expected fashion is also a death test. | |||
Note that if a piece of code throws an exception, we don't consider it "death" | Note that if a piece of code throws an exception, we don't consider it "death" | |||
for the purpose of death tests, as the caller of the code could catch the | for the purpose of death tests, as the caller of the code could catch the | |||
exception and avoid the crash. If you want to verify exceptions thrown by your | exception and avoid the crash. If you want to verify exceptions thrown by your | |||
skipping to change at line 483 | skipping to change at line 485 | |||
// normal test | // normal test | |||
} | } | |||
TEST_F(FooDeathTest, DoesThat) { | TEST_F(FooDeathTest, DoesThat) { | |||
// death test | // death test | |||
} | } | |||
``` | ``` | |||
### Regular Expression Syntax | ### Regular Expression Syntax | |||
On POSIX systems (e.g. Linux, Cygwin, and Mac), googletest uses the | When built with Bazel and using Abseil, googletest uses the | |||
[RE2](https://github.com/google/re2/wiki/Syntax) syntax. Otherwise, for POSIX | ||||
systems (Linux, Cygwin, Mac), googletest uses the | ||||
[POSIX extended regular expression](http://www.opengroup.org/onlinepubs/00969539 9/basedefs/xbd_chap09.html#tag_09_04) | [POSIX extended regular expression](http://www.opengroup.org/onlinepubs/00969539 9/basedefs/xbd_chap09.html#tag_09_04) | |||
syntax. To learn about this syntax, you may want to read this | syntax. To learn about POSIX syntax, you may want to read this | |||
[Wikipedia entry](http://en.wikipedia.org/wiki/Regular_expression#POSIX_Extended _Regular_Expressions). | [Wikipedia entry](http://en.wikipedia.org/wiki/Regular_expression#POSIX_Extended _Regular_Expressions). | |||
On Windows, googletest uses its own simple regular expression implementation. It | On Windows, googletest uses its own simple regular expression implementation. It | |||
lacks many features. For example, we don't support union (`"x|y"`), grouping | lacks many features. For example, we don't support union (`"x|y"`), grouping | |||
(`"(xy)"`), brackets (`"[xy]"`), and repetition count (`"x{5,7}"`), among | (`"(xy)"`), brackets (`"[xy]"`), and repetition count (`"x{5,7}"`), among | |||
others. Below is what we do support (`A` denotes a literal character, period | others. Below is what we do support (`A` denotes a literal character, period | |||
(`.`), or a single `\\ ` escape sequence; `x` and `y` denote regular | (`.`), or a single `\\ ` escape sequence; `x` and `y` denote regular | |||
expressions.): | expressions.): | |||
Expression | Meaning | Expression | Meaning | |||
skipping to change at line 561 | skipping to change at line 565 | |||
### Death Test Styles | ### Death Test Styles | |||
The "threadsafe" death test style was introduced in order to help mitigate the | The "threadsafe" death test style was introduced in order to help mitigate the | |||
risks of testing in a possibly multithreaded environment. It trades increased | risks of testing in a possibly multithreaded environment. It trades increased | |||
test execution time (potentially dramatically so) for improved thread safety. | test execution time (potentially dramatically so) for improved thread safety. | |||
The automated testing framework does not set the style flag. You can choose a | The automated testing framework does not set the style flag. You can choose a | |||
particular style of death tests by setting the flag programmatically: | particular style of death tests by setting the flag programmatically: | |||
```c++ | ```c++ | |||
testing::FLAGS_gtest_death_test_style="threadsafe" | GTEST_FLAG_SET(death_test_style, "threadsafe") | |||
``` | ``` | |||
You can do this in `main()` to set the style for all death tests in the binary, | You can do this in `main()` to set the style for all death tests in the binary, | |||
or in individual tests. Recall that flags are saved before running each test and | or in individual tests. Recall that flags are saved before running each test and | |||
restored afterwards, so you need not do that yourself. For example: | restored afterwards, so you need not do that yourself. For example: | |||
```c++ | ```c++ | |||
int main(int argc, char** argv) { | int main(int argc, char** argv) { | |||
testing::InitGoogleTest(&argc, argv); | testing::InitGoogleTest(&argc, argv); | |||
testing::FLAGS_gtest_death_test_style = "fast"; | GTEST_FLAG_SET(death_test_style, "fast"); | |||
return RUN_ALL_TESTS(); | return RUN_ALL_TESTS(); | |||
} | } | |||
TEST(MyDeathTest, TestOne) { | TEST(MyDeathTest, TestOne) { | |||
testing::FLAGS_gtest_death_test_style = "threadsafe"; | GTEST_FLAG_SET(death_test_style, "threadsafe"); | |||
// This test is run in the "threadsafe" style: | // This test is run in the "threadsafe" style: | |||
ASSERT_DEATH(ThisShouldDie(), ""); | ASSERT_DEATH(ThisShouldDie(), ""); | |||
} | } | |||
TEST(MyDeathTest, TestTwo) { | TEST(MyDeathTest, TestTwo) { | |||
// This test is run in the "fast" style: | // This test is run in the "fast" style: | |||
ASSERT_DEATH(ThisShouldDie(), ""); | ASSERT_DEATH(ThisShouldDie(), ""); | |||
} | } | |||
``` | ``` | |||
skipping to change at line 618 | skipping to change at line 622 | |||
Despite the improved thread safety afforded by the "threadsafe" style of death | Despite the improved thread safety afforded by the "threadsafe" style of death | |||
test, thread problems such as deadlock are still possible in the presence of | test, thread problems such as deadlock are still possible in the presence of | |||
handlers registered with `pthread_atfork(3)`. | handlers registered with `pthread_atfork(3)`. | |||
## Using Assertions in Sub-routines | ## Using Assertions in Sub-routines | |||
{: .callout .note} | {: .callout .note} | |||
Note: If you want to put a series of test assertions in a subroutine to check | Note: If you want to put a series of test assertions in a subroutine to check | |||
for a complex condition, consider using | for a complex condition, consider using | |||
[a custom GMock matcher](gmock_cook_book.md#NewMatchers) | [a custom GMock matcher](gmock_cook_book.md#NewMatchers) instead. This lets you | |||
instead. This lets you provide a more readable error message in case of failure | provide a more readable error message in case of failure and avoid all of the | |||
and avoid all of the issues described below. | issues described below. | |||
### Adding Traces to Assertions | ### Adding Traces to Assertions | |||
If a test sub-routine is called from several places, when an assertion inside it | If a test sub-routine is called from several places, when an assertion inside it | |||
fails, it can be hard to tell which invocation of the sub-routine the failure is | fails, it can be hard to tell which invocation of the sub-routine the failure is | |||
from. You can alleviate this problem using extra logging or custom failure | from. You can alleviate this problem using extra logging or custom failure | |||
messages, but that usually clutters up your tests. A better solution is to use | messages, but that usually clutters up your tests. A better solution is to use | |||
the `SCOPED_TRACE` macro or the `ScopedTrace` utility: | the `SCOPED_TRACE` macro or the `ScopedTrace` utility: | |||
```c++ | ```c++ | |||
SCOPED_TRACE(message); | SCOPED_TRACE(message); | |||
``` | ``` | |||
```c++ | ```c++ | |||
ScopedTrace trace("file_path", line_number, message); | ScopedTrace trace("file_path", line_number, message); | |||
``` | ``` | |||
where `message` can be anything streamable to `std::ostream`. `SCOPED_TRACE` | where `message` can be anything streamable to `std::ostream`. `SCOPED_TRACE` | |||
macro will cause the current file name, line number, and the given message to be | macro will cause the current file name, line number, and the given message to be | |||
added in every failure message. `ScopedTrace` accepts explicit file name and | added in every failure message. `ScopedTrace` accepts explicit file name and | |||
line number in arguments, which is useful for writing test helpers. The effect | line number in arguments, which is useful for writing test helpers. The effect | |||
will be undone when the control leaves the current lexical scope. | will be undone when the control leaves the current lexical scope. | |||
skipping to change at line 839 | skipping to change at line 844 | |||
TEST_F(WidgetUsageTest, MinAndMaxWidgets) { | TEST_F(WidgetUsageTest, MinAndMaxWidgets) { | |||
RecordProperty("MaximumWidgets", ComputeMaxUsage()); | RecordProperty("MaximumWidgets", ComputeMaxUsage()); | |||
RecordProperty("MinimumWidgets", ComputeMinUsage()); | RecordProperty("MinimumWidgets", ComputeMinUsage()); | |||
} | } | |||
``` | ``` | |||
will output XML like this: | will output XML like this: | |||
```xml | ```xml | |||
... | ... | |||
<testcase name="MinAndMaxWidgets" status="run" time="0.006" classname="Widge tUsageTest" MaximumWidgets="12" MinimumWidgets="9" /> | <testcase name="MinAndMaxWidgets" file="test.cpp" line="1" status="run" time ="0.006" classname="WidgetUsageTest" MaximumWidgets="12" MinimumWidgets="9" /> | |||
... | ... | |||
``` | ``` | |||
{: .callout .note} | {: .callout .note} | |||
> NOTE: | > NOTE: | |||
> | > | |||
> * `RecordProperty()` is a static member of the `Test` class. Therefore it | > * `RecordProperty()` is a static member of the `Test` class. Therefore it | |||
> needs to be prefixed with `::testing::Test::` if used outside of the | > needs to be prefixed with `::testing::Test::` if used outside of the | |||
> `TEST` body and the test fixture class. | > `TEST` body and the test fixture class. | |||
> * *`key`* must be a valid XML attribute name, and cannot conflict with the | > * *`key`* must be a valid XML attribute name, and cannot conflict with the | |||
skipping to change at line 890 | skipping to change at line 895 | |||
*first test* in the `FooTest` test suite (i.e. before creating the first | *first test* in the `FooTest` test suite (i.e. before creating the first | |||
`FooTest` object), and calls `TearDownTestSuite()` after running the *last test* | `FooTest` object), and calls `TearDownTestSuite()` after running the *last test* | |||
in it (i.e. after deleting the last `FooTest` object). In between, the tests can | in it (i.e. after deleting the last `FooTest` object). In between, the tests can | |||
use the shared resources. | use the shared resources. | |||
Remember that the test order is undefined, so your code can't depend on a test | Remember that the test order is undefined, so your code can't depend on a test | |||
preceding or following another. Also, the tests must either not modify the state | preceding or following another. Also, the tests must either not modify the state | |||
of any shared resource, or, if they do modify the state, they must restore the | of any shared resource, or, if they do modify the state, they must restore the | |||
state to its original value before passing control to the next test. | state to its original value before passing control to the next test. | |||
Note that `SetUpTestSuite()` may be called multiple times for a test fixture | ||||
class that has derived classes, so you should not expect code in the function | ||||
body to be run only once. Also, derived classes still have access to shared | ||||
resources defined as static members, so careful consideration is needed when | ||||
managing shared resources to avoid memory leaks. | ||||
Here's an example of per-test-suite set-up and tear-down: | Here's an example of per-test-suite set-up and tear-down: | |||
```c++ | ```c++ | |||
class FooTest : public testing::Test { | class FooTest : public testing::Test { | |||
protected: | protected: | |||
// Per-test-suite set-up. | // Per-test-suite set-up. | |||
// Called before the first test in this test suite. | // Called before the first test in this test suite. | |||
// Can be omitted if not needed. | // Can be omitted if not needed. | |||
static void SetUpTestSuite() { | static void SetUpTestSuite() { | |||
shared_resource_ = new ...; | // Avoid reallocating static objects if called in subclasses of FooTest. | |||
if (shared_resource_ == nullptr) { | ||||
shared_resource_ = new ...; | ||||
} | ||||
} | } | |||
// Per-test-suite tear-down. | // Per-test-suite tear-down. | |||
// Called after the last test in this test suite. | // Called after the last test in this test suite. | |||
// Can be omitted if not needed. | // Can be omitted if not needed. | |||
static void TearDownTestSuite() { | static void TearDownTestSuite() { | |||
delete shared_resource_; | delete shared_resource_; | |||
shared_resource_ = nullptr; | shared_resource_ = nullptr; | |||
} | } | |||
skipping to change at line 1304 | skipping to change at line 1318 | |||
the interface/concept should have. Then, the author of each implementation can | the interface/concept should have. Then, the author of each implementation can | |||
just instantiate the test suite with their type to verify that it conforms to | just instantiate the test suite with their type to verify that it conforms to | |||
the requirements, without having to write similar tests repeatedly. Here's an | the requirements, without having to write similar tests repeatedly. Here's an | |||
example: | example: | |||
First, define a fixture class template, as we did with typed tests: | First, define a fixture class template, as we did with typed tests: | |||
```c++ | ```c++ | |||
template <typename T> | template <typename T> | |||
class FooTest : public testing::Test { | class FooTest : public testing::Test { | |||
void DoSomethingInteresting(); | ||||
... | ... | |||
}; | }; | |||
``` | ``` | |||
Next, declare that you will define a type-parameterized test suite: | Next, declare that you will define a type-parameterized test suite: | |||
```c++ | ```c++ | |||
TYPED_TEST_SUITE_P(FooTest); | TYPED_TEST_SUITE_P(FooTest); | |||
``` | ``` | |||
Then, use `TYPED_TEST_P()` to define a type-parameterized test. You can repeat | Then, use `TYPED_TEST_P()` to define a type-parameterized test. You can repeat | |||
this as many times as you want: | this as many times as you want: | |||
```c++ | ```c++ | |||
TYPED_TEST_P(FooTest, DoesBlah) { | TYPED_TEST_P(FooTest, DoesBlah) { | |||
// Inside a test, refer to TypeParam to get the type parameter. | // Inside a test, refer to TypeParam to get the type parameter. | |||
TypeParam n = 0; | TypeParam n = 0; | |||
// You will need to use `this` explicitly to refer to fixture members. | ||||
this->DoSomethingInteresting() | ||||
... | ... | |||
} | } | |||
TYPED_TEST_P(FooTest, HasPropertyA) { ... } | TYPED_TEST_P(FooTest, HasPropertyA) { ... } | |||
``` | ``` | |||
Now the tricky part: you need to register all test patterns using the | Now the tricky part: you need to register all test patterns using the | |||
`REGISTER_TYPED_TEST_SUITE_P` macro before you can instantiate them. The first | `REGISTER_TYPED_TEST_SUITE_P` macro before you can instantiate them. The first | |||
argument of the macro is the test suite name; the rest are the names of the | argument of the macro is the test suite name; the rest are the names of the | |||
tests in this test suite: | tests in this test suite: | |||
skipping to change at line 1483 | skipping to change at line 1501 | |||
## "Catching" Failures | ## "Catching" Failures | |||
If you are building a testing utility on top of googletest, you'll want to test | If you are building a testing utility on top of googletest, you'll want to test | |||
your utility. What framework would you use to test it? googletest, of course. | your utility. What framework would you use to test it? googletest, of course. | |||
The challenge is to verify that your testing utility reports failures correctly. | The challenge is to verify that your testing utility reports failures correctly. | |||
In frameworks that report a failure by throwing an exception, you could catch | In frameworks that report a failure by throwing an exception, you could catch | |||
the exception and assert on it. But googletest doesn't use exceptions, so how do | the exception and assert on it. But googletest doesn't use exceptions, so how do | |||
we test that a piece of code generates an expected failure? | we test that a piece of code generates an expected failure? | |||
`"gtest/gtest-spi.h"` contains some constructs to do this. After #including this | `"gtest/gtest-spi.h"` contains some constructs to do this. | |||
header, | After #including this header, you can use | |||
you can use | ||||
```c++ | ```c++ | |||
EXPECT_FATAL_FAILURE(statement, substring); | EXPECT_FATAL_FAILURE(statement, substring); | |||
``` | ``` | |||
to assert that `statement` generates a fatal (e.g. `ASSERT_*`) failure in the | to assert that `statement` generates a fatal (e.g. `ASSERT_*`) failure in the | |||
current thread whose message contains the given `substring`, or use | current thread whose message contains the given `substring`, or use | |||
```c++ | ```c++ | |||
EXPECT_NONFATAL_FAILURE(statement, substring); | EXPECT_NONFATAL_FAILURE(statement, substring); | |||
skipping to change at line 1588 | skipping to change at line 1606 | |||
testing::RegisterTest( | testing::RegisterTest( | |||
"MyFixture", ("Test" + std::to_string(v)).c_str(), nullptr, | "MyFixture", ("Test" + std::to_string(v)).c_str(), nullptr, | |||
std::to_string(v).c_str(), | std::to_string(v).c_str(), | |||
__FILE__, __LINE__, | __FILE__, __LINE__, | |||
// Important to use the fixture type as the return type here. | // Important to use the fixture type as the return type here. | |||
[=]() -> MyFixture* { return new MyTest(v); }); | [=]() -> MyFixture* { return new MyTest(v); }); | |||
} | } | |||
} | } | |||
... | ... | |||
int main(int argc, char** argv) { | int main(int argc, char** argv) { | |||
testing::InitGoogleTest(&argc, argv); | ||||
std::vector<int> values_to_test = LoadValuesFromConfig(); | std::vector<int> values_to_test = LoadValuesFromConfig(); | |||
RegisterMyTests(values_to_test); | RegisterMyTests(values_to_test); | |||
... | ... | |||
return RUN_ALL_TESTS(); | return RUN_ALL_TESTS(); | |||
} | } | |||
``` | ``` | |||
## Getting the Current Test's Name | ## Getting the Current Test's Name | |||
Sometimes a function may need to know the name of the currently running test. | Sometimes a function may need to know the name of the currently running test. | |||
For example, you may be using the `SetUp()` method of your test fixture to set | For example, you may be using the `SetUp()` method of your test fixture to set | |||
the golden file name based on which test is running. The | the golden file name based on which test is running. The | |||
[`TestInfo`](reference/testing.md#TestInfo) class has this information. | [`TestInfo`](reference/testing.md#TestInfo) class has this information. | |||
To obtain a `TestInfo` object for the currently running test, call | To obtain a `TestInfo` object for the currently running test, call | |||
`current_test_info()` on the [`UnitTest`](reference/testing.md#UnitTest) | `current_test_info()` on the [`UnitTest`](reference/testing.md#UnitTest) | |||
singleton object: | singleton object: | |||
skipping to change at line 1818 | skipping to change at line 1838 | |||
* `./foo_test --gtest_filter=FooTest.*:BarTest.*-FooTest.Bar:BarTest.Foo` Runs | * `./foo_test --gtest_filter=FooTest.*:BarTest.*-FooTest.Bar:BarTest.Foo` Runs | |||
everything in test suite `FooTest` except `FooTest.Bar` and everything in | everything in test suite `FooTest` except `FooTest.Bar` and everything in | |||
test suite `BarTest` except `BarTest.Foo`. | test suite `BarTest` except `BarTest.Foo`. | |||
#### Stop test execution upon first failure | #### Stop test execution upon first failure | |||
By default, a googletest program runs all tests the user has defined. In some | By default, a googletest program runs all tests the user has defined. In some | |||
cases (e.g. iterative test development & execution) it may be desirable stop | cases (e.g. iterative test development & execution) it may be desirable stop | |||
test execution upon first failure (trading improved latency for completeness). | test execution upon first failure (trading improved latency for completeness). | |||
If `GTEST_FAIL_FAST` environment variable or `--gtest_fail_fast` flag is set, | If `GTEST_FAIL_FAST` environment variable or `--gtest_fail_fast` flag is set, | |||
the test runner will stop execution as soon as the first test failure is | the test runner will stop execution as soon as the first test failure is found. | |||
found. | ||||
#### Temporarily Disabling Tests | #### Temporarily Disabling Tests | |||
If you have a broken test that you cannot fix right away, you can add the | If you have a broken test that you cannot fix right away, you can add the | |||
`DISABLED_` prefix to its name. This will exclude it from execution. This is | `DISABLED_` prefix to its name. This will exclude it from execution. This is | |||
better than commenting out the code or using `#if 0`, as disabled tests are | better than commenting out the code or using `#if 0`, as disabled tests are | |||
still compiled (and thus won't rot). | still compiled (and thus won't rot). | |||
If you need to disable all tests in a test suite, you can either add `DISABLED_` | If you need to disable all tests in a test suite, you can either add `DISABLED_` | |||
to the front of the name of each test, or alternatively add it to the front of | to the front of the name of each test, or alternatively add it to the front of | |||
skipping to change at line 1913 | skipping to change at line 1932 | |||
the random seed value, such that you can reproduce an order-related test failure | the random seed value, such that you can reproduce an order-related test failure | |||
later. To specify the random seed explicitly, use the `--gtest_random_seed=SEED` | later. To specify the random seed explicitly, use the `--gtest_random_seed=SEED` | |||
flag (or set the `GTEST_RANDOM_SEED` environment variable), where `SEED` is an | flag (or set the `GTEST_RANDOM_SEED` environment variable), where `SEED` is an | |||
integer in the range [0, 99999]. The seed value 0 is special: it tells | integer in the range [0, 99999]. The seed value 0 is special: it tells | |||
googletest to do the default behavior of calculating the seed from the current | googletest to do the default behavior of calculating the seed from the current | |||
time. | time. | |||
If you combine this with `--gtest_repeat=N`, googletest will pick a different | If you combine this with `--gtest_repeat=N`, googletest will pick a different | |||
random seed and re-shuffle the tests in each iteration. | random seed and re-shuffle the tests in each iteration. | |||
### Distributing Test Functions to Multiple Machines | ||||
If you have more than one machine you can use to run a test program, you might | ||||
want to run the test functions in parallel and get the result faster. We call | ||||
this technique *sharding*, where each machine is called a *shard*. | ||||
GoogleTest is compatible with test sharding. To take advantage of this feature, | ||||
your test runner (not part of GoogleTest) needs to do the following: | ||||
1. Allocate a number of machines (shards) to run the tests. | ||||
1. On each shard, set the `GTEST_TOTAL_SHARDS` environment variable to the tota | ||||
l | ||||
number of shards. It must be the same for all shards. | ||||
1. On each shard, set the `GTEST_SHARD_INDEX` environment variable to the index | ||||
of the shard. Different shards must be assigned different indices, which | ||||
must be in the range `[0, GTEST_TOTAL_SHARDS - 1]`. | ||||
1. Run the same test program on all shards. When GoogleTest sees the above two | ||||
environment variables, it will select a subset of the test functions to run. | ||||
Across all shards, each test function in the program will be run exactly | ||||
once. | ||||
1. Wait for all shards to finish, then collect and report the results. | ||||
Your project may have tests that were written without GoogleTest and thus don't | ||||
understand this protocol. In order for your test runner to figure out which test | ||||
supports sharding, it can set the environment variable `GTEST_SHARD_STATUS_FILE` | ||||
to a non-existent file path. If a test program supports sharding, it will create | ||||
this file to acknowledge that fact; otherwise it will not create it. The actual | ||||
contents of the file are not important at this time, although we may put some | ||||
useful information in it in the future. | ||||
Here's an example to make it clear. Suppose you have a test program `foo_test` | ||||
that contains the following 5 test functions: | ||||
``` | ||||
TEST(A, V) | ||||
TEST(A, W) | ||||
TEST(B, X) | ||||
TEST(B, Y) | ||||
TEST(B, Z) | ||||
``` | ||||
Suppose you have 3 machines at your disposal. To run the test functions in | ||||
parallel, you would set `GTEST_TOTAL_SHARDS` to 3 on all machines, and set | ||||
`GTEST_SHARD_INDEX` to 0, 1, and 2 on the machines respectively. Then you would | ||||
run the same `foo_test` on each machine. | ||||
GoogleTest reserves the right to change how the work is distributed across the | ||||
shards, but here's one possible scenario: | ||||
* Machine #0 runs `A.V` and `B.X`. | ||||
* Machine #1 runs `A.W` and `B.Y`. | ||||
* Machine #2 runs `B.Z`. | ||||
### Controlling Test Output | ### Controlling Test Output | |||
#### Colored Terminal Output | #### Colored Terminal Output | |||
googletest can use colors in its terminal output to make it easier to spot the | googletest can use colors in its terminal output to make it easier to spot the | |||
important information: | important information: | |||
<pre>... | <pre>... | |||
<font color="green">[----------]</font> 1 test from FooTest | <font color="green">[----------]</font> 1 test from FooTest | |||
<font color="green">[ RUN ]</font> FooTest.DoesAbc | <font color="green">[ RUN ]</font> FooTest.DoesAbc | |||
skipping to change at line 2020 | skipping to change at line 2091 | |||
TEST(MathTest, Subtraction) { ... } | TEST(MathTest, Subtraction) { ... } | |||
TEST(LogicTest, NonContradiction) { ... } | TEST(LogicTest, NonContradiction) { ... } | |||
``` | ``` | |||
could generate this report: | could generate this report: | |||
```xml | ```xml | |||
<?xml version="1.0" encoding="UTF-8"?> | <?xml version="1.0" encoding="UTF-8"?> | |||
<testsuites tests="3" failures="1" errors="0" time="0.035" timestamp="2011-10-31 T18:52:42" name="AllTests"> | <testsuites tests="3" failures="1" errors="0" time="0.035" timestamp="2011-10-31 T18:52:42" name="AllTests"> | |||
<testsuite name="MathTest" tests="2" failures="1" errors="0" time="0.015"> | <testsuite name="MathTest" tests="2" failures="1" errors="0" time="0.015"> | |||
<testcase name="Addition" status="run" time="0.007" classname=""> | <testcase name="Addition" file="test.cpp" line="1" status="run" time="0.007" classname=""> | |||
<failure message="Value of: add(1, 1)
 Actual: 3
Expected: 2" t ype="">...</failure> | <failure message="Value of: add(1, 1)
 Actual: 3
Expected: 2" t ype="">...</failure> | |||
<failure message="Value of: add(1, -1)
 Actual: 1
Expected: 0" type="">...</failure> | <failure message="Value of: add(1, -1)
 Actual: 1
Expected: 0" type="">...</failure> | |||
</testcase> | </testcase> | |||
<testcase name="Subtraction" status="run" time="0.005" classname=""> | <testcase name="Subtraction" file="test.cpp" line="2" status="run" time="0.0 05" classname=""> | |||
</testcase> | </testcase> | |||
</testsuite> | </testsuite> | |||
<testsuite name="LogicTest" tests="1" failures="0" errors="0" time="0.005"> | <testsuite name="LogicTest" tests="1" failures="0" errors="0" time="0.005"> | |||
<testcase name="NonContradiction" status="run" time="0.005" classname=""> | <testcase name="NonContradiction" file="test.cpp" line="3" status="run" time ="0.005" classname=""> | |||
</testcase> | </testcase> | |||
</testsuite> | </testsuite> | |||
</testsuites> | </testsuites> | |||
``` | ``` | |||
Things to note: | Things to note: | |||
* The `tests` attribute of a `<testsuites>` or `<testsuite>` element tells how | * The `tests` attribute of a `<testsuites>` or `<testsuite>` element tells how | |||
many test functions the googletest program or test suite contains, while the | many test functions the googletest program or test suite contains, while the | |||
`failures` attribute tells how many of them failed. | `failures` attribute tells how many of them failed. | |||
* The `time` attribute expresses the duration of the test, test suite, or | * The `time` attribute expresses the duration of the test, test suite, or | |||
entire test program in seconds. | entire test program in seconds. | |||
* The `timestamp` attribute records the local date and time of the test | * The `timestamp` attribute records the local date and time of the test | |||
execution. | execution. | |||
* The `file` and `line` attributes record the source file location, where the | ||||
test was defined. | ||||
* Each `<failure>` element corresponds to a single failed googletest | * Each `<failure>` element corresponds to a single failed googletest | |||
assertion. | assertion. | |||
#### Generating a JSON Report | #### Generating a JSON Report | |||
googletest can also emit a JSON report as an alternative format to XML. To | googletest can also emit a JSON report as an alternative format to XML. To | |||
generate the JSON report, set the `GTEST_OUTPUT` environment variable or the | generate the JSON report, set the `GTEST_OUTPUT` environment variable or the | |||
`--gtest_output` flag to the string `"json:path_to_output_file"`, which will | `--gtest_output` flag to the string `"json:path_to_output_file"`, which will | |||
create the file at the given location. You can also just use the string | create the file at the given location. You can also just use the string | |||
`"json"`, in which case the output can be found in the `test_detail.json` file | `"json"`, in which case the output can be found in the `test_detail.json` file | |||
skipping to change at line 2085 | skipping to change at line 2159 | |||
"items": { | "items": { | |||
"$ref": "#/definitions/TestInfo" | "$ref": "#/definitions/TestInfo" | |||
} | } | |||
} | } | |||
} | } | |||
}, | }, | |||
"TestInfo": { | "TestInfo": { | |||
"type": "object", | "type": "object", | |||
"properties": { | "properties": { | |||
"name": { "type": "string" }, | "name": { "type": "string" }, | |||
"file": { "type": "string" }, | ||||
"line": { "type": "integer" }, | ||||
"status": { | "status": { | |||
"type": "string", | "type": "string", | |||
"enum": ["RUN", "NOTRUN"] | "enum": ["RUN", "NOTRUN"] | |||
}, | }, | |||
"time": { "type": "string" }, | "time": { "type": "string" }, | |||
"classname": { "type": "string" }, | "classname": { "type": "string" }, | |||
"failures": { | "failures": { | |||
"type": "array", | "type": "array", | |||
"items": { | "items": { | |||
"$ref": "#/definitions/Failure" | "$ref": "#/definitions/Failure" | |||
skipping to change at line 2162 | skipping to change at line 2238 | |||
int32 tests = 2; | int32 tests = 2; | |||
int32 failures = 3; | int32 failures = 3; | |||
int32 disabled = 4; | int32 disabled = 4; | |||
int32 errors = 5; | int32 errors = 5; | |||
google.protobuf.Duration time = 6; | google.protobuf.Duration time = 6; | |||
repeated TestInfo testsuite = 7; | repeated TestInfo testsuite = 7; | |||
} | } | |||
message TestInfo { | message TestInfo { | |||
string name = 1; | string name = 1; | |||
string file = 6; | ||||
int32 line = 7; | ||||
enum Status { | enum Status { | |||
RUN = 0; | RUN = 0; | |||
NOTRUN = 1; | NOTRUN = 1; | |||
} | } | |||
Status status = 2; | Status status = 2; | |||
google.protobuf.Duration time = 3; | google.protobuf.Duration time = 3; | |||
string classname = 4; | string classname = 4; | |||
message Failure { | message Failure { | |||
string failures = 1; | string failures = 1; | |||
string type = 2; | string type = 2; | |||
skipping to change at line 2205 | skipping to change at line 2283 | |||
"testsuites": [ | "testsuites": [ | |||
{ | { | |||
"name": "MathTest", | "name": "MathTest", | |||
"tests": 2, | "tests": 2, | |||
"failures": 1, | "failures": 1, | |||
"errors": 0, | "errors": 0, | |||
"time": "0.015s", | "time": "0.015s", | |||
"testsuite": [ | "testsuite": [ | |||
{ | { | |||
"name": "Addition", | "name": "Addition", | |||
"file": "test.cpp", | ||||
"line": 1, | ||||
"status": "RUN", | "status": "RUN", | |||
"time": "0.007s", | "time": "0.007s", | |||
"classname": "", | "classname": "", | |||
"failures": [ | "failures": [ | |||
{ | { | |||
"message": "Value of: add(1, 1)\n Actual: 3\nExpected: 2", | "message": "Value of: add(1, 1)\n Actual: 3\nExpected: 2", | |||
"type": "" | "type": "" | |||
}, | }, | |||
{ | { | |||
"message": "Value of: add(1, -1)\n Actual: 1\nExpected: 0", | "message": "Value of: add(1, -1)\n Actual: 1\nExpected: 0", | |||
"type": "" | "type": "" | |||
} | } | |||
] | ] | |||
}, | }, | |||
{ | { | |||
"name": "Subtraction", | "name": "Subtraction", | |||
"file": "test.cpp", | ||||
"line": 2, | ||||
"status": "RUN", | "status": "RUN", | |||
"time": "0.005s", | "time": "0.005s", | |||
"classname": "" | "classname": "" | |||
} | } | |||
] | ] | |||
}, | }, | |||
{ | { | |||
"name": "LogicTest", | "name": "LogicTest", | |||
"tests": 1, | "tests": 1, | |||
"failures": 0, | "failures": 0, | |||
"errors": 0, | "errors": 0, | |||
"time": "0.005s", | "time": "0.005s", | |||
"testsuite": [ | "testsuite": [ | |||
{ | { | |||
"name": "NonContradiction", | "name": "NonContradiction", | |||
"file": "test.cpp", | ||||
"line": 3, | ||||
"status": "RUN", | "status": "RUN", | |||
"time": "0.005s", | "time": "0.005s", | |||
"classname": "" | "classname": "" | |||
} | } | |||
] | ] | |||
} | } | |||
] | ] | |||
} | } | |||
``` | ``` | |||
{: .callout .important} | {: .callout .important} | |||
IMPORTANT: The exact format of the JSON document is subject to change. | IMPORTANT: The exact format of the JSON document is subject to change. | |||
### Controlling How Failures Are Reported | ### Controlling How Failures Are Reported | |||
#### Detecting Test Premature Exit | #### Detecting Test Premature Exit | |||
Google Test implements the _premature-exit-file_ protocol for test runners | Google Test implements the _premature-exit-file_ protocol for test runners to | |||
to catch any kind of unexpected exits of test programs. Upon start, | catch any kind of unexpected exits of test programs. Upon start, Google Test | |||
Google Test creates the file which will be automatically deleted after | creates the file which will be automatically deleted after all work has been | |||
all work has been finished. Then, the test runner can check if this file | finished. Then, the test runner can check if this file exists. In case the file | |||
exists. In case the file remains undeleted, the inspected test has exited | remains undeleted, the inspected test has exited prematurely. | |||
prematurely. | ||||
This feature is enabled only if the `TEST_PREMATURE_EXIT_FILE` environment | This feature is enabled only if the `TEST_PREMATURE_EXIT_FILE` environment | |||
variable has been set. | variable has been set. | |||
#### Turning Assertion Failures into Break-Points | #### Turning Assertion Failures into Break-Points | |||
When running test programs under a debugger, it's very convenient if the | When running test programs under a debugger, it's very convenient if the | |||
debugger can catch an assertion failure and automatically drop into interactive | debugger can catch an assertion failure and automatically drop into interactive | |||
mode. googletest's *break-on-failure* mode supports this behavior. | mode. googletest's *break-on-failure* mode supports this behavior. | |||
End of changes. 30 change blocks. | ||||
34 lines changed or deleted | 117 lines changed or added |