Now, run the following command to create all the necessary tables in our test database and use the nose2 test running to execute all the tests we created. The test runner will execute all the methods for our TestHexacopter
class that start with the test_
prefix and will display the results. In this case, we just have one method that matches the criteria, but we will add more later.
Run the following command within the same virtual environment we have been using. We will use the -v
option to instruct nose2 to print test case names and statuses. The --with-coverage
option turns on test coverage reporting generation:
nose2 -v --with-coverage
The following lines show the sample output. Notice that the numbers shown in the report might have small differences if our code includes additional lines or comments:
test_set_and_get_led_brightness_level (test_hexacopter.TestHexacopter) ... I've started setting the Blue LED's brightness level I've finished setting the Blue LED's brightness level I've started setting the White LED's brightness level I've finished setting the White LED's brightness level I've started retrieving Blue LED's status I've finished retrieving Blue LED's status I've started retrieving White LED's status I've finished retrieving White LED's status ok ---------------------------------------------------------------- Ran 1 test in 1.311s OK ----------- coverage: platform win32, python 3.5.2-final-0 ----- Name Stmts Miss Cover ---------------------------------- async_api.py 129 69 47% drone.py 57 18 68% ---------------------------------- TOTAL 186 87 53%
By default, nose2
looks for modules whose names start with the test
prefix. In this case, the only module that matches the criteria is the test_hexacopter
module. In the modules that match the criteria, nose2
loads tests from all the subclasses of unittest.TestCase
and the functions whose names start with the test
prefix. The tornado.testing.AsyncHTTPTestCase
includes unittest.TestCase
as one of its superclasses in the class hierarchy.
The output provided details indicating that the test runner discovered and executed one test and it passed. The output displays the method name and the class name for each method in the TestHexacopter
class that started with the test_
prefix and represented a test to be executed.
We definitely have a very low coverage for async_api.py
and drone.py
based on the measurements shown in the report. In fact, we just wrote one test related to LEDs, and therefore, it makes sense that the coverage has to be improved. We didn't create tests related to other hexacopter resources.
We can run the coverage
command with the -m
command-line option to display the line numbers of the missing statements in a new Missing
column:
coverage report -m
The command will use the information from the last execution and will display the missing statements. The next lines show a sample output that corresponds to the previous execution of the unit tests. Notice that the numbers shown in the report might have small differences if our code includes additional lines or comments:
Name Stmts Miss Cover Missing -------------------------------------------- async_api.py 129 69 47% 137-150, 154, 158-187, 191, 202-204, 226-228, 233-235, 249-256, 270-282, 286, 311-315 drone.py 57 18 68% 11-12, 24, 27-34, 37, 40-41, 59, 61, 68-69 -------------------------------------------- TOTAL 186 87 53%
Now, run the following command to get annotated HTML listings detailing missed lines:
coverage html
Open the index.html
HTML file generated in the htmlcov
folder with your Web browser. The following screenshot shows an example report that coverage generated in HTML format:
Click or tap on drony.py
and the Web browser will render a Web page that displays the statements that were run, the missing ones, and the excluded ones, with different colors. We can click or tap on the run, missing, and excluded buttons to show or hide the background color that represents the status for each line of code. By default, the missing lines of code will be displayed with a pink background. Thus, we must write unit tests that target these lines of code to improve our test coverage.
3.22.61.218