| Andrew Geissler | f034379 | 2020-11-18 10:42:21 -0600 | [diff] [blame] | 1 | .. SPDX-License-Identifier: CC-BY-SA-2.0-UK | 
| Andrew Geissler | c9f7865 | 2020-09-18 14:11:35 -0500 | [diff] [blame] | 2 |  | 
|  | 3 | ***************************************** | 
|  | 4 | The Yocto Project Test Environment Manual | 
|  | 5 | ***************************************** | 
|  | 6 |  | 
|  | 7 | .. _test-welcome: | 
|  | 8 |  | 
|  | 9 | Welcome | 
|  | 10 | ======= | 
|  | 11 |  | 
|  | 12 | Welcome to the Yocto Project Test Environment Manual! This manual is a | 
|  | 13 | work in progress. The manual contains information about the testing | 
|  | 14 | environment used by the Yocto Project to make sure each major and minor | 
|  | 15 | release works as intended. All the project's testing infrastructure and | 
|  | 16 | processes are publicly visible and available so that the community can | 
|  | 17 | see what testing is being performed, how it's being done and the current | 
|  | 18 | status of the tests and the project at any given time. It is intended | 
|  | 19 | that Other organizations can leverage off the process and testing | 
|  | 20 | environment used by the Yocto Project to create their own automated, | 
|  | 21 | production test environment, building upon the foundations from the | 
|  | 22 | project core. | 
|  | 23 |  | 
|  | 24 | Currently, the Yocto Project Test Environment Manual has no projected | 
|  | 25 | release date. This manual is a work-in-progress and is being initially | 
|  | 26 | loaded with information from the README files and notes from key | 
|  | 27 | engineers: | 
|  | 28 |  | 
|  | 29 | -  *yocto-autobuilder2:* This | 
|  | 30 | :yocto_git:`README.md </cgit.cgi/yocto-autobuilder2/tree/README.md>` | 
|  | 31 | is the main README which detials how to set up the Yocto Project | 
|  | 32 | Autobuilder. The ``yocto-autobuilder2`` repository represents the | 
|  | 33 | Yocto Project's console UI plugin to Buildbot and the configuration | 
|  | 34 | necessary to configure Buildbot to perform the testing the project | 
|  | 35 | requires. | 
|  | 36 |  | 
|  | 37 | -  *yocto-autobuilder-helper:* This :yocto_git:`README </cgit.cgi/yocto-autobuilder-helper/tree/README/>` | 
|  | 38 | and repository contains Yocto Project Autobuilder Helper scripts and | 
|  | 39 | configuration. The ``yocto-autobuilder-helper`` repository contains | 
|  | 40 | the "glue" logic that defines which tests to run and how to run them. | 
|  | 41 | As a result, it can be used by any Continuous Improvement (CI) system | 
|  | 42 | to run builds, support getting the correct code revisions, configure | 
|  | 43 | builds and layers, run builds, and collect results. The code is | 
|  | 44 | independent of any CI system, which means the code can work `Buildbot <https://docs.buildbot.net/0.9.15.post1/>`__, | 
|  | 45 | Jenkins, or others. This repository has a branch per release of the | 
|  | 46 | project defining the tests to run on a per release basis. | 
|  | 47 |  | 
|  | 48 | .. _test-yocto-project-autobuilder-overview: | 
|  | 49 |  | 
|  | 50 | Yocto Project Autobuilder Overview | 
|  | 51 | ================================== | 
|  | 52 |  | 
|  | 53 | The Yocto Project Autobuilder collectively refers to the software, | 
|  | 54 | tools, scripts, and procedures used by the Yocto Project to test | 
|  | 55 | released software across supported hardware in an automated and regular | 
|  | 56 | fashion. Basically, during the development of a Yocto Project release, | 
|  | 57 | the Autobuilder tests if things work. The Autobuilder builds all test | 
|  | 58 | targets and runs all the tests. | 
|  | 59 |  | 
|  | 60 | The Yocto Project uses now uses standard upstream | 
|  | 61 | `Buildbot <https://docs.buildbot.net/0.9.15.post1/>`__ (version 9) to | 
|  | 62 | drive its integration and testing. Buildbot Nine has a plug-in interface | 
|  | 63 | that the Yocto Project customizes using code from the | 
|  | 64 | ``yocto-autobuilder2`` repository, adding its own console UI plugin. The | 
|  | 65 | resulting UI plug-in allows you to visualize builds in a way suited to | 
|  | 66 | the project's needs. | 
|  | 67 |  | 
|  | 68 | A ``helper`` layer provides configuration and job management through | 
|  | 69 | scripts found in the ``yocto-autobuilder-helper`` repository. The | 
|  | 70 | ``helper`` layer contains the bulk of the build configuration | 
|  | 71 | information and is release-specific, which makes it highly customizable | 
|  | 72 | on a per-project basis. The layer is CI system-agnostic and contains a | 
|  | 73 | number of Helper scripts that can generate build configurations from | 
|  | 74 | simple JSON files. | 
|  | 75 |  | 
|  | 76 | .. note:: | 
|  | 77 |  | 
|  | 78 | The project uses Buildbot for historical reasons but also because | 
|  | 79 | many of the project developers have knowledge of python. It is | 
|  | 80 | possible to use the outer layers from another Continuous Integration | 
|  | 81 | (CI) system such as | 
|  | 82 | `Jenkins <https://en.wikipedia.org/wiki/Jenkins_(software)>`__ | 
|  | 83 | instead of Buildbot. | 
|  | 84 |  | 
|  | 85 | The following figure shows the Yocto Project Autobuilder stack with a | 
|  | 86 | topology that includes a controller and a cluster of workers: | 
|  | 87 |  | 
|  | 88 | .. image:: figures/ab-test-cluster.png | 
|  | 89 | :align: center | 
|  | 90 |  | 
|  | 91 | .. _test-project-tests: | 
|  | 92 |  | 
|  | 93 | Yocto Project Tests - Types of Testing Overview | 
|  | 94 | =============================================== | 
|  | 95 |  | 
|  | 96 | The Autobuilder tests different elements of the project by using | 
|  | 97 | thefollowing types of tests: | 
|  | 98 |  | 
|  | 99 | -  *Build Testing:* Tests whether specific configurations build by | 
|  | 100 | varying :term:`MACHINE`, | 
|  | 101 | :term:`DISTRO`, other configuration | 
|  | 102 | options, and the specific target images being built (or world). Used | 
|  | 103 | to trigger builds of all the different test configurations on the | 
|  | 104 | Autobuilder. Builds usually cover many different targets for | 
|  | 105 | different architectures, machines, and distributions, as well as | 
|  | 106 | different configurations, such as different init systems. The | 
|  | 107 | Autobuilder tests literally hundreds of configurations and targets. | 
|  | 108 |  | 
|  | 109 | -  *Sanity Checks During the Build Process:* Tests initiated through | 
|  | 110 | the :ref:`insane <ref-classes-insane>` | 
|  | 111 | class. These checks ensure the output of the builds are correct. | 
|  | 112 | For example, does the ELF architecture in the generated binaries | 
|  | 113 | match the target system? ARM binaries would not work in a MIPS | 
|  | 114 | system! | 
|  | 115 |  | 
|  | 116 | -  *Build Performance Testing:* Tests whether or not commonly used steps | 
|  | 117 | during builds work efficiently and avoid regressions. Tests to time | 
|  | 118 | commonly used usage scenarios are run through ``oe-build-perf-test``. | 
|  | 119 | These tests are run on isolated machines so that the time | 
|  | 120 | measurements of the tests are accurate and no other processes | 
|  | 121 | interfere with the timing results. The project currently tests | 
|  | 122 | performance on two different distributions, Fedora and Ubuntu, to | 
|  | 123 | ensure we have no single point of failure and can ensure the | 
|  | 124 | different distros work effectively. | 
|  | 125 |  | 
|  | 126 | -  *eSDK Testing:* Image tests initiated through the following command:: | 
|  | 127 |  | 
|  | 128 | $ bitbake image -c testsdkext | 
|  | 129 |  | 
|  | 130 | The tests utilize the ``testsdkext`` class and the ``do_testsdkext`` task. | 
|  | 131 |  | 
|  | 132 | -  *Feature Testing:* Various scenario-based tests are run through the | 
|  | 133 | :ref:`OpenEmbedded Self test (oe-selftest) <ref-manual/ref-release-process:Testing and Quality Assurance>`. We test oe-selftest on each of the main distrubutions | 
|  | 134 | we support. | 
|  | 135 |  | 
|  | 136 | -  *Image Testing:* Image tests initiated through the following command:: | 
|  | 137 |  | 
|  | 138 | $ bitbake image -c testimage | 
|  | 139 |  | 
|  | 140 | The tests utilize the :ref:`testimage* <ref-classes-testimage*>` | 
|  | 141 | classes and the :ref:`ref-tasks-testimage` task. | 
|  | 142 |  | 
|  | 143 | -  *Layer Testing:* The Autobuilder has the possibility to test whether | 
|  | 144 | specific layers work with the test of the system. The layers tested | 
|  | 145 | may be selected by members of the project. Some key community layers | 
|  | 146 | are also tested periodically. | 
|  | 147 |  | 
|  | 148 | -  *Package Testing:* A Package Test (ptest) runs tests against packages | 
|  | 149 | built by the OpenEmbedded build system on the target machine. See the | 
|  | 150 | :ref:`Testing Packages With | 
|  | 151 | ptest <dev-manual/dev-manual-common-tasks:Testing Packages With ptest>` section | 
|  | 152 | in the Yocto Project Development Tasks Manual and the | 
|  | 153 | ":yocto_wiki:`Ptest </wiki/Ptest>`" Wiki page for more | 
|  | 154 | information on Ptest. | 
|  | 155 |  | 
|  | 156 | -  *SDK Testing:* Image tests initiated through the following command:: | 
|  | 157 |  | 
|  | 158 | $ bitbake image -c testsdk | 
|  | 159 |  | 
|  | 160 | The tests utilize the :ref:`testsdk <ref-classes-testsdk>` class and | 
|  | 161 | the ``do_testsdk`` task. | 
|  | 162 |  | 
|  | 163 | -  *Unit Testing:* Unit tests on various components of the system run | 
|  | 164 | through :ref:`bitbake-selftest <ref-manual/ref-release-process:Testing and Quality Assurance>` and | 
|  | 165 | :ref:`oe-selftest <ref-manual/ref-release-process:Testing and Quality Assurance>`. | 
|  | 166 |  | 
|  | 167 | -  *Automatic Upgrade Helper:* This target tests whether new versions of | 
|  | 168 | software are available and whether we can automatically upgrade to | 
|  | 169 | those new versions. If so, this target emails the maintainers with a | 
|  | 170 | patch to let them know this is possible. | 
|  | 171 |  | 
|  | 172 | .. _test-test-mapping: | 
|  | 173 |  | 
|  | 174 | How Tests Map to Areas of Code | 
|  | 175 | ============================== | 
|  | 176 |  | 
|  | 177 | Tests map into the codebase as follows: | 
|  | 178 |  | 
|  | 179 | -  *bitbake-selftest:* | 
|  | 180 |  | 
|  | 181 | These tests are self-contained and test BitBake as well as its APIs, | 
|  | 182 | which include the fetchers. The tests are located in | 
|  | 183 | ``bitbake/lib/*/tests``. | 
|  | 184 |  | 
|  | 185 | From within the BitBake repository, run the following:: | 
|  | 186 |  | 
|  | 187 | $ bitbake-selftest | 
|  | 188 |  | 
|  | 189 | To skip tests that access the Internet, use the ``BB_SKIP_NETTEST`` | 
|  | 190 | variable when running "bitbake-selftest" as follows:: | 
|  | 191 |  | 
|  | 192 | $ BB_SKIP_NETTEST=yes bitbake-selftest | 
|  | 193 |  | 
|  | 194 | The default output is quiet and just prints a summary of what was | 
|  | 195 | run. To see more information, there is a verbose option:: | 
|  | 196 |  | 
|  | 197 | $ bitbake-selftest -v | 
|  | 198 |  | 
|  | 199 | Use this option when you wish to skip tests that access the network, | 
|  | 200 | which are mostly necessary to test the fetcher modules. To specify | 
|  | 201 | individual test modules to run, append the test module name to the | 
|  | 202 | "bitbake-selftest" command. For example, to specify the tests for the | 
|  | 203 | bb.data.module, run:: | 
|  | 204 |  | 
|  | 205 | $ bitbake-selftest bb.test.data.module | 
|  | 206 |  | 
|  | 207 | You can also specify individual tests by defining the full name and module | 
|  | 208 | plus the class path of the test, for example:: | 
|  | 209 |  | 
|  | 210 | $ bitbake-selftest bb.tests.data.TestOverrides.test_one_override | 
|  | 211 |  | 
|  | 212 | The tests are based on `Python | 
|  | 213 | unittest <https://docs.python.org/3/library/unittest.html>`__. | 
|  | 214 |  | 
|  | 215 | -  *oe-selftest:* | 
|  | 216 |  | 
|  | 217 | -  These tests use OE to test the workflows, which include testing | 
|  | 218 | specific features, behaviors of tasks, and API unit tests. | 
|  | 219 |  | 
|  | 220 | -  The tests can take advantage of parallelism through the "-j" | 
|  | 221 | option, which can specify a number of threads to spread the tests | 
|  | 222 | across. Note that all tests from a given class of tests will run | 
|  | 223 | in the same thread. To parallelize large numbers of tests you can | 
|  | 224 | split the class into multiple units. | 
|  | 225 |  | 
|  | 226 | -  The tests are based on Python unittest. | 
|  | 227 |  | 
|  | 228 | -  The code for the tests resides in | 
|  | 229 | ``meta/lib/oeqa/selftest/cases/``. | 
|  | 230 |  | 
|  | 231 | -  To run all the tests, enter the following command:: | 
|  | 232 |  | 
|  | 233 | $ oe-selftest -a | 
|  | 234 |  | 
|  | 235 | -  To run a specific test, use the following command form where | 
|  | 236 | testname is the name of the specific test:: | 
|  | 237 |  | 
|  | 238 | $ oe-selftest -r <testname> | 
|  | 239 |  | 
|  | 240 | For example, the following command would run the tinfoil | 
|  | 241 | getVar API test:: | 
|  | 242 |  | 
|  | 243 | $ oe-selftest -r tinfoil.TinfoilTests.test_getvar | 
|  | 244 |  | 
|  | 245 | It is also possible to run a set | 
|  | 246 | of tests. For example the following command will run all of the | 
|  | 247 | tinfoil tests:: | 
|  | 248 |  | 
|  | 249 | $ oe-selftest -r tinfoil | 
|  | 250 |  | 
|  | 251 | -  *testimage:* | 
|  | 252 |  | 
|  | 253 | -  These tests build an image, boot it, and run tests against the | 
|  | 254 | image's content. | 
|  | 255 |  | 
|  | 256 | -  The code for these tests resides in ``meta/lib/oeqa/runtime/cases/``. | 
|  | 257 |  | 
|  | 258 | -  You need to set the :term:`IMAGE_CLASSES` variable as follows:: | 
|  | 259 |  | 
|  | 260 | IMAGE_CLASSES += "testimage" | 
|  | 261 |  | 
|  | 262 | -  Run the tests using the following command form:: | 
|  | 263 |  | 
|  | 264 | $ bitbake image -c testimage | 
|  | 265 |  | 
|  | 266 | -  *testsdk:* | 
|  | 267 |  | 
|  | 268 | -  These tests build an SDK, install it, and then run tests against | 
|  | 269 | that SDK. | 
|  | 270 |  | 
|  | 271 | -  The code for these tests resides in ``meta/lib/oeqa/sdk/cases/``. | 
|  | 272 |  | 
|  | 273 | -  Run the test using the following command form:: | 
|  | 274 |  | 
|  | 275 | $ bitbake image -c testsdk | 
|  | 276 |  | 
|  | 277 | -  *testsdk_ext:* | 
|  | 278 |  | 
|  | 279 | -  These tests build an extended SDK (eSDK), install that eSDK, and | 
|  | 280 | run tests against the eSDK. | 
|  | 281 |  | 
|  | 282 | -  The code for these tests resides in ``meta/lib/oeqa/esdk``. | 
|  | 283 |  | 
|  | 284 | -  To run the tests, use the following command form:: | 
|  | 285 |  | 
|  | 286 | $ bitbake image -c testsdkext | 
|  | 287 |  | 
|  | 288 | -  *oe-build-perf-test:* | 
|  | 289 |  | 
|  | 290 | -  These tests run through commonly used usage scenarios and measure | 
|  | 291 | the performance times. | 
|  | 292 |  | 
|  | 293 | -  The code for these tests resides in ``meta/lib/oeqa/buildperf``. | 
|  | 294 |  | 
|  | 295 | -  To run the tests, use the following command form:: | 
|  | 296 |  | 
|  | 297 | $ oe-build-perf-test <options> | 
|  | 298 |  | 
|  | 299 | The command takes a number of options, | 
|  | 300 | such as where to place the test results. The Autobuilder Helper | 
|  | 301 | Scripts include the ``build-perf-test-wrapper`` script with | 
|  | 302 | examples of how to use the oe-build-perf-test from the command | 
|  | 303 | line. | 
|  | 304 |  | 
|  | 305 | Use the ``oe-git-archive`` command to store test results into a | 
|  | 306 | Git repository. | 
|  | 307 |  | 
|  | 308 | Use the ``oe-build-perf-report`` command to generate text reports | 
|  | 309 | and HTML reports with graphs of the performance data. For | 
|  | 310 | examples, see | 
|  | 311 | :yocto_dl:`/releases/yocto/yocto-2.7/testresults/buildperf-centos7/perf-centos7.yoctoproject.org_warrior_20190414204758_0e39202.html` | 
|  | 312 | and | 
|  | 313 | :yocto_dl:`/releases/yocto/yocto-2.7/testresults/buildperf-centos7/perf-centos7.yoctoproject.org_warrior_20190414204758_0e39202.txt`. | 
|  | 314 |  | 
|  | 315 | -  The tests are contained in ``lib/oeqa/buildperf/test_basic.py``. | 
|  | 316 |  | 
|  | 317 | Test Examples | 
|  | 318 | ============= | 
|  | 319 |  | 
|  | 320 | This section provides example tests for each of the tests listed in the | 
|  | 321 | :ref:`test-manual/test-manual-intro:How Tests Map to Areas of Code` section. | 
|  | 322 |  | 
|  | 323 | For oeqa tests, testcases for each area reside in the main test | 
|  | 324 | directory at ``meta/lib/oeqa/selftest/cases`` directory. | 
|  | 325 |  | 
|  | 326 | For oe-selftest. bitbake testcases reside in the ``lib/bb/tests/`` | 
|  | 327 | directory. | 
|  | 328 |  | 
|  | 329 | .. _bitbake-selftest-example: | 
|  | 330 |  | 
|  | 331 | ``bitbake-selftest`` | 
|  | 332 | -------------------- | 
|  | 333 |  | 
|  | 334 | A simple test example from ``lib/bb/tests/data.py`` is:: | 
|  | 335 |  | 
|  | 336 | class DataExpansions(unittest.TestCase): | 
|  | 337 | def setUp(self): | 
|  | 338 | self.d = bb.data.init() | 
|  | 339 | self.d["foo"] = "value_of_foo" | 
|  | 340 | self.d["bar"] = "value_of_bar" | 
|  | 341 | self.d["value_of_foo"] = "value_of_'value_of_foo'" | 
|  | 342 |  | 
|  | 343 | def test_one_var(self): | 
|  | 344 | val = self.d.expand("${foo}") | 
|  | 345 | self.assertEqual(str(val), "value_of_foo") | 
|  | 346 |  | 
|  | 347 | In this example, a ``DataExpansions`` class of tests is created, | 
|  | 348 | derived from standard python unittest. The class has a common ``setUp`` | 
|  | 349 | function which is shared by all the tests in the class. A simple test is | 
|  | 350 | then added to test that when a variable is expanded, the correct value | 
|  | 351 | is found. | 
|  | 352 |  | 
|  | 353 | Bitbake selftests are straightforward python unittest. Refer to the | 
|  | 354 | Python unittest documentation for additional information on writing | 
|  | 355 | these tests at: https://docs.python.org/3/library/unittest.html. | 
|  | 356 |  | 
|  | 357 | .. _oe-selftest-example: | 
|  | 358 |  | 
|  | 359 | ``oe-selftest`` | 
|  | 360 | --------------- | 
|  | 361 |  | 
|  | 362 | These tests are more complex due to the setup required behind the scenes | 
|  | 363 | for full builds. Rather than directly using Python's unittest, the code | 
|  | 364 | wraps most of the standard objects. The tests can be simple, such as | 
|  | 365 | testing a command from within the OE build environment using the | 
|  | 366 | following example:: | 
|  | 367 |  | 
|  | 368 | class BitbakeLayers(OESelftestTestCase): | 
|  | 369 | def test_bitbakelayers_showcrossdepends(self): | 
|  | 370 | result = runCmd('bitbake-layers show-cross-depends') | 
|  | 371 | self.assertTrue('aspell' in result.output, msg = "No dependencies were shown. bitbake-layers show-cross-depends output: %s"% result.output) | 
|  | 372 |  | 
|  | 373 | This example, taken from ``meta/lib/oeqa/selftest/cases/bblayers.py``, | 
|  | 374 | creates a testcase from the ``OESelftestTestCase`` class, derived | 
|  | 375 | from ``unittest.TestCase``, which runs the ``bitbake-layers`` command | 
|  | 376 | and checks the output to ensure it contains something we know should be | 
|  | 377 | here. | 
|  | 378 |  | 
|  | 379 | The ``oeqa.utils.commands`` module contains Helpers which can assist | 
|  | 380 | with common tasks, including: | 
|  | 381 |  | 
|  | 382 | -  *Obtaining the value of a bitbake variable:* Use | 
|  | 383 | ``oeqa.utils.commands.get_bb_var()`` or use | 
|  | 384 | ``oeqa.utils.commands.get_bb_vars()`` for more than one variable | 
|  | 385 |  | 
|  | 386 | -  *Running a bitbake invocation for a build:* Use | 
|  | 387 | ``oeqa.utils.commands.bitbake()`` | 
|  | 388 |  | 
|  | 389 | -  *Running a command:* Use ``oeqa.utils.commandsrunCmd()`` | 
|  | 390 |  | 
|  | 391 | There is also a ``oeqa.utils.commands.runqemu()`` function for launching | 
|  | 392 | the ``runqemu`` command for testing things within a running, virtualized | 
|  | 393 | image. | 
|  | 394 |  | 
|  | 395 | You can run these tests in parallel. Parallelism works per test class, | 
|  | 396 | so tests within a given test class should always run in the same build, | 
|  | 397 | while tests in different classes or modules may be split into different | 
|  | 398 | builds. There is no data store available for these tests since the tests | 
|  | 399 | launch the ``bitbake`` command and exist outside of its context. As a | 
|  | 400 | result, common bitbake library functions (bb.\*) are also unavailable. | 
|  | 401 |  | 
|  | 402 | .. _testimage-example: | 
|  | 403 |  | 
|  | 404 | ``testimage`` | 
|  | 405 | ------------- | 
|  | 406 |  | 
|  | 407 | These tests are run once an image is up and running, either on target | 
|  | 408 | hardware or under QEMU. As a result, they are assumed to be running in a | 
|  | 409 | target image environment, as opposed to a host build environment. A | 
|  | 410 | simple example from ``meta/lib/oeqa/runtime/cases/python.py`` contains | 
|  | 411 | the following:: | 
|  | 412 |  | 
|  | 413 | class PythonTest(OERuntimeTestCase): | 
|  | 414 | @OETestDepends(['ssh.SSHTest.test_ssh']) | 
|  | 415 | @OEHasPackage(['python3-core']) | 
|  | 416 | def test_python3(self): | 
|  | 417 | cmd = "python3 -c \\"import codecs; print(codecs.encode('Uryyb, jbeyq', 'rot13'))\"" | 
|  | 418 | status, output = self.target.run(cmd) | 
|  | 419 | msg = 'Exit status was not 0. Output: %s' % output | 
|  | 420 | self.assertEqual(status, 0, msg=msg) | 
|  | 421 |  | 
|  | 422 | In this example, the ``OERuntimeTestCase`` class wraps | 
|  | 423 | ``unittest.TestCase``. Within the test, ``self.target`` represents the | 
|  | 424 | target system, where commands can be run on it using the ``run()`` | 
|  | 425 | method. | 
|  | 426 |  | 
|  | 427 | To ensure certain test or package dependencies are met, you can use the | 
|  | 428 | ``OETestDepends`` and ``OEHasPackage`` decorators. For example, the test | 
|  | 429 | in this example would only make sense if python3-core is installed in | 
|  | 430 | the image. | 
|  | 431 |  | 
|  | 432 | .. _testsdk_ext-example: | 
|  | 433 |  | 
|  | 434 | ``testsdk_ext`` | 
|  | 435 | --------------- | 
|  | 436 |  | 
|  | 437 | These tests are run against built extensible SDKs (eSDKs). The tests can | 
|  | 438 | assume that the eSDK environment has already been setup. An example from | 
|  | 439 | ``meta/lib/oeqa/sdk/cases/devtool.py`` contains the following:: | 
|  | 440 |  | 
|  | 441 | class DevtoolTest(OESDKExtTestCase): | 
|  | 442 | @classmethod def setUpClass(cls): | 
|  | 443 | myapp_src = os.path.join(cls.tc.esdk_files_dir, "myapp") | 
|  | 444 | cls.myapp_dst = os.path.join(cls.tc.sdk_dir, "myapp") | 
|  | 445 | shutil.copytree(myapp_src, cls.myapp_dst) | 
|  | 446 | subprocess.check_output(['git', 'init', '.'], cwd=cls.myapp_dst) | 
|  | 447 | subprocess.check_output(['git', 'add', '.'], cwd=cls.myapp_dst) | 
|  | 448 | subprocess.check_output(['git', 'commit', '-m', "'test commit'"], cwd=cls.myapp_dst) | 
|  | 449 |  | 
|  | 450 | @classmethod | 
|  | 451 | def tearDownClass(cls): | 
|  | 452 | shutil.rmtree(cls.myapp_dst) | 
|  | 453 | def _test_devtool_build(self, directory): | 
|  | 454 | self._run('devtool add myapp %s' % directory) | 
|  | 455 | try: | 
|  | 456 | self._run('devtool build myapp') | 
|  | 457 | finally: | 
|  | 458 | self._run('devtool reset myapp') | 
|  | 459 | def test_devtool_build_make(self): | 
|  | 460 | self._test_devtool_build(self.myapp_dst) | 
|  | 461 |  | 
|  | 462 | In this example, the ``devtool`` | 
|  | 463 | command is tested to see whether a sample application can be built with | 
|  | 464 | the ``devtool build`` command within the eSDK. | 
|  | 465 |  | 
|  | 466 | .. _testsdk-example: | 
|  | 467 |  | 
|  | 468 | ``testsdk`` | 
|  | 469 | ----------- | 
|  | 470 |  | 
|  | 471 | These tests are run against built SDKs. The tests can assume that an SDK | 
|  | 472 | has already been extracted and its environment file has been sourced. A | 
|  | 473 | simple example from ``meta/lib/oeqa/sdk/cases/python2.py`` contains the | 
|  | 474 | following:: | 
|  | 475 |  | 
|  | 476 | class Python3Test(OESDKTestCase): | 
|  | 477 | def setUp(self): | 
|  | 478 | if not (self.tc.hasHostPackage("nativesdk-python3-core") or | 
|  | 479 | self.tc.hasHostPackage("python3-core-native")): | 
|  | 480 | raise unittest.SkipTest("No python3 package in the SDK") | 
|  | 481 |  | 
|  | 482 | def test_python3(self): | 
|  | 483 | cmd = "python3 -c \\"import codecs; print(codecs.encode('Uryyb, jbeyq', 'rot13'))\"" | 
|  | 484 | output = self._run(cmd) | 
|  | 485 | self.assertEqual(output, "Hello, world\n") | 
|  | 486 |  | 
|  | 487 | In this example, if nativesdk-python3-core has been installed into the SDK, the code runs | 
|  | 488 | the python3 interpreter with a basic command to check it is working | 
|  | 489 | correctly. The test would only run if python3 is installed in the SDK. | 
|  | 490 |  | 
|  | 491 | .. _oe-build-perf-test-example: | 
|  | 492 |  | 
|  | 493 | ``oe-build-perf-test`` | 
|  | 494 | ---------------------- | 
|  | 495 |  | 
|  | 496 | The performance tests usually measure how long operations take and the | 
|  | 497 | resource utilisation as that happens. An example from | 
|  | 498 | ``meta/lib/oeqa/buildperf/test_basic.py`` contains the following:: | 
|  | 499 |  | 
|  | 500 | class Test3(BuildPerfTestCase): | 
|  | 501 | def test3(self): | 
|  | 502 | """Bitbake parsing (bitbake -p)""" | 
|  | 503 | # Drop all caches and parse | 
|  | 504 | self.rm_cache() | 
|  | 505 | oe.path.remove(os.path.join(self.bb_vars['TMPDIR'], 'cache'), True) | 
|  | 506 | self.measure_cmd_resources(['bitbake', '-p'], 'parse_1', | 
|  | 507 | 'bitbake -p (no caches)') | 
|  | 508 | # Drop tmp/cache | 
|  | 509 | oe.path.remove(os.path.join(self.bb_vars['TMPDIR'], 'cache'), True) | 
|  | 510 | self.measure_cmd_resources(['bitbake', '-p'], 'parse_2', | 
|  | 511 | 'bitbake -p (no tmp/cache)') | 
|  | 512 | # Parse with fully cached data | 
|  | 513 | self.measure_cmd_resources(['bitbake', '-p'], 'parse_3', | 
|  | 514 | 'bitbake -p (cached)') | 
|  | 515 |  | 
|  | 516 | This example shows how three specific parsing timings are | 
|  | 517 | measured, with and without various caches, to show how BitBake's parsing | 
|  | 518 | performance trends over time. | 
|  | 519 |  | 
|  | 520 | .. _test-writing-considerations: | 
|  | 521 |  | 
|  | 522 | Considerations When Writing Tests | 
|  | 523 | ================================= | 
|  | 524 |  | 
|  | 525 | When writing good tests, there are several things to keep in mind. Since | 
|  | 526 | things running on the Autobuilder are accessed concurrently by multiple | 
|  | 527 | workers, consider the following: | 
|  | 528 |  | 
|  | 529 | **Running "cleanall" is not permitted.** | 
|  | 530 |  | 
|  | 531 | This can delete files from DL_DIR which would potentially break other | 
|  | 532 | builds running in parallel. If this is required, DL_DIR must be set to | 
|  | 533 | an isolated directory. | 
|  | 534 |  | 
|  | 535 | **Running "cleansstate" is not permitted.** | 
|  | 536 |  | 
|  | 537 | This can delete files from SSTATE_DIR which would potentially break | 
|  | 538 | other builds running in parallel. If this is required, SSTATE_DIR must | 
|  | 539 | be set to an isolated directory. Alternatively, you can use the "-f" | 
|  | 540 | option with the ``bitbake`` command to "taint" tasks by changing the | 
|  | 541 | sstate checksums to ensure sstate cache items will not be reused. | 
|  | 542 |  | 
|  | 543 | **Tests should not change the metadata.** | 
|  | 544 |  | 
|  | 545 | This is particularly true for oe-selftests since these can run in | 
|  | 546 | parallel and changing metadata leads to changing checksums, which | 
|  | 547 | confuses BitBake while running in parallel. If this is necessary, copy | 
|  | 548 | layers to a temporary location and modify them. Some tests need to | 
|  | 549 | change metadata, such as the devtool tests. To prevent the metadate from | 
|  | 550 | changes, set up temporary copies of that data first. |