New source files and Makefile update for Test Manual

(From yocto-docs rev: d7cff640569a5772f3c366b4136762628fca534d)

Signed-off-by: Mark Morton <mark.morton@windriver.com>
Signed-off-by: Richard Purdie <richard.purdie@linuxfoundation.org>
This commit is contained in:
Mark Morton 2020-06-16 14:14:50 -07:00 committed by Richard Purdie
parent 04118df8b0
commit 444666faa4
9 changed files with 2184 additions and 0 deletions

View File

@ -358,6 +358,16 @@ STYLESHEET = $(DOC)/*.css
endif
ifeq ($(DOC),test-manual)
XSLTOPTS = --xinclude
ALLPREQ = html tarball
TARFILES = test-manual.html test-manual-style.css \
figures/test-manual-title.png figures/ab-test-cluster.png
MANUALS = $(DOC)/$(DOC).html
FIGURES = figures
STYLESHEET = $(DOC)/*.css
endif
##
# These URI should be rewritten by your distribution's xml catalog to
# match your locally installed XSL stylesheets.

Binary file not shown.

After

Width:  |  Height:  |  Size: 18 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 15 KiB

View File

@ -0,0 +1,27 @@
<?xml version='1.0'?>
<xsl:stylesheet xmlns:xsl="http://www.w3.org/1999/XSL/Transform" xmlns="http://www.w3.org/1999/xhtml" xmlns:fo="http://www.w3.org/1999/XSL/Format" version="1.0">
<xsl:import href="http://downloads.yoctoproject.org/mirror/docbook-mirror/docbook-xsl-1.76.1/xhtml/docbook.xsl" />
<!--
<xsl:import href="../template/1.76.1/docbook-xsl-1.76.1/xhtml/docbook.xsl" />
<xsl:import href="http://docbook.sourceforge.net/release/xsl/1.76.1/xhtml/docbook.xsl" />
-->
<xsl:include href="../template/permalinks.xsl"/>
<xsl:include href="../template/section.title.xsl"/>
<xsl:include href="../template/component.title.xsl"/>
<xsl:include href="../template/division.title.xsl"/>
<xsl:include href="../template/formal.object.heading.xsl"/>
<xsl:param name="html.stylesheet" select="'test-manual-style.css'" />
<xsl:param name="chapter.autolabel" select="1" />
<xsl:param name="appendix.autolabel" select="A" />
<xsl:param name="section.autolabel" select="1" />
<xsl:param name="section.label.includes.component.label" select="1" />
<xsl:param name="generate.id.attributes" select="1" />
</xsl:stylesheet>

View File

@ -0,0 +1,634 @@
<!DOCTYPE chapter PUBLIC "-//OASIS//DTD DocBook XML V4.2//EN"
"http://www.oasis-open.org/docbook/xml/4.2/docbookx.dtd"
[<!ENTITY % poky SYSTEM "../poky.ent"> %poky; ] >
<chapter id='test-manual-intro'>
<title>The Yocto Project Test Environment Manual</title>
<section id='test-welcome'>
<title>Welcome</title>
<para> Welcome to the Yocto Project Test Environment Manual! This manual is a work in
progress. The manual contains information about the testing environment used by the
Yocto Project to make sure each major and minor release works as intended. All the
projects testing infrastructure and processes are publicly visible and available so
that the community can see what testing is being performed, how its being done and the
current status of the tests and the project at any given time. It is intended that Other
organizations can leverage off the process and testing environment used by the Yocto
Project to create their own automated, production test environment, building upon the
foundations from the project core. </para>
<para> Currently, the Yocto Project Test Environment Manual has no projected release date.
This manual is a work-in-progress and is being initially loaded with information from
the <ulink url="">README</ulink> files and notes from key engineers: <itemizedlist>
<listitem>
<para>
<emphasis><filename>yocto-autobuilder2</filename>:</emphasis> This <ulink
url="http://git.yoctoproject.org/clean/cgit.cgi/yocto-autobuilder2/tree/README.md"
><filename>README.md</filename></ulink> is the main README which
detials how to set up the Yocto Project Autobuilder. The
<filename>yocto-autobuilder2</filename> repository represents the Yocto
Project's console UI plugin to Buildbot and the configuration necessary to
configure Buildbot to perform the testing the project requires. </para>
</listitem>
<listitem>
<para>
<emphasis><filename>yocto-autobuilder-helper</filename>:</emphasis> This
<ulink
url="http://git.yoctoproject.org/clean/cgit.cgi/yocto-autobuilder-helper/tree/README"
><filename>README</filename></ulink> and repository contains Yocto
Project Autobuilder Helper scripts and configuration. The
<filename>yocto-autobuilder-helper</filename> repository contains the
"glue" logic that defines which tests to run and how to run them. As a
result, it can be used by any Continuous Improvement (CI) system to run
builds, support getting the correct code revisions, configure builds and
layers, run builds, and collect results. The code is independent of any CI
system, which means the code can work Buildbot, Jenkins, or others. This
repository has a branch per release of the project defining the tests to run
on a per release basis.</para>
</listitem>
</itemizedlist>
</para>
</section>
<section id='test-yocto-project-autobuilder-overview'>
<title>Yocto Project Autobuilder Overview</title>
<para>The Yocto Project Autobuilder collectively refers to the software, tools, scripts, and
procedures used by the Yocto Project to test released software across supported hardware
in an automated and regular fashion. Basically, during the development of a Yocto
Project release, the Autobuilder tests if things work. The Autobuilder builds all test
targets and runs all the tests. </para>
<para>The Yocto Project uses now uses standard upstream <ulink
url="https://docs.buildbot.net/0.9.15.post1/">Buildbot</ulink> (version 9) to drive
its integration and testing. Buildbot Nine has a plug-in interface that the Yocto
Project customizes using code from the <filename>yocto-autobuilder2</filename>
repository, adding its own console UI plugin. The resulting UI plug-in allows you to
visualize builds in a way suited to the project's needs.</para>
<para>A <filename>helper</filename> layer provides configuration and job management through
scripts found in the <filename>yocto-autobuilder-helper</filename> repository. The
<filename>helper</filename> layer contains the bulk of the build configuration
information and is release-specific, which makes it highly customizable on a per-project
basis. The layer is CI system-agnostic and contains a number of Helper scripts that can
generate build configurations from simple JSON files. <note>
<para>The project uses Buildbot for historical reasons but also because many of the
project developers have knowledge of python. It is possible to use the outer
layers from another Continuous Integration (CI) system such as <ulink
url="https://en.wikipedia.org/wiki/Jenkins_(software)">Jenkins</ulink>
instead of Buildbot. </para>
</note>
</para>
<para> The following figure shows the Yocto Project Autobuilder stack with a topology that
includes a controller and a cluster of workers: <imagedata
fileref="figures/ab-test-cluster.png" width="4.6in" depth="4.35in" align="center"
scalefit="1"/>
</para>
</section>
<section id='test-project-tests'>
<title>Yocto Project Tests - Types of Testing Overview</title>
<para>The Autobuilder tests different elements of the project by using thefollowing types of
tests: <itemizedlist>
<listitem>
<para>
<emphasis>Build Testing:</emphasis> Tests whether specific configurations
build by varying <ulink url="&YOCTO_DOCS_REF_URL;#var-MACHINE"
><filename>MACHINE</filename></ulink>, <ulink
url="&YOCTO_DOCS_REF_URL;#var-DISTRO"
><filename>DISTRO</filename></ulink>, other configuration options, and
the specific target images being built (or world). Used to trigger builds of
all the different test configurations on the Autobuilder. Builds usually
cover many different targets for different architectures, machines, and
distributions, as well as different configurations, such as different init
systems. The Autobuilder tests literally hundreds of configurations and
targets. <itemizedlist>
<listitem>
<para>
<emphasis>Sanity Checks During the Build Process:</emphasis>
Tests initiated through the <ulink
url="&YOCTO_DOCS_REF_URL;#ref-classes-insane"
><filename>insane</filename></ulink> class. These checks
ensure the output of the builds are correct. For example, does
the ELF architecture in the generated binaries match the target
system? ARM binaries would not work in a MIPS system! </para>
</listitem>
</itemizedlist></para>
</listitem>
<listitem>
<para>
<emphasis>Build Performance Testing:</emphasis> Tests whether or not
commonly used steps during builds work efficiently and avoid regressions.
Tests to time commonly used usage scenarios are run through
<filename>oe-build-perf-test</filename>. These tests are run on isolated
machines so that the time measurements of the tests are accurate and no
other processes interfere with the timing results. The project currently
tests performance on two different distributions, Fedora and Ubuntu, to
ensure we have no single point of failure and can ensure the different
distros work effectively. </para>
</listitem>
<listitem>
<para>
<emphasis>eSDK Testing:</emphasis> Image tests initiated through the
following command:
<literallayout class="monospaced">
$ bitbake <replaceable>image</replaceable> -c testsdkext
</literallayout>
The tests utilize the <filename>testsdkext</filename> class and the
<filename>do_testsdkext</filename> task. </para>
</listitem>
<listitem>
<para>
<emphasis>Feature Testing:</emphasis> Various scenario-based tests are run
through the <ulink url="&YOCTO_DOCS_REF_URL;#testing-and-quality-assurance"
>OpenEmbedded Self-Test</ulink> (oe-selftest). We test oe-selftest on
each of the main distrubutions we support. </para>
</listitem>
<listitem>
<para>
<emphasis>Image Testing:</emphasis> Image tests initiated through the
following command:
<literallayout class="monospaced">
$ bitbake <replaceable>image</replaceable> -c testimage
</literallayout>
The tests utilize the <ulink
url="&YOCTO_DOCS_REF_URL;#ref-classes-testimage*"
><filename>testimage*</filename></ulink> classes and the <ulink
url="&YOCTO_DOCS_REF_URL;#ref-tasks-testimage"
><filename>do_testimage</filename></ulink> task. </para>
</listitem>
<listitem>
<para>
<emphasis>Layer Testing:</emphasis> The Autobuilder has the possibility to
test whether specific layers work with the test of the system. The layers
tested may be selected by members of the project. Some key community layers
are also tested periodically.</para>
</listitem>
<listitem>
<para>
<emphasis>Package Testing:</emphasis> A Package Test (ptest) runs tests
against packages built by the OpenEmbedded build system on the target
machine. See the "<ulink
url="&YOCTO_DOCS_DEV_URL;#testing-packages-with-ptest">Testing Packages
With ptest</ulink>" section in the Yocto Project Development Tasks
Manual and the "<ulink url="&YOCTO_WIKI_URL;/wiki/Ptest">Ptest</ulink>" Wiki
page for more information on Ptest. </para>
</listitem>
<listitem>
<para>
<emphasis>SDK Testing:</emphasis> Image tests initiated through the
following command:
<literallayout class="monospaced">
$ bitbake <replaceable>image</replaceable> -c testsdk
</literallayout>
The tests utilize the <ulink url="&YOCTO_DOCS_REF_URL;#ref-classes-testsdk"
><filename>testsdk</filename></ulink> class and the
<filename>do_testsdk</filename> task. </para>
</listitem>
<listitem>
<para>
<emphasis>Unit Testing:</emphasis> Unit tests on various components of the
system run through <filename>oe-selftest</filename> and <ulink
url="&YOCTO_DOCS_REF_URL;#testing-and-quality-assurance"
><filename>bitbake-selftest</filename></ulink>. </para>
</listitem>
<listitem>
<para>
<emphasis>Automatic Upgrade Helper:</emphasis> This target tests whether new
versions of software are available and whether we can automatically upgrade
to those new versions. If so, this target emails the maintainers with a
patch to let them know this is possible.</para>
</listitem>
</itemizedlist>
</para>
</section>
<section id='test-test-mapping'>
<title>How Tests Map to Areas of Code</title>
<para>
Tests map into the codebase as follows:
<itemizedlist>
<listitem><para>
<emphasis>bitbake-selftest</emphasis>: <itemizedlist>
<listitem>
<para>These tests are self-contained and test BitBake as well as its
APIs, which include the fetchers. The tests are located in
<filename>bitbake/lib/*/tests</filename>. </para>
</listitem>
<listitem>
<para>From within the BitBake repository, run the following:
<literallayout class="monospaced">
$ bitbake-selftest
</literallayout>
</para>
</listitem>
<listitem>
<para>To skip tests that access the Internet, use the
<filename>BB_SKIP_NETTEST</filename> variable when running
"bitbake-selftest" as follows:
<literallayout class="monospaced">
$ BB_SKIP_NETTEST=yes bitbake-selftest
</literallayout>The
default output is quiet and just prints a summary of what was
run. To see more information, there is a verbose
option:<literallayout>
$ bitbake-selftest -v
</literallayout></para>
<para>Use this option when you wish to skip tests that access the
network, which are mostly necessary to test the fetcher modules.
To specify individual test modules to run, append the test
module name to the "bitbake-selftest" command. For example, to
specify the tests for the bb.data.module, run:
<literallayout class="monospaced">
$ bitbake-selftest bb.test.data.module
</literallayout>You
can also specify individual tests by defining the full name and
module plus the class path of the test, for example:
<literallayout>
$ bitbake-selftest bb.tests.data.TestOverrides.test_one_override
</literallayout></para>
</listitem>
<listitem>
<para>The tests are based on <ulink
url="https://docs.python.org/3/library/unittest.html">Python
unittest</ulink>. </para>
</listitem>
</itemizedlist>
</para></listitem>
<listitem><para>
<emphasis>oe-selftest</emphasis>: <itemizedlist>
<listitem>
<para>These tests use OE to test the workflows, which include
testing specific features, behaviors of tasks, and API unit
tests. </para>
</listitem>
<listitem>
<para>The tests can take advantage of parallelism through the "-j"
option, which can specify a number of threads to spread the
tests across. Note that all tests from a given class of tests
will run in the same thread. To parallelize large numbers of
tests you can split the class into multiple units.</para>
</listitem>
<listitem>
<para>The tests are based on Python unittest. </para>
</listitem>
<listitem>
<para>The code for the tests resides in
<filename>meta/lib/oeqa/selftest/cases/</filename>. </para>
</listitem>
<listitem>
<para>To run all the tests, enter the following command:
<literallayout class="monospaced">
$ oe-selftest -a
</literallayout>
</para>
</listitem>
<listitem>
<para>To run a specific test, use the following command form where
<replaceable>testname</replaceable> is the name of the
specific test:
<literallayout class="monospaced">
$ oe-selftest -r <replaceable>testname</replaceable>
</literallayout>
For example, the following command would run the tinfoil getVar
API
test:<literallayout>
$ oe-selftest -r tinfoil.TinfoilTests.test_getvar
</literallayout>It
is also possible to run a set of tests. For example the
following command will run all of the tinfoil
tests:<literallayout>
$ oe-selftest -r tinfoil
</literallayout></para>
</listitem>
</itemizedlist>
</para></listitem>
<listitem><para>
<emphasis>testimage:</emphasis>
<itemizedlist>
<listitem><para>
These tests build an image, boot it, and run tests
against the image's content.
</para></listitem>
<listitem><para> The code for these tests resides in <filename>meta/lib/oeqa/runtime/cases/</filename>. </para></listitem>
<listitem><para>
You need to set the
<ulink url='&YOCTO_DOCS_REF_URL;#var-IMAGE_CLASSES'><filename>IMAGE_CLASSES</filename></ulink>
variable as follows:
<literallayout class='monospaced'>
IMAGE_CLASSES += "testimage"
</literallayout>
</para></listitem>
<listitem><para>
Run the tests using the following command form:
<literallayout class='monospaced'>
$ bitbake <replaceable>image</replaceable> -c testimage
</literallayout>
</para></listitem>
</itemizedlist>
</para></listitem>
<listitem><para>
<emphasis>testsdk:</emphasis>
<itemizedlist>
<listitem><para>These tests build an SDK, install it, and then run tests against that SDK. </para></listitem>
<listitem><para>The code for these tests resides in <filename>meta/lib/oeqa/sdk/cases/</filename>. </para></listitem>
<listitem><para>Run the test using the following command form:
<literallayout class="monospaced">
$ bitbake <replaceable>image</replaceable> -c testsdk
</literallayout>
</para></listitem>
</itemizedlist>
</para></listitem>
<listitem><para>
<emphasis>testsdk_ext:</emphasis>
<itemizedlist>
<listitem><para>These tests build an extended SDK (eSDK), install that eSDK, and run tests against the eSDK. </para></listitem>
<listitem><para>The code for these tests resides in <filename>meta/lib/oeqa/esdk</filename>. </para></listitem>
<listitem><para>To run the tests, use the following command form:
<literallayout class="monospaced">
$ bitbake <replaceable>image</replaceable> -c testsdkext
</literallayout>
</para></listitem>
</itemizedlist>
</para></listitem>
<listitem><para>
<emphasis>oe-build-perf-test:</emphasis>
<itemizedlist>
<listitem><para>These tests run through commonly used usage scenarios and measure the performance times. </para></listitem>
<listitem><para>The code for these tests resides in <filename>meta/lib/oeqa/buildperf</filename>. </para></listitem>
<listitem><para>To run the tests, use the following command form:
<literallayout class="monospaced">
$ oe-build-perf-test <replaceable>options</replaceable>
</literallayout>The
command takes a number of options, such as where to place the
test results. The Autobuilder Helper Scripts include the
<filename>build-perf-test-wrapper</filename> script with
examples of how to use the oe-build-perf-test from the command
line.</para>
<para>Use the <filename>oe-git-archive</filename> command to store
test results into a Git repository. </para>
<para>Use the <filename>oe-build-perf-report</filename> command to
generate text reports and HTML reports with graphs of the
performance data. For examples, see <link linkend=""
>http://downloads.yoctoproject.org/releases/yocto/yocto-2.7/testresults/buildperf-centos7/perf-centos7.yoctoproject.org_warrior_20190414204758_0e39202.html</link>
and <link linkend=""
>http://downloads.yoctoproject.org/releases/yocto/yocto-2.7/testresults/buildperf-centos7/perf-centos7.yoctoproject.org_warrior_20190414204758_0e39202.txt</link>.</para></listitem>
<listitem>
<para>The tests are contained in
<filename>lib/oeqa/buildperf/test_basic.py</filename>.</para>
</listitem>
</itemizedlist>
</para></listitem>
</itemizedlist>
</para>
</section>
<section id='test-examples'>
<title>Test Examples</title>
<para>This section provides example tests for each of the tests listed in the <link
linkend="test-test-mapping">How Tests Map to Areas of Code</link> section. </para>
<para>For oeqa tests, testcases for each area reside in the main test directory at
<filename>meta/lib/oeqa/selftest/cases</filename> directory.</para>
<para>For oe-selftest. bitbake testcases reside in the <filename>lib/bb/tests/</filename>
directory. </para>
<section id='bitbake-selftest-example'>
<title><filename>bitbake-selftest</filename></title>
<para>A simple test example from <filename>lib/bb/tests/data.py</filename> is:
<literallayout>
class DataExpansions(unittest.TestCase):
def setUp(self):
self.d = bb.data.init()
self.d["foo"] = "value_of_foo"
self.d["bar"] = "value_of_bar"
self.d["value_of_foo"] = "value_of_'value_of_foo'"
def test_one_var(self):
val = self.d.expand("${foo}")
self.assertEqual(str(val), "value_of_foo")
</literallayout>
</para>
<para>In this example, a <ulink url=""><filename>DataExpansions</filename></ulink> class
of tests is created, derived from standard python unittest. The class has a common
<filename>setUp</filename> function which is shared by all the tests in the
class. A simple test is then added to test that when a variable is expanded, the
correct value is found.</para>
<para>Bitbake selftests are straightforward python unittest. Refer to the Python
unittest documentation for additional information on writing these tests at: <link
linkend="">https://docs.python.org/3/library/unittest.html</link>.</para>
</section>
<section id='oe-selftest-example'>
<title><filename>oe-selftest</filename></title>
<para>These tests are more complex due to the setup required behind the scenes for full
builds. Rather than directly using Python's unittest, the code wraps most of the
standard objects. The tests can be simple, such as testing a command from within the
OE build environment using the following
example:<literallayout>
class BitbakeLayers(OESelftestTestCase):
def test_bitbakelayers_showcrossdepends(self):
result = runCmd('bitbake-layers show-cross-depends')
self.assertTrue('aspell' in result.output, msg = "No dependencies
were shown. bitbake-layers show-cross-depends output:
%s"% result.output)
</literallayout></para>
<para>This example, taken from
<filename>meta/lib/oeqa/selftest/cases/bblayers.py</filename>, creates a
testcase from the <ulink url=""><filename>OESelftestTestCase</filename></ulink>
class, derived from <filename>unittest.TestCase</filename>, which runs the
<filename>bitbake-layers</filename> command and checks the output to ensure it
contains something we know should be here.</para>
<para>The <filename>oeqa.utils.commands</filename> module contains Helpers which can
assist with common tasks, including:<itemizedlist>
<listitem>
<para><emphasis>Obtaining the value of a bitbake variable:</emphasis> Use
<filename>oeqa.utils.commands.get_bb_var()</filename> or use
<filename>oeqa.utils.commands.get_bb_vars()</filename> for more than
one variable</para>
</listitem>
<listitem>
<para><emphasis>Running a bitbake invocation for a build:</emphasis> Use
<filename>oeqa.utils.commands.bitbake()</filename></para>
</listitem>
<listitem>
<para><emphasis>Running a command:</emphasis> Use
<filename>oeqa.utils.commandsrunCmd()</filename></para>
</listitem>
</itemizedlist></para>
<para>There is also a <filename>oeqa.utils.commands.runqemu()</filename> function for
launching the <filename>runqemu</filename> command for testing things within a
running, virtualized image.</para>
<para>You can run these tests in parallel. Parallelism works per test class, so tests
within a given test class should always run in the same build, while tests in
different classes or modules may be split into different builds. There is no data
store available for these tests since the tests launch the
<filename>bitbake</filename> command and exist outside of its context. As a
result, common bitbake library functions (bb.*) are also unavailable.</para>
</section>
<section id='testimage-example'>
<title><filename>testimage</filename></title>
<para>These tests are run once an image is up and running, either on target hardware or
under QEMU. As a result, they are assumed to be running in a target image
environment, as opposed to a host build environment. A simple example from
<filename>meta/lib/oeqa/runtime/cases/python.py</filename> contains the
following:<literallayout>
class PythonTest(OERuntimeTestCase):
@OETestDepends(['ssh.SSHTest.test_ssh'])
@OEHasPackage(['python3-core'])
def test_python3(self):
cmd = "python3 -c \"import codecs; print(codecs.encode('Uryyb,
jbeyq', 'rot13'))\""
status, output = self.target.run(cmd)
msg = 'Exit status was not 0. Output: %s' % output
self.assertEqual(status, 0, msg=msg)
</literallayout></para>
<para>In this example, the <ulink url=""><filename>OERuntimeTestCase</filename></ulink>
class wraps <filename>unittest.TestCase</filename>. Within the test,
<filename>self.target</filename> represents the target system, where commands
can be run on it using the <filename>run()</filename> method. </para>
<para>To ensure certain test or package dependencies are met, you can use the
<filename>OETestDepends</filename> and <filename>OEHasPackage</filename>
decorators. For example, the test in this example would only make sense if
python3-core is installed in the image.</para>
</section>
<section id='testsdk_ext-example'>
<title><filename>testsdk_ext</filename></title>
<para>These tests are run against built extensible SDKs (eSDKs). The tests can assume
that the eSDK environment has already been setup. An example from
<filename>meta/lib/oeqa/sdk/cases/devtool.py</filename> contains the
following:<literallayout>
class DevtoolTest(OESDKExtTestCase):
@classmethod
def setUpClass(cls):
myapp_src = os.path.join(cls.tc.esdk_files_dir, "myapp")
cls.myapp_dst = os.path.join(cls.tc.sdk_dir, "myapp")
shutil.copytree(myapp_src, cls.myapp_dst)
subprocess.check_output(['git', 'init', '.'], cwd=cls.myapp_dst)
subprocess.check_output(['git', 'add', '.'], cwd=cls.myapp_dst)
subprocess.check_output(['git', 'commit', '-m', "'test commit'"], cwd=cls.myapp_dst)
@classmethod
def tearDownClass(cls):
shutil.rmtree(cls.myapp_dst)
def _test_devtool_build(self, directory):
self._run('devtool add myapp %s' % directory)
try:
self._run('devtool build myapp')
finally:
self._run('devtool reset myapp')
def test_devtool_build_make(self):
self._test_devtool_build(self.myapp_dst)
</literallayout>In
this example, the <filename>devtool</filename> command is tested to see whether a
sample application can be built with the <filename>devtool build</filename> command
within the eSDK.</para>
</section>
<section id='testsdk-example'>
<title><filename>testsdk</filename></title>
<para>These tests are run against built SDKs. The tests can assume that an SDK has
already been extracted and its environment file has been sourced. A simple example
from <filename>meta/lib/oeqa/sdk/cases/python2.py</filename> contains the
following:<literallayout>
class Python3Test(OESDKTestCase):
def setUp(self):
if not (self.tc.hasHostPackage("nativesdk-python3-core") or
self.tc.hasHostPackage("python3-core-native")):
raise unittest.SkipTest("No python3 package in the SDK")
def test_python3(self):
cmd = "python3 -c \"import codecs; print(codecs.encode('Uryyb, jbeyq', 'rot13'))\""
output = self._run(cmd)
self.assertEqual(output, "Hello, world\n")
</literallayout>In
this example, if nativesdk-python3-core has been installed into the SDK, the code
runs the python3 interpreter with a basic command to check it is working correctly.
The test would only run if python3 is installed in the SDK.</para>
</section>
<section id='oe-build-perf-test-example'>
<title><filename>oe-build-perf-test</filename></title>
<para>The performance tests usually measure how long operations take and the resource
utilisation as that happens. An example from
<filename>meta/lib/oeqa/buildperf/test_basic.py</filename> contains the
following:<literallayout>
class Test3(BuildPerfTestCase):
def test3(self):
"""Bitbake parsing (bitbake -p)"""
# Drop all caches and parse
self.rm_cache()
oe.path.remove(os.path.join(self.bb_vars['TMPDIR'], 'cache'), True)
self.measure_cmd_resources(['bitbake', '-p'], 'parse_1',
'bitbake -p (no caches)')
# Drop tmp/cache
oe.path.remove(os.path.join(self.bb_vars['TMPDIR'], 'cache'), True)
self.measure_cmd_resources(['bitbake', '-p'], 'parse_2',
'bitbake -p (no tmp/cache)')
# Parse with fully cached data
self.measure_cmd_resources(['bitbake', '-p'], 'parse_3',
'bitbake -p (cached)')
</literallayout>This
example shows how three specific parsing timings are measured, with and without
various caches, to show how BitBakes parsing performance trends over time.</para>
</section>
</section>
<section id='test-writing-considerations'>
<title>Considerations When Writing Tests</title>
<para>When writing good tests, there are several things to keep in mind. Since things
running on the Autobuilder are accessed concurrently by multiple workers, consider the
following:</para>
<formalpara>
<title>Running "cleanall" is not permitted</title>
<para>This can delete files from DL_DIR which would potentially break other builds
running in parallel. If this is required, DL_DIR must be set to an isolated
directory.</para>
</formalpara>
<formalpara>
<title>Running "cleansstate" is not permitted</title>
<para>This can delete files from SSTATE_DIR which would potentially break other builds
running in parallel. If this is required, SSTATE_DIR must be set to an isolated
directory. Alternatively, you can use the "-f" option with the
<filename>bitbake</filename> command to "taint" tasks by changing the sstate
checksums to ensure sstate cache items will not be reused.</para>
</formalpara>
<formalpara>
<title>Tests should not change the metadata</title>
<para>This is particularly true for oe-selftests since these can run in parallel and
changing metadata leads to changing checksums, which confuses BitBake while running
in parallel. If this is necessary, copy layers to a temporary location and modify
them. Some tests need to change metadata, such as the devtool tests. To prevent the
metadate from changes, set up temporary copies of that data first.</para>
</formalpara>
</section>
</chapter>
<!--
vim: expandtab tw=80 ts=4
-->

View File

@ -0,0 +1,989 @@
/*
Generic XHTML / DocBook XHTML CSS Stylesheet.
Browser wrangling and typographic design by
Oyvind Kolas / pippin@gimp.org
Customised for Poky by
Matthew Allum / mallum@o-hand.com
Thanks to:
Liam R. E. Quin
William Skaggs
Jakub Steiner
Structure
---------
The stylesheet is divided into the following sections:
Positioning
Margins, paddings, width, font-size, clearing.
Decorations
Borders, style
Colors
Colors
Graphics
Graphical backgrounds
Nasty IE tweaks
Workarounds needed to make it work in internet explorer,
currently makes the stylesheet non validating, but up until
this point it is validating.
Mozilla extensions
Transparency for footer
Rounded corners on boxes
*/
/*************** /
/ Positioning /
/ ***************/
body {
font-family: Verdana, Sans, sans-serif;
min-width: 640px;
width: 80%;
margin: 0em auto;
padding: 2em 5em 5em 5em;
color: #333;
}
h1,h2,h3,h4,h5,h6,h7 {
font-family: Arial, Sans;
color: #00557D;
clear: both;
}
h1 {
font-size: 2em;
text-align: left;
padding: 0em 0em 0em 0em;
margin: 2em 0em 0em 0em;
}
h2.subtitle {
margin: 0.10em 0em 3.0em 0em;
padding: 0em 0em 0em 0em;
font-size: 1.8em;
padding-left: 20%;
font-weight: normal;
font-style: italic;
}
h2 {
margin: 2em 0em 0.66em 0em;
padding: 0.5em 0em 0em 0em;
font-size: 1.5em;
font-weight: bold;
}
h3.subtitle {
margin: 0em 0em 1em 0em;
padding: 0em 0em 0em 0em;
font-size: 142.14%;
text-align: right;
}
h3 {
margin: 1em 0em 0.5em 0em;
padding: 1em 0em 0em 0em;
font-size: 140%;
font-weight: bold;
}
h4 {
margin: 1em 0em 0.5em 0em;
padding: 1em 0em 0em 0em;
font-size: 120%;
font-weight: bold;
}
h5 {
margin: 1em 0em 0.5em 0em;
padding: 1em 0em 0em 0em;
font-size: 110%;
font-weight: bold;
}
h6 {
margin: 1em 0em 0em 0em;
padding: 1em 0em 0em 0em;
font-size: 110%;
font-weight: bold;
}
.authorgroup {
background-color: transparent;
background-repeat: no-repeat;
padding-top: 256px;
background-image: url("figures/test-manual-title.png");
background-position: left top;
margin-top: -256px;
padding-right: 50px;
margin-left: 0px;
text-align: right;
width: 740px;
}
h3.author {
margin: 0em 0me 0em 0em;
padding: 0em 0em 0em 0em;
font-weight: normal;
font-size: 100%;
color: #333;
clear: both;
}
.author tt.email {
font-size: 66%;
}
.titlepage hr {
width: 0em;
clear: both;
}
.revhistory {
padding-top: 2em;
clear: both;
}
.toc,
.list-of-tables,
.list-of-examples,
.list-of-figures {
padding: 1.33em 0em 2.5em 0em;
color: #00557D;
}
.toc p,
.list-of-tables p,
.list-of-figures p,
.list-of-examples p {
padding: 0em 0em 0em 0em;
padding: 0em 0em 0.3em;
margin: 1.5em 0em 0em 0em;
}
.toc p b,
.list-of-tables p b,
.list-of-figures p b,
.list-of-examples p b{
font-size: 100.0%;
font-weight: bold;
}
.toc dl,
.list-of-tables dl,
.list-of-figures dl,
.list-of-examples dl {
margin: 0em 0em 0.5em 0em;
padding: 0em 0em 0em 0em;
}
.toc dt {
margin: 0em 0em 0em 0em;
padding: 0em 0em 0em 0em;
}
.toc dd {
margin: 0em 0em 0em 2.6em;
padding: 0em 0em 0em 0em;
}
div.glossary dl,
div.variablelist dl {
}
.glossary dl dt,
.variablelist dl dt,
.variablelist dl dt span.term {
font-weight: normal;
width: 20em;
text-align: right;
}
.variablelist dl dt {
margin-top: 0.5em;
}
.glossary dl dd,
.variablelist dl dd {
margin-top: 0em;
margin-left: 25.5em;
}
.glossary dd p,
.variablelist dd p {
margin-top: 0em;
margin-bottom: 1em;
}
div.calloutlist table td {
padding: 0em 0em 0em 0em;
margin: 0em 0em 0em 0em;
}
div.calloutlist table td p {
margin-top: 0em;
margin-bottom: 1em;
}
div p.copyright {
text-align: left;
}
div.legalnotice p.legalnotice-title {
margin-bottom: 0em;
}
p {
line-height: 1.5em;
margin-top: 0em;
}
dl {
padding-top: 0em;
}
hr {
border: solid 1px;
}
.mediaobject,
.mediaobjectco {
text-align: center;
}
img {
border: none;
}
ul {
padding: 0em 0em 0em 1.5em;
}
ul li {
padding: 0em 0em 0em 0em;
}
ul li p {
text-align: left;
}
table {
width :100%;
}
th {
padding: 0.25em;
text-align: left;
font-weight: normal;
vertical-align: top;
}
td {
padding: 0.25em;
vertical-align: top;
}
p a[id] {
margin: 0px;
padding: 0px;
display: inline;
background-image: none;
}
a {
text-decoration: underline;
color: #444;
}
pre {
overflow: auto;
}
a:hover {
text-decoration: underline;
/*font-weight: bold;*/
}
/* This style defines how the permalink character
appears by itself and when hovered over with
the mouse. */
[alt='Permalink'] { color: #eee; }
[alt='Permalink']:hover { color: black; }
div.informalfigure,
div.informalexample,
div.informaltable,
div.figure,
div.table,
div.example {
margin: 1em 0em;
padding: 1em;
page-break-inside: avoid;
}
div.informalfigure p.title b,
div.informalexample p.title b,
div.informaltable p.title b,
div.figure p.title b,
div.example p.title b,
div.table p.title b{
padding-top: 0em;
margin-top: 0em;
font-size: 100%;
font-weight: normal;
}
.mediaobject .caption,
.mediaobject .caption p {
text-align: center;
font-size: 80%;
padding-top: 0.5em;
padding-bottom: 0.5em;
}
.epigraph {
padding-left: 55%;
margin-bottom: 1em;
}
.epigraph p {
text-align: left;
}
.epigraph .quote {
font-style: italic;
}
.epigraph .attribution {
font-style: normal;
text-align: right;
}
span.application {
font-style: italic;
}
.programlisting {
font-family: monospace;
font-size: 80%;
white-space: pre;
margin: 1.33em 0em;
padding: 1.33em;
}
.tip,
.warning,
.caution,
.note {
margin-top: 1em;
margin-bottom: 1em;
}
/* force full width of table within div */
.tip table,
.warning table,
.caution table,
.note table {
border: none;
width: 100%;
}
.tip table th,
.warning table th,
.caution table th,
.note table th {
padding: 0.8em 0.0em 0.0em 0.0em;
margin : 0em 0em 0em 0em;
}
.tip p,
.warning p,
.caution p,
.note p {
margin-top: 0.5em;
margin-bottom: 0.5em;
padding-right: 1em;
text-align: left;
}
.acronym {
text-transform: uppercase;
}
b.keycap,
.keycap {
padding: 0.09em 0.3em;
margin: 0em;
}
.itemizedlist li {
clear: none;
}
.filename {
font-size: medium;
font-family: Courier, monospace;
}
div.navheader, div.heading{
position: absolute;
left: 0em;
top: 0em;
width: 100%;
background-color: #cdf;
width: 100%;
}
div.navfooter, div.footing{
position: fixed;
left: 0em;
bottom: 0em;
background-color: #eee;
width: 100%;
}
div.navheader td,
div.navfooter td {
font-size: 66%;
}
div.navheader table th {
/*font-family: Georgia, Times, serif;*/
/*font-size: x-large;*/
font-size: 80%;
}
div.navheader table {
border-left: 0em;
border-right: 0em;
border-top: 0em;
width: 100%;
}
div.navfooter table {
border-left: 0em;
border-right: 0em;
border-bottom: 0em;
width: 100%;
}
div.navheader table td a,
div.navfooter table td a {
color: #777;
text-decoration: none;
}
/* normal text in the footer */
div.navfooter table td {
color: black;
}
div.navheader table td a:visited,
div.navfooter table td a:visited {
color: #444;
}
/* links in header and footer */
div.navheader table td a:hover,
div.navfooter table td a:hover {
text-decoration: underline;
background-color: transparent;
color: #33a;
}
div.navheader hr,
div.navfooter hr {
display: none;
}
.qandaset tr.question td p {
margin: 0em 0em 1em 0em;
padding: 0em 0em 0em 0em;
}
.qandaset tr.answer td p {
margin: 0em 0em 1em 0em;
padding: 0em 0em 0em 0em;
}
.answer td {
padding-bottom: 1.5em;
}
.emphasis {
font-weight: bold;
}
/************* /
/ decorations /
/ *************/
.titlepage {
}
.part .title {
}
.subtitle {
border: none;
}
/*
h1 {
border: none;
}
h2 {
border-top: solid 0.2em;
border-bottom: solid 0.06em;
}
h3 {
border-top: 0em;
border-bottom: solid 0.06em;
}
h4 {
border: 0em;
border-bottom: solid 0.06em;
}
h5 {
border: 0em;
}
*/
.programlisting {
border: solid 1px;
}
div.figure,
div.table,
div.informalfigure,
div.informaltable,
div.informalexample,
div.example {
border: 1px solid;
}
.tip,
.warning,
.caution,
.note {
border: 1px solid;
}
.tip table th,
.warning table th,
.caution table th,
.note table th {
border-bottom: 1px solid;
}
.question td {
border-top: 1px solid black;
}
.answer {
}
b.keycap,
.keycap {
border: 1px solid;
}
div.navheader, div.heading{
border-bottom: 1px solid;
}
div.navfooter, div.footing{
border-top: 1px solid;
}
/********* /
/ colors /
/ *********/
body {
color: #333;
background: white;
}
a {
background: transparent;
}
a:hover {
background-color: #dedede;
}
h1,
h2,
h3,
h4,
h5,
h6,
h7,
h8 {
background-color: transparent;
}
hr {
border-color: #aaa;
}
.tip, .warning, .caution, .note {
border-color: #fff;
}
.tip table th,
.warning table th,
.caution table th,
.note table th {
border-bottom-color: #fff;
}
.warning {
background-color: #f0f0f2;
}
.caution {
background-color: #f0f0f2;
}
.tip {
background-color: #f0f0f2;
}
.note {
background-color: #f0f0f2;
}
.glossary dl dt,
.variablelist dl dt,
.variablelist dl dt span.term {
color: #044;
}
div.figure,
div.table,
div.example,
div.informalfigure,
div.informaltable,
div.informalexample {
border-color: #aaa;
}
pre.programlisting {
color: black;
background-color: #fff;
border-color: #aaa;
border-width: 2px;
}
.guimenu,
.guilabel,
.guimenuitem {
background-color: #eee;
}
b.keycap,
.keycap {
background-color: #eee;
border-color: #999;
}
div.navheader {
border-color: black;
}
div.navfooter {
border-color: black;
}
.writernotes {
color: red;
}
/*********** /
/ graphics /
/ ***********/
/*
body {
background-image: url("images/body_bg.jpg");
background-attachment: fixed;
}
.navheader,
.note,
.tip {
background-image: url("images/note_bg.jpg");
background-attachment: fixed;
}
.warning,
.caution {
background-image: url("images/warning_bg.jpg");
background-attachment: fixed;
}
.figure,
.informalfigure,
.example,
.informalexample,
.table,
.informaltable {
background-image: url("images/figure_bg.jpg");
background-attachment: fixed;
}
*/
h1,
h2,
h3,
h4,
h5,
h6,
h7{
}
/*
Example of how to stick an image as part of the title.
div.article .titlepage .title
{
background-image: url("figures/white-on-black.png");
background-position: center;
background-repeat: repeat-x;
}
*/
div.preface .titlepage .title,
div.colophon .title,
div.chapter .titlepage .title,
div.article .titlepage .title
{
}
div.section div.section .titlepage .title,
div.sect2 .titlepage .title {
background: none;
}
h1.title {
background-color: transparent;
background-image: url("figures/test-title.png");
background-repeat: no-repeat;
height: 256px;
text-indent: -9000px;
overflow:hidden;
}
h2.subtitle {
background-color: transparent;
text-indent: -9000px;
overflow:hidden;
width: 0px;
display: none;
}
/*************************************** /
/ pippin.gimp.org specific alterations /
/ ***************************************/
/*
div.heading, div.navheader {
color: #777;
font-size: 80%;
padding: 0;
margin: 0;
text-align: left;
position: absolute;
top: 0px;
left: 0px;
width: 100%;
height: 50px;
background: url('/gfx/heading_bg.png') transparent;
background-repeat: repeat-x;
background-attachment: fixed;
border: none;
}
div.heading a {
color: #444;
}
div.footing, div.navfooter {
border: none;
color: #ddd;
font-size: 80%;
text-align:right;
width: 100%;
padding-top: 10px;
position: absolute;
bottom: 0px;
left: 0px;
background: url('/gfx/footing_bg.png') transparent;
}
*/
/****************** /
/ nasty ie tweaks /
/ ******************/
/*
div.heading, div.navheader {
width:expression(document.body.clientWidth + "px");
}
div.footing, div.navfooter {
width:expression(document.body.clientWidth + "px");
margin-left:expression("-5em");
}
body {
padding:expression("4em 5em 0em 5em");
}
*/
/**************************************** /
/ mozilla vendor specific css extensions /
/ ****************************************/
/*
div.navfooter, div.footing{
-moz-opacity: 0.8em;
}
div.figure,
div.table,
div.informalfigure,
div.informaltable,
div.informalexample,
div.example,
.tip,
.warning,
.caution,
.note {
-moz-border-radius: 0.5em;
}
b.keycap,
.keycap {
-moz-border-radius: 0.3em;
}
*/
table tr td table tr td {
display: none;
}
hr {
display: none;
}
table {
border: 0em;
}
.photo {
float: right;
margin-left: 1.5em;
margin-bottom: 1.5em;
margin-top: 0em;
max-width: 17em;
border: 1px solid gray;
padding: 3px;
background: white;
}
.seperator {
padding-top: 2em;
clear: both;
}
#validators {
margin-top: 5em;
text-align: right;
color: #777;
}
@media print {
body {
font-size: 8pt;
}
.noprint {
display: none;
}
}
.tip,
.note {
background: #f0f0f2;
color: #333;
padding: 20px;
margin: 20px;
}
.tip h3,
.note h3 {
padding: 0em;
margin: 0em;
font-size: 2em;
font-weight: bold;
color: #333;
}
.tip a,
.note a {
color: #333;
text-decoration: underline;
}
.footnote {
font-size: small;
color: #333;
}
/* Changes the announcement text */
.tip h3,
.warning h3,
.caution h3,
.note h3 {
font-size:large;
color: #00557D;
}

View File

@ -0,0 +1,109 @@
<!DOCTYPE chapter PUBLIC "-//OASIS//DTD DocBook XML V4.2//EN"
"http://www.oasis-open.org/docbook/xml/4.2/docbookx.dtd"
[<!ENTITY % poky SYSTEM "../poky.ent"> %poky; ] >
<chapter id='test-manual-test-process'>
<title>Project Testing and Release Process</title>
<section id='test-daily-devel'>
<title>Day to Day Development</title>
<para>This section details how the project tests changes, through automation on the
Autobuilder or with the assistance of QA teams, through to making releases.</para>
<para>The project aims to test changes against our test matrix before those changes are
merged into the master branch. As such, changes are queued up in batches either in the
<filename>master-next</filename> branch in the main trees, or in user trees such as
<filename>ross/mut</filename> in <filename>poky-contrib</filename> (Ross Burton
helps review and test patches and this is his testing tree).</para>
<para>We have two broad categories of test builds, including "full" and "quick". On the
Autobuilder, these can be seen as "a-quick" and "a-full", simply for ease of sorting in
the UI. Use our Autobuilder console view to see where me manage most test-related items,
available at: <link linkend=""
>https://autobuilder.yoctoproject.org/typhoon/#/console</link>.</para>
<para>Builds are triggered manually when the test branches are ready. The builds are
monitored by the SWAT team. For additional information, see <link linkend=""
>https://wiki.yoctoproject.org/wiki/Yocto_Build_Failure_Swat_Team</link>. If
successful, the changes would usually be merged to the <filename>master</filename>
branch. If not successful, someone would respond to the changes on the mailing list
explaining that there was a failure in testing. The choice of quick or full would depend
on the type of changes and the speed with which the result was required.</para>
<para>The Autobuilder does build the <filename>master</filename> branch once daily for
several reasons, in particular, to ensure the current <filename>master</filename> branch
does build, but also to keep <filename>yocto-testresults</filename> (<link linkend=""
>http://git.yoctoproject.org/cgit.cgi/yocto-testresults/</link>), buildhistory
(<link linkend="">http://git.yoctoproject.org/cgit.cgi/poky-buildhistory/</link>),
and our sstate up to date. On the weekend, there is a master-next build instead to
ensure the test results are updated for the less frequently run targets.</para>
<para>Performance builds (buildperf-* targets in the console) are triggered separately every
six hours and automatically push their results to the buildstats repository at: <link
linkend="">http://git.yoctoproject.org/cgit.cgi/yocto-buildstats/</link>. </para>
<para>The 'quick' targets have been selected to be the ones which catch the most failures or
give the most valuable data. We run 'fast' ptests in this case for example but not the
ones which take a long time. The quick target doesn't include *-lsb builds for all
architectures, some world builds and doesn't trigger performance tests or ltp testing.
The full build includes all these things and is slower but more comprehensive.</para>
</section>
<section id='test-yocto-project-autobuilder-overview'>
<title>Release Builds</title>
<para>The project typically has two major releases a year with a six month cadence in April
and October. Between these there would be a number of milestone releases (usually four)
with the final one being stablization only along with point releases of our stable
branches.</para>
<para>The build and release process for these project releases is similar to that in <link
linkend="test-daily-devel">Day to Day Development</link>, in that the a-full target
of the Autobuilder is used but in addition the form is configured to generate and
publish artefacts and the milestone number, version, release candidate number and other
information is entered. The box to "generate an email to QA"is also checked.</para>
<para>When the build completes, an email is sent out using the send-qa-email script in the
<filename>yocto-autobuilder-helper</filename> repository to the list of people
configured for that release. Release builds are placed into a directory in <link
linkend="">https://autobuilder.yocto.io/pub/releases</link> on the Autobuilder which
is included in the email. The process from here is more manual and control is
effectively passed to release engineering. The next steps include:<itemizedlist>
<listitem>
<para>QA teams respond to the email saying which tests they plan to run and when
the results will be available.</para>
</listitem>
<listitem>
<para>QA teams run their tests and share their results in the yocto-
testresults-contrib repository, along with a summary of their findings.
</para>
</listitem>
<listitem>
<para>Release engineering prepare the release as per their process. </para>
</listitem>
<listitem>
<para>Test results from the QA teams are included into the release in separate
directories and also uploaded to the yocto-testresults repository alongside
the other test results for the given revision.</para>
</listitem>
<listitem>
<para>The QA report in the final release is regenerated using resulttool to
include the new test results and the test summaries from the teams (as
headers to the generated report).</para>
</listitem>
<listitem>
<para>The release is checked against the release checklist and release readiness
criteria.</para>
</listitem>
<listitem>
<para>A final decision on whether to release is made by the YP TSC who have
final oversight on release readiness.</para>
</listitem>
</itemizedlist></para>
</section>
</chapter>
<!--
vim: expandtab tw=80 ts=4
-->

View File

@ -0,0 +1,312 @@
<!DOCTYPE chapter PUBLIC "-//OASIS//DTD DocBook XML V4.2//EN"
"http://www.oasis-open.org/docbook/xml/4.2/docbookx.dtd"
[<!ENTITY % poky SYSTEM "../poky.ent"> %poky; ] >
<chapter id='test-manual-understand-autobuilder'>
<title>Understanding the Yocto Project Autobuilder</title>
<section>
<title>Execution Flow within the Autobuilder</title>
<para>The “a-full” and “a-quick” targets are the usual entry points into the Autobuilder and
it makes sense to follow the process through the system starting there. This is best
visualised from the Autobuilder Console view (<link linkend=""
>https://autobuilder.yoctoproject.org/typhoon/#/console</link>). </para>
<para>Each item along the top of that view represents some “target build” and these targets
are all run in parallel. The full build will trigger the majority of them, the “quick”
build will trigger some subset of them. The Autobuilder effectively runs whichever
configuration is defined for each of those targets on a seperate buildbot worker. To
understand the configuration, you need to look at the entry on
<filename>config.json</filename> file within the
<filename>yocto-autobuilder-helper</filename> repository. The targets are defined in
the overrides section, a quick example could be qemux86-64 which looks
like:<literallayout>
"qemux86-64" : {
"MACHINE" : "qemux86-64",
"TEMPLATE" : "arch-qemu",
"step1" : {
"extravars" : [
"IMAGE_FSTYPES_append = ' wic wic.bmap'"
]
}
},
</literallayout>And
to expand that, you need the “arch-qemu” entry from the “templates” section, which looks
like:<literallayout>
"arch-qemu" : {
"BUILDINFO" : true,
"BUILDHISTORY" : true,
"step1" : {
"BBTARGETS" : "core-image-sato core-image-sato-dev core-image-sato-sdk core-image-minimal core-image-minimal-dev core-image-sato:do_populate_sdk",
"SANITYTARGETS" : "core-image-minimal:do_testimage core-image-sato:do_testimage core-image-sato-sdk:do_testimage core-image-sato:do_testsdk"
},
"step2" : {
"SDKMACHINE" : "x86_64",
"BBTARGETS" : "core-image-sato:do_populate_sdk core-image-minimal:do_populate_sdk_ext core-image-sato:do_populate_sdk_ext",
"SANITYTARGETS" : "core-image-sato:do_testsdk core-image-minimal:do_testsdkext core-image-sato:do_testsdkext"
},
"step3" : {
"BUILDHISTORY" : false,
"EXTRACMDS" : ["${SCRIPTSDIR}/checkvnc; DISPLAY=:1 oe-selftest ${HELPERSTMACHTARGS} -j 15"],
"ADDLAYER" : ["${BUILDDIR}/../meta-selftest"]
}
},
</literallayout>Combining
these two entries you can see that “qemux86-64” is a three step build where the “bitbake
BBTARGETS” would be run, then “bitbake SANITYTARGETS” for each step; all for
MACHINE=”qemx86-64” but with differing SDKMACHINE settings. In step 1 an extra variable
is added to the <filename>auto.conf</filename> file to enable wic image
generation.</para>
<para>While not every detail of this is covered here, you can see how the templating
mechanism allows quite complex configurations to be built up yet allows duplication and
repetition to be kept to a minimum.</para>
<para>The different build targets are designed to allow for parallelisation, so different
machines are usually built in parallel, operations using the same machine and metadata
are built sequentially, with the aim of trying to optimise build efficiency as much as
possible.</para>
<para>The <filename>config.json</filename> file is processed by the scripts in the Helper
repository in the <filename>scripts</filename> directory. The following section details
how this works.</para>
</section>
<section id='test-autobuilder-target-exec-overview'>
<title>Autobuilder Target Execution Overview</title>
<para>For each given target in a build, the Autobuilder executes several steps. These are
configured in <filename>yocto-autobuilder2/builders.py</filename> and roughly consist
of: <orderedlist>
<listitem id='test-list-tgt-exec-clobberdir'>
<para><emphasis>Run <filename>clobberdir</filename></emphasis></para>
<para>This cleans out any previous build. Old builds are left around to allow
easier debugging of failed builds. For additional information, see <link
linkend="test-clobberdir"><filename>clobberdir</filename></link>.</para>
</listitem>
<listitem>
<para><emphasis>Obtain yocto-autobuilder-helper</emphasis></para>
<para>This step clones the <filename>yocto-autobuilder-helper</filename> git
repository. This is necessary to prevent the requirement to maintain all the
release or project-specific code within Buildbot. The branch chosen matches
the release being built so we can support older releases and still make
changes in newer ones.</para>
</listitem>
<listitem>
<para><emphasis>Write layerinfo.json</emphasis></para>
<para>This transfers data in the Buildbot UI when the build was configured to
the Helper.</para>
</listitem>
<listitem>
<para><emphasis>Call scripts/shared-repo-unpack</emphasis></para>
<para>This is a call into the Helper scripts to set up a checkout of all the
pieces this build might need. It might clone the BitBake repository and the
OpenEmbedded-Core repository. It may clone the Poky repository, as well as
additional layers. It will use the data from the
<filename>layerinfo.json</filename> file to help understand the
configuration. It will also use a local cache of repositories to speed up
the clone checkouts. For additional information, see <link
linkend="test-autobuilder-clone-cache">Autobuilder Clone
Cache</link>.</para>
<para>This step has two possible modes of operation. If the build is part of a
parent build, its possible that all the repositories needed may already be
available, ready in a pre-prepared directory. An "a-quick" or "a-full" build
would prepare this before starting the other sub-target builds. This is done
for two reasons:<itemizedlist>
<listitem>
<para>the upstream may change during a build, for example, from a
forced push and this ensures we have matching content for the
whole build</para>
</listitem>
<listitem>
<para>if 15 Workers all tried to pull the same data from the same
repos, we can hit resource limits on upstream servers as they
can think they are under some kind of network attack</para>
</listitem>
</itemizedlist>This pre-prepared directory is shared among the Workers over
NFS. If the build is an individual build and there is no "shared" directory
available, it would clone from the cache and the upstreams as necessary.
This is considered the fallback mode.</para>
</listitem>
<listitem>
<para><emphasis>Call scripts/run-config</emphasis></para>
<para>This is another call into the Helper scripts where its expected that the
main functionality of this target will be executed.</para>
</listitem>
</orderedlist></para>
</section>
<section id='test-autobuilder-tech'>
<title>Autobuilder Technology</title>
<para>The Autobuilder has Yocto Project-specific functionality to allow builds to operate
with increased efficiency and speed.</para>
<section id='test-clobberdir'>
<title>clobberdir</title>
<para>When deleting files, the Autobuilder uses <filename>clobberdir</filename>, which
is a special script that moves files to a special location, rather than deleting
them. Files in this location are deleted by an <filename>rm</filename> command,
which is run under <filename>ionice -c 3</filename>. For example, the deletion only
happens when there is idle IO capacity on the Worker. The Autobuilder Worker Janitor
runs this deletion. See <link linkend="test-autobuilder-worker-janitor">Autobuilder
Worker Janitor</link>.</para>
</section>
<section id='test-autobuilder-clone-cache'>
<title>Autobuilder Clone Cache</title>
<para>Cloning repositories from scratch each time they are required was slow on the
Autobuilder. We therefore have a stash of commonly used repositories pre-cloned on
the Workers. Data is fetched from these during clones first, then "topped up" with
later revisions from any upstream when necesary. The cache is maintained by the
Autobuilder Worker Janitor. See <link linkend="test-autobuilder-worker-janitor"
>Autobuilder Worker Janitor</link>.</para>
</section>
<section id='test-autobuilder-worker-janitor'>
<title>Autobuilder Worker Janitor</title>
<para>This is a process running on each Worker that performs two basic operations,
including background file deletion at IO idle (see <link
linkend="test-list-tgt-exec-clobberdir">Target Execution: clobberdir</link>) and
maintainenance of a cache of cloned repositories to improve the speed the system can
checkout repositories.</para>
</section>
<section id='test-shared-dl-dir'>
<title>Shared DL_DIR</title>
<para>The Workers are all connected over NFS which allows DL_DIR to be shared between
them. This reduces network accesses from the system and allows the build to be sped
up. Usage of the directory within the build system is designed to be able to be
shared over NFS.</para>
</section>
<section id='test-shared-sstate-cache'>
<title>Shared SSTATE_DIR</title>
<para>The Workers are all connected over NFS which allows the
<filename>sstate</filename> directory to be shared between them. This means once
a Worker has built an artefact, all the others can benefit from it. Usage of the
directory within the directory is designed for sharing over NFS.</para>
</section>
<section id='test-resulttool'>
<title>Resulttool</title>
<para>All of the different tests run as part of the build generate output into
<filename>testresults.json</filename> files. This allows us to determine which
tests ran in a given build and their status. Additional information, such as failure
logs or the time taken to run the tests, may also be included.</para>
<para>Resulttool is part of OpenEmbedded-Core and is used to manipulate these json
results files. It has the ability to merge files together, display reports of the
test results and compare different result files.</para>
<para>For details, see <link linkend=""
>https://wiki.yoctoproject.org/wiki/Resulttool</link>.</para>
</section>
</section>
<section id='test-run-config-tgt-execution'>
<title>run-config Target Execution</title>
<para>The <filename>scripts/run-config</filename> execution is where most of the work within
the Autobuilder happens. It runs through a number of steps; the first are general setup
steps that are run once and include:<orderedlist>
<listitem>
<para>Set up any <filename>buildtools-tarball</filename> if configured.</para>
</listitem>
<listitem>
<para>Call "buildhistory-init" if buildhistory is configured.</para>
</listitem>
</orderedlist></para>
<para>For each step that is configured in <filename>config.json</filename>, it will perform
the following:</para>
<para>
<remark>## WRITER's question: What does "logging in as stepXa" and others refer to
below? ##</remark>
<orderedlist>
<listitem id="test-run-config-add-layers-step">
<para dir="ltr">Add any layers that are specified using the
<filename>bitbake-layers add-layer</filename> command (logging as
stepXa)</para>
</listitem>
<listitem>
<para dir="ltr">Call the <filename>scripts/setup-config</filename> script to
generate the necessary <filename>auto.conf</filename> configuration file for
the build</para>
</listitem>
<listitem>
<para dir="ltr">Run the <filename>bitbake BBTARGETS</filename> command (logging
as stepXb)</para>
</listitem>
<listitem>
<para dir="ltr">Run the <filename>bitbake SANITYTARGETS</filename> command
(logging as stepXc)</para>
</listitem>
<listitem>
<para dir="ltr">Run the <filename>EXTRACMDS</filename> command, which are run
within the BitBake build environment (logging as stepXd)</para>
</listitem>
<listitem>
<para dir="ltr">Run the <filename>EXTRAPLAINCMDS</filename> command(s), which
are run outside the BitBake build environment (logging as stepXd)</para>
</listitem>
<listitem>
<para dir="ltr">Remove any layers added in <link
linkend="test-run-config-add-layers-step">step 1</link> using the
<filename>bitbake-layers remove-layer</filename> command (logging as
stepXa)</para>
</listitem>
</orderedlist>
</para>
<para>Once the execution steps above complete, <filename>run-config</filename> executes a
set of post-build steps, including:<orderedlist>
<listitem>
<para dir="ltr">Call <filename>scripts/publish-artifacts</filename> to collect
any output which is to be saved from the build.</para>
</listitem>
<listitem>
<para dir="ltr">Call <filename>scripts/collect-results</filename> to collect any
test results to be saved from the build.</para>
</listitem>
<listitem>
<para dir="ltr">Call <filename>scripts/upload-error-reports</filename> to send
any error reports generated to the remote server.</para>
</listitem>
<listitem>
<para dir="ltr">Cleanup the build directory using <link
linkend="test-clobberdir"><filename>clobberdir</filename></link> if the
build was successful, else rename it to “build-renamed” for potential future
debugging.</para>
</listitem>
</orderedlist></para>
</section>
<section id='test-deploying-yp-autobuilder'>
<title>Deploying Yocto Autobuilder</title>
<para>The most up to date information about how to setup and deploy your own Autbuilder can
be found in README.md in the <filename>yocto-autobuilder2</filename> repository.</para>
<para>We hope that people can use the <filename>yocto-autobuilder2</filename> code directly
but it is inevitable that users will end up needing to heavily customise the
<filename>yocto-autobuilder-helper</filename> repository, particularly the
<filename>config.json</filename> file as they will want to define their own test
matrix.</para>
<para>The Autobuilder supports wo customization options: <itemizedlist>
<listitem>
<para>variable substitution</para>
</listitem>
<listitem>
<para>overlaying configuration files</para>
</listitem>
</itemizedlist>The standard <filename>config.json</filename> minimally attempts to allow
substitution of the paths. The Helper script repository includes a
<filename>local-example.json</filename> file to show how you could override these
from a separate configuration file. Pass the following into the environment of the
Autobuilder:<literallayout>
$ ABHELPER_JSON="config.json local-example.json"
</literallayout>As
another example, you could also pass the following into the
environment:<literallayout>
$ ABHELPER_JSON="config.json <replaceable>/some/location/</replaceable>local.json"
</literallayout>One
issue users often run into is validation of the <filename>config.json</filename> files.
A tip for minimizing issues from invalid json files is to use a Git
<filename>pre-commit-hook.sh</filename> script to verify the JSON file before
committing it. Create a symbolic link as
follows:<literallayout>
$ ln -s ../../scripts/pre-commit-hook.sh .git/hooks/pre-commit
</literallayout></para>
</section>
</chapter>
<!--
vim: expandtab tw=80 ts=4
-->

View File

@ -0,0 +1,103 @@
<!DOCTYPE book PUBLIC "-//OASIS//DTD DocBook XML V4.2//EN"
"http://www.oasis-open.org/docbook/xml/4.2/docbookx.dtd"
[<!ENTITY % poky SYSTEM "../poky.ent"> %poky; ] >
<book id='test-manual' lang='en'
xmlns:xi="http://www.w3.org/2003/XInclude"
xmlns="http://docbook.org/ns/docbook"
>
<bookinfo>
<mediaobject>
<imageobject>
<imagedata fileref='figures/test-manual-title.png'
format='SVG'
align='left' scalefit='1' width='100%'/>
</imageobject>
</mediaobject>
<title>
Yocto Project Test Environment Manual
</title>
<authorgroup>
<author>
<affiliation>
<orgname>&ORGNAME;</orgname>
</affiliation>
<email>&ORGEMAIL;</email>
</author>
</authorgroup>
<revhistory>
<revision>
<revnumber>3.1.1</revnumber>
<date>TBD</date>
<revremark>DRAFT - Work-in-Progress - posted June 16, 2020</revremark>
</revision>
</revhistory>
<copyright>
<year>&COPYRIGHT_YEAR;</year>
<holder>Linux Foundation</holder>
</copyright>
<legalnotice>
<para>
Permission is granted to copy, distribute and/or modify this document under
the terms of the <ulink type="http" url="http://creativecommons.org/licenses/by-sa/2.0/uk/">
Creative Commons Attribution-Share Alike 2.0 UK: England &amp; Wales</ulink> as published by
Creative Commons.
</para>
<note><title>Manual Notes</title>
<itemizedlist>
<listitem><para>
This version of the
<emphasis>Yocto Project Test Environment Manual</emphasis>
is for the &YOCTO_DOC_VERSION; release of the
Yocto Project.
To be sure you have the latest version of the manual
for this release, go to the
<ulink url='&YOCTO_HOME_URL;/documentation'>Yocto Project documentation page</ulink>
and select the manual from that site.
Manuals from the site are more up-to-date than manuals
derived from the Yocto Project released TAR files.
</para></listitem>
<listitem><para>
If you located this manual through a web search, the
version of the manual might not be the one you want
(e.g. the search might have returned a manual much
older than the Yocto Project version with which you
are working).
You can see all Yocto Project major releases by
visiting the
<ulink url='&YOCTO_WIKI_URL;/wiki/Releases'>Releases</ulink>
page.
If you need a version of this manual for a different
Yocto Project release, visit the
<ulink url='&YOCTO_HOME_URL;/documentation'>Yocto Project documentation page</ulink>
and select the manual set by using the
"ACTIVE RELEASES DOCUMENTATION" or "DOCUMENTS ARCHIVE"
pull-down menus.
</para></listitem>
<listitem><para>
To report any inaccuracies or problems with this
manual, send an email to the Yocto Project
discussion group at
<filename>yocto@yoctoproject.com</filename> or log into
the freenode <filename>#yocto</filename> channel.
</para></listitem>
</itemizedlist>
</note>
</legalnotice>
</bookinfo>
<xi:include href="test-manual-intro.xml"/>
<xi:include href="test-manual-test-process.xml"/>
<xi:include href="test-manual-understand-autobuilder.xml"/>
</book>
<!--
vim: expandtab tw=80 ts=4
-->