subtree updates
meta-raspberrypi: fde68b24f0..4c033eb074:
Harunobu Kurokawa (1):
rpi-cmdline, rpi-u-boot-src: Support USB boot
meta-arm: 0b61cc659a..4d22f982bc:
Debbie Martin (2):
arm-systemready: Add parted dependency and inherit testimage
ci: Add Arm SystemReady firmware and IR ACS builds
Harsimran Singh Tungal (3):
arm-bsp/documentation: corstone1000: fix the steps in the user guide and instructions
corstone1000:arm-bsp/optee: Update optee to v4.0
corstone1000:arm-bsp/tftf: Fix tftf tests on mps3
Jon Mason (5):
arm/trusted-firmware-a: move patch file to bbappend
arm/trusted-firmware-a: update to 2.10
arm/hafnium: update to v2.10
CI: rename meta-secure-core directory
arm/edk2: update to 202311
Ross Burton (1):
CI: switch back to master
poky: 028b6f6226..4675bbb757:
Adrian Freihofer (4):
cmake-qemu.bbclass: make it more usable
oe-selftest: add a cpp-example recipe
oeqa/core/decorator: add skip if not qemu-usermode
oe-selftest: add tests for C and C++ build tools
Alassane Yattara (22):
bitbake: toaster/test: bug-fix on tests/browser/test_all_builds_page
bitbake: toaster/test: from test_no_builds_message.py wait for the empty state div to appear
bitbake: toaster/test: delay driver action until elements to appear
bitbake: toaster/tests: Ensure to kill toaster process create for tests functional
bitbake: toaster/tests: Added functional/utils, contains useful methods using by functional tests
bitbake: toaster/tests: Refactorize tests/functional
bitbake: toaster/tests: Bug fixes, functional tests dependent on each other
bitbake: toaster/tests: Fixes warnings in autobuilder
bitbake: toaster/tests: bug-fix tests writing files into /tmp on the autobuilders
bitbake: toaster/test: fix Copyright
bitbake: toaster/tests: logging warning in console, trying to kill unavailable Runbuilds process
bitbake: toaster/tests: Removed all time.sleep occurrence
bitbake: toaster/tests: Bug-Fix testcase functional/test_project_page_tab_config.py
bitbake: toaster/tests: bug-fix element click intercepted in browser/test_layerdetails_page.py
bitbake: toaster/tests: Update tests/functional/functional_helpers test_functional_basic
bitbake: toaster/tests: Fixes functional tests warning on autobuilder
bitbake: toaster/tests: Bug-fix test_functional_basic, delay driver actions
bitbake: toaster/tests: bug-fix An element matching "#projectstable" should be visible
bitbake: toaster/tests: bug-fix An element matching "#lastest_builds" should be on the page
bitbake: toaster/tests: Skip to show more then 100 item in ToasterTable
bitbake: toaster/tests: Bug-fix "#project-created-notification" should be visible
bitbake: toaster/toastergui: Bug-fix verify given layer path only if import/add local layer
Alex Bennée (1):
qemurunner: more cleanups for output blocking
Alex Kiernan (17):
cargo: Rename MANIFEST_PATH -> CARGO_MANIFEST_PATH
cargo: Move CARGO_MANIFEST_PATH/CARGO_SRC_DIR to cargo_common
rust: cargo: Convert single-valued variables to weak defaults
cargo: Add CARGO_LOCK_PATH for path to Cargo.lock
rust: Upgrade 1.70.0 -> 1.71.0
rust: Upgrade 1.71.0 -> 1.71.1
sstate-cache-management: Rewrite in python
devtool: selftest: Fix test_devtool_modify_git_crates_subpath inequality
devtool: selftest: Fix test_devtool_modify_git_crates_subpath bbappend check
meta-selftest: hello-rs: Simple rust test recipe
devtool: selftest: Swap to hello-rs for crates testing
zvariant: Drop recipe
rust: Upgrade 1.71.1 -> 1.72.0
rust: Upgrade 1.72.0 -> 1.72.1
rust: Upgrade 1.72.1 -> 1.73.0
rust: Upgrade 1.73.0 -> 1.74.0
rust: Upgrade 1.74.0 -> 1.74.1
Alexander Kanavin (21):
selftest/sstatetest: print output from bitbake with actual newlines, not \n
selftest/sstatetests: do not delete custom $TMPDIRs under build-st when testing printdiff
sstatesig/find_siginfo: special-case gcc-source when looking in sstate caches
oeqa/selftest/sstatetests: re-work CDN tests, add local cache tests
gobject-introspection: depend on setuptools to obtain distutils module
libcap-ng-python: depend on setuptools to obtain distutils copy
dnf: remove obsolete python3-gpg dependency (provided by gpgme)
gpgme: disable python support (until upstream fixes 3.12 compatibility)
python3-setuptools-rust: remove distutils dependency
python3-babel: replace distutils with setuptools, as supported by upstream
python3-pip: remove distutils depedency
glib-2.0: replace distutils dependency with setuptools
python3-pytest-runner: remove distutils dependency
python3-numpy: distutils is no longer required
bitbake: bitbake/codeparser.py: address ast module deprecations in py 3.12
glibc-y2038-tests: do not run tests using 32 bit time APIs
bitbake: bitbake/runqueue: add debugging for find_siginfo() calls
bitbake: bitbake-diffsigs/runqueue: adapt to reworked find_siginfo()
bitbake: bitbake/runqueue: prioritize local stamps over sstate signatures in printdiff
sstatesig/find_siginfo: unify a disjointed API
lib/sstatesig/find_siginfo: raise an error instead of returning None when obtaining mtime
Alexander Lussier-Cullen (6):
bitbake: toaster: fix pytest build test execution and test discovery
bitbake: toaster: Add verbose printout for missing chrome(driver) dependencies
bitbake: bitbake: toaster: add functional testing toaster error details
bitbake: toaster/tests: Exit tests on chromedriver creation failure
bitbake: toaster/tests: fix functional tests setup and teardown
bitbake: toaster/tests: fix chrome argument syntax and wait for driver exit
Alexandre Belloni (1):
oeqa/selftest/recipetool: stop looking for md5sum
Anuj Mittal (9):
sqlite3: upgrade 3.44.0 -> 3.44.2
base-passwd: upgrade 3.6.2 -> 3.6.3
bluez5: upgrade 5.70 -> 5.71
glib-2.0: upgrade 2.78.1 -> 2.78.3
glib-networking: upgrade 2.76.1 -> 2.78.0
puzzles: upgrade to latest revision
stress-ng: upgrade 0.17.01 -> 0.17.03
libusb1: fix upstream version check
enchant2: upgrade 2.6.2 -> 2.6.4
Archana Polampalli (1):
bluez5: fix CVE-2023-45866
Bruce Ashfield (31):
linux-yocto/6.5: cfg: split runtime and symbol debug
linux-yocto/6.5: update to v6.5.11
linux-yocto/6.1: update to v6.1.62
linux-yocto-dev: bump to v6.7
linux-yocto/6.5: update to v6.5.12
linux-yocto/6.5: update to v6.5.13
linux-yocto/6.1: update to v6.1.65
linux-yocto/6.1: drop removed IMA option
linux-yocto/6.5: drop removed IMA option
linux-yocto-rt/6.1: update to -rt18
linux-yocto/6.1: update to v6.1.66
linux-yocto/6.1: update to v6.1.67
linux-yocto/6.5: fix AB-INT: QEMU kernel panic: No irq handler for vector
linux-yocto/6.1: update to v6.1.68
oeqa/runtime/parselogs: add qemux86 ACPI ignore for kernel v6.6+
linux-libc-headers: update to v6.6-lts
linux-yocto: introduce 6.6 reference kernel
linux-yocto/6.6: fix AB-INT: QEMU kernel panic: No irq handler for vector
linux-yocto-rt/6.6: fix CVE exclusion include
linux-yocto/6.6: update CVE exclusions
linux-yocto/6.6: update to v6.6.8
linux-yocto/6.1: update to v6.1.69
linux-yocto/6.5: drop 6.5 recipes
linux-yocto-rt/6.6: correct meta data branch
linux-yocto/6.6: update to v6.6.9
linux-yocto/6.6: update CVE exclusions
linux-yocto/6.1: update to v6.1.70
linux-yocto/6.1: update CVE exclusions
linux-yocto/6.6: ARM fix configuration audit warning
linux-yocto/6.6: arm: jitter entropy backport
poky/poky-tiny: make 6.6 the default kernel
Changqing Li (1):
man-pages: remove conflict pages
Chen Qi (1):
devtool: use straight print in check-upgrade-status output
Clay Chang (1):
devtool: deploy: provide max_process to strip_execs
Daniel Ammann (1):
base: Unpack .7z files with p7zip
Deepthi Hemraj (1):
autoconf: Add missing perl modules to RDEPENDS
Dhairya Nagodra (2):
cve-update-nvd2-native: faster requests with API keys
cve-update-nvd2-native: increase the delay between subsequent request failures
Eilís 'pidge' Ní Fhlannagáin (3):
useradd: Fix issues with useradd dependencies
useradd: Add testcase for bugzilla issue (currently disabled)
usergrouptests.py: Add test for switching between static-ids
Enrico Scholz (1):
tcp-wrappers: drop libnsl2 build dependency
Etienne Cordonnier (2):
gdb/systemd: enable minidebuginfo support conditionally
manuals: document minidebuginfo
Fabio Estevam (3):
libdrm: Upgrade to 2.4.119
kmscube: Upgrade to latest revision
bmap-tools: Upgrade to 3.7
Hongxu Jia (2):
socat: 1.7.4.4 -> 1.8.0.0
man-db: 2.11.2 -> 2.12.0
Jason Andryuk (3):
linux-firmware: Package iwlwifi .pnvm files
linux-firmware: Change bnx2 packaging
linux-firmware: Create bnx2x subpackage
Jeremy A. Puhlman (1):
create-spdx-2.2: combine spdx can try to write before dir creation
Jermain Horsman (2):
lib/bblayers/makesetup.py: Remove unused imports
lib/bblayers/buildconf.py: Remove unused imports/variables
Jose Quaresma (2):
go: update 1.20.10 -> 1.20.11
go: update 1.20.11 -> 1.20.12
Joshua Watt (11):
bitbake: bitbake-hashserv: Add description of permissions
bitbake.conf: Add runtimedir
rpcbind: Specify state directory under /run
libinput: Add packageconfig for tests
ipk: Switch to using zstd compression
lib/oe/path.py: Add relsymlink()
lib/packagedata.py: Fix broken symlinks for providers with a '/'
bitbake: contrib/vim: Syntax improvements
classes-global/sstate: Fix variable typo
lib/packagedata.py: Add API to iterate over rprovides
classes-global/insane: Look up all runtime providers for file-rdeps
Julien Stephan (19):
recipetool: create_buildsys_python.py: initialize metadata
recipetool: create: add trailing newlines
recipetool: create: add new optional process_url callback for plugins
recipetool: create_buildsys_python: add pypi support
oeqa/selftest/recipetool: remove spaces on empty lines
oeqa/selftest/recipetool/devtool: add test for pypi class
recipetool: appendsrcfile(s): add dry-run mode
recipeutils: bbappend_recipe: fix undefined variable
recipeutils: bbappend_recipe: fix docstring
recipeutils: bbappend_recipe: add a way to specify the name of the file to add
recipeutils: bbappend_recipe: remove old srcuri entry if parameters are different
recipetool: appendsrcfile(s): use params instead of extraline
recipeutils: bbappend_recipe: allow to patch the recipe itself
recipetool: appendsrcfile(s): add a mode to update the recipe itself
oeqa/selftest/recipetool: appendsrfile: add test for machine
oeqa/selftest/recipetool: appendsrc: add test for update mode
oeqa/selftest/recipetool: add back checksum checks on pypi tests
oeqa/selftest/recipetool: remove left over from development
oeqa/selftest/recipetool: fix metadata corruption on meta layer
Kevin Hao (2):
beaglebone-yocto: Remove the redundant kernel-devicetree
beaglebone-yocto: Remove the obsolete variables for uImage
Khem Raj (13):
tiff: Backport fixes for CVE-2023-6277
kmod: Fix build with latest musl
elfutils: Use own basename API implementation
util-linux: Fix build with latest musl
sysvinit: Include libgen.h for basename API
attr: Fix build with latest musl
opkg: Use own version of portable basename function
util-linux: Delete md-raid tests
gdb: Update to gdb 14.1 release
systemd: Fix build with latest musl
qemu: Fix build with latest musl
qemu: Add packageconfig knob to enable pipewire support
weston: Include libgen.h for basename
Lee Chee Yang (5):
migration-guides: reword fix in release-notes-4.3.1
migration-guides: add release notes for 4.0.15
perlcross: update to 1.5.2
perl: 5.38.0 -> 5.38.2
curl: update to 8.5.0
Lucas Stach (1):
mesa: upgrade 23.2.1 -> 23.3.1
Ludovic Jozeau (1):
image-live.bbclass: LIVE_ROOTFS_TYPE support compression
Lukas Funke (1):
selftest: wic: add test for zerorize option of empty plugin
Malte Schmidt (1):
wic: extend empty plugin with options to write zeros to partiton
Markus Volk (3):
gtk4: upgrade 4.12.3 -> 4.12.4
libadwaita: update 1.4.0 -> 1.4.2
appstream: Upgrade 0.16.3 -> 1.0.0
Marlon Rodriguez Garcia (5):
bitbake: toaster/tests: Update build test
bitbake: toaster: Added new feature to import eventlogs from command line into toaster using replay functionality
bitbake: toaster: remove test and update setup to avoid rebuilding image
bitbake: toaster: Commandline build import table improvements
bitbake: toaster: Added validation to stop import if there is a build in progress
Marta Rybczynska (1):
bitbake: toastergui: verify that an existing layer path is given
Massimiliano Minella (1):
zstd: fix LICENSE statement
Michael Opdenacker (8):
test-manual: text and formatting fixes
test-manual: resource updates
test-manual: use working example
test-manual: add links to python unittest
test-manual: explicit or fix file paths
test-manual: add or improve hyperlinks
dev-manual: runtime-testing: fix test module name
poky.conf: update SANITY_TESTED_DISTROS to match autobuilder
Mikko Rapeli (1):
runqemu: match .rootfs. in addition to -image- for rootfs
Ming Liu (1):
grub: fs/fat: Don't error when mtime is 0
Mingli Yu (2):
python3-license-expression: Fix the ptest failure
ptest-packagelists.inc: Add python3-license-expression
Pavel Zhukov (2):
bitbake: utils: Do not create directories with ${ in the name
oeqa/selftest/bbtests: Add test for unexpanded variables in the dirname
Peter Kjellerstedt (11):
oeqa/selftest/devtool: Correct git clone of local repository
oeqa/selftest/devtool: Avoid global Git hooks when amending a patch
oeqa/selftest/devtool: Make test_devtool_load_plugin more resilient
oeqa/selftest/recipetool: Make test_recipetool_load_plugin more resilient
lib/oe/recipeutils: Avoid wrapping any SRC_URI[sha*sum] variables
recipetool: create: Improve identification of licenses
recipetool: create: Only include the expected SRC_URI checksums
devtool: upgrade: Update all existing checksums for the SRC_URI
devtool: modify: Make --no-extract work again
devtool: modify: Handle recipes with a menuconfig task correctly
dev-manual: Discourage the use of SRC_URI[md5sum]
Peter Marko (1):
dtc: preserve version also from shallow git clones
Philip Balister (1):
sanity.bbclass: Check for additional native perl modules.
Renat Khalikov (1):
python3-maturin: Add missing space appending to CFLAGS
Richard Purdie (41):
bitbake: runqueue: Improve inter setscene task dependency handling
bitbake: bb/toaster: Fix assertEquals deprecation warnings
bitbake: toaster: Fix assertRegexpMatches deprecation warnings
bitbake: toastermain/settings: Avoid python filehandle closure warnings
bitbake: toastergui: Fix regex markup issues
bitbake: bitbake: Move to version 2.6.1 to mark runqueue changes
bitbake: toaster-eventreplay: Remove ordering assumptions
sanity.conf: Require bitbake 2.6.1 for recent runqueue change
sstate: Remove unneeded code from setscene_depvalid() related to useradd
oeqa/runtime/systemd: Ensure test runs only on systemd images
bitbake: toaster: Update to use qemux86-64 machine by default
bitbake: toaster/tests/builds: Add BB_HASHSERVE passthrough
pseudo: Update to pull in syncfs probe fix
useradd: Fix useradd do_populate_sysroot dependency bug
sstate: Fix dir ownership issues in SSTATE_DIR
oeqa/sstatetests: Disable gcc source printdiff test for now
build-appliance-image: Update to master head revision
bitbake: utils: Fix mkdir with PosixPath
bitbake: runqueue: Remove tie between rqexe and starts_worker
build-appliance-image: Update to master head revision
testimage: Exclude wtmp from target-dumper commands
qemurunner: Improve stdout logging handling
qemurunner: Improve handling of serial port output blocking
oeqa/selftest/overlayfs: Don't overwrite DISTRO_FEATURES
testimage: Drop target_dumper and most of monitor_dumper
oeqa/selftest/overlayfs: Fix whitespace
qemu: Clean up DEPENDS
qemu: Ensure pip and the python venv aren't used for meson
curl: Disable two intermittently failing tests
linux/cve-exclusion6.1: Update to latest kernel point release
lib/prservice: Improve lock handling robustness
oeqa/selftest/prservice: Improve test robustness
scripts: Drop shell sstate-cache-management
oeqa/selftest/sstatetests: Update sstate management script tests to python script
curl: Disable test 1091 due to intermittent failures
bitbake: lib/bb: Add workaround for libgcc issues with python 3.8 and 3.9
bitbake: bitbake: Post release version bump to 2.7.0
bitbake: siggen: Ensure version of siggen is verified
bitbake: bitbake: Version bump for find_siginfo chanages
sstatesig: Add version information for find_sigingfo
sanity: Require bitbake 2.7.1
Robert Berger (1):
uninative-tarball.xz - reproducibility fix
Robert Yang (5):
gettext: Upgrade 0.22.3 -> 0.22.4
nfs-utils: Upgrade 2.6.3 -> 2.6.4
archiver.bbclass: Improve work-shared checking
nfs-utils: Update Upstream-Status
archiver.bbclass: Drop tarfile module to improve performance
Ross Burton (23):
avahi: update URL for new project location
oeqa/runtime/parselogs: load ignores from disk
oeqa/runtime/parselogs: migrate ignores
meta-yocto-bsp/oeqa/parselogs: add BSP-specific ignores
linux-yocto: update CVE exclusions
genericx86: remove redundant assignments
images: remove redundant IMAGE_BASENAME assignments
insane: ensure more paths have the workdir removed
tcl: skip timing-dependent tests in run-ptest
qemurunner: remove unused import
go: set vendor in CVE_PRODUCT
runqemu: add qmp socket support
linux-yocto: update CVE exclusions
tcl: skip async and event tests in run-ptest
images: add core-image-initramfs-boot
machine/arch-armv9: remove crc and sve tunes, they are mandatory
python3: re-enable profile guided optimisation
openssl: mark assembler sections as call targets for PAC/BTI support on aarch64
nativesdk: ensure features don't get backfilled
nativesdk: don't unset MACHINE_FEATURES, let machine-sdk/ set it
conf/machine-sdk: declare qemu-usermode SDK_MACHINE_FEATURE
libseccomp: remove redundant PV assignment
oeqa/parselogs-ignores-qemuarmv5: add comments and organise
Saul Wold (1):
package.py: OEHasPackage: Add MLPREFIX to packagename
Shubham Kulkarni (1):
tzdata: Upgrade to 2023d
Simone Weiß (2):
manuals: brief-yoctoprojectqs: align variable order with default local.conf
patchtest: Add test for deprecated CVE_CHECK_IGNORE
Soumya Sambu (1):
ncurses: Fix - tty is hung after reset
Sundeep KOKKONDA (1):
rust: rustdoc reproducibility issue fix - disable PGO
Tim Orling (12):
python3-bcrypt: upgrade 4.0.1 -> 4.1.1
python3-pygments: upgrade 2.16.1 -> 2.17.2
recipetool: pypi: do not clobber SRC_URI checksums
python3-setuptools-rust: BBCLASSEXTEND + nativesdk
python3-maturin: add v1.4.0
python3-maturin: bzip2-sys reproduciblility
classes-recipe: add python_maturin.bbclass
recipetool: add python_maturin support
oe-selfest: add maturn runtime (testimage) test
oeqa: add simple 'maturin' SDK (testsdk) test case
oeqa: add "maturin develop" SDK test case
oeqa: add runtime 'maturin develop' test case
Tom Rini (1):
inetutils: Update to the 2.5 release
Trevor Gamblin (1):
scripts/runqemu: fix regex escape sequences
Victor Kamensky (5):
systemtap: upgrade 4.9 -> 5.0
systemtap: do not install uprobes and uprobes sources
systemtap-uprobes: removed as obsolete
systemtap: explicit handling debuginfod library dependency
systemtap: fix libdebuginfod auto detection logic
Vijay Anusuri (1):
avahi: backport CVE-2023-1981 & CVE's follow-up patches
Viswanath Kraleti (2):
image-uefi.conf: Add EFI_UKI_PATH variable
systemd-boot: Add recipe to compile native
Wang Mingyu (38):
kbd: upgrade 2.6.3 -> 2.6.4
libatomic-ops: upgrade 7.8.0 -> 7.8.2
libnl: upgrade 3.8.0 -> 3.9.0
libseccomp: upgrade 2.5.4 -> 2.5.5
libva-utils: upgrade 2.20.0 -> 2.20.1
dnf: upgrade 4.18.1 -> 4.18.2
gpgme: upgrade 1.23.1 -> 1.23.2
kea: upgrade 2.4.0 -> 2.4.1
opkg-utils: upgrade 0.6.2 -> 0.6.3
repo: upgrade 2.39 -> 2.40
sysstat: upgrade 12.7.4 -> 12.7.5
p11-kit: upgrade 0.25.2 -> 0.25.3
python3-babel: upgrade 2.13.1 -> 2.14.0
python3-dbusmock: upgrade 0.29.1 -> 0.30.0
python3-hatchling: upgrade 1.18.0 -> 1.20.0
python3-hypothesis: upgrade 6.90.0 -> 6.92.1
python3-importlib-metadata: upgrade 6.8.0 -> 7.0.0
python3-license-expression: upgrade 30.1.1 -> 30.2.0
python3-pathspec: upgrade 0.11.2 -> 0.12.1
python3-pip: upgrade 23.3.1 -> 23.3.2
python3-psutil: upgrade 5.9.6 -> 5.9.7
python3-pytest-runner: upgrade 6.0.0 -> 6.0.1
python3-trove-classifiers: upgrade 2023.11.22 -> 2023.11.29
python3-typing-extensions: upgrade 4.8.0 -> 4.9.0
python3-wcwidth: upgrade 0.2.11 -> 0.2.12
ttyrun: upgrade 2.29.0 -> 2.30.0
xwayland: upgrade 23.2.2 -> 23.2.3
diffoscope: upgrade 252 -> 253
iputils: upgrade 20221126 -> 20231222
gstreamer1.0: upgrade 1.22.7 -> 1.22.8
dhcpcd: upgrade 10.0.5 -> 10.0.6
fontconfig: upgrade 2.14.2 -> 2.15.0
python3-setuptools: upgrade 69.0.2 -> 69.0.3
python3-dbusmock: upgrade 0.30.0 -> 0.30.1
python3-hatchling: upgrade 1.20.0 -> 1.21.0
python3-importlib-metadata: upgrade 7.0.0 -> 7.0.1
python3-lxml: upgrade 4.9.3 -> 4.9.4
aspell: upgrade 0.60.8 -> 0.60.8.1
Yash Shinde (1):
rust: Disable rust oe-selftest
Yi Zhao (3):
json-glib: upgrade 1.6.6 -> 1.8.0
psplash: upgrade to latest revision
debianutils: upgrade 5.14 -> 5.15
Yoann Congal (2):
lib/oe/patch: handle creating patches for CRLF sources
strace: Disable bluetooth support by default
Zang Ruochen (2):
ell: upgrade 0.60 -> 0.61
musl: add typedefs for Elf64_Relr and Elf32_Relr
Zoltan Boszormenyi (1):
update_gtk_icon_cache: Fix for GTK4-only builds
venkata pyla (1):
wic: use E2FSPROGS_FAKE_TIME and hash_seed to generate reproducible ext4 images
meta-openembedded: 5ad7203f68..7d8115d550:
Alex Kiernan (7):
mdns: Fix HOMEPAGE URL
mbedtls: Upgrade 3.5.0 -> 3.5.1
c-ares: Upgrade 1.22.1 -> 1.24.0
mdns: Upgrade 2200.40.37.0.1 -> 2200.60.25.0.4
c-ares: Move to tarballs, add ptest and static support
thin-provisioning-tools: Upgrade 1.0.4 -> 1.0.9
bearssl: Upgrade to latest
Alexander Kanavin (29):
python3-pyinotify: remove as unmaintained
python3-supervisor: do not rely on smtpd module
python3-meld3: do not rely on smtpd module
python3-m2crypto: do not rely on smtpd module
python3-uinput: remove as unmaintained
python3-mcrypto: rely on setuptools for distutils copy
python3-joblib: do not rely in distutils
python3-web3: remove distutils dependency
python3-cppy: remove unused distutils dependency
python3-pyroute2: remove unused distutils dependency
python3-eventlet: backport a patch to remove distutils dependency
python3-unoconv: rely on setuptools to obtain distutils copy
python3-astroid: remove unneeded distutils dependency
python3-django: remove unneeded distutils dependency
python3-pillow: remove unneeded distutils dependency
python3-grpcio: update 1.56.2 -> 1.59.3
gstd: correctly delete files in do_install
libplist: fix python 3.12 compatibility
libcamera: skip until upstream resolves python 3.12 compatibility
nodejs: backport (partially) python 3.12 support
nodejs: backport (partially) python 3.12 support
polkit: remove long obsolete 0.119 version
mozjs-115: split the way-too-long PYTHONPATH line
polkit: update mozjs dependency 102 -> 115
mozjs-115: backport py 3.12 compatibility
mozjs-102: remove the recipe
gthumb: update 3.12.2 -> 3.12.4
flatpak: do not rely on executables from the host
bolt: package systemd units
Archana Polampalli (1):
cjson: upgrade 1.7.16 -> 1.7.17
Bruce Ashfield (1):
zfs: update to 2.2.2
Changqing Li (2):
postgresql: upgrade 15.4 -> 15.5
redis: upgrade 6.2.13 -> 6.2.14
Derek Straka (70):
python3-greenlet: update to version 3.0.2
python3-ujson: update to version 5.9.0
python3-termcolor: update to version 2.4.0
python3-cmake: update to version 3.28.0
python3-pint: upgrade to 0.23
python3-gnupg: update to 0.5.2
python3-pyzmq: update to 25.1.2
python3-tox: update to version 4.11.4
python3-olefile: update to version 0.47
python3-distlib: update to version 0.3.8
python3-colorlog: update to version 6.8.0
python3-pymongo: update version to 4.6.1
python3-bandit: update to version 1.7.6
python3-gmqtt: update to version 0.6.13
python3-portion: update to version 2.4.2
python3-prompt-toolkit: update to version 3.0.43
python3-asyncinotify: update to version 4.0.4
python3-bitstring: update to version 4.1.4
python3-ipython: update to version 8.18.1
nginx: update versions for both the stable branch and mainline
python3-portalocker: update to version 2.8.2
python3-astroid: update to version 3.0.2
python3-alembic: update to version 1.13.1
python3-pymisp: update to verion 2.4.182
python3-ninja: update to version 1.11.1.1
python3-coverage: update to version 7.3.4
python3-pdm: update to version 2.11.1
python3-paramiko: update to version 3.4.0
python3-zeroconf: update to version 0.131.0
python3-wtforms: update to version 3.1.1
python3-isort: update to version 5.13.2
python3-protobuf: update to version 4.25.1
python3-lazy-object-proxy: update to version 1.10.0
python3-cantools: update to version 39.4.0
python3-sentry-sdk: update to version 1.39.1
python3-xmlschema: update to version 2.5.1
python3-apiflask: update to version 2.1.0
python3-rapidjson: update to version 1.14
python3-bitarray: update to version 2.9.0
python3-pyfanotify: update to version 0.2.2
python3-eventlet: update to version 0.34.1
python3-flask-wtf: update to version 1.2.1
python3-grpcio: update to version 1.60.0
python3-grpcio-tools: update to version 1.60.0
python3-cmake: update to version 3.28.1
python3-flask-sqlalchemy: fix upstream uri check
python3-wtforms: fix upstream uri and version check
gyp: update to the latest commit
python3-ipython-genutils: fix upstream uri and version check
python3-flask: fix upstream uri and version check
python3-wpa-supplicant: fix upstream uri and version check
python3-uswid: update to version 0.4.7
python3-flask-wtf: fix upstream uri and version check
python3-gspread: update to version 5.12.3
python3-pytest-html: update to version 4.1.1
python3-setuptools-scm-git-archive: remove obsolete package
python3-pyroute2: update to version 0.7.10
python3-constantly: update to version 23.10.4
python3-mypy: update to version 1.8.0
python3-flask-jwt-extended: update to version 4.6.0
python3-greenlet: update to version 3.0.3
python3-web3: update to version 6.13.0
python3-parse: update to version 1.20.0
python3-kmod: add comment about update to version 0.9.2
python3-engineio: update to version 4.8.1
python3-sqlalchemy: update to version 2.0.24
python3-pdm-backend: update to version 2.1.8
python3-cantools: update to version 39.4.1
python3-argh: update to version 0.30.5
python3-dominate: update to version 2.9.1
Dmitry Baryshkov (2):
android-tools: remove two Debianisms
networkmanager: drop libnewt dependency
Frederic Martinsons (3):
crash: factorize recipe with inc file to prepare cross-canadian version
crash: add cross canadian version
crash: update to 8.0.4
Jan Vermaete (1):
netdata: added Python as rdepends
Jean-Marc BOUCHE (1):
terminus-font: build compressed archives with -n
Jose Quaresma (1):
ostree: Upgrade 2023.7 -> 2023.8
Joshua Watt (1):
redis: Create state directory in systemd service
Jörg Sommer (1):
i2cdev: New recipe with i2c tools
Kai Kang (1):
lvm2: 2.03.16 -> 2.03.22
Khem Raj (3):
Revert "nodejs: backport (partially) python 3.12 support"
Revert "libcamera: skip until upstream resolves python 3.12 compatibility"
libcamera: Fix build with python 3.12
Leon Anavi (11):
sip: Upgrade 6.7.12 -> 6.8.0
python3-expandvars: add recipe
python3-frozenlist: upgrade 1.4.0 -> 1.4.1
python3-yarl: upgrade 1.9.2 -> 1.9.4
python3-coverage: upgrade 7.3.2 -> 7.3.3
python3-cycler: upgrade 0.11.0 -> 0.12.1
python3-aiohue: upgrade 4.6.2 -> 4.7.0
python3-sdbus: upgrade 0.11.0 -> 0.11.1
python3-zeroconf: upgrade 0.128.4 -> 0.130.0
python3-dominate: upgrade 2.8.0 -> 2.9.0
python3-rlp: upgrade 3.0.0 -> 4.0.0
Marek Vasut (1):
faad2: Upgrade 2.10.0 -> 2.11.1
Markus Volk (3):
wireplumber: update 0.4.15 -> 0.4.17
tracker: dont inherit gsettings
gnome-software: update 45.1 -> 45.2
Martin Jansa (4):
monocypher: pass LIBDIR to fix installed-vs-shipped QA issue with multilib
rygel: fix build with gtk+3 PACKAGECONFIG disabled
rygel: add x11 to DISTRO_FEATURES
driverctl: fix installed-vs-shipped
Meenali Gupta (1):
nginx: upgrade 1.25.2 -> 1.25.3
Mingli Yu (2):
mariadb: Upgrade to 10.11.6
tk: Remove buildpath issue
Nathan BRIENT (1):
cyaml: new recipe
Niko Mauno (1):
pkcs11-provider: Add recipe
Ny Antra Ranaivoarison (1):
python3-click-spinner: backport patch that fixes deprecated methods
Patrick Wicki (1):
poco: upgrade 1.12.4 -> 1.12.5p2
Petr Chernikov (1):
abseil-cpp: remove -Dcmake_cxx_standard=14 flag from extra_oecmake
Robert Yang (1):
minifi-cpp: Fix do_configure error builder aarch64
Ross Burton (13):
Remove unused SRC_DISTRIBUTE_LICENSES
gspell: inherit gtk-doc
gspell: update DEPENDS, switch iso-codes for icu
librest: remove spurious build dependencies
librest: inherit gtk-doc
keybinder: use autotools-brokensep instead of setting B
keybinder: disable gtk-doc documentation
gtksourceview3: remove obsolete DEPENDS
libgsf: remove obsolete DEPENDS
evolution-data-server: remove obsolete intltool DEPENDS
php: remove lemon-native build dependency
lemon: upgrade to 3.44.2
renderdoc: no need to depend on vim-native
Samuli Piippo (1):
jasper: enable opengl only wih x11
Theodore A. Roth (1):
python3-flask-sqlalchemy: upgrade 2.5.1 -> 3.1.1
Thomas Perrot (2):
networkmanager: add missing modemmanager rdepends
networkmanager: fix some missing pkgconfig
Tim Orling (8):
python3-pydantic-core: add v2.14.5
python3-annotated-types: add v0.6.0
python3-pydantic: fix RDEPENDS
python3-dirty-equals: add v0.7.1
python3-pydantic-core: enable ptest
python3-cloudpickle: add v3.0.0
python3-pydantic: enable ptest
python3-yappi: upgrade 1.4.0 -> 1.6.0; fix ptests
Wang Mingyu (61):
python3-alembic: upgrade 1.12.1 -> 1.13.0
python3-ansi2html: upgrade 1.8.0 -> 1.9.1
python3-argcomplete: upgrade 3.1.6 -> 3.2.1
python3-dbus-fast: upgrade 2.15.0 -> 2.21.0
python3-django: upgrade 4.2.7 -> 5.0
python3-flask-restx: upgrade 1.2.0 -> 1.3.0
python3-google-api-core: upgrade 2.14.0 -> 2.15.0
python3-google-api-python-client: upgrade 2.108.0 -> 2.111.0
python3-googleapis-common-protos: upgrade 1.61.0 -> 1.62.0
python3-google-auth: upgrade 2.23.4 -> 2.25.2
python3-imageio: upgrade 2.33.0 -> 2.33.1
python3-isort: upgrade 5.12.0 -> 5.13.1
python3-path: upgrade 16.7.1 -> 16.9.0
python3-platformdirs: upgrade 4.0.0 -> 4.1.0
python3-pytest-asyncio: upgrade 0.22.0 -> 0.23.2
python3-sentry-sdk: upgrade 1.37.1 -> 1.39.0
python3-bitarray: upgrade 2.8.3 -> 2.8.5
python3-eth-keyfile: upgrade 0.6.1 -> 0.7.0
python3-eth-rlp: upgrade 0.3.0 -> 1.0.0
python3-fastnumbers: upgrade 5.0.1 -> 5.1.0
python3-pylint: upgrade 3.0.2 -> 3.0.3
python3-tornado: upgrade 6.3.3 -> 6.4
python3-traitlets: upgrade 5.13.0 -> 5.14.0
python3-types-setuptools: upgrade 68.2.0.2 -> 69.0.0.0
python3-virtualenv: upgrade 20.24.7 -> 20.25.0
python3-web3: upgrade 6.11.3 -> 6.12.0
python3-websocket-client: upgrade 1.6.4 -> 1.7.0
python3-zeroconf: upgrade 0.127.0 -> 0.128.4
ctags: upgrade 6.0.20231126.0 -> 6.0.20231210.0
gensio: upgrade 2.8.0 -> 2.8.2
hwdata: upgrade 0.376 -> 0.377
lvgl: upgrade 8.3.10 -> 8.3.11
gjs: upgrade 1.78.0 -> 1.78.1
ifenslave: upgrade 2.13 -> 2.14
libei: upgrade 1.1.0 -> 1.2.0
pkcs11-helper: upgrade 1.29.0 -> 1.30.0
strongswan: upgrade 5.9.12 -> 5.9.13
webkitgtk3: upgrade 2.42.2 -> 2.42.3
sip: upgrade 6.8.0 -> 6.8.1
paho-mqtt-cpp: upgrade 1.3.1 -> 1.3.2
dbus-cxx: upgrade 2.4.0 -> 2.5.0
exiftool: upgrade 12.70 -> 12.71
uftp: upgrade 5.0.2 -> 5.0.3
ctags: upgrade 6.0.20231210.0 -> 6.0.20231224.0
jasper: Fix install conflict when enable multilib.
jq: upgrade 1.7 -> 1.7.1
libmbim: upgrade 1.31.1 -> 1.31.2
libqmi: upgrade 1.34.0 -> 1.35.1
opencl-headers: upgrade 2023.04.17 -> 2023.12.14
valijson: upgrade 1.0.1 -> 1.0.2
python3-apispec: upgrade 6.3.0 -> 6.3.1
python3-asyncinotify: upgrade 4.0.4 -> 4.0.5
python3-bitarray: upgrade 2.9.0 -> 2.9.1
python3-cassandra-driver: upgrade 3.28.0 -> 3.29.0
python3-ipython: upgrade 8.18.1 -> 8.19.0
python3-pydantic: upgrade 2.5.2 -> 2.5.3
python3-regex: upgrade 2023.10.3 -> 2023.12.25
opencl-icd-loader: upgrade 2023.04.17 -> 2023.12.14
python3-distro: upgrade 1.8.0 -> 1.9.0
zchunk: upgrade 1.3.2 -> 1.4.0
python3-eventlet: upgrade 0.34.1 -> 0.34.2
William Lyu (1):
networkmanager: Improved SUMMARY and added DESCRIPTION
Xiangyu Chen (1):
layer.conf: add libbpf to NON_MULTILIB_RECIPES
Yi Zhao (2):
open-vm-tools: upgrade 12.1.5 -> 12.3.5
samba: upgrade 4.18.8 -> 4.18.9
Zoltán Böszörményi (2):
mutter: Make gnome-desktop and libcanberra dependencies optional
zenity: Upgrade to 4.0.0
alperak (29):
jasper: upgrade 2.0.33 -> 4.1.1
xcursorgen: upgrade 1.0.7 -> 1.0.8
xstdcmap: upgrade 1.0.4 -> 1.0.5
xlsclients: upgrade 1.1.4 -> 1.1.5
xlsatoms: upgrade 1.1.3 -> 1.1.4
xkbevd: upgrade 1.1.4 -> 1.1.5
xgamma: upgrade 1.0.6 -> 1.0.7
sessreg: upgrade 1.1.2 -> 1.1.3
xbitmaps: upgrade 1.1.2 -> 1.1.3
xcursor-themes: add recipe
xorg-docs: add recipe
xorg-sgml-doctools: update summary depends and inc file
xf86-video-ati: upgrade 19.1.0 -> 22.0.0
xf86-input-void: upgrade 1.4.1 -> 1.4.2
libxaw: upgrade 1.0.14 -> 1.0.15
xf86-video-mga: upgrade 2.0.0 -> 2.0.1
snappy: upgrade 1.1.9 -> 1.1.10
xsetroot: upgrade 1.1.2 -> 1.1.3
libbytesize: Removed unnecessary setting of B
libmxml: use autotools-brokensep instead of setting B
libsombok3: use autotools-brokensep instead of setting B
pgpool2: use autotools-brokensep instead of setting B
qpdf: upgrade 11.6.3 -> 11.6.4
cpprest: upgrade 2.10.18 -> 2.10.19
avro-c: upgrade 1.11.2 -> 1.11.3
dool: upgrade 1.1.0 -> 1.3.1
driverctl: upgrade 0.111 -> 0.115
hstr: upgrade 2.5.0 -> 3.1.0
libharu: upgrade 2.3.0 -> 2.4.4
meta-security: 070a1e82cc..b2e1511338:
Armin Kuster (6):
libgssglue: update to 0.8
python3-privacyidea: Update to 3.9.1
lynis: Update SRC_URI to improve updater
layers: Move READMEs to markdown format
arpwatch: adjust CONFIGURE params to allow to build again.
python3-pyinotify: fail2ban needs this module
Dawid Dabrowski (1):
libhoth recipe update
Erik Schilling (2):
dm-verity-img.bbclass: use bc-native
dm-verity-img.bbclass: remove IMAGE_NAME_SUFFIX
Mikko Rapeli (2):
tpm2-tss: support native builds
dm-verity-img.bbclass: add DM_VERITY_DEPLOY_DIR
Change-Id: I94d7f1ee5ff2da4555c05fbf63a1293ec8f249c2
Signed-off-by: Patrick Williams <patrick@stwcx.xyz>
diff --git a/poky/meta/lib/bblayers/buildconf.py b/poky/meta/lib/bblayers/buildconf.py
index ccab332..87a5e5a 100644
--- a/poky/meta/lib/bblayers/buildconf.py
+++ b/poky/meta/lib/bblayers/buildconf.py
@@ -6,13 +6,7 @@
import logging
import os
-import stat
import sys
-import shutil
-import json
-
-import bb.utils
-import bb.process
from bblayers.common import LayerPlugin
@@ -58,7 +52,6 @@
def do_save_build_conf(self, args):
""" Save the currently active build configuration (conf/local.conf, conf/bblayers.conf) as a template into a layer.\n This template can later be used for setting up builds via TEMPLATECONF. """
- repos = {}
layers = oe.buildcfg.get_layer_revisions(self.tinfoil.config_data)
targetlayer = None
oecore = None
diff --git a/poky/meta/lib/bblayers/makesetup.py b/poky/meta/lib/bblayers/makesetup.py
index 5fb6f14..4f27c56 100644
--- a/poky/meta/lib/bblayers/makesetup.py
+++ b/poky/meta/lib/bblayers/makesetup.py
@@ -6,9 +6,7 @@
import logging
import os
-import stat
import sys
-import shutil
import bb.utils
import bb.process
diff --git a/poky/meta/lib/oe/buildhistory_analysis.py b/poky/meta/lib/oe/buildhistory_analysis.py
index b185684..4edad01 100644
--- a/poky/meta/lib/oe/buildhistory_analysis.py
+++ b/poky/meta/lib/oe/buildhistory_analysis.py
@@ -562,7 +562,7 @@
elif not hash2 in hashfiles:
out.append("Unable to find matching sigdata for %s with hash %s" % (desc, hash2))
else:
- out2 = bb.siggen.compare_sigfiles(hashfiles[hash1], hashfiles[hash2], recursecb, collapsed=True)
+ out2 = bb.siggen.compare_sigfiles(hashfiles[hash1]['path'], hashfiles[hash2]['path'], recursecb, collapsed=True)
for line in out2:
m = hashlib.sha256()
m.update(line.encode('utf-8'))
diff --git a/poky/meta/lib/oe/package.py b/poky/meta/lib/oe/package.py
index f69bf9c..9a465ea 100644
--- a/poky/meta/lib/oe/package.py
+++ b/poky/meta/lib/oe/package.py
@@ -1239,7 +1239,7 @@
oe.utils.multiprocess_launch(oe.package.runstrip, sfiles, d)
# Build "minidebuginfo" and reinject it back into the stripped binaries
- if d.getVar('PACKAGE_MINIDEBUGINFO') == '1':
+ if bb.utils.contains('DISTRO_FEATURES', 'minidebuginfo', True, False, d):
oe.utils.multiprocess_launch(inject_minidebuginfo, list(elffiles), d,
extraargs=(dvar, dv, d))
diff --git a/poky/meta/lib/oe/package_manager/ipk/__init__.py b/poky/meta/lib/oe/package_manager/ipk/__init__.py
index e6f9c08..8fcbad5 100644
--- a/poky/meta/lib/oe/package_manager/ipk/__init__.py
+++ b/poky/meta/lib/oe/package_manager/ipk/__init__.py
@@ -133,7 +133,7 @@
tmp_dir = tempfile.mkdtemp()
current_dir = os.getcwd()
os.chdir(tmp_dir)
- data_tar = 'data.tar.xz'
+ data_tar = 'data.tar.zst'
try:
cmd = [ar_cmd, 'x', pkg_path]
@@ -505,6 +505,6 @@
"trying to extract the package." % pkg)
tmp_dir = super(OpkgPM, self).extract(pkg, pkg_info)
- bb.utils.remove(os.path.join(tmp_dir, "data.tar.xz"))
+ bb.utils.remove(os.path.join(tmp_dir, "data.tar.zst"))
return tmp_dir
diff --git a/poky/meta/lib/oe/packagedata.py b/poky/meta/lib/oe/packagedata.py
index 162ff60..2d1d6dd 100644
--- a/poky/meta/lib/oe/packagedata.py
+++ b/poky/meta/lib/oe/packagedata.py
@@ -116,6 +116,21 @@
return pkgmap(d).get(pkg)
+def foreach_runtime_provider_pkgdata(d, rdep, include_rdep=False):
+ pkgdata_dir = d.getVar("PKGDATA_DIR")
+ possibles = set()
+ try:
+ possibles |= set(os.listdir("%s/runtime-rprovides/%s/" % (pkgdata_dir, rdep)))
+ except OSError:
+ pass
+
+ if include_rdep:
+ possibles.add(rdep)
+
+ for p in sorted(list(possibles)):
+ rdep_data = read_subpkgdata(p, d)
+ yield p, rdep_data
+
def get_package_mapping(pkg, basepkg, d, depversions=None):
import oe.packagedata
@@ -317,7 +332,7 @@
for p in bb.utils.explode_deps(rprov):
subdata_sym = pkgdatadir + "/runtime-rprovides/%s/%s" % (p, pkg)
bb.utils.mkdirhier(os.path.dirname(subdata_sym))
- oe.path.symlink("../../runtime/%s" % pkg, subdata_sym, True)
+ oe.path.relsymlink(subdata_file, subdata_sym, True)
allow_empty = d.getVar('ALLOW_EMPTY:%s' % pkg)
if not allow_empty:
@@ -328,7 +343,7 @@
if g or allow_empty == "1":
# Symlinks needed for reverse lookups (from the final package name)
subdata_sym = pkgdatadir + "/runtime-reverse/%s" % pkgval
- oe.path.symlink("../runtime/%s" % pkg, subdata_sym, True)
+ oe.path.relsymlink(subdata_file, subdata_sym, True)
packagedfile = pkgdatadir + '/runtime/%s.packaged' % pkg
open(packagedfile, 'w').close()
diff --git a/poky/meta/lib/oe/patch.py b/poky/meta/lib/oe/patch.py
index e4bb5a7..d5ad4f3 100644
--- a/poky/meta/lib/oe/patch.py
+++ b/poky/meta/lib/oe/patch.py
@@ -478,7 +478,7 @@
patchlines = []
outfile = None
try:
- with open(srcfile, 'r', encoding=encoding) as f:
+ with open(srcfile, 'r', encoding=encoding, newline='') as f:
for line in f:
if line.startswith(GitApplyTree.patch_line_prefix):
outfile = line.split()[-1].strip()
diff --git a/poky/meta/lib/oe/path.py b/poky/meta/lib/oe/path.py
index e2f1913..5d21cdc 100644
--- a/poky/meta/lib/oe/path.py
+++ b/poky/meta/lib/oe/path.py
@@ -172,6 +172,9 @@
if e.errno != errno.EEXIST or os.readlink(destination) != source:
raise
+def relsymlink(target, name, force=False):
+ symlink(os.path.relpath(target, os.path.dirname(name)), name, force=force)
+
def find(dir, **walkoptions):
""" Given a directory, recurses into that directory,
returning all files as absolute paths. """
diff --git a/poky/meta/lib/oe/prservice.py b/poky/meta/lib/oe/prservice.py
index 2f2a0c1..c41242c 100644
--- a/poky/meta/lib/oe/prservice.py
+++ b/poky/meta/lib/oe/prservice.py
@@ -78,8 +78,7 @@
bb.utils.mkdirhier(d.getVar('PRSERV_DUMPDIR'))
df = d.getVar('PRSERV_DUMPFILE')
#write data
- lf = bb.utils.lockfile("%s.lock" % df)
- with open(df, "a") as f:
+ with open(df, "a") as f, bb.utils.fileslocked(["%s.lock" % df]) as locks:
if metainfo:
#dump column info
f.write("#PR_core_ver = \"%s\"\n\n" % metainfo['core_ver']);
@@ -113,7 +112,6 @@
if not nomax:
for i in idx:
f.write("PRAUTO_%s_%s = \"%s\"\n" % (str(datainfo[idx[i]]['version']),str(datainfo[idx[i]]['pkgarch']),str(datainfo[idx[i]]['value'])))
- bb.utils.unlockfile(lf)
def prserv_check_avail(d):
host_params = list([_f for _f in (d.getVar("PRSERV_HOST") or '').split(':') if _f])
diff --git a/poky/meta/lib/oe/recipeutils.py b/poky/meta/lib/oe/recipeutils.py
index 25b159b..de1fbdd 100644
--- a/poky/meta/lib/oe/recipeutils.py
+++ b/poky/meta/lib/oe/recipeutils.py
@@ -26,7 +26,7 @@
# Help us to find places to insert values
recipe_progression = ['SUMMARY', 'DESCRIPTION', 'HOMEPAGE', 'BUGTRACKER', 'SECTION', 'LICENSE', 'LICENSE_FLAGS', 'LIC_FILES_CHKSUM', 'PROVIDES', 'DEPENDS', 'PR', 'PV', 'SRCREV', 'SRC_URI', 'S', 'do_fetch()', 'do_unpack()', 'do_patch()', 'EXTRA_OECONF', 'EXTRA_OECMAKE', 'EXTRA_OESCONS', 'do_configure()', 'EXTRA_OEMAKE', 'do_compile()', 'do_install()', 'do_populate_sysroot()', 'INITSCRIPT', 'USERADD', 'GROUPADD', 'PACKAGES', 'FILES', 'RDEPENDS', 'RRECOMMENDS', 'RSUGGESTS', 'RPROVIDES', 'RREPLACES', 'RCONFLICTS', 'ALLOW_EMPTY', 'populate_packages()', 'do_package()', 'do_deploy()', 'BBCLASSEXTEND']
# Variables that sometimes are a bit long but shouldn't be wrapped
-nowrap_vars = ['SUMMARY', 'HOMEPAGE', 'BUGTRACKER', r'SRC_URI\[(.+\.)?md5sum\]', r'SRC_URI\[(.+\.)?sha256sum\]']
+nowrap_vars = ['SUMMARY', 'HOMEPAGE', 'BUGTRACKER', r'SRC_URI\[(.+\.)?md5sum\]', r'SRC_URI\[(.+\.)?sha[0-9]+sum\]']
list_vars = ['SRC_URI', 'LIC_FILES_CHKSUM']
meta_vars = ['SUMMARY', 'DESCRIPTION', 'HOMEPAGE', 'BUGTRACKER', 'SECTION']
@@ -664,17 +664,21 @@
return (appendpath, pathok)
-def bbappend_recipe(rd, destlayerdir, srcfiles, install=None, wildcardver=False, machine=None, extralines=None, removevalues=None, redirect_output=None, params=None):
+def bbappend_recipe(rd, destlayerdir, srcfiles, install=None, wildcardver=False, machine=None, extralines=None, removevalues=None, redirect_output=None, params=None, update_original_recipe=False):
"""
Writes a bbappend file for a recipe
Parameters:
rd: data dictionary for the recipe
destlayerdir: base directory of the layer to place the bbappend in
(subdirectory path from there will be determined automatically)
- srcfiles: dict of source files to add to SRC_URI, where the value
+ srcfiles: dict of source files to add to SRC_URI, where the key
is the full path to the file to be added, and the value is a
- dict with 'path' key containing the original filename as it
- would appear in SRC_URI or None if it isn't already present.
+ dict with following optional keys:
+ path: the original filename as it would appear in SRC_URI
+ or None if it isn't already present.
+ patchdir: the patchdir parameter
+ newname: the name to give to the new added file. None to use
+ the default value: basename(path)
You may pass None for this parameter if you simply want to specify
your own content via the extralines parameter.
install: dict mapping entries in srcfiles to a tuple of two elements:
@@ -697,18 +701,29 @@
params:
Parameters to use when adding entries to SRC_URI. If specified,
should be a list of dicts with the same length as srcfiles.
+ update_original_recipe:
+ Force to update the original recipe instead of creating/updating
+ a bbapend. destlayerdir must contain the original recipe
"""
if not removevalues:
removevalues = {}
- # Determine how the bbappend should be named
- appendpath, pathok = get_bbappend_path(rd, destlayerdir, wildcardver)
- if not appendpath:
- bb.error('Unable to determine layer directory containing %s' % recipefile)
- return (None, None)
- if not pathok:
- bb.warn('Unable to determine correct subdirectory path for bbappend file - check that what %s adds to BBFILES also matches .bbappend files. Using %s for now, but until you fix this the bbappend will not be applied.' % (os.path.join(destlayerdir, 'conf', 'layer.conf'), os.path.dirname(appendpath)))
+ recipefile = rd.getVar('FILE')
+ if update_original_recipe:
+ if destlayerdir not in recipefile:
+ bb.error("destlayerdir %s doesn't contain the original recipe (%s), cannot update it" % (destlayerdir, recipefile))
+ return (None, None)
+
+ appendpath = recipefile
+ else:
+ # Determine how the bbappend should be named
+ appendpath, pathok = get_bbappend_path(rd, destlayerdir, wildcardver)
+ if not appendpath:
+ bb.error('Unable to determine layer directory containing %s' % recipefile)
+ return (None, None)
+ if not pathok:
+ bb.warn('Unable to determine correct subdirectory path for bbappend file - check that what %s adds to BBFILES also matches .bbappend files. Using %s for now, but until you fix this the bbappend will not be applied.' % (os.path.join(destlayerdir, 'conf', 'layer.conf'), os.path.dirname(appendpath)))
appenddir = os.path.dirname(appendpath)
if not redirect_output:
@@ -753,7 +768,7 @@
bbappendlines.append((varname, op, value))
destsubdir = rd.getVar('PN')
- if srcfiles:
+ if not update_original_recipe and srcfiles:
bbappendlines.append(('FILESEXTRAPATHS:prepend', ':=', '${THISDIR}/${PN}:'))
appendoverride = ''
@@ -766,16 +781,30 @@
for i, (newfile, param) in enumerate(srcfiles.items()):
srcurientry = None
if not 'path' in param or not param['path']:
- srcfile = os.path.basename(newfile)
+ if 'newname' in param and param['newname']:
+ srcfile = param['newname']
+ else:
+ srcfile = os.path.basename(newfile)
srcurientry = 'file://%s' % srcfile
+ oldentry = None
+ for uri in rd.getVar('SRC_URI').split():
+ if srcurientry in uri:
+ oldentry = uri
if params and params[i]:
srcurientry = '%s;%s' % (srcurientry, ';'.join('%s=%s' % (k,v) for k,v in params[i].items()))
# Double-check it's not there already
# FIXME do we care if the entry is added by another bbappend that might go away?
if not srcurientry in rd.getVar('SRC_URI').split():
if machine:
+ if oldentry:
+ appendline('SRC_URI:remove%s' % appendoverride, '=', ' ' + oldentry)
appendline('SRC_URI:append%s' % appendoverride, '=', ' ' + srcurientry)
else:
+ if oldentry:
+ if update_original_recipe:
+ removevalues['SRC_URI'] = oldentry
+ else:
+ appendline('SRC_URI:remove', '=', oldentry)
appendline('SRC_URI', '+=', srcurientry)
param['path'] = srcfile
else:
@@ -800,6 +829,8 @@
# multiple times per operation when we're handling overrides)
if os.path.exists(appendpath) and not os.path.exists(outfile):
shutil.copy2(appendpath, outfile)
+ elif update_original_recipe:
+ outfile = recipefile
else:
bb.note('Writing append file %s' % appendpath)
outfile = appendpath
diff --git a/poky/meta/lib/oe/sstatesig.py b/poky/meta/lib/oe/sstatesig.py
index e250f51..1b4380f 100644
--- a/poky/meta/lib/oe/sstatesig.py
+++ b/poky/meta/lib/oe/sstatesig.py
@@ -349,7 +349,6 @@
pn, taskname = key.split(':', 1)
hashfiles = {}
- filedates = {}
def get_hashval(siginfo):
if siginfo.endswith('.siginfo'):
@@ -357,6 +356,9 @@
else:
return siginfo.rpartition('.')[2]
+ def get_time(fullpath):
+ return os.stat(fullpath).st_mtime
+
# First search in stamps dir
localdata = d.createCopy()
localdata.setVar('MULTIMACH_TARGET_SYS', '*')
@@ -372,24 +374,21 @@
filespec = '%s.%s.sigdata.*' % (stamp, taskname)
foundall = False
import glob
+ bb.debug(1, "Calling glob.glob on {}".format(filespec))
for fullpath in glob.glob(filespec):
match = False
if taskhashlist:
for taskhash in taskhashlist:
if fullpath.endswith('.%s' % taskhash):
- hashfiles[taskhash] = fullpath
+ hashfiles[taskhash] = {'path':fullpath, 'sstate':False, 'time':get_time(fullpath)}
if len(hashfiles) == len(taskhashlist):
foundall = True
break
else:
- try:
- filedates[fullpath] = os.stat(fullpath).st_mtime
- except OSError:
- continue
hashval = get_hashval(fullpath)
- hashfiles[hashval] = fullpath
+ hashfiles[hashval] = {'path':fullpath, 'sstate':False, 'time':get_time(fullpath)}
- if not taskhashlist or (len(filedates) < 2 and not foundall):
+ if not taskhashlist or (len(hashfiles) < 2 and not foundall):
# That didn't work, look in sstate-cache
hashes = taskhashlist or ['?' * 64]
localdata = bb.data.createCopy(d)
@@ -398,6 +397,9 @@
localdata.setVar('TARGET_VENDOR', '*')
localdata.setVar('TARGET_OS', '*')
localdata.setVar('PN', pn)
+ # gcc-source is a special case, same as with local stamps above
+ if pn.startswith("gcc-source"):
+ localdata.setVar('PN', "gcc")
localdata.setVar('PV', '*')
localdata.setVar('PR', '*')
localdata.setVar('BB_TASKHASH', hashval)
@@ -409,24 +411,18 @@
localdata.setVar('SSTATE_EXTRAPATH', "${NATIVELSBSTRING}/")
filespec = '%s.siginfo' % localdata.getVar('SSTATE_PKG')
+ bb.debug(1, "Calling glob.glob on {}".format(filespec))
matchedfiles = glob.glob(filespec)
for fullpath in matchedfiles:
actual_hashval = get_hashval(fullpath)
if actual_hashval in hashfiles:
continue
- hashfiles[hashval] = fullpath
- if not taskhashlist:
- try:
- filedates[fullpath] = os.stat(fullpath).st_mtime
- except:
- continue
+ hashfiles[actual_hashval] = {'path':fullpath, 'sstate':True, 'time':get_time(fullpath)}
- if taskhashlist:
- return hashfiles
- else:
- return filedates
+ return hashfiles
bb.siggen.find_siginfo = find_siginfo
+bb.siggen.find_siginfo_version = 2
def sstate_get_manifest_filename(task, d):
diff --git a/poky/meta/lib/oeqa/core/decorator/data.py b/poky/meta/lib/oeqa/core/decorator/data.py
index de881e0..5444b2c 100644
--- a/poky/meta/lib/oeqa/core/decorator/data.py
+++ b/poky/meta/lib/oeqa/core/decorator/data.py
@@ -186,6 +186,16 @@
self.case.skipTest('Test only runs on qemu machines')
@registerDecorator
+class skipIfNotQemuUsermode(OETestDecorator):
+ """
+ Skip test if MACHINE_FEATURES does not contain qemu-usermode
+ """
+ def setUpDecorator(self):
+ self.logger.debug("Checking if MACHINE_FEATURES does not contain qemu-usermode")
+ if 'qemu-usermode' not in self.case.td.get('MACHINE_FEATURES', '').split():
+ self.case.skipTest('Test requires qemu-usermode in MACHINE_FEATURES')
+
+@registerDecorator
class skipIfQemu(OETestDecorator):
"""
Skip test if MACHINE is qemu*
diff --git a/poky/meta/lib/oeqa/core/target/qemu.py b/poky/meta/lib/oeqa/core/target/qemu.py
index 6893d10..d93b3ac 100644
--- a/poky/meta/lib/oeqa/core/target/qemu.py
+++ b/poky/meta/lib/oeqa/core/target/qemu.py
@@ -14,8 +14,6 @@
from .ssh import OESSHTarget
from oeqa.utils.qemurunner import QemuRunner
-from oeqa.utils.dump import MonitorDumper
-from oeqa.utils.dump import TargetDumper
supported_fstypes = ['ext3', 'ext4', 'cpio.gz', 'wic']
@@ -47,14 +45,6 @@
use_kvm=kvm, use_slirp=slirp, dump_dir=dump_dir, logger=logger,
serial_ports=serial_ports, boot_patterns = boot_patterns,
use_ovmf=ovmf, tmpfsdir=tmpfsdir)
- dump_monitor_cmds = kwargs.get("testimage_dump_monitor")
- self.monitor_dumper = MonitorDumper(dump_monitor_cmds, dump_dir, self.runner)
- if self.monitor_dumper:
- self.monitor_dumper.create_dir("qmp")
-
- dump_target_cmds = kwargs.get("testimage_dump_target")
- self.target_dumper = TargetDumper(dump_target_cmds, dump_dir, self.runner)
- self.target_dumper.create_dir("qemu")
def start(self, params=None, extra_bootparams=None, runqemuparams=''):
if self.use_slirp and not self.server_ip:
diff --git a/poky/meta/lib/oeqa/core/target/ssh.py b/poky/meta/lib/oeqa/core/target/ssh.py
index f4dd0ca..09cdd14 100644
--- a/poky/meta/lib/oeqa/core/target/ssh.py
+++ b/poky/meta/lib/oeqa/core/target/ssh.py
@@ -48,8 +48,6 @@
if port:
self.ssh = self.ssh + [ '-p', port ]
self.scp = self.scp + [ '-P', port ]
- self._monitor_dumper = None
- self.target_dumper = None
def start(self, **kwargs):
pass
@@ -57,15 +55,6 @@
def stop(self, **kwargs):
pass
- @property
- def monitor_dumper(self):
- return self._monitor_dumper
-
- @monitor_dumper.setter
- def monitor_dumper(self, dumper):
- self._monitor_dumper = dumper
- self.monitor_dumper.dump_monitor()
-
def _run(self, command, timeout=None, ignore_status=True):
"""
Runs command in target using SSHProcess.
@@ -104,14 +93,7 @@
status, output = self._run(sshCmd, processTimeout, ignore_status)
self.logger.debug('Command: %s\nStatus: %d Output: %s\n' % (command, status, output))
- if (status == 255) and (('No route to host') in output):
- if self.monitor_dumper:
- self.monitor_dumper.dump_monitor()
- if status == 255:
- if self.target_dumper:
- self.target_dumper.dump_target()
- if self.monitor_dumper:
- self.monitor_dumper.dump_monitor()
+
return (status, output)
def copyTo(self, localSrc, remoteDst):
diff --git a/poky/meta/lib/oeqa/files/maturin/guessing-game/Cargo.toml b/poky/meta/lib/oeqa/files/maturin/guessing-game/Cargo.toml
new file mode 100644
index 0000000..de95025
--- /dev/null
+++ b/poky/meta/lib/oeqa/files/maturin/guessing-game/Cargo.toml
@@ -0,0 +1,20 @@
+[package]
+name = "guessing-game"
+version = "0.1.0"
+edition = "2021"
+
+# See more keys and their definitions at https://doc.rust-lang.org/cargo/reference/manifest.html
+
+[lib]
+name = "guessing_game"
+# "cdylib" is necessary to produce a shared library for Python to import from.
+crate-type = ["cdylib"]
+
+[dependencies]
+rand = "0.8.4"
+
+[dependencies.pyo3]
+version = "0.19.0"
+# "abi3-py38" tells pyo3 (and maturin) to build using the stable ABI with minimum Python version 3.8
+features = ["abi3-py38"]
+
diff --git a/poky/meta/lib/oeqa/files/maturin/guessing-game/LICENSE-APACHE b/poky/meta/lib/oeqa/files/maturin/guessing-game/LICENSE-APACHE
new file mode 100644
index 0000000..16fe87b
--- /dev/null
+++ b/poky/meta/lib/oeqa/files/maturin/guessing-game/LICENSE-APACHE
@@ -0,0 +1,201 @@
+ Apache License
+ Version 2.0, January 2004
+ http://www.apache.org/licenses/
+
+TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
+
+1. Definitions.
+
+ "License" shall mean the terms and conditions for use, reproduction,
+ and distribution as defined by Sections 1 through 9 of this document.
+
+ "Licensor" shall mean the copyright owner or entity authorized by
+ the copyright owner that is granting the License.
+
+ "Legal Entity" shall mean the union of the acting entity and all
+ other entities that control, are controlled by, or are under common
+ control with that entity. For the purposes of this definition,
+ "control" means (i) the power, direct or indirect, to cause the
+ direction or management of such entity, whether by contract or
+ otherwise, or (ii) ownership of fifty percent (50%) or more of the
+ outstanding shares, or (iii) beneficial ownership of such entity.
+
+ "You" (or "Your") shall mean an individual or Legal Entity
+ exercising permissions granted by this License.
+
+ "Source" form shall mean the preferred form for making modifications,
+ including but not limited to software source code, documentation
+ source, and configuration files.
+
+ "Object" form shall mean any form resulting from mechanical
+ transformation or translation of a Source form, including but
+ not limited to compiled object code, generated documentation,
+ and conversions to other media types.
+
+ "Work" shall mean the work of authorship, whether in Source or
+ Object form, made available under the License, as indicated by a
+ copyright notice that is included in or attached to the work
+ (an example is provided in the Appendix below).
+
+ "Derivative Works" shall mean any work, whether in Source or Object
+ form, that is based on (or derived from) the Work and for which the
+ editorial revisions, annotations, elaborations, or other modifications
+ represent, as a whole, an original work of authorship. For the purposes
+ of this License, Derivative Works shall not include works that remain
+ separable from, or merely link (or bind by name) to the interfaces of,
+ the Work and Derivative Works thereof.
+
+ "Contribution" shall mean any work of authorship, including
+ the original version of the Work and any modifications or additions
+ to that Work or Derivative Works thereof, that is intentionally
+ submitted to Licensor for inclusion in the Work by the copyright owner
+ or by an individual or Legal Entity authorized to submit on behalf of
+ the copyright owner. For the purposes of this definition, "submitted"
+ means any form of electronic, verbal, or written communication sent
+ to the Licensor or its representatives, including but not limited to
+ communication on electronic mailing lists, source code control systems,
+ and issue tracking systems that are managed by, or on behalf of, the
+ Licensor for the purpose of discussing and improving the Work, but
+ excluding communication that is conspicuously marked or otherwise
+ designated in writing by the copyright owner as "Not a Contribution."
+
+ "Contributor" shall mean Licensor and any individual or Legal Entity
+ on behalf of whom a Contribution has been received by Licensor and
+ subsequently incorporated within the Work.
+
+2. Grant of Copyright License. Subject to the terms and conditions of
+ this License, each Contributor hereby grants to You a perpetual,
+ worldwide, non-exclusive, no-charge, royalty-free, irrevocable
+ copyright license to reproduce, prepare Derivative Works of,
+ publicly display, publicly perform, sublicense, and distribute the
+ Work and such Derivative Works in Source or Object form.
+
+3. Grant of Patent License. Subject to the terms and conditions of
+ this License, each Contributor hereby grants to You a perpetual,
+ worldwide, non-exclusive, no-charge, royalty-free, irrevocable
+ (except as stated in this section) patent license to make, have made,
+ use, offer to sell, sell, import, and otherwise transfer the Work,
+ where such license applies only to those patent claims licensable
+ by such Contributor that are necessarily infringed by their
+ Contribution(s) alone or by combination of their Contribution(s)
+ with the Work to which such Contribution(s) was submitted. If You
+ institute patent litigation against any entity (including a
+ cross-claim or counterclaim in a lawsuit) alleging that the Work
+ or a Contribution incorporated within the Work constitutes direct
+ or contributory patent infringement, then any patent licenses
+ granted to You under this License for that Work shall terminate
+ as of the date such litigation is filed.
+
+4. Redistribution. You may reproduce and distribute copies of the
+ Work or Derivative Works thereof in any medium, with or without
+ modifications, and in Source or Object form, provided that You
+ meet the following conditions:
+
+ (a) You must give any other recipients of the Work or
+ Derivative Works a copy of this License; and
+
+ (b) You must cause any modified files to carry prominent notices
+ stating that You changed the files; and
+
+ (c) You must retain, in the Source form of any Derivative Works
+ that You distribute, all copyright, patent, trademark, and
+ attribution notices from the Source form of the Work,
+ excluding those notices that do not pertain to any part of
+ the Derivative Works; and
+
+ (d) If the Work includes a "NOTICE" text file as part of its
+ distribution, then any Derivative Works that You distribute must
+ include a readable copy of the attribution notices contained
+ within such NOTICE file, excluding those notices that do not
+ pertain to any part of the Derivative Works, in at least one
+ of the following places: within a NOTICE text file distributed
+ as part of the Derivative Works; within the Source form or
+ documentation, if provided along with the Derivative Works; or,
+ within a display generated by the Derivative Works, if and
+ wherever such third-party notices normally appear. The contents
+ of the NOTICE file are for informational purposes only and
+ do not modify the License. You may add Your own attribution
+ notices within Derivative Works that You distribute, alongside
+ or as an addendum to the NOTICE text from the Work, provided
+ that such additional attribution notices cannot be construed
+ as modifying the License.
+
+ You may add Your own copyright statement to Your modifications and
+ may provide additional or different license terms and conditions
+ for use, reproduction, or distribution of Your modifications, or
+ for any such Derivative Works as a whole, provided Your use,
+ reproduction, and distribution of the Work otherwise complies with
+ the conditions stated in this License.
+
+5. Submission of Contributions. Unless You explicitly state otherwise,
+ any Contribution intentionally submitted for inclusion in the Work
+ by You to the Licensor shall be under the terms and conditions of
+ this License, without any additional terms or conditions.
+ Notwithstanding the above, nothing herein shall supersede or modify
+ the terms of any separate license agreement you may have executed
+ with Licensor regarding such Contributions.
+
+6. Trademarks. This License does not grant permission to use the trade
+ names, trademarks, service marks, or product names of the Licensor,
+ except as required for reasonable and customary use in describing the
+ origin of the Work and reproducing the content of the NOTICE file.
+
+7. Disclaimer of Warranty. Unless required by applicable law or
+ agreed to in writing, Licensor provides the Work (and each
+ Contributor provides its Contributions) on an "AS IS" BASIS,
+ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
+ implied, including, without limitation, any warranties or conditions
+ of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
+ PARTICULAR PURPOSE. You are solely responsible for determining the
+ appropriateness of using or redistributing the Work and assume any
+ risks associated with Your exercise of permissions under this License.
+
+8. Limitation of Liability. In no event and under no legal theory,
+ whether in tort (including negligence), contract, or otherwise,
+ unless required by applicable law (such as deliberate and grossly
+ negligent acts) or agreed to in writing, shall any Contributor be
+ liable to You for damages, including any direct, indirect, special,
+ incidental, or consequential damages of any character arising as a
+ result of this License or out of the use or inability to use the
+ Work (including but not limited to damages for loss of goodwill,
+ work stoppage, computer failure or malfunction, or any and all
+ other commercial damages or losses), even if such Contributor
+ has been advised of the possibility of such damages.
+
+9. Accepting Warranty or Additional Liability. While redistributing
+ the Work or Derivative Works thereof, You may choose to offer,
+ and charge a fee for, acceptance of support, warranty, indemnity,
+ or other liability obligations and/or rights consistent with this
+ License. However, in accepting such obligations, You may act only
+ on Your own behalf and on Your sole responsibility, not on behalf
+ of any other Contributor, and only if You agree to indemnify,
+ defend, and hold each Contributor harmless for any liability
+ incurred by, or claims asserted against, such Contributor by reason
+ of your accepting any such warranty or additional liability.
+
+END OF TERMS AND CONDITIONS
+
+APPENDIX: How to apply the Apache License to your work.
+
+ To apply the Apache License to your work, attach the following
+ boilerplate notice, with the fields enclosed by brackets "[]"
+ replaced with your own identifying information. (Don't include
+ the brackets!) The text should be enclosed in the appropriate
+ comment syntax for the file format. We also recommend that a
+ file or class name and description of purpose be included on the
+ same "printed page" as the copyright notice for easier
+ identification within third-party archives.
+
+Copyright [yyyy] [name of copyright owner]
+
+Licensed under the Apache License, Version 2.0 (the "License");
+you may not use this file except in compliance with the License.
+You may obtain a copy of the License at
+
+ http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing, software
+distributed under the License is distributed on an "AS IS" BASIS,
+WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+See the License for the specific language governing permissions and
+limitations under the License.
diff --git a/poky/meta/lib/oeqa/files/maturin/guessing-game/LICENSE-MIT b/poky/meta/lib/oeqa/files/maturin/guessing-game/LICENSE-MIT
new file mode 100644
index 0000000..c4a9a58
--- /dev/null
+++ b/poky/meta/lib/oeqa/files/maturin/guessing-game/LICENSE-MIT
@@ -0,0 +1,25 @@
+Copyright (c) 2018 konstin
+
+Permission is hereby granted, free of charge, to any
+person obtaining a copy of this software and associated
+documentation files (the "Software"), to deal in the
+Software without restriction, including without
+limitation the rights to use, copy, modify, merge,
+publish, distribute, sublicense, and/or sell copies of
+the Software, and to permit persons to whom the Software
+is furnished to do so, subject to the following
+conditions:
+
+The above copyright notice and this permission notice
+shall be included in all copies or substantial portions
+of the Software.
+
+THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF
+ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED
+TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A
+PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT
+SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY
+CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION
+OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR
+IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER
+DEALINGS IN THE SOFTWARE.
diff --git a/poky/meta/lib/oeqa/files/maturin/guessing-game/pyproject.toml b/poky/meta/lib/oeqa/files/maturin/guessing-game/pyproject.toml
new file mode 100644
index 0000000..ff35abc
--- /dev/null
+++ b/poky/meta/lib/oeqa/files/maturin/guessing-game/pyproject.toml
@@ -0,0 +1,8 @@
+[build-system]
+requires = ["maturin>=1.0,<2.0"]
+build-backend = "maturin"
+
+[tool.maturin]
+# "extension-module" tells pyo3 we want to build an extension module (skips linking against libpython.so)
+features = ["pyo3/extension-module"]
+
diff --git a/poky/meta/lib/oeqa/files/maturin/guessing-game/src/lib.rs b/poky/meta/lib/oeqa/files/maturin/guessing-game/src/lib.rs
new file mode 100644
index 0000000..6828466
--- /dev/null
+++ b/poky/meta/lib/oeqa/files/maturin/guessing-game/src/lib.rs
@@ -0,0 +1,48 @@
+use pyo3::prelude::*;
+use rand::Rng;
+use std::cmp::Ordering;
+use std::io;
+
+#[pyfunction]
+fn guess_the_number() {
+ println!("Guess the number!");
+
+ let secret_number = rand::thread_rng().gen_range(1..101);
+
+ loop {
+ println!("Please input your guess.");
+
+ let mut guess = String::new();
+
+ io::stdin()
+ .read_line(&mut guess)
+ .expect("Failed to read line");
+
+ let guess: u32 = match guess.trim().parse() {
+ Ok(num) => num,
+ Err(_) => continue,
+ };
+
+ println!("You guessed: {}", guess);
+
+ match guess.cmp(&secret_number) {
+ Ordering::Less => println!("Too small!"),
+ Ordering::Greater => println!("Too big!"),
+ Ordering::Equal => {
+ println!("You win!");
+ break;
+ }
+ }
+ }
+}
+
+/// A Python module implemented in Rust. The name of this function must match
+/// the `lib.name` setting in the `Cargo.toml`, else Python will not be able to
+/// import the module.
+#[pymodule]
+fn guessing_game(_py: Python, m: &PyModule) -> PyResult<()> {
+ m.add_function(wrap_pyfunction!(guess_the_number, m)?)?;
+
+ Ok(())
+}
+
diff --git a/poky/meta/lib/oeqa/runtime/cases/maturin.py b/poky/meta/lib/oeqa/runtime/cases/maturin.py
new file mode 100644
index 0000000..4e6384f
--- /dev/null
+++ b/poky/meta/lib/oeqa/runtime/cases/maturin.py
@@ -0,0 +1,58 @@
+#
+# Copyright OpenEmbedded Contributors
+#
+# SPDX-License-Identifier: MIT
+#
+
+import os
+
+from oeqa.runtime.case import OERuntimeTestCase
+from oeqa.core.decorator.depends import OETestDepends
+from oeqa.runtime.decorator.package import OEHasPackage
+
+
+class MaturinTest(OERuntimeTestCase):
+ @OETestDepends(['ssh.SSHTest.test_ssh', 'python.PythonTest.test_python3'])
+ @OEHasPackage(['python3-maturin'])
+ def test_maturin_list_python(self):
+ status, output = self.target.run("maturin list-python")
+ self.assertEqual(status, 0)
+ _, py_major = self.target.run("python3 -c 'import sys; print(sys.version_info.major)'")
+ _, py_minor = self.target.run("python3 -c 'import sys; print(sys.version_info.minor)'")
+ python_version = "%s.%s" % (py_major, py_minor)
+ self.assertEqual(output, "🐍 1 python interpreter found:\n"
+ " - CPython %s at /usr/bin/python%s" % (python_version, python_version))
+
+
+class MaturinDevelopTest(OERuntimeTestCase):
+ @classmethod
+ def setUp(cls):
+ dst = '/tmp'
+ src = os.path.join(cls.tc.files_dir, "maturin/guessing-game")
+ cls.tc.target.copyTo(src, dst)
+
+ @classmethod
+ def tearDown(cls):
+ cls.tc.target.run('rm -rf %s' % '/tmp/guessing-game/target')
+
+ @OETestDepends(['ssh.SSHTest.test_ssh', 'python.PythonTest.test_python3'])
+ @OEHasPackage(['python3-maturin'])
+ def test_maturin_develop(self):
+ """
+ This test case requires:
+ (1) that a .venv can been created.
+ (2) DNS nameserver to resolve crate URIs for fetching
+ (3) a functional 'rustc' and 'cargo'
+ """
+ targetdir = os.path.join("/tmp", "guessing-game")
+ self.target.run("cd %s; python3 -m venv .venv" % targetdir)
+ self.target.run("echo 'nameserver 8.8.8.8' > /etc/resolv.conf")
+ cmd = "cd %s; maturin develop" % targetdir
+ status, output = self.target.run(cmd)
+ self.assertRegex(output, r"🔗 Found pyo3 bindings with abi3 support for Python ≥ 3.8")
+ self.assertRegex(output, r"🐍 Not using a specific python interpreter")
+ self.assertRegex(output, r"📡 Using build options features from pyproject.toml")
+ self.assertRegex(output, r"Compiling guessing-game v0.1.0")
+ self.assertRegex(output, r"📦 Built wheel for abi3 Python ≥ 3.8")
+ self.assertRegex(output, r"🛠 Installed guessing-game-0.1.0")
+ self.assertEqual(status, 0)
diff --git a/poky/meta/lib/oeqa/runtime/cases/parselogs-ignores-common.txt b/poky/meta/lib/oeqa/runtime/cases/parselogs-ignores-common.txt
new file mode 100644
index 0000000..f91abbc
--- /dev/null
+++ b/poky/meta/lib/oeqa/runtime/cases/parselogs-ignores-common.txt
@@ -0,0 +1,62 @@
+# Xserver explains what the short codes mean
+(WW) warning, (EE) error, (NI) not implemented, (??) unknown.
+
+# Xserver warns if compiled with ACPI but no acpid running
+Open ACPI failed (/var/run/acpid.socket) (No such file or directory)
+
+# Some machines (eg qemux86) don't enable PAE (they probably should though)
+NX (Execute Disable) protection cannot be enabled: non-PAE kernel!
+
+# Connman's pacrunner warns if external connectivity isn't available
+Failed to find URL:http://ipv4.connman.net/online/status.html
+Failed to find URL:http://ipv6.connman.net/online/status.html
+
+# x86 on 6.6+ outputs this message, it is informational, not an error
+ACPI: _OSC evaluation for CPUs failed, trying _PDC
+
+# These should be reviewed to see if they are still needed
+dma timeout
+can\'t add hid device:
+usbhid: probe of
+_OSC failed (AE_ERROR)
+_OSC failed (AE_SUPPORT)
+AE_ALREADY_EXISTS
+ACPI _OSC request failed (AE_SUPPORT)
+can\'t disable ASPM
+Failed to load module "vesa"
+Failed to load module "modesetting"
+Failed to load module "glx"
+Failed to load module "fbdev"
+Failed to load module "ati"
+[drm] Cannot find any crtc or sizes
+_OSC failed (AE_NOT_FOUND); disabling ASPM
+hd.: possibly failed opcode
+NETLINK INITIALIZATION FAILED
+kernel: Cannot find map file
+omap_hwmod: debugss: _wait_target_disable failed
+VGA arbiter: cannot open kernel arbiter, no multi-card support
+Online check failed for
+netlink init failed
+Fast TSC calibration
+controller can't do DEVSLP, turning off
+stmmac_dvr_probe: warning: cannot get CSR clock
+error: couldn\'t mount because of unsupported optional features
+GPT: Use GNU Parted to correct GPT errors
+Cannot set xattr user.Librepo.DownloadInProgress
+Failed to read /var/lib/nfs/statd/state: Success
+error retry time-out =
+logind: cannot setup systemd-logind helper (-61), using legacy fallback
+Failed to rename network interface
+Failed to process device, ignoring: Device or resource busy
+Cannot find a map file
+[rdrand]: Initialization Failed
+[rndr ]: Initialization Failed
+[pulseaudio] authkey.c: Failed to open cookie file
+[pulseaudio] authkey.c: Failed to load authentication key
+was skipped because of a failed condition check
+was skipped because all trigger condition checks failed
+xf86OpenConsole: Switching VT failed
+Failed to read LoaderConfigTimeoutOneShot variable, ignoring: Operation not supported
+Failed to read LoaderEntryOneShot variable, ignoring: Operation not supported
+Direct firmware load for regulatory.db
+failed to load regulatory.db
diff --git a/poky/meta/lib/oeqa/runtime/cases/parselogs-ignores-mipsarch.txt b/poky/meta/lib/oeqa/runtime/cases/parselogs-ignores-mipsarch.txt
new file mode 100644
index 0000000..2c0bd9a
--- /dev/null
+++ b/poky/meta/lib/oeqa/runtime/cases/parselogs-ignores-mipsarch.txt
@@ -0,0 +1,2 @@
+# These should be reviewed to see if they are still needed
+cacheinfo: Failed to find cpu0 device node
diff --git a/poky/meta/lib/oeqa/runtime/cases/parselogs-ignores-qemuall.txt b/poky/meta/lib/oeqa/runtime/cases/parselogs-ignores-qemuall.txt
new file mode 100644
index 0000000..b0c0fc9
--- /dev/null
+++ b/poky/meta/lib/oeqa/runtime/cases/parselogs-ignores-qemuall.txt
@@ -0,0 +1,27 @@
+# psplash
+FBIOPUT_VSCREENINFO failed, double buffering disabled
+
+# PCI host bridge to bus 0000:00
+# pci_bus 0000:00: root bus resource [mem 0x10000000-0x17ffffff]
+# pci_bus 0000:00: root bus resource [io 0x1000-0x1fffff]
+# pci_bus 0000:00: No busn resource found for root bus, will use [bus 00-ff]
+# pci 0000:00:00.0: [2046:ab11] type 00 class 0x100000
+# pci 0000:00:00.0: [Firmware Bug]: reg 0x10: invalid BAR (can't size)
+# pci 0000:00:00.0: [Firmware Bug]: reg 0x14: invalid BAR (can't size)
+# pci 0000:00:00.0: [Firmware Bug]: reg 0x18: invalid BAR (can't size)
+# pci 0000:00:00.0: [Firmware Bug]: reg 0x1c: invalid BAR (can't size)
+# pci 0000:00:00.0: [Firmware Bug]: reg 0x20: invalid BAR (can't size)
+# pci 0000:00:00.0: [Firmware Bug]: reg 0x24: invalid BAR (can't size)
+invalid BAR (can't size)
+
+# These should be reviewed to see if they are still needed
+wrong ELF class
+fail to add MMCONFIG information, can't access extended PCI configuration space under this bridge
+can't claim BAR
+amd_nb: Cannot enumerate AMD northbridges
+tsc: HPET/PMTIMER calibration failed
+modeset(0): Failed to initialize the DRI2 extension
+glamor initialization failed
+blk_update_request: I/O error, dev fd0, sector 0 op 0x0:(READ)
+floppy: error
+failed to IDENTIFY (I/O error, err_mask=0x4)
diff --git a/poky/meta/lib/oeqa/runtime/cases/parselogs-ignores-qemuarm64.txt b/poky/meta/lib/oeqa/runtime/cases/parselogs-ignores-qemuarm64.txt
new file mode 100644
index 0000000..260cdde
--- /dev/null
+++ b/poky/meta/lib/oeqa/runtime/cases/parselogs-ignores-qemuarm64.txt
@@ -0,0 +1,6 @@
+# These should be reviewed to see if they are still needed
+Fatal server error:
+(EE) Server terminated with error (1). Closing log file.
+dmi: Firmware registration failed.
+irq: type mismatch, failed to map hwirq-27 for /intc
+logind: failed to get session seat
\ No newline at end of file
diff --git a/poky/meta/lib/oeqa/runtime/cases/parselogs-ignores-qemuarmv5.txt b/poky/meta/lib/oeqa/runtime/cases/parselogs-ignores-qemuarmv5.txt
new file mode 100644
index 0000000..ed91107
--- /dev/null
+++ b/poky/meta/lib/oeqa/runtime/cases/parselogs-ignores-qemuarmv5.txt
@@ -0,0 +1,19 @@
+# Code is 2 JENT_ECOARSETIME: Timer too coarse for RNG.
+jitterentropy: Initialization failed with host not compliant with requirements: 2
+
+# These should be reviewed to see if they are still needed
+mmci-pl18x: probe of fpga:05 failed with error -22
+mmci-pl18x: probe of fpga:0b failed with error -22
+
+OF: amba_device_add() failed (-19) for /amba/smc@10100000
+OF: amba_device_add() failed (-19) for /amba/mpmc@10110000
+OF: amba_device_add() failed (-19) for /amba/sctl@101e0000
+OF: amba_device_add() failed (-19) for /amba/watchdog@101e1000
+OF: amba_device_add() failed (-19) for /amba/sci@101f0000
+OF: amba_device_add() failed (-19) for /amba/spi@101f4000
+OF: amba_device_add() failed (-19) for /amba/ssp@101f4000
+OF: amba_device_add() failed (-19) for /amba/fpga/sci@a000
+Failed to initialize '/amba/timer@101e3000': -22
+
+clcd-pl11x: probe of 10120000.display failed with error -2
+arm-charlcd 10008000.lcd: error -ENXIO: IRQ index 0 not found
diff --git a/poky/meta/lib/oeqa/runtime/cases/parselogs-ignores-qemuppc.txt b/poky/meta/lib/oeqa/runtime/cases/parselogs-ignores-qemuppc.txt
new file mode 100644
index 0000000..d9b58b5
--- /dev/null
+++ b/poky/meta/lib/oeqa/runtime/cases/parselogs-ignores-qemuppc.txt
@@ -0,0 +1,6 @@
+# These should be reviewed to see if they are still needed
+PCI 0000:00 Cannot reserve Legacy IO [io 0x0000-0x0fff]
+host side 80-wire cable detection failed, limiting max speed
+mode "640x480" test failed
+can't handle BAR above 4GB
+Cannot reserve Legacy IO
\ No newline at end of file
diff --git a/poky/meta/lib/oeqa/runtime/cases/parselogs-ignores-qemuppc64.txt b/poky/meta/lib/oeqa/runtime/cases/parselogs-ignores-qemuppc64.txt
new file mode 100644
index 0000000..b736a2a
--- /dev/null
+++ b/poky/meta/lib/oeqa/runtime/cases/parselogs-ignores-qemuppc64.txt
@@ -0,0 +1,4 @@
+# These should be reviewed to see if they are still needed
+vio vio: uevent: failed to send synthetic uevent
+synth uevent: /devices/vio: failed to send uevent
+PCI 0000:00 Cannot reserve Legacy IO [io 0x10000-0x10fff]
\ No newline at end of file
diff --git a/poky/meta/lib/oeqa/runtime/cases/parselogs-ignores-qemux86.txt b/poky/meta/lib/oeqa/runtime/cases/parselogs-ignores-qemux86.txt
new file mode 100644
index 0000000..ebb76f1
--- /dev/null
+++ b/poky/meta/lib/oeqa/runtime/cases/parselogs-ignores-qemux86.txt
@@ -0,0 +1,2 @@
+# These should be reviewed to see if they are still needed
+Failed to access perfctr msr (MSR
diff --git a/poky/meta/lib/oeqa/runtime/cases/parselogs-ignores-x86.txt b/poky/meta/lib/oeqa/runtime/cases/parselogs-ignores-x86.txt
new file mode 100644
index 0000000..5985247
--- /dev/null
+++ b/poky/meta/lib/oeqa/runtime/cases/parselogs-ignores-x86.txt
@@ -0,0 +1,10 @@
+# These should be reviewed to see if they are still needed
+[drm:psb_do_init] *ERROR* Debug is
+wrong ELF class
+Could not enable PowerButton event
+probe of LNXPWRBN:00 failed with error -22
+pmd_set_huge: Cannot satisfy
+failed to setup card detect gpio
+amd_nb: Cannot enumerate AMD northbridges
+failed to retrieve link info, disabling eDP
+Direct firmware load for iwlwifi
diff --git a/poky/meta/lib/oeqa/runtime/cases/parselogs-ignores-x86_64.txt b/poky/meta/lib/oeqa/runtime/cases/parselogs-ignores-x86_64.txt
new file mode 120000
index 0000000..404e384
--- /dev/null
+++ b/poky/meta/lib/oeqa/runtime/cases/parselogs-ignores-x86_64.txt
@@ -0,0 +1 @@
+parselogs-ignores-x86.txt
\ No newline at end of file
diff --git a/poky/meta/lib/oeqa/runtime/cases/parselogs.py b/poky/meta/lib/oeqa/runtime/cases/parselogs.py
index cddb846..6966923 100644
--- a/poky/meta/lib/oeqa/runtime/cases/parselogs.py
+++ b/poky/meta/lib/oeqa/runtime/cases/parselogs.py
@@ -6,189 +6,27 @@
import collections
import os
+import sys
from shutil import rmtree
from oeqa.runtime.case import OERuntimeTestCase
from oeqa.core.decorator.depends import OETestDepends
-common_errors = [
- "(WW) warning, (EE) error, (NI) not implemented, (??) unknown.",
- "dma timeout",
- "can\'t add hid device:",
- "usbhid: probe of ",
- "_OSC failed (AE_ERROR)",
- "_OSC failed (AE_SUPPORT)",
- "AE_ALREADY_EXISTS",
- "ACPI _OSC request failed (AE_SUPPORT)",
- "can\'t disable ASPM",
- "Failed to load module \"vesa\"",
- "Failed to load module vesa",
- "Failed to load module \"modesetting\"",
- "Failed to load module modesetting",
- "Failed to load module \"glx\"",
- "Failed to load module \"fbdev\"",
- "Failed to load module fbdev",
- "Failed to load module glx",
- "[drm] Cannot find any crtc or sizes",
- "_OSC failed (AE_NOT_FOUND); disabling ASPM",
- "Open ACPI failed (/var/run/acpid.socket) (No such file or directory)",
- "NX (Execute Disable) protection cannot be enabled: non-PAE kernel!",
- "hd.: possibly failed opcode",
- 'NETLINK INITIALIZATION FAILED',
- 'kernel: Cannot find map file',
- 'omap_hwmod: debugss: _wait_target_disable failed',
- 'VGA arbiter: cannot open kernel arbiter, no multi-card support',
- 'Failed to find URL:http://ipv4.connman.net/online/status.html',
- 'Online check failed for',
- 'netlink init failed',
- 'Fast TSC calibration',
- "BAR 0-9",
- "Failed to load module \"ati\"",
- "controller can't do DEVSLP, turning off",
- "stmmac_dvr_probe: warning: cannot get CSR clock",
- "error: couldn\'t mount because of unsupported optional features",
- "GPT: Use GNU Parted to correct GPT errors",
- "Cannot set xattr user.Librepo.DownloadInProgress",
- "Failed to read /var/lib/nfs/statd/state: Success",
- "error retry time-out =",
- "logind: cannot setup systemd-logind helper (-61), using legacy fallback",
- "Failed to rename network interface",
- "Failed to process device, ignoring: Device or resource busy",
- "Cannot find a map file",
- "[rdrand]: Initialization Failed",
- "[rndr ]: Initialization Failed",
- "[pulseaudio] authkey.c: Failed to open cookie file",
- "[pulseaudio] authkey.c: Failed to load authentication key",
- "was skipped because of a failed condition check",
- "was skipped because all trigger condition checks failed",
- "xf86OpenConsole: Switching VT failed",
- "Failed to read LoaderConfigTimeoutOneShot variable, ignoring: Operation not supported",
- "Failed to read LoaderEntryOneShot variable, ignoring: Operation not supported",
- "invalid BAR (can't size)",
- ]
+# importlib.resources.open_text in Python <3.10 doesn't search all directories
+# when a package is split across multiple directories. Until we can rely on
+# 3.10+, reimplement the searching logic.
+if sys.version_info < (3, 10):
+ def _open_text(package, resource):
+ import importlib, pathlib
+ module = importlib.import_module(package)
+ for path in module.__path__:
+ candidate = pathlib.Path(path) / resource
+ if candidate.exists():
+ return candidate.open(encoding='utf-8')
+ raise FileNotFoundError
+else:
+ from importlib.resources import open_text as _open_text
-x86_common = [
- '[drm:psb_do_init] *ERROR* Debug is',
- 'wrong ELF class',
- 'Could not enable PowerButton event',
- 'probe of LNXPWRBN:00 failed with error -22',
- 'pmd_set_huge: Cannot satisfy',
- 'failed to setup card detect gpio',
- 'amd_nb: Cannot enumerate AMD northbridges',
- 'failed to retrieve link info, disabling eDP',
- 'Direct firmware load for iwlwifi',
- 'Direct firmware load for regulatory.db',
- 'failed to load regulatory.db',
-] + common_errors
-
-qemux86_common = [
- 'wrong ELF class',
- "fail to add MMCONFIG information, can't access extended PCI configuration space under this bridge.",
- "can't claim BAR ",
- 'amd_nb: Cannot enumerate AMD northbridges',
- 'tsc: HPET/PMTIMER calibration failed',
- "modeset(0): Failed to initialize the DRI2 extension",
- "glamor initialization failed",
- "blk_update_request: I/O error, dev fd0, sector 0 op 0x0:(READ)",
- "floppy: error",
- 'failed to IDENTIFY (I/O error, err_mask=0x4)',
-] + common_errors
-
-ignore_errors = {
- 'default' : common_errors,
- 'qemux86' : [
- 'Failed to access perfctr msr (MSR',
- ] + qemux86_common,
- 'qemux86-64' : qemux86_common,
- 'qemumips' : [
- 'Failed to load module "glx"',
- 'cacheinfo: Failed to find cpu0 device node',
- ] + common_errors,
- 'qemumips64' : [
- 'cacheinfo: Failed to find cpu0 device node',
- ] + common_errors,
- 'qemuppc' : [
- 'PCI 0000:00 Cannot reserve Legacy IO [io 0x0000-0x0fff]',
- 'host side 80-wire cable detection failed, limiting max speed',
- 'mode "640x480" test failed',
- 'Failed to load module "glx"',
- 'can\'t handle BAR above 4GB',
- 'Cannot reserve Legacy IO',
- ] + common_errors,
- 'qemuppc64' : [
- 'vio vio: uevent: failed to send synthetic uevent',
- 'synth uevent: /devices/vio: failed to send uevent',
- 'PCI 0000:00 Cannot reserve Legacy IO [io 0x10000-0x10fff]',
- ] + common_errors,
- 'qemuarmv5' : [
- 'mmci-pl18x: probe of fpga:05 failed with error -22',
- 'mmci-pl18x: probe of fpga:0b failed with error -22',
- 'Failed to load module "glx"',
- 'OF: amba_device_add() failed (-19) for /amba/smc@10100000',
- 'OF: amba_device_add() failed (-19) for /amba/mpmc@10110000',
- 'OF: amba_device_add() failed (-19) for /amba/sctl@101e0000',
- 'OF: amba_device_add() failed (-19) for /amba/watchdog@101e1000',
- 'OF: amba_device_add() failed (-19) for /amba/sci@101f0000',
- 'OF: amba_device_add() failed (-19) for /amba/spi@101f4000',
- 'OF: amba_device_add() failed (-19) for /amba/ssp@101f4000',
- 'OF: amba_device_add() failed (-19) for /amba/fpga/sci@a000',
- 'Failed to initialize \'/amba/timer@101e3000\': -22',
- 'jitterentropy: Initialization failed with host not compliant with requirements: 2',
- 'clcd-pl11x: probe of 10120000.display failed with error -2',
- 'arm-charlcd 10008000.lcd: error -ENXIO: IRQ index 0 not found'
- ] + common_errors,
- 'qemuarm64' : [
- 'Fatal server error:',
- '(EE) Server terminated with error (1). Closing log file.',
- 'dmi: Firmware registration failed.',
- 'irq: type mismatch, failed to map hwirq-27 for /intc',
- 'logind: failed to get session seat',
- ] + common_errors,
- 'intel-core2-32' : [
- 'ACPI: No _BQC method, cannot determine initial brightness',
- '[Firmware Bug]: ACPI: No _BQC method, cannot determine initial brightness',
- '(EE) Failed to load module "psb"',
- '(EE) Failed to load module psb',
- '(EE) Failed to load module "psbdrv"',
- '(EE) Failed to load module psbdrv',
- '(EE) open /dev/fb0: No such file or directory',
- '(EE) AIGLX: reverting to software rendering',
- 'dmi: Firmware registration failed.',
- 'ioremap error for 0x78',
- ] + x86_common,
- 'intel-corei7-64' : [
- 'can\'t set Max Payload Size to 256',
- 'intel_punit_ipc: can\'t request region for resource',
- '[drm] parse error at position 4 in video mode \'efifb\'',
- 'ACPI Error: Could not enable RealTimeClock event',
- 'ACPI Warning: Could not enable fixed event - RealTimeClock',
- 'hci_intel INT33E1:00: Unable to retrieve gpio',
- 'hci_intel: probe of INT33E1:00 failed',
- 'can\'t derive routing for PCI INT A',
- 'failed to read out thermal zone',
- 'Bluetooth: hci0: Setting Intel event mask failed',
- 'ttyS2 - failed to request DMA',
- 'Bluetooth: hci0: Failed to send firmware data (-38)',
- 'atkbd serio0: Failed to enable keyboard on isa0060/serio0',
- ] + x86_common,
- 'genericx86' : x86_common,
- 'genericx86-64' : [
- 'Direct firmware load for i915',
- 'Failed to load firmware i915',
- 'Failed to fetch GuC',
- 'Failed to initialize GuC',
- 'Failed to load DMC firmware',
- 'The driver is built-in, so to load the firmware you need to',
- ] + x86_common,
- 'beaglebone-yocto' : [
- 'Direct firmware load for regulatory.db',
- 'failed to load regulatory.db',
- 'l4_wkup_cm',
- 'Failed to load module "glx"',
- 'Failed to make EGL context current',
- 'glamor initialization failed',
- ] + common_errors,
-}
class ParseLogsTest(OERuntimeTestCase):
@@ -198,6 +36,9 @@
# The keywords that identify error messages in the log files
errors = ["error", "cannot", "can't", "failed"]
+ # A list of error messages that should be ignored
+ ignore_errors = []
+
@classmethod
def setUpClass(cls):
# When systemd is enabled we need to notice errors on
@@ -212,11 +53,20 @@
cls.errors = [s.casefold() for s in cls.errors]
- try:
- cls.ignore_errors = [s.casefold() for s in ignore_errors[cls.td.get('MACHINE')]]
- except KeyError:
- cls.logger.info('No ignore list found for this machine, using default')
- cls.ignore_errors = [s.casefold() for s in ignore_errors['default']]
+ cls.load_machine_ignores()
+
+ @classmethod
+ def load_machine_ignores(cls):
+ # Add TARGET_ARCH explicitly as not every machine has that in MACHINEOVERRDES (eg qemux86-64)
+ for candidate in ["common", cls.td.get("TARGET_ARCH")] + cls.td.get("MACHINEOVERRIDES").split(":"):
+ try:
+ name = f"parselogs-ignores-{candidate}.txt"
+ for line in _open_text("oeqa.runtime.cases", name):
+ line = line.strip()
+ if line and not line.startswith("#"):
+ cls.ignore_errors.append(line.casefold())
+ except FileNotFoundError:
+ pass
# Go through the log locations provided and if it's a folder
# create a list with all the .log files in it, if it's a file
diff --git a/poky/meta/lib/oeqa/runtime/cases/systemd.py b/poky/meta/lib/oeqa/runtime/cases/systemd.py
index 37f2954..2865887 100644
--- a/poky/meta/lib/oeqa/runtime/cases/systemd.py
+++ b/poky/meta/lib/oeqa/runtime/cases/systemd.py
@@ -5,6 +5,7 @@
#
import re
+import threading
import time
from oeqa.runtime.case import OERuntimeTestCase
@@ -136,6 +137,27 @@
status = self.target.run('mount -oro,remount /')[0]
self.assertTrue(status == 0, msg='Remounting / as r/o failed')
+ @OETestDepends(['systemd.SystemdBasicTests.test_systemd_basic'])
+ @skipIfNotFeature('minidebuginfo', 'Test requires minidebuginfo to be in DISTRO_FEATURES')
+ @OEHasPackage(['busybox'])
+ def test_systemd_coredump_minidebuginfo(self):
+ """
+ Verify that call-stacks generated by systemd-coredump contain symbolicated call-stacks,
+ extracted from the minidebuginfo metadata (.gnu_debugdata elf section).
+ """
+ t_thread = threading.Thread(target=self.target.run, args=("ulimit -c unlimited && sleep 1000",))
+ t_thread.start()
+ time.sleep(1)
+
+ status, output = self.target.run('pidof sleep')
+ # cause segfault on purpose
+ self.target.run('kill -SEGV %s' % output)
+ self.assertEqual(status, 0, msg = 'Not able to find process that runs sleep, output : %s' % output)
+
+ (status, output) = self.target.run('coredumpctl info')
+ self.assertEqual(status, 0, msg='MiniDebugInfo Test failed: %s' % output)
+ self.assertEqual('sleep_for_duration (busybox.nosuid' in output, True, msg='Call stack is missing minidebuginfo symbols (functions shown as "n/a"): %s' % output)
+
class SystemdJournalTests(SystemdTest):
@OETestDepends(['systemd.SystemdBasicTests.test_systemd_basic'])
diff --git a/poky/meta/lib/oeqa/runtime/decorator/package.py b/poky/meta/lib/oeqa/runtime/decorator/package.py
index 8aba3f3..b78ac9f 100644
--- a/poky/meta/lib/oeqa/runtime/decorator/package.py
+++ b/poky/meta/lib/oeqa/runtime/decorator/package.py
@@ -38,11 +38,12 @@
if isinstance(self.need_pkgs, str):
self.need_pkgs = [self.need_pkgs,]
+ mlprefix = self.case.td.get("MLPREFIX")
for pkg in self.need_pkgs:
if pkg.startswith('!'):
- unneed_pkgs.add(pkg[1:])
+ unneed_pkgs.add(mlprefix + pkg[1:])
else:
- need_pkgs.add(pkg)
+ need_pkgs.add(mlprefix + pkg)
if unneed_pkgs:
msg = 'Checking if %s is not installed' % ', '.join(unneed_pkgs)
diff --git a/poky/meta/lib/oeqa/sdk/cases/maturin.py b/poky/meta/lib/oeqa/sdk/cases/maturin.py
new file mode 100644
index 0000000..ea10f56
--- /dev/null
+++ b/poky/meta/lib/oeqa/sdk/cases/maturin.py
@@ -0,0 +1,79 @@
+#
+# Copyright OpenEmbedded Contributors
+#
+# SPDX-License-Identifier: MIT
+#
+
+import os
+import shutil
+import unittest
+
+from oeqa.core.utils.path import remove_safe
+from oeqa.sdk.case import OESDKTestCase
+from oeqa.utils.subprocesstweak import errors_have_output
+
+errors_have_output()
+
+
+class MaturinTest(OESDKTestCase):
+ def setUp(self):
+ if not (
+ self.tc.hasHostPackage("nativesdk-python3-maturin")
+ or self.tc.hasHostPackage("python3-maturin-native")
+ ):
+ raise unittest.SkipTest("No python3-maturin package in the SDK")
+
+ def test_maturin_list_python(self):
+ py_major = self._run("python3 -c 'import sys; print(sys.version_info.major)'")
+ py_minor = self._run("python3 -c 'import sys; print(sys.version_info.minor)'")
+ python_version = "%s.%s" % (py_major.strip(), py_minor.strip())
+ cmd = "maturin list-python"
+ output = self._run(cmd)
+ self.assertRegex(output, r"^🐍 1 python interpreter found:\n")
+ self.assertRegex(
+ output,
+ r" - CPython %s (.+)/usr/bin/python%s$" % (python_version, python_version),
+ )
+
+
+class MaturinDevelopTest(OESDKTestCase):
+ @classmethod
+ def setUpClass(self):
+ targetdir = os.path.join(self.tc.sdk_dir, "guessing-game")
+ try:
+ shutil.rmtree(targetdir)
+ except FileNotFoundError:
+ pass
+ shutil.copytree(
+ os.path.join(self.tc.files_dir, "maturin/guessing-game"), targetdir
+ )
+
+ def setUp(self):
+ machine = self.td.get("MACHINE")
+ if not (
+ self.tc.hasHostPackage("nativesdk-python3-maturin")
+ or self.tc.hasHostPackage("python3-maturin-native")
+ ):
+ raise unittest.SkipTest("No python3-maturin package in the SDK")
+ if not (
+ self.tc.hasHostPackage("packagegroup-rust-cross-canadian-%s" % machine)
+ ):
+ raise unittest.SkipTest(
+ "Testing 'maturin develop' requires Rust cross-canadian in the SDK"
+ )
+
+ def test_maturin_develop(self):
+ """
+ This test case requires:
+ (1) that a .venv can been created.
+ (2) a functional 'rustc' and 'cargo'
+ """
+ self._run("cd %s/guessing-game; python3 -m venv .venv" % self.tc.sdk_dir)
+ cmd = "cd %s/guessing-game; maturin develop" % self.tc.sdk_dir
+ output = self._run(cmd)
+ self.assertRegex(output, r"🔗 Found pyo3 bindings with abi3 support for Python ≥ 3.8")
+ self.assertRegex(output, r"🐍 Not using a specific python interpreter")
+ self.assertRegex(output, r"📡 Using build options features from pyproject.toml")
+ self.assertRegex(output, r"Compiling guessing-game v0.1.0")
+ self.assertRegex(output, r"📦 Built wheel for abi3 Python ≥ 3.8")
+ self.assertRegex(output, r"🛠 Installed guessing-game-0.1.0")
diff --git a/poky/meta/lib/oeqa/selftest/cases/bbtests.py b/poky/meta/lib/oeqa/selftest/cases/bbtests.py
index d242352..0da59e0 100644
--- a/poky/meta/lib/oeqa/selftest/cases/bbtests.py
+++ b/poky/meta/lib/oeqa/selftest/cases/bbtests.py
@@ -362,3 +362,14 @@
result = bitbake('gitunpackoffline-fail -c fetch', ignore_status=True)
self.assertTrue(re.search("Recipe uses a floating tag/branch .* for repo .* without a fixed SRCREV yet doesn't call bb.fetch2.get_srcrev()", result.output), msg = "Recipe without PV set to SRCPV should have failed: %s" % result.output)
+
+ def test_unexpanded_variable_in_path(self):
+ """
+ Test that bitbake fails if directory contains unexpanded bitbake variable in the name
+ """
+ recipe_name = "gitunpackoffline"
+ self.write_config('PV:pn-gitunpackoffline:append = "+${UNDEFVAL}"')
+ result = bitbake('{}'.format(recipe_name), ignore_status=True)
+ self.assertGreater(result.status, 0, "Build should have failed if ${ is in the path")
+ self.assertTrue(re.search("ERROR: Directory name /.* contains unexpanded bitbake variable. This may cause build failures and WORKDIR polution",
+ result.output), msg = "mkdirhier with unexpanded variable should have failed: %s" % result.output)
diff --git a/poky/meta/lib/oeqa/selftest/cases/c_cpp.py b/poky/meta/lib/oeqa/selftest/cases/c_cpp.py
new file mode 100644
index 0000000..9a70ce2
--- /dev/null
+++ b/poky/meta/lib/oeqa/selftest/cases/c_cpp.py
@@ -0,0 +1,60 @@
+#
+# Copyright OpenEmbedded Contributors
+#
+# SPDX-License-Identifier: MIT
+#
+
+from oeqa.selftest.case import OESelftestTestCase
+from oeqa.core.decorator.data import skipIfNotQemuUsermode
+from oeqa.utils.commands import bitbake
+
+
+class CCppTests(OESelftestTestCase):
+
+ @skipIfNotQemuUsermode()
+ def _qemu_usermode(self, recipe_name):
+ self.add_command_to_tearDown("bitbake -c clean %s" % recipe_name)
+ bitbake("%s -c run_tests" % recipe_name)
+
+ @skipIfNotQemuUsermode()
+ def _qemu_usermode_failing(self, recipe_name):
+ config = 'PACKAGECONFIG:pn-%s = "failing_test"' % recipe_name
+ self.write_config(config)
+ self.add_command_to_tearDown("bitbake -c clean %s" % recipe_name)
+ result = bitbake("%s -c run_tests" % recipe_name, ignore_status=True)
+ self.assertNotEqual(0, result.status, "command: %s is expected to fail but passed, status: %s, output: %s, error: %s" % (
+ result.command, result.status, result.output, result.error))
+
+
+class CMakeTests(CCppTests):
+ def test_cmake_qemu(self):
+ """Test for cmake-qemu.bbclass good case
+
+ compile the cmake-example and verify the CTests pass in qemu-user.
+ qemu-user is configured by CMAKE_CROSSCOMPILING_EMULATOR.
+ """
+ self._qemu_usermode("cmake-example")
+
+ def test_cmake_qemu_failing(self):
+ """Test for cmake-qemu.bbclass bad case
+
+ Break the comparison in the test code and verify the CTests do not pass.
+ """
+ self._qemu_usermode_failing("cmake-example")
+
+
+class MesonTests(CCppTests):
+ def test_meson_qemu(self):
+ """Test the qemu-user feature of the meson.bbclass good case
+
+ compile the meson-example and verify the Unit Test pass in qemu-user.
+ qemu-user is configured by meson's exe_wrapper option.
+ """
+ self._qemu_usermode("meson-example")
+
+ def test_meson_qemu_failing(self):
+ """Test the qemu-user feature of the meson.bbclass bad case
+
+ Break the comparison in the test code and verify the Unit Test does not pass in qemu-user.
+ """
+ self._qemu_usermode_failing("meson-example")
diff --git a/poky/meta/lib/oeqa/selftest/cases/devtool.py b/poky/meta/lib/oeqa/selftest/cases/devtool.py
index 2a11886..a877720 100644
--- a/poky/meta/lib/oeqa/selftest/cases/devtool.py
+++ b/poky/meta/lib/oeqa/selftest/cases/devtool.py
@@ -4,6 +4,7 @@
# SPDX-License-Identifier: MIT
#
+import errno
import os
import re
import shutil
@@ -54,7 +55,7 @@
result = runCmd('git rev-parse --show-toplevel', cwd=canonical_layerpath)
oldreporoot = result.output.rstrip()
newmetapath = os.path.join(corecopydir, os.path.relpath(oldmetapath, oldreporoot))
- runCmd('git clone %s %s' % (oldreporoot, corecopydir), cwd=templayerdir)
+ runCmd('git clone file://%s %s' % (oldreporoot, corecopydir), cwd=templayerdir)
# Now we need to copy any modified files
# You might ask "why not just copy the entire tree instead of
# cloning and doing this?" - well, the problem with that is
@@ -543,7 +544,7 @@
self.track_for_cleanup(self.workspacedir)
self.add_command_to_tearDown('bitbake -c cleansstate %s' % testrecipe)
self.add_command_to_tearDown('bitbake-layers remove-layer */workspace')
- result = runCmd('devtool add %s %s -f %s' % (testrecipe, srcdir, url))
+ result = runCmd('devtool add --no-pypi %s %s -f %s' % (testrecipe, srcdir, url))
self.assertExists(os.path.join(self.workspacedir, 'conf', 'layer.conf'), 'Workspace directory not created. %s' % result.output)
self.assertTrue(os.path.isfile(os.path.join(srcdir, 'setup.py')), 'Unable to find setup.py in source directory')
self.assertTrue(os.path.isdir(os.path.join(srcdir, '.git')), 'git repository for external source tree was not created')
@@ -562,7 +563,7 @@
result = runCmd('devtool reset -n %s' % testrecipe)
shutil.rmtree(srcdir)
fakever = '1.9'
- result = runCmd('devtool add %s %s -f %s -V %s' % (testrecipe, srcdir, url, fakever))
+ result = runCmd('devtool add --no-pypi %s %s -f %s -V %s' % (testrecipe, srcdir, url, fakever))
self.assertTrue(os.path.isfile(os.path.join(srcdir, 'setup.py')), 'Unable to find setup.py in source directory')
# Test devtool status
result = runCmd('devtool status')
@@ -927,7 +928,7 @@
# some crate:// in SRC_URI
# others git:// in SRC_URI
# cointains a patch
- testrecipe = 'zvariant'
+ testrecipe = 'hello-rs'
bb_vars = get_bb_vars(['SRC_URI', 'FILE', 'WORKDIR', 'CARGO_HOME'], testrecipe)
recipefile = bb_vars['FILE']
workdir = bb_vars['WORKDIR']
@@ -939,7 +940,7 @@
'This test expects the %s recipe to have a git uri with subpath' % testrecipe)
self.assertTrue(any([uri.startswith('crate://') for uri in src_uri]),
'This test expects the %s recipe to have some crates in its src uris' % testrecipe)
- self.assertGreater(sum(map(lambda x:x.startswith('git://'), src_uri)), 2,
+ self.assertGreaterEqual(sum(map(lambda x:x.startswith('git://'), src_uri)), 2,
'This test expects the %s recipe to have several git:// uris' % testrecipe)
self.assertTrue(any([uri.startswith('file://') and '.patch' in uri for uri in src_uri]),
'This test expects the %s recipe to have a patch in its src uris' % testrecipe)
@@ -957,7 +958,7 @@
result = runCmd('devtool modify %s -x %s' % (testrecipe, tempdir))
self.assertExists(os.path.join(tempdir, 'Cargo.toml'), 'Extracted source could not be found')
self.assertExists(os.path.join(self.workspacedir, 'conf', 'layer.conf'), 'Workspace directory not created. devtool output: %s' % result.output)
- matches = glob.glob(os.path.join(self.workspacedir, 'appends', 'zvariant_*.bbappend'))
+ matches = glob.glob(os.path.join(self.workspacedir, 'appends', '%s_*.bbappend' % testrecipe))
self.assertTrue(matches, 'bbappend not created')
# Test devtool status
result = runCmd('devtool status')
@@ -1495,7 +1496,7 @@
# Modify one file
srctree = os.path.join(self.workspacedir, 'sources', testrecipe)
runCmd('echo "Another line" >> README', cwd=srctree)
- runCmd('git commit -a --amend --no-edit', cwd=srctree)
+ runCmd('git commit -a --amend --no-edit --no-verify', cwd=srctree)
self.add_command_to_tearDown('cd %s; rm %s/*; git checkout %s %s' % (os.path.dirname(recipefile), testrecipe, testrecipe, os.path.basename(recipefile)))
result = runCmd('devtool update-recipe %s' % testrecipe)
expected_status = [(' M', '.*/%s/readme.patch.gz$' % testrecipe)]
@@ -1882,6 +1883,54 @@
self.assertNotIn(recipe, result.output)
self.assertNotExists(os.path.join(self.workspacedir, 'recipes', recipe), 'Recipe directory should not exist after resetting')
+ def test_devtool_upgrade_drop_md5sum(self):
+ # Check preconditions
+ self.assertTrue(not os.path.exists(self.workspacedir), 'This test cannot be run with a workspace directory under the build directory')
+ self.track_for_cleanup(self.workspacedir)
+ self.add_command_to_tearDown('bitbake-layers remove-layer */workspace')
+ # For the moment, we are using a real recipe.
+ recipe = 'devtool-upgrade-test3'
+ version = '1.6.0'
+ oldrecipefile = get_bb_var('FILE', recipe)
+ tempdir = tempfile.mkdtemp(prefix='devtoolqa')
+ self.track_for_cleanup(tempdir)
+ # Check upgrade. Code does not check if new PV is older or newer that current PV, so, it may be that
+ # we are downgrading instead of upgrading.
+ result = runCmd('devtool upgrade %s %s -V %s' % (recipe, tempdir, version))
+ # Check new recipe file is present
+ newrecipefile = os.path.join(self.workspacedir, 'recipes', recipe, '%s_%s.bb' % (recipe, version))
+ self.assertExists(newrecipefile, 'Recipe file should exist after upgrade')
+ # Check recipe got changed as expected
+ with open(oldrecipefile + '.upgraded', 'r') as f:
+ desiredlines = f.readlines()
+ with open(newrecipefile, 'r') as f:
+ newlines = f.readlines()
+ self.assertEqual(desiredlines, newlines)
+
+ def test_devtool_upgrade_all_checksums(self):
+ # Check preconditions
+ self.assertTrue(not os.path.exists(self.workspacedir), 'This test cannot be run with a workspace directory under the build directory')
+ self.track_for_cleanup(self.workspacedir)
+ self.add_command_to_tearDown('bitbake-layers remove-layer */workspace')
+ # For the moment, we are using a real recipe.
+ recipe = 'devtool-upgrade-test4'
+ version = '1.6.0'
+ oldrecipefile = get_bb_var('FILE', recipe)
+ tempdir = tempfile.mkdtemp(prefix='devtoolqa')
+ self.track_for_cleanup(tempdir)
+ # Check upgrade. Code does not check if new PV is older or newer that current PV, so, it may be that
+ # we are downgrading instead of upgrading.
+ result = runCmd('devtool upgrade %s %s -V %s' % (recipe, tempdir, version))
+ # Check new recipe file is present
+ newrecipefile = os.path.join(self.workspacedir, 'recipes', recipe, '%s_%s.bb' % (recipe, version))
+ self.assertExists(newrecipefile, 'Recipe file should exist after upgrade')
+ # Check recipe got changed as expected
+ with open(oldrecipefile + '.upgraded', 'r') as f:
+ desiredlines = f.readlines()
+ with open(newrecipefile, 'r') as f:
+ newlines = f.readlines()
+ self.assertEqual(desiredlines, newlines)
+
def test_devtool_layer_plugins(self):
"""Test that devtool can use plugins from other layers.
@@ -1900,7 +1949,15 @@
for p in paths:
dstdir = os.path.join(dstdir, p)
if not os.path.exists(dstdir):
- os.makedirs(dstdir)
+ try:
+ os.makedirs(dstdir)
+ except PermissionError:
+ return False
+ except OSError as e:
+ if e.errno == errno.EROFS:
+ return False
+ else:
+ raise e
if p == "lib":
# Can race with other tests
self.add_command_to_tearDown('rmdir --ignore-fail-on-non-empty %s' % dstdir)
@@ -1908,8 +1965,12 @@
self.track_for_cleanup(dstdir)
dstfile = os.path.join(dstdir, os.path.basename(srcfile))
if srcfile != dstfile:
- shutil.copy(srcfile, dstfile)
+ try:
+ shutil.copy(srcfile, dstfile)
+ except PermissionError:
+ return False
self.track_for_cleanup(dstfile)
+ return True
def test_devtool_load_plugin(self):
"""Test that devtool loads only the first found plugin in BBPATH."""
@@ -1927,15 +1988,17 @@
plugincontent = fh.readlines()
try:
self.assertIn('meta-selftest', srcfile, 'wrong bbpath plugin found')
- for path in searchpath:
- self._copy_file_with_cleanup(srcfile, path, 'lib', 'devtool')
+ searchpath = [
+ path for path in searchpath
+ if self._copy_file_with_cleanup(srcfile, path, 'lib', 'devtool')
+ ]
result = runCmd("devtool --quiet count")
self.assertEqual(result.output, '1')
result = runCmd("devtool --quiet multiloaded")
self.assertEqual(result.output, "no")
for path in searchpath:
result = runCmd("devtool --quiet bbdir")
- self.assertEqual(result.output, path)
+ self.assertEqual(os.path.realpath(result.output), os.path.realpath(path))
os.unlink(os.path.join(result.output, 'lib', 'devtool', 'bbpath.py'))
finally:
with open(srcfile, 'w') as fh:
diff --git a/poky/meta/lib/oeqa/selftest/cases/minidebuginfo.py b/poky/meta/lib/oeqa/selftest/cases/minidebuginfo.py
index aa1f9fa..2919f07 100644
--- a/poky/meta/lib/oeqa/selftest/cases/minidebuginfo.py
+++ b/poky/meta/lib/oeqa/selftest/cases/minidebuginfo.py
@@ -21,7 +21,7 @@
bb_vars = get_bb_vars(['DEPLOY_DIR_IMAGE', 'IMAGE_LINK_NAME', 'READELF'], image)
self.write_config("""
-PACKAGE_MINIDEBUGINFO = "1"
+DISTRO_FEATURES:append = " minidebuginfo"
IMAGE_FSTYPES = "tar.bz2"
""")
bitbake("{} {}:do_addto_recipe_sysroot".format(image, binutils))
diff --git a/poky/meta/lib/oeqa/selftest/cases/overlayfs.py b/poky/meta/lib/oeqa/selftest/cases/overlayfs.py
index 4031ded..cd0dc60 100644
--- a/poky/meta/lib/oeqa/selftest/cases/overlayfs.py
+++ b/poky/meta/lib/oeqa/selftest/cases/overlayfs.py
@@ -79,7 +79,7 @@
config = """
IMAGE_INSTALL:append = " overlayfs-user"
-DISTRO_FEATURES += "systemd overlayfs usrmerge"
+DISTRO_FEATURES:append = " systemd overlayfs usrmerge"
OVERLAYFS_QA_SKIP[mnt-overlay] = "mount-configured"
"""
diff --git a/poky/meta/lib/oeqa/selftest/cases/prservice.py b/poky/meta/lib/oeqa/selftest/cases/prservice.py
index 9fe3b80..8da3739 100644
--- a/poky/meta/lib/oeqa/selftest/cases/prservice.py
+++ b/poky/meta/lib/oeqa/selftest/cases/prservice.py
@@ -14,6 +14,8 @@
from oeqa.utils.commands import runCmd, bitbake, get_bb_var
from oeqa.utils.network import get_free_port
+import bb.utils
+
class BitbakePrTests(OESelftestTestCase):
@classmethod
@@ -21,6 +23,16 @@
super(BitbakePrTests, cls).setUpClass()
cls.pkgdata_dir = get_bb_var('PKGDATA_DIR')
+ cls.exported_db_path = os.path.join(cls.builddir, 'export.inc')
+ cls.current_db_path = os.path.join(get_bb_var('PERSISTENT_DIR'), 'prserv.sqlite3')
+
+ def cleanup(self):
+ # Ensure any memory resident bitbake is stopped
+ bitbake("-m")
+ # Remove any existing export file or prserv database
+ bb.utils.remove(self.exported_db_path)
+ bb.utils.remove(self.current_db_path + "*")
+
def get_pr_version(self, package_name):
package_data_file = os.path.join(self.pkgdata_dir, 'runtime', package_name)
package_data = ftools.read_file(package_data_file)
@@ -49,6 +61,7 @@
self.assertEqual(res.status, 0, msg=res.output)
def config_pr_tests(self, package_name, package_type='rpm', pr_socket='localhost:0'):
+ self.cleanup()
config_package_data = 'PACKAGE_CLASSES = "package_%s"' % package_type
self.write_config(config_package_data)
config_server_data = 'PRSERV_HOST = "%s"' % pr_socket
@@ -68,24 +81,24 @@
self.assertTrue(pr_2 - pr_1 == 1, "New PR %s did not increment as expected (from %s), difference should be 1" % (pr_2, pr_1))
self.assertTrue(stamp_1 != stamp_2, "Different pkg rev. but same stamp: %s" % stamp_1)
+ self.cleanup()
+
def run_test_pr_export_import(self, package_name, replace_current_db=True):
self.config_pr_tests(package_name)
self.increment_package_pr(package_name)
pr_1 = self.get_pr_version(package_name)
- exported_db_path = os.path.join(self.builddir, 'export.inc')
- export_result = runCmd("bitbake-prserv-tool export %s" % exported_db_path, ignore_status=True)
+ export_result = runCmd("bitbake-prserv-tool export %s" % self.exported_db_path, ignore_status=True)
self.assertEqual(export_result.status, 0, msg="PR Service database export failed: %s" % export_result.output)
- self.assertTrue(os.path.exists(exported_db_path), msg="%s didn't exist, tool output %s" % (exported_db_path, export_result.output))
+ self.assertTrue(os.path.exists(self.exported_db_path), msg="%s didn't exist, tool output %s" % (self.exported_db_path, export_result.output))
if replace_current_db:
- current_db_path = os.path.join(get_bb_var('PERSISTENT_DIR'), 'prserv.sqlite3')
- self.assertTrue(os.path.exists(current_db_path), msg="Path to current PR Service database is invalid: %s" % current_db_path)
- os.remove(current_db_path)
+ self.assertTrue(os.path.exists(self.current_db_path), msg="Path to current PR Service database is invalid: %s" % self.current_db_path)
+ os.remove(self.current_db_path)
- import_result = runCmd("bitbake-prserv-tool import %s" % exported_db_path, ignore_status=True)
- os.remove(exported_db_path)
+ import_result = runCmd("bitbake-prserv-tool import %s" % self.exported_db_path, ignore_status=True)
+ #os.remove(self.exported_db_path)
self.assertEqual(import_result.status, 0, msg="PR Service database import failed: %s" % import_result.output)
self.increment_package_pr(package_name)
@@ -93,6 +106,8 @@
self.assertTrue(pr_2 - pr_1 == 1, "New PR %s did not increment as expected (from %s), difference should be 1" % (pr_2, pr_1))
+ self.cleanup()
+
def test_import_export_replace_db(self):
self.run_test_pr_export_import('m4')
diff --git a/poky/meta/lib/oeqa/selftest/cases/recipetool.py b/poky/meta/lib/oeqa/selftest/cases/recipetool.py
index 55cbba9..df15c80 100644
--- a/poky/meta/lib/oeqa/selftest/cases/recipetool.py
+++ b/poky/meta/lib/oeqa/selftest/cases/recipetool.py
@@ -4,6 +4,7 @@
# SPDX-License-Identifier: MIT
#
+import errno
import os
import shutil
import tempfile
@@ -348,7 +349,6 @@
checkvars['LICENSE'] = 'GPL-2.0-only'
checkvars['LIC_FILES_CHKSUM'] = 'file://COPYING;md5=b234ee4d69f5fce4486a80fdaf4a4263'
checkvars['SRC_URI'] = 'https://github.com/logrotate/logrotate/releases/download/${PV}/logrotate-${PV}.tar.xz'
- checkvars['SRC_URI[md5sum]'] = 'a560c57fac87c45b2fc17406cdf79288'
checkvars['SRC_URI[sha256sum]'] = '2e6a401cac9024db2288297e3be1a8ab60e7401ba8e91225218aaf4a27e82a07'
self._test_recipe_contents(recipefile, checkvars, [])
@@ -406,7 +406,6 @@
checkvars = {}
checkvars['LICENSE'] = set(['LGPL-2.1-only', 'MPL-1.1-only'])
checkvars['SRC_URI'] = 'http://taglib.github.io/releases/taglib-${PV}.tar.gz'
- checkvars['SRC_URI[md5sum]'] = 'cee7be0ccfc892fa433d6c837df9522a'
checkvars['SRC_URI[sha256sum]'] = 'b6d1a5a610aae6ff39d93de5efd0fdc787aa9e9dc1e7026fa4c961b26563526b'
checkvars['DEPENDS'] = set(['boost', 'zlib'])
inherits = ['cmake']
@@ -457,6 +456,26 @@
def test_recipetool_create_python3_setuptools(self):
# Test creating python3 package from tarball (using setuptools3 class)
+ # Use the --no-pypi switch to avoid creating a pypi enabled recipe and
+ # and check the created recipe as if it was a more general tarball
+ temprecipe = os.path.join(self.tempdir, 'recipe')
+ os.makedirs(temprecipe)
+ pn = 'python-magic'
+ pv = '0.4.15'
+ recipefile = os.path.join(temprecipe, '%s_%s.bb' % (pn, pv))
+ srcuri = 'https://files.pythonhosted.org/packages/84/30/80932401906eaf787f2e9bd86dc458f1d2e75b064b4c187341f29516945c/python-magic-%s.tar.gz' % pv
+ result = runCmd('recipetool create --no-pypi -o %s %s' % (temprecipe, srcuri))
+ self.assertTrue(os.path.isfile(recipefile))
+ checkvars = {}
+ checkvars['LICENSE'] = set(['MIT'])
+ checkvars['LIC_FILES_CHKSUM'] = 'file://LICENSE;md5=16a934f165e8c3245f241e77d401bb88'
+ checkvars['SRC_URI'] = 'https://files.pythonhosted.org/packages/84/30/80932401906eaf787f2e9bd86dc458f1d2e75b064b4c187341f29516945c/python-magic-${PV}.tar.gz'
+ checkvars['SRC_URI[sha256sum]'] = 'f3765c0f582d2dfc72c15f3b5a82aecfae9498bd29ca840d72f37d7bd38bfcd5'
+ inherits = ['setuptools3']
+ self._test_recipe_contents(recipefile, checkvars, inherits)
+
+ def test_recipetool_create_python3_setuptools_pypi_tarball(self):
+ # Test creating python3 package from tarball (using setuptools3 and pypi classes)
temprecipe = os.path.join(self.tempdir, 'recipe')
os.makedirs(temprecipe)
pn = 'python-magic'
@@ -468,10 +487,70 @@
checkvars = {}
checkvars['LICENSE'] = set(['MIT'])
checkvars['LIC_FILES_CHKSUM'] = 'file://LICENSE;md5=16a934f165e8c3245f241e77d401bb88'
- checkvars['SRC_URI'] = 'https://files.pythonhosted.org/packages/84/30/80932401906eaf787f2e9bd86dc458f1d2e75b064b4c187341f29516945c/python-magic-${PV}.tar.gz'
- checkvars['SRC_URI[md5sum]'] = 'e384c95a47218f66c6501cd6dd45ff59'
checkvars['SRC_URI[sha256sum]'] = 'f3765c0f582d2dfc72c15f3b5a82aecfae9498bd29ca840d72f37d7bd38bfcd5'
- inherits = ['setuptools3']
+ checkvars['PYPI_PACKAGE'] = pn
+ inherits = ['setuptools3', 'pypi']
+ self._test_recipe_contents(recipefile, checkvars, inherits)
+
+ def test_recipetool_create_python3_setuptools_pypi(self):
+ # Test creating python3 package from pypi url (using setuptools3 and pypi classes)
+ # Intentionnaly using setuptools3 class here instead of any of the pep517 class
+ # to avoid the toml dependency and allows this test to run on host autobuilders
+ # with older version of python
+ temprecipe = os.path.join(self.tempdir, 'recipe')
+ os.makedirs(temprecipe)
+ pn = 'python-magic'
+ pv = '0.4.15'
+ recipefile = os.path.join(temprecipe, '%s_%s.bb' % (pn, pv))
+ # First specify the required version in the url
+ srcuri = 'https://pypi.org/project/%s/%s' % (pn, pv)
+ runCmd('recipetool create -o %s %s' % (temprecipe, srcuri))
+ self.assertTrue(os.path.isfile(recipefile))
+ checkvars = {}
+ checkvars['LICENSE'] = set(['MIT'])
+ checkvars['LIC_FILES_CHKSUM'] = 'file://LICENSE;md5=16a934f165e8c3245f241e77d401bb88'
+ checkvars['SRC_URI[sha256sum]'] = 'f3765c0f582d2dfc72c15f3b5a82aecfae9498bd29ca840d72f37d7bd38bfcd5'
+ checkvars['PYPI_PACKAGE'] = pn
+ inherits = ['setuptools3', "pypi"]
+ self._test_recipe_contents(recipefile, checkvars, inherits)
+
+ # Now specify the version as a recipetool parameter
+ runCmd('rm -rf %s' % recipefile)
+ self.assertFalse(os.path.isfile(recipefile))
+ srcuri = 'https://pypi.org/project/%s' % pn
+ runCmd('recipetool create -o %s %s --version %s' % (temprecipe, srcuri, pv))
+ self.assertTrue(os.path.isfile(recipefile))
+ checkvars = {}
+ checkvars['LICENSE'] = set(['MIT'])
+ checkvars['LIC_FILES_CHKSUM'] = 'file://LICENSE;md5=16a934f165e8c3245f241e77d401bb88'
+ checkvars['SRC_URI[sha256sum]'] = 'f3765c0f582d2dfc72c15f3b5a82aecfae9498bd29ca840d72f37d7bd38bfcd5'
+ checkvars['PYPI_PACKAGE'] = pn
+ inherits = ['setuptools3', "pypi"]
+ self._test_recipe_contents(recipefile, checkvars, inherits)
+
+ # Now, try to grab latest version of the package, so we cannot guess the name of the recipe,
+ # unless hardcoding the latest version but it means we will need to update the test for each release,
+ # so use a regexp
+ runCmd('rm -rf %s' % recipefile)
+ self.assertFalse(os.path.isfile(recipefile))
+ recipefile_re = r'%s_(.*)\.bb' % pn
+ result = runCmd('recipetool create -o %s %s' % (temprecipe, srcuri))
+ dirlist = os.listdir(temprecipe)
+ if len(dirlist) > 1:
+ self.fail('recipetool created more than just one file; output:\n%s\ndirlist:\n%s' % (result.output, str(dirlist)))
+ if len(dirlist) < 1 or not os.path.isfile(os.path.join(temprecipe, dirlist[0])):
+ self.fail('recipetool did not create recipe file; output:\n%s\ndirlist:\n%s' % (result.output, str(dirlist)))
+ import re
+ match = re.match(recipefile_re, dirlist[0])
+ self.assertTrue(match)
+ latest_pv = match.group(1)
+ self.assertTrue(latest_pv != pv)
+ recipefile = os.path.join(temprecipe, '%s_%s.bb' % (pn, latest_pv))
+ # Do not check LIC_FILES_CHKSUM and SRC_URI checksum here to avoid having updating the test on each release
+ checkvars = {}
+ checkvars['LICENSE'] = set(['MIT'])
+ checkvars['PYPI_PACKAGE'] = pn
+ inherits = ['setuptools3', "pypi"]
self._test_recipe_contents(recipefile, checkvars, inherits)
def test_recipetool_create_python3_pep517_setuptools_build_meta(self):
@@ -498,13 +577,8 @@
checkvars['SUMMARY'] = 'A library for working with the color formats defined by HTML and CSS.'
checkvars['LICENSE'] = set(['BSD-3-Clause'])
checkvars['LIC_FILES_CHKSUM'] = 'file://LICENSE;md5=702b1ef12cf66832a88f24c8f2ee9c19'
- checkvars['SRC_URI'] = 'https://files.pythonhosted.org/packages/a1/fb/f95560c6a5d4469d9c49e24cf1b5d4d21ffab5608251c6020a965fb7791c/webcolors-${PV}.tar.gz'
- checkvars['SRC_URI[md5sum]'] = 'c9be30c5b0cf1cad32e4cbacbb2229e9'
- checkvars['SRC_URI[sha1sum]'] = 'c90b84fb65eed9b4c9dea7f08c657bfac0e820a5'
checkvars['SRC_URI[sha256sum]'] = 'c225b674c83fa923be93d235330ce0300373d02885cef23238813b0d5668304a'
- checkvars['SRC_URI[sha384sum]'] = '45652af349660f19f68d01361dd5bda287789e5ea63608f52a8cea526ac04465614db2ea236103fb8456b1fcaea96ed7'
- checkvars['SRC_URI[sha512sum]'] = '074aaf135ac6b0025b88b731d1d6dfa4c539b4fff7195658cc58a4326bb9f0449a231685d312b4a1ec48ca535a838bfa5c680787fe0e61473a2a092c448937d0'
- inherits = ['python_setuptools_build_meta']
+ inherits = ['python_setuptools_build_meta', 'pypi']
self._test_recipe_contents(recipefile, checkvars, inherits)
@@ -532,13 +606,8 @@
checkvars['SUMMARY'] = 'Simple module to parse ISO 8601 dates'
checkvars['LICENSE'] = set(['MIT'])
checkvars['LIC_FILES_CHKSUM'] = 'file://LICENSE;md5=aab31f2ef7ba214a5a341eaa47a7f367'
- checkvars['SRC_URI'] = 'https://files.pythonhosted.org/packages/b9/f3/ef59cee614d5e0accf6fd0cbba025b93b272e626ca89fb70a3e9187c5d15/iso8601-${PV}.tar.gz'
- checkvars['SRC_URI[md5sum]'] = '6e33910eba87066b3be7fcf3d59d16b5'
- checkvars['SRC_URI[sha1sum]'] = 'efd225b2c9fa7d9e4a1ec6ad94f3295cee982e61'
checkvars['SRC_URI[sha256sum]'] = '6b1d3829ee8921c4301998c909f7829fa9ed3cbdac0d3b16af2d743aed1ba8df'
- checkvars['SRC_URI[sha384sum]'] = '255002433fe65c19adfd6b91494271b613cb25ef6a35ac77436de1e03d60cc07bf89fd716451b917f1435e4384860ef6'
- checkvars['SRC_URI[sha512sum]'] = 'db57ab2a25ef91e3bc479c8539d27e853cf1fbf60986820b8999ae15d7e566425a1e0cfba47d0f3b23aa703db0576db368e6c110ba2a2f46c9a34e8ee3611fb7'
- inherits = ['python_poetry_core']
+ inherits = ['python_poetry_core', 'pypi']
self._test_recipe_contents(recipefile, checkvars, inherits)
@@ -566,13 +635,8 @@
checkvars['SUMMARY'] = 'Backported and Experimental Type Hints for Python 3.8+'
checkvars['LICENSE'] = set(['PSF-2.0'])
checkvars['LIC_FILES_CHKSUM'] = 'file://LICENSE;md5=fcf6b249c2641540219a727f35d8d2c2'
- checkvars['SRC_URI'] = 'https://files.pythonhosted.org/packages/1f/7a/8b94bb016069caa12fc9f587b28080ac33b4fbb8ca369b98bc0a4828543e/typing_extensions-${PV}.tar.gz'
- checkvars['SRC_URI[md5sum]'] = '74bafe841fbd1c27324afdeb099babdf'
- checkvars['SRC_URI[sha1sum]'] = 'f8bed69cbad4a57a1a67bf8a31b62b657b47f7a3'
checkvars['SRC_URI[sha256sum]'] = 'df8e4339e9cb77357558cbdbceca33c303714cf861d1eef15e1070055ae8b7ef'
- checkvars['SRC_URI[sha384sum]'] = '0bd0112234134d965c6884f3c1f95d27b6ae49cfb08108101158e31dff33c2dce729331628b69818850f1acb68f6c8d0'
- checkvars['SRC_URI[sha512sum]'] = '5fbff10e085fbf3ac2e35d08d913608d8c8bca66903435ede91cdc7776d775689a53d64f5f0615fe687c6c228ac854c8651d99eb1cb96ec61c56b7ca01fdd440'
- inherits = ['python_flit_core']
+ inherits = ['python_flit_core', 'pypi']
self._test_recipe_contents(recipefile, checkvars, inherits)
@@ -601,13 +665,37 @@
checkvars['HOMEPAGE'] = 'https://github.com/python-jsonschema/jsonschema'
checkvars['LICENSE'] = set(['MIT'])
checkvars['LIC_FILES_CHKSUM'] = 'file://COPYING;md5=7a60a81c146ec25599a3e1dabb8610a8 file://json/LICENSE;md5=9d4de43111d33570c8fe49b4cb0e01af'
- checkvars['SRC_URI'] = 'https://files.pythonhosted.org/packages/e4/43/087b24516db11722c8687e0caf0f66c7785c0b1c51b0ab951dfde924e3f5/jsonschema-${PV}.tar.gz'
- checkvars['SRC_URI[md5sum]'] = '4d6667ce76f820c35082c2d60a4896ab'
- checkvars['SRC_URI[sha1sum]'] = '9173714cb88964d07f3a3f4fcaaef638b8ceac0c'
checkvars['SRC_URI[sha256sum]'] = 'ec84cc37cfa703ef7cd4928db24f9cb31428a5d0fa77747b8b51a847458e0bbf'
- checkvars['SRC_URI[sha384sum]'] = '7a53181f0e679aa3dc3eb4d05a420877b7b9bff2d02e81f5c289a37ed1127d6c0cca1f5a5f9e4e166f089ab36bcc2be9'
- checkvars['SRC_URI[sha512sum]'] = '60fa769faf6e3fc2c14eb9acd189c86e9d366b157230a5681d36552af0c159cb1ad33fd920668a36afdab98bc97253f91501704c5c07b5009fdaf9d29b52060d'
- inherits = ['python_hatchling']
+ inherits = ['python_hatchling', 'pypi']
+
+ self._test_recipe_contents(recipefile, checkvars, inherits)
+
+ def test_recipetool_create_python3_pep517_maturin(self):
+ # This test require python 3.11 or above for the tomllib module
+ # or tomli module to be installed
+ try:
+ import tomllib
+ except ImportError:
+ try:
+ import tomli
+ except ImportError:
+ self.skipTest('Test requires python 3.11 or above for tomllib module or tomli module')
+
+ # Test creating python3 package from tarball (using maturin class)
+ temprecipe = os.path.join(self.tempdir, 'recipe')
+ os.makedirs(temprecipe)
+ pn = 'pydantic-core'
+ pv = '2.14.5'
+ recipefile = os.path.join(temprecipe, 'python3-%s_%s.bb' % (pn, pv))
+ srcuri = 'https://files.pythonhosted.org/packages/64/26/cffb93fe9c6b5a91c497f37fae14a4b073ecbc47fc36a9979c7aa888b245/pydantic_core-%s.tar.gz' % pv
+ result = runCmd('recipetool create -o %s %s' % (temprecipe, srcuri))
+ self.assertTrue(os.path.isfile(recipefile))
+ checkvars = {}
+ checkvars['HOMEPAGE'] = 'https://github.com/pydantic/pydantic-core'
+ checkvars['LICENSE'] = set(['MIT'])
+ checkvars['LIC_FILES_CHKSUM'] = 'file://LICENSE;md5=ab599c188b4a314d2856b3a55030c75c'
+ checkvars['SRC_URI[sha256sum]'] = '6d30226dfc816dd0fdf120cae611dd2215117e4f9b124af8c60ab9093b6e8e71'
+ inherits = ['python_maturin', 'pypi']
self._test_recipe_contents(recipefile, checkvars, inherits)
@@ -853,14 +941,22 @@
self._test_recipe_contents(deps_require_file, checkvars, [])
-
+
def _copy_file_with_cleanup(self, srcfile, basedstdir, *paths):
dstdir = basedstdir
self.assertTrue(os.path.exists(dstdir))
for p in paths:
dstdir = os.path.join(dstdir, p)
if not os.path.exists(dstdir):
- os.makedirs(dstdir)
+ try:
+ os.makedirs(dstdir)
+ except PermissionError:
+ return False
+ except OSError as e:
+ if e.errno == errno.EROFS:
+ return False
+ else:
+ raise e
if p == "lib":
# Can race with other tests
self.add_command_to_tearDown('rmdir --ignore-fail-on-non-empty %s' % dstdir)
@@ -868,8 +964,12 @@
self.track_for_cleanup(dstdir)
dstfile = os.path.join(dstdir, os.path.basename(srcfile))
if srcfile != dstfile:
- shutil.copy(srcfile, dstfile)
+ try:
+ shutil.copy(srcfile, dstfile)
+ except PermissionError:
+ return False
self.track_for_cleanup(dstfile)
+ return True
def test_recipetool_load_plugin(self):
"""Test that recipetool loads only the first found plugin in BBPATH."""
@@ -883,15 +983,17 @@
plugincontent = fh.readlines()
try:
self.assertIn('meta-selftest', srcfile, 'wrong bbpath plugin found')
- for path in searchpath:
- self._copy_file_with_cleanup(srcfile, path, 'lib', 'recipetool')
+ searchpath = [
+ path for path in searchpath
+ if self._copy_file_with_cleanup(srcfile, path, 'lib', 'recipetool')
+ ]
result = runCmd("recipetool --quiet count")
self.assertEqual(result.output, '1')
result = runCmd("recipetool --quiet multiloaded")
self.assertEqual(result.output, "no")
for path in searchpath:
result = runCmd("recipetool --quiet bbdir")
- self.assertEqual(result.output, path)
+ self.assertEqual(os.path.realpath(result.output), os.path.realpath(path))
os.unlink(os.path.join(result.output, 'lib', 'recipetool', 'bbpath.py'))
finally:
with open(srcfile, 'w') as fh:
@@ -1054,9 +1156,9 @@
for uri in src_uri:
p = urllib.parse.urlparse(uri)
if p.scheme == 'file':
- return p.netloc + p.path
+ return p.netloc + p.path, uri
- def _test_appendsrcfile(self, testrecipe, filename=None, destdir=None, has_src_uri=True, srcdir=None, newfile=None, options=''):
+ def _test_appendsrcfile(self, testrecipe, filename=None, destdir=None, has_src_uri=True, srcdir=None, newfile=None, remove=None, machine=None , options=''):
if newfile is None:
newfile = self.testfile
@@ -1083,12 +1185,40 @@
expectedlines = ['FILESEXTRAPATHS:prepend := "${THISDIR}/${PN}:"\n',
'\n']
+
+ override = ""
+ if machine:
+ options += ' -m %s' % machine
+ override = ':append:%s' % machine
+ expectedlines.extend(['PACKAGE_ARCH = "${MACHINE_ARCH}"\n',
+ '\n'])
+
+ if remove:
+ for entry in remove:
+ if machine:
+ entry_remove_line = 'SRC_URI:remove:%s = " %s"\n' % (machine, entry)
+ else:
+ entry_remove_line = 'SRC_URI:remove = "%s"\n' % entry
+
+ expectedlines.extend([entry_remove_line,
+ '\n'])
+
if has_src_uri:
uri = 'file://%s' % filename
if expected_subdir:
uri += ';subdir=%s' % expected_subdir
- expectedlines[0:0] = ['SRC_URI += "%s"\n' % uri,
- '\n']
+ if machine:
+ src_uri_line = 'SRC_URI%s = " %s"\n' % (override, uri)
+ else:
+ src_uri_line = 'SRC_URI += "%s"\n' % uri
+
+ expectedlines.extend([src_uri_line, '\n'])
+
+ with open("/tmp/tmp.txt", "w") as file:
+ print(expectedlines, file=file)
+
+ if machine:
+ filename = '%s/%s' % (machine, filename)
return self._try_recipetool_appendsrcfile(testrecipe, newfile, destpath, options, expectedlines, [filename])
@@ -1143,18 +1273,46 @@
def test_recipetool_appendsrcfile_existing_in_src_uri(self):
testrecipe = 'base-files'
- filepath = self._get_first_file_uri(testrecipe)
+ filepath,_ = self._get_first_file_uri(testrecipe)
self.assertTrue(filepath, 'Unable to test, no file:// uri found in SRC_URI for %s' % testrecipe)
self._test_appendsrcfile(testrecipe, filepath, has_src_uri=False)
- def test_recipetool_appendsrcfile_existing_in_src_uri_diff_params(self):
+ def test_recipetool_appendsrcfile_existing_in_src_uri_diff_params(self, machine=None):
testrecipe = 'base-files'
subdir = 'tmp'
- filepath = self._get_first_file_uri(testrecipe)
+ filepath, srcuri_entry = self._get_first_file_uri(testrecipe)
self.assertTrue(filepath, 'Unable to test, no file:// uri found in SRC_URI for %s' % testrecipe)
- output = self._test_appendsrcfile(testrecipe, filepath, subdir, has_src_uri=False)
- self.assertTrue(any('with different parameters' in l for l in output))
+ self._test_appendsrcfile(testrecipe, filepath, subdir, machine=machine, remove=[srcuri_entry])
+
+ def test_recipetool_appendsrcfile_machine(self):
+ # A very basic test
+ self._test_appendsrcfile('base-files', 'a-file', machine='mymachine')
+
+ # Force cleaning the output of previous test
+ self.tearDownLocal()
+
+ # A more complex test: existing entry in src_uri with different param
+ self.test_recipetool_appendsrcfile_existing_in_src_uri_diff_params(machine='mymachine')
+
+ def test_recipetool_appendsrcfile_update_recipe_basic(self):
+ testrecipe = "mtd-utils-selftest"
+ recipefile = get_bb_var('FILE', testrecipe)
+ self.assertIn('meta-selftest', recipefile, 'This test expect %s recipe to be in meta-selftest')
+ cmd = 'recipetool appendsrcfile -W -u meta-selftest %s %s' % (testrecipe, self.testfile)
+ result = runCmd(cmd)
+ self.assertNotIn('Traceback', result.output)
+ self.add_command_to_tearDown('cd %s; rm -f %s/%s; git checkout .' % (os.path.dirname(recipefile), testrecipe, os.path.basename(self.testfile)))
+
+ expected_status = [(' M', '.*/%s$' % os.path.basename(recipefile)),
+ ('??', '.*/%s/%s$' % (testrecipe, os.path.basename(self.testfile)))]
+ self._check_repo_status(os.path.dirname(recipefile), expected_status)
+ result = runCmd('git diff %s' % os.path.basename(recipefile), cwd=os.path.dirname(recipefile))
+ removelines = []
+ addlines = [
+ 'file://%s \\\\' % os.path.basename(self.testfile),
+ ]
+ self._check_diff(result.output, addlines, removelines)
def test_recipetool_appendsrcfile_replace_file_srcdir(self):
testrecipe = 'bash'
diff --git a/poky/meta/lib/oeqa/selftest/cases/reproducible.py b/poky/meta/lib/oeqa/selftest/cases/reproducible.py
index 14ccb0b..80e8301 100644
--- a/poky/meta/lib/oeqa/selftest/cases/reproducible.py
+++ b/poky/meta/lib/oeqa/selftest/cases/reproducible.py
@@ -16,8 +16,6 @@
import datetime
exclude_packages = [
- 'rust-rustdoc',
- 'rust-dbg'
]
def is_excluded(package):
diff --git a/poky/meta/lib/oeqa/selftest/cases/rust.py b/poky/meta/lib/oeqa/selftest/cases/rust.py
index 7d14814..6dbc517 100644
--- a/poky/meta/lib/oeqa/selftest/cases/rust.py
+++ b/poky/meta/lib/oeqa/selftest/cases/rust.py
@@ -39,6 +39,9 @@
@OETestTag("runqemu")
class RustSelfTestSystemEmulated(OESelftestTestCase, OEPTestResultTestCase):
def test_rust(self, *args, **kwargs):
+ # Disable Rust Oe-selftest
+ self.skipTest("The Rust Oe-selftest is disabled.")
+
# build remote-test-server before image build
recipe = "rust"
start_time = time.time()
diff --git a/poky/meta/lib/oeqa/selftest/cases/sstatetests.py b/poky/meta/lib/oeqa/selftest/cases/sstatetests.py
index 7c2b14e..393eaf6 100644
--- a/poky/meta/lib/oeqa/selftest/cases/sstatetests.py
+++ b/poky/meta/lib/oeqa/selftest/cases/sstatetests.py
@@ -263,7 +263,7 @@
class SStateCacheManagement(SStateBase):
# Test the sstate-cache-management script. Each element in the global_config list is used with the corresponding element in the target_config list
- # global_config elements are expected to not generate any sstate files that would be removed by sstate-cache-management.sh (such as changing the value of MACHINE)
+ # global_config elements are expected to not generate any sstate files that would be removed by sstate-cache-management.py (such as changing the value of MACHINE)
def run_test_sstate_cache_management_script(self, target, global_config=[''], target_config=[''], ignore_patterns=[]):
self.assertTrue(global_config)
self.assertTrue(target_config)
@@ -277,14 +277,10 @@
# For now this only checks if random sstate tasks are handled correctly as a group.
# In the future we should add control over what tasks we check for.
- sstate_archs_list = []
expected_remaining_sstate = []
for idx in range(len(target_config)):
self.append_config(global_config[idx])
self.append_recipeinc(target, target_config[idx])
- sstate_arch = get_bb_var('SSTATE_PKGARCH', target)
- if not sstate_arch in sstate_archs_list:
- sstate_archs_list.append(sstate_arch)
if target_config[idx] == target_config[-1]:
target_sstate_before_build = self.search_sstate(target + r'.*?\.tar.zst$')
bitbake("-cclean %s" % target)
@@ -296,7 +292,7 @@
self.remove_recipeinc(target, target_config[idx])
self.assertEqual(result.status, 0, msg = "build of %s failed with %s" % (target, result.output))
- runCmd("sstate-cache-management.sh -y --cache-dir=%s --remove-duplicated --extra-archs=%s" % (self.sstate_path, ','.join(map(str, sstate_archs_list))))
+ runCmd("sstate-cache-management.py -y --cache-dir=%s --remove-duplicated" % (self.sstate_path))
actual_remaining_sstate = [x for x in self.search_sstate(target + r'.*?\.tar.zst$') if not any(pattern in x for pattern in ignore_patterns)]
actual_not_expected = [x for x in actual_remaining_sstate if x not in expected_remaining_sstate]
@@ -765,22 +761,22 @@
hashes = [hash1, hash2]
hashfiles = find_siginfo(key, None, hashes)
self.assertCountEqual(hashes, hashfiles)
- bb.siggen.compare_sigfiles(hashfiles[hash1], hashfiles[hash2], recursecb)
+ bb.siggen.compare_sigfiles(hashfiles[hash1]['path'], hashfiles[hash2]['path'], recursecb)
for pn in pns:
recursecb_count = 0
- filedates = find_siginfo(pn, "do_tmptask1")
- self.assertGreaterEqual(len(filedates), 2)
- latestfiles = sorted(filedates.keys(), key=lambda f: filedates[f])[-2:]
- bb.siggen.compare_sigfiles(latestfiles[-2], latestfiles[-1], recursecb)
+ matches = find_siginfo(pn, "do_tmptask1")
+ self.assertGreaterEqual(len(matches), 2)
+ latesthashes = sorted(matches.keys(), key=lambda h: matches[h]['time'])[-2:]
+ bb.siggen.compare_sigfiles(matches[latesthashes[-2]]['path'], matches[latesthashes[-1]]['path'], recursecb)
self.assertEqual(recursecb_count,1)
class SStatePrintdiff(SStateBase):
def run_test_printdiff_changerecipe(self, target, change_recipe, change_bbtask, change_content, expected_sametmp_output, expected_difftmp_output):
+ import time
self.write_config("""
-TMPDIR = "${TOPDIR}/tmp-sstateprintdiff"
-""")
- self.track_for_cleanup(self.topdir + "/tmp-sstateprintdiff")
+TMPDIR = "${{TOPDIR}}/tmp-sstateprintdiff-sametmp-{}"
+""".format(time.time()))
# Use runall do_build to ensure any indirect sstate is created, e.g. tzcode-native on both x86 and
# aarch64 hosts since only allarch target recipes depend upon it and it may not be built otherwise.
# A bitbake -c cleansstate tzcode-native would cause some of these tests to error for example.
@@ -791,38 +787,36 @@
result_sametmp = bitbake("-S printdiff {}".format(target))
self.write_config("""
-TMPDIR = "${TOPDIR}/tmp-sstateprintdiff-2"
-""")
- self.track_for_cleanup(self.topdir + "/tmp-sstateprintdiff-2")
+TMPDIR = "${{TOPDIR}}/tmp-sstateprintdiff-difftmp-{}"
+""".format(time.time()))
result_difftmp = bitbake("-S printdiff {}".format(target))
self.delete_recipeinc(change_recipe)
for item in expected_sametmp_output:
- self.assertIn(item, result_sametmp.output)
+ self.assertIn(item, result_sametmp.output, msg = "Item {} not found in output:\n{}".format(item, result_sametmp.output))
for item in expected_difftmp_output:
- self.assertIn(item, result_difftmp.output)
+ self.assertIn(item, result_difftmp.output, msg = "Item {} not found in output:\n{}".format(item, result_difftmp.output))
def run_test_printdiff_changeconfig(self, target, change_content, expected_sametmp_output, expected_difftmp_output):
+ import time
self.write_config("""
-TMPDIR = "${TOPDIR}/tmp-sstateprintdiff"
-""")
- self.track_for_cleanup(self.topdir + "/tmp-sstateprintdiff")
+TMPDIR = "${{TOPDIR}}/tmp-sstateprintdiff-sametmp-{}"
+""".format(time.time()))
bitbake("--runall build --runall deploy_source_date_epoch {}".format(target))
bitbake("-S none {}".format(target))
self.append_config(change_content)
result_sametmp = bitbake("-S printdiff {}".format(target))
self.write_config("""
-TMPDIR = "${TOPDIR}/tmp-sstateprintdiff-2"
-""")
+TMPDIR = "${{TOPDIR}}/tmp-sstateprintdiff-difftmp-{}"
+""".format(time.time()))
self.append_config(change_content)
- self.track_for_cleanup(self.topdir + "/tmp-sstateprintdiff-2")
result_difftmp = bitbake("-S printdiff {}".format(target))
for item in expected_sametmp_output:
- self.assertIn(item, result_sametmp.output)
+ self.assertIn(item, result_sametmp.output, msg = "Item {} not found in output:\n{}".format(item, result_sametmp.output))
for item in expected_difftmp_output:
- self.assertIn(item, result_difftmp.output)
+ self.assertIn(item, result_difftmp.output, msg = "Item {} not found in output:\n{}".format(item, result_difftmp.output))
# Check if printdiff walks the full dependency chain from the image target to where the change is in a specific recipe
@@ -842,7 +836,7 @@
expected_sametmp_output, expected_difftmp_output)
# Check if changes to gcc-source (which uses tmp/work-shared) are correctly discovered
- def test_gcc_runtime_vs_gcc_source(self):
+ def _test_gcc_runtime_vs_gcc_source(self):
gcc_source_pn = 'gcc-source-%s' % get_bb_vars(['PV'], 'gcc')['PV']
expected_output = ("Task {}:do_preconfigure couldn't be used from the cache because:".format(gcc_source_pn),
@@ -886,47 +880,82 @@
@OETestTag("yocto-mirrors")
class SStateMirrors(SStateBase):
- def check_bb_output(self, output, exceptions):
- in_tasks = False
- missing_objects = []
- checked_urls = []
- for l in output.splitlines():
- if "Testing URL" in l:
- checked_urls.append(l.split()[3])
- if "The differences between the current build and any cached tasks start at the following tasks" in l:
- in_tasks = True
- continue
- if "Writing task signature files" in l:
- in_tasks = False
- continue
- if in_tasks:
- recipe_task = l.split("/")[-1]
- recipe, task = recipe_task.split(":")
- for e in exceptions:
- if e[0] in recipe and task == e[1]:
+ def check_bb_output(self, output, exceptions, check_cdn):
+ def is_exception(object, exceptions):
+ for e in exceptions:
+ if re.search(e, object):
+ return True
+ return False
+
+ output_l = output.splitlines()
+ for l in output_l:
+ if l.startswith("Sstate summary"):
+ for idx, item in enumerate(l.split()):
+ if item == 'Missed':
+ missing_objects = int(l.split()[idx+1])
break
else:
- missing_objects.append(recipe_task)
- self.assertTrue(len(missing_objects) == 0, "URLs checked:\n{}\nMissing objects in the cache:\n{}".format("\n".join(checked_urls), "\n".join(missing_objects)))
+ self.fail("Did not find missing objects amount in sstate summary: {}".format(l))
+ break
+ else:
+ self.fail("Did not find 'Sstate summary' line in bitbake output")
- def run_test_cdn_mirror(self, machine, targets, exceptions):
- exceptions = exceptions + [[t, "do_deploy_source_date_epoch"] for t in targets.split()]
- exceptions = exceptions + [[t, "do_image_qa"] for t in targets.split()]
- self.config_sstate(True)
- self.append_config("""
+ failed_urls = []
+ for l in output_l:
+ if "SState: Unsuccessful fetch test for" in l and check_cdn:
+ missing_object = l.split()[6]
+ elif "SState: Looked for but didn't find file" in l and not check_cdn:
+ missing_object = l.split()[8]
+ else:
+ missing_object = None
+ if missing_object:
+ if not is_exception(missing_object, exceptions):
+ failed_urls.append(missing_object)
+ else:
+ missing_objects -= 1
+
+ self.assertEqual(len(failed_urls), missing_objects, "Amount of reported missing objects does not match failed URLs: {}\nFailed URLs:\n{}".format(missing_objects, "\n".join(failed_urls)))
+ self.assertEqual(len(failed_urls), 0, "Missing objects in the cache:\n{}".format("\n".join(failed_urls)))
+
+ def run_test(self, machine, targets, exceptions, check_cdn = True):
+ # sstate is checked for existence of these, but they never get written out to begin with
+ exceptions += ["{}.*image_qa".format(t) for t in targets.split()]
+ exceptions += ["{}.*deploy_source_date_epoch".format(t) for t in targets.split()]
+ exceptions += ["{}.*image_complete".format(t) for t in targets.split()]
+ exceptions += ["linux-yocto.*shared_workdir"]
+ # these get influnced by IMAGE_FSTYPES tweaks in yocto-autobuilder-helper's config.json (on x86-64)
+ # additionally, they depend on noexec (thus, absent stamps) package, install, etc. image tasks,
+ # which makes tracing other changes difficult
+ exceptions += ["{}.*create_spdx".format(t) for t in targets.split()]
+ exceptions += ["{}.*create_runtime_spdx".format(t) for t in targets.split()]
+
+ if check_cdn:
+ self.config_sstate(True)
+ self.append_config("""
MACHINE = "{}"
BB_HASHSERVE_UPSTREAM = "hashserv.yocto.io:8687"
SSTATE_MIRRORS ?= "file://.* http://cdn.jsdelivr.net/yocto/sstate/all/PATH;downloadfilename=PATH"
""".format(machine))
- result = bitbake("-D -S printdiff {}".format(targets))
- self.check_bb_output(result.output, exceptions)
+ else:
+ self.append_config("""
+MACHINE = "{}"
+""".format(machine))
+ result = bitbake("-DD -n {}".format(targets))
+ bitbake("-S none {}".format(targets))
+ self.check_bb_output(result.output, exceptions, check_cdn)
def test_cdn_mirror_qemux86_64(self):
- # Example:
- # exceptions = [ ["packagegroup-core-sdk","do_package"] ]
exceptions = []
- self.run_test_cdn_mirror("qemux86-64", "core-image-minimal core-image-full-cmdline core-image-sato-sdk", exceptions)
+ self.run_test("qemux86-64", "core-image-minimal core-image-full-cmdline core-image-sato-sdk", exceptions)
def test_cdn_mirror_qemuarm64(self):
exceptions = []
- self.run_test_cdn_mirror("qemuarm64", "core-image-minimal core-image-full-cmdline core-image-sato-sdk", exceptions)
+ self.run_test("qemuarm64", "core-image-minimal core-image-full-cmdline core-image-sato-sdk", exceptions)
+
+ def test_local_cache_qemux86_64(self):
+ exceptions = []
+ self.run_test("qemux86-64", "core-image-minimal core-image-full-cmdline core-image-sato-sdk", exceptions, check_cdn = False)
+
+ def test_local_cache_qemuarm64(self):
+ exceptions = []
+ self.run_test("qemuarm64", "core-image-minimal core-image-full-cmdline core-image-sato-sdk", exceptions, check_cdn = False)
diff --git a/poky/meta/lib/oeqa/selftest/cases/usergrouptests.py b/poky/meta/lib/oeqa/selftest/cases/usergrouptests.py
new file mode 100644
index 0000000..a331ca9
--- /dev/null
+++ b/poky/meta/lib/oeqa/selftest/cases/usergrouptests.py
@@ -0,0 +1,53 @@
+#
+# Copyright OpenEmbedded Contributors
+#
+# SPDX-License-Identifier: MIT
+#
+
+import os
+import shutil
+from oeqa.selftest.case import OESelftestTestCase
+from oeqa.utils.commands import bitbake
+from oeqa.utils.commands import bitbake, get_bb_var, get_test_layer
+
+class UserGroupTests(OESelftestTestCase):
+ def test_group_from_dep_package(self):
+ self.logger.info("Building creategroup2")
+ bitbake(' creategroup2 creategroup1')
+ bitbake(' creategroup2 creategroup1 -c clean')
+ self.logger.info("Packaging creategroup2")
+ self.assertTrue(bitbake(' creategroup2 -c package'))
+
+ def test_add_task_between_p_sysroot_and_package(self):
+ # Test for YOCTO #14961
+ self.assertTrue(bitbake('useraddbadtask -C fetch'))
+
+ def test_static_useradd_from_dynamic(self):
+ metaselftestpath = get_test_layer()
+ self.logger.info("Building core-image-minimal to generate passwd/group file")
+ bitbake(' core-image-minimal')
+ self.logger.info("Setting up useradd-staticids")
+ repropassdir = os.path.join(metaselftestpath, "conf/include")
+ os.makedirs(repropassdir)
+ etcdir=os.path.join(os.path.join(os.path.join(get_bb_var("TMPDIR"), "work"), \
+ os.path.join(get_bb_var("MACHINE").replace("-","_")+"-poky-linux", "core-image-minimal/1.0/rootfs/etc")))
+ shutil.copy(os.path.join(etcdir, "passwd"), os.path.join(repropassdir, "reproducable-passwd"))
+ shutil.copy(os.path.join(etcdir, "group"), os.path.join(repropassdir, "reproducable-group"))
+ # Copy the original local.conf
+ shutil.copyfile(os.path.join(os.environ.get('BUILDDIR'), 'conf/local.conf'), os.path.join(os.environ.get('BUILDDIR'), 'conf/local.conf.orig'))
+
+ self.write_config("USERADDEXTENSION = \"useradd-staticids\"")
+ self.write_config("USERADD_ERROR_DYNAMIC ??= \"error\"")
+ self.write_config("USERADD_UID_TABLES += \"conf/include/reproducible-passwd\"")
+ self.write_config("USERADD_GID_TABLES += \"conf/include/reproducible-group\"")
+ self.logger.info("Rebuild with staticids")
+ bitbake(' core-image-minimal')
+ shutil.copyfile(os.path.join(os.environ.get('BUILDDIR'), 'conf/local.conf.orig'), os.path.join(os.environ.get('BUILDDIR'), 'conf/local.conf'))
+ self.logger.info("Rebuild without staticids")
+ bitbake(' core-image-minimal')
+ self.write_config("USERADDEXTENSION = \"useradd-staticids\"")
+ self.write_config("USERADD_ERROR_DYNAMIC ??= \"error\"")
+ self.write_config("USERADD_UID_TABLES += \"files/static-passwd\"")
+ self.write_config("USERADD_GID_TABLES += \"files/static-group\"")
+ self.logger.info("Rebuild with other staticids")
+ self.assertTrue(bitbake(' core-image-minimal'))
diff --git a/poky/meta/lib/oeqa/selftest/cases/wic.py b/poky/meta/lib/oeqa/selftest/cases/wic.py
index 4ebcb76..b616759 100644
--- a/poky/meta/lib/oeqa/selftest/cases/wic.py
+++ b/poky/meta/lib/oeqa/selftest/cases/wic.py
@@ -1483,6 +1483,42 @@
result = runCmd("%s/usr/sbin/sfdisk --part-label %s 3" % (sysroot, image_path))
self.assertEqual('ext-space', result.output)
+ def test_empty_zeroize_plugin(self):
+ img = 'core-image-minimal'
+ expected_size = [ 1024*1024, # 1M
+ 512*1024, # 512K
+ 2*1024*1024] # 2M
+ # Check combination of sourceparams
+ with NamedTemporaryFile("w", suffix=".wks") as wks:
+ wks.writelines(
+ ['part empty --source empty --sourceparams="fill" --ondisk sda --fixed-size 1M\n',
+ 'part empty --source empty --sourceparams="size=512K" --ondisk sda --size 1M --align 1024\n',
+ 'part empty --source empty --sourceparams="size=2048k,bs=512K" --ondisk sda --size 4M --align 1024\n'
+ ])
+ wks.flush()
+ cmd = "wic create %s -e %s -o %s" % (wks.name, img, self.resultdir)
+ runCmd(cmd)
+ wksname = os.path.splitext(os.path.basename(wks.name))[0]
+ wicout = glob(os.path.join(self.resultdir, "%s-*direct" % wksname))
+ # Skip the complete image and just look at the single partitions
+ for idx, value in enumerate(wicout[1:]):
+ self.logger.info(wicout[idx])
+ # Check if partitions are actually zeroized
+ with open(wicout[idx], mode="rb") as fd:
+ ba = bytearray(fd.read())
+ for b in ba:
+ self.assertEqual(b, 0)
+ self.assertEqual(expected_size[idx], os.path.getsize(wicout[idx]))
+
+ # Check inconsistancy check between "fill" and "--size" parameter
+ with NamedTemporaryFile("w", suffix=".wks") as wks:
+ wks.writelines(['part empty --source empty --sourceparams="fill" --ondisk sda --size 1M\n'])
+ wks.flush()
+ cmd = "wic create %s -e %s -o %s" % (wks.name, img, self.resultdir)
+ result = runCmd(cmd, ignore_status=True)
+ self.assertIn("Source parameter 'fill' only works with the '--fixed-size' option, exiting.", result.output)
+ self.assertNotEqual(0, result.status)
+
class ModifyTests(WicTestCase):
def test_wic_ls(self):
"""Test listing image content using 'wic ls'"""
diff --git a/poky/meta/lib/oeqa/targetcontrol.py b/poky/meta/lib/oeqa/targetcontrol.py
index e21655c..6e8b781 100644
--- a/poky/meta/lib/oeqa/targetcontrol.py
+++ b/poky/meta/lib/oeqa/targetcontrol.py
@@ -103,7 +103,6 @@
self.rootfs = os.path.join(d.getVar("DEPLOY_DIR_IMAGE"), d.getVar("IMAGE_LINK_NAME") + '.' + self.image_fstype)
self.kernel = os.path.join(d.getVar("DEPLOY_DIR_IMAGE"), d.getVar("KERNEL_IMAGETYPE", False) + '-' + d.getVar('MACHINE', False) + '.bin')
self.qemulog = os.path.join(self.testdir, "qemu_boot_log.%s" % self.datetime)
- dump_target_cmds = d.getVar("testimage_dump_target")
dump_monitor_cmds = d.getVar("testimage_dump_monitor")
dump_dir = d.getVar("TESTIMAGE_DUMP_DIR")
if not dump_dir:
@@ -144,7 +143,6 @@
tmpfsdir = d.getVar("RUNQEMU_TMPFS_DIR"),
serial_ports = len(d.getVar("SERIAL_CONSOLES").split()))
- self.target_dumper = TargetDumper(dump_target_cmds, dump_dir, self.runner)
self.monitor_dumper = MonitorDumper(dump_monitor_cmds, dump_dir, self.runner)
if (self.monitor_dumper):
self.monitor_dumper.create_dir("qmp")
diff --git a/poky/meta/lib/oeqa/utils/qemurunner.py b/poky/meta/lib/oeqa/utils/qemurunner.py
index 29fe271..7273bbc 100644
--- a/poky/meta/lib/oeqa/utils/qemurunner.py
+++ b/poky/meta/lib/oeqa/utils/qemurunner.py
@@ -19,10 +19,11 @@
import string
import threading
import codecs
-import logging
import tempfile
from collections import defaultdict
+from contextlib import contextmanager
import importlib
+import traceback
# Get Unicode non printable control chars
control_range = list(range(0,32))+list(range(127,160))
@@ -30,6 +31,15 @@
if chr(x) not in string.printable]
re_control_char = re.compile('[%s]' % re.escape("".join(control_chars)))
+def getOutput(o):
+ import fcntl
+ fl = fcntl.fcntl(o, fcntl.F_GETFL)
+ fcntl.fcntl(o, fcntl.F_SETFL, fl | os.O_NONBLOCK)
+ try:
+ return os.read(o.fileno(), 1000000).decode("utf-8")
+ except BlockingIOError:
+ return ""
+
class QemuRunner:
def __init__(self, machine, rootfs, display, tmpdir, deploy_dir_image, logfile, boottime, dump_dir, use_kvm, logger, use_slirp=False,
@@ -56,6 +66,7 @@
self.boottime = boottime
self.logged = False
self.thread = None
+ self.threadsock = None
self.use_kvm = use_kvm
self.use_ovmf = use_ovmf
self.use_slirp = use_slirp
@@ -120,21 +131,11 @@
f.write(msg)
self.msg += self.decode_qemulog(msg)
- def getOutput(self, o):
- import fcntl
- fl = fcntl.fcntl(o, fcntl.F_GETFL)
- fcntl.fcntl(o, fcntl.F_SETFL, fl | os.O_NONBLOCK)
- try:
- return os.read(o.fileno(), 1000000).decode("utf-8")
- except BlockingIOError:
- return ""
-
-
def handleSIGCHLD(self, signum, frame):
if self.runqemu and self.runqemu.poll():
if self.runqemu.returncode:
self.logger.error('runqemu exited with code %d' % self.runqemu.returncode)
- self.logger.error('Output from runqemu:\n%s' % self.getOutput(self.runqemu.stdout))
+ self.logger.error('Output from runqemu:\n%s' % getOutput(self.runqemu.stdout))
self.stop()
def start(self, qemuparams = None, get_ip = True, extra_bootparams = None, runqemuparams='', launch_cmd=None, discard_writes=True):
@@ -283,7 +284,7 @@
if self.runqemu.returncode:
# No point waiting any longer
self.logger.warning('runqemu exited with code %d' % self.runqemu.returncode)
- self.logger.warning("Output from runqemu:\n%s" % self.getOutput(output))
+ self.logger.warning("Output from runqemu:\n%s" % getOutput(output))
self.stop()
return False
time.sleep(0.5)
@@ -310,7 +311,7 @@
ps = subprocess.Popen(['ps', 'axww', '-o', 'pid,ppid,pri,ni,command '], stdout=subprocess.PIPE).communicate()[0]
processes = ps.decode("utf-8")
self.logger.debug("Running processes:\n%s" % processes)
- op = self.getOutput(output)
+ op = getOutput(output)
self.stop()
if op:
self.logger.error("Output from runqemu:\n%s" % op)
@@ -388,7 +389,7 @@
time.time() - connect_time))
# We are alive: qemu is running
- out = self.getOutput(output)
+ out = getOutput(output)
netconf = False # network configuration is not required by default
self.logger.debug("qemu started in %.2f seconds - qemu procces pid is %s (%s)" %
(time.time() - (endtime - self.runqemutime),
@@ -431,9 +432,10 @@
self.logger.debug("Target IP: %s" % self.ip)
self.logger.debug("Server IP: %s" % self.server_ip)
+ self.thread = LoggingThread(self.log, self.threadsock, self.logger, self.runqemu.stdout)
+ self.thread.start()
+
if self.serial_ports >= 2:
- self.thread = LoggingThread(self.log, self.threadsock, self.logger)
- self.thread.start()
if not self.thread.connection_established.wait(self.boottime):
self.logger.error("Didn't receive a console connection from qemu. "
"Here is the qemu command line used:\n%s\nand "
@@ -445,7 +447,7 @@
self.logger.debug("Waiting at most %d seconds for login banner (%s)" %
(self.boottime, time.strftime("%D %H:%M:%S")))
endtime = time.time() + self.boottime
- filelist = [self.server_socket, self.runqemu.stdout]
+ filelist = [self.server_socket]
reachedlogin = False
stopread = False
qemusock = None
@@ -517,8 +519,12 @@
except Exception as e:
self.logger.warning('Extra log data exception %s' % repr(e))
data = None
+ self.thread.serial_lock.release()
return False
+ with self.thread.serial_lock:
+ self.thread.set_serialsock(self.server_socket)
+
# If we are not able to login the tests can continue
try:
(status, output) = self.run_serial(self.boot_patterns['send_login_user'], raw=True, timeout=120)
@@ -565,7 +571,7 @@
self.logger.debug("Sending SIGKILL to runqemu")
os.killpg(os.getpgid(self.runqemu.pid), signal.SIGKILL)
if not self.runqemu.stdout.closed:
- self.logger.info("Output from runqemu:\n%s" % self.getOutput(self.runqemu.stdout))
+ self.logger.info("Output from runqemu:\n%s" % getOutput(self.runqemu.stdout))
self.runqemu.stdin.close()
self.runqemu.stdout.close()
self.runqemu_exited = True
@@ -653,31 +659,32 @@
data = ''
status = 0
- self.server_socket.sendall(command.encode('utf-8'))
- start = time.time()
- end = start + timeout
- while True:
- now = time.time()
- if now >= end:
- data += "<<< run_serial(): command timed out after %d seconds without output >>>\r\n\r\n" % timeout
- break
- try:
- sread, _, _ = select.select([self.server_socket],[],[], end - now)
- except InterruptedError:
- continue
- if sread:
- # try to avoid reading single character at a time
- time.sleep(0.1)
- answer = self.server_socket.recv(1024)
- if answer:
- data += answer.decode('utf-8')
- # Search the prompt to stop
- if re.search(self.boot_patterns['search_cmd_finished'], data):
- break
- else:
- if self.canexit:
- return (1, "")
- raise Exception("No data on serial console socket, connection closed?")
+ with self.thread.serial_lock:
+ self.server_socket.sendall(command.encode('utf-8'))
+ start = time.time()
+ end = start + timeout
+ while True:
+ now = time.time()
+ if now >= end:
+ data += "<<< run_serial(): command timed out after %d seconds without output >>>\r\n\r\n" % timeout
+ break
+ try:
+ sread, _, _ = select.select([self.server_socket],[],[], end - now)
+ except InterruptedError:
+ continue
+ if sread:
+ # try to avoid reading single character at a time
+ time.sleep(0.1)
+ answer = self.server_socket.recv(1024)
+ if answer:
+ data += answer.decode('utf-8')
+ # Search the prompt to stop
+ if re.search(self.boot_patterns['search_cmd_finished'], data):
+ break
+ else:
+ if self.canexit:
+ return (1, "")
+ raise Exception("No data on serial console socket, connection closed?")
if data:
if raw:
@@ -696,14 +703,27 @@
status = 1
return (status, str(data))
+@contextmanager
+def nonblocking_lock(lock):
+ locked = lock.acquire(False)
+ try:
+ yield locked
+ finally:
+ if locked:
+ lock.release()
+
# This class is for reading data from a socket and passing it to logfunc
# to be processed. It's completely event driven and has a straightforward
# event loop. The mechanism for stopping the thread is a simple pipe which
# will wake up the poll and allow for tearing everything down.
class LoggingThread(threading.Thread):
- def __init__(self, logfunc, sock, logger):
+ def __init__(self, logfunc, sock, logger, qemuoutput):
self.connection_established = threading.Event()
+ self.serial_lock = threading.Lock()
+
self.serversock = sock
+ self.serialsock = None
+ self.qemuoutput = qemuoutput
self.logfunc = logfunc
self.logger = logger
self.readsock = None
@@ -715,9 +735,14 @@
threading.Thread.__init__(self, target=self.threadtarget)
+ def set_serialsock(self, serialsock):
+ self.serialsock = serialsock
+
def threadtarget(self):
try:
self.eventloop()
+ except Exception as e:
+ self.logger.warning("Exception %s in logging thread" % traceback.format_exception(e))
finally:
self.teardown()
@@ -733,7 +758,8 @@
def teardown(self):
self.logger.debug("Tearing down logging thread")
- self.close_socket(self.serversock)
+ if self.serversock:
+ self.close_socket(self.serversock)
if self.readsock is not None:
self.close_socket(self.readsock)
@@ -748,27 +774,31 @@
def eventloop(self):
poll = select.poll()
event_read_mask = self.errorevents | self.readevents
- poll.register(self.serversock.fileno())
+ if self.serversock:
+ poll.register(self.serversock.fileno())
+ serial_registered = False
+ poll.register(self.qemuoutput.fileno())
poll.register(self.readpipe, event_read_mask)
breakout = False
self.running = True
self.logger.debug("Starting thread event loop")
while not breakout:
- events = poll.poll()
- for event in events:
+ events = poll.poll(2)
+ for fd, event in events:
+
# An error occurred, bail out
- if event[1] & self.errorevents:
- raise Exception(self.stringify_event(event[1]))
+ if event & self.errorevents:
+ raise Exception(self.stringify_event(event))
# Event to stop the thread
- if self.readpipe == event[0]:
+ if self.readpipe == fd:
self.logger.debug("Stop event received")
breakout = True
break
# A connection request was received
- elif self.serversock.fileno() == event[0]:
+ elif self.serversock and self.serversock.fileno() == fd:
self.logger.debug("Connection request received")
self.readsock, _ = self.serversock.accept()
self.readsock.setblocking(0)
@@ -779,15 +809,35 @@
self.connection_established.set()
# Actual data to be logged
- elif self.readsock.fileno() == event[0]:
- data = self.recv(1024)
+ elif self.readsock and self.readsock.fileno() == fd:
+ data = self.recv(1024, self.readsock)
self.logfunc(data)
+ elif self.qemuoutput.fileno() == fd:
+ data = self.qemuoutput.read()
+ self.logger.debug("Data received on qemu stdout %s" % data)
+ self.logfunc(data, ".stdout")
+ elif self.serialsock and self.serialsock.fileno() == fd:
+ if self.serial_lock.acquire(blocking=False):
+ data = self.recv(1024, self.serialsock)
+ self.logger.debug("Data received serial thread %s" % data.decode('utf-8', 'replace'))
+ self.logfunc(data, ".2")
+ self.serial_lock.release()
+ else:
+ serial_registered = False
+ poll.unregister(self.serialsock.fileno())
+
+ if not serial_registered and self.serialsock:
+ with nonblocking_lock(self.serial_lock) as l:
+ if l:
+ serial_registered = True
+ poll.register(self.serialsock.fileno(), event_read_mask)
+
# Since the socket is non-blocking make sure to honor EAGAIN
# and EWOULDBLOCK.
- def recv(self, count):
+ def recv(self, count, sock):
try:
- data = self.readsock.recv(count)
+ data = sock.recv(count)
except socket.error as e:
if e.errno == errno.EAGAIN or e.errno == errno.EWOULDBLOCK:
return b''
@@ -815,6 +865,9 @@
val = 'POLLHUP'
elif select.POLLNVAL == event:
val = 'POLLNVAL'
+ else:
+ val = "0x%x" % (event)
+
return val
def close_socket(self, sock):
diff --git a/poky/meta/lib/patchtest/tests/test_metadata.py b/poky/meta/lib/patchtest/tests/test_metadata.py
index b6f4456..174dfc3 100644
--- a/poky/meta/lib/patchtest/tests/test_metadata.py
+++ b/poky/meta/lib/patchtest/tests/test_metadata.py
@@ -25,6 +25,8 @@
sha256sum = 'sha256sum'
git_regex = pyparsing.Regex('^git\:\/\/.*')
metadata_summary = 'SUMMARY'
+ cve_check_ignore_var = 'CVE_CHECK_IGNORE'
+ cve_status_var = 'CVE_STATUS'
def test_license_presence(self):
if not self.added:
@@ -178,3 +180,16 @@
# "${PN} version ${PN}-${PR}" is the default, so fail if default
if summary.startswith('%s version' % pn):
self.fail('%s is missing in newly added recipe' % self.metadata_summary)
+
+ def test_cve_check_ignore(self):
+ if not self.modified:
+ self.skip('No modified recipes, skipping test')
+ for pn in self.modified:
+ # we are not interested in images
+ if 'core-image' in pn:
+ continue
+ rd = self.tinfoil.parse_recipe(pn)
+ cve_check_ignore = rd.getVar(self.cve_check_ignore_var)
+
+ if cve_check_ignore is not None:
+ self.fail('%s is deprecated and should be replaced by %s' % (self.cve_check_ignore_var, self.cve_status_var))