poky: subtree update:c67f57c09e..c6bc20857c
Adrian Freihofer (2):
oe-publish-sdk: fix layers init via ssh
oe-publish-sdk: add --keep-orig option
Alexander Kanavin (68):
meta-selftest: correct the virgl test for 5.8 kernels
bison: upgrade 3.6.4 -> 3.7.1
util-linux: upgrade 2.35.2 -> 2.36
python3-numpy: upgrade 1.19.0 -> 1.19.1
python3-setuptools: upgrade 49.3.1 -> 49.6.0
rsync: upgrade 3.2.2 -> 3.2.3
util-linux: merge .inc into .bb
acpica: upgrade 20200528 -> 20200717
asciidoc: upgrade 9.0.1 -> 9.0.2
cryptodev: upgrade 1.10 -> 1.11
diffoscope: upgrade 153 -> 156
epiphany: upgrade 3.36.3 -> 3.36.4
font-alias: upgrade 1.0.3 -> 1.0.4
gtk+3: upgrade 3.24.21 -> 3.24.22
libcheck: upgrade 0.15.0 -> 0.15.2
libinput: upgrade 1.16.0 -> 1.16.1
libpipeline: upgrade 1.5.2 -> 1.5.3
libx11: upgrade 1.6.9 -> 1.6.11
linux-firmware: upgrade 20200619 -> 20200721
man-pages: upgrade 5.07 -> 5.08
mc: upgrade 4.8.24 -> 4.8.25
mesa: upgrade 20.1.4 -> 20.1.5
piglit: upgrade to latest revision
re2c: upgrade 2.0 -> 2.0.2
sysstat: upgrade 12.2.2 -> 12.4.0
vala: upgrade 0.48.7 -> 0.48.9
bootchart2: update 0.14.8 -> 0.14.9
harfbuzz: convert to meson, enable gobject introspection
pango: update 1.44.7 -> 1.46.0
boost: update 1.73.0 -> 1.74.0
xev: update 1.2.3 -> 1.2.4
wpebackend-fdo: update 1.6.1 -> 1.7.1
gpgme: update 1.13.1 -> 1.14.0
libpsl: update 0.21.0 -> 0.21.1.
gettext: update 0.20.2 -> 0.21
cmake: update 3.17.3 -> 3.18.1
linux-firmware: update 20200721 -> 20200817
meson: update 0.55.0 -> 0.55.1
systemd-boot: bump version to 246.2
json-glib: inherit upstream-version-is-even
packagegroup-core-device-devel: remove
oeqa/x32lib: rework to use readelf from the host
oeqa/multilib: rework to use readelf from the host
oeqa/multilib: un-skip the connman test
poky.conf: do not install packagegroup-core-device-devel into qemu images
glib-2.0: update 2.64.4 -> 2.64.5
cmake: upgrade 3.18.1 -> 3.18.2
libxcrypt: upgrade 4.4.16 -> 4.4.17
debianutils: upgrade 4.11 -> 4.11.1
enchant2: upgrade 2.2.8 -> 2.2.9
harfbuzz: upgrade 2.7.1 -> 2.7.2
libmpc: upgrade 1.1.0 -> 1.2.0
librepo: upgrade 1.12.0 -> 1.12.1
libuv: upgrade 1.38.1 -> 1.39.0
msmtp: upgrade 1.8.11 -> 1.8.12
ninja: upgrade 1.10.0 -> 1.10.1
p11-kit: upgrade 0.23.20 -> 0.23.21
pango: upgrade 1.46.0 -> 1.46.1
re2c: upgrade 2.0.2 -> 2.0.3
resolvconf: upgrade 1.82 -> 1.83
stress-ng: upgrade 0.11.18 -> 0.11.19
gnu-config: update to latest revision
nasm: update 2.15.03 -> 2.15.05
libva-utils: fix upstream version check
gnupg: update 2.2.21 -> 2.2.22
libx11: update 1.6.11 -> 1.6.12
mesa: update 20.1.5 -> 20.1.6
xserver-xorg: update 1.20.8 -> 1.20.9
Andrey Zhizhikin (1):
insane: check for missing update-alternatives inherit
Anibal Limon (1):
recipes-kernel: linux-firmware add qcom-venus-{5.2,5.4} packages
Aníbal Limón (1):
recipes-graphics/xorg-xserver: Add patch to fix segfault when probe
Armin Kuster (2):
bind: update to 9.11.22 ESV
core-image-sato: qemumips use 512 mem
Bruce Ashfield (30):
linux-yocto/5.4: update to v5.4.59
linux-yocto/5.8: update to v5.8.2
yocto-bsp: update to v5.4.56
yocto-bsp: update to v5.4.58
qemu: bump default reference kernel to v5.8
linux-yocto/5.8: fix perf and virtio_scsi warnings
linux-yocto-rt/5.8: fix lttng-modules build
linux-yocto/5.8: selftests/bpf: Prevent runqslower from racing on building bpftool
linux-yocto/5.8: disable CONFIG_NFS_DISABLE_UDP_SUPPORT
poky: set preferred version for linux-yocto to be v5.8
poky-tiny: set preferred version to 5.8
poky: add preferred version for linux-yocto-rt
linux-yocto/5.8: update to v5.8.3
linux-yocto/5.4: update to v5.4.60
kernel: config cleanups for 5.8+
linux-yocto/5.4: update to v5.4.61
linux-yocto/5.8: update to v5.8.4
linux-yocto/5.8: disable IKHEADERS in default builds
kernel-yocto: allow promotion of configuration warnings to errors
kernel-yocto: checksum all modifications to available kernel fragments directories
lttng-modules/devupstream: bump to latest 2.12 commits
linux-yocto-dev: bump to v5.9+
linux-yocto/5.8: update to v5.8.5
kernel-devsrc: account for HOSTCC and HOSTCXX
linux-yocto/config: netfilter: Enable nat for ipv4 and ipv6
linux-yocto/5.8: update to v5.8.8
linux-yocto/5.4: update to v5.4.64
linux-yocto/config: configuration warning cleanup
linux-yocto/5.8: update to v5.8.9
linux-yocto/5.4: update to v5.4.65
Changhyeok Bae (2):
iw: upgrade 5.4 -> 5.8
iputils: upgrade s20190709 -> s20200821
Chris Laplante (12):
bitbake: compat.py: remove file since it no longer actually implements anything
bitbake: COW: formatting
bitbake: COW: migrate test suite into tests/cow
cve-update-db-native: add progress handler
cve-check/cve-update-db-native: use lockfile to fix usage under multiconfig
cve-update-db-native: use context manager for cve_f
cve-check: avoid FileNotFoundError if no do_cve_check task has run
bitbake: utils: process_profilelog: use context manager
bitbake: utils: fix UnboundLocalError when _print_exception raises
cve-update-db-native: be less magical about checking whether the cve-check class is enabled
cve-update-db-native: move -journal checking into do_fetch
cve-update-db-native: remove unused variable
Christophe GUIBOUT (1):
initramfs-framework: support kernel cmdline with double quotes
Denys Dmytriyenko (2):
weston: upgrade 8.0.0 -> 9.0.0
cryptodev: bump 1 commit past 1.11 to fix 5.9-rc1+
Diego Sueiro (2):
license_image.bbclass: Create symlink to the image license manifest dir
license_image.bbclass: Fix symlink to the image license manifest dir creation
Douglas Royds (1):
tcmode-default: Drop gcc-cross-initial, gcc-crosssdk-initial references
Frazer Clews (1):
bitbake: lib: fix most undefined code picked up by pylint
Geoff Parker (1):
systemd-serialgetty: Replace sed quoting using ' with " to allow var expansion
Jacob Kroon (1):
gcc10: Don't default back to -fcommon
Jean-Francois Dagenais (1):
bitbake: siggen: clean_basepath: remove recipe full path when virtual:xyz present
Jens Rehsack (1):
lttng-modules: backport patches from 2.12.x to fix 5.4.64+ and 5.8.9+ builds
Joe Slater (1):
pseudo: fix renaming to self
Jon Mason (4):
cortex-m0plus.inc: change file permissions
tune-cortexa55.inc: clean-up ARMv8.2a uses
tune-cortexa57-cortexa53.inc: add CRC and set march
tune-cortexa*: Cleanups
Joshua Watt (8):
wic: Add 512 Byte alignment to --offset
oeqa: runtime_tests: Extra GPG debugging
oeqa: sdk: Capture stderr output
oeqa: reproducible: Fix test not producing diffs
diffoscope: upgrade 156 -> 158
bitbake: bitbake: Add parsing torture test
bitbake: cooker: Block SIGINT in worker processes
sphinx: dev-manual: Clarify that virtual providers do not apply to runtime dependencies
Kai Kang (1):
dhcpcd: 9.1.4 -> 9.2.0
Kevin Hao (1):
meta-yocto-bsp: Bump to the v5.8 kernel
Khairul Rohaizzat Jamaluddin (1):
wic/bootimg-efi: IMAGE_EFI_BOOT_FILES variable added to separate bootimg-efi and bootimg-partition
Khem Raj (24):
gcc-cross-canadian: Install gcc/g++ wrappers for musl
uninative: Upgrade to 2.9
packagegroup-core-tools-profile: Disable lttng-modules for riscv64
lttng-modules: Disable on riscv64
kexec-tools: Fix build with -fno-common on ppc
lttng-tools: Do not build for riscv64
util-linux: Allow update alternatives for additional apps
lttng-tools: lttng-ust works on riscv64
json-glib: Backport a build fix with clang
rpcbind: Use update-alternatives for rpcinfo
go: Upgrade to 1.15 major release
weston-init: Redefine weston service and add socket activation option
musl: Upgrade to latest master
libucontext: Recognise riscv32 architecture
linuxloader.bbclass: Define riscv32 ldso for musl
populate_sdk_ext: Do not assume local.conf will always exist
weston: plane_add_prop() calls break musl atomic modesetting
weston-init: Enable RDP screen share
weston-init: Do not use fbdev backend
weston-init: Select drm/fbdev backends for qemu machines
oeqa/weston: Fix tests to run with systemd
core-image-weston: Bump qemu memory to 512M
go: Update to 1.15.2 minor release
bind: Inherit update-alternatives
Mark Hatle (6):
package_tar.bbclass: Sync to the other package_* classes
kernel.bbclass: Remove do_install[prefunc] no longer needed
buildhistory.bbclass: Rework to use read_subpackage_metadata
kernel.bbclass: Move away from calling package_get_auto_pr
package.bbclass: hash equivalency and pr service
bitbake: process.py: Handle SystemExit exception to eliminate backtrace
Mark Morton (1):
sphinx: test-manual code block, link, and format update
Martin Jansa (7):
devtool: expand SRC_URI when guessing recipe update mode
image-artifact-names: introduce new bbclass and move some variables into it
kernel.bbclass: use bash variables like imageType, base_name without {}
kernel.bbclass: eliminate (initramfs_)symlink_name variables
kernel.bbclass: use camelCase notation for bash variables in do_deploy
*-initramfs: don't use .rootfs IMAGE_NAME_SUFFIX
bitbake.conf: use ${TCMODE}-${TCLIBC} directory for CACHE
Matt Madison (1):
image.bbclass: fix REPRODUCIBLE_TIMESTAMP_ROOTFS reference
Michael Gloff (2):
sysvinit rc: Use PSPLASH_FIFO_DIR for progress fifo
sysvinit: Remove ${B} assignment
Michael Tretter (1):
devtool: deploy-target: Fix size calculation for hard links
Ming Liu (2):
systemd: split systemd specific udev rules into its own package
libubootenv: inherit uboot-config
Mingli Yu (3):
qemu: always define unknown_lock_type
qemu: override DEBUG_BUILD
bison: remove the parallel build patch
Naveen Saini (1):
lib/oe/recipeutils.py: add support for BBFILES_DYNAMIC
Nicolas Dechesne (73):
linux-libc-headers: kernel headers are installed in STAGING_KERNEL_BUILDDIR
bitbake: sphinx: add initial build infrastructure
bitbake: sphinx: initial sphinx support
bitbake: sphinx: bitbake-user-manual: use builtin sphinx glossary
bitbake: sphinx: switch to readthedocs theme
bitbake: sphinx: override theme CSS
bitbake: sphinx: fixup for links
bitbake: sphinx: fix links inside notes
bitbake: sphinx: fixes all remaining warnings
bitbake: sphinx: Makefile.sphinx: add clean and publish targets
bitbake: sphinx: tweak html output a bit
bitbake: sphinx: add SPDX headers
bitbake: sphinx: index: move the boilerplate at the end of the page
bitbake: sphinx: conf: enable extlinks extension
bitbake: sphinx: add releases page
bitbake: sphinx: bitbake-user-manual: insert additional blank line after title
bitbake: sphinx: last manual round of fixes/improvements
bitbake: sphinx: update style for important, caution and warnings
bitbake: sphinx: remove leading '/'
bitbake: sphinx: theme_override: properly set font for verbatim text
bitbake: bitbake-user-manual: fix bad links
sphinx: add initial build infrastructure
sphinx: initial sphinx support
sphinx: ref-variables: use builtin sphinx glossary
sphinx: overview-manual: add figures
sphinx: switch to readthedocs theme
sphinx: Add SPDX license headers
sphinx: add CSS theme override
sphinx: bsp-guide: add figures
sphinx: add Yocto project logo
sphinx: conf: update copyright
sphinx: conf: add substitutions/global variables
sphinx: add boilerplate file
sphinx: add boilerplate to manuals
sphinx: ref-manual: add revision history table
sphinx: add a general index
sphinx: conf.py: enable sphinx.ext.autosectionlabel
sphinx: ref-manual: use builtin glossary for the Terms section
sphinx: fix internal links
sphinx: ref-manual: fix typo
sphinx: fix custom term links
sphinx: manual updates for some links
sphinx: dev-manual add figures
sphinx: kernel-dev: add figures
sphinx: profile-manual: add figures
sphinx: fix up bold text for informalexample container
sphinx: ref-manual: add figures
sphinx: sdk-manual: add figures
sphinx: test-manual: add figures
sphinx: toaster-manual: add figures
sphinx: add links for Yocto project website
sphinx: fix links when the link text should be displayed
sphinx: add links to terms in the BitBake glossary
sphinx: add links to section in the Bitbake manual
sphinx: setup extlink for docs.yoctoproject.org
sphinx: enable intersphinx extension
sphinx: insert blank below between title and toc
sphinx: fix up terms related to kernel-fitimage
sphinx: conf: a few rendering tweaks
sphinx: makefile: add publish target
sphinx: conf: include CSS/JS files, the proper way
sphinx: convert 'what I wish I'd known'
sphinx: convert 'transitioning to a custom environment'
sphinx: ref-manual: fix heading for oe-init-build-env
sphinx: brief-yoctoprojectqs: fix up all remaining rendering issues
sphinx: Makefile.sphinx improvements
sphinx: convert bsp-guide
sphinx: remove leading '/'
sphinx: update style for important, caution and warnings
sphinx: profile-manual: convert profile-manual
sphinx: theme_override: properly set font for verbatim text
sphinx: theme_override: add tying-it-together admonition
sphinx: conf: exclude adt-manual/*.rst
Oleksandr Kravchuk (1):
ell: update to 0.33
Ovidiu Panait (1):
libxml2: Fix CVE-2020-24977
Peter A. Bigot (2):
bluez5: fix builds that require ell support
timezone: include leap second data in tzdata-core
Peter Bergin (1):
systemd: avoid failing if no udev rules provided
Pierre-Jean Texier (2):
libubootenv: upgrade 0.3 -> 0.3.1
diffoscope: upgrade 158 -> 160
Quentin Schulz (16):
sphinx: brief-yoctoprojectqs: remove redundant welcome
sphinx: brief-yoctoprojectqs: fix ambiguous note for cyclone5 example
sphinx: brief-yoctoprojectqs: add missing boilerplate
sphinx: overview-manual: add link to AUH how-to section
sphinx: overview-manual: fix bitbake basic explanation
sphinx: brief-yoctoprojectqs: add note on branch consistency between layers
sphinx: what-i-wish-id-known: update "don't be fooled by doc search results"
sphinx: overview-manual: remove highlight in bold section
sphinx: replace special quotes with single and double quotes
sphinx: fix incorrect indentations
sphinx: brief-yoctoprojectqs: put other distros note after Ubuntu-specific packages
sphinx: fix a few typos or missing/too many words
sphinx: "highlight" some variables, tasks or files
sphinx: fix or add missing links and remove mention of Eclipse workflow
ref-manual: examples: hello-autotools: upgrade to 2.10
ref-manual: examples: libxpm: add relative path to .inc
Rahul Kumar (1):
systemd-serialgetty: Fix sed expression quoting
Rasmus Villemoes (1):
kernel.bbclass: run do_symlink_kernsrc before do_patch
Richard Purdie (74):
nativesdk-sdk-provides-dummy: Add /bin/sh
bitbake: fetch2/wget: Remove buffering parameter
bitbake: cooker: Ensure parse_quit thread is closed down
bitbake: cooker: Explictly shut down the sync thread
bitbake: fetch2: Drop cups.org from wget status checks
bitbake: build/msg: Cleanup verbose option handling
bitbake: cooker/cookerdata/main: Improve loglevel handling
bitbake: cookerdata: Ensure UI options are updated to the server
bitbake: cooker/cookerdata: Ensure UI event log is updated from commandline
bitbake: cooker: Defer configuration init to after UI connection
bitbake: server/process: Move the socket code to server process only
bitbake: main/server/process: Drop configuration object passing
bitbake: cooker: Ensure BB_ORIGENV is updated by changes to configuration.env
bitbake: server/process: Log extra threads at exit
bitbake: server/process: Add bitbake-server and exec() a new server process
bitbake: runqueue: Don't use sys.argv
bitbake: cooker: Ensure cooker's enviroment is updated on updateConfig
connman-gnome/matchbox-desktop: Remove file:// globbing
selftest/recipetool: Drop globbing SRC_URI test, no longer supported
local.conf.sample: Document memory resident bitbake
bitbake: fetch2: Drop globbing supprt in file:// SRC_URIs
bitbake: server/process: Use sys.executable for bitbake-server
bitbake: process: Avoid bb.utils.timeout
bitbake: utils: Drop broken timeout function
bitbake: server/process: Fix typo in code causing tracebacks
oeqa/selftest: Apply patch to fix cpio build with -fno-common
runqemu: Show an error for conflicting graphics options
lttng: Move platform logic to dedicated inc file
patchelf: upgrade 0.11 -> 0.12
build-appliance/packagegroup-core-base-utils: Replace dhcp-client/dhcp-server with dhcpcd/kea
selftest/prservice: Improve test failure message
iputils: Adapt ${PN}-tftpd package dependency to PACKAGECONFIG
bitbake: process/knotty: Improve early exception handling
bitbake: cooker/cookerdata: Use BBHandledException, not sys.exit()
bitbake: cookerdata: Fix exception raise statements
bitbake: process: Avoid printing binary strings for leftover processes
bitbake: server/process: Ensure logging is flushed
bitbake: server/process: Don't show tracebacks if the lockfile is removed
bitbake: cooker: Ensure parser replacement calls parser final_cleanup
bitbake: cooker: Assign a name to the sync thread to aid debugging
bitbake: server/process: Ensure we don't keep looping if some other server is started
bitbake: server/process: Prefix the log data with pid/time information
bitbake: server/process: Note when commands complete in logs
bitbake: cooker: Ensure parser is cleaned up
runqemu: Add a hook to allow it to renice
bitbake: cooker: Avoid parser deadlocks
bitbake: cooker: Ensure parser worker signal handlers are default
selftest/signing: Ensure build path relocation is safe
oeqa/concurrencytest: Improve builddir path manipulations
bitbake: cooker/command: Fix disconnection handling
bitbake: tinfoil: Ensure sockets don't leak even when exceptions occur
bitbake: tests/fetch: Move away from problematic freedesktop.org urls
bitbake: sphinx: Enhance the sphinx experience/nagivation with:
bitbake: sphinx: theme_override: Use bold for emphasis text
Revert "qemu: always define unknown_lock_type"
Revert "core-image-sato: qemumips use 512 mem"
sphinx: Organize top level docs
sphinx: releases.rst: Add index/links to docs for previous releases
sphinx: boilerplate.rst: Drop versions notes as we have better navigation now
sphinx: boilerplate.rst: Sphinx puts the copyright elsewhere
sphinx: history: Move revision history to its own section
sphinx: manuals: Move boilerplate after toctree
sphinx: Add support for multiple docs version
sphinx: index.rst: Fix links
sphinx: ref-system-requirements: Improve formatting of the notes sections, merging them
sphinx: ref-manual links fixes and many other cleanups to import
sphinx: dev-manual: Various URL, code block and other fixes to imported data
sphinx: sdk-manual: Various URL, code block and other fixes to imported data
sphinx: kernel-dev: Various URL, code block and other fixes to imported data
sphinx: theme_override: Use bold for emphasis text
sphinx: ref-tasks: Add populate_sdk_ext task definition
sphinx: ref-manual/migration: Split each release into its own file
sphinx: overview-manual: Various URL, code block and other fixes to imported data
build-appliance-image: Update to master head revision
Robert Yang (3):
bitbake: cooker.py: Save prioritized BBFILES to BBFILES_PRIORITIZED
bitbake: utils.py: get_file_layer(): Exit the loop when file is matched
bitbake: utils.py: get_file_layer(): Improve performance
Ross Burton (25):
package.bbclass: explode the RPROVIDES so we don't think the versions are provides
elfutils: silence a new QA warning
insane: improve gnu-hash-style warning
gdk-pixbuf: add tests PACKAGECONFIG
debianutils: change SRC_URI to use snapshot.debian.org
insane: only load real files as ELF
autoconf: consolidate SRC_URI
autoconf: consolidate DEPENDS
kea: no need to depend on kea-native
kea: don't use PACKAGECONFIG inappropriately
kea: bump to 1.7.10
help2man: rewrite recipe
local.conf.sample.extended: remove help2man reference
curl: add vendors to CVE_PRODUCT to exclude false positives
harfbuzz: update patch status
harfbuzz: fix a build race around hb-version.h
cmake: whitelist CVE-2016-10642
ncurses: remove config.cache
qemu: fix CVE-2020-14364
cve-update-db-native: remove unused import
cve-update-db-native: add more logging when fetching
cve-update-db-native: use fetch task
alsa-plugins: improve .la removal
sato-screenshot: improve .la removal
buildhistory-diff: use BUILDDIR to know where buildhistory is
Saul Wold (1):
gnupg: uprev 2.2.22 -> 2.2.23
Stacy Gaikovaia (2):
bison: uprev from 3.7.1 to 3.7.2
valgrind: fix memcheck vgtests remove fullpath-after flags
Steve Sakoman (1):
xinput-calibrator: change SRC_URI to branch with libinput support
Sumit Garg (1):
insane: fix gnu-hash-style check
TeohJayShen (1):
oeqa/runtime: add test for matchbox-terminal
Tim Orling (1):
sphinx: toaster-manual: fix vars, links, code blocks
Vijai Kumar K (2):
image_types_wic: Add ASSUME_PROVIDED to WICVARS
wic: misc: Add /bin to the list of searchpaths
Yanfei Xu (1):
kernel-yocto: only replace leading -I in include paths
Yi Zhao (1):
glib-networking: add ptest
Zhixiong Chi (1):
gnutls: CVE-2020-24659
akuster (8):
log4cplus: move meta-oe pkg to core
kea: Move from meta-networking
maintainers.inc: Add me as kea & log4plus maintainer.
dhcpcd: Move from meta-network as OE-Core needs a client
maintainers.inc: Add me as dhcpcd maintainer
dhcp: remove from core
bind: Add 9.16.x
bind: 9.11 remove
hongxu (1):
sysstat: fix installed-vs-shipped QA Issue in systemd
zangrc (4):
libcap:upgrade 2.42 -> 2.43
libcap-ng:upgrade 0.7.10 -> 0.7.11
libgpg-error:upgrade 1.38 -> 1.39
at-spi2-core:upgrade 2.36.0 -> 2.36.1
Signed-off-by: Andrew Geissler <geissonator@yahoo.com>
Change-Id: I5542f5eea751a2641342e945725fd687cd74bebe
diff --git a/poky/bitbake/lib/bb/COW.py b/poky/bitbake/lib/bb/COW.py
index bc20ce3..23c22b6 100644
--- a/poky/bitbake/lib/bb/COW.py
+++ b/poky/bitbake/lib/bb/COW.py
@@ -3,13 +3,14 @@
#
# Copyright (C) 2006 Tim Ansell
#
-#Please Note:
+# Please Note:
# Be careful when using mutable types (ie Dict and Lists) - operations involving these are SLOW.
# Assign a file to __warn__ to get warnings about slow operations.
#
import copy
+
ImmutableTypes = (
bool,
complex,
@@ -22,9 +23,11 @@
MUTABLE = "__mutable__"
+
class COWMeta(type):
pass
+
class COWDictMeta(COWMeta):
__warn__ = False
__hasmutable__ = False
@@ -33,12 +36,15 @@
def __str__(cls):
# FIXME: I have magic numbers!
return "<COWDict Level: %i Current Keys: %i>" % (cls.__count__, len(cls.__dict__) - 3)
+
__repr__ = __str__
def cow(cls):
class C(cls):
__count__ = cls.__count__ + 1
+
return C
+
copy = cow
__call__ = cow
@@ -70,8 +76,9 @@
return value
__getmarker__ = []
+
def __getreadonly__(cls, key, default=__getmarker__):
- """\
+ """
Get a value (even if mutable) which you promise not to change.
"""
return cls.__getitem__(key, default, True)
@@ -138,24 +145,29 @@
def iterkeys(cls):
return cls.iter("keys")
+
def itervalues(cls, readonly=False):
if not cls.__warn__ is False and cls.__hasmutable__ and readonly is False:
- print("Warning: If you arn't going to change any of the values call with True.", file=cls.__warn__)
+ print("Warning: If you aren't going to change any of the values call with True.", file=cls.__warn__)
return cls.iter("values", readonly)
+
def iteritems(cls, readonly=False):
if not cls.__warn__ is False and cls.__hasmutable__ and readonly is False:
- print("Warning: If you arn't going to change any of the values call with True.", file=cls.__warn__)
+ print("Warning: If you aren't going to change any of the values call with True.", file=cls.__warn__)
return cls.iter("items", readonly)
+
class COWSetMeta(COWDictMeta):
def __str__(cls):
# FIXME: I have magic numbers!
- return "<COWSet Level: %i Current Keys: %i>" % (cls.__count__, len(cls.__dict__) -3)
+ return "<COWSet Level: %i Current Keys: %i>" % (cls.__count__, len(cls.__dict__) - 3)
+
__repr__ = __str__
def cow(cls):
class C(cls):
__count__ = cls.__count__ + 1
+
return C
def add(cls, value):
@@ -173,131 +185,11 @@
def iteritems(cls):
raise TypeError("sets don't have 'items'")
+
# These are the actual classes you use!
-class COWDictBase(object, metaclass = COWDictMeta):
+class COWDictBase(metaclass=COWDictMeta):
__count__ = 0
-class COWSetBase(object, metaclass = COWSetMeta):
+
+class COWSetBase(metaclass=COWSetMeta):
__count__ = 0
-
-if __name__ == "__main__":
- import sys
- COWDictBase.__warn__ = sys.stderr
- a = COWDictBase()
- print("a", a)
-
- a['a'] = 'a'
- a['b'] = 'b'
- a['dict'] = {}
-
- b = a.copy()
- print("b", b)
- b['c'] = 'b'
-
- print()
-
- print("a", a)
- for x in a.iteritems():
- print(x)
- print("--")
- print("b", b)
- for x in b.iteritems():
- print(x)
- print()
-
- b['dict']['a'] = 'b'
- b['a'] = 'c'
-
- print("a", a)
- for x in a.iteritems():
- print(x)
- print("--")
- print("b", b)
- for x in b.iteritems():
- print(x)
- print()
-
- try:
- b['dict2']
- except KeyError as e:
- print("Okay!")
-
- a['set'] = COWSetBase()
- a['set'].add("o1")
- a['set'].add("o1")
- a['set'].add("o2")
-
- print("a", a)
- for x in a['set'].itervalues():
- print(x)
- print("--")
- print("b", b)
- for x in b['set'].itervalues():
- print(x)
- print()
-
- b['set'].add('o3')
-
- print("a", a)
- for x in a['set'].itervalues():
- print(x)
- print("--")
- print("b", b)
- for x in b['set'].itervalues():
- print(x)
- print()
-
- a['set2'] = set()
- a['set2'].add("o1")
- a['set2'].add("o1")
- a['set2'].add("o2")
-
- print("a", a)
- for x in a.iteritems():
- print(x)
- print("--")
- print("b", b)
- for x in b.iteritems(readonly=True):
- print(x)
- print()
-
- del b['b']
- try:
- print(b['b'])
- except KeyError:
- print("Yay! deleted key raises error")
-
- if 'b' in b:
- print("Boo!")
- else:
- print("Yay - has_key with delete works!")
-
- print("a", a)
- for x in a.iteritems():
- print(x)
- print("--")
- print("b", b)
- for x in b.iteritems(readonly=True):
- print(x)
- print()
-
- b.__revertitem__('b')
-
- print("a", a)
- for x in a.iteritems():
- print(x)
- print("--")
- print("b", b)
- for x in b.iteritems(readonly=True):
- print(x)
- print()
-
- b.__revertitem__('dict')
- print("a", a)
- for x in a.iteritems():
- print(x)
- print("--")
- print("b", b)
- for x in b.iteritems(readonly=True):
- print(x)
- print()
diff --git a/poky/bitbake/lib/bb/__init__.py b/poky/bitbake/lib/bb/__init__.py
index 2c94e10..888dd5c 100644
--- a/poky/bitbake/lib/bb/__init__.py
+++ b/poky/bitbake/lib/bb/__init__.py
@@ -93,7 +93,7 @@
def __repr__(self):
logger = self.logger
- level = getLevelName(logger.getEffectiveLevel())
+ level = logger.getLevelName(logger.getEffectiveLevel())
return '<%s %s (%s)>' % (self.__class__.__name__, logger.name, level)
logging.LoggerAdapter = BBLoggerAdapter
diff --git a/poky/bitbake/lib/bb/build.py b/poky/bitbake/lib/bb/build.py
index 94f9cb3..974d2ff 100644
--- a/poky/bitbake/lib/bb/build.py
+++ b/poky/bitbake/lib/bb/build.py
@@ -29,6 +29,9 @@
bblogger = logging.getLogger('BitBake')
logger = logging.getLogger('BitBake.Build')
+verboseShellLogging = False
+verboseStdoutLogging = False
+
__mtime_cache = {}
def cached_mtime_noerror(f):
@@ -413,7 +416,7 @@
bb.data.emit_func(func, script, d)
- if bb.msg.loggerVerboseLogs:
+ if verboseShellLogging or bb.utils.to_boolean(d.getVar("BB_VERBOSE_LOGS", False)):
script.write("set -x\n")
if cwd:
script.write("cd '%s'\n" % cwd)
@@ -433,7 +436,7 @@
if fakerootcmd:
cmd = [fakerootcmd, runfile]
- if bb.msg.loggerDefaultVerbose:
+ if verboseStdoutLogging:
logfile = LogTee(logger, StdoutNoopContextManager())
else:
logfile = StdoutNoopContextManager()
diff --git a/poky/bitbake/lib/bb/cache.py b/poky/bitbake/lib/bb/cache.py
index b819a0c..9e0c931 100644
--- a/poky/bitbake/lib/bb/cache.py
+++ b/poky/bitbake/lib/bb/cache.py
@@ -636,7 +636,7 @@
# Have to be careful about spaces and colons in filenames
flist = self.filelist_regex.split(fl)
for f in flist:
- if not f or "*" in f:
+ if not f:
continue
f, exist = f.split(":")
if (exist == "True" and not os.path.exists(f)) or (exist == "False" and os.path.exists(f)):
diff --git a/poky/bitbake/lib/bb/command.py b/poky/bitbake/lib/bb/command.py
index 4d152ff..f8c6a03 100644
--- a/poky/bitbake/lib/bb/command.py
+++ b/poky/bitbake/lib/bb/command.py
@@ -54,13 +54,20 @@
self.cooker = cooker
self.cmds_sync = CommandsSync()
self.cmds_async = CommandsAsync()
- self.remotedatastores = bb.remotedata.RemoteDatastores(cooker)
+ self.remotedatastores = None
# FIXME Add lock for this
self.currentAsyncCommand = None
def runCommand(self, commandline, ro_only = False):
command = commandline.pop(0)
+
+ # Ensure cooker is ready for commands
+ if command != "updateConfig" and command != "setFeatures":
+ self.cooker.init_configdata()
+ if not self.remotedatastores:
+ self.remotedatastores = bb.remotedata.RemoteDatastores(self.cooker)
+
if hasattr(CommandsSync, command):
# Can run synchronous commands straight away
command_method = getattr(self.cmds_sync, command)
@@ -136,7 +143,8 @@
self.cooker.finishcommand()
def reset(self):
- self.remotedatastores = bb.remotedata.RemoteDatastores(self.cooker)
+ if self.remotedatastores:
+ self.remotedatastores = bb.remotedata.RemoteDatastores(self.cooker)
class CommandsSync:
"""
diff --git a/poky/bitbake/lib/bb/compat.py b/poky/bitbake/lib/bb/compat.py
deleted file mode 100644
index 4935668..0000000
--- a/poky/bitbake/lib/bb/compat.py
+++ /dev/null
@@ -1,10 +0,0 @@
-#
-# SPDX-License-Identifier: GPL-2.0-only
-#
-
-"""Code pulled from future python versions, here for compatibility"""
-
-from collections import MutableMapping, KeysView, ValuesView, ItemsView, OrderedDict
-from functools import total_ordering
-
-
diff --git a/poky/bitbake/lib/bb/cooker.py b/poky/bitbake/lib/bb/cooker.py
index 9123605..5442f7d 100644
--- a/poky/bitbake/lib/bb/cooker.py
+++ b/poky/bitbake/lib/bb/cooker.py
@@ -148,15 +148,16 @@
Manages one bitbake build run
"""
- def __init__(self, configuration, featureSet=None, idleCallBackRegister=None):
+ def __init__(self, featureSet=None, idleCallBackRegister=None):
self.recipecaches = None
+ self.eventlog = None
self.skiplist = {}
self.featureset = CookerFeatures()
if featureSet:
for f in featureSet:
self.featureset.setFeature(f)
- self.configuration = configuration
+ self.configuration = bb.cookerdata.CookerConfiguration()
self.idleCallBackRegister = idleCallBackRegister
@@ -194,18 +195,6 @@
self.hashserv = None
self.hashservaddr = None
- self.initConfigurationData()
-
- bb.debug(1, "BBCooker parsed base configuration %s" % time.time())
- sys.stdout.flush()
-
- # we log all events to a file if so directed
- if self.configuration.writeeventlog:
- # register the log file writer as UI Handler
- writer = EventWriter(self, self.configuration.writeeventlog)
- EventLogWriteHandler = namedtuple('EventLogWriteHandler', ['event'])
- bb.event.register_UIHhandler(EventLogWriteHandler(writer))
-
self.inotify_modified_files = []
def _process_inotify_updates(server, cooker, abort):
@@ -239,6 +228,13 @@
bb.debug(1, "BBCooker startup complete %s" % time.time())
sys.stdout.flush()
+ def init_configdata(self):
+ if not hasattr(self, "data"):
+ self.initConfigurationData()
+ bb.debug(1, "BBCooker parsed base configuration %s" % time.time())
+ sys.stdout.flush()
+ self.handlePRServ()
+
def process_inotify_updates(self):
for n in [self.confignotifier, self.notifier]:
if n.check_events(timeout=0):
@@ -324,7 +320,7 @@
for feature in features:
self.featureset.setFeature(feature)
bb.debug(1, "Features set %s (was %s)" % (original_featureset, list(self.featureset)))
- if (original_featureset != list(self.featureset)) and self.state != state.error:
+ if (original_featureset != list(self.featureset)) and self.state != state.error and hasattr(self, "data"):
self.reset()
def initConfigurationData(self):
@@ -356,7 +352,7 @@
self.caches_array.append(getattr(module, cache_name))
except ImportError as exc:
logger.critical("Unable to import extra RecipeInfo '%s' from '%s': %s" % (cache_name, module_name, exc))
- sys.exit("FATAL: Failed to import extra cache class '%s'." % cache_name)
+ raise bb.BBHandledException()
self.databuilder = bb.cookerdata.CookerDataBuilder(self.configuration, False)
self.databuilder.parseBaseConfiguration()
@@ -413,11 +409,6 @@
self.data.disableTracking()
def parseConfiguration(self):
- # Set log file verbosity
- verboselogs = bb.utils.to_boolean(self.data.getVar("BB_VERBOSE_LOGS", False))
- if verboselogs:
- bb.msg.loggerVerboseLogs = True
-
# Change nice level if we're asked to
nice = self.data.getVar("BB_NICE_LEVEL")
if nice:
@@ -451,7 +442,28 @@
logger.debug(1, "Marking as dirty due to '%s' option change to '%s'" % (o, options[o]))
print("Marking as dirty due to '%s' option change to '%s'" % (o, options[o]))
clean = False
- setattr(self.configuration, o, options[o])
+ if hasattr(self.configuration, o):
+ setattr(self.configuration, o, options[o])
+
+ if self.configuration.writeeventlog:
+ if self.eventlog and self.eventlog[0] != self.configuration.writeeventlog:
+ bb.event.unregister_UIHhandler(self.eventlog[1])
+ if not self.eventlog or self.eventlog[0] != self.configuration.writeeventlog:
+ # we log all events to a file if so directed
+ # register the log file writer as UI Handler
+ writer = EventWriter(self, self.configuration.writeeventlog)
+ EventLogWriteHandler = namedtuple('EventLogWriteHandler', ['event'])
+ self.eventlog = (self.configuration.writeeventlog, bb.event.register_UIHhandler(EventLogWriteHandler(writer)))
+
+ bb.msg.loggerDefaultLogLevel = self.configuration.default_loglevel
+ bb.msg.loggerDefaultDomains = self.configuration.debug_domains
+
+ if hasattr(self, "data"):
+ origenv = bb.data.init()
+ for k in environment:
+ origenv.setVar(k, environment[k])
+ self.data.setVar("BB_ORIGENV", origenv)
+
for k in bb.utils.approved_variables():
if k in environment and k not in self.configuration.env:
logger.debug(1, "Updating new environment variable %s to %s" % (k, environment[k]))
@@ -467,6 +479,10 @@
logger.debug(1, "Updating environment variable %s from %s to %s" % (k, self.configuration.env[k], environment[k]))
self.configuration.env[k] = environment[k]
clean = False
+
+ # Now update all the variables not in the datastore to match
+ self.configuration.env = environment
+
if not clean:
logger.debug(1, "Base environment change, triggering reparse")
self.reset()
@@ -1111,7 +1127,7 @@
from bb import shell
except ImportError:
parselog.exception("Interactive mode not available")
- sys.exit(1)
+ raise bb.BBHandledException()
else:
shell.start( self )
@@ -1547,6 +1563,7 @@
if self.state in (state.shutdown, state.forceshutdown, state.error):
if hasattr(self.parser, 'shutdown'):
self.parser.shutdown(clean=False, force = True)
+ self.parser.final_cleanup()
raise bb.BBHandledException()
if self.state != state.parsing:
@@ -1654,12 +1671,10 @@
return pkgs_to_build
def pre_serve(self):
- # We now are in our own process so we can call this here.
- # PRServ exits if its parent process exits
- self.handlePRServ()
return
def post_serve(self):
+ self.shutdown(force=True)
prserv.serv.auto_shutdown()
if self.hashserv:
self.hashserv.process.terminate()
@@ -1674,6 +1689,7 @@
if self.parser:
self.parser.shutdown(clean=not force, force=force)
+ self.parser.final_cleanup()
def finishcommand(self):
self.state = state.initial
@@ -1687,8 +1703,9 @@
self.finishcommand()
self.extraconfigdata = {}
self.command.reset()
- self.databuilder.reset()
- self.data = self.databuilder.data
+ if hasattr(self, "data"):
+ self.databuilder.reset()
+ self.data = self.databuilder.data
self.parsecache_valid = False
self.baseconfig_valid = False
@@ -1745,10 +1762,10 @@
collectlog.debug(1, "collecting .bb files")
files = (config.getVar( "BBFILES") or "").split()
- config.setVar("BBFILES", " ".join(files))
# Sort files by priority
files.sort( key=lambda fileitem: self.calc_bbfile_priority(fileitem)[0] )
+ config.setVar("BBFILES_PRIORITIZED", " ".join(files))
if not len(files):
files = self.get_bbfiles()
@@ -1977,7 +1994,8 @@
except queue.Empty:
pass
else:
- self.results.cancel_join_thread()
+ self.results.close()
+ self.results.join_thread()
break
if pending:
@@ -1986,6 +2004,8 @@
try:
job = self.jobs.pop()
except IndexError:
+ self.results.close()
+ self.results.join_thread()
break
result = self.parse(*job)
# Clear the siggen cache after parsing to control memory usage, its huge
@@ -2063,6 +2083,7 @@
self.start()
self.haveshutdown = False
+ self.syncthread = None
def start(self):
self.results = self.load_cached()
@@ -2070,6 +2091,9 @@
if self.toparse:
bb.event.fire(bb.event.ParseStarted(self.toparse), self.cfgdata)
def init():
+ signal.signal(signal.SIGTERM, signal.SIG_DFL)
+ signal.signal(signal.SIGHUP, signal.SIG_DFL)
+ signal.signal(signal.SIGINT, signal.SIG_IGN)
bb.utils.set_process_name(multiprocessing.current_process().name)
multiprocessing.util.Finalize(None, bb.codeparser.parser_cache_save, exitpriority=1)
multiprocessing.util.Finalize(None, bb.fetch.fetcher_parse_save, exitpriority=1)
@@ -2103,12 +2127,9 @@
self.total)
bb.event.fire(event, self.cfgdata)
- for process in self.processes:
- self.parser_quit.put(None)
- else:
- self.parser_quit.cancel_join_thread()
- for process in self.processes:
- self.parser_quit.put(None)
+
+ for process in self.processes:
+ self.parser_quit.put(None)
# Cleanup the queue before call process.join(), otherwise there might be
# deadlocks.
@@ -2125,13 +2146,17 @@
else:
process.join()
+ self.parser_quit.close()
+ # Allow data left in the cancel queue to be discarded
+ self.parser_quit.cancel_join_thread()
+
def sync_caches():
for c in self.bb_caches.values():
c.sync()
- sync = threading.Thread(target=sync_caches)
+ sync = threading.Thread(target=sync_caches, name="SyncThread")
+ self.syncthread = sync
sync.start()
- multiprocessing.util.Finalize(None, sync.join, exitpriority=-100)
bb.codeparser.parser_cache_savemerge()
bb.fetch.fetcher_parse_done()
if self.cooker.configuration.profile:
@@ -2145,6 +2170,10 @@
bb.utils.process_profilelog(profiles, pout = pout)
print("Processed parsing statistics saved to %s" % (pout))
+ def final_cleanup(self):
+ if self.syncthread:
+ self.syncthread.join()
+
def load_cached(self):
for mc, cache, filename, appends in self.fromcache:
cached, infos = cache.load(filename, appends)
diff --git a/poky/bitbake/lib/bb/cookerdata.py b/poky/bitbake/lib/bb/cookerdata.py
index b86e7d4..91cc434 100644
--- a/poky/bitbake/lib/bb/cookerdata.py
+++ b/poky/bitbake/lib/bb/cookerdata.py
@@ -58,11 +58,18 @@
def updateToServer(self, server, environment):
options = {}
for o in ["abort", "force", "invalidate_stamp",
- "verbose", "debug", "dry_run", "dump_signatures",
- "debug_domains", "extra_assume_provided", "profile",
- "prefile", "postfile", "server_timeout"]:
+ "dry_run", "dump_signatures",
+ "extra_assume_provided", "profile",
+ "prefile", "postfile", "server_timeout",
+ "nosetscene", "setsceneonly", "skipsetscene",
+ "runall", "runonly", "writeeventlog"]:
options[o] = getattr(self.options, o)
+ options['build_verbose_shell'] = self.options.verbose
+ options['build_verbose_stdout'] = self.options.verbose
+ options['default_loglevel'] = bb.msg.loggerDefaultLogLevel
+ options['debug_domains'] = bb.msg.loggerDefaultDomains
+
ret, error = server.runCommand(["updateConfig", options, environment, sys.argv])
if error:
raise Exception("Unable to update the server configuration with local parameters: %s" % error)
@@ -111,11 +118,11 @@
"""
def __init__(self):
- self.debug_domains = []
+ self.debug_domains = bb.msg.loggerDefaultDomains
+ self.default_loglevel = bb.msg.loggerDefaultLogLevel
self.extra_assume_provided = []
self.prefile = []
self.postfile = []
- self.debug = 0
self.cmd = None
self.abort = True
self.force = False
@@ -125,24 +132,17 @@
self.skipsetscene = False
self.invalidate_stamp = False
self.dump_signatures = []
+ self.build_verbose_shell = False
+ self.build_verbose_stdout = False
self.dry_run = False
self.tracking = False
- self.xmlrpcinterface = []
- self.server_timeout = None
self.writeeventlog = False
- self.server_only = False
self.limited_deps = False
self.runall = []
self.runonly = []
self.env = {}
- def setConfigParameters(self, parameters):
- for key in self.__dict__.keys():
- if key in parameters.options.__dict__:
- setattr(self, key, parameters.options.__dict__[key])
- self.env = parameters.environment.copy()
-
def __getstate__(self):
state = {}
for key in self.__dict__.keys():
@@ -164,7 +164,7 @@
import traceback
parselog.critical(traceback.format_exc())
parselog.critical("Unable to parse %s: %s" % (fn, exc))
- sys.exit(1)
+ raise bb.BBHandledException()
except bb.data_smart.ExpansionError as exc:
import traceback
@@ -176,10 +176,10 @@
if not fn.startswith(bbdir):
break
parselog.critical("Unable to parse %s" % fn, exc_info=(exc_class, exc, tb))
- sys.exit(1)
+ raise bb.BBHandledException()
except bb.parse.ParseError as exc:
parselog.critical(str(exc))
- sys.exit(1)
+ raise bb.BBHandledException()
return wrapped
@catch_parse_error
@@ -300,13 +300,13 @@
self.data_hash = data_hash.hexdigest()
except (SyntaxError, bb.BBHandledException):
- raise bb.BBHandledException
+ raise bb.BBHandledException()
except bb.data_smart.ExpansionError as e:
logger.error(str(e))
- raise bb.BBHandledException
+ raise bb.BBHandledException()
except Exception:
logger.exception("Error parsing configuration files")
- raise bb.BBHandledException
+ raise bb.BBHandledException()
# Create a copy so we can reset at a later date when UIs disconnect
self.origdata = self.data
@@ -355,7 +355,7 @@
for layer in broken_layers:
parselog.critical(" %s", layer)
parselog.critical("Please check BBLAYERS in %s" % (layerconf))
- sys.exit(1)
+ raise bb.BBHandledException()
for layer in layers:
parselog.debug(2, "Adding layer %s", layer)
@@ -427,7 +427,7 @@
handlerfn = data.getVarFlag(var, "filename", False)
if not handlerfn:
parselog.critical("Undefined event handler function '%s'" % var)
- sys.exit(1)
+ raise bb.BBHandledException()
handlerln = int(data.getVarFlag(var, "lineno", False))
bb.event.register(var, data.getVar(var, False), (data.getVarFlag(var, "eventmask") or "").split(), handlerfn, handlerln)
diff --git a/poky/bitbake/lib/bb/daemonize.py b/poky/bitbake/lib/bb/daemonize.py
index f01e6ec..c187fcf 100644
--- a/poky/bitbake/lib/bb/daemonize.py
+++ b/poky/bitbake/lib/bb/daemonize.py
@@ -14,6 +14,8 @@
import io
import traceback
+import bb
+
def createDaemon(function, logfile):
"""
Detach a process from the controlling terminal and run it in the
diff --git a/poky/bitbake/lib/bb/data_smart.py b/poky/bitbake/lib/bb/data_smart.py
index 7f1b6dc..c559102 100644
--- a/poky/bitbake/lib/bb/data_smart.py
+++ b/poky/bitbake/lib/bb/data_smart.py
@@ -189,7 +189,7 @@
if self.current.parent:
self.current = self.current.parent
else:
- bb.warn("Include log: Tried to finish '%s' at top level." % filename)
+ bb.warn("Include log: Tried to finish '%s' at top level." % self.filename)
return False
def emit(self, o, level = 0):
diff --git a/poky/bitbake/lib/bb/event.py b/poky/bitbake/lib/bb/event.py
index 0e6d9b2..694b470 100644
--- a/poky/bitbake/lib/bb/event.py
+++ b/poky/bitbake/lib/bb/event.py
@@ -10,17 +10,17 @@
# SPDX-License-Identifier: GPL-2.0-only
#
-import sys
-import pickle
-import logging
-import atexit
-import traceback
import ast
+import atexit
+import collections
+import logging
+import pickle
+import sys
import threading
+import traceback
-import bb.utils
-import bb.compat
import bb.exceptions
+import bb.utils
# This is the pid for which we should generate the event. This is set when
# the runqueue forks off.
@@ -56,7 +56,7 @@
_handlers = h
def clean_class_handlers():
- return bb.compat.OrderedDict()
+ return collections.OrderedDict()
# Internal
_handlers = clean_class_handlers()
diff --git a/poky/bitbake/lib/bb/fetch2/__init__.py b/poky/bitbake/lib/bb/fetch2/__init__.py
index 756f602..7ec1fea 100644
--- a/poky/bitbake/lib/bb/fetch2/__init__.py
+++ b/poky/bitbake/lib/bb/fetch2/__init__.py
@@ -1195,8 +1195,6 @@
paths = ud.method.localpaths(ud, d)
for f in paths:
pth = ud.decodedurl
- if '*' in pth:
- f = os.path.join(os.path.abspath(f), pth)
if f.startswith(dl_dir):
# The local fetcher's behaviour is to return a path under DL_DIR if it couldn't find the file anywhere else
if os.path.exists(f):
@@ -1365,9 +1363,6 @@
# We cannot compute checksums for directories
if os.path.isdir(urldata.localpath):
return False
- if urldata.localpath.find("*") != -1:
- return False
-
return True
def recommends_checksum(self, urldata):
@@ -1430,11 +1425,6 @@
iterate = False
file = urldata.localpath
- # Localpath can't deal with 'dir/*' entries, so it converts them to '.',
- # but it must be corrected back for local files copying
- if urldata.basename == '*' and file.endswith('/.'):
- file = '%s/%s' % (file.rstrip('/.'), urldata.path)
-
try:
unpack = bb.utils.to_boolean(urldata.parm.get('unpack'), True)
except ValueError as exc:
@@ -1613,8 +1603,6 @@
"""
if os.path.exists(ud.localpath):
return True
- if ud.localpath.find("*") != -1:
- return True
return False
def implicit_urldata(self, ud, d):
diff --git a/poky/bitbake/lib/bb/fetch2/local.py b/poky/bitbake/lib/bb/fetch2/local.py
index 01d9ff9..25d4557 100644
--- a/poky/bitbake/lib/bb/fetch2/local.py
+++ b/poky/bitbake/lib/bb/fetch2/local.py
@@ -17,7 +17,7 @@
import urllib.request, urllib.parse, urllib.error
import bb
import bb.utils
-from bb.fetch2 import FetchMethod, FetchError
+from bb.fetch2 import FetchMethod, FetchError, ParameterError
from bb.fetch2 import logger
class Local(FetchMethod):
@@ -33,6 +33,8 @@
ud.basename = os.path.basename(ud.decodedurl)
ud.basepath = ud.decodedurl
ud.needdonestamp = False
+ if "*" in ud.decodedurl:
+ raise bb.fetch2.ParameterError("file:// urls using globbing are no longer supported. Please place the files in a directory and reference that instead.", ud.url)
return
def localpath(self, urldata, d):
@@ -55,12 +57,6 @@
logger.debug(2, "Searching for %s in paths:\n %s" % (path, "\n ".join(filespath.split(":"))))
newpath, hist = bb.utils.which(filespath, path, history=True)
searched.extend(hist)
- if (not newpath or not os.path.exists(newpath)) and path.find("*") != -1:
- # For expressions using '*', best we can do is take the first directory in FILESPATH that exists
- newpath, hist = bb.utils.which(filespath, ".", history=True)
- searched.extend(hist)
- logger.debug(2, "Searching for %s in path: %s" % (path, newpath))
- return searched
if not os.path.exists(newpath):
dldirfile = os.path.join(d.getVar("DL_DIR"), path)
logger.debug(2, "Defaulting to %s for %s" % (dldirfile, path))
@@ -70,8 +66,6 @@
return searched
def need_update(self, ud, d):
- if ud.url.find("*") != -1:
- return False
if os.path.exists(ud.localpath):
return False
return True
@@ -95,9 +89,6 @@
"""
Check the status of the url
"""
- if urldata.localpath.find("*") != -1:
- logger.info("URL %s looks like a glob and was therefore not checked.", urldata.url)
- return True
if os.path.exists(urldata.localpath):
return True
return False
diff --git a/poky/bitbake/lib/bb/fetch2/osc.py b/poky/bitbake/lib/bb/fetch2/osc.py
index 8f091ef..3a6cd29 100644
--- a/poky/bitbake/lib/bb/fetch2/osc.py
+++ b/poky/bitbake/lib/bb/fetch2/osc.py
@@ -8,12 +8,15 @@
"""
import logging
+import os
import bb
from bb.fetch2 import FetchMethod
from bb.fetch2 import FetchError
from bb.fetch2 import MissingParameterError
from bb.fetch2 import runfetchcmd
+logger = logging.getLogger(__name__)
+
class Osc(FetchMethod):
"""Class to fetch a module or modules from Opensuse build server
repositories."""
diff --git a/poky/bitbake/lib/bb/fetch2/ssh.py b/poky/bitbake/lib/bb/fetch2/ssh.py
index 5e982ec..2c8557e 100644
--- a/poky/bitbake/lib/bb/fetch2/ssh.py
+++ b/poky/bitbake/lib/bb/fetch2/ssh.py
@@ -31,8 +31,7 @@
#
import re, os
-from bb.fetch2 import FetchMethod
-from bb.fetch2 import runfetchcmd
+from bb.fetch2 import check_network_access, FetchMethod, ParameterError, runfetchcmd
__pattern__ = re.compile(r'''
@@ -65,7 +64,7 @@
def urldata_init(self, urldata, d):
if 'protocol' in urldata.parm and urldata.parm['protocol'] == 'git':
- raise bb.fetch2.ParameterError(
+ raise ParameterError(
"Invalid protocol - if you wish to fetch from a git " +
"repository using ssh, you need to use " +
"git:// prefix with protocol=ssh", urldata.url)
@@ -105,7 +104,7 @@
dldir
)
- bb.fetch2.check_network_access(d, cmd, urldata.url)
+ check_network_access(d, cmd, urldata.url)
runfetchcmd(cmd, d)
diff --git a/poky/bitbake/lib/bb/fetch2/wget.py b/poky/bitbake/lib/bb/fetch2/wget.py
index f7d1de2..e6d9f52 100644
--- a/poky/bitbake/lib/bb/fetch2/wget.py
+++ b/poky/bitbake/lib/bb/fetch2/wget.py
@@ -208,10 +208,7 @@
fetch.connection_cache.remove_connection(h.host, h.port)
raise urllib.error.URLError(err)
else:
- try:
- r = h.getresponse(buffering=True)
- except TypeError: # buffering kw not supported
- r = h.getresponse()
+ r = h.getresponse()
# Pick apart the HTTPResponse object to get the addinfourl
# object initialized properly.
diff --git a/poky/bitbake/lib/bb/main.py b/poky/bitbake/lib/bb/main.py
index af2880f..7990195 100755
--- a/poky/bitbake/lib/bb/main.py
+++ b/poky/bitbake/lib/bb/main.py
@@ -344,8 +344,6 @@
except:
pass
- configuration.setConfigParameters(configParams)
-
if configParams.server_only and configParams.remote_server:
raise BBMainException("FATAL: The '--server-only' option conflicts with %s.\n" %
("the BBSERVER environment variable" if "BBSERVER" in os.environ \
@@ -357,13 +355,13 @@
if "BBDEBUG" in os.environ:
level = int(os.environ["BBDEBUG"])
- if level > configuration.debug:
- configuration.debug = level
+ if level > configParams.debug:
+ configParams.debug = level
- bb.msg.init_msgconfig(configParams.verbose, configuration.debug,
- configuration.debug_domains)
+ bb.msg.init_msgconfig(configParams.verbose, configParams.debug,
+ configParams.debug_domains)
- server_connection, ui_module = setup_bitbake(configParams, configuration)
+ server_connection, ui_module = setup_bitbake(configParams)
# No server connection
if server_connection is None:
if configParams.status_only:
@@ -390,7 +388,7 @@
return 1
-def setup_bitbake(configParams, configuration, extrafeatures=None):
+def setup_bitbake(configParams, extrafeatures=None):
# Ensure logging messages get sent to the UI as events
handler = bb.event.LogHandler()
if not configParams.status_only:
@@ -431,11 +429,11 @@
logger.info("bitbake server is not running.")
lock.close()
return None, None
- # we start a server with a given configuration
+ # we start a server with a given featureset
logger.info("Starting bitbake server...")
# Clear the event queue since we already displayed messages
bb.event.ui_queue = []
- server = bb.server.process.BitBakeServer(lock, sockname, configuration, featureset)
+ server = bb.server.process.BitBakeServer(lock, sockname, featureset, configParams.server_timeout, configParams.xmlrpcinterface)
else:
logger.info("Reconnecting to bitbake server...")
diff --git a/poky/bitbake/lib/bb/msg.py b/poky/bitbake/lib/bb/msg.py
index 2d88c4e..6f17b6a 100644
--- a/poky/bitbake/lib/bb/msg.py
+++ b/poky/bitbake/lib/bb/msg.py
@@ -14,6 +14,7 @@
import copy
import logging
import logging.config
+import os
from itertools import groupby
import bb
import bb.event
@@ -146,18 +147,12 @@
#
loggerDefaultLogLevel = BBLogFormatter.NOTE
-loggerDefaultVerbose = False
-loggerVerboseLogs = False
loggerDefaultDomains = {}
def init_msgconfig(verbose, debug, debug_domains=None):
"""
Set default verbosity and debug levels config the logger
"""
- bb.msg.loggerDefaultVerbose = verbose
- if verbose:
- bb.msg.loggerVerboseLogs = True
-
if debug:
bb.msg.loggerDefaultLogLevel = BBLogFormatter.DEBUG - debug + 1
elif verbose:
diff --git a/poky/bitbake/lib/bb/namedtuple_with_abc.py b/poky/bitbake/lib/bb/namedtuple_with_abc.py
index 646aed6..e46dbf0 100644
--- a/poky/bitbake/lib/bb/namedtuple_with_abc.py
+++ b/poky/bitbake/lib/bb/namedtuple_with_abc.py
@@ -61,17 +61,9 @@
return ABCMeta.__new__(mcls, name, bases, namespace)
-exec(
- # Python 2.x metaclass declaration syntax
- """class _NamedTupleABC(object):
- '''The abstract base class + mix-in for named tuples.'''
- __metaclass__ = _NamedTupleABCMeta
- _fields = abstractproperty()""" if version_info[0] < 3 else
- # Python 3.x metaclass declaration syntax
- """class _NamedTupleABC(metaclass=_NamedTupleABCMeta):
- '''The abstract base class + mix-in for named tuples.'''
- _fields = abstractproperty()"""
-)
+class _NamedTupleABC(metaclass=_NamedTupleABCMeta):
+ '''The abstract base class + mix-in for named tuples.'''
+ _fields = abstractproperty()
_namedtuple.abc = _NamedTupleABC
diff --git a/poky/bitbake/lib/bb/persist_data.py b/poky/bitbake/lib/bb/persist_data.py
index 7357ab2..5f4fbe3 100644
--- a/poky/bitbake/lib/bb/persist_data.py
+++ b/poky/bitbake/lib/bb/persist_data.py
@@ -12,14 +12,14 @@
#
import collections
+import contextlib
+import functools
import logging
import os.path
+import sqlite3
import sys
import warnings
-from bb.compat import total_ordering
from collections import Mapping
-import sqlite3
-import contextlib
sqlversion = sqlite3.sqlite_version_info
if sqlversion[0] < 3 or (sqlversion[0] == 3 and sqlversion[1] < 3):
@@ -28,7 +28,7 @@
logger = logging.getLogger("BitBake.PersistData")
-@total_ordering
+@functools.total_ordering
class SQLTable(collections.MutableMapping):
class _Decorators(object):
@staticmethod
diff --git a/poky/bitbake/lib/bb/process.py b/poky/bitbake/lib/bb/process.py
index f36c929..7c3995c 100644
--- a/poky/bitbake/lib/bb/process.py
+++ b/poky/bitbake/lib/bb/process.py
@@ -7,6 +7,7 @@
import subprocess
import errno
import select
+import bb
logger = logging.getLogger('BitBake.Process')
diff --git a/poky/bitbake/lib/bb/runqueue.py b/poky/bitbake/lib/bb/runqueue.py
index 02a261e..28bdadb 100644
--- a/poky/bitbake/lib/bb/runqueue.py
+++ b/poky/bitbake/lib/bb/runqueue.py
@@ -376,7 +376,7 @@
self.stampwhitelist = cfgData.getVar("BB_STAMP_WHITELIST") or ""
self.multi_provider_whitelist = (cfgData.getVar("MULTI_PROVIDER_WHITELIST") or "").split()
- self.setscenewhitelist = get_setscene_enforce_whitelist(cfgData)
+ self.setscenewhitelist = get_setscene_enforce_whitelist(cfgData, targets)
self.setscenewhitelist_checked = False
self.setscene_enforce = (cfgData.getVar('BB_SETSCENE_ENFORCE') == "1")
self.init_progress_reporter = bb.progress.DummyMultiStageProcessProgressReporter()
@@ -1263,8 +1263,8 @@
"fakerootnoenv" : self.rqdata.dataCaches[mc].fakerootnoenv,
"sigdata" : bb.parse.siggen.get_taskdata(),
"logdefaultlevel" : bb.msg.loggerDefaultLogLevel,
- "logdefaultverbose" : bb.msg.loggerDefaultVerbose,
- "logdefaultverboselogs" : bb.msg.loggerVerboseLogs,
+ "build_verbose_shell" : self.cooker.configuration.build_verbose_shell,
+ "build_verbose_stdout" : self.cooker.configuration.build_verbose_stdout,
"logdefaultdomain" : bb.msg.loggerDefaultDomains,
"prhost" : self.cooker.prhost,
"buildname" : self.cfgData.getVar("BUILDNAME"),
@@ -2999,16 +2999,15 @@
print("Warning, worker left partial message: %s" % self.queue)
self.input.close()
-def get_setscene_enforce_whitelist(d):
+def get_setscene_enforce_whitelist(d, targets):
if d.getVar('BB_SETSCENE_ENFORCE') != '1':
return None
whitelist = (d.getVar("BB_SETSCENE_ENFORCE_WHITELIST") or "").split()
outlist = []
for item in whitelist[:]:
if item.startswith('%:'):
- for target in sys.argv[1:]:
- if not target.startswith('-'):
- outlist.append(target.split(':')[0] + ':' + item.split(':')[1])
+ for (mc, target, task, fn) in targets:
+ outlist.append(target + ':' + item.split(':')[1])
else:
outlist.append(item)
return outlist
diff --git a/poky/bitbake/lib/bb/server/process.py b/poky/bitbake/lib/bb/server/process.py
index 65e1eab..c7cb34f 100644
--- a/poky/bitbake/lib/bb/server/process.py
+++ b/poky/bitbake/lib/bb/server/process.py
@@ -25,6 +25,7 @@
import errno
import re
import datetime
+import pickle
import bb.server.xmlrpcserver
from bb import daemonize
from multiprocessing import queues
@@ -34,11 +35,15 @@
class ProcessTimeout(SystemExit):
pass
+def serverlog(msg):
+ print(str(os.getpid()) + " " + datetime.datetime.now().strftime('%H:%M:%S.%f') + " " + msg)
+ sys.stdout.flush()
+
class ProcessServer():
profile_filename = "profile.log"
profile_processed_filename = "profile.log.processed"
- def __init__(self, lock, sock, sockname, server_timeout, xmlrpcinterface):
+ def __init__(self, lock, lockname, sock, sockname, server_timeout, xmlrpcinterface):
self.command_channel = False
self.command_channel_reply = False
self.quit = False
@@ -54,10 +59,12 @@
self._idlefuns = {}
self.bitbake_lock = lock
+ self.bitbake_lock_name = lockname
self.sock = sock
self.sockname = sockname
self.server_timeout = server_timeout
+ self.timeout = self.server_timeout
self.xmlrpcinterface = xmlrpcinterface
def register_idle_function(self, function, data):
@@ -70,22 +77,7 @@
if self.xmlrpcinterface[0]:
self.xmlrpc = bb.server.xmlrpcserver.BitBakeXMLRPCServer(self.xmlrpcinterface, self.cooker, self)
- print("Bitbake XMLRPC server address: %s, server port: %s" % (self.xmlrpc.host, self.xmlrpc.port))
-
- heartbeat_event = self.cooker.data.getVar('BB_HEARTBEAT_EVENT')
- if heartbeat_event:
- try:
- self.heartbeat_seconds = float(heartbeat_event)
- except:
- bb.warn('Ignoring invalid BB_HEARTBEAT_EVENT=%s, must be a float specifying seconds.' % heartbeat_event)
-
- self.timeout = self.server_timeout or self.cooker.data.getVar('BB_SERVER_TIMEOUT')
- try:
- if self.timeout:
- self.timeout = float(self.timeout)
- except:
- bb.warn('Ignoring invalid BB_SERVER_TIMEOUT=%s, must be a float specifying seconds.' % self.timeout)
-
+ serverlog("Bitbake XMLRPC server address: %s, server port: %s" % (self.xmlrpc.host, self.xmlrpc.port))
try:
self.bitbake_lock.seek(0)
@@ -96,7 +88,7 @@
self.bitbake_lock.write("%s\n" % (os.getpid()))
self.bitbake_lock.flush()
except Exception as e:
- print("Error writing to lock file: %s" % str(e))
+ serverlog("Error writing to lock file: %s" % str(e))
pass
if self.cooker.configuration.profile:
@@ -110,7 +102,7 @@
prof.dump_stats("profile.log")
bb.utils.process_profilelog("profile.log")
- print("Raw profiling information saved to profile.log and processed statistics to profile.log.processed")
+ serverlog("Raw profiling information saved to profile.log and processed statistics to profile.log.processed")
else:
ret = self.main()
@@ -129,10 +121,11 @@
fds = [self.sock]
if self.xmlrpc:
fds.append(self.xmlrpc)
- print("Entering server connection loop")
+ seendata = False
+ serverlog("Entering server connection loop")
def disconnect_client(self, fds):
- print("Disconnecting Client")
+ serverlog("Disconnecting Client")
if self.controllersock:
fds.remove(self.controllersock)
self.controllersock.close()
@@ -150,12 +143,12 @@
self.haveui = False
ready = select.select(fds,[],[],0)[0]
if newconnections:
- print("Starting new client")
+ serverlog("Starting new client")
conn = newconnections.pop(-1)
fds.append(conn)
self.controllersock = conn
elif self.timeout is None and not ready:
- print("No timeout, exiting.")
+ serverlog("No timeout, exiting.")
self.quit = True
self.lastui = time.time()
@@ -164,17 +157,17 @@
while select.select([self.sock],[],[],0)[0]:
controllersock, address = self.sock.accept()
if self.controllersock:
- print("Queuing %s (%s)" % (str(ready), str(newconnections)))
+ serverlog("Queuing %s (%s)" % (str(ready), str(newconnections)))
newconnections.append(controllersock)
else:
- print("Accepting %s (%s)" % (str(ready), str(newconnections)))
+ serverlog("Accepting %s (%s)" % (str(ready), str(newconnections)))
self.controllersock = controllersock
fds.append(controllersock)
if self.controllersock in ready:
try:
- print("Processing Client")
+ serverlog("Processing Client")
ui_fds = recvfds(self.controllersock, 3)
- print("Connecting Client")
+ serverlog("Connecting Client")
# Where to write events to
writer = ConnectionWriter(ui_fds[0])
@@ -198,14 +191,14 @@
if not self.timeout == -1.0 and not self.haveui and self.timeout and \
(self.lastui + self.timeout) < time.time():
- print("Server timeout, exiting.")
+ serverlog("Server timeout, exiting.")
self.quit = True
# If we don't see a UI connection within maxuiwait, its unlikely we're going to see
# one. We have had issue with processes hanging indefinitely so timing out UI-less
# servers is useful.
if not self.hadanyui and not self.xmlrpc and not self.timeout and (self.lastui + self.maxuiwait) < time.time():
- print("No UI connection within max timeout, exiting to avoid infinite loop.")
+ serverlog("No UI connection within max timeout, exiting to avoid infinite loop.")
self.quit = True
if self.command_channel in ready:
@@ -220,17 +213,37 @@
self.quit = True
continue
try:
- print("Running command %s" % command)
+ serverlog("Running command %s" % command)
self.command_channel_reply.send(self.cooker.command.runCommand(command))
+ serverlog("Command Completed")
except Exception as e:
logger.exception('Exception in server main event loop running command %s (%s)' % (command, str(e)))
if self.xmlrpc in ready:
self.xmlrpc.handle_requests()
+ if not seendata and hasattr(self.cooker, "data"):
+ heartbeat_event = self.cooker.data.getVar('BB_HEARTBEAT_EVENT')
+ if heartbeat_event:
+ try:
+ self.heartbeat_seconds = float(heartbeat_event)
+ except:
+ bb.warn('Ignoring invalid BB_HEARTBEAT_EVENT=%s, must be a float specifying seconds.' % heartbeat_event)
+
+ self.timeout = self.server_timeout or self.cooker.data.getVar('BB_SERVER_TIMEOUT')
+ try:
+ if self.timeout:
+ self.timeout = float(self.timeout)
+ except:
+ bb.warn('Ignoring invalid BB_SERVER_TIMEOUT=%s, must be a float specifying seconds.' % self.timeout)
+ seendata = True
+
ready = self.idle_commands(.1, fds)
- print("Exiting")
+ if len(threading.enumerate()) != 1:
+ serverlog("More than one thread left?: " + str(threading.enumerate()))
+
+ serverlog("Exiting")
# Remove the socket file so we don't get any more connections to avoid races
try:
os.unlink(self.sockname)
@@ -253,39 +266,67 @@
# Finally release the lockfile but warn about other processes holding it open
lock = self.bitbake_lock
- lockfile = lock.name
+ lockfile = self.bitbake_lock_name
+
+ def get_lock_contents(lockfile):
+ try:
+ with open(lockfile, "r") as f:
+ return f.readlines()
+ except FileNotFoundError:
+ return None
+
+ lockcontents = get_lock_contents(lockfile)
+ serverlog("Original lockfile contents: " + str(lockcontents))
+
lock.close()
lock = None
while not lock:
- with bb.utils.timeout(3):
- lock = bb.utils.lockfile(lockfile, shared=False, retry=False, block=True)
- if lock:
- # We hold the lock so we can remove the file (hide stale pid data)
- # via unlockfile.
- bb.utils.unlockfile(lock)
- return
-
+ i = 0
+ lock = None
+ while not lock and i < 30:
+ lock = bb.utils.lockfile(lockfile, shared=False, retry=False, block=False)
if not lock:
- # Some systems may not have lsof available
- procs = None
+ newlockcontents = get_lock_contents(lockfile)
+ if newlockcontents != lockcontents:
+ # A new server was started, the lockfile contents changed, we can exit
+ serverlog("Lockfile now contains different contents, exiting: " + str(newlockcontents))
+ return
+ time.sleep(0.1)
+ i += 1
+ if lock:
+ # We hold the lock so we can remove the file (hide stale pid data)
+ # via unlockfile.
+ bb.utils.unlockfile(lock)
+ serverlog("Exiting as we could obtain the lock")
+ return
+
+ if not lock:
+ # Some systems may not have lsof available
+ procs = None
+ try:
+ procs = subprocess.check_output(["lsof", '-w', lockfile], stderr=subprocess.STDOUT)
+ except subprocess.CalledProcessError:
+ # File was deleted?
+ continue
+ except OSError as e:
+ if e.errno != errno.ENOENT:
+ raise
+ if procs is None:
+ # Fall back to fuser if lsof is unavailable
try:
- procs = subprocess.check_output(["lsof", '-w', lockfile], stderr=subprocess.STDOUT)
+ procs = subprocess.check_output(["fuser", '-v', lockfile], stderr=subprocess.STDOUT)
+ except subprocess.CalledProcessError:
+ # File was deleted?
+ continue
except OSError as e:
if e.errno != errno.ENOENT:
raise
- if procs is None:
- # Fall back to fuser if lsof is unavailable
- try:
- procs = subprocess.check_output(["fuser", '-v', lockfile], stderr=subprocess.STDOUT)
- except OSError as e:
- if e.errno != errno.ENOENT:
- raise
- msg = "Delaying shutdown due to active processes which appear to be holding bitbake.lock"
- if procs:
- msg += ":\n%s" % str(procs)
- print(msg)
+ msg = "Delaying shutdown due to active processes which appear to be holding bitbake.lock"
+ if procs:
+ msg += ":\n%s" % str(procs.decode("utf-8"))
+ serverlog(msg)
def idle_commands(self, delay, fds=None):
nextsleep = delay
@@ -323,8 +364,9 @@
self.next_heartbeat += self.heartbeat_seconds
if self.next_heartbeat <= now:
self.next_heartbeat = now + self.heartbeat_seconds
- heartbeat = bb.event.HeartbeatEvent(now)
- bb.event.fire(heartbeat, self.cooker.data)
+ if hasattr(self.cooker, "data"):
+ heartbeat = bb.event.HeartbeatEvent(now)
+ bb.event.fire(heartbeat, self.cooker.data)
if nextsleep and now + nextsleep > self.next_heartbeat:
# Shorten timeout so that we we wake up in time for
# the heartbeat.
@@ -353,7 +395,12 @@
logger.info("No reply from server in 30s")
if not self.recv.poll(30):
raise ProcessTimeout("Timeout while waiting for a reply from the bitbake server (60s)")
- return self.recv.get()
+ ret, exc = self.recv.get()
+ # Should probably turn all exceptions in exc back into exceptions?
+ # For now, at least handle BBHandledException
+ if exc and ("BBHandledException" in exc or "SystemExit" in exc):
+ raise bb.BBHandledException()
+ return ret, exc
def updateFeatureSet(self, featureset):
_, error = self.runCommand(["setFeatures", featureset])
@@ -386,39 +433,26 @@
self.connection.recv.close()
return
+start_log_format = '--- Starting bitbake server pid %s at %s ---'
+start_log_datetime_format = '%Y-%m-%d %H:%M:%S.%f'
+
class BitBakeServer(object):
- start_log_format = '--- Starting bitbake server pid %s at %s ---'
- start_log_datetime_format = '%Y-%m-%d %H:%M:%S.%f'
- def __init__(self, lock, sockname, configuration, featureset):
+ def __init__(self, lock, sockname, featureset, server_timeout, xmlrpcinterface):
- self.configuration = configuration
+ self.server_timeout = server_timeout
+ self.xmlrpcinterface = xmlrpcinterface
self.featureset = featureset
self.sockname = sockname
self.bitbake_lock = lock
self.readypipe, self.readypipein = os.pipe()
- # Create server control socket
- if os.path.exists(sockname):
- os.unlink(sockname)
-
# Place the log in the builddirectory alongside the lock file
logfile = os.path.join(os.path.dirname(self.bitbake_lock.name), "bitbake-cookerdaemon.log")
+ self.logfile = logfile
- self.sock = socket.socket(socket.AF_UNIX, socket.SOCK_STREAM)
- # AF_UNIX has path length issues so chdir here to workaround
- cwd = os.getcwd()
- try:
- os.chdir(os.path.dirname(sockname))
- self.sock.bind(os.path.basename(sockname))
- finally:
- os.chdir(cwd)
- self.sock.listen(1)
-
- os.set_inheritable(self.sock.fileno(), True)
startdatetime = datetime.datetime.now()
bb.daemonize.createDaemon(self._startServer, logfile)
- self.sock.close()
self.bitbake_lock.close()
os.close(self.readypipein)
@@ -437,7 +471,7 @@
ready.close()
bb.error("Unable to start bitbake server (%s)" % str(r))
if os.path.exists(logfile):
- logstart_re = re.compile(self.start_log_format % ('([0-9]+)', '([0-9-]+ [0-9:.]+)'))
+ logstart_re = re.compile(start_log_format % ('([0-9]+)', '([0-9-]+ [0-9:.]+)'))
started = False
lines = []
lastlines = []
@@ -447,9 +481,9 @@
lines.append(line)
else:
lastlines.append(line)
- res = logstart_re.match(line.rstrip())
+ res = logstart_re.search(line.rstrip())
if res:
- ldatetime = datetime.datetime.strptime(res.group(2), self.start_log_datetime_format)
+ ldatetime = datetime.datetime.strptime(res.group(2), start_log_datetime_format)
if ldatetime >= startdatetime:
started = True
lines.append(line)
@@ -470,28 +504,53 @@
ready.close()
def _startServer(self):
- print(self.start_log_format % (os.getpid(), datetime.datetime.now().strftime(self.start_log_datetime_format)))
- sys.stdout.flush()
+ os.close(self.readypipe)
+ os.set_inheritable(self.bitbake_lock.fileno(), True)
+ os.set_inheritable(self.readypipein, True)
+ serverscript = os.path.realpath(os.path.dirname(__file__) + "/../../../bin/bitbake-server")
+ os.execl(sys.executable, "bitbake-server", serverscript, "decafbad", str(self.bitbake_lock.fileno()), str(self.readypipein), self.logfile, self.bitbake_lock.name, self.sockname, str(self.server_timeout), str(self.xmlrpcinterface[0]), str(self.xmlrpcinterface[1]))
+def execServer(lockfd, readypipeinfd, lockname, sockname, server_timeout, xmlrpcinterface):
+
+ import bb.cookerdata
+ import bb.cooker
+
+ serverlog(start_log_format % (os.getpid(), datetime.datetime.now().strftime(start_log_datetime_format)))
+
+ try:
+ bitbake_lock = os.fdopen(lockfd, "w")
+
+ # Create server control socket
+ if os.path.exists(sockname):
+ os.unlink(sockname)
+
+ sock = socket.socket(socket.AF_UNIX, socket.SOCK_STREAM)
+ # AF_UNIX has path length issues so chdir here to workaround
+ cwd = os.getcwd()
try:
- server = ProcessServer(self.bitbake_lock, self.sock, self.sockname, self.configuration.server_timeout, self.configuration.xmlrpcinterface)
- os.close(self.readypipe)
- writer = ConnectionWriter(self.readypipein)
- try:
- self.cooker = bb.cooker.BBCooker(self.configuration, self.featureset, server.register_idle_function)
- except bb.BBHandledException:
- return None
- writer.send("r")
- writer.close()
- server.cooker = self.cooker
- print("Started bitbake server pid %d" % os.getpid())
- sys.stdout.flush()
-
- server.run()
+ os.chdir(os.path.dirname(sockname))
+ sock.bind(os.path.basename(sockname))
finally:
- # Flush any ,essages/errors to the logfile before exit
- sys.stdout.flush()
- sys.stderr.flush()
+ os.chdir(cwd)
+ sock.listen(1)
+
+ server = ProcessServer(bitbake_lock, lockname, sock, sockname, server_timeout, xmlrpcinterface)
+ writer = ConnectionWriter(readypipeinfd)
+ try:
+ featureset = []
+ cooker = bb.cooker.BBCooker(featureset, server.register_idle_function)
+ except bb.BBHandledException:
+ return None
+ writer.send("r")
+ writer.close()
+ server.cooker = cooker
+ serverlog("Started bitbake server pid %d" % os.getpid())
+
+ server.run()
+ finally:
+ # Flush any ,essages/errors to the logfile before exit
+ sys.stdout.flush()
+ sys.stderr.flush()
def connectProcessServer(sockname, featureset):
# Connect to socket
diff --git a/poky/bitbake/lib/bb/siggen.py b/poky/bitbake/lib/bb/siggen.py
index 4c63b0b..ad49d1e 100644
--- a/poky/bitbake/lib/bb/siggen.py
+++ b/poky/bitbake/lib/bb/siggen.py
@@ -752,7 +752,7 @@
_, mc, a = a.split(":", 2)
b = a.rsplit("/", 2)[1] + '/' + a.rsplit("/", 2)[2]
if a.startswith("virtual:"):
- b = b + ":" + a.rsplit(":", 1)[0]
+ b = b + ":" + a.rsplit(":", 2)[0]
if mc:
b = b + ":mc:" + mc
return b
diff --git a/poky/bitbake/lib/bb/tests/cow.py b/poky/bitbake/lib/bb/tests/cow.py
index bf6e79f..7514264 100644
--- a/poky/bitbake/lib/bb/tests/cow.py
+++ b/poky/bitbake/lib/bb/tests/cow.py
@@ -4,9 +4,17 @@
# SPDX-License-Identifier: GPL-2.0-only
#
# Copyright 2006 Holger Freyther <freyther@handhelds.org>
+# Copyright (C) 2020 Agilent Technologies, Inc.
#
+import io
+import re
+import sys
import unittest
+import contextlib
+import collections
+
+from bb.COW import COWDictBase, COWSetBase, COWDictMeta, COWSetMeta
class COWTestCase(unittest.TestCase):
@@ -14,11 +22,61 @@
Test case for the COW module from mithro
"""
+ def setUp(self):
+ self._track_warnings = False
+ self._warning_file = io.StringIO()
+ self._unhandled_warnings = collections.deque()
+ COWDictBase.__warn__ = self._warning_file
+
+ def tearDown(self):
+ COWDictBase.__warn__ = sys.stderr
+ if self._track_warnings:
+ self._checkAllWarningsRead()
+
+ def trackWarnings(self):
+ self._track_warnings = True
+
+ def _collectWarnings(self):
+ self._warning_file.seek(0)
+ for warning in self._warning_file:
+ self._unhandled_warnings.append(warning.rstrip("\n"))
+ self._warning_file.truncate(0)
+ self._warning_file.seek(0)
+
+ def _checkAllWarningsRead(self):
+ self._collectWarnings()
+ self.assertSequenceEqual(self._unhandled_warnings, [])
+
+ @contextlib.contextmanager
+ def checkReportsWarning(self, expected_warning):
+ self._checkAllWarningsRead()
+ yield
+ self._collectWarnings()
+ warning = self._unhandled_warnings.popleft()
+ self.assertEqual(warning, expected_warning)
+
+ def checkStrOutput(self, obj, expected_levels, expected_keys):
+ if obj.__class__ is COWDictMeta:
+ expected_class_name = "COWDict"
+ elif obj.__class__ is COWSetMeta:
+ expected_class_name = "COWSet"
+ else:
+ self.fail("obj is of unknown type {0}".format(type(obj)))
+ s = str(obj)
+ regex = re.compile(r"<(\w+) Level: (\d+) Current Keys: (\d+)>")
+ match = regex.match(s)
+ self.assertIsNotNone(match, "bad str output: '{0}'".format(s))
+ class_name = match.group(1)
+ self.assertEqual(class_name, expected_class_name)
+ levels = int(match.group(2))
+ self.assertEqual(levels, expected_levels, "wrong # levels in str: '{0}'".format(s))
+ keys = int(match.group(3))
+ self.assertEqual(keys, expected_keys, "wrong # keys in str: '{0}'".format(s))
+
def testGetSet(self):
"""
Test and set
"""
- from bb.COW import COWDictBase
a = COWDictBase.copy()
self.assertEqual(False, 'a' in a)
@@ -27,16 +85,14 @@
a['b'] = 'b'
self.assertEqual(True, 'a' in a)
self.assertEqual(True, 'b' in a)
- self.assertEqual('a', a['a'] )
- self.assertEqual('b', a['b'] )
+ self.assertEqual('a', a['a'])
+ self.assertEqual('b', a['b'])
def testCopyCopy(self):
"""
Test the copy of copies
"""
- from bb.COW import COWDictBase
-
# create two COW dict 'instances'
b = COWDictBase.copy()
c = COWDictBase.copy()
@@ -94,30 +150,168 @@
self.assertEqual(False, 'e' in b_2)
def testCow(self):
- from bb.COW import COWDictBase
+ self.trackWarnings()
+
c = COWDictBase.copy()
c['123'] = 1027
c['other'] = 4711
- c['d'] = { 'abc' : 10, 'bcd' : 20 }
+ c['d'] = {'abc': 10, 'bcd': 20}
copy = c.copy()
self.assertEqual(1027, c['123'])
self.assertEqual(4711, c['other'])
- self.assertEqual({'abc':10, 'bcd':20}, c['d'])
+ self.assertEqual({'abc': 10, 'bcd': 20}, c['d'])
self.assertEqual(1027, copy['123'])
self.assertEqual(4711, copy['other'])
- self.assertEqual({'abc':10, 'bcd':20}, copy['d'])
+ with self.checkReportsWarning("Warning: Doing a copy because d is a mutable type."):
+ self.assertEqual({'abc': 10, 'bcd': 20}, copy['d'])
# cow it now
copy['123'] = 1028
copy['other'] = 4712
copy['d']['abc'] = 20
-
self.assertEqual(1027, c['123'])
self.assertEqual(4711, c['other'])
- self.assertEqual({'abc':10, 'bcd':20}, c['d'])
+ self.assertEqual({'abc': 10, 'bcd': 20}, c['d'])
self.assertEqual(1028, copy['123'])
self.assertEqual(4712, copy['other'])
- self.assertEqual({'abc':20, 'bcd':20}, copy['d'])
+ self.assertEqual({'abc': 20, 'bcd': 20}, copy['d'])
+
+ def testOriginalTestSuite(self):
+ # This test suite is a port of the original one from COW.py
+ self.trackWarnings()
+
+ a = COWDictBase.copy()
+ self.checkStrOutput(a, 1, 0)
+
+ a['a'] = 'a'
+ a['b'] = 'b'
+ a['dict'] = {}
+ self.checkStrOutput(a, 1, 4) # 4th member is dict__mutable__
+
+ b = a.copy()
+ self.checkStrOutput(b, 2, 0)
+ b['c'] = 'b'
+ self.checkStrOutput(b, 2, 1)
+
+ with self.checkReportsWarning("Warning: If you aren't going to change any of the values call with True."):
+ self.assertListEqual(list(a.iteritems()),
+ [('a', 'a'),
+ ('b', 'b'),
+ ('dict', {})
+ ])
+
+ with self.checkReportsWarning("Warning: If you aren't going to change any of the values call with True."):
+ b_gen = b.iteritems()
+ self.assertTupleEqual(next(b_gen), ('a', 'a'))
+ self.assertTupleEqual(next(b_gen), ('b', 'b'))
+ self.assertTupleEqual(next(b_gen), ('c', 'b'))
+ with self.checkReportsWarning("Warning: Doing a copy because dict is a mutable type."):
+ self.assertTupleEqual(next(b_gen), ('dict', {}))
+ with self.assertRaises(StopIteration):
+ next(b_gen)
+
+ b['dict']['a'] = 'b'
+ b['a'] = 'c'
+
+ self.checkStrOutput(a, 1, 4)
+ self.checkStrOutput(b, 2, 3)
+
+ with self.checkReportsWarning("Warning: If you aren't going to change any of the values call with True."):
+ self.assertListEqual(list(a.iteritems()),
+ [('a', 'a'),
+ ('b', 'b'),
+ ('dict', {})
+ ])
+
+ with self.checkReportsWarning("Warning: If you aren't going to change any of the values call with True."):
+ b_gen = b.iteritems()
+ self.assertTupleEqual(next(b_gen), ('a', 'c'))
+ self.assertTupleEqual(next(b_gen), ('b', 'b'))
+ self.assertTupleEqual(next(b_gen), ('c', 'b'))
+ self.assertTupleEqual(next(b_gen), ('dict', {'a': 'b'}))
+ with self.assertRaises(StopIteration):
+ next(b_gen)
+
+ with self.assertRaises(KeyError):
+ print(b["dict2"])
+
+ a['set'] = COWSetBase()
+ a['set'].add("o1")
+ a['set'].add("o1")
+ a['set'].add("o2")
+ self.assertSetEqual(set(a['set'].itervalues()), {"o1", "o2"})
+ self.assertSetEqual(set(b['set'].itervalues()), {"o1", "o2"})
+
+ b['set'].add('o3')
+ self.assertSetEqual(set(a['set'].itervalues()), {"o1", "o2"})
+ self.assertSetEqual(set(b['set'].itervalues()), {"o1", "o2", "o3"})
+
+ a['set2'] = set()
+ a['set2'].add("o1")
+ a['set2'].add("o1")
+ a['set2'].add("o2")
+
+ # We don't expect 'a' to change anymore
+ def check_a():
+ with self.checkReportsWarning("Warning: If you aren't going to change any of the values call with True."):
+ a_gen = a.iteritems()
+ self.assertTupleEqual(next(a_gen), ('a', 'a'))
+ self.assertTupleEqual(next(a_gen), ('b', 'b'))
+ self.assertTupleEqual(next(a_gen), ('dict', {}))
+ self.assertTupleEqual(next(a_gen), ('set2', {'o1', 'o2'}))
+ a_sub_set = next(a_gen)
+ self.assertEqual(a_sub_set[0], 'set')
+ self.checkStrOutput(a_sub_set[1], 1, 2)
+ self.assertSetEqual(set(a_sub_set[1].itervalues()), {'o1', 'o2'})
+
+ check_a()
+
+ b_gen = b.iteritems(readonly=True)
+ self.assertTupleEqual(next(b_gen), ('a', 'c'))
+ self.assertTupleEqual(next(b_gen), ('b', 'b'))
+ self.assertTupleEqual(next(b_gen), ('c', 'b'))
+ self.assertTupleEqual(next(b_gen), ('dict', {'a': 'b'}))
+ self.assertTupleEqual(next(b_gen), ('set2', {'o1', 'o2'}))
+ b_sub_set = next(b_gen)
+ self.assertEqual(b_sub_set[0], 'set')
+ self.checkStrOutput(b_sub_set[1], 2, 1)
+ self.assertSetEqual(set(b_sub_set[1].itervalues()), {'o1', 'o2', 'o3'})
+
+ del b['b']
+ with self.assertRaises(KeyError):
+ print(b['b'])
+ self.assertFalse('b' in b)
+
+ check_a()
+
+ b.__revertitem__('b')
+ check_a()
+ self.assertEqual(b['b'], 'b')
+ self.assertTrue('b' in b)
+
+ b.__revertitem__('dict')
+ check_a()
+
+ b_gen = b.iteritems(readonly=True)
+ self.assertTupleEqual(next(b_gen), ('a', 'c'))
+ self.assertTupleEqual(next(b_gen), ('b', 'b'))
+ self.assertTupleEqual(next(b_gen), ('c', 'b'))
+ self.assertTupleEqual(next(b_gen), ('dict', {}))
+ self.assertTupleEqual(next(b_gen), ('set2', {'o1', 'o2'}))
+ b_sub_set = next(b_gen)
+ self.assertEqual(b_sub_set[0], 'set')
+ self.checkStrOutput(b_sub_set[1], 2, 1)
+ self.assertSetEqual(set(b_sub_set[1].itervalues()), {'o1', 'o2', 'o3'})
+
+ self.checkStrOutput(a, 1, 6)
+ self.checkStrOutput(b, 2, 3)
+
+ def testSetMethods(self):
+ s = COWSetBase()
+ with self.assertRaises(TypeError):
+ print(s.iteritems())
+ with self.assertRaises(TypeError):
+ print(s.iterkeys())
diff --git a/poky/bitbake/lib/bb/tests/data.py b/poky/bitbake/lib/bb/tests/data.py
index 5f19504..1d4a64b 100644
--- a/poky/bitbake/lib/bb/tests/data.py
+++ b/poky/bitbake/lib/bb/tests/data.py
@@ -12,6 +12,7 @@
import bb.data
import bb.parse
import logging
+import os
class LogRecord():
def __enter__(self):
diff --git a/poky/bitbake/lib/bb/tests/event.py b/poky/bitbake/lib/bb/tests/event.py
index 9229b63..9ca7e9b 100644
--- a/poky/bitbake/lib/bb/tests/event.py
+++ b/poky/bitbake/lib/bb/tests/event.py
@@ -6,17 +6,18 @@
# SPDX-License-Identifier: GPL-2.0-only
#
-import unittest
-import bb
-import logging
-import bb.compat
-import bb.event
+import collections
import importlib
+import logging
+import pickle
import threading
import time
-import pickle
+import unittest
from unittest.mock import Mock
from unittest.mock import call
+
+import bb
+import bb.event
from bb.msg import BBLogFormatter
@@ -75,7 +76,7 @@
def _create_test_handlers(self):
""" Method used to create a test handler ordered dictionary """
- test_handlers = bb.compat.OrderedDict()
+ test_handlers = collections.OrderedDict()
test_handlers["handler1"] = self._test_process.handler1
test_handlers["handler2"] = self._test_process.handler2
return test_handlers
@@ -96,7 +97,7 @@
def test_clean_class_handlers(self):
""" Test clean_class_handlers method """
- cleanDict = bb.compat.OrderedDict()
+ cleanDict = collections.OrderedDict()
self.assertEqual(cleanDict,
bb.event.clean_class_handlers())
diff --git a/poky/bitbake/lib/bb/tests/fetch.py b/poky/bitbake/lib/bb/tests/fetch.py
index 29c96b2..0ecf044 100644
--- a/poky/bitbake/lib/bb/tests/fetch.py
+++ b/poky/bitbake/lib/bb/tests/fetch.py
@@ -602,8 +602,8 @@
self.assertEqual(tree, ['a', 'dir/c'])
def test_local_wildcard(self):
- tree = self.fetchUnpack(['file://a', 'file://dir/*'])
- self.assertEqual(tree, ['a', 'dir/c', 'dir/d', 'dir/subdir/e'])
+ with self.assertRaises(bb.fetch2.ParameterError):
+ tree = self.fetchUnpack(['file://a', 'file://dir/*'])
def test_local_dir(self):
tree = self.fetchUnpack(['file://a', 'file://dir'])
@@ -1156,7 +1156,8 @@
("mtd-utils", "git://git.yoctoproject.org/mtd-utils.git", "ca39eb1d98e736109c64ff9c1aa2a6ecca222d8f", "")
: "1.5.0",
# version pattern "pkg_name-X.Y"
- ("presentproto", "git://anongit.freedesktop.org/git/xorg/proto/presentproto", "24f3a56e541b0a9e6c6ee76081f441221a120ef9", "")
+ # mirror of git://anongit.freedesktop.org/git/xorg/proto/presentproto since network issues interfered with testing
+ ("presentproto", "git://git.yoctoproject.org/bbfetchtests-presentproto", "24f3a56e541b0a9e6c6ee76081f441221a120ef9", "")
: "1.0",
# version pattern "pkg_name-vX.Y.Z"
("dtc", "git://git.qemu.org/dtc.git", "65cc4d2748a2c2e6f27f1cf39e07a5dbabd80ebf", "")
@@ -1170,7 +1171,8 @@
("mobile-broadband-provider-info", "git://gitlab.gnome.org/GNOME/mobile-broadband-provider-info.git;protocol=https", "4ed19e11c2975105b71b956440acdb25d46a347d", "")
: "20120614",
# packages with a valid UPSTREAM_CHECK_GITTAGREGEX
- ("xf86-video-omap", "git://anongit.freedesktop.org/xorg/driver/xf86-video-omap", "ae0394e687f1a77e966cf72f895da91840dffb8f", "(?P<pver>(\d+\.(\d\.?)*))")
+ # mirror of git://anongit.freedesktop.org/xorg/driver/xf86-video-omap since network issues interfered with testing
+ ("xf86-video-omap", "git://git.yoctoproject.org/bbfetchtests-xf86-video-omap", "ae0394e687f1a77e966cf72f895da91840dffb8f", "(?P<pver>(\d+\.(\d\.?)*))")
: "0.4.3",
("build-appliance-image", "git://git.yoctoproject.org/poky", "b37dd451a52622d5b570183a81583cc34c2ff555", "(?P<pver>(([0-9][\.|_]?)+[0-9]))")
: "11.0.0",
@@ -1262,9 +1264,7 @@
class FetchCheckStatusTest(FetcherTest):
- test_wget_uris = ["http://www.cups.org/software/1.7.2/cups-1.7.2-source.tar.bz2",
- "http://www.cups.org/",
- "http://downloads.yoctoproject.org/releases/sato/sato-engine-0.1.tar.gz",
+ test_wget_uris = ["http://downloads.yoctoproject.org/releases/sato/sato-engine-0.1.tar.gz",
"http://downloads.yoctoproject.org/releases/sato/sato-engine-0.2.tar.gz",
"http://downloads.yoctoproject.org/releases/sato/sato-engine-0.3.tar.gz",
"https://yoctoproject.org/",
diff --git a/poky/bitbake/lib/bb/tinfoil.py b/poky/bitbake/lib/bb/tinfoil.py
index dccbe0e..2fb1bb7 100644
--- a/poky/bitbake/lib/bb/tinfoil.py
+++ b/poky/bitbake/lib/bb/tinfoil.py
@@ -22,7 +22,6 @@
import bb.utils
import bb.command
import bb.remotedata
-from bb.cookerdata import CookerConfiguration
from bb.main import setup_bitbake, BitBakeConfigParameters
import bb.fetch2
@@ -381,18 +380,13 @@
if not config_params:
config_params = TinfoilConfigParameters(config_only=config_only, quiet=quiet)
- cookerconfig = CookerConfiguration()
- cookerconfig.setConfigParameters(config_params)
-
if not config_only:
# Disable local loggers because the UI module is going to set up its own
for handler in self.localhandlers:
self.logger.handlers.remove(handler)
self.localhandlers = []
- self.server_connection, ui_module = setup_bitbake(config_params,
- cookerconfig,
- extrafeatures)
+ self.server_connection, ui_module = setup_bitbake(config_params, extrafeatures)
self.ui_module = ui_module
@@ -738,7 +732,7 @@
continue
if helper.eventHandler(event):
if isinstance(event, bb.build.TaskFailedSilent):
- logger.warning("Logfile for failed setscene task is %s" % event.logfile)
+ self.logger.warning("Logfile for failed setscene task is %s" % event.logfile)
elif isinstance(event, bb.build.TaskFailed):
bb.ui.knotty.print_event_log(event, includelogs, loglines, termfilter)
continue
@@ -812,18 +806,22 @@
prepare() has been called, or use a with... block when you create
the tinfoil object which will ensure that it gets called.
"""
- if self.server_connection:
- self.run_command('clientComplete')
- _server_connections.remove(self.server_connection)
- bb.event.ui_queue = []
- self.server_connection.terminate()
- self.server_connection = None
+ try:
+ if self.server_connection:
+ try:
+ self.run_command('clientComplete')
+ finally:
+ _server_connections.remove(self.server_connection)
+ bb.event.ui_queue = []
+ self.server_connection.terminate()
+ self.server_connection = None
- # Restore logging handlers to how it looked when we started
- if self.oldhandlers:
- for handler in self.logger.handlers:
- if handler not in self.oldhandlers:
- self.logger.handlers.remove(handler)
+ finally:
+ # Restore logging handlers to how it looked when we started
+ if self.oldhandlers:
+ for handler in self.logger.handlers:
+ if handler not in self.oldhandlers:
+ self.logger.handlers.remove(handler)
def _reconvert_type(self, obj, origtypename):
"""
diff --git a/poky/bitbake/lib/bb/ui/knotty.py b/poky/bitbake/lib/bb/ui/knotty.py
index 87e873d..a91e4fd 100644
--- a/poky/bitbake/lib/bb/ui/knotty.py
+++ b/poky/bitbake/lib/bb/ui/knotty.py
@@ -144,7 +144,7 @@
pass
if not cr:
try:
- cr = (env['LINES'], env['COLUMNS'])
+ cr = (os.environ['LINES'], os.environ['COLUMNS'])
except:
cr = (25, 80)
return cr
@@ -380,14 +380,27 @@
"bb.event.BuildBase", "bb.build.TaskStarted", "bb.build.TaskSucceeded", "bb.build.TaskFailedSilent",
"bb.build.TaskProgress", "bb.event.ProcessStarted", "bb.event.ProcessProgress", "bb.event.ProcessFinished"]
+def drain_events_errorhandling(eventHandler):
+ # We don't have logging setup, we do need to show any events we see before exiting
+ event = True
+ logger = bb.msg.logger_create('bitbake', sys.stdout)
+ while event:
+ event = eventHandler.waitEvent(0)
+ if isinstance(event, logging.LogRecord):
+ logger.handle(event)
+
def main(server, eventHandler, params, tf = TerminalFilter):
- if not params.observe_only:
- params.updateToServer(server, os.environ.copy())
+ try:
+ if not params.observe_only:
+ params.updateToServer(server, os.environ.copy())
- includelogs, loglines, consolelogfile, logconfigfile = _log_settings_from_server(server, params.observe_only)
+ includelogs, loglines, consolelogfile, logconfigfile = _log_settings_from_server(server, params.observe_only)
- loglevel, _ = bb.msg.constructLogOptions()
+ loglevel, _ = bb.msg.constructLogOptions()
+ except bb.BBHandledException:
+ drain_events_errorhandling(eventHandler)
+ return 1
if params.options.quiet == 0:
console_loglevel = loglevel
diff --git a/poky/bitbake/lib/bb/ui/ncurses.py b/poky/bitbake/lib/bb/ui/ncurses.py
index da4fbea..cf1c876 100644
--- a/poky/bitbake/lib/bb/ui/ncurses.py
+++ b/poky/bitbake/lib/bb/ui/ncurses.py
@@ -48,6 +48,8 @@
import xmlrpc.client
from bb.ui import uihelper
+logger = logging.getLogger(__name__)
+
parsespin = itertools.cycle( r'|/-\\' )
X = 0
diff --git a/poky/bitbake/lib/bb/ui/uievent.py b/poky/bitbake/lib/bb/ui/uievent.py
index 13d0d4a..8607d05 100644
--- a/poky/bitbake/lib/bb/ui/uievent.py
+++ b/poky/bitbake/lib/bb/ui/uievent.py
@@ -11,9 +11,13 @@
client/server deadlocks.
"""
-import socket, threading, pickle, collections
+import collections, logging, pickle, socket, threading
from xmlrpc.server import SimpleXMLRPCServer, SimpleXMLRPCRequestHandler
+import bb
+
+logger = logging.getLogger(__name__)
+
class BBUIEventQueue:
def __init__(self, BBServer, clientinfo=("localhost, 0")):
diff --git a/poky/bitbake/lib/bb/utils.py b/poky/bitbake/lib/bb/utils.py
index 50032e5..0b79f92 100644
--- a/poky/bitbake/lib/bb/utils.py
+++ b/poky/bitbake/lib/bb/utils.py
@@ -402,8 +402,8 @@
(t, value, tb) = sys.exc_info()
try:
_print_exception(t, value, tb, realfile, text, context)
- except Exception as e:
- logger.error("Exception handler error: %s" % str(e))
+ except Exception as e2:
+ logger.error("Exception handler error: %s" % str(e2))
e = bb.BBHandledException(e)
raise e
@@ -433,20 +433,6 @@
for lock in locks:
bb.utils.unlockfile(lock)
-@contextmanager
-def timeout(seconds):
- def timeout_handler(signum, frame):
- pass
-
- original_handler = signal.signal(signal.SIGALRM, timeout_handler)
-
- try:
- signal.alarm(seconds)
- yield
- finally:
- signal.alarm(0)
- signal.signal(signal.SIGALRM, original_handler)
-
def lockfile(name, shared=False, retry=True, block=False):
"""
Use the specified file as a lock file, return when the lock has
@@ -1085,21 +1071,20 @@
# Either call with a list of filenames and set pout or a filename and optionally pout.
if not pout:
pout = fn + '.processed'
- pout = open(pout, 'w')
-
- import pstats
- if isinstance(fn, list):
- p = pstats.Stats(*fn, stream=pout)
- else:
- p = pstats.Stats(fn, stream=pout)
- p.sort_stats('time')
- p.print_stats()
- p.print_callers()
- p.sort_stats('cumulative')
- p.print_stats()
- pout.flush()
- pout.close()
+ with open(pout, 'w') as pout:
+ import pstats
+ if isinstance(fn, list):
+ p = pstats.Stats(*fn, stream=pout)
+ else:
+ p = pstats.Stats(fn, stream=pout)
+ p.sort_stats('time')
+ p.print_stats()
+ p.print_callers()
+ p.sort_stats('cumulative')
+ p.print_stats()
+
+ pout.flush()
#
# Was present to work around multiprocessing pool bugs in python < 2.7.3
@@ -1472,14 +1457,20 @@
return (notadded, notremoved)
-
-def get_file_layer(filename, d):
- """Determine the collection (as defined by a layer's layer.conf file) containing the specified file"""
+def get_collection_res(d):
collections = (d.getVar('BBFILE_COLLECTIONS') or '').split()
collection_res = {}
for collection in collections:
collection_res[collection] = d.getVar('BBFILE_PATTERN_%s' % collection) or ''
+ return collection_res
+
+
+def get_file_layer(filename, d, collection_res={}):
+ """Determine the collection (as defined by a layer's layer.conf file) containing the specified file"""
+ if not collection_res:
+ collection_res = get_collection_res(d)
+
def path_to_layer(path):
# Use longest path so we handle nested layers
matchlen = 0
@@ -1491,12 +1482,13 @@
return match
result = None
- bbfiles = (d.getVar('BBFILES') or '').split()
+ bbfiles = (d.getVar('BBFILES_PRIORITIZED') or '').split()
bbfilesmatch = False
for bbfilesentry in bbfiles:
- if fnmatch.fnmatch(filename, bbfilesentry):
+ if fnmatch.fnmatchcase(filename, bbfilesentry):
bbfilesmatch = True
result = path_to_layer(bbfilesentry)
+ break
if not bbfilesmatch:
# Probably a bbclass
diff --git a/poky/bitbake/lib/bblayers/query.py b/poky/bitbake/lib/bblayers/query.py
index ee2db0e..f5e3c84 100644
--- a/poky/bitbake/lib/bblayers/query.py
+++ b/poky/bitbake/lib/bblayers/query.py
@@ -21,6 +21,10 @@
class QueryPlugin(LayerPlugin):
+ def __init__(self):
+ super(QueryPlugin, self).__init__()
+ self.collection_res = {}
+
def do_show_layers(self, args):
"""show current configured layers."""
logger.plain("%s %s %s" % ("layer".ljust(20), "path".ljust(40), "priority"))
@@ -222,7 +226,6 @@
multilayer = True
if prov[0] != pref[0]:
same_ver = False
-
if (multilayer or not show_overlayed_only) and (same_ver or not show_same_ver_only):
if not items_listed:
logger.plain('=== %s ===' % title)
@@ -243,8 +246,13 @@
else:
return '?'
+ def get_collection_res(self):
+ if not self.collection_res:
+ self.collection_res = bb.utils.get_collection_res(self.tinfoil.config_data)
+ return self.collection_res
+
def get_file_layerdir(self, filename):
- layer = bb.utils.get_file_layer(filename, self.tinfoil.config_data)
+ layer = bb.utils.get_file_layer(filename, self.tinfoil.config_data, self.get_collection_res())
return self.bbfile_collections.get(layer, None)
def remove_layer_prefix(self, f):
diff --git a/poky/bitbake/lib/hashserv/tests.py b/poky/bitbake/lib/hashserv/tests.py
index 6e86295..b34c436 100644
--- a/poky/bitbake/lib/hashserv/tests.py
+++ b/poky/bitbake/lib/hashserv/tests.py
@@ -9,6 +9,7 @@
import hashlib
import logging
import multiprocessing
+import os
import sys
import tempfile
import threading
diff --git a/poky/bitbake/lib/layerindexlib/__init__.py b/poky/bitbake/lib/layerindexlib/__init__.py
index 77196b4..45157b6 100644
--- a/poky/bitbake/lib/layerindexlib/__init__.py
+++ b/poky/bitbake/lib/layerindexlib/__init__.py
@@ -7,6 +7,7 @@
import logging
import imp
+import os
from collections import OrderedDict
from layerindexlib.plugin import LayerIndexPluginUrlError
@@ -70,7 +71,7 @@
if self.__class__ != newIndex.__class__ or \
other.__class__ != newIndex.__class__:
- raise TypeException("Can not add different types.")
+ raise TypeError("Can not add different types.")
for indexEnt in self.indexes:
newIndex.indexes.append(indexEnt)
@@ -266,8 +267,8 @@
logger.debug(1, "Store not implemented in %s" % plugin.type)
pass
else:
- logger.debug(1, "No plugins support %s" % url)
- raise LayerIndexException("No plugins support %s" % url)
+ logger.debug(1, "No plugins support %s" % indexURI)
+ raise LayerIndexException("No plugins support %s" % indexURI)
def is_empty(self):
@@ -657,7 +658,7 @@
if obj.id in self._index[indexname]:
if self._index[indexname][obj.id] == obj:
continue
- raise LayerIndexError('Conflict adding object %s(%s) to index' % (indexname, obj.id))
+ raise LayerIndexException('Conflict adding object %s(%s) to index' % (indexname, obj.id))
self._index[indexname][obj.id] = obj
def add_raw_element(self, indexname, objtype, rawobjs):
@@ -842,11 +843,11 @@
def _resolve_dependencies(layerbranches, ignores, dependencies, invalid):
for layerbranch in layerbranches:
- if ignores and layerBranch.layer.name in ignores:
+ if ignores and layerbranch.layer.name in ignores:
continue
- for layerdependency in layerbranch.index.layerDependencies_layerBranchId[layerBranch.id]:
- deplayerbranch = layerDependency.dependency_layerBranch
+ for layerdependency in layerbranch.index.layerDependencies_layerBranchId[layerbranch.id]:
+ deplayerbranch = layerdependency.dependency_layerBranch
if ignores and deplayerbranch.layer.name in ignores:
continue
diff --git a/poky/bitbake/lib/layerindexlib/cooker.py b/poky/bitbake/lib/layerindexlib/cooker.py
index 65b23d0..21ec438 100644
--- a/poky/bitbake/lib/layerindexlib/cooker.py
+++ b/poky/bitbake/lib/layerindexlib/cooker.py
@@ -4,6 +4,7 @@
#
import logging
+import os
from collections import defaultdict
@@ -73,7 +74,7 @@
d = self.layerindex.data
if not branches:
- raise LayerIndexFetchError("No branches specified for _load_bblayers!")
+ raise layerindexlib.LayerIndexFetchError("No branches specified for _load_bblayers!")
index = layerindexlib.LayerIndexObj()
@@ -202,7 +203,7 @@
try:
depDict = bb.utils.explode_dep_versions2(deps)
except bb.utils.VersionStringException as vse:
- bb.fatal('Error parsing LAYERDEPENDS_%s: %s' % (c, str(vse)))
+ bb.fatal('Error parsing LAYERDEPENDS_%s: %s' % (collection, str(vse)))
for dep, oplist in list(depDict.items()):
# We need to search ourselves, so use the _ version...
@@ -268,7 +269,7 @@
layer = bb.utils.get_file_layer(realfn[0], self.config_data)
- depBranchId = collection_layerbranch[layer]
+ depBranchId = collection[layer]
recipeId += 1
recipe = layerindexlib.Recipe(index, None)
diff --git a/poky/bitbake/lib/layerindexlib/restapi.py b/poky/bitbake/lib/layerindexlib/restapi.py
index 21fd144..7023f42 100644
--- a/poky/bitbake/lib/layerindexlib/restapi.py
+++ b/poky/bitbake/lib/layerindexlib/restapi.py
@@ -5,9 +5,13 @@
import logging
import json
+import os
+
from urllib.parse import unquote
from urllib.parse import urlparse
+import bb
+
import layerindexlib
import layerindexlib.plugin
@@ -163,7 +167,7 @@
parsed = _get_json_response(apiurl=up_stripped.geturl(), username=username, password=password, retry=False)
logger.debug(1, "%s: retry successful.")
else:
- raise LayerIndexFetchError('%s: Connection reset by peer. Is there a firewall blocking your connection?' % apiurl)
+ raise layerindexlib.LayerIndexFetchError('%s: Connection reset by peer. Is there a firewall blocking your connection?' % apiurl)
return parsed
diff --git a/poky/bitbake/lib/layerindexlib/tests/restapi.py b/poky/bitbake/lib/layerindexlib/tests/restapi.py
index e5ccafe..4646d01 100644
--- a/poky/bitbake/lib/layerindexlib/tests/restapi.py
+++ b/poky/bitbake/lib/layerindexlib/tests/restapi.py
@@ -112,7 +112,7 @@
break
else:
self.logger.debug(1, "meta-python was not found")
- self.assetTrue(False)
+ raise self.failureException
# Only check the first element...
break
diff --git a/poky/bitbake/lib/ply/lex.py b/poky/bitbake/lib/ply/lex.py
index 267ec10..182f2e8 100644
--- a/poky/bitbake/lib/ply/lex.py
+++ b/poky/bitbake/lib/ply/lex.py
@@ -705,11 +705,7 @@
# Sort the functions by line number
for f in self.funcsym.values():
- if sys.version_info[0] < 3:
- f.sort(lambda x,y: cmp(func_code(x[1]).co_firstlineno,func_code(y[1]).co_firstlineno))
- else:
- # Python 3.0
- f.sort(key=lambda x: func_code(x[1]).co_firstlineno)
+ f.sort(key=lambda x: func_code(x[1]).co_firstlineno)
# Sort the strings by regular expression length
for s in self.strsym.values():
diff --git a/poky/bitbake/lib/ply/yacc.py b/poky/bitbake/lib/ply/yacc.py
index 561784f..46e7dc9 100644
--- a/poky/bitbake/lib/ply/yacc.py
+++ b/poky/bitbake/lib/ply/yacc.py
@@ -1205,7 +1205,7 @@
# Precompute the list of productions immediately following. Hack. Remove later
try:
- p.lr_after = Prodnames[p.prod[n+1]]
+ p.lr_after = self.Prodnames[p.prod[n+1]]
except (IndexError,KeyError):
p.lr_after = []
try:
diff --git a/poky/bitbake/lib/toaster/tests/functional/functional_helpers.py b/poky/bitbake/lib/toaster/tests/functional/functional_helpers.py
index 455c408..5c4ea71 100644
--- a/poky/bitbake/lib/toaster/tests/functional/functional_helpers.py
+++ b/poky/bitbake/lib/toaster/tests/functional/functional_helpers.py
@@ -75,7 +75,7 @@
try:
table_element = self.get_table_element(table_id)
element = table_element.find_element_by_link_text(link_text)
- except NoSuchElementException as e:
+ except self.NoSuchElementException:
print('no element found')
raise
return element
@@ -86,7 +86,7 @@
element_xpath = "//*[@id='" + table_id + "']"
try:
element = self.driver.find_element_by_xpath(element_xpath)
- except NoSuchElementException as e:
+ except self.NoSuchElementException:
raise
return element
row = coordinate[0]
@@ -96,7 +96,7 @@
element_xpath = "//*[@id='" + table_id + "']/tbody/tr[" + str(row) + "]"
try:
element = self.driver.find_element_by_xpath(element_xpath)
- except NoSuchElementException as e:
+ except self.NoSuchElementException:
return False
return element
#now we are looking for an element with specified X and Y
@@ -105,6 +105,6 @@
element_xpath = "//*[@id='" + table_id + "']/tbody/tr[" + str(row) + "]/td[" + str(column) + "]"
try:
element = self.driver.find_element_by_xpath(element_xpath)
- except NoSuchElementException as e:
+ except self.NoSuchElementException:
return False
return element