subtree updates

meta-raspberrypi: fde68b24f0..4c033eb074:
  Harunobu Kurokawa (1):
        rpi-cmdline, rpi-u-boot-src: Support USB boot

meta-arm: 0b61cc659a..4d22f982bc:
  Debbie Martin (2):
        arm-systemready: Add parted dependency and inherit testimage
        ci: Add Arm SystemReady firmware and IR ACS builds

  Harsimran Singh Tungal (3):
        arm-bsp/documentation: corstone1000: fix the steps in the user guide and instructions
        corstone1000:arm-bsp/optee: Update optee to v4.0
        corstone1000:arm-bsp/tftf: Fix tftf tests on mps3

  Jon Mason (5):
        arm/trusted-firmware-a: move patch file to bbappend
        arm/trusted-firmware-a: update to 2.10
        arm/hafnium: update to v2.10
        CI: rename meta-secure-core directory
        arm/edk2: update to 202311

  Ross Burton (1):
        CI: switch back to master

poky: 028b6f6226..4675bbb757:
  Adrian Freihofer (4):
        cmake-qemu.bbclass: make it more usable
        oe-selftest: add a cpp-example recipe
        oeqa/core/decorator: add skip if not qemu-usermode
        oe-selftest: add tests for C and C++ build tools

  Alassane Yattara (22):
        bitbake: toaster/test: bug-fix on tests/browser/test_all_builds_page
        bitbake: toaster/test: from test_no_builds_message.py wait for the empty state div to appear
        bitbake: toaster/test: delay driver action until elements to appear
        bitbake: toaster/tests: Ensure to kill toaster process create for tests functional
        bitbake: toaster/tests: Added functional/utils, contains useful methods using by functional tests
        bitbake: toaster/tests: Refactorize tests/functional
        bitbake: toaster/tests: Bug fixes, functional tests dependent on each other
        bitbake: toaster/tests: Fixes warnings in autobuilder
        bitbake: toaster/tests: bug-fix tests writing files into /tmp on the autobuilders
        bitbake: toaster/test: fix Copyright
        bitbake: toaster/tests: logging warning in console, trying to kill unavailable Runbuilds process
        bitbake: toaster/tests: Removed all time.sleep occurrence
        bitbake: toaster/tests: Bug-Fix testcase functional/test_project_page_tab_config.py
        bitbake: toaster/tests: bug-fix element click intercepted in browser/test_layerdetails_page.py
        bitbake: toaster/tests: Update tests/functional/functional_helpers test_functional_basic
        bitbake: toaster/tests: Fixes functional tests warning on autobuilder
        bitbake: toaster/tests: Bug-fix test_functional_basic, delay driver actions
        bitbake: toaster/tests: bug-fix An element matching "#projectstable" should be visible
        bitbake: toaster/tests: bug-fix An element matching "#lastest_builds" should be on the page
        bitbake: toaster/tests: Skip to show more then 100 item in ToasterTable
        bitbake: toaster/tests: Bug-fix "#project-created-notification" should be visible
        bitbake: toaster/toastergui: Bug-fix verify given layer path only if import/add local layer

  Alex Bennée (1):
        qemurunner: more cleanups for output blocking

  Alex Kiernan (17):
        cargo: Rename MANIFEST_PATH -> CARGO_MANIFEST_PATH
        cargo: Move CARGO_MANIFEST_PATH/CARGO_SRC_DIR to cargo_common
        rust: cargo: Convert single-valued variables to weak defaults
        cargo: Add CARGO_LOCK_PATH for path to Cargo.lock
        rust: Upgrade 1.70.0 -> 1.71.0
        rust: Upgrade 1.71.0 -> 1.71.1
        sstate-cache-management: Rewrite in python
        devtool: selftest: Fix test_devtool_modify_git_crates_subpath inequality
        devtool: selftest: Fix test_devtool_modify_git_crates_subpath bbappend check
        meta-selftest: hello-rs: Simple rust test recipe
        devtool: selftest: Swap to hello-rs for crates testing
        zvariant: Drop recipe
        rust: Upgrade 1.71.1 -> 1.72.0
        rust: Upgrade 1.72.0 -> 1.72.1
        rust: Upgrade 1.72.1 -> 1.73.0
        rust: Upgrade 1.73.0 -> 1.74.0
        rust: Upgrade 1.74.0 -> 1.74.1

  Alexander Kanavin (21):
        selftest/sstatetest: print output from bitbake with actual newlines, not \n
        selftest/sstatetests: do not delete custom $TMPDIRs under build-st when testing printdiff
        sstatesig/find_siginfo: special-case gcc-source when looking in sstate caches
        oeqa/selftest/sstatetests: re-work CDN tests, add local cache tests
        gobject-introspection: depend on setuptools to obtain distutils module
        libcap-ng-python: depend on setuptools to obtain distutils copy
        dnf: remove obsolete python3-gpg dependency (provided by gpgme)
        gpgme: disable python support (until upstream fixes 3.12 compatibility)
        python3-setuptools-rust: remove distutils dependency
        python3-babel: replace distutils with setuptools, as supported by upstream
        python3-pip: remove distutils depedency
        glib-2.0: replace distutils dependency with setuptools
        python3-pytest-runner: remove distutils dependency
        python3-numpy: distutils is no longer required
        bitbake: bitbake/codeparser.py: address ast module deprecations in py 3.12
        glibc-y2038-tests: do not run tests using 32 bit time APIs
        bitbake: bitbake/runqueue: add debugging for find_siginfo() calls
        bitbake: bitbake-diffsigs/runqueue: adapt to reworked find_siginfo()
        bitbake: bitbake/runqueue: prioritize local stamps over sstate signatures in printdiff
        sstatesig/find_siginfo: unify a disjointed API
        lib/sstatesig/find_siginfo: raise an error instead of returning None when obtaining mtime

  Alexander Lussier-Cullen (6):
        bitbake: toaster: fix pytest build test execution and test discovery
        bitbake: toaster: Add verbose printout for missing chrome(driver) dependencies
        bitbake: bitbake: toaster: add functional testing toaster error details
        bitbake: toaster/tests: Exit tests on chromedriver creation failure
        bitbake: toaster/tests: fix functional tests setup and teardown
        bitbake: toaster/tests: fix chrome argument syntax and wait for driver exit

  Alexandre Belloni (1):
        oeqa/selftest/recipetool: stop looking for md5sum

  Anuj Mittal (9):
        sqlite3: upgrade 3.44.0 -> 3.44.2
        base-passwd: upgrade 3.6.2 -> 3.6.3
        bluez5: upgrade 5.70 -> 5.71
        glib-2.0: upgrade 2.78.1 -> 2.78.3
        glib-networking: upgrade 2.76.1 -> 2.78.0
        puzzles: upgrade to latest revision
        stress-ng: upgrade 0.17.01 -> 0.17.03
        libusb1: fix upstream version check
        enchant2: upgrade 2.6.2 -> 2.6.4

  Archana Polampalli (1):
        bluez5: fix CVE-2023-45866

  Bruce Ashfield (31):
        linux-yocto/6.5: cfg: split runtime and symbol debug
        linux-yocto/6.5: update to v6.5.11
        linux-yocto/6.1: update to v6.1.62
        linux-yocto-dev: bump to v6.7
        linux-yocto/6.5: update to v6.5.12
        linux-yocto/6.5: update to v6.5.13
        linux-yocto/6.1: update to v6.1.65
        linux-yocto/6.1: drop removed IMA option
        linux-yocto/6.5: drop removed IMA option
        linux-yocto-rt/6.1: update to -rt18
        linux-yocto/6.1: update to v6.1.66
        linux-yocto/6.1: update to v6.1.67
        linux-yocto/6.5: fix AB-INT: QEMU kernel panic: No irq handler for vector
        linux-yocto/6.1: update to v6.1.68
        oeqa/runtime/parselogs: add qemux86 ACPI ignore for kernel v6.6+
        linux-libc-headers: update to v6.6-lts
        linux-yocto: introduce 6.6 reference kernel
        linux-yocto/6.6: fix AB-INT: QEMU kernel panic: No irq handler for vector
        linux-yocto-rt/6.6: fix CVE exclusion include
        linux-yocto/6.6: update CVE exclusions
        linux-yocto/6.6: update to v6.6.8
        linux-yocto/6.1: update to v6.1.69
        linux-yocto/6.5: drop 6.5 recipes
        linux-yocto-rt/6.6: correct meta data branch
        linux-yocto/6.6: update to v6.6.9
        linux-yocto/6.6: update CVE exclusions
        linux-yocto/6.1: update to v6.1.70
        linux-yocto/6.1: update CVE exclusions
        linux-yocto/6.6: ARM fix configuration audit warning
        linux-yocto/6.6: arm: jitter entropy backport
        poky/poky-tiny: make 6.6 the default kernel

  Changqing Li (1):
        man-pages: remove conflict pages

  Chen Qi (1):
        devtool: use straight print in check-upgrade-status output

  Clay Chang (1):
        devtool: deploy: provide max_process to strip_execs

  Daniel Ammann (1):
        base: Unpack .7z files with p7zip

  Deepthi Hemraj (1):
        autoconf: Add missing perl modules to RDEPENDS

  Dhairya Nagodra (2):
        cve-update-nvd2-native: faster requests with API keys
        cve-update-nvd2-native: increase the delay between subsequent request failures

  Eilís 'pidge' Ní Fhlannagáin (3):
        useradd: Fix issues with useradd dependencies
        useradd: Add testcase for bugzilla issue (currently disabled)
        usergrouptests.py: Add test for switching between static-ids

  Enrico Scholz (1):
        tcp-wrappers: drop libnsl2 build dependency

  Etienne Cordonnier (2):
        gdb/systemd: enable minidebuginfo support conditionally
        manuals: document minidebuginfo

  Fabio Estevam (3):
        libdrm: Upgrade to 2.4.119
        kmscube: Upgrade to latest revision
        bmap-tools: Upgrade to 3.7

  Hongxu Jia (2):
        socat: 1.7.4.4 -> 1.8.0.0
        man-db: 2.11.2 -> 2.12.0

  Jason Andryuk (3):
        linux-firmware: Package iwlwifi .pnvm files
        linux-firmware: Change bnx2 packaging
        linux-firmware: Create bnx2x subpackage

  Jeremy A. Puhlman (1):
        create-spdx-2.2: combine spdx can try to write before dir creation

  Jermain Horsman (2):
        lib/bblayers/makesetup.py: Remove unused imports
        lib/bblayers/buildconf.py: Remove unused imports/variables

  Jose Quaresma (2):
        go: update 1.20.10 -> 1.20.11
        go: update 1.20.11 -> 1.20.12

  Joshua Watt (11):
        bitbake: bitbake-hashserv: Add description of permissions
        bitbake.conf: Add runtimedir
        rpcbind: Specify state directory under /run
        libinput: Add packageconfig for tests
        ipk: Switch to using zstd compression
        lib/oe/path.py: Add relsymlink()
        lib/packagedata.py: Fix broken symlinks for providers with a '/'
        bitbake: contrib/vim: Syntax improvements
        classes-global/sstate: Fix variable typo
        lib/packagedata.py: Add API to iterate over rprovides
        classes-global/insane: Look up all runtime providers for file-rdeps

  Julien Stephan (19):
        recipetool: create_buildsys_python.py: initialize metadata
        recipetool: create: add trailing newlines
        recipetool: create: add new optional process_url callback for plugins
        recipetool: create_buildsys_python: add pypi support
        oeqa/selftest/recipetool: remove spaces on empty lines
        oeqa/selftest/recipetool/devtool: add test for pypi class
        recipetool: appendsrcfile(s): add dry-run mode
        recipeutils: bbappend_recipe: fix undefined variable
        recipeutils: bbappend_recipe: fix docstring
        recipeutils: bbappend_recipe: add a way to specify the name of the file to add
        recipeutils: bbappend_recipe: remove old srcuri entry if parameters are different
        recipetool: appendsrcfile(s): use params instead of extraline
        recipeutils: bbappend_recipe: allow to patch the recipe itself
        recipetool: appendsrcfile(s): add a mode to update the recipe itself
        oeqa/selftest/recipetool: appendsrfile: add test for machine
        oeqa/selftest/recipetool: appendsrc: add test for update mode
        oeqa/selftest/recipetool: add back checksum checks on pypi tests
        oeqa/selftest/recipetool: remove left over from development
        oeqa/selftest/recipetool: fix metadata corruption on meta layer

  Kevin Hao (2):
        beaglebone-yocto: Remove the redundant kernel-devicetree
        beaglebone-yocto: Remove the obsolete variables for uImage

  Khem Raj (13):
        tiff: Backport fixes for CVE-2023-6277
        kmod: Fix build with latest musl
        elfutils: Use own basename API implementation
        util-linux: Fix build with latest musl
        sysvinit: Include libgen.h for basename API
        attr: Fix build with latest musl
        opkg: Use own version of portable basename function
        util-linux: Delete md-raid tests
        gdb: Update to gdb 14.1 release
        systemd: Fix build with latest musl
        qemu: Fix build with latest musl
        qemu: Add packageconfig knob to enable pipewire support
        weston: Include libgen.h for basename

  Lee Chee Yang (5):
        migration-guides: reword fix in release-notes-4.3.1
        migration-guides: add release notes for 4.0.15
        perlcross: update to 1.5.2
        perl: 5.38.0 -> 5.38.2
        curl: update to 8.5.0

  Lucas Stach (1):
        mesa: upgrade 23.2.1 -> 23.3.1

  Ludovic Jozeau (1):
        image-live.bbclass: LIVE_ROOTFS_TYPE support compression

  Lukas Funke (1):
        selftest: wic: add test for zerorize option of empty plugin

  Malte Schmidt (1):
        wic: extend empty plugin with options to write zeros to partiton

  Markus Volk (3):
        gtk4: upgrade 4.12.3 -> 4.12.4
        libadwaita: update 1.4.0 -> 1.4.2
        appstream: Upgrade 0.16.3 -> 1.0.0

  Marlon Rodriguez Garcia (5):
        bitbake: toaster/tests: Update build test
        bitbake: toaster: Added new feature to import eventlogs from command line into toaster using replay functionality
        bitbake: toaster: remove test and update setup to avoid rebuilding image
        bitbake: toaster: Commandline build import table improvements
        bitbake: toaster: Added validation to stop import if there is a build in progress

  Marta Rybczynska (1):
        bitbake: toastergui: verify that an existing layer path is given

  Massimiliano Minella (1):
        zstd: fix LICENSE statement

  Michael Opdenacker (8):
        test-manual: text and formatting fixes
        test-manual: resource updates
        test-manual: use working example
        test-manual: add links to python unittest
        test-manual: explicit or fix file paths
        test-manual: add or improve hyperlinks
        dev-manual: runtime-testing: fix test module name
        poky.conf: update SANITY_TESTED_DISTROS to match autobuilder

  Mikko Rapeli (1):
        runqemu: match .rootfs. in addition to -image- for rootfs

  Ming Liu (1):
        grub: fs/fat: Don't error when mtime is 0

  Mingli Yu (2):
        python3-license-expression: Fix the ptest failure
        ptest-packagelists.inc: Add python3-license-expression

  Pavel Zhukov (2):
        bitbake: utils: Do not create directories with ${ in the name
        oeqa/selftest/bbtests: Add test for unexpanded variables in the dirname

  Peter Kjellerstedt (11):
        oeqa/selftest/devtool: Correct git clone of local repository
        oeqa/selftest/devtool: Avoid global Git hooks when amending a patch
        oeqa/selftest/devtool: Make test_devtool_load_plugin more resilient
        oeqa/selftest/recipetool: Make test_recipetool_load_plugin more resilient
        lib/oe/recipeutils: Avoid wrapping any SRC_URI[sha*sum] variables
        recipetool: create: Improve identification of licenses
        recipetool: create: Only include the expected SRC_URI checksums
        devtool: upgrade: Update all existing checksums for the SRC_URI
        devtool: modify: Make --no-extract work again
        devtool: modify: Handle recipes with a menuconfig task correctly
        dev-manual: Discourage the use of SRC_URI[md5sum]

  Peter Marko (1):
        dtc: preserve version also from shallow git clones

  Philip Balister (1):
        sanity.bbclass: Check for additional native perl modules.

  Renat Khalikov (1):
        python3-maturin: Add missing space appending to CFLAGS

  Richard Purdie (41):
        bitbake: runqueue: Improve inter setscene task dependency handling
        bitbake: bb/toaster: Fix assertEquals deprecation warnings
        bitbake: toaster: Fix assertRegexpMatches deprecation warnings
        bitbake: toastermain/settings: Avoid python filehandle closure warnings
        bitbake: toastergui: Fix regex markup issues
        bitbake: bitbake: Move to version 2.6.1 to mark runqueue changes
        bitbake: toaster-eventreplay: Remove ordering assumptions
        sanity.conf: Require bitbake 2.6.1 for recent runqueue change
        sstate: Remove unneeded code from setscene_depvalid() related to useradd
        oeqa/runtime/systemd: Ensure test runs only on systemd images
        bitbake: toaster: Update to use qemux86-64 machine by default
        bitbake: toaster/tests/builds: Add BB_HASHSERVE passthrough
        pseudo: Update to pull in syncfs probe fix
        useradd: Fix useradd do_populate_sysroot dependency bug
        sstate: Fix dir ownership issues in SSTATE_DIR
        oeqa/sstatetests: Disable gcc source printdiff test for now
        build-appliance-image: Update to master head revision
        bitbake: utils: Fix mkdir with PosixPath
        bitbake: runqueue: Remove tie between rqexe and starts_worker
        build-appliance-image: Update to master head revision
        testimage: Exclude wtmp from target-dumper commands
        qemurunner: Improve stdout logging handling
        qemurunner: Improve handling of serial port output blocking
        oeqa/selftest/overlayfs: Don't overwrite DISTRO_FEATURES
        testimage: Drop target_dumper and most of monitor_dumper
        oeqa/selftest/overlayfs: Fix whitespace
        qemu: Clean up DEPENDS
        qemu: Ensure pip and the python venv aren't used for meson
        curl: Disable two intermittently failing tests
        linux/cve-exclusion6.1: Update to latest kernel point release
        lib/prservice: Improve lock handling robustness
        oeqa/selftest/prservice: Improve test robustness
        scripts: Drop shell sstate-cache-management
        oeqa/selftest/sstatetests: Update sstate management script tests to python script
        curl: Disable test 1091 due to intermittent failures
        bitbake: lib/bb: Add workaround for libgcc issues with python 3.8 and 3.9
        bitbake: bitbake: Post release version bump to 2.7.0
        bitbake: siggen: Ensure version of siggen is verified
        bitbake: bitbake: Version bump for find_siginfo chanages
        sstatesig: Add version information for find_sigingfo
        sanity: Require bitbake 2.7.1

  Robert Berger (1):
        uninative-tarball.xz - reproducibility fix

  Robert Yang (5):
        gettext: Upgrade 0.22.3 -> 0.22.4
        nfs-utils: Upgrade 2.6.3 -> 2.6.4
        archiver.bbclass: Improve work-shared checking
        nfs-utils: Update Upstream-Status
        archiver.bbclass: Drop tarfile module to improve performance

  Ross Burton (23):
        avahi: update URL for new project location
        oeqa/runtime/parselogs: load ignores from disk
        oeqa/runtime/parselogs: migrate ignores
        meta-yocto-bsp/oeqa/parselogs: add BSP-specific ignores
        linux-yocto: update CVE exclusions
        genericx86: remove redundant assignments
        images: remove redundant IMAGE_BASENAME assignments
        insane: ensure more paths have the workdir removed
        tcl: skip timing-dependent tests in run-ptest
        qemurunner: remove unused import
        go: set vendor in CVE_PRODUCT
        runqemu: add qmp socket support
        linux-yocto: update CVE exclusions
        tcl: skip async and event tests in run-ptest
        images: add core-image-initramfs-boot
        machine/arch-armv9: remove crc and sve tunes, they are mandatory
        python3: re-enable profile guided optimisation
        openssl: mark assembler sections as call targets for PAC/BTI support on aarch64
        nativesdk: ensure features don't get backfilled
        nativesdk: don't unset MACHINE_FEATURES, let machine-sdk/ set it
        conf/machine-sdk: declare qemu-usermode SDK_MACHINE_FEATURE
        libseccomp: remove redundant PV assignment
        oeqa/parselogs-ignores-qemuarmv5: add comments and organise

  Saul Wold (1):
        package.py: OEHasPackage: Add MLPREFIX to packagename

  Shubham Kulkarni (1):
        tzdata: Upgrade to 2023d

  Simone Weiß (2):
        manuals: brief-yoctoprojectqs: align variable order with default local.conf
        patchtest: Add test for deprecated CVE_CHECK_IGNORE

  Soumya Sambu (1):
        ncurses: Fix - tty is hung after reset

  Sundeep KOKKONDA (1):
        rust: rustdoc reproducibility issue fix - disable PGO

  Tim Orling (12):
        python3-bcrypt: upgrade 4.0.1 -> 4.1.1
        python3-pygments: upgrade 2.16.1 -> 2.17.2
        recipetool: pypi: do not clobber SRC_URI checksums
        python3-setuptools-rust: BBCLASSEXTEND + nativesdk
        python3-maturin: add v1.4.0
        python3-maturin: bzip2-sys reproduciblility
        classes-recipe: add python_maturin.bbclass
        recipetool: add python_maturin support
        oe-selfest: add maturn runtime (testimage) test
        oeqa: add simple 'maturin' SDK (testsdk) test case
        oeqa: add "maturin develop" SDK test case
        oeqa: add runtime 'maturin develop' test case

  Tom Rini (1):
        inetutils: Update to the 2.5 release

  Trevor Gamblin (1):
        scripts/runqemu: fix regex escape sequences

  Victor Kamensky (5):
        systemtap: upgrade 4.9 -> 5.0
        systemtap: do not install uprobes and uprobes sources
        systemtap-uprobes: removed as obsolete
        systemtap: explicit handling debuginfod library dependency
        systemtap: fix libdebuginfod auto detection logic

  Vijay Anusuri (1):
        avahi: backport CVE-2023-1981 & CVE's follow-up patches

  Viswanath Kraleti (2):
        image-uefi.conf: Add EFI_UKI_PATH variable
        systemd-boot: Add recipe to compile native

  Wang Mingyu (38):
        kbd: upgrade 2.6.3 -> 2.6.4
        libatomic-ops: upgrade 7.8.0 -> 7.8.2
        libnl: upgrade 3.8.0 -> 3.9.0
        libseccomp: upgrade 2.5.4 -> 2.5.5
        libva-utils: upgrade 2.20.0 -> 2.20.1
        dnf: upgrade 4.18.1 -> 4.18.2
        gpgme: upgrade 1.23.1 -> 1.23.2
        kea: upgrade 2.4.0 -> 2.4.1
        opkg-utils: upgrade 0.6.2 -> 0.6.3
        repo: upgrade 2.39 -> 2.40
        sysstat: upgrade 12.7.4 -> 12.7.5
        p11-kit: upgrade 0.25.2 -> 0.25.3
        python3-babel: upgrade 2.13.1 -> 2.14.0
        python3-dbusmock: upgrade 0.29.1 -> 0.30.0
        python3-hatchling: upgrade 1.18.0 -> 1.20.0
        python3-hypothesis: upgrade 6.90.0 -> 6.92.1
        python3-importlib-metadata: upgrade 6.8.0 -> 7.0.0
        python3-license-expression: upgrade 30.1.1 -> 30.2.0
        python3-pathspec: upgrade 0.11.2 -> 0.12.1
        python3-pip: upgrade 23.3.1 -> 23.3.2
        python3-psutil: upgrade 5.9.6 -> 5.9.7
        python3-pytest-runner: upgrade 6.0.0 -> 6.0.1
        python3-trove-classifiers: upgrade 2023.11.22 -> 2023.11.29
        python3-typing-extensions: upgrade 4.8.0 -> 4.9.0
        python3-wcwidth: upgrade 0.2.11 -> 0.2.12
        ttyrun: upgrade 2.29.0 -> 2.30.0
        xwayland: upgrade 23.2.2 -> 23.2.3
        diffoscope: upgrade 252 -> 253
        iputils: upgrade 20221126 -> 20231222
        gstreamer1.0: upgrade 1.22.7 -> 1.22.8
        dhcpcd: upgrade 10.0.5 -> 10.0.6
        fontconfig: upgrade 2.14.2 -> 2.15.0
        python3-setuptools: upgrade 69.0.2 -> 69.0.3
        python3-dbusmock: upgrade 0.30.0 -> 0.30.1
        python3-hatchling: upgrade 1.20.0 -> 1.21.0
        python3-importlib-metadata: upgrade 7.0.0 -> 7.0.1
        python3-lxml: upgrade 4.9.3 -> 4.9.4
        aspell: upgrade 0.60.8 -> 0.60.8.1

  Yash Shinde (1):
        rust: Disable rust oe-selftest

  Yi Zhao (3):
        json-glib: upgrade 1.6.6 -> 1.8.0
        psplash: upgrade to latest revision
        debianutils: upgrade 5.14 -> 5.15

  Yoann Congal (2):
        lib/oe/patch: handle creating patches for CRLF sources
        strace: Disable bluetooth support by default

  Zang Ruochen (2):
        ell: upgrade 0.60 -> 0.61
        musl: add typedefs for Elf64_Relr and Elf32_Relr

  Zoltan Boszormenyi (1):
        update_gtk_icon_cache: Fix for GTK4-only builds

  venkata pyla (1):
        wic: use E2FSPROGS_FAKE_TIME and hash_seed to generate reproducible ext4 images

meta-openembedded: 5ad7203f68..7d8115d550:
  Alex Kiernan (7):
        mdns: Fix HOMEPAGE URL
        mbedtls: Upgrade 3.5.0 -> 3.5.1
        c-ares: Upgrade 1.22.1 -> 1.24.0
        mdns: Upgrade 2200.40.37.0.1 -> 2200.60.25.0.4
        c-ares: Move to tarballs, add ptest and static support
        thin-provisioning-tools: Upgrade 1.0.4 -> 1.0.9
        bearssl: Upgrade to latest

  Alexander Kanavin (29):
        python3-pyinotify: remove as unmaintained
        python3-supervisor: do not rely on smtpd module
        python3-meld3: do not rely on smtpd module
        python3-m2crypto: do not rely on smtpd module
        python3-uinput: remove as unmaintained
        python3-mcrypto: rely on setuptools for distutils copy
        python3-joblib: do not rely in distutils
        python3-web3: remove distutils dependency
        python3-cppy: remove unused distutils dependency
        python3-pyroute2: remove unused distutils dependency
        python3-eventlet: backport a patch to remove distutils dependency
        python3-unoconv: rely on setuptools to obtain distutils copy
        python3-astroid: remove unneeded distutils dependency
        python3-django: remove unneeded distutils dependency
        python3-pillow: remove unneeded distutils dependency
        python3-grpcio: update 1.56.2 -> 1.59.3
        gstd: correctly delete files in do_install
        libplist: fix python 3.12 compatibility
        libcamera: skip until upstream resolves python 3.12 compatibility
        nodejs: backport (partially) python 3.12 support
        nodejs: backport (partially) python 3.12 support
        polkit: remove long obsolete 0.119 version
        mozjs-115: split the way-too-long PYTHONPATH line
        polkit: update mozjs dependency 102 -> 115
        mozjs-115: backport py 3.12 compatibility
        mozjs-102: remove the recipe
        gthumb: update 3.12.2 -> 3.12.4
        flatpak: do not rely on executables from the host
        bolt: package systemd units

  Archana Polampalli (1):
        cjson: upgrade 1.7.16 -> 1.7.17

  Bruce Ashfield (1):
        zfs: update to 2.2.2

  Changqing Li (2):
        postgresql: upgrade 15.4 -> 15.5
        redis: upgrade 6.2.13 -> 6.2.14

  Derek Straka (70):
        python3-greenlet: update to version 3.0.2
        python3-ujson: update to version 5.9.0
        python3-termcolor: update to version 2.4.0
        python3-cmake: update to version 3.28.0
        python3-pint: upgrade to 0.23
        python3-gnupg: update to 0.5.2
        python3-pyzmq: update to 25.1.2
        python3-tox: update to version 4.11.4
        python3-olefile: update to version 0.47
        python3-distlib: update to version 0.3.8
        python3-colorlog: update to version 6.8.0
        python3-pymongo: update version to 4.6.1
        python3-bandit: update to version 1.7.6
        python3-gmqtt: update to version 0.6.13
        python3-portion: update to version 2.4.2
        python3-prompt-toolkit: update to version 3.0.43
        python3-asyncinotify: update to version 4.0.4
        python3-bitstring: update to version 4.1.4
        python3-ipython: update to version 8.18.1
        nginx: update versions for both the stable branch and mainline
        python3-portalocker: update to version 2.8.2
        python3-astroid: update to version 3.0.2
        python3-alembic: update to version 1.13.1
        python3-pymisp: update to verion 2.4.182
        python3-ninja: update to version 1.11.1.1
        python3-coverage: update to version 7.3.4
        python3-pdm: update to version 2.11.1
        python3-paramiko: update to version 3.4.0
        python3-zeroconf: update to version 0.131.0
        python3-wtforms: update to version 3.1.1
        python3-isort: update to version 5.13.2
        python3-protobuf: update to version 4.25.1
        python3-lazy-object-proxy: update to version 1.10.0
        python3-cantools: update to version 39.4.0
        python3-sentry-sdk: update to version 1.39.1
        python3-xmlschema: update to version 2.5.1
        python3-apiflask: update to version 2.1.0
        python3-rapidjson: update to version 1.14
        python3-bitarray: update to version 2.9.0
        python3-pyfanotify: update to version 0.2.2
        python3-eventlet: update to version 0.34.1
        python3-flask-wtf: update to version 1.2.1
        python3-grpcio: update to version 1.60.0
        python3-grpcio-tools: update to version 1.60.0
        python3-cmake: update to version 3.28.1
        python3-flask-sqlalchemy: fix upstream uri check
        python3-wtforms: fix upstream uri and version check
        gyp: update to the latest commit
        python3-ipython-genutils: fix upstream uri and version check
        python3-flask: fix upstream uri and version check
        python3-wpa-supplicant: fix upstream uri and version check
        python3-uswid: update to version 0.4.7
        python3-flask-wtf: fix upstream uri and version check
        python3-gspread: update to version 5.12.3
        python3-pytest-html: update to version 4.1.1
        python3-setuptools-scm-git-archive: remove obsolete package
        python3-pyroute2: update to version 0.7.10
        python3-constantly: update to version 23.10.4
        python3-mypy: update to version 1.8.0
        python3-flask-jwt-extended: update to version 4.6.0
        python3-greenlet: update to version 3.0.3
        python3-web3: update to version 6.13.0
        python3-parse: update to version 1.20.0
        python3-kmod: add comment about update to version 0.9.2
        python3-engineio: update to version 4.8.1
        python3-sqlalchemy: update to version 2.0.24
        python3-pdm-backend: update to version 2.1.8
        python3-cantools: update to version 39.4.1
        python3-argh: update to version 0.30.5
        python3-dominate: update to version 2.9.1

  Dmitry Baryshkov (2):
        android-tools: remove two Debianisms
        networkmanager: drop libnewt dependency

  Frederic Martinsons (3):
        crash: factorize recipe with inc file to prepare cross-canadian version
        crash: add cross canadian version
        crash: update to 8.0.4

  Jan Vermaete (1):
        netdata: added Python as rdepends

  Jean-Marc BOUCHE (1):
        terminus-font: build compressed archives with -n

  Jose Quaresma (1):
        ostree: Upgrade 2023.7 -> 2023.8

  Joshua Watt (1):
        redis: Create state directory in systemd service

  Jörg Sommer (1):
        i2cdev: New recipe with i2c tools

  Kai Kang (1):
        lvm2: 2.03.16 -> 2.03.22

  Khem Raj (3):
        Revert "nodejs: backport (partially) python 3.12 support"
        Revert "libcamera: skip until upstream resolves python 3.12 compatibility"
        libcamera: Fix build with python 3.12

  Leon Anavi (11):
        sip: Upgrade 6.7.12 -> 6.8.0
        python3-expandvars: add recipe
        python3-frozenlist: upgrade 1.4.0 -> 1.4.1
        python3-yarl: upgrade 1.9.2 -> 1.9.4
        python3-coverage: upgrade 7.3.2 -> 7.3.3
        python3-cycler: upgrade 0.11.0 -> 0.12.1
        python3-aiohue: upgrade 4.6.2 -> 4.7.0
        python3-sdbus: upgrade 0.11.0 -> 0.11.1
        python3-zeroconf: upgrade 0.128.4 -> 0.130.0
        python3-dominate: upgrade 2.8.0 -> 2.9.0
        python3-rlp: upgrade 3.0.0 -> 4.0.0

  Marek Vasut (1):
        faad2: Upgrade 2.10.0 -> 2.11.1

  Markus Volk (3):
        wireplumber: update 0.4.15 -> 0.4.17
        tracker: dont inherit gsettings
        gnome-software: update 45.1 -> 45.2

  Martin Jansa (4):
        monocypher: pass LIBDIR to fix installed-vs-shipped QA issue with multilib
        rygel: fix build with gtk+3 PACKAGECONFIG disabled
        rygel: add x11 to DISTRO_FEATURES
        driverctl: fix installed-vs-shipped

  Meenali Gupta (1):
        nginx: upgrade 1.25.2 -> 1.25.3

  Mingli Yu (2):
        mariadb: Upgrade to 10.11.6
        tk: Remove buildpath issue

  Nathan BRIENT (1):
        cyaml: new recipe

  Niko Mauno (1):
        pkcs11-provider: Add recipe

  Ny Antra Ranaivoarison (1):
        python3-click-spinner: backport patch that fixes deprecated methods

  Patrick Wicki (1):
        poco: upgrade 1.12.4 -> 1.12.5p2

  Petr Chernikov (1):
        abseil-cpp: remove -Dcmake_cxx_standard=14 flag from extra_oecmake

  Robert Yang (1):
        minifi-cpp: Fix do_configure error builder aarch64

  Ross Burton (13):
        Remove unused SRC_DISTRIBUTE_LICENSES
        gspell: inherit gtk-doc
        gspell: update DEPENDS, switch iso-codes for icu
        librest: remove spurious build dependencies
        librest: inherit gtk-doc
        keybinder: use autotools-brokensep instead of setting B
        keybinder: disable gtk-doc documentation
        gtksourceview3: remove obsolete DEPENDS
        libgsf: remove obsolete DEPENDS
        evolution-data-server: remove obsolete intltool DEPENDS
        php: remove lemon-native build dependency
        lemon: upgrade to 3.44.2
        renderdoc: no need to depend on vim-native

  Samuli Piippo (1):
        jasper: enable opengl only wih x11

  Theodore A. Roth (1):
        python3-flask-sqlalchemy: upgrade 2.5.1 -> 3.1.1

  Thomas Perrot (2):
        networkmanager: add missing modemmanager rdepends
        networkmanager: fix some missing pkgconfig

  Tim Orling (8):
        python3-pydantic-core: add v2.14.5
        python3-annotated-types: add v0.6.0
        python3-pydantic: fix RDEPENDS
        python3-dirty-equals: add v0.7.1
        python3-pydantic-core: enable ptest
        python3-cloudpickle: add v3.0.0
        python3-pydantic: enable ptest
        python3-yappi: upgrade 1.4.0 -> 1.6.0; fix ptests

  Wang Mingyu (61):
        python3-alembic: upgrade 1.12.1 -> 1.13.0
        python3-ansi2html: upgrade 1.8.0 -> 1.9.1
        python3-argcomplete: upgrade 3.1.6 -> 3.2.1
        python3-dbus-fast: upgrade 2.15.0 -> 2.21.0
        python3-django: upgrade 4.2.7 -> 5.0
        python3-flask-restx: upgrade 1.2.0 -> 1.3.0
        python3-google-api-core: upgrade 2.14.0 -> 2.15.0
        python3-google-api-python-client: upgrade 2.108.0 -> 2.111.0
        python3-googleapis-common-protos: upgrade 1.61.0 -> 1.62.0
        python3-google-auth: upgrade 2.23.4 -> 2.25.2
        python3-imageio: upgrade 2.33.0 -> 2.33.1
        python3-isort: upgrade 5.12.0 -> 5.13.1
        python3-path: upgrade 16.7.1 -> 16.9.0
        python3-platformdirs: upgrade 4.0.0 -> 4.1.0
        python3-pytest-asyncio: upgrade 0.22.0 -> 0.23.2
        python3-sentry-sdk: upgrade 1.37.1 -> 1.39.0
        python3-bitarray: upgrade 2.8.3 -> 2.8.5
        python3-eth-keyfile: upgrade 0.6.1 -> 0.7.0
        python3-eth-rlp: upgrade 0.3.0 -> 1.0.0
        python3-fastnumbers: upgrade 5.0.1 -> 5.1.0
        python3-pylint: upgrade 3.0.2 -> 3.0.3
        python3-tornado: upgrade 6.3.3 -> 6.4
        python3-traitlets: upgrade 5.13.0 -> 5.14.0
        python3-types-setuptools: upgrade 68.2.0.2 -> 69.0.0.0
        python3-virtualenv: upgrade 20.24.7 -> 20.25.0
        python3-web3: upgrade 6.11.3 -> 6.12.0
        python3-websocket-client: upgrade 1.6.4 -> 1.7.0
        python3-zeroconf: upgrade 0.127.0 -> 0.128.4
        ctags: upgrade 6.0.20231126.0 -> 6.0.20231210.0
        gensio: upgrade 2.8.0 -> 2.8.2
        hwdata: upgrade 0.376 -> 0.377
        lvgl: upgrade 8.3.10 -> 8.3.11
        gjs: upgrade 1.78.0 -> 1.78.1
        ifenslave: upgrade 2.13 -> 2.14
        libei: upgrade 1.1.0 -> 1.2.0
        pkcs11-helper: upgrade 1.29.0 -> 1.30.0
        strongswan: upgrade 5.9.12 -> 5.9.13
        webkitgtk3: upgrade 2.42.2 -> 2.42.3
        sip: upgrade 6.8.0 -> 6.8.1
        paho-mqtt-cpp: upgrade 1.3.1 -> 1.3.2
        dbus-cxx: upgrade 2.4.0 -> 2.5.0
        exiftool: upgrade 12.70 -> 12.71
        uftp: upgrade 5.0.2 -> 5.0.3
        ctags: upgrade 6.0.20231210.0 -> 6.0.20231224.0
        jasper: Fix install conflict when enable multilib.
        jq: upgrade 1.7 -> 1.7.1
        libmbim: upgrade 1.31.1 -> 1.31.2
        libqmi: upgrade 1.34.0 -> 1.35.1
        opencl-headers: upgrade 2023.04.17 -> 2023.12.14
        valijson: upgrade 1.0.1 -> 1.0.2
        python3-apispec: upgrade 6.3.0 -> 6.3.1
        python3-asyncinotify: upgrade 4.0.4 -> 4.0.5
        python3-bitarray: upgrade 2.9.0 -> 2.9.1
        python3-cassandra-driver: upgrade 3.28.0 -> 3.29.0
        python3-ipython: upgrade 8.18.1 -> 8.19.0
        python3-pydantic: upgrade 2.5.2 -> 2.5.3
        python3-regex: upgrade 2023.10.3 -> 2023.12.25
        opencl-icd-loader: upgrade 2023.04.17 -> 2023.12.14
        python3-distro: upgrade 1.8.0 -> 1.9.0
        zchunk: upgrade 1.3.2 -> 1.4.0
        python3-eventlet: upgrade 0.34.1 -> 0.34.2

  William Lyu (1):
        networkmanager: Improved SUMMARY and added DESCRIPTION

  Xiangyu Chen (1):
        layer.conf: add libbpf to NON_MULTILIB_RECIPES

  Yi Zhao (2):
        open-vm-tools: upgrade 12.1.5 -> 12.3.5
        samba: upgrade 4.18.8 -> 4.18.9

  Zoltán Böszörményi (2):
        mutter: Make gnome-desktop and libcanberra dependencies optional
        zenity: Upgrade to 4.0.0

  alperak (29):
        jasper: upgrade 2.0.33 -> 4.1.1
        xcursorgen: upgrade 1.0.7 -> 1.0.8
        xstdcmap: upgrade 1.0.4 -> 1.0.5
        xlsclients: upgrade 1.1.4 -> 1.1.5
        xlsatoms: upgrade 1.1.3 -> 1.1.4
        xkbevd: upgrade 1.1.4 -> 1.1.5
        xgamma: upgrade 1.0.6 -> 1.0.7
        sessreg: upgrade 1.1.2 -> 1.1.3
        xbitmaps: upgrade 1.1.2 -> 1.1.3
        xcursor-themes: add recipe
        xorg-docs: add recipe
        xorg-sgml-doctools: update summary depends and inc file
        xf86-video-ati: upgrade 19.1.0 -> 22.0.0
        xf86-input-void: upgrade 1.4.1 -> 1.4.2
        libxaw: upgrade 1.0.14 -> 1.0.15
        xf86-video-mga: upgrade 2.0.0 -> 2.0.1
        snappy: upgrade 1.1.9 -> 1.1.10
        xsetroot: upgrade 1.1.2 -> 1.1.3
        libbytesize: Removed unnecessary setting of B
        libmxml: use autotools-brokensep instead of setting B
        libsombok3: use autotools-brokensep instead of setting B
        pgpool2: use autotools-brokensep instead of setting B
        qpdf: upgrade 11.6.3 -> 11.6.4
        cpprest: upgrade 2.10.18 -> 2.10.19
        avro-c: upgrade 1.11.2 -> 1.11.3
        dool: upgrade 1.1.0 -> 1.3.1
        driverctl: upgrade 0.111 -> 0.115
        hstr: upgrade 2.5.0 -> 3.1.0
        libharu: upgrade 2.3.0 -> 2.4.4

meta-security: 070a1e82cc..b2e1511338:
  Armin Kuster (6):
        libgssglue: update to 0.8
        python3-privacyidea: Update to 3.9.1
        lynis: Update SRC_URI to improve updater
        layers: Move READMEs to markdown format
        arpwatch: adjust CONFIGURE params to allow to build again.
        python3-pyinotify: fail2ban needs this module

  Dawid Dabrowski (1):
        libhoth recipe update

  Erik Schilling (2):
        dm-verity-img.bbclass: use bc-native
        dm-verity-img.bbclass: remove IMAGE_NAME_SUFFIX

  Mikko Rapeli (2):
        tpm2-tss: support native builds
        dm-verity-img.bbclass: add DM_VERITY_DEPLOY_DIR

Change-Id: I94d7f1ee5ff2da4555c05fbf63a1293ec8f249c2
Signed-off-by: Patrick Williams <patrick@stwcx.xyz>
diff --git a/poky/scripts/lib/devtool/deploy.py b/poky/scripts/lib/devtool/deploy.py
index e14a587..eadf6e1 100644
--- a/poky/scripts/lib/devtool/deploy.py
+++ b/poky/scripts/lib/devtool/deploy.py
@@ -140,6 +140,7 @@
     import math
     import oe.recipeutils
     import oe.package
+    import oe.utils
 
     check_workspace_recipe(workspace, args.recipename, checksrc=False)
 
@@ -174,7 +175,7 @@
             exec_fakeroot(rd, "cp -af %s %s" % (os.path.join(srcdir, '.'), recipe_outdir), shell=True)
             os.environ['PATH'] = ':'.join([os.environ['PATH'], rd.getVar('PATH') or ''])
             oe.package.strip_execs(args.recipename, recipe_outdir, rd.getVar('STRIP'), rd.getVar('libdir'),
-                        rd.getVar('base_libdir'), rd)
+                        rd.getVar('base_libdir'), oe.utils.get_bb_number_threads(rd), rd)
 
         filelist = []
         inodes = set({})
diff --git a/poky/scripts/lib/devtool/standard.py b/poky/scripts/lib/devtool/standard.py
index ad6e346..559fd45 100644
--- a/poky/scripts/lib/devtool/standard.py
+++ b/poky/scripts/lib/devtool/standard.py
@@ -147,6 +147,8 @@
         extracmdopts += ' -a'
     if args.npm_dev:
         extracmdopts += ' --npm-dev'
+    if args.no_pypi:
+        extracmdopts += ' --no-pypi'
     if args.mirrors:
         extracmdopts += ' --mirrors'
     if args.srcrev:
@@ -984,14 +986,15 @@
                         '}\n')
             if rd.getVarFlag('do_menuconfig','task'):
                 f.write('\ndo_configure:append() {\n'
-                '    if [ ${@ oe.types.boolean(\'${KCONFIG_CONFIG_ENABLE_MENUCONFIG}\') } = True ]; then\n'
+                '    if [ ${@oe.types.boolean(d.getVar("KCONFIG_CONFIG_ENABLE_MENUCONFIG"))} = True ]; then\n'
                 '        cp ${KCONFIG_CONFIG_ROOTDIR}/.config ${S}/.config.baseline\n'
                 '        ln -sfT ${KCONFIG_CONFIG_ROOTDIR}/.config ${S}/.config.new\n'
                 '    fi\n'
                 '}\n')
             if initial_revs:
                 for name, rev in initial_revs.items():
-                        f.write('\n# initial_rev %s: %s\n' % (name, rev))
+                    f.write('\n# initial_rev %s: %s\n' % (name, rev))
+                    if name in commits:
                         for commit in commits[name]:
                             f.write('# commit %s: %s\n' % (name, commit))
             if branch_patches:
@@ -2328,6 +2331,7 @@
     group.add_argument('--no-same-dir', help='Force build in a separate build directory', action="store_true")
     parser_add.add_argument('--fetch', '-f', help='Fetch the specified URI and extract it to create the source tree (deprecated - pass as positional argument instead)', metavar='URI')
     parser_add.add_argument('--npm-dev', help='For npm, also fetch devDependencies', action="store_true")
+    parser_add.add_argument('--no-pypi', help='Do not inherit pypi class', action="store_true")
     parser_add.add_argument('--version', '-V', help='Version to use within recipe (PV)')
     parser_add.add_argument('--no-git', '-g', help='If fetching source, do not set up source tree as a git repository', action="store_true")
     group = parser_add.add_mutually_exclusive_group()
diff --git a/poky/scripts/lib/devtool/upgrade.py b/poky/scripts/lib/devtool/upgrade.py
index 10827a7..ef58523 100644
--- a/poky/scripts/lib/devtool/upgrade.py
+++ b/poky/scripts/lib/devtool/upgrade.py
@@ -192,8 +192,7 @@
         __run('git submodule foreach \'git tag -f devtool-base-new\'')
         (stdout, _) = __run('git submodule --quiet foreach \'echo $sm_path\'')
         paths += [os.path.join(srctree, p) for p in stdout.splitlines()]
-        md5 = None
-        sha256 = None
+        checksums = {}
         _, _, _, _, _, params = bb.fetch2.decodeurl(uri)
         srcsubdir_rel = params.get('destsuffix', 'git')
         if not srcbranch:
@@ -226,9 +225,6 @@
         if ftmpdir and keep_temp:
             logger.info('Fetch temp directory is %s' % ftmpdir)
 
-        md5 = checksums['md5sum']
-        sha256 = checksums['sha256sum']
-
         tmpsrctree = _get_srctree(tmpdir)
         srctree = os.path.abspath(srctree)
         srcsubdir_rel = os.path.relpath(tmpsrctree, tmpdir)
@@ -297,7 +293,7 @@
             if tmpdir != tmpsrctree:
                 shutil.rmtree(tmpdir)
 
-    return (revs, md5, sha256, srcbranch, srcsubdir_rel)
+    return (revs, checksums, srcbranch, srcsubdir_rel)
 
 def _add_license_diff_to_recipe(path, diff):
     notice_text = """# FIXME: the LIC_FILES_CHKSUM values have been updated by 'devtool upgrade'.
@@ -318,7 +314,7 @@
         f.write("\n#\n\n".encode())
         f.write(orig_content)
 
-def _create_new_recipe(newpv, md5, sha256, srcrev, srcbranch, srcsubdir_old, srcsubdir_new, workspace, tinfoil, rd, license_diff, new_licenses, srctree, keep_failure):
+def _create_new_recipe(newpv, checksums, srcrev, srcbranch, srcsubdir_old, srcsubdir_new, workspace, tinfoil, rd, license_diff, new_licenses, srctree, keep_failure):
     """Creates the new recipe under workspace"""
 
     bpn = rd.getVar('BPN')
@@ -390,30 +386,39 @@
                 addnames.append(params['name'])
     # Find what's been set in the original recipe
     oldnames = []
+    oldsums = []
     noname = False
     for varflag in rd.getVarFlags('SRC_URI'):
-        if varflag.endswith(('.md5sum', '.sha256sum')):
-            name = varflag.rsplit('.', 1)[0]
-            if name not in oldnames:
-                oldnames.append(name)
-        elif varflag in ['md5sum', 'sha256sum']:
-            noname = True
+        for checksum in checksums:
+            if varflag.endswith('.' + checksum):
+                name = varflag.rsplit('.', 1)[0]
+                if name not in oldnames:
+                    oldnames.append(name)
+                oldsums.append(checksum)
+            elif varflag == checksum:
+                noname = True
+                oldsums.append(checksum)
     # Even if SRC_URI has named entries it doesn't have to actually use the name
     if noname and addnames and addnames[0] not in oldnames:
         addnames = []
     # Drop any old names (the name actually might include ${PV})
     for name in oldnames:
         if name not in newnames:
-            newvalues['SRC_URI[%s.md5sum]' % name] = None
-            newvalues['SRC_URI[%s.sha256sum]' % name] = None
+            for checksum in oldsums:
+                newvalues['SRC_URI[%s.%s]' % (name, checksum)] = None
 
-    if sha256:
-        if addnames:
-            nameprefix = '%s.' % addnames[0]
-        else:
-            nameprefix = ''
+    nameprefix = '%s.' % addnames[0] if addnames else ''
+
+    # md5sum is deprecated, remove any traces of it. If it was the only old
+    # checksum, then replace it with the default checksums.
+    if 'md5sum' in oldsums:
         newvalues['SRC_URI[%smd5sum]' % nameprefix] = None
-        newvalues['SRC_URI[%ssha256sum]' % nameprefix] = sha256
+        oldsums.remove('md5sum')
+        if not oldsums:
+            oldsums = ["%ssum" % s for s in bb.fetch2.SHOWN_CHECKSUM_LIST]
+
+    for checksum in oldsums:
+        newvalues['SRC_URI[%s%s]' % (nameprefix, checksum)] = checksums[checksum]
 
     if srcsubdir_new != srcsubdir_old:
         s_subdir_old = os.path.relpath(os.path.abspath(rd.getVar('S')), rd.getVar('WORKDIR'))
@@ -571,12 +576,12 @@
             rev1, srcsubdir1 = standard._extract_source(srctree, False, 'devtool-orig', False, config, basepath, workspace, args.fixed_setup, rd, tinfoil, no_overrides=args.no_overrides)
             old_licenses = _extract_licenses(srctree_s, (rd.getVar('LIC_FILES_CHKSUM') or ""))
             logger.info('Extracting upgraded version source...')
-            rev2, md5, sha256, srcbranch, srcsubdir2 = _extract_new_source(args.version, srctree, args.no_patch,
+            rev2, checksums, srcbranch, srcsubdir2 = _extract_new_source(args.version, srctree, args.no_patch,
                                                     args.srcrev, args.srcbranch, args.branch, args.keep_temp,
                                                     tinfoil, rd)
             new_licenses = _extract_licenses(srctree_s, (rd.getVar('LIC_FILES_CHKSUM') or ""))
             license_diff = _generate_license_diff(old_licenses, new_licenses)
-            rf, copied = _create_new_recipe(args.version, md5, sha256, args.srcrev, srcbranch, srcsubdir1, srcsubdir2, config.workspace_path, tinfoil, rd, license_diff, new_licenses, srctree, args.keep_failure)
+            rf, copied = _create_new_recipe(args.version, checksums, args.srcrev, srcbranch, srcsubdir1, srcsubdir2, config.workspace_path, tinfoil, rd, license_diff, new_licenses, srctree, args.keep_failure)
         except (bb.process.CmdError, DevtoolError) as e:
             recipedir = os.path.join(config.workspace_path, 'recipes', rd.getVar('BPN'))
             _upgrade_error(e, recipedir, srctree, args.keep_failure)
@@ -626,7 +631,7 @@
     for result in results:
         # pn, update_status, current, latest, maintainer, latest_commit, no_update_reason
         if args.all or result[1] != 'MATCH':
-            logger.info("{:25} {:15} {:15} {} {} {}".format(   result[0],
+            print("{:25} {:15} {:15} {} {} {}".format(   result[0],
                                                                result[2],
                                                                result[1] if result[1] != 'UPDATE' else (result[3] if not result[3].endswith("new-commits-available") else "new commits"),
                                                                result[4],
diff --git a/poky/scripts/lib/recipetool/append.py b/poky/scripts/lib/recipetool/append.py
index 4b6a711..341e893 100644
--- a/poky/scripts/lib/recipetool/append.py
+++ b/poky/scripts/lib/recipetool/append.py
@@ -18,6 +18,7 @@
 import scriptutils
 import errno
 from collections import defaultdict
+import difflib
 
 logger = logging.getLogger('recipetool')
 
@@ -299,7 +300,9 @@
                 if st.st_mode & stat.S_IXUSR:
                     perms = '0755'
             install = {args.newfile: (args.targetpath, perms)}
-        oe.recipeutils.bbappend_recipe(rd, args.destlayer, {args.newfile: {'path' : sourcepath}}, install, wildcardver=args.wildcard_version, machine=args.machine)
+        if sourcepath:
+            sourcepath = os.path.basename(sourcepath)
+        oe.recipeutils.bbappend_recipe(rd, args.destlayer, {args.newfile: {'newname' : sourcepath}}, install, wildcardver=args.wildcard_version, machine=args.machine)
         tinfoil.modified_files()
         return 0
     else:
@@ -328,6 +331,7 @@
 
     copyfiles = {}
     extralines = extralines or []
+    params = []
     for newfile, srcfile in files.items():
         src_destdir = os.path.dirname(srcfile)
         if not args.use_workdir:
@@ -338,24 +342,45 @@
             src_destdir = os.path.join(os.path.relpath(srcdir, workdir), src_destdir)
         src_destdir = os.path.normpath(src_destdir)
 
-        source_uri = 'file://{0}'.format(os.path.basename(srcfile))
         if src_destdir and src_destdir != '.':
-            source_uri += ';subdir={0}'.format(src_destdir)
-
-        simple = bb.fetch.URI(source_uri)
-        simple.params = {}
-        simple_str = str(simple)
-        if simple_str in simplified:
-            existing = simplified[simple_str]
-            if source_uri != existing:
-                logger.warning('{0!r} is already in SRC_URI, with different parameters: {1!r}, not adding'.format(source_uri, existing))
-            else:
-                logger.warning('{0!r} is already in SRC_URI, not adding'.format(source_uri))
+            params.append({'subdir': src_destdir})
         else:
-            extralines.append('SRC_URI += {0}'.format(source_uri))
-        copyfiles[newfile] = {'path' : srcfile}
+            params.append({})
 
-    oe.recipeutils.bbappend_recipe(rd, args.destlayer, copyfiles, None, wildcardver=args.wildcard_version, machine=args.machine, extralines=extralines)
+        copyfiles[newfile] = {'newname' : os.path.basename(srcfile)}
+
+    dry_run_output = None
+    dry_run_outdir = None
+    if args.dry_run:
+        import tempfile
+        dry_run_output = tempfile.TemporaryDirectory(prefix='devtool')
+        dry_run_outdir = dry_run_output.name
+
+    appendfile, _ = oe.recipeutils.bbappend_recipe(rd, args.destlayer, copyfiles, None, wildcardver=args.wildcard_version, machine=args.machine, extralines=extralines, params=params,
+                                                   redirect_output=dry_run_outdir, update_original_recipe=args.update_recipe)
+    if not appendfile:
+        return
+    if args.dry_run:
+        output = ''
+        appendfilename = os.path.basename(appendfile)
+        newappendfile = appendfile
+        if appendfile and os.path.exists(appendfile):
+            with open(appendfile, 'r') as f:
+                oldlines = f.readlines()
+        else:
+            appendfile = '/dev/null'
+            oldlines = []
+
+        with open(os.path.join(dry_run_outdir, appendfilename), 'r') as f:
+            newlines = f.readlines()
+        diff = difflib.unified_diff(oldlines, newlines, appendfile, newappendfile)
+        difflines = list(diff)
+        if difflines:
+            output += ''.join(difflines)
+        if output:
+            logger.info('Diff of changed files:\n%s' % output)
+        else:
+            logger.info('No changed files')
     tinfoil.modified_files()
 
 def appendsrcfiles(parser, args):
@@ -436,6 +461,8 @@
                                    help='Create/update a bbappend to add or replace source files',
                                    description='Creates a bbappend (or updates an existing one) to add or replace the specified file in the recipe sources, either those in WORKDIR or those in the source tree. This command lets you specify multiple files with a destination directory, so cannot specify the destination filename. See the `appendsrcfile` command for the other behavior.')
     parser.add_argument('-D', '--destdir', help='Destination directory (relative to S or WORKDIR, defaults to ".")', default='', type=destination_path)
+    parser.add_argument('-u', '--update-recipe', help='Update recipe instead of creating (or updating) a bbapend file. DESTLAYER must contains the recipe to update', action='store_true')
+    parser.add_argument('-n', '--dry-run', help='Dry run mode', action='store_true')
     parser.add_argument('files', nargs='+', metavar='FILE', help='File(s) to be added to the recipe sources (WORKDIR or S)', type=existing_path)
     parser.set_defaults(func=lambda a: appendsrcfiles(parser, a), parserecipes=True)
 
@@ -443,6 +470,8 @@
                                    parents=[common_src],
                                    help='Create/update a bbappend to add or replace a source file',
                                    description='Creates a bbappend (or updates an existing one) to add or replace the specified files in the recipe sources, either those in WORKDIR or those in the source tree. This command lets you specify the destination filename, not just destination directory, but only works for one file. See the `appendsrcfiles` command for the other behavior.')
+    parser.add_argument('-u', '--update-recipe', help='Update recipe instead of creating (or updating) a bbapend file. DESTLAYER must contains the recipe to update', action='store_true')
+    parser.add_argument('-n', '--dry-run', help='Dry run mode', action='store_true')
     parser.add_argument('file', metavar='FILE', help='File to be added to the recipe sources (WORKDIR or S)', type=existing_path)
     parser.add_argument('destfile', metavar='DESTFILE', nargs='?', help='Destination path (relative to S or WORKDIR, optional)', type=destination_path)
     parser.set_defaults(func=lambda a: appendsrcfile(parser, a), parserecipes=True)
diff --git a/poky/scripts/lib/recipetool/create.py b/poky/scripts/lib/recipetool/create.py
index 293198d..d2997cc 100644
--- a/poky/scripts/lib/recipetool/create.py
+++ b/poky/scripts/lib/recipetool/create.py
@@ -423,6 +423,36 @@
     storeTagName = ''
     pv_srcpv = False
 
+    handled = []
+    classes = []
+
+    # Find all plugins that want to register handlers
+    logger.debug('Loading recipe handlers')
+    raw_handlers = []
+    for plugin in plugins:
+        if hasattr(plugin, 'register_recipe_handlers'):
+            plugin.register_recipe_handlers(raw_handlers)
+    # Sort handlers by priority
+    handlers = []
+    for i, handler in enumerate(raw_handlers):
+        if isinstance(handler, tuple):
+            handlers.append((handler[0], handler[1], i))
+        else:
+            handlers.append((handler, 0, i))
+    handlers.sort(key=lambda item: (item[1], -item[2]), reverse=True)
+    for handler, priority, _ in handlers:
+        logger.debug('Handler: %s (priority %d)' % (handler.__class__.__name__, priority))
+        setattr(handler, '_devtool', args.devtool)
+    handlers = [item[0] for item in handlers]
+
+    fetchuri = None
+    for handler in handlers:
+        if hasattr(handler, 'process_url'):
+            ret = handler.process_url(args, classes, handled, extravalues)
+            if 'url' in handled and ret:
+                fetchuri = ret
+                break
+
     if os.path.isfile(source):
         source = 'file://%s' % os.path.abspath(source)
 
@@ -431,7 +461,8 @@
         if re.match(r'https?://github.com/[^/]+/[^/]+/archive/.+(\.tar\..*|\.zip)$', source):
             logger.warning('github archive files are not guaranteed to be stable and may be re-generated over time. If the latter occurs, the checksums will likely change and the recipe will fail at do_fetch. It is recommended that you point to an actual commit or tag in the repository instead (using the repository URL in conjunction with the -S/--srcrev option).')
         # Fetch a URL
-        fetchuri = reformat_git_uri(urldefrag(source)[0])
+        if not fetchuri:
+            fetchuri = reformat_git_uri(urldefrag(source)[0])
         if args.binary:
             # Assume the archive contains the directory structure verbatim
             # so we need to extract to a subdirectory
@@ -638,8 +669,6 @@
     # We'll come back and replace this later in handle_license_vars()
     lines_before.append('##LICENSE_PLACEHOLDER##')
 
-    handled = []
-    classes = []
 
     # FIXME This is kind of a hack, we probably ought to be using bitbake to do this
     pn = None
@@ -677,8 +706,10 @@
     if not srcuri:
         lines_before.append('# No information for SRC_URI yet (only an external source tree was specified)')
     lines_before.append('SRC_URI = "%s"' % srcuri)
+    shown_checksums = ["%ssum" % s for s in bb.fetch2.SHOWN_CHECKSUM_LIST]
     for key, value in sorted(checksums.items()):
-        lines_before.append('SRC_URI[%s] = "%s"' % (key, value))
+        if key in shown_checksums:
+            lines_before.append('SRC_URI[%s] = "%s"' % (key, value))
     if srcuri and supports_srcrev(srcuri):
         lines_before.append('')
         lines_before.append('# Modify these as desired')
@@ -718,25 +749,6 @@
     if args.npm_dev:
         extravalues['NPM_INSTALL_DEV'] = 1
 
-    # Find all plugins that want to register handlers
-    logger.debug('Loading recipe handlers')
-    raw_handlers = []
-    for plugin in plugins:
-        if hasattr(plugin, 'register_recipe_handlers'):
-            plugin.register_recipe_handlers(raw_handlers)
-    # Sort handlers by priority
-    handlers = []
-    for i, handler in enumerate(raw_handlers):
-        if isinstance(handler, tuple):
-            handlers.append((handler[0], handler[1], i))
-        else:
-            handlers.append((handler, 0, i))
-    handlers.sort(key=lambda item: (item[1], -item[2]), reverse=True)
-    for handler, priority, _ in handlers:
-        logger.debug('Handler: %s (priority %d)' % (handler.__class__.__name__, priority))
-        setattr(handler, '_devtool', args.devtool)
-    handlers = [item[0] for item in handlers]
-
     # Apply the handlers
     if args.binary:
         classes.append('bin_package')
@@ -873,8 +885,10 @@
         outlines.append('')
     outlines.extend(lines_after)
 
+    outlines = [ line.rstrip('\n') +"\n" for line in outlines]
+
     if extravalues:
-        _, outlines = oe.recipeutils.patch_recipe_lines(outlines, extravalues, trailing_newline=False)
+        _, outlines = oe.recipeutils.patch_recipe_lines(outlines, extravalues, trailing_newline=True)
 
     if args.extract_to:
         scriptutils.git_convert_standalone_clone(srctree)
@@ -890,7 +904,7 @@
         log_info_cond('Source extracted to %s' % args.extract_to, args.devtool)
 
     if outfile == '-':
-        sys.stdout.write('\n'.join(outlines) + '\n')
+        sys.stdout.write(''.join(outlines) + '\n')
     else:
         with open(outfile, 'w') as f:
             lastline = None
@@ -898,7 +912,7 @@
                 if not lastline and not line:
                     # Skip extra blank lines
                     continue
-                f.write('%s\n' % line)
+                f.write('%s' % line)
                 lastline = line
         log_info_cond('Recipe %s has been created; further editing may be required to make it fully functional' % outfile, args.devtool)
         tinfoil.modified_files()
@@ -1059,54 +1073,18 @@
 
     return md5sums
 
-def crunch_license(licfile):
+def crunch_known_licenses(d):
     '''
-    Remove non-material text from a license file and then check
-    its md5sum against a known list. This works well for licenses
-    which contain a copyright statement, but is also a useful way
-    to handle people's insistence upon reformatting the license text
-    slightly (with no material difference to the text of the
-    license).
+    Calculate the MD5 checksums for the crunched versions of all common
+    licenses. Also add additional known checksums.
     '''
-
-    import oe.utils
-
-    # Note: these are carefully constructed!
-    license_title_re = re.compile(r'^#*\(? *(This is )?([Tt]he )?.{0,15} ?[Ll]icen[sc]e( \(.{1,10}\))?\)?[:\.]? ?#*$')
-    license_statement_re = re.compile(r'^((This (project|software)|.{1,10}) is( free software)? (released|licen[sc]ed)|(Released|Licen[cs]ed)) under the .{1,10} [Ll]icen[sc]e:?$')
-    copyright_re = re.compile('^ *[#\*]* *(Modified work |MIT LICENSED )?Copyright ?(\([cC]\))? .*$')
-    disclaimer_re = re.compile('^ *\*? ?All [Rr]ights [Rr]eserved\.$')
-    email_re = re.compile('^.*<[\w\.-]*@[\w\.\-]*>$')
-    header_re = re.compile('^(\/\**!?)? ?[\-=\*]* ?(\*\/)?$')
-    tag_re = re.compile('^ *@?\(?([Ll]icense|MIT)\)?$')
-    url_re = re.compile('^ *[#\*]* *https?:\/\/[\w\.\/\-]+$')
-
+    
     crunched_md5sums = {}
 
     # common licenses
-    crunched_md5sums['89f3bf322f30a1dcfe952e09945842f0'] = 'Apache-2.0'
-    crunched_md5sums['13b6fe3075f8f42f2270a748965bf3a1'] = '0BSD'
-    crunched_md5sums['ba87a7d7c20719c8df4b8beed9b78c43'] = 'BSD-2-Clause'
-    crunched_md5sums['7f8892c03b72de419c27be4ebfa253f8'] = 'BSD-3-Clause'
-    crunched_md5sums['21128c0790b23a8a9f9e260d5f6b3619'] = 'BSL-1.0'
-    crunched_md5sums['975742a59ae1b8abdea63a97121f49f4'] = 'EDL-1.0'
-    crunched_md5sums['5322cee4433d84fb3aafc9e253116447'] = 'EPL-1.0'
-    crunched_md5sums['6922352e87de080f42419bed93063754'] = 'EPL-2.0'
-    crunched_md5sums['793475baa22295cae1d3d4046a3a0ceb'] = 'GPL-2.0-only'
-    crunched_md5sums['ff9047f969b02c20f0559470df5cb433'] = 'GPL-2.0-or-later'
-    crunched_md5sums['ea6de5453fcadf534df246e6cdafadcd'] = 'GPL-3.0-only'
-    crunched_md5sums['b419257d4d153a6fde92ddf96acf5b67'] = 'GPL-3.0-or-later'
-    crunched_md5sums['228737f4c49d3ee75b8fb3706b090b84'] = 'ISC'
-    crunched_md5sums['c6a782e826ca4e85bf7f8b89435a677d'] = 'LGPL-2.0-only'
-    crunched_md5sums['32d8f758a066752f0db09bd7624b8090'] = 'LGPL-2.0-or-later'
-    crunched_md5sums['4820937eb198b4f84c52217ed230be33'] = 'LGPL-2.1-only'
-    crunched_md5sums['db13fe9f3a13af7adab2dc7a76f9e44a'] = 'LGPL-2.1-or-later'
-    crunched_md5sums['d7a0f2e4e0950e837ac3eabf5bd1d246'] = 'LGPL-3.0-only'
-    crunched_md5sums['abbf328e2b434f9153351f06b9f79d02'] = 'LGPL-3.0-or-later'
-    crunched_md5sums['eecf6429523cbc9693547cf2db790b5c'] = 'MIT'
-    crunched_md5sums['b218b0e94290b9b818c4be67c8e1cc82'] = 'MIT-0'
-    crunched_md5sums['ddc18131d6748374f0f35a621c245b49'] = 'Unlicense'
-    crunched_md5sums['51f9570ff32571fc0a443102285c5e33'] = 'WTFPL'
+    crunched_md5sums['ad4e9d34a2e966dfe9837f18de03266d'] = 'GFDL-1.1-only'
+    crunched_md5sums['d014fb11a34eb67dc717fdcfc97e60ed'] = 'GFDL-1.2-only'
+    crunched_md5sums['e020ca655b06c112def28e597ab844f1'] = 'GFDL-1.3-only'
 
     # The following two were gleaned from the "forever" npm package
     crunched_md5sums['0a97f8e4cbaf889d6fa51f84b89a79f6'] = 'ISC'
@@ -1162,6 +1140,39 @@
     # https://raw.githubusercontent.com/stackgl/gl-mat3/v2.0.0/LICENSE.md
     crunched_md5sums['75512892d6f59dddb6d1c7e191957e9c'] = 'Zlib'
 
+    commonlicdir = d.getVar('COMMON_LICENSE_DIR')
+    for fn in sorted(os.listdir(commonlicdir)):
+        md5value, lictext = crunch_license(os.path.join(commonlicdir, fn))
+        if md5value not in crunched_md5sums:
+            crunched_md5sums[md5value] = fn
+        elif fn != crunched_md5sums[md5value]:
+            bb.debug(2, "crunched_md5sums['%s'] is already set to '%s' rather than '%s'" % (md5value, crunched_md5sums[md5value], fn))
+        else:
+            bb.debug(2, "crunched_md5sums['%s'] is already set to '%s'" % (md5value, crunched_md5sums[md5value]))
+
+    return crunched_md5sums
+
+def crunch_license(licfile):
+    '''
+    Remove non-material text from a license file and then calculate its
+    md5sum. This works well for licenses that contain a copyright statement,
+    but is also a useful way to handle people's insistence upon reformatting
+    the license text slightly (with no material difference to the text of the
+    license).
+    '''
+
+    import oe.utils
+
+    # Note: these are carefully constructed!
+    license_title_re = re.compile(r'^#*\(? *(This is )?([Tt]he )?.{0,15} ?[Ll]icen[sc]e( \(.{1,10}\))?\)?[:\.]? ?#*$')
+    license_statement_re = re.compile(r'^((This (project|software)|.{1,10}) is( free software)? (released|licen[sc]ed)|(Released|Licen[cs]ed)) under the .{1,10} [Ll]icen[sc]e:?$')
+    copyright_re = re.compile('^ *[#\*]* *(Modified work |MIT LICENSED )?Copyright ?(\([cC]\))? .*$')
+    disclaimer_re = re.compile('^ *\*? ?All [Rr]ights [Rr]eserved\.$')
+    email_re = re.compile('^.*<[\w\.-]*@[\w\.\-]*>$')
+    header_re = re.compile('^(\/\**!?)? ?[\-=\*]* ?(\*\/)?$')
+    tag_re = re.compile('^ *@?\(?([Ll]icense|MIT)\)?$')
+    url_re = re.compile('^ *[#\*]* *https?:\/\/[\w\.\/\-]+$')
+
     lictext = []
     with open(licfile, 'r', errors='surrogateescape') as f:
         for line in f:
@@ -1203,13 +1214,14 @@
     except UnicodeEncodeError:
         md5val = None
         lictext = ''
-    license = crunched_md5sums.get(md5val, None)
-    return license, md5val, lictext
+    return md5val, lictext
 
 def guess_license(srctree, d):
     import bb
     md5sums = get_license_md5sums(d)
 
+    crunched_md5sums = crunch_known_licenses(d)
+
     licenses = []
     licspecs = ['*LICEN[CS]E*', 'COPYING*', '*[Ll]icense*', 'LEGAL*', '[Ll]egal*', '*GPL*', 'README.lic*', 'COPYRIGHT*', '[Cc]opyright*', 'e[dp]l-v10']
     skip_extensions = (".html", ".js", ".json", ".svg", ".ts", ".go")
@@ -1227,7 +1239,8 @@
         md5value = bb.utils.md5_file(licfile)
         license = md5sums.get(md5value, None)
         if not license:
-            license, crunched_md5, lictext = crunch_license(licfile)
+            crunched_md5, lictext = crunch_license(licfile)
+            license = crunched_md5sums.get(crunched_md5, None)
             if lictext and not license:
                 license = 'Unknown'
                 logger.info("Please add the following line for '%s' to a 'lib/recipetool/licenses.csv' " \
@@ -1401,6 +1414,7 @@
     parser_create.add_argument('-B', '--srcbranch', help='Branch in source repository if fetching from an SCM such as git (default master)')
     parser_create.add_argument('--keep-temp', action="store_true", help='Keep temporary directory (for debugging)')
     parser_create.add_argument('--npm-dev', action="store_true", help='For npm, also fetch devDependencies')
+    parser_create.add_argument('--no-pypi', action="store_true", help='Do not inherit pypi class')
     parser_create.add_argument('--devtool', action="store_true", help=argparse.SUPPRESS)
     parser_create.add_argument('--mirrors', action="store_true", help='Enable PREMIRRORS and MIRRORS for source tree fetching (disabled by default).')
     parser_create.set_defaults(func=create_recipe)
diff --git a/poky/scripts/lib/recipetool/create_buildsys_python.py b/poky/scripts/lib/recipetool/create_buildsys_python.py
index 9312e4a..60c5903 100644
--- a/poky/scripts/lib/recipetool/create_buildsys_python.py
+++ b/poky/scripts/lib/recipetool/create_buildsys_python.py
@@ -18,7 +18,11 @@
 import re
 import sys
 import subprocess
+import json
+import urllib.request
 from recipetool.create import RecipeHandler
+from urllib.parse import urldefrag
+from recipetool.create import determine_from_url
 
 logger = logging.getLogger('recipetool')
 
@@ -111,6 +115,69 @@
     def __init__(self):
         pass
 
+    def process_url(self, args, classes, handled, extravalues):
+        """
+        Convert any pypi url https://pypi.org/project/<package>/<version> into https://files.pythonhosted.org/packages/source/...
+        which corresponds to the archive location, and add pypi class
+        """
+
+        if 'url' in handled:
+            return None
+
+        fetch_uri = None
+        source = args.source
+        required_version = args.version if args.version else None
+        match = re.match(r'https?://pypi.org/project/([^/]+)(?:/([^/]+))?/?$', urldefrag(source)[0])
+        if match:
+            package = match.group(1)
+            version = match.group(2) if match.group(2) else required_version
+
+            json_url = f"https://pypi.org/pypi/%s/json" % package
+            response = urllib.request.urlopen(json_url)
+            if response.status == 200:
+                data = json.loads(response.read())
+                if not version:
+                    # grab latest version
+                    version = data["info"]["version"]
+                pypi_package = data["info"]["name"]
+                for release in reversed(data["releases"][version]):
+                    if release["packagetype"] == "sdist":
+                        fetch_uri = release["url"]
+                        break
+            else:
+                logger.warning("Cannot handle pypi url %s: cannot fetch package information using %s", source, json_url)
+                return None
+        else:
+            match = re.match(r'^https?://files.pythonhosted.org/packages.*/(.*)-.*$', source)
+            if match:
+                fetch_uri = source
+                pypi_package = match.group(1)
+                _, version = determine_from_url(fetch_uri)
+
+        if match and not args.no_pypi:
+            if required_version and version != required_version:
+                raise Exception("Version specified using --version/-V (%s) and version specified in the url (%s) do not match" % (required_version, version))
+            # This is optionnal if BPN looks like "python-<pypi_package>" or "python3-<pypi_package>" (see pypi.bbclass)
+            # but at this point we cannot know because because user can specify the output name of the recipe on the command line
+            extravalues["PYPI_PACKAGE"] = pypi_package
+            # If the tarball extension is not 'tar.gz' (default value in pypi.bblcass) whe should set PYPI_PACKAGE_EXT in the recipe
+            pypi_package_ext = re.match(r'.*%s-%s\.(.*)$' % (pypi_package, version), fetch_uri)
+            if pypi_package_ext:
+                pypi_package_ext = pypi_package_ext.group(1)
+                if pypi_package_ext != "tar.gz":
+                    extravalues["PYPI_PACKAGE_EXT"] = pypi_package_ext
+
+            # Pypi class will handle S and SRC_URI variables, so remove them
+            # TODO: allow oe.recipeutils.patch_recipe_lines() to accept regexp so we can simplify the following to:
+            # extravalues['SRC_URI(?:\[.*?\])?'] = None
+            extravalues['S'] = None
+            extravalues['SRC_URI'] = None
+
+            classes.append('pypi')
+
+        handled.append('url')
+        return fetch_uri
+
     def handle_classifier_license(self, classifiers, existing_licenses=""):
 
         licenses = []
@@ -668,6 +735,7 @@
         "poetry.core.masonry.api": "python_poetry_core",
         "flit_core.buildapi": "python_flit_core",
         "hatchling.build": "python_hatchling",
+        "maturin": "python_maturin",
     }
 
     # setuptools.build_meta and flit declare project metadata into the "project" section of pyproject.toml
@@ -726,6 +794,7 @@
 
     def process(self, srctree, classes, lines_before, lines_after, handled, extravalues):
         info = {}
+        metadata = {}
 
         if 'buildsystem' in handled:
             return False
diff --git a/poky/scripts/lib/wic/partition.py b/poky/scripts/lib/wic/partition.py
index b1a2306..795707e 100644
--- a/poky/scripts/lib/wic/partition.py
+++ b/poky/scripts/lib/wic/partition.py
@@ -284,6 +284,20 @@
 
         extraopts = self.mkfs_extraopts or "-F -i 8192"
 
+        if os.getenv('SOURCE_DATE_EPOCH'):
+            sde_time = int(os.getenv('SOURCE_DATE_EPOCH'))
+            if pseudo:
+                pseudo = "export E2FSPROGS_FAKE_TIME=%s;%s " % (sde_time, pseudo)
+            else:
+                pseudo = "export E2FSPROGS_FAKE_TIME=%s; " % sde_time
+
+            # Set hash_seed to generate deterministic directory indexes
+            namespace = uuid.UUID("e7429877-e7b3-4a68-a5c9-2f2fdf33d460")
+            if self.fsuuid:
+                namespace = uuid.UUID(self.fsuuid)
+            hash_seed = str(uuid.uuid5(namespace, str(sde_time)))
+            extraopts += " -E hash_seed=%s" % hash_seed
+
         label_str = ""
         if self.label:
             label_str = "-L %s" % self.label
diff --git a/poky/scripts/lib/wic/plugins/source/empty.py b/poky/scripts/lib/wic/plugins/source/empty.py
index 9c492ca..0a9f5fa 100644
--- a/poky/scripts/lib/wic/plugins/source/empty.py
+++ b/poky/scripts/lib/wic/plugins/source/empty.py
@@ -9,9 +9,19 @@
 # To use it you must pass "empty" as argument for the "--source" parameter in
 # the wks file. For example:
 # part foo --source empty --ondisk sda --size="1024" --align 1024
+#
+# The plugin supports writing zeros to the start of the
+# partition. This is useful to overwrite old content like
+# filesystem signatures which may be re-recognized otherwise.
+# This feature can be enabled with
+# '--soucreparams="[fill|size=<N>[S|s|K|k|M|G]][,][bs=<N>[S|s|K|k|M|G]]"'
+# Conflicting or missing options throw errors.
 
 import logging
+import os
 
+from wic import WicError
+from wic.ksparser import sizetype
 from wic.pluginbase import SourcePlugin
 
 logger = logging.getLogger('wic')
@@ -19,6 +29,16 @@
 class EmptyPartitionPlugin(SourcePlugin):
     """
     Populate unformatted empty partition.
+
+    The following sourceparams are supported:
+    - fill
+      Fill the entire partition with zeros. Requires '--fixed-size' option
+      to be set.
+    - size=<N>[S|s|K|k|M|G]
+      Set the first N bytes of the partition to zero. Default unit is 'K'.
+    - bs=<N>[S|s|K|k|M|G]
+      Write at most N bytes at a time during source file creation.
+      Defaults to '1M'. Default unit is 'K'.
     """
 
     name = 'empty'
@@ -31,4 +51,39 @@
         Called to do the actual content population for a partition i.e. it
         'prepares' the partition to be incorporated into the image.
         """
-        return
+        get_byte_count = sizetype('K', True)
+        size = 0
+
+        if 'fill' in source_params and 'size' in source_params:
+            raise WicError("Conflicting source parameters 'fill' and 'size' specified, exiting.")
+
+        # Set the size of the zeros to be written to the partition
+        if 'fill' in source_params:
+            if part.fixed_size == 0:
+                raise WicError("Source parameter 'fill' only works with the '--fixed-size' option, exiting.")
+            size = get_byte_count(part.fixed_size)
+        elif 'size' in source_params:
+            size = get_byte_count(source_params['size'])
+
+        if size == 0:
+            # Nothing to do, create empty partition
+            return
+
+        if 'bs' in source_params:
+            bs = get_byte_count(source_params['bs'])
+        else:
+            bs = get_byte_count('1M')
+
+        # Create a binary file of the requested size filled with zeros
+        source_file = os.path.join(cr_workdir, 'empty-plugin-zeros%s.bin' % part.lineno)
+        if not os.path.exists(os.path.dirname(source_file)):
+            os.makedirs(os.path.dirname(source_file))
+
+        quotient, remainder = divmod(size, bs)
+        with open(source_file, 'wb') as file:
+            for _ in range(quotient):
+                file.write(bytearray(bs))
+            file.write(bytearray(remainder))
+
+        part.size = (size + 1024 - 1) // 1024  # size in KB rounded up
+        part.source_file = source_file
diff --git a/poky/scripts/postinst-intercepts/update_gtk_icon_cache b/poky/scripts/postinst-intercepts/update_gtk_icon_cache
index 99367a2..a92bd84 100644
--- a/poky/scripts/postinst-intercepts/update_gtk_icon_cache
+++ b/poky/scripts/postinst-intercepts/update_gtk_icon_cache
@@ -11,7 +11,11 @@
 
 for icondir in $D/usr/share/icons/*/ ; do
     if [ -d $icondir ] ; then
-        gtk-update-icon-cache -fqt  $icondir
+        for gtkuic_cmd in gtk-update-icon-cache gtk4-update-icon-cache ; do
+            if [ -n "$(which $gtkuic_cmd)" ]; then
+                $gtkuic_cmd -fqt  $icondir
+            fi
+        done
     fi
 done
 
diff --git a/poky/scripts/runqemu b/poky/scripts/runqemu
index 18aeb7f..69cd448 100755
--- a/poky/scripts/runqemu
+++ b/poky/scripts/runqemu
@@ -84,6 +84,7 @@
     publicvnc - enable a VNC server open to all hosts
     audio - enable audio
     guestagent - enable guest agent communication
+    qmp=<path> - create a QMP socket (defaults to unix:qmp.sock if unspecified)
     [*/]ovmf* - OVMF firmware file or base name for booting with UEFI
   tcpserial=<port> - specify tcp serial port number
   qemuparams=<xyz> - specify custom parameters to QEMU
@@ -221,6 +222,7 @@
         self.cleaned = False
         # Files to cleanup after run
         self.cleanup_files = []
+        self.qmp = None
         self.guest_agent = False
         self.guest_agent_sockpath = '/tmp/qga.sock'
 
@@ -369,11 +371,11 @@
         if p.endswith('.qemuboot.conf'):
             self.qemuboot = p
             self.qbconfload = True
-        elif re.search('\.bin$', p) or re.search('bzImage', p) or \
+        elif re.search('\\.bin$', p) or re.search('bzImage', p) or \
              re.search('zImage', p) or re.search('vmlinux', p) or \
              re.search('fitImage', p) or re.search('uImage', p):
             self.kernel =  p
-        elif os.path.exists(p) and (not os.path.isdir(p)) and '-image-' in os.path.basename(p):
+        elif os.path.isfile(p) and ('-image-' in os.path.basename(p) or '.rootfs.' in os.path.basename(p)):
             self.rootfs = p
             # Check filename against self.fstypes can handle <file>.cpio.gz,
             # otherwise, its type would be "gz", which is incorrect.
@@ -383,19 +385,19 @@
                     fst = t
                     break
             if not fst:
-                m = re.search('.*\.(.*)$', self.rootfs)
+                m = re.search('.*\\.(.*)$', self.rootfs)
                 if m:
                     fst =  m.group(1)
             if fst:
                 self.check_arg_fstype(fst)
-                qb = re.sub('\.' + fst + "$", '.qemuboot.conf', self.rootfs)
+                qb = re.sub('\\.' + fst + "$", '.qemuboot.conf', self.rootfs)
                 if os.path.exists(qb):
                     self.qemuboot = qb
                     self.qbconfload = True
                 else:
                     logger.warning("%s doesn't exist, will try to remove '.rootfs' from filename" % qb)
                     # They to remove .rootfs (IMAGE_NAME_SUFFIX) as well
-                    qb = re.sub('\.rootfs.qemuboot.conf$', '.qemuboot.conf', qb)
+                    qb = re.sub('\\.rootfs.qemuboot.conf$', '.qemuboot.conf', qb)
                     if os.path.exists(qb):
                         self.qemuboot = qb
                         self.qbconfload = True
@@ -536,6 +538,10 @@
                 self.qemu_opt_script += ' -vnc :0'
             elif arg == 'guestagent':
                 self.guest_agent = True
+            elif arg == "qmp":
+                self.qmp = "unix:qmp.sock"
+            elif arg.startswith("qmp="):
+                self.qmp = arg[len('qmp='):]
             elif arg.startswith('guestagent-sockpath='):
                 self.guest_agent_sockpath = '%s' % arg[len('guestagent-sockpath='):]
             elif arg.startswith('tcpserial='):
@@ -1406,6 +1412,10 @@
             self.qemu_opt += ' -device virtio-serial '
             self.qemu_opt += ' -device virtserialport,chardev=qga0,name=org.qemu.guest_agent.0 '
 
+    def setup_qmp(self):
+        if self.qmp:
+            self.qemu_opt += " -qmp %s,server,nowait" % self.qmp
+
     def setup_vga(self):
         if self.nographic == True:
             if self.sdl == True:
@@ -1547,6 +1557,7 @@
             self.qemu_opt += " -snapshot"
 
         self.setup_guest_agent()
+        self.setup_qmp()
         self.setup_serial()
         self.setup_vga()
 
diff --git a/poky/scripts/sstate-cache-management.py b/poky/scripts/sstate-cache-management.py
new file mode 100755
index 0000000..09b7aa2
--- /dev/null
+++ b/poky/scripts/sstate-cache-management.py
@@ -0,0 +1,329 @@
+#!/usr/bin/env python3
+#
+# Copyright OpenEmbedded Contributors
+#
+# SPDX-License-Identifier: MIT
+#
+
+import argparse
+import os
+import re
+import sys
+
+from collections import defaultdict
+from concurrent.futures import ThreadPoolExecutor
+from dataclasses import dataclass
+from pathlib import Path
+
+if sys.version_info < (3, 8, 0):
+    raise RuntimeError("Sorry, python 3.8.0 or later is required for this script.")
+
+SSTATE_PREFIX = "sstate:"
+SSTATE_EXTENSION = ".tar.zst"
+# SSTATE_EXTENSION = ".tgz"
+# .siginfo.done files are mentioned in the original script?
+SSTATE_SUFFIXES = (
+    SSTATE_EXTENSION,
+    f"{SSTATE_EXTENSION}.siginfo",
+    f"{SSTATE_EXTENSION}.done",
+)
+
+RE_SSTATE_PKGSPEC = re.compile(
+    rf"""sstate:(?P<pn>[^:]*):
+         (?P<package_target>[^:]*):
+         (?P<pv>[^:]*):
+         (?P<pr>[^:]*):
+         (?P<sstate_pkgarch>[^:]*):
+         (?P<sstate_version>[^_]*):
+         (?P<bb_unihash>[^_]*)_
+         (?P<bb_task>[^:]*)
+         (?P<ext>({"|".join([re.escape(s) for s in SSTATE_SUFFIXES])}))$""",
+    re.X,
+)
+
+
+# Really we'd like something like a Path subclass which implements a stat
+# cache here, unfortunately there's no good way to do that transparently
+# (yet); see:
+#
+# https://github.com/python/cpython/issues/70219
+# https://discuss.python.org/t/make-pathlib-extensible/3428/77
+@dataclass
+class SstateEntry:
+    """Class for keeping track of an entry in sstate-cache."""
+
+    path: Path
+    match: re.Match
+    stat_result: os.stat_result = None
+
+    def __hash__(self):
+        return self.path.__hash__()
+
+    def __getattr__(self, name):
+        return self.match.group(name)
+
+
+# this is what's in the original script; as far as I can tell, it's an
+# implementation artefact which we don't need?
+def find_archs():
+    # all_archs
+    builder_arch = os.uname().machine
+
+    # FIXME
+    layer_paths = [Path("../..")]
+
+    tune_archs = set()
+    re_tune = re.compile(r'AVAILTUNES .*=.*"(.*)"')
+    for path in layer_paths:
+        for tunefile in [
+            p for p in path.glob("meta*/conf/machine/include/**/*") if p.is_file()
+        ]:
+            with open(tunefile) as f:
+                for line in f:
+                    m = re_tune.match(line)
+                    if m:
+                        tune_archs.update(m.group(1).split())
+
+    # all_machines
+    machine_archs = set()
+    for path in layer_paths:
+        for machine_file in path.glob("meta*/conf/machine/*.conf"):
+            machine_archs.add(machine_file.parts[-1][:-5])
+
+    extra_archs = set()
+    all_archs = (
+        set(
+            arch.replace("-", "_")
+            for arch in machine_archs | tune_archs | set(["allarch", builder_arch])
+        )
+        | extra_archs
+    )
+
+    print(all_archs)
+
+
+# again, not needed?
+def find_tasks():
+    print(set([p.bb_task for p in paths]))
+
+
+def collect_sstate_paths(args):
+    def scandir(path, paths):
+        # Assume everything is a directory; by not checking we avoid needing an
+        # additional stat which is potentially a synchronous roundtrip over NFS
+        try:
+            for p in path.iterdir():
+                filename = p.parts[-1]
+                if filename.startswith(SSTATE_PREFIX):
+                    if filename.endswith(SSTATE_SUFFIXES):
+                        m = RE_SSTATE_PKGSPEC.match(p.parts[-1])
+                        assert m
+                        paths.add(SstateEntry(p, m))
+                    # ignore other things (includes things like lockfiles)
+                else:
+                    scandir(p, paths)
+
+        except NotADirectoryError:
+            pass
+
+    paths = set()
+    # TODO: parellise scandir
+    scandir(Path(args.cache_dir), paths)
+
+    def path_stat(p):
+        p.stat_result = p.path.lstat()
+
+    if args.remove_duplicated:
+        # This is probably slightly performance negative on a local filesystem
+        # when we interact with the GIL; over NFS it's a massive win.
+        with ThreadPoolExecutor(max_workers=args.jobs) as executor:
+            executor.map(path_stat, paths)
+
+    return paths
+
+
+def remove_by_stamps(args, paths):
+    all_sums = set()
+    for stamps_dir in args.stamps_dir:
+        stamps_path = Path(stamps_dir)
+        assert stamps_path.is_dir()
+        re_sigdata = re.compile(r"do_.*.sigdata\.([^.]*)")
+        all_sums |= set(
+            [
+                re_sigdata.search(x.parts[-1]).group(1)
+                for x in stamps_path.glob("*/*/*.do_*.sigdata.*")
+            ]
+        )
+        re_setscene = re.compile(r"do_.*_setscene\.([^.]*)")
+        all_sums |= set(
+            [
+                re_setscene.search(x.parts[-1]).group(1)
+                for x in stamps_path.glob("*/*/*.do_*_setscene.*")
+            ]
+        )
+    return [p for p in paths if p.bb_unihash not in all_sums]
+
+
+def remove_duplicated(args, paths):
+    # Skip populate_lic as it produces duplicates in a normal build
+    #
+    # 9ae16469e707 sstate-cache-management: skip populate_lic archives when removing duplicates
+    valid_paths = [p for p in paths if p.bb_task != "populate_lic"]
+
+    keep = dict()
+    remove = list()
+    for p in valid_paths:
+        sstate_sig = ":".join([p.pn, p.sstate_pkgarch, p.bb_task, p.ext])
+        if sstate_sig not in keep:
+            keep[sstate_sig] = p
+        elif p.stat_result.st_mtime > keep[sstate_sig].stat_result.st_mtime:
+            remove.append(keep[sstate_sig])
+            keep[sstate_sig] = p
+        else:
+            remove.append(p)
+
+    return remove
+
+
+def remove_orphans(args, paths):
+    remove = list()
+    pathsigs = defaultdict(list)
+    for p in paths:
+        sstate_sig = ":".join([p.pn, p.sstate_pkgarch, p.bb_task])
+        pathsigs[sstate_sig].append(p)
+    for k, v in pathsigs.items():
+        if len([p for p in v if p.ext == SSTATE_EXTENSION]) == 0:
+            remove.extend(v)
+    return remove
+
+
+def parse_arguments():
+    parser = argparse.ArgumentParser(description="sstate cache management utility.")
+
+    parser.add_argument(
+        "--cache-dir",
+        default=os.environ.get("SSTATE_CACHE_DIR"),
+        help="""Specify sstate cache directory, will use the environment
+            variable SSTATE_CACHE_DIR if it is not specified.""",
+    )
+
+    # parser.add_argument(
+    #     "--extra-archs",
+    #     help="""Specify list of architectures which should be tested, this list
+    #         will be extended with native arch, allarch and empty arch. The
+    #         script won't be trying to generate list of available archs from
+    #         AVAILTUNES in tune files.""",
+    # )
+
+    # parser.add_argument(
+    #     "--extra-layer",
+    #     help="""Specify the layer which will be used for searching the archs,
+    #         it will search the meta and meta-* layers in the top dir by
+    #         default, and will search meta, meta-*, <layer1>, <layer2>,
+    #         ...<layern> when specified. Use "," as the separator.
+    #
+    #         This is useless for --stamps-dir or when --extra-archs is used.""",
+    # )
+
+    parser.add_argument(
+        "-d",
+        "--remove-duplicated",
+        action="store_true",
+        help="""Remove the duplicated sstate cache files of one package, only
+            the newest one will be kept. The duplicated sstate cache files
+            of one package must have the same arch, which means sstate cache
+            files with multiple archs are not considered duplicate.
+
+            Conflicts with --stamps-dir.""",
+    )
+
+    parser.add_argument(
+        "--remove-orphans",
+        action="store_true",
+        help=f"""Remove orphan siginfo files from the sstate cache, i.e. those
+            where this is no {SSTATE_EXTENSION} file but there are associated
+            tracking files.""",
+    )
+
+    parser.add_argument(
+        "--stamps-dir",
+        action="append",
+        help="""Specify the build directory's stamps directories, the sstate
+            cache file which IS USED by these build diretories will be KEPT,
+            other sstate cache files in cache-dir will be removed. Can be
+            specified multiple times for several directories.
+
+            Conflicts with --remove-duplicated.""",
+    )
+
+    parser.add_argument(
+        "-j", "--jobs", default=8, type=int, help="Run JOBS jobs in parallel."
+    )
+
+    # parser.add_argument(
+    #     "-L",
+    #     "--follow-symlink",
+    #     action="store_true",
+    #     help="Remove both the symbol link and the destination file, default: no.",
+    # )
+
+    parser.add_argument(
+        "-y",
+        "--yes",
+        action="store_true",
+        help="""Automatic yes to prompts; assume "yes" as answer to all prompts
+            and run non-interactively.""",
+    )
+
+    parser.add_argument(
+        "-v", "--verbose", action="store_true", help="Explain what is being done."
+    )
+
+    parser.add_argument(
+        "-D",
+        "--debug",
+        action="count",
+        default=0,
+        help="Show debug info, repeat for more debug info.",
+    )
+
+    args = parser.parse_args()
+    if args.cache_dir is None or (
+        not args.remove_duplicated and not args.stamps_dir and not args.remove_orphans
+    ):
+        parser.print_usage()
+        sys.exit(1)
+
+    return args
+
+
+def main():
+    args = parse_arguments()
+
+    paths = collect_sstate_paths(args)
+    if args.remove_duplicated:
+        remove = remove_duplicated(args, paths)
+    elif args.stamps_dir:
+        remove = remove_by_stamps(args, paths)
+    else:
+        remove = list()
+
+    if args.remove_orphans:
+        remove = set(remove) | set(remove_orphans(args, paths))
+
+    if args.debug >= 1:
+        print("\n".join([str(p.path) for p in remove]))
+    print(f"{len(remove)} out of {len(paths)} files will be removed!")
+    if not args.yes:
+        print("Do you want to continue (y/n)?")
+        confirm = input() in ("y", "Y")
+    else:
+        confirm = True
+    if confirm:
+        # TODO: parallelise remove
+        for p in remove:
+            p.path.unlink()
+
+
+if __name__ == "__main__":
+    main()
diff --git a/poky/scripts/sstate-cache-management.sh b/poky/scripts/sstate-cache-management.sh
deleted file mode 100755
index d39671f..0000000
--- a/poky/scripts/sstate-cache-management.sh
+++ /dev/null
@@ -1,458 +0,0 @@
-#!/bin/bash
-
-#  Copyright (c) 2012 Wind River Systems, Inc.
-#
-# SPDX-License-Identifier: GPL-2.0-only
-#
-
-# Global vars
-cache_dir=
-confirm=
-fsym=
-total_deleted=0
-verbose=
-debug=0
-
-usage () {
-  cat << EOF
-Welcome to sstate cache management utilities.
-sstate-cache-management.sh <OPTION>
-
-Options:
-  -h, --help
-        Display this help and exit.
-
-  --cache-dir=<sstate cache dir>
-        Specify sstate cache directory, will use the environment
-        variable SSTATE_CACHE_DIR if it is not specified.
-
-  --extra-archs=<arch1>,<arch2>...<archn>
-        Specify list of architectures which should be tested, this list
-        will be extended with native arch, allarch and empty arch. The
-        script won't be trying to generate list of available archs from
-        AVAILTUNES in tune files.
-
-  --extra-layer=<layer1>,<layer2>...<layern>
-        Specify the layer which will be used for searching the archs,
-        it will search the meta and meta-* layers in the top dir by
-        default, and will search meta, meta-*, <layer1>, <layer2>,
-        ...<layern> when specified. Use "," as the separator.
-
-        This is useless for --stamps-dir or when --extra-archs is used.
-
-  -d, --remove-duplicated
-        Remove the duplicated sstate cache files of one package, only
-        the newest one will be kept. The duplicated sstate cache files
-        of one package must have the same arch, which means sstate cache
-        files with multiple archs are not considered duplicate.
-
-        Conflicts with --stamps-dir.
-
-  --stamps-dir=<dir1>,<dir2>...<dirn>
-        Specify the build directory's stamps directories, the sstate
-        cache file which IS USED by these build diretories will be KEPT,
-        other sstate cache files in cache-dir will be removed. Use ","
-        as the separator. For example:
-        --stamps-dir=build1/tmp/stamps,build2/tmp/stamps
-
-        Conflicts with --remove-duplicated.
-
-  -L, --follow-symlink
-        Remove both the symbol link and the destination file, default: no.
-
-  -y, --yes
-        Automatic yes to prompts; assume "yes" as answer to all prompts
-        and run non-interactively.
-
-  -v, --verbose
-        Explain what is being done.
-
-  -D, --debug
-        Show debug info, repeat for more debug info.
-
-EOF
-}
-
-if [ $# -lt 1 ]; then
-  usage
-  exit 0
-fi
-
-# Echo no files to remove
-no_files () {
-    echo No files to remove
-}
-
-# Echo nothing to do
-do_nothing () {
-   echo Nothing to do
-}
-
-# Read the input "y"
-read_confirm () {
-  echo "$total_deleted out of $total_files files will be removed! "
-  if [ "$confirm" != "y" ]; then
-      echo "Do you want to continue (y/n)? "
-      while read confirm; do
-          [ "$confirm" = "Y" -o "$confirm" = "y" -o "$confirm" = "n" \
-            -o "$confirm" = "N" ] && break
-          echo "Invalid input \"$confirm\", please input 'y' or 'n': "
-      done
-  else
-      echo
-  fi
-}
-
-# Print error information and exit.
-echo_error () {
-  echo "ERROR: $1" >&2
-  exit 1
-}
-
-# Generate the remove list:
-#
-# * Add .done/.siginfo to the remove list
-# * Add destination of symlink to the remove list
-#
-# $1: output file, others: sstate cache file (.tar.zst)
-gen_rmlist (){
-  local rmlist_file="$1"
-  shift
-  local files="$@"
-  for i in $files; do
-      echo $i >> $rmlist_file
-      # Add the ".siginfo"
-      if [ -e $i.siginfo ]; then
-          echo $i.siginfo >> $rmlist_file
-      fi
-      # Add the destination of symlink
-      if [ -L "$i" ]; then
-          if [ "$fsym" = "y" ]; then
-              dest="`readlink -e $i`"
-              if [ -n "$dest" ]; then
-                  echo $dest >> $rmlist_file
-                  # Remove the .siginfo when .tar.zst is removed
-                  if [ -f "$dest.siginfo" ]; then
-                      echo $dest.siginfo >> $rmlist_file
-                  fi
-              fi
-          fi
-          # Add the ".tar.zst.done" and ".siginfo.done" (may exist in the future)
-          base_fn="${i##/*/}"
-          t_fn="$base_fn.done"
-          s_fn="$base_fn.siginfo.done"
-          for d in $t_fn $s_fn; do
-              if [ -f $cache_dir/$d ]; then
-                  echo $cache_dir/$d >> $rmlist_file
-              fi
-          done
-      fi
-  done
-}
-
-# Remove the duplicated cache files for the pkg, keep the newest one
-remove_duplicated () {
-
-  local topdir
-  local oe_core_dir
-  local tunedirs
-  local all_archs
-  local all_machines
-  local ava_archs
-  local arch
-  local file_names
-  local sstate_files_list
-  local fn_tmp
-  local list_suffix=`mktemp` || exit 1
-
-  if [ -z "$extra_archs" ] ; then
-    # Find out the archs in all the layers
-    echo "Figuring out the archs in the layers ... "
-    oe_core_dir=$(dirname $(dirname $(readlink -e $0)))
-    topdir=$(dirname $oe_core_dir)
-    tunedirs="`find $topdir/meta* ${oe_core_dir}/meta* $layers -path '*/meta*/conf/machine/include' 2>/dev/null`"
-    [ -n "$tunedirs" ] || echo_error "Can't find the tune directory"
-    all_machines="`find $topdir/meta* ${oe_core_dir}/meta* $layers -path '*/meta*/conf/machine/*' -name '*.conf' 2>/dev/null | sed -e 's/.*\///' -e 's/.conf$//'`"
-    all_archs=`grep -r -h "^AVAILTUNES .*=" $tunedirs | sed -e 's/.*=//' -e 's/\"//g'`
-  fi
-
-  # Use the "_" to substitute "-", e.g., x86-64 to x86_64, but not for extra_archs which can be something like cortexa9t2-vfp-neon
-  # Sort to remove the duplicated ones
-  # Add allarch and builder arch (native)
-  builder_arch=$(uname -m)
-  all_archs="$(echo allarch $all_archs $all_machines $builder_arch \
-          | sed -e 's/-/_/g' -e 's/ /\n/g' | sort -u) $extra_archs"
-  echo "Done"
-
-  # Total number of files including sstate-, .siginfo and .done files
-  total_files=`find $cache_dir -name 'sstate*' | wc -l`
-  # Save all the sstate files in a file
-  sstate_files_list=`mktemp` || exit 1
-  find $cache_dir -iname 'sstate:*:*:*:*:*:*:*.tar.zst*' >$sstate_files_list
-
-  echo "Figuring out the suffixes in the sstate cache dir ... "
-  sstate_suffixes="`sed 's%.*/sstate:[^:]*:[^:]*:[^:]*:[^:]*:[^:]*:[^:]*:[^_]*_\([^:]*\)\.tar\.zst.*%\1%g' $sstate_files_list | sort -u`"
-  echo "Done"
-  echo "The following suffixes have been found in the cache dir:"
-  echo $sstate_suffixes
-
-  echo "Figuring out the archs in the sstate cache dir ... "
-  # Using this SSTATE_PKGSPEC definition it's 6th colon separated field
-  # SSTATE_PKGSPEC    = "sstate:${PN}:${PACKAGE_ARCH}${TARGET_VENDOR}-${TARGET_OS}:${PV}:${PR}:${SSTATE_PKGARCH}:${SSTATE_VERSION}:"
-  for arch in $all_archs; do
-      grep -q ".*/sstate:[^:]*:[^:]*:[^:]*:[^:]*:$arch:[^:]*:[^:]*\.tar\.zst$" $sstate_files_list
-      [ $? -eq 0 ] && ava_archs="$ava_archs $arch"
-      # ${builder_arch}_$arch used by toolchain sstate
-      grep -q ".*/sstate:[^:]*:[^:]*:[^:]*:[^:]*:${builder_arch}_$arch:[^:]*:[^:]*\.tar\.zst$" $sstate_files_list
-      [ $? -eq 0 ] && ava_archs="$ava_archs ${builder_arch}_$arch"
-  done
-  echo "Done"
-  echo "The following archs have been found in the cache dir:"
-  echo $ava_archs
-  echo ""
-
-  # Save the file list which needs to be removed
-  local remove_listdir=`mktemp -d` || exit 1
-  for suffix in $sstate_suffixes; do
-      if [ "$suffix" = "populate_lic" ] ; then
-          echo "Skipping populate_lic, because removing duplicates doesn't work correctly for them (use --stamps-dir instead)"
-          continue
-      fi
-      # Total number of files including .siginfo and .done files
-      total_files_suffix=`grep ".*/sstate:[^:]*:[^:]*:[^:]*:[^:]*:[^:]*:[^:]*:[^:_]*_$suffix\.tar\.zst.*" $sstate_files_list | wc -l 2>/dev/null`
-      total_archive_suffix=`grep ".*/sstate:[^:]*:[^:]*:[^:]*:[^:]*:[^:]*:[^:]*:[^:_]*_$suffix\.tar\.zst$" $sstate_files_list | wc -l 2>/dev/null`
-      # Save the file list to a file, some suffix's file may not exist
-      grep ".*/sstate:[^:]*:[^:]*:[^:]*:[^:]*:[^:]*:[^:]*:[^:_]*_$suffix\.tar\.zst.*" $sstate_files_list >$list_suffix 2>/dev/null
-      local deleted_archives=0
-      local deleted_files=0
-      for ext in tar.zst tar.zst.siginfo tar.zst.done; do
-          echo "Figuring out the sstate:xxx_$suffix.$ext ... "
-          # Uniq BPNs
-          file_names=`for arch in $ava_archs ""; do
-              sed -ne "s%.*/sstate:\([^:]*\):[^:]*:[^:]*:[^:]*:$arch:[^:]*:[^:]*\.${ext}$%\1%p" $list_suffix
-          done | sort -u`
-
-          fn_tmp=`mktemp` || exit 1
-          rm_list="$remove_listdir/sstate:xxx_$suffix"
-          for fn in $file_names; do
-              [ -z "$verbose" ] || echo "Analyzing sstate:$fn-xxx_$suffix.${ext}"
-              for arch in $ava_archs ""; do
-                  grep -h ".*/sstate:$fn:[^:]*:[^:]*:[^:]*:$arch:[^:]*:[^:]*\.${ext}$" $list_suffix >$fn_tmp
-                  if [ -s $fn_tmp ] ; then
-                      [ $debug -gt 1 ] && echo "Available files for $fn-$arch- with suffix $suffix.${ext}:" && cat $fn_tmp
-                      # Use the modification time
-                      to_del=$(ls -t $(cat $fn_tmp) | sed -n '1!p')
-                      [ $debug -gt 2 ] && echo "Considering to delete: $to_del"
-                      # The sstate file which is downloaded from the SSTATE_MIRROR is
-                      # put in SSTATE_DIR, and there is a symlink in SSTATE_DIR/??/ to
-                      # it, so filter it out from the remove list if it should not be
-                      # removed.
-                      to_keep=$(ls -t $(cat $fn_tmp) | sed -n '1p')
-                      [ $debug -gt 2 ] && echo "Considering to keep: $to_keep"
-                      for k in $to_keep; do
-                          if [ -L "$k" ]; then
-                              # The symlink's destination
-                              k_dest="`readlink -e $k`"
-                              # Maybe it is the one in cache_dir
-                              k_maybe="$cache_dir/${k##/*/}"
-                              # Remove it from the remove list if they are the same.
-                              if [ "$k_dest" = "$k_maybe" ]; then
-                                  to_del="`echo $to_del | sed 's#'\"$k_maybe\"'##g'`"
-                              fi
-                          fi
-                      done
-                      rm -f $fn_tmp
-                      [ $debug -gt 2 ] && echo "Decided to delete: $to_del"
-                      gen_rmlist $rm_list.$ext "$to_del"
-                  fi
-              done
-          done
-      done
-      deleted_archives=`cat $rm_list.* 2>/dev/null | grep "\.tar\.zst$" | wc -l`
-      deleted_files=`cat $rm_list.* 2>/dev/null | wc -l`
-      [ "$deleted_files" -gt 0 -a $debug -gt 0 ] && cat $rm_list.*
-      echo "($deleted_archives out of $total_archives_suffix .tar.zst files for $suffix suffix will be removed or $deleted_files out of $total_files_suffix when counting also .siginfo and .done files)"
-      let total_deleted=$total_deleted+$deleted_files
-  done
-  deleted_archives=0
-  rm_old_list=$remove_listdir/sstate-old-filenames
-  find $cache_dir -name 'sstate-*.tar.zst' >$rm_old_list
-  [ -s "$rm_old_list" ] && deleted_archives=`cat $rm_old_list | grep "\.tar\.zst$" | wc -l`
-  [ -s "$rm_old_list" ] && deleted_files=`cat $rm_old_list | wc -l`
-  [ -s "$rm_old_list" -a $debug -gt 0 ] && cat $rm_old_list
-  echo "($deleted_archives or .tar.zst files with old sstate-* filenames will be removed or $deleted_files when counting also .siginfo and .done files)"
-  let total_deleted=$total_deleted+$deleted_files
-
-  rm -f $list_suffix
-  rm -f $sstate_files_list
-  if [ $total_deleted -gt 0 ]; then
-      read_confirm
-      if [ "$confirm" = "y" -o "$confirm" = "Y" ]; then
-          for list in `ls $remove_listdir/`; do
-              echo "Removing $list.tar.zst archive (`cat $remove_listdir/$list | wc -w` files) ... "
-              # Remove them one by one to avoid the argument list too long error
-              for i in `cat $remove_listdir/$list`; do
-                  rm -f $verbose $i
-              done
-              echo "Done"
-          done
-          echo "$total_deleted files have been removed!"
-      else
-          do_nothing
-      fi
-  else
-       no_files
-  fi
-  [ -d $remove_listdir ] && rm -fr $remove_listdir
-}
-
-# Remove the sstate file by stamps dir, the file not used by the stamps dir
-# will be removed.
-rm_by_stamps (){
-
-  local cache_list=`mktemp` || exit 1
-  local keep_list=`mktemp` || exit 1
-  local rm_list=`mktemp` || exit 1
-  local sums
-  local all_sums
-
-  # Total number of files including sstate-, .siginfo and .done files
-  total_files=`find $cache_dir -type f -name 'sstate*' | wc -l`
-  # Save all the state file list to a file
-  find $cache_dir -type f -name 'sstate*' | sort -u -o $cache_list
-
-  echo "Figuring out the suffixes in the sstate cache dir ... "
-  local sstate_suffixes="`sed 's%.*/sstate:[^:]*:[^:]*:[^:]*:[^:]*:[^:]*:[^:]*:[^_]*_\([^:]*\)\.tar\.zst.*%\1%g' $cache_list | sort -u`"
-  echo "Done"
-  echo "The following suffixes have been found in the cache dir:"
-  echo $sstate_suffixes
-
-  # Figure out all the md5sums in the stamps dir.
-  echo "Figuring out all the md5sums in stamps dir ... "
-  for i in $sstate_suffixes; do
-      # There is no "\.sigdata" but "_setcene" when it is mirrored
-      # from the SSTATE_MIRRORS, use them to figure out the sum.
-      sums=`find $stamps -maxdepth 3 -name "*.do_$i.*" \
-        -o -name "*.do_${i}_setscene.*" | \
-        sed -ne 's#.*_setscene\.##p' -e 's#.*\.sigdata\.##p' | \
-        sed -e 's#\..*##' | sort -u`
-      all_sums="$all_sums $sums"
-  done
-  echo "Done"
-
-  echo "Figuring out the files which will be removed ... "
-  for i in $all_sums; do
-      grep ".*/sstate:[^:]*:[^:]*:[^:]*:[^:]*:[^:]*:[^:]*:${i}_.*" $cache_list >>$keep_list
-  done
-  echo "Done"
-
-  if [ -s $keep_list ]; then
-      sort -u $keep_list -o $keep_list
-      to_del=`comm -1 -3 $keep_list $cache_list`
-      gen_rmlist $rm_list "$to_del"
-      let total_deleted=`cat $rm_list | sort -u | wc -w`
-      if [ $total_deleted -gt 0 ]; then
-          [ $debug -gt 0 ] && cat $rm_list | sort -u
-          read_confirm
-          if [ "$confirm" = "y" -o "$confirm" = "Y" ]; then
-              echo "Removing sstate cache files ... ($total_deleted files)"
-              # Remove them one by one to avoid the argument list too long error
-              for i in `cat $rm_list | sort -u`; do
-                  rm -f $verbose $i
-              done
-              echo "$total_deleted files have been removed"
-          else
-              do_nothing
-          fi
-      else
-          no_files
-      fi
-  else
-      echo_error "All files in cache dir will be removed! Abort!"
-  fi
-
-  rm -f $cache_list
-  rm -f $keep_list
-  rm -f $rm_list
-}
-
-# Parse arguments
-while [ -n "$1" ]; do
-  case $1 in
-    --cache-dir=*)
-      cache_dir=`echo $1 | sed -e 's#^--cache-dir=##' | xargs readlink -e`
-      [ -d "$cache_dir" ] || echo_error "Invalid argument to --cache-dir"
-      shift
-        ;;
-    --remove-duplicated|-d)
-      rm_duplicated="y"
-      shift
-        ;;
-    --yes|-y)
-      confirm="y"
-      shift
-        ;;
-    --follow-symlink|-L)
-      fsym="y"
-      shift
-        ;;
-    --extra-archs=*)
-      extra_archs=`echo $1 | sed -e 's#^--extra-archs=##' -e 's#,# #g'`
-      [ -n "$extra_archs" ] || echo_error "Invalid extra arch parameter"
-      shift
-        ;;
-    --extra-layer=*)
-      extra_layers=`echo $1 | sed -e 's#^--extra-layer=##' -e 's#,# #g'`
-      [ -n "$extra_layers" ] || echo_error "Invalid extra layer parameter"
-      for i in $extra_layers; do
-          l=`readlink -e $i`
-          if [ -d "$l" ]; then
-              layers="$layers $l"
-          else
-              echo_error "Can't find layer $i"
-          fi
-      done
-      shift
-        ;;
-    --stamps-dir=*)
-      stamps=`echo $1 | sed -e 's#^--stamps-dir=##' -e 's#,# #g'`
-      [ -n "$stamps" ] || echo_error "Invalid stamps dir $i"
-      for i in $stamps; do
-          [ -d "$i" ] || echo_error "Invalid stamps dir $i"
-      done
-      shift
-        ;;
-    --verbose|-v)
-      verbose="-v"
-      shift
-        ;;
-    --debug|-D)
-      debug=`expr $debug + 1`
-      echo "Debug level $debug"
-      shift
-        ;;
-    --help|-h)
-      usage
-      exit 0
-        ;;
-    *)
-      echo "Invalid arguments $*"
-      echo_error "Try 'sstate-cache-management.sh -h' for more information."
-        ;;
-  esac
-done
-
-# sstate cache directory, use environment variable SSTATE_CACHE_DIR
-# if it was not specified, otherwise, error.
-[ -n "$cache_dir" ] || cache_dir=$SSTATE_CACHE_DIR
-[ -n "$cache_dir" ] || echo_error "No cache dir found!"
-[ -d "$cache_dir" ] || echo_error "Invalid cache directory \"$cache_dir\""
-
-[ -n "$rm_duplicated" -a -n "$stamps" ] && \
-    echo_error "Can not use both --remove-duplicated and --stamps-dir"
-
-[ "$rm_duplicated" = "y" ] && remove_duplicated
-[ -n "$stamps" ] && rm_by_stamps
-[ -z "$rm_duplicated" -a -z "$stamps" ] && \
-    echo "What do you want to do?"
-exit 0