subtree updates

poky: 348d9aba33..fc8e5d7c13:
  Adithya Balakumar (1):
        wic: implement reproducible Disk GUID

  Adrian Freihofer (20):
        cmake.bbclass: use --install
        devtool: support plugins with plugins
        devtool: refactor exec_fakeroot
        devtool: refactor deploy to use exec_fakeroot_no_d
        devtool: refactor deploy-target
        recipetool: cleanup imports
        oeqa: replace deprecated assertEquals
        oeqa/selftest/recipetool: fix for python 3.12
        oeqa/selftest/oelib/buildhistory: git default branch
        scripts: python 3.12 regex
        feature-microblaze-versions.inc: python 3.12 regex
        meta/lib/oeqa: python 3.12 regex
        meta/lib/patchtest: python 3.12 regex
        meta/recipes: python 3.12 regex
        bitbake: bitbake/lib/bs4/tests/test_tree.py: python 3.12 regex
        devtool: new ide-sdk plugin
        oe-selftest devtool: ide-sdk tests
        devtool: ide-sdk make deploy-target quicker
        vscode: drop .vscode folder
        oe-init-build-env: generate .vscode from template

  Aleksey Smirnov (2):
        conf/machine: Add Power8 tune to PowerPC architecture
        busybox: Explicitly specify tty device for serial consoles

  Alex Kiernan (1):
        wireless-regdb: Upgrade 2023.09.01 -> 2024.01.23

  Alex Stewart (3):
        opkg: upgrade to 0.6.3
        opkg: add deprecation warning for internal solver
        opkg-arch-config: update recipe HOMEPAGE

  Alexander Kanavin (26):
        sysroot user management postinsts: run with /bin/sh -e to report errors when they happen
        classes/multilib: expand PACKAGE_WRITE_DEPS in addition to DEPENDS
        classes/staging: capture output of sysroot postinsts into logs
        classes/package_rpm: write file permissions and ownership explicitly into .spec
        classes/package_rpm: use weak user/group dependencies
        classes/package_rpm: set bogus locations for passwd/group files
        oeqa/runtime/rpm: fail tests if test rpm file cannot be found
        rpm: update 4.18.1 -> 4.19.1
        classes/package_rpm: correctly escape percent characters
        setftest/cdn tests: check for exceptions also in fetcher diagnostics
        rpm: override curl executable search with just 'curl'
        classes/package_rpm: additionally escape \ and " in filenames
        classes/package_rpm: streamline the logic in one of the condition blocks
        lzlib: add a recipe
        file: enable additional internal compressor support
        selftest/SStateCacheManagement: do not manipulate ERROR_QA
        selftest/SStateCacheManagement: pre-populate the cache
        shadow: add a packageconfig for logind support
        meta/conf/templates/default/conf-notes.txt: remove
        scripts/oe-setup-layers: write a list of layer paths into the checkout's top dir
        meta/conf/templates/default/conf-summary.txt: add a template summary
        meta/lib/bblayers/buildconf.py: add support for configuration summaries
        scripts/oe-setup-builddir: add support for configuration summaries
        oe-setup-build: add a tool for discovering config templates and setting up builds
        meta-poky/conf/templates/default/conf-summary.txt: add a template summary
        bitbake: Revert "bitbake: wget.py: always use the custom user agent"

  Alexis Lothoré (3):
        patchtest-send-results: remove unused variable
        patchtest-send-results: properly parse test status
        testimage: retrieve ptests directory when ptests fail

  André Draszik (4):
        sstate-cache-management: fix regex for 'sigdata' stamp files
        bitbake: fetch/git2: support git's safe.bareRepository
        bitbake: tests/fetch: support git's safe.bareRepository
        bitbake: git-make-shallow: support git's safe.bareRepository

  Anibal Limon (1):
        ptest-runner: Bump to 2.4.3 (92c1b97)

  Anuj Mittal (8):
        enchant2: upgrade 2.6.5 -> 2.6.7
        libproxy: upgrade 0.5.3 -> 0.5.4
        sqlite3: upgrade 3.44.2 -> 3.45.1
        orc: upgrade 0.4.36 -> 0.4.37
        stress-ng: upgrade 0.17.04 -> 0.17.05
        libcap-ng: fix build with swig 4.2.0
        gstreamer1.0: upgrade 1.22.9 -> 1.22.10
        swig: upgrade 4.1.1 -> 4.2.0

  Bruce Ashfield (13):
        lttng-modules: fix v6.8+ build
        linux-yocto-dev: update to v6.8
        linux-yocto/6.6: features/qat/qat.cfg: enable CONFIG_PCIEAER
        linux-yocto/6.6: beaglebone: drop nonassignable kernel options
        linux-yocto/6.6: update to v6.6.13
        linux-yocto/6.6: update CVE exclusions
        linux-yocto/6.6: can: drop obsolete CONFIG_PCH_CAN
        linux-yocto/6.6: update to v6.6.15
        linux-yocto/6.6: update CVE exclusions
        yocto-bsp: update reference boards to v6.6.15
        linux-yocto/6.6: update to v6.6.16
        linux-yocto/6.6: update CVE exclusions
        linux-yocto/6.6: qemuriscv: enable goldfish RTC

  Chen Qi (5):
        multilib_global.bbclass: fix parsing error with no kernel module split
        gnupg: disable tests to avoid running target binaries at build time
        bitbake: fetch2/git.py: fix a corner case in try_premirror
        bitbake: tests/fetch.py: add test case for using premirror in restricted network
        bitbake: fetch2/git.py: add comment in try_premirrors

  Chi Xu (1):
        xz: Add ptest support

  Claus Stovgaard (2):
        kernel-devsrc: fix RDEPENDS for make
        kernel-devsrc: RDEPENDS on gawk

  Clément Péron (1):
        libpcap: extend with nativesdk

  Colin McAllister (1):
        initscripts: Add custom mount args for /var/lib

  David Reyna (1):
        bitbake: taskexp_ncurses: ncurses version of taskexp.py

  Denys Dmytriyenko (3):
        lttng-modules: upgrade 2.13.10 -> 2.13.11
        zlib: upgrade 1.3 -> 1.3.1
        xz: upgrade 5.4.5 -> 5.4.6

  Enguerrand de Ribaucourt (3):
        devtool: ide_sdk: Use bitbake's python3 for generated scripts
        devtool: ide: vscode: Configure read-only files
        meson: use absolute cross-compiler paths

  Enrico Jörns (1):
        rootfs-postcommands: remove make_zimage_symlink_relative()

  Etienne Cordonnier (1):
        dropbear: remove unnecessary line

  Fabien Mahot (1):
        ldconfig-native: Fix to point correctly on the DT_NEEDED entries in an ELF file

  Fabio Estevam (3):
        piglit: Update to latest revision
        mesa: Upgrade 23.3.3 -> 23.3.4
        mesa: Upgrade 23.3.4 -> 23.3.5

  Jamin Lin (3):
        uboot-sign: set load address and entrypoint
        uboot-sign: Fix to install nonexistent dtb file
        u-boot-sign:uboot-config: support to verify signed FIT image

  Jermain Horsman (2):
        bitbake-layers: Add ability to update the reference of repositories
        bitbake-layers: Add test case layers setup for custom references

  Joe Slater (1):
        eudev: allow for predictable network interface names

  Johannes Schneider (2):
        initramfs-framework: overlayroot: fix kernel commandline clash
        initramfs-framework: overlayroot: align bootparams with module name

  Jon Mason (2):
        tunes/sve: Add support for sve2 instructions
        arm/armv*: add all the Arm tunes in GCC 13.2.0

  Jonathan GUILLOT (3):
        lib/oe/package: replace in place PN-locale-* packages in PACKAGES
        lib/oe/package: add LOCALE_PATHS to add define all locations for locales
        cups: use LOCALE_PATHS to split localized HTML templates

  Jose Quaresma (3):
        go: update 1.20.12 -> 1.20.13
        systemd: pack pre-defined pcrlock files installed with tpm2
        qemu: disbale AF_XDP network backend support

  Joshua Watt (8):
        bitbake: hashserv: Add Unihash Garbage Collection
        bitbake: hashserv: sqlalchemy: Use _execute() helper
        bitbake: hashserv: Add unihash-exists API
        bitbake: asyncrpc: Add Client Pool object
        bitbake: hashserv: Add Client Pool
        bitbake: siggen: Add parallel query API
        bitbake: siggen: Add parallel unihash exist API
        sstatesig: Implement new siggen API

  Kai Kang (2):
        rpm: fix dependency for package config imaevm
        ghostscript: correct LICENSE with AGPLv3

  Khem Raj (27):
        elfutils: Fix build with gcc trunk
        python3: Initialize struct termios before calling tcgetattr()
        qemu: Replace the basename patch with backport
        xwayland: Upgrade 23.2.3 -> 23.2.4
        armv8/armv9: Avoid using -march when -mcpu is chosen
        kexec-tools: Fix build with gas 2.42
        systemtap: Backport GCC-14 related calloc fixes
        sdk/assimp.py: Fix build on 32bit arches with 64bit time_t
        binutils: Upgrade to binutils 2.42
        qemu-native: Use inherit_defer for including native class
        syslinux: Disable error on implicit-function-declaration
        glibc: Upgrade to 2.39
        strace: Upgrade to 6.7
        rust/cargo: Build fixes to rust for rv32 target
        buildcpio.py: Switch to using cpio-2.15
        ptest.bbclass: Handle the case when Makefile does not exist in do_install_ptest_base
        kernel-devsrc: Add needed fixes for 6.1+ kernel build on target on RISCV
        python3: Fix ptests with expat 2.6+
        expat: Upgrade to 2.6.0
        gcc-runtime: Move gdb pretty printer file to auto-load location
        core-image-ptest: Increase disk size to 1.5G for strace ptest image
        tcmode-default: Do not define LLVMVERSION
        glibc: Update to latest on 2.39
        glibc: Update to bring mips32/clone3 fix
        piglit: Fix build with musl
        llvm: Upgrade to LLVM-18 RC2
        binutils: Update to tip of 2.42 release branch

  Konrad Weihmann (1):
        python3-yamllint: add missing dependency

  Lee Chee Yang (1):
        migration-guide: add release notes for 4.0.16

  Maanya Goenka (2):
        toolchain-shar-relocate: allow 'find' access to libraries in symlinked directories
        bash: nativesdk-bash does not provide /bin/bash so don't claim to

  Marek Vasut (1):
        Revert "lzop: remove recipe from oe-core"

  Mark Hatle (5):
        qemu: Allow native and nativesdk versions on Linux older then 4.17
        tune-cortexa78.inc: Add cortexa78 tune, based on cortexa77
        feature-arm-vfp.inc: Allow hard-float on newer simd targets
        tune-cortexr5: Add hard-float variant
        tune-cortexr52: Add hard-float variant

  Markus Volk (6):
        gtk4: update 4.12.4 -> 4.12.5
        mesa: update 23.3.5 -> 24.0.0
        mesa: update 24.0.0 -> 24.0.1
        libadwaita: update 1.4.2 -> 1.4.3
        wayland-protocols: update 1.32 -> 1.33
        ell: update 0.61 -> 0.62

  Martin Jansa (5):
        qemu: fix target build with ccache enabled
        package_manager: ipk: add OPKG_MAKE_INDEX_EXTRA_PARAMS variable
        package_rpm: add RPMBUILD_EXTRA_PARAMS variable
        bitbake: bitbake-diffsigs: fix walking the task dependencies and show better error
        bitbake: tests: fetch.py: use real subversion repository

  Michael Opdenacker (9):
        dev-manual: start: remove idle line
        docs: remove support for mickledore (4.2) release
        release-notes-4.3: fix spacing
        alsa-lib: upgrade 1.2.10 -> 1.2.11
        alsa-tools: upgrade 1.2.5 -> 1.2.11
        alsa-ucm-conf: upgrade 1.2.10 -> 1.2.11
        alsa-utils: upgrade 1.2.10 -> 1.2.11
        oeqa/runtime/cases: fix typo in information message
        bitbake: doc: README: simpler link to contributor guide

  Michal Sieron (1):
        sanity.bbclass: raise_sanity_error if /tmp is noexec

  Nick Owens (1):
        systemd: recommend libelf, libdw for elfutils flag

  Ola x Nilsson (1):
        python3-numpy: Use Large File Support version of fallocate

  Paul Gortmaker (1):
        bitbake: hashserv: improve the loglevel error message to be more helpful

  Pavel Zhukov (3):
        systemd.bbclass: Check for existence of the symlink too
        bitbake: fetch2/git.py: Fetch mirror into HEAD
        bitbake: tests/fetch.py: add multiple fetches test

  Peter Kjellerstedt (12):
        devtool: modify: Correct appending of type=git-dependency to URIs
        devtool: standard: Add some missing whitespace
        devtool: _extract_source: Correct the removal of an old backup directory
        bitbake: tests/fetch: Make test_git_latest_versionstring support a max version
        bitbake: fetch2/git: A bit of clean-up of latest_versionstring()
        bitbake: fetch2/git: Make latest_versionstring extract tags with slashes correctly
        lib/oe/patch: Make extractPatches() not extract ignored commits
        lib/oe/patch: Add GitApplyTree.commitIgnored()
        devtool: Make use of oe.patch.GitApplyTree.commitIgnored()
        patch.bbclass: Make use of oe.patch.GitApplyTree.commitIgnored()
        lib/oe/patch: Use git notes to store the filenames for the patches
        insane.bbclass: Allow the warning about virtual/ to be disabled

  Peter Marko (2):
        openssl: Upgrade 3.2.0 -> 3.2.1
        util-linux: add alternative link for scriptreplay

  Petr Vorel (1):
        ltp: Update to 20240129

  Philip Lorenz (1):
        ipk: Remove temporary package lists during SDK creation

  Priyal Doshi (1):
        tzdata : Upgrade to 2024a

  Quentin Schulz (1):
        u-boot: add missing dependency on pyelftools-native

  Randolph Sapp (1):
        mirrors.bbclass: add infraroot as an https mirror

  Randy MacLeod (4):
        valgrind: make ptest depend on all components
        valgrind: update from 3.21.0 to 3.22.0
        valgrind: skip 14 ptests in 3.22
        valgrind: Skip 22 arm64 ptests

  Richard Purdie (34):
        oeqa/qemurunner: Handle rare shutdown race
        pseudo: Update to pull in gcc14 fix and missing statvfs64 intercept
        numactl: upgrade 2.0.16 -> 2.0.17
        conf: Move selftest config to dedicated inc file
        oeqa/selftest/bbtests: Tweak to use no-gplv3 inc file
        python3-markupsafe: upgrade 2.1.3 -> 2.1.5
        python3-markupsafe: Switch to python_setuptools_build_meta
        qemu: Upgrade 8.2.0 -> 8.2.1
        ltp: Enable extra test groups
        ltp: Try re-enabling problematic test
        meta-yocto-bsp: Remove accidentally added files
        oeqa/runtime: Move files from oe-core to bsp layer
        mirrors: Allow shallow glibc to work correctly
        ptest-packagelists: Mark python3 as problematic on riscv64
        kernel-devsrc: Clean up whitespace
        selftest/recipetool: Factor tomllib test to a function
        selftest/recipetool: Improve test failure output
        layer.conf: Update for the scarthgap release series
        layer.conf: Update for the scarthgap release series
        bitbake: process: Add profile logging for main loop
        bitbake: process/server: Fix typo
        kernel-arch: Simplify strip support
        insane: Clarify runtime/ warning
        bitbake: runqueue: Improve performance for executing tasks
        bitbake: runqueue: Optimise taskname lookups in next_buildable_task
        bitbake: runqueue: Improve setcene performance when encoutering many 'hard' dependencies
        openssh: Add a work around for ICE on mips/mips64
        kernel-devsrc: Improve vdso-offsets handling for qemuriscv64
        u-boot: Pass in prefix mapping variables to the compiler
        testsdk: Avoid PATH contamination
        oeqa/selftest/rust: Exclude failing riscv tests
        bitbake: bitbake: Bump version to 2.7.3 for hashserv changes
        sanity.conf: Require bitbake 2.7.3
        python: Drop ${PYTHON_PN}

  Robert Joslyn (2):
        curl: Update to 8.6.0
        gtk: Set CVE_PRODUCT

  Robert Yang (1):
        gnu-config: Update to latest version

  Ross Burton (13):
        grub2: ignore CVE-2023-4001, this is Red Hat-specific
        openssl: backport fix for CVE-2023-6129
        lib/oeqa: rename assertRaisesRegexp to assertRaisesRegex
        oeqa/selftest/recipetool: downgrade meson version to not use pyproject.toml
        recipetool: don't dump stack traces if a toml parser can't be found
        xz: remove redundant PTEST_ENABLED conditional
        libpam: remove redundant PTEST_ENABLED conditional
        glib-2.0: backport memory monitor test fixes
        python3: move dataclasses to python3-core
        python3-unittest-automake-output: upgrade to 0.2
        meson: remove TMPDIR workaround
        meson: set the sysroot in the cross files
        libffi: upgrade to 3.4.5

  Simone Weiß (12):
        gnutls: Upgrade 3.8.2 -> 3.8.3
        maintainers.inc: Add self for libseccomp and gnutls
        bsp-guide: correct formfactor recipe name
        dev-manual: gen-tapdevs need iptables installed
        gnutls: print log if ptest fails
        patchtest: log errors and failures at end
        grub2: ignore CVE-2024-1048, Redhat only issue
        libgit2: update 1.7.1 -> 1.7.2
        libuv: Upgrade 1.47.0 -> 1.48.0
        qemu: Set CVE_STATUS for wrong CVEs
        patchtest: Add selftest for test cve_check_ignore
        patchtest: add stronger indication for failed tests

  Siong W.LIM (1):
        useradd.bbclass: Fix missing space when appending vardeps.

  Thomas Perrot (2):
        opensbi: append LDFLAGS to TARGET_CC_ARCH
        bitbake: wget.py: always use the custom user agent

  Tim Orling (13):
        libxml-parser-perl: upgrade 2.46 -> 2.47
        python3-pyyaml: add PACKAGECONFIG for libyaml
        python3-pyyaml: enable ptest
        python3-cryptography: upgrade 41.0.7 to 42.0.2
        openssh: upgrade 9.5p1 -> 9.6p1
        python3-poetry-core: upgrade 1.8.1 -> 1.9.0
        python3-attrs: skip test failing with pytest-8
        vim: upgrade from 9.0.2130 -> 9.1.0114
        python3-pyproject-metadata: move from meta-python
        python3-pyproject-metadata: HOMEPAGE; DESCRIPTION
        python3-meson-python: move from meta-python
        python_mesonpy.bbclass: move from meta-python
        recipetool; add support for python_mesonpy class

  Tobias Hagelborn (2):
        sstate.bbclass: Only sign packages at the time of their creation
        bitbake: bitbake: hashserv: Postgres adaptations for ignoring duplicate inserts

  Toni Lammi (1):
        bitbake: support temporary AWS credentials

  Trevor Gamblin (7):
        patchtest.README: update mailing list
        cmake: upgrade 3.27.7 -> 3.28.3
        python3-numpy: upgrade 1.26.3 -> 1.26.4
        patchtest-send-results: Add 'References' header
        patchtest-send-results: use Message-ID directly
        patchtest: Fix grammar in log output
        patchtest-send-results: add --debug option

  Valek Andrej (1):
        glibc: Refresh CVE status w.r.t 2.39 release

  Vikas Katariya (1):
        bmap-tools: Add missing runtime dependency

  Wang Mingyu (36):
        at-spi2-core: upgrade 2.50.0 -> 2.50.1
        cpio: upgrade 2.14 -> 2.15
        ethtool: upgrade 6.6 -> 6.7
        iso-codes: upgrade 4.15.0 -> 4.16.0
        libinput: upgrade 1.24.0 -> 1.25.0
        libtest-warnings-perl: upgrade 0.032 -> 0.033
        libwpe: upgrade 1.14.1 -> 1.14.2
        lzip: upgrade 1.23 -> 1.24
        createrepo-c: upgrade 1.0.2 -> 1.0.3
        diffstat: upgrade 1.65 -> 1.66
        dos2unix: upgrade 7.5.1 -> 7.5.2
        ed: upgrade 1.19 -> 1.20
        gnupg: upgrade 2.4.3 -> 2.4.4
        gstreamer: upgrade 1.22.8 -> 1.22.9
        libidn2: upgrade 2.3.4 -> 2.3.7
        libpng: upgrade 1.6.40 -> 1.6.41
        libsolv: upgrade 0.7.27 -> 0.7.28
        liburi-perl: upgrade 5.21 -> 5.25
        nghttp2: upgrade 1.58.0 -> 1.59.0
        repo: upgrade 2.40 -> 2.41
        orc: upgrade 0.4.34 -> 0.4.36
        pkgconf: upgrade 2.0.3 -> 2.1.0
        python3-sphinxcontrib-applehelp: upgrade 1.0.7 -> 1.0.8
        python3-sphinxcontrib-devhelp: upgrade 1.0.5 -> 1.0.6
        python3-sphinxcontrib-htmlhelp: upgrade 2.0.4 -> 2.0.5
        python3-sphinxcontrib-qthelp: upgrade 1.0.6 -> 1.0.7
        python3-sphinxcontrib-serializinghtml: upgrade 1.1.9 -> 1.1.10
        python3-beartype: upgrade 0.16.4 -> 0.17.0
        python3-mako: upgrade 1.3.0 -> 1.3.2
        python3-hatchling: upgrade 1.21.0 -> 1.21.1
        python3-hypothesis: upgrade 6.92.9 -> 6.97.3
        python3-pluggy: upgrade 1.3.0 -> 1.4.0
        python3-psutil: upgrade 5.9.7 -> 5.9.8
        python3-pyopenssl: upgrade 23.3.0 -> 24.0.0
        python3-pytz: upgrade 2023.3 -> 2023.4
        python3-pytest: upgrade 7.4.4 -> 8.0.0

  Xiangyu Chen (1):
        bash: rebase the patch to fix ptest failure

  Yi Zhao (2):
        rpm: add missing dependencies for packageconfig
        libsdl2: upgrade 2.28.5 -> 2.30.0

  Yoann Congal (2):
        kexec-tools: Replace a submitted patch by the backported one
        waf.bbclass: Print waf output on unparsable version

  Yogita Urade (1):
        tiff: fix CVE-2023-52355 and CVE-2023-52356

  baruch@tkos.co.il (3):
        contributor-guide: fix lore URL
        overlayfs: add missing closing parenthesis in selftest
        overlayfs-etc: add option to skip creation of mount dirs

meta-arm: 6bb1fc8d8c..025f76a14f:
  Ali Can Ozaslan (1):
        arm-bsp/u-boot:corstone1000: Fix deployment of capsule files

  Drew Reed (4):
        bsp: Move Corstone-1000 U-Boot configuration entries
        bsp: Move machine settings
        bsp,ci: Switch to poky distro
        bsp: Rename corstone1000-image

  Harsimran Singh Tungal (2):
        n1sdp:arm arm-bsp: fix tftf tests for n1sdp
        arm-bsp/optee: upgrade optee to 4.1.0 for N1SDP

  Jon Mason (3):
        arm/opencsd: update to v1.5.1
        arm/optee: update to 4.1
        arm-bsp/optee: remove unused v3.22.0 recipes

  Khem Raj (1):
        layer.conf: Update for the scarthgap release series

  Ross Burton (5):
        CI: support extra kas files from environment
        CI/cve.yml: add a CVE-checking Kas fragment
        CI: add explanatory comments to variables
        CI: allow the runner to set a NVD API key
        CI: use https: to fetch meta-virtualization

  Vincent Stehlé (1):
        arm-bsp/documentation: corstone1000: fix typo

meta-security: b2e1511338..30e755c592:
  Armin Kuster (3):
        python3-pyinotify: do not rely on smtpd module
        python3-fail2ban: remove unused distutils dependency
        scap-security-guide: update to 0.1.71

  BELOUARGA Mohamed (2):
        checksec: Add more runtime dependencies to checksec tool
        lynis: Add missing runtime dependencies

  Leon Anavi (2):
        linux-yocto%.bbappend: Add audit.cfg
        integrity-image-minimal: Fix IMAGE_INSTALL

  Mikko Rapeli (1):
        parsec-tool: fix serialNumber check

  Yi Zhao (1):
        openscap: fix build with python 3.12

  Yushi Sun (1):
        meta-security: libhoth: SRCREV bump e520f8f...e482716

meta-raspberrypi: 9c901bf170..dbf1113a82:
  Kevin Hao (1):
        rpidistro-ffmpeg: Fix old override syntax

  Khem Raj (3):
        linux-raspberrypi_6.1.bb: Upgrade to 6.1.74
        linux-raspberrypi: Upgrade to 6.1.77
        layer.conf: Update for the scarthgap release series

  Martin Jansa (1):
        libcamera-apps: fix build with libcamera-0.2.0

  Matthew Draws (1):
        rpi-eeprom_git: v.2024.01.05-2712 Update recipe to latest rpi-eeprom repo This follows the current latest release of rpi-eeprom: https://github.com/raspberrypi/rpi-eeprom

  Pascal Huerst (1):
        rpi-base: Add missing hifiberry overlay

meta-openembedded: 9953ca1ac0..528f273006:
  Alex Kiernan (3):
        mdns: Fix SIGSEGV during DumpStateLog()
        mdns: Upgrade 2200.60.25.0.4 -> 2200.80.16
        c-ares: Upgrade 1.24.0 -> 1.26.0

  Angelo Ribeiro (1):
        flatcc: Add tool recipe

  Angelo.Ribeiro (1):
        e2tools: Add tool recipe

  Archana Polampalli (1):
        nodejs: update to latest v20 version 20.11.0

  Beniamin Sandu (3):
        mbedtls: upgrade 3.5.1 -> 3.5.2
        mbedtls: upgrade 2.28.4 -> 2.28.7
        opencv: upgrade 4.8.0 -> 4.9.0

  Changqing Li (1):
        cpuid: fix do_install

  Chirag Shilwant (1):
        kernel-selftest: Add few more testcases

  Christophe Vu-Brugier (4):
        dropwatch: add new recipe
        switchtec-user: upgrade 4.1 -> 4.2
        libnvme: upgrade 1.7.1 -> 1.8
        nvme-cli: upgrade 2.7.1 -> 2.8

  Clément Péron (2):
        proj: extend class to native and nativesdk
        proj: upgrade 9.3.0 -> 9.3.1

  Denys Dmytriyenko (1):
        libcamera: update 0.1.0 -> 0.2.0

  Derek Straka (36):
        python3-bandit: update to version 1.7.7
        python3-web3: update to version 6.15.0
        python3-argcomplete: update to version 3.2.2
        python3-cytoolz: update to version 0.12.3
        python3-pdm: update to version 2.12.2
        python3-google-api-python-client: update to version 2.115.0
        python3-coverage: update to version 7.4.1
        python3-gmqtt: update to version 0.6.14
        python3-colorlog: update to version 6.8.2
        python3-argh: update to version 0.31.2
        python3-luma-core: update to version 2.4.2
        python-pdm: update to version 2.12.3
        python3-parse: update to version 1.20.1
        python3-grpcio: update to version 1.60.1
        python3-dill: update to version 0.3.8
        python3-types-setuptools: update to version 69.0.0.20240125
        python3-pymisp: update to version 2.4.184
        python3-cbor2: update to version 5.6.1
        python3-sentry-sdk: update to version 1.40.0
        python3-pytest-asyncio: update to version 0.23.4
        python3-google-api-core: update to version 2.16.1
        python3-google-api-python-client: update to version 2.116.0
        python3-google-auth: update to version 2.27.0
        python3-jsonrpcclient: update to version 4.0.3
        python3-dnspython: update to version 2.5.0
        python3-eventlet: update to version 0.35.1
        python3-platformdirs: update to version 4.2.0
        python3-ipython: update to version 8.21.0
        python3-grpcio-tools: update to version 1.60.1
        python3-cachecontrol: update to version 0.14.0
        python3-binwalk: update the regex version for upstream checks
        python3-pymodbus: update to version 3.6.3
        python3-pyyaml-include: add initial recipe for version 1.3.2
        python3-appdirs: add ptest into PTESTS_FAST_META_PYTHON items
        python3-yarl: add ptest into PTESTS_FAST_META_PYTHON items
        python3-ujson: add ptest into PTESTS_FAST_META_PYTHON items

  Emil Kronborg (1):
        php-fpm: fix systemd

  Etienne Cordonnier (2):
        uutils-coreutils: upgrade 0.0.23 -> 0.0.24
        uutils_coreutils: merge .inc and .bb

  Fathi Boudra (4):
        whitenoise: add a new recipe
        python3-django: upgrade to Django 4.2.10 LTS release
        libtinyxml2: fix the homepage URL
        libtinyxml2: allow to build both shared and static libraries

  Geoff Parker (2):
        python3-aiodns python3-pycares: Add native & nativesdk support
        python3-aiohappyeyeballs: Add native & nativesdk support

  Jean-Marc BOUCHE (1):
        rtkit: missing files/directories in package

  Jose Quaresma (1):
        ostree: Upgrade 2023.8 -> 2024.1

  Jörg Sommer (1):
        bonnie++: New recipe for version 2.0

  Khem Raj (18):
        uftrace: Upgrade to 0.15.2
        i2cdev: Set PV correctly
        minicoredumper: Fix build with clang
        python3-pytest-mock: Fix ptest failures with python 3.12
        ndctl: Update to v78
        vk-gl-cts: Disable Werror on amber external module
        vulkan-cts: Upgrade to 1.3.7.3
        uftrace: Adjust the summary to reflect rust and python support
        libcamera: Fix build with clang-18
        breakpad: Upgrade to 2023.06.01 release
        bpftool: Add missing dep on elfutils-native
        flatcc: Fix build warnings found with clang-18
        Revert "lzop: add (from oe-core)"
        can-isotp: Update to latest and skip it
        openflow: Switch SRC_URI to github mirror
        ot-br-posix: upgrade to latest trunk
        libcereal: Disable c++11-narrowing-const-reference warning as error
        ot-br-posix: Limit vla-cxx-extension option to clang >= 18

  Li Wang (1):
        radvd: add '--shell /sbin/nologin' to /etc/passwd

  Mark Hatle (1):
        opencv: Fix python3 package generation

  Markus Volk (9):
        luajit: allow to build on supported platforms
        pipewire: fix build with libcamera-0.2
        system-config-printer: fix runtime for system-config-printer
        iwd: update 2.8 -> 2.13
        pipewire: update 1.0.1 -> 1.0.3
        flatpak: remove unneeded RDEPENDS
        libosinfo: use hwdata for ids files
        libnfs: update 5.0.2 -> 5.0.3
        hwdata: update 0.378 -> 0.379

  Martin Jansa (18):
        libtalloc, libtevent, libtdb, libldb: set PYTHONARCHDIR for waf to respect python libdir
        jack: fix build with python3 on host
        redis: restore Upstream-Status
        libvpx: restore Upstream-Status
        python-jsonref: add missing Upstream-Status
        flatcc: respect baselib
        flatcc: drop 'r' from gitr and ${SRCPV}
        recipes: drop ${SRCPV} usage
        recipes: drop remaining +gitr cases
        gitpkgv.bbclass: adjust the example in comment a bit
        ne10: append +git instead of gitr+
        evemu-tools: use better PV
        nana: upgrade to latest commit from github
        xfstests: upgrade to latest 2024.01.14
        xfstests: add gawk to RDEPENDS
        xfstests: use master branch instead of 'for-next'
        xfstests: drop the upstream rejected install-sh hack
        xfstests: fix make install race condition

  Max Krummenacher (2):
        libusbgx: fix usbgx.service stop / restart
        libusbgx: uprev to the latest commit

  Maxime Roussin-Belanger (1):
        xdg-desktop-portal: add missing glib-2.0-native dependency

  Maxime Roussin-Bélanger (1):
        polkit: fix rules.d permissions

  Ming Liu (1):
        plymouth: uprev to 24.004.60

  Niko Mauno (4):
        python3-pybind11: Amend HOMEPAGE
        python3-pybind11: Prune redundant inherit
        python3-pybind11: Fix LICENSE
        python3-pybind11: Cosmetic fixes

  Pavel Zhukov (1):
        python3-tzlocal: Add zoneinfo dependency

  Peter Kjellerstedt (1):
        xfstests: Only specify the main SRCREV once

  Peter Marko (2):
        syslog-ng: ignore CVE-2022-38725
        libqmi: correct PV

  Pratik Manvar (1):
        python3-pybind11: Remove the Boost dependency

  Richard Leitner (7):
        python3-janus: add recipe for v1.0.0
        python3-moteus: add recipe for v0.3.67
        python3-socksio: add recipe for v1.0.0
        python3-anyio: add recipe for v4.2.0
        python3-sniffio: add recipe for v1.3.0
        python3-httpcore: add recipe for v1.0.2
        python3-httpx: add recipe for v0.26.0

  Sascha Hauer (1):
        signing.bbclass: make it work with eliptic curve keys

  Simone Weiß (1):
        scapy: Add difftools and logutils in RDEPENDS

  Thomas Perrot (3):
        dvb-apps: no longer skip ldflags QA
        etcd-cpp-apiv3: no longer skip ldflags QA
        kernel-selftest: no longer skip ldflags QA

  Tim Orling (60):
        python3-uritemplate: switch to pytest --automake
        python3-unidiff: switch to pytest --automake
        python3-ujson: switch to pytest --automake
        python3-pytest-lazy-fixture: switch to pytest --automake
        python3-fastjsonschema: switch to pytest --automake
        python3-tomlkit: switch to pytest --automake
        python3-inotify: switch to pytest --automake
        python3-requests-file: switch to pytest --automake
        python3-covdefaults: switch to pytest --automake
        python3-dominate: switch to pytest --automake
        python3-scrypt: switch to pytest --automake
        python3-u-msgpack-python: switch to pytest --automake
        python3-iso3166: switch to pytest --automake
        python3-trustme: switch to pytest --automake
        python3-asgiref: switch to pytest --automake
        python3-html2text: switch to pytest --automake
        python3-pyasn1-modules: switch to pytest --automake
        python3-intervals: switch to pytest --automake
        python3-py-cpuinfo: switch to pytest --automake
        python3-backports-functools-lru-cache: drop folder
        python3-whoosh: switch to pytest --automake
        python3-xlrd: switch to pytest --automake
        python3-dnspython: switch to pytest --automake
        python3-prettytable: switch to pytest --automake
        python3-ptyprocess: switch to pytest --automake
        python3-gunicorn: switch to pytest --automake
        python3-pytest-mock: switch to pytest --automake
        python3-pyroute2: switch to pytest --automake
        python3-smpplib: switch to pytest --automake
        python3-pyzmq: switch to pytest --automake
        python3-multidict: switch to pytest --automake
        python3-geojson: switch to pytest --automake
        python3-serpent: switch to pytest --automake
        python3-soupsieve: switch to pytest --automake
        python3-requests-toolbelt: switch to pytest --automake
        python3-yarl: switch to pytest --automake
        python3-cbor2: switch to pytest --automake
        python3-ansicolors: switch to pytest --automake
        python3-ipy: switch to pytest --automake
        python3-sqlparse: switch to pytest --automake
        python3-precise-runner: switch to pytest --automake
        python3-parse-type: switch to pytest --automake
        python3-inflection: switch to pytest --automake
        python3-blinker: switch to pytest --automake
        python3-service-identity: switch to pytest --automake
        python3-cachetools: switch to pytest --automake
        python3-simpleeval: switch to pytest --automake
        python3-appdirs: switch to pytest --automake
        python3-pillow: switch to pytest --automake
        python3-semver: switch to pytest --automake
        python3-platformdirs: switch to pytest --automake
        python3-polyline: switch to pytest --automake
        python3-betamax: switch to pytest --automake
        python3-pytoml: switch to pytest --automake
        python3-pyserial: switch to pytest --automake
        python3-typeguard: switch to pytest --automake
        python3-execnet: switch to pytest --automake
        python3-pyyaml-include: switch to pytest --automake
        python3-xxhash: switch to pytest --automake
        python3-pylint: switch to pytest --automake

  Tom Geelen (1):
        python3-pychromecast: add missing RDEPENDS, and add initial recipe for dependency.

  Wang Mingyu (90):
        btop: upgrade 1.2.13 -> 1.3.0
        ccid: upgrade 1.5.4 -> 1.5.5
        ctags: upgrade 6.1.20231231.0 -> 6.1.20240114.0
        gcr3: upgrade 3.41.1 -> 3.41.2
        htop: upgrade 3.2.2 -> 3.3.0
        hwdata: upgrade 0.377 -> 0.378
        libdecor: upgrade 0.2.1 -> 0.2.2
        libvpx: upgrade 1.13.1 -> 1.14.0
        lldpd: upgrade 1.0.17 -> 1.0.18
        gjs: upgrade 1.78.2 -> 1.78.3
        wireshark: upgrade 4.2.0 -> 4.2.2
        capnproto: upgrade 1.0.1.1 -> 1.0.2
        dnfdragora: upgrade 2.1.5 -> 2.1.6
        libyang: upgrade 2.1.128 -> 2.1.148
        lshw: upgrade 02.19.2 -> 02.20
        md4c: upgrade 0.4.8 -> 0.5.0
        python3-apscheduler: add new recipe
        redis: upgrade 7.2.3 -> 7.2.4
        sanlock: upgrade 3.8.5 -> 3.9.0
        python3-eth-keys: upgrade 0.4.0 -> 0.5.0
        python3-xmlschema: upgrade 2.5.1 -> 3.0.1
        plocate: upgrade 1.1.20 -> 1.1.22
        python3-absl: upgrade 2.0.0 -> 2.1.0
        python3-asyncinotify: upgrade 4.0.5 -> 4.0.6
        python3-beautifulsoup4: upgrade 4.12.2 -> 4.12.3
        python3-cantools: upgrade 39.4.2 -> 39.4.3
        python3-cbor2: upgrade 5.5.1 -> 5.6.0
        python3-dbus-fast: upgrade 2.21.0 -> 2.21.1
        python3-django: upgrade 5.0 -> 5.0.1
        python3-eth-abi: upgrade 4.2.1 -> 5.0.0
        python3-eth-typing: upgrade 3.5.2 -> 4.0.0
        python3-eth-utils: upgrade 2.3.1 -> 3.0.0
        python3-eventlet: upgrade 0.34.2 -> 0.34.3
        python3-flask: upgrade 3.0.0 -> 3.0.1
        python3-git-pw: upgrade 2.5.0 -> 2.6.0
        python3-google-api-python-client: upgrade 2.113.0 -> 2.114.0
        python3-haversine: upgrade 2.8.0 -> 2.8.1
        python3-ipython: upgrade 8.19.0 -> 8.20.0
        python3-pdm: upgrade 2.11.2 -> 2.12.1
        python3-pyatspi: upgrade 2.46.0 -> 2.46.1
        python3-sentry-sdk: upgrade 1.39.1 -> 1.39.2
        python3-robotframework: upgrade 6.1.1 -> 7.0
        python3-pychromecast: upgrade 13.0.8 -> 13.1.0
        python3-tox: upgrade 4.11.4 -> 4.12.1
        python3-types-psutil: upgrade 5.9.5.17 -> 5.9.5.20240106
        qpdf: upgrade 11.7.0 -> 11.8.0
        smemstat: upgrade 0.02.12 -> 0.02.13
        tesseract: upgrade 5.3.3 -> 5.3.4
        libsmi: Fix buildpaths warning.
        minicoredumper: upgrade 2.0.6 -> 2.0.7
        cmocka: Fix install conflict when enable multilib.
        czmq: Fix install conflict when enable multilib.
        czmq: Fix buildpaths warning.
        bdwgc: upgrade 8.2.4 -> 8.2.6
        cmark: upgrade 0.30.3 -> 0.31.0
        gensio: upgrade 2.8.2 -> 2.8.3
        geos: upgrade 3.12.0 -> 3.12.1
        imlib2: upgrade 1.12.1 -> 1.12.2
        libcbor: upgrade 0.10.2 -> 0.11.0
        libinih: upgrade 57 -> 58
        libio-socket-ssl-perl: upgrade 2.084 -> 2.085
        libjcat: upgrade 0.2.0 -> 0.2.1
        libqmi: upgrade 1.35.1 -> 1.35.2
        md4c: upgrade 0.5.0 -> 0.5.2
        nanomsg: upgrade 1.2 -> 1.2.1
        neatvnc: upgrade 0.7.1 -> 0.7.2
        network-manager-applet: upgrade 1.34.0 -> 1.36.0
        libgsf: upgrade 1.14.51 -> 1.14.52
        ndisc6: upgrade 1.0.7 -> 1.0.8
        squid: upgrade 6.6 -> 6.7
        iotop: upgrade 1.25 -> 1.26
        libblockdev: upgrade 3.0.4 -> 3.1.0
        neon: upgrade 0.32.5 -> 0.33.0
        pkcs11-provider: upgrade 0.2 -> 0.3
        sanlock: upgrade 3.9.0 -> 3.9.1
        satyr: upgrade 0.42 -> 0.43
        python3-astroid: upgrade 3.0.2 -> 3.0.3
        python3-elementpath: upgrade 4.1.5 -> 4.2.0
        python3-flask: upgrade 3.0.1 -> 3.0.2
        python3-google-api-core: upgrade 2.16.1 -> 2.16.2
        python3-gspread: upgrade 5.12.4 -> 6.0.0
        python3-path: upgrade 16.9.0 -> 16.10.0
        python3-gcovr: upgrade 6.0 -> 7.0
        python3-types-psutil: upgrade 5.9.5.20240106 -> 5.9.5.20240205
        python3-waitress: upgrade 2.1.2 -> 3.0.0
        rdma-core: upgrade 48.0 -> 50.0
        ser2net: upgrade 4.6.0 -> 4.6.1
        sip: upgrade 6.8.1 -> 6.8.2
        span-lite: upgrade 0.10.3 -> 0.11.0
        tcpslice: upgrade 1.6 -> 1.7

  William A. Kennington III (3):
        nanopb: Update 0.4.7 -> 0.4.8
        nanopb: Split into 2 packages
        nanopb-runtime: Enable shared library

  Yoann Congal (6):
        ibus: backport a reproducibility fix
        radvd: Fix build in reproducible test
        mariadb: Move useradd handling in target side of the recipe
        kexec-tools-klibc: Fix building on x86_64 with binutils 2.41
        freeradius: Add missing 'radiusd' static group id
        ntp: Add missing 'ntp' static group id

  alperak (18):
        python3-flask-marshmallow: upgrade 0.15.0 -> 1.1.0
        python3-netaddr: upgrade 0.10.0 -> 0.10.1
        python3-toolz: upgrade 0.12.0 -> 0.12.1
        python3-aiohappyeyeballs: add recipe
        python3-aiohttp: upgrade 3.9.1 -> 3.9.2
        python3-eth-rlp: upgrade 1.0.0 -> 1.0.1
        python3-aiohttp: upgrade 3.9.2 -> 3.9.3
        python3-google-auth-oauthlib: add recipe
        python3-scikit-build: upgrade 0.16.7 -> 0.17.6
        python3-eth-account: upgrade 0.10.0 -> 0.11.0
        python3-pyunormalize: add recipe
        python3-web3: upgrade 6.15.0 -> 6.15.1
        python3-gspread: upgrade 6.0.0 -> 6.0.1
        python3-strenum: add recipe
        python3-flask-marshmallow: upgrade 1.1.0 -> 1.2.0
        python3-werkzeug: upgrade 2.3.6 -> 3.0.1
        python3-imageio: upgrade 2.33.1 -> 2.34.0
        python3-werkzeug: add missing runtime dependencies

  virendra thakur (1):
        nodejs: Set CVE_PRODUCT to "node.js"

Change-Id: If9fadba6ede9e8de3b778d470bbd61f208f48e54
Signed-off-by: Patrick Williams <patrick@stwcx.xyz>
diff --git a/poky/scripts/.oe-layers.json b/poky/scripts/.oe-layers.json
new file mode 100644
index 0000000..1b00a84
--- /dev/null
+++ b/poky/scripts/.oe-layers.json
@@ -0,0 +1,7 @@
+{
+    "layers": [
+        "../meta-poky",
+        "../meta"
+    ],
+    "version": "1.0"
+}
diff --git a/poky/scripts/combo-layer b/poky/scripts/combo-layer
index 2312cef..4a71591 100755
--- a/poky/scripts/combo-layer
+++ b/poky/scripts/combo-layer
@@ -483,7 +483,7 @@
         exit if repo is dirty
     """
     output=runcmd("git status --porcelain", repodir)
-    r = re.compile('\?\? patch-.*/')
+    r = re.compile(r'\?\? patch-.*/')
     dirtyout = [item for item in output.splitlines() if not r.match(item)]
     if dirtyout:
         logger.error("git repo %s is dirty, please fix it first", repodir)
diff --git a/poky/scripts/contrib/bbvars.py b/poky/scripts/contrib/bbvars.py
index 0901336..a9cdf08 100755
--- a/poky/scripts/contrib/bbvars.py
+++ b/poky/scripts/contrib/bbvars.py
@@ -36,8 +36,8 @@
 def collect_documented_vars(docfiles):
     ''' Walk the docfiles and collect the documented variables '''
     documented_vars = []
-    prog = re.compile(".*($|[^A-Z_])<glossentry id=\'var-")
-    var_prog = re.compile('<glossentry id=\'var-(.*)\'>')
+    prog = re.compile(r".*($|[^A-Z_])<glossentry id=\'var-")
+    var_prog = re.compile(r'<glossentry id=\'var-(.*)\'>')
     for d in docfiles:
         with open(d) as f:
             documented_vars += var_prog.findall(f.read())
@@ -45,7 +45,7 @@
     return documented_vars
 
 def bbvar_doctag(var, docconf):
-    prog = re.compile('^%s\[doc\] *= *"(.*)"' % (var))
+    prog = re.compile(r'^%s\[doc\] *= *"(.*)"' % (var))
     if docconf == "":
         return "?"
 
diff --git a/poky/scripts/contrib/convert-overrides.py b/poky/scripts/contrib/convert-overrides.py
index 1939757..c69acb4 100755
--- a/poky/scripts/contrib/convert-overrides.py
+++ b/poky/scripts/contrib/convert-overrides.py
@@ -81,19 +81,19 @@
 
 vars_re = {}
 for exp in vars:
-    vars_re[exp] = (re.compile('((^|[#\'"\s\-\+])[A-Za-z0-9_\-:${}\.]+)_' + exp), r"\1:" + exp)
+    vars_re[exp] = (re.compile(r'((^|[#\'"\s\-\+])[A-Za-z0-9_\-:${}\.]+)_' + exp), r"\1:" + exp)
 
 shortvars_re = {}
 for exp in shortvars:
-    shortvars_re[exp] = (re.compile('((^|[#\'"\s\-\+])[A-Za-z0-9_\-:${}\.]+)_' + exp + '([\(\'"\s:])'), r"\1:" + exp + r"\3")
+    shortvars_re[exp] = (re.compile(r'((^|[#\'"\s\-\+])[A-Za-z0-9_\-:${}\.]+)_' + exp + r'([\(\'"\s:])'), r"\1:" + exp + r"\3")
 
 package_re = {}
 for exp in packagevars:
-    package_re[exp] = (re.compile('(^|[#\'"\s\-\+]+)' + exp + '_' + '([$a-z"\'\s%\[<{\\\*].)'), r"\1" + exp + r":\2")
+    package_re[exp] = (re.compile(r'(^|[#\'"\s\-\+]+)' + exp + r'_' + r'([$a-z"\'\s%\[<{\\\*].)'), r"\1" + exp + r":\2")
 
 # Other substitutions to make
 subs = {
-    'r = re.compile("([^:]+):\s*(.*)")' : 'r = re.compile("(^.+?):\s+(.*)")',
+    'r = re.compile(r"([^:]+):\s*(.*)")' : 'r = re.compile(r"(^.+?):\s+(.*)")',
     "val = d.getVar('%s_%s' % (var, pkg))" : "val = d.getVar('%s:%s' % (var, pkg))",
     "f.write('%s_%s: %s\\n' % (var, pkg, encode(val)))" : "f.write('%s:%s: %s\\n' % (var, pkg, encode(val)))",
     "d.getVar('%s_%s' % (scriptlet_name, pkg))" : "d.getVar('%s:%s' % (scriptlet_name, pkg))",
diff --git a/poky/scripts/devtool b/poky/scripts/devtool
index 3aae7b9..60ea3e8 100755
--- a/poky/scripts/devtool
+++ b/poky/scripts/devtool
@@ -299,8 +299,9 @@
             return 2
 
     # Search BBPATH first to allow layers to override plugins in scripts_path
-    for path in global_args.bbpath.split(':') + [scripts_path]:
-        pluginpath = os.path.join(path, 'lib', 'devtool')
+    pluginpaths = [os.path.join(path, 'lib', 'devtool') for path in global_args.bbpath.split(':') + [scripts_path]]
+    context.pluginpaths = pluginpaths
+    for pluginpath in pluginpaths:
         scriptutils.load_plugins(logger, plugins, pluginpath)
 
     subparsers = parser.add_subparsers(dest="subparser_name", title='subcommands', metavar='<subcommand>')
diff --git a/poky/scripts/lib/checklayer/__init__.py b/poky/scripts/lib/checklayer/__init__.py
index 8271ed7..62ecdfe 100644
--- a/poky/scripts/lib/checklayer/__init__.py
+++ b/poky/scripts/lib/checklayer/__init__.py
@@ -324,8 +324,8 @@
         else:
             raise
 
-    sig_regex = re.compile("^(?P<task>.*:.*):(?P<hash>.*) .$")
-    tune_regex = re.compile("(^|\s)SIGGEN_LOCKEDSIGS_t-(?P<tune>\S*)\s*=\s*")
+    sig_regex = re.compile(r"^(?P<task>.*:.*):(?P<hash>.*) .$")
+    tune_regex = re.compile(r"(^|\s)SIGGEN_LOCKEDSIGS_t-(?P<tune>\S*)\s*=\s*")
     current_tune = None
     with open(sigs_file, 'r') as f:
         for line in f.readlines():
diff --git a/poky/scripts/lib/devtool/__init__.py b/poky/scripts/lib/devtool/__init__.py
index b4f998a..6133c1c 100644
--- a/poky/scripts/lib/devtool/__init__.py
+++ b/poky/scripts/lib/devtool/__init__.py
@@ -78,12 +78,15 @@
     """Run a command under fakeroot (pseudo, in fact) so that it picks up the appropriate file permissions"""
     # Grab the command and check it actually exists
     fakerootcmd = d.getVar('FAKEROOTCMD')
+    fakerootenv = d.getVar('FAKEROOTENV')
+    exec_fakeroot_no_d(fakerootcmd, fakerootenv, cmd, kwargs)
+
+def exec_fakeroot_no_d(fakerootcmd, fakerootenv, cmd, **kwargs):
     if not os.path.exists(fakerootcmd):
         logger.error('pseudo executable %s could not be found - have you run a build yet? pseudo-native should install this and if you have run any build then that should have been built')
         return 2
     # Set up the appropriate environment
     newenv = dict(os.environ)
-    fakerootenv = d.getVar('FAKEROOTENV')
     for varvalue in fakerootenv.split():
         if '=' in varvalue:
             splitval = varvalue.split('=', 1)
@@ -250,9 +253,7 @@
                     bb.process.run('git submodule add %s %s' % (remote_url, os.path.relpath(root, os.path.join(root, ".."))), cwd=os.path.join(root, ".."))
                     found = True
                 if found:
-                    useroptions = []
-                    oe.patch.GitApplyTree.gitCommandUserOptions(useroptions, d=d)
-                    bb.process.run('git %s commit -m "Adding additionnal submodule from SRC_URI\n\n%s"' % (' '.join(useroptions), oe.patch.GitApplyTree.ignore_commit_prefix), cwd=os.path.join(root, ".."))
+                    oe.patch.GitApplyTree.commitIgnored("Add additional submodule from SRC_URI", dir=os.path.join(root, ".."), d=d)
                     found = False
     if os.path.exists(os.path.join(repodir, '.gitmodules')):
         bb.process.run('git submodule foreach --recursive  "git tag -f %s"' % basetag, cwd=repodir)
diff --git a/poky/scripts/lib/devtool/deploy.py b/poky/scripts/lib/devtool/deploy.py
index eadf6e1..b5ca8f2 100644
--- a/poky/scripts/lib/devtool/deploy.py
+++ b/poky/scripts/lib/devtool/deploy.py
@@ -16,7 +16,7 @@
 import argparse_oe
 import oe.types
 
-from devtool import exec_fakeroot, setup_tinfoil, check_workspace_recipe, DevtoolError
+from devtool import exec_fakeroot_no_d, setup_tinfoil, check_workspace_recipe, DevtoolError
 
 logger = logging.getLogger('devtool')
 
@@ -133,17 +133,38 @@
 
     return '\n'.join(lines)
 
-
-
 def deploy(args, config, basepath, workspace):
     """Entry point for the devtool 'deploy' subcommand"""
-    import math
-    import oe.recipeutils
-    import oe.package
     import oe.utils
 
     check_workspace_recipe(workspace, args.recipename, checksrc=False)
 
+    tinfoil = setup_tinfoil(basepath=basepath)
+    try:
+        try:
+            rd = tinfoil.parse_recipe(args.recipename)
+        except Exception as e:
+            raise DevtoolError('Exception parsing recipe %s: %s' %
+                            (args.recipename, e))
+        
+        srcdir = rd.getVar('D')
+        workdir = rd.getVar('WORKDIR')
+        path = rd.getVar('PATH')
+        strip_cmd = rd.getVar('STRIP')
+        libdir = rd.getVar('libdir')
+        base_libdir = rd.getVar('base_libdir')
+        max_process = oe.utils.get_bb_number_threads(rd)
+        fakerootcmd = rd.getVar('FAKEROOTCMD')
+        fakerootenv = rd.getVar('FAKEROOTENV')
+    finally:
+        tinfoil.shutdown()
+
+    return deploy_no_d(srcdir, workdir, path, strip_cmd, libdir, base_libdir, max_process, fakerootcmd, fakerootenv, args)
+
+def deploy_no_d(srcdir, workdir, path, strip_cmd, libdir, base_libdir, max_process, fakerootcmd, fakerootenv, args):
+    import math
+    import oe.package
+
     try:
         host, destdir = args.target.split(':')
     except ValueError:
@@ -153,118 +174,108 @@
     if not destdir.endswith('/'):
         destdir += '/'
 
-    tinfoil = setup_tinfoil(basepath=basepath)
+    recipe_outdir = srcdir
+    if not os.path.exists(recipe_outdir) or not os.listdir(recipe_outdir):
+        raise DevtoolError('No files to deploy - have you built the %s '
+                        'recipe? If so, the install step has not installed '
+                        'any files.' % args.recipename)
+
+    if args.strip and not args.dry_run:
+        # Fakeroot copy to new destination
+        srcdir = recipe_outdir
+        recipe_outdir = os.path.join(workdir, 'devtool-deploy-target-stripped')
+        if os.path.isdir(recipe_outdir):
+            exec_fakeroot_no_d(fakerootcmd, fakerootenv, "rm -rf %s" % recipe_outdir, shell=True)
+        exec_fakeroot_no_d(fakerootcmd, fakerootenv, "cp -af %s %s" % (os.path.join(srcdir, '.'), recipe_outdir), shell=True)
+        os.environ['PATH'] = ':'.join([os.environ['PATH'], path or ''])
+        oe.package.strip_execs(args.recipename, recipe_outdir, strip_cmd, libdir, base_libdir, max_process)
+
+    filelist = []
+    inodes = set({})
+    ftotalsize = 0
+    for root, _, files in os.walk(recipe_outdir):
+        for fn in files:
+            fstat = os.lstat(os.path.join(root, fn))
+            # Get the size in kiB (since we'll be comparing it to the output of du -k)
+            # MUST use lstat() here not stat() or getfilesize() since we don't want to
+            # dereference symlinks
+            if fstat.st_ino in inodes:
+                fsize = 0
+            else:
+                fsize = int(math.ceil(float(fstat.st_size)/1024))
+            inodes.add(fstat.st_ino)
+            ftotalsize += fsize
+            # The path as it would appear on the target
+            fpath = os.path.join(destdir, os.path.relpath(root, recipe_outdir), fn)
+            filelist.append((fpath, fsize))
+
+    if args.dry_run:
+        print('Files to be deployed for %s on target %s:' % (args.recipename, args.target))
+        for item, _ in filelist:
+            print('  %s' % item)
+        return 0
+
+    extraoptions = ''
+    if args.no_host_check:
+        extraoptions += '-o UserKnownHostsFile=/dev/null -o StrictHostKeyChecking=no'
+    if not args.show_status:
+        extraoptions += ' -q'
+
+    scp_sshexec = ''
+    ssh_sshexec = 'ssh'
+    if args.ssh_exec:
+        scp_sshexec = "-S %s" % args.ssh_exec
+        ssh_sshexec = args.ssh_exec
+    scp_port = ''
+    ssh_port = ''
+    if args.port:
+        scp_port = "-P %s" % args.port
+        ssh_port = "-p %s" % args.port
+
+    if args.key:
+        extraoptions += ' -i %s' % args.key
+
+    # In order to delete previously deployed files and have the manifest file on
+    # the target, we write out a shell script and then copy it to the target
+    # so we can then run it (piping tar output to it).
+    # (We cannot use scp here, because it doesn't preserve symlinks.)
+    tmpdir = tempfile.mkdtemp(prefix='devtool')
     try:
-        try:
-            rd = tinfoil.parse_recipe(args.recipename)
-        except Exception as e:
-            raise DevtoolError('Exception parsing recipe %s: %s' %
-                            (args.recipename, e))
-        recipe_outdir = rd.getVar('D')
-        if not os.path.exists(recipe_outdir) or not os.listdir(recipe_outdir):
-            raise DevtoolError('No files to deploy - have you built the %s '
-                            'recipe? If so, the install step has not installed '
-                            'any files.' % args.recipename)
-
-        if args.strip and not args.dry_run:
-            # Fakeroot copy to new destination
-            srcdir = recipe_outdir
-            recipe_outdir = os.path.join(rd.getVar('WORKDIR'), 'devtool-deploy-target-stripped')
-            if os.path.isdir(recipe_outdir):
-                exec_fakeroot(rd, "rm -rf %s" % recipe_outdir, shell=True)
-            exec_fakeroot(rd, "cp -af %s %s" % (os.path.join(srcdir, '.'), recipe_outdir), shell=True)
-            os.environ['PATH'] = ':'.join([os.environ['PATH'], rd.getVar('PATH') or ''])
-            oe.package.strip_execs(args.recipename, recipe_outdir, rd.getVar('STRIP'), rd.getVar('libdir'),
-                        rd.getVar('base_libdir'), oe.utils.get_bb_number_threads(rd), rd)
-
-        filelist = []
-        inodes = set({})
-        ftotalsize = 0
-        for root, _, files in os.walk(recipe_outdir):
-            for fn in files:
-                fstat = os.lstat(os.path.join(root, fn))
-                # Get the size in kiB (since we'll be comparing it to the output of du -k)
-                # MUST use lstat() here not stat() or getfilesize() since we don't want to
-                # dereference symlinks
-                if fstat.st_ino in inodes:
-                    fsize = 0
-                else:
-                    fsize = int(math.ceil(float(fstat.st_size)/1024))
-                inodes.add(fstat.st_ino)
-                ftotalsize += fsize
-                # The path as it would appear on the target
-                fpath = os.path.join(destdir, os.path.relpath(root, recipe_outdir), fn)
-                filelist.append((fpath, fsize))
-
-        if args.dry_run:
-            print('Files to be deployed for %s on target %s:' % (args.recipename, args.target))
-            for item, _ in filelist:
-                print('  %s' % item)
-            return 0
-
-        extraoptions = ''
-        if args.no_host_check:
-            extraoptions += '-o UserKnownHostsFile=/dev/null -o StrictHostKeyChecking=no'
-        if not args.show_status:
-            extraoptions += ' -q'
-
-        scp_sshexec = ''
-        ssh_sshexec = 'ssh'
-        if args.ssh_exec:
-            scp_sshexec = "-S %s" % args.ssh_exec
-            ssh_sshexec = args.ssh_exec
-        scp_port = ''
-        ssh_port = ''
-        if args.port:
-            scp_port = "-P %s" % args.port
-            ssh_port = "-p %s" % args.port
-
-        if args.key:
-            extraoptions += ' -i %s' % args.key
-
-        # In order to delete previously deployed files and have the manifest file on
-        # the target, we write out a shell script and then copy it to the target
-        # so we can then run it (piping tar output to it).
-        # (We cannot use scp here, because it doesn't preserve symlinks.)
-        tmpdir = tempfile.mkdtemp(prefix='devtool')
-        try:
-            tmpscript = '/tmp/devtool_deploy.sh'
-            tmpfilelist = os.path.join(os.path.dirname(tmpscript), 'devtool_deploy.list')
-            shellscript = _prepare_remote_script(deploy=True,
-                                                verbose=args.show_status,
-                                                nopreserve=args.no_preserve,
-                                                nocheckspace=args.no_check_space)
-            # Write out the script to a file
-            with open(os.path.join(tmpdir, os.path.basename(tmpscript)), 'w') as f:
-                f.write(shellscript)
-            # Write out the file list
-            with open(os.path.join(tmpdir, os.path.basename(tmpfilelist)), 'w') as f:
-                f.write('%d\n' % ftotalsize)
-                for fpath, fsize in filelist:
-                    f.write('%s %d\n' % (fpath, fsize))
-            # Copy them to the target
-            ret = subprocess.call("scp %s %s %s %s/* %s:%s" % (scp_sshexec, scp_port, extraoptions, tmpdir, args.target, os.path.dirname(tmpscript)), shell=True)
-            if ret != 0:
-                raise DevtoolError('Failed to copy script to %s - rerun with -s to '
-                                'get a complete error message' % args.target)
-        finally:
-            shutil.rmtree(tmpdir)
-
-        # Now run the script
-        ret = exec_fakeroot(rd, 'tar cf - . | %s  %s %s %s \'sh %s %s %s %s\'' % (ssh_sshexec, ssh_port, extraoptions, args.target, tmpscript, args.recipename, destdir, tmpfilelist), cwd=recipe_outdir, shell=True)
+        tmpscript = '/tmp/devtool_deploy.sh'
+        tmpfilelist = os.path.join(os.path.dirname(tmpscript), 'devtool_deploy.list')
+        shellscript = _prepare_remote_script(deploy=True,
+                                            verbose=args.show_status,
+                                            nopreserve=args.no_preserve,
+                                            nocheckspace=args.no_check_space)
+        # Write out the script to a file
+        with open(os.path.join(tmpdir, os.path.basename(tmpscript)), 'w') as f:
+            f.write(shellscript)
+        # Write out the file list
+        with open(os.path.join(tmpdir, os.path.basename(tmpfilelist)), 'w') as f:
+            f.write('%d\n' % ftotalsize)
+            for fpath, fsize in filelist:
+                f.write('%s %d\n' % (fpath, fsize))
+        # Copy them to the target
+        ret = subprocess.call("scp %s %s %s %s/* %s:%s" % (scp_sshexec, scp_port, extraoptions, tmpdir, args.target, os.path.dirname(tmpscript)), shell=True)
         if ret != 0:
-            raise DevtoolError('Deploy failed - rerun with -s to get a complete '
-                            'error message')
-
-        logger.info('Successfully deployed %s' % recipe_outdir)
-
-        files_list = []
-        for root, _, files in os.walk(recipe_outdir):
-            for filename in files:
-                filename = os.path.relpath(os.path.join(root, filename), recipe_outdir)
-                files_list.append(os.path.join(destdir, filename))
+            raise DevtoolError('Failed to copy script to %s - rerun with -s to '
+                            'get a complete error message' % args.target)
     finally:
-        tinfoil.shutdown()
+        shutil.rmtree(tmpdir)
+
+    # Now run the script
+    ret = exec_fakeroot_no_d(fakerootcmd, fakerootenv, 'tar cf - . | %s  %s %s %s \'sh %s %s %s %s\'' % (ssh_sshexec, ssh_port, extraoptions, args.target, tmpscript, args.recipename, destdir, tmpfilelist), cwd=recipe_outdir, shell=True)
+    if ret != 0:
+        raise DevtoolError('Deploy failed - rerun with -s to get a complete '
+                        'error message')
+
+    logger.info('Successfully deployed %s' % recipe_outdir)
+
+    files_list = []
+    for root, _, files in os.walk(recipe_outdir):
+        for filename in files:
+            filename = os.path.relpath(os.path.join(root, filename), recipe_outdir)
+            files_list.append(os.path.join(destdir, filename))
 
     return 0
 
diff --git a/poky/scripts/lib/devtool/ide_plugins/__init__.py b/poky/scripts/lib/devtool/ide_plugins/__init__.py
new file mode 100644
index 0000000..3371b24
--- /dev/null
+++ b/poky/scripts/lib/devtool/ide_plugins/__init__.py
@@ -0,0 +1,267 @@
+#
+# Copyright (C) 2023-2024 Siemens AG
+#
+# SPDX-License-Identifier: GPL-2.0-only
+#
+"""Devtool ide-sdk IDE plugin interface definition and helper functions"""
+
+import errno
+import json
+import logging
+import os
+import stat
+from enum import Enum, auto
+from devtool import DevtoolError
+from bb.utils import mkdirhier
+
+logger = logging.getLogger('devtool')
+
+
+class BuildTool(Enum):
+    UNDEFINED = auto()
+    CMAKE = auto()
+    MESON = auto()
+
+    @property
+    def is_c_ccp(self):
+        if self is BuildTool.CMAKE:
+            return True
+        if self is BuildTool.MESON:
+            return True
+        return False
+
+
+class GdbCrossConfig:
+    """Base class defining the GDB configuration generator interface
+
+    Generate a GDB configuration for a binary on the target device.
+    Only one instance per binary is allowed. This allows to assign unique port
+    numbers for all gdbserver instances.
+    """
+    _gdbserver_port_next = 1234
+    _binaries = []
+
+    def __init__(self, image_recipe, modified_recipe, binary, gdbserver_multi=True):
+        self.image_recipe = image_recipe
+        self.modified_recipe = modified_recipe
+        self.gdb_cross = modified_recipe.gdb_cross
+        self.binary = binary
+        if binary in GdbCrossConfig._binaries:
+            raise DevtoolError(
+                "gdbserver config for binary %s is already generated" % binary)
+        GdbCrossConfig._binaries.append(binary)
+        self.script_dir = modified_recipe.ide_sdk_scripts_dir
+        self.gdbinit_dir = os.path.join(self.script_dir, 'gdbinit')
+        self.gdbserver_multi = gdbserver_multi
+        self.binary_pretty = self.binary.replace(os.sep, '-').lstrip('-')
+        self.gdbserver_port = GdbCrossConfig._gdbserver_port_next
+        GdbCrossConfig._gdbserver_port_next += 1
+        self.id_pretty = "%d_%s" % (self.gdbserver_port, self.binary_pretty)
+        # gdbserver start script
+        gdbserver_script_file = 'gdbserver_' + self.id_pretty
+        if self.gdbserver_multi:
+            gdbserver_script_file += "_m"
+        self.gdbserver_script = os.path.join(
+            self.script_dir, gdbserver_script_file)
+        # gdbinit file
+        self.gdbinit = os.path.join(
+            self.gdbinit_dir, 'gdbinit_' + self.id_pretty)
+        # gdb start script
+        self.gdb_script = os.path.join(
+            self.script_dir, 'gdb_' + self.id_pretty)
+
+    def _gen_gdbserver_start_script(self):
+        """Generate a shell command starting the gdbserver on the remote device via ssh
+
+        GDB supports two modes:
+        multi: gdbserver remains running over several debug sessions
+        once: gdbserver terminates after the debugged process terminates
+        """
+        cmd_lines = ['#!/bin/sh']
+        if self.gdbserver_multi:
+            temp_dir = "TEMP_DIR=/tmp/gdbserver_%s; " % self.id_pretty
+            gdbserver_cmd_start = temp_dir
+            gdbserver_cmd_start += "test -f \$TEMP_DIR/pid && exit 0; "
+            gdbserver_cmd_start += "mkdir -p \$TEMP_DIR; "
+            gdbserver_cmd_start += "%s --multi :%s > \$TEMP_DIR/log 2>&1 & " % (
+                self.gdb_cross.gdbserver_path, self.gdbserver_port)
+            gdbserver_cmd_start += "echo \$! > \$TEMP_DIR/pid;"
+
+            gdbserver_cmd_stop = temp_dir
+            gdbserver_cmd_stop += "test -f \$TEMP_DIR/pid && kill \$(cat \$TEMP_DIR/pid); "
+            gdbserver_cmd_stop += "rm -rf \$TEMP_DIR; "
+
+            gdbserver_cmd_l = []
+            gdbserver_cmd_l.append('if [ "$1" = "stop" ]; then')
+            gdbserver_cmd_l.append('  shift')
+            gdbserver_cmd_l.append("  %s %s %s %s 'sh -c \"%s\"'" % (
+                self.gdb_cross.target_device.ssh_sshexec, self.gdb_cross.target_device.ssh_port, self.gdb_cross.target_device.extraoptions, self.gdb_cross.target_device.target, gdbserver_cmd_stop))
+            gdbserver_cmd_l.append('else')
+            gdbserver_cmd_l.append("  %s %s %s %s 'sh -c \"%s\"'" % (
+                self.gdb_cross.target_device.ssh_sshexec, self.gdb_cross.target_device.ssh_port, self.gdb_cross.target_device.extraoptions, self.gdb_cross.target_device.target, gdbserver_cmd_start))
+            gdbserver_cmd_l.append('fi')
+            gdbserver_cmd = os.linesep.join(gdbserver_cmd_l)
+        else:
+            gdbserver_cmd_start = "%s --once :%s %s" % (
+                self.gdb_cross.gdbserver_path, self.gdbserver_port, self.binary)
+            gdbserver_cmd = "%s %s %s %s 'sh -c \"%s\"'" % (
+                self.gdb_cross.target_device.ssh_sshexec, self.gdb_cross.target_device.ssh_port, self.gdb_cross.target_device.extraoptions, self.gdb_cross.target_device.target, gdbserver_cmd_start)
+        cmd_lines.append(gdbserver_cmd)
+        GdbCrossConfig.write_file(self.gdbserver_script, cmd_lines, True)
+
+    def _gen_gdbinit_config(self):
+        """Generate a gdbinit file for this binary and the corresponding gdbserver configuration"""
+        gdbinit_lines = ['# This file is generated by devtool ide-sdk']
+        if self.gdbserver_multi:
+            target_help = '#   gdbserver --multi :%d' % self.gdbserver_port
+            remote_cmd = 'target extended-remote'
+        else:
+            target_help = '#   gdbserver :%d %s' % (
+                self.gdbserver_port, self.binary)
+            remote_cmd = 'target remote'
+        gdbinit_lines.append('# On the remote target:')
+        gdbinit_lines.append(target_help)
+        gdbinit_lines.append('# On the build machine:')
+        gdbinit_lines.append('#   cd ' + self.modified_recipe.real_srctree)
+        gdbinit_lines.append(
+            '#   ' + self.gdb_cross.gdb + ' -ix ' + self.gdbinit)
+
+        gdbinit_lines.append('set sysroot ' + self.modified_recipe.d)
+        gdbinit_lines.append('set substitute-path "/usr/include" "' +
+                             os.path.join(self.modified_recipe.recipe_sysroot, 'usr', 'include') + '"')
+        # Disable debuginfod for now, the IDE configuration uses rootfs-dbg from the image workdir.
+        gdbinit_lines.append('set debuginfod enabled off')
+        if self.image_recipe.rootfs_dbg:
+            gdbinit_lines.append(
+                'set solib-search-path "' + self.modified_recipe.solib_search_path_str(self.image_recipe) + '"')
+            gdbinit_lines.append('set substitute-path "/usr/src/debug" "' + os.path.join(
+                self.image_recipe.rootfs_dbg, 'usr', 'src', 'debug') + '"')
+        gdbinit_lines.append(
+            '%s %s:%d' % (remote_cmd, self.gdb_cross.host, self.gdbserver_port))
+        gdbinit_lines.append('set remote exec-file ' + self.binary)
+        gdbinit_lines.append(
+            'run ' + os.path.join(self.modified_recipe.d, self.binary))
+
+        GdbCrossConfig.write_file(self.gdbinit, gdbinit_lines)
+
+    def _gen_gdb_start_script(self):
+        """Generate a script starting GDB with the corresponding gdbinit configuration."""
+        cmd_lines = ['#!/bin/sh']
+        cmd_lines.append('cd ' + self.modified_recipe.real_srctree)
+        cmd_lines.append(self.gdb_cross.gdb + ' -ix ' +
+                         self.gdbinit + ' "$@"')
+        GdbCrossConfig.write_file(self.gdb_script, cmd_lines, True)
+
+    def initialize(self):
+        self._gen_gdbserver_start_script()
+        self._gen_gdbinit_config()
+        self._gen_gdb_start_script()
+
+    @staticmethod
+    def write_file(script_file, cmd_lines, executable=False):
+        script_dir = os.path.dirname(script_file)
+        mkdirhier(script_dir)
+        with open(script_file, 'w') as script_f:
+            script_f.write(os.linesep.join(cmd_lines))
+            script_f.write(os.linesep)
+        if executable:
+            st = os.stat(script_file)
+            os.chmod(script_file, st.st_mode | stat.S_IEXEC)
+        logger.info("Created: %s" % script_file)
+
+
+class IdeBase:
+    """Base class defining the interface for IDE plugins"""
+
+    def __init__(self):
+        self.ide_name = 'undefined'
+        self.gdb_cross_configs = []
+
+    @classmethod
+    def ide_plugin_priority(cls):
+        """Used to find the default ide handler if --ide is not passed"""
+        return 10
+
+    def setup_shared_sysroots(self, shared_env):
+        logger.warn("Shared sysroot mode is not supported for IDE %s" %
+                    self.ide_name)
+
+    def setup_modified_recipe(self, args, image_recipe, modified_recipe):
+        logger.warn("Modified recipe mode is not supported for IDE %s" %
+                    self.ide_name)
+
+    def initialize_gdb_cross_configs(self, image_recipe, modified_recipe, gdb_cross_config_class=GdbCrossConfig):
+        binaries = modified_recipe.find_installed_binaries()
+        for binary in binaries:
+            gdb_cross_config = gdb_cross_config_class(
+                image_recipe, modified_recipe, binary)
+            gdb_cross_config.initialize()
+            self.gdb_cross_configs.append(gdb_cross_config)
+
+    @staticmethod
+    def gen_oe_scrtips_sym_link(modified_recipe):
+        # create a sym-link from sources to the scripts directory
+        if os.path.isdir(modified_recipe.ide_sdk_scripts_dir):
+            IdeBase.symlink_force(modified_recipe.ide_sdk_scripts_dir,
+                                  os.path.join(modified_recipe.real_srctree, 'oe-scripts'))
+
+    @staticmethod
+    def update_json_file(json_dir, json_file, update_dict):
+        """Update a json file
+
+        By default it uses the dict.update function. If this is not sutiable
+        the update function might be passed via update_func parameter.
+        """
+        json_path = os.path.join(json_dir, json_file)
+        logger.info("Updating IDE config file: %s (%s)" %
+                    (json_file, json_path))
+        if not os.path.exists(json_dir):
+            os.makedirs(json_dir)
+        try:
+            with open(json_path) as f:
+                orig_dict = json.load(f)
+        except json.decoder.JSONDecodeError:
+            logger.info(
+                "Decoding %s failed. Probably because of comments in the json file" % json_path)
+            orig_dict = {}
+        except FileNotFoundError:
+            orig_dict = {}
+        orig_dict.update(update_dict)
+        with open(json_path, 'w') as f:
+            json.dump(orig_dict, f, indent=4)
+
+    @staticmethod
+    def symlink_force(tgt, dst):
+        try:
+            os.symlink(tgt, dst)
+        except OSError as err:
+            if err.errno == errno.EEXIST:
+                if os.readlink(dst) != tgt:
+                    os.remove(dst)
+                    os.symlink(tgt, dst)
+            else:
+                raise err
+
+
+def get_devtool_deploy_opts(args):
+    """Filter args for devtool deploy-target args"""
+    if not args.target:
+        return None
+    devtool_deploy_opts = [args.target]
+    if args.no_host_check:
+        devtool_deploy_opts += ["-c"]
+    if args.show_status:
+        devtool_deploy_opts += ["-s"]
+    if args.no_preserve:
+        devtool_deploy_opts += ["-p"]
+    if args.no_check_space:
+        devtool_deploy_opts += ["--no-check-space"]
+    if args.ssh_exec:
+        devtool_deploy_opts += ["-e", args.ssh.exec]
+    if args.port:
+        devtool_deploy_opts += ["-P", args.port]
+    if args.key:
+        devtool_deploy_opts += ["-I", args.key]
+    if args.strip is False:
+        devtool_deploy_opts += ["--no-strip"]
+    return devtool_deploy_opts
diff --git a/poky/scripts/lib/devtool/ide_plugins/ide_code.py b/poky/scripts/lib/devtool/ide_plugins/ide_code.py
new file mode 100644
index 0000000..f44665e
--- /dev/null
+++ b/poky/scripts/lib/devtool/ide_plugins/ide_code.py
@@ -0,0 +1,445 @@
+#
+# Copyright (C) 2023-2024 Siemens AG
+#
+# SPDX-License-Identifier: GPL-2.0-only
+#
+"""Devtool ide-sdk IDE plugin for VSCode and VSCodium"""
+
+import json
+import logging
+import os
+import shutil
+from devtool.ide_plugins import BuildTool, IdeBase, GdbCrossConfig, get_devtool_deploy_opts
+
+logger = logging.getLogger('devtool')
+
+
+class GdbCrossConfigVSCode(GdbCrossConfig):
+    def __init__(self, image_recipe, modified_recipe, binary):
+        super().__init__(image_recipe, modified_recipe, binary, False)
+
+    def initialize(self):
+        self._gen_gdbserver_start_script()
+
+
+class IdeVSCode(IdeBase):
+    """Manage IDE configurations for VSCode
+
+    Modified recipe mode:
+    - cmake: use the cmake-preset generated by devtool ide-sdk
+    - meson: meson is called via a wrapper script generated by devtool ide-sdk
+
+    Shared sysroot mode:
+    In shared sysroot mode, the cross tool-chain is exported to the user's global configuration.
+    A workspace cannot be created because there is no recipe that defines how a workspace could
+    be set up.
+    - cmake: adds a cmake-kit to .local/share/CMakeTools/cmake-tools-kits.json
+             The cmake-kit uses the environment script and the tool-chain file
+             generated by meta-ide-support.
+    - meson: Meson needs manual workspace configuration.
+    """
+
+    @classmethod
+    def ide_plugin_priority(cls):
+        """If --ide is not passed this is the default plugin"""
+        if shutil.which('code'):
+            return 100
+        return 0
+
+    def setup_shared_sysroots(self, shared_env):
+        """Expose the toolchain of the shared sysroots SDK"""
+        datadir = shared_env.ide_support.datadir
+        deploy_dir_image = shared_env.ide_support.deploy_dir_image
+        real_multimach_target_sys = shared_env.ide_support.real_multimach_target_sys
+        standalone_sysroot_native = shared_env.build_sysroots.standalone_sysroot_native
+        vscode_ws_path = os.path.join(
+            os.environ['HOME'], '.local', 'share', 'CMakeTools')
+        cmake_kits_path = os.path.join(vscode_ws_path, 'cmake-tools-kits.json')
+        oecmake_generator = "Ninja"
+        env_script = os.path.join(
+            deploy_dir_image, 'environment-setup-' + real_multimach_target_sys)
+
+        if not os.path.isdir(vscode_ws_path):
+            os.makedirs(vscode_ws_path)
+        cmake_kits_old = []
+        if os.path.exists(cmake_kits_path):
+            with open(cmake_kits_path, 'r', encoding='utf-8') as cmake_kits_file:
+                cmake_kits_old = json.load(cmake_kits_file)
+        cmake_kits = cmake_kits_old.copy()
+
+        cmake_kit_new = {
+            "name": "OE " + real_multimach_target_sys,
+            "environmentSetupScript": env_script,
+            "toolchainFile": standalone_sysroot_native + datadir + "/cmake/OEToolchainConfig.cmake",
+            "preferredGenerator": {
+                "name": oecmake_generator
+            }
+        }
+
+        def merge_kit(cmake_kits, cmake_kit_new):
+            i = 0
+            while i < len(cmake_kits):
+                if 'environmentSetupScript' in cmake_kits[i] and \
+                        cmake_kits[i]['environmentSetupScript'] == cmake_kit_new['environmentSetupScript']:
+                    cmake_kits[i] = cmake_kit_new
+                    return
+                i += 1
+            cmake_kits.append(cmake_kit_new)
+        merge_kit(cmake_kits, cmake_kit_new)
+
+        if cmake_kits != cmake_kits_old:
+            logger.info("Updating: %s" % cmake_kits_path)
+            with open(cmake_kits_path, 'w', encoding='utf-8') as cmake_kits_file:
+                json.dump(cmake_kits, cmake_kits_file, indent=4)
+        else:
+            logger.info("Already up to date: %s" % cmake_kits_path)
+
+        cmake_native = os.path.join(
+            shared_env.build_sysroots.standalone_sysroot_native, 'usr', 'bin', 'cmake')
+        if os.path.isfile(cmake_native):
+            logger.info('cmake-kits call cmake by default. If the cmake provided by this SDK should be used, please add the following line to ".vscode/settings.json" file: "cmake.cmakePath": "%s"' % cmake_native)
+        else:
+            logger.error("Cannot find cmake native at: %s" % cmake_native)
+
+    def dot_code_dir(self, modified_recipe):
+        return os.path.join(modified_recipe.srctree, '.vscode')
+
+    def __vscode_settings_meson(self, settings_dict, modified_recipe):
+        if modified_recipe.build_tool is not BuildTool.MESON:
+            return
+        settings_dict["mesonbuild.mesonPath"] = modified_recipe.meson_wrapper
+
+        confopts = modified_recipe.mesonopts.split()
+        confopts += modified_recipe.meson_cross_file.split()
+        confopts += modified_recipe.extra_oemeson.split()
+        settings_dict["mesonbuild.configureOptions"] = confopts
+        settings_dict["mesonbuild.buildFolder"] = modified_recipe.b
+
+    def __vscode_settings_cmake(self, settings_dict, modified_recipe):
+        """Add cmake specific settings to settings.json.
+
+        Note: most settings are passed to the cmake preset.
+        """
+        if modified_recipe.build_tool is not BuildTool.CMAKE:
+            return
+        settings_dict["cmake.configureOnOpen"] = True
+        settings_dict["cmake.sourceDirectory"] = modified_recipe.real_srctree
+
+    def vscode_settings(self, modified_recipe, image_recipe):
+        files_excludes = {
+            "**/.git/**": True,
+            "**/oe-logs/**": True,
+            "**/oe-workdir/**": True,
+            "**/source-date-epoch/**": True
+        }
+        python_exclude = [
+            "**/.git/**",
+            "**/oe-logs/**",
+            "**/oe-workdir/**",
+            "**/source-date-epoch/**"
+        ]
+        files_readonly = {
+            modified_recipe.recipe_sysroot + '/**': True,
+            modified_recipe.recipe_sysroot_native + '/**': True,
+        }
+        if image_recipe.rootfs_dbg is not None:
+            files_readonly[image_recipe.rootfs_dbg + '/**'] = True
+        settings_dict = {
+            "files.watcherExclude": files_excludes,
+            "files.exclude": files_excludes,
+            "files.readonlyInclude": files_readonly,
+            "python.analysis.exclude": python_exclude
+        }
+        self.__vscode_settings_cmake(settings_dict, modified_recipe)
+        self.__vscode_settings_meson(settings_dict, modified_recipe)
+
+        settings_file = 'settings.json'
+        IdeBase.update_json_file(
+            self.dot_code_dir(modified_recipe), settings_file, settings_dict)
+
+    def __vscode_extensions_cmake(self, modified_recipe, recommendations):
+        if modified_recipe.build_tool is not BuildTool.CMAKE:
+            return
+        recommendations += [
+            "twxs.cmake",
+            "ms-vscode.cmake-tools",
+            "ms-vscode.cpptools",
+            "ms-vscode.cpptools-extension-pack",
+            "ms-vscode.cpptools-themes"
+        ]
+
+    def __vscode_extensions_meson(self, modified_recipe, recommendations):
+        if modified_recipe.build_tool is not BuildTool.MESON:
+            return
+        recommendations += [
+            'mesonbuild.mesonbuild',
+            "ms-vscode.cpptools",
+            "ms-vscode.cpptools-extension-pack",
+            "ms-vscode.cpptools-themes"
+        ]
+
+    def vscode_extensions(self, modified_recipe):
+        recommendations = []
+        self.__vscode_extensions_cmake(modified_recipe, recommendations)
+        self.__vscode_extensions_meson(modified_recipe, recommendations)
+        extensions_file = 'extensions.json'
+        IdeBase.update_json_file(
+            self.dot_code_dir(modified_recipe), extensions_file, {"recommendations": recommendations})
+
+    def vscode_c_cpp_properties(self, modified_recipe):
+        properties_dict = {
+            "name": modified_recipe.recipe_id_pretty,
+        }
+        if modified_recipe.build_tool is BuildTool.CMAKE:
+            properties_dict["configurationProvider"] = "ms-vscode.cmake-tools"
+        elif modified_recipe.build_tool is BuildTool.MESON:
+            properties_dict["configurationProvider"] = "mesonbuild.mesonbuild"
+        else:  # no C/C++ build
+            return
+
+        properties_dicts = {
+            "configurations": [
+                properties_dict
+            ],
+            "version": 4
+        }
+        prop_file = 'c_cpp_properties.json'
+        IdeBase.update_json_file(
+            self.dot_code_dir(modified_recipe), prop_file, properties_dicts)
+
+    def vscode_launch_bin_dbg(self, gdb_cross_config):
+        modified_recipe = gdb_cross_config.modified_recipe
+
+        launch_config = {
+            "name": gdb_cross_config.id_pretty,
+            "type": "cppdbg",
+            "request": "launch",
+            "program": os.path.join(modified_recipe.d, gdb_cross_config.binary.lstrip('/')),
+            "stopAtEntry": True,
+            "cwd": "${workspaceFolder}",
+            "environment": [],
+            "externalConsole": False,
+            "MIMode": "gdb",
+            "preLaunchTask": gdb_cross_config.id_pretty,
+            "miDebuggerPath": modified_recipe.gdb_cross.gdb,
+            "miDebuggerServerAddress": "%s:%d" % (modified_recipe.gdb_cross.host, gdb_cross_config.gdbserver_port)
+        }
+
+        # Search for header files in recipe-sysroot.
+        src_file_map = {
+            "/usr/include": os.path.join(modified_recipe.recipe_sysroot, "usr", "include")
+        }
+        # First of all search for not stripped binaries in the image folder.
+        # These binaries are copied (and optionally stripped) by deploy-target
+        setup_commands = [
+            {
+                "description": "sysroot",
+                "text": "set sysroot " + modified_recipe.d
+            }
+        ]
+
+        if gdb_cross_config.image_recipe.rootfs_dbg:
+            launch_config['additionalSOLibSearchPath'] = modified_recipe.solib_search_path_str(
+                gdb_cross_config.image_recipe)
+            src_file_map["/usr/src/debug"] = os.path.join(
+                gdb_cross_config.image_recipe.rootfs_dbg, "usr", "src", "debug")
+        else:
+            logger.warning(
+                "Cannot setup debug symbols configuration for GDB. IMAGE_GEN_DEBUGFS is not enabled.")
+
+        launch_config['sourceFileMap'] = src_file_map
+        launch_config['setupCommands'] = setup_commands
+        return launch_config
+
+    def vscode_launch(self, modified_recipe):
+        """GDB Launch configuration for binaries (elf files)"""
+
+        configurations = [self.vscode_launch_bin_dbg(
+            gdb_cross_config) for gdb_cross_config in self.gdb_cross_configs]
+        launch_dict = {
+            "version": "0.2.0",
+            "configurations": configurations
+        }
+        launch_file = 'launch.json'
+        IdeBase.update_json_file(
+            self.dot_code_dir(modified_recipe), launch_file, launch_dict)
+
+    def vscode_tasks_cpp(self, args, modified_recipe):
+        run_install_deploy = modified_recipe.gen_install_deploy_script(args)
+        install_task_name = "install && deploy-target %s" % modified_recipe.recipe_id_pretty
+        tasks_dict = {
+            "version": "2.0.0",
+            "tasks": [
+                {
+                    "label": install_task_name,
+                    "type": "shell",
+                    "command": run_install_deploy,
+                    "problemMatcher": []
+                }
+            ]
+        }
+        for gdb_cross_config in self.gdb_cross_configs:
+            tasks_dict['tasks'].append(
+                {
+                    "label": gdb_cross_config.id_pretty,
+                    "type": "shell",
+                    "isBackground": True,
+                    "dependsOn": [
+                        install_task_name
+                    ],
+                    "command": gdb_cross_config.gdbserver_script,
+                    "problemMatcher": [
+                        {
+                            "pattern": [
+                                {
+                                    "regexp": ".",
+                                    "file": 1,
+                                    "location": 2,
+                                    "message": 3
+                                }
+                            ],
+                            "background": {
+                                "activeOnStart": True,
+                                "beginsPattern": ".",
+                                "endsPattern": ".",
+                            }
+                        }
+                    ]
+                })
+        tasks_file = 'tasks.json'
+        IdeBase.update_json_file(
+            self.dot_code_dir(modified_recipe), tasks_file, tasks_dict)
+
+    def vscode_tasks_fallback(self, args, modified_recipe):
+        oe_init_dir = modified_recipe.oe_init_dir
+        oe_init = ". %s %s > /dev/null && " % (modified_recipe.oe_init_build_env, modified_recipe.topdir)
+        dt_build = "devtool build "
+        dt_build_label = dt_build + modified_recipe.recipe_id_pretty
+        dt_build_cmd = dt_build + modified_recipe.bpn
+        clean_opt = " --clean"
+        dt_build_clean_label = dt_build + modified_recipe.recipe_id_pretty + clean_opt
+        dt_build_clean_cmd = dt_build + modified_recipe.bpn + clean_opt
+        dt_deploy = "devtool deploy-target "
+        dt_deploy_label = dt_deploy + modified_recipe.recipe_id_pretty
+        dt_deploy_cmd = dt_deploy + modified_recipe.bpn
+        dt_build_deploy_label = "devtool build & deploy-target %s" % modified_recipe.recipe_id_pretty
+        deploy_opts = ' '.join(get_devtool_deploy_opts(args))
+        tasks_dict = {
+            "version": "2.0.0",
+            "tasks": [
+                {
+                    "label": dt_build_label,
+                    "type": "shell",
+                    "command": "bash",
+                    "linux": {
+                        "options": {
+                            "cwd": oe_init_dir
+                        }
+                    },
+                    "args": [
+                        "--login",
+                        "-c",
+                        "%s%s" % (oe_init, dt_build_cmd)
+                    ],
+                    "problemMatcher": []
+                },
+                {
+                    "label": dt_deploy_label,
+                    "type": "shell",
+                    "command": "bash",
+                    "linux": {
+                        "options": {
+                            "cwd": oe_init_dir
+                        }
+                    },
+                    "args": [
+                        "--login",
+                        "-c",
+                        "%s%s %s" % (
+                            oe_init, dt_deploy_cmd, deploy_opts)
+                    ],
+                    "problemMatcher": []
+                },
+                {
+                    "label": dt_build_deploy_label,
+                    "dependsOrder": "sequence",
+                    "dependsOn": [
+                        dt_build_label,
+                        dt_deploy_label
+                    ],
+                    "problemMatcher": [],
+                    "group": {
+                        "kind": "build",
+                        "isDefault": True
+                    }
+                },
+                {
+                    "label": dt_build_clean_label,
+                    "type": "shell",
+                    "command": "bash",
+                    "linux": {
+                        "options": {
+                            "cwd": oe_init_dir
+                        }
+                    },
+                    "args": [
+                        "--login",
+                        "-c",
+                        "%s%s" % (oe_init, dt_build_clean_cmd)
+                    ],
+                    "problemMatcher": []
+                }
+            ]
+        }
+        if modified_recipe.gdb_cross:
+            for gdb_cross_config in self.gdb_cross_configs:
+                tasks_dict['tasks'].append(
+                    {
+                        "label": gdb_cross_config.id_pretty,
+                        "type": "shell",
+                        "isBackground": True,
+                        "dependsOn": [
+                            dt_build_deploy_label
+                        ],
+                        "command": gdb_cross_config.gdbserver_script,
+                        "problemMatcher": [
+                            {
+                                "pattern": [
+                                    {
+                                        "regexp": ".",
+                                        "file": 1,
+                                        "location": 2,
+                                        "message": 3
+                                    }
+                                ],
+                                "background": {
+                                    "activeOnStart": True,
+                                    "beginsPattern": ".",
+                                    "endsPattern": ".",
+                                }
+                            }
+                        ]
+                    })
+        tasks_file = 'tasks.json'
+        IdeBase.update_json_file(
+            self.dot_code_dir(modified_recipe), tasks_file, tasks_dict)
+
+    def vscode_tasks(self, args, modified_recipe):
+        if modified_recipe.build_tool.is_c_ccp:
+            self.vscode_tasks_cpp(args, modified_recipe)
+        else:
+            self.vscode_tasks_fallback(args, modified_recipe)
+
+    def setup_modified_recipe(self, args, image_recipe, modified_recipe):
+        self.vscode_settings(modified_recipe, image_recipe)
+        self.vscode_extensions(modified_recipe)
+        self.vscode_c_cpp_properties(modified_recipe)
+        if args.target:
+            self.initialize_gdb_cross_configs(
+                image_recipe, modified_recipe, gdb_cross_config_class=GdbCrossConfigVSCode)
+            self.vscode_launch(modified_recipe)
+            self.vscode_tasks(args, modified_recipe)
+
+
+def register_ide_plugin(ide_plugins):
+    ide_plugins['code'] = IdeVSCode
diff --git a/poky/scripts/lib/devtool/ide_plugins/ide_none.py b/poky/scripts/lib/devtool/ide_plugins/ide_none.py
new file mode 100644
index 0000000..f106c5a
--- /dev/null
+++ b/poky/scripts/lib/devtool/ide_plugins/ide_none.py
@@ -0,0 +1,53 @@
+#
+# Copyright (C) 2023-2024 Siemens AG
+#
+# SPDX-License-Identifier: GPL-2.0-only
+#
+"""Devtool ide-sdk generic IDE plugin"""
+
+import os
+import logging
+from devtool.ide_plugins import IdeBase, GdbCrossConfig
+
+logger = logging.getLogger('devtool')
+
+
+class IdeNone(IdeBase):
+    """Generate some generic helpers for other IDEs
+
+    Modified recipe mode:
+    Generate some helper scripts for remote debugging with GDB
+
+    Shared sysroot mode:
+    A wrapper for bitbake meta-ide-support and bitbake build-sysroots
+    """
+
+    def __init__(self):
+        super().__init__()
+
+    def setup_shared_sysroots(self, shared_env):
+        real_multimach_target_sys = shared_env.ide_support.real_multimach_target_sys
+        deploy_dir_image = shared_env.ide_support.deploy_dir_image
+        env_script = os.path.join(
+            deploy_dir_image, 'environment-setup-' + real_multimach_target_sys)
+        logger.info(
+            "To use this SDK please source this: %s" % env_script)
+
+    def setup_modified_recipe(self, args, image_recipe, modified_recipe):
+        """generate some helper scripts and config files
+
+        - Execute the do_install task
+        - Execute devtool deploy-target
+        - Generate a gdbinit file per executable
+        - Generate the oe-scripts sym-link
+        """
+        script_path = modified_recipe.gen_install_deploy_script(args)
+        logger.info("Created: %s" % script_path)
+
+        self.initialize_gdb_cross_configs(image_recipe, modified_recipe)
+
+        IdeBase.gen_oe_scrtips_sym_link(modified_recipe)
+
+
+def register_ide_plugin(ide_plugins):
+    ide_plugins['none'] = IdeNone
diff --git a/poky/scripts/lib/devtool/ide_sdk.py b/poky/scripts/lib/devtool/ide_sdk.py
new file mode 100755
index 0000000..1467974
--- /dev/null
+++ b/poky/scripts/lib/devtool/ide_sdk.py
@@ -0,0 +1,1065 @@
+# Development tool - ide-sdk command plugin
+#
+# Copyright (C) 2023-2024 Siemens AG
+#
+# SPDX-License-Identifier: GPL-2.0-only
+#
+"""Devtool ide-sdk plugin"""
+
+import json
+import logging
+import os
+import re
+import shutil
+import stat
+import subprocess
+import sys
+from argparse import RawTextHelpFormatter
+from enum import Enum
+
+import scriptutils
+import bb
+from devtool import exec_build_env_command, setup_tinfoil, check_workspace_recipe, DevtoolError, parse_recipe
+from devtool.standard import get_real_srctree
+from devtool.ide_plugins import BuildTool
+
+
+logger = logging.getLogger('devtool')
+
+# dict of classes derived from IdeBase
+ide_plugins = {}
+
+
+class DevtoolIdeMode(Enum):
+    """Different modes are supported by the ide-sdk plugin.
+
+    The enum might be extended by more advanced modes in the future. Some ideas:
+    - auto: modified if all recipes are modified, shared if none of the recipes is modified.
+    - mixed: modified mode for modified recipes, shared mode for all other recipes.
+    """
+
+    modified = 'modified'
+    shared = 'shared'
+
+
+class TargetDevice:
+    """SSH remote login parameters"""
+
+    def __init__(self, args):
+        self.extraoptions = ''
+        if args.no_host_check:
+            self.extraoptions += '-o UserKnownHostsFile=/dev/null -o StrictHostKeyChecking=no'
+        self.ssh_sshexec = 'ssh'
+        if args.ssh_exec:
+            self.ssh_sshexec = args.ssh_exec
+        self.ssh_port = ''
+        if args.port:
+            self.ssh_port = "-p %s" % args.port
+        if args.key:
+            self.extraoptions += ' -i %s' % args.key
+
+        self.target = args.target
+        target_sp = args.target.split('@')
+        if len(target_sp) == 1:
+            self.login = ""
+            self.host = target_sp[0]
+        elif len(target_sp) == 2:
+            self.login = target_sp[0]
+            self.host = target_sp[1]
+        else:
+            logger.error("Invalid target argument: %s" % args.target)
+
+
+class RecipeNative:
+    """Base class for calling bitbake to provide a -native recipe"""
+
+    def __init__(self, name, target_arch=None):
+        self.name = name
+        self.target_arch = target_arch
+        self.bootstrap_tasks = [self.name + ':do_addto_recipe_sysroot']
+        self.staging_bindir_native = None
+        self.target_sys = None
+        self.__native_bin = None
+
+    def _initialize(self, config, workspace, tinfoil):
+        """Get the parsed recipe"""
+        recipe_d = parse_recipe(
+            config, tinfoil, self.name, appends=True, filter_workspace=False)
+        if not recipe_d:
+            raise DevtoolError("Parsing %s recipe failed" % self.name)
+        self.staging_bindir_native = os.path.realpath(
+            recipe_d.getVar('STAGING_BINDIR_NATIVE'))
+        self.target_sys = recipe_d.getVar('TARGET_SYS')
+        return recipe_d
+
+    def initialize(self, config, workspace, tinfoil):
+        """Basic initialization that can be overridden by a derived class"""
+        self._initialize(config, workspace, tinfoil)
+
+    @property
+    def native_bin(self):
+        if not self.__native_bin:
+            raise DevtoolError("native binary name is not defined.")
+        return self.__native_bin
+
+
+class RecipeGdbCross(RecipeNative):
+    """Handle handle gdb-cross on the host and the gdbserver on the target device"""
+
+    def __init__(self, args, target_arch, target_device):
+        super().__init__('gdb-cross-' + target_arch, target_arch)
+        self.target_device = target_device
+        self.gdb = None
+        self.gdbserver_port_next = int(args.gdbserver_port_start)
+        self.config_db = {}
+
+    def __find_gdbserver(self, config, tinfoil):
+        """Absolute path of the gdbserver"""
+        recipe_d_gdb = parse_recipe(
+            config, tinfoil, 'gdb', appends=True, filter_workspace=False)
+        if not recipe_d_gdb:
+            raise DevtoolError("Parsing gdb recipe failed")
+        return os.path.join(recipe_d_gdb.getVar('bindir'), 'gdbserver')
+
+    def initialize(self, config, workspace, tinfoil):
+        super()._initialize(config, workspace, tinfoil)
+        gdb_bin = self.target_sys + '-gdb'
+        gdb_path = os.path.join(
+            self.staging_bindir_native, self.target_sys, gdb_bin)
+        self.gdb = gdb_path
+        self.gdbserver_path = self.__find_gdbserver(config, tinfoil)
+
+    @property
+    def host(self):
+        return self.target_device.host
+
+
+class RecipeImage:
+    """Handle some image recipe related properties
+
+    Most workflows require firmware that runs on the target device.
+    This firmware must be consistent with the setup of the host system.
+    In particular, the debug symbols must be compatible. For this, the
+    rootfs must be created as part of the SDK.
+    """
+
+    def __init__(self, name):
+        self.combine_dbg_image = False
+        self.gdbserver_missing = False
+        self.name = name
+        self.rootfs = None
+        self.__rootfs_dbg = None
+        self.bootstrap_tasks = [self.name + ':do_build']
+
+    def initialize(self, config, tinfoil):
+        image_d = parse_recipe(
+            config, tinfoil, self.name, appends=True, filter_workspace=False)
+        if not image_d:
+            raise DevtoolError(
+                "Parsing image recipe %s failed" % self.name)
+
+        self.combine_dbg_image = bb.data.inherits_class(
+            'image-combined-dbg', image_d)
+
+        workdir = image_d.getVar('WORKDIR')
+        self.rootfs = os.path.join(workdir, 'rootfs')
+        if image_d.getVar('IMAGE_GEN_DEBUGFS') == "1":
+            self.__rootfs_dbg = os.path.join(workdir, 'rootfs-dbg')
+
+        self.gdbserver_missing = 'gdbserver' not in image_d.getVar(
+            'IMAGE_INSTALL')
+
+    @property
+    def debug_support(self):
+        return bool(self.rootfs_dbg)
+
+    @property
+    def rootfs_dbg(self):
+        if self.__rootfs_dbg and os.path.isdir(self.__rootfs_dbg):
+            return self.__rootfs_dbg
+        return None
+
+
+class RecipeMetaIdeSupport:
+    """For the shared sysroots mode meta-ide-support is needed
+
+    For use cases where just a cross tool-chain is required but
+    no recipe is used, devtool ide-sdk abstracts calling bitbake meta-ide-support
+    and bitbake build-sysroots. This also allows to expose the cross-toolchains
+    to IDEs. For example VSCode support different tool-chains with e.g. cmake-kits.
+    """
+
+    def __init__(self):
+        self.bootstrap_tasks = ['meta-ide-support:do_build']
+        self.topdir = None
+        self.datadir = None
+        self.deploy_dir_image = None
+        self.build_sys = None
+        # From toolchain-scripts
+        self.real_multimach_target_sys = None
+
+    def initialize(self, config, tinfoil):
+        meta_ide_support_d = parse_recipe(
+            config, tinfoil, 'meta-ide-support', appends=True, filter_workspace=False)
+        if not meta_ide_support_d:
+            raise DevtoolError("Parsing meta-ide-support recipe failed")
+
+        self.topdir = meta_ide_support_d.getVar('TOPDIR')
+        self.datadir = meta_ide_support_d.getVar('datadir')
+        self.deploy_dir_image = meta_ide_support_d.getVar(
+            'DEPLOY_DIR_IMAGE')
+        self.build_sys = meta_ide_support_d.getVar('BUILD_SYS')
+        self.real_multimach_target_sys = meta_ide_support_d.getVar(
+            'REAL_MULTIMACH_TARGET_SYS')
+
+
+class RecipeBuildSysroots:
+    """For the shared sysroots mode build-sysroots is needed"""
+
+    def __init__(self):
+        self.standalone_sysroot = None
+        self.standalone_sysroot_native = None
+        self.bootstrap_tasks = [
+            'build-sysroots:do_build_target_sysroot',
+            'build-sysroots:do_build_native_sysroot'
+        ]
+
+    def initialize(self, config, tinfoil):
+        build_sysroots_d = parse_recipe(
+            config, tinfoil, 'build-sysroots', appends=True, filter_workspace=False)
+        if not build_sysroots_d:
+            raise DevtoolError("Parsing build-sysroots recipe failed")
+        self.standalone_sysroot = build_sysroots_d.getVar(
+            'STANDALONE_SYSROOT')
+        self.standalone_sysroot_native = build_sysroots_d.getVar(
+            'STANDALONE_SYSROOT_NATIVE')
+
+
+class SharedSysrootsEnv:
+    """Handle the shared sysroots based workflow
+
+    Support the workflow with just a tool-chain without a recipe.
+    It's basically like:
+      bitbake some-dependencies
+      bitbake meta-ide-support
+      bitbake build-sysroots
+      Use the environment-* file found in the deploy folder
+    """
+
+    def __init__(self):
+        self.ide_support = None
+        self.build_sysroots = None
+
+    def initialize(self, ide_support, build_sysroots):
+        self.ide_support = ide_support
+        self.build_sysroots = build_sysroots
+
+    def setup_ide(self, ide):
+        ide.setup(self)
+
+
+class RecipeNotModified:
+    """Handling of recipes added to the Direct DSK shared sysroots."""
+
+    def __init__(self, name):
+        self.name = name
+        self.bootstrap_tasks = [name + ':do_populate_sysroot']
+
+
+class RecipeModified:
+    """Handling of recipes in the workspace created by devtool modify"""
+    OE_INIT_BUILD_ENV = 'oe-init-build-env'
+
+    VALID_BASH_ENV_NAME_CHARS = re.compile(r"^[a-zA-Z0-9_]*$")
+
+    def __init__(self, name):
+        self.name = name
+        self.bootstrap_tasks = [name + ':do_install']
+        self.gdb_cross = None
+        # workspace
+        self.real_srctree = None
+        self.srctree = None
+        self.ide_sdk_dir = None
+        self.ide_sdk_scripts_dir = None
+        self.bbappend = None
+        # recipe variables from d.getVar
+        self.b = None
+        self.base_libdir = None
+        self.bblayers = None
+        self.bpn = None
+        self.d = None
+        self.fakerootcmd = None
+        self.fakerootenv = None
+        self.libdir = None
+        self.max_process = None
+        self.package_arch = None
+        self.package_debug_split_style = None
+        self.path = None
+        self.pn = None
+        self.recipe_sysroot = None
+        self.recipe_sysroot_native = None
+        self.staging_incdir = None
+        self.strip_cmd = None
+        self.target_arch = None
+        self.topdir = None
+        self.workdir = None
+        self.recipe_id = None
+        # replicate bitbake build environment
+        self.exported_vars = None
+        self.cmd_compile = None
+        self.__oe_init_dir = None
+        # main build tool used by this recipe
+        self.build_tool = BuildTool.UNDEFINED
+        # build_tool = cmake
+        self.oecmake_generator = None
+        self.cmake_cache_vars = None
+        # build_tool = meson
+        self.meson_buildtype = None
+        self.meson_wrapper = None
+        self.mesonopts = None
+        self.extra_oemeson = None
+        self.meson_cross_file = None
+
+    def initialize(self, config, workspace, tinfoil):
+        recipe_d = parse_recipe(
+            config, tinfoil, self.name, appends=True, filter_workspace=False)
+        if not recipe_d:
+            raise DevtoolError("Parsing %s recipe failed" % self.name)
+
+        # Verify this recipe is built as externalsrc setup by devtool modify
+        workspacepn = check_workspace_recipe(
+            workspace, self.name, bbclassextend=True)
+        self.srctree = workspace[workspacepn]['srctree']
+        # Need to grab this here in case the source is within a subdirectory
+        self.real_srctree = get_real_srctree(
+            self.srctree, recipe_d.getVar('S'), recipe_d.getVar('WORKDIR'))
+        self.bbappend = workspace[workspacepn]['bbappend']
+
+        self.ide_sdk_dir = os.path.join(
+            config.workspace_path, 'ide-sdk', self.name)
+        if os.path.exists(self.ide_sdk_dir):
+            shutil.rmtree(self.ide_sdk_dir)
+        self.ide_sdk_scripts_dir = os.path.join(self.ide_sdk_dir, 'scripts')
+
+        self.b = recipe_d.getVar('B')
+        self.base_libdir = recipe_d.getVar('base_libdir')
+        self.bblayers = recipe_d.getVar('BBLAYERS').split()
+        self.bpn = recipe_d.getVar('BPN')
+        self.d = recipe_d.getVar('D')
+        self.fakerootcmd = recipe_d.getVar('FAKEROOTCMD')
+        self.fakerootenv = recipe_d.getVar('FAKEROOTENV')
+        self.libdir = recipe_d.getVar('libdir')
+        self.max_process = int(recipe_d.getVar(
+            "BB_NUMBER_THREADS") or os.cpu_count() or 1)
+        self.package_arch = recipe_d.getVar('PACKAGE_ARCH')
+        self.package_debug_split_style = recipe_d.getVar(
+            'PACKAGE_DEBUG_SPLIT_STYLE')
+        self.path = recipe_d.getVar('PATH')
+        self.pn = recipe_d.getVar('PN')
+        self.recipe_sysroot = os.path.realpath(
+            recipe_d.getVar('RECIPE_SYSROOT'))
+        self.recipe_sysroot_native = os.path.realpath(
+            recipe_d.getVar('RECIPE_SYSROOT_NATIVE'))
+        self.staging_incdir = os.path.realpath(
+            recipe_d.getVar('STAGING_INCDIR'))
+        self.strip_cmd = recipe_d.getVar('STRIP')
+        self.target_arch = recipe_d.getVar('TARGET_ARCH')
+        self.topdir = recipe_d.getVar('TOPDIR')
+        self.workdir = os.path.realpath(recipe_d.getVar('WORKDIR'))
+
+        self.__init_exported_variables(recipe_d)
+
+        if bb.data.inherits_class('cmake', recipe_d):
+            self.oecmake_generator = recipe_d.getVar('OECMAKE_GENERATOR')
+            self.__init_cmake_preset_cache(recipe_d)
+            self.build_tool = BuildTool.CMAKE
+        elif bb.data.inherits_class('meson', recipe_d):
+            self.meson_buildtype = recipe_d.getVar('MESON_BUILDTYPE')
+            self.mesonopts = recipe_d.getVar('MESONOPTS')
+            self.extra_oemeson = recipe_d.getVar('EXTRA_OEMESON')
+            self.meson_cross_file = recipe_d.getVar('MESON_CROSS_FILE')
+            self.build_tool = BuildTool.MESON
+
+        # Recipe ID is the identifier for IDE config sections
+        self.recipe_id = self.bpn + "-" + self.package_arch
+        self.recipe_id_pretty = self.bpn + ": " + self.package_arch
+
+    def append_to_bbappend(self, append_text):
+        with open(self.bbappend, 'a') as bbap:
+            bbap.write(append_text)
+
+    def remove_from_bbappend(self, append_text):
+        with open(self.bbappend, 'r') as bbap:
+            text = bbap.read()
+        new_text = text.replace(append_text, '')
+        with open(self.bbappend, 'w') as bbap:
+            bbap.write(new_text)
+
+    @staticmethod
+    def is_valid_shell_variable(var):
+        """Skip strange shell variables like systemd
+
+        prevent from strange bugs because of strange variables which
+        are not used in this context but break various tools.
+        """
+        if RecipeModified.VALID_BASH_ENV_NAME_CHARS.match(var):
+            bb.debug(1, "ignoring variable: %s" % var)
+            return True
+        return False
+
+    def debug_build_config(self, args):
+        """Explicitely set for example CMAKE_BUILD_TYPE to Debug if not defined otherwise"""
+        if self.build_tool is BuildTool.CMAKE:
+            append_text = os.linesep + \
+                'OECMAKE_ARGS:append = " -DCMAKE_BUILD_TYPE:STRING=Debug"' + os.linesep
+            if args.debug_build_config and not 'CMAKE_BUILD_TYPE' in self.cmake_cache_vars:
+                self.cmake_cache_vars['CMAKE_BUILD_TYPE'] = {
+                    "type": "STRING",
+                    "value": "Debug",
+                }
+                self.append_to_bbappend(append_text)
+            elif 'CMAKE_BUILD_TYPE' in self.cmake_cache_vars:
+                del self.cmake_cache_vars['CMAKE_BUILD_TYPE']
+                self.remove_from_bbappend(append_text)
+        elif self.build_tool is BuildTool.MESON:
+            append_text = os.linesep + 'MESON_BUILDTYPE = "debug"' + os.linesep
+            if args.debug_build_config and self.meson_buildtype != "debug":
+                self.mesonopts.replace(
+                    '--buildtype ' + self.meson_buildtype, '--buildtype debug')
+                self.append_to_bbappend(append_text)
+            elif self.meson_buildtype == "debug":
+                self.mesonopts.replace(
+                    '--buildtype debug', '--buildtype plain')
+                self.remove_from_bbappend(append_text)
+        elif args.debug_build_config:
+            logger.warn(
+                "--debug-build-config is not implemented for this build tool yet.")
+
+    def solib_search_path(self, image):
+        """Search for debug symbols in the rootfs and rootfs-dbg
+
+        The debug symbols of shared libraries which are provided by other packages
+        are grabbed from the -dbg packages in the rootfs-dbg.
+
+        But most cross debugging tools like gdb, perf, and systemtap need to find
+        executable/library first and through it debuglink note find corresponding
+        symbols file. Therefore the library paths from the rootfs are added as well.
+
+        Note: For the devtool modified recipe compiled from the IDE, the debug
+        symbols are taken from the unstripped binaries in the image folder.
+        Also, devtool deploy-target takes the files from the image folder.
+        debug symbols in the image folder refer to the corresponding source files
+        with absolute paths of the build machine. Debug symbols found in the
+        rootfs-dbg are relocated and contain paths which refer to the source files
+        installed on the target device e.g. /usr/src/...
+        """
+        base_libdir = self.base_libdir.lstrip('/')
+        libdir = self.libdir.lstrip('/')
+        so_paths = [
+            # debug symbols for package_debug_split_style: debug-with-srcpkg or .debug
+            os.path.join(image.rootfs_dbg, base_libdir, ".debug"),
+            os.path.join(image.rootfs_dbg, libdir, ".debug"),
+            # debug symbols for package_debug_split_style: debug-file-directory
+            os.path.join(image.rootfs_dbg, "usr", "lib", "debug"),
+
+            # The binaries are required as well, the debug packages are not enough
+            # With image-combined-dbg.bbclass the binaries are copied into rootfs-dbg
+            os.path.join(image.rootfs_dbg, base_libdir),
+            os.path.join(image.rootfs_dbg, libdir),
+            # Without image-combined-dbg.bbclass the binaries are only in rootfs.
+            # Note: Stepping into source files located in rootfs-dbg does not
+            #       work without image-combined-dbg.bbclass yet.
+            os.path.join(image.rootfs, base_libdir),
+            os.path.join(image.rootfs, libdir)
+        ]
+        return so_paths
+
+    def solib_search_path_str(self, image):
+        """Return a : separated list of paths usable by GDB's set solib-search-path"""
+        return ':'.join(self.solib_search_path(image))
+
+    def __init_exported_variables(self, d):
+        """Find all variables with export flag set.
+
+        This allows to generate IDE configurations which compile with the same
+        environment as bitbake does. That's at least a reasonable default behavior.
+        """
+        exported_vars = {}
+
+        vars = (key for key in d.keys() if not key.startswith(
+            "__") and not d.getVarFlag(key, "func", False))
+        for var in vars:
+            func = d.getVarFlag(var, "func", False)
+            if d.getVarFlag(var, 'python', False) and func:
+                continue
+            export = d.getVarFlag(var, "export", False)
+            unexport = d.getVarFlag(var, "unexport", False)
+            if not export and not unexport and not func:
+                continue
+            if unexport:
+                continue
+
+            val = d.getVar(var)
+            if val is None:
+                continue
+            if set(var) & set("-.{}+"):
+                logger.warn(
+                    "Warning: Found invalid character in variable name %s", str(var))
+                continue
+            varExpanded = d.expand(var)
+            val = str(val)
+
+            if not RecipeModified.is_valid_shell_variable(varExpanded):
+                continue
+
+            if func:
+                code_line = "line: {0}, file: {1}\n".format(
+                    d.getVarFlag(var, "lineno", False),
+                    d.getVarFlag(var, "filename", False))
+                val = val.rstrip('\n')
+                logger.warn("Warning: exported shell function %s() is not exported (%s)" %
+                            (varExpanded, code_line))
+                continue
+
+            if export:
+                exported_vars[varExpanded] = val.strip()
+                continue
+
+        self.exported_vars = exported_vars
+
+    def __init_cmake_preset_cache(self, d):
+        """Get the arguments passed to cmake
+
+        Replicate the cmake configure arguments with all details to
+        share on build folder between bitbake and SDK.
+        """
+        site_file = os.path.join(self.workdir, 'site-file.cmake')
+        if os.path.exists(site_file):
+            print("Warning: site-file.cmake is not supported")
+
+        cache_vars = {}
+        oecmake_args = d.getVar('OECMAKE_ARGS').split()
+        extra_oecmake = d.getVar('EXTRA_OECMAKE').split()
+        for param in oecmake_args + extra_oecmake:
+            d_pref = "-D"
+            if param.startswith(d_pref):
+                param = param[len(d_pref):]
+            else:
+                print("Error: expected a -D")
+            param_s = param.split('=', 1)
+            param_nt = param_s[0].split(':', 1)
+
+            def handle_undefined_variable(var):
+                if var.startswith('${') and var.endswith('}'):
+                    return ''
+                else:
+                    return var
+            # Example: FOO=ON
+            if len(param_nt) == 1:
+                cache_vars[param_s[0]] = handle_undefined_variable(param_s[1])
+            # Example: FOO:PATH=/tmp
+            elif len(param_nt) == 2:
+                cache_vars[param_nt[0]] = {
+                    "type": param_nt[1],
+                    "value": handle_undefined_variable(param_s[1]),
+                }
+            else:
+                print("Error: cannot parse %s" % param)
+        self.cmake_cache_vars = cache_vars
+
+    def cmake_preset(self):
+        """Create a preset for cmake that mimics how bitbake calls cmake"""
+        toolchain_file = os.path.join(self.workdir, 'toolchain.cmake')
+        cmake_executable = os.path.join(
+            self.recipe_sysroot_native, 'usr', 'bin', 'cmake')
+        self.cmd_compile = cmake_executable + " --build --preset " + self.recipe_id
+
+        preset_dict_configure = {
+            "name": self.recipe_id,
+            "displayName": self.recipe_id_pretty,
+            "description": "Bitbake build environment for the recipe %s compiled for %s" % (self.bpn, self.package_arch),
+            "binaryDir": self.b,
+            "generator": self.oecmake_generator,
+            "toolchainFile": toolchain_file,
+            "cacheVariables": self.cmake_cache_vars,
+            "environment": self.exported_vars,
+            "cmakeExecutable": cmake_executable
+        }
+
+        preset_dict_build = {
+            "name": self.recipe_id,
+            "displayName": self.recipe_id_pretty,
+            "description": "Bitbake build environment for the recipe %s compiled for %s" % (self.bpn, self.package_arch),
+            "configurePreset": self.recipe_id,
+            "inheritConfigureEnvironment": True
+        }
+
+        preset_dict_test = {
+            "name": self.recipe_id,
+            "displayName": self.recipe_id_pretty,
+            "description": "Bitbake build environment for the recipe %s compiled for %s" % (self.bpn, self.package_arch),
+            "configurePreset": self.recipe_id,
+            "inheritConfigureEnvironment": True
+        }
+
+        preset_dict = {
+            "version": 3,  # cmake 3.21, backward compatible with kirkstone
+            "configurePresets": [preset_dict_configure],
+            "buildPresets": [preset_dict_build],
+            "testPresets": [preset_dict_test]
+        }
+
+        # Finally write the json file
+        json_file = 'CMakeUserPresets.json'
+        json_path = os.path.join(self.real_srctree, json_file)
+        logger.info("Updating CMake preset: %s (%s)" % (json_file, json_path))
+        if not os.path.exists(self.real_srctree):
+            os.makedirs(self.real_srctree)
+        try:
+            with open(json_path) as f:
+                orig_dict = json.load(f)
+        except json.decoder.JSONDecodeError:
+            logger.info(
+                "Decoding %s failed. Probably because of comments in the json file" % json_path)
+            orig_dict = {}
+        except FileNotFoundError:
+            orig_dict = {}
+
+        # Add or update the presets for the recipe and keep other presets
+        for k, v in preset_dict.items():
+            if isinstance(v, list):
+                update_preset = v[0]
+                preset_added = False
+                if k in orig_dict:
+                    for index, orig_preset in enumerate(orig_dict[k]):
+                        if 'name' in orig_preset:
+                            if orig_preset['name'] == update_preset['name']:
+                                logger.debug("Updating preset: %s" %
+                                             orig_preset['name'])
+                                orig_dict[k][index] = update_preset
+                                preset_added = True
+                                break
+                            else:
+                                logger.debug("keeping preset: %s" %
+                                             orig_preset['name'])
+                        else:
+                            logger.warn("preset without a name found")
+                if not preset_added:
+                    if not k in orig_dict:
+                        orig_dict[k] = []
+                    orig_dict[k].append(update_preset)
+                    logger.debug("Added preset: %s" %
+                                 update_preset['name'])
+            else:
+                orig_dict[k] = v
+
+        with open(json_path, 'w') as f:
+            json.dump(orig_dict, f, indent=4)
+
+    def gen_meson_wrapper(self):
+        """Generate a wrapper script to call meson with the cross environment"""
+        bb.utils.mkdirhier(self.ide_sdk_scripts_dir)
+        meson_wrapper = os.path.join(self.ide_sdk_scripts_dir, 'meson')
+        meson_real = os.path.join(
+            self.recipe_sysroot_native, 'usr', 'bin', 'meson.real')
+        with open(meson_wrapper, 'w') as mwrap:
+            mwrap.write("#!/bin/sh" + os.linesep)
+            for var, val in self.exported_vars.items():
+                mwrap.write('export %s="%s"' % (var, val) + os.linesep)
+            mwrap.write("unset CC CXX CPP LD AR NM STRIP" + os.linesep)
+            private_temp = os.path.join(self.b, "meson-private", "tmp")
+            mwrap.write('mkdir -p "%s"' % private_temp + os.linesep)
+            mwrap.write('export TMPDIR="%s"' % private_temp + os.linesep)
+            mwrap.write('exec "%s" "$@"' % meson_real + os.linesep)
+        st = os.stat(meson_wrapper)
+        os.chmod(meson_wrapper, st.st_mode | stat.S_IEXEC)
+        self.meson_wrapper = meson_wrapper
+        self.cmd_compile = meson_wrapper + " compile -C " + self.b
+
+    def which(self, executable):
+        bin_path = shutil.which(executable, path=self.path)
+        if not bin_path:
+            raise DevtoolError(
+                'Cannot find %s. Probably the recipe %s is not built yet.' % (executable, self.bpn))
+        return bin_path
+
+    @staticmethod
+    def is_elf_file(file_path):
+        with open(file_path, "rb") as f:
+            data = f.read(4)
+        if data == b'\x7fELF':
+            return True
+        return False
+
+    def find_installed_binaries(self):
+        """find all executable elf files in the image directory"""
+        binaries = []
+        d_len = len(self.d)
+        re_so = re.compile('.*\.so[.0-9]*$')
+        for root, _, files in os.walk(self.d, followlinks=False):
+            for file in files:
+                if os.path.islink(file):
+                    continue
+                if re_so.match(file):
+                    continue
+                abs_name = os.path.join(root, file)
+                if os.access(abs_name, os.X_OK) and RecipeModified.is_elf_file(abs_name):
+                    binaries.append(abs_name[d_len:])
+        return sorted(binaries)
+
+    def gen_delete_package_dirs(self):
+        """delete folders of package tasks
+
+        This is a workaround for and issue with recipes having their sources
+        downloaded as file://
+        This likely breaks pseudo like:
+        path mismatch [3 links]: ino 79147802 db
+        .../build/tmp/.../cmake-example/1.0/package/usr/src/debug/
+                             cmake-example/1.0-r0/oe-local-files/cpp-example-lib.cpp
+        .../build/workspace/sources/cmake-example/oe-local-files/cpp-example-lib.cpp
+        Since the files are anyway outdated lets deleted them (also from pseudo's db) to workaround this issue.
+        """
+        cmd_lines = ['#!/bin/sh']
+
+        # Set up the appropriate environment
+        newenv = dict(os.environ)
+        for varvalue in self.fakerootenv.split():
+            if '=' in varvalue:
+                splitval = varvalue.split('=', 1)
+                newenv[splitval[0]] = splitval[1]
+
+        # Replicate the environment variables from bitbake
+        for var, val in newenv.items():
+            if not RecipeModified.is_valid_shell_variable(var):
+                continue
+            cmd_lines.append('%s="%s"' % (var, val))
+            cmd_lines.append('export %s' % var)
+
+        # Delete the folders
+        pkg_dirs = ' '.join([os.path.join(self.workdir, d) for d in [
+            "package", "packages-split", "pkgdata", "sstate-install-package", "debugsources.list", "*.spec"]])
+        cmd = "%s rm -rf %s" % (self.fakerootcmd, pkg_dirs)
+        cmd_lines.append('%s || { "%s failed"; exit 1; }' % (cmd, cmd))
+
+        return self.write_script(cmd_lines, 'delete_package_dirs')
+
+    def gen_deploy_target_script(self, args):
+        """Generate a script which does what devtool deploy-target does
+
+        This script is much quicker than devtool target-deploy. Because it
+        does not need to start a bitbake server. All information from tinfoil
+        is hard-coded in the generated script.
+        """
+        cmd_lines = ['#!%s' % str(sys.executable)]
+        cmd_lines.append('import sys')
+        cmd_lines.append('devtool_sys_path = %s' % str(sys.path))
+        cmd_lines.append('devtool_sys_path.reverse()')
+        cmd_lines.append('for p in devtool_sys_path:')
+        cmd_lines.append('    if p not in sys.path:')
+        cmd_lines.append('        sys.path.insert(0, p)')
+        cmd_lines.append('from devtool.deploy import deploy_no_d')
+        args_filter = ['debug', 'dry_run', 'key', 'no_check_space', 'no_host_check',
+                       'no_preserve', 'port', 'show_status', 'ssh_exec', 'strip', 'target']
+        filtered_args_dict = {key: value for key, value in vars(
+            args).items() if key in args_filter}
+        cmd_lines.append('filtered_args_dict = %s' % str(filtered_args_dict))
+        cmd_lines.append('class Dict2Class(object):')
+        cmd_lines.append('    def __init__(self, my_dict):')
+        cmd_lines.append('        for key in my_dict:')
+        cmd_lines.append('            setattr(self, key, my_dict[key])')
+        cmd_lines.append('filtered_args = Dict2Class(filtered_args_dict)')
+        cmd_lines.append(
+            'setattr(filtered_args, "recipename", "%s")' % self.bpn)
+        cmd_lines.append('deploy_no_d("%s", "%s", "%s", "%s", "%s", "%s", %d, "%s", "%s", filtered_args)' %
+                         (self.d, self.workdir, self.path, self.strip_cmd,
+                          self.libdir, self.base_libdir, self.max_process,
+                          self.fakerootcmd, self.fakerootenv))
+        return self.write_script(cmd_lines, 'deploy_target')
+
+    def gen_install_deploy_script(self, args):
+        """Generate a script which does install and deploy"""
+        cmd_lines = ['#!/bin/bash']
+
+        cmd_lines.append(self.gen_delete_package_dirs())
+
+        # . oe-init-build-env $BUILDDIR
+        # Note: Sourcing scripts with arguments requires bash
+        cmd_lines.append('cd "%s" || { echo "cd %s failed"; exit 1; }' % (
+            self.oe_init_dir, self.oe_init_dir))
+        cmd_lines.append('. "%s" "%s" || { echo ". %s %s failed"; exit 1; }' % (
+            self.oe_init_build_env, self.topdir, self.oe_init_build_env, self.topdir))
+
+        # bitbake -c install
+        cmd_lines.append(
+            'bitbake %s -c install --force || { echo "bitbake %s -c install --force failed"; exit 1; }' % (self.bpn, self.bpn))
+
+        # Self contained devtool deploy-target
+        cmd_lines.append(self.gen_deploy_target_script(args))
+
+        return self.write_script(cmd_lines, 'install_and_deploy')
+
+    def write_script(self, cmd_lines, script_name):
+        bb.utils.mkdirhier(self.ide_sdk_scripts_dir)
+        script_name_arch = script_name + '_' + self.recipe_id
+        script_file = os.path.join(self.ide_sdk_scripts_dir, script_name_arch)
+        with open(script_file, 'w') as script_f:
+            script_f.write(os.linesep.join(cmd_lines))
+        st = os.stat(script_file)
+        os.chmod(script_file, st.st_mode | stat.S_IEXEC)
+        return script_file
+
+    @property
+    def oe_init_build_env(self):
+        """Find the oe-init-build-env used for this setup"""
+        oe_init_dir = self.oe_init_dir
+        if oe_init_dir:
+            return os.path.join(oe_init_dir, RecipeModified.OE_INIT_BUILD_ENV)
+        return None
+
+    @property
+    def oe_init_dir(self):
+        """Find the directory where the oe-init-build-env is located
+
+        Assumption: There might be a layer with higher priority than poky
+        which provides to oe-init-build-env in the layer's toplevel folder.
+        """
+        if not self.__oe_init_dir:
+            for layer in reversed(self.bblayers):
+                result = subprocess.run(
+                    ['git', 'rev-parse', '--show-toplevel'], cwd=layer, capture_output=True)
+                if result.returncode == 0:
+                    oe_init_dir = result.stdout.decode('utf-8').strip()
+                    oe_init_path = os.path.join(
+                        oe_init_dir, RecipeModified.OE_INIT_BUILD_ENV)
+                    if os.path.exists(oe_init_path):
+                        logger.debug("Using %s from: %s" % (
+                            RecipeModified.OE_INIT_BUILD_ENV, oe_init_path))
+                        self.__oe_init_dir = oe_init_dir
+                        break
+            if not self.__oe_init_dir:
+                logger.error("Cannot find the bitbake top level folder")
+        return self.__oe_init_dir
+
+
+def ide_setup(args, config, basepath, workspace):
+    """Generate the IDE configuration for the workspace"""
+
+    # Explicitely passing some special recipes does not make sense
+    for recipe in args.recipenames:
+        if recipe in ['meta-ide-support', 'build-sysroots']:
+            raise DevtoolError("Invalid recipe: %s." % recipe)
+
+    # Collect information about tasks which need to be bitbaked
+    bootstrap_tasks = []
+    bootstrap_tasks_late = []
+    tinfoil = setup_tinfoil(config_only=False, basepath=basepath)
+    try:
+        # define mode depending on recipes which need to be processed
+        recipes_image_names = []
+        recipes_modified_names = []
+        recipes_other_names = []
+        for recipe in args.recipenames:
+            try:
+                check_workspace_recipe(
+                    workspace, recipe, bbclassextend=True)
+                recipes_modified_names.append(recipe)
+            except DevtoolError:
+                recipe_d = parse_recipe(
+                    config, tinfoil, recipe, appends=True, filter_workspace=False)
+                if not recipe_d:
+                    raise DevtoolError("Parsing recipe %s failed" % recipe)
+                if bb.data.inherits_class('image', recipe_d):
+                    recipes_image_names.append(recipe)
+                else:
+                    recipes_other_names.append(recipe)
+
+        invalid_params = False
+        if args.mode == DevtoolIdeMode.shared:
+            if len(recipes_modified_names):
+                logger.error("In shared sysroots mode modified recipes %s cannot be handled." % str(
+                    recipes_modified_names))
+                invalid_params = True
+        if args.mode == DevtoolIdeMode.modified:
+            if len(recipes_other_names):
+                logger.error("Only in shared sysroots mode not modified recipes %s can be handled." % str(
+                    recipes_other_names))
+                invalid_params = True
+            if len(recipes_image_names) != 1:
+                logger.error(
+                    "One image recipe is required as the rootfs for the remote development.")
+                invalid_params = True
+            for modified_recipe_name in recipes_modified_names:
+                if modified_recipe_name.startswith('nativesdk-') or modified_recipe_name.endswith('-native'):
+                    logger.error(
+                        "Only cross compiled recipes are support. %s is not cross." % modified_recipe_name)
+                    invalid_params = True
+
+        if invalid_params:
+            raise DevtoolError("Invalid parameters are passed.")
+
+        # For the shared sysroots mode, add all dependencies of all the images to the sysroots
+        # For the modified mode provide one rootfs and the corresponding debug symbols via rootfs-dbg
+        recipes_images = []
+        for recipes_image_name in recipes_image_names:
+            logger.info("Using image: %s" % recipes_image_name)
+            recipe_image = RecipeImage(recipes_image_name)
+            recipe_image.initialize(config, tinfoil)
+            bootstrap_tasks += recipe_image.bootstrap_tasks
+            recipes_images.append(recipe_image)
+
+        # Provide a Direct SDK with shared sysroots
+        recipes_not_modified = []
+        if args.mode == DevtoolIdeMode.shared:
+            ide_support = RecipeMetaIdeSupport()
+            ide_support.initialize(config, tinfoil)
+            bootstrap_tasks += ide_support.bootstrap_tasks
+
+            logger.info("Adding %s to the Direct SDK sysroots." %
+                        str(recipes_other_names))
+            for recipe_name in recipes_other_names:
+                recipe_not_modified = RecipeNotModified(recipe_name)
+                bootstrap_tasks += recipe_not_modified.bootstrap_tasks
+                recipes_not_modified.append(recipe_not_modified)
+
+            build_sysroots = RecipeBuildSysroots()
+            build_sysroots.initialize(config, tinfoil)
+            bootstrap_tasks_late += build_sysroots.bootstrap_tasks
+            shared_env = SharedSysrootsEnv()
+            shared_env.initialize(ide_support, build_sysroots)
+
+        recipes_modified = []
+        if args.mode == DevtoolIdeMode.modified:
+            logger.info("Setting up workspaces for modified recipe: %s" %
+                        str(recipes_modified_names))
+            gdbs_cross = {}
+            for recipe_name in recipes_modified_names:
+                recipe_modified = RecipeModified(recipe_name)
+                recipe_modified.initialize(config, workspace, tinfoil)
+                bootstrap_tasks += recipe_modified.bootstrap_tasks
+                recipes_modified.append(recipe_modified)
+
+                if recipe_modified.target_arch not in gdbs_cross:
+                    target_device = TargetDevice(args)
+                    gdb_cross = RecipeGdbCross(
+                        args, recipe_modified.target_arch, target_device)
+                    gdb_cross.initialize(config, workspace, tinfoil)
+                    bootstrap_tasks += gdb_cross.bootstrap_tasks
+                    gdbs_cross[recipe_modified.target_arch] = gdb_cross
+                recipe_modified.gdb_cross = gdbs_cross[recipe_modified.target_arch]
+
+    finally:
+        tinfoil.shutdown()
+
+    if not args.skip_bitbake:
+        bb_cmd = 'bitbake '
+        if args.bitbake_k:
+            bb_cmd += "-k "
+        bb_cmd_early = bb_cmd + ' '.join(bootstrap_tasks)
+        exec_build_env_command(
+            config.init_path, basepath, bb_cmd_early, watch=True)
+        if bootstrap_tasks_late:
+            bb_cmd_late = bb_cmd + ' '.join(bootstrap_tasks_late)
+            exec_build_env_command(
+                config.init_path, basepath, bb_cmd_late, watch=True)
+
+    for recipe_image in recipes_images:
+        if (recipe_image.gdbserver_missing):
+            logger.warning(
+                "gdbserver not installed in image %s. Remote debugging will not be available" % recipe_image)
+
+        if recipe_image.combine_dbg_image is False:
+            logger.warning(
+                'IMAGE_CLASSES += "image-combined-dbg" is missing for image %s. Remote debugging will not find debug symbols from rootfs-dbg.' % recipe_image)
+
+    # Instantiate the active IDE plugin
+    ide = ide_plugins[args.ide]()
+    if args.mode == DevtoolIdeMode.shared:
+        ide.setup_shared_sysroots(shared_env)
+    elif args.mode == DevtoolIdeMode.modified:
+        for recipe_modified in recipes_modified:
+            if recipe_modified.build_tool is BuildTool.CMAKE:
+                recipe_modified.cmake_preset()
+            if recipe_modified.build_tool is BuildTool.MESON:
+                recipe_modified.gen_meson_wrapper()
+            ide.setup_modified_recipe(
+                args, recipe_image, recipe_modified)
+    else:
+        raise DevtoolError("Must not end up here.")
+
+
+def register_commands(subparsers, context):
+    """Register devtool subcommands from this plugin"""
+
+    global ide_plugins
+
+    # Search for IDE plugins in all sub-folders named ide_plugins where devtool seraches for plugins.
+    pluginpaths = [os.path.join(path, 'ide_plugins')
+                   for path in context.pluginpaths]
+    ide_plugin_modules = []
+    for pluginpath in pluginpaths:
+        scriptutils.load_plugins(logger, ide_plugin_modules, pluginpath)
+
+    for ide_plugin_module in ide_plugin_modules:
+        if hasattr(ide_plugin_module, 'register_ide_plugin'):
+            ide_plugin_module.register_ide_plugin(ide_plugins)
+    # Sort plugins according to their priority. The first entry is the default IDE plugin.
+    ide_plugins = dict(sorted(ide_plugins.items(),
+                       key=lambda p: p[1].ide_plugin_priority(), reverse=True))
+
+    parser_ide_sdk = subparsers.add_parser('ide-sdk', group='working', order=50, formatter_class=RawTextHelpFormatter,
+                                           help='Setup the SDK and configure the IDE')
+    parser_ide_sdk.add_argument(
+        'recipenames', nargs='+', help='Generate an IDE configuration suitable to work on the given recipes.\n'
+        'Depending on the --mode paramter different types of SDKs and IDE configurations are generated.')
+    parser_ide_sdk.add_argument(
+        '-m', '--mode', type=DevtoolIdeMode, default=DevtoolIdeMode.modified,
+        help='Different SDK types are supported:\n'
+        '- "' + DevtoolIdeMode.modified.name + '" (default):\n'
+        '  devtool modify creates a workspace to work on the source code of a recipe.\n'
+        '  devtool ide-sdk builds the SDK and generates the IDE configuration(s) in the workspace directorie(s)\n'
+        '  Usage example:\n'
+        '    devtool modify cmake-example\n'
+        '    devtool ide-sdk cmake-example core-image-minimal\n'
+        '    Start the IDE in the workspace folder\n'
+        '  At least one devtool modified recipe plus one image recipe are required:\n'
+        '  The image recipe is used to generate the target image and the remote debug configuration.\n'
+        '- "' + DevtoolIdeMode.shared.name + '":\n'
+        '  Usage example:\n'
+        '    devtool ide-sdk -m ' + DevtoolIdeMode.shared.name + ' recipe(s)\n'
+        '  This command generates a cross-toolchain as well as the corresponding shared sysroot directories.\n'
+        '  To use this tool-chain the environment-* file found in the deploy..image folder needs to be sourced into a shell.\n'
+        '  In case of VSCode and cmake the tool-chain is also exposed as a cmake-kit')
+    default_ide = list(ide_plugins.keys())[0]
+    parser_ide_sdk.add_argument(
+        '-i', '--ide', choices=ide_plugins.keys(), default=default_ide,
+        help='Setup the configuration for this IDE (default: %s)' % default_ide)
+    parser_ide_sdk.add_argument(
+        '-t', '--target', default='root@192.168.7.2',
+        help='Live target machine running an ssh server: user@hostname.')
+    parser_ide_sdk.add_argument(
+        '-G', '--gdbserver-port-start', default="1234", help='port where gdbserver is listening.')
+    parser_ide_sdk.add_argument(
+        '-c', '--no-host-check', help='Disable ssh host key checking', action='store_true')
+    parser_ide_sdk.add_argument(
+        '-e', '--ssh-exec', help='Executable to use in place of ssh')
+    parser_ide_sdk.add_argument(
+        '-P', '--port', help='Specify ssh port to use for connection to the target')
+    parser_ide_sdk.add_argument(
+        '-I', '--key', help='Specify ssh private key for connection to the target')
+    parser_ide_sdk.add_argument(
+        '--skip-bitbake', help='Generate IDE configuration but skip calling bibtake to update the SDK.', action='store_true')
+    parser_ide_sdk.add_argument(
+        '-k', '--bitbake-k', help='Pass -k parameter to bitbake', action='store_true')
+    parser_ide_sdk.add_argument(
+        '--no-strip', help='Do not strip executables prior to deploy', dest='strip', action='store_false')
+    parser_ide_sdk.add_argument(
+        '-n', '--dry-run', help='List files to be undeployed only', action='store_true')
+    parser_ide_sdk.add_argument(
+        '-s', '--show-status', help='Show progress/status output', action='store_true')
+    parser_ide_sdk.add_argument(
+        '-p', '--no-preserve', help='Do not preserve existing files', action='store_true')
+    parser_ide_sdk.add_argument(
+        '--no-check-space', help='Do not check for available space before deploying', action='store_true')
+    parser_ide_sdk.add_argument(
+        '--debug-build-config', help='Use debug build flags, for example set CMAKE_BUILD_TYPE=Debug', action='store_true')
+    parser_ide_sdk.set_defaults(func=ide_setup)
diff --git a/poky/scripts/lib/devtool/standard.py b/poky/scripts/lib/devtool/standard.py
index 8fb4b93..7972b4f 100644
--- a/poky/scripts/lib/devtool/standard.py
+++ b/poky/scripts/lib/devtool/standard.py
@@ -460,7 +460,7 @@
     finally:
         tinfoil.shutdown()
 
-def symlink_oelocal_files_srctree(rd,srctree):
+def symlink_oelocal_files_srctree(rd, srctree):
     import oe.patch
     if os.path.abspath(rd.getVar('S')) == os.path.abspath(rd.getVar('WORKDIR')):
         # If recipe extracts to ${WORKDIR}, symlink the files into the srctree
@@ -484,11 +484,7 @@
                     os.symlink('oe-local-files/%s' % fn, destpth)
                 addfiles.append(os.path.join(relpth, fn))
         if addfiles:
-            bb.process.run('git add %s' % ' '.join(addfiles), cwd=srctree)
-            useroptions = []
-            oe.patch.GitApplyTree.gitCommandUserOptions(useroptions, d=rd)
-            bb.process.run('git %s commit -m "Committing local file symlinks\n\n%s"' % (' '.join(useroptions), oe.patch.GitApplyTree.ignore_commit_prefix), cwd=srctree)
-
+            oe.patch.GitApplyTree.commitIgnored("Add local file symlinks", dir=srctree, files=addfiles, d=rd)
 
 def _extract_source(srctree, keep_temp, devbranch, sync, config, basepath, workspace, fixed_setup, d, tinfoil, no_overrides=False):
     """Extract sources of a recipe"""
@@ -657,9 +653,9 @@
 
             if os.path.exists(workshareddir) and (not os.listdir(workshareddir) or kernelVersion != staging_kerVer):
                 shutil.rmtree(workshareddir)
-                oe.path.copyhardlinktree(srcsubdir,workshareddir)
+                oe.path.copyhardlinktree(srcsubdir, workshareddir)
             elif not os.path.exists(workshareddir):
-                oe.path.copyhardlinktree(srcsubdir,workshareddir)
+                oe.path.copyhardlinktree(srcsubdir, workshareddir)
 
         tempdir_localdir = os.path.join(tempdir, 'oe-local-files')
         srctree_localdir = os.path.join(srctree, 'oe-local-files')
@@ -667,13 +663,13 @@
         if sync:
             bb.process.run('git fetch file://' + srcsubdir + ' ' + devbranch + ':' + devbranch, cwd=srctree)
 
-            # Move oe-local-files directory to srctree
-            # As the oe-local-files is not part of the constructed git tree,
-            # remove them directly during the synchrounizating might surprise
-            # the users.  Instead, we move it to oe-local-files.bak and remind
-            # user in the log message.
+            # Move the oe-local-files directory to srctree.
+            # As oe-local-files is not part of the constructed git tree,
+            # removing it directly during the synchronization might surprise
+            # the user.  Instead, we move it to oe-local-files.bak and remind
+            # the user in the log message.
             if os.path.exists(srctree_localdir + '.bak'):
-                shutil.rmtree(srctree_localdir, srctree_localdir + '.bak')
+                shutil.rmtree(srctree_localdir + '.bak')
 
             if os.path.exists(srctree_localdir):
                 logger.info('Backing up current local file directory %s' % srctree_localdir)
@@ -689,7 +685,7 @@
                 shutil.move(tempdir_localdir, srcsubdir)
 
             shutil.move(srcsubdir, srctree)
-            symlink_oelocal_files_srctree(d,srctree)
+            symlink_oelocal_files_srctree(d, srctree)
 
         if is_kernel_yocto:
             logger.info('Copying kernel config to srctree')
@@ -762,7 +758,7 @@
     kerver = []
     staging_kerVer=""
     if os.path.exists(srcdir) and os.listdir(srcdir):
-        with open(os.path.join(srcdir,"Makefile")) as f:
+        with open(os.path.join(srcdir, "Makefile")) as f:
             version = [next(f) for x in range(5)][1:4]
             for word in version:
                 kerver.append(word.split('= ')[1].split('\n')[0])
@@ -843,10 +839,10 @@
             staging_kerVer = get_staging_kver(srcdir)
             staging_kbranch = get_staging_kbranch(srcdir)
             if (os.path.exists(srcdir) and os.listdir(srcdir)) and (kernelVersion in staging_kerVer and staging_kbranch == kbranch):
-                oe.path.copyhardlinktree(srcdir,srctree)
+                oe.path.copyhardlinktree(srcdir, srctree)
                 workdir = rd.getVar('WORKDIR')
                 srcsubdir = rd.getVar('S')
-                localfilesdir = os.path.join(srctree,'oe-local-files')
+                localfilesdir = os.path.join(srctree, 'oe-local-files')
                 # Move local source files into separate subdir
                 recipe_patches = [os.path.basename(patch) for patch in oe.recipeutils.get_recipe_patches(rd)]
                 local_files = oe.recipeutils.get_recipe_local_files(rd)
@@ -870,9 +866,9 @@
                     for fname in local_files:
                         _move_file(os.path.join(workdir, fname), os.path.join(srctree, 'oe-local-files', fname))
                     with open(os.path.join(srctree, 'oe-local-files', '.gitignore'), 'w') as f:
-                        f.write('# Ignore local files, by default. Remove this file ''if you want to commit the directory to Git\n*\n')
+                        f.write('# Ignore local files, by default. Remove this file if you want to commit the directory to Git\n*\n')
 
-                symlink_oelocal_files_srctree(rd,srctree)
+                symlink_oelocal_files_srctree(rd, srctree)
 
                 task = 'do_configure'
                 res = tinfoil.build_targets(pn, task, handle_events=True)
@@ -880,7 +876,7 @@
                 # Copy .config to workspace
                 kconfpath = rd.getVar('B')
                 logger.info('Copying kernel config to workspace')
-                shutil.copy2(os.path.join(kconfpath, '.config'),srctree)
+                shutil.copy2(os.path.join(kconfpath, '.config'), srctree)
 
                 # Set this to true, we still need to get initial_rev
                 # by parsing the git repo
@@ -941,14 +937,13 @@
             seen_patches = []
             for branch in branches:
                 branch_patches[branch] = []
-                (stdout, _) = bb.process.run('git log devtool-base..%s' % branch, cwd=srctree)
-                for line in stdout.splitlines():
-                    line = line.strip()
-                    if line.startswith(oe.patch.GitApplyTree.patch_line_prefix):
-                        origpatch = line[len(oe.patch.GitApplyTree.patch_line_prefix):].split(':', 1)[-1].strip()
-                        if not origpatch in seen_patches:
-                            seen_patches.append(origpatch)
-                            branch_patches[branch].append(origpatch)
+                (stdout, _) = bb.process.run('git rev-list devtool-base..%s' % branch, cwd=srctree)
+                for sha1 in stdout.splitlines():
+                    notes = oe.patch.GitApplyTree.getNotes(srctree, sha1.strip())
+                    origpatch = notes.get(oe.patch.GitApplyTree.original_patch)
+                    if origpatch and origpatch not in seen_patches:
+                        seen_patches.append(origpatch)
+                        branch_patches[branch].append(origpatch)
 
         # Need to grab this here in case the source is within a subdirectory
         srctreebase = srctree
@@ -966,7 +961,7 @@
             # Assume first entry is main source extracted in ${S} so skip it
             src_uri = src_uri[1::]
 
-            #Add "type=git-dependency" to all non local sources
+            # Add "type=git-dependency" to all non local sources
             for url in src_uri:
                 if not url.startswith('file://') and not 'type=' in url:
                     src_uri_remove.append(url)
@@ -974,7 +969,7 @@
 
             if src_uri_remove:
                 f.write('SRC_URI:remove = "%s"\n' % ' '.join(src_uri_remove))
-                f.write('SRC_URI:append = "%s"\n\n' % ' '.join(src_uri_append))
+                f.write('SRC_URI:append = " %s"\n\n' % ' '.join(src_uri_append))
 
             f.write('FILESEXTRAPATHS:prepend := "${THISDIR}/${PN}:"\n')
             # Local files can be modified/tracked in separate subdir under srctree
@@ -1004,7 +999,7 @@
                         '        mv ${S}/.config ${S}/.config.old\n'
                         '    fi\n'
                         '}\n')
-            if rd.getVarFlag('do_menuconfig','task'):
+            if rd.getVarFlag('do_menuconfig', 'task'):
                 f.write('\ndo_configure:append() {\n'
                 '    if [ ${@oe.types.boolean(d.getVar("KCONFIG_CONFIG_ENABLE_MENUCONFIG"))} = True ]; then\n'
                 '        cp ${KCONFIG_CONFIG_ROOTDIR}/.config ${S}/.config.baseline\n'
@@ -2081,7 +2076,7 @@
                         # We don't want to risk wiping out any work in progress
                         if srctreebase.startswith(os.path.join(config.workspace_path, 'sources')):
                             from datetime import datetime
-                            preservesrc = os.path.join(config.workspace_path, 'attic', 'sources', "{}.{}".format(pn,datetime.now().strftime("%Y%m%d%H%M%S")))
+                            preservesrc = os.path.join(config.workspace_path, 'attic', 'sources', "{}.{}".format(pn, datetime.now().strftime("%Y%m%d%H%M%S")))
                             logger.info('Preserving source tree in %s\nIf you no '
                                         'longer need it then please delete it manually.\n'
                                         'It is also possible to reuse it via devtool source tree argument.'
diff --git a/poky/scripts/lib/devtool/upgrade.py b/poky/scripts/lib/devtool/upgrade.py
index ef58523..fa5b8ef 100644
--- a/poky/scripts/lib/devtool/upgrade.py
+++ b/poky/scripts/lib/devtool/upgrade.py
@@ -261,7 +261,7 @@
     revs = {}
     for path in paths:
         (stdout, _) = _run('git rev-parse HEAD', cwd=path)
-        revs[os.path.relpath(path,srctree)] = stdout.rstrip()
+        revs[os.path.relpath(path, srctree)] = stdout.rstrip()
 
     if no_patch:
         patches = oe.recipeutils.get_recipe_patches(crd)
@@ -272,17 +272,35 @@
             _run('git checkout devtool-patched -b %s' % branch, cwd=path)
             (stdout, _) = _run('git branch --list devtool-override-*', cwd=path)
             branches_to_rebase = [branch] + stdout.split()
+            target_branch = revs[os.path.relpath(path, srctree)]
+
+            # There is a bug (or feature?) in git rebase where if a commit with
+            # a note is fully rebased away by being part of an old commit, the
+            # note is still attached to the old commit. Avoid this by making
+            # sure all old devtool related commits have a note attached to them
+            # (this assumes git config notes.rewriteMode is set to ignore).
+            (stdout, _) = __run('git rev-list devtool-base..%s' % target_branch)
+            for rev in stdout.splitlines():
+                if not oe.patch.GitApplyTree.getNotes(path, rev):
+                    oe.patch.GitApplyTree.addNote(path, rev, "dummy")
+
             for b in branches_to_rebase:
-                logger.info("Rebasing {} onto {}".format(b, revs[os.path.relpath(path,srctree)]))
+                logger.info("Rebasing {} onto {}".format(b, target_branch))
                 _run('git checkout %s' % b, cwd=path)
                 try:
-                    _run('git rebase %s' % revs[os.path.relpath(path, srctree)], cwd=path)
+                    _run('git rebase %s' % target_branch, cwd=path)
                 except bb.process.ExecutionError as e:
                     if 'conflict' in e.stdout:
                         logger.warning('Command \'%s\' failed:\n%s\n\nYou will need to resolve conflicts in order to complete the upgrade.' % (e.command, e.stdout.rstrip()))
                         _run('git rebase --abort', cwd=path)
                     else:
                         logger.warning('Command \'%s\' failed:\n%s' % (e.command, e.stdout))
+
+            # Remove any dummy notes added above.
+            (stdout, _) = __run('git rev-list devtool-base..%s' % target_branch)
+            for rev in stdout.splitlines():
+                oe.patch.GitApplyTree.removeNote(path, rev, "dummy")
+
             _run('git checkout %s' % branch, cwd=path)
 
     if tmpsrctree:
diff --git a/poky/scripts/lib/recipetool/create.py b/poky/scripts/lib/recipetool/create.py
index d2997cc..8e9ff38 100644
--- a/poky/scripts/lib/recipetool/create.py
+++ b/poky/scripts/lib/recipetool/create.py
@@ -1166,12 +1166,12 @@
     # Note: these are carefully constructed!
     license_title_re = re.compile(r'^#*\(? *(This is )?([Tt]he )?.{0,15} ?[Ll]icen[sc]e( \(.{1,10}\))?\)?[:\.]? ?#*$')
     license_statement_re = re.compile(r'^((This (project|software)|.{1,10}) is( free software)? (released|licen[sc]ed)|(Released|Licen[cs]ed)) under the .{1,10} [Ll]icen[sc]e:?$')
-    copyright_re = re.compile('^ *[#\*]* *(Modified work |MIT LICENSED )?Copyright ?(\([cC]\))? .*$')
-    disclaimer_re = re.compile('^ *\*? ?All [Rr]ights [Rr]eserved\.$')
-    email_re = re.compile('^.*<[\w\.-]*@[\w\.\-]*>$')
-    header_re = re.compile('^(\/\**!?)? ?[\-=\*]* ?(\*\/)?$')
-    tag_re = re.compile('^ *@?\(?([Ll]icense|MIT)\)?$')
-    url_re = re.compile('^ *[#\*]* *https?:\/\/[\w\.\/\-]+$')
+    copyright_re = re.compile(r'^ *[#\*]* *(Modified work |MIT LICENSED )?Copyright ?(\([cC]\))? .*$')
+    disclaimer_re = re.compile(r'^ *\*? ?All [Rr]ights [Rr]eserved\.$')
+    email_re = re.compile(r'^.*<[\w\.-]*@[\w\.\-]*>$')
+    header_re = re.compile(r'^(\/\**!?)? ?[\-=\*]* ?(\*\/)?$')
+    tag_re = re.compile(r'^ *@?\(?([Ll]icense|MIT)\)?$')
+    url_re = re.compile(r'^ *[#\*]* *https?:\/\/[\w\.\/\-]+$')
 
     lictext = []
     with open(licfile, 'r', errors='surrogateescape') as f:
diff --git a/poky/scripts/lib/recipetool/create_buildsys.py b/poky/scripts/lib/recipetool/create_buildsys.py
index 5015634..ec9d510 100644
--- a/poky/scripts/lib/recipetool/create_buildsys.py
+++ b/poky/scripts/lib/recipetool/create_buildsys.py
@@ -5,9 +5,9 @@
 # SPDX-License-Identifier: GPL-2.0-only
 #
 
+import os
 import re
 import logging
-import glob
 from recipetool.create import RecipeHandler, validate_pv
 
 logger = logging.getLogger('recipetool')
@@ -137,15 +137,15 @@
         deps = []
         unmappedpkgs = []
 
-        proj_re = re.compile('project\s*\(([^)]*)\)', re.IGNORECASE)
-        pkgcm_re = re.compile('pkg_check_modules\s*\(\s*[a-zA-Z0-9-_]+\s*(REQUIRED)?\s+([^)\s]+)\s*\)', re.IGNORECASE)
-        pkgsm_re = re.compile('pkg_search_module\s*\(\s*[a-zA-Z0-9-_]+\s*(REQUIRED)?((\s+[^)\s]+)+)\s*\)', re.IGNORECASE)
-        findpackage_re = re.compile('find_package\s*\(\s*([a-zA-Z0-9-_]+)\s*.*', re.IGNORECASE)
-        findlibrary_re = re.compile('find_library\s*\(\s*[a-zA-Z0-9-_]+\s*(NAMES\s+)?([a-zA-Z0-9-_ ]+)\s*.*')
-        checklib_re = re.compile('check_library_exists\s*\(\s*([^\s)]+)\s*.*', re.IGNORECASE)
-        include_re = re.compile('include\s*\(\s*([^)\s]*)\s*\)', re.IGNORECASE)
-        subdir_re = re.compile('add_subdirectory\s*\(\s*([^)\s]*)\s*([^)\s]*)\s*\)', re.IGNORECASE)
-        dep_re = re.compile('([^ ><=]+)( *[<>=]+ *[^ ><=]+)?')
+        proj_re = re.compile(r'project\s*\(([^)]*)\)', re.IGNORECASE)
+        pkgcm_re = re.compile(r'pkg_check_modules\s*\(\s*[a-zA-Z0-9-_]+\s*(REQUIRED)?\s+([^)\s]+)\s*\)', re.IGNORECASE)
+        pkgsm_re = re.compile(r'pkg_search_module\s*\(\s*[a-zA-Z0-9-_]+\s*(REQUIRED)?((\s+[^)\s]+)+)\s*\)', re.IGNORECASE)
+        findpackage_re = re.compile(r'find_package\s*\(\s*([a-zA-Z0-9-_]+)\s*.*', re.IGNORECASE)
+        findlibrary_re = re.compile(r'find_library\s*\(\s*[a-zA-Z0-9-_]+\s*(NAMES\s+)?([a-zA-Z0-9-_ ]+)\s*.*')
+        checklib_re = re.compile(r'check_library_exists\s*\(\s*([^\s)]+)\s*.*', re.IGNORECASE)
+        include_re = re.compile(r'include\s*\(\s*([^)\s]*)\s*\)', re.IGNORECASE)
+        subdir_re = re.compile(r'add_subdirectory\s*\(\s*([^)\s]*)\s*([^)\s]*)\s*\)', re.IGNORECASE)
+        dep_re = re.compile(r'([^ ><=]+)( *[<>=]+ *[^ ><=]+)?')
 
         def find_cmake_package(pkg):
             RecipeHandler.load_devel_filemap(tinfoil.config_data)
@@ -423,16 +423,16 @@
                 'makeinfo': 'texinfo',
                 }
 
-        pkg_re = re.compile('PKG_CHECK_MODULES\(\s*\[?[a-zA-Z0-9_]*\]?,\s*\[?([^,\]]*)\]?[),].*')
-        pkgce_re = re.compile('PKG_CHECK_EXISTS\(\s*\[?([^,\]]*)\]?[),].*')
-        lib_re = re.compile('AC_CHECK_LIB\(\s*\[?([^,\]]*)\]?,.*')
-        libx_re = re.compile('AX_CHECK_LIBRARY\(\s*\[?[^,\]]*\]?,\s*\[?([^,\]]*)\]?,\s*\[?([a-zA-Z0-9-]*)\]?,.*')
-        progs_re = re.compile('_PROGS?\(\s*\[?[a-zA-Z0-9_]*\]?,\s*\[?([^,\]]*)\]?[),].*')
-        dep_re = re.compile('([^ ><=]+)( [<>=]+ [^ ><=]+)?')
-        ac_init_re = re.compile('AC_INIT\(\s*([^,]+),\s*([^,]+)[,)].*')
-        am_init_re = re.compile('AM_INIT_AUTOMAKE\(\s*([^,]+),\s*([^,]+)[,)].*')
-        define_re = re.compile('\s*(m4_)?define\(\s*([^,]+),\s*([^,]+)\)')
-        version_re = re.compile('([0-9.]+)')
+        pkg_re = re.compile(r'PKG_CHECK_MODULES\(\s*\[?[a-zA-Z0-9_]*\]?,\s*\[?([^,\]]*)\]?[),].*')
+        pkgce_re = re.compile(r'PKG_CHECK_EXISTS\(\s*\[?([^,\]]*)\]?[),].*')
+        lib_re = re.compile(r'AC_CHECK_LIB\(\s*\[?([^,\]]*)\]?,.*')
+        libx_re = re.compile(r'AX_CHECK_LIBRARY\(\s*\[?[^,\]]*\]?,\s*\[?([^,\]]*)\]?,\s*\[?([a-zA-Z0-9-]*)\]?,.*')
+        progs_re = re.compile(r'_PROGS?\(\s*\[?[a-zA-Z0-9_]*\]?,\s*\[?([^,\]]*)\]?[),].*')
+        dep_re = re.compile(r'([^ ><=]+)( [<>=]+ [^ ><=]+)?')
+        ac_init_re = re.compile(r'AC_INIT\(\s*([^,]+),\s*([^,]+)[,)].*')
+        am_init_re = re.compile(r'AM_INIT_AUTOMAKE\(\s*([^,]+),\s*([^,]+)[,)].*')
+        define_re = re.compile(r'\s*(m4_)?define\(\s*([^,]+),\s*([^,]+)\)')
+        version_re = re.compile(r'([0-9.]+)')
 
         defines = {}
         def subst_defines(value):
diff --git a/poky/scripts/lib/recipetool/create_buildsys_python.py b/poky/scripts/lib/recipetool/create_buildsys_python.py
index 60c5903..a807daf 100644
--- a/poky/scripts/lib/recipetool/create_buildsys_python.py
+++ b/poky/scripts/lib/recipetool/create_buildsys_python.py
@@ -573,12 +573,15 @@
         if 'buildsystem' in handled:
             return False
 
+        logger.debug("Trying setup.py parser")
+
         # Check for non-zero size setup.py files
         setupfiles = RecipeHandler.checkfiles(srctree, ['setup.py'])
         for fn in setupfiles:
             if os.path.getsize(fn):
                 break
         else:
+            logger.debug("No setup.py found")
             return False
 
         # setup.py is always parsed to get at certain required information, such as
@@ -736,6 +739,7 @@
         "flit_core.buildapi": "python_flit_core",
         "hatchling.build": "python_hatchling",
         "maturin": "python_maturin",
+        "mesonpy": "python_mesonpy",
     }
 
     # setuptools.build_meta and flit declare project metadata into the "project" section of pyproject.toml
@@ -776,6 +780,8 @@
         "python3-poetry-core-native",
         # already provided by python_flit_core.bbclass
         "python3-flit-core-native",
+        # already provided by python_mesonpy
+        "python3-meson-python-native",
     ]
 
     # add here a list of known and often used packages and the corresponding bitbake package
@@ -787,6 +793,7 @@
         "setuptools-scm": "python3-setuptools-scm",
         "hatchling": "python3-hatchling",
         "hatch-vcs": "python3-hatch-vcs",
+        "meson-python" : "python3-meson-python",
     }
 
     def __init__(self):
@@ -799,12 +806,15 @@
         if 'buildsystem' in handled:
             return False
 
+        logger.debug("Trying pyproject.toml parser")
+
         # Check for non-zero size setup.py files
         setupfiles = RecipeHandler.checkfiles(srctree, ["pyproject.toml"])
         for fn in setupfiles:
             if os.path.getsize(fn):
                 break
         else:
+            logger.debug("No pyproject.toml found")
             return False
 
         setupscript = os.path.join(srctree, "pyproject.toml")
@@ -816,14 +826,16 @@
                 try:
                     import tomli as tomllib
                 except ImportError:
-                    logger.exception("Neither 'tomllib' nor 'tomli' could be imported. Please use python3.11 or above or install tomli module")
-                    return False
-                except Exception:
-                    logger.exception("Failed to parse pyproject.toml")
+                    logger.error("Neither 'tomllib' nor 'tomli' could be imported, cannot scan pyproject.toml.")
                     return False
 
-            with open(setupscript, "rb") as f:
-                config = tomllib.load(f)
+            try:
+                with open(setupscript, "rb") as f:
+                    config = tomllib.load(f)
+            except Exception:
+                logger.exception("Failed to parse pyproject.toml")
+                return False
+
             build_backend = config["build-system"]["build-backend"]
             if build_backend in self.build_backend_map:
                 classes.append(self.build_backend_map[build_backend])
diff --git a/poky/scripts/lib/wic/plugins/imager/direct.py b/poky/scripts/lib/wic/plugins/imager/direct.py
index 9b619e4..a1d1526 100644
--- a/poky/scripts/lib/wic/plugins/imager/direct.py
+++ b/poky/scripts/lib/wic/plugins/imager/direct.py
@@ -530,6 +530,16 @@
         exec_native_cmd("parted -s %s mklabel %s" % (device, ptable_format),
                         self.native_sysroot)
 
+    def _write_disk_guid(self):
+        if self.ptable_format in ('gpt', 'gpt-hybrid'):
+            if os.getenv('SOURCE_DATE_EPOCH'):
+                self.disk_guid = uuid.UUID(int=int(os.getenv('SOURCE_DATE_EPOCH')))
+            else:
+                self.disk_guid = uuid.uuid4()
+
+            logger.debug("Set disk guid %s", self.disk_guid)
+            sfdisk_cmd = "sfdisk --disk-id %s %s" % (self.path, self.disk_guid)
+            exec_native_cmd(sfdisk_cmd, self.native_sysroot)
 
     def create(self):
         self._make_disk(self.path,
@@ -537,6 +547,7 @@
                         self.min_size)
 
         self._write_identifier(self.path, self.identifier)
+        self._write_disk_guid()
 
         if self.ptable_format == "gpt-hybrid":
             mbr_path = self.path + ".mbr"
diff --git a/poky/scripts/oe-check-sstate b/poky/scripts/oe-check-sstate
index 4187e77..0d171c4 100755
--- a/poky/scripts/oe-check-sstate
+++ b/poky/scripts/oe-check-sstate
@@ -53,7 +53,7 @@
             cmd = ['bitbake', '--dry-run', '--runall=build'] + args.target
             output = subprocess.check_output(cmd, stderr=subprocess.STDOUT, env=env)
 
-            task_re = re.compile('NOTE: Running setscene task [0-9]+ of [0-9]+ \(([^)]+)\)')
+            task_re = re.compile(r'NOTE: Running setscene task [0-9]+ of [0-9]+ \(([^)]+)\)')
             tasks = []
             for line in output.decode('utf-8').splitlines():
                 res = task_re.match(line)
diff --git a/poky/scripts/oe-pkgdata-util b/poky/scripts/oe-pkgdata-util
index 7412cc1..44ae405 100755
--- a/poky/scripts/oe-pkgdata-util
+++ b/poky/scripts/oe-pkgdata-util
@@ -296,7 +296,7 @@
             extra = ''
             for line in f:
                 for var in vars:
-                    m = re.match(var + '(?::\S+)?:\s*(.+?)\s*$', line)
+                    m = re.match(var + r'(?::\S+)?:\s*(.+?)\s*$', line)
                     if m:
                         vals[var] = m.group(1)
             pkg_version = vals['PKGV'] or ''
diff --git a/poky/scripts/oe-setup-build b/poky/scripts/oe-setup-build
new file mode 100755
index 0000000..5364f2b
--- /dev/null
+++ b/poky/scripts/oe-setup-build
@@ -0,0 +1,122 @@
+#!/usr/bin/env python3
+#
+# Copyright OpenEmbedded Contributors
+#
+# SPDX-License-Identifier: MIT
+#
+
+import argparse
+import json
+import os
+import subprocess
+
+def defaultlayers():
+    return os.path.abspath(os.path.join(os.path.dirname(__file__), '.oe-layers.json'))
+
+def makebuildpath(topdir, template):
+    return os.path.join(topdir, "build-{}".format(template))
+
+def discover_templates(layers_file):
+    if not os.path.exists(layers_file):
+        print("List of layers {} does not exist; were the layers set up using the setup-layers script?".format(layers_file))
+        return None
+
+    templates = []
+    layers_list = json.load(open(layers_file))["layers"]
+    for layer in layers_list:
+        template_dir = os.path.join(os.path.dirname(layers_file), layer, 'conf','templates')
+        if os.path.exists(template_dir):
+            for d in sorted(os.listdir(template_dir)):
+                templatepath = os.path.join(template_dir,d)
+                if not os.path.isfile(os.path.join(templatepath,'local.conf.sample')):
+                    continue
+                layer_base = os.path.basename(layer)
+                templatename = "{}-{}".format(layer_base[5:] if layer_base.startswith("meta-") else layer_base, d)
+                buildpath = makebuildpath(os.getcwd(), templatename)
+                notespath = os.path.join(template_dir, d, 'conf-notes.txt')
+                try: notes = open(notespath).read()
+                except: notes = None
+                try: summary = open(os.path.join(template_dir, d, 'conf-summary.txt')).read()
+                except: summary = None
+                templates.append({"templatename":templatename,"templatepath":templatepath,"buildpath":buildpath,"notespath":notespath,"notes":notes,"summary":summary})
+
+    return templates
+
+def print_templates(templates, verbose):
+    print("Available build configurations:\n")
+
+    for i in range(len(templates)):
+        t = templates[i]
+        print("{}. {}".format(i+1, t["templatename"]))
+        print("{}".format(t["summary"].strip() if t["summary"] else "This configuration does not have a summary."))
+        if verbose:
+            print("Configuration template path:", t["templatepath"])
+            print("Build path:", t["buildpath"])
+            print("Usage notes:", t["notespath"] if t["notes"] else "This configuration does not have usage notes.")
+        print("")
+    if not verbose:
+        print("Re-run with 'list -v' to see additional information.")
+
+def list_templates(args):
+    templates = discover_templates(args.layerlist)
+    if not templates:
+        return
+
+    verbose = args.v
+    print_templates(templates, verbose)
+
+def find_template(template_name, templates):
+    print_templates(templates, False)
+    if not template_name:
+        n_s = input("Please choose a configuration by its number: ")
+        try: return templates[int(n_s) - 1]
+        except:
+            print("Invalid selection, please try again.")
+            return None
+    else:
+        for t in templates:
+            if t["templatename"] == template_name:
+                return t
+        print("Configuration {} is not one of {}, please try again.".format(tempalte_name, [t["templatename"] for t in templates]))
+        return None
+
+def setup_build_env(args):
+    templates = discover_templates(args.layerlist)
+    if not templates:
+        return
+
+    template = find_template(args.c, templates)
+    if not template:
+        return
+    builddir = args.b if args.b else template["buildpath"]
+    no_shell = args.no_shell
+    coredir = os.path.abspath(os.path.join(os.path.dirname(os.path.realpath(__file__)), '..'))
+    cmd = "TEMPLATECONF={} . {} {}".format(template["templatepath"], os.path.join(coredir, 'oe-init-build-env'), builddir)
+    if not no_shell:
+        cmd = cmd + " && {}".format(os.environ['SHELL'])
+    print("Running:", cmd)
+    subprocess.run(cmd, shell=True, executable=os.environ['SHELL'])
+
+parser = argparse.ArgumentParser(description="A script that discovers available build configurations and sets up a build environment based on one of them. Run without arguments to choose one interactively.")
+parser.add_argument("--layerlist", default=defaultlayers(), help='Where to look for available layers (as written out by setup-layers script) (default is {}).'.format(defaultlayers()))
+
+subparsers = parser.add_subparsers()
+parser_list_templates = subparsers.add_parser('list', help='List available configurations')
+parser_list_templates.add_argument('-v', action='store_true',
+        help='Print detailed information and usage notes for each available build configuration.')
+parser_list_templates.set_defaults(func=list_templates)
+
+parser_setup_env = subparsers.add_parser('setup', help='Set up a build environment and open a shell session with it, ready to run builds.')
+parser_setup_env.add_argument('-c', metavar='configuration_name', help="Use a build configuration configuration_name to set up a build environment (run this script with 'list' to see what is available)")
+parser_setup_env.add_argument('-b', metavar='build_path', help="Set up a build directory in build_path (run this script with 'list -v' to see where it would be by default)")
+parser_setup_env.add_argument('--no-shell', action='store_true',
+        help='Create a build directory but do not start a shell session with the build environment from it.')
+parser_setup_env.set_defaults(func=setup_build_env)
+
+args = parser.parse_args()
+
+if 'func' in args:
+    args.func(args)
+else:
+    from argparse import Namespace
+    setup_build_env(Namespace(layerlist=args.layerlist, c=None, b=None, no_shell=False))
diff --git a/poky/scripts/oe-setup-builddir b/poky/scripts/oe-setup-builddir
index 678aeac..dcb384c 100755
--- a/poky/scripts/oe-setup-builddir
+++ b/poky/scripts/oe-setup-builddir
@@ -57,6 +57,7 @@
     fi
     OECORELAYERCONF="$TEMPLATECONF/bblayers.conf.sample"
     OECORELOCALCONF="$TEMPLATECONF/local.conf.sample"
+    OECORESUMMARYCONF="$TEMPLATECONF/conf-summary.txt"
     OECORENOTESCONF="$TEMPLATECONF/conf-notes.txt"
 fi
 
@@ -98,6 +99,13 @@
     SHOWYPDOC=yes
 fi
 
+if [ -z "$OECORESUMMARYCONF" ]; then
+    OECORESUMMARYCONF="$OEROOT/meta/conf/templates/default/conf-summary.txt"
+fi
+if [ ! -r "$BUILDDIR/conf/conf-summary.txt" ]; then
+    [ ! -r "$OECORESUMMARYCONF" ] || cp "$OECORESUMMARYCONF" "$BUILDDIR/conf/conf-summary.txt"
+fi
+
 if [ -z "$OECORENOTESCONF" ]; then
     OECORENOTESCONF="$OEROOT/meta/conf/templates/default/conf-notes.txt"
 fi
@@ -108,6 +116,7 @@
 # Prevent disturbing a new GIT clone in same console
 unset OECORELOCALCONF
 unset OECORELAYERCONF
+unset OECORESUMMARYCONF
 unset OECORENOTESCONF
 
 # Ending the first-time run message. Show the YP Documentation banner.
@@ -124,6 +133,7 @@
 #    unset SHOWYPDOC
 fi
 
+[ ! -r "$BUILDDIR/conf/conf-summary.txt" ] || cat "$BUILDDIR/conf/conf-summary.txt"
 [ ! -r "$BUILDDIR/conf/conf-notes.txt" ] || cat "$BUILDDIR/conf/conf-notes.txt"
 
 if [ ! -f "$BUILDDIR/conf/templateconf.cfg" ]; then
diff --git a/poky/scripts/oe-setup-layers b/poky/scripts/oe-setup-layers
index 6d49688..6fbfefd 100755
--- a/poky/scripts/oe-setup-layers
+++ b/poky/scripts/oe-setup-layers
@@ -49,11 +49,25 @@
 def _contains_submodules(repodir):
     return os.path.exists(os.path.join(repodir,".gitmodules"))
 
+def _write_layer_list(dest, repodirs):
+    layers = []
+    for r in repodirs:
+        for root, dirs, files in os.walk(r):
+            if os.path.basename(root) == 'conf' and 'layer.conf' in files:
+                layers.append(os.path.relpath(os.path.dirname(root), dest))
+    layers_f = os.path.join(dest, ".oe-layers.json")
+    print("Writing list of layers into {}".format(layers_f))
+    with open(layers_f, 'w') as f:
+        json.dump({"version":"1.0","layers":layers}, f, sort_keys=True, indent=4)
+
 def _do_checkout(args, json):
     repos = json['sources']
+    repodirs = []
+    oesetupbuild = None
     for r_name in repos:
         r_data = repos[r_name]
         repodir = os.path.abspath(os.path.join(args['destdir'], r_data['path']))
+        repodirs.append(repodir)
 
         if 'contains_this_file' in r_data.keys():
             force_arg = 'force_bootstraplayer_checkout'
@@ -95,6 +109,17 @@
 
             if _contains_submodules(repodir):
                 print("Repo {} contains submodules, use 'git submodule update' to ensure they are up to date".format(repodir))
+        if os.path.exists(os.path.join(repodir, 'scripts/oe-setup-build')):
+            oesetupbuild = os.path.join(repodir, 'scripts/oe-setup-build')
+
+    _write_layer_list(args['destdir'], repodirs)
+
+    if oesetupbuild:
+        oesetupbuild_symlink = os.path.join(args['destdir'], 'setup-build')
+        if os.path.exists(oesetupbuild_symlink):
+            os.remove(oesetupbuild_symlink)
+        os.symlink(os.path.relpath(oesetupbuild,args['destdir']),oesetupbuild_symlink)
+        print("\nRun '{}' to list available build configuration templates and set up a build from one of them.".format(oesetupbuild_symlink))
 
 parser = argparse.ArgumentParser(description="A self contained python script that fetches all the needed layers and sets them to correct revisions using data in a json format from a separate file. The json data can be created from an active build directory with 'bitbake-layers create-layers-setup destdir' and there's a sample file and a schema in meta/files/")
 
diff --git a/poky/scripts/oe-setup-vscode b/poky/scripts/oe-setup-vscode
new file mode 100755
index 0000000..b864278
--- /dev/null
+++ b/poky/scripts/oe-setup-vscode
@@ -0,0 +1,93 @@
+#!/bin/sh
+
+usage() {
+    echo "$0 <OEINIT> <BUILDDIR>"
+    echo "  OEINIT:   path to directory where the .vscode folder is"
+    echo "  BUILDDIR: directory passed to the oe-init-setup-env script"
+}
+
+if [ "$#" -ne 2 ]; then
+    usage
+    exit 1
+fi
+
+OEINIT=$(readlink -f "$1")
+BUILDDIR=$(readlink -f "$2")
+VSCODEDIR=$OEINIT/.vscode
+
+if [ ! -d "$OEINIT" ] || [ ! -d "$BUILDDIR" ]; then
+    echo "$OEINIT and/or $BUILDDIR directories are not present."
+    exit 1
+fi
+
+VSCODE_SETTINGS=$VSCODEDIR/settings.json
+ws_builddir="$(echo "$BUILDDIR" | sed -e "s|$OEINIT|\${workspaceFolder}|g")"
+
+# If BUILDDIR is in scope of VSCode ensure VSCode does not try to index the build folder.
+# This would lead to a busy CPU and finally to an OOM exception.
+mkdir -p "$VSCODEDIR"
+cat <<EOMsettings > "$VSCODE_SETTINGS"
+{
+    "bitbake.pathToBitbakeFolder": "\${workspaceFolder}/bitbake",
+    "bitbake.pathToEnvScript": "\${workspaceFolder}/oe-init-build-env",
+    "bitbake.pathToBuildFolder": "$ws_builddir",
+    "bitbake.commandWrapper": "",
+    "bitbake.workingDirectory": "\${workspaceFolder}",
+    "files.exclude": {
+        "**/.git/**": true,
+        "**/_build/**": true,
+        "**/buildhistory/**": true,
+        "**/cache/**": true,
+        "**/downloads/**": true,
+        "**/node_modules/**": true,
+        "**/oe-logs/**": true,
+        "**/oe-workdir/**": true,
+        "**/sstate-cache/**": true,
+        "**/tmp*/**": true,
+        "**/workspace/attic/**": true,
+        "**/workspace/sources/**": true
+    },
+    "files.watcherExclude": {
+        "**/.git/**": true,
+        "**/_build/**": true,
+        "**/buildhistory/**": true,
+        "**/cache/**": true,
+        "**/downloads/**": true,
+        "**/node_modules/**": true,
+        "**/oe-logs/**": true,
+        "**/oe-workdir/**": true,
+        "**/sstate-cache/**": true,
+        "**/tmp*/**": true,
+        "**/workspace/attic/**": true,
+        "**/workspace/sources/**": true
+    },
+    "python.analysis.exclude": [
+        "**/_build/**",
+        "**/.git/**",
+        "**/buildhistory/**",
+        "**/cache/**",
+        "**/downloads/**",
+        "**/node_modules/**",
+        "**/oe-logs/**",
+        "**/oe-workdir/**",
+        "**/sstate-cache/**",
+        "**/tmp*/**",
+        "**/workspace/attic/**",
+        "**/workspace/sources/**"
+    ]
+}
+EOMsettings
+
+
+# Ask the user if the yocto-bitbake extension should be installed
+VSCODE_EXTENSIONS=$VSCODEDIR/extensions.json
+cat <<EOMextensions > "$VSCODE_EXTENSIONS"
+{
+    "recommendations": [
+        "yocto-project.yocto-bitbake"
+    ]
+}
+EOMextensions
+
+echo "You had no $VSCODEDIR configuration."
+echo "These configuration files have therefore been created for you."
diff --git a/poky/scripts/opkg-query-helper.py b/poky/scripts/opkg-query-helper.py
index bc3ab43..084d9ef 100755
--- a/poky/scripts/opkg-query-helper.py
+++ b/poky/scripts/opkg-query-helper.py
@@ -29,7 +29,7 @@
         args.append(arg)
 
 # Regex for removing version specs after dependency items
-verregex = re.compile(' \([=<>]* [^ )]*\)')
+verregex = re.compile(r' \([=<>]* [^ )]*\)')
 
 pkg = ""
 ver = ""
diff --git a/poky/scripts/patchtest b/poky/scripts/patchtest
index a1c824f..3163420 100755
--- a/poky/scripts/patchtest
+++ b/poky/scripts/patchtest
@@ -142,6 +142,8 @@
         logger.error(traceback.print_exc())
         logger.error('patchtest: something went wrong')
         return 1
+    if result.test_failure or result.test_error:
+        return 1 
 
     return 0
 
@@ -158,9 +160,14 @@
     postmerge_resultklass = getResult(patch, True, logfile)
     postmerge_result = _runner(postmerge_resultklass, 'test')
 
+    print('----------------------------------------------------------------------\n')
     if premerge_result == 2 and postmerge_result == 2:
-        logger.error('patchtest: any test cases found - did you specify the correct suite directory?')
-
+        logger.error('patchtest: No test cases found - did you specify the correct suite directory?')
+    if premerge_result == 1 or postmerge_result == 1:
+        logger.error('WARNING: patchtest: At least one patchtest caused a failure or an error - please check')
+    else:
+        logger.error('OK: patchtest: All patchtests passed')
+    print('----------------------------------------------------------------------\n')
     return premerge_result or postmerge_result
 
 def main():
diff --git a/poky/scripts/patchtest-get-branch b/poky/scripts/patchtest-get-branch
index b5fb2b0..c6e242f 100755
--- a/poky/scripts/patchtest-get-branch
+++ b/poky/scripts/patchtest-get-branch
@@ -16,7 +16,7 @@
 import re
 import git
 
-re_prefix = re.compile("(\[.*\])", re.DOTALL)
+re_prefix = re.compile(r"(\[.*\])", re.DOTALL)
 
 def get_branch(filepath_repo, filepath_mbox, default_branch):
     branch = None
diff --git a/poky/scripts/patchtest-send-results b/poky/scripts/patchtest-send-results
index 71b73f0..8a3dadb 100755
--- a/poky/scripts/patchtest-send-results
+++ b/poky/scripts/patchtest-send-results
@@ -33,8 +33,12 @@
 failures, see: https://wiki.yoctoproject.org/wiki/Patchtest. Thank
 you!"""
 
+def has_a_failed_test(raw_results):
+    return any(raw_result.split(':')[0] == "FAIL" for raw_result in raw_results.splitlines())
+
 parser = argparse.ArgumentParser(description="Send patchtest results to a submitter for a given patch")
 parser.add_argument("-p", "--patch", dest="patch", required=True, help="The patch file to summarize")
+parser.add_argument("-d", "--debug", dest="debug", required=False, action='store_true', help="Print raw email headers and content, but don't actually send it")
 args = parser.parse_args()
 
 if not os.path.exists(args.patch):
@@ -45,7 +49,6 @@
     sys.exit(1)
 
 result_file = args.patch + ".testresult"
-result_basename = os.path.basename(args.patch)
 testresult = None
 
 with open(result_file, "r") as f:
@@ -62,7 +65,9 @@
 reply_address = mbox[0]['from']
 
 # extract the message ID and use that as the in-reply-to address
-in_reply_to = re.findall("<(.*)>", mbox[0]['Message-ID'])[0]
+# TODO: This will need to change again when patchtest can handle a whole
+# series at once
+in_reply_to = mbox[0]['Message-ID']
 
 # the address the results email is sent from
 from_address = "patchtest@automation.yoctoproject.org"
@@ -70,7 +75,7 @@
 # mailing list to CC
 cc_address = "openembedded-core@lists.openembedded.org"
 
-if "FAIL" in testresult:
+if has_a_failed_test(testresult):
     reply_contents = None
     if len(max(open(result_file, 'r'), key=len)) > 220:
         warning = "Tests failed for the patch, but the results log could not be processed due to excessive result line length."
@@ -79,18 +84,27 @@
         reply_contents = greeting + testresult + suggestions
 
     ses_client = boto3.client('ses', region_name='us-west-2')
+
+    # Construct the headers for the email. We only want to reply
+    # directly to the tested patch, so make In-Reply-To and References
+    # the same value.
     raw_data = 'From: ' + from_address + '\nTo: ' + reply_address + \
         '\nCC: ' + cc_address + '\nSubject:' + subject_line + \
         '\nIn-Reply-To:' + in_reply_to + \
+        '\nReferences:' + in_reply_to + \
         '\nMIME-Version: 1.0" + \
         "\nContent-type: Multipart/Mixed;boundary="NextPart"\n\n--NextPart\nContent-Type: text/plain\n\n' + \
         reply_contents + '\n\n--NextPart'
-    response = ses_client.send_raw_email(
-        Source="patchtest@automation.yoctoproject.org",
-        RawMessage={
-            "Data": raw_data,
-        },
-    )
+
+    if args.debug:
+        print(f"RawMessage: \n\n{raw_data}")
+    else:
+        response = ses_client.send_raw_email(
+            Source="patchtest@automation.yoctoproject.org",
+            RawMessage={
+                "Data": raw_data,
+            },
+        )
 
 else:
     print(f"No failures identified for {args.patch}.")
diff --git a/poky/scripts/patchtest.README b/poky/scripts/patchtest.README
index ad46b02..76b5fcd 100644
--- a/poky/scripts/patchtest.README
+++ b/poky/scripts/patchtest.README
@@ -133,16 +133,13 @@
 
 ## Contributing
 
-The yocto mailing list (yocto@lists.yoctoproject.org) is used for questions,
+The yocto mailing list (openembedded-core@lists.openembedded.org) is used for questions,
 comments and patch review.  It is subscriber only, so please register before
 posting.
 
-Send pull requests to yocto@lists.yoctoproject.org with '[patchtest]' in the
-subject.
-
 When sending single patches, please use something like:
 
-    git send-email -M -1 --to=yocto@lists.yoctoproject.org  --subject-prefix=patchtest][PATCH
+    git send-email -M -1 --to=openembedded-core@lists.openembedded.org --subject-prefix=OE-core][PATCH
 
 ## Maintenance
 -----------
diff --git a/poky/scripts/sstate-cache-management.py b/poky/scripts/sstate-cache-management.py
index 09b7aa2..d3f600b 100755
--- a/poky/scripts/sstate-cache-management.py
+++ b/poky/scripts/sstate-cache-management.py
@@ -147,7 +147,7 @@
     for stamps_dir in args.stamps_dir:
         stamps_path = Path(stamps_dir)
         assert stamps_path.is_dir()
-        re_sigdata = re.compile(r"do_.*.sigdata\.([^.]*)")
+        re_sigdata = re.compile(r"do_.*\.sigdata\.([^.]*)")
         all_sums |= set(
             [
                 re_sigdata.search(x.parts[-1]).group(1)