poky: subtree update:745e38ff0f..81f9e815d3

Adrian Bunk (6):
      openssl: Upgrade 1.1.1c -> 1.1.1d
      glib-2.0: Upgrade 2.60.6 -> 2.60.7
      lttng-modules: Upgrade 2.10.10 -> 2.10.11
      lttng-ust: Upgrade 2.10.4 -> 2.10.5
      squashfs-tools: Remove UPSTREAM_CHECK_COMMITS
      libmpc: Remove dead UPSTREAM_CHECK_URI

Alexander Kanavin (2):
      runqemu: decouple gtk and gl options
      strace: add a timeout for running ptests

Alistair Francis (1):
      gdb: Mark gdbserver as ALLOW_EMPTY for riscv32

Andre McCurdy (9):
      busybox: drop unused mount.busybox and umount.busybox wrappers
      busybox: drop inittab from SRC_URI ( now moved to busybox-inittab )
      busybox-inittab: minor formatting tweaks
      base-files: drop legacy empty file /etc/default/usbd
      busybox: rcS and rcK should not be writeable by everyone
      ffmpeg: add PACKAGECONFIG controls for alsa and zlib (enable by default)
      libwebp: apply ARM specific config options to big endian ARM
      initscripts: enable alignment.sh init script for big endian ARM
      libunwind: apply configure over-ride to both big and little endian ARM

Andrew F. Davis (4):
      libepoxy: Disable x11 when not building for x11
      cogl: Set depends to the virtual needed not explicitly on Mesa
      gtk+3: Set depends to the virtual needed not explicitly on Mesa
      weston: Set depends to the virtual needed not explicitly on Mesa

Armin Kuster (1):
      gcc: Security fix for CVE-2019-15847

Changhyeok Bae (1):
      iw: upgrade to 5.3

Changqing Li (2):
      classextend.py: don't extend file for file dependency
      report-error.bbclass: add local.conf/auto.conf into error report

Chen Qi (1):
      python-numpy: fix build for libn32

Daniel Gomez (1):
      lttng-modules: Add missing SRCREV_FORMAT

Diego Rondini (1):
      initramfs-framework: support PARTLABEL option

Dmitry Eremin-Solenikov (7):
      image-uefi.conf: add config file holding configuration for UEFI images
      grub-bootconf: switch to image-uefi.conf
      grub-efi: switch to image-uefi.conf
      grub-efi.bbclass: switch to image-uefi.conf
      systemd-boot: switch to image-uefi.conf
      systemd-boot.bbclass: switch to image-uefi.conf
      live-vm-common.bbclass: provide efi population functions for live images

Hector Palacios (1):
      udev-extraconf: skip mounting partitions already mounted by systemd

Henning Schild (6):
      oe-git-proxy: allow setting SOCAT from outside
      oeqa: add case for oe-git-proxy
      Revert "oe-git-proxy: Avoid resolving NO_PROXY against local files"
      oe-git-proxy: disable shell pathname expansion for the whole script
      oe-git-proxy: NO_PROXY suffix matching without wildcard for match_host
      oe-git-proxy: fix dash "Bad substitution"

Hongxu Jia (1):
      elfutils: 0.176 -> 0.177

Jack Mitchell (1):
      iptables: add systemd helper unit to load/restore rules

Jaewon Lee (1):
      populate_sdk_ext: Introduce mechanism to keep nativesdk* sstate in esdk

Jason Wessel (1):
      gnupg: Extend -native wrapper to fix gpgme-native's gpgconf problems

Jiang Lu (2):
      glib-networking:enable glib-networking build as native package
      libsoup:enable libsoup build as native package

Joshua Watt (4):
      sstatesig: Update server URI
      Remove SSTATE_HASHEQUIV_SERVER
      bitbake: bitbake: Rework hash equivalence
      classes/archiver: Fix WORKDIR for shared source

Kai Kang (1):
      systemd: provides ${base_sbindir}/udevadm

Khem Raj (10):
      ptrace: Drop ptrace aid for musl/ppc
      elfutils: Fix build on ppc/musl
      cogl: Do not depend PN-dev on empty PN
      musl: Update to latest master
      glibc: Move DISTRO_FEATURE specific do_install code for target recipe only
      populate_sdk_base.bbclass: nativesdk-glibc-locale is required on musl too
      nativesdk.bbclass: Clear out LIBCEXTENSION and ABIEXTENSION
      openssl: Enable os option for with-rand-seed as well
      weston-init: Add possibility to run weston as non-root user
      layer.conf: Remove weston-conf from SIGGEN_EXCLUDE_SAFE_RECIPE_DEPS

Li Zhou (1):
      qemu: Security Advisory - qemu - CVE-2019-15890

Limeng (1):
      tune-cortexa57-cortexa53: add tunes for ARM Cortex-A53-Cortex-A57

Martin Jansa (2):
      perf: fix build on kernels which don't have ${S}/tools/include/linux/bits.h
      bitbake: Revert "bitbake: cooker: Ensure bbappends are found in stable order"

Maxime Roussin-Bélanger (1):
      meta: add missing descriptions and homepage in bsp

Mikko Rapeli (2):
      busybox.inc: handle empty DEBUG_PREFIX_MAP
      bitbake: svn fetcher: allow "svn propget svn:externals" to fail

Nathan Rossi (7):
      resulttool: Handle multiple series containing ptestresults
      gcc-cross.inc: Process binaries in build dir to be relocatable
      oeqa/core/case.py: Add OEPTestResultTestCase for ptestresult helpers
      oeqa/selftest: Rework toolchain tests to use OEPTestResultTestCase
      glibc-testsuite: SkipRecipe if libc is not glibc
      cmake: 3.15.2 -> 3.15.3
      meson.bbclass: Handle microblaze* mapping to cpu family

Oleksandr Kravchuk (5):
      python3-pygobject: update to 3.34.0
      font-util: update to 1.3.2
      expat: update to 2.2.8
      curl: update to 7.66.0
      python3-dbus: update to 1.2.12

Otavio Salvador (1):
      mesa: Upgrade 19.1.1 -> 19.1.6

Peter Kjellerstedt (3):
      glibc: Make it build without ldconfig in DISTRO_FEATURES
      package_rpm.bbclass: Remove a misleading bb.note()
      tzdata: Correct the packaging of /etc/localtime and /etc/timezone

Quentin Schulz (1):
      externalsrc: stop rebuilds of 2+ externalsrc recipes sharing the same git repo

Randy MacLeod (4):
      valgrind: enable ~500 more ptests
      valgrind: make a few more ptests pass
      valgrind: ptest improvements to run-ptest and more
      valgrind: disable 256 ptests for aarch64

Richard Purdie (8):
      bitbake: runqueue/siggen: Optimise hash equiv queries
      runqemu: Mention snapshot in the help output
      initramfs-framework: support PARTLABEL option
      systemd: Handle slow to boot mips hwdb update timeouts
      meta-extsdk: Either an sstate task is a proper task or it isn't
      oeqa/concurrenttest: Use ionice to delete build directories
      bitbake: utils: Add ionice option to prunedir
      build-appliance-image: Update to master head revision

Robert Yang (2):
      conf/multilib.conf: Add ovmf to NON_MULTILIB_RECIPES
      bitbake: runqueue: validate_hashes(): currentcount should be a number

Ross Burton (16):
      libtasn1: fix build with api-documentation enabled
      gstreamer1.0-libav: enable gtk-doc again
      python3: handle STAGING_LIBDIR/INCDIR being unset
      mesa: no need to depend on target python3
      adwaita-icon-theme: fix rare install race
      oeqa/selftest/wic: improve assert messages in test_fixed_size
      oeqa/selftest/imagefeatures: dump the JSON if it can't be parsed
      libical: upgrade to 3.0.6
      acpica: upgrade 20190509 -> 20190816
      gdk-pixbuf: upgrade 2.38.1 -> 2.38.2
      piglit: upgrade to latest revision
      libinput: upgrade 1.14.0 -> 1.14.1
      rootfs-postcommands: check /etc/gconf exists before working on it
      systemd-systemctl-native: don't care about line endings
      opkg-utils: respect SOURCE_DATE_EPOCH when building ipkgs
      bitbake: fetch2/git: add git-lfs toggle option

Scott Murray (1):
      systemd: upgrade to 243

Stefan Ghinea (1):
      ghostscript: CVE-2019-14811, CVE-2019-14817

Tim Blechmann (1):
      icecc: blacklist pixman

Yeoh Ee Peng (3):
      bitbake: bitbake-layers: show-recipes: Show recipes only
      bitbake: bitbake-layers: show-recipes: Select recipes from selected layer
      bitbake: bitbake-layers: show-recipes: Enable bare output

Yi Zhao (3):
      screen: add /etc/screenrc as global config file
      nfs-utils: fix nfs mount error on 32bit nfs server
      grub: remove diffutils and freetype runtime dependencies

Zang Ruochen (2):
      btrfs-tools:upgrade 5.2.1 -> 5.2.2
      timezone:upgrade 2019b -> 2019c

Change-Id: I1ec24480a8964e474cd99d60a0cb0975e49b46b8
Signed-off-by: Brad Bishop <bradleyb@fuzziesquirrel.com>
diff --git a/poky/bitbake/lib/bb/cooker.py b/poky/bitbake/lib/bb/cooker.py
index 5840aa7..0c54002 100644
--- a/poky/bitbake/lib/bb/cooker.py
+++ b/poky/bitbake/lib/bb/cooker.py
@@ -194,7 +194,7 @@
 
         self.ui_cmdline = None
         self.hashserv = None
-        self.hashservport = None
+        self.hashservaddr = None
 
         self.initConfigurationData()
 
@@ -392,19 +392,20 @@
         except prserv.serv.PRServiceConfigError as e:
             bb.fatal("Unable to start PR Server, exitting")
 
-        if self.data.getVar("BB_HASHSERVE") == "localhost:0":
+        if self.data.getVar("BB_HASHSERVE") == "auto":
+            # Create a new hash server bound to a unix domain socket
             if not self.hashserv:
                 dbfile = (self.data.getVar("PERSISTENT_DIR") or self.data.getVar("CACHE")) + "/hashserv.db"
-                self.hashserv = hashserv.create_server(('localhost', 0), dbfile, '')
-                self.hashservport = "localhost:" + str(self.hashserv.server_port)
+                self.hashservaddr = "unix://%s/hashserve.sock" % self.data.getVar("TOPDIR")
+                self.hashserv = hashserv.create_server(self.hashservaddr, dbfile, sync=False)
                 self.hashserv.process = multiprocessing.Process(target=self.hashserv.serve_forever)
                 self.hashserv.process.daemon = True
                 self.hashserv.process.start()
-            self.data.setVar("BB_HASHSERVE", self.hashservport)
-            self.databuilder.origdata.setVar("BB_HASHSERVE", self.hashservport)
-            self.databuilder.data.setVar("BB_HASHSERVE", self.hashservport)
+            self.data.setVar("BB_HASHSERVE", self.hashservaddr)
+            self.databuilder.origdata.setVar("BB_HASHSERVE", self.hashservaddr)
+            self.databuilder.data.setVar("BB_HASHSERVE", self.hashservaddr)
             for mc in self.databuilder.mcdata:
-                self.databuilder.mcdata[mc].setVar("BB_HASHSERVE", self.hashservport)
+                self.databuilder.mcdata[mc].setVar("BB_HASHSERVE", self.hashservaddr)
 
         bb.parse.init_parser(self.data)
 
@@ -1852,7 +1853,6 @@
             (bbappend, filename) = b
             if (bbappend == f) or ('%' in bbappend and bbappend.startswith(f[:bbappend.index('%')])):
                 filelist.append(filename)
-        filelist.sort()
         return filelist
 
     def collection_priorities(self, pkgfns, d):
diff --git a/poky/bitbake/lib/bb/fetch2/git.py b/poky/bitbake/lib/bb/fetch2/git.py
index e171aa7..5fd63b4 100644
--- a/poky/bitbake/lib/bb/fetch2/git.py
+++ b/poky/bitbake/lib/bb/fetch2/git.py
@@ -464,6 +464,8 @@
         if os.path.exists(destdir):
             bb.utils.prunedir(destdir)
 
+        need_lfs = ud.parm.get("lfs", "1") == "1"
+
         source_found = False
         source_error = []
 
@@ -493,14 +495,16 @@
         runfetchcmd("%s remote set-url origin %s" % (ud.basecmd, repourl), d, workdir=destdir)
 
         if self._contains_lfs(ud, d, destdir):
-            path = d.getVar('PATH')
-            if path:
-                gitlfstool = bb.utils.which(path, "git-lfs", executable=True)
-                if not gitlfstool:
-                    raise bb.fetch2.FetchError("Repository %s has lfs content, install git-lfs plugin on host to download" % (repourl))
+            if need_lfs:
+                path = d.getVar('PATH')
+                if path:
+                    gitlfstool = bb.utils.which(path, "git-lfs", executable=True)
+                    if not gitlfstool:
+                        raise bb.fetch2.FetchError("Repository %s has LFS content, install git-lfs on host to download (or set lfs=0 to ignore it)" % (repourl))
+                else:
+                    bb.note("Could not find 'PATH'")
             else:
-                bb.note("Could not find 'PATH'")
-
+                bb.note("Repository %s has LFS content but it is not being fetched" % (repourl))
 
         if not ud.nocheckout:
             if subdir != "":
diff --git a/poky/bitbake/lib/bb/fetch2/svn.py b/poky/bitbake/lib/bb/fetch2/svn.py
index 59ce931..96d666b 100644
--- a/poky/bitbake/lib/bb/fetch2/svn.py
+++ b/poky/bitbake/lib/bb/fetch2/svn.py
@@ -145,7 +145,7 @@
 
             if not ("externals" in ud.parm and ud.parm["externals"] == "nowarn"):
                 # Warn the user if this had externals (won't catch them all)
-                output = runfetchcmd("svn propget svn:externals", d, workdir=ud.moddir)
+                output = runfetchcmd("svn propget svn:externals || true", d, workdir=ud.moddir)
                 if output:
                     if "--ignore-externals" in svnfetchcmd.split():
                         bb.warn("%s contains svn:externals." % ud.url)
diff --git a/poky/bitbake/lib/bb/runqueue.py b/poky/bitbake/lib/bb/runqueue.py
index addb2bb..d9a67a3 100644
--- a/poky/bitbake/lib/bb/runqueue.py
+++ b/poky/bitbake/lib/bb/runqueue.py
@@ -1173,6 +1173,7 @@
                     self.prepare_task_hash(tid)
 
         bb.parse.siggen.writeout_file_checksum_cache()
+        bb.parse.siggen.set_setscene_tasks(self.runq_setscene_tids)
 
         #self.dump_data()
         return len(self.runtaskentries)
@@ -1259,7 +1260,7 @@
             "buildname" : self.cfgData.getVar("BUILDNAME"),
             "date" : self.cfgData.getVar("DATE"),
             "time" : self.cfgData.getVar("TIME"),
-            "hashservport" : self.cooker.hashservport,
+            "hashservaddr" : self.cooker.hashservaddr,
         }
 
         worker.stdin.write(b"<cookerconfig>" + pickle.dumps(self.cooker.configuration) + b"</cookerconfig>")
@@ -1393,7 +1394,7 @@
             cache[tid] = iscurrent
         return iscurrent
 
-    def validate_hashes(self, tocheck, data, currentcount=None, siginfo=False):
+    def validate_hashes(self, tocheck, data, currentcount=0, siginfo=False):
         valid = set()
         if self.hashvalidate:
             sq_data = {}
@@ -1600,7 +1601,7 @@
 
             tocheck.add(tid)
 
-        valid_new = self.validate_hashes(tocheck, self.cooker.data, None, True)
+        valid_new = self.validate_hashes(tocheck, self.cooker.data, 0, True)
 
         # Tasks which are both setscene and noexec never care about dependencies
         # We therefore find tasks which are setscene and noexec and mark their
@@ -1981,7 +1982,7 @@
                             continue
                         logger.debug(1, "Task %s no longer deferred" % nexttask)
                         del self.sq_deferred[nexttask]
-                        valid = self.rq.validate_hashes(set([nexttask]), self.cooker.data, None, False)
+                        valid = self.rq.validate_hashes(set([nexttask]), self.cooker.data, 0, False)
                         if not valid:
                             logger.debug(1, "%s didn't become valid, skipping setscene" % nexttask)
                             self.sq_task_failoutright(nexttask)
@@ -2173,7 +2174,7 @@
             ret.add(dep)
         return ret
 
-    # We filter out multiconfig dependencies from taskdepdata we pass to the tasks 
+    # We filter out multiconfig dependencies from taskdepdata we pass to the tasks
     # as most code can't handle them
     def build_taskdepdata(self, task):
         taskdepdata = {}
diff --git a/poky/bitbake/lib/bb/siggen.py b/poky/bitbake/lib/bb/siggen.py
index b503559..e047c21 100644
--- a/poky/bitbake/lib/bb/siggen.py
+++ b/poky/bitbake/lib/bb/siggen.py
@@ -13,6 +13,7 @@
 import simplediff
 from bb.checksum import FileChecksumCache
 from bb import runqueue
+import hashserv
 
 logger = logging.getLogger('BitBake.SigGen')
 
@@ -91,6 +92,8 @@
     def save_unitaskhashes(self):
         return
 
+    def set_setscene_tasks(self, setscene_tasks):
+        return
 
 class SignatureGeneratorBasic(SignatureGenerator):
     """
@@ -106,6 +109,7 @@
         self.taints = {}
         self.gendeps = {}
         self.lookupcache = {}
+        self.setscenetasks = {}
         self.basewhitelist = set((data.getVar("BB_HASHBASE_WHITELIST") or "").split())
         self.taskwhitelist = None
         self.init_rundepcheck(data)
@@ -151,6 +155,9 @@
 
         return taskdeps
 
+    def set_setscene_tasks(self, setscene_tasks):
+        self.setscenetasks = setscene_tasks
+
     def finalise(self, fn, d, variant):
 
         mc = d.getVar("__BBMULTICONFIG", False) or ""
@@ -369,6 +376,11 @@
         self.server, self.method = data[:2]
         super().set_taskdata(data[2:])
 
+    def client(self):
+        if getattr(self, '_client', None) is None:
+            self._client = hashserv.create_client(self.server)
+        return self._client
+
     def __get_task_unihash_key(self, tid):
         # TODO: The key only *needs* to be the taskhash, the tid is just
         # convenient
@@ -389,11 +401,12 @@
         self.unitaskhashes[self.__get_task_unihash_key(tid)] = unihash
 
     def get_unihash(self, tid):
-        import urllib
-        import json
-
         taskhash = self.taskhash[tid]
 
+        # If its not a setscene task we can return
+        if self.setscenetasks and tid not in self.setscenetasks:
+            return taskhash
+
         key = self.__get_task_unihash_key(tid)
 
         # TODO: This cache can grow unbounded. It probably only needs to keep
@@ -418,36 +431,22 @@
         unihash = taskhash
 
         try:
-            url = '%s/v1/equivalent?%s' % (self.server,
-                    urllib.parse.urlencode({'method': self.method, 'taskhash': self.taskhash[tid]}))
-
-            request = urllib.request.Request(url)
-            response = urllib.request.urlopen(request)
-            data = response.read().decode('utf-8')
-
-            json_data = json.loads(data)
-
-            if json_data:
-                unihash = json_data['unihash']
+            data = self.client().get_unihash(self.method, self.taskhash[tid])
+            if data:
+                unihash = data
                 # A unique hash equal to the taskhash is not very interesting,
                 # so it is reported it at debug level 2. If they differ, that
                 # is much more interesting, so it is reported at debug level 1
                 bb.debug((1, 2)[unihash == taskhash], 'Found unihash %s in place of %s for %s from %s' % (unihash, taskhash, tid, self.server))
             else:
                 bb.debug(2, 'No reported unihash for %s:%s from %s' % (tid, taskhash, self.server))
-        except urllib.error.URLError as e:
-            bb.warn('Failure contacting Hash Equivalence Server %s: %s' % (self.server, str(e)))
-        except (KeyError, json.JSONDecodeError) as e:
-            bb.warn('Poorly formatted response from %s: %s' % (self.server, str(e)))
+        except hashserv.HashConnectionError as e:
+            bb.warn('Error contacting Hash Equivalence Server %s: %s' % (self.server, str(e)))
 
         self.unitaskhashes[key] = unihash
         return unihash
 
     def report_unihash(self, path, task, d):
-        import urllib
-        import json
-        import tempfile
-        import base64
         import importlib
 
         taskhash = d.getVar('BB_TASKHASH')
@@ -482,42 +481,31 @@
                 outhash = bb.utils.better_eval(self.method + '(path, sigfile, task, d)', locs)
 
             try:
-                url = '%s/v1/equivalent' % self.server
-                task_data = {
-                    'taskhash': taskhash,
-                    'method': self.method,
-                    'outhash': outhash,
-                    'unihash': unihash,
-                    'owner': d.getVar('SSTATE_HASHEQUIV_OWNER')
-                    }
+                extra_data = {}
+
+                owner = d.getVar('SSTATE_HASHEQUIV_OWNER')
+                if owner:
+                    extra_data['owner'] = owner
 
                 if report_taskdata:
                     sigfile.seek(0)
 
-                    task_data['PN'] = d.getVar('PN')
-                    task_data['PV'] = d.getVar('PV')
-                    task_data['PR'] = d.getVar('PR')
-                    task_data['task'] = task
-                    task_data['outhash_siginfo'] = sigfile.read().decode('utf-8')
+                    extra_data['PN'] = d.getVar('PN')
+                    extra_data['PV'] = d.getVar('PV')
+                    extra_data['PR'] = d.getVar('PR')
+                    extra_data['task'] = task
+                    extra_data['outhash_siginfo'] = sigfile.read().decode('utf-8')
 
-                headers = {'content-type': 'application/json'}
-
-                request = urllib.request.Request(url, json.dumps(task_data).encode('utf-8'), headers)
-                response = urllib.request.urlopen(request)
-                data = response.read().decode('utf-8')
-
-                json_data = json.loads(data)
-                new_unihash = json_data['unihash']
+                data = self.client().report_unihash(taskhash, self.method, outhash, unihash, extra_data)
+                new_unihash = data['unihash']
 
                 if new_unihash != unihash:
                     bb.debug(1, 'Task %s unihash changed %s -> %s by server %s' % (taskhash, unihash, new_unihash, self.server))
                     bb.event.fire(bb.runqueue.taskUniHashUpdate(fn + ':do_' + task, new_unihash), d)
                 else:
                     bb.debug(1, 'Reported task %s as unihash %s to %s' % (taskhash, unihash, self.server))
-            except urllib.error.URLError as e:
-                bb.warn('Failure contacting Hash Equivalence Server %s: %s' % (self.server, str(e)))
-            except (KeyError, json.JSONDecodeError) as e:
-                bb.warn('Poorly formatted response from %s: %s' % (self.server, str(e)))
+            except hashserv.HashConnectionError as e:
+                bb.warn('Error contacting Hash Equivalence Server %s: %s' % (self.server, str(e)))
         finally:
             if sigfile:
                 sigfile.close()
@@ -538,7 +526,7 @@
     name = "TestEquivHash"
     def init_rundepcheck(self, data):
         super().init_rundepcheck(data)
-        self.server = "http://" + data.getVar('BB_HASHSERVE')
+        self.server = data.getVar('BB_HASHSERVE')
         self.method = "sstate_output_hash"
 
 
diff --git a/poky/bitbake/lib/bb/tests/runqueue.py b/poky/bitbake/lib/bb/tests/runqueue.py
index c7f5e55..cb4d526 100644
--- a/poky/bitbake/lib/bb/tests/runqueue.py
+++ b/poky/bitbake/lib/bb/tests/runqueue.py
@@ -11,6 +11,7 @@
 import os
 import tempfile
 import subprocess
+import sys
 
 #
 # TODO:
@@ -232,10 +233,11 @@
             self.assertEqual(set(tasks), set(expected))
 
 
+    @unittest.skipIf(sys.version_info < (3, 5, 0), 'Python 3.5 or later required')
     def test_hashserv_single(self):
         with tempfile.TemporaryDirectory(prefix="runqueuetest") as tempdir:
             extraenv = {
-                "BB_HASHSERVE" : "localhost:0",
+                "BB_HASHSERVE" : "auto",
                 "BB_SIGNATURE_HANDLER" : "TestEquivHash"
             }
             cmd = ["bitbake", "a1", "b1"]
@@ -255,10 +257,11 @@
                         'a1:package_write_ipk_setscene', 'a1:package_qa_setscene']
             self.assertEqual(set(tasks), set(expected))
 
+    @unittest.skipIf(sys.version_info < (3, 5, 0), 'Python 3.5 or later required')
     def test_hashserv_double(self):
         with tempfile.TemporaryDirectory(prefix="runqueuetest") as tempdir:
             extraenv = {
-                "BB_HASHSERVE" : "localhost:0",
+                "BB_HASHSERVE" : "auto",
                 "BB_SIGNATURE_HANDLER" : "TestEquivHash"
             }
             cmd = ["bitbake", "a1", "b1", "e1"]
@@ -278,11 +281,12 @@
             self.assertEqual(set(tasks), set(expected))
 
 
+    @unittest.skipIf(sys.version_info < (3, 5, 0), 'Python 3.5 or later required')
     def test_hashserv_multiple_setscene(self):
         # Runs e1:do_package_setscene twice
         with tempfile.TemporaryDirectory(prefix="runqueuetest") as tempdir:
             extraenv = {
-                "BB_HASHSERVE" : "localhost:0",
+                "BB_HASHSERVE" : "auto",
                 "BB_SIGNATURE_HANDLER" : "TestEquivHash"
             }
             cmd = ["bitbake", "a1", "b1", "e1"]
@@ -308,11 +312,12 @@
                 else:
                     self.assertEqual(tasks.count(i), 1, "%s not in task list once" % i)
 
+    @unittest.skipIf(sys.version_info < (3, 5, 0), 'Python 3.5 or later required')
     def test_hashserv_partial_match(self):
         # e1:do_package matches initial built but not second hash value
         with tempfile.TemporaryDirectory(prefix="runqueuetest") as tempdir:
             extraenv = {
-                "BB_HASHSERVE" : "localhost:0",
+                "BB_HASHSERVE" : "auto",
                 "BB_SIGNATURE_HANDLER" : "TestEquivHash"
             }
             cmd = ["bitbake", "a1", "b1"]
@@ -336,11 +341,12 @@
             expected.remove('e1:package')
             self.assertEqual(set(tasks), set(expected))
 
+    @unittest.skipIf(sys.version_info < (3, 5, 0), 'Python 3.5 or later required')
     def test_hashserv_partial_match2(self):
         # e1:do_package + e1:do_populate_sysroot matches initial built but not second hash value
         with tempfile.TemporaryDirectory(prefix="runqueuetest") as tempdir:
             extraenv = {
-                "BB_HASHSERVE" : "localhost:0",
+                "BB_HASHSERVE" : "auto",
                 "BB_SIGNATURE_HANDLER" : "TestEquivHash"
             }
             cmd = ["bitbake", "a1", "b1"]
@@ -363,13 +369,14 @@
                         'e1:package_setscene', 'e1:populate_sysroot_setscene', 'e1:build', 'e1:package_qa', 'e1:package_write_rpm', 'e1:package_write_ipk', 'e1:packagedata']
             self.assertEqual(set(tasks), set(expected))
 
+    @unittest.skipIf(sys.version_info < (3, 5, 0), 'Python 3.5 or later required')
     def test_hashserv_partial_match3(self):
         # e1:do_package is valid for a1 but not after b1
         # In former buggy code, this triggered e1:do_fetch, then e1:do_populate_sysroot to run
         # with none of the intermediate tasks which is a serious bug
         with tempfile.TemporaryDirectory(prefix="runqueuetest") as tempdir:
             extraenv = {
-                "BB_HASHSERVE" : "localhost:0",
+                "BB_HASHSERVE" : "auto",
                 "BB_SIGNATURE_HANDLER" : "TestEquivHash"
             }
             cmd = ["bitbake", "a1", "b1"]
diff --git a/poky/bitbake/lib/bb/utils.py b/poky/bitbake/lib/bb/utils.py
index 3e90b6a..d035949 100644
--- a/poky/bitbake/lib/bb/utils.py
+++ b/poky/bitbake/lib/bb/utils.py
@@ -677,7 +677,7 @@
         return True
     return False
 
-def remove(path, recurse=False):
+def remove(path, recurse=False, ionice=False):
     """Equivalent to rm -f or rm -rf"""
     if not path:
         return
@@ -686,7 +686,10 @@
             if _check_unsafe_delete_path(path):
                 raise Exception('bb.utils.remove: called with dangerous path "%s" and recurse=True, refusing to delete!' % path)
         # shutil.rmtree(name) would be ideal but its too slow
-        subprocess.check_call(['rm', '-rf'] + glob.glob(path))
+        cmd = []
+        if ionice:
+            cmd = ['ionice', '-c', '3']
+        subprocess.check_call(cmd + ['rm', '-rf'] + glob.glob(path))
         return
     for name in glob.glob(path):
         try:
@@ -695,12 +698,12 @@
             if exc.errno != errno.ENOENT:
                 raise
 
-def prunedir(topdir):
+def prunedir(topdir, ionice=False):
     # Delete everything reachable from the directory named in 'topdir'.
     # CAUTION:  This is dangerous!
     if _check_unsafe_delete_path(topdir):
         raise Exception('bb.utils.prunedir: called with dangerous path "%s", refusing to delete!' % topdir)
-    remove(topdir, recurse=True)
+    remove(topdir, recurse=True, ionice=ionice)
 
 #
 # Could also use return re.compile("(%s)" % "|".join(map(re.escape, suffixes))).sub(lambda mo: "", var)