poky: subtree update:5951cbcabe..968fcf4989

Alejandro Hernandez (3):
      baremetal-helloworld: Use do_image_complete instead of do_deploy
      baremetal-image.bbclass: Create a class for baremetal applications or an RTOS
      baremetal-helloworld: Use baremetal-image class to deploy the application

Alejandro del Castillo (2):
      opkg-utils: upgrade to 0.4.3
      opkg: upgrade to version 0.4.3

Alexander Kanavin (30):
      dnf: upgrade 4.2.21 -> 4.2.23
      meson: upgrade 0.54.2 -> 0.54.3
      libdnf: update 0.47.0 -> 0.48.0
      ffmpeg: disable altivec on ppc by default
      dropbear: update 2019.78 -> 2020.79
      elfutils: upgrade 0.179 -> 0.180
      gnu-config: update to latest revision
      libgpg-error: update 1.37 -> 1.38
      perl: update 5.30.2 -> 5.32.0
      gst-examples: upstream releases are even numbered
      bison: upgrade 3.6.3 -> 3.6.4
      python3-cython: upgrade 0.29.19 -> 0.29.20
      stress-ng: upgrade 0.11.12 -> 0.11.14
      piglit: upgrade to latest revision
      linux-firmware: upgrade 20200519 -> 20200619
      systemtap: upgrade 4.2 -> 4.3
      alsa-lib: upgrade 1.2.2 -> 1.2.3.1
      alsa-topology-conf: upgrade 1.2.2 -> 1.2.3
      alsa-ucm-conf: upgrade 1.2.2 -> 1.2.3
      alsa-utils: upgrade 1.2.2 -> 1.2.3
      puzzles: upgrade to latest revision
      diffoscope: upgrade 147 -> 148
      libcheck: upgrade 0.14.0 -> 0.15.0
      rsync: update 3.1.3 -> 3.2.1
      sudo: upgrade 1.9.0 -> 1.9.1
      python3-numpy: update 1.18.5 -> 1.19.0
      mesa: update 20.0.7 -> 20.1.2
      go-binary-native: fix upstream version check
      Revert "python3-setuptools: patch entrypoints for faster initialization"
      python3-setuptools: upgrade 47.1.1 -> 47.3.1

Alistair Francis (1):
      opensbi: Update to OpenSBI v0.8 release

Andreas Müller (3):
      nfs-utils: upgrade 2.4.3 -> 2.5.1
      ccache: merge ccache.inc into recipe
      ccache: upgrade 3.7.9 -> 3.7.10

Andrej Valek (2):
      busybox: 1.31.1 -> 1.32.0
      dropbear: update to 2020.80

Andrey Zhizhikin (1):
      kernel/yocto: fix search for defconfig from src_uri

Armin Kuster (1):
      wpa-supplicant: Security fix CVE-2020-12695

Bjarne Michelsen (1):
      devtool: default to empty string, if LIC_FILES_CHKSUM is not available

Bruce Ashfield (10):
      kernel/yocto: ensure that defconfigs are processed first
      linux-yocto/5.4: update to v5.4.45
      linux-yocto-rt/5.4: update to rt25
      linux-yocto/5.4: update to v5.4.46
      linux-yocto/5.4: update to v5.4.47
      linux-yocto/5.4: update to v5.4.49 and -rt28
      yocto-bsps: bump reference boards to v5.4.49
      linux-yocto/5.4: update to v5.4.50
      linux-yocto-dev: bump to 5.8-rc
      lttng-modules: bump devupstream to v2.12.1+

Changqing Li (5):
      xinit: add rxvt-unicode in RDEPENDS
      modutils-initscripts: update postinst
      initscripts: update postinst
      gtk-icon-cache.bbclass: add runtime dependency
      logrotate.py: fix testimage occasionally failure

Chen Qi (2):
      oescripts.py: fix typo
      oescripts: ignore whitespaces when comparing lines

Chris Laplante (2):
      bitbake: contrib/vim: synchronize from kergoth/vim-bitbake rev 4225ee8b4818d7e4696520567216a3a031c26f7d
      bitbake: ui/teamcity: don't use removed logging classes

Christian Eggers (1):
      libnl: Extend for native/nativesdk

Damian Wrobel (1):
      rootfs: do not let ldconfig to create symlinks

Daniel Klauer (2):
      uboot-sign: Refactor do_deploy prefunc to do_deploy_prepend
      deploy.bbclass: Clean DEPLOYDIR before do_deploy

David Khouya (2):
      bitbake: lib/ui/taskexp: Validate gi import
      bitbake: lib/ui/taskexp: Fix missing Gtk import

Hannu Lounento (1):
      openssl: move ${libdir}/[...]/openssl.cnf to ${PN}-conf

Hongxu Jia (1):
      iso-codes: switch upstream branch master -> main

Jason Wessel (1):
      runqemu: If using a vmtype image do not add the -no-reboot flag

Joe Slater (1):
      jquery: use ${S}

Joshua Watt (4):
      bitbake: hashserv: Chunkify large messages
      bitbake: siggen: Fix error when hash equivalence has an exception
      classes/archiver: run do_unpack_and_patch after do_preconfigure
      classes/archive: do_configure should not depend on do_ar_patched

Khem Raj (2):
      musl: Update to tip of master
      rxvt-unicode: Disable wtmp on musl

Konrad Weihmann (2):
      systemd: remove kernel-install from base pkg
      bitbake.conf: fix whitespace issues

Lee Chee Yang (3):
      json-c: fix CVE-2020-12762
      qemu: fix CVE-2020-10761
      oeqa/core/loader: refine regex to find module

Lili Li (1):
      kernel.bbclass: Fix Module.symvers support

Matt Madison (1):
      kernel.bbclass: add gzip-native to do_deploy dependencies

Max Krummenacher (2):
      cogl-1.0: : don't require eglmesaext.h
      cogl-1.0: cope with missing x11 headers

Mingli Yu (2):
      python3-libarchive-c: add the missing rdepends
      python3: add ldconfig rdepends for python3-ctypes

Nicolas Dechesne (1):
      checklayer: parse LAYERDEPENDS with bb.utils.explode_dep_versions2()

Pierre-Jean Texier (3):
      libubootenv: bump to revision 86bd30a
      curl: upgrade 7.71.0 -> 7.71.1
      diffoscope: upgrade 148 -> 150

Rahul Kumar (1):
      bzip2: Add test suite for bzip2

Rasmus Villemoes (1):
      coreutils: don't split stdbuf to own package with single-binary

Richard Purdie (13):
      pseudo: Switch to oe-core branch in git repo
      pseudo: merge in fixes for setfacl issue
      oeqa/selftest: Clean up separate builddir in success case when non-threaded
      populate_sdk_ext: Fix to use python3, not python
      bitbake: taskdata: Improve handling of regex in ASSUME_PROVIDED
      bitbake: runqueue: Avoid unpickle errors in rare cases
      bitbake: msg: Avoid issues where paths have relative components
      oeqa/selftest: recipetool/devtool: Avoid load_plugin test race
      oeqa/targetcontrol: Attempt to fix log closure warning message
      rootfs-postcommands: Improve/fix rootfs_check_host_user_contaminated
      spdx: Remove the class as its obsolete
      adwaita-icon-theme: Add missing license files to LIC_FILES_CHKSUM
      bitbake: server/process: Increase timeout for commands

Ross Burton (3):
      ovmf: build natively everywhere
      common-licenses: fix filename of BSD-2-Clause-Patent
      gtk+3: fix reproducible build failure

Timon Ulrich (2):
      kernel.bbclass: add lz4 dependency and fix the call to lz4
      kernel.bbclass: make dependency on lzop-native conditional

Vacek, Patrick (1):
      oeqa/core/loader: fix regex to include numbers

Wang Mingyu (1):
      gtk+3: upgrade 3.24.20 -> 3.24.21

Yanfei Xu (1):
      classes/kernel: Use a copy of image for kernel*.rpm if fs doesn't support symlinks

akuster (5):
      libuv: update to the last version in meta-oe
      bitbake: test/fetch: change to better svn source
      overview-manual: add SPDX license header
      mega-manual: Add SPDX license headers
      ref-manual: Add SPDX license headers

hongxu (2):
      qemu: switches from libcap to libcap-ng for PACAKGECONFIG virtfs
      cpio: add nativesdk support

zangrc (1):
      libjpeg-turbo:upgrade 2.0.4 -> 2.0.5

Signed-off-by: Andrew Geissler <geissonator@yahoo.com>
Change-Id: I41e066e5957aa74c9a24e86a6c214bcf96e9c46b
diff --git a/poky/bitbake/lib/hashserv/server.py b/poky/bitbake/lib/hashserv/server.py
index cc7e482..8105071 100644
--- a/poky/bitbake/lib/hashserv/server.py
+++ b/poky/bitbake/lib/hashserv/server.py
@@ -13,6 +13,7 @@
 import signal
 import socket
 import time
+from . import chunkify, DEFAULT_MAX_CHUNK
 
 logger = logging.getLogger('hashserv.server')
 
@@ -107,12 +108,29 @@
         return {k: getattr(self, k) for k in ('num', 'total_time', 'max_time', 'average', 'stdev')}
 
 
+class ClientError(Exception):
+    pass
+
 class ServerClient(object):
+    FAST_QUERY = 'SELECT taskhash, method, unihash FROM tasks_v2 WHERE method=:method AND taskhash=:taskhash ORDER BY created ASC LIMIT 1'
+    ALL_QUERY =  'SELECT *                         FROM tasks_v2 WHERE method=:method AND taskhash=:taskhash ORDER BY created ASC LIMIT 1'
+
     def __init__(self, reader, writer, db, request_stats):
         self.reader = reader
         self.writer = writer
         self.db = db
         self.request_stats = request_stats
+        self.max_chunk = DEFAULT_MAX_CHUNK
+
+        self.handlers = {
+            'get': self.handle_get,
+            'report': self.handle_report,
+            'report-equiv': self.handle_equivreport,
+            'get-stream': self.handle_get_stream,
+            'get-stats': self.handle_get_stats,
+            'reset-stats': self.handle_reset_stats,
+            'chunk-stream': self.handle_chunk,
+        }
 
     async def process_requests(self):
         try:
@@ -125,7 +143,11 @@
                 return
 
             (proto_name, proto_version) = protocol.decode('utf-8').rstrip().split()
-            if proto_name != 'OEHASHEQUIV' or proto_version != '1.0':
+            if proto_name != 'OEHASHEQUIV':
+                return
+
+            proto_version = tuple(int(v) for v in proto_version.split('.'))
+            if proto_version < (1, 0) or proto_version > (1, 1):
                 return
 
             # Read headers. Currently, no headers are implemented, so look for
@@ -140,40 +162,34 @@
                     break
 
             # Handle messages
-            handlers = {
-                'get': self.handle_get,
-                'report': self.handle_report,
-                'report-equiv': self.handle_equivreport,
-                'get-stream': self.handle_get_stream,
-                'get-stats': self.handle_get_stats,
-                'reset-stats': self.handle_reset_stats,
-            }
-
             while True:
                 d = await self.read_message()
                 if d is None:
                     break
-
-                for k in handlers.keys():
-                    if k in d:
-                        logger.debug('Handling %s' % k)
-                        if 'stream' in k:
-                            await handlers[k](d[k])
-                        else:
-                            with self.request_stats.start_sample() as self.request_sample, \
-                                    self.request_sample.measure():
-                                await handlers[k](d[k])
-                        break
-                else:
-                    logger.warning("Unrecognized command %r" % d)
-                    break
-
+                await self.dispatch_message(d)
                 await self.writer.drain()
+        except ClientError as e:
+            logger.error(str(e))
         finally:
             self.writer.close()
 
+    async def dispatch_message(self, msg):
+        for k in self.handlers.keys():
+            if k in msg:
+                logger.debug('Handling %s' % k)
+                if 'stream' in k:
+                    await self.handlers[k](msg[k])
+                else:
+                    with self.request_stats.start_sample() as self.request_sample, \
+                            self.request_sample.measure():
+                        await self.handlers[k](msg[k])
+                return
+
+        raise ClientError("Unrecognized command %r" % msg)
+
     def write_message(self, msg):
-        self.writer.write(('%s\n' % json.dumps(msg)).encode('utf-8'))
+        for c in chunkify(json.dumps(msg), self.max_chunk):
+            self.writer.write(c.encode('utf-8'))
 
     async def read_message(self):
         l = await self.reader.readline()
@@ -191,14 +207,38 @@
             logger.error('Bad message from client: %r' % message)
             raise e
 
+    async def handle_chunk(self, request):
+        lines = []
+        try:
+            while True:
+                l = await self.reader.readline()
+                l = l.rstrip(b"\n").decode("utf-8")
+                if not l:
+                    break
+                lines.append(l)
+
+            msg = json.loads(''.join(lines))
+        except (json.JSONDecodeError, UnicodeDecodeError) as e:
+            logger.error('Bad message from client: %r' % message)
+            raise e
+
+        if 'chunk-stream' in msg:
+            raise ClientError("Nested chunks are not allowed")
+
+        await self.dispatch_message(msg)
+
     async def handle_get(self, request):
         method = request['method']
         taskhash = request['taskhash']
 
-        row = self.query_equivalent(method, taskhash)
+        if request.get('all', False):
+            row = self.query_equivalent(method, taskhash, self.ALL_QUERY)
+        else:
+            row = self.query_equivalent(method, taskhash, self.FAST_QUERY)
+
         if row is not None:
             logger.debug('Found equivalent task %s -> %s', (row['taskhash'], row['unihash']))
-            d = {k: row[k] for k in ('taskhash', 'method', 'unihash')}
+            d = {k: row[k] for k in row.keys()}
 
             self.write_message(d)
         else:
@@ -228,7 +268,7 @@
 
                 (method, taskhash) = l.split()
                 #logger.debug('Looking up %s %s' % (method, taskhash))
-                row = self.query_equivalent(method, taskhash)
+                row = self.query_equivalent(method, taskhash, self.FAST_QUERY)
                 if row is not None:
                     msg = ('%s\n' % row['unihash']).encode('utf-8')
                     #logger.debug('Found equivalent task %s -> %s', (row['taskhash'], row['unihash']))
@@ -328,7 +368,7 @@
             # Fetch the unihash that will be reported for the taskhash. If the
             # unihash matches, it means this row was inserted (or the mapping
             # was already valid)
-            row = self.query_equivalent(data['method'], data['taskhash'])
+            row = self.query_equivalent(data['method'], data['taskhash'], self.FAST_QUERY)
 
             if row['unihash'] == data['unihash']:
                 logger.info('Adding taskhash equivalence for %s with unihash %s',
@@ -354,12 +394,11 @@
         self.request_stats.reset()
         self.write_message(d)
 
-    def query_equivalent(self, method, taskhash):
+    def query_equivalent(self, method, taskhash, query):
         # This is part of the inner loop and must be as fast as possible
         try:
             cursor = self.db.cursor()
-            cursor.execute('SELECT taskhash, method, unihash FROM tasks_v2 WHERE method=:method AND taskhash=:taskhash ORDER BY created ASC LIMIT 1',
-                           {'method': method, 'taskhash': taskhash})
+            cursor.execute(query, {'method': method, 'taskhash': taskhash})
             return cursor.fetchone()
         except:
             cursor.close()