Squashed 'yocto-poky/' content from commit ea562de
git-subtree-dir: yocto-poky
git-subtree-split: ea562de57590c966cd5a75fda8defecd397e6436
diff --git a/bitbake/AUTHORS b/bitbake/AUTHORS
new file mode 100644
index 0000000..91fd78f
--- /dev/null
+++ b/bitbake/AUTHORS
@@ -0,0 +1,10 @@
+Tim Ansell <mithro@mithis.net>
+Phil Blundell <pb@handhelds.org>
+Seb Frankengul <seb@frankengul.org>
+Holger Freyther <holger@moiji-mobile.com>
+Marcin Juszkiewicz <marcin@juszkiewicz.com.pl>
+Chris Larson <kergoth@handhelds.org>
+Ulrich Luckas <luckas@musoft.de>
+Mickey Lauer <mickey@Vanille.de>
+Richard Purdie <rpurdie@rpsys.net>
+Holger Schurig <holgerschurig@gmx.de>
diff --git a/bitbake/COPYING b/bitbake/COPYING
new file mode 100644
index 0000000..d511905
--- /dev/null
+++ b/bitbake/COPYING
@@ -0,0 +1,339 @@
+ GNU GENERAL PUBLIC LICENSE
+ Version 2, June 1991
+
+ Copyright (C) 1989, 1991 Free Software Foundation, Inc.,
+ 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
+ Everyone is permitted to copy and distribute verbatim copies
+ of this license document, but changing it is not allowed.
+
+ Preamble
+
+ The licenses for most software are designed to take away your
+freedom to share and change it. By contrast, the GNU General Public
+License is intended to guarantee your freedom to share and change free
+software--to make sure the software is free for all its users. This
+General Public License applies to most of the Free Software
+Foundation's software and to any other program whose authors commit to
+using it. (Some other Free Software Foundation software is covered by
+the GNU Lesser General Public License instead.) You can apply it to
+your programs, too.
+
+ When we speak of free software, we are referring to freedom, not
+price. Our General Public Licenses are designed to make sure that you
+have the freedom to distribute copies of free software (and charge for
+this service if you wish), that you receive source code or can get it
+if you want it, that you can change the software or use pieces of it
+in new free programs; and that you know you can do these things.
+
+ To protect your rights, we need to make restrictions that forbid
+anyone to deny you these rights or to ask you to surrender the rights.
+These restrictions translate to certain responsibilities for you if you
+distribute copies of the software, or if you modify it.
+
+ For example, if you distribute copies of such a program, whether
+gratis or for a fee, you must give the recipients all the rights that
+you have. You must make sure that they, too, receive or can get the
+source code. And you must show them these terms so they know their
+rights.
+
+ We protect your rights with two steps: (1) copyright the software, and
+(2) offer you this license which gives you legal permission to copy,
+distribute and/or modify the software.
+
+ Also, for each author's protection and ours, we want to make certain
+that everyone understands that there is no warranty for this free
+software. If the software is modified by someone else and passed on, we
+want its recipients to know that what they have is not the original, so
+that any problems introduced by others will not reflect on the original
+authors' reputations.
+
+ Finally, any free program is threatened constantly by software
+patents. We wish to avoid the danger that redistributors of a free
+program will individually obtain patent licenses, in effect making the
+program proprietary. To prevent this, we have made it clear that any
+patent must be licensed for everyone's free use or not licensed at all.
+
+ The precise terms and conditions for copying, distribution and
+modification follow.
+
+ GNU GENERAL PUBLIC LICENSE
+ TERMS AND CONDITIONS FOR COPYING, DISTRIBUTION AND MODIFICATION
+
+ 0. This License applies to any program or other work which contains
+a notice placed by the copyright holder saying it may be distributed
+under the terms of this General Public License. The "Program", below,
+refers to any such program or work, and a "work based on the Program"
+means either the Program or any derivative work under copyright law:
+that is to say, a work containing the Program or a portion of it,
+either verbatim or with modifications and/or translated into another
+language. (Hereinafter, translation is included without limitation in
+the term "modification".) Each licensee is addressed as "you".
+
+Activities other than copying, distribution and modification are not
+covered by this License; they are outside its scope. The act of
+running the Program is not restricted, and the output from the Program
+is covered only if its contents constitute a work based on the
+Program (independent of having been made by running the Program).
+Whether that is true depends on what the Program does.
+
+ 1. You may copy and distribute verbatim copies of the Program's
+source code as you receive it, in any medium, provided that you
+conspicuously and appropriately publish on each copy an appropriate
+copyright notice and disclaimer of warranty; keep intact all the
+notices that refer to this License and to the absence of any warranty;
+and give any other recipients of the Program a copy of this License
+along with the Program.
+
+You may charge a fee for the physical act of transferring a copy, and
+you may at your option offer warranty protection in exchange for a fee.
+
+ 2. You may modify your copy or copies of the Program or any portion
+of it, thus forming a work based on the Program, and copy and
+distribute such modifications or work under the terms of Section 1
+above, provided that you also meet all of these conditions:
+
+ a) You must cause the modified files to carry prominent notices
+ stating that you changed the files and the date of any change.
+
+ b) You must cause any work that you distribute or publish, that in
+ whole or in part contains or is derived from the Program or any
+ part thereof, to be licensed as a whole at no charge to all third
+ parties under the terms of this License.
+
+ c) If the modified program normally reads commands interactively
+ when run, you must cause it, when started running for such
+ interactive use in the most ordinary way, to print or display an
+ announcement including an appropriate copyright notice and a
+ notice that there is no warranty (or else, saying that you provide
+ a warranty) and that users may redistribute the program under
+ these conditions, and telling the user how to view a copy of this
+ License. (Exception: if the Program itself is interactive but
+ does not normally print such an announcement, your work based on
+ the Program is not required to print an announcement.)
+
+These requirements apply to the modified work as a whole. If
+identifiable sections of that work are not derived from the Program,
+and can be reasonably considered independent and separate works in
+themselves, then this License, and its terms, do not apply to those
+sections when you distribute them as separate works. But when you
+distribute the same sections as part of a whole which is a work based
+on the Program, the distribution of the whole must be on the terms of
+this License, whose permissions for other licensees extend to the
+entire whole, and thus to each and every part regardless of who wrote it.
+
+Thus, it is not the intent of this section to claim rights or contest
+your rights to work written entirely by you; rather, the intent is to
+exercise the right to control the distribution of derivative or
+collective works based on the Program.
+
+In addition, mere aggregation of another work not based on the Program
+with the Program (or with a work based on the Program) on a volume of
+a storage or distribution medium does not bring the other work under
+the scope of this License.
+
+ 3. You may copy and distribute the Program (or a work based on it,
+under Section 2) in object code or executable form under the terms of
+Sections 1 and 2 above provided that you also do one of the following:
+
+ a) Accompany it with the complete corresponding machine-readable
+ source code, which must be distributed under the terms of Sections
+ 1 and 2 above on a medium customarily used for software interchange; or,
+
+ b) Accompany it with a written offer, valid for at least three
+ years, to give any third party, for a charge no more than your
+ cost of physically performing source distribution, a complete
+ machine-readable copy of the corresponding source code, to be
+ distributed under the terms of Sections 1 and 2 above on a medium
+ customarily used for software interchange; or,
+
+ c) Accompany it with the information you received as to the offer
+ to distribute corresponding source code. (This alternative is
+ allowed only for noncommercial distribution and only if you
+ received the program in object code or executable form with such
+ an offer, in accord with Subsection b above.)
+
+The source code for a work means the preferred form of the work for
+making modifications to it. For an executable work, complete source
+code means all the source code for all modules it contains, plus any
+associated interface definition files, plus the scripts used to
+control compilation and installation of the executable. However, as a
+special exception, the source code distributed need not include
+anything that is normally distributed (in either source or binary
+form) with the major components (compiler, kernel, and so on) of the
+operating system on which the executable runs, unless that component
+itself accompanies the executable.
+
+If distribution of executable or object code is made by offering
+access to copy from a designated place, then offering equivalent
+access to copy the source code from the same place counts as
+distribution of the source code, even though third parties are not
+compelled to copy the source along with the object code.
+
+ 4. You may not copy, modify, sublicense, or distribute the Program
+except as expressly provided under this License. Any attempt
+otherwise to copy, modify, sublicense or distribute the Program is
+void, and will automatically terminate your rights under this License.
+However, parties who have received copies, or rights, from you under
+this License will not have their licenses terminated so long as such
+parties remain in full compliance.
+
+ 5. You are not required to accept this License, since you have not
+signed it. However, nothing else grants you permission to modify or
+distribute the Program or its derivative works. These actions are
+prohibited by law if you do not accept this License. Therefore, by
+modifying or distributing the Program (or any work based on the
+Program), you indicate your acceptance of this License to do so, and
+all its terms and conditions for copying, distributing or modifying
+the Program or works based on it.
+
+ 6. Each time you redistribute the Program (or any work based on the
+Program), the recipient automatically receives a license from the
+original licensor to copy, distribute or modify the Program subject to
+these terms and conditions. You may not impose any further
+restrictions on the recipients' exercise of the rights granted herein.
+You are not responsible for enforcing compliance by third parties to
+this License.
+
+ 7. If, as a consequence of a court judgment or allegation of patent
+infringement or for any other reason (not limited to patent issues),
+conditions are imposed on you (whether by court order, agreement or
+otherwise) that contradict the conditions of this License, they do not
+excuse you from the conditions of this License. If you cannot
+distribute so as to satisfy simultaneously your obligations under this
+License and any other pertinent obligations, then as a consequence you
+may not distribute the Program at all. For example, if a patent
+license would not permit royalty-free redistribution of the Program by
+all those who receive copies directly or indirectly through you, then
+the only way you could satisfy both it and this License would be to
+refrain entirely from distribution of the Program.
+
+If any portion of this section is held invalid or unenforceable under
+any particular circumstance, the balance of the section is intended to
+apply and the section as a whole is intended to apply in other
+circumstances.
+
+It is not the purpose of this section to induce you to infringe any
+patents or other property right claims or to contest validity of any
+such claims; this section has the sole purpose of protecting the
+integrity of the free software distribution system, which is
+implemented by public license practices. Many people have made
+generous contributions to the wide range of software distributed
+through that system in reliance on consistent application of that
+system; it is up to the author/donor to decide if he or she is willing
+to distribute software through any other system and a licensee cannot
+impose that choice.
+
+This section is intended to make thoroughly clear what is believed to
+be a consequence of the rest of this License.
+
+ 8. If the distribution and/or use of the Program is restricted in
+certain countries either by patents or by copyrighted interfaces, the
+original copyright holder who places the Program under this License
+may add an explicit geographical distribution limitation excluding
+those countries, so that distribution is permitted only in or among
+countries not thus excluded. In such case, this License incorporates
+the limitation as if written in the body of this License.
+
+ 9. The Free Software Foundation may publish revised and/or new versions
+of the General Public License from time to time. Such new versions will
+be similar in spirit to the present version, but may differ in detail to
+address new problems or concerns.
+
+Each version is given a distinguishing version number. If the Program
+specifies a version number of this License which applies to it and "any
+later version", you have the option of following the terms and conditions
+either of that version or of any later version published by the Free
+Software Foundation. If the Program does not specify a version number of
+this License, you may choose any version ever published by the Free Software
+Foundation.
+
+ 10. If you wish to incorporate parts of the Program into other free
+programs whose distribution conditions are different, write to the author
+to ask for permission. For software which is copyrighted by the Free
+Software Foundation, write to the Free Software Foundation; we sometimes
+make exceptions for this. Our decision will be guided by the two goals
+of preserving the free status of all derivatives of our free software and
+of promoting the sharing and reuse of software generally.
+
+ NO WARRANTY
+
+ 11. BECAUSE THE PROGRAM IS LICENSED FREE OF CHARGE, THERE IS NO WARRANTY
+FOR THE PROGRAM, TO THE EXTENT PERMITTED BY APPLICABLE LAW. EXCEPT WHEN
+OTHERWISE STATED IN WRITING THE COPYRIGHT HOLDERS AND/OR OTHER PARTIES
+PROVIDE THE PROGRAM "AS IS" WITHOUT WARRANTY OF ANY KIND, EITHER EXPRESSED
+OR IMPLIED, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF
+MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE. THE ENTIRE RISK AS
+TO THE QUALITY AND PERFORMANCE OF THE PROGRAM IS WITH YOU. SHOULD THE
+PROGRAM PROVE DEFECTIVE, YOU ASSUME THE COST OF ALL NECESSARY SERVICING,
+REPAIR OR CORRECTION.
+
+ 12. IN NO EVENT UNLESS REQUIRED BY APPLICABLE LAW OR AGREED TO IN WRITING
+WILL ANY COPYRIGHT HOLDER, OR ANY OTHER PARTY WHO MAY MODIFY AND/OR
+REDISTRIBUTE THE PROGRAM AS PERMITTED ABOVE, BE LIABLE TO YOU FOR DAMAGES,
+INCLUDING ANY GENERAL, SPECIAL, INCIDENTAL OR CONSEQUENTIAL DAMAGES ARISING
+OUT OF THE USE OR INABILITY TO USE THE PROGRAM (INCLUDING BUT NOT LIMITED
+TO LOSS OF DATA OR DATA BEING RENDERED INACCURATE OR LOSSES SUSTAINED BY
+YOU OR THIRD PARTIES OR A FAILURE OF THE PROGRAM TO OPERATE WITH ANY OTHER
+PROGRAMS), EVEN IF SUCH HOLDER OR OTHER PARTY HAS BEEN ADVISED OF THE
+POSSIBILITY OF SUCH DAMAGES.
+
+ END OF TERMS AND CONDITIONS
+
+ How to Apply These Terms to Your New Programs
+
+ If you develop a new program, and you want it to be of the greatest
+possible use to the public, the best way to achieve this is to make it
+free software which everyone can redistribute and change under these terms.
+
+ To do so, attach the following notices to the program. It is safest
+to attach them to the start of each source file to most effectively
+convey the exclusion of warranty; and each file should have at least
+the "copyright" line and a pointer to where the full notice is found.
+
+ <one line to give the program's name and a brief idea of what it does.>
+ Copyright (C) <year> <name of author>
+
+ This program is free software; you can redistribute it and/or modify
+ it under the terms of the GNU General Public License as published by
+ the Free Software Foundation; either version 2 of the License, or
+ (at your option) any later version.
+
+ This program is distributed in the hope that it will be useful,
+ but WITHOUT ANY WARRANTY; without even the implied warranty of
+ MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
+ GNU General Public License for more details.
+
+ You should have received a copy of the GNU General Public License along
+ with this program; if not, write to the Free Software Foundation, Inc.,
+ 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
+
+Also add information on how to contact you by electronic and paper mail.
+
+If the program is interactive, make it output a short notice like this
+when it starts in an interactive mode:
+
+ Gnomovision version 69, Copyright (C) year name of author
+ Gnomovision comes with ABSOLUTELY NO WARRANTY; for details type `show w'.
+ This is free software, and you are welcome to redistribute it
+ under certain conditions; type `show c' for details.
+
+The hypothetical commands `show w' and `show c' should show the appropriate
+parts of the General Public License. Of course, the commands you use may
+be called something other than `show w' and `show c'; they could even be
+mouse-clicks or menu items--whatever suits your program.
+
+You should also get your employer (if you work as a programmer) or your
+school, if any, to sign a "copyright disclaimer" for the program, if
+necessary. Here is a sample; alter the names:
+
+ Yoyodyne, Inc., hereby disclaims all copyright interest in the program
+ `Gnomovision' (which makes passes at compilers) written by James Hacker.
+
+ <signature of Ty Coon>, 1 April 1989
+ Ty Coon, President of Vice
+
+This General Public License does not permit incorporating your program into
+proprietary programs. If your program is a subroutine library, you may
+consider it more useful to permit linking proprietary applications with the
+library. If this is what you want to do, use the GNU Lesser General
+Public License instead of this License.
diff --git a/bitbake/ChangeLog b/bitbake/ChangeLog
new file mode 100644
index 0000000..4ac2a64
--- /dev/null
+++ b/bitbake/ChangeLog
@@ -0,0 +1,317 @@
+Changes in Bitbake 1.9.x:
+ - Add PE (Package Epoch) support from Philipp Zabel (pH5)
+ - Treat python functions the same as shell functions for logging
+ - Use TMPDIR/anonfunc as a __anonfunc temp directory (T)
+ - Catch truncated cache file errors
+ - Allow operations other than assignment on flag variables
+ - Add code to handle inter-task dependencies
+ - Fix cache errors when generation dotGraphs
+ - Make sure __inherit_cache is updated before calling include() (from Michael Krelin)
+ - Fix bug when target was in ASSUME_PROVIDED (#2236)
+ - Raise ParseError for filenames with multiple underscores instead of infinitely looping (#2062)
+ - Fix invalid regexp in BBMASK error handling (missing import) (#1124)
+ - Promote certain warnings from debug to note 2 level
+ - Update manual
+ - Correctly redirect stdin when forking
+ - If parsing errors are found, exit, too many users miss the errors
+ - Remove supriours PREFERRED_PROVIDER warnings
+ - svn fetcher: Add _buildsvncommand function
+ - Improve certain error messages
+ - Rewrite svn fetcher to make adding extra operations easier
+ as part of future SRCDATE="now" fixes
+ (requires new FETCHCMD_svn definition in bitbake.conf)
+ - Change SVNDIR layout to be more unique (fixes #2644 and #2624)
+ - Add ConfigParsed Event after configuration parsing is complete
+ - Add SRCREV support for svn fetcher
+ - data.emit_var() - only call getVar if we need the variable
+ - Stop generating the A variable (seems to be legacy code)
+ - Make sure intertask depends get processed correcting in recursive depends
+ - Add pn-PN to overrides when evaluating PREFERRED_VERSION
+ - Improve the progress indicator by skipping tasks that have
+ already run before starting the build rather than during it
+ - Add profiling option (-P)
+ - Add BB_SRCREV_POLICY variable (clear or cache) to control SRCREV cache
+ - Add SRCREV_FORMAT support
+ - Fix local fetcher's localpath return values
+ - Apply OVERRIDES before performing immediate expansions
+ - Allow the -b -e option combination to take regular expressions
+ - Fix handling of variables with expansion in the name using _append/_prepend
+ e.g. RRECOMMENDS_${PN}_append_xyz = "abc"
+ - Add plain message function to bb.msg
+ - Sort the list of providers before processing so dependency problems are
+ reproducible rather than effectively random
+ - Fix/improve bitbake -s output
+ - Add locking for fetchers so only one tries to fetch a given file at a given time
+ - Fix int(0)/None confusion in runqueue.py which causes random gaps in dependency chains
+ - Expand data in addtasks
+ - Print the list of missing DEPENDS,RDEPENDS for the "No buildable providers available for required...."
+ error message.
+ - Rework add_task to be more efficient (6% speedup, 7% number of function calls reduction)
+ - Sort digraph output to make builds more reproducible
+ - Split expandKeys into two for loops to benefit from the expand_cache (12% speedup)
+ - runqueue.py: Fix idepends handling to avoid dependency errors
+ - Clear the terminal TOSTOP flag if set (and warn the user)
+ - Fix regression from r653 and make SRCDATE/CVSDATE work for packages again
+ - Fix a bug in bb.decodeurl where http://some.where.com/somefile.tgz decoded to host="" (#1530)
+ - Warn about malformed PREFERRED_PROVIDERS (#1072)
+ - Add support for BB_NICE_LEVEL option (#1627)
+ - Psyco is used only on x86 as there is no support for other architectures.
+ - Sort initial providers list by default preference (#1145, #2024)
+ - Improve provider sorting so prefered versions have preference over latest versions (#768)
+ - Detect builds of tasks with overlapping providers and warn (will become a fatal error) (#1359)
+ - Add MULTI_PROVIDER_WHITELIST variable to allow known safe multiple providers to be listed
+ - Handle paths in svn fetcher module parameter
+ - Support the syntax "export VARIABLE"
+ - Add bzr fetcher
+ - Add support for cleaning directories before a task in the form:
+ do_taskname[cleandirs] = "dir"
+ - bzr fetcher tweaks from Robert Schuster (#2913)
+ - Add mercurial (hg) fetcher from Robert Schuster (#2913)
+ - Don't add duplicates to BBPATH
+ - Fix preferred_version return values (providers.py)
+ - Fix 'depends' flag splitting
+ - Fix unexport handling (#3135)
+ - Add bb.copyfile function similar to bb.movefile (and improve movefile error reporting)
+ - Allow multiple options for deptask flag
+ - Use git-fetch instead of git-pull removing any need for merges when
+ fetching (we don't care about the index). Fixes fetch errors.
+ - Add BB_GENERATE_MIRROR_TARBALLS option, set to 0 to make git fetches
+ faster at the expense of not creating mirror tarballs.
+ - SRCREV handling updates, improvements and fixes from Poky
+ - Add bb.utils.lockfile() and bb.utils.unlockfile() from Poky
+ - Add support for task selfstamp and lockfiles flags
+ - Disable task number acceleration since it can allow the tasks to run
+ out of sequence
+ - Improve runqueue code comments
+ - Add task scheduler abstraction and some example schedulers
+ - Improve circular dependency chain debugging code and user feedback
+ - Don't give a stacktrace for invalid tasks, have a user friendly message (#3431)
+ - Add support for "-e target" (#3432)
+ - Fix shell showdata command (#3259)
+ - Fix shell data updating problems (#1880)
+ - Properly raise errors for invalid source URI protocols
+ - Change the wget fetcher failure handling to avoid lockfile problems
+ - Add support for branches in git fetcher (Otavio Salvador, Michael Lauer)
+ - Make taskdata and runqueue errors more user friendly
+ - Add norecurse and fullpath options to cvs fetcher
+ - Fix exit code for build failures in --continue mode
+ - Fix git branch tags fetching
+ - Change parseConfigurationFile so it works on real data, not a copy
+ - Handle 'base' inherit and all other INHERITs from parseConfigurationFile
+ instead of BBHandler
+ - Fix getVarFlags bug in data_smart
+ - Optmise cache handling by more quickly detecting an invalid cache, only
+ saving the cache when its changed, moving the cache validity check into
+ the parsing loop and factoring some getVar calls outside a for loop
+ - Cooker: Remove a debug message from the parsing loop to lower overhead
+ - Convert build.py exec_task to use getVarFlags
+ - Update shell to use cooker.buildFile
+ - Add StampUpdate event
+ - Convert -b option to use taskdata/runqueue
+ - Remove digraph and switch to new stamp checking code. exec_task no longer
+ honours dependencies
+ - Make fetcher timestamp updating non-fatal when permissions don't allow
+ updates
+ - Add BB_SCHEDULER variable/option ("completion" or "speed") controlling
+ the way bitbake schedules tasks
+ - Add BB_STAMP_POLICY variable/option ("perfile" or "full") controlling
+ how extensively stamps are looked at for validity
+ - When handling build target failures make sure idepends are checked and
+ failed where needed. Fixes --continue mode crashes.
+ - Fix -f (force) in conjunction with -b
+ - Fix problems with recrdeptask handling where some idepends weren't handled
+ correctly.
+ - Handle exit codes correctly (from pH5)
+ - Work around refs/HEAD issues with git over http (#3410)
+ - Add proxy support to the CVS fetcher (from Cyril Chemparathy)
+ - Improve runfetchcmd so errors are seen and various GIT variables are exported
+ - Add ability to fetchers to check URL validity without downloading
+ - Improve runtime PREFERRED_PROVIDERS warning message
+ - Add BB_STAMP_WHITELIST option which contains a list of stamps to ignore when
+ checking stamp dependencies and using a BB_STAMP_POLICY of "whitelist"
+ - No longer weight providers on the basis of a package being "already staged". This
+ leads to builds being non-deterministic.
+ - Flush stdout/stderr before forking to fix duplicate console output
+ - Make sure recrdeps tasks include all inter-task dependencies of a given fn
+ - Add bb.runqueue.check_stamp_fn() for use by packaged-staging
+ - Add PERSISTENT_DIR to store the PersistData in a persistent
+ directory != the cache dir.
+ - Add md5 and sha256 checksum generation functions to utils.py
+ - Correctly handle '-' characters in class names (#2958)
+ - Make sure expandKeys has been called on the data dictionary before running tasks
+ - Correctly add a task override in the form task-TASKNAME.
+ - Revert the '-' character fix in class names since it breaks things
+ - When a regexp fails to compile for PACKAGES_DYNAMIC, print a more useful error (#4444)
+ - Allow to checkout CVS by Date and Time. Just add HHmm to the SRCDATE.
+ - Move prunedir function to utils.py and add explode_dep_versions function
+ - Raise an exception if SRCREV == 'INVALID'
+ - Fix hg fetcher username/password handling and fix crash
+ - Fix PACKAGES_DYNAMIC handling of packages with '++' in the name
+ - Rename __depends to __base_depends after configuration parsing so we don't
+ recheck the validity of the config files time after time
+ - Add better environmental variable handling. By default it will now only pass certain
+ whitelisted variables into the data store. If BB_PRESERVE_ENV is set bitbake will use
+ all variable from the environment. If BB_ENV_WHITELIST is set, that whitelist will be
+ used instead of the internal bitbake one. Alternatively, BB_ENV_EXTRAWHITE can be used
+ to extend the internal whitelist.
+ - Perforce fetcher fix to use commandline options instead of being overriden by the environment
+ - bb.utils.prunedir can cope with symlinks to directoriees without exceptions
+ - use @rev when doing a svn checkout
+ - Add osc fetcher (from Joshua Lock in Poky)
+ - When SRCREV autorevisioning for a recipe is in use, don't cache the recipe
+ - Add tryaltconfigs option to control whether bitbake trys using alternative providers
+ to fulfil failed dependencies. It defaults to off, changing the default since this
+ behaviour confuses many users and isn't often useful.
+ - Improve lock file function error handling
+ - Add username handling to the git fetcher (Robert Bragg)
+ - Add support for HTTP_PROXY and HTTP_PROXY_IGNORE variables to the wget fetcher
+ - Export more variables to the fetcher commands to allow ssh checkouts and checkouts through
+ proxies to work better. (from Poky)
+ - Also allow user and pswd options in SRC_URIs globally (from Poky)
+ - Improve proxy handling when using mirrors (from Poky)
+ - Add bb.utils.prune_suffix function
+ - Fix hg checkouts of specific revisions (from Poky)
+ - Fix wget fetching of urls with parameters specified (from Poky)
+ - Add username handling to git fetcher (from Poky)
+ - Set HOME environmental variable when running fetcher commands (from Poky)
+ - Make sure allowed variables inherited from the environment are exported again (from Poky)
+ - When running a stage task in bbshell, run populate_staging, not the stage task (from Poky)
+ - Fix + character escaping from PACKAGES_DYNAMIC (thanks Otavio Salvador)
+ - Addition of BBCLASSEXTEND support for allowing one recipe to provide multiple targets (from Poky)
+
+Changes in Bitbake 1.8.0:
+ - Release 1.7.x as a stable series
+
+Changes in BitBake 1.7.x:
+ - Major updates of the dependency handling and execution
+ of tasks. Code from bin/bitbake replaced with runqueue.py
+ and taskdata.py
+ - New task execution code supports multithreading with a simplistic
+ threading algorithm controlled by BB_NUMBER_THREADS
+ - Change of the SVN Fetcher to keep the checkout around
+ courtsey of Paul Sokolovsky (#1367)
+ - PATH fix to bbimage (#1108)
+ - Allow debug domains to be specified on the commandline (-l)
+ - Allow 'interactive' tasks
+ - Logging message improvements
+ - Drop now uneeded BUILD_ALL_DEPS variable
+ - Add support for wildcards to -b option
+ - Major overhaul of the fetchers making a large amount of code common
+ including mirroring code
+ - Fetchers now touch md5 stamps upon access (to show activity)
+ - Fix -f force option when used without -b (long standing bug)
+ - Add expand_cache to data_cache.py, caching expanded data (speedup)
+ - Allow version field in DEPENDS (ignored for now)
+ - Add abort flag support to the shell
+ - Make inherit fail if the class doesn't exist (#1478)
+ - Fix data.emit_env() to expand keynames as well as values
+ - Add ssh fetcher
+ - Add perforce fetcher
+ - Make PREFERRED_PROVIDER_foobar defaults to foobar if available
+ - Share the parser's mtime_cache, reducing the number of stat syscalls
+ - Compile all anonfuncs at once!
+ *** Anonfuncs must now use common spacing format ***
+ - Memorise the list of handlers in __BBHANDLERS and tasks in __BBTASKS
+ This removes 2 million function calls resulting in a 5-10% speedup
+ - Add manpage
+ - Update generateDotGraph to use taskData/runQueue improving accuracy
+ and also adding a task dependency graph
+ - Fix/standardise on GPLv2 licence
+ - Move most functionality from bin/bitbake to cooker.py and split into
+ separate funcitons
+ - CVS fetcher: Added support for non-default port
+ - Add BBINCLUDELOGS_LINES, the number of lines to read from any logfile
+ - Drop shebangs from lib/bb scripts
+
+Changes in Bitbake 1.6.0:
+ - Better msg handling
+ - COW dict implementation from Tim Ansell (mithro) leading
+ to better performance
+ - Speed up of -s
+
+Changes in Bitbake 1.4.4:
+ - SRCDATE now handling courtsey Justin Patrin
+ - #1017 fix to work with rm_work
+
+Changes in BitBake 1.4.2:
+ - Send logs to oe.pastebin.com instead of pastebin.com
+ fixes #856
+ - Copy the internal bitbake data before building the
+ dependency graph. This fixes nano not having a
+ virtual/libc dependency
+ - Allow multiple TARBALL_STASH entries
+ - Cache, check if the directory exists before changing
+ into it
+ - git speedup cloning by not doing a checkout
+ - allow to have spaces in filenames (.conf, .bb, .bbclass)
+
+Changes in BitBake 1.4.0:
+ - Fix to check both RDEPENDS and RDEPENDS_${PN}
+ - Fix a RDEPENDS parsing bug in utils:explode_deps()
+ - Update git fetcher behaviour to match git changes
+ - ASSUME_PROVIDED allowed to include runtime packages
+ - git fetcher cleanup and efficency improvements
+ - Change the format of the cache
+ - Update usermanual to document the Fetchers
+ - Major changes to caching with a new strategy
+ giving a major performance increase when reparsing
+ with few data changes
+
+Changes in BitBake 1.3.3:
+ - Create a new Fetcher module to ease the
+ development of new Fetchers.
+ Issue #438 fixed by rpurdie@openedhand.com
+ - Make the Subversion fetcher honor the SRC Date
+ (CVSDATE).
+ Issue #555 fixed by chris@openedhand.com
+ - Expand PREFERRED_PROVIDER properly
+ Issue #436 fixed by rprudie@openedhand.com
+ - Typo fix for Issue #531 by Philipp Zabel for the
+ BitBake Shell
+ - Introduce a new special variable SRCDATE as
+ a generic naming to replace CVSDATE.
+ - Introduce a new keyword 'required'. In contrast
+ to 'include' parsing will fail if a to be included
+ file can not be found.
+ - Remove hardcoding of the STAMP directory. Patch
+ courtsey pHilipp Zabel
+ - Track the RDEPENDS of each package (rpurdie@openedhand.com)
+ - Introduce BUILD_ALL_DEPS to build all RDEPENDS. E.g
+ this is used by the OpenEmbedded Meta Packages.
+ (rpurdie@openedhand.com).
+
+Changes in BitBake 1.3.2:
+ - reintegration of make.py into BitBake
+ - bbread is gone, use bitbake -e
+ - lots of shell updates and bugfixes
+ - Introduction of the .= and =. operator
+ - Sort variables, keys and groups in bitdoc
+ - Fix regression in the handling of BBCOLLECTIONS
+ - Update the bitbake usermanual
+
+Changes in BitBake 1.3.0:
+ - add bitbake interactive shell (bitbake -i)
+ - refactor bitbake utility in OO style
+ - kill default arguments in methods in the bb.data module
+ - kill default arguments in methods in the bb.fetch module
+ - the http/https/ftp fetcher will fail if the to be
+ downloaded file was not found in DL_DIR (this is needed
+ to avoid unpacking the sourceforge mirror page)
+ - Switch to a cow like data instance for persistent and non
+ persisting mode (called data_smart.py)
+ - Changed the callback of bb.make.collect_bbfiles to carry
+ additional parameters
+ - Drastically reduced the amount of needed RAM by not holding
+ each data instance in memory when using a cache/persistent
+ storage
+
+Changes in BitBake 1.2.1:
+ The 1.2.1 release is meant as a intermediate release to lay the
+ ground for more radical changes. The most notable changes are:
+
+ - Do not hardcode {}, use bb.data.init() instead if you want to
+ get a instance of a data class
+ - bb.data.init() is a factory and the old bb.data methods are delegates
+ - Do not use deepcopy use bb.data.createCopy() instead.
+ - Removed default arguments in bb.fetch
+
diff --git a/bitbake/HEADER b/bitbake/HEADER
new file mode 100644
index 0000000..9859255
--- /dev/null
+++ b/bitbake/HEADER
@@ -0,0 +1,19 @@
+# ex:ts=4:sw=4:sts=4:et
+# -*- tab-width: 4; c-basic-offset: 4; indent-tabs-mode: nil -*-
+#
+# <one line to give the program's name and a brief idea of what it does.>
+# Copyright (C) <year> <name of author>
+#
+# This program is free software; you can redistribute it and/or modify
+# it under the terms of the GNU General Public License version 2 as
+# published by the Free Software Foundation.
+#
+# This program is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
+# GNU General Public License for more details.
+#
+# You should have received a copy of the GNU General Public License along
+# with this program; if not, write to the Free Software Foundation, Inc.,
+# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
+
diff --git a/bitbake/LICENSE b/bitbake/LICENSE
new file mode 100644
index 0000000..350140c
--- /dev/null
+++ b/bitbake/LICENSE
@@ -0,0 +1,10 @@
+BitBake is licensed under the GNU General Public License version 2.0. See COPYING for further details.
+
+The following external components are distributed with this software:
+
+* The Toaster Simple UI application is based upon the Django project template, the files of which are covered by the BSD license and are copyright (c) Django Software
+Foundation and individual contributors.
+
+* Twitter Bootstrap (including Glyphicons), redistributed under the Apache License 2.0.
+
+* jQuery is redistributed under the MIT license.
diff --git a/bitbake/bin/bitbake b/bitbake/bin/bitbake
new file mode 100755
index 0000000..e3d138b
--- /dev/null
+++ b/bitbake/bin/bitbake
@@ -0,0 +1,53 @@
+#!/usr/bin/env python
+# ex:ts=4:sw=4:sts=4:et
+# -*- tab-width: 4; c-basic-offset: 4; indent-tabs-mode: nil -*-
+#
+# Copyright (C) 2003, 2004 Chris Larson
+# Copyright (C) 2003, 2004 Phil Blundell
+# Copyright (C) 2003 - 2005 Michael 'Mickey' Lauer
+# Copyright (C) 2005 Holger Hans Peter Freyther
+# Copyright (C) 2005 ROAD GmbH
+# Copyright (C) 2006 Richard Purdie
+#
+# This program is free software; you can redistribute it and/or modify
+# it under the terms of the GNU General Public License version 2 as
+# published by the Free Software Foundation.
+#
+# This program is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
+# GNU General Public License for more details.
+#
+# You should have received a copy of the GNU General Public License along
+# with this program; if not, write to the Free Software Foundation, Inc.,
+# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
+
+import os
+import sys
+
+sys.path.insert(0, os.path.join(os.path.dirname(os.path.dirname(__file__)),
+ 'lib'))
+try:
+ import bb
+except RuntimeError as exc:
+ sys.exit(str(exc))
+
+from bb import cookerdata
+from bb.main import bitbake_main, BitBakeConfigParameters, BBMainException
+
+__version__ = "1.27.1"
+
+if __name__ == "__main__":
+ if __version__ != bb.__version__:
+ sys.exit("Bitbake core version and program version mismatch!")
+ try:
+ sys.exit(bitbake_main(BitBakeConfigParameters(sys.argv),
+ cookerdata.CookerConfiguration()))
+ except BBMainException as err:
+ sys.exit(err)
+ except bb.BBHandledException:
+ sys.exit(1)
+ except Exception:
+ import traceback
+ traceback.print_exc()
+ sys.exit(1)
diff --git a/bitbake/bin/bitbake-diffsigs b/bitbake/bin/bitbake-diffsigs
new file mode 100755
index 0000000..196f0b7
--- /dev/null
+++ b/bitbake/bin/bitbake-diffsigs
@@ -0,0 +1,138 @@
+#!/usr/bin/env python
+
+# bitbake-diffsigs
+# BitBake task signature data comparison utility
+#
+# Copyright (C) 2012-2013 Intel Corporation
+#
+# This program is free software; you can redistribute it and/or modify
+# it under the terms of the GNU General Public License version 2 as
+# published by the Free Software Foundation.
+#
+# This program is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
+# GNU General Public License for more details.
+#
+# You should have received a copy of the GNU General Public License along
+# with this program; if not, write to the Free Software Foundation, Inc.,
+# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
+
+import os
+import sys
+import warnings
+import fnmatch
+import optparse
+import logging
+
+sys.path.insert(0, os.path.join(os.path.dirname(os.path.dirname(sys.argv[0])), 'lib'))
+
+import bb.tinfoil
+import bb.siggen
+
+def logger_create(name, output=sys.stderr):
+ logger = logging.getLogger(name)
+ console = logging.StreamHandler(output)
+ format = bb.msg.BBLogFormatter("%(levelname)s: %(message)s")
+ if output.isatty():
+ format.enable_color()
+ console.setFormatter(format)
+ logger.addHandler(console)
+ logger.setLevel(logging.INFO)
+ return logger
+
+logger = logger_create('bitbake-diffsigs')
+
+def find_compare_task(bbhandler, pn, taskname):
+ """ Find the most recent signature files for the specified PN/task and compare them """
+
+ def get_hashval(siginfo):
+ if siginfo.endswith('.siginfo'):
+ return siginfo.rpartition(':')[2].partition('_')[0]
+ else:
+ return siginfo.rpartition('.')[2]
+
+ if not hasattr(bb.siggen, 'find_siginfo'):
+ logger.error('Metadata does not support finding signature data files')
+ sys.exit(1)
+
+ if not taskname.startswith('do_'):
+ taskname = 'do_%s' % taskname
+
+ filedates = bb.siggen.find_siginfo(pn, taskname, None, bbhandler.config_data)
+ latestfiles = sorted(filedates.keys(), key=lambda f: filedates[f])[-3:]
+ if not latestfiles:
+ logger.error('No sigdata files found matching %s %s' % (pn, taskname))
+ sys.exit(1)
+ elif len(latestfiles) < 2:
+ logger.error('Only one matching sigdata file found for the specified task (%s %s)' % (pn, taskname))
+ sys.exit(1)
+ else:
+ # It's possible that latestfiles contain 3 elements and the first two have the same hash value.
+ # In this case, we delete the second element.
+ # The above case is actually the most common one. Because we may have sigdata file and siginfo
+ # file having the same hash value. Comparing such two files makes no sense.
+ if len(latestfiles) == 3:
+ hash0 = get_hashval(latestfiles[0])
+ hash1 = get_hashval(latestfiles[1])
+ if hash0 == hash1:
+ latestfiles.pop(1)
+
+ # Define recursion callback
+ def recursecb(key, hash1, hash2):
+ hashes = [hash1, hash2]
+ hashfiles = bb.siggen.find_siginfo(key, None, hashes, bbhandler.config_data)
+
+ recout = []
+ if len(hashfiles) == 2:
+ out2 = bb.siggen.compare_sigfiles(hashfiles[hash1], hashfiles[hash2], recursecb)
+ recout.extend(list(' ' + l for l in out2))
+ else:
+ recout.append("Unable to find matching sigdata for %s with hashes %s or %s" % (key, hash1, hash2))
+
+ return recout
+
+ # Recurse into signature comparison
+ output = bb.siggen.compare_sigfiles(latestfiles[0], latestfiles[1], recursecb)
+ if output:
+ print '\n'.join(output)
+ sys.exit(0)
+
+
+
+parser = optparse.OptionParser(
+ description = "Compares siginfo/sigdata files written out by BitBake",
+ usage = """
+ %prog -t recipename taskname
+ %prog sigdatafile1 sigdatafile2
+ %prog sigdatafile1""")
+
+parser.add_option("-t", "--task",
+ help = "find the signature data files for last two runs of the specified task and compare them",
+ action="store", dest="taskargs", nargs=2, metavar='recipename taskname')
+
+options, args = parser.parse_args(sys.argv)
+
+if options.taskargs:
+ tinfoil = bb.tinfoil.Tinfoil()
+ tinfoil.prepare(config_only = True)
+ find_compare_task(tinfoil, options.taskargs[0], options.taskargs[1])
+else:
+ if len(args) == 1:
+ parser.print_help()
+ else:
+ import cPickle
+ try:
+ if len(args) == 2:
+ output = bb.siggen.dump_sigfile(sys.argv[1])
+ else:
+ output = bb.siggen.compare_sigfiles(sys.argv[1], sys.argv[2])
+ except IOError as e:
+ logger.error(str(e))
+ sys.exit(1)
+ except cPickle.UnpicklingError, EOFError:
+ logger.error('Invalid signature data - ensure you are specifying sigdata/siginfo files')
+ sys.exit(1)
+
+ if output:
+ print '\n'.join(output)
diff --git a/bitbake/bin/bitbake-dumpsig b/bitbake/bin/bitbake-dumpsig
new file mode 100755
index 0000000..656d93a
--- /dev/null
+++ b/bitbake/bin/bitbake-dumpsig
@@ -0,0 +1,65 @@
+#!/usr/bin/env python
+
+# bitbake-dumpsig
+# BitBake task signature dump utility
+#
+# Copyright (C) 2013 Intel Corporation
+#
+# This program is free software; you can redistribute it and/or modify
+# it under the terms of the GNU General Public License version 2 as
+# published by the Free Software Foundation.
+#
+# This program is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
+# GNU General Public License for more details.
+#
+# You should have received a copy of the GNU General Public License along
+# with this program; if not, write to the Free Software Foundation, Inc.,
+# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
+
+import os
+import sys
+import warnings
+import optparse
+import logging
+
+sys.path.insert(0, os.path.join(os.path.dirname(os.path.dirname(sys.argv[0])), 'lib'))
+
+import bb.siggen
+
+def logger_create(name, output=sys.stderr):
+ logger = logging.getLogger(name)
+ console = logging.StreamHandler(output)
+ format = bb.msg.BBLogFormatter("%(levelname)s: %(message)s")
+ if output.isatty():
+ format.enable_color()
+ console.setFormatter(format)
+ logger.addHandler(console)
+ logger.setLevel(logging.INFO)
+ return logger
+
+logger = logger_create('bitbake-dumpsig')
+
+parser = optparse.OptionParser(
+ description = "Dumps siginfo/sigdata files written out by BitBake",
+ usage = """
+ %prog sigdatafile""")
+
+options, args = parser.parse_args(sys.argv)
+
+if len(args) == 1:
+ parser.print_help()
+else:
+ import cPickle
+ try:
+ output = bb.siggen.dump_sigfile(args[1])
+ except IOError as e:
+ logger.error(str(e))
+ sys.exit(1)
+ except cPickle.UnpicklingError, EOFError:
+ logger.error('Invalid signature data - ensure you are specifying a sigdata/siginfo file')
+ sys.exit(1)
+
+ if output:
+ print '\n'.join(output)
diff --git a/bitbake/bin/bitbake-layers b/bitbake/bin/bitbake-layers
new file mode 100755
index 0000000..fb13044
--- /dev/null
+++ b/bitbake/bin/bitbake-layers
@@ -0,0 +1,1072 @@
+#!/usr/bin/env python
+
+# This script has subcommands which operate against your bitbake layers, either
+# displaying useful information, or acting against them.
+# See the help output for details on available commands.
+
+# Copyright (C) 2011 Mentor Graphics Corporation
+# Copyright (C) 2011-2015 Intel Corporation
+#
+# This program is free software; you can redistribute it and/or modify
+# it under the terms of the GNU General Public License version 2 as
+# published by the Free Software Foundation.
+#
+# This program is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
+# GNU General Public License for more details.
+#
+# You should have received a copy of the GNU General Public License along
+# with this program; if not, write to the Free Software Foundation, Inc.,
+# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
+
+import logging
+import os
+import sys
+import fnmatch
+from collections import defaultdict
+import argparse
+import re
+import httplib, urlparse, json
+import subprocess
+
+bindir = os.path.dirname(__file__)
+topdir = os.path.dirname(bindir)
+sys.path[0:0] = [os.path.join(topdir, 'lib')]
+
+import bb.cache
+import bb.cooker
+import bb.providers
+import bb.utils
+import bb.tinfoil
+
+
+def logger_create(name, output=sys.stderr):
+ logger = logging.getLogger(name)
+ console = logging.StreamHandler(output)
+ format = bb.msg.BBLogFormatter("%(levelname)s: %(message)s")
+ if output.isatty():
+ format.enable_color()
+ console.setFormatter(format)
+ logger.addHandler(console)
+ logger.setLevel(logging.INFO)
+ return logger
+
+logger = logger_create('bitbake-layers', sys.stdout)
+
+class UserError(Exception):
+ pass
+
+class Commands():
+ def __init__(self):
+ self.bbhandler = None
+ self.bblayers = []
+
+ def init_bbhandler(self, config_only = False):
+ if not self.bbhandler:
+ self.bbhandler = bb.tinfoil.Tinfoil(tracking=True)
+ self.bblayers = (self.bbhandler.config_data.getVar('BBLAYERS', True) or "").split()
+ self.bbhandler.prepare(config_only)
+ layerconfs = self.bbhandler.config_data.varhistory.get_variable_items_files('BBFILE_COLLECTIONS', self.bbhandler.config_data)
+ self.bbfile_collections = {layer: os.path.dirname(os.path.dirname(path)) for layer, path in layerconfs.iteritems()}
+
+
+ def do_show_layers(self, args):
+ """show current configured layers"""
+ self.init_bbhandler(config_only = True)
+ logger.plain("%s %s %s" % ("layer".ljust(20), "path".ljust(40), "priority"))
+ logger.plain('=' * 74)
+ for layer, _, regex, pri in self.bbhandler.cooker.recipecache.bbfile_config_priorities:
+ layerdir = self.bbfile_collections.get(layer, None)
+ layername = self.get_layer_name(layerdir)
+ logger.plain("%s %s %d" % (layername.ljust(20), layerdir.ljust(40), pri))
+
+
+ def do_add_layer(self, args):
+ """Add a layer to bblayers.conf
+
+Adds the specified layer to bblayers.conf
+"""
+ layerdir = os.path.abspath(args.layerdir)
+ if not os.path.exists(layerdir):
+ sys.stderr.write("Specified layer directory doesn't exist\n")
+ return 1
+
+ layer_conf = os.path.join(layerdir, 'conf', 'layer.conf')
+ if not os.path.exists(layer_conf):
+ sys.stderr.write("Specified layer directory doesn't contain a conf/layer.conf file\n")
+ return 1
+
+ bblayers_conf = os.path.join('conf', 'bblayers.conf')
+ if not os.path.exists(bblayers_conf):
+ sys.stderr.write("Unable to find bblayers.conf\n")
+ return 1
+
+ (notadded, _) = bb.utils.edit_bblayers_conf(bblayers_conf, layerdir, None)
+ if notadded:
+ for item in notadded:
+ sys.stderr.write("Specified layer %s is already in BBLAYERS\n" % item)
+
+
+ def do_remove_layer(self, args):
+ """Remove a layer from bblayers.conf
+
+Removes the specified layer from bblayers.conf
+"""
+ bblayers_conf = os.path.join('conf', 'bblayers.conf')
+ if not os.path.exists(bblayers_conf):
+ sys.stderr.write("Unable to find bblayers.conf\n")
+ return 1
+
+ if args.layerdir.startswith('*'):
+ layerdir = args.layerdir
+ elif not '/' in args.layerdir:
+ layerdir = '*/%s' % args.layerdir
+ else:
+ layerdir = os.path.abspath(args.layerdir)
+ (_, notremoved) = bb.utils.edit_bblayers_conf(bblayers_conf, None, layerdir)
+ if notremoved:
+ for item in notremoved:
+ sys.stderr.write("No layers matching %s found in BBLAYERS\n" % item)
+ return 1
+
+
+ def get_json_data(self, apiurl):
+ proxy_settings = os.environ.get("http_proxy", None)
+ conn = None
+ _parsedurl = urlparse.urlparse(apiurl)
+ path = _parsedurl.path
+ query = _parsedurl.query
+ def parse_url(url):
+ parsedurl = urlparse.urlparse(url)
+ if parsedurl.netloc[0] == '[':
+ host, port = parsedurl.netloc[1:].split(']', 1)
+ if ':' in port:
+ port = port.rsplit(':', 1)[1]
+ else:
+ port = None
+ else:
+ if parsedurl.netloc.count(':') == 1:
+ (host, port) = parsedurl.netloc.split(":")
+ else:
+ host = parsedurl.netloc
+ port = None
+ return (host, 80 if port is None else int(port))
+
+ if proxy_settings is None:
+ host, port = parse_url(apiurl)
+ conn = httplib.HTTPConnection(host, port)
+ conn.request("GET", path + "?" + query)
+ else:
+ host, port = parse_url(proxy_settings)
+ conn = httplib.HTTPConnection(host, port)
+ conn.request("GET", apiurl)
+
+ r = conn.getresponse()
+ if r.status != 200:
+ raise Exception("Failed to read " + path + ": %d %s" % (r.status, r.reason))
+ return json.loads(r.read())
+
+
+ def get_layer_deps(self, layername, layeritems, layerbranches, layerdependencies, branchnum, selfname=False):
+ def layeritems_info_id(items_name, layeritems):
+ litems_id = None
+ for li in layeritems:
+ if li['name'] == items_name:
+ litems_id = li['id']
+ break
+ return litems_id
+
+ def layerbranches_info(items_id, layerbranches):
+ lbranch = {}
+ for lb in layerbranches:
+ if lb['layer'] == items_id and lb['branch'] == branchnum:
+ lbranch['id'] = lb['id']
+ lbranch['vcs_subdir'] = lb['vcs_subdir']
+ break
+ return lbranch
+
+ def layerdependencies_info(lb_id, layerdependencies):
+ ld_deps = []
+ for ld in layerdependencies:
+ if ld['layerbranch'] == lb_id and not ld['dependency'] in ld_deps:
+ ld_deps.append(ld['dependency'])
+ if not ld_deps:
+ logger.error("The dependency of layerDependencies is not found.")
+ return ld_deps
+
+ def layeritems_info_name_subdir(items_id, layeritems):
+ litems = {}
+ for li in layeritems:
+ if li['id'] == items_id:
+ litems['vcs_url'] = li['vcs_url']
+ litems['name'] = li['name']
+ break
+ return litems
+
+ if selfname:
+ selfid = layeritems_info_id(layername, layeritems)
+ lbinfo = layerbranches_info(selfid, layerbranches)
+ if lbinfo:
+ selfsubdir = lbinfo['vcs_subdir']
+ else:
+ logger.error("%s is not found in the specified branch" % layername)
+ return
+ selfurl = layeritems_info_name_subdir(selfid, layeritems)['vcs_url']
+ if selfurl:
+ return selfurl, selfsubdir
+ else:
+ logger.error("Cannot get layer %s git repo and subdir" % layername)
+ return
+ ldict = {}
+ itemsid = layeritems_info_id(layername, layeritems)
+ if not itemsid:
+ return layername, None
+ lbid = layerbranches_info(itemsid, layerbranches)
+ if lbid:
+ lbid = layerbranches_info(itemsid, layerbranches)['id']
+ else:
+ logger.error("%s is not found in the specified branch" % layername)
+ return None, None
+ for dependency in layerdependencies_info(lbid, layerdependencies):
+ lname = layeritems_info_name_subdir(dependency, layeritems)['name']
+ lurl = layeritems_info_name_subdir(dependency, layeritems)['vcs_url']
+ lsubdir = layerbranches_info(dependency, layerbranches)['vcs_subdir']
+ ldict[lname] = lurl, lsubdir
+ return None, ldict
+
+
+ def get_fetch_layer(self, fetchdir, url, subdir, fetch_layer):
+ layername = self.get_layer_name(url)
+ if os.path.splitext(layername)[1] == '.git':
+ layername = os.path.splitext(layername)[0]
+ repodir = os.path.join(fetchdir, layername)
+ layerdir = os.path.join(repodir, subdir)
+ if not os.path.exists(repodir):
+ if fetch_layer:
+ result = subprocess.call('git clone %s %s' % (url, repodir), shell = True)
+ if result:
+ logger.error("Failed to download %s" % url)
+ return None, None
+ else:
+ return layername, layerdir
+ else:
+ logger.plain("Repository %s needs to be fetched" % url)
+ return layername, layerdir
+ elif os.path.exists(layerdir):
+ return layername, layerdir
+ else:
+ logger.error("%s is not in %s" % (url, subdir))
+ return None, None
+
+
+ def do_layerindex_fetch(self, args):
+ """Fetches a layer from a layer index along with its dependent layers, and adds them to conf/bblayers.conf.
+"""
+ self.init_bbhandler(config_only = True)
+ apiurl = self.bbhandler.config_data.getVar('BBLAYERS_LAYERINDEX_URL', True)
+ if not apiurl:
+ logger.error("Cannot get BBLAYERS_LAYERINDEX_URL")
+ return 1
+ else:
+ if apiurl[-1] != '/':
+ apiurl += '/'
+ apiurl += "api/"
+ apilinks = self.get_json_data(apiurl)
+ branches = self.get_json_data(apilinks['branches'])
+
+ branchnum = 0
+ for branch in branches:
+ if branch['name'] == args.branch:
+ branchnum = branch['id']
+ break
+ if branchnum == 0:
+ validbranches = ', '.join([branch['name'] for branch in branches])
+ logger.error('Invalid layer branch name "%s". Valid branches: %s' % (args.branch, validbranches))
+ return 1
+
+ ignore_layers = []
+ for collection in self.bbhandler.config_data.getVar('BBFILE_COLLECTIONS', True).split():
+ lname = self.bbhandler.config_data.getVar('BBLAYERS_LAYERINDEX_NAME_%s' % collection, True)
+ if lname:
+ ignore_layers.append(lname)
+
+ if args.ignore:
+ ignore_layers.extend(args.ignore.split(','))
+
+ layeritems = self.get_json_data(apilinks['layerItems'])
+ layerbranches = self.get_json_data(apilinks['layerBranches'])
+ layerdependencies = self.get_json_data(apilinks['layerDependencies'])
+ invaluenames = []
+ repourls = {}
+ printlayers = []
+ def query_dependencies(layers, layeritems, layerbranches, layerdependencies, branchnum):
+ depslayer = []
+ for layername in layers:
+ invaluename, layerdict = self.get_layer_deps(layername, layeritems, layerbranches, layerdependencies, branchnum)
+ if layerdict:
+ repourls[layername] = self.get_layer_deps(layername, layeritems, layerbranches, layerdependencies, branchnum, selfname=True)
+ for layer in layerdict:
+ if not layer in ignore_layers:
+ depslayer.append(layer)
+ printlayers.append((layername, layer, layerdict[layer][0], layerdict[layer][1]))
+ if not layer in ignore_layers and not layer in repourls:
+ repourls[layer] = (layerdict[layer][0], layerdict[layer][1])
+ if invaluename and not invaluename in invaluenames:
+ invaluenames.append(invaluename)
+ return depslayer
+
+ depslayers = query_dependencies(args.layername, layeritems, layerbranches, layerdependencies, branchnum)
+ while depslayers:
+ depslayer = query_dependencies(depslayers, layeritems, layerbranches, layerdependencies, branchnum)
+ depslayers = depslayer
+ if invaluenames:
+ for invaluename in invaluenames:
+ logger.error('Layer "%s" not found in layer index' % invaluename)
+ return 1
+ logger.plain("%s %s %s %s" % ("Layer".ljust(19), "Required by".ljust(19), "Git repository".ljust(54), "Subdirectory"))
+ logger.plain('=' * 115)
+ for layername in args.layername:
+ layerurl = repourls[layername]
+ logger.plain("%s %s %s %s" % (layername.ljust(20), '-'.ljust(20), layerurl[0].ljust(55), layerurl[1]))
+ printedlayers = []
+ for layer, dependency, gitrepo, subdirectory in printlayers:
+ if dependency in printedlayers:
+ continue
+ logger.plain("%s %s %s %s" % (dependency.ljust(20), layer.ljust(20), gitrepo.ljust(55), subdirectory))
+ printedlayers.append(dependency)
+
+ if repourls:
+ fetchdir = self.bbhandler.config_data.getVar('BBLAYERS_FETCH_DIR', True)
+ if not fetchdir:
+ logger.error("Cannot get BBLAYERS_FETCH_DIR")
+ return 1
+ if not os.path.exists(fetchdir):
+ os.makedirs(fetchdir)
+ addlayers = []
+ for repourl, subdir in repourls.values():
+ name, layerdir = self.get_fetch_layer(fetchdir, repourl, subdir, not args.show_only)
+ if not name:
+ # Error already shown
+ return 1
+ addlayers.append((subdir, name, layerdir))
+ if not args.show_only:
+ for subdir, name, layerdir in set(addlayers):
+ if os.path.exists(layerdir):
+ if subdir:
+ logger.plain("Adding layer \"%s\" to conf/bblayers.conf" % subdir)
+ else:
+ logger.plain("Adding layer \"%s\" to conf/bblayers.conf" % name)
+ localargs = argparse.Namespace()
+ localargs.layerdir = layerdir
+ self.do_add_layer(localargs)
+ else:
+ break
+
+
+ def do_layerindex_show_depends(self, args):
+ """Find layer dependencies from layer index.
+"""
+ args.show_only = True
+ args.ignore = []
+ self.do_layerindex_fetch(args)
+
+
+ def version_str(self, pe, pv, pr = None):
+ verstr = "%s" % pv
+ if pr:
+ verstr = "%s-%s" % (verstr, pr)
+ if pe:
+ verstr = "%s:%s" % (pe, verstr)
+ return verstr
+
+
+ def do_show_overlayed(self, args):
+ """list overlayed recipes (where the same recipe exists in another layer)
+
+Lists the names of overlayed recipes and the available versions in each
+layer, with the preferred version first. Note that skipped recipes that
+are overlayed will also be listed, with a " (skipped)" suffix.
+"""
+ self.init_bbhandler()
+
+ items_listed = self.list_recipes('Overlayed recipes', None, True, args.same_version, args.filenames, True, None)
+
+ # Check for overlayed .bbclass files
+ classes = defaultdict(list)
+ for layerdir in self.bblayers:
+ classdir = os.path.join(layerdir, 'classes')
+ if os.path.exists(classdir):
+ for classfile in os.listdir(classdir):
+ if os.path.splitext(classfile)[1] == '.bbclass':
+ classes[classfile].append(classdir)
+
+ # Locating classes and other files is a bit more complicated than recipes -
+ # layer priority is not a factor; instead BitBake uses the first matching
+ # file in BBPATH, which is manipulated directly by each layer's
+ # conf/layer.conf in turn, thus the order of layers in bblayers.conf is a
+ # factor - however, each layer.conf is free to either prepend or append to
+ # BBPATH (or indeed do crazy stuff with it). Thus the order in BBPATH might
+ # not be exactly the order present in bblayers.conf either.
+ bbpath = str(self.bbhandler.config_data.getVar('BBPATH', True))
+ overlayed_class_found = False
+ for (classfile, classdirs) in classes.items():
+ if len(classdirs) > 1:
+ if not overlayed_class_found:
+ logger.plain('=== Overlayed classes ===')
+ overlayed_class_found = True
+
+ mainfile = bb.utils.which(bbpath, os.path.join('classes', classfile))
+ if args.filenames:
+ logger.plain('%s' % mainfile)
+ else:
+ # We effectively have to guess the layer here
+ logger.plain('%s:' % classfile)
+ mainlayername = '?'
+ for layerdir in self.bblayers:
+ classdir = os.path.join(layerdir, 'classes')
+ if mainfile.startswith(classdir):
+ mainlayername = self.get_layer_name(layerdir)
+ logger.plain(' %s' % mainlayername)
+ for classdir in classdirs:
+ fullpath = os.path.join(classdir, classfile)
+ if fullpath != mainfile:
+ if args.filenames:
+ print(' %s' % fullpath)
+ else:
+ print(' %s' % self.get_layer_name(os.path.dirname(classdir)))
+
+ if overlayed_class_found:
+ items_listed = True;
+
+ if not items_listed:
+ logger.plain('No overlayed files found.')
+
+
+ def do_show_recipes(self, args):
+ """list available recipes, showing the layer they are provided by
+
+Lists the names of recipes and the available versions in each
+layer, with the preferred version first. Optionally you may specify
+pnspec to match a specified recipe name (supports wildcards). Note that
+skipped recipes will also be listed, with a " (skipped)" suffix.
+"""
+ self.init_bbhandler()
+
+ inheritlist = args.inherits.split(',') if args.inherits else []
+ if inheritlist or args.pnspec or args.multiple:
+ title = 'Matching recipes:'
+ else:
+ title = 'Available recipes:'
+ self.list_recipes(title, args.pnspec, False, False, args.filenames, args.multiple, inheritlist)
+
+
+ def list_recipes(self, title, pnspec, show_overlayed_only, show_same_ver_only, show_filenames, show_multi_provider_only, inherits):
+ if inherits:
+ bbpath = str(self.bbhandler.config_data.getVar('BBPATH', True))
+ for classname in inherits:
+ classfile = 'classes/%s.bbclass' % classname
+ if not bb.utils.which(bbpath, classfile, history=False):
+ raise UserError('No class named %s found in BBPATH' % classfile)
+
+ pkg_pn = self.bbhandler.cooker.recipecache.pkg_pn
+ (latest_versions, preferred_versions) = bb.providers.findProviders(self.bbhandler.config_data, self.bbhandler.cooker.recipecache, pkg_pn)
+ allproviders = bb.providers.allProviders(self.bbhandler.cooker.recipecache)
+
+ # Ensure we list skipped recipes
+ # We are largely guessing about PN, PV and the preferred version here,
+ # but we have no choice since skipped recipes are not fully parsed
+ skiplist = self.bbhandler.cooker.skiplist.keys()
+ skiplist.sort( key=lambda fileitem: self.bbhandler.cooker.collection.calc_bbfile_priority(fileitem) )
+ skiplist.reverse()
+ for fn in skiplist:
+ recipe_parts = os.path.splitext(os.path.basename(fn))[0].split('_')
+ p = recipe_parts[0]
+ if len(recipe_parts) > 1:
+ ver = (None, recipe_parts[1], None)
+ else:
+ ver = (None, 'unknown', None)
+ allproviders[p].append((ver, fn))
+ if not p in pkg_pn:
+ pkg_pn[p] = 'dummy'
+ preferred_versions[p] = (ver, fn)
+
+ def print_item(f, pn, ver, layer, ispref):
+ if f in skiplist:
+ skipped = ' (skipped)'
+ else:
+ skipped = ''
+ if show_filenames:
+ if ispref:
+ logger.plain("%s%s", f, skipped)
+ else:
+ logger.plain(" %s%s", f, skipped)
+ else:
+ if ispref:
+ logger.plain("%s:", pn)
+ logger.plain(" %s %s%s", layer.ljust(20), ver, skipped)
+
+ global_inherit = (self.bbhandler.config_data.getVar('INHERIT', True) or "").split()
+ cls_re = re.compile('classes/')
+
+ preffiles = []
+ items_listed = False
+ for p in sorted(pkg_pn):
+ if pnspec:
+ if not fnmatch.fnmatch(p, pnspec):
+ continue
+
+ if len(allproviders[p]) > 1 or not show_multi_provider_only:
+ pref = preferred_versions[p]
+ realfn = bb.cache.Cache.virtualfn2realfn(pref[1])
+ preffile = realfn[0]
+
+ # We only display once per recipe, we should prefer non extended versions of the
+ # recipe if present (so e.g. in OpenEmbedded, openssl rather than nativesdk-openssl
+ # which would otherwise sort first).
+ if realfn[1] and realfn[0] in self.bbhandler.cooker.recipecache.pkg_fn:
+ continue
+
+ if inherits:
+ matchcount = 0
+ recipe_inherits = self.bbhandler.cooker_data.inherits.get(preffile, [])
+ for cls in recipe_inherits:
+ if cls_re.match(cls):
+ continue
+ classname = os.path.splitext(os.path.basename(cls))[0]
+ if classname in global_inherit:
+ continue
+ elif classname in inherits:
+ matchcount += 1
+ if matchcount != len(inherits):
+ # No match - skip this recipe
+ continue
+
+ if preffile not in preffiles:
+ preflayer = self.get_file_layer(preffile)
+ multilayer = False
+ same_ver = True
+ provs = []
+ for prov in allproviders[p]:
+ provfile = bb.cache.Cache.virtualfn2realfn(prov[1])[0]
+ provlayer = self.get_file_layer(provfile)
+ provs.append((provfile, provlayer, prov[0]))
+ if provlayer != preflayer:
+ multilayer = True
+ if prov[0] != pref[0]:
+ same_ver = False
+
+ if (multilayer or not show_overlayed_only) and (same_ver or not show_same_ver_only):
+ if not items_listed:
+ logger.plain('=== %s ===' % title)
+ items_listed = True
+ print_item(preffile, p, self.version_str(pref[0][0], pref[0][1]), preflayer, True)
+ for (provfile, provlayer, provver) in provs:
+ if provfile != preffile:
+ print_item(provfile, p, self.version_str(provver[0], provver[1]), provlayer, False)
+ # Ensure we don't show two entries for BBCLASSEXTENDed recipes
+ preffiles.append(preffile)
+
+ return items_listed
+
+
+ def do_flatten(self, args):
+ """flatten layer configuration into a separate output directory.
+
+Takes the specified layers (or all layers in the current layer
+configuration if none are specified) and builds a "flattened" directory
+containing the contents of all layers, with any overlayed recipes removed
+and bbappends appended to the corresponding recipes. Note that some manual
+cleanup may still be necessary afterwards, in particular:
+
+* where non-recipe files (such as patches) are overwritten (the flatten
+ command will show a warning for these)
+* where anything beyond the normal layer setup has been added to
+ layer.conf (only the lowest priority number layer's layer.conf is used)
+* overridden/appended items from bbappends will need to be tidied up
+* when the flattened layers do not have the same directory structure (the
+ flatten command should show a warning when this will cause a problem)
+
+Warning: if you flatten several layers where another layer is intended to
+be used "inbetween" them (in layer priority order) such that recipes /
+bbappends in the layers interact, and then attempt to use the new output
+layer together with that other layer, you may no longer get the same
+build results (as the layer priority order has effectively changed).
+"""
+ if len(args.layer) == 1:
+ logger.error('If you specify layers to flatten you must specify at least two')
+ return 1
+
+ outputdir = args.outputdir
+ if os.path.exists(outputdir) and os.listdir(outputdir):
+ logger.error('Directory %s exists and is non-empty, please clear it out first' % outputdir)
+ return 1
+
+ self.init_bbhandler()
+ layers = self.bblayers
+ if len(args.layer) > 2:
+ layernames = args.layer
+ found_layernames = []
+ found_layerdirs = []
+ for layerdir in layers:
+ layername = self.get_layer_name(layerdir)
+ if layername in layernames:
+ found_layerdirs.append(layerdir)
+ found_layernames.append(layername)
+
+ for layername in layernames:
+ if not layername in found_layernames:
+ logger.error('Unable to find layer %s in current configuration, please run "%s show-layers" to list configured layers' % (layername, os.path.basename(sys.argv[0])))
+ return
+ layers = found_layerdirs
+ else:
+ layernames = []
+
+ # Ensure a specified path matches our list of layers
+ def layer_path_match(path):
+ for layerdir in layers:
+ if path.startswith(os.path.join(layerdir, '')):
+ return layerdir
+ return None
+
+ applied_appends = []
+ for layer in layers:
+ overlayed = []
+ for f in self.bbhandler.cooker.collection.overlayed.iterkeys():
+ for of in self.bbhandler.cooker.collection.overlayed[f]:
+ if of.startswith(layer):
+ overlayed.append(of)
+
+ logger.plain('Copying files from %s...' % layer )
+ for root, dirs, files in os.walk(layer):
+ for f1 in files:
+ f1full = os.sep.join([root, f1])
+ if f1full in overlayed:
+ logger.plain(' Skipping overlayed file %s' % f1full )
+ else:
+ ext = os.path.splitext(f1)[1]
+ if ext != '.bbappend':
+ fdest = f1full[len(layer):]
+ fdest = os.path.normpath(os.sep.join([outputdir,fdest]))
+ bb.utils.mkdirhier(os.path.dirname(fdest))
+ if os.path.exists(fdest):
+ if f1 == 'layer.conf' and root.endswith('/conf'):
+ logger.plain(' Skipping layer config file %s' % f1full )
+ continue
+ else:
+ logger.warn('Overwriting file %s', fdest)
+ bb.utils.copyfile(f1full, fdest)
+ if ext == '.bb':
+ for append in self.bbhandler.cooker.collection.get_file_appends(f1full):
+ if layer_path_match(append):
+ logger.plain(' Applying append %s to %s' % (append, fdest))
+ self.apply_append(append, fdest)
+ applied_appends.append(append)
+
+ # Take care of when some layers are excluded and yet we have included bbappends for those recipes
+ for b in self.bbhandler.cooker.collection.bbappends:
+ (recipename, appendname) = b
+ if appendname not in applied_appends:
+ first_append = None
+ layer = layer_path_match(appendname)
+ if layer:
+ if first_append:
+ self.apply_append(appendname, first_append)
+ else:
+ fdest = appendname[len(layer):]
+ fdest = os.path.normpath(os.sep.join([outputdir,fdest]))
+ bb.utils.mkdirhier(os.path.dirname(fdest))
+ bb.utils.copyfile(appendname, fdest)
+ first_append = fdest
+
+ # Get the regex for the first layer in our list (which is where the conf/layer.conf file will
+ # have come from)
+ first_regex = None
+ layerdir = layers[0]
+ for layername, pattern, regex, _ in self.bbhandler.cooker.recipecache.bbfile_config_priorities:
+ if regex.match(os.path.join(layerdir, 'test')):
+ first_regex = regex
+ break
+
+ if first_regex:
+ # Find the BBFILES entries that match (which will have come from this conf/layer.conf file)
+ bbfiles = str(self.bbhandler.config_data.getVar('BBFILES', True)).split()
+ bbfiles_layer = []
+ for item in bbfiles:
+ if first_regex.match(item):
+ newpath = os.path.join(outputdir, item[len(layerdir)+1:])
+ bbfiles_layer.append(newpath)
+
+ if bbfiles_layer:
+ # Check that all important layer files match BBFILES
+ for root, dirs, files in os.walk(outputdir):
+ for f1 in files:
+ ext = os.path.splitext(f1)[1]
+ if ext in ['.bb', '.bbappend']:
+ f1full = os.sep.join([root, f1])
+ entry_found = False
+ for item in bbfiles_layer:
+ if fnmatch.fnmatch(f1full, item):
+ entry_found = True
+ break
+ if not entry_found:
+ logger.warning("File %s does not match the flattened layer's BBFILES setting, you may need to edit conf/layer.conf or move the file elsewhere" % f1full)
+
+ def get_file_layer(self, filename):
+ layerdir = self.get_file_layerdir(filename)
+ if layerdir:
+ return self.get_layer_name(layerdir)
+ else:
+ return '?'
+
+ def get_file_layerdir(self, filename):
+ layer = bb.utils.get_file_layer(filename, self.bbhandler.config_data)
+ return self.bbfile_collections.get(layer, None)
+
+ def remove_layer_prefix(self, f):
+ """Remove the layer_dir prefix, e.g., f = /path/to/layer_dir/foo/blah, the
+ return value will be: layer_dir/foo/blah"""
+ f_layerdir = self.get_file_layerdir(f)
+ if not f_layerdir:
+ return f
+ prefix = os.path.join(os.path.dirname(f_layerdir), '')
+ return f[len(prefix):] if f.startswith(prefix) else f
+
+ def get_layer_name(self, layerdir):
+ return os.path.basename(layerdir.rstrip(os.sep))
+
+ def apply_append(self, appendname, recipename):
+ with open(appendname, 'r') as appendfile:
+ with open(recipename, 'a') as recipefile:
+ recipefile.write('\n')
+ recipefile.write('##### bbappended from %s #####\n' % self.get_file_layer(appendname))
+ recipefile.writelines(appendfile.readlines())
+
+ def do_show_appends(self, args):
+ """list bbappend files and recipe files they apply to
+
+Lists recipes with the bbappends that apply to them as subitems.
+"""
+ self.init_bbhandler()
+
+ logger.plain('=== Appended recipes ===')
+
+ pnlist = list(self.bbhandler.cooker_data.pkg_pn.keys())
+ pnlist.sort()
+ appends = False
+ for pn in pnlist:
+ if self.show_appends_for_pn(pn):
+ appends = True
+
+ if self.show_appends_for_skipped():
+ appends = True
+
+ if not appends:
+ logger.plain('No append files found')
+
+ def show_appends_for_pn(self, pn):
+ filenames = self.bbhandler.cooker_data.pkg_pn[pn]
+
+ best = bb.providers.findBestProvider(pn,
+ self.bbhandler.config_data,
+ self.bbhandler.cooker_data,
+ self.bbhandler.cooker_data.pkg_pn)
+ best_filename = os.path.basename(best[3])
+
+ return self.show_appends_output(filenames, best_filename)
+
+ def show_appends_for_skipped(self):
+ filenames = [os.path.basename(f)
+ for f in self.bbhandler.cooker.skiplist.iterkeys()]
+ return self.show_appends_output(filenames, None, " (skipped)")
+
+ def show_appends_output(self, filenames, best_filename, name_suffix = ''):
+ appended, missing = self.get_appends_for_files(filenames)
+ if appended:
+ for basename, appends in appended:
+ logger.plain('%s%s:', basename, name_suffix)
+ for append in appends:
+ logger.plain(' %s', append)
+
+ if best_filename:
+ if best_filename in missing:
+ logger.warn('%s: missing append for preferred version',
+ best_filename)
+ return True
+ else:
+ return False
+
+ def get_appends_for_files(self, filenames):
+ appended, notappended = [], []
+ for filename in filenames:
+ _, cls = bb.cache.Cache.virtualfn2realfn(filename)
+ if cls:
+ continue
+
+ basename = os.path.basename(filename)
+ appends = self.bbhandler.cooker.collection.get_file_appends(basename)
+ if appends:
+ appended.append((basename, list(appends)))
+ else:
+ notappended.append(basename)
+ return appended, notappended
+
+ def do_show_cross_depends(self, args):
+ """Show dependencies between recipes that cross layer boundaries.
+
+Figure out the dependencies between recipes that cross layer boundaries.
+
+NOTE: .bbappend files can impact the dependencies.
+"""
+ ignore_layers = (args.ignore or '').split(',')
+
+ self.init_bbhandler()
+
+ pkg_fn = self.bbhandler.cooker_data.pkg_fn
+ bbpath = str(self.bbhandler.config_data.getVar('BBPATH', True))
+ self.require_re = re.compile(r"require\s+(.+)")
+ self.include_re = re.compile(r"include\s+(.+)")
+ self.inherit_re = re.compile(r"inherit\s+(.+)")
+
+ global_inherit = (self.bbhandler.config_data.getVar('INHERIT', True) or "").split()
+
+ # The bb's DEPENDS and RDEPENDS
+ for f in pkg_fn:
+ f = bb.cache.Cache.virtualfn2realfn(f)[0]
+ # Get the layername that the file is in
+ layername = self.get_file_layer(f)
+
+ # The DEPENDS
+ deps = self.bbhandler.cooker_data.deps[f]
+ for pn in deps:
+ if pn in self.bbhandler.cooker_data.pkg_pn:
+ best = bb.providers.findBestProvider(pn,
+ self.bbhandler.config_data,
+ self.bbhandler.cooker_data,
+ self.bbhandler.cooker_data.pkg_pn)
+ self.check_cross_depends("DEPENDS", layername, f, best[3], args.filenames, ignore_layers)
+
+ # The RDPENDS
+ all_rdeps = self.bbhandler.cooker_data.rundeps[f].values()
+ # Remove the duplicated or null one.
+ sorted_rdeps = {}
+ # The all_rdeps is the list in list, so we need two for loops
+ for k1 in all_rdeps:
+ for k2 in k1:
+ sorted_rdeps[k2] = 1
+ all_rdeps = sorted_rdeps.keys()
+ for rdep in all_rdeps:
+ all_p = bb.providers.getRuntimeProviders(self.bbhandler.cooker_data, rdep)
+ if all_p:
+ if f in all_p:
+ # The recipe provides this one itself, ignore
+ continue
+ best = bb.providers.filterProvidersRunTime(all_p, rdep,
+ self.bbhandler.config_data,
+ self.bbhandler.cooker_data)[0][0]
+ self.check_cross_depends("RDEPENDS", layername, f, best, args.filenames, ignore_layers)
+
+ # The RRECOMMENDS
+ all_rrecs = self.bbhandler.cooker_data.runrecs[f].values()
+ # Remove the duplicated or null one.
+ sorted_rrecs = {}
+ # The all_rrecs is the list in list, so we need two for loops
+ for k1 in all_rrecs:
+ for k2 in k1:
+ sorted_rrecs[k2] = 1
+ all_rrecs = sorted_rrecs.keys()
+ for rrec in all_rrecs:
+ all_p = bb.providers.getRuntimeProviders(self.bbhandler.cooker_data, rrec)
+ if all_p:
+ if f in all_p:
+ # The recipe provides this one itself, ignore
+ continue
+ best = bb.providers.filterProvidersRunTime(all_p, rrec,
+ self.bbhandler.config_data,
+ self.bbhandler.cooker_data)[0][0]
+ self.check_cross_depends("RRECOMMENDS", layername, f, best, args.filenames, ignore_layers)
+
+ # The inherit class
+ cls_re = re.compile('classes/')
+ if f in self.bbhandler.cooker_data.inherits:
+ inherits = self.bbhandler.cooker_data.inherits[f]
+ for cls in inherits:
+ # The inherits' format is [classes/cls, /path/to/classes/cls]
+ # ignore the classes/cls.
+ if not cls_re.match(cls):
+ classname = os.path.splitext(os.path.basename(cls))[0]
+ if classname in global_inherit:
+ continue
+ inherit_layername = self.get_file_layer(cls)
+ if inherit_layername != layername and not inherit_layername in ignore_layers:
+ if not args.filenames:
+ f_short = self.remove_layer_prefix(f)
+ cls = self.remove_layer_prefix(cls)
+ else:
+ f_short = f
+ logger.plain("%s inherits %s" % (f_short, cls))
+
+ # The 'require/include xxx' in the bb file
+ pv_re = re.compile(r"\${PV}")
+ with open(f, 'r') as fnfile:
+ line = fnfile.readline()
+ while line:
+ m, keyword = self.match_require_include(line)
+ # Found the 'require/include xxxx'
+ if m:
+ needed_file = m.group(1)
+ # Replace the ${PV} with the real PV
+ if pv_re.search(needed_file) and f in self.bbhandler.cooker_data.pkg_pepvpr:
+ pv = self.bbhandler.cooker_data.pkg_pepvpr[f][1]
+ needed_file = re.sub(r"\${PV}", pv, needed_file)
+ self.print_cross_files(bbpath, keyword, layername, f, needed_file, args.filenames, ignore_layers)
+ line = fnfile.readline()
+
+ # The "require/include xxx" in conf/machine/*.conf, .inc and .bbclass
+ conf_re = re.compile(".*/conf/machine/[^\/]*\.conf$")
+ inc_re = re.compile(".*\.inc$")
+ # The "inherit xxx" in .bbclass
+ bbclass_re = re.compile(".*\.bbclass$")
+ for layerdir in self.bblayers:
+ layername = self.get_layer_name(layerdir)
+ for dirpath, dirnames, filenames in os.walk(layerdir):
+ for name in filenames:
+ f = os.path.join(dirpath, name)
+ s = conf_re.match(f) or inc_re.match(f) or bbclass_re.match(f)
+ if s:
+ with open(f, 'r') as ffile:
+ line = ffile.readline()
+ while line:
+ m, keyword = self.match_require_include(line)
+ # Only bbclass has the "inherit xxx" here.
+ bbclass=""
+ if not m and f.endswith(".bbclass"):
+ m, keyword = self.match_inherit(line)
+ bbclass=".bbclass"
+ # Find a 'require/include xxxx'
+ if m:
+ self.print_cross_files(bbpath, keyword, layername, f, m.group(1) + bbclass, args.filenames, ignore_layers)
+ line = ffile.readline()
+
+ def print_cross_files(self, bbpath, keyword, layername, f, needed_filename, show_filenames, ignore_layers):
+ """Print the depends that crosses a layer boundary"""
+ needed_file = bb.utils.which(bbpath, needed_filename)
+ if needed_file:
+ # Which layer is this file from
+ needed_layername = self.get_file_layer(needed_file)
+ if needed_layername != layername and not needed_layername in ignore_layers:
+ if not show_filenames:
+ f = self.remove_layer_prefix(f)
+ needed_file = self.remove_layer_prefix(needed_file)
+ logger.plain("%s %s %s" %(f, keyword, needed_file))
+
+ def match_inherit(self, line):
+ """Match the inherit xxx line"""
+ return (self.inherit_re.match(line), "inherits")
+
+ def match_require_include(self, line):
+ """Match the require/include xxx line"""
+ m = self.require_re.match(line)
+ keyword = "requires"
+ if not m:
+ m = self.include_re.match(line)
+ keyword = "includes"
+ return (m, keyword)
+
+ def check_cross_depends(self, keyword, layername, f, needed_file, show_filenames, ignore_layers):
+ """Print the DEPENDS/RDEPENDS file that crosses a layer boundary"""
+ best_realfn = bb.cache.Cache.virtualfn2realfn(needed_file)[0]
+ needed_layername = self.get_file_layer(best_realfn)
+ if needed_layername != layername and not needed_layername in ignore_layers:
+ if not show_filenames:
+ f = self.remove_layer_prefix(f)
+ best_realfn = self.remove_layer_prefix(best_realfn)
+
+ logger.plain("%s %s %s" % (f, keyword, best_realfn))
+
+
+def main():
+
+ cmds = Commands()
+
+ def add_command(cmdname, function, *args, **kwargs):
+ # Convert docstring for function to help (one-liner shown in main --help) and description (shown in subcommand --help)
+ docsplit = function.__doc__.splitlines()
+ help = docsplit[0]
+ if len(docsplit) > 1:
+ desc = '\n'.join(docsplit[1:])
+ else:
+ desc = help
+ subparser = subparsers.add_parser(cmdname, *args, help=help, description=desc, formatter_class=argparse.RawTextHelpFormatter, **kwargs)
+ subparser.set_defaults(func=function)
+ return subparser
+
+ parser = argparse.ArgumentParser(description="BitBake layers utility",
+ epilog="Use %(prog)s <subcommand> --help to get help on a specific command")
+ parser.add_argument('-d', '--debug', help='Enable debug output', action='store_true')
+ parser.add_argument('-q', '--quiet', help='Print only errors', action='store_true')
+ subparsers = parser.add_subparsers(title='subcommands', metavar='<subcommand>')
+
+ parser_show_layers = add_command('show-layers', cmds.do_show_layers)
+
+ parser_add_layer = add_command('add-layer', cmds.do_add_layer)
+ parser_add_layer.add_argument('layerdir', help='Layer directory to add')
+
+ parser_remove_layer = add_command('remove-layer', cmds.do_remove_layer)
+ parser_remove_layer.add_argument('layerdir', help='Layer directory to remove (wildcards allowed, enclose in quotes to avoid shell expansion)')
+ parser_remove_layer.set_defaults(func=cmds.do_remove_layer)
+
+ parser_show_overlayed = add_command('show-overlayed', cmds.do_show_overlayed)
+ parser_show_overlayed.add_argument('-f', '--filenames', help='instead of the default formatting, list filenames of higher priority recipes with the ones they overlay indented underneath', action='store_true')
+ parser_show_overlayed.add_argument('-s', '--same-version', help='only list overlayed recipes where the version is the same', action='store_true')
+
+ parser_show_recipes = add_command('show-recipes', cmds.do_show_recipes)
+ parser_show_recipes.add_argument('-f', '--filenames', help='instead of the default formatting, list filenames of higher priority recipes with the ones they overlay indented underneath', action='store_true')
+ parser_show_recipes.add_argument('-m', '--multiple', help='only list where multiple recipes (in the same layer or different layers) exist for the same recipe name', action='store_true')
+ parser_show_recipes.add_argument('-i', '--inherits', help='only list recipes that inherit the named class', metavar='CLASS', default='')
+ parser_show_recipes.add_argument('pnspec', nargs='?', help='optional recipe name specification (wildcards allowed, enclose in quotes to avoid shell expansion)')
+
+ parser_show_appends = add_command('show-appends', cmds.do_show_appends)
+
+ parser_flatten = add_command('flatten', cmds.do_flatten)
+ parser_flatten.add_argument('layer', nargs='*', help='Optional layer(s) to flatten (otherwise all are flattened)')
+ parser_flatten.add_argument('outputdir', help='Output directory')
+
+ parser_show_cross_depends = add_command('show-cross-depends', cmds.do_show_cross_depends)
+ parser_show_cross_depends.add_argument('-f', '--filenames', help='show full file path', action='store_true')
+ parser_show_cross_depends.add_argument('-i', '--ignore', help='ignore dependencies on items in the specified layer(s) (split multiple layer names with commas, no spaces)', metavar='LAYERNAME')
+
+ parser_layerindex_fetch = add_command('layerindex-fetch', cmds.do_layerindex_fetch)
+ parser_layerindex_fetch.add_argument('-n', '--show-only', help='show dependencies and do nothing else', action='store_true')
+ parser_layerindex_fetch.add_argument('-b', '--branch', help='branch name to fetch (default %(default)s)', default='master')
+ parser_layerindex_fetch.add_argument('-i', '--ignore', help='assume the specified layers do not need to be fetched/added (separate multiple layers with commas, no spaces)', metavar='LAYER')
+ parser_layerindex_fetch.add_argument('layername', nargs='+', help='layer to fetch')
+
+ parser_layerindex_show_depends = add_command('layerindex-show-depends', cmds.do_layerindex_show_depends)
+ parser_layerindex_show_depends.add_argument('-b', '--branch', help='branch name to fetch (default %(default)s)', default='master')
+ parser_layerindex_show_depends.add_argument('layername', nargs='+', help='layer to query')
+
+ args = parser.parse_args()
+
+ if args.debug:
+ logger.setLevel(logging.DEBUG)
+ elif args.quiet:
+ logger.setLevel(logging.ERROR)
+
+ try:
+ ret = args.func(args)
+ except UserError as err:
+ logger.error(str(err))
+ ret = 1
+
+ return ret
+
+
+if __name__ == "__main__":
+ try:
+ ret = main()
+ except Exception:
+ ret = 1
+ import traceback
+ traceback.print_exc(5)
+ sys.exit(ret)
diff --git a/bitbake/bin/bitbake-prserv b/bitbake/bin/bitbake-prserv
new file mode 100755
index 0000000..a8d7acb
--- /dev/null
+++ b/bitbake/bin/bitbake-prserv
@@ -0,0 +1,55 @@
+#!/usr/bin/env python
+import os
+import sys,logging
+import optparse
+
+sys.path.insert(0, os.path.join(os.path.dirname(os.path.dirname(__file__)),'lib'))
+
+import prserv
+import prserv.serv
+
+__version__="1.0.0"
+
+PRHOST_DEFAULT='0.0.0.0'
+PRPORT_DEFAULT=8585
+
+def main():
+ parser = optparse.OptionParser(
+ version="Bitbake PR Service Core version %s, %%prog version %s" % (prserv.__version__, __version__),
+ usage = "%prog < --start | --stop > [options]")
+
+ parser.add_option("-f", "--file", help="database filename(default: prserv.sqlite3)", action="store",
+ dest="dbfile", type="string", default="prserv.sqlite3")
+ parser.add_option("-l", "--log", help="log filename(default: prserv.log)", action="store",
+ dest="logfile", type="string", default="prserv.log")
+ parser.add_option("--loglevel", help="logging level, i.e. CRITICAL, ERROR, WARNING, INFO, DEBUG",
+ action = "store", type="string", dest="loglevel", default = "INFO")
+ parser.add_option("--start", help="start daemon",
+ action="store_true", dest="start")
+ parser.add_option("--stop", help="stop daemon",
+ action="store_true", dest="stop")
+ parser.add_option("--host", help="ip address to bind", action="store",
+ dest="host", type="string", default=PRHOST_DEFAULT)
+ parser.add_option("--port", help="port number(default: 8585)", action="store",
+ dest="port", type="int", default=PRPORT_DEFAULT)
+
+ options, args = parser.parse_args(sys.argv)
+ prserv.init_logger(os.path.abspath(options.logfile),options.loglevel)
+
+ if options.start:
+ ret=prserv.serv.start_daemon(options.dbfile, options.host, options.port,os.path.abspath(options.logfile))
+ elif options.stop:
+ ret=prserv.serv.stop_daemon(options.host, options.port)
+ else:
+ ret=parser.print_help()
+ return ret
+
+if __name__ == "__main__":
+ try:
+ ret = main()
+ except Exception:
+ ret = 1
+ import traceback
+ traceback.print_exc(5)
+ sys.exit(ret)
+
diff --git a/bitbake/bin/bitbake-selftest b/bitbake/bin/bitbake-selftest
new file mode 100755
index 0000000..462eb1b
--- /dev/null
+++ b/bitbake/bin/bitbake-selftest
@@ -0,0 +1,55 @@
+#!/usr/bin/env python
+#
+# Copyright (C) 2012 Richard Purdie
+#
+# This program is free software; you can redistribute it and/or modify
+# it under the terms of the GNU General Public License version 2 as
+# published by the Free Software Foundation.
+#
+# This program is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
+# GNU General Public License for more details.
+#
+# You should have received a copy of the GNU General Public License along
+# with this program; if not, write to the Free Software Foundation, Inc.,
+# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
+
+import os
+import sys, logging
+sys.path.insert(0, os.path.join(os.path.dirname(os.path.dirname(__file__)), 'lib'))
+
+import unittest
+try:
+ import bb
+except RuntimeError as exc:
+ sys.exit(str(exc))
+
+def usage():
+ print('usage: [BB_SKIP_NETTESTS=yes] %s [-v] [testname1 [testname2]...]' % os.path.basename(sys.argv[0]))
+
+verbosity = 1
+
+tests = sys.argv[1:]
+if '-v' in sys.argv:
+ tests.remove('-v')
+ verbosity = 2
+
+if tests:
+ if '--help' in sys.argv[1:]:
+ usage()
+ sys.exit(0)
+else:
+ tests = ["bb.tests.codeparser",
+ "bb.tests.cow",
+ "bb.tests.data",
+ "bb.tests.fetch",
+ "bb.tests.parse",
+ "bb.tests.utils"]
+
+for t in tests:
+ t = '.'.join(t.split('.')[:3])
+ __import__(t)
+
+unittest.main(argv=["bitbake-selftest"] + tests, verbosity=verbosity)
+
diff --git a/bitbake/bin/bitbake-worker b/bitbake/bin/bitbake-worker
new file mode 100755
index 0000000..af17b87
--- /dev/null
+++ b/bitbake/bin/bitbake-worker
@@ -0,0 +1,432 @@
+#!/usr/bin/env python
+
+import os
+import sys
+import warnings
+sys.path.insert(0, os.path.join(os.path.dirname(os.path.dirname(sys.argv[0])), 'lib'))
+from bb import fetch2
+import logging
+import bb
+import select
+import errno
+import signal
+
+# Users shouldn't be running this code directly
+if len(sys.argv) != 2 or not sys.argv[1].startswith("decafbad"):
+ print("bitbake-worker is meant for internal execution by bitbake itself, please don't use it standalone.")
+ sys.exit(1)
+
+profiling = False
+if sys.argv[1] == "decafbadbad":
+ profiling = True
+ try:
+ import cProfile as profile
+ except:
+ import profile
+
+# Unbuffer stdout to avoid log truncation in the event
+# of an unorderly exit as well as to provide timely
+# updates to log files for use with tail
+try:
+ if sys.stdout.name == '<stdout>':
+ sys.stdout = os.fdopen(sys.stdout.fileno(), 'w', 0)
+except:
+ pass
+
+logger = logging.getLogger("BitBake")
+
+try:
+ import cPickle as pickle
+except ImportError:
+ import pickle
+ bb.msg.note(1, bb.msg.domain.Cache, "Importing cPickle failed. Falling back to a very slow implementation.")
+
+
+worker_pipe = sys.stdout.fileno()
+bb.utils.nonblockingfd(worker_pipe)
+
+handler = bb.event.LogHandler()
+logger.addHandler(handler)
+
+if 0:
+ # Code to write out a log file of all events passing through the worker
+ logfilename = "/tmp/workerlogfile"
+ format_str = "%(levelname)s: %(message)s"
+ conlogformat = bb.msg.BBLogFormatter(format_str)
+ consolelog = logging.FileHandler(logfilename)
+ bb.msg.addDefaultlogFilter(consolelog)
+ consolelog.setFormatter(conlogformat)
+ logger.addHandler(consolelog)
+
+worker_queue = ""
+
+def worker_fire(event, d):
+ data = "<event>" + pickle.dumps(event) + "</event>"
+ worker_fire_prepickled(data)
+
+def worker_fire_prepickled(event):
+ global worker_queue
+
+ worker_queue = worker_queue + event
+ worker_flush()
+
+def worker_flush():
+ global worker_queue, worker_pipe
+
+ if not worker_queue:
+ return
+
+ try:
+ written = os.write(worker_pipe, worker_queue)
+ worker_queue = worker_queue[written:]
+ except (IOError, OSError) as e:
+ if e.errno != errno.EAGAIN and e.errno != errno.EPIPE:
+ raise
+
+def worker_child_fire(event, d):
+ global worker_pipe
+
+ data = "<event>" + pickle.dumps(event) + "</event>"
+ try:
+ worker_pipe.write(data)
+ except IOError:
+ sigterm_handler(None, None)
+ raise
+
+bb.event.worker_fire = worker_fire
+
+lf = None
+#lf = open("/tmp/workercommandlog", "w+")
+def workerlog_write(msg):
+ if lf:
+ lf.write(msg)
+ lf.flush()
+
+def sigterm_handler(signum, frame):
+ signal.signal(signal.SIGTERM, signal.SIG_DFL)
+ os.killpg(0, signal.SIGTERM)
+ sys.exit()
+
+def fork_off_task(cfg, data, workerdata, fn, task, taskname, appends, taskdepdata, quieterrors=False):
+ # We need to setup the environment BEFORE the fork, since
+ # a fork() or exec*() activates PSEUDO...
+
+ envbackup = {}
+ fakeenv = {}
+ umask = None
+
+ taskdep = workerdata["taskdeps"][fn]
+ if 'umask' in taskdep and taskname in taskdep['umask']:
+ # umask might come in as a number or text string..
+ try:
+ umask = int(taskdep['umask'][taskname],8)
+ except TypeError:
+ umask = taskdep['umask'][taskname]
+
+ # We can't use the fakeroot environment in a dry run as it possibly hasn't been built
+ if 'fakeroot' in taskdep and taskname in taskdep['fakeroot'] and not cfg.dry_run:
+ envvars = (workerdata["fakerootenv"][fn] or "").split()
+ for key, value in (var.split('=') for var in envvars):
+ envbackup[key] = os.environ.get(key)
+ os.environ[key] = value
+ fakeenv[key] = value
+
+ fakedirs = (workerdata["fakerootdirs"][fn] or "").split()
+ for p in fakedirs:
+ bb.utils.mkdirhier(p)
+ logger.debug(2, 'Running %s:%s under fakeroot, fakedirs: %s' %
+ (fn, taskname, ', '.join(fakedirs)))
+ else:
+ envvars = (workerdata["fakerootnoenv"][fn] or "").split()
+ for key, value in (var.split('=') for var in envvars):
+ envbackup[key] = os.environ.get(key)
+ os.environ[key] = value
+ fakeenv[key] = value
+
+ sys.stdout.flush()
+ sys.stderr.flush()
+
+ try:
+ pipein, pipeout = os.pipe()
+ pipein = os.fdopen(pipein, 'rb', 4096)
+ pipeout = os.fdopen(pipeout, 'wb', 0)
+ pid = os.fork()
+ except OSError as e:
+ bb.msg.fatal("RunQueue", "fork failed: %d (%s)" % (e.errno, e.strerror))
+
+ if pid == 0:
+ def child():
+ global worker_pipe
+ pipein.close()
+
+ signal.signal(signal.SIGTERM, sigterm_handler)
+ # Let SIGHUP exit as SIGTERM
+ signal.signal(signal.SIGHUP, sigterm_handler)
+ bb.utils.signal_on_parent_exit("SIGTERM")
+
+ # Save out the PID so that the event can include it the
+ # events
+ bb.event.worker_pid = os.getpid()
+ bb.event.worker_fire = worker_child_fire
+ worker_pipe = pipeout
+
+ # Make the child the process group leader and ensure no
+ # child process will be controlled by the current terminal
+ # This ensures signals sent to the controlling terminal like Ctrl+C
+ # don't stop the child processes.
+ os.setsid()
+ # No stdin
+ newsi = os.open(os.devnull, os.O_RDWR)
+ os.dup2(newsi, sys.stdin.fileno())
+
+ if umask:
+ os.umask(umask)
+
+ data.setVar("BB_WORKERCONTEXT", "1")
+ data.setVar("BB_TASKDEPDATA", taskdepdata)
+ data.setVar("BUILDNAME", workerdata["buildname"])
+ data.setVar("DATE", workerdata["date"])
+ data.setVar("TIME", workerdata["time"])
+ bb.parse.siggen.set_taskdata(workerdata["sigdata"])
+ ret = 0
+ try:
+ the_data = bb.cache.Cache.loadDataFull(fn, appends, data)
+ the_data.setVar('BB_TASKHASH', workerdata["runq_hash"][task])
+
+ # exported_vars() returns a generator which *cannot* be passed to os.environ.update()
+ # successfully. We also need to unset anything from the environment which shouldn't be there
+ exports = bb.data.exported_vars(the_data)
+ bb.utils.empty_environment()
+ for e, v in exports:
+ os.environ[e] = v
+ for e in fakeenv:
+ os.environ[e] = fakeenv[e]
+ the_data.setVar(e, fakeenv[e])
+ the_data.setVarFlag(e, 'export', "1")
+
+ if quieterrors:
+ the_data.setVarFlag(taskname, "quieterrors", "1")
+
+ except Exception as exc:
+ if not quieterrors:
+ logger.critical(str(exc))
+ os._exit(1)
+ try:
+ if cfg.dry_run:
+ return 0
+ return bb.build.exec_task(fn, taskname, the_data, cfg.profile)
+ except:
+ os._exit(1)
+ if not profiling:
+ os._exit(child())
+ else:
+ profname = "profile-%s.log" % (fn.replace("/", "-") + "-" + taskname)
+ prof = profile.Profile()
+ try:
+ ret = profile.Profile.runcall(prof, child)
+ finally:
+ prof.dump_stats(profname)
+ bb.utils.process_profilelog(profname)
+ os._exit(ret)
+ else:
+ for key, value in envbackup.iteritems():
+ if value is None:
+ del os.environ[key]
+ else:
+ os.environ[key] = value
+
+ return pid, pipein, pipeout
+
+class runQueueWorkerPipe():
+ """
+ Abstraction for a pipe between a worker thread and the worker server
+ """
+ def __init__(self, pipein, pipeout):
+ self.input = pipein
+ if pipeout:
+ pipeout.close()
+ bb.utils.nonblockingfd(self.input)
+ self.queue = ""
+
+ def read(self):
+ start = len(self.queue)
+ try:
+ self.queue = self.queue + self.input.read(102400)
+ except (OSError, IOError) as e:
+ if e.errno != errno.EAGAIN:
+ raise
+
+ end = len(self.queue)
+ index = self.queue.find("</event>")
+ while index != -1:
+ worker_fire_prepickled(self.queue[:index+8])
+ self.queue = self.queue[index+8:]
+ index = self.queue.find("</event>")
+ return (end > start)
+
+ def close(self):
+ while self.read():
+ continue
+ if len(self.queue) > 0:
+ print("Warning, worker child left partial message: %s" % self.queue)
+ self.input.close()
+
+normalexit = False
+
+class BitbakeWorker(object):
+ def __init__(self, din):
+ self.input = din
+ bb.utils.nonblockingfd(self.input)
+ self.queue = ""
+ self.cookercfg = None
+ self.databuilder = None
+ self.data = None
+ self.build_pids = {}
+ self.build_pipes = {}
+
+ signal.signal(signal.SIGTERM, self.sigterm_exception)
+ # Let SIGHUP exit as SIGTERM
+ signal.signal(signal.SIGHUP, self.sigterm_exception)
+
+ def sigterm_exception(self, signum, stackframe):
+ if signum == signal.SIGTERM:
+ bb.warn("Worker recieved SIGTERM, shutting down...")
+ elif signum == signal.SIGHUP:
+ bb.warn("Worker recieved SIGHUP, shutting down...")
+ self.handle_finishnow(None)
+ signal.signal(signal.SIGTERM, signal.SIG_DFL)
+ os.kill(os.getpid(), signal.SIGTERM)
+
+ def serve(self):
+ while True:
+ (ready, _, _) = select.select([self.input] + [i.input for i in self.build_pipes.values()], [] , [], 1)
+ if self.input in ready:
+ try:
+ r = self.input.read()
+ if len(r) == 0:
+ # EOF on pipe, server must have terminated
+ self.sigterm_exception(signal.SIGTERM, None)
+ self.queue = self.queue + r
+ except (OSError, IOError):
+ pass
+ if len(self.queue):
+ self.handle_item("cookerconfig", self.handle_cookercfg)
+ self.handle_item("workerdata", self.handle_workerdata)
+ self.handle_item("runtask", self.handle_runtask)
+ self.handle_item("finishnow", self.handle_finishnow)
+ self.handle_item("ping", self.handle_ping)
+ self.handle_item("quit", self.handle_quit)
+
+ for pipe in self.build_pipes:
+ self.build_pipes[pipe].read()
+ if len(self.build_pids):
+ self.process_waitpid()
+ worker_flush()
+
+
+ def handle_item(self, item, func):
+ if self.queue.startswith("<" + item + ">"):
+ index = self.queue.find("</" + item + ">")
+ while index != -1:
+ func(self.queue[(len(item) + 2):index])
+ self.queue = self.queue[(index + len(item) + 3):]
+ index = self.queue.find("</" + item + ">")
+
+ def handle_cookercfg(self, data):
+ self.cookercfg = pickle.loads(data)
+ self.databuilder = bb.cookerdata.CookerDataBuilder(self.cookercfg, worker=True)
+ self.databuilder.parseBaseConfiguration()
+ self.data = self.databuilder.data
+
+ def handle_workerdata(self, data):
+ self.workerdata = pickle.loads(data)
+ bb.msg.loggerDefaultDebugLevel = self.workerdata["logdefaultdebug"]
+ bb.msg.loggerDefaultVerbose = self.workerdata["logdefaultverbose"]
+ bb.msg.loggerVerboseLogs = self.workerdata["logdefaultverboselogs"]
+ bb.msg.loggerDefaultDomains = self.workerdata["logdefaultdomain"]
+ self.data.setVar("PRSERV_HOST", self.workerdata["prhost"])
+
+ def handle_ping(self, _):
+ workerlog_write("Handling ping\n")
+
+ logger.warn("Pong from bitbake-worker!")
+
+ def handle_quit(self, data):
+ workerlog_write("Handling quit\n")
+
+ global normalexit
+ normalexit = True
+ sys.exit(0)
+
+ def handle_runtask(self, data):
+ fn, task, taskname, quieterrors, appends, taskdepdata = pickle.loads(data)
+ workerlog_write("Handling runtask %s %s %s\n" % (task, fn, taskname))
+
+ pid, pipein, pipeout = fork_off_task(self.cookercfg, self.data, self.workerdata, fn, task, taskname, appends, taskdepdata, quieterrors)
+
+ self.build_pids[pid] = task
+ self.build_pipes[pid] = runQueueWorkerPipe(pipein, pipeout)
+
+ def process_waitpid(self):
+ """
+ Return none is there are no processes awaiting result collection, otherwise
+ collect the process exit codes and close the information pipe.
+ """
+ try:
+ pid, status = os.waitpid(-1, os.WNOHANG)
+ if pid == 0 or os.WIFSTOPPED(status):
+ return None
+ except OSError:
+ return None
+
+ workerlog_write("Exit code of %s for pid %s\n" % (status, pid))
+
+ if os.WIFEXITED(status):
+ status = os.WEXITSTATUS(status)
+ elif os.WIFSIGNALED(status):
+ # Per shell conventions for $?, when a process exits due to
+ # a signal, we return an exit code of 128 + SIGNUM
+ status = 128 + os.WTERMSIG(status)
+
+ task = self.build_pids[pid]
+ del self.build_pids[pid]
+
+ self.build_pipes[pid].close()
+ del self.build_pipes[pid]
+
+ worker_fire_prepickled("<exitcode>" + pickle.dumps((task, status)) + "</exitcode>")
+
+ def handle_finishnow(self, _):
+ if self.build_pids:
+ logger.info("Sending SIGTERM to remaining %s tasks", len(self.build_pids))
+ for k, v in self.build_pids.iteritems():
+ try:
+ os.kill(-k, signal.SIGTERM)
+ os.waitpid(-1, 0)
+ except:
+ pass
+ for pipe in self.build_pipes:
+ self.build_pipes[pipe].read()
+
+try:
+ worker = BitbakeWorker(sys.stdin)
+ if not profiling:
+ worker.serve()
+ else:
+ profname = "profile-worker.log"
+ prof = profile.Profile()
+ try:
+ profile.Profile.runcall(prof, worker.serve)
+ finally:
+ prof.dump_stats(profname)
+ bb.utils.process_profilelog(profname)
+except BaseException as e:
+ if not normalexit:
+ import traceback
+ sys.stderr.write(traceback.format_exc())
+ sys.stderr.write(str(e))
+while len(worker_queue):
+ worker_flush()
+workerlog_write("exitting")
+sys.exit(0)
+
diff --git a/bitbake/bin/bitdoc b/bitbake/bin/bitdoc
new file mode 100755
index 0000000..576d88b
--- /dev/null
+++ b/bitbake/bin/bitdoc
@@ -0,0 +1,531 @@
+#!/usr/bin/env python
+# ex:ts=4:sw=4:sts=4:et
+# -*- tab-width: 4; c-basic-offset: 4; indent-tabs-mode: nil -*-
+#
+# Copyright (C) 2005 Holger Hans Peter Freyther
+#
+# This program is free software; you can redistribute it and/or modify
+# it under the terms of the GNU General Public License version 2 as
+# published by the Free Software Foundation.
+#
+# This program is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
+# GNU General Public License for more details.
+#
+# You should have received a copy of the GNU General Public License along
+# with this program; if not, write to the Free Software Foundation, Inc.,
+# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
+
+import optparse, os, sys
+
+# bitbake
+sys.path.append(os.path.join(os.path.dirname(os.path.dirname(__file__), 'lib'))
+import bb
+import bb.parse
+from string import split, join
+
+__version__ = "0.0.2"
+
+class HTMLFormatter:
+ """
+ Simple class to help to generate some sort of HTML files. It is
+ quite inferior solution compared to docbook, gtkdoc, doxygen but it
+ should work for now.
+ We've a global introduction site (index.html) and then one site for
+ the list of keys (alphabetical sorted) and one for the list of groups,
+ one site for each key with links to the relations and groups.
+
+ index.html
+ all_keys.html
+ all_groups.html
+ groupNAME.html
+ keyNAME.html
+ """
+
+ def replace(self, text, *pairs):
+ """
+ From pydoc... almost identical at least
+ """
+ while pairs:
+ (a, b) = pairs[0]
+ text = join(split(text, a), b)
+ pairs = pairs[1:]
+ return text
+ def escape(self, text):
+ """
+ Escape string to be conform HTML
+ """
+ return self.replace(text,
+ ('&', '&'),
+ ('<', '<' ),
+ ('>', '>' ) )
+ def createNavigator(self):
+ """
+ Create the navgiator
+ """
+ return """<table class="navigation" width="100%" summary="Navigation header" cellpadding="2" cellspacing="2">
+<tr valign="middle">
+<td><a accesskey="g" href="index.html">Home</a></td>
+<td><a accesskey="n" href="all_groups.html">Groups</a></td>
+<td><a accesskey="u" href="all_keys.html">Keys</a></td>
+</tr></table>
+"""
+
+ def relatedKeys(self, item):
+ """
+ Create HTML to link to foreign keys
+ """
+
+ if len(item.related()) == 0:
+ return ""
+
+ txt = "<p><b>See also:</b><br>"
+ txts = []
+ for it in item.related():
+ txts.append("""<a href="key%(it)s.html">%(it)s</a>""" % vars() )
+
+ return txt + ",".join(txts)
+
+ def groups(self, item):
+ """
+ Create HTML to link to related groups
+ """
+
+ if len(item.groups()) == 0:
+ return ""
+
+
+ txt = "<p><b>See also:</b><br>"
+ txts = []
+ for group in item.groups():
+ txts.append( """<a href="group%s.html">%s</a> """ % (group, group) )
+
+ return txt + ",".join(txts)
+
+
+ def createKeySite(self, item):
+ """
+ Create a site for a key. It contains the header/navigator, a heading,
+ the description, links to related keys and to the groups.
+ """
+
+ return """<!doctype html PUBLIC "-//W3C//DTD HTML 4.0 Transitional//EN">
+<html><head><title>Key %s</title></head>
+<link rel="stylesheet" href="style.css" type="text/css">
+<body bgcolor="white" text="black" link="#0000FF" vlink="#840084" alink="#0000FF">
+%s
+<h2><span class="refentrytitle">%s</span></h2>
+
+<div class="refsynopsisdiv">
+<h2>Synopsis</h2>
+<p>
+%s
+</p>
+</div>
+
+<div class="refsynopsisdiv">
+<h2>Related Keys</h2>
+<p>
+%s
+</p>
+</div>
+
+<div class="refsynopsisdiv">
+<h2>Groups</h2>
+<p>
+%s
+</p>
+</div>
+
+
+</body>
+""" % (item.name(), self.createNavigator(), item.name(),
+ self.escape(item.description()), self.relatedKeys(item), self.groups(item))
+
+ def createGroupsSite(self, doc):
+ """
+ Create the Group Overview site
+ """
+
+ groups = ""
+ sorted_groups = sorted(doc.groups())
+ for group in sorted_groups:
+ groups += """<a href="group%s.html">%s</a><br>""" % (group, group)
+
+ return """<!doctype html PUBLIC "-//W3C//DTD HTML 4.0 Transitional//EN">
+<html><head><title>Group overview</title></head>
+<link rel="stylesheet" href="style.css" type="text/css">
+<body bgcolor="white" text="black" link="#0000FF" vlink="#840084" alink="#0000FF">
+%s
+<h2>Available Groups</h2>
+%s
+</body>
+""" % (self.createNavigator(), groups)
+
+ def createIndex(self):
+ """
+ Create the index file
+ """
+
+ return """<!doctype html PUBLIC "-//W3C//DTD HTML 4.0 Transitional//EN">
+<html><head><title>Bitbake Documentation</title></head>
+<link rel="stylesheet" href="style.css" type="text/css">
+<body bgcolor="white" text="black" link="#0000FF" vlink="#840084" alink="#0000FF">
+%s
+<h2>Documentation Entrance</h2>
+<a href="all_groups.html">All available groups</a><br>
+<a href="all_keys.html">All available keys</a><br>
+</body>
+""" % self.createNavigator()
+
+ def createKeysSite(self, doc):
+ """
+ Create Overview of all avilable keys
+ """
+ keys = ""
+ sorted_keys = sorted(doc.doc_keys())
+ for key in sorted_keys:
+ keys += """<a href="key%s.html">%s</a><br>""" % (key, key)
+
+ return """<!doctype html PUBLIC "-//W3C//DTD HTML 4.0 Transitional//EN">
+<html><head><title>Key overview</title></head>
+<link rel="stylesheet" href="style.css" type="text/css">
+<body bgcolor="white" text="black" link="#0000FF" vlink="#840084" alink="#0000FF">
+%s
+<h2>Available Keys</h2>
+%s
+</body>
+""" % (self.createNavigator(), keys)
+
+ def createGroupSite(self, gr, items, _description = None):
+ """
+ Create a site for a group:
+ Group the name of the group, items contain the name of the keys
+ inside this group
+ """
+ groups = ""
+ description = ""
+
+ # create a section with the group descriptions
+ if _description:
+ description += "<h2 Description of Grozp %s</h2>" % gr
+ description += _description
+
+ items.sort(lambda x, y:cmp(x.name(), y.name()))
+ for group in items:
+ groups += """<a href="key%s.html">%s</a><br>""" % (group.name(), group.name())
+
+ return """<!doctype html PUBLIC "-//W3C//DTD HTML 4.0 Transitional//EN">
+<html><head><title>Group %s</title></head>
+<link rel="stylesheet" href="style.css" type="text/css">
+<body bgcolor="white" text="black" link="#0000FF" vlink="#840084" alink="#0000FF">
+%s
+%s
+<div class="refsynopsisdiv">
+<h2>Keys in Group %s</h2>
+<pre class="synopsis">
+%s
+</pre>
+</div>
+</body>
+""" % (gr, self.createNavigator(), description, gr, groups)
+
+
+
+ def createCSS(self):
+ """
+ Create the CSS file
+ """
+ return """.synopsis, .classsynopsis
+{
+ background: #eeeeee;
+ border: solid 1px #aaaaaa;
+ padding: 0.5em;
+}
+.programlisting
+{
+ background: #eeeeff;
+ border: solid 1px #aaaaff;
+ padding: 0.5em;
+}
+.variablelist
+{
+ padding: 4px;
+ margin-left: 3em;
+}
+.variablelist td:first-child
+{
+ vertical-align: top;
+}
+table.navigation
+{
+ background: #ffeeee;
+ border: solid 1px #ffaaaa;
+ margin-top: 0.5em;
+ margin-bottom: 0.5em;
+}
+.navigation a
+{
+ color: #770000;
+}
+.navigation a:visited
+{
+ color: #550000;
+}
+.navigation .title
+{
+ font-size: 200%;
+}
+div.refnamediv
+{
+ margin-top: 2em;
+}
+div.gallery-float
+{
+ float: left;
+ padding: 10px;
+}
+div.gallery-float img
+{
+ border-style: none;
+}
+div.gallery-spacer
+{
+ clear: both;
+}
+a
+{
+ text-decoration: none;
+}
+a:hover
+{
+ text-decoration: underline;
+ color: #FF0000;
+}
+"""
+
+
+
+class DocumentationItem:
+ """
+ A class to hold information about a configuration
+ item. It contains the key name, description, a list of related names,
+ and the group this item is contained in.
+ """
+
+ def __init__(self):
+ self._groups = []
+ self._related = []
+ self._name = ""
+ self._desc = ""
+
+ def groups(self):
+ return self._groups
+
+ def name(self):
+ return self._name
+
+ def description(self):
+ return self._desc
+
+ def related(self):
+ return self._related
+
+ def setName(self, name):
+ self._name = name
+
+ def setDescription(self, desc):
+ self._desc = desc
+
+ def addGroup(self, group):
+ self._groups.append(group)
+
+ def addRelation(self, relation):
+ self._related.append(relation)
+
+ def sort(self):
+ self._related.sort()
+ self._groups.sort()
+
+
+class Documentation:
+ """
+ Holds the documentation... with mappings from key to items...
+ """
+
+ def __init__(self):
+ self.__keys = {}
+ self.__groups = {}
+
+ def insert_doc_item(self, item):
+ """
+ Insert the Doc Item into the internal list
+ of representation
+ """
+ item.sort()
+ self.__keys[item.name()] = item
+
+ for group in item.groups():
+ if not group in self.__groups:
+ self.__groups[group] = []
+ self.__groups[group].append(item)
+ self.__groups[group].sort()
+
+
+ def doc_item(self, key):
+ """
+ Return the DocumentationInstance describing the key
+ """
+ try:
+ return self.__keys[key]
+ except KeyError:
+ return None
+
+ def doc_keys(self):
+ """
+ Return the documented KEYS (names)
+ """
+ return self.__keys.keys()
+
+ def groups(self):
+ """
+ Return the names of available groups
+ """
+ return self.__groups.keys()
+
+ def group_content(self, group_name):
+ """
+ Return a list of keys/names that are in a specefic
+ group or the empty list
+ """
+ try:
+ return self.__groups[group_name]
+ except KeyError:
+ return []
+
+
+def parse_cmdline(args):
+ """
+ Parse the CMD line and return the result as a n-tuple
+ """
+
+ parser = optparse.OptionParser( version = "Bitbake Documentation Tool Core version %s, %%prog version %s" % (bb.__version__, __version__))
+ usage = """%prog [options]
+
+Create a set of html pages (documentation) for a bitbake.conf....
+"""
+
+ # Add the needed options
+ parser.add_option( "-c", "--config", help = "Use the specified configuration file as source",
+ action = "store", dest = "config", default = os.path.join("conf", "documentation.conf") )
+
+ parser.add_option( "-o", "--output", help = "Output directory for html files",
+ action = "store", dest = "output", default = "html/" )
+
+ parser.add_option( "-D", "--debug", help = "Increase the debug level",
+ action = "count", dest = "debug", default = 0 )
+
+ parser.add_option( "-v", "--verbose", help = "output more chit-char to the terminal",
+ action = "store_true", dest = "verbose", default = False )
+
+ options, args = parser.parse_args( sys.argv )
+
+ bb.msg.init_msgconfig(options.verbose, options.debug)
+
+ return options.config, options.output
+
+def main():
+ """
+ The main Method
+ """
+
+ (config_file, output_dir) = parse_cmdline( sys.argv )
+
+ # right to let us load the file now
+ try:
+ documentation = bb.parse.handle( config_file, bb.data.init() )
+ except IOError:
+ bb.fatal( "Unable to open %s" % config_file )
+ except bb.parse.ParseError:
+ bb.fatal( "Unable to parse %s" % config_file )
+
+ if isinstance(documentation, dict):
+ documentation = documentation[""]
+
+ # Assuming we've the file loaded now, we will initialize the 'tree'
+ doc = Documentation()
+
+ # defined states
+ state_begin = 0
+ state_see = 1
+ state_group = 2
+
+ for key in bb.data.keys(documentation):
+ data = documentation.getVarFlag(key, "doc")
+ if not data:
+ continue
+
+ # The Documentation now starts
+ doc_ins = DocumentationItem()
+ doc_ins.setName(key)
+
+
+ tokens = data.split(' ')
+ state = state_begin
+ string= ""
+ for token in tokens:
+ token = token.strip(',')
+
+ if not state == state_see and token == "@see":
+ state = state_see
+ continue
+ elif not state == state_group and token == "@group":
+ state = state_group
+ continue
+
+ if state == state_begin:
+ string += " %s" % token
+ elif state == state_see:
+ doc_ins.addRelation(token)
+ elif state == state_group:
+ doc_ins.addGroup(token)
+
+ # set the description
+ doc_ins.setDescription(string)
+ doc.insert_doc_item(doc_ins)
+
+ # let us create the HTML now
+ bb.utils.mkdirhier(output_dir)
+ os.chdir(output_dir)
+
+ # Let us create the sites now. We do it in the following order
+ # Start with the index.html. It will point to sites explaining all
+ # keys and groups
+ html_slave = HTMLFormatter()
+
+ f = file('style.css', 'w')
+ print >> f, html_slave.createCSS()
+
+ f = file('index.html', 'w')
+ print >> f, html_slave.createIndex()
+
+ f = file('all_groups.html', 'w')
+ print >> f, html_slave.createGroupsSite(doc)
+
+ f = file('all_keys.html', 'w')
+ print >> f, html_slave.createKeysSite(doc)
+
+ # now for each group create the site
+ for group in doc.groups():
+ f = file('group%s.html' % group, 'w')
+ print >> f, html_slave.createGroupSite(group, doc.group_content(group))
+
+ # now for the keys
+ for key in doc.doc_keys():
+ f = file('key%s.html' % doc.doc_item(key).name(), 'w')
+ print >> f, html_slave.createKeySite(doc.doc_item(key))
+
+
+if __name__ == "__main__":
+ main()
diff --git a/bitbake/bin/image-writer b/bitbake/bin/image-writer
new file mode 100755
index 0000000..7d71167
--- /dev/null
+++ b/bitbake/bin/image-writer
@@ -0,0 +1,122 @@
+#!/usr/bin/env python
+
+# Copyright (c) 2012 Wind River Systems, Inc.
+#
+# This program is free software; you can redistribute it and/or modify
+# it under the terms of the GNU General Public License version 2 as
+# published by the Free Software Foundation.
+#
+# This program is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.
+# See the GNU General Public License for more details.
+#
+# You should have received a copy of the GNU General Public License
+# along with this program; if not, write to the Free Software
+# Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
+
+import os
+import sys
+sys.path.insert(0, os.path.join(os.path.dirname(os.path.dirname( \
+ os.path.abspath(__file__))), 'lib'))
+try:
+ import bb
+except RuntimeError as exc:
+ sys.exit(str(exc))
+
+import gtk
+import optparse
+import pygtk
+
+from bb.ui.crumbs.hobwidget import HobAltButton, HobButton
+from bb.ui.crumbs.hig.crumbsmessagedialog import CrumbsMessageDialog
+from bb.ui.crumbs.hig.deployimagedialog import DeployImageDialog
+from bb.ui.crumbs.hig.imageselectiondialog import ImageSelectionDialog
+
+# I put all the fs bitbake supported here. Need more test.
+DEPLOYABLE_IMAGE_TYPES = ["jffs2", "cramfs", "ext2", "ext3", "ext4", "btrfs", "squashfs", "ubi", "vmdk"]
+Title = "USB Image Writer"
+
+class DeployWindow(gtk.Window):
+ def __init__(self, image_path=''):
+ super(DeployWindow, self).__init__()
+
+ if len(image_path) > 0:
+ valid = True
+ if not os.path.exists(image_path):
+ valid = False
+ lbl = "<b>Invalid image file path: %s.</b>\nPress <b>Select Image</b> to select an image." % image_path
+ else:
+ image_path = os.path.abspath(image_path)
+ extend_name = os.path.splitext(image_path)[1][1:]
+ if extend_name not in DEPLOYABLE_IMAGE_TYPES:
+ valid = False
+ lbl = "<b>Undeployable imge type: %s</b>\nPress <b>Select Image</b> to select an image." % extend_name
+
+ if not valid:
+ image_path = ''
+ crumbs_dialog = CrumbsMessageDialog(self, lbl, gtk.STOCK_DIALOG_INFO)
+ button = crumbs_dialog.add_button("Close", gtk.RESPONSE_OK)
+ HobButton.style_button(button)
+ crumbs_dialog.run()
+ crumbs_dialog.destroy()
+
+ self.deploy_dialog = DeployImageDialog(Title, image_path, self,
+ gtk.DIALOG_MODAL | gtk.DIALOG_DESTROY_WITH_PARENT
+ | gtk.DIALOG_NO_SEPARATOR, None, standalone=True)
+ close_button = self.deploy_dialog.add_button("Close", gtk.RESPONSE_NO)
+ HobAltButton.style_button(close_button)
+ close_button.connect('clicked', gtk.main_quit)
+
+ write_button = self.deploy_dialog.add_button("Write USB image", gtk.RESPONSE_YES)
+ HobAltButton.style_button(write_button)
+
+ self.deploy_dialog.connect('select_image_clicked', self.select_image_clicked_cb)
+ self.deploy_dialog.connect('destroy', gtk.main_quit)
+ response = self.deploy_dialog.show()
+
+ def select_image_clicked_cb(self, dialog):
+ cwd = os.getcwd()
+ dialog = ImageSelectionDialog(cwd, DEPLOYABLE_IMAGE_TYPES, Title, self, gtk.FILE_CHOOSER_ACTION_SAVE )
+ button = dialog.add_button("Cancel", gtk.RESPONSE_NO)
+ HobAltButton.style_button(button)
+ button = dialog.add_button("Open", gtk.RESPONSE_YES)
+ HobAltButton.style_button(button)
+ response = dialog.run()
+
+ if response == gtk.RESPONSE_YES:
+ if not dialog.image_names:
+ lbl = "<b>No selections made</b>\nClicked the radio button to select a image."
+ crumbs_dialog = CrumbsMessageDialog(self, lbl, gtk.STOCK_DIALOG_INFO)
+ button = crumbs_dialog.add_button("Close", gtk.RESPONSE_OK)
+ HobButton.style_button(button)
+ crumbs_dialog.run()
+ crumbs_dialog.destroy()
+ dialog.destroy()
+ return
+
+ # get the full path of image
+ image_path = os.path.join(dialog.image_folder, dialog.image_names[0])
+ self.deploy_dialog.set_image_text_buffer(image_path)
+ self.deploy_dialog.set_image_path(image_path)
+
+ dialog.destroy()
+
+def main():
+ parser = optparse.OptionParser(
+ usage = """%prog [-h] [image_file]
+
+%prog writes bootable images to USB devices. You can
+provide the image file on the command line or select it using the GUI.""")
+
+ options, args = parser.parse_args(sys.argv)
+ image_file = args[1] if len(args) > 1 else ''
+ dw = DeployWindow(image_file)
+
+if __name__ == '__main__':
+ try:
+ main()
+ gtk.main()
+ except Exception:
+ import traceback
+ traceback.print_exc(3)
diff --git a/bitbake/bin/toaster b/bitbake/bin/toaster
new file mode 100755
index 0000000..411ce2c
--- /dev/null
+++ b/bitbake/bin/toaster
@@ -0,0 +1,352 @@
+#!/bin/sh
+# (c) 2013 Intel Corp.
+
+# This program is free software; you can redistribute it and/or modify
+# it under the terms of the GNU General Public License as published by
+# the Free Software Foundation; either version 2 of the License, or
+# (at your option) any later version.
+#
+# This program is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
+# GNU General Public License for more details.
+#
+# You should have received a copy of the GNU General Public License
+# along with this program; if not, write to the Free Software
+# Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
+
+
+# This script can be run in two modes.
+
+# When used with "source", from a build directory,
+# it enables toaster event logging and starts the bitbake resident server.
+# use as: source toaster [start|stop] [noweb] [noui]
+
+# When it is called as a stand-alone script, it starts just the
+# web server, and the building shall be done through the web interface.
+# As script, it will not return to the command prompt. Stop with Ctrl-C.
+
+# Helper function to kill a background toaster development server
+
+webserverKillAll()
+{
+ local pidfile
+ for pidfile in ${BUILDDIR}/.toastermain.pid; do
+ if [ -f ${pidfile} ]; then
+ pid=`cat ${pidfile}`
+ while kill -0 $pid 2>/dev/null; do
+ kill -SIGTERM -$pid 2>/dev/null
+ sleep 1
+ # Kill processes if they are still running - may happen in interactive shells
+ ps fux | grep "python.*manage.py runserver" | awk '{print $2}' | xargs kill
+ done
+ rm ${pidfile}
+ fi
+ done
+}
+
+webserverStartAll()
+{
+ # do not start if toastermain points to a valid process
+ if ! cat "${BUILDDIR}/.toastermain.pid" 2>/dev/null | xargs -I{} kill -0 {} ; then
+ retval=1
+ rm "${BUILDDIR}/.toastermain.pid"
+ fi
+
+ retval=0
+ if [ "$TOASTER_MANAGED" '=' '1' ]; then
+ python $BBBASEDIR/lib/toaster/manage.py syncdb || retval=1
+ else
+ python $BBBASEDIR/lib/toaster/manage.py syncdb --noinput || retval=1
+ fi
+ python $BBBASEDIR/lib/toaster/manage.py migrate orm || retval=2
+ if [ $retval -eq 1 ]; then
+ echo "Failed db sync, stopping system start" 1>&2
+ elif [ $retval -eq 2 ]; then
+ printf "\nError on migration, trying to recover... \n"
+ python $BBBASEDIR/lib/toaster/manage.py migrate orm 0001_initial --fake
+ retval=0
+ python $BBBASEDIR/lib/toaster/manage.py migrate orm || retval=1
+ fi
+ if [ "$TOASTER_MANAGED" = '1' ]; then
+ python $BBBASEDIR/lib/toaster/manage.py migrate bldcontrol || retval=1
+ python $BBBASEDIR/lib/toaster/manage.py checksettings --traceback || retval=1
+ fi
+ if [ $retval -eq 0 ]; then
+ echo "Starting webserver..."
+ python $BBBASEDIR/lib/toaster/manage.py runserver "0.0.0.0:$WEB_PORT" </dev/null >>${BUILDDIR}/toaster_web.log 2>&1 & echo $! >${BUILDDIR}/.toastermain.pid
+ sleep 1
+ if ! cat "${BUILDDIR}/.toastermain.pid" | xargs -I{} kill -0 {} ; then
+ retval=1
+ rm "${BUILDDIR}/.toastermain.pid"
+ else
+ echo "Webserver address: http://0.0.0.0:$WEB_PORT/"
+ fi
+ fi
+ return $retval
+}
+
+# Helper functions to add a special configuration file
+
+addtoConfiguration()
+{
+ file=$1
+ shift
+ echo "#Created by toaster start script" > ${BUILDDIR}/conf/$file
+ for var in "$@"; do echo $var >> ${BUILDDIR}/conf/$file; done
+}
+
+INSTOPSYSTEM=0
+
+# define the stop command
+stop_system()
+{
+ # prevent reentry
+ if [ $INSTOPSYSTEM -eq 1 ]; then return; fi
+ INSTOPSYSTEM=1
+ if [ -f ${BUILDDIR}/.toasterui.pid ]; then
+ kill `cat ${BUILDDIR}/.toasterui.pid` 2>/dev/null
+ rm ${BUILDDIR}/.toasterui.pid
+ fi
+ BBSERVER=0.0.0.0:-1 bitbake -m
+ unset BBSERVER
+ webserverKillAll
+ # force stop any misbehaving bitbake server
+ lsof bitbake.lock | awk '{print $2}' | grep "[0-9]\+" | xargs -n1 -r kill
+ trap - SIGHUP
+ #trap - SIGCHLD
+ INSTOPSYSTEM=0
+}
+
+check_pidbyfile() {
+ [ -e $1 ] && kill -0 `cat $1` 2>/dev/null
+}
+
+
+notify_chldexit() {
+ if [ $NOTOASTERUI -eq 0 ]; then
+ check_pidbyfile ${BUILDDIR}/.toasterui.pid && return
+ stop_system
+ fi
+}
+
+
+verify_prereq() {
+ # Verify prerequisites
+
+ if ! echo "import django; print (1,) == django.VERSION[0:1] and django.VERSION[1:2][0] in (6,)" | python 2>/dev/null | grep True >/dev/null; then
+ printf "This program needs Django 1.6. Please install with\n\npip install django==1.6\n"
+ return 2
+ fi
+
+ if ! echo "import south; print reduce(lambda x, y: 2 if x==2 else 0 if x == 0 else y, map(lambda x: 1+cmp(x[1]-x[0],0), zip([0,8,4], map(int,south.__version__.split(\".\"))))) > 0" | python 2>/dev/null | grep True >/dev/null; then
+ printf "This program needs South 0.8.4. Please install with\n\npip install south==0.8.4\n"
+ return 2
+ fi
+ return 0
+}
+
+
+# read command line parameters
+if [ -n "$BASH_SOURCE" ] ; then
+ TOASTER=${BASH_SOURCE}
+elif [ -n "$ZSH_NAME" ] ; then
+ TOASTER=${(%):-%x}
+else
+ TOASTER=$0
+fi
+
+BBBASEDIR=`dirname $TOASTER`/..
+
+RUNNING=0
+
+NOTOASTERUI=0
+WEBSERVER=1
+TOASTER_BRBE=""
+WEB_PORT="8000"
+NOBROWSER=0
+
+for param in $*; do
+ case $param in
+ noui )
+ NOTOASTERUI=1
+ ;;
+ noweb )
+ WEBSERVER=0
+ ;;
+ nobrowser )
+ NOBROWSER=1
+ ;;
+ brbe=* )
+ TOASTER_BRBE=$'\n'"TOASTER_BRBE=\""${param#*=}"\""
+ ;;
+ webport=*)
+ WEB_PORT="${param#*=}"
+ esac
+done
+
+[ -n "${BASH_SOURCE}" ] && SRCFILE=${BASH_SOURCE} || SRCFILE=$_
+
+if [ `basename \"$0\"` = `basename \"${SRCFILE}\"` ]; then
+ # We are called as standalone. We refuse to run in a build environment - we need the interactive mode for that.
+ # Start just the web server, point the web browser to the interface, and start any Django services.
+
+ if ! verify_prereq; then
+ echo "Error: Could not verify that the needed dependencies are installed. Please use virtualenv and pip to install dependencies listed in toaster-requirements.txt" 1>&2
+ exit 1
+ fi
+
+ if [ -n "$BUILDDIR" ]; then
+ printf "Error: It looks like you sourced oe-init-build-env. Toaster cannot start in build mode from an oe-core build environment.\n You should be starting Toaster from a new terminal window." 1>&2
+ exit 1
+ fi
+
+ # Define a fake builddir where only the pid files are actually created. No real builds will take place here.
+ BUILDDIR=/tmp/toaster_$$
+ if [ -d "$BUILDDIR" ]; then
+ echo "Previous toaster run directory $BUILDDIR found, cowardly refusing to start. Please remove the directory when that toaster instance is over" 2>&1
+ exit 1
+ fi
+
+ mkdir -p "$BUILDDIR"
+
+ RUNNING=1
+ trap_ctrlc() {
+ echo "** Stopping system"
+ webserverKillAll
+ RUNNING=0
+ }
+
+ do_cleanup() {
+ find "$BUILDDIR" -type f | xargs rm
+ rmdir "$BUILDDIR"
+ }
+ cleanup() {
+ if grep -ir error "$BUILDDIR" >/dev/null; then
+ if grep -irn "That port is already in use" "$BUILDDIR"; then
+ echo "You can use the \"webport=PORTNUMBER\" parameter to start Toaster on a different port (port $WEB_PORT is already in use)"
+ do_cleanup
+ else
+ printf "\nErrors found in the Toaster log files present in '$BUILDDIR'. Directory will not be cleaned.\n Please review the errors and notify toaster@yoctoproject.org or submit a bug https://bugzilla.yoctoproject.org/enter_bug.cgi?product=Toaster"
+ fi
+ else
+ echo "No errors found, removing the run directory '$BUILDDIR'"
+ do_cleanup
+ fi
+ }
+ TOASTER_MANAGED=1
+ export TOASTER_MANAGED=1
+ if [ $WEBSERVER -gt 0 ] && ! webserverStartAll; then
+ echo "Failed to start the web server, stopping" 1>&2
+ cleanup
+ exit 1
+ fi
+ if [ $WEBSERVER -gt 0 ] && [ $NOBROWSER -eq 0 ] ; then
+ echo "Starting browser..."
+ xdg-open http://127.0.0.1:$WEB_PORT/ >/dev/null 2>&1 &
+ fi
+ trap trap_ctrlc 2
+ echo "Toaster is now running. You can stop it with Ctrl-C"
+ while [ $RUNNING -gt 0 ]; do
+ python $BBBASEDIR/lib/toaster/manage.py runbuilds 2>&1 | tee -a "$BUILDDIR/toaster.log"
+ sleep 1
+ done
+ cleanup
+ echo "**** Exit"
+ exit 0
+fi
+
+
+if ! verify_prereq; then
+ echo "Error: Could not verify that the needed dependencies are installed. Please use virtualenv and pip to install dependencies listed in toaster-requirements.txt" 1>&2
+ return 1
+fi
+
+
+# We make sure we're running in the current shell and in a good environment
+if [ -z "$BUILDDIR" ] || ! which bitbake >/dev/null 2>&1 ; then
+ echo "Error: Build environment is not setup or bitbake is not in path." 1>&2
+ return 2
+fi
+
+
+# Determine the action. If specified by arguments, fine, if not, toggle it
+if [ "$1" = 'start' ] || [ "$1" = 'stop' ]; then
+ CMD="$1"
+else
+ if [ -z "$BBSERVER" ]; then
+ CMD="start"
+ else
+ CMD="stop"
+ fi
+fi
+
+echo "The system will $CMD."
+
+# Make sure it's safe to run by checking bitbake lock
+
+lock=1
+if [ -e $BUILDDIR/bitbake.lock ]; then
+ python -c "import fcntl; fcntl.flock(open(\"$BUILDDIR/bitbake.lock\"), fcntl.LOCK_EX|fcntl.LOCK_NB)" 2>/dev/null || lock=0
+fi
+
+if [ ${CMD} = 'start' ] && [ $lock -eq 0 ]; then
+ echo "Error: bitbake lock state error. File locks show that the system is on." 1>&2
+ echo "Please wait for the current build to finish, stop and then start the system again." 1>&2
+ return 3
+fi
+
+if [ ${CMD} = 'start' ] && [ -e $BUILDDIR/.toastermain.pid ] && kill -0 `cat $BUILDDIR/.toastermain.pid`; then
+ echo "Warning: bitbake appears to be dead, but the Toaster web server is running. Something fishy is going on." 1>&2
+ echo "Cleaning up the web server to start from a clean slate."
+ webserverKillAll
+fi
+
+
+# Execute the commands
+
+case $CMD in
+ start )
+ start_success=1
+ addtoConfiguration toaster.conf "INHERIT+=\"toaster buildhistory\"" $TOASTER_BRBE
+ if [ $WEBSERVER -gt 0 ] && ! webserverStartAll; then
+ echo "Failed ${CMD}."
+ return 4
+ fi
+ unset BBSERVER
+ PREREAD=""
+ if [ -e ${BUILDDIR}/conf/toaster-pre.conf ]; then
+ rm ${BUILDDIR}/conf/toaster-pre.conf
+ fi
+ bitbake $PREREAD --postread conf/toaster.conf --server-only -t xmlrpc -B 0.0.0.0:0
+ if [ $? -ne 0 ]; then
+ start_success=0
+ echo "Bitbake server start failed"
+ else
+ export BBSERVER=0.0.0.0:-1
+ if [ $NOTOASTERUI -eq 0 ]; then # we start the TOASTERUI only if not inhibited
+ bitbake --observe-only -u toasterui >>${BUILDDIR}/toaster_ui.log 2>&1 & echo $! >${BUILDDIR}/.toasterui.pid
+ fi
+ fi
+ if [ $start_success -eq 1 ]; then
+ # set fail safe stop system on terminal exit
+ trap stop_system SIGHUP
+ echo "Successful ${CMD}."
+ return 0
+ else
+ # failed start, do stop
+ stop_system
+ echo "Failed ${CMD}."
+ return 1
+ fi
+ # stop system on terminal exit
+ set -o monitor
+ trap stop_system SIGHUP
+ #trap notify_chldexit SIGCHLD
+ ;;
+ stop )
+ stop_system
+ echo "Successful ${CMD}."
+ ;;
+esac
+
diff --git a/bitbake/bin/toaster-eventreplay b/bitbake/bin/toaster-eventreplay
new file mode 100755
index 0000000..615a7ae
--- /dev/null
+++ b/bitbake/bin/toaster-eventreplay
@@ -0,0 +1,174 @@
+#!/usr/bin/env python
+# ex:ts=4:sw=4:sts=4:et
+# -*- tab-width: 4; c-basic-offset: 4; indent-tabs-mode: nil -*-
+#
+# Copyright (C) 2014 Alex Damian
+#
+# This file re-uses code spread throughout other Bitbake source files.
+# As such, all other copyrights belong to their own right holders.
+#
+#
+# This program is free software; you can redistribute it and/or modify
+# it under the terms of the GNU General Public License version 2 as
+# published by the Free Software Foundation.
+#
+# This program is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
+# GNU General Public License for more details.
+#
+# You should have received a copy of the GNU General Public License along
+# with this program; if not, write to the Free Software Foundation, Inc.,
+# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
+
+
+# This command takes a filename as a single parameter. The filename is read
+# as a build eventlog, and the ToasterUI is used to process events in the file
+# and log data in the database
+
+from __future__ import print_function
+import os
+import sys, logging
+
+# mangle syspath to allow easy import of modules
+sys.path.insert(0, os.path.join(os.path.dirname(os.path.dirname(os.path.abspath(__file__))),
+ 'lib'))
+
+
+import bb.cooker
+from bb.ui import toasterui
+import sys
+import logging
+
+import json, pickle
+
+
+class FileReadEventsServerConnection():
+ """ Emulates a connection to a bitbake server that feeds
+ events coming actually read from a saved log file.
+ """
+
+ class MockConnection():
+ """ fill-in for the proxy to the server. we just return generic data
+ """
+ def __init__(self, sc):
+ self._sc = sc
+
+ def runCommand(self, commandArray):
+ """ emulates running a command on the server; only read-only commands are accepted """
+ command_name = commandArray[0]
+
+ if command_name == "getVariable":
+ if commandArray[1] in self._sc._variables:
+ return (self._sc._variables[commandArray[1]]['v'], None)
+ return (None, "Missing variable")
+
+ elif command_name == "getAllKeysWithFlags":
+ dump = {}
+ flaglist = commandArray[1]
+ for k in self._sc._variables.keys():
+ try:
+ if not k.startswith("__"):
+ v = self._sc._variables[k]['v']
+ dump[k] = {
+ 'v' : v ,
+ 'history' : self._sc._variables[k]['history'],
+ }
+ for d in flaglist:
+ dump[k][d] = self._sc._variables[k][d]
+ except Exception as e:
+ print(e)
+ return (dump, None)
+ else:
+ raise Exception("Command %s not implemented" % commandArray[0])
+
+ def terminateServer(self):
+ """ do not do anything """
+ pass
+
+
+
+ class EventReader():
+ def __init__(self, sc):
+ self._sc = sc
+ self.firstraise = 0
+
+ def _create_event(self, line):
+ def _import_class(name):
+ assert len(name) > 0
+ assert "." in name, name
+
+ components = name.strip().split(".")
+ modulename = ".".join(components[:-1])
+ moduleklass = components[-1]
+
+ module = __import__(modulename, fromlist=[str(moduleklass)])
+ return getattr(module, moduleklass)
+
+ # we build a toaster event out of current event log line
+ try:
+ event_data = json.loads(line.strip())
+ event_class = _import_class(event_data['class'])
+ event_object = pickle.loads(json.loads(event_data['vars']))
+ except ValueError as e:
+ print("Failed loading ", line)
+ raise e
+
+ if not isinstance(event_object, event_class):
+ raise Exception("Error loading objects %s class %s ", event_object, event_class)
+
+ return event_object
+
+ def waitEvent(self, timeout):
+
+ nextline = self._sc._eventfile.readline()
+ if len(nextline) == 0:
+ # the build data ended, while toasterui still waits for events.
+ # this happens when the server was abruptly stopped, so we simulate this
+ self.firstraise += 1
+ if self.firstraise == 1:
+ raise KeyboardInterrupt()
+ else:
+ return None
+ else:
+ self._sc.lineno += 1
+ return self._create_event(nextline)
+
+
+ def _readVariables(self, variableline):
+ self._variables = json.loads(variableline.strip())['allvariables']
+
+
+ def __init__(self, file_name):
+ self.connection = FileReadEventsServerConnection.MockConnection(self)
+ self._eventfile = open(file_name, "r")
+
+ # we expect to have the variable dump at the start of the file
+ self.lineno = 1
+ self._readVariables(self._eventfile.readline())
+
+ self.events = FileReadEventsServerConnection.EventReader(self)
+
+
+
+
+
+class MockConfigParameters():
+ """ stand-in for cookerdata.ConfigParameters; as we don't really config a cooker, this
+ serves just to supply needed interfaces for the toaster ui to work """
+ def __init__(self):
+ self.observe_only = True # we can only read files
+
+
+# run toaster ui on our mock bitbake class
+if __name__ == "__main__":
+ if len(sys.argv) < 2:
+ print("Usage: %s event.log " % sys.argv[0])
+ sys.exit(1)
+
+ file_name = sys.argv[-1]
+ mock_connection = FileReadEventsServerConnection(file_name)
+ configParams = MockConfigParameters()
+
+ # run the main program and set exit code to the returned value
+ sys.exit(toasterui.main(mock_connection.connection, mock_connection.events, configParams))
diff --git a/bitbake/contrib/README b/bitbake/contrib/README
new file mode 100644
index 0000000..25e5156
--- /dev/null
+++ b/bitbake/contrib/README
@@ -0,0 +1 @@
+This directory is for additional contributed files which may be useful.
diff --git a/bitbake/contrib/bbdev.sh b/bitbake/contrib/bbdev.sh
new file mode 100644
index 0000000..33a7853
--- /dev/null
+++ b/bitbake/contrib/bbdev.sh
@@ -0,0 +1,31 @@
+# This is a shell function to be sourced into your shell or placed in your .profile,
+# which makes setting things up for BitBake a bit easier.
+#
+# The author disclaims copyright to the contents of this file and places it in the
+# public domain.
+
+bbdev () {
+ local BBDIR PKGDIR BUILDDIR
+ if test x"$1" = "x--help"; then echo >&2 "syntax: bbdev [bbdir [pkgdir [builddir]]]"; return 1; fi
+ if test x"$1" = x; then BBDIR=`pwd`; else BBDIR=$1; fi
+ if test x"$2" = x; then PKGDIR=`pwd`; else PKGDIR=$2; fi
+ if test x"$3" = x; then BUILDDIR=`pwd`; else BUILDDIR=$3; fi
+
+ BBDIR=`readlink -f $BBDIR`
+ PKGDIR=`readlink -f $PKGDIR`
+ BUILDDIR=`readlink -f $BUILDDIR`
+ if ! (test -d $BBDIR && test -d $PKGDIR && test -d $BUILDDIR); then
+ echo >&2 "syntax: bbdev [bbdir [pkgdir [builddir]]]"
+ return 1
+ fi
+
+ PATH=$BBDIR/bin:$PATH
+ BBPATH=$BBDIR
+ if test x"$BBDIR" != x"$PKGDIR"; then
+ BBPATH=$PKGDIR:$BBPATH
+ fi
+ if test x"$PKGDIR" != x"$BUILDDIR"; then
+ BBPATH=$BUILDDIR:$BBPATH
+ fi
+ export BBPATH
+}
diff --git a/bitbake/contrib/dump_cache.py b/bitbake/contrib/dump_cache.py
new file mode 100755
index 0000000..e1f2309
--- /dev/null
+++ b/bitbake/contrib/dump_cache.py
@@ -0,0 +1,68 @@
+#!/usr/bin/env python
+# ex:ts=4:sw=4:sts=4:et
+# -*- tab-width: 4; c-basic-offset: 4; indent-tabs-mode: nil -*-
+#
+# Copyright (C) 2012 Wind River Systems, Inc.
+#
+# This program is free software; you can redistribute it and/or modify
+# it under the terms of the GNU General Public License version 2 as
+# published by the Free Software Foundation.
+#
+# This program is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
+# GNU General Public License for more details.
+#
+# You should have received a copy of the GNU General Public License along
+# with this program; if not, write to the Free Software Foundation, Inc.,
+# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
+
+#
+# This is used for dumping the bb_cache.dat, the output format is:
+# recipe_path PN PV PACKAGES
+#
+import os
+import sys
+import warnings
+
+# For importing bb.cache
+sys.path.insert(0, os.path.join(os.path.abspath(os.path.dirname(sys.argv[0])), '../lib'))
+from bb.cache import CoreRecipeInfo
+
+import cPickle as pickle
+
+def main(argv=None):
+ """
+ Get the mapping for the target recipe.
+ """
+ if len(argv) != 1:
+ print >>sys.stderr, "Error, need one argument!"
+ return 2
+
+ cachefile = argv[0]
+
+ with open(cachefile, "rb") as cachefile:
+ pickled = pickle.Unpickler(cachefile)
+ while cachefile:
+ try:
+ key = pickled.load()
+ val = pickled.load()
+ except Exception:
+ break
+ if isinstance(val, CoreRecipeInfo) and (not val.skipped):
+ pn = val.pn
+ # Filter out the native recipes.
+ if key.startswith('virtual:native:') or pn.endswith("-native"):
+ continue
+
+ # 1.0 is the default version for a no PV recipe.
+ if val.__dict__.has_key("pv"):
+ pv = val.pv
+ else:
+ pv = "1.0"
+
+ print("%s %s %s %s" % (key, pn, pv, ' '.join(val.packages)))
+
+if __name__ == "__main__":
+ sys.exit(main(sys.argv[1:]))
+
diff --git a/bitbake/contrib/vim/ftdetect/bitbake.vim b/bitbake/contrib/vim/ftdetect/bitbake.vim
new file mode 100644
index 0000000..200f8ae
--- /dev/null
+++ b/bitbake/contrib/vim/ftdetect/bitbake.vim
@@ -0,0 +1,24 @@
+" Vim filetype detection file
+" Language: BitBake
+" Author: Ricardo Salveti <rsalveti@rsalveti.net>
+" Copyright: Copyright (C) 2008 Ricardo Salveti <rsalveti@rsalveti.net>
+" Licence: You may redistribute this under the same terms as Vim itself
+"
+" This sets up the syntax highlighting for BitBake files, like .bb, .bbclass and .inc
+
+if &compatible || version < 600
+ finish
+endif
+
+" .bb, .bbappend and .bbclass
+au BufNewFile,BufRead *.{bb,bbappend,bbclass} set filetype=bitbake
+
+" .inc
+au BufNewFile,BufRead *.inc set filetype=bitbake
+
+" .conf
+au BufNewFile,BufRead *.conf
+ \ if (match(expand("%:p:h"), "conf") > 0) |
+ \ set filetype=bitbake |
+ \ endif
+
diff --git a/bitbake/contrib/vim/ftplugin/bitbake.vim b/bitbake/contrib/vim/ftplugin/bitbake.vim
new file mode 100644
index 0000000..db0d753
--- /dev/null
+++ b/bitbake/contrib/vim/ftplugin/bitbake.vim
@@ -0,0 +1,2 @@
+set sts=4 sw=4 et
+set cms=#%s
diff --git a/bitbake/contrib/vim/plugin/newbb.vim b/bitbake/contrib/vim/plugin/newbb.vim
new file mode 100755
index 0000000..874e338
--- /dev/null
+++ b/bitbake/contrib/vim/plugin/newbb.vim
@@ -0,0 +1,84 @@
+" Vim plugin file
+" Purpose: Create a template for new bb files
+" Author: Ricardo Salveti <rsalveti@gmail.com>
+" Copyright: Copyright (C) 2008 Ricardo Salveti <rsalveti@gmail.com>
+"
+" This file is licensed under the MIT license, see COPYING.MIT in
+" this source distribution for the terms.
+"
+" Based on the gentoo-syntax package
+"
+" Will try to use git to find the user name and email
+
+if &compatible || v:version < 600
+ finish
+endif
+
+fun! <SID>GetUserName()
+ let l:user_name = system("git config --get user.name")
+ if v:shell_error
+ return "Unknown User"
+ else
+ return substitute(l:user_name, "\n", "", "")
+endfun
+
+fun! <SID>GetUserEmail()
+ let l:user_email = system("git config --get user.email")
+ if v:shell_error
+ return "unknow@user.org"
+ else
+ return substitute(l:user_email, "\n", "", "")
+endfun
+
+fun! BBHeader()
+ let l:current_year = strftime("%Y")
+ let l:user_name = <SID>GetUserName()
+ let l:user_email = <SID>GetUserEmail()
+ 0 put ='# Copyright (C) ' . l:current_year .
+ \ ' ' . l:user_name . ' <' . l:user_email . '>'
+ put ='# Released under the MIT license (see COPYING.MIT for the terms)'
+ $
+endfun
+
+fun! NewBBTemplate()
+ let l:paste = &paste
+ set nopaste
+
+ " Get the header
+ call BBHeader()
+
+ " New the bb template
+ put ='DESCRIPTION = \"\"'
+ put ='HOMEPAGE = \"\"'
+ put ='LICENSE = \"\"'
+ put ='SECTION = \"\"'
+ put ='DEPENDS = \"\"'
+ put =''
+ put ='SRC_URI = \"\"'
+
+ " Go to the first place to edit
+ 0
+ /^DESCRIPTION =/
+ exec "normal 2f\""
+
+ if paste == 1
+ set paste
+ endif
+endfun
+
+if !exists("g:bb_create_on_empty")
+ let g:bb_create_on_empty = 1
+endif
+
+" disable in case of vimdiff
+if v:progname =~ "vimdiff"
+ let g:bb_create_on_empty = 0
+endif
+
+augroup NewBB
+ au BufNewFile *.bb
+ \ if g:bb_create_on_empty |
+ \ call NewBBTemplate() |
+ \ endif
+augroup END
+
diff --git a/bitbake/contrib/vim/syntax/bitbake.vim b/bitbake/contrib/vim/syntax/bitbake.vim
new file mode 100644
index 0000000..fb55f91
--- /dev/null
+++ b/bitbake/contrib/vim/syntax/bitbake.vim
@@ -0,0 +1,126 @@
+" Vim syntax file
+" Language: BitBake bb/bbclasses/inc
+" Author: Chris Larson <kergoth@handhelds.org>
+" Ricardo Salveti <rsalveti@rsalveti.net>
+" Copyright: Copyright (C) 2004 Chris Larson <kergoth@handhelds.org>
+" Copyright (C) 2008 Ricardo Salveti <rsalveti@rsalveti.net>
+"
+" This file is licensed under the MIT license, see COPYING.MIT in
+" this source distribution for the terms.
+"
+" Syntax highlighting for bb, bbclasses and inc files.
+"
+" It's an entirely new type, just has specific syntax in shell and python code
+
+if &compatible || v:version < 600
+ finish
+endif
+if exists("b:current_syntax")
+ finish
+endif
+
+syn include @python syntax/python.vim
+if exists("b:current_syntax")
+ unlet b:current_syntax
+endif
+
+" BitBake syntax
+
+" Matching case
+syn case match
+
+" Indicates the error when nothing is matched
+syn match bbUnmatched "."
+
+" Comments
+syn cluster bbCommentGroup contains=bbTodo,@Spell
+syn keyword bbTodo COMBAK FIXME TODO XXX contained
+syn match bbComment "#.*$" contains=@bbCommentGroup
+
+" String helpers
+syn match bbQuote +['"]+ contained
+syn match bbDelimiter "[(){}=]" contained
+syn match bbArrayBrackets "[\[\]]" contained
+
+" BitBake strings
+syn match bbContinue "\\$"
+syn region bbString matchgroup=bbQuote start=+"+ skip=+\\$+ end=+"+ contained contains=bbTodo,bbContinue,bbVarDeref,bbVarPyValue,@Spell
+syn region bbString matchgroup=bbQuote start=+'+ skip=+\\$+ end=+'+ contained contains=bbTodo,bbContinue,bbVarDeref,bbVarPyValue,@Spell
+
+" Vars definition
+syn match bbExport "^export" nextgroup=bbIdentifier skipwhite
+syn keyword bbExportFlag export contained nextgroup=bbIdentifier skipwhite
+syn match bbIdentifier "[a-zA-Z0-9\-_\.\/\+]\+" display contained
+syn match bbVarDeref "${[a-zA-Z0-9\-_\.\/\+]\+}" contained
+syn match bbVarEq "\(:=\|+=\|=+\|\.=\|=\.\|?=\|??=\|=\)" contained nextgroup=bbVarValue
+syn match bbVarDef "^\(export\s*\)\?\([a-zA-Z0-9\-_\.\/\+]\+\(_[${}a-zA-Z0-9\-_\.\/\+]\+\)\?\)\s*\(:=\|+=\|=+\|\.=\|=\.\|?=\|??=\|=\)\@=" contains=bbExportFlag,bbIdentifier,bbVarDeref nextgroup=bbVarEq
+syn match bbVarValue ".*$" contained contains=bbString,bbVarDeref,bbVarPyValue
+syn region bbVarPyValue start=+${@+ skip=+\\$+ end=+}+ contained contains=@python
+
+" Vars metadata flags
+syn match bbVarFlagDef "^\([a-zA-Z0-9\-_\.]\+\)\(\[[a-zA-Z0-9\-_\.]\+\]\)\@=" contains=bbIdentifier nextgroup=bbVarFlagFlag
+syn region bbVarFlagFlag matchgroup=bbArrayBrackets start="\[" end="\]\s*\(=\|+=\|=+\|?=\)\@=" contained contains=bbIdentifier nextgroup=bbVarEq
+
+" Includes and requires
+syn keyword bbInclude inherit include require contained
+syn match bbIncludeRest ".*$" contained contains=bbString,bbVarDeref
+syn match bbIncludeLine "^\(inherit\|include\|require\)\s\+" contains=bbInclude nextgroup=bbIncludeRest
+
+" Add taks and similar
+syn keyword bbStatement addtask addhandler after before EXPORT_FUNCTIONS contained
+syn match bbStatementRest ".*$" skipwhite contained contains=bbStatement
+syn match bbStatementLine "^\(addtask\|addhandler\|after\|before\|EXPORT_FUNCTIONS\)\s\+" contains=bbStatement nextgroup=bbStatementRest
+
+" OE Important Functions
+syn keyword bbOEFunctions do_fetch do_unpack do_patch do_configure do_compile do_stage do_install do_package contained
+
+" Generic Functions
+syn match bbFunction "\h[0-9A-Za-z_-]*" display contained contains=bbOEFunctions
+
+" BitBake shell metadata
+syn include @shell syntax/sh.vim
+if exists("b:current_syntax")
+ unlet b:current_syntax
+endif
+syn keyword bbShFakeRootFlag fakeroot contained
+syn match bbShFuncDef "^\(fakeroot\s*\)\?\([0-9A-Za-z_${}-]\+\)\(python\)\@<!\(\s*()\s*\)\({\)\@=" contains=bbShFakeRootFlag,bbFunction,bbVarDeref,bbDelimiter nextgroup=bbShFuncRegion skipwhite
+syn region bbShFuncRegion matchgroup=bbDelimiter start="{\s*$" end="^}\s*$" contained contains=@shell
+
+" Python value inside shell functions
+syn region shDeref start=+${@+ skip=+\\$+ excludenl end=+}+ contained contains=@python
+
+" BitBake python metadata
+syn keyword bbPyFlag python contained
+syn match bbPyFuncDef "^\(python\s\+\)\([0-9A-Za-z_${}-]\+\)\?\(\s*()\s*\)\({\)\@=" contains=bbPyFlag,bbFunction,bbVarDeref,bbDelimiter nextgroup=bbPyFuncRegion skipwhite
+syn region bbPyFuncRegion matchgroup=bbDelimiter start="{\s*$" end="^}\s*$" contained contains=@python
+
+" BitBake 'def'd python functions
+syn keyword bbPyDef def contained
+syn region bbPyDefRegion start='^\(def\s\+\)\([0-9A-Za-z_-]\+\)\(\s*(.*)\s*\):\s*$' end='^\(\s\|$\)\@!' contains=@python
+
+" Highlighting Definitions
+hi def link bbUnmatched Error
+hi def link bbInclude Include
+hi def link bbTodo Todo
+hi def link bbComment Comment
+hi def link bbQuote String
+hi def link bbString String
+hi def link bbDelimiter Keyword
+hi def link bbArrayBrackets Statement
+hi def link bbContinue Special
+hi def link bbExport Type
+hi def link bbExportFlag Type
+hi def link bbIdentifier Identifier
+hi def link bbVarDeref PreProc
+hi def link bbVarDef Identifier
+hi def link bbVarValue String
+hi def link bbShFakeRootFlag Type
+hi def link bbFunction Function
+hi def link bbPyFlag Type
+hi def link bbPyDef Statement
+hi def link bbStatement Statement
+hi def link bbStatementRest Identifier
+hi def link bbOEFunctions Special
+hi def link bbVarPyValue PreProc
+
+let b:current_syntax = "bb"
diff --git a/bitbake/doc/COPYING.GPL b/bitbake/doc/COPYING.GPL
new file mode 100644
index 0000000..d511905
--- /dev/null
+++ b/bitbake/doc/COPYING.GPL
@@ -0,0 +1,339 @@
+ GNU GENERAL PUBLIC LICENSE
+ Version 2, June 1991
+
+ Copyright (C) 1989, 1991 Free Software Foundation, Inc.,
+ 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
+ Everyone is permitted to copy and distribute verbatim copies
+ of this license document, but changing it is not allowed.
+
+ Preamble
+
+ The licenses for most software are designed to take away your
+freedom to share and change it. By contrast, the GNU General Public
+License is intended to guarantee your freedom to share and change free
+software--to make sure the software is free for all its users. This
+General Public License applies to most of the Free Software
+Foundation's software and to any other program whose authors commit to
+using it. (Some other Free Software Foundation software is covered by
+the GNU Lesser General Public License instead.) You can apply it to
+your programs, too.
+
+ When we speak of free software, we are referring to freedom, not
+price. Our General Public Licenses are designed to make sure that you
+have the freedom to distribute copies of free software (and charge for
+this service if you wish), that you receive source code or can get it
+if you want it, that you can change the software or use pieces of it
+in new free programs; and that you know you can do these things.
+
+ To protect your rights, we need to make restrictions that forbid
+anyone to deny you these rights or to ask you to surrender the rights.
+These restrictions translate to certain responsibilities for you if you
+distribute copies of the software, or if you modify it.
+
+ For example, if you distribute copies of such a program, whether
+gratis or for a fee, you must give the recipients all the rights that
+you have. You must make sure that they, too, receive or can get the
+source code. And you must show them these terms so they know their
+rights.
+
+ We protect your rights with two steps: (1) copyright the software, and
+(2) offer you this license which gives you legal permission to copy,
+distribute and/or modify the software.
+
+ Also, for each author's protection and ours, we want to make certain
+that everyone understands that there is no warranty for this free
+software. If the software is modified by someone else and passed on, we
+want its recipients to know that what they have is not the original, so
+that any problems introduced by others will not reflect on the original
+authors' reputations.
+
+ Finally, any free program is threatened constantly by software
+patents. We wish to avoid the danger that redistributors of a free
+program will individually obtain patent licenses, in effect making the
+program proprietary. To prevent this, we have made it clear that any
+patent must be licensed for everyone's free use or not licensed at all.
+
+ The precise terms and conditions for copying, distribution and
+modification follow.
+
+ GNU GENERAL PUBLIC LICENSE
+ TERMS AND CONDITIONS FOR COPYING, DISTRIBUTION AND MODIFICATION
+
+ 0. This License applies to any program or other work which contains
+a notice placed by the copyright holder saying it may be distributed
+under the terms of this General Public License. The "Program", below,
+refers to any such program or work, and a "work based on the Program"
+means either the Program or any derivative work under copyright law:
+that is to say, a work containing the Program or a portion of it,
+either verbatim or with modifications and/or translated into another
+language. (Hereinafter, translation is included without limitation in
+the term "modification".) Each licensee is addressed as "you".
+
+Activities other than copying, distribution and modification are not
+covered by this License; they are outside its scope. The act of
+running the Program is not restricted, and the output from the Program
+is covered only if its contents constitute a work based on the
+Program (independent of having been made by running the Program).
+Whether that is true depends on what the Program does.
+
+ 1. You may copy and distribute verbatim copies of the Program's
+source code as you receive it, in any medium, provided that you
+conspicuously and appropriately publish on each copy an appropriate
+copyright notice and disclaimer of warranty; keep intact all the
+notices that refer to this License and to the absence of any warranty;
+and give any other recipients of the Program a copy of this License
+along with the Program.
+
+You may charge a fee for the physical act of transferring a copy, and
+you may at your option offer warranty protection in exchange for a fee.
+
+ 2. You may modify your copy or copies of the Program or any portion
+of it, thus forming a work based on the Program, and copy and
+distribute such modifications or work under the terms of Section 1
+above, provided that you also meet all of these conditions:
+
+ a) You must cause the modified files to carry prominent notices
+ stating that you changed the files and the date of any change.
+
+ b) You must cause any work that you distribute or publish, that in
+ whole or in part contains or is derived from the Program or any
+ part thereof, to be licensed as a whole at no charge to all third
+ parties under the terms of this License.
+
+ c) If the modified program normally reads commands interactively
+ when run, you must cause it, when started running for such
+ interactive use in the most ordinary way, to print or display an
+ announcement including an appropriate copyright notice and a
+ notice that there is no warranty (or else, saying that you provide
+ a warranty) and that users may redistribute the program under
+ these conditions, and telling the user how to view a copy of this
+ License. (Exception: if the Program itself is interactive but
+ does not normally print such an announcement, your work based on
+ the Program is not required to print an announcement.)
+
+These requirements apply to the modified work as a whole. If
+identifiable sections of that work are not derived from the Program,
+and can be reasonably considered independent and separate works in
+themselves, then this License, and its terms, do not apply to those
+sections when you distribute them as separate works. But when you
+distribute the same sections as part of a whole which is a work based
+on the Program, the distribution of the whole must be on the terms of
+this License, whose permissions for other licensees extend to the
+entire whole, and thus to each and every part regardless of who wrote it.
+
+Thus, it is not the intent of this section to claim rights or contest
+your rights to work written entirely by you; rather, the intent is to
+exercise the right to control the distribution of derivative or
+collective works based on the Program.
+
+In addition, mere aggregation of another work not based on the Program
+with the Program (or with a work based on the Program) on a volume of
+a storage or distribution medium does not bring the other work under
+the scope of this License.
+
+ 3. You may copy and distribute the Program (or a work based on it,
+under Section 2) in object code or executable form under the terms of
+Sections 1 and 2 above provided that you also do one of the following:
+
+ a) Accompany it with the complete corresponding machine-readable
+ source code, which must be distributed under the terms of Sections
+ 1 and 2 above on a medium customarily used for software interchange; or,
+
+ b) Accompany it with a written offer, valid for at least three
+ years, to give any third party, for a charge no more than your
+ cost of physically performing source distribution, a complete
+ machine-readable copy of the corresponding source code, to be
+ distributed under the terms of Sections 1 and 2 above on a medium
+ customarily used for software interchange; or,
+
+ c) Accompany it with the information you received as to the offer
+ to distribute corresponding source code. (This alternative is
+ allowed only for noncommercial distribution and only if you
+ received the program in object code or executable form with such
+ an offer, in accord with Subsection b above.)
+
+The source code for a work means the preferred form of the work for
+making modifications to it. For an executable work, complete source
+code means all the source code for all modules it contains, plus any
+associated interface definition files, plus the scripts used to
+control compilation and installation of the executable. However, as a
+special exception, the source code distributed need not include
+anything that is normally distributed (in either source or binary
+form) with the major components (compiler, kernel, and so on) of the
+operating system on which the executable runs, unless that component
+itself accompanies the executable.
+
+If distribution of executable or object code is made by offering
+access to copy from a designated place, then offering equivalent
+access to copy the source code from the same place counts as
+distribution of the source code, even though third parties are not
+compelled to copy the source along with the object code.
+
+ 4. You may not copy, modify, sublicense, or distribute the Program
+except as expressly provided under this License. Any attempt
+otherwise to copy, modify, sublicense or distribute the Program is
+void, and will automatically terminate your rights under this License.
+However, parties who have received copies, or rights, from you under
+this License will not have their licenses terminated so long as such
+parties remain in full compliance.
+
+ 5. You are not required to accept this License, since you have not
+signed it. However, nothing else grants you permission to modify or
+distribute the Program or its derivative works. These actions are
+prohibited by law if you do not accept this License. Therefore, by
+modifying or distributing the Program (or any work based on the
+Program), you indicate your acceptance of this License to do so, and
+all its terms and conditions for copying, distributing or modifying
+the Program or works based on it.
+
+ 6. Each time you redistribute the Program (or any work based on the
+Program), the recipient automatically receives a license from the
+original licensor to copy, distribute or modify the Program subject to
+these terms and conditions. You may not impose any further
+restrictions on the recipients' exercise of the rights granted herein.
+You are not responsible for enforcing compliance by third parties to
+this License.
+
+ 7. If, as a consequence of a court judgment or allegation of patent
+infringement or for any other reason (not limited to patent issues),
+conditions are imposed on you (whether by court order, agreement or
+otherwise) that contradict the conditions of this License, they do not
+excuse you from the conditions of this License. If you cannot
+distribute so as to satisfy simultaneously your obligations under this
+License and any other pertinent obligations, then as a consequence you
+may not distribute the Program at all. For example, if a patent
+license would not permit royalty-free redistribution of the Program by
+all those who receive copies directly or indirectly through you, then
+the only way you could satisfy both it and this License would be to
+refrain entirely from distribution of the Program.
+
+If any portion of this section is held invalid or unenforceable under
+any particular circumstance, the balance of the section is intended to
+apply and the section as a whole is intended to apply in other
+circumstances.
+
+It is not the purpose of this section to induce you to infringe any
+patents or other property right claims or to contest validity of any
+such claims; this section has the sole purpose of protecting the
+integrity of the free software distribution system, which is
+implemented by public license practices. Many people have made
+generous contributions to the wide range of software distributed
+through that system in reliance on consistent application of that
+system; it is up to the author/donor to decide if he or she is willing
+to distribute software through any other system and a licensee cannot
+impose that choice.
+
+This section is intended to make thoroughly clear what is believed to
+be a consequence of the rest of this License.
+
+ 8. If the distribution and/or use of the Program is restricted in
+certain countries either by patents or by copyrighted interfaces, the
+original copyright holder who places the Program under this License
+may add an explicit geographical distribution limitation excluding
+those countries, so that distribution is permitted only in or among
+countries not thus excluded. In such case, this License incorporates
+the limitation as if written in the body of this License.
+
+ 9. The Free Software Foundation may publish revised and/or new versions
+of the General Public License from time to time. Such new versions will
+be similar in spirit to the present version, but may differ in detail to
+address new problems or concerns.
+
+Each version is given a distinguishing version number. If the Program
+specifies a version number of this License which applies to it and "any
+later version", you have the option of following the terms and conditions
+either of that version or of any later version published by the Free
+Software Foundation. If the Program does not specify a version number of
+this License, you may choose any version ever published by the Free Software
+Foundation.
+
+ 10. If you wish to incorporate parts of the Program into other free
+programs whose distribution conditions are different, write to the author
+to ask for permission. For software which is copyrighted by the Free
+Software Foundation, write to the Free Software Foundation; we sometimes
+make exceptions for this. Our decision will be guided by the two goals
+of preserving the free status of all derivatives of our free software and
+of promoting the sharing and reuse of software generally.
+
+ NO WARRANTY
+
+ 11. BECAUSE THE PROGRAM IS LICENSED FREE OF CHARGE, THERE IS NO WARRANTY
+FOR THE PROGRAM, TO THE EXTENT PERMITTED BY APPLICABLE LAW. EXCEPT WHEN
+OTHERWISE STATED IN WRITING THE COPYRIGHT HOLDERS AND/OR OTHER PARTIES
+PROVIDE THE PROGRAM "AS IS" WITHOUT WARRANTY OF ANY KIND, EITHER EXPRESSED
+OR IMPLIED, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF
+MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE. THE ENTIRE RISK AS
+TO THE QUALITY AND PERFORMANCE OF THE PROGRAM IS WITH YOU. SHOULD THE
+PROGRAM PROVE DEFECTIVE, YOU ASSUME THE COST OF ALL NECESSARY SERVICING,
+REPAIR OR CORRECTION.
+
+ 12. IN NO EVENT UNLESS REQUIRED BY APPLICABLE LAW OR AGREED TO IN WRITING
+WILL ANY COPYRIGHT HOLDER, OR ANY OTHER PARTY WHO MAY MODIFY AND/OR
+REDISTRIBUTE THE PROGRAM AS PERMITTED ABOVE, BE LIABLE TO YOU FOR DAMAGES,
+INCLUDING ANY GENERAL, SPECIAL, INCIDENTAL OR CONSEQUENTIAL DAMAGES ARISING
+OUT OF THE USE OR INABILITY TO USE THE PROGRAM (INCLUDING BUT NOT LIMITED
+TO LOSS OF DATA OR DATA BEING RENDERED INACCURATE OR LOSSES SUSTAINED BY
+YOU OR THIRD PARTIES OR A FAILURE OF THE PROGRAM TO OPERATE WITH ANY OTHER
+PROGRAMS), EVEN IF SUCH HOLDER OR OTHER PARTY HAS BEEN ADVISED OF THE
+POSSIBILITY OF SUCH DAMAGES.
+
+ END OF TERMS AND CONDITIONS
+
+ How to Apply These Terms to Your New Programs
+
+ If you develop a new program, and you want it to be of the greatest
+possible use to the public, the best way to achieve this is to make it
+free software which everyone can redistribute and change under these terms.
+
+ To do so, attach the following notices to the program. It is safest
+to attach them to the start of each source file to most effectively
+convey the exclusion of warranty; and each file should have at least
+the "copyright" line and a pointer to where the full notice is found.
+
+ <one line to give the program's name and a brief idea of what it does.>
+ Copyright (C) <year> <name of author>
+
+ This program is free software; you can redistribute it and/or modify
+ it under the terms of the GNU General Public License as published by
+ the Free Software Foundation; either version 2 of the License, or
+ (at your option) any later version.
+
+ This program is distributed in the hope that it will be useful,
+ but WITHOUT ANY WARRANTY; without even the implied warranty of
+ MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
+ GNU General Public License for more details.
+
+ You should have received a copy of the GNU General Public License along
+ with this program; if not, write to the Free Software Foundation, Inc.,
+ 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
+
+Also add information on how to contact you by electronic and paper mail.
+
+If the program is interactive, make it output a short notice like this
+when it starts in an interactive mode:
+
+ Gnomovision version 69, Copyright (C) year name of author
+ Gnomovision comes with ABSOLUTELY NO WARRANTY; for details type `show w'.
+ This is free software, and you are welcome to redistribute it
+ under certain conditions; type `show c' for details.
+
+The hypothetical commands `show w' and `show c' should show the appropriate
+parts of the General Public License. Of course, the commands you use may
+be called something other than `show w' and `show c'; they could even be
+mouse-clicks or menu items--whatever suits your program.
+
+You should also get your employer (if you work as a programmer) or your
+school, if any, to sign a "copyright disclaimer" for the program, if
+necessary. Here is a sample; alter the names:
+
+ Yoyodyne, Inc., hereby disclaims all copyright interest in the program
+ `Gnomovision' (which makes passes at compilers) written by James Hacker.
+
+ <signature of Ty Coon>, 1 April 1989
+ Ty Coon, President of Vice
+
+This General Public License does not permit incorporating your program into
+proprietary programs. If your program is a subroutine library, you may
+consider it more useful to permit linking proprietary applications with the
+library. If this is what you want to do, use the GNU Lesser General
+Public License instead of this License.
diff --git a/bitbake/doc/COPYING.MIT b/bitbake/doc/COPYING.MIT
new file mode 100644
index 0000000..7e7d574
--- /dev/null
+++ b/bitbake/doc/COPYING.MIT
@@ -0,0 +1,17 @@
+Permission is hereby granted, free of charge, to any person obtaining a copy
+of this software and associated documentation files (the "Software"), to deal
+in the Software without restriction, including without limitation the rights
+to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
+copies of the Software, and to permit persons to whom the Software is
+furnished to do so, subject to the following conditions:
+
+The above copyright notice and this permission notice shall be included in all
+copies or substantial portions of the Software.
+
+THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
+IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
+FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT
+SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM,
+DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR
+OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR
+THE USE OR OTHER DEALINGS IN THE SOFTWARE.
diff --git a/bitbake/doc/Makefile b/bitbake/doc/Makefile
new file mode 100644
index 0000000..3c28f4b
--- /dev/null
+++ b/bitbake/doc/Makefile
@@ -0,0 +1,91 @@
+# This is a single Makefile to handle all generated BitBake documents.
+# The Makefile needs to live in the documentation directory and all figures used
+# in any manuals must be .PNG files and live in the individual book's figures
+# directory.
+#
+# The Makefile has these targets:
+#
+# pdf: generates a PDF version of a manual.
+# html: generates an HTML version of a manual.
+# tarball: creates a tarball for the doc files.
+# validate: validates
+# clean: removes files
+#
+# The Makefile generates an HTML version of every document. The
+# variable DOC indicates the folder name for a given manual.
+#
+# To build a manual, you must invoke 'make' with the DOC argument.
+#
+# Examples:
+#
+# make DOC=bitbake-user-manual
+# make pdf DOC=bitbake-user-manual
+#
+# The first example generates the HTML version of the User Manual.
+# The second example generates the PDF version of the User Manual.
+#
+
+ifeq ($(DOC),bitbake-user-manual)
+XSLTOPTS = --stringparam html.stylesheet bitbake-user-manual-style.css \
+ --stringparam chapter.autolabel 1 \
+ --stringparam section.autolabel 1 \
+ --stringparam section.label.includes.component.label 1 \
+ --xinclude
+ALLPREQ = html tarball
+TARFILES = bitbake-user-manual-style.css bitbake-user-manual.html figures/bitbake-title.png
+MANUALS = $(DOC)/$(DOC).html
+FIGURES = figures
+STYLESHEET = $(DOC)/*.css
+
+endif
+
+##
+# These URI should be rewritten by your distribution's xml catalog to
+# match your localy installed XSL stylesheets.
+XSL_BASE_URI = http://docbook.sourceforge.net/release/xsl/current
+XSL_XHTML_URI = $(XSL_BASE_URI)/xhtml/docbook.xsl
+
+all: $(ALLPREQ)
+
+pdf:
+ifeq ($(DOC),bitbake-user-manual)
+ @echo " "
+ @echo "********** Building."$(DOC)
+ @echo " "
+ cd $(DOC); ../tools/docbook-to-pdf $(DOC).xml ../template; cd ..
+endif
+
+html:
+ifeq ($(DOC),bitbake-user-manual)
+# See http://www.sagehill.net/docbookxsl/HtmlOutput.html
+ @echo " "
+ @echo "******** Building "$(DOC)
+ @echo " "
+ cd $(DOC); xsltproc $(XSLTOPTS) -o $(DOC).html $(DOC)-customization.xsl $(DOC).xml; cd ..
+endif
+
+tarball: html
+ @echo " "
+ @echo "******** Creating Tarball of document files"
+ @echo " "
+ cd $(DOC); tar -cvzf $(DOC).tgz $(TARFILES); cd ..
+
+validate:
+ cd $(DOC); xmllint --postvalid --xinclude --noout $(DOC).xml; cd ..
+
+publish:
+ @if test -f $(DOC)/$(DOC).html; \
+ then \
+ echo " "; \
+ echo "******** Publishing "$(DOC)".html"; \
+ echo " "; \
+ scp -r $(MANUALS) $(STYLESHEET) docs.yp:/var/www/www.yoctoproject.org-docs/$(VER)/$(DOC); \
+ cd $(DOC); scp -r $(FIGURES) docs.yp:/var/www/www.yoctoproject.org-docs/$(VER)/$(DOC); \
+ else \
+ echo " "; \
+ echo $(DOC)".html missing. Generate the file first then try again."; \
+ echo " "; \
+ fi
+
+clean:
+ rm -rf $(MANUALS); rm $(DOC)/$(DOC).tgz;
diff --git a/bitbake/doc/README b/bitbake/doc/README
new file mode 100644
index 0000000..303cf8e
--- /dev/null
+++ b/bitbake/doc/README
@@ -0,0 +1,39 @@
+Documentation
+=============
+
+This is the directory that contains the BitBake documentation.
+
+Manual Organization
+===================
+
+Folders exist for individual manuals as follows:
+
+* bitbake-user-manual - The BitBake User Manual
+
+Each folder is self-contained regarding content and figures.
+
+If you want to find HTML versions of the BitBake manuals on the web,
+go to http://www.openembedded.org/wiki/Documentation.
+
+Makefile
+========
+
+The Makefile processes manual directories to create HTML, PDF,
+tarballs, etc. Details on how the Makefile work are documented
+inside the Makefile. See that file for more information.
+
+To build a manual, you run the make command and pass it the name
+of the folder containing the manual's contents.
+For example, the following command run from the documentation directory
+creates an HTML and a PDF version of the BitBake User Manual.
+The DOC variable specifies the manual you are making:
+
+ $ make DOC=bitbake-user-manual
+
+template
+========
+Contains various templates, fonts, and some old PNG files.
+
+tools
+=====
+Contains a tool to convert the DocBook files to PDF format.
diff --git a/bitbake/doc/bitbake-user-manual/bitbake-user-manual-customization.xsl b/bitbake/doc/bitbake-user-manual/bitbake-user-manual-customization.xsl
new file mode 100644
index 0000000..5985ea7
--- /dev/null
+++ b/bitbake/doc/bitbake-user-manual/bitbake-user-manual-customization.xsl
@@ -0,0 +1,29 @@
+<?xml version='1.0'?>
+<xsl:stylesheet xmlns:xsl="http://www.w3.org/1999/XSL/Transform" xmlns="http://www.w3.org/1999/xhtml" xmlns:fo="http://www.w3.org/1999/XSL/Format" version="1.0">
+
+ <xsl:import href="http://downloads.yoctoproject.org/mirror/docbook-mirror/docbook-xsl-1.76.1/xhtml/docbook.xsl" />
+
+<!--
+
+ <xsl:import href="../template/1.76.1/docbook-xsl-1.76.1/xhtml/docbook.xsl" />
+
+ <xsl:import href="http://docbook.sourceforge.net/release/xsl/1.76.1/xhtml/docbook.xsl" />
+
+-->
+
+ <xsl:include href="../template/permalinks.xsl"/>
+ <xsl:include href="../template/section.title.xsl"/>
+ <xsl:include href="../template/component.title.xsl"/>
+ <xsl:include href="../template/division.title.xsl"/>
+ <xsl:include href="../template/formal.object.heading.xsl"/>
+ <xsl:include href="../template/gloss-permalinks.xsl"/>
+
+ <xsl:param name="html.stylesheet" select="'user-manual-style.css'" />
+ <xsl:param name="chapter.autolabel" select="1" />
+ <xsl:param name="section.autolabel" select="1" />
+ <xsl:param name="section.label.includes.component.label" select="1" />
+ <xsl:param name="appendix.autolabel">A</xsl:param>
+
+<!-- <xsl:param name="generate.toc" select="'article nop'"></xsl:param> -->
+
+</xsl:stylesheet>
diff --git a/bitbake/doc/bitbake-user-manual/bitbake-user-manual-execution.xml b/bitbake/doc/bitbake-user-manual/bitbake-user-manual-execution.xml
new file mode 100644
index 0000000..fa52e29
--- /dev/null
+++ b/bitbake/doc/bitbake-user-manual/bitbake-user-manual-execution.xml
@@ -0,0 +1,912 @@
+<!DOCTYPE chapter PUBLIC "-//OASIS//DTD DocBook XML V4.2//EN"
+"http://www.oasis-open.org/docbook/xml/4.2/docbookx.dtd">
+
+<chapter id="bitbake-user-manual-execution">
+ <title>Execution</title>
+
+ <para>
+ The primary purpose for running BitBake is to produce some kind
+ of output such as a single installable package, a kernel, a software
+ development kit, or even a full, board-specific bootable Linux image,
+ complete with bootloader, kernel, and root filesystem.
+ Of course, you can execute the <filename>bitbake</filename>
+ command with options that cause it to execute single tasks,
+ compile single recipe files, capture or clear data, or simply
+ return information about the execution environment.
+ </para>
+
+ <para>
+ This chapter describes BitBake's execution process from start
+ to finish when you use it to create an image.
+ The execution process is launched using the following command
+ form:
+ <literallayout class='monospaced'>
+ $ bitbake <replaceable>target</replaceable>
+ </literallayout>
+ For information on the BitBake command and its options,
+ see
+ "<link linkend='bitbake-user-manual-command'>The BitBake Command</link>"
+ section.
+ <note>
+ <para>
+ Prior to executing BitBake, you should take advantage of available
+ parallel thread execution on your build host by setting the
+ <link linkend='var-BB_NUMBER_THREADS'><filename>BB_NUMBER_THREADS</filename></link>
+ variable in your project's <filename>local.conf</filename>
+ configuration file.
+ </para>
+
+ <para>
+ A common method to determine this value for your build host is to run
+ the following:
+ <literallayout class='monospaced'>
+ $ grep processor /proc/cpuinfo
+ </literallayout>
+ This command returns the number of processors, which takes into
+ account hyper-threading.
+ Thus, a quad-core build host with hyper-threading most likely
+ shows eight processors, which is the value you would then assign to
+ <filename>BB_NUMBER_THREADS</filename>.
+ </para>
+
+ <para>
+ A possibly simpler solution is that some Linux distributions
+ (e.g. Debian and Ubuntu) provide the <filename>ncpus</filename> command.
+ </para>
+ </note>
+ </para>
+
+ <section id='parsing-the-base-configuration-metadata'>
+ <title>Parsing the Base Configuration Metadata</title>
+
+ <para>
+ The first thing BitBake does is parse base configuration
+ metadata.
+ Base configuration metadata consists of your project's
+ <filename>bblayers.conf</filename> file to determine what
+ layers BitBake needs to recognize, all necessary
+ <filename>layer.conf</filename> files (one from each layer),
+ and <filename>bitbake.conf</filename>.
+ The data itself is of various types:
+ <itemizedlist>
+ <listitem><para><emphasis>Recipes:</emphasis>
+ Details about particular pieces of software.
+ </para></listitem>
+ <listitem><para><emphasis>Class Data:</emphasis>
+ An abstraction of common build information
+ (e.g. how to build a Linux kernel).
+ </para></listitem>
+ <listitem><para><emphasis>Configuration Data:</emphasis>
+ Machine-specific settings, policy decisions,
+ and so forth.
+ Configuration data acts as the glue to bind everything
+ together.</para></listitem>
+ </itemizedlist>
+ </para>
+
+ <para>
+ The <filename>layer.conf</filename> files are used to
+ construct key variables such as
+ <link linkend='var-BBPATH'><filename>BBPATH</filename></link>
+ and
+ <link linkend='var-BBFILES'><filename>BBFILES</filename></link>.
+ <filename>BBPATH</filename> is used to search for
+ configuration and class files under the
+ <filename>conf</filename> and <filename>classes</filename>
+ directories, respectively.
+ <filename>BBFILES</filename> is used to locate both recipe
+ and recipe append files
+ (<filename>.bb</filename> and <filename>.bbappend</filename>).
+ If there is no <filename>bblayers.conf</filename> file,
+ it is assumed the user has set the <filename>BBPATH</filename>
+ and <filename>BBFILES</filename> directly in the environment.
+ </para>
+
+ <para>
+ Next, the <filename>bitbake.conf</filename> file is located
+ using the <filename>BBPATH</filename> variable that was
+ just constructed.
+ The <filename>bitbake.conf</filename> file may also include other
+ configuration files using the
+ <filename>include</filename> or
+ <filename>require</filename> directives.
+ </para>
+
+ <para>
+ Prior to parsing configuration files, Bitbake looks
+ at certain variables, including:
+ <itemizedlist>
+ <listitem><para><link linkend='var-BB_ENV_WHITELIST'><filename>BB_ENV_WHITELIST</filename></link></para></listitem>
+ <listitem><para><link linkend='var-BB_PRESERVE_ENV'><filename>BB_PRESERVE_ENV</filename></link></para></listitem>
+ <listitem><para><link linkend='var-BB_ENV_EXTRAWHITE'><filename>BB_ENV_EXTRAWHITE</filename></link></para></listitem>
+ <listitem><para>
+ <link linkend='var-BITBAKE_UI'><filename>BITBAKE_UI</filename></link>
+ </para></listitem>
+ </itemizedlist>
+ You can find information on how to pass environment variables into the BitBake
+ execution environment in the
+ "<link linkend='passing-information-into-the-build-task-environment'>Passing Information Into the Build Task Environment</link>" section.
+ </para>
+
+ <para>
+ The base configuration metadata is global
+ and therefore affects all recipes and tasks that are executed.
+ </para>
+
+ <para>
+ BitBake first searches the current working directory for an
+ optional <filename>conf/bblayers.conf</filename> configuration file.
+ This file is expected to contain a
+ <link linkend='var-BBLAYERS'><filename>BBLAYERS</filename></link>
+ variable that is a space-delimited list of 'layer' directories.
+ Recall that if BitBake cannot find a <filename>bblayers.conf</filename>
+ file, then it is assumed the user has set the <filename>BBPATH</filename>
+ and <filename>BBFILES</filename> variables directly in the environment.
+ </para>
+
+ <para>
+ For each directory (layer) in this list, a <filename>conf/layer.conf</filename>
+ file is located and parsed with the
+ <link linkend='var-LAYERDIR'><filename>LAYERDIR</filename></link>
+ variable being set to the directory where the layer was found.
+ The idea is these files automatically set up
+ <link linkend='var-BBPATH'><filename>BBPATH</filename></link>
+ and other variables correctly for a given build directory.
+ </para>
+
+ <para>
+ BitBake then expects to find the <filename>conf/bitbake.conf</filename>
+ file somewhere in the user-specified <filename>BBPATH</filename>.
+ That configuration file generally has include directives to pull
+ in any other metadata such as files specific to the architecture,
+ the machine, the local environment, and so forth.
+ </para>
+
+ <para>
+ Only variable definitions and include directives are allowed
+ in BitBake <filename>.conf</filename> files.
+ Some variables directly influence BitBake's behavior.
+ These variables might have been set from the environment
+ depending on the environment variables previously
+ mentioned or set in the configuration files.
+ The
+ "<link linkend='ref-variables-glos'>Variables Glossary</link>"
+ chapter presents a full list of variables.
+ </para>
+
+ <para>
+ After parsing configuration files, BitBake uses its rudimentary
+ inheritance mechanism, which is through class files, to inherit
+ some standard classes.
+ BitBake parses a class when the inherit directive responsible
+ for getting that class is encountered.
+ </para>
+
+ <para>
+ The <filename>base.bbclass</filename> file is always included.
+ Other classes that are specified in the configuration using the
+ <link linkend='var-INHERIT'><filename>INHERIT</filename></link>
+ variable are also included.
+ BitBake searches for class files in a
+ <filename>classes</filename> subdirectory under
+ the paths in <filename>BBPATH</filename> in the same way as
+ configuration files.
+ </para>
+
+ <para>
+ A good way to get an idea of the configuration files and
+ the class files used in your execution environment is to
+ run the following BitBake command:
+ <literallayout class='monospaced'>
+ $ bitbake -e > mybb.log
+ </literallayout>
+ Examining the top of the <filename>mybb.log</filename>
+ shows you the many configuration files and class files
+ used in your execution environment.
+ </para>
+
+ <note>
+ <para>
+ You need to be aware of how BitBake parses curly braces.
+ If a recipe uses a closing curly brace within the function and
+ the character has no leading spaces, BitBake produces a parsing
+ error.
+ If you use a pair of curly braces in a shell function, the
+ closing curly brace must not be located at the start of the line
+ without leading spaces.
+ </para>
+
+ <para>
+ Here is an example that causes BitBake to produce a parsing
+ error:
+ <literallayout class='monospaced'>
+ fakeroot create_shar() {
+ cat << "EOF" > ${SDK_DEPLOY}/${TOOLCHAIN_OUTPUTNAME}.sh
+ usage()
+ {
+ echo "test"
+ ###### The following "}" at the start of the line causes a parsing error ######
+ }
+ EOF
+ }
+ </literallayout>
+ Writing the recipe this way avoids the error:
+ <literallayout class='monospaced'>
+ fakeroot create_shar() {
+ cat << "EOF" > ${SDK_DEPLOY}/${TOOLCHAIN_OUTPUTNAME}.sh
+ usage()
+ {
+ echo "test"
+ ######The following "}" with a leading space at the start of the line avoids the error ######
+ }
+ EOF
+ }
+ </literallayout>
+ </para>
+ </note>
+ </section>
+
+ <section id='locating-and-parsing-recipes'>
+ <title>Locating and Parsing Recipes</title>
+
+ <para>
+ During the configuration phase, BitBake will have set
+ <link linkend='var-BBFILES'><filename>BBFILES</filename></link>.
+ BitBake now uses it to construct a list of recipes to parse,
+ along with any append files (<filename>.bbappend</filename>)
+ to apply.
+ <filename>BBFILES</filename> is a space-separated list of
+ available files and supports wildcards.
+ An example would be:
+ <literallayout class='monospaced'>
+ BBFILES = "/path/to/bbfiles/*.bb /path/to/appends/*.bbappend"
+ </literallayout>
+ BitBake parses each recipe and append file located
+ with <filename>BBFILES</filename> and stores the values of
+ various variables into the datastore.
+ <note>
+ Append files are applied in the order they are encountered in
+ <filename>BBFILES</filename>.
+ </note>
+ For each file, a fresh copy of the base configuration is
+ made, then the recipe is parsed line by line.
+ Any inherit statements cause BitBake to find and
+ then parse class files (<filename>.bbclass</filename>)
+ using
+ <link linkend='var-BBPATH'><filename>BBPATH</filename></link>
+ as the search path.
+ Finally, BitBake parses in order any append files found in
+ <filename>BBFILES</filename>.
+ </para>
+
+ <para>
+ One common convention is to use the recipe filename to define
+ pieces of metadata.
+ For example, in <filename>bitbake.conf</filename> the recipe
+ name and version are used to set the variables
+ <link linkend='var-PN'><filename>PN</filename></link> and
+ <link linkend='var-PV'><filename>PV</filename></link>:
+ <literallayout class='monospaced'>
+ PN = "${@bb.parse.BBHandler.vars_from_file(d.getVar('FILE', False),d)[0] or 'defaultpkgname'}"
+ PV = "${@bb.parse.BBHandler.vars_from_file(d.getVar('FILE', False),d)[1] or '1.0'}"
+ </literallayout>
+ In this example, a recipe called "something_1.2.3.bb" would set
+ <filename>PN</filename> to "something" and
+ <filename>PV</filename> to "1.2.3".
+ </para>
+
+ <para>
+ By the time parsing is complete for a recipe, BitBake
+ has a list of tasks that the recipe defines and a set of
+ data consisting of keys and values as well as
+ dependency information about the tasks.
+ </para>
+
+ <para>
+ BitBake does not need all of this information.
+ It only needs a small subset of the information to make
+ decisions about the recipe.
+ Consequently, BitBake caches the values in which it is
+ interested and does not store the rest of the information.
+ Experience has shown it is faster to re-parse the metadata than to
+ try and write it out to the disk and then reload it.
+ </para>
+
+ <para>
+ Where possible, subsequent BitBake commands reuse this cache of
+ recipe information.
+ The validity of this cache is determined by first computing a
+ checksum of the base configuration data (see
+ <link linkend='var-BB_HASHCONFIG_WHITELIST'><filename>BB_HASHCONFIG_WHITELIST</filename></link>)
+ and then checking if the checksum matches.
+ If that checksum matches what is in the cache and the recipe
+ and class files have not changed, Bitbake is able to use
+ the cache.
+ BitBake then reloads the cached information about the recipe
+ instead of reparsing it from scratch.
+ </para>
+
+ <para>
+ Recipe file collections exist to allow the user to
+ have multiple repositories of
+ <filename>.bb</filename> files that contain the same
+ exact package.
+ For example, one could easily use them to make one's
+ own local copy of an upstream repository, but with
+ custom modifications that one does not want upstream.
+ Here is an example:
+ <literallayout class='monospaced'>
+ BBFILES = "/stuff/openembedded/*/*.bb /stuff/openembedded.modified/*/*.bb"
+ BBFILE_COLLECTIONS = "upstream local"
+ BBFILE_PATTERN_upstream = "^/stuff/openembedded/"
+ BBFILE_PATTERN_local = "^/stuff/openembedded.modified/"
+ BBFILE_PRIORITY_upstream = "5"
+ BBFILE_PRIORITY_local = "10"
+ </literallayout>
+ <note>
+ The layers mechanism is now the preferred method of collecting
+ code.
+ While the collections code remains, its main use is to set layer
+ priorities and to deal with overlap (conflicts) between layers.
+ </note>
+ </para>
+ </section>
+
+ <section id='bb-bitbake-providers'>
+ <title>Providers</title>
+
+ <para>
+ Assuming BitBake has been instructed to execute a target
+ and that all the recipe files have been parsed, BitBake
+ starts to figure out how to build the target.
+ BitBake looks through the <filename>PROVIDES</filename> list
+ for each of the recipes.
+ A <filename>PROVIDES</filename> list is the list of names by which
+ the recipe can be known.
+ Each recipe's <filename>PROVIDES</filename> list is created
+ implicitly through the recipe's
+ <link linkend='var-PN'><filename>PN</filename></link> variable
+ and explicitly through the recipe's
+ <link linkend='var-PROVIDES'><filename>PROVIDES</filename></link>
+ variable, which is optional.
+ </para>
+
+ <para>
+ When a recipe uses <filename>PROVIDES</filename>, that recipe's
+ functionality can be found under an alternative name or names other
+ than the implicit <filename>PN</filename> name.
+ As an example, suppose a recipe named <filename>keyboard_1.0.bb</filename>
+ contained the following:
+ <literallayout class='monospaced'>
+ PROVIDES += "fullkeyboard"
+ </literallayout>
+ The <filename>PROVIDES</filename> list for this recipe becomes
+ "keyboard", which is implicit, and "fullkeyboard", which is explicit.
+ Consequently, the functionality found in
+ <filename>keyboard_1.0.bb</filename> can be found under two
+ different names.
+ </para>
+ </section>
+
+ <section id='bb-bitbake-preferences'>
+ <title>Preferences</title>
+
+ <para>
+ The <filename>PROVIDES</filename> list is only part of the solution
+ for figuring out a target's recipes.
+ Because targets might have multiple providers, BitBake needs
+ to prioritize providers by determining provider preferences.
+ </para>
+
+ <para>
+ A common example in which a target has multiple providers
+ is "virtual/kernel", which is on the
+ <filename>PROVIDES</filename> list for each kernel recipe.
+ Each machine often selects the best kernel provider by using a
+ line similar to the following in the machine configuration file:
+ <literallayout class='monospaced'>
+ PREFERRED_PROVIDER_virtual/kernel = "linux-yocto"
+ </literallayout>
+ The default
+ <link linkend='var-PREFERRED_PROVIDER'><filename>PREFERRED_PROVIDER</filename></link>
+ is the provider with the same name as the target.
+ Bitbake iterates through each target it needs to build and
+ resolves them and their dependencies using this process.
+ </para>
+
+ <para>
+ Understanding how providers are chosen is made complicated by the fact
+ that multiple versions might exist for a given provider.
+ BitBake defaults to the highest version of a provider.
+ Version comparisons are made using the same method as Debian.
+ You can use the
+ <link linkend='var-PREFERRED_VERSION'><filename>PREFERRED_VERSION</filename></link>
+ variable to specify a particular version.
+ You can influence the order by using the
+ <link linkend='var-DEFAULT_PREFERENCE'><filename>DEFAULT_PREFERENCE</filename></link>
+ variable.
+ </para>
+
+ <para>
+ By default, files have a preference of "0".
+ Setting <filename>DEFAULT_PREFERENCE</filename> to "-1" makes the
+ recipe unlikely to be used unless it is explicitly referenced.
+ Setting <filename>DEFAULT_PREFERENCE</filename> to "1" makes it
+ likely the recipe is used.
+ <filename>PREFERRED_VERSION</filename> overrides any
+ <filename>DEFAULT_PREFERENCE</filename> setting.
+ <filename>DEFAULT_PREFERENCE</filename> is often used to mark newer
+ and more experimental recipe versions until they have undergone
+ sufficient testing to be considered stable.
+ </para>
+
+ <para>
+ When there are multiple “versions” of a given recipe,
+ BitBake defaults to selecting the most recent
+ version, unless otherwise specified.
+ If the recipe in question has a
+ <link linkend='var-DEFAULT_PREFERENCE'><filename>DEFAULT_PREFERENCE</filename></link>
+ set lower than the other recipes (default is 0), then
+ it will not be selected.
+ This allows the person or persons maintaining
+ the repository of recipe files to specify
+ their preference for the default selected version.
+ Additionally, the user can specify their preferred version.
+ </para>
+
+ <para>
+ If the first recipe is named <filename>a_1.1.bb</filename>, then the
+ <link linkend='var-PN'><filename>PN</filename></link> variable
+ will be set to “a”, and the
+ <link linkend='var-PV'><filename>PV</filename></link>
+ variable will be set to 1.1.
+ </para>
+
+ <para>
+ Thus, if a recipe named <filename>a_1.2.bb</filename> exists, BitBake
+ will choose 1.2 by default.
+ However, if you define the following variable in a
+ <filename>.conf</filename> file that BitBake parses, you
+ can change that preference:
+ <literallayout class='monospaced'>
+ PREFERRED_VERSION_a = "1.1"
+ </literallayout>
+ </para>
+
+ <note>
+ <para>
+ It is common for a recipe to provide two versions -- a stable,
+ numbered (and preferred) version, and a version that is
+ automatically checked out from a source code repository that
+ is considered more "bleeding edge" but can be selected only
+ explicitly.
+ </para>
+
+ <para>
+ For example, in the OpenEmbedded codebase, there is a standard,
+ versioned recipe file for BusyBox,
+ <filename>busybox_1.22.1.bb</filename>,
+ but there is also a Git-based version,
+ <filename>busybox_git.bb</filename>, which explicitly contains the line
+ <literallayout class='monospaced'>
+ DEFAULT_PREFERENCE = "-1"
+ </literallayout>
+ to ensure that the numbered, stable version is always preferred
+ unless the developer selects otherwise.
+ </para>
+ </note>
+ </section>
+
+ <section id='bb-bitbake-dependencies'>
+ <title>Dependencies</title>
+
+ <para>
+ Each target BitBake builds consists of multiple tasks such as
+ <filename>fetch</filename>, <filename>unpack</filename>,
+ <filename>patch</filename>, <filename>configure</filename>,
+ and <filename>compile</filename>.
+ For best performance on multi-core systems, BitBake considers each
+ task as an independent
+ entity with its own set of dependencies.
+ </para>
+
+ <para>
+ Dependencies are defined through several variables.
+ You can find information about variables BitBake uses in
+ the <link linkend='ref-variables-glos'>Variables Glossary</link>
+ near the end of this manual.
+ At a basic level, it is sufficient to know that BitBake uses the
+ <link linkend='var-DEPENDS'><filename>DEPENDS</filename></link> and
+ <link linkend='var-RDEPENDS'><filename>RDEPENDS</filename></link> variables when
+ calculating dependencies.
+ </para>
+
+ <para>
+ For more information on how BitBake handles dependencies, see the
+ "<link linkend='dependencies'>Dependencies</link>" section.
+ </para>
+ </section>
+
+ <section id='ref-bitbake-tasklist'>
+ <title>The Task List</title>
+
+ <para>
+ Based on the generated list of providers and the dependency information,
+ BitBake can now calculate exactly what tasks it needs to run and in what
+ order it needs to run them.
+ The
+ "<link linkend='executing-tasks'>Executing Tasks</link>" section has more
+ information on how BitBake chooses which task to execute next.
+ </para>
+
+ <para>
+ The build now starts with BitBake forking off threads up to the limit set in the
+ <link linkend='var-BB_NUMBER_THREADS'><filename>BB_NUMBER_THREADS</filename></link>
+ variable.
+ BitBake continues to fork threads as long as there are tasks ready to run,
+ those tasks have all their dependencies met, and the thread threshold has not been
+ exceeded.
+ </para>
+
+ <para>
+ It is worth noting that you can greatly speed up the build time by properly setting
+ the <filename>BB_NUMBER_THREADS</filename> variable.
+ </para>
+
+ <para>
+ As each task completes, a timestamp is written to the directory specified by the
+ <link linkend='var-STAMP'><filename>STAMP</filename></link> variable.
+ On subsequent runs, BitBake looks in the build directory within
+ <filename>tmp/stamps</filename> and does not rerun
+ tasks that are already completed unless a timestamp is found to be invalid.
+ Currently, invalid timestamps are only considered on a per
+ recipe file basis.
+ So, for example, if the configure stamp has a timestamp greater than the
+ compile timestamp for a given target, then the compile task would rerun.
+ Running the compile task again, however, has no effect on other providers
+ that depend on that target.
+ </para>
+
+ <para>
+ The exact format of the stamps is partly configurable.
+ In modern versions of BitBake, a hash is appended to the
+ stamp so that if the configuration changes, the stamp becomes
+ invalid and the task is automatically rerun.
+ This hash, or signature used, is governed by the signature policy
+ that is configured (see the
+ "<link linkend='checksums'>Checksums (Signatures)</link>"
+ section for information).
+ It is also possible to append extra metadata to the stamp using
+ the "stamp-extra-info" task flag.
+ For example, OpenEmbedded uses this flag to make some tasks machine-specific.
+ </para>
+
+ <note>
+ Some tasks are marked as "nostamp" tasks.
+ No timestamp file is created when these tasks are run.
+ Consequently, "nostamp" tasks are always rerun.
+ </note>
+
+ <para>
+ For more information on tasks, see the
+ "<link linkend='tasks'>Tasks</link>" section.
+ </para>
+ </section>
+
+ <section id='executing-tasks'>
+ <title>Executing Tasks</title>
+
+ <para>
+ Tasks can be either a shell task or a Python task.
+ For shell tasks, BitBake writes a shell script to
+ <filename>${</filename><link linkend='var-T'><filename>T</filename></link><filename>}/run.do_taskname.pid</filename>
+ and then executes the script.
+ The generated shell script contains all the exported variables,
+ and the shell functions with all variables expanded.
+ Output from the shell script goes to the file
+ <filename>${T}/log.do_taskname.pid</filename>.
+ Looking at the expanded shell functions in the run file and
+ the output in the log files is a useful debugging technique.
+ </para>
+
+ <para>
+ For Python tasks, BitBake executes the task internally and logs
+ information to the controlling terminal.
+ Future versions of BitBake will write the functions to files
+ similar to the way shell tasks are handled.
+ Logging will be handled in a way similar to shell tasks as well.
+ </para>
+
+ <para>
+ The order in which BitBake runs the tasks is controlled by its
+ task scheduler.
+ It is possible to configure the scheduler and define custom
+ implementations for specific use cases.
+ For more information, see these variables that control the
+ behavior:
+ <itemizedlist>
+ <listitem><para>
+ <link linkend='var-BB_SCHEDULER'><filename>BB_SCHEDULER</filename></link>
+ </para></listitem>
+ <listitem><para>
+ <link linkend='var-BB_SCHEDULERS'><filename>BB_SCHEDULERS</filename></link>
+ </para></listitem>
+ </itemizedlist>
+ It is possible to have functions run before and after a task's main
+ function.
+ This is done using the "prefuncs" and "postfuncs" flags of the task
+ that lists the functions to run.
+ </para>
+ </section>
+
+ <section id='checksums'>
+ <title>Checksums (Signatures)</title>
+
+ <para>
+ A checksum is a unique signature of a task's inputs.
+ The signature of a task can be used to determine if a task
+ needs to be run.
+ Because it is a change in a task's inputs that triggers running
+ the task, BitBake needs to detect all the inputs to a given task.
+ For shell tasks, this turns out to be fairly easy because
+ BitBake generates a "run" shell script for each task and
+ it is possible to create a checksum that gives you a good idea of when
+ the task's data changes.
+ </para>
+
+ <para>
+ To complicate the problem, some things should not be included in
+ the checksum.
+ First, there is the actual specific build path of a given task -
+ the working directory.
+ It does not matter if the working directory changes because it should not
+ affect the output for target packages.
+ The simplistic approach for excluding the working directory is to set
+ it to some fixed value and create the checksum for the "run" script.
+ BitBake goes one step better and uses the
+ <link linkend='var-BB_HASHBASE_WHITELIST'><filename>BB_HASHBASE_WHITELIST</filename></link>
+ variable to define a list of variables that should never be included
+ when generating the signatures.
+ </para>
+
+ <para>
+ Another problem results from the "run" scripts containing functions that
+ might or might not get called.
+ The incremental build solution contains code that figures out dependencies
+ between shell functions.
+ This code is used to prune the "run" scripts down to the minimum set,
+ thereby alleviating this problem and making the "run" scripts much more
+ readable as a bonus.
+ </para>
+
+ <para>
+ So far we have solutions for shell scripts.
+ What about Python tasks?
+ The same approach applies even though these tasks are more difficult.
+ The process needs to figure out what variables a Python function accesses
+ and what functions it calls.
+ Again, the incremental build solution contains code that first figures out
+ the variable and function dependencies, and then creates a checksum for the data
+ used as the input to the task.
+ </para>
+
+ <para>
+ Like the working directory case, situations exist where dependencies
+ should be ignored.
+ For these cases, you can instruct the build process to ignore a dependency
+ by using a line like the following:
+ <literallayout class='monospaced'>
+ PACKAGE_ARCHS[vardepsexclude] = "MACHINE"
+ </literallayout>
+ This example ensures that the <filename>PACKAGE_ARCHS</filename> variable does not
+ depend on the value of <filename>MACHINE</filename>, even if it does reference it.
+ </para>
+
+ <para>
+ Equally, there are cases where we need to add dependencies BitBake
+ is not able to find.
+ You can accomplish this by using a line like the following:
+ <literallayout class='monospaced'>
+ PACKAGE_ARCHS[vardeps] = "MACHINE"
+ </literallayout>
+ This example explicitly adds the <filename>MACHINE</filename> variable as a
+ dependency for <filename>PACKAGE_ARCHS</filename>.
+ </para>
+
+ <para>
+ Consider a case with in-line Python, for example, where BitBake is not
+ able to figure out dependencies.
+ When running in debug mode (i.e. using <filename>-DDD</filename>), BitBake
+ produces output when it discovers something for which it cannot figure out
+ dependencies.
+ </para>
+
+ <para>
+ Thus far, this section has limited discussion to the direct inputs into a task.
+ Information based on direct inputs is referred to as the "basehash" in the
+ code.
+ However, there is still the question of a task's indirect inputs - the
+ things that were already built and present in the build directory.
+ The checksum (or signature) for a particular task needs to add the hashes
+ of all the tasks on which the particular task depends.
+ Choosing which dependencies to add is a policy decision.
+ However, the effect is to generate a master checksum that combines the basehash
+ and the hashes of the task's dependencies.
+ </para>
+
+ <para>
+ At the code level, there are a variety of ways both the basehash and the
+ dependent task hashes can be influenced.
+ Within the BitBake configuration file, we can give BitBake some extra information
+ to help it construct the basehash.
+ The following statement effectively results in a list of global variable
+ dependency excludes - variables never included in any checksum.
+ This example uses variables from OpenEmbedded to help illustrate
+ the concept:
+ <literallayout class='monospaced'>
+ BB_HASHBASE_WHITELIST ?= "TMPDIR FILE PATH PWD BB_TASKHASH BBPATH DL_DIR \
+ SSTATE_DIR THISDIR FILESEXTRAPATHS FILE_DIRNAME HOME LOGNAME SHELL TERM \
+ USER FILESPATH STAGING_DIR_HOST STAGING_DIR_TARGET COREBASE PRSERV_HOST \
+ PRSERV_DUMPDIR PRSERV_DUMPFILE PRSERV_LOCKDOWN PARALLEL_MAKE \
+ CCACHE_DIR EXTERNAL_TOOLCHAIN CCACHE CCACHE_DISABLE LICENSE_PATH SDKPKGSUFFIX"
+ </literallayout>
+ The previous example excludes the work directory, which is part of
+ <filename>TMPDIR</filename>.
+ </para>
+
+ <para>
+ The rules for deciding which hashes of dependent tasks to include through
+ dependency chains are more complex and are generally accomplished with a
+ Python function.
+ The code in <filename>meta/lib/oe/sstatesig.py</filename> shows two examples
+ of this and also illustrates how you can insert your own policy into the system
+ if so desired.
+ This file defines the two basic signature generators OpenEmbedded Core
+ uses: "OEBasic" and "OEBasicHash".
+ By default, there is a dummy "noop" signature handler enabled in BitBake.
+ This means that behavior is unchanged from previous versions.
+ <filename>OE-Core</filename> uses the "OEBasicHash" signature handler by default
+ through this setting in the <filename>bitbake.conf</filename> file:
+ <literallayout class='monospaced'>
+ BB_SIGNATURE_HANDLER ?= "OEBasicHash"
+ </literallayout>
+ The "OEBasicHash" <filename>BB_SIGNATURE_HANDLER</filename> is the same as the
+ "OEBasic" version but adds the task hash to the stamp files.
+ This results in any metadata change that changes the task hash, automatically
+ causing the task to be run again.
+ This removes the need to bump
+ <link linkend='var-PR'><filename>PR</filename></link>
+ values, and changes to metadata automatically ripple across the build.
+ </para>
+
+ <para>
+ It is also worth noting that the end result of these signature generators is to
+ make some dependency and hash information available to the build.
+ This information includes:
+ <itemizedlist>
+ <listitem><para><filename>BB_BASEHASH_task-</filename><replaceable>taskname</replaceable>:
+ The base hashes for each task in the recipe.
+ </para></listitem>
+ <listitem><para><filename>BB_BASEHASH_</filename><replaceable>filename</replaceable><filename>:</filename><replaceable>taskname</replaceable>:
+ The base hashes for each dependent task.
+ </para></listitem>
+ <listitem><para><filename>BBHASHDEPS_</filename><replaceable>filename</replaceable><filename>:</filename><replaceable>taskname</replaceable>:
+ The task dependencies for each task.
+ </para></listitem>
+ <listitem><para><filename>BB_TASKHASH</filename>:
+ The hash of the currently running task.
+ </para></listitem>
+ </itemizedlist>
+ </para>
+
+ <para>
+ It is worth noting that BitBake's "-S" option lets you
+ debug Bitbake's processing of signatures.
+ The options passed to -S allow different debugging modes
+ to be used, either using BitBake's own debug functions
+ or possibly those defined in the metadata/signature handler
+ itself.
+ The simplest parameter to pass is "none", which causes a
+ set of signature information to be written out into
+ <filename>STAMP_DIR</filename>
+ corresponding to the targets specified.
+ The other currently available parameter is "printdiff",
+ which causes BitBake to try to establish the closest
+ signature match it can (e.g. in the sstate cache) and then
+ run <filename>bitbake-diffsigs</filename> over the matches
+ to determine the stamps and delta where these two
+ stamp trees diverge.
+ <note>
+ It is likely that future versions of BitBake will
+ provide other signature handlers triggered through
+ additional "-S" parameters.
+ </note>
+ </para>
+
+ <para>
+ You can find more information on checksum metadata in the
+ "<link linkend='task-checksums-and-setscene'>Task Checksums and Setscene</link>"
+ section.
+ </para>
+ </section>
+
+ <section id='setscene'>
+ <title>Setscene</title>
+
+ <para>
+ The setscene process enables BitBake to handle "pre-built" artifacts.
+ The ability to handle and reuse these artifacts allows BitBake
+ the luxury of not having to build something from scratch every time.
+ Instead, BitBake can use, when possible, existing build artifacts.
+ </para>
+
+ <para>
+ BitBake needs to have reliable data indicating whether or not an
+ artifact is compatible.
+ Signatures, described in the previous section, provide an ideal
+ way of representing whether an artifact is compatible.
+ If a signature is the same, an object can be reused.
+ </para>
+
+ <para>
+ If an object can be reused, the problem then becomes how to
+ replace a given task or set of tasks with the pre-built artifact.
+ BitBake solves the problem with the "setscene" process.
+ </para>
+
+ <para>
+ When BitBake is asked to build a given target, before building anything,
+ it first asks whether cached information is available for any of the
+ targets it's building, or any of the intermediate targets.
+ If cached information is available, BitBake uses this information instead of
+ running the main tasks.
+ </para>
+
+ <para>
+ BitBake first calls the function defined by the
+ <link linkend='var-BB_HASHCHECK_FUNCTION'><filename>BB_HASHCHECK_FUNCTION</filename></link>
+ variable with a list of tasks and corresponding
+ hashes it wants to build.
+ This function is designed to be fast and returns a list
+ of the tasks for which it believes in can obtain artifacts.
+ </para>
+
+ <para>
+ Next, for each of the tasks that were returned as possibilities,
+ BitBake executes a setscene version of the task that the possible
+ artifact covers.
+ Setscene versions of a task have the string "_setscene" appended to the
+ task name.
+ So, for example, the task with the name <filename>xxx</filename> has
+ a setscene task named <filename>xxx_setscene</filename>.
+ The setscene version of the task executes and provides the necessary
+ artifacts returning either success or failure.
+ </para>
+
+ <para>
+ As previously mentioned, an artifact can cover more than one task.
+ For example, it is pointless to obtain a compiler if you
+ already have the compiled binary.
+ To handle this, BitBake calls the
+ <link linkend='var-BB_SETSCENE_DEPVALID'><filename>BB_SETSCENE_DEPVALID</filename></link>
+ function for each successful setscene task to know whether or not it needs
+ to obtain the dependencies of that task.
+ </para>
+
+ <para>
+ Finally, after all the setscene tasks have executed, BitBake calls the
+ function listed in
+ <link linkend='var-BB_SETSCENE_VERIFY_FUNCTION'><filename>BB_SETSCENE_VERIFY_FUNCTION</filename></link>
+ with the list of tasks BitBake thinks has been "covered".
+ The metadata can then ensure that this list is correct and can
+ inform BitBake that it wants specific tasks to be run regardless
+ of the setscene result.
+ </para>
+
+ <para>
+ You can find more information on setscene metadata in the
+ "<link linkend='task-checksums-and-setscene'>Task Checksums and Setscene</link>"
+ section.
+ </para>
+ </section>
+</chapter>
diff --git a/bitbake/doc/bitbake-user-manual/bitbake-user-manual-fetching.xml b/bitbake/doc/bitbake-user-manual/bitbake-user-manual-fetching.xml
new file mode 100644
index 0000000..f168cfa
--- /dev/null
+++ b/bitbake/doc/bitbake-user-manual/bitbake-user-manual-fetching.xml
@@ -0,0 +1,765 @@
+<!DOCTYPE chapter PUBLIC "-//OASIS//DTD DocBook XML V4.2//EN"
+"http://www.oasis-open.org/docbook/xml/4.2/docbookx.dtd">
+
+<chapter>
+<title>File Download Support</title>
+
+ <para>
+ BitBake's fetch module is a standalone piece of library code
+ that deals with the intricacies of downloading source code
+ and files from remote systems.
+ Fetching source code is one of the cornerstones of building software.
+ As such, this module forms an important part of BitBake.
+ </para>
+
+ <para>
+ The current fetch module is called "fetch2" and refers to the
+ fact that it is the second major version of the API.
+ The original version is obsolete and has been removed from the codebase.
+ Thus, in all cases, "fetch" refers to "fetch2" in this
+ manual.
+ </para>
+
+ <section id='the-download-fetch'>
+ <title>The Download (Fetch)</title>
+
+ <para>
+ BitBake takes several steps when fetching source code or files.
+ The fetcher codebase deals with two distinct processes in order:
+ obtaining the files from somewhere (cached or otherwise)
+ and then unpacking those files into a specific location and
+ perhaps in a specific way.
+ Getting and unpacking the files is often optionally followed
+ by patching.
+ Patching, however, is not covered by this module.
+ </para>
+
+ <para>
+ The code to execute the first part of this process, a fetch,
+ looks something like the following:
+ <literallayout class='monospaced'>
+ src_uri = (d.getVar('SRC_URI', True) or "").split()
+ fetcher = bb.fetch2.Fetch(src_uri, d)
+ fetcher.download()
+ </literallayout>
+ This code sets up an instance of the fetch class.
+ The instance uses a space-separated list of URLs from the
+ <link linkend='var-SRC_URI'><filename>SRC_URI</filename></link>
+ variable and then calls the <filename>download</filename>
+ method to download the files.
+ </para>
+
+ <para>
+ The instantiation of the fetch class is usually followed by:
+ <literallayout class='monospaced'>
+ rootdir = l.getVar('WORKDIR', True)
+ fetcher.unpack(rootdir)
+ </literallayout>
+ This code unpacks the downloaded files to the
+ specified by <filename>WORKDIR</filename>.
+ <note>
+ For convenience, the naming in these examples matches
+ the variables used by OpenEmbedded.
+ If you want to see the above code in action, examine
+ the OpenEmbedded class file <filename>base.bbclass</filename>.
+ </note>
+ The <filename>SRC_URI</filename> and <filename>WORKDIR</filename>
+ variables are not hardcoded into the fetcher, since those fetcher
+ methods can be (and are) called with different variable names.
+ In OpenEmbedded for example, the shared state (sstate) code uses
+ the fetch module to fetch the sstate files.
+ </para>
+
+ <para>
+ When the <filename>download()</filename> method is called,
+ BitBake tries to resolve the URLs by looking for source files
+ in a specific search order:
+ <itemizedlist>
+ <listitem><para><emphasis>Pre-mirror Sites:</emphasis>
+ BitBake first uses pre-mirrors to try and find source files.
+ These locations are defined using the
+ <link linkend='var-PREMIRRORS'><filename>PREMIRRORS</filename></link>
+ variable.
+ </para></listitem>
+ <listitem><para><emphasis>Source URI:</emphasis>
+ If pre-mirrors fail, BitBake uses the original URL (e.g from
+ <filename>SRC_URI</filename>).
+ </para></listitem>
+ <listitem><para><emphasis>Mirror Sites:</emphasis>
+ If fetch failures occur, BitBake next uses mirror locations as
+ defined by the
+ <link linkend='var-MIRRORS'><filename>MIRRORS</filename></link>
+ variable.
+ </para></listitem>
+ </itemizedlist>
+ </para>
+
+ <para>
+ For each URL passed to the fetcher, the fetcher
+ calls the submodule that handles that particular URL type.
+ This behavior can be the source of some confusion when you
+ are providing URLs for the <filename>SRC_URI</filename>
+ variable.
+ Consider the following two URLs:
+ <literallayout class='monospaced'>
+ http://git.yoctoproject.org/git/poky;protocol=git
+ git://git.yoctoproject.org/git/poky;protocol=http
+ </literallayout>
+ In the former case, the URL is passed to the
+ <filename>wget</filename> fetcher, which does not
+ understand "git".
+ Therefore, the latter case is the correct form since the
+ Git fetcher does know how to use HTTP as a transport.
+ </para>
+
+ <para>
+ Here are some examples that show commonly used mirror
+ definitions:
+ <literallayout class='monospaced'>
+ PREMIRRORS ?= "\
+ bzr://.*/.* http://somemirror.org/sources/ \n \
+ cvs://.*/.* http://somemirror.org/sources/ \n \
+ git://.*/.* http://somemirror.org/sources/ \n \
+ hg://.*/.* http://somemirror.org/sources/ \n \
+ osc://.*/.* http://somemirror.org/sources/ \n \
+ p4://.*/.* http://somemirror.org/sources/ \n \
+ svn://.*/.* http://somemirror.org/sources/ \n"
+
+ MIRRORS =+ "\
+ ftp://.*/.* http://somemirror.org/sources/ \n \
+ http://.*/.* http://somemirror.org/sources/ \n \
+ https://.*/.* http://somemirror.org/sources/ \n"
+ </literallayout>
+ It is useful to note that BitBake supports
+ cross-URLs.
+ It is possible to mirror a Git repository on an HTTP
+ server as a tarball.
+ This is what the <filename>git://</filename> mapping in
+ the previous example does.
+ </para>
+
+ <para>
+ Since network accesses are slow, Bitbake maintains a
+ cache of files downloaded from the network.
+ Any source files that are not local (i.e.
+ downloaded from the Internet) are placed into the download
+ directory, which is specified by the
+ <link linkend='var-DL_DIR'><filename>DL_DIR</filename></link>
+ variable.
+ </para>
+
+ <para>
+ File integrity is of key importance for reproducing builds.
+ For non-local archive downloads, the fetcher code can verify
+ SHA-256 and MD5 checksums to ensure the archives have been
+ downloaded correctly.
+ You can specify these checksums by using the
+ <filename>SRC_URI</filename> variable with the appropriate
+ varflags as follows:
+ <literallayout class='monospaced'>
+ SRC_URI[md5sum] = "<replaceable>value</replaceable>"
+ SRC_URI[sha256sum] = "<replaceable>value</replaceable>"
+ </literallayout>
+ You can also specify the checksums as parameters on the
+ <filename>SRC_URI</filename> as shown below:
+ <literallayout class='monospaced'>
+ SRC_URI = "http://example.com/foobar.tar.bz2;md5sum=4a8e0f237e961fd7785d19d07fdb994d"
+ </literallayout>
+ If multiple URIs exist, you can specify the checksums either
+ directly as in the previous example, or you can name the URLs.
+ The following syntax shows how you name the URIs:
+ <literallayout class='monospaced'>
+ SRC_URI = "http://example.com/foobar.tar.bz2;name=foo"
+ SRC_URI[foo.md5sum] = 4a8e0f237e961fd7785d19d07fdb994d
+ </literallayout>
+ After a file has been downloaded and has had its checksum checked,
+ a ".done" stamp is placed in <filename>DL_DIR</filename>.
+ BitBake uses this stamp during subsequent builds to avoid
+ downloading or comparing a checksum for the file again.
+ <note>
+ It is assumed that local storage is safe from data corruption.
+ If this were not the case, there would be bigger issues to worry about.
+ </note>
+ </para>
+
+ <para>
+ If
+ <link linkend='var-BB_STRICT_CHECKSUM'><filename>BB_STRICT_CHECKSUM</filename></link>
+ is set, any download without a checksum triggers an
+ error message.
+ The
+ <link linkend='var-BB_NO_NETWORK'><filename>BB_NO_NETWORK</filename></link>
+ variable can be used to make any attempted network access a fatal
+ error, which is useful for checking that mirrors are complete
+ as well as other things.
+ </para>
+ </section>
+
+ <section id='bb-the-unpack'>
+ <title>The Unpack</title>
+
+ <para>
+ The unpack process usually immediately follows the download.
+ For all URLs except Git URLs, BitBake uses the common
+ <filename>unpack</filename> method.
+ </para>
+
+ <para>
+ A number of parameters exist that you can specify within the
+ URL to govern the behavior of the unpack stage:
+ <itemizedlist>
+ <listitem><para><emphasis>unpack:</emphasis>
+ Controls whether the URL components are unpacked.
+ If set to "1", which is the default, the components
+ are unpacked.
+ If set to "0", the unpack stage leaves the file alone.
+ This parameter is useful when you want an archive to be
+ copied in and not be unpacked.
+ </para></listitem>
+ <listitem><para><emphasis>dos:</emphasis>
+ Applies to <filename>.zip</filename> and
+ <filename>.jar</filename> files and specifies whether to
+ use DOS line ending conversion on text files.
+ </para></listitem>
+ <listitem><para><emphasis>basepath:</emphasis>
+ Instructs the unpack stage to strip the specified
+ directories from the source path when unpacking.
+ </para></listitem>
+ <listitem><para><emphasis>subdir:</emphasis>
+ Unpacks the specific URL to the specified subdirectory
+ within the root directory.
+ </para></listitem>
+ </itemizedlist>
+ The unpack call automatically decompresses and extracts files
+ with ".Z", ".z", ".gz", ".xz", ".zip", ".jar", ".ipk", ".rpm".
+ ".srpm", ".deb" and ".bz2" extensions as well as various combinations
+ of tarball extensions.
+ </para>
+
+ <para>
+ As mentioned, the Git fetcher has its own unpack method that
+ is optimized to work with Git trees.
+ Basically, this method works by cloning the tree into the final
+ directory.
+ The process is completed using references so that there is
+ only one central copy of the Git metadata needed.
+ </para>
+ </section>
+
+ <section id='bb-fetchers'>
+ <title>Fetchers</title>
+
+ <para>
+ As mentioned earlier, the URL prefix determines which
+ fetcher submodule BitBake uses.
+ Each submodule can support different URL parameters,
+ which are described in the following sections.
+ </para>
+
+ <section id='local-file-fetcher'>
+ <title>Local file fetcher (<filename>file://</filename>)</title>
+
+ <para>
+ This submodule handles URLs that begin with
+ <filename>file://</filename>.
+ The filename you specify within the URL can be
+ either an absolute or relative path to a file.
+ If the filename is relative, the contents of the
+ <link linkend='var-FILESPATH'><filename>FILESPATH</filename></link>
+ variable is used in the same way
+ <filename>PATH</filename> is used to find executables.
+ Failing that,
+ <link linkend='var-FILESDIR'><filename>FILESDIR</filename></link>
+ is used to find the appropriate relative file.
+ <note>
+ <filename>FILESDIR</filename> is deprecated and can
+ be replaced with <filename>FILESPATH</filename>.
+ Because <filename>FILESDIR</filename> is likely to be
+ removed, you should not use this variable in any new code.
+ </note>
+ If the file cannot be found, it is assumed that it is available in
+ <link linkend='var-DL_DIR'><filename>DL_DIR</filename></link>
+ by the time the <filename>download()</filename> method is called.
+ </para>
+
+ <para>
+ If you specify a directory, the entire directory is
+ unpacked.
+ </para>
+
+ <para>
+ Here are a couple of example URLs, the first relative and
+ the second absolute:
+ <literallayout class='monospaced'>
+ SRC_URI = "file://relativefile.patch"
+ SRC_URI = "file:///Users/ich/very_important_software"
+ </literallayout>
+ </para>
+ </section>
+
+ <section id='http-ftp-fetcher'>
+ <title>HTTP/FTP wget fetcher (<filename>http://</filename>, <filename>ftp://</filename>, <filename>https://</filename>)</title>
+
+ <para>
+ This fetcher obtains files from web and FTP servers.
+ Internally, the fetcher uses the wget utility.
+ </para>
+
+ <para>
+ The executable and parameters used are specified by the
+ <filename>FETCHCMD_wget</filename> variable, which defaults
+ to sensible values.
+ The fetcher supports a parameter "downloadfilename" that
+ allows the name of the downloaded file to be specified.
+ Specifying the name of the downloaded file is useful
+ for avoiding collisions in
+ <link linkend='var-DL_DIR'><filename>DL_DIR</filename></link>
+ when dealing with multiple files that have the same name.
+ </para>
+
+ <para>
+ Some example URLs are as follows:
+ <literallayout class='monospaced'>
+ SRC_URI = "http://oe.handhelds.org/not_there.aac"
+ SRC_URI = "ftp://oe.handhelds.org/not_there_as_well.aac"
+ SRC_URI = "ftp://you@oe.handhelds.org/home/you/secret.plan"
+ </literallayout>
+ </para>
+ <note>
+ Because URL parameters are delimited by semi-colons, this can
+ introduce ambiguity when parsing URLs that also contain semi-colons,
+ for example:
+ <literallayout class='monospaced'>
+ SRC_URI = "http://abc123.org/git/?p=gcc/gcc.git;a=snapshot;h=a5dd47"
+ </literallayout>
+ Such URLs should should be modified by replacing semi-colons with '&' characters:
+ <literallayout class='monospaced'>
+ SRC_URI = "http://abc123.org/git/?p=gcc/gcc.git&a=snapshot&h=a5dd47"
+ </literallayout>
+ In most cases this should work. Treating semi-colons and '&' in queries
+ identically is recommended by the World Wide Web Consortium (W3C).
+ Note that due to the nature of the URL, you may have to specify the name
+ of the downloaded file as well:
+ <literallayout class='monospaced'>
+ SRC_URI = "http://abc123.org/git/?p=gcc/gcc.git&a=snapshot&h=a5dd47;downloadfilename=myfile.bz2"
+ </literallayout>
+ </note>
+ </section>
+
+ <section id='cvs-fetcher'>
+ <title>CVS fetcher (<filename>(cvs://</filename>)</title>
+
+ <para>
+ This submodule handles checking out files from the
+ CVS version control system.
+ You can configure it using a number of different variables:
+ <itemizedlist>
+ <listitem><para><emphasis><filename>FETCHCMD_cvs</filename>:</emphasis>
+ The name of the executable to use when running
+ the <filename>cvs</filename> command.
+ This name is usually "cvs".
+ </para></listitem>
+ <listitem><para><emphasis><filename>SRCDATE</filename>:</emphasis>
+ The date to use when fetching the CVS source code.
+ A special value of "now" causes the checkout to
+ be updated on every build.
+ </para></listitem>
+ <listitem><para><emphasis><link linkend='var-CVSDIR'><filename>CVSDIR</filename></link>:</emphasis>
+ Specifies where a temporary checkout is saved.
+ The location is often <filename>DL_DIR/cvs</filename>.
+ </para></listitem>
+ <listitem><para><emphasis><filename>CVS_PROXY_HOST</filename>:</emphasis>
+ The name to use as a "proxy=" parameter to the
+ <filename>cvs</filename> command.
+ </para></listitem>
+ <listitem><para><emphasis><filename>CVS_PROXY_PORT</filename>:</emphasis>
+ The port number to use as a "proxyport=" parameter to
+ the <filename>cvs</filename> command.
+ </para></listitem>
+ </itemizedlist>
+ As well as the standard username and password URL syntax,
+ you can also configure the fetcher with various URL parameters:
+ </para>
+
+ <para>
+ The supported parameters are as follows:
+ <itemizedlist>
+ <listitem><para><emphasis>"method":</emphasis>
+ The protocol over which to communicate with the CVS server.
+ By default, this protocol is "pserver".
+ If "method" is set to "ext", BitBake examines the
+ "rsh" parameter and sets <filename>CVS_RSH</filename>.
+ You can use "dir" for local directories.
+ </para></listitem>
+ <listitem><para><emphasis>"module":</emphasis>
+ Specifies the module to check out.
+ You must supply this parameter.
+ </para></listitem>
+ <listitem><para><emphasis>"tag":</emphasis>
+ Describes which CVS TAG should be used for
+ the checkout.
+ By default, the TAG is empty.
+ </para></listitem>
+ <listitem><para><emphasis>"date":</emphasis>
+ Specifies a date.
+ If no "date" is specified, the
+ <link linkend='var-SRCDATE'><filename>SRCDATE</filename></link>
+ of the configuration is used to checkout a specific date.
+ The special value of "now" causes the checkout to be
+ updated on every build.
+ </para></listitem>
+ <listitem><para><emphasis>"localdir":</emphasis>
+ Used to rename the module.
+ Effectively, you are renaming the output directory
+ to which the module is unpacked.
+ You are forcing the module into a special
+ directory relative to
+ <link linkend='var-CVSDIR'><filename>CVSDIR</filename></link>.
+ </para></listitem>
+ <listitem><para><emphasis>"rsh"</emphasis>
+ Used in conjunction with the "method" parameter.
+ </para></listitem>
+ <listitem><para><emphasis>"scmdata":</emphasis>
+ Causes the CVS metadata to be maintained in the tarball
+ the fetcher creates when set to "keep".
+ The tarball is expanded into the work directory.
+ By default, the CVS metadata is removed.
+ </para></listitem>
+ <listitem><para><emphasis>"fullpath":</emphasis>
+ Controls whether the resulting checkout is at the
+ module level, which is the default, or is at deeper
+ paths.
+ </para></listitem>
+ <listitem><para><emphasis>"norecurse":</emphasis>
+ Causes the fetcher to only checkout the specified
+ directory with no recurse into any subdirectories.
+ </para></listitem>
+ <listitem><para><emphasis>"port":</emphasis>
+ The port to which the CVS server connects.
+ </para></listitem>
+ </itemizedlist>
+ Some example URLs are as follows:
+ <literallayout class='monospaced'>
+ SRC_URI = "cvs://CVSROOT;module=mymodule;tag=some-version;method=ext"
+ SRC_URI = "cvs://CVSROOT;module=mymodule;date=20060126;localdir=usethat"
+ </literallayout>
+ </para>
+ </section>
+
+ <section id='svn-fetcher'>
+ <title>Subversion (SVN) Fetcher (<filename>svn://</filename>)</title>
+
+ <para>
+ This fetcher submodule fetches code from the
+ Subversion source control system.
+ The executable used is specified by
+ <filename>FETCHCMD_svn</filename>, which defaults
+ to "svn".
+ The fetcher's temporary working directory is set by
+ <link linkend='var-SVNDIR'><filename>SVNDIR</filename></link>,
+ which is usually <filename>DL_DIR/svn</filename>.
+ </para>
+
+ <para>
+ The supported parameters are as follows:
+ <itemizedlist>
+ <listitem><para><emphasis>"module":</emphasis>
+ The name of the svn module to checkout.
+ You must provide this parameter.
+ You can think of this parameter as the top-level
+ directory of the repository data you want.
+ </para></listitem>
+ <listitem><para><emphasis>"protocol":</emphasis>
+ The protocol to use, which defaults to "svn".
+ Other options are "svn+ssh" and "rsh".
+ For "rsh", the "rsh" parameter is also used.
+ </para></listitem>
+ <listitem><para><emphasis>"rev":</emphasis>
+ The revision of the source code to checkout.
+ </para></listitem>
+ <listitem><para><emphasis>"date":</emphasis>
+ The date of the source code to checkout.
+ Specific revisions are generally much safer to checkout
+ rather than by date as they do not involve timezones
+ (e.g. they are much more deterministic).
+ </para></listitem>
+ <listitem><para><emphasis>"scmdata":</emphasis>
+ Causes the “.svn” directories to be available during
+ compile-time when set to "keep".
+ By default, these directories are removed.
+ </para></listitem>
+ <listitem><para><emphasis>"transportuser":</emphasis>
+ When required, sets the username for the transport.
+ By default, this parameter is empty.
+ The transport username is different than the username
+ used in the main URL, which is passed to the subversion
+ command.
+ </para></listitem>
+ </itemizedlist>
+ Following are two examples using svn:
+ <literallayout class='monospaced'>
+ SRC_URI = "svn://svn.oe.handhelds.org/svn;module=vip;proto=http;rev=667"
+ SRC_URI = "svn://svn.oe.handhelds.org/svn/;module=opie;proto=svn+ssh;date=20060126"
+ </literallayout>
+ </para>
+ </section>
+
+ <section id='git-fetcher'>
+ <title>Git Fetcher (<filename>git://</filename>)</title>
+
+ <para>
+ This fetcher submodule fetches code from the Git
+ source control system.
+ The fetcher works by creating a bare clone of the
+ remote into
+ <link linkend='var-GITDIR'><filename>GITDIR</filename></link>,
+ which is usually <filename>DL_DIR/git2</filename>.
+ This bare clone is then cloned into the work directory during the
+ unpack stage when a specific tree is checked out.
+ This is done using alternates and by reference to
+ minimize the amount of duplicate data on the disk and
+ make the unpack process fast.
+ The executable used can be set with
+ <filename>FETCHCMD_git</filename>.
+ </para>
+
+ <para>
+ This fetcher supports the following parameters:
+ <itemizedlist>
+ <listitem><para><emphasis>"protocol":</emphasis>
+ The protocol used to fetch the files.
+ The default is "git" when a hostname is set.
+ If a hostname is not set, the Git protocol is "file".
+ You can also use "http", "https", "ssh" and "rsync".
+ </para></listitem>
+ <listitem><para><emphasis>"nocheckout":</emphasis>
+ Tells the fetcher to not checkout source code when
+ unpacking when set to "1".
+ Set this option for the URL where there is a custom
+ routine to checkout code.
+ The default is "0".
+ </para></listitem>
+ <listitem><para><emphasis>"rebaseable":</emphasis>
+ Indicates that the upstream Git repository can be rebased.
+ You should set this parameter to "1" if
+ revisions can become detached from branches.
+ In this case, the source mirror tarball is done per
+ revision, which has a loss of efficiency.
+ Rebasing the upstream Git repository could cause the
+ current revision to disappear from the upstream repository.
+ This option reminds the fetcher to preserve the local cache
+ carefully for future use.
+ The default value for this parameter is "0".
+ </para></listitem>
+ <listitem><para><emphasis>"nobranch":</emphasis>
+ Tells the fetcher to not check the SHA validation
+ for the branch when set to "1".
+ The default is "0".
+ Set this option for the recipe that refers to
+ the commit that is valid for a tag instead of
+ the branch.
+ </para></listitem>
+ <listitem><para><emphasis>"bareclone":</emphasis>
+ Tells the fetcher to clone a bare clone into the
+ destination directory without checking out a working tree.
+ Only the raw Git metadata is provided.
+ This parameter implies the "nocheckout" parameter as well.
+ </para></listitem>
+ <listitem><para><emphasis>"branch":</emphasis>
+ The branch(es) of the Git tree to clone.
+ If unset, this is assumed to be "master".
+ The number of branch parameters much match the number of
+ name parameters.
+ </para></listitem>
+ <listitem><para><emphasis>"rev":</emphasis>
+ The revision to use for the checkout.
+ The default is "master".
+ </para></listitem>
+ <listitem><para><emphasis>"tag":</emphasis>
+ Specifies a tag to use for the checkout.
+ To correctly resolve tags, BitBake must access the
+ network.
+ For that reason, tags are often not used.
+ As far as Git is concerned, the "tag" parameter behaves
+ effectively the same as the "rev" parameter.
+ </para></listitem>
+ <listitem><para><emphasis>"subpath":</emphasis>
+ Limits the checkout to a specific subpath of the tree.
+ By default, the whole tree is checked out.
+ </para></listitem>
+ <listitem><para><emphasis>"destsuffix":</emphasis>
+ The name of the path in which to place the checkout.
+ By default, the path is <filename>git/</filename>.
+ </para></listitem>
+ </itemizedlist>
+ Here are some example URLs:
+ <literallayout class='monospaced'>
+ SRC_URI = "git://git.oe.handhelds.org/git/vip.git;tag=version-1"
+ SRC_URI = "git://git.oe.handhelds.org/git/vip.git;protocol=http"
+ </literallayout>
+ </para>
+ </section>
+
+ <section id='gitsm-fetcher'>
+ <title>Git Submodule Fetcher (<filename>gitsm://</filename>)</title>
+
+ <para>
+ This fetcher submodule inherits from the
+ <link linkend='git-fetcher'>Git fetcher</link> and extends
+ that fetcher's behavior by fetching a repository's submodules.
+ <link linkend='var-SRC_URI'><filename>SRC_URI</filename></link>
+ is passed to the Git fetcher as described in the
+ "<link linkend='git-fetcher'>Git Fetcher (<filename>git://</filename>)</link>"
+ section.
+ <note>
+ <title>Notes and Warnings</title>
+ <para>
+ You must clean a recipe when switching between
+ '<filename>git://</filename>' and
+ '<filename>gitsm://</filename>' URLs.
+ </para>
+
+ <para>
+ The Git Submodules fetcher is not a complete fetcher
+ implementation.
+ The fetcher has known issues where it does not use the
+ normal source mirroring infrastructure properly.
+ </para>
+ </note>
+ </para>
+ </section>
+
+ <section id='clearcase-fetcher'>
+ <title>ClearCase Fetcher (<filename>ccrc://</filename>)</title>
+
+ <para>
+ This fetcher submodule fetches code from a
+ <ulink url='http://en.wikipedia.org/wiki/Rational_ClearCase'>ClearCase</ulink>
+ repository.
+ </para>
+
+ <para>
+ To use this fetcher, make sure your recipe has proper
+ <link linkend='var-SRC_URI'><filename>SRC_URI</filename></link>,
+ <link linkend='var-SRCREV'><filename>SRCREV</filename></link>, and
+ <link linkend='var-PV'><filename>PV</filename></link> settings.
+ Here is an example:
+ <literallayout class='monospaced'>
+ SRC_URI = "ccrc://cc.example.org/ccrc;vob=/example_vob;module=/example_module"
+ SRCREV = "EXAMPLE_CLEARCASE_TAG"
+ PV = "${@d.getVar("SRCREV", False).replace("/", "+")}"
+ </literallayout>
+ The fetcher uses the <filename>rcleartool</filename> or
+ <filename>cleartool</filename> remote client, depending on
+ which one is available.
+ </para>
+
+ <para>
+ Following are options for the <filename>SRC_URI</filename>
+ statement:
+ <itemizedlist>
+ <listitem><para><emphasis><filename>vob</filename></emphasis>:
+ The name, which must include the
+ prepending "/" character, of the ClearCase VOB.
+ This option is required.
+ </para></listitem>
+ <listitem><para><emphasis><filename>module</filename></emphasis>:
+ The module, which must include the
+ prepending "/" character, in the selected VOB.
+ <note>
+ The <filename>module</filename> and <filename>vob</filename>
+ options are combined to create the <filename>load</filename> rule in
+ the view config spec.
+ As an example, consider the <filename>vob</filename> and
+ <filename>module</filename> values from the
+ <filename>SRC_URI</filename> statement at the start of this section.
+ Combining those values results in the following:
+ <literallayout class='monospaced'>
+ load /example_vob/example_module
+ </literallayout>
+ </note>
+ </para></listitem>
+ <listitem><para><emphasis><filename>proto</filename></emphasis>:
+ The protocol, which can be either <filename>http</filename> or
+ <filename>https</filename>.
+ </para></listitem>
+ </itemizedlist>
+ </para>
+
+ <para>
+ By default, the fetcher creates a configuration specification.
+ If you want this specification written to an area other than the default,
+ use the <filename>CCASE_CUSTOM_CONFIG_SPEC</filename> variable
+ in your recipe to define where the specification is written.
+ <note>
+ the <filename>SRCREV</filename> loses its functionality if you
+ specify this variable.
+ However, <filename>SRCREV</filename> is still used to label the
+ archive after a fetch even though it does not define what is
+ fetched.
+ </note>
+ </para>
+
+ <para>
+ Here are a couple of other behaviors worth mentioning:
+ <itemizedlist>
+ <listitem><para>
+ When using <filename>cleartool</filename>, the login of
+ <filename>cleartool</filename> is handled by the system.
+ The login require no special steps.
+ </para></listitem>
+ <listitem><para>
+ In order to use <filename>rcleartool</filename> with authenticated
+ users, an "rcleartool login" is necessary before using the fetcher.
+ </para></listitem>
+ </itemizedlist>
+ </para>
+ </section>
+
+ <section id='other-fetchers'>
+ <title>Other Fetchers</title>
+
+ <para>
+ Fetch submodules also exist for the following:
+ <itemizedlist>
+ <listitem><para>
+ Bazaar (<filename>bzr://</filename>)
+ </para></listitem>
+ <listitem><para>
+ Perforce (<filename>p4://</filename>)
+ </para></listitem>
+ <listitem><para>
+ Trees using Git Annex (<filename>gitannex://</filename>)
+ </para></listitem>
+ <listitem><para>
+ Secure FTP (<filename>sftp://</filename>)
+ </para></listitem>
+ <listitem><para>
+ Secure Shell (<filename>ssh://</filename>)
+ </para></listitem>
+ <listitem><para>
+ Repo (<filename>repo://</filename>)
+ </para></listitem>
+ <listitem><para>
+ OSC (<filename>osc://</filename>)
+ </para></listitem>
+ <listitem><para>
+ Mercurial (<filename>hg://</filename>)
+ </para></listitem>
+ </itemizedlist>
+ No documentation currently exists for these lesser used
+ fetcher submodules.
+ However, you might find the code helpful and readable.
+ </para>
+ </section>
+ </section>
+
+ <section id='auto-revisions'>
+ <title>Auto Revisions</title>
+
+ <para>
+ We need to document <filename>AUTOREV</filename> and
+ <filename>SRCREV_FORMAT</filename> here.
+ </para>
+ </section>
+</chapter>
diff --git a/bitbake/doc/bitbake-user-manual/bitbake-user-manual-hello.xml b/bitbake/doc/bitbake-user-manual/bitbake-user-manual-hello.xml
new file mode 100644
index 0000000..f3628cf
--- /dev/null
+++ b/bitbake/doc/bitbake-user-manual/bitbake-user-manual-hello.xml
@@ -0,0 +1,506 @@
+<!DOCTYPE chapter PUBLIC "-//OASIS//DTD DocBook XML V4.2//EN"
+"http://www.oasis-open.org/docbook/xml/4.2/docbookx.dtd">
+
+<appendix id='hello-world-example'>
+ <title>Hello World Example</title>
+
+ <section id='bitbake-hello-world'>
+ <title>BitBake Hello World</title>
+
+ <para>
+ The simplest example commonly used to demonstrate any new
+ programming language or tool is the
+ "<ulink url="http://en.wikipedia.org/wiki/Hello_world_program">Hello World</ulink>"
+ example.
+ This appendix demonstrates, in tutorial form, Hello
+ World within the context of BitBake.
+ The tutorial describes how to create a new project
+ and the applicable metadata files necessary to allow
+ BitBake to build it.
+ </para>
+ </section>
+
+ <section id='example-obtaining-bitbake'>
+ <title>Obtaining BitBake</title>
+
+ <para>
+ See the
+ "<link linkend='obtaining-bitbake'>Obtaining BitBake</link>"
+ section for information on how to obtain BitBake.
+ Once you have the source code on your machine, the BitBake directory
+ appears as follows:
+ <literallayout class='monospaced'>
+ $ ls -al
+ total 100
+ drwxrwxr-x. 9 wmat wmat 4096 Jan 31 13:44 .
+ drwxrwxr-x. 3 wmat wmat 4096 Feb 4 10:45 ..
+ -rw-rw-r--. 1 wmat wmat 365 Nov 26 04:55 AUTHORS
+ drwxrwxr-x. 2 wmat wmat 4096 Nov 26 04:55 bin
+ drwxrwxr-x. 4 wmat wmat 4096 Jan 31 13:44 build
+ -rw-rw-r--. 1 wmat wmat 16501 Nov 26 04:55 ChangeLog
+ drwxrwxr-x. 2 wmat wmat 4096 Nov 26 04:55 classes
+ drwxrwxr-x. 2 wmat wmat 4096 Nov 26 04:55 conf
+ drwxrwxr-x. 3 wmat wmat 4096 Nov 26 04:55 contrib
+ -rw-rw-r--. 1 wmat wmat 17987 Nov 26 04:55 COPYING
+ drwxrwxr-x. 3 wmat wmat 4096 Nov 26 04:55 doc
+ -rw-rw-r--. 1 wmat wmat 69 Nov 26 04:55 .gitignore
+ -rw-rw-r--. 1 wmat wmat 849 Nov 26 04:55 HEADER
+ drwxrwxr-x. 5 wmat wmat 4096 Jan 31 13:44 lib
+ -rw-rw-r--. 1 wmat wmat 195 Nov 26 04:55 MANIFEST.in
+ -rwxrwxr-x. 1 wmat wmat 3195 Jan 31 11:57 setup.py
+ -rw-rw-r--. 1 wmat wmat 2887 Nov 26 04:55 TODO
+ </literallayout>
+ </para>
+
+ <para>
+ At this point, you should have BitBake cloned to
+ a directory that matches the previous listing except for
+ dates and user names.
+ </para>
+ </section>
+
+ <section id='setting-up-the-bitbake-environment'>
+ <title>Setting Up the BitBake Environment</title>
+
+ <para>
+ First, you need to be sure that you can run BitBake.
+ Set your working directory to where your local BitBake
+ files are and run the following command:
+ <literallayout class='monospaced'>
+ $ ./bin/bitbake --version
+ BitBake Build Tool Core version 1.23.0, bitbake version 1.23.0
+ </literallayout>
+ The console output tells you what version you are running.
+ </para>
+
+ <para>
+ The recommended method to run BitBake is from a directory of your
+ choice.
+ To be able to run BitBake from any directory, you need to add the
+ executable binary to your binary to your shell's environment
+ <filename>PATH</filename> variable.
+ First, look at your current <filename>PATH</filename> variable
+ by entering the following:
+ <literallayout class='monospaced'>
+ $ echo $PATH
+ </literallayout>
+ Next, add the directory location for the BitBake binary to the
+ <filename>PATH</filename>.
+ Here is an example that adds the
+ <filename>/home/scott-lenovo/bitbake/bin</filename> directory
+ to the front of the <filename>PATH</filename> variable:
+ <literallayout class='monospaced'>
+ $ export PATH=/home/scott-lenovo/bitbake/bin:$PATH
+ </literallayout>
+ You should now be able to enter the <filename>bitbake</filename>
+ command from the command line while working from any directory.
+ </para>
+ </section>
+
+ <section id='the-hello-world-example'>
+ <title>The Hello World Example</title>
+
+ <para>
+ The overall goal of this exercise is to build a
+ complete "Hello World" example utilizing task and layer
+ concepts.
+ Because this is how modern projects such as OpenEmbedded and
+ the Yocto Project utilize BitBake, the example
+ provides an excellent starting point for understanding
+ BitBake.
+ </para>
+
+ <para>
+ To help you understand how to use BitBake to build targets,
+ the example starts with nothing but the <filename>bitbake</filename>
+ command, which causes BitBake to fail and report problems.
+ The example progresses by adding pieces to the build to
+ eventually conclude with a working, minimal "Hello World"
+ example.
+ </para>
+
+ <para>
+ While every attempt is made to explain what is happening during
+ the example, the descriptions cannot cover everything.
+ You can find further information throughout this manual.
+ Also, you can actively participate in the
+ <ulink url='http://lists.openembedded.org/mailman/listinfo/bitbake-devel'></ulink>
+ discussion mailing list about the BitBake build tool.
+ </para>
+
+ <note>
+ This example was inspired by and drew heavily from these sources:
+ <itemizedlist>
+ <listitem><para>
+ <ulink url="http://www.mail-archive.com/yocto@yoctoproject.org/msg09379.html">Mailing List post - The BitBake equivalent of "Hello, World!"</ulink>
+ </para></listitem>
+ <listitem><para>
+ <ulink url="http://hambedded.org/blog/2012/11/24/from-bitbake-hello-world-to-an-image/">Hambedded Linux blog post - From Bitbake Hello World to an Image</ulink>
+ </para></listitem>
+ </itemizedlist>
+ </note>
+
+ <para>
+ As stated earlier, the goal of this example
+ is to eventually compile "Hello World".
+ However, it is unknown what BitBake needs and what you have
+ to provide in order to achieve that goal.
+ Recall that BitBake utilizes three types of metadata files:
+ <link linkend='configuration-files'>Configuration Files</link>,
+ <link linkend='classes'>Classes</link>, and
+ <link linkend='recipes'>Recipes</link>.
+ But where do they go?
+ How does BitBake find them?
+ BitBake's error messaging helps you answer these types of questions
+ and helps you better understand exactly what is going on.
+ </para>
+
+ <para>
+ Following is the complete "Hello World" example.
+ </para>
+
+ <orderedlist>
+ <listitem><para><emphasis>Create a Project Directory:</emphasis>
+ First, set up a directory for the "Hello World" project.
+ Here is how you can do so in your home directory:
+ <literallayout class='monospaced'>
+ $ mkdir ~/hello
+ $ cd ~/hello
+ </literallayout>
+ This is the directory that BitBake will use to do all of
+ its work.
+ You can use this directory to keep all the metafiles needed
+ by BitBake.
+ Having a project directory is a good way to isolate your
+ project.
+ </para></listitem>
+ <listitem><para><emphasis>Run Bitbake:</emphasis>
+ At this point, you have nothing but a project directory.
+ Run the <filename>bitbake</filename> command and see what
+ it does:
+ <literallayout class='monospaced'>
+ $ bitbake
+ The BBPATH variable is not set and bitbake did not
+ find a conf/bblayers.conf file in the expected location.
+ Maybe you accidentally invoked bitbake from the wrong directory?
+ DEBUG: Removed the following variables from the environment:
+ GNOME_DESKTOP_SESSION_ID, XDG_CURRENT_DESKTOP,
+ GNOME_KEYRING_CONTROL, DISPLAY, SSH_AGENT_PID, LANG, no_proxy,
+ XDG_SESSION_PATH, XAUTHORITY, SESSION_MANAGER, SHLVL,
+ MANDATORY_PATH, COMPIZ_CONFIG_PROFILE, WINDOWID, EDITOR,
+ GPG_AGENT_INFO, SSH_AUTH_SOCK, GDMSESSION, GNOME_KEYRING_PID,
+ XDG_SEAT_PATH, XDG_CONFIG_DIRS, LESSOPEN, DBUS_SESSION_BUS_ADDRESS,
+ _, XDG_SESSION_COOKIE, DESKTOP_SESSION, LESSCLOSE, DEFAULTS_PATH,
+ UBUNTU_MENUPROXY, OLDPWD, XDG_DATA_DIRS, COLORTERM, LS_COLORS
+ </literallayout>
+ The majority of this output is specific to environment variables
+ that are not directly relevant to BitBake.
+ However, the very first message regarding the
+ <filename>BBPATH</filename> variable and the
+ <filename>conf/bblayers.conf</filename> file
+ is relevant.</para>
+ <para>
+ When you run BitBake, it begins looking for metadata files.
+ The
+ <link linkend='var-BBPATH'><filename>BBPATH</filename></link>
+ variable is what tells BitBake where to look for those files.
+ <filename>BBPATH</filename> is not set and you need to set it.
+ Without <filename>BBPATH</filename>, Bitbake cannot
+ find any configuration files (<filename>.conf</filename>)
+ or recipe files (<filename>.bb</filename>) at all.
+ BitBake also cannot find the <filename>bitbake.conf</filename>
+ file.
+ </para></listitem>
+ <listitem><para><emphasis>Setting <filename>BBPATH</filename>:</emphasis>
+ For this example, you can set <filename>BBPATH</filename>
+ in the same manner that you set <filename>PATH</filename>
+ earlier in the appendix.
+ You should realize, though, that it is much more flexible to set the
+ <filename>BBPATH</filename> variable up in a configuration
+ file for each project.</para>
+ <para>From your shell, enter the following commands to set and
+ export the <filename>BBPATH</filename> variable:
+ <literallayout class='monospaced'>
+ $ BBPATH="<replaceable>projectdirectory</replaceable>"
+ $ export BBPATH
+ </literallayout>
+ Use your actual project directory in the command.
+ BitBake uses that directory to find the metadata it needs for
+ your project.
+ <note>
+ When specifying your project directory, do not use the
+ tilde ("~") character as BitBake does not expand that character
+ as the shell would.
+ </note>
+ </para></listitem>
+ <listitem><para><emphasis>Run Bitbake:</emphasis>
+ Now that you have <filename>BBPATH</filename> defined, run
+ the <filename>bitbake</filename> command again:
+ <literallayout class='monospaced'>
+ $ bitbake
+ ERROR: Traceback (most recent call last):
+ File "/home/scott-lenovo/bitbake/lib/bb/cookerdata.py", line 163, in wrapped
+ return func(fn, *args)
+ File "/home/scott-lenovo/bitbake/lib/bb/cookerdata.py", line 173, in parse_config_file
+ return bb.parse.handle(fn, data, include)
+ File "/home/scott-lenovo/bitbake/lib/bb/parse/__init__.py", line 99, in handle
+ return h['handle'](fn, data, include)
+ File "/home/scott-lenovo/bitbake/lib/bb/parse/parse_py/ConfHandler.py", line 120, in handle
+ abs_fn = resolve_file(fn, data)
+ File "/home/scott-lenovo/bitbake/lib/bb/parse/__init__.py", line 117, in resolve_file
+ raise IOError("file %s not found in %s" % (fn, bbpath))
+ IOError: file conf/bitbake.conf not found in /home/scott-lenovo/hello
+
+ ERROR: Unable to parse conf/bitbake.conf: file conf/bitbake.conf not found in /home/scott-lenovo/hello
+ </literallayout>
+ This sample output shows that BitBake could not find the
+ <filename>conf/bitbake.conf</filename> file in the project
+ directory.
+ This file is the first thing BitBake must find in order
+ to build a target.
+ And, since the project directory for this example is
+ empty, you need to provide a <filename>conf/bitbake.conf</filename>
+ file.
+ </para></listitem>
+ <listitem><para><emphasis>Creating <filename>conf/bitbake.conf</filename>:</emphasis>
+ The <filename>conf/bitbake.conf</filename> includes a number of
+ configuration variables BitBake uses for metadata and recipe
+ files.
+ For this example, you need to create the file in your project directory
+ and define some key BitBake variables.
+ For more information on the <filename>bitbake.conf</filename>,
+ see
+ <ulink url='http://hambedded.org/blog/2012/11/24/from-bitbake-hello-world-to-an-image/#an-overview-of-bitbakeconf'></ulink>
+ </para>
+ <para>Use the following commands to create the <filename>conf</filename>
+ directory in the project directory:
+ <literallayout class='monospaced'>
+ $ mkdir conf
+ </literallayout>
+ From within the <filename>conf</filename> directory, use
+ some editor to create the <filename>bitbake.conf</filename>
+ so that it contains the following:
+ <literallayout class='monospaced'>
+ TMPDIR = "${<link linkend='var-TOPDIR'>TOPDIR</link>}/tmp"
+ <link linkend='var-CACHE'>CACHE</link> = "${TMPDIR}/cache"
+ <link linkend='var-STAMP'>STAMP</link> = "${TMPDIR}/stamps"
+ <link linkend='var-T'>T</link> = "${TMPDIR}/work"
+ <link linkend='var-B'>B</link> = "${TMPDIR}"
+ </literallayout>
+ The <filename>TMPDIR</filename> variable establishes a directory
+ that BitBake uses for build output and intermediate files (other
+ than the cached information used by the
+ <link linkend='setscene'>Setscene</link> process.
+ Here, the <filename>TMPDIR</filename> directory is set to
+ <filename>hello/tmp</filename>.
+ <note><title>Tip</title>
+ You can always safely delete the <filename>tmp</filename>
+ directory in order to rebuild a BitBake target.
+ The build process creates the directory for you
+ when you run BitBake.
+ </note></para>
+ <para>For information about each of the other variables defined in this
+ example, click on the links to take you to the definitions in
+ the glossary.
+ </para></listitem>
+ <listitem><para><emphasis>Run Bitbake:</emphasis>
+ After making sure that the <filename>conf/bitbake.conf</filename>
+ file exists, you can run the <filename>bitbake</filename>
+ command again:
+ <literallayout class='monospaced'>
+$ bitbake
+ERROR: Traceback (most recent call last):
+ File "/home/scott-lenovo/bitbake/lib/bb/cookerdata.py", line 163, in wrapped
+ return func(fn, *args)
+ File "/home/scott-lenovo/bitbake/lib/bb/cookerdata.py", line 177, in _inherit
+ bb.parse.BBHandler.inherit(bbclass, "configuration INHERITs", 0, data)
+ File "/home/scott-lenovo/bitbake/lib/bb/parse/parse_py/BBHandler.py", line 92, in inherit
+ include(fn, file, lineno, d, "inherit")
+ File "/home/scott-lenovo/bitbake/lib/bb/parse/parse_py/ConfHandler.py", line 100, in include
+ raise ParseError("Could not %(error_out)s file %(fn)s" % vars(), oldfn, lineno)
+ParseError: ParseError in configuration INHERITs: Could not inherit file classes/base.bbclass
+
+ERROR: Unable to parse base: ParseError in configuration INHERITs: Could not inherit file classes/base.bbclass
+ </literallayout>
+ In the sample output, BitBake could not find the
+ <filename>classes/base.bbclass</filename> file.
+ You need to create that file next.
+ </para></listitem>
+ <listitem><para><emphasis>Creating <filename>classes/base.bbclass</filename>:</emphasis>
+ BitBake uses class files to provide common code and functionality.
+ The minimally required class for BitBake is the
+ <filename>classes/base.bbclass</filename> file.
+ The <filename>base</filename> class is implicitly inherited by
+ every recipe.
+ BitBake looks for the class in the <filename>classes</filename>
+ directory of the project (i.e <filename>hello/classes</filename>
+ in this example).
+ </para>
+ <para>Create the <filename>classes</filename> directory as follows:
+ <literallayout class='monospaced'>
+ $ cd $HOME/hello
+ $ mkdir classes
+ </literallayout>
+ Move to the <filename>classes</filename> directory and then
+ create the <filename>base.bbclass</filename> file by inserting
+ this single line:
+ <literallayout class='monospaced'>
+ addtask build
+ </literallayout>
+ The minimal task that BitBake runs is the
+ <filename>do_build</filename> task.
+ This is all the example needs in order to build the project.
+ Of course, the <filename>base.bbclass</filename> can have much
+ more depending on which build environments BitBake is
+ supporting.
+ For more information on the <filename>base.bbclass</filename> file,
+ you can look at
+ <ulink url='http://hambedded.org/blog/2012/11/24/from-bitbake-hello-world-to-an-image/#tasks'></ulink>.
+ </para></listitem>
+ <listitem><para><emphasis>Run Bitbake:</emphasis>
+ After making sure that the <filename>classes/base.bbclass</filename>
+ file exists, you can run the <filename>bitbake</filename>
+ command again:
+ <literallayout class='monospaced'>
+ $ bitbake
+ Nothing to do. Use 'bitbake world' to build everything, or run 'bitbake --help' for usage information.
+ </literallayout>
+ BitBake is finally reporting no errors.
+ However, you can see that it really does not have anything
+ to do.
+ You need to create a recipe that gives BitBake something to do.
+ </para></listitem>
+ <listitem><para><emphasis>Creating a Layer:</emphasis>
+ While it is not really necessary for such a small example,
+ it is good practice to create a layer in which to keep your
+ code separate from the general metadata used by BitBake.
+ Thus, this example creates and uses a layer called "mylayer".
+ <note>
+ You can find additional information on adding a layer at
+ <ulink url='http://hambedded.org/blog/2012/11/24/from-bitbake-hello-world-to-an-image/#adding-an-example-layer'></ulink>.
+ </note>
+ </para>
+ <para>Minimally, you need a recipe file and a layer configuration
+ file in your layer.
+ The configuration file needs to be in the <filename>conf</filename>
+ directory inside the layer.
+ Use these commands to set up the layer and the <filename>conf</filename>
+ directory:
+ <literallayout class='monospaced'>
+ $ cd $HOME
+ $ mkdir mylayer
+ $ cd mylayer
+ $ mkdir conf
+ </literallayout>
+ Move to the <filename>conf</filename> directory and create a
+ <filename>layer.conf</filename> file that has the following:
+ <literallayout class='monospaced'>
+ BBPATH .= ":${<link linkend='var-LAYERDIR'>LAYERDIR</link>}"
+
+ <link linkend='var-BBFILES'>BBFILES</link> += "${LAYERDIR}/*.bb"
+
+ <link linkend='var-BBFILE_COLLECTIONS'>BBFILE_COLLECTIONS</link> += "mylayer"
+ <link linkend='var-BBFILE_PATTERN'>BBFILE_PATTERN_mylayer</link> := "^${LAYERDIR}/"
+ </literallayout>
+ For information on these variables, click the links
+ to go to the definitions in the glossary.</para>
+ <para>You need to create the recipe file next.
+ Inside your layer at the top-level, use an editor and create
+ a recipe file named <filename>printhello.bb</filename> that
+ has the following:
+ <literallayout class='monospaced'>
+ <link linkend='var-DESCRIPTION'>DESCRIPTION</link> = "Prints Hello World"
+ <link linkend='var-PN'>PN</link> = 'printhello'
+ <link linkend='var-PV'>PV</link> = '1'
+
+ python do_build() {
+ bb.plain("********************");
+ bb.plain("* *");
+ bb.plain("* Hello, World! *");
+ bb.plain("* *");
+ bb.plain("********************");
+ }
+ </literallayout>
+ The recipe file simply provides a description of the
+ recipe, the name, version, and the <filename>do_build</filename>
+ task, which prints out "Hello World" to the console.
+ For more information on these variables, follow the links
+ to the glossary.
+ </para></listitem>
+ <listitem><para><emphasis>Run Bitbake With a Target:</emphasis>
+ Now that a BitBake target exists, run the command and provide
+ that target:
+ <literallayout class='monospaced'>
+ $ cd $HOME/hello
+ $ bitbake printhello
+ ERROR: no recipe files to build, check your BBPATH and BBFILES?
+
+ Summary: There was 1 ERROR message shown, returning a non-zero exit code.
+ </literallayout>
+ We have created the layer with the recipe and the layer
+ configuration file but it still seems that BitBake cannot
+ find the recipe.
+ BitBake needs a <filename>conf/bblayers.conf</filename> that
+ lists the layers for the project.
+ Without this file, BitBake cannot find the recipe.
+ </para></listitem>
+ <listitem><para><emphasis>Creating <filename>conf/bblayers.conf</filename>:</emphasis>
+ BitBake uses the <filename>conf/bblayers.conf</filename> file
+ to locate layers needed for the project.
+ This file must reside in the <filename>conf</filename> directory
+ of the project (i.e. <filename>hello/conf</filename> for this
+ example).</para>
+ <para>Set your working directory to the <filename>hello/conf</filename>
+ directory and then create the <filename>bblayers.conf</filename>
+ file so that it contains the following:
+ <literallayout class='monospaced'>
+ BBLAYERS ?= " \
+ /home/<you>/mylayer \
+ "
+ </literallayout>
+ You need to provide your own information for
+ <filename>you</filename> in the file.
+ </para></listitem>
+ <listitem><para><emphasis>Run Bitbake With a Target:</emphasis>
+ Now that you have supplied the <filename>bblayers.conf</filename>
+ file, run the <filename>bitbake</filename> command and provide
+ the target:
+ <literallayout class='monospaced'>
+ $ bitbake printhello
+ Parsing recipes: 100% |##################################################################################|
+ Time: 00:00:00
+ Parsing of 1 .bb files complete (0 cached, 1 parsed). 1 targets, 0 skipped, 0 masked, 0 errors.
+ NOTE: Resolving any missing task queue dependencies
+ NOTE: Preparing RunQueue
+ NOTE: Executing RunQueue Tasks
+ ********************
+ * *
+ * Hello, World! *
+ * *
+ ********************
+ NOTE: Tasks Summary: Attempted 1 tasks of which 0 didn't need to be rerun and all succeeded.
+ </literallayout>
+ BitBake finds the <filename>printhello</filename> recipe and
+ successfully runs the task.
+ <note>
+ After the first execution, re-running
+ <filename>bitbake printhello</filename> again will not
+ result in a BitBake run that prints the same console
+ output.
+ The reason for this is that the first time the
+ <filename>printhello.bb</filename> recipe's
+ <filename>do_build</filename> task executes
+ successfully, BitBake writes a stamp file for the task.
+ Thus, the next time you attempt to run the task
+ using that same <filename>bitbake</filename> command,
+ BitBake notices the stamp and therefore determines
+ that the task does not need to be re-run.
+ If you delete the <filename>tmp</filename> directory
+ or run <filename>bitbake -c clean printhello</filename>
+ and then re-run the build, the "Hello, World!" message will
+ be printed again.
+ </note>
+ </para></listitem>
+ </orderedlist>
+ </section>
+</appendix>
diff --git a/bitbake/doc/bitbake-user-manual/bitbake-user-manual-intro.xml b/bitbake/doc/bitbake-user-manual/bitbake-user-manual-intro.xml
new file mode 100644
index 0000000..2188655
--- /dev/null
+++ b/bitbake/doc/bitbake-user-manual/bitbake-user-manual-intro.xml
@@ -0,0 +1,685 @@
+<!DOCTYPE chapter PUBLIC "-//OASIS//DTD DocBook XML V4.2//EN"
+ "http://www.oasis-open.org/docbook/xml/4.2/docbookx.dtd">
+
+<chapter id="bitbake-user-manual-intro">
+ <title>Overview</title>
+
+ <para>
+ Welcome to the BitBake User Manual.
+ This manual provides information on the BitBake tool.
+ The information attempts to be as independent as possible regarding
+ systems that use BitBake, such as OpenEmbedded and the
+ Yocto Project.
+ In some cases, scenarios or examples within the context of
+ a build system are used in the manual to help with understanding.
+ For these cases, the manual clearly states the context.
+ </para>
+
+ <section id="intro">
+ <title>Introduction</title>
+
+ <para>
+ Fundamentally, BitBake is a generic task execution
+ engine that allows shell and Python tasks to be run
+ efficiently and in parallel while working within
+ complex inter-task dependency constraints.
+ One of BitBake's main users, OpenEmbedded, takes this core
+ and builds embedded Linux software stacks using
+ a task-oriented approach.
+ </para>
+
+ <para>
+ Conceptually, BitBake is similar to GNU Make in
+ some regards but has significant differences:
+ <itemizedlist>
+ <listitem><para>
+ BitBake executes tasks according to provided
+ metadata that builds up the tasks.
+ Metadata is stored in recipe (<filename>.bb</filename>)
+ and related recipe "append" (<filename>.bbappend</filename>)
+ files, configuration (<filename>.conf</filename>) and
+ underlying include (<filename>.inc</filename>) files, and
+ in class (<filename>.bbclass</filename>) files.
+ The metadata provides
+ BitBake with instructions on what tasks to run and
+ the dependencies between those tasks.
+ </para></listitem>
+ <listitem><para>
+ BitBake includes a fetcher library for obtaining source
+ code from various places such as local files, source control
+ systems, or websites.
+ </para></listitem>
+ <listitem><para>
+ The instructions for each unit to be built (e.g. a piece
+ of software) are known as "recipe" files and
+ contain all the information about the unit
+ (dependencies, source file locations, checksums, description
+ and so on).
+ </para></listitem>
+ <listitem><para>
+ BitBake includes a client/server abstraction and can
+ be used from a command line or used as a service over
+ XML-RPC and has several different user interfaces.
+ </para></listitem>
+ </itemizedlist>
+ </para>
+ </section>
+
+ <section id="history-and-goals">
+ <title>History and Goals</title>
+
+ <para>
+ BitBake was originally a part of the OpenEmbedded project.
+ It was inspired by the Portage package management system
+ used by the Gentoo Linux distribution.
+ On December 7, 2004, OpenEmbedded project team member
+ Chris Larson split the project into two distinct pieces:
+ <itemizedlist>
+ <listitem><para>BitBake, a generic task executor</para></listitem>
+ <listitem><para>OpenEmbedded, a metadata set utilized by
+ BitBake</para></listitem>
+ </itemizedlist>
+ Today, BitBake is the primary basis of the
+ <ulink url="http://www.openembedded.org/">OpenEmbedded</ulink>
+ project, which is being used to build and maintain Linux
+ distributions such as the
+ <ulink url='http://www.angstrom-distribution.org/'>Angstrom Distribution</ulink>,
+ and which is also being used as the build tool for Linux projects
+ such as the
+ <ulink url='http://www.yoctoproject.org'>Yocto Project</ulink>.
+ </para>
+
+ <para>
+ Prior to BitBake, no other build tool adequately met the needs of
+ an aspiring embedded Linux distribution.
+ All of the build systems used by traditional desktop Linux
+ distributions lacked important functionality, and none of the
+ ad hoc Buildroot-based systems, prevalent in the
+ embedded space, were scalable or maintainable.
+ </para>
+
+ <para>
+ Some important original goals for BitBake were:
+ <itemizedlist>
+ <listitem><para>
+ Handle cross-compilation.
+ </para></listitem>
+ <listitem><para>
+ Handle inter-package dependencies (build time on
+ target architecture, build time on native
+ architecture, and runtime).
+ </para></listitem>
+ <listitem><para>
+ Support running any number of tasks within a given
+ package, including, but not limited to, fetching
+ upstream sources, unpacking them, patching them,
+ configuring them, and so forth.
+ </para></listitem>
+ <listitem><para>
+ Be Linux distribution agnostic for both build and
+ target systems.
+ </para></listitem>
+ <listitem><para>
+ Be architecture agnostic.
+ </para></listitem>
+ <listitem><para>
+ Support multiple build and target operating systems
+ (e.g. Cygwin, the BSDs, and so forth).
+ </para></listitem>
+ <listitem><para>
+ Be self contained, rather than tightly
+ integrated into the build machine's root
+ filesystem.
+ </para></listitem>
+ <listitem><para>
+ Handle conditional metadata on the target architecture,
+ operating system, distribution, and machine.
+ </para></listitem>
+ <listitem><para>
+ Be easy to use the tools to supply local metadata and packages
+ against which to operate.
+ </para></listitem>
+ <listitem><para>
+ Be easy to use BitBake to collaborate between multiple
+ projects for their builds.
+ </para></listitem>
+ <listitem><para>
+ Provide an inheritance mechanism to share
+ common metadata between many packages.
+ </para></listitem>
+ </itemizedlist>
+ Over time it became apparent that some further requirements
+ were necessary:
+ <itemizedlist>
+ <listitem><para>
+ Handle variants of a base recipe (e.g. native, sdk,
+ and multilib).
+ </para></listitem>
+ <listitem><para>
+ Split metadata into layers and allow layers
+ to enhance or override other layers.
+ </para></listitem>
+ <listitem><para>
+ Allow representation of a given set of input variables
+ to a task as a checksum.
+ Based on that checksum, allow acceleration of builds
+ with prebuilt components.
+ </para></listitem>
+ </itemizedlist>
+ BitBake satisfies all the original requirements and many more
+ with extensions being made to the basic functionality to
+ reflect the additional requirements.
+ Flexibility and power have always been the priorities.
+ BitBake is highly extensible and supports embedded Python code and
+ execution of any arbitrary tasks.
+ </para>
+ </section>
+
+ <section id="Concepts">
+ <title>Concepts</title>
+
+ <para>
+ BitBake is a program written in the Python language.
+ At the highest level, BitBake interprets metadata, decides
+ what tasks are required to run, and executes those tasks.
+ Similar to GNU Make, BitBake controls how software is
+ built.
+ GNU Make achieves its control through "makefiles", while
+ BitBake uses "recipes".
+ </para>
+
+ <para>
+ BitBake extends the capabilities of a simple
+ tool like GNU Make by allowing for the definition of much more
+ complex tasks, such as assembling entire embedded Linux
+ distributions.
+ </para>
+
+ <para>
+ The remainder of this section introduces several concepts
+ that should be understood in order to better leverage
+ the power of BitBake.
+ </para>
+
+ <section id='recipes'>
+ <title>Recipes</title>
+
+ <para>
+ BitBake Recipes, which are denoted by the file extension
+ <filename>.bb</filename>, are the most basic metadata files.
+ These recipe files provide BitBake with the following:
+ <itemizedlist>
+ <listitem><para>Descriptive information about the
+ package (author, homepage, license, and so on)</para></listitem>
+ <listitem><para>The version of the recipe</para></listitem>
+ <listitem><para>Existing dependencies (both build
+ and runtime dependencies)</para></listitem>
+ <listitem><para>Where the source code resides and
+ how to fetch it</para></listitem>
+ <listitem><para>Whether the source code requires
+ any patches, where to find them, and how to apply
+ them</para></listitem>
+ <listitem><para>How to configure and compile the
+ source code</para></listitem>
+ <listitem><para>Where on the target machine to install the
+ package or packages created</para></listitem>
+ </itemizedlist>
+ </para>
+
+ <para>
+ Within the context of BitBake, or any project utilizing BitBake
+ as its build system, files with the <filename>.bb</filename>
+ extension are referred to as recipes.
+ <note>
+ The term "package" is also commonly used to describe recipes.
+ However, since the same word is used to describe packaged
+ output from a project, it is best to maintain a single
+ descriptive term - "recipes".
+ Put another way, a single "recipe" file is quite capable
+ of generating a number of related but separately installable
+ "packages".
+ In fact, that ability is fairly common.
+ </note>
+ </para>
+ </section>
+
+ <section id='configuration-files'>
+ <title>Configuration Files</title>
+
+ <para>
+ Configuration files, which are denoted by the
+ <filename>.conf</filename> extension, define
+ various configuration variables that govern the project's build
+ process.
+ These files fall into several areas that define
+ machine configuration options, distribution configuration
+ options, compiler tuning options, general common
+ configuration options, and user configuration options.
+ The main configuration file is the sample
+ <filename>bitbake.conf</filename> file, which is
+ located within the BitBake source tree
+ <filename>conf</filename> directory.
+ </para>
+ </section>
+
+ <section id='classes'>
+ <title>Classes</title>
+
+ <para>
+ Class files, which are denoted by the
+ <filename>.bbclass</filename> extension, contain
+ information that is useful to share between metadata files.
+ The BitBake source tree currently comes with one class metadata file
+ called <filename>base.bbclass</filename>.
+ You can find this file in the
+ <filename>classes</filename> directory.
+ The <filename>base.bbclass</filename> class files is special since it
+ is always included automatically for all recipes
+ and classes.
+ This class contains definitions for standard basic tasks such
+ as fetching, unpacking, configuring (empty by default),
+ compiling (runs any Makefile present), installing (empty by
+ default) and packaging (empty by default).
+ These tasks are often overridden or extended by other classes
+ added during the project development process.
+ </para>
+ </section>
+
+ <section id='layers'>
+ <title>Layers</title>
+
+ <para>
+ Layers allow you to isolate different types of
+ customizations from each other.
+ While you might find it tempting to keep everything in one layer
+ when working on a single project, the more modular you organize
+ your metadata, the easier it is to cope with future changes.
+ </para>
+
+ <para>
+ To illustrate how you can use layers to keep things modular,
+ consider customizations you might make to support a specific target machine.
+ These types of customizations typically reside in a special layer,
+ rather than a general layer, called a Board Support Package (BSP)
+ Layer.
+ Furthermore, the machine customizations should be isolated from
+ recipes and metadata that support a new GUI environment, for
+ example.
+ This situation gives you a couple of layers: one for the machine
+ configurations and one for the GUI environment.
+ It is important to understand, however, that the BSP layer can still
+ make machine-specific additions to recipes within
+ the GUI environment layer without polluting the GUI layer itself
+ with those machine-specific changes.
+ You can accomplish this through a recipe that is a BitBake append
+ (<filename>.bbappend</filename>) file.
+ </para>
+ </section>
+
+ <section id='append-bbappend-files'>
+ <title>Append Files</title>
+
+ <para>
+ Append files, which are files that have the
+ <filename>.bbappend</filename> file extension, extend or
+ override information in an existing recipe file.
+ </para>
+
+ <para>
+ BitBake expects every append file to have a corresponding recipe file.
+ Furthermore, the append file and corresponding recipe file
+ must use the same root filename.
+ The filenames can differ only in the file type suffix used
+ (e.g. <filename>formfactor_0.0.bb</filename> and
+ <filename>formfactor_0.0.bbappend</filename>).
+ </para>
+
+ <para>
+ Information in append files extends or
+ overrides the information in the underlying,
+ similarly-named recipe files.
+ </para>
+
+ <para>
+ When you name an append file, you can use the
+ wildcard character (%) to allow for matching recipe names.
+ For example, suppose you have an append file named
+ as follows:
+ <literallayout class='monospaced'>
+ busybox_1.21.%.bbappend
+ </literallayout>
+ That append file would match any <filename>busybox_1.21.x.bb</filename>
+ version of the recipe.
+ So, the append file would match the following recipe names:
+ <literallayout class='monospaced'>
+ busybox_1.21.1.bb
+ busybox_1.21.2.bb
+ busybox_1.21.3.bb
+ </literallayout>
+ If the <filename>busybox</filename> recipe was updated to
+ <filename>busybox_1.3.0.bb</filename>, the append name would not
+ match.
+ However, if you named the append file
+ <filename>busybox_1.%.bbappend</filename>, then you would have a match.
+ </para>
+
+ <para>
+ In the most general case, you could name the append file something as
+ simple as <filename>busybox_%.bbappend</filename> to be entirely
+ version independent.
+ </para>
+ </section>
+ </section>
+
+ <section id='obtaining-bitbake'>
+ <title>Obtaining BitBake</title>
+
+ <para>
+ You can obtain BitBake several different ways:
+ <itemizedlist>
+ <listitem><para><emphasis>Cloning BitBake:</emphasis>
+ Using Git to clone the BitBake source code repository
+ is the recommended method for obtaining BitBake.
+ Cloning the repository makes it easy to get bug fixes
+ and have access to stable branches and the master
+ branch.
+ Once you have cloned BitBake, you should use
+ the latest stable
+ branch for development since the master branch is for
+ BitBake development and might contain less stable changes.
+ </para>
+ <para>You usually need a version of BitBake
+ that matches the metadata you are using.
+ The metadata is generally backwards compatible but
+ not forward compatible.</para>
+ <para>Here is an example that clones the BitBake repository:
+ <literallayout class='monospaced'>
+ $ git clone git://git.openembedded.org/bitbake
+ </literallayout>
+ This command clones the BitBake Git repository into a
+ directory called <filename>bitbake</filename>.
+ Alternatively, you can
+ designate a directory after the
+ <filename>git clone</filename> command
+ if you want to call the new directory something
+ other than <filename>bitbake</filename>.
+ Here is an example that names the directory
+ <filename>bbdev</filename>:
+ <literallayout class='monospaced'>
+ $ git clone git://git.openembedded.org/bitbake bbdev
+ </literallayout></para></listitem>
+ <listitem><para><emphasis>Installation using your Distribution
+ Package Management System:</emphasis>
+ This method is not
+ recommended because the BitBake version that is
+ provided by your distribution, in most cases,
+ is several
+ releases behind a snapshot of the BitBake repository.
+ </para></listitem>
+ <listitem><para><emphasis>Taking a snapshot of BitBake:</emphasis>
+ Downloading a snapshot of BitBake from the
+ source code repository gives you access to a known
+ branch or release of BitBake.
+ <note>
+ Cloning the Git repository, as described earlier,
+ is the preferred method for getting BitBake.
+ Cloning the repository makes it easier to update as
+ patches are added to the stable branches.
+ </note></para>
+ <para>The following example downloads a snapshot of
+ BitBake version 1.17.0:
+ <literallayout class='monospaced'>
+ $ wget http://git.openembedded.org/bitbake/snapshot/bitbake-1.17.0.tar.gz
+ $ tar zxpvf bitbake-1.17.0.tar.gz
+ </literallayout>
+ After extraction of the tarball using the tar utility,
+ you have a directory entitled
+ <filename>bitbake-1.17.0</filename>.
+ </para></listitem>
+ <listitem><para><emphasis>Using the BitBake that Comes With Your
+ Build Checkout:</emphasis>
+ A final possibility for getting a copy of BitBake is that it
+ already comes with your checkout of a larger Bitbake-based build
+ system, such as Poky or Yocto Project.
+ Rather than manually checking out individual layers and
+ gluing them together yourself, you can check
+ out an entire build system.
+ The checkout will already include a version of BitBake that
+ has been thoroughly tested for compatibility with the other
+ components.
+ For information on how to check out a particular BitBake-based
+ build system, consult that build system's supporting documentation.
+ </para></listitem>
+ </itemizedlist>
+ </para>
+ </section>
+
+ <section id="bitbake-user-manual-command">
+ <title>The BitBake Command</title>
+
+ <para>
+ The <filename>bitbake</filename> command is the primary interface
+ to the BitBake tool.
+ This section presents the BitBake command syntax and provides
+ several execution examples.
+ </para>
+
+ <section id='usage-and-syntax'>
+ <title>Usage and syntax</title>
+
+ <para>
+ Following is the usage and syntax for BitBake:
+ <literallayout class='monospaced'>
+ $ bitbake -h
+ Usage: bitbake [options] [recipename/target ...]
+
+ Executes the specified task (default is 'build') for a given set of target recipes (.bb files).
+ It is assumed there is a conf/bblayers.conf available in cwd or in BBPATH which
+ will provide the layer, BBFILES and other configuration information.
+
+ Options:
+ --version show program's version number and exit
+ -h, --help show this help message and exit
+ -b BUILDFILE, --buildfile=BUILDFILE
+ Execute tasks from a specific .bb recipe directly.
+ WARNING: Does not handle any dependencies from other
+ recipes.
+ -k, --continue Continue as much as possible after an error. While the
+ target that failed and anything depending on it cannot
+ be built, as much as possible will be built before
+ stopping.
+ -a, --tryaltconfigs Continue with builds by trying to use alternative
+ providers where possible.
+ -f, --force Force the specified targets/task to run (invalidating
+ any existing stamp file).
+ -c CMD, --cmd=CMD Specify the task to execute. The exact options
+ available depend on the metadata. Some examples might
+ be 'compile' or 'populate_sysroot' or 'listtasks' may
+ give a list of the tasks available.
+ -C INVALIDATE_STAMP, --clear-stamp=INVALIDATE_STAMP
+ Invalidate the stamp for the specified task such as
+ 'compile' and then run the default task for the
+ specified target(s).
+ -r PREFILE, --read=PREFILE
+ Read the specified file before bitbake.conf.
+ -R POSTFILE, --postread=POSTFILE
+ Read the specified file after bitbake.conf.
+ -v, --verbose Output more log message data to the terminal.
+ -D, --debug Increase the debug level. You can specify this more
+ than once.
+ -n, --dry-run Don't execute, just go through the motions.
+ -S SIGNATURE_HANDLER, --dump-signatures=SIGNATURE_HANDLER
+ Dump out the signature construction information, with
+ no task execution. The SIGNATURE_HANDLER parameter is
+ passed to the handler. Two common values are none and
+ printdiff but the handler may define more/less. none
+ means only dump the signature, printdiff means compare
+ the dumped signature with the cached one.
+ -p, --parse-only Quit after parsing the BB recipes.
+ -s, --show-versions Show current and preferred versions of all recipes.
+ -e, --environment Show the global or per-recipe environment complete
+ with information about where variables were
+ set/changed.
+ -g, --graphviz Save dependency tree information for the specified
+ targets in the dot syntax.
+ -I EXTRA_ASSUME_PROVIDED, --ignore-deps=EXTRA_ASSUME_PROVIDED
+ Assume these dependencies don't exist and are already
+ provided (equivalent to ASSUME_PROVIDED). Useful to
+ make dependency graphs more appealing
+ -l DEBUG_DOMAINS, --log-domains=DEBUG_DOMAINS
+ Show debug logging for the specified logging domains
+ -P, --profile Profile the command and save reports.
+ -u UI, --ui=UI The user interface to use (e.g. knotty, hob, depexp).
+ -t SERVERTYPE, --servertype=SERVERTYPE
+ Choose which server to use, process or xmlrpc.
+ --token=XMLRPCTOKEN Specify the connection token to be used when
+ connecting to a remote server.
+ --revisions-changed Set the exit code depending on whether upstream
+ floating revisions have changed or not.
+ --server-only Run bitbake without a UI, only starting a server
+ (cooker) process.
+ -B BIND, --bind=BIND The name/address for the bitbake server to bind to.
+ --no-setscene Do not run any setscene tasks. sstate will be ignored
+ and everything needed, built.
+ --remote-server=REMOTE_SERVER
+ Connect to the specified server.
+ -m, --kill-server Terminate the remote server.
+ --observe-only Connect to a server as an observing-only client.
+ --status-only Check the status of the remote bitbake server.
+ </literallayout>
+ </para>
+ </section>
+
+ <section id='bitbake-examples'>
+ <title>Examples</title>
+
+ <para>
+ This section presents some examples showing how to use BitBake.
+ </para>
+
+ <section id='example-executing-a-task-against-a-single-recipe'>
+ <title>Executing a Task Against a Single Recipe</title>
+
+ <para>
+ Executing tasks for a single recipe file is relatively simple.
+ You specify the file in question, and BitBake parses
+ it and executes the specified task.
+ If you do not specify a task, BitBake executes the default
+ task, which is "build”.
+ BitBake obeys inter-task dependencies when doing
+ so.
+ </para>
+
+ <para>
+ The following command runs the build task, which is
+ the default task, on the <filename>foo_1.0.bb</filename>
+ recipe file:
+ <literallayout class='monospaced'>
+ $ bitbake -b foo_1.0.bb
+ </literallayout>
+ The following command runs the clean task on the
+ <filename>foo.bb</filename> recipe file:
+ <literallayout class='monospaced'>
+ $ bitbake -b foo.bb -c clean
+ </literallayout>
+ <note>
+ The "-b" option explicitly does not handle recipe
+ dependencies.
+ Other than for debugging purposes, it is instead
+ recommended that you use the syntax presented in the
+ next section.
+ </note>
+ </para>
+ </section>
+
+ <section id='executing-tasks-against-a-set-of-recipe-files'>
+ <title>Executing Tasks Against a Set of Recipe Files</title>
+
+ <para>
+ There are a number of additional complexities introduced
+ when one wants to manage multiple <filename>.bb</filename>
+ files.
+ Clearly there needs to be a way to tell BitBake what
+ files are available and, of those, which you
+ want to execute.
+ There also needs to be a way for each recipe
+ to express its dependencies, both for build-time and
+ runtime.
+ There must be a way for you to express recipe preferences
+ when multiple recipes provide the same functionality, or when
+ there are multiple versions of a recipe.
+ </para>
+
+ <para>
+ The <filename>bitbake</filename> command, when not using
+ "--buildfile" or "-b" only accepts a "PROVIDES".
+ You cannot provide anything else.
+ By default, a recipe file generally "PROVIDES" its
+ "packagename" as shown in the following example:
+ <literallayout class='monospaced'>
+ $ bitbake foo
+ </literallayout>
+ This next example "PROVIDES" the package name and also uses
+ the "-c" option to tell BitBake to just execute the
+ <filename>do_clean</filename> task:
+ <literallayout class='monospaced'>
+ $ bitbake -c clean foo
+ </literallayout>
+ </para>
+ </section>
+
+ <section id='generating-dependency-graphs'>
+ <title>Generating Dependency Graphs</title>
+
+ <para>
+ BitBake is able to generate dependency graphs using
+ the <filename>dot</filename> syntax.
+ You can convert these graphs into images using the
+ <filename>dot</filename> tool from
+ <ulink url='http://www.graphviz.org'>Graphviz</ulink>.
+ </para>
+
+ <para>
+ When you generate a dependency graph, BitBake writes four files
+ to the current working directory:
+ <itemizedlist>
+ <listitem><para><emphasis><filename>package-depends.dot</filename>:</emphasis>
+ Shows BitBake's knowledge of dependencies between
+ runtime targets.
+ </para></listitem>
+ <listitem><para><emphasis><filename>pn-depends.dot</filename>:</emphasis>
+ Shows dependencies between build-time targets
+ (i.e. recipes).
+ </para></listitem>
+ <listitem><para><emphasis><filename>task-depends.dot</filename>:</emphasis>
+ Shows dependencies between tasks.
+ </para></listitem>
+ <listitem><para><emphasis><filename>pn-buildlist</filename>:</emphasis>
+ Shows a simple list of targets that are to be built.
+ </para></listitem>
+ </itemizedlist>
+ </para>
+
+ <para>
+ To stop depending on common depends, use the "-I" depend
+ option and BitBake omits them from the graph.
+ Leaving this information out can produce more readable graphs.
+ This way, you can remove from the graph
+ <filename>DEPENDS</filename> from inherited classes
+ such as <filename>base.bbclass</filename>.
+ </para>
+
+ <para>
+ Here are two examples that create dependency graphs.
+ The second example omits depends common in OpenEmbedded from
+ the graph:
+ <literallayout class='monospaced'>
+ $ bitbake -g foo
+
+ $ bitbake -g -I virtual/kernel -I eglibc foo
+ </literallayout>
+ </para>
+ </section>
+ </section>
+ </section>
+</chapter>
diff --git a/bitbake/doc/bitbake-user-manual/bitbake-user-manual-metadata.xml b/bitbake/doc/bitbake-user-manual/bitbake-user-manual-metadata.xml
new file mode 100644
index 0000000..1b9d800
--- /dev/null
+++ b/bitbake/doc/bitbake-user-manual/bitbake-user-manual-metadata.xml
@@ -0,0 +1,1830 @@
+<!DOCTYPE chapter PUBLIC "-//OASIS//DTD DocBook XML V4.2//EN"
+"http://www.oasis-open.org/docbook/xml/4.2/docbookx.dtd">
+
+<chapter id="bitbake-user-manual-metadata">
+ <title>Syntax and Operators</title>
+
+ <para>
+ Bitbake files have their own syntax.
+ The syntax has similarities to several
+ other languages but also has some unique features.
+ This section describes the available syntax and operators
+ as well as provides examples.
+ </para>
+
+ <section id='basic-syntax'>
+ <title>Basic Syntax</title>
+
+ <para>
+ This section provides some basic syntax examples.
+ </para>
+
+ <section id='basic-variable-setting'>
+ <title>Basic Variable Setting</title>
+
+ <para>
+ The following example sets <filename>VARIABLE</filename> to
+ "value".
+ This assignment occurs immediately as the statement is parsed.
+ It is a "hard" assignment.
+ <literallayout class='monospaced'>
+ VARIABLE = "value"
+ </literallayout>
+ As expected, if you include leading or trailing spaces as part of
+ an assignment, the spaces are retained:
+ <literallayout class='monospaced'>
+ VARIABLE = " value"
+ VARIABLE = "value "
+ </literallayout>
+ Setting <filename>VARIABLE</filename> to "" sets it to an empty string,
+ while setting the variable to " " sets it to a blank space
+ (i.e. these are not the same values).
+ <literallayout class='monospaced'>
+ VARIABLE = ""
+ VARIABLE = " "
+ </literallayout>
+ </para>
+ </section>
+
+ <section id='variable-expansion'>
+ <title>Variable Expansion</title>
+
+ <para>
+ BitBake supports variables referencing one another's
+ contents using a syntax that is similar to shell scripting.
+ Following is an example that results in <filename>A</filename>
+ containing "aval" and <filename>B</filename> evaluating to
+ "preavalpost" based on that current value of
+ <filename>A</filename>.
+ <literallayout class='monospaced'>
+ A = "aval"
+ B = "pre${A}post"
+ </literallayout>
+ You should realize that whenever <filename>B</filename> is
+ referenced, its evaluation will depend on the state of
+ <filename>A</filename> at that time.
+ Thus, later evaluations of <filename>B</filename> in the
+ previous example could result in different values
+ depending on the value of <filename>A</filename>.
+ </para>
+ </section>
+
+ <section id='setting-a-default-value'>
+ <title>Setting a default value (?=)</title>
+
+ <para>
+ You can use the "?=" operator to achieve a "softer" assignment
+ for a variable.
+ This type of assignment allows you to define a variable if it
+ is undefined when the statement is parsed, but to leave the
+ value alone if the variable has a value.
+ Here is an example:
+ <literallayout class='monospaced'>
+ A ?= "aval"
+ </literallayout>
+ If <filename>A</filename> is set at the time this statement is parsed,
+ the variable retains its value.
+ However, if <filename>A</filename> is not set,
+ the variable is set to "aval".
+ <note>
+ This assignment is immediate.
+ Consequently, if multiple "?=" assignments
+ to a single variable exist, the first of those ends up getting
+ used.
+ </note>
+ </para>
+ </section>
+
+ <section id='setting-a-weak-default-value'>
+ <title>Setting a weak default value (??=)</title>
+
+ <para>
+ It is possible to use a "weaker" assignment than in the
+ previous section by using the "??=" operator.
+ This assignment behaves identical to "?=" except that the
+ assignment is made at the end of the parsing process rather
+ than immediately.
+ Consequently, when multiple "??=" assignments exist, the last
+ one is used.
+ Also, any "=" or "?=" assignment will override the value set with
+ "??=".
+ Here is an example:
+ <literallayout class='monospaced'>
+ A ??= "somevalue"
+ A ??= "someothervalue"
+ </literallayout>
+ If <filename>A</filename> is set before the above statements are parsed,
+ the variable retains its value.
+ If <filename>A</filename> is not set,
+ the variable is set to "someothervalue".
+ </para>
+
+ <para>
+ Again, this assignment is a "lazy" or "weak" assignment
+ because it does not occur until the end
+ of the parsing process.
+ </para>
+ </section>
+
+ <section id='immediate-variable-expansion'>
+ <title>Immediate variable expansion (:=)</title>
+
+ <para>
+ The ":=" operator results in a variable's
+ contents being expanded immediately,
+ rather than when the variable is actually used:
+ <literallayout class='monospaced'>
+ T = "123"
+ A := "${B} ${A} test ${T}"
+ T = "456"
+ B = "${T} bval"
+ C = "cval"
+ C := "${C}append"
+ </literallayout>
+ In this example, <filename>A</filename> contains
+ "test 123" because <filename>${B}</filename> and
+ <filename>${A}</filename> at the time of parsing are undefined,
+ which leaves "test 123".
+ And, the variable <filename>C</filename>
+ contains "cvalappend" since <filename>${C}</filename> immediately
+ expands to "cval".
+ </para>
+ </section>
+
+ <section id='appending-and-prepending'>
+ <title>Appending (+=) and prepending (=+) With Spaces</title>
+
+ <para>
+ Appending and prepending values is common and can be accomplished
+ using the "+=" and "=+" operators.
+ These operators insert a space between the current
+ value and prepended or appended value.
+ </para>
+
+ <para>
+ These operators take immediate effect during parsing.
+ Here are some examples:
+ <literallayout class='monospaced'>
+ B = "bval"
+ B += "additionaldata"
+ C = "cval"
+ C =+ "test"
+ </literallayout>
+ The variable <filename>B</filename> contains
+ "bval additionaldata" and <filename>C</filename>
+ contains "test cval".
+ </para>
+ </section>
+
+ <section id='appending-and-prepending-without-spaces'>
+ <title>Appending (.=) and Prepending (=.) Without Spaces</title>
+
+ <para>
+ If you want to append or prepend values without an
+ inserted space, use the ".=" and "=." operators.
+ </para>
+
+ <para>
+ These operators take immediate effect during parsing.
+ Here are some examples:
+ <literallayout class='monospaced'>
+ B = "bval"
+ B .= "additionaldata"
+ C = "cval"
+ C =. "test"
+ </literallayout>
+ The variable <filename>B</filename> contains
+ "bvaladditionaldata" and
+ <filename>C</filename> contains "testcval".
+ </para>
+ </section>
+
+ <section id='appending-and-prepending-override-style-syntax'>
+ <title>Appending and Prepending (Override Style Syntax)</title>
+
+ <para>
+ You can also append and prepend a variable's value
+ using an override style syntax.
+ When you use this syntax, no spaces are inserted.
+ </para>
+
+ <para>
+ These operators differ from the ":=", ".=", "=.", "+=", and "=+"
+ operators in that their effects are deferred
+ until after parsing completes rather than being immediately
+ applied.
+ Here are some examples:
+ <literallayout class='monospaced'>
+ B = "bval"
+ B_append = " additional data"
+ C = "cval"
+ C_prepend = "additional data "
+ D = "dval"
+ D_append = "additional data"
+ </literallayout>
+ The variable <filename>B</filename> becomes
+ "bval additional data" and <filename>C</filename> becomes
+ "additional data cval".
+ The variable <filename>D</filename> becomes
+ "dvaladditional data".
+ <note>
+ You must control all spacing when you use the
+ override syntax.
+ </note>
+ </para>
+ </section>
+
+ <section id='removing-override-style-syntax'>
+ <title>Removal (Override Style Syntax)</title>
+
+ <para>
+ You can remove values from lists using the removal
+ override style syntax.
+ Specifying a value for removal causes all occurrences of that
+ value to be removed from the variable.
+ </para>
+
+ <para>
+ When you use this syntax, BitBake expects one or more strings.
+ Surrounding spaces are removed as well.
+ Here is an example:
+ <literallayout class='monospaced'>
+ FOO = "123 456 789 123456 123 456 123 456"
+ FOO_remove = "123"
+ FOO_remove = "456"
+ FOO2 = "abc def ghi abcdef abc def abc def"
+ FOO2_remove = "abc def"
+ </literallayout>
+ The variable <filename>FOO</filename> becomes
+ "789 123456" and <filename>FOO2</filename> becomes
+ "ghi abcdef".
+ </para>
+ </section>
+
+ <section id='variable-flag-syntax'>
+ <title>Variable Flag Syntax</title>
+
+ <para>
+ Variable flags are BitBake's implementation of variable properties
+ or attributes.
+ It is a way of tagging extra information onto a variable.
+ You can find more out about variable flags in general in the
+ "<link linkend='variable-flags'>Variable Flags</link>"
+ section.
+ </para>
+
+ <para>
+ You can define, append, and prepend values to variable flags.
+ All the standard syntax operations previously mentioned work
+ for variable flags except for override style syntax
+ (i.e. <filename>_prepend</filename>, <filename>_append</filename>,
+ and <filename>_remove</filename>).
+ </para>
+
+ <para>
+ Here are some examples showing how to set variable flags:
+ <literallayout class='monospaced'>
+ FOO[a] = "abc"
+ FOO[b] = "123"
+ FOO[a] += "456"
+ </literallayout>
+ The variable <filename>FOO</filename> has two flags:
+ <filename>a</filename> and <filename>b</filename>.
+ The flags are immediately set to "abc" and "123", respectively.
+ The <filename>a</filename> flag becomes "abc 456".
+ </para>
+
+ <para>
+ No need exists to pre-define variable flags.
+ You can simply start using them.
+ One extremely common application
+ is to attach some brief documentation to a BitBake variable as
+ follows:
+ <literallayout class='monospaced'>
+ CACHE[doc] = "The directory holding the cache of the metadata."
+ </literallayout>
+ </para>
+ </section>
+
+ <section id='inline-python-variable-expansion'>
+ <title>Inline Python Variable Expansion</title>
+
+ <para>
+ You can use inline Python variable expansion to
+ set variables.
+ Here is an example:
+ <literallayout class='monospaced'>
+ DATE = "${@time.strftime('%Y%m%d',time.gmtime())}"
+ </literallayout>
+ This example results in the <filename>DATE</filename>
+ variable being set to the current date.
+ </para>
+
+ <para>
+ Probably the most common use of this feature is to extract
+ the value of variables from BitBake's internal data dictionary,
+ <filename>d</filename>.
+ The following lines select the values of a package name
+ and its version number, respectively:
+ <literallayout class='monospaced'>
+ PN = "${@bb.parse.BBHandler.vars_from_file(d.getVar('FILE', False),d)[0] or 'defaultpkgname'}"
+ PV = "${@bb.parse.BBHandler.vars_from_file(d.getVar('FILE', False),d)[1] or '1.0'}"
+ </literallayout>
+ </para>
+ </section>
+
+ <section id='providing-pathnames'>
+ <title>Providing Pathnames</title>
+
+ <para>
+ When specifying pathnames for use with BitBake,
+ do not use the tilde ("~") character as a shortcut
+ for your home directory.
+ Doing so might cause BitBake to not recognize the
+ path since BitBake does not expand this character in
+ the same way a shell would.
+ </para>
+
+ <para>
+ Instead, provide a fuller path as the following
+ example illustrates:
+ <literallayout class='monospaced'>
+ BBLAYERS ?= " \
+ /home/scott-lenovo/LayerA \
+ "
+ </literallayout>
+ </para>
+ </section>
+ </section>
+
+ <section id='conditional-syntax-overrides'>
+ <title>Conditional Syntax (Overrides)</title>
+
+ <para>
+ BitBake uses
+ <link linkend='var-OVERRIDES'><filename>OVERRIDES</filename></link>
+ to control what variables are overridden after BitBake
+ parses recipes and configuration files.
+ This section describes how you can use
+ <filename>OVERRIDES</filename> as conditional metadata,
+ talks about key expansion in relationship to
+ <filename>OVERRIDES</filename>, and provides some examples
+ to help with understanding.
+ </para>
+
+ <section id='conditional-metadata'>
+ <title>Conditional Metadata</title>
+
+ <para>
+ You can use <filename>OVERRIDES</filename> to conditionally select
+ a specific version of a variable and to conditionally
+ append or prepend the value of a variable.
+ <itemizedlist>
+ <listitem><para><emphasis>Selecting a Variable:</emphasis>
+ The <filename>OVERRIDES</filename> variable is
+ a colon-character-separated list that contains items
+ for which you want to satisfy conditions.
+ Thus, if you have a variable that is conditional on “arm”, and “arm”
+ is in <filename>OVERRIDES</filename>, then the “arm”-specific
+ version of the variable is used rather than the non-conditional
+ version.
+ Here is an example:
+ <literallayout class='monospaced'>
+ OVERRIDES = "architecture:os:machine"
+ TEST = "default"
+ TEST_os = "osspecific"
+ TEST_nooverride = "othercondvalue"
+ </literallayout>
+ In this example, the <filename>OVERRIDES</filename>
+ variable lists three overrides:
+ "architecture", "os", and "machine".
+ The variable <filename>TEST</filename> by itself has a default
+ value of "default".
+ You select the os-specific version of the <filename>TEST</filename>
+ variable by appending the "os" override to the variable
+ (i.e.<filename>TEST_os</filename>).
+ </para>
+
+ <para>
+ To better understand this, consider a practical example
+ that assumes an OpenEmbedded metadata-based Linux
+ kernel recipe file.
+ The following lines from the recipe file first set
+ the kernel branch variable <filename>KBRANCH</filename>
+ to a default value, then conditionally override that
+ value based on the architecture of the build:
+ <literallayout class='monospaced'>
+ KBRANCH = "standard/base"
+ KBRANCH_qemuarm = "standard/arm-versatile-926ejs"
+ KBRANCH_qemumips = "standard/mti-malta32"
+ KBRANCH_qemuppc = "standard/qemuppc"
+ KBRANCH_qemux86 = "standard/common-pc/base"
+ KBRANCH_qemux86-64 = "standard/common-pc-64/base"
+ KBRANCH_qemumips64 = "standard/mti-malta64"
+ </literallayout>
+ </para></listitem>
+ <listitem><para><emphasis>Appending and Prepending:</emphasis>
+ BitBake also supports append and prepend operations to
+ variable values based on whether a specific item is
+ listed in <filename>OVERRIDES</filename>.
+ Here is an example:
+ <literallayout class='monospaced'>
+ DEPENDS = "glibc ncurses"
+ OVERRIDES = "machine:local"
+ DEPENDS_append_machine = "libmad"
+ </literallayout>
+ In this example, <filename>DEPENDS</filename> becomes
+ "glibc ncurses libmad".
+ </para>
+
+ <para>
+ Again, using an OpenEmbedded metadata-based
+ kernel recipe file as an example, the
+ following lines will conditionally append to the
+ <filename>KERNEL_FEATURES</filename> variable based
+ on the architecture:
+ <literallayout class='monospaced'>
+ KERNEL_FEATURES_append = " ${KERNEL_EXTRA_FEATURES}"
+ KERNEL_FEATURES_append_qemux86=" cfg/sound.scc cfg/paravirt_kvm.scc"
+ KERNEL_FEATURES_append_qemux86-64=" cfg/sound.scc cfg/paravirt_kvm.scc"
+ </literallayout>
+ </para></listitem>
+ </itemizedlist>
+ </para>
+ </section>
+
+ <section id='key-expansion'>
+ <title>Key Expansion</title>
+
+ <para>
+ Key expansion happens when the BitBake datastore is finalized
+ just before BitBake expands overrides.
+ To better understand this, consider the following example:
+ <literallayout class='monospaced'>
+ A${B} = "X"
+ B = "2"
+ A2 = "Y"
+ </literallayout>
+ In this case, after all the parsing is complete, and
+ before any overrides are handled, BitBake expands
+ <filename>${B}</filename> into "2".
+ This expansion causes <filename>A2</filename>, which was
+ set to "Y" before the expansion, to become "X".
+ </para>
+ </section>
+
+ <section id='variable-interaction-worked-examples'>
+ <title>Examples</title>
+
+ <para>
+ Despite the previous explanations that show the different forms of
+ variable definitions, it can be hard to work
+ out exactly what happens when variable operators, conditional
+ overrides, and unconditional overrides are combined.
+ This section presents some common scenarios along
+ with explanations for variable interactions that
+ typically confuse users.
+ </para>
+
+ <para>
+ There is often confusion concerning the order in which
+ overrides and various "append" operators take effect.
+ Recall that an append or prepend operation using "_append"
+ and "_prepend" does not result in an immediate assignment
+ as would "+=", ".=", "=+", or "=.".
+ Consider the following example:
+ <literallayout class='monospaced'>
+ OVERRIDES = "foo"
+ A = "Z"
+ A_foo_append = "X"
+ </literallayout>
+ For this case, <filename>A</filename> is
+ unconditionally set to "Z" and "X" is
+ unconditionally and immediately appended to the variable
+ <filename>A_foo</filename>.
+ Because overrides have not been applied yet,
+ <filename>A_foo</filename> is set to "X" due to the append
+ and <filename>A</filename> simply equals "Z".
+ </para>
+
+ <para>
+ Applying overrides, however, changes things.
+ Since "foo" is listed in <filename>OVERRIDES</filename>,
+ the conditional variable <filename>A</filename> is replaced
+ with the "foo" version, which is equal to "X".
+ So effectively, <filename>A_foo</filename> replaces <filename>A</filename>.
+ </para>
+
+ <para>
+ This next example changes the order of the override and
+ the append:
+ <literallayout class='monospaced'>
+ OVERRIDES = "foo"
+ A = "Z"
+ A_append_foo = "X"
+ </literallayout>
+ For this case, before overrides are handled,
+ <filename>A</filename> is set to "Z" and <filename>A_append_foo</filename>
+ is set to "X".
+ Once the override for "foo" is applied, however,
+ <filename>A</filename> gets appended with "X".
+ Consequently, <filename>A</filename> becomes "ZX".
+ Notice that spaces are not appended.
+ </para>
+
+ <para>
+ This next example has the order of the appends and overrides reversed
+ back as in the first example:
+ <literallayout class='monospaced'>
+ OVERRIDES = "foo"
+ A = "Y"
+ A_foo_append = "Z"
+ A_foo_append += "X"
+ </literallayout>
+ For this case, before any overrides are resolved,
+ <filename>A</filename> is set to "Y" using an immediate assignment.
+ After this immediate assignment, <filename>A_foo</filename> is set
+ to "Z", and then further appended with
+ "X" leaving the variable set to "Z X".
+ Finally, applying the override for "foo" results in the conditional
+ variable <filename>A</filename> becoming "Z X" (i.e.
+ <filename>A</filename> is replaced with <filename>A_foo</filename>).
+ </para>
+
+ <para>
+ This final example mixes in some varying operators:
+ <literallayout class='monospaced'>
+ A = "1"
+ A_append = "2"
+ A_append = "3"
+ A += "4"
+ A .= "5"
+ </literallayout>
+ For this case, the type of append operators are affecting the
+ order of assignments as BitBake passes through the code
+ multiple times.
+ Initially, <filename>A</filename> is set to "1 45" because
+ of the three statements that use immediate operators.
+ After these assignments are made, BitBake applies the
+ <filename>_append</filename> operations.
+ Those operations result in <filename>A</filename> becoming "1 4523".
+ </para>
+ </section>
+ </section>
+
+ <section id='sharing-functionality'>
+ <title>Sharing Functionality</title>
+
+ <para>
+ BitBake allows for metadata sharing through include files
+ (<filename>.inc</filename>) and class files
+ (<filename>.bbclass</filename>).
+ For example, suppose you have a piece of common functionality
+ such as a task definition that you want to share between
+ more than one recipe.
+ In this case, creating a <filename>.bbclass</filename>
+ file that contains the common functionality and then using
+ the <filename>inherit</filename> directive in your recipes to
+ inherit the class would be a common way to share the task.
+ </para>
+
+ <para>
+ This section presents the mechanisms BitBake provides to
+ allow you to share functionality between recipes.
+ Specifically, the mechanisms include <filename>include</filename>,
+ <filename>inherit</filename>, <filename>INHERIT</filename>, and
+ <filename>require</filename> directives.
+ </para>
+
+ <section id='locating-include-and-class-files'>
+ <title>Locating Include and Class Files</title>
+
+ <para>
+ BitBake uses the
+ <link linkend='var-BBPATH'><filename>BBPATH</filename></link>
+ variable to locate needed include and class files.
+ The <filename>BBPATH</filename> variable is analogous to
+ the environment variable <filename>PATH</filename>.
+ </para>
+
+ <para>
+ In order for include and class files to be found by BitBake,
+ they need to be located in a "classes" subdirectory that can
+ be found in <filename>BBPATH</filename>.
+ </para>
+ </section>
+
+ <section id='inherit-directive'>
+ <title><filename>inherit</filename> Directive</title>
+
+ <para>
+ When writing a recipe or class file, you can use the
+ <filename>inherit</filename> directive to inherit the
+ functionality of a class (<filename>.bbclass</filename>).
+ BitBake only supports this directive when used within recipe
+ and class files (i.e. <filename>.bb</filename> and
+ <filename>.bbclass</filename>).
+ </para>
+
+ <para>
+ The <filename>inherit</filename> directive is a rudimentary
+ means of specifying what classes of functionality your
+ recipes require.
+ For example, you can easily abstract out the tasks involved in
+ building a package that uses Autoconf and Automake and put
+ those tasks into a class file that can be used by your recipe.
+ </para>
+
+ <para>
+ As an example, your recipes could use the following directive
+ to inherit an <filename>autotools.bbclass</filename> file.
+ The class file would contain common functionality for using
+ Autotools that could be shared across recipes:
+ <literallayout class='monospaced'>
+ inherit autotools
+ </literallayout>
+ In this case, BitBake would search for the directory
+ <filename>classes/autotools.bbclass</filename>
+ in <filename>BBPATH</filename>.
+ <note>
+ You can override any values and functions of the
+ inherited class within your recipe by doing so
+ after the "inherit" statement.
+ </note>
+ </para>
+ </section>
+
+ <section id='include-directive'>
+ <title><filename>include</filename> Directive</title>
+
+ <para>
+ BitBake understands the <filename>include</filename>
+ directive.
+ This directive causes BitBake to parse whatever file you specify,
+ and to insert that file at that location.
+ The directive is much like its equivalent in Make except
+ that if the path specified on the include line is a relative
+ path, BitBake locates the first file it can find
+ within <filename>BBPATH</filename>.
+ </para>
+
+ <para>
+ As an example, suppose you needed a recipe to include some
+ self-test definitions:
+ <literallayout class='monospaced'>
+ include test_defs.inc
+ </literallayout>
+ <note>
+ The <filename>include</filename> directive does not
+ produce an error when the file cannot be found.
+ Consequently, it is recommended that if the file you
+ are including is expected to exist, you should use
+ <link linkend='require-inclusion'><filename>require</filename></link>
+ instead of <filename>include</filename>.
+ Doing so makes sure that an error is produced if the
+ file cannot be found.
+ </note>
+ </para>
+ </section>
+
+ <section id='require-inclusion'>
+ <title><filename>require</filename> Directive</title>
+
+ <para>
+ BitBake understands the <filename>require</filename>
+ directive.
+ This directive behaves just like the
+ <filename>include</filename> directive with the exception that
+ BitBake raises a parsing error if the file to be included cannot
+ be found.
+ Thus, any file you require is inserted into the file that is
+ being parsed at the location of the directive.
+ </para>
+
+ <para>
+ Similar to how BitBake handles
+ <link linkend='include-directive'><filename>include</filename></link>,
+ if the path specified
+ on the require line is a relative path, BitBake locates
+ the first file it can find within <filename>BBPATH</filename>.
+ </para>
+
+ <para>
+ As an example, suppose you have two versions of a recipe
+ (e.g. <filename>foo_1.2.2.bb</filename> and
+ <filename>foo_2.0.0.bb</filename>) where
+ each version contains some identical functionality that could be
+ shared.
+ You could create an include file named <filename>foo.inc</filename>
+ that contains the common definitions needed to build "foo".
+ You need to be sure <filename>foo.inc</filename> is located in the
+ same directory as your two recipe files as well.
+ Once these conditions are set up, you can share the functionality
+ using a <filename>require</filename> directive from within each
+ recipe:
+ <literallayout class='monospaced'>
+ require foo.inc
+ </literallayout>
+ </para>
+ </section>
+
+ <section id='inherit-configuration-directive'>
+ <title><filename>INHERIT</filename> Configuration Directive</title>
+
+ <para>
+ When creating a configuration file (<filename>.conf</filename>),
+ you can use the <filename>INHERIT</filename> directive to
+ inherit a class.
+ BitBake only supports this directive when used within
+ a configuration file.
+ </para>
+
+ <para>
+ As an example, suppose you needed to inherit a class
+ file called <filename>abc.bbclass</filename> from a
+ configuration file as follows:
+ <literallayout class='monospaced'>
+ INHERIT += "abc"
+ </literallayout>
+ This configuration directive causes the named
+ class to be inherited at the point of the directive
+ during parsing.
+ As with the <filename>inherit</filename> directive, the
+ <filename>.bbclass</filename> file must be located in a
+ "classes" subdirectory in one of the directories specified
+ in <filename>BBPATH</filename>.
+ <note>
+ Because <filename>.conf</filename> files are parsed
+ first during BitBake's execution, using
+ <filename>INHERIT</filename> to inherit a class effectively
+ inherits the class globally (i.e. for all recipes).
+ </note>
+ </para>
+ </section>
+ </section>
+
+ <section id='functions'>
+ <title>Functions</title>
+
+ <para>
+ As with most languages, functions are the building blocks that
+ are used to build up operations into tasks.
+ BitBake supports these types of functions:
+ <itemizedlist>
+ <listitem><para><emphasis>Shell Functions:</emphasis>
+ Functions written in shell script and executed either
+ directly as functions, tasks, or both.
+ They can also be called by other shell functions.
+ </para></listitem>
+ <listitem><para><emphasis>BitBake Style Python Functions:</emphasis>
+ Functions written in Python and executed by BitBake or other
+ Python functions using <filename>bb.build.exec_func()</filename>.
+ </para></listitem>
+ <listitem><para><emphasis>Python Functions:</emphasis>
+ Functions written in Python and executed by Python.
+ </para></listitem>
+ <listitem><para><emphasis>Anonymous Python Functions:</emphasis>
+ Python functions executed automatically during
+ parsing.
+ </para></listitem>
+ </itemizedlist>
+ Regardless of the type of function, you can only
+ define them in class (<filename>.bbclass</filename>)
+ and recipe (<filename>.bb</filename> or <filename>.inc</filename>)
+ files.
+ </para>
+
+ <section id='shell-functions'>
+ <title>Shell Functions</title>
+
+ <para>
+ Functions written in shell script and executed either
+ directly as functions, tasks, or both.
+ They can also be called by other shell functions.
+ Here is an example shell function definition:
+ <literallayout class='monospaced'>
+ some_function () {
+ echo "Hello World"
+ }
+ </literallayout>
+ When you create these types of functions in your recipe
+ or class files, you need to follow the shell programming
+ rules.
+ The scripts are executed by <filename>/bin/sh</filename>,
+ which may not be a bash shell but might be something
+ such as <filename>dash</filename>.
+ You should not use Bash-specific script (bashisms).
+ </para>
+ </section>
+
+ <section id='bitbake-style-python-functions'>
+ <title>BitBake Style Python Functions</title>
+
+ <para>
+ These functions are written in Python and executed by
+ BitBake or other Python functions using
+ <filename>bb.build.exec_func()</filename>.
+ </para>
+
+ <para>
+ An example BitBake function is:
+ <literallayout class='monospaced'>
+ python some_python_function () {
+ d.setVar("TEXT", "Hello World")
+ print d.getVar("TEXT", True)
+ }
+ </literallayout>
+ Because the Python "bb" and "os" modules are already
+ imported, you do not need to import these modules.
+ Also in these types of functions, the datastore ("d")
+ is a global variable and is always automatically
+ available.
+ </para>
+ </section>
+
+ <section id='python-functions'>
+ <title>Python Functions</title>
+
+ <para>
+ These functions are written in Python and are executed by
+ other Python code.
+ Examples of Python functions are utility functions
+ that you intend to call from in-line Python or
+ from within other Python functions.
+ Here is an example:
+ <literallayout class='monospaced'>
+ def get_depends(d):
+ if d.getVar('SOMECONDITION', True):
+ return "dependencywithcond"
+ else:
+ return "dependency"
+ SOMECONDITION = "1"
+ DEPENDS = "${@get_depends(d)}"
+ </literallayout>
+ This would result in <filename>DEPENDS</filename>
+ containing <filename>dependencywithcond</filename>.
+ </para>
+
+ <para>
+ Here are some things to know about Python functions:
+ <itemizedlist>
+ <listitem><para>Python functions can take parameters.
+ </para></listitem>
+ <listitem><para>The BitBake datastore is not
+ automatically available.
+ Consequently, you must pass it in as a
+ parameter to the function.
+ </para></listitem>
+ <listitem><para>The "bb" and "os" Python modules are
+ automatically available.
+ You do not need to import them.
+ </para></listitem>
+ </itemizedlist>
+ </para>
+ </section>
+
+ <section id='anonymous-python-functions'>
+ <title>Anonymous Python Functions</title>
+
+ <para>
+ Sometimes it is useful to run some code during
+ parsing to set variables or to perform other operations
+ programmatically.
+ To do this, you can define an anonymous Python function.
+ Here is an example that conditionally sets a
+ variable based on the value of another variable:
+ <literallayout class='monospaced'>
+ python __anonymous () {
+ if d.getVar('SOMEVAR', True) == 'value':
+ d.setVar('ANOTHERVAR', 'value2')
+ }
+ </literallayout>
+ The "__anonymous" function name is optional, so the
+ following example is functionally equivalent to the above:
+ <literallayout class='monospaced'>
+ python () {
+ if d.getVar('SOMEVAR', True) == 'value':
+ d.setVar('ANOTHERVAR', 'value2')
+ }
+ </literallayout>
+ Because unlike other Python functions anonymous
+ Python functions are executed during parsing, the
+ "d" variable within an anonymous Python function represents
+ the datastore for the entire recipe.
+ Consequently, you can set variable values here and
+ those values can be picked up by other functions.
+ </para>
+ </section>
+
+ <section id='flexible-inheritance-for-class-functions'>
+ <title>Flexible Inheritance for Class Functions</title>
+
+ <para>
+ Through coding techniques and the use of
+ <filename>EXPORT_FUNCTIONS</filename>, BitBake supports
+ exporting a function from a class such that the
+ class function appears as the default implementation
+ of the function, but can still be called if a recipe
+ inheriting the class needs to define its own version of
+ the function.
+ </para>
+
+ <para>
+ To understand the benefits of this feature, consider
+ the basic scenario where a class defines a task function
+ and your recipe inherits the class.
+ In this basic scenario, your recipe inherits the task
+ function as defined in the class.
+ If desired, your recipe can add to the start and end of the
+ function by using the "_prepend" or "_append" operations
+ respectively, or it can redefine the function completely.
+ However, if it redefines the function, there is
+ no means for it to call the class version of the function.
+ <filename>EXPORT_FUNCTIONS</filename> provides a mechanism
+ that enables the recipe's version of the function to call
+ the original version of the function.
+ </para>
+
+ <para>
+ To make use of this technique, you need the following
+ things in place:
+ <itemizedlist>
+ <listitem><para>
+ The class needs to define the function as follows:
+ <literallayout class='monospaced'>
+ <replaceable>classname</replaceable><filename>_</filename><replaceable>functionname</replaceable>
+ </literallayout>
+ For example, if you have a class file
+ <filename>bar.bbclass</filename> and a function named
+ <filename>do_foo</filename>, the class must define the function
+ as follows:
+ <literallayout class='monospaced'>
+ bar_do_foo
+ </literallayout>
+ </para></listitem>
+ <listitem><para>
+ The class needs to contain the <filename>EXPORT_FUNCTIONS</filename>
+ statement as follows:
+ <literallayout class='monospaced'>
+ EXPORT_FUNCTIONS <replaceable>functionname</replaceable>
+ </literallayout>
+ For example, continuing with the same example, the
+ statement in the <filename>bar.bbclass</filename> would be
+ as follows:
+ <literallayout class='monospaced'>
+ EXPORT_FUNCTIONS do_foo
+ </literallayout>
+ </para></listitem>
+ <listitem><para>
+ You need to call the function appropriately from within your
+ recipe.
+ Continuing with the same example, if your recipe
+ needs to call the class version of the function,
+ it should call <filename>bar_do_foo</filename>.
+ Assuming <filename>do_foo</filename> was a shell function
+ and <filename>EXPORT_FUNCTIONS</filename> was used as above,
+ the recipe's function could conditionally call the
+ class version of the function as follows:
+ <literallayout class='monospaced'>
+ do_foo() {
+ if [ somecondition ] ; then
+ bar_do_foo
+ else
+ # Do something else
+ fi
+ }
+ </literallayout>
+ To call your modified version of the function as defined
+ in your recipe, call it as <filename>do_foo</filename>.
+ </para></listitem>
+ </itemizedlist>
+ With these conditions met, your single recipe
+ can freely choose between the original function
+ as defined in the class file and the modified function in your recipe.
+ If you do not set up these conditions, you are limited to using one function
+ or the other.
+ </para>
+ </section>
+ </section>
+
+ <section id='tasks'>
+ <title>Tasks</title>
+
+ <para>
+ Tasks are BitBake execution units that originate as
+ functions and make up the steps that BitBake needs to run
+ for given recipe.
+ Tasks are only supported in recipe (<filename>.bb</filename>
+ or <filename>.inc</filename>) and class
+ (<filename>.bbclass</filename>) files.
+ By convention, task names begin with the string "do_".
+ </para>
+
+ <para>
+ Here is an example of a task that prints out the date:
+ <literallayout class='monospaced'>
+ python do_printdate () {
+ import time
+ print time.strftime('%Y%m%d', time.gmtime())
+ }
+ addtask printdate after do_fetch before do_build
+ </literallayout>
+ </para>
+
+ <section id='promoting-a-function-to-a-task'>
+ <title>Promoting a Function to a Task</title>
+
+ <para>
+ Any function can be promoted to a task by applying the
+ <filename>addtask</filename> command.
+ The <filename>addtask</filename> command also describes
+ inter-task dependencies.
+ Here is the function from the previous section but with the
+ <filename>addtask</filename> command promoting it to a task
+ and defining some dependencies:
+ <literallayout class='monospaced'>
+ python do_printdate () {
+ import time
+ print time.strftime('%Y%m%d', time.gmtime())
+ }
+ addtask printdate after do_fetch before do_build
+ </literallayout>
+ In the example, the function is defined and then promoted
+ as a task.
+ The <filename>do_printdate</filename> task becomes a dependency of
+ the <filename>do_build</filename> task, which is the default
+ task.
+ And, the <filename>do_printdate</filename> task is dependent upon
+ the <filename>do_fetch</filename> task.
+ Execution of the <filename>do_build</filename> task results
+ in the <filename>do_printdate</filename> task running first.
+ </para>
+ </section>
+
+ <section id='deleting-a-task'>
+ <title>Deleting a Task</title>
+
+ <para>
+ As well as being able to add tasks, you can delete them.
+ Simply use the <filename>deltask</filename> command to
+ delete a task.
+ For example, to delete the example task used in the previous
+ sections, you would use:
+ <literallayout class='monospaced'>
+ deltask printdate
+ </literallayout>
+ If you delete a task using the <filename>deltask</filename>
+ command and the task has dependencies, the dependencies are
+ not reconnected.
+ For example, suppose you have three tasks named
+ <filename>do_a</filename>, <filename>do_b</filename>, and
+ <filename>do_c</filename>.
+ Furthermore, <filename>do_c</filename> is dependent on
+ <filename>do_b</filename>, which in turn is dependent on
+ <filename>do_a</filename>.
+ Given this scenario, if you use <filename>deltask</filename>
+ to delete <filename>do_b</filename>, the implicit dependency
+ relationship between <filename>do_c</filename> and
+ <filename>do_a</filename> through <filename>do_b</filename>
+ no longer exists, and <filename>do_c</filename> dependencies
+ are not updated to include <filename>do_a</filename>.
+ Thus, <filename>do_c</filename> is free to run before
+ <filename>do_a</filename>.
+ </para>
+
+ <para>
+ If you want dependencies such as these to remain intact, use
+ the <filename>noexec</filename> varflag to disable the task
+ instead of using the <filename>deltask</filename> command to
+ delete it:
+ <literallayout class='monospaced'>
+ do_b[noexec] = "1"
+ </literallayout>
+ </para>
+ </section>
+
+ <section id='passing-information-into-the-build-task-environment'>
+ <title>Passing Information Into the Build Task Environment</title>
+
+ <para>
+ When running a task, BitBake tightly controls the execution
+ environment of the build tasks to make
+ sure unwanted contamination from the build machine cannot
+ influence the build.
+ Consequently, if you do want something to get passed into the
+ build task environment, you must take these two steps:
+ <orderedlist>
+ <listitem><para>
+ Tell BitBake to load what you want from the environment
+ into the datastore.
+ You can do so through the
+ <link linkend='var-BB_ENV_EXTRAWHITE'><filename>BB_ENV_EXTRAWHITE</filename></link>
+ variable.
+ For example, assume you want to prevent the build system from
+ accessing your <filename>$HOME/.ccache</filename>
+ directory.
+ The following command tells BitBake to load
+ <filename>CCACHE_DIR</filename> from the environment into
+ the datastore:
+ <literallayout class='monospaced'>
+ export BB_ENV_EXTRAWHITE="$BB_ENV_EXTRAWHITE CCACHE_DIR"
+ </literallayout></para></listitem>
+ <listitem><para>
+ Tell BitBake to export what you have loaded into the
+ datastore to the task environment of every running task.
+ Loading something from the environment into the datastore
+ (previous step) only makes it available in the datastore.
+ To export it to the task environment of every running task,
+ use a command similar to the following in your local configuration
+ file <filename>local.conf</filename> or your
+ distribution configuration file:
+ <literallayout class='monospaced'>
+ export CCACHE_DIR
+ </literallayout>
+ <note>
+ A side effect of the previous steps is that BitBake
+ records the variable as a dependency of the build process
+ in things like the setscene checksums.
+ If doing so results in unnecessary rebuilds of tasks, you can
+ whitelist the variable so that the setscene code
+ ignores the dependency when it creates checksums.
+ </note></para></listitem>
+ </orderedlist>
+ </para>
+
+ <para>
+ Sometimes, it is useful to be able to obtain information
+ from the original execution environment.
+ Bitbake saves a copy of the original environment into
+ a special variable named
+ <link linkend='var-BB_ORIGENV'><filename>BB_ORIGENV</filename></link>.
+ </para>
+
+ <para>
+ The <filename>BB_ORIGENV</filename> variable returns a datastore
+ object that can be queried using the standard datastore operators
+ such as <filename>getVar(, False)</filename>.
+ The datastore object is useful, for example, to find the original
+ <filename>DISPLAY</filename> variable.
+ Here is an example:
+ <literallayout class='monospaced'>
+ origenv = d.getVar("BB_ORIGENV", False)
+ bar = origenv.getVar("BAR", False)
+ </literallayout>
+ The previous example returns <filename>BAR</filename> from the original
+ execution environment.
+ </para>
+
+ <para>
+ By default, BitBake cleans the environment to include only those
+ things exported or listed in its whitelist to ensure that the build
+ environment is reproducible and consistent.
+ </para>
+ </section>
+ </section>
+
+ <section id='variable-flags'>
+ <title>Variable Flags</title>
+
+ <para>
+ Variable flags (varflags) help control a task's functionality
+ and dependencies.
+ BitBake reads and writes varflags to the datastore using the following
+ command forms:
+ <literallayout class='monospaced'>
+ <replaceable>variable</replaceable> = d.getVarFlags("<replaceable>variable</replaceable>")
+ self.d.setVarFlags("FOO", {"func": True})
+ </literallayout>
+ </para>
+
+ <para>
+ When working with varflags, the same syntax, with the exception of
+ overrides, applies.
+ In other words, you can set, append, and prepend varflags just like
+ variables.
+ See the
+ "<link linkend='variable-flag-syntax'>Variable Flag Syntax</link>"
+ section for details.
+ </para>
+
+ <para>
+ BitBake has a defined set of varflags available for recipes and
+ classes.
+ Tasks support a number of these flags which control various
+ functionality of the task:
+ <itemizedlist>
+ <listitem><para><emphasis>cleandirs:</emphasis>
+ Empty directories that should created before the task runs.
+ </para></listitem>
+ <listitem><para><emphasis>depends:</emphasis>
+ Controls inter-task dependencies.
+ See the
+ <link linkend='var-DEPENDS'><filename>DEPENDS</filename></link>
+ variable and the
+ "<link linkend='inter-task-dependencies'>Inter-Task Dependencies</link>"
+ section for more information.
+ </para></listitem>
+ <listitem><para><emphasis>deptask:</emphasis>
+ Controls task build-time dependencies.
+ See the
+ <link linkend='var-DEPENDS'><filename>DEPENDS</filename></link>
+ variable and the
+ "<link linkend='build-dependencies'>Build Dependencies</link>"
+ section for more information.
+ </para></listitem>
+ <listitem><para><emphasis>dirs:</emphasis>
+ Directories that should be created before the task runs.
+ </para></listitem>
+ <listitem><para><emphasis>lockfiles:</emphasis>
+ Specifies one or more lockfiles to lock while the task
+ executes.
+ Only one task may hold a lockfile, and any task that
+ attempts to lock an already locked file will block until
+ the lock is released.
+ You can use this variable flag to accomplish mutual
+ exclusion.
+ </para></listitem>
+ <listitem><para><emphasis>noexec:</emphasis>
+ Marks the tasks as being empty and no execution required.
+ The <filename>noexec</filename> flag can be used to set up
+ tasks as dependency placeholders, or to disable tasks defined
+ elsewhere that are not needed in a particular recipe.
+ </para></listitem>
+ <listitem><para><emphasis>nostamp:</emphasis>
+ Tells BitBake to not generate a stamp file for a task,
+ which implies the task should always be executed.
+ </para></listitem>
+ <listitem><para><emphasis>postfuncs:</emphasis>
+ List of functions to call after the completion of the task.
+ </para></listitem>
+ <listitem><para><emphasis>prefuncs:</emphasis>
+ List of functions to call before the task executes.
+ </para></listitem>
+ <listitem><para><emphasis>rdepends:</emphasis>
+ Controls inter-task runtime dependencies.
+ See the
+ <link linkend='var-RDEPENDS'><filename>RDEPENDS</filename></link>
+ variable, the
+ <link linkend='var-RRECOMMENDS'><filename>RRECOMMENDS</filename></link>
+ variable, and the
+ "<link linkend='inter-task-dependencies'>Inter-Task Dependencies</link>"
+ section for more information.
+ </para></listitem>
+ <listitem><para><emphasis>rdeptask:</emphasis>
+ Controls task runtime dependencies.
+ See the
+ <link linkend='var-RDEPENDS'><filename>RDEPENDS</filename></link>
+ variable, the
+ <link linkend='var-RRECOMMENDS'><filename>RRECOMMENDS</filename></link>
+ variable, and the
+ "<link linkend='runtime-dependencies'>Runtime Dependencies</link>"
+ section for more information.
+ </para></listitem>
+ <listitem><para><emphasis>recideptask:</emphasis>
+ When set in conjunction with
+ <filename>recrdeptask</filename>, specifies a task that
+ should be inspected for additional dependencies.
+ </para></listitem>
+ <listitem><para><emphasis>recrdeptask:</emphasis>
+ Controls task recursive runtime dependencies.
+ See the
+ <link linkend='var-RDEPENDS'><filename>RDEPENDS</filename></link>
+ variable, the
+ <link linkend='var-RRECOMMENDS'><filename>RRECOMMENDS</filename></link>
+ variable, and the
+ "<link linkend='recursive-dependencies'>Recursive Dependencies</link>"
+ section for more information.
+ </para></listitem>
+ <listitem><para><emphasis>stamp-extra-info:</emphasis>
+ Extra stamp information to append to the task's stamp.
+ As an example, OpenEmbedded uses this flag to allow
+ machine-specific tasks.
+ </para></listitem>
+ <listitem><para><emphasis>umask:</emphasis>
+ The umask to run the task under.
+ </para></listitem>
+ </itemizedlist>
+ </para>
+
+ <para>
+ Several varflags are useful for controlling how signatures are
+ calculated for variables.
+ For more information on this process, see the
+ "<link linkend='checksums'>Checksums (Signatures)</link>"
+ section.
+ <itemizedlist>
+ <listitem><para><emphasis>vardeps:</emphasis>
+ Specifies a space-separated list of additional
+ variables to add to a variable's dependencies
+ for the purposes of calculating its signature.
+ Adding variables to this list is useful, for example, when
+ a function refers to a variable in a manner that
+ does not allow BitBake to automatically determine
+ that the variable is referred to.
+ </para></listitem>
+ <listitem><para><emphasis>vardepsexclude:</emphasis>
+ Specifies a space-separated list of variables
+ that should be excluded from a variable's dependencies
+ for the purposes of calculating its signature.
+ </para></listitem>
+ <listitem><para><emphasis>vardepvalue:</emphasis>
+ If set, instructs BitBake to ignore the actual
+ value of the variable and instead use the specified
+ value when calculating the variable's signature.
+ </para></listitem>
+ <listitem><para><emphasis>vardepvalueexclude:</emphasis>
+ Specifies a pipe-separated list of strings to exclude
+ from the variable's value when calculating the
+ variable's signature.
+ </para></listitem>
+ </itemizedlist>
+ </para>
+ </section>
+
+ <section id='events'>
+ <title>Events</title>
+
+ <para>
+ BitBake allows installation of event handlers within
+ recipe and class files.
+ Events are triggered at certain points during operation,
+ such as the beginning of an operation against a given recipe
+ (<filename>*.bb</filename> file), the start of a given task,
+ task failure, task success, and so forth.
+ The intent is to make it easy to do things like email
+ notification on build failure.
+ </para>
+
+ <para>
+ Following is an example event handler that
+ prints the name of the event and the content of
+ the <filename>FILE</filename> variable:
+ <literallayout class='monospaced'>
+ addhandler myclass_eventhandler
+ python myclass_eventhandler() {
+ from bb.event import getName
+ from bb import data
+ print("The name of the Event is %s" % getName(e))
+ print("The file we run for is %s" % data.getVar('FILE', e.data, True))
+ }
+ </literallayout>
+ This event handler gets called every time an event is
+ triggered.
+ A global variable "<filename>e</filename>" is defined and
+ "<filename>e.data</filename>" contains an instance of
+ "<filename>bb.data</filename>".
+ With the <filename>getName(e)</filename> method, one can get
+ the name of the triggered event.
+ </para>
+
+ <para>
+ Because you probably are only interested in a subset of events,
+ you would likely use the <filename>[eventmask]</filename> flag
+ for your event handler to be sure that only certain events
+ trigger the handler.
+ Given the previous example, suppose you only wanted the
+ <filename>bb.build.TaskFailed</filename> event to trigger that
+ event handler.
+ Use the flag as follows:
+ <literallayout class='monospaced'>
+ addhandler myclass_eventhandler
+ myclass_eventhandler[eventmask] = "bb.build.TaskFailed"
+ python myclass_eventhandler() {
+ from bb.event import getName
+ from bb import data
+ print("The name of the Event is %s" % getName(e))
+ print("The file we run for is %s" % data.getVar('FILE', e.data, True))
+ }
+ </literallayout>
+ </para>
+
+ <para>
+ During a standard build, the following common events might occur:
+ <itemizedlist>
+ <listitem><para>
+ <filename>bb.event.ConfigParsed()</filename>
+ </para></listitem>
+ <listitem><para>
+ <filename>bb.event.ParseStarted()</filename>
+ </para></listitem>
+ <listitem><para>
+ <filename>bb.event.ParseProgress()</filename>
+ </para></listitem>
+ <listitem><para>
+ <filename>bb.event.ParseCompleted()</filename>
+ </para></listitem>
+ <listitem><para>
+ <filename>bb.event.BuildStarted()</filename>
+ </para></listitem>
+ <listitem><para>
+ <filename>bb.build.TaskStarted()</filename>
+ </para></listitem>
+ <listitem><para>
+ <filename>bb.build.TaskInvalid()</filename>
+ </para></listitem>
+ <listitem><para>
+ <filename>bb.build.TaskFailedSilent()</filename>
+ </para></listitem>
+ <listitem><para>
+ <filename>bb.build.TaskFailed()</filename>
+ </para></listitem>
+ <listitem><para>
+ <filename>bb.build.TaskSucceeded()</filename>
+ </para></listitem>
+ <listitem><para>
+ <filename>bb.event.BuildCompleted()</filename>
+ </para></listitem>
+ <listitem><para>
+ <filename>bb.cooker.CookerExit()</filename>
+ </para></listitem>
+ </itemizedlist>
+ Here is a list of other events that occur based on specific requests
+ to the server:
+ <itemizedlist>
+ <listitem><para>
+ <filename>bb.event.TreeDataPreparationStarted()</filename>
+ </para></listitem>
+ <listitem><para>
+ <filename>bb.event.TreeDataPreparationProgress</filename>
+ </para></listitem>
+ <listitem><para>
+ <filename>bb.event.TreeDataPreparationCompleted</filename>
+ </para></listitem>
+ <listitem><para>
+ <filename>bb.event.DepTreeGenerated</filename>
+ </para></listitem>
+ <listitem><para>
+ <filename>bb.event.CoreBaseFilesFound</filename>
+ </para></listitem>
+ <listitem><para>
+ <filename>bb.event.ConfigFilePathFound</filename>
+ </para></listitem>
+ <listitem><para>
+ <filename>bb.event.FilesMatchingFound</filename>
+ </para></listitem>
+ <listitem><para>
+ <filename>bb.event.ConfigFilesFound</filename>
+ </para></listitem>
+ <listitem><para>
+ <filename>bb.event.TargetsTreeGenerated</filename>
+ </para></listitem>
+ </itemizedlist>
+ </para>
+ </section>
+
+ <section id='variants-class-extension-mechanism'>
+ <title>Variants - Class Extension Mechanism</title>
+
+ <para>
+ BitBake supports two features that facilitate creating
+ from a single recipe file multiple incarnations of that
+ recipe file where all incarnations are buildable.
+ These features are enabled through the
+ <link linkend='var-BBCLASSEXTEND'><filename>BBCLASSEXTEND</filename></link>
+ and
+ <link linkend='var-BBVERSIONS'><filename>BBVERSIONS</filename></link>
+ variables.
+ <note>
+ The mechanism for this class extension is extremely
+ specific to the implementation.
+ Usually, the recipe's
+ <link linkend='var-PROVIDES'><filename>PROVIDES</filename></link>,
+ <link linkend='var-PN'><filename>PN</filename></link>, and
+ <link linkend='var-DEPENDS'><filename>DEPENDS</filename></link>
+ variables would need to be modified by the extension class.
+ For specific examples, see the OE-Core
+ <filename>native</filename>, <filename>nativesdk</filename>,
+ and <filename>multilib</filename> classes.
+ </note>
+ <itemizedlist>
+ <listitem><para><emphasis><filename>BBCLASSEXTEND</filename>:</emphasis>
+ This variable is a space separated list of classes used to "extend" the
+ recipe for each variant.
+ Here is an example that results in a second incarnation of the current
+ recipe being available.
+ This second incarnation will have the "native" class inherited.
+ <literallayout class='monospaced'>
+ BBCLASSEXTEND = "native"
+ </literallayout></para></listitem>
+ <listitem><para><emphasis><filename>BBVERSIONS</filename>:</emphasis>
+ This variable allows a single recipe to build multiple versions of a
+ project from a single recipe file.
+ You can also specify conditional metadata
+ (using the
+ <link linkend='var-OVERRIDES'><filename>OVERRIDES</filename></link>
+ mechanism) for a single version, or an optionally named range of versions.
+ Here is an example:
+ <literallayout class='monospaced'>
+ BBVERSIONS = "1.0 2.0 git"
+ SRC_URI_git = "git://someurl/somepath.git"
+
+ BBVERSIONS = "1.0.[0-6]:1.0.0+ \ 1.0.[7-9]:1.0.7+"
+ SRC_URI_append_1.0.7+ = "file://some_patch_which_the_new_versions_need.patch;patch=1"
+ </literallayout>
+ The name of the range defaults to the original version of the
+ recipe.
+ For example, in OpenEmbedded, the recipe file
+ <filename>foo_1.0.0+.bb</filename> creates a default name range
+ of <filename>1.0.0+</filename>.
+ This is useful because the range name is not only placed
+ into overrides, but it is also made available for the metadata to use
+ in the variable that defines the base recipe versions for use in
+ <filename>file://</filename> search paths
+ (<link linkend='var-FILESPATH'><filename>FILESPATH</filename></link>).
+ </para></listitem>
+ </itemizedlist>
+ </para>
+ </section>
+
+ <section id='dependencies'>
+ <title>Dependencies</title>
+
+ <para>
+ To allow for efficient operation given multiple processes
+ executing in parallel, BitBake handles dependencies at
+ the task level.
+ BitBake supports a robust method to handle these dependencies.
+ </para>
+
+ <para>
+ This section describes several types of dependency mechanisms.
+ </para>
+
+ <section id='dependencies-internal-to-the-bb-file'>
+ <title>Dependencies Internal to the <filename>.bb</filename> File</title>
+
+ <para>
+ BitBake uses the <filename>addtask</filename> directive
+ to manage dependencies that are internal to a given recipe
+ file.
+ You can use the <filename>addtask</filename> directive to
+ indicate when a task is dependent on other tasks or when
+ other tasks depend on that recipe.
+ Here is an example:
+ <literallayout class='monospaced'>
+ addtask printdate after do_fetch before do_build
+ </literallayout>
+ In this example, the <filename>printdate</filename> task is
+ depends on the completion of the <filename>do_fetch</filename>
+ task.
+ And, the <filename>do_build</filename> depends on the completion
+ of the <filename>printdate</filename> task.
+ </para>
+ </section>
+
+ <section id='build-dependencies'>
+ <title>Build Dependencies</title>
+
+ <para>
+ BitBake uses the
+ <link linkend='var-DEPENDS'><filename>DEPENDS</filename></link>
+ variable to manage build time dependencies.
+ The "deptask" varflag for tasks signifies the task of each
+ item listed in <filename>DEPENDS</filename> that must
+ complete before that task can be executed.
+ Here is an example:
+ <literallayout class='monospaced'>
+ do_configure[deptask] = "do_populate_sysroot"
+ </literallayout>
+ In this example, the <filename>do_populate_sysroot</filename>
+ task of each item in <filename>DEPENDS</filename> must complete before
+ <filename>do_configure</filename> can execute.
+ </para>
+ </section>
+
+ <section id='runtime-dependencies'>
+ <title>Runtime Dependencies</title>
+
+ <para>
+ BitBake uses the
+ <link linkend='var-PACKAGES'><filename>PACKAGES</filename></link>,
+ <link linkend='var-RDEPENDS'><filename>RDEPENDS</filename></link>, and
+ <link linkend='var-RRECOMMENDS'><filename>RRECOMMENDS</filename></link>
+ variables to manage runtime dependencies.
+ </para>
+
+ <para>
+ The <filename>PACKAGES</filename> variable lists runtime
+ packages.
+ Each of those packages can have <filename>RDEPENDS</filename> and
+ <filename>RRECOMMENDS</filename> runtime dependencies.
+ The "rdeptask" flag for tasks is used to signify the task of each
+ item runtime dependency which must have completed before that
+ task can be executed.
+ <literallayout class='monospaced'>
+ do_package_qa[rdeptask] = "do_packagedata"
+ </literallayout>
+ In the previous example, the <filename>do_packagedata</filename>
+ task of each item in <filename>RDEPENDS</filename> must have
+ completed before <filename>do_package_qa</filename> can execute.
+ </para>
+ </section>
+
+ <section id='recursive-dependencies'>
+ <title>Recursive Dependencies</title>
+
+ <para>
+ BitBake uses the "recrdeptask" flag to manage
+ recursive task dependencies.
+ BitBake looks through the build-time and runtime
+ dependencies of the current recipe, looks through
+ the task's inter-task
+ dependencies, and then adds dependencies for the
+ listed task.
+ Once BitBake has accomplished this, it recursively works through
+ the dependencies of those tasks.
+ Iterative passes continue until all dependencies are discovered
+ and added.
+ </para>
+
+ <para>
+ You might want to not only have BitBake look for
+ dependencies of those tasks, but also have BitBake look
+ for build-time and runtime dependencies of the dependent
+ tasks as well.
+ If that is the case, you need to reference the task name
+ itself in the task list:
+ <literallayout class='monospaced'>
+ do_a[recrdeptask] = "do_a do_b"
+ </literallayout>
+ </para>
+ </section>
+
+ <section id='inter-task-dependencies'>
+ <title>Inter-Task Dependencies</title>
+
+ <para>
+ BitBake uses the "depends" flag in a more generic form
+ to manage inter-task dependencies.
+ This more generic form allows for inter-dependency
+ checks for specific tasks rather than checks for
+ the data in <filename>DEPENDS</filename>.
+ Here is an example:
+ <literallayout class='monospaced'>
+ do_patch[depends] = "quilt-native:do_populate_sysroot"
+ </literallayout>
+ In this example, the <filename>do_populate_sysroot</filename>
+ task of the target <filename>quilt-native</filename>
+ must have completed before the
+ <filename>do_patch</filename> task can execute.
+ </para>
+
+ <para>
+ The "rdepends" flag works in a similar way but takes targets
+ in the runtime namespace instead of the build-time dependency
+ namespace.
+ </para>
+ </section>
+ </section>
+
+ <section id='accessing-datastore-variables-using-python'>
+ <title>Accessing Datastore Variables Using Python</title>
+
+ <para>
+ It is often necessary to access variables in the
+ BitBake datastore using Python functions.
+ The Bitbake datastore has an API that allows you this
+ access.
+ Here is a list of available operations:
+ </para>
+
+ <para>
+ <informaltable frame='none'>
+ <tgroup cols='2' align='left' colsep='1' rowsep='1'>
+ <colspec colname='c1' colwidth='1*'/>
+ <colspec colname='c2' colwidth='1*'/>
+ <thead>
+ <row>
+ <entry align="left"><emphasis>Operation</emphasis></entry>
+ <entry align="left"><emphasis>Description</emphasis></entry>
+ </row>
+ </thead>
+ <tbody>
+ <row>
+ <entry align="left"><filename>d.getVar("X", expand=False)</filename></entry>
+ <entry align="left">Returns the value of variable "X".
+ Using "expand=True" expands the value.</entry>
+ </row>
+ <row>
+ <entry align="left"><filename>d.setVar("X", "value")</filename></entry>
+ <entry align="left">Sets the variable "X" to "value".</entry>
+ </row>
+ <row>
+ <entry align="left"><filename>d.appendVar("X", "value")</filename></entry>
+ <entry align="left">Adds "value" to the end of the variable "X".</entry>
+ </row>
+ <row>
+ <entry align="left"><filename>d.prependVar("X", "value")</filename></entry>
+ <entry align="left">Adds "value" to the start of the variable "X".</entry>
+ </row>
+ <row>
+ <entry align="left"><filename>d.delVar("X")</filename></entry>
+ <entry align="left">Deletes the variable "X" from the datastore.</entry>
+ </row>
+ <row>
+ <entry align="left"><filename>d.renameVar("X", "Y")</filename></entry>
+ <entry align="left">Renames the variable "X" to "Y".</entry>
+ </row>
+ <row>
+ <entry align="left"><filename>d.getVarFlag("X", flag, expand=False)</filename></entry>
+ <entry align="left">Gets then named flag from the variable "X".
+ Using "expand=True" expands the named flag.</entry>
+ </row>
+ <row>
+ <entry align="left"><filename>d.setVarFlag("X", flag, "value")</filename></entry>
+ <entry align="left">Sets the named flag for variable "X" to "value".</entry>
+ </row>
+ <row>
+ <entry align="left"><filename>d.appendVarFlag("X", flag, "value")</filename></entry>
+ <entry align="left">Appends "value" to the named flag on the
+ variable "X".</entry>
+ </row>
+ <row>
+ <entry align="left"><filename>d.prependVarFlag("X", flag, "value")</filename></entry>
+ <entry align="left">Prepends "value" to the named flag on
+ the variable "X".</entry>
+ </row>
+ <row>
+ <entry align="left"><filename>d.delVarFlag("X", flag)</filename></entry>
+ <entry align="left">Deletes the named flag on the variable
+ "X" from the datastore.</entry>
+ </row>
+ <row>
+ <entry align="left"><filename>d.setVarFlags("X", flagsdict)</filename></entry>
+ <entry align="left">Sets the flags specified in
+ the <filename>flagsdict()</filename> parameter.
+ <filename>setVarFlags</filename> does not clear previous flags.
+ Think of this operation as <filename>addVarFlags</filename>.</entry>
+ </row>
+ <row>
+ <entry align="left"><filename>d.getVarFlags("X")</filename></entry>
+ <entry align="left">Returns a <filename>flagsdict</filename> of the flags for
+ the variable "X".</entry>
+ </row>
+ <row>
+ <entry align="left"><filename>d.delVarFlags("X")</filename></entry>
+ <entry align="left">Deletes all the flags for the variable "X".</entry>
+ </row>
+ </tbody>
+ </tgroup>
+ </informaltable>
+ </para>
+ </section>
+
+ <section id='task-checksums-and-setscene'>
+ <title>Task Checksums and Setscene</title>
+
+ <para>
+ BitBake uses checksums (or signatures) along with the setscene
+ to determine if a task needs to be run.
+ This section describes the process.
+ To help understand how BitBake does this, the section assumes an
+ OpenEmbedded metadata-based example.
+ </para>
+
+ <para>
+ This list is a place holder of content existed from previous work
+ on the manual.
+ Some or all of it probably needs integrated into the subsections
+ that make up this section.
+ For now, I have just provided a short glossary-like description
+ for each variable.
+ Ultimately, this list goes away.
+ <itemizedlist>
+ <listitem><para><filename>STAMP</filename>:
+ The base path to create stamp files.</para></listitem>
+ <listitem><para><filename>STAMPCLEAN</filename>
+ Again, the base path to create stamp files but can use wildcards
+ for matching a range of files for clean operations.
+ </para></listitem>
+ <listitem><para><filename>BB_STAMP_WHITELIST</filename>
+ Lists stamp files that are looked at when the stamp policy
+ is "whitelist".
+ </para></listitem>
+ <listitem><para><filename>BB_STAMP_POLICY</filename>
+ Defines the mode for comparing timestamps of stamp files.
+ </para></listitem>
+ <listitem><para><filename>BB_HASHCHECK_FUNCTION</filename>
+ Specifies the name of the function to call during
+ the "setscene" part of the task's execution in order
+ to validate the list of task hashes.
+ </para></listitem>
+ <listitem><para><filename>BB_SETSCENE_VERIFY_FUNCTION</filename>
+ Specifies a function to call that verifies the list of
+ planned task execution before the main task execution
+ happens.
+ </para></listitem>
+ <listitem><para><filename>BB_SETSCENE_DEPVALID</filename>
+ Specifies a function BitBake calls that determines
+ whether BitBake requires a setscene dependency to
+ be met.
+ </para></listitem>
+ <listitem><para><filename>BB_TASKHASH</filename>
+ Within an executing task, this variable holds the hash
+ of the task as returned by the currently enabled
+ signature generator.
+ </para></listitem>
+ </itemizedlist>
+ </para>
+ </section>
+</chapter>
diff --git a/bitbake/doc/bitbake-user-manual/bitbake-user-manual-ref-variables.xml b/bitbake/doc/bitbake-user-manual/bitbake-user-manual-ref-variables.xml
new file mode 100644
index 0000000..05e1b95
--- /dev/null
+++ b/bitbake/doc/bitbake-user-manual/bitbake-user-manual-ref-variables.xml
@@ -0,0 +1,2262 @@
+<!DOCTYPE chapter PUBLIC "-//OASIS//DTD DocBook XML V4.2//EN"
+"http://www.oasis-open.org/docbook/xml/4.2/docbookx.dtd"
+[<!ENTITY % poky SYSTEM "../poky.ent"> %poky; ] >
+
+<!-- Dummy chapter -->
+<chapter id='ref-variables-glos'>
+
+<title>Variables Glossary</title>
+
+<para>
+ This chapter lists common variables used by BitBake and gives an overview
+ of their function and contents.
+</para>
+
+<note>
+ Following are some points regarding the variables listed in this glossary:
+ <itemizedlist>
+ <listitem><para>The variables listed in this glossary
+ are specific to BitBake.
+ Consequently, the descriptions are limited to that context.
+ </para></listitem>
+ <listitem><para>Also, variables exist in other systems that use BitBake
+ (e.g. The Yocto Project and OpenEmbedded) that have names identical
+ to those found in this glossary.
+ For such cases, the variables in those systems extend the
+ functionality of the variable as it is described here in
+ this glossary.
+ </para></listitem>
+ <listitem><para>Finally, there are variables mentioned in this
+ glossary that do not appear in the BitBake glossary.
+ These other variables are variables used in systems that use
+ BitBake.
+ </para></listitem>
+ </itemizedlist>
+</note>
+
+<glossary id='ref-variables-glossary'>
+
+ <para>
+ <link linkend='var-ASSUME_PROVIDED'>A</link>
+ <link linkend='var-B'>B</link>
+ <link linkend='var-CACHE'>C</link>
+ <link linkend='var-DEFAULT_PREFERENCE'>D</link>
+ <link linkend='var-EXCLUDE_FROM_WORLD'>E</link>
+ <link linkend='var-FAKEROOT'>F</link>
+ <link linkend='var-GITDIR'>G</link>
+ <link linkend='var-HGDIR'>H</link>
+<!-- <link linkend='var-ICECC_DISABLED'>I</link> -->
+<!-- <link linkend='var-glossary-j'>J</link> -->
+<!-- <link linkend='var-KARCH'>K</link> -->
+ <link linkend='var-LAYERDEPENDS'>L</link>
+ <link linkend='var-MIRRORS'>M</link>
+<!-- <link linkend='var-glossary-n'>N</link> -->
+ <link linkend='var-OVERRIDES'>O</link>
+ <link linkend='var-PACKAGES'>P</link>
+<!-- <link linkend='var-QMAKE_PROFILES'>Q</link> -->
+ <link linkend='var-RDEPENDS'>R</link>
+ <link linkend='var-SECTION'>S</link>
+ <link linkend='var-T'>T</link>
+<!-- <link linkend='var-UBOOT_CONFIG'>U</link> -->
+<!-- <link linkend='var-glossary-v'>V</link> -->
+<!-- <link linkend='var-WARN_QA'>W</link> -->
+<!-- <link linkend='var-glossary-x'>X</link> -->
+<!-- <link linkend='var-glossary-y'>Y</link> -->
+<!-- <link linkend='var-glossary-z'>Z</link>-->
+ </para>
+
+ <glossdiv id='var-glossary-a'><title>A</title>
+
+ <glossentry id='var-ASSUME_PROVIDED'><glossterm>ASSUME_PROVIDED</glossterm>
+ <glossdef>
+ <para>
+ Lists recipe names
+ (<link linkend='var-PN'><filename>PN</filename></link>
+ values) BitBake does not attempt to build.
+ Instead, BitBake assumes these recipes have already been
+ built.
+ </para>
+
+ <para>
+ In OpenEmbedded Core, <filename>ASSUME_PROVIDED</filename>
+ mostly specifies native tools that should not be built.
+ An example is <filename>git-native</filename>, which
+ when specified allows for the Git binary from the host to
+ be used rather than building
+ <filename>git-native</filename>.
+ </para>
+ </glossdef>
+ </glossentry>
+
+ </glossdiv>
+
+
+ <glossdiv id='var-glossary-b'><title>B</title>
+
+ <glossentry id='var-B'><glossterm>B</glossterm>
+ <glossdef>
+ <para>
+ The directory in which BitBake executes functions
+ during a recipe's build process.
+ </para>
+ </glossdef>
+ </glossentry>
+
+ <glossentry id='var-BB_ALLOWED_NETWORKS'><glossterm>BB_ALLOWED_NETWORKS</glossterm>
+ <glossdef>
+ <para>
+ Specifies a space-delimited list of hosts that the fetcher
+ is allowed to use to obtain the required source code.
+ Following are considerations surrounding this variable:
+ <itemizedlist>
+ <listitem><para>
+ This host list is only used if
+ <link linkend='var-BB_NO_NETWORK'><filename>BB_NO_NETWORK</filename></link>
+ is either not set or set to "0".
+ </para></listitem>
+ <listitem><para>
+ Limited support for wildcard matching against the
+ beginning of host names exists.
+ For example, the following setting matches
+ <filename>git.gnu.org</filename>,
+ <filename>ftp.gnu.org</filename>, and
+ <filename>foo.git.gnu.org</filename>.
+ <literallayout class='monospaced'>
+ BB_ALLOWED_NETWORKS = "*.gnu.org"
+ </literallayout>
+ </para></listitem>
+ <listitem><para>
+ Mirrors not in the host list are skipped and
+ logged in debug.
+ </para></listitem>
+ <listitem><para>
+ Attempts to access networks not in the host list
+ cause a failure.
+ </para></listitem>
+ </itemizedlist>
+ Using <filename>BB_ALLOWED_NETWORKS</filename> in
+ conjunction with
+ <link linkend='var-PREMIRRORS'><filename>PREMIRRORS</filename></link>
+ is very useful.
+ Adding the host you want to use to
+ <filename>PREMIRRORS</filename> results in the source code
+ being fetched from an allowed location and avoids raising
+ an error when a host that is not allowed is in a
+ <link linkend='var-SRC_URI'><filename>SRC_URI</filename></link>
+ statement.
+ This is because the fetcher does not attempt to use the
+ host listed in <filename>SRC_URI</filename> after a
+ successful fetch from the
+ <filename>PREMIRRORS</filename> occurs.
+ </para>
+ </glossdef>
+ </glossentry>
+
+ <glossentry id='var-BB_CONSOLELOG'><glossterm>BB_CONSOLELOG</glossterm>
+ <glossdef>
+ <para>
+ Specifies the path to a log file into which BitBake's user
+ interface writes output during the build.
+ </para>
+ </glossdef>
+ </glossentry>
+
+ <glossentry id='var-BB_CURRENTTASK'><glossterm>BB_CURRENTTASK</glossterm>
+ <glossdef>
+ <para>
+ Contains the name of the currently running task.
+ The name does not include the
+ <filename>do_</filename> prefix.
+ </para>
+ </glossdef>
+ </glossentry>
+
+ <glossentry id='var-BB_DANGLINGAPPENDS_WARNONLY'><glossterm>BB_DANGLINGAPPENDS_WARNONLY</glossterm>
+ <glossdef>
+ <para>
+ Defines how BitBake handles situations where an append
+ file (<filename>.bbappend</filename>) has no
+ corresponding recipe file (<filename>.bb</filename>).
+ This condition often occurs when layers get out of sync
+ (e.g. <filename>oe-core</filename> bumps a
+ recipe version and the old recipe no longer exists and the
+ other layer has not been updated to the new version
+ of the recipe yet).
+ </para>
+
+ <para>
+ The default fatal behavior is safest because it is
+ the sane reaction given something is out of sync.
+ It is important to realize when your changes are no longer
+ being applied.
+ </para>
+ </glossdef>
+ </glossentry>
+
+ <glossentry id='var-BB_DEFAULT_TASK'><glossterm>BB_DEFAULT_TASK</glossterm>
+ <glossdef>
+ <para>
+ The default task to use when none is specified (e.g.
+ with the <filename>-c</filename> command line option).
+ The task name specified should not include the
+ <filename>do_</filename> prefix.
+ </para>
+ </glossdef>
+ </glossentry>
+
+ <glossentry id='var-BB_DISKMON_DIRS'><glossterm>BB_DISKMON_DIRS</glossterm>
+ <glossdef>
+ <para>
+ Monitors disk space and available inodes during the build
+ and allows you to control the build based on these
+ parameters.
+ </para>
+
+ <para>
+ Disk space monitoring is disabled by default.
+ When setting this variable, use the following form:
+ <literallayout class='monospaced'>
+ BB_DISKMON_DIRS = "<action>,<dir>,<threshold> [...]"
+
+ where:
+
+ <action> is:
+ ABORT: Immediately abort the build when
+ a threshold is broken.
+ STOPTASKS: Stop the build after the currently
+ executing tasks have finished when
+ a threshold is broken.
+ WARN: Issue a warning but continue the
+ build when a threshold is broken.
+ Subsequent warnings are issued as
+ defined by the
+ <link linkend='var-BB_DISKMON_WARNINTERVAL'>BB_DISKMON_WARNINTERVAL</link> variable,
+ which must be defined.
+
+ <dir> is:
+ Any directory you choose. You can specify one or
+ more directories to monitor by separating the
+ groupings with a space. If two directories are
+ on the same device, only the first directory
+ is monitored.
+
+ <threshold> is:
+ Either the minimum available disk space,
+ the minimum number of free inodes, or
+ both. You must specify at least one. To
+ omit one or the other, simply omit the value.
+ Specify the threshold using G, M, K for Gbytes,
+ Mbytes, and Kbytes, respectively. If you do
+ not specify G, M, or K, Kbytes is assumed by
+ default. Do not use GB, MB, or KB.
+ </literallayout>
+ </para>
+
+ <para>
+ Here are some examples:
+ <literallayout class='monospaced'>
+ BB_DISKMON_DIRS = "ABORT,${TMPDIR},1G,100K WARN,${SSTATE_DIR},1G,100K"
+ BB_DISKMON_DIRS = "STOPTASKS,${TMPDIR},1G"
+ BB_DISKMON_DIRS = "ABORT,${TMPDIR},,100K"
+ </literallayout>
+ The first example works only if you also set
+ the <link linkend='var-BB_DISKMON_WARNINTERVAL'><filename>BB_DISKMON_WARNINTERVAL</filename></link> variable.
+ This example causes the build system to immediately
+ abort when either the disk space in <filename>${TMPDIR}</filename> drops
+ below 1 Gbyte or the available free inodes drops below
+ 100 Kbytes.
+ Because two directories are provided with the variable, the
+ build system also issues a
+ warning when the disk space in the
+ <filename>${SSTATE_DIR}</filename> directory drops
+ below 1 Gbyte or the number of free inodes drops
+ below 100 Kbytes.
+ Subsequent warnings are issued during intervals as
+ defined by the <filename>BB_DISKMON_WARNINTERVAL</filename>
+ variable.
+ </para>
+
+ <para>
+ The second example stops the build after all currently
+ executing tasks complete when the minimum disk space
+ in the <filename>${TMPDIR}</filename>
+ directory drops below 1 Gbyte.
+ No disk monitoring occurs for the free inodes in this case.
+ </para>
+
+ <para>
+ The final example immediately aborts the build when the
+ number of free inodes in the <filename>${TMPDIR}</filename> directory
+ drops below 100 Kbytes.
+ No disk space monitoring for the directory itself occurs
+ in this case.
+ </para>
+ </glossdef>
+ </glossentry>
+
+ <glossentry id='var-BB_DISKMON_WARNINTERVAL'><glossterm>BB_DISKMON_WARNINTERVAL</glossterm>
+ <glossdef>
+ <para>
+ Defines the disk space and free inode warning intervals.
+ </para>
+
+ <para>
+ If you are going to use the
+ <filename>BB_DISKMON_WARNINTERVAL</filename> variable, you must
+ also use the
+ <link linkend='var-BB_DISKMON_DIRS'><filename>BB_DISKMON_DIRS</filename></link> variable
+ and define its action as "WARN".
+ During the build, subsequent warnings are issued each time
+ disk space or number of free inodes further reduces by
+ the respective interval.
+ </para>
+
+ <para>
+ If you do not provide a <filename>BB_DISKMON_WARNINTERVAL</filename>
+ variable and you do use <filename>BB_DISKMON_DIRS</filename> with
+ the "WARN" action, the disk monitoring interval defaults to
+ the following:
+ <literallayout class='monospaced'>
+ BB_DISKMON_WARNINTERVAL = "50M,5K"
+ </literallayout>
+ </para>
+
+ <para>
+ When specifying the variable in your configuration file,
+ use the following form:
+ <literallayout class='monospaced'>
+ BB_DISKMON_WARNINTERVAL = "<disk_space_interval>,<disk_inode_interval>"
+
+ where:
+
+ <disk_space_interval> is:
+ An interval of memory expressed in either
+ G, M, or K for Gbytes, Mbytes, or Kbytes,
+ respectively. You cannot use GB, MB, or KB.
+
+ <disk_inode_interval> is:
+ An interval of free inodes expressed in either
+ G, M, or K for Gbytes, Mbytes, or Kbytes,
+ respectively. You cannot use GB, MB, or KB.
+ </literallayout>
+ </para>
+
+ <para>
+ Here is an example:
+ <literallayout class='monospaced'>
+ BB_DISKMON_DIRS = "WARN,${SSTATE_DIR},1G,100K"
+ BB_DISKMON_WARNINTERVAL = "50M,5K"
+ </literallayout>
+ These variables cause BitBake to
+ issue subsequent warnings each time the available
+ disk space further reduces by 50 Mbytes or the number
+ of free inodes further reduces by 5 Kbytes in the
+ <filename>${SSTATE_DIR}</filename> directory.
+ Subsequent warnings based on the interval occur each time
+ a respective interval is reached beyond the initial warning
+ (i.e. 1 Gbytes and 100 Kbytes).
+ </para>
+ </glossdef>
+ </glossentry>
+
+ <glossentry id='var-BB_ENV_WHITELIST'><glossterm>BB_ENV_WHITELIST</glossterm>
+ <glossdef>
+ <para>
+ Specifies the internal whitelist of variables to allow
+ through from the external environment into BitBake's
+ datastore.
+ If the value of this variable is not specified
+ (which is the default), the following list is used:
+ <link linkend='var-BBPATH'><filename>BBPATH</filename></link>,
+ <link linkend='var-BB_PRESERVE_ENV'><filename>BB_PRESERVE_ENV</filename></link>,
+ <link linkend='var-BB_ENV_WHITELIST'><filename>BB_ENV_WHITELIST</filename></link>,
+ and
+ <link linkend='var-BB_ENV_EXTRAWHITE'><filename>BB_ENV_EXTRAWHITE</filename></link>.
+ <note>
+ You must set this variable in the external environment
+ in order for it to work.
+ </note>
+ </para>
+ </glossdef>
+ </glossentry>
+
+ <glossentry id='var-BB_ENV_EXTRAWHITE'><glossterm>BB_ENV_EXTRAWHITE</glossterm>
+ <glossdef>
+ <para>
+ Specifies an additional set of variables to allow through
+ (whitelist) from the external environment into BitBake's
+ datastore.
+ This list of variables are on top of the internal list
+ set in
+ <link linkend='var-BB_ENV_WHITELIST'><filename>BB_ENV_WHITELIST</filename></link>.
+ <note>
+ You must set this variable in the external
+ environment in order for it to work.
+ </note>
+ </para>
+ </glossdef>
+ </glossentry>
+
+ <glossentry id='var-BB_FETCH_PREMIRRORONLY'><glossterm>BB_FETCH_PREMIRRORONLY</glossterm>
+ <glossdef>
+ <para>
+ When set to "1", causes BitBake's fetcher module to only
+ search
+ <link linkend='var-PREMIRRORS'><filename>PREMIRRORS</filename></link>
+ for files.
+ BitBake will not search the main
+ <link linkend='var-SRC_URI'><filename>SRC_URI</filename></link>
+ or
+ <link linkend='var-MIRRORS'><filename>MIRRORS</filename></link>.
+ </para>
+ </glossdef>
+ </glossentry>
+
+ <glossentry id='var-BB_FILENAME'><glossterm>BB_FILENAME</glossterm>
+ <glossdef>
+ <para>
+ Contains the filename of the recipe that owns the currently
+ running task.
+ For example, if the <filename>do_fetch</filename> task that
+ resides in the <filename>my-recipe.bb</filename> is
+ executing, the <filename>BB_FILENAME</filename> variable
+ contains "/foo/path/my-recipe.bb".
+ </para>
+ </glossdef>
+ </glossentry>
+
+ <glossentry id='var-BB_GENERATE_MIRROR_TARBALLS'><glossterm>BB_GENERATE_MIRROR_TARBALLS</glossterm>
+ <glossdef>
+ <para>
+ Causes tarballs of the Git repositories, including the
+ Git metadata, to be placed in the
+ <link linkend='var-DL_DIR'><filename>DL_DIR</filename></link>
+ directory.
+ Anyone wishing to create a source mirror would want to
+ enable this variable.
+ </para>
+
+ <para>
+ For performance reasons, creating and placing tarballs of
+ the Git repositories is not the default action by BitBake.
+ <literallayout class='monospaced'>
+ BB_GENERATE_MIRROR_TARBALLS = "1"
+ </literallayout>
+ </para>
+ </glossdef>
+ </glossentry>
+
+ <glossentry id='var-BB_HASHCONFIG_WHITELIST'><glossterm>BB_HASHCONFIG_WHITELIST</glossterm>
+ <glossdef>
+ <para>
+ Lists variables that are excluded from base configuration
+ checksum, which is used to determine if the cache can
+ be reused.
+ </para>
+
+ <para>
+ One of the ways BitBake determines whether to re-parse the
+ main metadata is through checksums of the variables in the
+ datastore of the base configuration data.
+ There are variables that you typically want to exclude when
+ checking whether or not to re-parse and thus rebuild the
+ cache.
+ As an example, you would usually exclude
+ <filename>TIME</filename> and <filename>DATE</filename>
+ because these variables are always changing.
+ If you did not exclude them, BitBake would never reuse the
+ cache.
+ </para>
+ </glossdef>
+ </glossentry>
+
+ <glossentry id='var-BB_HASHBASE_WHITELIST'><glossterm>BB_HASHBASE_WHITELIST</glossterm>
+ <glossdef>
+ <para>
+ Lists variables that are excluded from checksum and
+ dependency data.
+ Variables that are excluded can therefore change without
+ affecting the checksum mechanism.
+ A common example would be the variable for the path of
+ the build.
+ BitBake's output should not (and usually does not) depend
+ on the directory in which it was built.
+ </para>
+ </glossdef>
+ </glossentry>
+
+ <glossentry id='var-BB_HASHCHECK_FUNCTION'><glossterm>BB_HASHCHECK_FUNCTION</glossterm>
+ <glossdef>
+ <para>
+ Specifies the name of the function to call during the
+ "setscene" part of the task's execution in order to
+ validate the list of task hashes.
+ The function returns the list of setscene tasks that should
+ be executed.
+ </para>
+
+ <para>
+ At this point in the execution of the code, the objective
+ is to quickly verify if a given setscene function is likely
+ to work or not.
+ It's easier to check the list of setscene functions in
+ one pass than to call many individual tasks.
+ The returned list need not be completely accurate.
+ A given setscene task can still later fail.
+ However, the more accurate the data returned, the more
+ efficient the build will be.
+ </para>
+ </glossdef>
+ </glossentry>
+
+ <glossentry id='var-BB_INVALIDCONF'><glossterm>BB_INVALIDCONF</glossterm>
+ <glossdef>
+ <para>
+ Used in combination with the
+ <filename>ConfigParsed</filename> event to trigger
+ re-parsing the base metadata (i.e. all the
+ recipes).
+ The <filename>ConfigParsed</filename> event can set the
+ variable to trigger the re-parse.
+ You must be careful to avoid recursive loops with this
+ functionality.
+ </para>
+ </glossdef>
+ </glossentry>
+
+ <glossentry id='var-BB_LOGFMT'><glossterm>BB_LOGFMT</glossterm>
+ <glossdef>
+ <para>
+ Specifies the name of the log files saved into
+ <filename>${</filename><link linkend='var-T'><filename>T</filename></link><filename>}</filename>.
+ By default, the <filename>BB_LOGFMT</filename> variable
+ is undefined and the log file names get created using the
+ following form:
+ <literallayout class='monospaced'>
+ log.{task}.{pid}
+ </literallayout>
+ If you want to force log files to take a specific name,
+ you can set this variable in a configuration file.
+ </para>
+ </glossdef>
+ </glossentry>
+
+ <glossentry id='var-BB_NICE_LEVEL'><glossterm>BB_NICE_LEVEL</glossterm>
+ <glossdef>
+ <para>
+ Allows BitBake to run at a specific priority
+ (i.e. nice level).
+ System permissions usually mean that BitBake can reduce its
+ priority but not raise it again.
+ See
+ <link linkend='var-BB_TASK_NICE_LEVEL'><filename>BB_TASK_NICE_LEVEL</filename></link>
+ for additional information.
+ </para>
+ </glossdef>
+ </glossentry>
+
+ <glossentry id='var-BB_NO_NETWORK'><glossterm>BB_NO_NETWORK</glossterm>
+ <glossdef>
+ <para>
+ Disables network access in the BitBake fetcher modules.
+ With this access disabled, any command that attempts to
+ access the network becomes an error.
+ </para>
+
+ <para>
+ Disabling network access is useful for testing source
+ mirrors, running builds when not connected to the Internet,
+ and when operating in certain kinds of firewall
+ environments.
+ </para>
+ </glossdef>
+ </glossentry>
+
+ <glossentry id='var-BB_NUMBER_THREADS'><glossterm>BB_NUMBER_THREADS</glossterm>
+ <glossdef>
+ <para>
+ The maximum number of tasks BitBake should run in parallel
+ at any one time.
+ If your host development system supports multiple cores,
+ a good rule of thumb is to set this variable to twice the
+ number of cores.
+ </para>
+ </glossdef>
+ </glossentry>
+
+ <glossentry id='var-BB_NUMBER_PARSE_THREADS'><glossterm>BB_NUMBER_PARSE_THREADS</glossterm>
+ <glossdef>
+ <para>
+ Sets the number of threads BitBake uses when parsing.
+ By default, the number of threads is equal to the number
+ of cores on the system.
+ </para>
+ </glossdef>
+ </glossentry>
+
+ <glossentry id='var-BB_ORIGENV'><glossterm>BB_ORIGENV</glossterm>
+ <glossdef>
+ <para>
+ Contains a copy of the original external environment in
+ which BitBake was run.
+ The copy is taken before any whitelisted variable values
+ are filtered into BitBake's datastore.
+ <note>
+ The contents of this variable is a datastore object
+ that can be queried using the normal datastore
+ operations.
+ </note>
+ </para>
+ </glossdef>
+ </glossentry>
+
+ <glossentry id='var-BB_PRESERVE_ENV'><glossterm>BB_PRESERVE_ENV</glossterm>
+ <glossdef>
+ <para>
+ Disables whitelisting and instead allows all variables
+ through from the external environment into BitBake's
+ datastore.
+ <note>
+ You must set this variable in the external
+ environment in order for it to work.
+ </note>
+ </para>
+ </glossdef>
+ </glossentry>
+
+ <glossentry id='var-BB_RUNFMT'><glossterm>BB_RUNFMT</glossterm>
+ <glossdef>
+ <para>
+ Specifies the name of the executable script files
+ (i.e. run files) saved into
+ <filename>${</filename><link linkend='var-T'><filename>T</filename></link><filename>}</filename>.
+ By default, the <filename>BB_RUNFMT</filename> variable
+ is undefined and the run file names get created using the
+ following form:
+ <literallayout class='monospaced'>
+ run.{task}.{pid}
+ </literallayout>
+ If you want to force run files to take a specific name,
+ you can set this variable in a configuration file.
+ </para>
+ </glossdef>
+ </glossentry>
+
+ <glossentry id='var-BB_RUNTASK'><glossterm>BB_RUNTASK</glossterm>
+ <glossdef>
+ <para>
+ Contains the name of the currently executing task.
+ The value does not include the "do_" prefix.
+ For example, if the currently executing task is
+ <filename>do_config</filename>, the value is
+ "config".
+ </para>
+ </glossdef>
+ </glossentry>
+
+ <glossentry id='var-BB_SCHEDULER'><glossterm>BB_SCHEDULER</glossterm>
+ <glossdef>
+ <para>
+ Selects the name of the scheduler to use for the
+ scheduling of BitBake tasks.
+ Three options exist:
+ <itemizedlist>
+ <listitem><para><emphasis>basic</emphasis> -
+ The basic framework from which everything derives.
+ Using this option causes tasks to be ordered
+ numerically as they are parsed.
+ </para></listitem>
+ <listitem><para><emphasis>speed</emphasis> -
+ Executes tasks first that have more tasks
+ depending on them.
+ The "speed" option is the default.
+ </para></listitem>
+ <listitem><para><emphasis>completion</emphasis> -
+ Causes the scheduler to try to complete a given
+ recipe once its build has started.
+ </para></listitem>
+ </itemizedlist>
+ </para>
+ </glossdef>
+ </glossentry>
+
+ <glossentry id='var-BB_SCHEDULERS'><glossterm>BB_SCHEDULERS</glossterm>
+ <glossdef>
+ <para>
+ Defines custom schedulers to import.
+ Custom schedulers need to be derived from the
+ <filename>RunQueueScheduler</filename> class.
+ </para>
+
+ <para>
+ For information how to select a scheduler, see the
+ <link linkend='var-BB_SCHEDULER'><filename>BB_SCHEDULER</filename></link>
+ variable.
+ </para>
+ </glossdef>
+ </glossentry>
+
+ <glossentry id='var-BB_SETSCENE_DEPVALID'><glossterm>BB_SETSCENE_DEPVALID</glossterm>
+ <glossdef>
+ <para>
+ Specifies a function BitBake calls that determines
+ whether BitBake requires a setscene dependency to be met.
+ </para>
+
+ <para>
+ When running a setscene task, BitBake needs to
+ know which dependencies of that setscene task also need
+ to be run.
+ Whether dependencies also need to be run is highly
+ dependent on the metadata.
+ The function specified by this variable returns a
+ "True" or "False" depending on whether the dependency needs
+ to be met.
+ </para>
+ </glossdef>
+ </glossentry>
+
+ <glossentry id='var-BB_SETSCENE_VERIFY_FUNCTION'><glossterm>BB_SETSCENE_VERIFY_FUNCTION</glossterm>
+ <glossdef>
+ <para>
+ Specifies a function to call that verifies the list of
+ planned task execution before the main task execution
+ happens.
+ The function is called once BitBake has a list of setscene
+ tasks that have run and either succeeded or failed.
+ </para>
+
+ <para>
+ The function allows for a task list check to see if they
+ make sense.
+ Even if BitBake was planning to skip a task, the
+ returned value of the function can force BitBake to run
+ the task, which is necessary under certain metadata
+ defined circumstances.
+ </para>
+ </glossdef>
+ </glossentry>
+
+ <glossentry id='var-BB_SIGNATURE_EXCLUDE_FLAGS'><glossterm>BB_SIGNATURE_EXCLUDE_FLAGS</glossterm>
+ <glossdef>
+ <para>
+ Lists variable flags (varflags)
+ that can be safely excluded from checksum
+ and dependency data for keys in the datastore.
+ When generating checksum or dependency data for keys in the
+ datastore, the flags set against that key are normally
+ included in the checksum.
+ </para>
+
+ <para>
+ For more information on varflags, see the
+ "<link linkend='variable-flags'>Variable Flags</link>"
+ section.
+ </para>
+ </glossdef>
+ </glossentry>
+
+ <glossentry id='var-BB_SIGNATURE_HANDLER'><glossterm>BB_SIGNATURE_HANDLER</glossterm>
+ <glossdef>
+ <para>
+ Defines the name of the signature handler BitBake uses.
+ The signature handler defines the way stamp files are
+ created and handled, if and how the signature is
+ incorporated into the stamps, and how the signature
+ itself is generated.
+ </para>
+
+ <para>
+ A new signature handler can be added by injecting a class
+ derived from the
+ <filename>SignatureGenerator</filename> class into the
+ global namespace.
+ </para>
+ </glossdef>
+ </glossentry>
+
+ <glossentry id='var-BB_SRCREV_POLICY'><glossterm>BB_SRCREV_POLICY</glossterm>
+ <glossdef>
+ <para>
+ Defines the behavior of the fetcher when it interacts with
+ source control systems and dynamic source revisions.
+ The <filename>BB_SRCREV_POLICY</filename> variable is
+ useful when working without a network.
+ </para>
+
+ <para>
+ The variable can be set using one of two policies:
+ <itemizedlist>
+ <listitem><para><emphasis>cache</emphasis> -
+ Retains the value the system obtained previously
+ rather than querying the source control system
+ each time.
+ </para></listitem>
+ <listitem><para><emphasis>clear</emphasis> -
+ Queries the source controls system every time.
+ With this policy, there is no cache.
+ The "clear" policy is the default.
+ </para></listitem>
+ </itemizedlist>
+ </para>
+ </glossdef>
+ </glossentry>
+
+ <glossentry id='var-BB_STAMP_POLICY'><glossterm>BB_STAMP_POLICY</glossterm>
+ <glossdef>
+ <para>
+ Defines the mode used for how timestamps of stamp files
+ are compared.
+ You can set the variable to one of the following modes:
+ <itemizedlist>
+ <listitem><para><emphasis>perfile</emphasis> -
+ Timestamp comparisons are only made
+ between timestamps of a specific recipe.
+ This is the default mode.
+ </para></listitem>
+ <listitem><para><emphasis>full</emphasis> -
+ Timestamp comparisons are made for all
+ dependencies.
+ </para></listitem>
+ <listitem><para><emphasis>whitelist</emphasis> -
+ Identical to "full" mode except timestamp
+ comparisons are made for recipes listed in the
+ <link linkend='var-BB_STAMP_WHITELIST'><filename>BB_STAMP_WHITELIST</filename></link>
+ variable.
+ </para></listitem>
+ </itemizedlist>
+ <note>
+ Stamp policies are largely obsolete with the
+ introduction of setscene tasks.
+ </note>
+ </para>
+ </glossdef>
+ </glossentry>
+
+ <glossentry id='var-BB_STAMP_WHITELIST'><glossterm>BB_STAMP_WHITELIST</glossterm>
+ <glossdef>
+ <para>
+ Lists files whose stamp file timestamps are compared when
+ the stamp policy mode is set to "whitelist".
+ For information on stamp policies, see the
+ <link linkend='var-BB_STAMP_POLICY'><filename>BB_STAMP_POLICY</filename></link>
+ variable.
+ </para>
+ </glossdef>
+ </glossentry>
+
+ <glossentry id='var-BB_STRICT_CHECKSUM'><glossterm>BB_STRICT_CHECKSUM</glossterm>
+ <glossdef>
+ <para>
+ Sets a more strict checksum mechanism for non-local URLs.
+ Setting this variable to a value causes BitBake
+ to report an error if it encounters a non-local URL
+ that does not have at least one checksum specified.
+ </para>
+ </glossdef>
+ </glossentry>
+
+ <glossentry id='var-BB_TASK_NICE_LEVEL'><glossterm>BB_TASK_NICE_LEVEL</glossterm>
+ <glossdef>
+ <para>
+ Allows specific tasks to change their priority
+ (i.e. nice level).
+ </para>
+
+ <para>
+ You can use this variable in combination with task
+ overrides to raise or lower priorities of specific tasks.
+ For example, on the
+ <ulink url='http://www.yoctoproject.org'>Yocto Project</ulink>
+ autobuilder, QEMU emulation in images is given a higher
+ priority as compared to build tasks to ensure that images
+ do not suffer timeouts on loaded systems.
+ </para>
+ </glossdef>
+ </glossentry>
+
+ <glossentry id='var-BB_TASKHASH'><glossterm>BB_TASKHASH</glossterm>
+ <glossdef>
+ <para>
+ Within an executing task, this variable holds the hash
+ of the task as returned by the currently enabled
+ signature generator.
+ </para>
+ </glossdef>
+ </glossentry>
+
+ <glossentry id='var-BB_VERBOSE_LOGS'><glossterm>BB_VERBOSE_LOGS</glossterm>
+ <glossdef>
+ <para>
+ Controls how verbose BitBake is during builds.
+ If set, shell scripts echo commands and shell script output
+ appears on standard out (stdout).
+ </para>
+ </glossdef>
+ </glossentry>
+
+ <glossentry id='var-BB_WORKERCONTEXT'><glossterm>BB_WORKERCONTEXT</glossterm>
+ <glossdef>
+ <para>
+ Specifies if the current context is executing a task.
+ BitBake sets this variable to "1" when a task is
+ being executed.
+ The value is not set when the task is in server context
+ during parsing or event handling.
+ </para>
+ </glossdef>
+ </glossentry>
+
+
+ <glossentry id='var-BBCLASSEXTEND'><glossterm>BBCLASSEXTEND</glossterm>
+ <glossdef>
+ <para>
+ Allows you to extend a recipe so that it builds variants
+ of the software.
+ Some examples of these variants for recipes from the
+ OpenEmbedded Core metadata are "natives" such as
+ <filename>quilt-native</filename>, which is a copy of
+ Quilt built to run on the build system; "crosses" such
+ as <filename>gcc-cross</filename>, which is a compiler
+ built to run on the build machine but produces binaries
+ that run on the target <filename>MACHINE</filename>;
+ "nativesdk", which targets the SDK machine instead of
+ <filename>MACHINE</filename>; and "mulitlibs" in the form
+ "<filename>multilib:</filename><replaceable>multilib_name</replaceable>".
+ </para>
+
+ <para>
+ To build a different variant of the recipe with a minimal
+ amount of code, it usually is as simple as adding the
+ variable to your recipe.
+ Here are two examples.
+ The "native" variants are from the OpenEmbedded Core
+ metadata:
+ <literallayout class='monospaced'>
+ BBCLASSEXTEND =+ "native nativesdk"
+ BBCLASSEXTEND =+ "multilib:<replaceable>multilib_name</replaceable>"
+ </literallayout>
+ </para>
+ </glossdef>
+ </glossentry>
+
+ <glossentry id='var-BBDEBUG'><glossterm>BBDEBUG</glossterm>
+ <glossdef>
+ <para>
+ Sets the BitBake debug output level to a specific value
+ as incremented by the <filename>-d</filename> command line
+ option.
+ <note>
+ You must set this variable in the external environment
+ in order for it to work.
+ </note>
+ </para>
+ </glossdef>
+ </glossentry>
+
+ <glossentry id='var-BBFILE_COLLECTIONS'><glossterm>BBFILE_COLLECTIONS</glossterm>
+ <glossdef>
+ <para>Lists the names of configured layers.
+ These names are used to find the other <filename>BBFILE_*</filename>
+ variables.
+ Typically, each layer appends its name to this variable in its
+ <filename>conf/layer.conf</filename> file.
+ </para>
+ </glossdef>
+ </glossentry>
+
+ <glossentry id='var-BBFILE_PATTERN'><glossterm>BBFILE_PATTERN</glossterm>
+ <glossdef>
+ <para>Variable that expands to match files from
+ <link linkend='var-BBFILES'><filename>BBFILES</filename></link>
+ in a particular layer.
+ This variable is used in the <filename>conf/layer.conf</filename> file and must
+ be suffixed with the name of the specific layer (e.g.
+ <filename>BBFILE_PATTERN_emenlow</filename>).</para>
+ </glossdef>
+ </glossentry>
+
+ <glossentry id='var-BBFILE_PRIORITY'><glossterm>BBFILE_PRIORITY</glossterm>
+ <glossdef>
+ <para>Assigns the priority for recipe files in each layer.</para>
+ <para>This variable is useful in situations where the same recipe appears in
+ more than one layer.
+ Setting this variable allows you to prioritize a
+ layer against other layers that contain the same recipe - effectively
+ letting you control the precedence for the multiple layers.
+ The precedence established through this variable stands regardless of a
+ recipe's version
+ (<link linkend='var-PV'><filename>PV</filename></link> variable).
+ For example, a layer that has a recipe with a higher <filename>PV</filename> value but for
+ which the <filename>BBFILE_PRIORITY</filename> is set to have a lower precedence still has a
+ lower precedence.</para>
+ <para>A larger value for the <filename>BBFILE_PRIORITY</filename> variable results in a higher
+ precedence.
+ For example, the value 6 has a higher precedence than the value 5.
+ If not specified, the <filename>BBFILE_PRIORITY</filename> variable is set based on layer
+ dependencies (see the
+ <filename><link linkend='var-LAYERDEPENDS'>LAYERDEPENDS</link></filename> variable for
+ more information.
+ The default priority, if unspecified
+ for a layer with no dependencies, is the lowest defined priority + 1
+ (or 1 if no priorities are defined).</para>
+ <tip>
+ You can use the command <filename>bitbake-layers show-layers</filename> to list
+ all configured layers along with their priorities.
+ </tip>
+ </glossdef>
+ </glossentry>
+
+ <glossentry id='var-BBFILES'><glossterm>BBFILES</glossterm>
+ <glossdef>
+ <para>List of recipe files BitBake uses to build software.</para>
+ </glossdef>
+ </glossentry>
+
+ <glossentry id='var-BBINCLUDED'><glossterm>BBINCLUDED</glossterm>
+ <glossdef>
+ <para>
+ Contains a space-separated list of all of all files that
+ BitBake's parser included during parsing of the current
+ file.
+ </para>
+ </glossdef>
+ </glossentry>
+
+ <glossentry id='var-BBINCLUDELOGS'><glossterm>BBINCLUDELOGS</glossterm>
+ <glossdef>
+ <para>
+ If set to a value, enables printing the task log when
+ reporting a failed task.
+ </para>
+ </glossdef>
+ </glossentry>
+
+ <glossentry id='var-BBINCLUDELOGS_LINES'><glossterm>BBINCLUDELOGS_LINES</glossterm>
+ <glossdef>
+ <para>
+ If
+ <link linkend='var-BBINCLUDELOGS'><filename>BBINCLUDELOGS</filename></link>
+ is set, specifies the maximum number of lines from the
+ task log file to print when reporting a failed task.
+ If you do not set <filename>BBINCLUDELOGS_LINES</filename>,
+ the entire log is printed.
+ </para>
+ </glossdef>
+ </glossentry>
+
+ <glossentry id='var-BBLAYERS'><glossterm>BBLAYERS</glossterm>
+ <glossdef>
+ <para>Lists the layers to enable during the build.
+ This variable is defined in the <filename>bblayers.conf</filename> configuration
+ file in the build directory.
+ Here is an example:
+ <literallayout class='monospaced'>
+ BBLAYERS = " \
+ /home/scottrif/poky/meta \
+ /home/scottrif/poky/meta-yocto \
+ /home/scottrif/poky/meta-yocto-bsp \
+ /home/scottrif/poky/meta-mykernel \
+ "
+
+ </literallayout>
+ This example enables four layers, one of which is a custom, user-defined layer
+ named <filename>meta-mykernel</filename>.
+ </para>
+ </glossdef>
+ </glossentry>
+
+ <glossentry id='var-BBLAYERS_FETCH_DIR'><glossterm>BBLAYERS_FETCH_DIR</glossterm>
+ <glossdef>
+ <para>
+ Sets the base location where layers are stored.
+ By default, this location is set to
+ <filename>${COREBASE}</filename>.
+ This setting is used in conjunction with
+ <filename>bitbake-layers layerindex-fetch</filename> and
+ tells <filename>bitbake-layers</filename> where to place
+ the fetched layers.
+ </para>
+ </glossdef>
+ </glossentry>
+
+ <glossentry id='var-BBMASK'><glossterm>BBMASK</glossterm>
+ <glossdef>
+ <para>
+ Prevents BitBake from processing recipes and recipe
+ append files.
+ </para>
+
+ <para>
+ You can use the <filename>BBMASK</filename> variable
+ to "hide" these <filename>.bb</filename> and
+ <filename>.bbappend</filename> files.
+ BitBake ignores any recipe or recipe append files that
+ match the expression.
+ It is as if BitBake does not see them at all.
+ Consequently, matching files are not parsed or otherwise
+ used by BitBake.</para>
+ <para>
+ The value you provide is passed to Python's regular
+ expression compiler.
+ The expression is compared against the full paths to
+ the files.
+ For complete syntax information, see Python's
+ documentation at
+ <ulink url='http://docs.python.org/release/2.3/lib/re-syntax.html'></ulink>.
+ </para>
+
+ <para>
+ The following example uses a complete regular expression
+ to tell BitBake to ignore all recipe and recipe append
+ files in the <filename>meta-ti/recipes-misc/</filename>
+ directory:
+ <literallayout class='monospaced'>
+ BBMASK = "meta-ti/recipes-misc/"
+ </literallayout>
+ If you want to mask out multiple directories or recipes,
+ use the vertical bar to separate the regular expression
+ fragments.
+ This next example masks out multiple directories and
+ individual recipes:
+ <literallayout class='monospaced'>
+ BBMASK = "meta-ti/recipes-misc/|meta-ti/recipes-ti/packagegroup/"
+ BBMASK .= "|.*meta-oe/recipes-support/"
+ BBMASK .= "|.*openldap"
+ BBMASK .= "|.*opencv"
+ BBMASK .= "|.*lzma"
+ </literallayout>
+ Notice how the vertical bar is used to append the fragments.
+ <note>
+ When specifying a directory name, use the trailing
+ slash character to ensure you match just that directory
+ name.
+ </note>
+ </para>
+ </glossdef>
+ </glossentry>
+
+ <glossentry id='var-BBPATH'><glossterm>BBPATH</glossterm>
+ <glossdef>
+ <para>
+ Used by BitBake to locate class
+ (<filename>.bbclass</filename>) and configuration
+ (<filename>.conf</filename>) files.
+ This variable is analogous to the
+ <filename>PATH</filename> variable.
+ </para>
+
+ <para>
+ If you run BitBake from a directory outside of the
+ build directory,
+ you must be sure to set
+ <filename>BBPATH</filename> to point to the
+ build directory.
+ Set the variable as you would any environment variable
+ and then run BitBake:
+ <literallayout class='monospaced'>
+ $ BBPATH="<replaceable>build_directory</replaceable>"
+ $ export BBPATH
+ $ bitbake <replaceable>target</replaceable>
+ </literallayout>
+ </para>
+ </glossdef>
+ </glossentry>
+
+ <glossentry id='var-BBSERVER'><glossterm>BBSERVER</glossterm>
+ <glossdef>
+ <para>
+ Points to the server that runs memory-resident BitBake.
+ The variable is only used when you employ memory-resident
+ BitBake.
+ </para>
+ </glossdef>
+ </glossentry>
+
+ <glossentry id='var-BBVERSIONS'><glossterm>BBVERSIONS</glossterm>
+ <glossdef>
+ <para>
+ Allows a single recipe to build multiple versions of a
+ project from a single recipe file.
+ You also able to specify conditional metadata
+ using the
+ <link linkend='var-OVERRIDES'><filename>OVERRIDES</filename></link>
+ mechanism for a single version or for an optionally named
+ range of versions.
+ </para>
+
+ <para>
+ For more information on <filename>BBVERSIONS</filename>,
+ see the
+ "<link linkend='variants-class-extension-mechanism'>Variants - Class Extension Mechanism</link>"
+ section.
+ </para>
+ </glossdef>
+ </glossentry>
+
+ <glossentry id='var-BITBAKE_UI'><glossterm>BITBAKE_UI</glossterm>
+ <glossdef>
+ <para>
+ Used to specify the UI module to use when running BitBake.
+ Using this variable is equivalent to using the
+ <filename>-u</filename> command-line option.
+ <note>
+ You must set this variable in the external environment
+ in order for it to work.
+ </note>
+ </para>
+ </glossdef>
+ </glossentry>
+
+ <glossentry id='var-BUILDNAME'><glossterm>BUILDNAME</glossterm>
+ <glossdef>
+ <para>
+ A name assigned to the build.
+ The name defaults to a datetime stamp of when the build was
+ started but can be defined by the metadata.
+ </para>
+ </glossdef>
+ </glossentry>
+
+ <glossentry id='var-BZRDIR'><glossterm>BZRDIR</glossterm>
+ <glossdef>
+ <para>
+ The directory in which files checked out of a Bazaar
+ system are stored.
+ </para>
+ </glossdef>
+ </glossentry>
+
+ </glossdiv>
+
+ <glossdiv id='var-glossary-c'><title>C</title>
+
+ <glossentry id='var-CACHE'><glossterm>CACHE</glossterm>
+ <glossdef>
+ <para>
+ Specifies the directory BitBake uses to store a cache
+ of the metadata so it does not need to be parsed every
+ time BitBake is started.
+ </para>
+ </glossdef>
+ </glossentry>
+
+ <glossentry id='var-CVSDIR'><glossterm>CVSDIR</glossterm>
+ <glossdef>
+ <para>
+ The directory in which files checked out under the
+ CVS system are stored.
+ </para>
+ </glossdef>
+ </glossentry>
+
+ </glossdiv>
+
+ <glossdiv id='var-glossary-d'><title>D</title>
+
+ <glossentry id='var-DEFAULT_PREFERENCE'><glossterm>DEFAULT_PREFERENCE</glossterm>
+ <glossdef>
+ <para>
+ Specifies a weak bias for recipe selection priority.
+ </para>
+ <para>
+ The most common usage of this is variable is to set
+ it to "-1" within a recipe for a development version of a
+ piece of software.
+ Using the variable in this way causes the stable version
+ of the recipe to build by default in the absence of
+ <filename><link linkend='var-PREFERRED_VERSION'>PREFERRED_VERSION</link></filename>
+ being used to build the development version.
+ </para>
+ <note>
+ The bias provided by <filename>DEFAULT_PREFERENCE</filename>
+ is weak and is overridden by
+ <filename><link linkend='var-BBFILE_PRIORITY'>BBFILE_PRIORITY</link></filename>
+ if that variable is different between two layers
+ that contain different versions of the same recipe.
+ </note>
+ </glossdef>
+ </glossentry>
+
+ <glossentry id='var-DEPENDS'><glossterm>DEPENDS</glossterm>
+ <glossdef>
+ <para>
+ Lists a recipe's build-time dependencies
+ (i.e. other recipe files).
+ </para>
+
+ <para>
+ Consider this simple example for two recipes named "a" and
+ "b" that produce similarly named packages.
+ In this example, the <filename>DEPENDS</filename>
+ statement appears in the "a" recipe:
+ <literallayout class='monospaced'>
+ DEPENDS = "b"
+ </literallayout>
+ Here, the dependency is such that the
+ <filename>do_configure</filename> task for recipe "a"
+ depends on the <filename>do_populate_sysroot</filename>
+ task of recipe "b".
+ This means anything that recipe "b" puts into sysroot
+ is available when recipe "a" is configuring itself.
+ </para>
+
+ <para>
+ For information on runtime dependencies, see the
+ <link linkend='var-RDEPENDS'><filename>RDEPENDS</filename></link>
+ variable.
+ </para>
+ </glossdef>
+ </glossentry>
+
+ <glossentry id='var-DESCRIPTION'><glossterm>DESCRIPTION</glossterm>
+ <glossdef>
+ <para>
+ A long description for the recipe.
+ </para>
+ </glossdef>
+ </glossentry>
+
+ <glossentry id='var-DL_DIR'><glossterm>DL_DIR</glossterm>
+ <glossdef>
+ <para>
+ The central download directory used by the build process to
+ store downloads.
+ By default, <filename>DL_DIR</filename> gets files
+ suitable for mirroring for everything except Git
+ repositories.
+ If you want tarballs of Git repositories, use the
+ <link linkend='var-BB_GENERATE_MIRROR_TARBALLS'><filename>BB_GENERATE_MIRROR_TARBALLS</filename></link>
+ variable.
+ </para>
+ </glossdef>
+
+ </glossentry>
+ </glossdiv>
+
+ <glossdiv id='var-glossary-e'><title>E</title>
+
+ <glossentry id='var-EXCLUDE_FROM_WORLD'><glossterm>EXCLUDE_FROM_WORLD</glossterm>
+ <glossdef>
+ <para>
+ Directs BitBake to exclude a recipe from world builds (i.e.
+ <filename>bitbake world</filename>).
+ During world builds, BitBake locates, parses and builds all
+ recipes found in every layer exposed in the
+ <filename>bblayers.conf</filename> configuration file.
+ </para>
+
+ <para>
+ To exclude a recipe from a world build using this variable,
+ set the variable to "1" in the recipe.
+ </para>
+
+ <note>
+ Recipes added to <filename>EXCLUDE_FROM_WORLD</filename>
+ may still be built during a world build in order to satisfy
+ dependencies of other recipes.
+ Adding a recipe to <filename>EXCLUDE_FROM_WORLD</filename>
+ only ensures that the recipe is not explicitly added
+ to the list of build targets in a world build.
+ </note>
+ </glossdef>
+ </glossentry>
+
+ </glossdiv>
+
+ <glossdiv id='var-glossary-f'><title>F</title>
+
+ <glossentry id='var-FAKEROOT'><glossterm>FAKEROOT</glossterm>
+ <glossdef>
+ <para>
+ Contains the command to use when running a shell script
+ in a fakeroot environment.
+ The <filename>FAKEROOT</filename> variable is obsolete
+ and has been replaced by the other
+ <filename>FAKEROOT*</filename> variables.
+ See these entries in the glossary for more information.
+ </para>
+ </glossdef>
+ </glossentry>
+
+ <glossentry id='var-FAKEROOTBASEENV'><glossterm>FAKEROOTBASEENV</glossterm>
+ <glossdef>
+ <para>
+ Lists environment variables to set when executing
+ the command defined by
+ <link linkend='var-FAKEROOTCMD'><filename>FAKEROOTCMD</filename></link>
+ that starts the bitbake-worker process
+ in the fakeroot environment.
+ </para>
+ </glossdef>
+ </glossentry>
+
+ <glossentry id='var-FAKEROOTCMD'><glossterm>FAKEROOTCMD</glossterm>
+ <glossdef>
+ <para>
+ Contains the command that starts the bitbake-worker
+ process in the fakeroot environment.
+ </para>
+ </glossdef>
+ </glossentry>
+
+ <glossentry id='var-FAKEROOTDIRS'><glossterm>FAKEROOTDIRS</glossterm>
+ <glossdef>
+ <para>
+ Lists directories to create before running a task in
+ the fakeroot environment.
+ </para>
+ </glossdef>
+ </glossentry>
+
+ <glossentry id='var-FAKEROOTENV'><glossterm>FAKEROOTENV</glossterm>
+ <glossdef>
+ <para>
+ Lists environment variables to set when running a task
+ in the fakeroot environment.
+ For additional information on environment variables and
+ the fakeroot environment, see the
+ <link linkend='var-FAKEROOTBASEENV'><filename>FAKEROOTBASEENV</filename></link>
+ variable.
+ </para>
+ </glossdef>
+ </glossentry>
+
+ <glossentry id='var-FAKEROOTNOENV'><glossterm>FAKEROOTNOENV</glossterm>
+ <glossdef>
+ <para>
+ Lists environment variables to set when running a task
+ that is not in the fakeroot environment.
+ For additional information on environment variables and
+ the fakeroot environment, see the
+ <link linkend='var-FAKEROOTENV'><filename>FAKEROOTENV</filename></link>
+ variable.
+ </para>
+ </glossdef>
+ </glossentry>
+
+ <glossentry id='var-FETCHCMD'><glossterm>FETCHCMD</glossterm>
+ <glossdef>
+ <para>
+ Defines the command the BitBake fetcher module
+ executes when running fetch operations.
+ You need to use an override suffix when you use the
+ variable (e.g. <filename>FETCHCMD_git</filename>
+ or <filename>FETCHCMD_svn</filename>).
+ </para>
+ </glossdef>
+ </glossentry>
+
+ <glossentry id='var-FILE'><glossterm>FILE</glossterm>
+ <glossdef>
+ <para>
+ Points at the current file.
+ BitBake sets this variable during the parsing process
+ to identify the file being parsed.
+ BitBake also sets this variable when a recipe is being
+ executed to identify the recipe file.
+ </para>
+ </glossdef>
+ </glossentry>
+
+ <glossentry id='var-FILESDIR'><glossterm>FILESDIR</glossterm>
+ <glossdef>
+ <para>
+ Specifies directories BitBake uses when searching for
+ patches and files.
+ The "local" fetcher module uses these directories when
+ handling <filename>file://</filename> URLs if the file
+ was not found using
+ <link linkend='var-FILESPATH'><filename>FILESPATH</filename></link>.
+ <note>
+ The <filename>FILESDIR</filename> variable is
+ deprecated and you should use
+ <filename>FILESPATH</filename> in all new code.
+ </note>
+ </para>
+ </glossdef>
+ </glossentry>
+
+ <glossentry id='var-FILESPATH'><glossterm>FILESPATH</glossterm>
+ <glossdef>
+ <para>
+ Specifies directories BitBake uses when searching for
+ patches and files.
+ The "local" fetcher module uses these directories when
+ handling <filename>file://</filename> URLs.
+ The variable behaves like a shell <filename>PATH</filename>
+ environment variable.
+ The value is a colon-separated list of directories that
+ are searched left-to-right in order.
+ </para>
+ </glossdef>
+ </glossentry>
+
+ </glossdiv>
+
+
+ <glossdiv id='var-glossary-g'><title>G</title>
+
+ <glossentry id='var-GITDIR'><glossterm>GITDIR</glossterm>
+ <glossdef>
+ <para>
+ The directory in which a local copy of a Git repository
+ is stored when it is cloned.
+ </para>
+ </glossdef>
+ </glossentry>
+
+ </glossdiv>
+
+
+ <glossdiv id='var-glossary-h'><title>H</title>
+
+ <glossentry id='var-HGDIR'><glossterm>HGDIR</glossterm>
+ <glossdef>
+ <para>
+ The directory in which files checked out of a Mercurial
+ system are stored.
+ </para>
+ </glossdef>
+ </glossentry>
+
+ <glossentry id='var-HOMEPAGE'><glossterm>HOMEPAGE</glossterm>
+ <glossdef>
+ <para>Website where more information about the software the recipe is building
+ can be found.</para>
+ </glossdef>
+ </glossentry>
+
+ </glossdiv>
+
+ <glossdiv id='var-glossary-i'><title>I</title>
+
+ <glossentry id='var-INHERIT'><glossterm>INHERIT</glossterm>
+ <glossdef>
+ <para>
+ Causes the named class to be inherited at
+ this point during parsing.
+ The variable is only valid in configuration files.
+ </para>
+ </glossdef>
+ </glossentry>
+
+ </glossdiv>
+
+<!--
+ <glossdiv id='var-glossary-j'><title>J</title>
+ </glossdiv>
+
+ <glossdiv id='var-glossary-k'><title>K</title>
+ </glossdiv>
+-->
+
+ <glossdiv id='var-glossary-l'><title>L</title>
+
+ <glossentry id='var-LAYERDEPENDS'><glossterm>LAYERDEPENDS</glossterm>
+ <glossdef>
+ <para>Lists the layers, separated by spaces, upon which this recipe depends.
+ Optionally, you can specify a specific layer version for a dependency
+ by adding it to the end of the layer name with a colon, (e.g. "anotherlayer:3"
+ to be compared against
+ <link linkend='var-LAYERVERSION'><filename>LAYERVERSION</filename></link><filename>_anotherlayer</filename>
+ in this case).
+ BitBake produces an error if any dependency is missing or
+ the version numbers do not match exactly (if specified).</para>
+ <para>
+ You use this variable in the <filename>conf/layer.conf</filename> file.
+ You must also use the specific layer name as a suffix
+ to the variable (e.g. <filename>LAYERDEPENDS_mylayer</filename>).</para>
+ </glossdef>
+ </glossentry>
+
+ <glossentry id='var-LAYERDIR'><glossterm>LAYERDIR</glossterm>
+ <glossdef>
+ <para>When used inside the <filename>layer.conf</filename> configuration
+ file, this variable provides the path of the current layer.
+ This variable is not available outside of <filename>layer.conf</filename>
+ and references are expanded immediately when parsing of the file completes.</para>
+ </glossdef>
+ </glossentry>
+
+ <glossentry id='var-LAYERVERSION'><glossterm>LAYERVERSION</glossterm>
+ <glossdef>
+ <para>Optionally specifies the version of a layer as a single number.
+ You can use this variable within
+ <link linkend='var-LAYERDEPENDS'><filename>LAYERDEPENDS</filename></link>
+ for another layer in order to depend on a specific version
+ of the layer.</para>
+ <para>
+ You use this variable in the <filename>conf/layer.conf</filename> file.
+ You must also use the specific layer name as a suffix
+ to the variable (e.g. <filename>LAYERDEPENDS_mylayer</filename>).</para>
+ </glossdef>
+ </glossentry>
+
+ <glossentry id='var-LICENSE'><glossterm>LICENSE</glossterm>
+ <glossdef>
+ <para>
+ The list of source licenses for the recipe.
+ </para>
+ </glossdef>
+ </glossentry>
+
+ </glossdiv>
+
+ <glossdiv id='var-glossary-m'><title>M</title>
+
+ <glossentry id='var-MIRRORS'><glossterm>MIRRORS</glossterm>
+ <glossdef>
+ <para>
+ Specifies additional paths from which BitBake gets source code.
+ When the build system searches for source code, it first
+ tries the local download directory.
+ If that location fails, the build system tries locations
+ defined by
+ <link linkend='var-PREMIRRORS'><filename>PREMIRRORS</filename></link>,
+ the upstream source, and then locations specified by
+ <filename>MIRRORS</filename> in that order.
+ </para>
+ </glossdef>
+ </glossentry>
+
+ <glossentry id='var-MULTI_PROVIDER_WHITELIST'><glossterm>MULTI_PROVIDER_WHITELIST</glossterm>
+ <glossdef>
+ <para>
+ Allows you to suppress BitBake warnings caused when
+ building two separate recipes that provide the same
+ output.
+ </para>
+
+ <para>
+ Bitbake normally issues a warning when building two
+ different recipes where each provides the same output.
+ This scenario is usually something the user does not
+ want.
+ However, cases do exist where it makes sense, particularly
+ in the <filename>virtual/*</filename> namespace.
+ You can use this variable to suppress BitBake's warnings.
+ </para>
+
+ <para>
+ To use the variable, list provider names (e.g.
+ recipe names, <filename>virtual/kernel</filename>,
+ and so forth).
+ </para>
+ </glossdef>
+ </glossentry>
+
+ </glossdiv>
+
+<!--
+ <glossdiv id='var-glossary-n'><title>N</title>
+ </glossdiv>
+-->
+
+ <glossdiv id='var-glossary-o'><title>O</title>
+
+ <glossentry id='var-OVERRIDES'><glossterm>OVERRIDES</glossterm>
+ <glossdef>
+ <para>
+ BitBake uses <filename>OVERRIDES</filename> to control
+ what variables are overridden after BitBake parses
+ recipes and configuration files.
+ </para>
+
+ <para>
+ Following is a simple example that uses an overrides
+ list based on machine architectures:
+ <literallayout class='monospaced'>
+ OVERRIDES = "arm:x86:mips:powerpc"
+ </literallayout>
+ You can find information on how to use
+ <filename>OVERRIDES</filename> in the
+ "<link linkend='conditional-syntax-overrides'>Conditional Syntax (Overrides)</link>"
+ section.
+ </para>
+ </glossdef>
+ </glossentry>
+ </glossdiv>
+
+ <glossdiv id='var-glossary-p'><title>P</title>
+
+ <glossentry id='var-PACKAGES'><glossterm>PACKAGES</glossterm>
+ <glossdef>
+ <para>The list of packages the recipe creates.
+ </para>
+ </glossdef>
+ </glossentry>
+
+ <glossentry id='var-PACKAGES_DYNAMIC'><glossterm>PACKAGES_DYNAMIC</glossterm>
+ <glossdef>
+ <para>
+ A promise that your recipe satisfies runtime dependencies
+ for optional modules that are found in other recipes.
+ <filename>PACKAGES_DYNAMIC</filename>
+ does not actually satisfy the dependencies, it only states that
+ they should be satisfied.
+ For example, if a hard, runtime dependency
+ (<link linkend='var-RDEPENDS'><filename>RDEPENDS</filename></link>)
+ of another package is satisfied during the build
+ through the <filename>PACKAGES_DYNAMIC</filename>
+ variable, but a package with the module name is never actually
+ produced, then the other package will be broken.
+ </para>
+ </glossdef>
+ </glossentry>
+
+ <glossentry id='var-PE'><glossterm>PE</glossterm>
+ <glossdef>
+ <para>
+ The epoch of the recipe.
+ By default, this variable is unset.
+ The variable is used to make upgrades possible when the
+ versioning scheme changes in some backwards incompatible
+ way.
+ </para>
+ </glossdef>
+ </glossentry>
+
+ <glossentry id='var-PERSISTENT_DIR'><glossterm>PERSISTENT_DIR</glossterm>
+ <glossdef>
+ <para>
+ Specifies the directory BitBake uses to store data that
+ should be preserved between builds.
+ In particular, the data stored is the data that uses
+ BitBake's persistent data API and the data used by the
+ PR Server and PR Service.
+ </para>
+ </glossdef>
+ </glossentry>
+
+ <glossentry id='var-PF'><glossterm>PF</glossterm>
+ <glossdef>
+ <para>
+ Specifies the recipe or package name and includes all version and revision
+ numbers (i.e. <filename>eglibc-2.13-r20+svnr15508/</filename> and
+ <filename>bash-4.2-r1/</filename>).
+ </para>
+ </glossdef>
+ </glossentry>
+
+ <glossentry id='var-PN'><glossterm>PN</glossterm>
+ <glossdef>
+ <para>The recipe name.</para>
+ </glossdef>
+ </glossentry>
+
+ <glossentry id='var-PR'><glossterm>PR</glossterm>
+ <glossdef>
+ <para>The revision of the recipe.
+ </para>
+ </glossdef>
+ </glossentry>
+
+ <glossentry id='var-PREFERRED_PROVIDER'><glossterm>PREFERRED_PROVIDER</glossterm>
+ <glossdef>
+ <para>
+ Determines which recipe should be given preference when
+ multiple recipes provide the same item.
+ You should always suffix the variable with the name of the
+ provided item, and you should set it to the
+ <link linkend='var-PN'><filename>PN</filename></link>
+ of the recipe to which you want to give precedence.
+ Some examples:
+ <literallayout class='monospaced'>
+ PREFERRED_PROVIDER_virtual/kernel ?= "linux-yocto"
+ PREFERRED_PROVIDER_virtual/xserver = "xserver-xf86"
+ PREFERRED_PROVIDER_virtual/libgl ?= "mesa"
+ </literallayout>
+ </para>
+ </glossdef>
+ </glossentry>
+
+ <glossentry id='var-PREFERRED_PROVIDERS'><glossterm>PREFERRED_PROVIDERS</glossterm>
+ <glossdef>
+ <para>
+ Determines which recipe should be given preference for
+ cases where multiple recipes provide the same item.
+ Functionally,
+ <filename>PREFERRED_PROVIDERS</filename> is identical to
+ <link linkend='var-PREFERRED_PROVIDER'><filename>PREFERRED_PROVIDER</filename></link>.
+ However, the <filename>PREFERRED_PROVIDERS</filename>
+ variable lets you define preferences for multiple
+ situations using the following form:
+ <literallayout class='monospaced'>
+ PREFERRED_PROVIDERS = "xxx:yyy aaa:bbb ..."
+ </literallayout>
+ This form is a convenient replacement for the following:
+ <literallayout class='monospaced'>
+ PREFERRED_PROVIDER_xxx = "yyy"
+ PREFERRED_PROVIDER_aaa = "bbb"
+ </literallayout>
+ </para>
+ </glossdef>
+ </glossentry>
+
+ <glossentry id='var-PREFERRED_VERSION'><glossterm>PREFERRED_VERSION</glossterm>
+ <glossdef>
+ <para>
+ If there are multiple versions of recipes available, this
+ variable determines which recipe should be given preference.
+ You must always suffix the variable with the
+ <link linkend='var-PN'><filename>PN</filename></link>
+ you want to select, and you should set
+ <link linkend='var-PV'><filename>PV</filename></link>
+ accordingly for precedence.
+ You can use the "<filename>%</filename>" character as a
+ wildcard to match any number of characters, which can be
+ useful when specifying versions that contain long revision
+ numbers that could potentially change.
+ Here are two examples:
+ <literallayout class='monospaced'>
+ PREFERRED_VERSION_python = "2.7.3"
+ PREFERRED_VERSION_linux-yocto = "3.10%"
+ </literallayout>
+ </para>
+ </glossdef>
+ </glossentry>
+
+ <glossentry id='var-PREMIRRORS'><glossterm>PREMIRRORS</glossterm>
+ <glossdef>
+ <para>
+ Specifies additional paths from which BitBake gets source code.
+ When the build system searches for source code, it first
+ tries the local download directory.
+ If that location fails, the build system tries locations
+ defined by <filename>PREMIRRORS</filename>, the upstream
+ source, and then locations specified by
+ <link linkend='var-MIRRORS'><filename>MIRRORS</filename></link>
+ in that order.
+ </para>
+
+ <para>
+ Typically, you would add a specific server for the
+ build system to attempt before any others by adding
+ something like the following to your configuration:
+ <literallayout class='monospaced'>
+ PREMIRRORS_prepend = "\
+ git://.*/.* http://www.yoctoproject.org/sources/ \n \
+ ftp://.*/.* http://www.yoctoproject.org/sources/ \n \
+ http://.*/.* http://www.yoctoproject.org/sources/ \n \
+ https://.*/.* http://www.yoctoproject.org/sources/ \n"
+ </literallayout>
+ These changes cause the build system to intercept
+ Git, FTP, HTTP, and HTTPS requests and direct them to
+ the <filename>http://</filename> sources mirror.
+ You can use <filename>file://</filename> URLs to point
+ to local directories or network shares as well.
+ </para>
+ </glossdef>
+ </glossentry>
+
+ <glossentry id='var-PROVIDES'><glossterm>PROVIDES</glossterm>
+ <glossdef>
+ <para>
+ A list of aliases by which a particular recipe can be
+ known.
+ By default, a recipe's own
+ <filename><link linkend='var-PN'>PN</link></filename>
+ is implicitly already in its <filename>PROVIDES</filename>
+ list.
+ If a recipe uses <filename>PROVIDES</filename>, the
+ additional aliases are synonyms for the recipe and can
+ be useful satisfying dependencies of other recipes during
+ the build as specified by
+ <filename><link linkend='var-DEPENDS'>DEPENDS</link></filename>.
+ </para>
+
+ <para>
+ Consider the following example
+ <filename>PROVIDES</filename> statement from a recipe
+ file <filename>libav_0.8.11.bb</filename>:
+ <literallayout class='monospaced'>
+ PROVIDES += "libpostproc"
+ </literallayout>
+ The <filename>PROVIDES</filename> statement results in
+ the "libav" recipe also being known as "libpostproc".
+ </para>
+ </glossdef>
+ </glossentry>
+
+ <glossentry id='var-PRSERV_HOST'><glossterm>PRSERV_HOST</glossterm>
+ <glossdef>
+ <para>
+ The network based
+ <link linkend='var-PR'><filename>PR</filename></link>
+ service host and port.
+ </para>
+
+ <para>
+ Following is an example of how the <filename>PRSERV_HOST</filename> variable is
+ set:
+ <literallayout class='monospaced'>
+ PRSERV_HOST = "localhost:0"
+ </literallayout>
+ You must set the variable if you want to automatically
+ start a local PR service.
+ You can set <filename>PRSERV_HOST</filename> to other
+ values to use a remote PR service.
+ </para>
+ </glossdef>
+ </glossentry>
+
+ <glossentry id='var-PV'><glossterm>PV</glossterm>
+ <glossdef>
+ <para>The version of the recipe.
+ </para>
+ </glossdef>
+ </glossentry>
+
+ </glossdiv>
+
+<!--
+ <glossdiv id='var-glossary-q'><title>Q</title>
+ </glossdiv>
+-->
+
+ <glossdiv id='var-glossary-r'><title>R</title>
+
+ <glossentry id='var-RDEPENDS'><glossterm>RDEPENDS</glossterm>
+ <glossdef>
+ <para>
+ Lists a package's runtime dependencies (i.e. other packages)
+ that must be installed in order for the built package to run
+ correctly.
+ If a package in this list cannot be found during the build,
+ you will get a build error.
+ </para>
+
+ <para>
+ Because the <filename>RDEPENDS</filename> variable applies
+ to packages being built, you should always use the variable
+ in a form with an attached package name.
+ For example, suppose you are building a development package
+ that depends on the <filename>perl</filename> package.
+ In this case, you would use the following
+ <filename>RDEPENDS</filename> statement:
+ <literallayout class='monospaced'>
+ RDEPENDS_${PN}-dev += "perl"
+ </literallayout>
+ In the example, the development package depends on
+ the <filename>perl</filename> package.
+ Thus, the <filename>RDEPENDS</filename> variable has the
+ <filename>${PN}-dev</filename> package name as part of the
+ variable.
+ </para>
+
+ <para>
+ BitBake supports specifying versioned dependencies.
+ Although the syntax varies depending on the packaging
+ format, BitBake hides these differences from you.
+ Here is the general syntax to specify versions with
+ the <filename>RDEPENDS</filename> variable:
+ <literallayout class='monospaced'>
+ RDEPENDS_${PN} = "<replaceable>package</replaceable> (<replaceable>operator</replaceable> <replaceable>version</replaceable>)"
+ </literallayout>
+ For <filename>operator</filename>, you can specify the
+ following:
+ <literallayout class='monospaced'>
+ =
+ <
+ >
+ <=
+ >=
+ </literallayout>
+ For example, the following sets up a dependency on version
+ 1.2 or greater of the package <filename>foo</filename>:
+ <literallayout class='monospaced'>
+ RDEPENDS_${PN} = "foo (>= 1.2)"
+ </literallayout>
+ </para>
+
+ <para>
+ For information on build-time dependencies, see the
+ <link linkend='var-DEPENDS'><filename>DEPENDS</filename></link>
+ variable.
+ </para>
+ </glossdef>
+ </glossentry>
+
+ <glossentry id='var-RPROVIDES'><glossterm>RPROVIDES</glossterm>
+ <glossdef>
+ <para>
+ A list of package name aliases that a package also provides.
+ These aliases are useful for satisfying runtime dependencies
+ of other packages both during the build and on the target
+ (as specified by
+ <filename><link linkend='var-RDEPENDS'>RDEPENDS</link></filename>).
+ </para>
+ <para>
+ As with all package-controlling variables, you must always
+ use the variable in conjunction with a package name override.
+ Here is an example:
+ <literallayout class='monospaced'>
+ RPROVIDES_${PN} = "widget-abi-2"
+ </literallayout>
+ </para>
+ </glossdef>
+ </glossentry>
+
+ <glossentry id='var-RRECOMMENDS'><glossterm>RRECOMMENDS</glossterm>
+ <glossdef>
+ <para>
+ A list of packages that extends the usability of a package
+ being built.
+ The package being built does not depend on this list of
+ packages in order to successfully build, but needs them for
+ the extended usability.
+ To specify runtime dependencies for packages, see the
+ <filename><link linkend='var-RDEPENDS'>RDEPENDS</link></filename>
+ variable.
+ </para>
+
+ <para>
+ BitBake supports specifying versioned recommends.
+ Although the syntax varies depending on the packaging
+ format, BitBake hides these differences from you.
+ Here is the general syntax to specify versions with
+ the <filename>RRECOMMENDS</filename> variable:
+ <literallayout class='monospaced'>
+ RRECOMMENDS_${PN} = "<replaceable>package</replaceable> (<replaceable>operator</replaceable> <replaceable>version</replaceable>)"
+ </literallayout>
+ For <filename>operator</filename>, you can specify the
+ following:
+ <literallayout class='monospaced'>
+ =
+ <
+ >
+ <=
+ >=
+ </literallayout>
+ For example, the following sets up a recommend on version
+ 1.2 or greater of the package <filename>foo</filename>:
+ <literallayout class='monospaced'>
+ RRECOMMENDS_${PN} = "foo (>= 1.2)"
+ </literallayout>
+ </para>
+ </glossdef>
+ </glossentry>
+
+ </glossdiv>
+
+ <glossdiv id='var-glossary-s'><title>S</title>
+
+ <glossentry id='var-SECTION'><glossterm>SECTION</glossterm>
+ <glossdef>
+ <para>The section in which packages should be categorized.</para>
+ </glossdef>
+ </glossentry>
+
+ <glossentry id='var-SRC_URI'><glossterm>SRC_URI</glossterm>
+ <glossdef>
+ <para>
+ The list of source files - local or remote.
+ This variable tells BitBake which bits
+ to pull for the build and how to pull them.
+ For example, if the recipe or append file needs to
+ fetch a single tarball from the Internet, the recipe or
+ append file uses a <filename>SRC_URI</filename>
+ entry that specifies that tarball.
+ On the other hand, if the recipe or append file needs to
+ fetch a tarball and include a custom file, the recipe or
+ append file needs an <filename>SRC_URI</filename> variable
+ that specifies all those sources.</para>
+ <para>The following list explains the available URI protocols:
+ <itemizedlist>
+ <listitem><para><emphasis><filename>file://</filename> -</emphasis>
+ Fetches files, which are usually files shipped with
+ the metadata,
+ from the local machine.
+ The path is relative to the
+ <link linkend='var-FILESPATH'><filename>FILESPATH</filename></link>
+ variable.</para></listitem>
+ <listitem><para><emphasis><filename>bzr://</filename> -</emphasis> Fetches files from a
+ Bazaar revision control repository.</para></listitem>
+ <listitem><para><emphasis><filename>git://</filename> -</emphasis> Fetches files from a
+ Git revision control repository.</para></listitem>
+ <listitem><para><emphasis><filename>osc://</filename> -</emphasis> Fetches files from
+ an OSC (OpenSUSE Build service) revision control repository.</para></listitem>
+ <listitem><para><emphasis><filename>repo://</filename> -</emphasis> Fetches files from
+ a repo (Git) repository.</para></listitem>
+ <listitem><para><emphasis><filename>http://</filename> -</emphasis> Fetches files from
+ the Internet using HTTP.</para></listitem>
+ <listitem><para><emphasis><filename>https://</filename> -</emphasis> Fetches files
+ from the Internet using HTTPS.</para></listitem>
+ <listitem><para><emphasis><filename>ftp://</filename> -</emphasis> Fetches files
+ from the Internet using FTP.</para></listitem>
+ <listitem><para><emphasis><filename>cvs://</filename> -</emphasis> Fetches files from
+ a CVS revision control repository.</para></listitem>
+ <listitem><para><emphasis><filename>hg://</filename> -</emphasis> Fetches files from
+ a Mercurial (<filename>hg</filename>) revision control repository.</para></listitem>
+ <listitem><para><emphasis><filename>p4://</filename> -</emphasis> Fetches files from
+ a Perforce (<filename>p4</filename>) revision control repository.</para></listitem>
+ <listitem><para><emphasis><filename>ssh://</filename> -</emphasis> Fetches files from
+ a secure shell.</para></listitem>
+ <listitem><para><emphasis><filename>svn://</filename> -</emphasis> Fetches files from
+ a Subversion (<filename>svn</filename>) revision control repository.</para></listitem>
+ </itemizedlist>
+ </para>
+ <para>Here are some additional options worth mentioning:
+ <itemizedlist>
+ <listitem><para><emphasis><filename>unpack</filename> -</emphasis> Controls
+ whether or not to unpack the file if it is an archive.
+ The default action is to unpack the file.</para></listitem>
+ <listitem><para><emphasis><filename>subdir</filename> -</emphasis> Places the file
+ (or extracts its contents) into the specified
+ subdirectory.
+ This option is useful for unusual tarballs or other archives that
+ do not have their files already in a subdirectory within the archive.
+ </para></listitem>
+ <listitem><para><emphasis><filename>name</filename> -</emphasis> Specifies a
+ name to be used for association with <filename>SRC_URI</filename> checksums
+ when you have more than one file specified in <filename>SRC_URI</filename>.
+ </para></listitem>
+ <listitem><para><emphasis><filename>downloadfilename</filename> -</emphasis> Specifies
+ the filename used when storing the downloaded file.</para></listitem>
+ </itemizedlist>
+ </para>
+ </glossdef>
+ </glossentry>
+
+ <glossentry id='var-SRCDATE'><glossterm>SRCDATE</glossterm>
+ <glossdef>
+ <para>
+ The date of the source code used to build the package.
+ This variable applies only if the source was fetched from a Source Code Manager (SCM).
+ </para>
+ </glossdef>
+ </glossentry>
+
+ <glossentry id='var-SRCREV'><glossterm>SRCREV</glossterm>
+ <glossdef>
+ <para>
+ The revision of the source code used to build the package.
+ This variable applies only when using Subversion, Git, Mercurial and Bazaar.
+ If you want to build a fixed revision and you want
+ to avoid performing a query on the remote repository every time
+ BitBake parses your recipe, you should specify a <filename>SRCREV</filename> that is a
+ full revision identifier and not just a tag.
+ </para>
+ </glossdef>
+ </glossentry>
+
+ <glossentry id='var-SRCREV_FORMAT'><glossterm>SRCREV_FORMAT</glossterm>
+ <glossdef>
+ <para>
+ Helps construct valid
+ <link linkend='var-SRCREV'><filename>SRCREV</filename></link>
+ values when multiple source controlled URLs are used in
+ <link linkend='var-SRC_URI'><filename>SRC_URI</filename></link>.
+ </para>
+
+ <para>
+ The system needs help constructing these values under these
+ circumstances.
+ Each component in the <filename>SRC_URI</filename>
+ is assigned a name and these are referenced
+ in the <filename>SRCREV_FORMAT</filename> variable.
+ Consider an example with URLs named "machine" and "meta".
+ In this case, <filename>SRCREV_FORMAT</filename> could look
+ like "machine_meta" and those names would have the SCM
+ versions substituted into each position.
+ Only one <filename>AUTOINC</filename> placeholder is added
+ and if needed.
+ And, this placeholder is placed at the start of the
+ returned string.
+ </para>
+ </glossdef>
+ </glossentry>
+
+ <glossentry id='var-STAMP'><glossterm>STAMP</glossterm>
+ <glossdef>
+ <para>
+ Specifies the base path used to create recipe stamp files.
+ The path to an actual stamp file is constructed by evaluating this
+ string and then appending additional information.
+ </para>
+ </glossdef>
+ </glossentry>
+
+ <glossentry id='var-STAMPCLEAN'><glossterm>STAMPCLEAN</glossterm>
+ <glossdef>
+ <para>
+ Specifies the base path used to create recipe stamp files.
+ Unlike the
+ <link linkend='var-STAMP'><filename>STAMP</filename></link>
+ variable, <filename>STAMPCLEAN</filename> can contain
+ wildcards to match the range of files a clean operation
+ should remove.
+ BitBake uses a clean operation to remove any other stamps
+ it should be removing when creating a new stamp.
+ </para>
+ </glossdef>
+ </glossentry>
+
+ <glossentry id='var-SUMMARY'><glossterm>SUMMARY</glossterm>
+ <glossdef>
+ <para>
+ A short summary for the recipe, which is 72 characters or less.
+ </para>
+ </glossdef>
+ </glossentry>
+
+ <glossentry id='var-SVNDIR'><glossterm>SVNDIR</glossterm>
+ <glossdef>
+ <para>
+ The directory in which files checked out of a Subversion
+ system are stored.
+ </para>
+ </glossdef>
+ </glossentry>
+
+ </glossdiv>
+
+ <glossdiv id='var-glossary-t'><title>T</title>
+
+ <glossentry id='var-T'><glossterm>T</glossterm>
+ <glossdef>
+ <para>Points to a directory were BitBake places
+ temporary files, which consist mostly of task logs and
+ scripts, when building a particular recipe.
+ </para>
+ </glossdef>
+ </glossentry>
+
+ <glossentry id='var-TOPDIR'><glossterm>TOPDIR</glossterm>
+ <glossdef>
+ <para>
+ Points to the build directory.
+ BitBake automatically sets this variable.
+ </para>
+ </glossdef>
+ </glossentry>
+
+ </glossdiv>
+
+<!--
+ <glossdiv id='var-glossary-u'><title>U</title>
+ </glossdiv>
+
+ <glossdiv id='var-glossary-v'><title>V</title>
+ </glossdiv>
+
+ <glossdiv id='var-glossary-w'><title>W</title>
+ </glossdiv>
+
+ <glossdiv id='var-glossary-x'><title>X</title>
+ </glossdiv>
+
+ <glossdiv id='var-glossary-y'><title>Y</title>
+ </glossdiv>
+
+ <glossdiv id='var-glossary-z'><title>Z</title>
+ </glossdiv>
+-->
+
+
+</glossary>
+</chapter>
+<!--
+vim: expandtab tw=80 ts=4
+-->
diff --git a/bitbake/doc/bitbake-user-manual/bitbake-user-manual-style.css b/bitbake/doc/bitbake-user-manual/bitbake-user-manual-style.css
new file mode 100644
index 0000000..65da2a4
--- /dev/null
+++ b/bitbake/doc/bitbake-user-manual/bitbake-user-manual-style.css
@@ -0,0 +1,984 @@
+/*
+ Generic XHTML / DocBook XHTML CSS Stylesheet.
+
+ Browser wrangling and typographic design by
+ Oyvind Kolas / pippin@gimp.org
+
+ Customised for Poky by
+ Matthew Allum / mallum@o-hand.com
+
+ Thanks to:
+ Liam R. E. Quin
+ William Skaggs
+ Jakub Steiner
+
+ Structure
+ ---------
+
+ The stylesheet is divided into the following sections:
+
+ Positioning
+ Margins, paddings, width, font-size, clearing.
+ Decorations
+ Borders, style
+ Colors
+ Colors
+ Graphics
+ Graphical backgrounds
+ Nasty IE tweaks
+ Workarounds needed to make it work in internet explorer,
+ currently makes the stylesheet non validating, but up until
+ this point it is validating.
+ Mozilla extensions
+ Transparency for footer
+ Rounded corners on boxes
+
+*/
+
+
+ /*************** /
+ / Positioning /
+/ ***************/
+
+body {
+ font-family: Verdana, Sans, sans-serif;
+
+ min-width: 640px;
+ width: 80%;
+ margin: 0em auto;
+ padding: 2em 5em 5em 5em;
+ color: #333;
+}
+
+h1,h2,h3,h4,h5,h6,h7 {
+ font-family: Arial, Sans;
+ color: #00557D;
+ clear: both;
+}
+
+h1 {
+ font-size: 2em;
+ text-align: left;
+ padding: 0em 0em 0em 0em;
+ margin: 2em 0em 0em 0em;
+}
+
+h2.subtitle {
+ margin: 0.10em 0em 3.0em 0em;
+ padding: 0em 0em 0em 0em;
+ font-size: 1.8em;
+ padding-left: 20%;
+ font-weight: normal;
+ font-style: italic;
+}
+
+h2 {
+ margin: 2em 0em 0.66em 0em;
+ padding: 0.5em 0em 0em 0em;
+ font-size: 1.5em;
+ font-weight: bold;
+}
+
+h3.subtitle {
+ margin: 0em 0em 1em 0em;
+ padding: 0em 0em 0em 0em;
+ font-size: 142.14%;
+ text-align: right;
+}
+
+h3 {
+ margin: 1em 0em 0.5em 0em;
+ padding: 1em 0em 0em 0em;
+ font-size: 140%;
+ font-weight: bold;
+}
+
+h4 {
+ margin: 1em 0em 0.5em 0em;
+ padding: 1em 0em 0em 0em;
+ font-size: 120%;
+ font-weight: bold;
+}
+
+h5 {
+ margin: 1em 0em 0.5em 0em;
+ padding: 1em 0em 0em 0em;
+ font-size: 110%;
+ font-weight: bold;
+}
+
+h6 {
+ margin: 1em 0em 0em 0em;
+ padding: 1em 0em 0em 0em;
+ font-size: 110%;
+ font-weight: bold;
+}
+
+.authorgroup {
+ background-color: transparent;
+ background-repeat: no-repeat;
+ padding-top: 256px;
+ background-image: url("figures/bitbake-title.png");
+ background-position: left top;
+ margin-top: -256px;
+ padding-right: 50px;
+ margin-left: 0px;
+ text-align: right;
+ width: 740px;
+}
+
+h3.author {
+ margin: 0em 0me 0em 0em;
+ padding: 0em 0em 0em 0em;
+ font-weight: normal;
+ font-size: 100%;
+ color: #333;
+ clear: both;
+}
+
+.author tt.email {
+ font-size: 66%;
+}
+
+.titlepage hr {
+ width: 0em;
+ clear: both;
+}
+
+.revhistory {
+ padding-top: 2em;
+ clear: both;
+}
+
+.toc,
+.list-of-tables,
+.list-of-examples,
+.list-of-figures {
+ padding: 1.33em 0em 2.5em 0em;
+ color: #00557D;
+}
+
+.toc p,
+.list-of-tables p,
+.list-of-figures p,
+.list-of-examples p {
+ padding: 0em 0em 0em 0em;
+ padding: 0em 0em 0.3em;
+ margin: 1.5em 0em 0em 0em;
+}
+
+.toc p b,
+.list-of-tables p b,
+.list-of-figures p b,
+.list-of-examples p b{
+ font-size: 100.0%;
+ font-weight: bold;
+}
+
+.toc dl,
+.list-of-tables dl,
+.list-of-figures dl,
+.list-of-examples dl {
+ margin: 0em 0em 0.5em 0em;
+ padding: 0em 0em 0em 0em;
+}
+
+.toc dt {
+ margin: 0em 0em 0em 0em;
+ padding: 0em 0em 0em 0em;
+}
+
+.toc dd {
+ margin: 0em 0em 0em 2.6em;
+ padding: 0em 0em 0em 0em;
+}
+
+div.glossary dl,
+div.variablelist dl {
+}
+
+.glossary dl dt,
+.variablelist dl dt,
+.variablelist dl dt span.term {
+ font-weight: normal;
+ width: 20em;
+ text-align: right;
+}
+
+.variablelist dl dt {
+ margin-top: 0.5em;
+}
+
+.glossary dl dd,
+.variablelist dl dd {
+ margin-top: -1em;
+ margin-left: 25.5em;
+}
+
+.glossary dd p,
+.variablelist dd p {
+ margin-top: 0em;
+ margin-bottom: 1em;
+}
+
+
+div.calloutlist table td {
+ padding: 0em 0em 0em 0em;
+ margin: 0em 0em 0em 0em;
+}
+
+div.calloutlist table td p {
+ margin-top: 0em;
+ margin-bottom: 1em;
+}
+
+div p.copyright {
+ text-align: left;
+}
+
+div.legalnotice p.legalnotice-title {
+ margin-bottom: 0em;
+}
+
+p {
+ line-height: 1.5em;
+ margin-top: 0em;
+
+}
+
+dl {
+ padding-top: 0em;
+}
+
+hr {
+ border: solid 1px;
+}
+
+
+.mediaobject,
+.mediaobjectco {
+ text-align: center;
+}
+
+img {
+ border: none;
+}
+
+ul {
+ padding: 0em 0em 0em 1.5em;
+}
+
+ul li {
+ padding: 0em 0em 0em 0em;
+}
+
+ul li p {
+ text-align: left;
+}
+
+table {
+ width :100%;
+}
+
+th {
+ padding: 0.25em;
+ text-align: left;
+ font-weight: normal;
+ vertical-align: top;
+}
+
+td {
+ padding: 0.25em;
+ vertical-align: top;
+}
+
+p a[id] {
+ margin: 0px;
+ padding: 0px;
+ display: inline;
+ background-image: none;
+}
+
+a {
+ text-decoration: underline;
+ color: #444;
+}
+
+pre {
+ overflow: auto;
+}
+
+a:hover {
+ text-decoration: underline;
+ /*font-weight: bold;*/
+}
+
+/* This style defines how the permalink character
+ appears by itself and when hovered over with
+ the mouse. */
+
+[alt='Permalink'] { color: #eee; }
+[alt='Permalink']:hover { color: black; }
+
+
+div.informalfigure,
+div.informalexample,
+div.informaltable,
+div.figure,
+div.table,
+div.example {
+ margin: 1em 0em;
+ padding: 1em;
+ page-break-inside: avoid;
+}
+
+
+div.informalfigure p.title b,
+div.informalexample p.title b,
+div.informaltable p.title b,
+div.figure p.title b,
+div.example p.title b,
+div.table p.title b{
+ padding-top: 0em;
+ margin-top: 0em;
+ font-size: 100%;
+ font-weight: normal;
+}
+
+.mediaobject .caption,
+.mediaobject .caption p {
+ text-align: center;
+ font-size: 80%;
+ padding-top: 0.5em;
+ padding-bottom: 0.5em;
+}
+
+.epigraph {
+ padding-left: 55%;
+ margin-bottom: 1em;
+}
+
+.epigraph p {
+ text-align: left;
+}
+
+.epigraph .quote {
+ font-style: italic;
+}
+.epigraph .attribution {
+ font-style: normal;
+ text-align: right;
+}
+
+span.application {
+ font-style: italic;
+}
+
+.programlisting {
+ font-family: monospace;
+ font-size: 80%;
+ white-space: pre;
+ margin: 1.33em 0em;
+ padding: 1.33em;
+}
+
+.tip,
+.warning,
+.caution,
+.note {
+ margin-top: 1em;
+ margin-bottom: 1em;
+
+}
+
+/* force full width of table within div */
+.tip table,
+.warning table,
+.caution table,
+.note table {
+ border: none;
+ width: 100%;
+}
+
+
+.tip table th,
+.warning table th,
+.caution table th,
+.note table th {
+ padding: 0.8em 0.0em 0.0em 0.0em;
+ margin : 0em 0em 0em 0em;
+}
+
+.tip p,
+.warning p,
+.caution p,
+.note p {
+ margin-top: 0.5em;
+ margin-bottom: 0.5em;
+ padding-right: 1em;
+ text-align: left;
+}
+
+.acronym {
+ text-transform: uppercase;
+}
+
+b.keycap,
+.keycap {
+ padding: 0.09em 0.3em;
+ margin: 0em;
+}
+
+.itemizedlist li {
+ clear: none;
+}
+
+.filename {
+ font-size: medium;
+ font-family: Courier, monospace;
+}
+
+
+div.navheader, div.heading{
+ position: absolute;
+ left: 0em;
+ top: 0em;
+ width: 100%;
+ background-color: #cdf;
+ width: 100%;
+}
+
+div.navfooter, div.footing{
+ position: fixed;
+ left: 0em;
+ bottom: 0em;
+ background-color: #eee;
+ width: 100%;
+}
+
+
+div.navheader td,
+div.navfooter td {
+ font-size: 66%;
+}
+
+div.navheader table th {
+ /*font-family: Georgia, Times, serif;*/
+ /*font-size: x-large;*/
+ font-size: 80%;
+}
+
+div.navheader table {
+ border-left: 0em;
+ border-right: 0em;
+ border-top: 0em;
+ width: 100%;
+}
+
+div.navfooter table {
+ border-left: 0em;
+ border-right: 0em;
+ border-bottom: 0em;
+ width: 100%;
+}
+
+div.navheader table td a,
+div.navfooter table td a {
+ color: #777;
+ text-decoration: none;
+}
+
+/* normal text in the footer */
+div.navfooter table td {
+ color: black;
+}
+
+div.navheader table td a:visited,
+div.navfooter table td a:visited {
+ color: #444;
+}
+
+
+/* links in header and footer */
+div.navheader table td a:hover,
+div.navfooter table td a:hover {
+ text-decoration: underline;
+ background-color: transparent;
+ color: #33a;
+}
+
+div.navheader hr,
+div.navfooter hr {
+ display: none;
+}
+
+
+.qandaset tr.question td p {
+ margin: 0em 0em 1em 0em;
+ padding: 0em 0em 0em 0em;
+}
+
+.qandaset tr.answer td p {
+ margin: 0em 0em 1em 0em;
+ padding: 0em 0em 0em 0em;
+}
+.answer td {
+ padding-bottom: 1.5em;
+}
+
+.emphasis {
+ font-weight: bold;
+}
+
+
+ /************* /
+ / decorations /
+/ *************/
+
+.titlepage {
+}
+
+.part .title {
+}
+
+.subtitle {
+ border: none;
+}
+
+/*
+h1 {
+ border: none;
+}
+
+h2 {
+ border-top: solid 0.2em;
+ border-bottom: solid 0.06em;
+}
+
+h3 {
+ border-top: 0em;
+ border-bottom: solid 0.06em;
+}
+
+h4 {
+ border: 0em;
+ border-bottom: solid 0.06em;
+}
+
+h5 {
+ border: 0em;
+}
+*/
+
+.programlisting {
+ border: solid 1px;
+}
+
+div.figure,
+div.table,
+div.informalfigure,
+div.informaltable,
+div.informalexample,
+div.example {
+ border: 1px solid;
+}
+
+
+
+.tip,
+.warning,
+.caution,
+.note {
+ border: 1px solid;
+}
+
+.tip table th,
+.warning table th,
+.caution table th,
+.note table th {
+ border-bottom: 1px solid;
+}
+
+.question td {
+ border-top: 1px solid black;
+}
+
+.answer {
+}
+
+
+b.keycap,
+.keycap {
+ border: 1px solid;
+}
+
+
+div.navheader, div.heading{
+ border-bottom: 1px solid;
+}
+
+
+div.navfooter, div.footing{
+ border-top: 1px solid;
+}
+
+ /********* /
+ / colors /
+/ *********/
+
+body {
+ color: #333;
+ background: white;
+}
+
+a {
+ background: transparent;
+}
+
+a:hover {
+ background-color: #dedede;
+}
+
+
+h1,
+h2,
+h3,
+h4,
+h5,
+h6,
+h7,
+h8 {
+ background-color: transparent;
+}
+
+hr {
+ border-color: #aaa;
+}
+
+
+.tip, .warning, .caution, .note {
+ border-color: #fff;
+}
+
+
+.tip table th,
+.warning table th,
+.caution table th,
+.note table th {
+ border-bottom-color: #fff;
+}
+
+
+.warning {
+ background-color: #f0f0f2;
+}
+
+.caution {
+ background-color: #f0f0f2;
+}
+
+.tip {
+ background-color: #f0f0f2;
+}
+
+.note {
+ background-color: #f0f0f2;
+}
+
+.glossary dl dt,
+.variablelist dl dt,
+.variablelist dl dt span.term {
+ color: #044;
+}
+
+div.figure,
+div.table,
+div.example,
+div.informalfigure,
+div.informaltable,
+div.informalexample {
+ border-color: #aaa;
+}
+
+pre.programlisting {
+ color: black;
+ background-color: #fff;
+ border-color: #aaa;
+ border-width: 2px;
+}
+
+.guimenu,
+.guilabel,
+.guimenuitem {
+ background-color: #eee;
+}
+
+
+b.keycap,
+.keycap {
+ background-color: #eee;
+ border-color: #999;
+}
+
+
+div.navheader {
+ border-color: black;
+}
+
+
+div.navfooter {
+ border-color: black;
+}
+
+
+ /*********** /
+ / graphics /
+/ ***********/
+
+/*
+body {
+ background-image: url("images/body_bg.jpg");
+ background-attachment: fixed;
+}
+
+.navheader,
+.note,
+.tip {
+ background-image: url("images/note_bg.jpg");
+ background-attachment: fixed;
+}
+
+.warning,
+.caution {
+ background-image: url("images/warning_bg.jpg");
+ background-attachment: fixed;
+}
+
+.figure,
+.informalfigure,
+.example,
+.informalexample,
+.table,
+.informaltable {
+ background-image: url("images/figure_bg.jpg");
+ background-attachment: fixed;
+}
+
+*/
+h1,
+h2,
+h3,
+h4,
+h5,
+h6,
+h7{
+}
+
+/*
+Example of how to stick an image as part of the title.
+
+div.article .titlepage .title
+{
+ background-image: url("figures/white-on-black.png");
+ background-position: center;
+ background-repeat: repeat-x;
+}
+*/
+
+div.preface .titlepage .title,
+div.colophon .title,
+div.chapter .titlepage .title,
+div.article .titlepage .title
+{
+}
+
+div.section div.section .titlepage .title,
+div.sect2 .titlepage .title {
+ background: none;
+}
+
+
+h1.title {
+ background-color: transparent;
+ background-repeat: no-repeat;
+ height: 256px;
+ text-indent: -9000px;
+ overflow:hidden;
+}
+
+h2.subtitle {
+ background-color: transparent;
+ text-indent: -9000px;
+ overflow:hidden;
+ width: 0px;
+ display: none;
+}
+
+ /*************************************** /
+ / pippin.gimp.org specific alterations /
+/ ***************************************/
+
+/*
+div.heading, div.navheader {
+ color: #777;
+ font-size: 80%;
+ padding: 0;
+ margin: 0;
+ text-align: left;
+ position: absolute;
+ top: 0px;
+ left: 0px;
+ width: 100%;
+ height: 50px;
+ background: url('/gfx/heading_bg.png') transparent;
+ background-repeat: repeat-x;
+ background-attachment: fixed;
+ border: none;
+}
+
+div.heading a {
+ color: #444;
+}
+
+div.footing, div.navfooter {
+ border: none;
+ color: #ddd;
+ font-size: 80%;
+ text-align:right;
+
+ width: 100%;
+ padding-top: 10px;
+ position: absolute;
+ bottom: 0px;
+ left: 0px;
+
+ background: url('/gfx/footing_bg.png') transparent;
+}
+*/
+
+
+
+ /****************** /
+ / nasty ie tweaks /
+/ ******************/
+
+/*
+div.heading, div.navheader {
+ width:expression(document.body.clientWidth + "px");
+}
+
+div.footing, div.navfooter {
+ width:expression(document.body.clientWidth + "px");
+ margin-left:expression("-5em");
+}
+body {
+ padding:expression("4em 5em 0em 5em");
+}
+*/
+
+ /**************************************** /
+ / mozilla vendor specific css extensions /
+/ ****************************************/
+/*
+div.navfooter, div.footing{
+ -moz-opacity: 0.8em;
+}
+
+div.figure,
+div.table,
+div.informalfigure,
+div.informaltable,
+div.informalexample,
+div.example,
+.tip,
+.warning,
+.caution,
+.note {
+ -moz-border-radius: 0.5em;
+}
+
+b.keycap,
+.keycap {
+ -moz-border-radius: 0.3em;
+}
+*/
+
+table tr td table tr td {
+ display: none;
+}
+
+
+hr {
+ display: none;
+}
+
+table {
+ border: 0em;
+}
+
+ .photo {
+ float: right;
+ margin-left: 1.5em;
+ margin-bottom: 1.5em;
+ margin-top: 0em;
+ max-width: 17em;
+ border: 1px solid gray;
+ padding: 3px;
+ background: white;
+}
+ .seperator {
+ padding-top: 2em;
+ clear: both;
+ }
+
+ #validators {
+ margin-top: 5em;
+ text-align: right;
+ color: #777;
+ }
+ @media print {
+ body {
+ font-size: 8pt;
+ }
+ .noprint {
+ display: none;
+ }
+ }
+
+
+.tip,
+.note {
+ background: #f0f0f2;
+ color: #333;
+ padding: 20px;
+ margin: 20px;
+}
+
+.tip h3,
+.note h3 {
+ padding: 0em;
+ margin: 0em;
+ font-size: 2em;
+ font-weight: bold;
+ color: #333;
+}
+
+.tip a,
+.note a {
+ color: #333;
+ text-decoration: underline;
+}
+
+.footnote {
+ font-size: small;
+ color: #333;
+}
+
+/* Changes the announcement text */
+.tip h3,
+.warning h3,
+.caution h3,
+.note h3 {
+ font-size:large;
+ color: #00557D;
+}
diff --git a/bitbake/doc/bitbake-user-manual/bitbake-user-manual.xml b/bitbake/doc/bitbake-user-manual/bitbake-user-manual.xml
new file mode 100644
index 0000000..7fff933
--- /dev/null
+++ b/bitbake/doc/bitbake-user-manual/bitbake-user-manual.xml
@@ -0,0 +1,88 @@
+<!DOCTYPE book PUBLIC "-//OASIS//DTD DocBook XML V4.2//EN"
+"http://www.oasis-open.org/docbook/xml/4.2/docbookx.dtd">
+
+<book id='bitbake-user-manual' lang='en'
+ xmlns:xi="http://www.w3.org/2003/XInclude"
+ xmlns="http://docbook.org/ns/docbook"
+ >
+ <bookinfo>
+
+ <mediaobject>
+ <imageobject>
+ <imagedata fileref='figures/bitbake-title.png'
+ format='SVG'
+ align='left' scalefit='1' width='100%'/>
+ </imageobject>
+ </mediaobject>
+
+ <title>
+ BitBake User Manual
+ </title>
+
+ <authorgroup>
+ <author>
+ <firstname>Richard Purdie, Chris Larson, and </firstname> <surname>Phil Blundell</surname>
+ <affiliation>
+ <orgname>BitBake Community</orgname>
+ </affiliation>
+ <email>bitbake-devel@lists.openembedded.org</email>
+ </author>
+ </authorgroup>
+
+<!--
+# Add in some revision history if we want it here.
+ <revhistory>
+ <revision>
+ <revnumber>x.x</revnumber>
+ <date>dd month year</date>
+ <revremark>Some relevent comment</revremark>
+ </revision>
+ <revision>
+ <revnumber>x.x</revnumber>
+ <date>dd month year</date>
+ <revremark>Some relevent comment</revremark>
+ </revision>
+ <revision>
+ <revnumber>x.x</revnumber>
+ <date>dd month year</date>
+ <revremark>Some relevent comment</revremark>
+ </revision>
+ <revision>
+ <revnumber>x.x</revnumber>
+ <date>dd month year</date>
+ <revremark>Some relevent comment</revremark>
+ </revision>
+ </revhistory>
+-->
+
+ <copyright>
+ <year>2004-2015</year>
+ <holder>Richard Purdie</holder>
+ <holder>Chris Larson</holder>
+ <holder>and Phil Blundell</holder>
+ </copyright>
+
+ <legalnotice>
+ <para>
+ This work is licensed under the Creative Commons Attribution License.
+ To view a copy of this license, visit
+ <ulink url="http://creativecommons.org/licenses/by/2.5/">http://creativecommons.org/licenses/by/2.5/</ulink>
+ or send a letter to Creative Commons, 444 Castro Street,
+ Suite 900, Mountain View, California 94041, USA.
+ </para>
+ </legalnotice>
+ </bookinfo>
+
+ <xi:include href="bitbake-user-manual-intro.xml"/>
+
+ <xi:include href="bitbake-user-manual-execution.xml"/>
+
+ <xi:include href="bitbake-user-manual-metadata.xml"/>
+
+ <xi:include href="bitbake-user-manual-fetching.xml"/>
+
+ <xi:include href="bitbake-user-manual-ref-variables.xml"/>
+
+ <xi:include href="bitbake-user-manual-hello.xml"/>
+
+</book>
diff --git a/bitbake/doc/bitbake-user-manual/figures/bitbake-title.png b/bitbake/doc/bitbake-user-manual/figures/bitbake-title.png
new file mode 100644
index 0000000..cb29015
--- /dev/null
+++ b/bitbake/doc/bitbake-user-manual/figures/bitbake-title.png
Binary files differ
diff --git a/bitbake/doc/bitbake-user-manual/html.css b/bitbake/doc/bitbake-user-manual/html.css
new file mode 100644
index 0000000..6eedfd3
--- /dev/null
+++ b/bitbake/doc/bitbake-user-manual/html.css
@@ -0,0 +1,281 @@
+/* Feuille de style DocBook du projet Traduc.org */
+/* DocBook CSS stylesheet of the Traduc.org project */
+
+/* (c) Jean-Philippe Guérard - 14 août 2004 */
+/* (c) Jean-Philippe Guérard - 14 August 2004 */
+
+/* Cette feuille de style est libre, vous pouvez la */
+/* redistribuer et la modifier selon les termes de la Licence */
+/* Art Libre. Vous trouverez un exemplaire de cette Licence sur */
+/* http://tigreraye.org/Petit-guide-du-traducteur.html#licence-art-libre */
+
+/* This work of art is free, you can redistribute it and/or */
+/* modify it according to terms of the Free Art license. You */
+/* will find a specimen of this license on the Copyleft */
+/* Attitude web site: http://artlibre.org as well as on other */
+/* sites. */
+/* Please note that the French version of this licence as shown */
+/* on http://tigreraye.org/Petit-guide-du-traducteur.html#licence-art-libre */
+/* is only official licence of this document. The English */
+/* is only provided to help you understand this licence. */
+
+/* La dernière version de cette feuille de style est toujours */
+/* disponible sur : http://tigreraye.org/style.css */
+/* Elle est également disponible sur : */
+/* http://www.traduc.org/docs/HOWTO/lecture/style.css */
+
+/* The latest version of this stylesheet is available from: */
+/* http://tigreraye.org/style.css */
+/* It is also available on: */
+/* http://www.traduc.org/docs/HOWTO/lecture/style.css */
+
+/* N'hésitez pas à envoyer vos commentaires et corrections à */
+/* Jean-Philippe Guérard <jean-philippe.guerard@tigreraye.org> */
+
+/* Please send feedback and bug reports to */
+/* Jean-Philippe Guérard <jean-philippe.guerard@tigreraye.org> */
+
+/* $Id: style.css,v 1.14 2004/09/10 20:12:09 fevrier Exp fevrier $ */
+
+/* Présentation générale du document */
+/* Overall document presentation */
+
+body {
+ /*
+ font-family: Apolline, "URW Palladio L", Garamond, jGaramond,
+ "Bitstream Cyberbit", "Palatino Linotype", serif;
+ */
+ margin: 7%;
+ background-color: white;
+}
+
+/* Taille du texte */
+/* Text size */
+
+* { font-size: 100%; }
+
+/* Gestion des textes mis en relief imbriqués */
+/* Embedded emphasis */
+
+em { font-style: italic; }
+em em { font-style: normal; }
+em em em { font-style: italic; }
+
+/* Titres */
+/* Titles */
+
+h1 { font-size: 200%; font-weight: 900; }
+h2 { font-size: 160%; font-weight: 900; }
+h3 { font-size: 130%; font-weight: bold; }
+h4 { font-size: 115%; font-weight: bold; }
+h5 { font-size: 108%; font-weight: bold; }
+h6 { font-weight: bold; }
+
+/* Nom de famille en petites majuscules (uniquement en français) */
+/* Last names in small caps (for French only) */
+
+*[class~="surname"]:lang(fr) { font-variant: small-caps; }
+
+/* Blocs de citation */
+/* Quotation blocs */
+
+div[class~="blockquote"] {
+ border: solid 2px #AAA;
+ padding: 5px;
+ margin: 5px;
+}
+
+div[class~="blockquote"] > table {
+ border: none;
+}
+
+/* Blocs litéraux : fond gris clair */
+/* Literal blocs: light gray background */
+
+*[class~="literallayout"] {
+ background: #f0f0f0;
+ padding: 5px;
+ margin: 5px;
+}
+
+/* Programmes et captures texte : fond bleu clair */
+/* Listing and text screen snapshots: light blue background */
+
+*[class~="programlisting"], *[class~="screen"] {
+ background: #f0f0ff;
+ padding: 5px;
+ margin: 5px;
+}
+
+/* Les textes à remplacer sont surlignés en vert pâle */
+/* Replaceable text in highlighted in pale green */
+
+*[class~="replaceable"] {
+ background-color: #98fb98;
+ font-style: normal; }
+
+/* Tables : fonds gris clair & bords simples */
+/* Tables: light gray background and solid borders */
+
+*[class~="table"] *[class~="title"] { width:100%; border: 0px; }
+
+table {
+ border: 1px solid #aaa;
+ border-collapse: collapse;
+ padding: 2px;
+ margin: 5px;
+}
+
+/* Listes simples en style table */
+/* Simples lists in table presentation */
+
+table[class~="simplelist"] {
+ background-color: #F0F0F0;
+ margin: 5px;
+ border: solid 1px #AAA;
+}
+
+table[class~="simplelist"] td {
+ border: solid 1px #AAA;
+}
+
+/* Les tables */
+/* Tables */
+
+*[class~="table"] table {
+ background-color: #F0F0F0;
+ border: solid 1px #AAA;
+}
+*[class~="informaltable"] table { background-color: #F0F0F0; }
+
+th,td {
+ vertical-align: baseline;
+ text-align: left;
+ padding: 0.1em 0.3em;
+ empty-cells: show;
+}
+
+/* Alignement des colonnes */
+/* Colunms alignment */
+
+td[align=center] , th[align=center] { text-align: center; }
+td[align=right] , th[align=right] { text-align: right; }
+td[align=left] , th[align=left] { text-align: left; }
+td[align=justify] , th[align=justify] { text-align: justify; }
+
+/* Pas de marge autour des images */
+/* No inside margins for images */
+
+img { border: 0; }
+
+/* Les liens ne sont pas soulignés */
+/* No underlines for links */
+
+:link , :visited , :active { text-decoration: none; }
+
+/* Prudence : cadre jaune et fond jaune clair */
+/* Caution: yellow border and light yellow background */
+
+*[class~="caution"] {
+ border: solid 2px yellow;
+ background-color: #ffffe0;
+ padding: 1em 6px 1em ;
+ margin: 5px;
+}
+
+*[class~="caution"] th {
+ vertical-align: middle
+}
+
+*[class~="caution"] table {
+ background-color: #ffffe0;
+ border: none;
+}
+
+/* Note importante : cadre jaune et fond jaune clair */
+/* Important: yellow border and light yellow background */
+
+*[class~="important"] {
+ border: solid 2px yellow;
+ background-color: #ffffe0;
+ padding: 1em 6px 1em;
+ margin: 5px;
+}
+
+*[class~="important"] th {
+ vertical-align: middle
+}
+
+*[class~="important"] table {
+ background-color: #ffffe0;
+ border: none;
+}
+
+/* Mise en évidence : texte légèrement plus grand */
+/* Highlights: slightly larger texts */
+
+*[class~="highlights"] {
+ font-size: 110%;
+}
+
+/* Note : cadre bleu et fond bleu clair */
+/* Notes: blue border and light blue background */
+
+*[class~="note"] {
+ border: solid 2px #7099C5;
+ background-color: #f0f0ff;
+ padding: 1em 6px 1em ;
+ margin: 5px;
+}
+
+*[class~="note"] th {
+ vertical-align: middle
+}
+
+*[class~="note"] table {
+ background-color: #f0f0ff;
+ border: none;
+}
+
+/* Astuce : cadre vert et fond vert clair */
+/* Tip: green border and light green background */
+
+*[class~="tip"] {
+ border: solid 2px #00ff00;
+ background-color: #f0ffff;
+ padding: 1em 6px 1em ;
+ margin: 5px;
+}
+
+*[class~="tip"] th {
+ vertical-align: middle;
+}
+
+*[class~="tip"] table {
+ background-color: #f0ffff;
+ border: none;
+}
+
+/* Avertissement : cadre rouge et fond rouge clair */
+/* Warning: red border and light red background */
+
+*[class~="warning"] {
+ border: solid 2px #ff0000;
+ background-color: #fff0f0;
+ padding: 1em 6px 1em ;
+ margin: 5px;
+}
+
+*[class~="warning"] th {
+ vertical-align: middle;
+}
+
+
+*[class~="warning"] table {
+ background-color: #fff0f0;
+ border: none;
+}
+
+/* Fin */
+/* The End */
+
diff --git a/bitbake/doc/bitbake.1 b/bitbake/doc/bitbake.1
new file mode 100644
index 0000000..a6c8d97
--- /dev/null
+++ b/bitbake/doc/bitbake.1
@@ -0,0 +1,142 @@
+.\" Hey, EMACS: -*- nroff -*-
+.\" First parameter, NAME, should be all caps
+.\" Second parameter, SECTION, should be 1-8, maybe w/ subsection
+.\" other parameters are allowed: see man(7), man(1)
+.TH BITBAKE 1 "November 19, 2006"
+.\" Please adjust this date whenever revising the manpage.
+.\"
+.\" Some roff macros, for reference:
+.\" .nh disable hyphenation
+.\" .hy enable hyphenation
+.\" .ad l left justify
+.\" .ad b justify to both left and right margins
+.\" .nf disable filling
+.\" .fi enable filling
+.\" .br insert line break
+.\" .sp <n> insert n+1 empty lines
+.\" for manpage-specific macros, see man(7)
+.SH NAME
+BitBake \- simple tool for the execution of tasks
+.SH SYNOPSIS
+.B bitbake
+.RI [ options ] " packagenames"
+.br
+.SH DESCRIPTION
+This manual page documents briefly the
+.B bitbake
+command.
+.PP
+.\" TeX users may be more comfortable with the \fB<whatever>\fP and
+.\" \fI<whatever>\fP escape sequences to invode bold face and italics,
+.\" respectively.
+\fBbitbake\fP is a program that executes the specified task (default is 'build')
+for a given set of BitBake files.
+.br
+It expects that BBFILES is defined, which is a space separated list of files to
+be executed. BBFILES does support wildcards.
+.br
+Default BBFILES are the .bb files in the current directory.
+.SH OPTIONS
+This program follow the usual GNU command line syntax, with long
+options starting with two dashes (`-').
+.TP
+.B \-h, \-\-help
+Show summary of options.
+.TP
+.B \-\-version
+Show version of program.
+.TP
+.B \-bBUILDFILE, \-\-buildfile=BUILDFILE
+execute the task against this .bb file, rather than a package from BBFILES.
+.TP
+.B \-k, \-\-continue
+continue as much as possible after an error. While the target that failed, and
+those that depend on it, cannot be remade, the other dependencies of these
+targets can be processed all the same.
+.TP
+.B \-a, \-\-tryaltconfigs
+continue with builds by trying to use alternative providers where possible.
+.TP
+.B \-f, \-\-force
+force run of specified cmd, regardless of stamp status
+.TP
+.B \-i, \-\-interactive
+drop into the interactive mode also called the BitBake shell.
+.TP
+.B \-cCMD, \-\-cmd=CMD
+Specify task to execute. Note that this only executes the specified task for
+the providee and the packages it depends on, i.e. 'compile' does not implicitly
+call stage for the dependencies (IOW: use only if you know what you are doing).
+Depending on the base.bbclass a listtasks task is defined and will show
+available tasks.
+.TP
+.B \-rFILE, \-\-read=FILE
+read the specified file before bitbake.conf
+.TP
+.B \-v, \-\-verbose
+output more chit-chat to the terminal
+.TP
+.B \-D, \-\-debug
+Increase the debug level. You can specify this more than once.
+.TP
+.B \-n, \-\-dry-run
+don't execute, just go through the motions
+.TP
+.B \-p, \-\-parse-only
+quit after parsing the BB files (developers only)
+.TP
+.B \-s, \-\-show-versions
+show current and preferred versions of all packages
+.TP
+.B \-e, \-\-environment
+show the global or per-recipe environment (this is what used to be bbread)
+.TP
+.B \-g, \-\-graphviz
+emit the dependency trees of the specified packages in the dot syntax
+.TP
+.B \-IIGNORED\_DOT\_DEPS, \-\-ignore-deps=IGNORED_DOT_DEPS
+Stop processing at the given list of dependencies when generating dependency
+graphs. This can help to make the graph more appealing
+.TP
+.B \-lDEBUG_DOMAINS, \-\-log-domains=DEBUG_DOMAINS
+Show debug logging for the specified logging domains
+.TP
+.B \-P, \-\-profile
+profile the command and print a report
+.TP
+.B \-uUI, \-\-ui=UI
+User interface to use. Currently, hob, depexp, goggle or ncurses can be specified as UI.
+.TP
+.B \-tSERVERTYPE, \-\-servertype=SERVERTYPE
+Choose which server to use, none, process or xmlrpc.
+.TP
+.B \-\-revisions-changed
+Set the exit code depending on whether upstream floating revisions have changed or not.
+.TP
+.B \-\-server-only
+Run bitbake without UI, the frontend can connect with bitbake server itself.
+.TP
+.B \-BBIND, \-\-bind=BIND
+The name/address for the bitbake server to bind to.
+.TP
+.B \-\-no\-setscene
+Do not run any setscene tasks, forces builds.
+
+.SH ENVIRONMENT VARIABLES
+bitbake uses the following environment variables to control its
+operation:
+.TP
+.B BITBAKE_UI
+The bitbake user interface; overridden by the \fB-u\fP commandline option.
+
+.SH AUTHORS
+BitBake was written by
+Phil Blundell,
+Holger Freyther,
+Chris Larson,
+Mickey Lauer,
+Richard Purdie,
+Holger Schurig
+.PP
+This manual page was written by Marcin Juszkiewicz <marcin@hrw.one.pl>
+for the Debian project (but may be used by others).
diff --git a/bitbake/doc/poky.ent b/bitbake/doc/poky.ent
new file mode 100644
index 0000000..c032e14
--- /dev/null
+++ b/bitbake/doc/poky.ent
@@ -0,0 +1,59 @@
+<!ENTITY DISTRO "1.4">
+<!ENTITY DISTRO_NAME "tbd">
+<!ENTITY YOCTO_DOC_VERSION "1.4">
+<!ENTITY POKYVERSION "8.0">
+<!ENTITY YOCTO_POKY "poky-&DISTRO_NAME;-&POKYVERSION;">
+<!ENTITY COPYRIGHT_YEAR "2010-2013">
+<!ENTITY YOCTO_DL_URL "http://downloads.yoctoproject.org">
+<!ENTITY YOCTO_HOME_URL "http://www.yoctoproject.org">
+<!ENTITY YOCTO_LISTS_URL "http://lists.yoctoproject.org">
+<!ENTITY YOCTO_BUGZILLA_URL "http://bugzilla.yoctoproject.org">
+<!ENTITY YOCTO_WIKI_URL "https://wiki.yoctoproject.org">
+<!ENTITY YOCTO_AB_URL "http://autobuilder.yoctoproject.org">
+<!ENTITY YOCTO_GIT_URL "http://git.yoctoproject.org">
+<!ENTITY YOCTO_ADTREPO_URL "http://adtrepo.yoctoproject.org">
+<!ENTITY OE_HOME_URL "http://www.openembedded.org">
+<!ENTITY OE_LISTS_URL "http://lists.linuxtogo.org/cgi-bin/mailman">
+<!ENTITY OE_DOCS_URL "http://docs.openembedded.org">
+<!ENTITY OH_HOME_URL "http://o-hand.com">
+<!ENTITY BITBAKE_HOME_URL "http://developer.berlios.de/projects/bitbake/">
+<!ENTITY ECLIPSE_MAIN_URL "http://www.eclipse.org/downloads">
+<!ENTITY ECLIPSE_DL_URL "http://download.eclipse.org">
+<!ENTITY ECLIPSE_DL_PLUGIN_URL "&YOCTO_DL_URL;/releases/eclipse-plugin/&DISTRO;">
+<!ENTITY ECLIPSE_UPDATES_URL "&ECLIPSE_DL_URL;/tm/updates/3.3">
+<!ENTITY ECLIPSE_INDIGO_URL "&ECLIPSE_DL_URL;/releases/indigo">
+<!ENTITY ECLIPSE_JUNO_URL "&ECLIPSE_DL_URL;/releases/juno">
+<!ENTITY ECLIPSE_INDIGO_CDT_URL "&ECLIPSE_DL_URL;tools/cdt/releases/indigo">
+<!ENTITY YOCTO_DOCS_URL "&YOCTO_HOME_URL;/docs">
+<!ENTITY YOCTO_SOURCES_URL "&YOCTO_HOME_URL;/sources/">
+<!ENTITY YOCTO_AB_PORT_URL "&YOCTO_AB_URL;:8010">
+<!ENTITY YOCTO_AB_NIGHTLY_URL "&YOCTO_AB_URL;/nightly/">
+<!ENTITY YOCTO_POKY_URL "&YOCTO_DL_URL;/releases/poky/">
+<!ENTITY YOCTO_RELEASE_DL_URL "&YOCTO_DL_URL;/releases/yocto/yocto-&DISTRO;">
+<!ENTITY YOCTO_TOOLCHAIN_DL_URL "&YOCTO_RELEASE_DL_URL;/toolchain/">
+<!ENTITY YOCTO_ECLIPSE_DL_URL "&YOCTO_RELEASE_DL_URL;/eclipse-plugin/indigo;">
+<!ENTITY YOCTO_ADTINSTALLER_DL_URL "&YOCTO_RELEASE_DL_URL;/adt_installer">
+<!ENTITY YOCTO_POKY_DL_URL "&YOCTO_RELEASE_DL_URL;/&YOCTO_POKY;.tar.bz2">
+<!ENTITY YOCTO_MACHINES_DL_URL "&YOCTO_RELEASE_DL_URL;/machines">
+<!ENTITY YOCTO_QEMU_DL_URL "&YOCTO_MACHINES_DL_URL;/qemu">
+<!ENTITY YOCTO_PYTHON-i686_DL_URL "&YOCTO_DL_URL;/releases/miscsupport/python-nativesdk-standalone-i686.tar.bz2">
+<!ENTITY YOCTO_PYTHON-x86_64_DL_URL "&YOCTO_DL_URL;/releases/miscsupport/python-nativesdk-standalone-x86_64.tar.bz2">
+<!ENTITY YOCTO_DOCS_QS_URL "&YOCTO_DOCS_URL;/&YOCTO_DOC_VERSION;/yocto-project-qs/yocto-project-qs.html">
+<!ENTITY YOCTO_DOCS_ADT_URL "&YOCTO_DOCS_URL;/&YOCTO_DOC_VERSION;/adt-manual/adt-manual.html">
+<!ENTITY YOCTO_DOCS_REF_URL "&YOCTO_DOCS_URL;/&YOCTO_DOC_VERSION;/ref-manual/ref-manual.html">
+<!ENTITY YOCTO_DOCS_BSP_URL "&YOCTO_DOCS_URL;/&YOCTO_DOC_VERSION;/bsp-guide/bsp-guide.html">
+<!ENTITY YOCTO_DOCS_DEV_URL "&YOCTO_DOCS_URL;/&YOCTO_DOC_VERSION;/dev-manual/dev-manual.html">
+<!ENTITY YOCTO_DOCS_KERNEL_URL "&YOCTO_DOCS_URL;/&YOCTO_DOC_VERSION;/kernel-manual/kernel-manual.html">
+<!ENTITY YOCTO_ADTPATH_DIR "/opt/poky/&DISTRO;">
+<!ENTITY YOCTO_POKY_TARBALL "&YOCTO_POKY;.tar.bz2">
+<!ENTITY OE_INIT_PATH "&YOCTO_POKY;/oe-init-build-env">
+<!ENTITY OE_INIT_FILE "oe-init-build-env">
+<!ENTITY UBUNTU_HOST_PACKAGES_ESSENTIAL "gawk wget git-core diffstat unzip texinfo \
+ build-essential chrpath">
+<!ENTITY FEDORA_HOST_PACKAGES_ESSENTIAL "gawk make wget tar bzip2 gzip python unzip perl patch \
+ diffutils diffstat git cpp gcc gcc-c++ eglibc-devel texinfo chrpath \
+ ccache">
+<!ENTITY OPENSUSE_HOST_PACKAGES_ESSENTIAL "python gcc gcc-c++ git chrpath make wget python-xml \
+ diffstat texinfo python-curses">
+<!ENTITY CENTOS_HOST_PACKAGES_ESSENTIAL "gawk make wget tar bzip2 gzip python unzip perl patch \
+ diffutils diffstat git cpp gcc gcc-c++ glibc-devel texinfo chrpath">
diff --git a/bitbake/doc/template/Vera.ttf b/bitbake/doc/template/Vera.ttf
new file mode 100644
index 0000000..58cd6b5
--- /dev/null
+++ b/bitbake/doc/template/Vera.ttf
Binary files differ
diff --git a/bitbake/doc/template/Vera.xml b/bitbake/doc/template/Vera.xml
new file mode 100644
index 0000000..3c82043
--- /dev/null
+++ b/bitbake/doc/template/Vera.xml
@@ -0,0 +1 @@
+<?xml version="1.0" encoding="UTF-8"?><font-metrics type="TYPE0"><font-name>BitstreamVeraSans</font-name><embed/><cap-height>729</cap-height><x-height>546</x-height><ascender>928</ascender><descender>-235</descender><bbox><left>-183</left><bottom>-235</bottom><right>1287</right><top>928</top></bbox><flags>32</flags><stemv>0</stemv><italicangle>0</italicangle><subtype>TYPE0</subtype><multibyte-extras><cid-type>CIDFontType2</cid-type><default-width>0</default-width><bfranges><bf gi="3" ue="126" us="32"/><bf gi="172" ue="160" us="160"/><bf gi="163" ue="161" us="161"/><bf gi="132" ue="163" us="162"/><bf gi="189" ue="164" us="164"/><bf gi="150" ue="165" us="165"/><bf gi="231" ue="166" us="166"/><bf gi="134" ue="167" us="167"/><bf gi="142" ue="168" us="168"/><bf gi="139" ue="169" us="169"/><bf gi="157" ue="170" us="170"/><bf gi="169" ue="171" us="171"/><bf gi="164" ue="172" us="172"/><bf gi="256" ue="173" us="173"/><bf gi="138" ue="174" us="174"/><bf gi="217" ue="175" us="175"/><bf gi="131" ue="176" us="176"/><bf gi="147" ue="177" us="177"/><bf gi="241" ue="179" us="178"/><bf gi="141" ue="180" us="180"/><bf gi="151" ue="181" us="181"/><bf gi="136" ue="182" us="182"/><bf gi="195" ue="183" us="183"/><bf gi="221" ue="184" us="184"/><bf gi="240" ue="185" us="185"/><bf gi="158" ue="186" us="186"/><bf gi="170" ue="187" us="187"/><bf gi="243" ue="190" us="188"/><bf gi="162" ue="191" us="191"/><bf gi="173" ue="192" us="192"/><bf gi="201" ue="193" us="193"/><bf gi="199" ue="194" us="194"/><bf gi="174" ue="195" us="195"/><bf gi="98" ue="197" us="196"/><bf gi="144" ue="198" us="198"/><bf gi="100" ue="199" us="199"/><bf gi="203" ue="200" us="200"/><bf gi="101" ue="201" us="201"/><bf gi="200" ue="202" us="202"/><bf gi="202" ue="203" us="203"/><bf gi="207" ue="204" us="204"/><bf gi="204" ue="207" us="205"/><bf gi="232" ue="208" us="208"/><bf gi="102" ue="209" us="209"/><bf gi="210" ue="210" us="210"/><bf gi="208" ue="212" us="211"/><bf gi="175" ue="213" us="213"/><bf gi="103" ue="214" us="214"/><bf gi="239" ue="215" us="215"/><bf gi="145" ue="216" us="216"/><bf gi="213" ue="217" us="217"/><bf gi="211" ue="219" us="218"/><bf gi="104" ue="220" us="220"/><bf gi="234" ue="221" us="221"/><bf gi="236" ue="222" us="222"/><bf gi="137" ue="223" us="223"/><bf gi="106" ue="224" us="224"/><bf gi="105" ue="225" us="225"/><bf gi="107" ue="226" us="226"/><bf gi="109" ue="227" us="227"/><bf gi="108" ue="228" us="228"/><bf gi="110" ue="229" us="229"/><bf gi="160" ue="230" us="230"/><bf gi="111" ue="231" us="231"/><bf gi="113" ue="232" us="232"/><bf gi="112" ue="233" us="233"/><bf gi="114" ue="235" us="234"/><bf gi="117" ue="236" us="236"/><bf gi="116" ue="237" us="237"/><bf gi="118" ue="239" us="238"/><bf gi="233" ue="240" us="240"/><bf gi="120" ue="241" us="241"/><bf gi="122" ue="242" us="242"/><bf gi="121" ue="243" us="243"/><bf gi="123" ue="244" us="244"/><bf gi="125" ue="245" us="245"/><bf gi="124" ue="246" us="246"/><bf gi="184" ue="247" us="247"/><bf gi="161" ue="248" us="248"/><bf gi="127" ue="249" us="249"/><bf gi="126" ue="250" us="250"/><bf gi="128" ue="252" us="251"/><bf gi="235" ue="253" us="253"/><bf gi="237" ue="254" us="254"/><bf gi="186" ue="255" us="255"/><bf gi="251" ue="263" us="262"/><bf gi="253" ue="269" us="268"/><bf gi="0" ue="270" us="270"/><bf gi="0" ue="271" us="271"/><bf gi="0" ue="272" us="272"/><bf gi="255" ue="273" us="273"/><bf gi="246" ue="287" us="286"/><bf gi="248" ue="304" us="304"/><bf gi="214" ue="305" us="305"/><bf gi="225" ue="322" us="321"/><bf gi="176" ue="339" us="338"/><bf gi="249" ue="351" us="350"/><bf gi="227" ue="353" us="352"/><bf gi="187" ue="376" us="376"/><bf gi="229" ue="382" us="381"/><bf gi="166" ue="402" us="402"/><bf gi="215" ue="710" us="710"/><bf gi="224" ue="711" us="711"/><bf gi="218" ue="730" us="728"/><bf gi="223" ue="731" us="731"/><bf gi="216" ue="732" us="732"/><bf gi="222" ue="733" us="733"/><bf gi="159" ue="937" us="937"/><bf gi="155" ue="960" us="960"/><bf gi="178" ue="8212" us="8211"/><bf gi="0" ue="8213" us="8213"/><bf gi="0" ue="8214" us="8214"/><bf gi="0" ue="8215" us="8215"/><bf gi="182" ue="8217" us="8216"/><bf gi="196" ue="8218" us="8218"/><bf gi="0" ue="8219" us="8219"/><bf gi="180" ue="8221" us="8220"/><bf gi="197" ue="8222" us="8222"/><bf gi="0" ue="8223" us="8223"/><bf gi="130" ue="8224" us="8224"/><bf gi="194" ue="8225" us="8225"/><bf gi="135" ue="8226" us="8226"/><bf gi="0" ue="8227" us="8227"/><bf gi="0" ue="8228" us="8228"/><bf gi="0" ue="8229" us="8229"/><bf gi="171" ue="8230" us="8230"/><bf gi="198" ue="8240" us="8240"/><bf gi="190" ue="8250" us="8249"/><bf gi="258" ue="8364" us="8364"/><bf gi="140" ue="8482" us="8482"/><bf gi="152" ue="8706" us="8706"/><bf gi="0" ue="8707" us="8707"/><bf gi="0" ue="8708" us="8708"/><bf gi="0" ue="8709" us="8709"/><bf gi="168" ue="8710" us="8710"/><bf gi="154" ue="8719" us="8719"/><bf gi="0" ue="8720" us="8720"/><bf gi="153" ue="8721" us="8721"/><bf gi="238" ue="8722" us="8722"/><bf gi="0" ue="8723" us="8723"/><bf gi="0" ue="8724" us="8724"/><bf gi="188" ue="8725" us="8725"/><bf gi="0" ue="8726" us="8726"/><bf gi="0" ue="8727" us="8727"/><bf gi="0" ue="8728" us="8728"/><bf gi="257" ue="8729" us="8729"/><bf gi="165" ue="8730" us="8730"/><bf gi="0" ue="8731" us="8731"/><bf gi="0" ue="8732" us="8732"/><bf gi="0" ue="8733" us="8733"/><bf gi="146" ue="8734" us="8734"/><bf gi="156" ue="8747" us="8747"/><bf gi="167" ue="8776" us="8776"/><bf gi="143" ue="8800" us="8800"/><bf gi="0" ue="8801" us="8801"/><bf gi="0" ue="8802" us="8802"/><bf gi="0" ue="8803" us="8803"/><bf gi="148" ue="8805" us="8804"/><bf gi="185" ue="9674" us="9674"/><bf gi="192" ue="64258" us="64257"/><bf gi="0" ue="65535" us="65535"/></bfranges><cid-widths start-index="0"><wx w="600"/><wx w="0"/><wx w="317"/><wx w="317"/><wx w="400"/><wx w="459"/><wx w="837"/><wx w="636"/><wx w="950"/><wx w="779"/><wx w="274"/><wx w="390"/><wx w="390"/><wx w="500"/><wx w="837"/><wx w="317"/><wx w="360"/><wx w="317"/><wx w="336"/><wx w="636"/><wx w="636"/><wx w="636"/><wx w="636"/><wx w="636"/><wx w="636"/><wx w="636"/><wx w="636"/><wx w="636"/><wx w="636"/><wx w="336"/><wx w="336"/><wx w="837"/><wx w="837"/><wx w="837"/><wx w="530"/><wx w="1000"/><wx w="684"/><wx w="686"/><wx w="698"/><wx w="770"/><wx w="631"/><wx w="575"/><wx w="774"/><wx w="751"/><wx w="294"/><wx w="294"/><wx w="655"/><wx w="557"/><wx w="862"/><wx w="748"/><wx w="787"/><wx w="603"/><wx w="787"/><wx w="694"/><wx w="634"/><wx w="610"/><wx w="731"/><wx w="684"/><wx w="988"/><wx w="685"/><wx w="610"/><wx w="685"/><wx w="390"/><wx w="336"/><wx w="390"/><wx w="837"/><wx w="500"/><wx w="500"/><wx w="612"/><wx w="634"/><wx w="549"/><wx w="634"/><wx w="615"/><wx w="352"/><wx w="634"/><wx w="633"/><wx w="277"/><wx w="277"/><wx w="579"/><wx w="277"/><wx w="974"/><wx w="633"/><wx w="611"/><wx w="634"/><wx w="634"/><wx w="411"/><wx w="520"/><wx w="392"/><wx w="633"/><wx w="591"/><wx w="817"/><wx w="591"/><wx w="591"/><wx w="524"/><wx w="636"/><wx w="336"/><wx w="636"/><wx w="837"/><wx w="684"/><wx w="684"/><wx w="698"/><wx w="631"/><wx w="748"/><wx w="787"/><wx w="731"/><wx w="612"/><wx w="612"/><wx w="612"/><wx w="612"/><wx w="612"/><wx w="612"/><wx w="549"/><wx w="615"/><wx w="615"/><wx w="615"/><wx w="615"/><wx w="277"/><wx w="277"/><wx w="277"/><wx w="277"/><wx w="633"/><wx w="611"/><wx w="611"/><wx w="611"/><wx w="611"/><wx w="611"/><wx w="633"/><wx w="633"/><wx w="633"/><wx w="633"/><wx w="500"/><wx w="500"/><wx w="636"/><wx w="636"/><wx w="500"/><wx w="589"/><wx w="636"/><wx w="629"/><wx w="1000"/><wx w="1000"/><wx w="1000"/><wx w="500"/><wx w="500"/><wx w="837"/><wx w="974"/><wx w="787"/><wx w="833"/><wx w="837"/><wx w="837"/><wx w="837"/><wx w="636"/><wx w="636"/><wx w="517"/><wx w="673"/><wx w="756"/><wx w="588"/><wx w="520"/><wx w="471"/><wx w="471"/><wx w="764"/><wx w="981"/><wx w="611"/><wx w="530"/><wx w="400"/><wx w="837"/><wx w="637"/><wx w="636"/><wx w="837"/><wx w="668"/><wx w="611"/><wx w="611"/><wx w="1000"/><wx w="636"/><wx w="684"/><wx w="684"/><wx w="787"/><wx w="1069"/><wx w="1022"/><wx w="500"/><wx w="1000"/><wx w="518"/><wx w="518"/><wx w="317"/><wx w="317"/><wx w="837"/><wx w="494"/><wx w="591"/><wx w="610"/><wx w="166"/><wx w="636"/><wx w="399"/><wx w="399"/><wx w="629"/><wx w="629"/><wx w="500"/><wx w="317"/><wx w="317"/><wx w="518"/><wx w="1341"/><wx w="684"/><wx w="631"/><wx w="684"/><wx w="631"/><wx w="631"/><wx w="294"/><wx w="294"/><wx w="294"/><wx w="294"/><wx w="787"/><wx w="787"/><wx w="787"/><wx w="731"/><wx w="731"/><wx w="731"/><wx w="277"/><wx w="500"/><wx w="500"/><wx w="500"/><wx w="500"/><wx w="500"/><wx w="500"/><wx w="500"/><wx w="500"/><wx w="500"/><wx w="500"/><wx w="562"/><wx w="284"/><wx w="634"/><wx w="520"/><wx w="685"/><wx w="524"/><wx w="336"/><wx w="774"/><wx w="611"/><wx w="610"/><wx w="591"/><wx w="604"/><wx w="634"/><wx w="837"/><wx w="837"/><wx w="400"/><wx w="400"/><wx w="400"/><wx w="969"/><wx w="969"/><wx w="969"/><wx w="774"/><wx w="634"/><wx w="294"/><wx w="634"/><wx w="520"/><wx w="698"/><wx w="549"/><wx w="698"/><wx w="549"/><wx w="634"/><wx w="360"/><wx w="317"/><wx w="636"/><wx w="500"/><wx w="500"/><wx w="500"/><wx w="500"/><wx w="500"/><wx w="500"/><wx w="400"/><wx w="500"/><wx w="500"/></cid-widths></multibyte-extras><kerning kpx1="246"><pair kern="-21" kpx2="180"/><pair kern="-17" kpx2="169"/><pair kern="-26" kpx2="197"/><pair kern="-35" kpx2="55"/><pair kern="-49" kpx2="60"/><pair kern="-49" kpx2="187"/><pair kern="-21" kpx2="181"/><pair kern="-17" kpx2="170"/><pair kern="-49" kpx2="234"/></kerning><kerning kpx1="235"><pair kern="-142" kpx2="17"/><pair kern="-17" kpx2="169"/><pair kern="-146" kpx2="197"/><pair kern="-17" kpx2="16"/><pair kern="-72" kpx2="29"/><pair kern="-17" kpx2="170"/></kerning><kerning kpx1="43"><pair kern="-35" kpx2="180"/><pair kern="-17" kpx2="17"/><pair kern="-35" kpx2="197"/><pair kern="-30" kpx2="181"/></kerning><kerning kpx1="16"><pair kern="36" kpx2="246"/><pair kern="-17" kpx2="235"/><pair kern="-21" kpx2="199"/><pair kern="18" kpx2="123"/><pair kern="27" kpx2="208"/><pair kern="-118" kpx2="187"/><pair kern="-49" kpx2="59"/><pair kern="18" kpx2="124"/><pair kern="-21" kpx2="201"/><pair kern="-118" kpx2="60"/><pair kern="36" kpx2="52"/><pair kern="18" kpx2="125"/><pair kern="36" kpx2="42"/><pair kern="-118" kpx2="234"/><pair kern="18" kpx2="122"/><pair kern="27" kpx2="210"/><pair kern="-21" kpx2="36"/><pair kern="18" kpx2="82"/><pair kern="-40" kpx2="58"/><pair kern="-91" kpx2="55"/><pair kern="-17" kpx2="186"/><pair kern="27" kpx2="175"/><pair kern="27" kpx2="50"/><pair kern="27" kpx2="209"/><pair kern="27" kpx2="103"/><pair kern="-21" kpx2="98"/><pair kern="55" kpx2="45"/><pair kern="-21" kpx2="173"/><pair kern="-17" kpx2="92"/><pair kern="-26" kpx2="89"/><pair kern="18" kpx2="121"/><pair kern="-58" kpx2="57"/><pair kern="-35" kpx2="37"/><pair kern="-21" kpx2="174"/></kerning><kerning kpx1="112"><pair kern="-17" kpx2="91"/></kerning><kerning kpx1="123"><pair kern="-72" kpx2="180"/><pair kern="-17" kpx2="17"/><pair kern="-63" kpx2="197"/><pair kern="18" kpx2="16"/><pair kern="-30" kpx2="91"/><pair kern="-35" kpx2="181"/></kerning><kerning kpx1="251"><pair kern="-17" kpx2="169"/><pair kern="-17" kpx2="60"/><pair kern="-17" kpx2="187"/><pair kern="18" kpx2="181"/><pair kern="-17" kpx2="170"/><pair kern="-17" kpx2="234"/></kerning><kerning kpx1="213"><pair kern="-17" kpx2="229"/><pair kern="-17" kpx2="61"/></kerning><kerning kpx1="208"><pair kern="-17" kpx2="36"/><pair kern="-17" kpx2="199"/><pair kern="27" kpx2="16"/><pair kern="-54" kpx2="187"/><pair kern="-17" kpx2="98"/><pair kern="-17" kpx2="181"/><pair kern="-63" kpx2="59"/><pair kern="-40" kpx2="17"/><pair kern="-21" kpx2="180"/><pair kern="-17" kpx2="173"/><pair kern="-17" kpx2="169"/><pair kern="-91" kpx2="197"/><pair kern="-17" kpx2="201"/><pair kern="-54" kpx2="60"/><pair kern="-17" kpx2="29"/><pair kern="-17" kpx2="57"/><pair kern="-17" kpx2="174"/><pair kern="-54" kpx2="234"/></kerning><kerning kpx1="187"><pair kern="-114" kpx2="126"/><pair kern="-137" kpx2="107"/><pair kern="-132" kpx2="72"/><pair kern="-77" kpx2="199"/><pair kern="-118" kpx2="16"/><pair kern="-132" kpx2="123"/><pair kern="-132" kpx2="112"/><pair kern="-54" kpx2="251"/><pair kern="-54" kpx2="208"/><pair kern="-132" kpx2="113"/><pair kern="-54" kpx2="180"/><pair kern="-137" kpx2="105"/><pair kern="-114" kpx2="129"/><pair kern="-132" kpx2="124"/><pair kern="-109" kpx2="169"/><pair kern="-77" kpx2="201"/><pair kern="-54" kpx2="253"/><pair kern="-137" kpx2="106"/><pair kern="-132" kpx2="29"/><pair kern="-132" kpx2="125"/><pair kern="-72" kpx2="170"/><pair kern="-132" kpx2="115"/><pair kern="-114" kpx2="88"/><pair kern="-132" kpx2="122"/><pair kern="-54" kpx2="100"/><pair kern="-137" kpx2="68"/><pair kern="-54" kpx2="210"/><pair kern="-77" kpx2="36"/><pair kern="-132" kpx2="82"/><pair kern="-132" kpx2="114"/><pair kern="-54" kpx2="175"/><pair kern="-114" kpx2="127"/><pair kern="-54" kpx2="50"/><pair kern="-54" kpx2="209"/><pair kern="-54" kpx2="103"/><pair kern="-137" kpx2="108"/><pair kern="-77" kpx2="98"/><pair kern="-35" kpx2="76"/><pair kern="-17" kpx2="181"/><pair kern="-202" kpx2="17"/><pair kern="-114" kpx2="128"/><pair kern="-77" kpx2="173"/><pair kern="-137" kpx2="109"/><pair kern="-128" kpx2="197"/><pair kern="-54" kpx2="38"/><pair kern="-132" kpx2="121"/><pair kern="-137" kpx2="110"/><pair kern="-77" kpx2="174"/></kerning><kerning kpx1="113"><pair kern="-17" kpx2="91"/></kerning><kerning kpx1="144"><pair kern="-40" kpx2="180"/><pair kern="-54" kpx2="197"/><pair kern="-44" kpx2="181"/></kerning><kerning kpx1="59"><pair kern="-72" kpx2="100"/><pair kern="-63" kpx2="210"/><pair kern="-17" kpx2="55"/><pair kern="-44" kpx2="114"/><pair kern="-44" kpx2="72"/><pair kern="-63" kpx2="175"/><pair kern="-49" kpx2="16"/><pair kern="-63" kpx2="50"/><pair kern="-63" kpx2="209"/><pair kern="-44" kpx2="112"/><pair kern="-72" kpx2="251"/><pair kern="-63" kpx2="103"/><pair kern="-63" kpx2="208"/><pair kern="-44" kpx2="113"/><pair kern="-40" kpx2="181"/><pair kern="-77" kpx2="180"/><pair kern="-54" kpx2="169"/><pair kern="-21" kpx2="197"/><pair kern="-72" kpx2="38"/><pair kern="-72" kpx2="253"/><pair kern="-44" kpx2="115"/></kerning><kerning kpx1="73"><pair kern="31" kpx2="180"/><pair kern="-17" kpx2="90"/><pair kern="-72" kpx2="17"/><pair kern="-17" kpx2="235"/><pair kern="-35" kpx2="169"/><pair kern="-114" kpx2="197"/><pair kern="-17" kpx2="186"/><pair kern="-17" kpx2="92"/><pair kern="-17" kpx2="87"/><pair kern="-54" kpx2="16"/><pair kern="-35" kpx2="29"/><pair kern="-17" kpx2="170"/></kerning><kerning kpx1="41"><pair kern="-17" kpx2="227"/><pair kern="-54" kpx2="126"/><pair kern="-91" kpx2="107"/><pair kern="-91" kpx2="235"/><pair kern="-54" kpx2="72"/><pair kern="-91" kpx2="199"/><pair kern="-35" kpx2="123"/><pair kern="-54" kpx2="112"/><pair kern="-54" kpx2="113"/><pair kern="-17" kpx2="54"/><pair kern="-21" kpx2="180"/><pair kern="-91" kpx2="105"/><pair kern="-54" kpx2="129"/><pair kern="-35" kpx2="124"/><pair kern="-91" kpx2="201"/><pair kern="-72" kpx2="85"/><pair kern="-91" kpx2="106"/><pair kern="-77" kpx2="29"/><pair kern="-35" kpx2="125"/><pair kern="-54" kpx2="115"/><pair kern="-54" kpx2="88"/><pair kern="-35" kpx2="122"/><pair kern="-91" kpx2="68"/><pair kern="-91" kpx2="36"/><pair kern="-35" kpx2="82"/><pair kern="-91" kpx2="186"/><pair kern="-17" kpx2="55"/><pair kern="-54" kpx2="114"/><pair kern="-54" kpx2="127"/><pair kern="-91" kpx2="108"/><pair kern="-91" kpx2="98"/><pair kern="-72" kpx2="76"/><pair kern="-160" kpx2="17"/><pair kern="-54" kpx2="128"/><pair kern="-91" kpx2="173"/><pair kern="-91" kpx2="109"/><pair kern="-183" kpx2="197"/><pair kern="-91" kpx2="92"/><pair kern="-35" kpx2="121"/><pair kern="-91" kpx2="110"/><pair kern="-91" kpx2="174"/><pair kern="-17" kpx2="249"/></kerning><kerning kpx1="124"><pair kern="-72" kpx2="180"/><pair kern="-17" kpx2="17"/><pair kern="-63" kpx2="197"/><pair kern="18" kpx2="16"/><pair kern="-30" kpx2="91"/><pair kern="-35" kpx2="181"/></kerning><kerning kpx1="169"><pair kern="-17" kpx2="90"/><pair kern="-17" kpx2="100"/><pair kern="-17" kpx2="246"/><pair kern="-17" kpx2="235"/><pair kern="-17" kpx2="58"/><pair kern="-17" kpx2="186"/><pair kern="-54" kpx2="55"/><pair kern="-17" kpx2="251"/><pair kern="-72" kpx2="187"/><pair kern="-17" kpx2="39"/><pair kern="73" kpx2="144"/><pair kern="-17" kpx2="45"/><pair kern="-17" kpx2="92"/><pair kern="-17" kpx2="38"/><pair kern="-72" kpx2="60"/><pair kern="-17" kpx2="89"/><pair kern="-17" kpx2="253"/><pair kern="-54" kpx2="57"/><pair kern="-17" kpx2="37"/><pair kern="-17" kpx2="42"/><pair kern="-72" kpx2="234"/></kerning><kerning kpx1="201"><pair kern="-17" kpx2="246"/><pair kern="-67" kpx2="235"/><pair kern="-21" kpx2="16"/><pair kern="-17" kpx2="112"/><pair kern="-17" kpx2="123"/><pair kern="-17" kpx2="251"/><pair kern="-17" kpx2="113"/><pair kern="-77" kpx2="187"/><pair kern="-17" kpx2="208"/><pair kern="-35" kpx2="73"/><pair kern="-17" kpx2="124"/><pair kern="-35" kpx2="169"/><pair kern="-17" kpx2="252"/><pair kern="-17" kpx2="70"/><pair kern="-77" kpx2="60"/><pair kern="27" kpx2="201"/><pair kern="-17" kpx2="29"/><pair kern="-77" kpx2="234"/><pair kern="-17" kpx2="100"/><pair kern="-17" kpx2="122"/><pair kern="-17" kpx2="210"/><pair kern="-17" kpx2="82"/><pair kern="-54" kpx2="58"/><pair kern="-67" kpx2="186"/><pair kern="-17" kpx2="175"/><pair kern="-17" kpx2="209"/><pair kern="-17" kpx2="103"/><pair kern="27" kpx2="98"/><pair kern="-123" kpx2="181"/><pair kern="-17" kpx2="17"/><pair kern="-17" kpx2="38"/><pair kern="-17" kpx2="84"/><pair kern="-17" kpx2="121"/><pair kern="-63" kpx2="57"/><pair kern="-17" kpx2="254"/><pair kern="-17" kpx2="87"/><pair kern="-17" kpx2="72"/><pair kern="27" kpx2="199"/><pair kern="-17" kpx2="71"/><pair kern="-128" kpx2="180"/><pair kern="-17" kpx2="253"/><pair kern="-17" kpx2="52"/><pair kern="-17" kpx2="125"/><pair kern="-17" kpx2="42"/><pair kern="-17" kpx2="115"/><pair kern="-40" kpx2="90"/><pair kern="-17" kpx2="111"/><pair kern="27" kpx2="36"/><pair kern="-77" kpx2="55"/><pair kern="-17" kpx2="114"/><pair kern="-17" kpx2="50"/><pair kern="27" kpx2="173"/><pair kern="-67" kpx2="92"/><pair kern="22" kpx2="197"/><pair kern="-58" kpx2="89"/><pair kern="27" kpx2="174"/></kerning><kerning kpx1="60"><pair kern="-114" kpx2="126"/><pair kern="-137" kpx2="107"/><pair kern="-132" kpx2="72"/><pair kern="-77" kpx2="199"/><pair kern="-118" kpx2="16"/><pair kern="-132" kpx2="123"/><pair kern="-132" kpx2="112"/><pair kern="-54" kpx2="251"/><pair kern="-54" kpx2="208"/><pair kern="-132" kpx2="113"/><pair kern="-54" kpx2="180"/><pair kern="-137" kpx2="105"/><pair kern="-114" kpx2="129"/><pair kern="-132" kpx2="124"/><pair kern="-109" kpx2="169"/><pair kern="-77" kpx2="201"/><pair kern="-54" kpx2="253"/><pair kern="-137" kpx2="106"/><pair kern="-132" kpx2="29"/><pair kern="-132" kpx2="125"/><pair kern="-72" kpx2="170"/><pair kern="-132" kpx2="115"/><pair kern="-114" kpx2="88"/><pair kern="-132" kpx2="122"/><pair kern="-54" kpx2="100"/><pair kern="-137" kpx2="68"/><pair kern="-54" kpx2="210"/><pair kern="-77" kpx2="36"/><pair kern="-132" kpx2="82"/><pair kern="-132" kpx2="114"/><pair kern="-54" kpx2="175"/><pair kern="-114" kpx2="127"/><pair kern="-54" kpx2="50"/><pair kern="-54" kpx2="209"/><pair kern="-54" kpx2="103"/><pair kern="-137" kpx2="108"/><pair kern="-77" kpx2="98"/><pair kern="-35" kpx2="76"/><pair kern="-17" kpx2="181"/><pair kern="-202" kpx2="17"/><pair kern="-114" kpx2="128"/><pair kern="-77" kpx2="173"/><pair kern="-137" kpx2="109"/><pair kern="-128" kpx2="197"/><pair kern="-54" kpx2="38"/><pair kern="-132" kpx2="121"/><pair kern="-137" kpx2="110"/><pair kern="-77" kpx2="174"/></kerning><kerning kpx1="85"><pair kern="-21" kpx2="254"/><pair kern="-21" kpx2="72"/><pair kern="-63" kpx2="16"/><pair kern="-21" kpx2="112"/><pair kern="-21" kpx2="123"/><pair kern="-17" kpx2="80"/><pair kern="-21" kpx2="113"/><pair kern="-17" kpx2="71"/><pair kern="-21" kpx2="124"/><pair kern="-35" kpx2="169"/><pair kern="-21" kpx2="252"/><pair kern="-21" kpx2="70"/><pair kern="-17" kpx2="85"/><pair kern="-17" kpx2="29"/><pair kern="-21" kpx2="125"/><pair kern="-21" kpx2="115"/><pair kern="-21" kpx2="111"/><pair kern="-21" kpx2="122"/><pair kern="-21" kpx2="82"/><pair kern="-17" kpx2="75"/><pair kern="-21" kpx2="114"/><pair kern="-26" kpx2="91"/><pair kern="-17" kpx2="81"/><pair kern="41" kpx2="181"/><pair kern="-91" kpx2="17"/><pair kern="-151" kpx2="197"/><pair kern="-17" kpx2="74"/><pair kern="-17" kpx2="84"/><pair kern="-21" kpx2="121"/><pair kern="-17" kpx2="247"/><pair kern="-17" kpx2="120"/></kerning><kerning kpx1="61"><pair kern="-17" kpx2="180"/><pair kern="-17" kpx2="197"/><pair kern="-17" kpx2="16"/><pair kern="-17" kpx2="181"/></kerning><kerning kpx1="234"><pair kern="-114" kpx2="126"/><pair kern="-137" kpx2="107"/><pair kern="-132" kpx2="72"/><pair kern="-77" kpx2="199"/><pair kern="-118" kpx2="16"/><pair kern="-132" kpx2="123"/><pair kern="-132" kpx2="112"/><pair kern="-54" kpx2="251"/><pair kern="-54" kpx2="208"/><pair kern="-132" kpx2="113"/><pair kern="-54" kpx2="180"/><pair kern="-137" kpx2="105"/><pair kern="-114" kpx2="129"/><pair kern="-132" kpx2="124"/><pair kern="-109" kpx2="169"/><pair kern="-77" kpx2="201"/><pair kern="-54" kpx2="253"/><pair kern="-137" kpx2="106"/><pair kern="-132" kpx2="29"/><pair kern="-132" kpx2="125"/><pair kern="-72" kpx2="170"/><pair kern="-132" kpx2="115"/><pair kern="-114" kpx2="88"/><pair kern="-132" kpx2="122"/><pair kern="-54" kpx2="100"/><pair kern="-137" kpx2="68"/><pair kern="-54" kpx2="210"/><pair kern="-77" kpx2="36"/><pair kern="-132" kpx2="82"/><pair kern="-132" kpx2="114"/><pair kern="-54" kpx2="175"/><pair kern="-114" kpx2="127"/><pair kern="-54" kpx2="50"/><pair kern="-54" kpx2="209"/><pair kern="-54" kpx2="103"/><pair kern="-137" kpx2="108"/><pair kern="-77" kpx2="98"/><pair kern="-35" kpx2="76"/><pair kern="-17" kpx2="181"/><pair kern="-202" kpx2="17"/><pair kern="-114" kpx2="128"/><pair kern="-77" kpx2="173"/><pair kern="-137" kpx2="109"/><pair kern="-128" kpx2="197"/><pair kern="-54" kpx2="38"/><pair kern="-132" kpx2="121"/><pair kern="-137" kpx2="110"/><pair kern="-77" kpx2="174"/></kerning><kerning kpx1="100"><pair kern="-17" kpx2="169"/><pair kern="-17" kpx2="60"/><pair kern="-17" kpx2="187"/><pair kern="18" kpx2="181"/><pair kern="-17" kpx2="170"/><pair kern="-17" kpx2="234"/></kerning><kerning kpx1="122"><pair kern="-72" kpx2="180"/><pair kern="-17" kpx2="17"/><pair kern="-63" kpx2="197"/><pair kern="18" kpx2="16"/><pair kern="-30" kpx2="91"/><pair kern="-35" kpx2="181"/></kerning><kerning kpx1="47"><pair kern="-17" kpx2="126"/><pair kern="-91" kpx2="235"/><pair kern="-49" kpx2="104"/><pair kern="-17" kpx2="72"/><pair kern="22" kpx2="199"/><pair kern="-17" kpx2="16"/><pair kern="-17" kpx2="112"/><pair kern="-17" kpx2="123"/><pair kern="-49" kpx2="213"/><pair kern="-35" kpx2="208"/><pair kern="-132" kpx2="187"/><pair kern="-17" kpx2="113"/><pair kern="-202" kpx2="180"/><pair kern="-17" kpx2="129"/><pair kern="-17" kpx2="124"/><pair kern="22" kpx2="201"/><pair kern="-132" kpx2="60"/><pair kern="-49" kpx2="211"/><pair kern="-17" kpx2="125"/><pair kern="-17" kpx2="115"/><pair kern="-132" kpx2="234"/><pair kern="-17" kpx2="88"/><pair kern="-17" kpx2="122"/><pair kern="-35" kpx2="210"/><pair kern="22" kpx2="36"/><pair kern="-17" kpx2="82"/><pair kern="-91" kpx2="58"/><pair kern="-91" kpx2="186"/><pair kern="-137" kpx2="55"/><pair kern="-17" kpx2="114"/><pair kern="-35" kpx2="175"/><pair kern="-17" kpx2="127"/><pair kern="-35" kpx2="50"/><pair kern="-35" kpx2="209"/><pair kern="-35" kpx2="103"/><pair kern="22" kpx2="98"/><pair kern="-262" kpx2="181"/><pair kern="-17" kpx2="128"/><pair kern="22" kpx2="173"/><pair kern="-49" kpx2="212"/><pair kern="-91" kpx2="92"/><pair kern="-17" kpx2="121"/><pair kern="-109" kpx2="57"/><pair kern="22" kpx2="174"/><pair kern="-49" kpx2="56"/></kerning><kerning kpx1="210"><pair kern="-17" kpx2="36"/><pair kern="-17" kpx2="199"/><pair kern="27" kpx2="16"/><pair kern="-54" kpx2="187"/><pair kern="-17" kpx2="98"/><pair kern="-17" kpx2="181"/><pair kern="-63" kpx2="59"/><pair kern="-40" kpx2="17"/><pair kern="-21" kpx2="180"/><pair kern="-17" kpx2="173"/><pair kern="-17" kpx2="169"/><pair kern="-91" kpx2="197"/><pair kern="-17" kpx2="201"/><pair kern="-54" kpx2="60"/><pair kern="-17" kpx2="29"/><pair kern="-17" kpx2="57"/><pair kern="-17" kpx2="174"/><pair kern="-54" kpx2="234"/></kerning><kerning kpx1="58"><pair kern="-35" kpx2="126"/><pair kern="-63" kpx2="107"/><pair kern="-17" kpx2="235"/><pair kern="-58" kpx2="72"/><pair kern="-54" kpx2="199"/><pair kern="-40" kpx2="16"/><pair kern="-58" kpx2="112"/><pair kern="-58" kpx2="123"/><pair kern="-58" kpx2="113"/><pair kern="-17" kpx2="180"/><pair kern="-63" kpx2="105"/><pair kern="-35" kpx2="129"/><pair kern="-58" kpx2="124"/><pair kern="-54" kpx2="169"/><pair kern="-54" kpx2="201"/><pair kern="-44" kpx2="85"/><pair kern="-63" kpx2="106"/><pair kern="-58" kpx2="29"/><pair kern="-58" kpx2="125"/><pair kern="-17" kpx2="170"/><pair kern="-58" kpx2="115"/><pair kern="-35" kpx2="88"/><pair kern="-58" kpx2="122"/><pair kern="-63" kpx2="68"/><pair kern="-54" kpx2="36"/><pair kern="-58" kpx2="82"/><pair kern="-17" kpx2="186"/><pair kern="-58" kpx2="114"/><pair kern="-35" kpx2="127"/><pair kern="-63" kpx2="108"/><pair kern="-54" kpx2="98"/><pair kern="-21" kpx2="76"/><pair kern="-114" kpx2="17"/><pair kern="-35" kpx2="128"/><pair kern="-54" kpx2="173"/><pair kern="-63" kpx2="109"/><pair kern="-128" kpx2="197"/><pair kern="-17" kpx2="92"/><pair kern="-58" kpx2="121"/><pair kern="-63" kpx2="110"/><pair kern="-54" kpx2="174"/></kerning><kerning kpx1="82"><pair kern="-72" kpx2="180"/><pair kern="-17" kpx2="17"/><pair kern="-63" kpx2="197"/><pair kern="18" kpx2="16"/><pair kern="-30" kpx2="91"/><pair kern="-35" kpx2="181"/></kerning><kerning kpx1="186"><pair kern="-142" kpx2="17"/><pair kern="-17" kpx2="169"/><pair kern="-146" kpx2="197"/><pair kern="-17" kpx2="16"/><pair kern="-72" kpx2="29"/><pair kern="-17" kpx2="170"/></kerning><kerning kpx1="175"><pair kern="-17" kpx2="36"/><pair kern="-17" kpx2="199"/><pair kern="27" kpx2="16"/><pair kern="-54" kpx2="187"/><pair kern="-17" kpx2="98"/><pair kern="-17" kpx2="181"/><pair kern="-63" kpx2="59"/><pair kern="-40" kpx2="17"/><pair kern="-21" kpx2="180"/><pair kern="-17" kpx2="173"/><pair kern="-17" kpx2="169"/><pair kern="-91" kpx2="197"/><pair kern="-17" kpx2="201"/><pair kern="-54" kpx2="60"/><pair kern="-17" kpx2="29"/><pair kern="-17" kpx2="57"/><pair kern="-17" kpx2="174"/><pair kern="-54" kpx2="234"/></kerning><kerning kpx1="209"><pair kern="-17" kpx2="36"/><pair kern="-17" kpx2="199"/><pair kern="27" kpx2="16"/><pair kern="-54" kpx2="187"/><pair kern="-17" kpx2="98"/><pair kern="-17" kpx2="181"/><pair kern="-63" kpx2="59"/><pair kern="-40" kpx2="17"/><pair kern="-21" kpx2="180"/><pair kern="-17" kpx2="173"/><pair kern="-17" kpx2="169"/><pair kern="-91" kpx2="197"/><pair kern="-17" kpx2="201"/><pair kern="-54" kpx2="60"/><pair kern="-17" kpx2="29"/><pair kern="-17" kpx2="57"/><pair kern="-17" kpx2="174"/><pair kern="-54" kpx2="234"/></kerning><kerning kpx1="103"><pair kern="-17" kpx2="36"/><pair kern="-17" kpx2="199"/><pair kern="27" kpx2="16"/><pair kern="-54" kpx2="187"/><pair kern="-17" kpx2="98"/><pair kern="-17" kpx2="181"/><pair kern="-63" kpx2="59"/><pair kern="-40" kpx2="17"/><pair kern="-21" kpx2="180"/><pair kern="-17" kpx2="173"/><pair kern="-17" kpx2="169"/><pair kern="-91" kpx2="197"/><pair kern="-17" kpx2="201"/><pair kern="-54" kpx2="60"/><pair kern="-17" kpx2="29"/><pair kern="-17" kpx2="57"/><pair kern="-17" kpx2="174"/><pair kern="-54" kpx2="234"/></kerning><kerning kpx1="81"><pair kern="-72" kpx2="180"/><pair kern="-44" kpx2="197"/><pair kern="-54" kpx2="181"/></kerning><kerning kpx1="98"><pair kern="-17" kpx2="246"/><pair kern="-67" kpx2="235"/><pair kern="-21" kpx2="16"/><pair kern="-17" kpx2="112"/><pair kern="-17" kpx2="123"/><pair kern="-17" kpx2="251"/><pair kern="-17" kpx2="113"/><pair kern="-77" kpx2="187"/><pair kern="-17" kpx2="208"/><pair kern="-35" kpx2="73"/><pair kern="-17" kpx2="124"/><pair kern="-35" kpx2="169"/><pair kern="-17" kpx2="252"/><pair kern="-17" kpx2="70"/><pair kern="-77" kpx2="60"/><pair kern="27" kpx2="201"/><pair kern="-17" kpx2="29"/><pair kern="-77" kpx2="234"/><pair kern="-17" kpx2="100"/><pair kern="-17" kpx2="122"/><pair kern="-17" kpx2="210"/><pair kern="-17" kpx2="82"/><pair kern="-54" kpx2="58"/><pair kern="-67" kpx2="186"/><pair kern="-17" kpx2="175"/><pair kern="-17" kpx2="209"/><pair kern="-17" kpx2="103"/><pair kern="27" kpx2="98"/><pair kern="-123" kpx2="181"/><pair kern="-17" kpx2="17"/><pair kern="-17" kpx2="38"/><pair kern="-17" kpx2="84"/><pair kern="-17" kpx2="121"/><pair kern="-63" kpx2="57"/><pair kern="-17" kpx2="254"/><pair kern="-17" kpx2="87"/><pair kern="-17" kpx2="72"/><pair kern="27" kpx2="199"/><pair kern="-17" kpx2="71"/><pair kern="-128" kpx2="180"/><pair kern="-17" kpx2="253"/><pair kern="-17" kpx2="52"/><pair kern="-17" kpx2="125"/><pair kern="-17" kpx2="42"/><pair kern="-17" kpx2="115"/><pair kern="-40" kpx2="90"/><pair kern="-17" kpx2="111"/><pair kern="27" kpx2="36"/><pair kern="-77" kpx2="55"/><pair kern="-17" kpx2="114"/><pair kern="-17" kpx2="50"/><pair kern="27" kpx2="173"/><pair kern="-67" kpx2="92"/><pair kern="22" kpx2="197"/><pair kern="-58" kpx2="89"/><pair kern="27" kpx2="174"/></kerning><kerning kpx1="212"><pair kern="-17" kpx2="229"/><pair kern="-17" kpx2="61"/></kerning><kerning kpx1="229"><pair kern="-17" kpx2="180"/><pair kern="-17" kpx2="197"/><pair kern="-17" kpx2="16"/><pair kern="-17" kpx2="181"/></kerning><kerning kpx1="38"><pair kern="-17" kpx2="169"/><pair kern="-17" kpx2="60"/><pair kern="-17" kpx2="187"/><pair kern="18" kpx2="181"/><pair kern="-17" kpx2="170"/><pair kern="-17" kpx2="234"/></kerning><kerning kpx1="121"><pair kern="-72" kpx2="180"/><pair kern="-17" kpx2="17"/><pair kern="-63" kpx2="197"/><pair kern="18" kpx2="16"/><pair kern="-30" kpx2="91"/><pair kern="-35" kpx2="181"/></kerning><kerning kpx1="57"><pair kern="-67" kpx2="126"/><pair kern="-77" kpx2="107"/><pair kern="-26" kpx2="235"/><pair kern="-77" kpx2="72"/><pair kern="-63" kpx2="199"/><pair kern="-58" kpx2="16"/><pair kern="-77" kpx2="123"/><pair kern="-77" kpx2="112"/><pair kern="-17" kpx2="208"/><pair kern="-77" kpx2="113"/><pair kern="-77" kpx2="105"/><pair kern="-67" kpx2="129"/><pair kern="-77" kpx2="124"/><pair kern="-86" kpx2="169"/><pair kern="-63" kpx2="201"/><pair kern="-77" kpx2="106"/><pair kern="-81" kpx2="29"/><pair kern="-77" kpx2="125"/><pair kern="-54" kpx2="170"/><pair kern="-77" kpx2="115"/><pair kern="-67" kpx2="88"/><pair kern="-77" kpx2="122"/><pair kern="-77" kpx2="68"/><pair kern="-17" kpx2="210"/><pair kern="-63" kpx2="36"/><pair kern="-77" kpx2="82"/><pair kern="-26" kpx2="186"/><pair kern="-77" kpx2="114"/><pair kern="-17" kpx2="175"/><pair kern="-67" kpx2="127"/><pair kern="-17" kpx2="50"/><pair kern="-17" kpx2="209"/><pair kern="-17" kpx2="103"/><pair kern="-77" kpx2="108"/><pair kern="-63" kpx2="98"/><pair kern="-21" kpx2="76"/><pair kern="-128" kpx2="17"/><pair kern="-67" kpx2="128"/><pair kern="-63" kpx2="173"/><pair kern="-77" kpx2="109"/><pair kern="-137" kpx2="197"/><pair kern="-26" kpx2="92"/><pair kern="-77" kpx2="121"/><pair kern="-77" kpx2="110"/><pair kern="-63" kpx2="174"/></kerning><kerning kpx1="37"><pair kern="-17" kpx2="227"/><pair kern="-17" kpx2="246"/><pair kern="-17" kpx2="251"/><pair kern="-54" kpx2="187"/><pair kern="-17" kpx2="208"/><pair kern="-17" kpx2="54"/><pair kern="-54" kpx2="180"/><pair kern="-30" kpx2="169"/><pair kern="-54" kpx2="60"/><pair kern="-17" kpx2="253"/><pair kern="-17" kpx2="42"/><pair kern="-17" kpx2="170"/><pair kern="-54" kpx2="234"/><pair kern="-17" kpx2="100"/><pair kern="-17" kpx2="210"/><pair kern="-35" kpx2="58"/><pair kern="-17" kpx2="175"/><pair kern="-17" kpx2="50"/><pair kern="-17" kpx2="209"/><pair kern="-17" kpx2="103"/><pair kern="-54" kpx2="181"/><pair kern="-40" kpx2="197"/><pair kern="-17" kpx2="38"/><pair kern="-30" kpx2="57"/><pair kern="-17" kpx2="249"/></kerning><kerning kpx1="120"><pair kern="-72" kpx2="180"/><pair kern="-44" kpx2="197"/><pair kern="-54" kpx2="181"/></kerning><kerning kpx1="249"><pair kern="18" kpx2="173"/><pair kern="18" kpx2="36"/><pair kern="18" kpx2="201"/><pair kern="18" kpx2="199"/><pair kern="18" kpx2="174"/><pair kern="18" kpx2="98"/></kerning><kerning kpx1="227"><pair kern="18" kpx2="173"/><pair kern="18" kpx2="36"/><pair kern="18" kpx2="201"/><pair kern="18" kpx2="199"/><pair kern="18" kpx2="174"/><pair kern="18" kpx2="98"/></kerning><kerning kpx1="51"><pair kern="-17" kpx2="126"/><pair kern="-44" kpx2="107"/><pair kern="-35" kpx2="72"/><pair kern="-63" kpx2="199"/><pair kern="-21" kpx2="16"/><pair kern="-35" kpx2="123"/><pair kern="-35" kpx2="112"/><pair kern="-21" kpx2="187"/><pair kern="-35" kpx2="113"/><pair kern="-17" kpx2="86"/><pair kern="18" kpx2="180"/><pair kern="-44" kpx2="105"/><pair kern="-17" kpx2="129"/><pair kern="-35" kpx2="124"/><pair kern="-17" kpx2="169"/><pair kern="-63" kpx2="201"/><pair kern="-17" kpx2="85"/><pair kern="-21" kpx2="60"/><pair kern="-44" kpx2="106"/><pair kern="-35" kpx2="125"/><pair kern="-35" kpx2="115"/><pair kern="-21" kpx2="234"/><pair kern="-17" kpx2="88"/><pair kern="-35" kpx2="122"/><pair kern="-44" kpx2="68"/><pair kern="-63" kpx2="36"/><pair kern="-35" kpx2="82"/><pair kern="-35" kpx2="114"/><pair kern="-17" kpx2="250"/><pair kern="-17" kpx2="127"/><pair kern="-44" kpx2="108"/><pair kern="-63" kpx2="98"/><pair kern="-17" kpx2="81"/><pair kern="-21" kpx2="76"/><pair kern="18" kpx2="181"/><pair kern="-155" kpx2="17"/><pair kern="-17" kpx2="128"/><pair kern="-63" kpx2="173"/><pair kern="-44" kpx2="109"/><pair kern="-160" kpx2="197"/><pair kern="-35" kpx2="121"/><pair kern="-17" kpx2="228"/><pair kern="-44" kpx2="110"/><pair kern="-63" kpx2="174"/><pair kern="-17" kpx2="120"/></kerning><kerning kpx1="104"><pair kern="-17" kpx2="229"/><pair kern="-17" kpx2="61"/></kerning><kerning kpx1="72"><pair kern="-17" kpx2="91"/></kerning><kerning kpx1="199"><pair kern="-17" kpx2="246"/><pair kern="-67" kpx2="235"/><pair kern="-21" kpx2="16"/><pair kern="-17" kpx2="112"/><pair kern="-17" kpx2="123"/><pair kern="-17" kpx2="251"/><pair kern="-17" kpx2="113"/><pair kern="-77" kpx2="187"/><pair kern="-17" kpx2="208"/><pair kern="-35" kpx2="73"/><pair kern="-17" kpx2="124"/><pair kern="-35" kpx2="169"/><pair kern="-17" kpx2="252"/><pair kern="-17" kpx2="70"/><pair kern="-77" kpx2="60"/><pair kern="27" kpx2="201"/><pair kern="-17" kpx2="29"/><pair kern="-77" kpx2="234"/><pair kern="-17" kpx2="100"/><pair kern="-17" kpx2="122"/><pair kern="-17" kpx2="210"/><pair kern="-17" kpx2="82"/><pair kern="-54" kpx2="58"/><pair kern="-67" kpx2="186"/><pair kern="-17" kpx2="175"/><pair kern="-17" kpx2="209"/><pair kern="-17" kpx2="103"/><pair kern="27" kpx2="98"/><pair kern="-123" kpx2="181"/><pair kern="-17" kpx2="17"/><pair kern="-17" kpx2="38"/><pair kern="-17" kpx2="84"/><pair kern="-17" kpx2="121"/><pair kern="-63" kpx2="57"/><pair kern="-17" kpx2="254"/><pair kern="-17" kpx2="87"/><pair kern="-17" kpx2="72"/><pair kern="27" kpx2="199"/><pair kern="-17" kpx2="71"/><pair kern="-128" kpx2="180"/><pair kern="-17" kpx2="253"/><pair kern="-17" kpx2="52"/><pair kern="-17" kpx2="125"/><pair kern="-17" kpx2="42"/><pair kern="-17" kpx2="115"/><pair kern="-40" kpx2="90"/><pair kern="-17" kpx2="111"/><pair kern="27" kpx2="36"/><pair kern="-77" kpx2="55"/><pair kern="-17" kpx2="114"/><pair kern="-17" kpx2="50"/><pair kern="27" kpx2="173"/><pair kern="-67" kpx2="92"/><pair kern="22" kpx2="197"/><pair kern="-58" kpx2="89"/><pair kern="27" kpx2="174"/></kerning><kerning kpx1="54"><pair kern="18" kpx2="173"/><pair kern="18" kpx2="36"/><pair kern="18" kpx2="201"/><pair kern="18" kpx2="199"/><pair kern="18" kpx2="174"/><pair kern="18" kpx2="98"/></kerning><kerning kpx1="180"><pair kern="-35" kpx2="235"/><pair kern="-35" kpx2="246"/><pair kern="-30" kpx2="43"/><pair kern="-72" kpx2="123"/><pair kern="-35" kpx2="251"/><pair kern="-35" kpx2="208"/><pair kern="-188" kpx2="144"/><pair kern="-58" kpx2="59"/><pair kern="-35" kpx2="73"/><pair kern="-30" kpx2="41"/><pair kern="-72" kpx2="124"/><pair kern="-54" kpx2="85"/><pair kern="-128" kpx2="201"/><pair kern="-17" kpx2="61"/><pair kern="-35" kpx2="100"/><pair kern="-72" kpx2="122"/><pair kern="-30" kpx2="47"/><pair kern="-35" kpx2="210"/><pair kern="-72" kpx2="82"/><pair kern="-35" kpx2="186"/><pair kern="-35" kpx2="175"/><pair kern="-35" kpx2="209"/><pair kern="-35" kpx2="103"/><pair kern="-128" kpx2="98"/><pair kern="-54" kpx2="81"/><pair kern="-17" kpx2="229"/><pair kern="-35" kpx2="38"/><pair kern="-72" kpx2="121"/><pair kern="-30" kpx2="37"/><pair kern="-54" kpx2="120"/><pair kern="-30" kpx2="51"/><pair kern="-128" kpx2="199"/><pair kern="-30" kpx2="53"/><pair kern="-30" kpx2="137"/><pair kern="-35" kpx2="233"/><pair kern="-35" kpx2="253"/><pair kern="-35" kpx2="52"/><pair kern="-72" kpx2="125"/><pair kern="-35" kpx2="42"/><pair kern="-35" kpx2="90"/><pair kern="-128" kpx2="36"/><pair kern="-35" kpx2="50"/><pair kern="-30" kpx2="39"/><pair kern="-30" kpx2="236"/><pair kern="-30" kpx2="45"/><pair kern="-128" kpx2="173"/><pair kern="-35" kpx2="92"/><pair kern="-35" kpx2="89"/><pair kern="-30" kpx2="46"/><pair kern="-128" kpx2="174"/></kerning><kerning kpx1="53"><pair kern="-21" kpx2="107"/><pair kern="-54" kpx2="235"/><pair kern="-40" kpx2="16"/><pair kern="-44" kpx2="112"/><pair kern="-44" kpx2="123"/><pair kern="-49" kpx2="251"/><pair kern="-44" kpx2="113"/><pair kern="-63" kpx2="187"/><pair kern="-44" kpx2="129"/><pair kern="-44" kpx2="124"/><pair kern="-54" kpx2="169"/><pair kern="-63" kpx2="60"/><pair kern="-40" kpx2="201"/><pair kern="-21" kpx2="106"/><pair kern="-30" kpx2="29"/><pair kern="-63" kpx2="234"/><pair kern="-49" kpx2="100"/><pair kern="-44" kpx2="122"/><pair kern="-21" kpx2="68"/><pair kern="-40" kpx2="58"/><pair kern="-44" kpx2="82"/><pair kern="-54" kpx2="186"/><pair kern="-40" kpx2="98"/><pair kern="-63" kpx2="181"/><pair kern="-35" kpx2="17"/><pair kern="-49" kpx2="38"/><pair kern="-44" kpx2="121"/><pair kern="-54" kpx2="57"/><pair kern="-44" kpx2="126"/><pair kern="-44" kpx2="72"/><pair kern="-40" kpx2="199"/><pair kern="-72" kpx2="180"/><pair kern="-21" kpx2="105"/><pair kern="-49" kpx2="253"/><pair kern="-44" kpx2="125"/><pair kern="-44" kpx2="115"/><pair kern="-17" kpx2="170"/><pair kern="-44" kpx2="88"/><pair kern="-40" kpx2="36"/><pair kern="-44" kpx2="114"/><pair kern="-72" kpx2="55"/><pair kern="-44" kpx2="127"/><pair kern="-21" kpx2="108"/><pair kern="-44" kpx2="128"/><pair kern="-40" kpx2="173"/><pair kern="-21" kpx2="109"/><pair kern="-54" kpx2="92"/><pair kern="-17" kpx2="197"/><pair kern="-21" kpx2="110"/><pair kern="-40" kpx2="174"/></kerning><kerning kpx1="137"><pair kern="-54" kpx2="180"/><pair kern="-40" kpx2="197"/><pair kern="18" kpx2="16"/><pair kern="-54" kpx2="181"/></kerning><kerning kpx1="233"><pair kern="-44" kpx2="180"/><pair kern="-35" kpx2="197"/><pair kern="-54" kpx2="181"/></kerning><kerning kpx1="253"><pair kern="-17" kpx2="169"/><pair kern="-17" kpx2="60"/><pair kern="-17" kpx2="187"/><pair kern="18" kpx2="181"/><pair kern="-17" kpx2="170"/><pair kern="-17" kpx2="234"/></kerning><kerning kpx1="211"><pair kern="-17" kpx2="229"/><pair kern="-17" kpx2="61"/></kerning><kerning kpx1="78"><pair kern="-17" kpx2="107"/><pair kern="-30" kpx2="126"/><pair kern="-35" kpx2="235"/><pair kern="-35" kpx2="72"/><pair kern="-35" kpx2="112"/><pair kern="-35" kpx2="123"/><pair kern="-35" kpx2="113"/><pair kern="-17" kpx2="105"/><pair kern="-30" kpx2="129"/><pair kern="-35" kpx2="124"/><pair kern="-17" kpx2="106"/><pair kern="-35" kpx2="125"/><pair kern="-35" kpx2="115"/><pair kern="-30" kpx2="88"/><pair kern="-35" kpx2="122"/><pair kern="-17" kpx2="68"/><pair kern="-35" kpx2="82"/><pair kern="-35" kpx2="114"/><pair kern="-35" kpx2="186"/><pair kern="-30" kpx2="127"/><pair kern="-17" kpx2="108"/><pair kern="-30" kpx2="128"/><pair kern="-17" kpx2="109"/><pair kern="-35" kpx2="92"/><pair kern="-35" kpx2="121"/><pair kern="-17" kpx2="110"/></kerning><kerning kpx1="52"><pair kern="-21" kpx2="180"/><pair kern="-63" kpx2="197"/><pair kern="27" kpx2="16"/><pair kern="-17" kpx2="181"/></kerning><kerning kpx1="125"><pair kern="-72" kpx2="180"/><pair kern="-17" kpx2="17"/><pair kern="-63" kpx2="197"/><pair kern="18" kpx2="16"/><pair kern="-30" kpx2="91"/><pair kern="-35" kpx2="181"/></kerning><kerning kpx1="42"><pair kern="-21" kpx2="180"/><pair kern="-17" kpx2="169"/><pair kern="-26" kpx2="197"/><pair kern="-35" kpx2="55"/><pair kern="-49" kpx2="60"/><pair kern="-49" kpx2="187"/><pair kern="-21" kpx2="181"/><pair kern="-17" kpx2="170"/><pair kern="-49" kpx2="234"/></kerning><kerning kpx1="170"><pair kern="-17" kpx2="235"/><pair kern="-35" kpx2="199"/><pair kern="-17" kpx2="251"/><pair kern="-109" kpx2="187"/><pair kern="-17" kpx2="208"/><pair kern="-54" kpx2="59"/><pair kern="-109" kpx2="60"/><pair kern="-35" kpx2="201"/><pair kern="-17" kpx2="253"/><pair kern="-109" kpx2="234"/><pair kern="-17" kpx2="90"/><pair kern="-17" kpx2="100"/><pair kern="-17" kpx2="210"/><pair kern="-35" kpx2="36"/><pair kern="-54" kpx2="58"/><pair kern="-91" kpx2="55"/><pair kern="-17" kpx2="186"/><pair kern="-17" kpx2="175"/><pair kern="-17" kpx2="50"/><pair kern="-17" kpx2="209"/><pair kern="-17" kpx2="103"/><pair kern="-17" kpx2="39"/><pair kern="-35" kpx2="98"/><pair kern="-17" kpx2="45"/><pair kern="-35" kpx2="173"/><pair kern="-17" kpx2="92"/><pair kern="-17" kpx2="38"/><pair kern="-17" kpx2="89"/><pair kern="-86" kpx2="57"/><pair kern="-35" kpx2="37"/><pair kern="-35" kpx2="174"/></kerning><kerning kpx1="115"><pair kern="-17" kpx2="91"/></kerning><kerning kpx1="90"><pair kern="-91" kpx2="17"/><pair kern="-17" kpx2="169"/><pair kern="-104" kpx2="197"/><pair kern="-54" kpx2="29"/><pair kern="-17" kpx2="170"/></kerning><kerning kpx1="36"><pair kern="-17" kpx2="246"/><pair kern="-67" kpx2="235"/><pair kern="-21" kpx2="16"/><pair kern="-17" kpx2="112"/><pair kern="-17" kpx2="123"/><pair kern="-17" kpx2="251"/><pair kern="-17" kpx2="113"/><pair kern="-77" kpx2="187"/><pair kern="-17" kpx2="208"/><pair kern="-35" kpx2="73"/><pair kern="-17" kpx2="124"/><pair kern="-35" kpx2="169"/><pair kern="-17" kpx2="252"/><pair kern="-17" kpx2="70"/><pair kern="-77" kpx2="60"/><pair kern="27" kpx2="201"/><pair kern="-17" kpx2="29"/><pair kern="-77" kpx2="234"/><pair kern="-17" kpx2="100"/><pair kern="-17" kpx2="122"/><pair kern="-17" kpx2="210"/><pair kern="-17" kpx2="82"/><pair kern="-54" kpx2="58"/><pair kern="-67" kpx2="186"/><pair kern="-17" kpx2="175"/><pair kern="-17" kpx2="209"/><pair kern="-17" kpx2="103"/><pair kern="27" kpx2="98"/><pair kern="-123" kpx2="181"/><pair kern="-17" kpx2="17"/><pair kern="-17" kpx2="38"/><pair kern="-17" kpx2="84"/><pair kern="-17" kpx2="121"/><pair kern="-63" kpx2="57"/><pair kern="-17" kpx2="254"/><pair kern="-17" kpx2="87"/><pair kern="-17" kpx2="72"/><pair kern="27" kpx2="199"/><pair kern="-17" kpx2="71"/><pair kern="-128" kpx2="180"/><pair kern="-17" kpx2="253"/><pair kern="-17" kpx2="52"/><pair kern="-17" kpx2="125"/><pair kern="-17" kpx2="42"/><pair kern="-17" kpx2="115"/><pair kern="-40" kpx2="90"/><pair kern="-17" kpx2="111"/><pair kern="27" kpx2="36"/><pair kern="-77" kpx2="55"/><pair kern="-17" kpx2="114"/><pair kern="-17" kpx2="50"/><pair kern="27" kpx2="173"/><pair kern="-67" kpx2="92"/><pair kern="22" kpx2="197"/><pair kern="-58" kpx2="89"/><pair kern="27" kpx2="174"/></kerning><kerning kpx1="55"><pair kern="-165" kpx2="107"/><pair kern="-155" kpx2="235"/><pair kern="-91" kpx2="16"/><pair kern="-169" kpx2="112"/><pair kern="-169" kpx2="123"/><pair kern="-58" kpx2="251"/><pair kern="-169" kpx2="113"/><pair kern="-165" kpx2="86"/><pair kern="-151" kpx2="129"/><pair kern="-169" kpx2="124"/><pair kern="-91" kpx2="169"/><pair kern="-169" kpx2="252"/><pair kern="-169" kpx2="70"/><pair kern="-146" kpx2="85"/><pair kern="-77" kpx2="201"/><pair kern="-165" kpx2="106"/><pair kern="-109" kpx2="29"/><pair kern="-58" kpx2="100"/><pair kern="-169" kpx2="122"/><pair kern="-165" kpx2="68"/><pair kern="-169" kpx2="82"/><pair kern="-155" kpx2="186"/><pair kern="-165" kpx2="250"/><pair kern="-77" kpx2="98"/><pair kern="-21" kpx2="181"/><pair kern="-118" kpx2="17"/><pair kern="-58" kpx2="38"/><pair kern="-169" kpx2="121"/><pair kern="-165" kpx2="228"/><pair kern="-169" kpx2="254"/><pair kern="-151" kpx2="126"/><pair kern="-169" kpx2="72"/><pair kern="-77" kpx2="199"/><pair kern="-165" kpx2="105"/><pair kern="-58" kpx2="253"/><pair kern="-169" kpx2="125"/><pair kern="-169" kpx2="115"/><pair kern="-54" kpx2="170"/><pair kern="-151" kpx2="88"/><pair kern="-169" kpx2="111"/><pair kern="-165" kpx2="90"/><pair kern="-77" kpx2="36"/><pair kern="-17" kpx2="55"/><pair kern="-169" kpx2="114"/><pair kern="-151" kpx2="127"/><pair kern="-165" kpx2="108"/><pair kern="-30" kpx2="76"/><pair kern="-151" kpx2="128"/><pair kern="-77" kpx2="173"/><pair kern="-165" kpx2="109"/><pair kern="-155" kpx2="92"/><pair kern="-128" kpx2="197"/><pair kern="-165" kpx2="110"/><pair kern="-77" kpx2="174"/></kerning><kerning kpx1="114"><pair kern="-17" kpx2="91"/></kerning><kerning kpx1="50"><pair kern="-17" kpx2="36"/><pair kern="-17" kpx2="199"/><pair kern="27" kpx2="16"/><pair kern="-54" kpx2="187"/><pair kern="-17" kpx2="98"/><pair kern="-17" kpx2="181"/><pair kern="-63" kpx2="59"/><pair kern="-40" kpx2="17"/><pair kern="-21" kpx2="180"/><pair kern="-17" kpx2="173"/><pair kern="-17" kpx2="169"/><pair kern="-91" kpx2="197"/><pair kern="-17" kpx2="201"/><pair kern="-54" kpx2="60"/><pair kern="-17" kpx2="29"/><pair kern="-17" kpx2="57"/><pair kern="-17" kpx2="174"/><pair kern="-54" kpx2="234"/></kerning><kerning kpx1="91"><pair kern="-17" kpx2="254"/><pair kern="-17" kpx2="111"/><pair kern="-30" kpx2="122"/><pair kern="-30" kpx2="82"/><pair kern="-30" kpx2="114"/><pair kern="-30" kpx2="72"/><pair kern="-30" kpx2="112"/><pair kern="-30" kpx2="123"/><pair kern="-30" kpx2="113"/><pair kern="-30" kpx2="124"/><pair kern="-17" kpx2="252"/><pair kern="-17" kpx2="70"/><pair kern="-30" kpx2="121"/><pair kern="-30" kpx2="125"/><pair kern="-30" kpx2="115"/></kerning><kerning kpx1="39"><pair kern="-17" kpx2="36"/><pair kern="-17" kpx2="199"/><pair kern="-17" kpx2="98"/><pair kern="-54" kpx2="187"/><pair kern="-26" kpx2="181"/><pair kern="-21" kpx2="180"/><pair kern="-17" kpx2="173"/><pair kern="-17" kpx2="169"/><pair kern="-91" kpx2="197"/><pair kern="-17" kpx2="201"/><pair kern="-54" kpx2="60"/><pair kern="-17" kpx2="57"/><pair kern="-17" kpx2="174"/><pair kern="-17" kpx2="170"/><pair kern="-54" kpx2="234"/></kerning><kerning kpx1="236"><pair kern="-17" kpx2="180"/><pair kern="-72" kpx2="17"/><pair kern="-91" kpx2="197"/><pair kern="-35" kpx2="29"/></kerning><kerning kpx1="45"><pair kern="-35" kpx2="180"/><pair kern="-17" kpx2="173"/><pair kern="-17" kpx2="36"/><pair kern="-17" kpx2="169"/><pair kern="-54" kpx2="197"/><pair kern="-17" kpx2="201"/><pair kern="-17" kpx2="199"/><pair kern="-35" kpx2="16"/><pair kern="-17" kpx2="174"/><pair kern="-17" kpx2="98"/><pair kern="-30" kpx2="181"/><pair kern="-17" kpx2="170"/></kerning><kerning kpx1="173"><pair kern="-17" kpx2="246"/><pair kern="-67" kpx2="235"/><pair kern="-21" kpx2="16"/><pair kern="-17" kpx2="112"/><pair kern="-17" kpx2="123"/><pair kern="-17" kpx2="251"/><pair kern="-17" kpx2="113"/><pair kern="-77" kpx2="187"/><pair kern="-17" kpx2="208"/><pair kern="-35" kpx2="73"/><pair kern="-17" kpx2="124"/><pair kern="-35" kpx2="169"/><pair kern="-17" kpx2="252"/><pair kern="-17" kpx2="70"/><pair kern="-77" kpx2="60"/><pair kern="27" kpx2="201"/><pair kern="-17" kpx2="29"/><pair kern="-77" kpx2="234"/><pair kern="-17" kpx2="100"/><pair kern="-17" kpx2="122"/><pair kern="-17" kpx2="210"/><pair kern="-17" kpx2="82"/><pair kern="-54" kpx2="58"/><pair kern="-67" kpx2="186"/><pair kern="-17" kpx2="175"/><pair kern="-17" kpx2="209"/><pair kern="-17" kpx2="103"/><pair kern="27" kpx2="98"/><pair kern="-123" kpx2="181"/><pair kern="-17" kpx2="17"/><pair kern="-17" kpx2="38"/><pair kern="-17" kpx2="84"/><pair kern="-17" kpx2="121"/><pair kern="-63" kpx2="57"/><pair kern="-17" kpx2="254"/><pair kern="-17" kpx2="87"/><pair kern="-17" kpx2="72"/><pair kern="27" kpx2="199"/><pair kern="-17" kpx2="71"/><pair kern="-128" kpx2="180"/><pair kern="-17" kpx2="253"/><pair kern="-17" kpx2="52"/><pair kern="-17" kpx2="125"/><pair kern="-17" kpx2="42"/><pair kern="-17" kpx2="115"/><pair kern="-40" kpx2="90"/><pair kern="-17" kpx2="111"/><pair kern="27" kpx2="36"/><pair kern="-77" kpx2="55"/><pair kern="-17" kpx2="114"/><pair kern="-17" kpx2="50"/><pair kern="27" kpx2="173"/><pair kern="-67" kpx2="92"/><pair kern="22" kpx2="197"/><pair kern="-58" kpx2="89"/><pair kern="27" kpx2="174"/></kerning><kerning kpx1="197"><pair kern="-35" kpx2="246"/><pair kern="-54" kpx2="235"/><pair kern="-35" kpx2="43"/><pair kern="-35" kpx2="123"/><pair kern="-54" kpx2="251"/><pair kern="-183" kpx2="187"/><pair kern="-54" kpx2="208"/><pair kern="18" kpx2="144"/><pair kern="-35" kpx2="59"/><pair kern="-17" kpx2="73"/><pair kern="-35" kpx2="41"/><pair kern="-35" kpx2="124"/><pair kern="-35" kpx2="85"/><pair kern="-183" kpx2="60"/><pair kern="18" kpx2="201"/><pair kern="-183" kpx2="234"/><pair kern="-54" kpx2="100"/><pair kern="-35" kpx2="122"/><pair kern="-35" kpx2="47"/><pair kern="-54" kpx2="210"/><pair kern="-35" kpx2="82"/><pair kern="-123" kpx2="58"/><pair kern="-54" kpx2="186"/><pair kern="-54" kpx2="175"/><pair kern="-54" kpx2="209"/><pair kern="-54" kpx2="103"/><pair kern="-35" kpx2="81"/><pair kern="18" kpx2="98"/><pair kern="-54" kpx2="38"/><pair kern="-35" kpx2="121"/><pair kern="-183" kpx2="57"/><pair kern="-35" kpx2="37"/><pair kern="-35" kpx2="120"/><pair kern="-35" kpx2="51"/><pair kern="18" kpx2="199"/><pair kern="-35" kpx2="53"/><pair kern="-35" kpx2="137"/><pair kern="-35" kpx2="233"/><pair kern="-54" kpx2="253"/><pair kern="-54" kpx2="52"/><pair kern="-35" kpx2="125"/><pair kern="-35" kpx2="42"/><pair kern="-95" kpx2="90"/><pair kern="18" kpx2="36"/><pair kern="-137" kpx2="55"/><pair kern="-54" kpx2="50"/><pair kern="-35" kpx2="39"/><pair kern="-35" kpx2="236"/><pair kern="22" kpx2="45"/><pair kern="18" kpx2="173"/><pair kern="-54" kpx2="92"/><pair kern="-114" kpx2="89"/><pair kern="-35" kpx2="46"/><pair kern="18" kpx2="174"/></kerning><kerning kpx1="92"><pair kern="-142" kpx2="17"/><pair kern="-17" kpx2="169"/><pair kern="-146" kpx2="197"/><pair kern="-17" kpx2="16"/><pair kern="-72" kpx2="29"/><pair kern="-17" kpx2="170"/></kerning><kerning kpx1="89"><pair kern="-77" kpx2="17"/><pair kern="-17" kpx2="169"/><pair kern="-132" kpx2="197"/><pair kern="-26" kpx2="16"/><pair kern="-54" kpx2="29"/><pair kern="-17" kpx2="181"/><pair kern="-17" kpx2="170"/></kerning><kerning kpx1="46"><pair kern="-17" kpx2="107"/><pair kern="-72" kpx2="235"/><pair kern="-104" kpx2="16"/><pair kern="-49" kpx2="112"/><pair kern="-49" kpx2="123"/><pair kern="-54" kpx2="251"/><pair kern="-26" kpx2="213"/><pair kern="-49" kpx2="113"/><pair kern="-35" kpx2="187"/><pair kern="-54" kpx2="208"/><pair kern="-49" kpx2="129"/><pair kern="-49" kpx2="124"/><pair kern="-63" kpx2="169"/><pair kern="-35" kpx2="60"/><pair kern="-17" kpx2="201"/><pair kern="-17" kpx2="106"/><pair kern="-35" kpx2="234"/><pair kern="-54" kpx2="100"/><pair kern="-49" kpx2="122"/><pair kern="-17" kpx2="68"/><pair kern="-54" kpx2="210"/><pair kern="-35" kpx2="58"/><pair kern="-49" kpx2="82"/><pair kern="-72" kpx2="186"/><pair kern="-54" kpx2="175"/><pair kern="-54" kpx2="209"/><pair kern="-54" kpx2="103"/><pair kern="-17" kpx2="98"/><pair kern="-30" kpx2="181"/><pair kern="-26" kpx2="212"/><pair kern="-54" kpx2="38"/><pair kern="-49" kpx2="121"/><pair kern="-49" kpx2="126"/><pair kern="-26" kpx2="104"/><pair kern="-49" kpx2="72"/><pair kern="-17" kpx2="199"/><pair kern="-30" kpx2="180"/><pair kern="-17" kpx2="105"/><pair kern="-54" kpx2="253"/><pair kern="-26" kpx2="211"/><pair kern="-49" kpx2="125"/><pair kern="-49" kpx2="115"/><pair kern="-49" kpx2="88"/><pair kern="-17" kpx2="36"/><pair kern="-77" kpx2="55"/><pair kern="-49" kpx2="114"/><pair kern="-54" kpx2="50"/><pair kern="-49" kpx2="127"/><pair kern="-17" kpx2="108"/><pair kern="-49" kpx2="128"/><pair kern="-17" kpx2="173"/><pair kern="-17" kpx2="109"/><pair kern="-72" kpx2="92"/><pair kern="-17" kpx2="110"/><pair kern="-17" kpx2="174"/><pair kern="-26" kpx2="56"/></kerning><kerning kpx1="174"><pair kern="-17" kpx2="246"/><pair kern="-67" kpx2="235"/><pair kern="-21" kpx2="16"/><pair kern="-17" kpx2="112"/><pair kern="-17" kpx2="123"/><pair kern="-17" kpx2="251"/><pair kern="-17" kpx2="113"/><pair kern="-77" kpx2="187"/><pair kern="-17" kpx2="208"/><pair kern="-35" kpx2="73"/><pair kern="-17" kpx2="124"/><pair kern="-35" kpx2="169"/><pair kern="-17" kpx2="252"/><pair kern="-17" kpx2="70"/><pair kern="-77" kpx2="60"/><pair kern="27" kpx2="201"/><pair kern="-17" kpx2="29"/><pair kern="-77" kpx2="234"/><pair kern="-17" kpx2="100"/><pair kern="-17" kpx2="122"/><pair kern="-17" kpx2="210"/><pair kern="-17" kpx2="82"/><pair kern="-54" kpx2="58"/><pair kern="-67" kpx2="186"/><pair kern="-17" kpx2="175"/><pair kern="-17" kpx2="209"/><pair kern="-17" kpx2="103"/><pair kern="27" kpx2="98"/><pair kern="-123" kpx2="181"/><pair kern="-17" kpx2="17"/><pair kern="-17" kpx2="38"/><pair kern="-17" kpx2="84"/><pair kern="-17" kpx2="121"/><pair kern="-63" kpx2="57"/><pair kern="-17" kpx2="254"/><pair kern="-17" kpx2="87"/><pair kern="-17" kpx2="72"/><pair kern="27" kpx2="199"/><pair kern="-17" kpx2="71"/><pair kern="-128" kpx2="180"/><pair kern="-17" kpx2="253"/><pair kern="-17" kpx2="52"/><pair kern="-17" kpx2="125"/><pair kern="-17" kpx2="42"/><pair kern="-17" kpx2="115"/><pair kern="-40" kpx2="90"/><pair kern="-17" kpx2="111"/><pair kern="27" kpx2="36"/><pair kern="-77" kpx2="55"/><pair kern="-17" kpx2="114"/><pair kern="-17" kpx2="50"/><pair kern="27" kpx2="173"/><pair kern="-67" kpx2="92"/><pair kern="22" kpx2="197"/><pair kern="-58" kpx2="89"/><pair kern="27" kpx2="174"/></kerning><kerning kpx1="56"><pair kern="-17" kpx2="229"/><pair kern="-17" kpx2="61"/></kerning></font-metrics>
\ No newline at end of file
diff --git a/bitbake/doc/template/VeraMoBd.ttf b/bitbake/doc/template/VeraMoBd.ttf
new file mode 100644
index 0000000..9be6547
--- /dev/null
+++ b/bitbake/doc/template/VeraMoBd.ttf
Binary files differ
diff --git a/bitbake/doc/template/VeraMoBd.xml b/bitbake/doc/template/VeraMoBd.xml
new file mode 100644
index 0000000..9b33107
--- /dev/null
+++ b/bitbake/doc/template/VeraMoBd.xml
@@ -0,0 +1 @@
+<?xml version="1.0" encoding="UTF-8"?><font-metrics metrics-version="2" type="TYPE0"><font-name>BitstreamVeraSansMono-Bold</font-name><full-name>Bitstream Vera Sans Mono Bold</full-name><family-name>Bitstream Vera Sans Mono</family-name><embed/><cap-height>729</cap-height><x-height>546</x-height><ascender>759</ascender><descender>-240</descender><bbox><left>-19</left><bottom>-235</bottom><right>605</right><top>928</top></bbox><flags>34</flags><stemv>0</stemv><italicangle>0</italicangle><subtype>TYPE0</subtype><multibyte-extras><cid-type>CIDFontType2</cid-type><default-width>0</default-width><bfranges><bf gi="3" ue="126" us="32"/><bf gi="172" ue="160" us="160"/><bf gi="163" ue="161" us="161"/><bf gi="132" ue="163" us="162"/><bf gi="189" ue="164" us="164"/><bf gi="150" ue="165" us="165"/><bf gi="231" ue="166" us="166"/><bf gi="134" ue="167" us="167"/><bf gi="142" ue="168" us="168"/><bf gi="139" ue="169" us="169"/><bf gi="157" ue="170" us="170"/><bf gi="169" ue="171" us="171"/><bf gi="164" ue="172" us="172"/><bf gi="256" ue="173" us="173"/><bf gi="138" ue="174" us="174"/><bf gi="217" ue="175" us="175"/><bf gi="131" ue="176" us="176"/><bf gi="147" ue="177" us="177"/><bf gi="241" ue="179" us="178"/><bf gi="141" ue="180" us="180"/><bf gi="151" ue="181" us="181"/><bf gi="136" ue="182" us="182"/><bf gi="195" ue="183" us="183"/><bf gi="221" ue="184" us="184"/><bf gi="240" ue="185" us="185"/><bf gi="158" ue="186" us="186"/><bf gi="170" ue="187" us="187"/><bf gi="243" ue="190" us="188"/><bf gi="162" ue="191" us="191"/><bf gi="173" ue="192" us="192"/><bf gi="201" ue="193" us="193"/><bf gi="199" ue="194" us="194"/><bf gi="174" ue="195" us="195"/><bf gi="98" ue="197" us="196"/><bf gi="144" ue="198" us="198"/><bf gi="100" ue="199" us="199"/><bf gi="203" ue="200" us="200"/><bf gi="101" ue="201" us="201"/><bf gi="200" ue="202" us="202"/><bf gi="202" ue="203" us="203"/><bf gi="207" ue="204" us="204"/><bf gi="204" ue="207" us="205"/><bf gi="232" ue="208" us="208"/><bf gi="102" ue="209" us="209"/><bf gi="210" ue="210" us="210"/><bf gi="208" ue="212" us="211"/><bf gi="175" ue="213" us="213"/><bf gi="103" ue="214" us="214"/><bf gi="239" ue="215" us="215"/><bf gi="145" ue="216" us="216"/><bf gi="213" ue="217" us="217"/><bf gi="211" ue="219" us="218"/><bf gi="104" ue="220" us="220"/><bf gi="234" ue="221" us="221"/><bf gi="236" ue="222" us="222"/><bf gi="137" ue="223" us="223"/><bf gi="106" ue="224" us="224"/><bf gi="105" ue="225" us="225"/><bf gi="107" ue="226" us="226"/><bf gi="109" ue="227" us="227"/><bf gi="108" ue="228" us="228"/><bf gi="110" ue="229" us="229"/><bf gi="160" ue="230" us="230"/><bf gi="111" ue="231" us="231"/><bf gi="113" ue="232" us="232"/><bf gi="112" ue="233" us="233"/><bf gi="114" ue="235" us="234"/><bf gi="117" ue="236" us="236"/><bf gi="116" ue="237" us="237"/><bf gi="118" ue="239" us="238"/><bf gi="233" ue="240" us="240"/><bf gi="120" ue="241" us="241"/><bf gi="122" ue="242" us="242"/><bf gi="121" ue="243" us="243"/><bf gi="123" ue="244" us="244"/><bf gi="125" ue="245" us="245"/><bf gi="124" ue="246" us="246"/><bf gi="184" ue="247" us="247"/><bf gi="161" ue="248" us="248"/><bf gi="127" ue="249" us="249"/><bf gi="126" ue="250" us="250"/><bf gi="128" ue="252" us="251"/><bf gi="235" ue="253" us="253"/><bf gi="237" ue="254" us="254"/><bf gi="186" ue="255" us="255"/><bf gi="251" ue="263" us="262"/><bf gi="253" ue="269" us="268"/><bf gi="0" ue="270" us="270"/><bf gi="0" ue="271" us="271"/><bf gi="0" ue="272" us="272"/><bf gi="255" ue="273" us="273"/><bf gi="246" ue="287" us="286"/><bf gi="248" ue="304" us="304"/><bf gi="214" ue="305" us="305"/><bf gi="225" ue="322" us="321"/><bf gi="176" ue="339" us="338"/><bf gi="249" ue="351" us="350"/><bf gi="227" ue="353" us="352"/><bf gi="187" ue="376" us="376"/><bf gi="229" ue="382" us="381"/><bf gi="166" ue="402" us="402"/><bf gi="215" ue="710" us="710"/><bf gi="224" ue="711" us="711"/><bf gi="218" ue="730" us="728"/><bf gi="223" ue="731" us="731"/><bf gi="216" ue="732" us="732"/><bf gi="222" ue="733" us="733"/><bf gi="159" ue="937" us="937"/><bf gi="155" ue="960" us="960"/><bf gi="178" ue="8212" us="8211"/><bf gi="0" ue="8213" us="8213"/><bf gi="0" ue="8214" us="8214"/><bf gi="0" ue="8215" us="8215"/><bf gi="182" ue="8217" us="8216"/><bf gi="196" ue="8218" us="8218"/><bf gi="0" ue="8219" us="8219"/><bf gi="180" ue="8221" us="8220"/><bf gi="197" ue="8222" us="8222"/><bf gi="0" ue="8223" us="8223"/><bf gi="130" ue="8224" us="8224"/><bf gi="194" ue="8225" us="8225"/><bf gi="135" ue="8226" us="8226"/><bf gi="0" ue="8227" us="8227"/><bf gi="0" ue="8228" us="8228"/><bf gi="0" ue="8229" us="8229"/><bf gi="171" ue="8230" us="8230"/><bf gi="198" ue="8240" us="8240"/><bf gi="190" ue="8250" us="8249"/><bf gi="258" ue="8364" us="8364"/><bf gi="140" ue="8482" us="8482"/><bf gi="152" ue="8706" us="8706"/><bf gi="0" ue="8707" us="8707"/><bf gi="0" ue="8708" us="8708"/><bf gi="0" ue="8709" us="8709"/><bf gi="168" ue="8710" us="8710"/><bf gi="154" ue="8719" us="8719"/><bf gi="0" ue="8720" us="8720"/><bf gi="153" ue="8721" us="8721"/><bf gi="238" ue="8722" us="8722"/><bf gi="0" ue="8723" us="8723"/><bf gi="0" ue="8724" us="8724"/><bf gi="188" ue="8725" us="8725"/><bf gi="0" ue="8726" us="8726"/><bf gi="0" ue="8727" us="8727"/><bf gi="0" ue="8728" us="8728"/><bf gi="257" ue="8729" us="8729"/><bf gi="165" ue="8730" us="8730"/><bf gi="0" ue="8731" us="8731"/><bf gi="0" ue="8732" us="8732"/><bf gi="0" ue="8733" us="8733"/><bf gi="146" ue="8734" us="8734"/><bf gi="156" ue="8747" us="8747"/><bf gi="167" ue="8776" us="8776"/><bf gi="143" ue="8800" us="8800"/><bf gi="0" ue="8801" us="8801"/><bf gi="0" ue="8802" us="8802"/><bf gi="0" ue="8803" us="8803"/><bf gi="148" ue="8805" us="8804"/><bf gi="185" ue="9674" us="9674"/><bf gi="192" ue="64258" us="64257"/><bf gi="0" ue="65535" us="65535"/></bfranges><cid-widths start-index="0"><wx w="602"/><wx w="0"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/></cid-widths></multibyte-extras></font-metrics>
\ No newline at end of file
diff --git a/bitbake/doc/template/VeraMono.ttf b/bitbake/doc/template/VeraMono.ttf
new file mode 100644
index 0000000..139f0b4
--- /dev/null
+++ b/bitbake/doc/template/VeraMono.ttf
Binary files differ
diff --git a/bitbake/doc/template/VeraMono.xml b/bitbake/doc/template/VeraMono.xml
new file mode 100644
index 0000000..3a0a866
--- /dev/null
+++ b/bitbake/doc/template/VeraMono.xml
@@ -0,0 +1 @@
+<?xml version="1.0" encoding="UTF-8"?><font-metrics metrics-version="2" type="TYPE0"><font-name>BitstreamVeraSansMono-Roman</font-name><full-name>Bitstream Vera Sans Mono</full-name><family-name>Bitstream Vera Sans Mono</family-name><embed/><cap-height>729</cap-height><x-height>546</x-height><ascender>759</ascender><descender>-240</descender><bbox><left>-4</left><bottom>-235</bottom><right>605</right><top>928</top></bbox><flags>34</flags><stemv>0</stemv><italicangle>0</italicangle><subtype>TYPE0</subtype><multibyte-extras><cid-type>CIDFontType2</cid-type><default-width>0</default-width><bfranges><bf gi="3" ue="126" us="32"/><bf gi="172" ue="160" us="160"/><bf gi="163" ue="161" us="161"/><bf gi="132" ue="163" us="162"/><bf gi="189" ue="164" us="164"/><bf gi="150" ue="165" us="165"/><bf gi="231" ue="166" us="166"/><bf gi="134" ue="167" us="167"/><bf gi="142" ue="168" us="168"/><bf gi="139" ue="169" us="169"/><bf gi="157" ue="170" us="170"/><bf gi="169" ue="171" us="171"/><bf gi="164" ue="172" us="172"/><bf gi="256" ue="173" us="173"/><bf gi="138" ue="174" us="174"/><bf gi="217" ue="175" us="175"/><bf gi="131" ue="176" us="176"/><bf gi="147" ue="177" us="177"/><bf gi="241" ue="179" us="178"/><bf gi="141" ue="180" us="180"/><bf gi="151" ue="181" us="181"/><bf gi="136" ue="182" us="182"/><bf gi="195" ue="183" us="183"/><bf gi="221" ue="184" us="184"/><bf gi="240" ue="185" us="185"/><bf gi="158" ue="186" us="186"/><bf gi="170" ue="187" us="187"/><bf gi="243" ue="190" us="188"/><bf gi="162" ue="191" us="191"/><bf gi="173" ue="192" us="192"/><bf gi="201" ue="193" us="193"/><bf gi="199" ue="194" us="194"/><bf gi="174" ue="195" us="195"/><bf gi="98" ue="197" us="196"/><bf gi="144" ue="198" us="198"/><bf gi="100" ue="199" us="199"/><bf gi="203" ue="200" us="200"/><bf gi="101" ue="201" us="201"/><bf gi="200" ue="202" us="202"/><bf gi="202" ue="203" us="203"/><bf gi="207" ue="204" us="204"/><bf gi="204" ue="207" us="205"/><bf gi="232" ue="208" us="208"/><bf gi="102" ue="209" us="209"/><bf gi="210" ue="210" us="210"/><bf gi="208" ue="212" us="211"/><bf gi="175" ue="213" us="213"/><bf gi="103" ue="214" us="214"/><bf gi="239" ue="215" us="215"/><bf gi="145" ue="216" us="216"/><bf gi="213" ue="217" us="217"/><bf gi="211" ue="219" us="218"/><bf gi="104" ue="220" us="220"/><bf gi="234" ue="221" us="221"/><bf gi="236" ue="222" us="222"/><bf gi="137" ue="223" us="223"/><bf gi="106" ue="224" us="224"/><bf gi="105" ue="225" us="225"/><bf gi="107" ue="226" us="226"/><bf gi="109" ue="227" us="227"/><bf gi="108" ue="228" us="228"/><bf gi="110" ue="229" us="229"/><bf gi="160" ue="230" us="230"/><bf gi="111" ue="231" us="231"/><bf gi="113" ue="232" us="232"/><bf gi="112" ue="233" us="233"/><bf gi="114" ue="235" us="234"/><bf gi="117" ue="236" us="236"/><bf gi="116" ue="237" us="237"/><bf gi="118" ue="239" us="238"/><bf gi="233" ue="240" us="240"/><bf gi="120" ue="241" us="241"/><bf gi="122" ue="242" us="242"/><bf gi="121" ue="243" us="243"/><bf gi="123" ue="244" us="244"/><bf gi="125" ue="245" us="245"/><bf gi="124" ue="246" us="246"/><bf gi="184" ue="247" us="247"/><bf gi="161" ue="248" us="248"/><bf gi="127" ue="249" us="249"/><bf gi="126" ue="250" us="250"/><bf gi="128" ue="252" us="251"/><bf gi="235" ue="253" us="253"/><bf gi="237" ue="254" us="254"/><bf gi="186" ue="255" us="255"/><bf gi="251" ue="263" us="262"/><bf gi="253" ue="269" us="268"/><bf gi="0" ue="270" us="270"/><bf gi="0" ue="271" us="271"/><bf gi="0" ue="272" us="272"/><bf gi="255" ue="273" us="273"/><bf gi="246" ue="287" us="286"/><bf gi="248" ue="304" us="304"/><bf gi="214" ue="305" us="305"/><bf gi="225" ue="322" us="321"/><bf gi="176" ue="339" us="338"/><bf gi="249" ue="351" us="350"/><bf gi="227" ue="353" us="352"/><bf gi="187" ue="376" us="376"/><bf gi="229" ue="382" us="381"/><bf gi="166" ue="402" us="402"/><bf gi="215" ue="710" us="710"/><bf gi="224" ue="711" us="711"/><bf gi="218" ue="730" us="728"/><bf gi="223" ue="731" us="731"/><bf gi="216" ue="732" us="732"/><bf gi="222" ue="733" us="733"/><bf gi="159" ue="937" us="937"/><bf gi="155" ue="960" us="960"/><bf gi="178" ue="8212" us="8211"/><bf gi="0" ue="8213" us="8213"/><bf gi="0" ue="8214" us="8214"/><bf gi="0" ue="8215" us="8215"/><bf gi="182" ue="8217" us="8216"/><bf gi="196" ue="8218" us="8218"/><bf gi="0" ue="8219" us="8219"/><bf gi="180" ue="8221" us="8220"/><bf gi="197" ue="8222" us="8222"/><bf gi="0" ue="8223" us="8223"/><bf gi="130" ue="8224" us="8224"/><bf gi="194" ue="8225" us="8225"/><bf gi="135" ue="8226" us="8226"/><bf gi="0" ue="8227" us="8227"/><bf gi="0" ue="8228" us="8228"/><bf gi="0" ue="8229" us="8229"/><bf gi="171" ue="8230" us="8230"/><bf gi="198" ue="8240" us="8240"/><bf gi="190" ue="8250" us="8249"/><bf gi="258" ue="8364" us="8364"/><bf gi="140" ue="8482" us="8482"/><bf gi="152" ue="8706" us="8706"/><bf gi="0" ue="8707" us="8707"/><bf gi="0" ue="8708" us="8708"/><bf gi="0" ue="8709" us="8709"/><bf gi="168" ue="8710" us="8710"/><bf gi="154" ue="8719" us="8719"/><bf gi="0" ue="8720" us="8720"/><bf gi="153" ue="8721" us="8721"/><bf gi="238" ue="8722" us="8722"/><bf gi="0" ue="8723" us="8723"/><bf gi="0" ue="8724" us="8724"/><bf gi="188" ue="8725" us="8725"/><bf gi="0" ue="8726" us="8726"/><bf gi="0" ue="8727" us="8727"/><bf gi="0" ue="8728" us="8728"/><bf gi="257" ue="8729" us="8729"/><bf gi="165" ue="8730" us="8730"/><bf gi="0" ue="8731" us="8731"/><bf gi="0" ue="8732" us="8732"/><bf gi="0" ue="8733" us="8733"/><bf gi="146" ue="8734" us="8734"/><bf gi="156" ue="8747" us="8747"/><bf gi="167" ue="8776" us="8776"/><bf gi="143" ue="8800" us="8800"/><bf gi="0" ue="8801" us="8801"/><bf gi="0" ue="8802" us="8802"/><bf gi="0" ue="8803" us="8803"/><bf gi="148" ue="8805" us="8804"/><bf gi="185" ue="9674" us="9674"/><bf gi="192" ue="64258" us="64257"/><bf gi="0" ue="65535" us="65535"/></bfranges><cid-widths start-index="0"><wx w="602"/><wx w="0"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/><wx w="602"/></cid-widths></multibyte-extras></font-metrics>
\ No newline at end of file
diff --git a/bitbake/doc/template/component.title.xsl b/bitbake/doc/template/component.title.xsl
new file mode 100644
index 0000000..faef043
--- /dev/null
+++ b/bitbake/doc/template/component.title.xsl
@@ -0,0 +1,39 @@
+<xsl:stylesheet version="1.0"
+ xmlns:xsl="http://www.w3.org/1999/XSL/Transform"
+ xmlns:d="http://docbook.org/ns/docbook"
+ xmlns="http://www.w3.org/1999/xhtml"
+ exclude-result-prefixes="d">
+
+ <xsl:template name="component.title">
+ <xsl:param name="node" select="."/>
+
+ <xsl:variable name="level">
+ <xsl:choose>
+ <xsl:when test="ancestor::d:section">
+ <xsl:value-of select="count(ancestor::d:section)+1"/>
+ </xsl:when>
+ <xsl:when test="ancestor::d:sect5">6</xsl:when>
+ <xsl:when test="ancestor::d:sect4">5</xsl:when>
+ <xsl:when test="ancestor::d:sect3">4</xsl:when>
+ <xsl:when test="ancestor::d:sect2">3</xsl:when>
+ <xsl:when test="ancestor::d:sect1">2</xsl:when>
+ <xsl:otherwise>1</xsl:otherwise>
+ </xsl:choose>
+ </xsl:variable>
+ <xsl:element name="h{$level+1}" namespace="http://www.w3.org/1999/xhtml">
+ <xsl:attribute name="class">title</xsl:attribute>
+ <xsl:if test="$generate.id.attributes = 0">
+ <xsl:call-template name="anchor">
+ <xsl:with-param name="node" select="$node"/>
+ <xsl:with-param name="conditional" select="0"/>
+ </xsl:call-template>
+ </xsl:if>
+ <xsl:apply-templates select="$node" mode="object.title.markup">
+ <xsl:with-param name="allow-anchors" select="1"/>
+ </xsl:apply-templates>
+ <xsl:call-template name="permalink">
+ <xsl:with-param name="node" select="$node"/>
+ </xsl:call-template>
+ </xsl:element>
+ </xsl:template>
+</xsl:stylesheet>
diff --git a/bitbake/doc/template/db-pdf.xsl b/bitbake/doc/template/db-pdf.xsl
new file mode 100644
index 0000000..3dd065a
--- /dev/null
+++ b/bitbake/doc/template/db-pdf.xsl
@@ -0,0 +1,64 @@
+<?xml version='1.0'?>
+<xsl:stylesheet xmlns:xsl="http://www.w3.org/1999/XSL/Transform" xmlns="http://www.w3.org/1999/xhtml" xmlns:fo="http://www.w3.org/1999/XSL/Format" version="1.0">
+
+ <xsl:import href="http://docbook.sourceforge.net/release/xsl/current/fo/docbook.xsl" />
+
+ <!-- check project-plan.sh for how this is generated, needed to tweak
+ the cover page
+ -->
+ <xsl:include href="/tmp/titlepage.xsl"/>
+
+ <!-- To force a page break in document, i.e per section add a
+ <?hard-pagebreak?> tag.
+ -->
+ <xsl:template match="processing-instruction('hard-pagebreak')">
+ <fo:block break-before='page' />
+ </xsl:template>
+
+ <!--Fix for defualt indent getting TOC all wierd..
+ See http://sources.redhat.com/ml/docbook-apps/2005-q1/msg00455.html
+ FIXME: must be a better fix
+ -->
+ <xsl:param name="body.start.indent" select="'0'"/>
+ <!--<xsl:param name="title.margin.left" select="'0'"/>-->
+
+ <!-- stop long-ish header titles getting wrapped -->
+ <xsl:param name="header.column.widths">1 10 1</xsl:param>
+
+ <!-- customise headers and footers a little -->
+
+ <xsl:template name="head.sep.rule">
+ <xsl:if test="$header.rule != 0">
+ <xsl:attribute name="border-bottom-width">0.5pt</xsl:attribute>
+ <xsl:attribute name="border-bottom-style">solid</xsl:attribute>
+ <xsl:attribute name="border-bottom-color">#cccccc</xsl:attribute>
+ </xsl:if>
+ </xsl:template>
+
+ <xsl:template name="foot.sep.rule">
+ <xsl:if test="$footer.rule != 0">
+ <xsl:attribute name="border-top-width">0.5pt</xsl:attribute>
+ <xsl:attribute name="border-top-style">solid</xsl:attribute>
+ <xsl:attribute name="border-top-color">#cccccc</xsl:attribute>
+ </xsl:if>
+ </xsl:template>
+
+ <xsl:attribute-set name="header.content.properties">
+ <xsl:attribute name="color">#cccccc</xsl:attribute>
+ </xsl:attribute-set>
+
+ <xsl:attribute-set name="footer.content.properties">
+ <xsl:attribute name="color">#cccccc</xsl:attribute>
+ </xsl:attribute-set>
+
+
+ <!-- general settings -->
+
+ <xsl:param name="fop1.extensions" select="1"></xsl:param>
+ <xsl:param name="paper.type" select="'A4'"></xsl:param>
+ <xsl:param name="section.autolabel" select="1"></xsl:param>
+ <xsl:param name="body.font.family" select="'verasans'"></xsl:param>
+ <xsl:param name="title.font.family" select="'verasans'"></xsl:param>
+ <xsl:param name="monospace.font.family" select="'veramono'"></xsl:param>
+
+</xsl:stylesheet>
diff --git a/bitbake/doc/template/division.title.xsl b/bitbake/doc/template/division.title.xsl
new file mode 100644
index 0000000..9c843bc
--- /dev/null
+++ b/bitbake/doc/template/division.title.xsl
@@ -0,0 +1,25 @@
+<xsl:stylesheet version="1.0"
+ xmlns:xsl="http://www.w3.org/1999/XSL/Transform"
+ xmlns:d="http://docbook.org/ns/docbook"
+ xmlns="http://www.w3.org/1999/xhtml"
+ exclude-result-prefixes="d">
+
+ <xsl:template name="division.title">
+ <xsl:param name="node" select="."/>
+
+ <h1>
+ <xsl:attribute name="class">title</xsl:attribute>
+ <xsl:call-template name="anchor">
+ <xsl:with-param name="node" select="$node"/>
+ <xsl:with-param name="conditional" select="0"/>
+ </xsl:call-template>
+ <xsl:apply-templates select="$node" mode="object.title.markup">
+ <xsl:with-param name="allow-anchors" select="1"/>
+ </xsl:apply-templates>
+ <xsl:call-template name="permalink">
+ <xsl:with-param name="node" select="$node"/>
+ </xsl:call-template>
+ </h1>
+ </xsl:template>
+</xsl:stylesheet>
+
diff --git a/bitbake/doc/template/draft.png b/bitbake/doc/template/draft.png
new file mode 100644
index 0000000..53051a9
--- /dev/null
+++ b/bitbake/doc/template/draft.png
Binary files differ
diff --git a/bitbake/doc/template/fop-config.xml b/bitbake/doc/template/fop-config.xml
new file mode 100644
index 0000000..09cc5ca
--- /dev/null
+++ b/bitbake/doc/template/fop-config.xml
@@ -0,0 +1,58 @@
+<fop version="1.0">
+
+ <!-- Strict user configuration -->
+ <strict-configuration>true</strict-configuration>
+
+ <!-- Strict FO validation -->
+ <strict-validation>true</strict-validation>
+
+ <!--
+ Set the baseDir so common/openedhand.svg references in plans still
+ work ok. Note, relative file references to current dir should still work.
+ -->
+ <base>../template</base>
+ <font-base>../template</font-base>
+
+ <!-- Source resolution in dpi (dots/pixels per inch) for determining the
+ size of pixels in SVG and bitmap images, default: 72dpi -->
+ <!-- <source-resolution>72</source-resolution> -->
+ <!-- Target resolution in dpi (dots/pixels per inch) for specifying the
+ target resolution for generated bitmaps, default: 72dpi -->
+ <!-- <target-resolution>72</target-resolution> -->
+
+ <!-- default page-height and page-width, in case
+ value is specified as auto -->
+ <default-page-settings height="11in" width="8.26in"/>
+
+ <!-- <use-cache>false</use-cache> -->
+
+ <renderers>
+ <renderer mime="application/pdf">
+ <fonts>
+ <font metrics-file="VeraMono.xml"
+ kerning="yes"
+ embed-url="VeraMono.ttf">
+ <font-triplet name="veramono" style="normal" weight="normal"/>
+ </font>
+
+ <font metrics-file="VeraMoBd.xml"
+ kerning="yes"
+ embed-url="VeraMoBd.ttf">
+ <font-triplet name="veramono" style="normal" weight="bold"/>
+ </font>
+
+ <font metrics-file="Vera.xml"
+ kerning="yes"
+ embed-url="Vera.ttf">
+ <font-triplet name="verasans" style="normal" weight="normal"/>
+ <font-triplet name="verasans" style="normal" weight="bold"/>
+ <font-triplet name="verasans" style="italic" weight="normal"/>
+ <font-triplet name="verasans" style="italic" weight="bold"/>
+ </font>
+
+ <auto-detect/>
+ </fonts>
+ </renderer>
+ </renderers>
+</fop>
+
diff --git a/bitbake/doc/template/formal.object.heading.xsl b/bitbake/doc/template/formal.object.heading.xsl
new file mode 100644
index 0000000..4f3900d
--- /dev/null
+++ b/bitbake/doc/template/formal.object.heading.xsl
@@ -0,0 +1,21 @@
+<xsl:stylesheet version="1.0"
+ xmlns:xsl="http://www.w3.org/1999/XSL/Transform"
+ xmlns:d="http://docbook.org/ns/docbook"
+ xmlns="http://www.w3.org/1999/xhtml"
+ exclude-result-prefixes="d">
+
+ <xsl:template name="formal.object.heading">
+ <xsl:param name="object" select="."/>
+ <xsl:param name="title">
+ <xsl:apply-templates select="$object" mode="object.title.markup">
+ <xsl:with-param name="allow-anchors" select="1"/>
+ </xsl:apply-templates>
+ </xsl:param>
+ <p class="title">
+ <b><xsl:copy-of select="$title"/></b>
+ <xsl:call-template name="permalink">
+ <xsl:with-param name="node" select="$object"/>
+ </xsl:call-template>
+ </p>
+ </xsl:template>
+</xsl:stylesheet>
\ No newline at end of file
diff --git a/bitbake/doc/template/gloss-permalinks.xsl b/bitbake/doc/template/gloss-permalinks.xsl
new file mode 100644
index 0000000..6bf5811
--- /dev/null
+++ b/bitbake/doc/template/gloss-permalinks.xsl
@@ -0,0 +1,14 @@
+<xsl:stylesheet version="1.0"
+ xmlns:xsl="http://www.w3.org/1999/XSL/Transform"
+ xmlns:d="http://docbook.org/ns/docbook"
+ xmlns="http://www.w3.org/1999/xhtml">
+
+ <xsl:template match="glossentry/glossterm">
+ <xsl:apply-imports/>
+ <xsl:if test="$generate.permalink != 0">
+ <xsl:call-template name="permalink">
+ <xsl:with-param name="node" select=".."/>
+ </xsl:call-template>
+ </xsl:if>
+ </xsl:template>
+</xsl:stylesheet>
diff --git a/bitbake/doc/template/permalinks.xsl b/bitbake/doc/template/permalinks.xsl
new file mode 100644
index 0000000..d2a1c14
--- /dev/null
+++ b/bitbake/doc/template/permalinks.xsl
@@ -0,0 +1,25 @@
+<?xml version="1.0" encoding="UTF-8"?>
+<xsl:stylesheet version="1.0"
+ xmlns="http://www.w3.org/1999/xhtml"
+ xmlns:xsl="http://www.w3.org/1999/XSL/Transform">
+
+ <xsl:param name="generate.permalink" select="1"/>
+ <xsl:param name="permalink.text">¶</xsl:param>
+
+ <xsl:template name="permalink">
+ <xsl:param name="node"/>
+
+ <xsl:if test="$generate.permalink != '0'">
+ <span class="permalink">
+ <a alt="Permalink" title="Permalink">
+ <xsl:attribute name="href">
+ <xsl:call-template name="href.target">
+ <xsl:with-param name="object" select="$node"/>
+ </xsl:call-template>
+ </xsl:attribute>
+ <xsl:copy-of select="$permalink.text"/>
+ </a>
+ </span>
+ </xsl:if>
+ </xsl:template>
+</xsl:stylesheet>
diff --git a/bitbake/doc/template/section.title.xsl b/bitbake/doc/template/section.title.xsl
new file mode 100644
index 0000000..5c6ff9a
--- /dev/null
+++ b/bitbake/doc/template/section.title.xsl
@@ -0,0 +1,55 @@
+<xsl:stylesheet version="1.0"
+ xmlns:xsl="http://www.w3.org/1999/XSL/Transform"
+ xmlns:d="http://docbook.org/ns/docbook"
+ xmlns="http://www.w3.org/1999/xhtml" exclude-result-prefixes="d">
+
+ <xsl:template name="section.title">
+ <xsl:variable name="section"
+ select="(ancestor::section |
+ ancestor::simplesect|
+ ancestor::sect1|
+ ancestor::sect2|
+ ancestor::sect3|
+ ancestor::sect4|
+ ancestor::sect5)[last()]"/>
+
+ <xsl:variable name="renderas">
+ <xsl:choose>
+ <xsl:when test="$section/@renderas = 'sect1'">1</xsl:when>
+ <xsl:when test="$section/@renderas = 'sect2'">2</xsl:when>
+ <xsl:when test="$section/@renderas = 'sect3'">3</xsl:when>
+ <xsl:when test="$section/@renderas = 'sect4'">4</xsl:when>
+ <xsl:when test="$section/@renderas = 'sect5'">5</xsl:when>
+ <xsl:otherwise><xsl:value-of select="''"/></xsl:otherwise>
+ </xsl:choose>
+ </xsl:variable>
+
+ <xsl:variable name="level">
+ <xsl:choose>
+ <xsl:when test="$renderas != ''">
+ <xsl:value-of select="$renderas"/>
+ </xsl:when>
+ <xsl:otherwise>
+ <xsl:call-template name="section.level">
+ <xsl:with-param name="node" select="$section"/>
+ </xsl:call-template>
+ </xsl:otherwise>
+ </xsl:choose>
+ </xsl:variable>
+
+ <xsl:call-template name="section.heading">
+ <xsl:with-param name="section" select="$section"/>
+ <xsl:with-param name="level" select="$level"/>
+ <xsl:with-param name="title">
+ <xsl:apply-templates select="$section" mode="object.title.markup">
+ <xsl:with-param name="allow-anchors" select="1"/>
+ </xsl:apply-templates>
+ <xsl:if test="$level > 0">
+ <xsl:call-template name="permalink">
+ <xsl:with-param name="node" select="$section"/>
+ </xsl:call-template>
+ </xsl:if>
+ </xsl:with-param>
+ </xsl:call-template>
+ </xsl:template>
+</xsl:stylesheet>
diff --git a/bitbake/doc/template/titlepage.templates.xml b/bitbake/doc/template/titlepage.templates.xml
new file mode 100644
index 0000000..38ec11a
--- /dev/null
+++ b/bitbake/doc/template/titlepage.templates.xml
@@ -0,0 +1,1259 @@
+<!DOCTYPE t:templates [
+<!ENTITY hsize0 "10pt">
+<!ENTITY hsize1 "12pt">
+<!ENTITY hsize2 "14.4pt">
+<!ENTITY hsize3 "17.28pt">
+<!ENTITY hsize4 "20.736pt">
+<!ENTITY hsize5 "24.8832pt">
+<!ENTITY hsize0space "7.5pt"> <!-- 0.75 * hsize0 -->
+<!ENTITY hsize1space "9pt"> <!-- 0.75 * hsize1 -->
+<!ENTITY hsize2space "10.8pt"> <!-- 0.75 * hsize2 -->
+<!ENTITY hsize3space "12.96pt"> <!-- 0.75 * hsize3 -->
+<!ENTITY hsize4space "15.552pt"> <!-- 0.75 * hsize4 -->
+<!ENTITY hsize5space "18.6624pt"> <!-- 0.75 * hsize5 -->
+]>
+<t:templates xmlns:t="http://nwalsh.com/docbook/xsl/template/1.0"
+ xmlns:param="http://nwalsh.com/docbook/xsl/template/1.0/param"
+ xmlns:fo="http://www.w3.org/1999/XSL/Format"
+ xmlns:xsl="http://www.w3.org/1999/XSL/Transform">
+
+<!-- ********************************************************************
+ $Id: titlepage.templates.xml,v 1.23 2003/12/16 00:30:49 bobstayton Exp $
+ ********************************************************************
+
+ This file is part of the DocBook XSL Stylesheet distribution.
+ See ../README or http://docbook.sf.net/ for copyright
+ and other information.
+
+ ******************************************************************** -->
+
+<!-- ==================================================================== -->
+
+<t:titlepage t:element="article" t:wrapper="fo:block"
+ font-family="{$title.fontset}">
+
+ <t:titlepage-content t:side="recto"
+ text-align="center">
+
+ <mediaobject/>
+
+ <title t:named-template="component.title"
+ param:node="ancestor-or-self::article[1]"
+ keep-with-next="always"
+ font-size="&hsize5;"
+ font-weight="bold"/>
+
+ <subtitle param:node="ancestor-or-self::article[1]"
+ keep-with-next="always"
+ font-size="&hsize3;"
+ font-weight="bold"
+ space-after="0.8em"/>
+
+ <corpauthor space-before="0.5em"
+ font-size="&hsize3;"/>
+ <authorgroup space-before="0.5em"
+ font-size="&hsize2;"/>
+ <author space-before="0.5em"
+ font-size="&hsize2;"
+ space-after="0.8em"/>
+
+ <email font-size="&hsize2;"/>
+
+ <othercredit space-before="0.5em"/>
+ <releaseinfo space-before="0.5em"/>
+ <copyright space-before="0.5em"/>
+ <legalnotice text-align="start"
+ margin-left="0.5in"
+ margin-right="0.5in"
+ font-family="{$body.fontset}"/>
+ <pubdate space-before="0.5em"/>
+ <para></para>
+ <revision space-before="0.5em"/>
+ <revhistory space-before="0.5em"/>
+ <abstract space-before="0.5em"
+ text-align="start"
+ margin-left="0.5in"
+ margin-right="0.5in"
+ font-family="{$body.fontset}"/>
+
+ <para></para>
+ </t:titlepage-content>
+
+ <t:titlepage-content t:side="verso">
+ </t:titlepage-content>
+
+ <t:titlepage-separator>
+ </t:titlepage-separator>
+
+ <t:titlepage-before t:side="recto">
+ </t:titlepage-before>
+
+ <t:titlepage-before t:side="verso">
+ </t:titlepage-before>
+</t:titlepage>
+
+<!-- ==================================================================== -->
+
+<t:titlepage t:element="set" t:wrapper="fo:block">
+ <t:titlepage-content t:side="recto">
+ <title
+ t:named-template="division.title"
+ param:node="ancestor-or-self::set[1]"
+ text-align="center"
+ font-size="&hsize5;"
+ space-before="&hsize5space;"
+ font-weight="bold"
+ font-family="{$title.fontset}"/>
+ <subtitle
+ font-family="{$title.fontset}"
+ text-align="center"/>
+ <corpauthor/>
+ <authorgroup/>
+ <author/>
+ <othercredit/>
+ <releaseinfo/>
+ <copyright/>
+ <legalnotice/>
+ <pubdate/>
+ <revision/>
+ <revhistory/>
+ <abstract/>
+ </t:titlepage-content>
+
+ <t:titlepage-content t:side="verso">
+ </t:titlepage-content>
+
+ <t:titlepage-separator>
+ </t:titlepage-separator>
+
+ <t:titlepage-before t:side="recto">
+ </t:titlepage-before>
+
+ <t:titlepage-before t:side="verso">
+ </t:titlepage-before>
+</t:titlepage>
+
+<!-- ==================================================================== -->
+
+ <t:titlepage t:element="book" t:wrapper="fo:block">
+ <t:titlepage-content t:side="recto">
+
+ <mediaobject/>
+
+<!--
+
+# If you leave this block of code in then the text title in the
+# <title>BitBake User Manual</title> statement of the
+# bitbake-user-manual.xml file is rendered on the title page below the
+# image. Commenting it out gets it out of there yet allows it
+# to be retained in the tab text for the HTML version of the
+# manual.
+
+ <title
+ t:named-template="division.title"
+ param:node="ancestor-or-self::book[1]"
+ text-align="center"
+ font-size="&hsize5;"
+ space-before="&hsize5space;"
+ font-weight="bold"
+ font-family="{$title.fontset}"/>
+-->
+ <subtitle
+ text-align="center"
+ font-size="&hsize4;"
+ space-before="&hsize4space;"
+ font-family="{$title.fontset}"/>
+ <corpauthor font-size="&hsize3;"
+ keep-with-next="always"
+ space-before="2in"/>
+ <authorgroup space-before="2in"/>
+ <author font-size="&hsize3;"
+ space-before="&hsize2space;"
+ keep-with-next="always"/>
+ </t:titlepage-content>
+
+ <t:titlepage-content t:side="verso">
+<!--
+# If you leave this block of code in then the text title in the
+# <title>BitBake User Manual</title> statement of the
+# bitbake-user-manual.xml file is rendered on the title page below the
+# image. Commenting it out gets it out of there yet allows it
+# to be retained in the tab text for the HTML version of the
+# manual.
+
+ <title
+ t:named-template="book.verso.title"
+ font-size="&hsize2;"
+ font-weight="bold"
+ font-family="{$title.fontset}"/>
+-->
+ <corpauthor/>
+ <authorgroup t:named-template="verso.authorgroup"/>
+ <author/>
+ <othercredit/>
+ <pubdate space-before="1em"/>
+ <copyright/>
+ <abstract/>
+ <legalnotice font-size="8pt"/>
+ </t:titlepage-content>
+
+ <t:titlepage-separator>
+ <fo:block break-after="page"/>
+ </t:titlepage-separator>
+
+ <t:titlepage-before t:side="recto">
+ </t:titlepage-before>
+
+ <t:titlepage-before t:side="verso">
+ <fo:block break-after="page"/>
+ </t:titlepage-before>
+</t:titlepage>
+
+<!-- ==================================================================== -->
+
+<t:titlepage t:element="part" t:wrapper="fo:block">
+ <t:titlepage-content t:side="recto">
+ <title
+ t:named-template="division.title"
+ param:node="ancestor-or-self::part[1]"
+ text-align="center"
+ font-size="&hsize5;"
+ space-before="&hsize5space;"
+ font-weight="bold"
+ font-family="{$title.fontset}"/>
+ <subtitle
+ text-align="center"
+ font-size="&hsize4;"
+ space-before="&hsize4space;"
+ font-weight='bold'
+ font-style='italic'
+ font-family="{$title.fontset}"/>
+ </t:titlepage-content>
+
+ <t:titlepage-content t:side="verso">
+ </t:titlepage-content>
+
+ <t:titlepage-separator>
+ </t:titlepage-separator>
+
+ <t:titlepage-before t:side="recto">
+ </t:titlepage-before>
+
+ <t:titlepage-before t:side="verso">
+ </t:titlepage-before>
+</t:titlepage>
+
+<t:titlepage t:element="partintro" t:wrapper="fo:block">
+ <t:titlepage-content t:side="recto">
+ <title
+ text-align="center"
+ font-size="&hsize5;"
+ font-weight="bold"
+ space-before="1em"
+ font-family="{$title.fontset}"/>
+ <subtitle
+ text-align="center"
+ font-size="&hsize2;"
+ font-weight="bold"
+ font-style="italic"
+ font-family="{$title.fontset}"/>
+ <corpauthor/>
+ <authorgroup/>
+ <author/>
+ <othercredit/>
+ <releaseinfo/>
+ <copyright/>
+ <legalnotice/>
+ <pubdate/>
+ <revision/>
+ <revhistory/>
+ <abstract/>
+ </t:titlepage-content>
+
+ <t:titlepage-content t:side="verso">
+ </t:titlepage-content>
+
+ <t:titlepage-separator>
+ </t:titlepage-separator>
+
+ <t:titlepage-before t:side="recto">
+ </t:titlepage-before>
+
+ <t:titlepage-before t:side="verso">
+ </t:titlepage-before>
+</t:titlepage>
+
+<!-- ==================================================================== -->
+
+<t:titlepage t:element="reference" t:wrapper="fo:block">
+ <t:titlepage-content t:side="recto">
+ <title
+ t:named-template="division.title"
+ param:node="ancestor-or-self::reference[1]"
+ text-align="center"
+ font-size="&hsize5;"
+ space-before="&hsize5space;"
+ font-weight="bold"
+ font-family="{$title.fontset}"/>
+ <subtitle
+ font-family="{$title.fontset}"
+ text-align="center"/>
+ <corpauthor/>
+ <authorgroup/>
+ <author/>
+ <othercredit/>
+ <releaseinfo/>
+ <copyright/>
+ <legalnotice/>
+ <pubdate/>
+ <revision/>
+ <revhistory/>
+ <abstract/>
+ </t:titlepage-content>
+
+ <t:titlepage-content t:side="verso">
+ </t:titlepage-content>
+
+ <t:titlepage-separator>
+ </t:titlepage-separator>
+
+ <t:titlepage-before t:side="recto">
+ </t:titlepage-before>
+
+ <t:titlepage-before t:side="verso">
+ </t:titlepage-before>
+</t:titlepage>
+
+<!-- ==================================================================== -->
+
+<t:titlepage t:element="refsynopsisdiv" t:wrapper="fo:block">
+ <t:titlepage-content t:side="recto">
+ <title
+ font-family="{$title.fontset}"/>
+ </t:titlepage-content>
+
+ <t:titlepage-content t:side="verso">
+ </t:titlepage-content>
+
+ <t:titlepage-separator>
+ </t:titlepage-separator>
+
+ <t:titlepage-before t:side="recto">
+ </t:titlepage-before>
+
+ <t:titlepage-before t:side="verso">
+ </t:titlepage-before>
+</t:titlepage>
+
+<!-- ==================================================================== -->
+
+<t:titlepage t:element="refsection" t:wrapper="fo:block">
+ <t:titlepage-content t:side="recto">
+ <title
+ font-family="{$title.fontset}"/>
+ </t:titlepage-content>
+
+ <t:titlepage-content t:side="verso">
+ </t:titlepage-content>
+
+ <t:titlepage-separator>
+ </t:titlepage-separator>
+
+ <t:titlepage-before t:side="recto">
+ </t:titlepage-before>
+
+ <t:titlepage-before t:side="verso">
+ </t:titlepage-before>
+</t:titlepage>
+
+<!-- ==================================================================== -->
+
+<t:titlepage t:element="refsect1" t:wrapper="fo:block">
+ <t:titlepage-content t:side="recto">
+ <title
+ font-family="{$title.fontset}"/>
+ </t:titlepage-content>
+
+ <t:titlepage-content t:side="verso">
+ </t:titlepage-content>
+
+ <t:titlepage-separator>
+ </t:titlepage-separator>
+
+ <t:titlepage-before t:side="recto">
+ </t:titlepage-before>
+
+ <t:titlepage-before t:side="verso">
+ </t:titlepage-before>
+</t:titlepage>
+
+<!-- ==================================================================== -->
+
+<t:titlepage t:element="refsect2" t:wrapper="fo:block">
+ <t:titlepage-content t:side="recto">
+ <title
+ font-family="{$title.fontset}"/>
+ </t:titlepage-content>
+
+ <t:titlepage-content t:side="verso">
+ </t:titlepage-content>
+
+ <t:titlepage-separator>
+ </t:titlepage-separator>
+
+ <t:titlepage-before t:side="recto">
+ </t:titlepage-before>
+
+ <t:titlepage-before t:side="verso">
+ </t:titlepage-before>
+</t:titlepage>
+
+<!-- ==================================================================== -->
+
+<t:titlepage t:element="refsect3" t:wrapper="fo:block">
+ <t:titlepage-content t:side="recto">
+ <title
+ font-family="{$title.fontset}"/>
+ </t:titlepage-content>
+
+ <t:titlepage-content t:side="verso">
+ </t:titlepage-content>
+
+ <t:titlepage-separator>
+ </t:titlepage-separator>
+
+ <t:titlepage-before t:side="recto">
+ </t:titlepage-before>
+
+ <t:titlepage-before t:side="verso">
+ </t:titlepage-before>
+</t:titlepage>
+
+<!-- ==================================================================== -->
+
+ <t:titlepage t:element="dedication" t:wrapper="fo:block">
+ <t:titlepage-content t:side="recto">
+ <title
+ t:force="1"
+ t:named-template="component.title"
+ param:node="ancestor-or-self::dedication[1]"
+ margin-left="{$title.margin.left}"
+ font-size="&hsize5;"
+ font-family="{$title.fontset}"
+ font-weight="bold"/>
+ <subtitle
+ font-family="{$title.fontset}"/>
+ </t:titlepage-content>
+
+ <t:titlepage-content t:side="verso">
+ </t:titlepage-content>
+
+ <t:titlepage-separator>
+ </t:titlepage-separator>
+
+ <t:titlepage-before t:side="recto">
+ </t:titlepage-before>
+
+ <t:titlepage-before t:side="verso">
+ </t:titlepage-before>
+</t:titlepage>
+
+<!-- ==================================================================== -->
+
+ <t:titlepage t:element="preface" t:wrapper="fo:block">
+ <t:titlepage-content t:side="recto">
+ <title
+ t:force="1"
+ t:named-template="component.title"
+ param:node="ancestor-or-self::preface[1]"
+ margin-left="{$title.margin.left}"
+ font-size="&hsize5;"
+ font-family="{$title.fontset}"
+ font-weight="bold"/>
+ <subtitle
+ font-family="{$title.fontset}"/>
+ <corpauthor/>
+ <authorgroup/>
+ <author/>
+ <othercredit/>
+ <releaseinfo/>
+ <copyright/>
+ <legalnotice/>
+ <pubdate/>
+ <revision/>
+ <revhistory/>
+ <abstract/>
+ </t:titlepage-content>
+
+ <t:titlepage-content t:side="verso">
+ </t:titlepage-content>
+
+ <t:titlepage-separator>
+ </t:titlepage-separator>
+
+ <t:titlepage-before t:side="recto">
+ </t:titlepage-before>
+
+ <t:titlepage-before t:side="verso">
+ </t:titlepage-before>
+</t:titlepage>
+
+<!-- ==================================================================== -->
+
+ <t:titlepage t:element="chapter" t:wrapper="fo:block"
+ font-family="{$title.fontset}">
+ <t:titlepage-content t:side="recto" margin-left="{$title.margin.left}">
+ <title t:named-template="component.title"
+ param:node="ancestor-or-self::chapter[1]"
+ font-size="&hsize5;"
+ font-weight="bold"/>
+
+ <subtitle space-before="0.5em"
+ font-style="italic"
+ font-size="&hsize2;"
+ font-weight="bold"/>
+
+ <corpauthor space-before="0.5em"
+ space-after="0.5em"
+ font-size="&hsize2;"/>
+
+ <authorgroup space-before="0.5em"
+ space-after="0.5em"
+ font-size="&hsize2;"/>
+
+ <author space-before="0.5em"
+ space-after="0.5em"
+ font-size="&hsize2;"/>
+
+ <othercredit/>
+ <releaseinfo/>
+ <copyright/>
+ <legalnotice/>
+ <pubdate/>
+ <revision/>
+ <revhistory/>
+ <abstract/>
+ </t:titlepage-content>
+
+ <t:titlepage-content t:side="verso">
+ </t:titlepage-content>
+
+ <t:titlepage-separator>
+ </t:titlepage-separator>
+
+ <t:titlepage-before t:side="recto">
+ </t:titlepage-before>
+
+ <t:titlepage-before t:side="verso">
+ </t:titlepage-before>
+</t:titlepage>
+
+<!-- ==================================================================== -->
+
+ <t:titlepage t:element="appendix" t:wrapper="fo:block">
+ <t:titlepage-content t:side="recto">
+ <title
+ t:named-template="component.title"
+ param:node="ancestor-or-self::appendix[1]"
+ margin-left="{$title.margin.left}"
+ font-size="&hsize5;"
+ font-weight="bold"
+ font-family="{$title.fontset}"/>
+ <subtitle
+ font-family="{$title.fontset}"/>
+ <corpauthor/>
+ <authorgroup/>
+ <author/>
+ <othercredit/>
+ <releaseinfo/>
+ <copyright/>
+ <legalnotice/>
+ <pubdate/>
+ <revision/>
+ <revhistory/>
+ <abstract/>
+ </t:titlepage-content>
+
+ <t:titlepage-content t:side="verso">
+ </t:titlepage-content>
+
+ <t:titlepage-separator>
+ </t:titlepage-separator>
+
+ <t:titlepage-before t:side="recto">
+ </t:titlepage-before>
+
+ <t:titlepage-before t:side="verso">
+ </t:titlepage-before>
+</t:titlepage>
+
+<!-- ==================================================================== -->
+
+<t:titlepage t:element="section" t:wrapper="fo:block">
+ <t:titlepage-content t:side="recto">
+ <title
+ margin-left="{$title.margin.left}"
+ font-family="{$title.fontset}"/>
+ <subtitle
+ font-family="{$title.fontset}"/>
+ <corpauthor/>
+ <authorgroup/>
+ <author/>
+ <othercredit/>
+ <releaseinfo/>
+ <copyright/>
+ <legalnotice/>
+ <pubdate/>
+ <revision/>
+ <revhistory/>
+ <abstract/>
+ </t:titlepage-content>
+
+ <t:titlepage-content t:side="verso">
+ </t:titlepage-content>
+
+ <t:titlepage-separator>
+ </t:titlepage-separator>
+
+ <t:titlepage-before t:side="recto">
+ </t:titlepage-before>
+
+ <t:titlepage-before t:side="verso">
+ </t:titlepage-before>
+</t:titlepage>
+
+<t:titlepage t:element="sect1" t:wrapper="fo:block">
+ <t:titlepage-content t:side="recto">
+ <title
+ margin-left="{$title.margin.left}"
+ font-family="{$title.fontset}"/>
+ <subtitle
+ font-family="{$title.fontset}"/>
+ <corpauthor/>
+ <authorgroup/>
+ <author/>
+ <othercredit/>
+ <releaseinfo/>
+ <copyright/>
+ <legalnotice/>
+ <pubdate/>
+ <revision/>
+ <revhistory/>
+ <abstract/>
+ </t:titlepage-content>
+
+ <t:titlepage-content t:side="verso">
+ </t:titlepage-content>
+
+ <t:titlepage-separator>
+ </t:titlepage-separator>
+
+ <t:titlepage-before t:side="recto">
+ </t:titlepage-before>
+
+ <t:titlepage-before t:side="verso">
+ </t:titlepage-before>
+</t:titlepage>
+
+<t:titlepage t:element="sect2" t:wrapper="fo:block">
+ <t:titlepage-content t:side="recto">
+ <title
+ margin-left="{$title.margin.left}"
+ font-family="{$title.fontset}"/>
+ <subtitle
+ font-family="{$title.fontset}"/>
+ <corpauthor/>
+ <authorgroup/>
+ <author/>
+ <othercredit/>
+ <releaseinfo/>
+ <copyright/>
+ <legalnotice/>
+ <pubdate/>
+ <revision/>
+ <revhistory/>
+ <abstract/>
+ </t:titlepage-content>
+
+ <t:titlepage-content t:side="verso">
+ </t:titlepage-content>
+
+ <t:titlepage-separator>
+ </t:titlepage-separator>
+
+ <t:titlepage-before t:side="recto">
+ </t:titlepage-before>
+
+ <t:titlepage-before t:side="verso">
+ </t:titlepage-before>
+</t:titlepage>
+
+<t:titlepage t:element="sect3" t:wrapper="fo:block">
+ <t:titlepage-content t:side="recto">
+ <title
+ margin-left="{$title.margin.left}"
+ font-family="{$title.fontset}"/>
+ <subtitle
+ font-family="{$title.fontset}"/>
+ <corpauthor/>
+ <authorgroup/>
+ <author/>
+ <othercredit/>
+ <releaseinfo/>
+ <copyright/>
+ <legalnotice/>
+ <pubdate/>
+ <revision/>
+ <revhistory/>
+ <abstract/>
+ </t:titlepage-content>
+
+ <t:titlepage-content t:side="verso">
+ </t:titlepage-content>
+
+ <t:titlepage-separator>
+ </t:titlepage-separator>
+
+ <t:titlepage-before t:side="recto">
+ </t:titlepage-before>
+
+ <t:titlepage-before t:side="verso">
+ </t:titlepage-before>
+</t:titlepage>
+
+<t:titlepage t:element="sect4" t:wrapper="fo:block">
+ <t:titlepage-content t:side="recto">
+ <title
+ margin-left="{$title.margin.left}"
+ font-family="{$title.fontset}"/>
+ <subtitle
+ font-family="{$title.fontset}"/>
+ <corpauthor/>
+ <authorgroup/>
+ <author/>
+ <othercredit/>
+ <releaseinfo/>
+ <copyright/>
+ <legalnotice/>
+ <pubdate/>
+ <revision/>
+ <revhistory/>
+ <abstract/>
+ </t:titlepage-content>
+
+ <t:titlepage-content t:side="verso">
+ </t:titlepage-content>
+
+ <t:titlepage-separator>
+ </t:titlepage-separator>
+
+ <t:titlepage-before t:side="recto">
+ </t:titlepage-before>
+
+ <t:titlepage-before t:side="verso">
+ </t:titlepage-before>
+</t:titlepage>
+
+<t:titlepage t:element="sect5" t:wrapper="fo:block">
+ <t:titlepage-content t:side="recto">
+ <title
+ margin-left="{$title.margin.left}"
+ font-family="{$title.fontset}"/>
+ <subtitle
+ font-family="{$title.fontset}"/>
+ <corpauthor/>
+ <authorgroup/>
+ <author/>
+ <othercredit/>
+ <releaseinfo/>
+ <copyright/>
+ <legalnotice/>
+ <pubdate/>
+ <revision/>
+ <revhistory/>
+ <abstract/>
+ </t:titlepage-content>
+
+ <t:titlepage-content t:side="verso">
+ </t:titlepage-content>
+
+ <t:titlepage-separator>
+ </t:titlepage-separator>
+
+ <t:titlepage-before t:side="recto">
+ </t:titlepage-before>
+
+ <t:titlepage-before t:side="verso">
+ </t:titlepage-before>
+</t:titlepage>
+
+<t:titlepage t:element="simplesect" t:wrapper="fo:block">
+ <t:titlepage-content t:side="recto">
+ <title
+ margin-left="{$title.margin.left}"
+ font-family="{$title.fontset}"/>
+ <subtitle
+ font-family="{$title.fontset}"/>
+ <corpauthor/>
+ <authorgroup/>
+ <author/>
+ <othercredit/>
+ <releaseinfo/>
+ <copyright/>
+ <legalnotice/>
+ <pubdate/>
+ <revision/>
+ <revhistory/>
+ <abstract/>
+ </t:titlepage-content>
+
+ <t:titlepage-content t:side="verso">
+ </t:titlepage-content>
+
+ <t:titlepage-separator>
+ </t:titlepage-separator>
+
+ <t:titlepage-before t:side="recto">
+ </t:titlepage-before>
+
+ <t:titlepage-before t:side="verso">
+ </t:titlepage-before>
+</t:titlepage>
+
+<!-- ==================================================================== -->
+
+ <t:titlepage t:element="bibliography" t:wrapper="fo:block">
+ <t:titlepage-content t:side="recto">
+ <title
+ t:force="1"
+ t:named-template="component.title"
+ param:node="ancestor-or-self::bibliography[1]"
+ margin-left="{$title.margin.left}"
+ font-size="&hsize5;"
+ font-family="{$title.fontset}"
+ font-weight="bold"/>
+ <subtitle
+ font-family="{$title.fontset}"/>
+ </t:titlepage-content>
+
+ <t:titlepage-content t:side="verso">
+ </t:titlepage-content>
+
+ <t:titlepage-separator>
+ </t:titlepage-separator>
+
+ <t:titlepage-before t:side="recto">
+ </t:titlepage-before>
+
+ <t:titlepage-before t:side="verso">
+ </t:titlepage-before>
+ </t:titlepage>
+
+<!-- ==================================================================== -->
+
+ <t:titlepage t:element="bibliodiv" t:wrapper="fo:block">
+ <t:titlepage-content t:side="recto">
+ <title t:named-template="component.title"
+ param:node="ancestor-or-self::bibliodiv[1]"
+ margin-left="{$title.margin.left}"
+ font-size="&hsize4;"
+ font-family="{$title.fontset}"
+ font-weight="bold"/>
+ <subtitle
+ font-family="{$title.fontset}"/>
+ </t:titlepage-content>
+
+ <t:titlepage-content t:side="verso">
+ </t:titlepage-content>
+
+ <t:titlepage-separator>
+ </t:titlepage-separator>
+
+ <t:titlepage-before t:side="recto">
+ </t:titlepage-before>
+
+ <t:titlepage-before t:side="verso">
+ </t:titlepage-before>
+ </t:titlepage>
+
+<!-- ==================================================================== -->
+
+ <t:titlepage t:element="glossary" t:wrapper="fo:block">
+ <t:titlepage-content t:side="recto">
+ <title
+ t:force="1"
+ t:named-template="component.title"
+ param:node="ancestor-or-self::glossary[1]"
+ margin-left="{$title.margin.left}"
+ font-size="&hsize5;"
+ font-family="{$title.fontset}"
+ font-weight="bold"/>
+ <subtitle
+ font-family="{$title.fontset}"/>
+ </t:titlepage-content>
+
+ <t:titlepage-content t:side="verso">
+ </t:titlepage-content>
+
+ <t:titlepage-separator>
+ </t:titlepage-separator>
+
+ <t:titlepage-before t:side="recto">
+ </t:titlepage-before>
+
+ <t:titlepage-before t:side="verso">
+ </t:titlepage-before>
+ </t:titlepage>
+
+<!-- ==================================================================== -->
+
+ <t:titlepage t:element="glossdiv" t:wrapper="fo:block">
+ <t:titlepage-content t:side="recto">
+ <title t:named-template="component.title"
+ param:node="ancestor-or-self::glossdiv[1]"
+ margin-left="{$title.margin.left}"
+ font-size="&hsize4;"
+ font-family="{$title.fontset}"
+ font-weight="bold"/>
+ <subtitle
+ font-family="{$title.fontset}"/>
+ </t:titlepage-content>
+
+ <t:titlepage-content t:side="verso">
+ </t:titlepage-content>
+
+ <t:titlepage-separator>
+ </t:titlepage-separator>
+
+ <t:titlepage-before t:side="recto">
+ </t:titlepage-before>
+
+ <t:titlepage-before t:side="verso">
+ </t:titlepage-before>
+ </t:titlepage>
+
+<!-- ==================================================================== -->
+
+ <t:titlepage t:element="index" t:wrapper="fo:block">
+ <t:titlepage-content t:side="recto">
+ <title
+ t:force="1"
+ t:named-template="component.title"
+ param:node="ancestor-or-self::index[1]"
+ param:pagewide="1"
+ margin-left="0pt"
+ font-size="&hsize5;"
+ font-family="{$title.fontset}"
+ font-weight="bold"/>
+ <subtitle
+ font-family="{$title.fontset}"/>
+ </t:titlepage-content>
+
+ <t:titlepage-content t:side="verso">
+ </t:titlepage-content>
+
+ <t:titlepage-separator>
+ </t:titlepage-separator>
+
+ <t:titlepage-before t:side="recto">
+ </t:titlepage-before>
+
+ <t:titlepage-before t:side="verso">
+ </t:titlepage-before>
+ </t:titlepage>
+
+<!-- ==================================================================== -->
+
+ <!-- The indexdiv.title template is used so that manual and -->
+ <!-- automatically generated indexdiv titles get the same -->
+ <!-- formatting. -->
+
+ <t:titlepage t:element="indexdiv" t:wrapper="fo:block">
+ <t:titlepage-content t:side="recto">
+ <title t:force="1"
+ t:named-template="indexdiv.title"
+ param:title="title"/>
+ <subtitle
+ font-family="{$title.fontset}"/>
+ </t:titlepage-content>
+
+ <t:titlepage-content t:side="verso">
+ </t:titlepage-content>
+
+ <t:titlepage-separator>
+ </t:titlepage-separator>
+
+ <t:titlepage-before t:side="recto">
+ </t:titlepage-before>
+
+ <t:titlepage-before t:side="verso">
+ </t:titlepage-before>
+ </t:titlepage>
+
+<!-- ==================================================================== -->
+
+ <t:titlepage t:element="setindex" t:wrapper="fo:block">
+ <t:titlepage-content t:side="recto">
+ <title
+ t:force="1"
+ t:named-template="component.title"
+ param:node="ancestor-or-self::setindex[1]"
+ param:pagewide="1"
+ margin-left="0pt"
+ font-size="&hsize5;"
+ font-family="{$title.fontset}"
+ font-weight="bold"/>
+ <subtitle
+ font-family="{$title.fontset}"/>
+ </t:titlepage-content>
+
+ <t:titlepage-content t:side="verso">
+ </t:titlepage-content>
+
+ <t:titlepage-separator>
+ </t:titlepage-separator>
+
+ <t:titlepage-before t:side="recto">
+ </t:titlepage-before>
+
+ <t:titlepage-before t:side="verso">
+ </t:titlepage-before>
+ </t:titlepage>
+
+<!-- ==================================================================== -->
+
+ <t:titlepage t:element="colophon" t:wrapper="fo:block">
+ <t:titlepage-content t:side="recto">
+ <title
+ t:force="1"
+ t:named-template="component.title"
+ param:node="ancestor-or-self::colophon[1]"
+ margin-left="{$title.margin.left}"
+ font-size="&hsize5;"
+ font-family="{$title.fontset}"
+ font-weight="bold"/>
+ <subtitle
+ font-family="{$title.fontset}"/>
+ </t:titlepage-content>
+
+ <t:titlepage-content t:side="verso">
+ </t:titlepage-content>
+
+ <t:titlepage-separator>
+ </t:titlepage-separator>
+
+ <t:titlepage-before t:side="recto">
+ </t:titlepage-before>
+
+ <t:titlepage-before t:side="verso">
+ </t:titlepage-before>
+</t:titlepage>
+
+<!-- ==================================================================== -->
+
+ <t:titlepage t:element="table.of.contents" t:wrapper="fo:block">
+ <t:titlepage-content t:side="recto">
+ <title
+ t:force="1"
+ t:named-template="gentext"
+ param:key="'TableofContents'"
+ space-before.minimum="1em"
+ space-before.optimum="1.5em"
+ space-before.maximum="2em"
+ space-after="0.5em"
+ margin-left="{$title.margin.left}"
+ font-size="&hsize3;"
+ font-weight="bold"
+ font-family="{$title.fontset}"/>
+ </t:titlepage-content>
+
+ <t:titlepage-content t:side="verso">
+ </t:titlepage-content>
+
+ <t:titlepage-separator>
+ </t:titlepage-separator>
+
+ <t:titlepage-before t:side="recto">
+ </t:titlepage-before>
+
+ <t:titlepage-before t:side="verso">
+ </t:titlepage-before>
+ </t:titlepage>
+
+ <t:titlepage t:element="list.of.tables" t:wrapper="fo:block">
+ <t:titlepage-content t:side="recto">
+ <title
+ t:force="1"
+ t:named-template="gentext"
+ param:key="'ListofTables'"
+ space-before.minimum="1em"
+ space-before.optimum="1.5em"
+ space-before.maximum="2em"
+ space-after="0.5em"
+ margin-left="{$title.margin.left}"
+ font-size="&hsize3;"
+ font-weight="bold"
+ font-family="{$title.fontset}"/>
+ </t:titlepage-content>
+
+ <t:titlepage-content t:side="verso">
+ </t:titlepage-content>
+
+ <t:titlepage-separator>
+ </t:titlepage-separator>
+
+ <t:titlepage-before t:side="recto">
+ </t:titlepage-before>
+
+ <t:titlepage-before t:side="verso">
+ </t:titlepage-before>
+ </t:titlepage>
+
+ <t:titlepage t:element="list.of.figures" t:wrapper="fo:block">
+ <t:titlepage-content t:side="recto">
+ <title
+ t:force="1"
+ t:named-template="gentext"
+ param:key="'ListofFigures'"
+ space-before.minimum="1em"
+ space-before.optimum="1.5em"
+ space-before.maximum="2em"
+ space-after="0.5em"
+ margin-left="{$title.margin.left}"
+ font-size="&hsize3;"
+ font-weight="bold"
+ font-family="{$title.fontset}"/>
+ </t:titlepage-content>
+
+ <t:titlepage-content t:side="verso">
+ </t:titlepage-content>
+
+ <t:titlepage-separator>
+ </t:titlepage-separator>
+
+ <t:titlepage-before t:side="recto">
+ </t:titlepage-before>
+
+ <t:titlepage-before t:side="verso">
+ </t:titlepage-before>
+ </t:titlepage>
+
+ <t:titlepage t:element="list.of.examples" t:wrapper="fo:block">
+ <t:titlepage-content t:side="recto">
+ <title
+ t:force="1"
+ t:named-template="gentext"
+ param:key="'ListofExamples'"
+ space-before.minimum="1em"
+ space-before.optimum="1.5em"
+ space-before.maximum="2em"
+ space-after="0.5em"
+ margin-left="{$title.margin.left}"
+ font-size="&hsize3;"
+ font-weight="bold"
+ font-family="{$title.fontset}"/>
+ </t:titlepage-content>
+
+ <t:titlepage-content t:side="verso">
+ </t:titlepage-content>
+
+ <t:titlepage-separator>
+ </t:titlepage-separator>
+
+ <t:titlepage-before t:side="recto">
+ </t:titlepage-before>
+
+ <t:titlepage-before t:side="verso">
+ </t:titlepage-before>
+ </t:titlepage>
+
+ <t:titlepage t:element="list.of.equations" t:wrapper="fo:block">
+ <t:titlepage-content t:side="recto">
+ <title
+ t:force="1"
+ t:named-template="gentext"
+ param:key="'ListofEquations'"
+ space-before.minimum="1em"
+ space-before.optimum="1.5em"
+ space-before.maximum="2em"
+ space-after="0.5em"
+ margin-left="{$title.margin.left}"
+ font-size="&hsize3;"
+ font-weight="bold"
+ font-family="{$title.fontset}"/>
+ </t:titlepage-content>
+
+ <t:titlepage-content t:side="verso">
+ </t:titlepage-content>
+
+ <t:titlepage-separator>
+ </t:titlepage-separator>
+
+ <t:titlepage-before t:side="recto">
+ </t:titlepage-before>
+
+ <t:titlepage-before t:side="verso">
+ </t:titlepage-before>
+ </t:titlepage>
+
+ <t:titlepage t:element="list.of.procedures" t:wrapper="fo:block">
+ <t:titlepage-content t:side="recto">
+ <title
+ t:force="1"
+ t:named-template="gentext"
+ param:key="'ListofProcedures'"
+ space-before.minimum="1em"
+ space-before.optimum="1.5em"
+ space-before.maximum="2em"
+ space-after="0.5em"
+ margin-left="{$title.margin.left}"
+ font-size="&hsize3;"
+ font-weight="bold"
+ font-family="{$title.fontset}"/>
+ </t:titlepage-content>
+
+ <t:titlepage-content t:side="verso">
+ </t:titlepage-content>
+
+ <t:titlepage-separator>
+ </t:titlepage-separator>
+
+ <t:titlepage-before t:side="recto">
+ </t:titlepage-before>
+
+ <t:titlepage-before t:side="verso">
+ </t:titlepage-before>
+ </t:titlepage>
+
+ <t:titlepage t:element="list.of.unknowns" t:wrapper="fo:block">
+ <t:titlepage-content t:side="recto">
+ <title
+ t:force="1"
+ t:named-template="gentext"
+ param:key="'ListofUnknown'"
+ space-before.minimum="1em"
+ space-before.optimum="1.5em"
+ space-before.maximum="2em"
+ space-after="0.5em"
+ margin-left="{$title.margin.left}"
+ font-size="&hsize3;"
+ font-weight="bold"
+ font-family="{$title.fontset}"/>
+ </t:titlepage-content>
+
+ <t:titlepage-content t:side="verso">
+ </t:titlepage-content>
+
+ <t:titlepage-separator>
+ </t:titlepage-separator>
+
+ <t:titlepage-before t:side="recto">
+ </t:titlepage-before>
+
+ <t:titlepage-before t:side="verso">
+ </t:titlepage-before>
+ </t:titlepage>
+
+<!-- ==================================================================== -->
+
+</t:templates>
diff --git a/bitbake/doc/tools/docbook-to-pdf b/bitbake/doc/tools/docbook-to-pdf
new file mode 100755
index 0000000..558ded9e
--- /dev/null
+++ b/bitbake/doc/tools/docbook-to-pdf
@@ -0,0 +1,51 @@
+#!/bin/sh
+
+if [ -z "$1" -o -z "$2" ]; then
+ echo "usage: [-v] $0 <docbook file> <templatedir>"
+ echo
+ echo "*NOTE* you need xsltproc, fop and nwalsh docbook stylesheets"
+ echo " installed for this to work!"
+ echo
+ exit 0
+fi
+
+FO=`echo $1 | sed s/.xml/.fo/` || exit 1
+PDF=`echo $1 | sed s/.xml/.pdf/` || exit 1
+TEMPLATEDIR=$2
+
+##
+# These URI should be rewritten by your distribution's xml catalog to
+# match your localy installed XSL stylesheets.
+XSL_BASE_URI="http://docbook.sourceforge.net/release/xsl/current"
+
+# Creates a temporary XSL stylesheet based on titlepage.xsl
+xsltproc -o /tmp/titlepage.xsl \
+ --xinclude \
+ $XSL_BASE_URI/template/titlepage.xsl \
+ $TEMPLATEDIR/titlepage.templates.xml || exit 1
+
+# Creates the file needed for FOP
+xsltproc --xinclude \
+ --stringparam hyphenate false \
+ --stringparam formal.title.placement "figure after" \
+ --stringparam ulink.show 1 \
+ --stringparam body.font.master 9 \
+ --stringparam title.font.master 11 \
+ --stringparam draft.watermark.image "$TEMPLATEDIR/draft.png" \
+ --stringparam chapter.autolabel 1 \
+ --stringparam appendix.autolabel A \
+ --stringparam section.autolabel 1 \
+ --stringparam section.label.includes.component.label 1 \
+ --output $FO \
+ $TEMPLATEDIR/db-pdf.xsl \
+ $1 || exit 1
+
+# Invokes the Java version of FOP. Uses the additional configuration file common/fop-config.xml
+fop -c $TEMPLATEDIR/fop-config.xml -fo $FO -pdf $PDF || exit 1
+
+rm -f $FO
+rm -f /tmp/titlepage.xsl
+
+echo
+echo " #### Success! $PDF ready. ####"
+echo
diff --git a/bitbake/lib/bb/COW.py b/bitbake/lib/bb/COW.py
new file mode 100644
index 0000000..6917ec3
--- /dev/null
+++ b/bitbake/lib/bb/COW.py
@@ -0,0 +1,323 @@
+# ex:ts=4:sw=4:sts=4:et
+# -*- tab-width: 4; c-basic-offset: 4; indent-tabs-mode: nil -*-
+#
+# This is a copy on write dictionary and set which abuses classes to try and be nice and fast.
+#
+# Copyright (C) 2006 Tim Amsell
+#
+# This program is free software; you can redistribute it and/or modify
+# it under the terms of the GNU General Public License version 2 as
+# published by the Free Software Foundation.
+#
+# This program is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
+# GNU General Public License for more details.
+#
+# You should have received a copy of the GNU General Public License along
+# with this program; if not, write to the Free Software Foundation, Inc.,
+# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
+#
+#Please Note:
+# Be careful when using mutable types (ie Dict and Lists) - operations involving these are SLOW.
+# Assign a file to __warn__ to get warnings about slow operations.
+#
+
+from __future__ import print_function
+import copy
+import types
+ImmutableTypes = (
+ types.NoneType,
+ bool,
+ complex,
+ float,
+ int,
+ long,
+ tuple,
+ frozenset,
+ basestring
+)
+
+MUTABLE = "__mutable__"
+
+class COWMeta(type):
+ pass
+
+class COWDictMeta(COWMeta):
+ __warn__ = False
+ __hasmutable__ = False
+ __marker__ = tuple()
+
+ def __str__(cls):
+ # FIXME: I have magic numbers!
+ return "<COWDict Level: %i Current Keys: %i>" % (cls.__count__, len(cls.__dict__) - 3)
+ __repr__ = __str__
+
+ def cow(cls):
+ class C(cls):
+ __count__ = cls.__count__ + 1
+ return C
+ copy = cow
+ __call__ = cow
+
+ def __setitem__(cls, key, value):
+ if not isinstance(value, ImmutableTypes):
+ if not isinstance(value, COWMeta):
+ cls.__hasmutable__ = True
+ key += MUTABLE
+ setattr(cls, key, value)
+
+ def __getmutable__(cls, key, readonly=False):
+ nkey = key + MUTABLE
+ try:
+ return cls.__dict__[nkey]
+ except KeyError:
+ pass
+
+ value = getattr(cls, nkey)
+ if readonly:
+ return value
+
+ if not cls.__warn__ is False and not isinstance(value, COWMeta):
+ print("Warning: Doing a copy because %s is a mutable type." % key, file=cls.__warn__)
+ try:
+ value = value.copy()
+ except AttributeError as e:
+ value = copy.copy(value)
+ setattr(cls, nkey, value)
+ return value
+
+ __getmarker__ = []
+ def __getreadonly__(cls, key, default=__getmarker__):
+ """\
+ Get a value (even if mutable) which you promise not to change.
+ """
+ return cls.__getitem__(key, default, True)
+
+ def __getitem__(cls, key, default=__getmarker__, readonly=False):
+ try:
+ try:
+ value = getattr(cls, key)
+ except AttributeError:
+ value = cls.__getmutable__(key, readonly)
+
+ # This is for values which have been deleted
+ if value is cls.__marker__:
+ raise AttributeError("key %s does not exist." % key)
+
+ return value
+ except AttributeError as e:
+ if not default is cls.__getmarker__:
+ return default
+
+ raise KeyError(str(e))
+
+ def __delitem__(cls, key):
+ cls.__setitem__(key, cls.__marker__)
+
+ def __revertitem__(cls, key):
+ if not cls.__dict__.has_key(key):
+ key += MUTABLE
+ delattr(cls, key)
+
+ def __contains__(cls, key):
+ return cls.has_key(key)
+
+ def has_key(cls, key):
+ value = cls.__getreadonly__(key, cls.__marker__)
+ if value is cls.__marker__:
+ return False
+ return True
+
+ def iter(cls, type, readonly=False):
+ for key in dir(cls):
+ if key.startswith("__"):
+ continue
+
+ if key.endswith(MUTABLE):
+ key = key[:-len(MUTABLE)]
+
+ if type == "keys":
+ yield key
+
+ try:
+ if readonly:
+ value = cls.__getreadonly__(key)
+ else:
+ value = cls[key]
+ except KeyError:
+ continue
+
+ if type == "values":
+ yield value
+ if type == "items":
+ yield (key, value)
+ raise StopIteration()
+
+ def iterkeys(cls):
+ return cls.iter("keys")
+ def itervalues(cls, readonly=False):
+ if not cls.__warn__ is False and cls.__hasmutable__ and readonly is False:
+ print("Warning: If you arn't going to change any of the values call with True.", file=cls.__warn__)
+ return cls.iter("values", readonly)
+ def iteritems(cls, readonly=False):
+ if not cls.__warn__ is False and cls.__hasmutable__ and readonly is False:
+ print("Warning: If you arn't going to change any of the values call with True.", file=cls.__warn__)
+ return cls.iter("items", readonly)
+
+class COWSetMeta(COWDictMeta):
+ def __str__(cls):
+ # FIXME: I have magic numbers!
+ return "<COWSet Level: %i Current Keys: %i>" % (cls.__count__, len(cls.__dict__) -3)
+ __repr__ = __str__
+
+ def cow(cls):
+ class C(cls):
+ __count__ = cls.__count__ + 1
+ return C
+
+ def add(cls, value):
+ COWDictMeta.__setitem__(cls, repr(hash(value)), value)
+
+ def remove(cls, value):
+ COWDictMeta.__delitem__(cls, repr(hash(value)))
+
+ def __in__(cls, value):
+ return COWDictMeta.has_key(repr(hash(value)))
+
+ def iterkeys(cls):
+ raise TypeError("sets don't have keys")
+
+ def iteritems(cls):
+ raise TypeError("sets don't have 'items'")
+
+# These are the actual classes you use!
+class COWDictBase(object):
+ __metaclass__ = COWDictMeta
+ __count__ = 0
+
+class COWSetBase(object):
+ __metaclass__ = COWSetMeta
+ __count__ = 0
+
+if __name__ == "__main__":
+ import sys
+ COWDictBase.__warn__ = sys.stderr
+ a = COWDictBase()
+ print("a", a)
+
+ a['a'] = 'a'
+ a['b'] = 'b'
+ a['dict'] = {}
+
+ b = a.copy()
+ print("b", b)
+ b['c'] = 'b'
+
+ print()
+
+ print("a", a)
+ for x in a.iteritems():
+ print(x)
+ print("--")
+ print("b", b)
+ for x in b.iteritems():
+ print(x)
+ print()
+
+ b['dict']['a'] = 'b'
+ b['a'] = 'c'
+
+ print("a", a)
+ for x in a.iteritems():
+ print(x)
+ print("--")
+ print("b", b)
+ for x in b.iteritems():
+ print(x)
+ print()
+
+ try:
+ b['dict2']
+ except KeyError as e:
+ print("Okay!")
+
+ a['set'] = COWSetBase()
+ a['set'].add("o1")
+ a['set'].add("o1")
+ a['set'].add("o2")
+
+ print("a", a)
+ for x in a['set'].itervalues():
+ print(x)
+ print("--")
+ print("b", b)
+ for x in b['set'].itervalues():
+ print(x)
+ print()
+
+ b['set'].add('o3')
+
+ print("a", a)
+ for x in a['set'].itervalues():
+ print(x)
+ print("--")
+ print("b", b)
+ for x in b['set'].itervalues():
+ print(x)
+ print()
+
+ a['set2'] = set()
+ a['set2'].add("o1")
+ a['set2'].add("o1")
+ a['set2'].add("o2")
+
+ print("a", a)
+ for x in a.iteritems():
+ print(x)
+ print("--")
+ print("b", b)
+ for x in b.iteritems(readonly=True):
+ print(x)
+ print()
+
+ del b['b']
+ try:
+ print(b['b'])
+ except KeyError:
+ print("Yay! deleted key raises error")
+
+ if b.has_key('b'):
+ print("Boo!")
+ else:
+ print("Yay - has_key with delete works!")
+
+ print("a", a)
+ for x in a.iteritems():
+ print(x)
+ print("--")
+ print("b", b)
+ for x in b.iteritems(readonly=True):
+ print(x)
+ print()
+
+ b.__revertitem__('b')
+
+ print("a", a)
+ for x in a.iteritems():
+ print(x)
+ print("--")
+ print("b", b)
+ for x in b.iteritems(readonly=True):
+ print(x)
+ print()
+
+ b.__revertitem__('dict')
+ print("a", a)
+ for x in a.iteritems():
+ print(x)
+ print("--")
+ print("b", b)
+ for x in b.iteritems(readonly=True):
+ print(x)
+ print()
diff --git a/bitbake/lib/bb/__init__.py b/bitbake/lib/bb/__init__.py
new file mode 100644
index 0000000..1f7946e
--- /dev/null
+++ b/bitbake/lib/bb/__init__.py
@@ -0,0 +1,142 @@
+# ex:ts=4:sw=4:sts=4:et
+# -*- tab-width: 4; c-basic-offset: 4; indent-tabs-mode: nil -*-
+#
+# BitBake Build System Python Library
+#
+# Copyright (C) 2003 Holger Schurig
+# Copyright (C) 2003, 2004 Chris Larson
+#
+# Based on Gentoo's portage.py.
+#
+# This program is free software; you can redistribute it and/or modify
+# it under the terms of the GNU General Public License version 2 as
+# published by the Free Software Foundation.
+#
+# This program is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
+# GNU General Public License for more details.
+#
+# You should have received a copy of the GNU General Public License along
+# with this program; if not, write to the Free Software Foundation, Inc.,
+# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
+
+__version__ = "1.27.1"
+
+import sys
+if sys.version_info < (2, 7, 3):
+ raise RuntimeError("Sorry, python 2.7.3 or later is required for this version of bitbake")
+
+
+class BBHandledException(Exception):
+ """
+ The big dilemma for generic bitbake code is what information to give the user
+ when an exception occurs. Any exception inheriting this base exception class
+ has already provided information to the user via some 'fired' message type such as
+ an explicitly fired event using bb.fire, or a bb.error message. If bitbake
+ encounters an exception derived from this class, no backtrace or other information
+ will be given to the user, its assumed the earlier event provided the relevant information.
+ """
+ pass
+
+import os
+import logging
+
+
+class NullHandler(logging.Handler):
+ def emit(self, record):
+ pass
+
+Logger = logging.getLoggerClass()
+class BBLogger(Logger):
+ def __init__(self, name):
+ if name.split(".")[0] == "BitBake":
+ self.debug = self.bbdebug
+ Logger.__init__(self, name)
+
+ def bbdebug(self, level, msg, *args, **kwargs):
+ return self.log(logging.DEBUG - level + 1, msg, *args, **kwargs)
+
+ def plain(self, msg, *args, **kwargs):
+ return self.log(logging.INFO + 1, msg, *args, **kwargs)
+
+ def verbose(self, msg, *args, **kwargs):
+ return self.log(logging.INFO - 1, msg, *args, **kwargs)
+
+logging.raiseExceptions = False
+logging.setLoggerClass(BBLogger)
+
+logger = logging.getLogger("BitBake")
+logger.addHandler(NullHandler())
+logger.setLevel(logging.DEBUG - 2)
+
+# This has to be imported after the setLoggerClass, as the import of bb.msg
+# can result in construction of the various loggers.
+import bb.msg
+
+from bb import fetch2 as fetch
+sys.modules['bb.fetch'] = sys.modules['bb.fetch2']
+
+# Messaging convenience functions
+def plain(*args):
+ logger.plain(''.join(args))
+
+def debug(lvl, *args):
+ if isinstance(lvl, basestring):
+ logger.warn("Passed invalid debug level '%s' to bb.debug", lvl)
+ args = (lvl,) + args
+ lvl = 1
+ logger.debug(lvl, ''.join(args))
+
+def note(*args):
+ logger.info(''.join(args))
+
+def warn(*args):
+ logger.warn(''.join(args))
+
+def error(*args, **kwargs):
+ logger.error(''.join(args), extra=kwargs)
+
+def fatal(*args, **kwargs):
+ logger.critical(''.join(args), extra=kwargs)
+ raise BBHandledException()
+
+def deprecated(func, name=None, advice=""):
+ """This is a decorator which can be used to mark functions
+ as deprecated. It will result in a warning being emitted
+ when the function is used."""
+ import warnings
+
+ if advice:
+ advice = ": %s" % advice
+ if name is None:
+ name = func.__name__
+
+ def newFunc(*args, **kwargs):
+ warnings.warn("Call to deprecated function %s%s." % (name,
+ advice),
+ category=DeprecationWarning,
+ stacklevel=2)
+ return func(*args, **kwargs)
+ newFunc.__name__ = func.__name__
+ newFunc.__doc__ = func.__doc__
+ newFunc.__dict__.update(func.__dict__)
+ return newFunc
+
+# For compatibility
+def deprecate_import(current, modulename, fromlist, renames = None):
+ """Import objects from one module into another, wrapping them with a DeprecationWarning"""
+ import sys
+
+ module = __import__(modulename, fromlist = fromlist)
+ for position, objname in enumerate(fromlist):
+ obj = getattr(module, objname)
+ newobj = deprecated(obj, "{0}.{1}".format(current, objname),
+ "Please use {0}.{1} instead".format(modulename, objname))
+ if renames:
+ newname = renames[position]
+ else:
+ newname = objname
+
+ setattr(sys.modules[current], newname, newobj)
+
diff --git a/bitbake/lib/bb/build.py b/bitbake/lib/bb/build.py
new file mode 100644
index 0000000..948c395
--- /dev/null
+++ b/bitbake/lib/bb/build.py
@@ -0,0 +1,770 @@
+# ex:ts=4:sw=4:sts=4:et
+# -*- tab-width: 4; c-basic-offset: 4; indent-tabs-mode: nil -*-
+#
+# BitBake 'Build' implementation
+#
+# Core code for function execution and task handling in the
+# BitBake build tools.
+#
+# Copyright (C) 2003, 2004 Chris Larson
+#
+# Based on Gentoo's portage.py.
+#
+# This program is free software; you can redistribute it and/or modify
+# it under the terms of the GNU General Public License version 2 as
+# published by the Free Software Foundation.
+#
+# This program is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
+# GNU General Public License for more details.
+#
+# You should have received a copy of the GNU General Public License along
+# with this program; if not, write to the Free Software Foundation, Inc.,
+# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
+#
+# Based on functions from the base bb module, Copyright 2003 Holger Schurig
+
+import os
+import sys
+import logging
+import shlex
+import glob
+import time
+import stat
+import bb
+import bb.msg
+import bb.process
+from contextlib import nested
+from bb import event, utils
+
+bblogger = logging.getLogger('BitBake')
+logger = logging.getLogger('BitBake.Build')
+
+NULL = open(os.devnull, 'r+')
+
+__mtime_cache = {}
+
+def cached_mtime_noerror(f):
+ if f not in __mtime_cache:
+ try:
+ __mtime_cache[f] = os.stat(f)[stat.ST_MTIME]
+ except OSError:
+ return 0
+ return __mtime_cache[f]
+
+def reset_cache():
+ global __mtime_cache
+ __mtime_cache = {}
+
+# When we execute a Python function, we'd like certain things
+# in all namespaces, hence we add them to __builtins__.
+# If we do not do this and use the exec globals, they will
+# not be available to subfunctions.
+__builtins__['bb'] = bb
+__builtins__['os'] = os
+
+class FuncFailed(Exception):
+ def __init__(self, name = None, logfile = None):
+ self.logfile = logfile
+ self.name = name
+ if name:
+ self.msg = 'Function failed: %s' % name
+ else:
+ self.msg = "Function failed"
+
+ def __str__(self):
+ if self.logfile and os.path.exists(self.logfile):
+ msg = ("%s (log file is located at %s)" %
+ (self.msg, self.logfile))
+ else:
+ msg = self.msg
+ return msg
+
+class TaskBase(event.Event):
+ """Base class for task events"""
+
+ def __init__(self, t, logfile, d):
+ self._task = t
+ self._package = d.getVar("PF", True)
+ self.taskfile = d.getVar("FILE", True)
+ self.taskname = self._task
+ self.logfile = logfile
+ self.time = time.time()
+ event.Event.__init__(self)
+ self._message = "recipe %s: task %s: %s" % (d.getVar("PF", True), t, self.getDisplayName())
+
+ def getTask(self):
+ return self._task
+
+ def setTask(self, task):
+ self._task = task
+
+ def getDisplayName(self):
+ return bb.event.getName(self)[4:]
+
+ task = property(getTask, setTask, None, "task property")
+
+class TaskStarted(TaskBase):
+ """Task execution started"""
+ def __init__(self, t, logfile, taskflags, d):
+ super(TaskStarted, self).__init__(t, logfile, d)
+ self.taskflags = taskflags
+
+class TaskSucceeded(TaskBase):
+ """Task execution completed"""
+
+class TaskFailed(TaskBase):
+ """Task execution failed"""
+
+ def __init__(self, task, logfile, metadata, errprinted = False):
+ self.errprinted = errprinted
+ super(TaskFailed, self).__init__(task, logfile, metadata)
+
+class TaskFailedSilent(TaskBase):
+ """Task execution failed (silently)"""
+ def getDisplayName(self):
+ # Don't need to tell the user it was silent
+ return "Failed"
+
+class TaskInvalid(TaskBase):
+
+ def __init__(self, task, metadata):
+ super(TaskInvalid, self).__init__(task, None, metadata)
+ self._message = "No such task '%s'" % task
+
+
+class LogTee(object):
+ def __init__(self, logger, outfile):
+ self.outfile = outfile
+ self.logger = logger
+ self.name = self.outfile.name
+
+ def write(self, string):
+ self.logger.plain(string)
+ self.outfile.write(string)
+
+ def __enter__(self):
+ self.outfile.__enter__()
+ return self
+
+ def __exit__(self, *excinfo):
+ self.outfile.__exit__(*excinfo)
+
+ def __repr__(self):
+ return '<LogTee {0}>'.format(self.name)
+ def flush(self):
+ self.outfile.flush()
+
+def exec_func(func, d, dirs = None):
+ """Execute a BB 'function'"""
+
+ body = d.getVar(func, False)
+ if not body:
+ if body is None:
+ logger.warn("Function %s doesn't exist", func)
+ return
+
+ flags = d.getVarFlags(func)
+ cleandirs = flags.get('cleandirs')
+ if cleandirs:
+ for cdir in d.expand(cleandirs).split():
+ bb.utils.remove(cdir, True)
+ bb.utils.mkdirhier(cdir)
+
+ if dirs is None:
+ dirs = flags.get('dirs')
+ if dirs:
+ dirs = d.expand(dirs).split()
+
+ if dirs:
+ for adir in dirs:
+ bb.utils.mkdirhier(adir)
+ adir = dirs[-1]
+ else:
+ adir = d.getVar('B', True)
+ bb.utils.mkdirhier(adir)
+
+ ispython = flags.get('python')
+
+ lockflag = flags.get('lockfiles')
+ if lockflag:
+ lockfiles = [f for f in d.expand(lockflag).split()]
+ else:
+ lockfiles = None
+
+ tempdir = d.getVar('T', True)
+
+ # or func allows items to be executed outside of the normal
+ # task set, such as buildhistory
+ task = d.getVar('BB_RUNTASK', True) or func
+ if task == func:
+ taskfunc = task
+ else:
+ taskfunc = "%s.%s" % (task, func)
+
+ runfmt = d.getVar('BB_RUNFMT', True) or "run.{func}.{pid}"
+ runfn = runfmt.format(taskfunc=taskfunc, task=task, func=func, pid=os.getpid())
+ runfile = os.path.join(tempdir, runfn)
+ bb.utils.mkdirhier(os.path.dirname(runfile))
+
+ # Setup the courtesy link to the runfn, only for tasks
+ # we create the link 'just' before the run script is created
+ # if we create it after, and if the run script fails, then the
+ # link won't be created as an exception would be fired.
+ if task == func:
+ runlink = os.path.join(tempdir, 'run.{0}'.format(task))
+ if runlink:
+ bb.utils.remove(runlink)
+
+ try:
+ os.symlink(runfn, runlink)
+ except OSError:
+ pass
+
+ with bb.utils.fileslocked(lockfiles):
+ if ispython:
+ exec_func_python(func, d, runfile, cwd=adir)
+ else:
+ exec_func_shell(func, d, runfile, cwd=adir)
+
+_functionfmt = """
+def {function}(d):
+{body}
+
+{function}(d)
+"""
+logformatter = bb.msg.BBLogFormatter("%(levelname)s: %(message)s")
+def exec_func_python(func, d, runfile, cwd=None):
+ """Execute a python BB 'function'"""
+
+ bbfile = d.getVar('FILE', True)
+ code = _functionfmt.format(function=func, body=d.getVar(func, True))
+ bb.utils.mkdirhier(os.path.dirname(runfile))
+ with open(runfile, 'w') as script:
+ bb.data.emit_func_python(func, script, d)
+
+ if cwd:
+ try:
+ olddir = os.getcwd()
+ except OSError:
+ olddir = None
+ os.chdir(cwd)
+
+ bb.debug(2, "Executing python function %s" % func)
+
+ try:
+ comp = utils.better_compile(code, func, bbfile)
+ utils.better_exec(comp, {"d": d}, code, bbfile)
+ except (bb.parse.SkipRecipe, bb.build.FuncFailed):
+ raise
+ except:
+ raise FuncFailed(func, None)
+ finally:
+ bb.debug(2, "Python function %s finished" % func)
+
+ if cwd and olddir:
+ try:
+ os.chdir(olddir)
+ except OSError:
+ pass
+
+def shell_trap_code():
+ return '''#!/bin/sh\n
+# Emit a useful diagnostic if something fails:
+bb_exit_handler() {
+ ret=$?
+ case $ret in
+ 0) ;;
+ *) case $BASH_VERSION in
+ "") echo "WARNING: exit code $ret from a shell command.";;
+ *) echo "WARNING: ${BASH_SOURCE[0]}:${BASH_LINENO[0]} exit $ret from
+ \"$BASH_COMMAND\"";;
+ esac
+ exit $ret
+ esac
+}
+trap 'bb_exit_handler' 0
+set -e
+'''
+
+def exec_func_shell(func, d, runfile, cwd=None):
+ """Execute a shell function from the metadata
+
+ Note on directory behavior. The 'dirs' varflag should contain a list
+ of the directories you need created prior to execution. The last
+ item in the list is where we will chdir/cd to.
+ """
+
+ # Don't let the emitted shell script override PWD
+ d.delVarFlag('PWD', 'export')
+
+ with open(runfile, 'w') as script:
+ script.write(shell_trap_code())
+
+ bb.data.emit_func(func, script, d)
+
+ if bb.msg.loggerVerboseLogs:
+ script.write("set -x\n")
+ if cwd:
+ script.write("cd '%s'\n" % cwd)
+ script.write("%s\n" % func)
+ script.write('''
+# cleanup
+ret=$?
+trap '' 0
+exit $ret
+''')
+
+ os.chmod(runfile, 0775)
+
+ cmd = runfile
+ if d.getVarFlag(func, 'fakeroot'):
+ fakerootcmd = d.getVar('FAKEROOT', True)
+ if fakerootcmd:
+ cmd = [fakerootcmd, runfile]
+
+ if bb.msg.loggerDefaultVerbose:
+ logfile = LogTee(logger, sys.stdout)
+ else:
+ logfile = sys.stdout
+
+ def readfifo(data):
+ lines = data.split('\0')
+ for line in lines:
+ splitval = line.split(' ', 1)
+ cmd = splitval[0]
+ if len(splitval) > 1:
+ value = splitval[1]
+ else:
+ value = ''
+ if cmd == 'bbplain':
+ bb.plain(value)
+ elif cmd == 'bbnote':
+ bb.note(value)
+ elif cmd == 'bbwarn':
+ bb.warn(value)
+ elif cmd == 'bberror':
+ bb.error(value)
+ elif cmd == 'bbfatal':
+ # The caller will call exit themselves, so bb.error() is
+ # what we want here rather than bb.fatal()
+ bb.error(value)
+ elif cmd == 'bbfatal_log':
+ bb.error(value, forcelog=True)
+ elif cmd == 'bbdebug':
+ splitval = value.split(' ', 1)
+ level = int(splitval[0])
+ value = splitval[1]
+ bb.debug(level, value)
+
+ tempdir = d.getVar('T', True)
+ fifopath = os.path.join(tempdir, 'fifo.%s' % os.getpid())
+ if os.path.exists(fifopath):
+ os.unlink(fifopath)
+ os.mkfifo(fifopath)
+ with open(fifopath, 'r+') as fifo:
+ try:
+ bb.debug(2, "Executing shell function %s" % func)
+
+ try:
+ with open(os.devnull, 'r+') as stdin:
+ bb.process.run(cmd, shell=False, stdin=stdin, log=logfile, extrafiles=[(fifo,readfifo)])
+ except bb.process.CmdError:
+ logfn = d.getVar('BB_LOGFILE', True)
+ raise FuncFailed(func, logfn)
+ finally:
+ os.unlink(fifopath)
+
+ bb.debug(2, "Shell function %s finished" % func)
+
+def _task_data(fn, task, d):
+ localdata = bb.data.createCopy(d)
+ localdata.setVar('BB_FILENAME', fn)
+ localdata.setVar('BB_CURRENTTASK', task[3:])
+ localdata.setVar('OVERRIDES', 'task-%s:%s' %
+ (task[3:].replace('_', '-'), d.getVar('OVERRIDES', False)))
+ localdata.finalize()
+ bb.data.expandKeys(localdata)
+ return localdata
+
+def _exec_task(fn, task, d, quieterr):
+ """Execute a BB 'task'
+
+ Execution of a task involves a bit more setup than executing a function,
+ running it with its own local metadata, and with some useful variables set.
+ """
+ if not d.getVarFlag(task, 'task'):
+ event.fire(TaskInvalid(task, d), d)
+ logger.error("No such task: %s" % task)
+ return 1
+
+ logger.debug(1, "Executing task %s", task)
+
+ localdata = _task_data(fn, task, d)
+ tempdir = localdata.getVar('T', True)
+ if not tempdir:
+ bb.fatal("T variable not set, unable to build")
+
+ # Change nice level if we're asked to
+ nice = localdata.getVar("BB_TASK_NICE_LEVEL", True)
+ if nice:
+ curnice = os.nice(0)
+ nice = int(nice) - curnice
+ newnice = os.nice(nice)
+ logger.debug(1, "Renice to %s " % newnice)
+
+ bb.utils.mkdirhier(tempdir)
+
+ # Determine the logfile to generate
+ logfmt = localdata.getVar('BB_LOGFMT', True) or 'log.{task}.{pid}'
+ logbase = logfmt.format(task=task, pid=os.getpid())
+
+ # Document the order of the tasks...
+ logorder = os.path.join(tempdir, 'log.task_order')
+ try:
+ with open(logorder, 'a') as logorderfile:
+ logorderfile.write('{0} ({1}): {2}\n'.format(task, os.getpid(), logbase))
+ except OSError:
+ logger.exception("Opening log file '%s'", logorder)
+ pass
+
+ # Setup the courtesy link to the logfn
+ loglink = os.path.join(tempdir, 'log.{0}'.format(task))
+ logfn = os.path.join(tempdir, logbase)
+ if loglink:
+ bb.utils.remove(loglink)
+
+ try:
+ os.symlink(logbase, loglink)
+ except OSError:
+ pass
+
+ prefuncs = localdata.getVarFlag(task, 'prefuncs', expand=True)
+ postfuncs = localdata.getVarFlag(task, 'postfuncs', expand=True)
+
+ class ErrorCheckHandler(logging.Handler):
+ def __init__(self):
+ self.triggered = False
+ logging.Handler.__init__(self, logging.ERROR)
+ def emit(self, record):
+ if getattr(record, 'forcelog', False):
+ self.triggered = False
+ else:
+ self.triggered = True
+
+ # Handle logfiles
+ si = open('/dev/null', 'r')
+ try:
+ bb.utils.mkdirhier(os.path.dirname(logfn))
+ logfile = open(logfn, 'w')
+ except OSError:
+ logger.exception("Opening log file '%s'", logfn)
+ pass
+
+ # Dup the existing fds so we dont lose them
+ osi = [os.dup(sys.stdin.fileno()), sys.stdin.fileno()]
+ oso = [os.dup(sys.stdout.fileno()), sys.stdout.fileno()]
+ ose = [os.dup(sys.stderr.fileno()), sys.stderr.fileno()]
+
+ # Replace those fds with our own
+ os.dup2(si.fileno(), osi[1])
+ os.dup2(logfile.fileno(), oso[1])
+ os.dup2(logfile.fileno(), ose[1])
+
+ # Ensure Python logging goes to the logfile
+ handler = logging.StreamHandler(logfile)
+ handler.setFormatter(logformatter)
+ # Always enable full debug output into task logfiles
+ handler.setLevel(logging.DEBUG - 2)
+ bblogger.addHandler(handler)
+
+ errchk = ErrorCheckHandler()
+ bblogger.addHandler(errchk)
+
+ localdata.setVar('BB_LOGFILE', logfn)
+ localdata.setVar('BB_RUNTASK', task)
+
+ flags = localdata.getVarFlags(task)
+
+ event.fire(TaskStarted(task, logfn, flags, localdata), localdata)
+ try:
+ for func in (prefuncs or '').split():
+ exec_func(func, localdata)
+ exec_func(task, localdata)
+ for func in (postfuncs or '').split():
+ exec_func(func, localdata)
+ except FuncFailed as exc:
+ if quieterr:
+ event.fire(TaskFailedSilent(task, logfn, localdata), localdata)
+ else:
+ errprinted = errchk.triggered
+ logger.error(str(exc))
+ event.fire(TaskFailed(task, logfn, localdata, errprinted), localdata)
+ return 1
+ finally:
+ sys.stdout.flush()
+ sys.stderr.flush()
+
+ bblogger.removeHandler(handler)
+
+ # Restore the backup fds
+ os.dup2(osi[0], osi[1])
+ os.dup2(oso[0], oso[1])
+ os.dup2(ose[0], ose[1])
+
+ # Close the backup fds
+ os.close(osi[0])
+ os.close(oso[0])
+ os.close(ose[0])
+ si.close()
+
+ logfile.close()
+ if os.path.exists(logfn) and os.path.getsize(logfn) == 0:
+ logger.debug(2, "Zero size logfn %s, removing", logfn)
+ bb.utils.remove(logfn)
+ bb.utils.remove(loglink)
+ event.fire(TaskSucceeded(task, logfn, localdata), localdata)
+
+ if not localdata.getVarFlag(task, 'nostamp') and not localdata.getVarFlag(task, 'selfstamp'):
+ make_stamp(task, localdata)
+
+ return 0
+
+def exec_task(fn, task, d, profile = False):
+ try:
+ quieterr = False
+ if d.getVarFlag(task, "quieterrors") is not None:
+ quieterr = True
+
+ if profile:
+ profname = "profile-%s.log" % (d.getVar("PN", True) + "-" + task)
+ try:
+ import cProfile as profile
+ except:
+ import profile
+ prof = profile.Profile()
+ ret = profile.Profile.runcall(prof, _exec_task, fn, task, d, quieterr)
+ prof.dump_stats(profname)
+ bb.utils.process_profilelog(profname)
+
+ return ret
+ else:
+ return _exec_task(fn, task, d, quieterr)
+
+ except Exception:
+ from traceback import format_exc
+ if not quieterr:
+ logger.error("Build of %s failed" % (task))
+ logger.error(format_exc())
+ failedevent = TaskFailed(task, None, d, True)
+ event.fire(failedevent, d)
+ return 1
+
+def stamp_internal(taskname, d, file_name, baseonly=False):
+ """
+ Internal stamp helper function
+ Makes sure the stamp directory exists
+ Returns the stamp path+filename
+
+ In the bitbake core, d can be a CacheData and file_name will be set.
+ When called in task context, d will be a data store, file_name will not be set
+ """
+ taskflagname = taskname
+ if taskname.endswith("_setscene") and taskname != "do_setscene":
+ taskflagname = taskname.replace("_setscene", "")
+
+ if file_name:
+ stamp = d.stamp_base[file_name].get(taskflagname) or d.stamp[file_name]
+ extrainfo = d.stamp_extrainfo[file_name].get(taskflagname) or ""
+ else:
+ stamp = d.getVarFlag(taskflagname, 'stamp-base', True) or d.getVar('STAMP', True)
+ file_name = d.getVar('BB_FILENAME', True)
+ extrainfo = d.getVarFlag(taskflagname, 'stamp-extra-info', True) or ""
+
+ if baseonly:
+ return stamp
+
+ if not stamp:
+ return
+
+ stamp = bb.parse.siggen.stampfile(stamp, file_name, taskname, extrainfo)
+
+ stampdir = os.path.dirname(stamp)
+ if cached_mtime_noerror(stampdir) == 0:
+ bb.utils.mkdirhier(stampdir)
+
+ return stamp
+
+def stamp_cleanmask_internal(taskname, d, file_name):
+ """
+ Internal stamp helper function to generate stamp cleaning mask
+ Returns the stamp path+filename
+
+ In the bitbake core, d can be a CacheData and file_name will be set.
+ When called in task context, d will be a data store, file_name will not be set
+ """
+ taskflagname = taskname
+ if taskname.endswith("_setscene") and taskname != "do_setscene":
+ taskflagname = taskname.replace("_setscene", "")
+
+ if file_name:
+ stamp = d.stamp_base_clean[file_name].get(taskflagname) or d.stampclean[file_name]
+ extrainfo = d.stamp_extrainfo[file_name].get(taskflagname) or ""
+ else:
+ stamp = d.getVarFlag(taskflagname, 'stamp-base-clean', True) or d.getVar('STAMPCLEAN', True)
+ file_name = d.getVar('BB_FILENAME', True)
+ extrainfo = d.getVarFlag(taskflagname, 'stamp-extra-info', True) or ""
+
+ if not stamp:
+ return []
+
+ cleanmask = bb.parse.siggen.stampcleanmask(stamp, file_name, taskname, extrainfo)
+
+ return [cleanmask, cleanmask.replace(taskflagname, taskflagname + "_setscene")]
+
+def make_stamp(task, d, file_name = None):
+ """
+ Creates/updates a stamp for a given task
+ (d can be a data dict or dataCache)
+ """
+ cleanmask = stamp_cleanmask_internal(task, d, file_name)
+ for mask in cleanmask:
+ for name in glob.glob(mask):
+ # Preserve sigdata files in the stamps directory
+ if "sigdata" in name:
+ continue
+ # Preserve taint files in the stamps directory
+ if name.endswith('.taint'):
+ continue
+ os.unlink(name)
+
+ stamp = stamp_internal(task, d, file_name)
+ # Remove the file and recreate to force timestamp
+ # change on broken NFS filesystems
+ if stamp:
+ bb.utils.remove(stamp)
+ open(stamp, "w").close()
+
+ # If we're in task context, write out a signature file for each task
+ # as it completes
+ if not task.endswith("_setscene") and task != "do_setscene" and not file_name:
+ stampbase = stamp_internal(task, d, None, True)
+ file_name = d.getVar('BB_FILENAME', True)
+ bb.parse.siggen.dump_sigtask(file_name, task, stampbase, True)
+
+def del_stamp(task, d, file_name = None):
+ """
+ Removes a stamp for a given task
+ (d can be a data dict or dataCache)
+ """
+ stamp = stamp_internal(task, d, file_name)
+ bb.utils.remove(stamp)
+
+def write_taint(task, d, file_name = None):
+ """
+ Creates a "taint" file which will force the specified task and its
+ dependents to be re-run the next time by influencing the value of its
+ taskhash.
+ (d can be a data dict or dataCache)
+ """
+ import uuid
+ if file_name:
+ taintfn = d.stamp[file_name] + '.' + task + '.taint'
+ else:
+ taintfn = d.getVar('STAMP', True) + '.' + task + '.taint'
+ bb.utils.mkdirhier(os.path.dirname(taintfn))
+ # The specific content of the taint file is not really important,
+ # we just need it to be random, so a random UUID is used
+ with open(taintfn, 'w') as taintf:
+ taintf.write(str(uuid.uuid4()))
+
+def stampfile(taskname, d, file_name = None):
+ """
+ Return the stamp for a given task
+ (d can be a data dict or dataCache)
+ """
+ return stamp_internal(taskname, d, file_name)
+
+def add_tasks(tasklist, d):
+ task_deps = d.getVar('_task_deps', False)
+ if not task_deps:
+ task_deps = {}
+ if not 'tasks' in task_deps:
+ task_deps['tasks'] = []
+ if not 'parents' in task_deps:
+ task_deps['parents'] = {}
+
+ for task in tasklist:
+ task = d.expand(task)
+
+ d.setVarFlag(task, 'task', 1)
+
+ if not task in task_deps['tasks']:
+ task_deps['tasks'].append(task)
+
+ flags = d.getVarFlags(task)
+ def getTask(name):
+ if not name in task_deps:
+ task_deps[name] = {}
+ if name in flags:
+ deptask = d.expand(flags[name])
+ task_deps[name][task] = deptask
+ getTask('depends')
+ getTask('rdepends')
+ getTask('deptask')
+ getTask('rdeptask')
+ getTask('recrdeptask')
+ getTask('recideptask')
+ getTask('nostamp')
+ getTask('fakeroot')
+ getTask('noexec')
+ getTask('umask')
+ task_deps['parents'][task] = []
+ if 'deps' in flags:
+ for dep in flags['deps']:
+ dep = d.expand(dep)
+ task_deps['parents'][task].append(dep)
+
+ # don't assume holding a reference
+ d.setVar('_task_deps', task_deps)
+
+def addtask(task, before, after, d):
+ if task[:3] != "do_":
+ task = "do_" + task
+
+ d.setVarFlag(task, "task", 1)
+ bbtasks = d.getVar('__BBTASKS', False) or []
+ if task not in bbtasks:
+ bbtasks.append(task)
+ d.setVar('__BBTASKS', bbtasks)
+
+ existing = d.getVarFlag(task, "deps") or []
+ if after is not None:
+ # set up deps for function
+ for entry in after.split():
+ if entry not in existing:
+ existing.append(entry)
+ d.setVarFlag(task, "deps", existing)
+ if before is not None:
+ # set up things that depend on this func
+ for entry in before.split():
+ existing = d.getVarFlag(entry, "deps") or []
+ if task not in existing:
+ d.setVarFlag(entry, "deps", [task] + existing)
+
+def deltask(task, d):
+ if task[:3] != "do_":
+ task = "do_" + task
+
+ bbtasks = d.getVar('__BBTASKS', False) or []
+ if task in bbtasks:
+ bbtasks.remove(task)
+ d.setVar('__BBTASKS', bbtasks)
+
+ d.delVarFlag(task, 'deps')
+ for bbtask in d.getVar('__BBTASKS', False) or []:
+ deps = d.getVarFlag(bbtask, 'deps') or []
+ if task in deps:
+ deps.remove(task)
+ d.setVarFlag(bbtask, 'deps', deps)
diff --git a/bitbake/lib/bb/cache.py b/bitbake/lib/bb/cache.py
new file mode 100644
index 0000000..ef4d660
--- /dev/null
+++ b/bitbake/lib/bb/cache.py
@@ -0,0 +1,837 @@
+# ex:ts=4:sw=4:sts=4:et
+# -*- tab-width: 4; c-basic-offset: 4; indent-tabs-mode: nil -*-
+#
+# BitBake Cache implementation
+#
+# Caching of bitbake variables before task execution
+
+# Copyright (C) 2006 Richard Purdie
+# Copyright (C) 2012 Intel Corporation
+
+# but small sections based on code from bin/bitbake:
+# Copyright (C) 2003, 2004 Chris Larson
+# Copyright (C) 2003, 2004 Phil Blundell
+# Copyright (C) 2003 - 2005 Michael 'Mickey' Lauer
+# Copyright (C) 2005 Holger Hans Peter Freyther
+# Copyright (C) 2005 ROAD GmbH
+#
+# This program is free software; you can redistribute it and/or modify
+# it under the terms of the GNU General Public License version 2 as
+# published by the Free Software Foundation.
+#
+# This program is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
+# GNU General Public License for more details.
+#
+# You should have received a copy of the GNU General Public License along
+# with this program; if not, write to the Free Software Foundation, Inc.,
+# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
+
+
+import os
+import logging
+from collections import defaultdict
+import bb.utils
+
+logger = logging.getLogger("BitBake.Cache")
+
+try:
+ import cPickle as pickle
+except ImportError:
+ import pickle
+ logger.info("Importing cPickle failed. "
+ "Falling back to a very slow implementation.")
+
+__cache_version__ = "148"
+
+def getCacheFile(path, filename, data_hash):
+ return os.path.join(path, filename + "." + data_hash)
+
+# RecipeInfoCommon defines common data retrieving methods
+# from meta data for caches. CoreRecipeInfo as well as other
+# Extra RecipeInfo needs to inherit this class
+class RecipeInfoCommon(object):
+
+ @classmethod
+ def listvar(cls, var, metadata):
+ return cls.getvar(var, metadata).split()
+
+ @classmethod
+ def intvar(cls, var, metadata):
+ return int(cls.getvar(var, metadata) or 0)
+
+ @classmethod
+ def depvar(cls, var, metadata):
+ return bb.utils.explode_deps(cls.getvar(var, metadata))
+
+ @classmethod
+ def pkgvar(cls, var, packages, metadata):
+ return dict((pkg, cls.depvar("%s_%s" % (var, pkg), metadata))
+ for pkg in packages)
+
+ @classmethod
+ def taskvar(cls, var, tasks, metadata):
+ return dict((task, cls.getvar("%s_task-%s" % (var, task), metadata))
+ for task in tasks)
+
+ @classmethod
+ def flaglist(cls, flag, varlist, metadata, squash=False):
+ out_dict = dict((var, metadata.getVarFlag(var, flag, True))
+ for var in varlist)
+ if squash:
+ return dict((k,v) for (k,v) in out_dict.iteritems() if v)
+ else:
+ return out_dict
+
+ @classmethod
+ def getvar(cls, var, metadata):
+ return metadata.getVar(var, True) or ''
+
+
+class CoreRecipeInfo(RecipeInfoCommon):
+ __slots__ = ()
+
+ cachefile = "bb_cache.dat"
+
+ def __init__(self, filename, metadata):
+ self.file_depends = metadata.getVar('__depends', False)
+ self.timestamp = bb.parse.cached_mtime(filename)
+ self.variants = self.listvar('__VARIANTS', metadata) + ['']
+ self.appends = self.listvar('__BBAPPEND', metadata)
+ self.nocache = self.getvar('__BB_DONT_CACHE', metadata)
+
+ self.skipreason = self.getvar('__SKIPPED', metadata)
+ if self.skipreason:
+ self.pn = self.getvar('PN', metadata) or bb.parse.BBHandler.vars_from_file(filename,metadata)[0]
+ self.skipped = True
+ self.provides = self.depvar('PROVIDES', metadata)
+ self.rprovides = self.depvar('RPROVIDES', metadata)
+ return
+
+ self.tasks = metadata.getVar('__BBTASKS', False)
+
+ self.pn = self.getvar('PN', metadata)
+ self.packages = self.listvar('PACKAGES', metadata)
+ if not self.pn in self.packages:
+ self.packages.append(self.pn)
+
+ self.basetaskhashes = self.taskvar('BB_BASEHASH', self.tasks, metadata)
+ self.hashfilename = self.getvar('BB_HASHFILENAME', metadata)
+
+ self.task_deps = metadata.getVar('_task_deps', False) or {'tasks': [], 'parents': {}}
+
+ self.skipped = False
+ self.pe = self.getvar('PE', metadata)
+ self.pv = self.getvar('PV', metadata)
+ self.pr = self.getvar('PR', metadata)
+ self.defaultpref = self.intvar('DEFAULT_PREFERENCE', metadata)
+ self.not_world = self.getvar('EXCLUDE_FROM_WORLD', metadata)
+ self.stamp = self.getvar('STAMP', metadata)
+ self.stampclean = self.getvar('STAMPCLEAN', metadata)
+ self.stamp_base = self.flaglist('stamp-base', self.tasks, metadata)
+ self.stamp_base_clean = self.flaglist('stamp-base-clean', self.tasks, metadata)
+ self.stamp_extrainfo = self.flaglist('stamp-extra-info', self.tasks, metadata)
+ self.file_checksums = self.flaglist('file-checksums', self.tasks, metadata, True)
+ self.packages_dynamic = self.listvar('PACKAGES_DYNAMIC', metadata)
+ self.depends = self.depvar('DEPENDS', metadata)
+ self.provides = self.depvar('PROVIDES', metadata)
+ self.rdepends = self.depvar('RDEPENDS', metadata)
+ self.rprovides = self.depvar('RPROVIDES', metadata)
+ self.rrecommends = self.depvar('RRECOMMENDS', metadata)
+ self.rprovides_pkg = self.pkgvar('RPROVIDES', self.packages, metadata)
+ self.rdepends_pkg = self.pkgvar('RDEPENDS', self.packages, metadata)
+ self.rrecommends_pkg = self.pkgvar('RRECOMMENDS', self.packages, metadata)
+ self.inherits = self.getvar('__inherit_cache', metadata)
+ self.fakerootenv = self.getvar('FAKEROOTENV', metadata)
+ self.fakerootdirs = self.getvar('FAKEROOTDIRS', metadata)
+ self.fakerootnoenv = self.getvar('FAKEROOTNOENV', metadata)
+
+ @classmethod
+ def init_cacheData(cls, cachedata):
+ # CacheData in Core RecipeInfo Class
+ cachedata.task_deps = {}
+ cachedata.pkg_fn = {}
+ cachedata.pkg_pn = defaultdict(list)
+ cachedata.pkg_pepvpr = {}
+ cachedata.pkg_dp = {}
+
+ cachedata.stamp = {}
+ cachedata.stampclean = {}
+ cachedata.stamp_base = {}
+ cachedata.stamp_base_clean = {}
+ cachedata.stamp_extrainfo = {}
+ cachedata.file_checksums = {}
+ cachedata.fn_provides = {}
+ cachedata.pn_provides = defaultdict(list)
+ cachedata.all_depends = []
+
+ cachedata.deps = defaultdict(list)
+ cachedata.packages = defaultdict(list)
+ cachedata.providers = defaultdict(list)
+ cachedata.rproviders = defaultdict(list)
+ cachedata.packages_dynamic = defaultdict(list)
+
+ cachedata.rundeps = defaultdict(lambda: defaultdict(list))
+ cachedata.runrecs = defaultdict(lambda: defaultdict(list))
+ cachedata.possible_world = []
+ cachedata.universe_target = []
+ cachedata.hashfn = {}
+
+ cachedata.basetaskhash = {}
+ cachedata.inherits = {}
+ cachedata.fakerootenv = {}
+ cachedata.fakerootnoenv = {}
+ cachedata.fakerootdirs = {}
+
+ def add_cacheData(self, cachedata, fn):
+ cachedata.task_deps[fn] = self.task_deps
+ cachedata.pkg_fn[fn] = self.pn
+ cachedata.pkg_pn[self.pn].append(fn)
+ cachedata.pkg_pepvpr[fn] = (self.pe, self.pv, self.pr)
+ cachedata.pkg_dp[fn] = self.defaultpref
+ cachedata.stamp[fn] = self.stamp
+ cachedata.stampclean[fn] = self.stampclean
+ cachedata.stamp_base[fn] = self.stamp_base
+ cachedata.stamp_base_clean[fn] = self.stamp_base_clean
+ cachedata.stamp_extrainfo[fn] = self.stamp_extrainfo
+ cachedata.file_checksums[fn] = self.file_checksums
+
+ provides = [self.pn]
+ for provide in self.provides:
+ if provide not in provides:
+ provides.append(provide)
+ cachedata.fn_provides[fn] = provides
+
+ for provide in provides:
+ cachedata.providers[provide].append(fn)
+ if provide not in cachedata.pn_provides[self.pn]:
+ cachedata.pn_provides[self.pn].append(provide)
+
+ for dep in self.depends:
+ if dep not in cachedata.deps[fn]:
+ cachedata.deps[fn].append(dep)
+ if dep not in cachedata.all_depends:
+ cachedata.all_depends.append(dep)
+
+ rprovides = self.rprovides
+ for package in self.packages:
+ cachedata.packages[package].append(fn)
+ rprovides += self.rprovides_pkg[package]
+
+ for rprovide in rprovides:
+ cachedata.rproviders[rprovide].append(fn)
+
+ for package in self.packages_dynamic:
+ cachedata.packages_dynamic[package].append(fn)
+
+ # Build hash of runtime depends and recommends
+ for package in self.packages + [self.pn]:
+ cachedata.rundeps[fn][package] = list(self.rdepends) + self.rdepends_pkg[package]
+ cachedata.runrecs[fn][package] = list(self.rrecommends) + self.rrecommends_pkg[package]
+
+ # Collect files we may need for possible world-dep
+ # calculations
+ if self.not_world:
+ logger.debug(1, "EXCLUDE FROM WORLD: %s", fn)
+ else:
+ cachedata.possible_world.append(fn)
+
+ # create a collection of all targets for sanity checking
+ # tasks, such as upstream versions, license, and tools for
+ # task and image creation.
+ cachedata.universe_target.append(self.pn)
+
+ cachedata.hashfn[fn] = self.hashfilename
+ for task, taskhash in self.basetaskhashes.iteritems():
+ identifier = '%s.%s' % (fn, task)
+ cachedata.basetaskhash[identifier] = taskhash
+
+ cachedata.inherits[fn] = self.inherits
+ cachedata.fakerootenv[fn] = self.fakerootenv
+ cachedata.fakerootnoenv[fn] = self.fakerootnoenv
+ cachedata.fakerootdirs[fn] = self.fakerootdirs
+
+
+
+class Cache(object):
+ """
+ BitBake Cache implementation
+ """
+
+ def __init__(self, data, data_hash, caches_array):
+ # Pass caches_array information into Cache Constructor
+ # It will be used later for deciding whether we
+ # need extra cache file dump/load support
+ self.caches_array = caches_array
+ self.cachedir = data.getVar("CACHE", True)
+ self.clean = set()
+ self.checked = set()
+ self.depends_cache = {}
+ self.data = None
+ self.data_fn = None
+ self.cacheclean = True
+ self.data_hash = data_hash
+
+ if self.cachedir in [None, '']:
+ self.has_cache = False
+ logger.info("Not using a cache. "
+ "Set CACHE = <directory> to enable.")
+ return
+
+ self.has_cache = True
+ self.cachefile = getCacheFile(self.cachedir, "bb_cache.dat", self.data_hash)
+
+ logger.debug(1, "Using cache in '%s'", self.cachedir)
+ bb.utils.mkdirhier(self.cachedir)
+
+ cache_ok = True
+ if self.caches_array:
+ for cache_class in self.caches_array:
+ if type(cache_class) is type and issubclass(cache_class, RecipeInfoCommon):
+ cachefile = getCacheFile(self.cachedir, cache_class.cachefile, self.data_hash)
+ cache_ok = cache_ok and os.path.exists(cachefile)
+ cache_class.init_cacheData(self)
+ if cache_ok:
+ self.load_cachefile()
+ elif os.path.isfile(self.cachefile):
+ logger.info("Out of date cache found, rebuilding...")
+
+ def load_cachefile(self):
+ # Firstly, using core cache file information for
+ # valid checking
+ with open(self.cachefile, "rb") as cachefile:
+ pickled = pickle.Unpickler(cachefile)
+ try:
+ cache_ver = pickled.load()
+ bitbake_ver = pickled.load()
+ except Exception:
+ logger.info('Invalid cache, rebuilding...')
+ return
+
+ if cache_ver != __cache_version__:
+ logger.info('Cache version mismatch, rebuilding...')
+ return
+ elif bitbake_ver != bb.__version__:
+ logger.info('Bitbake version mismatch, rebuilding...')
+ return
+
+
+ cachesize = 0
+ previous_progress = 0
+ previous_percent = 0
+
+ # Calculate the correct cachesize of all those cache files
+ for cache_class in self.caches_array:
+ if type(cache_class) is type and issubclass(cache_class, RecipeInfoCommon):
+ cachefile = getCacheFile(self.cachedir, cache_class.cachefile, self.data_hash)
+ with open(cachefile, "rb") as cachefile:
+ cachesize += os.fstat(cachefile.fileno()).st_size
+
+ bb.event.fire(bb.event.CacheLoadStarted(cachesize), self.data)
+
+ for cache_class in self.caches_array:
+ if type(cache_class) is type and issubclass(cache_class, RecipeInfoCommon):
+ cachefile = getCacheFile(self.cachedir, cache_class.cachefile, self.data_hash)
+ with open(cachefile, "rb") as cachefile:
+ pickled = pickle.Unpickler(cachefile)
+ while cachefile:
+ try:
+ key = pickled.load()
+ value = pickled.load()
+ except Exception:
+ break
+ if self.depends_cache.has_key(key):
+ self.depends_cache[key].append(value)
+ else:
+ self.depends_cache[key] = [value]
+ # only fire events on even percentage boundaries
+ current_progress = cachefile.tell() + previous_progress
+ current_percent = 100 * current_progress / cachesize
+ if current_percent > previous_percent:
+ previous_percent = current_percent
+ bb.event.fire(bb.event.CacheLoadProgress(current_progress, cachesize),
+ self.data)
+
+ previous_progress += current_progress
+
+ # Note: depends cache number is corresponding to the parsing file numbers.
+ # The same file has several caches, still regarded as one item in the cache
+ bb.event.fire(bb.event.CacheLoadCompleted(cachesize,
+ len(self.depends_cache)),
+ self.data)
+
+
+ @staticmethod
+ def virtualfn2realfn(virtualfn):
+ """
+ Convert a virtual file name to a real one + the associated subclass keyword
+ """
+
+ fn = virtualfn
+ cls = ""
+ if virtualfn.startswith('virtual:'):
+ elems = virtualfn.split(':')
+ cls = ":".join(elems[1:-1])
+ fn = elems[-1]
+ return (fn, cls)
+
+ @staticmethod
+ def realfn2virtual(realfn, cls):
+ """
+ Convert a real filename + the associated subclass keyword to a virtual filename
+ """
+ if cls == "":
+ return realfn
+ return "virtual:" + cls + ":" + realfn
+
+ @classmethod
+ def loadDataFull(cls, virtualfn, appends, cfgData):
+ """
+ Return a complete set of data for fn.
+ To do this, we need to parse the file.
+ """
+
+ (fn, virtual) = cls.virtualfn2realfn(virtualfn)
+
+ logger.debug(1, "Parsing %s (full)", fn)
+
+ cfgData.setVar("__ONLYFINALISE", virtual or "default")
+ bb_data = cls.load_bbfile(fn, appends, cfgData)
+ return bb_data[virtual]
+
+ @classmethod
+ def parse(cls, filename, appends, configdata, caches_array):
+ """Parse the specified filename, returning the recipe information"""
+ infos = []
+ datastores = cls.load_bbfile(filename, appends, configdata)
+ depends = []
+ for variant, data in sorted(datastores.iteritems(),
+ key=lambda i: i[0],
+ reverse=True):
+ virtualfn = cls.realfn2virtual(filename, variant)
+ depends = depends + (data.getVar("__depends", False) or [])
+ if depends and not variant:
+ data.setVar("__depends", depends)
+
+ info_array = []
+ for cache_class in caches_array:
+ if type(cache_class) is type and issubclass(cache_class, RecipeInfoCommon):
+ info = cache_class(filename, data)
+ info_array.append(info)
+ infos.append((virtualfn, info_array))
+
+ return infos
+
+ def load(self, filename, appends, configdata):
+ """Obtain the recipe information for the specified filename,
+ using cached values if available, otherwise parsing.
+
+ Note that if it does parse to obtain the info, it will not
+ automatically add the information to the cache or to your
+ CacheData. Use the add or add_info method to do so after
+ running this, or use loadData instead."""
+ cached = self.cacheValid(filename, appends)
+ if cached:
+ infos = []
+ # info_array item is a list of [CoreRecipeInfo, XXXRecipeInfo]
+ info_array = self.depends_cache[filename]
+ for variant in info_array[0].variants:
+ virtualfn = self.realfn2virtual(filename, variant)
+ infos.append((virtualfn, self.depends_cache[virtualfn]))
+ else:
+ logger.debug(1, "Parsing %s", filename)
+ return self.parse(filename, appends, configdata, self.caches_array)
+
+ return cached, infos
+
+ def loadData(self, fn, appends, cfgData, cacheData):
+ """Load the recipe info for the specified filename,
+ parsing and adding to the cache if necessary, and adding
+ the recipe information to the supplied CacheData instance."""
+ skipped, virtuals = 0, 0
+
+ cached, infos = self.load(fn, appends, cfgData)
+ for virtualfn, info_array in infos:
+ if info_array[0].skipped:
+ logger.debug(1, "Skipping %s: %s", virtualfn, info_array[0].skipreason)
+ skipped += 1
+ else:
+ self.add_info(virtualfn, info_array, cacheData, not cached)
+ virtuals += 1
+
+ return cached, skipped, virtuals
+
+ def cacheValid(self, fn, appends):
+ """
+ Is the cache valid for fn?
+ Fast version, no timestamps checked.
+ """
+ if fn not in self.checked:
+ self.cacheValidUpdate(fn, appends)
+
+ # Is cache enabled?
+ if not self.has_cache:
+ return False
+ if fn in self.clean:
+ return True
+ return False
+
+ def cacheValidUpdate(self, fn, appends):
+ """
+ Is the cache valid for fn?
+ Make thorough (slower) checks including timestamps.
+ """
+ # Is cache enabled?
+ if not self.has_cache:
+ return False
+
+ self.checked.add(fn)
+
+ # File isn't in depends_cache
+ if not fn in self.depends_cache:
+ logger.debug(2, "Cache: %s is not cached", fn)
+ return False
+
+ mtime = bb.parse.cached_mtime_noerror(fn)
+
+ # Check file still exists
+ if mtime == 0:
+ logger.debug(2, "Cache: %s no longer exists", fn)
+ self.remove(fn)
+ return False
+
+ info_array = self.depends_cache[fn]
+ # Check the file's timestamp
+ if mtime != info_array[0].timestamp:
+ logger.debug(2, "Cache: %s changed", fn)
+ self.remove(fn)
+ return False
+
+ # Check dependencies are still valid
+ depends = info_array[0].file_depends
+ if depends:
+ for f, old_mtime in depends:
+ fmtime = bb.parse.cached_mtime_noerror(f)
+ # Check if file still exists
+ if old_mtime != 0 and fmtime == 0:
+ logger.debug(2, "Cache: %s's dependency %s was removed",
+ fn, f)
+ self.remove(fn)
+ return False
+
+ if (fmtime != old_mtime):
+ logger.debug(2, "Cache: %s's dependency %s changed",
+ fn, f)
+ self.remove(fn)
+ return False
+
+ if hasattr(info_array[0], 'file_checksums'):
+ for _, fl in info_array[0].file_checksums.items():
+ for f in fl.split():
+ if "*" in f:
+ continue
+ f, exist = f.split(":")
+ if (exist == "True" and not os.path.exists(f)) or (exist == "False" and os.path.exists(f)):
+ logger.debug(2, "Cache: %s's file checksum list file %s changed",
+ fn, f)
+ self.remove(fn)
+ return False
+
+ if appends != info_array[0].appends:
+ logger.debug(2, "Cache: appends for %s changed", fn)
+ logger.debug(2, "%s to %s" % (str(appends), str(info_array[0].appends)))
+ self.remove(fn)
+ return False
+
+ invalid = False
+ for cls in info_array[0].variants:
+ virtualfn = self.realfn2virtual(fn, cls)
+ self.clean.add(virtualfn)
+ if virtualfn not in self.depends_cache:
+ logger.debug(2, "Cache: %s is not cached", virtualfn)
+ invalid = True
+
+ # If any one of the variants is not present, mark as invalid for all
+ if invalid:
+ for cls in info_array[0].variants:
+ virtualfn = self.realfn2virtual(fn, cls)
+ if virtualfn in self.clean:
+ logger.debug(2, "Cache: Removing %s from cache", virtualfn)
+ self.clean.remove(virtualfn)
+ if fn in self.clean:
+ logger.debug(2, "Cache: Marking %s as not clean", fn)
+ self.clean.remove(fn)
+ return False
+
+ self.clean.add(fn)
+ return True
+
+ def remove(self, fn):
+ """
+ Remove a fn from the cache
+ Called from the parser in error cases
+ """
+ if fn in self.depends_cache:
+ logger.debug(1, "Removing %s from cache", fn)
+ del self.depends_cache[fn]
+ if fn in self.clean:
+ logger.debug(1, "Marking %s as unclean", fn)
+ self.clean.remove(fn)
+
+ def sync(self):
+ """
+ Save the cache
+ Called from the parser when complete (or exiting)
+ """
+
+ if not self.has_cache:
+ return
+
+ if self.cacheclean:
+ logger.debug(2, "Cache is clean, not saving.")
+ return
+
+ file_dict = {}
+ pickler_dict = {}
+ for cache_class in self.caches_array:
+ if type(cache_class) is type and issubclass(cache_class, RecipeInfoCommon):
+ cache_class_name = cache_class.__name__
+ cachefile = getCacheFile(self.cachedir, cache_class.cachefile, self.data_hash)
+ file_dict[cache_class_name] = open(cachefile, "wb")
+ pickler_dict[cache_class_name] = pickle.Pickler(file_dict[cache_class_name], pickle.HIGHEST_PROTOCOL)
+
+ pickler_dict['CoreRecipeInfo'].dump(__cache_version__)
+ pickler_dict['CoreRecipeInfo'].dump(bb.__version__)
+
+ try:
+ for key, info_array in self.depends_cache.iteritems():
+ for info in info_array:
+ if isinstance(info, RecipeInfoCommon):
+ cache_class_name = info.__class__.__name__
+ pickler_dict[cache_class_name].dump(key)
+ pickler_dict[cache_class_name].dump(info)
+ finally:
+ for cache_class in self.caches_array:
+ if type(cache_class) is type and issubclass(cache_class, RecipeInfoCommon):
+ cache_class_name = cache_class.__name__
+ file_dict[cache_class_name].close()
+
+ del self.depends_cache
+
+ @staticmethod
+ def mtime(cachefile):
+ return bb.parse.cached_mtime_noerror(cachefile)
+
+ def add_info(self, filename, info_array, cacheData, parsed=None, watcher=None):
+ if isinstance(info_array[0], CoreRecipeInfo) and (not info_array[0].skipped):
+ cacheData.add_from_recipeinfo(filename, info_array)
+
+ if watcher:
+ watcher(info_array[0].file_depends)
+
+ if not self.has_cache:
+ return
+
+ if (info_array[0].skipped or 'SRCREVINACTION' not in info_array[0].pv) and not info_array[0].nocache:
+ if parsed:
+ self.cacheclean = False
+ self.depends_cache[filename] = info_array
+
+ def add(self, file_name, data, cacheData, parsed=None):
+ """
+ Save data we need into the cache
+ """
+
+ realfn = self.virtualfn2realfn(file_name)[0]
+
+ info_array = []
+ for cache_class in self.caches_array:
+ if type(cache_class) is type and issubclass(cache_class, RecipeInfoCommon):
+ info_array.append(cache_class(realfn, data))
+ self.add_info(file_name, info_array, cacheData, parsed)
+
+ @staticmethod
+ def load_bbfile(bbfile, appends, config):
+ """
+ Load and parse one .bb build file
+ Return the data and whether parsing resulted in the file being skipped
+ """
+ chdir_back = False
+
+ from bb import parse
+
+ # expand tmpdir to include this topdir
+ config.setVar('TMPDIR', config.getVar('TMPDIR', True) or "")
+ bbfile_loc = os.path.abspath(os.path.dirname(bbfile))
+ oldpath = os.path.abspath(os.getcwd())
+ parse.cached_mtime_noerror(bbfile_loc)
+ bb_data = config.createCopy()
+ # The ConfHandler first looks if there is a TOPDIR and if not
+ # then it would call getcwd().
+ # Previously, we chdir()ed to bbfile_loc, called the handler
+ # and finally chdir()ed back, a couple of thousand times. We now
+ # just fill in TOPDIR to point to bbfile_loc if there is no TOPDIR yet.
+ if not bb_data.getVar('TOPDIR', False):
+ chdir_back = True
+ bb_data.setVar('TOPDIR', bbfile_loc)
+ try:
+ if appends:
+ bb_data.setVar('__BBAPPEND', " ".join(appends))
+ bb_data = parse.handle(bbfile, bb_data)
+ if chdir_back:
+ os.chdir(oldpath)
+ return bb_data
+ except:
+ if chdir_back:
+ os.chdir(oldpath)
+ raise
+
+
+def init(cooker):
+ """
+ The Objective: Cache the minimum amount of data possible yet get to the
+ stage of building packages (i.e. tryBuild) without reparsing any .bb files.
+
+ To do this, we intercept getVar calls and only cache the variables we see
+ being accessed. We rely on the cache getVar calls being made for all
+ variables bitbake might need to use to reach this stage. For each cached
+ file we need to track:
+
+ * Its mtime
+ * The mtimes of all its dependencies
+ * Whether it caused a parse.SkipRecipe exception
+
+ Files causing parsing errors are evicted from the cache.
+
+ """
+ return Cache(cooker.configuration.data, cooker.configuration.data_hash)
+
+
+class CacheData(object):
+ """
+ The data structures we compile from the cached data
+ """
+
+ def __init__(self, caches_array):
+ self.caches_array = caches_array
+ for cache_class in self.caches_array:
+ if type(cache_class) is type and issubclass(cache_class, RecipeInfoCommon):
+ cache_class.init_cacheData(self)
+
+ # Direct cache variables
+ self.task_queues = {}
+ self.preferred = {}
+ self.tasks = {}
+ # Indirect Cache variables (set elsewhere)
+ self.ignored_dependencies = []
+ self.world_target = set()
+ self.bbfile_priority = {}
+
+ def add_from_recipeinfo(self, fn, info_array):
+ for info in info_array:
+ info.add_cacheData(self, fn)
+
+class MultiProcessCache(object):
+ """
+ BitBake multi-process cache implementation
+
+ Used by the codeparser & file checksum caches
+ """
+
+ def __init__(self):
+ self.cachefile = None
+ self.cachedata = self.create_cachedata()
+ self.cachedata_extras = self.create_cachedata()
+
+ def init_cache(self, d):
+ cachedir = (d.getVar("PERSISTENT_DIR", True) or
+ d.getVar("CACHE", True))
+ if cachedir in [None, '']:
+ return
+ bb.utils.mkdirhier(cachedir)
+ self.cachefile = os.path.join(cachedir, self.__class__.cache_file_name)
+ logger.debug(1, "Using cache in '%s'", self.cachefile)
+
+ glf = bb.utils.lockfile(self.cachefile + ".lock")
+
+ try:
+ with open(self.cachefile, "rb") as f:
+ p = pickle.Unpickler(f)
+ data, version = p.load()
+ except:
+ bb.utils.unlockfile(glf)
+ return
+
+ bb.utils.unlockfile(glf)
+
+ if version != self.__class__.CACHE_VERSION:
+ return
+
+ self.cachedata = data
+
+ def create_cachedata(self):
+ data = [{}]
+ return data
+
+ def save_extras(self, d):
+ if not self.cachefile:
+ return
+
+ glf = bb.utils.lockfile(self.cachefile + ".lock", shared=True)
+
+ i = os.getpid()
+ lf = None
+ while not lf:
+ lf = bb.utils.lockfile(self.cachefile + ".lock." + str(i), retry=False)
+ if not lf or os.path.exists(self.cachefile + "-" + str(i)):
+ if lf:
+ bb.utils.unlockfile(lf)
+ lf = None
+ i = i + 1
+ continue
+
+ with open(self.cachefile + "-" + str(i), "wb") as f:
+ p = pickle.Pickler(f, -1)
+ p.dump([self.cachedata_extras, self.__class__.CACHE_VERSION])
+
+ bb.utils.unlockfile(lf)
+ bb.utils.unlockfile(glf)
+
+ def merge_data(self, source, dest):
+ for j in range(0,len(dest)):
+ for h in source[j]:
+ if h not in dest[j]:
+ dest[j][h] = source[j][h]
+
+ def save_merge(self, d):
+ if not self.cachefile:
+ return
+
+ glf = bb.utils.lockfile(self.cachefile + ".lock")
+
+ data = self.cachedata
+
+ for f in [y for y in os.listdir(os.path.dirname(self.cachefile)) if y.startswith(os.path.basename(self.cachefile) + '-')]:
+ f = os.path.join(os.path.dirname(self.cachefile), f)
+ try:
+ with open(f, "rb") as fd:
+ p = pickle.Unpickler(fd)
+ extradata, version = p.load()
+ except (IOError, EOFError):
+ os.unlink(f)
+ continue
+
+ if version != self.__class__.CACHE_VERSION:
+ os.unlink(f)
+ continue
+
+ self.merge_data(extradata, data)
+ os.unlink(f)
+
+ with open(self.cachefile, "wb") as f:
+ p = pickle.Pickler(f, -1)
+ p.dump([data, self.__class__.CACHE_VERSION])
+
+ bb.utils.unlockfile(glf)
+
diff --git a/bitbake/lib/bb/cache_extra.py b/bitbake/lib/bb/cache_extra.py
new file mode 100644
index 0000000..83f4959
--- /dev/null
+++ b/bitbake/lib/bb/cache_extra.py
@@ -0,0 +1,75 @@
+# ex:ts=4:sw=4:sts=4:et
+# -*- tab-width: 4; c-basic-offset: 4; indent-tabs-mode: nil -*-
+#
+# Extra RecipeInfo will be all defined in this file. Currently,
+# Only Hob (Image Creator) Requests some extra fields. So
+# HobRecipeInfo is defined. It's named HobRecipeInfo because it
+# is introduced by 'hob'. Users could also introduce other
+# RecipeInfo or simply use those already defined RecipeInfo.
+# In the following patch, this newly defined new extra RecipeInfo
+# will be dynamically loaded and used for loading/saving the extra
+# cache fields
+
+# Copyright (C) 2011, Intel Corporation. All rights reserved.
+
+# This program is free software; you can redistribute it and/or modify
+# it under the terms of the GNU General Public License version 2 as
+# published by the Free Software Foundation.
+#
+# This program is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
+# GNU General Public License for more details.
+#
+# You should have received a copy of the GNU General Public License along
+# with this program; if not, write to the Free Software Foundation, Inc.,
+# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
+
+from bb.cache import RecipeInfoCommon
+
+class HobRecipeInfo(RecipeInfoCommon):
+ __slots__ = ()
+
+ classname = "HobRecipeInfo"
+ # please override this member with the correct data cache file
+ # such as (bb_cache.dat, bb_extracache_hob.dat)
+ cachefile = "bb_extracache_" + classname +".dat"
+
+ # override this member with the list of extra cache fields
+ # that this class will provide
+ cachefields = ['summary', 'license', 'section',
+ 'description', 'homepage', 'bugtracker',
+ 'prevision', 'files_info']
+
+ def __init__(self, filename, metadata):
+
+ self.summary = self.getvar('SUMMARY', metadata)
+ self.license = self.getvar('LICENSE', metadata)
+ self.section = self.getvar('SECTION', metadata)
+ self.description = self.getvar('DESCRIPTION', metadata)
+ self.homepage = self.getvar('HOMEPAGE', metadata)
+ self.bugtracker = self.getvar('BUGTRACKER', metadata)
+ self.prevision = self.getvar('PR', metadata)
+ self.files_info = self.getvar('FILES_INFO', metadata)
+
+ @classmethod
+ def init_cacheData(cls, cachedata):
+ # CacheData in Hob RecipeInfo Class
+ cachedata.summary = {}
+ cachedata.license = {}
+ cachedata.section = {}
+ cachedata.description = {}
+ cachedata.homepage = {}
+ cachedata.bugtracker = {}
+ cachedata.prevision = {}
+ cachedata.files_info = {}
+
+ def add_cacheData(self, cachedata, fn):
+ cachedata.summary[fn] = self.summary
+ cachedata.license[fn] = self.license
+ cachedata.section[fn] = self.section
+ cachedata.description[fn] = self.description
+ cachedata.homepage[fn] = self.homepage
+ cachedata.bugtracker[fn] = self.bugtracker
+ cachedata.prevision[fn] = self.prevision
+ cachedata.files_info[fn] = self.files_info
diff --git a/bitbake/lib/bb/checksum.py b/bitbake/lib/bb/checksum.py
new file mode 100644
index 0000000..514ff0b
--- /dev/null
+++ b/bitbake/lib/bb/checksum.py
@@ -0,0 +1,90 @@
+# Local file checksum cache implementation
+#
+# Copyright (C) 2012 Intel Corporation
+#
+# This program is free software; you can redistribute it and/or modify
+# it under the terms of the GNU General Public License version 2 as
+# published by the Free Software Foundation.
+#
+# This program is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
+# GNU General Public License for more details.
+#
+# You should have received a copy of the GNU General Public License along
+# with this program; if not, write to the Free Software Foundation, Inc.,
+# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
+
+import os
+import stat
+import bb.utils
+import logging
+from bb.cache import MultiProcessCache
+
+logger = logging.getLogger("BitBake.Cache")
+
+try:
+ import cPickle as pickle
+except ImportError:
+ import pickle
+ logger.info("Importing cPickle failed. "
+ "Falling back to a very slow implementation.")
+
+
+# mtime cache (non-persistent)
+# based upon the assumption that files do not change during bitbake run
+class FileMtimeCache(object):
+ cache = {}
+
+ def cached_mtime(self, f):
+ if f not in self.cache:
+ self.cache[f] = os.stat(f)[stat.ST_MTIME]
+ return self.cache[f]
+
+ def cached_mtime_noerror(self, f):
+ if f not in self.cache:
+ try:
+ self.cache[f] = os.stat(f)[stat.ST_MTIME]
+ except OSError:
+ return 0
+ return self.cache[f]
+
+ def update_mtime(self, f):
+ self.cache[f] = os.stat(f)[stat.ST_MTIME]
+ return self.cache[f]
+
+ def clear(self):
+ self.cache.clear()
+
+# Checksum + mtime cache (persistent)
+class FileChecksumCache(MultiProcessCache):
+ cache_file_name = "local_file_checksum_cache.dat"
+ CACHE_VERSION = 1
+
+ def __init__(self):
+ self.mtime_cache = FileMtimeCache()
+ MultiProcessCache.__init__(self)
+
+ def get_checksum(self, f):
+ entry = self.cachedata[0].get(f)
+ cmtime = self.mtime_cache.cached_mtime(f)
+ if entry:
+ (mtime, hashval) = entry
+ if cmtime == mtime:
+ return hashval
+ else:
+ bb.debug(2, "file %s changed mtime, recompute checksum" % f)
+
+ hashval = bb.utils.md5_file(f)
+ self.cachedata_extras[0][f] = (cmtime, hashval)
+ return hashval
+
+ def merge_data(self, source, dest):
+ for h in source[0]:
+ if h in dest:
+ (smtime, _) = source[0][h]
+ (dmtime, _) = dest[0][h]
+ if smtime > dmtime:
+ dest[0][h] = source[0][h]
+ else:
+ dest[0][h] = source[0][h]
diff --git a/bitbake/lib/bb/codeparser.py b/bitbake/lib/bb/codeparser.py
new file mode 100644
index 0000000..82a3af4
--- /dev/null
+++ b/bitbake/lib/bb/codeparser.py
@@ -0,0 +1,414 @@
+import ast
+import codegen
+import logging
+import os.path
+import bb.utils, bb.data
+from itertools import chain
+from pysh import pyshyacc, pyshlex, sherrors
+from bb.cache import MultiProcessCache
+
+
+logger = logging.getLogger('BitBake.CodeParser')
+
+try:
+ import cPickle as pickle
+except ImportError:
+ import pickle
+ logger.info('Importing cPickle failed. Falling back to a very slow implementation.')
+
+
+def check_indent(codestr):
+ """If the code is indented, add a top level piece of code to 'remove' the indentation"""
+
+ i = 0
+ while codestr[i] in ["\n", "\t", " "]:
+ i = i + 1
+
+ if i == 0:
+ return codestr
+
+ if codestr[i-1] == "\t" or codestr[i-1] == " ":
+ return "if 1:\n" + codestr
+
+ return codestr
+
+
+# Basically pickle, in python 2.7.3 at least, does badly with data duplication
+# upon pickling and unpickling. Combine this with duplicate objects and things
+# are a mess.
+#
+# When the sets are originally created, python calls intern() on the set keys
+# which significantly improves memory usage. Sadly the pickle/unpickle process
+# doesn't call intern() on the keys and results in the same strings being duplicated
+# in memory. This also means pickle will save the same string multiple times in
+# the cache file.
+#
+# By having shell and python cacheline objects with setstate/getstate, we force
+# the object creation through our own routine where we can call intern (via internSet).
+#
+# We also use hashable frozensets and ensure we use references to these so that
+# duplicates can be removed, both in memory and in the resulting pickled data.
+#
+# By playing these games, the size of the cache file shrinks dramatically
+# meaning faster load times and the reloaded cache files also consume much less
+# memory. Smaller cache files, faster load times and lower memory usage is good.
+#
+# A custom getstate/setstate using tuples is actually worth 15% cachesize by
+# avoiding duplication of the attribute names!
+
+class SetCache(object):
+ def __init__(self):
+ self.setcache = {}
+
+ def internSet(self, items):
+
+ new = []
+ for i in items:
+ new.append(intern(i))
+ s = frozenset(new)
+ if hash(s) in self.setcache:
+ return self.setcache[hash(s)]
+ self.setcache[hash(s)] = s
+ return s
+
+codecache = SetCache()
+
+class pythonCacheLine(object):
+ def __init__(self, refs, execs, contains):
+ self.refs = codecache.internSet(refs)
+ self.execs = codecache.internSet(execs)
+ self.contains = {}
+ for c in contains:
+ self.contains[c] = codecache.internSet(contains[c])
+
+ def __getstate__(self):
+ return (self.refs, self.execs, self.contains)
+
+ def __setstate__(self, state):
+ (refs, execs, contains) = state
+ self.__init__(refs, execs, contains)
+ def __hash__(self):
+ l = (hash(self.refs), hash(self.execs))
+ for c in sorted(self.contains.keys()):
+ l = l + (c, hash(self.contains[c]))
+ return hash(l)
+ def __repr__(self):
+ return " ".join([str(self.refs), str(self.execs), str(self.contains)])
+
+
+class shellCacheLine(object):
+ def __init__(self, execs):
+ self.execs = codecache.internSet(execs)
+
+ def __getstate__(self):
+ return (self.execs)
+
+ def __setstate__(self, state):
+ (execs) = state
+ self.__init__(execs)
+ def __hash__(self):
+ return hash(self.execs)
+ def __repr__(self):
+ return str(self.execs)
+
+class CodeParserCache(MultiProcessCache):
+ cache_file_name = "bb_codeparser.dat"
+ CACHE_VERSION = 7
+
+ def __init__(self):
+ MultiProcessCache.__init__(self)
+ self.pythoncache = self.cachedata[0]
+ self.shellcache = self.cachedata[1]
+ self.pythoncacheextras = self.cachedata_extras[0]
+ self.shellcacheextras = self.cachedata_extras[1]
+
+ # To avoid duplication in the codeparser cache, keep
+ # a lookup of hashes of objects we already have
+ self.pythoncachelines = {}
+ self.shellcachelines = {}
+
+ def newPythonCacheLine(self, refs, execs, contains):
+ cacheline = pythonCacheLine(refs, execs, contains)
+ h = hash(cacheline)
+ if h in self.pythoncachelines:
+ return self.pythoncachelines[h]
+ self.pythoncachelines[h] = cacheline
+ return cacheline
+
+ def newShellCacheLine(self, execs):
+ cacheline = shellCacheLine(execs)
+ h = hash(cacheline)
+ if h in self.shellcachelines:
+ return self.shellcachelines[h]
+ self.shellcachelines[h] = cacheline
+ return cacheline
+
+ def init_cache(self, d):
+ MultiProcessCache.init_cache(self, d)
+
+ # cachedata gets re-assigned in the parent
+ self.pythoncache = self.cachedata[0]
+ self.shellcache = self.cachedata[1]
+
+ def create_cachedata(self):
+ data = [{}, {}]
+ return data
+
+codeparsercache = CodeParserCache()
+
+def parser_cache_init(d):
+ codeparsercache.init_cache(d)
+
+def parser_cache_save(d):
+ codeparsercache.save_extras(d)
+
+def parser_cache_savemerge(d):
+ codeparsercache.save_merge(d)
+
+Logger = logging.getLoggerClass()
+class BufferedLogger(Logger):
+ def __init__(self, name, level=0, target=None):
+ Logger.__init__(self, name)
+ self.setLevel(level)
+ self.buffer = []
+ self.target = target
+
+ def handle(self, record):
+ self.buffer.append(record)
+
+ def flush(self):
+ for record in self.buffer:
+ self.target.handle(record)
+ self.buffer = []
+
+class PythonParser():
+ getvars = (".getVar", ".appendVar", ".prependVar")
+ containsfuncs = ("bb.utils.contains", "base_contains", "bb.utils.contains_any")
+ execfuncs = ("bb.build.exec_func", "bb.build.exec_task")
+
+ def warn(self, func, arg):
+ """Warn about calls of bitbake APIs which pass a non-literal
+ argument for the variable name, as we're not able to track such
+ a reference.
+ """
+
+ try:
+ funcstr = codegen.to_source(func)
+ argstr = codegen.to_source(arg)
+ except TypeError:
+ self.log.debug(2, 'Failed to convert function and argument to source form')
+ else:
+ self.log.debug(1, self.unhandled_message % (funcstr, argstr))
+
+ def visit_Call(self, node):
+ name = self.called_node_name(node.func)
+ if name and name.endswith(self.getvars) or name in self.containsfuncs:
+ if isinstance(node.args[0], ast.Str):
+ varname = node.args[0].s
+ if name in self.containsfuncs and isinstance(node.args[1], ast.Str):
+ if varname not in self.contains:
+ self.contains[varname] = set()
+ self.contains[varname].add(node.args[1].s)
+ else:
+ self.references.add(node.args[0].s)
+ else:
+ self.warn(node.func, node.args[0])
+ elif name in self.execfuncs:
+ if isinstance(node.args[0], ast.Str):
+ self.var_execs.add(node.args[0].s)
+ else:
+ self.warn(node.func, node.args[0])
+ elif name and isinstance(node.func, (ast.Name, ast.Attribute)):
+ self.execs.add(name)
+
+ def called_node_name(self, node):
+ """Given a called node, return its original string form"""
+ components = []
+ while node:
+ if isinstance(node, ast.Attribute):
+ components.append(node.attr)
+ node = node.value
+ elif isinstance(node, ast.Name):
+ components.append(node.id)
+ return '.'.join(reversed(components))
+ else:
+ break
+
+ def __init__(self, name, log):
+ self.var_execs = set()
+ self.contains = {}
+ self.execs = set()
+ self.references = set()
+ self.log = BufferedLogger('BitBake.Data.PythonParser', logging.DEBUG, log)
+
+ self.unhandled_message = "in call of %s, argument '%s' is not a string literal"
+ self.unhandled_message = "while parsing %s, %s" % (name, self.unhandled_message)
+
+ def parse_python(self, node):
+ if not node or not node.strip():
+ return
+
+ h = hash(str(node))
+
+ if h in codeparsercache.pythoncache:
+ self.references = set(codeparsercache.pythoncache[h].refs)
+ self.execs = set(codeparsercache.pythoncache[h].execs)
+ self.contains = {}
+ for i in codeparsercache.pythoncache[h].contains:
+ self.contains[i] = set(codeparsercache.pythoncache[h].contains[i])
+ return
+
+ if h in codeparsercache.pythoncacheextras:
+ self.references = set(codeparsercache.pythoncacheextras[h].refs)
+ self.execs = set(codeparsercache.pythoncacheextras[h].execs)
+ self.contains = {}
+ for i in codeparsercache.pythoncacheextras[h].contains:
+ self.contains[i] = set(codeparsercache.pythoncacheextras[h].contains[i])
+ return
+
+ code = compile(check_indent(str(node)), "<string>", "exec",
+ ast.PyCF_ONLY_AST)
+
+ for n in ast.walk(code):
+ if n.__class__.__name__ == "Call":
+ self.visit_Call(n)
+
+ self.execs.update(self.var_execs)
+
+ codeparsercache.pythoncacheextras[h] = codeparsercache.newPythonCacheLine(self.references, self.execs, self.contains)
+
+class ShellParser():
+ def __init__(self, name, log):
+ self.funcdefs = set()
+ self.allexecs = set()
+ self.execs = set()
+ self.log = BufferedLogger('BitBake.Data.%s' % name, logging.DEBUG, log)
+ self.unhandled_template = "unable to handle non-literal command '%s'"
+ self.unhandled_template = "while parsing %s, %s" % (name, self.unhandled_template)
+
+ def parse_shell(self, value):
+ """Parse the supplied shell code in a string, returning the external
+ commands it executes.
+ """
+
+ h = hash(str(value))
+
+ if h in codeparsercache.shellcache:
+ self.execs = set(codeparsercache.shellcache[h].execs)
+ return self.execs
+
+ if h in codeparsercache.shellcacheextras:
+ self.execs = set(codeparsercache.shellcacheextras[h].execs)
+ return self.execs
+
+ self._parse_shell(value)
+ self.execs = set(cmd for cmd in self.allexecs if cmd not in self.funcdefs)
+
+ codeparsercache.shellcacheextras[h] = codeparsercache.newShellCacheLine(self.execs)
+
+ return self.execs
+
+ def _parse_shell(self, value):
+ try:
+ tokens, _ = pyshyacc.parse(value, eof=True, debug=False)
+ except pyshlex.NeedMore:
+ raise sherrors.ShellSyntaxError("Unexpected EOF")
+
+ for token in tokens:
+ self.process_tokens(token)
+
+ def process_tokens(self, tokens):
+ """Process a supplied portion of the syntax tree as returned by
+ pyshyacc.parse.
+ """
+
+ def function_definition(value):
+ self.funcdefs.add(value.name)
+ return [value.body], None
+
+ def case_clause(value):
+ # Element 0 of each item in the case is the list of patterns, and
+ # Element 1 of each item in the case is the list of commands to be
+ # executed when that pattern matches.
+ words = chain(*[item[0] for item in value.items])
+ cmds = chain(*[item[1] for item in value.items])
+ return cmds, words
+
+ def if_clause(value):
+ main = chain(value.cond, value.if_cmds)
+ rest = value.else_cmds
+ if isinstance(rest, tuple) and rest[0] == "elif":
+ return chain(main, if_clause(rest[1]))
+ else:
+ return chain(main, rest)
+
+ def simple_command(value):
+ return None, chain(value.words, (assign[1] for assign in value.assigns))
+
+ token_handlers = {
+ "and_or": lambda x: ((x.left, x.right), None),
+ "async": lambda x: ([x], None),
+ "brace_group": lambda x: (x.cmds, None),
+ "for_clause": lambda x: (x.cmds, x.items),
+ "function_definition": function_definition,
+ "if_clause": lambda x: (if_clause(x), None),
+ "pipeline": lambda x: (x.commands, None),
+ "redirect_list": lambda x: ([x.cmd], None),
+ "subshell": lambda x: (x.cmds, None),
+ "while_clause": lambda x: (chain(x.condition, x.cmds), None),
+ "until_clause": lambda x: (chain(x.condition, x.cmds), None),
+ "simple_command": simple_command,
+ "case_clause": case_clause,
+ }
+
+ for token in tokens:
+ name, value = token
+ try:
+ more_tokens, words = token_handlers[name](value)
+ except KeyError:
+ raise NotImplementedError("Unsupported token type " + name)
+
+ if more_tokens:
+ self.process_tokens(more_tokens)
+
+ if words:
+ self.process_words(words)
+
+ def process_words(self, words):
+ """Process a set of 'words' in pyshyacc parlance, which includes
+ extraction of executed commands from $() blocks, as well as grabbing
+ the command name argument.
+ """
+
+ words = list(words)
+ for word in list(words):
+ wtree = pyshlex.make_wordtree(word[1])
+ for part in wtree:
+ if not isinstance(part, list):
+ continue
+
+ if part[0] in ('`', '$('):
+ command = pyshlex.wordtree_as_string(part[1:-1])
+ self._parse_shell(command)
+
+ if word[0] in ("cmd_name", "cmd_word"):
+ if word in words:
+ words.remove(word)
+
+ usetoken = False
+ for word in words:
+ if word[0] in ("cmd_name", "cmd_word") or \
+ (usetoken and word[0] == "TOKEN"):
+ if "=" in word[1]:
+ usetoken = True
+ continue
+
+ cmd = word[1]
+ if cmd.startswith("$"):
+ self.log.debug(1, self.unhandled_template % cmd)
+ elif cmd == "eval":
+ command = " ".join(word for _, word in words[1:])
+ self._parse_shell(command)
+ else:
+ self.allexecs.add(cmd)
+ break
diff --git a/bitbake/lib/bb/command.py b/bitbake/lib/bb/command.py
new file mode 100644
index 0000000..398c1d6
--- /dev/null
+++ b/bitbake/lib/bb/command.py
@@ -0,0 +1,463 @@
+"""
+BitBake 'Command' module
+
+Provide an interface to interact with the bitbake server through 'commands'
+"""
+
+# Copyright (C) 2006-2007 Richard Purdie
+#
+# This program is free software; you can redistribute it and/or modify
+# it under the terms of the GNU General Public License version 2 as
+# published by the Free Software Foundation.
+#
+# This program is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
+# GNU General Public License for more details.
+#
+# You should have received a copy of the GNU General Public License along
+# with this program; if not, write to the Free Software Foundation, Inc.,
+# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
+
+"""
+The bitbake server takes 'commands' from its UI/commandline.
+Commands are either synchronous or asynchronous.
+Async commands return data to the client in the form of events.
+Sync commands must only return data through the function return value
+and must not trigger events, directly or indirectly.
+Commands are queued in a CommandQueue
+"""
+
+import bb.event
+import bb.cooker
+
+class CommandCompleted(bb.event.Event):
+ pass
+
+class CommandExit(bb.event.Event):
+ def __init__(self, exitcode):
+ bb.event.Event.__init__(self)
+ self.exitcode = int(exitcode)
+
+class CommandFailed(CommandExit):
+ def __init__(self, message):
+ self.error = message
+ CommandExit.__init__(self, 1)
+
+class CommandError(Exception):
+ pass
+
+class Command:
+ """
+ A queue of asynchronous commands for bitbake
+ """
+ def __init__(self, cooker):
+ self.cooker = cooker
+ self.cmds_sync = CommandsSync()
+ self.cmds_async = CommandsAsync()
+
+ # FIXME Add lock for this
+ self.currentAsyncCommand = None
+
+ def runCommand(self, commandline, ro_only = False):
+ command = commandline.pop(0)
+ if hasattr(CommandsSync, command):
+ # Can run synchronous commands straight away
+ command_method = getattr(self.cmds_sync, command)
+ if ro_only:
+ if not hasattr(command_method, 'readonly') or False == getattr(command_method, 'readonly'):
+ return None, "Not able to execute not readonly commands in readonly mode"
+ try:
+ if getattr(command_method, 'needconfig', False):
+ self.cooker.updateCacheSync()
+ result = command_method(self, commandline)
+ except CommandError as exc:
+ return None, exc.args[0]
+ except (Exception, SystemExit):
+ import traceback
+ return None, traceback.format_exc()
+ else:
+ return result, None
+ if self.currentAsyncCommand is not None:
+ return None, "Busy (%s in progress)" % self.currentAsyncCommand[0]
+ if command not in CommandsAsync.__dict__:
+ return None, "No such command"
+ self.currentAsyncCommand = (command, commandline)
+ self.cooker.configuration.server_register_idlecallback(self.cooker.runCommands, self.cooker)
+ return True, None
+
+ def runAsyncCommand(self):
+ try:
+ if self.cooker.state in (bb.cooker.state.error, bb.cooker.state.shutdown, bb.cooker.state.forceshutdown):
+ # updateCache will trigger a shutdown of the parser
+ # and then raise BBHandledException triggering an exit
+ self.cooker.updateCache()
+ return False
+ if self.currentAsyncCommand is not None:
+ (command, options) = self.currentAsyncCommand
+ commandmethod = getattr(CommandsAsync, command)
+ needcache = getattr( commandmethod, "needcache" )
+ if needcache and self.cooker.state != bb.cooker.state.running:
+ self.cooker.updateCache()
+ return True
+ else:
+ commandmethod(self.cmds_async, self, options)
+ return False
+ else:
+ return False
+ except KeyboardInterrupt as exc:
+ self.finishAsyncCommand("Interrupted")
+ return False
+ except SystemExit as exc:
+ arg = exc.args[0]
+ if isinstance(arg, basestring):
+ self.finishAsyncCommand(arg)
+ else:
+ self.finishAsyncCommand("Exited with %s" % arg)
+ return False
+ except Exception as exc:
+ import traceback
+ if isinstance(exc, bb.BBHandledException):
+ self.finishAsyncCommand("")
+ else:
+ self.finishAsyncCommand(traceback.format_exc())
+ return False
+
+ def finishAsyncCommand(self, msg=None, code=None):
+ if msg or msg == "":
+ bb.event.fire(CommandFailed(msg), self.cooker.expanded_data)
+ elif code:
+ bb.event.fire(CommandExit(code), self.cooker.expanded_data)
+ else:
+ bb.event.fire(CommandCompleted(), self.cooker.expanded_data)
+ self.currentAsyncCommand = None
+ self.cooker.finishcommand()
+
+class CommandsSync:
+ """
+ A class of synchronous commands
+ These should run quickly so as not to hurt interactive performance.
+ These must not influence any running synchronous command.
+ """
+
+ def stateShutdown(self, command, params):
+ """
+ Trigger cooker 'shutdown' mode
+ """
+ command.cooker.shutdown(False)
+
+ def stateForceShutdown(self, command, params):
+ """
+ Stop the cooker
+ """
+ command.cooker.shutdown(True)
+
+ def getAllKeysWithFlags(self, command, params):
+ """
+ Returns a dump of the global state. Call with
+ variable flags to be retrieved as params.
+ """
+ flaglist = params[0]
+ return command.cooker.getAllKeysWithFlags(flaglist)
+ getAllKeysWithFlags.readonly = True
+
+ def getVariable(self, command, params):
+ """
+ Read the value of a variable from data
+ """
+ varname = params[0]
+ expand = True
+ if len(params) > 1:
+ expand = (params[1] == "True")
+
+ return command.cooker.data.getVar(varname, expand)
+ getVariable.readonly = True
+
+ def setVariable(self, command, params):
+ """
+ Set the value of variable in data
+ """
+ varname = params[0]
+ value = str(params[1])
+ command.cooker.data.setVar(varname, value)
+
+ def setConfig(self, command, params):
+ """
+ Set the value of variable in configuration
+ """
+ varname = params[0]
+ value = str(params[1])
+ setattr(command.cooker.configuration, varname, value)
+
+ def enableDataTracking(self, command, params):
+ """
+ Enable history tracking for variables
+ """
+ command.cooker.enableDataTracking()
+
+ def disableDataTracking(self, command, params):
+ """
+ Disable history tracking for variables
+ """
+ command.cooker.disableDataTracking()
+
+ def setPrePostConfFiles(self, command, params):
+ prefiles = params[0].split()
+ postfiles = params[1].split()
+ command.cooker.configuration.prefile = prefiles
+ command.cooker.configuration.postfile = postfiles
+ setPrePostConfFiles.needconfig = False
+
+ def getCpuCount(self, command, params):
+ """
+ Get the CPU count on the bitbake server
+ """
+ return bb.utils.cpu_count()
+ getCpuCount.readonly = True
+ getCpuCount.needconfig = False
+
+ def matchFile(self, command, params):
+ fMatch = params[0]
+ return command.cooker.matchFile(fMatch)
+ matchFile.needconfig = False
+
+ def generateNewImage(self, command, params):
+ image = params[0]
+ base_image = params[1]
+ package_queue = params[2]
+ timestamp = params[3]
+ description = params[4]
+ return command.cooker.generateNewImage(image, base_image,
+ package_queue, timestamp, description)
+
+ def ensureDir(self, command, params):
+ directory = params[0]
+ bb.utils.mkdirhier(directory)
+ ensureDir.needconfig = False
+
+ def setVarFile(self, command, params):
+ """
+ Save a variable in a file; used for saving in a configuration file
+ """
+ var = params[0]
+ val = params[1]
+ default_file = params[2]
+ op = params[3]
+ command.cooker.modifyConfigurationVar(var, val, default_file, op)
+ setVarFile.needconfig = False
+
+ def removeVarFile(self, command, params):
+ """
+ Remove a variable declaration from a file
+ """
+ var = params[0]
+ command.cooker.removeConfigurationVar(var)
+ removeVarFile.needconfig = False
+
+ def createConfigFile(self, command, params):
+ """
+ Create an extra configuration file
+ """
+ name = params[0]
+ command.cooker.createConfigFile(name)
+ createConfigFile.needconfig = False
+
+ def setEventMask(self, command, params):
+ handlerNum = params[0]
+ llevel = params[1]
+ debug_domains = params[2]
+ mask = params[3]
+ return bb.event.set_UIHmask(handlerNum, llevel, debug_domains, mask)
+ setEventMask.needconfig = False
+
+ def setFeatures(self, command, params):
+ """
+ Set the cooker features to include the passed list of features
+ """
+ features = params[0]
+ command.cooker.setFeatures(features)
+ setFeatures.needconfig = False
+ # although we change the internal state of the cooker, this is transparent since
+ # we always take and leave the cooker in state.initial
+ setFeatures.readonly = True
+
+ def updateConfig(self, command, params):
+ options = params[0]
+ environment = params[1]
+ command.cooker.updateConfigOpts(options, environment)
+ updateConfig.needconfig = False
+
+class CommandsAsync:
+ """
+ A class of asynchronous commands
+ These functions communicate via generated events.
+ Any function that requires metadata parsing should be here.
+ """
+
+ def buildFile(self, command, params):
+ """
+ Build a single specified .bb file
+ """
+ bfile = params[0]
+ task = params[1]
+
+ command.cooker.buildFile(bfile, task)
+ buildFile.needcache = False
+
+ def buildTargets(self, command, params):
+ """
+ Build a set of targets
+ """
+ pkgs_to_build = params[0]
+ task = params[1]
+
+ command.cooker.buildTargets(pkgs_to_build, task)
+ buildTargets.needcache = True
+
+ def generateDepTreeEvent(self, command, params):
+ """
+ Generate an event containing the dependency information
+ """
+ pkgs_to_build = params[0]
+ task = params[1]
+
+ command.cooker.generateDepTreeEvent(pkgs_to_build, task)
+ command.finishAsyncCommand()
+ generateDepTreeEvent.needcache = True
+
+ def generateDotGraph(self, command, params):
+ """
+ Dump dependency information to disk as .dot files
+ """
+ pkgs_to_build = params[0]
+ task = params[1]
+
+ command.cooker.generateDotGraphFiles(pkgs_to_build, task)
+ command.finishAsyncCommand()
+ generateDotGraph.needcache = True
+
+ def generateTargetsTree(self, command, params):
+ """
+ Generate a tree of buildable targets.
+ If klass is provided ensure all recipes that inherit the class are
+ included in the package list.
+ If pkg_list provided use that list (plus any extras brought in by
+ klass) rather than generating a tree for all packages.
+ """
+ klass = params[0]
+ pkg_list = params[1]
+
+ command.cooker.generateTargetsTree(klass, pkg_list)
+ command.finishAsyncCommand()
+ generateTargetsTree.needcache = True
+
+ def findCoreBaseFiles(self, command, params):
+ """
+ Find certain files in COREBASE directory. i.e. Layers
+ """
+ subdir = params[0]
+ filename = params[1]
+
+ command.cooker.findCoreBaseFiles(subdir, filename)
+ command.finishAsyncCommand()
+ findCoreBaseFiles.needcache = False
+
+ def findConfigFiles(self, command, params):
+ """
+ Find config files which provide appropriate values
+ for the passed configuration variable. i.e. MACHINE
+ """
+ varname = params[0]
+
+ command.cooker.findConfigFiles(varname)
+ command.finishAsyncCommand()
+ findConfigFiles.needcache = False
+
+ def findFilesMatchingInDir(self, command, params):
+ """
+ Find implementation files matching the specified pattern
+ in the requested subdirectory of a BBPATH
+ """
+ pattern = params[0]
+ directory = params[1]
+
+ command.cooker.findFilesMatchingInDir(pattern, directory)
+ command.finishAsyncCommand()
+ findFilesMatchingInDir.needcache = False
+
+ def findConfigFilePath(self, command, params):
+ """
+ Find the path of the requested configuration file
+ """
+ configfile = params[0]
+
+ command.cooker.findConfigFilePath(configfile)
+ command.finishAsyncCommand()
+ findConfigFilePath.needcache = False
+
+ def showVersions(self, command, params):
+ """
+ Show the currently selected versions
+ """
+ command.cooker.showVersions()
+ command.finishAsyncCommand()
+ showVersions.needcache = True
+
+ def showEnvironmentTarget(self, command, params):
+ """
+ Print the environment of a target recipe
+ (needs the cache to work out which recipe to use)
+ """
+ pkg = params[0]
+
+ command.cooker.showEnvironment(None, pkg)
+ command.finishAsyncCommand()
+ showEnvironmentTarget.needcache = True
+
+ def showEnvironment(self, command, params):
+ """
+ Print the standard environment
+ or if specified the environment for a specified recipe
+ """
+ bfile = params[0]
+
+ command.cooker.showEnvironment(bfile)
+ command.finishAsyncCommand()
+ showEnvironment.needcache = False
+
+ def parseFiles(self, command, params):
+ """
+ Parse the .bb files
+ """
+ command.cooker.updateCache()
+ command.finishAsyncCommand()
+ parseFiles.needcache = True
+
+ def compareRevisions(self, command, params):
+ """
+ Parse the .bb files
+ """
+ if bb.fetch.fetcher_compare_revisions(command.cooker.data):
+ command.finishAsyncCommand(code=1)
+ else:
+ command.finishAsyncCommand()
+ compareRevisions.needcache = True
+
+ def triggerEvent(self, command, params):
+ """
+ Trigger a certain event
+ """
+ event = params[0]
+ bb.event.fire(eval(event), command.cooker.data)
+ command.currentAsyncCommand = None
+ triggerEvent.needcache = False
+
+ def resetCooker(self, command, params):
+ """
+ Reset the cooker to its initial state, thus forcing a reparse for
+ any async command that has the needcache property set to True
+ """
+ command.cooker.reset()
+ command.finishAsyncCommand()
+ resetCooker.needcache = False
+
diff --git a/bitbake/lib/bb/compat.py b/bitbake/lib/bb/compat.py
new file mode 100644
index 0000000..de1923d
--- /dev/null
+++ b/bitbake/lib/bb/compat.py
@@ -0,0 +1,6 @@
+"""Code pulled from future python versions, here for compatibility"""
+
+from collections import MutableMapping, KeysView, ValuesView, ItemsView, OrderedDict
+from functools import total_ordering
+
+
diff --git a/bitbake/lib/bb/cooker.py b/bitbake/lib/bb/cooker.py
new file mode 100644
index 0000000..a0d7d59
--- /dev/null
+++ b/bitbake/lib/bb/cooker.py
@@ -0,0 +1,2139 @@
+#!/usr/bin/env python
+# ex:ts=4:sw=4:sts=4:et
+# -*- tab-width: 4; c-basic-offset: 4; indent-tabs-mode: nil -*-
+#
+# Copyright (C) 2003, 2004 Chris Larson
+# Copyright (C) 2003, 2004 Phil Blundell
+# Copyright (C) 2003 - 2005 Michael 'Mickey' Lauer
+# Copyright (C) 2005 Holger Hans Peter Freyther
+# Copyright (C) 2005 ROAD GmbH
+# Copyright (C) 2006 - 2007 Richard Purdie
+#
+# This program is free software; you can redistribute it and/or modify
+# it under the terms of the GNU General Public License version 2 as
+# published by the Free Software Foundation.
+#
+# This program is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
+# GNU General Public License for more details.
+#
+# You should have received a copy of the GNU General Public License along
+# with this program; if not, write to the Free Software Foundation, Inc.,
+# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
+
+from __future__ import print_function
+import sys, os, glob, os.path, re, time
+import atexit
+import itertools
+import logging
+import multiprocessing
+import sre_constants
+import threading
+from cStringIO import StringIO
+from contextlib import closing
+from functools import wraps
+from collections import defaultdict
+import bb, bb.exceptions, bb.command
+from bb import utils, data, parse, event, cache, providers, taskdata, runqueue, build
+import Queue
+import signal
+import subprocess
+import errno
+import prserv.serv
+import pyinotify
+
+logger = logging.getLogger("BitBake")
+collectlog = logging.getLogger("BitBake.Collection")
+buildlog = logging.getLogger("BitBake.Build")
+parselog = logging.getLogger("BitBake.Parsing")
+providerlog = logging.getLogger("BitBake.Provider")
+
+class NoSpecificMatch(bb.BBHandledException):
+ """
+ Exception raised when no or multiple file matches are found
+ """
+
+class NothingToBuild(Exception):
+ """
+ Exception raised when there is nothing to build
+ """
+
+class CollectionError(bb.BBHandledException):
+ """
+ Exception raised when layer configuration is incorrect
+ """
+
+class state:
+ initial, parsing, running, shutdown, forceshutdown, stopped, error = range(7)
+
+
+class SkippedPackage:
+ def __init__(self, info = None, reason = None):
+ self.pn = None
+ self.skipreason = None
+ self.provides = None
+ self.rprovides = None
+
+ if info:
+ self.pn = info.pn
+ self.skipreason = info.skipreason
+ self.provides = info.provides
+ self.rprovides = info.rprovides
+ elif reason:
+ self.skipreason = reason
+
+
+class CookerFeatures(object):
+ _feature_list = [HOB_EXTRA_CACHES, SEND_DEPENDS_TREE, BASEDATASTORE_TRACKING, SEND_SANITYEVENTS] = range(4)
+
+ def __init__(self):
+ self._features=set()
+
+ def setFeature(self, f):
+ # validate we got a request for a feature we support
+ if f not in CookerFeatures._feature_list:
+ return
+ self._features.add(f)
+
+ def __contains__(self, f):
+ return f in self._features
+
+ def __iter__(self):
+ return self._features.__iter__()
+
+ def next(self):
+ return self._features.next()
+
+
+#============================================================================#
+# BBCooker
+#============================================================================#
+class BBCooker:
+ """
+ Manages one bitbake build run
+ """
+
+ def __init__(self, configuration, featureSet=None):
+ self.recipecache = None
+ self.skiplist = {}
+ self.featureset = CookerFeatures()
+ if featureSet:
+ for f in featureSet:
+ self.featureset.setFeature(f)
+
+ self.configuration = configuration
+
+ self.configwatcher = pyinotify.WatchManager()
+ self.configwatcher.bbseen = []
+ self.configwatcher.bbwatchedfiles = []
+ self.confignotifier = pyinotify.Notifier(self.configwatcher, self.config_notifications)
+ self.watchmask = pyinotify.IN_CLOSE_WRITE | pyinotify.IN_CREATE | pyinotify.IN_DELETE | \
+ pyinotify.IN_DELETE_SELF | pyinotify.IN_MODIFY | pyinotify.IN_MOVE_SELF | \
+ pyinotify.IN_MOVED_FROM | pyinotify.IN_MOVED_TO
+ self.watcher = pyinotify.WatchManager()
+ self.watcher.bbseen = []
+ self.watcher.bbwatchedfiles = []
+ self.notifier = pyinotify.Notifier(self.watcher, self.notifications)
+
+
+ self.initConfigurationData()
+
+ self.inotify_modified_files = []
+
+ def _process_inotify_updates(server, notifier_list, abort):
+ for n in notifier_list:
+ if n.check_events(timeout=0):
+ # read notified events and enqeue them
+ n.read_events()
+ n.process_events()
+ return 1.0
+
+ self.configuration.server_register_idlecallback(_process_inotify_updates, [self.confignotifier, self.notifier])
+
+ self.baseconfig_valid = True
+ self.parsecache_valid = False
+
+ # Take a lock so only one copy of bitbake can run against a given build
+ # directory at a time
+ if not self.lockBitbake():
+ bb.fatal("Only one copy of bitbake should be run against a build directory")
+ try:
+ self.lock.seek(0)
+ self.lock.truncate()
+ if len(configuration.interface) >= 2:
+ self.lock.write("%s:%s\n" % (configuration.interface[0], configuration.interface[1]));
+ self.lock.flush()
+ except:
+ pass
+
+ # TOSTOP must not be set or our children will hang when they output
+ fd = sys.stdout.fileno()
+ if os.isatty(fd):
+ import termios
+ tcattr = termios.tcgetattr(fd)
+ if tcattr[3] & termios.TOSTOP:
+ buildlog.info("The terminal had the TOSTOP bit set, clearing...")
+ tcattr[3] = tcattr[3] & ~termios.TOSTOP
+ termios.tcsetattr(fd, termios.TCSANOW, tcattr)
+
+ self.command = bb.command.Command(self)
+ self.state = state.initial
+
+ self.parser = None
+
+ signal.signal(signal.SIGTERM, self.sigterm_exception)
+ # Let SIGHUP exit as SIGTERM
+ signal.signal(signal.SIGHUP, self.sigterm_exception)
+
+ def config_notifications(self, event):
+ if not event.pathname in self.configwatcher.bbwatchedfiles:
+ return
+ if not event.path in self.inotify_modified_files:
+ self.inotify_modified_files.append(event.path)
+ self.baseconfig_valid = False
+
+ def notifications(self, event):
+ if not event.path in self.inotify_modified_files:
+ self.inotify_modified_files.append(event.path)
+ self.parsecache_valid = False
+
+ def add_filewatch(self, deps, watcher=None):
+ if not watcher:
+ watcher = self.watcher
+ for i in deps:
+ watcher.bbwatchedfiles.append(i[0])
+ f = os.path.dirname(i[0])
+ if f in watcher.bbseen:
+ continue
+ watcher.bbseen.append(f)
+ watchtarget = None
+ while True:
+ # We try and add watches for files that don't exist but if they did, would influence
+ # the parser. The parent directory of these files may not exist, in which case we need
+ # to watch any parent that does exist for changes.
+ try:
+ watcher.add_watch(f, self.watchmask, quiet=False)
+ if watchtarget:
+ watcher.bbwatchedfiles.append(watchtarget)
+ break
+ except pyinotify.WatchManagerError as e:
+ if 'ENOENT' in str(e):
+ watchtarget = f
+ f = os.path.dirname(f)
+ if f in watcher.bbseen:
+ break
+ watcher.bbseen.append(f)
+ continue
+ if 'ENOSPC' in str(e):
+ providerlog.error("No space left on device or exceeds fs.inotify.max_user_watches?")
+ providerlog.error("To check max_user_watches: sysctl -n fs.inotify.max_user_watches.")
+ providerlog.error("To modify max_user_watches: sysctl -n -w fs.inotify.max_user_watches=<value>.")
+ providerlog.error("Root privilege is required to modify max_user_watches.")
+ raise
+
+ def sigterm_exception(self, signum, stackframe):
+ if signum == signal.SIGTERM:
+ bb.warn("Cooker recieved SIGTERM, shutting down...")
+ elif signum == signal.SIGHUP:
+ bb.warn("Cooker recieved SIGHUP, shutting down...")
+ self.state = state.forceshutdown
+
+ def setFeatures(self, features):
+ # we only accept a new feature set if we're in state initial, so we can reset without problems
+ if not self.state in [state.initial, state.shutdown, state.forceshutdown, state.stopped, state.error]:
+ raise Exception("Illegal state for feature set change")
+ original_featureset = list(self.featureset)
+ for feature in features:
+ self.featureset.setFeature(feature)
+ bb.debug(1, "Features set %s (was %s)" % (original_featureset, list(self.featureset)))
+ if (original_featureset != list(self.featureset)) and self.state != state.error:
+ self.reset()
+
+ def initConfigurationData(self):
+
+ self.state = state.initial
+ self.caches_array = []
+
+ if CookerFeatures.BASEDATASTORE_TRACKING in self.featureset:
+ self.enableDataTracking()
+
+ all_extra_cache_names = []
+ # We hardcode all known cache types in a single place, here.
+ if CookerFeatures.HOB_EXTRA_CACHES in self.featureset:
+ all_extra_cache_names.append("bb.cache_extra:HobRecipeInfo")
+
+ caches_name_array = ['bb.cache:CoreRecipeInfo'] + all_extra_cache_names
+
+ # At least CoreRecipeInfo will be loaded, so caches_array will never be empty!
+ # This is the entry point, no further check needed!
+ for var in caches_name_array:
+ try:
+ module_name, cache_name = var.split(':')
+ module = __import__(module_name, fromlist=(cache_name,))
+ self.caches_array.append(getattr(module, cache_name))
+ except ImportError as exc:
+ logger.critical("Unable to import extra RecipeInfo '%s' from '%s': %s" % (cache_name, module_name, exc))
+ sys.exit("FATAL: Failed to import extra cache class '%s'." % cache_name)
+
+ self.databuilder = bb.cookerdata.CookerDataBuilder(self.configuration, False)
+ self.databuilder.parseBaseConfiguration()
+ self.data = self.databuilder.data
+ self.data_hash = self.databuilder.data_hash
+
+
+ # we log all events to a file if so directed
+ if self.configuration.writeeventlog:
+ import json, pickle
+ DEFAULT_EVENTFILE = self.configuration.writeeventlog
+ class EventLogWriteHandler():
+
+ class EventWriter():
+ def __init__(self, cooker):
+ self.file_inited = None
+ self.cooker = cooker
+ self.event_queue = []
+
+ def init_file(self):
+ try:
+ # delete the old log
+ os.remove(DEFAULT_EVENTFILE)
+ except:
+ pass
+
+ # write current configuration data
+ with open(DEFAULT_EVENTFILE, "w") as f:
+ f.write("%s\n" % json.dumps({ "allvariables" : self.cooker.getAllKeysWithFlags(["doc", "func"])}))
+
+ def write_event(self, event):
+ with open(DEFAULT_EVENTFILE, "a") as f:
+ try:
+ f.write("%s\n" % json.dumps({"class":event.__module__ + "." + event.__class__.__name__, "vars":json.dumps(pickle.dumps(event)) }))
+ except Exception as e:
+ import traceback
+ print(e, traceback.format_exc(e))
+
+
+ def send(self, event):
+ event_class = event.__module__ + "." + event.__class__.__name__
+
+ # init on bb.event.BuildStarted
+ if self.file_inited is None:
+ if event_class == "bb.event.BuildStarted":
+ self.init_file()
+ self.file_inited = True
+
+ # write pending events
+ for e in self.event_queue:
+ self.write_event(e)
+
+ # also write the current event
+ self.write_event(event)
+
+ else:
+ # queue all events until the file is inited
+ self.event_queue.append(event)
+
+ else:
+ # we have the file, just write the event
+ self.write_event(event)
+
+ # set our handler's event processor
+ event = EventWriter(self) # self is the cooker here
+
+
+ # set up cooker features for this mock UI handler
+
+ # we need to write the dependency tree in the log
+ self.featureset.setFeature(CookerFeatures.SEND_DEPENDS_TREE)
+ # register the log file writer as UI Handler
+ bb.event.register_UIHhandler(EventLogWriteHandler())
+
+
+ #
+ # Copy of the data store which has been expanded.
+ # Used for firing events and accessing variables where expansion needs to be accounted for
+ #
+ self.expanded_data = bb.data.createCopy(self.data)
+ bb.data.update_data(self.expanded_data)
+ bb.parse.init_parser(self.expanded_data)
+
+ if CookerFeatures.BASEDATASTORE_TRACKING in self.featureset:
+ self.disableDataTracking()
+
+ self.data.renameVar("__depends", "__base_depends")
+ self.add_filewatch(self.data.getVar("__base_depends", False), self.configwatcher)
+
+
+ def enableDataTracking(self):
+ self.configuration.tracking = True
+ if hasattr(self, "data"):
+ self.data.enableTracking()
+
+ def disableDataTracking(self):
+ self.configuration.tracking = False
+ if hasattr(self, "data"):
+ self.data.disableTracking()
+
+ def modifyConfigurationVar(self, var, val, default_file, op):
+ if op == "append":
+ self.appendConfigurationVar(var, val, default_file)
+ elif op == "set":
+ self.saveConfigurationVar(var, val, default_file, "=")
+ elif op == "earlyAssign":
+ self.saveConfigurationVar(var, val, default_file, "?=")
+
+
+ def appendConfigurationVar(self, var, val, default_file):
+ #add append var operation to the end of default_file
+ default_file = bb.cookerdata.findConfigFile(default_file, self.data)
+
+ total = "#added by hob"
+ total += "\n%s += \"%s\"\n" % (var, val)
+
+ with open(default_file, 'a') as f:
+ f.write(total)
+
+ #add to history
+ loginfo = {"op":"append", "file":default_file, "line":total.count("\n")}
+ self.data.appendVar(var, val, **loginfo)
+
+ def saveConfigurationVar(self, var, val, default_file, op):
+
+ replaced = False
+ #do not save if nothing changed
+ if str(val) == self.data.getVar(var, False):
+ return
+
+ conf_files = self.data.varhistory.get_variable_files(var)
+
+ #format the value when it is a list
+ if isinstance(val, list):
+ listval = ""
+ for value in val:
+ listval += "%s " % value
+ val = listval
+
+ topdir = self.data.getVar("TOPDIR", False)
+
+ #comment or replace operations made on var
+ for conf_file in conf_files:
+ if topdir in conf_file:
+ with open(conf_file, 'r') as f:
+ contents = f.readlines()
+
+ lines = self.data.varhistory.get_variable_lines(var, conf_file)
+ for line in lines:
+ total = ""
+ i = 0
+ for c in contents:
+ total += c
+ i = i + 1
+ if i==int(line):
+ end_index = len(total)
+ index = total.rfind(var, 0, end_index)
+
+ begin_line = total.count("\n",0,index)
+ end_line = int(line)
+
+ #check if the variable was saved before in the same way
+ #if true it replace the place where the variable was declared
+ #else it comments it
+ if contents[begin_line-1]== "#added by hob\n":
+ contents[begin_line] = "%s %s \"%s\"\n" % (var, op, val)
+ replaced = True
+ else:
+ for ii in range(begin_line, end_line):
+ contents[ii] = "#" + contents[ii]
+
+ with open(conf_file, 'w') as f:
+ f.writelines(contents)
+
+ if replaced == False:
+ #remove var from history
+ self.data.varhistory.del_var_history(var)
+
+ #add var to the end of default_file
+ default_file = bb.cookerdata.findConfigFile(default_file, self.data)
+
+ #add the variable on a single line, to be easy to replace the second time
+ total = "\n#added by hob"
+ total += "\n%s %s \"%s\"\n" % (var, op, val)
+
+ with open(default_file, 'a') as f:
+ f.write(total)
+
+ #add to history
+ loginfo = {"op":"set", "file":default_file, "line":total.count("\n")}
+ self.data.setVar(var, val, **loginfo)
+
+ def removeConfigurationVar(self, var):
+ conf_files = self.data.varhistory.get_variable_files(var)
+ topdir = self.data.getVar("TOPDIR", False)
+
+ for conf_file in conf_files:
+ if topdir in conf_file:
+ with open(conf_file, 'r') as f:
+ contents = f.readlines()
+
+ lines = self.data.varhistory.get_variable_lines(var, conf_file)
+ for line in lines:
+ total = ""
+ i = 0
+ for c in contents:
+ total += c
+ i = i + 1
+ if i==int(line):
+ end_index = len(total)
+ index = total.rfind(var, 0, end_index)
+
+ begin_line = total.count("\n",0,index)
+
+ #check if the variable was saved before in the same way
+ if contents[begin_line-1]== "#added by hob\n":
+ contents[begin_line-1] = contents[begin_line] = "\n"
+ else:
+ contents[begin_line] = "\n"
+ #remove var from history
+ self.data.varhistory.del_var_history(var, conf_file, line)
+ #remove variable
+ self.data.delVar(var)
+
+ with open(conf_file, 'w') as f:
+ f.writelines(contents)
+
+ def createConfigFile(self, name):
+ path = os.getcwd()
+ confpath = os.path.join(path, "conf", name)
+ open(confpath, 'w').close()
+
+ def parseConfiguration(self):
+ # Set log file verbosity
+ verboselogs = bb.utils.to_boolean(self.data.getVar("BB_VERBOSE_LOGS", False))
+ if verboselogs:
+ bb.msg.loggerVerboseLogs = True
+
+ # Change nice level if we're asked to
+ nice = self.data.getVar("BB_NICE_LEVEL", True)
+ if nice:
+ curnice = os.nice(0)
+ nice = int(nice) - curnice
+ buildlog.verbose("Renice to %s " % os.nice(nice))
+
+ if self.recipecache:
+ del self.recipecache
+ self.recipecache = bb.cache.CacheData(self.caches_array)
+
+ self.handleCollections( self.data.getVar("BBFILE_COLLECTIONS", True) )
+
+ def updateConfigOpts(self, options, environment):
+ clean = True
+ for o in options:
+ if o in ['prefile', 'postfile']:
+ clean = False
+ setattr(self.configuration, o, options[o])
+ for k in bb.utils.approved_variables():
+ if k in environment and k not in self.configuration.env:
+ logger.debug(1, "Updating environment variable %s to %s" % (k, environment[k]))
+ self.configuration.env[k] = environment[k]
+ clean = False
+ if k in self.configuration.env and k not in environment:
+ logger.debug(1, "Updating environment variable %s (deleted)" % (k))
+ del self.configuration.env[k]
+ clean = False
+ if k not in self.configuration.env and k not in environment:
+ continue
+ if environment[k] != self.configuration.env[k]:
+ logger.debug(1, "Updating environment variable %s to %s" % (k, environment[k]))
+ self.configuration.env[k] = environment[k]
+ clean = False
+ if not clean:
+ logger.debug(1, "Base environment change, triggering reparse")
+ self.baseconfig_valid = False
+ self.reset()
+
+ def runCommands(self, server, data, abort):
+ """
+ Run any queued asynchronous command
+ This is done by the idle handler so it runs in true context rather than
+ tied to any UI.
+ """
+
+ return self.command.runAsyncCommand()
+
+ def showVersions(self):
+
+ pkg_pn = self.recipecache.pkg_pn
+ (latest_versions, preferred_versions) = bb.providers.findProviders(self.data, self.recipecache, pkg_pn)
+
+ logger.plain("%-35s %25s %25s", "Recipe Name", "Latest Version", "Preferred Version")
+ logger.plain("%-35s %25s %25s\n", "===========", "==============", "=================")
+
+ for p in sorted(pkg_pn):
+ pref = preferred_versions[p]
+ latest = latest_versions[p]
+
+ prefstr = pref[0][0] + ":" + pref[0][1] + '-' + pref[0][2]
+ lateststr = latest[0][0] + ":" + latest[0][1] + "-" + latest[0][2]
+
+ if pref == latest:
+ prefstr = ""
+
+ logger.plain("%-35s %25s %25s", p, lateststr, prefstr)
+
+ def showEnvironment(self, buildfile=None, pkgs_to_build=None):
+ """
+ Show the outer or per-recipe environment
+ """
+ fn = None
+ envdata = None
+ if not pkgs_to_build:
+ pkgs_to_build = []
+
+ if buildfile:
+ # Parse the configuration here. We need to do it explicitly here since
+ # this showEnvironment() code path doesn't use the cache
+ self.parseConfiguration()
+
+ fn, cls = bb.cache.Cache.virtualfn2realfn(buildfile)
+ fn = self.matchFile(fn)
+ fn = bb.cache.Cache.realfn2virtual(fn, cls)
+ elif len(pkgs_to_build) == 1:
+ ignore = self.expanded_data.getVar("ASSUME_PROVIDED", True) or ""
+ if pkgs_to_build[0] in set(ignore.split()):
+ bb.fatal("%s is in ASSUME_PROVIDED" % pkgs_to_build[0])
+
+ taskdata, runlist, pkgs_to_build = self.buildTaskData(pkgs_to_build, None, self.configuration.abort, allowincomplete=True)
+
+ targetid = taskdata.getbuild_id(pkgs_to_build[0])
+ fnid = taskdata.build_targets[targetid][0]
+ fn = taskdata.fn_index[fnid]
+ else:
+ envdata = self.data
+
+ if fn:
+ try:
+ envdata = bb.cache.Cache.loadDataFull(fn, self.collection.get_file_appends(fn), self.data)
+ except Exception as e:
+ parselog.exception("Unable to read %s", fn)
+ raise
+
+ # Display history
+ with closing(StringIO()) as env:
+ self.data.inchistory.emit(env)
+ logger.plain(env.getvalue())
+
+ # emit variables and shell functions
+ data.update_data(envdata)
+ with closing(StringIO()) as env:
+ data.emit_env(env, envdata, True)
+ logger.plain(env.getvalue())
+
+ # emit the metadata which isnt valid shell
+ data.expandKeys(envdata)
+ for e in envdata.keys():
+ if data.getVarFlag( e, 'python', envdata ):
+ logger.plain("\npython %s () {\n%s}\n", e, envdata.getVar(e, True))
+
+
+ def buildTaskData(self, pkgs_to_build, task, abort, allowincomplete=False):
+ """
+ Prepare a runqueue and taskdata object for iteration over pkgs_to_build
+ """
+ bb.event.fire(bb.event.TreeDataPreparationStarted(), self.data)
+
+ # A task of None means use the default task
+ if task is None:
+ task = self.configuration.cmd
+
+ fulltargetlist = self.checkPackages(pkgs_to_build)
+
+ localdata = data.createCopy(self.data)
+ bb.data.update_data(localdata)
+ bb.data.expandKeys(localdata)
+ taskdata = bb.taskdata.TaskData(abort, skiplist=self.skiplist, allowincomplete=allowincomplete)
+
+ current = 0
+ runlist = []
+ for k in fulltargetlist:
+ ktask = task
+ if ":do_" in k:
+ k2 = k.split(":do_")
+ k = k2[0]
+ ktask = k2[1]
+ taskdata.add_provider(localdata, self.recipecache, k)
+ current += 1
+ if not ktask.startswith("do_"):
+ ktask = "do_%s" % ktask
+ runlist.append([k, ktask])
+ bb.event.fire(bb.event.TreeDataPreparationProgress(current, len(fulltargetlist)), self.data)
+ taskdata.add_unresolved(localdata, self.recipecache)
+ bb.event.fire(bb.event.TreeDataPreparationCompleted(len(fulltargetlist)), self.data)
+ return taskdata, runlist, fulltargetlist
+
+ def prepareTreeData(self, pkgs_to_build, task):
+ """
+ Prepare a runqueue and taskdata object for iteration over pkgs_to_build
+ """
+
+ # We set abort to False here to prevent unbuildable targets raising
+ # an exception when we're just generating data
+ taskdata, runlist, pkgs_to_build = self.buildTaskData(pkgs_to_build, task, False, allowincomplete=True)
+
+ return runlist, taskdata
+
+ ######## WARNING : this function requires cache_extra to be enabled ########
+
+ def generateTaskDepTreeData(self, pkgs_to_build, task):
+ """
+ Create a dependency graph of pkgs_to_build including reverse dependency
+ information.
+ """
+ runlist, taskdata = self.prepareTreeData(pkgs_to_build, task)
+ rq = bb.runqueue.RunQueue(self, self.data, self.recipecache, taskdata, runlist)
+ rq.rqdata.prepare()
+ return self.buildDependTree(rq, taskdata)
+
+
+ def buildDependTree(self, rq, taskdata):
+ seen_fnids = []
+ depend_tree = {}
+ depend_tree["depends"] = {}
+ depend_tree["tdepends"] = {}
+ depend_tree["pn"] = {}
+ depend_tree["rdepends-pn"] = {}
+ depend_tree["packages"] = {}
+ depend_tree["rdepends-pkg"] = {}
+ depend_tree["rrecs-pkg"] = {}
+ depend_tree["layer-priorities"] = self.recipecache.bbfile_config_priorities
+
+ for task in xrange(len(rq.rqdata.runq_fnid)):
+ taskname = rq.rqdata.runq_task[task]
+ fnid = rq.rqdata.runq_fnid[task]
+ fn = taskdata.fn_index[fnid]
+ pn = self.recipecache.pkg_fn[fn]
+ version = "%s:%s-%s" % self.recipecache.pkg_pepvpr[fn]
+ if pn not in depend_tree["pn"]:
+ depend_tree["pn"][pn] = {}
+ depend_tree["pn"][pn]["filename"] = fn
+ depend_tree["pn"][pn]["version"] = version
+ depend_tree["pn"][pn]["inherits"] = self.recipecache.inherits.get(fn, None)
+
+ # if we have extra caches, list all attributes they bring in
+ extra_info = []
+ for cache_class in self.caches_array:
+ if type(cache_class) is type and issubclass(cache_class, bb.cache.RecipeInfoCommon) and hasattr(cache_class, 'cachefields'):
+ cachefields = getattr(cache_class, 'cachefields', [])
+ extra_info = extra_info + cachefields
+
+ # for all attributes stored, add them to the dependency tree
+ for ei in extra_info:
+ depend_tree["pn"][pn][ei] = vars(self.recipecache)[ei][fn]
+
+
+ for dep in rq.rqdata.runq_depends[task]:
+ depfn = taskdata.fn_index[rq.rqdata.runq_fnid[dep]]
+ deppn = self.recipecache.pkg_fn[depfn]
+ dotname = "%s.%s" % (pn, rq.rqdata.runq_task[task])
+ if not dotname in depend_tree["tdepends"]:
+ depend_tree["tdepends"][dotname] = []
+ depend_tree["tdepends"][dotname].append("%s.%s" % (deppn, rq.rqdata.runq_task[dep]))
+ if fnid not in seen_fnids:
+ seen_fnids.append(fnid)
+ packages = []
+
+ depend_tree["depends"][pn] = []
+ for dep in taskdata.depids[fnid]:
+ depend_tree["depends"][pn].append(taskdata.build_names_index[dep])
+
+ depend_tree["rdepends-pn"][pn] = []
+ for rdep in taskdata.rdepids[fnid]:
+ depend_tree["rdepends-pn"][pn].append(taskdata.run_names_index[rdep])
+
+ rdepends = self.recipecache.rundeps[fn]
+ for package in rdepends:
+ depend_tree["rdepends-pkg"][package] = []
+ for rdepend in rdepends[package]:
+ depend_tree["rdepends-pkg"][package].append(rdepend)
+ packages.append(package)
+
+ rrecs = self.recipecache.runrecs[fn]
+ for package in rrecs:
+ depend_tree["rrecs-pkg"][package] = []
+ for rdepend in rrecs[package]:
+ depend_tree["rrecs-pkg"][package].append(rdepend)
+ if not package in packages:
+ packages.append(package)
+
+ for package in packages:
+ if package not in depend_tree["packages"]:
+ depend_tree["packages"][package] = {}
+ depend_tree["packages"][package]["pn"] = pn
+ depend_tree["packages"][package]["filename"] = fn
+ depend_tree["packages"][package]["version"] = version
+
+ return depend_tree
+
+ ######## WARNING : this function requires cache_extra to be enabled ########
+ def generatePkgDepTreeData(self, pkgs_to_build, task):
+ """
+ Create a dependency tree of pkgs_to_build, returning the data.
+ """
+ _, taskdata = self.prepareTreeData(pkgs_to_build, task)
+ tasks_fnid = []
+ if len(taskdata.tasks_name) != 0:
+ for task in xrange(len(taskdata.tasks_name)):
+ tasks_fnid.append(taskdata.tasks_fnid[task])
+
+ seen_fnids = []
+ depend_tree = {}
+ depend_tree["depends"] = {}
+ depend_tree["pn"] = {}
+ depend_tree["rdepends-pn"] = {}
+ depend_tree["rdepends-pkg"] = {}
+ depend_tree["rrecs-pkg"] = {}
+
+ # if we have extra caches, list all attributes they bring in
+ extra_info = []
+ for cache_class in self.caches_array:
+ if type(cache_class) is type and issubclass(cache_class, bb.cache.RecipeInfoCommon) and hasattr(cache_class, 'cachefields'):
+ cachefields = getattr(cache_class, 'cachefields', [])
+ extra_info = extra_info + cachefields
+
+ for task in xrange(len(tasks_fnid)):
+ fnid = tasks_fnid[task]
+ fn = taskdata.fn_index[fnid]
+ pn = self.recipecache.pkg_fn[fn]
+
+ if pn not in depend_tree["pn"]:
+ depend_tree["pn"][pn] = {}
+ depend_tree["pn"][pn]["filename"] = fn
+ version = "%s:%s-%s" % self.recipecache.pkg_pepvpr[fn]
+ depend_tree["pn"][pn]["version"] = version
+ rdepends = self.recipecache.rundeps[fn]
+ rrecs = self.recipecache.runrecs[fn]
+ depend_tree["pn"][pn]["inherits"] = self.recipecache.inherits.get(fn, None)
+
+ # for all extra attributes stored, add them to the dependency tree
+ for ei in extra_info:
+ depend_tree["pn"][pn][ei] = vars(self.recipecache)[ei][fn]
+
+ if fnid not in seen_fnids:
+ seen_fnids.append(fnid)
+
+ depend_tree["depends"][pn] = []
+ for dep in taskdata.depids[fnid]:
+ item = taskdata.build_names_index[dep]
+ pn_provider = ""
+ targetid = taskdata.getbuild_id(item)
+ if targetid in taskdata.build_targets and taskdata.build_targets[targetid]:
+ id = taskdata.build_targets[targetid][0]
+ fn_provider = taskdata.fn_index[id]
+ pn_provider = self.recipecache.pkg_fn[fn_provider]
+ else:
+ pn_provider = item
+ depend_tree["depends"][pn].append(pn_provider)
+
+ depend_tree["rdepends-pn"][pn] = []
+ for rdep in taskdata.rdepids[fnid]:
+ item = taskdata.run_names_index[rdep]
+ pn_rprovider = ""
+ targetid = taskdata.getrun_id(item)
+ if targetid in taskdata.run_targets and taskdata.run_targets[targetid]:
+ id = taskdata.run_targets[targetid][0]
+ fn_rprovider = taskdata.fn_index[id]
+ pn_rprovider = self.recipecache.pkg_fn[fn_rprovider]
+ else:
+ pn_rprovider = item
+ depend_tree["rdepends-pn"][pn].append(pn_rprovider)
+
+ depend_tree["rdepends-pkg"].update(rdepends)
+ depend_tree["rrecs-pkg"].update(rrecs)
+
+ return depend_tree
+
+ def generateDepTreeEvent(self, pkgs_to_build, task):
+ """
+ Create a task dependency graph of pkgs_to_build.
+ Generate an event with the result
+ """
+ depgraph = self.generateTaskDepTreeData(pkgs_to_build, task)
+ bb.event.fire(bb.event.DepTreeGenerated(depgraph), self.data)
+
+ def generateDotGraphFiles(self, pkgs_to_build, task):
+ """
+ Create a task dependency graph of pkgs_to_build.
+ Save the result to a set of .dot files.
+ """
+
+ depgraph = self.generateTaskDepTreeData(pkgs_to_build, task)
+
+ # Prints a flattened form of package-depends below where subpackages of a package are merged into the main pn
+ depends_file = file('pn-depends.dot', 'w' )
+ buildlist_file = file('pn-buildlist', 'w' )
+ print("digraph depends {", file=depends_file)
+ for pn in depgraph["pn"]:
+ fn = depgraph["pn"][pn]["filename"]
+ version = depgraph["pn"][pn]["version"]
+ print('"%s" [label="%s %s\\n%s"]' % (pn, pn, version, fn), file=depends_file)
+ print("%s" % pn, file=buildlist_file)
+ buildlist_file.close()
+ logger.info("PN build list saved to 'pn-buildlist'")
+ for pn in depgraph["depends"]:
+ for depend in depgraph["depends"][pn]:
+ print('"%s" -> "%s"' % (pn, depend), file=depends_file)
+ for pn in depgraph["rdepends-pn"]:
+ for rdepend in depgraph["rdepends-pn"][pn]:
+ print('"%s" -> "%s" [style=dashed]' % (pn, rdepend), file=depends_file)
+ print("}", file=depends_file)
+ logger.info("PN dependencies saved to 'pn-depends.dot'")
+
+ depends_file = file('package-depends.dot', 'w' )
+ print("digraph depends {", file=depends_file)
+ for package in depgraph["packages"]:
+ pn = depgraph["packages"][package]["pn"]
+ fn = depgraph["packages"][package]["filename"]
+ version = depgraph["packages"][package]["version"]
+ if package == pn:
+ print('"%s" [label="%s %s\\n%s"]' % (pn, pn, version, fn), file=depends_file)
+ else:
+ print('"%s" [label="%s(%s) %s\\n%s"]' % (package, package, pn, version, fn), file=depends_file)
+ for depend in depgraph["depends"][pn]:
+ print('"%s" -> "%s"' % (package, depend), file=depends_file)
+ for package in depgraph["rdepends-pkg"]:
+ for rdepend in depgraph["rdepends-pkg"][package]:
+ print('"%s" -> "%s" [style=dashed]' % (package, rdepend), file=depends_file)
+ for package in depgraph["rrecs-pkg"]:
+ for rdepend in depgraph["rrecs-pkg"][package]:
+ print('"%s" -> "%s" [style=dashed]' % (package, rdepend), file=depends_file)
+ print("}", file=depends_file)
+ logger.info("Package dependencies saved to 'package-depends.dot'")
+
+ tdepends_file = file('task-depends.dot', 'w' )
+ print("digraph depends {", file=tdepends_file)
+ for task in depgraph["tdepends"]:
+ (pn, taskname) = task.rsplit(".", 1)
+ fn = depgraph["pn"][pn]["filename"]
+ version = depgraph["pn"][pn]["version"]
+ print('"%s.%s" [label="%s %s\\n%s\\n%s"]' % (pn, taskname, pn, taskname, version, fn), file=tdepends_file)
+ for dep in depgraph["tdepends"][task]:
+ print('"%s" -> "%s"' % (task, dep), file=tdepends_file)
+ print("}", file=tdepends_file)
+ logger.info("Task dependencies saved to 'task-depends.dot'")
+
+ def show_appends_with_no_recipes(self):
+ # Determine which bbappends haven't been applied
+
+ # First get list of recipes, including skipped
+ recipefns = self.recipecache.pkg_fn.keys()
+ recipefns.extend(self.skiplist.keys())
+
+ # Work out list of bbappends that have been applied
+ applied_appends = []
+ for fn in recipefns:
+ applied_appends.extend(self.collection.get_file_appends(fn))
+
+ appends_without_recipes = []
+ for _, appendfn in self.collection.bbappends:
+ if not appendfn in applied_appends:
+ appends_without_recipes.append(appendfn)
+
+ if appends_without_recipes:
+ msg = 'No recipes available for:\n %s' % '\n '.join(appends_without_recipes)
+ warn_only = self.data.getVar("BB_DANGLINGAPPENDS_WARNONLY", \
+ False) or "no"
+ if warn_only.lower() in ("1", "yes", "true"):
+ bb.warn(msg)
+ else:
+ bb.fatal(msg)
+
+ def handlePrefProviders(self):
+
+ localdata = data.createCopy(self.data)
+ bb.data.update_data(localdata)
+ bb.data.expandKeys(localdata)
+
+ # Handle PREFERRED_PROVIDERS
+ for p in (localdata.getVar('PREFERRED_PROVIDERS', True) or "").split():
+ try:
+ (providee, provider) = p.split(':')
+ except:
+ providerlog.critical("Malformed option in PREFERRED_PROVIDERS variable: %s" % p)
+ continue
+ if providee in self.recipecache.preferred and self.recipecache.preferred[providee] != provider:
+ providerlog.error("conflicting preferences for %s: both %s and %s specified", providee, provider, self.recipecache.preferred[providee])
+ self.recipecache.preferred[providee] = provider
+
+ def findCoreBaseFiles(self, subdir, configfile):
+ corebase = self.data.getVar('COREBASE', True) or ""
+ paths = []
+ for root, dirs, files in os.walk(corebase + '/' + subdir):
+ for d in dirs:
+ configfilepath = os.path.join(root, d, configfile)
+ if os.path.exists(configfilepath):
+ paths.append(os.path.join(root, d))
+
+ if paths:
+ bb.event.fire(bb.event.CoreBaseFilesFound(paths), self.data)
+
+ def findConfigFilePath(self, configfile):
+ """
+ Find the location on disk of configfile and if it exists and was parsed by BitBake
+ emit the ConfigFilePathFound event with the path to the file.
+ """
+ path = bb.cookerdata.findConfigFile(configfile, self.data)
+ if not path:
+ return
+
+ # Generate a list of parsed configuration files by searching the files
+ # listed in the __depends and __base_depends variables with a .conf suffix.
+ conffiles = []
+ dep_files = self.data.getVar('__base_depends', False) or []
+ dep_files = dep_files + (self.data.getVar('__depends', False) or [])
+
+ for f in dep_files:
+ if f[0].endswith(".conf"):
+ conffiles.append(f[0])
+
+ _, conf, conffile = path.rpartition("conf/")
+ match = os.path.join(conf, conffile)
+ # Try and find matches for conf/conffilename.conf as we don't always
+ # have the full path to the file.
+ for cfg in conffiles:
+ if cfg.endswith(match):
+ bb.event.fire(bb.event.ConfigFilePathFound(path),
+ self.data)
+ break
+
+ def findFilesMatchingInDir(self, filepattern, directory):
+ """
+ Searches for files matching the regex 'pattern' which are children of
+ 'directory' in each BBPATH. i.e. to find all rootfs package classes available
+ to BitBake one could call findFilesMatchingInDir(self, 'rootfs_', 'classes')
+ or to find all machine configuration files one could call:
+ findFilesMatchingInDir(self, 'conf/machines', 'conf')
+ """
+
+ matches = []
+ p = re.compile(re.escape(filepattern))
+ bbpaths = self.data.getVar('BBPATH', True).split(':')
+ for path in bbpaths:
+ dirpath = os.path.join(path, directory)
+ if os.path.exists(dirpath):
+ for root, dirs, files in os.walk(dirpath):
+ for f in files:
+ if p.search(f):
+ matches.append(f)
+
+ if matches:
+ bb.event.fire(bb.event.FilesMatchingFound(filepattern, matches), self.data)
+
+ def findConfigFiles(self, varname):
+ """
+ Find config files which are appropriate values for varname.
+ i.e. MACHINE, DISTRO
+ """
+ possible = []
+ var = varname.lower()
+
+ data = self.data
+ # iterate configs
+ bbpaths = data.getVar('BBPATH', True).split(':')
+ for path in bbpaths:
+ confpath = os.path.join(path, "conf", var)
+ if os.path.exists(confpath):
+ for root, dirs, files in os.walk(confpath):
+ # get all child files, these are appropriate values
+ for f in files:
+ val, sep, end = f.rpartition('.')
+ if end == 'conf':
+ possible.append(val)
+
+ if possible:
+ bb.event.fire(bb.event.ConfigFilesFound(var, possible), self.data)
+
+ def findInheritsClass(self, klass):
+ """
+ Find all recipes which inherit the specified class
+ """
+ pkg_list = []
+
+ for pfn in self.recipecache.pkg_fn:
+ inherits = self.recipecache.inherits.get(pfn, None)
+ if inherits and inherits.count(klass) > 0:
+ pkg_list.append(self.recipecache.pkg_fn[pfn])
+
+ return pkg_list
+
+ def generateTargetsTree(self, klass=None, pkgs=None):
+ """
+ Generate a dependency tree of buildable targets
+ Generate an event with the result
+ """
+ # if the caller hasn't specified a pkgs list default to universe
+ if not pkgs:
+ pkgs = ['universe']
+ # if inherited_class passed ensure all recipes which inherit the
+ # specified class are included in pkgs
+ if klass:
+ extra_pkgs = self.findInheritsClass(klass)
+ pkgs = pkgs + extra_pkgs
+
+ # generate a dependency tree for all our packages
+ tree = self.generatePkgDepTreeData(pkgs, 'build')
+ bb.event.fire(bb.event.TargetsTreeGenerated(tree), self.data)
+
+ def buildWorldTargetList(self):
+ """
+ Build package list for "bitbake world"
+ """
+ parselog.debug(1, "collating packages for \"world\"")
+ for f in self.recipecache.possible_world:
+ terminal = True
+ pn = self.recipecache.pkg_fn[f]
+
+ for p in self.recipecache.pn_provides[pn]:
+ if p.startswith('virtual/'):
+ parselog.debug(2, "World build skipping %s due to %s provider starting with virtual/", f, p)
+ terminal = False
+ break
+ for pf in self.recipecache.providers[p]:
+ if self.recipecache.pkg_fn[pf] != pn:
+ parselog.debug(2, "World build skipping %s due to both us and %s providing %s", f, pf, p)
+ terminal = False
+ break
+ if terminal:
+ self.recipecache.world_target.add(pn)
+
+ def interactiveMode( self ):
+ """Drop off into a shell"""
+ try:
+ from bb import shell
+ except ImportError:
+ parselog.exception("Interactive mode not available")
+ sys.exit(1)
+ else:
+ shell.start( self )
+
+
+ def handleCollections( self, collections ):
+ """Handle collections"""
+ errors = False
+ self.recipecache.bbfile_config_priorities = []
+ if collections:
+ collection_priorities = {}
+ collection_depends = {}
+ collection_list = collections.split()
+ min_prio = 0
+ for c in collection_list:
+ # Get collection priority if defined explicitly
+ priority = self.data.getVar("BBFILE_PRIORITY_%s" % c, True)
+ if priority:
+ try:
+ prio = int(priority)
+ except ValueError:
+ parselog.error("invalid value for BBFILE_PRIORITY_%s: \"%s\"", c, priority)
+ errors = True
+ if min_prio == 0 or prio < min_prio:
+ min_prio = prio
+ collection_priorities[c] = prio
+ else:
+ collection_priorities[c] = None
+
+ # Check dependencies and store information for priority calculation
+ deps = self.data.getVar("LAYERDEPENDS_%s" % c, True)
+ if deps:
+ try:
+ deplist = bb.utils.explode_dep_versions2(deps)
+ except bb.utils.VersionStringException as vse:
+ bb.fatal('Error parsing LAYERDEPENDS_%s: %s' % (c, str(vse)))
+ for dep, oplist in deplist.iteritems():
+ if dep in collection_list:
+ for opstr in oplist:
+ layerver = self.data.getVar("LAYERVERSION_%s" % dep, True)
+ (op, depver) = opstr.split()
+ if layerver:
+ try:
+ res = bb.utils.vercmp_string_op(layerver, depver, op)
+ except bb.utils.VersionStringException as vse:
+ bb.fatal('Error parsing LAYERDEPENDS_%s: %s' % (c, str(vse)))
+ if not res:
+ parselog.error("Layer '%s' depends on version %s of layer '%s', but version %s is currently enabled in your configuration. Check that you are using the correct matching versions/branches of these two layers.", c, opstr, dep, layerver)
+ errors = True
+ else:
+ parselog.error("Layer '%s' depends on version %s of layer '%s', which exists in your configuration but does not specify a version. Check that you are using the correct matching versions/branches of these two layers.", c, opstr, dep)
+ errors = True
+ else:
+ parselog.error("Layer '%s' depends on layer '%s', but this layer is not enabled in your configuration", c, dep)
+ errors = True
+ collection_depends[c] = deplist.keys()
+ else:
+ collection_depends[c] = []
+
+ # Recursively work out collection priorities based on dependencies
+ def calc_layer_priority(collection):
+ if not collection_priorities[collection]:
+ max_depprio = min_prio
+ for dep in collection_depends[collection]:
+ calc_layer_priority(dep)
+ depprio = collection_priorities[dep]
+ if depprio > max_depprio:
+ max_depprio = depprio
+ max_depprio += 1
+ parselog.debug(1, "Calculated priority of layer %s as %d", collection, max_depprio)
+ collection_priorities[collection] = max_depprio
+
+ # Calculate all layer priorities using calc_layer_priority and store in bbfile_config_priorities
+ for c in collection_list:
+ calc_layer_priority(c)
+ regex = self.data.getVar("BBFILE_PATTERN_%s" % c, True)
+ if regex == None:
+ parselog.error("BBFILE_PATTERN_%s not defined" % c)
+ errors = True
+ continue
+ try:
+ cre = re.compile(regex)
+ except re.error:
+ parselog.error("BBFILE_PATTERN_%s \"%s\" is not a valid regular expression", c, regex)
+ errors = True
+ continue
+ self.recipecache.bbfile_config_priorities.append((c, regex, cre, collection_priorities[c]))
+ if errors:
+ # We've already printed the actual error(s)
+ raise CollectionError("Errors during parsing layer configuration")
+
+ def buildSetVars(self):
+ """
+ Setup any variables needed before starting a build
+ """
+ t = time.gmtime()
+ if not self.data.getVar("BUILDNAME", False):
+ self.data.setVar("BUILDNAME", "${DATE}${TIME}")
+ self.data.setVar("BUILDSTART", time.strftime('%m/%d/%Y %H:%M:%S', t))
+ self.data.setVar("DATE", time.strftime('%Y%m%d', t))
+ self.data.setVar("TIME", time.strftime('%H%M%S', t))
+
+ def matchFiles(self, bf):
+ """
+ Find the .bb files which match the expression in 'buildfile'.
+ """
+ if bf.startswith("/") or bf.startswith("../"):
+ bf = os.path.abspath(bf)
+
+ self.collection = CookerCollectFiles(self.recipecache.bbfile_config_priorities)
+ filelist, masked = self.collection.collect_bbfiles(self.data, self.expanded_data)
+ try:
+ os.stat(bf)
+ bf = os.path.abspath(bf)
+ return [bf]
+ except OSError:
+ regexp = re.compile(bf)
+ matches = []
+ for f in filelist:
+ if regexp.search(f) and os.path.isfile(f):
+ matches.append(f)
+ return matches
+
+ def matchFile(self, buildfile):
+ """
+ Find the .bb file which matches the expression in 'buildfile'.
+ Raise an error if multiple files
+ """
+ matches = self.matchFiles(buildfile)
+ if len(matches) != 1:
+ if matches:
+ msg = "Unable to match '%s' to a specific recipe file - %s matches found:" % (buildfile, len(matches))
+ if matches:
+ for f in matches:
+ msg += "\n %s" % f
+ parselog.error(msg)
+ else:
+ parselog.error("Unable to find any recipe file matching '%s'" % buildfile)
+ raise NoSpecificMatch
+ return matches[0]
+
+ def buildFile(self, buildfile, task):
+ """
+ Build the file matching regexp buildfile
+ """
+
+ # Too many people use -b because they think it's how you normally
+ # specify a target to be built, so show a warning
+ bb.warn("Buildfile specified, dependencies will not be handled. If this is not what you want, do not use -b / --buildfile.")
+
+ # Parse the configuration here. We need to do it explicitly here since
+ # buildFile() doesn't use the cache
+ self.parseConfiguration()
+
+ # If we are told to do the None task then query the default task
+ if (task == None):
+ task = self.configuration.cmd
+
+ fn, cls = bb.cache.Cache.virtualfn2realfn(buildfile)
+ fn = self.matchFile(fn)
+
+ self.buildSetVars()
+
+ infos = bb.cache.Cache.parse(fn, self.collection.get_file_appends(fn), \
+ self.data,
+ self.caches_array)
+ infos = dict(infos)
+
+ fn = bb.cache.Cache.realfn2virtual(fn, cls)
+ try:
+ info_array = infos[fn]
+ except KeyError:
+ bb.fatal("%s does not exist" % fn)
+
+ if info_array[0].skipped:
+ bb.fatal("%s was skipped: %s" % (fn, info_array[0].skipreason))
+
+ self.recipecache.add_from_recipeinfo(fn, info_array)
+
+ # Tweak some variables
+ item = info_array[0].pn
+ self.recipecache.ignored_dependencies = set()
+ self.recipecache.bbfile_priority[fn] = 1
+
+ # Remove external dependencies
+ self.recipecache.task_deps[fn]['depends'] = {}
+ self.recipecache.deps[fn] = []
+ self.recipecache.rundeps[fn] = []
+ self.recipecache.runrecs[fn] = []
+
+ # Invalidate task for target if force mode active
+ if self.configuration.force:
+ logger.verbose("Invalidate task %s, %s", task, fn)
+ if not task.startswith("do_"):
+ task = "do_%s" % task
+ bb.parse.siggen.invalidate_task(task, self.recipecache, fn)
+
+ # Setup taskdata structure
+ taskdata = bb.taskdata.TaskData(self.configuration.abort)
+ taskdata.add_provider(self.data, self.recipecache, item)
+
+ buildname = self.data.getVar("BUILDNAME", True)
+ bb.event.fire(bb.event.BuildStarted(buildname, [item]), self.expanded_data)
+
+ # Execute the runqueue
+ if not task.startswith("do_"):
+ task = "do_%s" % task
+ runlist = [[item, task]]
+
+ rq = bb.runqueue.RunQueue(self, self.data, self.recipecache, taskdata, runlist)
+
+ def buildFileIdle(server, rq, abort):
+
+ msg = None
+ interrupted = 0
+ if abort or self.state == state.forceshutdown:
+ rq.finish_runqueue(True)
+ msg = "Forced shutdown"
+ interrupted = 2
+ elif self.state == state.shutdown:
+ rq.finish_runqueue(False)
+ msg = "Stopped build"
+ interrupted = 1
+ failures = 0
+ try:
+ retval = rq.execute_runqueue()
+ except runqueue.TaskFailure as exc:
+ failures += len(exc.args)
+ retval = False
+ except SystemExit as exc:
+ self.command.finishAsyncCommand()
+ return False
+
+ if not retval:
+ bb.event.fire(bb.event.BuildCompleted(len(rq.rqdata.runq_fnid), buildname, item, failures, interrupted), self.expanded_data)
+ self.command.finishAsyncCommand(msg)
+ return False
+ if retval is True:
+ return True
+ return retval
+
+ self.configuration.server_register_idlecallback(buildFileIdle, rq)
+
+ def buildTargets(self, targets, task):
+ """
+ Attempt to build the targets specified
+ """
+
+ def buildTargetsIdle(server, rq, abort):
+ msg = None
+ interrupted = 0
+ if abort or self.state == state.forceshutdown:
+ rq.finish_runqueue(True)
+ msg = "Forced shutdown"
+ interrupted = 2
+ elif self.state == state.shutdown:
+ rq.finish_runqueue(False)
+ msg = "Stopped build"
+ interrupted = 1
+ failures = 0
+ try:
+ retval = rq.execute_runqueue()
+ except runqueue.TaskFailure as exc:
+ failures += len(exc.args)
+ retval = False
+ except SystemExit as exc:
+ self.command.finishAsyncCommand()
+ return False
+
+ if not retval:
+ bb.event.fire(bb.event.BuildCompleted(len(rq.rqdata.runq_fnid), buildname, targets, failures, interrupted), self.data)
+ self.command.finishAsyncCommand(msg)
+ return False
+ if retval is True:
+ return True
+ return retval
+
+ build.reset_cache()
+ self.buildSetVars()
+
+ taskdata, runlist, fulltargetlist = self.buildTaskData(targets, task, self.configuration.abort)
+
+ buildname = self.data.getVar("BUILDNAME", False)
+ bb.event.fire(bb.event.BuildStarted(buildname, fulltargetlist), self.data)
+
+ rq = bb.runqueue.RunQueue(self, self.data, self.recipecache, taskdata, runlist)
+ if 'universe' in targets:
+ rq.rqdata.warn_multi_bb = True
+
+ self.configuration.server_register_idlecallback(buildTargetsIdle, rq)
+
+
+ def getAllKeysWithFlags(self, flaglist):
+ dump = {}
+ for k in self.data.keys():
+ try:
+ v = self.data.getVar(k, True)
+ if not k.startswith("__") and not isinstance(v, bb.data_smart.DataSmart):
+ dump[k] = {
+ 'v' : v ,
+ 'history' : self.data.varhistory.variable(k),
+ }
+ for d in flaglist:
+ dump[k][d] = self.data.getVarFlag(k, d)
+ except Exception as e:
+ print(e)
+ return dump
+
+
+ def generateNewImage(self, image, base_image, package_queue, timestamp, description):
+ '''
+ Create a new image with a "require"/"inherit" base_image statement
+ '''
+ if timestamp:
+ image_name = os.path.splitext(image)[0]
+ timestr = time.strftime("-%Y%m%d-%H%M%S")
+ dest = image_name + str(timestr) + ".bb"
+ else:
+ if not image.endswith(".bb"):
+ dest = image + ".bb"
+ else:
+ dest = image
+
+ basename = False
+ if base_image:
+ with open(base_image, 'r') as f:
+ require_line = f.readline()
+ p = re.compile("IMAGE_BASENAME *=")
+ for line in f:
+ if p.search(line):
+ basename = True
+
+ with open(dest, "w") as imagefile:
+ if base_image is None:
+ imagefile.write("inherit core-image\n")
+ else:
+ topdir = self.data.getVar("TOPDIR", False)
+ if topdir in base_image:
+ base_image = require_line.split()[1]
+ imagefile.write("require " + base_image + "\n")
+ image_install = "IMAGE_INSTALL = \""
+ for package in package_queue:
+ image_install += str(package) + " "
+ image_install += "\"\n"
+ imagefile.write(image_install)
+
+ description_var = "DESCRIPTION = \"" + description + "\"\n"
+ imagefile.write(description_var)
+
+ if basename:
+ # If this is overwritten in a inherited image, reset it to default
+ image_basename = "IMAGE_BASENAME = \"${PN}\"\n"
+ imagefile.write(image_basename)
+
+ self.state = state.initial
+ if timestamp:
+ return timestr
+
+ def updateCacheSync(self):
+ if self.state == state.running:
+ return
+
+ # reload files for which we got notifications
+ for p in self.inotify_modified_files:
+ bb.parse.update_cache(p)
+ self.inotify_modified_files = []
+
+ if not self.baseconfig_valid:
+ logger.debug(1, "Reloading base configuration data")
+ self.initConfigurationData()
+ self.baseconfig_valid = True
+ self.parsecache_valid = False
+
+ # This is called for all async commands when self.state != running
+ def updateCache(self):
+ if self.state == state.running:
+ return
+
+ if self.state in (state.shutdown, state.forceshutdown, state.error):
+ if hasattr(self.parser, 'shutdown'):
+ self.parser.shutdown(clean=False, force = True)
+ raise bb.BBHandledException()
+
+ if self.state != state.parsing:
+ self.updateCacheSync()
+
+ if self.state != state.parsing and not self.parsecache_valid:
+ self.parseConfiguration ()
+ if CookerFeatures.SEND_SANITYEVENTS in self.featureset:
+ bb.event.fire(bb.event.SanityCheck(False), self.data)
+
+ ignore = self.expanded_data.getVar("ASSUME_PROVIDED", True) or ""
+ self.recipecache.ignored_dependencies = set(ignore.split())
+
+ for dep in self.configuration.extra_assume_provided:
+ self.recipecache.ignored_dependencies.add(dep)
+
+ self.collection = CookerCollectFiles(self.recipecache.bbfile_config_priorities)
+ (filelist, masked) = self.collection.collect_bbfiles(self.data, self.expanded_data)
+
+ self.parser = CookerParser(self, filelist, masked)
+ self.parsecache_valid = True
+
+ self.state = state.parsing
+
+ if not self.parser.parse_next():
+ collectlog.debug(1, "parsing complete")
+ if self.parser.error:
+ raise bb.BBHandledException()
+ self.show_appends_with_no_recipes()
+ self.handlePrefProviders()
+ self.recipecache.bbfile_priority = self.collection.collection_priorities(self.recipecache.pkg_fn, self.data)
+ self.state = state.running
+
+ # Send an event listing all stamps reachable after parsing
+ # which the metadata may use to clean up stale data
+ event = bb.event.ReachableStamps(self.recipecache.stamp)
+ bb.event.fire(event, self.expanded_data)
+ return None
+
+ return True
+
+ def checkPackages(self, pkgs_to_build):
+
+ # Return a copy, don't modify the original
+ pkgs_to_build = pkgs_to_build[:]
+
+ if len(pkgs_to_build) == 0:
+ raise NothingToBuild
+
+ ignore = (self.expanded_data.getVar("ASSUME_PROVIDED", True) or "").split()
+ for pkg in pkgs_to_build:
+ if pkg in ignore:
+ parselog.warn("Explicit target \"%s\" is in ASSUME_PROVIDED, ignoring" % pkg)
+
+ if 'world' in pkgs_to_build:
+ self.buildWorldTargetList()
+ pkgs_to_build.remove('world')
+ for t in self.recipecache.world_target:
+ pkgs_to_build.append(t)
+
+ if 'universe' in pkgs_to_build:
+ parselog.warn("The \"universe\" target is only intended for testing and may produce errors.")
+ parselog.debug(1, "collating packages for \"universe\"")
+ pkgs_to_build.remove('universe')
+ for t in self.recipecache.universe_target:
+ pkgs_to_build.append(t)
+
+ return pkgs_to_build
+
+
+
+
+ def pre_serve(self):
+ # Empty the environment. The environment will be populated as
+ # necessary from the data store.
+ #bb.utils.empty_environment()
+ try:
+ self.prhost = prserv.serv.auto_start(self.data)
+ except prserv.serv.PRServiceConfigError:
+ bb.event.fire(CookerExit(), self.expanded_data)
+ self.state = state.error
+ return
+
+ def post_serve(self):
+ prserv.serv.auto_shutdown(self.data)
+ bb.event.fire(CookerExit(), self.expanded_data)
+ lockfile = self.lock.name
+ self.lock.close()
+ self.lock = None
+
+ while not self.lock:
+ with bb.utils.timeout(3):
+ self.lock = bb.utils.lockfile(lockfile, shared=False, retry=False, block=True)
+ if not self.lock:
+ # Some systems may not have lsof available
+ procs = None
+ try:
+ procs = subprocess.check_output(["lsof", '-w', lockfile], stderr=subprocess.STDOUT)
+ except OSError as e:
+ if e.errno != errno.ENOENT:
+ raise
+ if procs is None:
+ # Fall back to fuser if lsof is unavailable
+ try:
+ procs = subprocess.check_output(["fuser", '-v', lockfile], stderr=subprocess.STDOUT)
+ except OSError as e:
+ if e.errno != errno.ENOENT:
+ raise
+
+ msg = "Delaying shutdown due to active processes which appear to be holding bitbake.lock"
+ if procs:
+ msg += ":\n%s" % str(procs)
+ print(msg)
+
+
+ def shutdown(self, force = False):
+ if force:
+ self.state = state.forceshutdown
+ else:
+ self.state = state.shutdown
+
+ def finishcommand(self):
+ self.state = state.initial
+
+ def reset(self):
+ self.initConfigurationData()
+
+ def lockBitbake(self):
+ if not hasattr(self, 'lock'):
+ self.lock = None
+ if self.data:
+ lockfile = self.data.expand("${TOPDIR}/bitbake.lock")
+ if lockfile:
+ self.lock = bb.utils.lockfile(lockfile, False, False)
+ return self.lock
+
+ def unlockBitbake(self):
+ if hasattr(self, 'lock') and self.lock:
+ bb.utils.unlockfile(self.lock)
+
+def server_main(cooker, func, *args):
+ cooker.pre_serve()
+
+ if cooker.configuration.profile:
+ try:
+ import cProfile as profile
+ except:
+ import profile
+ prof = profile.Profile()
+
+ ret = profile.Profile.runcall(prof, func, *args)
+
+ prof.dump_stats("profile.log")
+ bb.utils.process_profilelog("profile.log")
+ print("Raw profiling information saved to profile.log and processed statistics to profile.log.processed")
+
+ else:
+ ret = func(*args)
+
+ cooker.post_serve()
+
+ return ret
+
+class CookerExit(bb.event.Event):
+ """
+ Notify clients of the Cooker shutdown
+ """
+
+ def __init__(self):
+ bb.event.Event.__init__(self)
+
+
+class CookerCollectFiles(object):
+ def __init__(self, priorities):
+ self.bbappends = []
+ self.bbfile_config_priorities = priorities
+
+ def calc_bbfile_priority( self, filename, matched = None ):
+ for _, _, regex, pri in self.bbfile_config_priorities:
+ if regex.match(filename):
+ if matched != None:
+ if not regex in matched:
+ matched.add(regex)
+ return pri
+ return 0
+
+ def get_bbfiles(self):
+ """Get list of default .bb files by reading out the current directory"""
+ path = os.getcwd()
+ contents = os.listdir(path)
+ bbfiles = []
+ for f in contents:
+ if f.endswith(".bb"):
+ bbfiles.append(os.path.abspath(os.path.join(path, f)))
+ return bbfiles
+
+ def find_bbfiles(self, path):
+ """Find all the .bb and .bbappend files in a directory"""
+ found = []
+ for dir, dirs, files in os.walk(path):
+ for ignored in ('SCCS', 'CVS', '.svn'):
+ if ignored in dirs:
+ dirs.remove(ignored)
+ found += [os.path.join(dir, f) for f in files if (f.endswith(['.bb', '.bbappend']))]
+
+ return found
+
+ def collect_bbfiles(self, config, eventdata):
+ """Collect all available .bb build files"""
+ masked = 0
+
+ collectlog.debug(1, "collecting .bb files")
+
+ files = (config.getVar( "BBFILES", True) or "").split()
+ config.setVar("BBFILES", " ".join(files))
+
+ # Sort files by priority
+ files.sort( key=lambda fileitem: self.calc_bbfile_priority(fileitem) )
+
+ if not len(files):
+ files = self.get_bbfiles()
+
+ if not len(files):
+ collectlog.error("no recipe files to build, check your BBPATH and BBFILES?")
+ bb.event.fire(CookerExit(), eventdata)
+
+ # Can't use set here as order is important
+ newfiles = []
+ for f in files:
+ if os.path.isdir(f):
+ dirfiles = self.find_bbfiles(f)
+ for g in dirfiles:
+ if g not in newfiles:
+ newfiles.append(g)
+ else:
+ globbed = glob.glob(f)
+ if not globbed and os.path.exists(f):
+ globbed = [f]
+ for g in globbed:
+ if g not in newfiles:
+ newfiles.append(g)
+
+ bbmask = config.getVar('BBMASK', True)
+
+ if bbmask:
+ try:
+ bbmask_compiled = re.compile(bbmask)
+ except sre_constants.error:
+ collectlog.critical("BBMASK is not a valid regular expression, ignoring.")
+ return list(newfiles), 0
+
+ bbfiles = []
+ bbappend = []
+ for f in newfiles:
+ if bbmask and bbmask_compiled.search(f):
+ collectlog.debug(1, "skipping masked file %s", f)
+ masked += 1
+ continue
+ if f.endswith('.bb'):
+ bbfiles.append(f)
+ elif f.endswith('.bbappend'):
+ bbappend.append(f)
+ else:
+ collectlog.debug(1, "skipping %s: unknown file extension", f)
+
+ # Build a list of .bbappend files for each .bb file
+ for f in bbappend:
+ base = os.path.basename(f).replace('.bbappend', '.bb')
+ self.bbappends.append((base, f))
+
+ # Find overlayed recipes
+ # bbfiles will be in priority order which makes this easy
+ bbfile_seen = dict()
+ self.overlayed = defaultdict(list)
+ for f in reversed(bbfiles):
+ base = os.path.basename(f)
+ if base not in bbfile_seen:
+ bbfile_seen[base] = f
+ else:
+ topfile = bbfile_seen[base]
+ self.overlayed[topfile].append(f)
+
+ return (bbfiles, masked)
+
+ def get_file_appends(self, fn):
+ """
+ Returns a list of .bbappend files to apply to fn
+ """
+ filelist = []
+ f = os.path.basename(fn)
+ for b in self.bbappends:
+ (bbappend, filename) = b
+ if (bbappend == f) or ('%' in bbappend and bbappend.startswith(f[:bbappend.index('%')])):
+ filelist.append(filename)
+ return filelist
+
+ def collection_priorities(self, pkgfns, d):
+
+ priorities = {}
+
+ # Calculate priorities for each file
+ matched = set()
+ for p in pkgfns:
+ realfn, cls = bb.cache.Cache.virtualfn2realfn(p)
+ priorities[p] = self.calc_bbfile_priority(realfn, matched)
+
+ # Don't show the warning if the BBFILE_PATTERN did match .bbappend files
+ unmatched = set()
+ for _, _, regex, pri in self.bbfile_config_priorities:
+ if not regex in matched:
+ unmatched.add(regex)
+
+ def findmatch(regex):
+ for b in self.bbappends:
+ (bbfile, append) = b
+ if regex.match(append):
+ return True
+ return False
+
+ for unmatch in unmatched.copy():
+ if findmatch(unmatch):
+ unmatched.remove(unmatch)
+
+ for collection, pattern, regex, _ in self.bbfile_config_priorities:
+ if regex in unmatched:
+ if d.getVar('BBFILE_PATTERN_IGNORE_EMPTY_%s' % collection, True) != '1':
+ collectlog.warn("No bb files matched BBFILE_PATTERN_%s '%s'" % (collection, pattern))
+
+ return priorities
+
+class ParsingFailure(Exception):
+ def __init__(self, realexception, recipe):
+ self.realexception = realexception
+ self.recipe = recipe
+ Exception.__init__(self, realexception, recipe)
+
+class Feeder(multiprocessing.Process):
+ def __init__(self, jobs, to_parsers, quit):
+ self.quit = quit
+ self.jobs = jobs
+ self.to_parsers = to_parsers
+ multiprocessing.Process.__init__(self)
+
+ def run(self):
+ while True:
+ try:
+ quit = self.quit.get_nowait()
+ except Queue.Empty:
+ pass
+ else:
+ if quit == 'cancel':
+ self.to_parsers.cancel_join_thread()
+ break
+
+ try:
+ job = self.jobs.pop()
+ except IndexError:
+ break
+
+ try:
+ self.to_parsers.put(job, timeout=0.5)
+ except Queue.Full:
+ self.jobs.insert(0, job)
+ continue
+
+class Parser(multiprocessing.Process):
+ def __init__(self, jobs, results, quit, init, profile):
+ self.jobs = jobs
+ self.results = results
+ self.quit = quit
+ self.init = init
+ multiprocessing.Process.__init__(self)
+ self.context = bb.utils.get_context().copy()
+ self.handlers = bb.event.get_class_handlers().copy()
+ self.profile = profile
+
+ def run(self):
+
+ if not self.profile:
+ self.realrun()
+ return
+
+ try:
+ import cProfile as profile
+ except:
+ import profile
+ prof = profile.Profile()
+ try:
+ profile.Profile.runcall(prof, self.realrun)
+ finally:
+ logfile = "profile-parse-%s.log" % multiprocessing.current_process().name
+ prof.dump_stats(logfile)
+
+ def realrun(self):
+ if self.init:
+ self.init()
+
+ pending = []
+ while True:
+ try:
+ self.quit.get_nowait()
+ except Queue.Empty:
+ pass
+ else:
+ self.results.cancel_join_thread()
+ break
+
+ if pending:
+ result = pending.pop()
+ else:
+ try:
+ job = self.jobs.get(timeout=0.25)
+ except Queue.Empty:
+ continue
+
+ if job is None:
+ break
+ result = self.parse(*job)
+
+ try:
+ self.results.put(result, timeout=0.25)
+ except Queue.Full:
+ pending.append(result)
+
+ def parse(self, filename, appends, caches_array):
+ try:
+ # Reset our environment and handlers to the original settings
+ bb.utils.set_context(self.context.copy())
+ bb.event.set_class_handlers(self.handlers.copy())
+ return True, bb.cache.Cache.parse(filename, appends, self.cfg, caches_array)
+ except Exception as exc:
+ tb = sys.exc_info()[2]
+ exc.recipe = filename
+ exc.traceback = list(bb.exceptions.extract_traceback(tb, context=3))
+ return True, exc
+ # Need to turn BaseExceptions into Exceptions here so we gracefully shutdown
+ # and for example a worker thread doesn't just exit on its own in response to
+ # a SystemExit event for example.
+ except BaseException as exc:
+ return True, ParsingFailure(exc, filename)
+
+class CookerParser(object):
+ def __init__(self, cooker, filelist, masked):
+ self.filelist = filelist
+ self.cooker = cooker
+ self.cfgdata = cooker.data
+ self.cfghash = cooker.data_hash
+
+ # Accounting statistics
+ self.parsed = 0
+ self.cached = 0
+ self.error = 0
+ self.masked = masked
+
+ self.skipped = 0
+ self.virtuals = 0
+ self.total = len(filelist)
+
+ self.current = 0
+ self.num_processes = int(self.cfgdata.getVar("BB_NUMBER_PARSE_THREADS", True) or
+ multiprocessing.cpu_count())
+ self.process_names = []
+
+ self.bb_cache = bb.cache.Cache(self.cfgdata, self.cfghash, cooker.caches_array)
+ self.fromcache = []
+ self.willparse = []
+ for filename in self.filelist:
+ appends = self.cooker.collection.get_file_appends(filename)
+ if not self.bb_cache.cacheValid(filename, appends):
+ self.willparse.append((filename, appends, cooker.caches_array))
+ else:
+ self.fromcache.append((filename, appends))
+ self.toparse = self.total - len(self.fromcache)
+ self.progress_chunk = max(self.toparse / 100, 1)
+
+ self.start()
+ self.haveshutdown = False
+
+ def start(self):
+ self.results = self.load_cached()
+ self.processes = []
+ if self.toparse:
+ bb.event.fire(bb.event.ParseStarted(self.toparse), self.cfgdata)
+ def init():
+ Parser.cfg = self.cfgdata
+ multiprocessing.util.Finalize(None, bb.codeparser.parser_cache_save, args=(self.cfgdata,), exitpriority=1)
+ multiprocessing.util.Finalize(None, bb.fetch.fetcher_parse_save, args=(self.cfgdata,), exitpriority=1)
+
+ self.feeder_quit = multiprocessing.Queue(maxsize=1)
+ self.parser_quit = multiprocessing.Queue(maxsize=self.num_processes)
+ self.jobs = multiprocessing.Queue(maxsize=self.num_processes)
+ self.result_queue = multiprocessing.Queue()
+ self.feeder = Feeder(self.willparse, self.jobs, self.feeder_quit)
+ self.feeder.start()
+ for i in range(0, self.num_processes):
+ parser = Parser(self.jobs, self.result_queue, self.parser_quit, init, self.cooker.configuration.profile)
+ parser.start()
+ self.process_names.append(parser.name)
+ self.processes.append(parser)
+
+ self.results = itertools.chain(self.results, self.parse_generator())
+
+ def shutdown(self, clean=True, force=False):
+ if not self.toparse:
+ return
+ if self.haveshutdown:
+ return
+ self.haveshutdown = True
+
+ if clean:
+ event = bb.event.ParseCompleted(self.cached, self.parsed,
+ self.skipped, self.masked,
+ self.virtuals, self.error,
+ self.total)
+
+ bb.event.fire(event, self.cfgdata)
+ self.feeder_quit.put(None)
+ for process in self.processes:
+ self.jobs.put(None)
+ else:
+ self.feeder_quit.put('cancel')
+
+ self.parser_quit.cancel_join_thread()
+ for process in self.processes:
+ self.parser_quit.put(None)
+
+ self.jobs.cancel_join_thread()
+
+ for process in self.processes:
+ if force:
+ process.join(.1)
+ process.terminate()
+ else:
+ process.join()
+ self.feeder.join()
+
+ sync = threading.Thread(target=self.bb_cache.sync)
+ sync.start()
+ multiprocessing.util.Finalize(None, sync.join, exitpriority=-100)
+ bb.codeparser.parser_cache_savemerge(self.cooker.data)
+ bb.fetch.fetcher_parse_done(self.cooker.data)
+ if self.cooker.configuration.profile:
+ profiles = []
+ for i in self.process_names:
+ logfile = "profile-parse-%s.log" % i
+ if os.path.exists(logfile):
+ profiles.append(logfile)
+
+ pout = "profile-parse.log.processed"
+ bb.utils.process_profilelog(profiles, pout = pout)
+ print("Processed parsing statistics saved to %s" % (pout))
+
+ def load_cached(self):
+ for filename, appends in self.fromcache:
+ cached, infos = self.bb_cache.load(filename, appends, self.cfgdata)
+ yield not cached, infos
+
+ def parse_generator(self):
+ while True:
+ if self.parsed >= self.toparse:
+ break
+
+ try:
+ result = self.result_queue.get(timeout=0.25)
+ except Queue.Empty:
+ pass
+ else:
+ value = result[1]
+ if isinstance(value, BaseException):
+ raise value
+ else:
+ yield result
+
+ def parse_next(self):
+ result = []
+ parsed = None
+ try:
+ parsed, result = self.results.next()
+ except StopIteration:
+ self.shutdown()
+ return False
+ except bb.BBHandledException as exc:
+ self.error += 1
+ logger.error('Failed to parse recipe: %s' % exc.recipe)
+ self.shutdown(clean=False)
+ return False
+ except ParsingFailure as exc:
+ self.error += 1
+ logger.error('Unable to parse %s: %s' %
+ (exc.recipe, bb.exceptions.to_string(exc.realexception)))
+ self.shutdown(clean=False)
+ return False
+ except bb.parse.ParseError as exc:
+ self.error += 1
+ logger.error(str(exc))
+ self.shutdown(clean=False)
+ return False
+ except bb.data_smart.ExpansionError as exc:
+ self.error += 1
+ _, value, _ = sys.exc_info()
+ logger.error('ExpansionError during parsing %s: %s', value.recipe, str(exc))
+ self.shutdown(clean=False)
+ return False
+ except SyntaxError as exc:
+ self.error += 1
+ logger.error('Unable to parse %s', exc.recipe)
+ self.shutdown(clean=False)
+ return False
+ except Exception as exc:
+ self.error += 1
+ etype, value, tb = sys.exc_info()
+ if hasattr(value, "recipe"):
+ logger.error('Unable to parse %s', value.recipe,
+ exc_info=(etype, value, exc.traceback))
+ else:
+ # Most likely, an exception occurred during raising an exception
+ import traceback
+ logger.error('Exception during parse: %s' % traceback.format_exc())
+ self.shutdown(clean=False)
+ return False
+
+ self.current += 1
+ self.virtuals += len(result)
+ if parsed:
+ self.parsed += 1
+ if self.parsed % self.progress_chunk == 0:
+ bb.event.fire(bb.event.ParseProgress(self.parsed, self.toparse),
+ self.cfgdata)
+ else:
+ self.cached += 1
+
+ for virtualfn, info_array in result:
+ if info_array[0].skipped:
+ self.skipped += 1
+ self.cooker.skiplist[virtualfn] = SkippedPackage(info_array[0])
+ self.bb_cache.add_info(virtualfn, info_array, self.cooker.recipecache,
+ parsed=parsed, watcher = self.cooker.add_filewatch)
+ return True
+
+ def reparse(self, filename):
+ infos = self.bb_cache.parse(filename,
+ self.cooker.collection.get_file_appends(filename),
+ self.cfgdata, self.cooker.caches_array)
+ for vfn, info_array in infos:
+ self.cooker.recipecache.add_from_recipeinfo(vfn, info_array)
diff --git a/bitbake/lib/bb/cookerdata.py b/bitbake/lib/bb/cookerdata.py
new file mode 100644
index 0000000..f19c283
--- /dev/null
+++ b/bitbake/lib/bb/cookerdata.py
@@ -0,0 +1,336 @@
+#!/usr/bin/env python
+# ex:ts=4:sw=4:sts=4:et
+# -*- tab-width: 4; c-basic-offset: 4; indent-tabs-mode: nil -*-
+#
+# Copyright (C) 2003, 2004 Chris Larson
+# Copyright (C) 2003, 2004 Phil Blundell
+# Copyright (C) 2003 - 2005 Michael 'Mickey' Lauer
+# Copyright (C) 2005 Holger Hans Peter Freyther
+# Copyright (C) 2005 ROAD GmbH
+# Copyright (C) 2006 Richard Purdie
+#
+# This program is free software; you can redistribute it and/or modify
+# it under the terms of the GNU General Public License version 2 as
+# published by the Free Software Foundation.
+#
+# This program is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
+# GNU General Public License for more details.
+#
+# You should have received a copy of the GNU General Public License along
+# with this program; if not, write to the Free Software Foundation, Inc.,
+# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
+
+import os, sys
+from functools import wraps
+import logging
+import bb
+from bb import data
+import bb.parse
+
+logger = logging.getLogger("BitBake")
+parselog = logging.getLogger("BitBake.Parsing")
+
+class ConfigParameters(object):
+ def __init__(self, argv=sys.argv):
+ self.options, targets = self.parseCommandLine(argv)
+ self.environment = self.parseEnvironment()
+
+ self.options.pkgs_to_build = targets or []
+
+ self.options.tracking = False
+ if hasattr(self.options, "show_environment") and self.options.show_environment:
+ self.options.tracking = True
+
+ for key, val in self.options.__dict__.items():
+ setattr(self, key, val)
+
+ def parseCommandLine(self, argv=sys.argv):
+ raise Exception("Caller must implement commandline option parsing")
+
+ def parseEnvironment(self):
+ return os.environ.copy()
+
+ def updateFromServer(self, server):
+ if not self.options.cmd:
+ defaulttask, error = server.runCommand(["getVariable", "BB_DEFAULT_TASK"])
+ if error:
+ raise Exception("Unable to get the value of BB_DEFAULT_TASK from the server: %s" % error)
+ self.options.cmd = defaulttask or "build"
+ _, error = server.runCommand(["setConfig", "cmd", self.options.cmd])
+ if error:
+ raise Exception("Unable to set configuration option 'cmd' on the server: %s" % error)
+
+ if not self.options.pkgs_to_build:
+ bbpkgs, error = server.runCommand(["getVariable", "BBPKGS"])
+ if error:
+ raise Exception("Unable to get the value of BBPKGS from the server: %s" % error)
+ if bbpkgs:
+ self.options.pkgs_to_build.extend(bbpkgs.split())
+
+ def updateToServer(self, server, environment):
+ options = {}
+ for o in ["abort", "tryaltconfigs", "force", "invalidate_stamp",
+ "verbose", "debug", "dry_run", "dump_signatures",
+ "debug_domains", "extra_assume_provided", "profile",
+ "prefile", "postfile"]:
+ options[o] = getattr(self.options, o)
+
+ ret, error = server.runCommand(["updateConfig", options, environment])
+ if error:
+ raise Exception("Unable to update the server configuration with local parameters: %s" % error)
+
+ def parseActions(self):
+ # Parse any commandline into actions
+ action = {'action':None, 'msg':None}
+ if self.options.show_environment:
+ if 'world' in self.options.pkgs_to_build:
+ action['msg'] = "'world' is not a valid target for --environment."
+ elif 'universe' in self.options.pkgs_to_build:
+ action['msg'] = "'universe' is not a valid target for --environment."
+ elif len(self.options.pkgs_to_build) > 1:
+ action['msg'] = "Only one target can be used with the --environment option."
+ elif self.options.buildfile and len(self.options.pkgs_to_build) > 0:
+ action['msg'] = "No target should be used with the --environment and --buildfile options."
+ elif len(self.options.pkgs_to_build) > 0:
+ action['action'] = ["showEnvironmentTarget", self.options.pkgs_to_build]
+ else:
+ action['action'] = ["showEnvironment", self.options.buildfile]
+ elif self.options.buildfile is not None:
+ action['action'] = ["buildFile", self.options.buildfile, self.options.cmd]
+ elif self.options.revisions_changed:
+ action['action'] = ["compareRevisions"]
+ elif self.options.show_versions:
+ action['action'] = ["showVersions"]
+ elif self.options.parse_only:
+ action['action'] = ["parseFiles"]
+ elif self.options.dot_graph:
+ if self.options.pkgs_to_build:
+ action['action'] = ["generateDotGraph", self.options.pkgs_to_build, self.options.cmd]
+ else:
+ action['msg'] = "Please specify a package name for dependency graph generation."
+ else:
+ if self.options.pkgs_to_build:
+ action['action'] = ["buildTargets", self.options.pkgs_to_build, self.options.cmd]
+ else:
+ #action['msg'] = "Nothing to do. Use 'bitbake world' to build everything, or run 'bitbake --help' for usage information."
+ action = None
+ self.options.initialaction = action
+ return action
+
+class CookerConfiguration(object):
+ """
+ Manages build options and configurations for one run
+ """
+
+ def __init__(self):
+ self.debug_domains = []
+ self.extra_assume_provided = []
+ self.prefile = []
+ self.postfile = []
+ self.debug = 0
+ self.cmd = None
+ self.abort = True
+ self.force = False
+ self.profile = False
+ self.nosetscene = False
+ self.invalidate_stamp = False
+ self.dump_signatures = []
+ self.dry_run = False
+ self.tracking = False
+ self.interface = []
+ self.writeeventlog = False
+
+ self.env = {}
+
+ def setConfigParameters(self, parameters):
+ for key in self.__dict__.keys():
+ if key in parameters.options.__dict__:
+ setattr(self, key, parameters.options.__dict__[key])
+ self.env = parameters.environment.copy()
+ self.tracking = parameters.tracking
+
+ def setServerRegIdleCallback(self, srcb):
+ self.server_register_idlecallback = srcb
+
+ def __getstate__(self):
+ state = {}
+ for key in self.__dict__.keys():
+ if key == "server_register_idlecallback":
+ state[key] = None
+ else:
+ state[key] = getattr(self, key)
+ return state
+
+ def __setstate__(self,state):
+ for k in state:
+ setattr(self, k, state[k])
+
+
+def catch_parse_error(func):
+ """Exception handling bits for our parsing"""
+ @wraps(func)
+ def wrapped(fn, *args):
+ try:
+ return func(fn, *args)
+ except IOError as exc:
+ import traceback
+ parselog.critical(traceback.format_exc())
+ parselog.critical("Unable to parse %s: %s" % (fn, exc))
+ sys.exit(1)
+ except (bb.parse.ParseError, bb.data_smart.ExpansionError) as exc:
+ import traceback
+
+ bbdir = os.path.dirname(__file__) + os.sep
+ exc_class, exc, tb = sys.exc_info()
+ for tb in iter(lambda: tb.tb_next, None):
+ # Skip frames in bitbake itself, we only want the metadata
+ fn, _, _, _ = traceback.extract_tb(tb, 1)[0]
+ if not fn.startswith(bbdir):
+ break
+ parselog.critical("Unable to parse %s", fn, exc_info=(exc_class, exc, tb))
+ sys.exit(1)
+ return wrapped
+
+@catch_parse_error
+def parse_config_file(fn, data, include=True):
+ return bb.parse.handle(fn, data, include)
+
+@catch_parse_error
+def _inherit(bbclass, data):
+ bb.parse.BBHandler.inherit(bbclass, "configuration INHERITs", 0, data)
+ return data
+
+def findConfigFile(configfile, data):
+ search = []
+ bbpath = data.getVar("BBPATH", True)
+ if bbpath:
+ for i in bbpath.split(":"):
+ search.append(os.path.join(i, "conf", configfile))
+ path = os.getcwd()
+ while path != "/":
+ search.append(os.path.join(path, "conf", configfile))
+ path, _ = os.path.split(path)
+
+ for i in search:
+ if os.path.exists(i):
+ return i
+
+ return None
+
+class CookerDataBuilder(object):
+
+ def __init__(self, cookercfg, worker = False):
+
+ self.prefiles = cookercfg.prefile
+ self.postfiles = cookercfg.postfile
+ self.tracking = cookercfg.tracking
+
+ bb.utils.set_context(bb.utils.clean_context())
+ bb.event.set_class_handlers(bb.event.clean_class_handlers())
+ self.data = bb.data.init()
+ if self.tracking:
+ self.data.enableTracking()
+
+ # Keep a datastore of the initial environment variables and their
+ # values from when BitBake was launched to enable child processes
+ # to use environment variables which have been cleaned from the
+ # BitBake processes env
+ self.savedenv = bb.data.init()
+ for k in cookercfg.env:
+ self.savedenv.setVar(k, cookercfg.env[k])
+
+ filtered_keys = bb.utils.approved_variables()
+ bb.data.inheritFromOS(self.data, self.savedenv, filtered_keys)
+ self.data.setVar("BB_ORIGENV", self.savedenv)
+
+ if worker:
+ self.data.setVar("BB_WORKERCONTEXT", "1")
+
+ def parseBaseConfiguration(self):
+ try:
+ self.parseConfigurationFiles(self.prefiles, self.postfiles)
+ except SyntaxError:
+ raise bb.BBHandledException
+ except bb.data_smart.ExpansionError as e:
+ logger.error(str(e))
+ raise bb.BBHandledException
+ except Exception:
+ logger.exception("Error parsing configuration files")
+ raise bb.BBHandledException
+
+ def _findLayerConf(self, data):
+ return findConfigFile("bblayers.conf", data)
+
+ def parseConfigurationFiles(self, prefiles, postfiles):
+ data = self.data
+ bb.parse.init_parser(data)
+
+ # Parse files for loading *before* bitbake.conf and any includes
+ for f in prefiles:
+ data = parse_config_file(f, data)
+
+ layerconf = self._findLayerConf(data)
+ if layerconf:
+ parselog.debug(2, "Found bblayers.conf (%s)", layerconf)
+ # By definition bblayers.conf is in conf/ of TOPDIR.
+ # We may have been called with cwd somewhere else so reset TOPDIR
+ data.setVar("TOPDIR", os.path.dirname(os.path.dirname(layerconf)))
+ data = parse_config_file(layerconf, data)
+
+ layers = (data.getVar('BBLAYERS', True) or "").split()
+
+ data = bb.data.createCopy(data)
+ approved = bb.utils.approved_variables()
+ for layer in layers:
+ parselog.debug(2, "Adding layer %s", layer)
+ if 'HOME' in approved and '~' in layer:
+ layer = os.path.expanduser(layer)
+ data.setVar('LAYERDIR', layer)
+ data = parse_config_file(os.path.join(layer, "conf", "layer.conf"), data)
+ data.expandVarref('LAYERDIR')
+
+ data.delVar('LAYERDIR')
+
+ if not data.getVar("BBPATH", True):
+ msg = "The BBPATH variable is not set"
+ if not layerconf:
+ msg += (" and bitbake did not find a conf/bblayers.conf file in"
+ " the expected location.\nMaybe you accidentally"
+ " invoked bitbake from the wrong directory?")
+ raise SystemExit(msg)
+
+ data = parse_config_file(os.path.join("conf", "bitbake.conf"), data)
+
+ # Parse files for loading *after* bitbake.conf and any includes
+ for p in postfiles:
+ data = parse_config_file(p, data)
+
+ # Handle any INHERITs and inherit the base class
+ bbclasses = ["base"] + (data.getVar('INHERIT', True) or "").split()
+ for bbclass in bbclasses:
+ data = _inherit(bbclass, data)
+
+ # Nomally we only register event handlers at the end of parsing .bb files
+ # We register any handlers we've found so far here...
+ for var in data.getVar('__BBHANDLERS', False) or []:
+ bb.event.register(var, data.getVar(var, False), (data.getVarFlag(var, "eventmask", True) or "").split())
+
+ if data.getVar("BB_WORKERCONTEXT", False) is None:
+ bb.fetch.fetcher_init(data)
+ bb.codeparser.parser_cache_init(data)
+ bb.event.fire(bb.event.ConfigParsed(), data)
+
+ if data.getVar("BB_INVALIDCONF", False) is True:
+ data.setVar("BB_INVALIDCONF", False)
+ self.parseConfigurationFiles(self.prefiles, self.postfiles)
+ return
+
+ bb.parse.init_parser(data)
+ data.setVar('BBINCLUDED',bb.parse.get_file_depends(data))
+ self.data = data
+ self.data_hash = data.get_hash()
+
+
+
diff --git a/bitbake/lib/bb/daemonize.py b/bitbake/lib/bb/daemonize.py
new file mode 100644
index 0000000..346a618
--- /dev/null
+++ b/bitbake/lib/bb/daemonize.py
@@ -0,0 +1,193 @@
+"""
+Python Daemonizing helper
+
+Configurable daemon behaviors:
+
+ 1.) The current working directory set to the "/" directory.
+ 2.) The current file creation mode mask set to 0.
+ 3.) Close all open files (1024).
+ 4.) Redirect standard I/O streams to "/dev/null".
+
+A failed call to fork() now raises an exception.
+
+References:
+ 1) Advanced Programming in the Unix Environment: W. Richard Stevens
+ http://www.apuebook.com/apue3e.html
+ 2) The Linux Programming Interface: Michael Kerrisk
+ http://man7.org/tlpi/index.html
+ 3) Unix Programming Frequently Asked Questions:
+ http://www.faqs.org/faqs/unix-faq/programmer/faq/
+
+Modified to allow a function to be daemonized and return for
+bitbake use by Richard Purdie
+"""
+
+__author__ = "Chad J. Schroeder"
+__copyright__ = "Copyright (C) 2005 Chad J. Schroeder"
+__version__ = "0.2"
+
+# Standard Python modules.
+import os # Miscellaneous OS interfaces.
+import sys # System-specific parameters and functions.
+
+# Default daemon parameters.
+# File mode creation mask of the daemon.
+# For BitBake's children, we do want to inherit the parent umask.
+UMASK = None
+
+# Default maximum for the number of available file descriptors.
+MAXFD = 1024
+
+# The standard I/O file descriptors are redirected to /dev/null by default.
+if (hasattr(os, "devnull")):
+ REDIRECT_TO = os.devnull
+else:
+ REDIRECT_TO = "/dev/null"
+
+def createDaemon(function, logfile):
+ """
+ Detach a process from the controlling terminal and run it in the
+ background as a daemon, returning control to the caller.
+ """
+
+ try:
+ # Fork a child process so the parent can exit. This returns control to
+ # the command-line or shell. It also guarantees that the child will not
+ # be a process group leader, since the child receives a new process ID
+ # and inherits the parent's process group ID. This step is required
+ # to insure that the next call to os.setsid is successful.
+ pid = os.fork()
+ except OSError as e:
+ raise Exception("%s [%d]" % (e.strerror, e.errno))
+
+ if (pid == 0): # The first child.
+ # To become the session leader of this new session and the process group
+ # leader of the new process group, we call os.setsid(). The process is
+ # also guaranteed not to have a controlling terminal.
+ os.setsid()
+
+ # Is ignoring SIGHUP necessary?
+ #
+ # It's often suggested that the SIGHUP signal should be ignored before
+ # the second fork to avoid premature termination of the process. The
+ # reason is that when the first child terminates, all processes, e.g.
+ # the second child, in the orphaned group will be sent a SIGHUP.
+ #
+ # "However, as part of the session management system, there are exactly
+ # two cases where SIGHUP is sent on the death of a process:
+ #
+ # 1) When the process that dies is the session leader of a session that
+ # is attached to a terminal device, SIGHUP is sent to all processes
+ # in the foreground process group of that terminal device.
+ # 2) When the death of a process causes a process group to become
+ # orphaned, and one or more processes in the orphaned group are
+ # stopped, then SIGHUP and SIGCONT are sent to all members of the
+ # orphaned group." [2]
+ #
+ # The first case can be ignored since the child is guaranteed not to have
+ # a controlling terminal. The second case isn't so easy to dismiss.
+ # The process group is orphaned when the first child terminates and
+ # POSIX.1 requires that every STOPPED process in an orphaned process
+ # group be sent a SIGHUP signal followed by a SIGCONT signal. Since the
+ # second child is not STOPPED though, we can safely forego ignoring the
+ # SIGHUP signal. In any case, there are no ill-effects if it is ignored.
+ #
+ # import signal # Set handlers for asynchronous events.
+ # signal.signal(signal.SIGHUP, signal.SIG_IGN)
+
+ try:
+ # Fork a second child and exit immediately to prevent zombies. This
+ # causes the second child process to be orphaned, making the init
+ # process responsible for its cleanup. And, since the first child is
+ # a session leader without a controlling terminal, it's possible for
+ # it to acquire one by opening a terminal in the future (System V-
+ # based systems). This second fork guarantees that the child is no
+ # longer a session leader, preventing the daemon from ever acquiring
+ # a controlling terminal.
+ pid = os.fork() # Fork a second child.
+ except OSError as e:
+ raise Exception("%s [%d]" % (e.strerror, e.errno))
+
+ if (pid == 0): # The second child.
+ # We probably don't want the file mode creation mask inherited from
+ # the parent, so we give the child complete control over permissions.
+ if UMASK is not None:
+ os.umask(UMASK)
+ else:
+ # Parent (the first child) of the second child.
+ os._exit(0)
+ else:
+ # exit() or _exit()?
+ # _exit is like exit(), but it doesn't call any functions registered
+ # with atexit (and on_exit) or any registered signal handlers. It also
+ # closes any open file descriptors. Using exit() may cause all stdio
+ # streams to be flushed twice and any temporary files may be unexpectedly
+ # removed. It's therefore recommended that child branches of a fork()
+ # and the parent branch(es) of a daemon use _exit().
+ return
+
+ # Close all open file descriptors. This prevents the child from keeping
+ # open any file descriptors inherited from the parent. There is a variety
+ # of methods to accomplish this task. Three are listed below.
+ #
+ # Try the system configuration variable, SC_OPEN_MAX, to obtain the maximum
+ # number of open file descriptors to close. If it doesn't exist, use
+ # the default value (configurable).
+ #
+ # try:
+ # maxfd = os.sysconf("SC_OPEN_MAX")
+ # except (AttributeError, ValueError):
+ # maxfd = MAXFD
+ #
+ # OR
+ #
+ # if (os.sysconf_names.has_key("SC_OPEN_MAX")):
+ # maxfd = os.sysconf("SC_OPEN_MAX")
+ # else:
+ # maxfd = MAXFD
+ #
+ # OR
+ #
+ # Use the getrlimit method to retrieve the maximum file descriptor number
+ # that can be opened by this process. If there is no limit on the
+ # resource, use the default value.
+ #
+ import resource # Resource usage information.
+ maxfd = resource.getrlimit(resource.RLIMIT_NOFILE)[1]
+ if (maxfd == resource.RLIM_INFINITY):
+ maxfd = MAXFD
+
+ # Iterate through and close all file descriptors.
+# for fd in range(0, maxfd):
+# try:
+# os.close(fd)
+# except OSError: # ERROR, fd wasn't open to begin with (ignored)
+# pass
+
+ # Redirect the standard I/O file descriptors to the specified file. Since
+ # the daemon has no controlling terminal, most daemons redirect stdin,
+ # stdout, and stderr to /dev/null. This is done to prevent side-effects
+ # from reads and writes to the standard I/O file descriptors.
+
+ # This call to open is guaranteed to return the lowest file descriptor,
+ # which will be 0 (stdin), since it was closed above.
+# os.open(REDIRECT_TO, os.O_RDWR) # standard input (0)
+
+ # Duplicate standard input to standard output and standard error.
+# os.dup2(0, 1) # standard output (1)
+# os.dup2(0, 2) # standard error (2)
+
+
+ si = file('/dev/null', 'r')
+ so = file(logfile, 'w')
+ se = so
+
+
+ # Replace those fds with our own
+ os.dup2(si.fileno(), sys.stdin.fileno())
+ os.dup2(so.fileno(), sys.stdout.fileno())
+ os.dup2(se.fileno(), sys.stderr.fileno())
+
+ function()
+
+ os._exit(0)
diff --git a/bitbake/lib/bb/data.py b/bitbake/lib/bb/data.py
new file mode 100644
index 0000000..f6415a4
--- /dev/null
+++ b/bitbake/lib/bb/data.py
@@ -0,0 +1,446 @@
+# ex:ts=4:sw=4:sts=4:et
+# -*- tab-width: 4; c-basic-offset: 4; indent-tabs-mode: nil -*-
+"""
+BitBake 'Data' implementations
+
+Functions for interacting with the data structure used by the
+BitBake build tools.
+
+The expandKeys and update_data are the most expensive
+operations. At night the cookie monster came by and
+suggested 'give me cookies on setting the variables and
+things will work out'. Taking this suggestion into account
+applying the skills from the not yet passed 'Entwurf und
+Analyse von Algorithmen' lecture and the cookie
+monster seems to be right. We will track setVar more carefully
+to have faster update_data and expandKeys operations.
+
+This is a trade-off between speed and memory again but
+the speed is more critical here.
+"""
+
+# Copyright (C) 2003, 2004 Chris Larson
+# Copyright (C) 2005 Holger Hans Peter Freyther
+#
+# This program is free software; you can redistribute it and/or modify
+# it under the terms of the GNU General Public License version 2 as
+# published by the Free Software Foundation.
+#
+# This program is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
+# GNU General Public License for more details.
+#
+# You should have received a copy of the GNU General Public License along
+# with this program; if not, write to the Free Software Foundation, Inc.,
+# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
+#
+# Based on functions from the base bb module, Copyright 2003 Holger Schurig
+
+import sys, os, re
+if sys.argv[0][-5:] == "pydoc":
+ path = os.path.dirname(os.path.dirname(sys.argv[1]))
+else:
+ path = os.path.dirname(os.path.dirname(sys.argv[0]))
+sys.path.insert(0, path)
+from itertools import groupby
+
+from bb import data_smart
+from bb import codeparser
+import bb
+
+logger = data_smart.logger
+_dict_type = data_smart.DataSmart
+
+def init():
+ """Return a new object representing the Bitbake data"""
+ return _dict_type()
+
+def init_db(parent = None):
+ """Return a new object representing the Bitbake data,
+ optionally based on an existing object"""
+ if parent is not None:
+ return parent.createCopy()
+ else:
+ return _dict_type()
+
+def createCopy(source):
+ """Link the source set to the destination
+ If one does not find the value in the destination set,
+ search will go on to the source set to get the value.
+ Value from source are copy-on-write. i.e. any try to
+ modify one of them will end up putting the modified value
+ in the destination set.
+ """
+ return source.createCopy()
+
+def initVar(var, d):
+ """Non-destructive var init for data structure"""
+ d.initVar(var)
+
+
+def setVar(var, value, d):
+ """Set a variable to a given value"""
+ d.setVar(var, value)
+
+
+def getVar(var, d, exp = False):
+ """Gets the value of a variable"""
+ return d.getVar(var, exp)
+
+
+def renameVar(key, newkey, d):
+ """Renames a variable from key to newkey"""
+ d.renameVar(key, newkey)
+
+def delVar(var, d):
+ """Removes a variable from the data set"""
+ d.delVar(var)
+
+def appendVar(var, value, d):
+ """Append additional value to a variable"""
+ d.appendVar(var, value)
+
+def setVarFlag(var, flag, flagvalue, d):
+ """Set a flag for a given variable to a given value"""
+ d.setVarFlag(var, flag, flagvalue)
+
+def getVarFlag(var, flag, d):
+ """Gets given flag from given var"""
+ return d.getVarFlag(var, flag)
+
+def delVarFlag(var, flag, d):
+ """Removes a given flag from the variable's flags"""
+ d.delVarFlag(var, flag)
+
+def setVarFlags(var, flags, d):
+ """Set the flags for a given variable
+
+ Note:
+ setVarFlags will not clear previous
+ flags. Think of this method as
+ addVarFlags
+ """
+ d.setVarFlags(var, flags)
+
+def getVarFlags(var, d):
+ """Gets a variable's flags"""
+ return d.getVarFlags(var)
+
+def delVarFlags(var, d):
+ """Removes a variable's flags"""
+ d.delVarFlags(var)
+
+def keys(d):
+ """Return a list of keys in d"""
+ return d.keys()
+
+
+__expand_var_regexp__ = re.compile(r"\${[^{}]+}")
+__expand_python_regexp__ = re.compile(r"\${@.+?}")
+
+def expand(s, d, varname = None):
+ """Variable expansion using the data store"""
+ return d.expand(s, varname)
+
+def expandKeys(alterdata, readdata = None):
+ if readdata == None:
+ readdata = alterdata
+
+ todolist = {}
+ for key in alterdata:
+ if not '${' in key:
+ continue
+
+ ekey = expand(key, readdata)
+ if key == ekey:
+ continue
+ todolist[key] = ekey
+
+ # These two for loops are split for performance to maximise the
+ # usefulness of the expand cache
+ for key in sorted(todolist):
+ ekey = todolist[key]
+ newval = alterdata.getVar(ekey, False)
+ if newval is not None:
+ val = alterdata.getVar(key, False)
+ if val is not None:
+ bb.warn("Variable key %s (%s) replaces original key %s (%s)." % (key, val, ekey, newval))
+ alterdata.renameVar(key, ekey)
+
+def inheritFromOS(d, savedenv, permitted):
+ """Inherit variables from the initial environment."""
+ exportlist = bb.utils.preserved_envvars_exported()
+ for s in savedenv.keys():
+ if s in permitted:
+ try:
+ d.setVar(s, savedenv.getVar(s, True), op = 'from env')
+ if s in exportlist:
+ d.setVarFlag(s, "export", True, op = 'auto env export')
+ except TypeError:
+ pass
+
+def emit_var(var, o=sys.__stdout__, d = init(), all=False):
+ """Emit a variable to be sourced by a shell."""
+ if d.getVarFlag(var, "python"):
+ return False
+
+ export = d.getVarFlag(var, "export")
+ unexport = d.getVarFlag(var, "unexport")
+ func = d.getVarFlag(var, "func")
+ if not all and not export and not unexport and not func:
+ return False
+
+ try:
+ if all:
+ oval = d.getVar(var, False)
+ val = d.getVar(var, True)
+ except (KeyboardInterrupt, bb.build.FuncFailed):
+ raise
+ except Exception as exc:
+ o.write('# expansion of %s threw %s: %s\n' % (var, exc.__class__.__name__, str(exc)))
+ return False
+
+ if all:
+ d.varhistory.emit(var, oval, val, o, d)
+
+ if (var.find("-") != -1 or var.find(".") != -1 or var.find('{') != -1 or var.find('}') != -1 or var.find('+') != -1) and not all:
+ return False
+
+ varExpanded = d.expand(var)
+
+ if unexport:
+ o.write('unset %s\n' % varExpanded)
+ return False
+
+ if val is None:
+ return False
+
+ val = str(val)
+
+ if varExpanded.startswith("BASH_FUNC_"):
+ varExpanded = varExpanded[10:-2]
+ val = val[3:] # Strip off "() "
+ o.write("%s() %s\n" % (varExpanded, val))
+ o.write("export -f %s\n" % (varExpanded))
+ return True
+
+ if func:
+ # NOTE: should probably check for unbalanced {} within the var
+ o.write("%s() {\n%s\n}\n" % (varExpanded, val))
+ return 1
+
+ if export:
+ o.write('export ')
+
+ # if we're going to output this within doublequotes,
+ # to a shell, we need to escape the quotes in the var
+ alter = re.sub('"', '\\"', val)
+ alter = re.sub('\n', ' \\\n', alter)
+ alter = re.sub('\\$', '\\\\$', alter)
+ o.write('%s="%s"\n' % (varExpanded, alter))
+ return False
+
+def emit_env(o=sys.__stdout__, d = init(), all=False):
+ """Emits all items in the data store in a format such that it can be sourced by a shell."""
+
+ isfunc = lambda key: bool(d.getVarFlag(key, "func"))
+ keys = sorted((key for key in d.keys() if not key.startswith("__")), key=isfunc)
+ grouped = groupby(keys, isfunc)
+ for isfunc, keys in grouped:
+ for key in keys:
+ emit_var(key, o, d, all and not isfunc) and o.write('\n')
+
+def exported_keys(d):
+ return (key for key in d.keys() if not key.startswith('__') and
+ d.getVarFlag(key, 'export') and
+ not d.getVarFlag(key, 'unexport'))
+
+def exported_vars(d):
+ for key in exported_keys(d):
+ try:
+ value = d.getVar(key, True)
+ except Exception:
+ pass
+
+ if value is not None:
+ yield key, str(value)
+
+def emit_func(func, o=sys.__stdout__, d = init()):
+ """Emits all items in the data store in a format such that it can be sourced by a shell."""
+
+ keys = (key for key in d.keys() if not key.startswith("__") and not d.getVarFlag(key, "func"))
+ for key in keys:
+ emit_var(key, o, d, False)
+
+ o.write('\n')
+ emit_var(func, o, d, False) and o.write('\n')
+ newdeps = bb.codeparser.ShellParser(func, logger).parse_shell(d.getVar(func, True))
+ newdeps |= set((d.getVarFlag(func, "vardeps", True) or "").split())
+ seen = set()
+ while newdeps:
+ deps = newdeps
+ seen |= deps
+ newdeps = set()
+ for dep in deps:
+ if d.getVarFlag(dep, "func") and not d.getVarFlag(dep, "python"):
+ emit_var(dep, o, d, False) and o.write('\n')
+ newdeps |= bb.codeparser.ShellParser(dep, logger).parse_shell(d.getVar(dep, True))
+ newdeps |= set((d.getVarFlag(dep, "vardeps", True) or "").split())
+ newdeps -= seen
+
+_functionfmt = """
+def {function}(d):
+{body}"""
+
+def emit_func_python(func, o=sys.__stdout__, d = init()):
+ """Emits all items in the data store in a format such that it can be sourced by a shell."""
+
+ def write_func(func, o, call = False):
+ body = d.getVar(func, True)
+ if not body.startswith("def"):
+ body = _functionfmt.format(function=func, body=body)
+
+ o.write(body.strip() + "\n\n")
+ if call:
+ o.write(func + "(d)" + "\n\n")
+
+ write_func(func, o, True)
+ pp = bb.codeparser.PythonParser(func, logger)
+ pp.parse_python(d.getVar(func, True))
+ newdeps = pp.execs
+ newdeps |= set((d.getVarFlag(func, "vardeps", True) or "").split())
+ seen = set()
+ while newdeps:
+ deps = newdeps
+ seen |= deps
+ newdeps = set()
+ for dep in deps:
+ if d.getVarFlag(dep, "func") and d.getVarFlag(dep, "python"):
+ write_func(dep, o)
+ pp = bb.codeparser.PythonParser(dep, logger)
+ pp.parse_python(d.getVar(dep, True))
+ newdeps |= pp.execs
+ newdeps |= set((d.getVarFlag(dep, "vardeps", True) or "").split())
+ newdeps -= seen
+
+def update_data(d):
+ """Performs final steps upon the datastore, including application of overrides"""
+ d.finalize(parent = True)
+
+def build_dependencies(key, keys, shelldeps, varflagsexcl, d):
+ deps = set()
+ try:
+ if key[-1] == ']':
+ vf = key[:-1].split('[')
+ value = d.getVarFlag(vf[0], vf[1], False)
+ parser = d.expandWithRefs(value, key)
+ deps |= parser.references
+ deps = deps | (keys & parser.execs)
+ return deps, value
+ varflags = d.getVarFlags(key, ["vardeps", "vardepvalue", "vardepsexclude", "vardepvalueexclude", "postfuncs", "prefuncs"]) or {}
+ vardeps = varflags.get("vardeps")
+ value = d.getVar(key, False)
+
+ def handle_contains(value, contains, d):
+ newvalue = ""
+ for k in sorted(contains):
+ l = (d.getVar(k, True) or "").split()
+ for word in sorted(contains[k]):
+ if word in l:
+ newvalue += "\n%s{%s} = Set" % (k, word)
+ else:
+ newvalue += "\n%s{%s} = Unset" % (k, word)
+ if not newvalue:
+ return value
+ if not value:
+ return newvalue
+ return value + newvalue
+
+ if "vardepvalue" in varflags:
+ value = varflags.get("vardepvalue")
+ elif varflags.get("func"):
+ if varflags.get("python"):
+ parsedvar = d.expandWithRefs(value, key)
+ parser = bb.codeparser.PythonParser(key, logger)
+ if parsedvar.value and "\t" in parsedvar.value:
+ logger.warn("Variable %s contains tabs, please remove these (%s)" % (key, d.getVar("FILE", True)))
+ parser.parse_python(parsedvar.value)
+ deps = deps | parser.references
+ value = handle_contains(value, parser.contains, d)
+ else:
+ parsedvar = d.expandWithRefs(value, key)
+ parser = bb.codeparser.ShellParser(key, logger)
+ parser.parse_shell(parsedvar.value)
+ deps = deps | shelldeps
+ if vardeps is None:
+ parser.log.flush()
+ if "prefuncs" in varflags:
+ deps = deps | set(varflags["prefuncs"].split())
+ if "postfuncs" in varflags:
+ deps = deps | set(varflags["postfuncs"].split())
+ deps = deps | parsedvar.references
+ deps = deps | (keys & parser.execs) | (keys & parsedvar.execs)
+ value = handle_contains(value, parsedvar.contains, d)
+ else:
+ parser = d.expandWithRefs(value, key)
+ deps |= parser.references
+ deps = deps | (keys & parser.execs)
+ value = handle_contains(value, parser.contains, d)
+
+ if "vardepvalueexclude" in varflags:
+ exclude = varflags.get("vardepvalueexclude")
+ for excl in exclude.split('|'):
+ if excl:
+ value = value.replace(excl, '')
+
+ # Add varflags, assuming an exclusion list is set
+ if varflagsexcl:
+ varfdeps = []
+ for f in varflags:
+ if f not in varflagsexcl:
+ varfdeps.append('%s[%s]' % (key, f))
+ if varfdeps:
+ deps |= set(varfdeps)
+
+ deps |= set((vardeps or "").split())
+ deps -= set(varflags.get("vardepsexclude", "").split())
+ except Exception as e:
+ raise bb.data_smart.ExpansionError(key, None, e)
+ return deps, value
+ #bb.note("Variable %s references %s and calls %s" % (key, str(deps), str(execs)))
+ #d.setVarFlag(key, "vardeps", deps)
+
+def generate_dependencies(d):
+
+ keys = set(key for key in d if not key.startswith("__"))
+ shelldeps = set(key for key in d.getVar("__exportlist", False) if d.getVarFlag(key, "export") and not d.getVarFlag(key, "unexport"))
+ varflagsexcl = d.getVar('BB_SIGNATURE_EXCLUDE_FLAGS', True)
+
+ deps = {}
+ values = {}
+
+ tasklist = d.getVar('__BBTASKS', False) or []
+ for task in tasklist:
+ deps[task], values[task] = build_dependencies(task, keys, shelldeps, varflagsexcl, d)
+ newdeps = deps[task]
+ seen = set()
+ while newdeps:
+ nextdeps = newdeps
+ seen |= nextdeps
+ newdeps = set()
+ for dep in nextdeps:
+ if dep not in deps:
+ deps[dep], values[dep] = build_dependencies(dep, keys, shelldeps, varflagsexcl, d)
+ newdeps |= deps[dep]
+ newdeps -= seen
+ #print "For %s: %s" % (task, str(deps[task]))
+ return tasklist, deps, values
+
+def inherits_class(klass, d):
+ val = d.getVar('__inherit_cache', False) or []
+ needle = os.path.join('classes', '%s.bbclass' % klass)
+ for v in val:
+ if v.endswith(needle):
+ return True
+ return False
diff --git a/bitbake/lib/bb/data_smart.py b/bitbake/lib/bb/data_smart.py
new file mode 100644
index 0000000..79b4ed9
--- /dev/null
+++ b/bitbake/lib/bb/data_smart.py
@@ -0,0 +1,942 @@
+# ex:ts=4:sw=4:sts=4:et
+# -*- tab-width: 4; c-basic-offset: 4; indent-tabs-mode: nil -*-
+"""
+BitBake Smart Dictionary Implementation
+
+Functions for interacting with the data structure used by the
+BitBake build tools.
+
+"""
+
+# Copyright (C) 2003, 2004 Chris Larson
+# Copyright (C) 2004, 2005 Seb Frankengul
+# Copyright (C) 2005, 2006 Holger Hans Peter Freyther
+# Copyright (C) 2005 Uli Luckas
+# Copyright (C) 2005 ROAD GmbH
+#
+# This program is free software; you can redistribute it and/or modify
+# it under the terms of the GNU General Public License version 2 as
+# published by the Free Software Foundation.
+#
+# This program is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
+# GNU General Public License for more details.
+#
+# You should have received a copy of the GNU General Public License along
+# with this program; if not, write to the Free Software Foundation, Inc.,
+# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
+# Based on functions from the base bb module, Copyright 2003 Holger Schurig
+
+import copy, re, sys, traceback
+from collections import MutableMapping
+import logging
+import hashlib
+import bb, bb.codeparser
+from bb import utils
+from bb.COW import COWDictBase
+
+logger = logging.getLogger("BitBake.Data")
+
+__setvar_keyword__ = ["_append", "_prepend", "_remove"]
+__setvar_regexp__ = re.compile('(?P<base>.*?)(?P<keyword>_append|_prepend|_remove)(_(?P<add>.*))?$')
+__expand_var_regexp__ = re.compile(r"\${[^{}@\n\t ]+}")
+__expand_python_regexp__ = re.compile(r"\${@.+?}")
+
+def infer_caller_details(loginfo, parent = False, varval = True):
+ """Save the caller the trouble of specifying everything."""
+ # Save effort.
+ if 'ignore' in loginfo and loginfo['ignore']:
+ return
+ # If nothing was provided, mark this as possibly unneeded.
+ if not loginfo:
+ loginfo['ignore'] = True
+ return
+ # Infer caller's likely values for variable (var) and value (value),
+ # to reduce clutter in the rest of the code.
+ above = None
+ def set_above():
+ try:
+ raise Exception
+ except Exception:
+ tb = sys.exc_info()[2]
+ if parent:
+ return tb.tb_frame.f_back.f_back.f_back
+ else:
+ return tb.tb_frame.f_back.f_back
+
+ if varval and ('variable' not in loginfo or 'detail' not in loginfo):
+ if not above:
+ above = set_above()
+ lcls = above.f_locals.items()
+ for k, v in lcls:
+ if k == 'value' and 'detail' not in loginfo:
+ loginfo['detail'] = v
+ if k == 'var' and 'variable' not in loginfo:
+ loginfo['variable'] = v
+ # Infer file/line/function from traceback
+ # Don't use traceback.extract_stack() since it fills the line contents which
+ # we don't need and that hits stat syscalls
+ if 'file' not in loginfo:
+ if not above:
+ above = set_above()
+ f = above.f_back
+ line = f.f_lineno
+ file = f.f_code.co_filename
+ func = f.f_code.co_name
+ loginfo['file'] = file
+ loginfo['line'] = line
+ if func not in loginfo:
+ loginfo['func'] = func
+
+class VariableParse:
+ def __init__(self, varname, d, val = None):
+ self.varname = varname
+ self.d = d
+ self.value = val
+
+ self.references = set()
+ self.execs = set()
+ self.contains = {}
+
+ def var_sub(self, match):
+ key = match.group()[2:-1]
+ if self.varname and key:
+ if self.varname == key:
+ raise Exception("variable %s references itself!" % self.varname)
+ if key in self.d.expand_cache:
+ varparse = self.d.expand_cache[key]
+ var = varparse.value
+ else:
+ var = self.d.getVarFlag(key, "_content", True)
+ self.references.add(key)
+ if var is not None:
+ return var
+ else:
+ return match.group()
+
+ def python_sub(self, match):
+ code = match.group()[3:-1]
+ codeobj = compile(code.strip(), self.varname or "<expansion>", "eval")
+
+ parser = bb.codeparser.PythonParser(self.varname, logger)
+ parser.parse_python(code)
+ if self.varname:
+ vardeps = self.d.getVarFlag(self.varname, "vardeps", True)
+ if vardeps is None:
+ parser.log.flush()
+ else:
+ parser.log.flush()
+ self.references |= parser.references
+ self.execs |= parser.execs
+
+ for k in parser.contains:
+ if k not in self.contains:
+ self.contains[k] = parser.contains[k].copy()
+ else:
+ self.contains[k].update(parser.contains[k])
+ value = utils.better_eval(codeobj, DataContext(self.d))
+ return str(value)
+
+
+class DataContext(dict):
+ def __init__(self, metadata, **kwargs):
+ self.metadata = metadata
+ dict.__init__(self, **kwargs)
+ self['d'] = metadata
+
+ def __missing__(self, key):
+ value = self.metadata.getVar(key, True)
+ if value is None or self.metadata.getVarFlag(key, 'func'):
+ raise KeyError(key)
+ else:
+ return value
+
+class ExpansionError(Exception):
+ def __init__(self, varname, expression, exception):
+ self.expression = expression
+ self.variablename = varname
+ self.exception = exception
+ if varname:
+ if expression:
+ self.msg = "Failure expanding variable %s, expression was %s which triggered exception %s: %s" % (varname, expression, type(exception).__name__, exception)
+ else:
+ self.msg = "Failure expanding variable %s: %s: %s" % (varname, type(exception).__name__, exception)
+ else:
+ self.msg = "Failure expanding expression %s which triggered exception %s: %s" % (expression, type(exception).__name__, exception)
+ Exception.__init__(self, self.msg)
+ self.args = (varname, expression, exception)
+ def __str__(self):
+ return self.msg
+
+class IncludeHistory(object):
+ def __init__(self, parent = None, filename = '[TOP LEVEL]'):
+ self.parent = parent
+ self.filename = filename
+ self.children = []
+ self.current = self
+
+ def copy(self):
+ new = IncludeHistory(self.parent, self.filename)
+ for c in self.children:
+ new.children.append(c)
+ return new
+
+ def include(self, filename):
+ newfile = IncludeHistory(self.current, filename)
+ self.current.children.append(newfile)
+ self.current = newfile
+ return self
+
+ def __enter__(self):
+ pass
+
+ def __exit__(self, a, b, c):
+ if self.current.parent:
+ self.current = self.current.parent
+ else:
+ bb.warn("Include log: Tried to finish '%s' at top level." % filename)
+ return False
+
+ def emit(self, o, level = 0):
+ """Emit an include history file, and its children."""
+ if level:
+ spaces = " " * (level - 1)
+ o.write("# %s%s" % (spaces, self.filename))
+ if len(self.children) > 0:
+ o.write(" includes:")
+ else:
+ o.write("#\n# INCLUDE HISTORY:\n#")
+ level = level + 1
+ for child in self.children:
+ o.write("\n")
+ child.emit(o, level)
+
+class VariableHistory(object):
+ def __init__(self, dataroot):
+ self.dataroot = dataroot
+ self.variables = COWDictBase.copy()
+
+ def copy(self):
+ new = VariableHistory(self.dataroot)
+ new.variables = self.variables.copy()
+ return new
+
+ def record(self, *kwonly, **loginfo):
+ if not self.dataroot._tracking:
+ return
+ if len(kwonly) > 0:
+ raise TypeError
+ infer_caller_details(loginfo, parent = True)
+ if 'ignore' in loginfo and loginfo['ignore']:
+ return
+ if 'op' not in loginfo or not loginfo['op']:
+ loginfo['op'] = 'set'
+ if 'detail' in loginfo:
+ loginfo['detail'] = str(loginfo['detail'])
+ if 'variable' not in loginfo or 'file' not in loginfo:
+ raise ValueError("record() missing variable or file.")
+ var = loginfo['variable']
+
+ if var not in self.variables:
+ self.variables[var] = []
+ if not isinstance(self.variables[var], list):
+ return
+ if 'nodups' in loginfo and loginfo in self.variables[var]:
+ return
+ self.variables[var].append(loginfo.copy())
+
+ def variable(self, var):
+ if var in self.variables:
+ return self.variables[var]
+ else:
+ return []
+
+ def emit(self, var, oval, val, o, d):
+ history = self.variable(var)
+
+ # Append override history
+ if var in d.overridedata:
+ for (r, override) in d.overridedata[var]:
+ for event in self.variable(r):
+ loginfo = event.copy()
+ if 'flag' in loginfo and not loginfo['flag'].startswith("_"):
+ continue
+ loginfo['variable'] = var
+ loginfo['op'] = 'override[%s]:%s' % (override, loginfo['op'])
+ history.append(loginfo)
+
+ commentVal = re.sub('\n', '\n#', str(oval))
+ if history:
+ if len(history) == 1:
+ o.write("#\n# $%s\n" % var)
+ else:
+ o.write("#\n# $%s [%d operations]\n" % (var, len(history)))
+ for event in history:
+ # o.write("# %s\n" % str(event))
+ if 'func' in event:
+ # If we have a function listed, this is internal
+ # code, not an operation in a config file, and the
+ # full path is distracting.
+ event['file'] = re.sub('.*/', '', event['file'])
+ display_func = ' [%s]' % event['func']
+ else:
+ display_func = ''
+ if 'flag' in event:
+ flag = '[%s] ' % (event['flag'])
+ else:
+ flag = ''
+ o.write("# %s %s:%s%s\n# %s\"%s\"\n" % (event['op'], event['file'], event['line'], display_func, flag, re.sub('\n', '\n# ', event['detail'])))
+ if len(history) > 1:
+ o.write("# pre-expansion value:\n")
+ o.write('# "%s"\n' % (commentVal))
+ else:
+ o.write("#\n# $%s\n# [no history recorded]\n#\n" % var)
+ o.write('# "%s"\n' % (commentVal))
+
+ def get_variable_files(self, var):
+ """Get the files where operations are made on a variable"""
+ var_history = self.variable(var)
+ files = []
+ for event in var_history:
+ files.append(event['file'])
+ return files
+
+ def get_variable_lines(self, var, f):
+ """Get the line where a operation is made on a variable in file f"""
+ var_history = self.variable(var)
+ lines = []
+ for event in var_history:
+ if f== event['file']:
+ line = event['line']
+ lines.append(line)
+ return lines
+
+ def get_variable_items_files(self, var, d):
+ """
+ Use variable history to map items added to a list variable and
+ the files in which they were added.
+ """
+ history = self.variable(var)
+ finalitems = (d.getVar(var, True) or '').split()
+ filemap = {}
+ isset = False
+ for event in history:
+ if 'flag' in event:
+ continue
+ if event['op'] == '_remove':
+ continue
+ if isset and event['op'] == 'set?':
+ continue
+ isset = True
+ items = d.expand(event['detail']).split()
+ for item in items:
+ # This is a little crude but is belt-and-braces to avoid us
+ # having to handle every possible operation type specifically
+ if item in finalitems and not item in filemap:
+ filemap[item] = event['file']
+ return filemap
+
+ def del_var_history(self, var, f=None, line=None):
+ """If file f and line are not given, the entire history of var is deleted"""
+ if var in self.variables:
+ if f and line:
+ self.variables[var] = [ x for x in self.variables[var] if x['file']!=f and x['line']!=line]
+ else:
+ self.variables[var] = []
+
+class DataSmart(MutableMapping):
+ def __init__(self):
+ self.dict = {}
+
+ self.inchistory = IncludeHistory()
+ self.varhistory = VariableHistory(self)
+ self._tracking = False
+
+ self.expand_cache = {}
+
+ # cookie monster tribute
+ # Need to be careful about writes to overridedata as
+ # its only a shallow copy, could influence other data store
+ # copies!
+ self.overridedata = {}
+ self.overrides = None
+ self.overridevars = set(["OVERRIDES", "FILE"])
+ self.inoverride = False
+
+ def enableTracking(self):
+ self._tracking = True
+
+ def disableTracking(self):
+ self._tracking = False
+
+ def expandWithRefs(self, s, varname):
+
+ if not isinstance(s, basestring): # sanity check
+ return VariableParse(varname, self, s)
+
+ if varname and varname in self.expand_cache:
+ return self.expand_cache[varname]
+
+ varparse = VariableParse(varname, self)
+
+ while s.find('${') != -1:
+ olds = s
+ try:
+ s = __expand_var_regexp__.sub(varparse.var_sub, s)
+ s = __expand_python_regexp__.sub(varparse.python_sub, s)
+ if s == olds:
+ break
+ except ExpansionError:
+ raise
+ except bb.parse.SkipRecipe:
+ raise
+ except Exception as exc:
+ exc_class, exc, tb = sys.exc_info()
+ raise ExpansionError, ExpansionError(varname, s, exc), tb
+
+ varparse.value = s
+
+ if varname:
+ self.expand_cache[varname] = varparse
+
+ return varparse
+
+ def expand(self, s, varname = None):
+ return self.expandWithRefs(s, varname).value
+
+ def finalize(self, parent = False):
+ return
+
+ def internal_finalize(self, parent = False):
+ """Performs final steps upon the datastore, including application of overrides"""
+ self.overrides = None
+
+ def need_overrides(self):
+ if self.overrides is None:
+ if self.inoverride:
+ return
+ self.inoverride = True
+ # Can end up here recursively so setup dummy values
+ self.overrides = []
+ self.overridesset = set()
+ self.overrides = (self.getVar("OVERRIDES", True) or "").split(":") or []
+ self.overridesset = set(self.overrides)
+ self.inoverride = False
+ self.expand_cache = {}
+
+ def initVar(self, var):
+ self.expand_cache = {}
+ if not var in self.dict:
+ self.dict[var] = {}
+
+ def _findVar(self, var):
+ dest = self.dict
+ while dest:
+ if var in dest:
+ return dest[var]
+
+ if "_data" not in dest:
+ break
+ dest = dest["_data"]
+
+ def _makeShadowCopy(self, var):
+ if var in self.dict:
+ return
+
+ local_var = self._findVar(var)
+
+ if local_var:
+ self.dict[var] = copy.copy(local_var)
+ else:
+ self.initVar(var)
+
+
+ def setVar(self, var, value, **loginfo):
+ #print("var=" + str(var) + " val=" + str(value))
+ parsing=False
+ if 'parsing' in loginfo:
+ parsing=True
+
+ if 'op' not in loginfo:
+ loginfo['op'] = "set"
+ self.expand_cache = {}
+ match = __setvar_regexp__.match(var)
+ if match and match.group("keyword") in __setvar_keyword__:
+ base = match.group('base')
+ keyword = match.group("keyword")
+ override = match.group('add')
+ l = self.getVarFlag(base, keyword) or []
+ l.append([value, override])
+ self.setVarFlag(base, keyword, l, ignore=True)
+ # And cause that to be recorded:
+ loginfo['detail'] = value
+ loginfo['variable'] = base
+ if override:
+ loginfo['op'] = '%s[%s]' % (keyword, override)
+ else:
+ loginfo['op'] = keyword
+ self.varhistory.record(**loginfo)
+ # todo make sure keyword is not __doc__ or __module__
+ # pay the cookie monster
+
+ # more cookies for the cookie monster
+ if '_' in var:
+ self._setvar_update_overrides(base, **loginfo)
+
+
+ if base in self.overridevars:
+ self.overridevars.update(self.expandWithRefs(value, var).references)
+ self.internal_finalize(True)
+ return
+
+ if not var in self.dict:
+ self._makeShadowCopy(var)
+
+ if not parsing:
+ if "_append" in self.dict[var]:
+ del self.dict[var]["_append"]
+ if "_prepend" in self.dict[var]:
+ del self.dict[var]["_prepend"]
+ if var in self.overridedata:
+ active = []
+ self.need_overrides()
+ for (r, o) in self.overridedata[var]:
+ if o in self.overridesset:
+ active.append(r)
+ elif "_" in o:
+ if set(o.split("_")).issubset(self.overridesset):
+ active.append(r)
+ for a in active:
+ self.delVar(a)
+ del self.overridedata[var]
+
+ # more cookies for the cookie monster
+ if '_' in var:
+ self._setvar_update_overrides(var, **loginfo)
+
+ # setting var
+ self.dict[var]["_content"] = value
+ self.varhistory.record(**loginfo)
+
+ if var in self.overridevars:
+ self.overridevars.update(self.expandWithRefs(value, var).references)
+ self.internal_finalize(True)
+
+ def _setvar_update_overrides(self, var, **loginfo):
+ # aka pay the cookie monster
+ override = var[var.rfind('_')+1:]
+ shortvar = var[:var.rfind('_')]
+ while override:
+ if shortvar not in self.overridedata:
+ self.overridedata[shortvar] = []
+ if [var, override] not in self.overridedata[shortvar]:
+ # Force CoW by recreating the list first
+ self.overridedata[shortvar] = list(self.overridedata[shortvar])
+ self.overridedata[shortvar].append([var, override])
+ override = None
+ if "_" in shortvar:
+ override = var[shortvar.rfind('_')+1:]
+ shortvar = var[:shortvar.rfind('_')]
+ if len(shortvar) == 0:
+ override = None
+
+ def getVar(self, var, expand=False, noweakdefault=False, parsing=False):
+ return self.getVarFlag(var, "_content", expand, noweakdefault, parsing)
+
+ def renameVar(self, key, newkey, **loginfo):
+ """
+ Rename the variable key to newkey
+ """
+ val = self.getVar(key, 0, parsing=True)
+ if val is not None:
+ loginfo['variable'] = newkey
+ loginfo['op'] = 'rename from %s' % key
+ loginfo['detail'] = val
+ self.varhistory.record(**loginfo)
+ self.setVar(newkey, val, ignore=True, parsing=True)
+
+ for i in (__setvar_keyword__):
+ src = self.getVarFlag(key, i)
+ if src is None:
+ continue
+
+ dest = self.getVarFlag(newkey, i) or []
+ dest.extend(src)
+ self.setVarFlag(newkey, i, dest, ignore=True)
+
+ if key in self.overridedata:
+ self.overridedata[newkey] = []
+ for (v, o) in self.overridedata[key]:
+ self.overridedata[newkey].append([v.replace(key, newkey), o])
+ self.renameVar(v, v.replace(key, newkey))
+
+ if '_' in newkey and val is None:
+ self._setvar_update_overrides(newkey, **loginfo)
+
+ loginfo['variable'] = key
+ loginfo['op'] = 'rename (to)'
+ loginfo['detail'] = newkey
+ self.varhistory.record(**loginfo)
+ self.delVar(key, ignore=True)
+
+ def appendVar(self, var, value, **loginfo):
+ loginfo['op'] = 'append'
+ self.varhistory.record(**loginfo)
+ self.setVar(var + "_append", value, ignore=True, parsing=True)
+
+ def prependVar(self, var, value, **loginfo):
+ loginfo['op'] = 'prepend'
+ self.varhistory.record(**loginfo)
+ self.setVar(var + "_prepend", value, ignore=True, parsing=True)
+
+ def delVar(self, var, **loginfo):
+ loginfo['detail'] = ""
+ loginfo['op'] = 'del'
+ self.varhistory.record(**loginfo)
+ self.expand_cache = {}
+ self.dict[var] = {}
+ if var in self.overridedata:
+ del self.overridedata[var]
+ if '_' in var:
+ override = var[var.rfind('_')+1:]
+ shortvar = var[:var.rfind('_')]
+ while override:
+ try:
+ if shortvar in self.overridedata:
+ # Force CoW by recreating the list first
+ self.overridedata[shortvar] = list(self.overridedata[shortvar])
+ self.overridedata[shortvar].remove([var, override])
+ except ValueError as e:
+ pass
+ override = None
+ if "_" in shortvar:
+ override = var[shortvar.rfind('_')+1:]
+ shortvar = var[:shortvar.rfind('_')]
+ if len(shortvar) == 0:
+ override = None
+
+ def setVarFlag(self, var, flag, value, **loginfo):
+ self.expand_cache = {}
+ if 'op' not in loginfo:
+ loginfo['op'] = "set"
+ loginfo['flag'] = flag
+ self.varhistory.record(**loginfo)
+ if not var in self.dict:
+ self._makeShadowCopy(var)
+ self.dict[var][flag] = value
+
+ if flag == "_defaultval" and '_' in var:
+ self._setvar_update_overrides(var, **loginfo)
+
+ if flag == "unexport" or flag == "export":
+ if not "__exportlist" in self.dict:
+ self._makeShadowCopy("__exportlist")
+ if not "_content" in self.dict["__exportlist"]:
+ self.dict["__exportlist"]["_content"] = set()
+ self.dict["__exportlist"]["_content"].add(var)
+
+ def getVarFlag(self, var, flag, expand=False, noweakdefault=False, parsing=False):
+ local_var = self._findVar(var)
+ value = None
+ if flag == "_content" and var in self.overridedata and not parsing:
+ match = False
+ active = {}
+ self.need_overrides()
+ for (r, o) in self.overridedata[var]:
+ # What about double overrides both with "_" in the name?
+ if o in self.overridesset:
+ active[o] = r
+ elif "_" in o:
+ if set(o.split("_")).issubset(self.overridesset):
+ active[o] = r
+
+ mod = True
+ while mod:
+ mod = False
+ for o in self.overrides:
+ for a in active.copy():
+ if a.endswith("_" + o):
+ t = active[a]
+ del active[a]
+ active[a.replace("_" + o, "")] = t
+ mod = True
+ elif a == o:
+ match = active[a]
+ del active[a]
+ if match:
+ value = self.getVar(match)
+
+ if local_var is not None and value is None:
+ if flag in local_var:
+ value = copy.copy(local_var[flag])
+ elif flag == "_content" and "_defaultval" in local_var and not noweakdefault:
+ value = copy.copy(local_var["_defaultval"])
+
+
+ if flag == "_content" and local_var is not None and "_append" in local_var and not parsing:
+ if not value:
+ value = ""
+ self.need_overrides()
+ for (r, o) in local_var["_append"]:
+ match = True
+ if o:
+ for o2 in o.split("_"):
+ if not o2 in self.overrides:
+ match = False
+ if match:
+ value = value + r
+
+ if flag == "_content" and local_var is not None and "_prepend" in local_var and not parsing:
+ if not value:
+ value = ""
+ self.need_overrides()
+ for (r, o) in local_var["_prepend"]:
+
+ match = True
+ if o:
+ for o2 in o.split("_"):
+ if not o2 in self.overrides:
+ match = False
+ if match:
+ value = r + value
+
+ if expand and value:
+ # Only getvar (flag == _content) hits the expand cache
+ cachename = None
+ if flag == "_content":
+ cachename = var
+ else:
+ cachename = var + "[" + flag + "]"
+ value = self.expand(value, cachename)
+
+ if value and flag == "_content" and local_var is not None and "_remove" in local_var:
+ removes = []
+ self.need_overrides()
+ for (r, o) in local_var["_remove"]:
+ match = True
+ if o:
+ for o2 in o.split("_"):
+ if not o2 in self.overrides:
+ match = False
+ if match:
+ removes.extend(self.expand(r).split())
+
+ filtered = filter(lambda v: v not in removes,
+ value.split())
+ value = " ".join(filtered)
+ if expand and var in self.expand_cache:
+ # We need to ensure the expand cache has the correct value
+ # flag == "_content" here
+ self.expand_cache[var].value = value
+ return value
+
+ def delVarFlag(self, var, flag, **loginfo):
+ self.expand_cache = {}
+ local_var = self._findVar(var)
+ if not local_var:
+ return
+ if not var in self.dict:
+ self._makeShadowCopy(var)
+
+ if var in self.dict and flag in self.dict[var]:
+ loginfo['detail'] = ""
+ loginfo['op'] = 'delFlag'
+ loginfo['flag'] = flag
+ self.varhistory.record(**loginfo)
+
+ del self.dict[var][flag]
+
+ def appendVarFlag(self, var, flag, value, **loginfo):
+ loginfo['op'] = 'append'
+ loginfo['flag'] = flag
+ self.varhistory.record(**loginfo)
+ newvalue = (self.getVarFlag(var, flag, False) or "") + value
+ self.setVarFlag(var, flag, newvalue, ignore=True)
+
+ def prependVarFlag(self, var, flag, value, **loginfo):
+ loginfo['op'] = 'prepend'
+ loginfo['flag'] = flag
+ self.varhistory.record(**loginfo)
+ newvalue = value + (self.getVarFlag(var, flag, False) or "")
+ self.setVarFlag(var, flag, newvalue, ignore=True)
+
+ def setVarFlags(self, var, flags, **loginfo):
+ self.expand_cache = {}
+ infer_caller_details(loginfo)
+ if not var in self.dict:
+ self._makeShadowCopy(var)
+
+ for i in flags:
+ if i == "_content":
+ continue
+ loginfo['flag'] = i
+ loginfo['detail'] = flags[i]
+ self.varhistory.record(**loginfo)
+ self.dict[var][i] = flags[i]
+
+ def getVarFlags(self, var, expand = False, internalflags=False):
+ local_var = self._findVar(var)
+ flags = {}
+
+ if local_var:
+ for i in local_var:
+ if i.startswith("_") and not internalflags:
+ continue
+ flags[i] = local_var[i]
+ if expand and i in expand:
+ flags[i] = self.expand(flags[i], var + "[" + i + "]")
+ if len(flags) == 0:
+ return None
+ return flags
+
+
+ def delVarFlags(self, var, **loginfo):
+ self.expand_cache = {}
+ if not var in self.dict:
+ self._makeShadowCopy(var)
+
+ if var in self.dict:
+ content = None
+
+ loginfo['op'] = 'delete flags'
+ self.varhistory.record(**loginfo)
+
+ # try to save the content
+ if "_content" in self.dict[var]:
+ content = self.dict[var]["_content"]
+ self.dict[var] = {}
+ self.dict[var]["_content"] = content
+ else:
+ del self.dict[var]
+
+ def createCopy(self):
+ """
+ Create a copy of self by setting _data to self
+ """
+ # we really want this to be a DataSmart...
+ data = DataSmart()
+ data.dict["_data"] = self.dict
+ data.varhistory = self.varhistory.copy()
+ data.varhistory.datasmart = data
+ data.inchistory = self.inchistory.copy()
+
+ data._tracking = self._tracking
+
+ data.overrides = None
+ data.overridevars = copy.copy(self.overridevars)
+ # Should really be a deepcopy but has heavy overhead.
+ # Instead, we're careful with writes.
+ data.overridedata = copy.copy(self.overridedata)
+
+ return data
+
+ def expandVarref(self, variable, parents=False):
+ """Find all references to variable in the data and expand it
+ in place, optionally descending to parent datastores."""
+
+ if parents:
+ keys = iter(self)
+ else:
+ keys = self.localkeys()
+
+ ref = '${%s}' % variable
+ value = self.getVar(variable, False)
+ for key in keys:
+ referrervalue = self.getVar(key, False)
+ if referrervalue and ref in referrervalue:
+ self.setVar(key, referrervalue.replace(ref, value))
+
+ def localkeys(self):
+ for key in self.dict:
+ if key != '_data':
+ yield key
+
+ def __iter__(self):
+ deleted = set()
+ overrides = set()
+ def keylist(d):
+ klist = set()
+ for key in d:
+ if key == "_data":
+ continue
+ if key in deleted:
+ continue
+ if key in overrides:
+ continue
+ if not d[key]:
+ deleted.add(key)
+ continue
+ klist.add(key)
+
+ if "_data" in d:
+ klist |= keylist(d["_data"])
+
+ return klist
+
+ self.need_overrides()
+ for var in self.overridedata:
+ for (r, o) in self.overridedata[var]:
+ if o in self.overridesset:
+ overrides.add(var)
+ elif "_" in o:
+ if set(o.split("_")).issubset(self.overridesset):
+ overrides.add(var)
+
+ for k in keylist(self.dict):
+ yield k
+
+ for k in overrides:
+ yield k
+
+ def __len__(self):
+ return len(frozenset(self))
+
+ def __getitem__(self, item):
+ value = self.getVar(item, False)
+ if value is None:
+ raise KeyError(item)
+ else:
+ return value
+
+ def __setitem__(self, var, value):
+ self.setVar(var, value)
+
+ def __delitem__(self, var):
+ self.delVar(var)
+
+ def get_hash(self):
+ data = {}
+ d = self.createCopy()
+ bb.data.expandKeys(d)
+ bb.data.update_data(d)
+
+ config_whitelist = set((d.getVar("BB_HASHCONFIG_WHITELIST", True) or "").split())
+ keys = set(key for key in iter(d) if not key.startswith("__"))
+ for key in keys:
+ if key in config_whitelist:
+ continue
+
+ value = d.getVar(key, False) or ""
+ data.update({key:value})
+
+ varflags = d.getVarFlags(key, internalflags = True)
+ if not varflags:
+ continue
+ for f in varflags:
+ if f == "_content":
+ continue
+ data.update({'%s[%s]' % (key, f):varflags[f]})
+
+ for key in ["__BBTASKS", "__BBANONFUNCS", "__BBHANDLERS"]:
+ bb_list = d.getVar(key, False) or []
+ bb_list.sort()
+ data.update({key:str(bb_list)})
+
+ if key == "__BBANONFUNCS":
+ for i in bb_list:
+ value = d.getVar(i, True) or ""
+ data.update({i:value})
+
+ data_str = str([(k, data[k]) for k in sorted(data.keys())])
+ return hashlib.md5(data_str).hexdigest()
diff --git a/bitbake/lib/bb/event.py b/bitbake/lib/bb/event.py
new file mode 100644
index 0000000..366bc41
--- /dev/null
+++ b/bitbake/lib/bb/event.py
@@ -0,0 +1,668 @@
+# ex:ts=4:sw=4:sts=4:et
+# -*- tab-width: 4; c-basic-offset: 4; indent-tabs-mode: nil -*-
+"""
+BitBake 'Event' implementation
+
+Classes and functions for manipulating 'events' in the
+BitBake build tools.
+"""
+
+# Copyright (C) 2003, 2004 Chris Larson
+#
+# This program is free software; you can redistribute it and/or modify
+# it under the terms of the GNU General Public License version 2 as
+# published by the Free Software Foundation.
+#
+# This program is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
+# GNU General Public License for more details.
+#
+# You should have received a copy of the GNU General Public License along
+# with this program; if not, write to the Free Software Foundation, Inc.,
+# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
+
+import os, sys
+import warnings
+try:
+ import cPickle as pickle
+except ImportError:
+ import pickle
+import logging
+import atexit
+import traceback
+import bb.utils
+import bb.compat
+import bb.exceptions
+
+# This is the pid for which we should generate the event. This is set when
+# the runqueue forks off.
+worker_pid = 0
+worker_fire = None
+
+logger = logging.getLogger('BitBake.Event')
+
+class Event(object):
+ """Base class for events"""
+
+ def __init__(self):
+ self.pid = worker_pid
+
+Registered = 10
+AlreadyRegistered = 14
+
+def get_class_handlers():
+ return _handlers
+
+def set_class_handlers(h):
+ global _handlers
+ _handlers = h
+
+def clean_class_handlers():
+ return bb.compat.OrderedDict()
+
+# Internal
+_handlers = clean_class_handlers()
+_ui_handlers = {}
+_ui_logfilters = {}
+_ui_handler_seq = 0
+_event_handler_map = {}
+_catchall_handlers = {}
+_eventfilter = None
+_uiready = False
+
+def execute_handler(name, handler, event, d):
+ event.data = d
+ addedd = False
+ if 'd' not in __builtins__:
+ __builtins__['d'] = d
+ addedd = True
+ try:
+ ret = handler(event)
+ except (bb.parse.SkipRecipe, bb.BBHandledException):
+ raise
+ except Exception:
+ etype, value, tb = sys.exc_info()
+ logger.error("Execution of event handler '%s' failed" % name,
+ exc_info=(etype, value, tb.tb_next))
+ raise
+ except SystemExit as exc:
+ if exc.code != 0:
+ logger.error("Execution of event handler '%s' failed" % name)
+ raise
+ finally:
+ del event.data
+ if addedd:
+ del __builtins__['d']
+
+def fire_class_handlers(event, d):
+ if isinstance(event, logging.LogRecord):
+ return
+
+ eid = str(event.__class__)[8:-2]
+ evt_hmap = _event_handler_map.get(eid, {})
+ for name, handler in _handlers.iteritems():
+ if name in _catchall_handlers or name in evt_hmap:
+ if _eventfilter:
+ if not _eventfilter(name, handler, event, d):
+ continue
+ execute_handler(name, handler, event, d)
+
+ui_queue = []
+@atexit.register
+def print_ui_queue():
+ """If we're exiting before a UI has been spawned, display any queued
+ LogRecords to the console."""
+ logger = logging.getLogger("BitBake")
+ if not _uiready:
+ from bb.msg import BBLogFormatter
+ console = logging.StreamHandler(sys.stdout)
+ console.setFormatter(BBLogFormatter("%(levelname)s: %(message)s"))
+ logger.handlers = [console]
+
+ # First check to see if we have any proper messages
+ msgprint = False
+ for event in ui_queue:
+ if isinstance(event, logging.LogRecord):
+ if event.levelno > logging.DEBUG:
+ logger.handle(event)
+ msgprint = True
+ if msgprint:
+ return
+
+ # Nope, so just print all of the messages we have (including debug messages)
+ for event in ui_queue:
+ if isinstance(event, logging.LogRecord):
+ logger.handle(event)
+
+def fire_ui_handlers(event, d):
+ if not _uiready:
+ # No UI handlers registered yet, queue up the messages
+ ui_queue.append(event)
+ return
+
+ errors = []
+ for h in _ui_handlers:
+ #print "Sending event %s" % event
+ try:
+ if not _ui_logfilters[h].filter(event):
+ continue
+ # We use pickle here since it better handles object instances
+ # which xmlrpc's marshaller does not. Events *must* be serializable
+ # by pickle.
+ if hasattr(_ui_handlers[h].event, "sendpickle"):
+ _ui_handlers[h].event.sendpickle((pickle.dumps(event)))
+ else:
+ _ui_handlers[h].event.send(event)
+ except:
+ errors.append(h)
+ for h in errors:
+ del _ui_handlers[h]
+
+def fire(event, d):
+ """Fire off an Event"""
+
+ # We can fire class handlers in the worker process context and this is
+ # desired so they get the task based datastore.
+ # UI handlers need to be fired in the server context so we defer this. They
+ # don't have a datastore so the datastore context isn't a problem.
+
+ fire_class_handlers(event, d)
+ if worker_fire:
+ worker_fire(event, d)
+ else:
+ fire_ui_handlers(event, d)
+
+def fire_from_worker(event, d):
+ fire_ui_handlers(event, d)
+
+noop = lambda _: None
+def register(name, handler, mask=None):
+ """Register an Event handler"""
+
+ # already registered
+ if name in _handlers:
+ return AlreadyRegistered
+
+ if handler is not None:
+ # handle string containing python code
+ if isinstance(handler, basestring):
+ tmp = "def %s(e):\n%s" % (name, handler)
+ try:
+ code = compile(tmp, "%s(e)" % name, "exec")
+ except SyntaxError:
+ logger.error("Unable to register event handler '%s':\n%s", name,
+ ''.join(traceback.format_exc(limit=0)))
+ _handlers[name] = noop
+ return
+ env = {}
+ bb.utils.better_exec(code, env)
+ func = bb.utils.better_eval(name, env)
+ _handlers[name] = func
+ else:
+ _handlers[name] = handler
+
+ if not mask or '*' in mask:
+ _catchall_handlers[name] = True
+ else:
+ for m in mask:
+ if _event_handler_map.get(m, None) is None:
+ _event_handler_map[m] = {}
+ _event_handler_map[m][name] = True
+
+ return Registered
+
+def remove(name, handler):
+ """Remove an Event handler"""
+ _handlers.pop(name)
+
+def set_eventfilter(func):
+ global _eventfilter
+ _eventfilter = func
+
+def register_UIHhandler(handler, mainui=False):
+ if mainui:
+ global _uiready
+ _uiready = True
+ bb.event._ui_handler_seq = bb.event._ui_handler_seq + 1
+ _ui_handlers[_ui_handler_seq] = handler
+ level, debug_domains = bb.msg.constructLogOptions()
+ _ui_logfilters[_ui_handler_seq] = UIEventFilter(level, debug_domains)
+ return _ui_handler_seq
+
+def unregister_UIHhandler(handlerNum):
+ if handlerNum in _ui_handlers:
+ del _ui_handlers[handlerNum]
+ return
+
+# Class to allow filtering of events and specific filtering of LogRecords *before* we put them over the IPC
+class UIEventFilter(object):
+ def __init__(self, level, debug_domains):
+ self.update(None, level, debug_domains)
+
+ def update(self, eventmask, level, debug_domains):
+ self.eventmask = eventmask
+ self.stdlevel = level
+ self.debug_domains = debug_domains
+
+ def filter(self, event):
+ if isinstance(event, logging.LogRecord):
+ if event.levelno >= self.stdlevel:
+ return True
+ if event.name in self.debug_domains and event.levelno >= self.debug_domains[event.name]:
+ return True
+ return False
+ eid = str(event.__class__)[8:-2]
+ if self.eventmask and eid not in self.eventmask:
+ return False
+ return True
+
+def set_UIHmask(handlerNum, level, debug_domains, mask):
+ if not handlerNum in _ui_handlers:
+ return False
+ if '*' in mask:
+ _ui_logfilters[handlerNum].update(None, level, debug_domains)
+ else:
+ _ui_logfilters[handlerNum].update(mask, level, debug_domains)
+ return True
+
+def getName(e):
+ """Returns the name of a class or class instance"""
+ if getattr(e, "__name__", None) == None:
+ return e.__class__.__name__
+ else:
+ return e.__name__
+
+class OperationStarted(Event):
+ """An operation has begun"""
+ def __init__(self, msg = "Operation Started"):
+ Event.__init__(self)
+ self.msg = msg
+
+class OperationCompleted(Event):
+ """An operation has completed"""
+ def __init__(self, total, msg = "Operation Completed"):
+ Event.__init__(self)
+ self.total = total
+ self.msg = msg
+
+class OperationProgress(Event):
+ """An operation is in progress"""
+ def __init__(self, current, total, msg = "Operation in Progress"):
+ Event.__init__(self)
+ self.current = current
+ self.total = total
+ self.msg = msg + ": %s/%s" % (current, total);
+
+class ConfigParsed(Event):
+ """Configuration Parsing Complete"""
+
+class RecipeEvent(Event):
+ def __init__(self, fn):
+ self.fn = fn
+ Event.__init__(self)
+
+class RecipePreFinalise(RecipeEvent):
+ """ Recipe Parsing Complete but not yet finialised"""
+
+class RecipeParsed(RecipeEvent):
+ """ Recipe Parsing Complete """
+
+class StampUpdate(Event):
+ """Trigger for any adjustment of the stamp files to happen"""
+
+ def __init__(self, targets, stampfns):
+ self._targets = targets
+ self._stampfns = stampfns
+ Event.__init__(self)
+
+ def getStampPrefix(self):
+ return self._stampfns
+
+ def getTargets(self):
+ return self._targets
+
+ stampPrefix = property(getStampPrefix)
+ targets = property(getTargets)
+
+class BuildBase(Event):
+ """Base class for bbmake run events"""
+
+ def __init__(self, n, p, failures = 0):
+ self._name = n
+ self._pkgs = p
+ Event.__init__(self)
+ self._failures = failures
+
+ def getPkgs(self):
+ return self._pkgs
+
+ def setPkgs(self, pkgs):
+ self._pkgs = pkgs
+
+ def getName(self):
+ return self._name
+
+ def setName(self, name):
+ self._name = name
+
+ def getCfg(self):
+ return self.data
+
+ def setCfg(self, cfg):
+ self.data = cfg
+
+ def getFailures(self):
+ """
+ Return the number of failed packages
+ """
+ return self._failures
+
+ pkgs = property(getPkgs, setPkgs, None, "pkgs property")
+ name = property(getName, setName, None, "name property")
+ cfg = property(getCfg, setCfg, None, "cfg property")
+
+
+
+
+
+class BuildStarted(BuildBase, OperationStarted):
+ """bbmake build run started"""
+ def __init__(self, n, p, failures = 0):
+ OperationStarted.__init__(self, "Building Started")
+ BuildBase.__init__(self, n, p, failures)
+
+class BuildCompleted(BuildBase, OperationCompleted):
+ """bbmake build run completed"""
+ def __init__(self, total, n, p, failures=0, interrupted=0):
+ if not failures:
+ OperationCompleted.__init__(self, total, "Building Succeeded")
+ else:
+ OperationCompleted.__init__(self, total, "Building Failed")
+ self._interrupted = interrupted
+ BuildBase.__init__(self, n, p, failures)
+
+class DiskFull(Event):
+ """Disk full case build aborted"""
+ def __init__(self, dev, type, freespace, mountpoint):
+ Event.__init__(self)
+ self._dev = dev
+ self._type = type
+ self._free = freespace
+ self._mountpoint = mountpoint
+
+class NoProvider(Event):
+ """No Provider for an Event"""
+
+ def __init__(self, item, runtime=False, dependees=None, reasons=None, close_matches=None):
+ Event.__init__(self)
+ self._item = item
+ self._runtime = runtime
+ self._dependees = dependees
+ self._reasons = reasons
+ self._close_matches = close_matches
+
+ def getItem(self):
+ return self._item
+
+ def isRuntime(self):
+ return self._runtime
+
+class MultipleProviders(Event):
+ """Multiple Providers"""
+
+ def __init__(self, item, candidates, runtime = False):
+ Event.__init__(self)
+ self._item = item
+ self._candidates = candidates
+ self._is_runtime = runtime
+
+ def isRuntime(self):
+ """
+ Is this a runtime issue?
+ """
+ return self._is_runtime
+
+ def getItem(self):
+ """
+ The name for the to be build item
+ """
+ return self._item
+
+ def getCandidates(self):
+ """
+ Get the possible Candidates for a PROVIDER.
+ """
+ return self._candidates
+
+class ParseStarted(OperationStarted):
+ """Recipe parsing for the runqueue has begun"""
+ def __init__(self, total):
+ OperationStarted.__init__(self, "Recipe parsing Started")
+ self.total = total
+
+class ParseCompleted(OperationCompleted):
+ """Recipe parsing for the runqueue has completed"""
+ def __init__(self, cached, parsed, skipped, masked, virtuals, errors, total):
+ OperationCompleted.__init__(self, total, "Recipe parsing Completed")
+ self.cached = cached
+ self.parsed = parsed
+ self.skipped = skipped
+ self.virtuals = virtuals
+ self.masked = masked
+ self.errors = errors
+ self.sofar = cached + parsed
+
+class ParseProgress(OperationProgress):
+ """Recipe parsing progress"""
+ def __init__(self, current, total):
+ OperationProgress.__init__(self, current, total, "Recipe parsing")
+
+
+class CacheLoadStarted(OperationStarted):
+ """Loading of the dependency cache has begun"""
+ def __init__(self, total):
+ OperationStarted.__init__(self, "Loading cache Started")
+ self.total = total
+
+class CacheLoadProgress(OperationProgress):
+ """Cache loading progress"""
+ def __init__(self, current, total):
+ OperationProgress.__init__(self, current, total, "Loading cache")
+
+class CacheLoadCompleted(OperationCompleted):
+ """Cache loading is complete"""
+ def __init__(self, total, num_entries):
+ OperationCompleted.__init__(self, total, "Loading cache Completed")
+ self.num_entries = num_entries
+
+class TreeDataPreparationStarted(OperationStarted):
+ """Tree data preparation started"""
+ def __init__(self):
+ OperationStarted.__init__(self, "Preparing tree data Started")
+
+class TreeDataPreparationProgress(OperationProgress):
+ """Tree data preparation is in progress"""
+ def __init__(self, current, total):
+ OperationProgress.__init__(self, current, total, "Preparing tree data")
+
+class TreeDataPreparationCompleted(OperationCompleted):
+ """Tree data preparation completed"""
+ def __init__(self, total):
+ OperationCompleted.__init__(self, total, "Preparing tree data Completed")
+
+class DepTreeGenerated(Event):
+ """
+ Event when a dependency tree has been generated
+ """
+
+ def __init__(self, depgraph):
+ Event.__init__(self)
+ self._depgraph = depgraph
+
+class TargetsTreeGenerated(Event):
+ """
+ Event when a set of buildable targets has been generated
+ """
+ def __init__(self, model):
+ Event.__init__(self)
+ self._model = model
+
+class ReachableStamps(Event):
+ """
+ An event listing all stamps reachable after parsing
+ which the metadata may use to clean up stale data
+ """
+
+ def __init__(self, stamps):
+ Event.__init__(self)
+ self.stamps = stamps
+
+class FilesMatchingFound(Event):
+ """
+ Event when a list of files matching the supplied pattern has
+ been generated
+ """
+ def __init__(self, pattern, matches):
+ Event.__init__(self)
+ self._pattern = pattern
+ self._matches = matches
+
+class CoreBaseFilesFound(Event):
+ """
+ Event when a list of appropriate config files has been generated
+ """
+ def __init__(self, paths):
+ Event.__init__(self)
+ self._paths = paths
+
+class ConfigFilesFound(Event):
+ """
+ Event when a list of appropriate config files has been generated
+ """
+ def __init__(self, variable, values):
+ Event.__init__(self)
+ self._variable = variable
+ self._values = values
+
+class ConfigFilePathFound(Event):
+ """
+ Event when a path for a config file has been found
+ """
+ def __init__(self, path):
+ Event.__init__(self)
+ self._path = path
+
+class MsgBase(Event):
+ """Base class for messages"""
+
+ def __init__(self, msg):
+ self._message = msg
+ Event.__init__(self)
+
+class MsgDebug(MsgBase):
+ """Debug Message"""
+
+class MsgNote(MsgBase):
+ """Note Message"""
+
+class MsgWarn(MsgBase):
+ """Warning Message"""
+
+class MsgError(MsgBase):
+ """Error Message"""
+
+class MsgFatal(MsgBase):
+ """Fatal Message"""
+
+class MsgPlain(MsgBase):
+ """General output"""
+
+class LogExecTTY(Event):
+ """Send event containing program to spawn on tty of the logger"""
+ def __init__(self, msg, prog, sleep_delay, retries):
+ Event.__init__(self)
+ self.msg = msg
+ self.prog = prog
+ self.sleep_delay = sleep_delay
+ self.retries = retries
+
+class LogHandler(logging.Handler):
+ """Dispatch logging messages as bitbake events"""
+
+ def emit(self, record):
+ if record.exc_info:
+ etype, value, tb = record.exc_info
+ if hasattr(tb, 'tb_next'):
+ tb = list(bb.exceptions.extract_traceback(tb, context=3))
+ record.bb_exc_info = (etype, value, tb)
+ record.exc_info = None
+ fire(record, None)
+
+ def filter(self, record):
+ record.taskpid = worker_pid
+ return True
+
+class RequestPackageInfo(Event):
+ """
+ Event to request package information
+ """
+
+class PackageInfo(Event):
+ """
+ Package information for GUI
+ """
+ def __init__(self, pkginfolist):
+ Event.__init__(self)
+ self._pkginfolist = pkginfolist
+
+class MetadataEvent(Event):
+ """
+ Generic event that target for OE-Core classes
+ to report information during asynchrous execution
+ """
+ def __init__(self, eventtype, eventdata):
+ Event.__init__(self)
+ self.type = eventtype
+ self._localdata = eventdata
+
+class SanityCheck(Event):
+ """
+ Event to run sanity checks, either raise errors or generate events as return status.
+ """
+ def __init__(self, generateevents = True):
+ Event.__init__(self)
+ self.generateevents = generateevents
+
+class SanityCheckPassed(Event):
+ """
+ Event to indicate sanity check has passed
+ """
+
+class SanityCheckFailed(Event):
+ """
+ Event to indicate sanity check has failed
+ """
+ def __init__(self, msg, network_error=False):
+ Event.__init__(self)
+ self._msg = msg
+ self._network_error = network_error
+
+class NetworkTest(Event):
+ """
+ Event to run network connectivity tests, either raise errors or generate events as return status.
+ """
+ def __init__(self, generateevents = True):
+ Event.__init__(self)
+ self.generateevents = generateevents
+
+class NetworkTestPassed(Event):
+ """
+ Event to indicate network test has passed
+ """
+
+class NetworkTestFailed(Event):
+ """
+ Event to indicate network test has failed
+ """
+
diff --git a/bitbake/lib/bb/exceptions.py b/bitbake/lib/bb/exceptions.py
new file mode 100644
index 0000000..f182c8f
--- /dev/null
+++ b/bitbake/lib/bb/exceptions.py
@@ -0,0 +1,91 @@
+from __future__ import absolute_import
+import inspect
+import traceback
+import bb.namedtuple_with_abc
+from collections import namedtuple
+
+
+class TracebackEntry(namedtuple.abc):
+ """Pickleable representation of a traceback entry"""
+ _fields = 'filename lineno function args code_context index'
+ _header = ' File "{0.filename}", line {0.lineno}, in {0.function}{0.args}'
+
+ def format(self, formatter=None):
+ if not self.code_context:
+ return self._header.format(self) + '\n'
+
+ formatted = [self._header.format(self) + ':\n']
+
+ for lineindex, line in enumerate(self.code_context):
+ if formatter:
+ line = formatter(line)
+
+ if lineindex == self.index:
+ formatted.append(' >%s' % line)
+ else:
+ formatted.append(' %s' % line)
+ return formatted
+
+ def __str__(self):
+ return ''.join(self.format())
+
+def _get_frame_args(frame):
+ """Get the formatted arguments and class (if available) for a frame"""
+ arginfo = inspect.getargvalues(frame)
+
+ try:
+ if not arginfo.args:
+ return '', None
+ # There have been reports from the field of python 2.6 which doesn't
+ # return a namedtuple here but simply a tuple so fallback gracefully if
+ # args isn't present.
+ except AttributeError:
+ return '', None
+
+ firstarg = arginfo.args[0]
+ if firstarg == 'self':
+ self = arginfo.locals['self']
+ cls = self.__class__.__name__
+
+ arginfo.args.pop(0)
+ del arginfo.locals['self']
+ else:
+ cls = None
+
+ formatted = inspect.formatargvalues(*arginfo)
+ return formatted, cls
+
+def extract_traceback(tb, context=1):
+ frames = inspect.getinnerframes(tb, context)
+ for frame, filename, lineno, function, code_context, index in frames:
+ formatted_args, cls = _get_frame_args(frame)
+ if cls:
+ function = '%s.%s' % (cls, function)
+ yield TracebackEntry(filename, lineno, function, formatted_args,
+ code_context, index)
+
+def format_extracted(extracted, formatter=None, limit=None):
+ if limit:
+ extracted = extracted[-limit:]
+
+ formatted = []
+ for tracebackinfo in extracted:
+ formatted.extend(tracebackinfo.format(formatter))
+ return formatted
+
+
+def format_exception(etype, value, tb, context=1, limit=None, formatter=None):
+ formatted = ['Traceback (most recent call last):\n']
+
+ if hasattr(tb, 'tb_next'):
+ tb = extract_traceback(tb, context)
+
+ formatted.extend(format_extracted(tb, formatter, limit))
+ formatted.extend(traceback.format_exception_only(etype, value))
+ return formatted
+
+def to_string(exc):
+ if isinstance(exc, SystemExit):
+ if not isinstance(exc.code, basestring):
+ return 'Exited with "%d"' % exc.code
+ return str(exc)
diff --git a/bitbake/lib/bb/fetch2/__init__.py b/bitbake/lib/bb/fetch2/__init__.py
new file mode 100644
index 0000000..3d53b63
--- /dev/null
+++ b/bitbake/lib/bb/fetch2/__init__.py
@@ -0,0 +1,1787 @@
+# ex:ts=4:sw=4:sts=4:et
+# -*- tab-width: 4; c-basic-offset: 4; indent-tabs-mode: nil -*-
+"""
+BitBake 'Fetch' implementations
+
+Classes for obtaining upstream sources for the
+BitBake build tools.
+"""
+
+# Copyright (C) 2003, 2004 Chris Larson
+# Copyright (C) 2012 Intel Corporation
+#
+# This program is free software; you can redistribute it and/or modify
+# it under the terms of the GNU General Public License version 2 as
+# published by the Free Software Foundation.
+#
+# This program is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
+# GNU General Public License for more details.
+#
+# You should have received a copy of the GNU General Public License along
+# with this program; if not, write to the Free Software Foundation, Inc.,
+# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
+#
+# Based on functions from the base bb module, Copyright 2003 Holger Schurig
+
+from __future__ import absolute_import
+from __future__ import print_function
+import os, re
+import signal
+import glob
+import logging
+import urllib
+import urlparse
+import operator
+import bb.persist_data, bb.utils
+import bb.checksum
+from bb import data
+import bb.process
+import subprocess
+
+__version__ = "2"
+_checksum_cache = bb.checksum.FileChecksumCache()
+
+logger = logging.getLogger("BitBake.Fetcher")
+
+try:
+ import cPickle as pickle
+except ImportError:
+ import pickle
+ logger.info("Importing cPickle failed. "
+ "Falling back to a very slow implementation.")
+
+class BBFetchException(Exception):
+ """Class all fetch exceptions inherit from"""
+ def __init__(self, message):
+ self.msg = message
+ Exception.__init__(self, message)
+
+ def __str__(self):
+ return self.msg
+
+class UntrustedUrl(BBFetchException):
+ """Exception raised when encountering a host not listed in BB_ALLOWED_NETWORKS"""
+ def __init__(self, url, message=''):
+ if message:
+ msg = message
+ else:
+ msg = "The URL: '%s' is not trusted and cannot be used" % url
+ self.url = url
+ BBFetchException.__init__(self, msg)
+ self.args = (url,)
+
+class MalformedUrl(BBFetchException):
+ """Exception raised when encountering an invalid url"""
+ def __init__(self, url, message=''):
+ if message:
+ msg = message
+ else:
+ msg = "The URL: '%s' is invalid and cannot be interpreted" % url
+ self.url = url
+ BBFetchException.__init__(self, msg)
+ self.args = (url,)
+
+class FetchError(BBFetchException):
+ """General fetcher exception when something happens incorrectly"""
+ def __init__(self, message, url = None):
+ if url:
+ msg = "Fetcher failure for URL: '%s'. %s" % (url, message)
+ else:
+ msg = "Fetcher failure: %s" % message
+ self.url = url
+ BBFetchException.__init__(self, msg)
+ self.args = (message, url)
+
+class ChecksumError(FetchError):
+ """Exception when mismatched checksum encountered"""
+ def __init__(self, message, url = None, checksum = None):
+ self.checksum = checksum
+ FetchError.__init__(self, message, url)
+
+class NoChecksumError(FetchError):
+ """Exception when no checksum is specified, but BB_STRICT_CHECKSUM is set"""
+
+class UnpackError(BBFetchException):
+ """General fetcher exception when something happens incorrectly when unpacking"""
+ def __init__(self, message, url):
+ msg = "Unpack failure for URL: '%s'. %s" % (url, message)
+ self.url = url
+ BBFetchException.__init__(self, msg)
+ self.args = (message, url)
+
+class NoMethodError(BBFetchException):
+ """Exception raised when there is no method to obtain a supplied url or set of urls"""
+ def __init__(self, url):
+ msg = "Could not find a fetcher which supports the URL: '%s'" % url
+ self.url = url
+ BBFetchException.__init__(self, msg)
+ self.args = (url,)
+
+class MissingParameterError(BBFetchException):
+ """Exception raised when a fetch method is missing a critical parameter in the url"""
+ def __init__(self, missing, url):
+ msg = "URL: '%s' is missing the required parameter '%s'" % (url, missing)
+ self.url = url
+ self.missing = missing
+ BBFetchException.__init__(self, msg)
+ self.args = (missing, url)
+
+class ParameterError(BBFetchException):
+ """Exception raised when a url cannot be proccessed due to invalid parameters."""
+ def __init__(self, message, url):
+ msg = "URL: '%s' has invalid parameters. %s" % (url, message)
+ self.url = url
+ BBFetchException.__init__(self, msg)
+ self.args = (message, url)
+
+class NetworkAccess(BBFetchException):
+ """Exception raised when network access is disabled but it is required."""
+ def __init__(self, url, cmd):
+ msg = "Network access disabled through BB_NO_NETWORK (or set indirectly due to use of BB_FETCH_PREMIRRORONLY) but access requested with command %s (for url %s)" % (cmd, url)
+ self.url = url
+ self.cmd = cmd
+ BBFetchException.__init__(self, msg)
+ self.args = (url, cmd)
+
+class NonLocalMethod(Exception):
+ def __init__(self):
+ Exception.__init__(self)
+
+
+class URI(object):
+ """
+ A class representing a generic URI, with methods for
+ accessing the URI components, and stringifies to the
+ URI.
+
+ It is constructed by calling it with a URI, or setting
+ the attributes manually:
+
+ uri = URI("http://example.com/")
+
+ uri = URI()
+ uri.scheme = 'http'
+ uri.hostname = 'example.com'
+ uri.path = '/'
+
+ It has the following attributes:
+
+ * scheme (read/write)
+ * userinfo (authentication information) (read/write)
+ * username (read/write)
+ * password (read/write)
+
+ Note, password is deprecated as of RFC 3986.
+
+ * hostname (read/write)
+ * port (read/write)
+ * hostport (read only)
+ "hostname:port", if both are set, otherwise just "hostname"
+ * path (read/write)
+ * path_quoted (read/write)
+ A URI quoted version of path
+ * params (dict) (read/write)
+ * query (dict) (read/write)
+ * relative (bool) (read only)
+ True if this is a "relative URI", (e.g. file:foo.diff)
+
+ It stringifies to the URI itself.
+
+ Some notes about relative URIs: while it's specified that
+ a URI beginning with <scheme>:// should either be directly
+ followed by a hostname or a /, the old URI handling of the
+ fetch2 library did not comform to this. Therefore, this URI
+ class has some kludges to make sure that URIs are parsed in
+ a way comforming to bitbake's current usage. This URI class
+ supports the following:
+
+ file:relative/path.diff (IETF compliant)
+ git:relative/path.git (IETF compliant)
+ git:///absolute/path.git (IETF compliant)
+ file:///absolute/path.diff (IETF compliant)
+
+ file://relative/path.diff (not IETF compliant)
+
+ But it does not support the following:
+
+ file://hostname/absolute/path.diff (would be IETF compliant)
+
+ Note that the last case only applies to a list of
+ "whitelisted" schemes (currently only file://), that requires
+ its URIs to not have a network location.
+ """
+
+ _relative_schemes = ['file', 'git']
+ _netloc_forbidden = ['file']
+
+ def __init__(self, uri=None):
+ self.scheme = ''
+ self.userinfo = ''
+ self.hostname = ''
+ self.port = None
+ self._path = ''
+ self.params = {}
+ self.query = {}
+ self.relative = False
+
+ if not uri:
+ return
+
+ # We hijack the URL parameters, since the way bitbake uses
+ # them are not quite RFC compliant.
+ uri, param_str = (uri.split(";", 1) + [None])[:2]
+
+ urlp = urlparse.urlparse(uri)
+ self.scheme = urlp.scheme
+
+ reparse = 0
+
+ # Coerce urlparse to make URI scheme use netloc
+ if not self.scheme in urlparse.uses_netloc:
+ urlparse.uses_params.append(self.scheme)
+ reparse = 1
+
+ # Make urlparse happy(/ier) by converting local resources
+ # to RFC compliant URL format. E.g.:
+ # file://foo.diff -> file:foo.diff
+ if urlp.scheme in self._netloc_forbidden:
+ uri = re.sub("(?<=:)//(?!/)", "", uri, 1)
+ reparse = 1
+
+ if reparse:
+ urlp = urlparse.urlparse(uri)
+
+ # Identify if the URI is relative or not
+ if urlp.scheme in self._relative_schemes and \
+ re.compile("^\w+:(?!//)").match(uri):
+ self.relative = True
+
+ if not self.relative:
+ self.hostname = urlp.hostname or ''
+ self.port = urlp.port
+
+ self.userinfo += urlp.username or ''
+
+ if urlp.password:
+ self.userinfo += ':%s' % urlp.password
+
+ self.path = urllib.unquote(urlp.path)
+
+ if param_str:
+ self.params = self._param_str_split(param_str, ";")
+ if urlp.query:
+ self.query = self._param_str_split(urlp.query, "&")
+
+ def __str__(self):
+ userinfo = self.userinfo
+ if userinfo:
+ userinfo += '@'
+
+ return "%s:%s%s%s%s%s%s" % (
+ self.scheme,
+ '' if self.relative else '//',
+ userinfo,
+ self.hostport,
+ self.path_quoted,
+ self._query_str(),
+ self._param_str())
+
+ def _param_str(self):
+ return (
+ ''.join([';', self._param_str_join(self.params, ";")])
+ if self.params else '')
+
+ def _query_str(self):
+ return (
+ ''.join(['?', self._param_str_join(self.query, "&")])
+ if self.query else '')
+
+ def _param_str_split(self, string, elmdelim, kvdelim="="):
+ ret = {}
+ for k, v in [x.split(kvdelim, 1) for x in string.split(elmdelim)]:
+ ret[k] = v
+ return ret
+
+ def _param_str_join(self, dict_, elmdelim, kvdelim="="):
+ return elmdelim.join([kvdelim.join([k, v]) for k, v in dict_.items()])
+
+ @property
+ def hostport(self):
+ if not self.port:
+ return self.hostname
+ return "%s:%d" % (self.hostname, self.port)
+
+ @property
+ def path_quoted(self):
+ return urllib.quote(self.path)
+
+ @path_quoted.setter
+ def path_quoted(self, path):
+ self.path = urllib.unquote(path)
+
+ @property
+ def path(self):
+ return self._path
+
+ @path.setter
+ def path(self, path):
+ self._path = path
+
+ if re.compile("^/").match(path):
+ self.relative = False
+ else:
+ self.relative = True
+
+ @property
+ def username(self):
+ if self.userinfo:
+ return (self.userinfo.split(":", 1))[0]
+ return ''
+
+ @username.setter
+ def username(self, username):
+ password = self.password
+ self.userinfo = username
+ if password:
+ self.userinfo += ":%s" % password
+
+ @property
+ def password(self):
+ if self.userinfo and ":" in self.userinfo:
+ return (self.userinfo.split(":", 1))[1]
+ return ''
+
+ @password.setter
+ def password(self, password):
+ self.userinfo = "%s:%s" % (self.username, password)
+
+def decodeurl(url):
+ """Decodes an URL into the tokens (scheme, network location, path,
+ user, password, parameters).
+ """
+
+ m = re.compile('(?P<type>[^:]*)://((?P<user>[^/]+)@)?(?P<location>[^;]+)(;(?P<parm>.*))?').match(url)
+ if not m:
+ raise MalformedUrl(url)
+
+ type = m.group('type')
+ location = m.group('location')
+ if not location:
+ raise MalformedUrl(url)
+ user = m.group('user')
+ parm = m.group('parm')
+
+ locidx = location.find('/')
+ if locidx != -1 and type.lower() != 'file':
+ host = location[:locidx]
+ path = location[locidx:]
+ else:
+ host = ""
+ path = location
+ if user:
+ m = re.compile('(?P<user>[^:]+)(:?(?P<pswd>.*))').match(user)
+ if m:
+ user = m.group('user')
+ pswd = m.group('pswd')
+ else:
+ user = ''
+ pswd = ''
+
+ p = {}
+ if parm:
+ for s in parm.split(';'):
+ if s:
+ if not '=' in s:
+ raise MalformedUrl(url, "The URL: '%s' is invalid: parameter %s does not specify a value (missing '=')" % (url, s))
+ s1, s2 = s.split('=')
+ p[s1] = s2
+
+ return type, host, urllib.unquote(path), user, pswd, p
+
+def encodeurl(decoded):
+ """Encodes a URL from tokens (scheme, network location, path,
+ user, password, parameters).
+ """
+
+ type, host, path, user, pswd, p = decoded
+
+ if not path:
+ raise MissingParameterError('path', "encoded from the data %s" % str(decoded))
+ if not type:
+ raise MissingParameterError('type', "encoded from the data %s" % str(decoded))
+ url = '%s://' % type
+ if user and type != "file":
+ url += "%s" % user
+ if pswd:
+ url += ":%s" % pswd
+ url += "@"
+ if host and type != "file":
+ url += "%s" % host
+ # Standardise path to ensure comparisons work
+ while '//' in path:
+ path = path.replace("//", "/")
+ url += "%s" % urllib.quote(path)
+ if p:
+ for parm in p:
+ url += ";%s=%s" % (parm, p[parm])
+
+ return url
+
+def uri_replace(ud, uri_find, uri_replace, replacements, d):
+ if not ud.url or not uri_find or not uri_replace:
+ logger.error("uri_replace: passed an undefined value, not replacing")
+ return None
+ uri_decoded = list(decodeurl(ud.url))
+ uri_find_decoded = list(decodeurl(uri_find))
+ uri_replace_decoded = list(decodeurl(uri_replace))
+ logger.debug(2, "For url %s comparing %s to %s" % (uri_decoded, uri_find_decoded, uri_replace_decoded))
+ result_decoded = ['', '', '', '', '', {}]
+ for loc, i in enumerate(uri_find_decoded):
+ result_decoded[loc] = uri_decoded[loc]
+ regexp = i
+ if loc == 0 and regexp and not regexp.endswith("$"):
+ # Leaving the type unanchored can mean "https" matching "file" can become "files"
+ # which is clearly undesirable.
+ regexp += "$"
+ if loc == 5:
+ # Handle URL parameters
+ if i:
+ # Any specified URL parameters must match
+ for k in uri_replace_decoded[loc]:
+ if uri_decoded[loc][k] != uri_replace_decoded[loc][k]:
+ return None
+ # Overwrite any specified replacement parameters
+ for k in uri_replace_decoded[loc]:
+ for l in replacements:
+ uri_replace_decoded[loc][k] = uri_replace_decoded[loc][k].replace(l, replacements[l])
+ result_decoded[loc][k] = uri_replace_decoded[loc][k]
+ elif (re.match(regexp, uri_decoded[loc])):
+ if not uri_replace_decoded[loc]:
+ result_decoded[loc] = ""
+ else:
+ for k in replacements:
+ uri_replace_decoded[loc] = uri_replace_decoded[loc].replace(k, replacements[k])
+ #bb.note("%s %s %s" % (regexp, uri_replace_decoded[loc], uri_decoded[loc]))
+ result_decoded[loc] = re.sub(regexp, uri_replace_decoded[loc], uri_decoded[loc])
+ if loc == 2:
+ # Handle path manipulations
+ basename = None
+ if uri_decoded[0] != uri_replace_decoded[0] and ud.mirrortarball:
+ # If the source and destination url types differ, must be a mirrortarball mapping
+ basename = os.path.basename(ud.mirrortarball)
+ # Kill parameters, they make no sense for mirror tarballs
+ uri_decoded[5] = {}
+ elif ud.localpath and ud.method.supports_checksum(ud):
+ basename = os.path.basename(ud.localpath)
+ if basename and not result_decoded[loc].endswith(basename):
+ result_decoded[loc] = os.path.join(result_decoded[loc], basename)
+ else:
+ return None
+ result = encodeurl(result_decoded)
+ if result == ud.url:
+ return None
+ logger.debug(2, "For url %s returning %s" % (ud.url, result))
+ return result
+
+methods = []
+urldata_cache = {}
+saved_headrevs = {}
+
+def fetcher_init(d):
+ """
+ Called to initialize the fetchers once the configuration data is known.
+ Calls before this must not hit the cache.
+ """
+ # When to drop SCM head revisions controlled by user policy
+ srcrev_policy = d.getVar('BB_SRCREV_POLICY', True) or "clear"
+ if srcrev_policy == "cache":
+ logger.debug(1, "Keeping SRCREV cache due to cache policy of: %s", srcrev_policy)
+ elif srcrev_policy == "clear":
+ logger.debug(1, "Clearing SRCREV cache due to cache policy of: %s", srcrev_policy)
+ revs = bb.persist_data.persist('BB_URI_HEADREVS', d)
+ try:
+ bb.fetch2.saved_headrevs = revs.items()
+ except:
+ pass
+ revs.clear()
+ else:
+ raise FetchError("Invalid SRCREV cache policy of: %s" % srcrev_policy)
+
+ _checksum_cache.init_cache(d)
+
+ for m in methods:
+ if hasattr(m, "init"):
+ m.init(d)
+
+def fetcher_parse_save(d):
+ _checksum_cache.save_extras(d)
+
+def fetcher_parse_done(d):
+ _checksum_cache.save_merge(d)
+
+def fetcher_compare_revisions(d):
+ """
+ Compare the revisions in the persistant cache with current values and
+ return true/false on whether they've changed.
+ """
+
+ data = bb.persist_data.persist('BB_URI_HEADREVS', d).items()
+ data2 = bb.fetch2.saved_headrevs
+
+ changed = False
+ for key in data:
+ if key not in data2 or data2[key] != data[key]:
+ logger.debug(1, "%s changed", key)
+ changed = True
+ return True
+ else:
+ logger.debug(2, "%s did not change", key)
+ return False
+
+def mirror_from_string(data):
+ return [ i.split() for i in (data or "").replace('\\n','\n').split('\n') if i ]
+
+def verify_checksum(ud, d, precomputed={}):
+ """
+ verify the MD5 and SHA256 checksum for downloaded src
+
+ Raises a FetchError if one or both of the SRC_URI checksums do not match
+ the downloaded file, or if BB_STRICT_CHECKSUM is set and there are no
+ checksums specified.
+
+ Returns a dict of checksums that can be stored in a done stamp file and
+ passed in as precomputed parameter in a later call to avoid re-computing
+ the checksums from the file. This allows verifying the checksums of the
+ file against those in the recipe each time, rather than only after
+ downloading. See https://bugzilla.yoctoproject.org/show_bug.cgi?id=5571.
+ """
+
+ _MD5_KEY = "md5"
+ _SHA256_KEY = "sha256"
+
+ if ud.ignore_checksums or not ud.method.supports_checksum(ud):
+ return {}
+
+ if _MD5_KEY in precomputed:
+ md5data = precomputed[_MD5_KEY]
+ else:
+ md5data = bb.utils.md5_file(ud.localpath)
+
+ if _SHA256_KEY in precomputed:
+ sha256data = precomputed[_SHA256_KEY]
+ else:
+ sha256data = bb.utils.sha256_file(ud.localpath)
+
+ if ud.method.recommends_checksum(ud):
+ # If strict checking enabled and neither sum defined, raise error
+ strict = d.getVar("BB_STRICT_CHECKSUM", True) or "0"
+ if (strict == "1") and not (ud.md5_expected or ud.sha256_expected):
+ logger.error('No checksum specified for %s, please add at least one to the recipe:\n'
+ 'SRC_URI[%s] = "%s"\nSRC_URI[%s] = "%s"' %
+ (ud.localpath, ud.md5_name, md5data,
+ ud.sha256_name, sha256data))
+ raise NoChecksumError('Missing SRC_URI checksum', ud.url)
+
+ # Log missing sums so user can more easily add them
+ if not ud.md5_expected:
+ logger.warn('Missing md5 SRC_URI checksum for %s, consider adding to the recipe:\n'
+ 'SRC_URI[%s] = "%s"',
+ ud.localpath, ud.md5_name, md5data)
+
+ if not ud.sha256_expected:
+ logger.warn('Missing sha256 SRC_URI checksum for %s, consider adding to the recipe:\n'
+ 'SRC_URI[%s] = "%s"',
+ ud.localpath, ud.sha256_name, sha256data)
+
+ md5mismatch = False
+ sha256mismatch = False
+
+ if ud.md5_expected != md5data:
+ md5mismatch = True
+
+ if ud.sha256_expected != sha256data:
+ sha256mismatch = True
+
+ # We want to alert the user if a checksum is defined in the recipe but
+ # it does not match.
+ msg = ""
+ mismatch = False
+ if md5mismatch and ud.md5_expected:
+ msg = msg + "\nFile: '%s' has %s checksum %s when %s was expected" % (ud.localpath, 'md5', md5data, ud.md5_expected)
+ mismatch = True;
+
+ if sha256mismatch and ud.sha256_expected:
+ msg = msg + "\nFile: '%s' has %s checksum %s when %s was expected" % (ud.localpath, 'sha256', sha256data, ud.sha256_expected)
+ mismatch = True;
+
+ if mismatch:
+ msg = msg + '\nIf this change is expected (e.g. you have upgraded to a new version without updating the checksums) then you can use these lines within the recipe:\nSRC_URI[%s] = "%s"\nSRC_URI[%s] = "%s"\nOtherwise you should retry the download and/or check with upstream to determine if the file has become corrupted or otherwise unexpectedly modified.\n' % (ud.md5_name, md5data, ud.sha256_name, sha256data)
+
+ if len(msg):
+ raise ChecksumError('Checksum mismatch!%s' % msg, ud.url, md5data)
+
+ return {
+ _MD5_KEY: md5data,
+ _SHA256_KEY: sha256data
+ }
+
+
+def verify_donestamp(ud, d, origud=None):
+ """
+ Check whether the done stamp file has the right checksums (if the fetch
+ method supports them). If it doesn't, delete the done stamp and force
+ a re-download.
+
+ Returns True, if the donestamp exists and is valid, False otherwise. When
+ returning False, any existing done stamps are removed.
+ """
+ if not os.path.exists(ud.donestamp):
+ return False
+
+ if (not ud.method.supports_checksum(ud) or
+ (origud and not origud.method.supports_checksum(origud))):
+ # done stamp exists, checksums not supported; assume the local file is
+ # current
+ return True
+
+ if not os.path.exists(ud.localpath):
+ # done stamp exists, but the downloaded file does not; the done stamp
+ # must be incorrect, re-trigger the download
+ bb.utils.remove(ud.donestamp)
+ return False
+
+ precomputed_checksums = {}
+ # Only re-use the precomputed checksums if the donestamp is newer than the
+ # file. Do not rely on the mtime of directories, though. If ud.localpath is
+ # a directory, there will probably not be any checksums anyway.
+ if (os.path.isdir(ud.localpath) or
+ os.path.getmtime(ud.localpath) < os.path.getmtime(ud.donestamp)):
+ try:
+ with open(ud.donestamp, "rb") as cachefile:
+ pickled = pickle.Unpickler(cachefile)
+ precomputed_checksums.update(pickled.load())
+ except Exception as e:
+ # Avoid the warnings on the upgrade path from emtpy done stamp
+ # files to those containing the checksums.
+ if not isinstance(e, EOFError):
+ # Ignore errors, they aren't fatal
+ logger.warn("Couldn't load checksums from donestamp %s: %s "
+ "(msg: %s)" % (ud.donestamp, type(e).__name__,
+ str(e)))
+
+ try:
+ checksums = verify_checksum(ud, d, precomputed_checksums)
+ # If the cache file did not have the checksums, compute and store them
+ # as an upgrade path from the previous done stamp file format.
+ if checksums != precomputed_checksums:
+ with open(ud.donestamp, "wb") as cachefile:
+ p = pickle.Pickler(cachefile, pickle.HIGHEST_PROTOCOL)
+ p.dump(checksums)
+ return True
+ except ChecksumError as e:
+ # Checksums failed to verify, trigger re-download and remove the
+ # incorrect stamp file.
+ logger.warn("Checksum mismatch for local file %s\n"
+ "Cleaning and trying again." % ud.localpath)
+ rename_bad_checksum(ud, e.checksum)
+ bb.utils.remove(ud.donestamp)
+ return False
+
+
+def update_stamp(ud, d):
+ """
+ donestamp is file stamp indicating the whole fetching is done
+ this function update the stamp after verifying the checksum
+ """
+ if os.path.exists(ud.donestamp):
+ # Touch the done stamp file to show active use of the download
+ try:
+ os.utime(ud.donestamp, None)
+ except:
+ # Errors aren't fatal here
+ pass
+ else:
+ checksums = verify_checksum(ud, d)
+ # Store the checksums for later re-verification against the recipe
+ with open(ud.donestamp, "wb") as cachefile:
+ p = pickle.Pickler(cachefile, pickle.HIGHEST_PROTOCOL)
+ p.dump(checksums)
+
+def subprocess_setup():
+ # Python installs a SIGPIPE handler by default. This is usually not what
+ # non-Python subprocesses expect.
+ # SIGPIPE errors are known issues with gzip/bash
+ signal.signal(signal.SIGPIPE, signal.SIG_DFL)
+
+def get_autorev(d):
+ # only not cache src rev in autorev case
+ if d.getVar('BB_SRCREV_POLICY', True) != "cache":
+ d.setVar('__BB_DONT_CACHE', '1')
+ return "AUTOINC"
+
+def get_srcrev(d, method_name='sortable_revision'):
+ """
+ Return the revsion string, usually for use in the version string (PV) of the current package
+ Most packages usually only have one SCM so we just pass on the call.
+ In the multi SCM case, we build a value based on SRCREV_FORMAT which must
+ have been set.
+
+ The idea here is that we put the string "AUTOINC+" into return value if the revisions are not
+ incremental, other code is then responsible for turning that into an increasing value (if needed)
+
+ A method_name can be supplied to retrieve an alternatively formatted revision from a fetcher, if
+ that fetcher provides a method with the given name and the same signature as sortable_revision.
+ """
+
+ scms = []
+ fetcher = Fetch(d.getVar('SRC_URI', True).split(), d)
+ urldata = fetcher.ud
+ for u in urldata:
+ if urldata[u].method.supports_srcrev():
+ scms.append(u)
+
+ if len(scms) == 0:
+ raise FetchError("SRCREV was used yet no valid SCM was found in SRC_URI")
+
+ if len(scms) == 1 and len(urldata[scms[0]].names) == 1:
+ autoinc, rev = getattr(urldata[scms[0]].method, method_name)(urldata[scms[0]], d, urldata[scms[0]].names[0])
+ if len(rev) > 10:
+ rev = rev[:10]
+ if autoinc:
+ return "AUTOINC+" + rev
+ return rev
+
+ #
+ # Mutiple SCMs are in SRC_URI so we resort to SRCREV_FORMAT
+ #
+ format = d.getVar('SRCREV_FORMAT', True)
+ if not format:
+ raise FetchError("The SRCREV_FORMAT variable must be set when multiple SCMs are used.")
+
+ seenautoinc = False
+ for scm in scms:
+ ud = urldata[scm]
+ for name in ud.names:
+ autoinc, rev = getattr(ud.method, method_name)(ud, d, name)
+ seenautoinc = seenautoinc or autoinc
+ if len(rev) > 10:
+ rev = rev[:10]
+ format = format.replace(name, rev)
+ if seenautoinc:
+ format = "AUTOINC+" + format
+
+ return format
+
+def localpath(url, d):
+ fetcher = bb.fetch2.Fetch([url], d)
+ return fetcher.localpath(url)
+
+def runfetchcmd(cmd, d, quiet=False, cleanup=None):
+ """
+ Run cmd returning the command output
+ Raise an error if interrupted or cmd fails
+ Optionally echo command output to stdout
+ Optionally remove the files/directories listed in cleanup upon failure
+ """
+
+ # Need to export PATH as binary could be in metadata paths
+ # rather than host provided
+ # Also include some other variables.
+ # FIXME: Should really include all export varaiables?
+ exportvars = ['HOME', 'PATH',
+ 'HTTP_PROXY', 'http_proxy',
+ 'HTTPS_PROXY', 'https_proxy',
+ 'FTP_PROXY', 'ftp_proxy',
+ 'FTPS_PROXY', 'ftps_proxy',
+ 'NO_PROXY', 'no_proxy',
+ 'ALL_PROXY', 'all_proxy',
+ 'GIT_PROXY_COMMAND',
+ 'GIT_SSL_CAINFO',
+ 'GIT_SMART_HTTP',
+ 'SSH_AUTH_SOCK', 'SSH_AGENT_PID',
+ 'SOCKS5_USER', 'SOCKS5_PASSWD']
+
+ if not cleanup:
+ cleanup = []
+
+ for var in exportvars:
+ val = d.getVar(var, True)
+ if val:
+ cmd = 'export ' + var + '=\"%s\"; %s' % (val, cmd)
+
+ logger.debug(1, "Running %s", cmd)
+
+ success = False
+ error_message = ""
+
+ try:
+ (output, errors) = bb.process.run(cmd, shell=True, stderr=subprocess.PIPE)
+ success = True
+ except bb.process.NotFoundError as e:
+ error_message = "Fetch command %s" % (e.command)
+ except bb.process.ExecutionError as e:
+ if e.stdout:
+ output = "output:\n%s\n%s" % (e.stdout, e.stderr)
+ elif e.stderr:
+ output = "output:\n%s" % e.stderr
+ else:
+ output = "no output"
+ error_message = "Fetch command failed with exit code %s, %s" % (e.exitcode, output)
+ except bb.process.CmdError as e:
+ error_message = "Fetch command %s could not be run:\n%s" % (e.command, e.msg)
+ if not success:
+ for f in cleanup:
+ try:
+ bb.utils.remove(f, True)
+ except OSError:
+ pass
+
+ raise FetchError(error_message)
+
+ return output
+
+def check_network_access(d, info = "", url = None):
+ """
+ log remote network access, and error if BB_NO_NETWORK is set
+ """
+ if d.getVar("BB_NO_NETWORK", True) == "1":
+ raise NetworkAccess(url, info)
+ else:
+ logger.debug(1, "Fetcher accessed the network with the command %s" % info)
+
+def build_mirroruris(origud, mirrors, ld):
+ uris = []
+ uds = []
+
+ replacements = {}
+ replacements["TYPE"] = origud.type
+ replacements["HOST"] = origud.host
+ replacements["PATH"] = origud.path
+ replacements["BASENAME"] = origud.path.split("/")[-1]
+ replacements["MIRRORNAME"] = origud.host.replace(':','.') + origud.path.replace('/', '.').replace('*', '.')
+
+ def adduri(ud, uris, uds):
+ for line in mirrors:
+ try:
+ (find, replace) = line
+ except ValueError:
+ continue
+ newuri = uri_replace(ud, find, replace, replacements, ld)
+ if not newuri or newuri in uris or newuri == origud.url:
+ continue
+
+ if not trusted_network(ld, newuri):
+ logger.debug(1, "Mirror %s not in the list of trusted networks, skipping" % (newuri))
+ continue
+
+ try:
+ newud = FetchData(newuri, ld)
+ newud.setup_localpath(ld)
+ except bb.fetch2.BBFetchException as e:
+ logger.debug(1, "Mirror fetch failure for url %s (original url: %s)" % (newuri, origud.url))
+ logger.debug(1, str(e))
+ try:
+ # setup_localpath of file:// urls may fail, we should still see
+ # if mirrors of the url exist
+ adduri(newud, uris, uds)
+ except UnboundLocalError:
+ pass
+ continue
+ uris.append(newuri)
+ uds.append(newud)
+
+ adduri(newud, uris, uds)
+
+ adduri(origud, uris, uds)
+
+ return uris, uds
+
+def rename_bad_checksum(ud, suffix):
+ """
+ Renames files to have suffix from parameter
+ """
+
+ if ud.localpath is None:
+ return
+
+ new_localpath = "%s_bad-checksum_%s" % (ud.localpath, suffix)
+ bb.warn("Renaming %s to %s" % (ud.localpath, new_localpath))
+ bb.utils.movefile(ud.localpath, new_localpath)
+
+
+def try_mirror_url(fetch, origud, ud, ld, check = False):
+ # Return of None or a value means we're finished
+ # False means try another url
+ try:
+ if check:
+ found = ud.method.checkstatus(fetch, ud, ld)
+ if found:
+ return found
+ return False
+
+ os.chdir(ld.getVar("DL_DIR", True))
+
+ if not verify_donestamp(ud, ld, origud) or ud.method.need_update(ud, ld):
+ ud.method.download(ud, ld)
+ if hasattr(ud.method,"build_mirror_data"):
+ ud.method.build_mirror_data(ud, ld)
+
+ if not ud.localpath or not os.path.exists(ud.localpath):
+ return False
+
+ if ud.localpath == origud.localpath:
+ return ud.localpath
+
+ # We may be obtaining a mirror tarball which needs further processing by the real fetcher
+ # If that tarball is a local file:// we need to provide a symlink to it
+ dldir = ld.getVar("DL_DIR", True)
+ if origud.mirrortarball and os.path.basename(ud.localpath) == os.path.basename(origud.mirrortarball) \
+ and os.path.basename(ud.localpath) != os.path.basename(origud.localpath):
+ # Create donestamp in old format to avoid triggering a re-download
+ bb.utils.mkdirhier(os.path.dirname(ud.donestamp))
+ open(ud.donestamp, 'w').close()
+ dest = os.path.join(dldir, os.path.basename(ud.localpath))
+ if not os.path.exists(dest):
+ os.symlink(ud.localpath, dest)
+ if not verify_donestamp(origud, ld) or origud.method.need_update(origud, ld):
+ origud.method.download(origud, ld)
+ if hasattr(origud.method,"build_mirror_data"):
+ origud.method.build_mirror_data(origud, ld)
+ return ud.localpath
+ # Otherwise the result is a local file:// and we symlink to it
+ if not os.path.exists(origud.localpath):
+ if os.path.islink(origud.localpath):
+ # Broken symbolic link
+ os.unlink(origud.localpath)
+
+ os.symlink(ud.localpath, origud.localpath)
+ update_stamp(origud, ld)
+ return ud.localpath
+
+ except bb.fetch2.NetworkAccess:
+ raise
+
+ except bb.fetch2.BBFetchException as e:
+ if isinstance(e, ChecksumError):
+ logger.warn("Mirror checksum failure for url %s (original url: %s)\nCleaning and trying again." % (ud.url, origud.url))
+ logger.warn(str(e))
+ rename_bad_checksum(ud, e.checksum)
+ elif isinstance(e, NoChecksumError):
+ raise
+ else:
+ logger.debug(1, "Mirror fetch failure for url %s (original url: %s)" % (ud.url, origud.url))
+ logger.debug(1, str(e))
+ try:
+ ud.method.clean(ud, ld)
+ except UnboundLocalError:
+ pass
+ return False
+
+def try_mirrors(fetch, d, origud, mirrors, check = False):
+ """
+ Try to use a mirrored version of the sources.
+ This method will be automatically called before the fetchers go.
+
+ d Is a bb.data instance
+ uri is the original uri we're trying to download
+ mirrors is the list of mirrors we're going to try
+ """
+ ld = d.createCopy()
+
+ uris, uds = build_mirroruris(origud, mirrors, ld)
+
+ for index, uri in enumerate(uris):
+ ret = try_mirror_url(fetch, origud, uds[index], ld, check)
+ if ret != False:
+ return ret
+ return None
+
+def trusted_network(d, url):
+ """
+ Use a trusted url during download if networking is enabled and
+ BB_ALLOWED_NETWORKS is set globally or for a specific recipe.
+ Note: modifies SRC_URI & mirrors.
+ """
+ if d.getVar('BB_NO_NETWORK', True) == "1":
+ return True
+
+ pkgname = d.expand(d.getVar('PN', False))
+ trusted_hosts = d.getVarFlag('BB_ALLOWED_NETWORKS', pkgname)
+
+ if not trusted_hosts:
+ trusted_hosts = d.getVar('BB_ALLOWED_NETWORKS', True)
+
+ # Not enabled.
+ if not trusted_hosts:
+ return True
+
+ scheme, network, path, user, passwd, param = decodeurl(url)
+
+ if not network:
+ return True
+
+ network = network.lower()
+
+ for host in trusted_hosts.split(" "):
+ host = host.lower()
+ if host.startswith("*.") and ("." + network).endswith(host[1:]):
+ return True
+ if host == network:
+ return True
+
+ return False
+
+def srcrev_internal_helper(ud, d, name):
+ """
+ Return:
+ a) a source revision if specified
+ b) latest revision if SRCREV="AUTOINC"
+ c) None if not specified
+ """
+
+ srcrev = None
+ pn = d.getVar("PN", True)
+ attempts = []
+ if name != '' and pn:
+ attempts.append("SRCREV_%s_pn-%s" % (name, pn))
+ if name != '':
+ attempts.append("SRCREV_%s" % name)
+ if pn:
+ attempts.append("SRCREV_pn-%s" % pn)
+ attempts.append("SRCREV")
+
+ for a in attempts:
+ srcrev = d.getVar(a, True)
+ if srcrev and srcrev != "INVALID":
+ break
+
+ if 'rev' in ud.parm and 'tag' in ud.parm:
+ raise FetchError("Please specify a ;rev= parameter or a ;tag= parameter in the url %s but not both." % (ud.url))
+
+ if 'rev' in ud.parm or 'tag' in ud.parm:
+ if 'rev' in ud.parm:
+ parmrev = ud.parm['rev']
+ else:
+ parmrev = ud.parm['tag']
+ if srcrev == "INVALID" or not srcrev:
+ return parmrev
+ if srcrev != parmrev:
+ raise FetchError("Conflicting revisions (%s from SRCREV and %s from the url) found, please spcify one valid value" % (srcrev, parmrev))
+ return parmrev
+
+ if srcrev == "INVALID" or not srcrev:
+ raise FetchError("Please set a valid SRCREV for url %s (possible key names are %s, or use a ;rev=X URL parameter)" % (str(attempts), ud.url), ud.url)
+ if srcrev == "AUTOINC":
+ srcrev = ud.method.latest_revision(ud, d, name)
+
+ return srcrev
+
+def get_checksum_file_list(d):
+ """ Get a list of files checksum in SRC_URI
+
+ Returns the resolved local paths of all local file entries in
+ SRC_URI as a space-separated string
+ """
+ fetch = Fetch([], d, cache = False, localonly = True)
+
+ dl_dir = d.getVar('DL_DIR', True)
+ filelist = []
+ for u in fetch.urls:
+ ud = fetch.ud[u]
+
+ if ud and isinstance(ud.method, local.Local):
+ paths = ud.method.localpaths(ud, d)
+ for f in paths:
+ pth = ud.decodedurl
+ if '*' in pth:
+ f = os.path.join(os.path.abspath(f), pth)
+ if f.startswith(dl_dir):
+ # The local fetcher's behaviour is to return a path under DL_DIR if it couldn't find the file anywhere else
+ if os.path.exists(f):
+ bb.warn("Getting checksum for %s SRC_URI entry %s: file not found except in DL_DIR" % (d.getVar('PN', True), os.path.basename(f)))
+ else:
+ bb.warn("Unable to get checksum for %s SRC_URI entry %s: file could not be found" % (d.getVar('PN', True), os.path.basename(f)))
+ filelist.append(f + ":" + str(os.path.exists(f)))
+
+ return " ".join(filelist)
+
+def get_file_checksums(filelist, pn):
+ """Get a list of the checksums for a list of local files
+
+ Returns the checksums for a list of local files, caching the results as
+ it proceeds
+
+ """
+
+ def checksum_file(f):
+ try:
+ checksum = _checksum_cache.get_checksum(f)
+ except OSError as e:
+ bb.warn("Unable to get checksum for %s SRC_URI entry %s: %s" % (pn, os.path.basename(f), e))
+ return None
+ return checksum
+
+ def checksum_dir(pth):
+ # Handle directories recursively
+ dirchecksums = []
+ for root, dirs, files in os.walk(pth):
+ for name in files:
+ fullpth = os.path.join(root, name)
+ checksum = checksum_file(fullpth)
+ if checksum:
+ dirchecksums.append((fullpth, checksum))
+ return dirchecksums
+
+ checksums = []
+ for pth in filelist.split():
+ exist = pth.split(":")[1]
+ if exist == "False":
+ continue
+ pth = pth.split(":")[0]
+ if '*' in pth:
+ # Handle globs
+ for f in glob.glob(pth):
+ if os.path.isdir(f):
+ checksums.extend(checksum_dir(f))
+ else:
+ checksum = checksum_file(f)
+ checksums.append((f, checksum))
+ elif os.path.isdir(pth):
+ checksums.extend(checksum_dir(pth))
+ else:
+ checksum = checksum_file(pth)
+ checksums.append((pth, checksum))
+
+ checksums.sort(key=operator.itemgetter(1))
+ return checksums
+
+
+class FetchData(object):
+ """
+ A class which represents the fetcher state for a given URI.
+ """
+ def __init__(self, url, d, localonly = False):
+ # localpath is the location of a downloaded result. If not set, the file is local.
+ self.donestamp = None
+ self.localfile = ""
+ self.localpath = None
+ self.lockfile = None
+ self.mirrortarball = None
+ self.basename = None
+ self.basepath = None
+ (self.type, self.host, self.path, self.user, self.pswd, self.parm) = decodeurl(data.expand(url, d))
+ self.date = self.getSRCDate(d)
+ self.url = url
+ if not self.user and "user" in self.parm:
+ self.user = self.parm["user"]
+ if not self.pswd and "pswd" in self.parm:
+ self.pswd = self.parm["pswd"]
+ self.setup = False
+
+ if "name" in self.parm:
+ self.md5_name = "%s.md5sum" % self.parm["name"]
+ self.sha256_name = "%s.sha256sum" % self.parm["name"]
+ else:
+ self.md5_name = "md5sum"
+ self.sha256_name = "sha256sum"
+ if self.md5_name in self.parm:
+ self.md5_expected = self.parm[self.md5_name]
+ elif self.type not in ["http", "https", "ftp", "ftps", "sftp"]:
+ self.md5_expected = None
+ else:
+ self.md5_expected = d.getVarFlag("SRC_URI", self.md5_name)
+ if self.sha256_name in self.parm:
+ self.sha256_expected = self.parm[self.sha256_name]
+ elif self.type not in ["http", "https", "ftp", "ftps", "sftp"]:
+ self.sha256_expected = None
+ else:
+ self.sha256_expected = d.getVarFlag("SRC_URI", self.sha256_name)
+ self.ignore_checksums = False
+
+ self.names = self.parm.get("name",'default').split(',')
+
+ self.method = None
+ for m in methods:
+ if m.supports(self, d):
+ self.method = m
+ break
+
+ if not self.method:
+ raise NoMethodError(url)
+
+ if localonly and not isinstance(self.method, local.Local):
+ raise NonLocalMethod()
+
+ if self.parm.get("proto", None) and "protocol" not in self.parm:
+ logger.warn('Consider updating %s recipe to use "protocol" not "proto" in SRC_URI.', d.getVar('PN', True))
+ self.parm["protocol"] = self.parm.get("proto", None)
+
+ if hasattr(self.method, "urldata_init"):
+ self.method.urldata_init(self, d)
+
+ if "localpath" in self.parm:
+ # if user sets localpath for file, use it instead.
+ self.localpath = self.parm["localpath"]
+ self.basename = os.path.basename(self.localpath)
+ elif self.localfile:
+ self.localpath = self.method.localpath(self, d)
+
+ dldir = d.getVar("DL_DIR", True)
+ # Note: .done and .lock files should always be in DL_DIR whereas localpath may not be.
+ if self.localpath and self.localpath.startswith(dldir):
+ basepath = self.localpath
+ elif self.localpath:
+ basepath = dldir + os.sep + os.path.basename(self.localpath)
+ else:
+ basepath = dldir + os.sep + (self.basepath or self.basename)
+ self.donestamp = basepath + '.done'
+ self.lockfile = basepath + '.lock'
+
+ def setup_revisons(self, d):
+ self.revisions = {}
+ for name in self.names:
+ self.revisions[name] = srcrev_internal_helper(self, d, name)
+
+ # add compatibility code for non name specified case
+ if len(self.names) == 1:
+ self.revision = self.revisions[self.names[0]]
+
+ def setup_localpath(self, d):
+ if not self.localpath:
+ self.localpath = self.method.localpath(self, d)
+
+ def getSRCDate(self, d):
+ """
+ Return the SRC Date for the component
+
+ d the bb.data module
+ """
+ if "srcdate" in self.parm:
+ return self.parm['srcdate']
+
+ pn = d.getVar("PN", True)
+
+ if pn:
+ return d.getVar("SRCDATE_%s" % pn, True) or d.getVar("SRCDATE", True) or d.getVar("DATE", True)
+
+ return d.getVar("SRCDATE", True) or d.getVar("DATE", True)
+
+class FetchMethod(object):
+ """Base class for 'fetch'ing data"""
+
+ def __init__(self, urls=None):
+ self.urls = []
+
+ def supports(self, urldata, d):
+ """
+ Check to see if this fetch class supports a given url.
+ """
+ return 0
+
+ def localpath(self, urldata, d):
+ """
+ Return the local filename of a given url assuming a successful fetch.
+ Can also setup variables in urldata for use in go (saving code duplication
+ and duplicate code execution)
+ """
+ return os.path.join(data.getVar("DL_DIR", d, True), urldata.localfile)
+
+ def supports_checksum(self, urldata):
+ """
+ Is localpath something that can be represented by a checksum?
+ """
+
+ # We cannot compute checksums for directories
+ if os.path.isdir(urldata.localpath) == True:
+ return False
+ if urldata.localpath.find("*") != -1:
+ return False
+
+ return True
+
+ def recommends_checksum(self, urldata):
+ """
+ Is the backend on where checksumming is recommended (should warnings
+ be displayed if there is no checksum)?
+ """
+ return False
+
+ def _strip_leading_slashes(self, relpath):
+ """
+ Remove leading slash as os.path.join can't cope
+ """
+ while os.path.isabs(relpath):
+ relpath = relpath[1:]
+ return relpath
+
+ def setUrls(self, urls):
+ self.__urls = urls
+
+ def getUrls(self):
+ return self.__urls
+
+ urls = property(getUrls, setUrls, None, "Urls property")
+
+ def need_update(self, ud, d):
+ """
+ Force a fetch, even if localpath exists?
+ """
+ if os.path.exists(ud.localpath):
+ return False
+ return True
+
+ def supports_srcrev(self):
+ """
+ The fetcher supports auto source revisions (SRCREV)
+ """
+ return False
+
+ def download(self, urldata, d):
+ """
+ Fetch urls
+ Assumes localpath was called first
+ """
+ raise NoMethodError(url)
+
+ def unpack(self, urldata, rootdir, data):
+ iterate = False
+ file = urldata.localpath
+
+ try:
+ unpack = bb.utils.to_boolean(urldata.parm.get('unpack'), True)
+ except ValueError as exc:
+ bb.fatal("Invalid value for 'unpack' parameter for %s: %s" %
+ (file, urldata.parm.get('unpack')))
+
+ base, ext = os.path.splitext(file)
+ if ext in ['.gz', '.bz2', '.Z', '.xz', '.lz']:
+ efile = os.path.join(rootdir, os.path.basename(base))
+ else:
+ efile = file
+ cmd = None
+
+ if unpack:
+ if file.endswith('.tar'):
+ cmd = 'tar x --no-same-owner -f %s' % file
+ elif file.endswith('.tgz') or file.endswith('.tar.gz') or file.endswith('.tar.Z'):
+ cmd = 'tar xz --no-same-owner -f %s' % file
+ elif file.endswith('.tbz') or file.endswith('.tbz2') or file.endswith('.tar.bz2'):
+ cmd = 'bzip2 -dc %s | tar x --no-same-owner -f -' % file
+ elif file.endswith('.gz') or file.endswith('.Z') or file.endswith('.z'):
+ cmd = 'gzip -dc %s > %s' % (file, efile)
+ elif file.endswith('.bz2'):
+ cmd = 'bzip2 -dc %s > %s' % (file, efile)
+ elif file.endswith('.tar.xz'):
+ cmd = 'xz -dc %s | tar x --no-same-owner -f -' % file
+ elif file.endswith('.xz'):
+ cmd = 'xz -dc %s > %s' % (file, efile)
+ elif file.endswith('.tar.lz'):
+ cmd = 'lzip -dc %s | tar x --no-same-owner -f -' % file
+ elif file.endswith('.lz'):
+ cmd = 'lzip -dc %s > %s' % (file, efile)
+ elif file.endswith('.zip') or file.endswith('.jar'):
+ try:
+ dos = bb.utils.to_boolean(urldata.parm.get('dos'), False)
+ except ValueError as exc:
+ bb.fatal("Invalid value for 'dos' parameter for %s: %s" %
+ (file, urldata.parm.get('dos')))
+ cmd = 'unzip -q -o'
+ if dos:
+ cmd = '%s -a' % cmd
+ cmd = "%s '%s'" % (cmd, file)
+ elif file.endswith('.rpm') or file.endswith('.srpm'):
+ if 'extract' in urldata.parm:
+ unpack_file = urldata.parm.get('extract')
+ cmd = 'rpm2cpio.sh %s | cpio -id %s' % (file, unpack_file)
+ iterate = True
+ iterate_file = unpack_file
+ else:
+ cmd = 'rpm2cpio.sh %s | cpio -id' % (file)
+ elif file.endswith('.deb') or file.endswith('.ipk'):
+ cmd = 'ar -p %s data.tar.gz | zcat | tar --no-same-owner -xpf -' % file
+
+ if not unpack or not cmd:
+ # If file == dest, then avoid any copies, as we already put the file into dest!
+ dest = os.path.join(rootdir, os.path.basename(file))
+ if (file != dest) and not (os.path.exists(dest) and os.path.samefile(file, dest)):
+ if os.path.isdir(file):
+ # If for example we're asked to copy file://foo/bar, we need to unpack the result into foo/bar
+ basepath = getattr(urldata, "basepath", None)
+ destdir = "."
+ if basepath and basepath.endswith("/"):
+ basepath = basepath.rstrip("/")
+ elif basepath:
+ basepath = os.path.dirname(basepath)
+ if basepath and basepath.find("/") != -1:
+ destdir = basepath[:basepath.rfind('/')]
+ destdir = destdir.strip('/')
+ if destdir != "." and not os.access("%s/%s" % (rootdir, destdir), os.F_OK):
+ os.makedirs("%s/%s" % (rootdir, destdir))
+ cmd = 'cp -fpPR %s %s/%s/' % (file, rootdir, destdir)
+ #cmd = 'tar -cf - -C "%d" -ps . | tar -xf - -C "%s/%s/"' % (file, rootdir, destdir)
+ else:
+ # The "destdir" handling was specifically done for FILESPATH
+ # items. So, only do so for file:// entries.
+ if urldata.type == "file" and urldata.path.find("/") != -1:
+ destdir = urldata.path.rsplit("/", 1)[0]
+ if urldata.parm.get('subdir') != None:
+ destdir = urldata.parm.get('subdir') + "/" + destdir
+ else:
+ if urldata.parm.get('subdir') != None:
+ destdir = urldata.parm.get('subdir')
+ else:
+ destdir = "."
+ bb.utils.mkdirhier("%s/%s" % (rootdir, destdir))
+ cmd = 'cp -f %s %s/%s/' % (file, rootdir, destdir)
+
+ if not cmd:
+ return
+
+ # Change to subdir before executing command
+ save_cwd = os.getcwd();
+ os.chdir(rootdir)
+ if 'subdir' in urldata.parm:
+ newdir = ("%s/%s" % (rootdir, urldata.parm.get('subdir')))
+ bb.utils.mkdirhier(newdir)
+ os.chdir(newdir)
+
+ path = data.getVar('PATH', True)
+ if path:
+ cmd = "PATH=\"%s\" %s" % (path, cmd)
+ bb.note("Unpacking %s to %s/" % (file, os.getcwd()))
+ ret = subprocess.call(cmd, preexec_fn=subprocess_setup, shell=True)
+
+ os.chdir(save_cwd)
+
+ if ret != 0:
+ raise UnpackError("Unpack command %s failed with return value %s" % (cmd, ret), urldata.url)
+
+ if iterate is True:
+ iterate_urldata = urldata
+ iterate_urldata.localpath = "%s/%s" % (rootdir, iterate_file)
+ self.unpack(urldata, rootdir, data)
+
+ return
+
+ def clean(self, urldata, d):
+ """
+ Clean any existing full or partial download
+ """
+ bb.utils.remove(urldata.localpath)
+
+ def try_premirror(self, urldata, d):
+ """
+ Should premirrors be used?
+ """
+ return True
+
+ def checkstatus(self, fetch, urldata, d):
+ """
+ Check the status of a URL
+ Assumes localpath was called first
+ """
+ logger.info("URL %s could not be checked for status since no method exists.", url)
+ return True
+
+ def latest_revision(self, ud, d, name):
+ """
+ Look in the cache for the latest revision, if not present ask the SCM.
+ """
+ if not hasattr(self, "_latest_revision"):
+ raise ParameterError("The fetcher for this URL does not support _latest_revision", url)
+
+ revs = bb.persist_data.persist('BB_URI_HEADREVS', d)
+ key = self.generate_revision_key(ud, d, name)
+ try:
+ return revs[key]
+ except KeyError:
+ revs[key] = rev = self._latest_revision(ud, d, name)
+ return rev
+
+ def sortable_revision(self, ud, d, name):
+ latest_rev = self._build_revision(ud, d, name)
+ return True, str(latest_rev)
+
+ def generate_revision_key(self, ud, d, name):
+ key = self._revision_key(ud, d, name)
+ return "%s-%s" % (key, d.getVar("PN", True) or "")
+
+class Fetch(object):
+ def __init__(self, urls, d, cache = True, localonly = False, connection_cache = None):
+ if localonly and cache:
+ raise Exception("bb.fetch2.Fetch.__init__: cannot set cache and localonly at same time")
+
+ if len(urls) == 0:
+ urls = d.getVar("SRC_URI", True).split()
+ self.urls = urls
+ self.d = d
+ self.ud = {}
+ self.connection_cache = connection_cache
+
+ fn = d.getVar('FILE', True)
+ if cache and fn and fn in urldata_cache:
+ self.ud = urldata_cache[fn]
+
+ for url in urls:
+ if url not in self.ud:
+ try:
+ self.ud[url] = FetchData(url, d, localonly)
+ except NonLocalMethod:
+ if localonly:
+ self.ud[url] = None
+ pass
+
+ if fn and cache:
+ urldata_cache[fn] = self.ud
+
+ def localpath(self, url):
+ if url not in self.urls:
+ self.ud[url] = FetchData(url, self.d)
+
+ self.ud[url].setup_localpath(self.d)
+ return self.d.expand(self.ud[url].localpath)
+
+ def localpaths(self):
+ """
+ Return a list of the local filenames, assuming successful fetch
+ """
+ local = []
+
+ for u in self.urls:
+ ud = self.ud[u]
+ ud.setup_localpath(self.d)
+ local.append(ud.localpath)
+
+ return local
+
+ def download(self, urls=None):
+ """
+ Fetch all urls
+ """
+ if not urls:
+ urls = self.urls
+
+ network = self.d.getVar("BB_NO_NETWORK", True)
+ premirroronly = (self.d.getVar("BB_FETCH_PREMIRRORONLY", True) == "1")
+
+ for u in urls:
+ ud = self.ud[u]
+ ud.setup_localpath(self.d)
+ m = ud.method
+ localpath = ""
+
+ lf = bb.utils.lockfile(ud.lockfile)
+
+ try:
+ self.d.setVar("BB_NO_NETWORK", network)
+
+ if verify_donestamp(ud, self.d) and not m.need_update(ud, self.d):
+ localpath = ud.localpath
+ elif m.try_premirror(ud, self.d):
+ logger.debug(1, "Trying PREMIRRORS")
+ mirrors = mirror_from_string(self.d.getVar('PREMIRRORS', True))
+ localpath = try_mirrors(self, self.d, ud, mirrors, False)
+
+ if premirroronly:
+ self.d.setVar("BB_NO_NETWORK", "1")
+
+ os.chdir(self.d.getVar("DL_DIR", True))
+
+ firsterr = None
+ verified_stamp = verify_donestamp(ud, self.d)
+ if not localpath and (not verified_stamp or m.need_update(ud, self.d)):
+ try:
+ if not trusted_network(self.d, ud.url):
+ raise UntrustedUrl(ud.url)
+ logger.debug(1, "Trying Upstream")
+ m.download(ud, self.d)
+ if hasattr(m, "build_mirror_data"):
+ m.build_mirror_data(ud, self.d)
+ localpath = ud.localpath
+ # early checksum verify, so that if checksum mismatched,
+ # fetcher still have chance to fetch from mirror
+ update_stamp(ud, self.d)
+
+ except bb.fetch2.NetworkAccess:
+ raise
+
+ except BBFetchException as e:
+ if isinstance(e, ChecksumError):
+ logger.warn("Checksum failure encountered with download of %s - will attempt other sources if available" % u)
+ logger.debug(1, str(e))
+ rename_bad_checksum(ud, e.checksum)
+ elif isinstance(e, NoChecksumError):
+ raise
+ else:
+ logger.warn('Failed to fetch URL %s, attempting MIRRORS if available' % u)
+ logger.debug(1, str(e))
+ firsterr = e
+ # Remove any incomplete fetch
+ if not verified_stamp:
+ m.clean(ud, self.d)
+ logger.debug(1, "Trying MIRRORS")
+ mirrors = mirror_from_string(self.d.getVar('MIRRORS', True))
+ localpath = try_mirrors(self, self.d, ud, mirrors)
+
+ if not localpath or ((not os.path.exists(localpath)) and localpath.find("*") == -1):
+ if firsterr:
+ logger.error(str(firsterr))
+ raise FetchError("Unable to fetch URL from any source.", u)
+
+ update_stamp(ud, self.d)
+
+ except BBFetchException as e:
+ if isinstance(e, ChecksumError):
+ logger.error("Checksum failure fetching %s" % u)
+ raise
+
+ finally:
+ bb.utils.unlockfile(lf)
+
+ def checkstatus(self, urls=None):
+ """
+ Check all urls exist upstream
+ """
+
+ if not urls:
+ urls = self.urls
+
+ for u in urls:
+ ud = self.ud[u]
+ ud.setup_localpath(self.d)
+ m = ud.method
+ logger.debug(1, "Testing URL %s", u)
+ # First try checking uri, u, from PREMIRRORS
+ mirrors = mirror_from_string(self.d.getVar('PREMIRRORS', True))
+ ret = try_mirrors(self, self.d, ud, mirrors, True)
+ if not ret:
+ # Next try checking from the original uri, u
+ try:
+ ret = m.checkstatus(self, ud, self.d)
+ except:
+ # Finally, try checking uri, u, from MIRRORS
+ mirrors = mirror_from_string(self.d.getVar('MIRRORS', True))
+ ret = try_mirrors(self, self.d, ud, mirrors, True)
+
+ if not ret:
+ raise FetchError("URL %s doesn't work" % u, u)
+
+ def unpack(self, root, urls=None):
+ """
+ Check all urls exist upstream
+ """
+
+ if not urls:
+ urls = self.urls
+
+ for u in urls:
+ ud = self.ud[u]
+ ud.setup_localpath(self.d)
+
+ if self.d.expand(self.localpath) is None:
+ continue
+
+ if ud.lockfile:
+ lf = bb.utils.lockfile(ud.lockfile)
+
+ ud.method.unpack(ud, root, self.d)
+
+ if ud.lockfile:
+ bb.utils.unlockfile(lf)
+
+ def clean(self, urls=None):
+ """
+ Clean files that the fetcher gets or places
+ """
+
+ if not urls:
+ urls = self.urls
+
+ for url in urls:
+ if url not in self.ud:
+ self.ud[url] = FetchData(url, d)
+ ud = self.ud[url]
+ ud.setup_localpath(self.d)
+
+ if not ud.localfile and ud.localpath is None:
+ continue
+
+ if ud.lockfile:
+ lf = bb.utils.lockfile(ud.lockfile)
+
+ ud.method.clean(ud, self.d)
+ if ud.donestamp:
+ bb.utils.remove(ud.donestamp)
+
+ if ud.lockfile:
+ bb.utils.unlockfile(lf)
+
+class FetchConnectionCache(object):
+ """
+ A class which represents an container for socket connections.
+ """
+ def __init__(self):
+ self.cache = {}
+
+ def get_connection_name(self, host, port):
+ return host + ':' + str(port)
+
+ def add_connection(self, host, port, connection):
+ cn = self.get_connection_name(host, port)
+
+ if cn not in self.cache:
+ self.cache[cn] = connection
+
+ def get_connection(self, host, port):
+ connection = None
+
+ cn = self.get_connection_name(host, port)
+ if cn in self.cache:
+ connection = self.cache[cn]
+
+ return connection
+
+ def remove_connection(self, host, port):
+ cn = self.get_connection_name(host, port)
+ if cn in self.cache:
+ self.cache[cn].close()
+ del self.cache[cn]
+
+ def close_connections(self):
+ for cn in self.cache.keys():
+ self.cache[cn].close()
+ del self.cache[cn]
+
+from . import cvs
+from . import git
+from . import gitsm
+from . import gitannex
+from . import local
+from . import svn
+from . import wget
+from . import ssh
+from . import sftp
+from . import perforce
+from . import bzr
+from . import hg
+from . import osc
+from . import repo
+from . import clearcase
+
+methods.append(local.Local())
+methods.append(wget.Wget())
+methods.append(svn.Svn())
+methods.append(git.Git())
+methods.append(gitsm.GitSM())
+methods.append(gitannex.GitANNEX())
+methods.append(cvs.Cvs())
+methods.append(ssh.SSH())
+methods.append(sftp.SFTP())
+methods.append(perforce.Perforce())
+methods.append(bzr.Bzr())
+methods.append(hg.Hg())
+methods.append(osc.Osc())
+methods.append(repo.Repo())
+methods.append(clearcase.ClearCase())
diff --git a/bitbake/lib/bb/fetch2/bzr.py b/bitbake/lib/bb/fetch2/bzr.py
new file mode 100644
index 0000000..03e9ac4
--- /dev/null
+++ b/bitbake/lib/bb/fetch2/bzr.py
@@ -0,0 +1,143 @@
+"""
+BitBake 'Fetch' implementation for bzr.
+
+"""
+
+# Copyright (C) 2007 Ross Burton
+# Copyright (C) 2007 Richard Purdie
+#
+# Classes for obtaining upstream sources for the
+# BitBake build tools.
+# Copyright (C) 2003, 2004 Chris Larson
+#
+# This program is free software; you can redistribute it and/or modify
+# it under the terms of the GNU General Public License version 2 as
+# published by the Free Software Foundation.
+#
+# This program is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
+# GNU General Public License for more details.
+#
+# You should have received a copy of the GNU General Public License along
+# with this program; if not, write to the Free Software Foundation, Inc.,
+# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
+
+import os
+import sys
+import logging
+import bb
+from bb import data
+from bb.fetch2 import FetchMethod
+from bb.fetch2 import FetchError
+from bb.fetch2 import runfetchcmd
+from bb.fetch2 import logger
+
+class Bzr(FetchMethod):
+ def supports(self, ud, d):
+ return ud.type in ['bzr']
+
+ def urldata_init(self, ud, d):
+ """
+ init bzr specific variable within url data
+ """
+ # Create paths to bzr checkouts
+ relpath = self._strip_leading_slashes(ud.path)
+ ud.pkgdir = os.path.join(data.expand('${BZRDIR}', d), ud.host, relpath)
+
+ ud.setup_revisons(d)
+
+ if not ud.revision:
+ ud.revision = self.latest_revision(ud, d)
+
+ ud.localfile = data.expand('bzr_%s_%s_%s.tar.gz' % (ud.host, ud.path.replace('/', '.'), ud.revision), d)
+
+ def _buildbzrcommand(self, ud, d, command):
+ """
+ Build up an bzr commandline based on ud
+ command is "fetch", "update", "revno"
+ """
+
+ basecmd = data.expand('${FETCHCMD_bzr}', d)
+
+ proto = ud.parm.get('protocol', 'http')
+
+ bzrroot = ud.host + ud.path
+
+ options = []
+
+ if command == "revno":
+ bzrcmd = "%s revno %s %s://%s" % (basecmd, " ".join(options), proto, bzrroot)
+ else:
+ if ud.revision:
+ options.append("-r %s" % ud.revision)
+
+ if command == "fetch":
+ bzrcmd = "%s branch %s %s://%s" % (basecmd, " ".join(options), proto, bzrroot)
+ elif command == "update":
+ bzrcmd = "%s pull %s --overwrite" % (basecmd, " ".join(options))
+ else:
+ raise FetchError("Invalid bzr command %s" % command, ud.url)
+
+ return bzrcmd
+
+ def download(self, ud, d):
+ """Fetch url"""
+
+ if os.access(os.path.join(ud.pkgdir, os.path.basename(ud.pkgdir), '.bzr'), os.R_OK):
+ bzrcmd = self._buildbzrcommand(ud, d, "update")
+ logger.debug(1, "BZR Update %s", ud.url)
+ bb.fetch2.check_network_access(d, bzrcmd, ud.url)
+ os.chdir(os.path.join (ud.pkgdir, os.path.basename(ud.path)))
+ runfetchcmd(bzrcmd, d)
+ else:
+ bb.utils.remove(os.path.join(ud.pkgdir, os.path.basename(ud.pkgdir)), True)
+ bzrcmd = self._buildbzrcommand(ud, d, "fetch")
+ bb.fetch2.check_network_access(d, bzrcmd, ud.url)
+ logger.debug(1, "BZR Checkout %s", ud.url)
+ bb.utils.mkdirhier(ud.pkgdir)
+ os.chdir(ud.pkgdir)
+ logger.debug(1, "Running %s", bzrcmd)
+ runfetchcmd(bzrcmd, d)
+
+ os.chdir(ud.pkgdir)
+
+ scmdata = ud.parm.get("scmdata", "")
+ if scmdata == "keep":
+ tar_flags = ""
+ else:
+ tar_flags = "--exclude '.bzr' --exclude '.bzrtags'"
+
+ # tar them up to a defined filename
+ runfetchcmd("tar %s -czf %s %s" % (tar_flags, ud.localpath, os.path.basename(ud.pkgdir)), d, cleanup = [ud.localpath])
+
+ def supports_srcrev(self):
+ return True
+
+ def _revision_key(self, ud, d, name):
+ """
+ Return a unique key for the url
+ """
+ return "bzr:" + ud.pkgdir
+
+ def _latest_revision(self, ud, d, name):
+ """
+ Return the latest upstream revision number
+ """
+ logger.debug(2, "BZR fetcher hitting network for %s", ud.url)
+
+ bb.fetch2.check_network_access(d, self._buildbzrcommand(ud, d, "revno"), ud.url)
+
+ output = runfetchcmd(self._buildbzrcommand(ud, d, "revno"), d, True)
+
+ return output.strip()
+
+ def sortable_revision(self, ud, d, name):
+ """
+ Return a sortable revision number which in our case is the revision number
+ """
+
+ return False, self._build_revision(ud, d)
+
+ def _build_revision(self, ud, d):
+ return ud.revision
diff --git a/bitbake/lib/bb/fetch2/clearcase.py b/bitbake/lib/bb/fetch2/clearcase.py
new file mode 100644
index 0000000..ba83e7c
--- /dev/null
+++ b/bitbake/lib/bb/fetch2/clearcase.py
@@ -0,0 +1,263 @@
+# ex:ts=4:sw=4:sts=4:et
+# -*- tab-width: 4; c-basic-offset: 4; indent-tabs-mode: nil -*-
+"""
+BitBake 'Fetch' clearcase implementation
+
+The clearcase fetcher is used to retrieve files from a ClearCase repository.
+
+Usage in the recipe:
+
+ SRC_URI = "ccrc://cc.example.org/ccrc;vob=/example_vob;module=/example_module"
+ SRCREV = "EXAMPLE_CLEARCASE_TAG"
+ PV = "${@d.getVar("SRCREV", False).replace("/", "+")}"
+
+The fetcher uses the rcleartool or cleartool remote client, depending on which one is available.
+
+Supported SRC_URI options are:
+
+- vob
+ (required) The name of the clearcase VOB (with prepending "/")
+
+- module
+ The module in the selected VOB (with prepending "/")
+
+ The module and vob parameters are combined to create
+ the following load rule in the view config spec:
+ load <vob><module>
+
+- proto
+ http or https
+
+Related variables:
+
+ CCASE_CUSTOM_CONFIG_SPEC
+ Write a config spec to this variable in your recipe to use it instead
+ of the default config spec generated by this fetcher.
+ Please note that the SRCREV loses its functionality if you specify
+ this variable. SRCREV is still used to label the archive after a fetch,
+ but it doesn't define what's fetched.
+
+User credentials:
+ cleartool:
+ The login of cleartool is handled by the system. No special steps needed.
+
+ rcleartool:
+ In order to use rcleartool with authenticated users an `rcleartool login` is
+ necessary before using the fetcher.
+"""
+# Copyright (C) 2014 Siemens AG
+#
+# This program is free software; you can redistribute it and/or modify
+# it under the terms of the GNU General Public License version 2 as
+# published by the Free Software Foundation.
+#
+# This program is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
+# GNU General Public License for more details.
+#
+# You should have received a copy of the GNU General Public License along
+# with this program; if not, write to the Free Software Foundation, Inc.,
+# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
+#
+
+import os
+import sys
+import shutil
+import bb
+from bb import data
+from bb.fetch2 import FetchMethod
+from bb.fetch2 import FetchError
+from bb.fetch2 import runfetchcmd
+from bb.fetch2 import logger
+from distutils import spawn
+
+class ClearCase(FetchMethod):
+ """Class to fetch urls via 'clearcase'"""
+ def init(self, d):
+ pass
+
+ def supports(self, ud, d):
+ """
+ Check to see if a given url can be fetched with Clearcase.
+ """
+ return ud.type in ['ccrc']
+
+ def debug(self, msg):
+ logger.debug(1, "ClearCase: %s", msg)
+
+ def urldata_init(self, ud, d):
+ """
+ init ClearCase specific variable within url data
+ """
+ ud.proto = "https"
+ if 'protocol' in ud.parm:
+ ud.proto = ud.parm['protocol']
+ if not ud.proto in ('http', 'https'):
+ raise fetch2.ParameterError("Invalid protocol type", ud.url)
+
+ ud.vob = ''
+ if 'vob' in ud.parm:
+ ud.vob = ud.parm['vob']
+ else:
+ msg = ud.url+": vob must be defined so the fetcher knows what to get."
+ raise MissingParameterError('vob', msg)
+
+ if 'module' in ud.parm:
+ ud.module = ud.parm['module']
+ else:
+ ud.module = ""
+
+ ud.basecmd = d.getVar("FETCHCMD_ccrc", True) or spawn.find_executable("cleartool") or spawn.find_executable("rcleartool")
+
+ if data.getVar("SRCREV", d, True) == "INVALID":
+ raise FetchError("Set a valid SRCREV for the clearcase fetcher in your recipe, e.g. SRCREV = \"/main/LATEST\" or any other label of your choice.")
+
+ ud.label = d.getVar("SRCREV", False)
+ ud.customspec = d.getVar("CCASE_CUSTOM_CONFIG_SPEC", True)
+
+ ud.server = "%s://%s%s" % (ud.proto, ud.host, ud.path)
+
+ ud.identifier = "clearcase-%s%s-%s" % ( ud.vob.replace("/", ""),
+ ud.module.replace("/", "."),
+ ud.label.replace("/", "."))
+
+ ud.viewname = "%s-view%s" % (ud.identifier, d.getVar("DATETIME", d, True))
+ ud.csname = "%s-config-spec" % (ud.identifier)
+ ud.ccasedir = os.path.join(data.getVar("DL_DIR", d, True), ud.type)
+ ud.viewdir = os.path.join(ud.ccasedir, ud.viewname)
+ ud.configspecfile = os.path.join(ud.ccasedir, ud.csname)
+ ud.localfile = "%s.tar.gz" % (ud.identifier)
+
+ self.debug("host = %s" % ud.host)
+ self.debug("path = %s" % ud.path)
+ self.debug("server = %s" % ud.server)
+ self.debug("proto = %s" % ud.proto)
+ self.debug("type = %s" % ud.type)
+ self.debug("vob = %s" % ud.vob)
+ self.debug("module = %s" % ud.module)
+ self.debug("basecmd = %s" % ud.basecmd)
+ self.debug("label = %s" % ud.label)
+ self.debug("ccasedir = %s" % ud.ccasedir)
+ self.debug("viewdir = %s" % ud.viewdir)
+ self.debug("viewname = %s" % ud.viewname)
+ self.debug("configspecfile = %s" % ud.configspecfile)
+ self.debug("localfile = %s" % ud.localfile)
+
+ ud.localfile = os.path.join(data.getVar("DL_DIR", d, True), ud.localfile)
+
+ def _build_ccase_command(self, ud, command):
+ """
+ Build up a commandline based on ud
+ command is: mkview, setcs, rmview
+ """
+ options = []
+
+ if "rcleartool" in ud.basecmd:
+ options.append("-server %s" % ud.server)
+
+ basecmd = "%s %s" % (ud.basecmd, command)
+
+ if command is 'mkview':
+ if not "rcleartool" in ud.basecmd:
+ # Cleartool needs a -snapshot view
+ options.append("-snapshot")
+ options.append("-tag %s" % ud.viewname)
+ options.append(ud.viewdir)
+
+ elif command is 'rmview':
+ options.append("-force")
+ options.append("%s" % ud.viewdir)
+
+ elif command is 'setcs':
+ options.append("-overwrite")
+ options.append(ud.configspecfile)
+
+ else:
+ raise FetchError("Invalid ccase command %s" % command)
+
+ ccasecmd = "%s %s" % (basecmd, " ".join(options))
+ self.debug("ccasecmd = %s" % ccasecmd)
+ return ccasecmd
+
+ def _write_configspec(self, ud, d):
+ """
+ Create config spec file (ud.configspecfile) for ccase view
+ """
+ config_spec = ""
+ custom_config_spec = d.getVar("CCASE_CUSTOM_CONFIG_SPEC", d)
+ if custom_config_spec is not None:
+ for line in custom_config_spec.split("\\n"):
+ config_spec += line+"\n"
+ bb.warn("A custom config spec has been set, SRCREV is only relevant for the tarball name.")
+ else:
+ config_spec += "element * CHECKEDOUT\n"
+ config_spec += "element * %s\n" % ud.label
+ config_spec += "load %s%s\n" % (ud.vob, ud.module)
+
+ logger.info("Using config spec: \n%s" % config_spec)
+
+ with open(ud.configspecfile, 'w') as f:
+ f.write(config_spec)
+
+ def _remove_view(self, ud, d):
+ if os.path.exists(ud.viewdir):
+ os.chdir(ud.ccasedir)
+ cmd = self._build_ccase_command(ud, 'rmview');
+ logger.info("cleaning up [VOB=%s label=%s view=%s]", ud.vob, ud.label, ud.viewname)
+ bb.fetch2.check_network_access(d, cmd, ud.url)
+ output = runfetchcmd(cmd, d)
+ logger.info("rmview output: %s", output)
+
+ def need_update(self, ud, d):
+ if ("LATEST" in ud.label) or (ud.customspec and "LATEST" in ud.customspec):
+ ud.identifier += "-%s" % d.getVar("DATETIME",d, True)
+ return True
+ if os.path.exists(ud.localpath):
+ return False
+ return True
+
+ def supports_srcrev(self):
+ return True
+
+ def sortable_revision(self, ud, d, name):
+ return False, ud.identifier
+
+ def download(self, ud, d):
+ """Fetch url"""
+
+ # Make a fresh view
+ bb.utils.mkdirhier(ud.ccasedir)
+ self._write_configspec(ud, d)
+ cmd = self._build_ccase_command(ud, 'mkview')
+ logger.info("creating view [VOB=%s label=%s view=%s]", ud.vob, ud.label, ud.viewname)
+ bb.fetch2.check_network_access(d, cmd, ud.url)
+ try:
+ runfetchcmd(cmd, d)
+ except FetchError as e:
+ if "CRCLI2008E" in e.msg:
+ raise FetchError("%s\n%s\n" % (e.msg, "Call `rcleartool login` in your console to authenticate to the clearcase server before running bitbake."))
+ else:
+ raise e
+
+ # Set configspec: Setting the configspec effectively fetches the files as defined in the configspec
+ os.chdir(ud.viewdir)
+ cmd = self._build_ccase_command(ud, 'setcs');
+ logger.info("fetching data [VOB=%s label=%s view=%s]", ud.vob, ud.label, ud.viewname)
+ bb.fetch2.check_network_access(d, cmd, ud.url)
+ output = runfetchcmd(cmd, d)
+ logger.info("%s", output)
+
+ # Copy the configspec to the viewdir so we have it in our source tarball later
+ shutil.copyfile(ud.configspecfile, os.path.join(ud.viewdir, ud.csname))
+
+ # Clean clearcase meta-data before tar
+
+ runfetchcmd('tar -czf "%s" .' % (ud.localpath), d, cleanup = [ud.localpath])
+
+ # Clean up so we can create a new view next time
+ self.clean(ud, d);
+
+ def clean(self, ud, d):
+ self._remove_view(ud, d)
+ bb.utils.remove(ud.configspecfile)
diff --git a/bitbake/lib/bb/fetch2/cvs.py b/bitbake/lib/bb/fetch2/cvs.py
new file mode 100644
index 0000000..d27d96f
--- /dev/null
+++ b/bitbake/lib/bb/fetch2/cvs.py
@@ -0,0 +1,171 @@
+# ex:ts=4:sw=4:sts=4:et
+# -*- tab-width: 4; c-basic-offset: 4; indent-tabs-mode: nil -*-
+"""
+BitBake 'Fetch' implementations
+
+Classes for obtaining upstream sources for the
+BitBake build tools.
+
+"""
+
+# Copyright (C) 2003, 2004 Chris Larson
+#
+# This program is free software; you can redistribute it and/or modify
+# it under the terms of the GNU General Public License version 2 as
+# published by the Free Software Foundation.
+#
+# This program is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
+# GNU General Public License for more details.
+#
+# You should have received a copy of the GNU General Public License along
+# with this program; if not, write to the Free Software Foundation, Inc.,
+# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
+#
+#Based on functions from the base bb module, Copyright 2003 Holger Schurig
+#
+
+import os
+import logging
+import bb
+from bb.fetch2 import FetchMethod, FetchError, MissingParameterError, logger
+from bb.fetch2 import runfetchcmd
+
+class Cvs(FetchMethod):
+ """
+ Class to fetch a module or modules from cvs repositories
+ """
+ def supports(self, ud, d):
+ """
+ Check to see if a given url can be fetched with cvs.
+ """
+ return ud.type in ['cvs']
+
+ def urldata_init(self, ud, d):
+ if not "module" in ud.parm:
+ raise MissingParameterError("module", ud.url)
+ ud.module = ud.parm["module"]
+
+ ud.tag = ud.parm.get('tag', "")
+
+ # Override the default date in certain cases
+ if 'date' in ud.parm:
+ ud.date = ud.parm['date']
+ elif ud.tag:
+ ud.date = ""
+
+ norecurse = ''
+ if 'norecurse' in ud.parm:
+ norecurse = '_norecurse'
+
+ fullpath = ''
+ if 'fullpath' in ud.parm:
+ fullpath = '_fullpath'
+
+ ud.localfile = bb.data.expand('%s_%s_%s_%s%s%s.tar.gz' % (ud.module.replace('/', '.'), ud.host, ud.tag, ud.date, norecurse, fullpath), d)
+
+ def need_update(self, ud, d):
+ if (ud.date == "now"):
+ return True
+ if not os.path.exists(ud.localpath):
+ return True
+ return False
+
+ def download(self, ud, d):
+
+ method = ud.parm.get('method', 'pserver')
+ localdir = ud.parm.get('localdir', ud.module)
+ cvs_port = ud.parm.get('port', '')
+
+ cvs_rsh = None
+ if method == "ext":
+ if "rsh" in ud.parm:
+ cvs_rsh = ud.parm["rsh"]
+
+ if method == "dir":
+ cvsroot = ud.path
+ else:
+ cvsroot = ":" + method
+ cvsproxyhost = d.getVar('CVS_PROXY_HOST', True)
+ if cvsproxyhost:
+ cvsroot += ";proxy=" + cvsproxyhost
+ cvsproxyport = d.getVar('CVS_PROXY_PORT', True)
+ if cvsproxyport:
+ cvsroot += ";proxyport=" + cvsproxyport
+ cvsroot += ":" + ud.user
+ if ud.pswd:
+ cvsroot += ":" + ud.pswd
+ cvsroot += "@" + ud.host + ":" + cvs_port + ud.path
+
+ options = []
+ if 'norecurse' in ud.parm:
+ options.append("-l")
+ if ud.date:
+ # treat YYYYMMDDHHMM specially for CVS
+ if len(ud.date) == 12:
+ options.append("-D \"%s %s:%s UTC\"" % (ud.date[0:8], ud.date[8:10], ud.date[10:12]))
+ else:
+ options.append("-D \"%s UTC\"" % ud.date)
+ if ud.tag:
+ options.append("-r %s" % ud.tag)
+
+ cvsbasecmd = d.getVar("FETCHCMD_cvs", True)
+ cvscmd = cvsbasecmd + " '-d" + cvsroot + "' co " + " ".join(options) + " " + ud.module
+ cvsupdatecmd = cvsbasecmd + " '-d" + cvsroot + "' update -d -P " + " ".join(options)
+
+ if cvs_rsh:
+ cvscmd = "CVS_RSH=\"%s\" %s" % (cvs_rsh, cvscmd)
+ cvsupdatecmd = "CVS_RSH=\"%s\" %s" % (cvs_rsh, cvsupdatecmd)
+
+ # create module directory
+ logger.debug(2, "Fetch: checking for module directory")
+ pkg = d.getVar('PN', True)
+ pkgdir = os.path.join(d.getVar('CVSDIR', True), pkg)
+ moddir = os.path.join(pkgdir, localdir)
+ if os.access(os.path.join(moddir, 'CVS'), os.R_OK):
+ logger.info("Update " + ud.url)
+ bb.fetch2.check_network_access(d, cvsupdatecmd, ud.url)
+ # update sources there
+ os.chdir(moddir)
+ cmd = cvsupdatecmd
+ else:
+ logger.info("Fetch " + ud.url)
+ # check out sources there
+ bb.utils.mkdirhier(pkgdir)
+ os.chdir(pkgdir)
+ logger.debug(1, "Running %s", cvscmd)
+ bb.fetch2.check_network_access(d, cvscmd, ud.url)
+ cmd = cvscmd
+
+ runfetchcmd(cmd, d, cleanup = [moddir])
+
+ if not os.access(moddir, os.R_OK):
+ raise FetchError("Directory %s was not readable despite sucessful fetch?!" % moddir, ud.url)
+
+ scmdata = ud.parm.get("scmdata", "")
+ if scmdata == "keep":
+ tar_flags = ""
+ else:
+ tar_flags = "--exclude 'CVS'"
+
+ # tar them up to a defined filename
+ if 'fullpath' in ud.parm:
+ os.chdir(pkgdir)
+ cmd = "tar %s -czf %s %s" % (tar_flags, ud.localpath, localdir)
+ else:
+ os.chdir(moddir)
+ os.chdir('..')
+ cmd = "tar %s -czf %s %s" % (tar_flags, ud.localpath, os.path.basename(moddir))
+
+ runfetchcmd(cmd, d, cleanup = [ud.localpath])
+
+ def clean(self, ud, d):
+ """ Clean CVS Files and tarballs """
+
+ pkg = d.getVar('PN', True)
+ pkgdir = os.path.join(d.getVar("CVSDIR", True), pkg)
+
+ bb.utils.remove(pkgdir, True)
+ bb.utils.remove(ud.localpath)
+
diff --git a/bitbake/lib/bb/fetch2/git.py b/bitbake/lib/bb/fetch2/git.py
new file mode 100644
index 0000000..40658ff
--- /dev/null
+++ b/bitbake/lib/bb/fetch2/git.py
@@ -0,0 +1,447 @@
+# ex:ts=4:sw=4:sts=4:et
+# -*- tab-width: 4; c-basic-offset: 4; indent-tabs-mode: nil -*-
+"""
+BitBake 'Fetch' git implementation
+
+git fetcher support the SRC_URI with format of:
+SRC_URI = "git://some.host/somepath;OptionA=xxx;OptionB=xxx;..."
+
+Supported SRC_URI options are:
+
+- branch
+ The git branch to retrieve from. The default is "master"
+
+ This option also supports multiple branch fetching, with branches
+ separated by commas. In multiple branches case, the name option
+ must have the same number of names to match the branches, which is
+ used to specify the SRC_REV for the branch
+ e.g:
+ SRC_URI="git://some.host/somepath;branch=branchX,branchY;name=nameX,nameY"
+ SRCREV_nameX = "xxxxxxxxxxxxxxxxxxxx"
+ SRCREV_nameY = "YYYYYYYYYYYYYYYYYYYY"
+
+- tag
+ The git tag to retrieve. The default is "master"
+
+- protocol
+ The method to use to access the repository. Common options are "git",
+ "http", "https", "file", "ssh" and "rsync". The default is "git".
+
+- rebaseable
+ rebaseable indicates that the upstream git repo may rebase in the future,
+ and current revision may disappear from upstream repo. This option will
+ remind fetcher to preserve local cache carefully for future use.
+ The default value is "0", set rebaseable=1 for rebaseable git repo.
+
+- nocheckout
+ Don't checkout source code when unpacking. set this option for the recipe
+ who has its own routine to checkout code.
+ The default is "0", set nocheckout=1 if needed.
+
+- bareclone
+ Create a bare clone of the source code and don't checkout the source code
+ when unpacking. Set this option for the recipe who has its own routine to
+ checkout code and tracking branch requirements.
+ The default is "0", set bareclone=1 if needed.
+
+- nobranch
+ Don't check the SHA validation for branch. set this option for the recipe
+ referring to commit which is valid in tag instead of branch.
+ The default is "0", set nobranch=1 if needed.
+
+"""
+
+#Copyright (C) 2005 Richard Purdie
+#
+# This program is free software; you can redistribute it and/or modify
+# it under the terms of the GNU General Public License version 2 as
+# published by the Free Software Foundation.
+#
+# This program is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
+# GNU General Public License for more details.
+#
+# You should have received a copy of the GNU General Public License along
+# with this program; if not, write to the Free Software Foundation, Inc.,
+# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
+
+import os
+import re
+import bb
+from bb import data
+from bb.fetch2 import FetchMethod
+from bb.fetch2 import runfetchcmd
+from bb.fetch2 import logger
+
+class Git(FetchMethod):
+ """Class to fetch a module or modules from git repositories"""
+ def init(self, d):
+ pass
+
+ def supports(self, ud, d):
+ """
+ Check to see if a given url can be fetched with git.
+ """
+ return ud.type in ['git']
+
+ def supports_checksum(self, urldata):
+ return False
+
+ def urldata_init(self, ud, d):
+ """
+ init git specific variable within url data
+ so that the git method like latest_revision() can work
+ """
+ if 'protocol' in ud.parm:
+ ud.proto = ud.parm['protocol']
+ elif not ud.host:
+ ud.proto = 'file'
+ else:
+ ud.proto = "git"
+
+ if not ud.proto in ('git', 'file', 'ssh', 'http', 'https', 'rsync'):
+ raise bb.fetch2.ParameterError("Invalid protocol type", ud.url)
+
+ ud.nocheckout = ud.parm.get("nocheckout","0") == "1"
+
+ ud.rebaseable = ud.parm.get("rebaseable","0") == "1"
+
+ ud.nobranch = ud.parm.get("nobranch","0") == "1"
+
+ # bareclone implies nocheckout
+ ud.bareclone = ud.parm.get("bareclone","0") == "1"
+ if ud.bareclone:
+ ud.nocheckout = 1
+
+ ud.unresolvedrev = {}
+ branches = ud.parm.get("branch", "master").split(',')
+ if len(branches) != len(ud.names):
+ raise bb.fetch2.ParameterError("The number of name and branch parameters is not balanced", ud.url)
+ ud.branches = {}
+ for name in ud.names:
+ branch = branches[ud.names.index(name)]
+ ud.branches[name] = branch
+ ud.unresolvedrev[name] = branch
+
+ ud.basecmd = data.getVar("FETCHCMD_git", d, True) or "git -c core.fsyncobjectfiles=0"
+
+ ud.write_tarballs = ((data.getVar("BB_GENERATE_MIRROR_TARBALLS", d, True) or "0") != "0") or ud.rebaseable
+
+ ud.setup_revisons(d)
+
+ for name in ud.names:
+ # Ensure anything that doesn't look like a sha256 checksum/revision is translated into one
+ if not ud.revisions[name] or len(ud.revisions[name]) != 40 or (False in [c in "abcdef0123456789" for c in ud.revisions[name]]):
+ if ud.revisions[name]:
+ ud.unresolvedrev[name] = ud.revisions[name]
+ ud.revisions[name] = self.latest_revision(ud, d, name)
+
+ gitsrcname = '%s%s' % (ud.host.replace(':', '.'), ud.path.replace('/', '.').replace('*', '.'))
+ if gitsrcname.startswith('.'):
+ gitsrcname = gitsrcname[1:]
+
+ # for rebaseable git repo, it is necessary to keep mirror tar ball
+ # per revision, so that even the revision disappears from the
+ # upstream repo in the future, the mirror will remain intact and still
+ # contains the revision
+ if ud.rebaseable:
+ for name in ud.names:
+ gitsrcname = gitsrcname + '_' + ud.revisions[name]
+ ud.mirrortarball = 'git2_%s.tar.gz' % (gitsrcname)
+ ud.fullmirror = os.path.join(d.getVar("DL_DIR", True), ud.mirrortarball)
+ gitdir = d.getVar("GITDIR", True) or (d.getVar("DL_DIR", True) + "/git2/")
+ ud.clonedir = os.path.join(gitdir, gitsrcname)
+
+ ud.localfile = ud.clonedir
+
+ def localpath(self, ud, d):
+ return ud.clonedir
+
+ def need_update(self, ud, d):
+ if not os.path.exists(ud.clonedir):
+ return True
+ os.chdir(ud.clonedir)
+ for name in ud.names:
+ if not self._contains_ref(ud, d, name):
+ return True
+ if ud.write_tarballs and not os.path.exists(ud.fullmirror):
+ return True
+ return False
+
+ def try_premirror(self, ud, d):
+ # If we don't do this, updating an existing checkout with only premirrors
+ # is not possible
+ if d.getVar("BB_FETCH_PREMIRRORONLY", True) is not None:
+ return True
+ if os.path.exists(ud.clonedir):
+ return False
+ return True
+
+ def download(self, ud, d):
+ """Fetch url"""
+
+ ud.repochanged = not os.path.exists(ud.fullmirror)
+
+ # If the checkout doesn't exist and the mirror tarball does, extract it
+ if not os.path.exists(ud.clonedir) and os.path.exists(ud.fullmirror):
+ bb.utils.mkdirhier(ud.clonedir)
+ os.chdir(ud.clonedir)
+ runfetchcmd("tar -xzf %s" % (ud.fullmirror), d)
+
+ repourl = self._get_repo_url(ud)
+
+ # If the repo still doesn't exist, fallback to cloning it
+ if not os.path.exists(ud.clonedir):
+ # We do this since git will use a "-l" option automatically for local urls where possible
+ if repourl.startswith("file://"):
+ repourl = repourl[7:]
+ clone_cmd = "%s clone --bare --mirror %s %s" % (ud.basecmd, repourl, ud.clonedir)
+ if ud.proto.lower() != 'file':
+ bb.fetch2.check_network_access(d, clone_cmd)
+ runfetchcmd(clone_cmd, d)
+
+ os.chdir(ud.clonedir)
+ # Update the checkout if needed
+ needupdate = False
+ for name in ud.names:
+ if not self._contains_ref(ud, d, name):
+ needupdate = True
+ if needupdate:
+ try:
+ runfetchcmd("%s remote rm origin" % ud.basecmd, d)
+ except bb.fetch2.FetchError:
+ logger.debug(1, "No Origin")
+
+ runfetchcmd("%s remote add --mirror=fetch origin %s" % (ud.basecmd, repourl), d)
+ fetch_cmd = "%s fetch -f --prune %s refs/*:refs/*" % (ud.basecmd, repourl)
+ if ud.proto.lower() != 'file':
+ bb.fetch2.check_network_access(d, fetch_cmd, ud.url)
+ runfetchcmd(fetch_cmd, d)
+ runfetchcmd("%s prune-packed" % ud.basecmd, d)
+ runfetchcmd("%s pack-redundant --all | xargs -r rm" % ud.basecmd, d)
+ ud.repochanged = True
+ os.chdir(ud.clonedir)
+ for name in ud.names:
+ if not self._contains_ref(ud, d, name):
+ raise bb.fetch2.FetchError("Unable to find revision %s in branch %s even from upstream" % (ud.revisions[name], ud.branches[name]))
+
+ def build_mirror_data(self, ud, d):
+ # Generate a mirror tarball if needed
+ if ud.write_tarballs and (ud.repochanged or not os.path.exists(ud.fullmirror)):
+ # it's possible that this symlink points to read-only filesystem with PREMIRROR
+ if os.path.islink(ud.fullmirror):
+ os.unlink(ud.fullmirror)
+
+ os.chdir(ud.clonedir)
+ logger.info("Creating tarball of git repository")
+ runfetchcmd("tar -czf %s %s" % (ud.fullmirror, os.path.join(".") ), d)
+ runfetchcmd("touch %s.done" % (ud.fullmirror), d)
+
+ def unpack(self, ud, destdir, d):
+ """ unpack the downloaded src to destdir"""
+
+ subdir = ud.parm.get("subpath", "")
+ if subdir != "":
+ readpathspec = ":%s" % (subdir)
+ def_destsuffix = "%s/" % os.path.basename(subdir.rstrip('/'))
+ else:
+ readpathspec = ""
+ def_destsuffix = "git/"
+
+ destsuffix = ud.parm.get("destsuffix", def_destsuffix)
+ destdir = ud.destdir = os.path.join(destdir, destsuffix)
+ if os.path.exists(destdir):
+ bb.utils.prunedir(destdir)
+
+ cloneflags = "-s -n"
+ if ud.bareclone:
+ cloneflags += " --mirror"
+
+ # Versions of git prior to 1.7.9.2 have issues where foo.git and foo get confused
+ # and you end up with some horrible union of the two when you attempt to clone it
+ # The least invasive workaround seems to be a symlink to the real directory to
+ # fool git into ignoring any .git version that may also be present.
+ #
+ # The issue is fixed in more recent versions of git so we can drop this hack in future
+ # when that version becomes common enough.
+ clonedir = ud.clonedir
+ if not ud.path.endswith(".git"):
+ indirectiondir = destdir[:-1] + ".indirectionsymlink"
+ if os.path.exists(indirectiondir):
+ os.remove(indirectiondir)
+ bb.utils.mkdirhier(os.path.dirname(indirectiondir))
+ os.symlink(ud.clonedir, indirectiondir)
+ clonedir = indirectiondir
+
+ runfetchcmd("%s clone %s %s/ %s" % (ud.basecmd, cloneflags, clonedir, destdir), d)
+ os.chdir(destdir)
+ repourl = self._get_repo_url(ud)
+ runfetchcmd("%s remote set-url origin %s" % (ud.basecmd, repourl), d)
+ if not ud.nocheckout:
+ if subdir != "":
+ runfetchcmd("%s read-tree %s%s" % (ud.basecmd, ud.revisions[ud.names[0]], readpathspec), d)
+ runfetchcmd("%s checkout-index -q -f -a" % ud.basecmd, d)
+ elif not ud.nobranch:
+ branchname = ud.branches[ud.names[0]]
+ runfetchcmd("%s checkout -B %s %s" % (ud.basecmd, branchname, \
+ ud.revisions[ud.names[0]]), d)
+ runfetchcmd("%s branch --set-upstream %s origin/%s" % (ud.basecmd, branchname, \
+ branchname), d)
+ else:
+ runfetchcmd("%s checkout %s" % (ud.basecmd, ud.revisions[ud.names[0]]), d)
+
+ return True
+
+ def clean(self, ud, d):
+ """ clean the git directory """
+
+ bb.utils.remove(ud.localpath, True)
+ bb.utils.remove(ud.fullmirror)
+ bb.utils.remove(ud.fullmirror + ".done")
+
+ def supports_srcrev(self):
+ return True
+
+ def _contains_ref(self, ud, d, name):
+ cmd = ""
+ if ud.nobranch:
+ cmd = "%s log --pretty=oneline -n 1 %s -- 2> /dev/null | wc -l" % (
+ ud.basecmd, ud.revisions[name])
+ else:
+ cmd = "%s branch --contains %s --list %s 2> /dev/null | wc -l" % (
+ ud.basecmd, ud.revisions[name], ud.branches[name])
+ try:
+ output = runfetchcmd(cmd, d, quiet=True)
+ except bb.fetch2.FetchError:
+ return False
+ if len(output.split()) > 1:
+ raise bb.fetch2.FetchError("The command '%s' gave output with more then 1 line unexpectedly, output: '%s'" % (cmd, output))
+ return output.split()[0] != "0"
+
+ def _get_repo_url(self, ud):
+ """
+ Return the repository URL
+ """
+ if ud.user:
+ username = ud.user + '@'
+ else:
+ username = ""
+ return "%s://%s%s%s" % (ud.proto, username, ud.host, ud.path)
+
+ def _revision_key(self, ud, d, name):
+ """
+ Return a unique key for the url
+ """
+ return "git:" + ud.host + ud.path.replace('/', '.') + ud.unresolvedrev[name]
+
+ def _lsremote(self, ud, d, search):
+ """
+ Run git ls-remote with the specified search string
+ """
+ repourl = self._get_repo_url(ud)
+ cmd = "%s ls-remote %s %s" % \
+ (ud.basecmd, repourl, search)
+ if ud.proto.lower() != 'file':
+ bb.fetch2.check_network_access(d, cmd)
+ output = runfetchcmd(cmd, d, True)
+ if not output:
+ raise bb.fetch2.FetchError("The command %s gave empty output unexpectedly" % cmd, ud.url)
+ return output
+
+ def _latest_revision(self, ud, d, name):
+ """
+ Compute the HEAD revision for the url
+ """
+ output = self._lsremote(ud, d, "")
+ # Tags of the form ^{} may not work, need to fallback to other form
+ if ud.unresolvedrev[name][:5] == "refs/":
+ head = ud.unresolvedrev[name]
+ tag = ud.unresolvedrev[name]
+ else:
+ head = "refs/heads/%s" % ud.unresolvedrev[name]
+ tag = "refs/tags/%s" % ud.unresolvedrev[name]
+ for s in [head, tag + "^{}", tag]:
+ for l in output.split('\n'):
+ if s in l:
+ return l.split()[0]
+ raise bb.fetch2.FetchError("Unable to resolve '%s' in upstream git repository in git ls-remote output for %s" % \
+ (ud.unresolvedrev[name], ud.host+ud.path))
+
+ def latest_versionstring(self, ud, d):
+ """
+ Compute the latest release name like "x.y.x" in "x.y.x+gitHASH"
+ by searching through the tags output of ls-remote, comparing
+ versions and returning the highest match.
+ """
+ pupver = ('', '')
+
+ tagregex = re.compile(d.getVar('GITTAGREGEX', True) or "(?P<pver>([0-9][\.|_]?)+)")
+ try:
+ output = self._lsremote(ud, d, "refs/tags/*")
+ except bb.fetch2.FetchError or bb.fetch2.NetworkAccess:
+ return pupver
+
+ verstring = ""
+ revision = ""
+ for line in output.split("\n"):
+ if not line:
+ break
+
+ tag_head = line.split("/")[-1]
+ # Ignore non-released branches
+ m = re.search("(alpha|beta|rc|final)+", tag_head)
+ if m:
+ continue
+
+ # search for version in the line
+ tag = tagregex.search(tag_head)
+ if tag == None:
+ continue
+
+ tag = tag.group('pver')
+ tag = tag.replace("_", ".")
+
+ if verstring and bb.utils.vercmp(("0", tag, ""), ("0", verstring, "")) < 0:
+ continue
+
+ verstring = tag
+ revision = line.split()[0]
+ pupver = (verstring, revision)
+
+ return pupver
+
+ def _build_revision(self, ud, d, name):
+ return ud.revisions[name]
+
+ def gitpkgv_revision(self, ud, d, name):
+ """
+ Return a sortable revision number by counting commits in the history
+ Based on gitpkgv.bblass in meta-openembedded
+ """
+ rev = self._build_revision(ud, d, name)
+ localpath = ud.localpath
+ rev_file = os.path.join(localpath, "oe-gitpkgv_" + rev)
+ if not os.path.exists(localpath):
+ commits = None
+ else:
+ if not os.path.exists(rev_file) or not os.path.getsize(rev_file):
+ from pipes import quote
+ commits = bb.fetch2.runfetchcmd(
+ "git rev-list %s -- | wc -l" % (quote(rev)),
+ d, quiet=True).strip().lstrip('0')
+ if commits:
+ open(rev_file, "w").write("%d\n" % int(commits))
+ else:
+ commits = open(rev_file, "r").readline(128).strip()
+ if commits:
+ return False, "%s+%s" % (commits, rev[:7])
+ else:
+ return True, str(rev)
+
+ def checkstatus(self, fetch, ud, d):
+ try:
+ self._lsremote(ud, d, "")
+ return True
+ except FetchError:
+ return False
diff --git a/bitbake/lib/bb/fetch2/gitannex.py b/bitbake/lib/bb/fetch2/gitannex.py
new file mode 100644
index 0000000..0f37897
--- /dev/null
+++ b/bitbake/lib/bb/fetch2/gitannex.py
@@ -0,0 +1,76 @@
+# ex:ts=4:sw=4:sts=4:et
+# -*- tab-width: 4; c-basic-offset: 4; indent-tabs-mode: nil -*-
+"""
+BitBake 'Fetch' git annex implementation
+"""
+
+# Copyright (C) 2014 Otavio Salvador
+# Copyright (C) 2014 O.S. Systems Software LTDA.
+#
+# This program is free software; you can redistribute it and/or modify
+# it under the terms of the GNU General Public License version 2 as
+# published by the Free Software Foundation.
+#
+# This program is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
+# GNU General Public License for more details.
+#
+# You should have received a copy of the GNU General Public License along
+# with this program; if not, write to the Free Software Foundation, Inc.,
+# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
+
+import os
+import bb
+from bb import data
+from bb.fetch2.git import Git
+from bb.fetch2 import runfetchcmd
+from bb.fetch2 import logger
+
+class GitANNEX(Git):
+ def supports(self, ud, d):
+ """
+ Check to see if a given url can be fetched with git.
+ """
+ return ud.type in ['gitannex']
+
+ def uses_annex(self, ud, d):
+ for name in ud.names:
+ try:
+ runfetchcmd("%s rev-list git-annex" % (ud.basecmd), d, quiet=True)
+ return True
+ except bb.fetch.FetchError:
+ pass
+
+ return False
+
+ def update_annex(self, ud, d):
+ try:
+ runfetchcmd("%s annex get --all" % (ud.basecmd), d, quiet=True)
+ except bb.fetch.FetchError:
+ return False
+ runfetchcmd("chmod u+w -R %s/annex" % (ud.clonedir), d, quiet=True)
+
+ return True
+
+ def download(self, ud, d):
+ Git.download(self, ud, d)
+
+ os.chdir(ud.clonedir)
+ annex = self.uses_annex(ud, d)
+ if annex:
+ self.update_annex(ud, d)
+
+ def unpack(self, ud, destdir, d):
+ Git.unpack(self, ud, destdir, d)
+
+ os.chdir(ud.destdir)
+ try:
+ runfetchcmd("%s annex sync" % (ud.basecmd), d)
+ except bb.fetch.FetchError:
+ pass
+
+ annex = self.uses_annex(ud, d)
+ if annex:
+ runfetchcmd("%s annex get" % (ud.basecmd), d)
+ runfetchcmd("chmod u+w -R %s/.git/annex" % (ud.destdir), d, quiet=True)
diff --git a/bitbake/lib/bb/fetch2/gitsm.py b/bitbake/lib/bb/fetch2/gitsm.py
new file mode 100644
index 0000000..0392e48
--- /dev/null
+++ b/bitbake/lib/bb/fetch2/gitsm.py
@@ -0,0 +1,137 @@
+# ex:ts=4:sw=4:sts=4:et
+# -*- tab-width: 4; c-basic-offset: 4; indent-tabs-mode: nil -*-
+"""
+BitBake 'Fetch' git submodules implementation
+
+Inherits from and extends the Git fetcher to retrieve submodules of a git repository
+after cloning.
+
+SRC_URI = "gitsm://<see Git fetcher for syntax>"
+
+See the Git fetcher, git://, for usage documentation.
+
+NOTE: Switching a SRC_URI from "git://" to "gitsm://" requires a clean of your recipe.
+
+"""
+
+# Copyright (C) 2013 Richard Purdie
+#
+# This program is free software; you can redistribute it and/or modify
+# it under the terms of the GNU General Public License version 2 as
+# published by the Free Software Foundation.
+#
+# This program is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
+# GNU General Public License for more details.
+#
+# You should have received a copy of the GNU General Public License along
+# with this program; if not, write to the Free Software Foundation, Inc.,
+# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
+
+import os
+import bb
+from bb import data
+from bb.fetch2.git import Git
+from bb.fetch2 import runfetchcmd
+from bb.fetch2 import logger
+
+class GitSM(Git):
+ def supports(self, ud, d):
+ """
+ Check to see if a given url can be fetched with git.
+ """
+ return ud.type in ['gitsm']
+
+ def uses_submodules(self, ud, d):
+ for name in ud.names:
+ try:
+ runfetchcmd("%s show %s:.gitmodules" % (ud.basecmd, ud.revisions[name]), d, quiet=True)
+ return True
+ except bb.fetch.FetchError:
+ pass
+ return False
+
+ def _set_relative_paths(self, repopath):
+ """
+ Fix submodule paths to be relative instead of absolute,
+ so that when we move the repo it doesn't break
+ (In Git 1.7.10+ this is done automatically)
+ """
+ submodules = []
+ with open(os.path.join(repopath, '.gitmodules'), 'r') as f:
+ for line in f.readlines():
+ if line.startswith('[submodule'):
+ submodules.append(line.split('"')[1])
+
+ for module in submodules:
+ repo_conf = os.path.join(repopath, module, '.git')
+ if os.path.exists(repo_conf):
+ with open(repo_conf, 'r') as f:
+ lines = f.readlines()
+ newpath = ''
+ for i, line in enumerate(lines):
+ if line.startswith('gitdir:'):
+ oldpath = line.split(': ')[-1].rstrip()
+ if oldpath.startswith('/'):
+ newpath = '../' * (module.count('/') + 1) + '.git/modules/' + module
+ lines[i] = 'gitdir: %s\n' % newpath
+ break
+ if newpath:
+ with open(repo_conf, 'w') as f:
+ for line in lines:
+ f.write(line)
+
+ repo_conf2 = os.path.join(repopath, '.git', 'modules', module, 'config')
+ if os.path.exists(repo_conf2):
+ with open(repo_conf2, 'r') as f:
+ lines = f.readlines()
+ newpath = ''
+ for i, line in enumerate(lines):
+ if line.lstrip().startswith('worktree = '):
+ oldpath = line.split(' = ')[-1].rstrip()
+ if oldpath.startswith('/'):
+ newpath = '../' * (module.count('/') + 3) + module
+ lines[i] = '\tworktree = %s\n' % newpath
+ break
+ if newpath:
+ with open(repo_conf2, 'w') as f:
+ for line in lines:
+ f.write(line)
+
+ def update_submodules(self, ud, d):
+ # We have to convert bare -> full repo, do the submodule bit, then convert back
+ tmpclonedir = ud.clonedir + ".tmp"
+ gitdir = tmpclonedir + os.sep + ".git"
+ bb.utils.remove(tmpclonedir, True)
+ os.mkdir(tmpclonedir)
+ os.rename(ud.clonedir, gitdir)
+ runfetchcmd("sed " + gitdir + "/config -i -e 's/bare.*=.*true/bare = false/'", d)
+ os.chdir(tmpclonedir)
+ runfetchcmd(ud.basecmd + " reset --hard", d)
+ runfetchcmd(ud.basecmd + " checkout " + ud.revisions[ud.names[0]], d)
+ runfetchcmd(ud.basecmd + " submodule init", d)
+ runfetchcmd(ud.basecmd + " submodule update", d)
+ self._set_relative_paths(tmpclonedir)
+ runfetchcmd("sed " + gitdir + "/config -i -e 's/bare.*=.*false/bare = true/'", d)
+ os.rename(gitdir, ud.clonedir,)
+ bb.utils.remove(tmpclonedir, True)
+
+ def download(self, ud, d):
+ Git.download(self, ud, d)
+
+ os.chdir(ud.clonedir)
+ submodules = self.uses_submodules(ud, d)
+ if submodules:
+ self.update_submodules(ud, d)
+
+ def unpack(self, ud, destdir, d):
+ Git.unpack(self, ud, destdir, d)
+
+ os.chdir(ud.destdir)
+ submodules = self.uses_submodules(ud, d)
+ if submodules:
+ runfetchcmd("cp -r " + ud.clonedir + "/modules " + ud.destdir + "/.git/", d)
+ runfetchcmd(ud.basecmd + " submodule init", d)
+ runfetchcmd(ud.basecmd + " submodule update", d)
+
diff --git a/bitbake/lib/bb/fetch2/hg.py b/bitbake/lib/bb/fetch2/hg.py
new file mode 100644
index 0000000..d978630
--- /dev/null
+++ b/bitbake/lib/bb/fetch2/hg.py
@@ -0,0 +1,275 @@
+# ex:ts=4:sw=4:sts=4:et
+# -*- tab-width: 4; c-basic-offset: 4; indent-tabs-mode: nil -*-
+"""
+BitBake 'Fetch' implementation for mercurial DRCS (hg).
+
+"""
+
+# Copyright (C) 2003, 2004 Chris Larson
+# Copyright (C) 2004 Marcin Juszkiewicz
+# Copyright (C) 2007 Robert Schuster
+#
+# This program is free software; you can redistribute it and/or modify
+# it under the terms of the GNU General Public License version 2 as
+# published by the Free Software Foundation.
+#
+# This program is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
+# GNU General Public License for more details.
+#
+# You should have received a copy of the GNU General Public License along
+# with this program; if not, write to the Free Software Foundation, Inc.,
+# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
+#
+# Based on functions from the base bb module, Copyright 2003 Holger Schurig
+
+import os
+import sys
+import logging
+import bb
+from bb import data
+from bb.fetch2 import FetchMethod
+from bb.fetch2 import FetchError
+from bb.fetch2 import MissingParameterError
+from bb.fetch2 import runfetchcmd
+from bb.fetch2 import logger
+
+class Hg(FetchMethod):
+ """Class to fetch from mercurial repositories"""
+ def supports(self, ud, d):
+ """
+ Check to see if a given url can be fetched with mercurial.
+ """
+ return ud.type in ['hg']
+
+ def supports_checksum(self, urldata):
+ """
+ Don't require checksums for local archives created from
+ repository checkouts.
+ """
+ return False
+
+ def urldata_init(self, ud, d):
+ """
+ init hg specific variable within url data
+ """
+ if not "module" in ud.parm:
+ raise MissingParameterError('module', ud.url)
+
+ ud.module = ud.parm["module"]
+
+ if 'protocol' in ud.parm:
+ ud.proto = ud.parm['protocol']
+ elif not ud.host:
+ ud.proto = 'file'
+ else:
+ ud.proto = "hg"
+
+ ud.setup_revisons(d)
+
+ if 'rev' in ud.parm:
+ ud.revision = ud.parm['rev']
+ elif not ud.revision:
+ ud.revision = self.latest_revision(ud, d)
+
+ # Create paths to mercurial checkouts
+ hgsrcname = '%s_%s_%s' % (ud.module.replace('/', '.'), \
+ ud.host, ud.path.replace('/', '.'))
+ ud.mirrortarball = 'hg_%s.tar.gz' % hgsrcname
+ ud.fullmirror = os.path.join(d.getVar("DL_DIR", True), ud.mirrortarball)
+
+ hgdir = d.getVar("HGDIR", True) or (d.getVar("DL_DIR", True) + "/hg/")
+ ud.pkgdir = os.path.join(hgdir, hgsrcname)
+ ud.moddir = os.path.join(ud.pkgdir, ud.module)
+ ud.localfile = ud.moddir
+ ud.basecmd = data.getVar("FETCHCMD_hg", d, True) or "/usr/bin/env hg"
+
+ ud.write_tarballs = d.getVar("BB_GENERATE_MIRROR_TARBALLS", True)
+
+ def need_update(self, ud, d):
+ revTag = ud.parm.get('rev', 'tip')
+ if revTag == "tip":
+ return True
+ if not os.path.exists(ud.localpath):
+ return True
+ return False
+
+ def try_premirror(self, ud, d):
+ # If we don't do this, updating an existing checkout with only premirrors
+ # is not possible
+ if d.getVar("BB_FETCH_PREMIRRORONLY", True) is not None:
+ return True
+ if os.path.exists(ud.moddir):
+ return False
+ return True
+
+ def _buildhgcommand(self, ud, d, command):
+ """
+ Build up an hg commandline based on ud
+ command is "fetch", "update", "info"
+ """
+
+ proto = ud.parm.get('protocol', 'http')
+
+ host = ud.host
+ if proto == "file":
+ host = "/"
+ ud.host = "localhost"
+
+ if not ud.user:
+ hgroot = host + ud.path
+ else:
+ if ud.pswd:
+ hgroot = ud.user + ":" + ud.pswd + "@" + host + ud.path
+ else:
+ hgroot = ud.user + "@" + host + ud.path
+
+ if command == "info":
+ return "%s identify -i %s://%s/%s" % (ud.basecmd, proto, hgroot, ud.module)
+
+ options = [];
+
+ # Don't specify revision for the fetch; clone the entire repo.
+ # This avoids an issue if the specified revision is a tag, because
+ # the tag actually exists in the specified revision + 1, so it won't
+ # be available when used in any successive commands.
+ if ud.revision and command != "fetch":
+ options.append("-r %s" % ud.revision)
+
+ if command == "fetch":
+ if ud.user and ud.pswd:
+ cmd = "%s --config auth.default.prefix=* --config auth.default.username=%s --config auth.default.password=%s --config \"auth.default.schemes=%s\" clone %s %s://%s/%s %s" % (ud.basecmd, ud.user, ud.pswd, proto, " ".join(options), proto, hgroot, ud.module, ud.module)
+ else:
+ cmd = "%s clone %s %s://%s/%s %s" % (ud.basecmd, " ".join(options), proto, hgroot, ud.module, ud.module)
+ elif command == "pull":
+ # do not pass options list; limiting pull to rev causes the local
+ # repo not to contain it and immediately following "update" command
+ # will crash
+ if ud.user and ud.pswd:
+ cmd = "%s --config auth.default.prefix=* --config auth.default.username=%s --config auth.default.password=%s --config \"auth.default.schemes=%s\" pull" % (ud.basecmd, ud.user, ud.pswd, proto)
+ else:
+ cmd = "%s pull" % (ud.basecmd)
+ elif command == "update":
+ if ud.user and ud.pswd:
+ cmd = "%s --config auth.default.prefix=* --config auth.default.username=%s --config auth.default.password=%s --config \"auth.default.schemes=%s\" update -C %s" % (ud.basecmd, ud.user, ud.pswd, proto, " ".join(options))
+ else:
+ cmd = "%s update -C %s" % (ud.basecmd, " ".join(options))
+ else:
+ raise FetchError("Invalid hg command %s" % command, ud.url)
+
+ return cmd
+
+ def download(self, ud, d):
+ """Fetch url"""
+
+ ud.repochanged = not os.path.exists(ud.fullmirror)
+
+ logger.debug(2, "Fetch: checking for module directory '" + ud.moddir + "'")
+
+ # If the checkout doesn't exist and the mirror tarball does, extract it
+ if not os.path.exists(ud.pkgdir) and os.path.exists(ud.fullmirror):
+ bb.utils.mkdirhier(ud.pkgdir)
+ os.chdir(ud.pkgdir)
+ runfetchcmd("tar -xzf %s" % (ud.fullmirror), d)
+
+ if os.access(os.path.join(ud.moddir, '.hg'), os.R_OK):
+ # Found the source, check whether need pull
+ updatecmd = self._buildhgcommand(ud, d, "update")
+ os.chdir(ud.moddir)
+ logger.debug(1, "Running %s", updatecmd)
+ try:
+ runfetchcmd(updatecmd, d)
+ except bb.fetch2.FetchError:
+ # Runnning pull in the repo
+ pullcmd = self._buildhgcommand(ud, d, "pull")
+ logger.info("Pulling " + ud.url)
+ # update sources there
+ os.chdir(ud.moddir)
+ logger.debug(1, "Running %s", pullcmd)
+ bb.fetch2.check_network_access(d, pullcmd, ud.url)
+ runfetchcmd(pullcmd, d)
+ ud.repochanged = True
+
+ # No source found, clone it.
+ if not os.path.exists(ud.moddir):
+ fetchcmd = self._buildhgcommand(ud, d, "fetch")
+ logger.info("Fetch " + ud.url)
+ # check out sources there
+ bb.utils.mkdirhier(ud.pkgdir)
+ os.chdir(ud.pkgdir)
+ logger.debug(1, "Running %s", fetchcmd)
+ bb.fetch2.check_network_access(d, fetchcmd, ud.url)
+ runfetchcmd(fetchcmd, d)
+
+ # Even when we clone (fetch), we still need to update as hg's clone
+ # won't checkout the specified revision if its on a branch
+ updatecmd = self._buildhgcommand(ud, d, "update")
+ os.chdir(ud.moddir)
+ logger.debug(1, "Running %s", updatecmd)
+ runfetchcmd(updatecmd, d)
+
+ def clean(self, ud, d):
+ """ Clean the hg dir """
+
+ bb.utils.remove(ud.localpath, True)
+ bb.utils.remove(ud.fullmirror)
+ bb.utils.remove(ud.fullmirror + ".done")
+
+ def supports_srcrev(self):
+ return True
+
+ def _latest_revision(self, ud, d, name):
+ """
+ Compute tip revision for the url
+ """
+ bb.fetch2.check_network_access(d, self._buildhgcommand(ud, d, "info"))
+ output = runfetchcmd(self._buildhgcommand(ud, d, "info"), d)
+ return output.strip()
+
+ def _build_revision(self, ud, d, name):
+ return ud.revision
+
+ def _revision_key(self, ud, d, name):
+ """
+ Return a unique key for the url
+ """
+ return "hg:" + ud.moddir
+
+ def build_mirror_data(self, ud, d):
+ # Generate a mirror tarball if needed
+ if ud.write_tarballs == "1" and (ud.repochanged or not os.path.exists(ud.fullmirror)):
+ # it's possible that this symlink points to read-only filesystem with PREMIRROR
+ if os.path.islink(ud.fullmirror):
+ os.unlink(ud.fullmirror)
+
+ os.chdir(ud.pkgdir)
+ logger.info("Creating tarball of hg repository")
+ runfetchcmd("tar -czf %s %s" % (ud.fullmirror, ud.module), d)
+ runfetchcmd("touch %s.done" % (ud.fullmirror), d)
+
+ def localpath(self, ud, d):
+ return ud.pkgdir
+
+ def unpack(self, ud, destdir, d):
+ """
+ Make a local clone or export for the url
+ """
+
+ revflag = "-r %s" % ud.revision
+ subdir = ud.parm.get("destsuffix", ud.module)
+ codir = "%s/%s" % (destdir, subdir)
+
+ scmdata = ud.parm.get("scmdata", "")
+ if scmdata != "nokeep":
+ if not os.access(os.path.join(codir, '.hg'), os.R_OK):
+ logger.debug(2, "Unpack: creating new hg repository in '" + codir + "'")
+ runfetchcmd("%s init %s" % (ud.basecmd, codir), d)
+ logger.debug(2, "Unpack: updating source in '" + codir + "'")
+ os.chdir(codir)
+ runfetchcmd("%s pull %s" % (ud.basecmd, ud.moddir), d)
+ runfetchcmd("%s up -C %s" % (ud.basecmd, revflag), d)
+ else:
+ logger.debug(2, "Unpack: extracting source to '" + codir + "'")
+ os.chdir(ud.moddir)
+ runfetchcmd("%s archive -t files %s %s" % (ud.basecmd, revflag, codir), d)
diff --git a/bitbake/lib/bb/fetch2/local.py b/bitbake/lib/bb/fetch2/local.py
new file mode 100644
index 0000000..2d921f7
--- /dev/null
+++ b/bitbake/lib/bb/fetch2/local.py
@@ -0,0 +1,128 @@
+# ex:ts=4:sw=4:sts=4:et
+# -*- tab-width: 4; c-basic-offset: 4; indent-tabs-mode: nil -*-
+"""
+BitBake 'Fetch' implementations
+
+Classes for obtaining upstream sources for the
+BitBake build tools.
+
+"""
+
+# Copyright (C) 2003, 2004 Chris Larson
+#
+# This program is free software; you can redistribute it and/or modify
+# it under the terms of the GNU General Public License version 2 as
+# published by the Free Software Foundation.
+#
+# This program is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
+# GNU General Public License for more details.
+#
+# You should have received a copy of the GNU General Public License along
+# with this program; if not, write to the Free Software Foundation, Inc.,
+# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
+#
+# Based on functions from the base bb module, Copyright 2003 Holger Schurig
+
+import os
+import urllib
+import bb
+import bb.utils
+from bb import data
+from bb.fetch2 import FetchMethod, FetchError
+from bb.fetch2 import logger
+
+class Local(FetchMethod):
+ def supports(self, urldata, d):
+ """
+ Check to see if a given url represents a local fetch.
+ """
+ return urldata.type in ['file']
+
+ def urldata_init(self, ud, d):
+ # We don't set localfile as for this fetcher the file is already local!
+ ud.decodedurl = urllib.unquote(ud.url.split("://")[1].split(";")[0])
+ ud.basename = os.path.basename(ud.decodedurl)
+ ud.basepath = ud.decodedurl
+ return
+
+ def localpath(self, urldata, d):
+ """
+ Return the local filename of a given url assuming a successful fetch.
+ """
+ return self.localpaths(urldata, d)[-1]
+
+ def localpaths(self, urldata, d):
+ """
+ Return the local filename of a given url assuming a successful fetch.
+ """
+ searched = []
+ path = urldata.decodedurl
+ newpath = path
+ if path[0] == "/":
+ return [path]
+ filespath = data.getVar('FILESPATH', d, True)
+ if filespath:
+ logger.debug(2, "Searching for %s in paths:\n %s" % (path, "\n ".join(filespath.split(":"))))
+ newpath, hist = bb.utils.which(filespath, path, history=True)
+ searched.extend(hist)
+ if not newpath:
+ filesdir = data.getVar('FILESDIR', d, True)
+ if filesdir:
+ logger.debug(2, "Searching for %s in path: %s" % (path, filesdir))
+ newpath = os.path.join(filesdir, path)
+ searched.append(newpath)
+ if (not newpath or not os.path.exists(newpath)) and path.find("*") != -1:
+ # For expressions using '*', best we can do is take the first directory in FILESPATH that exists
+ newpath, hist = bb.utils.which(filespath, ".", history=True)
+ searched.extend(hist)
+ logger.debug(2, "Searching for %s in path: %s" % (path, newpath))
+ return searched
+ if not os.path.exists(newpath):
+ dldirfile = os.path.join(d.getVar("DL_DIR", True), path)
+ logger.debug(2, "Defaulting to %s for %s" % (dldirfile, path))
+ bb.utils.mkdirhier(os.path.dirname(dldirfile))
+ searched.append(dldirfile)
+ return searched
+ return searched
+
+ def need_update(self, ud, d):
+ if ud.url.find("*") != -1:
+ return False
+ if os.path.exists(ud.localpath):
+ return False
+ return True
+
+ def download(self, urldata, d):
+ """Fetch urls (no-op for Local method)"""
+ # no need to fetch local files, we'll deal with them in place.
+ if self.supports_checksum(urldata) and not os.path.exists(urldata.localpath):
+ locations = []
+ filespath = data.getVar('FILESPATH', d, True)
+ if filespath:
+ locations = filespath.split(":")
+ filesdir = data.getVar('FILESDIR', d, True)
+ if filesdir:
+ locations.append(filesdir)
+ locations.append(d.getVar("DL_DIR", True))
+
+ msg = "Unable to find file " + urldata.url + " anywhere. The paths that were searched were:\n " + "\n ".join(locations)
+ raise FetchError(msg)
+
+ return True
+
+ def checkstatus(self, fetch, urldata, d):
+ """
+ Check the status of the url
+ """
+ if urldata.localpath.find("*") != -1:
+ logger.info("URL %s looks like a glob and was therefore not checked.", urldata.url)
+ return True
+ if os.path.exists(urldata.localpath):
+ return True
+ return False
+
+ def clean(self, urldata, d):
+ return
+
diff --git a/bitbake/lib/bb/fetch2/osc.py b/bitbake/lib/bb/fetch2/osc.py
new file mode 100644
index 0000000..3d87796
--- /dev/null
+++ b/bitbake/lib/bb/fetch2/osc.py
@@ -0,0 +1,135 @@
+# ex:ts=4:sw=4:sts=4:et
+# -*- tab-width: 4; c-basic-offset: 4; indent-tabs-mode: nil -*-
+"""
+Bitbake "Fetch" implementation for osc (Opensuse build service client).
+Based on the svn "Fetch" implementation.
+
+"""
+
+import os
+import sys
+import logging
+import bb
+from bb import data
+from bb.fetch2 import FetchMethod
+from bb.fetch2 import FetchError
+from bb.fetch2 import MissingParameterError
+from bb.fetch2 import runfetchcmd
+
+class Osc(FetchMethod):
+ """Class to fetch a module or modules from Opensuse build server
+ repositories."""
+
+ def supports(self, ud, d):
+ """
+ Check to see if a given url can be fetched with osc.
+ """
+ return ud.type in ['osc']
+
+ def urldata_init(self, ud, d):
+ if not "module" in ud.parm:
+ raise MissingParameterError('module', ud.url)
+
+ ud.module = ud.parm["module"]
+
+ # Create paths to osc checkouts
+ relpath = self._strip_leading_slashes(ud.path)
+ ud.pkgdir = os.path.join(data.expand('${OSCDIR}', d), ud.host)
+ ud.moddir = os.path.join(ud.pkgdir, relpath, ud.module)
+
+ if 'rev' in ud.parm:
+ ud.revision = ud.parm['rev']
+ else:
+ pv = data.getVar("PV", d, 0)
+ rev = bb.fetch2.srcrev_internal_helper(ud, d)
+ if rev and rev != True:
+ ud.revision = rev
+ else:
+ ud.revision = ""
+
+ ud.localfile = data.expand('%s_%s_%s.tar.gz' % (ud.module.replace('/', '.'), ud.path.replace('/', '.'), ud.revision), d)
+
+ def _buildosccommand(self, ud, d, command):
+ """
+ Build up an ocs commandline based on ud
+ command is "fetch", "update", "info"
+ """
+
+ basecmd = data.expand('${FETCHCMD_osc}', d)
+
+ proto = ud.parm.get('protocol', 'ocs')
+
+ options = []
+
+ config = "-c %s" % self.generate_config(ud, d)
+
+ if ud.revision:
+ options.append("-r %s" % ud.revision)
+
+ coroot = self._strip_leading_slashes(ud.path)
+
+ if command == "fetch":
+ osccmd = "%s %s co %s/%s %s" % (basecmd, config, coroot, ud.module, " ".join(options))
+ elif command == "update":
+ osccmd = "%s %s up %s" % (basecmd, config, " ".join(options))
+ else:
+ raise FetchError("Invalid osc command %s" % command, ud.url)
+
+ return osccmd
+
+ def download(self, ud, d):
+ """
+ Fetch url
+ """
+
+ logger.debug(2, "Fetch: checking for module directory '" + ud.moddir + "'")
+
+ if os.access(os.path.join(data.expand('${OSCDIR}', d), ud.path, ud.module), os.R_OK):
+ oscupdatecmd = self._buildosccommand(ud, d, "update")
+ logger.info("Update "+ ud.url)
+ # update sources there
+ os.chdir(ud.moddir)
+ logger.debug(1, "Running %s", oscupdatecmd)
+ bb.fetch2.check_network_access(d, oscupdatecmd, ud.url)
+ runfetchcmd(oscupdatecmd, d)
+ else:
+ oscfetchcmd = self._buildosccommand(ud, d, "fetch")
+ logger.info("Fetch " + ud.url)
+ # check out sources there
+ bb.utils.mkdirhier(ud.pkgdir)
+ os.chdir(ud.pkgdir)
+ logger.debug(1, "Running %s", oscfetchcmd)
+ bb.fetch2.check_network_access(d, oscfetchcmd, ud.url)
+ runfetchcmd(oscfetchcmd, d)
+
+ os.chdir(os.path.join(ud.pkgdir + ud.path))
+ # tar them up to a defined filename
+ runfetchcmd("tar -czf %s %s" % (ud.localpath, ud.module), d, cleanup = [ud.localpath])
+
+ def supports_srcrev(self):
+ return False
+
+ def generate_config(self, ud, d):
+ """
+ Generate a .oscrc to be used for this run.
+ """
+
+ config_path = os.path.join(data.expand('${OSCDIR}', d), "oscrc")
+ if (os.path.exists(config_path)):
+ os.remove(config_path)
+
+ f = open(config_path, 'w')
+ f.write("[general]\n")
+ f.write("apisrv = %s\n" % ud.host)
+ f.write("scheme = http\n")
+ f.write("su-wrapper = su -c\n")
+ f.write("build-root = %s\n" % data.expand('${WORKDIR}', d))
+ f.write("urllist = http://moblin-obs.jf.intel.com:8888/build/%(project)s/%(repository)s/%(buildarch)s/:full/%(name)s.rpm\n")
+ f.write("extra-pkgs = gzip\n")
+ f.write("\n")
+ f.write("[%s]\n" % ud.host)
+ f.write("user = %s\n" % ud.parm["user"])
+ f.write("pass = %s\n" % ud.parm["pswd"])
+ f.close()
+
+ return config_path
diff --git a/bitbake/lib/bb/fetch2/perforce.py b/bitbake/lib/bb/fetch2/perforce.py
new file mode 100644
index 0000000..3a10c7c
--- /dev/null
+++ b/bitbake/lib/bb/fetch2/perforce.py
@@ -0,0 +1,187 @@
+# ex:ts=4:sw=4:sts=4:et
+# -*- tab-width: 4; c-basic-offset: 4; indent-tabs-mode: nil -*-
+"""
+BitBake 'Fetch' implementations
+
+Classes for obtaining upstream sources for the
+BitBake build tools.
+
+"""
+
+# Copyright (C) 2003, 2004 Chris Larson
+#
+# This program is free software; you can redistribute it and/or modify
+# it under the terms of the GNU General Public License version 2 as
+# published by the Free Software Foundation.
+#
+# This program is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
+# GNU General Public License for more details.
+#
+# You should have received a copy of the GNU General Public License along
+# with this program; if not, write to the Free Software Foundation, Inc.,
+# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
+#
+# Based on functions from the base bb module, Copyright 2003 Holger Schurig
+
+from future_builtins import zip
+import os
+import subprocess
+import logging
+import bb
+from bb import data
+from bb.fetch2 import FetchMethod
+from bb.fetch2 import FetchError
+from bb.fetch2 import logger
+from bb.fetch2 import runfetchcmd
+
+class Perforce(FetchMethod):
+ def supports(self, ud, d):
+ return ud.type in ['p4']
+
+ def doparse(url, d):
+ parm = {}
+ path = url.split("://")[1]
+ delim = path.find("@");
+ if delim != -1:
+ (user, pswd, host, port) = path.split('@')[0].split(":")
+ path = path.split('@')[1]
+ else:
+ (host, port) = d.getVar('P4PORT', False).split(':')
+ user = ""
+ pswd = ""
+
+ if path.find(";") != -1:
+ keys=[]
+ values=[]
+ plist = path.split(';')
+ for item in plist:
+ if item.count('='):
+ (key, value) = item.split('=')
+ keys.append(key)
+ values.append(value)
+
+ parm = dict(zip(keys, values))
+ path = "//" + path.split(';')[0]
+ host += ":%s" % (port)
+ parm["cset"] = Perforce.getcset(d, path, host, user, pswd, parm)
+
+ return host, path, user, pswd, parm
+ doparse = staticmethod(doparse)
+
+ def getcset(d, depot, host, user, pswd, parm):
+ p4opt = ""
+ if "cset" in parm:
+ return parm["cset"];
+ if user:
+ p4opt += " -u %s" % (user)
+ if pswd:
+ p4opt += " -P %s" % (pswd)
+ if host:
+ p4opt += " -p %s" % (host)
+
+ p4date = d.getVar("P4DATE", True)
+ if "revision" in parm:
+ depot += "#%s" % (parm["revision"])
+ elif "label" in parm:
+ depot += "@%s" % (parm["label"])
+ elif p4date:
+ depot += "@%s" % (p4date)
+
+ p4cmd = d.getVar('FETCHCMD_p4', True) or "p4"
+ logger.debug(1, "Running %s%s changes -m 1 %s", p4cmd, p4opt, depot)
+ p4file, errors = bb.process.run("%s%s changes -m 1 %s" % (p4cmd, p4opt, depot))
+ cset = p4file.strip()
+ logger.debug(1, "READ %s", cset)
+ if not cset:
+ return -1
+
+ return cset.split(' ')[1]
+ getcset = staticmethod(getcset)
+
+ def urldata_init(self, ud, d):
+ (host, path, user, pswd, parm) = Perforce.doparse(ud.url, d)
+
+ base_path = path.replace('/...', '')
+ base_path = self._strip_leading_slashes(base_path)
+
+ if "label" in parm:
+ version = parm["label"]
+ else:
+ version = Perforce.getcset(d, path, host, user, pswd, parm)
+
+ ud.localfile = data.expand('%s+%s+%s.tar.gz' % (host, base_path.replace('/', '.'), version), d)
+
+ def download(self, ud, d):
+ """
+ Fetch urls
+ """
+
+ (host, depot, user, pswd, parm) = Perforce.doparse(ud.url, d)
+
+ if depot.find('/...') != -1:
+ path = depot[:depot.find('/...')]
+ else:
+ path = depot[:depot.rfind('/')]
+
+ module = parm.get('module', os.path.basename(path))
+
+ # Get the p4 command
+ p4opt = ""
+ if user:
+ p4opt += " -u %s" % (user)
+
+ if pswd:
+ p4opt += " -P %s" % (pswd)
+
+ if host:
+ p4opt += " -p %s" % (host)
+
+ p4cmd = d.getVar('FETCHCMD_p4', True) or "p4"
+
+ # create temp directory
+ logger.debug(2, "Fetch: creating temporary directory")
+ bb.utils.mkdirhier(d.expand('${WORKDIR}'))
+ mktemp = d.getVar("FETCHCMD_p4mktemp", True) or d.expand("mktemp -d -q '${WORKDIR}/oep4.XXXXXX'")
+ tmpfile, errors = bb.process.run(mktemp)
+ tmpfile = tmpfile.strip()
+ if not tmpfile:
+ raise FetchError("Fetch: unable to create temporary directory.. make sure 'mktemp' is in the PATH.", ud.url)
+
+ if "label" in parm:
+ depot = "%s@%s" % (depot, parm["label"])
+ else:
+ cset = Perforce.getcset(d, depot, host, user, pswd, parm)
+ depot = "%s@%s" % (depot, cset)
+
+ os.chdir(tmpfile)
+ logger.info("Fetch " + ud.url)
+ logger.info("%s%s files %s", p4cmd, p4opt, depot)
+ p4file, errors = bb.process.run("%s%s files %s" % (p4cmd, p4opt, depot))
+ p4file = [f.rstrip() for f in p4file.splitlines()]
+
+ if not p4file:
+ raise FetchError("Fetch: unable to get the P4 files from %s" % depot, ud.url)
+
+ count = 0
+
+ for file in p4file:
+ list = file.split()
+
+ if list[2] == "delete":
+ continue
+
+ dest = list[0][len(path)+1:]
+ where = dest.find("#")
+
+ subprocess.call("%s%s print -o %s/%s %s" % (p4cmd, p4opt, module, dest[:where], list[0]), shell=True)
+ count = count + 1
+
+ if count == 0:
+ logger.error()
+ raise FetchError("Fetch: No files gathered from the P4 fetch", ud.url)
+
+ runfetchcmd("tar -czf %s %s" % (ud.localpath, module), d, cleanup = [ud.localpath])
+ # cleanup
+ bb.utils.prunedir(tmpfile)
diff --git a/bitbake/lib/bb/fetch2/repo.py b/bitbake/lib/bb/fetch2/repo.py
new file mode 100644
index 0000000..21678eb
--- /dev/null
+++ b/bitbake/lib/bb/fetch2/repo.py
@@ -0,0 +1,98 @@
+# ex:ts=4:sw=4:sts=4:et
+# -*- tab-width: 4; c-basic-offset: 4; indent-tabs-mode: nil -*-
+"""
+BitBake "Fetch" repo (git) implementation
+
+"""
+
+# Copyright (C) 2009 Tom Rini <trini@embeddedalley.com>
+#
+# Based on git.py which is:
+#Copyright (C) 2005 Richard Purdie
+#
+# This program is free software; you can redistribute it and/or modify
+# it under the terms of the GNU General Public License version 2 as
+# published by the Free Software Foundation.
+#
+# This program is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
+# GNU General Public License for more details.
+#
+# You should have received a copy of the GNU General Public License along
+# with this program; if not, write to the Free Software Foundation, Inc.,
+# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
+
+import os
+import bb
+from bb import data
+from bb.fetch2 import FetchMethod
+from bb.fetch2 import runfetchcmd
+
+class Repo(FetchMethod):
+ """Class to fetch a module or modules from repo (git) repositories"""
+ def supports(self, ud, d):
+ """
+ Check to see if a given url can be fetched with repo.
+ """
+ return ud.type in ["repo"]
+
+ def urldata_init(self, ud, d):
+ """
+ We don"t care about the git rev of the manifests repository, but
+ we do care about the manifest to use. The default is "default".
+ We also care about the branch or tag to be used. The default is
+ "master".
+ """
+
+ ud.proto = ud.parm.get('protocol', 'git')
+ ud.branch = ud.parm.get('branch', 'master')
+ ud.manifest = ud.parm.get('manifest', 'default.xml')
+ if not ud.manifest.endswith('.xml'):
+ ud.manifest += '.xml'
+
+ ud.localfile = data.expand("repo_%s%s_%s_%s.tar.gz" % (ud.host, ud.path.replace("/", "."), ud.manifest, ud.branch), d)
+
+ def download(self, ud, d):
+ """Fetch url"""
+
+ if os.access(os.path.join(data.getVar("DL_DIR", d, True), ud.localfile), os.R_OK):
+ logger.debug(1, "%s already exists (or was stashed). Skipping repo init / sync.", ud.localpath)
+ return
+
+ gitsrcname = "%s%s" % (ud.host, ud.path.replace("/", "."))
+ repodir = data.getVar("REPODIR", d, True) or os.path.join(data.getVar("DL_DIR", d, True), "repo")
+ codir = os.path.join(repodir, gitsrcname, ud.manifest)
+
+ if ud.user:
+ username = ud.user + "@"
+ else:
+ username = ""
+
+ bb.utils.mkdirhier(os.path.join(codir, "repo"))
+ os.chdir(os.path.join(codir, "repo"))
+ if not os.path.exists(os.path.join(codir, "repo", ".repo")):
+ bb.fetch2.check_network_access(d, "repo init -m %s -b %s -u %s://%s%s%s" % (ud.manifest, ud.branch, ud.proto, username, ud.host, ud.path), ud.url)
+ runfetchcmd("repo init -m %s -b %s -u %s://%s%s%s" % (ud.manifest, ud.branch, ud.proto, username, ud.host, ud.path), d)
+
+ bb.fetch2.check_network_access(d, "repo sync %s" % ud.url, ud.url)
+ runfetchcmd("repo sync", d)
+ os.chdir(codir)
+
+ scmdata = ud.parm.get("scmdata", "")
+ if scmdata == "keep":
+ tar_flags = ""
+ else:
+ tar_flags = "--exclude '.repo' --exclude '.git'"
+
+ # Create a cache
+ runfetchcmd("tar %s -czf %s %s" % (tar_flags, ud.localpath, os.path.join(".", "*") ), d)
+
+ def supports_srcrev(self):
+ return False
+
+ def _build_revision(self, ud, d):
+ return ud.manifest
+
+ def _want_sortable_revision(self, ud, d):
+ return False
diff --git a/bitbake/lib/bb/fetch2/sftp.py b/bitbake/lib/bb/fetch2/sftp.py
new file mode 100644
index 0000000..cb2f753
--- /dev/null
+++ b/bitbake/lib/bb/fetch2/sftp.py
@@ -0,0 +1,129 @@
+# ex:ts=4:sw=4:sts=4:et
+# -*- tab-width: 4; c-basic-offset: 4; indent-tabs-mode: nil -*-
+"""
+BitBake SFTP Fetch implementation
+
+Class for fetching files via SFTP. It tries to adhere to the (now
+expired) IETF Internet Draft for "Uniform Resource Identifier (URI)
+Scheme for Secure File Transfer Protocol (SFTP) and Secure Shell
+(SSH)" (SECSH URI).
+
+It uses SFTP (as to adhere to the SECSH URI specification). It only
+supports key based authentication, not password. This class, unlike
+the SSH fetcher, does not support fetching a directory tree from the
+remote.
+
+ http://tools.ietf.org/html/draft-ietf-secsh-scp-sftp-ssh-uri-04
+ https://www.iana.org/assignments/uri-schemes/prov/sftp
+ https://tools.ietf.org/html/draft-ietf-secsh-filexfer-13
+
+Please note that '/' is used as host path seperator, and not ":"
+as you may be used to from the scp/sftp commands. You can use a
+~ (tilde) to specify a path relative to your home directory.
+(The /~user/ syntax, for specyfing a path relative to another
+user's home directory is not supported.) Note that the tilde must
+still follow the host path seperator ("/"). See exampels below.
+
+Example SRC_URIs:
+
+SRC_URI = "sftp://host.example.com/dir/path.file.txt"
+
+A path relative to your home directory.
+
+SRC_URI = "sftp://host.example.com/~/dir/path.file.txt"
+
+You can also specify a username (specyfing password in the
+URI is not supported, use SSH keys to authenticate):
+
+SRC_URI = "sftp://user@host.example.com/dir/path.file.txt"
+
+"""
+
+# Copyright (C) 2013, Olof Johansson <olof.johansson@axis.com>
+#
+# Based in part on bb.fetch2.wget:
+# Copyright (C) 2003, 2004 Chris Larson
+#
+# This program is free software; you can redistribute it and/or modify
+# it under the terms of the GNU General Public License version 2 as
+# published by the Free Software Foundation.
+#
+# This program is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
+# GNU General Public License for more details.
+#
+# You should have received a copy of the GNU General Public License along
+# with this program; if not, write to the Free Software Foundation, Inc.,
+# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
+#
+# Based on functions from the base bb module, Copyright 2003 Holger Schurig
+
+import os
+import bb
+import urllib
+import commands
+from bb import data
+from bb.fetch2 import URI
+from bb.fetch2 import FetchMethod
+from bb.fetch2 import runfetchcmd
+
+
+class SFTP(FetchMethod):
+ """Class to fetch urls via 'sftp'"""
+
+ def supports(self, ud, d):
+ """
+ Check to see if a given url can be fetched with sftp.
+ """
+ return ud.type in ['sftp']
+
+ def recommends_checksum(self, urldata):
+ return True
+
+ def urldata_init(self, ud, d):
+ if 'protocol' in ud.parm and ud.parm['protocol'] == 'git':
+ raise bb.fetch2.ParameterError(
+ "Invalid protocol - if you wish to fetch from a " +
+ "git repository using ssh, you need to use the " +
+ "git:// prefix with protocol=ssh", ud.url)
+
+ if 'downloadfilename' in ud.parm:
+ ud.basename = ud.parm['downloadfilename']
+ else:
+ ud.basename = os.path.basename(ud.path)
+
+ ud.localfile = data.expand(urllib.unquote(ud.basename), d)
+
+ def download(self, ud, d):
+ """Fetch urls"""
+
+ urlo = URI(ud.url)
+ basecmd = 'sftp -oBatchMode=yes'
+ port = ''
+ if urlo.port:
+ port = '-P %d' % urlo.port
+ urlo.port = None
+
+ dldir = data.getVar('DL_DIR', d, True)
+ lpath = os.path.join(dldir, ud.localfile)
+
+ user = ''
+ if urlo.userinfo:
+ user = urlo.userinfo + '@'
+
+ path = urlo.path
+
+ # Supoprt URIs relative to the user's home directory, with
+ # the tilde syntax. (E.g. <sftp://example.com/~/foo.diff>).
+ if path[:3] == '/~/':
+ path = path[3:]
+
+ remote = '%s%s:%s' % (user, urlo.hostname, path)
+
+ cmd = '%s %s %s %s' % (basecmd, port, commands.mkarg(remote),
+ commands.mkarg(lpath))
+
+ bb.fetch2.check_network_access(d, cmd, ud.url)
+ runfetchcmd(cmd, d)
+ return True
diff --git a/bitbake/lib/bb/fetch2/ssh.py b/bitbake/lib/bb/fetch2/ssh.py
new file mode 100644
index 0000000..635578a
--- /dev/null
+++ b/bitbake/lib/bb/fetch2/ssh.py
@@ -0,0 +1,128 @@
+# ex:ts=4:sw=4:sts=4:et
+# -*- tab-width: 4; c-basic-offset: 4; indent-tabs-mode: nil -*-
+'''
+BitBake 'Fetch' implementations
+
+This implementation is for Secure Shell (SSH), and attempts to comply with the
+IETF secsh internet draft:
+ http://tools.ietf.org/wg/secsh/draft-ietf-secsh-scp-sftp-ssh-uri/
+
+ Currently does not support the sftp parameters, as this uses scp
+ Also does not support the 'fingerprint' connection parameter.
+
+ Please note that '/' is used as host, path separator not ':' as you may
+ be used to, also '~' can be used to specify user HOME, but again after '/'
+
+ Example SRC_URI:
+ SRC_URI = "ssh://user@host.example.com/dir/path/file.txt"
+ SRC_URI = "ssh://user@host.example.com/~/file.txt"
+'''
+
+# Copyright (C) 2006 OpenedHand Ltd.
+#
+#
+# Based in part on svk.py:
+# Copyright (C) 2006 Holger Hans Peter Freyther
+# Based on svn.py:
+# Copyright (C) 2003, 2004 Chris Larson
+# Based on functions from the base bb module:
+# Copyright 2003 Holger Schurig
+#
+#
+# This program is free software; you can redistribute it and/or modify
+# it under the terms of the GNU General Public License version 2 as
+# published by the Free Software Foundation.
+#
+# This program is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
+# GNU General Public License for more details.
+#
+# You should have received a copy of the GNU General Public License along
+# with this program; if not, write to the Free Software Foundation, Inc.,
+# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
+
+import re, os
+from bb import data
+from bb.fetch2 import FetchMethod
+from bb.fetch2 import FetchError
+from bb.fetch2 import logger
+from bb.fetch2 import runfetchcmd
+
+
+__pattern__ = re.compile(r'''
+ \s* # Skip leading whitespace
+ ssh:// # scheme
+ ( # Optional username/password block
+ (?P<user>\S+) # username
+ (:(?P<pass>\S+))? # colon followed by the password (optional)
+ )?
+ (?P<cparam>(;[^;]+)*)? # connection parameters block (optional)
+ @
+ (?P<host>\S+?) # non-greedy match of the host
+ (:(?P<port>[0-9]+))? # colon followed by the port (optional)
+ /
+ (?P<path>[^;]+) # path on the remote system, may be absolute or relative,
+ # and may include the use of '~' to reference the remote home
+ # directory
+ (?P<sparam>(;[^;]+)*)? # parameters block (optional)
+ $
+''', re.VERBOSE)
+
+class SSH(FetchMethod):
+ '''Class to fetch a module or modules via Secure Shell'''
+
+ def supports(self, urldata, d):
+ return __pattern__.match(urldata.url) != None
+
+ def supports_checksum(self, urldata):
+ return False
+
+ def urldata_init(self, urldata, d):
+ if 'protocol' in urldata.parm and urldata.parm['protocol'] == 'git':
+ raise bb.fetch2.ParameterError(
+ "Invalid protocol - if you wish to fetch from a git " +
+ "repository using ssh, you need to use " +
+ "git:// prefix with protocol=ssh", urldata.url)
+ m = __pattern__.match(urldata.url)
+ path = m.group('path')
+ host = m.group('host')
+ urldata.localpath = os.path.join(d.getVar('DL_DIR', True),
+ os.path.basename(os.path.normpath(path)))
+
+ def download(self, urldata, d):
+ dldir = d.getVar('DL_DIR', True)
+
+ m = __pattern__.match(urldata.url)
+ path = m.group('path')
+ host = m.group('host')
+ port = m.group('port')
+ user = m.group('user')
+ password = m.group('pass')
+
+ if port:
+ portarg = '-P %s' % port
+ else:
+ portarg = ''
+
+ if user:
+ fr = user
+ if password:
+ fr += ':%s' % password
+ fr += '@%s' % host
+ else:
+ fr = host
+ fr += ':%s' % path
+
+
+ import commands
+ cmd = 'scp -B -r %s %s %s/' % (
+ portarg,
+ commands.mkarg(fr),
+ commands.mkarg(dldir)
+ )
+
+ bb.fetch2.check_network_access(d, cmd, urldata.url)
+
+ runfetchcmd(cmd, d)
+
diff --git a/bitbake/lib/bb/fetch2/svn.py b/bitbake/lib/bb/fetch2/svn.py
new file mode 100644
index 0000000..1733c2b
--- /dev/null
+++ b/bitbake/lib/bb/fetch2/svn.py
@@ -0,0 +1,192 @@
+# ex:ts=4:sw=4:sts=4:et
+# -*- tab-width: 4; c-basic-offset: 4; indent-tabs-mode: nil -*-
+"""
+BitBake 'Fetch' implementation for svn.
+
+"""
+
+# Copyright (C) 2003, 2004 Chris Larson
+# Copyright (C) 2004 Marcin Juszkiewicz
+#
+# This program is free software; you can redistribute it and/or modify
+# it under the terms of the GNU General Public License version 2 as
+# published by the Free Software Foundation.
+#
+# This program is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
+# GNU General Public License for more details.
+#
+# You should have received a copy of the GNU General Public License along
+# with this program; if not, write to the Free Software Foundation, Inc.,
+# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
+#
+# Based on functions from the base bb module, Copyright 2003 Holger Schurig
+
+import os
+import sys
+import logging
+import bb
+import re
+from bb import data
+from bb.fetch2 import FetchMethod
+from bb.fetch2 import FetchError
+from bb.fetch2 import MissingParameterError
+from bb.fetch2 import runfetchcmd
+from bb.fetch2 import logger
+
+class Svn(FetchMethod):
+ """Class to fetch a module or modules from svn repositories"""
+ def supports(self, ud, d):
+ """
+ Check to see if a given url can be fetched with svn.
+ """
+ return ud.type in ['svn']
+
+ def urldata_init(self, ud, d):
+ """
+ init svn specific variable within url data
+ """
+ if not "module" in ud.parm:
+ raise MissingParameterError('module', ud.url)
+
+ ud.basecmd = d.getVar('FETCHCMD_svn', True)
+
+ ud.module = ud.parm["module"]
+
+ # Create paths to svn checkouts
+ relpath = self._strip_leading_slashes(ud.path)
+ ud.pkgdir = os.path.join(data.expand('${SVNDIR}', d), ud.host, relpath)
+ ud.moddir = os.path.join(ud.pkgdir, ud.module)
+
+ ud.setup_revisons(d)
+
+ if 'rev' in ud.parm:
+ ud.revision = ud.parm['rev']
+
+ ud.localfile = data.expand('%s_%s_%s_%s_.tar.gz' % (ud.module.replace('/', '.'), ud.host, ud.path.replace('/', '.'), ud.revision), d)
+
+ def _buildsvncommand(self, ud, d, command):
+ """
+ Build up an svn commandline based on ud
+ command is "fetch", "update", "info"
+ """
+
+ proto = ud.parm.get('protocol', 'svn')
+
+ svn_rsh = None
+ if proto == "svn+ssh" and "rsh" in ud.parm:
+ svn_rsh = ud.parm["rsh"]
+
+ svnroot = ud.host + ud.path
+
+ options = []
+
+ options.append("--no-auth-cache")
+
+ if ud.user:
+ options.append("--username %s" % ud.user)
+
+ if ud.pswd:
+ options.append("--password %s" % ud.pswd)
+
+ if command == "info":
+ svncmd = "%s info %s %s://%s/%s/" % (ud.basecmd, " ".join(options), proto, svnroot, ud.module)
+ elif command == "log1":
+ svncmd = "%s log --limit 1 %s %s://%s/%s/" % (ud.basecmd, " ".join(options), proto, svnroot, ud.module)
+ else:
+ suffix = ""
+ if ud.revision:
+ options.append("-r %s" % ud.revision)
+ suffix = "@%s" % (ud.revision)
+
+ if command == "fetch":
+ transportuser = ud.parm.get("transportuser", "")
+ svncmd = "%s co %s %s://%s%s/%s%s %s" % (ud.basecmd, " ".join(options), proto, transportuser, svnroot, ud.module, suffix, ud.module)
+ elif command == "update":
+ svncmd = "%s update %s" % (ud.basecmd, " ".join(options))
+ else:
+ raise FetchError("Invalid svn command %s" % command, ud.url)
+
+ if svn_rsh:
+ svncmd = "svn_RSH=\"%s\" %s" % (svn_rsh, svncmd)
+
+ return svncmd
+
+ def download(self, ud, d):
+ """Fetch url"""
+
+ logger.debug(2, "Fetch: checking for module directory '" + ud.moddir + "'")
+
+ if os.access(os.path.join(ud.moddir, '.svn'), os.R_OK):
+ svnupdatecmd = self._buildsvncommand(ud, d, "update")
+ logger.info("Update " + ud.url)
+ # update sources there
+ os.chdir(ud.moddir)
+ # We need to attempt to run svn upgrade first in case its an older working format
+ try:
+ runfetchcmd(ud.basecmd + " upgrade", d)
+ except FetchError:
+ pass
+ logger.debug(1, "Running %s", svnupdatecmd)
+ bb.fetch2.check_network_access(d, svnupdatecmd, ud.url)
+ runfetchcmd(svnupdatecmd, d)
+ else:
+ svnfetchcmd = self._buildsvncommand(ud, d, "fetch")
+ logger.info("Fetch " + ud.url)
+ # check out sources there
+ bb.utils.mkdirhier(ud.pkgdir)
+ os.chdir(ud.pkgdir)
+ logger.debug(1, "Running %s", svnfetchcmd)
+ bb.fetch2.check_network_access(d, svnfetchcmd, ud.url)
+ runfetchcmd(svnfetchcmd, d)
+
+ scmdata = ud.parm.get("scmdata", "")
+ if scmdata == "keep":
+ tar_flags = ""
+ else:
+ tar_flags = "--exclude '.svn'"
+
+ os.chdir(ud.pkgdir)
+ # tar them up to a defined filename
+ runfetchcmd("tar %s -czf %s %s" % (tar_flags, ud.localpath, ud.module), d, cleanup = [ud.localpath])
+
+ def clean(self, ud, d):
+ """ Clean SVN specific files and dirs """
+
+ bb.utils.remove(ud.localpath)
+ bb.utils.remove(ud.moddir, True)
+
+
+ def supports_srcrev(self):
+ return True
+
+ def _revision_key(self, ud, d, name):
+ """
+ Return a unique key for the url
+ """
+ return "svn:" + ud.moddir
+
+ def _latest_revision(self, ud, d, name):
+ """
+ Return the latest upstream revision number
+ """
+ bb.fetch2.check_network_access(d, self._buildsvncommand(ud, d, "log1"))
+
+ output = runfetchcmd("LANG=C LC_ALL=C " + self._buildsvncommand(ud, d, "log1"), d, True)
+
+ # skip the first line, as per output of svn log
+ # then we expect the revision on the 2nd line
+ revision = re.search('^r([0-9]*)', output.splitlines()[1]).group(1)
+
+ return revision
+
+ def sortable_revision(self, ud, d, name):
+ """
+ Return a sortable revision number which in our case is the revision number
+ """
+
+ return False, self._build_revision(ud, d)
+
+ def _build_revision(self, ud, d):
+ return ud.revision
diff --git a/bitbake/lib/bb/fetch2/wget.py b/bitbake/lib/bb/fetch2/wget.py
new file mode 100644
index 0000000..bd2a897
--- /dev/null
+++ b/bitbake/lib/bb/fetch2/wget.py
@@ -0,0 +1,541 @@
+# ex:ts=4:sw=4:sts=4:et
+# -*- tab-width: 4; c-basic-offset: 4; indent-tabs-mode: nil -*-
+"""
+BitBake 'Fetch' implementations
+
+Classes for obtaining upstream sources for the
+BitBake build tools.
+
+"""
+
+# Copyright (C) 2003, 2004 Chris Larson
+#
+# This program is free software; you can redistribute it and/or modify
+# it under the terms of the GNU General Public License version 2 as
+# published by the Free Software Foundation.
+#
+# This program is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
+# GNU General Public License for more details.
+#
+# You should have received a copy of the GNU General Public License along
+# with this program; if not, write to the Free Software Foundation, Inc.,
+# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
+#
+# Based on functions from the base bb module, Copyright 2003 Holger Schurig
+
+import re
+import tempfile
+import subprocess
+import os
+import logging
+import bb
+import urllib
+from bb import data
+from bb.fetch2 import FetchMethod
+from bb.fetch2 import FetchError
+from bb.fetch2 import logger
+from bb.fetch2 import runfetchcmd
+from bs4 import BeautifulSoup
+
+class Wget(FetchMethod):
+ """Class to fetch urls via 'wget'"""
+ def supports(self, ud, d):
+ """
+ Check to see if a given url can be fetched with wget.
+ """
+ return ud.type in ['http', 'https', 'ftp']
+
+ def recommends_checksum(self, urldata):
+ return True
+
+ def urldata_init(self, ud, d):
+ if 'protocol' in ud.parm:
+ if ud.parm['protocol'] == 'git':
+ raise bb.fetch2.ParameterError("Invalid protocol - if you wish to fetch from a git repository using http, you need to instead use the git:// prefix with protocol=http", ud.url)
+
+ if 'downloadfilename' in ud.parm:
+ ud.basename = ud.parm['downloadfilename']
+ else:
+ ud.basename = os.path.basename(ud.path)
+
+ ud.localfile = data.expand(urllib.unquote(ud.basename), d)
+
+ self.basecmd = d.getVar("FETCHCMD_wget", True) or "/usr/bin/env wget -t 2 -T 30 -nv --passive-ftp --no-check-certificate"
+
+ def _runwget(self, ud, d, command, quiet):
+
+ logger.debug(2, "Fetching %s using command '%s'" % (ud.url, command))
+ bb.fetch2.check_network_access(d, command)
+ runfetchcmd(command, d, quiet)
+
+ def download(self, ud, d):
+ """Fetch urls"""
+
+ fetchcmd = self.basecmd
+
+ if 'downloadfilename' in ud.parm:
+ dldir = d.getVar("DL_DIR", True)
+ bb.utils.mkdirhier(os.path.dirname(dldir + os.sep + ud.localfile))
+ fetchcmd += " -O " + dldir + os.sep + ud.localfile
+
+ uri = ud.url.split(";")[0]
+ if os.path.exists(ud.localpath):
+ # file exists, but we didnt complete it.. trying again..
+ fetchcmd += d.expand(" -c -P ${DL_DIR} '%s'" % uri)
+ else:
+ fetchcmd += d.expand(" -P ${DL_DIR} '%s'" % uri)
+
+ self._runwget(ud, d, fetchcmd, False)
+
+ # Sanity check since wget can pretend it succeed when it didn't
+ # Also, this used to happen if sourceforge sent us to the mirror page
+ if not os.path.exists(ud.localpath):
+ raise FetchError("The fetch command returned success for url %s but %s doesn't exist?!" % (uri, ud.localpath), uri)
+
+ if os.path.getsize(ud.localpath) == 0:
+ os.remove(ud.localpath)
+ raise FetchError("The fetch of %s resulted in a zero size file?! Deleting and failing since this isn't right." % (uri), uri)
+
+ return True
+
+ def checkstatus(self, fetch, ud, d):
+ import urllib2, socket, httplib
+ from urllib import addinfourl
+ from bb.fetch2 import FetchConnectionCache
+
+ class HTTPConnectionCache(httplib.HTTPConnection):
+ if fetch.connection_cache:
+ def connect(self):
+ """Connect to the host and port specified in __init__."""
+
+ sock = fetch.connection_cache.get_connection(self.host, self.port)
+ if sock:
+ self.sock = sock
+ else:
+ self.sock = socket.create_connection((self.host, self.port),
+ self.timeout, self.source_address)
+ fetch.connection_cache.add_connection(self.host, self.port, self.sock)
+
+ if self._tunnel_host:
+ self._tunnel()
+
+ class CacheHTTPHandler(urllib2.HTTPHandler):
+ def http_open(self, req):
+ return self.do_open(HTTPConnectionCache, req)
+
+ def do_open(self, http_class, req):
+ """Return an addinfourl object for the request, using http_class.
+
+ http_class must implement the HTTPConnection API from httplib.
+ The addinfourl return value is a file-like object. It also
+ has methods and attributes including:
+ - info(): return a mimetools.Message object for the headers
+ - geturl(): return the original request URL
+ - code: HTTP status code
+ """
+ host = req.get_host()
+ if not host:
+ raise urlllib2.URLError('no host given')
+
+ h = http_class(host, timeout=req.timeout) # will parse host:port
+ h.set_debuglevel(self._debuglevel)
+
+ headers = dict(req.unredirected_hdrs)
+ headers.update(dict((k, v) for k, v in req.headers.items()
+ if k not in headers))
+
+ # We want to make an HTTP/1.1 request, but the addinfourl
+ # class isn't prepared to deal with a persistent connection.
+ # It will try to read all remaining data from the socket,
+ # which will block while the server waits for the next request.
+ # So make sure the connection gets closed after the (only)
+ # request.
+
+ # Don't close connection when connection_cache is enabled,
+ if fetch.connection_cache is None:
+ headers["Connection"] = "close"
+ else:
+ headers["Connection"] = "Keep-Alive" # Works for HTTP/1.0
+
+ headers = dict(
+ (name.title(), val) for name, val in headers.items())
+
+ if req._tunnel_host:
+ tunnel_headers = {}
+ proxy_auth_hdr = "Proxy-Authorization"
+ if proxy_auth_hdr in headers:
+ tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr]
+ # Proxy-Authorization should not be sent to origin
+ # server.
+ del headers[proxy_auth_hdr]
+ h.set_tunnel(req._tunnel_host, headers=tunnel_headers)
+
+ try:
+ h.request(req.get_method(), req.get_selector(), req.data, headers)
+ except socket.error, err: # XXX what error?
+ # Don't close connection when cache is enabled.
+ if fetch.connection_cache is None:
+ h.close()
+ raise urllib2.URLError(err)
+ else:
+ try:
+ r = h.getresponse(buffering=True)
+ except TypeError: # buffering kw not supported
+ r = h.getresponse()
+
+ # Pick apart the HTTPResponse object to get the addinfourl
+ # object initialized properly.
+
+ # Wrap the HTTPResponse object in socket's file object adapter
+ # for Windows. That adapter calls recv(), so delegate recv()
+ # to read(). This weird wrapping allows the returned object to
+ # have readline() and readlines() methods.
+
+ # XXX It might be better to extract the read buffering code
+ # out of socket._fileobject() and into a base class.
+ r.recv = r.read
+
+ # no data, just have to read
+ r.read()
+ class fp_dummy(object):
+ def read(self):
+ return ""
+ def readline(self):
+ return ""
+ def close(self):
+ pass
+
+ resp = addinfourl(fp_dummy(), r.msg, req.get_full_url())
+ resp.code = r.status
+ resp.msg = r.reason
+
+ # Close connection when server request it.
+ if fetch.connection_cache is not None:
+ if 'Connection' in r.msg and r.msg['Connection'] == 'close':
+ fetch.connection_cache.remove_connection(h.host, h.port)
+
+ return resp
+
+ def export_proxies(d):
+ variables = ['http_proxy', 'HTTP_PROXY', 'https_proxy', 'HTTPS_PROXY',
+ 'ftp_proxy', 'FTP_PROXY', 'no_proxy', 'NO_PROXY']
+ exported = False
+
+ for v in variables:
+ if v in os.environ.keys():
+ exported = True
+ else:
+ v_proxy = d.getVar(v, True)
+ if v_proxy is not None:
+ os.environ[v] = v_proxy
+ exported = True
+
+ return exported
+
+ def head_method(self):
+ return "HEAD"
+
+ exported_proxies = export_proxies(d)
+
+ # XXX: Since Python 2.7.9 ssl cert validation is enabled by default
+ # see PEP-0476, this causes verification errors on some https servers
+ # so disable by default.
+ import ssl
+ ssl_context = None
+ if hasattr(ssl, '_create_unverified_context'):
+ ssl_context = ssl._create_unverified_context()
+
+ if exported_proxies == True and ssl_context is not None:
+ opener = urllib2.build_opener(urllib2.ProxyHandler, CacheHTTPHandler,
+ urllib2.HTTPSHandler(context=ssl_context))
+ elif exported_proxies == False and ssl_context is not None:
+ opener = urllib2.build_opener(CacheHTTPHandler,
+ urllib2.HTTPSHandler(context=ssl_context))
+ elif exported_proxies == True and ssl_context is None:
+ opener = urllib2.build_opener(urllib2.ProxyHandler, CacheHTTPHandler)
+ else:
+ opener = urllib2.build_opener(CacheHTTPHandler)
+
+ urllib2.Request.get_method = head_method
+ urllib2.install_opener(opener)
+
+ uri = ud.url.split(";")[0]
+
+ try:
+ urllib2.urlopen(uri)
+ except:
+ return False
+ return True
+
+ def _parse_path(self, regex, s):
+ """
+ Find and group name, version and archive type in the given string s
+ """
+
+ m = regex.search(s)
+ if m:
+ pname = ''
+ pver = ''
+ ptype = ''
+
+ mdict = m.groupdict()
+ if 'name' in mdict.keys():
+ pname = mdict['name']
+ if 'pver' in mdict.keys():
+ pver = mdict['pver']
+ if 'type' in mdict.keys():
+ ptype = mdict['type']
+
+ bb.debug(3, "_parse_path: %s, %s, %s" % (pname, pver, ptype))
+
+ return (pname, pver, ptype)
+
+ return None
+
+ def _modelate_version(self, version):
+ if version[0] in ['.', '-']:
+ if version[1].isdigit():
+ version = version[1] + version[0] + version[2:len(version)]
+ else:
+ version = version[1:len(version)]
+
+ version = re.sub('-', '.', version)
+ version = re.sub('_', '.', version)
+ version = re.sub('(rc)+', '.1000.', version)
+ version = re.sub('(beta)+', '.100.', version)
+ version = re.sub('(alpha)+', '.10.', version)
+ if version[0] == 'v':
+ version = version[1:len(version)]
+ return version
+
+ def _vercmp(self, old, new):
+ """
+ Check whether 'new' is newer than 'old' version. We use existing vercmp() for the
+ purpose. PE is cleared in comparison as it's not for build, and PR is cleared too
+ for simplicity as it's somehow difficult to get from various upstream format
+ """
+
+ (oldpn, oldpv, oldsuffix) = old
+ (newpn, newpv, newsuffix) = new
+
+ """
+ Check for a new suffix type that we have never heard of before
+ """
+ if (newsuffix):
+ m = self.suffix_regex_comp.search(newsuffix)
+ if not m:
+ bb.warn("%s has a possible unknown suffix: %s" % (newpn, newsuffix))
+ return False
+
+ """
+ Not our package so ignore it
+ """
+ if oldpn != newpn:
+ return False
+
+ oldpv = self._modelate_version(oldpv)
+ newpv = self._modelate_version(newpv)
+
+ return bb.utils.vercmp(("0", oldpv, ""), ("0", newpv, ""))
+
+ def _fetch_index(self, uri, ud, d):
+ """
+ Run fetch checkstatus to get directory information
+ """
+ f = tempfile.NamedTemporaryFile()
+
+ agent = "Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.9.2.12) Gecko/20101027 Ubuntu/9.10 (karmic) Firefox/3.6.12"
+ fetchcmd = self.basecmd
+ fetchcmd += " -O " + f.name + " --user-agent='" + agent + "' '" + uri + "'"
+ try:
+ self._runwget(ud, d, fetchcmd, True)
+ fetchresult = f.read()
+ except bb.fetch2.BBFetchException:
+ fetchresult = ""
+
+ f.close()
+ return fetchresult
+
+ def _check_latest_version(self, url, package, package_regex, current_version, ud, d):
+ """
+ Return the latest version of a package inside a given directory path
+ If error or no version, return ""
+ """
+ valid = 0
+ version = ['', '', '']
+
+ bb.debug(3, "VersionURL: %s" % (url))
+ soup = BeautifulSoup(self._fetch_index(url, ud, d))
+ if not soup:
+ bb.debug(3, "*** %s NO SOUP" % (url))
+ return ""
+
+ for line in soup.find_all('a', href=True):
+ bb.debug(3, "line['href'] = '%s'" % (line['href']))
+ bb.debug(3, "line = '%s'" % (str(line)))
+
+ newver = self._parse_path(package_regex, line['href'])
+ if not newver:
+ newver = self._parse_path(package_regex, str(line))
+
+ if newver:
+ bb.debug(3, "Upstream version found: %s" % newver[1])
+ if valid == 0:
+ version = newver
+ valid = 1
+ elif self._vercmp(version, newver) < 0:
+ version = newver
+
+ pupver = re.sub('_', '.', version[1])
+
+ bb.debug(3, "*** %s -> UpstreamVersion = %s (CurrentVersion = %s)" %
+ (package, pupver or "N/A", current_version[1]))
+
+ if valid:
+ return pupver
+
+ return ""
+
+ def _check_latest_version_by_dir(self, dirver, package, package_regex,
+ current_version, ud, d):
+ """
+ Scan every directory in order to get upstream version.
+ """
+ version_dir = ['', '', '']
+ version = ['', '', '']
+
+ dirver_regex = re.compile("(\D*)((\d+[\.\-_])+(\d+))")
+ s = dirver_regex.search(dirver)
+ if s:
+ version_dir[1] = s.group(2)
+ else:
+ version_dir[1] = dirver
+
+ dirs_uri = bb.fetch.encodeurl([ud.type, ud.host,
+ ud.path.split(dirver)[0], ud.user, ud.pswd, {}])
+ bb.debug(3, "DirURL: %s, %s" % (dirs_uri, package))
+
+ soup = BeautifulSoup(self._fetch_index(dirs_uri, ud, d))
+ if not soup:
+ return version[1]
+
+ for line in soup.find_all('a', href=True):
+ s = dirver_regex.search(line['href'].strip("/"))
+ if s:
+ version_dir_new = ['', s.group(2), '']
+ if self._vercmp(version_dir, version_dir_new) <= 0:
+ dirver_new = s.group(1) + s.group(2)
+ path = ud.path.replace(dirver, dirver_new, True) \
+ .split(package)[0]
+ uri = bb.fetch.encodeurl([ud.type, ud.host, path,
+ ud.user, ud.pswd, {}])
+
+ pupver = self._check_latest_version(uri,
+ package, package_regex, current_version, ud, d)
+ if pupver:
+ version[1] = pupver
+
+ version_dir = version_dir_new
+
+ return version[1]
+
+ def _init_regexes(self, package, ud, d):
+ """
+ Match as many patterns as possible such as:
+ gnome-common-2.20.0.tar.gz (most common format)
+ gtk+-2.90.1.tar.gz
+ xf86-input-synaptics-12.6.9.tar.gz
+ dri2proto-2.3.tar.gz
+ blktool_4.orig.tar.gz
+ libid3tag-0.15.1b.tar.gz
+ unzip552.tar.gz
+ icu4c-3_6-src.tgz
+ genext2fs_1.3.orig.tar.gz
+ gst-fluendo-mp3
+ """
+ # match most patterns which uses "-" as separator to version digits
+ pn_prefix1 = "[a-zA-Z][a-zA-Z0-9]*([-_][a-zA-Z]\w+)*\+?[-_]"
+ # a loose pattern such as for unzip552.tar.gz
+ pn_prefix2 = "[a-zA-Z]+"
+ # a loose pattern such as for 80325-quicky-0.4.tar.gz
+ pn_prefix3 = "[0-9]+[-]?[a-zA-Z]+"
+ # Save the Package Name (pn) Regex for use later
+ pn_regex = "(%s|%s|%s)" % (pn_prefix1, pn_prefix2, pn_prefix3)
+
+ # match version
+ pver_regex = "(([A-Z]*\d+[a-zA-Z]*[\.\-_]*)+)"
+
+ # match arch
+ parch_regex = "-source|_all_"
+
+ # src.rpm extension was added only for rpm package. Can be removed if the rpm
+ # packaged will always be considered as having to be manually upgraded
+ psuffix_regex = "(tar\.gz|tgz|tar\.bz2|zip|xz|rpm|bz2|orig\.tar\.gz|tar\.xz|src\.tar\.gz|src\.tgz|svnr\d+\.tar\.bz2|stable\.tar\.gz|src\.rpm)"
+
+ # match name, version and archive type of a package
+ package_regex_comp = re.compile("(?P<name>%s?\.?v?)(?P<pver>%s)(?P<arch>%s)?[\.-](?P<type>%s$)"
+ % (pn_regex, pver_regex, parch_regex, psuffix_regex))
+ self.suffix_regex_comp = re.compile(psuffix_regex)
+
+ # compile regex, can be specific by package or generic regex
+ pn_regex = d.getVar('REGEX', True)
+ if pn_regex:
+ package_custom_regex_comp = re.compile(pn_regex)
+ else:
+ version = self._parse_path(package_regex_comp, package)
+ if version:
+ package_custom_regex_comp = re.compile(
+ "(?P<name>%s)(?P<pver>%s)(?P<arch>%s)?[\.-](?P<type>%s)" %
+ (re.escape(version[0]), pver_regex, parch_regex, psuffix_regex))
+ else:
+ package_custom_regex_comp = None
+
+ return package_custom_regex_comp
+
+ def latest_versionstring(self, ud, d):
+ """
+ Manipulate the URL and try to obtain the latest package version
+
+ sanity check to ensure same name and type.
+ """
+ package = ud.path.split("/")[-1]
+ current_version = ['', d.getVar('PV', True), '']
+
+ """possible to have no version in pkg name, such as spectrum-fw"""
+ if not re.search("\d+", package):
+ current_version[1] = re.sub('_', '.', current_version[1])
+ current_version[1] = re.sub('-', '.', current_version[1])
+ return (current_version[1], '')
+
+ package_regex = self._init_regexes(package, ud, d)
+ if package_regex is None:
+ bb.warn("latest_versionstring: package %s don't match pattern" % (package))
+ return ('', '')
+ bb.debug(3, "latest_versionstring, regex: %s" % (package_regex.pattern))
+
+ uri = ""
+ regex_uri = d.getVar("REGEX_URI", True)
+ if not regex_uri:
+ path = ud.path.split(package)[0]
+
+ # search for version matches on folders inside the path, like:
+ # "5.7" in http://download.gnome.org/sources/${PN}/5.7/${PN}-${PV}.tar.gz
+ dirver_regex = re.compile("(?P<dirver>[^/]*(\d+\.)*\d+([-_]r\d+)*)/")
+ m = dirver_regex.search(path)
+ if m:
+ pn = d.getVar('PN', True)
+ dirver = m.group('dirver')
+
+ dirver_pn_regex = re.compile("%s\d?" % (re.escape(pn)))
+ if not dirver_pn_regex.search(dirver):
+ return (self._check_latest_version_by_dir(dirver,
+ package, package_regex, current_version, ud, d), '')
+
+ uri = bb.fetch.encodeurl([ud.type, ud.host, path, ud.user, ud.pswd, {}])
+ else:
+ uri = regex_uri
+
+ return (self._check_latest_version(uri, package, package_regex,
+ current_version, ud, d), '')
diff --git a/bitbake/lib/bb/main.py b/bitbake/lib/bb/main.py
new file mode 100755
index 0000000..8762f72
--- /dev/null
+++ b/bitbake/lib/bb/main.py
@@ -0,0 +1,427 @@
+#!/usr/bin/env python
+# ex:ts=4:sw=4:sts=4:et
+# -*- tab-width: 4; c-basic-offset: 4; indent-tabs-mode: nil -*-
+#
+# Copyright (C) 2003, 2004 Chris Larson
+# Copyright (C) 2003, 2004 Phil Blundell
+# Copyright (C) 2003 - 2005 Michael 'Mickey' Lauer
+# Copyright (C) 2005 Holger Hans Peter Freyther
+# Copyright (C) 2005 ROAD GmbH
+# Copyright (C) 2006 Richard Purdie
+#
+# This program is free software; you can redistribute it and/or modify
+# it under the terms of the GNU General Public License version 2 as
+# published by the Free Software Foundation.
+#
+# This program is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
+# GNU General Public License for more details.
+#
+# You should have received a copy of the GNU General Public License along
+# with this program; if not, write to the Free Software Foundation, Inc.,
+# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
+
+import os
+import sys
+import logging
+import optparse
+import warnings
+
+import bb
+from bb import event
+import bb.msg
+from bb import cooker
+from bb import ui
+from bb import server
+from bb import cookerdata
+
+logger = logging.getLogger("BitBake")
+
+class BBMainException(Exception):
+ pass
+
+def present_options(optionlist):
+ if len(optionlist) > 1:
+ return ' or '.join([', '.join(optionlist[:-1]), optionlist[-1]])
+ else:
+ return optionlist[0]
+
+class BitbakeHelpFormatter(optparse.IndentedHelpFormatter):
+ def format_option(self, option):
+ # We need to do this here rather than in the text we supply to
+ # add_option() because we don't want to call list_extension_modules()
+ # on every execution (since it imports all of the modules)
+ # Note also that we modify option.help rather than the returned text
+ # - this is so that we don't have to re-format the text ourselves
+ if option.dest == 'ui':
+ valid_uis = list_extension_modules(bb.ui, 'main')
+ option.help = option.help.replace('@CHOICES@', present_options(valid_uis))
+ elif option.dest == 'servertype':
+ valid_server_types = list_extension_modules(bb.server, 'BitBakeServer')
+ option.help = option.help.replace('@CHOICES@', present_options(valid_server_types))
+
+ return optparse.IndentedHelpFormatter.format_option(self, option)
+
+def list_extension_modules(pkg, checkattr):
+ """
+ Lists extension modules in a specific Python package
+ (e.g. UIs, servers). NOTE: Calling this function will import all of the
+ submodules of the specified module in order to check for the specified
+ attribute; this can have unusual side-effects. As a result, this should
+ only be called when displaying help text or error messages.
+ Parameters:
+ pkg: previously imported Python package to list
+ checkattr: attribute to look for in module to determine if it's valid
+ as the type of extension you are looking for
+ """
+ import pkgutil
+ pkgdir = os.path.dirname(pkg.__file__)
+
+ modules = []
+ for _, modulename, _ in pkgutil.iter_modules([pkgdir]):
+ if os.path.isdir(os.path.join(pkgdir, modulename)):
+ # ignore directories
+ continue
+ try:
+ module = __import__(pkg.__name__, fromlist=[modulename])
+ except:
+ # If we can't import it, it's not valid
+ continue
+ module_if = getattr(module, modulename)
+ if getattr(module_if, 'hidden_extension', False):
+ continue
+ if not checkattr or hasattr(module_if, checkattr):
+ modules.append(modulename)
+ return modules
+
+def import_extension_module(pkg, modulename, checkattr):
+ try:
+ # Dynamically load the UI based on the ui name. Although we
+ # suggest a fixed set this allows you to have flexibility in which
+ # ones are available.
+ module = __import__(pkg.__name__, fromlist = [modulename])
+ return getattr(module, modulename)
+ except AttributeError:
+ raise BBMainException('FATAL: Unable to import extension module "%s" from %s. Valid extension modules: %s' % (modulename, pkg.__name__, present_options(list_extension_modules(pkg, checkattr))))
+
+
+# Display bitbake/OE warnings via the BitBake.Warnings logger, ignoring others"""
+warnlog = logging.getLogger("BitBake.Warnings")
+_warnings_showwarning = warnings.showwarning
+def _showwarning(message, category, filename, lineno, file=None, line=None):
+ if file is not None:
+ if _warnings_showwarning is not None:
+ _warnings_showwarning(message, category, filename, lineno, file, line)
+ else:
+ s = warnings.formatwarning(message, category, filename, lineno)
+ warnlog.warn(s)
+
+warnings.showwarning = _showwarning
+warnings.filterwarnings("ignore")
+warnings.filterwarnings("default", module="(<string>$|(oe|bb)\.)")
+warnings.filterwarnings("ignore", category=PendingDeprecationWarning)
+warnings.filterwarnings("ignore", category=ImportWarning)
+warnings.filterwarnings("ignore", category=DeprecationWarning, module="<string>$")
+warnings.filterwarnings("ignore", message="With-statements now directly support multiple context managers")
+
+class BitBakeConfigParameters(cookerdata.ConfigParameters):
+
+ def parseCommandLine(self, argv=sys.argv):
+ parser = optparse.OptionParser(
+ formatter = BitbakeHelpFormatter(),
+ version = "BitBake Build Tool Core version %s" % bb.__version__,
+ usage = """%prog [options] [recipename/target recipe:do_task ...]
+
+ Executes the specified task (default is 'build') for a given set of target recipes (.bb files).
+ It is assumed there is a conf/bblayers.conf available in cwd or in BBPATH which
+ will provide the layer, BBFILES and other configuration information.""")
+
+ parser.add_option("-b", "--buildfile", help = "Execute tasks from a specific .bb recipe directly. WARNING: Does not handle any dependencies from other recipes.",
+ action = "store", dest = "buildfile", default = None)
+
+ parser.add_option("-k", "--continue", help = "Continue as much as possible after an error. While the target that failed and anything depending on it cannot be built, as much as possible will be built before stopping.",
+ action = "store_false", dest = "abort", default = True)
+
+ parser.add_option("-a", "--tryaltconfigs", help = "Continue with builds by trying to use alternative providers where possible.",
+ action = "store_true", dest = "tryaltconfigs", default = False)
+
+ parser.add_option("-f", "--force", help = "Force the specified targets/task to run (invalidating any existing stamp file).",
+ action = "store_true", dest = "force", default = False)
+
+ parser.add_option("-c", "--cmd", help = "Specify the task to execute. The exact options available depend on the metadata. Some examples might be 'compile' or 'populate_sysroot' or 'listtasks' may give a list of the tasks available.",
+ action = "store", dest = "cmd")
+
+ parser.add_option("-C", "--clear-stamp", help = "Invalidate the stamp for the specified task such as 'compile' and then run the default task for the specified target(s).",
+ action = "store", dest = "invalidate_stamp")
+
+ parser.add_option("-r", "--read", help = "Read the specified file before bitbake.conf.",
+ action = "append", dest = "prefile", default = [])
+
+ parser.add_option("-R", "--postread", help = "Read the specified file after bitbake.conf.",
+ action = "append", dest = "postfile", default = [])
+
+ parser.add_option("-v", "--verbose", help = "Output more log message data to the terminal.",
+ action = "store_true", dest = "verbose", default = False)
+
+ parser.add_option("-D", "--debug", help = "Increase the debug level. You can specify this more than once.",
+ action = "count", dest="debug", default = 0)
+
+ parser.add_option("-n", "--dry-run", help = "Don't execute, just go through the motions.",
+ action = "store_true", dest = "dry_run", default = False)
+
+ parser.add_option("-S", "--dump-signatures", help = "Dump out the signature construction information, with no task execution. The SIGNATURE_HANDLER parameter is passed to the handler. Two common values are none and printdiff but the handler may define more/less. none means only dump the signature, printdiff means compare the dumped signature with the cached one.",
+ action = "append", dest = "dump_signatures", default = [], metavar="SIGNATURE_HANDLER")
+
+ parser.add_option("-p", "--parse-only", help = "Quit after parsing the BB recipes.",
+ action = "store_true", dest = "parse_only", default = False)
+
+ parser.add_option("-s", "--show-versions", help = "Show current and preferred versions of all recipes.",
+ action = "store_true", dest = "show_versions", default = False)
+
+ parser.add_option("-e", "--environment", help = "Show the global or per-recipe environment complete with information about where variables were set/changed.",
+ action = "store_true", dest = "show_environment", default = False)
+
+ parser.add_option("-g", "--graphviz", help = "Save dependency tree information for the specified targets in the dot syntax.",
+ action = "store_true", dest = "dot_graph", default = False)
+
+ parser.add_option("-I", "--ignore-deps", help = """Assume these dependencies don't exist and are already provided (equivalent to ASSUME_PROVIDED). Useful to make dependency graphs more appealing""",
+ action = "append", dest = "extra_assume_provided", default = [])
+
+ parser.add_option("-l", "--log-domains", help = """Show debug logging for the specified logging domains""",
+ action = "append", dest = "debug_domains", default = [])
+
+ parser.add_option("-P", "--profile", help = "Profile the command and save reports.",
+ action = "store_true", dest = "profile", default = False)
+
+ env_ui = os.environ.get('BITBAKE_UI', None)
+ default_ui = env_ui or 'knotty'
+ # @CHOICES@ is substituted out by BitbakeHelpFormatter above
+ parser.add_option("-u", "--ui", help = "The user interface to use (@CHOICES@ - default %default).",
+ action="store", dest="ui", default=default_ui)
+
+ # @CHOICES@ is substituted out by BitbakeHelpFormatter above
+ parser.add_option("-t", "--servertype", help = "Choose which server type to use (@CHOICES@ - default %default).",
+ action = "store", dest = "servertype", default = "process")
+
+ parser.add_option("", "--token", help = "Specify the connection token to be used when connecting to a remote server.",
+ action = "store", dest = "xmlrpctoken")
+
+ parser.add_option("", "--revisions-changed", help = "Set the exit code depending on whether upstream floating revisions have changed or not.",
+ action = "store_true", dest = "revisions_changed", default = False)
+
+ parser.add_option("", "--server-only", help = "Run bitbake without a UI, only starting a server (cooker) process.",
+ action = "store_true", dest = "server_only", default = False)
+
+ parser.add_option("-B", "--bind", help = "The name/address for the bitbake server to bind to.",
+ action = "store", dest = "bind", default = False)
+
+ parser.add_option("", "--no-setscene", help = "Do not run any setscene tasks. sstate will be ignored and everything needed, built.",
+ action = "store_true", dest = "nosetscene", default = False)
+
+ parser.add_option("", "--remote-server", help = "Connect to the specified server.",
+ action = "store", dest = "remote_server", default = False)
+
+ parser.add_option("-m", "--kill-server", help = "Terminate the remote server.",
+ action = "store_true", dest = "kill_server", default = False)
+
+ parser.add_option("", "--observe-only", help = "Connect to a server as an observing-only client.",
+ action = "store_true", dest = "observe_only", default = False)
+
+ parser.add_option("", "--status-only", help = "Check the status of the remote bitbake server.",
+ action = "store_true", dest = "status_only", default = False)
+
+ parser.add_option("-w", "--write-log", help = "Writes the event log of the build to a bitbake event json file. Use '' (empty string) to assign the name automatically.",
+ action = "store", dest = "writeeventlog")
+
+ options, targets = parser.parse_args(argv)
+
+ # some environmental variables set also configuration options
+ if "BBSERVER" in os.environ:
+ options.servertype = "xmlrpc"
+ options.remote_server = os.environ["BBSERVER"]
+
+ if "BBTOKEN" in os.environ:
+ options.xmlrpctoken = os.environ["BBTOKEN"]
+
+ if "BBEVENTLOG" is os.environ:
+ options.writeeventlog = os.environ["BBEVENTLOG"]
+
+ # fill in proper log name if not supplied
+ if options.writeeventlog is not None and len(options.writeeventlog) == 0:
+ import datetime
+ options.writeeventlog = "bitbake_eventlog_%s.json" % datetime.datetime.now().strftime("%Y%m%d%H%M%S")
+
+ # if BBSERVER says to autodetect, let's do that
+ if options.remote_server:
+ [host, port] = options.remote_server.split(":", 2)
+ port = int(port)
+ # use automatic port if port set to -1, means read it from
+ # the bitbake.lock file; this is a bit tricky, but we always expect
+ # to be in the base of the build directory if we need to have a
+ # chance to start the server later, anyway
+ if port == -1:
+ lock_location = "./bitbake.lock"
+ # we try to read the address at all times; if the server is not started,
+ # we'll try to start it after the first connect fails, below
+ try:
+ lf = open(lock_location, 'r')
+ remotedef = lf.readline()
+ [host, port] = remotedef.split(":")
+ port = int(port)
+ lf.close()
+ options.remote_server = remotedef
+ except Exception as e:
+ raise BBMainException("Failed to read bitbake.lock (%s), invalid port" % str(e))
+
+ return options, targets[1:]
+
+
+def start_server(servermodule, configParams, configuration, features):
+ server = servermodule.BitBakeServer()
+ if configParams.bind:
+ (host, port) = configParams.bind.split(':')
+ server.initServer((host, int(port)))
+ configuration.interface = [ server.serverImpl.host, server.serverImpl.port ]
+ else:
+ server.initServer()
+ configuration.interface = []
+
+ try:
+ configuration.setServerRegIdleCallback(server.getServerIdleCB())
+
+ cooker = bb.cooker.BBCooker(configuration, features)
+
+ server.addcooker(cooker)
+ server.saveConnectionDetails()
+ except Exception as e:
+ exc_info = sys.exc_info()
+ while hasattr(server, "event_queue"):
+ try:
+ import queue
+ except ImportError:
+ import Queue as queue
+ try:
+ event = server.event_queue.get(block=False)
+ except (queue.Empty, IOError):
+ break
+ if isinstance(event, logging.LogRecord):
+ logger.handle(event)
+ raise exc_info[1], None, exc_info[2]
+ server.detach()
+ cooker.lock.close()
+ return server
+
+
+def bitbake_main(configParams, configuration):
+
+ # Python multiprocessing requires /dev/shm on Linux
+ if sys.platform.startswith('linux') and not os.access('/dev/shm', os.W_OK | os.X_OK):
+ raise BBMainException("FATAL: /dev/shm does not exist or is not writable")
+
+ # Unbuffer stdout to avoid log truncation in the event
+ # of an unorderly exit as well as to provide timely
+ # updates to log files for use with tail
+ try:
+ if sys.stdout.name == '<stdout>':
+ sys.stdout = os.fdopen(sys.stdout.fileno(), 'w', 0)
+ except:
+ pass
+
+
+ configuration.setConfigParameters(configParams)
+
+ ui_module = import_extension_module(bb.ui, configParams.ui, 'main')
+ servermodule = import_extension_module(bb.server, configParams.servertype, 'BitBakeServer')
+
+ if configParams.server_only:
+ if configParams.servertype != "xmlrpc":
+ raise BBMainException("FATAL: If '--server-only' is defined, we must set the "
+ "servertype as 'xmlrpc'.\n")
+ if not configParams.bind:
+ raise BBMainException("FATAL: The '--server-only' option requires a name/address "
+ "to bind to with the -B option.\n")
+ if configParams.remote_server:
+ raise BBMainException("FATAL: The '--server-only' option conflicts with %s.\n" %
+ ("the BBSERVER environment variable" if "BBSERVER" in os.environ \
+ else "the '--remote-server' option" ))
+
+ if configParams.bind and configParams.servertype != "xmlrpc":
+ raise BBMainException("FATAL: If '-B' or '--bind' is defined, we must "
+ "set the servertype as 'xmlrpc'.\n")
+
+ if configParams.remote_server and configParams.servertype != "xmlrpc":
+ raise BBMainException("FATAL: If '--remote-server' is defined, we must "
+ "set the servertype as 'xmlrpc'.\n")
+
+ if configParams.observe_only and (not configParams.remote_server or configParams.bind):
+ raise BBMainException("FATAL: '--observe-only' can only be used by UI clients "
+ "connecting to a server.\n")
+
+ if configParams.kill_server and not configParams.remote_server:
+ raise BBMainException("FATAL: '--kill-server' can only be used to terminate a remote server")
+
+ if "BBDEBUG" in os.environ:
+ level = int(os.environ["BBDEBUG"])
+ if level > configuration.debug:
+ configuration.debug = level
+
+ bb.msg.init_msgconfig(configParams.verbose, configuration.debug,
+ configuration.debug_domains)
+
+ # Ensure logging messages get sent to the UI as events
+ handler = bb.event.LogHandler()
+ if not configParams.status_only:
+ # In status only mode there are no logs and no UI
+ logger.addHandler(handler)
+
+ # Clear away any spurious environment variables while we stoke up the cooker
+ cleanedvars = bb.utils.clean_environment()
+
+ featureset = []
+ if not configParams.server_only:
+ # Collect the feature set for the UI
+ featureset = getattr(ui_module, "featureSet", [])
+
+ if not configParams.remote_server:
+ # we start a server with a given configuration
+ server = start_server(servermodule, configParams, configuration, featureset)
+ bb.event.ui_queue = []
+ else:
+ # we start a stub server that is actually a XMLRPClient that connects to a real server
+ server = servermodule.BitBakeXMLRPCClient(configParams.observe_only, configParams.xmlrpctoken)
+ server.saveConnectionDetails(configParams.remote_server)
+
+
+ if not configParams.server_only:
+ try:
+ server_connection = server.establishConnection(featureset)
+ except Exception as e:
+ bb.fatal("Could not connect to server %s: %s" % (configParams.remote_server, str(e)))
+
+ # Restore the environment in case the UI needs it
+ for k in cleanedvars:
+ os.environ[k] = cleanedvars[k]
+
+ logger.removeHandler(handler)
+
+
+ if configParams.status_only:
+ server_connection.terminate()
+ return 0
+
+ if configParams.kill_server:
+ server_connection.connection.terminateServer()
+ bb.event.ui_queue = []
+ return 0
+
+ try:
+ return ui_module.main(server_connection.connection, server_connection.events, configParams)
+ finally:
+ bb.event.ui_queue = []
+ server_connection.terminate()
+ else:
+ print("Bitbake server address: %s, server port: %s" % (server.serverImpl.host, server.serverImpl.port))
+ return 0
+
+ return 1
diff --git a/bitbake/lib/bb/methodpool.py b/bitbake/lib/bb/methodpool.py
new file mode 100644
index 0000000..bf2e9f5
--- /dev/null
+++ b/bitbake/lib/bb/methodpool.py
@@ -0,0 +1,29 @@
+# ex:ts=4:sw=4:sts=4:et
+# -*- tab-width: 4; c-basic-offset: 4; indent-tabs-mode: nil -*-
+#
+#
+# Copyright (C) 2006 Holger Hans Peter Freyther
+#
+# This program is free software; you can redistribute it and/or modify
+# it under the terms of the GNU General Public License version 2 as
+# published by the Free Software Foundation.
+#
+# This program is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
+# GNU General Public License for more details.
+#
+# You should have received a copy of the GNU General Public License along
+# with this program; if not, write to the Free Software Foundation, Inc.,
+# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
+
+from bb.utils import better_compile, better_exec
+
+def insert_method(modulename, code, fn):
+ """
+ Add code of a module should be added. The methods
+ will be simply added, no checking will be done
+ """
+ comp = better_compile(code, modulename, fn )
+ better_exec(comp, None, code, fn)
+
diff --git a/bitbake/lib/bb/monitordisk.py b/bitbake/lib/bb/monitordisk.py
new file mode 100644
index 0000000..466523c
--- /dev/null
+++ b/bitbake/lib/bb/monitordisk.py
@@ -0,0 +1,263 @@
+#!/usr/bin/env python
+# ex:ts=4:sw=4:sts=4:et
+# -*- tab-width: 4; c-basic-offset: 4; indent-tabs-mode: nil -*-
+#
+# Copyright (C) 2012 Robert Yang
+#
+# This program is free software; you can redistribute it and/or modify
+# it under the terms of the GNU General Public License version 2 as
+# published by the Free Software Foundation.
+#
+# This program is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
+# GNU General Public License for more details.
+#
+# You should have received a copy of the GNU General Public License along
+# with this program; if not, write to the Free Software Foundation, Inc.,
+# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
+
+import os, logging, re, sys
+import bb
+logger = logging.getLogger("BitBake.Monitor")
+
+def printErr(info):
+ logger.error("%s\n Disk space monitor will NOT be enabled" % info)
+
+def convertGMK(unit):
+
+ """ Convert the space unit G, M, K, the unit is case-insensitive """
+
+ unitG = re.match('([1-9][0-9]*)[gG]\s?$', unit)
+ if unitG:
+ return int(unitG.group(1)) * (1024 ** 3)
+ unitM = re.match('([1-9][0-9]*)[mM]\s?$', unit)
+ if unitM:
+ return int(unitM.group(1)) * (1024 ** 2)
+ unitK = re.match('([1-9][0-9]*)[kK]\s?$', unit)
+ if unitK:
+ return int(unitK.group(1)) * 1024
+ unitN = re.match('([1-9][0-9]*)\s?$', unit)
+ if unitN:
+ return int(unitN.group(1))
+ else:
+ return None
+
+def getMountedDev(path):
+
+ """ Get the device mounted at the path, uses /proc/mounts """
+
+ # Get the mount point of the filesystem containing path
+ # st_dev is the ID of device containing file
+ parentDev = os.stat(path).st_dev
+ currentDev = parentDev
+ # When the current directory's device is different from the
+ # parent's, then the current directory is a mount point
+ while parentDev == currentDev:
+ mountPoint = path
+ # Use dirname to get the parent's directory
+ path = os.path.dirname(path)
+ # Reach the "/"
+ if path == mountPoint:
+ break
+ parentDev= os.stat(path).st_dev
+
+ try:
+ with open("/proc/mounts", "r") as ifp:
+ for line in ifp:
+ procLines = line.rstrip('\n').split()
+ if procLines[1] == mountPoint:
+ return procLines[0]
+ except EnvironmentError:
+ pass
+ return None
+
+def getDiskData(BBDirs, configuration):
+
+ """Prepare disk data for disk space monitor"""
+
+ # Save the device IDs, need the ID to be unique (the dictionary's key is
+ # unique), so that when more than one directory is located on the same
+ # device, we just monitor it once
+ devDict = {}
+ for pathSpaceInode in BBDirs.split():
+ # The input format is: "dir,space,inode", dir is a must, space
+ # and inode are optional
+ pathSpaceInodeRe = re.match('([^,]*),([^,]*),([^,]*),?(.*)', pathSpaceInode)
+ if not pathSpaceInodeRe:
+ printErr("Invalid value in BB_DISKMON_DIRS: %s" % pathSpaceInode)
+ return None
+
+ action = pathSpaceInodeRe.group(1)
+ if action not in ("ABORT", "STOPTASKS", "WARN"):
+ printErr("Unknown disk space monitor action: %s" % action)
+ return None
+
+ path = os.path.realpath(pathSpaceInodeRe.group(2))
+ if not path:
+ printErr("Invalid path value in BB_DISKMON_DIRS: %s" % pathSpaceInode)
+ return None
+
+ # The disk space or inode is optional, but it should have a correct
+ # value once it is specified
+ minSpace = pathSpaceInodeRe.group(3)
+ if minSpace:
+ minSpace = convertGMK(minSpace)
+ if not minSpace:
+ printErr("Invalid disk space value in BB_DISKMON_DIRS: %s" % pathSpaceInodeRe.group(3))
+ return None
+ else:
+ # None means that it is not specified
+ minSpace = None
+
+ minInode = pathSpaceInodeRe.group(4)
+ if minInode:
+ minInode = convertGMK(minInode)
+ if not minInode:
+ printErr("Invalid inode value in BB_DISKMON_DIRS: %s" % pathSpaceInodeRe.group(4))
+ return None
+ else:
+ # None means that it is not specified
+ minInode = None
+
+ if minSpace is None and minInode is None:
+ printErr("No disk space or inode value in found BB_DISKMON_DIRS: %s" % pathSpaceInode)
+ return None
+ # mkdir for the directory since it may not exist, for example the
+ # DL_DIR may not exist at the very beginning
+ if not os.path.exists(path):
+ bb.utils.mkdirhier(path)
+ dev = getMountedDev(path)
+ # Use path/action as the key
+ devDict[os.path.join(path, action)] = [dev, minSpace, minInode]
+
+ return devDict
+
+def getInterval(configuration):
+
+ """ Get the disk space interval """
+
+ # The default value is 50M and 5K.
+ spaceDefault = 50 * 1024 * 1024
+ inodeDefault = 5 * 1024
+
+ interval = configuration.getVar("BB_DISKMON_WARNINTERVAL", True)
+ if not interval:
+ return spaceDefault, inodeDefault
+ else:
+ # The disk space or inode interval is optional, but it should
+ # have a correct value once it is specified
+ intervalRe = re.match('([^,]*),?\s*(.*)', interval)
+ if intervalRe:
+ intervalSpace = intervalRe.group(1)
+ if intervalSpace:
+ intervalSpace = convertGMK(intervalSpace)
+ if not intervalSpace:
+ printErr("Invalid disk space interval value in BB_DISKMON_WARNINTERVAL: %s" % intervalRe.group(1))
+ return None, None
+ else:
+ intervalSpace = spaceDefault
+ intervalInode = intervalRe.group(2)
+ if intervalInode:
+ intervalInode = convertGMK(intervalInode)
+ if not intervalInode:
+ printErr("Invalid disk inode interval value in BB_DISKMON_WARNINTERVAL: %s" % intervalRe.group(2))
+ return None, None
+ else:
+ intervalInode = inodeDefault
+ return intervalSpace, intervalInode
+ else:
+ printErr("Invalid interval value in BB_DISKMON_WARNINTERVAL: %s" % interval)
+ return None, None
+
+class diskMonitor:
+
+ """Prepare the disk space monitor data"""
+
+ def __init__(self, configuration):
+
+ self.enableMonitor = False
+ self.configuration = configuration
+
+ BBDirs = configuration.getVar("BB_DISKMON_DIRS", True) or None
+ if BBDirs:
+ self.devDict = getDiskData(BBDirs, configuration)
+ if self.devDict:
+ self.spaceInterval, self.inodeInterval = getInterval(configuration)
+ if self.spaceInterval and self.inodeInterval:
+ self.enableMonitor = True
+ # These are for saving the previous disk free space and inode, we
+ # use them to avoid printing too many warning messages
+ self.preFreeS = {}
+ self.preFreeI = {}
+ # This is for STOPTASKS and ABORT, to avoid printing the message
+ # repeatedly while waiting for the tasks to finish
+ self.checked = {}
+ for k in self.devDict:
+ self.preFreeS[k] = 0
+ self.preFreeI[k] = 0
+ self.checked[k] = False
+ if self.spaceInterval is None and self.inodeInterval is None:
+ self.enableMonitor = False
+
+ def check(self, rq):
+
+ """ Take action for the monitor """
+
+ if self.enableMonitor:
+ for k in self.devDict:
+ path = os.path.dirname(k)
+ action = os.path.basename(k)
+ dev = self.devDict[k][0]
+ minSpace = self.devDict[k][1]
+ minInode = self.devDict[k][2]
+
+ st = os.statvfs(path)
+
+ # The free space, float point number
+ freeSpace = st.f_bavail * st.f_frsize
+
+ if minSpace and freeSpace < minSpace:
+ # Always show warning, the self.checked would always be False if the action is WARN
+ if self.preFreeS[k] == 0 or self.preFreeS[k] - freeSpace > self.spaceInterval and not self.checked[k]:
+ logger.warn("The free space of %s (%s) is running low (%.3fGB left)" % \
+ (path, dev, freeSpace / 1024 / 1024 / 1024.0))
+ self.preFreeS[k] = freeSpace
+
+ if action == "STOPTASKS" and not self.checked[k]:
+ logger.error("No new tasks can be executed since the disk space monitor action is \"STOPTASKS\"!")
+ self.checked[k] = True
+ rq.finish_runqueue(False)
+ bb.event.fire(bb.event.DiskFull(dev, 'disk', freeSpace, path), self.configuration)
+ elif action == "ABORT" and not self.checked[k]:
+ logger.error("Immediately abort since the disk space monitor action is \"ABORT\"!")
+ self.checked[k] = True
+ rq.finish_runqueue(True)
+ bb.event.fire(bb.event.DiskFull(dev, 'disk', freeSpace, path), self.configuration)
+
+ # The free inodes, float point number
+ freeInode = st.f_favail
+
+ if minInode and freeInode < minInode:
+ # Some filesystems use dynamic inodes so can't run out
+ # (e.g. btrfs). This is reported by the inode count being 0.
+ if st.f_files == 0:
+ self.devDict[k][2] = None
+ continue
+ # Always show warning, the self.checked would always be False if the action is WARN
+ if self.preFreeI[k] == 0 or self.preFreeI[k] - freeInode > self.inodeInterval and not self.checked[k]:
+ logger.warn("The free inode of %s (%s) is running low (%.3fK left)" % \
+ (path, dev, freeInode / 1024.0))
+ self.preFreeI[k] = freeInode
+
+ if action == "STOPTASKS" and not self.checked[k]:
+ logger.error("No new tasks can be executed since the disk space monitor action is \"STOPTASKS\"!")
+ self.checked[k] = True
+ rq.finish_runqueue(False)
+ bb.event.fire(bb.event.DiskFull(dev, 'inode', freeInode, path), self.configuration)
+ elif action == "ABORT" and not self.checked[k]:
+ logger.error("Immediately abort since the disk space monitor action is \"ABORT\"!")
+ self.checked[k] = True
+ rq.finish_runqueue(True)
+ bb.event.fire(bb.event.DiskFull(dev, 'inode', freeInode, path), self.configuration)
+ return
diff --git a/bitbake/lib/bb/msg.py b/bitbake/lib/bb/msg.py
new file mode 100644
index 0000000..786b5ae
--- /dev/null
+++ b/bitbake/lib/bb/msg.py
@@ -0,0 +1,199 @@
+# ex:ts=4:sw=4:sts=4:et
+# -*- tab-width: 4; c-basic-offset: 4; indent-tabs-mode: nil -*-
+"""
+BitBake 'msg' implementation
+
+Message handling infrastructure for bitbake
+
+"""
+
+# Copyright (C) 2006 Richard Purdie
+#
+# This program is free software; you can redistribute it and/or modify
+# it under the terms of the GNU General Public License version 2 as
+# published by the Free Software Foundation.
+#
+# This program is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
+# GNU General Public License for more details.
+#
+# You should have received a copy of the GNU General Public License along
+# with this program; if not, write to the Free Software Foundation, Inc.,
+# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
+
+import sys
+import copy
+import logging
+import collections
+from itertools import groupby
+import warnings
+import bb
+import bb.event
+
+class BBLogFormatter(logging.Formatter):
+ """Formatter which ensures that our 'plain' messages (logging.INFO + 1) are used as is"""
+
+ DEBUG3 = logging.DEBUG - 2
+ DEBUG2 = logging.DEBUG - 1
+ DEBUG = logging.DEBUG
+ VERBOSE = logging.INFO - 1
+ NOTE = logging.INFO
+ PLAIN = logging.INFO + 1
+ ERROR = logging.ERROR
+ WARNING = logging.WARNING
+ CRITICAL = logging.CRITICAL
+
+ levelnames = {
+ DEBUG3 : 'DEBUG',
+ DEBUG2 : 'DEBUG',
+ DEBUG : 'DEBUG',
+ VERBOSE: 'NOTE',
+ NOTE : 'NOTE',
+ PLAIN : '',
+ WARNING : 'WARNING',
+ ERROR : 'ERROR',
+ CRITICAL: 'ERROR',
+ }
+
+ color_enabled = False
+ BASECOLOR, BLACK, RED, GREEN, YELLOW, BLUE, MAGENTA, CYAN, WHITE = range(29,38)
+
+ COLORS = {
+ DEBUG3 : CYAN,
+ DEBUG2 : CYAN,
+ DEBUG : CYAN,
+ VERBOSE : BASECOLOR,
+ NOTE : BASECOLOR,
+ PLAIN : BASECOLOR,
+ WARNING : YELLOW,
+ ERROR : RED,
+ CRITICAL: RED,
+ }
+
+ BLD = '\033[1;%dm'
+ STD = '\033[%dm'
+ RST = '\033[0m'
+
+ def getLevelName(self, levelno):
+ try:
+ return self.levelnames[levelno]
+ except KeyError:
+ self.levelnames[levelno] = value = 'Level %d' % levelno
+ return value
+
+ def format(self, record):
+ record.levelname = self.getLevelName(record.levelno)
+ if record.levelno == self.PLAIN:
+ msg = record.getMessage()
+ else:
+ if self.color_enabled:
+ record = self.colorize(record)
+ msg = logging.Formatter.format(self, record)
+
+ if hasattr(record, 'bb_exc_info'):
+ etype, value, tb = record.bb_exc_info
+ formatted = bb.exceptions.format_exception(etype, value, tb, limit=5)
+ msg += '\n' + ''.join(formatted)
+ return msg
+
+ def colorize(self, record):
+ color = self.COLORS[record.levelno]
+ if self.color_enabled and color is not None:
+ record = copy.copy(record)
+ record.levelname = "".join([self.BLD % color, record.levelname, self.RST])
+ record.msg = "".join([self.STD % color, record.msg, self.RST])
+ return record
+
+ def enable_color(self):
+ self.color_enabled = True
+
+class BBLogFilter(object):
+ def __init__(self, handler, level, debug_domains):
+ self.stdlevel = level
+ self.debug_domains = debug_domains
+ loglevel = level
+ for domain in debug_domains:
+ if debug_domains[domain] < loglevel:
+ loglevel = debug_domains[domain]
+ handler.setLevel(loglevel)
+ handler.addFilter(self)
+
+ def filter(self, record):
+ if record.levelno >= self.stdlevel:
+ return True
+ if record.name in self.debug_domains and record.levelno >= self.debug_domains[record.name]:
+ return True
+ return False
+
+class BBLogFilterStdErr(BBLogFilter):
+ def filter(self, record):
+ if not BBLogFilter.filter(self, record):
+ return False
+ if record.levelno >= logging.ERROR:
+ return True
+ return False
+
+class BBLogFilterStdOut(BBLogFilter):
+ def filter(self, record):
+ if not BBLogFilter.filter(self, record):
+ return False
+ if record.levelno < logging.ERROR:
+ return True
+ return False
+
+# Message control functions
+#
+
+loggerDefaultDebugLevel = 0
+loggerDefaultVerbose = False
+loggerVerboseLogs = False
+loggerDefaultDomains = []
+
+def init_msgconfig(verbose, debug, debug_domains=None):
+ """
+ Set default verbosity and debug levels config the logger
+ """
+ bb.msg.loggerDefaultDebugLevel = debug
+ bb.msg.loggerDefaultVerbose = verbose
+ if verbose:
+ bb.msg.loggerVerboseLogs = True
+ if debug_domains:
+ bb.msg.loggerDefaultDomains = debug_domains
+ else:
+ bb.msg.loggerDefaultDomains = []
+
+def constructLogOptions():
+ debug = loggerDefaultDebugLevel
+ verbose = loggerDefaultVerbose
+ domains = loggerDefaultDomains
+
+ if debug:
+ level = BBLogFormatter.DEBUG - debug + 1
+ elif verbose:
+ level = BBLogFormatter.VERBOSE
+ else:
+ level = BBLogFormatter.NOTE
+
+ debug_domains = {}
+ for (domainarg, iterator) in groupby(domains):
+ dlevel = len(tuple(iterator))
+ debug_domains["BitBake.%s" % domainarg] = logging.DEBUG - dlevel + 1
+ return level, debug_domains
+
+def addDefaultlogFilter(handler, cls = BBLogFilter):
+ level, debug_domains = constructLogOptions()
+
+ cls(handler, level, debug_domains)
+
+#
+# Message handling functions
+#
+
+def fatal(msgdomain, msg):
+ if msgdomain:
+ logger = logging.getLogger("BitBake.%s" % msgdomain)
+ else:
+ logger = logging.getLogger("BitBake")
+ logger.critical(msg)
+ sys.exit(1)
diff --git a/bitbake/lib/bb/namedtuple_with_abc.py b/bitbake/lib/bb/namedtuple_with_abc.py
new file mode 100644
index 0000000..32f2fc6
--- /dev/null
+++ b/bitbake/lib/bb/namedtuple_with_abc.py
@@ -0,0 +1,255 @@
+# http://code.activestate.com/recipes/577629-namedtupleabc-abstract-base-class-mix-in-for-named/
+#!/usr/bin/env python
+# Copyright (c) 2011 Jan Kaliszewski (zuo). Available under the MIT License.
+
+"""
+namedtuple_with_abc.py:
+* named tuple mix-in + ABC (abstract base class) recipe,
+* works under Python 2.6, 2.7 as well as 3.x.
+
+Import this module to patch collections.namedtuple() factory function
+-- enriching it with the 'abc' attribute (an abstract base class + mix-in
+for named tuples) and decorating it with a wrapper that registers each
+newly created named tuple as a subclass of namedtuple.abc.
+
+How to import:
+ import collections, namedtuple_with_abc
+or:
+ import namedtuple_with_abc
+ from collections import namedtuple
+ # ^ in this variant you must import namedtuple function
+ # *after* importing namedtuple_with_abc module
+or simply:
+ from namedtuple_with_abc import namedtuple
+
+Simple usage example:
+ class Credentials(namedtuple.abc):
+ _fields = 'username password'
+ def __str__(self):
+ return ('{0.__class__.__name__}'
+ '(username={0.username}, password=...)'.format(self))
+ print(Credentials("alice", "Alice's password"))
+
+For more advanced examples -- see below the "if __name__ == '__main__':".
+"""
+
+import collections
+from abc import ABCMeta, abstractproperty
+from functools import wraps
+from sys import version_info
+
+__all__ = ('namedtuple',)
+_namedtuple = collections.namedtuple
+
+
+class _NamedTupleABCMeta(ABCMeta):
+ '''The metaclass for the abstract base class + mix-in for named tuples.'''
+ def __new__(mcls, name, bases, namespace):
+ fields = namespace.get('_fields')
+ for base in bases:
+ if fields is not None:
+ break
+ fields = getattr(base, '_fields', None)
+ if not isinstance(fields, abstractproperty):
+ basetuple = _namedtuple(name, fields)
+ bases = (basetuple,) + bases
+ namespace.pop('_fields', None)
+ namespace.setdefault('__doc__', basetuple.__doc__)
+ namespace.setdefault('__slots__', ())
+ return ABCMeta.__new__(mcls, name, bases, namespace)
+
+
+exec(
+ # Python 2.x metaclass declaration syntax
+ """class _NamedTupleABC(object):
+ '''The abstract base class + mix-in for named tuples.'''
+ __metaclass__ = _NamedTupleABCMeta
+ _fields = abstractproperty()""" if version_info[0] < 3 else
+ # Python 3.x metaclass declaration syntax
+ """class _NamedTupleABC(metaclass=_NamedTupleABCMeta):
+ '''The abstract base class + mix-in for named tuples.'''
+ _fields = abstractproperty()"""
+)
+
+
+_namedtuple.abc = _NamedTupleABC
+#_NamedTupleABC.register(type(version_info)) # (and similar, in the future...)
+
+@wraps(_namedtuple)
+def namedtuple(*args, **kwargs):
+ '''Named tuple factory with namedtuple.abc subclass registration.'''
+ cls = _namedtuple(*args, **kwargs)
+ _NamedTupleABC.register(cls)
+ return cls
+
+collections.namedtuple = namedtuple
+
+
+
+
+if __name__ == '__main__':
+
+ '''Examples and explanations'''
+
+ # Simple usage
+
+ class MyRecord(namedtuple.abc):
+ _fields = 'x y z' # such form will be transformed into ('x', 'y', 'z')
+ def _my_custom_method(self):
+ return list(self._asdict().items())
+ # (the '_fields' attribute belongs to the named tuple public API anyway)
+
+ rec = MyRecord(1, 2, 3)
+ print(rec)
+ print(rec._my_custom_method())
+ print(rec._replace(y=222))
+ print(rec._replace(y=222)._my_custom_method())
+
+ # Custom abstract classes...
+
+ class MyAbstractRecord(namedtuple.abc):
+ def _my_custom_method(self):
+ return list(self._asdict().items())
+
+ try:
+ MyAbstractRecord() # (abstract classes cannot be instantiated)
+ except TypeError as exc:
+ print(exc)
+
+ class AnotherAbstractRecord(MyAbstractRecord):
+ def __str__(self):
+ return '<<<{0}>>>'.format(super(AnotherAbstractRecord,
+ self).__str__())
+
+ # ...and their non-abstract subclasses
+
+ class MyRecord2(MyAbstractRecord):
+ _fields = 'a, b'
+
+ class MyRecord3(AnotherAbstractRecord):
+ _fields = 'p', 'q', 'r'
+
+ rec2 = MyRecord2('foo', 'bar')
+ print(rec2)
+ print(rec2._my_custom_method())
+ print(rec2._replace(b=222))
+ print(rec2._replace(b=222)._my_custom_method())
+
+ rec3 = MyRecord3('foo', 'bar', 'baz')
+ print(rec3)
+ print(rec3._my_custom_method())
+ print(rec3._replace(q=222))
+ print(rec3._replace(q=222)._my_custom_method())
+
+ # You can also subclass non-abstract ones...
+
+ class MyRecord33(MyRecord3):
+ def __str__(self):
+ return '< {0!r}, ..., {0!r} >'.format(self.p, self.r)
+
+ rec33 = MyRecord33('foo', 'bar', 'baz')
+ print(rec33)
+ print(rec33._my_custom_method())
+ print(rec33._replace(q=222))
+ print(rec33._replace(q=222)._my_custom_method())
+
+ # ...and even override the magic '_fields' attribute again
+
+ class MyRecord345(MyRecord3):
+ _fields = 'e f g h i j k'
+
+ rec345 = MyRecord345(1, 2, 3, 4, 3, 2, 1)
+ print(rec345)
+ print(rec345._my_custom_method())
+ print(rec345._replace(f=222))
+ print(rec345._replace(f=222)._my_custom_method())
+
+ # Mixing-in some other classes is also possible:
+
+ class MyMixIn(object):
+ def method(self):
+ return "MyMixIn.method() called"
+ def _my_custom_method(self):
+ return "MyMixIn._my_custom_method() called"
+ def count(self, item):
+ return "MyMixIn.count({0}) called".format(item)
+ def _asdict(self): # (cannot override a namedtuple method, see below)
+ return "MyMixIn._asdict() called"
+
+ class MyRecord4(MyRecord33, MyMixIn): # mix-in on the right
+ _fields = 'j k l x'
+
+ class MyRecord5(MyMixIn, MyRecord33): # mix-in on the left
+ _fields = 'j k l x y'
+
+ rec4 = MyRecord4(1, 2, 3, 2)
+ print(rec4)
+ print(rec4.method())
+ print(rec4._my_custom_method()) # MyRecord33's
+ print(rec4.count(2)) # tuple's
+ print(rec4._replace(k=222))
+ print(rec4._replace(k=222).method())
+ print(rec4._replace(k=222)._my_custom_method()) # MyRecord33's
+ print(rec4._replace(k=222).count(8)) # tuple's
+
+ rec5 = MyRecord5(1, 2, 3, 2, 1)
+ print(rec5)
+ print(rec5.method())
+ print(rec5._my_custom_method()) # MyMixIn's
+ print(rec5.count(2)) # MyMixIn's
+ print(rec5._replace(k=222))
+ print(rec5._replace(k=222).method())
+ print(rec5._replace(k=222)._my_custom_method()) # MyMixIn's
+ print(rec5._replace(k=222).count(2)) # MyMixIn's
+
+ # Note that behavior: the standard namedtuple methods cannot be
+ # overridden by a foreign mix-in -- even if the mix-in is declared
+ # as the leftmost base class (but, obviously, you can override them
+ # in the defined class or its subclasses):
+
+ print(rec4._asdict()) # (returns a dict, not "MyMixIn._asdict() called")
+ print(rec5._asdict()) # (returns a dict, not "MyMixIn._asdict() called")
+
+ class MyRecord6(MyRecord33):
+ _fields = 'j k l x y z'
+ def _asdict(self):
+ return "MyRecord6._asdict() called"
+ rec6 = MyRecord6(1, 2, 3, 1, 2, 3)
+ print(rec6._asdict()) # (this returns "MyRecord6._asdict() called")
+
+ # All that record classes are real subclasses of namedtuple.abc:
+
+ assert issubclass(MyRecord, namedtuple.abc)
+ assert issubclass(MyAbstractRecord, namedtuple.abc)
+ assert issubclass(AnotherAbstractRecord, namedtuple.abc)
+ assert issubclass(MyRecord2, namedtuple.abc)
+ assert issubclass(MyRecord3, namedtuple.abc)
+ assert issubclass(MyRecord33, namedtuple.abc)
+ assert issubclass(MyRecord345, namedtuple.abc)
+ assert issubclass(MyRecord4, namedtuple.abc)
+ assert issubclass(MyRecord5, namedtuple.abc)
+ assert issubclass(MyRecord6, namedtuple.abc)
+
+ # ...but abstract ones are not subclasses of tuple
+ # (and this is what you probably want):
+
+ assert not issubclass(MyAbstractRecord, tuple)
+ assert not issubclass(AnotherAbstractRecord, tuple)
+
+ assert issubclass(MyRecord, tuple)
+ assert issubclass(MyRecord2, tuple)
+ assert issubclass(MyRecord3, tuple)
+ assert issubclass(MyRecord33, tuple)
+ assert issubclass(MyRecord345, tuple)
+ assert issubclass(MyRecord4, tuple)
+ assert issubclass(MyRecord5, tuple)
+ assert issubclass(MyRecord6, tuple)
+
+ # Named tuple classes created with namedtuple() factory function
+ # (in the "traditional" way) are registered as "virtual" subclasses
+ # of namedtuple.abc:
+
+ MyTuple = namedtuple('MyTuple', 'a b c')
+ mt = MyTuple(1, 2, 3)
+ assert issubclass(MyTuple, namedtuple.abc)
+ assert isinstance(mt, namedtuple.abc)
diff --git a/bitbake/lib/bb/parse/__init__.py b/bitbake/lib/bb/parse/__init__.py
new file mode 100644
index 0000000..67ec71f
--- /dev/null
+++ b/bitbake/lib/bb/parse/__init__.py
@@ -0,0 +1,170 @@
+"""
+BitBake Parsers
+
+File parsers for the BitBake build tools.
+
+"""
+
+
+# Copyright (C) 2003, 2004 Chris Larson
+# Copyright (C) 2003, 2004 Phil Blundell
+#
+# This program is free software; you can redistribute it and/or modify
+# it under the terms of the GNU General Public License version 2 as
+# published by the Free Software Foundation.
+#
+# This program is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
+# GNU General Public License for more details.
+#
+# You should have received a copy of the GNU General Public License along
+# with this program; if not, write to the Free Software Foundation, Inc.,
+# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
+#
+# Based on functions from the base bb module, Copyright 2003 Holger Schurig
+
+handlers = []
+
+import errno
+import logging
+import os
+import stat
+import bb
+import bb.utils
+import bb.siggen
+
+logger = logging.getLogger("BitBake.Parsing")
+
+class ParseError(Exception):
+ """Exception raised when parsing fails"""
+ def __init__(self, msg, filename, lineno=0):
+ self.msg = msg
+ self.filename = filename
+ self.lineno = lineno
+ Exception.__init__(self, msg, filename, lineno)
+
+ def __str__(self):
+ if self.lineno:
+ return "ParseError at %s:%d: %s" % (self.filename, self.lineno, self.msg)
+ else:
+ return "ParseError in %s: %s" % (self.filename, self.msg)
+
+class SkipRecipe(Exception):
+ """Exception raised to skip this recipe"""
+
+class SkipPackage(SkipRecipe):
+ """Exception raised to skip this recipe (use SkipRecipe in new code)"""
+
+__mtime_cache = {}
+def cached_mtime(f):
+ if f not in __mtime_cache:
+ __mtime_cache[f] = os.stat(f)[stat.ST_MTIME]
+ return __mtime_cache[f]
+
+def cached_mtime_noerror(f):
+ if f not in __mtime_cache:
+ try:
+ __mtime_cache[f] = os.stat(f)[stat.ST_MTIME]
+ except OSError:
+ return 0
+ return __mtime_cache[f]
+
+def update_mtime(f):
+ try:
+ __mtime_cache[f] = os.stat(f)[stat.ST_MTIME]
+ except OSError:
+ if f in __mtime_cache:
+ del __mtime_cache[f]
+ return 0
+ return __mtime_cache[f]
+
+def update_cache(f):
+ if f in __mtime_cache:
+ logger.debug(1, "Updating mtime cache for %s" % f)
+ update_mtime(f)
+
+def mark_dependency(d, f):
+ if f.startswith('./'):
+ f = "%s/%s" % (os.getcwd(), f[2:])
+ deps = (d.getVar('__depends', False) or [])
+ s = (f, cached_mtime_noerror(f))
+ if s not in deps:
+ deps.append(s)
+ d.setVar('__depends', deps)
+
+def check_dependency(d, f):
+ s = (f, cached_mtime_noerror(f))
+ deps = (d.getVar('__depends', False) or [])
+ return s in deps
+
+def supports(fn, data):
+ """Returns true if we have a handler for this file, false otherwise"""
+ for h in handlers:
+ if h['supports'](fn, data):
+ return 1
+ return 0
+
+def handle(fn, data, include = 0):
+ """Call the handler that is appropriate for this file"""
+ for h in handlers:
+ if h['supports'](fn, data):
+ with data.inchistory.include(fn):
+ return h['handle'](fn, data, include)
+ raise ParseError("not a BitBake file", fn)
+
+def init(fn, data):
+ for h in handlers:
+ if h['supports'](fn):
+ return h['init'](data)
+
+def init_parser(d):
+ bb.parse.siggen = bb.siggen.init(d)
+
+def resolve_file(fn, d):
+ if not os.path.isabs(fn):
+ bbpath = d.getVar("BBPATH", True)
+ newfn, attempts = bb.utils.which(bbpath, fn, history=True)
+ for af in attempts:
+ mark_dependency(d, af)
+ if not newfn:
+ raise IOError(errno.ENOENT, "file %s not found in %s" % (fn, bbpath))
+ fn = newfn
+
+ mark_dependency(d, fn)
+ if not os.path.isfile(fn):
+ raise IOError(errno.ENOENT, "file %s not found" % fn)
+
+ return fn
+
+# Used by OpenEmbedded metadata
+__pkgsplit_cache__={}
+def vars_from_file(mypkg, d):
+ if not mypkg or not mypkg.endswith((".bb", ".bbappend")):
+ return (None, None, None)
+ if mypkg in __pkgsplit_cache__:
+ return __pkgsplit_cache__[mypkg]
+
+ myfile = os.path.splitext(os.path.basename(mypkg))
+ parts = myfile[0].split('_')
+ __pkgsplit_cache__[mypkg] = parts
+ if len(parts) > 3:
+ raise ParseError("Unable to generate default variables from filename (too many underscores)", mypkg)
+ exp = 3 - len(parts)
+ tmplist = []
+ while exp != 0:
+ exp -= 1
+ tmplist.append(None)
+ parts.extend(tmplist)
+ return parts
+
+def get_file_depends(d):
+ '''Return the dependent files'''
+ dep_files = []
+ depends = d.getVar('__base_depends', True) or []
+ depends = depends + (d.getVar('__depends', True) or [])
+ for (fn, _) in depends:
+ dep_files.append(os.path.abspath(fn))
+ return " ".join(dep_files)
+
+from bb.parse.parse_py import __version__, ConfHandler, BBHandler
diff --git a/bitbake/lib/bb/parse/ast.py b/bitbake/lib/bb/parse/ast.py
new file mode 100644
index 0000000..11db180
--- /dev/null
+++ b/bitbake/lib/bb/parse/ast.py
@@ -0,0 +1,481 @@
+# ex:ts=4:sw=4:sts=4:et
+# -*- tab-width: 4; c-basic-offset: 4; indent-tabs-mode: nil -*-
+"""
+ AbstractSyntaxTree classes for the Bitbake language
+"""
+
+# Copyright (C) 2003, 2004 Chris Larson
+# Copyright (C) 2003, 2004 Phil Blundell
+# Copyright (C) 2009 Holger Hans Peter Freyther
+#
+# This program is free software; you can redistribute it and/or modify
+# it under the terms of the GNU General Public License version 2 as
+# published by the Free Software Foundation.
+#
+# This program is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
+# GNU General Public License for more details.
+#
+# You should have received a copy of the GNU General Public License along
+# with this program; if not, write to the Free Software Foundation, Inc.,
+# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
+
+from __future__ import absolute_import
+from future_builtins import filter
+import re
+import string
+import logging
+import bb
+import itertools
+from bb import methodpool
+from bb.parse import logger
+
+_bbversions_re = re.compile(r"\[(?P<from>[0-9]+)-(?P<to>[0-9]+)\]")
+
+class StatementGroup(list):
+ def eval(self, data):
+ for statement in self:
+ statement.eval(data)
+
+class AstNode(object):
+ def __init__(self, filename, lineno):
+ self.filename = filename
+ self.lineno = lineno
+
+class IncludeNode(AstNode):
+ def __init__(self, filename, lineno, what_file, force):
+ AstNode.__init__(self, filename, lineno)
+ self.what_file = what_file
+ self.force = force
+
+ def eval(self, data):
+ """
+ Include the file and evaluate the statements
+ """
+ s = data.expand(self.what_file)
+ logger.debug(2, "CONF %s:%s: including %s", self.filename, self.lineno, s)
+
+ # TODO: Cache those includes... maybe not here though
+ if self.force:
+ bb.parse.ConfHandler.include(self.filename, s, self.lineno, data, "include required")
+ else:
+ bb.parse.ConfHandler.include(self.filename, s, self.lineno, data, False)
+
+class ExportNode(AstNode):
+ def __init__(self, filename, lineno, var):
+ AstNode.__init__(self, filename, lineno)
+ self.var = var
+
+ def eval(self, data):
+ data.setVarFlag(self.var, "export", 1, op = 'exported')
+
+class DataNode(AstNode):
+ """
+ Various data related updates. For the sake of sanity
+ we have one class doing all this. This means that all
+ this need to be re-evaluated... we might be able to do
+ that faster with multiple classes.
+ """
+ def __init__(self, filename, lineno, groupd):
+ AstNode.__init__(self, filename, lineno)
+ self.groupd = groupd
+
+ def getFunc(self, key, data):
+ if 'flag' in self.groupd and self.groupd['flag'] != None:
+ return data.getVarFlag(key, self.groupd['flag'], noweakdefault=True)
+ else:
+ return data.getVar(key, False, noweakdefault=True, parsing=True)
+
+ def eval(self, data):
+ groupd = self.groupd
+ key = groupd["var"]
+ loginfo = {
+ 'variable': key,
+ 'file': self.filename,
+ 'line': self.lineno,
+ }
+ if "exp" in groupd and groupd["exp"] != None:
+ data.setVarFlag(key, "export", 1, op = 'exported', **loginfo)
+
+ op = "set"
+ if "ques" in groupd and groupd["ques"] != None:
+ val = self.getFunc(key, data)
+ op = "set?"
+ if val == None:
+ val = groupd["value"]
+ elif "colon" in groupd and groupd["colon"] != None:
+ e = data.createCopy()
+ bb.data.update_data(e)
+ op = "immediate"
+ val = e.expand(groupd["value"], key + "[:=]")
+ elif "append" in groupd and groupd["append"] != None:
+ op = "append"
+ val = "%s %s" % ((self.getFunc(key, data) or ""), groupd["value"])
+ elif "prepend" in groupd and groupd["prepend"] != None:
+ op = "prepend"
+ val = "%s %s" % (groupd["value"], (self.getFunc(key, data) or ""))
+ elif "postdot" in groupd and groupd["postdot"] != None:
+ op = "postdot"
+ val = "%s%s" % ((self.getFunc(key, data) or ""), groupd["value"])
+ elif "predot" in groupd and groupd["predot"] != None:
+ op = "predot"
+ val = "%s%s" % (groupd["value"], (self.getFunc(key, data) or ""))
+ else:
+ val = groupd["value"]
+
+ flag = None
+ if 'flag' in groupd and groupd['flag'] != None:
+ flag = groupd['flag']
+ elif groupd["lazyques"]:
+ flag = "_defaultval"
+
+ loginfo['op'] = op
+ loginfo['detail'] = groupd["value"]
+
+ if flag:
+ data.setVarFlag(key, flag, val, **loginfo)
+ else:
+ data.setVar(key, val, parsing=True, **loginfo)
+
+class MethodNode(AstNode):
+ tr_tbl = string.maketrans('/.+-@%&', '_______')
+
+ def __init__(self, filename, lineno, func_name, body):
+ AstNode.__init__(self, filename, lineno)
+ self.func_name = func_name
+ self.body = body
+
+ def eval(self, data):
+ text = '\n'.join(self.body)
+ if self.func_name == "__anonymous":
+ funcname = ("__anon_%s_%s" % (self.lineno, self.filename.translate(MethodNode.tr_tbl)))
+ text = "def %s(d):\n" % (funcname) + text
+ bb.methodpool.insert_method(funcname, text, self.filename)
+ anonfuncs = data.getVar('__BBANONFUNCS', False) or []
+ anonfuncs.append(funcname)
+ data.setVar('__BBANONFUNCS', anonfuncs)
+ data.setVar(funcname, text, parsing=True)
+ else:
+ data.setVarFlag(self.func_name, "func", 1)
+ data.setVar(self.func_name, text, parsing=True)
+
+class PythonMethodNode(AstNode):
+ def __init__(self, filename, lineno, function, modulename, body):
+ AstNode.__init__(self, filename, lineno)
+ self.function = function
+ self.modulename = modulename
+ self.body = body
+
+ def eval(self, data):
+ # Note we will add root to parsedmethods after having parse
+ # 'this' file. This means we will not parse methods from
+ # bb classes twice
+ text = '\n'.join(self.body)
+ bb.methodpool.insert_method(self.modulename, text, self.filename)
+ data.setVarFlag(self.function, "func", 1)
+ data.setVarFlag(self.function, "python", 1)
+ data.setVar(self.function, text, parsing=True)
+
+class MethodFlagsNode(AstNode):
+ def __init__(self, filename, lineno, key, m):
+ AstNode.__init__(self, filename, lineno)
+ self.key = key
+ self.m = m
+
+ def eval(self, data):
+ if data.getVar(self.key, False):
+ # clean up old version of this piece of metadata, as its
+ # flags could cause problems
+ data.setVarFlag(self.key, 'python', None)
+ data.setVarFlag(self.key, 'fakeroot', None)
+ if self.m.group("py") is not None:
+ data.setVarFlag(self.key, "python", "1")
+ else:
+ data.delVarFlag(self.key, "python")
+ if self.m.group("fr") is not None:
+ data.setVarFlag(self.key, "fakeroot", "1")
+ else:
+ data.delVarFlag(self.key, "fakeroot")
+
+class ExportFuncsNode(AstNode):
+ def __init__(self, filename, lineno, fns, classname):
+ AstNode.__init__(self, filename, lineno)
+ self.n = fns.split()
+ self.classname = classname
+
+ def eval(self, data):
+
+ for func in self.n:
+ calledfunc = self.classname + "_" + func
+
+ if data.getVar(func, False) and not data.getVarFlag(func, 'export_func'):
+ continue
+
+ if data.getVar(func, False):
+ data.setVarFlag(func, 'python', None)
+ data.setVarFlag(func, 'func', None)
+
+ for flag in [ "func", "python" ]:
+ if data.getVarFlag(calledfunc, flag):
+ data.setVarFlag(func, flag, data.getVarFlag(calledfunc, flag))
+ for flag in [ "dirs" ]:
+ if data.getVarFlag(func, flag):
+ data.setVarFlag(calledfunc, flag, data.getVarFlag(func, flag))
+
+ if data.getVarFlag(calledfunc, "python"):
+ data.setVar(func, " bb.build.exec_func('" + calledfunc + "', d)\n", parsing=True)
+ else:
+ if "-" in self.classname:
+ bb.fatal("The classname %s contains a dash character and is calling an sh function %s using EXPORT_FUNCTIONS. Since a dash is illegal in sh function names, this cannot work, please rename the class or don't use EXPORT_FUNCTIONS." % (self.classname, calledfunc))
+ data.setVar(func, " " + calledfunc + "\n", parsing=True)
+ data.setVarFlag(func, 'export_func', '1')
+
+class AddTaskNode(AstNode):
+ def __init__(self, filename, lineno, func, before, after):
+ AstNode.__init__(self, filename, lineno)
+ self.func = func
+ self.before = before
+ self.after = after
+
+ def eval(self, data):
+ bb.build.addtask(self.func, self.before, self.after, data)
+
+class DelTaskNode(AstNode):
+ def __init__(self, filename, lineno, func):
+ AstNode.__init__(self, filename, lineno)
+ self.func = func
+
+ def eval(self, data):
+ bb.build.deltask(self.func, data)
+
+class BBHandlerNode(AstNode):
+ def __init__(self, filename, lineno, fns):
+ AstNode.__init__(self, filename, lineno)
+ self.hs = fns.split()
+
+ def eval(self, data):
+ bbhands = data.getVar('__BBHANDLERS', False) or []
+ for h in self.hs:
+ bbhands.append(h)
+ data.setVarFlag(h, "handler", 1)
+ data.setVar('__BBHANDLERS', bbhands)
+
+class InheritNode(AstNode):
+ def __init__(self, filename, lineno, classes):
+ AstNode.__init__(self, filename, lineno)
+ self.classes = classes
+
+ def eval(self, data):
+ bb.parse.BBHandler.inherit(self.classes, self.filename, self.lineno, data)
+
+def handleInclude(statements, filename, lineno, m, force):
+ statements.append(IncludeNode(filename, lineno, m.group(1), force))
+
+def handleExport(statements, filename, lineno, m):
+ statements.append(ExportNode(filename, lineno, m.group(1)))
+
+def handleData(statements, filename, lineno, groupd):
+ statements.append(DataNode(filename, lineno, groupd))
+
+def handleMethod(statements, filename, lineno, func_name, body):
+ statements.append(MethodNode(filename, lineno, func_name, body))
+
+def handlePythonMethod(statements, filename, lineno, funcname, modulename, body):
+ statements.append(PythonMethodNode(filename, lineno, funcname, modulename, body))
+
+def handleMethodFlags(statements, filename, lineno, key, m):
+ statements.append(MethodFlagsNode(filename, lineno, key, m))
+
+def handleExportFuncs(statements, filename, lineno, m, classname):
+ statements.append(ExportFuncsNode(filename, lineno, m.group(1), classname))
+
+def handleAddTask(statements, filename, lineno, m):
+ func = m.group("func")
+ before = m.group("before")
+ after = m.group("after")
+ if func is None:
+ return
+
+ statements.append(AddTaskNode(filename, lineno, func, before, after))
+
+def handleDelTask(statements, filename, lineno, m):
+ func = m.group("func")
+ if func is None:
+ return
+
+ statements.append(DelTaskNode(filename, lineno, func))
+
+def handleBBHandlers(statements, filename, lineno, m):
+ statements.append(BBHandlerNode(filename, lineno, m.group(1)))
+
+def handleInherit(statements, filename, lineno, m):
+ classes = m.group(1)
+ statements.append(InheritNode(filename, lineno, classes))
+
+def finalize(fn, d, variant = None):
+ all_handlers = {}
+ for var in d.getVar('__BBHANDLERS', False) or []:
+ # try to add the handler
+ bb.event.register(var, d.getVar(var, False), (d.getVarFlag(var, "eventmask", True) or "").split())
+
+ bb.event.fire(bb.event.RecipePreFinalise(fn), d)
+
+ bb.data.expandKeys(d)
+ bb.data.update_data(d)
+ code = []
+ for funcname in d.getVar("__BBANONFUNCS", False) or []:
+ code.append("%s(d)" % funcname)
+ bb.utils.better_exec("\n".join(code), {"d": d})
+ bb.data.update_data(d)
+
+ tasklist = d.getVar('__BBTASKS', False) or []
+ bb.build.add_tasks(tasklist, d)
+
+ bb.parse.siggen.finalise(fn, d, variant)
+
+ d.setVar('BBINCLUDED', bb.parse.get_file_depends(d))
+
+ bb.event.fire(bb.event.RecipeParsed(fn), d)
+
+def _create_variants(datastores, names, function, onlyfinalise):
+ def create_variant(name, orig_d, arg = None):
+ if onlyfinalise and name not in onlyfinalise:
+ return
+ new_d = bb.data.createCopy(orig_d)
+ function(arg or name, new_d)
+ datastores[name] = new_d
+
+ for variant, variant_d in datastores.items():
+ for name in names:
+ if not variant:
+ # Based on main recipe
+ create_variant(name, variant_d)
+ else:
+ create_variant("%s-%s" % (variant, name), variant_d, name)
+
+def _expand_versions(versions):
+ def expand_one(version, start, end):
+ for i in xrange(start, end + 1):
+ ver = _bbversions_re.sub(str(i), version, 1)
+ yield ver
+
+ versions = iter(versions)
+ while True:
+ try:
+ version = next(versions)
+ except StopIteration:
+ break
+
+ range_ver = _bbversions_re.search(version)
+ if not range_ver:
+ yield version
+ else:
+ newversions = expand_one(version, int(range_ver.group("from")),
+ int(range_ver.group("to")))
+ versions = itertools.chain(newversions, versions)
+
+def multi_finalize(fn, d):
+ appends = (d.getVar("__BBAPPEND", True) or "").split()
+ for append in appends:
+ logger.debug(1, "Appending .bbappend file %s to %s", append, fn)
+ bb.parse.BBHandler.handle(append, d, True)
+
+ onlyfinalise = d.getVar("__ONLYFINALISE", False)
+
+ safe_d = d
+ d = bb.data.createCopy(safe_d)
+ try:
+ finalize(fn, d)
+ except bb.parse.SkipRecipe as e:
+ d.setVar("__SKIPPED", e.args[0])
+ datastores = {"": safe_d}
+
+ versions = (d.getVar("BBVERSIONS", True) or "").split()
+ if versions:
+ pv = orig_pv = d.getVar("PV", True)
+ baseversions = {}
+
+ def verfunc(ver, d, pv_d = None):
+ if pv_d is None:
+ pv_d = d
+
+ overrides = d.getVar("OVERRIDES", True).split(":")
+ pv_d.setVar("PV", ver)
+ overrides.append(ver)
+ bpv = baseversions.get(ver) or orig_pv
+ pv_d.setVar("BPV", bpv)
+ overrides.append(bpv)
+ d.setVar("OVERRIDES", ":".join(overrides))
+
+ versions = list(_expand_versions(versions))
+ for pos, version in enumerate(list(versions)):
+ try:
+ pv, bpv = version.split(":", 2)
+ except ValueError:
+ pass
+ else:
+ versions[pos] = pv
+ baseversions[pv] = bpv
+
+ if pv in versions and not baseversions.get(pv):
+ versions.remove(pv)
+ else:
+ pv = versions.pop()
+
+ # This is necessary because our existing main datastore
+ # has already been finalized with the old PV, we need one
+ # that's been finalized with the new PV.
+ d = bb.data.createCopy(safe_d)
+ verfunc(pv, d, safe_d)
+ try:
+ finalize(fn, d)
+ except bb.parse.SkipRecipe as e:
+ d.setVar("__SKIPPED", e.args[0])
+
+ _create_variants(datastores, versions, verfunc, onlyfinalise)
+
+ extended = d.getVar("BBCLASSEXTEND", True) or ""
+ if extended:
+ # the following is to support bbextends with arguments, for e.g. multilib
+ # an example is as follows:
+ # BBCLASSEXTEND = "multilib:lib32"
+ # it will create foo-lib32, inheriting multilib.bbclass and set
+ # BBEXTENDCURR to "multilib" and BBEXTENDVARIANT to "lib32"
+ extendedmap = {}
+ variantmap = {}
+
+ for ext in extended.split():
+ eext = ext.split(':', 2)
+ if len(eext) > 1:
+ extendedmap[ext] = eext[0]
+ variantmap[ext] = eext[1]
+ else:
+ extendedmap[ext] = ext
+
+ pn = d.getVar("PN", True)
+ def extendfunc(name, d):
+ if name != extendedmap[name]:
+ d.setVar("BBEXTENDCURR", extendedmap[name])
+ d.setVar("BBEXTENDVARIANT", variantmap[name])
+ else:
+ d.setVar("PN", "%s-%s" % (pn, name))
+ bb.parse.BBHandler.inherit(extendedmap[name], fn, 0, d)
+
+ safe_d.setVar("BBCLASSEXTEND", extended)
+ _create_variants(datastores, extendedmap.keys(), extendfunc, onlyfinalise)
+
+ for variant, variant_d in datastores.iteritems():
+ if variant:
+ try:
+ if not onlyfinalise or variant in onlyfinalise:
+ finalize(fn, variant_d, variant)
+ except bb.parse.SkipRecipe as e:
+ variant_d.setVar("__SKIPPED", e.args[0])
+
+ if len(datastores) > 1:
+ variants = filter(None, datastores.iterkeys())
+ safe_d.setVar("__VARIANTS", " ".join(variants))
+
+ datastores[""] = d
+ return datastores
diff --git a/bitbake/lib/bb/parse/parse_py/BBHandler.py b/bitbake/lib/bb/parse/parse_py/BBHandler.py
new file mode 100644
index 0000000..ec097ba
--- /dev/null
+++ b/bitbake/lib/bb/parse/parse_py/BBHandler.py
@@ -0,0 +1,265 @@
+#!/usr/bin/env python
+# ex:ts=4:sw=4:sts=4:et
+# -*- tab-width: 4; c-basic-offset: 4; indent-tabs-mode: nil -*-
+"""
+ class for handling .bb files
+
+ Reads a .bb file and obtains its metadata
+
+"""
+
+
+# Copyright (C) 2003, 2004 Chris Larson
+# Copyright (C) 2003, 2004 Phil Blundell
+#
+# This program is free software; you can redistribute it and/or modify
+# it under the terms of the GNU General Public License version 2 as
+# published by the Free Software Foundation.
+#
+# This program is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
+# GNU General Public License for more details.
+#
+# You should have received a copy of the GNU General Public License along
+# with this program; if not, write to the Free Software Foundation, Inc.,
+# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
+
+from __future__ import absolute_import
+import re, bb, os
+import logging
+import bb.build, bb.utils
+from bb import data
+
+from . import ConfHandler
+from .. import resolve_file, ast, logger, ParseError
+from .ConfHandler import include, init
+
+# For compatibility
+bb.deprecate_import(__name__, "bb.parse", ["vars_from_file"])
+
+__func_start_regexp__ = re.compile( r"(((?P<py>python)|(?P<fr>fakeroot))\s*)*(?P<func>[\w\.\-\+\{\}\$]+)?\s*\(\s*\)\s*{$" )
+__inherit_regexp__ = re.compile( r"inherit\s+(.+)" )
+__export_func_regexp__ = re.compile( r"EXPORT_FUNCTIONS\s+(.+)" )
+__addtask_regexp__ = re.compile("addtask\s+(?P<func>\w+)\s*((before\s*(?P<before>((.*(?=after))|(.*))))|(after\s*(?P<after>((.*(?=before))|(.*)))))*")
+__deltask_regexp__ = re.compile("deltask\s+(?P<func>\w+)")
+__addhandler_regexp__ = re.compile( r"addhandler\s+(.+)" )
+__def_regexp__ = re.compile( r"def\s+(\w+).*:" )
+__python_func_regexp__ = re.compile( r"(\s+.*)|(^$)" )
+
+
+__infunc__ = []
+__inpython__ = False
+__body__ = []
+__classname__ = ""
+
+cached_statements = {}
+
+# We need to indicate EOF to the feeder. This code is so messy that
+# factoring it out to a close_parse_file method is out of question.
+# We will use the IN_PYTHON_EOF as an indicator to just close the method
+#
+# The two parts using it are tightly integrated anyway
+IN_PYTHON_EOF = -9999999999999
+
+
+
+def supports(fn, d):
+ """Return True if fn has a supported extension"""
+ return os.path.splitext(fn)[-1] in [".bb", ".bbclass", ".inc"]
+
+def inherit(files, fn, lineno, d):
+ __inherit_cache = d.getVar('__inherit_cache', False) or []
+ files = d.expand(files).split()
+ for file in files:
+ if not os.path.isabs(file) and not file.endswith(".bbclass"):
+ file = os.path.join('classes', '%s.bbclass' % file)
+
+ if not os.path.isabs(file):
+ bbpath = d.getVar("BBPATH", True)
+ abs_fn, attempts = bb.utils.which(bbpath, file, history=True)
+ for af in attempts:
+ if af != abs_fn:
+ bb.parse.mark_dependency(d, af)
+ if abs_fn:
+ file = abs_fn
+
+ if not file in __inherit_cache:
+ logger.debug(1, "Inheriting %s (from %s:%d)" % (file, fn, lineno))
+ __inherit_cache.append( file )
+ d.setVar('__inherit_cache', __inherit_cache)
+ include(fn, file, lineno, d, "inherit")
+ __inherit_cache = d.getVar('__inherit_cache', False) or []
+
+def get_statements(filename, absolute_filename, base_name):
+ global cached_statements
+
+ try:
+ return cached_statements[absolute_filename]
+ except KeyError:
+ file = open(absolute_filename, 'r')
+ statements = ast.StatementGroup()
+
+ lineno = 0
+ while True:
+ lineno = lineno + 1
+ s = file.readline()
+ if not s: break
+ s = s.rstrip()
+ feeder(lineno, s, filename, base_name, statements)
+ file.close()
+ if __inpython__:
+ # add a blank line to close out any python definition
+ feeder(IN_PYTHON_EOF, "", filename, base_name, statements)
+
+ if filename.endswith(".bbclass") or filename.endswith(".inc"):
+ cached_statements[absolute_filename] = statements
+ return statements
+
+def handle(fn, d, include):
+ global __func_start_regexp__, __inherit_regexp__, __export_func_regexp__, __addtask_regexp__, __addhandler_regexp__, __infunc__, __body__, __residue__, __classname__
+ __body__ = []
+ __infunc__ = []
+ __classname__ = ""
+ __residue__ = []
+
+ base_name = os.path.basename(fn)
+ (root, ext) = os.path.splitext(base_name)
+ init(d)
+
+ if ext == ".bbclass":
+ __classname__ = root
+ __inherit_cache = d.getVar('__inherit_cache', False) or []
+ if not fn in __inherit_cache:
+ __inherit_cache.append(fn)
+ d.setVar('__inherit_cache', __inherit_cache)
+
+ if include != 0:
+ oldfile = d.getVar('FILE', False)
+ else:
+ oldfile = None
+
+ abs_fn = resolve_file(fn, d)
+
+ if include:
+ bb.parse.mark_dependency(d, abs_fn)
+
+ # actual loading
+ statements = get_statements(fn, abs_fn, base_name)
+
+ # DONE WITH PARSING... time to evaluate
+ if ext != ".bbclass" and abs_fn != oldfile:
+ d.setVar('FILE', abs_fn)
+
+ try:
+ statements.eval(d)
+ except bb.parse.SkipRecipe:
+ bb.data.setVar("__SKIPPED", True, d)
+ if include == 0:
+ return { "" : d }
+
+ if __infunc__:
+ raise ParseError("Shell function %s is never closed" % __infunc__[0], __infunc__[1], __infunc__[2])
+ if __residue__:
+ raise ParseError("Leftover unparsed (incomplete?) data %s from %s" % __residue__, fn)
+
+ if ext != ".bbclass" and include == 0:
+ return ast.multi_finalize(fn, d)
+
+ if ext != ".bbclass" and oldfile and abs_fn != oldfile:
+ d.setVar("FILE", oldfile)
+
+ return d
+
+def feeder(lineno, s, fn, root, statements):
+ global __func_start_regexp__, __inherit_regexp__, __export_func_regexp__, __addtask_regexp__, __addhandler_regexp__, __def_regexp__, __python_func_regexp__, __inpython__, __infunc__, __body__, bb, __residue__, __classname__
+ if __infunc__:
+ if s == '}':
+ __body__.append('')
+ ast.handleMethod(statements, fn, lineno, __infunc__[0], __body__)
+ __infunc__ = []
+ __body__ = []
+ else:
+ __body__.append(s)
+ return
+
+ if __inpython__:
+ m = __python_func_regexp__.match(s)
+ if m and lineno != IN_PYTHON_EOF:
+ __body__.append(s)
+ return
+ else:
+ ast.handlePythonMethod(statements, fn, lineno, __inpython__,
+ root, __body__)
+ __body__ = []
+ __inpython__ = False
+
+ if lineno == IN_PYTHON_EOF:
+ return
+
+ if s and s[0] == '#':
+ if len(__residue__) != 0 and __residue__[0][0] != "#":
+ bb.fatal("There is a comment on line %s of file %s (%s) which is in the middle of a multiline expression.\nBitbake used to ignore these but no longer does so, please fix your metadata as errors are likely as a result of this change." % (lineno, fn, s))
+
+ if len(__residue__) != 0 and __residue__[0][0] == "#" and (not s or s[0] != "#"):
+ bb.fatal("There is a confusing multiline, partially commented expression on line %s of file %s (%s).\nPlease clarify whether this is all a comment or should be parsed." % (lineno, fn, s))
+
+ if s and s[-1] == '\\':
+ __residue__.append(s[:-1])
+ return
+
+ s = "".join(__residue__) + s
+ __residue__ = []
+
+ # Skip empty lines
+ if s == '':
+ return
+
+ # Skip comments
+ if s[0] == '#':
+ return
+
+ m = __func_start_regexp__.match(s)
+ if m:
+ __infunc__ = [m.group("func") or "__anonymous", fn, lineno]
+ ast.handleMethodFlags(statements, fn, lineno, __infunc__[0], m)
+ return
+
+ m = __def_regexp__.match(s)
+ if m:
+ __body__.append(s)
+ __inpython__ = m.group(1)
+
+ return
+
+ m = __export_func_regexp__.match(s)
+ if m:
+ ast.handleExportFuncs(statements, fn, lineno, m, __classname__)
+ return
+
+ m = __addtask_regexp__.match(s)
+ if m:
+ ast.handleAddTask(statements, fn, lineno, m)
+ return
+
+ m = __deltask_regexp__.match(s)
+ if m:
+ ast.handleDelTask(statements, fn, lineno, m)
+ return
+
+ m = __addhandler_regexp__.match(s)
+ if m:
+ ast.handleBBHandlers(statements, fn, lineno, m)
+ return
+
+ m = __inherit_regexp__.match(s)
+ if m:
+ ast.handleInherit(statements, fn, lineno, m)
+ return
+
+ return ConfHandler.feeder(lineno, s, fn, statements)
+
+# Add us to the handlers list
+from .. import handlers
+handlers.append({'supports': supports, 'handle': handle, 'init': init})
+del handlers
diff --git a/bitbake/lib/bb/parse/parse_py/ConfHandler.py b/bitbake/lib/bb/parse/parse_py/ConfHandler.py
new file mode 100644
index 0000000..fbd75b1
--- /dev/null
+++ b/bitbake/lib/bb/parse/parse_py/ConfHandler.py
@@ -0,0 +1,193 @@
+#!/usr/bin/env python
+# ex:ts=4:sw=4:sts=4:et
+# -*- tab-width: 4; c-basic-offset: 4; indent-tabs-mode: nil -*-
+"""
+ class for handling configuration data files
+
+ Reads a .conf file and obtains its metadata
+
+"""
+
+# Copyright (C) 2003, 2004 Chris Larson
+# Copyright (C) 2003, 2004 Phil Blundell
+#
+# This program is free software; you can redistribute it and/or modify
+# it under the terms of the GNU General Public License version 2 as
+# published by the Free Software Foundation.
+#
+# This program is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
+# GNU General Public License for more details.
+#
+# You should have received a copy of the GNU General Public License along
+# with this program; if not, write to the Free Software Foundation, Inc.,
+# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
+
+import errno
+import re
+import os
+import bb.utils
+from bb.parse import ParseError, resolve_file, ast, logger, handle
+
+__config_regexp__ = re.compile( r"""
+ ^
+ (?P<exp>export\s*)?
+ (?P<var>[a-zA-Z0-9\-~_+.${}/]+?)
+ (\[(?P<flag>[a-zA-Z0-9\-_+.]+)\])?
+
+ \s* (
+ (?P<colon>:=) |
+ (?P<lazyques>\?\?=) |
+ (?P<ques>\?=) |
+ (?P<append>\+=) |
+ (?P<prepend>=\+) |
+ (?P<predot>=\.) |
+ (?P<postdot>\.=) |
+ =
+ ) \s*
+
+ (?!'[^']*'[^']*'$)
+ (?!\"[^\"]*\"[^\"]*\"$)
+ (?P<apo>['\"])
+ (?P<value>.*)
+ (?P=apo)
+ $
+ """, re.X)
+__include_regexp__ = re.compile( r"include\s+(.+)" )
+__require_regexp__ = re.compile( r"require\s+(.+)" )
+__export_regexp__ = re.compile( r"export\s+([a-zA-Z0-9\-_+.${}/]+)$" )
+
+def init(data):
+ topdir = data.getVar('TOPDIR', False)
+ if not topdir:
+ data.setVar('TOPDIR', os.getcwd())
+
+
+def supports(fn, d):
+ return fn[-5:] == ".conf"
+
+def include(parentfn, fn, lineno, data, error_out):
+ """
+ error_out: A string indicating the verb (e.g. "include", "inherit") to be
+ used in a ParseError that will be raised if the file to be included could
+ not be included. Specify False to avoid raising an error in this case.
+ """
+ if parentfn == fn: # prevent infinite recursion
+ return None
+
+ fn = data.expand(fn)
+ parentfn = data.expand(parentfn)
+
+ if not os.path.isabs(fn):
+ dname = os.path.dirname(parentfn)
+ bbpath = "%s:%s" % (dname, data.getVar("BBPATH", True))
+ abs_fn, attempts = bb.utils.which(bbpath, fn, history=True)
+ if abs_fn and bb.parse.check_dependency(data, abs_fn):
+ logger.warn("Duplicate inclusion for %s in %s" % (abs_fn, data.getVar('FILE', True)))
+ for af in attempts:
+ bb.parse.mark_dependency(data, af)
+ if abs_fn:
+ fn = abs_fn
+ elif bb.parse.check_dependency(data, fn):
+ logger.warn("Duplicate inclusion for %s in %s" % (fn, data.getVar('FILE', True)))
+
+ try:
+ bb.parse.handle(fn, data, True)
+ except (IOError, OSError) as exc:
+ if exc.errno == errno.ENOENT:
+ if error_out:
+ raise ParseError("Could not %s file %s" % (error_out, fn), parentfn, lineno)
+ logger.debug(2, "CONF file '%s' not found", fn)
+ else:
+ if error_out:
+ raise ParseError("Could not %s file %s: %s" % (error_out, fn, exc.strerror), parentfn, lineno)
+ else:
+ raise ParseError("Error parsing %s: %s" % (fn, exc.strerror), parentfn, lineno)
+
+# We have an issue where a UI might want to enforce particular settings such as
+# an empty DISTRO variable. If configuration files do something like assigning
+# a weak default, it turns out to be very difficult to filter out these changes,
+# particularly when the weak default might appear half way though parsing a chain
+# of configuration files. We therefore let the UIs hook into configuration file
+# parsing. This turns out to be a hard problem to solve any other way.
+confFilters = []
+
+def handle(fn, data, include):
+ init(data)
+
+ if include == 0:
+ oldfile = None
+ else:
+ oldfile = data.getVar('FILE', False)
+
+ abs_fn = resolve_file(fn, data)
+ f = open(abs_fn, 'r')
+
+ if include:
+ bb.parse.mark_dependency(data, abs_fn)
+
+ statements = ast.StatementGroup()
+ lineno = 0
+ while True:
+ lineno = lineno + 1
+ s = f.readline()
+ if not s:
+ break
+ w = s.strip()
+ # skip empty lines
+ if not w:
+ continue
+ s = s.rstrip()
+ while s[-1] == '\\':
+ s2 = f.readline().strip()
+ lineno = lineno + 1
+ if (not s2 or s2 and s2[0] != "#") and s[0] == "#" :
+ bb.fatal("There is a confusing multiline, partially commented expression on line %s of file %s (%s).\nPlease clarify whether this is all a comment or should be parsed." % (lineno, fn, s))
+ s = s[:-1] + s2
+ # skip comments
+ if s[0] == '#':
+ continue
+ feeder(lineno, s, abs_fn, statements)
+
+ # DONE WITH PARSING... time to evaluate
+ data.setVar('FILE', abs_fn)
+ statements.eval(data)
+ if oldfile:
+ data.setVar('FILE', oldfile)
+
+ f.close()
+
+ for f in confFilters:
+ f(fn, data)
+
+ return data
+
+def feeder(lineno, s, fn, statements):
+ m = __config_regexp__.match(s)
+ if m:
+ groupd = m.groupdict()
+ ast.handleData(statements, fn, lineno, groupd)
+ return
+
+ m = __include_regexp__.match(s)
+ if m:
+ ast.handleInclude(statements, fn, lineno, m, False)
+ return
+
+ m = __require_regexp__.match(s)
+ if m:
+ ast.handleInclude(statements, fn, lineno, m, True)
+ return
+
+ m = __export_regexp__.match(s)
+ if m:
+ ast.handleExport(statements, fn, lineno, m)
+ return
+
+ raise ParseError("unparsed line: '%s'" % s, fn, lineno);
+
+# Add us to the handlers list
+from bb.parse import handlers
+handlers.append({'supports': supports, 'handle': handle, 'init': init})
+del handlers
diff --git a/bitbake/lib/bb/parse/parse_py/__init__.py b/bitbake/lib/bb/parse/parse_py/__init__.py
new file mode 100644
index 0000000..3e658d0
--- /dev/null
+++ b/bitbake/lib/bb/parse/parse_py/__init__.py
@@ -0,0 +1,33 @@
+#!/usr/bin/env python
+# ex:ts=4:sw=4:sts=4:et
+# -*- tab-width: 4; c-basic-offset: 4; indent-tabs-mode: nil -*-
+"""
+BitBake Parsers
+
+File parsers for the BitBake build tools.
+
+"""
+
+# Copyright (C) 2003, 2004 Chris Larson
+# Copyright (C) 2003, 2004 Phil Blundell
+#
+# This program is free software; you can redistribute it and/or modify
+# it under the terms of the GNU General Public License version 2 as
+# published by the Free Software Foundation.
+#
+# This program is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
+# GNU General Public License for more details.
+#
+# You should have received a copy of the GNU General Public License along
+# with this program; if not, write to the Free Software Foundation, Inc.,
+# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
+#
+# Based on functions from the base bb module, Copyright 2003 Holger Schurig
+
+from __future__ import absolute_import
+from . import ConfHandler
+from . import BBHandler
+
+__version__ = '1.0'
diff --git a/bitbake/lib/bb/persist_data.py b/bitbake/lib/bb/persist_data.py
new file mode 100644
index 0000000..5795bc8
--- /dev/null
+++ b/bitbake/lib/bb/persist_data.py
@@ -0,0 +1,217 @@
+"""BitBake Persistent Data Store
+
+Used to store data in a central location such that other threads/tasks can
+access them at some future date. Acts as a convenience wrapper around sqlite,
+currently, providing a key/value store accessed by 'domain'.
+"""
+
+# Copyright (C) 2007 Richard Purdie
+# Copyright (C) 2010 Chris Larson <chris_larson@mentor.com>
+#
+# This program is free software; you can redistribute it and/or modify
+# it under the terms of the GNU General Public License version 2 as
+# published by the Free Software Foundation.
+#
+# This program is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
+# GNU General Public License for more details.
+#
+# You should have received a copy of the GNU General Public License along
+# with this program; if not, write to the Free Software Foundation, Inc.,
+# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
+
+import collections
+import logging
+import os.path
+import sys
+import warnings
+from bb.compat import total_ordering
+from collections import Mapping
+
+try:
+ import sqlite3
+except ImportError:
+ from pysqlite2 import dbapi2 as sqlite3
+
+sqlversion = sqlite3.sqlite_version_info
+if sqlversion[0] < 3 or (sqlversion[0] == 3 and sqlversion[1] < 3):
+ raise Exception("sqlite3 version 3.3.0 or later is required.")
+
+
+logger = logging.getLogger("BitBake.PersistData")
+if hasattr(sqlite3, 'enable_shared_cache'):
+ try:
+ sqlite3.enable_shared_cache(True)
+ except sqlite3.OperationalError:
+ pass
+
+
+@total_ordering
+class SQLTable(collections.MutableMapping):
+ """Object representing a table/domain in the database"""
+ def __init__(self, cachefile, table):
+ self.cachefile = cachefile
+ self.table = table
+ self.cursor = connect(self.cachefile)
+
+ self._execute("CREATE TABLE IF NOT EXISTS %s(key TEXT, value TEXT);"
+ % table)
+
+ def _execute(self, *query):
+ """Execute a query, waiting to acquire a lock if necessary"""
+ count = 0
+ while True:
+ try:
+ return self.cursor.execute(*query)
+ except sqlite3.OperationalError as exc:
+ if 'database is locked' in str(exc) and count < 500:
+ count = count + 1
+ self.cursor.close()
+ self.cursor = connect(self.cachefile)
+ continue
+ raise
+
+ def __enter__(self):
+ self.cursor.__enter__()
+ return self
+
+ def __exit__(self, *excinfo):
+ self.cursor.__exit__(*excinfo)
+
+ def __getitem__(self, key):
+ data = self._execute("SELECT * from %s where key=?;" %
+ self.table, [key])
+ for row in data:
+ return row[1]
+ raise KeyError(key)
+
+ def __delitem__(self, key):
+ if key not in self:
+ raise KeyError(key)
+ self._execute("DELETE from %s where key=?;" % self.table, [key])
+
+ def __setitem__(self, key, value):
+ if not isinstance(key, basestring):
+ raise TypeError('Only string keys are supported')
+ elif not isinstance(value, basestring):
+ raise TypeError('Only string values are supported')
+
+ data = self._execute("SELECT * from %s where key=?;" %
+ self.table, [key])
+ exists = len(list(data))
+ if exists:
+ self._execute("UPDATE %s SET value=? WHERE key=?;" % self.table,
+ [value, key])
+ else:
+ self._execute("INSERT into %s(key, value) values (?, ?);" %
+ self.table, [key, value])
+
+ def __contains__(self, key):
+ return key in set(self)
+
+ def __len__(self):
+ data = self._execute("SELECT COUNT(key) FROM %s;" % self.table)
+ for row in data:
+ return row[0]
+
+ def __iter__(self):
+ data = self._execute("SELECT key FROM %s;" % self.table)
+ return (row[0] for row in data)
+
+ def __lt__(self, other):
+ if not isinstance(other, Mapping):
+ raise NotImplemented
+
+ return len(self) < len(other)
+
+ def get_by_pattern(self, pattern):
+ data = self._execute("SELECT * FROM %s WHERE key LIKE ?;" %
+ self.table, [pattern])
+ return [row[1] for row in data]
+
+ def values(self):
+ return list(self.itervalues())
+
+ def itervalues(self):
+ data = self._execute("SELECT value FROM %s;" % self.table)
+ return (row[0] for row in data)
+
+ def items(self):
+ return list(self.iteritems())
+
+ def iteritems(self):
+ return self._execute("SELECT * FROM %s;" % self.table)
+
+ def clear(self):
+ self._execute("DELETE FROM %s;" % self.table)
+
+ def has_key(self, key):
+ return key in self
+
+
+class PersistData(object):
+ """Deprecated representation of the bitbake persistent data store"""
+ def __init__(self, d):
+ warnings.warn("Use of PersistData is deprecated. Please use "
+ "persist(domain, d) instead.",
+ category=DeprecationWarning,
+ stacklevel=2)
+
+ self.data = persist(d)
+ logger.debug(1, "Using '%s' as the persistent data cache",
+ self.data.filename)
+
+ def addDomain(self, domain):
+ """
+ Add a domain (pending deprecation)
+ """
+ return self.data[domain]
+
+ def delDomain(self, domain):
+ """
+ Removes a domain and all the data it contains
+ """
+ del self.data[domain]
+
+ def getKeyValues(self, domain):
+ """
+ Return a list of key + value pairs for a domain
+ """
+ return self.data[domain].items()
+
+ def getValue(self, domain, key):
+ """
+ Return the value of a key for a domain
+ """
+ return self.data[domain][key]
+
+ def setValue(self, domain, key, value):
+ """
+ Sets the value of a key for a domain
+ """
+ self.data[domain][key] = value
+
+ def delValue(self, domain, key):
+ """
+ Deletes a key/value pair
+ """
+ del self.data[domain][key]
+
+def connect(database):
+ connection = sqlite3.connect(database, timeout=5, isolation_level=None)
+ connection.execute("pragma synchronous = off;")
+ return connection
+
+def persist(domain, d):
+ """Convenience factory for SQLTable objects based upon metadata"""
+ import bb.utils
+ cachedir = (d.getVar("PERSISTENT_DIR", True) or
+ d.getVar("CACHE", True))
+ if not cachedir:
+ logger.critical("Please set the 'PERSISTENT_DIR' or 'CACHE' variable")
+ sys.exit(1)
+
+ bb.utils.mkdirhier(cachedir)
+ cachefile = os.path.join(cachedir, "bb_persist_data.sqlite3")
+ return SQLTable(cachefile, domain)
diff --git a/bitbake/lib/bb/process.py b/bitbake/lib/bb/process.py
new file mode 100644
index 0000000..1c07f2d
--- /dev/null
+++ b/bitbake/lib/bb/process.py
@@ -0,0 +1,156 @@
+import logging
+import signal
+import subprocess
+import errno
+import select
+
+logger = logging.getLogger('BitBake.Process')
+
+def subprocess_setup():
+ # Python installs a SIGPIPE handler by default. This is usually not what
+ # non-Python subprocesses expect.
+ signal.signal(signal.SIGPIPE, signal.SIG_DFL)
+
+class CmdError(RuntimeError):
+ def __init__(self, command, msg=None):
+ self.command = command
+ self.msg = msg
+
+ def __str__(self):
+ if not isinstance(self.command, basestring):
+ cmd = subprocess.list2cmdline(self.command)
+ else:
+ cmd = self.command
+
+ msg = "Execution of '%s' failed" % cmd
+ if self.msg:
+ msg += ': %s' % self.msg
+ return msg
+
+class NotFoundError(CmdError):
+ def __str__(self):
+ return CmdError.__str__(self) + ": command not found"
+
+class ExecutionError(CmdError):
+ def __init__(self, command, exitcode, stdout = None, stderr = None):
+ CmdError.__init__(self, command)
+ self.exitcode = exitcode
+ self.stdout = stdout
+ self.stderr = stderr
+
+ def __str__(self):
+ message = ""
+ if self.stderr:
+ message += self.stderr
+ if self.stdout:
+ message += self.stdout
+ if message:
+ message = ":\n" + message
+ return (CmdError.__str__(self) +
+ " with exit code %s" % self.exitcode + message)
+
+class Popen(subprocess.Popen):
+ defaults = {
+ "close_fds": True,
+ "preexec_fn": subprocess_setup,
+ "stdout": subprocess.PIPE,
+ "stderr": subprocess.STDOUT,
+ "stdin": subprocess.PIPE,
+ "shell": False,
+ }
+
+ def __init__(self, *args, **kwargs):
+ options = dict(self.defaults)
+ options.update(kwargs)
+ subprocess.Popen.__init__(self, *args, **options)
+
+def _logged_communicate(pipe, log, input, extrafiles):
+ if pipe.stdin:
+ if input is not None:
+ pipe.stdin.write(input)
+ pipe.stdin.close()
+
+ outdata, errdata = [], []
+ rin = []
+
+ if pipe.stdout is not None:
+ bb.utils.nonblockingfd(pipe.stdout.fileno())
+ rin.append(pipe.stdout)
+ if pipe.stderr is not None:
+ bb.utils.nonblockingfd(pipe.stderr.fileno())
+ rin.append(pipe.stderr)
+ for fobj, _ in extrafiles:
+ bb.utils.nonblockingfd(fobj.fileno())
+ rin.append(fobj)
+
+ def readextras(selected):
+ for fobj, func in extrafiles:
+ if fobj in selected:
+ try:
+ data = fobj.read()
+ except IOError as err:
+ if err.errno == errno.EAGAIN or err.errno == errno.EWOULDBLOCK:
+ data = None
+ if data is not None:
+ func(data)
+
+ try:
+ while pipe.poll() is None:
+ rlist = rin
+ try:
+ r,w,e = select.select (rlist, [], [], 1)
+ except OSError as e:
+ if e.errno != errno.EINTR:
+ raise
+
+ if pipe.stdout in r:
+ data = pipe.stdout.read()
+ if data is not None:
+ outdata.append(data)
+ log.write(data)
+
+ if pipe.stderr in r:
+ data = pipe.stderr.read()
+ if data is not None:
+ errdata.append(data)
+ log.write(data)
+
+ readextras(r)
+
+ finally:
+ log.flush()
+
+ readextras([fobj for fobj, _ in extrafiles])
+
+ if pipe.stdout is not None:
+ pipe.stdout.close()
+ if pipe.stderr is not None:
+ pipe.stderr.close()
+ return ''.join(outdata), ''.join(errdata)
+
+def run(cmd, input=None, log=None, extrafiles=None, **options):
+ """Convenience function to run a command and return its output, raising an
+ exception when the command fails"""
+
+ if not extrafiles:
+ extrafiles = []
+
+ if isinstance(cmd, basestring) and not "shell" in options:
+ options["shell"] = True
+
+ try:
+ pipe = Popen(cmd, **options)
+ except OSError as exc:
+ if exc.errno == 2:
+ raise NotFoundError(cmd)
+ else:
+ raise CmdError(cmd, exc)
+
+ if log:
+ stdout, stderr = _logged_communicate(pipe, log, input, extrafiles)
+ else:
+ stdout, stderr = pipe.communicate(input)
+
+ if pipe.returncode != 0:
+ raise ExecutionError(cmd, pipe.returncode, stdout, stderr)
+ return stdout, stderr
diff --git a/bitbake/lib/bb/providers.py b/bitbake/lib/bb/providers.py
new file mode 100644
index 0000000..637e1fa
--- /dev/null
+++ b/bitbake/lib/bb/providers.py
@@ -0,0 +1,381 @@
+# ex:ts=4:sw=4:sts=4:et
+# -*- tab-width: 4; c-basic-offset: 4; indent-tabs-mode: nil -*-
+#
+# Copyright (C) 2003, 2004 Chris Larson
+# Copyright (C) 2003, 2004 Phil Blundell
+# Copyright (C) 2003 - 2005 Michael 'Mickey' Lauer
+# Copyright (C) 2005 Holger Hans Peter Freyther
+# Copyright (C) 2005 ROAD GmbH
+# Copyright (C) 2006 Richard Purdie
+#
+# This program is free software; you can redistribute it and/or modify
+# it under the terms of the GNU General Public License version 2 as
+# published by the Free Software Foundation.
+#
+# This program is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
+# GNU General Public License for more details.
+#
+# You should have received a copy of the GNU General Public License along
+# with this program; if not, write to the Free Software Foundation, Inc.,
+# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
+
+import re
+import logging
+from bb import data, utils
+from collections import defaultdict
+import bb
+
+logger = logging.getLogger("BitBake.Provider")
+
+class NoProvider(bb.BBHandledException):
+ """Exception raised when no provider of a build dependency can be found"""
+
+class NoRProvider(bb.BBHandledException):
+ """Exception raised when no provider of a runtime dependency can be found"""
+
+class MultipleRProvider(bb.BBHandledException):
+ """Exception raised when multiple providers of a runtime dependency can be found"""
+
+def findProviders(cfgData, dataCache, pkg_pn = None):
+ """
+ Convenience function to get latest and preferred providers in pkg_pn
+ """
+
+ if not pkg_pn:
+ pkg_pn = dataCache.pkg_pn
+
+ # Need to ensure data store is expanded
+ localdata = data.createCopy(cfgData)
+ bb.data.update_data(localdata)
+ bb.data.expandKeys(localdata)
+
+ preferred_versions = {}
+ latest_versions = {}
+
+ for pn in pkg_pn:
+ (last_ver, last_file, pref_ver, pref_file) = findBestProvider(pn, localdata, dataCache, pkg_pn)
+ preferred_versions[pn] = (pref_ver, pref_file)
+ latest_versions[pn] = (last_ver, last_file)
+
+ return (latest_versions, preferred_versions)
+
+
+def allProviders(dataCache):
+ """
+ Find all providers for each pn
+ """
+ all_providers = defaultdict(list)
+ for (fn, pn) in dataCache.pkg_fn.items():
+ ver = dataCache.pkg_pepvpr[fn]
+ all_providers[pn].append((ver, fn))
+ return all_providers
+
+
+def sortPriorities(pn, dataCache, pkg_pn = None):
+ """
+ Reorder pkg_pn by file priority and default preference
+ """
+
+ if not pkg_pn:
+ pkg_pn = dataCache.pkg_pn
+
+ files = pkg_pn[pn]
+ priorities = {}
+ for f in files:
+ priority = dataCache.bbfile_priority[f]
+ preference = dataCache.pkg_dp[f]
+ if priority not in priorities:
+ priorities[priority] = {}
+ if preference not in priorities[priority]:
+ priorities[priority][preference] = []
+ priorities[priority][preference].append(f)
+ tmp_pn = []
+ for pri in sorted(priorities):
+ tmp_pref = []
+ for pref in sorted(priorities[pri]):
+ tmp_pref.extend(priorities[pri][pref])
+ tmp_pn = [tmp_pref] + tmp_pn
+
+ return tmp_pn
+
+def preferredVersionMatch(pe, pv, pr, preferred_e, preferred_v, preferred_r):
+ """
+ Check if the version pe,pv,pr is the preferred one.
+ If there is preferred version defined and ends with '%', then pv has to start with that version after removing the '%'
+ """
+ if (pr == preferred_r or preferred_r == None):
+ if (pe == preferred_e or preferred_e == None):
+ if preferred_v == pv:
+ return True
+ if preferred_v != None and preferred_v.endswith('%') and pv.startswith(preferred_v[:len(preferred_v)-1]):
+ return True
+ return False
+
+def findPreferredProvider(pn, cfgData, dataCache, pkg_pn = None, item = None):
+ """
+ Find the first provider in pkg_pn with a PREFERRED_VERSION set.
+ """
+
+ preferred_file = None
+ preferred_ver = None
+
+ localdata = data.createCopy(cfgData)
+ localdata.setVar('OVERRIDES', "%s:pn-%s:%s" % (data.getVar('OVERRIDES', localdata), pn, pn))
+ bb.data.update_data(localdata)
+
+ preferred_v = localdata.getVar('PREFERRED_VERSION', True)
+ if preferred_v:
+ m = re.match('(\d+:)*(.*)(_.*)*', preferred_v)
+ if m:
+ if m.group(1):
+ preferred_e = m.group(1)[:-1]
+ else:
+ preferred_e = None
+ preferred_v = m.group(2)
+ if m.group(3):
+ preferred_r = m.group(3)[1:]
+ else:
+ preferred_r = None
+ else:
+ preferred_e = None
+ preferred_r = None
+
+ for file_set in pkg_pn:
+ for f in file_set:
+ pe, pv, pr = dataCache.pkg_pepvpr[f]
+ if preferredVersionMatch(pe, pv, pr, preferred_e, preferred_v, preferred_r):
+ preferred_file = f
+ preferred_ver = (pe, pv, pr)
+ break
+ if preferred_file:
+ break;
+ if preferred_r:
+ pv_str = '%s-%s' % (preferred_v, preferred_r)
+ else:
+ pv_str = preferred_v
+ if not (preferred_e is None):
+ pv_str = '%s:%s' % (preferred_e, pv_str)
+ itemstr = ""
+ if item:
+ itemstr = " (for item %s)" % item
+ if preferred_file is None:
+ logger.info("preferred version %s of %s not available%s", pv_str, pn, itemstr)
+ available_vers = []
+ for file_set in pkg_pn:
+ for f in file_set:
+ pe, pv, pr = dataCache.pkg_pepvpr[f]
+ ver_str = pv
+ if pe:
+ ver_str = "%s:%s" % (pe, ver_str)
+ if not ver_str in available_vers:
+ available_vers.append(ver_str)
+ if available_vers:
+ available_vers.sort()
+ logger.info("versions of %s available: %s", pn, ' '.join(available_vers))
+ else:
+ logger.debug(1, "selecting %s as PREFERRED_VERSION %s of package %s%s", preferred_file, pv_str, pn, itemstr)
+
+ return (preferred_ver, preferred_file)
+
+
+def findLatestProvider(pn, cfgData, dataCache, file_set):
+ """
+ Return the highest version of the providers in file_set.
+ Take default preferences into account.
+ """
+ latest = None
+ latest_p = 0
+ latest_f = None
+ for file_name in file_set:
+ pe, pv, pr = dataCache.pkg_pepvpr[file_name]
+ dp = dataCache.pkg_dp[file_name]
+
+ if (latest is None) or ((latest_p == dp) and (utils.vercmp(latest, (pe, pv, pr)) < 0)) or (dp > latest_p):
+ latest = (pe, pv, pr)
+ latest_f = file_name
+ latest_p = dp
+
+ return (latest, latest_f)
+
+
+def findBestProvider(pn, cfgData, dataCache, pkg_pn = None, item = None):
+ """
+ If there is a PREFERRED_VERSION, find the highest-priority bbfile
+ providing that version. If not, find the latest version provided by
+ an bbfile in the highest-priority set.
+ """
+
+ sortpkg_pn = sortPriorities(pn, dataCache, pkg_pn)
+ # Find the highest priority provider with a PREFERRED_VERSION set
+ (preferred_ver, preferred_file) = findPreferredProvider(pn, cfgData, dataCache, sortpkg_pn, item)
+ # Find the latest version of the highest priority provider
+ (latest, latest_f) = findLatestProvider(pn, cfgData, dataCache, sortpkg_pn[0])
+
+ if preferred_file is None:
+ preferred_file = latest_f
+ preferred_ver = latest
+
+ return (latest, latest_f, preferred_ver, preferred_file)
+
+
+def _filterProviders(providers, item, cfgData, dataCache):
+ """
+ Take a list of providers and filter/reorder according to the
+ environment variables and previous build results
+ """
+ eligible = []
+ preferred_versions = {}
+ sortpkg_pn = {}
+
+ # The order of providers depends on the order of the files on the disk
+ # up to here. Sort pkg_pn to make dependency issues reproducible rather
+ # than effectively random.
+ providers.sort()
+
+ # Collate providers by PN
+ pkg_pn = {}
+ for p in providers:
+ pn = dataCache.pkg_fn[p]
+ if pn not in pkg_pn:
+ pkg_pn[pn] = []
+ pkg_pn[pn].append(p)
+
+ logger.debug(1, "providers for %s are: %s", item, pkg_pn.keys())
+
+ # First add PREFERRED_VERSIONS
+ for pn in pkg_pn:
+ sortpkg_pn[pn] = sortPriorities(pn, dataCache, pkg_pn)
+ preferred_versions[pn] = findPreferredProvider(pn, cfgData, dataCache, sortpkg_pn[pn], item)
+ if preferred_versions[pn][1]:
+ eligible.append(preferred_versions[pn][1])
+
+ # Now add latest versions
+ for pn in sortpkg_pn:
+ if pn in preferred_versions and preferred_versions[pn][1]:
+ continue
+ preferred_versions[pn] = findLatestProvider(pn, cfgData, dataCache, sortpkg_pn[pn][0])
+ eligible.append(preferred_versions[pn][1])
+
+ if len(eligible) == 0:
+ logger.error("no eligible providers for %s", item)
+ return 0
+
+ # If pn == item, give it a slight default preference
+ # This means PREFERRED_PROVIDER_foobar defaults to foobar if available
+ for p in providers:
+ pn = dataCache.pkg_fn[p]
+ if pn != item:
+ continue
+ (newvers, fn) = preferred_versions[pn]
+ if not fn in eligible:
+ continue
+ eligible.remove(fn)
+ eligible = [fn] + eligible
+
+ return eligible
+
+
+def filterProviders(providers, item, cfgData, dataCache):
+ """
+ Take a list of providers and filter/reorder according to the
+ environment variables and previous build results
+ Takes a "normal" target item
+ """
+
+ eligible = _filterProviders(providers, item, cfgData, dataCache)
+
+ prefervar = cfgData.getVar('PREFERRED_PROVIDER_%s' % item, True)
+ if prefervar:
+ dataCache.preferred[item] = prefervar
+
+ foundUnique = False
+ if item in dataCache.preferred:
+ for p in eligible:
+ pn = dataCache.pkg_fn[p]
+ if dataCache.preferred[item] == pn:
+ logger.verbose("selecting %s to satisfy %s due to PREFERRED_PROVIDERS", pn, item)
+ eligible.remove(p)
+ eligible = [p] + eligible
+ foundUnique = True
+ break
+
+ logger.debug(1, "sorted providers for %s are: %s", item, eligible)
+
+ return eligible, foundUnique
+
+def filterProvidersRunTime(providers, item, cfgData, dataCache):
+ """
+ Take a list of providers and filter/reorder according to the
+ environment variables and previous build results
+ Takes a "runtime" target item
+ """
+
+ eligible = _filterProviders(providers, item, cfgData, dataCache)
+
+ # Should use dataCache.preferred here?
+ preferred = []
+ preferred_vars = []
+ pns = {}
+ for p in eligible:
+ pns[dataCache.pkg_fn[p]] = p
+ for p in eligible:
+ pn = dataCache.pkg_fn[p]
+ provides = dataCache.pn_provides[pn]
+ for provide in provides:
+ prefervar = cfgData.getVar('PREFERRED_PROVIDER_%s' % provide, True)
+ #logger.debug(1, "checking PREFERRED_PROVIDER_%s (value %s) against %s", provide, prefervar, pns.keys())
+ if prefervar in pns and pns[prefervar] not in preferred:
+ var = "PREFERRED_PROVIDER_%s = %s" % (provide, prefervar)
+ logger.verbose("selecting %s to satisfy runtime %s due to %s", prefervar, item, var)
+ preferred_vars.append(var)
+ pref = pns[prefervar]
+ eligible.remove(pref)
+ eligible = [pref] + eligible
+ preferred.append(pref)
+ break
+
+ numberPreferred = len(preferred)
+
+ if numberPreferred > 1:
+ logger.error("Trying to resolve runtime dependency %s resulted in conflicting PREFERRED_PROVIDER entries being found.\nThe providers found were: %s\nThe PREFERRED_PROVIDER entries resulting in this conflict were: %s", item, preferred, preferred_vars)
+
+ logger.debug(1, "sorted runtime providers for %s are: %s", item, eligible)
+
+ return eligible, numberPreferred
+
+regexp_cache = {}
+
+def getRuntimeProviders(dataCache, rdepend):
+ """
+ Return any providers of runtime dependency
+ """
+ rproviders = []
+
+ if rdepend in dataCache.rproviders:
+ rproviders += dataCache.rproviders[rdepend]
+
+ if rdepend in dataCache.packages:
+ rproviders += dataCache.packages[rdepend]
+
+ if rproviders:
+ return rproviders
+
+ # Only search dynamic packages if we can't find anything in other variables
+ for pattern in dataCache.packages_dynamic:
+ pattern = pattern.replace('+', "\+")
+ if pattern in regexp_cache:
+ regexp = regexp_cache[pattern]
+ else:
+ try:
+ regexp = re.compile(pattern)
+ except:
+ logger.error("Error parsing regular expression '%s'", pattern)
+ raise
+ regexp_cache[pattern] = regexp
+ if regexp.match(rdepend):
+ rproviders += dataCache.packages_dynamic[pattern]
+ logger.debug(1, "Assuming %s is a dynamic package, but it may not exist" % rdepend)
+
+ return rproviders
diff --git a/bitbake/lib/bb/pysh/__init__.py b/bitbake/lib/bb/pysh/__init__.py
new file mode 100644
index 0000000..e69de29
--- /dev/null
+++ b/bitbake/lib/bb/pysh/__init__.py
diff --git a/bitbake/lib/bb/pysh/builtin.py b/bitbake/lib/bb/pysh/builtin.py
new file mode 100644
index 0000000..b748e4a
--- /dev/null
+++ b/bitbake/lib/bb/pysh/builtin.py
@@ -0,0 +1,710 @@
+# builtin.py - builtins and utilities definitions for pysh.
+#
+# Copyright 2007 Patrick Mezard
+#
+# This software may be used and distributed according to the terms
+# of the GNU General Public License, incorporated herein by reference.
+
+"""Builtin and internal utilities implementations.
+
+- Beware not to use python interpreter environment as if it were the shell
+environment. For instance, commands working directory must be explicitely handled
+through env['PWD'] instead of relying on python working directory.
+"""
+import errno
+import optparse
+import os
+import re
+import subprocess
+import sys
+import time
+
+def has_subprocess_bug():
+ return getattr(subprocess, 'list2cmdline') and \
+ ( subprocess.list2cmdline(['']) == '' or \
+ subprocess.list2cmdline(['foo|bar']) == 'foo|bar')
+
+# Detect python bug 1634343: "subprocess swallows empty arguments under win32"
+# <http://sourceforge.net/tracker/index.php?func=detail&aid=1634343&group_id=5470&atid=105470>
+# Also detect: "[ 1710802 ] subprocess must escape redirection characters under win32"
+# <http://sourceforge.net/tracker/index.php?func=detail&aid=1710802&group_id=5470&atid=105470>
+if has_subprocess_bug():
+ import subprocess_fix
+ subprocess.list2cmdline = subprocess_fix.list2cmdline
+
+from sherrors import *
+
+class NonExitingParser(optparse.OptionParser):
+ """OptionParser default behaviour upon error is to print the error message and
+ exit. Raise a utility error instead.
+ """
+ def error(self, msg):
+ raise UtilityError(msg)
+
+#-------------------------------------------------------------------------------
+# set special builtin
+#-------------------------------------------------------------------------------
+OPT_SET = NonExitingParser(usage="set - set or unset options and positional parameters")
+OPT_SET.add_option( '-f', action='store_true', dest='has_f', default=False,
+ help='The shell shall disable pathname expansion.')
+OPT_SET.add_option('-e', action='store_true', dest='has_e', default=False,
+ help="""When this option is on, if a simple command fails for any of the \
+ reasons listed in Consequences of Shell Errors or returns an exit status \
+ value >0, and is not part of the compound list following a while, until, \
+ or if keyword, and is not a part of an AND or OR list, and is not a \
+ pipeline preceded by the ! reserved word, then the shell shall immediately \
+ exit.""")
+OPT_SET.add_option('-x', action='store_true', dest='has_x', default=False,
+ help="""The shell shall write to standard error a trace for each command \
+ after it expands the command and before it executes it. It is unspecified \
+ whether the command that turns tracing off is traced.""")
+
+def builtin_set(name, args, interp, env, stdin, stdout, stderr, debugflags):
+ if 'debug-utility' in debugflags:
+ print interp.log(' '.join([name, str(args), interp['PWD']]) + '\n')
+
+ option, args = OPT_SET.parse_args(args)
+ env = interp.get_env()
+
+ if option.has_f:
+ env.set_opt('-f')
+ if option.has_e:
+ env.set_opt('-e')
+ if option.has_x:
+ env.set_opt('-x')
+ return 0
+
+#-------------------------------------------------------------------------------
+# shift special builtin
+#-------------------------------------------------------------------------------
+def builtin_shift(name, args, interp, env, stdin, stdout, stderr, debugflags):
+ if 'debug-utility' in debugflags:
+ print interp.log(' '.join([name, str(args), interp['PWD']]) + '\n')
+
+ params = interp.get_env().get_positional_args()
+ if args:
+ try:
+ n = int(args[0])
+ if n > len(params):
+ raise ValueError()
+ except ValueError:
+ return 1
+ else:
+ n = 1
+
+ params[:n] = []
+ interp.get_env().set_positional_args(params)
+ return 0
+
+#-------------------------------------------------------------------------------
+# export special builtin
+#-------------------------------------------------------------------------------
+OPT_EXPORT = NonExitingParser(usage="set - set or unset options and positional parameters")
+OPT_EXPORT.add_option('-p', action='store_true', dest='has_p', default=False)
+
+def builtin_export(name, args, interp, env, stdin, stdout, stderr, debugflags):
+ if 'debug-utility' in debugflags:
+ print interp.log(' '.join([name, str(args), interp['PWD']]) + '\n')
+
+ option, args = OPT_EXPORT.parse_args(args)
+ if option.has_p:
+ raise NotImplementedError()
+
+ for arg in args:
+ try:
+ name, value = arg.split('=', 1)
+ except ValueError:
+ name, value = arg, None
+ env = interp.get_env().export(name, value)
+
+ return 0
+
+#-------------------------------------------------------------------------------
+# return special builtin
+#-------------------------------------------------------------------------------
+def builtin_return(name, args, interp, env, stdin, stdout, stderr, debugflags):
+ if 'debug-utility' in debugflags:
+ print interp.log(' '.join([name, str(args), interp['PWD']]) + '\n')
+ res = 0
+ if args:
+ try:
+ res = int(args[0])
+ except ValueError:
+ res = 0
+ if not 0<=res<=255:
+ res = 0
+
+ # BUG: should be last executed command exit code
+ raise ReturnSignal(res)
+
+#-------------------------------------------------------------------------------
+# trap special builtin
+#-------------------------------------------------------------------------------
+def builtin_trap(name, args, interp, env, stdin, stdout, stderr, debugflags):
+ if 'debug-utility' in debugflags:
+ print interp.log(' '.join([name, str(args), interp['PWD']]) + '\n')
+ if len(args) < 2:
+ stderr.write('trap: usage: trap [[arg] signal_spec ...]\n')
+ return 2
+
+ action = args[0]
+ for sig in args[1:]:
+ try:
+ env.traps[sig] = action
+ except Exception as e:
+ stderr.write('trap: %s\n' % str(e))
+ return 0
+
+#-------------------------------------------------------------------------------
+# unset special builtin
+#-------------------------------------------------------------------------------
+OPT_UNSET = NonExitingParser("unset - unset values and attributes of variables and functions")
+OPT_UNSET.add_option( '-f', action='store_true', dest='has_f', default=False)
+OPT_UNSET.add_option( '-v', action='store_true', dest='has_v', default=False)
+
+def builtin_unset(name, args, interp, env, stdin, stdout, stderr, debugflags):
+ if 'debug-utility' in debugflags:
+ print interp.log(' '.join([name, str(args), interp['PWD']]) + '\n')
+
+ option, args = OPT_UNSET.parse_args(args)
+
+ status = 0
+ env = interp.get_env()
+ for arg in args:
+ try:
+ if option.has_f:
+ env.remove_function(arg)
+ else:
+ del env[arg]
+ except KeyError:
+ pass
+ except VarAssignmentError:
+ status = 1
+
+ return status
+
+#-------------------------------------------------------------------------------
+# wait special builtin
+#-------------------------------------------------------------------------------
+def builtin_wait(name, args, interp, env, stdin, stdout, stderr, debugflags):
+ if 'debug-utility' in debugflags:
+ print interp.log(' '.join([name, str(args), interp['PWD']]) + '\n')
+
+ return interp.wait([int(arg) for arg in args])
+
+#-------------------------------------------------------------------------------
+# cat utility
+#-------------------------------------------------------------------------------
+def utility_cat(name, args, interp, env, stdin, stdout, stderr, debugflags):
+ if 'debug-utility' in debugflags:
+ print interp.log(' '.join([name, str(args), interp['PWD']]) + '\n')
+
+ if not args:
+ args = ['-']
+
+ status = 0
+ for arg in args:
+ if arg == '-':
+ data = stdin.read()
+ else:
+ path = os.path.join(env['PWD'], arg)
+ try:
+ f = file(path, 'rb')
+ try:
+ data = f.read()
+ finally:
+ f.close()
+ except IOError as e:
+ if e.errno != errno.ENOENT:
+ raise
+ status = 1
+ continue
+ stdout.write(data)
+ stdout.flush()
+ return status
+
+#-------------------------------------------------------------------------------
+# cd utility
+#-------------------------------------------------------------------------------
+OPT_CD = NonExitingParser("cd - change the working directory")
+
+def utility_cd(name, args, interp, env, stdin, stdout, stderr, debugflags):
+ if 'debug-utility' in debugflags:
+ print interp.log(' '.join([name, str(args), interp['PWD']]) + '\n')
+
+ option, args = OPT_CD.parse_args(args)
+ env = interp.get_env()
+
+ directory = None
+ printdir = False
+ if not args:
+ home = env.get('HOME')
+ if home:
+ # Unspecified, do nothing
+ return 0
+ else:
+ directory = home
+ elif len(args)==1:
+ directory = args[0]
+ if directory=='-':
+ if 'OLDPWD' not in env:
+ raise UtilityError("OLDPWD not set")
+ printdir = True
+ directory = env['OLDPWD']
+ else:
+ raise UtilityError("too many arguments")
+
+ curpath = None
+ # Absolute directories will be handled correctly by the os.path.join call.
+ if not directory.startswith('.') and not directory.startswith('..'):
+ cdpaths = env.get('CDPATH', '.').split(';')
+ for cdpath in cdpaths:
+ p = os.path.join(cdpath, directory)
+ if os.path.isdir(p):
+ curpath = p
+ break
+
+ if curpath is None:
+ curpath = directory
+ curpath = os.path.join(env['PWD'], directory)
+
+ env['OLDPWD'] = env['PWD']
+ env['PWD'] = curpath
+ if printdir:
+ stdout.write('%s\n' % curpath)
+ return 0
+
+#-------------------------------------------------------------------------------
+# colon utility
+#-------------------------------------------------------------------------------
+def utility_colon(name, args, interp, env, stdin, stdout, stderr, debugflags):
+ if 'debug-utility' in debugflags:
+ print interp.log(' '.join([name, str(args), interp['PWD']]) + '\n')
+ return 0
+
+#-------------------------------------------------------------------------------
+# echo utility
+#-------------------------------------------------------------------------------
+def utility_echo(name, args, interp, env, stdin, stdout, stderr, debugflags):
+ if 'debug-utility' in debugflags:
+ print interp.log(' '.join([name, str(args), interp['PWD']]) + '\n')
+
+ # Echo only takes arguments, no options. Use printf if you need fancy stuff.
+ output = ' '.join(args) + '\n'
+ stdout.write(output)
+ stdout.flush()
+ return 0
+
+#-------------------------------------------------------------------------------
+# egrep utility
+#-------------------------------------------------------------------------------
+# egrep is usually a shell script.
+# Unfortunately, pysh does not support shell scripts *with arguments* right now,
+# so the redirection is implemented here, assuming grep is available.
+def utility_egrep(name, args, interp, env, stdin, stdout, stderr, debugflags):
+ if 'debug-utility' in debugflags:
+ print interp.log(' '.join([name, str(args), interp['PWD']]) + '\n')
+
+ return run_command('grep', ['-E'] + args, interp, env, stdin, stdout,
+ stderr, debugflags)
+
+#-------------------------------------------------------------------------------
+# env utility
+#-------------------------------------------------------------------------------
+def utility_env(name, args, interp, env, stdin, stdout, stderr, debugflags):
+ if 'debug-utility' in debugflags:
+ print interp.log(' '.join([name, str(args), interp['PWD']]) + '\n')
+
+ if args and args[0]=='-i':
+ raise NotImplementedError('env: -i option is not implemented')
+
+ i = 0
+ for arg in args:
+ if '=' not in arg:
+ break
+ # Update the current environment
+ name, value = arg.split('=', 1)
+ env[name] = value
+ i += 1
+
+ if args[i:]:
+ # Find then execute the specified interpreter
+ utility = env.find_in_path(args[i])
+ if not utility:
+ return 127
+ args[i:i+1] = utility
+ name = args[i]
+ args = args[i+1:]
+ try:
+ return run_command(name, args, interp, env, stdin, stdout, stderr,
+ debugflags)
+ except UtilityError:
+ stderr.write('env: failed to execute %s' % ' '.join([name]+args))
+ return 126
+ else:
+ for pair in env.get_variables().iteritems():
+ stdout.write('%s=%s\n' % pair)
+ return 0
+
+#-------------------------------------------------------------------------------
+# exit utility
+#-------------------------------------------------------------------------------
+def utility_exit(name, args, interp, env, stdin, stdout, stderr, debugflags):
+ if 'debug-utility' in debugflags:
+ print interp.log(' '.join([name, str(args), interp['PWD']]) + '\n')
+
+ res = None
+ if args:
+ try:
+ res = int(args[0])
+ except ValueError:
+ res = None
+ if not 0<=res<=255:
+ res = None
+
+ if res is None:
+ # BUG: should be last executed command exit code
+ res = 0
+
+ raise ExitSignal(res)
+
+#-------------------------------------------------------------------------------
+# fgrep utility
+#-------------------------------------------------------------------------------
+# see egrep
+def utility_fgrep(name, args, interp, env, stdin, stdout, stderr, debugflags):
+ if 'debug-utility' in debugflags:
+ print interp.log(' '.join([name, str(args), interp['PWD']]) + '\n')
+
+ return run_command('grep', ['-F'] + args, interp, env, stdin, stdout,
+ stderr, debugflags)
+
+#-------------------------------------------------------------------------------
+# gunzip utility
+#-------------------------------------------------------------------------------
+# see egrep
+def utility_gunzip(name, args, interp, env, stdin, stdout, stderr, debugflags):
+ if 'debug-utility' in debugflags:
+ print interp.log(' '.join([name, str(args), interp['PWD']]) + '\n')
+
+ return run_command('gzip', ['-d'] + args, interp, env, stdin, stdout,
+ stderr, debugflags)
+
+#-------------------------------------------------------------------------------
+# kill utility
+#-------------------------------------------------------------------------------
+def utility_kill(name, args, interp, env, stdin, stdout, stderr, debugflags):
+ if 'debug-utility' in debugflags:
+ print interp.log(' '.join([name, str(args), interp['PWD']]) + '\n')
+
+ for arg in args:
+ pid = int(arg)
+ status = subprocess.call(['pskill', '/T', str(pid)],
+ shell=True,
+ stdout=subprocess.PIPE,
+ stderr=subprocess.PIPE)
+ # pskill is asynchronous, hence the stupid polling loop
+ while 1:
+ p = subprocess.Popen(['pslist', str(pid)],
+ shell=True,
+ stdout=subprocess.PIPE,
+ stderr=subprocess.STDOUT)
+ output = p.communicate()[0]
+ if ('process %d was not' % pid) in output:
+ break
+ time.sleep(1)
+ return status
+
+#-------------------------------------------------------------------------------
+# mkdir utility
+#-------------------------------------------------------------------------------
+OPT_MKDIR = NonExitingParser("mkdir - make directories.")
+OPT_MKDIR.add_option('-p', action='store_true', dest='has_p', default=False)
+
+def utility_mkdir(name, args, interp, env, stdin, stdout, stderr, debugflags):
+ if 'debug-utility' in debugflags:
+ print interp.log(' '.join([name, str(args), interp['PWD']]) + '\n')
+
+ # TODO: implement umask
+ # TODO: implement proper utility error report
+ option, args = OPT_MKDIR.parse_args(args)
+ for arg in args:
+ path = os.path.join(env['PWD'], arg)
+ if option.has_p:
+ try:
+ os.makedirs(path)
+ except IOError as e:
+ if e.errno != errno.EEXIST:
+ raise
+ else:
+ os.mkdir(path)
+ return 0
+
+#-------------------------------------------------------------------------------
+# netstat utility
+#-------------------------------------------------------------------------------
+def utility_netstat(name, args, interp, env, stdin, stdout, stderr, debugflags):
+ # Do you really expect me to implement netstat ?
+ # This empty form is enough for Mercurial tests since it's
+ # supposed to generate nothing upon success. Faking this test
+ # is not a big deal either.
+ if 'debug-utility' in debugflags:
+ print interp.log(' '.join([name, str(args), interp['PWD']]) + '\n')
+ return 0
+
+#-------------------------------------------------------------------------------
+# pwd utility
+#-------------------------------------------------------------------------------
+OPT_PWD = NonExitingParser("pwd - return working directory name")
+OPT_PWD.add_option('-L', action='store_true', dest='has_L', default=True,
+ help="""If the PWD environment variable contains an absolute pathname of \
+ the current directory that does not contain the filenames dot or dot-dot, \
+ pwd shall write this pathname to standard output. Otherwise, the -L option \
+ shall behave as the -P option.""")
+OPT_PWD.add_option('-P', action='store_true', dest='has_L', default=False,
+ help="""The absolute pathname written shall not contain filenames that, in \
+ the context of the pathname, refer to files of type symbolic link.""")
+
+def utility_pwd(name, args, interp, env, stdin, stdout, stderr, debugflags):
+ if 'debug-utility' in debugflags:
+ print interp.log(' '.join([name, str(args), interp['PWD']]) + '\n')
+
+ option, args = OPT_PWD.parse_args(args)
+ stdout.write('%s\n' % env['PWD'])
+ return 0
+
+#-------------------------------------------------------------------------------
+# printf utility
+#-------------------------------------------------------------------------------
+RE_UNESCAPE = re.compile(r'(\\x[a-zA-Z0-9]{2}|\\[0-7]{1,3}|\\.)')
+
+def utility_printf(name, args, interp, env, stdin, stdout, stderr, debugflags):
+ if 'debug-utility' in debugflags:
+ print interp.log(' '.join([name, str(args), interp['PWD']]) + '\n')
+
+ def replace(m):
+ assert m.group()
+ g = m.group()[1:]
+ if g.startswith('x'):
+ return chr(int(g[1:], 16))
+ if len(g) <= 3 and len([c for c in g if c in '01234567']) == len(g):
+ # Yay, an octal number
+ return chr(int(g, 8))
+ return {
+ 'a': '\a',
+ 'b': '\b',
+ 'f': '\f',
+ 'n': '\n',
+ 'r': '\r',
+ 't': '\t',
+ 'v': '\v',
+ '\\': '\\',
+ }.get(g)
+
+ # Convert escape sequences
+ format = re.sub(RE_UNESCAPE, replace, args[0])
+ stdout.write(format % tuple(args[1:]))
+ return 0
+
+#-------------------------------------------------------------------------------
+# true utility
+#-------------------------------------------------------------------------------
+def utility_true(name, args, interp, env, stdin, stdout, stderr, debugflags):
+ if 'debug-utility' in debugflags:
+ print interp.log(' '.join([name, str(args), interp['PWD']]) + '\n')
+ return 0
+
+#-------------------------------------------------------------------------------
+# sed utility
+#-------------------------------------------------------------------------------
+RE_SED = re.compile(r'^s(.).*\1[a-zA-Z]*$')
+
+# cygwin sed fails with some expressions when they do not end with a single space.
+# see unit tests for details. Interestingly, the same expressions works perfectly
+# in cygwin shell.
+def utility_sed(name, args, interp, env, stdin, stdout, stderr, debugflags):
+ if 'debug-utility' in debugflags:
+ print interp.log(' '.join([name, str(args), interp['PWD']]) + '\n')
+
+ # Scan pattern arguments and append a space if necessary
+ for i in xrange(len(args)):
+ if not RE_SED.search(args[i]):
+ continue
+ args[i] = args[i] + ' '
+
+ return run_command(name, args, interp, env, stdin, stdout,
+ stderr, debugflags)
+
+#-------------------------------------------------------------------------------
+# sleep utility
+#-------------------------------------------------------------------------------
+def utility_sleep(name, args, interp, env, stdin, stdout, stderr, debugflags):
+ if 'debug-utility' in debugflags:
+ print interp.log(' '.join([name, str(args), interp['PWD']]) + '\n')
+ time.sleep(int(args[0]))
+ return 0
+
+#-------------------------------------------------------------------------------
+# sort utility
+#-------------------------------------------------------------------------------
+OPT_SORT = NonExitingParser("sort - sort, merge, or sequence check text files")
+
+def utility_sort(name, args, interp, env, stdin, stdout, stderr, debugflags):
+
+ def sort(path):
+ if path == '-':
+ lines = stdin.readlines()
+ else:
+ try:
+ f = file(path)
+ try:
+ lines = f.readlines()
+ finally:
+ f.close()
+ except IOError as e:
+ stderr.write(str(e) + '\n')
+ return 1
+
+ if lines and lines[-1][-1]!='\n':
+ lines[-1] = lines[-1] + '\n'
+ return lines
+
+ if 'debug-utility' in debugflags:
+ print interp.log(' '.join([name, str(args), interp['PWD']]) + '\n')
+
+ option, args = OPT_SORT.parse_args(args)
+ alllines = []
+
+ if len(args)<=0:
+ args += ['-']
+
+ # Load all files lines
+ curdir = os.getcwd()
+ try:
+ os.chdir(env['PWD'])
+ for path in args:
+ alllines += sort(path)
+ finally:
+ os.chdir(curdir)
+
+ alllines.sort()
+ for line in alllines:
+ stdout.write(line)
+ return 0
+
+#-------------------------------------------------------------------------------
+# hg utility
+#-------------------------------------------------------------------------------
+
+hgcommands = [
+ 'add',
+ 'addremove',
+ 'commit', 'ci',
+ 'debugrename',
+ 'debugwalk',
+ 'falabala', # Dummy command used in a mercurial test
+ 'incoming',
+ 'locate',
+ 'pull',
+ 'push',
+ 'qinit',
+ 'remove', 'rm',
+ 'rename', 'mv',
+ 'revert',
+ 'showconfig',
+ 'status', 'st',
+ 'strip',
+ ]
+
+def rewriteslashes(name, args):
+ # Several hg commands output file paths, rewrite the separators
+ if len(args) > 1 and name.lower().endswith('python') \
+ and args[0].endswith('hg'):
+ for cmd in hgcommands:
+ if cmd in args[1:]:
+ return True
+
+ # svn output contains many paths with OS specific separators.
+ # Normalize these to unix paths.
+ base = os.path.basename(name)
+ if base.startswith('svn'):
+ return True
+
+ return False
+
+def rewritehg(output):
+ if not output:
+ return output
+ # Rewrite os specific messages
+ output = output.replace(': The system cannot find the file specified',
+ ': No such file or directory')
+ output = re.sub(': Access is denied.*$', ': Permission denied', output)
+ output = output.replace(': No connection could be made because the target machine actively refused it',
+ ': Connection refused')
+ return output
+
+
+def run_command(name, args, interp, env, stdin, stdout,
+ stderr, debugflags):
+ # Execute the command
+ if 'debug-utility' in debugflags:
+ print interp.log(' '.join([name, str(args), interp['PWD']]) + '\n')
+
+ hgbin = interp.options().hgbinary
+ ishg = hgbin and ('hg' in name or args and 'hg' in args[0])
+ unixoutput = 'cygwin' in name or ishg
+
+ exec_env = env.get_variables()
+ try:
+ # BUG: comparing file descriptor is clearly not a reliable way to tell
+ # whether they point on the same underlying object. But in pysh limited
+ # scope this is usually right, we do not expect complicated redirections
+ # besides usual 2>&1.
+ # Still there is one case we have but cannot deal with is when stdout
+ # and stderr are redirected *by pysh caller*. This the reason for the
+ # --redirect pysh() option.
+ # Now, we want to know they are the same because we sometimes need to
+ # transform the command output, mostly remove CR-LF to ensure that
+ # command output is unix-like. Cygwin utilies are a special case because
+ # they explicitely set their output streams to binary mode, so we have
+ # nothing to do. For all others commands, we have to guess whether they
+ # are sending text data, in which case the transformation must be done.
+ # Again, the NUL character test is unreliable but should be enough for
+ # hg tests.
+ redirected = stdout.fileno()==stderr.fileno()
+ if not redirected:
+ p = subprocess.Popen([name] + args, cwd=env['PWD'], env=exec_env,
+ stdin=stdin, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
+ else:
+ p = subprocess.Popen([name] + args, cwd=env['PWD'], env=exec_env,
+ stdin=stdin, stdout=subprocess.PIPE, stderr=subprocess.STDOUT)
+ out, err = p.communicate()
+ except WindowsError as e:
+ raise UtilityError(str(e))
+
+ if not unixoutput:
+ def encode(s):
+ if '\0' in s:
+ return s
+ return s.replace('\r\n', '\n')
+ else:
+ encode = lambda s: s
+
+ if rewriteslashes(name, args):
+ encode1_ = encode
+ def encode(s):
+ s = encode1_(s)
+ s = s.replace('\\\\', '\\')
+ s = s.replace('\\', '/')
+ return s
+
+ if ishg:
+ encode2_ = encode
+ def encode(s):
+ return rewritehg(encode2_(s))
+
+ stdout.write(encode(out))
+ if not redirected:
+ stderr.write(encode(err))
+ return p.returncode
+
diff --git a/bitbake/lib/bb/pysh/interp.py b/bitbake/lib/bb/pysh/interp.py
new file mode 100644
index 0000000..25d8c92
--- /dev/null
+++ b/bitbake/lib/bb/pysh/interp.py
@@ -0,0 +1,1367 @@
+# interp.py - shell interpreter for pysh.
+#
+# Copyright 2007 Patrick Mezard
+#
+# This software may be used and distributed according to the terms
+# of the GNU General Public License, incorporated herein by reference.
+
+"""Implement the shell interpreter.
+
+Most references are made to "The Open Group Base Specifications Issue 6".
+<http://www.opengroup.org/onlinepubs/009695399/utilities/xcu_chap02.html>
+"""
+# TODO: document the fact input streams must implement fileno() so Popen will work correctly.
+# it requires non-stdin stream to be implemented as files. Still to be tested...
+# DOC: pathsep is used in PATH instead of ':'. Clearly, there are path syntax issues here.
+# TODO: stop command execution upon error.
+# TODO: sort out the filename/io_number mess. It should be possible to use filenames only.
+# TODO: review subshell implementation
+# TODO: test environment cloning for non-special builtins
+# TODO: set -x should not rebuild commands from tokens, assignments/redirections are lost
+# TODO: unit test for variable assignment
+# TODO: test error management wrt error type/utility type
+# TODO: test for binary output everywhere
+# BUG: debug-parsing does not pass log file to PLY. Maybe a PLY upgrade is necessary.
+import base64
+import cPickle as pickle
+import errno
+import glob
+import os
+import re
+import subprocess
+import sys
+import tempfile
+
+try:
+ s = set()
+ del s
+except NameError:
+ from Set import Set as set
+
+import builtin
+from sherrors import *
+import pyshlex
+import pyshyacc
+
+def mappend(func, *args, **kargs):
+ """Like map but assume func returns a list. Returned lists are merged into
+ a single one.
+ """
+ return reduce(lambda a,b: a+b, map(func, *args, **kargs), [])
+
+class FileWrapper:
+ """File object wrapper to ease debugging.
+
+ Allow mode checking and implement file duplication through a simple
+ reference counting scheme. Not sure the latter is really useful since
+ only real file descriptors can be used.
+ """
+ def __init__(self, mode, file, close=True):
+ if mode not in ('r', 'w', 'a'):
+ raise IOError('invalid mode: %s' % mode)
+ self._mode = mode
+ self._close = close
+ if isinstance(file, FileWrapper):
+ if file._refcount[0] <= 0:
+ raise IOError(0, 'Error')
+ self._refcount = file._refcount
+ self._refcount[0] += 1
+ self._file = file._file
+ else:
+ self._refcount = [1]
+ self._file = file
+
+ def dup(self):
+ return FileWrapper(self._mode, self, self._close)
+
+ def fileno(self):
+ """fileno() should be only necessary for input streams."""
+ return self._file.fileno()
+
+ def read(self, size=-1):
+ if self._mode!='r':
+ raise IOError(0, 'Error')
+ return self._file.read(size)
+
+ def readlines(self, *args, **kwargs):
+ return self._file.readlines(*args, **kwargs)
+
+ def write(self, s):
+ if self._mode not in ('w', 'a'):
+ raise IOError(0, 'Error')
+ return self._file.write(s)
+
+ def flush(self):
+ self._file.flush()
+
+ def close(self):
+ if not self._refcount:
+ return
+ assert self._refcount[0] > 0
+
+ self._refcount[0] -= 1
+ if self._refcount[0] == 0:
+ self._mode = 'c'
+ if self._close:
+ self._file.close()
+ self._refcount = None
+
+ def mode(self):
+ return self._mode
+
+ def __getattr__(self, name):
+ if name == 'name':
+ self.name = getattr(self._file, name)
+ return self.name
+ else:
+ raise AttributeError(name)
+
+ def __del__(self):
+ self.close()
+
+
+def win32_open_devnull(mode):
+ return open('NUL', mode)
+
+
+class Redirections:
+ """Stores open files and their mapping to pseudo-sh file descriptor.
+ """
+ # BUG: redirections are not handled correctly: 1>&3 2>&3 3>&4 does
+ # not make 1 to redirect to 4
+ def __init__(self, stdin=None, stdout=None, stderr=None):
+ self._descriptors = {}
+ if stdin is not None:
+ self._add_descriptor(0, stdin)
+ if stdout is not None:
+ self._add_descriptor(1, stdout)
+ if stderr is not None:
+ self._add_descriptor(2, stderr)
+
+ def add_here_document(self, interp, name, content, io_number=None):
+ if io_number is None:
+ io_number = 0
+
+ if name==pyshlex.unquote_wordtree(name):
+ content = interp.expand_here_document(('TOKEN', content))
+
+ # Write document content in a temporary file
+ tmp = tempfile.TemporaryFile()
+ try:
+ tmp.write(content)
+ tmp.flush()
+ tmp.seek(0)
+ self._add_descriptor(io_number, FileWrapper('r', tmp))
+ except:
+ tmp.close()
+ raise
+
+ def add(self, interp, op, filename, io_number=None):
+ if op not in ('<', '>', '>|', '>>', '>&'):
+ # TODO: add descriptor duplication and here_documents
+ raise RedirectionError('Unsupported redirection operator "%s"' % op)
+
+ if io_number is not None:
+ io_number = int(io_number)
+
+ if (op == '>&' and filename.isdigit()) or filename=='-':
+ # No expansion for file descriptors, quote them if you want a filename
+ fullname = filename
+ else:
+ if filename.startswith('/'):
+ # TODO: win32 kludge
+ if filename=='/dev/null':
+ fullname = 'NUL'
+ else:
+ # TODO: handle absolute pathnames, they are unlikely to exist on the
+ # current platform (win32 for instance).
+ raise NotImplementedError()
+ else:
+ fullname = interp.expand_redirection(('TOKEN', filename))
+ if not fullname:
+ raise RedirectionError('%s: ambiguous redirect' % filename)
+ # Build absolute path based on PWD
+ fullname = os.path.join(interp.get_env()['PWD'], fullname)
+
+ if op=='<':
+ return self._add_input_redirection(interp, fullname, io_number)
+ elif op in ('>', '>|'):
+ clobber = ('>|'==op)
+ return self._add_output_redirection(interp, fullname, io_number, clobber)
+ elif op=='>>':
+ return self._add_output_appending(interp, fullname, io_number)
+ elif op=='>&':
+ return self._dup_output_descriptor(fullname, io_number)
+
+ def close(self):
+ if self._descriptors is not None:
+ for desc in self._descriptors.itervalues():
+ desc.flush()
+ desc.close()
+ self._descriptors = None
+
+ def stdin(self):
+ return self._descriptors[0]
+
+ def stdout(self):
+ return self._descriptors[1]
+
+ def stderr(self):
+ return self._descriptors[2]
+
+ def clone(self):
+ clone = Redirections()
+ for desc, fileobj in self._descriptors.iteritems():
+ clone._descriptors[desc] = fileobj.dup()
+ return clone
+
+ def _add_output_redirection(self, interp, filename, io_number, clobber):
+ if io_number is None:
+ # io_number default to standard output
+ io_number = 1
+
+ if not clobber and interp.get_env().has_opt('-C') and os.path.isfile(filename):
+ # File already exist in no-clobber mode, bail out
+ raise RedirectionError('File "%s" already exists' % filename)
+
+ # Open and register
+ self._add_file_descriptor(io_number, filename, 'w')
+
+ def _add_output_appending(self, interp, filename, io_number):
+ if io_number is None:
+ io_number = 1
+ self._add_file_descriptor(io_number, filename, 'a')
+
+ def _add_input_redirection(self, interp, filename, io_number):
+ if io_number is None:
+ io_number = 0
+ self._add_file_descriptor(io_number, filename, 'r')
+
+ def _add_file_descriptor(self, io_number, filename, mode):
+ try:
+ if filename.startswith('/'):
+ if filename=='/dev/null':
+ f = win32_open_devnull(mode+'b')
+ else:
+ # TODO: handle absolute pathnames, they are unlikely to exist on the
+ # current platform (win32 for instance).
+ raise NotImplementedError('cannot open absolute path %s' % repr(filename))
+ else:
+ f = file(filename, mode+'b')
+ except IOError as e:
+ raise RedirectionError(str(e))
+
+ wrapper = None
+ try:
+ wrapper = FileWrapper(mode, f)
+ f = None
+ self._add_descriptor(io_number, wrapper)
+ except:
+ if f: f.close()
+ if wrapper: wrapper.close()
+ raise
+
+ def _dup_output_descriptor(self, source_fd, dest_fd):
+ if source_fd is None:
+ source_fd = 1
+ self._dup_file_descriptor(source_fd, dest_fd, 'w')
+
+ def _dup_file_descriptor(self, source_fd, dest_fd, mode):
+ source_fd = int(source_fd)
+ if source_fd not in self._descriptors:
+ raise RedirectionError('"%s" is not a valid file descriptor' % str(source_fd))
+ source = self._descriptors[source_fd]
+
+ if source.mode()!=mode:
+ raise RedirectionError('Descriptor %s cannot be duplicated in mode "%s"' % (str(source), mode))
+
+ if dest_fd=='-':
+ # Close the source descriptor
+ del self._descriptors[source_fd]
+ source.close()
+ else:
+ dest_fd = int(dest_fd)
+ if dest_fd not in self._descriptors:
+ raise RedirectionError('Cannot replace file descriptor %s' % str(dest_fd))
+
+ dest = self._descriptors[dest_fd]
+ if dest.mode()!=mode:
+ raise RedirectionError('Descriptor %s cannot be cannot be redirected in mode "%s"' % (str(dest), mode))
+
+ self._descriptors[dest_fd] = source.dup()
+ dest.close()
+
+ def _add_descriptor(self, io_number, file):
+ io_number = int(io_number)
+
+ if io_number in self._descriptors:
+ # Close the current descriptor
+ d = self._descriptors[io_number]
+ del self._descriptors[io_number]
+ d.close()
+
+ self._descriptors[io_number] = file
+
+ def __str__(self):
+ names = [('%d=%r' % (k, getattr(v, 'name', None))) for k,v
+ in self._descriptors.iteritems()]
+ names = ','.join(names)
+ return 'Redirections(%s)' % names
+
+ def __del__(self):
+ self.close()
+
+def cygwin_to_windows_path(path):
+ """Turn /cygdrive/c/foo into c:/foo, or return path if it
+ is not a cygwin path.
+ """
+ if not path.startswith('/cygdrive/'):
+ return path
+ path = path[len('/cygdrive/'):]
+ path = path[:1] + ':' + path[1:]
+ return path
+
+def win32_to_unix_path(path):
+ if path is not None:
+ path = path.replace('\\', '/')
+ return path
+
+_RE_SHEBANG = re.compile(r'^\#!\s?([^\s]+)(?:\s([^\s]+))?')
+_SHEBANG_CMDS = {
+ '/usr/bin/env': 'env',
+ '/bin/sh': 'pysh',
+ 'python': 'python',
+}
+
+def resolve_shebang(path, ignoreshell=False):
+ """Return a list of arguments as shebang interpreter call or an empty list
+ if path does not refer to an executable script.
+ See <http://www.opengroup.org/austin/docs/austin_51r2.txt>.
+
+ ignoreshell - set to True to ignore sh shebangs. Return an empty list instead.
+ """
+ try:
+ f = file(path)
+ try:
+ # At most 80 characters in the first line
+ header = f.read(80).splitlines()[0]
+ finally:
+ f.close()
+
+ m = _RE_SHEBANG.search(header)
+ if not m:
+ return []
+ cmd, arg = m.group(1,2)
+ if os.path.isfile(cmd):
+ # Keep this one, the hg script for instance contains a weird windows
+ # shebang referencing the current python install.
+ cmdfile = os.path.basename(cmd).lower()
+ if cmdfile == 'python.exe':
+ cmd = 'python'
+ pass
+ elif cmd not in _SHEBANG_CMDS:
+ raise CommandNotFound('Unknown interpreter "%s" referenced in '\
+ 'shebang' % header)
+ cmd = _SHEBANG_CMDS.get(cmd)
+ if cmd is None or (ignoreshell and cmd == 'pysh'):
+ return []
+ if arg is None:
+ return [cmd, win32_to_unix_path(path)]
+ return [cmd, arg, win32_to_unix_path(path)]
+ except IOError as e:
+ if e.errno!=errno.ENOENT and \
+ (e.errno!=errno.EPERM and not os.path.isdir(path)): # Opening a directory raises EPERM
+ raise
+ return []
+
+def win32_find_in_path(name, path):
+ if isinstance(path, str):
+ path = path.split(os.pathsep)
+
+ exts = os.environ.get('PATHEXT', '').lower().split(os.pathsep)
+ for p in path:
+ p_name = os.path.join(p, name)
+
+ prefix = resolve_shebang(p_name)
+ if prefix:
+ return prefix
+
+ for ext in exts:
+ p_name_ext = p_name + ext
+ if os.path.exists(p_name_ext):
+ return [win32_to_unix_path(p_name_ext)]
+ return []
+
+class Traps(dict):
+ def __setitem__(self, key, value):
+ if key not in ('EXIT',):
+ raise NotImplementedError()
+ super(Traps, self).__setitem__(key, value)
+
+# IFS white spaces character class
+_IFS_WHITESPACES = (' ', '\t', '\n')
+
+class Environment:
+ """Environment holds environment variables, export table, function
+ definitions and whatever is defined in 2.12 "Shell Execution Environment",
+ redirection excepted.
+ """
+ def __init__(self, pwd):
+ self._opt = set() #Shell options
+
+ self._functions = {}
+ self._env = {'?': '0', '#': '0'}
+ self._exported = set([
+ 'HOME', 'IFS', 'PATH'
+ ])
+
+ # Set environment vars with side-effects
+ self._ifs_ws = None # Set of IFS whitespace characters
+ self._ifs_re = None # Regular expression used to split between words using IFS classes
+ self['IFS'] = ''.join(_IFS_WHITESPACES) #Default environment values
+ self['PWD'] = pwd
+ self.traps = Traps()
+
+ def clone(self, subshell=False):
+ env = Environment(self['PWD'])
+ env._opt = set(self._opt)
+ for k,v in self.get_variables().iteritems():
+ if k in self._exported:
+ env.export(k,v)
+ elif subshell:
+ env[k] = v
+
+ if subshell:
+ env._functions = dict(self._functions)
+
+ return env
+
+ def __getitem__(self, key):
+ if key in ('@', '*', '-', '$'):
+ raise NotImplementedError('%s is not implemented' % repr(key))
+ return self._env[key]
+
+ def get(self, key, defval=None):
+ try:
+ return self[key]
+ except KeyError:
+ return defval
+
+ def __setitem__(self, key, value):
+ if key=='IFS':
+ # Update the whitespace/non-whitespace classes
+ self._update_ifs(value)
+ elif key=='PWD':
+ pwd = os.path.abspath(value)
+ if not os.path.isdir(pwd):
+ raise VarAssignmentError('Invalid directory %s' % value)
+ value = pwd
+ elif key in ('?', '!'):
+ value = str(int(value))
+ self._env[key] = value
+
+ def __delitem__(self, key):
+ if key in ('IFS', 'PWD', '?'):
+ raise VarAssignmentError('%s cannot be unset' % key)
+ del self._env[key]
+
+ def __contains__(self, item):
+ return item in self._env
+
+ def set_positional_args(self, args):
+ """Set the content of 'args' as positional argument from 1 to len(args).
+ Return previous argument as a list of strings.
+ """
+ # Save and remove previous arguments
+ prevargs = []
+ for i in xrange(int(self._env['#'])):
+ i = str(i+1)
+ prevargs.append(self._env[i])
+ del self._env[i]
+ self._env['#'] = '0'
+
+ #Set new ones
+ for i,arg in enumerate(args):
+ self._env[str(i+1)] = str(arg)
+ self._env['#'] = str(len(args))
+
+ return prevargs
+
+ def get_positional_args(self):
+ return [self._env[str(i+1)] for i in xrange(int(self._env['#']))]
+
+ def get_variables(self):
+ return dict(self._env)
+
+ def export(self, key, value=None):
+ if value is not None:
+ self[key] = value
+ self._exported.add(key)
+
+ def get_exported(self):
+ return [(k,self._env.get(k)) for k in self._exported]
+
+ def split_fields(self, word):
+ if not self._ifs_ws or not word:
+ return [word]
+ return re.split(self._ifs_re, word)
+
+ def _update_ifs(self, value):
+ """Update the split_fields related variables when IFS character set is
+ changed.
+ """
+ # TODO: handle NULL IFS
+
+ # Separate characters in whitespace and non-whitespace
+ chars = set(value)
+ ws = [c for c in chars if c in _IFS_WHITESPACES]
+ nws = [c for c in chars if c not in _IFS_WHITESPACES]
+
+ # Keep whitespaces in a string for left and right stripping
+ self._ifs_ws = ''.join(ws)
+
+ # Build a regexp to split fields
+ trailing = '[' + ''.join([re.escape(c) for c in ws]) + ']'
+ if nws:
+ # First, the single non-whitespace occurence.
+ nws = '[' + ''.join([re.escape(c) for c in nws]) + ']'
+ nws = '(?:' + trailing + '*' + nws + trailing + '*' + '|' + trailing + '+)'
+ else:
+ # Then mix all parts with quantifiers
+ nws = trailing + '+'
+ self._ifs_re = re.compile(nws)
+
+ def has_opt(self, opt, val=None):
+ return (opt, val) in self._opt
+
+ def set_opt(self, opt, val=None):
+ self._opt.add((opt, val))
+
+ def find_in_path(self, name, pwd=False):
+ path = self._env.get('PATH', '').split(os.pathsep)
+ if pwd:
+ path[:0] = [self['PWD']]
+ if os.name == 'nt':
+ return win32_find_in_path(name, self._env.get('PATH', ''))
+ else:
+ raise NotImplementedError()
+
+ def define_function(self, name, body):
+ if not is_name(name):
+ raise ShellSyntaxError('%s is not a valid function name' % repr(name))
+ self._functions[name] = body
+
+ def remove_function(self, name):
+ del self._functions[name]
+
+ def is_function(self, name):
+ return name in self._functions
+
+ def get_function(self, name):
+ return self._functions.get(name)
+
+
+name_charset = 'abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789_'
+name_charset = dict(zip(name_charset,name_charset))
+
+def match_name(s):
+ """Return the length in characters of the longest prefix made of name
+ allowed characters in s.
+ """
+ for i,c in enumerate(s):
+ if c not in name_charset:
+ return s[:i]
+ return s
+
+def is_name(s):
+ return len([c for c in s if c not in name_charset])<=0
+
+def is_special_param(c):
+ return len(c)==1 and c in ('@','*','#','?','-','$','!','0')
+
+def utility_not_implemented(name, *args, **kwargs):
+ raise NotImplementedError('%s utility is not implemented' % name)
+
+
+class Utility:
+ """Define utilities properties:
+ func -- utility callable. See builtin module for utility samples.
+ is_special -- see XCU 2.8.
+ """
+ def __init__(self, func, is_special=0):
+ self.func = func
+ self.is_special = bool(is_special)
+
+
+def encodeargs(args):
+ def encodearg(s):
+ lines = base64.encodestring(s)
+ lines = [l.splitlines()[0] for l in lines]
+ return ''.join(lines)
+
+ s = pickle.dumps(args)
+ return encodearg(s)
+
+def decodeargs(s):
+ s = base64.decodestring(s)
+ return pickle.loads(s)
+
+
+class GlobError(Exception):
+ pass
+
+class Options:
+ def __init__(self):
+ # True if Mercurial operates with binary streams
+ self.hgbinary = True
+
+class Interpreter:
+ # Implementation is very basic: the execute() method just makes a DFS on the
+ # AST and execute nodes one by one. Nodes are tuple (name,obj) where name
+ # is a string identifier and obj the AST element returned by the parser.
+ #
+ # Handler are named after the node identifiers.
+ # TODO: check node names and remove the switch in execute with some
+ # dynamic getattr() call to find node handlers.
+ """Shell interpreter.
+
+ The following debugging flags can be passed:
+ debug-parsing - enable PLY debugging.
+ debug-tree - print the generated AST.
+ debug-cmd - trace command execution before word expansion, plus exit status.
+ debug-utility - trace utility execution.
+ """
+
+ # List supported commands.
+ COMMANDS = {
+ 'cat': Utility(builtin.utility_cat,),
+ 'cd': Utility(builtin.utility_cd,),
+ ':': Utility(builtin.utility_colon,),
+ 'echo': Utility(builtin.utility_echo),
+ 'env': Utility(builtin.utility_env),
+ 'exit': Utility(builtin.utility_exit),
+ 'export': Utility(builtin.builtin_export, is_special=1),
+ 'egrep': Utility(builtin.utility_egrep),
+ 'fgrep': Utility(builtin.utility_fgrep),
+ 'gunzip': Utility(builtin.utility_gunzip),
+ 'kill': Utility(builtin.utility_kill),
+ 'mkdir': Utility(builtin.utility_mkdir),
+ 'netstat': Utility(builtin.utility_netstat),
+ 'printf': Utility(builtin.utility_printf),
+ 'pwd': Utility(builtin.utility_pwd),
+ 'return': Utility(builtin.builtin_return, is_special=1),
+ 'sed': Utility(builtin.utility_sed,),
+ 'set': Utility(builtin.builtin_set,),
+ 'shift': Utility(builtin.builtin_shift,),
+ 'sleep': Utility(builtin.utility_sleep,),
+ 'sort': Utility(builtin.utility_sort,),
+ 'trap': Utility(builtin.builtin_trap, is_special=1),
+ 'true': Utility(builtin.utility_true),
+ 'unset': Utility(builtin.builtin_unset, is_special=1),
+ 'wait': Utility(builtin.builtin_wait, is_special=1),
+ }
+
+ def __init__(self, pwd, debugflags = [], env=None, redirs=None, stdin=None,
+ stdout=None, stderr=None, opts=Options()):
+ self._env = env
+ if self._env is None:
+ self._env = Environment(pwd)
+ self._children = {}
+
+ self._redirs = redirs
+ self._close_redirs = False
+
+ if self._redirs is None:
+ if stdin is None:
+ stdin = sys.stdin
+ if stdout is None:
+ stdout = sys.stdout
+ if stderr is None:
+ stderr = sys.stderr
+ stdin = FileWrapper('r', stdin, False)
+ stdout = FileWrapper('w', stdout, False)
+ stderr = FileWrapper('w', stderr, False)
+ self._redirs = Redirections(stdin, stdout, stderr)
+ self._close_redirs = True
+
+ self._debugflags = list(debugflags)
+ self._logfile = sys.stderr
+ self._options = opts
+
+ def close(self):
+ """Must be called when the interpreter is no longer used."""
+ script = self._env.traps.get('EXIT')
+ if script:
+ try:
+ self.execute_script(script=script)
+ except:
+ pass
+
+ if self._redirs is not None and self._close_redirs:
+ self._redirs.close()
+ self._redirs = None
+
+ def log(self, s):
+ self._logfile.write(s)
+ self._logfile.flush()
+
+ def __getitem__(self, key):
+ return self._env[key]
+
+ def __setitem__(self, key, value):
+ self._env[key] = value
+
+ def options(self):
+ return self._options
+
+ def redirect(self, redirs, ios):
+ def add_redir(io):
+ if isinstance(io, pyshyacc.IORedirect):
+ redirs.add(self, io.op, io.filename, io.io_number)
+ else:
+ redirs.add_here_document(self, io.name, io.content, io.io_number)
+
+ map(add_redir, ios)
+ return redirs
+
+ def execute_script(self, script=None, ast=None, sourced=False,
+ scriptpath=None):
+ """If script is not None, parse the input. Otherwise takes the supplied
+ AST. Then execute the AST.
+ Return the script exit status.
+ """
+ try:
+ if scriptpath is not None:
+ self._env['0'] = os.path.abspath(scriptpath)
+
+ if script is not None:
+ debug_parsing = ('debug-parsing' in self._debugflags)
+ cmds, script = pyshyacc.parse(script, True, debug_parsing)
+ if 'debug-tree' in self._debugflags:
+ pyshyacc.print_commands(cmds, self._logfile)
+ self._logfile.flush()
+ else:
+ cmds, script = ast, ''
+
+ status = 0
+ for cmd in cmds:
+ try:
+ status = self.execute(cmd)
+ except ExitSignal as e:
+ if sourced:
+ raise
+ status = int(e.args[0])
+ return status
+ except ShellError:
+ self._env['?'] = 1
+ raise
+ if 'debug-utility' in self._debugflags or 'debug-cmd' in self._debugflags:
+ self.log('returncode ' + str(status)+ '\n')
+ return status
+ except CommandNotFound as e:
+ print >>self._redirs.stderr, str(e)
+ self._redirs.stderr.flush()
+ # Command not found by non-interactive shell
+ # return 127
+ raise
+ except RedirectionError as e:
+ # TODO: should be handled depending on the utility status
+ print >>self._redirs.stderr, str(e)
+ self._redirs.stderr.flush()
+ # Command not found by non-interactive shell
+ # return 127
+ raise
+
+ def dotcommand(self, env, args):
+ if len(args) < 1:
+ raise ShellError('. expects at least one argument')
+ path = args[0]
+ if '/' not in path:
+ found = env.find_in_path(args[0], True)
+ if found:
+ path = found[0]
+ script = file(path).read()
+ return self.execute_script(script=script, sourced=True)
+
+ def execute(self, token, redirs=None):
+ """Execute and AST subtree with supplied redirections overriding default
+ interpreter ones.
+ Return the exit status.
+ """
+ if not token:
+ return 0
+
+ if redirs is None:
+ redirs = self._redirs
+
+ if isinstance(token, list):
+ # Commands sequence
+ res = 0
+ for t in token:
+ res = self.execute(t, redirs)
+ return res
+
+ type, value = token
+ status = 0
+ if type=='simple_command':
+ redirs_copy = redirs.clone()
+ try:
+ # TODO: define and handle command return values
+ # TODO: implement set -e
+ status = self._execute_simple_command(value, redirs_copy)
+ finally:
+ redirs_copy.close()
+ elif type=='pipeline':
+ status = self._execute_pipeline(value, redirs)
+ elif type=='and_or':
+ status = self._execute_and_or(value, redirs)
+ elif type=='for_clause':
+ status = self._execute_for_clause(value, redirs)
+ elif type=='while_clause':
+ status = self._execute_while_clause(value, redirs)
+ elif type=='function_definition':
+ status = self._execute_function_definition(value, redirs)
+ elif type=='brace_group':
+ status = self._execute_brace_group(value, redirs)
+ elif type=='if_clause':
+ status = self._execute_if_clause(value, redirs)
+ elif type=='subshell':
+ status = self.subshell(ast=value.cmds, redirs=redirs)
+ elif type=='async':
+ status = self._asynclist(value)
+ elif type=='redirect_list':
+ redirs_copy = self.redirect(redirs.clone(), value.redirs)
+ try:
+ status = self.execute(value.cmd, redirs_copy)
+ finally:
+ redirs_copy.close()
+ else:
+ raise NotImplementedError('Unsupported token type ' + type)
+
+ if status < 0:
+ status = 255
+ return status
+
+ def _execute_if_clause(self, if_clause, redirs):
+ cond_status = self.execute(if_clause.cond, redirs)
+ if cond_status==0:
+ return self.execute(if_clause.if_cmds, redirs)
+ else:
+ return self.execute(if_clause.else_cmds, redirs)
+
+ def _execute_brace_group(self, group, redirs):
+ status = 0
+ for cmd in group.cmds:
+ status = self.execute(cmd, redirs)
+ return status
+
+ def _execute_function_definition(self, fundef, redirs):
+ self._env.define_function(fundef.name, fundef.body)
+ return 0
+
+ def _execute_while_clause(self, while_clause, redirs):
+ status = 0
+ while 1:
+ cond_status = 0
+ for cond in while_clause.condition:
+ cond_status = self.execute(cond, redirs)
+
+ if cond_status:
+ break
+
+ for cmd in while_clause.cmds:
+ status = self.execute(cmd, redirs)
+
+ return status
+
+ def _execute_for_clause(self, for_clause, redirs):
+ if not is_name(for_clause.name):
+ raise ShellSyntaxError('%s is not a valid name' % repr(for_clause.name))
+ items = mappend(self.expand_token, for_clause.items)
+
+ status = 0
+ for item in items:
+ self._env[for_clause.name] = item
+ for cmd in for_clause.cmds:
+ status = self.execute(cmd, redirs)
+ return status
+
+ def _execute_and_or(self, or_and, redirs):
+ res = self.execute(or_and.left, redirs)
+ if (or_and.op=='&&' and res==0) or (or_and.op!='&&' and res!=0):
+ res = self.execute(or_and.right, redirs)
+ return res
+
+ def _execute_pipeline(self, pipeline, redirs):
+ if len(pipeline.commands)==1:
+ status = self.execute(pipeline.commands[0], redirs)
+ else:
+ # Execute all commands one after the other
+ status = 0
+ inpath, outpath = None, None
+ try:
+ # Commands inputs and outputs cannot really be plugged as done
+ # by a real shell. Run commands sequentially and chain their
+ # input/output throught temporary files.
+ tmpfd, inpath = tempfile.mkstemp()
+ os.close(tmpfd)
+ tmpfd, outpath = tempfile.mkstemp()
+ os.close(tmpfd)
+
+ inpath = win32_to_unix_path(inpath)
+ outpath = win32_to_unix_path(outpath)
+
+ for i, cmd in enumerate(pipeline.commands):
+ call_redirs = redirs.clone()
+ try:
+ if i!=0:
+ call_redirs.add(self, '<', inpath)
+ if i!=len(pipeline.commands)-1:
+ call_redirs.add(self, '>', outpath)
+
+ status = self.execute(cmd, call_redirs)
+
+ # Chain inputs/outputs
+ inpath, outpath = outpath, inpath
+ finally:
+ call_redirs.close()
+ finally:
+ if inpath: os.remove(inpath)
+ if outpath: os.remove(outpath)
+
+ if pipeline.reverse_status:
+ status = int(not status)
+ self._env['?'] = status
+ return status
+
+ def _execute_function(self, name, args, interp, env, stdin, stdout, stderr, *others):
+ assert interp is self
+
+ func = env.get_function(name)
+ #Set positional parameters
+ prevargs = None
+ try:
+ prevargs = env.set_positional_args(args)
+ try:
+ redirs = Redirections(stdin.dup(), stdout.dup(), stderr.dup())
+ try:
+ status = self.execute(func, redirs)
+ finally:
+ redirs.close()
+ except ReturnSignal as e:
+ status = int(e.args[0])
+ env['?'] = status
+ return status
+ finally:
+ #Reset positional parameters
+ if prevargs is not None:
+ env.set_positional_args(prevargs)
+
+ def _execute_simple_command(self, token, redirs):
+ """Can raise ReturnSignal when return builtin is called, ExitSignal when
+ exit is called, and other shell exceptions upon builtin failures.
+ """
+ debug_command = 'debug-cmd' in self._debugflags
+ if debug_command:
+ self.log('word' + repr(token.words) + '\n')
+ self.log('assigns' + repr(token.assigns) + '\n')
+ self.log('redirs' + repr(token.redirs) + '\n')
+
+ is_special = None
+ env = self._env
+
+ try:
+ # Word expansion
+ args = []
+ for word in token.words:
+ args += self.expand_token(word)
+ if is_special is None and args:
+ is_special = env.is_function(args[0]) or \
+ (args[0] in self.COMMANDS and self.COMMANDS[args[0]].is_special)
+
+ if debug_command:
+ self.log('_execute_simple_command' + str(args) + '\n')
+
+ if not args:
+ # Redirections happen is a subshell
+ redirs = redirs.clone()
+ elif not is_special:
+ env = self._env.clone()
+
+ # Redirections
+ self.redirect(redirs, token.redirs)
+
+ # Variables assignments
+ res = 0
+ for type,(k,v) in token.assigns:
+ status, expanded = self.expand_variable((k,v))
+ if status is not None:
+ res = status
+ if args:
+ env.export(k, expanded)
+ else:
+ env[k] = expanded
+
+ if args and args[0] in ('.', 'source'):
+ res = self.dotcommand(env, args[1:])
+ elif args:
+ if args[0] in self.COMMANDS:
+ command = self.COMMANDS[args[0]]
+ elif env.is_function(args[0]):
+ command = Utility(self._execute_function, is_special=True)
+ else:
+ if not '/' in args[0].replace('\\', '/'):
+ cmd = env.find_in_path(args[0])
+ if not cmd:
+ # TODO: test error code on unknown command => 127
+ raise CommandNotFound('Unknown command: "%s"' % args[0])
+ else:
+ # Handle commands like '/cygdrive/c/foo.bat'
+ cmd = cygwin_to_windows_path(args[0])
+ if not os.path.exists(cmd):
+ raise CommandNotFound('%s: No such file or directory' % args[0])
+ shebang = resolve_shebang(cmd)
+ if shebang:
+ cmd = shebang
+ else:
+ cmd = [cmd]
+ args[0:1] = cmd
+ command = Utility(builtin.run_command)
+
+ # Command execution
+ if 'debug-cmd' in self._debugflags:
+ self.log('redirections ' + str(redirs) + '\n')
+
+ res = command.func(args[0], args[1:], self, env,
+ redirs.stdin(), redirs.stdout(),
+ redirs.stderr(), self._debugflags)
+
+ if self._env.has_opt('-x'):
+ # Trace command execution in shell environment
+ # BUG: would be hard to reproduce a real shell behaviour since
+ # the AST is not annotated with source lines/tokens.
+ self._redirs.stdout().write(' '.join(args))
+
+ except ReturnSignal:
+ raise
+ except ShellError as e:
+ if is_special or isinstance(e, (ExitSignal,
+ ShellSyntaxError, ExpansionError)):
+ raise e
+ self._redirs.stderr().write(str(e)+'\n')
+ return 1
+
+ return res
+
+ def expand_token(self, word):
+ """Expand a word as specified in [2.6 Word Expansions]. Return the list
+ of expanded words.
+ """
+ status, wtrees = self._expand_word(word)
+ return map(pyshlex.wordtree_as_string, wtrees)
+
+ def expand_variable(self, word):
+ """Return a status code (or None if no command expansion occurred)
+ and a single word.
+ """
+ status, wtrees = self._expand_word(word, pathname=False, split=False)
+ words = map(pyshlex.wordtree_as_string, wtrees)
+ assert len(words)==1
+ return status, words[0]
+
+ def expand_here_document(self, word):
+ """Return the expanded document as a single word. The here document is
+ assumed to be unquoted.
+ """
+ status, wtrees = self._expand_word(word, pathname=False,
+ split=False, here_document=True)
+ words = map(pyshlex.wordtree_as_string, wtrees)
+ assert len(words)==1
+ return words[0]
+
+ def expand_redirection(self, word):
+ """Return a single word."""
+ return self.expand_variable(word)[1]
+
+ def get_env(self):
+ return self._env
+
+ def _expand_word(self, token, pathname=True, split=True, here_document=False):
+ wtree = pyshlex.make_wordtree(token[1], here_document=here_document)
+
+ # TODO: implement tilde expansion
+ def expand(wtree):
+ """Return a pseudo wordtree: the tree or its subelements can be empty
+ lists when no value result from the expansion.
+ """
+ status = None
+ for part in wtree:
+ if not isinstance(part, list):
+ continue
+ if part[0]in ("'", '\\'):
+ continue
+ elif part[0] in ('`', '$('):
+ status, result = self._expand_command(part)
+ part[:] = result
+ elif part[0] in ('$', '${'):
+ part[:] = self._expand_parameter(part, wtree[0]=='"', split)
+ elif part[0] in ('', '"'):
+ status, result = expand(part)
+ part[:] = result
+ else:
+ raise NotImplementedError('%s expansion is not implemented'
+ % part[0])
+ # [] is returned when an expansion result in no-field,
+ # like an empty $@
+ wtree = [p for p in wtree if p != []]
+ if len(wtree) < 3:
+ return status, []
+ return status, wtree
+
+ status, wtree = expand(wtree)
+ if len(wtree) == 0:
+ return status, wtree
+ wtree = pyshlex.normalize_wordtree(wtree)
+
+ if split:
+ wtrees = self._split_fields(wtree)
+ else:
+ wtrees = [wtree]
+
+ if pathname:
+ wtrees = mappend(self._expand_pathname, wtrees)
+
+ wtrees = map(self._remove_quotes, wtrees)
+ return status, wtrees
+
+ def _expand_command(self, wtree):
+ # BUG: there is something to do with backslashes and quoted
+ # characters here
+ command = pyshlex.wordtree_as_string(wtree[1:-1])
+ status, output = self.subshell_output(command)
+ return status, ['', output, '']
+
+ def _expand_parameter(self, wtree, quoted=False, split=False):
+ """Return a valid wtree or an empty list when no parameter results."""
+ # Get the parameter name
+ # TODO: implement weird expansion rules with ':'
+ name = pyshlex.wordtree_as_string(wtree[1:-1])
+ if not is_name(name) and not is_special_param(name):
+ raise ExpansionError('Bad substitution "%s"' % name)
+ # TODO: implement special parameters
+ if name in ('@', '*'):
+ args = self._env.get_positional_args()
+ if len(args) == 0:
+ return []
+ if len(args)<2:
+ return ['', ''.join(args), '']
+
+ sep = self._env.get('IFS', '')[:1]
+ if split and quoted and name=='@':
+ # Introduce a new token to tell the caller that these parameters
+ # cause a split as specified in 2.5.2
+ return ['@'] + args + ['']
+ else:
+ return ['', sep.join(args), '']
+
+ return ['', self._env.get(name, ''), '']
+
+ def _split_fields(self, wtree):
+ def is_empty(split):
+ return split==['', '', '']
+
+ def split_positional(quoted):
+ # Return a list of wtree split according positional parameters rules.
+ # All remaining '@' groups are removed.
+ assert quoted[0]=='"'
+
+ splits = [[]]
+ for part in quoted:
+ if not isinstance(part, list) or part[0]!='@':
+ splits[-1].append(part)
+ else:
+ # Empty or single argument list were dealt with already
+ assert len(part)>3
+ # First argument must join with the beginning part of the original word
+ splits[-1].append(part[1])
+ # Create double-quotes expressions for every argument after the first
+ for arg in part[2:-1]:
+ splits[-1].append('"')
+ splits.append(['"', arg])
+ return splits
+
+ # At this point, all expansions but pathnames have occured. Only quoted
+ # and positional sequences remain. Thus, all candidates for field splitting
+ # are in the tree root, or are positional splits ('@') and lie in root
+ # children.
+ if not wtree or wtree[0] not in ('', '"'):
+ # The whole token is quoted or empty, nothing to split
+ return [wtree]
+
+ if wtree[0]=='"':
+ wtree = ['', wtree, '']
+
+ result = [['', '']]
+ for part in wtree[1:-1]:
+ if isinstance(part, list):
+ if part[0]=='"':
+ splits = split_positional(part)
+ if len(splits)<=1:
+ result[-1] += [part, '']
+ else:
+ # Terminate the current split
+ result[-1] += [splits[0], '']
+ result += splits[1:-1]
+ # Create a new split
+ result += [['', splits[-1], '']]
+ else:
+ result[-1] += [part, '']
+ else:
+ splits = self._env.split_fields(part)
+ if len(splits)<=1:
+ # No split
+ result[-1][-1] += part
+ else:
+ # Terminate the current resulting part and create a new one
+ result[-1][-1] += splits[0]
+ result[-1].append('')
+ result += [['', r, ''] for r in splits[1:-1]]
+ result += [['', splits[-1]]]
+ result[-1].append('')
+
+ # Leading and trailing empty groups come from leading/trailing blanks
+ if result and is_empty(result[-1]):
+ result[-1:] = []
+ if result and is_empty(result[0]):
+ result[:1] = []
+ return result
+
+ def _expand_pathname(self, wtree):
+ """See [2.6.6 Pathname Expansion]."""
+ if self._env.has_opt('-f'):
+ return [wtree]
+
+ # All expansions have been performed, only quoted sequences should remain
+ # in the tree. Generate the pattern by folding the tree, escaping special
+ # characters when appear quoted
+ special_chars = '*?[]'
+
+ def make_pattern(wtree):
+ subpattern = []
+ for part in wtree[1:-1]:
+ if isinstance(part, list):
+ part = make_pattern(part)
+ elif wtree[0]!='':
+ for c in part:
+ # Meta-characters cannot be quoted
+ if c in special_chars:
+ raise GlobError()
+ subpattern.append(part)
+ return ''.join(subpattern)
+
+ def pwd_glob(pattern):
+ cwd = os.getcwd()
+ os.chdir(self._env['PWD'])
+ try:
+ return glob.glob(pattern)
+ finally:
+ os.chdir(cwd)
+
+ #TODO: check working directory issues here wrt relative patterns
+ try:
+ pattern = make_pattern(wtree)
+ paths = pwd_glob(pattern)
+ except GlobError:
+ # BUG: Meta-characters were found in quoted sequences. The should
+ # have been used literally but this is unsupported in current glob module.
+ # Instead we consider the whole tree must be used literally and
+ # therefore there is no point in globbing. This is wrong when meta
+ # characters are mixed with quoted meta in the same pattern like:
+ # < foo*"py*" >
+ paths = []
+
+ if not paths:
+ return [wtree]
+ return [['', path, ''] for path in paths]
+
+ def _remove_quotes(self, wtree):
+ """See [2.6.7 Quote Removal]."""
+
+ def unquote(wtree):
+ unquoted = []
+ for part in wtree[1:-1]:
+ if isinstance(part, list):
+ part = unquote(part)
+ unquoted.append(part)
+ return ''.join(unquoted)
+
+ return ['', unquote(wtree), '']
+
+ def subshell(self, script=None, ast=None, redirs=None):
+ """Execute the script or AST in a subshell, with inherited redirections
+ if redirs is not None.
+ """
+ if redirs:
+ sub_redirs = redirs
+ else:
+ sub_redirs = redirs.clone()
+
+ subshell = None
+ try:
+ subshell = Interpreter(None, self._debugflags, self._env.clone(True),
+ sub_redirs, opts=self._options)
+ return subshell.execute_script(script, ast)
+ finally:
+ if not redirs: sub_redirs.close()
+ if subshell: subshell.close()
+
+ def subshell_output(self, script):
+ """Execute the script in a subshell and return the captured output."""
+ # Create temporary file to capture subshell output
+ tmpfd, tmppath = tempfile.mkstemp()
+ try:
+ tmpfile = os.fdopen(tmpfd, 'wb')
+ stdout = FileWrapper('w', tmpfile)
+
+ redirs = Redirections(self._redirs.stdin().dup(),
+ stdout,
+ self._redirs.stderr().dup())
+ try:
+ status = self.subshell(script=script, redirs=redirs)
+ finally:
+ redirs.close()
+ redirs = None
+
+ # Extract subshell standard output
+ tmpfile = open(tmppath, 'rb')
+ try:
+ output = tmpfile.read()
+ return status, output.rstrip('\n')
+ finally:
+ tmpfile.close()
+ finally:
+ os.remove(tmppath)
+
+ def _asynclist(self, cmd):
+ args = (self._env.get_variables(), cmd)
+ arg = encodeargs(args)
+ assert len(args) < 30*1024
+ cmd = ['pysh.bat', '--ast', '-c', arg]
+ p = subprocess.Popen(cmd, cwd=self._env['PWD'])
+ self._children[p.pid] = p
+ self._env['!'] = p.pid
+ return 0
+
+ def wait(self, pids=None):
+ if not pids:
+ pids = self._children.keys()
+
+ status = 127
+ for pid in pids:
+ if pid not in self._children:
+ continue
+ p = self._children.pop(pid)
+ status = p.wait()
+
+ return status
+
diff --git a/bitbake/lib/bb/pysh/lsprof.py b/bitbake/lib/bb/pysh/lsprof.py
new file mode 100644
index 0000000..b1831c2
--- /dev/null
+++ b/bitbake/lib/bb/pysh/lsprof.py
@@ -0,0 +1,116 @@
+#! /usr/bin/env python
+
+import sys
+from _lsprof import Profiler, profiler_entry
+
+__all__ = ['profile', 'Stats']
+
+def profile(f, *args, **kwds):
+ """XXX docstring"""
+ p = Profiler()
+ p.enable(subcalls=True, builtins=True)
+ try:
+ f(*args, **kwds)
+ finally:
+ p.disable()
+ return Stats(p.getstats())
+
+
+class Stats(object):
+ """XXX docstring"""
+
+ def __init__(self, data):
+ self.data = data
+
+ def sort(self, crit="inlinetime"):
+ """XXX docstring"""
+ if crit not in profiler_entry.__dict__:
+ raise ValueError("Can't sort by %s" % crit)
+ self.data.sort(lambda b, a: cmp(getattr(a, crit),
+ getattr(b, crit)))
+ for e in self.data:
+ if e.calls:
+ e.calls.sort(lambda b, a: cmp(getattr(a, crit),
+ getattr(b, crit)))
+
+ def pprint(self, top=None, file=None, limit=None, climit=None):
+ """XXX docstring"""
+ if file is None:
+ file = sys.stdout
+ d = self.data
+ if top is not None:
+ d = d[:top]
+ cols = "% 12s %12s %11.4f %11.4f %s\n"
+ hcols = "% 12s %12s %12s %12s %s\n"
+ cols2 = "+%12s %12s %11.4f %11.4f + %s\n"
+ file.write(hcols % ("CallCount", "Recursive", "Total(ms)",
+ "Inline(ms)", "module:lineno(function)"))
+ count = 0
+ for e in d:
+ file.write(cols % (e.callcount, e.reccallcount, e.totaltime,
+ e.inlinetime, label(e.code)))
+ count += 1
+ if limit is not None and count == limit:
+ return
+ ccount = 0
+ if e.calls:
+ for se in e.calls:
+ file.write(cols % ("+%s" % se.callcount, se.reccallcount,
+ se.totaltime, se.inlinetime,
+ "+%s" % label(se.code)))
+ count += 1
+ ccount += 1
+ if limit is not None and count == limit:
+ return
+ if climit is not None and ccount == climit:
+ break
+
+ def freeze(self):
+ """Replace all references to code objects with string
+ descriptions; this makes it possible to pickle the instance."""
+
+ # this code is probably rather ickier than it needs to be!
+ for i in range(len(self.data)):
+ e = self.data[i]
+ if not isinstance(e.code, str):
+ self.data[i] = type(e)((label(e.code),) + e[1:])
+ if e.calls:
+ for j in range(len(e.calls)):
+ se = e.calls[j]
+ if not isinstance(se.code, str):
+ e.calls[j] = type(se)((label(se.code),) + se[1:])
+
+_fn2mod = {}
+
+def label(code):
+ if isinstance(code, str):
+ return code
+ try:
+ mname = _fn2mod[code.co_filename]
+ except KeyError:
+ for k, v in sys.modules.items():
+ if v is None:
+ continue
+ if not hasattr(v, '__file__'):
+ continue
+ if not isinstance(v.__file__, str):
+ continue
+ if v.__file__.startswith(code.co_filename):
+ mname = _fn2mod[code.co_filename] = k
+ break
+ else:
+ mname = _fn2mod[code.co_filename] = '<%s>'%code.co_filename
+
+ return '%s:%d(%s)' % (mname, code.co_firstlineno, code.co_name)
+
+
+if __name__ == '__main__':
+ import os
+ sys.argv = sys.argv[1:]
+ if not sys.argv:
+ print >> sys.stderr, "usage: lsprof.py <script> <arguments...>"
+ sys.exit(2)
+ sys.path.insert(0, os.path.abspath(os.path.dirname(sys.argv[0])))
+ stats = profile(execfile, sys.argv[0], globals(), locals())
+ stats.sort()
+ stats.pprint()
diff --git a/bitbake/lib/bb/pysh/pysh.py b/bitbake/lib/bb/pysh/pysh.py
new file mode 100644
index 0000000..b4e6145
--- /dev/null
+++ b/bitbake/lib/bb/pysh/pysh.py
@@ -0,0 +1,167 @@
+# pysh.py - command processing for pysh.
+#
+# Copyright 2007 Patrick Mezard
+#
+# This software may be used and distributed according to the terms
+# of the GNU General Public License, incorporated herein by reference.
+
+import optparse
+import os
+import sys
+
+import interp
+
+SH_OPT = optparse.OptionParser(prog='pysh', usage="%prog [OPTIONS]", version='0.1')
+SH_OPT.add_option('-c', action='store_true', dest='command_string', default=None,
+ help='A string that shall be interpreted by the shell as one or more commands')
+SH_OPT.add_option('--redirect-to', dest='redirect_to', default=None,
+ help='Redirect script commands stdout and stderr to the specified file')
+# See utility_command in builtin.py about the reason for this flag.
+SH_OPT.add_option('--redirected', dest='redirected', action='store_true', default=False,
+ help='Tell the interpreter that stdout and stderr are actually the same objects, which is really stdout')
+SH_OPT.add_option('--debug-parsing', action='store_true', dest='debug_parsing', default=False,
+ help='Trace PLY execution')
+SH_OPT.add_option('--debug-tree', action='store_true', dest='debug_tree', default=False,
+ help='Display the generated syntax tree.')
+SH_OPT.add_option('--debug-cmd', action='store_true', dest='debug_cmd', default=False,
+ help='Trace command execution before parameters expansion and exit status.')
+SH_OPT.add_option('--debug-utility', action='store_true', dest='debug_utility', default=False,
+ help='Trace utility calls, after parameters expansions')
+SH_OPT.add_option('--ast', action='store_true', dest='ast', default=False,
+ help='Encoded commands to execute in a subprocess')
+SH_OPT.add_option('--profile', action='store_true', default=False,
+ help='Profile pysh run')
+
+
+def split_args(args):
+ # Separate shell arguments from command ones
+ # Just stop at the first argument not starting with a dash. I know, this is completely broken,
+ # it ignores files starting with a dash or may take option values for command file. This is not
+ # supposed to happen for now
+ command_index = len(args)
+ for i,arg in enumerate(args):
+ if not arg.startswith('-'):
+ command_index = i
+ break
+
+ return args[:command_index], args[command_index:]
+
+
+def fixenv(env):
+ path = env.get('PATH')
+ if path is not None:
+ parts = path.split(os.pathsep)
+ # Remove Windows utilities from PATH, they are useless at best and
+ # some of them (find) may be confused with other utilities.
+ parts = [p for p in parts if 'system32' not in p.lower()]
+ env['PATH'] = os.pathsep.join(parts)
+ if env.get('HOME') is None:
+ # Several utilities, including cvsps, cannot work without
+ # a defined HOME directory.
+ env['HOME'] = os.path.expanduser('~')
+ return env
+
+def _sh(cwd, shargs, cmdargs, options, debugflags=None, env=None):
+ if os.environ.get('PYSH_TEXT') != '1':
+ import msvcrt
+ for fp in (sys.stdin, sys.stdout, sys.stderr):
+ msvcrt.setmode(fp.fileno(), os.O_BINARY)
+
+ hgbin = os.environ.get('PYSH_HGTEXT') != '1'
+
+ if debugflags is None:
+ debugflags = []
+ if options.debug_parsing: debugflags.append('debug-parsing')
+ if options.debug_utility: debugflags.append('debug-utility')
+ if options.debug_cmd: debugflags.append('debug-cmd')
+ if options.debug_tree: debugflags.append('debug-tree')
+
+ if env is None:
+ env = fixenv(dict(os.environ))
+ if cwd is None:
+ cwd = os.getcwd()
+
+ if not cmdargs:
+ # Nothing to do
+ return 0
+
+ ast = None
+ command_file = None
+ if options.command_string:
+ input = cmdargs[0]
+ if not options.ast:
+ input += '\n'
+ else:
+ args, input = interp.decodeargs(input), None
+ env, ast = args
+ cwd = env.get('PWD', cwd)
+ else:
+ command_file = cmdargs[0]
+ arguments = cmdargs[1:]
+
+ prefix = interp.resolve_shebang(command_file, ignoreshell=True)
+ if prefix:
+ input = ' '.join(prefix + [command_file] + arguments)
+ else:
+ # Read commands from file
+ f = file(command_file)
+ try:
+ # Trailing newline to help the parser
+ input = f.read() + '\n'
+ finally:
+ f.close()
+
+ redirect = None
+ try:
+ if options.redirected:
+ stdout = sys.stdout
+ stderr = stdout
+ elif options.redirect_to:
+ redirect = open(options.redirect_to, 'wb')
+ stdout = redirect
+ stderr = redirect
+ else:
+ stdout = sys.stdout
+ stderr = sys.stderr
+
+ # TODO: set arguments to environment variables
+ opts = interp.Options()
+ opts.hgbinary = hgbin
+ ip = interp.Interpreter(cwd, debugflags, stdout=stdout, stderr=stderr,
+ opts=opts)
+ try:
+ # Export given environment in shell object
+ for k,v in env.iteritems():
+ ip.get_env().export(k,v)
+ return ip.execute_script(input, ast, scriptpath=command_file)
+ finally:
+ ip.close()
+ finally:
+ if redirect is not None:
+ redirect.close()
+
+def sh(cwd=None, args=None, debugflags=None, env=None):
+ if args is None:
+ args = sys.argv[1:]
+ shargs, cmdargs = split_args(args)
+ options, shargs = SH_OPT.parse_args(shargs)
+
+ if options.profile:
+ import lsprof
+ p = lsprof.Profiler()
+ p.enable(subcalls=True)
+ try:
+ return _sh(cwd, shargs, cmdargs, options, debugflags, env)
+ finally:
+ p.disable()
+ stats = lsprof.Stats(p.getstats())
+ stats.sort()
+ stats.pprint(top=10, file=sys.stderr, climit=5)
+ else:
+ return _sh(cwd, shargs, cmdargs, options, debugflags, env)
+
+def main():
+ sys.exit(sh())
+
+if __name__=='__main__':
+ main()
diff --git a/bitbake/lib/bb/pysh/pyshlex.py b/bitbake/lib/bb/pysh/pyshlex.py
new file mode 100644
index 0000000..b301236
--- /dev/null
+++ b/bitbake/lib/bb/pysh/pyshlex.py
@@ -0,0 +1,888 @@
+# pyshlex.py - PLY compatible lexer for pysh.
+#
+# Copyright 2007 Patrick Mezard
+#
+# This software may be used and distributed according to the terms
+# of the GNU General Public License, incorporated herein by reference.
+
+# TODO:
+# - review all "char in 'abc'" snippets: the empty string can be matched
+# - test line continuations within quoted/expansion strings
+# - eof is buggy wrt sublexers
+# - the lexer cannot really work in pull mode as it would be required to run
+# PLY in pull mode. It was designed to work incrementally and it would not be
+# that hard to enable pull mode.
+import re
+try:
+ s = set()
+ del s
+except NameError:
+ from Set import Set as set
+
+from ply import lex
+from sherrors import *
+
+class NeedMore(Exception):
+ pass
+
+def is_blank(c):
+ return c in (' ', '\t')
+
+_RE_DIGITS = re.compile(r'^\d+$')
+
+def are_digits(s):
+ return _RE_DIGITS.search(s) is not None
+
+_OPERATORS = dict([
+ ('&&', 'AND_IF'),
+ ('||', 'OR_IF'),
+ (';;', 'DSEMI'),
+ ('<<', 'DLESS'),
+ ('>>', 'DGREAT'),
+ ('<&', 'LESSAND'),
+ ('>&', 'GREATAND'),
+ ('<>', 'LESSGREAT'),
+ ('<<-', 'DLESSDASH'),
+ ('>|', 'CLOBBER'),
+ ('&', 'AMP'),
+ (';', 'COMMA'),
+ ('<', 'LESS'),
+ ('>', 'GREATER'),
+ ('(', 'LPARENS'),
+ (')', 'RPARENS'),
+])
+
+#Make a function to silence pychecker "Local variable shadows global"
+def make_partial_ops():
+ partials = {}
+ for k in _OPERATORS:
+ for i in range(1, len(k)+1):
+ partials[k[:i]] = None
+ return partials
+
+_PARTIAL_OPERATORS = make_partial_ops()
+
+def is_partial_op(s):
+ """Return True if s matches a non-empty subpart of an operator starting
+ at its first character.
+ """
+ return s in _PARTIAL_OPERATORS
+
+def is_op(s):
+ """If s matches an operator, returns the operator identifier. Return None
+ otherwise.
+ """
+ return _OPERATORS.get(s)
+
+_RESERVEDS = dict([
+ ('if', 'If'),
+ ('then', 'Then'),
+ ('else', 'Else'),
+ ('elif', 'Elif'),
+ ('fi', 'Fi'),
+ ('do', 'Do'),
+ ('done', 'Done'),
+ ('case', 'Case'),
+ ('esac', 'Esac'),
+ ('while', 'While'),
+ ('until', 'Until'),
+ ('for', 'For'),
+ ('{', 'Lbrace'),
+ ('}', 'Rbrace'),
+ ('!', 'Bang'),
+ ('in', 'In'),
+ ('|', 'PIPE'),
+])
+
+def get_reserved(s):
+ return _RESERVEDS.get(s)
+
+_RE_NAME = re.compile(r'^[0-9a-zA-Z_]+$')
+
+def is_name(s):
+ return _RE_NAME.search(s) is not None
+
+def find_chars(seq, chars):
+ for i,v in enumerate(seq):
+ if v in chars:
+ return i,v
+ return -1, None
+
+class WordLexer:
+ """WordLexer parse quoted or expansion expressions and return an expression
+ tree. The input string can be any well formed sequence beginning with quoting
+ or expansion character. Embedded expressions are handled recursively. The
+ resulting tree is made of lists and strings. Lists represent quoted or
+ expansion expressions. Each list first element is the opening separator,
+ the last one the closing separator. In-between can be any number of strings
+ or lists for sub-expressions. Non quoted/expansion expression can written as
+ strings or as lists with empty strings as starting and ending delimiters.
+ """
+
+ NAME_CHARSET = 'abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789_'
+ NAME_CHARSET = dict(zip(NAME_CHARSET, NAME_CHARSET))
+
+ SPECIAL_CHARSET = '@*#?-$!0'
+
+ #Characters which can be escaped depends on the current delimiters
+ ESCAPABLE = {
+ '`': set(['$', '\\', '`']),
+ '"': set(['$', '\\', '`', '"']),
+ "'": set(),
+ }
+
+ def __init__(self, heredoc = False):
+ # _buffer is the unprocessed input characters buffer
+ self._buffer = []
+ # _stack is empty or contains a quoted list being processed
+ # (this is the DFS path to the quoted expression being evaluated).
+ self._stack = []
+ self._escapable = None
+ # True when parsing unquoted here documents
+ self._heredoc = heredoc
+
+ def add(self, data, eof=False):
+ """Feed the lexer with more data. If the quoted expression can be
+ delimited, return a tuple (expr, remaining) containing the expression
+ tree and the unconsumed data.
+ Otherwise, raise NeedMore.
+ """
+ self._buffer += list(data)
+ self._parse(eof)
+
+ result = self._stack[0]
+ remaining = ''.join(self._buffer)
+ self._stack = []
+ self._buffer = []
+ return result, remaining
+
+ def _is_escapable(self, c, delim=None):
+ if delim is None:
+ if self._heredoc:
+ # Backslashes works as if they were double quoted in unquoted
+ # here-documents
+ delim = '"'
+ else:
+ if len(self._stack)<=1:
+ return True
+ delim = self._stack[-2][0]
+
+ escapables = self.ESCAPABLE.get(delim, None)
+ return escapables is None or c in escapables
+
+ def _parse_squote(self, buf, result, eof):
+ if not buf:
+ raise NeedMore()
+ try:
+ pos = buf.index("'")
+ except ValueError:
+ raise NeedMore()
+ result[-1] += ''.join(buf[:pos])
+ result += ["'"]
+ return pos+1, True
+
+ def _parse_bquote(self, buf, result, eof):
+ if not buf:
+ raise NeedMore()
+
+ if buf[0]=='\n':
+ #Remove line continuations
+ result[:] = ['', '', '']
+ elif self._is_escapable(buf[0]):
+ result[-1] += buf[0]
+ result += ['']
+ else:
+ #Keep as such
+ result[:] = ['', '\\'+buf[0], '']
+
+ return 1, True
+
+ def _parse_dquote(self, buf, result, eof):
+ if not buf:
+ raise NeedMore()
+ pos, sep = find_chars(buf, '$\\`"')
+ if pos==-1:
+ raise NeedMore()
+
+ result[-1] += ''.join(buf[:pos])
+ if sep=='"':
+ result += ['"']
+ return pos+1, True
+ else:
+ #Keep everything until the separator and defer processing
+ return pos, False
+
+ def _parse_command(self, buf, result, eof):
+ if not buf:
+ raise NeedMore()
+
+ chars = '$\\`"\''
+ if result[0] == '$(':
+ chars += ')'
+ pos, sep = find_chars(buf, chars)
+ if pos == -1:
+ raise NeedMore()
+
+ result[-1] += ''.join(buf[:pos])
+ if (result[0]=='$(' and sep==')') or (result[0]=='`' and sep=='`'):
+ result += [sep]
+ return pos+1, True
+ else:
+ return pos, False
+
+ def _parse_parameter(self, buf, result, eof):
+ if not buf:
+ raise NeedMore()
+
+ pos, sep = find_chars(buf, '$\\`"\'}')
+ if pos==-1:
+ raise NeedMore()
+
+ result[-1] += ''.join(buf[:pos])
+ if sep=='}':
+ result += [sep]
+ return pos+1, True
+ else:
+ return pos, False
+
+ def _parse_dollar(self, buf, result, eof):
+ sep = result[0]
+ if sep=='$':
+ if not buf:
+ #TODO: handle empty $
+ raise NeedMore()
+ if buf[0]=='(':
+ if len(buf)==1:
+ raise NeedMore()
+
+ if buf[1]=='(':
+ result[0] = '$(('
+ buf[:2] = []
+ else:
+ result[0] = '$('
+ buf[:1] = []
+
+ elif buf[0]=='{':
+ result[0] = '${'
+ buf[:1] = []
+ else:
+ if buf[0] in self.SPECIAL_CHARSET:
+ result[-1] = buf[0]
+ read = 1
+ else:
+ for read,c in enumerate(buf):
+ if c not in self.NAME_CHARSET:
+ break
+ else:
+ if not eof:
+ raise NeedMore()
+ read += 1
+
+ result[-1] += ''.join(buf[0:read])
+
+ if not result[-1]:
+ result[:] = ['', result[0], '']
+ else:
+ result += ['']
+ return read,True
+
+ sep = result[0]
+ if sep=='$(':
+ parsefunc = self._parse_command
+ elif sep=='${':
+ parsefunc = self._parse_parameter
+ else:
+ raise NotImplementedError(sep)
+
+ pos, closed = parsefunc(buf, result, eof)
+ return pos, closed
+
+ def _parse(self, eof):
+ buf = self._buffer
+ stack = self._stack
+ recurse = False
+
+ while 1:
+ if not stack or recurse:
+ if not buf:
+ raise NeedMore()
+ if buf[0] not in ('"\\`$\''):
+ raise ShellSyntaxError('Invalid quoted string sequence')
+ stack.append([buf[0], ''])
+ buf[:1] = []
+ recurse = False
+
+ result = stack[-1]
+ if result[0]=="'":
+ parsefunc = self._parse_squote
+ elif result[0]=='\\':
+ parsefunc = self._parse_bquote
+ elif result[0]=='"':
+ parsefunc = self._parse_dquote
+ elif result[0]=='`':
+ parsefunc = self._parse_command
+ elif result[0][0]=='$':
+ parsefunc = self._parse_dollar
+ else:
+ raise NotImplementedError()
+
+ read, closed = parsefunc(buf, result, eof)
+
+ buf[:read] = []
+ if closed:
+ if len(stack)>1:
+ #Merge in parent expression
+ parsed = stack.pop()
+ stack[-1] += [parsed]
+ stack[-1] += ['']
+ else:
+ break
+ else:
+ recurse = True
+
+def normalize_wordtree(wtree):
+ """Fold back every literal sequence (delimited with empty strings) into
+ parent sequence.
+ """
+ def normalize(wtree):
+ result = []
+ for part in wtree[1:-1]:
+ if isinstance(part, list):
+ part = normalize(part)
+ if part[0]=='':
+ #Move the part content back at current level
+ result += part[1:-1]
+ continue
+ elif not part:
+ #Remove empty strings
+ continue
+ result.append(part)
+ if not result:
+ result = ['']
+ return [wtree[0]] + result + [wtree[-1]]
+
+ return normalize(wtree)
+
+
+def make_wordtree(token, here_document=False):
+ """Parse a delimited token and return a tree similar to the ones returned by
+ WordLexer. token may contain any combinations of expansion/quoted fields and
+ non-ones.
+ """
+ tree = ['']
+ remaining = token
+ delimiters = '\\$`'
+ if not here_document:
+ delimiters += '\'"'
+
+ while 1:
+ pos, sep = find_chars(remaining, delimiters)
+ if pos==-1:
+ tree += [remaining, '']
+ return normalize_wordtree(tree)
+ tree.append(remaining[:pos])
+ remaining = remaining[pos:]
+
+ try:
+ result, remaining = WordLexer(heredoc = here_document).add(remaining, True)
+ except NeedMore:
+ raise ShellSyntaxError('Invalid token "%s"')
+ tree.append(result)
+
+
+def wordtree_as_string(wtree):
+ """Rewrite an expression tree generated by make_wordtree as string."""
+ def visit(node, output):
+ for child in node:
+ if isinstance(child, list):
+ visit(child, output)
+ else:
+ output.append(child)
+
+ output = []
+ visit(wtree, output)
+ return ''.join(output)
+
+
+def unquote_wordtree(wtree):
+ """Fold the word tree while removing quotes everywhere. Other expansion
+ sequences are joined as such.
+ """
+ def unquote(wtree):
+ unquoted = []
+ if wtree[0] in ('', "'", '"', '\\'):
+ wtree = wtree[1:-1]
+
+ for part in wtree:
+ if isinstance(part, list):
+ part = unquote(part)
+ unquoted.append(part)
+ return ''.join(unquoted)
+
+ return unquote(wtree)
+
+
+class HereDocLexer:
+ """HereDocLexer delimits whatever comes from the here-document starting newline
+ not included to the closing delimiter line included.
+ """
+ def __init__(self, op, delim):
+ assert op in ('<<', '<<-')
+ if not delim:
+ raise ShellSyntaxError('invalid here document delimiter %s' % str(delim))
+
+ self._op = op
+ self._delim = delim
+ self._buffer = []
+ self._token = []
+
+ def add(self, data, eof):
+ """If the here-document was delimited, return a tuple (content, remaining).
+ Raise NeedMore() otherwise.
+ """
+ self._buffer += list(data)
+ self._parse(eof)
+ token = ''.join(self._token)
+ remaining = ''.join(self._buffer)
+ self._token, self._remaining = [], []
+ return token, remaining
+
+ def _parse(self, eof):
+ while 1:
+ #Look for first unescaped newline. Quotes may be ignored
+ escaped = False
+ for i,c in enumerate(self._buffer):
+ if escaped:
+ escaped = False
+ elif c=='\\':
+ escaped = True
+ elif c=='\n':
+ break
+ else:
+ i = -1
+
+ if i==-1 or self._buffer[i]!='\n':
+ if not eof:
+ raise NeedMore()
+ #No more data, maybe the last line is closing delimiter
+ line = ''.join(self._buffer)
+ eol = ''
+ self._buffer[:] = []
+ else:
+ line = ''.join(self._buffer[:i])
+ eol = self._buffer[i]
+ self._buffer[:i+1] = []
+
+ if self._op=='<<-':
+ line = line.lstrip('\t')
+
+ if line==self._delim:
+ break
+
+ self._token += [line, eol]
+ if i==-1:
+ break
+
+class Token:
+ #TODO: check this is still in use
+ OPERATOR = 'OPERATOR'
+ WORD = 'WORD'
+
+ def __init__(self):
+ self.value = ''
+ self.type = None
+
+ def __getitem__(self, key):
+ #Behave like a two elements tuple
+ if key==0:
+ return self.type
+ if key==1:
+ return self.value
+ raise IndexError(key)
+
+
+class HereDoc:
+ def __init__(self, op, name=None):
+ self.op = op
+ self.name = name
+ self.pendings = []
+
+TK_COMMA = 'COMMA'
+TK_AMPERSAND = 'AMP'
+TK_OP = 'OP'
+TK_TOKEN = 'TOKEN'
+TK_COMMENT = 'COMMENT'
+TK_NEWLINE = 'NEWLINE'
+TK_IONUMBER = 'IO_NUMBER'
+TK_ASSIGNMENT = 'ASSIGNMENT_WORD'
+TK_HERENAME = 'HERENAME'
+
+class Lexer:
+ """Main lexer.
+
+ Call add() until the script AST is returned.
+ """
+ # Here-document handling makes the whole thing more complex because they basically
+ # force tokens to be reordered: here-content must come right after the operator
+ # and the here-document name, while some other tokens might be following the
+ # here-document expression on the same line.
+ #
+ # So, here-doc states are basically:
+ # *self._state==ST_NORMAL
+ # - self._heredoc.op is None: no here-document
+ # - self._heredoc.op is not None but name is: here-document operator matched,
+ # waiting for the document name/delimiter
+ # - self._heredoc.op and name are not None: here-document is ready, following
+ # tokens are being stored and will be pushed again when the document is
+ # completely parsed.
+ # *self._state==ST_HEREDOC
+ # - The here-document is being delimited by self._herelexer. Once it is done
+ # the content is pushed in front of the pending token list then all these
+ # tokens are pushed once again.
+ ST_NORMAL = 'ST_NORMAL'
+ ST_OP = 'ST_OP'
+ ST_BACKSLASH = 'ST_BACKSLASH'
+ ST_QUOTED = 'ST_QUOTED'
+ ST_COMMENT = 'ST_COMMENT'
+ ST_HEREDOC = 'ST_HEREDOC'
+
+ #Match end of backquote strings
+ RE_BACKQUOTE_END = re.compile(r'(?<!\\)(`)')
+
+ def __init__(self, parent_state = None):
+ self._input = []
+ self._pos = 0
+
+ self._token = ''
+ self._type = TK_TOKEN
+
+ self._state = self.ST_NORMAL
+ self._parent_state = parent_state
+ self._wordlexer = None
+
+ self._heredoc = HereDoc(None)
+ self._herelexer = None
+
+ ### Following attributes are not used for delimiting token and can safely
+ ### be changed after here-document detection (see _push_toke)
+
+ # Count the number of tokens following a 'For' reserved word. Needed to
+ # return an 'In' reserved word if it comes in third place.
+ self._for_count = None
+
+ def add(self, data, eof=False):
+ """Feed the lexer with data.
+
+ When eof is set to True, returns unconsumed data or raise if the lexer
+ is in the middle of a delimiting operation.
+ Raise NeedMore otherwise.
+ """
+ self._input += list(data)
+ self._parse(eof)
+ self._input[:self._pos] = []
+ return ''.join(self._input)
+
+ def _parse(self, eof):
+ while self._state:
+ if self._pos>=len(self._input):
+ if not eof:
+ raise NeedMore()
+ elif self._state not in (self.ST_OP, self.ST_QUOTED, self.ST_HEREDOC):
+ #Delimit the current token and leave cleanly
+ self._push_token('')
+ break
+ else:
+ #Let the sublexer handle the eof themselves
+ pass
+
+ if self._state==self.ST_NORMAL:
+ self._parse_normal()
+ elif self._state==self.ST_COMMENT:
+ self._parse_comment()
+ elif self._state==self.ST_OP:
+ self._parse_op(eof)
+ elif self._state==self.ST_QUOTED:
+ self._parse_quoted(eof)
+ elif self._state==self.ST_HEREDOC:
+ self._parse_heredoc(eof)
+ else:
+ assert False, "Unknown state " + str(self._state)
+
+ if self._heredoc.op is not None:
+ raise ShellSyntaxError('missing here-document delimiter')
+
+ def _parse_normal(self):
+ c = self._input[self._pos]
+ if c=='\n':
+ self._push_token(c)
+ self._token = c
+ self._type = TK_NEWLINE
+ self._push_token('')
+ self._pos += 1
+ elif c in ('\\', '\'', '"', '`', '$'):
+ self._state = self.ST_QUOTED
+ elif is_partial_op(c):
+ self._push_token(c)
+
+ self._type = TK_OP
+ self._token += c
+ self._pos += 1
+ self._state = self.ST_OP
+ elif is_blank(c):
+ self._push_token(c)
+
+ #Discard blanks
+ self._pos += 1
+ elif self._token:
+ self._token += c
+ self._pos += 1
+ elif c=='#':
+ self._state = self.ST_COMMENT
+ self._type = TK_COMMENT
+ self._pos += 1
+ else:
+ self._pos += 1
+ self._token += c
+
+ def _parse_op(self, eof):
+ assert self._token
+
+ while 1:
+ if self._pos>=len(self._input):
+ if not eof:
+ raise NeedMore()
+ c = ''
+ else:
+ c = self._input[self._pos]
+
+ op = self._token + c
+ if c and is_partial_op(op):
+ #Still parsing an operator
+ self._token = op
+ self._pos += 1
+ else:
+ #End of operator
+ self._push_token(c)
+ self._state = self.ST_NORMAL
+ break
+
+ def _parse_comment(self):
+ while 1:
+ if self._pos>=len(self._input):
+ raise NeedMore()
+
+ c = self._input[self._pos]
+ if c=='\n':
+ #End of comment, do not consume the end of line
+ self._state = self.ST_NORMAL
+ break
+ else:
+ self._token += c
+ self._pos += 1
+
+ def _parse_quoted(self, eof):
+ """Precondition: the starting backquote/dollar is still in the input queue."""
+ if not self._wordlexer:
+ self._wordlexer = WordLexer()
+
+ if self._pos<len(self._input):
+ #Transfer input queue character into the subparser
+ input = self._input[self._pos:]
+ self._pos += len(input)
+
+ wtree, remaining = self._wordlexer.add(input, eof)
+ self._wordlexer = None
+ self._token += wordtree_as_string(wtree)
+
+ #Put unparsed character back in the input queue
+ if remaining:
+ self._input[self._pos:self._pos] = list(remaining)
+ self._state = self.ST_NORMAL
+
+ def _parse_heredoc(self, eof):
+ assert not self._token
+
+ if self._herelexer is None:
+ self._herelexer = HereDocLexer(self._heredoc.op, self._heredoc.name)
+
+ if self._pos<len(self._input):
+ #Transfer input queue character into the subparser
+ input = self._input[self._pos:]
+ self._pos += len(input)
+
+ self._token, remaining = self._herelexer.add(input, eof)
+
+ #Reset here-document state
+ self._herelexer = None
+ heredoc, self._heredoc = self._heredoc, HereDoc(None)
+ if remaining:
+ self._input[self._pos:self._pos] = list(remaining)
+ self._state = self.ST_NORMAL
+
+ #Push pending tokens
+ heredoc.pendings[:0] = [(self._token, self._type, heredoc.name)]
+ for token, type, delim in heredoc.pendings:
+ self._token = token
+ self._type = type
+ self._push_token(delim)
+
+ def _push_token(self, delim):
+ if not self._token:
+ return 0
+
+ if self._heredoc.op is not None:
+ if self._heredoc.name is None:
+ #Here-document name
+ if self._type!=TK_TOKEN:
+ raise ShellSyntaxError("expecting here-document name, got '%s'" % self._token)
+ self._heredoc.name = unquote_wordtree(make_wordtree(self._token))
+ self._type = TK_HERENAME
+ else:
+ #Capture all tokens until the newline starting the here-document
+ if self._type==TK_NEWLINE:
+ assert self._state==self.ST_NORMAL
+ self._state = self.ST_HEREDOC
+
+ self._heredoc.pendings.append((self._token, self._type, delim))
+ self._token = ''
+ self._type = TK_TOKEN
+ return 1
+
+ # BEWARE: do not change parser state from here to the end of the function:
+ # when parsing between an here-document operator to the end of the line
+ # tokens are stored in self._heredoc.pendings. Therefore, they will not
+ # reach the section below.
+
+ #Check operators
+ if self._type==TK_OP:
+ #False positive because of partial op matching
+ op = is_op(self._token)
+ if not op:
+ self._type = TK_TOKEN
+ else:
+ #Map to the specific operator
+ self._type = op
+ if self._token in ('<<', '<<-'):
+ #Done here rather than in _parse_op because there is no need
+ #to change the parser state since we are still waiting for
+ #the here-document name
+ if self._heredoc.op is not None:
+ raise ShellSyntaxError("syntax error near token '%s'" % self._token)
+ assert self._heredoc.op is None
+ self._heredoc.op = self._token
+
+ if self._type==TK_TOKEN:
+ if '=' in self._token and not delim:
+ if self._token.startswith('='):
+ #Token is a WORD... a TOKEN that is.
+ pass
+ else:
+ prev = self._token[:self._token.find('=')]
+ if is_name(prev):
+ self._type = TK_ASSIGNMENT
+ else:
+ #Just a token (unspecified)
+ pass
+ else:
+ reserved = get_reserved(self._token)
+ if reserved is not None:
+ if reserved=='In' and self._for_count!=2:
+ #Sorry, not a reserved word after all
+ pass
+ else:
+ self._type = reserved
+ if reserved in ('For', 'Case'):
+ self._for_count = 0
+ elif are_digits(self._token) and delim in ('<', '>'):
+ #Detect IO_NUMBER
+ self._type = TK_IONUMBER
+ elif self._token==';':
+ self._type = TK_COMMA
+ elif self._token=='&':
+ self._type = TK_AMPERSAND
+ elif self._type==TK_COMMENT:
+ #Comments are not part of sh grammar, ignore them
+ self._token = ''
+ self._type = TK_TOKEN
+ return 0
+
+ if self._for_count is not None:
+ #Track token count in 'For' expression to detect 'In' reserved words.
+ #Can only be in third position, no need to go beyond
+ self._for_count += 1
+ if self._for_count==3:
+ self._for_count = None
+
+ self.on_token((self._token, self._type))
+ self._token = ''
+ self._type = TK_TOKEN
+ return 1
+
+ def on_token(self, token):
+ raise NotImplementedError
+
+
+tokens = [
+ TK_TOKEN,
+# To silence yacc unused token warnings
+# TK_COMMENT,
+ TK_NEWLINE,
+ TK_IONUMBER,
+ TK_ASSIGNMENT,
+ TK_HERENAME,
+]
+
+#Add specific operators
+tokens += _OPERATORS.values()
+#Add reserved words
+tokens += _RESERVEDS.values()
+
+class PLYLexer(Lexer):
+ """Bridge Lexer and PLY lexer interface."""
+ def __init__(self):
+ Lexer.__init__(self)
+ self._tokens = []
+ self._current = 0
+ self.lineno = 0
+
+ def on_token(self, token):
+ value, type = token
+
+ self.lineno = 0
+ t = lex.LexToken()
+ t.value = value
+ t.type = type
+ t.lexer = self
+ t.lexpos = 0
+ t.lineno = 0
+
+ self._tokens.append(t)
+
+ def is_empty(self):
+ return not bool(self._tokens)
+
+ #PLY compliant interface
+ def token(self):
+ if self._current>=len(self._tokens):
+ return None
+ t = self._tokens[self._current]
+ self._current += 1
+ return t
+
+
+def get_tokens(s):
+ """Parse the input string and return a tuple (tokens, unprocessed) where
+ tokens is a list of parsed tokens and unprocessed is the part of the input
+ string left untouched by the lexer.
+ """
+ lexer = PLYLexer()
+ untouched = lexer.add(s, True)
+ tokens = []
+ while 1:
+ token = lexer.token()
+ if token is None:
+ break
+ tokens.append(token)
+
+ tokens = [(t.value, t.type) for t in tokens]
+ return tokens, untouched
diff --git a/bitbake/lib/bb/pysh/pyshyacc.py b/bitbake/lib/bb/pysh/pyshyacc.py
new file mode 100644
index 0000000..e8e80aa
--- /dev/null
+++ b/bitbake/lib/bb/pysh/pyshyacc.py
@@ -0,0 +1,779 @@
+# pyshyacc.py - PLY grammar definition for pysh
+#
+# Copyright 2007 Patrick Mezard
+#
+# This software may be used and distributed according to the terms
+# of the GNU General Public License, incorporated herein by reference.
+
+"""PLY grammar file.
+"""
+import os.path
+import sys
+
+import pyshlex
+tokens = pyshlex.tokens
+
+from ply import yacc
+import sherrors
+
+class IORedirect:
+ def __init__(self, op, filename, io_number=None):
+ self.op = op
+ self.filename = filename
+ self.io_number = io_number
+
+class HereDocument:
+ def __init__(self, op, name, content, io_number=None):
+ self.op = op
+ self.name = name
+ self.content = content
+ self.io_number = io_number
+
+def make_io_redirect(p):
+ """Make an IORedirect instance from the input 'io_redirect' production."""
+ name, io_number, io_target = p
+ assert name=='io_redirect'
+
+ if io_target[0]=='io_file':
+ io_type, io_op, io_file = io_target
+ return IORedirect(io_op, io_file, io_number)
+ elif io_target[0]=='io_here':
+ io_type, io_op, io_name, io_content = io_target
+ return HereDocument(io_op, io_name, io_content, io_number)
+ else:
+ assert False, "Invalid IO redirection token %s" % repr(io_type)
+
+class SimpleCommand:
+ """
+ assigns contains (name, value) pairs.
+ """
+ def __init__(self, words, redirs, assigns):
+ self.words = list(words)
+ self.redirs = list(redirs)
+ self.assigns = list(assigns)
+
+class Pipeline:
+ def __init__(self, commands, reverse_status=False):
+ self.commands = list(commands)
+ assert self.commands #Grammar forbids this
+ self.reverse_status = reverse_status
+
+class AndOr:
+ def __init__(self, op, left, right):
+ self.op = str(op)
+ self.left = left
+ self.right = right
+
+class ForLoop:
+ def __init__(self, name, items, cmds):
+ self.name = str(name)
+ self.items = list(items)
+ self.cmds = list(cmds)
+
+class WhileLoop:
+ def __init__(self, condition, cmds):
+ self.condition = list(condition)
+ self.cmds = list(cmds)
+
+class UntilLoop:
+ def __init__(self, condition, cmds):
+ self.condition = list(condition)
+ self.cmds = list(cmds)
+
+class FunDef:
+ def __init__(self, name, body):
+ self.name = str(name)
+ self.body = body
+
+class BraceGroup:
+ def __init__(self, cmds):
+ self.cmds = list(cmds)
+
+class IfCond:
+ def __init__(self, cond, if_cmds, else_cmds):
+ self.cond = list(cond)
+ self.if_cmds = if_cmds
+ self.else_cmds = else_cmds
+
+class Case:
+ def __init__(self, name, items):
+ self.name = name
+ self.items = items
+
+class SubShell:
+ def __init__(self, cmds):
+ self.cmds = cmds
+
+class RedirectList:
+ def __init__(self, cmd, redirs):
+ self.cmd = cmd
+ self.redirs = list(redirs)
+
+def get_production(productions, ptype):
+ """productions must be a list of production tuples like (name, obj) where
+ name is the production string identifier.
+ Return the first production named 'ptype'. Raise KeyError if None can be
+ found.
+ """
+ for production in productions:
+ if production is not None and production[0]==ptype:
+ return production
+ raise KeyError(ptype)
+
+#-------------------------------------------------------------------------------
+# PLY grammar definition
+#-------------------------------------------------------------------------------
+
+def p_multiple_commands(p):
+ """multiple_commands : newline_sequence
+ | complete_command
+ | multiple_commands complete_command"""
+ if len(p)==2:
+ if p[1] is not None:
+ p[0] = [p[1]]
+ else:
+ p[0] = []
+ else:
+ p[0] = p[1] + [p[2]]
+
+def p_complete_command(p):
+ """complete_command : list separator
+ | list"""
+ if len(p)==3 and p[2] and p[2][1] == '&':
+ p[0] = ('async', p[1])
+ else:
+ p[0] = p[1]
+
+def p_list(p):
+ """list : list separator_op and_or
+ | and_or"""
+ if len(p)==2:
+ p[0] = [p[1]]
+ else:
+ #if p[2]!=';':
+ # raise NotImplementedError('AND-OR list asynchronous execution is not implemented')
+ p[0] = p[1] + [p[3]]
+
+def p_and_or(p):
+ """and_or : pipeline
+ | and_or AND_IF linebreak pipeline
+ | and_or OR_IF linebreak pipeline"""
+ if len(p)==2:
+ p[0] = p[1]
+ else:
+ p[0] = ('and_or', AndOr(p[2], p[1], p[4]))
+
+def p_maybe_bang_word(p):
+ """maybe_bang_word : Bang"""
+ p[0] = ('maybe_bang_word', p[1])
+
+def p_pipeline(p):
+ """pipeline : pipe_sequence
+ | bang_word pipe_sequence"""
+ if len(p)==3:
+ p[0] = ('pipeline', Pipeline(p[2][1:], True))
+ else:
+ p[0] = ('pipeline', Pipeline(p[1][1:]))
+
+def p_pipe_sequence(p):
+ """pipe_sequence : command
+ | pipe_sequence PIPE linebreak command"""
+ if len(p)==2:
+ p[0] = ['pipe_sequence', p[1]]
+ else:
+ p[0] = p[1] + [p[4]]
+
+def p_command(p):
+ """command : simple_command
+ | compound_command
+ | compound_command redirect_list
+ | function_definition"""
+
+ if p[1][0] in ( 'simple_command',
+ 'for_clause',
+ 'while_clause',
+ 'until_clause',
+ 'case_clause',
+ 'if_clause',
+ 'function_definition',
+ 'subshell',
+ 'brace_group',):
+ if len(p) == 2:
+ p[0] = p[1]
+ else:
+ p[0] = ('redirect_list', RedirectList(p[1], p[2][1:]))
+ else:
+ raise NotImplementedError('%s command is not implemented' % repr(p[1][0]))
+
+def p_compound_command(p):
+ """compound_command : brace_group
+ | subshell
+ | for_clause
+ | case_clause
+ | if_clause
+ | while_clause
+ | until_clause"""
+ p[0] = p[1]
+
+def p_subshell(p):
+ """subshell : LPARENS compound_list RPARENS"""
+ p[0] = ('subshell', SubShell(p[2][1:]))
+
+def p_compound_list(p):
+ """compound_list : term
+ | newline_list term
+ | term separator
+ | newline_list term separator"""
+ productions = p[1:]
+ try:
+ sep = get_production(productions, 'separator')
+ if sep[1]!=';':
+ raise NotImplementedError()
+ except KeyError:
+ pass
+ term = get_production(productions, 'term')
+ p[0] = ['compound_list'] + term[1:]
+
+def p_term(p):
+ """term : term separator and_or
+ | and_or"""
+ if len(p)==2:
+ p[0] = ['term', p[1]]
+ else:
+ if p[2] is not None and p[2][1] == '&':
+ p[0] = ['term', ('async', p[1][1:])] + [p[3]]
+ else:
+ p[0] = p[1] + [p[3]]
+
+def p_maybe_for_word(p):
+ # Rearrange 'For' priority wrt TOKEN. See p_for_word
+ """maybe_for_word : For"""
+ p[0] = ('maybe_for_word', p[1])
+
+def p_for_clause(p):
+ """for_clause : for_word name linebreak do_group
+ | for_word name linebreak in sequential_sep do_group
+ | for_word name linebreak in wordlist sequential_sep do_group"""
+ productions = p[1:]
+ do_group = get_production(productions, 'do_group')
+ try:
+ items = get_production(productions, 'in')[1:]
+ except KeyError:
+ raise NotImplementedError('"in" omission is not implemented')
+
+ try:
+ items = get_production(productions, 'wordlist')[1:]
+ except KeyError:
+ items = []
+
+ name = p[2]
+ p[0] = ('for_clause', ForLoop(name, items, do_group[1:]))
+
+def p_name(p):
+ """name : token""" #Was NAME instead of token
+ p[0] = p[1]
+
+def p_in(p):
+ """in : In"""
+ p[0] = ('in', p[1])
+
+def p_wordlist(p):
+ """wordlist : wordlist token
+ | token"""
+ if len(p)==2:
+ p[0] = ['wordlist', ('TOKEN', p[1])]
+ else:
+ p[0] = p[1] + [('TOKEN', p[2])]
+
+def p_case_clause(p):
+ """case_clause : Case token linebreak in linebreak case_list Esac
+ | Case token linebreak in linebreak case_list_ns Esac
+ | Case token linebreak in linebreak Esac"""
+ if len(p) < 8:
+ items = []
+ else:
+ items = p[6][1:]
+ name = p[2]
+ p[0] = ('case_clause', Case(name, [c[1] for c in items]))
+
+def p_case_list_ns(p):
+ """case_list_ns : case_list case_item_ns
+ | case_item_ns"""
+ p_case_list(p)
+
+def p_case_list(p):
+ """case_list : case_list case_item
+ | case_item"""
+ if len(p)==2:
+ p[0] = ['case_list', p[1]]
+ else:
+ p[0] = p[1] + [p[2]]
+
+def p_case_item_ns(p):
+ """case_item_ns : pattern RPARENS linebreak
+ | pattern RPARENS compound_list linebreak
+ | LPARENS pattern RPARENS linebreak
+ | LPARENS pattern RPARENS compound_list linebreak"""
+ p_case_item(p)
+
+def p_case_item(p):
+ """case_item : pattern RPARENS linebreak DSEMI linebreak
+ | pattern RPARENS compound_list DSEMI linebreak
+ | LPARENS pattern RPARENS linebreak DSEMI linebreak
+ | LPARENS pattern RPARENS compound_list DSEMI linebreak"""
+ if len(p) < 7:
+ name = p[1][1:]
+ else:
+ name = p[2][1:]
+
+ try:
+ cmds = get_production(p[1:], "compound_list")[1:]
+ except KeyError:
+ cmds = []
+
+ p[0] = ('case_item', (name, cmds))
+
+def p_pattern(p):
+ """pattern : token
+ | pattern PIPE token"""
+ if len(p)==2:
+ p[0] = ['pattern', ('TOKEN', p[1])]
+ else:
+ p[0] = p[1] + [('TOKEN', p[2])]
+
+def p_maybe_if_word(p):
+ # Rearrange 'If' priority wrt TOKEN. See p_if_word
+ """maybe_if_word : If"""
+ p[0] = ('maybe_if_word', p[1])
+
+def p_maybe_then_word(p):
+ # Rearrange 'Then' priority wrt TOKEN. See p_then_word
+ """maybe_then_word : Then"""
+ p[0] = ('maybe_then_word', p[1])
+
+def p_if_clause(p):
+ """if_clause : if_word compound_list then_word compound_list else_part Fi
+ | if_word compound_list then_word compound_list Fi"""
+ else_part = []
+ if len(p)==7:
+ else_part = p[5]
+ p[0] = ('if_clause', IfCond(p[2][1:], p[4][1:], else_part))
+
+def p_else_part(p):
+ """else_part : Elif compound_list then_word compound_list else_part
+ | Elif compound_list then_word compound_list
+ | Else compound_list"""
+ if len(p)==3:
+ p[0] = p[2][1:]
+ else:
+ else_part = []
+ if len(p)==6:
+ else_part = p[5]
+ p[0] = ('elif', IfCond(p[2][1:], p[4][1:], else_part))
+
+def p_while_clause(p):
+ """while_clause : While compound_list do_group"""
+ p[0] = ('while_clause', WhileLoop(p[2][1:], p[3][1:]))
+
+def p_maybe_until_word(p):
+ # Rearrange 'Until' priority wrt TOKEN. See p_until_word
+ """maybe_until_word : Until"""
+ p[0] = ('maybe_until_word', p[1])
+
+def p_until_clause(p):
+ """until_clause : until_word compound_list do_group"""
+ p[0] = ('until_clause', UntilLoop(p[2][1:], p[3][1:]))
+
+def p_function_definition(p):
+ """function_definition : fname LPARENS RPARENS linebreak function_body"""
+ p[0] = ('function_definition', FunDef(p[1], p[5]))
+
+def p_function_body(p):
+ """function_body : compound_command
+ | compound_command redirect_list"""
+ if len(p)!=2:
+ raise NotImplementedError('functions redirections lists are not implemented')
+ p[0] = p[1]
+
+def p_fname(p):
+ """fname : TOKEN""" #Was NAME instead of token
+ p[0] = p[1]
+
+def p_brace_group(p):
+ """brace_group : Lbrace compound_list Rbrace"""
+ p[0] = ('brace_group', BraceGroup(p[2][1:]))
+
+def p_maybe_done_word(p):
+ #See p_assignment_word for details.
+ """maybe_done_word : Done"""
+ p[0] = ('maybe_done_word', p[1])
+
+def p_maybe_do_word(p):
+ """maybe_do_word : Do"""
+ p[0] = ('maybe_do_word', p[1])
+
+def p_do_group(p):
+ """do_group : do_word compound_list done_word"""
+ #Do group contains a list of AndOr
+ p[0] = ['do_group'] + p[2][1:]
+
+def p_simple_command(p):
+ """simple_command : cmd_prefix cmd_word cmd_suffix
+ | cmd_prefix cmd_word
+ | cmd_prefix
+ | cmd_name cmd_suffix
+ | cmd_name"""
+ words, redirs, assigns = [], [], []
+ for e in p[1:]:
+ name = e[0]
+ if name in ('cmd_prefix', 'cmd_suffix'):
+ for sube in e[1:]:
+ subname = sube[0]
+ if subname=='io_redirect':
+ redirs.append(make_io_redirect(sube))
+ elif subname=='ASSIGNMENT_WORD':
+ assigns.append(sube)
+ else:
+ words.append(sube)
+ elif name in ('cmd_word', 'cmd_name'):
+ words.append(e)
+
+ cmd = SimpleCommand(words, redirs, assigns)
+ p[0] = ('simple_command', cmd)
+
+def p_cmd_name(p):
+ """cmd_name : TOKEN"""
+ p[0] = ('cmd_name', p[1])
+
+def p_cmd_word(p):
+ """cmd_word : token"""
+ p[0] = ('cmd_word', p[1])
+
+def p_maybe_assignment_word(p):
+ #See p_assignment_word for details.
+ """maybe_assignment_word : ASSIGNMENT_WORD"""
+ p[0] = ('maybe_assignment_word', p[1])
+
+def p_cmd_prefix(p):
+ """cmd_prefix : io_redirect
+ | cmd_prefix io_redirect
+ | assignment_word
+ | cmd_prefix assignment_word"""
+ try:
+ prefix = get_production(p[1:], 'cmd_prefix')
+ except KeyError:
+ prefix = ['cmd_prefix']
+
+ try:
+ value = get_production(p[1:], 'assignment_word')[1]
+ value = ('ASSIGNMENT_WORD', value.split('=', 1))
+ except KeyError:
+ value = get_production(p[1:], 'io_redirect')
+ p[0] = prefix + [value]
+
+def p_cmd_suffix(p):
+ """cmd_suffix : io_redirect
+ | cmd_suffix io_redirect
+ | token
+ | cmd_suffix token
+ | maybe_for_word
+ | cmd_suffix maybe_for_word
+ | maybe_done_word
+ | cmd_suffix maybe_done_word
+ | maybe_do_word
+ | cmd_suffix maybe_do_word
+ | maybe_until_word
+ | cmd_suffix maybe_until_word
+ | maybe_assignment_word
+ | cmd_suffix maybe_assignment_word
+ | maybe_if_word
+ | cmd_suffix maybe_if_word
+ | maybe_then_word
+ | cmd_suffix maybe_then_word
+ | maybe_bang_word
+ | cmd_suffix maybe_bang_word"""
+ try:
+ suffix = get_production(p[1:], 'cmd_suffix')
+ token = p[2]
+ except KeyError:
+ suffix = ['cmd_suffix']
+ token = p[1]
+
+ if isinstance(token, tuple):
+ if token[0]=='io_redirect':
+ p[0] = suffix + [token]
+ else:
+ #Convert maybe_* to TOKEN if necessary
+ p[0] = suffix + [('TOKEN', token[1])]
+ else:
+ p[0] = suffix + [('TOKEN', token)]
+
+def p_redirect_list(p):
+ """redirect_list : io_redirect
+ | redirect_list io_redirect"""
+ if len(p) == 2:
+ p[0] = ['redirect_list', make_io_redirect(p[1])]
+ else:
+ p[0] = p[1] + [make_io_redirect(p[2])]
+
+def p_io_redirect(p):
+ """io_redirect : io_file
+ | IO_NUMBER io_file
+ | io_here
+ | IO_NUMBER io_here"""
+ if len(p)==3:
+ p[0] = ('io_redirect', p[1], p[2])
+ else:
+ p[0] = ('io_redirect', None, p[1])
+
+def p_io_file(p):
+ #Return the tuple (operator, filename)
+ """io_file : LESS filename
+ | LESSAND filename
+ | GREATER filename
+ | GREATAND filename
+ | DGREAT filename
+ | LESSGREAT filename
+ | CLOBBER filename"""
+ #Extract the filename from the file
+ p[0] = ('io_file', p[1], p[2][1])
+
+def p_filename(p):
+ #Return the filename
+ """filename : TOKEN"""
+ p[0] = ('filename', p[1])
+
+def p_io_here(p):
+ """io_here : DLESS here_end
+ | DLESSDASH here_end"""
+ p[0] = ('io_here', p[1], p[2][1], p[2][2])
+
+def p_here_end(p):
+ """here_end : HERENAME TOKEN"""
+ p[0] = ('here_document', p[1], p[2])
+
+def p_newline_sequence(p):
+ # Nothing in the grammar can handle leading NEWLINE productions, so add
+ # this one with the lowest possible priority relatively to newline_list.
+ """newline_sequence : newline_list"""
+ p[0] = None
+
+def p_newline_list(p):
+ """newline_list : NEWLINE
+ | newline_list NEWLINE"""
+ p[0] = None
+
+def p_linebreak(p):
+ """linebreak : newline_list
+ | empty"""
+ p[0] = None
+
+def p_separator_op(p):
+ """separator_op : COMMA
+ | AMP"""
+ p[0] = p[1]
+
+def p_separator(p):
+ """separator : separator_op linebreak
+ | newline_list"""
+ if len(p)==2:
+ #Ignore newlines
+ p[0] = None
+ else:
+ #Keep the separator operator
+ p[0] = ('separator', p[1])
+
+def p_sequential_sep(p):
+ """sequential_sep : COMMA linebreak
+ | newline_list"""
+ p[0] = None
+
+# Low priority TOKEN => for_word conversion.
+# Let maybe_for_word be used as a token when necessary in higher priority
+# rules.
+def p_for_word(p):
+ """for_word : maybe_for_word"""
+ p[0] = p[1]
+
+def p_if_word(p):
+ """if_word : maybe_if_word"""
+ p[0] = p[1]
+
+def p_then_word(p):
+ """then_word : maybe_then_word"""
+ p[0] = p[1]
+
+def p_done_word(p):
+ """done_word : maybe_done_word"""
+ p[0] = p[1]
+
+def p_do_word(p):
+ """do_word : maybe_do_word"""
+ p[0] = p[1]
+
+def p_until_word(p):
+ """until_word : maybe_until_word"""
+ p[0] = p[1]
+
+def p_assignment_word(p):
+ """assignment_word : maybe_assignment_word"""
+ p[0] = ('assignment_word', p[1][1])
+
+def p_bang_word(p):
+ """bang_word : maybe_bang_word"""
+ p[0] = ('bang_word', p[1][1])
+
+def p_token(p):
+ """token : TOKEN
+ | Fi"""
+ p[0] = p[1]
+
+def p_empty(p):
+ 'empty :'
+ p[0] = None
+
+# Error rule for syntax errors
+def p_error(p):
+ msg = []
+ w = msg.append
+ w('%r\n' % p)
+ w('followed by:\n')
+ for i in range(5):
+ n = yacc.token()
+ if not n:
+ break
+ w(' %r\n' % n)
+ raise sherrors.ShellSyntaxError(''.join(msg))
+
+# Build the parser
+try:
+ import pyshtables
+except ImportError:
+ outputdir = os.path.dirname(__file__)
+ if not os.access(outputdir, os.W_OK):
+ outputdir = ''
+ yacc.yacc(tabmodule = 'pyshtables', outputdir = outputdir, debug = 0)
+else:
+ yacc.yacc(tabmodule = 'pysh.pyshtables', write_tables = 0, debug = 0)
+
+
+def parse(input, eof=False, debug=False):
+ """Parse a whole script at once and return the generated AST and unconsumed
+ data in a tuple.
+
+ NOTE: eof is probably meaningless for now, the parser being unable to work
+ in pull mode. It should be set to True.
+ """
+ lexer = pyshlex.PLYLexer()
+ remaining = lexer.add(input, eof)
+ if lexer.is_empty():
+ return [], remaining
+ if debug:
+ debug = 2
+ return yacc.parse(lexer=lexer, debug=debug), remaining
+
+#-------------------------------------------------------------------------------
+# AST rendering helpers
+#-------------------------------------------------------------------------------
+
+def format_commands(v):
+ """Return a tree made of strings and lists. Make command trees easier to
+ display.
+ """
+ if isinstance(v, list):
+ return [format_commands(c) for c in v]
+ if isinstance(v, tuple):
+ if len(v)==2 and isinstance(v[0], str) and not isinstance(v[1], str):
+ if v[0] == 'async':
+ return ['AsyncList', map(format_commands, v[1])]
+ else:
+ #Avoid decomposing tuples like ('pipeline', Pipeline(...))
+ return format_commands(v[1])
+ return format_commands(list(v))
+ elif isinstance(v, IfCond):
+ name = ['IfCond']
+ name += ['if', map(format_commands, v.cond)]
+ name += ['then', map(format_commands, v.if_cmds)]
+ name += ['else', map(format_commands, v.else_cmds)]
+ return name
+ elif isinstance(v, ForLoop):
+ name = ['ForLoop']
+ name += [repr(v.name)+' in ', map(str, v.items)]
+ name += ['commands', map(format_commands, v.cmds)]
+ return name
+ elif isinstance(v, AndOr):
+ return [v.op, format_commands(v.left), format_commands(v.right)]
+ elif isinstance(v, Pipeline):
+ name = 'Pipeline'
+ if v.reverse_status:
+ name = '!' + name
+ return [name, format_commands(v.commands)]
+ elif isinstance(v, Case):
+ name = ['Case']
+ name += [v.name, format_commands(v.items)]
+ elif isinstance(v, SimpleCommand):
+ name = ['SimpleCommand']
+ if v.words:
+ name += ['words', map(str, v.words)]
+ if v.assigns:
+ assigns = [tuple(a[1]) for a in v.assigns]
+ name += ['assigns', map(str, assigns)]
+ if v.redirs:
+ name += ['redirs', map(format_commands, v.redirs)]
+ return name
+ elif isinstance(v, RedirectList):
+ name = ['RedirectList']
+ if v.redirs:
+ name += ['redirs', map(format_commands, v.redirs)]
+ name += ['command', format_commands(v.cmd)]
+ return name
+ elif isinstance(v, IORedirect):
+ return ' '.join(map(str, (v.io_number, v.op, v.filename)))
+ elif isinstance(v, HereDocument):
+ return ' '.join(map(str, (v.io_number, v.op, repr(v.name), repr(v.content))))
+ elif isinstance(v, SubShell):
+ return ['SubShell', map(format_commands, v.cmds)]
+ else:
+ return repr(v)
+
+def print_commands(cmds, output=sys.stdout):
+ """Pretty print a command tree."""
+ def print_tree(cmd, spaces, output):
+ if isinstance(cmd, list):
+ for c in cmd:
+ print_tree(c, spaces + 3, output)
+ else:
+ print >>output, ' '*spaces + str(cmd)
+
+ formatted = format_commands(cmds)
+ print_tree(formatted, 0, output)
+
+
+def stringify_commands(cmds):
+ """Serialize a command tree as a string.
+
+ Returned string is not pretty and is currently used for unit tests only.
+ """
+ def stringify(value):
+ output = []
+ if isinstance(value, list):
+ formatted = []
+ for v in value:
+ formatted.append(stringify(v))
+ formatted = ' '.join(formatted)
+ output.append(''.join(['<', formatted, '>']))
+ else:
+ output.append(value)
+ return ' '.join(output)
+
+ return stringify(format_commands(cmds))
+
+
+def visit_commands(cmds, callable):
+ """Visit the command tree and execute callable on every Pipeline and
+ SimpleCommand instances.
+ """
+ if isinstance(cmds, (tuple, list)):
+ map(lambda c: visit_commands(c,callable), cmds)
+ elif isinstance(cmds, (Pipeline, SimpleCommand)):
+ callable(cmds)
diff --git a/bitbake/lib/bb/pysh/sherrors.py b/bitbake/lib/bb/pysh/sherrors.py
new file mode 100644
index 0000000..49d0533
--- /dev/null
+++ b/bitbake/lib/bb/pysh/sherrors.py
@@ -0,0 +1,41 @@
+# sherrors.py - shell errors and signals
+#
+# Copyright 2007 Patrick Mezard
+#
+# This software may be used and distributed according to the terms
+# of the GNU General Public License, incorporated herein by reference.
+
+"""Define shell exceptions and error codes.
+"""
+
+class ShellError(Exception):
+ pass
+
+class ShellSyntaxError(ShellError):
+ pass
+
+class UtilityError(ShellError):
+ """Raised upon utility syntax error (option or operand error)."""
+ pass
+
+class ExpansionError(ShellError):
+ pass
+
+class CommandNotFound(ShellError):
+ """Specified command was not found."""
+ pass
+
+class RedirectionError(ShellError):
+ pass
+
+class VarAssignmentError(ShellError):
+ """Variable assignment error."""
+ pass
+
+class ExitSignal(ShellError):
+ """Exit signal."""
+ pass
+
+class ReturnSignal(ShellError):
+ """Exit signal."""
+ pass
diff --git a/bitbake/lib/bb/pysh/subprocess_fix.py b/bitbake/lib/bb/pysh/subprocess_fix.py
new file mode 100644
index 0000000..46eca22
--- /dev/null
+++ b/bitbake/lib/bb/pysh/subprocess_fix.py
@@ -0,0 +1,77 @@
+# subprocess - Subprocesses with accessible I/O streams
+#
+# For more information about this module, see PEP 324.
+#
+# This module should remain compatible with Python 2.2, see PEP 291.
+#
+# Copyright (c) 2003-2005 by Peter Astrand <astrand@lysator.liu.se>
+#
+# Licensed to PSF under a Contributor Agreement.
+# See http://www.python.org/2.4/license for licensing details.
+
+def list2cmdline(seq):
+ """
+ Translate a sequence of arguments into a command line
+ string, using the same rules as the MS C runtime:
+
+ 1) Arguments are delimited by white space, which is either a
+ space or a tab.
+
+ 2) A string surrounded by double quotation marks is
+ interpreted as a single argument, regardless of white space
+ contained within. A quoted string can be embedded in an
+ argument.
+
+ 3) A double quotation mark preceded by a backslash is
+ interpreted as a literal double quotation mark.
+
+ 4) Backslashes are interpreted literally, unless they
+ immediately precede a double quotation mark.
+
+ 5) If backslashes immediately precede a double quotation mark,
+ every pair of backslashes is interpreted as a literal
+ backslash. If the number of backslashes is odd, the last
+ backslash escapes the next double quotation mark as
+ described in rule 3.
+ """
+
+ # See
+ # http://msdn.microsoft.com/library/en-us/vccelng/htm/progs_12.asp
+ result = []
+ needquote = False
+ for arg in seq:
+ bs_buf = []
+
+ # Add a space to separate this argument from the others
+ if result:
+ result.append(' ')
+
+ needquote = (" " in arg) or ("\t" in arg) or ("|" in arg) or arg == ""
+ if needquote:
+ result.append('"')
+
+ for c in arg:
+ if c == '\\':
+ # Don't know if we need to double yet.
+ bs_buf.append(c)
+ elif c == '"':
+ # Double backspaces.
+ result.append('\\' * len(bs_buf)*2)
+ bs_buf = []
+ result.append('\\"')
+ else:
+ # Normal char
+ if bs_buf:
+ result.extend(bs_buf)
+ bs_buf = []
+ result.append(c)
+
+ # Add remaining backspaces, if any.
+ if bs_buf:
+ result.extend(bs_buf)
+
+ if needquote:
+ result.extend(bs_buf)
+ result.append('"')
+
+ return ''.join(result)
diff --git a/bitbake/lib/bb/runqueue.py b/bitbake/lib/bb/runqueue.py
new file mode 100644
index 0000000..2b71eed
--- /dev/null
+++ b/bitbake/lib/bb/runqueue.py
@@ -0,0 +1,2195 @@
+#!/usr/bin/env python
+# ex:ts=4:sw=4:sts=4:et
+# -*- tab-width: 4; c-basic-offset: 4; indent-tabs-mode: nil -*-
+"""
+BitBake 'RunQueue' implementation
+
+Handles preparation and execution of a queue of tasks
+"""
+
+# Copyright (C) 2006-2007 Richard Purdie
+#
+# This program is free software; you can redistribute it and/or modify
+# it under the terms of the GNU General Public License version 2 as
+# published by the Free Software Foundation.
+#
+# This program is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
+# GNU General Public License for more details.
+#
+# You should have received a copy of the GNU General Public License along
+# with this program; if not, write to the Free Software Foundation, Inc.,
+# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
+
+import copy
+import os
+import sys
+import signal
+import stat
+import fcntl
+import errno
+import logging
+import re
+import bb
+from bb import msg, data, event
+from bb import monitordisk
+import subprocess
+
+try:
+ import cPickle as pickle
+except ImportError:
+ import pickle
+
+bblogger = logging.getLogger("BitBake")
+logger = logging.getLogger("BitBake.RunQueue")
+
+__find_md5__ = re.compile( r'(?i)(?<![a-z0-9])[a-f0-9]{32}(?![a-z0-9])' )
+
+class RunQueueStats:
+ """
+ Holds statistics on the tasks handled by the associated runQueue
+ """
+ def __init__(self, total):
+ self.completed = 0
+ self.skipped = 0
+ self.failed = 0
+ self.active = 0
+ self.total = total
+
+ def copy(self):
+ obj = self.__class__(self.total)
+ obj.__dict__.update(self.__dict__)
+ return obj
+
+ def taskFailed(self):
+ self.active = self.active - 1
+ self.failed = self.failed + 1
+
+ def taskCompleted(self, number = 1):
+ self.active = self.active - number
+ self.completed = self.completed + number
+
+ def taskSkipped(self, number = 1):
+ self.active = self.active + number
+ self.skipped = self.skipped + number
+
+ def taskActive(self):
+ self.active = self.active + 1
+
+# These values indicate the next step due to be run in the
+# runQueue state machine
+runQueuePrepare = 2
+runQueueSceneInit = 3
+runQueueSceneRun = 4
+runQueueRunInit = 5
+runQueueRunning = 6
+runQueueFailed = 7
+runQueueCleanUp = 8
+runQueueComplete = 9
+
+class RunQueueScheduler(object):
+ """
+ Control the order tasks are scheduled in.
+ """
+ name = "basic"
+
+ def __init__(self, runqueue, rqdata):
+ """
+ The default scheduler just returns the first buildable task (the
+ priority map is sorted by task number)
+ """
+ self.rq = runqueue
+ self.rqdata = rqdata
+ self.numTasks = len(self.rqdata.runq_fnid)
+
+ self.prio_map = []
+ self.prio_map.extend(range(self.numTasks))
+
+ self.buildable = []
+ self.stamps = {}
+ for taskid in xrange(self.numTasks):
+ fn = self.rqdata.taskData.fn_index[self.rqdata.runq_fnid[taskid]]
+ taskname = self.rqdata.runq_task[taskid]
+ self.stamps[taskid] = bb.build.stampfile(taskname, self.rqdata.dataCache, fn)
+ if self.rq.runq_buildable[taskid] == 1:
+ self.buildable.append(taskid)
+
+ self.rev_prio_map = None
+
+ def next_buildable_task(self):
+ """
+ Return the id of the first task we find that is buildable
+ """
+ self.buildable = [x for x in self.buildable if not self.rq.runq_running[x] == 1]
+ if not self.buildable:
+ return None
+ if len(self.buildable) == 1:
+ taskid = self.buildable[0]
+ stamp = self.stamps[taskid]
+ if stamp not in self.rq.build_stamps.itervalues():
+ return taskid
+
+ if not self.rev_prio_map:
+ self.rev_prio_map = range(self.numTasks)
+ for taskid in xrange(self.numTasks):
+ self.rev_prio_map[self.prio_map[taskid]] = taskid
+
+ best = None
+ bestprio = None
+ for taskid in self.buildable:
+ prio = self.rev_prio_map[taskid]
+ if bestprio is None or bestprio > prio:
+ stamp = self.stamps[taskid]
+ if stamp in self.rq.build_stamps.itervalues():
+ continue
+ bestprio = prio
+ best = taskid
+
+ return best
+
+ def next(self):
+ """
+ Return the id of the task we should build next
+ """
+ if self.rq.stats.active < self.rq.number_tasks:
+ return self.next_buildable_task()
+
+ def newbuilable(self, task):
+ self.buildable.append(task)
+
+class RunQueueSchedulerSpeed(RunQueueScheduler):
+ """
+ A scheduler optimised for speed. The priority map is sorted by task weight,
+ heavier weighted tasks (tasks needed by the most other tasks) are run first.
+ """
+ name = "speed"
+
+ def __init__(self, runqueue, rqdata):
+ """
+ The priority map is sorted by task weight.
+ """
+ RunQueueScheduler.__init__(self, runqueue, rqdata)
+
+ sortweight = sorted(copy.deepcopy(self.rqdata.runq_weight))
+ copyweight = copy.deepcopy(self.rqdata.runq_weight)
+ self.prio_map = []
+
+ for weight in sortweight:
+ idx = copyweight.index(weight)
+ self.prio_map.append(idx)
+ copyweight[idx] = -1
+
+ self.prio_map.reverse()
+
+class RunQueueSchedulerCompletion(RunQueueSchedulerSpeed):
+ """
+ A scheduler optimised to complete .bb files are quickly as possible. The
+ priority map is sorted by task weight, but then reordered so once a given
+ .bb file starts to build, it's completed as quickly as possible. This works
+ well where disk space is at a premium and classes like OE's rm_work are in
+ force.
+ """
+ name = "completion"
+
+ def __init__(self, runqueue, rqdata):
+ RunQueueSchedulerSpeed.__init__(self, runqueue, rqdata)
+
+ #FIXME - whilst this groups all fnids together it does not reorder the
+ #fnid groups optimally.
+
+ basemap = copy.deepcopy(self.prio_map)
+ self.prio_map = []
+ while (len(basemap) > 0):
+ entry = basemap.pop(0)
+ self.prio_map.append(entry)
+ fnid = self.rqdata.runq_fnid[entry]
+ todel = []
+ for entry in basemap:
+ entry_fnid = self.rqdata.runq_fnid[entry]
+ if entry_fnid == fnid:
+ todel.append(basemap.index(entry))
+ self.prio_map.append(entry)
+ todel.reverse()
+ for idx in todel:
+ del basemap[idx]
+
+class RunQueueData:
+ """
+ BitBake Run Queue implementation
+ """
+ def __init__(self, rq, cooker, cfgData, dataCache, taskData, targets):
+ self.cooker = cooker
+ self.dataCache = dataCache
+ self.taskData = taskData
+ self.targets = targets
+ self.rq = rq
+ self.warn_multi_bb = False
+
+ self.stampwhitelist = cfgData.getVar("BB_STAMP_WHITELIST", True) or ""
+ self.multi_provider_whitelist = (cfgData.getVar("MULTI_PROVIDER_WHITELIST", True) or "").split()
+
+ self.reset()
+
+ def reset(self):
+ self.runq_fnid = []
+ self.runq_task = []
+ self.runq_depends = []
+ self.runq_revdeps = []
+ self.runq_hash = []
+
+ def runq_depends_names(self, ids):
+ import re
+ ret = []
+ for id in self.runq_depends[ids]:
+ nam = os.path.basename(self.get_user_idstring(id))
+ nam = re.sub("_[^,]*,", ",", nam)
+ ret.extend([nam])
+ return ret
+
+ def get_task_name(self, task):
+ return self.runq_task[task]
+
+ def get_task_file(self, task):
+ return self.taskData.fn_index[self.runq_fnid[task]]
+
+ def get_task_hash(self, task):
+ return self.runq_hash[task]
+
+ def get_user_idstring(self, task, task_name_suffix = ""):
+ fn = self.taskData.fn_index[self.runq_fnid[task]]
+ taskname = self.runq_task[task] + task_name_suffix
+ return "%s, %s" % (fn, taskname)
+
+ def get_task_id(self, fnid, taskname):
+ for listid in xrange(len(self.runq_fnid)):
+ if self.runq_fnid[listid] == fnid and self.runq_task[listid] == taskname:
+ return listid
+ return None
+
+ def circular_depchains_handler(self, tasks):
+ """
+ Some tasks aren't buildable, likely due to circular dependency issues.
+ Identify the circular dependencies and print them in a user readable format.
+ """
+ from copy import deepcopy
+
+ valid_chains = []
+ explored_deps = {}
+ msgs = []
+
+ def chain_reorder(chain):
+ """
+ Reorder a dependency chain so the lowest task id is first
+ """
+ lowest = 0
+ new_chain = []
+ for entry in xrange(len(chain)):
+ if chain[entry] < chain[lowest]:
+ lowest = entry
+ new_chain.extend(chain[lowest:])
+ new_chain.extend(chain[:lowest])
+ return new_chain
+
+ def chain_compare_equal(chain1, chain2):
+ """
+ Compare two dependency chains and see if they're the same
+ """
+ if len(chain1) != len(chain2):
+ return False
+ for index in xrange(len(chain1)):
+ if chain1[index] != chain2[index]:
+ return False
+ return True
+
+ def chain_array_contains(chain, chain_array):
+ """
+ Return True if chain_array contains chain
+ """
+ for ch in chain_array:
+ if chain_compare_equal(ch, chain):
+ return True
+ return False
+
+ def find_chains(taskid, prev_chain):
+ prev_chain.append(taskid)
+ total_deps = []
+ total_deps.extend(self.runq_revdeps[taskid])
+ for revdep in self.runq_revdeps[taskid]:
+ if revdep in prev_chain:
+ idx = prev_chain.index(revdep)
+ # To prevent duplicates, reorder the chain to start with the lowest taskid
+ # and search through an array of those we've already printed
+ chain = prev_chain[idx:]
+ new_chain = chain_reorder(chain)
+ if not chain_array_contains(new_chain, valid_chains):
+ valid_chains.append(new_chain)
+ msgs.append("Dependency loop #%d found:\n" % len(valid_chains))
+ for dep in new_chain:
+ msgs.append(" Task %s (%s) (dependent Tasks %s)\n" % (dep, self.get_user_idstring(dep), self.runq_depends_names(dep)))
+ msgs.append("\n")
+ if len(valid_chains) > 10:
+ msgs.append("Aborted dependency loops search after 10 matches.\n")
+ return msgs
+ continue
+ scan = False
+ if revdep not in explored_deps:
+ scan = True
+ elif revdep in explored_deps[revdep]:
+ scan = True
+ else:
+ for dep in prev_chain:
+ if dep in explored_deps[revdep]:
+ scan = True
+ if scan:
+ find_chains(revdep, copy.deepcopy(prev_chain))
+ for dep in explored_deps[revdep]:
+ if dep not in total_deps:
+ total_deps.append(dep)
+
+ explored_deps[taskid] = total_deps
+
+ for task in tasks:
+ find_chains(task, [])
+
+ return msgs
+
+ def calculate_task_weights(self, endpoints):
+ """
+ Calculate a number representing the "weight" of each task. Heavier weighted tasks
+ have more dependencies and hence should be executed sooner for maximum speed.
+
+ This function also sanity checks the task list finding tasks that are not
+ possible to execute due to circular dependencies.
+ """
+
+ numTasks = len(self.runq_fnid)
+ weight = []
+ deps_left = []
+ task_done = []
+
+ for listid in xrange(numTasks):
+ task_done.append(False)
+ weight.append(1)
+ deps_left.append(len(self.runq_revdeps[listid]))
+
+ for listid in endpoints:
+ weight[listid] = 10
+ task_done[listid] = True
+
+ while True:
+ next_points = []
+ for listid in endpoints:
+ for revdep in self.runq_depends[listid]:
+ weight[revdep] = weight[revdep] + weight[listid]
+ deps_left[revdep] = deps_left[revdep] - 1
+ if deps_left[revdep] == 0:
+ next_points.append(revdep)
+ task_done[revdep] = True
+ endpoints = next_points
+ if len(next_points) == 0:
+ break
+
+ # Circular dependency sanity check
+ problem_tasks = []
+ for task in xrange(numTasks):
+ if task_done[task] is False or deps_left[task] != 0:
+ problem_tasks.append(task)
+ logger.debug(2, "Task %s (%s) is not buildable", task, self.get_user_idstring(task))
+ logger.debug(2, "(Complete marker was %s and the remaining dependency count was %s)\n", task_done[task], deps_left[task])
+
+ if problem_tasks:
+ message = "Unbuildable tasks were found.\n"
+ message = message + "These are usually caused by circular dependencies and any circular dependency chains found will be printed below. Increase the debug level to see a list of unbuildable tasks.\n\n"
+ message = message + "Identifying dependency loops (this may take a short while)...\n"
+ logger.error(message)
+
+ msgs = self.circular_depchains_handler(problem_tasks)
+
+ message = "\n"
+ for msg in msgs:
+ message = message + msg
+ bb.msg.fatal("RunQueue", message)
+
+ return weight
+
+ def prepare(self):
+ """
+ Turn a set of taskData into a RunQueue and compute data needed
+ to optimise the execution order.
+ """
+
+ runq_build = []
+ recursivetasks = {}
+ recursiveitasks = {}
+ recursivetasksselfref = set()
+
+ taskData = self.taskData
+
+ if len(taskData.tasks_name) == 0:
+ # Nothing to do
+ return 0
+
+ logger.info("Preparing RunQueue")
+
+ # Step A - Work out a list of tasks to run
+ #
+ # Taskdata gives us a list of possible providers for every build and run
+ # target ordered by priority. It also gives information on each of those
+ # providers.
+ #
+ # To create the actual list of tasks to execute we fix the list of
+ # providers and then resolve the dependencies into task IDs. This
+ # process is repeated for each type of dependency (tdepends, deptask,
+ # rdeptast, recrdeptask, idepends).
+
+ def add_build_dependencies(depids, tasknames, depends):
+ for depid in depids:
+ # Won't be in build_targets if ASSUME_PROVIDED
+ if depid not in taskData.build_targets:
+ continue
+ depdata = taskData.build_targets[depid][0]
+ if depdata is None:
+ continue
+ for taskname in tasknames:
+ taskid = taskData.gettask_id_fromfnid(depdata, taskname)
+ if taskid is not None:
+ depends.add(taskid)
+
+ def add_runtime_dependencies(depids, tasknames, depends):
+ for depid in depids:
+ if depid not in taskData.run_targets:
+ continue
+ depdata = taskData.run_targets[depid][0]
+ if depdata is None:
+ continue
+ for taskname in tasknames:
+ taskid = taskData.gettask_id_fromfnid(depdata, taskname)
+ if taskid is not None:
+ depends.add(taskid)
+
+ def add_resolved_dependencies(depids, tasknames, depends):
+ for depid in depids:
+ for taskname in tasknames:
+ taskid = taskData.gettask_id_fromfnid(depid, taskname)
+ if taskid is not None:
+ depends.add(taskid)
+
+ for task in xrange(len(taskData.tasks_name)):
+ depends = set()
+ fnid = taskData.tasks_fnid[task]
+ fn = taskData.fn_index[fnid]
+ task_deps = self.dataCache.task_deps[fn]
+
+ #logger.debug(2, "Processing %s:%s", fn, taskData.tasks_name[task])
+
+ if fnid not in taskData.failed_fnids:
+
+ # Resolve task internal dependencies
+ #
+ # e.g. addtask before X after Y
+ depends = set(taskData.tasks_tdepends[task])
+
+ # Resolve 'deptask' dependencies
+ #
+ # e.g. do_sometask[deptask] = "do_someothertask"
+ # (makes sure sometask runs after someothertask of all DEPENDS)
+ if 'deptask' in task_deps and taskData.tasks_name[task] in task_deps['deptask']:
+ tasknames = task_deps['deptask'][taskData.tasks_name[task]].split()
+ add_build_dependencies(taskData.depids[fnid], tasknames, depends)
+
+ # Resolve 'rdeptask' dependencies
+ #
+ # e.g. do_sometask[rdeptask] = "do_someothertask"
+ # (makes sure sometask runs after someothertask of all RDEPENDS)
+ if 'rdeptask' in task_deps and taskData.tasks_name[task] in task_deps['rdeptask']:
+ tasknames = task_deps['rdeptask'][taskData.tasks_name[task]].split()
+ add_runtime_dependencies(taskData.rdepids[fnid], tasknames, depends)
+
+ # Resolve inter-task dependencies
+ #
+ # e.g. do_sometask[depends] = "targetname:do_someothertask"
+ # (makes sure sometask runs after targetname's someothertask)
+ idepends = taskData.tasks_idepends[task]
+ for (depid, idependtask) in idepends:
+ if depid in taskData.build_targets and not depid in taskData.failed_deps:
+ # Won't be in build_targets if ASSUME_PROVIDED
+ depdata = taskData.build_targets[depid][0]
+ if depdata is not None:
+ taskid = taskData.gettask_id_fromfnid(depdata, idependtask)
+ if taskid is None:
+ bb.msg.fatal("RunQueue", "Task %s in %s depends upon non-existent task %s in %s" % (taskData.tasks_name[task], fn, idependtask, taskData.fn_index[depdata]))
+ depends.add(taskid)
+ irdepends = taskData.tasks_irdepends[task]
+ for (depid, idependtask) in irdepends:
+ if depid in taskData.run_targets:
+ # Won't be in run_targets if ASSUME_PROVIDED
+ depdata = taskData.run_targets[depid][0]
+ if depdata is not None:
+ taskid = taskData.gettask_id_fromfnid(depdata, idependtask)
+ if taskid is None:
+ bb.msg.fatal("RunQueue", "Task %s in %s rdepends upon non-existent task %s in %s" % (taskData.tasks_name[task], fn, idependtask, taskData.fn_index[depdata]))
+ depends.add(taskid)
+
+ # Resolve recursive 'recrdeptask' dependencies (Part A)
+ #
+ # e.g. do_sometask[recrdeptask] = "do_someothertask"
+ # (makes sure sometask runs after someothertask of all DEPENDS, RDEPENDS and intertask dependencies, recursively)
+ # We cover the recursive part of the dependencies below
+ if 'recrdeptask' in task_deps and taskData.tasks_name[task] in task_deps['recrdeptask']:
+ tasknames = task_deps['recrdeptask'][taskData.tasks_name[task]].split()
+ recursivetasks[task] = tasknames
+ add_build_dependencies(taskData.depids[fnid], tasknames, depends)
+ add_runtime_dependencies(taskData.rdepids[fnid], tasknames, depends)
+ if taskData.tasks_name[task] in tasknames:
+ recursivetasksselfref.add(task)
+
+ if 'recideptask' in task_deps and taskData.tasks_name[task] in task_deps['recideptask']:
+ recursiveitasks[task] = []
+ for t in task_deps['recideptask'][taskData.tasks_name[task]].split():
+ newdep = taskData.gettask_id_fromfnid(fnid, t)
+ recursiveitasks[task].append(newdep)
+
+ self.runq_fnid.append(taskData.tasks_fnid[task])
+ self.runq_task.append(taskData.tasks_name[task])
+ self.runq_depends.append(depends)
+ self.runq_revdeps.append(set())
+ self.runq_hash.append("")
+
+ runq_build.append(0)
+
+ # Resolve recursive 'recrdeptask' dependencies (Part B)
+ #
+ # e.g. do_sometask[recrdeptask] = "do_someothertask"
+ # (makes sure sometask runs after someothertask of all DEPENDS, RDEPENDS and intertask dependencies, recursively)
+ # We need to do this separately since we need all of self.runq_depends to be complete before this is processed
+ extradeps = {}
+ for task in recursivetasks:
+ extradeps[task] = set(self.runq_depends[task])
+ tasknames = recursivetasks[task]
+ seendeps = set()
+ seenfnid = []
+
+ def generate_recdeps(t):
+ newdeps = set()
+ add_resolved_dependencies([taskData.tasks_fnid[t]], tasknames, newdeps)
+ extradeps[task].update(newdeps)
+ seendeps.add(t)
+ newdeps.add(t)
+ for i in newdeps:
+ for n in self.runq_depends[i]:
+ if n not in seendeps:
+ generate_recdeps(n)
+ generate_recdeps(task)
+
+ if task in recursiveitasks:
+ for dep in recursiveitasks[task]:
+ generate_recdeps(dep)
+
+ # Remove circular references so that do_a[recrdeptask] = "do_a do_b" can work
+ for task in recursivetasks:
+ extradeps[task].difference_update(recursivetasksselfref)
+
+ for task in xrange(len(taskData.tasks_name)):
+ # Add in extra dependencies
+ if task in extradeps:
+ self.runq_depends[task] = extradeps[task]
+ # Remove all self references
+ if task in self.runq_depends[task]:
+ logger.debug(2, "Task %s (%s %s) contains self reference! %s", task, taskData.fn_index[taskData.tasks_fnid[task]], taskData.tasks_name[task], self.runq_depends[task])
+ self.runq_depends[task].remove(task)
+
+ # Step B - Mark all active tasks
+ #
+ # Start with the tasks we were asked to run and mark all dependencies
+ # as active too. If the task is to be 'forced', clear its stamp. Once
+ # all active tasks are marked, prune the ones we don't need.
+
+ logger.verbose("Marking Active Tasks")
+
+ def mark_active(listid, depth):
+ """
+ Mark an item as active along with its depends
+ (calls itself recursively)
+ """
+
+ if runq_build[listid] == 1:
+ return
+
+ runq_build[listid] = 1
+
+ depends = self.runq_depends[listid]
+ for depend in depends:
+ mark_active(depend, depth+1)
+
+ self.target_pairs = []
+ for target in self.targets:
+ targetid = taskData.getbuild_id(target[0])
+
+ if targetid not in taskData.build_targets:
+ continue
+
+ if targetid in taskData.failed_deps:
+ continue
+
+ fnid = taskData.build_targets[targetid][0]
+ fn = taskData.fn_index[fnid]
+ self.target_pairs.append((fn, target[1]))
+
+ if fnid in taskData.failed_fnids:
+ continue
+
+ if target[1] not in taskData.tasks_lookup[fnid]:
+ import difflib
+ close_matches = difflib.get_close_matches(target[1], taskData.tasks_lookup[fnid], cutoff=0.7)
+ if close_matches:
+ extra = ". Close matches:\n %s" % "\n ".join(close_matches)
+ else:
+ extra = ""
+ bb.msg.fatal("RunQueue", "Task %s does not exist for target %s%s" % (target[1], target[0], extra))
+
+ listid = taskData.tasks_lookup[fnid][target[1]]
+
+ mark_active(listid, 1)
+
+ # Step C - Prune all inactive tasks
+ #
+ # Once all active tasks are marked, prune the ones we don't need.
+
+ maps = []
+ delcount = 0
+ for listid in xrange(len(self.runq_fnid)):
+ if runq_build[listid-delcount] == 1:
+ maps.append(listid-delcount)
+ else:
+ del self.runq_fnid[listid-delcount]
+ del self.runq_task[listid-delcount]
+ del self.runq_depends[listid-delcount]
+ del runq_build[listid-delcount]
+ del self.runq_revdeps[listid-delcount]
+ del self.runq_hash[listid-delcount]
+ delcount = delcount + 1
+ maps.append(-1)
+
+ #
+ # Step D - Sanity checks and computation
+ #
+
+ # Check to make sure we still have tasks to run
+ if len(self.runq_fnid) == 0:
+ if not taskData.abort:
+ bb.msg.fatal("RunQueue", "All buildable tasks have been run but the build is incomplete (--continue mode). Errors for the tasks that failed will have been printed above.")
+ else:
+ bb.msg.fatal("RunQueue", "No active tasks and not in --continue mode?! Please report this bug.")
+
+ logger.verbose("Pruned %s inactive tasks, %s left", delcount, len(self.runq_fnid))
+
+ # Remap the dependencies to account for the deleted tasks
+ # Check we didn't delete a task we depend on
+ for listid in xrange(len(self.runq_fnid)):
+ newdeps = []
+ origdeps = self.runq_depends[listid]
+ for origdep in origdeps:
+ if maps[origdep] == -1:
+ bb.msg.fatal("RunQueue", "Invalid mapping - Should never happen!")
+ newdeps.append(maps[origdep])
+ self.runq_depends[listid] = set(newdeps)
+
+ logger.verbose("Assign Weightings")
+
+ # Generate a list of reverse dependencies to ease future calculations
+ for listid in xrange(len(self.runq_fnid)):
+ for dep in self.runq_depends[listid]:
+ self.runq_revdeps[dep].add(listid)
+
+ # Identify tasks at the end of dependency chains
+ # Error on circular dependency loops (length two)
+ endpoints = []
+ for listid in xrange(len(self.runq_fnid)):
+ revdeps = self.runq_revdeps[listid]
+ if len(revdeps) == 0:
+ endpoints.append(listid)
+ for dep in revdeps:
+ if dep in self.runq_depends[listid]:
+ #self.dump_data(taskData)
+ bb.msg.fatal("RunQueue", "Task %s (%s) has circular dependency on %s (%s)" % (taskData.fn_index[self.runq_fnid[dep]], self.runq_task[dep], taskData.fn_index[self.runq_fnid[listid]], self.runq_task[listid]))
+
+ logger.verbose("Compute totals (have %s endpoint(s))", len(endpoints))
+
+ # Calculate task weights
+ # Check of higher length circular dependencies
+ self.runq_weight = self.calculate_task_weights(endpoints)
+
+ # Sanity Check - Check for multiple tasks building the same provider
+ prov_list = {}
+ seen_fn = []
+ for task in xrange(len(self.runq_fnid)):
+ fn = taskData.fn_index[self.runq_fnid[task]]
+ if fn in seen_fn:
+ continue
+ seen_fn.append(fn)
+ for prov in self.dataCache.fn_provides[fn]:
+ if prov not in prov_list:
+ prov_list[prov] = [fn]
+ elif fn not in prov_list[prov]:
+ prov_list[prov].append(fn)
+ for prov in prov_list:
+ if len(prov_list[prov]) > 1 and prov not in self.multi_provider_whitelist:
+ seen_pn = []
+ # If two versions of the same PN are being built its fatal, we don't support it.
+ for fn in prov_list[prov]:
+ pn = self.dataCache.pkg_fn[fn]
+ if pn not in seen_pn:
+ seen_pn.append(pn)
+ else:
+ bb.fatal("Multiple versions of %s are due to be built (%s). Only one version of a given PN should be built in any given build. You likely need to set PREFERRED_VERSION_%s to select the correct version or don't depend on multiple versions." % (pn, " ".join(prov_list[prov]), pn))
+ msg = "Multiple .bb files are due to be built which each provide %s (%s)." % (prov, " ".join(prov_list[prov]))
+ if self.warn_multi_bb:
+ logger.warn(msg)
+ else:
+ msg += "\n This usually means one provides something the other doesn't and should."
+ logger.error(msg)
+
+ # Create a whitelist usable by the stamp checks
+ stampfnwhitelist = []
+ for entry in self.stampwhitelist.split():
+ entryid = self.taskData.getbuild_id(entry)
+ if entryid not in self.taskData.build_targets:
+ continue
+ fnid = self.taskData.build_targets[entryid][0]
+ fn = self.taskData.fn_index[fnid]
+ stampfnwhitelist.append(fn)
+ self.stampfnwhitelist = stampfnwhitelist
+
+ # Iterate over the task list looking for tasks with a 'setscene' function
+ self.runq_setscene = []
+ if not self.cooker.configuration.nosetscene:
+ for task in range(len(self.runq_fnid)):
+ setscene = taskData.gettask_id(self.taskData.fn_index[self.runq_fnid[task]], self.runq_task[task] + "_setscene", False)
+ if not setscene:
+ continue
+ self.runq_setscene.append(task)
+
+ def invalidate_task(fn, taskname, error_nostamp):
+ taskdep = self.dataCache.task_deps[fn]
+ fnid = self.taskData.getfn_id(fn)
+ if taskname not in taskData.tasks_lookup[fnid]:
+ logger.warn("Task %s does not exist, invalidating this task will have no effect" % taskname)
+ if 'nostamp' in taskdep and taskname in taskdep['nostamp']:
+ if error_nostamp:
+ bb.fatal("Task %s is marked nostamp, cannot invalidate this task" % taskname)
+ else:
+ bb.debug(1, "Task %s is marked nostamp, cannot invalidate this task" % taskname)
+ else:
+ logger.verbose("Invalidate task %s, %s", taskname, fn)
+ bb.parse.siggen.invalidate_task(taskname, self.dataCache, fn)
+
+ # Invalidate task if force mode active
+ if self.cooker.configuration.force:
+ for (fn, target) in self.target_pairs:
+ invalidate_task(fn, target, False)
+
+ # Invalidate task if invalidate mode active
+ if self.cooker.configuration.invalidate_stamp:
+ for (fn, target) in self.target_pairs:
+ for st in self.cooker.configuration.invalidate_stamp.split(','):
+ if not st.startswith("do_"):
+ st = "do_%s" % st
+ invalidate_task(fn, st, True)
+
+ # Iterate over the task list and call into the siggen code
+ dealtwith = set()
+ todeal = set(range(len(self.runq_fnid)))
+ while len(todeal) > 0:
+ for task in todeal.copy():
+ if len(self.runq_depends[task] - dealtwith) == 0:
+ dealtwith.add(task)
+ todeal.remove(task)
+ procdep = []
+ for dep in self.runq_depends[task]:
+ procdep.append(self.taskData.fn_index[self.runq_fnid[dep]] + "." + self.runq_task[dep])
+ self.runq_hash[task] = bb.parse.siggen.get_taskhash(self.taskData.fn_index[self.runq_fnid[task]], self.runq_task[task], procdep, self.dataCache)
+
+ return len(self.runq_fnid)
+
+ def dump_data(self, taskQueue):
+ """
+ Dump some debug information on the internal data structures
+ """
+ logger.debug(3, "run_tasks:")
+ for task in xrange(len(self.rqdata.runq_task)):
+ logger.debug(3, " (%s)%s - %s: %s Deps %s RevDeps %s", task,
+ taskQueue.fn_index[self.rqdata.runq_fnid[task]],
+ self.rqdata.runq_task[task],
+ self.rqdata.runq_weight[task],
+ self.rqdata.runq_depends[task],
+ self.rqdata.runq_revdeps[task])
+
+ logger.debug(3, "sorted_tasks:")
+ for task1 in xrange(len(self.rqdata.runq_task)):
+ if task1 in self.prio_map:
+ task = self.prio_map[task1]
+ logger.debug(3, " (%s)%s - %s: %s Deps %s RevDeps %s", task,
+ taskQueue.fn_index[self.rqdata.runq_fnid[task]],
+ self.rqdata.runq_task[task],
+ self.rqdata.runq_weight[task],
+ self.rqdata.runq_depends[task],
+ self.rqdata.runq_revdeps[task])
+
+class RunQueue:
+ def __init__(self, cooker, cfgData, dataCache, taskData, targets):
+
+ self.cooker = cooker
+ self.cfgData = cfgData
+ self.rqdata = RunQueueData(self, cooker, cfgData, dataCache, taskData, targets)
+
+ self.stamppolicy = cfgData.getVar("BB_STAMP_POLICY", True) or "perfile"
+ self.hashvalidate = cfgData.getVar("BB_HASHCHECK_FUNCTION", True) or None
+ self.setsceneverify = cfgData.getVar("BB_SETSCENE_VERIFY_FUNCTION", True) or None
+ self.depvalidate = cfgData.getVar("BB_SETSCENE_DEPVALID", True) or None
+
+ self.state = runQueuePrepare
+
+ # For disk space monitor
+ self.dm = monitordisk.diskMonitor(cfgData)
+
+ self.rqexe = None
+ self.worker = None
+ self.workerpipe = None
+ self.fakeworker = None
+ self.fakeworkerpipe = None
+
+ def _start_worker(self, fakeroot = False, rqexec = None):
+ logger.debug(1, "Starting bitbake-worker")
+ magic = "decafbad"
+ if self.cooker.configuration.profile:
+ magic = "decafbadbad"
+ if fakeroot:
+ fakerootcmd = self.cfgData.getVar("FAKEROOTCMD", True)
+ fakerootenv = (self.cfgData.getVar("FAKEROOTBASEENV", True) or "").split()
+ env = os.environ.copy()
+ for key, value in (var.split('=') for var in fakerootenv):
+ env[key] = value
+ worker = subprocess.Popen([fakerootcmd, "bitbake-worker", magic], stdout=subprocess.PIPE, stdin=subprocess.PIPE, env=env)
+ else:
+ worker = subprocess.Popen(["bitbake-worker", magic], stdout=subprocess.PIPE, stdin=subprocess.PIPE)
+ bb.utils.nonblockingfd(worker.stdout)
+ workerpipe = runQueuePipe(worker.stdout, None, self.cfgData, self, rqexec)
+
+ workerdata = {
+ "taskdeps" : self.rqdata.dataCache.task_deps,
+ "fakerootenv" : self.rqdata.dataCache.fakerootenv,
+ "fakerootdirs" : self.rqdata.dataCache.fakerootdirs,
+ "fakerootnoenv" : self.rqdata.dataCache.fakerootnoenv,
+ "sigdata" : bb.parse.siggen.get_taskdata(),
+ "runq_hash" : self.rqdata.runq_hash,
+ "logdefaultdebug" : bb.msg.loggerDefaultDebugLevel,
+ "logdefaultverbose" : bb.msg.loggerDefaultVerbose,
+ "logdefaultverboselogs" : bb.msg.loggerVerboseLogs,
+ "logdefaultdomain" : bb.msg.loggerDefaultDomains,
+ "prhost" : self.cooker.prhost,
+ "buildname" : self.cfgData.getVar("BUILDNAME", True),
+ "date" : self.cfgData.getVar("DATE", True),
+ "time" : self.cfgData.getVar("TIME", True),
+ }
+
+ worker.stdin.write("<cookerconfig>" + pickle.dumps(self.cooker.configuration) + "</cookerconfig>")
+ worker.stdin.write("<workerdata>" + pickle.dumps(workerdata) + "</workerdata>")
+ worker.stdin.flush()
+
+ return worker, workerpipe
+
+ def _teardown_worker(self, worker, workerpipe):
+ if not worker:
+ return
+ logger.debug(1, "Teardown for bitbake-worker")
+ try:
+ worker.stdin.write("<quit></quit>")
+ worker.stdin.flush()
+ except IOError:
+ pass
+ while worker.returncode is None:
+ workerpipe.read()
+ worker.poll()
+ while workerpipe.read():
+ continue
+ workerpipe.close()
+
+ def start_worker(self):
+ if self.worker:
+ self.teardown_workers()
+ self.teardown = False
+ self.worker, self.workerpipe = self._start_worker()
+
+ def start_fakeworker(self, rqexec):
+ if not self.fakeworker:
+ self.fakeworker, self.fakeworkerpipe = self._start_worker(True, rqexec)
+
+ def teardown_workers(self):
+ self.teardown = True
+ self._teardown_worker(self.worker, self.workerpipe)
+ self.worker = None
+ self.workerpipe = None
+ self._teardown_worker(self.fakeworker, self.fakeworkerpipe)
+ self.fakeworker = None
+ self.fakeworkerpipe = None
+
+ def read_workers(self):
+ self.workerpipe.read()
+ if self.fakeworkerpipe:
+ self.fakeworkerpipe.read()
+
+ def active_fds(self):
+ fds = []
+ if self.workerpipe:
+ fds.append(self.workerpipe.input)
+ if self.fakeworkerpipe:
+ fds.append(self.fakeworkerpipe.input)
+ return fds
+
+ def check_stamp_task(self, task, taskname = None, recurse = False, cache = None):
+ def get_timestamp(f):
+ try:
+ if not os.access(f, os.F_OK):
+ return None
+ return os.stat(f)[stat.ST_MTIME]
+ except:
+ return None
+
+ if self.stamppolicy == "perfile":
+ fulldeptree = False
+ else:
+ fulldeptree = True
+ stampwhitelist = []
+ if self.stamppolicy == "whitelist":
+ stampwhitelist = self.rqdata.stampfnwhitelist
+
+ fn = self.rqdata.taskData.fn_index[self.rqdata.runq_fnid[task]]
+ if taskname is None:
+ taskname = self.rqdata.runq_task[task]
+
+ stampfile = bb.build.stampfile(taskname, self.rqdata.dataCache, fn)
+
+ # If the stamp is missing, it's not current
+ if not os.access(stampfile, os.F_OK):
+ logger.debug(2, "Stampfile %s not available", stampfile)
+ return False
+ # If it's a 'nostamp' task, it's not current
+ taskdep = self.rqdata.dataCache.task_deps[fn]
+ if 'nostamp' in taskdep and taskname in taskdep['nostamp']:
+ logger.debug(2, "%s.%s is nostamp\n", fn, taskname)
+ return False
+
+ if taskname != "do_setscene" and taskname.endswith("_setscene"):
+ return True
+
+ if cache is None:
+ cache = {}
+
+ iscurrent = True
+ t1 = get_timestamp(stampfile)
+ for dep in self.rqdata.runq_depends[task]:
+ if iscurrent:
+ fn2 = self.rqdata.taskData.fn_index[self.rqdata.runq_fnid[dep]]
+ taskname2 = self.rqdata.runq_task[dep]
+ stampfile2 = bb.build.stampfile(taskname2, self.rqdata.dataCache, fn2)
+ stampfile3 = bb.build.stampfile(taskname2 + "_setscene", self.rqdata.dataCache, fn2)
+ t2 = get_timestamp(stampfile2)
+ t3 = get_timestamp(stampfile3)
+ if t3 and t3 > t2:
+ continue
+ if fn == fn2 or (fulldeptree and fn2 not in stampwhitelist):
+ if not t2:
+ logger.debug(2, 'Stampfile %s does not exist', stampfile2)
+ iscurrent = False
+ if t1 < t2:
+ logger.debug(2, 'Stampfile %s < %s', stampfile, stampfile2)
+ iscurrent = False
+ if recurse and iscurrent:
+ if dep in cache:
+ iscurrent = cache[dep]
+ if not iscurrent:
+ logger.debug(2, 'Stampfile for dependency %s:%s invalid (cached)' % (fn2, taskname2))
+ else:
+ iscurrent = self.check_stamp_task(dep, recurse=True, cache=cache)
+ cache[dep] = iscurrent
+ if recurse:
+ cache[task] = iscurrent
+ return iscurrent
+
+ def _execute_runqueue(self):
+ """
+ Run the tasks in a queue prepared by rqdata.prepare()
+ Upon failure, optionally try to recover the build using any alternate providers
+ (if the abort on failure configuration option isn't set)
+ """
+
+ retval = True
+
+ if self.state is runQueuePrepare:
+ self.rqexe = RunQueueExecuteDummy(self)
+ if self.rqdata.prepare() == 0:
+ self.state = runQueueComplete
+ else:
+ self.state = runQueueSceneInit
+
+ # we are ready to run, see if any UI client needs the dependency info
+ if bb.cooker.CookerFeatures.SEND_DEPENDS_TREE in self.cooker.featureset:
+ depgraph = self.cooker.buildDependTree(self, self.rqdata.taskData)
+ bb.event.fire(bb.event.DepTreeGenerated(depgraph), self.cooker.data)
+
+ if self.state is runQueueSceneInit:
+ dump = self.cooker.configuration.dump_signatures
+ if dump:
+ if 'printdiff' in dump:
+ invalidtasks = self.print_diffscenetasks()
+ self.dump_signatures(dump)
+ if 'printdiff' in dump:
+ self.write_diffscenetasks(invalidtasks)
+ self.state = runQueueComplete
+ else:
+ self.start_worker()
+ self.rqexe = RunQueueExecuteScenequeue(self)
+
+ if self.state in [runQueueSceneRun, runQueueRunning, runQueueCleanUp]:
+ self.dm.check(self)
+
+ if self.state is runQueueSceneRun:
+ retval = self.rqexe.execute()
+
+ if self.state is runQueueRunInit:
+ logger.info("Executing RunQueue Tasks")
+ self.rqexe = RunQueueExecuteTasks(self)
+ self.state = runQueueRunning
+
+ if self.state is runQueueRunning:
+ retval = self.rqexe.execute()
+
+ if self.state is runQueueCleanUp:
+ retval = self.rqexe.finish()
+
+ if (self.state is runQueueComplete or self.state is runQueueFailed) and self.rqexe:
+ self.teardown_workers()
+ if self.rqexe.stats.failed:
+ logger.info("Tasks Summary: Attempted %d tasks of which %d didn't need to be rerun and %d failed.", self.rqexe.stats.completed + self.rqexe.stats.failed, self.rqexe.stats.skipped, self.rqexe.stats.failed)
+ else:
+ # Let's avoid the word "failed" if nothing actually did
+ logger.info("Tasks Summary: Attempted %d tasks of which %d didn't need to be rerun and all succeeded.", self.rqexe.stats.completed, self.rqexe.stats.skipped)
+
+ if self.state is runQueueFailed:
+ if not self.rqdata.taskData.tryaltconfigs:
+ raise bb.runqueue.TaskFailure(self.rqexe.failed_fnids)
+ for fnid in self.rqexe.failed_fnids:
+ self.rqdata.taskData.fail_fnid(fnid)
+ self.rqdata.reset()
+
+ if self.state is runQueueComplete:
+ # All done
+ return False
+
+ # Loop
+ return retval
+
+ def execute_runqueue(self):
+ # Catch unexpected exceptions and ensure we exit when an error occurs, not loop.
+ try:
+ return self._execute_runqueue()
+ except bb.runqueue.TaskFailure:
+ raise
+ except SystemExit:
+ raise
+ except bb.BBHandledException:
+ try:
+ self.teardown_workers()
+ except:
+ pass
+ self.state = runQueueComplete
+ raise
+ except:
+ logger.error("An uncaught exception occured in runqueue, please see the failure below:")
+ try:
+ self.teardown_workers()
+ except:
+ pass
+ self.state = runQueueComplete
+ raise
+
+ def finish_runqueue(self, now = False):
+ if not self.rqexe:
+ self.state = runQueueComplete
+ return
+
+ if now:
+ self.rqexe.finish_now()
+ else:
+ self.rqexe.finish()
+
+ def dump_signatures(self, options):
+ done = set()
+ bb.note("Reparsing files to collect dependency data")
+ for task in range(len(self.rqdata.runq_fnid)):
+ if self.rqdata.runq_fnid[task] not in done:
+ fn = self.rqdata.taskData.fn_index[self.rqdata.runq_fnid[task]]
+ the_data = bb.cache.Cache.loadDataFull(fn, self.cooker.collection.get_file_appends(fn), self.cooker.data)
+ done.add(self.rqdata.runq_fnid[task])
+
+ bb.parse.siggen.dump_sigs(self.rqdata.dataCache, options)
+
+ return
+
+ def print_diffscenetasks(self):
+
+ valid = []
+ sq_hash = []
+ sq_hashfn = []
+ sq_fn = []
+ sq_taskname = []
+ sq_task = []
+ noexec = []
+ stamppresent = []
+ valid_new = set()
+
+ for task in xrange(len(self.rqdata.runq_fnid)):
+ fn = self.rqdata.taskData.fn_index[self.rqdata.runq_fnid[task]]
+ taskname = self.rqdata.runq_task[task]
+ taskdep = self.rqdata.dataCache.task_deps[fn]
+
+ if 'noexec' in taskdep and taskname in taskdep['noexec']:
+ noexec.append(task)
+ continue
+
+ sq_fn.append(fn)
+ sq_hashfn.append(self.rqdata.dataCache.hashfn[fn])
+ sq_hash.append(self.rqdata.runq_hash[task])
+ sq_taskname.append(taskname)
+ sq_task.append(task)
+ locs = { "sq_fn" : sq_fn, "sq_task" : sq_taskname, "sq_hash" : sq_hash, "sq_hashfn" : sq_hashfn, "d" : self.cooker.expanded_data }
+ try:
+ call = self.hashvalidate + "(sq_fn, sq_task, sq_hash, sq_hashfn, d, siginfo=True)"
+ valid = bb.utils.better_eval(call, locs)
+ # Handle version with no siginfo parameter
+ except TypeError:
+ call = self.hashvalidate + "(sq_fn, sq_task, sq_hash, sq_hashfn, d)"
+ valid = bb.utils.better_eval(call, locs)
+ for v in valid:
+ valid_new.add(sq_task[v])
+
+ # Tasks which are both setscene and noexec never care about dependencies
+ # We therefore find tasks which are setscene and noexec and mark their
+ # unique dependencies as valid.
+ for task in noexec:
+ if task not in self.rqdata.runq_setscene:
+ continue
+ for dep in self.rqdata.runq_depends[task]:
+ hasnoexecparents = True
+ for dep2 in self.rqdata.runq_revdeps[dep]:
+ if dep2 in self.rqdata.runq_setscene and dep2 in noexec:
+ continue
+ hasnoexecparents = False
+ break
+ if hasnoexecparents:
+ valid_new.add(dep)
+
+ invalidtasks = set()
+ for task in xrange(len(self.rqdata.runq_fnid)):
+ if task not in valid_new and task not in noexec:
+ invalidtasks.add(task)
+
+ found = set()
+ processed = set()
+ for task in invalidtasks:
+ toprocess = set([task])
+ while toprocess:
+ next = set()
+ for t in toprocess:
+ for dep in self.rqdata.runq_depends[t]:
+ if dep in invalidtasks:
+ found.add(task)
+ if dep not in processed:
+ processed.add(dep)
+ next.add(dep)
+ toprocess = next
+ if task in found:
+ toprocess = set()
+
+ tasklist = []
+ for task in invalidtasks.difference(found):
+ tasklist.append(self.rqdata.get_user_idstring(task))
+
+ if tasklist:
+ bb.plain("The differences between the current build and any cached tasks start at the following tasks:\n" + "\n".join(tasklist))
+
+ return invalidtasks.difference(found)
+
+ def write_diffscenetasks(self, invalidtasks):
+
+ # Define recursion callback
+ def recursecb(key, hash1, hash2):
+ hashes = [hash1, hash2]
+ hashfiles = bb.siggen.find_siginfo(key, None, hashes, self.cfgData)
+
+ recout = []
+ if len(hashfiles) == 2:
+ out2 = bb.siggen.compare_sigfiles(hashfiles[hash1], hashfiles[hash2], recursecb)
+ recout.extend(list(' ' + l for l in out2))
+ else:
+ recout.append("Unable to find matching sigdata for %s with hashes %s or %s" % (key, hash1, hash2))
+
+ return recout
+
+
+ for task in invalidtasks:
+ fn = self.rqdata.taskData.fn_index[self.rqdata.runq_fnid[task]]
+ pn = self.rqdata.dataCache.pkg_fn[fn]
+ taskname = self.rqdata.runq_task[task]
+ h = self.rqdata.runq_hash[task]
+ matches = bb.siggen.find_siginfo(pn, taskname, [], self.cfgData)
+ match = None
+ for m in matches:
+ if h in m:
+ match = m
+ if match is None:
+ bb.fatal("Can't find a task we're supposed to have written out? (hash: %s)?" % h)
+ matches = {k : v for k, v in matches.iteritems() if h not in k}
+ if matches:
+ latestmatch = sorted(matches.keys(), key=lambda f: matches[f])[-1]
+ prevh = __find_md5__.search(latestmatch).group(0)
+ output = bb.siggen.compare_sigfiles(latestmatch, match, recursecb)
+ bb.plain("\nTask %s:%s couldn't be used from the cache because:\n We need hash %s, closest matching task was %s\n " % (pn, taskname, h, prevh) + '\n '.join(output))
+
+class RunQueueExecute:
+
+ def __init__(self, rq):
+ self.rq = rq
+ self.cooker = rq.cooker
+ self.cfgData = rq.cfgData
+ self.rqdata = rq.rqdata
+
+ self.number_tasks = int(self.cfgData.getVar("BB_NUMBER_THREADS", True) or 1)
+ self.scheduler = self.cfgData.getVar("BB_SCHEDULER", True) or "speed"
+
+ self.runq_buildable = []
+ self.runq_running = []
+ self.runq_complete = []
+
+ self.build_stamps = {}
+ self.build_stamps2 = []
+ self.failed_fnids = []
+
+ self.stampcache = {}
+
+ rq.workerpipe.setrunqueueexec(self)
+ if rq.fakeworkerpipe:
+ rq.fakeworkerpipe.setrunqueueexec(self)
+
+ if self.number_tasks <= 0:
+ bb.fatal("Invalid BB_NUMBER_THREADS %s" % self.number_tasks)
+
+ def runqueue_process_waitpid(self, task, status):
+
+ # self.build_stamps[pid] may not exist when use shared work directory.
+ if task in self.build_stamps:
+ self.build_stamps2.remove(self.build_stamps[task])
+ del self.build_stamps[task]
+
+ if status != 0:
+ self.task_fail(task, status)
+ else:
+ self.task_complete(task)
+ return True
+
+ def finish_now(self):
+
+ for worker in [self.rq.worker, self.rq.fakeworker]:
+ if not worker:
+ continue
+ try:
+ worker.stdin.write("<finishnow></finishnow>")
+ worker.stdin.flush()
+ except IOError:
+ # worker must have died?
+ pass
+
+ if len(self.failed_fnids) != 0:
+ self.rq.state = runQueueFailed
+ return
+
+ self.rq.state = runQueueComplete
+ return
+
+ def finish(self):
+ self.rq.state = runQueueCleanUp
+
+ if self.stats.active > 0:
+ bb.event.fire(runQueueExitWait(self.stats.active), self.cfgData)
+ self.rq.read_workers()
+ return self.rq.active_fds()
+
+ if len(self.failed_fnids) != 0:
+ self.rq.state = runQueueFailed
+ return True
+
+ self.rq.state = runQueueComplete
+ return True
+
+ def check_dependencies(self, task, taskdeps, setscene = False):
+ if not self.rq.depvalidate:
+ return False
+
+ taskdata = {}
+ taskdeps.add(task)
+ for dep in taskdeps:
+ if setscene:
+ depid = self.rqdata.runq_setscene[dep]
+ else:
+ depid = dep
+ fn = self.rqdata.taskData.fn_index[self.rqdata.runq_fnid[depid]]
+ pn = self.rqdata.dataCache.pkg_fn[fn]
+ taskname = self.rqdata.runq_task[depid]
+ taskdata[dep] = [pn, taskname, fn]
+ call = self.rq.depvalidate + "(task, taskdata, notneeded, d)"
+ locs = { "task" : task, "taskdata" : taskdata, "notneeded" : self.scenequeue_notneeded, "d" : self.cooker.expanded_data }
+ valid = bb.utils.better_eval(call, locs)
+ return valid
+
+class RunQueueExecuteDummy(RunQueueExecute):
+ def __init__(self, rq):
+ self.rq = rq
+ self.stats = RunQueueStats(0)
+
+ def finish(self):
+ self.rq.state = runQueueComplete
+ return
+
+class RunQueueExecuteTasks(RunQueueExecute):
+ def __init__(self, rq):
+ RunQueueExecute.__init__(self, rq)
+
+ self.stats = RunQueueStats(len(self.rqdata.runq_fnid))
+
+ self.stampcache = {}
+
+ initial_covered = self.rq.scenequeue_covered.copy()
+
+ # Mark initial buildable tasks
+ for task in xrange(self.stats.total):
+ self.runq_running.append(0)
+ self.runq_complete.append(0)
+ if len(self.rqdata.runq_depends[task]) == 0:
+ self.runq_buildable.append(1)
+ else:
+ self.runq_buildable.append(0)
+ if len(self.rqdata.runq_revdeps[task]) > 0 and self.rqdata.runq_revdeps[task].issubset(self.rq.scenequeue_covered) and task not in self.rq.scenequeue_notcovered:
+ self.rq.scenequeue_covered.add(task)
+
+ found = True
+ while found:
+ found = False
+ for task in xrange(self.stats.total):
+ if task in self.rq.scenequeue_covered:
+ continue
+ logger.debug(1, 'Considering %s (%s): %s' % (task, self.rqdata.get_user_idstring(task), str(self.rqdata.runq_revdeps[task])))
+
+ if len(self.rqdata.runq_revdeps[task]) > 0 and self.rqdata.runq_revdeps[task].issubset(self.rq.scenequeue_covered) and task not in self.rq.scenequeue_notcovered:
+ found = True
+ self.rq.scenequeue_covered.add(task)
+
+ logger.debug(1, 'Skip list (pre setsceneverify) %s', sorted(self.rq.scenequeue_covered))
+
+ # Allow the metadata to elect for setscene tasks to run anyway
+ covered_remove = set()
+ if self.rq.setsceneverify:
+ invalidtasks = []
+ for task in xrange(len(self.rqdata.runq_task)):
+ fn = self.rqdata.taskData.fn_index[self.rqdata.runq_fnid[task]]
+ taskname = self.rqdata.runq_task[task]
+ taskdep = self.rqdata.dataCache.task_deps[fn]
+
+ if 'noexec' in taskdep and taskname in taskdep['noexec']:
+ continue
+ if self.rq.check_stamp_task(task, taskname + "_setscene", cache=self.stampcache):
+ logger.debug(2, 'Setscene stamp current for task %s(%s)', task, self.rqdata.get_user_idstring(task))
+ continue
+ if self.rq.check_stamp_task(task, taskname, recurse = True, cache=self.stampcache):
+ logger.debug(2, 'Normal stamp current for task %s(%s)', task, self.rqdata.get_user_idstring(task))
+ continue
+ invalidtasks.append(task)
+
+ call = self.rq.setsceneverify + "(covered, tasknames, fnids, fns, d, invalidtasks=invalidtasks)"
+ call2 = self.rq.setsceneverify + "(covered, tasknames, fnids, fns, d)"
+ locs = { "covered" : self.rq.scenequeue_covered, "tasknames" : self.rqdata.runq_task, "fnids" : self.rqdata.runq_fnid, "fns" : self.rqdata.taskData.fn_index, "d" : self.cooker.expanded_data, "invalidtasks" : invalidtasks }
+ # Backwards compatibility with older versions without invalidtasks
+ try:
+ covered_remove = bb.utils.better_eval(call, locs)
+ except TypeError:
+ covered_remove = bb.utils.better_eval(call2, locs)
+
+ def removecoveredtask(task):
+ fn = self.rqdata.taskData.fn_index[self.rqdata.runq_fnid[task]]
+ taskname = self.rqdata.runq_task[task] + '_setscene'
+ bb.build.del_stamp(taskname, self.rqdata.dataCache, fn)
+ self.rq.scenequeue_covered.remove(task)
+
+ toremove = covered_remove
+ for task in toremove:
+ logger.debug(1, 'Not skipping task %s due to setsceneverify', task)
+ while toremove:
+ covered_remove = []
+ for task in toremove:
+ removecoveredtask(task)
+ for deptask in self.rqdata.runq_depends[task]:
+ if deptask not in self.rq.scenequeue_covered:
+ continue
+ if deptask in toremove or deptask in covered_remove or deptask in initial_covered:
+ continue
+ logger.debug(1, 'Task %s depends on task %s so not skipping' % (task, deptask))
+ covered_remove.append(deptask)
+ toremove = covered_remove
+
+ logger.debug(1, 'Full skip list %s', self.rq.scenequeue_covered)
+
+ event.fire(bb.event.StampUpdate(self.rqdata.target_pairs, self.rqdata.dataCache.stamp), self.cfgData)
+
+ schedulers = self.get_schedulers()
+ for scheduler in schedulers:
+ if self.scheduler == scheduler.name:
+ self.sched = scheduler(self, self.rqdata)
+ logger.debug(1, "Using runqueue scheduler '%s'", scheduler.name)
+ break
+ else:
+ bb.fatal("Invalid scheduler '%s'. Available schedulers: %s" %
+ (self.scheduler, ", ".join(obj.name for obj in schedulers)))
+
+ def get_schedulers(self):
+ schedulers = set(obj for obj in globals().values()
+ if type(obj) is type and
+ issubclass(obj, RunQueueScheduler))
+
+ user_schedulers = self.cfgData.getVar("BB_SCHEDULERS", True)
+ if user_schedulers:
+ for sched in user_schedulers.split():
+ if not "." in sched:
+ bb.note("Ignoring scheduler '%s' from BB_SCHEDULERS: not an import" % sched)
+ continue
+
+ modname, name = sched.rsplit(".", 1)
+ try:
+ module = __import__(modname, fromlist=(name,))
+ except ImportError as exc:
+ logger.critical("Unable to import scheduler '%s' from '%s': %s" % (name, modname, exc))
+ raise SystemExit(1)
+ else:
+ schedulers.add(getattr(module, name))
+ return schedulers
+
+ def setbuildable(self, task):
+ self.runq_buildable[task] = 1
+ self.sched.newbuilable(task)
+
+ def task_completeoutright(self, task):
+ """
+ Mark a task as completed
+ Look at the reverse dependencies and mark any task with
+ completed dependencies as buildable
+ """
+ self.runq_complete[task] = 1
+ for revdep in self.rqdata.runq_revdeps[task]:
+ if self.runq_running[revdep] == 1:
+ continue
+ if self.runq_buildable[revdep] == 1:
+ continue
+ alldeps = 1
+ for dep in self.rqdata.runq_depends[revdep]:
+ if self.runq_complete[dep] != 1:
+ alldeps = 0
+ if alldeps == 1:
+ self.setbuildable(revdep)
+ fn = self.rqdata.taskData.fn_index[self.rqdata.runq_fnid[revdep]]
+ taskname = self.rqdata.runq_task[revdep]
+ logger.debug(1, "Marking task %s (%s, %s) as buildable", revdep, fn, taskname)
+
+ def task_complete(self, task):
+ self.stats.taskCompleted()
+ bb.event.fire(runQueueTaskCompleted(task, self.stats, self.rq), self.cfgData)
+ self.task_completeoutright(task)
+
+ def task_fail(self, task, exitcode):
+ """
+ Called when a task has failed
+ Updates the state engine with the failure
+ """
+ self.stats.taskFailed()
+ fnid = self.rqdata.runq_fnid[task]
+ self.failed_fnids.append(fnid)
+ bb.event.fire(runQueueTaskFailed(task, self.stats, exitcode, self.rq), self.cfgData)
+ if self.rqdata.taskData.abort:
+ self.rq.state = runQueueCleanUp
+
+ def task_skip(self, task, reason):
+ self.runq_running[task] = 1
+ self.setbuildable(task)
+ bb.event.fire(runQueueTaskSkipped(task, self.stats, self.rq, reason), self.cfgData)
+ self.task_completeoutright(task)
+ self.stats.taskCompleted()
+ self.stats.taskSkipped()
+
+ def execute(self):
+ """
+ Run the tasks in a queue prepared by rqdata.prepare()
+ """
+
+ self.rq.read_workers()
+
+
+ if self.stats.total == 0:
+ # nothing to do
+ self.rq.state = runQueueCleanUp
+
+ task = self.sched.next()
+ if task is not None:
+ fn = self.rqdata.taskData.fn_index[self.rqdata.runq_fnid[task]]
+ taskname = self.rqdata.runq_task[task]
+
+ if task in self.rq.scenequeue_covered:
+ logger.debug(2, "Setscene covered task %s (%s)", task,
+ self.rqdata.get_user_idstring(task))
+ self.task_skip(task, "covered")
+ return True
+
+ if self.rq.check_stamp_task(task, taskname, cache=self.stampcache):
+ logger.debug(2, "Stamp current task %s (%s)", task,
+ self.rqdata.get_user_idstring(task))
+ self.task_skip(task, "existing")
+ return True
+
+ taskdep = self.rqdata.dataCache.task_deps[fn]
+ if 'noexec' in taskdep and taskname in taskdep['noexec']:
+ startevent = runQueueTaskStarted(task, self.stats, self.rq,
+ noexec=True)
+ bb.event.fire(startevent, self.cfgData)
+ self.runq_running[task] = 1
+ self.stats.taskActive()
+ if not self.cooker.configuration.dry_run:
+ bb.build.make_stamp(taskname, self.rqdata.dataCache, fn)
+ self.task_complete(task)
+ return True
+ else:
+ startevent = runQueueTaskStarted(task, self.stats, self.rq)
+ bb.event.fire(startevent, self.cfgData)
+
+ taskdepdata = self.build_taskdepdata(task)
+
+ taskdep = self.rqdata.dataCache.task_deps[fn]
+ if 'fakeroot' in taskdep and taskname in taskdep['fakeroot'] and not self.cooker.configuration.dry_run:
+ if not self.rq.fakeworker:
+ try:
+ self.rq.start_fakeworker(self)
+ except OSError as exc:
+ logger.critical("Failed to spawn fakeroot worker to run %s:%s: %s" % (fn, taskname, str(exc)))
+ self.rq.state = runQueueFailed
+ return True
+ self.rq.fakeworker.stdin.write("<runtask>" + pickle.dumps((fn, task, taskname, False, self.cooker.collection.get_file_appends(fn), taskdepdata)) + "</runtask>")
+ self.rq.fakeworker.stdin.flush()
+ else:
+ self.rq.worker.stdin.write("<runtask>" + pickle.dumps((fn, task, taskname, False, self.cooker.collection.get_file_appends(fn), taskdepdata)) + "</runtask>")
+ self.rq.worker.stdin.flush()
+
+ self.build_stamps[task] = bb.build.stampfile(taskname, self.rqdata.dataCache, fn)
+ self.build_stamps2.append(self.build_stamps[task])
+ self.runq_running[task] = 1
+ self.stats.taskActive()
+ if self.stats.active < self.number_tasks:
+ return True
+
+ if self.stats.active > 0:
+ self.rq.read_workers()
+ return self.rq.active_fds()
+
+ if len(self.failed_fnids) != 0:
+ self.rq.state = runQueueFailed
+ return True
+
+ # Sanity Checks
+ for task in xrange(self.stats.total):
+ if self.runq_buildable[task] == 0:
+ logger.error("Task %s never buildable!", task)
+ if self.runq_running[task] == 0:
+ logger.error("Task %s never ran!", task)
+ if self.runq_complete[task] == 0:
+ logger.error("Task %s never completed!", task)
+ self.rq.state = runQueueComplete
+
+ return True
+
+ def build_taskdepdata(self, task):
+ taskdepdata = {}
+ next = self.rqdata.runq_depends[task]
+ next.add(task)
+ while next:
+ additional = []
+ for revdep in next:
+ fn = self.rqdata.taskData.fn_index[self.rqdata.runq_fnid[revdep]]
+ pn = self.rqdata.dataCache.pkg_fn[fn]
+ taskname = self.rqdata.runq_task[revdep]
+ deps = self.rqdata.runq_depends[revdep]
+ provides = self.rqdata.dataCache.fn_provides[fn]
+ taskdepdata[revdep] = [pn, taskname, fn, deps, provides]
+ for revdep2 in deps:
+ if revdep2 not in taskdepdata:
+ additional.append(revdep2)
+ next = additional
+
+ #bb.note("Task %s: " % task + str(taskdepdata).replace("], ", "],\n"))
+ return taskdepdata
+
+class RunQueueExecuteScenequeue(RunQueueExecute):
+ def __init__(self, rq):
+ RunQueueExecute.__init__(self, rq)
+
+ self.scenequeue_covered = set()
+ self.scenequeue_notcovered = set()
+ self.scenequeue_notneeded = set()
+
+ # If we don't have any setscene functions, skip this step
+ if len(self.rqdata.runq_setscene) == 0:
+ rq.scenequeue_covered = set()
+ rq.state = runQueueRunInit
+ return
+
+ self.stats = RunQueueStats(len(self.rqdata.runq_setscene))
+
+ sq_revdeps = []
+ sq_revdeps_new = []
+ sq_revdeps_squash = []
+ self.sq_harddeps = {}
+
+ # We need to construct a dependency graph for the setscene functions. Intermediate
+ # dependencies between the setscene tasks only complicate the code. This code
+ # therefore aims to collapse the huge runqueue dependency tree into a smaller one
+ # only containing the setscene functions.
+
+ for task in xrange(self.stats.total):
+ self.runq_running.append(0)
+ self.runq_complete.append(0)
+ self.runq_buildable.append(0)
+
+ # First process the chains up to the first setscene task.
+ endpoints = {}
+ for task in xrange(len(self.rqdata.runq_fnid)):
+ sq_revdeps.append(copy.copy(self.rqdata.runq_revdeps[task]))
+ sq_revdeps_new.append(set())
+ if (len(self.rqdata.runq_revdeps[task]) == 0) and task not in self.rqdata.runq_setscene:
+ endpoints[task] = set()
+
+ # Secondly process the chains between setscene tasks.
+ for task in self.rqdata.runq_setscene:
+ for dep in self.rqdata.runq_depends[task]:
+ if dep not in endpoints:
+ endpoints[dep] = set()
+ endpoints[dep].add(task)
+
+ def process_endpoints(endpoints):
+ newendpoints = {}
+ for point, task in endpoints.items():
+ tasks = set()
+ if task:
+ tasks |= task
+ if sq_revdeps_new[point]:
+ tasks |= sq_revdeps_new[point]
+ sq_revdeps_new[point] = set()
+ if point in self.rqdata.runq_setscene:
+ sq_revdeps_new[point] = tasks
+ for dep in self.rqdata.runq_depends[point]:
+ if point in sq_revdeps[dep]:
+ sq_revdeps[dep].remove(point)
+ if tasks:
+ sq_revdeps_new[dep] |= tasks
+ if (len(sq_revdeps[dep]) == 0 or len(sq_revdeps_new[dep]) != 0) and dep not in self.rqdata.runq_setscene:
+ newendpoints[dep] = task
+ if len(newendpoints) != 0:
+ process_endpoints(newendpoints)
+
+ process_endpoints(endpoints)
+
+ # Build a list of setscene tasks which are "unskippable"
+ # These are direct endpoints referenced by the build
+ endpoints2 = {}
+ sq_revdeps2 = []
+ sq_revdeps_new2 = []
+ def process_endpoints2(endpoints):
+ newendpoints = {}
+ for point, task in endpoints.items():
+ tasks = set([point])
+ if task:
+ tasks |= task
+ if sq_revdeps_new2[point]:
+ tasks |= sq_revdeps_new2[point]
+ sq_revdeps_new2[point] = set()
+ if point in self.rqdata.runq_setscene:
+ sq_revdeps_new2[point] = tasks
+ for dep in self.rqdata.runq_depends[point]:
+ if point in sq_revdeps2[dep]:
+ sq_revdeps2[dep].remove(point)
+ if tasks:
+ sq_revdeps_new2[dep] |= tasks
+ if (len(sq_revdeps2[dep]) == 0 or len(sq_revdeps_new2[dep]) != 0) and dep not in self.rqdata.runq_setscene:
+ newendpoints[dep] = tasks
+ if len(newendpoints) != 0:
+ process_endpoints2(newendpoints)
+ for task in xrange(len(self.rqdata.runq_fnid)):
+ sq_revdeps2.append(copy.copy(self.rqdata.runq_revdeps[task]))
+ sq_revdeps_new2.append(set())
+ if (len(self.rqdata.runq_revdeps[task]) == 0) and task not in self.rqdata.runq_setscene:
+ endpoints2[task] = set()
+ process_endpoints2(endpoints2)
+ self.unskippable = []
+ for task in self.rqdata.runq_setscene:
+ if sq_revdeps_new2[task]:
+ self.unskippable.append(self.rqdata.runq_setscene.index(task))
+
+ for task in xrange(len(self.rqdata.runq_fnid)):
+ if task in self.rqdata.runq_setscene:
+ deps = set()
+ for dep in sq_revdeps_new[task]:
+ deps.add(self.rqdata.runq_setscene.index(dep))
+ sq_revdeps_squash.append(deps)
+ elif len(sq_revdeps_new[task]) != 0:
+ bb.msg.fatal("RunQueue", "Something went badly wrong during scenequeue generation, aborting. Please report this problem.")
+
+ # Resolve setscene inter-task dependencies
+ # e.g. do_sometask_setscene[depends] = "targetname:do_someothertask_setscene"
+ # Note that anything explicitly depended upon will have its reverse dependencies removed to avoid circular dependencies
+ for task in self.rqdata.runq_setscene:
+ realid = self.rqdata.taskData.gettask_id(self.rqdata.taskData.fn_index[self.rqdata.runq_fnid[task]], self.rqdata.runq_task[task] + "_setscene", False)
+ idepends = self.rqdata.taskData.tasks_idepends[realid]
+ for (depid, idependtask) in idepends:
+ if depid not in self.rqdata.taskData.build_targets:
+ continue
+
+ depdata = self.rqdata.taskData.build_targets[depid][0]
+ if depdata is None:
+ continue
+ dep = self.rqdata.taskData.fn_index[depdata]
+ taskid = self.rqdata.get_task_id(self.rqdata.taskData.getfn_id(dep), idependtask.replace("_setscene", ""))
+ if taskid is None:
+ bb.msg.fatal("RunQueue", "Task %s_setscene depends upon non-existent task %s:%s" % (self.rqdata.get_user_idstring(task), dep, idependtask))
+
+ if not self.rqdata.runq_setscene.index(taskid) in self.sq_harddeps:
+ self.sq_harddeps[self.rqdata.runq_setscene.index(taskid)] = set()
+ self.sq_harddeps[self.rqdata.runq_setscene.index(taskid)].add(self.rqdata.runq_setscene.index(task))
+
+ sq_revdeps_squash[self.rqdata.runq_setscene.index(task)].add(self.rqdata.runq_setscene.index(taskid))
+ # Have to zero this to avoid circular dependencies
+ sq_revdeps_squash[self.rqdata.runq_setscene.index(taskid)] = set()
+
+ for task in self.sq_harddeps:
+ for dep in self.sq_harddeps[task]:
+ sq_revdeps_squash[dep].add(task)
+
+ #for task in xrange(len(sq_revdeps_squash)):
+ # realtask = self.rqdata.runq_setscene[task]
+ # bb.warn("Task %s: %s_setscene is %s " % (task, self.rqdata.get_user_idstring(realtask) , sq_revdeps_squash[task]))
+
+ self.sq_deps = []
+ self.sq_revdeps = sq_revdeps_squash
+ self.sq_revdeps2 = copy.deepcopy(self.sq_revdeps)
+
+ for task in xrange(len(self.sq_revdeps)):
+ self.sq_deps.append(set())
+ for task in xrange(len(self.sq_revdeps)):
+ for dep in self.sq_revdeps[task]:
+ self.sq_deps[dep].add(task)
+
+ for task in xrange(len(self.sq_revdeps)):
+ if len(self.sq_revdeps[task]) == 0:
+ self.runq_buildable[task] = 1
+
+ self.outrightfail = []
+ if self.rq.hashvalidate:
+ sq_hash = []
+ sq_hashfn = []
+ sq_fn = []
+ sq_taskname = []
+ sq_task = []
+ noexec = []
+ stamppresent = []
+ for task in xrange(len(self.sq_revdeps)):
+ realtask = self.rqdata.runq_setscene[task]
+ fn = self.rqdata.taskData.fn_index[self.rqdata.runq_fnid[realtask]]
+ taskname = self.rqdata.runq_task[realtask]
+ taskdep = self.rqdata.dataCache.task_deps[fn]
+
+ if 'noexec' in taskdep and taskname in taskdep['noexec']:
+ noexec.append(task)
+ self.task_skip(task)
+ bb.build.make_stamp(taskname + "_setscene", self.rqdata.dataCache, fn)
+ continue
+
+ if self.rq.check_stamp_task(realtask, taskname + "_setscene", cache=self.stampcache):
+ logger.debug(2, 'Setscene stamp current for task %s(%s)', task, self.rqdata.get_user_idstring(realtask))
+ stamppresent.append(task)
+ self.task_skip(task)
+ continue
+
+ if self.rq.check_stamp_task(realtask, taskname, recurse = True, cache=self.stampcache):
+ logger.debug(2, 'Normal stamp current for task %s(%s)', task, self.rqdata.get_user_idstring(realtask))
+ stamppresent.append(task)
+ self.task_skip(task)
+ continue
+
+ sq_fn.append(fn)
+ sq_hashfn.append(self.rqdata.dataCache.hashfn[fn])
+ sq_hash.append(self.rqdata.runq_hash[realtask])
+ sq_taskname.append(taskname)
+ sq_task.append(task)
+ call = self.rq.hashvalidate + "(sq_fn, sq_task, sq_hash, sq_hashfn, d)"
+ locs = { "sq_fn" : sq_fn, "sq_task" : sq_taskname, "sq_hash" : sq_hash, "sq_hashfn" : sq_hashfn, "d" : self.cooker.expanded_data }
+ valid = bb.utils.better_eval(call, locs)
+
+ valid_new = stamppresent
+ for v in valid:
+ valid_new.append(sq_task[v])
+
+ for task in xrange(len(self.sq_revdeps)):
+ if task not in valid_new and task not in noexec:
+ realtask = self.rqdata.runq_setscene[task]
+ logger.debug(2, 'No package found, so skipping setscene task %s',
+ self.rqdata.get_user_idstring(realtask))
+ self.outrightfail.append(task)
+
+ logger.info('Executing SetScene Tasks')
+
+ self.rq.state = runQueueSceneRun
+
+ def scenequeue_updatecounters(self, task, fail = False):
+ for dep in self.sq_deps[task]:
+ if fail and task in self.sq_harddeps and dep in self.sq_harddeps[task]:
+ realtask = self.rqdata.runq_setscene[task]
+ realdep = self.rqdata.runq_setscene[dep]
+ logger.debug(2, "%s was unavailable and is a hard dependency of %s so skipping" % (self.rqdata.get_user_idstring(realtask), self.rqdata.get_user_idstring(realdep)))
+ self.scenequeue_updatecounters(dep, fail)
+ continue
+ if task not in self.sq_revdeps2[dep]:
+ # May already have been removed by the fail case above
+ continue
+ self.sq_revdeps2[dep].remove(task)
+ if len(self.sq_revdeps2[dep]) == 0:
+ self.runq_buildable[dep] = 1
+
+ def task_completeoutright(self, task):
+ """
+ Mark a task as completed
+ Look at the reverse dependencies and mark any task with
+ completed dependencies as buildable
+ """
+
+ index = self.rqdata.runq_setscene[task]
+ logger.debug(1, 'Found task %s which could be accelerated',
+ self.rqdata.get_user_idstring(index))
+
+ self.scenequeue_covered.add(task)
+ self.scenequeue_updatecounters(task)
+
+ def task_complete(self, task):
+ self.stats.taskCompleted()
+ bb.event.fire(sceneQueueTaskCompleted(task, self.stats, self.rq), self.cfgData)
+ self.task_completeoutright(task)
+
+ def task_fail(self, task, result):
+ self.stats.taskFailed()
+ bb.event.fire(sceneQueueTaskFailed(task, self.stats, result, self), self.cfgData)
+ self.scenequeue_notcovered.add(task)
+ self.scenequeue_updatecounters(task, True)
+
+ def task_failoutright(self, task):
+ self.runq_running[task] = 1
+ self.runq_buildable[task] = 1
+ self.stats.taskCompleted()
+ self.stats.taskSkipped()
+ index = self.rqdata.runq_setscene[task]
+ self.scenequeue_notcovered.add(task)
+ self.scenequeue_updatecounters(task, True)
+
+ def task_skip(self, task):
+ self.runq_running[task] = 1
+ self.runq_buildable[task] = 1
+ self.task_completeoutright(task)
+ self.stats.taskCompleted()
+ self.stats.taskSkipped()
+
+ def execute(self):
+ """
+ Run the tasks in a queue prepared by prepare_runqueue
+ """
+
+ self.rq.read_workers()
+
+ task = None
+ if self.stats.active < self.number_tasks:
+ # Find the next setscene to run
+ for nexttask in xrange(self.stats.total):
+ if self.runq_buildable[nexttask] == 1 and self.runq_running[nexttask] != 1:
+ if nexttask in self.unskippable:
+ logger.debug(2, "Setscene task %s is unskippable" % self.rqdata.get_user_idstring(self.rqdata.runq_setscene[nexttask]))
+ if nexttask not in self.unskippable and len(self.sq_revdeps[nexttask]) > 0 and self.sq_revdeps[nexttask].issubset(self.scenequeue_covered) and self.check_dependencies(nexttask, self.sq_revdeps[nexttask], True):
+ realtask = self.rqdata.runq_setscene[nexttask]
+ fn = self.rqdata.taskData.fn_index[self.rqdata.runq_fnid[realtask]]
+ foundtarget = False
+ for target in self.rqdata.target_pairs:
+ if target[0] == fn and target[1] == self.rqdata.runq_task[realtask]:
+ foundtarget = True
+ break
+ if not foundtarget:
+ logger.debug(2, "Skipping setscene for task %s" % self.rqdata.get_user_idstring(self.rqdata.runq_setscene[nexttask]))
+ self.task_skip(nexttask)
+ self.scenequeue_notneeded.add(nexttask)
+ return True
+ if nexttask in self.outrightfail:
+ self.task_failoutright(nexttask)
+ return True
+ task = nexttask
+ break
+ if task is not None:
+ realtask = self.rqdata.runq_setscene[task]
+ fn = self.rqdata.taskData.fn_index[self.rqdata.runq_fnid[realtask]]
+
+ taskname = self.rqdata.runq_task[realtask] + "_setscene"
+ if self.rq.check_stamp_task(realtask, self.rqdata.runq_task[realtask], recurse = True, cache=self.stampcache):
+ logger.debug(2, 'Stamp for underlying task %s(%s) is current, so skipping setscene variant',
+ task, self.rqdata.get_user_idstring(realtask))
+ self.task_failoutright(task)
+ return True
+
+ if self.cooker.configuration.force:
+ for target in self.rqdata.target_pairs:
+ if target[0] == fn and target[1] == self.rqdata.runq_task[realtask]:
+ self.task_failoutright(task)
+ return True
+
+ if self.rq.check_stamp_task(realtask, taskname, cache=self.stampcache):
+ logger.debug(2, 'Setscene stamp current task %s(%s), so skip it and its dependencies',
+ task, self.rqdata.get_user_idstring(realtask))
+ self.task_skip(task)
+ return True
+
+ startevent = sceneQueueTaskStarted(task, self.stats, self.rq)
+ bb.event.fire(startevent, self.cfgData)
+
+ taskdep = self.rqdata.dataCache.task_deps[fn]
+ if 'fakeroot' in taskdep and taskname in taskdep['fakeroot']:
+ if not self.rq.fakeworker:
+ self.rq.start_fakeworker(self)
+ self.rq.fakeworker.stdin.write("<runtask>" + pickle.dumps((fn, realtask, taskname, True, self.cooker.collection.get_file_appends(fn), None)) + "</runtask>")
+ self.rq.fakeworker.stdin.flush()
+ else:
+ self.rq.worker.stdin.write("<runtask>" + pickle.dumps((fn, realtask, taskname, True, self.cooker.collection.get_file_appends(fn), None)) + "</runtask>")
+ self.rq.worker.stdin.flush()
+
+ self.runq_running[task] = 1
+ self.stats.taskActive()
+ if self.stats.active < self.number_tasks:
+ return True
+
+ if self.stats.active > 0:
+ self.rq.read_workers()
+ return self.rq.active_fds()
+
+ #for task in xrange(self.stats.total):
+ # if self.runq_running[task] != 1:
+ # buildable = self.runq_buildable[task]
+ # revdeps = self.sq_revdeps[task]
+ # bb.warn("Found we didn't run %s %s %s %s" % (task, buildable, str(revdeps), self.rqdata.get_user_idstring(self.rqdata.runq_setscene[task])))
+
+ # Convert scenequeue_covered task numbers into full taskgraph ids
+ oldcovered = self.scenequeue_covered
+ self.rq.scenequeue_covered = set()
+ for task in oldcovered:
+ self.rq.scenequeue_covered.add(self.rqdata.runq_setscene[task])
+ self.rq.scenequeue_notcovered = set()
+ for task in self.scenequeue_notcovered:
+ self.rq.scenequeue_notcovered.add(self.rqdata.runq_setscene[task])
+
+ logger.debug(1, 'We can skip tasks %s', sorted(self.rq.scenequeue_covered))
+
+ self.rq.state = runQueueRunInit
+
+ completeevent = sceneQueueComplete(self.stats, self.rq)
+ bb.event.fire(completeevent, self.cfgData)
+
+ return True
+
+ def runqueue_process_waitpid(self, task, status):
+ task = self.rq.rqdata.runq_setscene.index(task)
+
+ RunQueueExecute.runqueue_process_waitpid(self, task, status)
+
+class TaskFailure(Exception):
+ """
+ Exception raised when a task in a runqueue fails
+ """
+ def __init__(self, x):
+ self.args = x
+
+
+class runQueueExitWait(bb.event.Event):
+ """
+ Event when waiting for task processes to exit
+ """
+
+ def __init__(self, remain):
+ self.remain = remain
+ self.message = "Waiting for %s active tasks to finish" % remain
+ bb.event.Event.__init__(self)
+
+class runQueueEvent(bb.event.Event):
+ """
+ Base runQueue event class
+ """
+ def __init__(self, task, stats, rq):
+ self.taskid = task
+ self.taskstring = rq.rqdata.get_user_idstring(task)
+ self.taskname = rq.rqdata.get_task_name(task)
+ self.taskfile = rq.rqdata.get_task_file(task)
+ self.taskhash = rq.rqdata.get_task_hash(task)
+ self.stats = stats.copy()
+ bb.event.Event.__init__(self)
+
+class sceneQueueEvent(runQueueEvent):
+ """
+ Base sceneQueue event class
+ """
+ def __init__(self, task, stats, rq, noexec=False):
+ runQueueEvent.__init__(self, task, stats, rq)
+ realtask = rq.rqdata.runq_setscene[task]
+ self.taskstring = rq.rqdata.get_user_idstring(realtask, "_setscene")
+ self.taskname = rq.rqdata.get_task_name(realtask) + "_setscene"
+ self.taskfile = rq.rqdata.get_task_file(realtask)
+ self.taskhash = rq.rqdata.get_task_hash(realtask)
+
+class runQueueTaskStarted(runQueueEvent):
+ """
+ Event notifying a task was started
+ """
+ def __init__(self, task, stats, rq, noexec=False):
+ runQueueEvent.__init__(self, task, stats, rq)
+ self.noexec = noexec
+
+class sceneQueueTaskStarted(sceneQueueEvent):
+ """
+ Event notifying a setscene task was started
+ """
+ def __init__(self, task, stats, rq, noexec=False):
+ sceneQueueEvent.__init__(self, task, stats, rq)
+ self.noexec = noexec
+
+class runQueueTaskFailed(runQueueEvent):
+ """
+ Event notifying a task failed
+ """
+ def __init__(self, task, stats, exitcode, rq):
+ runQueueEvent.__init__(self, task, stats, rq)
+ self.exitcode = exitcode
+
+class sceneQueueTaskFailed(sceneQueueEvent):
+ """
+ Event notifying a setscene task failed
+ """
+ def __init__(self, task, stats, exitcode, rq):
+ sceneQueueEvent.__init__(self, task, stats, rq)
+ self.exitcode = exitcode
+
+class sceneQueueComplete(sceneQueueEvent):
+ """
+ Event when all the sceneQueue tasks are complete
+ """
+ def __init__(self, stats, rq):
+ self.stats = stats.copy()
+ bb.event.Event.__init__(self)
+
+class runQueueTaskCompleted(runQueueEvent):
+ """
+ Event notifying a task completed
+ """
+
+class sceneQueueTaskCompleted(sceneQueueEvent):
+ """
+ Event notifying a setscene task completed
+ """
+
+class runQueueTaskSkipped(runQueueEvent):
+ """
+ Event notifying a task was skipped
+ """
+ def __init__(self, task, stats, rq, reason):
+ runQueueEvent.__init__(self, task, stats, rq)
+ self.reason = reason
+
+class runQueuePipe():
+ """
+ Abstraction for a pipe between a worker thread and the server
+ """
+ def __init__(self, pipein, pipeout, d, rq, rqexec):
+ self.input = pipein
+ if pipeout:
+ pipeout.close()
+ bb.utils.nonblockingfd(self.input)
+ self.queue = ""
+ self.d = d
+ self.rq = rq
+ self.rqexec = rqexec
+
+ def setrunqueueexec(self, rqexec):
+ self.rqexec = rqexec
+
+ def read(self):
+ for w in [self.rq.worker, self.rq.fakeworker]:
+ if not w:
+ continue
+ w.poll()
+ if w.returncode is not None and not self.rq.teardown:
+ name = None
+ if self.rq.worker and w.pid == self.rq.worker.pid:
+ name = "Worker"
+ elif self.rq.fakeworker and w.pid == self.rq.fakeworker.pid:
+ name = "Fakeroot"
+ bb.error("%s process (%s) exited unexpectedly (%s), shutting down..." % (name, w.pid, str(w.returncode)))
+ self.rq.finish_runqueue(True)
+
+ start = len(self.queue)
+ try:
+ self.queue = self.queue + self.input.read(102400)
+ except (OSError, IOError) as e:
+ if e.errno != errno.EAGAIN:
+ raise
+ end = len(self.queue)
+ found = True
+ while found and len(self.queue):
+ found = False
+ index = self.queue.find("</event>")
+ while index != -1 and self.queue.startswith("<event>"):
+ try:
+ event = pickle.loads(self.queue[7:index])
+ except ValueError as e:
+ bb.msg.fatal("RunQueue", "failed load pickle '%s': '%s'" % (e, self.queue[7:index]))
+ bb.event.fire_from_worker(event, self.d)
+ found = True
+ self.queue = self.queue[index+8:]
+ index = self.queue.find("</event>")
+ index = self.queue.find("</exitcode>")
+ while index != -1 and self.queue.startswith("<exitcode>"):
+ try:
+ task, status = pickle.loads(self.queue[10:index])
+ except ValueError as e:
+ bb.msg.fatal("RunQueue", "failed load pickle '%s': '%s'" % (e, self.queue[10:index]))
+ self.rqexec.runqueue_process_waitpid(task, status)
+ found = True
+ self.queue = self.queue[index+11:]
+ index = self.queue.find("</exitcode>")
+ return (end > start)
+
+ def close(self):
+ while self.read():
+ continue
+ if len(self.queue) > 0:
+ print("Warning, worker left partial message: %s" % self.queue)
+ self.input.close()
diff --git a/bitbake/lib/bb/server/__init__.py b/bitbake/lib/bb/server/__init__.py
new file mode 100644
index 0000000..da5e480
--- /dev/null
+++ b/bitbake/lib/bb/server/__init__.py
@@ -0,0 +1,96 @@
+#
+# BitBake Base Server Code
+#
+# Copyright (C) 2006 - 2007 Michael 'Mickey' Lauer
+# Copyright (C) 2006 - 2008 Richard Purdie
+# Copyright (C) 2013 Alexandru Damian
+#
+# This program is free software; you can redistribute it and/or modify
+# it under the terms of the GNU General Public License version 2 as
+# published by the Free Software Foundation.
+#
+# This program is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
+# GNU General Public License for more details.
+#
+# You should have received a copy of the GNU General Public License along
+# with this program; if not, write to the Free Software Foundation, Inc.,
+# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
+
+""" Base code for Bitbake server process
+
+Have a common base for that all Bitbake server classes ensures a consistent
+approach to the interface, and minimize risks associated with code duplication.
+
+"""
+
+""" BaseImplServer() the base class for all XXServer() implementations.
+
+ These classes contain the actual code that runs the server side, i.e.
+ listens for the commands and executes them. Although these implementations
+ contain all the data of the original bitbake command, i.e the cooker instance,
+ they may well run on a different process or even machine.
+
+"""
+
+class BaseImplServer():
+ def __init__(self):
+ self._idlefuns = {}
+
+ def addcooker(self, cooker):
+ self.cooker = cooker
+
+ def register_idle_function(self, function, data):
+ """Register a function to be called while the server is idle"""
+ assert hasattr(function, '__call__')
+ self._idlefuns[function] = data
+
+
+
+""" BitBakeBaseServerConnection class is the common ancestor to all
+ BitBakeServerConnection classes.
+
+ These classes control the remote server. The only command currently
+ implemented is the terminate() command.
+
+"""
+
+class BitBakeBaseServerConnection():
+ def __init__(self, serverImpl):
+ pass
+
+ def terminate(self):
+ pass
+
+
+""" BitBakeBaseServer class is the common ancestor to all Bitbake servers
+
+ Derive this class in order to implement a BitBakeServer which is the
+ controlling stub for the actual server implementation
+
+"""
+class BitBakeBaseServer(object):
+ def initServer(self):
+ self.serverImpl = None # we ensure a runtime crash if not overloaded
+ self.connection = None
+ return
+
+ def addcooker(self, cooker):
+ self.cooker = cooker
+ self.serverImpl.addcooker(cooker)
+
+ def getServerIdleCB(self):
+ return self.serverImpl.register_idle_function
+
+ def saveConnectionDetails(self):
+ return
+
+ def detach(self):
+ return
+
+ def establishConnection(self, featureset):
+ raise "Must redefine the %s.establishConnection()" % self.__class__.__name__
+
+ def endSession(self):
+ self.connection.terminate()
diff --git a/bitbake/lib/bb/server/process.py b/bitbake/lib/bb/server/process.py
new file mode 100644
index 0000000..5fca350
--- /dev/null
+++ b/bitbake/lib/bb/server/process.py
@@ -0,0 +1,264 @@
+#
+# BitBake Process based server.
+#
+# Copyright (C) 2010 Bob Foerster <robert@erafx.com>
+#
+# This program is free software; you can redistribute it and/or modify
+# it under the terms of the GNU General Public License version 2 as
+# published by the Free Software Foundation.
+#
+# This program is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
+# GNU General Public License for more details.
+#
+# You should have received a copy of the GNU General Public License along
+# with this program; if not, write to the Free Software Foundation, Inc.,
+# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
+
+"""
+ This module implements a multiprocessing.Process based server for bitbake.
+"""
+
+import bb
+import bb.event
+import itertools
+import logging
+import multiprocessing
+import os
+import signal
+import sys
+import time
+import select
+from Queue import Empty
+from multiprocessing import Event, Process, util, Queue, Pipe, queues, Manager
+
+from . import BitBakeBaseServer, BitBakeBaseServerConnection, BaseImplServer
+
+logger = logging.getLogger('BitBake')
+
+class ServerCommunicator():
+ def __init__(self, connection, event_handle, server):
+ self.connection = connection
+ self.event_handle = event_handle
+ self.server = server
+
+ def runCommand(self, command):
+ # @todo try/except
+ self.connection.send(command)
+
+ if not self.server.is_alive():
+ raise SystemExit
+
+ while True:
+ # don't let the user ctrl-c while we're waiting for a response
+ try:
+ if self.connection.poll(20):
+ return self.connection.recv()
+ else:
+ bb.fatal("Timeout while attempting to communicate with bitbake server")
+ except KeyboardInterrupt:
+ pass
+
+ def getEventHandle(self):
+ return self.event_handle.value
+
+class EventAdapter():
+ """
+ Adapter to wrap our event queue since the caller (bb.event) expects to
+ call a send() method, but our actual queue only has put()
+ """
+ def __init__(self, queue):
+ self.queue = queue
+
+ def send(self, event):
+ try:
+ self.queue.put(event)
+ except Exception as err:
+ print("EventAdapter puked: %s" % str(err))
+
+
+class ProcessServer(Process, BaseImplServer):
+ profile_filename = "profile.log"
+ profile_processed_filename = "profile.log.processed"
+
+ def __init__(self, command_channel, event_queue, featurelist):
+ BaseImplServer.__init__(self)
+ Process.__init__(self)
+ self.command_channel = command_channel
+ self.event_queue = event_queue
+ self.event = EventAdapter(event_queue)
+ self.featurelist = featurelist
+ self.quit = False
+
+ self.quitin, self.quitout = Pipe()
+ self.event_handle = multiprocessing.Value("i")
+
+ def run(self):
+ for event in bb.event.ui_queue:
+ self.event_queue.put(event)
+ self.event_handle.value = bb.event.register_UIHhandler(self, True)
+
+ bb.cooker.server_main(self.cooker, self.main)
+
+ def main(self):
+ # Ignore SIGINT within the server, as all SIGINT handling is done by
+ # the UI and communicated to us
+ self.quitin.close()
+ signal.signal(signal.SIGINT, signal.SIG_IGN)
+ while not self.quit:
+ try:
+ if self.command_channel.poll():
+ command = self.command_channel.recv()
+ self.runCommand(command)
+ if self.quitout.poll():
+ self.quitout.recv()
+ self.quit = True
+ try:
+ self.runCommand(["stateForceShutdown"])
+ except:
+ pass
+
+ self.idle_commands(.1, [self.command_channel, self.quitout])
+ except Exception:
+ logger.exception('Running command %s', command)
+
+ self.event_queue.close()
+ bb.event.unregister_UIHhandler(self.event_handle.value)
+ self.command_channel.close()
+ self.cooker.shutdown(True)
+ self.quitout.close()
+
+ def idle_commands(self, delay, fds=None):
+ nextsleep = delay
+ if not fds:
+ fds = []
+
+ for function, data in self._idlefuns.items():
+ try:
+ retval = function(self, data, False)
+ if retval is False:
+ del self._idlefuns[function]
+ nextsleep = None
+ elif retval is True:
+ nextsleep = None
+ elif isinstance(retval, float):
+ if (retval < nextsleep):
+ nextsleep = retval
+ elif nextsleep is None:
+ continue
+ else:
+ fds = fds + retval
+ except SystemExit:
+ raise
+ except Exception as exc:
+ if not isinstance(exc, bb.BBHandledException):
+ logger.exception('Running idle function')
+ del self._idlefuns[function]
+ self.quit = True
+
+ if nextsleep is not None:
+ select.select(fds,[],[],nextsleep)
+
+ def runCommand(self, command):
+ """
+ Run a cooker command on the server
+ """
+ self.command_channel.send(self.cooker.command.runCommand(command))
+
+ def stop(self):
+ self.quitin.send("quit")
+ self.quitin.close()
+
+class BitBakeProcessServerConnection(BitBakeBaseServerConnection):
+ def __init__(self, serverImpl, ui_channel, event_queue):
+ self.procserver = serverImpl
+ self.ui_channel = ui_channel
+ self.event_queue = event_queue
+ self.connection = ServerCommunicator(self.ui_channel, self.procserver.event_handle, self.procserver)
+ self.events = self.event_queue
+ self.terminated = False
+
+ def sigterm_terminate(self):
+ bb.error("UI received SIGTERM")
+ self.terminate()
+
+ def terminate(self):
+ if self.terminated:
+ return
+ self.terminated = True
+ def flushevents():
+ while True:
+ try:
+ event = self.event_queue.get(block=False)
+ except (Empty, IOError):
+ break
+ if isinstance(event, logging.LogRecord):
+ logger.handle(event)
+
+ signal.signal(signal.SIGINT, signal.SIG_IGN)
+ self.procserver.stop()
+
+ while self.procserver.is_alive():
+ flushevents()
+ self.procserver.join(0.1)
+
+ self.ui_channel.close()
+ self.event_queue.close()
+ self.event_queue.setexit()
+
+# Wrap Queue to provide API which isn't server implementation specific
+class ProcessEventQueue(multiprocessing.queues.Queue):
+ def __init__(self, maxsize):
+ multiprocessing.queues.Queue.__init__(self, maxsize)
+ self.exit = False
+
+ def setexit(self):
+ self.exit = True
+
+ def waitEvent(self, timeout):
+ if self.exit:
+ sys.exit(1)
+ try:
+ if not self.server.is_alive():
+ self.setexit()
+ return None
+ return self.get(True, timeout)
+ except Empty:
+ return None
+
+ def getEvent(self):
+ try:
+ if not self.server.is_alive():
+ self.setexit()
+ return None
+ return self.get(False)
+ except Empty:
+ return None
+
+
+class BitBakeServer(BitBakeBaseServer):
+ def initServer(self):
+ # establish communication channels. We use bidirectional pipes for
+ # ui <--> server command/response pairs
+ # and a queue for server -> ui event notifications
+ #
+ self.ui_channel, self.server_channel = Pipe()
+ self.event_queue = ProcessEventQueue(0)
+ self.serverImpl = ProcessServer(self.server_channel, self.event_queue, None)
+ self.event_queue.server = self.serverImpl
+
+ def detach(self):
+ self.serverImpl.start()
+ return
+
+ def establishConnection(self, featureset):
+
+ self.connection = BitBakeProcessServerConnection(self.serverImpl, self.ui_channel, self.event_queue)
+
+ _, error = self.connection.connection.runCommand(["setFeatures", featureset])
+ if error:
+ logger.error("Unable to set the cooker to the correct featureset: %s" % error)
+ raise BaseException(error)
+ signal.signal(signal.SIGTERM, lambda i, s: self.connection.sigterm_terminate())
+ return self.connection
diff --git a/bitbake/lib/bb/server/xmlrpc.py b/bitbake/lib/bb/server/xmlrpc.py
new file mode 100644
index 0000000..b7647c1
--- /dev/null
+++ b/bitbake/lib/bb/server/xmlrpc.py
@@ -0,0 +1,392 @@
+#
+# BitBake XMLRPC Server
+#
+# Copyright (C) 2006 - 2007 Michael 'Mickey' Lauer
+# Copyright (C) 2006 - 2008 Richard Purdie
+#
+# This program is free software; you can redistribute it and/or modify
+# it under the terms of the GNU General Public License version 2 as
+# published by the Free Software Foundation.
+#
+# This program is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
+# GNU General Public License for more details.
+#
+# You should have received a copy of the GNU General Public License along
+# with this program; if not, write to the Free Software Foundation, Inc.,
+# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
+
+"""
+ This module implements an xmlrpc server for BitBake.
+
+ Use this by deriving a class from BitBakeXMLRPCServer and then adding
+ methods which you want to "export" via XMLRPC. If the methods have the
+ prefix xmlrpc_, then registering those function will happen automatically,
+ if not, you need to call register_function.
+
+ Use register_idle_function() to add a function which the xmlrpc server
+ calls from within server_forever when no requests are pending. Make sure
+ that those functions are non-blocking or else you will introduce latency
+ in the server's main loop.
+"""
+
+import bb
+import xmlrpclib, sys
+from bb import daemonize
+from bb.ui import uievent
+import hashlib, time
+import socket
+import os, signal
+import threading
+try:
+ import cPickle as pickle
+except ImportError:
+ import pickle
+
+DEBUG = False
+
+from SimpleXMLRPCServer import SimpleXMLRPCServer, SimpleXMLRPCRequestHandler
+import inspect, select, httplib
+
+from . import BitBakeBaseServer, BitBakeBaseServerConnection, BaseImplServer
+
+class BBTransport(xmlrpclib.Transport):
+ def __init__(self, timeout):
+ self.timeout = timeout
+ self.connection_token = None
+ xmlrpclib.Transport.__init__(self)
+
+ # Modified from default to pass timeout to HTTPConnection
+ def make_connection(self, host):
+ #return an existing connection if possible. This allows
+ #HTTP/1.1 keep-alive.
+ if self._connection and host == self._connection[0]:
+ return self._connection[1]
+
+ # create a HTTP connection object from a host descriptor
+ chost, self._extra_headers, x509 = self.get_host_info(host)
+ #store the host argument along with the connection object
+ self._connection = host, httplib.HTTPConnection(chost, timeout=self.timeout)
+ return self._connection[1]
+
+ def set_connection_token(self, token):
+ self.connection_token = token
+
+ def send_content(self, h, body):
+ if self.connection_token:
+ h.putheader("Bitbake-token", self.connection_token)
+ xmlrpclib.Transport.send_content(self, h, body)
+
+def _create_server(host, port, timeout = 60):
+ t = BBTransport(timeout)
+ s = xmlrpclib.ServerProxy("http://%s:%d/" % (host, port), transport=t, allow_none=True)
+ return s, t
+
+class BitBakeServerCommands():
+
+ def __init__(self, server):
+ self.server = server
+ self.has_client = False
+
+ def registerEventHandler(self, host, port):
+ """
+ Register a remote UI Event Handler
+ """
+ s, t = _create_server(host, port)
+
+ # we don't allow connections if the cooker is running
+ if (self.cooker.state in [bb.cooker.state.parsing, bb.cooker.state.running]):
+ return None
+
+ self.event_handle = bb.event.register_UIHhandler(s, True)
+ return self.event_handle
+
+ def unregisterEventHandler(self, handlerNum):
+ """
+ Unregister a remote UI Event Handler
+ """
+ return bb.event.unregister_UIHhandler(handlerNum)
+
+ def runCommand(self, command):
+ """
+ Run a cooker command on the server
+ """
+ return self.cooker.command.runCommand(command, self.server.readonly)
+
+ def getEventHandle(self):
+ return self.event_handle
+
+ def terminateServer(self):
+ """
+ Trigger the server to quit
+ """
+ self.server.quit = True
+ print("Server (cooker) exiting")
+ return
+
+ def addClient(self):
+ if self.has_client:
+ return None
+ token = hashlib.md5(str(time.time())).hexdigest()
+ self.server.set_connection_token(token)
+ self.has_client = True
+ return token
+
+ def removeClient(self):
+ if self.has_client:
+ self.server.set_connection_token(None)
+ self.has_client = False
+ if self.server.single_use:
+ self.server.quit = True
+
+# This request handler checks if the request has a "Bitbake-token" header
+# field (this comes from the client side) and compares it with its internal
+# "Bitbake-token" field (this comes from the server). If the two are not
+# equal, it is assumed that a client is trying to connect to the server
+# while another client is connected to the server. In this case, a 503 error
+# ("service unavailable") is returned to the client.
+class BitBakeXMLRPCRequestHandler(SimpleXMLRPCRequestHandler):
+ def __init__(self, request, client_address, server):
+ self.server = server
+ SimpleXMLRPCRequestHandler.__init__(self, request, client_address, server)
+
+ def do_POST(self):
+ try:
+ remote_token = self.headers["Bitbake-token"]
+ except:
+ remote_token = None
+ if remote_token != self.server.connection_token and remote_token != "observer":
+ self.report_503()
+ else:
+ if remote_token == "observer":
+ self.server.readonly = True
+ else:
+ self.server.readonly = False
+ SimpleXMLRPCRequestHandler.do_POST(self)
+
+ def report_503(self):
+ self.send_response(503)
+ response = 'No more client allowed'
+ self.send_header("Content-type", "text/plain")
+ self.send_header("Content-length", str(len(response)))
+ self.end_headers()
+ self.wfile.write(response)
+
+
+class XMLRPCProxyServer(BaseImplServer):
+ """ not a real working server, but a stub for a proxy server connection
+
+ """
+ def __init__(self, host, port):
+ self.host = host
+ self.port = port
+
+class XMLRPCServer(SimpleXMLRPCServer, BaseImplServer):
+ # remove this when you're done with debugging
+ # allow_reuse_address = True
+
+ def __init__(self, interface):
+ """
+ Constructor
+ """
+ BaseImplServer.__init__(self)
+ if (interface[1] == 0): # anonymous port, not getting reused
+ self.single_use = True
+ # Use auto port configuration
+ if (interface[1] == -1):
+ interface = (interface[0], 0)
+ SimpleXMLRPCServer.__init__(self, interface,
+ requestHandler=BitBakeXMLRPCRequestHandler,
+ logRequests=False, allow_none=True)
+ self.host, self.port = self.socket.getsockname()
+ self.connection_token = None
+ #self.register_introspection_functions()
+ self.commands = BitBakeServerCommands(self)
+ self.autoregister_all_functions(self.commands, "")
+ self.interface = interface
+ self.single_use = False
+
+ def addcooker(self, cooker):
+ BaseImplServer.addcooker(self, cooker)
+ self.commands.cooker = cooker
+
+ def autoregister_all_functions(self, context, prefix):
+ """
+ Convenience method for registering all functions in the scope
+ of this class that start with a common prefix
+ """
+ methodlist = inspect.getmembers(context, inspect.ismethod)
+ for name, method in methodlist:
+ if name.startswith(prefix):
+ self.register_function(method, name[len(prefix):])
+
+
+ def serve_forever(self):
+ # Start the actual XMLRPC server
+ bb.cooker.server_main(self.cooker, self._serve_forever)
+
+ def _serve_forever(self):
+ """
+ Serve Requests. Overloaded to honor a quit command
+ """
+ self.quit = False
+ while not self.quit:
+ fds = [self]
+ nextsleep = 0.1
+ for function, data in self._idlefuns.items():
+ retval = None
+ try:
+ retval = function(self, data, False)
+ if retval is False:
+ del self._idlefuns[function]
+ elif retval is True:
+ nextsleep = 0
+ elif isinstance(retval, float):
+ if (retval < nextsleep):
+ nextsleep = retval
+ else:
+ fds = fds + retval
+ except SystemExit:
+ raise
+ except:
+ import traceback
+ traceback.print_exc()
+ if retval == None:
+ # the function execute failed; delete it
+ del self._idlefuns[function]
+ pass
+
+ socktimeout = self.socket.gettimeout() or nextsleep
+ socktimeout = min(socktimeout, nextsleep)
+ # Mirror what BaseServer handle_request would do
+ try:
+ fd_sets = select.select(fds, [], [], socktimeout)
+ if fd_sets[0] and self in fd_sets[0]:
+ self._handle_request_noblock()
+ except IOError:
+ # we ignore interrupted calls
+ pass
+
+ # Tell idle functions we're exiting
+ for function, data in self._idlefuns.items():
+ try:
+ retval = function(self, data, True)
+ except:
+ pass
+ self.server_close()
+ return
+
+ def set_connection_token(self, token):
+ self.connection_token = token
+
+class BitBakeXMLRPCServerConnection(BitBakeBaseServerConnection):
+ def __init__(self, serverImpl, clientinfo=("localhost", 0), observer_only = False, featureset = None):
+ self.connection, self.transport = _create_server(serverImpl.host, serverImpl.port)
+ self.clientinfo = clientinfo
+ self.serverImpl = serverImpl
+ self.observer_only = observer_only
+ if featureset:
+ self.featureset = featureset
+ else:
+ self.featureset = []
+
+ def connect(self, token = None):
+ if token is None:
+ if self.observer_only:
+ token = "observer"
+ else:
+ token = self.connection.addClient()
+
+ if token is None:
+ return None
+
+ self.transport.set_connection_token(token)
+
+ self.events = uievent.BBUIEventQueue(self.connection, self.clientinfo)
+ for event in bb.event.ui_queue:
+ self.events.queue_event(event)
+
+ _, error = self.connection.runCommand(["setFeatures", self.featureset])
+ if error:
+ # disconnect the client, we can't make the setFeature work
+ self.connection.removeClient()
+ # no need to log it here, the error shall be sent to the client
+ raise BaseException(error)
+
+ return self
+
+ def removeClient(self):
+ if not self.observer_only:
+ self.connection.removeClient()
+
+ def terminate(self):
+ # Don't wait for server indefinitely
+ import socket
+ socket.setdefaulttimeout(2)
+ try:
+ self.events.system_quit()
+ except:
+ pass
+ try:
+ self.connection.removeClient()
+ except:
+ pass
+
+class BitBakeServer(BitBakeBaseServer):
+ def initServer(self, interface = ("localhost", 0)):
+ self.interface = interface
+ self.serverImpl = XMLRPCServer(interface)
+
+ def detach(self):
+ daemonize.createDaemon(self.serverImpl.serve_forever, "bitbake-cookerdaemon.log")
+ del self.cooker
+
+ def establishConnection(self, featureset):
+ self.connection = BitBakeXMLRPCServerConnection(self.serverImpl, self.interface, False, featureset)
+ return self.connection.connect()
+
+ def set_connection_token(self, token):
+ self.connection.transport.set_connection_token(token)
+
+class BitBakeXMLRPCClient(BitBakeBaseServer):
+
+ def __init__(self, observer_only = False, token = None):
+ self.token = token
+
+ self.observer_only = observer_only
+ # if we need extra caches, just tell the server to load them all
+ pass
+
+ def saveConnectionDetails(self, remote):
+ self.remote = remote
+
+ def establishConnection(self, featureset):
+ # The format of "remote" must be "server:port"
+ try:
+ [host, port] = self.remote.split(":")
+ port = int(port)
+ except Exception as e:
+ bb.warn("Failed to read remote definition (%s)" % str(e))
+ raise e
+
+ # We need our IP for the server connection. We get the IP
+ # by trying to connect with the server
+ try:
+ s = socket.socket(socket.AF_INET, socket.SOCK_DGRAM)
+ s.connect((host, port))
+ ip = s.getsockname()[0]
+ s.close()
+ except Exception as e:
+ bb.warn("Could not create socket for %s:%s (%s)" % (host, port, str(e)))
+ raise e
+ try:
+ self.serverImpl = XMLRPCProxyServer(host, port)
+ self.connection = BitBakeXMLRPCServerConnection(self.serverImpl, (ip, 0), self.observer_only, featureset)
+ return self.connection.connect(self.token)
+ except Exception as e:
+ bb.warn("Could not connect to server at %s:%s (%s)" % (host, port, str(e)))
+ raise e
+
+ def endSession(self):
+ self.connection.removeClient()
diff --git a/bitbake/lib/bb/shell.py b/bitbake/lib/bb/shell.py
new file mode 100644
index 0000000..1dd8d54
--- /dev/null
+++ b/bitbake/lib/bb/shell.py
@@ -0,0 +1,820 @@
+# ex:ts=4:sw=4:sts=4:et
+# -*- tab-width: 4; c-basic-offset: 4; indent-tabs-mode: nil -*-
+##########################################################################
+#
+# Copyright (C) 2005-2006 Michael 'Mickey' Lauer <mickey@Vanille.de>
+# Copyright (C) 2005-2006 Vanille Media
+#
+# This program is free software; you can redistribute it and/or modify
+# it under the terms of the GNU General Public License version 2 as
+# published by the Free Software Foundation.
+#
+# This program is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
+# GNU General Public License for more details.
+#
+# You should have received a copy of the GNU General Public License along
+# with this program; if not, write to the Free Software Foundation, Inc.,
+# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
+#
+##########################################################################
+#
+# Thanks to:
+# * Holger Freyther <zecke@handhelds.org>
+# * Justin Patrin <papercrane@reversefold.com>
+#
+##########################################################################
+
+"""
+BitBake Shell
+
+IDEAS:
+ * list defined tasks per package
+ * list classes
+ * toggle force
+ * command to reparse just one (or more) bbfile(s)
+ * automatic check if reparsing is necessary (inotify?)
+ * frontend for bb file manipulation
+ * more shell-like features:
+ - output control, i.e. pipe output into grep, sort, etc.
+ - job control, i.e. bring running commands into background and foreground
+ * start parsing in background right after startup
+ * ncurses interface
+
+PROBLEMS:
+ * force doesn't always work
+ * readline completion for commands with more than one parameters
+
+"""
+
+##########################################################################
+# Import and setup global variables
+##########################################################################
+
+from __future__ import print_function
+from functools import reduce
+try:
+ set
+except NameError:
+ from sets import Set as set
+import sys, os, readline, socket, httplib, urllib, commands, popen2, shlex, Queue, fnmatch
+from bb import data, parse, build, cache, taskdata, runqueue, providers as Providers
+
+__version__ = "0.5.3.1"
+__credits__ = """BitBake Shell Version %s (C) 2005 Michael 'Mickey' Lauer <mickey@Vanille.de>
+Type 'help' for more information, press CTRL-D to exit.""" % __version__
+
+cmds = {}
+leave_mainloop = False
+last_exception = None
+cooker = None
+parsed = False
+debug = os.environ.get( "BBSHELL_DEBUG", "" )
+
+##########################################################################
+# Class BitBakeShellCommands
+##########################################################################
+
+class BitBakeShellCommands:
+ """This class contains the valid commands for the shell"""
+
+ def __init__( self, shell ):
+ """Register all the commands"""
+ self._shell = shell
+ for attr in BitBakeShellCommands.__dict__:
+ if not attr.startswith( "_" ):
+ if attr.endswith( "_" ):
+ command = attr[:-1].lower()
+ else:
+ command = attr[:].lower()
+ method = getattr( BitBakeShellCommands, attr )
+ debugOut( "registering command '%s'" % command )
+ # scan number of arguments
+ usage = getattr( method, "usage", "" )
+ if usage != "<...>":
+ numArgs = len( usage.split() )
+ else:
+ numArgs = -1
+ shell.registerCommand( command, method, numArgs, "%s %s" % ( command, usage ), method.__doc__ )
+
+ def _checkParsed( self ):
+ if not parsed:
+ print("SHELL: This command needs to parse bbfiles...")
+ self.parse( None )
+
+ def _findProvider( self, item ):
+ self._checkParsed()
+ # Need to use taskData for this information
+ preferred = data.getVar( "PREFERRED_PROVIDER_%s" % item, cooker.configuration.data, 1 )
+ if not preferred: preferred = item
+ try:
+ lv, lf, pv, pf = Providers.findBestProvider(preferred, cooker.configuration.data, cooker.status)
+ except KeyError:
+ if item in cooker.status.providers:
+ pf = cooker.status.providers[item][0]
+ else:
+ pf = None
+ return pf
+
+ def alias( self, params ):
+ """Register a new name for a command"""
+ new, old = params
+ if not old in cmds:
+ print("ERROR: Command '%s' not known" % old)
+ else:
+ cmds[new] = cmds[old]
+ print("OK")
+ alias.usage = "<alias> <command>"
+
+ def buffer( self, params ):
+ """Dump specified output buffer"""
+ index = params[0]
+ print(self._shell.myout.buffer( int( index ) ))
+ buffer.usage = "<index>"
+
+ def buffers( self, params ):
+ """Show the available output buffers"""
+ commands = self._shell.myout.bufferedCommands()
+ if not commands:
+ print("SHELL: No buffered commands available yet. Start doing something.")
+ else:
+ print("="*35, "Available Output Buffers", "="*27)
+ for index, cmd in enumerate( commands ):
+ print("| %s %s" % ( str( index ).ljust( 3 ), cmd ))
+ print("="*88)
+
+ def build( self, params, cmd = "build" ):
+ """Build a providee"""
+ global last_exception
+ globexpr = params[0]
+ self._checkParsed()
+ names = globfilter( cooker.status.pkg_pn, globexpr )
+ if len( names ) == 0: names = [ globexpr ]
+ print("SHELL: Building %s" % ' '.join( names ))
+
+ td = taskdata.TaskData(cooker.configuration.abort)
+ localdata = data.createCopy(cooker.configuration.data)
+ data.update_data(localdata)
+ data.expandKeys(localdata)
+
+ try:
+ tasks = []
+ for name in names:
+ td.add_provider(localdata, cooker.status, name)
+ providers = td.get_provider(name)
+
+ if len(providers) == 0:
+ raise Providers.NoProvider
+
+ tasks.append([name, "do_%s" % cmd])
+
+ td.add_unresolved(localdata, cooker.status)
+
+ rq = runqueue.RunQueue(cooker, localdata, cooker.status, td, tasks)
+ rq.prepare_runqueue()
+ rq.execute_runqueue()
+
+ except Providers.NoProvider:
+ print("ERROR: No Provider")
+ last_exception = Providers.NoProvider
+
+ except runqueue.TaskFailure as fnids:
+ last_exception = runqueue.TaskFailure
+
+ except build.FuncFailed as e:
+ print("ERROR: Couldn't build '%s'" % names)
+ last_exception = e
+
+
+ build.usage = "<providee>"
+
+ def clean( self, params ):
+ """Clean a providee"""
+ self.build( params, "clean" )
+ clean.usage = "<providee>"
+
+ def compile( self, params ):
+ """Execute 'compile' on a providee"""
+ self.build( params, "compile" )
+ compile.usage = "<providee>"
+
+ def configure( self, params ):
+ """Execute 'configure' on a providee"""
+ self.build( params, "configure" )
+ configure.usage = "<providee>"
+
+ def install( self, params ):
+ """Execute 'install' on a providee"""
+ self.build( params, "install" )
+ install.usage = "<providee>"
+
+ def edit( self, params ):
+ """Call $EDITOR on a providee"""
+ name = params[0]
+ bbfile = self._findProvider( name )
+ if bbfile is not None:
+ os.system( "%s %s" % ( os.environ.get( "EDITOR", "vi" ), bbfile ) )
+ else:
+ print("ERROR: Nothing provides '%s'" % name)
+ edit.usage = "<providee>"
+
+ def environment( self, params ):
+ """Dump out the outer BitBake environment"""
+ cooker.showEnvironment()
+
+ def exit_( self, params ):
+ """Leave the BitBake Shell"""
+ debugOut( "setting leave_mainloop to true" )
+ global leave_mainloop
+ leave_mainloop = True
+
+ def fetch( self, params ):
+ """Fetch a providee"""
+ self.build( params, "fetch" )
+ fetch.usage = "<providee>"
+
+ def fileBuild( self, params, cmd = "build" ):
+ """Parse and build a .bb file"""
+ global last_exception
+ name = params[0]
+ bf = completeFilePath( name )
+ print("SHELL: Calling '%s' on '%s'" % ( cmd, bf ))
+
+ try:
+ cooker.buildFile(bf, cmd)
+ except parse.ParseError:
+ print("ERROR: Unable to open or parse '%s'" % bf)
+ except build.FuncFailed as e:
+ print("ERROR: Couldn't build '%s'" % name)
+ last_exception = e
+
+ fileBuild.usage = "<bbfile>"
+
+ def fileClean( self, params ):
+ """Clean a .bb file"""
+ self.fileBuild( params, "clean" )
+ fileClean.usage = "<bbfile>"
+
+ def fileEdit( self, params ):
+ """Call $EDITOR on a .bb file"""
+ name = params[0]
+ os.system( "%s %s" % ( os.environ.get( "EDITOR", "vi" ), completeFilePath( name ) ) )
+ fileEdit.usage = "<bbfile>"
+
+ def fileRebuild( self, params ):
+ """Rebuild (clean & build) a .bb file"""
+ self.fileBuild( params, "rebuild" )
+ fileRebuild.usage = "<bbfile>"
+
+ def fileReparse( self, params ):
+ """(re)Parse a bb file"""
+ bbfile = params[0]
+ print("SHELL: Parsing '%s'" % bbfile)
+ parse.update_mtime( bbfile )
+ cooker.parser.reparse(bbfile)
+ if False: #fromCache:
+ print("SHELL: File has not been updated, not reparsing")
+ else:
+ print("SHELL: Parsed")
+ fileReparse.usage = "<bbfile>"
+
+ def abort( self, params ):
+ """Toggle abort task execution flag (see bitbake -k)"""
+ cooker.configuration.abort = not cooker.configuration.abort
+ print("SHELL: Abort Flag is now '%s'" % repr( cooker.configuration.abort ))
+
+ def force( self, params ):
+ """Toggle force task execution flag (see bitbake -f)"""
+ cooker.configuration.force = not cooker.configuration.force
+ print("SHELL: Force Flag is now '%s'" % repr( cooker.configuration.force ))
+
+ def help( self, params ):
+ """Show a comprehensive list of commands and their purpose"""
+ print("="*30, "Available Commands", "="*30)
+ for cmd in sorted(cmds):
+ function, numparams, usage, helptext = cmds[cmd]
+ print("| %s | %s" % (usage.ljust(30), helptext))
+ print("="*78)
+
+ def lastError( self, params ):
+ """Show the reason or log that was produced by the last BitBake event exception"""
+ if last_exception is None:
+ print("SHELL: No Errors yet (Phew)...")
+ else:
+ reason, event = last_exception.args
+ print("SHELL: Reason for the last error: '%s'" % reason)
+ if ':' in reason:
+ msg, filename = reason.split( ':' )
+ filename = filename.strip()
+ print("SHELL: Dumping log file for last error:")
+ try:
+ print(open( filename ).read())
+ except IOError:
+ print("ERROR: Couldn't open '%s'" % filename)
+
+ def match( self, params ):
+ """Dump all files or providers matching a glob expression"""
+ what, globexpr = params
+ if what == "files":
+ self._checkParsed()
+ for key in globfilter( cooker.status.pkg_fn, globexpr ): print(key)
+ elif what == "providers":
+ self._checkParsed()
+ for key in globfilter( cooker.status.pkg_pn, globexpr ): print(key)
+ else:
+ print("Usage: match %s" % self.print_.usage)
+ match.usage = "<files|providers> <glob>"
+
+ def new( self, params ):
+ """Create a new .bb file and open the editor"""
+ dirname, filename = params
+ packages = '/'.join( data.getVar( "BBFILES", cooker.configuration.data, 1 ).split('/')[:-2] )
+ fulldirname = "%s/%s" % ( packages, dirname )
+
+ if not os.path.exists( fulldirname ):
+ print("SHELL: Creating '%s'" % fulldirname)
+ os.mkdir( fulldirname )
+ if os.path.exists( fulldirname ) and os.path.isdir( fulldirname ):
+ if os.path.exists( "%s/%s" % ( fulldirname, filename ) ):
+ print("SHELL: ERROR: %s/%s already exists" % ( fulldirname, filename ))
+ return False
+ print("SHELL: Creating '%s/%s'" % ( fulldirname, filename ))
+ newpackage = open( "%s/%s" % ( fulldirname, filename ), "w" )
+ print("""DESCRIPTION = ""
+SECTION = ""
+AUTHOR = ""
+HOMEPAGE = ""
+MAINTAINER = ""
+LICENSE = "GPL"
+PR = "r0"
+
+SRC_URI = ""
+
+#inherit base
+
+#do_configure() {
+#
+#}
+
+#do_compile() {
+#
+#}
+
+#do_stage() {
+#
+#}
+
+#do_install() {
+#
+#}
+""", file=newpackage)
+ newpackage.close()
+ os.system( "%s %s/%s" % ( os.environ.get( "EDITOR" ), fulldirname, filename ) )
+ new.usage = "<directory> <filename>"
+
+ def package( self, params ):
+ """Execute 'package' on a providee"""
+ self.build( params, "package" )
+ package.usage = "<providee>"
+
+ def pasteBin( self, params ):
+ """Send a command + output buffer to the pastebin at http://rafb.net/paste"""
+ index = params[0]
+ contents = self._shell.myout.buffer( int( index ) )
+ sendToPastebin( "output of " + params[0], contents )
+ pasteBin.usage = "<index>"
+
+ def pasteLog( self, params ):
+ """Send the last event exception error log (if there is one) to http://rafb.net/paste"""
+ if last_exception is None:
+ print("SHELL: No Errors yet (Phew)...")
+ else:
+ reason, event = last_exception.args
+ print("SHELL: Reason for the last error: '%s'" % reason)
+ if ':' in reason:
+ msg, filename = reason.split( ':' )
+ filename = filename.strip()
+ print("SHELL: Pasting log file to pastebin...")
+
+ file = open( filename ).read()
+ sendToPastebin( "contents of " + filename, file )
+
+ def patch( self, params ):
+ """Execute 'patch' command on a providee"""
+ self.build( params, "patch" )
+ patch.usage = "<providee>"
+
+ def parse( self, params ):
+ """(Re-)parse .bb files and calculate the dependency graph"""
+ cooker.status = cache.CacheData(cooker.caches_array)
+ ignore = data.getVar("ASSUME_PROVIDED", cooker.configuration.data, 1) or ""
+ cooker.status.ignored_dependencies = set( ignore.split() )
+ cooker.handleCollections( data.getVar("BBFILE_COLLECTIONS", cooker.configuration.data, 1) )
+
+ (filelist, masked) = cooker.collect_bbfiles()
+ cooker.parse_bbfiles(filelist, masked, cooker.myProgressCallback)
+ cooker.buildDepgraph()
+ global parsed
+ parsed = True
+ print()
+
+ def reparse( self, params ):
+ """(re)Parse a providee's bb file"""
+ bbfile = self._findProvider( params[0] )
+ if bbfile is not None:
+ print("SHELL: Found bbfile '%s' for '%s'" % ( bbfile, params[0] ))
+ self.fileReparse( [ bbfile ] )
+ else:
+ print("ERROR: Nothing provides '%s'" % params[0])
+ reparse.usage = "<providee>"
+
+ def getvar( self, params ):
+ """Dump the contents of an outer BitBake environment variable"""
+ var = params[0]
+ value = data.getVar( var, cooker.configuration.data, 1 )
+ print(value)
+ getvar.usage = "<variable>"
+
+ def peek( self, params ):
+ """Dump contents of variable defined in providee's metadata"""
+ name, var = params
+ bbfile = self._findProvider( name )
+ if bbfile is not None:
+ the_data = cache.Cache.loadDataFull(bbfile, cooker.configuration.data)
+ value = the_data.getVar( var, 1 )
+ print(value)
+ else:
+ print("ERROR: Nothing provides '%s'" % name)
+ peek.usage = "<providee> <variable>"
+
+ def poke( self, params ):
+ """Set contents of variable defined in providee's metadata"""
+ name, var, value = params
+ bbfile = self._findProvider( name )
+ if bbfile is not None:
+ print("ERROR: Sorry, this functionality is currently broken")
+ #d = cooker.pkgdata[bbfile]
+ #data.setVar( var, value, d )
+
+ # mark the change semi persistant
+ #cooker.pkgdata.setDirty(bbfile, d)
+ #print "OK"
+ else:
+ print("ERROR: Nothing provides '%s'" % name)
+ poke.usage = "<providee> <variable> <value>"
+
+ def print_( self, params ):
+ """Dump all files or providers"""
+ what = params[0]
+ if what == "files":
+ self._checkParsed()
+ for key in cooker.status.pkg_fn: print(key)
+ elif what == "providers":
+ self._checkParsed()
+ for key in cooker.status.providers: print(key)
+ else:
+ print("Usage: print %s" % self.print_.usage)
+ print_.usage = "<files|providers>"
+
+ def python( self, params ):
+ """Enter the expert mode - an interactive BitBake Python Interpreter"""
+ sys.ps1 = "EXPERT BB>>> "
+ sys.ps2 = "EXPERT BB... "
+ import code
+ interpreter = code.InteractiveConsole( dict( globals() ) )
+ interpreter.interact( "SHELL: Expert Mode - BitBake Python %s\nType 'help' for more information, press CTRL-D to switch back to BBSHELL." % sys.version )
+
+ def showdata( self, params ):
+ """Execute 'showdata' on a providee"""
+ cooker.showEnvironment(None, params)
+ showdata.usage = "<providee>"
+
+ def setVar( self, params ):
+ """Set an outer BitBake environment variable"""
+ var, value = params
+ data.setVar( var, value, cooker.configuration.data )
+ print("OK")
+ setVar.usage = "<variable> <value>"
+
+ def rebuild( self, params ):
+ """Clean and rebuild a .bb file or a providee"""
+ self.build( params, "clean" )
+ self.build( params, "build" )
+ rebuild.usage = "<providee>"
+
+ def shell( self, params ):
+ """Execute a shell command and dump the output"""
+ if params != "":
+ print(commands.getoutput( " ".join( params ) ))
+ shell.usage = "<...>"
+
+ def stage( self, params ):
+ """Execute 'stage' on a providee"""
+ self.build( params, "populate_staging" )
+ stage.usage = "<providee>"
+
+ def status( self, params ):
+ """<just for testing>"""
+ print("-" * 78)
+ print("building list = '%s'" % cooker.building_list)
+ print("build path = '%s'" % cooker.build_path)
+ print("consider_msgs_cache = '%s'" % cooker.consider_msgs_cache)
+ print("build stats = '%s'" % cooker.stats)
+ if last_exception is not None: print("last_exception = '%s'" % repr( last_exception.args ))
+ print("memory output contents = '%s'" % self._shell.myout._buffer)
+
+ def test( self, params ):
+ """<just for testing>"""
+ print("testCommand called with '%s'" % params)
+
+ def unpack( self, params ):
+ """Execute 'unpack' on a providee"""
+ self.build( params, "unpack" )
+ unpack.usage = "<providee>"
+
+ def which( self, params ):
+ """Computes the providers for a given providee"""
+ # Need to use taskData for this information
+ item = params[0]
+
+ self._checkParsed()
+
+ preferred = data.getVar( "PREFERRED_PROVIDER_%s" % item, cooker.configuration.data, 1 )
+ if not preferred: preferred = item
+
+ try:
+ lv, lf, pv, pf = Providers.findBestProvider(preferred, cooker.configuration.data, cooker.status)
+ except KeyError:
+ lv, lf, pv, pf = (None,)*4
+
+ try:
+ providers = cooker.status.providers[item]
+ except KeyError:
+ print("SHELL: ERROR: Nothing provides", preferred)
+ else:
+ for provider in providers:
+ if provider == pf: provider = " (***) %s" % provider
+ else: provider = " %s" % provider
+ print(provider)
+ which.usage = "<providee>"
+
+##########################################################################
+# Common helper functions
+##########################################################################
+
+def completeFilePath( bbfile ):
+ """Get the complete bbfile path"""
+ if not cooker.status: return bbfile
+ if not cooker.status.pkg_fn: return bbfile
+ for key in cooker.status.pkg_fn:
+ if key.endswith( bbfile ):
+ return key
+ return bbfile
+
+def sendToPastebin( desc, content ):
+ """Send content to http://oe.pastebin.com"""
+ mydata = {}
+ mydata["lang"] = "Plain Text"
+ mydata["desc"] = desc
+ mydata["cvt_tabs"] = "No"
+ mydata["nick"] = "%s@%s" % ( os.environ.get( "USER", "unknown" ), socket.gethostname() or "unknown" )
+ mydata["text"] = content
+ params = urllib.urlencode( mydata )
+ headers = {"Content-type": "application/x-www-form-urlencoded", "Accept": "text/plain"}
+
+ host = "rafb.net"
+ conn = httplib.HTTPConnection( "%s:80" % host )
+ conn.request("POST", "/paste/paste.php", params, headers )
+
+ response = conn.getresponse()
+ conn.close()
+
+ if response.status == 302:
+ location = response.getheader( "location" ) or "unknown"
+ print("SHELL: Pasted to http://%s%s" % ( host, location ))
+ else:
+ print("ERROR: %s %s" % ( response.status, response.reason ))
+
+def completer( text, state ):
+ """Return a possible readline completion"""
+ debugOut( "completer called with text='%s', state='%d'" % ( text, state ) )
+
+ if state == 0:
+ line = readline.get_line_buffer()
+ if " " in line:
+ line = line.split()
+ # we are in second (or more) argument
+ if line[0] in cmds and hasattr( cmds[line[0]][0], "usage" ): # known command and usage
+ u = getattr( cmds[line[0]][0], "usage" ).split()[0]
+ if u == "<variable>":
+ allmatches = cooker.configuration.data.keys()
+ elif u == "<bbfile>":
+ if cooker.status.pkg_fn is None: allmatches = [ "(No Matches Available. Parsed yet?)" ]
+ else: allmatches = [ x.split("/")[-1] for x in cooker.status.pkg_fn ]
+ elif u == "<providee>":
+ if cooker.status.pkg_fn is None: allmatches = [ "(No Matches Available. Parsed yet?)" ]
+ else: allmatches = cooker.status.providers.iterkeys()
+ else: allmatches = [ "(No tab completion available for this command)" ]
+ else: allmatches = [ "(No tab completion available for this command)" ]
+ else:
+ # we are in first argument
+ allmatches = cmds.iterkeys()
+
+ completer.matches = [ x for x in allmatches if x[:len(text)] == text ]
+ #print "completer.matches = '%s'" % completer.matches
+ if len( completer.matches ) > state:
+ return completer.matches[state]
+ else:
+ return None
+
+def debugOut( text ):
+ if debug:
+ sys.stderr.write( "( %s )\n" % text )
+
+def columnize( alist, width = 80 ):
+ """
+ A word-wrap function that preserves existing line breaks
+ and most spaces in the text. Expects that existing line
+ breaks are posix newlines (\n).
+ """
+ return reduce(lambda line, word, width=width: '%s%s%s' %
+ (line,
+ ' \n'[(len(line[line.rfind('\n')+1:])
+ + len(word.split('\n', 1)[0]
+ ) >= width)],
+ word),
+ alist
+ )
+
+def globfilter( names, pattern ):
+ return fnmatch.filter( names, pattern )
+
+##########################################################################
+# Class MemoryOutput
+##########################################################################
+
+class MemoryOutput:
+ """File-like output class buffering the output of the last 10 commands"""
+ def __init__( self, delegate ):
+ self.delegate = delegate
+ self._buffer = []
+ self.text = []
+ self._command = None
+
+ def startCommand( self, command ):
+ self._command = command
+ self.text = []
+ def endCommand( self ):
+ if self._command is not None:
+ if len( self._buffer ) == 10: del self._buffer[0]
+ self._buffer.append( ( self._command, self.text ) )
+ def removeLast( self ):
+ if self._buffer:
+ del self._buffer[ len( self._buffer ) - 1 ]
+ self.text = []
+ self._command = None
+ def lastBuffer( self ):
+ if self._buffer:
+ return self._buffer[ len( self._buffer ) -1 ][1]
+ def bufferedCommands( self ):
+ return [ cmd for cmd, output in self._buffer ]
+ def buffer( self, i ):
+ if i < len( self._buffer ):
+ return "BB>> %s\n%s" % ( self._buffer[i][0], "".join( self._buffer[i][1] ) )
+ else: return "ERROR: Invalid buffer number. Buffer needs to be in (0, %d)" % ( len( self._buffer ) - 1 )
+ def write( self, text ):
+ if self._command is not None and text != "BB>> ": self.text.append( text )
+ if self.delegate is not None: self.delegate.write( text )
+ def flush( self ):
+ return self.delegate.flush()
+ def fileno( self ):
+ return self.delegate.fileno()
+ def isatty( self ):
+ return self.delegate.isatty()
+
+##########################################################################
+# Class BitBakeShell
+##########################################################################
+
+class BitBakeShell:
+
+ def __init__( self ):
+ """Register commands and set up readline"""
+ self.commandQ = Queue.Queue()
+ self.commands = BitBakeShellCommands( self )
+ self.myout = MemoryOutput( sys.stdout )
+ self.historyfilename = os.path.expanduser( "~/.bbsh_history" )
+ self.startupfilename = os.path.expanduser( "~/.bbsh_startup" )
+
+ readline.set_completer( completer )
+ readline.set_completer_delims( " " )
+ readline.parse_and_bind("tab: complete")
+
+ try:
+ readline.read_history_file( self.historyfilename )
+ except IOError:
+ pass # It doesn't exist yet.
+
+ print(__credits__)
+
+ def cleanup( self ):
+ """Write readline history and clean up resources"""
+ debugOut( "writing command history" )
+ try:
+ readline.write_history_file( self.historyfilename )
+ except:
+ print("SHELL: Unable to save command history")
+
+ def registerCommand( self, command, function, numparams = 0, usage = "", helptext = "" ):
+ """Register a command"""
+ if usage == "": usage = command
+ if helptext == "": helptext = function.__doc__ or "<not yet documented>"
+ cmds[command] = ( function, numparams, usage, helptext )
+
+ def processCommand( self, command, params ):
+ """Process a command. Check number of params and print a usage string, if appropriate"""
+ debugOut( "processing command '%s'..." % command )
+ try:
+ function, numparams, usage, helptext = cmds[command]
+ except KeyError:
+ print("SHELL: ERROR: '%s' command is not a valid command." % command)
+ self.myout.removeLast()
+ else:
+ if (numparams != -1) and (not len( params ) == numparams):
+ print("Usage: '%s'" % usage)
+ return
+
+ result = function( self.commands, params )
+ debugOut( "result was '%s'" % result )
+
+ def processStartupFile( self ):
+ """Read and execute all commands found in $HOME/.bbsh_startup"""
+ if os.path.exists( self.startupfilename ):
+ startupfile = open( self.startupfilename, "r" )
+ for cmdline in startupfile:
+ debugOut( "processing startup line '%s'" % cmdline )
+ if not cmdline:
+ continue
+ if "|" in cmdline:
+ print("ERROR: '|' in startup file is not allowed. Ignoring line")
+ continue
+ self.commandQ.put( cmdline.strip() )
+
+ def main( self ):
+ """The main command loop"""
+ while not leave_mainloop:
+ try:
+ if self.commandQ.empty():
+ sys.stdout = self.myout.delegate
+ cmdline = raw_input( "BB>> " )
+ sys.stdout = self.myout
+ else:
+ cmdline = self.commandQ.get()
+ if cmdline:
+ allCommands = cmdline.split( ';' )
+ for command in allCommands:
+ pipecmd = None
+ #
+ # special case for expert mode
+ if command == 'python':
+ sys.stdout = self.myout.delegate
+ self.processCommand( command, "" )
+ sys.stdout = self.myout
+ else:
+ self.myout.startCommand( command )
+ if '|' in command: # disable output
+ command, pipecmd = command.split( '|' )
+ delegate = self.myout.delegate
+ self.myout.delegate = None
+ tokens = shlex.split( command, True )
+ self.processCommand( tokens[0], tokens[1:] or "" )
+ self.myout.endCommand()
+ if pipecmd is not None: # restore output
+ self.myout.delegate = delegate
+
+ pipe = popen2.Popen4( pipecmd )
+ pipe.tochild.write( "\n".join( self.myout.lastBuffer() ) )
+ pipe.tochild.close()
+ sys.stdout.write( pipe.fromchild.read() )
+ #
+ except EOFError:
+ print()
+ return
+ except KeyboardInterrupt:
+ print()
+
+##########################################################################
+# Start function - called from the BitBake command line utility
+##########################################################################
+
+def start( aCooker ):
+ global cooker
+ cooker = aCooker
+ bbshell = BitBakeShell()
+ bbshell.processStartupFile()
+ bbshell.main()
+ bbshell.cleanup()
+
+if __name__ == "__main__":
+ print("SHELL: Sorry, this program should only be called by BitBake.")
diff --git a/bitbake/lib/bb/siggen.py b/bitbake/lib/bb/siggen.py
new file mode 100644
index 0000000..2985272
--- /dev/null
+++ b/bitbake/lib/bb/siggen.py
@@ -0,0 +1,526 @@
+import hashlib
+import logging
+import os
+import re
+import tempfile
+import bb.data
+
+logger = logging.getLogger('BitBake.SigGen')
+
+try:
+ import cPickle as pickle
+except ImportError:
+ import pickle
+ logger.info('Importing cPickle failed. Falling back to a very slow implementation.')
+
+def init(d):
+ siggens = [obj for obj in globals().itervalues()
+ if type(obj) is type and issubclass(obj, SignatureGenerator)]
+
+ desired = d.getVar("BB_SIGNATURE_HANDLER", True) or "noop"
+ for sg in siggens:
+ if desired == sg.name:
+ return sg(d)
+ break
+ else:
+ logger.error("Invalid signature generator '%s', using default 'noop'\n"
+ "Available generators: %s", desired,
+ ', '.join(obj.name for obj in siggens))
+ return SignatureGenerator(d)
+
+class SignatureGenerator(object):
+ """
+ """
+ name = "noop"
+
+ def __init__(self, data):
+ self.taskhash = {}
+ self.runtaskdeps = {}
+ self.file_checksum_values = {}
+
+ def finalise(self, fn, d, varient):
+ return
+
+ def get_taskhash(self, fn, task, deps, dataCache):
+ return "0"
+
+ def set_taskdata(self, hashes, deps, checksum):
+ return
+
+ def stampfile(self, stampbase, file_name, taskname, extrainfo):
+ return ("%s.%s.%s" % (stampbase, taskname, extrainfo)).rstrip('.')
+
+ def stampcleanmask(self, stampbase, file_name, taskname, extrainfo):
+ return ("%s.%s.%s" % (stampbase, taskname, extrainfo)).rstrip('.')
+
+ def dump_sigtask(self, fn, task, stampbase, runtime):
+ return
+
+ def invalidate_task(self, task, d, fn):
+ bb.build.del_stamp(task, d, fn)
+
+ def dump_sigs(self, dataCache, options):
+ return
+
+ def get_taskdata(self):
+ return (self.runtaskdeps, self.taskhash, self.file_checksum_values)
+
+ def set_taskdata(self, data):
+ self.runtaskdeps, self.taskhash, self.file_checksum_values = data
+
+
+class SignatureGeneratorBasic(SignatureGenerator):
+ """
+ """
+ name = "basic"
+
+ def __init__(self, data):
+ self.basehash = {}
+ self.taskhash = {}
+ self.taskdeps = {}
+ self.runtaskdeps = {}
+ self.file_checksum_values = {}
+ self.gendeps = {}
+ self.lookupcache = {}
+ self.pkgnameextract = re.compile("(?P<fn>.*)\..*")
+ self.basewhitelist = set((data.getVar("BB_HASHBASE_WHITELIST", True) or "").split())
+ self.taskwhitelist = None
+ self.init_rundepcheck(data)
+
+ def init_rundepcheck(self, data):
+ self.taskwhitelist = data.getVar("BB_HASHTASK_WHITELIST", True) or None
+ if self.taskwhitelist:
+ self.twl = re.compile(self.taskwhitelist)
+ else:
+ self.twl = None
+
+ def _build_data(self, fn, d):
+
+ tasklist, gendeps, lookupcache = bb.data.generate_dependencies(d)
+
+ taskdeps = {}
+ basehash = {}
+
+ for task in tasklist:
+ data = lookupcache[task]
+
+ if data is None:
+ bb.error("Task %s from %s seems to be empty?!" % (task, fn))
+ data = ''
+
+ gendeps[task] -= self.basewhitelist
+ newdeps = gendeps[task]
+ seen = set()
+ while newdeps:
+ nextdeps = newdeps
+ seen |= nextdeps
+ newdeps = set()
+ for dep in nextdeps:
+ if dep in self.basewhitelist:
+ continue
+ gendeps[dep] -= self.basewhitelist
+ newdeps |= gendeps[dep]
+ newdeps -= seen
+
+ alldeps = sorted(seen)
+ for dep in alldeps:
+ data = data + dep
+ var = lookupcache[dep]
+ if var is not None:
+ data = data + str(var)
+ self.basehash[fn + "." + task] = hashlib.md5(data).hexdigest()
+ taskdeps[task] = alldeps
+
+ self.taskdeps[fn] = taskdeps
+ self.gendeps[fn] = gendeps
+ self.lookupcache[fn] = lookupcache
+
+ return taskdeps
+
+ def finalise(self, fn, d, variant):
+
+ if variant:
+ fn = "virtual:" + variant + ":" + fn
+
+ try:
+ taskdeps = self._build_data(fn, d)
+ except:
+ bb.note("Error during finalise of %s" % fn)
+ raise
+
+ #Slow but can be useful for debugging mismatched basehashes
+ #for task in self.taskdeps[fn]:
+ # self.dump_sigtask(fn, task, d.getVar("STAMP", True), False)
+
+ for task in taskdeps:
+ d.setVar("BB_BASEHASH_task-%s" % task, self.basehash[fn + "." + task])
+
+ def rundep_check(self, fn, recipename, task, dep, depname, dataCache):
+ # Return True if we should keep the dependency, False to drop it
+ # We only manipulate the dependencies for packages not in the whitelist
+ if self.twl and not self.twl.search(recipename):
+ # then process the actual dependencies
+ if self.twl.search(depname):
+ return False
+ return True
+
+ def read_taint(self, fn, task, stampbase):
+ taint = None
+ try:
+ with open(stampbase + '.' + task + '.taint', 'r') as taintf:
+ taint = taintf.read()
+ except IOError:
+ pass
+ return taint
+
+ def get_taskhash(self, fn, task, deps, dataCache):
+ k = fn + "." + task
+ data = dataCache.basetaskhash[k]
+ self.runtaskdeps[k] = []
+ self.file_checksum_values[k] = {}
+ recipename = dataCache.pkg_fn[fn]
+ for dep in sorted(deps, key=clean_basepath):
+ depname = dataCache.pkg_fn[self.pkgnameextract.search(dep).group('fn')]
+ if not self.rundep_check(fn, recipename, task, dep, depname, dataCache):
+ continue
+ if dep not in self.taskhash:
+ bb.fatal("%s is not in taskhash, caller isn't calling in dependency order?", dep)
+ data = data + self.taskhash[dep]
+ self.runtaskdeps[k].append(dep)
+
+ if task in dataCache.file_checksums[fn]:
+ checksums = bb.fetch2.get_file_checksums(dataCache.file_checksums[fn][task], recipename)
+ for (f,cs) in checksums:
+ self.file_checksum_values[k][f] = cs
+ if cs:
+ data = data + cs
+
+ taskdep = dataCache.task_deps[fn]
+ if 'nostamp' in taskdep and task in taskdep['nostamp']:
+ # Nostamp tasks need an implicit taint so that they force any dependent tasks to run
+ import uuid
+ data = data + str(uuid.uuid4())
+
+ taint = self.read_taint(fn, task, dataCache.stamp[fn])
+ if taint:
+ data = data + taint
+ logger.warn("%s is tainted from a forced run" % k)
+
+ h = hashlib.md5(data).hexdigest()
+ self.taskhash[k] = h
+ #d.setVar("BB_TASKHASH_task-%s" % task, taskhash[task])
+ return h
+
+ def dump_sigtask(self, fn, task, stampbase, runtime):
+ k = fn + "." + task
+ if runtime == "customfile":
+ sigfile = stampbase
+ elif runtime and k in self.taskhash:
+ sigfile = stampbase + "." + task + ".sigdata" + "." + self.taskhash[k]
+ else:
+ sigfile = stampbase + "." + task + ".sigbasedata" + "." + self.basehash[k]
+
+ bb.utils.mkdirhier(os.path.dirname(sigfile))
+
+ data = {}
+ data['basewhitelist'] = self.basewhitelist
+ data['taskwhitelist'] = self.taskwhitelist
+ data['taskdeps'] = self.taskdeps[fn][task]
+ data['basehash'] = self.basehash[k]
+ data['gendeps'] = {}
+ data['varvals'] = {}
+ data['varvals'][task] = self.lookupcache[fn][task]
+ for dep in self.taskdeps[fn][task]:
+ if dep in self.basewhitelist:
+ continue
+ data['gendeps'][dep] = self.gendeps[fn][dep]
+ data['varvals'][dep] = self.lookupcache[fn][dep]
+
+ if runtime and k in self.taskhash:
+ data['runtaskdeps'] = self.runtaskdeps[k]
+ data['file_checksum_values'] = [(os.path.basename(f), cs) for f,cs in self.file_checksum_values[k].items()]
+ data['runtaskhashes'] = {}
+ for dep in data['runtaskdeps']:
+ data['runtaskhashes'][dep] = self.taskhash[dep]
+
+ taint = self.read_taint(fn, task, stampbase)
+ if taint:
+ data['taint'] = taint
+
+ fd, tmpfile = tempfile.mkstemp(dir=os.path.dirname(sigfile), prefix="sigtask.")
+ try:
+ with os.fdopen(fd, "wb") as stream:
+ p = pickle.dump(data, stream, -1)
+ stream.flush()
+ os.chmod(tmpfile, 0664)
+ os.rename(tmpfile, sigfile)
+ except (OSError, IOError) as err:
+ try:
+ os.unlink(tmpfile)
+ except OSError:
+ pass
+ raise err
+
+ def dump_sigs(self, dataCache, options):
+ for fn in self.taskdeps:
+ for task in self.taskdeps[fn]:
+ k = fn + "." + task
+ if k not in self.taskhash:
+ continue
+ if dataCache.basetaskhash[k] != self.basehash[k]:
+ bb.error("Bitbake's cached basehash does not match the one we just generated (%s)!" % k)
+ bb.error("The mismatched hashes were %s and %s" % (dataCache.basetaskhash[k], self.basehash[k]))
+ self.dump_sigtask(fn, task, dataCache.stamp[fn], True)
+
+class SignatureGeneratorBasicHash(SignatureGeneratorBasic):
+ name = "basichash"
+
+ def stampfile(self, stampbase, fn, taskname, extrainfo, clean=False):
+ if taskname != "do_setscene" and taskname.endswith("_setscene"):
+ k = fn + "." + taskname[:-9]
+ else:
+ k = fn + "." + taskname
+ if clean:
+ h = "*"
+ elif k in self.taskhash:
+ h = self.taskhash[k]
+ else:
+ # If k is not in basehash, then error
+ h = self.basehash[k]
+ return ("%s.%s.%s.%s" % (stampbase, taskname, h, extrainfo)).rstrip('.')
+
+ def stampcleanmask(self, stampbase, fn, taskname, extrainfo):
+ return self.stampfile(stampbase, fn, taskname, extrainfo, clean=True)
+
+ def invalidate_task(self, task, d, fn):
+ bb.note("Tainting hash to force rebuild of task %s, %s" % (fn, task))
+ bb.build.write_taint(task, d, fn)
+
+def dump_this_task(outfile, d):
+ import bb.parse
+ fn = d.getVar("BB_FILENAME", True)
+ task = "do_" + d.getVar("BB_CURRENTTASK", True)
+ bb.parse.siggen.dump_sigtask(fn, task, outfile, "customfile")
+
+def clean_basepath(a):
+ b = a.rsplit("/", 2)[1] + a.rsplit("/", 2)[2]
+ if a.startswith("virtual:"):
+ b = b + ":" + a.rsplit(":", 1)[0]
+ return b
+
+def clean_basepaths(a):
+ b = {}
+ for x in a:
+ b[clean_basepath(x)] = a[x]
+ return b
+
+def clean_basepaths_list(a):
+ b = []
+ for x in a:
+ b.append(clean_basepath(x))
+ return b
+
+def compare_sigfiles(a, b, recursecb = None):
+ output = []
+
+ p1 = pickle.Unpickler(open(a, "rb"))
+ a_data = p1.load()
+ p2 = pickle.Unpickler(open(b, "rb"))
+ b_data = p2.load()
+
+ def dict_diff(a, b, whitelist=set()):
+ sa = set(a.keys())
+ sb = set(b.keys())
+ common = sa & sb
+ changed = set()
+ for i in common:
+ if a[i] != b[i] and i not in whitelist:
+ changed.add(i)
+ added = sb - sa
+ removed = sa - sb
+ return changed, added, removed
+
+ def file_checksums_diff(a, b):
+ from collections import Counter
+ # Handle old siginfo format
+ if isinstance(a, dict):
+ a = [(os.path.basename(f), cs) for f, cs in a.items()]
+ if isinstance(b, dict):
+ b = [(os.path.basename(f), cs) for f, cs in b.items()]
+ # Compare lists, ensuring we can handle duplicate filenames if they exist
+ removedcount = Counter(a)
+ removedcount.subtract(b)
+ addedcount = Counter(b)
+ addedcount.subtract(a)
+ added = []
+ for x in b:
+ if addedcount[x] > 0:
+ addedcount[x] -= 1
+ added.append(x)
+ removed = []
+ changed = []
+ for x in a:
+ if removedcount[x] > 0:
+ removedcount[x] -= 1
+ for y in added:
+ if y[0] == x[0]:
+ changed.append((x[0], x[1], y[1]))
+ added.remove(y)
+ break
+ else:
+ removed.append(x)
+ added = [x[0] for x in added]
+ removed = [x[0] for x in removed]
+ return changed, added, removed
+
+ if 'basewhitelist' in a_data and a_data['basewhitelist'] != b_data['basewhitelist']:
+ output.append("basewhitelist changed from '%s' to '%s'" % (a_data['basewhitelist'], b_data['basewhitelist']))
+ if a_data['basewhitelist'] and b_data['basewhitelist']:
+ output.append("changed items: %s" % a_data['basewhitelist'].symmetric_difference(b_data['basewhitelist']))
+
+ if 'taskwhitelist' in a_data and a_data['taskwhitelist'] != b_data['taskwhitelist']:
+ output.append("taskwhitelist changed from '%s' to '%s'" % (a_data['taskwhitelist'], b_data['taskwhitelist']))
+ if a_data['taskwhitelist'] and b_data['taskwhitelist']:
+ output.append("changed items: %s" % a_data['taskwhitelist'].symmetric_difference(b_data['taskwhitelist']))
+
+ if a_data['taskdeps'] != b_data['taskdeps']:
+ output.append("Task dependencies changed from:\n%s\nto:\n%s" % (sorted(a_data['taskdeps']), sorted(b_data['taskdeps'])))
+
+ if a_data['basehash'] != b_data['basehash']:
+ output.append("basehash changed from %s to %s" % (a_data['basehash'], b_data['basehash']))
+
+ changed, added, removed = dict_diff(a_data['gendeps'], b_data['gendeps'], a_data['basewhitelist'] & b_data['basewhitelist'])
+ if changed:
+ for dep in changed:
+ output.append("List of dependencies for variable %s changed from '%s' to '%s'" % (dep, a_data['gendeps'][dep], b_data['gendeps'][dep]))
+ if a_data['gendeps'][dep] and b_data['gendeps'][dep]:
+ output.append("changed items: %s" % a_data['gendeps'][dep].symmetric_difference(b_data['gendeps'][dep]))
+ if added:
+ for dep in added:
+ output.append("Dependency on variable %s was added" % (dep))
+ if removed:
+ for dep in removed:
+ output.append("Dependency on Variable %s was removed" % (dep))
+
+
+ changed, added, removed = dict_diff(a_data['varvals'], b_data['varvals'])
+ if changed:
+ for dep in changed:
+ output.append("Variable %s value changed from '%s' to '%s'" % (dep, a_data['varvals'][dep], b_data['varvals'][dep]))
+
+ changed, added, removed = file_checksums_diff(a_data['file_checksum_values'], b_data['file_checksum_values'])
+ if changed:
+ for f, old, new in changed:
+ output.append("Checksum for file %s changed from %s to %s" % (f, old, new))
+ if added:
+ for f in added:
+ output.append("Dependency on checksum of file %s was added" % (f))
+ if removed:
+ for f in removed:
+ output.append("Dependency on checksum of file %s was removed" % (f))
+
+
+ if len(a_data['runtaskdeps']) != len(b_data['runtaskdeps']):
+ changed = ["Number of task dependencies changed"]
+ else:
+ changed = []
+ for idx, task in enumerate(a_data['runtaskdeps']):
+ a = a_data['runtaskdeps'][idx]
+ b = b_data['runtaskdeps'][idx]
+ if a_data['runtaskhashes'][a] != b_data['runtaskhashes'][b]:
+ changed.append("%s with hash %s\n changed to\n%s with hash %s" % (a, a_data['runtaskhashes'][a], b, b_data['runtaskhashes'][b]))
+
+ if changed:
+ output.append("runtaskdeps changed from %s to %s" % (clean_basepaths_list(a_data['runtaskdeps']), clean_basepaths_list(b_data['runtaskdeps'])))
+ output.append("\n".join(changed))
+
+
+ if 'runtaskhashes' in a_data and 'runtaskhashes' in b_data:
+ a = a_data['runtaskhashes']
+ b = b_data['runtaskhashes']
+ changed, added, removed = dict_diff(a, b)
+ if added:
+ for dep in added:
+ bdep_found = False
+ if removed:
+ for bdep in removed:
+ if b[dep] == a[bdep]:
+ #output.append("Dependency on task %s was replaced by %s with same hash" % (dep, bdep))
+ bdep_found = True
+ if not bdep_found:
+ output.append("Dependency on task %s was added with hash %s" % (clean_basepath(dep), b[dep]))
+ if removed:
+ for dep in removed:
+ adep_found = False
+ if added:
+ for adep in added:
+ if b[adep] == a[dep]:
+ #output.append("Dependency on task %s was replaced by %s with same hash" % (adep, dep))
+ adep_found = True
+ if not adep_found:
+ output.append("Dependency on task %s was removed with hash %s" % (clean_basepath(dep), a[dep]))
+ if changed:
+ for dep in changed:
+ output.append("Hash for dependent task %s changed from %s to %s" % (clean_basepath(dep), a[dep], b[dep]))
+ if callable(recursecb):
+ # If a dependent hash changed, might as well print the line above and then defer to the changes in
+ # that hash since in all likelyhood, they're the same changes this task also saw.
+ recout = recursecb(dep, a[dep], b[dep])
+ if recout:
+ output = [output[-1]] + recout
+
+ a_taint = a_data.get('taint', None)
+ b_taint = b_data.get('taint', None)
+ if a_taint != b_taint:
+ output.append("Taint (by forced/invalidated task) changed from %s to %s" % (a_taint, b_taint))
+
+ return output
+
+
+def dump_sigfile(a):
+ output = []
+
+ p1 = pickle.Unpickler(open(a, "rb"))
+ a_data = p1.load()
+
+ output.append("basewhitelist: %s" % (a_data['basewhitelist']))
+
+ output.append("taskwhitelist: %s" % (a_data['taskwhitelist']))
+
+ output.append("Task dependencies: %s" % (sorted(a_data['taskdeps'])))
+
+ output.append("basehash: %s" % (a_data['basehash']))
+
+ for dep in a_data['gendeps']:
+ output.append("List of dependencies for variable %s is %s" % (dep, a_data['gendeps'][dep]))
+
+ for dep in a_data['varvals']:
+ output.append("Variable %s value is %s" % (dep, a_data['varvals'][dep]))
+
+ if 'runtaskdeps' in a_data:
+ output.append("Tasks this task depends on: %s" % (a_data['runtaskdeps']))
+
+ if 'file_checksum_values' in a_data:
+ output.append("This task depends on the checksums of files: %s" % (a_data['file_checksum_values']))
+
+ if 'runtaskhashes' in a_data:
+ for dep in a_data['runtaskhashes']:
+ output.append("Hash for dependent task %s is %s" % (dep, a_data['runtaskhashes'][dep]))
+
+ if 'taint' in a_data:
+ output.append("Tainted (by forced/invalidated task): %s" % a_data['taint'])
+
+ data = a_data['basehash']
+ for dep in a_data['runtaskdeps']:
+ data = data + a_data['runtaskhashes'][dep]
+
+ for c in a_data['file_checksum_values']:
+ data = data + c[1]
+
+ if 'taint' in a_data:
+ data = data + a_data['taint']
+
+ h = hashlib.md5(data).hexdigest()
+ output.append("Computed Hash is %s" % h)
+
+ return output
diff --git a/bitbake/lib/bb/taskdata.py b/bitbake/lib/bb/taskdata.py
new file mode 100644
index 0000000..5fab704
--- /dev/null
+++ b/bitbake/lib/bb/taskdata.py
@@ -0,0 +1,655 @@
+#!/usr/bin/env python
+# ex:ts=4:sw=4:sts=4:et
+# -*- tab-width: 4; c-basic-offset: 4; indent-tabs-mode: nil -*-
+"""
+BitBake 'TaskData' implementation
+
+Task data collection and handling
+
+"""
+
+# Copyright (C) 2006 Richard Purdie
+#
+# This program is free software; you can redistribute it and/or modify
+# it under the terms of the GNU General Public License version 2 as
+# published by the Free Software Foundation.
+#
+# This program is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
+# GNU General Public License for more details.
+#
+# You should have received a copy of the GNU General Public License along
+# with this program; if not, write to the Free Software Foundation, Inc.,
+# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
+
+import logging
+import re
+import bb
+
+logger = logging.getLogger("BitBake.TaskData")
+
+def re_match_strings(target, strings):
+ """
+ Whether or not the string 'target' matches
+ any one string of the strings which can be regular expression string
+ """
+ return any(name == target or re.match(name, target)
+ for name in strings)
+
+class TaskData:
+ """
+ BitBake Task Data implementation
+ """
+ def __init__(self, abort = True, tryaltconfigs = False, skiplist = None, allowincomplete = False):
+ self.build_names_index = []
+ self.run_names_index = []
+ self.fn_index = []
+
+ self.build_targets = {}
+ self.run_targets = {}
+
+ self.external_targets = []
+
+ self.tasks_fnid = []
+ self.tasks_name = []
+ self.tasks_tdepends = []
+ self.tasks_idepends = []
+ self.tasks_irdepends = []
+ # Cache to speed up task ID lookups
+ self.tasks_lookup = {}
+
+ self.depids = {}
+ self.rdepids = {}
+
+ self.consider_msgs_cache = []
+
+ self.failed_deps = []
+ self.failed_rdeps = []
+ self.failed_fnids = []
+
+ self.abort = abort
+ self.tryaltconfigs = tryaltconfigs
+ self.allowincomplete = allowincomplete
+
+ self.skiplist = skiplist
+
+ def getbuild_id(self, name):
+ """
+ Return an ID number for the build target name.
+ If it doesn't exist, create one.
+ """
+ if not name in self.build_names_index:
+ self.build_names_index.append(name)
+ return len(self.build_names_index) - 1
+
+ return self.build_names_index.index(name)
+
+ def getrun_id(self, name):
+ """
+ Return an ID number for the run target name.
+ If it doesn't exist, create one.
+ """
+ if not name in self.run_names_index:
+ self.run_names_index.append(name)
+ return len(self.run_names_index) - 1
+
+ return self.run_names_index.index(name)
+
+ def getfn_id(self, name):
+ """
+ Return an ID number for the filename.
+ If it doesn't exist, create one.
+ """
+ if not name in self.fn_index:
+ self.fn_index.append(name)
+ return len(self.fn_index) - 1
+
+ return self.fn_index.index(name)
+
+ def gettask_ids(self, fnid):
+ """
+ Return an array of the ID numbers matching a given fnid.
+ """
+ ids = []
+ if fnid in self.tasks_lookup:
+ for task in self.tasks_lookup[fnid]:
+ ids.append(self.tasks_lookup[fnid][task])
+ return ids
+
+ def gettask_id_fromfnid(self, fnid, task):
+ """
+ Return an ID number for the task matching fnid and task.
+ """
+ if fnid in self.tasks_lookup:
+ if task in self.tasks_lookup[fnid]:
+ return self.tasks_lookup[fnid][task]
+
+ return None
+
+ def gettask_id(self, fn, task, create = True):
+ """
+ Return an ID number for the task matching fn and task.
+ If it doesn't exist, create one by default.
+ Optionally return None instead.
+ """
+ fnid = self.getfn_id(fn)
+
+ if fnid in self.tasks_lookup:
+ if task in self.tasks_lookup[fnid]:
+ return self.tasks_lookup[fnid][task]
+
+ if not create:
+ return None
+
+ self.tasks_name.append(task)
+ self.tasks_fnid.append(fnid)
+ self.tasks_tdepends.append([])
+ self.tasks_idepends.append([])
+ self.tasks_irdepends.append([])
+
+ listid = len(self.tasks_name) - 1
+
+ if fnid not in self.tasks_lookup:
+ self.tasks_lookup[fnid] = {}
+ self.tasks_lookup[fnid][task] = listid
+
+ return listid
+
+ def add_tasks(self, fn, dataCache):
+ """
+ Add tasks for a given fn to the database
+ """
+
+ task_deps = dataCache.task_deps[fn]
+
+ fnid = self.getfn_id(fn)
+
+ if fnid in self.failed_fnids:
+ bb.msg.fatal("TaskData", "Trying to re-add a failed file? Something is broken...")
+
+ # Check if we've already seen this fn
+ if fnid in self.tasks_fnid:
+ return
+
+ for task in task_deps['tasks']:
+
+ # Work out task dependencies
+ parentids = []
+ for dep in task_deps['parents'][task]:
+ if dep not in task_deps['tasks']:
+ bb.debug(2, "Not adding dependeny of %s on %s since %s does not exist" % (task, dep, dep))
+ continue
+ parentid = self.gettask_id(fn, dep)
+ parentids.append(parentid)
+ taskid = self.gettask_id(fn, task)
+ self.tasks_tdepends[taskid].extend(parentids)
+
+ # Touch all intertask dependencies
+ if 'depends' in task_deps and task in task_deps['depends']:
+ ids = []
+ for dep in task_deps['depends'][task].split():
+ if dep:
+ if ":" not in dep:
+ bb.msg.fatal("TaskData", "Error for %s, dependency %s does not contain ':' character\n. Task 'depends' should be specified in the form 'packagename:task'" % (fn, dep))
+ ids.append(((self.getbuild_id(dep.split(":")[0])), dep.split(":")[1]))
+ self.tasks_idepends[taskid].extend(ids)
+ if 'rdepends' in task_deps and task in task_deps['rdepends']:
+ ids = []
+ for dep in task_deps['rdepends'][task].split():
+ if dep:
+ if ":" not in dep:
+ bb.msg.fatal("TaskData", "Error for %s, dependency %s does not contain ':' character\n. Task 'rdepends' should be specified in the form 'packagename:task'" % (fn, dep))
+ ids.append(((self.getrun_id(dep.split(":")[0])), dep.split(":")[1]))
+ self.tasks_irdepends[taskid].extend(ids)
+
+
+ # Work out build dependencies
+ if not fnid in self.depids:
+ dependids = {}
+ for depend in dataCache.deps[fn]:
+ dependids[self.getbuild_id(depend)] = None
+ self.depids[fnid] = dependids.keys()
+ logger.debug(2, "Added dependencies %s for %s", str(dataCache.deps[fn]), fn)
+
+ # Work out runtime dependencies
+ if not fnid in self.rdepids:
+ rdependids = {}
+ rdepends = dataCache.rundeps[fn]
+ rrecs = dataCache.runrecs[fn]
+ rdependlist = []
+ rreclist = []
+ for package in rdepends:
+ for rdepend in rdepends[package]:
+ rdependlist.append(rdepend)
+ rdependids[self.getrun_id(rdepend)] = None
+ for package in rrecs:
+ for rdepend in rrecs[package]:
+ rreclist.append(rdepend)
+ rdependids[self.getrun_id(rdepend)] = None
+ if rdependlist:
+ logger.debug(2, "Added runtime dependencies %s for %s", str(rdependlist), fn)
+ if rreclist:
+ logger.debug(2, "Added runtime recommendations %s for %s", str(rreclist), fn)
+ self.rdepids[fnid] = rdependids.keys()
+
+ for dep in self.depids[fnid]:
+ if dep in self.failed_deps:
+ self.fail_fnid(fnid)
+ return
+ for dep in self.rdepids[fnid]:
+ if dep in self.failed_rdeps:
+ self.fail_fnid(fnid)
+ return
+
+ def have_build_target(self, target):
+ """
+ Have we a build target matching this name?
+ """
+ targetid = self.getbuild_id(target)
+
+ if targetid in self.build_targets:
+ return True
+ return False
+
+ def have_runtime_target(self, target):
+ """
+ Have we a runtime target matching this name?
+ """
+ targetid = self.getrun_id(target)
+
+ if targetid in self.run_targets:
+ return True
+ return False
+
+ def add_build_target(self, fn, item):
+ """
+ Add a build target.
+ If already present, append the provider fn to the list
+ """
+ targetid = self.getbuild_id(item)
+ fnid = self.getfn_id(fn)
+
+ if targetid in self.build_targets:
+ if fnid in self.build_targets[targetid]:
+ return
+ self.build_targets[targetid].append(fnid)
+ return
+ self.build_targets[targetid] = [fnid]
+
+ def add_runtime_target(self, fn, item):
+ """
+ Add a runtime target.
+ If already present, append the provider fn to the list
+ """
+ targetid = self.getrun_id(item)
+ fnid = self.getfn_id(fn)
+
+ if targetid in self.run_targets:
+ if fnid in self.run_targets[targetid]:
+ return
+ self.run_targets[targetid].append(fnid)
+ return
+ self.run_targets[targetid] = [fnid]
+
+ def mark_external_target(self, item):
+ """
+ Mark a build target as being externally requested
+ """
+ targetid = self.getbuild_id(item)
+
+ if targetid not in self.external_targets:
+ self.external_targets.append(targetid)
+
+ def get_unresolved_build_targets(self, dataCache):
+ """
+ Return a list of build targets who's providers
+ are unknown.
+ """
+ unresolved = []
+ for target in self.build_names_index:
+ if re_match_strings(target, dataCache.ignored_dependencies):
+ continue
+ if self.build_names_index.index(target) in self.failed_deps:
+ continue
+ if not self.have_build_target(target):
+ unresolved.append(target)
+ return unresolved
+
+ def get_unresolved_run_targets(self, dataCache):
+ """
+ Return a list of runtime targets who's providers
+ are unknown.
+ """
+ unresolved = []
+ for target in self.run_names_index:
+ if re_match_strings(target, dataCache.ignored_dependencies):
+ continue
+ if self.run_names_index.index(target) in self.failed_rdeps:
+ continue
+ if not self.have_runtime_target(target):
+ unresolved.append(target)
+ return unresolved
+
+ def get_provider(self, item):
+ """
+ Return a list of providers of item
+ """
+ targetid = self.getbuild_id(item)
+
+ return self.build_targets[targetid]
+
+ def get_dependees(self, itemid):
+ """
+ Return a list of targets which depend on item
+ """
+ dependees = []
+ for fnid in self.depids:
+ if itemid in self.depids[fnid]:
+ dependees.append(fnid)
+ return dependees
+
+ def get_dependees_str(self, item):
+ """
+ Return a list of targets which depend on item as a user readable string
+ """
+ itemid = self.getbuild_id(item)
+ dependees = []
+ for fnid in self.depids:
+ if itemid in self.depids[fnid]:
+ dependees.append(self.fn_index[fnid])
+ return dependees
+
+ def get_rdependees(self, itemid):
+ """
+ Return a list of targets which depend on runtime item
+ """
+ dependees = []
+ for fnid in self.rdepids:
+ if itemid in self.rdepids[fnid]:
+ dependees.append(fnid)
+ return dependees
+
+ def get_rdependees_str(self, item):
+ """
+ Return a list of targets which depend on runtime item as a user readable string
+ """
+ itemid = self.getrun_id(item)
+ dependees = []
+ for fnid in self.rdepids:
+ if itemid in self.rdepids[fnid]:
+ dependees.append(self.fn_index[fnid])
+ return dependees
+
+ def get_reasons(self, item, runtime=False):
+ """
+ Get the reason(s) for an item not being provided, if any
+ """
+ reasons = []
+ if self.skiplist:
+ for fn in self.skiplist:
+ skipitem = self.skiplist[fn]
+ if skipitem.pn == item:
+ reasons.append("%s was skipped: %s" % (skipitem.pn, skipitem.skipreason))
+ elif runtime and item in skipitem.rprovides:
+ reasons.append("%s RPROVIDES %s but was skipped: %s" % (skipitem.pn, item, skipitem.skipreason))
+ elif not runtime and item in skipitem.provides:
+ reasons.append("%s PROVIDES %s but was skipped: %s" % (skipitem.pn, item, skipitem.skipreason))
+ return reasons
+
+ def get_close_matches(self, item, provider_list):
+ import difflib
+ if self.skiplist:
+ skipped = []
+ for fn in self.skiplist:
+ skipped.append(self.skiplist[fn].pn)
+ full_list = provider_list + skipped
+ else:
+ full_list = provider_list
+ return difflib.get_close_matches(item, full_list, cutoff=0.7)
+
+ def add_provider(self, cfgData, dataCache, item):
+ try:
+ self.add_provider_internal(cfgData, dataCache, item)
+ except bb.providers.NoProvider:
+ if self.abort:
+ raise
+ self.remove_buildtarget(self.getbuild_id(item))
+
+ self.mark_external_target(item)
+
+ def add_provider_internal(self, cfgData, dataCache, item):
+ """
+ Add the providers of item to the task data
+ Mark entries were specifically added externally as against dependencies
+ added internally during dependency resolution
+ """
+
+ if re_match_strings(item, dataCache.ignored_dependencies):
+ return
+
+ if not item in dataCache.providers:
+ bb.event.fire(bb.event.NoProvider(item, dependees=self.get_dependees_str(item), reasons=self.get_reasons(item), close_matches=self.get_close_matches(item, dataCache.providers.keys())), cfgData)
+ raise bb.providers.NoProvider(item)
+
+ if self.have_build_target(item):
+ return
+
+ all_p = dataCache.providers[item]
+
+ eligible, foundUnique = bb.providers.filterProviders(all_p, item, cfgData, dataCache)
+ eligible = [p for p in eligible if not self.getfn_id(p) in self.failed_fnids]
+
+ if not eligible:
+ bb.event.fire(bb.event.NoProvider(item, dependees=self.get_dependees_str(item), reasons=["No eligible PROVIDERs exist for '%s'" % item]), cfgData)
+ raise bb.providers.NoProvider(item)
+
+ if len(eligible) > 1 and foundUnique == False:
+ if item not in self.consider_msgs_cache:
+ providers_list = []
+ for fn in eligible:
+ providers_list.append(dataCache.pkg_fn[fn])
+ bb.event.fire(bb.event.MultipleProviders(item, providers_list), cfgData)
+ self.consider_msgs_cache.append(item)
+
+ for fn in eligible:
+ fnid = self.getfn_id(fn)
+ if fnid in self.failed_fnids:
+ continue
+ logger.debug(2, "adding %s to satisfy %s", fn, item)
+ self.add_build_target(fn, item)
+ self.add_tasks(fn, dataCache)
+
+
+ #item = dataCache.pkg_fn[fn]
+
+ def add_rprovider(self, cfgData, dataCache, item):
+ """
+ Add the runtime providers of item to the task data
+ (takes item names from RDEPENDS/PACKAGES namespace)
+ """
+
+ if re_match_strings(item, dataCache.ignored_dependencies):
+ return
+
+ if self.have_runtime_target(item):
+ return
+
+ all_p = bb.providers.getRuntimeProviders(dataCache, item)
+
+ if not all_p:
+ bb.event.fire(bb.event.NoProvider(item, runtime=True, dependees=self.get_rdependees_str(item), reasons=self.get_reasons(item, True)), cfgData)
+ raise bb.providers.NoRProvider(item)
+
+ eligible, numberPreferred = bb.providers.filterProvidersRunTime(all_p, item, cfgData, dataCache)
+ eligible = [p for p in eligible if not self.getfn_id(p) in self.failed_fnids]
+
+ if not eligible:
+ bb.event.fire(bb.event.NoProvider(item, runtime=True, dependees=self.get_rdependees_str(item), reasons=["No eligible RPROVIDERs exist for '%s'" % item]), cfgData)
+ raise bb.providers.NoRProvider(item)
+
+ if len(eligible) > 1 and numberPreferred == 0:
+ if item not in self.consider_msgs_cache:
+ providers_list = []
+ for fn in eligible:
+ providers_list.append(dataCache.pkg_fn[fn])
+ bb.event.fire(bb.event.MultipleProviders(item, providers_list, runtime=True), cfgData)
+ self.consider_msgs_cache.append(item)
+
+ if numberPreferred > 1:
+ if item not in self.consider_msgs_cache:
+ providers_list = []
+ for fn in eligible:
+ providers_list.append(dataCache.pkg_fn[fn])
+ bb.event.fire(bb.event.MultipleProviders(item, providers_list, runtime=True), cfgData)
+ self.consider_msgs_cache.append(item)
+ raise bb.providers.MultipleRProvider(item)
+
+ # run through the list until we find one that we can build
+ for fn in eligible:
+ fnid = self.getfn_id(fn)
+ if fnid in self.failed_fnids:
+ continue
+ logger.debug(2, "adding '%s' to satisfy runtime '%s'", fn, item)
+ self.add_runtime_target(fn, item)
+ self.add_tasks(fn, dataCache)
+
+ def fail_fnid(self, fnid, missing_list=None):
+ """
+ Mark a file as failed (unbuildable)
+ Remove any references from build and runtime provider lists
+
+ missing_list, A list of missing requirements for this target
+ """
+ if fnid in self.failed_fnids:
+ return
+ if not missing_list:
+ missing_list = []
+ logger.debug(1, "File '%s' is unbuildable, removing...", self.fn_index[fnid])
+ self.failed_fnids.append(fnid)
+ for target in self.build_targets:
+ if fnid in self.build_targets[target]:
+ self.build_targets[target].remove(fnid)
+ if len(self.build_targets[target]) == 0:
+ self.remove_buildtarget(target, missing_list)
+ for target in self.run_targets:
+ if fnid in self.run_targets[target]:
+ self.run_targets[target].remove(fnid)
+ if len(self.run_targets[target]) == 0:
+ self.remove_runtarget(target, missing_list)
+
+ def remove_buildtarget(self, targetid, missing_list=None):
+ """
+ Mark a build target as failed (unbuildable)
+ Trigger removal of any files that have this as a dependency
+ """
+ if not missing_list:
+ missing_list = [self.build_names_index[targetid]]
+ else:
+ missing_list = [self.build_names_index[targetid]] + missing_list
+ logger.verbose("Target '%s' is unbuildable, removing...\nMissing or unbuildable dependency chain was: %s", self.build_names_index[targetid], missing_list)
+ self.failed_deps.append(targetid)
+ dependees = self.get_dependees(targetid)
+ for fnid in dependees:
+ self.fail_fnid(fnid, missing_list)
+ for taskid in xrange(len(self.tasks_idepends)):
+ idepends = self.tasks_idepends[taskid]
+ for (idependid, idependtask) in idepends:
+ if idependid == targetid:
+ self.fail_fnid(self.tasks_fnid[taskid], missing_list)
+
+ if self.abort and targetid in self.external_targets:
+ target = self.build_names_index[targetid]
+ logger.error("Required build target '%s' has no buildable providers.\nMissing or unbuildable dependency chain was: %s", target, missing_list)
+ raise bb.providers.NoProvider(target)
+
+ def remove_runtarget(self, targetid, missing_list=None):
+ """
+ Mark a run target as failed (unbuildable)
+ Trigger removal of any files that have this as a dependency
+ """
+ if not missing_list:
+ missing_list = [self.run_names_index[targetid]]
+ else:
+ missing_list = [self.run_names_index[targetid]] + missing_list
+
+ logger.info("Runtime target '%s' is unbuildable, removing...\nMissing or unbuildable dependency chain was: %s", self.run_names_index[targetid], missing_list)
+ self.failed_rdeps.append(targetid)
+ dependees = self.get_rdependees(targetid)
+ for fnid in dependees:
+ self.fail_fnid(fnid, missing_list)
+ for taskid in xrange(len(self.tasks_irdepends)):
+ irdepends = self.tasks_irdepends[taskid]
+ for (idependid, idependtask) in irdepends:
+ if idependid == targetid:
+ self.fail_fnid(self.tasks_fnid[taskid], missing_list)
+
+ def add_unresolved(self, cfgData, dataCache):
+ """
+ Resolve all unresolved build and runtime targets
+ """
+ logger.info("Resolving any missing task queue dependencies")
+ while True:
+ added = 0
+ for target in self.get_unresolved_build_targets(dataCache):
+ try:
+ self.add_provider_internal(cfgData, dataCache, target)
+ added = added + 1
+ except bb.providers.NoProvider:
+ targetid = self.getbuild_id(target)
+ if self.abort and targetid in self.external_targets and not self.allowincomplete:
+ raise
+ if not self.allowincomplete:
+ self.remove_buildtarget(targetid)
+ for target in self.get_unresolved_run_targets(dataCache):
+ try:
+ self.add_rprovider(cfgData, dataCache, target)
+ added = added + 1
+ except (bb.providers.NoRProvider, bb.providers.MultipleRProvider):
+ self.remove_runtarget(self.getrun_id(target))
+ logger.debug(1, "Resolved " + str(added) + " extra dependencies")
+ if added == 0:
+ break
+ # self.dump_data()
+
+ def dump_data(self):
+ """
+ Dump some debug information on the internal data structures
+ """
+ logger.debug(3, "build_names:")
+ logger.debug(3, ", ".join(self.build_names_index))
+
+ logger.debug(3, "run_names:")
+ logger.debug(3, ", ".join(self.run_names_index))
+
+ logger.debug(3, "build_targets:")
+ for buildid in xrange(len(self.build_names_index)):
+ target = self.build_names_index[buildid]
+ targets = "None"
+ if buildid in self.build_targets:
+ targets = self.build_targets[buildid]
+ logger.debug(3, " (%s)%s: %s", buildid, target, targets)
+
+ logger.debug(3, "run_targets:")
+ for runid in xrange(len(self.run_names_index)):
+ target = self.run_names_index[runid]
+ targets = "None"
+ if runid in self.run_targets:
+ targets = self.run_targets[runid]
+ logger.debug(3, " (%s)%s: %s", runid, target, targets)
+
+ logger.debug(3, "tasks:")
+ for task in xrange(len(self.tasks_name)):
+ logger.debug(3, " (%s)%s - %s: %s",
+ task,
+ self.fn_index[self.tasks_fnid[task]],
+ self.tasks_name[task],
+ self.tasks_tdepends[task])
+
+ logger.debug(3, "dependency ids (per fn):")
+ for fnid in self.depids:
+ logger.debug(3, " %s %s: %s", fnid, self.fn_index[fnid], self.depids[fnid])
+
+ logger.debug(3, "runtime dependency ids (per fn):")
+ for fnid in self.rdepids:
+ logger.debug(3, " %s %s: %s", fnid, self.fn_index[fnid], self.rdepids[fnid])
diff --git a/bitbake/lib/bb/tests/__init__.py b/bitbake/lib/bb/tests/__init__.py
new file mode 100644
index 0000000..e69de29
--- /dev/null
+++ b/bitbake/lib/bb/tests/__init__.py
diff --git a/bitbake/lib/bb/tests/codeparser.py b/bitbake/lib/bb/tests/codeparser.py
new file mode 100644
index 0000000..4454bc5
--- /dev/null
+++ b/bitbake/lib/bb/tests/codeparser.py
@@ -0,0 +1,375 @@
+# ex:ts=4:sw=4:sts=4:et
+# -*- tab-width: 4; c-basic-offset: 4; indent-tabs-mode: nil -*-
+#
+# BitBake Test for codeparser.py
+#
+# Copyright (C) 2010 Chris Larson
+# Copyright (C) 2012 Richard Purdie
+#
+# This program is free software; you can redistribute it and/or modify
+# it under the terms of the GNU General Public License version 2 as
+# published by the Free Software Foundation.
+#
+# This program is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
+# GNU General Public License for more details.
+#
+# You should have received a copy of the GNU General Public License along
+# with this program; if not, write to the Free Software Foundation, Inc.,
+# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
+#
+
+import unittest
+import logging
+import bb
+
+logger = logging.getLogger('BitBake.TestCodeParser')
+
+# bb.data references bb.parse but can't directly import due to circular dependencies.
+# Hack around it for now :(
+import bb.parse
+import bb.data
+
+class ReferenceTest(unittest.TestCase):
+ def setUp(self):
+ self.d = bb.data.init()
+
+ def setEmptyVars(self, varlist):
+ for k in varlist:
+ self.d.setVar(k, "")
+
+ def setValues(self, values):
+ for k, v in values.items():
+ self.d.setVar(k, v)
+
+ def assertReferences(self, refs):
+ self.assertEqual(self.references, refs)
+
+ def assertExecs(self, execs):
+ self.assertEqual(self.execs, execs)
+
+class VariableReferenceTest(ReferenceTest):
+
+ def parseExpression(self, exp):
+ parsedvar = self.d.expandWithRefs(exp, None)
+ self.references = parsedvar.references
+
+ def test_simple_reference(self):
+ self.setEmptyVars(["FOO"])
+ self.parseExpression("${FOO}")
+ self.assertReferences(set(["FOO"]))
+
+ def test_nested_reference(self):
+ self.setEmptyVars(["BAR"])
+ self.d.setVar("FOO", "BAR")
+ self.parseExpression("${${FOO}}")
+ self.assertReferences(set(["FOO", "BAR"]))
+
+ def test_python_reference(self):
+ self.setEmptyVars(["BAR"])
+ self.parseExpression("${@bb.data.getVar('BAR', d, True) + 'foo'}")
+ self.assertReferences(set(["BAR"]))
+
+class ShellReferenceTest(ReferenceTest):
+
+ def parseExpression(self, exp):
+ parsedvar = self.d.expandWithRefs(exp, None)
+ parser = bb.codeparser.ShellParser("ParserTest", logger)
+ parser.parse_shell(parsedvar.value)
+
+ self.references = parsedvar.references
+ self.execs = parser.execs
+
+ def test_quotes_inside_assign(self):
+ self.parseExpression('foo=foo"bar"baz')
+ self.assertReferences(set([]))
+
+ def test_quotes_inside_arg(self):
+ self.parseExpression('sed s#"bar baz"#"alpha beta"#g')
+ self.assertExecs(set(["sed"]))
+
+ def test_arg_continuation(self):
+ self.parseExpression("sed -i -e s,foo,bar,g \\\n *.pc")
+ self.assertExecs(set(["sed"]))
+
+ def test_dollar_in_quoted(self):
+ self.parseExpression('sed -i -e "foo$" *.pc')
+ self.assertExecs(set(["sed"]))
+
+ def test_quotes_inside_arg_continuation(self):
+ self.setEmptyVars(["bindir", "D", "libdir"])
+ self.parseExpression("""
+sed -i -e s#"moc_location=.*$"#"moc_location=${bindir}/moc4"# \\
+-e s#"uic_location=.*$"#"uic_location=${bindir}/uic4"# \\
+${D}${libdir}/pkgconfig/*.pc
+""")
+ self.assertReferences(set(["bindir", "D", "libdir"]))
+
+ def test_assign_subshell_expansion(self):
+ self.parseExpression("foo=$(echo bar)")
+ self.assertExecs(set(["echo"]))
+
+ def test_shell_unexpanded(self):
+ self.setEmptyVars(["QT_BASE_NAME"])
+ self.parseExpression('echo "${QT_BASE_NAME}"')
+ self.assertExecs(set(["echo"]))
+ self.assertReferences(set(["QT_BASE_NAME"]))
+
+ def test_incomplete_varexp_single_quotes(self):
+ self.parseExpression("sed -i -e 's:IP{:I${:g' $pc")
+ self.assertExecs(set(["sed"]))
+
+
+ def test_until(self):
+ self.parseExpression("until false; do echo true; done")
+ self.assertExecs(set(["false", "echo"]))
+ self.assertReferences(set())
+
+ def test_case(self):
+ self.parseExpression("""
+case $foo in
+*)
+bar
+;;
+esac
+""")
+ self.assertExecs(set(["bar"]))
+ self.assertReferences(set())
+
+ def test_assign_exec(self):
+ self.parseExpression("a=b c='foo bar' alpha 1 2 3")
+ self.assertExecs(set(["alpha"]))
+
+ def test_redirect_to_file(self):
+ self.setEmptyVars(["foo"])
+ self.parseExpression("echo foo >${foo}/bar")
+ self.assertExecs(set(["echo"]))
+ self.assertReferences(set(["foo"]))
+
+ def test_heredoc(self):
+ self.setEmptyVars(["theta"])
+ self.parseExpression("""
+cat <<END
+alpha
+beta
+${theta}
+END
+""")
+ self.assertReferences(set(["theta"]))
+
+ def test_redirect_from_heredoc(self):
+ v = ["B", "SHADOW_MAILDIR", "SHADOW_MAILFILE", "SHADOW_UTMPDIR", "SHADOW_LOGDIR", "bindir"]
+ self.setEmptyVars(v)
+ self.parseExpression("""
+cat <<END >${B}/cachedpaths
+shadow_cv_maildir=${SHADOW_MAILDIR}
+shadow_cv_mailfile=${SHADOW_MAILFILE}
+shadow_cv_utmpdir=${SHADOW_UTMPDIR}
+shadow_cv_logdir=${SHADOW_LOGDIR}
+shadow_cv_passwd_dir=${bindir}
+END
+""")
+ self.assertReferences(set(v))
+ self.assertExecs(set(["cat"]))
+
+# def test_incomplete_command_expansion(self):
+# self.assertRaises(reftracker.ShellSyntaxError, reftracker.execs,
+# bbvalue.shparse("cp foo`", self.d), self.d)
+
+# def test_rogue_dollarsign(self):
+# self.setValues({"D" : "/tmp"})
+# self.parseExpression("install -d ${D}$")
+# self.assertReferences(set(["D"]))
+# self.assertExecs(set(["install"]))
+
+
+class PythonReferenceTest(ReferenceTest):
+
+ def setUp(self):
+ self.d = bb.data.init()
+ if hasattr(bb.utils, "_context"):
+ self.context = bb.utils._context
+ else:
+ import __builtin__
+ self.context = __builtin__.__dict__
+
+ def parseExpression(self, exp):
+ parsedvar = self.d.expandWithRefs(exp, None)
+ parser = bb.codeparser.PythonParser("ParserTest", logger)
+ parser.parse_python(parsedvar.value)
+
+ self.references = parsedvar.references | parser.references
+ self.execs = parser.execs
+
+ @staticmethod
+ def indent(value):
+ """Python Snippets have to be indented, python values don't have to
+be. These unit tests are testing snippets."""
+ return " " + value
+
+ def test_getvar_reference(self):
+ self.parseExpression("bb.data.getVar('foo', d, True)")
+ self.assertReferences(set(["foo"]))
+ self.assertExecs(set())
+
+ def test_getvar_computed_reference(self):
+ self.parseExpression("bb.data.getVar('f' + 'o' + 'o', d, True)")
+ self.assertReferences(set())
+ self.assertExecs(set())
+
+ def test_getvar_exec_reference(self):
+ self.parseExpression("eval('bb.data.getVar(\"foo\", d, True)')")
+ self.assertReferences(set())
+ self.assertExecs(set(["eval"]))
+
+ def test_var_reference(self):
+ self.context["foo"] = lambda x: x
+ self.setEmptyVars(["FOO"])
+ self.parseExpression("foo('${FOO}')")
+ self.assertReferences(set(["FOO"]))
+ self.assertExecs(set(["foo"]))
+ del self.context["foo"]
+
+ def test_var_exec(self):
+ for etype in ("func", "task"):
+ self.d.setVar("do_something", "echo 'hi mom! ${FOO}'")
+ self.d.setVarFlag("do_something", etype, True)
+ self.parseExpression("bb.build.exec_func('do_something', d)")
+ self.assertReferences(set([]))
+ self.assertExecs(set(["do_something"]))
+
+ def test_function_reference(self):
+ self.context["testfunc"] = lambda msg: bb.msg.note(1, None, msg)
+ self.d.setVar("FOO", "Hello, World!")
+ self.parseExpression("testfunc('${FOO}')")
+ self.assertReferences(set(["FOO"]))
+ self.assertExecs(set(["testfunc"]))
+ del self.context["testfunc"]
+
+ def test_qualified_function_reference(self):
+ self.parseExpression("time.time()")
+ self.assertExecs(set(["time.time"]))
+
+ def test_qualified_function_reference_2(self):
+ self.parseExpression("os.path.dirname('/foo/bar')")
+ self.assertExecs(set(["os.path.dirname"]))
+
+ def test_qualified_function_reference_nested(self):
+ self.parseExpression("time.strftime('%Y%m%d',time.gmtime())")
+ self.assertExecs(set(["time.strftime", "time.gmtime"]))
+
+ def test_function_reference_chained(self):
+ self.context["testget"] = lambda: "\tstrip me "
+ self.parseExpression("testget().strip()")
+ self.assertExecs(set(["testget"]))
+ del self.context["testget"]
+
+
+class DependencyReferenceTest(ReferenceTest):
+
+ pydata = """
+bb.data.getVar('somevar', d, True)
+def test(d):
+ foo = 'bar %s' % 'foo'
+def test2(d):
+ d.getVar(foo, True)
+ d.getVar('bar', False)
+ test2(d)
+
+def a():
+ \"\"\"some
+ stuff
+ \"\"\"
+ return "heh"
+
+test(d)
+
+bb.data.expand(bb.data.getVar("something", False, d), d)
+bb.data.expand("${inexpand} somethingelse", d)
+bb.data.getVar(a(), d, False)
+"""
+
+ def test_python(self):
+ self.d.setVar("FOO", self.pydata)
+ self.setEmptyVars(["inexpand", "a", "test2", "test"])
+ self.d.setVarFlags("FOO", {"func": True, "python": True})
+
+ deps, values = bb.data.build_dependencies("FOO", set(self.d.keys()), set(), set(), self.d)
+
+ self.assertEquals(deps, set(["somevar", "bar", "something", "inexpand", "test", "test2", "a"]))
+
+
+ shelldata = """
+foo () {
+bar
+}
+{
+echo baz
+$(heh)
+eval `moo`
+}
+a=b
+c=d
+(
+true && false
+test -f foo
+testval=something
+$testval
+) || aiee
+! inverted
+echo ${somevar}
+
+case foo in
+bar)
+echo bar
+;;
+baz)
+echo baz
+;;
+foo*)
+echo foo
+;;
+esac
+"""
+
+ def test_shell(self):
+ execs = ["bar", "echo", "heh", "moo", "true", "aiee"]
+ self.d.setVar("somevar", "heh")
+ self.d.setVar("inverted", "echo inverted...")
+ self.d.setVarFlag("inverted", "func", True)
+ self.d.setVar("FOO", self.shelldata)
+ self.d.setVarFlags("FOO", {"func": True})
+ self.setEmptyVars(execs)
+
+ deps, values = bb.data.build_dependencies("FOO", set(self.d.keys()), set(), set(), self.d)
+
+ self.assertEquals(deps, set(["somevar", "inverted"] + execs))
+
+
+ def test_vardeps(self):
+ self.d.setVar("oe_libinstall", "echo test")
+ self.d.setVar("FOO", "foo=oe_libinstall; eval $foo")
+ self.d.setVarFlag("FOO", "vardeps", "oe_libinstall")
+
+ deps, values = bb.data.build_dependencies("FOO", set(self.d.keys()), set(), set(), self.d)
+
+ self.assertEquals(deps, set(["oe_libinstall"]))
+
+ def test_vardeps_expand(self):
+ self.d.setVar("oe_libinstall", "echo test")
+ self.d.setVar("FOO", "foo=oe_libinstall; eval $foo")
+ self.d.setVarFlag("FOO", "vardeps", "${@'oe_libinstall'}")
+
+ deps, values = bb.data.build_dependencies("FOO", set(self.d.keys()), set(), set(), self.d)
+
+ self.assertEquals(deps, set(["oe_libinstall"]))
+
+ #Currently no wildcard support
+ #def test_vardeps_wildcards(self):
+ # self.d.setVar("oe_libinstall", "echo test")
+ # self.d.setVar("FOO", "foo=oe_libinstall; eval $foo")
+ # self.d.setVarFlag("FOO", "vardeps", "oe_*")
+ # self.assertEquals(deps, set(["oe_libinstall"]))
+
+
diff --git a/bitbake/lib/bb/tests/cow.py b/bitbake/lib/bb/tests/cow.py
new file mode 100644
index 0000000..35c5841
--- /dev/null
+++ b/bitbake/lib/bb/tests/cow.py
@@ -0,0 +1,136 @@
+# ex:ts=4:sw=4:sts=4:et
+# -*- tab-width: 4; c-basic-offset: 4; indent-tabs-mode: nil -*-
+#
+# BitBake Tests for Copy-on-Write (cow.py)
+#
+# Copyright 2006 Holger Freyther <freyther@handhelds.org>
+#
+# This program is free software; you can redistribute it and/or modify
+# it under the terms of the GNU General Public License version 2 as
+# published by the Free Software Foundation.
+#
+# This program is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
+# GNU General Public License for more details.
+#
+# You should have received a copy of the GNU General Public License along
+# with this program; if not, write to the Free Software Foundation, Inc.,
+# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
+#
+
+import unittest
+import os
+
+class COWTestCase(unittest.TestCase):
+ """
+ Test case for the COW module from mithro
+ """
+
+ def testGetSet(self):
+ """
+ Test and set
+ """
+ from bb.COW import COWDictBase
+ a = COWDictBase.copy()
+
+ self.assertEquals(False, a.has_key('a'))
+
+ a['a'] = 'a'
+ a['b'] = 'b'
+ self.assertEquals(True, a.has_key('a'))
+ self.assertEquals(True, a.has_key('b'))
+ self.assertEquals('a', a['a'] )
+ self.assertEquals('b', a['b'] )
+
+ def testCopyCopy(self):
+ """
+ Test the copy of copies
+ """
+
+ from bb.COW import COWDictBase
+
+ # create two COW dict 'instances'
+ b = COWDictBase.copy()
+ c = COWDictBase.copy()
+
+ # assign some keys to one instance, some keys to another
+ b['a'] = 10
+ b['c'] = 20
+ c['a'] = 30
+
+ # test separation of the two instances
+ self.assertEquals(False, c.has_key('c'))
+ self.assertEquals(30, c['a'])
+ self.assertEquals(10, b['a'])
+
+ # test copy
+ b_2 = b.copy()
+ c_2 = c.copy()
+
+ self.assertEquals(False, c_2.has_key('c'))
+ self.assertEquals(10, b_2['a'])
+
+ b_2['d'] = 40
+ self.assertEquals(False, c_2.has_key('d'))
+ self.assertEquals(True, b_2.has_key('d'))
+ self.assertEquals(40, b_2['d'])
+ self.assertEquals(False, b.has_key('d'))
+ self.assertEquals(False, c.has_key('d'))
+
+ c_2['d'] = 30
+ self.assertEquals(True, c_2.has_key('d'))
+ self.assertEquals(True, b_2.has_key('d'))
+ self.assertEquals(30, c_2['d'])
+ self.assertEquals(40, b_2['d'])
+ self.assertEquals(False, b.has_key('d'))
+ self.assertEquals(False, c.has_key('d'))
+
+ # test copy of the copy
+ c_3 = c_2.copy()
+ b_3 = b_2.copy()
+ b_3_2 = b_2.copy()
+
+ c_3['e'] = 4711
+ self.assertEquals(4711, c_3['e'])
+ self.assertEquals(False, c_2.has_key('e'))
+ self.assertEquals(False, b_3.has_key('e'))
+ self.assertEquals(False, b_3_2.has_key('e'))
+ self.assertEquals(False, b_2.has_key('e'))
+
+ b_3['e'] = 'viel'
+ self.assertEquals('viel', b_3['e'])
+ self.assertEquals(4711, c_3['e'])
+ self.assertEquals(False, c_2.has_key('e'))
+ self.assertEquals(True, b_3.has_key('e'))
+ self.assertEquals(False, b_3_2.has_key('e'))
+ self.assertEquals(False, b_2.has_key('e'))
+
+ def testCow(self):
+ from bb.COW import COWDictBase
+ c = COWDictBase.copy()
+ c['123'] = 1027
+ c['other'] = 4711
+ c['d'] = { 'abc' : 10, 'bcd' : 20 }
+
+ copy = c.copy()
+
+ self.assertEquals(1027, c['123'])
+ self.assertEquals(4711, c['other'])
+ self.assertEquals({'abc':10, 'bcd':20}, c['d'])
+ self.assertEquals(1027, copy['123'])
+ self.assertEquals(4711, copy['other'])
+ self.assertEquals({'abc':10, 'bcd':20}, copy['d'])
+
+ # cow it now
+ copy['123'] = 1028
+ copy['other'] = 4712
+ copy['d']['abc'] = 20
+
+
+ self.assertEquals(1027, c['123'])
+ self.assertEquals(4711, c['other'])
+ self.assertEquals({'abc':10, 'bcd':20}, c['d'])
+ self.assertEquals(1028, copy['123'])
+ self.assertEquals(4712, copy['other'])
+ self.assertEquals({'abc':20, 'bcd':20}, copy['d'])
diff --git a/bitbake/lib/bb/tests/data.py b/bitbake/lib/bb/tests/data.py
new file mode 100644
index 0000000..e9aab57
--- /dev/null
+++ b/bitbake/lib/bb/tests/data.py
@@ -0,0 +1,441 @@
+# ex:ts=4:sw=4:sts=4:et
+# -*- tab-width: 4; c-basic-offset: 4; indent-tabs-mode: nil -*-
+#
+# BitBake Tests for the Data Store (data.py/data_smart.py)
+#
+# Copyright (C) 2010 Chris Larson
+# Copyright (C) 2012 Richard Purdie
+#
+# This program is free software; you can redistribute it and/or modify
+# it under the terms of the GNU General Public License version 2 as
+# published by the Free Software Foundation.
+#
+# This program is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
+# GNU General Public License for more details.
+#
+# You should have received a copy of the GNU General Public License along
+# with this program; if not, write to the Free Software Foundation, Inc.,
+# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
+#
+
+import unittest
+import bb
+import bb.data
+import bb.parse
+import logging
+
+class LogRecord():
+ def __enter__(self):
+ logs = []
+ class LogHandler(logging.Handler):
+ def emit(self, record):
+ logs.append(record)
+ logger = logging.getLogger("BitBake")
+ handler = LogHandler()
+ self.handler = handler
+ logger.addHandler(handler)
+ return logs
+ def __exit__(self, type, value, traceback):
+ logger = logging.getLogger("BitBake")
+ logger.removeHandler(self.handler)
+ return
+
+def logContains(item, logs):
+ for l in logs:
+ m = l.getMessage()
+ if item in m:
+ return True
+ return False
+
+class DataExpansions(unittest.TestCase):
+ def setUp(self):
+ self.d = bb.data.init()
+ self.d["foo"] = "value_of_foo"
+ self.d["bar"] = "value_of_bar"
+ self.d["value_of_foo"] = "value_of_'value_of_foo'"
+
+ def test_one_var(self):
+ val = self.d.expand("${foo}")
+ self.assertEqual(str(val), "value_of_foo")
+
+ def test_indirect_one_var(self):
+ val = self.d.expand("${${foo}}")
+ self.assertEqual(str(val), "value_of_'value_of_foo'")
+
+ def test_indirect_and_another(self):
+ val = self.d.expand("${${foo}} ${bar}")
+ self.assertEqual(str(val), "value_of_'value_of_foo' value_of_bar")
+
+ def test_python_snippet(self):
+ val = self.d.expand("${@5*12}")
+ self.assertEqual(str(val), "60")
+
+ def test_expand_in_python_snippet(self):
+ val = self.d.expand("${@'boo ' + '${foo}'}")
+ self.assertEqual(str(val), "boo value_of_foo")
+
+ def test_python_snippet_getvar(self):
+ val = self.d.expand("${@d.getVar('foo', True) + ' ${bar}'}")
+ self.assertEqual(str(val), "value_of_foo value_of_bar")
+
+ def test_python_snippet_syntax_error(self):
+ self.d.setVar("FOO", "${@foo = 5}")
+ self.assertRaises(bb.data_smart.ExpansionError, self.d.getVar, "FOO", True)
+
+ def test_python_snippet_runtime_error(self):
+ self.d.setVar("FOO", "${@int('test')}")
+ self.assertRaises(bb.data_smart.ExpansionError, self.d.getVar, "FOO", True)
+
+ def test_python_snippet_error_path(self):
+ self.d.setVar("FOO", "foo value ${BAR}")
+ self.d.setVar("BAR", "bar value ${@int('test')}")
+ self.assertRaises(bb.data_smart.ExpansionError, self.d.getVar, "FOO", True)
+
+ def test_value_containing_value(self):
+ val = self.d.expand("${@d.getVar('foo', True) + ' ${bar}'}")
+ self.assertEqual(str(val), "value_of_foo value_of_bar")
+
+ def test_reference_undefined_var(self):
+ val = self.d.expand("${undefinedvar} meh")
+ self.assertEqual(str(val), "${undefinedvar} meh")
+
+ def test_double_reference(self):
+ self.d.setVar("BAR", "bar value")
+ self.d.setVar("FOO", "${BAR} foo ${BAR}")
+ val = self.d.getVar("FOO", True)
+ self.assertEqual(str(val), "bar value foo bar value")
+
+ def test_direct_recursion(self):
+ self.d.setVar("FOO", "${FOO}")
+ self.assertRaises(bb.data_smart.ExpansionError, self.d.getVar, "FOO", True)
+
+ def test_indirect_recursion(self):
+ self.d.setVar("FOO", "${BAR}")
+ self.d.setVar("BAR", "${BAZ}")
+ self.d.setVar("BAZ", "${FOO}")
+ self.assertRaises(bb.data_smart.ExpansionError, self.d.getVar, "FOO", True)
+
+ def test_recursion_exception(self):
+ self.d.setVar("FOO", "${BAR}")
+ self.d.setVar("BAR", "${${@'FOO'}}")
+ self.assertRaises(bb.data_smart.ExpansionError, self.d.getVar, "FOO", True)
+
+ def test_incomplete_varexp_single_quotes(self):
+ self.d.setVar("FOO", "sed -i -e 's:IP{:I${:g' $pc")
+ val = self.d.getVar("FOO", True)
+ self.assertEqual(str(val), "sed -i -e 's:IP{:I${:g' $pc")
+
+ def test_nonstring(self):
+ self.d.setVar("TEST", 5)
+ val = self.d.getVar("TEST", True)
+ self.assertEqual(str(val), "5")
+
+ def test_rename(self):
+ self.d.renameVar("foo", "newfoo")
+ self.assertEqual(self.d.getVar("newfoo", False), "value_of_foo")
+ self.assertEqual(self.d.getVar("foo", False), None)
+
+ def test_deletion(self):
+ self.d.delVar("foo")
+ self.assertEqual(self.d.getVar("foo", False), None)
+
+ def test_keys(self):
+ keys = self.d.keys()
+ self.assertEqual(keys, ['value_of_foo', 'foo', 'bar'])
+
+ def test_keys_deletion(self):
+ newd = bb.data.createCopy(self.d)
+ newd.delVar("bar")
+ keys = newd.keys()
+ self.assertEqual(keys, ['value_of_foo', 'foo'])
+
+class TestNestedExpansions(unittest.TestCase):
+ def setUp(self):
+ self.d = bb.data.init()
+ self.d["foo"] = "foo"
+ self.d["bar"] = "bar"
+ self.d["value_of_foobar"] = "187"
+
+ def test_refs(self):
+ val = self.d.expand("${value_of_${foo}${bar}}")
+ self.assertEqual(str(val), "187")
+
+ #def test_python_refs(self):
+ # val = self.d.expand("${@${@3}**2 + ${@4}**2}")
+ # self.assertEqual(str(val), "25")
+
+ def test_ref_in_python_ref(self):
+ val = self.d.expand("${@'${foo}' + 'bar'}")
+ self.assertEqual(str(val), "foobar")
+
+ def test_python_ref_in_ref(self):
+ val = self.d.expand("${${@'f'+'o'+'o'}}")
+ self.assertEqual(str(val), "foo")
+
+ def test_deep_nesting(self):
+ depth = 100
+ val = self.d.expand("${" * depth + "foo" + "}" * depth)
+ self.assertEqual(str(val), "foo")
+
+ #def test_deep_python_nesting(self):
+ # depth = 50
+ # val = self.d.expand("${@" * depth + "1" + "+1}" * depth)
+ # self.assertEqual(str(val), str(depth + 1))
+
+ def test_mixed(self):
+ val = self.d.expand("${value_of_${@('${foo}'+'bar')[0:3]}${${@'BAR'.lower()}}}")
+ self.assertEqual(str(val), "187")
+
+ def test_runtime(self):
+ val = self.d.expand("${${@'value_of' + '_f'+'o'+'o'+'b'+'a'+'r'}}")
+ self.assertEqual(str(val), "187")
+
+class TestMemoize(unittest.TestCase):
+ def test_memoized(self):
+ d = bb.data.init()
+ d.setVar("FOO", "bar")
+ self.assertTrue(d.getVar("FOO", False) is d.getVar("FOO", False))
+
+ def test_not_memoized(self):
+ d1 = bb.data.init()
+ d2 = bb.data.init()
+ d1.setVar("FOO", "bar")
+ d2.setVar("FOO", "bar2")
+ self.assertTrue(d1.getVar("FOO", False) is not d2.getVar("FOO", False))
+
+ def test_changed_after_memoized(self):
+ d = bb.data.init()
+ d.setVar("foo", "value of foo")
+ self.assertEqual(str(d.getVar("foo", False)), "value of foo")
+ d.setVar("foo", "second value of foo")
+ self.assertEqual(str(d.getVar("foo", False)), "second value of foo")
+
+ def test_same_value(self):
+ d = bb.data.init()
+ d.setVar("foo", "value of")
+ d.setVar("bar", "value of")
+ self.assertEqual(d.getVar("foo", False),
+ d.getVar("bar", False))
+
+class TestConcat(unittest.TestCase):
+ def setUp(self):
+ self.d = bb.data.init()
+ self.d.setVar("FOO", "foo")
+ self.d.setVar("VAL", "val")
+ self.d.setVar("BAR", "bar")
+
+ def test_prepend(self):
+ self.d.setVar("TEST", "${VAL}")
+ self.d.prependVar("TEST", "${FOO}:")
+ self.assertEqual(self.d.getVar("TEST", True), "foo:val")
+
+ def test_append(self):
+ self.d.setVar("TEST", "${VAL}")
+ self.d.appendVar("TEST", ":${BAR}")
+ self.assertEqual(self.d.getVar("TEST", True), "val:bar")
+
+ def test_multiple_append(self):
+ self.d.setVar("TEST", "${VAL}")
+ self.d.prependVar("TEST", "${FOO}:")
+ self.d.appendVar("TEST", ":val2")
+ self.d.appendVar("TEST", ":${BAR}")
+ self.assertEqual(self.d.getVar("TEST", True), "foo:val:val2:bar")
+
+class TestConcatOverride(unittest.TestCase):
+ def setUp(self):
+ self.d = bb.data.init()
+ self.d.setVar("FOO", "foo")
+ self.d.setVar("VAL", "val")
+ self.d.setVar("BAR", "bar")
+
+ def test_prepend(self):
+ self.d.setVar("TEST", "${VAL}")
+ self.d.setVar("TEST_prepend", "${FOO}:")
+ bb.data.update_data(self.d)
+ self.assertEqual(self.d.getVar("TEST", True), "foo:val")
+
+ def test_append(self):
+ self.d.setVar("TEST", "${VAL}")
+ self.d.setVar("TEST_append", ":${BAR}")
+ bb.data.update_data(self.d)
+ self.assertEqual(self.d.getVar("TEST", True), "val:bar")
+
+ def test_multiple_append(self):
+ self.d.setVar("TEST", "${VAL}")
+ self.d.setVar("TEST_prepend", "${FOO}:")
+ self.d.setVar("TEST_append", ":val2")
+ self.d.setVar("TEST_append", ":${BAR}")
+ bb.data.update_data(self.d)
+ self.assertEqual(self.d.getVar("TEST", True), "foo:val:val2:bar")
+
+ def test_append_unset(self):
+ self.d.setVar("TEST_prepend", "${FOO}:")
+ self.d.setVar("TEST_append", ":val2")
+ self.d.setVar("TEST_append", ":${BAR}")
+ bb.data.update_data(self.d)
+ self.assertEqual(self.d.getVar("TEST", True), "foo::val2:bar")
+
+ def test_remove(self):
+ self.d.setVar("TEST", "${VAL} ${BAR}")
+ self.d.setVar("TEST_remove", "val")
+ bb.data.update_data(self.d)
+ self.assertEqual(self.d.getVar("TEST", True), "bar")
+
+ def test_doubleref_remove(self):
+ self.d.setVar("TEST", "${VAL} ${BAR}")
+ self.d.setVar("TEST_remove", "val")
+ self.d.setVar("TEST_TEST", "${TEST} ${TEST}")
+ bb.data.update_data(self.d)
+ self.assertEqual(self.d.getVar("TEST_TEST", True), "bar bar")
+
+ def test_empty_remove(self):
+ self.d.setVar("TEST", "")
+ self.d.setVar("TEST_remove", "val")
+ bb.data.update_data(self.d)
+ self.assertEqual(self.d.getVar("TEST", True), "")
+
+ def test_remove_expansion(self):
+ self.d.setVar("BAR", "Z")
+ self.d.setVar("TEST", "${BAR}/X Y")
+ self.d.setVar("TEST_remove", "${BAR}/X")
+ bb.data.update_data(self.d)
+ self.assertEqual(self.d.getVar("TEST", True), "Y")
+
+ def test_remove_expansion_items(self):
+ self.d.setVar("TEST", "A B C D")
+ self.d.setVar("BAR", "B D")
+ self.d.setVar("TEST_remove", "${BAR}")
+ bb.data.update_data(self.d)
+ self.assertEqual(self.d.getVar("TEST", True), "A C")
+
+class TestOverrides(unittest.TestCase):
+ def setUp(self):
+ self.d = bb.data.init()
+ self.d.setVar("OVERRIDES", "foo:bar:local")
+ self.d.setVar("TEST", "testvalue")
+
+ def test_no_override(self):
+ bb.data.update_data(self.d)
+ self.assertEqual(self.d.getVar("TEST", True), "testvalue")
+
+ def test_one_override(self):
+ self.d.setVar("TEST_bar", "testvalue2")
+ bb.data.update_data(self.d)
+ self.assertEqual(self.d.getVar("TEST", True), "testvalue2")
+
+ def test_one_override_unset(self):
+ self.d.setVar("TEST2_bar", "testvalue2")
+ bb.data.update_data(self.d)
+ self.assertEqual(self.d.getVar("TEST2", True), "testvalue2")
+ self.assertItemsEqual(self.d.keys(), ['TEST', 'TEST2', 'OVERRIDES', 'TEST2_bar'])
+
+ def test_multiple_override(self):
+ self.d.setVar("TEST_bar", "testvalue2")
+ self.d.setVar("TEST_local", "testvalue3")
+ self.d.setVar("TEST_foo", "testvalue4")
+ bb.data.update_data(self.d)
+ self.assertEqual(self.d.getVar("TEST", True), "testvalue3")
+ self.assertItemsEqual(self.d.keys(), ['TEST', 'TEST_foo', 'OVERRIDES', 'TEST_bar', 'TEST_local'])
+
+ def test_multiple_combined_overrides(self):
+ self.d.setVar("TEST_local_foo_bar", "testvalue3")
+ bb.data.update_data(self.d)
+ self.assertEqual(self.d.getVar("TEST", True), "testvalue3")
+
+ def test_multiple_overrides_unset(self):
+ self.d.setVar("TEST2_local_foo_bar", "testvalue3")
+ bb.data.update_data(self.d)
+ self.assertEqual(self.d.getVar("TEST2", True), "testvalue3")
+
+ def test_keyexpansion_override(self):
+ self.d.setVar("LOCAL", "local")
+ self.d.setVar("TEST_bar", "testvalue2")
+ self.d.setVar("TEST_${LOCAL}", "testvalue3")
+ self.d.setVar("TEST_foo", "testvalue4")
+ bb.data.update_data(self.d)
+ bb.data.expandKeys(self.d)
+ self.assertEqual(self.d.getVar("TEST", True), "testvalue3")
+
+ def test_rename_override(self):
+ self.d.setVar("ALTERNATIVE_ncurses-tools_class-target", "a")
+ self.d.setVar("OVERRIDES", "class-target")
+ bb.data.update_data(self.d)
+ self.d.renameVar("ALTERNATIVE_ncurses-tools", "ALTERNATIVE_lib32-ncurses-tools")
+ self.assertEqual(self.d.getVar("ALTERNATIVE_lib32-ncurses-tools", True), "a")
+
+ def test_underscore_override(self):
+ self.d.setVar("TEST_bar", "testvalue2")
+ self.d.setVar("TEST_some_val", "testvalue3")
+ self.d.setVar("TEST_foo", "testvalue4")
+ self.d.setVar("OVERRIDES", "foo:bar:some_val")
+ self.assertEqual(self.d.getVar("TEST", True), "testvalue3")
+
+class TestKeyExpansion(unittest.TestCase):
+ def setUp(self):
+ self.d = bb.data.init()
+ self.d.setVar("FOO", "foo")
+ self.d.setVar("BAR", "foo")
+
+ def test_keyexpand(self):
+ self.d.setVar("VAL_${FOO}", "A")
+ self.d.setVar("VAL_${BAR}", "B")
+ with LogRecord() as logs:
+ bb.data.expandKeys(self.d)
+ self.assertTrue(logContains("Variable key VAL_${FOO} (A) replaces original key VAL_foo (B)", logs))
+ self.assertEqual(self.d.getVar("VAL_foo", True), "A")
+
+class TestFlags(unittest.TestCase):
+ def setUp(self):
+ self.d = bb.data.init()
+ self.d.setVar("foo", "value of foo")
+ self.d.setVarFlag("foo", "flag1", "value of flag1")
+ self.d.setVarFlag("foo", "flag2", "value of flag2")
+
+ def test_setflag(self):
+ self.assertEqual(self.d.getVarFlag("foo", "flag1"), "value of flag1")
+ self.assertEqual(self.d.getVarFlag("foo", "flag2"), "value of flag2")
+
+ def test_delflag(self):
+ self.d.delVarFlag("foo", "flag2")
+ self.assertEqual(self.d.getVarFlag("foo", "flag1"), "value of flag1")
+ self.assertEqual(self.d.getVarFlag("foo", "flag2"), None)
+
+
+class Contains(unittest.TestCase):
+ def setUp(self):
+ self.d = bb.data.init()
+ self.d.setVar("SOMEFLAG", "a b c")
+
+ def test_contains(self):
+ self.assertTrue(bb.utils.contains("SOMEFLAG", "a", True, False, self.d))
+ self.assertTrue(bb.utils.contains("SOMEFLAG", "b", True, False, self.d))
+ self.assertTrue(bb.utils.contains("SOMEFLAG", "c", True, False, self.d))
+
+ self.assertTrue(bb.utils.contains("SOMEFLAG", "a b", True, False, self.d))
+ self.assertTrue(bb.utils.contains("SOMEFLAG", "b c", True, False, self.d))
+ self.assertTrue(bb.utils.contains("SOMEFLAG", "c a", True, False, self.d))
+
+ self.assertTrue(bb.utils.contains("SOMEFLAG", "a b c", True, False, self.d))
+ self.assertTrue(bb.utils.contains("SOMEFLAG", "c b a", True, False, self.d))
+
+ self.assertFalse(bb.utils.contains("SOMEFLAG", "x", True, False, self.d))
+ self.assertFalse(bb.utils.contains("SOMEFLAG", "a x", True, False, self.d))
+ self.assertFalse(bb.utils.contains("SOMEFLAG", "x c b", True, False, self.d))
+ self.assertFalse(bb.utils.contains("SOMEFLAG", "x c b a", True, False, self.d))
+
+ def test_contains_any(self):
+ self.assertTrue(bb.utils.contains_any("SOMEFLAG", "a", True, False, self.d))
+ self.assertTrue(bb.utils.contains_any("SOMEFLAG", "b", True, False, self.d))
+ self.assertTrue(bb.utils.contains_any("SOMEFLAG", "c", True, False, self.d))
+
+ self.assertTrue(bb.utils.contains_any("SOMEFLAG", "a b", True, False, self.d))
+ self.assertTrue(bb.utils.contains_any("SOMEFLAG", "b c", True, False, self.d))
+ self.assertTrue(bb.utils.contains_any("SOMEFLAG", "c a", True, False, self.d))
+
+ self.assertTrue(bb.utils.contains_any("SOMEFLAG", "a x", True, False, self.d))
+ self.assertTrue(bb.utils.contains_any("SOMEFLAG", "x c", True, False, self.d))
+
+ self.assertFalse(bb.utils.contains_any("SOMEFLAG", "x", True, False, self.d))
+ self.assertFalse(bb.utils.contains_any("SOMEFLAG", "x y z", True, False, self.d))
diff --git a/bitbake/lib/bb/tests/fetch.py b/bitbake/lib/bb/tests/fetch.py
new file mode 100644
index 0000000..1e61f3a
--- /dev/null
+++ b/bitbake/lib/bb/tests/fetch.py
@@ -0,0 +1,759 @@
+# ex:ts=4:sw=4:sts=4:et
+# -*- tab-width: 4; c-basic-offset: 4; indent-tabs-mode: nil -*-
+#
+# BitBake Tests for the Fetcher (fetch2/)
+#
+# Copyright (C) 2012 Richard Purdie
+#
+# This program is free software; you can redistribute it and/or modify
+# it under the terms of the GNU General Public License version 2 as
+# published by the Free Software Foundation.
+#
+# This program is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
+# GNU General Public License for more details.
+#
+# You should have received a copy of the GNU General Public License along
+# with this program; if not, write to the Free Software Foundation, Inc.,
+# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
+#
+
+import unittest
+import tempfile
+import subprocess
+import os
+from bb.fetch2 import URI
+from bb.fetch2 import FetchMethod
+import bb
+
+class URITest(unittest.TestCase):
+ test_uris = {
+ "http://www.google.com/index.html" : {
+ 'uri': 'http://www.google.com/index.html',
+ 'scheme': 'http',
+ 'hostname': 'www.google.com',
+ 'port': None,
+ 'hostport': 'www.google.com',
+ 'path': '/index.html',
+ 'userinfo': '',
+ 'username': '',
+ 'password': '',
+ 'params': {},
+ 'query': {},
+ 'relative': False
+ },
+ "http://www.google.com/index.html;param1=value1" : {
+ 'uri': 'http://www.google.com/index.html;param1=value1',
+ 'scheme': 'http',
+ 'hostname': 'www.google.com',
+ 'port': None,
+ 'hostport': 'www.google.com',
+ 'path': '/index.html',
+ 'userinfo': '',
+ 'username': '',
+ 'password': '',
+ 'params': {
+ 'param1': 'value1'
+ },
+ 'query': {},
+ 'relative': False
+ },
+ "http://www.example.org/index.html?param1=value1" : {
+ 'uri': 'http://www.example.org/index.html?param1=value1',
+ 'scheme': 'http',
+ 'hostname': 'www.example.org',
+ 'port': None,
+ 'hostport': 'www.example.org',
+ 'path': '/index.html',
+ 'userinfo': '',
+ 'username': '',
+ 'password': '',
+ 'params': {},
+ 'query': {
+ 'param1': 'value1'
+ },
+ 'relative': False
+ },
+ "http://www.example.org/index.html?qparam1=qvalue1;param2=value2" : {
+ 'uri': 'http://www.example.org/index.html?qparam1=qvalue1;param2=value2',
+ 'scheme': 'http',
+ 'hostname': 'www.example.org',
+ 'port': None,
+ 'hostport': 'www.example.org',
+ 'path': '/index.html',
+ 'userinfo': '',
+ 'username': '',
+ 'password': '',
+ 'params': {
+ 'param2': 'value2'
+ },
+ 'query': {
+ 'qparam1': 'qvalue1'
+ },
+ 'relative': False
+ },
+ "http://www.example.com:8080/index.html" : {
+ 'uri': 'http://www.example.com:8080/index.html',
+ 'scheme': 'http',
+ 'hostname': 'www.example.com',
+ 'port': 8080,
+ 'hostport': 'www.example.com:8080',
+ 'path': '/index.html',
+ 'userinfo': '',
+ 'username': '',
+ 'password': '',
+ 'params': {},
+ 'query': {},
+ 'relative': False
+ },
+ "cvs://anoncvs@cvs.handhelds.org/cvs;module=familiar/dist/ipkg" : {
+ 'uri': 'cvs://anoncvs@cvs.handhelds.org/cvs;module=familiar/dist/ipkg',
+ 'scheme': 'cvs',
+ 'hostname': 'cvs.handhelds.org',
+ 'port': None,
+ 'hostport': 'cvs.handhelds.org',
+ 'path': '/cvs',
+ 'userinfo': 'anoncvs',
+ 'username': 'anoncvs',
+ 'password': '',
+ 'params': {
+ 'module': 'familiar/dist/ipkg'
+ },
+ 'query': {},
+ 'relative': False
+ },
+ "cvs://anoncvs:anonymous@cvs.handhelds.org/cvs;tag=V0-99-81;module=familiar/dist/ipkg": {
+ 'uri': 'cvs://anoncvs:anonymous@cvs.handhelds.org/cvs;tag=V0-99-81;module=familiar/dist/ipkg',
+ 'scheme': 'cvs',
+ 'hostname': 'cvs.handhelds.org',
+ 'port': None,
+ 'hostport': 'cvs.handhelds.org',
+ 'path': '/cvs',
+ 'userinfo': 'anoncvs:anonymous',
+ 'username': 'anoncvs',
+ 'password': 'anonymous',
+ 'params': {
+ 'tag': 'V0-99-81',
+ 'module': 'familiar/dist/ipkg'
+ },
+ 'query': {},
+ 'relative': False
+ },
+ "file://example.diff": { # NOTE: Not RFC compliant!
+ 'uri': 'file:example.diff',
+ 'scheme': 'file',
+ 'hostname': '',
+ 'port': None,
+ 'hostport': '',
+ 'path': 'example.diff',
+ 'userinfo': '',
+ 'username': '',
+ 'password': '',
+ 'params': {},
+ 'query': {},
+ 'relative': True
+ },
+ "file:example.diff": { # NOTE: RFC compliant version of the former
+ 'uri': 'file:example.diff',
+ 'scheme': 'file',
+ 'hostname': '',
+ 'port': None,
+ 'hostport': '',
+ 'path': 'example.diff',
+ 'userinfo': '',
+ 'userinfo': '',
+ 'username': '',
+ 'password': '',
+ 'params': {},
+ 'query': {},
+ 'relative': True
+ },
+ "file:///tmp/example.diff": {
+ 'uri': 'file:///tmp/example.diff',
+ 'scheme': 'file',
+ 'hostname': '',
+ 'port': None,
+ 'hostport': '',
+ 'path': '/tmp/example.diff',
+ 'userinfo': '',
+ 'userinfo': '',
+ 'username': '',
+ 'password': '',
+ 'params': {},
+ 'query': {},
+ 'relative': False
+ },
+ "git:///path/example.git": {
+ 'uri': 'git:///path/example.git',
+ 'scheme': 'git',
+ 'hostname': '',
+ 'port': None,
+ 'hostport': '',
+ 'path': '/path/example.git',
+ 'userinfo': '',
+ 'userinfo': '',
+ 'username': '',
+ 'password': '',
+ 'params': {},
+ 'query': {},
+ 'relative': False
+ },
+ "git:path/example.git": {
+ 'uri': 'git:path/example.git',
+ 'scheme': 'git',
+ 'hostname': '',
+ 'port': None,
+ 'hostport': '',
+ 'path': 'path/example.git',
+ 'userinfo': '',
+ 'userinfo': '',
+ 'username': '',
+ 'password': '',
+ 'params': {},
+ 'query': {},
+ 'relative': True
+ },
+ "git://example.net/path/example.git": {
+ 'uri': 'git://example.net/path/example.git',
+ 'scheme': 'git',
+ 'hostname': 'example.net',
+ 'port': None,
+ 'hostport': 'example.net',
+ 'path': '/path/example.git',
+ 'userinfo': '',
+ 'userinfo': '',
+ 'username': '',
+ 'password': '',
+ 'params': {},
+ 'query': {},
+ 'relative': False
+ }
+ }
+
+ def test_uri(self):
+ for test_uri, ref in self.test_uris.items():
+ uri = URI(test_uri)
+
+ self.assertEqual(str(uri), ref['uri'])
+
+ # expected attributes
+ self.assertEqual(uri.scheme, ref['scheme'])
+
+ self.assertEqual(uri.userinfo, ref['userinfo'])
+ self.assertEqual(uri.username, ref['username'])
+ self.assertEqual(uri.password, ref['password'])
+
+ self.assertEqual(uri.hostname, ref['hostname'])
+ self.assertEqual(uri.port, ref['port'])
+ self.assertEqual(uri.hostport, ref['hostport'])
+
+ self.assertEqual(uri.path, ref['path'])
+ self.assertEqual(uri.params, ref['params'])
+
+ self.assertEqual(uri.relative, ref['relative'])
+
+ def test_dict(self):
+ for test in self.test_uris.values():
+ uri = URI()
+
+ self.assertEqual(uri.scheme, '')
+ self.assertEqual(uri.userinfo, '')
+ self.assertEqual(uri.username, '')
+ self.assertEqual(uri.password, '')
+ self.assertEqual(uri.hostname, '')
+ self.assertEqual(uri.port, None)
+ self.assertEqual(uri.path, '')
+ self.assertEqual(uri.params, {})
+
+
+ uri.scheme = test['scheme']
+ self.assertEqual(uri.scheme, test['scheme'])
+
+ uri.userinfo = test['userinfo']
+ self.assertEqual(uri.userinfo, test['userinfo'])
+ self.assertEqual(uri.username, test['username'])
+ self.assertEqual(uri.password, test['password'])
+
+ # make sure changing the values doesn't do anything unexpected
+ uri.username = 'changeme'
+ self.assertEqual(uri.username, 'changeme')
+ self.assertEqual(uri.password, test['password'])
+ uri.password = 'insecure'
+ self.assertEqual(uri.username, 'changeme')
+ self.assertEqual(uri.password, 'insecure')
+
+ # reset back after our trickery
+ uri.userinfo = test['userinfo']
+ self.assertEqual(uri.userinfo, test['userinfo'])
+ self.assertEqual(uri.username, test['username'])
+ self.assertEqual(uri.password, test['password'])
+
+ uri.hostname = test['hostname']
+ self.assertEqual(uri.hostname, test['hostname'])
+ self.assertEqual(uri.hostport, test['hostname'])
+
+ uri.port = test['port']
+ self.assertEqual(uri.port, test['port'])
+ self.assertEqual(uri.hostport, test['hostport'])
+
+ uri.path = test['path']
+ self.assertEqual(uri.path, test['path'])
+
+ uri.params = test['params']
+ self.assertEqual(uri.params, test['params'])
+
+ uri.query = test['query']
+ self.assertEqual(uri.query, test['query'])
+
+ self.assertEqual(str(uri), test['uri'])
+
+ uri.params = {}
+ self.assertEqual(uri.params, {})
+ self.assertEqual(str(uri), (str(uri).split(";"))[0])
+
+class FetcherTest(unittest.TestCase):
+
+ def setUp(self):
+ self.origdir = os.getcwd()
+ self.d = bb.data.init()
+ self.tempdir = tempfile.mkdtemp()
+ self.dldir = os.path.join(self.tempdir, "download")
+ os.mkdir(self.dldir)
+ self.d.setVar("DL_DIR", self.dldir)
+ self.unpackdir = os.path.join(self.tempdir, "unpacked")
+ os.mkdir(self.unpackdir)
+ persistdir = os.path.join(self.tempdir, "persistdata")
+ self.d.setVar("PERSISTENT_DIR", persistdir)
+
+ def tearDown(self):
+ os.chdir(self.origdir)
+ bb.utils.prunedir(self.tempdir)
+
+class MirrorUriTest(FetcherTest):
+
+ replaceuris = {
+ ("git://git.invalid.infradead.org/mtd-utils.git;tag=1234567890123456789012345678901234567890", "git://.*/.*", "http://somewhere.org/somedir/")
+ : "http://somewhere.org/somedir/git2_git.invalid.infradead.org.mtd-utils.git.tar.gz",
+ ("git://git.invalid.infradead.org/mtd-utils.git;tag=1234567890123456789012345678901234567890", "git://.*/([^/]+/)*([^/]*)", "git://somewhere.org/somedir/\\2;protocol=http")
+ : "git://somewhere.org/somedir/mtd-utils.git;tag=1234567890123456789012345678901234567890;protocol=http",
+ ("git://git.invalid.infradead.org/foo/mtd-utils.git;tag=1234567890123456789012345678901234567890", "git://.*/([^/]+/)*([^/]*)", "git://somewhere.org/somedir/\\2;protocol=http")
+ : "git://somewhere.org/somedir/mtd-utils.git;tag=1234567890123456789012345678901234567890;protocol=http",
+ ("git://git.invalid.infradead.org/foo/mtd-utils.git;tag=1234567890123456789012345678901234567890", "git://.*/([^/]+/)*([^/]*)", "git://somewhere.org/\\2;protocol=http")
+ : "git://somewhere.org/mtd-utils.git;tag=1234567890123456789012345678901234567890;protocol=http",
+ ("git://someserver.org/bitbake;tag=1234567890123456789012345678901234567890", "git://someserver.org/bitbake", "git://git.openembedded.org/bitbake")
+ : "git://git.openembedded.org/bitbake;tag=1234567890123456789012345678901234567890",
+ ("file://sstate-xyz.tgz", "file://.*", "file:///somewhere/1234/sstate-cache")
+ : "file:///somewhere/1234/sstate-cache/sstate-xyz.tgz",
+ ("file://sstate-xyz.tgz", "file://.*", "file:///somewhere/1234/sstate-cache/")
+ : "file:///somewhere/1234/sstate-cache/sstate-xyz.tgz",
+ ("http://somewhere.org/somedir1/somedir2/somefile_1.2.3.tar.gz", "http://.*/.*", "http://somewhere2.org/somedir3")
+ : "http://somewhere2.org/somedir3/somefile_1.2.3.tar.gz",
+ ("http://somewhere.org/somedir1/somefile_1.2.3.tar.gz", "http://somewhere.org/somedir1/somefile_1.2.3.tar.gz", "http://somewhere2.org/somedir3/somefile_1.2.3.tar.gz")
+ : "http://somewhere2.org/somedir3/somefile_1.2.3.tar.gz",
+ ("http://www.apache.org/dist/subversion/subversion-1.7.1.tar.bz2", "http://www.apache.org/dist", "http://archive.apache.org/dist")
+ : "http://archive.apache.org/dist/subversion/subversion-1.7.1.tar.bz2",
+ ("http://www.apache.org/dist/subversion/subversion-1.7.1.tar.bz2", "http://.*/.*", "file:///somepath/downloads/")
+ : "file:///somepath/downloads/subversion-1.7.1.tar.bz2",
+ ("git://git.invalid.infradead.org/mtd-utils.git;tag=1234567890123456789012345678901234567890", "git://.*/.*", "git://somewhere.org/somedir/BASENAME;protocol=http")
+ : "git://somewhere.org/somedir/mtd-utils.git;tag=1234567890123456789012345678901234567890;protocol=http",
+ ("git://git.invalid.infradead.org/foo/mtd-utils.git;tag=1234567890123456789012345678901234567890", "git://.*/.*", "git://somewhere.org/somedir/BASENAME;protocol=http")
+ : "git://somewhere.org/somedir/mtd-utils.git;tag=1234567890123456789012345678901234567890;protocol=http",
+ ("git://git.invalid.infradead.org/foo/mtd-utils.git;tag=1234567890123456789012345678901234567890", "git://.*/.*", "git://somewhere.org/somedir/MIRRORNAME;protocol=http")
+ : "git://somewhere.org/somedir/git.invalid.infradead.org.foo.mtd-utils.git;tag=1234567890123456789012345678901234567890;protocol=http",
+
+ #Renaming files doesn't work
+ #("http://somewhere.org/somedir1/somefile_1.2.3.tar.gz", "http://somewhere.org/somedir1/somefile_1.2.3.tar.gz", "http://somewhere2.org/somedir3/somefile_2.3.4.tar.gz") : "http://somewhere2.org/somedir3/somefile_2.3.4.tar.gz"
+ #("file://sstate-xyz.tgz", "file://.*/.*", "file:///somewhere/1234/sstate-cache") : "file:///somewhere/1234/sstate-cache/sstate-xyz.tgz",
+ }
+
+ mirrorvar = "http://.*/.* file:///somepath/downloads/ \n" \
+ "git://someserver.org/bitbake git://git.openembedded.org/bitbake \n" \
+ "https://.*/.* file:///someotherpath/downloads/ \n" \
+ "http://.*/.* file:///someotherpath/downloads/ \n"
+
+ def test_urireplace(self):
+ for k, v in self.replaceuris.items():
+ ud = bb.fetch.FetchData(k[0], self.d)
+ ud.setup_localpath(self.d)
+ mirrors = bb.fetch2.mirror_from_string("%s %s" % (k[1], k[2]))
+ newuris, uds = bb.fetch2.build_mirroruris(ud, mirrors, self.d)
+ self.assertEqual([v], newuris)
+
+ def test_urilist1(self):
+ fetcher = bb.fetch.FetchData("http://downloads.yoctoproject.org/releases/bitbake/bitbake-1.0.tar.gz", self.d)
+ mirrors = bb.fetch2.mirror_from_string(self.mirrorvar)
+ uris, uds = bb.fetch2.build_mirroruris(fetcher, mirrors, self.d)
+ self.assertEqual(uris, ['file:///somepath/downloads/bitbake-1.0.tar.gz', 'file:///someotherpath/downloads/bitbake-1.0.tar.gz'])
+
+ def test_urilist2(self):
+ # Catch https:// -> files:// bug
+ fetcher = bb.fetch.FetchData("https://downloads.yoctoproject.org/releases/bitbake/bitbake-1.0.tar.gz", self.d)
+ mirrors = bb.fetch2.mirror_from_string(self.mirrorvar)
+ uris, uds = bb.fetch2.build_mirroruris(fetcher, mirrors, self.d)
+ self.assertEqual(uris, ['file:///someotherpath/downloads/bitbake-1.0.tar.gz'])
+
+ def test_mirror_of_mirror(self):
+ # Test if mirror of a mirror works
+ mirrorvar = self.mirrorvar + " http://.*/.* http://otherdownloads.yoctoproject.org/downloads/ \n"
+ mirrorvar = mirrorvar + " http://otherdownloads.yoctoproject.org/.* http://downloads2.yoctoproject.org/downloads/ \n"
+ fetcher = bb.fetch.FetchData("http://downloads.yoctoproject.org/releases/bitbake/bitbake-1.0.tar.gz", self.d)
+ mirrors = bb.fetch2.mirror_from_string(mirrorvar)
+ uris, uds = bb.fetch2.build_mirroruris(fetcher, mirrors, self.d)
+ self.assertEqual(uris, ['file:///somepath/downloads/bitbake-1.0.tar.gz',
+ 'file:///someotherpath/downloads/bitbake-1.0.tar.gz',
+ 'http://otherdownloads.yoctoproject.org/downloads/bitbake-1.0.tar.gz',
+ 'http://downloads2.yoctoproject.org/downloads/bitbake-1.0.tar.gz'])
+
+
+class FetcherLocalTest(FetcherTest):
+ def setUp(self):
+ def touch(fn):
+ with file(fn, 'a'):
+ os.utime(fn, None)
+
+ super(FetcherLocalTest, self).setUp()
+ self.localsrcdir = os.path.join(self.tempdir, 'localsrc')
+ os.makedirs(self.localsrcdir)
+ touch(os.path.join(self.localsrcdir, 'a'))
+ touch(os.path.join(self.localsrcdir, 'b'))
+ os.makedirs(os.path.join(self.localsrcdir, 'dir'))
+ touch(os.path.join(self.localsrcdir, 'dir', 'c'))
+ touch(os.path.join(self.localsrcdir, 'dir', 'd'))
+ os.makedirs(os.path.join(self.localsrcdir, 'dir', 'subdir'))
+ touch(os.path.join(self.localsrcdir, 'dir', 'subdir', 'e'))
+ self.d.setVar("FILESPATH", self.localsrcdir)
+
+ def fetchUnpack(self, uris):
+ fetcher = bb.fetch.Fetch(uris, self.d)
+ fetcher.download()
+ fetcher.unpack(self.unpackdir)
+ flst = []
+ for root, dirs, files in os.walk(self.unpackdir):
+ for f in files:
+ flst.append(os.path.relpath(os.path.join(root, f), self.unpackdir))
+ flst.sort()
+ return flst
+
+ def test_local(self):
+ tree = self.fetchUnpack(['file://a', 'file://dir/c'])
+ self.assertEqual(tree, ['a', 'dir/c'])
+
+ def test_local_wildcard(self):
+ tree = self.fetchUnpack(['file://a', 'file://dir/*'])
+ # FIXME: this is broken - it should return ['a', 'dir/c', 'dir/d', 'dir/subdir/e']
+ # see https://bugzilla.yoctoproject.org/show_bug.cgi?id=6128
+ self.assertEqual(tree, ['a', 'b', 'dir/c', 'dir/d', 'dir/subdir/e'])
+
+ def test_local_dir(self):
+ tree = self.fetchUnpack(['file://a', 'file://dir'])
+ self.assertEqual(tree, ['a', 'dir/c', 'dir/d', 'dir/subdir/e'])
+
+ def test_local_subdir(self):
+ tree = self.fetchUnpack(['file://dir/subdir'])
+ # FIXME: this is broken - it should return ['dir/subdir/e']
+ # see https://bugzilla.yoctoproject.org/show_bug.cgi?id=6129
+ self.assertEqual(tree, ['subdir/e'])
+
+ def test_local_subdir_file(self):
+ tree = self.fetchUnpack(['file://dir/subdir/e'])
+ self.assertEqual(tree, ['dir/subdir/e'])
+
+ def test_local_subdirparam(self):
+ tree = self.fetchUnpack(['file://a;subdir=bar'])
+ self.assertEqual(tree, ['bar/a'])
+
+ def test_local_deepsubdirparam(self):
+ tree = self.fetchUnpack(['file://dir/subdir/e;subdir=bar'])
+ self.assertEqual(tree, ['bar/dir/subdir/e'])
+
+class FetcherNetworkTest(FetcherTest):
+
+ if os.environ.get("BB_SKIP_NETTESTS") == "yes":
+ print("Unset BB_SKIP_NETTESTS to run network tests")
+ else:
+ def test_fetch(self):
+ fetcher = bb.fetch.Fetch(["http://downloads.yoctoproject.org/releases/bitbake/bitbake-1.0.tar.gz", "http://downloads.yoctoproject.org/releases/bitbake/bitbake-1.1.tar.gz"], self.d)
+ fetcher.download()
+ self.assertEqual(os.path.getsize(self.dldir + "/bitbake-1.0.tar.gz"), 57749)
+ self.assertEqual(os.path.getsize(self.dldir + "/bitbake-1.1.tar.gz"), 57892)
+ self.d.setVar("BB_NO_NETWORK", "1")
+ fetcher = bb.fetch.Fetch(["http://downloads.yoctoproject.org/releases/bitbake/bitbake-1.0.tar.gz", "http://downloads.yoctoproject.org/releases/bitbake/bitbake-1.1.tar.gz"], self.d)
+ fetcher.download()
+ fetcher.unpack(self.unpackdir)
+ self.assertEqual(len(os.listdir(self.unpackdir + "/bitbake-1.0/")), 9)
+ self.assertEqual(len(os.listdir(self.unpackdir + "/bitbake-1.1/")), 9)
+
+ def test_fetch_mirror(self):
+ self.d.setVar("MIRRORS", "http://.*/.* http://downloads.yoctoproject.org/releases/bitbake")
+ fetcher = bb.fetch.Fetch(["http://invalid.yoctoproject.org/releases/bitbake/bitbake-1.0.tar.gz"], self.d)
+ fetcher.download()
+ self.assertEqual(os.path.getsize(self.dldir + "/bitbake-1.0.tar.gz"), 57749)
+
+ def test_fetch_mirror_of_mirror(self):
+ self.d.setVar("MIRRORS", "http://.*/.* http://invalid2.yoctoproject.org/ \n http://invalid2.yoctoproject.org/.* http://downloads.yoctoproject.org/releases/bitbake")
+ fetcher = bb.fetch.Fetch(["http://invalid.yoctoproject.org/releases/bitbake/bitbake-1.0.tar.gz"], self.d)
+ fetcher.download()
+ self.assertEqual(os.path.getsize(self.dldir + "/bitbake-1.0.tar.gz"), 57749)
+
+ def test_fetch_file_mirror_of_mirror(self):
+ self.d.setVar("MIRRORS", "http://.*/.* file:///some1where/ \n file:///some1where/.* file://some2where/ \n file://some2where/.* http://downloads.yoctoproject.org/releases/bitbake")
+ fetcher = bb.fetch.Fetch(["http://invalid.yoctoproject.org/releases/bitbake/bitbake-1.0.tar.gz"], self.d)
+ os.mkdir(self.dldir + "/some2where")
+ fetcher.download()
+ self.assertEqual(os.path.getsize(self.dldir + "/bitbake-1.0.tar.gz"), 57749)
+
+ def test_fetch_premirror(self):
+ self.d.setVar("PREMIRRORS", "http://.*/.* http://downloads.yoctoproject.org/releases/bitbake")
+ fetcher = bb.fetch.Fetch(["http://invalid.yoctoproject.org/releases/bitbake/bitbake-1.0.tar.gz"], self.d)
+ fetcher.download()
+ self.assertEqual(os.path.getsize(self.dldir + "/bitbake-1.0.tar.gz"), 57749)
+
+ def gitfetcher(self, url1, url2):
+ def checkrevision(self, fetcher):
+ fetcher.unpack(self.unpackdir)
+ revision = bb.process.run("git rev-parse HEAD", shell=True, cwd=self.unpackdir + "/git")[0].strip()
+ self.assertEqual(revision, "270a05b0b4ba0959fe0624d2a4885d7b70426da5")
+
+ self.d.setVar("BB_GENERATE_MIRROR_TARBALLS", "1")
+ self.d.setVar("SRCREV", "270a05b0b4ba0959fe0624d2a4885d7b70426da5")
+ fetcher = bb.fetch.Fetch([url1], self.d)
+ fetcher.download()
+ checkrevision(self, fetcher)
+ # Wipe out the dldir clone and the unpacked source, turn off the network and check mirror tarball works
+ bb.utils.prunedir(self.dldir + "/git2/")
+ bb.utils.prunedir(self.unpackdir)
+ self.d.setVar("BB_NO_NETWORK", "1")
+ fetcher = bb.fetch.Fetch([url2], self.d)
+ fetcher.download()
+ checkrevision(self, fetcher)
+
+ def test_gitfetch(self):
+ url1 = url2 = "git://git.openembedded.org/bitbake"
+ self.gitfetcher(url1, url2)
+
+ def test_gitfetch_goodsrcrev(self):
+ # SRCREV is set but matches rev= parameter
+ url1 = url2 = "git://git.openembedded.org/bitbake;rev=270a05b0b4ba0959fe0624d2a4885d7b70426da5"
+ self.gitfetcher(url1, url2)
+
+ def test_gitfetch_badsrcrev(self):
+ # SRCREV is set but does not match rev= parameter
+ url1 = url2 = "git://git.openembedded.org/bitbake;rev=dead05b0b4ba0959fe0624d2a4885d7b70426da5"
+ self.assertRaises(bb.fetch.FetchError, self.gitfetcher, url1, url2)
+
+ def test_gitfetch_tagandrev(self):
+ # SRCREV is set but does not match rev= parameter
+ url1 = url2 = "git://git.openembedded.org/bitbake;rev=270a05b0b4ba0959fe0624d2a4885d7b70426da5;tag=270a05b0b4ba0959fe0624d2a4885d7b70426da5"
+ self.assertRaises(bb.fetch.FetchError, self.gitfetcher, url1, url2)
+
+ def test_gitfetch_premirror(self):
+ url1 = "git://git.openembedded.org/bitbake"
+ url2 = "git://someserver.org/bitbake"
+ self.d.setVar("PREMIRRORS", "git://someserver.org/bitbake git://git.openembedded.org/bitbake \n")
+ self.gitfetcher(url1, url2)
+
+ def test_gitfetch_premirror2(self):
+ url1 = url2 = "git://someserver.org/bitbake"
+ self.d.setVar("PREMIRRORS", "git://someserver.org/bitbake git://git.openembedded.org/bitbake \n")
+ self.gitfetcher(url1, url2)
+
+ def test_gitfetch_premirror3(self):
+ realurl = "git://git.openembedded.org/bitbake"
+ dummyurl = "git://someserver.org/bitbake"
+ self.sourcedir = self.unpackdir.replace("unpacked", "sourcemirror.git")
+ os.chdir(self.tempdir)
+ bb.process.run("git clone %s %s 2> /dev/null" % (realurl, self.sourcedir), shell=True)
+ self.d.setVar("PREMIRRORS", "%s git://%s;protocol=file \n" % (dummyurl, self.sourcedir))
+ self.gitfetcher(dummyurl, dummyurl)
+
+ def test_git_submodule(self):
+ fetcher = bb.fetch.Fetch(["gitsm://git.yoctoproject.org/git-submodule-test;rev=f12e57f2edf0aa534cf1616fa983d165a92b0842"], self.d)
+ fetcher.download()
+ # Previous cwd has been deleted
+ os.chdir(os.path.dirname(self.unpackdir))
+ fetcher.unpack(self.unpackdir)
+
+ def test_trusted_network(self):
+ # Ensure trusted_network returns False when the host IS in the list.
+ url = "git://Someserver.org/foo;rev=1"
+ self.d.setVar("BB_ALLOWED_NETWORKS", "server1.org someserver.org server2.org server3.org")
+ self.assertTrue(bb.fetch.trusted_network(self.d, url))
+
+ def test_wild_trusted_network(self):
+ # Ensure trusted_network returns true when the *.host IS in the list.
+ url = "git://Someserver.org/foo;rev=1"
+ self.d.setVar("BB_ALLOWED_NETWORKS", "server1.org *.someserver.org server2.org server3.org")
+ self.assertTrue(bb.fetch.trusted_network(self.d, url))
+
+ def test_prefix_wild_trusted_network(self):
+ # Ensure trusted_network returns true when the prefix matches *.host.
+ url = "git://git.Someserver.org/foo;rev=1"
+ self.d.setVar("BB_ALLOWED_NETWORKS", "server1.org *.someserver.org server2.org server3.org")
+ self.assertTrue(bb.fetch.trusted_network(self.d, url))
+
+ def test_two_prefix_wild_trusted_network(self):
+ # Ensure trusted_network returns true when the prefix matches *.host.
+ url = "git://something.git.Someserver.org/foo;rev=1"
+ self.d.setVar("BB_ALLOWED_NETWORKS", "server1.org *.someserver.org server2.org server3.org")
+ self.assertTrue(bb.fetch.trusted_network(self.d, url))
+
+ def test_untrusted_network(self):
+ # Ensure trusted_network returns False when the host is NOT in the list.
+ url = "git://someserver.org/foo;rev=1"
+ self.d.setVar("BB_ALLOWED_NETWORKS", "server1.org server2.org server3.org")
+ self.assertFalse(bb.fetch.trusted_network(self.d, url))
+
+ def test_wild_untrusted_network(self):
+ # Ensure trusted_network returns False when the host is NOT in the list.
+ url = "git://*.someserver.org/foo;rev=1"
+ self.d.setVar("BB_ALLOWED_NETWORKS", "server1.org server2.org server3.org")
+ self.assertFalse(bb.fetch.trusted_network(self.d, url))
+
+
+class URLHandle(unittest.TestCase):
+
+ datatable = {
+ "http://www.google.com/index.html" : ('http', 'www.google.com', '/index.html', '', '', {}),
+ "cvs://anoncvs@cvs.handhelds.org/cvs;module=familiar/dist/ipkg" : ('cvs', 'cvs.handhelds.org', '/cvs', 'anoncvs', '', {'module': 'familiar/dist/ipkg'}),
+ "cvs://anoncvs:anonymous@cvs.handhelds.org/cvs;tag=V0-99-81;module=familiar/dist/ipkg" : ('cvs', 'cvs.handhelds.org', '/cvs', 'anoncvs', 'anonymous', {'tag': 'V0-99-81', 'module': 'familiar/dist/ipkg'}),
+ "git://git.openembedded.org/bitbake;branch=@foo" : ('git', 'git.openembedded.org', '/bitbake', '', '', {'branch': '@foo'})
+ }
+
+ def test_decodeurl(self):
+ for k, v in self.datatable.items():
+ result = bb.fetch.decodeurl(k)
+ self.assertEqual(result, v)
+
+ def test_encodeurl(self):
+ for k, v in self.datatable.items():
+ result = bb.fetch.encodeurl(v)
+ self.assertEqual(result, k)
+
+class FetchLatestVersionTest(FetcherTest):
+
+ test_git_uris = {
+ # version pattern "X.Y.Z"
+ ("mx-1.0", "git://github.com/clutter-project/mx.git;branch=mx-1.4", "9b1db6b8060bd00b121a692f942404a24ae2960f", "")
+ : "1.99.4",
+ # version pattern "vX.Y"
+ ("mtd-utils", "git://git.infradead.org/mtd-utils.git", "ca39eb1d98e736109c64ff9c1aa2a6ecca222d8f", "")
+ : "1.5.0",
+ # version pattern "pkg_name-X.Y"
+ ("presentproto", "git://anongit.freedesktop.org/git/xorg/proto/presentproto", "24f3a56e541b0a9e6c6ee76081f441221a120ef9", "")
+ : "1.0",
+ # version pattern "pkg_name-vX.Y.Z"
+ ("dtc", "git://git.qemu.org/dtc.git", "65cc4d2748a2c2e6f27f1cf39e07a5dbabd80ebf", "")
+ : "1.4.0",
+ # combination version pattern
+ ("sysprof", "git://git.gnome.org/sysprof", "cd44ee6644c3641507fb53b8a2a69137f2971219", "")
+ : "1.2.0",
+ ("u-boot-mkimage", "git://git.denx.de/u-boot.git;branch=master;protocol=git", "62c175fbb8a0f9a926c88294ea9f7e88eb898f6c", "")
+ : "2014.01",
+ # version pattern "yyyymmdd"
+ ("mobile-broadband-provider-info", "git://git.gnome.org/mobile-broadband-provider-info", "4ed19e11c2975105b71b956440acdb25d46a347d", "")
+ : "20120614",
+ # packages with a valid GITTAGREGEX
+ ("xf86-video-omap", "git://anongit.freedesktop.org/xorg/driver/xf86-video-omap", "ae0394e687f1a77e966cf72f895da91840dffb8f", "(?P<pver>(\d+\.(\d\.?)*))")
+ : "0.4.3",
+ ("build-appliance-image", "git://git.yoctoproject.org/poky", "b37dd451a52622d5b570183a81583cc34c2ff555", "(?P<pver>(([0-9][\.|_]?)+[0-9]))")
+ : "11.0.0",
+ ("chkconfig-alternatives-native", "git://github.com/kergoth/chkconfig;branch=sysroot", "cd437ecbd8986c894442f8fce1e0061e20f04dee", "chkconfig\-(?P<pver>((\d+[\.\-_]*)+))")
+ : "1.3.59",
+ ("remake", "git://github.com/rocky/remake.git", "f05508e521987c8494c92d9c2871aec46307d51d", "(?P<pver>(\d+\.(\d+\.)*\d*(\+dbg\d+(\.\d+)*)*))")
+ : "3.82+dbg0.9",
+ }
+
+ test_wget_uris = {
+ # packages with versions inside directory name
+ ("util-linux", "http://kernel.org/pub/linux/utils/util-linux/v2.23/util-linux-2.24.2.tar.bz2", "", "")
+ : "2.24.2",
+ ("enchant", "http://www.abisource.com/downloads/enchant/1.6.0/enchant-1.6.0.tar.gz", "", "")
+ : "1.6.0",
+ ("cmake", "http://www.cmake.org/files/v2.8/cmake-2.8.12.1.tar.gz", "", "")
+ : "2.8.12.1",
+ # packages with versions only in current directory
+ ("eglic", "http://downloads.yoctoproject.org/releases/eglibc/eglibc-2.18-svnr23787.tar.bz2", "", "")
+ : "2.19",
+ ("gnu-config", "http://downloads.yoctoproject.org/releases/gnu-config/gnu-config-20120814.tar.bz2", "", "")
+ : "20120814",
+ # packages with "99" in the name of possible version
+ ("pulseaudio", "http://freedesktop.org/software/pulseaudio/releases/pulseaudio-4.0.tar.xz", "", "")
+ : "5.0",
+ ("xserver-xorg", "http://xorg.freedesktop.org/releases/individual/xserver/xorg-server-1.15.1.tar.bz2", "", "")
+ : "1.15.1",
+ # packages with valid REGEX_URI and REGEX
+ ("cups", "http://www.cups.org/software/1.7.2/cups-1.7.2-source.tar.bz2", "http://www.cups.org/software.php", "(?P<name>cups\-)(?P<pver>((\d+[\.\-_]*)+))\-source\.tar\.gz")
+ : "2.0.0",
+ ("db", "http://download.oracle.com/berkeley-db/db-5.3.21.tar.gz", "http://www.oracle.com/technetwork/products/berkeleydb/downloads/index-082944.html", "http://download.oracle.com/otn/berkeley-db/(?P<name>db-)(?P<pver>((\d+[\.\-_]*)+))\.tar\.gz")
+ : "6.1.19",
+ }
+ if os.environ.get("BB_SKIP_NETTESTS") == "yes":
+ print("Unset BB_SKIP_NETTESTS to run network tests")
+ else:
+ def test_git_latest_versionstring(self):
+ for k, v in self.test_git_uris.items():
+ self.d.setVar("PN", k[0])
+ self.d.setVar("SRCREV", k[2])
+ self.d.setVar("GITTAGREGEX", k[3])
+ ud = bb.fetch2.FetchData(k[1], self.d)
+ pupver= ud.method.latest_versionstring(ud, self.d)
+ verstring = pupver[0]
+ r = bb.utils.vercmp_string(v, verstring)
+ self.assertTrue(r == -1 or r == 0, msg="Package %s, version: %s <= %s" % (k[0], v, verstring))
+
+ def test_wget_latest_versionstring(self):
+ for k, v in self.test_wget_uris.items():
+ self.d.setVar("PN", k[0])
+ self.d.setVar("REGEX_URI", k[2])
+ self.d.setVar("REGEX", k[3])
+ ud = bb.fetch2.FetchData(k[1], self.d)
+ pupver = ud.method.latest_versionstring(ud, self.d)
+ verstring = pupver[0]
+ r = bb.utils.vercmp_string(v, verstring)
+ self.assertTrue(r == -1 or r == 0, msg="Package %s, version: %s <= %s" % (k[0], v, verstring))
+
+
+class FetchCheckStatusTest(FetcherTest):
+ test_wget_uris = ["http://www.cups.org/software/1.7.2/cups-1.7.2-source.tar.bz2",
+ "http://www.cups.org/software/ipptool/ipptool-20130731-linux-ubuntu-i686.tar.gz",
+ "http://www.cups.org/",
+ "http://downloads.yoctoproject.org/releases/sato/sato-engine-0.1.tar.gz",
+ "http://downloads.yoctoproject.org/releases/sato/sato-engine-0.2.tar.gz",
+ "http://downloads.yoctoproject.org/releases/sato/sato-engine-0.3.tar.gz",
+ "https://yoctoproject.org/",
+ "https://yoctoproject.org/documentation",
+ "http://downloads.yoctoproject.org/releases/opkg/opkg-0.1.7.tar.gz",
+ "http://downloads.yoctoproject.org/releases/opkg/opkg-0.3.0.tar.gz",
+ "ftp://ftp.gnu.org/gnu/autoconf/autoconf-2.60.tar.gz",
+ "ftp://ftp.gnu.org/gnu/chess/gnuchess-5.08.tar.gz",
+ "ftp://ftp.gnu.org/gnu/gmp/gmp-4.0.tar.gz",
+ ]
+
+ if os.environ.get("BB_SKIP_NETTESTS") == "yes":
+ print("Unset BB_SKIP_NETTESTS to run network tests")
+ else:
+
+ def test_wget_checkstatus(self):
+ fetch = bb.fetch2.Fetch(self.test_wget_uris, self.d)
+ for u in self.test_wget_uris:
+ ud = fetch.ud[u]
+ m = ud.method
+ ret = m.checkstatus(fetch, ud, self.d)
+ self.assertTrue(ret, msg="URI %s, can't check status" % (u))
+
+
+ def test_wget_checkstatus_connection_cache(self):
+ from bb.fetch2 import FetchConnectionCache
+
+ connection_cache = FetchConnectionCache()
+ fetch = bb.fetch2.Fetch(self.test_wget_uris, self.d,
+ connection_cache = connection_cache)
+
+ for u in self.test_wget_uris:
+ ud = fetch.ud[u]
+ m = ud.method
+ ret = m.checkstatus(fetch, ud, self.d)
+ self.assertTrue(ret, msg="URI %s, can't check status" % (u))
+
+ connection_cache.close_connections()
diff --git a/bitbake/lib/bb/tests/parse.py b/bitbake/lib/bb/tests/parse.py
new file mode 100644
index 0000000..6beb76a
--- /dev/null
+++ b/bitbake/lib/bb/tests/parse.py
@@ -0,0 +1,147 @@
+# ex:ts=4:sw=4:sts=4:et
+# -*- tab-width: 4; c-basic-offset: 4; indent-tabs-mode: nil -*-
+#
+# BitBake Test for lib/bb/parse/
+#
+# Copyright (C) 2015 Richard Purdie
+#
+# This program is free software; you can redistribute it and/or modify
+# it under the terms of the GNU General Public License version 2 as
+# published by the Free Software Foundation.
+#
+# This program is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
+# GNU General Public License for more details.
+#
+# You should have received a copy of the GNU General Public License along
+# with this program; if not, write to the Free Software Foundation, Inc.,
+# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
+#
+
+import unittest
+import tempfile
+import logging
+import bb
+import os
+
+logger = logging.getLogger('BitBake.TestParse')
+
+import bb.parse
+import bb.data
+import bb.siggen
+
+class ParseTest(unittest.TestCase):
+
+ testfile = """
+A = "1"
+B = "2"
+do_install() {
+ echo "hello"
+}
+
+C = "3"
+"""
+
+ def setUp(self):
+ self.d = bb.data.init()
+ bb.parse.siggen = bb.siggen.init(self.d)
+
+ def parsehelper(self, content, suffix = ".bb"):
+
+ f = tempfile.NamedTemporaryFile(suffix = suffix)
+ f.write(content)
+ f.flush()
+ os.chdir(os.path.dirname(f.name))
+ return f
+
+ def test_parse_simple(self):
+ f = self.parsehelper(self.testfile)
+ d = bb.parse.handle(f.name, self.d)['']
+ self.assertEqual(d.getVar("A", True), "1")
+ self.assertEqual(d.getVar("B", True), "2")
+ self.assertEqual(d.getVar("C", True), "3")
+
+ def test_parse_incomplete_function(self):
+ testfileB = self.testfile.replace("}", "")
+ f = self.parsehelper(testfileB)
+ with self.assertRaises(bb.parse.ParseError):
+ d = bb.parse.handle(f.name, self.d)['']
+
+ overridetest = """
+RRECOMMENDS_${PN} = "a"
+RRECOMMENDS_${PN}_libc = "b"
+OVERRIDES = "libc:${PN}"
+PN = "gtk+"
+"""
+
+ def test_parse_overrides(self):
+ f = self.parsehelper(self.overridetest)
+ d = bb.parse.handle(f.name, self.d)['']
+ self.assertEqual(d.getVar("RRECOMMENDS", True), "b")
+ bb.data.expandKeys(d)
+ self.assertEqual(d.getVar("RRECOMMENDS", True), "b")
+ d.setVar("RRECOMMENDS_gtk+", "c")
+ self.assertEqual(d.getVar("RRECOMMENDS", True), "c")
+
+ overridetest2 = """
+EXTRA_OECONF = ""
+EXTRA_OECONF_class-target = "b"
+EXTRA_OECONF_append = " c"
+"""
+
+ def test_parse_overrides(self):
+ f = self.parsehelper(self.overridetest2)
+ d = bb.parse.handle(f.name, self.d)['']
+ d.appendVar("EXTRA_OECONF", " d")
+ d.setVar("OVERRIDES", "class-target")
+ self.assertEqual(d.getVar("EXTRA_OECONF", True), "b c d")
+
+ overridetest3 = """
+DESCRIPTION = "A"
+DESCRIPTION_${PN}-dev = "${DESCRIPTION} B"
+PN = "bc"
+"""
+
+ def test_parse_combinations(self):
+ f = self.parsehelper(self.overridetest3)
+ d = bb.parse.handle(f.name, self.d)['']
+ bb.data.expandKeys(d)
+ self.assertEqual(d.getVar("DESCRIPTION_bc-dev", True), "A B")
+ d.setVar("DESCRIPTION", "E")
+ d.setVar("DESCRIPTION_bc-dev", "C D")
+ d.setVar("OVERRIDES", "bc-dev")
+ self.assertEqual(d.getVar("DESCRIPTION", True), "C D")
+
+
+ classextend = """
+VAR_var_override1 = "B"
+EXTRA = ":override1"
+OVERRIDES = "nothing${EXTRA}"
+
+BBCLASSEXTEND = "###CLASS###"
+"""
+ classextend_bbclass = """
+EXTRA = ""
+python () {
+ d.renameVar("VAR_var", "VAR_var2")
+}
+"""
+
+ #
+ # Test based upon a real world data corruption issue. One
+ # data store changing a variable poked through into a different data
+ # store. This test case replicates that issue where the value 'B' would
+ # become unset/disappear.
+ #
+ def test_parse_classextend_contamination(self):
+ cls = self.parsehelper(self.classextend_bbclass, suffix=".bbclass")
+ #clsname = os.path.basename(cls.name).replace(".bbclass", "")
+ self.classextend = self.classextend.replace("###CLASS###", cls.name)
+ f = self.parsehelper(self.classextend)
+ alldata = bb.parse.handle(f.name, self.d)
+ d1 = alldata['']
+ d2 = alldata[cls.name]
+ self.assertEqual(d1.getVar("VAR_var", True), "B")
+ self.assertEqual(d2.getVar("VAR_var", True), None)
+
diff --git a/bitbake/lib/bb/tests/utils.py b/bitbake/lib/bb/tests/utils.py
new file mode 100644
index 0000000..9171509
--- /dev/null
+++ b/bitbake/lib/bb/tests/utils.py
@@ -0,0 +1,378 @@
+# ex:ts=4:sw=4:sts=4:et
+# -*- tab-width: 4; c-basic-offset: 4; indent-tabs-mode: nil -*-
+#
+# BitBake Tests for utils.py
+#
+# Copyright (C) 2012 Richard Purdie
+#
+# This program is free software; you can redistribute it and/or modify
+# it under the terms of the GNU General Public License version 2 as
+# published by the Free Software Foundation.
+#
+# This program is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
+# GNU General Public License for more details.
+#
+# You should have received a copy of the GNU General Public License along
+# with this program; if not, write to the Free Software Foundation, Inc.,
+# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
+#
+
+import unittest
+import bb
+import os
+import tempfile
+
+class VerCmpString(unittest.TestCase):
+
+ def test_vercmpstring(self):
+ result = bb.utils.vercmp_string('1', '2')
+ self.assertTrue(result < 0)
+ result = bb.utils.vercmp_string('2', '1')
+ self.assertTrue(result > 0)
+ result = bb.utils.vercmp_string('1', '1.0')
+ self.assertTrue(result < 0)
+ result = bb.utils.vercmp_string('1', '1.1')
+ self.assertTrue(result < 0)
+ result = bb.utils.vercmp_string('1.1', '1_p2')
+ self.assertTrue(result < 0)
+ result = bb.utils.vercmp_string('1.0', '1.0+1.1-beta1')
+ self.assertTrue(result < 0)
+ result = bb.utils.vercmp_string('1.1', '1.0+1.1-beta1')
+ self.assertTrue(result > 0)
+
+ def test_explode_dep_versions(self):
+ correctresult = {"foo" : ["= 1.10"]}
+ result = bb.utils.explode_dep_versions2("foo (= 1.10)")
+ self.assertEqual(result, correctresult)
+ result = bb.utils.explode_dep_versions2("foo (=1.10)")
+ self.assertEqual(result, correctresult)
+ result = bb.utils.explode_dep_versions2("foo ( = 1.10)")
+ self.assertEqual(result, correctresult)
+ result = bb.utils.explode_dep_versions2("foo ( =1.10)")
+ self.assertEqual(result, correctresult)
+ result = bb.utils.explode_dep_versions2("foo ( = 1.10 )")
+ self.assertEqual(result, correctresult)
+ result = bb.utils.explode_dep_versions2("foo ( =1.10 )")
+ self.assertEqual(result, correctresult)
+
+ def test_vercmp_string_op(self):
+ compareops = [('1', '1', '=', True),
+ ('1', '1', '==', True),
+ ('1', '1', '!=', False),
+ ('1', '1', '>', False),
+ ('1', '1', '<', False),
+ ('1', '1', '>=', True),
+ ('1', '1', '<=', True),
+ ('1', '0', '=', False),
+ ('1', '0', '==', False),
+ ('1', '0', '!=', True),
+ ('1', '0', '>', True),
+ ('1', '0', '<', False),
+ ('1', '0', '>>', True),
+ ('1', '0', '<<', False),
+ ('1', '0', '>=', True),
+ ('1', '0', '<=', False),
+ ('0', '1', '=', False),
+ ('0', '1', '==', False),
+ ('0', '1', '!=', True),
+ ('0', '1', '>', False),
+ ('0', '1', '<', True),
+ ('0', '1', '>>', False),
+ ('0', '1', '<<', True),
+ ('0', '1', '>=', False),
+ ('0', '1', '<=', True)]
+
+ for arg1, arg2, op, correctresult in compareops:
+ result = bb.utils.vercmp_string_op(arg1, arg2, op)
+ self.assertEqual(result, correctresult, 'vercmp_string_op("%s", "%s", "%s") != %s' % (arg1, arg2, op, correctresult))
+
+ # Check that clearly invalid operator raises an exception
+ self.assertRaises(bb.utils.VersionStringException, bb.utils.vercmp_string_op, '0', '0', '$')
+
+
+class Path(unittest.TestCase):
+ def test_unsafe_delete_path(self):
+ checkitems = [('/', True),
+ ('//', True),
+ ('///', True),
+ (os.getcwd().count(os.sep) * ('..' + os.sep), True),
+ (os.environ.get('HOME', '/home/test'), True),
+ ('/home/someone', True),
+ ('/home/other/', True),
+ ('/home/other/subdir', False),
+ ('', False)]
+ for arg1, correctresult in checkitems:
+ result = bb.utils._check_unsafe_delete_path(arg1)
+ self.assertEqual(result, correctresult, '_check_unsafe_delete_path("%s") != %s' % (arg1, correctresult))
+
+
+class EditMetadataFile(unittest.TestCase):
+ _origfile = """
+# A comment
+HELLO = "oldvalue"
+
+THIS = "that"
+
+# Another comment
+NOCHANGE = "samevalue"
+OTHER = 'anothervalue'
+
+MULTILINE = "a1 \\
+ a2 \\
+ a3"
+
+MULTILINE2 := " \\
+ b1 \\
+ b2 \\
+ b3 \\
+ "
+
+
+MULTILINE3 = " \\
+ c1 \\
+ c2 \\
+ c3 \\
+"
+
+do_functionname() {
+ command1 ${VAL1} ${VAL2}
+ command2 ${VAL3} ${VAL4}
+}
+"""
+ def _testeditfile(self, varvalues, compareto, dummyvars=None):
+ if dummyvars is None:
+ dummyvars = []
+ with tempfile.NamedTemporaryFile('w', delete=False) as tf:
+ tf.write(self._origfile)
+ tf.close()
+ try:
+ varcalls = []
+ def handle_file(varname, origvalue, op, newlines):
+ self.assertIn(varname, varvalues, 'Callback called for variable %s not in the list!' % varname)
+ self.assertNotIn(varname, dummyvars, 'Callback called for variable %s in dummy list!' % varname)
+ varcalls.append(varname)
+ return varvalues[varname]
+
+ bb.utils.edit_metadata_file(tf.name, varvalues.keys(), handle_file)
+ with open(tf.name) as f:
+ modfile = f.readlines()
+ # Ensure the output matches the expected output
+ self.assertEqual(compareto.splitlines(True), modfile)
+ # Ensure the callback function was called for every variable we asked for
+ # (plus allow testing behaviour when a requested variable is not present)
+ self.assertEqual(sorted(varvalues.keys()), sorted(varcalls + dummyvars))
+ finally:
+ os.remove(tf.name)
+
+
+ def test_edit_metadata_file_nochange(self):
+ # Test file doesn't get modified with nothing to do
+ self._testeditfile({}, self._origfile)
+ # Test file doesn't get modified with only dummy variables
+ self._testeditfile({'DUMMY1': ('should_not_set', None, 0, True),
+ 'DUMMY2': ('should_not_set_again', None, 0, True)}, self._origfile, dummyvars=['DUMMY1', 'DUMMY2'])
+ # Test file doesn't get modified with some the same values
+ self._testeditfile({'THIS': ('that', None, 0, True),
+ 'OTHER': ('anothervalue', None, 0, True),
+ 'MULTILINE3': (' c1 c2 c3', None, 4, False)}, self._origfile)
+
+ def test_edit_metadata_file_1(self):
+
+ newfile1 = """
+# A comment
+HELLO = "newvalue"
+
+THIS = "that"
+
+# Another comment
+NOCHANGE = "samevalue"
+OTHER = 'anothervalue'
+
+MULTILINE = "a1 \\
+ a2 \\
+ a3"
+
+MULTILINE2 := " \\
+ b1 \\
+ b2 \\
+ b3 \\
+ "
+
+
+MULTILINE3 = " \\
+ c1 \\
+ c2 \\
+ c3 \\
+"
+
+do_functionname() {
+ command1 ${VAL1} ${VAL2}
+ command2 ${VAL3} ${VAL4}
+}
+"""
+ self._testeditfile({'HELLO': ('newvalue', None, 4, True)}, newfile1)
+
+
+ def test_edit_metadata_file_2(self):
+
+ newfile2 = """
+# A comment
+HELLO = "oldvalue"
+
+THIS = "that"
+
+# Another comment
+NOCHANGE = "samevalue"
+OTHER = 'anothervalue'
+
+MULTILINE = " \\
+ d1 \\
+ d2 \\
+ d3 \\
+ "
+
+MULTILINE2 := " \\
+ b1 \\
+ b2 \\
+ b3 \\
+ "
+
+
+MULTILINE3 = "nowsingle"
+
+do_functionname() {
+ command1 ${VAL1} ${VAL2}
+ command2 ${VAL3} ${VAL4}
+}
+"""
+ self._testeditfile({'MULTILINE': (['d1','d2','d3'], None, 4, False),
+ 'MULTILINE3': ('nowsingle', None, 4, True),
+ 'NOTPRESENT': (['a', 'b'], None, 4, False)}, newfile2, dummyvars=['NOTPRESENT'])
+
+
+ def test_edit_metadata_file_3(self):
+
+ newfile3 = """
+# A comment
+HELLO = "oldvalue"
+
+# Another comment
+NOCHANGE = "samevalue"
+OTHER = "yetanothervalue"
+
+MULTILINE = "e1 \\
+ e2 \\
+ e3 \\
+ "
+
+MULTILINE2 := "f1 \\
+\tf2 \\
+\t"
+
+
+MULTILINE3 = " \\
+ c1 \\
+ c2 \\
+ c3 \\
+"
+
+do_functionname() {
+ othercommand_one a b c
+ othercommand_two d e f
+}
+"""
+
+ self._testeditfile({'do_functionname()': (['othercommand_one a b c', 'othercommand_two d e f'], None, 4, False),
+ 'MULTILINE2': (['f1', 'f2'], None, '\t', True),
+ 'MULTILINE': (['e1', 'e2', 'e3'], None, -1, True),
+ 'THIS': (None, None, 0, False),
+ 'OTHER': ('yetanothervalue', None, 0, True)}, newfile3)
+
+
+ def test_edit_metadata_file_4(self):
+
+ newfile4 = """
+# A comment
+HELLO = "oldvalue"
+
+THIS = "that"
+
+# Another comment
+OTHER = 'anothervalue'
+
+MULTILINE = "a1 \\
+ a2 \\
+ a3"
+
+MULTILINE2 := " \\
+ b1 \\
+ b2 \\
+ b3 \\
+ "
+
+
+"""
+
+ self._testeditfile({'NOCHANGE': (None, None, 0, False),
+ 'MULTILINE3': (None, None, 0, False),
+ 'THIS': ('that', None, 0, False),
+ 'do_functionname()': (None, None, 0, False)}, newfile4)
+
+
+ def test_edit_metadata(self):
+ newfile5 = """
+# A comment
+HELLO = "hithere"
+
+# A new comment
+THIS += "that"
+
+# Another comment
+NOCHANGE = "samevalue"
+OTHER = 'anothervalue'
+
+MULTILINE = "a1 \\
+ a2 \\
+ a3"
+
+MULTILINE2 := " \\
+ b1 \\
+ b2 \\
+ b3 \\
+ "
+
+
+MULTILINE3 = " \\
+ c1 \\
+ c2 \\
+ c3 \\
+"
+
+NEWVAR = "value"
+
+do_functionname() {
+ command1 ${VAL1} ${VAL2}
+ command2 ${VAL3} ${VAL4}
+}
+"""
+
+
+ def handle_var(varname, origvalue, op, newlines):
+ if varname == 'THIS':
+ newlines.append('# A new comment\n')
+ elif varname == 'do_functionname()':
+ newlines.append('NEWVAR = "value"\n')
+ newlines.append('\n')
+ valueitem = varvalues.get(varname, None)
+ if valueitem:
+ return valueitem
+ else:
+ return (origvalue, op, 0, True)
+
+ varvalues = {'HELLO': ('hithere', None, 0, True), 'THIS': ('that', '+=', 0, True)}
+ varlist = ['HELLO', 'THIS', 'do_functionname()']
+ (updated, newlines) = bb.utils.edit_metadata(self._origfile.splitlines(True), varlist, handle_var)
+ self.assertTrue(updated, 'List should be updated but isn\'t')
+ self.assertEqual(newlines, newfile5.splitlines(True))
diff --git a/bitbake/lib/bb/tinfoil.py b/bitbake/lib/bb/tinfoil.py
new file mode 100644
index 0000000..1ea46d8
--- /dev/null
+++ b/bitbake/lib/bb/tinfoil.py
@@ -0,0 +1,104 @@
+# tinfoil: a simple wrapper around cooker for bitbake-based command-line utilities
+#
+# Copyright (C) 2012 Intel Corporation
+# Copyright (C) 2011 Mentor Graphics Corporation
+#
+# This program is free software; you can redistribute it and/or modify
+# it under the terms of the GNU General Public License version 2 as
+# published by the Free Software Foundation.
+#
+# This program is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
+# GNU General Public License for more details.
+#
+# You should have received a copy of the GNU General Public License along
+# with this program; if not, write to the Free Software Foundation, Inc.,
+# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
+
+import logging
+import warnings
+import os
+import sys
+
+import bb.cache
+import bb.cooker
+import bb.providers
+import bb.utils
+from bb.cooker import state, BBCooker, CookerFeatures
+from bb.cookerdata import CookerConfiguration, ConfigParameters
+import bb.fetch2
+
+class Tinfoil:
+ def __init__(self, output=sys.stdout, tracking=False):
+ # Needed to avoid deprecation warnings with python 2.6
+ warnings.filterwarnings("ignore", category=DeprecationWarning)
+
+ # Set up logging
+ self.logger = logging.getLogger('BitBake')
+ console = logging.StreamHandler(output)
+ bb.msg.addDefaultlogFilter(console)
+ format = bb.msg.BBLogFormatter("%(levelname)s: %(message)s")
+ if output.isatty():
+ format.enable_color()
+ console.setFormatter(format)
+ self.logger.addHandler(console)
+
+ self.config = CookerConfiguration()
+ configparams = TinfoilConfigParameters(parse_only=True)
+ self.config.setConfigParameters(configparams)
+ self.config.setServerRegIdleCallback(self.register_idle_function)
+ features = []
+ if tracking:
+ features.append(CookerFeatures.BASEDATASTORE_TRACKING)
+ self.cooker = BBCooker(self.config, features)
+ self.config_data = self.cooker.data
+ bb.providers.logger.setLevel(logging.ERROR)
+ self.cooker_data = None
+
+ def register_idle_function(self, function, data):
+ pass
+
+ def parseRecipes(self):
+ sys.stderr.write("Parsing recipes..")
+ self.logger.setLevel(logging.WARNING)
+
+ try:
+ while self.cooker.state in (state.initial, state.parsing):
+ self.cooker.updateCache()
+ except KeyboardInterrupt:
+ self.cooker.shutdown()
+ self.cooker.updateCache()
+ sys.exit(2)
+
+ self.logger.setLevel(logging.INFO)
+ sys.stderr.write("done.\n")
+
+ self.cooker_data = self.cooker.recipecache
+
+ def prepare(self, config_only = False):
+ if not self.cooker_data:
+ if config_only:
+ self.cooker.parseConfiguration()
+ self.cooker_data = self.cooker.recipecache
+ else:
+ self.parseRecipes()
+
+ def shutdown(self):
+ self.cooker.shutdown(force=True)
+ self.cooker.post_serve()
+ self.cooker.unlockBitbake()
+
+class TinfoilConfigParameters(ConfigParameters):
+
+ def __init__(self, **options):
+ self.initial_options = options
+ super(TinfoilConfigParameters, self).__init__()
+
+ def parseCommandLine(self, argv=sys.argv):
+ class DummyOptions:
+ def __init__(self, initial_options):
+ for key, val in initial_options.items():
+ setattr(self, key, val)
+
+ return DummyOptions(self.initial_options), None
diff --git a/bitbake/lib/bb/ui/__init__.py b/bitbake/lib/bb/ui/__init__.py
new file mode 100644
index 0000000..a4805ed
--- /dev/null
+++ b/bitbake/lib/bb/ui/__init__.py
@@ -0,0 +1,17 @@
+#
+# BitBake UI Implementation
+#
+# Copyright (C) 2006-2007 Richard Purdie
+#
+# This program is free software; you can redistribute it and/or modify
+# it under the terms of the GNU General Public License version 2 as
+# published by the Free Software Foundation.
+#
+# This program is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
+# GNU General Public License for more details.
+#
+# You should have received a copy of the GNU General Public License along
+# with this program; if not, write to the Free Software Foundation, Inc.,
+# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
diff --git a/bitbake/lib/bb/ui/buildinfohelper.py b/bitbake/lib/bb/ui/buildinfohelper.py
new file mode 100644
index 0000000..2d1ed51
--- /dev/null
+++ b/bitbake/lib/bb/ui/buildinfohelper.py
@@ -0,0 +1,1339 @@
+#
+# BitBake ToasterUI Implementation
+#
+# Copyright (C) 2013 Intel Corporation
+#
+# This program is free software; you can redistribute it and/or modify
+# it under the terms of the GNU General Public License version 2 as
+# published by the Free Software Foundation.
+#
+# This program is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
+# GNU General Public License for more details.
+#
+# You should have received a copy of the GNU General Public License along
+# with this program; if not, write to the Free Software Foundation, Inc.,
+# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
+
+import sys
+import bb
+import re
+import os
+
+os.environ["DJANGO_SETTINGS_MODULE"] = "toaster.toastermain.settings"
+
+
+from django.utils import timezone
+
+
+def _configure_toaster():
+ """ Add toaster to sys path for importing modules
+ """
+ sys.path.append(os.path.join(os.path.dirname(os.path.dirname(os.path.dirname(__file__))), 'toaster'))
+_configure_toaster()
+
+from toaster.orm.models import Build, Task, Recipe, Layer_Version, Layer, Target, LogMessage, HelpText
+from toaster.orm.models import Target_Image_File, BuildArtifact
+from toaster.orm.models import Variable, VariableHistory
+from toaster.orm.models import Package, Package_File, Target_Installed_Package, Target_File
+from toaster.orm.models import Task_Dependency, Package_Dependency
+from toaster.orm.models import Recipe_Dependency
+
+from toaster.orm.models import Project
+from bldcontrol.models import BuildEnvironment, BuildRequest
+
+from bb.msg import BBLogFormatter as formatter
+from django.db import models
+from pprint import pformat
+import logging
+
+from django.db import transaction, connection
+
+# pylint: disable=invalid-name
+# the logger name is standard throughout BitBake
+logger = logging.getLogger("ToasterLogger")
+
+
+class NotExisting(Exception):
+ pass
+
+class ORMWrapper(object):
+ """ This class creates the dictionaries needed to store information in the database
+ following the format defined by the Django models. It is also used to save this
+ information in the database.
+ """
+
+ def __init__(self):
+ self.layer_version_objects = []
+ self.task_objects = {}
+ self.recipe_objects = {}
+
+ @staticmethod
+ def _build_key(**kwargs):
+ key = "0"
+ for k in sorted(kwargs.keys()):
+ if isinstance(kwargs[k], models.Model):
+ key += "-%d" % kwargs[k].id
+ else:
+ key += "-%s" % str(kwargs[k])
+ return key
+
+
+ def _cached_get_or_create(self, clazz, **kwargs):
+ """ This is a memory-cached get_or_create. We assume that the objects will not be created in the
+ database through any other means.
+ """
+
+ assert issubclass(clazz, models.Model), "_cached_get_or_create needs to get the class as first argument"
+
+ key = ORMWrapper._build_key(**kwargs)
+ dictname = "objects_%s" % clazz.__name__
+ if not dictname in vars(self).keys():
+ vars(self)[dictname] = {}
+
+ created = False
+ if not key in vars(self)[dictname].keys():
+ vars(self)[dictname][key] = clazz.objects.create(**kwargs)
+ created = True
+
+ return (vars(self)[dictname][key], created)
+
+
+ def _cached_get(self, clazz, **kwargs):
+ """ This is a memory-cached get. We assume that the objects will not change in the database between gets.
+ """
+ assert issubclass(clazz, models.Model), "_cached_get needs to get the class as first argument"
+
+ key = ORMWrapper._build_key(**kwargs)
+ dictname = "objects_%s" % clazz.__name__
+
+ if not dictname in vars(self).keys():
+ vars(self)[dictname] = {}
+
+ if not key in vars(self)[dictname].keys():
+ vars(self)[dictname][key] = clazz.objects.get(**kwargs)
+
+ return vars(self)[dictname][key]
+
+ # pylint: disable=no-self-use
+ # we disable detection of no self use in functions because the methods actually work on the object
+ # even if they don't touch self anywhere
+
+ # pylint: disable=bad-continuation
+ # we do not follow the python conventions for continuation indentation due to long lines here
+
+ def create_build_object(self, build_info, brbe, project_id):
+ assert 'machine' in build_info
+ assert 'distro' in build_info
+ assert 'distro_version' in build_info
+ assert 'started_on' in build_info
+ assert 'cooker_log_path' in build_info
+ assert 'build_name' in build_info
+ assert 'bitbake_version' in build_info
+
+ prj = None
+ buildrequest = None
+ if brbe is not None: # this build was triggered by a request from a user
+ logger.debug(1, "buildinfohelper: brbe is %s" % brbe)
+ br, _ = brbe.split(":")
+ buildrequest = BuildRequest.objects.get(pk = br)
+ prj = buildrequest.project
+
+ elif project_id is not None: # this build was triggered by an external system for a specific project
+ logger.debug(1, "buildinfohelper: project is %s" % prj)
+ prj = Project.objects.get(pk = project_id)
+
+ else: # this build was triggered by a legacy system, or command line interactive mode
+ prj = Project.objects.get_default_project()
+ logger.debug(1, "buildinfohelper: project is not specified, defaulting to %s" % prj)
+
+
+ if buildrequest is not None:
+ build = buildrequest.build
+ logger.info("Updating existing build, with %s", build_info)
+ build.project = prj
+ build.machine=build_info['machine']
+ build.distro=build_info['distro']
+ build.distro_version=build_info['distro_version']
+ build.cooker_log_path=build_info['cooker_log_path']
+ build.build_name=build_info['build_name']
+ build.bitbake_version=build_info['bitbake_version']
+ build.save()
+
+ Target.objects.filter(build = build).delete()
+
+ else:
+ build = Build.objects.create(
+ project = prj,
+ machine=build_info['machine'],
+ distro=build_info['distro'],
+ distro_version=build_info['distro_version'],
+ started_on=build_info['started_on'],
+ completed_on=build_info['started_on'],
+ cooker_log_path=build_info['cooker_log_path'],
+ build_name=build_info['build_name'],
+ bitbake_version=build_info['bitbake_version'])
+
+ logger.debug(1, "buildinfohelper: build is created %s" % build)
+
+ if buildrequest is not None:
+ buildrequest.build = build
+ buildrequest.save()
+
+ return build
+
+ def create_target_objects(self, target_info):
+ assert 'build' in target_info
+ assert 'targets' in target_info
+
+ targets = []
+ for tgt_name in target_info['targets']:
+ tgt_object = Target.objects.create( build = target_info['build'],
+ target = tgt_name,
+ is_image = False,
+ )
+ targets.append(tgt_object)
+ return targets
+
+ def update_build_object(self, build, errors, warnings, taskfailures):
+ assert isinstance(build,Build)
+ assert isinstance(errors, int)
+ assert isinstance(warnings, int)
+
+ outcome = Build.SUCCEEDED
+ if errors or taskfailures:
+ outcome = Build.FAILED
+
+ build.completed_on = timezone.now()
+ build.outcome = outcome
+ build.save()
+
+ def update_target_set_license_manifest(self, target, license_manifest_path):
+ target.license_manifest_path = license_manifest_path
+ target.save()
+
+ def get_update_task_object(self, task_information, must_exist = False):
+ assert 'build' in task_information
+ assert 'recipe' in task_information
+ assert 'task_name' in task_information
+
+ # we use must_exist info for database look-up optimization
+ task_object, created = self._cached_get_or_create(Task,
+ build=task_information['build'],
+ recipe=task_information['recipe'],
+ task_name=task_information['task_name']
+ )
+ if created and must_exist:
+ task_information['debug'] = "build id %d, recipe id %d" % (task_information['build'].pk, task_information['recipe'].pk)
+ raise NotExisting("Task object created when expected to exist", task_information)
+
+ object_changed = False
+ for v in vars(task_object):
+ if v in task_information.keys():
+ if vars(task_object)[v] != task_information[v]:
+ vars(task_object)[v] = task_information[v]
+ object_changed = True
+
+ # update setscene-related information if the task has a setscene
+ if task_object.outcome == Task.OUTCOME_COVERED and 1 == task_object.get_related_setscene().count():
+ task_object.outcome = Task.OUTCOME_CACHED
+ object_changed = True
+
+ outcome_task_setscene = Task.objects.get(task_executed=True, build = task_object.build,
+ recipe = task_object.recipe, task_name=task_object.task_name+"_setscene").outcome
+ if outcome_task_setscene == Task.OUTCOME_SUCCESS:
+ task_object.sstate_result = Task.SSTATE_RESTORED
+ object_changed = True
+ elif outcome_task_setscene == Task.OUTCOME_FAILED:
+ task_object.sstate_result = Task.SSTATE_FAILED
+ object_changed = True
+
+ # mark down duration if we have a start time and a current time
+ if 'start_time' in task_information.keys() and 'end_time' in task_information.keys():
+ duration = task_information['end_time'] - task_information['start_time']
+ task_object.elapsed_time = duration
+ object_changed = True
+ del task_information['start_time']
+ del task_information['end_time']
+
+ if object_changed:
+ task_object.save()
+ return task_object
+
+
+ def get_update_recipe_object(self, recipe_information, must_exist = False):
+ assert 'layer_version' in recipe_information
+ assert 'file_path' in recipe_information
+ assert 'pathflags' in recipe_information
+
+ assert not recipe_information['file_path'].startswith("/") # we should have layer-relative paths at all times
+
+ recipe_object, created = self._cached_get_or_create(Recipe, layer_version=recipe_information['layer_version'],
+ file_path=recipe_information['file_path'], pathflags = recipe_information['pathflags'])
+ if created and must_exist:
+ raise NotExisting("Recipe object created when expected to exist", recipe_information)
+
+ object_changed = False
+ for v in vars(recipe_object):
+ if v in recipe_information.keys():
+ object_changed = True
+ vars(recipe_object)[v] = recipe_information[v]
+
+ if object_changed:
+ recipe_object.save()
+
+ return recipe_object
+
+ def get_update_layer_version_object(self, build_obj, layer_obj, layer_version_information):
+ assert isinstance(build_obj, Build)
+ assert isinstance(layer_obj, Layer)
+ assert 'branch' in layer_version_information
+ assert 'commit' in layer_version_information
+ assert 'priority' in layer_version_information
+ assert 'local_path' in layer_version_information
+
+ layer_version_object, _ = Layer_Version.objects.get_or_create(
+ build = build_obj,
+ layer = layer_obj,
+ branch = layer_version_information['branch'],
+ commit = layer_version_information['commit'],
+ priority = layer_version_information['priority'],
+ local_path = layer_version_information['local_path'],
+ )
+
+ self.layer_version_objects.append(layer_version_object)
+
+ return layer_version_object
+
+ def get_update_layer_object(self, layer_information, brbe):
+ assert 'name' in layer_information
+ assert 'layer_index_url' in layer_information
+
+ if brbe is None:
+ layer_object, _ = Layer.objects.get_or_create(
+ name=layer_information['name'],
+ layer_index_url=layer_information['layer_index_url'])
+ return layer_object
+ else:
+ # we are under managed mode; we must match the layer used in the Project Layer
+ br_id, be_id = brbe.split(":")
+
+ # find layer by checkout path;
+ from bldcontrol import bbcontroller
+ bc = bbcontroller.getBuildEnvironmentController(pk = be_id)
+
+ # we might have a race condition here, as the project layers may change between the build trigger and the actual build execution
+ # but we can only match on the layer name, so the worst thing can happen is a mis-identification of the layer, not a total failure
+
+ # note that this is different
+ buildrequest = BuildRequest.objects.get(pk = br_id)
+ for brl in buildrequest.brlayer_set.all():
+ localdirname = os.path.join(bc.getGitCloneDirectory(brl.giturl, brl.commit), brl.dirpath)
+ # we get a relative path, unless running in HEAD mode where the path is absolute
+ if not localdirname.startswith("/"):
+ localdirname = os.path.join(bc.be.sourcedir, localdirname)
+ #logger.debug(1, "Localdirname %s lcal_path %s" % (localdirname, layer_information['local_path']))
+ if localdirname.startswith(layer_information['local_path']):
+ # we matched the BRLayer, but we need the layer_version that generated this BR; reverse of the Project.schedule_build()
+ #logger.debug(1, "Matched %s to BRlayer %s" % (pformat(layer_information["local_path"]), localdirname))
+ for pl in buildrequest.project.projectlayer_set.filter(layercommit__layer__name = brl.name):
+ if pl.layercommit.layer.vcs_url == brl.giturl :
+ layer = pl.layercommit.layer
+ layer.save()
+ return layer
+
+ raise NotExisting("Unidentified layer %s" % pformat(layer_information))
+
+
+ def save_target_file_information(self, build_obj, target_obj, filedata):
+ assert isinstance(build_obj, Build)
+ assert isinstance(target_obj, Target)
+ dirs = filedata['dirs']
+ files = filedata['files']
+ syms = filedata['syms']
+
+ # we insert directories, ordered by name depth
+ for d in sorted(dirs, key=lambda x:len(x[-1].split("/"))):
+ (user, group, size) = d[1:4]
+ permission = d[0][1:]
+ path = d[4].lstrip(".")
+ if len(path) == 0:
+ # we create the root directory as a special case
+ path = "/"
+ tf_obj = Target_File.objects.create(
+ target = target_obj,
+ path = path,
+ size = size,
+ inodetype = Target_File.ITYPE_DIRECTORY,
+ permission = permission,
+ owner = user,
+ group = group,
+ )
+ tf_obj.directory = tf_obj
+ tf_obj.save()
+ continue
+ parent_path = "/".join(path.split("/")[:len(path.split("/")) - 1])
+ if len(parent_path) == 0:
+ parent_path = "/"
+ parent_obj = self._cached_get(Target_File, target = target_obj, path = parent_path, inodetype = Target_File.ITYPE_DIRECTORY)
+ tf_obj = Target_File.objects.create(
+ target = target_obj,
+ path = path,
+ size = size,
+ inodetype = Target_File.ITYPE_DIRECTORY,
+ permission = permission,
+ owner = user,
+ group = group,
+ directory = parent_obj)
+
+
+ # we insert files
+ for d in files:
+ (user, group, size) = d[1:4]
+ permission = d[0][1:]
+ path = d[4].lstrip(".")
+ parent_path = "/".join(path.split("/")[:len(path.split("/")) - 1])
+ inodetype = Target_File.ITYPE_REGULAR
+ if d[0].startswith('b'):
+ inodetype = Target_File.ITYPE_BLOCK
+ if d[0].startswith('c'):
+ inodetype = Target_File.ITYPE_CHARACTER
+ if d[0].startswith('p'):
+ inodetype = Target_File.ITYPE_FIFO
+
+ tf_obj = Target_File.objects.create(
+ target = target_obj,
+ path = path,
+ size = size,
+ inodetype = inodetype,
+ permission = permission,
+ owner = user,
+ group = group)
+ parent_obj = self._cached_get(Target_File, target = target_obj, path = parent_path, inodetype = Target_File.ITYPE_DIRECTORY)
+ tf_obj.directory = parent_obj
+ tf_obj.save()
+
+ # we insert symlinks
+ for d in syms:
+ (user, group, size) = d[1:4]
+ permission = d[0][1:]
+ path = d[4].lstrip(".")
+ filetarget_path = d[6]
+
+ parent_path = "/".join(path.split("/")[:len(path.split("/")) - 1])
+ if not filetarget_path.startswith("/"):
+ # we have a relative path, get a normalized absolute one
+ filetarget_path = parent_path + "/" + filetarget_path
+ fcp = filetarget_path.split("/")
+ fcpl = []
+ for i in fcp:
+ if i == "..":
+ fcpl.pop()
+ else:
+ fcpl.append(i)
+ filetarget_path = "/".join(fcpl)
+
+ try:
+ filetarget_obj = Target_File.objects.get(target = target_obj, path = filetarget_path)
+ except Target_File.DoesNotExist:
+ # we might have an invalid link; no way to detect this. just set it to None
+ filetarget_obj = None
+
+ parent_obj = Target_File.objects.get(target = target_obj, path = parent_path, inodetype = Target_File.ITYPE_DIRECTORY)
+
+ tf_obj = Target_File.objects.create(
+ target = target_obj,
+ path = path,
+ size = size,
+ inodetype = Target_File.ITYPE_SYMLINK,
+ permission = permission,
+ owner = user,
+ group = group,
+ directory = parent_obj,
+ sym_target = filetarget_obj)
+
+
+ def save_target_package_information(self, build_obj, target_obj, packagedict, pkgpnmap, recipes):
+ assert isinstance(build_obj, Build)
+ assert isinstance(target_obj, Target)
+
+ errormsg = ""
+ for p in packagedict:
+ searchname = p
+ if 'OPKGN' in pkgpnmap[p].keys():
+ searchname = pkgpnmap[p]['OPKGN']
+
+ packagedict[p]['object'], created = Package.objects.get_or_create( build = build_obj, name = searchname )
+ if created or packagedict[p]['object'].size == -1: # save the data anyway we can, not just if it was not created here; bug [YOCTO #6887]
+ # fill in everything we can from the runtime-reverse package data
+ try:
+ packagedict[p]['object'].recipe = recipes[pkgpnmap[p]['PN']]
+ packagedict[p]['object'].version = pkgpnmap[p]['PV']
+ packagedict[p]['object'].installed_name = p
+ packagedict[p]['object'].revision = pkgpnmap[p]['PR']
+ packagedict[p]['object'].license = pkgpnmap[p]['LICENSE']
+ packagedict[p]['object'].section = pkgpnmap[p]['SECTION']
+ packagedict[p]['object'].summary = pkgpnmap[p]['SUMMARY']
+ packagedict[p]['object'].description = pkgpnmap[p]['DESCRIPTION']
+ packagedict[p]['object'].size = int(pkgpnmap[p]['PKGSIZE'])
+
+ # no files recorded for this package, so save files info
+ packagefile_objects = []
+ for targetpath in pkgpnmap[p]['FILES_INFO']:
+ targetfilesize = pkgpnmap[p]['FILES_INFO'][targetpath]
+ packagefile_objects.append(Package_File( package = packagedict[p]['object'],
+ path = targetpath,
+ size = targetfilesize))
+ if len(packagefile_objects):
+ Package_File.objects.bulk_create(packagefile_objects)
+ except KeyError as e:
+ errormsg += " stpi: Key error, package %s key %s \n" % ( p, e )
+
+ # save disk installed size
+ packagedict[p]['object'].installed_size = packagedict[p]['size']
+ packagedict[p]['object'].save()
+
+ Target_Installed_Package.objects.create(target = target_obj, package = packagedict[p]['object'])
+
+ packagedeps_objs = []
+ for p in packagedict:
+ for (px,deptype) in packagedict[p]['depends']:
+ if deptype == 'depends':
+ tdeptype = Package_Dependency.TYPE_TRDEPENDS
+ elif deptype == 'recommends':
+ tdeptype = Package_Dependency.TYPE_TRECOMMENDS
+
+ packagedeps_objs.append(Package_Dependency( package = packagedict[p]['object'],
+ depends_on = packagedict[px]['object'],
+ dep_type = tdeptype,
+ target = target_obj))
+
+ if len(packagedeps_objs) > 0:
+ Package_Dependency.objects.bulk_create(packagedeps_objs)
+
+ if len(errormsg) > 0:
+ logger.warn("buildinfohelper: target_package_info could not identify recipes: \n%s", errormsg)
+
+ def save_target_image_file_information(self, target_obj, file_name, file_size):
+ Target_Image_File.objects.create( target = target_obj,
+ file_name = file_name,
+ file_size = file_size)
+
+ def save_artifact_information(self, build_obj, file_name, file_size):
+ # we skip the image files from other builds
+ if Target_Image_File.objects.filter(file_name = file_name).count() > 0:
+ return
+
+ # do not update artifacts found in other builds
+ if BuildArtifact.objects.filter(file_name = file_name).count() > 0:
+ return
+
+ BuildArtifact.objects.create(build = build_obj, file_name = file_name, file_size = file_size)
+
+ def create_logmessage(self, log_information):
+ assert 'build' in log_information
+ assert 'level' in log_information
+ assert 'message' in log_information
+
+ log_object = LogMessage.objects.create(
+ build = log_information['build'],
+ level = log_information['level'],
+ message = log_information['message'])
+
+ for v in vars(log_object):
+ if v in log_information.keys():
+ vars(log_object)[v] = log_information[v]
+
+ return log_object.save()
+
+
+ def save_build_package_information(self, build_obj, package_info, recipes):
+ assert isinstance(build_obj, Build)
+
+ # create and save the object
+ pname = package_info['PKG']
+ if 'OPKGN' in package_info.keys():
+ pname = package_info['OPKGN']
+
+ bp_object, _ = Package.objects.get_or_create( build = build_obj,
+ name = pname )
+
+ bp_object.installed_name = package_info['PKG']
+ bp_object.recipe = recipes[package_info['PN']]
+ bp_object.version = package_info['PKGV']
+ bp_object.revision = package_info['PKGR']
+ bp_object.summary = package_info['SUMMARY']
+ bp_object.description = package_info['DESCRIPTION']
+ bp_object.size = int(package_info['PKGSIZE'])
+ bp_object.section = package_info['SECTION']
+ bp_object.license = package_info['LICENSE']
+ bp_object.save()
+
+ # save any attached file information
+ packagefile_objects = []
+ for path in package_info['FILES_INFO']:
+ packagefile_objects.append(Package_File( package = bp_object,
+ path = path,
+ size = package_info['FILES_INFO'][path] ))
+ if len(packagefile_objects):
+ Package_File.objects.bulk_create(packagefile_objects)
+
+ def _po_byname(p):
+ pkg, created = Package.objects.get_or_create(build = build_obj, name = p)
+ if created:
+ pkg.size = -1
+ pkg.save()
+ return pkg
+
+ packagedeps_objs = []
+ # save soft dependency information
+ if 'RDEPENDS' in package_info and package_info['RDEPENDS']:
+ for p in bb.utils.explode_deps(package_info['RDEPENDS']):
+ packagedeps_objs.append(Package_Dependency( package = bp_object,
+ depends_on = _po_byname(p), dep_type = Package_Dependency.TYPE_RDEPENDS))
+ if 'RPROVIDES' in package_info and package_info['RPROVIDES']:
+ for p in bb.utils.explode_deps(package_info['RPROVIDES']):
+ packagedeps_objs.append(Package_Dependency( package = bp_object,
+ depends_on = _po_byname(p), dep_type = Package_Dependency.TYPE_RPROVIDES))
+ if 'RRECOMMENDS' in package_info and package_info['RRECOMMENDS']:
+ for p in bb.utils.explode_deps(package_info['RRECOMMENDS']):
+ packagedeps_objs.append(Package_Dependency( package = bp_object,
+ depends_on = _po_byname(p), dep_type = Package_Dependency.TYPE_RRECOMMENDS))
+ if 'RSUGGESTS' in package_info and package_info['RSUGGESTS']:
+ for p in bb.utils.explode_deps(package_info['RSUGGESTS']):
+ packagedeps_objs.append(Package_Dependency( package = bp_object,
+ depends_on = _po_byname(p), dep_type = Package_Dependency.TYPE_RSUGGESTS))
+ if 'RREPLACES' in package_info and package_info['RREPLACES']:
+ for p in bb.utils.explode_deps(package_info['RREPLACES']):
+ packagedeps_objs.append(Package_Dependency( package = bp_object,
+ depends_on = _po_byname(p), dep_type = Package_Dependency.TYPE_RREPLACES))
+ if 'RCONFLICTS' in package_info and package_info['RCONFLICTS']:
+ for p in bb.utils.explode_deps(package_info['RCONFLICTS']):
+ packagedeps_objs.append(Package_Dependency( package = bp_object,
+ depends_on = _po_byname(p), dep_type = Package_Dependency.TYPE_RCONFLICTS))
+
+ if len(packagedeps_objs) > 0:
+ Package_Dependency.objects.bulk_create(packagedeps_objs)
+
+ return bp_object
+
+ def save_build_variables(self, build_obj, vardump):
+ assert isinstance(build_obj, Build)
+
+ helptext_objects = []
+ for k in vardump:
+ desc = vardump[k]['doc']
+ if desc is None:
+ var_words = [word for word in k.split('_')]
+ root_var = "_".join([word for word in var_words if word.isupper()])
+ if root_var and root_var != k and root_var in vardump:
+ desc = vardump[root_var]['doc']
+ if desc is None:
+ desc = ''
+ if len(desc):
+ helptext_objects.append(HelpText(build=build_obj,
+ area=HelpText.VARIABLE,
+ key=k,
+ text=desc))
+ if not bool(vardump[k]['func']):
+ value = vardump[k]['v']
+ if value is None:
+ value = ''
+ variable_obj = Variable.objects.create( build = build_obj,
+ variable_name = k,
+ variable_value = value,
+ description = desc)
+
+ varhist_objects = []
+ for vh in vardump[k]['history']:
+ if not 'documentation.conf' in vh['file']:
+ varhist_objects.append(VariableHistory( variable = variable_obj,
+ file_name = vh['file'],
+ line_number = vh['line'],
+ operation = vh['op']))
+ if len(varhist_objects):
+ VariableHistory.objects.bulk_create(varhist_objects)
+
+ HelpText.objects.bulk_create(helptext_objects)
+
+
+class MockEvent(object):
+ """ This object is used to create event, for which normal event-processing methods can
+ be used, out of data that is not coming via an actual event
+ """
+ def __init__(self):
+ self.msg = None
+ self.levelno = None
+ self.taskname = None
+ self.taskhash = None
+ self.pathname = None
+ self.lineno = None
+
+
+class BuildInfoHelper(object):
+ """ This class gathers the build information from the server and sends it
+ towards the ORM wrapper for storing in the database
+ It is instantiated once per build
+ Keeps in memory all data that needs matching before writing it to the database
+ """
+
+ # pylint: disable=protected-access
+ # the code will look into the protected variables of the event; no easy way around this
+ # pylint: disable=bad-continuation
+ # we do not follow the python conventions for continuation indentation due to long lines here
+
+ def __init__(self, server, has_build_history = False):
+ self.internal_state = {}
+ self.internal_state['taskdata'] = {}
+ self.task_order = 0
+ self.autocommit_step = 1
+ self.server = server
+ # we use manual transactions if the database doesn't autocommit on us
+ if not connection.features.autocommits_when_autocommit_is_off:
+ transaction.set_autocommit(False)
+ self.orm_wrapper = ORMWrapper()
+ self.has_build_history = has_build_history
+ self.tmp_dir = self.server.runCommand(["getVariable", "TMPDIR"])[0]
+ self.brbe = self.server.runCommand(["getVariable", "TOASTER_BRBE"])[0]
+ self.project = self.server.runCommand(["getVariable", "TOASTER_PROJECT"])[0]
+ logger.debug(1, "buildinfohelper: Build info helper inited %s" % vars(self))
+
+
+ ###################
+ ## methods to convert event/external info into objects that the ORM layer uses
+
+
+ def _get_build_information(self):
+ build_info = {}
+ # Generate an identifier for each new build
+
+ build_info['machine'] = self.server.runCommand(["getVariable", "MACHINE"])[0]
+ build_info['distro'] = self.server.runCommand(["getVariable", "DISTRO"])[0]
+ build_info['distro_version'] = self.server.runCommand(["getVariable", "DISTRO_VERSION"])[0]
+ build_info['started_on'] = timezone.now()
+ build_info['completed_on'] = timezone.now()
+ build_info['cooker_log_path'] = self.server.runCommand(["getVariable", "BB_CONSOLELOG"])[0]
+ build_info['build_name'] = self.server.runCommand(["getVariable", "BUILDNAME"])[0]
+ build_info['bitbake_version'] = self.server.runCommand(["getVariable", "BB_VERSION"])[0]
+
+ return build_info
+
+ def _get_task_information(self, event, recipe):
+ assert 'taskname' in vars(event)
+
+ task_information = {}
+ task_information['build'] = self.internal_state['build']
+ task_information['outcome'] = Task.OUTCOME_NA
+ task_information['recipe'] = recipe
+ task_information['task_name'] = event.taskname
+ try:
+ # some tasks don't come with a hash. and that's ok
+ task_information['sstate_checksum'] = event.taskhash
+ except AttributeError:
+ pass
+ return task_information
+
+ def _get_layer_version_for_path(self, path):
+ assert path.startswith("/")
+ assert 'build' in self.internal_state
+
+ if self.brbe is None:
+ def _slkey_interactive(layer_version):
+ assert isinstance(layer_version, Layer_Version)
+ return len(layer_version.local_path)
+
+ # Heuristics: we always match recipe to the deepest layer path in the discovered layers
+ for lvo in sorted(self.orm_wrapper.layer_version_objects, reverse=True, key=_slkey_interactive):
+ # we can match to the recipe file path
+ if path.startswith(lvo.local_path):
+ return lvo
+
+ else:
+ br_id, be_id = self.brbe.split(":")
+ from bldcontrol.bbcontroller import getBuildEnvironmentController
+ bc = getBuildEnvironmentController(pk = be_id)
+
+ def _slkey_managed(layer_version):
+ return len(bc.getGitCloneDirectory(layer_version.giturl, layer_version.commit) + layer_version.dirpath)
+
+ # Heuristics: we match the path to where the layers have been checked out
+ for brl in sorted(BuildRequest.objects.get(pk = br_id).brlayer_set.all(), reverse = True, key = _slkey_managed):
+ localdirname = os.path.join(bc.getGitCloneDirectory(brl.giturl, brl.commit), brl.dirpath)
+ # we get a relative path, unless running in HEAD mode where the path is absolute
+ if not localdirname.startswith("/"):
+ localdirname = os.path.join(bc.be.sourcedir, localdirname)
+ if path.startswith(localdirname):
+ #logger.warn("-- managed: matched path %s with layer %s " % (path, localdirname))
+ # we matched the BRLayer, but we need the layer_version that generated this br
+ for lvo in self.orm_wrapper.layer_version_objects:
+ if brl.name == lvo.layer.name:
+ return lvo
+
+ #if we get here, we didn't read layers correctly; dump whatever information we have on the error log
+ logger.warn("Could not match layer version for recipe path %s : %s", path, self.orm_wrapper.layer_version_objects)
+
+ #mockup the new layer
+ unknown_layer, _ = Layer.objects.get_or_create(name="__FIXME__unidentified_layer", layer_index_url="")
+ unknown_layer_version_obj, _ = Layer_Version.objects.get_or_create(layer = unknown_layer, build = self.internal_state['build'])
+
+ # append it so we don't run into this error again and again
+ self.orm_wrapper.layer_version_objects.append(unknown_layer_version_obj)
+
+ return unknown_layer_version_obj
+
+ def _get_recipe_information_from_taskfile(self, taskfile):
+ localfilepath = taskfile.split(":")[-1]
+ filepath_flags = ":".join(sorted(taskfile.split(":")[:-1]))
+ layer_version_obj = self._get_layer_version_for_path(localfilepath)
+
+
+
+ recipe_info = {}
+ recipe_info['layer_version'] = layer_version_obj
+ recipe_info['file_path'] = localfilepath
+ recipe_info['pathflags'] = filepath_flags
+
+ if recipe_info['file_path'].startswith(recipe_info['layer_version'].local_path):
+ recipe_info['file_path'] = recipe_info['file_path'][len(recipe_info['layer_version'].local_path):].lstrip("/")
+ else:
+ raise RuntimeError("Recipe file path %s is not under layer version at %s" % (recipe_info['file_path'], recipe_info['layer_version'].local_path))
+
+ return recipe_info
+
+ def _get_path_information(self, task_object):
+ assert isinstance(task_object, Task)
+ build_stats_format = "{tmpdir}/buildstats/{target}-{machine}/{buildname}/{package}/"
+ build_stats_path = []
+
+ for t in self.internal_state['targets']:
+ target = t.target
+ machine = self.internal_state['build'].machine
+ buildname = self.internal_state['build'].build_name
+ pe, pv = task_object.recipe.version.split(":",1)
+ if len(pe) > 0:
+ package = task_object.recipe.name + "-" + pe + "_" + pv
+ else:
+ package = task_object.recipe.name + "-" + pv
+
+ build_stats_path.append(build_stats_format.format(tmpdir=self.tmp_dir, target=target,
+ machine=machine, buildname=buildname,
+ package=package))
+
+ return build_stats_path
+
+
+ ################################
+ ## external available methods to store information
+ @staticmethod
+ def _get_data_from_event(event):
+ evdata = None
+ if '_localdata' in vars(event):
+ evdata = event._localdata
+ elif 'data' in vars(event):
+ evdata = event.data
+ else:
+ raise Exception("Event with neither _localdata or data properties")
+ return evdata
+
+ def store_layer_info(self, event):
+ layerinfos = BuildInfoHelper._get_data_from_event(event)
+ self.internal_state['lvs'] = {}
+ for layer in layerinfos:
+ try:
+ self.internal_state['lvs'][self.orm_wrapper.get_update_layer_object(layerinfos[layer], self.brbe)] = layerinfos[layer]['version']
+ self.internal_state['lvs'][self.orm_wrapper.get_update_layer_object(layerinfos[layer], self.brbe)]['local_path'] = layerinfos[layer]['local_path']
+ except NotExisting as nee:
+ logger.warn("buildinfohelper: cannot identify layer exception:%s ", nee)
+
+
+ def store_started_build(self, event):
+ assert '_pkgs' in vars(event)
+ build_information = self._get_build_information()
+
+ build_obj = self.orm_wrapper.create_build_object(build_information, self.brbe, self.project)
+
+ self.internal_state['build'] = build_obj
+
+ # save layer version information for this build
+ if not 'lvs' in self.internal_state:
+ logger.error("Layer version information not found; Check if the bitbake server was configured to inherit toaster.bbclass.")
+ else:
+ for layer_obj in self.internal_state['lvs']:
+ self.orm_wrapper.get_update_layer_version_object(build_obj, layer_obj, self.internal_state['lvs'][layer_obj])
+
+ del self.internal_state['lvs']
+
+ # create target information
+ target_information = {}
+ target_information['targets'] = event._pkgs
+ target_information['build'] = build_obj
+
+ self.internal_state['targets'] = self.orm_wrapper.create_target_objects(target_information)
+
+ # Save build configuration
+ data = self.server.runCommand(["getAllKeysWithFlags", ["doc", "func"]])[0]
+
+ # convert the paths from absolute to relative to either the build directory or layer checkouts
+ path_prefixes = []
+
+ if self.brbe is not None:
+ _, be_id = self.brbe.split(":")
+ be = BuildEnvironment.objects.get(pk = be_id)
+ path_prefixes.append(be.builddir)
+
+ for layer in sorted(self.orm_wrapper.layer_version_objects, key = lambda x:len(x.local_path), reverse=True):
+ path_prefixes.append(layer.local_path)
+
+ # we strip the prefixes
+ for k in data:
+ if not bool(data[k]['func']):
+ for vh in data[k]['history']:
+ if not 'documentation.conf' in vh['file']:
+ abs_file_name = vh['file']
+ for pp in path_prefixes:
+ if abs_file_name.startswith(pp + "/"):
+ vh['file']=abs_file_name[len(pp + "/"):]
+ break
+
+ # save the variables
+ self.orm_wrapper.save_build_variables(build_obj, data)
+
+ return self.brbe
+
+
+ def update_target_image_file(self, event):
+ evdata = BuildInfoHelper._get_data_from_event(event)
+
+ for t in self.internal_state['targets']:
+ if t.is_image == True:
+ output_files = list(evdata.viewkeys())
+ for output in output_files:
+ if t.target in output and 'rootfs' in output and not output.endswith(".manifest"):
+ self.orm_wrapper.save_target_image_file_information(t, output, evdata[output])
+
+ def update_artifact_image_file(self, event):
+ evdata = BuildInfoHelper._get_data_from_event(event)
+ for artifact_path in evdata.keys():
+ self.orm_wrapper.save_artifact_information(self.internal_state['build'], artifact_path, evdata[artifact_path])
+
+ def update_build_information(self, event, errors, warnings, taskfailures):
+ if 'build' in self.internal_state:
+ self.orm_wrapper.update_build_object(self.internal_state['build'], errors, warnings, taskfailures)
+
+
+ def store_license_manifest_path(self, event):
+ deploy_dir = BuildInfoHelper._get_data_from_event(event)['deploy_dir']
+ image_name = BuildInfoHelper._get_data_from_event(event)['image_name']
+ path = deploy_dir + "/licenses/" + image_name + "/license.manifest"
+ for target in self.internal_state['targets']:
+ if target.target in image_name:
+ self.orm_wrapper.update_target_set_license_manifest(target, path)
+
+
+ def store_started_task(self, event):
+ assert isinstance(event, (bb.runqueue.sceneQueueTaskStarted, bb.runqueue.runQueueTaskStarted, bb.runqueue.runQueueTaskSkipped))
+ assert 'taskfile' in vars(event)
+ localfilepath = event.taskfile.split(":")[-1]
+ assert localfilepath.startswith("/")
+
+ identifier = event.taskfile + ":" + event.taskname
+
+ recipe_information = self._get_recipe_information_from_taskfile(event.taskfile)
+ recipe = self.orm_wrapper.get_update_recipe_object(recipe_information, True)
+
+ task_information = self._get_task_information(event, recipe)
+ task_information['outcome'] = Task.OUTCOME_NA
+
+ if isinstance(event, bb.runqueue.runQueueTaskSkipped):
+ assert 'reason' in vars(event)
+ task_information['task_executed'] = False
+ if event.reason == "covered":
+ task_information['outcome'] = Task.OUTCOME_COVERED
+ if event.reason == "existing":
+ task_information['outcome'] = Task.OUTCOME_PREBUILT
+ else:
+ task_information['task_executed'] = True
+ if 'noexec' in vars(event) and event.noexec == True:
+ task_information['task_executed'] = False
+ task_information['outcome'] = Task.OUTCOME_EMPTY
+ task_information['script_type'] = Task.CODING_NA
+
+ # do not assign order numbers to scene tasks
+ if not isinstance(event, bb.runqueue.sceneQueueTaskStarted):
+ self.task_order += 1
+ task_information['order'] = self.task_order
+
+ self.orm_wrapper.get_update_task_object(task_information)
+
+ self.internal_state['taskdata'][identifier] = {
+ 'outcome': task_information['outcome'],
+ }
+
+
+ def store_tasks_stats(self, event):
+ for (taskfile, taskname, taskstats, recipename) in BuildInfoHelper._get_data_from_event(event):
+ localfilepath = taskfile.split(":")[-1]
+ assert localfilepath.startswith("/")
+
+ recipe_information = self._get_recipe_information_from_taskfile(taskfile)
+ try:
+ if recipe_information['file_path'].startswith(recipe_information['layer_version'].local_path):
+ recipe_information['file_path'] = recipe_information['file_path'][len(recipe_information['layer_version'].local_path):].lstrip("/")
+
+ recipe_object = Recipe.objects.get(layer_version = recipe_information['layer_version'],
+ file_path__endswith = recipe_information['file_path'],
+ name = recipename)
+ except Recipe.DoesNotExist:
+ logger.error("Could not find recipe for recipe_information %s name %s" , pformat(recipe_information), recipename)
+ raise
+
+ task_information = {}
+ task_information['build'] = self.internal_state['build']
+ task_information['recipe'] = recipe_object
+ task_information['task_name'] = taskname
+ task_information['cpu_usage'] = taskstats['cpu_usage']
+ task_information['disk_io'] = taskstats['disk_io']
+ if 'elapsed_time' in taskstats:
+ task_information['elapsed_time'] = taskstats['elapsed_time']
+ self.orm_wrapper.get_update_task_object(task_information, True) # must exist
+
+ def update_and_store_task(self, event):
+ assert 'taskfile' in vars(event)
+ localfilepath = event.taskfile.split(":")[-1]
+ assert localfilepath.startswith("/")
+
+ identifier = event.taskfile + ":" + event.taskname
+ if not identifier in self.internal_state['taskdata']:
+ if isinstance(event, bb.build.TaskBase):
+ # we do a bit of guessing
+ candidates = [x for x in self.internal_state['taskdata'].keys() if x.endswith(identifier)]
+ if len(candidates) == 1:
+ identifier = candidates[0]
+
+ assert identifier in self.internal_state['taskdata']
+ identifierlist = identifier.split(":")
+ realtaskfile = ":".join(identifierlist[0:len(identifierlist)-1])
+ recipe_information = self._get_recipe_information_from_taskfile(realtaskfile)
+ recipe = self.orm_wrapper.get_update_recipe_object(recipe_information, True)
+ task_information = self._get_task_information(event,recipe)
+
+ if 'time' in vars(event):
+ if not 'start_time' in self.internal_state['taskdata'][identifier]:
+ self.internal_state['taskdata'][identifier]['start_time'] = event.time
+ else:
+ task_information['end_time'] = event.time
+ task_information['start_time'] = self.internal_state['taskdata'][identifier]['start_time']
+
+ task_information['outcome'] = self.internal_state['taskdata'][identifier]['outcome']
+
+ if 'logfile' in vars(event):
+ task_information['logfile'] = event.logfile
+
+ if '_message' in vars(event):
+ task_information['message'] = event._message
+
+ if 'taskflags' in vars(event):
+ # with TaskStarted, we get even more information
+ if 'python' in event.taskflags.keys() and event.taskflags['python'] == '1':
+ task_information['script_type'] = Task.CODING_PYTHON
+ else:
+ task_information['script_type'] = Task.CODING_SHELL
+
+ if task_information['outcome'] == Task.OUTCOME_NA:
+ if isinstance(event, (bb.runqueue.runQueueTaskCompleted, bb.runqueue.sceneQueueTaskCompleted)):
+ task_information['outcome'] = Task.OUTCOME_SUCCESS
+ del self.internal_state['taskdata'][identifier]
+
+ if isinstance(event, (bb.runqueue.runQueueTaskFailed, bb.runqueue.sceneQueueTaskFailed)):
+ task_information['outcome'] = Task.OUTCOME_FAILED
+ del self.internal_state['taskdata'][identifier]
+
+ if not connection.features.autocommits_when_autocommit_is_off:
+ # we force a sync point here, to get the progress bar to show
+ if self.autocommit_step % 3 == 0:
+ transaction.set_autocommit(True)
+ transaction.set_autocommit(False)
+ self.autocommit_step += 1
+
+ self.orm_wrapper.get_update_task_object(task_information, True) # must exist
+
+
+ def store_missed_state_tasks(self, event):
+ for (fn, taskname, taskhash, sstatefile) in BuildInfoHelper._get_data_from_event(event)['missed']:
+
+ # identifier = fn + taskname + "_setscene"
+ recipe_information = self._get_recipe_information_from_taskfile(fn)
+ recipe = self.orm_wrapper.get_update_recipe_object(recipe_information)
+ mevent = MockEvent()
+ mevent.taskname = taskname
+ mevent.taskhash = taskhash
+ task_information = self._get_task_information(mevent,recipe)
+
+ task_information['start_time'] = timezone.now()
+ task_information['outcome'] = Task.OUTCOME_NA
+ task_information['sstate_checksum'] = taskhash
+ task_information['sstate_result'] = Task.SSTATE_MISS
+ task_information['path_to_sstate_obj'] = sstatefile
+
+ self.orm_wrapper.get_update_task_object(task_information)
+
+ for (fn, taskname, taskhash, sstatefile) in BuildInfoHelper._get_data_from_event(event)['found']:
+
+ # identifier = fn + taskname + "_setscene"
+ recipe_information = self._get_recipe_information_from_taskfile(fn)
+ recipe = self.orm_wrapper.get_update_recipe_object(recipe_information)
+ mevent = MockEvent()
+ mevent.taskname = taskname
+ mevent.taskhash = taskhash
+ task_information = self._get_task_information(mevent,recipe)
+
+ task_information['path_to_sstate_obj'] = sstatefile
+
+ self.orm_wrapper.get_update_task_object(task_information)
+
+
+ def store_target_package_data(self, event):
+ # for all image targets
+ for target in self.internal_state['targets']:
+ if target.is_image:
+ try:
+ pkgdata = BuildInfoHelper._get_data_from_event(event)['pkgdata']
+ imgdata = BuildInfoHelper._get_data_from_event(event)['imgdata'][target.target]
+ self.orm_wrapper.save_target_package_information(self.internal_state['build'], target, imgdata, pkgdata, self.internal_state['recipes'])
+ filedata = BuildInfoHelper._get_data_from_event(event)['filedata'][target.target]
+ self.orm_wrapper.save_target_file_information(self.internal_state['build'], target, filedata)
+ except KeyError:
+ # we must have not got the data for this image, nothing to save
+ pass
+
+
+
+ def store_dependency_information(self, event):
+ assert '_depgraph' in vars(event)
+ assert 'layer-priorities' in event._depgraph
+ assert 'pn' in event._depgraph
+ assert 'tdepends' in event._depgraph
+
+ errormsg = ""
+
+ # save layer version priorities
+ if 'layer-priorities' in event._depgraph.keys():
+ for lv in event._depgraph['layer-priorities']:
+ (_, path, _, priority) = lv
+ layer_version_obj = self._get_layer_version_for_path(path[1:]) # paths start with a ^
+ assert layer_version_obj is not None
+ layer_version_obj.priority = priority
+ layer_version_obj.save()
+
+ # save recipe information
+ self.internal_state['recipes'] = {}
+ for pn in event._depgraph['pn']:
+
+ file_name = event._depgraph['pn'][pn]['filename'].split(":")[-1]
+ pathflags = ":".join(sorted(event._depgraph['pn'][pn]['filename'].split(":")[:-1]))
+ layer_version_obj = self._get_layer_version_for_path(file_name)
+
+ assert layer_version_obj is not None
+
+ recipe_info = {}
+ recipe_info['name'] = pn
+ recipe_info['layer_version'] = layer_version_obj
+
+ if 'version' in event._depgraph['pn'][pn]:
+ recipe_info['version'] = event._depgraph['pn'][pn]['version'].lstrip(":")
+
+ if 'summary' in event._depgraph['pn'][pn]:
+ recipe_info['summary'] = event._depgraph['pn'][pn]['summary']
+
+ if 'license' in event._depgraph['pn'][pn]:
+ recipe_info['license'] = event._depgraph['pn'][pn]['license']
+
+ if 'description' in event._depgraph['pn'][pn]:
+ recipe_info['description'] = event._depgraph['pn'][pn]['description']
+
+ if 'section' in event._depgraph['pn'][pn]:
+ recipe_info['section'] = event._depgraph['pn'][pn]['section']
+
+ if 'homepage' in event._depgraph['pn'][pn]:
+ recipe_info['homepage'] = event._depgraph['pn'][pn]['homepage']
+
+ if 'bugtracker' in event._depgraph['pn'][pn]:
+ recipe_info['bugtracker'] = event._depgraph['pn'][pn]['bugtracker']
+
+ recipe_info['file_path'] = file_name
+ recipe_info['pathflags'] = pathflags
+
+ if recipe_info['file_path'].startswith(recipe_info['layer_version'].local_path):
+ recipe_info['file_path'] = recipe_info['file_path'][len(recipe_info['layer_version'].local_path):].lstrip("/")
+ else:
+ raise RuntimeError("Recipe file path %s is not under layer version at %s" % (recipe_info['file_path'], recipe_info['layer_version'].local_path))
+
+ recipe = self.orm_wrapper.get_update_recipe_object(recipe_info)
+ recipe.is_image = False
+ if 'inherits' in event._depgraph['pn'][pn].keys():
+ for cls in event._depgraph['pn'][pn]['inherits']:
+ if cls.endswith('/image.bbclass'):
+ recipe.is_image = True
+ break
+ if recipe.is_image:
+ for t in self.internal_state['targets']:
+ if pn == t.target:
+ t.is_image = True
+ t.save()
+ self.internal_state['recipes'][pn] = recipe
+
+ # we'll not get recipes for key w/ values listed in ASSUME_PROVIDED
+
+ assume_provided = self.server.runCommand(["getVariable", "ASSUME_PROVIDED"])[0].split()
+
+ # save recipe dependency
+ # buildtime
+ recipedeps_objects = []
+ for recipe in event._depgraph['depends']:
+ try:
+ target = self.internal_state['recipes'][recipe]
+ for dep in event._depgraph['depends'][recipe]:
+ dependency = self.internal_state['recipes'][dep]
+ recipedeps_objects.append(Recipe_Dependency( recipe = target,
+ depends_on = dependency, dep_type = Recipe_Dependency.TYPE_DEPENDS))
+ except KeyError as e:
+ if e not in assume_provided and not str(e).startswith("virtual/"):
+ errormsg += " stpd: KeyError saving recipe dependency for %s, %s \n" % (recipe, e)
+ Recipe_Dependency.objects.bulk_create(recipedeps_objects)
+
+ # save all task information
+ def _save_a_task(taskdesc):
+ spec = re.split(r'\.', taskdesc)
+ pn = ".".join(spec[0:-1])
+ taskname = spec[-1]
+ e = event
+ e.taskname = pn
+ recipe = self.internal_state['recipes'][pn]
+ task_info = self._get_task_information(e, recipe)
+ task_info['task_name'] = taskname
+ task_obj = self.orm_wrapper.get_update_task_object(task_info)
+ return task_obj
+
+ # create tasks
+ tasks = {}
+ for taskdesc in event._depgraph['tdepends']:
+ tasks[taskdesc] = _save_a_task(taskdesc)
+
+ # create dependencies between tasks
+ taskdeps_objects = []
+ for taskdesc in event._depgraph['tdepends']:
+ target = tasks[taskdesc]
+ for taskdep in event._depgraph['tdepends'][taskdesc]:
+ if taskdep not in tasks:
+ # Fetch tasks info is not collected previously
+ dep = _save_a_task(taskdep)
+ else:
+ dep = tasks[taskdep]
+ taskdeps_objects.append(Task_Dependency( task = target, depends_on = dep ))
+ Task_Dependency.objects.bulk_create(taskdeps_objects)
+
+ if len(errormsg) > 0:
+ logger.warn("buildinfohelper: dependency info not identify recipes: \n%s", errormsg)
+
+
+ def store_build_package_information(self, event):
+ package_info = BuildInfoHelper._get_data_from_event(event)
+ self.orm_wrapper.save_build_package_information(self.internal_state['build'],
+ package_info,
+ self.internal_state['recipes'],
+ )
+
+ def _store_build_done(self, errorcode):
+ logger.info("Build exited with errorcode %d", errorcode)
+ br_id, be_id = self.brbe.split(":")
+ be = BuildEnvironment.objects.get(pk = be_id)
+ be.lock = BuildEnvironment.LOCK_LOCK
+ be.save()
+ br = BuildRequest.objects.get(pk = br_id)
+ if errorcode == 0:
+ # request archival of the project artifacts
+ br.state = BuildRequest.REQ_ARCHIVE
+ else:
+ br.state = BuildRequest.REQ_FAILED
+ br.save()
+
+
+ def store_log_error(self, text):
+ mockevent = MockEvent()
+ mockevent.levelno = formatter.ERROR
+ mockevent.msg = text
+ mockevent.pathname = '-- None'
+ mockevent.lineno = LogMessage.ERROR
+ self.store_log_event(mockevent)
+
+ def store_log_exception(self, text, backtrace = ""):
+ mockevent = MockEvent()
+ mockevent.levelno = -1
+ mockevent.msg = text
+ mockevent.pathname = backtrace
+ mockevent.lineno = -1
+ self.store_log_event(mockevent)
+
+
+ def store_log_event(self, event):
+ if event.levelno < formatter.WARNING:
+ return
+
+ if 'args' in vars(event):
+ event.msg = event.msg % event.args
+
+ if not 'build' in self.internal_state:
+ if self.brbe is None:
+ if not 'backlog' in self.internal_state:
+ self.internal_state['backlog'] = []
+ self.internal_state['backlog'].append(event)
+ return
+ else: # we're under Toaster control, the build is already created
+ br, _ = self.brbe.split(":")
+ buildrequest = BuildRequest.objects.get(pk = br)
+ self.internal_state['build'] = buildrequest.build
+
+ if 'build' in self.internal_state and 'backlog' in self.internal_state:
+ # if we have a backlog of events, do our best to save them here
+ if len(self.internal_state['backlog']):
+ tempevent = self.internal_state['backlog'].pop()
+ logger.debug(1, "buildinfohelper: Saving stored event %s " % tempevent)
+ self.store_log_event(tempevent)
+ else:
+ logger.info("buildinfohelper: All events saved")
+ del self.internal_state['backlog']
+
+ log_information = {}
+ log_information['build'] = self.internal_state['build']
+ if event.levelno == formatter.ERROR:
+ log_information['level'] = LogMessage.ERROR
+ elif event.levelno == formatter.WARNING:
+ log_information['level'] = LogMessage.WARNING
+ elif event.levelno == -2: # toaster self-logging
+ log_information['level'] = -2
+ else:
+ log_information['level'] = LogMessage.INFO
+
+ log_information['message'] = event.msg
+ log_information['pathname'] = event.pathname
+ log_information['lineno'] = event.lineno
+ logger.info("Logging error 2: %s", log_information)
+ self.orm_wrapper.create_logmessage(log_information)
+
+ def close(self, errorcode):
+ if self.brbe is not None:
+ self._store_build_done(errorcode)
+
+ if 'backlog' in self.internal_state:
+ if 'build' in self.internal_state:
+ # we save missed events in the database for the current build
+ tempevent = self.internal_state['backlog'].pop()
+ self.store_log_event(tempevent)
+ else:
+ # we have no build, and we still have events; something amazingly wrong happend
+ for event in self.internal_state['backlog']:
+ logger.error("UNSAVED log: %s", event.msg)
+
+ if not connection.features.autocommits_when_autocommit_is_off:
+ transaction.set_autocommit(True)
diff --git a/bitbake/lib/bb/ui/crumbs/__init__.py b/bitbake/lib/bb/ui/crumbs/__init__.py
new file mode 100644
index 0000000..b7cbe1a
--- /dev/null
+++ b/bitbake/lib/bb/ui/crumbs/__init__.py
@@ -0,0 +1,17 @@
+#
+# Gtk+ UI pieces for BitBake
+#
+# Copyright (C) 2006-2007 Richard Purdie
+#
+# This program is free software; you can redistribute it and/or modify
+# it under the terms of the GNU General Public License version 2 as
+# published by the Free Software Foundation.
+#
+# This program is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
+# GNU General Public License for more details.
+#
+# You should have received a copy of the GNU General Public License along
+# with this program; if not, write to the Free Software Foundation, Inc.,
+# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
diff --git a/bitbake/lib/bb/ui/crumbs/builddetailspage.py b/bitbake/lib/bb/ui/crumbs/builddetailspage.py
new file mode 100755
index 0000000..7fc690e
--- /dev/null
+++ b/bitbake/lib/bb/ui/crumbs/builddetailspage.py
@@ -0,0 +1,437 @@
+#!/usr/bin/env python
+#
+# BitBake Graphical GTK User Interface
+#
+# Copyright (C) 2012 Intel Corporation
+#
+# Authored by Dongxiao Xu <dongxiao.xu@intel.com>
+# Authored by Shane Wang <shane.wang@intel.com>
+#
+# This program is free software; you can redistribute it and/or modify
+# it under the terms of the GNU General Public License version 2 as
+# published by the Free Software Foundation.
+#
+# This program is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
+# GNU General Public License for more details.
+#
+# You should have received a copy of the GNU General Public License along
+# with this program; if not, write to the Free Software Foundation, Inc.,
+# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
+
+import gtk
+import pango
+import gobject
+import bb.process
+from bb.ui.crumbs.progressbar import HobProgressBar
+from bb.ui.crumbs.hobwidget import hic, HobNotebook, HobAltButton, HobWarpCellRendererText, HobButton, HobInfoButton
+from bb.ui.crumbs.runningbuild import RunningBuildTreeView
+from bb.ui.crumbs.runningbuild import BuildFailureTreeView
+from bb.ui.crumbs.hobpages import HobPage
+from bb.ui.crumbs.hobcolor import HobColors
+
+class BuildConfigurationTreeView(gtk.TreeView):
+ def __init__ (self):
+ gtk.TreeView.__init__(self)
+ self.set_rules_hint(False)
+ self.set_headers_visible(False)
+ self.set_property("hover-expand", True)
+ self.get_selection().set_mode(gtk.SELECTION_SINGLE)
+
+ # The icon that indicates whether we're building or failed.
+ renderer0 = gtk.CellRendererText()
+ renderer0.set_property('font-desc', pango.FontDescription('courier bold 12'))
+ col0 = gtk.TreeViewColumn ("Name", renderer0, text=0)
+ self.append_column (col0)
+
+ # The message of configuration.
+ renderer1 = HobWarpCellRendererText(col_number=1)
+ col1 = gtk.TreeViewColumn ("Values", renderer1, text=1)
+ self.append_column (col1)
+
+ def set_vars(self, key="", var=[""]):
+ d = {}
+ if type(var) == str:
+ d = {key: [var]}
+ elif type(var) == list and len(var) > 1:
+ #create the sub item line
+ l = []
+ text = ""
+ for item in var:
+ text = " - " + item
+ l.append(text)
+ d = {key: var}
+
+ return d
+
+ def set_config_model(self, show_vars):
+ listmodel = gtk.TreeStore(gobject.TYPE_STRING, gobject.TYPE_STRING)
+ parent = None
+ for var in show_vars:
+ for subitem in var.items():
+ name = subitem[0]
+ is_parent = True
+ for value in subitem[1]:
+ if is_parent:
+ parent = listmodel.append(parent, (name, value))
+ is_parent = False
+ else:
+ listmodel.append(parent, (None, value))
+ name = " - "
+ parent = None
+ # renew the tree model after get the configuration messages
+ self.set_model(listmodel)
+
+ def show(self, src_config_info, src_params):
+ vars = []
+ vars.append(self.set_vars("BB version:", src_params.bb_version))
+ vars.append(self.set_vars("Target arch:", src_params.target_arch))
+ vars.append(self.set_vars("Target OS:", src_params.target_os))
+ vars.append(self.set_vars("Machine:", src_config_info.curr_mach))
+ vars.append(self.set_vars("Distro:", src_config_info.curr_distro))
+ vars.append(self.set_vars("Distro version:", src_params.distro_version))
+ vars.append(self.set_vars("SDK machine:", src_config_info.curr_sdk_machine))
+ vars.append(self.set_vars("Tune features:", src_params.tune_pkgarch))
+ vars.append(self.set_vars("Layers:", src_config_info.layers))
+
+ for path in src_config_info.layers:
+ import os, os.path
+ if os.path.exists(path):
+ branch = bb.process.run('cd %s; git branch | grep "^* " | tr -d "* "' % path)[0]
+ if branch.startswith("fatal:"):
+ branch = "(unknown)"
+ if branch:
+ branch = branch.strip('\n')
+ vars.append(self.set_vars("Branch:", branch))
+ break
+
+ self.set_config_model(vars)
+
+ def reset(self):
+ self.set_model(None)
+
+#
+# BuildDetailsPage
+#
+
+class BuildDetailsPage (HobPage):
+
+ def __init__(self, builder):
+ super(BuildDetailsPage, self).__init__(builder, "Building ...")
+
+ self.num_of_issues = 0
+ self.endpath = (0,)
+ # create visual elements
+ self.create_visual_elements()
+
+ def create_visual_elements(self):
+ # create visual elements
+ self.vbox = gtk.VBox(False, 12)
+
+ self.progress_box = gtk.VBox(False, 12)
+ self.task_status = gtk.Label("\n") # to ensure layout is correct
+ self.task_status.set_alignment(0.0, 0.5)
+ self.progress_box.pack_start(self.task_status, expand=False, fill=False)
+ self.progress_hbox = gtk.HBox(False, 6)
+ self.progress_box.pack_end(self.progress_hbox, expand=True, fill=True)
+ self.progress_bar = HobProgressBar()
+ self.progress_hbox.pack_start(self.progress_bar, expand=True, fill=True)
+ self.stop_button = HobAltButton("Stop")
+ self.stop_button.connect("clicked", self.stop_button_clicked_cb)
+ self.stop_button.set_sensitive(False)
+ self.progress_hbox.pack_end(self.stop_button, expand=False, fill=False)
+
+ self.notebook = HobNotebook()
+ self.config_tv = BuildConfigurationTreeView()
+ self.scrolled_view_config = gtk.ScrolledWindow ()
+ self.scrolled_view_config.set_policy(gtk.POLICY_NEVER, gtk.POLICY_ALWAYS)
+ self.scrolled_view_config.add(self.config_tv)
+ self.notebook.append_page(self.scrolled_view_config, "Build configuration")
+
+ self.failure_tv = BuildFailureTreeView()
+ self.failure_model = self.builder.handler.build.model.failure_model()
+ self.failure_tv.set_model(self.failure_model)
+ self.scrolled_view_failure = gtk.ScrolledWindow ()
+ self.scrolled_view_failure.set_policy(gtk.POLICY_NEVER, gtk.POLICY_ALWAYS)
+ self.scrolled_view_failure.add(self.failure_tv)
+ self.notebook.append_page(self.scrolled_view_failure, "Issues")
+
+ self.build_tv = RunningBuildTreeView(readonly=True, hob=True)
+ self.build_tv.set_model(self.builder.handler.build.model)
+ self.scrolled_view_build = gtk.ScrolledWindow ()
+ self.scrolled_view_build.set_policy(gtk.POLICY_NEVER, gtk.POLICY_ALWAYS)
+ self.scrolled_view_build.add(self.build_tv)
+ self.notebook.append_page(self.scrolled_view_build, "Log")
+
+ self.builder.handler.build.model.connect_after("row-changed", self.scroll_to_present_row, self.scrolled_view_build.get_vadjustment(), self.build_tv)
+
+ self.button_box = gtk.HBox(False, 6)
+ self.back_button = HobAltButton('<< Back')
+ self.back_button.connect("clicked", self.back_button_clicked_cb)
+ self.button_box.pack_start(self.back_button, expand=False, fill=False)
+
+ def update_build_status(self, current, total, task):
+ recipe_path, recipe_task = task.split(", ")
+ recipe = os.path.basename(recipe_path).rstrip(".bb")
+ tsk_msg = "<b>Running task %s of %s:</b> %s\n<b>Recipe:</b> %s" % (current, total, recipe_task, recipe)
+ self.task_status.set_markup(tsk_msg)
+ self.stop_button.set_sensitive(True)
+
+ def reset_build_status(self):
+ self.task_status.set_markup("\n") # to ensure layout is correct
+ self.endpath = (0,)
+
+ def show_issues(self):
+ self.num_of_issues += 1
+ self.notebook.show_indicator_icon("Issues", self.num_of_issues)
+ self.notebook.queue_draw()
+
+ def reset_issues(self):
+ self.num_of_issues = 0
+ self.notebook.hide_indicator_icon("Issues")
+
+ def _remove_all_widget(self):
+ children = self.vbox.get_children() or []
+ for child in children:
+ self.vbox.remove(child)
+ children = self.box_group_area.get_children() or []
+ for child in children:
+ self.box_group_area.remove(child)
+ children = self.get_children() or []
+ for child in children:
+ self.remove(child)
+
+ def add_build_fail_top_bar(self, actions, log_file=None):
+ primary_action = "Edit %s" % actions
+
+ color = HobColors.ERROR
+ build_fail_top = gtk.EventBox()
+ #build_fail_top.set_size_request(-1, 200)
+ build_fail_top.modify_bg(gtk.STATE_NORMAL, gtk.gdk.color_parse(color))
+
+ build_fail_tab = gtk.Table(14, 46, True)
+ build_fail_top.add(build_fail_tab)
+
+ icon = gtk.Image()
+ icon_pix_buffer = gtk.gdk.pixbuf_new_from_file(hic.ICON_INDI_ERROR_FILE)
+ icon.set_from_pixbuf(icon_pix_buffer)
+ build_fail_tab.attach(icon, 1, 4, 0, 6)
+
+ label = gtk.Label()
+ label.set_alignment(0.0, 0.5)
+ label.set_markup("<span size='x-large'><b>%s</b></span>" % self.title)
+ build_fail_tab.attach(label, 4, 26, 0, 6)
+
+ label = gtk.Label()
+ label.set_alignment(0.0, 0.5)
+ # Ensure variable disk_full is defined
+ if not hasattr(self.builder, 'disk_full'):
+ self.builder.disk_full = False
+
+ if self.builder.disk_full:
+ markup = "<span size='medium'>There is no disk space left, so Hob cannot finish building your image. Free up some disk space\n"
+ markup += "and restart the build. Check the \"Issues\" tab for more details</span>"
+ label.set_markup(markup)
+ else:
+ label.set_markup("<span size='medium'>Check the \"Issues\" information for more details</span>")
+ build_fail_tab.attach(label, 4, 40, 4, 9)
+
+ # create button 'Edit packages'
+ action_button = HobButton(primary_action)
+ #action_button.set_size_request(-1, 40)
+ action_button.set_tooltip_text("Edit the %s parameters" % actions)
+ action_button.connect('clicked', self.failure_primary_action_button_clicked_cb, primary_action)
+
+ if log_file:
+ open_log_button = HobAltButton("Open log")
+ open_log_button.set_relief(gtk.RELIEF_HALF)
+ open_log_button.set_tooltip_text("Open the build's log file")
+ open_log_button.connect('clicked', self.open_log_button_clicked_cb, log_file)
+
+ attach_pos = (24 if log_file else 14)
+ file_bug_button = HobAltButton('File a bug')
+ file_bug_button.set_relief(gtk.RELIEF_HALF)
+ file_bug_button.set_tooltip_text("Open the Yocto Project bug tracking website")
+ file_bug_button.connect('clicked', self.failure_activate_file_bug_link_cb)
+
+ if not self.builder.disk_full:
+ build_fail_tab.attach(action_button, 4, 13, 9, 12)
+ if log_file:
+ build_fail_tab.attach(open_log_button, 14, 23, 9, 12)
+ build_fail_tab.attach(file_bug_button, attach_pos, attach_pos + 9, 9, 12)
+
+ else:
+ restart_build = HobButton("Restart the build")
+ restart_build.set_tooltip_text("Restart the build")
+ restart_build.connect('clicked', self.restart_build_button_clicked_cb)
+
+ build_fail_tab.attach(restart_build, 4, 13, 9, 12)
+ build_fail_tab.attach(action_button, 14, 23, 9, 12)
+ if log_file:
+ build_fail_tab.attach(open_log_button, attach_pos, attach_pos + 9, 9, 12)
+
+ self.builder.disk_full = False
+ return build_fail_top
+
+ def show_fail_page(self, title):
+ self._remove_all_widget()
+ self.title = "Hob cannot build your %s" % title
+
+ self.build_fail_bar = self.add_build_fail_top_bar(title, self.builder.current_logfile)
+
+ self.pack_start(self.group_align, expand=True, fill=True)
+ self.box_group_area.pack_start(self.build_fail_bar, expand=False, fill=False)
+ self.box_group_area.pack_start(self.vbox, expand=True, fill=True)
+
+ self.vbox.pack_start(self.notebook, expand=True, fill=True)
+ self.show_all()
+ self.notebook.set_page("Issues")
+ self.back_button.hide()
+
+ def add_build_stop_top_bar(self, action, log_file=None):
+ color = HobColors.LIGHT_GRAY
+ build_stop_top = gtk.EventBox()
+ #build_stop_top.set_size_request(-1, 200)
+ build_stop_top.modify_bg(gtk.STATE_NORMAL, gtk.gdk.color_parse(color))
+ build_stop_top.set_flags(gtk.CAN_DEFAULT)
+ build_stop_top.grab_default()
+
+ build_stop_tab = gtk.Table(11, 46, True)
+ build_stop_top.add(build_stop_tab)
+
+ icon = gtk.Image()
+ icon_pix_buffer = gtk.gdk.pixbuf_new_from_file(hic.ICON_INFO_HOVER_FILE)
+ icon.set_from_pixbuf(icon_pix_buffer)
+ build_stop_tab.attach(icon, 1, 4, 0, 6)
+
+ label = gtk.Label()
+ label.set_alignment(0.0, 0.5)
+ label.set_markup("<span size='x-large'><b>%s</b></span>" % self.title)
+ build_stop_tab.attach(label, 4, 26, 0, 6)
+
+ action_button = HobButton("Edit %s" % action)
+ action_button.set_size_request(-1, 40)
+ if action == "image":
+ action_button.set_tooltip_text("Edit the image parameters")
+ elif action == "recipes":
+ action_button.set_tooltip_text("Edit the included recipes")
+ elif action == "packages":
+ action_button.set_tooltip_text("Edit the included packages")
+ action_button.connect('clicked', self.stop_primary_action_button_clicked_cb, action)
+ build_stop_tab.attach(action_button, 4, 13, 6, 9)
+
+ if log_file:
+ open_log_button = HobAltButton("Open log")
+ open_log_button.set_relief(gtk.RELIEF_HALF)
+ open_log_button.set_tooltip_text("Open the build's log file")
+ open_log_button.connect('clicked', self.open_log_button_clicked_cb, log_file)
+ build_stop_tab.attach(open_log_button, 14, 23, 6, 9)
+
+ attach_pos = (24 if log_file else 14)
+ build_button = HobAltButton("Build new image")
+ #build_button.set_size_request(-1, 40)
+ build_button.set_tooltip_text("Create a new image from scratch")
+ build_button.connect('clicked', self.new_image_button_clicked_cb)
+ build_stop_tab.attach(build_button, attach_pos, attach_pos + 9, 6, 9)
+
+ return build_stop_top, action_button
+
+ def show_stop_page(self, action):
+ self._remove_all_widget()
+ self.title = "Build stopped"
+ self.build_stop_bar, action_button = self.add_build_stop_top_bar(action, self.builder.current_logfile)
+
+ self.pack_start(self.group_align, expand=True, fill=True)
+ self.box_group_area.pack_start(self.build_stop_bar, expand=False, fill=False)
+ self.box_group_area.pack_start(self.vbox, expand=True, fill=True)
+
+ self.vbox.pack_start(self.notebook, expand=True, fill=True)
+ self.show_all()
+ self.back_button.hide()
+ return action_button
+
+ def show_page(self, step):
+ self._remove_all_widget()
+ if step == self.builder.PACKAGE_GENERATING or step == self.builder.FAST_IMAGE_GENERATING:
+ self.title = "Building packages ..."
+ else:
+ self.title = "Building image ..."
+ self.build_details_top = self.add_onto_top_bar(None)
+ self.pack_start(self.build_details_top, expand=False, fill=False)
+ self.pack_start(self.group_align, expand=True, fill=True)
+
+ self.box_group_area.pack_start(self.vbox, expand=True, fill=True)
+
+ self.progress_bar.reset()
+ self.config_tv.reset()
+ self.vbox.pack_start(self.progress_box, expand=False, fill=False)
+
+ self.vbox.pack_start(self.notebook, expand=True, fill=True)
+
+ self.box_group_area.pack_end(self.button_box, expand=False, fill=False)
+ self.show_all()
+ self.notebook.set_page("Log")
+ self.back_button.hide()
+
+ self.reset_build_status()
+ self.reset_issues()
+
+ def update_progress_bar(self, title, fraction, status=None):
+ self.progress_bar.update(fraction)
+ self.progress_bar.set_title(title)
+ self.progress_bar.set_rcstyle(status)
+
+ def back_button_clicked_cb(self, button):
+ self.builder.show_configuration()
+
+ def new_image_button_clicked_cb(self, button):
+ self.builder.reset()
+
+ def show_back_button(self):
+ self.back_button.show()
+
+ def stop_button_clicked_cb(self, button):
+ self.builder.stop_build()
+
+ def hide_stop_button(self):
+ self.stop_button.set_sensitive(False)
+ self.stop_button.hide()
+
+ def scroll_to_present_row(self, model, path, iter, v_adj, treeview):
+ if treeview and v_adj:
+ if path[0] > self.endpath[0]: # check the event is a new row append or not
+ self.endpath = path
+ # check the gtk.adjustment position is at end boundary or not
+ if (v_adj.upper <= v_adj.page_size) or (v_adj.value == v_adj.upper - v_adj.page_size):
+ treeview.scroll_to_cell(path)
+
+ def show_configurations(self, configurations, params):
+ self.config_tv.show(configurations, params)
+
+ def failure_primary_action_button_clicked_cb(self, button, action):
+ if "Edit recipes" in action:
+ self.builder.show_recipes()
+ elif "Edit packages" in action:
+ self.builder.show_packages()
+ elif "Edit image" in action:
+ self.builder.show_configuration()
+
+ def restart_build_button_clicked_cb(self, button):
+ self.builder.just_bake()
+
+ def stop_primary_action_button_clicked_cb(self, button, action):
+ if "recipes" in action:
+ self.builder.show_recipes()
+ elif "packages" in action:
+ self.builder.show_packages()
+ elif "image" in action:
+ self.builder.show_configuration()
+
+ def open_log_button_clicked_cb(self, button, log_file):
+ if log_file:
+ log_file = "file:///" + log_file
+ gtk.show_uri(screen=button.get_screen(), uri=log_file, timestamp=0)
+
+ def failure_activate_file_bug_link_cb(self, button):
+ button.child.emit('activate-link', "http://bugzilla.yoctoproject.org")
diff --git a/bitbake/lib/bb/ui/crumbs/builder.py b/bitbake/lib/bb/ui/crumbs/builder.py
new file mode 100755
index 0000000..dcc4104
--- /dev/null
+++ b/bitbake/lib/bb/ui/crumbs/builder.py
@@ -0,0 +1,1475 @@
+#!/usr/bin/env python
+#
+# BitBake Graphical GTK User Interface
+#
+# Copyright (C) 2011-2012 Intel Corporation
+#
+# Authored by Joshua Lock <josh@linux.intel.com>
+# Authored by Dongxiao Xu <dongxiao.xu@intel.com>
+# Authored by Shane Wang <shane.wang@intel.com>
+#
+# This program is free software; you can redistribute it and/or modify
+# it under the terms of the GNU General Public License version 2 as
+# published by the Free Software Foundation.
+#
+# This program is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
+# GNU General Public License for more details.
+#
+# You should have received a copy of the GNU General Public License along
+# with this program; if not, write to the Free Software Foundation, Inc.,
+# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
+
+import glib
+import gtk, gobject
+import copy
+import os
+import subprocess
+import shlex
+import re
+import logging
+import sys
+import signal
+import time
+from bb.ui.crumbs.imageconfigurationpage import ImageConfigurationPage
+from bb.ui.crumbs.recipeselectionpage import RecipeSelectionPage
+from bb.ui.crumbs.packageselectionpage import PackageSelectionPage
+from bb.ui.crumbs.builddetailspage import BuildDetailsPage
+from bb.ui.crumbs.imagedetailspage import ImageDetailsPage
+from bb.ui.crumbs.sanitycheckpage import SanityCheckPage
+from bb.ui.crumbs.hobwidget import hwc, HobButton, HobAltButton
+from bb.ui.crumbs.persistenttooltip import PersistentTooltip
+import bb.ui.crumbs.utils
+from bb.ui.crumbs.hig.crumbsmessagedialog import CrumbsMessageDialog
+from bb.ui.crumbs.hig.simplesettingsdialog import SimpleSettingsDialog
+from bb.ui.crumbs.hig.advancedsettingsdialog import AdvancedSettingsDialog
+from bb.ui.crumbs.hig.deployimagedialog import DeployImageDialog
+from bb.ui.crumbs.hig.layerselectiondialog import LayerSelectionDialog
+from bb.ui.crumbs.hig.imageselectiondialog import ImageSelectionDialog
+from bb.ui.crumbs.hig.parsingwarningsdialog import ParsingWarningsDialog
+from bb.ui.crumbs.hig.propertydialog import PropertyDialog
+
+hobVer = 20120808
+
+class Configuration:
+ '''Represents the data structure of configuration.'''
+
+ @classmethod
+ def parse_proxy_string(cls, proxy):
+ pattern = "^\s*((http|https|ftp|socks|cvs)://)?((\S+):(\S+)@)?([^\s:]+)(:(\d+))?/?"
+ match = re.search(pattern, proxy)
+ if match:
+ return match.group(2), match.group(4), match.group(5), match.group(6), match.group(8)
+ else:
+ return None, None, None, "", ""
+
+ @classmethod
+ def make_host_string(cls, prot, user, passwd, host, default_prot=""):
+ if host == None or host == "":
+ return ""
+
+ passwd = passwd or ""
+
+ if user != None and user != "":
+ if prot == None or prot == "":
+ prot = default_prot
+ return prot + "://" + user + ":" + passwd + "@" + host
+ else:
+ if prot == None or prot == "":
+ return host
+ else:
+ return prot + "://" + host
+
+ @classmethod
+ def make_port_string(cls, port):
+ port = port or ""
+ return port
+
+ @classmethod
+ def make_proxy_string(cls, prot, user, passwd, host, port, default_prot=""):
+ if host == None or host == "":# or port == None or port == "":
+ return ""
+
+ return Configuration.make_host_string(prot, user, passwd, host, default_prot) + (":" + Configuration.make_port_string(port) if port else "")
+
+ def __init__(self):
+ self.curr_mach = ""
+ self.selected_image = None
+ # settings
+ self.curr_distro = ""
+ self.dldir = self.sstatedir = self.sstatemirror = ""
+ self.pmake = self.bbthread = 0
+ self.curr_package_format = ""
+ self.image_rootfs_size = self.image_extra_size = 0
+ self.image_overhead_factor = 1
+ self.incompat_license = ""
+ self.curr_sdk_machine = ""
+ self.conf_version = self.lconf_version = ""
+ self.extra_setting = {}
+ self.toolchain_build = False
+ self.image_fstypes = ""
+ self.image_size = None
+ self.image_packages = []
+ # bblayers.conf
+ self.layers = []
+ # image/recipes/packages
+ self.clear_selection()
+
+ self.user_selected_packages = []
+
+ self.default_task = "build"
+
+ # proxy settings
+ self.enable_proxy = None
+ self.same_proxy = False
+ self.proxies = {
+ "http" : [None, None, None, "", ""], # protocol : [prot, user, passwd, host, port]
+ "https" : [None, None, None, "", ""],
+ "ftp" : [None, None, None, "", ""],
+ "socks" : [None, None, None, "", ""],
+ "cvs" : [None, None, None, "", ""],
+ }
+
+ def clear_selection(self):
+ self.selected_recipes = []
+ self.selected_packages = []
+ self.initial_selected_image = None
+ self.initial_selected_packages = []
+ self.initial_user_selected_packages = []
+
+ def split_proxy(self, protocol, proxy):
+ entry = []
+ prot, user, passwd, host, port = Configuration.parse_proxy_string(proxy)
+ entry.append(prot)
+ entry.append(user)
+ entry.append(passwd)
+ entry.append(host)
+ entry.append(port)
+ self.proxies[protocol] = entry
+
+ def combine_proxy(self, protocol):
+ entry = self.proxies[protocol]
+ return Configuration.make_proxy_string(entry[0], entry[1], entry[2], entry[3], entry[4], protocol)
+
+ def combine_host_only(self, protocol):
+ entry = self.proxies[protocol]
+ return Configuration.make_host_string(entry[0], entry[1], entry[2], entry[3], protocol)
+
+ def combine_port_only(self, protocol):
+ entry = self.proxies[protocol]
+ return Configuration.make_port_string(entry[4])
+
+ def update(self, params):
+ # settings
+ self.curr_distro = params["distro"]
+ self.dldir = params["dldir"]
+ self.sstatedir = params["sstatedir"]
+ self.sstatemirror = params["sstatemirror"]
+ self.pmake = int(params["pmake"].split()[1])
+ self.bbthread = params["bbthread"]
+ self.curr_package_format = " ".join(params["pclass"].split("package_")).strip()
+ self.image_rootfs_size = params["image_rootfs_size"]
+ self.image_extra_size = params["image_extra_size"]
+ self.image_overhead_factor = params['image_overhead_factor']
+ self.incompat_license = params["incompat_license"]
+ self.curr_sdk_machine = params["sdk_machine"]
+ self.conf_version = params["conf_version"]
+ self.lconf_version = params["lconf_version"]
+ self.image_fstypes = params["image_fstypes"]
+ # self.extra_setting/self.toolchain_build
+ # bblayers.conf
+ self.layers = params["layer"].split()
+ self.layers_non_removable = params["layers_non_removable"].split()
+ self.default_task = params["default_task"]
+
+ # proxy settings
+ self.enable_proxy = params["http_proxy"] != "" or params["https_proxy"] != "" \
+ or params["ftp_proxy"] != "" or params["socks_proxy"] != "" \
+ or params["cvs_proxy_host"] != "" or params["cvs_proxy_port"] != ""
+ self.split_proxy("http", params["http_proxy"])
+ self.split_proxy("https", params["https_proxy"])
+ self.split_proxy("ftp", params["ftp_proxy"])
+ self.split_proxy("socks", params["socks_proxy"])
+ self.split_proxy("cvs", params["cvs_proxy_host"] + ":" + params["cvs_proxy_port"])
+
+ def save(self, handler, defaults=False):
+ # bblayers.conf
+ handler.set_var_in_file("BBLAYERS", self.layers, "bblayers.conf")
+ # local.conf
+ if not defaults:
+ handler.early_assign_var_in_file("MACHINE", self.curr_mach, "local.conf")
+ handler.set_var_in_file("DISTRO", self.curr_distro, "local.conf")
+ handler.set_var_in_file("DL_DIR", self.dldir, "local.conf")
+ handler.set_var_in_file("SSTATE_DIR", self.sstatedir, "local.conf")
+ sstate_mirror_list = self.sstatemirror.split("\\n ")
+ sstate_mirror_list_modified = []
+ for mirror in sstate_mirror_list:
+ if mirror != "":
+ mirror = mirror + "\\n"
+ sstate_mirror_list_modified.append(mirror)
+ handler.set_var_in_file("SSTATE_MIRRORS", sstate_mirror_list_modified, "local.conf")
+ handler.set_var_in_file("PARALLEL_MAKE", "-j %s" % self.pmake, "local.conf")
+ handler.set_var_in_file("BB_NUMBER_THREADS", self.bbthread, "local.conf")
+ handler.set_var_in_file("PACKAGE_CLASSES", " ".join(["package_" + i for i in self.curr_package_format.split()]), "local.conf")
+ handler.set_var_in_file("IMAGE_ROOTFS_SIZE", self.image_rootfs_size, "local.conf")
+ handler.set_var_in_file("IMAGE_EXTRA_SPACE", self.image_extra_size, "local.conf")
+ handler.set_var_in_file("INCOMPATIBLE_LICENSE", self.incompat_license, "local.conf")
+ handler.set_var_in_file("SDKMACHINE", self.curr_sdk_machine, "local.conf")
+ handler.set_var_in_file("CONF_VERSION", self.conf_version, "local.conf")
+ handler.set_var_in_file("LCONF_VERSION", self.lconf_version, "bblayers.conf")
+ handler.set_extra_config(self.extra_setting)
+ handler.set_var_in_file("TOOLCHAIN_BUILD", self.toolchain_build, "local.conf")
+ handler.set_var_in_file("IMAGE_FSTYPES", self.image_fstypes, "local.conf")
+ if not defaults:
+ # image/recipes/packages
+ handler.set_var_in_file("__SELECTED_IMAGE__", self.selected_image, "local.conf")
+ handler.set_var_in_file("DEPENDS", self.selected_recipes, "local.conf")
+ handler.set_var_in_file("IMAGE_INSTALL", self.user_selected_packages, "local.conf")
+ # proxy
+ if self.enable_proxy == True:
+ handler.set_var_in_file("http_proxy", self.combine_proxy("http"), "local.conf")
+ handler.set_var_in_file("https_proxy", self.combine_proxy("https"), "local.conf")
+ handler.set_var_in_file("ftp_proxy", self.combine_proxy("ftp"), "local.conf")
+ handler.set_var_in_file("all_proxy", self.combine_proxy("socks"), "local.conf")
+ handler.set_var_in_file("CVS_PROXY_HOST", self.combine_host_only("cvs"), "local.conf")
+ handler.set_var_in_file("CVS_PROXY_PORT", self.combine_port_only("cvs"), "local.conf")
+ else:
+ handler.set_var_in_file("http_proxy", "", "local.conf")
+ handler.set_var_in_file("https_proxy", "", "local.conf")
+ handler.set_var_in_file("ftp_proxy", "", "local.conf")
+ handler.set_var_in_file("all_proxy", "", "local.conf")
+ handler.set_var_in_file("CVS_PROXY_HOST", "", "local.conf")
+ handler.set_var_in_file("CVS_PROXY_PORT", "", "local.conf")
+
+ def __str__(self):
+ s = "VERSION: '%s', BBLAYERS: '%s', MACHINE: '%s', DISTRO: '%s', DL_DIR: '%s'," % \
+ (hobVer, " ".join(self.layers), self.curr_mach, self.curr_distro, self.dldir )
+ s += "SSTATE_DIR: '%s', SSTATE_MIRROR: '%s', PARALLEL_MAKE: '-j %s', BB_NUMBER_THREADS: '%s', PACKAGE_CLASSES: '%s', " % \
+ (self.sstatedir, self.sstatemirror, self.pmake, self.bbthread, " ".join(["package_" + i for i in self.curr_package_format.split()]))
+ s += "IMAGE_ROOTFS_SIZE: '%s', IMAGE_EXTRA_SPACE: '%s', INCOMPATIBLE_LICENSE: '%s', SDKMACHINE: '%s', CONF_VERSION: '%s', " % \
+ (self.image_rootfs_size, self.image_extra_size, self.incompat_license, self.curr_sdk_machine, self.conf_version)
+ s += "LCONF_VERSION: '%s', EXTRA_SETTING: '%s', TOOLCHAIN_BUILD: '%s', IMAGE_FSTYPES: '%s', __SELECTED_IMAGE__: '%s', " % \
+ (self.lconf_version, self.extra_setting, self.toolchain_build, self.image_fstypes, self.selected_image)
+ s += "DEPENDS: '%s', IMAGE_INSTALL: '%s', enable_proxy: '%s', use_same_proxy: '%s', http_proxy: '%s', " % \
+ (self.selected_recipes, self.user_selected_packages, self.enable_proxy, self.same_proxy, self.combine_proxy("http"))
+ s += "https_proxy: '%s', ftp_proxy: '%s', all_proxy: '%s', CVS_PROXY_HOST: '%s', CVS_PROXY_PORT: '%s'" % \
+ (self.combine_proxy("https"), self.combine_proxy("ftp"), self.combine_proxy("socks"),
+ self.combine_host_only("cvs"), self.combine_port_only("cvs"))
+ return s
+
+class Parameters:
+ '''Represents other variables like available machines, etc.'''
+
+ def __init__(self):
+ # Variables
+ self.max_threads = 65535
+ self.core_base = ""
+ self.image_addr = ""
+ self.image_types = []
+ self.runnable_image_types = []
+ self.runnable_machine_patterns = []
+ self.deployable_image_types = []
+ self.tmpdir = ""
+
+ self.all_machines = []
+ self.all_package_formats = []
+ self.all_distros = []
+ self.all_sdk_machines = []
+ self.all_layers = []
+ self.image_names = []
+ self.image_white_pattern = ""
+ self.image_black_pattern = ""
+
+ # for build log to show
+ self.bb_version = ""
+ self.target_arch = ""
+ self.target_os = ""
+ self.distro_version = ""
+ self.tune_pkgarch = ""
+
+ def update(self, params):
+ self.max_threads = params["max_threads"]
+ self.core_base = params["core_base"]
+ self.image_addr = params["image_addr"]
+ self.image_types = params["image_types"].split()
+ self.runnable_image_types = params["runnable_image_types"].split()
+ self.runnable_machine_patterns = params["runnable_machine_patterns"].split()
+ self.deployable_image_types = params["deployable_image_types"].split()
+ self.tmpdir = params["tmpdir"]
+ self.image_white_pattern = params["image_white_pattern"]
+ self.image_black_pattern = params["image_black_pattern"]
+ self.kernel_image_type = params["kernel_image_type"]
+ # for build log to show
+ self.bb_version = params["bb_version"]
+ self.target_arch = params["target_arch"]
+ self.target_os = params["target_os"]
+ self.distro_version = params["distro_version"]
+ self.tune_pkgarch = params["tune_pkgarch"]
+
+def hob_conf_filter(fn, data):
+ if fn.endswith("/local.conf"):
+ distro = data.getVar("DISTRO_HOB", False)
+ if distro:
+ if distro != "defaultsetup":
+ data.setVar("DISTRO", distro)
+ else:
+ data.delVar("DISTRO")
+
+ keys = ["MACHINE_HOB", "SDKMACHINE_HOB", "PACKAGE_CLASSES_HOB", \
+ "BB_NUMBER_THREADS_HOB", "PARALLEL_MAKE_HOB", "DL_DIR_HOB", \
+ "SSTATE_DIR_HOB", "SSTATE_MIRRORS_HOB", "INCOMPATIBLE_LICENSE_HOB"]
+ for key in keys:
+ var_hob = data.getVar(key, False)
+ if var_hob:
+ data.setVar(key.split("_HOB")[0], var_hob)
+ return
+
+ if fn.endswith("/bblayers.conf"):
+ layers = data.getVar("BBLAYERS_HOB", False)
+ if layers:
+ data.setVar("BBLAYERS", layers)
+ return
+
+class Builder(gtk.Window):
+
+ (INITIAL_CHECKS,
+ MACHINE_SELECTION,
+ RCPPKGINFO_POPULATING,
+ RCPPKGINFO_POPULATED,
+ BASEIMG_SELECTED,
+ RECIPE_SELECTION,
+ PACKAGE_GENERATING,
+ PACKAGE_GENERATED,
+ PACKAGE_SELECTION,
+ FAST_IMAGE_GENERATING,
+ IMAGE_GENERATING,
+ IMAGE_GENERATED,
+ MY_IMAGE_OPENED,
+ BACK,
+ END_NOOP) = range(15)
+
+ (SANITY_CHECK,
+ IMAGE_CONFIGURATION,
+ RECIPE_DETAILS,
+ BUILD_DETAILS,
+ PACKAGE_DETAILS,
+ IMAGE_DETAILS,
+ END_TAB) = range(7)
+
+ __step2page__ = {
+ INITIAL_CHECKS : SANITY_CHECK,
+ MACHINE_SELECTION : IMAGE_CONFIGURATION,
+ RCPPKGINFO_POPULATING : IMAGE_CONFIGURATION,
+ RCPPKGINFO_POPULATED : IMAGE_CONFIGURATION,
+ BASEIMG_SELECTED : IMAGE_CONFIGURATION,
+ RECIPE_SELECTION : RECIPE_DETAILS,
+ PACKAGE_GENERATING : BUILD_DETAILS,
+ PACKAGE_GENERATED : PACKAGE_DETAILS,
+ PACKAGE_SELECTION : PACKAGE_DETAILS,
+ FAST_IMAGE_GENERATING : BUILD_DETAILS,
+ IMAGE_GENERATING : BUILD_DETAILS,
+ IMAGE_GENERATED : IMAGE_DETAILS,
+ MY_IMAGE_OPENED : IMAGE_DETAILS,
+ END_NOOP : None,
+ }
+
+ SANITY_CHECK_MIN_DISPLAY_TIME = 5
+
+ def __init__(self, hobHandler, recipe_model, package_model):
+ super(Builder, self).__init__()
+
+ self.hob_image = "hob-image"
+
+ # handler
+ self.handler = hobHandler
+
+ # logger
+ self.logger = logging.getLogger("BitBake")
+ self.consolelog = None
+ self.current_logfile = None
+
+ # configuration and parameters
+ self.configuration = Configuration()
+ self.parameters = Parameters()
+
+ # build step
+ self.current_step = None
+ self.previous_step = None
+
+ self.stopping = False
+
+ # recipe model and package model
+ self.recipe_model = recipe_model
+ self.package_model = package_model
+
+ # Indicate whether user has customized the image
+ self.customized = False
+
+ # Indicate whether the UI is working
+ self.sensitive = True
+
+ # Indicate whether the sanity check ran
+ self.sanity_checked = False
+
+ # save parsing warnings
+ self.parsing_warnings = []
+
+ # create visual elements
+ self.create_visual_elements()
+
+ # connect the signals to functions
+ self.connect("delete-event", self.destroy_window_cb)
+ self.recipe_model.connect ("recipe-selection-changed", self.recipelist_changed_cb)
+ self.package_model.connect("package-selection-changed", self.packagelist_changed_cb)
+ self.handler.connect("config-updated", self.handler_config_updated_cb)
+ self.handler.connect("package-formats-updated", self.handler_package_formats_updated_cb)
+ self.handler.connect("parsing-started", self.handler_parsing_started_cb)
+ self.handler.connect("parsing", self.handler_parsing_cb)
+ self.handler.connect("parsing-completed", self.handler_parsing_completed_cb)
+ self.handler.build.connect("build-started", self.handler_build_started_cb)
+ self.handler.build.connect("build-succeeded", self.handler_build_succeeded_cb)
+ self.handler.build.connect("build-failed", self.handler_build_failed_cb)
+ self.handler.build.connect("build-aborted", self.handler_build_aborted_cb)
+ self.handler.build.connect("task-started", self.handler_task_started_cb)
+ self.handler.build.connect("disk-full", self.handler_disk_full_cb)
+ self.handler.build.connect("log-error", self.handler_build_failure_cb)
+ self.handler.build.connect("log-warning", self.handler_build_failure_cb)
+ self.handler.build.connect("log", self.handler_build_log_cb)
+ self.handler.build.connect("no-provider", self.handler_no_provider_cb)
+ self.handler.connect("generating-data", self.handler_generating_data_cb)
+ self.handler.connect("data-generated", self.handler_data_generated_cb)
+ self.handler.connect("command-succeeded", self.handler_command_succeeded_cb)
+ self.handler.connect("command-failed", self.handler_command_failed_cb)
+ self.handler.connect("parsing-warning", self.handler_parsing_warning_cb)
+ self.handler.connect("sanity-failed", self.handler_sanity_failed_cb)
+ self.handler.connect("recipe-populated", self.handler_recipe_populated_cb)
+ self.handler.connect("package-populated", self.handler_package_populated_cb)
+
+ self.handler.append_to_bbfiles("${TOPDIR}/recipes/images/custom/*.bb")
+ self.handler.append_to_bbfiles("${TOPDIR}/recipes/images/*.bb")
+ self.initiate_new_build_async()
+
+ signal.signal(signal.SIGINT, self.event_handle_SIGINT)
+
+ def create_visual_elements(self):
+ self.set_title("Hob")
+ self.set_icon_name("applications-development")
+ self.set_resizable(True)
+
+ try:
+ window_width = self.get_screen().get_width()
+ window_height = self.get_screen().get_height()
+ except AttributeError:
+ print "Please set DISPLAY variable before running Hob."
+ sys.exit(1)
+
+ if window_width >= hwc.MAIN_WIN_WIDTH:
+ window_width = hwc.MAIN_WIN_WIDTH
+ window_height = hwc.MAIN_WIN_HEIGHT
+ self.set_size_request(window_width, window_height)
+
+ self.vbox = gtk.VBox(False, 0)
+ self.vbox.set_border_width(0)
+ self.add(self.vbox)
+
+ # create pages
+ self.image_configuration_page = ImageConfigurationPage(self)
+ self.recipe_details_page = RecipeSelectionPage(self)
+ self.build_details_page = BuildDetailsPage(self)
+ self.package_details_page = PackageSelectionPage(self)
+ self.image_details_page = ImageDetailsPage(self)
+ self.sanity_check_page = SanityCheckPage(self)
+ self.display_sanity_check = False
+ self.sanity_check_post_func = False
+ self.had_network_error = False
+
+ self.nb = gtk.Notebook()
+ self.nb.set_show_tabs(False)
+ self.nb.insert_page(self.sanity_check_page, None, self.SANITY_CHECK)
+ self.nb.insert_page(self.image_configuration_page, None, self.IMAGE_CONFIGURATION)
+ self.nb.insert_page(self.recipe_details_page, None, self.RECIPE_DETAILS)
+ self.nb.insert_page(self.build_details_page, None, self.BUILD_DETAILS)
+ self.nb.insert_page(self.package_details_page, None, self.PACKAGE_DETAILS)
+ self.nb.insert_page(self.image_details_page, None, self.IMAGE_DETAILS)
+ self.vbox.pack_start(self.nb, expand=True, fill=True)
+
+ self.show_all()
+ self.nb.set_current_page(0)
+
+ def sanity_check_timeout(self):
+ # The minimum time for showing the 'sanity check' page has passe
+ # If someone set the 'sanity_check_post_step' meanwhile, execute it now
+ self.display_sanity_check = False
+ if self.sanity_check_post_func:
+ temp = self.sanity_check_post_func
+ self.sanity_check_post_func = None
+ temp()
+ return False
+
+ def show_sanity_check_page(self):
+ # This window must stay on screen for at least 5 seconds, according to the design document
+ self.nb.set_current_page(self.SANITY_CHECK)
+ self.sanity_check_post_step = None
+ self.display_sanity_check = True
+ self.sanity_check_page.start()
+ gobject.timeout_add(self.SANITY_CHECK_MIN_DISPLAY_TIME * 1000, self.sanity_check_timeout)
+
+ def execute_after_sanity_check(self, func):
+ if not self.display_sanity_check:
+ func()
+ else:
+ self.sanity_check_post_func = func
+
+ def generate_configuration(self):
+ if not self.sanity_checked:
+ self.show_sanity_check_page()
+ self.handler.generate_configuration()
+
+ def initiate_new_build_async(self):
+ self.configuration.selected_image = None
+ self.switch_page(self.MACHINE_SELECTION)
+ self.handler.init_cooker()
+ self.handler.set_extra_inherit("image_types")
+ self.generate_configuration()
+
+ def update_config_async(self):
+ self.set_user_config()
+ self.generate_configuration()
+ self.switch_page(self.MACHINE_SELECTION)
+
+ def sanity_check(self):
+ self.handler.trigger_sanity_check()
+
+ def populate_recipe_package_info_async(self):
+ self.switch_page(self.RCPPKGINFO_POPULATING)
+ # Parse recipes
+ self.set_user_config()
+ self.handler.generate_recipes()
+
+ def generate_packages_async(self, log = False):
+ self.switch_page(self.PACKAGE_GENERATING)
+ if log:
+ self.current_logfile = self.handler.get_logfile()
+ self.do_log(self.current_logfile)
+ # Build packages
+ _, all_recipes = self.recipe_model.get_selected_recipes()
+ self.set_user_config()
+ self.handler.reset_build()
+ self.handler.generate_packages(all_recipes, self.configuration.default_task)
+
+ def restore_initial_selected_packages(self):
+ self.package_model.set_selected_packages(self.configuration.initial_user_selected_packages, True)
+ self.package_model.set_selected_packages(self.configuration.initial_selected_packages)
+ for package in self.configuration.selected_packages:
+ if package not in self.configuration.initial_selected_packages:
+ self.package_model.exclude_item(self.package_model.find_path_for_item(package))
+
+ def fast_generate_image_async(self, log = False):
+ self.switch_page(self.FAST_IMAGE_GENERATING)
+ if log:
+ self.current_logfile = self.handler.get_logfile()
+ self.do_log(self.current_logfile)
+ # Build packages
+ _, all_recipes = self.recipe_model.get_selected_recipes()
+ self.set_user_config()
+ self.handler.reset_build()
+ self.handler.generate_packages(all_recipes, self.configuration.default_task)
+
+ def generate_image_async(self, cont = False):
+ self.switch_page(self.IMAGE_GENERATING)
+ self.handler.reset_build()
+ if not cont:
+ self.current_logfile = self.handler.get_logfile()
+ self.do_log(self.current_logfile)
+ # Build image
+ self.set_user_config()
+ toolchain_packages = []
+ base_image = None
+ if self.configuration.toolchain_build:
+ toolchain_packages = self.package_model.get_selected_packages_toolchain()
+ if self.configuration.selected_image == self.recipe_model.__custom_image__:
+ packages = self.package_model.get_selected_packages()
+ image = self.hob_image
+ base_image = self.configuration.initial_selected_image
+ else:
+ packages = []
+ image = self.configuration.selected_image
+ self.handler.generate_image(image,
+ base_image,
+ packages,
+ toolchain_packages,
+ self.configuration.default_task)
+
+ def generate_new_image(self, image, description):
+ base_image = self.configuration.initial_selected_image
+ if base_image == self.recipe_model.__custom_image__:
+ base_image = None
+ packages = self.package_model.get_selected_packages()
+ self.handler.generate_new_image(image, base_image, packages, description)
+
+ def ensure_dir(self, directory):
+ self.handler.ensure_dir(directory)
+
+ def get_parameters_sync(self):
+ return self.handler.get_parameters()
+
+ def request_package_info_async(self):
+ self.handler.request_package_info()
+
+ def cancel_build_sync(self, force=False):
+ self.handler.cancel_build(force)
+
+ def cancel_parse_sync(self):
+ self.handler.cancel_parse()
+
+ def switch_page(self, next_step):
+ # Main Workflow (Business Logic)
+ self.nb.set_current_page(self.__step2page__[next_step])
+
+ if next_step == self.MACHINE_SELECTION: # init step
+ self.image_configuration_page.show_machine()
+
+ elif next_step == self.RCPPKGINFO_POPULATING:
+ # MACHINE CHANGED action or SETTINGS CHANGED
+ # show the progress bar
+ self.image_configuration_page.show_info_populating()
+
+ elif next_step == self.RCPPKGINFO_POPULATED:
+ self.image_configuration_page.show_info_populated()
+
+ elif next_step == self.BASEIMG_SELECTED:
+ self.image_configuration_page.show_baseimg_selected()
+
+ elif next_step == self.RECIPE_SELECTION:
+ if self.recipe_model.get_selected_image() == self.recipe_model.__custom_image__:
+ self.recipe_details_page.set_recipe_curr_tab(self.recipe_details_page.ALL)
+ else:
+ self.recipe_details_page.set_recipe_curr_tab(self.recipe_details_page.INCLUDED)
+
+ elif next_step == self.PACKAGE_SELECTION:
+ self.configuration.initial_selected_packages = self.configuration.selected_packages
+ self.configuration.initial_user_selected_packages = self.configuration.user_selected_packages
+ self.package_details_page.set_title("Edit packages")
+ if self.recipe_model.get_selected_image() == self.recipe_model.__custom_image__:
+ self.package_details_page.set_packages_curr_tab(self.package_details_page.ALL)
+ else:
+ self.package_details_page.set_packages_curr_tab(self.package_details_page.INCLUDED)
+ self.package_details_page.show_page(self.current_logfile)
+
+
+ elif next_step == self.PACKAGE_GENERATING or next_step == self.FAST_IMAGE_GENERATING:
+ # both PACKAGE_GENERATING and FAST_IMAGE_GENERATING share the same page
+ self.build_details_page.show_page(next_step)
+
+ elif next_step == self.PACKAGE_GENERATED:
+ self.package_details_page.set_title("Step 2 of 2: Edit packages")
+ if self.recipe_model.get_selected_image() == self.recipe_model.__custom_image__:
+ self.package_details_page.set_packages_curr_tab(self.package_details_page.ALL)
+ else:
+ self.package_details_page.set_packages_curr_tab(self.package_details_page.INCLUDED)
+ self.package_details_page.show_page(self.current_logfile)
+
+ elif next_step == self.IMAGE_GENERATING:
+ # after packages are generated, selected_packages need to
+ # be updated in package_model per selected_image in recipe_model
+ self.build_details_page.show_page(next_step)
+
+ elif next_step == self.IMAGE_GENERATED:
+ self.image_details_page.show_page(next_step)
+
+ elif next_step == self.MY_IMAGE_OPENED:
+ self.image_details_page.show_page(next_step)
+
+ self.previous_step = self.current_step
+ self.current_step = next_step
+
+ def set_user_config_proxies(self):
+ if self.configuration.enable_proxy == True:
+ self.handler.set_http_proxy(self.configuration.combine_proxy("http"))
+ self.handler.set_https_proxy(self.configuration.combine_proxy("https"))
+ self.handler.set_ftp_proxy(self.configuration.combine_proxy("ftp"))
+ self.handler.set_socks_proxy(self.configuration.combine_proxy("socks"))
+ self.handler.set_cvs_proxy(self.configuration.combine_host_only("cvs"), self.configuration.combine_port_only("cvs"))
+ elif self.configuration.enable_proxy == False:
+ self.handler.set_http_proxy("")
+ self.handler.set_https_proxy("")
+ self.handler.set_ftp_proxy("")
+ self.handler.set_socks_proxy("")
+ self.handler.set_cvs_proxy("", "")
+
+ def set_user_config_extra(self):
+ self.handler.set_rootfs_size(self.configuration.image_rootfs_size)
+ self.handler.set_extra_size(self.configuration.image_extra_size)
+ self.handler.set_incompatible_license(self.configuration.incompat_license)
+ self.handler.set_sdk_machine(self.configuration.curr_sdk_machine)
+ self.handler.set_image_fstypes(self.configuration.image_fstypes)
+ self.handler.set_extra_config(self.configuration.extra_setting)
+ self.handler.set_extra_inherit("packageinfo image_types")
+ self.set_user_config_proxies()
+
+ def set_user_config(self):
+ # set bb layers
+ self.handler.set_bblayers(self.configuration.layers)
+ # set local configuration
+ self.handler.set_machine(self.configuration.curr_mach)
+ self.handler.set_package_format(self.configuration.curr_package_format)
+ self.handler.set_distro(self.configuration.curr_distro)
+ self.handler.set_dl_dir(self.configuration.dldir)
+ self.handler.set_sstate_dir(self.configuration.sstatedir)
+ self.handler.set_sstate_mirrors(self.configuration.sstatemirror)
+ self.handler.set_pmake(self.configuration.pmake)
+ self.handler.set_bbthreads(self.configuration.bbthread)
+ self.set_user_config_extra()
+
+ def update_recipe_model(self, selected_image, selected_recipes):
+ self.recipe_model.set_selected_image(selected_image)
+ self.recipe_model.set_selected_recipes(selected_recipes)
+
+ def update_package_model(self, selected_packages, user_selected_packages=None):
+ if user_selected_packages:
+ left = self.package_model.set_selected_packages(user_selected_packages, True)
+ self.configuration.user_selected_packages += left
+ left = self.package_model.set_selected_packages(selected_packages)
+ self.configuration.selected_packages += left
+
+ def update_configuration_parameters(self, params):
+ if params:
+ self.configuration.update(params)
+ self.parameters.update(params)
+
+ def set_base_image(self):
+ self.configuration.initial_selected_image = self.configuration.selected_image
+ if self.configuration.selected_image != self.recipe_model.__custom_image__:
+ self.hob_image = self.configuration.selected_image + "-edited"
+
+ def reset(self):
+ self.configuration.curr_mach = ""
+ self.configuration.clear_selection()
+ self.image_configuration_page.switch_machine_combo()
+ self.switch_page(self.MACHINE_SELECTION)
+
+ # Callback Functions
+ def handler_config_updated_cb(self, handler, which, values):
+ if which == "distro":
+ self.parameters.all_distros = values
+ elif which == "machine":
+ self.parameters.all_machines = values
+ self.image_configuration_page.update_machine_combo()
+ elif which == "machine-sdk":
+ self.parameters.all_sdk_machines = values
+
+ def handler_package_formats_updated_cb(self, handler, formats):
+ self.parameters.all_package_formats = formats
+
+ def switch_to_image_configuration_helper(self):
+ self.sanity_check_page.stop()
+ self.switch_page(self.IMAGE_CONFIGURATION)
+ self.image_configuration_page.switch_machine_combo()
+
+ def show_network_error_dialog_helper(self):
+ self.sanity_check_page.stop()
+ self.show_network_error_dialog()
+
+ def handler_command_succeeded_cb(self, handler, initcmd):
+ if initcmd == self.handler.GENERATE_CONFIGURATION:
+ if not self.configuration.curr_mach:
+ self.configuration.curr_mach = self.handler.runCommand(["getVariable", "HOB_MACHINE"]) or ""
+ self.update_configuration_parameters(self.get_parameters_sync())
+ if not self.sanity_checked:
+ self.sanity_check()
+ self.sanity_checked = True
+ elif initcmd == self.handler.SANITY_CHECK:
+ if self.had_network_error:
+ self.had_network_error = False
+ self.execute_after_sanity_check(self.show_network_error_dialog_helper)
+ else:
+ # Switch to the 'image configuration' page now, but we might need
+ # to wait for the minimum display time of the sanity check page
+ self.execute_after_sanity_check(self.switch_to_image_configuration_helper)
+ elif initcmd in [self.handler.GENERATE_RECIPES,
+ self.handler.GENERATE_PACKAGES,
+ self.handler.GENERATE_IMAGE]:
+ self.update_configuration_parameters(self.get_parameters_sync())
+ self.request_package_info_async()
+ elif initcmd == self.handler.POPULATE_PACKAGEINFO:
+ if self.current_step == self.RCPPKGINFO_POPULATING:
+ self.switch_page(self.RCPPKGINFO_POPULATED)
+ self.rcppkglist_populated()
+ return
+
+ self.rcppkglist_populated()
+ if self.current_step == self.FAST_IMAGE_GENERATING:
+ self.generate_image_async(True)
+
+ def show_error_dialog(self, msg):
+ lbl = "<b>Hob found an error</b>"
+ dialog = CrumbsMessageDialog(self, lbl, gtk.MESSAGE_ERROR, msg)
+ button = dialog.add_button("Close", gtk.RESPONSE_OK)
+ HobButton.style_button(button)
+ response = dialog.run()
+ dialog.destroy()
+
+ def show_warning_dialog(self):
+ dialog = ParsingWarningsDialog(title = "View warnings",
+ warnings = self.parsing_warnings,
+ parent = None,
+ flags = gtk.DIALOG_DESTROY_WITH_PARENT
+ | gtk.DIALOG_NO_SEPARATOR)
+ response = dialog.run()
+ dialog.destroy()
+
+ def show_network_error_dialog(self):
+ lbl = "<b>Hob cannot connect to the network</b>"
+ msg = msg + "Please check your network connection. If you are using a proxy server, please make sure it is configured correctly."
+ dialog = CrumbsMessageDialog(self, lbl, gtk.MESSAGE_ERROR, msg)
+ button = dialog.add_button("Close", gtk.RESPONSE_OK)
+ HobButton.style_button(button)
+ button = dialog.add_button("Proxy settings", gtk.RESPONSE_CANCEL)
+ HobButton.style_button(button)
+ res = dialog.run()
+ dialog.destroy()
+ if res == gtk.RESPONSE_CANCEL:
+ res, settings_changed = self.show_simple_settings_dialog(SimpleSettingsDialog.PROXIES_PAGE_ID)
+ if not res:
+ return
+ if settings_changed:
+ self.reparse_post_adv_settings()
+
+ def handler_command_failed_cb(self, handler, msg):
+ if msg:
+ self.show_error_dialog(msg)
+ self.reset()
+
+ def handler_parsing_warning_cb(self, handler, warn_msg):
+ self.parsing_warnings.append(warn_msg)
+
+ def handler_sanity_failed_cb(self, handler, msg, network_error):
+ self.reset()
+ if network_error:
+ # Mark this in an internal field. The "network error" dialog will be
+ # shown later, when a SanityCheckPassed event will be handled
+ # (as sent by sanity.bbclass)
+ self.had_network_error = True
+ else:
+ msg = msg.replace("your local.conf", "Settings")
+ self.show_error_dialog(msg)
+ self.reset()
+
+ def window_sensitive(self, sensitive):
+ self.image_configuration_page.machine_combo.set_sensitive(sensitive)
+ self.image_configuration_page.machine_combo.child.set_sensitive(sensitive)
+ self.image_configuration_page.image_combo.set_sensitive(sensitive)
+ self.image_configuration_page.image_combo.child.set_sensitive(sensitive)
+ self.image_configuration_page.layer_button.set_sensitive(sensitive)
+ self.image_configuration_page.layer_info_icon.set_sensitive(sensitive)
+ self.image_configuration_page.toolbar.set_sensitive(sensitive)
+ self.image_configuration_page.view_adv_configuration_button.set_sensitive(sensitive)
+ self.image_configuration_page.config_build_button.set_sensitive(sensitive)
+
+ self.recipe_details_page.set_sensitive(sensitive)
+ self.package_details_page.set_sensitive(sensitive)
+ self.build_details_page.set_sensitive(sensitive)
+ self.image_details_page.set_sensitive(sensitive)
+
+ if sensitive:
+ self.window.set_cursor(None)
+ else:
+ self.window.set_cursor(gtk.gdk.Cursor(gtk.gdk.WATCH))
+ self.sensitive = sensitive
+
+
+ def handler_generating_data_cb(self, handler):
+ self.window_sensitive(False)
+
+ def handler_data_generated_cb(self, handler):
+ self.window_sensitive(True)
+
+ def rcppkglist_populated(self):
+ selected_image = self.configuration.selected_image
+ selected_recipes = self.configuration.selected_recipes[:]
+ selected_packages = self.configuration.selected_packages[:]
+ user_selected_packages = self.configuration.user_selected_packages[:]
+
+ self.image_configuration_page.update_image_combo(self.recipe_model, selected_image)
+ self.image_configuration_page.update_image_desc()
+ self.update_recipe_model(selected_image, selected_recipes)
+ self.update_package_model(selected_packages, user_selected_packages)
+
+ def recipelist_changed_cb(self, recipe_model):
+ self.recipe_details_page.refresh_selection()
+
+ def packagelist_changed_cb(self, package_model):
+ self.package_details_page.refresh_selection()
+
+ def handler_recipe_populated_cb(self, handler):
+ self.image_configuration_page.update_progress_bar("Populating recipes", 0.99)
+
+ def handler_package_populated_cb(self, handler):
+ self.image_configuration_page.update_progress_bar("Populating packages", 1.0)
+
+ def handler_parsing_started_cb(self, handler, message):
+ if self.current_step != self.RCPPKGINFO_POPULATING:
+ return
+
+ fraction = 0
+ if message["eventname"] == "TreeDataPreparationStarted":
+ fraction = 0.6 + fraction
+ self.image_configuration_page.stop_button.set_sensitive(False)
+ self.image_configuration_page.update_progress_bar("Generating dependency tree", fraction)
+ else:
+ self.image_configuration_page.stop_button.set_sensitive(True)
+ self.image_configuration_page.update_progress_bar(message["title"], fraction)
+
+ def handler_parsing_cb(self, handler, message):
+ if self.current_step != self.RCPPKGINFO_POPULATING:
+ return
+
+ fraction = message["current"] * 1.0/message["total"]
+ if message["eventname"] == "TreeDataPreparationProgress":
+ fraction = 0.6 + 0.38 * fraction
+ self.image_configuration_page.update_progress_bar("Generating dependency tree", fraction)
+ else:
+ fraction = 0.6 * fraction
+ self.image_configuration_page.update_progress_bar(message["title"], fraction)
+
+ def handler_parsing_completed_cb(self, handler, message):
+ if self.current_step != self.RCPPKGINFO_POPULATING:
+ return
+
+ if message["eventname"] == "TreeDataPreparationCompleted":
+ fraction = 0.98
+ else:
+ fraction = 0.6
+ self.image_configuration_page.update_progress_bar("Generating dependency tree", fraction)
+
+ def handler_build_started_cb(self, running_build):
+ if self.current_step == self.FAST_IMAGE_GENERATING:
+ fraction = 0
+ elif self.current_step == self.IMAGE_GENERATING:
+ if self.previous_step == self.FAST_IMAGE_GENERATING:
+ fraction = 0.9
+ else:
+ fraction = 0
+ elif self.current_step == self.PACKAGE_GENERATING:
+ fraction = 0
+ self.build_details_page.update_progress_bar("Build Started: ", fraction)
+ self.build_details_page.show_configurations(self.configuration, self.parameters)
+
+ def build_succeeded(self):
+ if self.current_step == self.FAST_IMAGE_GENERATING:
+ fraction = 0.9
+ elif self.current_step == self.IMAGE_GENERATING:
+ fraction = 1.0
+ version = ""
+ self.parameters.image_names = []
+ selected_image = self.recipe_model.get_selected_image()
+ if selected_image == self.recipe_model.__custom_image__:
+ if self.configuration.initial_selected_image != selected_image:
+ version = self.recipe_model.get_custom_image_version()
+ linkname = self.hob_image + version + "-" + self.configuration.curr_mach
+ else:
+ linkname = selected_image + '-' + self.configuration.curr_mach
+ image_extension = self.get_image_extension()
+ for image_type in self.parameters.image_types:
+ if image_type in image_extension:
+ real_types = image_extension[image_type]
+ else:
+ real_types = [image_type]
+ for real_image_type in real_types:
+ linkpath = self.parameters.image_addr + '/' + linkname + '.' + real_image_type
+ if os.path.exists(linkpath):
+ self.parameters.image_names.append(os.readlink(linkpath))
+ elif self.current_step == self.PACKAGE_GENERATING:
+ fraction = 1.0
+ self.build_details_page.update_progress_bar("Build Completed: ", fraction)
+ self.handler.build_succeeded_async()
+ self.stopping = False
+
+ if self.current_step == self.PACKAGE_GENERATING:
+ self.switch_page(self.PACKAGE_GENERATED)
+ elif self.current_step == self.IMAGE_GENERATING:
+ self.switch_page(self.IMAGE_GENERATED)
+
+ def build_failed(self):
+ if self.stopping:
+ status = "stop"
+ message = "Build stopped: "
+ fraction = self.build_details_page.progress_bar.get_fraction()
+ stop_to_next_edit = ""
+ if self.current_step == self.FAST_IMAGE_GENERATING:
+ stop_to_next_edit = "image configuration"
+ elif self.current_step == self.IMAGE_GENERATING:
+ if self.previous_step == self.FAST_IMAGE_GENERATING:
+ stop_to_next_edit = "image configuration"
+ else:
+ stop_to_next_edit = "packages"
+ elif self.current_step == self.PACKAGE_GENERATING:
+ stop_to_next_edit = "recipes"
+ button = self.build_details_page.show_stop_page(stop_to_next_edit.split(' ')[0])
+ self.set_default(button)
+ else:
+ fail_to_next_edit = ""
+ if self.current_step == self.FAST_IMAGE_GENERATING:
+ fail_to_next_edit = "image configuration"
+ fraction = 0.9
+ elif self.current_step == self.IMAGE_GENERATING:
+ if self.previous_step == self.FAST_IMAGE_GENERATING:
+ fail_to_next_edit = "image configuration"
+ else:
+ fail_to_next_edit = "packages"
+ fraction = 1.0
+ elif self.current_step == self.PACKAGE_GENERATING:
+ fail_to_next_edit = "recipes"
+ fraction = 1.0
+ self.build_details_page.show_fail_page(fail_to_next_edit.split(' ')[0])
+ status = "fail"
+ message = "Build failed: "
+ self.build_details_page.update_progress_bar(message, fraction, status)
+ self.build_details_page.show_back_button()
+ self.build_details_page.hide_stop_button()
+ self.handler.build_failed_async()
+ self.stopping = False
+
+ def handler_build_succeeded_cb(self, running_build):
+ if not self.stopping:
+ self.build_succeeded()
+ else:
+ self.build_failed()
+
+
+ def handler_build_failed_cb(self, running_build):
+ self.build_failed()
+
+ def handler_build_aborted_cb(self, running_build):
+ self.build_failed()
+
+ def handler_no_provider_cb(self, running_build, msg):
+ dialog = CrumbsMessageDialog(self, glib.markup_escape_text(msg), gtk.MESSAGE_INFO)
+ button = dialog.add_button("Close", gtk.RESPONSE_OK)
+ HobButton.style_button(button)
+ dialog.run()
+ dialog.destroy()
+ self.build_failed()
+
+ def handler_task_started_cb(self, running_build, message):
+ fraction = message["current"] * 1.0/message["total"]
+ title = "Build packages"
+ if self.current_step == self.FAST_IMAGE_GENERATING:
+ if message["eventname"] == "sceneQueueTaskStarted":
+ fraction = 0.27 * fraction
+ elif message["eventname"] == "runQueueTaskStarted":
+ fraction = 0.27 + 0.63 * fraction
+ elif self.current_step == self.IMAGE_GENERATING:
+ title = "Build image"
+ if self.previous_step == self.FAST_IMAGE_GENERATING:
+ if message["eventname"] == "sceneQueueTaskStarted":
+ fraction = 0.27 + 0.63 + 0.03 * fraction
+ elif message["eventname"] == "runQueueTaskStarted":
+ fraction = 0.27 + 0.63 + 0.03 + 0.07 * fraction
+ else:
+ if message["eventname"] == "sceneQueueTaskStarted":
+ fraction = 0.2 * fraction
+ elif message["eventname"] == "runQueueTaskStarted":
+ fraction = 0.2 + 0.8 * fraction
+ elif self.current_step == self.PACKAGE_GENERATING:
+ if message["eventname"] == "sceneQueueTaskStarted":
+ fraction = 0.2 * fraction
+ elif message["eventname"] == "runQueueTaskStarted":
+ fraction = 0.2 + 0.8 * fraction
+ self.build_details_page.update_progress_bar(title + ": ", fraction)
+ self.build_details_page.update_build_status(message["current"], message["total"], message["task"])
+
+ def handler_disk_full_cb(self, running_build):
+ self.disk_full = True
+
+ def handler_build_failure_cb(self, running_build):
+ self.build_details_page.show_issues()
+
+ def handler_build_log_cb(self, running_build, func, obj):
+ if hasattr(self.logger, func):
+ getattr(self.logger, func)(obj)
+
+ def destroy_window_cb(self, widget, event):
+ if not self.sensitive:
+ return True
+ elif self.handler.building:
+ self.stop_build()
+ return True
+ else:
+ gtk.main_quit()
+
+ def event_handle_SIGINT(self, signal, frame):
+ for w in gtk.window_list_toplevels():
+ if w.get_modal():
+ w.response(gtk.RESPONSE_DELETE_EVENT)
+ sys.exit(0)
+
+ def build_packages(self):
+ _, all_recipes = self.recipe_model.get_selected_recipes()
+ if not all_recipes:
+ lbl = "<b>No selections made</b>"
+ msg = "You have not made any selections"
+ msg = msg + " so there isn't anything to bake at this time."
+ dialog = CrumbsMessageDialog(self, lbl, gtk.MESSAGE_INFO, msg)
+ button = dialog.add_button("Close", gtk.RESPONSE_OK)
+ HobButton.style_button(button)
+ dialog.run()
+ dialog.destroy()
+ return
+ self.generate_packages_async(True)
+
+ def build_image(self):
+ selected_packages = self.package_model.get_selected_packages()
+ if not selected_packages:
+ lbl = "<b>No selections made</b>"
+ msg = "You have not made any selections"
+ msg = msg + " so there isn't anything to bake at this time."
+ dialog = CrumbsMessageDialog(self, lbl, gtk.MESSAGE_INFO, msg)
+ button = dialog.add_button("Close", gtk.RESPONSE_OK)
+ HobButton.style_button(button)
+ dialog.run()
+ dialog.destroy()
+ return
+ self.generate_image_async(True)
+
+ def just_bake(self):
+ selected_image = self.recipe_model.get_selected_image()
+ selected_packages = self.package_model.get_selected_packages() or []
+
+ # If no base image and no selected packages don't build anything
+ if not (selected_packages or selected_image != self.recipe_model.__custom_image__):
+ lbl = "<b>No selections made</b>"
+ msg = "You have not made any selections"
+ msg = msg + " so there isn't anything to bake at this time."
+ dialog = CrumbsMessageDialog(self, lbl, gtk.MESSAGE_INFO, msg)
+ button = dialog.add_button("Close", gtk.RESPONSE_OK)
+ HobButton.style_button(button)
+ dialog.run()
+ dialog.destroy()
+ return
+
+ self.fast_generate_image_async(True)
+
+ def show_recipe_property_dialog(self, properties):
+ information = {}
+ dialog = PropertyDialog(title = properties["name"] +' '+ "properties",
+ parent = self,
+ information = properties,
+ flags = gtk.DIALOG_DESTROY_WITH_PARENT
+ | gtk.DIALOG_NO_SEPARATOR)
+
+ dialog.set_modal(False)
+
+ button = dialog.add_button("Close", gtk.RESPONSE_NO)
+ HobAltButton.style_button(button)
+ button.connect("clicked", lambda w: dialog.destroy())
+
+ dialog.run()
+
+ def show_packages_property_dialog(self, properties):
+ information = {}
+ dialog = PropertyDialog(title = properties["name"] +' '+ "properties",
+ parent = self,
+ information = properties,
+ flags = gtk.DIALOG_DESTROY_WITH_PARENT
+ | gtk.DIALOG_NO_SEPARATOR)
+
+ dialog.set_modal(False)
+
+ button = dialog.add_button("Close", gtk.RESPONSE_NO)
+ HobAltButton.style_button(button)
+ button.connect("clicked", lambda w: dialog.destroy())
+
+ dialog.run()
+
+ def show_layer_selection_dialog(self):
+ dialog = LayerSelectionDialog(title = "Layers",
+ layers = copy.deepcopy(self.configuration.layers),
+ layers_non_removable = copy.deepcopy(self.configuration.layers_non_removable),
+ all_layers = self.parameters.all_layers,
+ parent = self,
+ flags = gtk.DIALOG_MODAL
+ | gtk.DIALOG_DESTROY_WITH_PARENT
+ | gtk.DIALOG_NO_SEPARATOR)
+ button = dialog.add_button("Cancel", gtk.RESPONSE_NO)
+ HobAltButton.style_button(button)
+ button = dialog.add_button("OK", gtk.RESPONSE_YES)
+ HobButton.style_button(button)
+ response = dialog.run()
+ if response == gtk.RESPONSE_YES:
+ self.configuration.layers = dialog.layers
+ # DO refresh layers
+ if dialog.layers_changed:
+ self.update_config_async()
+ dialog.destroy()
+
+ def get_image_extension(self):
+ image_extension = {}
+ for type in self.parameters.image_types:
+ ext = self.handler.runCommand(["getVariable", "IMAGE_EXTENSION_%s" % type])
+ if ext:
+ image_extension[type] = ext.split(' ')
+
+ return image_extension
+
+ def show_load_my_images_dialog(self):
+ image_extension = self.get_image_extension()
+ dialog = ImageSelectionDialog(self.parameters.image_addr, self.parameters.image_types,
+ "Open My Images", self,
+ gtk.FILE_CHOOSER_ACTION_SAVE, None,
+ image_extension)
+ button = dialog.add_button("Cancel", gtk.RESPONSE_NO)
+ HobAltButton.style_button(button)
+ button = dialog.add_button("Open", gtk.RESPONSE_YES)
+ HobButton.style_button(button)
+ response = dialog.run()
+ if response == gtk.RESPONSE_YES:
+ if not dialog.image_names:
+ lbl = "<b>No selections made</b>"
+ msg = "You have not made any selections"
+ crumbs_dialog = CrumbsMessageDialog(self, lbl, gtk.MESSAGE_INFO, msg)
+ button = crumbs_dialog.add_button("Close", gtk.RESPONSE_OK)
+ HobButton.style_button(button)
+ crumbs_dialog.run()
+ crumbs_dialog.destroy()
+ dialog.destroy()
+ return
+
+ self.parameters.image_addr = dialog.image_folder
+ self.parameters.image_names = dialog.image_names[:]
+ self.switch_page(self.MY_IMAGE_OPENED)
+
+ dialog.destroy()
+
+ def show_adv_settings_dialog(self, tab=None):
+ dialog = AdvancedSettingsDialog(title = "Advanced configuration",
+ configuration = copy.deepcopy(self.configuration),
+ all_image_types = self.parameters.image_types,
+ all_package_formats = self.parameters.all_package_formats,
+ all_distros = self.parameters.all_distros,
+ all_sdk_machines = self.parameters.all_sdk_machines,
+ max_threads = self.parameters.max_threads,
+ parent = self,
+ flags = gtk.DIALOG_MODAL
+ | gtk.DIALOG_DESTROY_WITH_PARENT
+ | gtk.DIALOG_NO_SEPARATOR)
+ button = dialog.add_button("Cancel", gtk.RESPONSE_NO)
+ HobAltButton.style_button(button)
+ button = dialog.add_button("Save", gtk.RESPONSE_YES)
+ HobButton.style_button(button)
+ dialog.set_save_button(button)
+ response = dialog.run()
+ settings_changed = False
+ if response == gtk.RESPONSE_YES:
+ self.configuration = dialog.configuration
+ self.configuration.save(self.handler, True) # remember settings
+ settings_changed = dialog.settings_changed
+ dialog.destroy()
+ return response == gtk.RESPONSE_YES, settings_changed
+
+ def show_simple_settings_dialog(self, tab=None):
+ dialog = SimpleSettingsDialog(title = "Settings",
+ configuration = copy.deepcopy(self.configuration),
+ all_image_types = self.parameters.image_types,
+ all_package_formats = self.parameters.all_package_formats,
+ all_distros = self.parameters.all_distros,
+ all_sdk_machines = self.parameters.all_sdk_machines,
+ max_threads = self.parameters.max_threads,
+ parent = self,
+ flags = gtk.DIALOG_MODAL
+ | gtk.DIALOG_DESTROY_WITH_PARENT
+ | gtk.DIALOG_NO_SEPARATOR,
+ handler = self.handler)
+ button = dialog.add_button("Cancel", gtk.RESPONSE_NO)
+ HobAltButton.style_button(button)
+ button = dialog.add_button("Save", gtk.RESPONSE_YES)
+ HobButton.style_button(button)
+ if tab:
+ dialog.switch_to_page(tab)
+ response = dialog.run()
+ settings_changed = False
+ if response == gtk.RESPONSE_YES:
+ self.configuration = dialog.configuration
+ self.configuration.save(self.handler, True) # remember settings
+ settings_changed = dialog.settings_changed
+ if dialog.proxy_settings_changed:
+ self.set_user_config_proxies()
+ elif dialog.proxy_test_ran:
+ # The user might have modified the proxies in the "Proxy"
+ # tab, which in turn made the proxy settings modify in bb.
+ # If "Cancel" was pressed, restore the previous proxy
+ # settings inside bb.
+ self.set_user_config_proxies()
+ dialog.destroy()
+ return response == gtk.RESPONSE_YES, settings_changed
+
+ def reparse_post_adv_settings(self):
+ if not self.configuration.curr_mach:
+ self.update_config_async()
+ else:
+ self.configuration.clear_selection()
+ # DO reparse recipes
+ self.populate_recipe_package_info_async()
+
+ def deploy_image(self, image_name):
+ if not image_name:
+ lbl = "<b>Please select an image to deploy.</b>"
+ dialog = CrumbsMessageDialog(self, lbl, gtk.MESSAGE_INFO)
+ button = dialog.add_button("Close", gtk.RESPONSE_OK)
+ HobButton.style_button(button)
+ dialog.run()
+ dialog.destroy()
+ return
+
+ image_path = os.path.join(self.parameters.image_addr, image_name)
+ dialog = DeployImageDialog(title = "Usb Image Maker",
+ image_path = image_path,
+ parent = self,
+ flags = gtk.DIALOG_MODAL
+ | gtk.DIALOG_DESTROY_WITH_PARENT
+ | gtk.DIALOG_NO_SEPARATOR)
+ button = dialog.add_button("Close", gtk.RESPONSE_NO)
+ HobAltButton.style_button(button)
+ button = dialog.add_button("Make usb image", gtk.RESPONSE_YES)
+ HobButton.style_button(button)
+ response = dialog.run()
+ dialog.destroy()
+
+ def show_load_kernel_dialog(self):
+ dialog = gtk.FileChooserDialog("Load Kernel Files", self,
+ gtk.FILE_CHOOSER_ACTION_SAVE)
+ button = dialog.add_button("Cancel", gtk.RESPONSE_NO)
+ HobAltButton.style_button(button)
+ button = dialog.add_button("Open", gtk.RESPONSE_YES)
+ HobButton.style_button(button)
+ filter = gtk.FileFilter()
+ filter.set_name("Kernel Files")
+ filter.add_pattern("*.bin")
+ dialog.add_filter(filter)
+
+ dialog.set_current_folder(self.parameters.image_addr)
+
+ response = dialog.run()
+ kernel_path = ""
+ if response == gtk.RESPONSE_YES:
+ kernel_path = dialog.get_filename()
+
+ dialog.destroy()
+
+ return kernel_path
+
+ def runqemu_image(self, image_name, kernel_name):
+ if not image_name or not kernel_name:
+ lbl = "<b>Please select %s to launch in QEMU.</b>" % ("a kernel" if image_name else "an image")
+ dialog = CrumbsMessageDialog(self, lbl, gtk.MESSAGE_INFO)
+ button = dialog.add_button("Close", gtk.RESPONSE_OK)
+ HobButton.style_button(button)
+ dialog.run()
+ dialog.destroy()
+ return
+
+ kernel_path = os.path.join(self.parameters.image_addr, kernel_name)
+ image_path = os.path.join(self.parameters.image_addr, image_name)
+
+ source_env_path = os.path.join(self.parameters.core_base, "oe-init-build-env")
+ tmp_path = self.parameters.tmpdir
+ cmdline = bb.ui.crumbs.utils.which_terminal()
+ if os.path.exists(image_path) and os.path.exists(kernel_path) \
+ and os.path.exists(source_env_path) and os.path.exists(tmp_path) \
+ and cmdline:
+ cmdline += "\' bash -c \"export OE_TMPDIR=" + tmp_path + "; "
+ cmdline += "source " + source_env_path + " " + os.getcwd() + "; "
+ cmdline += "runqemu " + kernel_path + " " + image_path + "\"\'"
+ subprocess.Popen(shlex.split(cmdline))
+ else:
+ lbl = "<b>Path error</b>"
+ msg = "One of your paths is wrong,"
+ msg = msg + " please make sure the following paths exist:\n"
+ msg = msg + "image path:" + image_path + "\n"
+ msg = msg + "kernel path:" + kernel_path + "\n"
+ msg = msg + "source environment path:" + source_env_path + "\n"
+ msg = msg + "tmp path: " + tmp_path + "."
+ msg = msg + "You may be missing either xterm or vte for terminal services."
+ dialog = CrumbsMessageDialog(self, lbl, gtk.MESSAGE_ERROR, msg)
+ button = dialog.add_button("Close", gtk.RESPONSE_OK)
+ HobButton.style_button(button)
+ dialog.run()
+ dialog.destroy()
+
+ def show_packages(self):
+ self.package_details_page.refresh_tables()
+ self.switch_page(self.PACKAGE_SELECTION)
+
+ def show_recipes(self):
+ self.switch_page(self.RECIPE_SELECTION)
+
+ def show_image_details(self):
+ self.switch_page(self.IMAGE_GENERATED)
+
+ def show_configuration(self):
+ self.switch_page(self.BASEIMG_SELECTED)
+
+ def stop_build(self):
+ if self.stopping:
+ lbl = "<b>Force Stop build?</b>"
+ msg = "You've already selected Stop once,"
+ msg = msg + " would you like to 'Force Stop' the build?\n\n"
+ msg = msg + "This will stop the build as quickly as possible but may"
+ msg = msg + " well leave your build directory in an unusable state"
+ msg = msg + " that requires manual steps to fix."
+ dialog = CrumbsMessageDialog(self, lbl, gtk.MESSAGE_WARNING, msg)
+ button = dialog.add_button("Cancel", gtk.RESPONSE_CANCEL)
+ HobAltButton.style_button(button)
+ button = dialog.add_button("Force Stop", gtk.RESPONSE_YES)
+ HobButton.style_button(button)
+ else:
+ lbl = "<b>Stop build?</b>"
+ msg = "Are you sure you want to stop this"
+ msg = msg + " build?\n\n'Stop' will stop the build as soon as all in"
+ msg = msg + " progress build tasks are finished. However if a"
+ msg = msg + " lengthy compilation phase is in progress this may take"
+ msg = msg + " some time.\n\n"
+ msg = msg + "'Force Stop' will stop the build as quickly as"
+ msg = msg + " possible but may well leave your build directory in an"
+ msg = msg + " unusable state that requires manual steps to fix."
+ dialog = CrumbsMessageDialog(self, lbl, gtk.MESSAGE_WARNING, msg)
+ button = dialog.add_button("Cancel", gtk.RESPONSE_CANCEL)
+ HobAltButton.style_button(button)
+ button = dialog.add_button("Force stop", gtk.RESPONSE_YES)
+ HobAltButton.style_button(button)
+ button = dialog.add_button("Stop", gtk.RESPONSE_OK)
+ HobButton.style_button(button)
+ response = dialog.run()
+ dialog.destroy()
+ if response != gtk.RESPONSE_CANCEL:
+ self.stopping = True
+ if response == gtk.RESPONSE_OK:
+ self.build_details_page.progress_bar.set_stop_title("Stopping the build....")
+ self.build_details_page.progress_bar.set_rcstyle("stop")
+ self.cancel_build_sync()
+ elif response == gtk.RESPONSE_YES:
+ self.cancel_build_sync(True)
+
+ def do_log(self, consolelogfile = None):
+ if consolelogfile:
+ bb.utils.mkdirhier(os.path.dirname(consolelogfile))
+ if self.consolelog:
+ self.logger.removeHandler(self.consolelog)
+ self.consolelog = None
+ self.consolelog = logging.FileHandler(consolelogfile)
+ bb.msg.addDefaultlogFilter(self.consolelog)
+ format = bb.msg.BBLogFormatter("%(levelname)s: %(message)s")
+ self.consolelog.setFormatter(format)
+
+ self.logger.addHandler(self.consolelog)
+
+ def get_topdir(self):
+ return self.handler.get_topdir()
+
+ def wait(self, delay):
+ time_start = time.time()
+ time_end = time_start + delay
+ while time_end > time.time():
+ while gtk.events_pending():
+ gtk.main_iteration()
diff --git a/bitbake/lib/bb/ui/crumbs/buildmanager.py b/bitbake/lib/bb/ui/crumbs/buildmanager.py
new file mode 100644
index 0000000..e858d75
--- /dev/null
+++ b/bitbake/lib/bb/ui/crumbs/buildmanager.py
@@ -0,0 +1,455 @@
+#
+# BitBake Graphical GTK User Interface
+#
+# Copyright (C) 2008 Intel Corporation
+#
+# Authored by Rob Bradford <rob@linux.intel.com>
+#
+# This program is free software; you can redistribute it and/or modify
+# it under the terms of the GNU General Public License version 2 as
+# published by the Free Software Foundation.
+#
+# This program is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
+# GNU General Public License for more details.
+#
+# You should have received a copy of the GNU General Public License along
+# with this program; if not, write to the Free Software Foundation, Inc.,
+# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
+
+import gtk
+import gobject
+import threading
+import os
+import datetime
+import time
+
+class BuildConfiguration:
+ """ Represents a potential *or* historic *or* concrete build. It
+ encompasses all the things that we need to tell bitbake to do to make it
+ build what we want it to build.
+
+ It also stored the metadata URL and the set of possible machines (and the
+ distros / images / uris for these. Apart from the metdata URL these are
+ not serialised to file (since they may be transient). In some ways this
+ functionality might be shifted to the loader class."""
+
+ def __init__ (self):
+ self.metadata_url = None
+
+ # Tuple of (distros, image, urls)
+ self.machine_options = {}
+
+ self.machine = None
+ self.distro = None
+ self.image = None
+ self.urls = []
+ self.extra_urls = []
+ self.extra_pkgs = []
+
+ def get_machines_model (self):
+ model = gtk.ListStore (gobject.TYPE_STRING)
+ for machine in self.machine_options.keys():
+ model.append ([machine])
+
+ return model
+
+ def get_distro_and_images_models (self, machine):
+ distro_model = gtk.ListStore (gobject.TYPE_STRING)
+
+ for distro in self.machine_options[machine][0]:
+ distro_model.append ([distro])
+
+ image_model = gtk.ListStore (gobject.TYPE_STRING)
+
+ for image in self.machine_options[machine][1]:
+ image_model.append ([image])
+
+ return (distro_model, image_model)
+
+ def get_repos (self):
+ self.urls = self.machine_options[self.machine][2]
+ return self.urls
+
+ # It might be a lot lot better if we stored these in like, bitbake conf
+ # file format.
+ @staticmethod
+ def load_from_file (filename):
+
+ conf = BuildConfiguration()
+ with open(filename, "r") as f:
+ for line in f:
+ data = line.split (";")[1]
+ if (line.startswith ("metadata-url;")):
+ conf.metadata_url = data.strip()
+ continue
+ if (line.startswith ("url;")):
+ conf.urls += [data.strip()]
+ continue
+ if (line.startswith ("extra-url;")):
+ conf.extra_urls += [data.strip()]
+ continue
+ if (line.startswith ("machine;")):
+ conf.machine = data.strip()
+ continue
+ if (line.startswith ("distribution;")):
+ conf.distro = data.strip()
+ continue
+ if (line.startswith ("image;")):
+ conf.image = data.strip()
+ continue
+
+ return conf
+
+ # Serialise to a file. This is part of the build process and we use this
+ # to be able to repeat a given build (using the same set of parameters)
+ # but also so that we can include the details of the image / machine /
+ # distro in the build manager tree view.
+ def write_to_file (self, filename):
+ f = open (filename, "w")
+
+ lines = []
+
+ if (self.metadata_url):
+ lines += ["metadata-url;%s\n" % (self.metadata_url)]
+
+ for url in self.urls:
+ lines += ["url;%s\n" % (url)]
+
+ for url in self.extra_urls:
+ lines += ["extra-url;%s\n" % (url)]
+
+ if (self.machine):
+ lines += ["machine;%s\n" % (self.machine)]
+
+ if (self.distro):
+ lines += ["distribution;%s\n" % (self.distro)]
+
+ if (self.image):
+ lines += ["image;%s\n" % (self.image)]
+
+ f.writelines (lines)
+ f.close ()
+
+class BuildResult(gobject.GObject):
+ """ Represents an historic build. Perhaps not successful. But it includes
+ things such as the files that are in the directory (the output from the
+ build) as well as a deserialised BuildConfiguration file that is stored in
+ ".conf" in the directory for the build.
+
+ This is GObject so that it can be included in the TreeStore."""
+
+ (STATE_COMPLETE, STATE_FAILED, STATE_ONGOING) = \
+ (0, 1, 2)
+
+ def __init__ (self, parent, identifier):
+ gobject.GObject.__init__ (self)
+ self.date = None
+
+ self.files = []
+ self.status = None
+ self.identifier = identifier
+ self.path = os.path.join (parent, identifier)
+
+ # Extract the date, since the directory name is of the
+ # format build-<year><month><day>-<ordinal> we can easily
+ # pull it out.
+ # TODO: Better to stat a file?
+ (_, date, revision) = identifier.split ("-")
+ print(date)
+
+ year = int (date[0:4])
+ month = int (date[4:6])
+ day = int (date[6:8])
+
+ self.date = datetime.date (year, month, day)
+
+ self.conf = None
+
+ # By default builds are STATE_FAILED unless we find a "complete" file
+ # in which case they are STATE_COMPLETE
+ self.state = BuildResult.STATE_FAILED
+ for file in os.listdir (self.path):
+ if (file.startswith (".conf")):
+ conffile = os.path.join (self.path, file)
+ self.conf = BuildConfiguration.load_from_file (conffile)
+ elif (file.startswith ("complete")):
+ self.state = BuildResult.STATE_COMPLETE
+ else:
+ self.add_file (file)
+
+ def add_file (self, file):
+ # Just add the file for now. Don't care about the type.
+ self.files += [(file, None)]
+
+class BuildManagerModel (gtk.TreeStore):
+ """ Model for the BuildManagerTreeView. This derives from gtk.TreeStore
+ but it abstracts nicely what the columns mean and the setup of the columns
+ in the model. """
+
+ (COL_IDENT, COL_DESC, COL_MACHINE, COL_DISTRO, COL_BUILD_RESULT, COL_DATE, COL_STATE) = \
+ (0, 1, 2, 3, 4, 5, 6)
+
+ def __init__ (self):
+ gtk.TreeStore.__init__ (self,
+ gobject.TYPE_STRING,
+ gobject.TYPE_STRING,
+ gobject.TYPE_STRING,
+ gobject.TYPE_STRING,
+ gobject.TYPE_OBJECT,
+ gobject.TYPE_INT64,
+ gobject.TYPE_INT)
+
+class BuildManager (gobject.GObject):
+ """ This class manages the historic builds that have been found in the
+ "results" directory but is also used for starting a new build."""
+
+ __gsignals__ = {
+ 'population-finished' : (gobject.SIGNAL_RUN_LAST,
+ gobject.TYPE_NONE,
+ ()),
+ 'populate-error' : (gobject.SIGNAL_RUN_LAST,
+ gobject.TYPE_NONE,
+ ())
+ }
+
+ def update_build_result (self, result, iter):
+ # Convert the date into something we can sort by.
+ date = long (time.mktime (result.date.timetuple()))
+
+ # Add a top level entry for the build
+
+ self.model.set (iter,
+ BuildManagerModel.COL_IDENT, result.identifier,
+ BuildManagerModel.COL_DESC, result.conf.image,
+ BuildManagerModel.COL_MACHINE, result.conf.machine,
+ BuildManagerModel.COL_DISTRO, result.conf.distro,
+ BuildManagerModel.COL_BUILD_RESULT, result,
+ BuildManagerModel.COL_DATE, date,
+ BuildManagerModel.COL_STATE, result.state)
+
+ # And then we use the files in the directory as the children for the
+ # top level iter.
+ for file in result.files:
+ self.model.append (iter, (None, file[0], None, None, None, date, -1))
+
+ # This function is called as an idle by the BuildManagerPopulaterThread
+ def add_build_result (self, result):
+ gtk.gdk.threads_enter()
+ self.known_builds += [result]
+
+ self.update_build_result (result, self.model.append (None))
+
+ gtk.gdk.threads_leave()
+
+ def notify_build_finished (self):
+ # This is a bit of a hack. If we have a running build running then we
+ # will have a row in the model in STATE_ONGOING. Find it and make it
+ # as if it was a proper historic build (well, it is completed now....)
+
+ # We need to use the iters here rather than the Python iterator
+ # interface to the model since we need to pass it into
+ # update_build_result
+
+ iter = self.model.get_iter_first()
+
+ while (iter):
+ (ident, state) = self.model.get(iter,
+ BuildManagerModel.COL_IDENT,
+ BuildManagerModel.COL_STATE)
+
+ if state == BuildResult.STATE_ONGOING:
+ result = BuildResult (self.results_directory, ident)
+ self.update_build_result (result, iter)
+ iter = self.model.iter_next(iter)
+
+ def notify_build_succeeded (self):
+ # Write the "complete" file so that when we create the BuildResult
+ # object we put into the model
+
+ complete_file_path = os.path.join (self.cur_build_directory, "complete")
+ f = file (complete_file_path, "w")
+ f.close()
+ self.notify_build_finished()
+
+ def notify_build_failed (self):
+ # Without a "complete" file then this will mark the build as failed:
+ self.notify_build_finished()
+
+ # This function is called as an idle
+ def emit_population_finished_signal (self):
+ gtk.gdk.threads_enter()
+ self.emit ("population-finished")
+ gtk.gdk.threads_leave()
+
+ class BuildManagerPopulaterThread (threading.Thread):
+ def __init__ (self, manager, directory):
+ threading.Thread.__init__ (self)
+ self.manager = manager
+ self.directory = directory
+
+ def run (self):
+ # For each of the "build-<...>" directories ..
+
+ if os.path.exists (self.directory):
+ for directory in os.listdir (self.directory):
+
+ if not directory.startswith ("build-"):
+ continue
+
+ build_result = BuildResult (self.directory, directory)
+ self.manager.add_build_result (build_result)
+
+ gobject.idle_add (BuildManager.emit_population_finished_signal,
+ self.manager)
+
+ def __init__ (self, server, results_directory):
+ gobject.GObject.__init__ (self)
+
+ # The builds that we've found from walking the result directory
+ self.known_builds = []
+
+ # Save out the bitbake server, we need this for issuing commands to
+ # the cooker:
+ self.server = server
+
+ # The TreeStore that we use
+ self.model = BuildManagerModel ()
+
+ # The results directory is where we create (and look for) the
+ # build-<xyz>-<n> directories. We need to populate ourselves from
+ # directory
+ self.results_directory = results_directory
+ self.populate_from_directory (self.results_directory)
+
+ def populate_from_directory (self, directory):
+ thread = BuildManager.BuildManagerPopulaterThread (self, directory)
+ thread.start()
+
+ # Come up with the name for the next build ident by combining "build-"
+ # with the date formatted as yyyymmdd and then an ordinal. We do this by
+ # an optimistic algorithm incrementing the ordinal if we find that it
+ # already exists.
+ def get_next_build_ident (self):
+ today = datetime.date.today ()
+ datestr = str (today.year) + str (today.month) + str (today.day)
+
+ revision = 0
+ test_name = "build-%s-%d" % (datestr, revision)
+ test_path = os.path.join (self.results_directory, test_name)
+
+ while (os.path.exists (test_path)):
+ revision += 1
+ test_name = "build-%s-%d" % (datestr, revision)
+ test_path = os.path.join (self.results_directory, test_name)
+
+ return test_name
+
+ # Take a BuildConfiguration and then try and build it based on the
+ # parameters of that configuration. S
+ def do_build (self, conf):
+ server = self.server
+
+ # Work out the build directory. Note we actually create the
+ # directories here since we need to write the ".conf" file. Otherwise
+ # we could have relied on bitbake's builder thread to actually make
+ # the directories as it proceeds with the build.
+ ident = self.get_next_build_ident ()
+ build_directory = os.path.join (self.results_directory,
+ ident)
+ self.cur_build_directory = build_directory
+ os.makedirs (build_directory)
+
+ conffile = os.path.join (build_directory, ".conf")
+ conf.write_to_file (conffile)
+
+ # Add a row to the model representing this ongoing build. It's kinda a
+ # fake entry. If this build completes or fails then this gets updated
+ # with the real stuff like the historic builds
+ date = long (time.time())
+ self.model.append (None, (ident, conf.image, conf.machine, conf.distro,
+ None, date, BuildResult.STATE_ONGOING))
+ try:
+ server.runCommand(["setVariable", "BUILD_IMAGES_FROM_FEEDS", 1])
+ server.runCommand(["setVariable", "MACHINE", conf.machine])
+ server.runCommand(["setVariable", "DISTRO", conf.distro])
+ server.runCommand(["setVariable", "PACKAGE_CLASSES", "package_ipk"])
+ server.runCommand(["setVariable", "BBFILES", \
+ """${OEROOT}/meta/packages/*/*.bb ${OEROOT}/meta-moblin/packages/*/*.bb"""])
+ server.runCommand(["setVariable", "TMPDIR", "${OEROOT}/build/tmp"])
+ server.runCommand(["setVariable", "IPK_FEED_URIS", \
+ " ".join(conf.get_repos())])
+ server.runCommand(["setVariable", "DEPLOY_DIR_IMAGE",
+ build_directory])
+ server.runCommand(["buildTargets", [conf.image], "rootfs"])
+
+ except Exception as e:
+ print(e)
+
+class BuildManagerTreeView (gtk.TreeView):
+ """ The tree view for the build manager. This shows the historic builds
+ and so forth. """
+
+ # We use this function to control what goes in the cell since we store
+ # the date in the model as seconds since the epoch (for sorting) and so we
+ # need to make it human readable.
+ def date_format_custom_cell_data_func (self, col, cell, model, iter):
+ date = model.get (iter, BuildManagerModel.COL_DATE)[0]
+ datestr = time.strftime("%A %d %B %Y", time.localtime(date))
+ cell.set_property ("text", datestr)
+
+ # This format function controls what goes in the cell. We use this to map
+ # the integer state to a string and also to colourise the text
+ def state_format_custom_cell_data_fun (self, col, cell, model, iter):
+ state = model.get (iter, BuildManagerModel.COL_STATE)[0]
+
+ if (state == BuildResult.STATE_ONGOING):
+ cell.set_property ("text", "Active")
+ cell.set_property ("foreground", "#000000")
+ elif (state == BuildResult.STATE_FAILED):
+ cell.set_property ("text", "Failed")
+ cell.set_property ("foreground", "#ff0000")
+ elif (state == BuildResult.STATE_COMPLETE):
+ cell.set_property ("text", "Complete")
+ cell.set_property ("foreground", "#00ff00")
+ else:
+ cell.set_property ("text", "")
+
+ def __init__ (self):
+ gtk.TreeView.__init__(self)
+
+ # Misc descriptiony thing
+ renderer = gtk.CellRendererText ()
+ col = gtk.TreeViewColumn (None, renderer,
+ text=BuildManagerModel.COL_DESC)
+ self.append_column (col)
+
+ # Machine
+ renderer = gtk.CellRendererText ()
+ col = gtk.TreeViewColumn ("Machine", renderer,
+ text=BuildManagerModel.COL_MACHINE)
+ self.append_column (col)
+
+ # distro
+ renderer = gtk.CellRendererText ()
+ col = gtk.TreeViewColumn ("Distribution", renderer,
+ text=BuildManagerModel.COL_DISTRO)
+ self.append_column (col)
+
+ # date (using a custom function for formatting the cell contents it
+ # takes epoch -> human readable string)
+ renderer = gtk.CellRendererText ()
+ col = gtk.TreeViewColumn ("Date", renderer,
+ text=BuildManagerModel.COL_DATE)
+ self.append_column (col)
+ col.set_cell_data_func (renderer,
+ self.date_format_custom_cell_data_func)
+
+ # For status.
+ renderer = gtk.CellRendererText ()
+ col = gtk.TreeViewColumn ("Status", renderer,
+ text = BuildManagerModel.COL_STATE)
+ self.append_column (col)
+ col.set_cell_data_func (renderer,
+ self.state_format_custom_cell_data_fun)
diff --git a/bitbake/lib/bb/ui/crumbs/hig/__init__.py b/bitbake/lib/bb/ui/crumbs/hig/__init__.py
new file mode 100644
index 0000000..e69de29
--- /dev/null
+++ b/bitbake/lib/bb/ui/crumbs/hig/__init__.py
diff --git a/bitbake/lib/bb/ui/crumbs/hig/advancedsettingsdialog.py b/bitbake/lib/bb/ui/crumbs/hig/advancedsettingsdialog.py
new file mode 100644
index 0000000..e0b3553
--- /dev/null
+++ b/bitbake/lib/bb/ui/crumbs/hig/advancedsettingsdialog.py
@@ -0,0 +1,341 @@
+#
+# BitBake Graphical GTK User Interface
+#
+# Copyright (C) 2011-2012 Intel Corporation
+#
+# Authored by Joshua Lock <josh@linux.intel.com>
+# Authored by Dongxiao Xu <dongxiao.xu@intel.com>
+# Authored by Shane Wang <shane.wang@intel.com>
+#
+# This program is free software; you can redistribute it and/or modify
+# it under the terms of the GNU General Public License version 2 as
+# published by the Free Software Foundation.
+#
+# This program is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
+# GNU General Public License for more details.
+#
+# You should have received a copy of the GNU General Public License along
+# with this program; if not, write to the Free Software Foundation, Inc.,
+# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
+
+import gtk
+import hashlib
+from bb.ui.crumbs.hobwidget import HobInfoButton, HobButton
+from bb.ui.crumbs.progressbar import HobProgressBar
+from bb.ui.crumbs.hig.settingsuihelper import SettingsUIHelper
+from bb.ui.crumbs.hig.crumbsdialog import CrumbsDialog
+from bb.ui.crumbs.hig.crumbsmessagedialog import CrumbsMessageDialog
+from bb.ui.crumbs.hig.proxydetailsdialog import ProxyDetailsDialog
+
+"""
+The following are convenience classes for implementing GNOME HIG compliant
+BitBake GUI's
+In summary: spacing = 12px, border-width = 6px
+"""
+
+class AdvancedSettingsDialog (CrumbsDialog, SettingsUIHelper):
+
+ def details_cb(self, button, parent, protocol):
+ dialog = ProxyDetailsDialog(title = protocol.upper() + " Proxy Details",
+ user = self.configuration.proxies[protocol][1],
+ passwd = self.configuration.proxies[protocol][2],
+ parent = parent,
+ flags = gtk.DIALOG_MODAL
+ | gtk.DIALOG_DESTROY_WITH_PARENT
+ | gtk.DIALOG_NO_SEPARATOR)
+ dialog.add_button(gtk.STOCK_CLOSE, gtk.RESPONSE_OK)
+ response = dialog.run()
+ if response == gtk.RESPONSE_OK:
+ self.configuration.proxies[protocol][1] = dialog.user
+ self.configuration.proxies[protocol][2] = dialog.passwd
+ self.refresh_proxy_components()
+ dialog.destroy()
+
+ def set_save_button(self, button):
+ self.save_button = button
+
+ def rootfs_combo_changed_cb(self, rootfs_combo, all_package_format, check_hbox):
+ combo_item = self.rootfs_combo.get_active_text()
+ modified = False
+ for child in check_hbox.get_children():
+ if isinstance(child, gtk.CheckButton):
+ check_hbox.remove(child)
+ modified = True
+ for format in all_package_format:
+ if format != combo_item:
+ check_button = gtk.CheckButton(format)
+ check_hbox.pack_start(check_button, expand=False, fill=False)
+ modified = True
+ if modified:
+ check_hbox.remove(self.pkgfmt_info)
+ check_hbox.pack_start(self.pkgfmt_info, expand=False, fill=False)
+ check_hbox.show_all()
+
+ def gen_pkgfmt_widget(self, curr_package_format, all_package_format, tooltip_combo="", tooltip_extra=""):
+ pkgfmt_vbox = gtk.VBox(False, 6)
+
+ label = self.gen_label_widget("Root file system package format")
+ pkgfmt_vbox.pack_start(label, expand=False, fill=False)
+
+ rootfs_format = ""
+ if curr_package_format:
+ rootfs_format = curr_package_format.split()[0]
+
+ rootfs_format_widget, rootfs_combo = self.gen_combo_widget(rootfs_format, all_package_format, tooltip_combo)
+ pkgfmt_vbox.pack_start(rootfs_format_widget, expand=False, fill=False)
+
+ label = self.gen_label_widget("Additional package formats")
+ pkgfmt_vbox.pack_start(label, expand=False, fill=False)
+
+ check_hbox = gtk.HBox(False, 12)
+ pkgfmt_vbox.pack_start(check_hbox, expand=False, fill=False)
+ for format in all_package_format:
+ if format != rootfs_format:
+ check_button = gtk.CheckButton(format)
+ is_active = (format in curr_package_format.split())
+ check_button.set_active(is_active)
+ check_hbox.pack_start(check_button, expand=False, fill=False)
+
+ self.pkgfmt_info = HobInfoButton(tooltip_extra, self)
+ check_hbox.pack_start(self.pkgfmt_info, expand=False, fill=False)
+
+ rootfs_combo.connect("changed", self.rootfs_combo_changed_cb, all_package_format, check_hbox)
+
+ pkgfmt_vbox.show_all()
+
+ return pkgfmt_vbox, rootfs_combo, check_hbox
+
+ def __init__(self, title, configuration, all_image_types,
+ all_package_formats, all_distros, all_sdk_machines,
+ max_threads, parent, flags, buttons=None):
+ super(AdvancedSettingsDialog, self).__init__(title, parent, flags, buttons)
+
+ # class members from other objects
+ # bitbake settings from Builder.Configuration
+ self.configuration = configuration
+ self.image_types = all_image_types
+ self.all_package_formats = all_package_formats
+ self.all_distros = all_distros[:]
+ self.all_sdk_machines = all_sdk_machines
+ self.max_threads = max_threads
+
+ # class members for internal use
+ self.distro_combo = None
+ self.dldir_text = None
+ self.sstatedir_text = None
+ self.sstatemirror_text = None
+ self.bb_spinner = None
+ self.pmake_spinner = None
+ self.rootfs_size_spinner = None
+ self.extra_size_spinner = None
+ self.gplv3_checkbox = None
+ self.sdk_checkbox = None
+ self.image_types_checkbuttons = {}
+
+ self.md5 = self.config_md5()
+ self.settings_changed = False
+
+ # create visual elements on the dialog
+ self.save_button = None
+ self.create_visual_elements()
+ self.connect("response", self.response_cb)
+
+ def _get_sorted_value(self, var):
+ return " ".join(sorted(str(var).split())) + "\n"
+
+ def config_md5(self):
+ data = ""
+ data += ("PACKAGE_CLASSES: " + self.configuration.curr_package_format + '\n')
+ data += ("DISTRO: " + self._get_sorted_value(self.configuration.curr_distro))
+ data += ("IMAGE_ROOTFS_SIZE: " + self._get_sorted_value(self.configuration.image_rootfs_size))
+ data += ("IMAGE_EXTRA_SIZE: " + self._get_sorted_value(self.configuration.image_extra_size))
+ data += ("INCOMPATIBLE_LICENSE: " + self._get_sorted_value(self.configuration.incompat_license))
+ data += ("SDK_MACHINE: " + self._get_sorted_value(self.configuration.curr_sdk_machine))
+ data += ("TOOLCHAIN_BUILD: " + self._get_sorted_value(self.configuration.toolchain_build))
+ data += ("IMAGE_FSTYPES: " + self._get_sorted_value(self.configuration.image_fstypes))
+ return hashlib.md5(data).hexdigest()
+
+ def create_visual_elements(self):
+ self.nb = gtk.Notebook()
+ self.nb.set_show_tabs(True)
+ self.nb.append_page(self.create_image_types_page(), gtk.Label("Image types"))
+ self.nb.append_page(self.create_output_page(), gtk.Label("Output"))
+ self.nb.set_current_page(0)
+ self.vbox.pack_start(self.nb, expand=True, fill=True)
+ self.vbox.pack_end(gtk.HSeparator(), expand=True, fill=True)
+
+ self.show_all()
+
+ def get_num_checked_image_types(self):
+ total = 0
+ for b in self.image_types_checkbuttons.values():
+ if b.get_active():
+ total = total + 1
+ return total
+
+ def set_save_button_state(self):
+ if self.save_button:
+ self.save_button.set_sensitive(self.get_num_checked_image_types() > 0)
+
+ def image_type_checkbutton_clicked_cb(self, button):
+ self.set_save_button_state()
+ if self.get_num_checked_image_types() == 0:
+ # Show an error dialog
+ lbl = "<b>Select an image type</b>"
+ msg = "You need to select at least one image type."
+ dialog = CrumbsMessageDialog(self, lbl, gtk.MESSAGE_WARNING, msg)
+ button = dialog.add_button("OK", gtk.RESPONSE_OK)
+ HobButton.style_button(button)
+ response = dialog.run()
+ dialog.destroy()
+
+ def create_image_types_page(self):
+ main_vbox = gtk.VBox(False, 16)
+ main_vbox.set_border_width(6)
+
+ advanced_vbox = gtk.VBox(False, 6)
+ advanced_vbox.set_border_width(6)
+
+ distro_vbox = gtk.VBox(False, 6)
+ label = self.gen_label_widget("Distro:")
+ tooltip = "Selects the Yocto Project distribution you want"
+ try:
+ i = self.all_distros.index( "defaultsetup" )
+ except ValueError:
+ i = -1
+ if i != -1:
+ self.all_distros[ i ] = "Default"
+ if self.configuration.curr_distro == "defaultsetup":
+ self.configuration.curr_distro = "Default"
+ distro_widget, self.distro_combo = self.gen_combo_widget(self.configuration.curr_distro, self.all_distros,"<b>Distro</b>" + "*" + tooltip)
+ distro_vbox.pack_start(label, expand=False, fill=False)
+ distro_vbox.pack_start(distro_widget, expand=False, fill=False)
+ main_vbox.pack_start(distro_vbox, expand=False, fill=False)
+
+
+ rows = (len(self.image_types)+1)/3
+ table = gtk.Table(rows + 1, 10, True)
+ advanced_vbox.pack_start(table, expand=False, fill=False)
+
+ tooltip = "Image file system types you want."
+ info = HobInfoButton("<b>Image types</b>" + "*" + tooltip, self)
+ label = self.gen_label_widget("Image types:")
+ align = gtk.Alignment(0, 0.5, 0, 0)
+ table.attach(align, 0, 4, 0, 1)
+ align.add(label)
+ table.attach(info, 4, 5, 0, 1)
+
+ i = 1
+ j = 1
+ for image_type in sorted(self.image_types):
+ self.image_types_checkbuttons[image_type] = gtk.CheckButton(image_type)
+ self.image_types_checkbuttons[image_type].connect("toggled", self.image_type_checkbutton_clicked_cb)
+ article = ""
+ if image_type.startswith(("a", "e", "i", "o", "u")):
+ article = "n"
+ if image_type == "live":
+ self.image_types_checkbuttons[image_type].set_tooltip_text("Build iso and hddimg images")
+ else:
+ self.image_types_checkbuttons[image_type].set_tooltip_text("Build a%s %s image" % (article, image_type))
+ table.attach(self.image_types_checkbuttons[image_type], j - 1, j + 3, i, i + 1)
+ if image_type in self.configuration.image_fstypes.split():
+ self.image_types_checkbuttons[image_type].set_active(True)
+ i += 1
+ if i > rows:
+ i = 1
+ j = j + 4
+
+ main_vbox.pack_start(advanced_vbox, expand=False, fill=False)
+ self.set_save_button_state()
+
+ return main_vbox
+
+ def create_output_page(self):
+ advanced_vbox = gtk.VBox(False, 6)
+ advanced_vbox.set_border_width(6)
+
+ advanced_vbox.pack_start(self.gen_label_widget('<span weight="bold">Package format</span>'), expand=False, fill=False)
+ sub_vbox = gtk.VBox(False, 6)
+ advanced_vbox.pack_start(sub_vbox, expand=False, fill=False)
+ tooltip_combo = "Selects the package format used to generate rootfs."
+ tooltip_extra = "Selects extra package formats to build"
+ pkgfmt_widget, self.rootfs_combo, self.check_hbox = self.gen_pkgfmt_widget(self.configuration.curr_package_format, self.all_package_formats,"<b>Root file system package format</b>" + "*" + tooltip_combo,"<b>Additional package formats</b>" + "*" + tooltip_extra)
+ sub_vbox.pack_start(pkgfmt_widget, expand=False, fill=False)
+
+ advanced_vbox.pack_start(self.gen_label_widget('<span weight="bold">Image size</span>'), expand=False, fill=False)
+ sub_vbox = gtk.VBox(False, 6)
+ advanced_vbox.pack_start(sub_vbox, expand=False, fill=False)
+ label = self.gen_label_widget("Image basic size (in MB)")
+ tooltip = "Defines the size for the generated image. The OpenEmbedded build system determines the final size for the generated image using an algorithm that takes into account the initial disk space used for the generated image, the Image basic size value, and the Additional free space value.\n\nFor more information, check the <a href=\"http://www.yoctoproject.org/docs/current/poky-ref-manual/poky-ref-manual.html#var-IMAGE_ROOTFS_SIZE\">Yocto Project Reference Manual</a>."
+ rootfs_size_widget, self.rootfs_size_spinner = self.gen_spinner_widget(int(self.configuration.image_rootfs_size*1.0/1024), 0, 65536,"<b>Image basic size</b>" + "*" + tooltip)
+ sub_vbox.pack_start(label, expand=False, fill=False)
+ sub_vbox.pack_start(rootfs_size_widget, expand=False, fill=False)
+
+ sub_vbox = gtk.VBox(False, 6)
+ advanced_vbox.pack_start(sub_vbox, expand=False, fill=False)
+ label = self.gen_label_widget("Additional free space (in MB)")
+ tooltip = "Sets extra free disk space to be added to the generated image. Use this variable when you want to ensure that a specific amount of free disk space is available on a device after an image is installed and running."
+ extra_size_widget, self.extra_size_spinner = self.gen_spinner_widget(int(self.configuration.image_extra_size*1.0/1024), 0, 65536,"<b>Additional free space</b>" + "*" + tooltip)
+ sub_vbox.pack_start(label, expand=False, fill=False)
+ sub_vbox.pack_start(extra_size_widget, expand=False, fill=False)
+
+ advanced_vbox.pack_start(self.gen_label_widget('<span weight="bold">Licensing</span>'), expand=False, fill=False)
+ self.gplv3_checkbox = gtk.CheckButton("Exclude GPLv3 packages")
+ self.gplv3_checkbox.set_tooltip_text("Check this box to prevent GPLv3 packages from being included in your image")
+ if "GPLv3" in self.configuration.incompat_license.split():
+ self.gplv3_checkbox.set_active(True)
+ else:
+ self.gplv3_checkbox.set_active(False)
+ advanced_vbox.pack_start(self.gplv3_checkbox, expand=False, fill=False)
+
+ advanced_vbox.pack_start(self.gen_label_widget('<span weight="bold">SDK</span>'), expand=False, fill=False)
+ sub_hbox = gtk.HBox(False, 6)
+ advanced_vbox.pack_start(sub_hbox, expand=False, fill=False)
+ self.sdk_checkbox = gtk.CheckButton("Populate SDK")
+ tooltip = "Check this box to generate an SDK tarball that consists of the cross-toolchain and a sysroot that contains development packages for your image."
+ self.sdk_checkbox.set_tooltip_text(tooltip)
+ self.sdk_checkbox.set_active(self.configuration.toolchain_build)
+ sub_hbox.pack_start(self.sdk_checkbox, expand=False, fill=False)
+
+ tooltip = "Select the host platform for which you want to run the toolchain contained in the SDK tarball."
+ sdk_machine_widget, self.sdk_machine_combo = self.gen_combo_widget(self.configuration.curr_sdk_machine, self.all_sdk_machines,"<b>Populate SDK</b>" + "*" + tooltip)
+ sub_hbox.pack_start(sdk_machine_widget, expand=False, fill=False)
+
+ return advanced_vbox
+
+ def response_cb(self, dialog, response_id):
+ package_format = []
+ package_format.append(self.rootfs_combo.get_active_text())
+ for child in self.check_hbox:
+ if isinstance(child, gtk.CheckButton) and child.get_active():
+ package_format.append(child.get_label())
+ self.configuration.curr_package_format = " ".join(package_format)
+
+ distro = self.distro_combo.get_active_text()
+ if distro == "Default":
+ distro = "defaultsetup"
+ self.configuration.curr_distro = distro
+ self.configuration.image_rootfs_size = self.rootfs_size_spinner.get_value_as_int() * 1024
+ self.configuration.image_extra_size = self.extra_size_spinner.get_value_as_int() * 1024
+
+ self.configuration.image_fstypes = ""
+ for image_type in self.image_types:
+ if self.image_types_checkbuttons[image_type].get_active():
+ self.configuration.image_fstypes += (" " + image_type)
+ self.configuration.image_fstypes.strip()
+
+ if self.gplv3_checkbox.get_active():
+ if "GPLv3" not in self.configuration.incompat_license.split():
+ self.configuration.incompat_license += " GPLv3"
+ else:
+ if "GPLv3" in self.configuration.incompat_license.split():
+ self.configuration.incompat_license = self.configuration.incompat_license.split().remove("GPLv3")
+ self.configuration.incompat_license = " ".join(self.configuration.incompat_license or [])
+ self.configuration.incompat_license = self.configuration.incompat_license.strip()
+
+ self.configuration.toolchain_build = self.sdk_checkbox.get_active()
+ self.configuration.curr_sdk_machine = self.sdk_machine_combo.get_active_text()
+ md5 = self.config_md5()
+ self.settings_changed = (self.md5 != md5)
diff --git a/bitbake/lib/bb/ui/crumbs/hig/crumbsdialog.py b/bitbake/lib/bb/ui/crumbs/hig/crumbsdialog.py
new file mode 100644
index 0000000..c679f9a
--- /dev/null
+++ b/bitbake/lib/bb/ui/crumbs/hig/crumbsdialog.py
@@ -0,0 +1,44 @@
+#
+# BitBake Graphical GTK User Interface
+#
+# Copyright (C) 2011-2012 Intel Corporation
+#
+# Authored by Joshua Lock <josh@linux.intel.com>
+# Authored by Dongxiao Xu <dongxiao.xu@intel.com>
+# Authored by Shane Wang <shane.wang@intel.com>
+#
+# This program is free software; you can redistribute it and/or modify
+# it under the terms of the GNU General Public License version 2 as
+# published by the Free Software Foundation.
+#
+# This program is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
+# GNU General Public License for more details.
+#
+# You should have received a copy of the GNU General Public License along
+# with this program; if not, write to the Free Software Foundation, Inc.,
+# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
+
+import gtk
+
+"""
+The following are convenience classes for implementing GNOME HIG compliant
+BitBake GUI's
+In summary: spacing = 12px, border-width = 6px
+"""
+
+class CrumbsDialog(gtk.Dialog):
+ """
+ A GNOME HIG compliant dialog widget.
+ Add buttons with gtk.Dialog.add_button or gtk.Dialog.add_buttons
+ """
+ def __init__(self, title="", parent=None, flags=0, buttons=None):
+ super(CrumbsDialog, self).__init__(title, parent, flags, buttons)
+
+ self.set_property("has-separator", False) # note: deprecated in 2.22
+
+ self.set_border_width(6)
+ self.vbox.set_property("spacing", 12)
+ self.action_area.set_property("spacing", 12)
+ self.action_area.set_property("border-width", 6)
diff --git a/bitbake/lib/bb/ui/crumbs/hig/crumbsmessagedialog.py b/bitbake/lib/bb/ui/crumbs/hig/crumbsmessagedialog.py
new file mode 100644
index 0000000..3b998e4
--- /dev/null
+++ b/bitbake/lib/bb/ui/crumbs/hig/crumbsmessagedialog.py
@@ -0,0 +1,70 @@
+#
+# BitBake Graphical GTK User Interface
+#
+# Copyright (C) 2011-2012 Intel Corporation
+#
+# Authored by Joshua Lock <josh@linux.intel.com>
+# Authored by Dongxiao Xu <dongxiao.xu@intel.com>
+# Authored by Shane Wang <shane.wang@intel.com>
+#
+# This program is free software; you can redistribute it and/or modify
+# it under the terms of the GNU General Public License version 2 as
+# published by the Free Software Foundation.
+#
+# This program is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
+# GNU General Public License for more details.
+#
+# You should have received a copy of the GNU General Public License along
+# with this program; if not, write to the Free Software Foundation, Inc.,
+# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
+
+import glib
+import gtk
+from bb.ui.crumbs.hobwidget import HobIconChecker
+from bb.ui.crumbs.hig.crumbsdialog import CrumbsDialog
+
+"""
+The following are convenience classes for implementing GNOME HIG compliant
+BitBake GUI's
+In summary: spacing = 12px, border-width = 6px
+"""
+
+class CrumbsMessageDialog(gtk.MessageDialog):
+ """
+ A GNOME HIG compliant dialog widget.
+ Add buttons with gtk.Dialog.add_button or gtk.Dialog.add_buttons
+ """
+ def __init__(self, parent = None, label="", dialog_type = gtk.MESSAGE_QUESTION, msg=""):
+ super(CrumbsMessageDialog, self).__init__(None,
+ gtk.DIALOG_MODAL | gtk.DIALOG_DESTROY_WITH_PARENT,
+ dialog_type,
+ gtk.BUTTONS_NONE,
+ None)
+
+ self.set_skip_taskbar_hint(False)
+
+ self.set_markup(label)
+
+ if 0 <= len(msg) < 300:
+ self.format_secondary_markup(msg)
+ else:
+ vbox = self.get_message_area()
+ vbox.set_border_width(1)
+ vbox.set_property("spacing", 12)
+ self.textWindow = gtk.ScrolledWindow()
+ self.textWindow.set_shadow_type(gtk.SHADOW_IN)
+ self.textWindow.set_policy(gtk.POLICY_AUTOMATIC, gtk.POLICY_AUTOMATIC)
+ self.msgView = gtk.TextView()
+ self.msgView.set_editable(False)
+ self.msgView.set_wrap_mode(gtk.WRAP_WORD)
+ self.msgView.set_cursor_visible(False)
+ self.msgView.set_size_request(300, 300)
+ self.buf = gtk.TextBuffer()
+ self.buf.set_text(msg)
+ self.msgView.set_buffer(self.buf)
+ self.textWindow.add(self.msgView)
+ self.msgView.show()
+ vbox.add(self.textWindow)
+ self.textWindow.show()
diff --git a/bitbake/lib/bb/ui/crumbs/hig/deployimagedialog.py b/bitbake/lib/bb/ui/crumbs/hig/deployimagedialog.py
new file mode 100644
index 0000000..a13fff9
--- /dev/null
+++ b/bitbake/lib/bb/ui/crumbs/hig/deployimagedialog.py
@@ -0,0 +1,219 @@
+#
+# BitBake Graphical GTK User Interface
+#
+# Copyright (C) 2011-2012 Intel Corporation
+#
+# Authored by Joshua Lock <josh@linux.intel.com>
+# Authored by Dongxiao Xu <dongxiao.xu@intel.com>
+# Authored by Shane Wang <shane.wang@intel.com>
+#
+# This program is free software; you can redistribute it and/or modify
+# it under the terms of the GNU General Public License version 2 as
+# published by the Free Software Foundation.
+#
+# This program is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
+# GNU General Public License for more details.
+#
+# You should have received a copy of the GNU General Public License along
+# with this program; if not, write to the Free Software Foundation, Inc.,
+# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
+
+import glob
+import gtk
+import gobject
+import os
+import re
+import shlex
+import subprocess
+import tempfile
+from bb.ui.crumbs.hobwidget import hic, HobButton
+from bb.ui.crumbs.progressbar import HobProgressBar
+import bb.ui.crumbs.utils
+import bb.process
+from bb.ui.crumbs.hig.crumbsdialog import CrumbsDialog
+from bb.ui.crumbs.hig.crumbsmessagedialog import CrumbsMessageDialog
+
+"""
+The following are convenience classes for implementing GNOME HIG compliant
+BitBake GUI's
+In summary: spacing = 12px, border-width = 6px
+"""
+
+class DeployImageDialog (CrumbsDialog):
+
+ __dummy_usb__ = "--select a usb drive--"
+
+ def __init__(self, title, image_path, parent, flags, buttons=None, standalone=False):
+ super(DeployImageDialog, self).__init__(title, parent, flags, buttons)
+
+ self.image_path = image_path
+ self.standalone = standalone
+
+ self.create_visual_elements()
+ self.connect("response", self.response_cb)
+
+ def create_visual_elements(self):
+ self.set_size_request(600, 400)
+ label = gtk.Label()
+ label.set_alignment(0.0, 0.5)
+ markup = "<span font_desc='12'>The image to be written into usb drive:</span>"
+ label.set_markup(markup)
+ self.vbox.pack_start(label, expand=False, fill=False, padding=2)
+
+ table = gtk.Table(2, 10, False)
+ table.set_col_spacings(5)
+ table.set_row_spacings(5)
+ self.vbox.pack_start(table, expand=True, fill=True)
+
+ scroll = gtk.ScrolledWindow()
+ scroll.set_policy(gtk.POLICY_NEVER, gtk.POLICY_AUTOMATIC)
+ scroll.set_shadow_type(gtk.SHADOW_IN)
+ tv = gtk.TextView()
+ tv.set_editable(False)
+ tv.set_wrap_mode(gtk.WRAP_WORD)
+ tv.set_cursor_visible(False)
+ self.buf = gtk.TextBuffer()
+ self.buf.set_text(self.image_path)
+ tv.set_buffer(self.buf)
+ scroll.add(tv)
+ table.attach(scroll, 0, 10, 0, 1)
+
+ # There are 2 ways to use DeployImageDialog
+ # One way is that called by HOB when the 'Deploy Image' button is clicked
+ # The other way is that called by a standalone script.
+ # Following block of codes handles the latter way. It adds a 'Select Image' button and
+ # emit a signal when the button is clicked.
+ if self.standalone:
+ gobject.signal_new("select_image_clicked", self, gobject.SIGNAL_RUN_FIRST,
+ gobject.TYPE_NONE, ())
+ icon = gtk.Image()
+ pix_buffer = gtk.gdk.pixbuf_new_from_file(hic.ICON_IMAGES_DISPLAY_FILE)
+ icon.set_from_pixbuf(pix_buffer)
+ button = gtk.Button("Select Image")
+ button.set_image(icon)
+ #button.set_size_request(140, 50)
+ table.attach(button, 9, 10, 1, 2, gtk.FILL, 0, 0, 0)
+ button.connect("clicked", self.select_image_button_clicked_cb)
+
+ separator = gtk.HSeparator()
+ self.vbox.pack_start(separator, expand=False, fill=False, padding=10)
+
+ self.usb_desc = gtk.Label()
+ self.usb_desc.set_alignment(0.0, 0.5)
+ markup = "<span font_desc='12'>You haven't chosen any USB drive.</span>"
+ self.usb_desc.set_markup(markup)
+
+ self.usb_combo = gtk.combo_box_new_text()
+ self.usb_combo.connect("changed", self.usb_combo_changed_cb)
+ model = self.usb_combo.get_model()
+ model.clear()
+ self.usb_combo.append_text(self.__dummy_usb__)
+ for usb in self.find_all_usb_devices():
+ self.usb_combo.append_text("/dev/" + usb)
+ self.usb_combo.set_active(0)
+ self.vbox.pack_start(self.usb_combo, expand=False, fill=False)
+ self.vbox.pack_start(self.usb_desc, expand=False, fill=False, padding=2)
+
+ self.progress_bar = HobProgressBar()
+ self.vbox.pack_start(self.progress_bar, expand=False, fill=False)
+ separator = gtk.HSeparator()
+ self.vbox.pack_start(separator, expand=False, fill=True, padding=10)
+
+ self.vbox.show_all()
+ self.progress_bar.hide()
+
+ def set_image_text_buffer(self, image_path):
+ self.buf.set_text(image_path)
+
+ def set_image_path(self, image_path):
+ self.image_path = image_path
+
+ def popen_read(self, cmd):
+ tmpout, errors = bb.process.run("%s" % cmd)
+ return tmpout.strip()
+
+ def find_all_usb_devices(self):
+ usb_devs = [ os.readlink(u)
+ for u in glob.glob('/dev/disk/by-id/usb*')
+ if not re.search(r'part\d+', u) ]
+ return [ '%s' % u[u.rfind('/')+1:] for u in usb_devs ]
+
+ def get_usb_info(self, dev):
+ return "%s %s" % \
+ (self.popen_read('cat /sys/class/block/%s/device/vendor' % dev),
+ self.popen_read('cat /sys/class/block/%s/device/model' % dev))
+
+ def select_image_button_clicked_cb(self, button):
+ self.emit('select_image_clicked')
+
+ def usb_combo_changed_cb(self, usb_combo):
+ combo_item = self.usb_combo.get_active_text()
+ if not combo_item or combo_item == self.__dummy_usb__:
+ markup = "<span font_desc='12'>You haven't chosen any USB drive.</span>"
+ self.usb_desc.set_markup(markup)
+ else:
+ markup = "<span font_desc='12'>" + self.get_usb_info(combo_item.lstrip("/dev/")) + "</span>"
+ self.usb_desc.set_markup(markup)
+
+ def response_cb(self, dialog, response_id):
+ if response_id == gtk.RESPONSE_YES:
+ lbl = ''
+ msg = ''
+ combo_item = self.usb_combo.get_active_text()
+ if combo_item and combo_item != self.__dummy_usb__ and self.image_path:
+ cmdline = bb.ui.crumbs.utils.which_terminal()
+ if cmdline:
+ tmpfile = tempfile.NamedTemporaryFile()
+ cmdline += "\"sudo dd if=" + self.image_path + \
+ " of=" + combo_item + " && sync; echo $? > " + tmpfile.name + "\""
+ subprocess.call(shlex.split(cmdline))
+
+ if int(tmpfile.readline().strip()) == 0:
+ lbl = "<b>Deploy image successfully.</b>"
+ else:
+ lbl = "<b>Failed to deploy image.</b>"
+ msg = "Please check image <b>%s</b> exists and USB device <b>%s</b> is writable." % (self.image_path, combo_item)
+ tmpfile.close()
+ else:
+ if not self.image_path:
+ lbl = "<b>No selection made.</b>"
+ msg = "You have not selected an image to deploy."
+ else:
+ lbl = "<b>No selection made.</b>"
+ msg = "You have not selected a USB device."
+ if len(lbl):
+ crumbs_dialog = CrumbsMessageDialog(self, lbl, gtk.MESSAGE_INFO, msg)
+ button = crumbs_dialog.add_button("Close", gtk.RESPONSE_OK)
+ HobButton.style_button(button)
+ crumbs_dialog.run()
+ crumbs_dialog.destroy()
+
+ def update_progress_bar(self, title, fraction, status=None):
+ self.progress_bar.update(fraction)
+ self.progress_bar.set_title(title)
+ self.progress_bar.set_rcstyle(status)
+
+ def write_file(self, ifile, ofile):
+ self.progress_bar.reset()
+ self.progress_bar.show()
+
+ f_from = os.open(ifile, os.O_RDONLY)
+ f_to = os.open(ofile, os.O_WRONLY)
+
+ total_size = os.stat(ifile).st_size
+ written_size = 0
+
+ while True:
+ buf = os.read(f_from, 1024*1024)
+ if not buf:
+ break
+ os.write(f_to, buf)
+ written_size += 1024*1024
+ self.update_progress_bar("Writing to usb:", written_size * 1.0/total_size)
+
+ self.update_progress_bar("Writing completed:", 1.0)
+ os.close(f_from)
+ os.close(f_to)
+ self.progress_bar.hide()
diff --git a/bitbake/lib/bb/ui/crumbs/hig/imageselectiondialog.py b/bitbake/lib/bb/ui/crumbs/hig/imageselectiondialog.py
new file mode 100644
index 0000000..21216ad
--- /dev/null
+++ b/bitbake/lib/bb/ui/crumbs/hig/imageselectiondialog.py
@@ -0,0 +1,172 @@
+#
+# BitBake Graphical GTK User Interface
+#
+# Copyright (C) 2011-2012 Intel Corporation
+#
+# Authored by Joshua Lock <josh@linux.intel.com>
+# Authored by Dongxiao Xu <dongxiao.xu@intel.com>
+# Authored by Shane Wang <shane.wang@intel.com>
+#
+# This program is free software; you can redistribute it and/or modify
+# it under the terms of the GNU General Public License version 2 as
+# published by the Free Software Foundation.
+#
+# This program is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
+# GNU General Public License for more details.
+#
+# You should have received a copy of the GNU General Public License along
+# with this program; if not, write to the Free Software Foundation, Inc.,
+# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
+
+import gtk
+import gobject
+import os
+from bb.ui.crumbs.hobwidget import HobViewTable, HobInfoButton, HobButton, HobAltButton
+from bb.ui.crumbs.hig.crumbsdialog import CrumbsDialog
+from bb.ui.crumbs.hig.layerselectiondialog import LayerSelectionDialog
+
+"""
+The following are convenience classes for implementing GNOME HIG compliant
+BitBake GUI's
+In summary: spacing = 12px, border-width = 6px
+"""
+
+class ImageSelectionDialog (CrumbsDialog):
+
+ __columns__ = [{
+ 'col_name' : 'Image name',
+ 'col_id' : 0,
+ 'col_style': 'text',
+ 'col_min' : 400,
+ 'col_max' : 400
+ }, {
+ 'col_name' : 'Select',
+ 'col_id' : 1,
+ 'col_style': 'radio toggle',
+ 'col_min' : 160,
+ 'col_max' : 160
+ }]
+
+
+ def __init__(self, image_folder, image_types, title, parent, flags, buttons=None, image_extension = {}):
+ super(ImageSelectionDialog, self).__init__(title, parent, flags, buttons)
+ self.connect("response", self.response_cb)
+
+ self.image_folder = image_folder
+ self.image_types = image_types
+ self.image_list = []
+ self.image_names = []
+ self.image_extension = image_extension
+
+ # create visual elements on the dialog
+ self.create_visual_elements()
+
+ self.image_store = gtk.ListStore(gobject.TYPE_STRING, gobject.TYPE_BOOLEAN)
+ self.fill_image_store()
+
+ def create_visual_elements(self):
+ hbox = gtk.HBox(False, 6)
+
+ self.vbox.pack_start(hbox, expand=False, fill=False)
+
+ entry = gtk.Entry()
+ entry.set_text(self.image_folder)
+ table = gtk.Table(1, 10, True)
+ table.set_size_request(560, -1)
+ hbox.pack_start(table, expand=False, fill=False)
+ table.attach(entry, 0, 9, 0, 1)
+ image = gtk.Image()
+ image.set_from_stock(gtk.STOCK_OPEN, gtk.ICON_SIZE_BUTTON)
+ open_button = gtk.Button()
+ open_button.set_image(image)
+ open_button.connect("clicked", self.select_path_cb, self, entry)
+ table.attach(open_button, 9, 10, 0, 1)
+
+ self.image_table = HobViewTable(self.__columns__, "Images")
+ self.image_table.set_size_request(-1, 300)
+ self.image_table.connect("toggled", self.toggled_cb)
+ self.image_table.connect_group_selection(self.table_selected_cb)
+ self.image_table.connect("row-activated", self.row_actived_cb)
+ self.vbox.pack_start(self.image_table, expand=True, fill=True)
+
+ self.show_all()
+
+ def change_image_cb(self, model, path, columnid):
+ if not model:
+ return
+ iter = model.get_iter_first()
+ while iter:
+ rowpath = model.get_path(iter)
+ model[rowpath][columnid] = False
+ iter = model.iter_next(iter)
+
+ model[path][columnid] = True
+
+ def toggled_cb(self, table, cell, path, columnid, tree):
+ model = tree.get_model()
+ self.change_image_cb(model, path, columnid)
+
+ def table_selected_cb(self, selection):
+ model, paths = selection.get_selected_rows()
+ if paths:
+ self.change_image_cb(model, paths[0], 1)
+
+ def row_actived_cb(self, tab, model, path):
+ self.change_image_cb(model, path, 1)
+ self.emit('response', gtk.RESPONSE_YES)
+
+ def select_path_cb(self, action, parent, entry):
+ dialog = gtk.FileChooserDialog("", parent,
+ gtk.FILE_CHOOSER_ACTION_SELECT_FOLDER)
+ text = entry.get_text()
+ dialog.set_current_folder(text if len(text) > 0 else os.getcwd())
+ button = dialog.add_button("Cancel", gtk.RESPONSE_NO)
+ HobAltButton.style_button(button)
+ button = dialog.add_button("Open", gtk.RESPONSE_YES)
+ HobButton.style_button(button)
+ response = dialog.run()
+ if response == gtk.RESPONSE_YES:
+ path = dialog.get_filename()
+ entry.set_text(path)
+ self.image_folder = path
+ self.fill_image_store()
+
+ dialog.destroy()
+
+ def fill_image_store(self):
+ self.image_list = []
+ self.image_store.clear()
+ imageset = set()
+ for root, dirs, files in os.walk(self.image_folder):
+ # ignore the sub directories
+ dirs[:] = []
+ for f in files:
+ for image_type in self.image_types:
+ if image_type in self.image_extension:
+ real_types = self.image_extension[image_type]
+ else:
+ real_types = [image_type]
+ for real_image_type in real_types:
+ if f.endswith('.' + real_image_type):
+ imageset.add(f.rsplit('.' + real_image_type)[0].rsplit('.rootfs')[0])
+ self.image_list.append(f)
+
+ for image in imageset:
+ self.image_store.set(self.image_store.append(), 0, image, 1, False)
+
+ self.image_table.set_model(self.image_store)
+
+ def response_cb(self, dialog, response_id):
+ self.image_names = []
+ if response_id == gtk.RESPONSE_YES:
+ iter = self.image_store.get_iter_first()
+ while iter:
+ path = self.image_store.get_path(iter)
+ if self.image_store[path][1]:
+ for f in self.image_list:
+ if f.startswith(self.image_store[path][0] + '.'):
+ self.image_names.append(f)
+ break
+ iter = self.image_store.iter_next(iter)
diff --git a/bitbake/lib/bb/ui/crumbs/hig/layerselectiondialog.py b/bitbake/lib/bb/ui/crumbs/hig/layerselectiondialog.py
new file mode 100644
index 0000000..52d57b6
--- /dev/null
+++ b/bitbake/lib/bb/ui/crumbs/hig/layerselectiondialog.py
@@ -0,0 +1,298 @@
+#
+# BitBake Graphical GTK User Interface
+#
+# Copyright (C) 2011-2012 Intel Corporation
+#
+# Authored by Joshua Lock <josh@linux.intel.com>
+# Authored by Dongxiao Xu <dongxiao.xu@intel.com>
+# Authored by Shane Wang <shane.wang@intel.com>
+#
+# This program is free software; you can redistribute it and/or modify
+# it under the terms of the GNU General Public License version 2 as
+# published by the Free Software Foundation.
+#
+# This program is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
+# GNU General Public License for more details.
+#
+# You should have received a copy of the GNU General Public License along
+# with this program; if not, write to the Free Software Foundation, Inc.,
+# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
+
+import gtk
+import gobject
+import os
+import tempfile
+from bb.ui.crumbs.hobwidget import hic, HobButton, HobAltButton
+from bb.ui.crumbs.hig.crumbsdialog import CrumbsDialog
+from bb.ui.crumbs.hig.crumbsmessagedialog import CrumbsMessageDialog
+
+"""
+The following are convenience classes for implementing GNOME HIG compliant
+BitBake GUI's
+In summary: spacing = 12px, border-width = 6px
+"""
+
+class CellRendererPixbufActivatable(gtk.CellRendererPixbuf):
+ """
+ A custom CellRenderer implementation which is activatable
+ so that we can handle user clicks
+ """
+ __gsignals__ = { 'clicked' : (gobject.SIGNAL_RUN_LAST,
+ gobject.TYPE_NONE,
+ (gobject.TYPE_STRING,)), }
+
+ def __init__(self):
+ gtk.CellRendererPixbuf.__init__(self)
+ self.set_property('mode', gtk.CELL_RENDERER_MODE_ACTIVATABLE)
+ self.set_property('follow-state', True)
+
+ """
+ Respond to a user click on a cell
+ """
+ def do_activate(self, even, widget, path, background_area, cell_area, flags):
+ self.emit('clicked', path)
+
+#
+# LayerSelectionDialog
+#
+class LayerSelectionDialog (CrumbsDialog):
+
+ TARGETS = [
+ ("MY_TREE_MODEL_ROW", gtk.TARGET_SAME_WIDGET, 0),
+ ("text/plain", 0, 1),
+ ("TEXT", 0, 2),
+ ("STRING", 0, 3),
+ ]
+
+ def gen_label_widget(self, content):
+ label = gtk.Label()
+ label.set_alignment(0, 0)
+ label.set_markup(content)
+ label.show()
+ return label
+
+ def layer_widget_toggled_cb(self, cell, path, layer_store):
+ name = layer_store[path][0]
+ toggle = not layer_store[path][1]
+ layer_store[path][1] = toggle
+
+ def layer_widget_add_clicked_cb(self, action, layer_store, parent):
+ dialog = gtk.FileChooserDialog("Add new layer", parent,
+ gtk.FILE_CHOOSER_ACTION_SELECT_FOLDER)
+ button = dialog.add_button("Cancel", gtk.RESPONSE_NO)
+ HobAltButton.style_button(button)
+ button = dialog.add_button("Open", gtk.RESPONSE_YES)
+ HobButton.style_button(button)
+ label = gtk.Label("Select the layer you wish to add")
+ label.show()
+ dialog.set_extra_widget(label)
+ response = dialog.run()
+ path = dialog.get_filename()
+ dialog.destroy()
+
+ lbl = "<b>Error</b>"
+ msg = "Unable to load layer <i>%s</i> because " % path
+ if response == gtk.RESPONSE_YES:
+ import os
+ import os.path
+ layers = []
+ it = layer_store.get_iter_first()
+ while it:
+ layers.append(layer_store.get_value(it, 0))
+ it = layer_store.iter_next(it)
+
+ if not path:
+ msg += "it is an invalid path."
+ elif not os.path.exists(path+"/conf/layer.conf"):
+ msg += "there is no layer.conf inside the directory."
+ elif path in layers:
+ msg += "it is already in loaded layers."
+ else:
+ layer_store.append([path])
+ return
+ dialog = CrumbsMessageDialog(parent, lbl, gtk.MESSAGE_ERROR, msg)
+ dialog.add_button(gtk.STOCK_CLOSE, gtk.RESPONSE_OK)
+ response = dialog.run()
+ dialog.destroy()
+
+ def layer_widget_del_clicked_cb(self, action, tree_selection, layer_store):
+ model, iter = tree_selection.get_selected()
+ if iter:
+ layer_store.remove(iter)
+
+
+ def gen_layer_widget(self, layers, layers_avail, window, tooltip=""):
+ hbox = gtk.HBox(False, 6)
+
+ layer_tv = gtk.TreeView()
+ layer_tv.set_rules_hint(True)
+ layer_tv.set_headers_visible(False)
+ tree_selection = layer_tv.get_selection()
+ tree_selection.set_mode(gtk.SELECTION_SINGLE)
+
+ # Allow enable drag and drop of rows including row move
+ dnd_internal_target = ''
+ dnd_targets = [(dnd_internal_target, gtk.TARGET_SAME_WIDGET, 0)]
+ layer_tv.enable_model_drag_source( gtk.gdk.BUTTON1_MASK,
+ dnd_targets,
+ gtk.gdk.ACTION_MOVE)
+ layer_tv.enable_model_drag_dest(dnd_targets,
+ gtk.gdk.ACTION_MOVE)
+ layer_tv.connect("drag_data_get", self.drag_data_get_cb)
+ layer_tv.connect("drag_data_received", self.drag_data_received_cb)
+
+ col0= gtk.TreeViewColumn('Path')
+ cell0 = gtk.CellRendererText()
+ cell0.set_padding(5,2)
+ col0.pack_start(cell0, True)
+ col0.set_cell_data_func(cell0, self.draw_layer_path_cb)
+ layer_tv.append_column(col0)
+
+ scroll = gtk.ScrolledWindow()
+ scroll.set_policy(gtk.POLICY_NEVER, gtk.POLICY_AUTOMATIC)
+ scroll.set_shadow_type(gtk.SHADOW_IN)
+ scroll.add(layer_tv)
+
+ table_layer = gtk.Table(2, 10, False)
+ hbox.pack_start(table_layer, expand=True, fill=True)
+
+ table_layer.attach(scroll, 0, 10, 0, 1)
+
+ layer_store = gtk.ListStore(gobject.TYPE_STRING)
+ for layer in layers:
+ layer_store.append([layer])
+
+ col1 = gtk.TreeViewColumn('Enabled')
+ layer_tv.append_column(col1)
+
+ cell1 = CellRendererPixbufActivatable()
+ cell1.set_fixed_size(-1,35)
+ cell1.connect("clicked", self.del_cell_clicked_cb, layer_store)
+ col1.pack_start(cell1, True)
+ col1.set_cell_data_func(cell1, self.draw_delete_button_cb, layer_tv)
+
+ add_button = gtk.Button()
+ add_button.set_relief(gtk.RELIEF_NONE)
+ box = gtk.HBox(False, 6)
+ box.show()
+ add_button.add(box)
+ add_button.connect("enter-notify-event", self.add_hover_cb)
+ add_button.connect("leave-notify-event", self.add_leave_cb)
+ self.im = gtk.Image()
+ self.im.set_from_file(hic.ICON_INDI_ADD_FILE)
+ self.im.show()
+ box.pack_start(self.im, expand=False, fill=False, padding=6)
+ lbl = gtk.Label("Add layer")
+ lbl.set_alignment(0.0, 0.5)
+ lbl.show()
+ box.pack_start(lbl, expand=True, fill=True, padding=6)
+ add_button.connect("clicked", self.layer_widget_add_clicked_cb, layer_store, window)
+ table_layer.attach(add_button, 0, 10, 1, 2, gtk.EXPAND | gtk.FILL, 0, 0, 6)
+ layer_tv.set_model(layer_store)
+
+ hbox.show_all()
+
+ return hbox, layer_store
+
+ def drag_data_get_cb(self, treeview, context, selection, target_id, etime):
+ treeselection = treeview.get_selection()
+ model, iter = treeselection.get_selected()
+ data = model.get_value(iter, 0)
+ selection.set(selection.target, 8, data)
+
+ def drag_data_received_cb(self, treeview, context, x, y, selection, info, etime):
+ model = treeview.get_model()
+ data = selection.data
+ drop_info = treeview.get_dest_row_at_pos(x, y)
+ if drop_info:
+ path, position = drop_info
+ iter = model.get_iter(path)
+ if (position == gtk.TREE_VIEW_DROP_BEFORE or position == gtk.TREE_VIEW_DROP_INTO_OR_BEFORE):
+ model.insert_before(iter, [data])
+ else:
+ model.insert_after(iter, [data])
+ else:
+ model.append([data])
+ if context.action == gtk.gdk.ACTION_MOVE:
+ context.finish(True, True, etime)
+ return
+
+ def add_hover_cb(self, button, event):
+ self.im.set_from_file(hic.ICON_INDI_ADD_HOVER_FILE)
+
+ def add_leave_cb(self, button, event):
+ self.im.set_from_file(hic.ICON_INDI_ADD_FILE)
+
+ def __init__(self, title, layers, layers_non_removable, all_layers, parent, flags, buttons=None):
+ super(LayerSelectionDialog, self).__init__(title, parent, flags, buttons)
+
+ # class members from other objects
+ self.layers = layers
+ self.layers_non_removable = layers_non_removable
+ self.all_layers = all_layers
+ self.layers_changed = False
+
+ # icon for remove button in TreeView
+ im = gtk.Image()
+ im.set_from_file(hic.ICON_INDI_REMOVE_FILE)
+ self.rem_icon = im.get_pixbuf()
+
+ # class members for internal use
+ self.layer_store = None
+
+ # create visual elements on the dialog
+ self.create_visual_elements()
+ self.connect("response", self.response_cb)
+
+ def create_visual_elements(self):
+ layer_widget, self.layer_store = self.gen_layer_widget(self.layers, self.all_layers, self, None)
+ layer_widget.set_size_request(450, 250)
+ self.vbox.pack_start(layer_widget, expand=True, fill=True)
+ self.show_all()
+
+ def response_cb(self, dialog, response_id):
+ model = self.layer_store
+ it = model.get_iter_first()
+ layers = []
+ while it:
+ layers.append(model.get_value(it, 0))
+ it = model.iter_next(it)
+
+ self.layers_changed = (self.layers != layers)
+ self.layers = layers
+
+ """
+ A custom cell_data_func to draw a delete 'button' in the TreeView for layers
+ other than the meta layer. The deletion of which is prevented so that the
+ user can't shoot themselves in the foot too badly.
+ """
+ def draw_delete_button_cb(self, col, cell, model, it, tv):
+ path = model.get_value(it, 0)
+ if path in self.layers_non_removable:
+ cell.set_sensitive(False)
+ cell.set_property('pixbuf', None)
+ cell.set_property('mode', gtk.CELL_RENDERER_MODE_INERT)
+ else:
+ cell.set_property('pixbuf', self.rem_icon)
+ cell.set_sensitive(True)
+ cell.set_property('mode', gtk.CELL_RENDERER_MODE_ACTIVATABLE)
+
+ return True
+
+ """
+ A custom cell_data_func to write an extra message into the layer path cell
+ for the meta layer. We should inform the user that they can't remove it for
+ their own safety.
+ """
+ def draw_layer_path_cb(self, col, cell, model, it):
+ path = model.get_value(it, 0)
+ if path in self.layers_non_removable:
+ cell.set_property('markup', "<b>It cannot be removed</b>\n%s" % path)
+ else:
+ cell.set_property('text', path)
+
+ def del_cell_clicked_cb(self, cell, path, model):
+ it = model.get_iter_from_string(path)
+ model.remove(it)
diff --git a/bitbake/lib/bb/ui/crumbs/hig/parsingwarningsdialog.py b/bitbake/lib/bb/ui/crumbs/hig/parsingwarningsdialog.py
new file mode 100644
index 0000000..33bac39
--- /dev/null
+++ b/bitbake/lib/bb/ui/crumbs/hig/parsingwarningsdialog.py
@@ -0,0 +1,163 @@
+#
+# BitBake Graphical GTK User Interface
+#
+# Copyright (C) 2011-2012 Intel Corporation
+#
+# Authored by Cristiana Voicu <cristiana.voicu@intel.com>
+#
+# This program is free software; you can redistribute it and/or modify
+# it under the terms of the GNU General Public License version 2 as
+# published by the Free Software Foundation.
+#
+# This program is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
+# GNU General Public License for more details.
+#
+# You should have received a copy of the GNU General Public License along
+# with this program; if not, write to the Free Software Foundation, Inc.,
+# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
+
+import gtk
+import gobject
+from bb.ui.crumbs.hobwidget import HobAltButton
+from bb.ui.crumbs.hig.crumbsdialog import CrumbsDialog
+
+"""
+The following are convenience classes for implementing GNOME HIG compliant
+BitBake GUI's
+In summary: spacing = 12px, border-width = 6px
+"""
+
+#
+# ParsingWarningsDialog
+#
+class ParsingWarningsDialog (CrumbsDialog):
+
+ def __init__(self, title, warnings, parent, flags, buttons=None):
+ super(ParsingWarningsDialog, self).__init__(title, parent, flags, buttons)
+
+ self.warnings = warnings
+ self.warning_on = 0
+ self.warn_nb = len(warnings)
+
+ # create visual elements on the dialog
+ self.create_visual_elements()
+
+ def cancel_button_cb(self, button):
+ self.destroy()
+
+ def previous_button_cb(self, button):
+ self.warning_on = self.warning_on - 1
+ self.refresh_components()
+
+ def next_button_cb(self, button):
+ self.warning_on = self.warning_on + 1
+ self.refresh_components()
+
+ def refresh_components(self):
+ lbl = self.warnings[self.warning_on]
+ #when the warning text has more than 400 chars, it uses a scroll bar
+ if 0<= len(lbl) < 400:
+ self.warning_label.set_size_request(320, 230)
+ self.warning_label.set_use_markup(True)
+ self.warning_label.set_line_wrap(True)
+ self.warning_label.set_markup(lbl)
+ self.warning_label.set_property("yalign", 0.00)
+ else:
+ self.textWindow.set_shadow_type(gtk.SHADOW_IN)
+ self.textWindow.set_policy(gtk.POLICY_AUTOMATIC, gtk.POLICY_AUTOMATIC)
+ self.msgView = gtk.TextView()
+ self.msgView.set_editable(False)
+ self.msgView.set_wrap_mode(gtk.WRAP_WORD)
+ self.msgView.set_cursor_visible(False)
+ self.msgView.set_size_request(320, 230)
+ self.buf = gtk.TextBuffer()
+ self.buf.set_text(lbl)
+ self.msgView.set_buffer(self.buf)
+ self.textWindow.add(self.msgView)
+ self.msgView.show()
+
+ if self.warning_on==0:
+ self.previous_button.set_sensitive(False)
+ else:
+ self.previous_button.set_sensitive(True)
+
+ if self.warning_on==self.warn_nb-1:
+ self.next_button.set_sensitive(False)
+ else:
+ self.next_button.set_sensitive(True)
+
+ if self.warn_nb>1:
+ self.heading = "Warning " + str(self.warning_on + 1) + " of " + str(self.warn_nb)
+ self.heading_label.set_markup('<span weight="bold">%s</span>' % self.heading)
+ else:
+ self.heading = "Warning"
+ self.heading_label.set_markup('<span weight="bold">%s</span>' % self.heading)
+
+ self.show_all()
+
+ if 0<= len(lbl) < 400:
+ self.textWindow.hide()
+ else:
+ self.warning_label.hide()
+
+ def create_visual_elements(self):
+ self.set_size_request(350, 350)
+ self.heading_label = gtk.Label()
+ self.heading_label.set_alignment(0, 0)
+ self.warning_label = gtk.Label()
+ self.warning_label.set_selectable(True)
+ self.warning_label.set_alignment(0, 0)
+ self.textWindow = gtk.ScrolledWindow()
+
+ table = gtk.Table(1, 10, False)
+
+ cancel_button = gtk.Button()
+ cancel_button.set_label("Close")
+ cancel_button.connect("clicked", self.cancel_button_cb)
+ cancel_button.set_size_request(110, 30)
+
+ self.previous_button = gtk.Button()
+ image1 = gtk.image_new_from_stock(gtk.STOCK_GO_BACK, gtk.ICON_SIZE_BUTTON)
+ image1.show()
+ box = gtk.HBox(False, 6)
+ box.show()
+ self.previous_button.add(box)
+ lbl = gtk.Label("Previous")
+ lbl.show()
+ box.pack_start(image1, expand=False, fill=False, padding=3)
+ box.pack_start(lbl, expand=True, fill=True, padding=3)
+ self.previous_button.connect("clicked", self.previous_button_cb)
+ self.previous_button.set_size_request(110, 30)
+
+ self.next_button = gtk.Button()
+ image2 = gtk.image_new_from_stock(gtk.STOCK_GO_FORWARD, gtk.ICON_SIZE_BUTTON)
+ image2.show()
+ box = gtk.HBox(False, 6)
+ box.show()
+ self.next_button.add(box)
+ lbl = gtk.Label("Next")
+ lbl.show()
+ box.pack_start(lbl, expand=True, fill=True, padding=3)
+ box.pack_start(image2, expand=False, fill=False, padding=3)
+ self.next_button.connect("clicked", self.next_button_cb)
+ self.next_button.set_size_request(110, 30)
+
+ #when there more than one warning, we need "previous" and "next" button
+ if self.warn_nb>1:
+ self.vbox.pack_start(self.heading_label, expand=False, fill=False)
+ self.vbox.pack_start(self.warning_label, expand=False, fill=False)
+ self.vbox.pack_start(self.textWindow, expand=False, fill=False)
+ table.attach(cancel_button, 6, 7, 0, 1, xoptions=gtk.SHRINK)
+ table.attach(self.previous_button, 7, 8, 0, 1, xoptions=gtk.SHRINK)
+ table.attach(self.next_button, 8, 9, 0, 1, xoptions=gtk.SHRINK)
+ self.vbox.pack_end(table, expand=False, fill=False)
+ else:
+ self.vbox.pack_start(self.heading_label, expand=False, fill=False)
+ self.vbox.pack_start(self.warning_label, expand=False, fill=False)
+ self.vbox.pack_start(self.textWindow, expand=False, fill=False)
+ cancel_button = self.add_button("Close", gtk.RESPONSE_CANCEL)
+ HobAltButton.style_button(cancel_button)
+
+ self.refresh_components()
diff --git a/bitbake/lib/bb/ui/crumbs/hig/propertydialog.py b/bitbake/lib/bb/ui/crumbs/hig/propertydialog.py
new file mode 100644
index 0000000..09b9ce6
--- /dev/null
+++ b/bitbake/lib/bb/ui/crumbs/hig/propertydialog.py
@@ -0,0 +1,437 @@
+#
+# BitBake Graphical GTK User Interface
+#
+# Copyright (C) 2011-2013 Intel Corporation
+#
+# Authored by Andrei Dinu <andrei.adrianx.dinu@intel.com>
+#
+# This program is free software; you can redistribute it and/or modify
+# it under the terms of the GNU General Public License version 2 as
+# published by the Free Software Foundation.
+#
+# This program is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
+# GNU General Public License for more details.
+#
+# You should have received a copy of the GNU General Public License along
+# with this program; if not, write to the Free Software Foundation, Inc.,
+# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
+
+import string
+import gtk
+import gobject
+import os
+import tempfile
+import glib
+from bb.ui.crumbs.hig.crumbsdialog import CrumbsDialog
+from bb.ui.crumbs.hig.settingsuihelper import SettingsUIHelper
+from bb.ui.crumbs.hig.crumbsmessagedialog import CrumbsMessageDialog
+from bb.ui.crumbs.hig.layerselectiondialog import LayerSelectionDialog
+
+"""
+The following are convenience classes for implementing GNOME HIG compliant
+BitBake GUI's
+In summary: spacing = 12px, border-width = 6px
+"""
+
+class PropertyDialog(CrumbsDialog):
+
+ def __init__(self, title, parent, information, flags, buttons=None):
+
+ super(PropertyDialog, self).__init__(title, parent, flags, buttons)
+
+ self.properties = information
+
+ if len(self.properties) == 10:
+ self.create_recipe_visual_elements()
+ elif len(self.properties) == 5:
+ self.create_package_visual_elements()
+ else:
+ self.create_information_visual_elements()
+
+
+ def create_information_visual_elements(self):
+
+ HOB_ICON_BASE_DIR = os.path.join(os.path.dirname(os.path.dirname(os.path.dirname(__file__))), ("icons/"))
+ ICON_PACKAGES_DISPLAY_FILE = os.path.join(HOB_ICON_BASE_DIR, ('info/info_display.png'))
+
+ self.set_resizable(False)
+
+ self.table = gtk.Table(1,1,False)
+ self.table.set_row_spacings(0)
+ self.table.set_col_spacings(0)
+
+ self.image = gtk.Image()
+ self.image.set_from_file(ICON_PACKAGES_DISPLAY_FILE)
+ self.image.set_property("xalign",0)
+ #self.vbox.add(self.image)
+
+ image_info = self.properties.split("*")[0]
+ info = self.properties.split("*")[1]
+
+ vbox = gtk.VBox(True, spacing=30)
+
+ self.label_short = gtk.Label()
+ self.label_short.set_line_wrap(False)
+ self.label_short.set_markup(image_info)
+ self.label_short.set_property("xalign", 0)
+
+ self.info_label = gtk.Label()
+ self.info_label.set_line_wrap(True)
+ self.info_label.set_markup(info)
+ self.info_label.set_property("yalign", 0.5)
+
+ self.table.attach(self.image, 0,1,0,1, xoptions=gtk.FILL|gtk.EXPAND, yoptions=gtk.FILL,xpadding=5,ypadding=5)
+ self.table.attach(self.label_short, 0,1,0,1, xoptions=gtk.FILL|gtk.EXPAND, yoptions=gtk.FILL,xpadding=40,ypadding=5)
+ self.table.attach(self.info_label, 0,1,1,2, xoptions=gtk.FILL|gtk.EXPAND, yoptions=gtk.FILL,xpadding=40,ypadding=10)
+
+ self.vbox.add(self.table)
+ self.connect('delete-event', lambda w, e: self.destroy() or True)
+
+ def treeViewTooltip( self, widget, e, tooltips, cell, emptyText="" ):
+ try:
+ (path,col,x,y) = widget.get_path_at_pos( int(e.x), int(e.y) )
+ it = widget.get_model().get_iter(path)
+ value = widget.get_model().get_value(it,cell)
+ if value in self.tooltip_items:
+ tooltips.set_tip(widget, self.tooltip_items[value])
+ tooltips.enable()
+ else:
+ tooltips.set_tip(widget, emptyText)
+ except:
+ tooltips.set_tip(widget, emptyText)
+
+
+ def create_package_visual_elements(self):
+
+ import json
+
+ name = self.properties['name']
+ binb = self.properties['binb']
+ size = self.properties['size']
+ recipe = self.properties['recipe']
+ file_list = json.loads(self.properties['files_list'])
+
+ files_temp = ''
+ paths_temp = ''
+ files_binb = []
+ paths_binb = []
+
+ self.tooltip_items = {}
+
+ self.set_resizable(False)
+
+ #cleaning out the recipe variable
+ recipe = recipe.split("+")[0]
+
+ vbox = gtk.VBox(True,spacing = 0)
+
+ ###################################### NAME ROW + COL #################################
+
+ self.label_short = gtk.Label()
+ self.label_short.set_size_request(300,-1)
+ self.label_short.set_selectable(True)
+ self.label_short.set_line_wrap(True)
+ self.label_short.set_markup("<span weight=\"bold\">Name: </span>" + name)
+ self.label_short.set_property("xalign", 0)
+
+ self.vbox.add(self.label_short)
+
+ ###################################### SIZE ROW + COL ######################################
+
+ self.label_short = gtk.Label()
+ self.label_short.set_size_request(300,-1)
+ self.label_short.set_selectable(True)
+ self.label_short.set_line_wrap(True)
+ self.label_short.set_markup("<span weight=\"bold\">Size: </span>" + size)
+ self.label_short.set_property("xalign", 0)
+
+ self.vbox.add(self.label_short)
+
+ ##################################### RECIPE ROW + COL #########################################
+
+ self.label_short = gtk.Label()
+ self.label_short.set_size_request(300,-1)
+ self.label_short.set_selectable(True)
+ self.label_short.set_line_wrap(True)
+ self.label_short.set_markup("<span weight=\"bold\">Recipe: </span>" + recipe)
+ self.label_short.set_property("xalign", 0)
+
+ self.vbox.add(self.label_short)
+
+ ##################################### BINB ROW + COL #######################################
+
+ if binb != '':
+ self.label_short = gtk.Label()
+ self.label_short.set_selectable(True)
+ self.label_short.set_line_wrap(True)
+ self.label_short.set_markup("<span weight=\"bold\">Brought in by: </span>")
+ self.label_short.set_property("xalign", 0)
+
+ self.label_info = gtk.Label()
+ self.label_info.set_size_request(300,-1)
+ self.label_info.set_selectable(True)
+ self.label_info.set_line_wrap(True)
+ self.label_info.set_markup(binb)
+ self.label_info.set_property("xalign", 0)
+
+ self.vbox.add(self.label_short)
+ self.vbox.add(self.label_info)
+
+ #################################### FILES BROUGHT BY PACKAGES ###################################
+
+ if file_list:
+
+ self.textWindow = gtk.ScrolledWindow()
+ self.textWindow.set_shadow_type(gtk.SHADOW_IN)
+ self.textWindow.set_policy(gtk.POLICY_AUTOMATIC, gtk.POLICY_AUTOMATIC)
+ self.textWindow.set_size_request(100, 170)
+
+ packagefiles_store = gtk.ListStore(str)
+
+ self.packagefiles_tv = gtk.TreeView()
+ self.packagefiles_tv.set_rules_hint(True)
+ self.packagefiles_tv.set_headers_visible(True)
+ self.textWindow.add(self.packagefiles_tv)
+
+ self.cell1 = gtk.CellRendererText()
+ col1 = gtk.TreeViewColumn('Package files', self.cell1)
+ col1.set_cell_data_func(self.cell1, self.regex_field)
+ self.packagefiles_tv.append_column(col1)
+
+ items = file_list.keys()
+ items.sort()
+ for item in items:
+ fullpath = item
+ while len(item) > 35:
+ item = item[:len(item)/2] + "" + item[len(item)/2+1:]
+ if len(item) == 35:
+ item = item[:len(item)/2] + "..." + item[len(item)/2+3:]
+ self.tooltip_items[item] = fullpath
+
+ packagefiles_store.append([str(item)])
+
+ self.packagefiles_tv.set_model(packagefiles_store)
+
+ tips = gtk.Tooltips()
+ tips.set_tip(self.packagefiles_tv, "")
+ self.packagefiles_tv.connect("motion-notify-event", self.treeViewTooltip, tips, 0)
+ self.packagefiles_tv.set_events(gtk.gdk.POINTER_MOTION_MASK)
+
+ self.vbox.add(self.textWindow)
+
+ self.vbox.show_all()
+
+
+ def regex_field(self, column, cell, model, iter):
+ cell.set_property('text', model.get_value(iter, 0))
+ return
+
+
+ def create_recipe_visual_elements(self):
+
+ summary = self.properties['summary']
+ name = self.properties['name']
+ version = self.properties['version']
+ revision = self.properties['revision']
+ binb = self.properties['binb']
+ group = self.properties['group']
+ license = self.properties['license']
+ homepage = self.properties['homepage']
+ bugtracker = self.properties['bugtracker']
+ description = self.properties['description']
+
+ self.set_resizable(False)
+
+ #cleaning out the version variable and also the summary
+ version = version.split(":")[1]
+ if len(version) > 30:
+ version = version.split("+")[0]
+ else:
+ version = version.split("-")[0]
+ license = license.replace("&" , "and")
+ if (homepage == ''):
+ homepage = 'unknown'
+ if (bugtracker == ''):
+ bugtracker = 'unknown'
+ summary = summary.split("+")[0]
+
+ #calculating the rows needed for the table
+ binb_items_count = len(binb.split(','))
+ binb_items = binb.split(',')
+
+ vbox = gtk.VBox(False,spacing = 0)
+
+ ######################################## SUMMARY LABEL #########################################
+
+ if summary != '':
+ self.label_short = gtk.Label()
+ self.label_short.set_width_chars(37)
+ self.label_short.set_selectable(True)
+ self.label_short.set_line_wrap(True)
+ self.label_short.set_markup("<b>" + summary + "</b>")
+ self.label_short.set_property("xalign", 0)
+
+ self.vbox.add(self.label_short)
+
+ ########################################## NAME ROW + COL #######################################
+
+ self.label_short = gtk.Label()
+ self.label_short.set_selectable(True)
+ self.label_short.set_line_wrap(True)
+ self.label_short.set_markup("<span weight=\"bold\">Name: </span>" + name)
+ self.label_short.set_property("xalign", 0)
+
+ self.vbox.add(self.label_short)
+
+ ####################################### VERSION ROW + COL ####################################
+
+ self.label_short = gtk.Label()
+ self.label_short.set_selectable(True)
+ self.label_short.set_line_wrap(True)
+ self.label_short.set_markup("<span weight=\"bold\">Version: </span>" + version)
+ self.label_short.set_property("xalign", 0)
+
+ self.vbox.add(self.label_short)
+
+ ##################################### REVISION ROW + COL #####################################
+
+ self.label_short = gtk.Label()
+ self.label_short.set_line_wrap(True)
+ self.label_short.set_selectable(True)
+ self.label_short.set_markup("<span weight=\"bold\">Revision: </span>" + revision)
+ self.label_short.set_property("xalign", 0)
+
+ self.vbox.add(self.label_short)
+
+ ################################## GROUP ROW + COL ############################################
+
+ self.label_short = gtk.Label()
+ self.label_short.set_selectable(True)
+ self.label_short.set_line_wrap(True)
+ self.label_short.set_markup("<span weight=\"bold\">Group: </span>" + group)
+ self.label_short.set_property("xalign", 0)
+
+ self.vbox.add(self.label_short)
+
+ ################################# HOMEPAGE ROW + COL ############################################
+
+ if homepage != 'unknown':
+ self.label_info = gtk.Label()
+ self.label_info.set_selectable(True)
+ self.label_info.set_line_wrap(True)
+ if len(homepage) > 35:
+ self.label_info.set_markup("<a href=\"" + homepage + "\">" + homepage[0:35] + "..." + "</a>")
+ else:
+ self.label_info.set_markup("<a href=\"" + homepage + "\">" + homepage[0:60] + "</a>")
+
+ self.label_info.set_property("xalign", 0)
+
+ self.label_short = gtk.Label()
+ self.label_short.set_selectable(True)
+ self.label_short.set_line_wrap(True)
+ self.label_short.set_markup("<b>Homepage: </b>")
+ self.label_short.set_property("xalign", 0)
+
+ self.vbox.add(self.label_short)
+ self.vbox.add(self.label_info)
+
+ ################################# BUGTRACKER ROW + COL ###########################################
+
+ if bugtracker != 'unknown':
+ self.label_info = gtk.Label()
+ self.label_info.set_selectable(True)
+ self.label_info.set_line_wrap(True)
+ if len(bugtracker) > 35:
+ self.label_info.set_markup("<a href=\"" + bugtracker + "\">" + bugtracker[0:35] + "..." + "</a>")
+ else:
+ self.label_info.set_markup("<a href=\"" + bugtracker + "\">" + bugtracker[0:60] + "</a>")
+ self.label_info.set_property("xalign", 0)
+
+ self.label_short = gtk.Label()
+ self.label_short.set_selectable(True)
+ self.label_short.set_line_wrap(True)
+ self.label_short.set_markup("<b>Bugtracker: </b>")
+ self.label_short.set_property("xalign", 0)
+
+ self.vbox.add(self.label_short)
+ self.vbox.add(self.label_info)
+
+ ################################# LICENSE ROW + COL ############################################
+
+ self.label_info = gtk.Label()
+ self.label_info.set_selectable(True)
+ self.label_info.set_line_wrap(True)
+ self.label_info.set_markup(license)
+ self.label_info.set_property("xalign", 0)
+
+ self.label_short = gtk.Label()
+ self.label_short.set_selectable(True)
+ self.label_short.set_line_wrap(True)
+ self.label_short.set_markup("<span weight=\"bold\">License: </span>")
+ self.label_short.set_property("xalign", 0)
+
+ self.vbox.add(self.label_short)
+ self.vbox.add(self.label_info)
+
+ ################################### BINB ROW+COL #############################################
+
+ if binb != '':
+ self.label_short = gtk.Label()
+ self.label_short.set_selectable(True)
+ self.label_short.set_line_wrap(True)
+ self.label_short.set_markup("<span weight=\"bold\">Brought in by: </span>")
+ self.label_short.set_property("xalign", 0)
+ self.vbox.add(self.label_short)
+ self.label_info = gtk.Label()
+ self.label_info.set_selectable(True)
+ self.label_info.set_width_chars(36)
+ if len(binb) > 200:
+ scrolled_window = gtk.ScrolledWindow()
+ scrolled_window.set_policy(gtk.POLICY_NEVER,gtk.POLICY_ALWAYS)
+ scrolled_window.set_size_request(100,100)
+ self.label_info.set_markup(binb)
+ self.label_info.set_padding(6,6)
+ self.label_info.set_alignment(0,0)
+ self.label_info.set_line_wrap(True)
+ scrolled_window.add_with_viewport(self.label_info)
+ self.vbox.add(scrolled_window)
+ else:
+ self.label_info.set_markup(binb)
+ self.label_info.set_property("xalign", 0)
+ self.label_info.set_line_wrap(True)
+ self.vbox.add(self.label_info)
+
+ ################################ DESCRIPTION TAG ROW #################################################
+
+ self.label_short = gtk.Label()
+ self.label_short.set_line_wrap(True)
+ self.label_short.set_markup("<span weight=\"bold\">Description </span>")
+ self.label_short.set_property("xalign", 0)
+ self.vbox.add(self.label_short)
+
+ ################################ DESCRIPTION INFORMATION ROW ##########################################
+
+ hbox = gtk.HBox(True,spacing = 0)
+
+ self.label_short = gtk.Label()
+ self.label_short.set_selectable(True)
+ self.label_short.set_width_chars(36)
+ if len(description) > 200:
+ scrolled_window = gtk.ScrolledWindow()
+ scrolled_window.set_policy(gtk.POLICY_NEVER,gtk.POLICY_ALWAYS)
+ scrolled_window.set_size_request(100,100)
+ self.label_short.set_markup(description)
+ self.label_short.set_padding(6,6)
+ self.label_short.set_alignment(0,0)
+ self.label_short.set_line_wrap(True)
+ scrolled_window.add_with_viewport(self.label_short)
+ self.vbox.add(scrolled_window)
+ else:
+ self.label_short.set_markup(description)
+ self.label_short.set_property("xalign", 0)
+ self.label_short.set_line_wrap(True)
+ self.vbox.add(self.label_short)
+
+ self.vbox.show_all()
diff --git a/bitbake/lib/bb/ui/crumbs/hig/proxydetailsdialog.py b/bitbake/lib/bb/ui/crumbs/hig/proxydetailsdialog.py
new file mode 100644
index 0000000..69e7dff
--- /dev/null
+++ b/bitbake/lib/bb/ui/crumbs/hig/proxydetailsdialog.py
@@ -0,0 +1,90 @@
+#
+# BitBake Graphical GTK User Interface
+#
+# Copyright (C) 2011-2012 Intel Corporation
+#
+# Authored by Joshua Lock <josh@linux.intel.com>
+# Authored by Dongxiao Xu <dongxiao.xu@intel.com>
+# Authored by Shane Wang <shane.wang@intel.com>
+#
+# This program is free software; you can redistribute it and/or modify
+# it under the terms of the GNU General Public License version 2 as
+# published by the Free Software Foundation.
+#
+# This program is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
+# GNU General Public License for more details.
+#
+# You should have received a copy of the GNU General Public License along
+# with this program; if not, write to the Free Software Foundation, Inc.,
+# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
+
+import gtk
+from bb.ui.crumbs.hig.crumbsdialog import CrumbsDialog
+
+"""
+The following are convenience classes for implementing GNOME HIG compliant
+BitBake GUI's
+In summary: spacing = 12px, border-width = 6px
+"""
+
+class ProxyDetailsDialog (CrumbsDialog):
+
+ def __init__(self, title, user, passwd, parent, flags, buttons=None):
+ super(ProxyDetailsDialog, self).__init__(title, parent, flags, buttons)
+ self.connect("response", self.response_cb)
+
+ self.auth = not (user == None or passwd == None or user == "")
+ self.user = user or ""
+ self.passwd = passwd or ""
+
+ # create visual elements on the dialog
+ self.create_visual_elements()
+
+ def create_visual_elements(self):
+ self.auth_checkbox = gtk.CheckButton("Use authentication")
+ self.auth_checkbox.set_tooltip_text("Check this box to set the username and the password")
+ self.auth_checkbox.set_active(self.auth)
+ self.auth_checkbox.connect("toggled", self.auth_checkbox_toggled_cb)
+ self.vbox.pack_start(self.auth_checkbox, expand=False, fill=False)
+
+ hbox = gtk.HBox(False, 6)
+ self.user_label = gtk.Label("Username:")
+ self.user_text = gtk.Entry()
+ self.user_text.set_text(self.user)
+ hbox.pack_start(self.user_label, expand=False, fill=False)
+ hbox.pack_end(self.user_text, expand=False, fill=False)
+ self.vbox.pack_start(hbox, expand=False, fill=False)
+
+ hbox = gtk.HBox(False, 6)
+ self.passwd_label = gtk.Label("Password:")
+ self.passwd_text = gtk.Entry()
+ self.passwd_text.set_text(self.passwd)
+ hbox.pack_start(self.passwd_label, expand=False, fill=False)
+ hbox.pack_end(self.passwd_text, expand=False, fill=False)
+ self.vbox.pack_start(hbox, expand=False, fill=False)
+
+ self.refresh_auth_components()
+ self.show_all()
+
+ def refresh_auth_components(self):
+ self.user_label.set_sensitive(self.auth)
+ self.user_text.set_editable(self.auth)
+ self.user_text.set_sensitive(self.auth)
+ self.passwd_label.set_sensitive(self.auth)
+ self.passwd_text.set_editable(self.auth)
+ self.passwd_text.set_sensitive(self.auth)
+
+ def auth_checkbox_toggled_cb(self, button):
+ self.auth = self.auth_checkbox.get_active()
+ self.refresh_auth_components()
+
+ def response_cb(self, dialog, response_id):
+ if response_id == gtk.RESPONSE_OK:
+ if self.auth:
+ self.user = self.user_text.get_text()
+ self.passwd = self.passwd_text.get_text()
+ else:
+ self.user = None
+ self.passwd = None
diff --git a/bitbake/lib/bb/ui/crumbs/hig/retrieveimagedialog.py b/bitbake/lib/bb/ui/crumbs/hig/retrieveimagedialog.py
new file mode 100644
index 0000000..9017139
--- /dev/null
+++ b/bitbake/lib/bb/ui/crumbs/hig/retrieveimagedialog.py
@@ -0,0 +1,51 @@
+#
+# BitBake Graphical GTK User Interface
+#
+# Copyright (C) 2013 Intel Corporation
+#
+# Authored by Cristiana Voicu <cristiana.voicu@intel.com>
+#
+# This program is free software; you can redistribute it and/or modify
+# it under the terms of the GNU General Public License version 2 as
+# published by the Free Software Foundation.
+#
+# This program is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
+# GNU General Public License for more details.
+#
+# You should have received a copy of the GNU General Public License along
+# with this program; if not, write to the Free Software Foundation, Inc.,
+# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
+
+import gtk
+
+class RetrieveImageDialog (gtk.FileChooserDialog):
+ """
+ This class is used to create a dialog that permits to retrieve
+ a custom image saved previously from Hob.
+ """
+ def __init__(self, directory,title, parent, flags, buttons=None):
+ super(RetrieveImageDialog, self).__init__(title, None, gtk.FILE_CHOOSER_ACTION_OPEN,
+ (gtk.STOCK_CANCEL, gtk.RESPONSE_CANCEL,gtk.STOCK_OPEN, gtk.RESPONSE_OK))
+ self.directory = directory
+
+ # create visual elements on the dialog
+ self.create_visual_elements()
+
+ def create_visual_elements(self):
+ self.set_show_hidden(True)
+ self.set_default_response(gtk.RESPONSE_OK)
+ self.set_current_folder(self.directory)
+
+ vbox = self.get_children()[0].get_children()[0].get_children()[0]
+ for child in vbox.get_children()[0].get_children()[0].get_children()[0].get_children():
+ vbox.get_children()[0].get_children()[0].get_children()[0].remove(child)
+
+ label1 = gtk.Label()
+ label1.set_text("File system" + self.directory)
+ label1.show()
+ vbox.get_children()[0].get_children()[0].get_children()[0].pack_start(label1, expand=False, fill=False, padding=0)
+ vbox.get_children()[0].get_children()[1].get_children()[0].hide()
+
+ self.get_children()[0].get_children()[1].get_children()[0].set_label("Select")
diff --git a/bitbake/lib/bb/ui/crumbs/hig/saveimagedialog.py b/bitbake/lib/bb/ui/crumbs/hig/saveimagedialog.py
new file mode 100644
index 0000000..4195f70
--- /dev/null
+++ b/bitbake/lib/bb/ui/crumbs/hig/saveimagedialog.py
@@ -0,0 +1,159 @@
+#
+# BitBake Graphical GTK User Interface
+#
+# Copyright (C) 2013 Intel Corporation
+#
+# Authored by Cristiana Voicu <cristiana.voicu@intel.com>
+#
+# This program is free software; you can redistribute it and/or modify
+# it under the terms of the GNU General Public License version 2 as
+# published by the Free Software Foundation.
+#
+# This program is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
+# GNU General Public License for more details.
+#
+# You should have received a copy of the GNU General Public License along
+# with this program; if not, write to the Free Software Foundation, Inc.,
+# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
+
+import gtk
+import glib
+from bb.ui.crumbs.hig.crumbsdialog import CrumbsDialog
+from bb.ui.crumbs.hig.crumbsmessagedialog import CrumbsMessageDialog
+from bb.ui.crumbs.hobwidget import HobButton
+
+class SaveImageDialog (CrumbsDialog):
+ """
+ This class is used to create a dialog that permits to save
+ a custom image in a predefined directory.
+ """
+ def __init__(self, directory, name, description, title, parent, flags, buttons=None):
+ super(SaveImageDialog, self).__init__(title, parent, flags, buttons)
+ self.directory = directory
+ self.builder = parent
+ self.name_field = name
+ self.description_field = description
+
+ # create visual elements on the dialog
+ self.create_visual_elements()
+
+ def create_visual_elements(self):
+ self.set_default_response(gtk.RESPONSE_OK)
+ self.vbox.set_border_width(6)
+
+ sub_vbox = gtk.VBox(False, 12)
+ self.vbox.pack_start(sub_vbox, expand=False, fill=False)
+ label = gtk.Label()
+ label.set_alignment(0, 0)
+ label.set_markup("<b>Name</b>")
+ sub_label = gtk.Label()
+ sub_label.set_alignment(0, 0)
+ content = "Image recipe names should be all lowercase and include only alphanumeric\n"
+ content += "characters. The only special character you can use is the ASCII hyphen (-)."
+ sub_label.set_markup(content)
+ self.name_entry = gtk.Entry()
+ self.name_entry.set_text(self.name_field)
+ self.name_entry.set_size_request(350,30)
+ self.name_entry.connect("changed", self.name_entry_changed)
+ sub_vbox.pack_start(label, expand=False, fill=False)
+ sub_vbox.pack_start(sub_label, expand=False, fill=False)
+ sub_vbox.pack_start(self.name_entry, expand=False, fill=False)
+
+ sub_vbox = gtk.VBox(False, 12)
+ self.vbox.pack_start(sub_vbox, expand=False, fill=False)
+ label = gtk.Label()
+ label.set_alignment(0, 0)
+ label.set_markup("<b>Description</b> (optional)")
+ sub_label = gtk.Label()
+ sub_label.set_alignment(0, 0)
+ sub_label.set_markup("The description should be less than 150 characters long.")
+ self.description_entry = gtk.TextView()
+ description_buffer = self.description_entry.get_buffer()
+ description_buffer.set_text(self.description_field)
+ description_buffer.connect("insert-text", self.limit_description_length)
+ self.description_entry.set_wrap_mode(gtk.WRAP_WORD)
+ self.description_entry.set_size_request(350,50)
+ sub_vbox.pack_start(label, expand=False, fill=False)
+ sub_vbox.pack_start(sub_label, expand=False, fill=False)
+ sub_vbox.pack_start(self.description_entry, expand=False, fill=False)
+
+ sub_vbox = gtk.VBox(False, 12)
+ self.vbox.pack_start(sub_vbox, expand=False, fill=False)
+ label = gtk.Label()
+ label.set_alignment(0, 0)
+ label.set_markup("Your image recipe will be saved to:")
+ sub_label = gtk.Label()
+ sub_label.set_alignment(0, 0)
+ sub_label.set_markup(self.directory)
+ sub_vbox.pack_start(label, expand=False, fill=False)
+ sub_vbox.pack_start(sub_label, expand=False, fill=False)
+
+ table = gtk.Table(1, 4, True)
+
+ cancel_button = gtk.Button()
+ cancel_button.set_label("Cancel")
+ cancel_button.connect("clicked", self.cancel_button_cb)
+ cancel_button.set_size_request(110, 30)
+
+ self.save_button = gtk.Button()
+ self.save_button.set_label("Save")
+ self.save_button.connect("clicked", self.save_button_cb)
+ self.save_button.set_size_request(110, 30)
+ if self.name_entry.get_text() == '':
+ self.save_button.set_sensitive(False)
+
+ table.attach(cancel_button, 2, 3, 0, 1)
+ table.attach(self.save_button, 3, 4, 0, 1)
+ self.vbox.pack_end(table, expand=False, fill=False)
+
+ self.show_all()
+
+ def limit_description_length(self, textbuffer, iter, text, length):
+ buffer_bounds = textbuffer.get_bounds()
+ entire_text = textbuffer.get_text(*buffer_bounds)
+ entire_text += text
+ if len(entire_text)>150 or text=="\n":
+ textbuffer.emit_stop_by_name("insert-text")
+
+ def name_entry_changed(self, entry):
+ text = entry.get_text()
+ if text == '':
+ self.save_button.set_sensitive(False)
+ else:
+ self.save_button.set_sensitive(True)
+
+ def cancel_button_cb(self, button):
+ self.destroy()
+
+ def save_button_cb(self, button):
+ text = self.name_entry.get_text()
+ new_text = text.replace("-","")
+ description_buffer = self.description_entry.get_buffer()
+ description = description_buffer.get_text(description_buffer.get_start_iter(),description_buffer.get_end_iter())
+ if new_text.islower() and new_text.isalnum():
+ self.builder.image_details_page.image_saved = True
+ self.builder.customized = False
+ self.builder.generate_new_image(self.directory+text, description)
+ self.builder.recipe_model.set_in_list(text, description)
+ self.builder.recipe_model.set_selected_image(text)
+ self.builder.image_details_page.show_page(self.builder.IMAGE_GENERATED)
+ self.builder.image_details_page.name_field_template = text
+ self.builder.image_details_page.description_field_template = description
+ self.destroy()
+ else:
+ self.show_invalid_input_error_dialog()
+
+ def show_invalid_input_error_dialog(self):
+ lbl = "<b>Invalid characters in image recipe name</b>"
+ msg = "Image recipe names should be all lowercase and\n"
+ msg += "include only alphanumeric characters. The only\n"
+ msg += "special character you can use is the ASCII hyphen (-)."
+ dialog = CrumbsMessageDialog(self, lbl, gtk.MESSAGE_ERROR, msg)
+ button = dialog.add_button("Close", gtk.RESPONSE_OK)
+ HobButton.style_button(button)
+
+ res = dialog.run()
+ self.name_entry.grab_focus()
+ dialog.destroy()
diff --git a/bitbake/lib/bb/ui/crumbs/hig/settingsuihelper.py b/bitbake/lib/bb/ui/crumbs/hig/settingsuihelper.py
new file mode 100644
index 0000000..e0285c9
--- /dev/null
+++ b/bitbake/lib/bb/ui/crumbs/hig/settingsuihelper.py
@@ -0,0 +1,122 @@
+#
+# BitBake Graphical GTK User Interface
+#
+# Copyright (C) 2011-2012 Intel Corporation
+#
+# Authored by Joshua Lock <josh@linux.intel.com>
+# Authored by Dongxiao Xu <dongxiao.xu@intel.com>
+# Authored by Shane Wang <shane.wang@intel.com>
+#
+# This program is free software; you can redistribute it and/or modify
+# it under the terms of the GNU General Public License version 2 as
+# published by the Free Software Foundation.
+#
+# This program is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
+# GNU General Public License for more details.
+#
+# You should have received a copy of the GNU General Public License along
+# with this program; if not, write to the Free Software Foundation, Inc.,
+# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
+
+import gtk
+import os
+from bb.ui.crumbs.hobwidget import HobInfoButton, HobButton, HobAltButton
+
+"""
+The following are convenience classes for implementing GNOME HIG compliant
+BitBake GUI's
+In summary: spacing = 12px, border-width = 6px
+"""
+
+class SettingsUIHelper():
+
+ def gen_label_widget(self, content):
+ label = gtk.Label()
+ label.set_alignment(0, 0)
+ label.set_markup(content)
+ label.show()
+ return label
+
+ def gen_label_info_widget(self, content, tooltip):
+ table = gtk.Table(1, 10, False)
+ label = self.gen_label_widget(content)
+ info = HobInfoButton(tooltip, self)
+ table.attach(label, 0, 1, 0, 1, xoptions=gtk.FILL)
+ table.attach(info, 1, 2, 0, 1, xoptions=gtk.FILL, xpadding=10)
+ return table
+
+ def gen_spinner_widget(self, content, lower, upper, tooltip=""):
+ hbox = gtk.HBox(False, 12)
+ adjust = gtk.Adjustment(value=content, lower=lower, upper=upper, step_incr=1)
+ spinner = gtk.SpinButton(adjustment=adjust, climb_rate=1, digits=0)
+
+ spinner.set_value(content)
+ hbox.pack_start(spinner, expand=False, fill=False)
+
+ info = HobInfoButton(tooltip, self)
+ hbox.pack_start(info, expand=False, fill=False)
+
+ hbox.show_all()
+ return hbox, spinner
+
+ def gen_combo_widget(self, curr_item, all_item, tooltip=""):
+ hbox = gtk.HBox(False, 12)
+ combo = gtk.combo_box_new_text()
+ hbox.pack_start(combo, expand=False, fill=False)
+
+ index = 0
+ for item in all_item or []:
+ combo.append_text(item)
+ if item == curr_item:
+ combo.set_active(index)
+ index += 1
+
+ info = HobInfoButton(tooltip, self)
+ hbox.pack_start(info, expand=False, fill=False)
+
+ hbox.show_all()
+ return hbox, combo
+
+ def entry_widget_select_path_cb(self, action, parent, entry):
+ dialog = gtk.FileChooserDialog("", parent,
+ gtk.FILE_CHOOSER_ACTION_SELECT_FOLDER)
+ text = entry.get_text()
+ dialog.set_current_folder(text if len(text) > 0 else os.getcwd())
+ button = dialog.add_button("Cancel", gtk.RESPONSE_NO)
+ HobAltButton.style_button(button)
+ button = dialog.add_button("Open", gtk.RESPONSE_YES)
+ HobButton.style_button(button)
+ response = dialog.run()
+ if response == gtk.RESPONSE_YES:
+ path = dialog.get_filename()
+ entry.set_text(path)
+
+ dialog.destroy()
+
+ def gen_entry_widget(self, content, parent, tooltip="", need_button=True):
+ hbox = gtk.HBox(False, 12)
+ entry = gtk.Entry()
+ entry.set_text(content)
+ entry.set_size_request(350,30)
+
+ if need_button:
+ table = gtk.Table(1, 10, False)
+ hbox.pack_start(table, expand=True, fill=True)
+ table.attach(entry, 0, 9, 0, 1, xoptions=gtk.SHRINK)
+ image = gtk.Image()
+ image.set_from_stock(gtk.STOCK_OPEN,gtk.ICON_SIZE_BUTTON)
+ open_button = gtk.Button()
+ open_button.set_image(image)
+ open_button.connect("clicked", self.entry_widget_select_path_cb, parent, entry)
+ table.attach(open_button, 9, 10, 0, 1, xoptions=gtk.SHRINK)
+ else:
+ hbox.pack_start(entry, expand=True, fill=True)
+
+ if tooltip != "":
+ info = HobInfoButton(tooltip, self)
+ hbox.pack_start(info, expand=False, fill=False)
+
+ hbox.show_all()
+ return hbox, entry
diff --git a/bitbake/lib/bb/ui/crumbs/hig/simplesettingsdialog.py b/bitbake/lib/bb/ui/crumbs/hig/simplesettingsdialog.py
new file mode 100644
index 0000000..b5eb3d8
--- /dev/null
+++ b/bitbake/lib/bb/ui/crumbs/hig/simplesettingsdialog.py
@@ -0,0 +1,891 @@
+#
+# BitBake Graphical GTK User Interface
+#
+# Copyright (C) 2011-2012 Intel Corporation
+#
+# Authored by Joshua Lock <josh@linux.intel.com>
+# Authored by Dongxiao Xu <dongxiao.xu@intel.com>
+# Authored by Shane Wang <shane.wang@intel.com>
+#
+# This program is free software; you can redistribute it and/or modify
+# it under the terms of the GNU General Public License version 2 as
+# published by the Free Software Foundation.
+#
+# This program is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
+# GNU General Public License for more details.
+#
+# You should have received a copy of the GNU General Public License along
+# with this program; if not, write to the Free Software Foundation, Inc.,
+# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
+
+import gtk
+import gobject
+import hashlib
+from bb.ui.crumbs.hobwidget import hic, HobInfoButton, HobButton, HobAltButton
+from bb.ui.crumbs.progressbar import HobProgressBar
+from bb.ui.crumbs.hig.settingsuihelper import SettingsUIHelper
+from bb.ui.crumbs.hig.crumbsdialog import CrumbsDialog
+from bb.ui.crumbs.hig.crumbsmessagedialog import CrumbsMessageDialog
+from bb.ui.crumbs.hig.proxydetailsdialog import ProxyDetailsDialog
+
+"""
+The following are convenience classes for implementing GNOME HIG compliant
+BitBake GUI's
+In summary: spacing = 12px, border-width = 6px
+"""
+
+class SimpleSettingsDialog (CrumbsDialog, SettingsUIHelper):
+
+ (BUILD_ENV_PAGE_ID,
+ SHARED_STATE_PAGE_ID,
+ PROXIES_PAGE_ID,
+ OTHERS_PAGE_ID) = range(4)
+
+ (TEST_NETWORK_NONE,
+ TEST_NETWORK_INITIAL,
+ TEST_NETWORK_RUNNING,
+ TEST_NETWORK_PASSED,
+ TEST_NETWORK_FAILED,
+ TEST_NETWORK_CANCELED) = range(6)
+
+ TARGETS = [
+ ("MY_TREE_MODEL_ROW", gtk.TARGET_SAME_WIDGET, 0),
+ ("text/plain", 0, 1),
+ ("TEXT", 0, 2),
+ ("STRING", 0, 3),
+ ]
+
+ def __init__(self, title, configuration, all_image_types,
+ all_package_formats, all_distros, all_sdk_machines,
+ max_threads, parent, flags, handler, buttons=None):
+ super(SimpleSettingsDialog, self).__init__(title, parent, flags, buttons)
+
+ # class members from other objects
+ # bitbake settings from Builder.Configuration
+ self.configuration = configuration
+ self.image_types = all_image_types
+ self.all_package_formats = all_package_formats
+ self.all_distros = all_distros
+ self.all_sdk_machines = all_sdk_machines
+ self.max_threads = max_threads
+
+ # class members for internal use
+ self.dldir_text = None
+ self.sstatedir_text = None
+ self.sstatemirrors_list = []
+ self.sstatemirrors_changed = 0
+ self.bb_spinner = None
+ self.pmake_spinner = None
+ self.rootfs_size_spinner = None
+ self.extra_size_spinner = None
+ self.gplv3_checkbox = None
+ self.toolchain_checkbox = None
+ self.setting_store = None
+ self.image_types_checkbuttons = {}
+
+ self.md5 = self.config_md5()
+ self.proxy_md5 = self.config_proxy_md5()
+ self.settings_changed = False
+ self.proxy_settings_changed = False
+ self.handler = handler
+ self.proxy_test_ran = False
+ self.selected_mirror_row = 0
+ self.new_mirror = False
+
+ # create visual elements on the dialog
+ self.create_visual_elements()
+ self.connect("response", self.response_cb)
+
+ def _get_sorted_value(self, var):
+ return " ".join(sorted(str(var).split())) + "\n"
+
+ def config_proxy_md5(self):
+ data = ("ENABLE_PROXY: " + self._get_sorted_value(self.configuration.enable_proxy))
+ if self.configuration.enable_proxy:
+ for protocol in self.configuration.proxies.keys():
+ data += (protocol + ": " + self._get_sorted_value(self.configuration.combine_proxy(protocol)))
+ return hashlib.md5(data).hexdigest()
+
+ def config_md5(self):
+ data = ""
+ for key in self.configuration.extra_setting.keys():
+ data += (key + ": " + self._get_sorted_value(self.configuration.extra_setting[key]))
+ return hashlib.md5(data).hexdigest()
+
+ def gen_proxy_entry_widget(self, protocol, parent, need_button=True, line=0):
+ label = gtk.Label(protocol.upper() + " proxy")
+ self.proxy_table.attach(label, 0, 1, line, line+1, xpadding=24)
+
+ proxy_entry = gtk.Entry()
+ proxy_entry.set_size_request(300, -1)
+ self.proxy_table.attach(proxy_entry, 1, 2, line, line+1, ypadding=4)
+
+ self.proxy_table.attach(gtk.Label(":"), 2, 3, line, line+1, xpadding=12, ypadding=4)
+
+ port_entry = gtk.Entry()
+ port_entry.set_size_request(60, -1)
+ self.proxy_table.attach(port_entry, 3, 4, line, line+1, ypadding=4)
+
+ details_button = HobAltButton("Details")
+ details_button.connect("clicked", self.details_cb, parent, protocol)
+ self.proxy_table.attach(details_button, 4, 5, line, line+1, xpadding=4, yoptions=gtk.EXPAND)
+
+ return proxy_entry, port_entry, details_button
+
+ def refresh_proxy_components(self):
+ self.same_checkbox.set_sensitive(self.configuration.enable_proxy)
+
+ self.http_proxy.set_text(self.configuration.combine_host_only("http"))
+ self.http_proxy.set_editable(self.configuration.enable_proxy)
+ self.http_proxy.set_sensitive(self.configuration.enable_proxy)
+ self.http_proxy_port.set_text(self.configuration.combine_port_only("http"))
+ self.http_proxy_port.set_editable(self.configuration.enable_proxy)
+ self.http_proxy_port.set_sensitive(self.configuration.enable_proxy)
+ self.http_proxy_details.set_sensitive(self.configuration.enable_proxy)
+
+ self.https_proxy.set_text(self.configuration.combine_host_only("https"))
+ self.https_proxy.set_editable(self.configuration.enable_proxy and (not self.configuration.same_proxy))
+ self.https_proxy.set_sensitive(self.configuration.enable_proxy and (not self.configuration.same_proxy))
+ self.https_proxy_port.set_text(self.configuration.combine_port_only("https"))
+ self.https_proxy_port.set_editable(self.configuration.enable_proxy and (not self.configuration.same_proxy))
+ self.https_proxy_port.set_sensitive(self.configuration.enable_proxy and (not self.configuration.same_proxy))
+ self.https_proxy_details.set_sensitive(self.configuration.enable_proxy and (not self.configuration.same_proxy))
+
+ self.ftp_proxy.set_text(self.configuration.combine_host_only("ftp"))
+ self.ftp_proxy.set_editable(self.configuration.enable_proxy and (not self.configuration.same_proxy))
+ self.ftp_proxy.set_sensitive(self.configuration.enable_proxy and (not self.configuration.same_proxy))
+ self.ftp_proxy_port.set_text(self.configuration.combine_port_only("ftp"))
+ self.ftp_proxy_port.set_editable(self.configuration.enable_proxy and (not self.configuration.same_proxy))
+ self.ftp_proxy_port.set_sensitive(self.configuration.enable_proxy and (not self.configuration.same_proxy))
+ self.ftp_proxy_details.set_sensitive(self.configuration.enable_proxy and (not self.configuration.same_proxy))
+
+ self.socks_proxy.set_text(self.configuration.combine_host_only("socks"))
+ self.socks_proxy.set_editable(self.configuration.enable_proxy and (not self.configuration.same_proxy))
+ self.socks_proxy.set_sensitive(self.configuration.enable_proxy and (not self.configuration.same_proxy))
+ self.socks_proxy_port.set_text(self.configuration.combine_port_only("socks"))
+ self.socks_proxy_port.set_editable(self.configuration.enable_proxy and (not self.configuration.same_proxy))
+ self.socks_proxy_port.set_sensitive(self.configuration.enable_proxy and (not self.configuration.same_proxy))
+ self.socks_proxy_details.set_sensitive(self.configuration.enable_proxy and (not self.configuration.same_proxy))
+
+ self.cvs_proxy.set_text(self.configuration.combine_host_only("cvs"))
+ self.cvs_proxy.set_editable(self.configuration.enable_proxy and (not self.configuration.same_proxy))
+ self.cvs_proxy.set_sensitive(self.configuration.enable_proxy and (not self.configuration.same_proxy))
+ self.cvs_proxy_port.set_text(self.configuration.combine_port_only("cvs"))
+ self.cvs_proxy_port.set_editable(self.configuration.enable_proxy and (not self.configuration.same_proxy))
+ self.cvs_proxy_port.set_sensitive(self.configuration.enable_proxy and (not self.configuration.same_proxy))
+ self.cvs_proxy_details.set_sensitive(self.configuration.enable_proxy and (not self.configuration.same_proxy))
+
+ if self.configuration.same_proxy:
+ if self.http_proxy.get_text():
+ [w.set_text(self.http_proxy.get_text()) for w in self.same_proxy_addresses]
+ if self.http_proxy_port.get_text():
+ [w.set_text(self.http_proxy_port.get_text()) for w in self.same_proxy_ports]
+
+ def proxy_checkbox_toggled_cb(self, button):
+ self.configuration.enable_proxy = self.proxy_checkbox.get_active()
+ if not self.configuration.enable_proxy:
+ self.configuration.same_proxy = False
+ self.same_checkbox.set_active(self.configuration.same_proxy)
+ self.save_proxy_data()
+ self.refresh_proxy_components()
+
+ def same_checkbox_toggled_cb(self, button):
+ self.configuration.same_proxy = self.same_checkbox.get_active()
+ self.save_proxy_data()
+ self.refresh_proxy_components()
+
+ def save_proxy_data(self):
+ self.configuration.split_proxy("http", self.http_proxy.get_text() + ":" + self.http_proxy_port.get_text())
+ if self.configuration.same_proxy:
+ self.configuration.split_proxy("https", self.http_proxy.get_text() + ":" + self.http_proxy_port.get_text())
+ self.configuration.split_proxy("ftp", self.http_proxy.get_text() + ":" + self.http_proxy_port.get_text())
+ self.configuration.split_proxy("socks", self.http_proxy.get_text() + ":" + self.http_proxy_port.get_text())
+ self.configuration.split_proxy("cvs", self.http_proxy.get_text() + ":" + self.http_proxy_port.get_text())
+ else:
+ self.configuration.split_proxy("https", self.https_proxy.get_text() + ":" + self.https_proxy_port.get_text())
+ self.configuration.split_proxy("ftp", self.ftp_proxy.get_text() + ":" + self.ftp_proxy_port.get_text())
+ self.configuration.split_proxy("socks", self.socks_proxy.get_text() + ":" + self.socks_proxy_port.get_text())
+ self.configuration.split_proxy("cvs", self.cvs_proxy.get_text() + ":" + self.cvs_proxy_port.get_text())
+
+ def response_cb(self, dialog, response_id):
+ if response_id == gtk.RESPONSE_YES:
+ if self.proxy_checkbox.get_active():
+ # Check that all proxy entries have a corresponding port
+ for proxy, port in zip(self.all_proxy_addresses, self.all_proxy_ports):
+ if proxy.get_text() and not port.get_text():
+ lbl = "<b>Enter all port numbers</b>"
+ msg = "Proxy servers require a port number. Please make sure you have entered a port number for each proxy server."
+ dialog = CrumbsMessageDialog(self, lbl, gtk.MESSAGE_WARNING, msg)
+ button = dialog.add_button("Close", gtk.RESPONSE_OK)
+ HobButton.style_button(button)
+ response = dialog.run()
+ dialog.destroy()
+ self.emit_stop_by_name("response")
+ return
+
+ self.configuration.dldir = self.dldir_text.get_text()
+ self.configuration.sstatedir = self.sstatedir_text.get_text()
+ self.configuration.sstatemirror = ""
+ for mirror in self.sstatemirrors_list:
+ if mirror[1] != "" and mirror[2].startswith("file://"):
+ smirror = mirror[2] + " " + mirror[1] + " \\n "
+ self.configuration.sstatemirror += smirror
+ self.configuration.bbthread = self.bb_spinner.get_value_as_int()
+ self.configuration.pmake = self.pmake_spinner.get_value_as_int()
+ self.save_proxy_data()
+ self.configuration.extra_setting = {}
+ it = self.setting_store.get_iter_first()
+ while it:
+ key = self.setting_store.get_value(it, 0)
+ value = self.setting_store.get_value(it, 1)
+ self.configuration.extra_setting[key] = value
+ it = self.setting_store.iter_next(it)
+
+ md5 = self.config_md5()
+ self.settings_changed = (self.md5 != md5)
+ self.proxy_settings_changed = (self.proxy_md5 != self.config_proxy_md5())
+
+ def create_build_environment_page(self):
+ advanced_vbox = gtk.VBox(False, 6)
+ advanced_vbox.set_border_width(6)
+
+ advanced_vbox.pack_start(self.gen_label_widget('<span weight="bold">Parallel threads</span>'), expand=False, fill=False)
+ sub_vbox = gtk.VBox(False, 6)
+ advanced_vbox.pack_start(sub_vbox, expand=False, fill=False)
+ label = self.gen_label_widget("BitBake parallel threads")
+ tooltip = "Sets the number of threads that BitBake tasks can simultaneously run. See the <a href=\""
+ tooltip += "http://www.yoctoproject.org/docs/current/poky-ref-manual/"
+ tooltip += "poky-ref-manual.html#var-BB_NUMBER_THREADS\">Poky reference manual</a> for information"
+ bbthread_widget, self.bb_spinner = self.gen_spinner_widget(self.configuration.bbthread, 1, self.max_threads,"<b>BitBake prallalel threads</b>" + "*" + tooltip)
+ sub_vbox.pack_start(label, expand=False, fill=False)
+ sub_vbox.pack_start(bbthread_widget, expand=False, fill=False)
+
+ sub_vbox = gtk.VBox(False, 6)
+ advanced_vbox.pack_start(sub_vbox, expand=False, fill=False)
+ label = self.gen_label_widget("Make parallel threads")
+ tooltip = "Sets the maximum number of threads the host can use during the build. See the <a href=\""
+ tooltip += "http://www.yoctoproject.org/docs/current/poky-ref-manual/"
+ tooltip += "poky-ref-manual.html#var-PARALLEL_MAKE\">Poky reference manual</a> for information"
+ pmake_widget, self.pmake_spinner = self.gen_spinner_widget(self.configuration.pmake, 1, self.max_threads,"<b>Make parallel threads</b>" + "*" + tooltip)
+ sub_vbox.pack_start(label, expand=False, fill=False)
+ sub_vbox.pack_start(pmake_widget, expand=False, fill=False)
+
+ advanced_vbox.pack_start(self.gen_label_widget('<span weight="bold">Downloaded source code</span>'), expand=False, fill=False)
+ sub_vbox = gtk.VBox(False, 6)
+ advanced_vbox.pack_start(sub_vbox, expand=False, fill=False)
+ label = self.gen_label_widget("Downloads directory")
+ tooltip = "Select a folder that caches the upstream project source code"
+ dldir_widget, self.dldir_text = self.gen_entry_widget(self.configuration.dldir, self,"<b>Downloaded source code</b>" + "*" + tooltip)
+ sub_vbox.pack_start(label, expand=False, fill=False)
+ sub_vbox.pack_start(dldir_widget, expand=False, fill=False)
+
+ return advanced_vbox
+
+ def create_shared_state_page(self):
+ advanced_vbox = gtk.VBox(False)
+ advanced_vbox.set_border_width(12)
+
+ sub_vbox = gtk.VBox(False)
+ advanced_vbox.pack_start(sub_vbox, expand=False, fill=False, padding=24)
+ content = "<span>Shared state directory</span>"
+ tooltip = "Select a folder that caches your prebuilt results"
+ label = self.gen_label_info_widget(content,"<b>Shared state directory</b>" + "*" + tooltip)
+ sstatedir_widget, self.sstatedir_text = self.gen_entry_widget(self.configuration.sstatedir, self)
+ sub_vbox.pack_start(label, expand=False, fill=False)
+ sub_vbox.pack_start(sstatedir_widget, expand=False, fill=False, padding=6)
+
+ content = "<span weight=\"bold\">Shared state mirrors</span>"
+ tooltip = "URLs pointing to pre-built mirrors that will speed your build. "
+ tooltip += "Select the \'Standard\' configuration if the structure of your "
+ tooltip += "mirror replicates the structure of your local shared state directory. "
+ tooltip += "For more information on shared state mirrors, check the <a href=\""
+ tooltip += "http://www.yoctoproject.org/docs/current/poky-ref-manual/"
+ tooltip += "poky-ref-manual.html#shared-state\">Yocto Project Reference Manual</a>."
+ table = self.gen_label_info_widget(content,"<b>Shared state mirrors</b>" + "*" + tooltip)
+ advanced_vbox.pack_start(table, expand=False, fill=False, padding=6)
+
+ sub_vbox = gtk.VBox(False)
+ advanced_vbox.pack_start(sub_vbox, gtk.TRUE, gtk.TRUE, 0)
+
+ if self.sstatemirrors_changed == 0:
+ self.sstatemirrors_changed = 1
+ sstatemirrors = self.configuration.sstatemirror
+ if sstatemirrors == "":
+ sm_list = ["Standard", "", "file://(.*)"]
+ self.sstatemirrors_list.append(sm_list)
+ else:
+ sstatemirrors = [x for x in sstatemirrors.split('\\n')]
+ for sstatemirror in sstatemirrors:
+ sstatemirror_fields = [x for x in sstatemirror.split(' ') if x.strip()]
+ if len(sstatemirror_fields) == 2:
+ if sstatemirror_fields[0] == "file://(.*)" or sstatemirror_fields[0] == "file://.*":
+ sm_list = ["Standard", sstatemirror_fields[1], sstatemirror_fields[0]]
+ else:
+ sm_list = ["Custom", sstatemirror_fields[1], sstatemirror_fields[0]]
+ self.sstatemirrors_list.append(sm_list)
+
+ sstatemirrors_widget, sstatemirrors_store = self.gen_shared_sstate_widget(self.sstatemirrors_list, self)
+ sub_vbox.pack_start(sstatemirrors_widget, expand=True, fill=True)
+
+ table = gtk.Table(1, 10, False)
+ table.set_col_spacings(6)
+ add_mirror_button = HobAltButton("Add mirror")
+ add_mirror_button.connect("clicked", self.add_mirror)
+ add_mirror_button.set_size_request(120,30)
+ table.attach(add_mirror_button, 1, 2, 0, 1, xoptions=gtk.SHRINK)
+
+ self.delete_button = HobAltButton("Delete mirror")
+ self.delete_button.connect("clicked", self.delete_cb)
+ self.delete_button.set_size_request(120, 30)
+ table.attach(self.delete_button, 3, 4, 0, 1, xoptions=gtk.SHRINK)
+
+ advanced_vbox.pack_start(table, expand=False, fill=False, padding=6)
+
+ return advanced_vbox
+
+ def gen_shared_sstate_widget(self, sstatemirrors_list, window):
+ hbox = gtk.HBox(False)
+
+ sstatemirrors_store = gtk.ListStore(str, str, str)
+ for sstatemirror in sstatemirrors_list:
+ sstatemirrors_store.append(sstatemirror)
+
+ self.sstatemirrors_tv = gtk.TreeView()
+ self.sstatemirrors_tv.set_rules_hint(True)
+ self.sstatemirrors_tv.set_headers_visible(True)
+ tree_selection = self.sstatemirrors_tv.get_selection()
+ tree_selection.set_mode(gtk.SELECTION_SINGLE)
+
+ # Allow enable drag and drop of rows including row move
+ self.sstatemirrors_tv.enable_model_drag_source( gtk.gdk.BUTTON1_MASK,
+ self.TARGETS,
+ gtk.gdk.ACTION_DEFAULT|
+ gtk.gdk.ACTION_MOVE)
+ self.sstatemirrors_tv.enable_model_drag_dest(self.TARGETS,
+ gtk.gdk.ACTION_DEFAULT)
+ self.sstatemirrors_tv.connect("drag_data_get", self.drag_data_get_cb)
+ self.sstatemirrors_tv.connect("drag_data_received", self.drag_data_received_cb)
+
+
+ self.scroll = gtk.ScrolledWindow()
+ self.scroll.set_policy(gtk.POLICY_NEVER, gtk.POLICY_AUTOMATIC)
+ self.scroll.set_shadow_type(gtk.SHADOW_IN)
+ self.scroll.connect('size-allocate', self.scroll_changed)
+ self.scroll.add(self.sstatemirrors_tv)
+
+ #list store for cell renderer
+ m = gtk.ListStore(gobject.TYPE_STRING)
+ m.append(["Standard"])
+ m.append(["Custom"])
+
+ cell0 = gtk.CellRendererCombo()
+ cell0.set_property("model",m)
+ cell0.set_property("text-column", 0)
+ cell0.set_property("editable", True)
+ cell0.set_property("has-entry", False)
+ col0 = gtk.TreeViewColumn("Configuration")
+ col0.pack_start(cell0, False)
+ col0.add_attribute(cell0, "text", 0)
+ col0.set_cell_data_func(cell0, self.configuration_field)
+ self.sstatemirrors_tv.append_column(col0)
+
+ cell0.connect("edited", self.combo_changed, sstatemirrors_store)
+
+ self.cell1 = gtk.CellRendererText()
+ self.cell1.set_padding(5,2)
+ col1 = gtk.TreeViewColumn('Regex', self.cell1)
+ col1.set_cell_data_func(self.cell1, self.regex_field)
+ self.sstatemirrors_tv.append_column(col1)
+
+ self.cell1.connect("edited", self.regex_changed, sstatemirrors_store)
+
+ cell2 = gtk.CellRendererText()
+ cell2.set_padding(5,2)
+ cell2.set_property("editable", True)
+ col2 = gtk.TreeViewColumn('URL', cell2)
+ col2.set_cell_data_func(cell2, self.url_field)
+ self.sstatemirrors_tv.append_column(col2)
+
+ cell2.connect("edited", self.url_changed, sstatemirrors_store)
+
+ self.sstatemirrors_tv.set_model(sstatemirrors_store)
+ self.sstatemirrors_tv.set_cursor(self.selected_mirror_row)
+ hbox.pack_start(self.scroll, expand=True, fill=True)
+ hbox.show_all()
+
+ return hbox, sstatemirrors_store
+
+ def drag_data_get_cb(self, treeview, context, selection, target_id, etime):
+ treeselection = treeview.get_selection()
+ model, iter = treeselection.get_selected()
+ data = model.get_string_from_iter(iter)
+ selection.set(selection.target, 8, data)
+
+ def drag_data_received_cb(self, treeview, context, x, y, selection, info, etime):
+ model = treeview.get_model()
+ data = []
+ tree_iter = model.get_iter_from_string(selection.data)
+ data.append(model.get_value(tree_iter, 0))
+ data.append(model.get_value(tree_iter, 1))
+ data.append(model.get_value(tree_iter, 2))
+
+ drop_info = treeview.get_dest_row_at_pos(x, y)
+ if drop_info:
+ path, position = drop_info
+ iter = model.get_iter(path)
+ if (position == gtk.TREE_VIEW_DROP_BEFORE or position == gtk.TREE_VIEW_DROP_INTO_OR_BEFORE):
+ model.insert_before(iter, data)
+ else:
+ model.insert_after(iter, data)
+ else:
+ model.append(data)
+ if context.action == gtk.gdk.ACTION_MOVE:
+ context.finish(True, True, etime)
+ return
+
+ def delete_cb(self, button):
+ selection = self.sstatemirrors_tv.get_selection()
+ tree_model, tree_iter = selection.get_selected()
+ index = int(tree_model.get_string_from_iter(tree_iter))
+ if index == 0:
+ self.selected_mirror_row = index
+ else:
+ self.selected_mirror_row = index - 1
+ self.sstatemirrors_list.pop(index)
+ self.refresh_shared_state_page()
+ if not self.sstatemirrors_list:
+ self.delete_button.set_sensitive(False)
+
+ def add_mirror(self, button):
+ self.new_mirror = True
+ tooltip = "Select the pre-built mirror that will speed your build"
+ index = len(self.sstatemirrors_list)
+ self.selected_mirror_row = index
+ sm_list = ["Standard", "", "file://(.*)"]
+ self.sstatemirrors_list.append(sm_list)
+ self.refresh_shared_state_page()
+
+ def scroll_changed(self, widget, event, data=None):
+ if self.new_mirror == True:
+ adj = widget.get_vadjustment()
+ adj.set_value(adj.upper - adj.page_size)
+ self.new_mirror = False
+
+ def combo_changed(self, widget, path, text, model):
+ model[path][0] = text
+ selection = self.sstatemirrors_tv.get_selection()
+ tree_model, tree_iter = selection.get_selected()
+ index = int(tree_model.get_string_from_iter(tree_iter))
+ self.sstatemirrors_list[index][0] = text
+
+ def regex_changed(self, cell, path, new_text, user_data):
+ user_data[path][2] = new_text
+ selection = self.sstatemirrors_tv.get_selection()
+ tree_model, tree_iter = selection.get_selected()
+ index = int(tree_model.get_string_from_iter(tree_iter))
+ self.sstatemirrors_list[index][2] = new_text
+ return
+
+ def url_changed(self, cell, path, new_text, user_data):
+ if new_text!="Enter the mirror URL" and new_text!="Match regex and replace it with this URL":
+ user_data[path][1] = new_text
+ selection = self.sstatemirrors_tv.get_selection()
+ tree_model, tree_iter = selection.get_selected()
+ index = int(tree_model.get_string_from_iter(tree_iter))
+ self.sstatemirrors_list[index][1] = new_text
+ return
+
+ def configuration_field(self, column, cell, model, iter):
+ cell.set_property('text', model.get_value(iter, 0))
+ if model.get_value(iter, 0) == "Standard":
+ self.cell1.set_property("sensitive", False)
+ self.cell1.set_property("editable", False)
+ else:
+ self.cell1.set_property("sensitive", True)
+ self.cell1.set_property("editable", True)
+ return
+
+ def regex_field(self, column, cell, model, iter):
+ cell.set_property('text', model.get_value(iter, 2))
+ return
+
+ def url_field(self, column, cell, model, iter):
+ text = model.get_value(iter, 1)
+ if text == "":
+ if model.get_value(iter, 0) == "Standard":
+ text = "Enter the mirror URL"
+ else:
+ text = "Match regex and replace it with this URL"
+ cell.set_property('text', text)
+ return
+
+ def refresh_shared_state_page(self):
+ page_num = self.nb.get_current_page()
+ self.nb.remove_page(page_num);
+ self.nb.insert_page(self.create_shared_state_page(), gtk.Label("Shared state"),page_num)
+ self.show_all()
+ self.nb.set_current_page(page_num)
+
+ def test_proxy_ended(self, passed):
+ self.proxy_test_running = False
+ self.set_test_proxy_state(self.TEST_NETWORK_PASSED if passed else self.TEST_NETWORK_FAILED)
+ self.set_sensitive(True)
+ self.refresh_proxy_components()
+
+ def timer_func(self):
+ self.test_proxy_progress.pulse()
+ return self.proxy_test_running
+
+ def test_network_button_cb(self, b):
+ self.set_test_proxy_state(self.TEST_NETWORK_RUNNING)
+ self.set_sensitive(False)
+ self.save_proxy_data()
+ if self.configuration.enable_proxy == True:
+ self.handler.set_http_proxy(self.configuration.combine_proxy("http"))
+ self.handler.set_https_proxy(self.configuration.combine_proxy("https"))
+ self.handler.set_ftp_proxy(self.configuration.combine_proxy("ftp"))
+ self.handler.set_socks_proxy(self.configuration.combine_proxy("socks"))
+ self.handler.set_cvs_proxy(self.configuration.combine_host_only("cvs"), self.configuration.combine_port_only("cvs"))
+ elif self.configuration.enable_proxy == False:
+ self.handler.set_http_proxy("")
+ self.handler.set_https_proxy("")
+ self.handler.set_ftp_proxy("")
+ self.handler.set_socks_proxy("")
+ self.handler.set_cvs_proxy("", "")
+ self.proxy_test_ran = True
+ self.proxy_test_running = True
+ gobject.timeout_add(100, self.timer_func)
+ self.handler.trigger_network_test()
+
+ def test_proxy_focus_event(self, w, direction):
+ if self.test_proxy_state in [self.TEST_NETWORK_PASSED, self.TEST_NETWORK_FAILED]:
+ self.set_test_proxy_state(self.TEST_NETWORK_INITIAL)
+ return False
+
+ def http_proxy_changed(self, e):
+ if not self.configuration.same_proxy:
+ return
+ if e == self.http_proxy:
+ [w.set_text(self.http_proxy.get_text()) for w in self.same_proxy_addresses]
+ else:
+ [w.set_text(self.http_proxy_port.get_text()) for w in self.same_proxy_ports]
+
+ def proxy_address_focus_out_event(self, w, direction):
+ text = w.get_text()
+ if not text:
+ return False
+ if text.find("//") == -1:
+ w.set_text("http://" + text)
+ return False
+
+ def set_test_proxy_state(self, state):
+ if self.test_proxy_state == state:
+ return
+ [self.proxy_table.remove(w) for w in self.test_gui_elements]
+ if state == self.TEST_NETWORK_INITIAL:
+ self.proxy_table.attach(self.test_network_button, 1, 2, 5, 6)
+ self.test_network_button.show()
+ elif state == self.TEST_NETWORK_RUNNING:
+ self.test_proxy_progress.set_rcstyle("running")
+ self.test_proxy_progress.set_text("Testing network configuration")
+ self.proxy_table.attach(self.test_proxy_progress, 0, 5, 5, 6, xpadding=4)
+ self.test_proxy_progress.show()
+ else: # passed or failed
+ self.dummy_progress.update(1.0)
+ if state == self.TEST_NETWORK_PASSED:
+ self.dummy_progress.set_text("Your network is properly configured")
+ self.dummy_progress.set_rcstyle("running")
+ else:
+ self.dummy_progress.set_text("Network test failed")
+ self.dummy_progress.set_rcstyle("fail")
+ self.proxy_table.attach(self.dummy_progress, 0, 4, 5, 6)
+ self.proxy_table.attach(self.retest_network_button, 4, 5, 5, 6, xpadding=4)
+ self.dummy_progress.show()
+ self.retest_network_button.show()
+ self.test_proxy_state = state
+
+ def create_network_page(self):
+ advanced_vbox = gtk.VBox(False, 6)
+ advanced_vbox.set_border_width(6)
+ self.same_proxy_addresses = []
+ self.same_proxy_ports = []
+ self.all_proxy_ports = []
+ self.all_proxy_addresses = []
+
+ sub_vbox = gtk.VBox(False, 6)
+ advanced_vbox.pack_start(sub_vbox, expand=False, fill=False)
+ label = self.gen_label_widget("<span weight=\"bold\">Set the proxies used when fetching source code</span>")
+ tooltip = "Set the proxies used when fetching source code. A blank field uses a direct internet connection."
+ info = HobInfoButton("<span weight=\"bold\">Set the proxies used when fetching source code</span>" + "*" + tooltip, self)
+ hbox = gtk.HBox(False, 12)
+ hbox.pack_start(label, expand=True, fill=True)
+ hbox.pack_start(info, expand=False, fill=False)
+ sub_vbox.pack_start(hbox, expand=False, fill=False)
+
+ proxy_test_focus = []
+ self.direct_checkbox = gtk.RadioButton(None, "Direct network connection")
+ proxy_test_focus.append(self.direct_checkbox)
+ self.direct_checkbox.set_tooltip_text("Check this box to use a direct internet connection with no proxy")
+ self.direct_checkbox.set_active(not self.configuration.enable_proxy)
+ sub_vbox.pack_start(self.direct_checkbox, expand=False, fill=False)
+
+ self.proxy_checkbox = gtk.RadioButton(self.direct_checkbox, "Manual proxy configuration")
+ proxy_test_focus.append(self.proxy_checkbox)
+ self.proxy_checkbox.set_tooltip_text("Check this box to manually set up a specific proxy")
+ self.proxy_checkbox.set_active(self.configuration.enable_proxy)
+ sub_vbox.pack_start(self.proxy_checkbox, expand=False, fill=False)
+
+ self.same_checkbox = gtk.CheckButton("Use the HTTP proxy for all protocols")
+ proxy_test_focus.append(self.same_checkbox)
+ self.same_checkbox.set_tooltip_text("Check this box to use the HTTP proxy for all five proxies")
+ self.same_checkbox.set_active(self.configuration.same_proxy)
+ hbox = gtk.HBox(False, 12)
+ hbox.pack_start(self.same_checkbox, expand=False, fill=False, padding=24)
+ sub_vbox.pack_start(hbox, expand=False, fill=False)
+
+ self.proxy_table = gtk.Table(6, 5, False)
+ self.http_proxy, self.http_proxy_port, self.http_proxy_details = self.gen_proxy_entry_widget(
+ "http", self, True, 0)
+ proxy_test_focus +=[self.http_proxy, self.http_proxy_port]
+ self.http_proxy.connect("changed", self.http_proxy_changed)
+ self.http_proxy_port.connect("changed", self.http_proxy_changed)
+
+ self.https_proxy, self.https_proxy_port, self.https_proxy_details = self.gen_proxy_entry_widget(
+ "https", self, True, 1)
+ proxy_test_focus += [self.https_proxy, self.https_proxy_port]
+ self.same_proxy_addresses.append(self.https_proxy)
+ self.same_proxy_ports.append(self.https_proxy_port)
+
+ self.ftp_proxy, self.ftp_proxy_port, self.ftp_proxy_details = self.gen_proxy_entry_widget(
+ "ftp", self, True, 2)
+ proxy_test_focus += [self.ftp_proxy, self.ftp_proxy_port]
+ self.same_proxy_addresses.append(self.ftp_proxy)
+ self.same_proxy_ports.append(self.ftp_proxy_port)
+
+ self.socks_proxy, self.socks_proxy_port, self.socks_proxy_details = self.gen_proxy_entry_widget(
+ "socks", self, True, 3)
+ proxy_test_focus += [self.socks_proxy, self.socks_proxy_port]
+ self.same_proxy_addresses.append(self.socks_proxy)
+ self.same_proxy_ports.append(self.socks_proxy_port)
+
+ self.cvs_proxy, self.cvs_proxy_port, self.cvs_proxy_details = self.gen_proxy_entry_widget(
+ "cvs", self, True, 4)
+ proxy_test_focus += [self.cvs_proxy, self.cvs_proxy_port]
+ self.same_proxy_addresses.append(self.cvs_proxy)
+ self.same_proxy_ports.append(self.cvs_proxy_port)
+ self.all_proxy_ports = self.same_proxy_ports + [self.http_proxy_port]
+ self.all_proxy_addresses = self.same_proxy_addresses + [self.http_proxy]
+ sub_vbox.pack_start(self.proxy_table, expand=False, fill=False)
+ self.proxy_table.show_all()
+
+ # Create the graphical elements for the network test feature, but don't display them yet
+ self.test_network_button = HobAltButton("Test network configuration")
+ self.test_network_button.connect("clicked", self.test_network_button_cb)
+ self.test_proxy_progress = HobProgressBar()
+ self.dummy_progress = HobProgressBar()
+ self.retest_network_button = HobAltButton("Retest")
+ self.retest_network_button.connect("clicked", self.test_network_button_cb)
+ self.test_gui_elements = [self.test_network_button, self.test_proxy_progress, self.dummy_progress, self.retest_network_button]
+ # Initialize the network tester
+ self.test_proxy_state = self.TEST_NETWORK_NONE
+ self.set_test_proxy_state(self.TEST_NETWORK_INITIAL)
+ self.proxy_test_passed_id = self.handler.connect("network-passed", lambda h:self.test_proxy_ended(True))
+ self.proxy_test_failed_id = self.handler.connect("network-failed", lambda h:self.test_proxy_ended(False))
+ [w.connect("focus-in-event", self.test_proxy_focus_event) for w in proxy_test_focus]
+ [w.connect("focus-out-event", self.proxy_address_focus_out_event) for w in self.all_proxy_addresses]
+
+ self.direct_checkbox.connect("toggled", self.proxy_checkbox_toggled_cb)
+ self.proxy_checkbox.connect("toggled", self.proxy_checkbox_toggled_cb)
+ self.same_checkbox.connect("toggled", self.same_checkbox_toggled_cb)
+
+ self.refresh_proxy_components()
+ return advanced_vbox
+
+ def switch_to_page(self, page_id):
+ self.nb.set_current_page(page_id)
+
+ def details_cb(self, button, parent, protocol):
+ self.save_proxy_data()
+ dialog = ProxyDetailsDialog(title = protocol.upper() + " Proxy Details",
+ user = self.configuration.proxies[protocol][1],
+ passwd = self.configuration.proxies[protocol][2],
+ parent = parent,
+ flags = gtk.DIALOG_MODAL
+ | gtk.DIALOG_DESTROY_WITH_PARENT
+ | gtk.DIALOG_NO_SEPARATOR)
+ dialog.add_button(gtk.STOCK_CLOSE, gtk.RESPONSE_OK)
+ response = dialog.run()
+ if response == gtk.RESPONSE_OK:
+ self.configuration.proxies[protocol][1] = dialog.user
+ self.configuration.proxies[protocol][2] = dialog.passwd
+ self.refresh_proxy_components()
+ dialog.destroy()
+
+ def rootfs_combo_changed_cb(self, rootfs_combo, all_package_format, check_hbox):
+ combo_item = self.rootfs_combo.get_active_text()
+ for child in check_hbox.get_children():
+ if isinstance(child, gtk.CheckButton):
+ check_hbox.remove(child)
+ for format in all_package_format:
+ if format != combo_item:
+ check_button = gtk.CheckButton(format)
+ check_hbox.pack_start(check_button, expand=False, fill=False)
+ check_hbox.show_all()
+
+ def gen_pkgfmt_widget(self, curr_package_format, all_package_format, tooltip_combo="", tooltip_extra=""):
+ pkgfmt_hbox = gtk.HBox(False, 24)
+
+ rootfs_vbox = gtk.VBox(False, 6)
+ pkgfmt_hbox.pack_start(rootfs_vbox, expand=False, fill=False)
+
+ label = self.gen_label_widget("Root file system package format")
+ rootfs_vbox.pack_start(label, expand=False, fill=False)
+
+ rootfs_format = ""
+ if curr_package_format:
+ rootfs_format = curr_package_format.split()[0]
+
+ rootfs_format_widget, rootfs_combo = self.gen_combo_widget(rootfs_format, all_package_format, tooltip_combo)
+ rootfs_vbox.pack_start(rootfs_format_widget, expand=False, fill=False)
+
+ extra_vbox = gtk.VBox(False, 6)
+ pkgfmt_hbox.pack_start(extra_vbox, expand=False, fill=False)
+
+ label = self.gen_label_widget("Additional package formats")
+ extra_vbox.pack_start(label, expand=False, fill=False)
+
+ check_hbox = gtk.HBox(False, 12)
+ extra_vbox.pack_start(check_hbox, expand=False, fill=False)
+ for format in all_package_format:
+ if format != rootfs_format:
+ check_button = gtk.CheckButton(format)
+ is_active = (format in curr_package_format.split())
+ check_button.set_active(is_active)
+ check_hbox.pack_start(check_button, expand=False, fill=False)
+
+ info = HobInfoButton(tooltip_extra, self)
+ check_hbox.pack_end(info, expand=False, fill=False)
+
+ rootfs_combo.connect("changed", self.rootfs_combo_changed_cb, all_package_format, check_hbox)
+
+ pkgfmt_hbox.show_all()
+
+ return pkgfmt_hbox, rootfs_combo, check_hbox
+
+ def editable_settings_cell_edited(self, cell, path_string, new_text, model):
+ it = model.get_iter_from_string(path_string)
+ column = cell.get_data("column")
+ model.set(it, column, new_text)
+
+ def editable_settings_add_item_clicked(self, button, model):
+ new_item = ["##KEY##", "##VALUE##"]
+
+ iter = model.append()
+ model.set (iter,
+ 0, new_item[0],
+ 1, new_item[1],
+ )
+
+ def editable_settings_remove_item_clicked(self, button, treeview):
+ selection = treeview.get_selection()
+ model, iter = selection.get_selected()
+
+ if iter:
+ path = model.get_path(iter)[0]
+ model.remove(iter)
+
+ def gen_editable_settings(self, setting, tooltip=""):
+ setting_hbox = gtk.HBox(False, 12)
+
+ vbox = gtk.VBox(False, 12)
+ setting_hbox.pack_start(vbox, expand=True, fill=True)
+
+ setting_store = gtk.ListStore(gobject.TYPE_STRING, gobject.TYPE_STRING)
+ for key in setting.keys():
+ setting_store.set(setting_store.append(), 0, key, 1, setting[key])
+
+ setting_tree = gtk.TreeView(setting_store)
+ setting_tree.set_headers_visible(True)
+ setting_tree.set_size_request(300, 100)
+
+ col = gtk.TreeViewColumn('Key')
+ col.set_min_width(100)
+ col.set_max_width(150)
+ col.set_resizable(True)
+ col1 = gtk.TreeViewColumn('Value')
+ col1.set_min_width(100)
+ col1.set_max_width(150)
+ col1.set_resizable(True)
+ setting_tree.append_column(col)
+ setting_tree.append_column(col1)
+ cell = gtk.CellRendererText()
+ cell.set_property('width-chars', 10)
+ cell.set_property('editable', True)
+ cell.set_data("column", 0)
+ cell.connect("edited", self.editable_settings_cell_edited, setting_store)
+ cell1 = gtk.CellRendererText()
+ cell1.set_property('width-chars', 10)
+ cell1.set_property('editable', True)
+ cell1.set_data("column", 1)
+ cell1.connect("edited", self.editable_settings_cell_edited, setting_store)
+ col.pack_start(cell, True)
+ col1.pack_end(cell1, True)
+ col.set_attributes(cell, text=0)
+ col1.set_attributes(cell1, text=1)
+
+ scroll = gtk.ScrolledWindow()
+ scroll.set_shadow_type(gtk.SHADOW_IN)
+ scroll.set_policy(gtk.POLICY_AUTOMATIC, gtk.POLICY_AUTOMATIC)
+ scroll.add(setting_tree)
+ vbox.pack_start(scroll, expand=True, fill=True)
+
+ # some buttons
+ hbox = gtk.HBox(True, 6)
+ vbox.pack_start(hbox, False, False)
+
+ button = gtk.Button(stock=gtk.STOCK_ADD)
+ button.connect("clicked", self.editable_settings_add_item_clicked, setting_store)
+ hbox.pack_start(button)
+
+ button = gtk.Button(stock=gtk.STOCK_REMOVE)
+ button.connect("clicked", self.editable_settings_remove_item_clicked, setting_tree)
+ hbox.pack_start(button)
+
+ info = HobInfoButton(tooltip, self)
+ setting_hbox.pack_start(info, expand=False, fill=False)
+
+ return setting_hbox, setting_store
+
+ def create_others_page(self):
+ advanced_vbox = gtk.VBox(False, 6)
+ advanced_vbox.set_border_width(6)
+
+ sub_vbox = gtk.VBox(False, 6)
+ advanced_vbox.pack_start(sub_vbox, expand=True, fill=True)
+ label = self.gen_label_widget("<span weight=\"bold\">Add your own variables:</span>")
+ tooltip = "These are key/value pairs for your extra settings. Click \'Add\' and then directly edit the key and the value"
+ setting_widget, self.setting_store = self.gen_editable_settings(self.configuration.extra_setting,"<b>Add your own variables</b>" + "*" + tooltip)
+ sub_vbox.pack_start(label, expand=False, fill=False)
+ sub_vbox.pack_start(setting_widget, expand=True, fill=True)
+
+ return advanced_vbox
+
+ def create_visual_elements(self):
+ self.nb = gtk.Notebook()
+ self.nb.set_show_tabs(True)
+ self.nb.append_page(self.create_build_environment_page(), gtk.Label("Build environment"))
+ self.nb.append_page(self.create_shared_state_page(), gtk.Label("Shared state"))
+ self.nb.append_page(self.create_network_page(), gtk.Label("Network"))
+ self.nb.append_page(self.create_others_page(), gtk.Label("Others"))
+ self.nb.set_current_page(0)
+ self.vbox.pack_start(self.nb, expand=True, fill=True)
+ self.vbox.pack_end(gtk.HSeparator(), expand=True, fill=True)
+
+ self.show_all()
+
+ def destroy(self):
+ self.handler.disconnect(self.proxy_test_passed_id)
+ self.handler.disconnect(self.proxy_test_failed_id)
+ super(SimpleSettingsDialog, self).destroy()
diff --git a/bitbake/lib/bb/ui/crumbs/hobcolor.py b/bitbake/lib/bb/ui/crumbs/hobcolor.py
new file mode 100644
index 0000000..3316542
--- /dev/null
+++ b/bitbake/lib/bb/ui/crumbs/hobcolor.py
@@ -0,0 +1,38 @@
+#
+# BitBake Graphical GTK User Interface
+#
+# Copyright (C) 2012 Intel Corporation
+#
+# Authored by Shane Wang <shane.wang@intel.com>
+#
+# This program is free software; you can redistribute it and/or modify
+# it under the terms of the GNU General Public License version 2 as
+# published by the Free Software Foundation.
+#
+# This program is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
+# GNU General Public License for more details.
+#
+# You should have received a copy of the GNU General Public License along
+# with this program; if not, write to the Free Software Foundation, Inc.,
+# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
+
+class HobColors:
+ WHITE = "#ffffff"
+ PALE_GREEN = "#aaffaa"
+ ORANGE = "#eb8e68"
+ PALE_RED = "#ffaaaa"
+ GRAY = "#aaaaaa"
+ LIGHT_GRAY = "#dddddd"
+ SLIGHT_DARK = "#5f5f5f"
+ DARK = "#3c3b37"
+ BLACK = "#000000"
+ PALE_BLUE = "#53b8ff"
+ DEEP_RED = "#aa3e3e"
+ KHAKI = "#fff68f"
+
+ OK = WHITE
+ RUNNING = PALE_GREEN
+ WARNING = ORANGE
+ ERROR = PALE_RED
diff --git a/bitbake/lib/bb/ui/crumbs/hobeventhandler.py b/bitbake/lib/bb/ui/crumbs/hobeventhandler.py
new file mode 100644
index 0000000..b71fb33
--- /dev/null
+++ b/bitbake/lib/bb/ui/crumbs/hobeventhandler.py
@@ -0,0 +1,645 @@
+#
+# BitBake Graphical GTK User Interface
+#
+# Copyright (C) 2011 Intel Corporation
+#
+# Authored by Joshua Lock <josh@linux.intel.com>
+# Authored by Dongxiao Xu <dongxiao.xu@intel.com>
+#
+# This program is free software; you can redistribute it and/or modify
+# it under the terms of the GNU General Public License version 2 as
+# published by the Free Software Foundation.
+#
+# This program is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
+# GNU General Public License for more details.
+#
+# You should have received a copy of the GNU General Public License along
+# with this program; if not, write to the Free Software Foundation, Inc.,
+# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
+
+import gobject
+import logging
+import ast
+from bb.ui.crumbs.runningbuild import RunningBuild
+
+class HobHandler(gobject.GObject):
+
+ """
+ This object does BitBake event handling for the hob gui.
+ """
+ __gsignals__ = {
+ "package-formats-updated" : (gobject.SIGNAL_RUN_LAST,
+ gobject.TYPE_NONE,
+ (gobject.TYPE_PYOBJECT,)),
+ "config-updated" : (gobject.SIGNAL_RUN_LAST,
+ gobject.TYPE_NONE,
+ (gobject.TYPE_STRING, gobject.TYPE_PYOBJECT,)),
+ "command-succeeded" : (gobject.SIGNAL_RUN_LAST,
+ gobject.TYPE_NONE,
+ (gobject.TYPE_INT,)),
+ "command-failed" : (gobject.SIGNAL_RUN_LAST,
+ gobject.TYPE_NONE,
+ (gobject.TYPE_STRING,)),
+ "parsing-warning" : (gobject.SIGNAL_RUN_LAST,
+ gobject.TYPE_NONE,
+ (gobject.TYPE_STRING,)),
+ "sanity-failed" : (gobject.SIGNAL_RUN_LAST,
+ gobject.TYPE_NONE,
+ (gobject.TYPE_STRING, gobject.TYPE_INT)),
+ "generating-data" : (gobject.SIGNAL_RUN_LAST,
+ gobject.TYPE_NONE,
+ ()),
+ "data-generated" : (gobject.SIGNAL_RUN_LAST,
+ gobject.TYPE_NONE,
+ ()),
+ "parsing-started" : (gobject.SIGNAL_RUN_LAST,
+ gobject.TYPE_NONE,
+ (gobject.TYPE_PYOBJECT,)),
+ "parsing" : (gobject.SIGNAL_RUN_LAST,
+ gobject.TYPE_NONE,
+ (gobject.TYPE_PYOBJECT,)),
+ "parsing-completed" : (gobject.SIGNAL_RUN_LAST,
+ gobject.TYPE_NONE,
+ (gobject.TYPE_PYOBJECT,)),
+ "recipe-populated" : (gobject.SIGNAL_RUN_LAST,
+ gobject.TYPE_NONE,
+ ()),
+ "package-populated" : (gobject.SIGNAL_RUN_LAST,
+ gobject.TYPE_NONE,
+ ()),
+ "network-passed" : (gobject.SIGNAL_RUN_LAST,
+ gobject.TYPE_NONE,
+ ()),
+ "network-failed" : (gobject.SIGNAL_RUN_LAST,
+ gobject.TYPE_NONE,
+ ()),
+ }
+
+ (GENERATE_CONFIGURATION, GENERATE_RECIPES, GENERATE_PACKAGES, GENERATE_IMAGE, POPULATE_PACKAGEINFO, SANITY_CHECK, NETWORK_TEST) = range(7)
+ (SUB_PATH_LAYERS, SUB_FILES_DISTRO, SUB_FILES_MACH, SUB_FILES_SDKMACH, SUB_MATCH_CLASS, SUB_PARSE_CONFIG, SUB_SANITY_CHECK,
+ SUB_GNERATE_TGTS, SUB_GENERATE_PKGINFO, SUB_BUILD_RECIPES, SUB_BUILD_IMAGE, SUB_NETWORK_TEST) = range(12)
+
+ def __init__(self, server, recipe_model, package_model):
+ super(HobHandler, self).__init__()
+
+ self.build = RunningBuild(sequential=True)
+
+ self.recipe_model = recipe_model
+ self.package_model = package_model
+
+ self.commands_async = []
+ self.generating = False
+ self.current_phase = None
+ self.building = False
+ self.recipe_queue = []
+ self.package_queue = []
+
+ self.server = server
+ self.error_msg = ""
+ self.initcmd = None
+ self.parsing = False
+
+ def set_busy(self):
+ if not self.generating:
+ self.emit("generating-data")
+ self.generating = True
+
+ def clear_busy(self):
+ if self.generating:
+ self.emit("data-generated")
+ self.generating = False
+
+ def runCommand(self, commandline):
+ try:
+ result, error = self.server.runCommand(commandline)
+ if error:
+ raise Exception("Error running command '%s': %s" % (commandline, error))
+ return result
+ except Exception as e:
+ self.commands_async = []
+ self.clear_busy()
+ self.emit("command-failed", "Hob Exception - %s" % (str(e)))
+ return None
+
+ def run_next_command(self, initcmd=None):
+ if initcmd != None:
+ self.initcmd = initcmd
+
+ if self.commands_async:
+ self.set_busy()
+ next_command = self.commands_async.pop(0)
+ else:
+ self.clear_busy()
+ if self.initcmd != None:
+ self.emit("command-succeeded", self.initcmd)
+ return
+
+ if next_command == self.SUB_PATH_LAYERS:
+ self.runCommand(["findConfigFilePath", "bblayers.conf"])
+ elif next_command == self.SUB_FILES_DISTRO:
+ self.runCommand(["findConfigFiles", "DISTRO"])
+ elif next_command == self.SUB_FILES_MACH:
+ self.runCommand(["findConfigFiles", "MACHINE"])
+ elif next_command == self.SUB_FILES_SDKMACH:
+ self.runCommand(["findConfigFiles", "MACHINE-SDK"])
+ elif next_command == self.SUB_MATCH_CLASS:
+ self.runCommand(["findFilesMatchingInDir", "rootfs_", "classes"])
+ elif next_command == self.SUB_PARSE_CONFIG:
+ self.runCommand(["resetCooker"])
+ elif next_command == self.SUB_GNERATE_TGTS:
+ self.runCommand(["generateTargetsTree", "classes/image.bbclass", []])
+ elif next_command == self.SUB_GENERATE_PKGINFO:
+ self.runCommand(["triggerEvent", "bb.event.RequestPackageInfo()"])
+ elif next_command == self.SUB_SANITY_CHECK:
+ self.runCommand(["triggerEvent", "bb.event.SanityCheck()"])
+ elif next_command == self.SUB_NETWORK_TEST:
+ self.runCommand(["triggerEvent", "bb.event.NetworkTest()"])
+ elif next_command == self.SUB_BUILD_RECIPES:
+ self.clear_busy()
+ self.building = True
+ self.runCommand(["buildTargets", self.recipe_queue, self.default_task])
+ self.recipe_queue = []
+ elif next_command == self.SUB_BUILD_IMAGE:
+ self.clear_busy()
+ self.building = True
+ target = self.image
+
+ if self.base_image:
+ # Request the build of a custom image
+ self.generate_hob_base_image(target)
+ self.set_var_in_file("LINGUAS_INSTALL", "", "local.conf")
+ hobImage = self.runCommand(["matchFile", target + ".bb"])
+ if self.base_image != self.recipe_model.__custom_image__:
+ baseImage = self.runCommand(["matchFile", self.base_image + ".bb"])
+ version = self.runCommand(["generateNewImage", hobImage, baseImage, self.package_queue, True, ""])
+ target += version
+ self.recipe_model.set_custom_image_version(version)
+
+ targets = [target]
+ if self.toolchain_packages:
+ self.set_var_in_file("TOOLCHAIN_TARGET_TASK", " ".join(self.toolchain_packages), "local.conf")
+ targets.append(target + ":do_populate_sdk")
+
+ self.runCommand(["buildTargets", targets, self.default_task])
+
+ def display_error(self):
+ self.clear_busy()
+ self.emit("command-failed", self.error_msg)
+ self.error_msg = ""
+ if self.building:
+ self.building = False
+
+ def handle_event(self, event):
+ if not event:
+ return
+ if self.building:
+ self.current_phase = "building"
+ self.build.handle_event(event)
+
+ if isinstance(event, bb.event.PackageInfo):
+ self.package_model.populate(event._pkginfolist)
+ self.emit("package-populated")
+ self.run_next_command()
+
+ elif isinstance(event, bb.event.SanityCheckPassed):
+ reparse = self.runCommand(["getVariable", "BB_INVALIDCONF"]) or None
+ if reparse is True:
+ self.set_var_in_file("BB_INVALIDCONF", False, "local.conf")
+ self.runCommand(["setPrePostConfFiles", "conf/.hob.conf", ""])
+ self.commands_async.prepend(self.SUB_PARSE_CONFIG)
+ self.run_next_command()
+
+ elif isinstance(event, bb.event.SanityCheckFailed):
+ self.emit("sanity-failed", event._msg, event._network_error)
+
+ elif isinstance(event, logging.LogRecord):
+ if not self.building:
+ if event.levelno >= logging.ERROR:
+ formatter = bb.msg.BBLogFormatter()
+ msg = formatter.format(event)
+ self.error_msg += msg + '\n'
+ elif event.levelno >= logging.WARNING and self.parsing == True:
+ formatter = bb.msg.BBLogFormatter()
+ msg = formatter.format(event)
+ warn_msg = msg + '\n'
+ self.emit("parsing-warning", warn_msg)
+
+ elif isinstance(event, bb.event.TargetsTreeGenerated):
+ self.current_phase = "data generation"
+ if event._model:
+ self.recipe_model.populate(event._model)
+ self.emit("recipe-populated")
+ elif isinstance(event, bb.event.ConfigFilesFound):
+ self.current_phase = "configuration lookup"
+ var = event._variable
+ values = event._values
+ values.sort()
+ self.emit("config-updated", var, values)
+ elif isinstance(event, bb.event.ConfigFilePathFound):
+ self.current_phase = "configuration lookup"
+ elif isinstance(event, bb.event.FilesMatchingFound):
+ self.current_phase = "configuration lookup"
+ # FIXME: hard coding, should at least be a variable shared between
+ # here and the caller
+ if event._pattern == "rootfs_":
+ formats = []
+ for match in event._matches:
+ classname, sep, cls = match.rpartition(".")
+ fs, sep, format = classname.rpartition("_")
+ formats.append(format)
+ formats.sort()
+ self.emit("package-formats-updated", formats)
+ elif isinstance(event, bb.command.CommandCompleted):
+ self.current_phase = None
+ self.run_next_command()
+ elif isinstance(event, bb.command.CommandFailed):
+ if event.error not in ("Forced shutdown", "Stopped build"):
+ self.error_msg += event.error
+ self.commands_async = []
+ self.display_error()
+ elif isinstance(event, (bb.event.ParseStarted,
+ bb.event.CacheLoadStarted,
+ bb.event.TreeDataPreparationStarted,
+ )):
+ message = {}
+ message["eventname"] = bb.event.getName(event)
+ message["current"] = 0
+ message["total"] = None
+ message["title"] = "Parsing recipes"
+ self.emit("parsing-started", message)
+ if isinstance(event, bb.event.ParseStarted):
+ self.parsing = True
+ elif isinstance(event, (bb.event.ParseProgress,
+ bb.event.CacheLoadProgress,
+ bb.event.TreeDataPreparationProgress)):
+ message = {}
+ message["eventname"] = bb.event.getName(event)
+ message["current"] = event.current
+ message["total"] = event.total
+ message["title"] = "Parsing recipes"
+ self.emit("parsing", message)
+ elif isinstance(event, (bb.event.ParseCompleted,
+ bb.event.CacheLoadCompleted,
+ bb.event.TreeDataPreparationCompleted)):
+ message = {}
+ message["eventname"] = bb.event.getName(event)
+ message["current"] = event.total
+ message["total"] = event.total
+ message["title"] = "Parsing recipes"
+ self.emit("parsing-completed", message)
+ if isinstance(event, bb.event.ParseCompleted):
+ self.parsing = False
+ elif isinstance(event, bb.event.NetworkTestFailed):
+ self.emit("network-failed")
+ self.run_next_command()
+ elif isinstance(event, bb.event.NetworkTestPassed):
+ self.emit("network-passed")
+ self.run_next_command()
+
+ if self.error_msg and not self.commands_async:
+ self.display_error()
+
+ return
+
+ def init_cooker(self):
+ self.runCommand(["createConfigFile", ".hob.conf"])
+
+ def set_extra_inherit(self, bbclass):
+ self.append_var_in_file("INHERIT", bbclass, ".hob.conf")
+
+ def set_bblayers(self, bblayers):
+ self.set_var_in_file("BBLAYERS", " ".join(bblayers), "bblayers.conf")
+
+ def set_machine(self, machine):
+ if machine:
+ self.early_assign_var_in_file("MACHINE", machine, "local.conf")
+
+ def set_sdk_machine(self, sdk_machine):
+ self.set_var_in_file("SDKMACHINE", sdk_machine, "local.conf")
+
+ def set_image_fstypes(self, image_fstypes):
+ self.set_var_in_file("IMAGE_FSTYPES", image_fstypes, "local.conf")
+
+ def set_distro(self, distro):
+ self.set_var_in_file("DISTRO", distro, "local.conf")
+
+ def set_package_format(self, format):
+ package_classes = ""
+ for pkgfmt in format.split():
+ package_classes += ("package_%s" % pkgfmt + " ")
+ self.set_var_in_file("PACKAGE_CLASSES", package_classes, "local.conf")
+
+ def set_bbthreads(self, threads):
+ self.set_var_in_file("BB_NUMBER_THREADS", threads, "local.conf")
+
+ def set_pmake(self, threads):
+ pmake = "-j %s" % threads
+ self.set_var_in_file("PARALLEL_MAKE", pmake, "local.conf")
+
+ def set_dl_dir(self, directory):
+ self.set_var_in_file("DL_DIR", directory, "local.conf")
+
+ def set_sstate_dir(self, directory):
+ self.set_var_in_file("SSTATE_DIR", directory, "local.conf")
+
+ def set_sstate_mirrors(self, url):
+ self.set_var_in_file("SSTATE_MIRRORS", url, "local.conf")
+
+ def set_extra_size(self, image_extra_size):
+ self.set_var_in_file("IMAGE_ROOTFS_EXTRA_SPACE", str(image_extra_size), "local.conf")
+
+ def set_rootfs_size(self, image_rootfs_size):
+ self.set_var_in_file("IMAGE_ROOTFS_SIZE", str(image_rootfs_size), "local.conf")
+
+ def set_incompatible_license(self, incompat_license):
+ self.set_var_in_file("INCOMPATIBLE_LICENSE", incompat_license, "local.conf")
+
+ def set_extra_setting(self, extra_setting):
+ self.set_var_in_file("EXTRA_SETTING", extra_setting, "local.conf")
+
+ def set_extra_config(self, extra_setting):
+ old_extra_setting = self.runCommand(["getVariable", "EXTRA_SETTING"]) or {}
+ old_extra_setting = str(old_extra_setting)
+
+ old_extra_setting = ast.literal_eval(old_extra_setting)
+ if not type(old_extra_setting) == dict:
+ old_extra_setting = {}
+
+ # settings not changed
+ if old_extra_setting == extra_setting:
+ return
+
+ # remove the old EXTRA SETTING variable
+ self.remove_var_from_file("EXTRA_SETTING")
+
+ # remove old settings from conf
+ for key in old_extra_setting.keys():
+ if key not in extra_setting:
+ self.remove_var_from_file(key)
+
+ # add new settings
+ for key, value in extra_setting.iteritems():
+ self.set_var_in_file(key, value, "local.conf")
+
+ if extra_setting:
+ self.set_var_in_file("EXTRA_SETTING", extra_setting, "local.conf")
+
+ def set_http_proxy(self, http_proxy):
+ self.set_var_in_file("http_proxy", http_proxy, "local.conf")
+
+ def set_https_proxy(self, https_proxy):
+ self.set_var_in_file("https_proxy", https_proxy, "local.conf")
+
+ def set_ftp_proxy(self, ftp_proxy):
+ self.set_var_in_file("ftp_proxy", ftp_proxy, "local.conf")
+
+ def set_socks_proxy(self, socks_proxy):
+ self.set_var_in_file("all_proxy", socks_proxy, "local.conf")
+
+ def set_cvs_proxy(self, host, port):
+ self.set_var_in_file("CVS_PROXY_HOST", host, "local.conf")
+ self.set_var_in_file("CVS_PROXY_PORT", port, "local.conf")
+
+ def request_package_info(self):
+ self.commands_async.append(self.SUB_GENERATE_PKGINFO)
+ self.run_next_command(self.POPULATE_PACKAGEINFO)
+
+ def trigger_sanity_check(self):
+ self.commands_async.append(self.SUB_SANITY_CHECK)
+ self.run_next_command(self.SANITY_CHECK)
+
+ def trigger_network_test(self):
+ self.commands_async.append(self.SUB_NETWORK_TEST)
+ self.run_next_command(self.NETWORK_TEST)
+
+ def generate_configuration(self):
+ self.runCommand(["setPrePostConfFiles", "conf/.hob.conf", ""])
+ self.commands_async.append(self.SUB_PARSE_CONFIG)
+ self.commands_async.append(self.SUB_PATH_LAYERS)
+ self.commands_async.append(self.SUB_FILES_DISTRO)
+ self.commands_async.append(self.SUB_FILES_MACH)
+ self.commands_async.append(self.SUB_FILES_SDKMACH)
+ self.commands_async.append(self.SUB_MATCH_CLASS)
+ self.run_next_command(self.GENERATE_CONFIGURATION)
+
+ def generate_recipes(self):
+ self.runCommand(["setPrePostConfFiles", "conf/.hob.conf", ""])
+ self.commands_async.append(self.SUB_PARSE_CONFIG)
+ self.commands_async.append(self.SUB_GNERATE_TGTS)
+ self.run_next_command(self.GENERATE_RECIPES)
+
+ def generate_packages(self, tgts, default_task="build"):
+ targets = []
+ targets.extend(tgts)
+ self.recipe_queue = targets
+ self.default_task = default_task
+ self.runCommand(["setPrePostConfFiles", "conf/.hob.conf", ""])
+ self.commands_async.append(self.SUB_PARSE_CONFIG)
+ self.commands_async.append(self.SUB_BUILD_RECIPES)
+ self.run_next_command(self.GENERATE_PACKAGES)
+
+ def generate_image(self, image, base_image, image_packages=None, toolchain_packages=None, default_task="build"):
+ self.image = image
+ self.base_image = base_image
+ if image_packages:
+ self.package_queue = image_packages
+ else:
+ self.package_queue = []
+ if toolchain_packages:
+ self.toolchain_packages = toolchain_packages
+ else:
+ self.toolchain_packages = []
+ self.default_task = default_task
+ self.runCommand(["setPrePostConfFiles", "conf/.hob.conf", ""])
+ self.commands_async.append(self.SUB_PARSE_CONFIG)
+ self.commands_async.append(self.SUB_BUILD_IMAGE)
+ self.run_next_command(self.GENERATE_IMAGE)
+
+ def generate_new_image(self, image, base_image, package_queue, description):
+ if base_image:
+ base_image = self.runCommand(["matchFile", self.base_image + ".bb"])
+ self.runCommand(["generateNewImage", image, base_image, package_queue, False, description])
+
+ def generate_hob_base_image(self, hob_image):
+ image_dir = self.get_topdir() + "/recipes/images/"
+ recipe_name = hob_image + ".bb"
+ self.ensure_dir(image_dir)
+ self.generate_new_image(image_dir + recipe_name, None, [], "")
+
+ def ensure_dir(self, directory):
+ self.runCommand(["ensureDir", directory])
+
+ def build_succeeded_async(self):
+ self.building = False
+
+ def build_failed_async(self):
+ self.initcmd = None
+ self.commands_async = []
+ self.building = False
+
+ def cancel_parse(self):
+ self.runCommand(["stateForceShutdown"])
+
+ def cancel_build(self, force=False):
+ if force:
+ # Force the cooker to stop as quickly as possible
+ self.runCommand(["stateForceShutdown"])
+ else:
+ # Wait for tasks to complete before shutting down, this helps
+ # leave the workdir in a usable state
+ self.runCommand(["stateShutdown"])
+
+ def reset_build(self):
+ self.build.reset()
+
+ def get_logfile(self):
+ return self.server.runCommand(["getVariable", "BB_CONSOLELOG"])[0]
+
+ def get_topdir(self):
+ return self.runCommand(["getVariable", "TOPDIR"]) or ""
+
+ def _remove_redundant(self, string):
+ ret = []
+ for i in string.split():
+ if i not in ret:
+ ret.append(i)
+ return " ".join(ret)
+
+ def set_var_in_file(self, var, val, default_file=None):
+ self.runCommand(["enableDataTracking"])
+ self.server.runCommand(["setVarFile", var, val, default_file, "set"])
+ self.runCommand(["disableDataTracking"])
+
+ def early_assign_var_in_file(self, var, val, default_file=None):
+ self.runCommand(["enableDataTracking"])
+ self.server.runCommand(["setVarFile", var, val, default_file, "earlyAssign"])
+ self.runCommand(["disableDataTracking"])
+
+ def remove_var_from_file(self, var):
+ self.server.runCommand(["removeVarFile", var])
+
+ def append_var_in_file(self, var, val, default_file=None):
+ self.server.runCommand(["setVarFile", var, val, default_file, "append"])
+
+ def append_to_bbfiles(self, val):
+ bbfiles = self.runCommand(["getVariable", "BBFILES", "False"]) or ""
+ bbfiles = bbfiles.split()
+ if val not in bbfiles:
+ self.append_var_in_file("BBFILES", val, "bblayers.conf")
+
+ def get_parameters(self):
+ # retrieve the parameters from bitbake
+ params = {}
+ params["core_base"] = self.runCommand(["getVariable", "COREBASE"]) or ""
+ params["layer"] = self.runCommand(["getVariable", "BBLAYERS"]) or ""
+ params["layers_non_removable"] = self.runCommand(["getVariable", "BBLAYERS_NON_REMOVABLE"]) or ""
+ params["dldir"] = self.runCommand(["getVariable", "DL_DIR"]) or ""
+ params["machine"] = self.runCommand(["getVariable", "MACHINE"]) or ""
+ params["distro"] = self.runCommand(["getVariable", "DISTRO"]) or "defaultsetup"
+ params["pclass"] = self.runCommand(["getVariable", "PACKAGE_CLASSES"]) or ""
+ params["sstatedir"] = self.runCommand(["getVariable", "SSTATE_DIR"]) or ""
+ params["sstatemirror"] = self.runCommand(["getVariable", "SSTATE_MIRRORS"]) or ""
+
+ num_threads = self.runCommand(["getCpuCount"])
+ if not num_threads:
+ num_threads = 1
+ max_threads = 65536
+ else:
+ try:
+ num_threads = int(num_threads)
+ max_threads = 16 * num_threads
+ except:
+ num_threads = 1
+ max_threads = 65536
+ params["max_threads"] = max_threads
+
+ bbthread = self.runCommand(["getVariable", "BB_NUMBER_THREADS"])
+ if not bbthread:
+ bbthread = num_threads
+ else:
+ try:
+ bbthread = int(bbthread)
+ except:
+ bbthread = num_threads
+ params["bbthread"] = bbthread
+
+ pmake = self.runCommand(["getVariable", "PARALLEL_MAKE"])
+ if not pmake:
+ pmake = num_threads
+ elif isinstance(pmake, int):
+ pass
+ else:
+ try:
+ pmake = int(pmake.lstrip("-j "))
+ except:
+ pmake = num_threads
+ params["pmake"] = "-j %s" % pmake
+
+ params["image_addr"] = self.runCommand(["getVariable", "DEPLOY_DIR_IMAGE"]) or ""
+
+ image_extra_size = self.runCommand(["getVariable", "IMAGE_ROOTFS_EXTRA_SPACE"])
+ if not image_extra_size:
+ image_extra_size = 0
+ else:
+ try:
+ image_extra_size = int(image_extra_size)
+ except:
+ image_extra_size = 0
+ params["image_extra_size"] = image_extra_size
+
+ image_rootfs_size = self.runCommand(["getVariable", "IMAGE_ROOTFS_SIZE"])
+ if not image_rootfs_size:
+ image_rootfs_size = 0
+ else:
+ try:
+ image_rootfs_size = int(image_rootfs_size)
+ except:
+ image_rootfs_size = 0
+ params["image_rootfs_size"] = image_rootfs_size
+
+ image_overhead_factor = self.runCommand(["getVariable", "IMAGE_OVERHEAD_FACTOR"])
+ if not image_overhead_factor:
+ image_overhead_factor = 1
+ else:
+ try:
+ image_overhead_factor = float(image_overhead_factor)
+ except:
+ image_overhead_factor = 1
+ params['image_overhead_factor'] = image_overhead_factor
+
+ params["incompat_license"] = self._remove_redundant(self.runCommand(["getVariable", "INCOMPATIBLE_LICENSE"]) or "")
+ params["sdk_machine"] = self.runCommand(["getVariable", "SDKMACHINE"]) or self.runCommand(["getVariable", "SDK_ARCH"]) or ""
+
+ params["image_fstypes"] = self._remove_redundant(self.runCommand(["getVariable", "IMAGE_FSTYPES"]) or "")
+
+ params["image_types"] = self._remove_redundant(self.runCommand(["getVariable", "IMAGE_TYPES"]) or "")
+
+ params["conf_version"] = self.runCommand(["getVariable", "CONF_VERSION"]) or ""
+ params["lconf_version"] = self.runCommand(["getVariable", "LCONF_VERSION"]) or ""
+
+ params["runnable_image_types"] = self._remove_redundant(self.runCommand(["getVariable", "RUNNABLE_IMAGE_TYPES"]) or "")
+ params["runnable_machine_patterns"] = self._remove_redundant(self.runCommand(["getVariable", "RUNNABLE_MACHINE_PATTERNS"]) or "")
+ params["deployable_image_types"] = self._remove_redundant(self.runCommand(["getVariable", "DEPLOYABLE_IMAGE_TYPES"]) or "")
+ params["kernel_image_type"] = self.runCommand(["getVariable", "KERNEL_IMAGETYPE"]) or ""
+ params["tmpdir"] = self.runCommand(["getVariable", "TMPDIR"]) or ""
+ params["distro_version"] = self.runCommand(["getVariable", "DISTRO_VERSION"]) or ""
+ params["target_os"] = self.runCommand(["getVariable", "TARGET_OS"]) or ""
+ params["target_arch"] = self.runCommand(["getVariable", "TARGET_ARCH"]) or ""
+ params["tune_pkgarch"] = self.runCommand(["getVariable", "TUNE_PKGARCH"]) or ""
+ params["bb_version"] = self.runCommand(["getVariable", "BB_MIN_VERSION"]) or ""
+
+ params["default_task"] = self.runCommand(["getVariable", "BB_DEFAULT_TASK"]) or "build"
+
+ params["socks_proxy"] = self.runCommand(["getVariable", "all_proxy"]) or ""
+ params["http_proxy"] = self.runCommand(["getVariable", "http_proxy"]) or ""
+ params["ftp_proxy"] = self.runCommand(["getVariable", "ftp_proxy"]) or ""
+ params["https_proxy"] = self.runCommand(["getVariable", "https_proxy"]) or ""
+
+ params["cvs_proxy_host"] = self.runCommand(["getVariable", "CVS_PROXY_HOST"]) or ""
+ params["cvs_proxy_port"] = self.runCommand(["getVariable", "CVS_PROXY_PORT"]) or ""
+
+ params["image_white_pattern"] = self.runCommand(["getVariable", "BBUI_IMAGE_WHITE_PATTERN"]) or ""
+ params["image_black_pattern"] = self.runCommand(["getVariable", "BBUI_IMAGE_BLACK_PATTERN"]) or ""
+ return params
diff --git a/bitbake/lib/bb/ui/crumbs/hoblistmodel.py b/bitbake/lib/bb/ui/crumbs/hoblistmodel.py
new file mode 100644
index 0000000..50df156
--- /dev/null
+++ b/bitbake/lib/bb/ui/crumbs/hoblistmodel.py
@@ -0,0 +1,903 @@
+#
+# BitBake Graphical GTK User Interface
+#
+# Copyright (C) 2011 Intel Corporation
+#
+# Authored by Joshua Lock <josh@linux.intel.com>
+# Authored by Dongxiao Xu <dongxiao.xu@intel.com>
+# Authored by Shane Wang <shane.wang@intel.com>
+#
+# This program is free software; you can redistribute it and/or modify
+# it under the terms of the GNU General Public License version 2 as
+# published by the Free Software Foundation.
+#
+# This program is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
+# GNU General Public License for more details.
+#
+# You should have received a copy of the GNU General Public License along
+# with this program; if not, write to the Free Software Foundation, Inc.,
+# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
+
+import gtk
+import gobject
+from bb.ui.crumbs.hobpages import HobPage
+
+#
+# PackageListModel
+#
+class PackageListModel(gtk.ListStore):
+ """
+ This class defines an gtk.ListStore subclass which will convert the output
+ of the bb.event.TargetsTreeGenerated event into a gtk.ListStore whilst also
+ providing convenience functions to access gtk.TreeModel subclasses which
+ provide filtered views of the data.
+ """
+
+ (COL_NAME, COL_VER, COL_REV, COL_RNM, COL_SEC, COL_SUM, COL_RDEP, COL_RPROV, COL_SIZE, COL_RCP, COL_BINB, COL_INC, COL_FADE_INC, COL_FONT, COL_FLIST) = range(15)
+
+ __gsignals__ = {
+ "package-selection-changed" : (gobject.SIGNAL_RUN_LAST,
+ gobject.TYPE_NONE,
+ ()),
+ }
+
+ __toolchain_required_packages__ = ["packagegroup-core-standalone-sdk-target", "packagegroup-core-standalone-sdk-target-dbg"]
+
+ def __init__(self):
+ self.rprov_pkg = {}
+ gtk.ListStore.__init__ (self,
+ gobject.TYPE_STRING,
+ gobject.TYPE_STRING,
+ gobject.TYPE_STRING,
+ gobject.TYPE_STRING,
+ gobject.TYPE_STRING,
+ gobject.TYPE_STRING,
+ gobject.TYPE_STRING,
+ gobject.TYPE_STRING,
+ gobject.TYPE_STRING,
+ gobject.TYPE_STRING,
+ gobject.TYPE_STRING,
+ gobject.TYPE_BOOLEAN,
+ gobject.TYPE_BOOLEAN,
+ gobject.TYPE_STRING,
+ gobject.TYPE_STRING)
+ self.sort_column_id, self.sort_order = PackageListModel.COL_NAME, gtk.SORT_ASCENDING
+
+ """
+ Find the model path for the item_name
+ Returns the path in the model or None
+ """
+ def find_path_for_item(self, item_name):
+ pkg = item_name
+ if item_name not in self.pn_path.keys():
+ if item_name not in self.rprov_pkg.keys():
+ return None
+ pkg = self.rprov_pkg[item_name]
+ if pkg not in self.pn_path.keys():
+ return None
+
+ return self.pn_path[pkg]
+
+ def find_item_for_path(self, item_path):
+ return self[item_path][self.COL_NAME]
+
+ """
+ Helper function to determine whether an item is an item specified by filter
+ """
+ def tree_model_filter(self, model, it, filter):
+ name = model.get_value(it, self.COL_NAME)
+
+ for key in filter.keys():
+ if key == self.COL_NAME:
+ if filter[key] != 'Search packages by name':
+ if name and filter[key] not in name:
+ return False
+ else:
+ if model.get_value(it, key) not in filter[key]:
+ return False
+ self.filtered_nb += 1
+ return True
+
+ """
+ Create, if required, and return a filtered gtk.TreeModelSort
+ containing only the items specified by filter
+ """
+ def tree_model(self, filter, excluded_items_ahead=False, included_items_ahead=False, search_data=None, initial=False):
+ model = self.filter_new()
+ self.filtered_nb = 0
+ model.set_visible_func(self.tree_model_filter, filter)
+
+ sort = gtk.TreeModelSort(model)
+ sort.connect ('sort-column-changed', self.sort_column_changed_cb)
+ if initial:
+ sort.set_sort_column_id(PackageListModel.COL_NAME, gtk.SORT_ASCENDING)
+ sort.set_default_sort_func(None)
+ elif excluded_items_ahead:
+ sort.set_default_sort_func(self.exclude_item_sort_func, search_data)
+ elif included_items_ahead:
+ sort.set_default_sort_func(self.include_item_sort_func, search_data)
+ else:
+ if search_data and search_data!='Search recipes by name' and search_data!='Search package groups by name':
+ sort.set_default_sort_func(self.sort_func, search_data)
+ else:
+ sort.set_sort_column_id(self.sort_column_id, self.sort_order)
+ sort.set_default_sort_func(None)
+
+ sort.set_sort_func(PackageListModel.COL_INC, self.sort_column, PackageListModel.COL_INC)
+ sort.set_sort_func(PackageListModel.COL_SIZE, self.sort_column, PackageListModel.COL_SIZE)
+ sort.set_sort_func(PackageListModel.COL_BINB, self.sort_binb_column)
+ sort.set_sort_func(PackageListModel.COL_RCP, self.sort_column, PackageListModel.COL_RCP)
+ return sort
+
+ def sort_column_changed_cb (self, data):
+ self.sort_column_id, self.sort_order = data.get_sort_column_id ()
+
+ def sort_column(self, model, row1, row2, col):
+ value1 = model.get_value(row1, col)
+ value2 = model.get_value(row2, col)
+ if col==PackageListModel.COL_SIZE:
+ value1 = HobPage._string_to_size(value1)
+ value2 = HobPage._string_to_size(value2)
+
+ cmp_res = cmp(value1, value2)
+ if cmp_res!=0:
+ if col==PackageListModel.COL_INC:
+ return -cmp_res
+ else:
+ return cmp_res
+ else:
+ name1 = model.get_value(row1, PackageListModel.COL_NAME)
+ name2 = model.get_value(row2, PackageListModel.COL_NAME)
+ return cmp(name1,name2)
+
+ def sort_binb_column(self, model, row1, row2):
+ value1 = model.get_value(row1, PackageListModel.COL_BINB)
+ value2 = model.get_value(row2, PackageListModel.COL_BINB)
+ value1_list = value1.split(', ')
+ value2_list = value2.split(', ')
+
+ value1 = value1_list[0]
+ value2 = value2_list[0]
+
+ cmp_res = cmp(value1, value2)
+ if cmp_res==0:
+ cmp_size = cmp(len(value1_list), len(value2_list))
+ if cmp_size==0:
+ name1 = model.get_value(row1, PackageListModel.COL_NAME)
+ name2 = model.get_value(row2, PackageListModel.COL_NAME)
+ return cmp(name1,name2)
+ else:
+ return cmp_size
+ else:
+ return cmp_res
+
+ def exclude_item_sort_func(self, model, iter1, iter2, user_data=None):
+ if user_data:
+ val1 = model.get_value(iter1, PackageListModel.COL_NAME)
+ val2 = model.get_value(iter2, PackageListModel.COL_NAME)
+ return self.cmp_vals(val1, val2, user_data)
+ else:
+ val1 = model.get_value(iter1, PackageListModel.COL_FADE_INC)
+ val2 = model.get_value(iter2, PackageListModel.COL_INC)
+ return ((val1 == True) and (val2 == False))
+
+ def include_item_sort_func(self, model, iter1, iter2, user_data=None):
+ if user_data:
+ val1 = model.get_value(iter1, PackageListModel.COL_NAME)
+ val2 = model.get_value(iter2, PackageListModel.COL_NAME)
+ return self.cmp_vals(val1, val2, user_data)
+ else:
+ val1 = model.get_value(iter1, PackageListModel.COL_INC)
+ val2 = model.get_value(iter2, PackageListModel.COL_INC)
+ return ((val1 == False) and (val2 == True))
+
+ def sort_func(self, model, iter1, iter2, user_data):
+ val1 = model.get_value(iter1, PackageListModel.COL_NAME)
+ val2 = model.get_value(iter2, PackageListModel.COL_NAME)
+ return self.cmp_vals(val1, val2, user_data)
+
+ def cmp_vals(self, val1, val2, user_data):
+ if val1 is None or val2 is None:
+ return 0
+ elif val1.startswith(user_data) and not val2.startswith(user_data):
+ return -1
+ elif not val1.startswith(user_data) and val2.startswith(user_data):
+ return 1
+ else:
+ return cmp(val1, val2)
+
+ def convert_vpath_to_path(self, view_model, view_path):
+ # view_model is the model sorted
+ # get the path of the model filtered
+ filtered_model_path = view_model.convert_path_to_child_path(view_path)
+ # get the model filtered
+ filtered_model = view_model.get_model()
+ # get the path of the original model
+ path = filtered_model.convert_path_to_child_path(filtered_model_path)
+ return path
+
+ def convert_path_to_vpath(self, view_model, path):
+ it = view_model.get_iter_first()
+ while it:
+ name = self.find_item_for_path(path)
+ view_name = view_model.get_value(it, PackageListModel.COL_NAME)
+ if view_name == name:
+ view_path = view_model.get_path(it)
+ return view_path
+ it = view_model.iter_next(it)
+ return None
+
+ """
+ The populate() function takes as input the data from a
+ bb.event.PackageInfo event and populates the package list.
+ """
+ def populate(self, pkginfolist):
+ # First clear the model, in case repopulating
+ self.clear()
+
+ def getpkgvalue(pkgdict, key, pkgname, defaultval = None):
+ value = pkgdict.get('%s_%s' % (key, pkgname), None)
+ if not value:
+ value = pkgdict.get(key, defaultval)
+ return value
+
+ for pkginfo in pkginfolist:
+ pn = pkginfo['PN']
+ pv = pkginfo['PV']
+ pr = pkginfo['PR']
+ pkg = pkginfo['PKG']
+ pkgv = getpkgvalue(pkginfo, 'PKGV', pkg)
+ pkgr = getpkgvalue(pkginfo, 'PKGR', pkg)
+ # PKGSIZE is artificial, will always be overridden with the package name if present
+ pkgsize = int(pkginfo.get('PKGSIZE_%s' % pkg, "0"))
+ # PKG_%s is the renamed version
+ pkg_rename = pkginfo.get('PKG_%s' % pkg, "")
+ # The rest may be overridden or not
+ section = getpkgvalue(pkginfo, 'SECTION', pkg, "")
+ summary = getpkgvalue(pkginfo, 'SUMMARY', pkg, "")
+ rdep = getpkgvalue(pkginfo, 'RDEPENDS', pkg, "")
+ rrec = getpkgvalue(pkginfo, 'RRECOMMENDS', pkg, "")
+ rprov = getpkgvalue(pkginfo, 'RPROVIDES', pkg, "")
+ files_list = getpkgvalue(pkginfo, 'FILES_INFO', pkg, "")
+ for i in rprov.split():
+ self.rprov_pkg[i] = pkg
+
+ recipe = pn + '-' + pv + '-' + pr
+
+ allow_empty = getpkgvalue(pkginfo, 'ALLOW_EMPTY', pkg, "")
+
+ if pkgsize == 0 and not allow_empty:
+ continue
+
+ size = HobPage._size_to_string(pkgsize)
+ self.set(self.append(), self.COL_NAME, pkg, self.COL_VER, pkgv,
+ self.COL_REV, pkgr, self.COL_RNM, pkg_rename,
+ self.COL_SEC, section, self.COL_SUM, summary,
+ self.COL_RDEP, rdep + ' ' + rrec,
+ self.COL_RPROV, rprov, self.COL_SIZE, size,
+ self.COL_RCP, recipe, self.COL_BINB, "",
+ self.COL_INC, False, self.COL_FONT, '10', self.COL_FLIST, files_list)
+
+ self.pn_path = {}
+ it = self.get_iter_first()
+ while it:
+ pn = self.get_value(it, self.COL_NAME)
+ path = self.get_path(it)
+ self.pn_path[pn] = path
+ it = self.iter_next(it)
+
+ """
+ Update the model, send out the notification.
+ """
+ def selection_change_notification(self):
+ self.emit("package-selection-changed")
+
+ """
+ Check whether the item at item_path is included or not
+ """
+ def path_included(self, item_path):
+ return self[item_path][self.COL_INC]
+
+ """
+ Add this item, and any of its dependencies, to the image contents
+ """
+ def include_item(self, item_path, binb=""):
+ if self.path_included(item_path):
+ return
+
+ item_name = self[item_path][self.COL_NAME]
+ item_deps = self[item_path][self.COL_RDEP]
+
+ self[item_path][self.COL_INC] = True
+
+ item_bin = self[item_path][self.COL_BINB].split(', ')
+ if binb and not binb in item_bin:
+ item_bin.append(binb)
+ self[item_path][self.COL_BINB] = ', '.join(item_bin).lstrip(', ')
+
+ if item_deps:
+ # Ensure all of the items deps are included and, where appropriate,
+ # add this item to their COL_BINB
+ for dep in item_deps.split(" "):
+ if dep.startswith('('):
+ continue
+ # If the contents model doesn't already contain dep, add it
+ dep_path = self.find_path_for_item(dep)
+ if not dep_path:
+ continue
+ dep_included = self.path_included(dep_path)
+
+ if dep_included and not dep in item_bin:
+ # don't set the COL_BINB to this item if the target is an
+ # item in our own COL_BINB
+ dep_bin = self[dep_path][self.COL_BINB].split(', ')
+ if not item_name in dep_bin:
+ dep_bin.append(item_name)
+ self[dep_path][self.COL_BINB] = ', '.join(dep_bin).lstrip(', ')
+ elif not dep_included:
+ self.include_item(dep_path, binb=item_name)
+
+ def exclude_item(self, item_path):
+ if not self.path_included(item_path):
+ return
+
+ self[item_path][self.COL_INC] = False
+
+ item_name = self[item_path][self.COL_NAME]
+ item_deps = self[item_path][self.COL_RDEP]
+ if item_deps:
+ for dep in item_deps.split(" "):
+ if dep.startswith('('):
+ continue
+ dep_path = self.find_path_for_item(dep)
+ if not dep_path:
+ continue
+ dep_bin = self[dep_path][self.COL_BINB].split(', ')
+ if item_name in dep_bin:
+ dep_bin.remove(item_name)
+ self[dep_path][self.COL_BINB] = ', '.join(dep_bin).lstrip(', ')
+
+ item_bin = self[item_path][self.COL_BINB].split(', ')
+ if item_bin:
+ for binb in item_bin:
+ binb_path = self.find_path_for_item(binb)
+ if not binb_path:
+ continue
+ self.exclude_item(binb_path)
+
+ """
+ Empty self.contents by setting the include of each entry to None
+ """
+ def reset(self):
+ it = self.get_iter_first()
+ while it:
+ self.set(it,
+ self.COL_INC, False,
+ self.COL_BINB, "")
+ it = self.iter_next(it)
+
+ self.selection_change_notification()
+
+ def get_selected_packages(self):
+ packagelist = []
+
+ it = self.get_iter_first()
+ while it:
+ if self.get_value(it, self.COL_INC):
+ name = self.get_value(it, self.COL_NAME)
+ packagelist.append(name)
+ it = self.iter_next(it)
+
+ return packagelist
+
+ def get_user_selected_packages(self):
+ packagelist = []
+
+ it = self.get_iter_first()
+ while it:
+ if self.get_value(it, self.COL_INC):
+ binb = self.get_value(it, self.COL_BINB)
+ if binb == "User Selected":
+ name = self.get_value(it, self.COL_NAME)
+ packagelist.append(name)
+ it = self.iter_next(it)
+
+ return packagelist
+
+ def get_selected_packages_toolchain(self):
+ packagelist = []
+
+ it = self.get_iter_first()
+ while it:
+ if self.get_value(it, self.COL_INC):
+ name = self.get_value(it, self.COL_NAME)
+ if name.endswith("-dev") or name.endswith("-dbg"):
+ packagelist.append(name)
+ it = self.iter_next(it)
+
+ return list(set(packagelist + self.__toolchain_required_packages__));
+
+ """
+ Package model may be incomplete, therefore when calling the
+ set_selected_packages(), some packages will not be set included.
+ Return the un-set packages list.
+ """
+ def set_selected_packages(self, packagelist, user_selected=False):
+ left = []
+ binb = 'User Selected' if user_selected else ''
+ for pn in packagelist:
+ if pn in self.pn_path.keys():
+ path = self.pn_path[pn]
+ self.include_item(item_path=path, binb=binb)
+ else:
+ left.append(pn)
+
+ self.selection_change_notification()
+ return left
+
+ """
+ Return the selected package size, unit is B.
+ """
+ def get_packages_size(self):
+ packages_size = 0
+ it = self.get_iter_first()
+ while it:
+ if self.get_value(it, self.COL_INC):
+ str_size = self.get_value(it, self.COL_SIZE)
+ if not str_size:
+ continue
+
+ packages_size += HobPage._string_to_size(str_size)
+
+ it = self.iter_next(it)
+ return packages_size
+
+ """
+ Resync the state of included items to a backup column before performing the fadeout visible effect
+ """
+ def resync_fadeout_column(self, model_first_iter=None):
+ it = model_first_iter
+ while it:
+ active = self.get_value(it, self.COL_INC)
+ self.set(it, self.COL_FADE_INC, active)
+ it = self.iter_next(it)
+
+#
+# RecipeListModel
+#
+class RecipeListModel(gtk.ListStore):
+ """
+ This class defines an gtk.ListStore subclass which will convert the output
+ of the bb.event.TargetsTreeGenerated event into a gtk.ListStore whilst also
+ providing convenience functions to access gtk.TreeModel subclasses which
+ provide filtered views of the data.
+ """
+ (COL_NAME, COL_DESC, COL_LIC, COL_GROUP, COL_DEPS, COL_BINB, COL_TYPE, COL_INC, COL_IMG, COL_INSTALL, COL_PN, COL_FADE_INC, COL_SUMMARY, COL_VERSION,
+ COL_REVISION, COL_HOMEPAGE, COL_BUGTRACKER, COL_FILE) = range(18)
+
+ __custom_image__ = "Start with an empty image recipe"
+
+ __gsignals__ = {
+ "recipe-selection-changed" : (gobject.SIGNAL_RUN_LAST,
+ gobject.TYPE_NONE,
+ ()),
+ }
+
+ """
+ """
+ def __init__(self):
+ gtk.ListStore.__init__ (self,
+ gobject.TYPE_STRING,
+ gobject.TYPE_STRING,
+ gobject.TYPE_STRING,
+ gobject.TYPE_STRING,
+ gobject.TYPE_STRING,
+ gobject.TYPE_STRING,
+ gobject.TYPE_STRING,
+ gobject.TYPE_BOOLEAN,
+ gobject.TYPE_BOOLEAN,
+ gobject.TYPE_STRING,
+ gobject.TYPE_STRING,
+ gobject.TYPE_BOOLEAN,
+ gobject.TYPE_STRING,
+ gobject.TYPE_STRING,
+ gobject.TYPE_STRING,
+ gobject.TYPE_STRING,
+ gobject.TYPE_STRING,
+ gobject.TYPE_STRING)
+ self.sort_column_id, self.sort_order = RecipeListModel.COL_NAME, gtk.SORT_ASCENDING
+
+ """
+ Find the model path for the item_name
+ Returns the path in the model or None
+ """
+ def find_path_for_item(self, item_name):
+ if self.non_target_name(item_name) or item_name not in self.pn_path.keys():
+ return None
+ else:
+ return self.pn_path[item_name]
+
+ def find_item_for_path(self, item_path):
+ return self[item_path][self.COL_NAME]
+
+ """
+ Helper method to determine whether name is a target pn
+ """
+ def non_target_name(self, name):
+ if name and ('-native' in name):
+ return True
+ return False
+
+ """
+ Helper function to determine whether an item is an item specified by filter
+ """
+ def tree_model_filter(self, model, it, filter):
+ name = model.get_value(it, self.COL_NAME)
+ if self.non_target_name(name):
+ return False
+
+ for key in filter.keys():
+ if key == self.COL_NAME:
+ if filter[key] != 'Search recipes by name' and filter[key] != 'Search package groups by name':
+ if filter[key] not in name:
+ return False
+ else:
+ if model.get_value(it, key) not in filter[key]:
+ return False
+ self.filtered_nb += 1
+
+ return True
+
+ def exclude_item_sort_func(self, model, iter1, iter2, user_data=None):
+ if user_data:
+ val1 = model.get_value(iter1, RecipeListModel.COL_NAME)
+ val2 = model.get_value(iter2, RecipeListModel.COL_NAME)
+ return self.cmp_vals(val1, val2, user_data)
+ else:
+ val1 = model.get_value(iter1, RecipeListModel.COL_FADE_INC)
+ val2 = model.get_value(iter2, RecipeListModel.COL_INC)
+ return ((val1 == True) and (val2 == False))
+
+ def include_item_sort_func(self, model, iter1, iter2, user_data=None):
+ if user_data:
+ val1 = model.get_value(iter1, RecipeListModel.COL_NAME)
+ val2 = model.get_value(iter2, RecipeListModel.COL_NAME)
+ return self.cmp_vals(val1, val2, user_data)
+ else:
+ val1 = model.get_value(iter1, RecipeListModel.COL_INC)
+ val2 = model.get_value(iter2, RecipeListModel.COL_INC)
+ return ((val1 == False) and (val2 == True))
+
+ def sort_func(self, model, iter1, iter2, user_data):
+ val1 = model.get_value(iter1, RecipeListModel.COL_NAME)
+ val2 = model.get_value(iter2, RecipeListModel.COL_NAME)
+ return self.cmp_vals(val1, val2, user_data)
+
+ def cmp_vals(self, val1, val2, user_data):
+ if val1 is None or val2 is None:
+ return 0
+ elif val1.startswith(user_data) and not val2.startswith(user_data):
+ return -1
+ elif not val1.startswith(user_data) and val2.startswith(user_data):
+ return 1
+ else:
+ return cmp(val1, val2)
+
+ """
+ Create, if required, and return a filtered gtk.TreeModelSort
+ containing only the items specified by filter
+ """
+ def tree_model(self, filter, excluded_items_ahead=False, included_items_ahead=False, search_data=None, initial=False):
+ model = self.filter_new()
+ self.filtered_nb = 0
+ model.set_visible_func(self.tree_model_filter, filter)
+
+ sort = gtk.TreeModelSort(model)
+ sort.connect ('sort-column-changed', self.sort_column_changed_cb)
+ if initial:
+ sort.set_sort_column_id(RecipeListModel.COL_NAME, gtk.SORT_ASCENDING)
+ sort.set_default_sort_func(None)
+ elif excluded_items_ahead:
+ sort.set_default_sort_func(self.exclude_item_sort_func, search_data)
+ elif included_items_ahead:
+ sort.set_default_sort_func(self.include_item_sort_func, search_data)
+ else:
+ if search_data and search_data!='Search recipes by name' and search_data!='Search package groups by name':
+ sort.set_default_sort_func(self.sort_func, search_data)
+ else:
+ sort.set_sort_column_id(self.sort_column_id, self.sort_order)
+ sort.set_default_sort_func(None)
+
+ sort.set_sort_func(RecipeListModel.COL_INC, self.sort_column, RecipeListModel.COL_INC)
+ sort.set_sort_func(RecipeListModel.COL_GROUP, self.sort_column, RecipeListModel.COL_GROUP)
+ sort.set_sort_func(RecipeListModel.COL_BINB, self.sort_binb_column)
+ sort.set_sort_func(RecipeListModel.COL_LIC, self.sort_column, RecipeListModel.COL_LIC)
+ return sort
+
+ def sort_column_changed_cb (self, data):
+ self.sort_column_id, self.sort_order = data.get_sort_column_id ()
+
+ def sort_column(self, model, row1, row2, col):
+ value1 = model.get_value(row1, col)
+ value2 = model.get_value(row2, col)
+ cmp_res = cmp(value1, value2)
+ if cmp_res!=0:
+ if col==RecipeListModel.COL_INC:
+ return -cmp_res
+ else:
+ return cmp_res
+ else:
+ name1 = model.get_value(row1, RecipeListModel.COL_NAME)
+ name2 = model.get_value(row2, RecipeListModel.COL_NAME)
+ return cmp(name1,name2)
+
+ def sort_binb_column(self, model, row1, row2):
+ value1 = model.get_value(row1, RecipeListModel.COL_BINB)
+ value2 = model.get_value(row2, RecipeListModel.COL_BINB)
+ value1_list = value1.split(', ')
+ value2_list = value2.split(', ')
+
+ value1 = value1_list[0]
+ value2 = value2_list[0]
+
+ cmp_res = cmp(value1, value2)
+ if cmp_res==0:
+ cmp_size = cmp(len(value1_list), len(value2_list))
+ if cmp_size==0:
+ name1 = model.get_value(row1, RecipeListModel.COL_NAME)
+ name2 = model.get_value(row2, RecipeListModel.COL_NAME)
+ return cmp(name1,name2)
+ else:
+ return cmp_size
+ else:
+ return cmp_res
+
+ def convert_vpath_to_path(self, view_model, view_path):
+ filtered_model_path = view_model.convert_path_to_child_path(view_path)
+ filtered_model = view_model.get_model()
+
+ # get the path of the original model
+ path = filtered_model.convert_path_to_child_path(filtered_model_path)
+ return path
+
+ def convert_path_to_vpath(self, view_model, path):
+ it = view_model.get_iter_first()
+ while it:
+ name = self.find_item_for_path(path)
+ view_name = view_model.get_value(it, RecipeListModel.COL_NAME)
+ if view_name == name:
+ view_path = view_model.get_path(it)
+ return view_path
+ it = view_model.iter_next(it)
+ return None
+
+ """
+ The populate() function takes as input the data from a
+ bb.event.TargetsTreeGenerated event and populates the RecipeList.
+ """
+ def populate(self, event_model):
+ # First clear the model, in case repopulating
+ self.clear()
+
+ # dummy image for prompt
+ self.set_in_list(self.__custom_image__, "Use 'Edit image recipe' to customize recipes and packages " \
+ "to be included in your image ")
+
+ for item in event_model["pn"]:
+ name = item
+ desc = event_model["pn"][item]["description"]
+ lic = event_model["pn"][item]["license"]
+ group = event_model["pn"][item]["section"]
+ inherits = event_model["pn"][item]["inherits"]
+ summary = event_model["pn"][item]["summary"]
+ version = event_model["pn"][item]["version"]
+ revision = event_model["pn"][item]["prevision"]
+ homepage = event_model["pn"][item]["homepage"]
+ bugtracker = event_model["pn"][item]["bugtracker"]
+ filename = event_model["pn"][item]["filename"]
+ install = []
+
+ depends = event_model["depends"].get(item, []) + event_model["rdepends-pn"].get(item, [])
+
+ if ('packagegroup.bbclass' in " ".join(inherits)):
+ atype = 'packagegroup'
+ elif ('/image.bbclass' in " ".join(inherits)):
+ if "edited" not in name:
+ atype = 'image'
+ install = event_model["rdepends-pkg"].get(item, []) + event_model["rrecs-pkg"].get(item, [])
+ elif ('meta-' in name):
+ atype = 'toolchain'
+ elif (name == 'dummy-image' or name == 'dummy-toolchain'):
+ atype = 'dummy'
+ else:
+ atype = 'recipe'
+
+ self.set(self.append(), self.COL_NAME, item, self.COL_DESC, desc,
+ self.COL_LIC, lic, self.COL_GROUP, group,
+ self.COL_DEPS, " ".join(depends), self.COL_BINB, "",
+ self.COL_TYPE, atype, self.COL_INC, False,
+ self.COL_IMG, False, self.COL_INSTALL, " ".join(install), self.COL_PN, item,
+ self.COL_SUMMARY, summary, self.COL_VERSION, version, self.COL_REVISION, revision,
+ self.COL_HOMEPAGE, homepage, self.COL_BUGTRACKER, bugtracker,
+ self.COL_FILE, filename)
+
+ self.pn_path = {}
+ it = self.get_iter_first()
+ while it:
+ pn = self.get_value(it, self.COL_NAME)
+ path = self.get_path(it)
+ self.pn_path[pn] = path
+ it = self.iter_next(it)
+
+ def set_in_list(self, item, desc):
+ self.set(self.append(), self.COL_NAME, item,
+ self.COL_DESC, desc,
+ self.COL_LIC, "", self.COL_GROUP, "",
+ self.COL_DEPS, "", self.COL_BINB, "",
+ self.COL_TYPE, "image", self.COL_INC, False,
+ self.COL_IMG, False, self.COL_INSTALL, "", self.COL_PN, item,
+ self.COL_SUMMARY, "", self.COL_VERSION, "", self.COL_REVISION, "",
+ self.COL_HOMEPAGE, "", self.COL_BUGTRACKER, "")
+ self.pn_path = {}
+ it = self.get_iter_first()
+ while it:
+ pn = self.get_value(it, self.COL_NAME)
+ path = self.get_path(it)
+ self.pn_path[pn] = path
+ it = self.iter_next(it)
+
+ """
+ Update the model, send out the notification.
+ """
+ def selection_change_notification(self):
+ self.emit("recipe-selection-changed")
+
+ def path_included(self, item_path):
+ return self[item_path][self.COL_INC]
+
+ """
+ Add this item, and any of its dependencies, to the image contents
+ """
+ def include_item(self, item_path, binb="", image_contents=False):
+ if self.path_included(item_path):
+ return
+
+ item_name = self[item_path][self.COL_NAME]
+ item_deps = self[item_path][self.COL_DEPS]
+
+ self[item_path][self.COL_INC] = True
+
+ item_bin = self[item_path][self.COL_BINB].split(', ')
+ if binb and not binb in item_bin:
+ item_bin.append(binb)
+ self[item_path][self.COL_BINB] = ', '.join(item_bin).lstrip(', ')
+
+ # We want to do some magic with things which are brought in by the
+ # base image so tag them as so
+ if image_contents:
+ self[item_path][self.COL_IMG] = True
+
+ if item_deps:
+ # Ensure all of the items deps are included and, where appropriate,
+ # add this item to their COL_BINB
+ for dep in item_deps.split(" "):
+ # If the contents model doesn't already contain dep, add it
+ dep_path = self.find_path_for_item(dep)
+ if not dep_path:
+ continue
+ dep_included = self.path_included(dep_path)
+
+ if dep_included and not dep in item_bin:
+ # don't set the COL_BINB to this item if the target is an
+ # item in our own COL_BINB
+ dep_bin = self[dep_path][self.COL_BINB].split(', ')
+ if not item_name in dep_bin:
+ dep_bin.append(item_name)
+ self[dep_path][self.COL_BINB] = ', '.join(dep_bin).lstrip(', ')
+ elif not dep_included:
+ self.include_item(dep_path, binb=item_name, image_contents=image_contents)
+ dep_bin = self[item_path][self.COL_BINB].split(', ')
+ if self[item_path][self.COL_NAME] in dep_bin:
+ dep_bin.remove(self[item_path][self.COL_NAME])
+ self[item_path][self.COL_BINB] = ', '.join(dep_bin).lstrip(', ')
+
+ def exclude_item(self, item_path):
+ if not self.path_included(item_path):
+ return
+
+ self[item_path][self.COL_INC] = False
+
+ item_name = self[item_path][self.COL_NAME]
+ item_deps = self[item_path][self.COL_DEPS]
+ if item_deps:
+ for dep in item_deps.split(" "):
+ dep_path = self.find_path_for_item(dep)
+ if not dep_path:
+ continue
+ dep_bin = self[dep_path][self.COL_BINB].split(', ')
+ if item_name in dep_bin:
+ dep_bin.remove(item_name)
+ self[dep_path][self.COL_BINB] = ', '.join(dep_bin).lstrip(', ')
+
+ item_bin = self[item_path][self.COL_BINB].split(', ')
+ if item_bin:
+ for binb in item_bin:
+ binb_path = self.find_path_for_item(binb)
+ if not binb_path:
+ continue
+ self.exclude_item(binb_path)
+
+ def reset(self):
+ it = self.get_iter_first()
+ while it:
+ self.set(it,
+ self.COL_INC, False,
+ self.COL_BINB, "",
+ self.COL_IMG, False)
+ it = self.iter_next(it)
+
+ self.selection_change_notification()
+
+ """
+ Returns two lists. One of user selected recipes and the other containing
+ all selected recipes
+ """
+ def get_selected_recipes(self):
+ allrecipes = []
+ userrecipes = []
+
+ it = self.get_iter_first()
+ while it:
+ if self.get_value(it, self.COL_INC):
+ name = self.get_value(it, self.COL_PN)
+ type = self.get_value(it, self.COL_TYPE)
+ if type != "image":
+ allrecipes.append(name)
+ sel = "User Selected" in self.get_value(it, self.COL_BINB)
+ if sel:
+ userrecipes.append(name)
+ it = self.iter_next(it)
+
+ return list(set(userrecipes)), list(set(allrecipes))
+
+ def set_selected_recipes(self, recipelist):
+ for pn in recipelist:
+ if pn in self.pn_path.keys():
+ path = self.pn_path[pn]
+ self.include_item(item_path=path,
+ binb="User Selected")
+ self.selection_change_notification()
+
+ def get_selected_image(self):
+ it = self.get_iter_first()
+ while it:
+ if self.get_value(it, self.COL_INC):
+ name = self.get_value(it, self.COL_PN)
+ type = self.get_value(it, self.COL_TYPE)
+ if type == "image":
+ sel = "User Selected" in self.get_value(it, self.COL_BINB)
+ if sel:
+ return name
+ it = self.iter_next(it)
+ return None
+
+ def set_selected_image(self, img):
+ if not img:
+ return
+ self.reset()
+ path = self.find_path_for_item(img)
+ self.include_item(item_path=path,
+ binb="User Selected",
+ image_contents=True)
+ self.selection_change_notification()
+
+ def set_custom_image_version(self, version):
+ self.custom_image_version = version
+
+ def get_custom_image_version(self):
+ return self.custom_image_version
+
+ def is_custom_image(self):
+ return self.get_selected_image() == self.__custom_image__
diff --git a/bitbake/lib/bb/ui/crumbs/hobpages.py b/bitbake/lib/bb/ui/crumbs/hobpages.py
new file mode 100755
index 0000000..0fd3598
--- /dev/null
+++ b/bitbake/lib/bb/ui/crumbs/hobpages.py
@@ -0,0 +1,128 @@
+#!/usr/bin/env python
+#
+# BitBake Graphical GTK User Interface
+#
+# Copyright (C) 2012 Intel Corporation
+#
+# Authored by Dongxiao Xu <dongxiao.xu@intel.com>
+# Authored by Shane Wang <shane.wang@intel.com>
+#
+# This program is free software; you can redistribute it and/or modify
+# it under the terms of the GNU General Public License version 2 as
+# published by the Free Software Foundation.
+#
+# This program is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
+# GNU General Public License for more details.
+#
+# You should have received a copy of the GNU General Public License along
+# with this program; if not, write to the Free Software Foundation, Inc.,
+# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
+
+import gtk
+from bb.ui.crumbs.hobcolor import HobColors
+from bb.ui.crumbs.hobwidget import hwc
+
+#
+# HobPage: the super class for all Hob-related pages
+#
+class HobPage (gtk.VBox):
+
+ def __init__(self, builder, title = None):
+ super(HobPage, self).__init__(False, 0)
+ self.builder = builder
+ self.builder_width, self.builder_height = self.builder.size_request()
+
+ if not title:
+ self.title = "Hob -- Image Creator"
+ else:
+ self.title = title
+ self.title_label = gtk.Label()
+
+ self.box_group_area = gtk.VBox(False, 12)
+ self.box_group_area.set_size_request(self.builder_width - 73 - 73, self.builder_height - 88 - 15 - 15)
+ self.group_align = gtk.Alignment(xalign = 0, yalign=0.5, xscale=1, yscale=1)
+ self.group_align.set_padding(15, 15, 73, 73)
+ self.group_align.add(self.box_group_area)
+ self.box_group_area.set_homogeneous(False)
+
+ def set_title(self, title):
+ self.title = title
+ self.title_label.set_markup("<span size='x-large'>%s</span>" % self.title)
+
+ def add_onto_top_bar(self, widget = None, padding = 0):
+ # the top button occupies 1/7 of the page height
+ # setup an event box
+ eventbox = gtk.EventBox()
+ style = eventbox.get_style().copy()
+ style.bg[gtk.STATE_NORMAL] = eventbox.get_colormap().alloc_color(HobColors.LIGHT_GRAY, False, False)
+ eventbox.set_style(style)
+ eventbox.set_size_request(-1, 88)
+
+ hbox = gtk.HBox()
+
+ self.title_label = gtk.Label()
+ self.title_label.set_markup("<span size='x-large'>%s</span>" % self.title)
+ hbox.pack_start(self.title_label, expand=False, fill=False, padding=20)
+
+ if widget:
+ # add the widget in the event box
+ hbox.pack_end(widget, expand=False, fill=False, padding=padding)
+ eventbox.add(hbox)
+
+ return eventbox
+
+ def span_tag(self, size="medium", weight="normal", forground="#1c1c1c"):
+ span_tag = "weight='%s' foreground='%s' size='%s'" % (weight, forground, size)
+ return span_tag
+
+ def append_toolbar_button(self, toolbar, buttonname, icon_disp, icon_hovor, tip, cb):
+ # Create a button and append it on the toolbar according to button name
+ icon = gtk.Image()
+ icon_display = icon_disp
+ icon_hover = icon_hovor
+ pix_buffer = gtk.gdk.pixbuf_new_from_file(icon_display)
+ icon.set_from_pixbuf(pix_buffer)
+ tip_text = tip
+ button = toolbar.append_item(buttonname, tip, None, icon, cb)
+ return button
+
+ @staticmethod
+ def _size_to_string(size):
+ try:
+ if not size:
+ size_str = "0 B"
+ else:
+ if len(str(int(size))) > 6:
+ size_str = '%.1f' % (size*1.0/(1024*1024)) + ' MB'
+ elif len(str(int(size))) > 3:
+ size_str = '%.1f' % (size*1.0/1024) + ' KB'
+ else:
+ size_str = str(size) + ' B'
+ except:
+ size_str = "0 B"
+ return size_str
+
+ @staticmethod
+ def _string_to_size(str_size):
+ try:
+ if not str_size:
+ size = 0
+ else:
+ unit = str_size.split()
+ if len(unit) > 1:
+ if unit[1] == 'MB':
+ size = float(unit[0])*1024*1024
+ elif unit[1] == 'KB':
+ size = float(unit[0])*1024
+ elif unit[1] == 'B':
+ size = float(unit[0])
+ else:
+ size = 0
+ else:
+ size = float(unit[0])
+ except:
+ size = 0
+ return size
+
diff --git a/bitbake/lib/bb/ui/crumbs/hobwidget.py b/bitbake/lib/bb/ui/crumbs/hobwidget.py
new file mode 100644
index 0000000..2b969c1
--- /dev/null
+++ b/bitbake/lib/bb/ui/crumbs/hobwidget.py
@@ -0,0 +1,904 @@
+# BitBake Graphical GTK User Interface
+#
+# Copyright (C) 2011-2012 Intel Corporation
+#
+# Authored by Dongxiao Xu <dongxiao.xu@intel.com>
+# Authored by Shane Wang <shane.wang@intel.com>
+#
+# This program is free software; you can redistribute it and/or modify
+# it under the terms of the GNU General Public License version 2 as
+# published by the Free Software Foundation.
+#
+# This program is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
+# GNU General Public License for more details.
+#
+# You should have received a copy of the GNU General Public License along
+# with this program; if not, write to the Free Software Foundation, Inc.,
+# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
+import gtk
+import gobject
+import os
+import os.path
+import sys
+import pango, pangocairo
+import cairo
+import math
+
+from bb.ui.crumbs.hobcolor import HobColors
+from bb.ui.crumbs.persistenttooltip import PersistentTooltip
+
+class hwc:
+
+ MAIN_WIN_WIDTH = 1024
+ MAIN_WIN_HEIGHT = 700
+
+class hic:
+
+ HOB_ICON_BASE_DIR = os.path.join(os.path.dirname(os.path.dirname(os.path.dirname(__file__))), ("ui/icons/"))
+
+ ICON_RCIPE_DISPLAY_FILE = os.path.join(HOB_ICON_BASE_DIR, ('recipe/recipe_display.png'))
+ ICON_RCIPE_HOVER_FILE = os.path.join(HOB_ICON_BASE_DIR, ('recipe/recipe_hover.png'))
+ ICON_PACKAGES_DISPLAY_FILE = os.path.join(HOB_ICON_BASE_DIR, ('packages/packages_display.png'))
+ ICON_PACKAGES_HOVER_FILE = os.path.join(HOB_ICON_BASE_DIR, ('packages/packages_hover.png'))
+ ICON_LAYERS_DISPLAY_FILE = os.path.join(HOB_ICON_BASE_DIR, ('layers/layers_display.png'))
+ ICON_LAYERS_HOVER_FILE = os.path.join(HOB_ICON_BASE_DIR, ('layers/layers_hover.png'))
+ ICON_IMAGES_DISPLAY_FILE = os.path.join(HOB_ICON_BASE_DIR, ('images/images_display.png'))
+ ICON_IMAGES_HOVER_FILE = os.path.join(HOB_ICON_BASE_DIR, ('images/images_hover.png'))
+ ICON_SETTINGS_DISPLAY_FILE = os.path.join(HOB_ICON_BASE_DIR, ('settings/settings_display.png'))
+ ICON_SETTINGS_HOVER_FILE = os.path.join(HOB_ICON_BASE_DIR, ('settings/settings_hover.png'))
+ ICON_INFO_DISPLAY_FILE = os.path.join(HOB_ICON_BASE_DIR, ('info/info_display.png'))
+ ICON_INFO_HOVER_FILE = os.path.join(HOB_ICON_BASE_DIR, ('info/info_hover.png'))
+ ICON_INDI_CONFIRM_FILE = os.path.join(HOB_ICON_BASE_DIR, ('indicators/confirmation.png'))
+ ICON_INDI_ERROR_FILE = os.path.join(HOB_ICON_BASE_DIR, ('indicators/denied.png'))
+ ICON_INDI_REMOVE_FILE = os.path.join(HOB_ICON_BASE_DIR, ('indicators/remove.png'))
+ ICON_INDI_REMOVE_HOVER_FILE = os.path.join(HOB_ICON_BASE_DIR, ('indicators/remove-hover.png'))
+ ICON_INDI_ADD_FILE = os.path.join(HOB_ICON_BASE_DIR, ('indicators/add.png'))
+ ICON_INDI_ADD_HOVER_FILE = os.path.join(HOB_ICON_BASE_DIR, ('indicators/add-hover.png'))
+ ICON_INDI_REFRESH_FILE = os.path.join(HOB_ICON_BASE_DIR, ('indicators/refresh.png'))
+ ICON_INDI_ALERT_FILE = os.path.join(HOB_ICON_BASE_DIR, ('indicators/alert.png'))
+ ICON_INDI_TICK_FILE = os.path.join(HOB_ICON_BASE_DIR, ('indicators/tick.png'))
+ ICON_INDI_INFO_FILE = os.path.join(HOB_ICON_BASE_DIR, ('indicators/info.png'))
+
+class HobViewTable (gtk.VBox):
+ """
+ A VBox to contain the table for different recipe views and package view
+ """
+ __gsignals__ = {
+ "toggled" : (gobject.SIGNAL_RUN_LAST,
+ gobject.TYPE_NONE,
+ (gobject.TYPE_PYOBJECT,
+ gobject.TYPE_STRING,
+ gobject.TYPE_INT,
+ gobject.TYPE_PYOBJECT,)),
+ "row-activated" : (gobject.SIGNAL_RUN_LAST,
+ gobject.TYPE_NONE,
+ (gobject.TYPE_PYOBJECT,
+ gobject.TYPE_PYOBJECT,)),
+ "cell-fadeinout-stopped" : (gobject.SIGNAL_RUN_LAST,
+ gobject.TYPE_NONE,
+ (gobject.TYPE_PYOBJECT,
+ gobject.TYPE_PYOBJECT,
+ gobject.TYPE_PYOBJECT,)),
+ }
+
+ def __init__(self, columns, name):
+ gtk.VBox.__init__(self, False, 6)
+ self.table_tree = gtk.TreeView()
+ self.table_tree.set_headers_visible(True)
+ self.table_tree.set_headers_clickable(True)
+ self.table_tree.set_rules_hint(True)
+ self.table_tree.set_enable_tree_lines(True)
+ self.table_tree.get_selection().set_mode(gtk.SELECTION_SINGLE)
+ self.toggle_columns = []
+ self.table_tree.connect("row-activated", self.row_activated_cb)
+ self.top_bar = None
+ self.tab_name = name
+
+ for i, column in enumerate(columns):
+ col_name = column['col_name']
+ col = gtk.TreeViewColumn(col_name)
+ col.set_clickable(True)
+ col.set_resizable(True)
+ if self.tab_name.startswith('Included'):
+ if col_name!='Included':
+ col.set_sort_column_id(column['col_id'])
+ else:
+ col.set_sort_column_id(column['col_id'])
+ if 'col_min' in column.keys():
+ col.set_min_width(column['col_min'])
+ if 'col_max' in column.keys():
+ col.set_max_width(column['col_max'])
+ if 'expand' in column.keys():
+ col.set_expand(True)
+ self.table_tree.append_column(col)
+
+ if (not 'col_style' in column.keys()) or column['col_style'] == 'text':
+ cell = gtk.CellRendererText()
+ col.pack_start(cell, True)
+ col.set_attributes(cell, text=column['col_id'])
+ if 'col_t_id' in column.keys():
+ col.add_attribute(cell, 'font', column['col_t_id'])
+ elif column['col_style'] == 'check toggle':
+ cell = HobCellRendererToggle()
+ cell.set_property('activatable', True)
+ cell.connect("toggled", self.toggled_cb, i, self.table_tree)
+ cell.connect_render_state_changed(self.stop_cell_fadeinout_cb, self.table_tree)
+ self.toggle_id = i
+ col.pack_end(cell, True)
+ col.set_attributes(cell, active=column['col_id'])
+ self.toggle_columns.append(col_name)
+ if 'col_group' in column.keys():
+ col.set_cell_data_func(cell, self.set_group_number_cb)
+ elif column['col_style'] == 'radio toggle':
+ cell = gtk.CellRendererToggle()
+ cell.set_property('activatable', True)
+ cell.set_radio(True)
+ cell.connect("toggled", self.toggled_cb, i, self.table_tree)
+ self.toggle_id = i
+ col.pack_end(cell, True)
+ col.set_attributes(cell, active=column['col_id'])
+ self.toggle_columns.append(col_name)
+ elif column['col_style'] == 'binb':
+ cell = gtk.CellRendererText()
+ col.pack_start(cell, True)
+ col.set_cell_data_func(cell, self.display_binb_cb, column['col_id'])
+ if 'col_t_id' in column.keys():
+ col.add_attribute(cell, 'font', column['col_t_id'])
+
+ self.scroll = gtk.ScrolledWindow()
+ self.scroll.set_policy(gtk.POLICY_NEVER, gtk.POLICY_AUTOMATIC)
+ self.scroll.add(self.table_tree)
+
+ self.pack_end(self.scroll, True, True, 0)
+
+ def add_no_result_bar(self, entry):
+ color = HobColors.KHAKI
+ self.top_bar = gtk.EventBox()
+ self.top_bar.set_size_request(-1, 70)
+ self.top_bar.modify_bg(gtk.STATE_NORMAL, gtk.gdk.color_parse(color))
+ self.top_bar.set_flags(gtk.CAN_DEFAULT)
+ self.top_bar.grab_default()
+
+ no_result_tab = gtk.Table(5, 20, True)
+ self.top_bar.add(no_result_tab)
+
+ label = gtk.Label()
+ label.set_alignment(0.0, 0.5)
+ title = "No results matching your search"
+ label.set_markup("<span size='x-large'><b>%s</b></span>" % title)
+ no_result_tab.attach(label, 1, 14, 1, 4)
+
+ clear_button = HobButton("Clear search")
+ clear_button.set_tooltip_text("Clear search query")
+ clear_button.connect('clicked', self.set_search_entry_clear_cb, entry)
+ no_result_tab.attach(clear_button, 16, 19, 1, 4)
+
+ self.pack_start(self.top_bar, False, True, 12)
+ self.top_bar.show_all()
+
+ def set_search_entry_clear_cb(self, button, search):
+ if search.get_editable() == True:
+ search.set_text("")
+ search.set_icon_sensitive(gtk.ENTRY_ICON_SECONDARY, False)
+ search.grab_focus()
+
+ def display_binb_cb(self, col, cell, model, it, col_id):
+ binb = model.get_value(it, col_id)
+ # Just display the first item
+ if binb:
+ bin = binb.split(', ')
+ total_no = len(bin)
+ if total_no > 1 and bin[0] == "User Selected":
+ if total_no > 2:
+ present_binb = bin[1] + ' (+' + str(total_no - 1) + ')'
+ else:
+ present_binb = bin[1]
+ else:
+ if total_no > 1:
+ present_binb = bin[0] + ' (+' + str(total_no - 1) + ')'
+ else:
+ present_binb = bin[0]
+ cell.set_property('text', present_binb)
+ else:
+ cell.set_property('text', "")
+ return True
+
+ def set_model(self, tree_model):
+ self.table_tree.set_model(tree_model)
+
+ def toggle_default(self):
+ model = self.table_tree.get_model()
+ if not model:
+ return
+ iter = model.get_iter_first()
+ if iter:
+ rowpath = model.get_path(iter)
+ model[rowpath][self.toggle_id] = True
+
+ def toggled_cb(self, cell, path, columnid, tree):
+ self.emit("toggled", cell, path, columnid, tree)
+
+ def row_activated_cb(self, tree, path, view_column):
+ if not view_column.get_title() in self.toggle_columns:
+ self.emit("row-activated", tree.get_model(), path)
+
+ def stop_cell_fadeinout_cb(self, ctrl, cell, tree):
+ self.emit("cell-fadeinout-stopped", ctrl, cell, tree)
+
+ def set_group_number_cb(self, col, cell, model, iter):
+ if model and (model.iter_parent(iter) == None):
+ cell.cell_attr["number_of_children"] = model.iter_n_children(iter)
+ else:
+ cell.cell_attr["number_of_children"] = 0
+
+ def connect_group_selection(self, cb_func):
+ self.table_tree.get_selection().connect("changed", cb_func)
+
+"""
+A method to calculate a softened value for the colour of widget when in the
+provided state.
+
+widget: the widget whose style to use
+state: the state of the widget to use the style for
+
+Returns a string value representing the softened colour
+"""
+def soften_color(widget, state=gtk.STATE_NORMAL):
+ # this colour munging routine is heavily inspired bu gdu_util_get_mix_color()
+ # from gnome-disk-utility:
+ # http://git.gnome.org/browse/gnome-disk-utility/tree/src/gdu-gtk/gdu-gtk.c?h=gnome-3-0
+ blend = 0.7
+ style = widget.get_style()
+ color = style.text[state]
+ color.red = color.red * blend + style.base[state].red * (1.0 - blend)
+ color.green = color.green * blend + style.base[state].green * (1.0 - blend)
+ color.blue = color.blue * blend + style.base[state].blue * (1.0 - blend)
+ return color.to_string()
+
+class BaseHobButton(gtk.Button):
+ """
+ A gtk.Button subclass which follows the visual design of Hob for primary
+ action buttons
+
+ label: the text to display as the button's label
+ """
+ def __init__(self, label):
+ gtk.Button.__init__(self, label)
+ HobButton.style_button(self)
+
+ @staticmethod
+ def style_button(button):
+ style = button.get_style()
+ style = gtk.rc_get_style_by_paths(gtk.settings_get_default(), 'gtk-button', 'gtk-button', gobject.TYPE_NONE)
+
+ button.set_flags(gtk.CAN_DEFAULT)
+ button.grab_default()
+
+# label = "<span size='x-large'><b>%s</b></span>" % gobject.markup_escape_text(button.get_label())
+ label = button.get_label()
+ button.set_label(label)
+ button.child.set_use_markup(True)
+
+class HobButton(BaseHobButton):
+ """
+ A gtk.Button subclass which follows the visual design of Hob for primary
+ action buttons
+
+ label: the text to display as the button's label
+ """
+ def __init__(self, label):
+ BaseHobButton.__init__(self, label)
+ HobButton.style_button(self)
+
+class HobAltButton(BaseHobButton):
+ """
+ A gtk.Button subclass which has no relief, and so is more discrete
+ """
+ def __init__(self, label):
+ BaseHobButton.__init__(self, label)
+ HobAltButton.style_button(self)
+
+ """
+ A callback for the state-changed event to ensure the text is displayed
+ differently when the widget is not sensitive
+ """
+ @staticmethod
+ def desensitise_on_state_change_cb(button, state):
+ if not button.get_property("sensitive"):
+ HobAltButton.set_text(button, False)
+ else:
+ HobAltButton.set_text(button, True)
+
+ """
+ Set the button label with an appropriate colour for the current widget state
+ """
+ @staticmethod
+ def set_text(button, sensitive=True):
+ if sensitive:
+ colour = HobColors.PALE_BLUE
+ else:
+ colour = HobColors.LIGHT_GRAY
+ button.set_label("<span size='large' color='%s'><b>%s</b></span>" % (colour, gobject.markup_escape_text(button.text)))
+ button.child.set_use_markup(True)
+
+class HobImageButton(gtk.Button):
+ """
+ A gtk.Button with an icon and two rows of text, the second of which is
+ displayed in a blended colour.
+
+ primary_text: the main button label
+ secondary_text: optional second line of text
+ icon_path: path to the icon file to display on the button
+ """
+ def __init__(self, primary_text, secondary_text="", icon_path="", hover_icon_path=""):
+ gtk.Button.__init__(self)
+ self.set_relief(gtk.RELIEF_NONE)
+
+ self.icon_path = icon_path
+ self.hover_icon_path = hover_icon_path
+
+ hbox = gtk.HBox(False, 10)
+ hbox.show()
+ self.add(hbox)
+ self.icon = gtk.Image()
+ self.icon.set_from_file(self.icon_path)
+ self.icon.set_alignment(0.5, 0.0)
+ self.icon.show()
+ if self.hover_icon_path and len(self.hover_icon_path):
+ self.connect("enter-notify-event", self.set_hover_icon_cb)
+ self.connect("leave-notify-event", self.set_icon_cb)
+ hbox.pack_start(self.icon, False, False, 0)
+ label = gtk.Label()
+ label.set_alignment(0.0, 0.5)
+ colour = soften_color(label)
+ mark = "<span size='x-large'>%s</span>\n<span size='medium' fgcolor='%s' weight='ultralight'>%s</span>" % (primary_text, colour, secondary_text)
+ label.set_markup(mark)
+ label.show()
+ hbox.pack_start(label, True, True, 0)
+
+ def set_hover_icon_cb(self, widget, event):
+ self.icon.set_from_file(self.hover_icon_path)
+
+ def set_icon_cb(self, widget, event):
+ self.icon.set_from_file(self.icon_path)
+
+class HobInfoButton(gtk.EventBox):
+ """
+ This class implements a button-like widget per the Hob visual and UX designs
+ which will display a persistent tooltip, with the contents of tip_markup, when
+ clicked.
+
+ tip_markup: the Pango Markup to be displayed in the persistent tooltip
+ """
+ def __init__(self, tip_markup, parent=None):
+ gtk.EventBox.__init__(self)
+ self.image = gtk.Image()
+ self.image.set_from_file(
+ hic.ICON_INFO_DISPLAY_FILE)
+ self.image.show()
+ self.add(self.image)
+ self.tip_markup = tip_markup
+ self.my_parent = parent
+
+ self.set_events(gtk.gdk.BUTTON_RELEASE |
+ gtk.gdk.ENTER_NOTIFY_MASK |
+ gtk.gdk.LEAVE_NOTIFY_MASK)
+
+ self.connect("button-release-event", self.button_release_cb)
+ self.connect("enter-notify-event", self.mouse_in_cb)
+ self.connect("leave-notify-event", self.mouse_out_cb)
+
+ """
+ When the mouse click is released emulate a button-click and show the associated
+ PersistentTooltip
+ """
+ def button_release_cb(self, widget, event):
+ from bb.ui.crumbs.hig.propertydialog import PropertyDialog
+ self.dialog = PropertyDialog(title = '',
+ parent = self.my_parent,
+ information = self.tip_markup,
+ flags = gtk.DIALOG_DESTROY_WITH_PARENT
+ | gtk.DIALOG_NO_SEPARATOR)
+
+ button = self.dialog.add_button("Close", gtk.RESPONSE_CANCEL)
+ HobAltButton.style_button(button)
+ button.connect("clicked", lambda w: self.dialog.destroy())
+ self.dialog.show_all()
+ self.dialog.run()
+
+ """
+ Change to the prelight image when the mouse enters the widget
+ """
+ def mouse_in_cb(self, widget, event):
+ self.image.set_from_file(hic.ICON_INFO_HOVER_FILE)
+
+ """
+ Change to the stock image when the mouse enters the widget
+ """
+ def mouse_out_cb(self, widget, event):
+ self.image.set_from_file(hic.ICON_INFO_DISPLAY_FILE)
+
+class HobIndicator(gtk.DrawingArea):
+ def __init__(self, count):
+ gtk.DrawingArea.__init__(self)
+ # Set no window for transparent background
+ self.set_has_window(False)
+ self.set_size_request(38,38)
+ # We need to pass through button clicks
+ self.add_events(gtk.gdk.BUTTON_PRESS_MASK | gtk.gdk.BUTTON_RELEASE_MASK)
+
+ self.connect('expose-event', self.expose)
+
+ self.count = count
+ self.color = HobColors.GRAY
+
+ def expose(self, widget, event):
+ if self.count and self.count > 0:
+ ctx = widget.window.cairo_create()
+
+ x, y, w, h = self.allocation
+
+ ctx.set_operator(cairo.OPERATOR_OVER)
+ ctx.set_source_color(gtk.gdk.color_parse(self.color))
+ ctx.translate(w/2, h/2)
+ ctx.arc(x, y, min(w,h)/2 - 2, 0, 2*math.pi)
+ ctx.fill_preserve()
+
+ layout = self.create_pango_layout(str(self.count))
+ textw, texth = layout.get_pixel_size()
+ x = (w/2)-(textw/2) + x
+ y = (h/2) - (texth/2) + y
+ ctx.move_to(x, y)
+ self.window.draw_layout(self.style.light_gc[gtk.STATE_NORMAL], int(x), int(y), layout)
+
+ def set_count(self, count):
+ self.count = count
+
+ def set_active(self, active):
+ if active:
+ self.color = HobColors.DEEP_RED
+ else:
+ self.color = HobColors.GRAY
+
+class HobTabLabel(gtk.HBox):
+ def __init__(self, text, count=0):
+ gtk.HBox.__init__(self, False, 0)
+ self.indicator = HobIndicator(count)
+ self.indicator.show()
+ self.pack_end(self.indicator, False, False)
+ self.lbl = gtk.Label(text)
+ self.lbl.set_alignment(0.0, 0.5)
+ self.lbl.show()
+ self.pack_end(self.lbl, True, True, 6)
+
+ def set_count(self, count):
+ self.indicator.set_count(count)
+
+ def set_active(self, active=True):
+ self.indicator.set_active(active)
+
+class HobNotebook(gtk.Notebook):
+ def __init__(self):
+ gtk.Notebook.__init__(self)
+ self.set_property('homogeneous', True)
+
+ self.pages = []
+
+ self.search = None
+ self.search_focus = False
+ self.page_changed = False
+
+ self.connect("switch-page", self.page_changed_cb)
+
+ self.show_all()
+
+ def page_changed_cb(self, nb, page, page_num):
+ for p, lbl in enumerate(self.pages):
+ if p == page_num:
+ lbl.set_active()
+ else:
+ lbl.set_active(False)
+
+ if self.search:
+ self.page_changed = True
+ self.reset_entry(self.search, page_num)
+
+ def append_page(self, child, tab_label, tab_tooltip=None):
+ label = HobTabLabel(tab_label)
+ if tab_tooltip:
+ label.set_tooltip_text(tab_tooltip)
+ label.set_active(False)
+ self.pages.append(label)
+ gtk.Notebook.append_page(self, child, label)
+
+ def set_entry(self, names, tips):
+ self.search = gtk.Entry()
+ self.search_names = names
+ self.search_tips = tips
+ style = self.search.get_style()
+ style.text[gtk.STATE_NORMAL] = self.get_colormap().alloc_color(HobColors.GRAY, False, False)
+ self.search.set_style(style)
+ self.search.set_text(names[0])
+ self.search.set_tooltip_text(self.search_tips[0])
+ self.search.props.has_tooltip = True
+
+ self.search.set_editable(False)
+ self.search.set_icon_from_stock(gtk.ENTRY_ICON_SECONDARY, gtk.STOCK_CLEAR)
+ self.search.set_icon_sensitive(gtk.ENTRY_ICON_SECONDARY, False)
+ self.search.connect("icon-release", self.set_search_entry_clear_cb)
+ self.search.set_width_chars(30)
+ self.search.show()
+
+ self.search.connect("focus-in-event", self.set_search_entry_editable_cb)
+ self.search.connect("focus-out-event", self.set_search_entry_reset_cb)
+ self.set_action_widget(self.search, gtk.PACK_END)
+
+ def show_indicator_icon(self, title, number):
+ for child in self.pages:
+ if child.lbl.get_label() == title:
+ child.set_count(number)
+
+ def hide_indicator_icon(self, title):
+ for child in self.pages:
+ if child.lbl.get_label() == title:
+ child.set_count(0)
+
+ def set_search_entry_editable_cb(self, search, event):
+ self.search_focus = True
+ search.set_editable(True)
+ text = search.get_text()
+ if text in self.search_names:
+ search.set_text("")
+ style = self.search.get_style()
+ style.text[gtk.STATE_NORMAL] = self.get_colormap().alloc_color(HobColors.BLACK, False, False)
+ search.set_style(style)
+
+ def set_search_entry_reset_cb(self, search, event):
+ page_num = self.get_current_page()
+ text = search.get_text()
+ if not text:
+ self.reset_entry(search, page_num)
+
+ def reset_entry(self, entry, page_num):
+ style = entry.get_style()
+ style.text[gtk.STATE_NORMAL] = self.get_colormap().alloc_color(HobColors.GRAY, False, False)
+ entry.set_style(style)
+ entry.set_text(self.search_names[page_num])
+ entry.set_tooltip_text(self.search_tips[page_num])
+ entry.set_editable(False)
+ entry.set_icon_sensitive(gtk.ENTRY_ICON_SECONDARY, False)
+
+ def set_search_entry_clear_cb(self, search, icon_pos, event):
+ if search.get_editable() == True:
+ search.set_text("")
+ search.set_icon_sensitive(gtk.ENTRY_ICON_SECONDARY, False)
+ search.grab_focus()
+
+ def set_page(self, title):
+ for child in self.pages:
+ if child.lbl.get_label() == title:
+ child.grab_focus()
+ self.set_current_page(self.pages.index(child))
+ return
+
+class HobWarpCellRendererText(gtk.CellRendererText):
+ def __init__(self, col_number):
+ gtk.CellRendererText.__init__(self)
+ self.set_property("wrap-mode", pango.WRAP_WORD_CHAR)
+ self.set_property("wrap-width", 300) # default value wrap width is 300
+ self.col_n = col_number
+
+ def do_render(self, window, widget, background_area, cell_area, expose_area, flags):
+ if widget:
+ self.props.wrap_width = self.get_resized_wrap_width(widget, widget.get_column(self.col_n))
+ return gtk.CellRendererText.do_render(self, window, widget, background_area, cell_area, expose_area, flags)
+
+ def get_resized_wrap_width(self, treeview, column):
+ otherCols = []
+ for col in treeview.get_columns():
+ if col != column:
+ otherCols.append(col)
+ adjwidth = treeview.allocation.width - sum(c.get_width() for c in otherCols)
+ adjwidth -= treeview.style_get_property("horizontal-separator") * 4
+ if self.props.wrap_width == adjwidth or adjwidth <= 0:
+ adjwidth = self.props.wrap_width
+ return adjwidth
+
+gobject.type_register(HobWarpCellRendererText)
+
+class HobIconChecker(hic):
+ def set_hob_icon_to_stock_icon(self, file_path, stock_id=""):
+ try:
+ pixbuf = gtk.gdk.pixbuf_new_from_file(file_path)
+ except Exception, e:
+ return None
+
+ if stock_id and (gtk.icon_factory_lookup_default(stock_id) == None):
+ icon_factory = gtk.IconFactory()
+ icon_factory.add_default()
+ icon_factory.add(stock_id, gtk.IconSet(pixbuf))
+ gtk.stock_add([(stock_id, '_label', 0, 0, '')])
+
+ return icon_factory.lookup(stock_id)
+
+ return None
+
+ """
+ For make hob icon consistently by request, and avoid icon view diff by system or gtk version, we use some 'hob icon' to replace the 'gtk icon'.
+ this function check the stock_id and make hob_id to replaced the gtk_id then return it or ""
+ """
+ def check_stock_icon(self, stock_name=""):
+ HOB_CHECK_STOCK_NAME = {
+ ('hic-dialog-info', 'gtk-dialog-info', 'dialog-info') : self.ICON_INDI_INFO_FILE,
+ ('hic-ok', 'gtk-ok', 'ok') : self.ICON_INDI_TICK_FILE,
+ ('hic-dialog-error', 'gtk-dialog-error', 'dialog-error') : self.ICON_INDI_ERROR_FILE,
+ ('hic-dialog-warning', 'gtk-dialog-warning', 'dialog-warning') : self.ICON_INDI_ALERT_FILE,
+ ('hic-task-refresh', 'gtk-execute', 'execute') : self.ICON_INDI_REFRESH_FILE,
+ }
+ valid_stock_id = stock_name
+ if stock_name:
+ for names, path in HOB_CHECK_STOCK_NAME.iteritems():
+ if stock_name in names:
+ valid_stock_id = names[0]
+ if not gtk.icon_factory_lookup_default(valid_stock_id):
+ self.set_hob_icon_to_stock_icon(path, valid_stock_id)
+
+ return valid_stock_id
+
+class HobCellRendererController(gobject.GObject):
+ (MODE_CYCLE_RUNNING, MODE_ONE_SHORT) = range(2)
+ __gsignals__ = {
+ "run-timer-stopped" : (gobject.SIGNAL_RUN_LAST,
+ gobject.TYPE_NONE,
+ ()),
+ }
+ def __init__(self, runningmode=MODE_CYCLE_RUNNING, is_draw_row=False):
+ gobject.GObject.__init__(self)
+ self.timeout_id = None
+ self.current_angle_pos = 0.0
+ self.step_angle = 0.0
+ self.tree_headers_height = 0
+ self.running_cell_areas = []
+ self.running_mode = runningmode
+ self.is_queue_draw_row_area = is_draw_row
+ self.force_stop_enable = False
+
+ def is_active(self):
+ if self.timeout_id:
+ return True
+ else:
+ return False
+
+ def reset_run(self):
+ self.force_stop()
+ self.running_cell_areas = []
+ self.current_angle_pos = 0.0
+ self.step_angle = 0.0
+
+ ''' time_iterval: (1~1000)ms, which will be as the basic interval count for timer
+ init_usrdata: the current data which related the progress-bar will be at
+ min_usrdata: the range of min of user data
+ max_usrdata: the range of max of user data
+ step: each step which you want to progress
+ Note: the init_usrdata should in the range of from min to max, and max should > min
+ step should < (max - min)
+ '''
+ def start_run(self, time_iterval, init_usrdata, min_usrdata, max_usrdata, step, tree):
+ if (not time_iterval) or (not max_usrdata):
+ return
+ usr_range = (max_usrdata - min_usrdata) * 1.0
+ self.current_angle_pos = (init_usrdata * 1.0) / usr_range
+ self.step_angle = (step * 1) / usr_range
+ self.timeout_id = gobject.timeout_add(int(time_iterval),
+ self.make_image_on_progressing_cb, tree)
+ self.tree_headers_height = self.get_treeview_headers_height(tree)
+ self.force_stop_enable = False
+
+ def force_stop(self):
+ self.emit("run-timer-stopped")
+ self.force_stop_enable = True
+ if self.timeout_id:
+ if gobject.source_remove(self.timeout_id):
+ self.timeout_id = None
+
+ def on_draw_pixbuf_cb(self, pixbuf, cr, x, y, img_width, img_height, do_refresh=True):
+ if pixbuf:
+ r = max(img_width/2, img_height/2)
+ cr.translate(x + r, y + r)
+ if do_refresh:
+ cr.rotate(2 * math.pi * self.current_angle_pos)
+
+ cr.set_source_pixbuf(pixbuf, -img_width/2, -img_height/2)
+ cr.paint()
+
+ def on_draw_fadeinout_cb(self, cr, color, x, y, width, height, do_fadeout=True):
+ if do_fadeout:
+ alpha = self.current_angle_pos * 0.8
+ else:
+ alpha = (1.0 - self.current_angle_pos) * 0.8
+
+ cr.set_source_rgba(color.red, color.green, color.blue, alpha)
+ cr.rectangle(x, y, width, height)
+ cr.fill()
+
+ def get_treeview_headers_height(self, tree):
+ if tree and (tree.get_property("headers-visible") == True):
+ height = tree.get_allocation().height - tree.get_bin_window().get_size()[1]
+ return height
+
+ return 0
+
+ def make_image_on_progressing_cb(self, tree):
+ self.current_angle_pos += self.step_angle
+ if self.running_mode == self.MODE_CYCLE_RUNNING:
+ if (self.current_angle_pos >= 1):
+ self.current_angle_pos = 0
+ else:
+ if self.current_angle_pos > 1:
+ self.force_stop()
+ return False
+
+ if self.is_queue_draw_row_area:
+ for path in self.running_cell_areas:
+ rect = tree.get_cell_area(path, tree.get_column(0))
+ row_x, _, row_width, _ = tree.get_visible_rect()
+ tree.queue_draw_area(row_x, rect.y + self.tree_headers_height, row_width, rect.height)
+ else:
+ for rect in self.running_cell_areas:
+ tree.queue_draw_area(rect.x, rect.y + self.tree_headers_height, rect.width, rect.height)
+
+ return (not self.force_stop_enable)
+
+ def append_running_cell_area(self, cell_area):
+ if cell_area and (cell_area not in self.running_cell_areas):
+ self.running_cell_areas.append(cell_area)
+
+ def remove_running_cell_area(self, cell_area):
+ if cell_area in self.running_cell_areas:
+ self.running_cell_areas.remove(cell_area)
+ if not self.running_cell_areas:
+ self.reset_run()
+
+gobject.type_register(HobCellRendererController)
+
+class HobCellRendererPixbuf(gtk.CellRendererPixbuf):
+ def __init__(self):
+ gtk.CellRendererPixbuf.__init__(self)
+ self.control = HobCellRendererController()
+ # add icon checker for make the gtk-icon transfer to hob-icon
+ self.checker = HobIconChecker()
+ self.set_property("stock-size", gtk.ICON_SIZE_DND)
+
+ def get_pixbuf_from_stock_icon(self, widget, stock_id="", size=gtk.ICON_SIZE_DIALOG):
+ if widget and stock_id and gtk.icon_factory_lookup_default(stock_id):
+ return widget.render_icon(stock_id, size)
+
+ return None
+
+ def set_icon_name_to_id(self, new_name):
+ if new_name and type(new_name) == str:
+ # check the name is need to transfer to hob icon or not
+ name = self.checker.check_stock_icon(new_name)
+ if name.startswith("hic") or name.startswith("gtk"):
+ stock_id = name
+ else:
+ stock_id = 'gtk-' + name
+
+ return stock_id
+
+ ''' render cell exactly, "icon-name" is priority
+ if use the 'hic-task-refresh' will make the pix animation
+ if 'pix' will change the pixbuf for it from the pixbuf or image.
+ '''
+ def do_render(self, window, tree, background_area,cell_area, expose_area, flags):
+ if (not self.control) or (not tree):
+ return
+
+ x, y, w, h = self.on_get_size(tree, cell_area)
+ x += cell_area.x
+ y += cell_area.y
+ w -= 2 * self.get_property("xpad")
+ h -= 2 * self.get_property("ypad")
+
+ stock_id = ""
+ if self.props.icon_name:
+ stock_id = self.set_icon_name_to_id(self.props.icon_name)
+ elif self.props.stock_id:
+ stock_id = self.props.stock_id
+ elif self.props.pixbuf:
+ pix = self.props.pixbuf
+ else:
+ return
+
+ if stock_id:
+ pix = self.get_pixbuf_from_stock_icon(tree, stock_id, self.props.stock_size)
+ if stock_id == 'hic-task-refresh':
+ self.control.append_running_cell_area(cell_area)
+ if self.control.is_active():
+ self.control.on_draw_pixbuf_cb(pix, window.cairo_create(), x, y, w, h, True)
+ else:
+ self.control.start_run(200, 0, 0, 1000, 150, tree)
+ else:
+ self.control.remove_running_cell_area(cell_area)
+ self.control.on_draw_pixbuf_cb(pix, window.cairo_create(), x, y, w, h, False)
+
+ def on_get_size(self, widget, cell_area):
+ if self.props.icon_name or self.props.pixbuf or self.props.stock_id:
+ w, h = gtk.icon_size_lookup(self.props.stock_size)
+ calc_width = self.get_property("xpad") * 2 + w
+ calc_height = self.get_property("ypad") * 2 + h
+ x_offset = 0
+ y_offset = 0
+ if cell_area and w > 0 and h > 0:
+ x_offset = self.get_property("xalign") * (cell_area.width - calc_width - self.get_property("xpad"))
+ y_offset = self.get_property("yalign") * (cell_area.height - calc_height - self.get_property("ypad"))
+
+ return x_offset, y_offset, w, h
+
+ return 0, 0, 0, 0
+
+gobject.type_register(HobCellRendererPixbuf)
+
+class HobCellRendererToggle(gtk.CellRendererToggle):
+ def __init__(self):
+ gtk.CellRendererToggle.__init__(self)
+ self.ctrl = HobCellRendererController(is_draw_row=True)
+ self.ctrl.running_mode = self.ctrl.MODE_ONE_SHORT
+ self.cell_attr = {"fadeout": False, "number_of_children": 0}
+
+ def do_render(self, window, widget, background_area, cell_area, expose_area, flags):
+ if (not self.ctrl) or (not widget):
+ return
+
+ if flags & gtk.CELL_RENDERER_SELECTED:
+ state = gtk.STATE_SELECTED
+ else:
+ state = gtk.STATE_NORMAL
+
+ if self.ctrl.is_active():
+ path = widget.get_path_at_pos(cell_area.x + cell_area.width/2, cell_area.y + cell_area.height/2)
+ # sometimes the parameters of cell_area will be a negative number,such as pull up down the scroll bar
+ # it's over the tree container range, so the path will be bad
+ if not path: return
+ path = path[0]
+ if path in self.ctrl.running_cell_areas:
+ cr = window.cairo_create()
+ color = widget.get_style().base[state]
+
+ row_x, _, row_width, _ = widget.get_visible_rect()
+ border_y = self.get_property("ypad")
+ self.ctrl.on_draw_fadeinout_cb(cr, color, row_x, cell_area.y - border_y, row_width, \
+ cell_area.height + border_y * 2, self.cell_attr["fadeout"])
+ # draw number of a group
+ if self.cell_attr["number_of_children"]:
+ text = "%d pkg" % self.cell_attr["number_of_children"]
+ pangolayout = widget.create_pango_layout(text)
+ textw, texth = pangolayout.get_pixel_size()
+ x = cell_area.x + (cell_area.width/2) - (textw/2)
+ y = cell_area.y + (cell_area.height/2) - (texth/2)
+
+ widget.style.paint_layout(window, state, True, cell_area, widget, "checkbox", x, y, pangolayout)
+ else:
+ return gtk.CellRendererToggle.do_render(self, window, widget, background_area, cell_area, expose_area, flags)
+
+ '''delay: normally delay time is 1000ms
+ cell_list: whilch cells need to be render
+ '''
+ def fadeout(self, tree, delay, cell_list=None):
+ if (delay < 200) or (not tree):
+ return
+ self.cell_attr["fadeout"] = True
+ self.ctrl.running_cell_areas = cell_list
+ self.ctrl.start_run(200, 0, 0, delay, (delay * 200 / 1000), tree)
+
+ def connect_render_state_changed(self, func, usrdata=None):
+ if not func:
+ return
+ if usrdata:
+ self.ctrl.connect("run-timer-stopped", func, self, usrdata)
+ else:
+ self.ctrl.connect("run-timer-stopped", func, self)
+
+gobject.type_register(HobCellRendererToggle)
diff --git a/bitbake/lib/bb/ui/crumbs/imageconfigurationpage.py b/bitbake/lib/bb/ui/crumbs/imageconfigurationpage.py
new file mode 100644
index 0000000..2766bea
--- /dev/null
+++ b/bitbake/lib/bb/ui/crumbs/imageconfigurationpage.py
@@ -0,0 +1,561 @@
+#!/usr/bin/env python
+#
+# BitBake Graphical GTK User Interface
+#
+# Copyright (C) 2012 Intel Corporation
+#
+# Authored by Dongxiao Xu <dongxiao.xu@intel.com>
+# Authored by Shane Wang <shane.wang@intel.com>
+#
+# This program is free software; you can redistribute it and/or modify
+# it under the terms of the GNU General Public License version 2 as
+# published by the Free Software Foundation.
+#
+# This program is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
+# GNU General Public License for more details.
+#
+# You should have received a copy of the GNU General Public License along
+# with this program; if not, write to the Free Software Foundation, Inc.,
+# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
+
+import gtk
+import glib
+import re
+from bb.ui.crumbs.progressbar import HobProgressBar
+from bb.ui.crumbs.hobcolor import HobColors
+from bb.ui.crumbs.hobwidget import hic, HobImageButton, HobInfoButton, HobAltButton, HobButton
+from bb.ui.crumbs.hoblistmodel import RecipeListModel
+from bb.ui.crumbs.hobpages import HobPage
+from bb.ui.crumbs.hig.retrieveimagedialog import RetrieveImageDialog
+
+#
+# ImageConfigurationPage
+#
+class ImageConfigurationPage (HobPage):
+
+ __dummy_machine__ = "--select a machine--"
+ __dummy_image__ = "--select an image recipe--"
+ __custom_image__ = "Select from my image recipes"
+
+ def __init__(self, builder):
+ super(ImageConfigurationPage, self).__init__(builder, "Image configuration")
+
+ self.image_combo_id = None
+ # we use machine_combo_changed_by_manual to identify the machine is changed by code
+ # or by manual. If by manual, all user's recipe selection and package selection are
+ # cleared.
+ self.machine_combo_changed_by_manual = True
+ self.stopping = False
+ self.warning_shift = 0
+ self.custom_image_selected = None
+ self.create_visual_elements()
+
+ def create_visual_elements(self):
+ # create visual elements
+ self.toolbar = gtk.Toolbar()
+ self.toolbar.set_orientation(gtk.ORIENTATION_HORIZONTAL)
+ self.toolbar.set_style(gtk.TOOLBAR_BOTH)
+
+ my_images_button = self.append_toolbar_button(self.toolbar,
+ "Images",
+ hic.ICON_IMAGES_DISPLAY_FILE,
+ hic.ICON_IMAGES_HOVER_FILE,
+ "Open previously built images",
+ self.my_images_button_clicked_cb)
+ settings_button = self.append_toolbar_button(self.toolbar,
+ "Settings",
+ hic.ICON_SETTINGS_DISPLAY_FILE,
+ hic.ICON_SETTINGS_HOVER_FILE,
+ "View additional build settings",
+ self.settings_button_clicked_cb)
+
+ self.config_top_button = self.add_onto_top_bar(self.toolbar)
+
+ self.gtable = gtk.Table(40, 40, True)
+ self.create_config_machine()
+ self.create_config_baseimg()
+ self.config_build_button = self.create_config_build_button()
+
+ def _remove_all_widget(self):
+ children = self.gtable.get_children() or []
+ for child in children:
+ self.gtable.remove(child)
+ children = self.box_group_area.get_children() or []
+ for child in children:
+ self.box_group_area.remove(child)
+ children = self.get_children() or []
+ for child in children:
+ self.remove(child)
+
+ def _pack_components(self, pack_config_build_button = False):
+ self._remove_all_widget()
+ self.pack_start(self.config_top_button, expand=False, fill=False)
+ self.pack_start(self.group_align, expand=True, fill=True)
+
+ self.box_group_area.pack_start(self.gtable, expand=True, fill=True)
+ if pack_config_build_button:
+ self.box_group_area.pack_end(self.config_build_button, expand=False, fill=False)
+ else:
+ box = gtk.HBox(False, 6)
+ box.show()
+ subbox = gtk.HBox(False, 0)
+ subbox.set_size_request(205, 49)
+ subbox.show()
+ box.add(subbox)
+ self.box_group_area.pack_end(box, False, False)
+
+ def show_machine(self):
+ self.progress_bar.reset()
+ self._pack_components(pack_config_build_button = False)
+ self.set_config_machine_layout(show_progress_bar = False)
+ self.show_all()
+
+ def update_progress_bar(self, title, fraction, status=None):
+ if self.stopping == False:
+ self.progress_bar.update(fraction)
+ self.progress_bar.set_text(title)
+ self.progress_bar.set_rcstyle(status)
+
+ def show_info_populating(self):
+ self._pack_components(pack_config_build_button = False)
+ self.set_config_machine_layout(show_progress_bar = True)
+ self.show_all()
+
+ def show_info_populated(self):
+ self.progress_bar.reset()
+ self._pack_components(pack_config_build_button = False)
+ self.set_config_machine_layout(show_progress_bar = False)
+ self.set_config_baseimg_layout()
+ self.show_all()
+
+ def show_baseimg_selected(self):
+ self.progress_bar.reset()
+ self._pack_components(pack_config_build_button = True)
+ self.set_config_machine_layout(show_progress_bar = False)
+ self.set_config_baseimg_layout()
+ self.show_all()
+ if self.builder.recipe_model.get_selected_image() == self.builder.recipe_model.__custom_image__:
+ self.just_bake_button.hide()
+
+ def add_warnings_bar(self):
+ #create the warnings bar shown when recipes parsing generates warnings
+ color = HobColors.KHAKI
+ warnings_bar = gtk.EventBox()
+ warnings_bar.modify_bg(gtk.STATE_NORMAL, gtk.gdk.color_parse(color))
+ warnings_bar.set_flags(gtk.CAN_DEFAULT)
+ warnings_bar.grab_default()
+
+ build_stop_tab = gtk.Table(10, 20, True)
+ warnings_bar.add(build_stop_tab)
+
+ icon = gtk.Image()
+ icon_pix_buffer = gtk.gdk.pixbuf_new_from_file(hic.ICON_INDI_ALERT_FILE)
+ icon.set_from_pixbuf(icon_pix_buffer)
+ build_stop_tab.attach(icon, 0, 2, 0, 10)
+
+ label = gtk.Label()
+ label.set_alignment(0.0, 0.5)
+ warnings_nb = len(self.builder.parsing_warnings)
+ if warnings_nb == 1:
+ label.set_markup("<span size='x-large'><b>1 recipe parsing warning</b></span>")
+ else:
+ label.set_markup("<span size='x-large'><b>%s recipe parsing warnings</b></span>" % warnings_nb)
+ build_stop_tab.attach(label, 2, 12, 0, 10)
+
+ view_warnings_button = HobButton("View warnings")
+ view_warnings_button.connect('clicked', self.view_warnings_button_clicked_cb)
+ build_stop_tab.attach(view_warnings_button, 15, 19, 1, 9)
+
+ return warnings_bar
+
+ def disable_warnings_bar(self):
+ if self.builder.parsing_warnings:
+ if hasattr(self, 'warnings_bar'):
+ self.warnings_bar.hide_all()
+ self.builder.parsing_warnings = []
+
+ def create_config_machine(self):
+ self.machine_title = gtk.Label()
+ self.machine_title.set_alignment(0.0, 0.5)
+ mark = "<span %s>Select a machine</span>" % self.span_tag('x-large', 'bold')
+ self.machine_title.set_markup(mark)
+
+ self.machine_title_desc = gtk.Label()
+ self.machine_title_desc.set_alignment(0.0, 0.5)
+ mark = ("<span %s>Your selection is the profile of the target machine for which you"
+ " are building the image.\n</span>") % (self.span_tag('medium'))
+ self.machine_title_desc.set_markup(mark)
+
+ self.machine_combo = gtk.combo_box_new_text()
+ self.machine_combo.connect("changed", self.machine_combo_changed_cb)
+
+ icon_file = hic.ICON_LAYERS_DISPLAY_FILE
+ hover_file = hic.ICON_LAYERS_HOVER_FILE
+ self.layer_button = HobImageButton("Layers", "Add support for machines, software, etc.",
+ icon_file, hover_file)
+ self.layer_button.connect("clicked", self.layer_button_clicked_cb)
+
+ markup = "Layers are a powerful mechanism to extend the Yocto Project "
+ markup += "with your own functionality.\n"
+ markup += "For more on layers, check the <a href=\""
+ markup += "http://www.yoctoproject.org/docs/current/dev-manual/"
+ markup += "dev-manual.html#understanding-and-using-layers\">reference manual</a>."
+ self.layer_info_icon = HobInfoButton("<b>Layers</b>" + "*" + markup, self.get_parent())
+ self.progress_bar = HobProgressBar()
+ self.stop_button = HobAltButton("Stop")
+ self.stop_button.connect("clicked", self.stop_button_clicked_cb)
+ self.machine_separator = gtk.HSeparator()
+
+ def set_config_machine_layout(self, show_progress_bar = False):
+ self.gtable.attach(self.machine_title, 0, 40, 0, 4)
+ self.gtable.attach(self.machine_title_desc, 0, 40, 4, 6)
+ self.gtable.attach(self.machine_combo, 0, 12, 7, 10)
+ self.gtable.attach(self.layer_button, 14, 36, 7, 12)
+ self.gtable.attach(self.layer_info_icon, 36, 40, 7, 11)
+ if show_progress_bar:
+ #self.gtable.attach(self.progress_box, 0, 40, 15, 18)
+ self.gtable.attach(self.progress_bar, 0, 37, 15, 18)
+ self.gtable.attach(self.stop_button, 37, 40, 15, 18, 0, 0)
+ if self.builder.parsing_warnings:
+ self.warnings_bar = self.add_warnings_bar()
+ self.gtable.attach(self.warnings_bar, 0, 40, 14, 18)
+ self.warning_shift = 4
+ else:
+ self.warning_shift = 0
+ self.gtable.attach(self.machine_separator, 0, 40, 13, 14)
+
+ def create_config_baseimg(self):
+ self.image_title = gtk.Label()
+ self.image_title.set_alignment(0, 1.0)
+ mark = "<span %s>Select an image recipe</span>" % self.span_tag('x-large', 'bold')
+ self.image_title.set_markup(mark)
+
+ self.image_title_desc = gtk.Label()
+ self.image_title_desc.set_alignment(0, 0.5)
+
+ mark = ("<span %s>Image recipes are a starting point for the type of image you want. "
+ "You can build them as \n"
+ "they are or edit them to suit your needs.\n</span>") % self.span_tag('medium')
+ self.image_title_desc.set_markup(mark)
+
+ self.image_combo = gtk.combo_box_new_text()
+ self.image_combo.set_row_separator_func(self.combo_separator_func, None)
+ self.image_combo_id = self.image_combo.connect("changed", self.image_combo_changed_cb)
+
+ self.image_desc = gtk.Label()
+ self.image_desc.set_alignment(0.0, 0.5)
+ self.image_desc.set_size_request(256, -1)
+ self.image_desc.set_justify(gtk.JUSTIFY_LEFT)
+ self.image_desc.set_line_wrap(True)
+
+ # button to view recipes
+ icon_file = hic.ICON_RCIPE_DISPLAY_FILE
+ hover_file = hic.ICON_RCIPE_HOVER_FILE
+ self.view_adv_configuration_button = HobImageButton("Advanced configuration",
+ "Select image types, package formats, etc",
+ icon_file, hover_file)
+ self.view_adv_configuration_button.connect("clicked", self.view_adv_configuration_button_clicked_cb)
+
+ self.image_separator = gtk.HSeparator()
+
+ def combo_separator_func(self, model, iter, user_data):
+ name = model.get_value(iter, 0)
+ if name == "--Separator--":
+ return True
+
+ def set_config_baseimg_layout(self):
+ self.gtable.attach(self.image_title, 0, 40, 15+self.warning_shift, 17+self.warning_shift)
+ self.gtable.attach(self.image_title_desc, 0, 40, 18+self.warning_shift, 22+self.warning_shift)
+ self.gtable.attach(self.image_combo, 0, 12, 23+self.warning_shift, 26+self.warning_shift)
+ self.gtable.attach(self.image_desc, 0, 12, 27+self.warning_shift, 33+self.warning_shift)
+ self.gtable.attach(self.view_adv_configuration_button, 14, 36, 23+self.warning_shift, 28+self.warning_shift)
+ self.gtable.attach(self.image_separator, 0, 40, 35+self.warning_shift, 36+self.warning_shift)
+
+ def create_config_build_button(self):
+ # Create the "Build packages" and "Build image" buttons at the bottom
+ button_box = gtk.HBox(False, 6)
+
+ # create button "Build image"
+ self.just_bake_button = HobButton("Build image")
+ self.just_bake_button.set_tooltip_text("Build the image recipe as it is")
+ self.just_bake_button.connect("clicked", self.just_bake_button_clicked_cb)
+ button_box.pack_end(self.just_bake_button, expand=False, fill=False)
+
+ # create button "Edit image recipe"
+ self.edit_image_button = HobAltButton("Edit image recipe")
+ self.edit_image_button.set_tooltip_text("Customize the recipes and packages to be included in your image")
+ self.edit_image_button.connect("clicked", self.edit_image_button_clicked_cb)
+ button_box.pack_end(self.edit_image_button, expand=False, fill=False)
+
+ return button_box
+
+ def stop_button_clicked_cb(self, button):
+ self.stopping = True
+ self.progress_bar.set_text("Stopping recipe parsing")
+ self.progress_bar.set_rcstyle("stop")
+ self.builder.cancel_parse_sync()
+
+ def view_warnings_button_clicked_cb(self, button):
+ self.builder.show_warning_dialog()
+
+ def machine_combo_changed_idle_cb(self):
+ self.builder.window.set_cursor(None)
+
+ def machine_combo_changed_cb(self, machine_combo):
+ self.stopping = False
+ self.builder.parsing_warnings = []
+ combo_item = machine_combo.get_active_text()
+ if not combo_item or combo_item == self.__dummy_machine__:
+ return
+
+ self.builder.window.set_cursor(gtk.gdk.Cursor(gtk.gdk.WATCH))
+ self.builder.wait(0.1) #wait for combo and cursor to update
+
+ # remove __dummy_machine__ item from the store list after first user selection
+ # because it is no longer valid
+ combo_store = machine_combo.get_model()
+ if len(combo_store) and (combo_store[0][0] == self.__dummy_machine__):
+ machine_combo.remove_text(0)
+
+ self.builder.configuration.curr_mach = combo_item
+ if self.machine_combo_changed_by_manual:
+ self.builder.configuration.clear_selection()
+ # reset machine_combo_changed_by_manual
+ self.machine_combo_changed_by_manual = True
+
+ self.builder.configuration.selected_image = None
+
+ # Do reparse recipes
+ self.builder.populate_recipe_package_info_async()
+
+ glib.idle_add(self.machine_combo_changed_idle_cb)
+
+ def update_machine_combo(self):
+ self.disable_warnings_bar()
+ all_machines = [self.__dummy_machine__] + self.builder.parameters.all_machines
+
+ model = self.machine_combo.get_model()
+ model.clear()
+ for machine in all_machines:
+ self.machine_combo.append_text(machine)
+ self.machine_combo.set_active(0)
+
+ def switch_machine_combo(self):
+ self.disable_warnings_bar()
+ self.machine_combo_changed_by_manual = False
+ model = self.machine_combo.get_model()
+ active = 0
+ while active < len(model):
+ if model[active][0] == self.builder.configuration.curr_mach:
+ self.machine_combo.set_active(active)
+ return
+ active += 1
+
+ if model[0][0] != self.__dummy_machine__:
+ self.machine_combo.insert_text(0, self.__dummy_machine__)
+
+ self.machine_combo.set_active(0)
+
+ def update_image_desc(self):
+ desc = ""
+ selected_image = self.image_combo.get_active_text()
+ if selected_image and selected_image in self.builder.recipe_model.pn_path.keys():
+ image_path = self.builder.recipe_model.pn_path[selected_image]
+ image_iter = self.builder.recipe_model.get_iter(image_path)
+ desc = self.builder.recipe_model.get_value(image_iter, self.builder.recipe_model.COL_DESC)
+
+ mark = ("<span %s>%s</span>\n") % (self.span_tag('small'), desc)
+ self.image_desc.set_markup(mark)
+
+ def image_combo_changed_idle_cb(self, selected_image, selected_recipes, selected_packages):
+ self.builder.update_recipe_model(selected_image, selected_recipes)
+ self.builder.update_package_model(selected_packages)
+ self.builder.window_sensitive(True)
+
+ def image_combo_changed_cb(self, combo):
+ self.builder.window_sensitive(False)
+ selected_image = self.image_combo.get_active_text()
+ if selected_image == self.__custom_image__:
+ topdir = self.builder.get_topdir()
+ images_dir = topdir + "/recipes/images/custom/"
+ self.builder.ensure_dir(images_dir)
+
+ dialog = RetrieveImageDialog(images_dir, "Select from my image recipes",
+ self.builder, gtk.DIALOG_MODAL | gtk.DIALOG_DESTROY_WITH_PARENT)
+ response = dialog.run()
+ if response == gtk.RESPONSE_OK:
+ image_name = dialog.get_filename()
+ head, tail = os.path.split(image_name)
+ selected_image = os.path.splitext(tail)[0]
+ self.custom_image_selected = selected_image
+ self.update_image_combo(self.builder.recipe_model, selected_image)
+ else:
+ selected_image = self.__dummy_image__
+ self.update_image_combo(self.builder.recipe_model, None)
+ dialog.destroy()
+ else:
+ if self.custom_image_selected:
+ self.custom_image_selected = None
+ self.update_image_combo(self.builder.recipe_model, selected_image)
+
+ if not selected_image or (selected_image == self.__dummy_image__):
+ self.builder.window_sensitive(True)
+ self.just_bake_button.hide()
+ self.edit_image_button.hide()
+ return
+
+ # remove __dummy_image__ item from the store list after first user selection
+ # because it is no longer valid
+ combo_store = combo.get_model()
+ if len(combo_store) and (combo_store[0][0] == self.__dummy_image__):
+ combo.remove_text(0)
+
+ self.builder.customized = False
+
+ selected_recipes = []
+
+ image_path = self.builder.recipe_model.pn_path[selected_image]
+ image_iter = self.builder.recipe_model.get_iter(image_path)
+ selected_packages = self.builder.recipe_model.get_value(image_iter, self.builder.recipe_model.COL_INSTALL).split()
+ self.update_image_desc()
+
+ self.builder.recipe_model.reset()
+ self.builder.package_model.reset()
+
+ self.show_baseimg_selected()
+
+ if selected_image == self.builder.recipe_model.__custom_image__:
+ self.just_bake_button.hide()
+
+ glib.idle_add(self.image_combo_changed_idle_cb, selected_image, selected_recipes, selected_packages)
+
+ def _image_combo_connect_signal(self):
+ if not self.image_combo_id:
+ self.image_combo_id = self.image_combo.connect("changed", self.image_combo_changed_cb)
+
+ def _image_combo_disconnect_signal(self):
+ if self.image_combo_id:
+ self.image_combo.disconnect(self.image_combo_id)
+ self.image_combo_id = None
+
+ def update_image_combo(self, recipe_model, selected_image):
+ # Update the image combo according to the images in the recipe_model
+ # populate image combo
+ filter = {RecipeListModel.COL_TYPE : ['image']}
+ image_model = recipe_model.tree_model(filter)
+ image_model.set_sort_column_id(recipe_model.COL_NAME, gtk.SORT_ASCENDING)
+ active = 0
+ cnt = 0
+
+ white_pattern = []
+ if self.builder.parameters.image_white_pattern:
+ for i in self.builder.parameters.image_white_pattern.split():
+ white_pattern.append(re.compile(i))
+
+ black_pattern = []
+ if self.builder.parameters.image_black_pattern:
+ for i in self.builder.parameters.image_black_pattern.split():
+ black_pattern.append(re.compile(i))
+ black_pattern.append(re.compile("hob-image"))
+ black_pattern.append(re.compile("edited(-[0-9]*)*.bb$"))
+
+ it = image_model.get_iter_first()
+ self._image_combo_disconnect_signal()
+ model = self.image_combo.get_model()
+ model.clear()
+ # Set a indicator text to combo store when first open
+ if not selected_image:
+ self.image_combo.append_text(self.__dummy_image__)
+ cnt = cnt + 1
+
+ self.image_combo.append_text(self.__custom_image__)
+ self.image_combo.append_text("--Separator--")
+ cnt = cnt + 2
+
+ topdir = self.builder.get_topdir()
+ # append and set active
+ while it:
+ path = image_model.get_path(it)
+ it = image_model.iter_next(it)
+ image_name = image_model[path][recipe_model.COL_NAME]
+ if image_name == self.builder.recipe_model.__custom_image__:
+ continue
+
+ if black_pattern:
+ allow = True
+ for pattern in black_pattern:
+ if pattern.search(image_name):
+ allow = False
+ break
+ elif white_pattern:
+ allow = False
+ for pattern in white_pattern:
+ if pattern.search(image_name):
+ allow = True
+ break
+ else:
+ allow = True
+
+ file_name = image_model[path][recipe_model.COL_FILE]
+ if file_name and topdir in file_name:
+ allow = False
+
+ if allow:
+ self.image_combo.append_text(image_name)
+ if image_name == selected_image:
+ active = cnt
+ cnt = cnt + 1
+ self.image_combo.append_text(self.builder.recipe_model.__custom_image__)
+
+ if selected_image == self.builder.recipe_model.__custom_image__:
+ active = cnt
+
+ if self.custom_image_selected:
+ self.image_combo.append_text("--Separator--")
+ self.image_combo.append_text(self.custom_image_selected)
+ cnt = cnt + 2
+ if self.custom_image_selected == selected_image:
+ active = cnt
+
+ self.image_combo.set_active(active)
+
+ if active != 0:
+ self.show_baseimg_selected()
+
+ self._image_combo_connect_signal()
+
+ def layer_button_clicked_cb(self, button):
+ # Create a layer selection dialog
+ self.builder.show_layer_selection_dialog()
+
+ def view_adv_configuration_button_clicked_cb(self, button):
+ # Create an advanced settings dialog
+ response, settings_changed = self.builder.show_adv_settings_dialog()
+ if not response:
+ return
+ if settings_changed:
+ self.builder.window.set_cursor(gtk.gdk.Cursor(gtk.gdk.WATCH))
+ self.builder.wait(0.1) #wait for adv_settings_dialog to terminate
+ self.builder.reparse_post_adv_settings()
+ self.builder.window.set_cursor(None)
+
+ def just_bake_button_clicked_cb(self, button):
+ self.builder.parsing_warnings = []
+ self.builder.just_bake()
+
+ def edit_image_button_clicked_cb(self, button):
+ self.builder.set_base_image()
+ self.builder.show_recipes()
+
+ def my_images_button_clicked_cb(self, button):
+ self.builder.show_load_my_images_dialog()
+
+ def settings_button_clicked_cb(self, button):
+ # Create an advanced settings dialog
+ response, settings_changed = self.builder.show_simple_settings_dialog()
+ if not response:
+ return
+ if settings_changed:
+ self.builder.reparse_post_adv_settings()
diff --git a/bitbake/lib/bb/ui/crumbs/imagedetailspage.py b/bitbake/lib/bb/ui/crumbs/imagedetailspage.py
new file mode 100755
index 0000000..352e948
--- /dev/null
+++ b/bitbake/lib/bb/ui/crumbs/imagedetailspage.py
@@ -0,0 +1,669 @@
+#!/usr/bin/env python
+#
+# BitBake Graphical GTK User Interface
+#
+# Copyright (C) 2012 Intel Corporation
+#
+# Authored by Dongxiao Xu <dongxiao.xu@intel.com>
+# Authored by Shane Wang <shane.wang@intel.com>
+#
+# This program is free software; you can redistribute it and/or modify
+# it under the terms of the GNU General Public License version 2 as
+# published by the Free Software Foundation.
+#
+# This program is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
+# GNU General Public License for more details.
+#
+# You should have received a copy of the GNU General Public License along
+# with this program; if not, write to the Free Software Foundation, Inc.,
+# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
+
+import gobject
+import gtk
+from bb.ui.crumbs.hobcolor import HobColors
+from bb.ui.crumbs.hobwidget import hic, HobViewTable, HobAltButton, HobButton
+from bb.ui.crumbs.hobpages import HobPage
+import subprocess
+from bb.ui.crumbs.hig.crumbsdialog import CrumbsDialog
+from bb.ui.crumbs.hig.saveimagedialog import SaveImageDialog
+
+#
+# ImageDetailsPage
+#
+class ImageDetailsPage (HobPage):
+
+ class DetailBox (gtk.EventBox):
+ def __init__(self, widget = None, varlist = None, vallist = None, icon = None, button = None, button2=None, color = HobColors.LIGHT_GRAY):
+ gtk.EventBox.__init__(self)
+
+ # set color
+ style = self.get_style().copy()
+ style.bg[gtk.STATE_NORMAL] = self.get_colormap().alloc_color(color, False, False)
+ self.set_style(style)
+
+ self.row = gtk.Table(1, 2, False)
+ self.row.set_border_width(10)
+ self.add(self.row)
+
+ total_rows = 0
+ if widget:
+ total_rows = 10
+ if varlist and vallist:
+ # pack the icon and the text on the left
+ total_rows += len(varlist)
+ self.table = gtk.Table(total_rows, 20, True)
+ self.table.set_row_spacings(6)
+ self.table.set_size_request(100, -1)
+ self.row.attach(self.table, 0, 1, 0, 1, xoptions=gtk.FILL|gtk.EXPAND, yoptions=gtk.FILL)
+
+ colid = 0
+ rowid = 0
+ self.line_widgets = {}
+ if icon:
+ self.table.attach(icon, colid, colid + 2, 0, 1)
+ colid = colid + 2
+ if widget:
+ self.table.attach(widget, colid, 20, 0, 10)
+ rowid = 10
+ if varlist and vallist:
+ for row in range(rowid, total_rows):
+ index = row - rowid
+ self.line_widgets[varlist[index]] = self.text2label(varlist[index], vallist[index])
+ self.table.attach(self.line_widgets[varlist[index]], colid, 20, row, row + 1)
+ # pack the button on the right
+ if button:
+ self.bbox = gtk.VBox()
+ self.bbox.pack_start(button, expand=True, fill=False)
+ if button2:
+ self.bbox.pack_start(button2, expand=True, fill=False)
+ self.bbox.set_size_request(150,-1)
+ self.row.attach(self.bbox, 1, 2, 0, 1, xoptions=gtk.FILL, yoptions=gtk.EXPAND)
+
+ def update_line_widgets(self, variable, value):
+ if len(self.line_widgets) == 0:
+ return
+ if not isinstance(self.line_widgets[variable], gtk.Label):
+ return
+ self.line_widgets[variable].set_markup(self.format_line(variable, value))
+
+ def wrap_line(self, inputs):
+ # wrap the long text of inputs
+ wrap_width_chars = 75
+ outputs = ""
+ tmps = inputs
+ less_chars = len(inputs)
+ while (less_chars - wrap_width_chars) > 0:
+ less_chars -= wrap_width_chars
+ outputs += tmps[:wrap_width_chars] + "\n "
+ tmps = inputs[less_chars:]
+ outputs += tmps
+ return outputs
+
+ def format_line(self, variable, value):
+ wraped_value = self.wrap_line(value)
+ markup = "<span weight=\'bold\'>%s</span>" % variable
+ markup += "<span weight=\'normal\' foreground=\'#1c1c1c\' font_desc=\'14px\'>%s</span>" % wraped_value
+ return markup
+
+ def text2label(self, variable, value):
+ # append the name:value to the left box
+ # such as "Name: hob-core-minimal-variant-2011-12-15-beagleboard"
+ label = gtk.Label()
+ label.set_alignment(0.0, 0.5)
+ label.set_markup(self.format_line(variable, value))
+ return label
+
+ class BuildDetailBox (gtk.EventBox):
+ def __init__(self, varlist = None, vallist = None, icon = None, color = HobColors.LIGHT_GRAY):
+ gtk.EventBox.__init__(self)
+
+ # set color
+ style = self.get_style().copy()
+ style.bg[gtk.STATE_NORMAL] = self.get_colormap().alloc_color(color, False, False)
+ self.set_style(style)
+
+ self.hbox = gtk.HBox()
+ self.hbox.set_border_width(10)
+ self.add(self.hbox)
+
+ total_rows = 0
+ if varlist and vallist:
+ # pack the icon and the text on the left
+ total_rows += len(varlist)
+ self.table = gtk.Table(total_rows, 20, True)
+ self.table.set_row_spacings(6)
+ self.table.set_size_request(100, -1)
+ self.hbox.pack_start(self.table, expand=True, fill=True, padding=15)
+
+ colid = 0
+ rowid = 0
+ self.line_widgets = {}
+ if icon:
+ self.table.attach(icon, colid, colid + 2, 0, 1)
+ colid = colid + 2
+ if varlist and vallist:
+ for row in range(rowid, total_rows):
+ index = row - rowid
+ self.line_widgets[varlist[index]] = self.text2label(varlist[index], vallist[index])
+ self.table.attach(self.line_widgets[varlist[index]], colid, 20, row, row + 1)
+
+ def update_line_widgets(self, variable, value):
+ if len(self.line_widgets) == 0:
+ return
+ if not isinstance(self.line_widgets[variable], gtk.Label):
+ return
+ self.line_widgets[variable].set_markup(self.format_line(variable, value))
+
+ def wrap_line(self, inputs):
+ # wrap the long text of inputs
+ wrap_width_chars = 75
+ outputs = ""
+ tmps = inputs
+ less_chars = len(inputs)
+ while (less_chars - wrap_width_chars) > 0:
+ less_chars -= wrap_width_chars
+ outputs += tmps[:wrap_width_chars] + "\n "
+ tmps = inputs[less_chars:]
+ outputs += tmps
+ return outputs
+
+ def format_line(self, variable, value):
+ wraped_value = self.wrap_line(value)
+ markup = "<span weight=\'bold\'>%s</span>" % variable
+ markup += "<span weight=\'normal\' foreground=\'#1c1c1c\' font_desc=\'14px\'>%s</span>" % wraped_value
+ return markup
+
+ def text2label(self, variable, value):
+ # append the name:value to the left box
+ # such as "Name: hob-core-minimal-variant-2011-12-15-beagleboard"
+ label = gtk.Label()
+ label.set_alignment(0.0, 0.5)
+ label.set_markup(self.format_line(variable, value))
+ return label
+
+ def __init__(self, builder):
+ super(ImageDetailsPage, self).__init__(builder, "Image details")
+
+ self.image_store = []
+ self.button_ids = {}
+ self.details_bottom_buttons = gtk.HBox(False, 6)
+ self.image_saved = False
+ self.create_visual_elements()
+ self.name_field_template = ""
+ self.description_field_template = ""
+
+ def create_visual_elements(self):
+ # create visual elements
+ # create the toolbar
+ self.toolbar = gtk.Toolbar()
+ self.toolbar.set_orientation(gtk.ORIENTATION_HORIZONTAL)
+ self.toolbar.set_style(gtk.TOOLBAR_BOTH)
+
+ my_images_button = self.append_toolbar_button(self.toolbar,
+ "Images",
+ hic.ICON_IMAGES_DISPLAY_FILE,
+ hic.ICON_IMAGES_HOVER_FILE,
+ "Open previously built images",
+ self.my_images_button_clicked_cb)
+ settings_button = self.append_toolbar_button(self.toolbar,
+ "Settings",
+ hic.ICON_SETTINGS_DISPLAY_FILE,
+ hic.ICON_SETTINGS_HOVER_FILE,
+ "View additional build settings",
+ self.settings_button_clicked_cb)
+
+ self.details_top_buttons = self.add_onto_top_bar(self.toolbar)
+
+ def _remove_all_widget(self):
+ children = self.get_children() or []
+ for child in children:
+ self.remove(child)
+ children = self.box_group_area.get_children() or []
+ for child in children:
+ self.box_group_area.remove(child)
+ children = self.details_bottom_buttons.get_children() or []
+ for child in children:
+ self.details_bottom_buttons.remove(child)
+
+ def show_page(self, step):
+ self.build_succeeded = (step == self.builder.IMAGE_GENERATED)
+ image_addr = self.builder.parameters.image_addr
+ image_names = self.builder.parameters.image_names
+ if self.build_succeeded:
+ machine = self.builder.configuration.curr_mach
+ base_image = self.builder.recipe_model.get_selected_image()
+ layers = self.builder.configuration.layers
+ pkg_num = "%s" % len(self.builder.package_model.get_selected_packages())
+ log_file = self.builder.current_logfile
+ else:
+ pkg_num = "N/A"
+ log_file = None
+
+ # remove
+ for button_id, button in self.button_ids.items():
+ button.disconnect(button_id)
+ self._remove_all_widget()
+
+ # repack
+ self.pack_start(self.details_top_buttons, expand=False, fill=False)
+ self.pack_start(self.group_align, expand=True, fill=True)
+
+ self.build_result = None
+ if self.image_saved or (self.build_succeeded and self.builder.current_step == self.builder.IMAGE_GENERATING):
+ # building is the previous step
+ icon = gtk.Image()
+ pixmap_path = hic.ICON_INDI_CONFIRM_FILE
+ color = HobColors.RUNNING
+ pix_buffer = gtk.gdk.pixbuf_new_from_file(pixmap_path)
+ icon.set_from_pixbuf(pix_buffer)
+ varlist = [""]
+ if self.image_saved:
+ vallist = ["Your image recipe has been saved"]
+ else:
+ vallist = ["Your image is ready"]
+ self.build_result = self.BuildDetailBox(varlist=varlist, vallist=vallist, icon=icon, color=color)
+ self.box_group_area.pack_start(self.build_result, expand=False, fill=False)
+
+ self.buttonlist = ["Build new image", "Save image recipe", "Run image", "Deploy image"]
+
+ # Name
+ self.image_store = []
+ self.toggled_image = ""
+ default_image_size = 0
+ self.num_toggled = 0
+ i = 0
+ for image_name in image_names:
+ image_size = HobPage._size_to_string(os.stat(os.path.join(image_addr, image_name)).st_size)
+
+ image_attr = ("run" if (self.test_type_runnable(image_name) and self.test_mach_runnable(image_name)) else \
+ ("deploy" if self.test_deployable(image_name) else ""))
+ is_toggled = (image_attr != "")
+
+ if not self.toggled_image:
+ if i == (len(image_names) - 1):
+ is_toggled = True
+ if is_toggled:
+ default_image_size = image_size
+ self.toggled_image = image_name
+
+ split_stuff = image_name.split('.')
+ if "rootfs" in split_stuff:
+ image_type = image_name[(len(split_stuff[0]) + len(".rootfs") + 1):]
+ else:
+ image_type = image_name[(len(split_stuff[0]) + 1):]
+
+ self.image_store.append({'name': image_name,
+ 'type': image_type,
+ 'size': image_size,
+ 'is_toggled': is_toggled,
+ 'action_attr': image_attr,})
+
+ i = i + 1
+ self.num_toggled += is_toggled
+
+ is_runnable = self.create_bottom_buttons(self.buttonlist, self.toggled_image)
+
+ # Generated image files info
+ varlist = ["Name: ", "Files created: ", "Directory: "]
+ vallist = []
+
+ vallist.append(image_name.split('.')[0])
+ vallist.append(', '.join(fileitem['type'] for fileitem in self.image_store))
+ vallist.append(image_addr)
+
+ view_files_button = HobAltButton("View files")
+ view_files_button.connect("clicked", self.view_files_clicked_cb, image_addr)
+ view_files_button.set_tooltip_text("Open the directory containing the image files")
+ open_log_button = None
+ if log_file:
+ open_log_button = HobAltButton("Open log")
+ open_log_button.connect("clicked", self.open_log_clicked_cb, log_file)
+ open_log_button.set_tooltip_text("Open the build's log file")
+ self.image_detail = self.DetailBox(varlist=varlist, vallist=vallist, button=view_files_button, button2=open_log_button)
+ self.box_group_area.pack_start(self.image_detail, expand=False, fill=True)
+
+ # The default kernel box for the qemu images
+ self.sel_kernel = ""
+ self.kernel_detail = None
+ if 'qemu' in image_name:
+ self.sel_kernel = self.get_kernel_file_name()
+
+ # varlist = ["Kernel: "]
+ # vallist = []
+ # vallist.append(self.sel_kernel)
+
+ # change_kernel_button = HobAltButton("Change")
+ # change_kernel_button.connect("clicked", self.change_kernel_cb)
+ # change_kernel_button.set_tooltip_text("Change qemu kernel file")
+ # self.kernel_detail = self.DetailBox(varlist=varlist, vallist=vallist, button=change_kernel_button)
+ # self.box_group_area.pack_start(self.kernel_detail, expand=True, fill=True)
+
+ # Machine, Image recipe and Layers
+ layer_num_limit = 15
+ varlist = ["Machine: ", "Image recipe: ", "Layers: "]
+ vallist = []
+ self.setting_detail = None
+ if self.build_succeeded:
+ vallist.append(machine)
+ if self.builder.recipe_model.is_custom_image():
+ if self.builder.configuration.initial_selected_image == self.builder.recipe_model.__custom_image__:
+ base_image ="New image recipe"
+ else:
+ base_image = self.builder.configuration.initial_selected_image + " (edited)"
+ vallist.append(base_image)
+ i = 0
+ for layer in layers:
+ if i > layer_num_limit:
+ break
+ varlist.append(" - ")
+ i += 1
+ vallist.append("")
+ i = 0
+ for layer in layers:
+ if i > layer_num_limit:
+ break
+ elif i == layer_num_limit:
+ vallist.append("and more...")
+ else:
+ vallist.append(layer)
+ i += 1
+
+ edit_config_button = HobAltButton("Edit configuration")
+ edit_config_button.set_tooltip_text("Edit machine and image recipe")
+ edit_config_button.connect("clicked", self.edit_config_button_clicked_cb)
+ self.setting_detail = self.DetailBox(varlist=varlist, vallist=vallist, button=edit_config_button)
+ self.box_group_area.pack_start(self.setting_detail, expand=True, fill=True)
+
+ # Packages included, and Total image size
+ varlist = ["Packages included: ", "Total image size: "]
+ vallist = []
+ vallist.append(pkg_num)
+ vallist.append(default_image_size)
+ self.builder.configuration.image_size = default_image_size
+ self.builder.configuration.image_packages = self.builder.configuration.selected_packages
+ if self.build_succeeded:
+ edit_packages_button = HobAltButton("Edit packages")
+ edit_packages_button.set_tooltip_text("Edit the packages included in your image")
+ edit_packages_button.connect("clicked", self.edit_packages_button_clicked_cb)
+ else: # get to this page from "My images"
+ edit_packages_button = None
+ self.package_detail = self.DetailBox(varlist=varlist, vallist=vallist, button=edit_packages_button)
+ self.box_group_area.pack_start(self.package_detail, expand=True, fill=True)
+
+ # pack the buttons at the bottom, at this time they are already created.
+ if self.build_succeeded:
+ self.box_group_area.pack_end(self.details_bottom_buttons, expand=False, fill=False)
+ else: # for "My images" page
+ self.details_separator = gtk.HSeparator()
+ self.box_group_area.pack_start(self.details_separator, expand=False, fill=False)
+ self.box_group_area.pack_start(self.details_bottom_buttons, expand=False, fill=False)
+
+ self.show_all()
+ if self.kernel_detail and (not is_runnable):
+ self.kernel_detail.hide()
+ self.image_saved = False
+
+ def view_files_clicked_cb(self, button, image_addr):
+ subprocess.call("xdg-open /%s" % image_addr, shell=True)
+
+ def open_log_clicked_cb(self, button, log_file):
+ if log_file:
+ log_file = "file:///" + log_file
+ gtk.show_uri(screen=button.get_screen(), uri=log_file, timestamp=0)
+
+ def refresh_package_detail_box(self, image_size):
+ self.package_detail.update_line_widgets("Total image size: ", image_size)
+
+ def test_type_runnable(self, image_name):
+ type_runnable = False
+ for t in self.builder.parameters.runnable_image_types:
+ if image_name.endswith(t):
+ type_runnable = True
+ break
+ return type_runnable
+
+ def test_mach_runnable(self, image_name):
+ mach_runnable = False
+ for t in self.builder.parameters.runnable_machine_patterns:
+ if t in image_name:
+ mach_runnable = True
+ break
+ return mach_runnable
+
+ def test_deployable(self, image_name):
+ if self.builder.configuration.curr_mach.startswith("qemu"):
+ return False
+ deployable = False
+ for t in self.builder.parameters.deployable_image_types:
+ if image_name.endswith(t):
+ deployable = True
+ break
+ return deployable
+
+ def get_kernel_file_name(self, kernel_addr=""):
+ kernel_name = ""
+
+ if not kernel_addr:
+ kernel_addr = self.builder.parameters.image_addr
+
+ files = [f for f in os.listdir(kernel_addr) if f[0] <> '.']
+ for check_file in files:
+ if check_file.endswith(".bin"):
+ name_splits = check_file.split(".")[0]
+ if self.builder.parameters.kernel_image_type in name_splits.split("-"):
+ kernel_name = check_file
+ break
+
+ return kernel_name
+
+ def show_builded_images_dialog(self, widget, primary_action=""):
+ title = primary_action if primary_action else "Your builded images"
+ dialog = CrumbsDialog(title, self.builder,
+ gtk.DIALOG_MODAL | gtk.DIALOG_DESTROY_WITH_PARENT)
+ dialog.set_border_width(12)
+
+ label = gtk.Label()
+ label.set_use_markup(True)
+ label.set_alignment(0.0, 0.5)
+ label.set_padding(12,0)
+ if primary_action == "Run image":
+ label.set_markup("<span font_desc='12'>Select the image file you want to run:</span>")
+ elif primary_action == "Deploy image":
+ label.set_markup("<span font_desc='12'>Select the image file you want to deploy:</span>")
+ else:
+ label.set_markup("<span font_desc='12'>Select the image file you want to %s</span>" % primary_action)
+ dialog.vbox.pack_start(label, expand=False, fill=False)
+
+ # filter created images as action attribution (deploy or run)
+ action_attr = ""
+ action_images = []
+ for fileitem in self.image_store:
+ action_attr = fileitem['action_attr']
+ if (action_attr == 'run' and primary_action == "Run image") \
+ or (action_attr == 'deploy' and primary_action == "Deploy image"):
+ action_images.append(fileitem)
+
+ # pack the corresponding 'runnable' or 'deploy' radio_buttons, if there has no more than one file.
+ # assume that there does not both have 'deploy' and 'runnable' files in the same building result
+ # in possible as design.
+ curr_row = 0
+ rows = (len(action_images)) if len(action_images) < 10 else 10
+ table = gtk.Table(rows, 10, True)
+ table.set_row_spacings(6)
+ table.set_col_spacing(0, 12)
+ table.set_col_spacing(5, 12)
+
+ sel_parent_btn = None
+ for fileitem in action_images:
+ sel_btn = gtk.RadioButton(sel_parent_btn, fileitem['type'])
+ sel_parent_btn = sel_btn if not sel_parent_btn else sel_parent_btn
+ sel_btn.set_active(fileitem['is_toggled'])
+ sel_btn.connect('toggled', self.table_selected_cb, fileitem)
+ if curr_row < 10:
+ table.attach(sel_btn, 0, 4, curr_row, curr_row + 1, xpadding=24)
+ else:
+ table.attach(sel_btn, 5, 9, curr_row - 10, curr_row - 9, xpadding=24)
+ curr_row += 1
+
+ dialog.vbox.pack_start(table, expand=False, fill=False, padding=6)
+
+ button = dialog.add_button("Cancel", gtk.RESPONSE_CANCEL)
+ HobAltButton.style_button(button)
+
+ if primary_action:
+ button = dialog.add_button(primary_action, gtk.RESPONSE_YES)
+ HobButton.style_button(button)
+
+ dialog.show_all()
+
+ response = dialog.run()
+ dialog.destroy()
+
+ if response != gtk.RESPONSE_YES:
+ return
+
+ for fileitem in self.image_store:
+ if fileitem['is_toggled']:
+ if fileitem['action_attr'] == 'run':
+ self.builder.runqemu_image(fileitem['name'], self.sel_kernel)
+ elif fileitem['action_attr'] == 'deploy':
+ self.builder.deploy_image(fileitem['name'])
+
+ def table_selected_cb(self, tbutton, image):
+ image['is_toggled'] = tbutton.get_active()
+ if image['is_toggled']:
+ self.toggled_image = image['name']
+
+ def change_kernel_cb(self, widget):
+ kernel_path = self.builder.show_load_kernel_dialog()
+ if kernel_path and self.kernel_detail:
+ import os.path
+ self.sel_kernel = os.path.basename(kernel_path)
+ markup = self.kernel_detail.format_line("Kernel: ", self.sel_kernel)
+ label = ((self.kernel_detail.get_children()[0]).get_children()[0]).get_children()[0]
+ label.set_markup(markup)
+
+ def create_bottom_buttons(self, buttonlist, image_name):
+ # Create the buttons at the bottom
+ created = False
+ packed = False
+ self.button_ids = {}
+ is_runnable = False
+
+ # create button "Deploy image"
+ name = "Deploy image"
+ if name in buttonlist and self.test_deployable(image_name):
+ deploy_button = HobButton('Deploy image')
+ #deploy_button.set_size_request(205, 49)
+ deploy_button.set_tooltip_text("Burn a live image to a USB drive or flash memory")
+ deploy_button.set_flags(gtk.CAN_DEFAULT)
+ button_id = deploy_button.connect("clicked", self.deploy_button_clicked_cb)
+ self.button_ids[button_id] = deploy_button
+ self.details_bottom_buttons.pack_end(deploy_button, expand=False, fill=False)
+ created = True
+ packed = True
+
+ name = "Run image"
+ if name in buttonlist and self.test_type_runnable(image_name) and self.test_mach_runnable(image_name):
+ if created == True:
+ # separator
+ #label = gtk.Label(" or ")
+ #self.details_bottom_buttons.pack_end(label, expand=False, fill=False)
+
+ # create button "Run image"
+ run_button = HobAltButton("Run image")
+ else:
+ # create button "Run image" as the primary button
+ run_button = HobButton("Run image")
+ #run_button.set_size_request(205, 49)
+ run_button.set_flags(gtk.CAN_DEFAULT)
+ packed = True
+ run_button.set_tooltip_text("Start up an image with qemu emulator")
+ button_id = run_button.connect("clicked", self.run_button_clicked_cb)
+ self.button_ids[button_id] = run_button
+ self.details_bottom_buttons.pack_end(run_button, expand=False, fill=False)
+ created = True
+ is_runnable = True
+
+ name = "Save image recipe"
+ if name in buttonlist and self.builder.recipe_model.is_custom_image():
+ save_button = HobAltButton("Save image recipe")
+ save_button.set_tooltip_text("Keep your changes saving them as an image recipe")
+ save_button.set_sensitive(not self.image_saved)
+ button_id = save_button.connect("clicked", self.save_button_clicked_cb)
+ self.button_ids[button_id] = save_button
+ self.details_bottom_buttons.pack_end(save_button, expand=False, fill=False)
+
+ name = "Build new image"
+ if name in buttonlist:
+ # create button "Build new image"
+ if packed:
+ build_new_button = HobAltButton("Build new image")
+ else:
+ build_new_button = HobButton("Build new image")
+ build_new_button.set_flags(gtk.CAN_DEFAULT)
+ #build_new_button.set_size_request(205, 49)
+ self.details_bottom_buttons.pack_end(build_new_button, expand=False, fill=False)
+ build_new_button.set_tooltip_text("Create a new image from scratch")
+ button_id = build_new_button.connect("clicked", self.build_new_button_clicked_cb)
+ self.button_ids[button_id] = build_new_button
+
+ return is_runnable
+
+ def deploy_button_clicked_cb(self, button):
+ if self.toggled_image:
+ if self.num_toggled > 1:
+ self.set_sensitive(False)
+ self.show_builded_images_dialog(None, "Deploy image")
+ self.set_sensitive(True)
+ else:
+ self.builder.deploy_image(self.toggled_image)
+
+ def run_button_clicked_cb(self, button):
+ if self.toggled_image:
+ if self.num_toggled > 1:
+ self.set_sensitive(False)
+ self.show_builded_images_dialog(None, "Run image")
+ self.set_sensitive(True)
+ else:
+ self.builder.runqemu_image(self.toggled_image, self.sel_kernel)
+
+ def save_button_clicked_cb(self, button):
+ topdir = self.builder.get_topdir()
+ images_dir = topdir + "/recipes/images/custom/"
+ self.builder.ensure_dir(images_dir)
+
+ self.name_field_template = self.builder.image_configuration_page.custom_image_selected
+ if self.name_field_template:
+ image_path = self.builder.recipe_model.pn_path[self.name_field_template]
+ image_iter = self.builder.recipe_model.get_iter(image_path)
+ self.description_field_template = self.builder.recipe_model.get_value(image_iter, self.builder.recipe_model.COL_DESC)
+ else:
+ self.name_field_template = ""
+
+ dialog = SaveImageDialog(images_dir, self.name_field_template, self.description_field_template,
+ "Save image recipe", self.builder, gtk.DIALOG_MODAL | gtk.DIALOG_DESTROY_WITH_PARENT)
+ response = dialog.run()
+ dialog.destroy()
+
+ def build_new_button_clicked_cb(self, button):
+ self.builder.initiate_new_build_async()
+
+ def edit_config_button_clicked_cb(self, button):
+ self.builder.show_configuration()
+
+ def edit_packages_button_clicked_cb(self, button):
+ self.builder.show_packages()
+
+ def my_images_button_clicked_cb(self, button):
+ self.builder.show_load_my_images_dialog()
+
+ def settings_button_clicked_cb(self, button):
+ # Create an advanced settings dialog
+ response, settings_changed = self.builder.show_simple_settings_dialog()
+ if not response:
+ return
+ if settings_changed:
+ self.builder.reparse_post_adv_settings()
diff --git a/bitbake/lib/bb/ui/crumbs/packageselectionpage.py b/bitbake/lib/bb/ui/crumbs/packageselectionpage.py
new file mode 100755
index 0000000..7c62b36
--- /dev/null
+++ b/bitbake/lib/bb/ui/crumbs/packageselectionpage.py
@@ -0,0 +1,355 @@
+#!/usr/bin/env python
+#
+# BitBake Graphical GTK User Interface
+#
+# Copyright (C) 2012 Intel Corporation
+#
+# Authored by Dongxiao Xu <dongxiao.xu@intel.com>
+# Authored by Shane Wang <shane.wang@intel.com>
+#
+# This program is free software; you can redistribute it and/or modify
+# it under the terms of the GNU General Public License version 2 as
+# published by the Free Software Foundation.
+#
+# This program is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
+# GNU General Public License for more details.
+#
+# You should have received a copy of the GNU General Public License along
+# with this program; if not, write to the Free Software Foundation, Inc.,
+# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
+
+import gtk
+import glib
+from bb.ui.crumbs.hobcolor import HobColors
+from bb.ui.crumbs.hobwidget import HobViewTable, HobNotebook, HobAltButton, HobButton
+from bb.ui.crumbs.hoblistmodel import PackageListModel
+from bb.ui.crumbs.hobpages import HobPage
+
+#
+# PackageSelectionPage
+#
+class PackageSelectionPage (HobPage):
+
+ pages = [
+ {
+ 'name' : 'Included packages',
+ 'tooltip' : 'The packages currently included for your image',
+ 'filter' : { PackageListModel.COL_INC : [True] },
+ 'search' : 'Search packages by name',
+ 'searchtip' : 'Enter a package name to find it',
+ 'columns' : [{
+ 'col_name' : 'Package name',
+ 'col_id' : PackageListModel.COL_NAME,
+ 'col_style': 'text',
+ 'col_min' : 100,
+ 'col_max' : 300,
+ 'expand' : 'True'
+ }, {
+ 'col_name' : 'Size',
+ 'col_id' : PackageListModel.COL_SIZE,
+ 'col_style': 'text',
+ 'col_min' : 100,
+ 'col_max' : 300,
+ 'expand' : 'True'
+ }, {
+ 'col_name' : 'Recipe',
+ 'col_id' : PackageListModel.COL_RCP,
+ 'col_style': 'text',
+ 'col_min' : 100,
+ 'col_max' : 250,
+ 'expand' : 'True'
+ }, {
+ 'col_name' : 'Brought in by (+others)',
+ 'col_id' : PackageListModel.COL_BINB,
+ 'col_style': 'binb',
+ 'col_min' : 100,
+ 'col_max' : 350,
+ 'expand' : 'True'
+ }, {
+ 'col_name' : 'Included',
+ 'col_id' : PackageListModel.COL_INC,
+ 'col_style': 'check toggle',
+ 'col_min' : 100,
+ 'col_max' : 100
+ }]
+ }, {
+ 'name' : 'All packages',
+ 'tooltip' : 'All packages that have been built',
+ 'filter' : {},
+ 'search' : 'Search packages by name',
+ 'searchtip' : 'Enter a package name to find it',
+ 'columns' : [{
+ 'col_name' : 'Package name',
+ 'col_id' : PackageListModel.COL_NAME,
+ 'col_style': 'text',
+ 'col_min' : 100,
+ 'col_max' : 400,
+ 'expand' : 'True'
+ }, {
+ 'col_name' : 'Size',
+ 'col_id' : PackageListModel.COL_SIZE,
+ 'col_style': 'text',
+ 'col_min' : 100,
+ 'col_max' : 500,
+ 'expand' : 'True'
+ }, {
+ 'col_name' : 'Recipe',
+ 'col_id' : PackageListModel.COL_RCP,
+ 'col_style': 'text',
+ 'col_min' : 100,
+ 'col_max' : 250,
+ 'expand' : 'True'
+ }, {
+ 'col_name' : 'Included',
+ 'col_id' : PackageListModel.COL_INC,
+ 'col_style': 'check toggle',
+ 'col_min' : 100,
+ 'col_max' : 100
+ }]
+ }
+ ]
+
+ (INCLUDED,
+ ALL) = range(2)
+
+ def __init__(self, builder):
+ super(PackageSelectionPage, self).__init__(builder, "Edit packages")
+
+ # set invisible members
+ self.recipe_model = self.builder.recipe_model
+ self.package_model = self.builder.package_model
+
+ # create visual elements
+ self.create_visual_elements()
+
+ def included_clicked_cb(self, button):
+ self.ins.set_current_page(self.INCLUDED)
+
+ def create_visual_elements(self):
+ self.label = gtk.Label("Packages included: 0\nSelected packages size: 0 MB")
+ self.eventbox = self.add_onto_top_bar(self.label, 73)
+ self.pack_start(self.eventbox, expand=False, fill=False)
+ self.pack_start(self.group_align, expand=True, fill=True)
+
+ # set visible members
+ self.ins = HobNotebook()
+ self.tables = [] # we need to modify table when the dialog is shown
+
+ search_names = []
+ search_tips = []
+ # append the tab
+ for page in self.pages:
+ columns = page['columns']
+ name = page['name']
+ tab = HobViewTable(columns, name)
+ search_names.append(page['search'])
+ search_tips.append(page['searchtip'])
+ filter = page['filter']
+ sort_model = self.package_model.tree_model(filter, initial=True)
+ tab.set_model(sort_model)
+ tab.connect("toggled", self.table_toggled_cb, name)
+ tab.connect("button-release-event", self.button_click_cb)
+ tab.connect("cell-fadeinout-stopped", self.after_fadeout_checkin_include, filter)
+ self.ins.append_page(tab, page['name'], page['tooltip'])
+ self.tables.append(tab)
+
+ self.ins.set_entry(search_names, search_tips)
+ self.ins.search.connect("changed", self.search_entry_changed)
+
+ # add all into the dialog
+ self.box_group_area.pack_start(self.ins, expand=True, fill=True)
+
+ self.button_box = gtk.HBox(False, 6)
+ self.box_group_area.pack_start(self.button_box, expand=False, fill=False)
+
+ self.build_image_button = HobButton('Build image')
+ #self.build_image_button.set_size_request(205, 49)
+ self.build_image_button.set_tooltip_text("Build target image")
+ self.build_image_button.set_flags(gtk.CAN_DEFAULT)
+ self.build_image_button.grab_default()
+ self.build_image_button.connect("clicked", self.build_image_clicked_cb)
+ self.button_box.pack_end(self.build_image_button, expand=False, fill=False)
+
+ self.back_button = HobAltButton('Cancel')
+ self.back_button.connect("clicked", self.back_button_clicked_cb)
+ self.button_box.pack_end(self.back_button, expand=False, fill=False)
+
+ def search_entry_changed(self, entry):
+ text = entry.get_text()
+ if self.ins.search_focus:
+ self.ins.search_focus = False
+ elif self.ins.page_changed:
+ self.ins.page_change = False
+ self.filter_search(entry)
+ elif text not in self.ins.search_names:
+ self.filter_search(entry)
+
+ def filter_search(self, entry):
+ text = entry.get_text()
+ current_tab = self.ins.get_current_page()
+ filter = self.pages[current_tab]['filter']
+ filter[PackageListModel.COL_NAME] = text
+ self.tables[current_tab].set_model(self.package_model.tree_model(filter, search_data=text))
+ if self.package_model.filtered_nb == 0:
+ if not self.ins.get_nth_page(current_tab).top_bar:
+ self.ins.get_nth_page(current_tab).add_no_result_bar(entry)
+ self.ins.get_nth_page(current_tab).top_bar.set_no_show_all(True)
+ self.ins.get_nth_page(current_tab).top_bar.show()
+ self.ins.get_nth_page(current_tab).scroll.hide()
+ else:
+ if self.ins.get_nth_page(current_tab).top_bar:
+ self.ins.get_nth_page(current_tab).top_bar.hide()
+ self.ins.get_nth_page(current_tab).scroll.show()
+ if entry.get_text() == '':
+ entry.set_icon_sensitive(gtk.ENTRY_ICON_SECONDARY, False)
+ else:
+ entry.set_icon_sensitive(gtk.ENTRY_ICON_SECONDARY, True)
+
+ def button_click_cb(self, widget, event):
+ path, col = widget.table_tree.get_cursor()
+ tree_model = widget.table_tree.get_model()
+ if path and col.get_title() != 'Included': # else activation is likely a removal
+ properties = {'binb': '' , 'name': '', 'size':'', 'recipe':'', 'files_list':''}
+ properties['binb'] = tree_model.get_value(tree_model.get_iter(path), PackageListModel.COL_BINB)
+ properties['name'] = tree_model.get_value(tree_model.get_iter(path), PackageListModel.COL_NAME)
+ properties['size'] = tree_model.get_value(tree_model.get_iter(path), PackageListModel.COL_SIZE)
+ properties['recipe'] = tree_model.get_value(tree_model.get_iter(path), PackageListModel.COL_RCP)
+ properties['files_list'] = tree_model.get_value(tree_model.get_iter(path), PackageListModel.COL_FLIST)
+
+ self.builder.show_recipe_property_dialog(properties)
+
+ def open_log_clicked_cb(self, button, log_file):
+ if log_file:
+ log_file = "file:///" + log_file
+ gtk.show_uri(screen=button.get_screen(), uri=log_file, timestamp=0)
+
+ def show_page(self, log_file):
+ children = self.button_box.get_children() or []
+ for child in children:
+ self.button_box.remove(child)
+ # re-packed the buttons as request, add the 'open log' button if build success
+ self.button_box.pack_end(self.build_image_button, expand=False, fill=False)
+ if log_file:
+ open_log_button = HobAltButton("Open log")
+ open_log_button.connect("clicked", self.open_log_clicked_cb, log_file)
+ open_log_button.set_tooltip_text("Open the build's log file")
+ self.button_box.pack_end(open_log_button, expand=False, fill=False)
+ self.button_box.pack_end(self.back_button, expand=False, fill=False)
+ self.show_all()
+
+ def build_image_clicked_cb(self, button):
+ self.builder.parsing_warnings = []
+ self.builder.build_image()
+
+ def refresh_tables(self):
+ self.ins.reset_entry(self.ins.search, 0)
+ for tab in self.tables:
+ index = self.tables.index(tab)
+ filter = self.pages[index]['filter']
+ tab.set_model(self.package_model.tree_model(filter, initial=True))
+
+ def back_button_clicked_cb(self, button):
+ if self.builder.previous_step == self.builder.IMAGE_GENERATED:
+ self.builder.restore_initial_selected_packages()
+ self.refresh_selection()
+ self.builder.show_image_details()
+ else:
+ self.builder.show_configuration()
+ self.refresh_tables()
+
+ def refresh_selection(self):
+ self.builder.configuration.selected_packages = self.package_model.get_selected_packages()
+ self.builder.configuration.user_selected_packages = self.package_model.get_user_selected_packages()
+ selected_packages_num = len(self.builder.configuration.selected_packages)
+ selected_packages_size = self.package_model.get_packages_size()
+ selected_packages_size_str = HobPage._size_to_string(selected_packages_size)
+
+ if self.builder.configuration.image_packages == self.builder.configuration.selected_packages:
+ image_total_size_str = self.builder.configuration.image_size
+ else:
+ image_overhead_factor = self.builder.configuration.image_overhead_factor
+ image_rootfs_size = self.builder.configuration.image_rootfs_size / 1024 # image_rootfs_size is KB
+ image_extra_size = self.builder.configuration.image_extra_size / 1024 # image_extra_size is KB
+ base_size = image_overhead_factor * selected_packages_size
+ image_total_size = max(base_size, image_rootfs_size) + image_extra_size
+ if "zypper" in self.builder.configuration.selected_packages:
+ image_total_size += (51200 * 1024)
+ image_total_size_str = HobPage._size_to_string(image_total_size)
+
+ self.label.set_label("Packages included: %s\nSelected packages size: %s\nEstimated image size: %s" %
+ (selected_packages_num, selected_packages_size_str, image_total_size_str))
+ self.ins.show_indicator_icon("Included packages", selected_packages_num)
+
+ def toggle_item_idle_cb(self, path, view_tree, cell, pagename):
+ if not self.package_model.path_included(path):
+ self.package_model.include_item(item_path=path, binb="User Selected")
+ else:
+ self.pre_fadeout_checkout_include(view_tree)
+ self.package_model.exclude_item(item_path=path)
+ self.render_fadeout(view_tree, cell)
+
+ self.refresh_selection()
+ if not self.builder.customized:
+ self.builder.customized = True
+ self.builder.set_base_image()
+ self.builder.configuration.selected_image = self.recipe_model.__custom_image__
+ self.builder.rcppkglist_populated()
+
+ self.builder.window_sensitive(True)
+ view_model = view_tree.get_model()
+ vpath = self.package_model.convert_path_to_vpath(view_model, path)
+ view_tree.set_cursor(vpath)
+
+ def table_toggled_cb(self, table, cell, view_path, toggled_columnid, view_tree, pagename):
+ # Click to include a package
+ self.builder.window_sensitive(False)
+ view_model = view_tree.get_model()
+ path = self.package_model.convert_vpath_to_path(view_model, view_path)
+ glib.idle_add(self.toggle_item_idle_cb, path, view_tree, cell, pagename)
+
+ def pre_fadeout_checkout_include(self, tree):
+ #after the fadeout the table will be sorted as before
+ self.sort_column_id = self.package_model.sort_column_id
+ self.sort_order = self.package_model.sort_order
+
+ self.package_model.resync_fadeout_column(self.package_model.get_iter_first())
+ # Check out a model which base on the column COL_FADE_INC,
+ # it's save the prev state of column COL_INC before do exclude_item
+ filter = { PackageListModel.COL_FADE_INC : [True]}
+ new_model = self.package_model.tree_model(filter, excluded_items_ahead=True)
+ tree.set_model(new_model)
+ tree.expand_all()
+
+ def get_excluded_rows(self, to_render_cells, model, it):
+ while it:
+ path = model.get_path(it)
+ prev_cell_is_active = model.get_value(it, PackageListModel.COL_FADE_INC)
+ curr_cell_is_active = model.get_value(it, PackageListModel.COL_INC)
+ if (prev_cell_is_active == True) and (curr_cell_is_active == False):
+ to_render_cells.append(path)
+ if model.iter_has_child(it):
+ self.get_excluded_rows(to_render_cells, model, model.iter_children(it))
+ it = model.iter_next(it)
+
+ return to_render_cells
+
+ def render_fadeout(self, tree, cell):
+ if (not cell) or (not tree):
+ return
+ to_render_cells = []
+ view_model = tree.get_model()
+ self.get_excluded_rows(to_render_cells, view_model, view_model.get_iter_first())
+
+ cell.fadeout(tree, 1000, to_render_cells)
+
+ def after_fadeout_checkin_include(self, table, ctrl, cell, tree, filter):
+ self.package_model.sort_column_id = self.sort_column_id
+ self.package_model.sort_order = self.sort_order
+ tree.set_model(self.package_model.tree_model(filter))
+ tree.expand_all()
+
+ def set_packages_curr_tab(self, curr_page):
+ self.ins.set_current_page(curr_page)
+
diff --git a/bitbake/lib/bb/ui/crumbs/persistenttooltip.py b/bitbake/lib/bb/ui/crumbs/persistenttooltip.py
new file mode 100644
index 0000000..927c194
--- /dev/null
+++ b/bitbake/lib/bb/ui/crumbs/persistenttooltip.py
@@ -0,0 +1,186 @@
+#
+# BitBake Graphical GTK User Interface
+#
+# Copyright (C) 2012 Intel Corporation
+#
+# Authored by Joshua Lock <josh@linux.intel.com>
+#
+# This program is free software; you can redistribute it and/or modify
+# it under the terms of the GNU General Public License version 2 as
+# published by the Free Software Foundation.
+#
+# This program is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
+# GNU General Public License for more details.
+#
+# You should have received a copy of the GNU General Public License along
+# with this program; if not, write to the Free Software Foundation, Inc.,
+# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
+
+import gobject
+import gtk
+try:
+ import gconf
+except:
+ pass
+
+class PersistentTooltip(gtk.Window):
+ """
+ A tooltip which persists once shown until the user dismisses it with the Esc
+ key or by clicking the close button.
+
+ # FIXME: the PersistentTooltip should be disabled when the user clicks anywhere off
+ # it. We can't do this with focus-out-event becuase modal ensures we have focus?
+
+ markup: some Pango text markup to display in the tooltip
+ """
+ def __init__(self, markup, parent_win=None):
+ gtk.Window.__init__(self, gtk.WINDOW_POPUP)
+
+ # Inherit the system theme for a tooltip
+ style = gtk.rc_get_style_by_paths(gtk.settings_get_default(),
+ 'gtk-tooltip', 'gtk-tooltip', gobject.TYPE_NONE)
+ self.set_style(style)
+
+ # The placement of the close button on the tip should reflect how the
+ # window manager of the users system places close buttons. Try to read
+ # the metacity gconf key to determine whether the close button is on the
+ # left or the right.
+ # In the case that we can't determine the users configuration we default
+ # to close buttons being on the right.
+ __button_right = True
+ try:
+ client = gconf.client_get_default()
+ order = client.get_string("/apps/metacity/general/button_layout")
+ if order and order.endswith(":"):
+ __button_right = False
+ except NameError:
+ pass
+
+ # We need to ensure we're only shown once
+ self.shown = False
+
+ # We don't want any WM decorations
+ self.set_decorated(False)
+ # We don't want to show in the taskbar or window switcher
+ self.set_skip_pager_hint(True)
+ self.set_skip_taskbar_hint(True)
+ # We must be modal to ensure we grab focus when presented from a gtk.Dialog
+ self.set_modal(True)
+
+ self.set_border_width(0)
+ self.set_position(gtk.WIN_POS_MOUSE)
+ self.set_opacity(0.95)
+
+ # Ensure a reasonable minimum size
+ self.set_geometry_hints(self, 100, 50)
+
+ # Set this window as a transient window for parent(main window)
+ if parent_win:
+ self.set_transient_for(parent_win)
+ self.set_destroy_with_parent(True)
+ # Draw our label and close buttons
+ hbox = gtk.HBox(False, 0)
+ hbox.show()
+ self.add(hbox)
+
+ img = gtk.Image()
+ img.set_from_stock(gtk.STOCK_CLOSE, gtk.ICON_SIZE_BUTTON)
+
+ self.button = gtk.Button()
+ self.button.set_image(img)
+ self.button.connect("clicked", self._dismiss_cb)
+ self.button.set_flags(gtk.CAN_DEFAULT)
+ self.button.grab_focus()
+ self.button.show()
+ vbox = gtk.VBox(False, 0)
+ vbox.show()
+ vbox.pack_start(self.button, False, False, 0)
+ if __button_right:
+ hbox.pack_end(vbox, True, True, 0)
+ else:
+ hbox.pack_start(vbox, True, True, 0)
+
+ self.set_default(self.button)
+
+ bin = gtk.HBox(True, 6)
+ bin.set_border_width(6)
+ bin.show()
+ self.label = gtk.Label()
+ self.label.set_line_wrap(True)
+ # We want to match the colours of the normal tooltips, as dictated by
+ # the users gtk+-2.0 theme, wherever possible - on some systems this
+ # requires explicitly setting a fg_color for the label which matches the
+ # tooltip_fg_color
+ settings = gtk.settings_get_default()
+ colours = settings.get_property('gtk-color-scheme').split('\n')
+ # remove any empty lines, there's likely to be a trailing one after
+ # calling split on a dictionary-like string
+ colours = filter(None, colours)
+ for col in colours:
+ item, val = col.split(': ')
+ if item == 'tooltip_fg_color':
+ style = self.label.get_style()
+ style.fg[gtk.STATE_NORMAL] = gtk.gdk.color_parse(val)
+ self.label.set_style(style)
+ break # we only care for the tooltip_fg_color
+
+ self.label.set_markup(markup)
+ self.label.show()
+ bin.add(self.label)
+ hbox.pack_end(bin, True, True, 6)
+
+ # add the original URL display for user reference
+ if 'a href' in markup:
+ hbox.set_tooltip_text(self.get_markup_url(markup))
+ hbox.show()
+
+ self.connect("key-press-event", self._catch_esc_cb)
+
+ """
+ Callback when the PersistentTooltip's close button is clicked.
+ Hides the PersistentTooltip.
+ """
+ def _dismiss_cb(self, button):
+ self.hide()
+ return True
+
+ """
+ Callback when the Esc key is detected. Hides the PersistentTooltip.
+ """
+ def _catch_esc_cb(self, widget, event):
+ keyname = gtk.gdk.keyval_name(event.keyval)
+ if keyname == "Escape":
+ self.hide()
+ return True
+
+ """
+ Called to present the PersistentTooltip.
+ Overrides the superclasses show() method to include state tracking.
+ """
+ def show(self):
+ if not self.shown:
+ self.shown = True
+ gtk.Window.show(self)
+
+ """
+ Called to hide the PersistentTooltip.
+ Overrides the superclasses hide() method to include state tracking.
+ """
+ def hide(self):
+ self.shown = False
+ gtk.Window.hide(self)
+
+ """
+ Called to get the hyperlink URL from markup text.
+ """
+ def get_markup_url(self, markup):
+ url = "http:"
+ if markup and type(markup) == str:
+ s = markup
+ if 'http:' in s:
+ import re
+ url = re.search('(http:[^,\\ "]+)', s).group(0)
+
+ return url
diff --git a/bitbake/lib/bb/ui/crumbs/progress.py b/bitbake/lib/bb/ui/crumbs/progress.py
new file mode 100644
index 0000000..1d28a11
--- /dev/null
+++ b/bitbake/lib/bb/ui/crumbs/progress.py
@@ -0,0 +1,23 @@
+import gtk
+
+class ProgressBar(gtk.Dialog):
+ def __init__(self, parent):
+
+ gtk.Dialog.__init__(self, flags=(gtk.DIALOG_MODAL | gtk.DIALOG_DESTROY_WITH_PARENT))
+ self.set_title("Parsing metadata, please wait...")
+ self.set_default_size(500, 0)
+ self.set_transient_for(parent)
+ self.progress = gtk.ProgressBar()
+ self.vbox.pack_start(self.progress)
+ self.show_all()
+
+ def set_text(self, msg):
+ self.progress.set_text(msg)
+
+ def update(self, x, y):
+ self.progress.set_fraction(float(x)/float(y))
+ self.progress.set_text("%2d %%" % (x*100/y))
+
+ def pulse(self):
+ self.progress.set_text("Loading...")
+ self.progress.pulse()
diff --git a/bitbake/lib/bb/ui/crumbs/progressbar.py b/bitbake/lib/bb/ui/crumbs/progressbar.py
new file mode 100644
index 0000000..3e2c660
--- /dev/null
+++ b/bitbake/lib/bb/ui/crumbs/progressbar.py
@@ -0,0 +1,59 @@
+# BitBake Graphical GTK User Interface
+#
+# Copyright (C) 2011 Intel Corporation
+#
+# Authored by Shane Wang <shane.wang@intel.com>
+#
+# This program is free software; you can redistribute it and/or modify
+# it under the terms of the GNU General Public License version 2 as
+# published by the Free Software Foundation.
+#
+# This program is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
+# GNU General Public License for more details.
+#
+# You should have received a copy of the GNU General Public License along
+# with this program; if not, write to the Free Software Foundation, Inc.,
+# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
+
+import gtk
+from bb.ui.crumbs.hobcolor import HobColors
+
+class HobProgressBar (gtk.ProgressBar):
+ def __init__(self):
+ gtk.ProgressBar.__init__(self)
+ self.set_rcstyle(True)
+ self.percentage = 0
+
+ def set_rcstyle(self, status):
+ rcstyle = gtk.RcStyle()
+ rcstyle.fg[2] = gtk.gdk.Color(HobColors.BLACK)
+ if status == "stop":
+ rcstyle.bg[3] = gtk.gdk.Color(HobColors.WARNING)
+ elif status == "fail":
+ rcstyle.bg[3] = gtk.gdk.Color(HobColors.ERROR)
+ else:
+ rcstyle.bg[3] = gtk.gdk.Color(HobColors.RUNNING)
+ self.modify_style(rcstyle)
+
+ def set_title(self, text=None):
+ if not text:
+ text = ""
+ text += " %.0f%%" % self.percentage
+ self.set_text(text)
+
+ def set_stop_title(self, text=None):
+ if not text:
+ text = ""
+ self.set_text(text)
+
+ def reset(self):
+ self.set_fraction(0)
+ self.set_text("")
+ self.set_rcstyle(True)
+ self.percentage = 0
+
+ def update(self, fraction):
+ self.percentage = int(fraction * 100)
+ self.set_fraction(fraction)
diff --git a/bitbake/lib/bb/ui/crumbs/puccho.glade b/bitbake/lib/bb/ui/crumbs/puccho.glade
new file mode 100644
index 0000000..d7553a6
--- /dev/null
+++ b/bitbake/lib/bb/ui/crumbs/puccho.glade
@@ -0,0 +1,606 @@
+<?xml version="1.0" encoding="UTF-8" standalone="no"?>
+<!DOCTYPE glade-interface SYSTEM "glade-2.0.dtd">
+<!--Generated with glade3 3.4.5 on Mon Nov 10 12:24:12 2008 -->
+<glade-interface>
+ <widget class="GtkDialog" id="build_dialog">
+ <property name="title" translatable="yes">Start a build</property>
+ <property name="window_position">GTK_WIN_POS_CENTER_ON_PARENT</property>
+ <property name="type_hint">GDK_WINDOW_TYPE_HINT_DIALOG</property>
+ <property name="has_separator">False</property>
+ <child internal-child="vbox">
+ <widget class="GtkVBox" id="dialog-vbox1">
+ <property name="visible">True</property>
+ <property name="spacing">2</property>
+ <child>
+ <widget class="GtkTable" id="build_table">
+ <property name="visible">True</property>
+ <property name="border_width">6</property>
+ <property name="n_rows">7</property>
+ <property name="n_columns">3</property>
+ <property name="column_spacing">5</property>
+ <property name="row_spacing">6</property>
+ <child>
+ <widget class="GtkAlignment" id="status_alignment">
+ <property name="visible">True</property>
+ <property name="left_padding">12</property>
+ <child>
+ <widget class="GtkHBox" id="status_hbox">
+ <property name="spacing">6</property>
+ <child>
+ <widget class="GtkImage" id="status_image">
+ <property name="visible">True</property>
+ <property name="no_show_all">True</property>
+ <property name="xalign">0</property>
+ <property name="stock">gtk-dialog-error</property>
+ </widget>
+ <packing>
+ <property name="expand">False</property>
+ <property name="fill">False</property>
+ </packing>
+ </child>
+ <child>
+ <widget class="GtkLabel" id="status_label">
+ <property name="visible">True</property>
+ <property name="xalign">0</property>
+ <property name="label" translatable="yes">If you see this text something is wrong...</property>
+ <property name="use_markup">True</property>
+ <property name="use_underline">True</property>
+ </widget>
+ <packing>
+ <property name="position">1</property>
+ </packing>
+ </child>
+ </widget>
+ </child>
+ </widget>
+ <packing>
+ <property name="right_attach">3</property>
+ <property name="top_attach">2</property>
+ <property name="bottom_attach">3</property>
+ </packing>
+ </child>
+ <child>
+ <widget class="GtkLabel" id="label2">
+ <property name="visible">True</property>
+ <property name="xalign">0</property>
+ <property name="label" translatable="yes"><b>Build configuration</b></property>
+ <property name="use_markup">True</property>
+ </widget>
+ <packing>
+ <property name="right_attach">3</property>
+ <property name="top_attach">3</property>
+ <property name="bottom_attach">4</property>
+ <property name="y_options"></property>
+ </packing>
+ </child>
+ <child>
+ <widget class="GtkComboBox" id="image_combo">
+ <property name="visible">True</property>
+ <property name="sensitive">False</property>
+ </widget>
+ <packing>
+ <property name="left_attach">1</property>
+ <property name="right_attach">2</property>
+ <property name="top_attach">6</property>
+ <property name="bottom_attach">7</property>
+ <property name="y_options"></property>
+ </packing>
+ </child>
+ <child>
+ <widget class="GtkLabel" id="image_label">
+ <property name="visible">True</property>
+ <property name="sensitive">False</property>
+ <property name="xalign">0</property>
+ <property name="xpad">12</property>
+ <property name="label" translatable="yes">Image:</property>
+ </widget>
+ <packing>
+ <property name="top_attach">6</property>
+ <property name="bottom_attach">7</property>
+ <property name="y_options"></property>
+ </packing>
+ </child>
+ <child>
+ <widget class="GtkComboBox" id="distribution_combo">
+ <property name="visible">True</property>
+ <property name="sensitive">False</property>
+ </widget>
+ <packing>
+ <property name="left_attach">1</property>
+ <property name="right_attach">2</property>
+ <property name="top_attach">5</property>
+ <property name="bottom_attach">6</property>
+ <property name="y_options"></property>
+ </packing>
+ </child>
+ <child>
+ <widget class="GtkLabel" id="distribution_label">
+ <property name="visible">True</property>
+ <property name="sensitive">False</property>
+ <property name="xalign">0</property>
+ <property name="xpad">12</property>
+ <property name="label" translatable="yes">Distribution:</property>
+ </widget>
+ <packing>
+ <property name="top_attach">5</property>
+ <property name="bottom_attach">6</property>
+ <property name="y_options"></property>
+ </packing>
+ </child>
+ <child>
+ <widget class="GtkComboBox" id="machine_combo">
+ <property name="visible">True</property>
+ <property name="sensitive">False</property>
+ </widget>
+ <packing>
+ <property name="left_attach">1</property>
+ <property name="right_attach">2</property>
+ <property name="top_attach">4</property>
+ <property name="bottom_attach">5</property>
+ <property name="y_options"></property>
+ </packing>
+ </child>
+ <child>
+ <widget class="GtkLabel" id="machine_label">
+ <property name="visible">True</property>
+ <property name="sensitive">False</property>
+ <property name="xalign">0</property>
+ <property name="xpad">12</property>
+ <property name="label" translatable="yes">Machine:</property>
+ </widget>
+ <packing>
+ <property name="top_attach">4</property>
+ <property name="bottom_attach">5</property>
+ <property name="y_options"></property>
+ </packing>
+ </child>
+ <child>
+ <widget class="GtkButton" id="refresh_button">
+ <property name="visible">True</property>
+ <property name="sensitive">False</property>
+ <property name="can_focus">True</property>
+ <property name="receives_default">True</property>
+ <property name="label" translatable="yes">gtk-refresh</property>
+ <property name="use_stock">True</property>
+ <property name="response_id">0</property>
+ </widget>
+ <packing>
+ <property name="left_attach">2</property>
+ <property name="right_attach">3</property>
+ <property name="top_attach">1</property>
+ <property name="bottom_attach">2</property>
+ <property name="y_options"></property>
+ </packing>
+ </child>
+ <child>
+ <widget class="GtkEntry" id="location_entry">
+ <property name="visible">True</property>
+ <property name="can_focus">True</property>
+ <property name="width_chars">32</property>
+ </widget>
+ <packing>
+ <property name="left_attach">1</property>
+ <property name="right_attach">2</property>
+ <property name="top_attach">1</property>
+ <property name="bottom_attach">2</property>
+ <property name="y_options"></property>
+ </packing>
+ </child>
+ <child>
+ <widget class="GtkLabel" id="label3">
+ <property name="visible">True</property>
+ <property name="xalign">0</property>
+ <property name="xpad">12</property>
+ <property name="label" translatable="yes">Location:</property>
+ </widget>
+ <packing>
+ <property name="top_attach">1</property>
+ <property name="bottom_attach">2</property>
+ <property name="y_options"></property>
+ </packing>
+ </child>
+ <child>
+ <widget class="GtkLabel" id="label1">
+ <property name="visible">True</property>
+ <property name="xalign">0</property>
+ <property name="label" translatable="yes"><b>Repository</b></property>
+ <property name="use_markup">True</property>
+ </widget>
+ <packing>
+ <property name="right_attach">3</property>
+ <property name="y_options"></property>
+ </packing>
+ </child>
+ <child>
+ <widget class="GtkAlignment" id="alignment1">
+ <property name="visible">True</property>
+ <child>
+ <placeholder/>
+ </child>
+ </widget>
+ <packing>
+ <property name="left_attach">2</property>
+ <property name="right_attach">3</property>
+ <property name="top_attach">4</property>
+ <property name="bottom_attach">5</property>
+ <property name="y_options"></property>
+ </packing>
+ </child>
+ <child>
+ <widget class="GtkAlignment" id="alignment2">
+ <property name="visible">True</property>
+ <child>
+ <placeholder/>
+ </child>
+ </widget>
+ <packing>
+ <property name="left_attach">2</property>
+ <property name="right_attach">3</property>
+ <property name="top_attach">5</property>
+ <property name="bottom_attach">6</property>
+ <property name="y_options"></property>
+ </packing>
+ </child>
+ <child>
+ <widget class="GtkAlignment" id="alignment3">
+ <property name="visible">True</property>
+ <child>
+ <placeholder/>
+ </child>
+ </widget>
+ <packing>
+ <property name="left_attach">2</property>
+ <property name="right_attach">3</property>
+ <property name="top_attach">6</property>
+ <property name="bottom_attach">7</property>
+ <property name="y_options"></property>
+ </packing>
+ </child>
+ </widget>
+ <packing>
+ <property name="position">1</property>
+ </packing>
+ </child>
+ <child internal-child="action_area">
+ <widget class="GtkHButtonBox" id="dialog-action_area1">
+ <property name="visible">True</property>
+ <property name="layout_style">GTK_BUTTONBOX_END</property>
+ <child>
+ <placeholder/>
+ </child>
+ <child>
+ <placeholder/>
+ </child>
+ <child>
+ <placeholder/>
+ </child>
+ </widget>
+ <packing>
+ <property name="expand">False</property>
+ <property name="pack_type">GTK_PACK_END</property>
+ </packing>
+ </child>
+ </widget>
+ </child>
+ </widget>
+ <widget class="GtkDialog" id="dialog2">
+ <property name="window_position">GTK_WIN_POS_CENTER_ON_PARENT</property>
+ <property name="type_hint">GDK_WINDOW_TYPE_HINT_DIALOG</property>
+ <property name="has_separator">False</property>
+ <child internal-child="vbox">
+ <widget class="GtkVBox" id="dialog-vbox2">
+ <property name="visible">True</property>
+ <property name="spacing">2</property>
+ <child>
+ <widget class="GtkTable" id="table2">
+ <property name="visible">True</property>
+ <property name="border_width">6</property>
+ <property name="n_rows">7</property>
+ <property name="n_columns">3</property>
+ <property name="column_spacing">6</property>
+ <property name="row_spacing">6</property>
+ <child>
+ <widget class="GtkLabel" id="label7">
+ <property name="visible">True</property>
+ <property name="xalign">0</property>
+ <property name="label" translatable="yes"><b>Repositories</b></property>
+ <property name="use_markup">True</property>
+ </widget>
+ <packing>
+ <property name="right_attach">3</property>
+ <property name="y_options"></property>
+ </packing>
+ </child>
+ <child>
+ <widget class="GtkAlignment" id="alignment4">
+ <property name="visible">True</property>
+ <property name="xalign">0</property>
+ <property name="left_padding">12</property>
+ <child>
+ <widget class="GtkScrolledWindow" id="scrolledwindow1">
+ <property name="visible">True</property>
+ <property name="can_focus">True</property>
+ <property name="hscrollbar_policy">GTK_POLICY_AUTOMATIC</property>
+ <property name="vscrollbar_policy">GTK_POLICY_AUTOMATIC</property>
+ <child>
+ <widget class="GtkTreeView" id="treeview1">
+ <property name="visible">True</property>
+ <property name="can_focus">True</property>
+ <property name="headers_clickable">True</property>
+ </widget>
+ </child>
+ </widget>
+ </child>
+ </widget>
+ <packing>
+ <property name="right_attach">3</property>
+ <property name="top_attach">2</property>
+ <property name="bottom_attach">3</property>
+ <property name="y_options"></property>
+ </packing>
+ </child>
+ <child>
+ <widget class="GtkEntry" id="entry1">
+ <property name="visible">True</property>
+ <property name="can_focus">True</property>
+ </widget>
+ <packing>
+ <property name="left_attach">1</property>
+ <property name="right_attach">3</property>
+ <property name="top_attach">1</property>
+ <property name="bottom_attach">2</property>
+ <property name="y_options"></property>
+ </packing>
+ </child>
+ <child>
+ <widget class="GtkLabel" id="label9">
+ <property name="visible">True</property>
+ <property name="xalign">0</property>
+ <property name="label" translatable="yes"><b>Additional packages</b></property>
+ <property name="use_markup">True</property>
+ </widget>
+ <packing>
+ <property name="right_attach">3</property>
+ <property name="top_attach">4</property>
+ <property name="bottom_attach">5</property>
+ <property name="y_options"></property>
+ </packing>
+ </child>
+ <child>
+ <widget class="GtkAlignment" id="alignment6">
+ <property name="visible">True</property>
+ <property name="xalign">0</property>
+ <property name="xscale">0</property>
+ <child>
+ <widget class="GtkLabel" id="label8">
+ <property name="visible">True</property>
+ <property name="xalign">0</property>
+ <property name="yalign">0</property>
+ <property name="xpad">12</property>
+ <property name="label" translatable="yes">Location: </property>
+ </widget>
+ </child>
+ </widget>
+ <packing>
+ <property name="top_attach">1</property>
+ <property name="bottom_attach">2</property>
+ <property name="y_options"></property>
+ </packing>
+ </child>
+ <child>
+ <widget class="GtkAlignment" id="alignment7">
+ <property name="visible">True</property>
+ <property name="xalign">1</property>
+ <property name="xscale">0</property>
+ <child>
+ <widget class="GtkHButtonBox" id="hbuttonbox1">
+ <property name="visible">True</property>
+ <property name="spacing">5</property>
+ <child>
+ <widget class="GtkButton" id="button7">
+ <property name="visible">True</property>
+ <property name="can_focus">True</property>
+ <property name="receives_default">True</property>
+ <property name="label" translatable="yes">gtk-remove</property>
+ <property name="use_stock">True</property>
+ <property name="response_id">0</property>
+ </widget>
+ </child>
+ <child>
+ <widget class="GtkButton" id="button6">
+ <property name="visible">True</property>
+ <property name="can_focus">True</property>
+ <property name="receives_default">True</property>
+ <property name="label" translatable="yes">gtk-edit</property>
+ <property name="use_stock">True</property>
+ <property name="response_id">0</property>
+ </widget>
+ <packing>
+ <property name="position">1</property>
+ </packing>
+ </child>
+ <child>
+ <widget class="GtkButton" id="button5">
+ <property name="visible">True</property>
+ <property name="can_focus">True</property>
+ <property name="receives_default">True</property>
+ <property name="label" translatable="yes">gtk-add</property>
+ <property name="use_stock">True</property>
+ <property name="response_id">0</property>
+ </widget>
+ <packing>
+ <property name="position">2</property>
+ </packing>
+ </child>
+ </widget>
+ </child>
+ </widget>
+ <packing>
+ <property name="left_attach">1</property>
+ <property name="right_attach">3</property>
+ <property name="top_attach">3</property>
+ <property name="bottom_attach">4</property>
+ <property name="y_options"></property>
+ </packing>
+ </child>
+ <child>
+ <widget class="GtkAlignment" id="alignment5">
+ <property name="visible">True</property>
+ <child>
+ <placeholder/>
+ </child>
+ </widget>
+ <packing>
+ <property name="top_attach">3</property>
+ <property name="bottom_attach">4</property>
+ <property name="y_options"></property>
+ </packing>
+ </child>
+ <child>
+ <widget class="GtkLabel" id="label10">
+ <property name="visible">True</property>
+ <property name="xalign">0</property>
+ <property name="yalign">0</property>
+ <property name="xpad">12</property>
+ <property name="label" translatable="yes">Search:</property>
+ </widget>
+ <packing>
+ <property name="top_attach">5</property>
+ <property name="bottom_attach">6</property>
+ <property name="y_options"></property>
+ </packing>
+ </child>
+ <child>
+ <widget class="GtkEntry" id="entry2">
+ <property name="visible">True</property>
+ <property name="can_focus">True</property>
+ </widget>
+ <packing>
+ <property name="left_attach">1</property>
+ <property name="right_attach">3</property>
+ <property name="top_attach">5</property>
+ <property name="bottom_attach">6</property>
+ <property name="y_options"></property>
+ </packing>
+ </child>
+ <child>
+ <widget class="GtkAlignment" id="alignment8">
+ <property name="visible">True</property>
+ <property name="xalign">0</property>
+ <property name="left_padding">12</property>
+ <child>
+ <widget class="GtkScrolledWindow" id="scrolledwindow2">
+ <property name="visible">True</property>
+ <property name="can_focus">True</property>
+ <property name="hscrollbar_policy">GTK_POLICY_AUTOMATIC</property>
+ <property name="vscrollbar_policy">GTK_POLICY_AUTOMATIC</property>
+ <child>
+ <widget class="GtkTreeView" id="treeview2">
+ <property name="visible">True</property>
+ <property name="can_focus">True</property>
+ <property name="headers_clickable">True</property>
+ </widget>
+ </child>
+ </widget>
+ </child>
+ </widget>
+ <packing>
+ <property name="right_attach">3</property>
+ <property name="top_attach">6</property>
+ <property name="bottom_attach">7</property>
+ <property name="y_options"></property>
+ </packing>
+ </child>
+ </widget>
+ <packing>
+ <property name="position">1</property>
+ </packing>
+ </child>
+ <child internal-child="action_area">
+ <widget class="GtkHButtonBox" id="dialog-action_area2">
+ <property name="visible">True</property>
+ <property name="layout_style">GTK_BUTTONBOX_END</property>
+ <child>
+ <widget class="GtkButton" id="button4">
+ <property name="visible">True</property>
+ <property name="can_focus">True</property>
+ <property name="receives_default">True</property>
+ <property name="label" translatable="yes">gtk-close</property>
+ <property name="use_stock">True</property>
+ <property name="response_id">0</property>
+ </widget>
+ </child>
+ </widget>
+ <packing>
+ <property name="expand">False</property>
+ <property name="pack_type">GTK_PACK_END</property>
+ </packing>
+ </child>
+ </widget>
+ </child>
+ </widget>
+ <widget class="GtkWindow" id="main_window">
+ <child>
+ <widget class="GtkVBox" id="main_window_vbox">
+ <property name="visible">True</property>
+ <child>
+ <widget class="GtkToolbar" id="main_toolbar">
+ <property name="visible">True</property>
+ <child>
+ <widget class="GtkToolButton" id="main_toolbutton_build">
+ <property name="visible">True</property>
+ <property name="label" translatable="yes">Build</property>
+ <property name="stock_id">gtk-execute</property>
+ </widget>
+ <packing>
+ <property name="expand">False</property>
+ </packing>
+ </child>
+ </widget>
+ <packing>
+ <property name="expand">False</property>
+ </packing>
+ </child>
+ <child>
+ <widget class="GtkVPaned" id="vpaned1">
+ <property name="visible">True</property>
+ <property name="can_focus">True</property>
+ <child>
+ <widget class="GtkScrolledWindow" id="results_scrolledwindow">
+ <property name="visible">True</property>
+ <property name="can_focus">True</property>
+ <property name="hscrollbar_policy">GTK_POLICY_AUTOMATIC</property>
+ <property name="vscrollbar_policy">GTK_POLICY_AUTOMATIC</property>
+ <child>
+ <placeholder/>
+ </child>
+ </widget>
+ <packing>
+ <property name="resize">False</property>
+ <property name="shrink">True</property>
+ </packing>
+ </child>
+ <child>
+ <widget class="GtkScrolledWindow" id="progress_scrolledwindow">
+ <property name="visible">True</property>
+ <property name="can_focus">True</property>
+ <property name="hscrollbar_policy">GTK_POLICY_AUTOMATIC</property>
+ <property name="vscrollbar_policy">GTK_POLICY_AUTOMATIC</property>
+ <child>
+ <placeholder/>
+ </child>
+ </widget>
+ <packing>
+ <property name="resize">True</property>
+ <property name="shrink">True</property>
+ </packing>
+ </child>
+ </widget>
+ <packing>
+ <property name="position">1</property>
+ </packing>
+ </child>
+ </widget>
+ </child>
+ </widget>
+</glade-interface>
diff --git a/bitbake/lib/bb/ui/crumbs/recipeselectionpage.py b/bitbake/lib/bb/ui/crumbs/recipeselectionpage.py
new file mode 100755
index 0000000..58db43f
--- /dev/null
+++ b/bitbake/lib/bb/ui/crumbs/recipeselectionpage.py
@@ -0,0 +1,335 @@
+#!/usr/bin/env python
+#
+# BitBake Graphical GTK User Interface
+#
+# Copyright (C) 2012 Intel Corporation
+#
+# Authored by Dongxiao Xu <dongxiao.xu@intel.com>
+# Authored by Shane Wang <shane.wang@intel.com>
+#
+# This program is free software; you can redistribute it and/or modify
+# it under the terms of the GNU General Public License version 2 as
+# published by the Free Software Foundation.
+#
+# This program is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
+# GNU General Public License for more details.
+#
+# You should have received a copy of the GNU General Public License along
+# with this program; if not, write to the Free Software Foundation, Inc.,
+# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
+
+import gtk
+import glib
+from bb.ui.crumbs.hobcolor import HobColors
+from bb.ui.crumbs.hobwidget import HobViewTable, HobNotebook, HobAltButton, HobButton
+from bb.ui.crumbs.hoblistmodel import RecipeListModel
+from bb.ui.crumbs.hobpages import HobPage
+
+#
+# RecipeSelectionPage
+#
+class RecipeSelectionPage (HobPage):
+ pages = [
+ {
+ 'name' : 'Included recipes',
+ 'tooltip' : 'The recipes currently included for your image',
+ 'filter' : { RecipeListModel.COL_INC : [True],
+ RecipeListModel.COL_TYPE : ['recipe', 'packagegroup'] },
+ 'search' : 'Search recipes by name',
+ 'searchtip' : 'Enter a recipe name to find it',
+ 'columns' : [{
+ 'col_name' : 'Recipe name',
+ 'col_id' : RecipeListModel.COL_NAME,
+ 'col_style': 'text',
+ 'col_min' : 100,
+ 'col_max' : 400,
+ 'expand' : 'True'
+ }, {
+ 'col_name' : 'Group',
+ 'col_id' : RecipeListModel.COL_GROUP,
+ 'col_style': 'text',
+ 'col_min' : 100,
+ 'col_max' : 300,
+ 'expand' : 'True'
+ }, {
+ 'col_name' : 'Brought in by (+others)',
+ 'col_id' : RecipeListModel.COL_BINB,
+ 'col_style': 'binb',
+ 'col_min' : 100,
+ 'col_max' : 500,
+ 'expand' : 'True'
+ }, {
+ 'col_name' : 'Included',
+ 'col_id' : RecipeListModel.COL_INC,
+ 'col_style': 'check toggle',
+ 'col_min' : 100,
+ 'col_max' : 100
+ }]
+ }, {
+ 'name' : 'All recipes',
+ 'tooltip' : 'All recipes in your configured layers',
+ 'filter' : { RecipeListModel.COL_TYPE : ['recipe'] },
+ 'search' : 'Search recipes by name',
+ 'searchtip' : 'Enter a recipe name to find it',
+ 'columns' : [{
+ 'col_name' : 'Recipe name',
+ 'col_id' : RecipeListModel.COL_NAME,
+ 'col_style': 'text',
+ 'col_min' : 100,
+ 'col_max' : 400,
+ 'expand' : 'True'
+ }, {
+ 'col_name' : 'Group',
+ 'col_id' : RecipeListModel.COL_GROUP,
+ 'col_style': 'text',
+ 'col_min' : 100,
+ 'col_max' : 400,
+ 'expand' : 'True'
+ }, {
+ 'col_name' : 'License',
+ 'col_id' : RecipeListModel.COL_LIC,
+ 'col_style': 'text',
+ 'col_min' : 100,
+ 'col_max' : 400,
+ 'expand' : 'True'
+ }, {
+ 'col_name' : 'Included',
+ 'col_id' : RecipeListModel.COL_INC,
+ 'col_style': 'check toggle',
+ 'col_min' : 100,
+ 'col_max' : 100
+ }]
+ }, {
+ 'name' : 'Package Groups',
+ 'tooltip' : 'All package groups in your configured layers',
+ 'filter' : { RecipeListModel.COL_TYPE : ['packagegroup'] },
+ 'search' : 'Search package groups by name',
+ 'searchtip' : 'Enter a package group name to find it',
+ 'columns' : [{
+ 'col_name' : 'Package group name',
+ 'col_id' : RecipeListModel.COL_NAME,
+ 'col_style': 'text',
+ 'col_min' : 100,
+ 'col_max' : 400,
+ 'expand' : 'True'
+ }, {
+ 'col_name' : 'Included',
+ 'col_id' : RecipeListModel.COL_INC,
+ 'col_style': 'check toggle',
+ 'col_min' : 100,
+ 'col_max' : 100
+ }]
+ }
+ ]
+
+ (INCLUDED,
+ ALL,
+ TASKS) = range(3)
+
+ def __init__(self, builder = None):
+ super(RecipeSelectionPage, self).__init__(builder, "Step 1 of 2: Edit recipes")
+
+ # set invisible members
+ self.recipe_model = self.builder.recipe_model
+
+ # create visual elements
+ self.create_visual_elements()
+
+ def included_clicked_cb(self, button):
+ self.ins.set_current_page(self.INCLUDED)
+
+ def create_visual_elements(self):
+ self.eventbox = self.add_onto_top_bar(None, 73)
+ self.pack_start(self.eventbox, expand=False, fill=False)
+ self.pack_start(self.group_align, expand=True, fill=True)
+
+ # set visible members
+ self.ins = HobNotebook()
+ self.tables = [] # we need modify table when the dialog is shown
+
+ search_names = []
+ search_tips = []
+ # append the tabs in order
+ for page in self.pages:
+ columns = page['columns']
+ name = page['name']
+ tab = HobViewTable(columns, name)
+ search_names.append(page['search'])
+ search_tips.append(page['searchtip'])
+ filter = page['filter']
+ sort_model = self.recipe_model.tree_model(filter, initial=True)
+ tab.set_model(sort_model)
+ tab.connect("toggled", self.table_toggled_cb, name)
+ tab.connect("button-release-event", self.button_click_cb)
+ tab.connect("cell-fadeinout-stopped", self.after_fadeout_checkin_include, filter)
+ self.ins.append_page(tab, page['name'], page['tooltip'])
+ self.tables.append(tab)
+
+ self.ins.set_entry(search_names, search_tips)
+ self.ins.search.connect("changed", self.search_entry_changed)
+
+ # add all into the window
+ self.box_group_area.pack_start(self.ins, expand=True, fill=True)
+
+ button_box = gtk.HBox(False, 6)
+ self.box_group_area.pack_end(button_box, expand=False, fill=False)
+
+ self.build_packages_button = HobButton('Build packages')
+ #self.build_packages_button.set_size_request(205, 49)
+ self.build_packages_button.set_tooltip_text("Build selected recipes into packages")
+ self.build_packages_button.set_flags(gtk.CAN_DEFAULT)
+ self.build_packages_button.grab_default()
+ self.build_packages_button.connect("clicked", self.build_packages_clicked_cb)
+ button_box.pack_end(self.build_packages_button, expand=False, fill=False)
+
+ self.back_button = HobAltButton('Cancel')
+ self.back_button.connect("clicked", self.back_button_clicked_cb)
+ button_box.pack_end(self.back_button, expand=False, fill=False)
+
+ def search_entry_changed(self, entry):
+ text = entry.get_text()
+ if self.ins.search_focus:
+ self.ins.search_focus = False
+ elif self.ins.page_changed:
+ self.ins.page_change = False
+ self.filter_search(entry)
+ elif text not in self.ins.search_names:
+ self.filter_search(entry)
+
+ def filter_search(self, entry):
+ text = entry.get_text()
+ current_tab = self.ins.get_current_page()
+ filter = self.pages[current_tab]['filter']
+ filter[RecipeListModel.COL_NAME] = text
+ self.tables[current_tab].set_model(self.recipe_model.tree_model(filter, search_data=text))
+ if self.recipe_model.filtered_nb == 0:
+ if not self.ins.get_nth_page(current_tab).top_bar:
+ self.ins.get_nth_page(current_tab).add_no_result_bar(entry)
+ self.ins.get_nth_page(current_tab).top_bar.set_no_show_all(True)
+ self.ins.get_nth_page(current_tab).top_bar.show()
+ self.ins.get_nth_page(current_tab).scroll.hide()
+ else:
+ if self.ins.get_nth_page(current_tab).top_bar:
+ self.ins.get_nth_page(current_tab).top_bar.hide()
+ self.ins.get_nth_page(current_tab).scroll.show()
+ if entry.get_text() == '':
+ entry.set_icon_sensitive(gtk.ENTRY_ICON_SECONDARY, False)
+ else:
+ entry.set_icon_sensitive(gtk.ENTRY_ICON_SECONDARY, True)
+
+ def button_click_cb(self, widget, event):
+ path, col = widget.table_tree.get_cursor()
+ tree_model = widget.table_tree.get_model()
+ if path and col.get_title() != 'Included': # else activation is likely a removal
+ properties = {'summary': '', 'name': '', 'version': '', 'revision': '', 'binb': '', 'group': '', 'license': '', 'homepage': '', 'bugtracker': '', 'description': ''}
+ properties['summary'] = tree_model.get_value(tree_model.get_iter(path), RecipeListModel.COL_SUMMARY)
+ properties['name'] = tree_model.get_value(tree_model.get_iter(path), RecipeListModel.COL_NAME)
+ properties['version'] = tree_model.get_value(tree_model.get_iter(path), RecipeListModel.COL_VERSION)
+ properties['revision'] = tree_model.get_value(tree_model.get_iter(path), RecipeListModel.COL_REVISION)
+ properties['binb'] = tree_model.get_value(tree_model.get_iter(path), RecipeListModel.COL_BINB)
+ properties['group'] = tree_model.get_value(tree_model.get_iter(path), RecipeListModel.COL_GROUP)
+ properties['license'] = tree_model.get_value(tree_model.get_iter(path), RecipeListModel.COL_LIC)
+ properties['homepage'] = tree_model.get_value(tree_model.get_iter(path), RecipeListModel.COL_HOMEPAGE)
+ properties['bugtracker'] = tree_model.get_value(tree_model.get_iter(path), RecipeListModel.COL_BUGTRACKER)
+ properties['description'] = tree_model.get_value(tree_model.get_iter(path), RecipeListModel.COL_DESC)
+ self.builder.show_recipe_property_dialog(properties)
+
+ def build_packages_clicked_cb(self, button):
+ self.refresh_tables()
+ self.builder.build_packages()
+
+ def refresh_tables(self):
+ self.ins.reset_entry(self.ins.search, 0)
+ for tab in self.tables:
+ index = self.tables.index(tab)
+ filter = self.pages[index]['filter']
+ tab.set_model(self.recipe_model.tree_model(filter, search_data="", initial=True))
+
+ def back_button_clicked_cb(self, button):
+ self.builder.recipe_model.set_selected_image(self.builder.configuration.initial_selected_image)
+ self.builder.image_configuration_page.update_image_combo(self.builder.recipe_model, self.builder.configuration.initial_selected_image)
+ self.builder.image_configuration_page.update_image_desc()
+ self.builder.show_configuration()
+ self.refresh_tables()
+
+ def refresh_selection(self):
+ self.builder.configuration.selected_image = self.recipe_model.get_selected_image()
+ _, self.builder.configuration.selected_recipes = self.recipe_model.get_selected_recipes()
+ self.ins.show_indicator_icon("Included recipes", len(self.builder.configuration.selected_recipes))
+
+ def toggle_item_idle_cb(self, path, view_tree, cell, pagename):
+ if not self.recipe_model.path_included(path):
+ self.recipe_model.include_item(item_path=path, binb="User Selected", image_contents=False)
+ else:
+ self.pre_fadeout_checkout_include(view_tree, pagename)
+ self.recipe_model.exclude_item(item_path=path)
+ self.render_fadeout(view_tree, cell)
+
+ self.refresh_selection()
+ if not self.builder.customized:
+ self.builder.customized = True
+ self.builder.configuration.selected_image = self.recipe_model.__custom_image__
+ self.builder.rcppkglist_populated()
+
+ self.builder.window_sensitive(True)
+
+ view_model = view_tree.get_model()
+ vpath = self.recipe_model.convert_path_to_vpath(view_model, path)
+ view_tree.set_cursor(vpath)
+
+ def table_toggled_cb(self, table, cell, view_path, toggled_columnid, view_tree, pagename):
+ # Click to include a recipe
+ self.builder.window_sensitive(False)
+ view_model = view_tree.get_model()
+ path = self.recipe_model.convert_vpath_to_path(view_model, view_path)
+ glib.idle_add(self.toggle_item_idle_cb, path, view_tree, cell, pagename)
+
+ def pre_fadeout_checkout_include(self, tree, pagename):
+ #after the fadeout the table will be sorted as before
+ self.sort_column_id = self.recipe_model.sort_column_id
+ self.sort_order = self.recipe_model.sort_order
+
+ #resync the included items to a backup fade include column
+ it = self.recipe_model.get_iter_first()
+ while it:
+ active = self.recipe_model.get_value(it, self.recipe_model.COL_INC)
+ self.recipe_model.set(it, self.recipe_model.COL_FADE_INC, active)
+ it = self.recipe_model.iter_next(it)
+ # Check out a model which base on the column COL_FADE_INC,
+ # it's save the prev state of column COL_INC before do exclude_item
+ filter = { RecipeListModel.COL_FADE_INC:[True] }
+ if pagename == "Included recipes":
+ filter[RecipeListModel.COL_TYPE] = ['recipe', 'packagegroup']
+ elif pagename == "All recipes":
+ filter[RecipeListModel.COL_TYPE] = ['recipe']
+ else:
+ filter[RecipeListModel.COL_TYPE] = ['packagegroup']
+
+ new_model = self.recipe_model.tree_model(filter, excluded_items_ahead=True)
+ tree.set_model(new_model)
+
+ def render_fadeout(self, tree, cell):
+ if (not cell) or (not tree):
+ return
+ to_render_cells = []
+ model = tree.get_model()
+ it = model.get_iter_first()
+ while it:
+ path = model.get_path(it)
+ prev_cell_is_active = model.get_value(it, RecipeListModel.COL_FADE_INC)
+ curr_cell_is_active = model.get_value(it, RecipeListModel.COL_INC)
+ if (prev_cell_is_active == True) and (curr_cell_is_active == False):
+ to_render_cells.append(path)
+ it = model.iter_next(it)
+
+ cell.fadeout(tree, 1000, to_render_cells)
+
+ def after_fadeout_checkin_include(self, table, ctrl, cell, tree, filter):
+ self.recipe_model.sort_column_id = self.sort_column_id
+ self.recipe_model.sort_order = self.sort_order
+ tree.set_model(self.recipe_model.tree_model(filter))
+
+ def set_recipe_curr_tab(self, curr_page):
+ self.ins.set_current_page(curr_page)
diff --git a/bitbake/lib/bb/ui/crumbs/runningbuild.py b/bitbake/lib/bb/ui/crumbs/runningbuild.py
new file mode 100644
index 0000000..16a955d
--- /dev/null
+++ b/bitbake/lib/bb/ui/crumbs/runningbuild.py
@@ -0,0 +1,551 @@
+
+#
+# BitBake Graphical GTK User Interface
+#
+# Copyright (C) 2008 Intel Corporation
+#
+# Authored by Rob Bradford <rob@linux.intel.com>
+#
+# This program is free software; you can redistribute it and/or modify
+# it under the terms of the GNU General Public License version 2 as
+# published by the Free Software Foundation.
+#
+# This program is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
+# GNU General Public License for more details.
+#
+# You should have received a copy of the GNU General Public License along
+# with this program; if not, write to the Free Software Foundation, Inc.,
+# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
+
+import gtk
+import gobject
+import logging
+import time
+import urllib
+import urllib2
+import pango
+from bb.ui.crumbs.hobcolor import HobColors
+from bb.ui.crumbs.hobwidget import HobWarpCellRendererText, HobCellRendererPixbuf
+
+class RunningBuildModel (gtk.TreeStore):
+ (COL_LOG, COL_PACKAGE, COL_TASK, COL_MESSAGE, COL_ICON, COL_COLOR, COL_NUM_ACTIVE) = range(7)
+
+ def __init__ (self):
+ gtk.TreeStore.__init__ (self,
+ gobject.TYPE_STRING,
+ gobject.TYPE_STRING,
+ gobject.TYPE_STRING,
+ gobject.TYPE_STRING,
+ gobject.TYPE_STRING,
+ gobject.TYPE_STRING,
+ gobject.TYPE_INT)
+
+ def failure_model_filter(self, model, it):
+ color = model.get(it, self.COL_COLOR)[0]
+ if not color:
+ return False
+ if color == HobColors.ERROR or color == HobColors.WARNING:
+ return True
+ return False
+
+ def failure_model(self):
+ model = self.filter_new()
+ model.set_visible_func(self.failure_model_filter)
+ return model
+
+ def foreach_cell_func(self, model, path, iter, usr_data=None):
+ if model.get_value(iter, self.COL_ICON) == "gtk-execute":
+ model.set(iter, self.COL_ICON, "")
+
+ def close_task_refresh(self):
+ self.foreach(self.foreach_cell_func, None)
+
+class RunningBuild (gobject.GObject):
+ __gsignals__ = {
+ 'build-started' : (gobject.SIGNAL_RUN_LAST,
+ gobject.TYPE_NONE,
+ ()),
+ 'build-succeeded' : (gobject.SIGNAL_RUN_LAST,
+ gobject.TYPE_NONE,
+ ()),
+ 'build-failed' : (gobject.SIGNAL_RUN_LAST,
+ gobject.TYPE_NONE,
+ ()),
+ 'build-complete' : (gobject.SIGNAL_RUN_LAST,
+ gobject.TYPE_NONE,
+ ()),
+ 'build-aborted' : (gobject.SIGNAL_RUN_LAST,
+ gobject.TYPE_NONE,
+ ()),
+ 'task-started' : (gobject.SIGNAL_RUN_LAST,
+ gobject.TYPE_NONE,
+ (gobject.TYPE_PYOBJECT,)),
+ 'log-error' : (gobject.SIGNAL_RUN_LAST,
+ gobject.TYPE_NONE,
+ ()),
+ 'log-warning' : (gobject.SIGNAL_RUN_LAST,
+ gobject.TYPE_NONE,
+ ()),
+ 'disk-full' : (gobject.SIGNAL_RUN_LAST,
+ gobject.TYPE_NONE,
+ ()),
+ 'no-provider' : (gobject.SIGNAL_RUN_LAST,
+ gobject.TYPE_NONE,
+ (gobject.TYPE_PYOBJECT,)),
+ 'log' : (gobject.SIGNAL_RUN_LAST,
+ gobject.TYPE_NONE,
+ (gobject.TYPE_STRING, gobject.TYPE_PYOBJECT,)),
+ }
+ pids_to_task = {}
+ tasks_to_iter = {}
+
+ def __init__ (self, sequential=False):
+ gobject.GObject.__init__ (self)
+ self.model = RunningBuildModel()
+ self.sequential = sequential
+ self.buildaborted = False
+
+ def reset (self):
+ self.pids_to_task.clear()
+ self.tasks_to_iter.clear()
+ self.model.clear()
+
+ def handle_event (self, event, pbar=None):
+ # Handle an event from the event queue, this may result in updating
+ # the model and thus the UI. Or it may be to tell us that the build
+ # has finished successfully (or not, as the case may be.)
+
+ parent = None
+ pid = 0
+ package = None
+ task = None
+
+ # If we have a pid attached to this message/event try and get the
+ # (package, task) pair for it. If we get that then get the parent iter
+ # for the message.
+ if hasattr(event, 'pid'):
+ pid = event.pid
+ if hasattr(event, 'process'):
+ pid = event.process
+
+ if pid and pid in self.pids_to_task:
+ (package, task) = self.pids_to_task[pid]
+ parent = self.tasks_to_iter[(package, task)]
+
+ if(isinstance(event, logging.LogRecord)):
+ if event.taskpid == 0 or event.levelno > logging.INFO:
+ self.emit("log", "handle", event)
+ # FIXME: this is a hack! More info in Yocto #1433
+ # http://bugzilla.pokylinux.org/show_bug.cgi?id=1433, temporarily
+ # mask the error message as it's not informative for the user.
+ if event.msg.startswith("Execution of event handler 'run_buildstats' failed"):
+ return
+
+ if (event.levelno < logging.INFO or
+ event.msg.startswith("Running task")):
+ return # don't add these to the list
+
+ if event.levelno >= logging.ERROR:
+ icon = "dialog-error"
+ color = HobColors.ERROR
+ self.emit("log-error")
+ elif event.levelno >= logging.WARNING:
+ icon = "dialog-warning"
+ color = HobColors.WARNING
+ self.emit("log-warning")
+ else:
+ icon = None
+ color = HobColors.OK
+
+ # if we know which package we belong to, we'll append onto its list.
+ # otherwise, we'll jump to the top of the master list
+ if self.sequential or not parent:
+ tree_add = self.model.append
+ else:
+ tree_add = self.model.prepend
+ tree_add(parent,
+ (None,
+ package,
+ task,
+ event.getMessage(),
+ icon,
+ color,
+ 0))
+
+ # if there are warnings while processing a package
+ # (parent), mark the task with warning color;
+ # in case there are errors, the updates will be
+ # handled on TaskFailed.
+ if color == HobColors.WARNING and parent:
+ self.model.set(parent, self.model.COL_COLOR, color)
+ if task: #then we have a parent (package), and update it's color
+ self.model.set(self.tasks_to_iter[(package, None)], self.model.COL_COLOR, color)
+
+ elif isinstance(event, bb.build.TaskStarted):
+ (package, task) = (event._package, event._task)
+
+ # Save out this PID.
+ self.pids_to_task[pid] = (package, task)
+
+ # Check if we already have this package in our model. If so then
+ # that can be the parent for the task. Otherwise we create a new
+ # top level for the package.
+ if ((package, None) in self.tasks_to_iter):
+ parent = self.tasks_to_iter[(package, None)]
+ else:
+ if self.sequential:
+ add = self.model.append
+ else:
+ add = self.model.prepend
+ parent = add(None, (None,
+ package,
+ None,
+ "Package: %s" % (package),
+ None,
+ HobColors.OK,
+ 0))
+ self.tasks_to_iter[(package, None)] = parent
+
+ # Because this parent package now has an active child mark it as
+ # such.
+ self.model.set(parent, self.model.COL_ICON, "gtk-execute")
+ parent_color = self.model.get(parent, self.model.COL_COLOR)[0]
+ if parent_color != HobColors.ERROR and parent_color != HobColors.WARNING:
+ self.model.set(parent, self.model.COL_COLOR, HobColors.RUNNING)
+
+ # Add an entry in the model for this task
+ i = self.model.append (parent, (None,
+ package,
+ task,
+ "Task: %s" % (task),
+ "gtk-execute",
+ HobColors.RUNNING,
+ 0))
+
+ # update the parent's active task count
+ num_active = self.model.get(parent, self.model.COL_NUM_ACTIVE)[0] + 1
+ self.model.set(parent, self.model.COL_NUM_ACTIVE, num_active)
+
+ # Save out the iter so that we can find it when we have a message
+ # that we need to attach to a task.
+ self.tasks_to_iter[(package, task)] = i
+
+ elif isinstance(event, bb.build.TaskBase):
+ self.emit("log", "info", event._message)
+ current = self.tasks_to_iter[(package, task)]
+ parent = self.tasks_to_iter[(package, None)]
+
+ # remove this task from the parent's active count
+ num_active = self.model.get(parent, self.model.COL_NUM_ACTIVE)[0] - 1
+ self.model.set(parent, self.model.COL_NUM_ACTIVE, num_active)
+
+ if isinstance(event, bb.build.TaskFailed):
+ # Mark the task and parent as failed
+ icon = "dialog-error"
+ color = HobColors.ERROR
+
+ logfile = event.logfile
+ if logfile and os.path.exists(logfile):
+ with open(logfile) as f:
+ logdata = f.read()
+ self.model.append(current, ('pastebin', None, None, logdata, 'gtk-error', HobColors.OK, 0))
+
+ for i in (current, parent):
+ self.model.set(i, self.model.COL_ICON, icon,
+ self.model.COL_COLOR, color)
+ else:
+ # Mark the parent package and the task as inactive,
+ # but make sure to preserve error, warnings and active
+ # states
+ parent_color = self.model.get(parent, self.model.COL_COLOR)[0]
+ task_color = self.model.get(current, self.model.COL_COLOR)[0]
+
+ # Mark the task as inactive
+ self.model.set(current, self.model.COL_ICON, None)
+ if task_color != HobColors.ERROR:
+ if task_color == HobColors.WARNING:
+ self.model.set(current, self.model.COL_ICON, 'dialog-warning')
+ else:
+ self.model.set(current, self.model.COL_COLOR, HobColors.OK)
+
+ # Mark the parent as inactive
+ if parent_color != HobColors.ERROR:
+ if parent_color == HobColors.WARNING:
+ self.model.set(parent, self.model.COL_ICON, "dialog-warning")
+ else:
+ self.model.set(parent, self.model.COL_ICON, None)
+ if num_active == 0:
+ self.model.set(parent, self.model.COL_COLOR, HobColors.OK)
+
+ # Clear the iters and the pids since when the task goes away the
+ # pid will no longer be used for messages
+ del self.tasks_to_iter[(package, task)]
+ del self.pids_to_task[pid]
+
+ elif isinstance(event, bb.event.BuildStarted):
+
+ self.emit("build-started")
+ self.model.prepend(None, (None,
+ None,
+ None,
+ "Build Started (%s)" % time.strftime('%m/%d/%Y %H:%M:%S'),
+ None,
+ HobColors.OK,
+ 0))
+ if pbar:
+ pbar.update(0, self.progress_total)
+ pbar.set_title(bb.event.getName(event))
+
+ elif isinstance(event, bb.event.BuildCompleted):
+ failures = int (event._failures)
+ self.model.prepend(None, (None,
+ None,
+ None,
+ "Build Completed (%s)" % time.strftime('%m/%d/%Y %H:%M:%S'),
+ None,
+ HobColors.OK,
+ 0))
+
+ # Emit the appropriate signal depending on the number of failures
+ if self.buildaborted:
+ self.emit ("build-aborted")
+ self.buildaborted = False
+ elif (failures >= 1):
+ self.emit ("build-failed")
+ else:
+ self.emit ("build-succeeded")
+ # Emit a generic "build-complete" signal for things wishing to
+ # handle when the build is finished
+ self.emit("build-complete")
+ # reset the all cell's icon indicator
+ self.model.close_task_refresh()
+ if pbar:
+ pbar.set_text(event.msg)
+
+ elif isinstance(event, bb.event.DiskFull):
+ self.buildaborted = True
+ self.emit("disk-full")
+
+ elif isinstance(event, bb.command.CommandFailed):
+ self.emit("log", "error", "Command execution failed: %s" % (event.error))
+ if event.error.startswith("Exited with"):
+ # If the command fails with an exit code we're done, emit the
+ # generic signal for the UI to notify the user
+ self.emit("build-complete")
+ # reset the all cell's icon indicator
+ self.model.close_task_refresh()
+
+ elif isinstance(event, bb.event.CacheLoadStarted) and pbar:
+ pbar.set_title("Loading cache")
+ self.progress_total = event.total
+ pbar.update(0, self.progress_total)
+ elif isinstance(event, bb.event.CacheLoadProgress) and pbar:
+ pbar.update(event.current, self.progress_total)
+ elif isinstance(event, bb.event.CacheLoadCompleted) and pbar:
+ pbar.update(self.progress_total, self.progress_total)
+ pbar.hide()
+ elif isinstance(event, bb.event.ParseStarted) and pbar:
+ if event.total == 0:
+ return
+ pbar.set_title("Processing recipes")
+ self.progress_total = event.total
+ pbar.update(0, self.progress_total)
+ elif isinstance(event, bb.event.ParseProgress) and pbar:
+ pbar.update(event.current, self.progress_total)
+ elif isinstance(event, bb.event.ParseCompleted) and pbar:
+ pbar.hide()
+ #using runqueue events as many as possible to update the progress bar
+ elif isinstance(event, bb.runqueue.runQueueTaskFailed):
+ self.emit("log", "error", "Task %s (%s) failed with exit code '%s'" % (event.taskid, event.taskstring, event.exitcode))
+ elif isinstance(event, bb.runqueue.sceneQueueTaskFailed):
+ self.emit("log", "warn", "Setscene task %s (%s) failed with exit code '%s' - real task will be run instead" \
+ % (event.taskid, event.taskstring, event.exitcode))
+ elif isinstance(event, (bb.runqueue.runQueueTaskStarted, bb.runqueue.sceneQueueTaskStarted)):
+ if isinstance(event, bb.runqueue.sceneQueueTaskStarted):
+ self.emit("log", "info", "Running setscene task %d of %d (%s)" % \
+ (event.stats.completed + event.stats.active + event.stats.failed + 1,
+ event.stats.total, event.taskstring))
+ else:
+ if event.noexec:
+ tasktype = 'noexec task'
+ else:
+ tasktype = 'task'
+ self.emit("log", "info", "Running %s %s of %s (ID: %s, %s)" % \
+ (tasktype, event.stats.completed + event.stats.active + event.stats.failed + 1,
+ event.stats.total, event.taskid, event.taskstring))
+ message = {}
+ message["eventname"] = bb.event.getName(event)
+ num_of_completed = event.stats.completed + event.stats.failed
+ message["current"] = num_of_completed
+ message["total"] = event.stats.total
+ message["title"] = ""
+ message["task"] = event.taskstring
+ self.emit("task-started", message)
+ elif isinstance(event, bb.event.MultipleProviders):
+ self.emit("log", "info", "multiple providers are available for %s%s (%s)" \
+ % (event._is_runtime and "runtime " or "", event._item, ", ".join(event._candidates)))
+ self.emit("log", "info", "consider defining a PREFERRED_PROVIDER entry to match %s" % (event._item))
+ elif isinstance(event, bb.event.NoProvider):
+ msg = ""
+ if event._runtime:
+ r = "R"
+ else:
+ r = ""
+
+ extra = ''
+ if not event._reasons:
+ if event._close_matches:
+ extra = ". Close matches:\n %s" % '\n '.join(event._close_matches)
+
+ if event._dependees:
+ msg = "Nothing %sPROVIDES '%s' (but %s %sDEPENDS on or otherwise requires it)%s\n" % (r, event._item, ", ".join(event._dependees), r, extra)
+ else:
+ msg = "Nothing %sPROVIDES '%s'%s\n" % (r, event._item, extra)
+ if event._reasons:
+ for reason in event._reasons:
+ msg += ("%s\n" % reason)
+ self.emit("no-provider", msg)
+ self.emit("log", "error", msg)
+ elif isinstance(event, bb.event.LogExecTTY):
+ icon = "dialog-warning"
+ color = HobColors.WARNING
+ if self.sequential or not parent:
+ tree_add = self.model.append
+ else:
+ tree_add = self.model.prepend
+ tree_add(parent,
+ (None,
+ package,
+ task,
+ event.msg,
+ icon,
+ color,
+ 0))
+ else:
+ if not isinstance(event, (bb.event.BuildBase,
+ bb.event.StampUpdate,
+ bb.event.ConfigParsed,
+ bb.event.RecipeParsed,
+ bb.event.RecipePreFinalise,
+ bb.runqueue.runQueueEvent,
+ bb.runqueue.runQueueExitWait,
+ bb.event.OperationStarted,
+ bb.event.OperationCompleted,
+ bb.event.OperationProgress)):
+ self.emit("log", "error", "Unknown event: %s" % (event.error if hasattr(event, 'error') else 'error'))
+
+ return
+
+
+def do_pastebin(text):
+ url = 'http://pastebin.com/api_public.php'
+ params = {'paste_code': text, 'paste_format': 'text'}
+
+ req = urllib2.Request(url, urllib.urlencode(params))
+ response = urllib2.urlopen(req)
+ paste_url = response.read()
+
+ return paste_url
+
+
+class RunningBuildTreeView (gtk.TreeView):
+ __gsignals__ = {
+ "button_press_event" : "override"
+ }
+ def __init__ (self, readonly=False, hob=False):
+ gtk.TreeView.__init__ (self)
+ self.readonly = readonly
+
+ # The icon that indicates whether we're building or failed.
+ # add 'hob' flag because there has not only hob to share this code
+ if hob:
+ renderer = HobCellRendererPixbuf ()
+ else:
+ renderer = gtk.CellRendererPixbuf()
+ col = gtk.TreeViewColumn ("Status", renderer)
+ col.add_attribute (renderer, "icon-name", 4)
+ self.append_column (col)
+
+ # The message of the build.
+ # add 'hob' flag because there has not only hob to share this code
+ if hob:
+ self.message_renderer = HobWarpCellRendererText (col_number=1)
+ else:
+ self.message_renderer = gtk.CellRendererText ()
+ self.message_column = gtk.TreeViewColumn ("Message", self.message_renderer, text=3)
+ self.message_column.add_attribute(self.message_renderer, 'background', 5)
+ self.message_renderer.set_property('editable', (not self.readonly))
+ self.append_column (self.message_column)
+
+ def do_button_press_event(self, event):
+ gtk.TreeView.do_button_press_event(self, event)
+
+ if event.button == 3:
+ selection = super(RunningBuildTreeView, self).get_selection()
+ (model, it) = selection.get_selected()
+ if it is not None:
+ can_paste = model.get(it, model.COL_LOG)[0]
+ if can_paste == 'pastebin':
+ # build a simple menu with a pastebin option
+ menu = gtk.Menu()
+ menuitem = gtk.MenuItem("Copy")
+ menu.append(menuitem)
+ menuitem.connect("activate", self.clipboard_handler, (model, it))
+ menuitem.show()
+ menuitem = gtk.MenuItem("Send log to pastebin")
+ menu.append(menuitem)
+ menuitem.connect("activate", self.pastebin_handler, (model, it))
+ menuitem.show()
+ menu.show()
+ menu.popup(None, None, None, event.button, event.time)
+
+ def _add_to_clipboard(self, clipping):
+ """
+ Add the contents of clipping to the system clipboard.
+ """
+ clipboard = gtk.clipboard_get()
+ clipboard.set_text(clipping)
+ clipboard.store()
+
+ def pastebin_handler(self, widget, data):
+ """
+ Send the log data to pastebin, then add the new paste url to the
+ clipboard.
+ """
+ (model, it) = data
+ paste_url = do_pastebin(model.get(it, model.COL_MESSAGE)[0])
+
+ # @todo Provide visual feedback to the user that it is done and that
+ # it worked.
+ print paste_url
+
+ self._add_to_clipboard(paste_url)
+
+ def clipboard_handler(self, widget, data):
+ """
+ """
+ (model, it) = data
+ message = model.get(it, model.COL_MESSAGE)[0]
+
+ self._add_to_clipboard(message)
+
+class BuildFailureTreeView(gtk.TreeView):
+
+ def __init__ (self):
+ gtk.TreeView.__init__(self)
+ self.set_rules_hint(False)
+ self.set_headers_visible(False)
+ self.get_selection().set_mode(gtk.SELECTION_SINGLE)
+
+ # The icon that indicates whether we're building or failed.
+ renderer = HobCellRendererPixbuf ()
+ col = gtk.TreeViewColumn ("Status", renderer)
+ col.add_attribute (renderer, "icon-name", RunningBuildModel.COL_ICON)
+ self.append_column (col)
+
+ # The message of the build.
+ self.message_renderer = HobWarpCellRendererText (col_number=1)
+ self.message_column = gtk.TreeViewColumn ("Message", self.message_renderer, text=RunningBuildModel.COL_MESSAGE, background=RunningBuildModel.COL_COLOR)
+ self.append_column (self.message_column)
diff --git a/bitbake/lib/bb/ui/crumbs/sanitycheckpage.py b/bitbake/lib/bb/ui/crumbs/sanitycheckpage.py
new file mode 100644
index 0000000..76ce2ec
--- /dev/null
+++ b/bitbake/lib/bb/ui/crumbs/sanitycheckpage.py
@@ -0,0 +1,85 @@
+#!/usr/bin/env python
+#
+# BitBake Graphical GTK User Interface
+#
+# Copyright (C) 2012 Intel Corporation
+#
+# Authored by Bogdan Marinescu <bogdan.a.marinescu@intel.com>
+#
+# This program is free software; you can redistribute it and/or modify
+# it under the terms of the GNU General Public License version 2 as
+# published by the Free Software Foundation.
+#
+# This program is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
+# GNU General Public License for more details.
+#
+# You should have received a copy of the GNU General Public License along
+# with this program; if not, write to the Free Software Foundation, Inc.,
+# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
+
+import gtk, gobject
+from bb.ui.crumbs.progressbar import HobProgressBar
+from bb.ui.crumbs.hobwidget import hic
+from bb.ui.crumbs.hobpages import HobPage
+
+#
+# SanityCheckPage
+#
+class SanityCheckPage (HobPage):
+
+ def __init__(self, builder):
+ super(SanityCheckPage, self).__init__(builder)
+ self.running = False
+ self.create_visual_elements()
+ self.show_all()
+
+ def make_label(self, text, bold=True):
+ label = gtk.Label()
+ label.set_alignment(0.0, 0.5)
+ mark = "<span %s>%s</span>" % (self.span_tag('x-large', 'bold') if bold else self.span_tag('medium'), text)
+ label.set_markup(mark)
+ return label
+
+ def start(self):
+ if not self.running:
+ self.running = True
+ gobject.timeout_add(100, self.timer_func)
+
+ def stop(self):
+ self.running = False
+
+ def is_running(self):
+ return self.running
+
+ def timer_func(self):
+ self.progress_bar.pulse()
+ return self.running
+
+ def create_visual_elements(self):
+ # Table'd layout. 'rows' and 'cols' give the table size
+ rows, cols = 30, 50
+ self.table = gtk.Table(rows, cols, True)
+ self.pack_start(self.table, expand=False, fill=False)
+ sx, sy = 2, 2
+ # 'info' icon
+ image = gtk.Image()
+ image.set_from_file(hic.ICON_INFO_DISPLAY_FILE)
+ self.table.attach(image, sx, sx + 2, sy, sy + 3 )
+ image.show()
+ # 'Checking' message
+ label = self.make_label('Hob is checking for correct build system setup')
+ self.table.attach(label, sx + 2, cols, sy, sy + 3, xpadding=5 )
+ label.show()
+ # 'Shouldn't take long' message.
+ label = self.make_label("The check shouldn't take long.", False)
+ self.table.attach(label, sx + 2, cols, sy + 3, sy + 4, xpadding=5)
+ label.show()
+ # Progress bar
+ self.progress_bar = HobProgressBar()
+ self.table.attach(self.progress_bar, sx + 2, cols - 3, sy + 5, sy + 7, xpadding=5)
+ self.progress_bar.show()
+ # All done
+ self.table.show()
+
diff --git a/bitbake/lib/bb/ui/crumbs/utils.py b/bitbake/lib/bb/ui/crumbs/utils.py
new file mode 100644
index 0000000..939864f
--- /dev/null
+++ b/bitbake/lib/bb/ui/crumbs/utils.py
@@ -0,0 +1,34 @@
+#
+# BitBake UI Utils
+#
+# Copyright (C) 2012 Intel Corporation
+#
+# This program is free software; you can redistribute it and/or modify
+# it under the terms of the GNU General Public License version 2 as
+# published by the Free Software Foundation.
+#
+# This program is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
+# GNU General Public License for more details.
+#
+# You should have received a copy of the GNU General Public License along
+# with this program; if not, write to the Free Software Foundation, Inc.,
+# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
+
+# This utility method looks for xterm or vte and return the
+# frist to exist, currently we are keeping this simple, but
+# we will likely move the oe.terminal implementation into
+# bitbake which will allow more flexibility.
+
+import os
+import bb
+
+def which_terminal():
+ term = bb.utils.which(os.environ["PATH"], "xterm")
+ if term:
+ return term + " -e "
+ term = bb.utils.which(os.environ["PATH"], "vte")
+ if term:
+ return term + " -c "
+ return None
diff --git a/bitbake/lib/bb/ui/depexp.py b/bitbake/lib/bb/ui/depexp.py
new file mode 100644
index 0000000..240aafc
--- /dev/null
+++ b/bitbake/lib/bb/ui/depexp.py
@@ -0,0 +1,333 @@
+#
+# BitBake Graphical GTK based Dependency Explorer
+#
+# Copyright (C) 2007 Ross Burton
+# Copyright (C) 2007 - 2008 Richard Purdie
+#
+# This program is free software; you can redistribute it and/or modify
+# it under the terms of the GNU General Public License version 2 as
+# published by the Free Software Foundation.
+#
+# This program is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
+# GNU General Public License for more details.
+#
+# You should have received a copy of the GNU General Public License along
+# with this program; if not, write to the Free Software Foundation, Inc.,
+# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
+
+import sys
+import gobject
+import gtk
+import Queue
+import threading
+import xmlrpclib
+import bb
+import bb.event
+from bb.ui.crumbs.progressbar import HobProgressBar
+
+# Package Model
+(COL_PKG_NAME) = (0)
+
+# Dependency Model
+(TYPE_DEP, TYPE_RDEP) = (0, 1)
+(COL_DEP_TYPE, COL_DEP_PARENT, COL_DEP_PACKAGE) = (0, 1, 2)
+
+
+class PackageDepView(gtk.TreeView):
+ def __init__(self, model, dep_type, label):
+ gtk.TreeView.__init__(self)
+ self.current = None
+ self.dep_type = dep_type
+ self.filter_model = model.filter_new()
+ self.filter_model.set_visible_func(self._filter)
+ self.set_model(self.filter_model)
+ #self.connect("row-activated", self.on_package_activated, COL_DEP_PACKAGE)
+ self.append_column(gtk.TreeViewColumn(label, gtk.CellRendererText(), text=COL_DEP_PACKAGE))
+
+ def _filter(self, model, iter):
+ (this_type, package) = model.get(iter, COL_DEP_TYPE, COL_DEP_PARENT)
+ if this_type != self.dep_type: return False
+ return package == self.current
+
+ def set_current_package(self, package):
+ self.current = package
+ self.filter_model.refilter()
+
+
+class PackageReverseDepView(gtk.TreeView):
+ def __init__(self, model, label):
+ gtk.TreeView.__init__(self)
+ self.current = None
+ self.filter_model = model.filter_new()
+ self.filter_model.set_visible_func(self._filter)
+ self.set_model(self.filter_model)
+ self.append_column(gtk.TreeViewColumn(label, gtk.CellRendererText(), text=COL_DEP_PARENT))
+
+ def _filter(self, model, iter):
+ package = model.get_value(iter, COL_DEP_PACKAGE)
+ return package == self.current
+
+ def set_current_package(self, package):
+ self.current = package
+ self.filter_model.refilter()
+
+
+class DepExplorer(gtk.Window):
+ def __init__(self):
+ gtk.Window.__init__(self)
+ self.set_title("Dependency Explorer")
+ self.set_default_size(500, 500)
+ self.connect("delete-event", gtk.main_quit)
+
+ # Create the data models
+ self.pkg_model = gtk.ListStore(gobject.TYPE_STRING)
+ self.pkg_model.set_sort_column_id(COL_PKG_NAME, gtk.SORT_ASCENDING)
+ self.depends_model = gtk.ListStore(gobject.TYPE_INT, gobject.TYPE_STRING, gobject.TYPE_STRING)
+ self.depends_model.set_sort_column_id(COL_DEP_PACKAGE, gtk.SORT_ASCENDING)
+
+ pane = gtk.HPaned()
+ pane.set_position(250)
+ self.add(pane)
+
+ # The master list of packages
+ scrolled = gtk.ScrolledWindow()
+ scrolled.set_policy(gtk.POLICY_AUTOMATIC, gtk.POLICY_AUTOMATIC)
+ scrolled.set_shadow_type(gtk.SHADOW_IN)
+
+ self.pkg_treeview = gtk.TreeView(self.pkg_model)
+ self.pkg_treeview.get_selection().connect("changed", self.on_cursor_changed)
+ column = gtk.TreeViewColumn("Package", gtk.CellRendererText(), text=COL_PKG_NAME)
+ self.pkg_treeview.append_column(column)
+ pane.add1(scrolled)
+ scrolled.add(self.pkg_treeview)
+
+ box = gtk.VBox(homogeneous=True, spacing=4)
+
+ # Runtime Depends
+ scrolled = gtk.ScrolledWindow()
+ scrolled.set_policy(gtk.POLICY_AUTOMATIC, gtk.POLICY_AUTOMATIC)
+ scrolled.set_shadow_type(gtk.SHADOW_IN)
+ self.rdep_treeview = PackageDepView(self.depends_model, TYPE_RDEP, "Runtime Depends")
+ self.rdep_treeview.connect("row-activated", self.on_package_activated, COL_DEP_PACKAGE)
+ scrolled.add(self.rdep_treeview)
+ box.add(scrolled)
+
+ # Build Depends
+ scrolled = gtk.ScrolledWindow()
+ scrolled.set_policy(gtk.POLICY_AUTOMATIC, gtk.POLICY_AUTOMATIC)
+ scrolled.set_shadow_type(gtk.SHADOW_IN)
+ self.dep_treeview = PackageDepView(self.depends_model, TYPE_DEP, "Build Depends")
+ self.dep_treeview.connect("row-activated", self.on_package_activated, COL_DEP_PACKAGE)
+ scrolled.add(self.dep_treeview)
+ box.add(scrolled)
+ pane.add2(box)
+
+ # Reverse Depends
+ scrolled = gtk.ScrolledWindow()
+ scrolled.set_policy(gtk.POLICY_AUTOMATIC, gtk.POLICY_AUTOMATIC)
+ scrolled.set_shadow_type(gtk.SHADOW_IN)
+ self.revdep_treeview = PackageReverseDepView(self.depends_model, "Reverse Depends")
+ self.revdep_treeview.connect("row-activated", self.on_package_activated, COL_DEP_PARENT)
+ scrolled.add(self.revdep_treeview)
+ box.add(scrolled)
+ pane.add2(box)
+
+ self.show_all()
+
+ def on_package_activated(self, treeview, path, column, data_col):
+ model = treeview.get_model()
+ package = model.get_value(model.get_iter(path), data_col)
+
+ pkg_path = []
+ def finder(model, path, iter, needle):
+ package = model.get_value(iter, COL_PKG_NAME)
+ if package == needle:
+ pkg_path.append(path)
+ return True
+ else:
+ return False
+ self.pkg_model.foreach(finder, package)
+ if pkg_path:
+ self.pkg_treeview.get_selection().select_path(pkg_path[0])
+ self.pkg_treeview.scroll_to_cell(pkg_path[0])
+
+ def on_cursor_changed(self, selection):
+ (model, it) = selection.get_selected()
+ if it is None:
+ current_package = None
+ else:
+ current_package = model.get_value(it, COL_PKG_NAME)
+ self.rdep_treeview.set_current_package(current_package)
+ self.dep_treeview.set_current_package(current_package)
+ self.revdep_treeview.set_current_package(current_package)
+
+
+ def parse(self, depgraph):
+ for package in depgraph["pn"]:
+ self.pkg_model.insert(0, (package,))
+
+ for package in depgraph["depends"]:
+ for depend in depgraph["depends"][package]:
+ self.depends_model.insert (0, (TYPE_DEP, package, depend))
+
+ for package in depgraph["rdepends-pn"]:
+ for rdepend in depgraph["rdepends-pn"][package]:
+ self.depends_model.insert (0, (TYPE_RDEP, package, rdepend))
+
+
+class gtkthread(threading.Thread):
+ quit = threading.Event()
+ def __init__(self, shutdown):
+ threading.Thread.__init__(self)
+ self.setDaemon(True)
+ self.shutdown = shutdown
+
+ def run(self):
+ gobject.threads_init()
+ gtk.gdk.threads_init()
+ gtk.main()
+ gtkthread.quit.set()
+
+
+def main(server, eventHandler, params):
+ try:
+ params.updateFromServer(server)
+ cmdline = params.parseActions()
+ if not cmdline:
+ print("Nothing to do. Use 'bitbake world' to build everything, or run 'bitbake --help' for usage information.")
+ return 1
+ if 'msg' in cmdline and cmdline['msg']:
+ print(cmdline['msg'])
+ return 1
+ cmdline = cmdline['action']
+ if not cmdline or cmdline[0] != "generateDotGraph":
+ print("This UI requires the -g option")
+ return 1
+ ret, error = server.runCommand(["generateDepTreeEvent", cmdline[1], cmdline[2]])
+ if error:
+ print("Error running command '%s': %s" % (cmdline, error))
+ return 1
+ elif ret != True:
+ print("Error running command '%s': returned %s" % (cmdline, ret))
+ return 1
+ except xmlrpclib.Fault as x:
+ print("XMLRPC Fault getting commandline:\n %s" % x)
+ return
+
+ try:
+ gtk.init_check()
+ except RuntimeError:
+ sys.stderr.write("Please set DISPLAY variable before running this command \n")
+ return
+
+ shutdown = 0
+
+ gtkgui = gtkthread(shutdown)
+ gtkgui.start()
+
+ gtk.gdk.threads_enter()
+ dep = DepExplorer()
+ bardialog = gtk.Dialog(parent=dep,
+ flags=gtk.DIALOG_MODAL|gtk.DIALOG_DESTROY_WITH_PARENT)
+ bardialog.set_default_size(400, 50)
+ pbar = HobProgressBar()
+ bardialog.vbox.pack_start(pbar)
+ bardialog.show_all()
+ bardialog.connect("delete-event", gtk.main_quit)
+ gtk.gdk.threads_leave()
+
+ progress_total = 0
+ while True:
+ try:
+ event = eventHandler.waitEvent(0.25)
+ if gtkthread.quit.isSet():
+ _, error = server.runCommand(["stateForceShutdown"])
+ if error:
+ print('Unable to cleanly stop: %s' % error)
+ break
+
+ if event is None:
+ continue
+
+ if isinstance(event, bb.event.CacheLoadStarted):
+ progress_total = event.total
+ gtk.gdk.threads_enter()
+ bardialog.set_title("Loading Cache")
+ pbar.update(0)
+ gtk.gdk.threads_leave()
+
+ if isinstance(event, bb.event.CacheLoadProgress):
+ x = event.current
+ gtk.gdk.threads_enter()
+ pbar.update(x * 1.0 / progress_total)
+ pbar.set_title('')
+ gtk.gdk.threads_leave()
+ continue
+
+ if isinstance(event, bb.event.CacheLoadCompleted):
+ bardialog.hide()
+ continue
+
+ if isinstance(event, bb.event.ParseStarted):
+ progress_total = event.total
+ if progress_total == 0:
+ continue
+ gtk.gdk.threads_enter()
+ pbar.update(0)
+ bardialog.set_title("Processing recipes")
+
+ gtk.gdk.threads_leave()
+
+ if isinstance(event, bb.event.ParseProgress):
+ x = event.current
+ gtk.gdk.threads_enter()
+ pbar.update(x * 1.0 / progress_total)
+ pbar.set_title('')
+ gtk.gdk.threads_leave()
+ continue
+
+ if isinstance(event, bb.event.ParseCompleted):
+ bardialog.hide()
+ continue
+
+ if isinstance(event, bb.event.DepTreeGenerated):
+ gtk.gdk.threads_enter()
+ dep.parse(event._depgraph)
+ gtk.gdk.threads_leave()
+
+ if isinstance(event, bb.command.CommandCompleted):
+ continue
+
+ if isinstance(event, bb.command.CommandFailed):
+ print("Command execution failed: %s" % event.error)
+ return event.exitcode
+
+ if isinstance(event, bb.command.CommandExit):
+ return event.exitcode
+
+ if isinstance(event, bb.cooker.CookerExit):
+ break
+
+ continue
+ except EnvironmentError as ioerror:
+ # ignore interrupted io
+ if ioerror.args[0] == 4:
+ pass
+ except KeyboardInterrupt:
+ if shutdown == 2:
+ print("\nThird Keyboard Interrupt, exit.\n")
+ break
+ if shutdown == 1:
+ print("\nSecond Keyboard Interrupt, stopping...\n")
+ _, error = server.runCommand(["stateForceShutdown"])
+ if error:
+ print('Unable to cleanly stop: %s' % error)
+ if shutdown == 0:
+ print("\nKeyboard Interrupt, closing down...\n")
+ _, error = server.runCommand(["stateShutdown"])
+ if error:
+ print('Unable to cleanly shutdown: %s' % error)
+ shutdown = shutdown + 1
+ pass
diff --git a/bitbake/lib/bb/ui/goggle.py b/bitbake/lib/bb/ui/goggle.py
new file mode 100644
index 0000000..f4ee7b4
--- /dev/null
+++ b/bitbake/lib/bb/ui/goggle.py
@@ -0,0 +1,121 @@
+#
+# BitBake Graphical GTK User Interface
+#
+# Copyright (C) 2008 Intel Corporation
+#
+# Authored by Rob Bradford <rob@linux.intel.com>
+#
+# This program is free software; you can redistribute it and/or modify
+# it under the terms of the GNU General Public License version 2 as
+# published by the Free Software Foundation.
+#
+# This program is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
+# GNU General Public License for more details.
+#
+# You should have received a copy of the GNU General Public License along
+# with this program; if not, write to the Free Software Foundation, Inc.,
+# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
+
+import gobject
+import gtk
+import xmlrpclib
+from bb.ui.crumbs.runningbuild import RunningBuildTreeView, RunningBuild
+from bb.ui.crumbs.progress import ProgressBar
+
+import Queue
+
+
+def event_handle_idle_func (eventHandler, build, pbar):
+
+ # Consume as many messages as we can in the time available to us
+ event = eventHandler.getEvent()
+ while event:
+ build.handle_event (event, pbar)
+ event = eventHandler.getEvent()
+
+ return True
+
+def scroll_tv_cb (model, path, iter, view):
+ view.scroll_to_cell (path)
+
+
+# @todo hook these into the GUI so the user has feedback...
+def running_build_failed_cb (running_build):
+ pass
+
+
+def running_build_succeeded_cb (running_build):
+ pass
+
+
+class MainWindow (gtk.Window):
+ def __init__ (self):
+ gtk.Window.__init__ (self, gtk.WINDOW_TOPLEVEL)
+
+ # Setup tree view and the scrolled window
+ scrolled_window = gtk.ScrolledWindow ()
+ self.add (scrolled_window)
+ self.cur_build_tv = RunningBuildTreeView()
+ self.connect("delete-event", gtk.main_quit)
+ self.set_default_size(640, 480)
+ scrolled_window.add (self.cur_build_tv)
+
+
+def main (server, eventHandler, params):
+ gobject.threads_init()
+ gtk.gdk.threads_init()
+
+ window = MainWindow ()
+ window.show_all ()
+ pbar = ProgressBar(window)
+ pbar.connect("delete-event", gtk.main_quit)
+
+ # Create the object for the current build
+ running_build = RunningBuild ()
+ window.cur_build_tv.set_model (running_build.model)
+ running_build.model.connect("row-inserted", scroll_tv_cb, window.cur_build_tv)
+ running_build.connect ("build-succeeded", running_build_succeeded_cb)
+ running_build.connect ("build-failed", running_build_failed_cb)
+
+ try:
+ params.updateFromServer(server)
+ cmdline = params.parseActions()
+ if not cmdline:
+ print("Nothing to do. Use 'bitbake world' to build everything, or run 'bitbake --help' for usage information.")
+ return 1
+ if 'msg' in cmdline and cmdline['msg']:
+ logger.error(cmdline['msg'])
+ return 1
+ cmdline = cmdline['action']
+ ret, error = server.runCommand(cmdline)
+ if error:
+ print("Error running command '%s': %s" % (cmdline, error))
+ return 1
+ elif ret != True:
+ print("Error running command '%s': returned %s" % (cmdline, ret))
+ return 1
+ except xmlrpclib.Fault as x:
+ print("XMLRPC Fault getting commandline:\n %s" % x)
+ return 1
+
+ # Use a timeout function for probing the event queue to find out if we
+ # have a message waiting for us.
+ gobject.timeout_add (100,
+ event_handle_idle_func,
+ eventHandler,
+ running_build,
+ pbar)
+
+ try:
+ gtk.main()
+ except EnvironmentError as ioerror:
+ # ignore interrupted io
+ if ioerror.args[0] == 4:
+ pass
+ except KeyboardInterrupt:
+ pass
+ finally:
+ server.runCommand(["stateForceShutdown"])
+
diff --git a/bitbake/lib/bb/ui/hob.py b/bitbake/lib/bb/ui/hob.py
new file mode 100755
index 0000000..da5b411
--- /dev/null
+++ b/bitbake/lib/bb/ui/hob.py
@@ -0,0 +1,109 @@
+#!/usr/bin/env python
+#
+# BitBake Graphical GTK User Interface
+#
+# Copyright (C) 2011 Intel Corporation
+#
+# Authored by Joshua Lock <josh@linux.intel.com>
+# Authored by Dongxiao Xu <dongxiao.xu@intel.com>
+#
+# This program is free software; you can redistribute it and/or modify
+# it under the terms of the GNU General Public License version 2 as
+# published by the Free Software Foundation.
+#
+# This program is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
+# GNU General Public License for more details.
+#
+# You should have received a copy of the GNU General Public License along
+# with this program; if not, write to the Free Software Foundation, Inc.,
+# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
+
+import sys
+import os
+requirements = "FATAL: Hob requires Gtk+ 2.20.0 or higher, PyGtk 2.21.0 or higher"
+try:
+ import gobject
+ import gtk
+ import pygtk
+ pygtk.require('2.0') # to be certain we don't have gtk+ 1.x !?!
+ gtkver = gtk.gtk_version
+ pygtkver = gtk.pygtk_version
+ if gtkver < (2, 20, 0) or pygtkver < (2, 21, 0):
+ sys.exit("%s,\nYou have Gtk+ %s and PyGtk %s." % (requirements,
+ ".".join(map(str, gtkver)),
+ ".".join(map(str, pygtkver))))
+except ImportError as exc:
+ sys.exit("%s (%s)." % (requirements, str(exc)))
+sys.path.insert(0, os.path.dirname(os.path.dirname(os.path.dirname(__file__))))
+try:
+ import bb
+except RuntimeError as exc:
+ sys.exit(str(exc))
+from bb.ui import uihelper
+from bb.ui.crumbs.hoblistmodel import RecipeListModel, PackageListModel
+from bb.ui.crumbs.hobeventhandler import HobHandler
+from bb.ui.crumbs.builder import Builder
+
+featureSet = [bb.cooker.CookerFeatures.HOB_EXTRA_CACHES, bb.cooker.CookerFeatures.BASEDATASTORE_TRACKING]
+
+def event_handle_idle_func(eventHandler, hobHandler):
+ # Consume as many messages as we can in the time available to us
+ if not eventHandler:
+ return False
+ event = eventHandler.getEvent()
+ while event:
+ hobHandler.handle_event(event)
+ event = eventHandler.getEvent()
+ return True
+
+_evt_list = [ "bb.runqueue.runQueueExitWait", "bb.event.LogExecTTY", "logging.LogRecord",
+ "bb.build.TaskFailed", "bb.build.TaskBase", "bb.event.ParseStarted",
+ "bb.event.ParseProgress", "bb.event.ParseCompleted", "bb.event.CacheLoadStarted",
+ "bb.event.CacheLoadProgress", "bb.event.CacheLoadCompleted", "bb.command.CommandFailed",
+ "bb.command.CommandExit", "bb.command.CommandCompleted", "bb.cooker.CookerExit",
+ "bb.event.MultipleProviders", "bb.event.NoProvider", "bb.runqueue.sceneQueueTaskStarted",
+ "bb.runqueue.runQueueTaskStarted", "bb.runqueue.runQueueTaskFailed", "bb.runqueue.sceneQueueTaskFailed",
+ "bb.event.BuildBase", "bb.build.TaskStarted", "bb.build.TaskSucceeded", "bb.build.TaskFailedSilent",
+ "bb.event.SanityCheckPassed", "bb.event.SanityCheckFailed", "bb.event.PackageInfo",
+ "bb.event.TargetsTreeGenerated", "bb.event.ConfigFilesFound", "bb.event.ConfigFilePathFound",
+ "bb.event.FilesMatchingFound", "bb.event.NetworkTestFailed", "bb.event.NetworkTestPassed",
+ "bb.event.BuildStarted", "bb.event.BuildCompleted", "bb.event.DiskFull"]
+
+def main (server, eventHandler, params):
+ params.updateFromServer(server)
+ gobject.threads_init()
+
+ # That indicates whether the Hob and the bitbake server are
+ # running on different machines
+ # recipe model and package model
+ recipe_model = RecipeListModel()
+ package_model = PackageListModel()
+
+ llevel, debug_domains = bb.msg.constructLogOptions()
+ server.runCommand(["setEventMask", server.getEventHandle(), llevel, debug_domains, _evt_list])
+ hobHandler = HobHandler(server, recipe_model, package_model)
+ builder = Builder(hobHandler, recipe_model, package_model)
+
+ # This timeout function regularly probes the event queue to find out if we
+ # have any messages waiting for us.
+ gobject.timeout_add(10, event_handle_idle_func, eventHandler, hobHandler)
+
+ try:
+ gtk.main()
+ except EnvironmentError as ioerror:
+ # ignore interrupted io
+ if ioerror.args[0] == 4:
+ pass
+ finally:
+ hobHandler.cancel_build(force = True)
+
+if __name__ == "__main__":
+ try:
+ ret = main()
+ except Exception:
+ ret = 1
+ import traceback
+ traceback.print_exc(15)
+ sys.exit(ret)
diff --git a/bitbake/lib/bb/ui/icons/images/images_display.png b/bitbake/lib/bb/ui/icons/images/images_display.png
new file mode 100644
index 0000000..a7f8710
--- /dev/null
+++ b/bitbake/lib/bb/ui/icons/images/images_display.png
Binary files differ
diff --git a/bitbake/lib/bb/ui/icons/images/images_hover.png b/bitbake/lib/bb/ui/icons/images/images_hover.png
new file mode 100644
index 0000000..2d9cd99
--- /dev/null
+++ b/bitbake/lib/bb/ui/icons/images/images_hover.png
Binary files differ
diff --git a/bitbake/lib/bb/ui/icons/indicators/add-hover.png b/bitbake/lib/bb/ui/icons/indicators/add-hover.png
new file mode 100644
index 0000000..526df77
--- /dev/null
+++ b/bitbake/lib/bb/ui/icons/indicators/add-hover.png
Binary files differ
diff --git a/bitbake/lib/bb/ui/icons/indicators/add.png b/bitbake/lib/bb/ui/icons/indicators/add.png
new file mode 100644
index 0000000..31e7090
--- /dev/null
+++ b/bitbake/lib/bb/ui/icons/indicators/add.png
Binary files differ
diff --git a/bitbake/lib/bb/ui/icons/indicators/alert.png b/bitbake/lib/bb/ui/icons/indicators/alert.png
new file mode 100644
index 0000000..d1c6f55
--- /dev/null
+++ b/bitbake/lib/bb/ui/icons/indicators/alert.png
Binary files differ
diff --git a/bitbake/lib/bb/ui/icons/indicators/confirmation.png b/bitbake/lib/bb/ui/icons/indicators/confirmation.png
new file mode 100644
index 0000000..3a5402d
--- /dev/null
+++ b/bitbake/lib/bb/ui/icons/indicators/confirmation.png
Binary files differ
diff --git a/bitbake/lib/bb/ui/icons/indicators/denied.png b/bitbake/lib/bb/ui/icons/indicators/denied.png
new file mode 100644
index 0000000..ee35c7d
--- /dev/null
+++ b/bitbake/lib/bb/ui/icons/indicators/denied.png
Binary files differ
diff --git a/bitbake/lib/bb/ui/icons/indicators/error.png b/bitbake/lib/bb/ui/icons/indicators/error.png
new file mode 100644
index 0000000..d06a8c1
--- /dev/null
+++ b/bitbake/lib/bb/ui/icons/indicators/error.png
Binary files differ
diff --git a/bitbake/lib/bb/ui/icons/indicators/info.png b/bitbake/lib/bb/ui/icons/indicators/info.png
new file mode 100644
index 0000000..ee8e8d8
--- /dev/null
+++ b/bitbake/lib/bb/ui/icons/indicators/info.png
Binary files differ
diff --git a/bitbake/lib/bb/ui/icons/indicators/issues.png b/bitbake/lib/bb/ui/icons/indicators/issues.png
new file mode 100644
index 0000000..b0c7461
--- /dev/null
+++ b/bitbake/lib/bb/ui/icons/indicators/issues.png
Binary files differ
diff --git a/bitbake/lib/bb/ui/icons/indicators/refresh.png b/bitbake/lib/bb/ui/icons/indicators/refresh.png
new file mode 100644
index 0000000..eb6c419
--- /dev/null
+++ b/bitbake/lib/bb/ui/icons/indicators/refresh.png
Binary files differ
diff --git a/bitbake/lib/bb/ui/icons/indicators/remove-hover.png b/bitbake/lib/bb/ui/icons/indicators/remove-hover.png
new file mode 100644
index 0000000..aa57c69
--- /dev/null
+++ b/bitbake/lib/bb/ui/icons/indicators/remove-hover.png
Binary files differ
diff --git a/bitbake/lib/bb/ui/icons/indicators/remove.png b/bitbake/lib/bb/ui/icons/indicators/remove.png
new file mode 100644
index 0000000..05c3c29
--- /dev/null
+++ b/bitbake/lib/bb/ui/icons/indicators/remove.png
Binary files differ
diff --git a/bitbake/lib/bb/ui/icons/indicators/tick.png b/bitbake/lib/bb/ui/icons/indicators/tick.png
new file mode 100644
index 0000000..beaad36
--- /dev/null
+++ b/bitbake/lib/bb/ui/icons/indicators/tick.png
Binary files differ
diff --git a/bitbake/lib/bb/ui/icons/info/info_display.png b/bitbake/lib/bb/ui/icons/info/info_display.png
new file mode 100644
index 0000000..5afbba2
--- /dev/null
+++ b/bitbake/lib/bb/ui/icons/info/info_display.png
Binary files differ
diff --git a/bitbake/lib/bb/ui/icons/info/info_hover.png b/bitbake/lib/bb/ui/icons/info/info_hover.png
new file mode 100644
index 0000000..f9d294d
--- /dev/null
+++ b/bitbake/lib/bb/ui/icons/info/info_hover.png
Binary files differ
diff --git a/bitbake/lib/bb/ui/icons/layers/layers_display.png b/bitbake/lib/bb/ui/icons/layers/layers_display.png
new file mode 100644
index 0000000..b7f9053
--- /dev/null
+++ b/bitbake/lib/bb/ui/icons/layers/layers_display.png
Binary files differ
diff --git a/bitbake/lib/bb/ui/icons/layers/layers_hover.png b/bitbake/lib/bb/ui/icons/layers/layers_hover.png
new file mode 100644
index 0000000..0bf3ce0
--- /dev/null
+++ b/bitbake/lib/bb/ui/icons/layers/layers_hover.png
Binary files differ
diff --git a/bitbake/lib/bb/ui/icons/packages/packages_display.png b/bitbake/lib/bb/ui/icons/packages/packages_display.png
new file mode 100644
index 0000000..f5d0a50
--- /dev/null
+++ b/bitbake/lib/bb/ui/icons/packages/packages_display.png
Binary files differ
diff --git a/bitbake/lib/bb/ui/icons/packages/packages_hover.png b/bitbake/lib/bb/ui/icons/packages/packages_hover.png
new file mode 100644
index 0000000..c081165
--- /dev/null
+++ b/bitbake/lib/bb/ui/icons/packages/packages_hover.png
Binary files differ
diff --git a/bitbake/lib/bb/ui/icons/recipe/recipe_display.png b/bitbake/lib/bb/ui/icons/recipe/recipe_display.png
new file mode 100644
index 0000000..e9809bc
--- /dev/null
+++ b/bitbake/lib/bb/ui/icons/recipe/recipe_display.png
Binary files differ
diff --git a/bitbake/lib/bb/ui/icons/recipe/recipe_hover.png b/bitbake/lib/bb/ui/icons/recipe/recipe_hover.png
new file mode 100644
index 0000000..7e48da9
--- /dev/null
+++ b/bitbake/lib/bb/ui/icons/recipe/recipe_hover.png
Binary files differ
diff --git a/bitbake/lib/bb/ui/icons/settings/settings_display.png b/bitbake/lib/bb/ui/icons/settings/settings_display.png
new file mode 100644
index 0000000..88c464d
--- /dev/null
+++ b/bitbake/lib/bb/ui/icons/settings/settings_display.png
Binary files differ
diff --git a/bitbake/lib/bb/ui/icons/settings/settings_hover.png b/bitbake/lib/bb/ui/icons/settings/settings_hover.png
new file mode 100644
index 0000000..d92a0bf
--- /dev/null
+++ b/bitbake/lib/bb/ui/icons/settings/settings_hover.png
Binary files differ
diff --git a/bitbake/lib/bb/ui/icons/templates/templates_display.png b/bitbake/lib/bb/ui/icons/templates/templates_display.png
new file mode 100644
index 0000000..153c7af
--- /dev/null
+++ b/bitbake/lib/bb/ui/icons/templates/templates_display.png
Binary files differ
diff --git a/bitbake/lib/bb/ui/icons/templates/templates_hover.png b/bitbake/lib/bb/ui/icons/templates/templates_hover.png
new file mode 100644
index 0000000..afb7165
--- /dev/null
+++ b/bitbake/lib/bb/ui/icons/templates/templates_hover.png
Binary files differ
diff --git a/bitbake/lib/bb/ui/knotty.py b/bitbake/lib/bb/ui/knotty.py
new file mode 100644
index 0000000..2bee242
--- /dev/null
+++ b/bitbake/lib/bb/ui/knotty.py
@@ -0,0 +1,564 @@
+#
+# BitBake (No)TTY UI Implementation
+#
+# Handling output to TTYs or files (no TTY)
+#
+# Copyright (C) 2006-2012 Richard Purdie
+#
+# This program is free software; you can redistribute it and/or modify
+# it under the terms of the GNU General Public License version 2 as
+# published by the Free Software Foundation.
+#
+# This program is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
+# GNU General Public License for more details.
+#
+# You should have received a copy of the GNU General Public License along
+# with this program; if not, write to the Free Software Foundation, Inc.,
+# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
+
+from __future__ import division
+
+import os
+import sys
+import xmlrpclib
+import logging
+import progressbar
+import signal
+import bb.msg
+import time
+import fcntl
+import struct
+import copy
+import atexit
+from bb.ui import uihelper
+
+featureSet = [bb.cooker.CookerFeatures.SEND_SANITYEVENTS]
+
+logger = logging.getLogger("BitBake")
+interactive = sys.stdout.isatty()
+
+class BBProgress(progressbar.ProgressBar):
+ def __init__(self, msg, maxval):
+ self.msg = msg
+ widgets = [progressbar.Percentage(), ' ', progressbar.Bar(), ' ',
+ progressbar.ETA()]
+
+ try:
+ self._resize_default = signal.getsignal(signal.SIGWINCH)
+ except:
+ self._resize_default = None
+ progressbar.ProgressBar.__init__(self, maxval, [self.msg + ": "] + widgets, fd=sys.stdout)
+
+ def _handle_resize(self, signum, frame):
+ progressbar.ProgressBar._handle_resize(self, signum, frame)
+ if self._resize_default:
+ self._resize_default(signum, frame)
+ def finish(self):
+ progressbar.ProgressBar.finish(self)
+ if self._resize_default:
+ signal.signal(signal.SIGWINCH, self._resize_default)
+
+class NonInteractiveProgress(object):
+ fobj = sys.stdout
+
+ def __init__(self, msg, maxval):
+ self.msg = msg
+ self.maxval = maxval
+
+ def start(self):
+ self.fobj.write("%s..." % self.msg)
+ self.fobj.flush()
+ return self
+
+ def update(self, value):
+ pass
+
+ def finish(self):
+ self.fobj.write("done.\n")
+ self.fobj.flush()
+
+def new_progress(msg, maxval):
+ if interactive:
+ return BBProgress(msg, maxval)
+ else:
+ return NonInteractiveProgress(msg, maxval)
+
+def pluralise(singular, plural, qty):
+ if(qty == 1):
+ return singular % qty
+ else:
+ return plural % qty
+
+
+class InteractConsoleLogFilter(logging.Filter):
+ def __init__(self, tf, format):
+ self.tf = tf
+ self.format = format
+
+ def filter(self, record):
+ if record.levelno == self.format.NOTE and (record.msg.startswith("Running") or record.msg.startswith("recipe ")):
+ return False
+ self.tf.clearFooter()
+ return True
+
+class TerminalFilter(object):
+ columns = 80
+
+ def sigwinch_handle(self, signum, frame):
+ self.columns = self.getTerminalColumns()
+ if self._sigwinch_default:
+ self._sigwinch_default(signum, frame)
+
+ def getTerminalColumns(self):
+ def ioctl_GWINSZ(fd):
+ try:
+ cr = struct.unpack('hh', fcntl.ioctl(fd, self.termios.TIOCGWINSZ, '1234'))
+ except:
+ return None
+ return cr
+ cr = ioctl_GWINSZ(sys.stdout.fileno())
+ if not cr:
+ try:
+ fd = os.open(os.ctermid(), os.O_RDONLY)
+ cr = ioctl_GWINSZ(fd)
+ os.close(fd)
+ except:
+ pass
+ if not cr:
+ try:
+ cr = (env['LINES'], env['COLUMNS'])
+ except:
+ cr = (25, 80)
+ return cr[1]
+
+ def __init__(self, main, helper, console, errconsole, format):
+ self.main = main
+ self.helper = helper
+ self.cuu = None
+ self.stdinbackup = None
+ self.interactive = sys.stdout.isatty()
+ self.footer_present = False
+ self.lastpids = []
+
+ if not self.interactive:
+ return
+
+ try:
+ import curses
+ except ImportError:
+ sys.exit("FATAL: The knotty ui could not load the required curses python module.")
+
+ import termios
+ self.curses = curses
+ self.termios = termios
+ try:
+ fd = sys.stdin.fileno()
+ self.stdinbackup = termios.tcgetattr(fd)
+ new = copy.deepcopy(self.stdinbackup)
+ new[3] = new[3] & ~termios.ECHO
+ termios.tcsetattr(fd, termios.TCSADRAIN, new)
+ curses.setupterm()
+ if curses.tigetnum("colors") > 2:
+ format.enable_color()
+ self.ed = curses.tigetstr("ed")
+ if self.ed:
+ self.cuu = curses.tigetstr("cuu")
+ try:
+ self._sigwinch_default = signal.getsignal(signal.SIGWINCH)
+ signal.signal(signal.SIGWINCH, self.sigwinch_handle)
+ except:
+ pass
+ self.columns = self.getTerminalColumns()
+ except:
+ self.cuu = None
+ console.addFilter(InteractConsoleLogFilter(self, format))
+ errconsole.addFilter(InteractConsoleLogFilter(self, format))
+
+ def clearFooter(self):
+ if self.footer_present:
+ lines = self.footer_present
+ sys.stdout.write(self.curses.tparm(self.cuu, lines))
+ sys.stdout.write(self.curses.tparm(self.ed))
+ self.footer_present = False
+
+ def updateFooter(self):
+ if not self.cuu:
+ return
+ activetasks = self.helper.running_tasks
+ failedtasks = self.helper.failed_tasks
+ runningpids = self.helper.running_pids
+ if self.footer_present and (self.lastcount == self.helper.tasknumber_current) and (self.lastpids == runningpids):
+ return
+ if self.footer_present:
+ self.clearFooter()
+ if (not self.helper.tasknumber_total or self.helper.tasknumber_current == self.helper.tasknumber_total) and not len(activetasks):
+ return
+ tasks = []
+ for t in runningpids:
+ tasks.append("%s (pid %s)" % (activetasks[t]["title"], t))
+
+ if self.main.shutdown:
+ content = "Waiting for %s running tasks to finish:" % len(activetasks)
+ elif not len(activetasks):
+ content = "No currently running tasks (%s of %s)" % (self.helper.tasknumber_current, self.helper.tasknumber_total)
+ else:
+ content = "Currently %s running tasks (%s of %s):" % (len(activetasks), self.helper.tasknumber_current, self.helper.tasknumber_total)
+ print(content)
+ lines = 1 + int(len(content) / (self.columns + 1))
+ for tasknum, task in enumerate(tasks):
+ content = "%s: %s" % (tasknum, task)
+ print(content)
+ lines = lines + 1 + int(len(content) / (self.columns + 1))
+ self.footer_present = lines
+ self.lastpids = runningpids[:]
+ self.lastcount = self.helper.tasknumber_current
+
+ def finish(self):
+ if self.stdinbackup:
+ fd = sys.stdin.fileno()
+ self.termios.tcsetattr(fd, self.termios.TCSADRAIN, self.stdinbackup)
+
+def _log_settings_from_server(server):
+ # Get values of variables which control our output
+ includelogs, error = server.runCommand(["getVariable", "BBINCLUDELOGS"])
+ if error:
+ logger.error("Unable to get the value of BBINCLUDELOGS variable: %s" % error)
+ raise BaseException(error)
+ loglines, error = server.runCommand(["getVariable", "BBINCLUDELOGS_LINES"])
+ if error:
+ logger.error("Unable to get the value of BBINCLUDELOGS_LINES variable: %s" % error)
+ raise BaseException(error)
+ consolelogfile, error = server.runCommand(["getVariable", "BB_CONSOLELOG"])
+ if error:
+ logger.error("Unable to get the value of BB_CONSOLELOG variable: %s" % error)
+ raise BaseException(error)
+ return includelogs, loglines, consolelogfile
+
+_evt_list = [ "bb.runqueue.runQueueExitWait", "bb.event.LogExecTTY", "logging.LogRecord",
+ "bb.build.TaskFailed", "bb.build.TaskBase", "bb.event.ParseStarted",
+ "bb.event.ParseProgress", "bb.event.ParseCompleted", "bb.event.CacheLoadStarted",
+ "bb.event.CacheLoadProgress", "bb.event.CacheLoadCompleted", "bb.command.CommandFailed",
+ "bb.command.CommandExit", "bb.command.CommandCompleted", "bb.cooker.CookerExit",
+ "bb.event.MultipleProviders", "bb.event.NoProvider", "bb.runqueue.sceneQueueTaskStarted",
+ "bb.runqueue.runQueueTaskStarted", "bb.runqueue.runQueueTaskFailed", "bb.runqueue.sceneQueueTaskFailed",
+ "bb.event.BuildBase", "bb.build.TaskStarted", "bb.build.TaskSucceeded", "bb.build.TaskFailedSilent"]
+
+def main(server, eventHandler, params, tf = TerminalFilter):
+
+ includelogs, loglines, consolelogfile = _log_settings_from_server(server)
+
+ if sys.stdin.isatty() and sys.stdout.isatty():
+ log_exec_tty = True
+ else:
+ log_exec_tty = False
+
+ helper = uihelper.BBUIHelper()
+
+ console = logging.StreamHandler(sys.stdout)
+ errconsole = logging.StreamHandler(sys.stderr)
+ format_str = "%(levelname)s: %(message)s"
+ format = bb.msg.BBLogFormatter(format_str)
+ bb.msg.addDefaultlogFilter(console, bb.msg.BBLogFilterStdOut)
+ bb.msg.addDefaultlogFilter(errconsole, bb.msg.BBLogFilterStdErr)
+ console.setFormatter(format)
+ errconsole.setFormatter(format)
+ logger.addHandler(console)
+ logger.addHandler(errconsole)
+
+ if params.options.remote_server and params.options.kill_server:
+ server.terminateServer()
+ return
+
+ if consolelogfile and not params.options.show_environment and not params.options.show_versions:
+ bb.utils.mkdirhier(os.path.dirname(consolelogfile))
+ conlogformat = bb.msg.BBLogFormatter(format_str)
+ consolelog = logging.FileHandler(consolelogfile)
+ bb.msg.addDefaultlogFilter(consolelog)
+ consolelog.setFormatter(conlogformat)
+ logger.addHandler(consolelog)
+
+ llevel, debug_domains = bb.msg.constructLogOptions()
+ server.runCommand(["setEventMask", server.getEventHandle(), llevel, debug_domains, _evt_list])
+
+ if not params.observe_only:
+ params.updateFromServer(server)
+ params.updateToServer(server, os.environ.copy())
+ cmdline = params.parseActions()
+ if not cmdline:
+ print("Nothing to do. Use 'bitbake world' to build everything, or run 'bitbake --help' for usage information.")
+ return 1
+ if 'msg' in cmdline and cmdline['msg']:
+ logger.error(cmdline['msg'])
+ return 1
+
+ ret, error = server.runCommand(cmdline['action'])
+ if error:
+ logger.error("Command '%s' failed: %s" % (cmdline, error))
+ return 1
+ elif ret != True:
+ logger.error("Command '%s' failed: returned %s" % (cmdline, ret))
+ return 1
+
+
+ parseprogress = None
+ cacheprogress = None
+ main.shutdown = 0
+ interrupted = False
+ return_value = 0
+ errors = 0
+ warnings = 0
+ taskfailures = []
+
+ termfilter = tf(main, helper, console, errconsole, format)
+ atexit.register(termfilter.finish)
+
+ while True:
+ try:
+ event = eventHandler.waitEvent(0)
+ if event is None:
+ if main.shutdown > 1:
+ break
+ termfilter.updateFooter()
+ event = eventHandler.waitEvent(0.25)
+ if event is None:
+ continue
+ helper.eventHandler(event)
+ if isinstance(event, bb.runqueue.runQueueExitWait):
+ if not main.shutdown:
+ main.shutdown = 1
+ continue
+ if isinstance(event, bb.event.LogExecTTY):
+ if log_exec_tty:
+ tries = event.retries
+ while tries:
+ print("Trying to run: %s" % event.prog)
+ if os.system(event.prog) == 0:
+ break
+ time.sleep(event.sleep_delay)
+ tries -= 1
+ if tries:
+ continue
+ logger.warn(event.msg)
+ continue
+
+ if isinstance(event, logging.LogRecord):
+ if event.levelno >= format.ERROR:
+ errors = errors + 1
+ return_value = 1
+ elif event.levelno == format.WARNING:
+ warnings = warnings + 1
+ # For "normal" logging conditions, don't show note logs from tasks
+ # but do show them if the user has changed the default log level to
+ # include verbose/debug messages
+ if event.taskpid != 0 and event.levelno <= format.NOTE and (event.levelno < llevel or (event.levelno == format.NOTE and llevel != format.VERBOSE)):
+ continue
+ logger.handle(event)
+ continue
+
+ if isinstance(event, bb.build.TaskFailedSilent):
+ logger.warn("Logfile for failed setscene task is %s" % event.logfile)
+ continue
+ if isinstance(event, bb.build.TaskFailed):
+ return_value = 1
+ logfile = event.logfile
+ if logfile and os.path.exists(logfile):
+ termfilter.clearFooter()
+ bb.error("Logfile of failure stored in: %s" % logfile)
+ if includelogs and not event.errprinted:
+ print("Log data follows:")
+ f = open(logfile, "r")
+ lines = []
+ while True:
+ l = f.readline()
+ if l == '':
+ break
+ l = l.rstrip()
+ if loglines:
+ lines.append(' | %s' % l)
+ if len(lines) > int(loglines):
+ lines.pop(0)
+ else:
+ print('| %s' % l)
+ f.close()
+ if lines:
+ for line in lines:
+ print(line)
+ if isinstance(event, bb.build.TaskBase):
+ logger.info(event._message)
+ continue
+ if isinstance(event, bb.event.ParseStarted):
+ if event.total == 0:
+ continue
+ parseprogress = new_progress("Parsing recipes", event.total).start()
+ continue
+ if isinstance(event, bb.event.ParseProgress):
+ parseprogress.update(event.current)
+ continue
+ if isinstance(event, bb.event.ParseCompleted):
+ if not parseprogress:
+ continue
+
+ parseprogress.finish()
+ print(("Parsing of %d .bb files complete (%d cached, %d parsed). %d targets, %d skipped, %d masked, %d errors."
+ % ( event.total, event.cached, event.parsed, event.virtuals, event.skipped, event.masked, event.errors)))
+ continue
+
+ if isinstance(event, bb.event.CacheLoadStarted):
+ cacheprogress = new_progress("Loading cache", event.total).start()
+ continue
+ if isinstance(event, bb.event.CacheLoadProgress):
+ cacheprogress.update(event.current)
+ continue
+ if isinstance(event, bb.event.CacheLoadCompleted):
+ cacheprogress.finish()
+ print("Loaded %d entries from dependency cache." % event.num_entries)
+ continue
+
+ if isinstance(event, bb.command.CommandFailed):
+ return_value = event.exitcode
+ if event.error:
+ errors = errors + 1
+ logger.error("Command execution failed: %s", event.error)
+ main.shutdown = 2
+ continue
+ if isinstance(event, bb.command.CommandExit):
+ if not return_value:
+ return_value = event.exitcode
+ continue
+ if isinstance(event, (bb.command.CommandCompleted, bb.cooker.CookerExit)):
+ main.shutdown = 2
+ continue
+ if isinstance(event, bb.event.MultipleProviders):
+ logger.info("multiple providers are available for %s%s (%s)", event._is_runtime and "runtime " or "",
+ event._item,
+ ", ".join(event._candidates))
+ logger.info("consider defining a PREFERRED_PROVIDER entry to match %s", event._item)
+ continue
+ if isinstance(event, bb.event.NoProvider):
+ return_value = 1
+ errors = errors + 1
+ if event._runtime:
+ r = "R"
+ else:
+ r = ""
+
+ extra = ''
+ if not event._reasons:
+ if event._close_matches:
+ extra = ". Close matches:\n %s" % '\n '.join(event._close_matches)
+
+ if event._dependees:
+ logger.error("Nothing %sPROVIDES '%s' (but %s %sDEPENDS on or otherwise requires it)%s", r, event._item, ", ".join(event._dependees), r, extra)
+ else:
+ logger.error("Nothing %sPROVIDES '%s'%s", r, event._item, extra)
+ if event._reasons:
+ for reason in event._reasons:
+ logger.error("%s", reason)
+ continue
+
+ if isinstance(event, bb.runqueue.sceneQueueTaskStarted):
+ logger.info("Running setscene task %d of %d (%s)" % (event.stats.completed + event.stats.active + event.stats.failed + 1, event.stats.total, event.taskstring))
+ continue
+
+ if isinstance(event, bb.runqueue.runQueueTaskStarted):
+ if event.noexec:
+ tasktype = 'noexec task'
+ else:
+ tasktype = 'task'
+ logger.info("Running %s %s of %s (ID: %s, %s)",
+ tasktype,
+ event.stats.completed + event.stats.active +
+ event.stats.failed + 1,
+ event.stats.total, event.taskid, event.taskstring)
+ continue
+
+ if isinstance(event, bb.runqueue.runQueueTaskFailed):
+ taskfailures.append(event.taskstring)
+ logger.error("Task %s (%s) failed with exit code '%s'",
+ event.taskid, event.taskstring, event.exitcode)
+ continue
+
+ if isinstance(event, bb.runqueue.sceneQueueTaskFailed):
+ logger.warn("Setscene task %s (%s) failed with exit code '%s' - real task will be run instead",
+ event.taskid, event.taskstring, event.exitcode)
+ continue
+
+ if isinstance(event, bb.event.DepTreeGenerated):
+ continue
+
+ # ignore
+ if isinstance(event, (bb.event.BuildBase,
+ bb.event.MetadataEvent,
+ bb.event.StampUpdate,
+ bb.event.ConfigParsed,
+ bb.event.RecipeParsed,
+ bb.event.RecipePreFinalise,
+ bb.runqueue.runQueueEvent,
+ bb.event.OperationStarted,
+ bb.event.OperationCompleted,
+ bb.event.OperationProgress,
+ bb.event.DiskFull)):
+ continue
+
+ logger.error("Unknown event: %s", event)
+
+ except EnvironmentError as ioerror:
+ termfilter.clearFooter()
+ # ignore interrupted io
+ if ioerror.args[0] == 4:
+ continue
+ sys.stderr.write(str(ioerror))
+ if not params.observe_only:
+ _, error = server.runCommand(["stateForceShutdown"])
+ main.shutdown = 2
+ except KeyboardInterrupt:
+ termfilter.clearFooter()
+ if params.observe_only:
+ print("\nKeyboard Interrupt, exiting observer...")
+ main.shutdown = 2
+ if not params.observe_only and main.shutdown == 1:
+ print("\nSecond Keyboard Interrupt, stopping...\n")
+ _, error = server.runCommand(["stateForceShutdown"])
+ if error:
+ logger.error("Unable to cleanly stop: %s" % error)
+ if not params.observe_only and main.shutdown == 0:
+ print("\nKeyboard Interrupt, closing down...\n")
+ interrupted = True
+ _, error = server.runCommand(["stateShutdown"])
+ if error:
+ logger.error("Unable to cleanly shutdown: %s" % error)
+ main.shutdown = main.shutdown + 1
+ pass
+ except Exception as e:
+ sys.stderr.write(str(e))
+ if not params.observe_only:
+ _, error = server.runCommand(["stateForceShutdown"])
+ main.shutdown = 2
+ try:
+ summary = ""
+ if taskfailures:
+ summary += pluralise("\nSummary: %s task failed:",
+ "\nSummary: %s tasks failed:", len(taskfailures))
+ for failure in taskfailures:
+ summary += "\n %s" % failure
+ if warnings:
+ summary += pluralise("\nSummary: There was %s WARNING message shown.",
+ "\nSummary: There were %s WARNING messages shown.", warnings)
+ if return_value and errors:
+ summary += pluralise("\nSummary: There was %s ERROR message shown, returning a non-zero exit code.",
+ "\nSummary: There were %s ERROR messages shown, returning a non-zero exit code.", errors)
+ if summary:
+ print(summary)
+
+ if interrupted:
+ print("Execution was interrupted, returning a non-zero exit code.")
+ if return_value == 0:
+ return_value = 1
+ except IOError as e:
+ import errno
+ if e.errno == errno.EPIPE:
+ pass
+
+ return return_value
diff --git a/bitbake/lib/bb/ui/ncurses.py b/bitbake/lib/bb/ui/ncurses.py
new file mode 100644
index 0000000..9589a77
--- /dev/null
+++ b/bitbake/lib/bb/ui/ncurses.py
@@ -0,0 +1,373 @@
+#
+# BitBake Curses UI Implementation
+#
+# Implements an ncurses frontend for the BitBake utility.
+#
+# Copyright (C) 2006 Michael 'Mickey' Lauer
+# Copyright (C) 2006-2007 Richard Purdie
+#
+# This program is free software; you can redistribute it and/or modify
+# it under the terms of the GNU General Public License version 2 as
+# published by the Free Software Foundation.
+#
+# This program is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
+# GNU General Public License for more details.
+#
+# You should have received a copy of the GNU General Public License along
+# with this program; if not, write to the Free Software Foundation, Inc.,
+# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
+
+"""
+ We have the following windows:
+
+ 1.) Main Window: Shows what we are ultimately building and how far we are. Includes status bar
+ 2.) Thread Activity Window: Shows one status line for every concurrent bitbake thread.
+ 3.) Command Line Window: Contains an interactive command line where you can interact w/ Bitbake.
+
+ Basic window layout is like that:
+
+ |---------------------------------------------------------|
+ | <Main Window> | <Thread Activity Window> |
+ | | 0: foo do_compile complete|
+ | Building Gtk+-2.6.10 | 1: bar do_patch complete |
+ | Status: 60% | ... |
+ | | ... |
+ | | ... |
+ |---------------------------------------------------------|
+ |<Command Line Window> |
+ |>>> which virtual/kernel |
+ |openzaurus-kernel |
+ |>>> _ |
+ |---------------------------------------------------------|
+
+"""
+
+
+from __future__ import division
+import logging
+import os, sys, itertools, time, subprocess
+
+try:
+ import curses
+except ImportError:
+ sys.exit("FATAL: The ncurses ui could not load the required curses python module.")
+
+import bb
+import xmlrpclib
+from bb import ui
+from bb.ui import uihelper
+
+parsespin = itertools.cycle( r'|/-\\' )
+
+X = 0
+Y = 1
+WIDTH = 2
+HEIGHT = 3
+
+MAXSTATUSLENGTH = 32
+
+class NCursesUI:
+ """
+ NCurses UI Class
+ """
+ class Window:
+ """Base Window Class"""
+ def __init__( self, x, y, width, height, fg=curses.COLOR_BLACK, bg=curses.COLOR_WHITE ):
+ self.win = curses.newwin( height, width, y, x )
+ self.dimensions = ( x, y, width, height )
+ """
+ if curses.has_colors():
+ color = 1
+ curses.init_pair( color, fg, bg )
+ self.win.bkgdset( ord(' '), curses.color_pair(color) )
+ else:
+ self.win.bkgdset( ord(' '), curses.A_BOLD )
+ """
+ self.erase()
+ self.setScrolling()
+ self.win.noutrefresh()
+
+ def erase( self ):
+ self.win.erase()
+
+ def setScrolling( self, b = True ):
+ self.win.scrollok( b )
+ self.win.idlok( b )
+
+ def setBoxed( self ):
+ self.boxed = True
+ self.win.box()
+ self.win.noutrefresh()
+
+ def setText( self, x, y, text, *args ):
+ self.win.addstr( y, x, text, *args )
+ self.win.noutrefresh()
+
+ def appendText( self, text, *args ):
+ self.win.addstr( text, *args )
+ self.win.noutrefresh()
+
+ def drawHline( self, y ):
+ self.win.hline( y, 0, curses.ACS_HLINE, self.dimensions[WIDTH] )
+ self.win.noutrefresh()
+
+ class DecoratedWindow( Window ):
+ """Base class for windows with a box and a title bar"""
+ def __init__( self, title, x, y, width, height, fg=curses.COLOR_BLACK, bg=curses.COLOR_WHITE ):
+ NCursesUI.Window.__init__( self, x+1, y+3, width-2, height-4, fg, bg )
+ self.decoration = NCursesUI.Window( x, y, width, height, fg, bg )
+ self.decoration.setBoxed()
+ self.decoration.win.hline( 2, 1, curses.ACS_HLINE, width-2 )
+ self.setTitle( title )
+
+ def setTitle( self, title ):
+ self.decoration.setText( 1, 1, title.center( self.dimensions[WIDTH]-2 ), curses.A_BOLD )
+
+ #-------------------------------------------------------------------------#
+# class TitleWindow( Window ):
+ #-------------------------------------------------------------------------#
+# """Title Window"""
+# def __init__( self, x, y, width, height ):
+# NCursesUI.Window.__init__( self, x, y, width, height )
+# version = bb.__version__
+# title = "BitBake %s" % version
+# credit = "(C) 2003-2007 Team BitBake"
+# #self.win.hline( 2, 1, curses.ACS_HLINE, width-2 )
+# self.win.border()
+# self.setText( 1, 1, title.center( self.dimensions[WIDTH]-2 ), curses.A_BOLD )
+# self.setText( 1, 2, credit.center( self.dimensions[WIDTH]-2 ), curses.A_BOLD )
+
+ #-------------------------------------------------------------------------#
+ class ThreadActivityWindow( DecoratedWindow ):
+ #-------------------------------------------------------------------------#
+ """Thread Activity Window"""
+ def __init__( self, x, y, width, height ):
+ NCursesUI.DecoratedWindow.__init__( self, "Thread Activity", x, y, width, height )
+
+ def setStatus( self, thread, text ):
+ line = "%02d: %s" % ( thread, text )
+ width = self.dimensions[WIDTH]
+ if ( len(line) > width ):
+ line = line[:width-3] + "..."
+ else:
+ line = line.ljust( width )
+ self.setText( 0, thread, line )
+
+ #-------------------------------------------------------------------------#
+ class MainWindow( DecoratedWindow ):
+ #-------------------------------------------------------------------------#
+ """Main Window"""
+ def __init__( self, x, y, width, height ):
+ self.StatusPosition = width - MAXSTATUSLENGTH
+ NCursesUI.DecoratedWindow.__init__( self, None, x, y, width, height )
+ curses.nl()
+
+ def setTitle( self, title ):
+ title = "BitBake %s" % bb.__version__
+ self.decoration.setText( 2, 1, title, curses.A_BOLD )
+ self.decoration.setText( self.StatusPosition - 8, 1, "Status:", curses.A_BOLD )
+
+ def setStatus(self, status):
+ while len(status) < MAXSTATUSLENGTH:
+ status = status + " "
+ self.decoration.setText( self.StatusPosition, 1, status, curses.A_BOLD )
+
+
+ #-------------------------------------------------------------------------#
+ class ShellOutputWindow( DecoratedWindow ):
+ #-------------------------------------------------------------------------#
+ """Interactive Command Line Output"""
+ def __init__( self, x, y, width, height ):
+ NCursesUI.DecoratedWindow.__init__( self, "Command Line Window", x, y, width, height )
+
+ #-------------------------------------------------------------------------#
+ class ShellInputWindow( Window ):
+ #-------------------------------------------------------------------------#
+ """Interactive Command Line Input"""
+ def __init__( self, x, y, width, height ):
+ NCursesUI.Window.__init__( self, x, y, width, height )
+
+# put that to the top again from curses.textpad import Textbox
+# self.textbox = Textbox( self.win )
+# t = threading.Thread()
+# t.run = self.textbox.edit
+# t.start()
+
+ #-------------------------------------------------------------------------#
+ def main(self, stdscr, server, eventHandler, params):
+ #-------------------------------------------------------------------------#
+ height, width = stdscr.getmaxyx()
+
+ # for now split it like that:
+ # MAIN_y + THREAD_y = 2/3 screen at the top
+ # MAIN_x = 2/3 left, THREAD_y = 1/3 right
+ # CLI_y = 1/3 of screen at the bottom
+ # CLI_x = full
+
+ main_left = 0
+ main_top = 0
+ main_height = ( height // 3 * 2 )
+ main_width = ( width // 3 ) * 2
+ clo_left = main_left
+ clo_top = main_top + main_height
+ clo_height = height - main_height - main_top - 1
+ clo_width = width
+ cli_left = main_left
+ cli_top = clo_top + clo_height
+ cli_height = 1
+ cli_width = width
+ thread_left = main_left + main_width
+ thread_top = main_top
+ thread_height = main_height
+ thread_width = width - main_width
+
+ #tw = self.TitleWindow( 0, 0, width, main_top )
+ mw = self.MainWindow( main_left, main_top, main_width, main_height )
+ taw = self.ThreadActivityWindow( thread_left, thread_top, thread_width, thread_height )
+ clo = self.ShellOutputWindow( clo_left, clo_top, clo_width, clo_height )
+ cli = self.ShellInputWindow( cli_left, cli_top, cli_width, cli_height )
+ cli.setText( 0, 0, "BB>" )
+
+ mw.setStatus("Idle")
+
+ helper = uihelper.BBUIHelper()
+ shutdown = 0
+
+ try:
+ params.updateFromServer(server)
+ cmdline = params.parseActions()
+ if not cmdline:
+ print("Nothing to do. Use 'bitbake world' to build everything, or run 'bitbake --help' for usage information.")
+ return 1
+ if 'msg' in cmdline and cmdline['msg']:
+ logger.error(cmdline['msg'])
+ return 1
+ cmdline = cmdline['action']
+ ret, error = server.runCommand(cmdline)
+ if error:
+ print("Error running command '%s': %s" % (cmdline, error))
+ return
+ elif ret != True:
+ print("Couldn't get default commandlind! %s" % ret)
+ return
+ except xmlrpclib.Fault as x:
+ print("XMLRPC Fault getting commandline:\n %s" % x)
+ return
+
+ exitflag = False
+ while not exitflag:
+ try:
+ event = eventHandler.waitEvent(0.25)
+ if not event:
+ continue
+
+ helper.eventHandler(event)
+ if isinstance(event, bb.build.TaskBase):
+ mw.appendText("NOTE: %s\n" % event._message)
+ if isinstance(event, logging.LogRecord):
+ mw.appendText(logging.getLevelName(event.levelno) + ': ' + event.getMessage() + '\n')
+
+ if isinstance(event, bb.event.CacheLoadStarted):
+ self.parse_total = event.total
+ if isinstance(event, bb.event.CacheLoadProgress):
+ x = event.current
+ y = self.parse_total
+ mw.setStatus("Loading Cache: %s [%2d %%]" % ( next(parsespin), x*100/y ) )
+ if isinstance(event, bb.event.CacheLoadCompleted):
+ mw.setStatus("Idle")
+ mw.appendText("Loaded %d entries from dependency cache.\n"
+ % ( event.num_entries))
+
+ if isinstance(event, bb.event.ParseStarted):
+ self.parse_total = event.total
+ if isinstance(event, bb.event.ParseProgress):
+ x = event.current
+ y = self.parse_total
+ mw.setStatus("Parsing Recipes: %s [%2d %%]" % ( next(parsespin), x*100/y ) )
+ if isinstance(event, bb.event.ParseCompleted):
+ mw.setStatus("Idle")
+ mw.appendText("Parsing finished. %d cached, %d parsed, %d skipped, %d masked.\n"
+ % ( event.cached, event.parsed, event.skipped, event.masked ))
+
+# if isinstance(event, bb.build.TaskFailed):
+# if event.logfile:
+# if data.getVar("BBINCLUDELOGS", d):
+# bb.error("log data follows (%s)" % logfile)
+# number_of_lines = data.getVar("BBINCLUDELOGS_LINES", d)
+# if number_of_lines:
+# subprocess.call('tail -n%s %s' % (number_of_lines, logfile), shell=True)
+# else:
+# f = open(logfile, "r")
+# while True:
+# l = f.readline()
+# if l == '':
+# break
+# l = l.rstrip()
+# print '| %s' % l
+# f.close()
+# else:
+# bb.error("see log in %s" % logfile)
+
+ if isinstance(event, bb.command.CommandCompleted):
+ # stop so the user can see the result of the build, but
+ # also allow them to now exit with a single ^C
+ shutdown = 2
+ if isinstance(event, bb.command.CommandFailed):
+ mw.appendText("Command execution failed: %s" % event.error)
+ time.sleep(2)
+ exitflag = True
+ if isinstance(event, bb.command.CommandExit):
+ exitflag = True
+ if isinstance(event, bb.cooker.CookerExit):
+ exitflag = True
+
+ if isinstance(event, bb.event.LogExecTTY):
+ mw.appendText('WARN: ' + event.msg + '\n')
+ if helper.needUpdate:
+ activetasks, failedtasks = helper.getTasks()
+ taw.erase()
+ taw.setText(0, 0, "")
+ if activetasks:
+ taw.appendText("Active Tasks:\n")
+ for task in activetasks.itervalues():
+ taw.appendText(task["title"] + '\n')
+ if failedtasks:
+ taw.appendText("Failed Tasks:\n")
+ for task in failedtasks:
+ taw.appendText(task["title"] + '\n')
+
+ curses.doupdate()
+ except EnvironmentError as ioerror:
+ # ignore interrupted io
+ if ioerror.args[0] == 4:
+ pass
+
+ except KeyboardInterrupt:
+ if shutdown == 2:
+ mw.appendText("Third Keyboard Interrupt, exit.\n")
+ exitflag = True
+ if shutdown == 1:
+ mw.appendText("Second Keyboard Interrupt, stopping...\n")
+ _, error = server.runCommand(["stateForceShutdown"])
+ if error:
+ print("Unable to cleanly stop: %s" % error)
+ if shutdown == 0:
+ mw.appendText("Keyboard Interrupt, closing down...\n")
+ _, error = server.runCommand(["stateShutdown"])
+ if error:
+ print("Unable to cleanly shutdown: %s" % error)
+ shutdown = shutdown + 1
+ pass
+
+def main(server, eventHandler, params):
+ if not os.isatty(sys.stdout.fileno()):
+ print("FATAL: Unable to run 'ncurses' UI without a TTY.")
+ return
+ ui = NCursesUI()
+ try:
+ curses.wrapper(ui.main, server, eventHandler, params)
+ except:
+ import traceback
+ traceback.print_exc()
diff --git a/bitbake/lib/bb/ui/puccho.py b/bitbake/lib/bb/ui/puccho.py
new file mode 100644
index 0000000..3ce4590
--- /dev/null
+++ b/bitbake/lib/bb/ui/puccho.py
@@ -0,0 +1,425 @@
+#
+# BitBake Graphical GTK User Interface
+#
+# Copyright (C) 2008 Intel Corporation
+#
+# Authored by Rob Bradford <rob@linux.intel.com>
+#
+# This program is free software; you can redistribute it and/or modify
+# it under the terms of the GNU General Public License version 2 as
+# published by the Free Software Foundation.
+#
+# This program is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
+# GNU General Public License for more details.
+#
+# You should have received a copy of the GNU General Public License along
+# with this program; if not, write to the Free Software Foundation, Inc.,
+# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
+
+import gtk
+import gobject
+import gtk.glade
+import threading
+import urllib2
+import os
+import contextlib
+
+from bb.ui.crumbs.buildmanager import BuildManager, BuildConfiguration
+from bb.ui.crumbs.buildmanager import BuildManagerTreeView
+
+from bb.ui.crumbs.runningbuild import RunningBuild, RunningBuildTreeView
+
+# The metadata loader is used by the BuildSetupDialog to download the
+# available options to populate the dialog
+class MetaDataLoader(gobject.GObject):
+ """ This class provides the mechanism for loading the metadata (the
+ fetching and parsing) from a given URL. The metadata encompasses details
+ on what machines are available. The distribution and images available for
+ the machine and the the uris to use for building the given machine."""
+ __gsignals__ = {
+ 'success' : (gobject.SIGNAL_RUN_LAST,
+ gobject.TYPE_NONE,
+ ()),
+ 'error' : (gobject.SIGNAL_RUN_LAST,
+ gobject.TYPE_NONE,
+ (gobject.TYPE_STRING,))
+ }
+
+ # We use these little helper functions to ensure that we take the gdk lock
+ # when emitting the signal. These functions are called as idles (so that
+ # they happen in the gtk / main thread's main loop.
+ def emit_error_signal (self, remark):
+ gtk.gdk.threads_enter()
+ self.emit ("error", remark)
+ gtk.gdk.threads_leave()
+
+ def emit_success_signal (self):
+ gtk.gdk.threads_enter()
+ self.emit ("success")
+ gtk.gdk.threads_leave()
+
+ def __init__ (self):
+ gobject.GObject.__init__ (self)
+
+ class LoaderThread(threading.Thread):
+ """ This class provides an asynchronous loader for the metadata (by
+ using threads and signals). This is useful since the metadata may be
+ at a remote URL."""
+ class LoaderImportException (Exception):
+ pass
+
+ def __init__(self, loader, url):
+ threading.Thread.__init__ (self)
+ self.url = url
+ self.loader = loader
+
+ def run (self):
+ result = {}
+ try:
+ with contextlib.closing (urllib2.urlopen (self.url)) as f:
+ # Parse the metadata format. The format is....
+ # <machine>;<default distro>|<distro>...;<default image>|</rss>'
+ soup = self.soup(markup)
+ self.assertEqual(
+ unicode(soup.rss), markup)
+
+ def test_docstring_includes_correct_encoding(self):
+ soup = self.soup("<root/>")
+ self.assertEqual(
+ soup.encode("latin1"),
+ b'<?xml version="1.0" encoding="latin1"?>\n<root/>')
+
+ def test_large_xml_document(self):
+ """A large XML document should come out the same as it went in."""
+ markup = (b'<?xml version="1.0" encoding="utf-8"?>\n<root>'
+ + b'0' * (2**12)
+ + b'</root>')
+ soup = self.soup(markup)
+ self.assertEqual(soup.encode("utf-8"), markup)
+
+
+ def test_tags_are_empty_element_if_and_only_if_they_are_empty(self):
+ self.assertSoupEquals("<p>", "<p/>")
+ self.assertSoupEquals("<p>foo</p>")
+
+ def test_namespaces_are_preserved(self):
+ markup = '<root xmlns:a="http://example.com/" xmlns:b="http://example.net/"><a:foo>This tag is in the a namespace</a:foo><b:foo>This tag is in the b namespace</b:foo></root>'
+ soup = self.soup(markup)
+ root = soup.root
+ self.assertEqual("http://example.com/", root['xmlns:a'])
+ self.assertEqual("http://example.net/", root['xmlns:b'])
+
+ def test_closing_namespaced_tag(self):
+ markup = '<p xmlns:dc="http://purl.org/dc/elements/1.1/"><dc:date>20010504</dc:date></p>'
+ soup = self.soup(markup)
+ self.assertEqual(unicode(soup.p), markup)
+
+ def test_namespaced_attributes(self):
+ markup = '<foo xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"><bar xsi:schemaLocation="http://www.example.com"/></foo>'
+ soup = self.soup(markup)
+ self.assertEqual(unicode(soup.foo), markup)
+
+ def test_namespaced_attributes_xml_namespace(self):
+ markup = '<foo xml:lang="fr">bar</foo>'
+ soup = self.soup(markup)
+ self.assertEqual(unicode(soup.foo), markup)
+
+class HTML5TreeBuilderSmokeTest(HTMLTreeBuilderSmokeTest):
+ """Smoke test for a tree builder that supports HTML5."""
+
+ def test_real_xhtml_document(self):
+ # Since XHTML is not HTML5, HTML5 parsers are not tested to handle
+ # XHTML documents in any particular way.
+ pass
+
+ def test_html_tags_have_namespace(self):
+ markup = "<a>"
+ soup = self.soup(markup)
+ self.assertEqual("http://www.w3.org/1999/xhtml", soup.a.namespace)
+
+ def test_svg_tags_have_namespace(self):
+ markup = '<svg><circle/></svg>'
+ soup = self.soup(markup)
+ namespace = "http://www.w3.org/2000/svg"
+ self.assertEqual(namespace, soup.svg.namespace)
+ self.assertEqual(namespace, soup.circle.namespace)
+
+
+ def test_mathml_tags_have_namespace(self):
+ markup = '<math><msqrt>5</msqrt></math>'
+ soup = self.soup(markup)
+ namespace = 'http://www.w3.org/1998/Math/MathML'
+ self.assertEqual(namespace, soup.math.namespace)
+ self.assertEqual(namespace, soup.msqrt.namespace)
+
+ def test_xml_declaration_becomes_comment(self):
+ markup = '<?xml version="1.0" encoding="utf-8"?><html></html>'
+ soup = self.soup(markup)
+ self.assertTrue(isinstance(soup.contents[0], Comment))
+ self.assertEqual(soup.contents[0], '?xml version="1.0" encoding="utf-8"?')
+ self.assertEqual("html", soup.contents[0].next_element.name)
+
+def skipIf(condition, reason):
+ def nothing(test, *args, **kwargs):
+ return None
+
+ def decorator(test_item):
+ if condition:
+ return nothing
+ else:
+ return test_item
+
+ return decorator
diff --git a/bitbake/lib/bs4/tests/__init__.py b/bitbake/lib/bs4/tests/__init__.py
new file mode 100644
index 0000000..142c8cc
--- /dev/null
+++ b/bitbake/lib/bs4/tests/__init__.py
@@ -0,0 +1 @@
+"The beautifulsoup tests."
diff --git a/bitbake/lib/bs4/tests/test_builder_registry.py b/bitbake/lib/bs4/tests/test_builder_registry.py
new file mode 100644
index 0000000..92ad10f
--- /dev/null
+++ b/bitbake/lib/bs4/tests/test_builder_registry.py
@@ -0,0 +1,141 @@
+"""Tests of the builder registry."""
+
+import unittest
+
+from bs4 import BeautifulSoup
+from bs4.builder import (
+ builder_registry as registry,
+ HTMLParserTreeBuilder,
+ TreeBuilderRegistry,
+)
+
+try:
+ from bs4.builder import HTML5TreeBuilder
+ HTML5LIB_PRESENT = True
+except ImportError:
+ HTML5LIB_PRESENT = False
+
+try:
+ from bs4.builder import (
+ LXMLTreeBuilderForXML,
+ LXMLTreeBuilder,
+ )
+ LXML_PRESENT = True
+except ImportError:
+ LXML_PRESENT = False
+
+
+class BuiltInRegistryTest(unittest.TestCase):
+ """Test the built-in registry with the default builders registered."""
+
+ def test_combination(self):
+ if LXML_PRESENT:
+ self.assertEqual(registry.lookup('fast', 'html'),
+ LXMLTreeBuilder)
+
+ if LXML_PRESENT:
+ self.assertEqual(registry.lookup('permissive', 'xml'),
+ LXMLTreeBuilderForXML)
+ self.assertEqual(registry.lookup('strict', 'html'),
+ HTMLParserTreeBuilder)
+ if HTML5LIB_PRESENT:
+ self.assertEqual(registry.lookup('html5lib', 'html'),
+ HTML5TreeBuilder)
+
+ def test_lookup_by_markup_type(self):
+ if LXML_PRESENT:
+ self.assertEqual(registry.lookup('html'), LXMLTreeBuilder)
+ self.assertEqual(registry.lookup('xml'), LXMLTreeBuilderForXML)
+ else:
+ self.assertEqual(registry.lookup('xml'), None)
+ if HTML5LIB_PRESENT:
+ self.assertEqual(registry.lookup('html'), HTML5TreeBuilder)
+ else:
+ self.assertEqual(registry.lookup('html'), HTMLParserTreeBuilder)
+
+ def test_named_library(self):
+ if LXML_PRESENT:
+ self.assertEqual(registry.lookup('lxml', 'xml'),
+ LXMLTreeBuilderForXML)
+ self.assertEqual(registry.lookup('lxml', 'html'),
+ LXMLTreeBuilder)
+ if HTML5LIB_PRESENT:
+ self.assertEqual(registry.lookup('html5lib'),
+ HTML5TreeBuilder)
+
+ self.assertEqual(registry.lookup('html.parser'),
+ HTMLParserTreeBuilder)
+
+ def test_beautifulsoup_constructor_does_lookup(self):
+ # You can pass in a string.
+ BeautifulSoup("", features="html")
+ # Or a list of strings.
+ BeautifulSoup("", features=["html", "fast"])
+
+ # You'll get an exception if BS can't find an appropriate
+ # builder.
+ self.assertRaises(ValueError, BeautifulSoup,
+ "", features="no-such-feature")
+
+class RegistryTest(unittest.TestCase):
+ """Test the TreeBuilderRegistry class in general."""
+
+ def setUp(self):
+ self.registry = TreeBuilderRegistry()
+
+ def builder_for_features(self, *feature_list):
+ cls = type('Builder_' + '_'.join(feature_list),
+ (object,), {'features' : feature_list})
+
+ self.registry.register(cls)
+ return cls
+
+ def test_register_with_no_features(self):
+ builder = self.builder_for_features()
+
+ # Since the builder advertises no features, you can't find it
+ # by looking up features.
+ self.assertEqual(self.registry.lookup('foo'), None)
+
+ # But you can find it by doing a lookup with no features, if
+ # this happens to be the only registered builder.
+ self.assertEqual(self.registry.lookup(), builder)
+
+ def test_register_with_features_makes_lookup_succeed(self):
+ builder = self.builder_for_features('foo', 'bar')
+ self.assertEqual(self.registry.lookup('foo'), builder)
+ self.assertEqual(self.registry.lookup('bar'), builder)
+
+ def test_lookup_fails_when_no_builder_implements_feature(self):
+ builder = self.builder_for_features('foo', 'bar')
+ self.assertEqual(self.registry.lookup('baz'), None)
+
+ def test_lookup_gets_most_recent_registration_when_no_feature_specified(self):
+ builder1 = self.builder_for_features('foo')
+ builder2 = self.builder_for_features('bar')
+ self.assertEqual(self.registry.lookup(), builder2)
+
+ def test_lookup_fails_when_no_tree_builders_registered(self):
+ self.assertEqual(self.registry.lookup(), None)
+
+ def test_lookup_gets_most_recent_builder_supporting_all_features(self):
+ has_one = self.builder_for_features('foo')
+ has_the_other = self.builder_for_features('bar')
+ has_both_early = self.builder_for_features('foo', 'bar', 'baz')
+ has_both_late = self.builder_for_features('foo', 'bar', 'quux')
+ lacks_one = self.builder_for_features('bar')
+ has_the_other = self.builder_for_features('foo')
+
+ # There are two builders featuring 'foo' and 'bar', but
+ # the one that also features 'quux' was registered later.
+ self.assertEqual(self.registry.lookup('foo', 'bar'),
+ has_both_late)
+
+ # There is only one builder featuring 'foo', 'bar', and 'baz'.
+ self.assertEqual(self.registry.lookup('foo', 'bar', 'baz'),
+ has_both_early)
+
+ def test_lookup_fails_when_cannot_reconcile_requested_features(self):
+ builder1 = self.builder_for_features('foo', 'bar')
+ builder2 = self.builder_for_features('foo', 'baz')
+ self.assertEqual(self.registry.lookup('bar', 'baz'), None)
diff --git a/bitbake/lib/bs4/tests/test_docs.py b/bitbake/lib/bs4/tests/test_docs.py
new file mode 100644
index 0000000..5b9f677
--- /dev/null
+++ b/bitbake/lib/bs4/tests/test_docs.py
@@ -0,0 +1,36 @@
+"Test harness for doctests."
+
+# pylint: disable-msg=E0611,W0142
+
+__metaclass__ = type
+__all__ = [
+ 'additional_tests',
+ ]
+
+import atexit
+import doctest
+import os
+#from pkg_resources import (
+# resource_filename, resource_exists, resource_listdir, cleanup_resources)
+import unittest
+
+DOCTEST_FLAGS = (
+ doctest.ELLIPSIS |
+ doctest.NORMALIZE_WHITESPACE |
+ doctest.REPORT_NDIFF)
+
+
+# def additional_tests():
+# "Run the doc tests (README.txt and docs/*, if any exist)"
+# doctest_files = [
+# os.path.abspath(resource_filename('bs4', 'README.txt'))]
+# if resource_exists('bs4', 'docs'):
+# for name in resource_listdir('bs4', 'docs'):
+# if name.endswith('.txt'):
+# doctest_files.append(
+# os.path.abspath(
+# resource_filename('bs4', 'docs/%s' % name)))
+# kwargs = dict(module_relative=False, optionflags=DOCTEST_FLAGS)
+# atexit.register(cleanup_resources)
+# return unittest.TestSuite((
+# doctest.DocFileSuite(*doctest_files, **kwargs)))
diff --git a/bitbake/lib/bs4/tests/test_html5lib.py b/bitbake/lib/bs4/tests/test_html5lib.py
new file mode 100644
index 0000000..594c3e1
--- /dev/null
+++ b/bitbake/lib/bs4/tests/test_html5lib.py
@@ -0,0 +1,85 @@
+"""Tests to ensure that the html5lib tree builder generates good trees."""
+
+import warnings
+
+try:
+ from bs4.builder import HTML5TreeBuilder
+ HTML5LIB_PRESENT = True
+except ImportError, e:
+ HTML5LIB_PRESENT = False
+from bs4.element import SoupStrainer
+from bs4.testing import (
+ HTML5TreeBuilderSmokeTest,
+ SoupTest,
+ skipIf,
+)
+
+@skipIf(
+ not HTML5LIB_PRESENT,
+ "html5lib seems not to be present, not testing its tree builder.")
+class HTML5LibBuilderSmokeTest(SoupTest, HTML5TreeBuilderSmokeTest):
+ """See ``HTML5TreeBuilderSmokeTest``."""
+
+ @property
+ def default_builder(self):
+ return HTML5TreeBuilder()
+
+ def test_soupstrainer(self):
+ # The html5lib tree builder does not support SoupStrainers.
+ strainer = SoupStrainer("b")
+ markup = "<p>A <b>bold</b> statement.</p>"
+ with warnings.catch_warnings(record=True) as w:
+ soup = self.soup(markup, parse_only=strainer)
+ self.assertEqual(
+ soup.decode(), self.document_for(markup))
+
+ self.assertTrue(
+ "the html5lib tree builder doesn't support parse_only" in
+ str(w[0].message))
+
+ def test_correctly_nested_tables(self):
+ """html5lib inserts <tbody> tags where other parsers don't."""
+ markup = ('<table id="1">'
+ '<tr>'
+ "<td>Here's another table:"
+ '<table id="2">'
+ '<tr><td>foo</td></tr>'
+ '</table></td>')
+
+ self.assertSoupEquals(
+ markup,
+ '<table id="1"><tbody><tr><td>Here\'s another table:'
+ '<table id="2"><tbody><tr><td>foo</td></tr></tbody></table>'
+ '</td></tr></tbody></table>')
+
+ self.assertSoupEquals(
+ "<table><thead><tr><td>Foo</td></tr></thead>"
+ "<tbody><tr><td>Bar</td></tr></tbody>"
+ "<tfoot><tr><td>Baz</td></tr></tfoot></table>")
+
+ def test_xml_declaration_followed_by_doctype(self):
+ markup = '''<?xml version="1.0" encoding="utf-8"?>
+<!DOCTYPE html>
+<html>
+ <head>
+ </head>
+ <body>
+ <p>foo</p>
+ </body>
+</html>'''
+ soup = self.soup(markup)
+ # Verify that we can reach the <p> tag; this means the tree is connected.
+ self.assertEqual(b"<p>foo</p>", soup.p.encode())
+
+ def test_reparented_markup(self):
+ markup = '<p><em>foo</p>\n<p>bar<a></a></em></p>'
+ soup = self.soup(markup)
+ self.assertEqual(u"<body><p><em>foo</em></p><em>\n</em><p><em>bar<a></a></em></p></body>", soup.body.decode())
+ self.assertEqual(2, len(soup.find_all('p')))
+
+
+ def test_reparented_markup_ends_with_whitespace(self):
+ markup = '<p><em>foo</p>\n<p>bar<a></a></em></p>\n'
+ soup = self.soup(markup)
+ self.assertEqual(u"<body><p><em>foo</em></p><em>\n</em><p><em>bar<a></a></em></p>\n</body>", soup.body.decode())
+ self.assertEqual(2, len(soup.find_all('p')))
diff --git a/bitbake/lib/bs4/tests/test_htmlparser.py b/bitbake/lib/bs4/tests/test_htmlparser.py
new file mode 100644
index 0000000..bcb5ed2
--- /dev/null
+++ b/bitbake/lib/bs4/tests/test_htmlparser.py
@@ -0,0 +1,19 @@
+"""Tests to ensure that the html.parser tree builder generates good
+trees."""
+
+from bs4.testing import SoupTest, HTMLTreeBuilderSmokeTest
+from bs4.builder import HTMLParserTreeBuilder
+
+class HTMLParserTreeBuilderSmokeTest(SoupTest, HTMLTreeBuilderSmokeTest):
+
+ @property
+ def default_builder(self):
+ return HTMLParserTreeBuilder()
+
+ def test_namespaced_system_doctype(self):
+ # html.parser can't handle namespaced doctypes, so skip this one.
+ pass
+
+ def test_namespaced_public_doctype(self):
+ # html.parser can't handle namespaced doctypes, so skip this one.
+ pass
diff --git a/bitbake/lib/bs4/tests/test_lxml.py b/bitbake/lib/bs4/tests/test_lxml.py
new file mode 100644
index 0000000..2b2e9b7
--- /dev/null
+++ b/bitbake/lib/bs4/tests/test_lxml.py
@@ -0,0 +1,91 @@
+"""Tests to ensure that the lxml tree builder generates good trees."""
+
+import re
+import warnings
+
+try:
+ import lxml.etree
+ LXML_PRESENT = True
+ LXML_VERSION = lxml.etree.LXML_VERSION
+except ImportError, e:
+ LXML_PRESENT = False
+ LXML_VERSION = (0,)
+
+if LXML_PRESENT:
+ from bs4.builder import LXMLTreeBuilder, LXMLTreeBuilderForXML
+
+from bs4 import (
+ BeautifulSoup,
+ BeautifulStoneSoup,
+ )
+from bs4.element import Comment, Doctype, SoupStrainer
+from bs4.testing import skipIf
+from bs4.tests import test_htmlparser
+from bs4.testing import (
+ HTMLTreeBuilderSmokeTest,
+ XMLTreeBuilderSmokeTest,
+ SoupTest,
+ skipIf,
+)
+
+@skipIf(
+ not LXML_PRESENT,
+ "lxml seems not to be present, not testing its tree builder.")
+class LXMLTreeBuilderSmokeTest(SoupTest, HTMLTreeBuilderSmokeTest):
+ """See ``HTMLTreeBuilderSmokeTest``."""
+
+ @property
+ def default_builder(self):
+ return LXMLTreeBuilder()
+
+ def test_out_of_range_entity(self):
+ self.assertSoupEquals(
+ "<p>foo�bar</p>", "<p>foobar</p>")
+ self.assertSoupEquals(
+ "<p>foo�bar</p>", "<p>foobar</p>")
+ self.assertSoupEquals(
+ "<p>foo�bar</p>", "<p>foobar</p>")
+
+ # In lxml < 2.3.5, an empty doctype causes a segfault. Skip this
+ # test if an old version of lxml is installed.
+
+ @skipIf(
+ not LXML_PRESENT or LXML_VERSION < (2,3,5,0),
+ "Skipping doctype test for old version of lxml to avoid segfault.")
+ def test_empty_doctype(self):
+ soup = self.soup("<!DOCTYPE>")
+ doctype = soup.contents[0]
+ self.assertEqual("", doctype.strip())
+
+ def test_beautifulstonesoup_is_xml_parser(self):
+ # Make sure that the deprecated BSS class uses an xml builder
+ # if one is installed.
+ with warnings.catch_warnings(record=True) as w:
+ soup = BeautifulStoneSoup("<b />")
+ self.assertEqual(u"<b/>", unicode(soup.b))
+ self.assertTrue("BeautifulStoneSoup class is deprecated" in str(w[0].message))
+
+ def test_real_xhtml_document(self):
+ """lxml strips the XML definition from an XHTML doc, which is fine."""
+ markup = b"""<?xml version="1.0" encoding="utf-8"?>
+<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN">
+<html xmlns="http://www.w3.org/1999/xhtml">
+<head><title>Hello.</title></head>
+<body>Goodbye.</body>
+</html>"""
+ soup = self.soup(markup)
+ self.assertEqual(
+ soup.encode("utf-8").replace(b"\n", b''),
+ markup.replace(b'\n', b'').replace(
+ b'<?xml version="1.0" encoding="utf-8"?>', b''))
+
+
+@skipIf(
+ not LXML_PRESENT,
+ "lxml seems not to be present, not testing its XML tree builder.")
+class LXMLXMLTreeBuilderSmokeTest(SoupTest, XMLTreeBuilderSmokeTest):
+ """See ``HTMLTreeBuilderSmokeTest``."""
+
+ @property
+ def default_builder(self):
+ return LXMLTreeBuilderForXML()
diff --git a/bitbake/lib/bs4/tests/test_soup.py b/bitbake/lib/bs4/tests/test_soup.py
new file mode 100644
index 0000000..47ac245
--- /dev/null
+++ b/bitbake/lib/bs4/tests/test_soup.py
@@ -0,0 +1,434 @@
+# -*- coding: utf-8 -*-
+"""Tests of Beautiful Soup as a whole."""
+
+import logging
+import unittest
+import sys
+import tempfile
+
+from bs4 import (
+ BeautifulSoup,
+ BeautifulStoneSoup,
+)
+from bs4.element import (
+ CharsetMetaAttributeValue,
+ ContentMetaAttributeValue,
+ SoupStrainer,
+ NamespacedAttribute,
+ )
+import bs4.dammit
+from bs4.dammit import (
+ EntitySubstitution,
+ UnicodeDammit,
+)
+from bs4.testing import (
+ SoupTest,
+ skipIf,
+)
+import warnings
+
+try:
+ from bs4.builder import LXMLTreeBuilder, LXMLTreeBuilderForXML
+ LXML_PRESENT = True
+except ImportError, e:
+ LXML_PRESENT = False
+
+PYTHON_2_PRE_2_7 = (sys.version_info < (2,7))
+PYTHON_3_PRE_3_2 = (sys.version_info[0] == 3 and sys.version_info < (3,2))
+
+class TestConstructor(SoupTest):
+
+ def test_short_unicode_input(self):
+ data = u"<h1>éé</h1>"
+ soup = self.soup(data)
+ self.assertEqual(u"éé", soup.h1.string)
+
+ def test_embedded_null(self):
+ data = u"<h1>foo\0bar</h1>"
+ soup = self.soup(data)
+ self.assertEqual(u"foo\0bar", soup.h1.string)
+
+
+class TestDeprecatedConstructorArguments(SoupTest):
+
+ def test_parseOnlyThese_renamed_to_parse_only(self):
+ with warnings.catch_warnings(record=True) as w:
+ soup = self.soup("<a><b></b></a>", parseOnlyThese=SoupStrainer("b"))
+ msg = str(w[0].message)
+ self.assertTrue("parseOnlyThese" in msg)
+ self.assertTrue("parse_only" in msg)
+ self.assertEqual(b"<b></b>", soup.encode())
+
+ def test_fromEncoding_renamed_to_from_encoding(self):
+ with warnings.catch_warnings(record=True) as w:
+ utf8 = b"\xc3\xa9"
+ soup = self.soup(utf8, fromEncoding="utf8")
+ msg = str(w[0].message)
+ self.assertTrue("fromEncoding" in msg)
+ self.assertTrue("from_encoding" in msg)
+ self.assertEqual("utf8", soup.original_encoding)
+
+ def test_unrecognized_keyword_argument(self):
+ self.assertRaises(
+ TypeError, self.soup, "<a>", no_such_argument=True)
+
+class TestWarnings(SoupTest):
+
+ def test_disk_file_warning(self):
+ filehandle = tempfile.NamedTemporaryFile()
+ filename = filehandle.name
+ try:
+ with warnings.catch_warnings(record=True) as w:
+ soup = self.soup(filename)
+ msg = str(w[0].message)
+ self.assertTrue("looks like a filename" in msg)
+ finally:
+ filehandle.close()
+
+ # The file no longer exists, so Beautiful Soup will no longer issue the warning.
+ with warnings.catch_warnings(record=True) as w:
+ soup = self.soup(filename)
+ self.assertEqual(0, len(w))
+
+ def test_url_warning(self):
+ with warnings.catch_warnings(record=True) as w:
+ soup = self.soup("http://www.crummy.com/")
+ msg = str(w[0].message)
+ self.assertTrue("looks like a URL" in msg)
+
+ with warnings.catch_warnings(record=True) as w:
+ soup = self.soup("http://www.crummy.com/ is great")
+ self.assertEqual(0, len(w))
+
+class TestSelectiveParsing(SoupTest):
+
+ def test_parse_with_soupstrainer(self):
+ markup = "No<b>Yes</b><a>No<b>Yes <c>Yes</c></b>"
+ strainer = SoupStrainer("b")
+ soup = self.soup(markup, parse_only=strainer)
+ self.assertEqual(soup.encode(), b"<b>Yes</b><b>Yes <c>Yes</c></b>")
+
+
+class TestEntitySubstitution(unittest.TestCase):
+ """Standalone tests of the EntitySubstitution class."""
+ def setUp(self):
+ self.sub = EntitySubstitution
+
+ def test_simple_html_substitution(self):
+ # Unicode characters corresponding to named HTML entites
+ # are substituted, and no others.
+ s = u"foo\u2200\N{SNOWMAN}\u00f5bar"
+ self.assertEqual(self.sub.substitute_html(s),
+ u"foo∀\N{SNOWMAN}õbar")
+
+ def test_smart_quote_substitution(self):
+ # MS smart quotes are a common source of frustration, so we
+ # give them a special test.
+ quotes = b"\x91\x92foo\x93\x94"
+ dammit = UnicodeDammit(quotes)
+ self.assertEqual(self.sub.substitute_html(dammit.markup),
+ "‘’foo“”")
+
+ def test_xml_converstion_includes_no_quotes_if_make_quoted_attribute_is_false(self):
+ s = 'Welcome to "my bar"'
+ self.assertEqual(self.sub.substitute_xml(s, False), s)
+
+ def test_xml_attribute_quoting_normally_uses_double_quotes(self):
+ self.assertEqual(self.sub.substitute_xml("Welcome", True),
+ '"Welcome"')
+ self.assertEqual(self.sub.substitute_xml("Bob's Bar", True),
+ '"Bob\'s Bar"')
+
+ def test_xml_attribute_quoting_uses_single_quotes_when_value_contains_double_quotes(self):
+ s = 'Welcome to "my bar"'
+ self.assertEqual(self.sub.substitute_xml(s, True),
+ "'Welcome to \"my bar\"'")
+
+ def test_xml_attribute_quoting_escapes_single_quotes_when_value_contains_both_single_and_double_quotes(self):
+ s = 'Welcome to "Bob\'s Bar"'
+ self.assertEqual(
+ self.sub.substitute_xml(s, True),
+ '"Welcome to "Bob\'s Bar""')
+
+ def test_xml_quotes_arent_escaped_when_value_is_not_being_quoted(self):
+ quoted = 'Welcome to "Bob\'s Bar"'
+ self.assertEqual(self.sub.substitute_xml(quoted), quoted)
+
+ def test_xml_quoting_handles_angle_brackets(self):
+ self.assertEqual(
+ self.sub.substitute_xml("foo<bar>"),
+ "foo<bar>")
+
+ def test_xml_quoting_handles_ampersands(self):
+ self.assertEqual(self.sub.substitute_xml("AT&T"), "AT&T")
+
+ def test_xml_quoting_including_ampersands_when_they_are_part_of_an_entity(self):
+ self.assertEqual(
+ self.sub.substitute_xml("ÁT&T"),
+ "&Aacute;T&T")
+
+ def test_xml_quoting_ignoring_ampersands_when_they_are_part_of_an_entity(self):
+ self.assertEqual(
+ self.sub.substitute_xml_containing_entities("ÁT&T"),
+ "ÁT&T")
+
+ def test_quotes_not_html_substituted(self):
+ """There's no need to do this except inside attribute values."""
+ text = 'Bob\'s "bar"'
+ self.assertEqual(self.sub.substitute_html(text), text)
+
+
+class TestEncodingConversion(SoupTest):
+ # Test Beautiful Soup's ability to decode and encode from various
+ # encodings.
+
+ def setUp(self):
+ super(TestEncodingConversion, self).setUp()
+ self.unicode_data = u'<html><head><meta charset="utf-8"/></head><body><foo>Sacr\N{LATIN SMALL LETTER E WITH ACUTE} bleu!</foo></body></html>'
+ self.utf8_data = self.unicode_data.encode("utf-8")
+ # Just so you know what it looks like.
+ self.assertEqual(
+ self.utf8_data,
+ b'<html><head><meta charset="utf-8"/></head><body><foo>Sacr\xc3\xa9 bleu!</foo></body></html>')
+
+ def test_ascii_in_unicode_out(self):
+ # ASCII input is converted to Unicode. The original_encoding
+ # attribute is set to 'utf-8', a superset of ASCII.
+ chardet = bs4.dammit.chardet_dammit
+ logging.disable(logging.WARNING)
+ try:
+ def noop(str):
+ return None
+ # Disable chardet, which will realize that the ASCII is ASCII.
+ bs4.dammit.chardet_dammit = noop
+ ascii = b"<foo>a</foo>"
+ soup_from_ascii = self.soup(ascii)
+ unicode_output = soup_from_ascii.decode()
+ self.assertTrue(isinstance(unicode_output, unicode))
+ self.assertEqual(unicode_output, self.document_for(ascii.decode()))
+ self.assertEqual(soup_from_ascii.original_encoding.lower(), "utf-8")
+ finally:
+ logging.disable(logging.NOTSET)
+ bs4.dammit.chardet_dammit = chardet
+
+ def test_unicode_in_unicode_out(self):
+ # Unicode input is left alone. The original_encoding attribute
+ # is not set.
+ soup_from_unicode = self.soup(self.unicode_data)
+ self.assertEqual(soup_from_unicode.decode(), self.unicode_data)
+ self.assertEqual(soup_from_unicode.foo.string, u'Sacr\xe9 bleu!')
+ self.assertEqual(soup_from_unicode.original_encoding, None)
+
+ def test_utf8_in_unicode_out(self):
+ # UTF-8 input is converted to Unicode. The original_encoding
+ # attribute is set.
+ soup_from_utf8 = self.soup(self.utf8_data)
+ self.assertEqual(soup_from_utf8.decode(), self.unicode_data)
+ self.assertEqual(soup_from_utf8.foo.string, u'Sacr\xe9 bleu!')
+
+ def test_utf8_out(self):
+ # The internal data structures can be encoded as UTF-8.
+ soup_from_unicode = self.soup(self.unicode_data)
+ self.assertEqual(soup_from_unicode.encode('utf-8'), self.utf8_data)
+
+ @skipIf(
+ PYTHON_2_PRE_2_7 or PYTHON_3_PRE_3_2,
+ "Bad HTMLParser detected; skipping test of non-ASCII characters in attribute name.")
+ def test_attribute_name_containing_unicode_characters(self):
+ markup = u'<div><a \N{SNOWMAN}="snowman"></a></div>'
+ self.assertEqual(self.soup(markup).div.encode("utf8"), markup.encode("utf8"))
+
+class TestUnicodeDammit(unittest.TestCase):
+ """Standalone tests of UnicodeDammit."""
+
+ def test_unicode_input(self):
+ markup = u"I'm already Unicode! \N{SNOWMAN}"
+ dammit = UnicodeDammit(markup)
+ self.assertEqual(dammit.unicode_markup, markup)
+
+ def test_smart_quotes_to_unicode(self):
+ markup = b"<foo>\x91\x92\x93\x94</foo>"
+ dammit = UnicodeDammit(markup)
+ self.assertEqual(
+ dammit.unicode_markup, u"<foo>\u2018\u2019\u201c\u201d</foo>")
+
+ def test_smart_quotes_to_xml_entities(self):
+ markup = b"<foo>\x91\x92\x93\x94</foo>"
+ dammit = UnicodeDammit(markup, smart_quotes_to="xml")
+ self.assertEqual(
+ dammit.unicode_markup, "<foo>‘’“”</foo>")
+
+ def test_smart_quotes_to_html_entities(self):
+ markup = b"<foo>\x91\x92\x93\x94</foo>"
+ dammit = UnicodeDammit(markup, smart_quotes_to="html")
+ self.assertEqual(
+ dammit.unicode_markup, "<foo>‘’“”</foo>")
+
+ def test_smart_quotes_to_ascii(self):
+ markup = b"<foo>\x91\x92\x93\x94</foo>"
+ dammit = UnicodeDammit(markup, smart_quotes_to="ascii")
+ self.assertEqual(
+ dammit.unicode_markup, """<foo>''""</foo>""")
+
+ def test_detect_utf8(self):
+ utf8 = b"\xc3\xa9"
+ dammit = UnicodeDammit(utf8)
+ self.assertEqual(dammit.unicode_markup, u'\xe9')
+ self.assertEqual(dammit.original_encoding.lower(), 'utf-8')
+
+ def test_convert_hebrew(self):
+ hebrew = b"\xed\xe5\xec\xf9"
+ dammit = UnicodeDammit(hebrew, ["iso-8859-8"])
+ self.assertEqual(dammit.original_encoding.lower(), 'iso-8859-8')
+ self.assertEqual(dammit.unicode_markup, u'\u05dd\u05d5\u05dc\u05e9')
+
+ def test_dont_see_smart_quotes_where_there_are_none(self):
+ utf_8 = b"\343\202\261\343\203\274\343\202\277\343\202\244 Watch"
+ dammit = UnicodeDammit(utf_8)
+ self.assertEqual(dammit.original_encoding.lower(), 'utf-8')
+ self.assertEqual(dammit.unicode_markup.encode("utf-8"), utf_8)
+
+ def test_ignore_inappropriate_codecs(self):
+ utf8_data = u"Räksmörgås".encode("utf-8")
+ dammit = UnicodeDammit(utf8_data, ["iso-8859-8"])
+ self.assertEqual(dammit.original_encoding.lower(), 'utf-8')
+
+ def test_ignore_invalid_codecs(self):
+ utf8_data = u"Räksmörgås".encode("utf-8")
+ for bad_encoding in ['.utf8', '...', 'utF---16.!']:
+ dammit = UnicodeDammit(utf8_data, [bad_encoding])
+ self.assertEqual(dammit.original_encoding.lower(), 'utf-8')
+
+ def test_detect_html5_style_meta_tag(self):
+
+ for data in (
+ b'<html><meta charset="euc-jp" /></html>',
+ b"<html><meta charset='euc-jp' /></html>",
+ b"<html><meta charset=euc-jp /></html>",
+ b"<html><meta charset=euc-jp/></html>"):
+ dammit = UnicodeDammit(data, is_html=True)
+ self.assertEqual(
+ "euc-jp", dammit.original_encoding)
+
+ def test_last_ditch_entity_replacement(self):
+ # This is a UTF-8 document that contains bytestrings
+ # completely incompatible with UTF-8 (ie. encoded with some other
+ # encoding).
+ #
+ # Since there is no consistent encoding for the document,
+ # Unicode, Dammit will eventually encode the document as UTF-8
+ # and encode the incompatible characters as REPLACEMENT
+ # CHARACTER.
+ #
+ # If chardet is installed, it will detect that the document
+ # can be converted into ISO-8859-1 without errors. This happens
+ # to be the wrong encoding, but it is a consistent encoding, so the
+ # code we're testing here won't run.
+ #
+ # So we temporarily disable chardet if it's present.
+ doc = b"""\357\273\277<?xml version="1.0" encoding="UTF-8"?>
+<html><b>\330\250\330\252\330\261</b>
+<i>\310\322\321\220\312\321\355\344</i></html>"""
+ chardet = bs4.dammit.chardet_dammit
+ logging.disable(logging.WARNING)
+ try:
+ def noop(str):
+ return None
+ bs4.dammit.chardet_dammit = noop
+ dammit = UnicodeDammit(doc)
+ self.assertEqual(True, dammit.contains_replacement_characters)
+ self.assertTrue(u"\ufffd" in dammit.unicode_markup)
+
+ soup = BeautifulSoup(doc, "html.parser")
+ self.assertTrue(soup.contains_replacement_characters)
+ finally:
+ logging.disable(logging.NOTSET)
+ bs4.dammit.chardet_dammit = chardet
+
+ def test_byte_order_mark_removed(self):
+ # A document written in UTF-16LE will have its byte order marker stripped.
+ data = b'\xff\xfe<\x00a\x00>\x00\xe1\x00\xe9\x00<\x00/\x00a\x00>\x00'
+ dammit = UnicodeDammit(data)
+ self.assertEqual(u"<a>áé</a>", dammit.unicode_markup)
+ self.assertEqual("utf-16le", dammit.original_encoding)
+
+ def test_detwingle(self):
+ # Here's a UTF8 document.
+ utf8 = (u"\N{SNOWMAN}" * 3).encode("utf8")
+
+ # Here's a Windows-1252 document.
+ windows_1252 = (
+ u"\N{LEFT DOUBLE QUOTATION MARK}Hi, I like Windows!"
+ u"\N{RIGHT DOUBLE QUOTATION MARK}").encode("windows_1252")
+
+ # Through some unholy alchemy, they've been stuck together.
+ doc = utf8 + windows_1252 + utf8
+
+ # The document can't be turned into UTF-8:
+ self.assertRaises(UnicodeDecodeError, doc.decode, "utf8")
+
+ # Unicode, Dammit thinks the whole document is Windows-1252,
+ # and decodes it into "☃☃☃“Hi, I like Windows!”☃☃☃"
+
+ # But if we run it through fix_embedded_windows_1252, it's fixed:
+
+ fixed = UnicodeDammit.detwingle(doc)
+ self.assertEqual(
+ u"☃☃☃“Hi, I like Windows!”☃☃☃", fixed.decode("utf8"))
+
+ def test_detwingle_ignores_multibyte_characters(self):
+ # Each of these characters has a UTF-8 representation ending
+ # in \x93. \x93 is a smart quote if interpreted as
+ # Windows-1252. But our code knows to skip over multibyte
+ # UTF-8 characters, so they'll survive the process unscathed.
+ for tricky_unicode_char in (
+ u"\N{LATIN SMALL LIGATURE OE}", # 2-byte char '\xc5\x93'
+ u"\N{LATIN SUBSCRIPT SMALL LETTER X}", # 3-byte char '\xe2\x82\x93'
+ u"\xf0\x90\x90\x93", # This is a CJK character, not sure which one.
+ ):
+ input = tricky_unicode_char.encode("utf8")
+ self.assertTrue(input.endswith(b'\x93'))
+ output = UnicodeDammit.detwingle(input)
+ self.assertEqual(output, input)
+
+class TestNamedspacedAttribute(SoupTest):
+
+ def test_name_may_be_none(self):
+ a = NamespacedAttribute("xmlns", None)
+ self.assertEqual(a, "xmlns")
+
+ def test_attribute_is_equivalent_to_colon_separated_string(self):
+ a = NamespacedAttribute("a", "b")
+ self.assertEqual("a:b", a)
+
+ def test_attributes_are_equivalent_if_prefix_and_name_identical(self):
+ a = NamespacedAttribute("a", "b", "c")
+ b = NamespacedAttribute("a", "b", "c")
+ self.assertEqual(a, b)
+
+ # The actual namespace is not considered.
+ c = NamespacedAttribute("a", "b", None)
+ self.assertEqual(a, c)
+
+ # But name and prefix are important.
+ d = NamespacedAttribute("a", "z", "c")
+ self.assertNotEqual(a, d)
+
+ e = NamespacedAttribute("z", "b", "c")
+ self.assertNotEqual(a, e)
+
+
+class TestAttributeValueWithCharsetSubstitution(unittest.TestCase):
+
+ def test_content_meta_attribute_value(self):
+ value = CharsetMetaAttributeValue("euc-jp")
+ self.assertEqual("euc-jp", value)
+ self.assertEqual("euc-jp", value.original_value)
+ self.assertEqual("utf8", value.encode("utf8"))
+
+
+ def test_content_meta_attribute_value(self):
+ value = ContentMetaAttributeValue("text/html; charset=euc-jp")
+ self.assertEqual("text/html; charset=euc-jp", value)
+ self.assertEqual("text/html; charset=euc-jp", value.original_value)
+ self.assertEqual("text/html; charset=utf8", value.encode("utf8"))
diff --git a/bitbake/lib/bs4/tests/test_tree.py b/bitbake/lib/bs4/tests/test_tree.py
new file mode 100644
index 0000000..f8515c0
--- /dev/null
+++ b/bitbake/lib/bs4/tests/test_tree.py
@@ -0,0 +1,1829 @@
+# -*- coding: utf-8 -*-
+"""Tests for Beautiful Soup's tree traversal methods.
+
+The tree traversal methods are the main advantage of using Beautiful
+Soup over just using a parser.
+
+Different parsers will build different Beautiful Soup trees given the
+same markup, but all Beautiful Soup trees can be traversed with the
+methods tested here.
+"""
+
+import copy
+import pickle
+import re
+import warnings
+from bs4 import BeautifulSoup
+from bs4.builder import (
+ builder_registry,
+ HTMLParserTreeBuilder,
+)
+from bs4.element import (
+ CData,
+ Comment,
+ Doctype,
+ NavigableString,
+ SoupStrainer,
+ Tag,
+)
+from bs4.testing import (
+ SoupTest,
+ skipIf,
+)
+
+XML_BUILDER_PRESENT = (builder_registry.lookup("xml") is not None)
+LXML_PRESENT = (builder_registry.lookup("lxml") is not None)
+
+class TreeTest(SoupTest):
+
+ def assertSelects(self, tags, should_match):
+ """Make sure that the given tags have the correct text.
+
+ This is used in tests that define a bunch of tags, each
+ containing a single string, and then select certain strings by
+ some mechanism.
+ """
+ self.assertEqual([tag.string for tag in tags], should_match)
+
+ def assertSelectsIDs(self, tags, should_match):
+ """Make sure that the given tags have the correct IDs.
+
+ This is used in tests that define a bunch of tags, each
+ containing a single string, and then select certain strings by
+ some mechanism.
+ """
+ self.assertEqual([tag['id'] for tag in tags], should_match)
+
+
+class TestFind(TreeTest):
+ """Basic tests of the find() method.
+
+ find() just calls find_all() with limit=1, so it's not tested all
+ that thouroughly here.
+ """
+
+ def test_find_tag(self):
+ soup = self.soup("<a>1</a><b>2</b><a>3</a><b>4</b>")
+ self.assertEqual(soup.find("b").string, "2")
+
+ def test_unicode_text_find(self):
+ soup = self.soup(u'<h1>Räksmörgås</h1>')
+ self.assertEqual(soup.find(text=u'Räksmörgås'), u'Räksmörgås')
+
+ def test_find_everything(self):
+ """Test an optimization that finds all tags."""
+ soup = self.soup("<a>foo</a><b>bar</b>")
+ self.assertEqual(2, len(soup.find_all()))
+
+ def test_find_everything_with_name(self):
+ """Test an optimization that finds all tags with a given name."""
+ soup = self.soup("<a>foo</a><b>bar</b><a>baz</a>")
+ self.assertEqual(2, len(soup.find_all('a')))
+
+class TestFindAll(TreeTest):
+ """Basic tests of the find_all() method."""
+
+ def test_find_all_text_nodes(self):
+ """You can search the tree for text nodes."""
+ soup = self.soup("<html>Foo<b>bar</b>\xbb</html>")
+ # Exact match.
+ self.assertEqual(soup.find_all(text="bar"), [u"bar"])
+ # Match any of a number of strings.
+ self.assertEqual(
+ soup.find_all(text=["Foo", "bar"]), [u"Foo", u"bar"])
+ # Match a regular expression.
+ self.assertEqual(soup.find_all(text=re.compile('.*')),
+ [u"Foo", u"bar", u'\xbb'])
+ # Match anything.
+ self.assertEqual(soup.find_all(text=True),
+ [u"Foo", u"bar", u'\xbb'])
+
+ def test_find_all_limit(self):
+ """You can limit the number of items returned by find_all."""
+ soup = self.soup("<a>1</a><a>2</a><a>3</a><a>4</a><a>5</a>")
+ self.assertSelects(soup.find_all('a', limit=3), ["1", "2", "3"])
+ self.assertSelects(soup.find_all('a', limit=1), ["1"])
+ self.assertSelects(
+ soup.find_all('a', limit=10), ["1", "2", "3", "4", "5"])
+
+ # A limit of 0 means no limit.
+ self.assertSelects(
+ soup.find_all('a', limit=0), ["1", "2", "3", "4", "5"])
+
+ def test_calling_a_tag_is_calling_findall(self):
+ soup = self.soup("<a>1</a><b>2<a id='foo'>3</a></b>")
+ self.assertSelects(soup('a', limit=1), ["1"])
+ self.assertSelects(soup.b(id="foo"), ["3"])
+
+ def test_find_all_with_self_referential_data_structure_does_not_cause_infinite_recursion(self):
+ soup = self.soup("<a></a>")
+ # Create a self-referential list.
+ l = []
+ l.append(l)
+
+ # Without special code in _normalize_search_value, this would cause infinite
+ # recursion.
+ self.assertEqual([], soup.find_all(l))
+
+ def test_find_all_resultset(self):
+ """All find_all calls return a ResultSet"""
+ soup = self.soup("<a></a>")
+ result = soup.find_all("a")
+ self.assertTrue(hasattr(result, "source"))
+
+ result = soup.find_all(True)
+ self.assertTrue(hasattr(result, "source"))
+
+ result = soup.find_all(text="foo")
+ self.assertTrue(hasattr(result, "source"))
+
+
+class TestFindAllBasicNamespaces(TreeTest):
+
+ def test_find_by_namespaced_name(self):
+ soup = self.soup('<mathml:msqrt>4</mathml:msqrt><a svg:fill="red">')
+ self.assertEqual("4", soup.find("mathml:msqrt").string)
+ self.assertEqual("a", soup.find(attrs= { "svg:fill" : "red" }).name)
+
+
+class TestFindAllByName(TreeTest):
+ """Test ways of finding tags by tag name."""
+
+ def setUp(self):
+ super(TreeTest, self).setUp()
+ self.tree = self.soup("""<a>First tag.</a>
+ <b>Second tag.</b>
+ <c>Third <a>Nested tag.</a> tag.</c>""")
+
+ def test_find_all_by_tag_name(self):
+ # Find all the <a> tags.
+ self.assertSelects(
+ self.tree.find_all('a'), ['First tag.', 'Nested tag.'])
+
+ def test_find_all_by_name_and_text(self):
+ self.assertSelects(
+ self.tree.find_all('a', text='First tag.'), ['First tag.'])
+
+ self.assertSelects(
+ self.tree.find_all('a', text=True), ['First tag.', 'Nested tag.'])
+
+ self.assertSelects(
+ self.tree.find_all('a', text=re.compile("tag")),
+ ['First tag.', 'Nested tag.'])
+
+
+ def test_find_all_on_non_root_element(self):
+ # You can call find_all on any node, not just the root.
+ self.assertSelects(self.tree.c.find_all('a'), ['Nested tag.'])
+
+ def test_calling_element_invokes_find_all(self):
+ self.assertSelects(self.tree('a'), ['First tag.', 'Nested tag.'])
+
+ def test_find_all_by_tag_strainer(self):
+ self.assertSelects(
+ self.tree.find_all(SoupStrainer('a')),
+ ['First tag.', 'Nested tag.'])
+
+ def test_find_all_by_tag_names(self):
+ self.assertSelects(
+ self.tree.find_all(['a', 'b']),
+ ['First tag.', 'Second tag.', 'Nested tag.'])
+
+ def test_find_all_by_tag_dict(self):
+ self.assertSelects(
+ self.tree.find_all({'a' : True, 'b' : True}),
+ ['First tag.', 'Second tag.', 'Nested tag.'])
+
+ def test_find_all_by_tag_re(self):
+ self.assertSelects(
+ self.tree.find_all(re.compile('^[ab]$')),
+ ['First tag.', 'Second tag.', 'Nested tag.'])
+
+ def test_find_all_with_tags_matching_method(self):
+ # You can define an oracle method that determines whether
+ # a tag matches the search.
+ def id_matches_name(tag):
+ return tag.name == tag.get('id')
+
+ tree = self.soup("""<a id="a">Match 1.</a>
+ <a id="1">Does not match.</a>
+ <b id="b">Match 2.</a>""")
+
+ self.assertSelects(
+ tree.find_all(id_matches_name), ["Match 1.", "Match 2."])
+
+
+class TestFindAllByAttribute(TreeTest):
+
+ def test_find_all_by_attribute_name(self):
+ # You can pass in keyword arguments to find_all to search by
+ # attribute.
+ tree = self.soup("""
+ <a id="first">Matching a.</a>
+ <a id="second">
+ Non-matching <b id="first">Matching b.</b>a.
+ </a>""")
+ self.assertSelects(tree.find_all(id='first'),
+ ["Matching a.", "Matching b."])
+
+ def test_find_all_by_utf8_attribute_value(self):
+ peace = u"םולש".encode("utf8")
+ data = u'<a title="םולש"></a>'.encode("utf8")
+ soup = self.soup(data)
+ self.assertEqual([soup.a], soup.find_all(title=peace))
+ self.assertEqual([soup.a], soup.find_all(title=peace.decode("utf8")))
+ self.assertEqual([soup.a], soup.find_all(title=[peace, "something else"]))
+
+ def test_find_all_by_attribute_dict(self):
+ # You can pass in a dictionary as the argument 'attrs'. This
+ # lets you search for attributes like 'name' (a fixed argument
+ # to find_all) and 'class' (a reserved word in Python.)
+ tree = self.soup("""
+ <a name="name1" class="class1">Name match.</a>
+ <a name="name2" class="class2">Class match.</a>
+ <a name="name3" class="class3">Non-match.</a>
+ <name1>A tag called 'name1'.</name1>
+ """)
+
+ # This doesn't do what you want.
+ self.assertSelects(tree.find_all(name='name1'),
+ ["A tag called 'name1'."])
+ # This does what you want.
+ self.assertSelects(tree.find_all(attrs={'name' : 'name1'}),
+ ["Name match."])
+
+ self.assertSelects(tree.find_all(attrs={'class' : 'class2'}),
+ ["Class match."])
+
+ def test_find_all_by_class(self):
+ tree = self.soup("""
+ <a class="1">Class 1.</a>
+ <a class="2">Class 2.</a>
+ <b class="1">Class 1.</b>
+ <c class="3 4">Class 3 and 4.</c>
+ """)
+
+ # Passing in the class_ keyword argument will search against
+ # the 'class' attribute.
+ self.assertSelects(tree.find_all('a', class_='1'), ['Class 1.'])
+ self.assertSelects(tree.find_all('c', class_='3'), ['Class 3 and 4.'])
+ self.assertSelects(tree.find_all('c', class_='4'), ['Class 3 and 4.'])
+
+ # Passing in a string to 'attrs' will also search the CSS class.
+ self.assertSelects(tree.find_all('a', '1'), ['Class 1.'])
+ self.assertSelects(tree.find_all(attrs='1'), ['Class 1.', 'Class 1.'])
+ self.assertSelects(tree.find_all('c', '3'), ['Class 3 and 4.'])
+ self.assertSelects(tree.find_all('c', '4'), ['Class 3 and 4.'])
+
+ def test_find_by_class_when_multiple_classes_present(self):
+ tree = self.soup("<gar class='foo bar'>Found it</gar>")
+
+ f = tree.find_all("gar", class_=re.compile("o"))
+ self.assertSelects(f, ["Found it"])
+
+ f = tree.find_all("gar", class_=re.compile("a"))
+ self.assertSelects(f, ["Found it"])
+
+ # Since the class is not the string "foo bar", but the two
+ # strings "foo" and "bar", this will not find anything.
+ f = tree.find_all("gar", class_=re.compile("o b"))
+ self.assertSelects(f, [])
+
+ def test_find_all_with_non_dictionary_for_attrs_finds_by_class(self):
+ soup = self.soup("<a class='bar'>Found it</a>")
+
+ self.assertSelects(soup.find_all("a", re.compile("ba")), ["Found it"])
+
+ def big_attribute_value(value):
+ return len(value) > 3
+
+ self.assertSelects(soup.find_all("a", big_attribute_value), [])
+
+ def small_attribute_value(value):
+ return len(value) <= 3
+
+ self.assertSelects(
+ soup.find_all("a", small_attribute_value), ["Found it"])
+
+ def test_find_all_with_string_for_attrs_finds_multiple_classes(self):
+ soup = self.soup('<a class="foo bar"></a><a class="foo"></a>')
+ a, a2 = soup.find_all("a")
+ self.assertEqual([a, a2], soup.find_all("a", "foo"))
+ self.assertEqual([a], soup.find_all("a", "bar"))
+
+ # If you specify the class as a string that contains a
+ # space, only that specific value will be found.
+ self.assertEqual([a], soup.find_all("a", class_="foo bar"))
+ self.assertEqual([a], soup.find_all("a", "foo bar"))
+ self.assertEqual([], soup.find_all("a", "bar foo"))
+
+ def test_find_all_by_attribute_soupstrainer(self):
+ tree = self.soup("""
+ <a id="first">Match.</a>
+ <a id="second">Non-match.</a>""")
+
+ strainer = SoupStrainer(attrs={'id' : 'first'})
+ self.assertSelects(tree.find_all(strainer), ['Match.'])
+
+ def test_find_all_with_missing_atribute(self):
+ # You can pass in None as the value of an attribute to find_all.
+ # This will match tags that do not have that attribute set.
+ tree = self.soup("""<a id="1">ID present.</a>
+ <a>No ID present.</a>
+ <a id="">ID is empty.</a>""")
+ self.assertSelects(tree.find_all('a', id=None), ["No ID present."])
+
+ def test_find_all_with_defined_attribute(self):
+ # You can pass in None as the value of an attribute to find_all.
+ # This will match tags that have that attribute set to any value.
+ tree = self.soup("""<a id="1">ID present.</a>
+ <a>No ID present.</a>
+ <a id="">ID is empty.</a>""")
+ self.assertSelects(
+ tree.find_all(id=True), ["ID present.", "ID is empty."])
+
+ def test_find_all_with_numeric_attribute(self):
+ # If you search for a number, it's treated as a string.
+ tree = self.soup("""<a id=1>Unquoted attribute.</a>
+ <a id="1">Quoted attribute.</a>""")
+
+ expected = ["Unquoted attribute.", "Quoted attribute."]
+ self.assertSelects(tree.find_all(id=1), expected)
+ self.assertSelects(tree.find_all(id="1"), expected)
+
+ def test_find_all_with_list_attribute_values(self):
+ # You can pass a list of attribute values instead of just one,
+ # and you'll get tags that match any of the values.
+ tree = self.soup("""<a id="1">1</a>
+ <a id="2">2</a>
+ <a id="3">3</a>
+ <a>No ID.</a>""")
+ self.assertSelects(tree.find_all(id=["1", "3", "4"]),
+ ["1", "3"])
+
+ def test_find_all_with_regular_expression_attribute_value(self):
+ # You can pass a regular expression as an attribute value, and
+ # you'll get tags whose values for that attribute match the
+ # regular expression.
+ tree = self.soup("""<a id="a">One a.</a>
+ <a id="aa">Two as.</a>
+ <a id="ab">Mixed as and bs.</a>
+ <a id="b">One b.</a>
+ <a>No ID.</a>""")
+
+ self.assertSelects(tree.find_all(id=re.compile("^a+$")),
+ ["One a.", "Two as."])
+
+ def test_find_by_name_and_containing_string(self):
+ soup = self.soup("<b>foo</b><b>bar</b><a>foo</a>")
+ a = soup.a
+
+ self.assertEqual([a], soup.find_all("a", text="foo"))
+ self.assertEqual([], soup.find_all("a", text="bar"))
+ self.assertEqual([], soup.find_all("a", text="bar"))
+
+ def test_find_by_name_and_containing_string_when_string_is_buried(self):
+ soup = self.soup("<a>foo</a><a><b><c>foo</c></b></a>")
+ self.assertEqual(soup.find_all("a"), soup.find_all("a", text="foo"))
+
+ def test_find_by_attribute_and_containing_string(self):
+ soup = self.soup('<b id="1">foo</b><a id="2">foo</a>')
+ a = soup.a
+
+ self.assertEqual([a], soup.find_all(id=2, text="foo"))
+ self.assertEqual([], soup.find_all(id=1, text="bar"))
+
+
+
+
+class TestIndex(TreeTest):
+ """Test Tag.index"""
+ def test_index(self):
+ tree = self.soup("""<div>
+ <a>Identical</a>
+ <b>Not identical</b>
+ <a>Identical</a>
+
+ <c><d>Identical with child</d></c>
+ <b>Also not identical</b>
+ <c><d>Identical with child</d></c>
+ </div>""")
+ div = tree.div
+ for i, element in enumerate(div.contents):
+ self.assertEqual(i, div.index(element))
+ self.assertRaises(ValueError, tree.index, 1)
+
+
+class TestParentOperations(TreeTest):
+ """Test navigation and searching through an element's parents."""
+
+ def setUp(self):
+ super(TestParentOperations, self).setUp()
+ self.tree = self.soup('''<ul id="empty"></ul>
+ <ul id="top">
+ <ul id="middle">
+ <ul id="bottom">
+ <b>Start here</b>
+ </ul>
+ </ul>''')
+ self.start = self.tree.b
+
+
+ def test_parent(self):
+ self.assertEqual(self.start.parent['id'], 'bottom')
+ self.assertEqual(self.start.parent.parent['id'], 'middle')
+ self.assertEqual(self.start.parent.parent.parent['id'], 'top')
+
+ def test_parent_of_top_tag_is_soup_object(self):
+ top_tag = self.tree.contents[0]
+ self.assertEqual(top_tag.parent, self.tree)
+
+ def test_soup_object_has_no_parent(self):
+ self.assertEqual(None, self.tree.parent)
+
+ def test_find_parents(self):
+ self.assertSelectsIDs(
+ self.start.find_parents('ul'), ['bottom', 'middle', 'top'])
+ self.assertSelectsIDs(
+ self.start.find_parents('ul', id="middle"), ['middle'])
+
+ def test_find_parent(self):
+ self.assertEqual(self.start.find_parent('ul')['id'], 'bottom')
+ self.assertEqual(self.start.find_parent('ul', id='top')['id'], 'top')
+
+ def test_parent_of_text_element(self):
+ text = self.tree.find(text="Start here")
+ self.assertEqual(text.parent.name, 'b')
+
+ def test_text_element_find_parent(self):
+ text = self.tree.find(text="Start here")
+ self.assertEqual(text.find_parent('ul')['id'], 'bottom')
+
+ def test_parent_generator(self):
+ parents = [parent['id'] for parent in self.start.parents
+ if parent is not None and 'id' in parent.attrs]
+ self.assertEqual(parents, ['bottom', 'middle', 'top'])
+
+
+class ProximityTest(TreeTest):
+
+ def setUp(self):
+ super(TreeTest, self).setUp()
+ self.tree = self.soup(
+ '<html id="start"><head></head><body><b id="1">One</b><b id="2">Two</b><b id="3">Three</b></body></html>')
+
+
+class TestNextOperations(ProximityTest):
+
+ def setUp(self):
+ super(TestNextOperations, self).setUp()
+ self.start = self.tree.b
+
+ def test_next(self):
+ self.assertEqual(self.start.next_element, "One")
+ self.assertEqual(self.start.next_element.next_element['id'], "2")
+
+ def test_next_of_last_item_is_none(self):
+ last = self.tree.find(text="Three")
+ self.assertEqual(last.next_element, None)
+
+ def test_next_of_root_is_none(self):
+ # The document root is outside the next/previous chain.
+ self.assertEqual(self.tree.next_element, None)
+
+ def test_find_all_next(self):
+ self.assertSelects(self.start.find_all_next('b'), ["Two", "Three"])
+ self.start.find_all_next(id=3)
+ self.assertSelects(self.start.find_all_next(id=3), ["Three"])
+
+ def test_find_next(self):
+ self.assertEqual(self.start.find_next('b')['id'], '2')
+ self.assertEqual(self.start.find_next(text="Three"), "Three")
+
+ def test_find_next_for_text_element(self):
+ text = self.tree.find(text="One")
+ self.assertEqual(text.find_next("b").string, "Two")
+ self.assertSelects(text.find_all_next("b"), ["Two", "Three"])
+
+ def test_next_generator(self):
+ start = self.tree.find(text="Two")
+ successors = [node for node in start.next_elements]
+ # There are two successors: the final <b> tag and its text contents.
+ tag, contents = successors
+ self.assertEqual(tag['id'], '3')
+ self.assertEqual(contents, "Three")
+
+class TestPreviousOperations(ProximityTest):
+
+ def setUp(self):
+ super(TestPreviousOperations, self).setUp()
+ self.end = self.tree.find(text="Three")
+
+ def test_previous(self):
+ self.assertEqual(self.end.previous_element['id'], "3")
+ self.assertEqual(self.end.previous_element.previous_element, "Two")
+
+ def test_previous_of_first_item_is_none(self):
+ first = self.tree.find('html')
+ self.assertEqual(first.previous_element, None)
+
+ def test_previous_of_root_is_none(self):
+ # The document root is outside the next/previous chain.
+ # XXX This is broken!
+ #self.assertEqual(self.tree.previous_element, None)
+ pass
+
+ def test_find_all_previous(self):
+ # The <b> tag containing the "Three" node is the predecessor
+ # of the "Three" node itself, which is why "Three" shows up
+ # here.
+ self.assertSelects(
+ self.end.find_all_previous('b'), ["Three", "Two", "One"])
+ self.assertSelects(self.end.find_all_previous(id=1), ["One"])
+
+ def test_find_previous(self):
+ self.assertEqual(self.end.find_previous('b')['id'], '3')
+ self.assertEqual(self.end.find_previous(text="One"), "One")
+
+ def test_find_previous_for_text_element(self):
+ text = self.tree.find(text="Three")
+ self.assertEqual(text.find_previous("b").string, "Three")
+ self.assertSelects(
+ text.find_all_previous("b"), ["Three", "Two", "One"])
+
+ def test_previous_generator(self):
+ start = self.tree.find(text="One")
+ predecessors = [node for node in start.previous_elements]
+
+ # There are four predecessors: the <b> tag containing "One"
+ # the <body> tag, the <head> tag, and the <html> tag.
+ b, body, head, html = predecessors
+ self.assertEqual(b['id'], '1')
+ self.assertEqual(body.name, "body")
+ self.assertEqual(head.name, "head")
+ self.assertEqual(html.name, "html")
+
+
+class SiblingTest(TreeTest):
+
+ def setUp(self):
+ super(SiblingTest, self).setUp()
+ markup = '''<html>
+ <span id="1">
+ <span id="1.1"></span>
+ </span>
+ <span id="2">
+ <span id="2.1"></span>
+ </span>
+ <span id="3">
+ <span id="3.1"></span>
+ </span>
+ <span id="4"></span>
+ </html>'''
+ # All that whitespace looks good but makes the tests more
+ # difficult. Get rid of it.
+ markup = re.compile("\n\s*").sub("", markup)
+ self.tree = self.soup(markup)
+
+
+class TestNextSibling(SiblingTest):
+
+ def setUp(self):
+ super(TestNextSibling, self).setUp()
+ self.start = self.tree.find(id="1")
+
+ def test_next_sibling_of_root_is_none(self):
+ self.assertEqual(self.tree.next_sibling, None)
+
+ def test_next_sibling(self):
+ self.assertEqual(self.start.next_sibling['id'], '2')
+ self.assertEqual(self.start.next_sibling.next_sibling['id'], '3')
+
+ # Note the difference between next_sibling and next_element.
+ self.assertEqual(self.start.next_element['id'], '1.1')
+
+ def test_next_sibling_may_not_exist(self):
+ self.assertEqual(self.tree.html.next_sibling, None)
+
+ nested_span = self.tree.find(id="1.1")
+ self.assertEqual(nested_span.next_sibling, None)
+
+ last_span = self.tree.find(id="4")
+ self.assertEqual(last_span.next_sibling, None)
+
+ def test_find_next_sibling(self):
+ self.assertEqual(self.start.find_next_sibling('span')['id'], '2')
+
+ def test_next_siblings(self):
+ self.assertSelectsIDs(self.start.find_next_siblings("span"),
+ ['2', '3', '4'])
+
+ self.assertSelectsIDs(self.start.find_next_siblings(id='3'), ['3'])
+
+ def test_next_sibling_for_text_element(self):
+ soup = self.soup("Foo<b>bar</b>baz")
+ start = soup.find(text="Foo")
+ self.assertEqual(start.next_sibling.name, 'b')
+ self.assertEqual(start.next_sibling.next_sibling, 'baz')
+
+ self.assertSelects(start.find_next_siblings('b'), ['bar'])
+ self.assertEqual(start.find_next_sibling(text="baz"), "baz")
+ self.assertEqual(start.find_next_sibling(text="nonesuch"), None)
+
+
+class TestPreviousSibling(SiblingTest):
+
+ def setUp(self):
+ super(TestPreviousSibling, self).setUp()
+ self.end = self.tree.find(id="4")
+
+ def test_previous_sibling_of_root_is_none(self):
+ self.assertEqual(self.tree.previous_sibling, None)
+
+ def test_previous_sibling(self):
+ self.assertEqual(self.end.previous_sibling['id'], '3')
+ self.assertEqual(self.end.previous_sibling.previous_sibling['id'], '2')
+
+ # Note the difference between previous_sibling and previous_element.
+ self.assertEqual(self.end.previous_element['id'], '3.1')
+
+ def test_previous_sibling_may_not_exist(self):
+ self.assertEqual(self.tree.html.previous_sibling, None)
+
+ nested_span = self.tree.find(id="1.1")
+ self.assertEqual(nested_span.previous_sibling, None)
+
+ first_span = self.tree.find(id="1")
+ self.assertEqual(first_span.previous_sibling, None)
+
+ def test_find_previous_sibling(self):
+ self.assertEqual(self.end.find_previous_sibling('span')['id'], '3')
+
+ def test_previous_siblings(self):
+ self.assertSelectsIDs(self.end.find_previous_siblings("span"),
+ ['3', '2', '1'])
+
+ self.assertSelectsIDs(self.end.find_previous_siblings(id='1'), ['1'])
+
+ def test_previous_sibling_for_text_element(self):
+ soup = self.soup("Foo<b>bar</b>baz")
+ start = soup.find(text="baz")
+ self.assertEqual(start.previous_sibling.name, 'b')
+ self.assertEqual(start.previous_sibling.previous_sibling, 'Foo')
+
+ self.assertSelects(start.find_previous_siblings('b'), ['bar'])
+ self.assertEqual(start.find_previous_sibling(text="Foo"), "Foo")
+ self.assertEqual(start.find_previous_sibling(text="nonesuch"), None)
+
+
+class TestTagCreation(SoupTest):
+ """Test the ability to create new tags."""
+ def test_new_tag(self):
+ soup = self.soup("")
+ new_tag = soup.new_tag("foo", bar="baz")
+ self.assertTrue(isinstance(new_tag, Tag))
+ self.assertEqual("foo", new_tag.name)
+ self.assertEqual(dict(bar="baz"), new_tag.attrs)
+ self.assertEqual(None, new_tag.parent)
+
+ def test_tag_inherits_self_closing_rules_from_builder(self):
+ if XML_BUILDER_PRESENT:
+ xml_soup = BeautifulSoup("", "xml")
+ xml_br = xml_soup.new_tag("br")
+ xml_p = xml_soup.new_tag("p")
+
+ # Both the <br> and <p> tag are empty-element, just because
+ # they have no contents.
+ self.assertEqual(b"<br/>", xml_br.encode())
+ self.assertEqual(b"<p/>", xml_p.encode())
+
+ html_soup = BeautifulSoup("", "html")
+ html_br = html_soup.new_tag("br")
+ html_p = html_soup.new_tag("p")
+
+ # The HTML builder users HTML's rules about which tags are
+ # empty-element tags, and the new tags reflect these rules.
+ self.assertEqual(b"<br/>", html_br.encode())
+ self.assertEqual(b"<p></p>", html_p.encode())
+
+ def test_new_string_creates_navigablestring(self):
+ soup = self.soup("")
+ s = soup.new_string("foo")
+ self.assertEqual("foo", s)
+ self.assertTrue(isinstance(s, NavigableString))
+
+ def test_new_string_can_create_navigablestring_subclass(self):
+ soup = self.soup("")
+ s = soup.new_string("foo", Comment)
+ self.assertEqual("foo", s)
+ self.assertTrue(isinstance(s, Comment))
+
+class TestTreeModification(SoupTest):
+
+ def test_attribute_modification(self):
+ soup = self.soup('<a id="1"></a>')
+ soup.a['id'] = 2
+ self.assertEqual(soup.decode(), self.document_for('<a id="2"></a>'))
+ del(soup.a['id'])
+ self.assertEqual(soup.decode(), self.document_for('<a></a>'))
+ soup.a['id2'] = 'foo'
+ self.assertEqual(soup.decode(), self.document_for('<a id2="foo"></a>'))
+
+ def test_new_tag_creation(self):
+ builder = builder_registry.lookup('html')()
+ soup = self.soup("<body></body>", builder=builder)
+ a = Tag(soup, builder, 'a')
+ ol = Tag(soup, builder, 'ol')
+ a['href'] = 'http://foo.com/'
+ soup.body.insert(0, a)
+ soup.body.insert(1, ol)
+ self.assertEqual(
+ soup.body.encode(),
+ b'<body><a href="http://foo.com/"></a><ol></ol></body>')
+
+ def test_append_to_contents_moves_tag(self):
+ doc = """<p id="1">Don't leave me <b>here</b>.</p>
+ <p id="2">Don\'t leave!</p>"""
+ soup = self.soup(doc)
+ second_para = soup.find(id='2')
+ bold = soup.b
+
+ # Move the <b> tag to the end of the second paragraph.
+ soup.find(id='2').append(soup.b)
+
+ # The <b> tag is now a child of the second paragraph.
+ self.assertEqual(bold.parent, second_para)
+
+ self.assertEqual(
+ soup.decode(), self.document_for(
+ '<p id="1">Don\'t leave me .</p>\n'
+ '<p id="2">Don\'t leave!<b>here</b></p>'))
+
+ def test_replace_with_returns_thing_that_was_replaced(self):
+ text = "<a></a><b><c></c></b>"
+ soup = self.soup(text)
+ a = soup.a
+ new_a = a.replace_with(soup.c)
+ self.assertEqual(a, new_a)
+
+ def test_unwrap_returns_thing_that_was_replaced(self):
+ text = "<a><b></b><c></c></a>"
+ soup = self.soup(text)
+ a = soup.a
+ new_a = a.unwrap()
+ self.assertEqual(a, new_a)
+
+ def test_replace_tag_with_itself(self):
+ text = "<a><b></b><c>Foo<d></d></c></a><a><e></e></a>"
+ soup = self.soup(text)
+ c = soup.c
+ soup.c.replace_with(c)
+ self.assertEqual(soup.decode(), self.document_for(text))
+
+ def test_replace_tag_with_its_parent_raises_exception(self):
+ text = "<a><b></b></a>"
+ soup = self.soup(text)
+ self.assertRaises(ValueError, soup.b.replace_with, soup.a)
+
+ def test_insert_tag_into_itself_raises_exception(self):
+ text = "<a><b></b></a>"
+ soup = self.soup(text)
+ self.assertRaises(ValueError, soup.a.insert, 0, soup.a)
+
+ def test_replace_with_maintains_next_element_throughout(self):
+ soup = self.soup('<p><a>one</a><b>three</b></p>')
+ a = soup.a
+ b = a.contents[0]
+ # Make it so the <a> tag has two text children.
+ a.insert(1, "two")
+
+ # Now replace each one with the empty string.
+ left, right = a.contents
+ left.replaceWith('')
+ right.replaceWith('')
+
+ # The <b> tag is still connected to the tree.
+ self.assertEqual("three", soup.b.string)
+
+ def test_replace_final_node(self):
+ soup = self.soup("<b>Argh!</b>")
+ soup.find(text="Argh!").replace_with("Hooray!")
+ new_text = soup.find(text="Hooray!")
+ b = soup.b
+ self.assertEqual(new_text.previous_element, b)
+ self.assertEqual(new_text.parent, b)
+ self.assertEqual(new_text.previous_element.next_element, new_text)
+ self.assertEqual(new_text.next_element, None)
+
+ def test_consecutive_text_nodes(self):
+ # A builder should never create two consecutive text nodes,
+ # but if you insert one next to another, Beautiful Soup will
+ # handle it correctly.
+ soup = self.soup("<a><b>Argh!</b><c></c></a>")
+ soup.b.insert(1, "Hooray!")
+
+ self.assertEqual(
+ soup.decode(), self.document_for(
+ "<a><b>Argh!Hooray!</b><c></c></a>"))
+
+ new_text = soup.find(text="Hooray!")
+ self.assertEqual(new_text.previous_element, "Argh!")
+ self.assertEqual(new_text.previous_element.next_element, new_text)
+
+ self.assertEqual(new_text.previous_sibling, "Argh!")
+ self.assertEqual(new_text.previous_sibling.next_sibling, new_text)
+
+ self.assertEqual(new_text.next_sibling, None)
+ self.assertEqual(new_text.next_element, soup.c)
+
+ def test_insert_string(self):
+ soup = self.soup("<a></a>")
+ soup.a.insert(0, "bar")
+ soup.a.insert(0, "foo")
+ # The string were added to the tag.
+ self.assertEqual(["foo", "bar"], soup.a.contents)
+ # And they were converted to NavigableStrings.
+ self.assertEqual(soup.a.contents[0].next_element, "bar")
+
+ def test_insert_tag(self):
+ builder = self.default_builder
+ soup = self.soup(
+ "<a><b>Find</b><c>lady!</c><d></d></a>", builder=builder)
+ magic_tag = Tag(soup, builder, 'magictag')
+ magic_tag.insert(0, "the")
+ soup.a.insert(1, magic_tag)
+
+ self.assertEqual(
+ soup.decode(), self.document_for(
+ "<a><b>Find</b><magictag>the</magictag><c>lady!</c><d></d></a>"))
+
+ # Make sure all the relationships are hooked up correctly.
+ b_tag = soup.b
+ self.assertEqual(b_tag.next_sibling, magic_tag)
+ self.assertEqual(magic_tag.previous_sibling, b_tag)
+
+ find = b_tag.find(text="Find")
+ self.assertEqual(find.next_element, magic_tag)
+ self.assertEqual(magic_tag.previous_element, find)
+
+ c_tag = soup.c
+ self.assertEqual(magic_tag.next_sibling, c_tag)
+ self.assertEqual(c_tag.previous_sibling, magic_tag)
+
+ the = magic_tag.find(text="the")
+ self.assertEqual(the.parent, magic_tag)
+ self.assertEqual(the.next_element, c_tag)
+ self.assertEqual(c_tag.previous_element, the)
+
+ def test_append_child_thats_already_at_the_end(self):
+ data = "<a><b></b></a>"
+ soup = self.soup(data)
+ soup.a.append(soup.b)
+ self.assertEqual(data, soup.decode())
+
+ def test_move_tag_to_beginning_of_parent(self):
+ data = "<a><b></b><c></c><d></d></a>"
+ soup = self.soup(data)
+ soup.a.insert(0, soup.d)
+ self.assertEqual("<a><d></d><b></b><c></c></a>", soup.decode())
+
+ def test_insert_works_on_empty_element_tag(self):
+ # This is a little strange, since most HTML parsers don't allow
+ # markup like this to come through. But in general, we don't
+ # know what the parser would or wouldn't have allowed, so
+ # I'm letting this succeed for now.
+ soup = self.soup("<br/>")
+ soup.br.insert(1, "Contents")
+ self.assertEqual(str(soup.br), "<br>Contents</br>")
+
+ def test_insert_before(self):
+ soup = self.soup("<a>foo</a><b>bar</b>")
+ soup.b.insert_before("BAZ")
+ soup.a.insert_before("QUUX")
+ self.assertEqual(
+ soup.decode(), self.document_for("QUUX<a>foo</a>BAZ<b>bar</b>"))
+
+ soup.a.insert_before(soup.b)
+ self.assertEqual(
+ soup.decode(), self.document_for("QUUX<b>bar</b><a>foo</a>BAZ"))
+
+ def test_insert_after(self):
+ soup = self.soup("<a>foo</a><b>bar</b>")
+ soup.b.insert_after("BAZ")
+ soup.a.insert_after("QUUX")
+ self.assertEqual(
+ soup.decode(), self.document_for("<a>foo</a>QUUX<b>bar</b>BAZ"))
+ soup.b.insert_after(soup.a)
+ self.assertEqual(
+ soup.decode(), self.document_for("QUUX<b>bar</b><a>foo</a>BAZ"))
+
+ def test_insert_after_raises_exception_if_after_has_no_meaning(self):
+ soup = self.soup("")
+ tag = soup.new_tag("a")
+ string = soup.new_string("")
+ self.assertRaises(ValueError, string.insert_after, tag)
+ self.assertRaises(NotImplementedError, soup.insert_after, tag)
+ self.assertRaises(ValueError, tag.insert_after, tag)
+
+ def test_insert_before_raises_notimplementederror_if_before_has_no_meaning(self):
+ soup = self.soup("")
+ tag = soup.new_tag("a")
+ string = soup.new_string("")
+ self.assertRaises(ValueError, string.insert_before, tag)
+ self.assertRaises(NotImplementedError, soup.insert_before, tag)
+ self.assertRaises(ValueError, tag.insert_before, tag)
+
+ def test_replace_with(self):
+ soup = self.soup(
+ "<p>There's <b>no</b> business like <b>show</b> business</p>")
+ no, show = soup.find_all('b')
+ show.replace_with(no)
+ self.assertEqual(
+ soup.decode(),
+ self.document_for(
+ "<p>There's business like <b>no</b> business</p>"))
+
+ self.assertEqual(show.parent, None)
+ self.assertEqual(no.parent, soup.p)
+ self.assertEqual(no.next_element, "no")
+ self.assertEqual(no.next_sibling, " business")
+
+ def test_replace_first_child(self):
+ data = "<a><b></b><c></c></a>"
+ soup = self.soup(data)
+ soup.b.replace_with(soup.c)
+ self.assertEqual("<a><c></c></a>", soup.decode())
+
+ def test_replace_last_child(self):
+ data = "<a><b></b><c></c></a>"
+ soup = self.soup(data)
+ soup.c.replace_with(soup.b)
+ self.assertEqual("<a><b></b></a>", soup.decode())
+
+ def test_nested_tag_replace_with(self):
+ soup = self.soup(
+ """<a>We<b>reserve<c>the</c><d>right</d></b></a><e>to<f>refuse</f><g>service</g></e>""")
+
+ # Replace the entire <b> tag and its contents ("reserve the
+ # right") with the <f> tag ("refuse").
+ remove_tag = soup.b
+ move_tag = soup.f
+ remove_tag.replace_with(move_tag)
+
+ self.assertEqual(
+ soup.decode(), self.document_for(
+ "<a>We<f>refuse</f></a><e>to<g>service</g></e>"))
+
+ # The <b> tag is now an orphan.
+ self.assertEqual(remove_tag.parent, None)
+ self.assertEqual(remove_tag.find(text="right").next_element, None)
+ self.assertEqual(remove_tag.previous_element, None)
+ self.assertEqual(remove_tag.next_sibling, None)
+ self.assertEqual(remove_tag.previous_sibling, None)
+
+ # The <f> tag is now connected to the <a> tag.
+ self.assertEqual(move_tag.parent, soup.a)
+ self.assertEqual(move_tag.previous_element, "We")
+ self.assertEqual(move_tag.next_element.next_element, soup.e)
+ self.assertEqual(move_tag.next_sibling, None)
+
+ # The gap where the <f> tag used to be has been mended, and
+ # the word "to" is now connected to the <g> tag.
+ to_text = soup.find(text="to")
+ g_tag = soup.g
+ self.assertEqual(to_text.next_element, g_tag)
+ self.assertEqual(to_text.next_sibling, g_tag)
+ self.assertEqual(g_tag.previous_element, to_text)
+ self.assertEqual(g_tag.previous_sibling, to_text)
+
+ def test_unwrap(self):
+ tree = self.soup("""
+ <p>Unneeded <em>formatting</em> is unneeded</p>
+ """)
+ tree.em.unwrap()
+ self.assertEqual(tree.em, None)
+ self.assertEqual(tree.p.text, "Unneeded formatting is unneeded")
+
+ def test_wrap(self):
+ soup = self.soup("I wish I was bold.")
+ value = soup.string.wrap(soup.new_tag("b"))
+ self.assertEqual(value.decode(), "<b>I wish I was bold.</b>")
+ self.assertEqual(
+ soup.decode(), self.document_for("<b>I wish I was bold.</b>"))
+
+ def test_wrap_extracts_tag_from_elsewhere(self):
+ soup = self.soup("<b></b>I wish I was bold.")
+ soup.b.next_sibling.wrap(soup.b)
+ self.assertEqual(
+ soup.decode(), self.document_for("<b>I wish I was bold.</b>"))
+
+ def test_wrap_puts_new_contents_at_the_end(self):
+ soup = self.soup("<b>I like being bold.</b>I wish I was bold.")
+ soup.b.next_sibling.wrap(soup.b)
+ self.assertEqual(2, len(soup.b.contents))
+ self.assertEqual(
+ soup.decode(), self.document_for(
+ "<b>I like being bold.I wish I was bold.</b>"))
+
+ def test_extract(self):
+ soup = self.soup(
+ '<html><body>Some content. <div id="nav">Nav crap</div> More content.</body></html>')
+
+ self.assertEqual(len(soup.body.contents), 3)
+ extracted = soup.find(id="nav").extract()
+
+ self.assertEqual(
+ soup.decode(), "<html><body>Some content. More content.</body></html>")
+ self.assertEqual(extracted.decode(), '<div id="nav">Nav crap</div>')
+
+ # The extracted tag is now an orphan.
+ self.assertEqual(len(soup.body.contents), 2)
+ self.assertEqual(extracted.parent, None)
+ self.assertEqual(extracted.previous_element, None)
+ self.assertEqual(extracted.next_element.next_element, None)
+
+ # The gap where the extracted tag used to be has been mended.
+ content_1 = soup.find(text="Some content. ")
+ content_2 = soup.find(text=" More content.")
+ self.assertEqual(content_1.next_element, content_2)
+ self.assertEqual(content_1.next_sibling, content_2)
+ self.assertEqual(content_2.previous_element, content_1)
+ self.assertEqual(content_2.previous_sibling, content_1)
+
+ def test_extract_distinguishes_between_identical_strings(self):
+ soup = self.soup("<a>foo</a><b>bar</b>")
+ foo_1 = soup.a.string
+ bar_1 = soup.b.string
+ foo_2 = soup.new_string("foo")
+ bar_2 = soup.new_string("bar")
+ soup.a.append(foo_2)
+ soup.b.append(bar_2)
+
+ # Now there are two identical strings in the <a> tag, and two
+ # in the <b> tag. Let's remove the first "foo" and the second
+ # "bar".
+ foo_1.extract()
+ bar_2.extract()
+ self.assertEqual(foo_2, soup.a.string)
+ self.assertEqual(bar_2, soup.b.string)
+
+ def test_clear(self):
+ """Tag.clear()"""
+ soup = self.soup("<p><a>String <em>Italicized</em></a> and another</p>")
+ # clear using extract()
+ a = soup.a
+ soup.p.clear()
+ self.assertEqual(len(soup.p.contents), 0)
+ self.assertTrue(hasattr(a, "contents"))
+
+ # clear using decompose()
+ em = a.em
+ a.clear(decompose=True)
+ self.assertEqual(0, len(em.contents))
+
+ def test_string_set(self):
+ """Tag.string = 'string'"""
+ soup = self.soup("<a></a> <b><c></c></b>")
+ soup.a.string = "foo"
+ self.assertEqual(soup.a.contents, ["foo"])
+ soup.b.string = "bar"
+ self.assertEqual(soup.b.contents, ["bar"])
+
+ def test_string_set_does_not_affect_original_string(self):
+ soup = self.soup("<a><b>foo</b><c>bar</c>")
+ soup.b.string = soup.c.string
+ self.assertEqual(soup.a.encode(), b"<a><b>bar</b><c>bar</c></a>")
+
+ def test_set_string_preserves_class_of_string(self):
+ soup = self.soup("<a></a>")
+ cdata = CData("foo")
+ soup.a.string = cdata
+ self.assertTrue(isinstance(soup.a.string, CData))
+
+class TestElementObjects(SoupTest):
+ """Test various features of element objects."""
+
+ def test_len(self):
+ """The length of an element is its number of children."""
+ soup = self.soup("<top>1<b>2</b>3</top>")
+
+ # The BeautifulSoup object itself contains one element: the
+ # <top> tag.
+ self.assertEqual(len(soup.contents), 1)
+ self.assertEqual(len(soup), 1)
+
+ # The <top> tag contains three elements: the text node "1", the
+ # <b> tag, and the text node "3".
+ self.assertEqual(len(soup.top), 3)
+ self.assertEqual(len(soup.top.contents), 3)
+
+ def test_member_access_invokes_find(self):
+ """Accessing a Python member .foo invokes find('foo')"""
+ soup = self.soup('<b><i></i></b>')
+ self.assertEqual(soup.b, soup.find('b'))
+ self.assertEqual(soup.b.i, soup.find('b').find('i'))
+ self.assertEqual(soup.a, None)
+
+ def test_deprecated_member_access(self):
+ soup = self.soup('<b><i></i></b>')
+ with warnings.catch_warnings(record=True) as w:
+ tag = soup.bTag
+ self.assertEqual(soup.b, tag)
+ self.assertEqual(
+ '.bTag is deprecated, use .find("b") instead.',
+ str(w[0].message))
+
+ def test_has_attr(self):
+ """has_attr() checks for the presence of an attribute.
+
+ Please note note: has_attr() is different from
+ __in__. has_attr() checks the tag's attributes and __in__
+ checks the tag's chidlren.
+ """
+ soup = self.soup("<foo attr='bar'>")
+ self.assertTrue(soup.foo.has_attr('attr'))
+ self.assertFalse(soup.foo.has_attr('attr2'))
+
+
+ def test_attributes_come_out_in_alphabetical_order(self):
+ markup = '<b a="1" z="5" m="3" f="2" y="4"></b>'
+ self.assertSoupEquals(markup, '<b a="1" f="2" m="3" y="4" z="5"></b>')
+
+ def test_string(self):
+ # A tag that contains only a text node makes that node
+ # available as .string.
+ soup = self.soup("<b>foo</b>")
+ self.assertEqual(soup.b.string, 'foo')
+
+ def test_empty_tag_has_no_string(self):
+ # A tag with no children has no .stirng.
+ soup = self.soup("<b></b>")
+ self.assertEqual(soup.b.string, None)
+
+ def test_tag_with_multiple_children_has_no_string(self):
+ # A tag with no children has no .string.
+ soup = self.soup("<a>foo<b></b><b></b></b>")
+ self.assertEqual(soup.b.string, None)
+
+ soup = self.soup("<a>foo<b></b>bar</b>")
+ self.assertEqual(soup.b.string, None)
+
+ # Even if all the children are strings, due to trickery,
+ # it won't work--but this would be a good optimization.
+ soup = self.soup("<a>foo</b>")
+ soup.a.insert(1, "bar")
+ self.assertEqual(soup.a.string, None)
+
+ def test_tag_with_recursive_string_has_string(self):
+ # A tag with a single child which has a .string inherits that
+ # .string.
+ soup = self.soup("<a><b>foo</b></a>")
+ self.assertEqual(soup.a.string, "foo")
+ self.assertEqual(soup.string, "foo")
+
+ def test_lack_of_string(self):
+ """Only a tag containing a single text node has a .string."""
+ soup = self.soup("<b>f<i>e</i>o</b>")
+ self.assertFalse(soup.b.string)
+
+ soup = self.soup("<b></b>")
+ self.assertFalse(soup.b.string)
+
+ def test_all_text(self):
+ """Tag.text and Tag.get_text(sep=u"") -> all child text, concatenated"""
+ soup = self.soup("<a>a<b>r</b> <r> t </r></a>")
+ self.assertEqual(soup.a.text, "ar t ")
+ self.assertEqual(soup.a.get_text(strip=True), "art")
+ self.assertEqual(soup.a.get_text(","), "a,r, , t ")
+ self.assertEqual(soup.a.get_text(",", strip=True), "a,r,t")
+
+ def test_get_text_ignores_comments(self):
+ soup = self.soup("foo<!--IGNORE-->bar")
+ self.assertEqual(soup.get_text(), "foobar")
+
+ self.assertEqual(
+ soup.get_text(types=(NavigableString, Comment)), "fooIGNOREbar")
+ self.assertEqual(
+ soup.get_text(types=None), "fooIGNOREbar")
+
+ def test_all_strings_ignores_comments(self):
+ soup = self.soup("foo<!--IGNORE-->bar")
+ self.assertEqual(['foo', 'bar'], list(soup.strings))
+
+class TestCDAtaListAttributes(SoupTest):
+
+ """Testing cdata-list attributes like 'class'.
+ """
+ def test_single_value_becomes_list(self):
+ soup = self.soup("<a class='foo'>")
+ self.assertEqual(["foo"],soup.a['class'])
+
+ def test_multiple_values_becomes_list(self):
+ soup = self.soup("<a class='foo bar'>")
+ self.assertEqual(["foo", "bar"], soup.a['class'])
+
+ def test_multiple_values_separated_by_weird_whitespace(self):
+ soup = self.soup("<a class='foo\tbar\nbaz'>")
+ self.assertEqual(["foo", "bar", "baz"],soup.a['class'])
+
+ def test_attributes_joined_into_string_on_output(self):
+ soup = self.soup("<a class='foo\tbar'>")
+ self.assertEqual(b'<a class="foo bar"></a>', soup.a.encode())
+
+ def test_accept_charset(self):
+ soup = self.soup('<form accept-charset="ISO-8859-1 UTF-8">')
+ self.assertEqual(['ISO-8859-1', 'UTF-8'], soup.form['accept-charset'])
+
+ def test_cdata_attribute_applying_only_to_one_tag(self):
+ data = '<a accept-charset="ISO-8859-1 UTF-8"></a>'
+ soup = self.soup(data)
+ # We saw in another test that accept-charset is a cdata-list
+ # attribute for the <form> tag. But it's not a cdata-list
+ # attribute for any other tag.
+ self.assertEqual('ISO-8859-1 UTF-8', soup.a['accept-charset'])
+
+ def test_string_has_immutable_name_property(self):
+ string = self.soup("s").string
+ self.assertEqual(None, string.name)
+ def t():
+ string.name = 'foo'
+ self.assertRaises(AttributeError, t)
+
+class TestPersistence(SoupTest):
+ "Testing features like pickle and deepcopy."
+
+ def setUp(self):
+ super(TestPersistence, self).setUp()
+ self.page = """<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.0 Transitional//EN"
+"http://www.w3.org/TR/REC-html40/transitional.dtd">
+<html>
+<head>
+<meta http-equiv="Content-Type" content="text/html; charset=utf-8">
+<title>Beautiful Soup: We called him Tortoise because he taught us.</title>
+<link rev="made" href="mailto:leonardr@segfault.org">
+<meta name="Description" content="Beautiful Soup: an HTML parser optimized for screen-scraping.">
+<meta name="generator" content="Markov Approximation 1.4 (module: leonardr)">
+<meta name="author" content="Leonard Richardson">
+</head>
+<body>
+<a href="foo">foo</a>
+<a href="foo"><b>bar</b></a>
+</body>
+</html>"""
+ self.tree = self.soup(self.page)
+
+ def test_pickle_and_unpickle_identity(self):
+ # Pickling a tree, then unpickling it, yields a tree identical
+ # to the original.
+ dumped = pickle.dumps(self.tree, 2)
+ loaded = pickle.loads(dumped)
+ self.assertEqual(loaded.__class__, BeautifulSoup)
+ self.assertEqual(loaded.decode(), self.tree.decode())
+
+ def test_deepcopy_identity(self):
+ # Making a deepcopy of a tree yields an identical tree.
+ copied = copy.deepcopy(self.tree)
+ self.assertEqual(copied.decode(), self.tree.decode())
+
+ def test_unicode_pickle(self):
+ # A tree containing Unicode characters can be pickled.
+ html = u"<b>\N{SNOWMAN}</b>"
+ soup = self.soup(html)
+ dumped = pickle.dumps(soup, pickle.HIGHEST_PROTOCOL)
+ loaded = pickle.loads(dumped)
+ self.assertEqual(loaded.decode(), soup.decode())
+
+
+class TestSubstitutions(SoupTest):
+
+ def test_default_formatter_is_minimal(self):
+ markup = u"<b><<Sacr\N{LATIN SMALL LETTER E WITH ACUTE} bleu!>></b>"
+ soup = self.soup(markup)
+ decoded = soup.decode(formatter="minimal")
+ # The < is converted back into < but the e-with-acute is left alone.
+ self.assertEqual(
+ decoded,
+ self.document_for(
+ u"<b><<Sacr\N{LATIN SMALL LETTER E WITH ACUTE} bleu!>></b>"))
+
+ def test_formatter_html(self):
+ markup = u"<b><<Sacr\N{LATIN SMALL LETTER E WITH ACUTE} bleu!>></b>"
+ soup = self.soup(markup)
+ decoded = soup.decode(formatter="html")
+ self.assertEqual(
+ decoded,
+ self.document_for("<b><<Sacré bleu!>></b>"))
+
+ def test_formatter_minimal(self):
+ markup = u"<b><<Sacr\N{LATIN SMALL LETTER E WITH ACUTE} bleu!>></b>"
+ soup = self.soup(markup)
+ decoded = soup.decode(formatter="minimal")
+ # The < is converted back into < but the e-with-acute is left alone.
+ self.assertEqual(
+ decoded,
+ self.document_for(
+ u"<b><<Sacr\N{LATIN SMALL LETTER E WITH ACUTE} bleu!>></b>"))
+
+ def test_formatter_null(self):
+ markup = u"<b><<Sacr\N{LATIN SMALL LETTER E WITH ACUTE} bleu!>></b>"
+ soup = self.soup(markup)
+ decoded = soup.decode(formatter=None)
+ # Neither the angle brackets nor the e-with-acute are converted.
+ # This is not valid HTML, but it's what the user wanted.
+ self.assertEqual(decoded,
+ self.document_for(u"<b><<Sacr\N{LATIN SMALL LETTER E WITH ACUTE} bleu!>></b>"))
+
+ def test_formatter_custom(self):
+ markup = u"<b><foo></b><b>bar</b>"
+ soup = self.soup(markup)
+ decoded = soup.decode(formatter = lambda x: x.upper())
+ # Instead of normal entity conversion code, the custom
+ # callable is called on every string.
+ self.assertEqual(
+ decoded,
+ self.document_for(u"<b><FOO></b><b>BAR</b>"))
+
+ def test_formatter_is_run_on_attribute_values(self):
+ markup = u'<a href="http://a.com?a=b&c=é">e</a>'
+ soup = self.soup(markup)
+ a = soup.a
+
+ expect_minimal = u'<a href="http://a.com?a=b&c=é">e</a>'
+
+ self.assertEqual(expect_minimal, a.decode())
+ self.assertEqual(expect_minimal, a.decode(formatter="minimal"))
+
+ expect_html = u'<a href="http://a.com?a=b&c=é">e</a>'
+ self.assertEqual(expect_html, a.decode(formatter="html"))
+
+ self.assertEqual(markup, a.decode(formatter=None))
+ expect_upper = u'<a href="HTTP://A.COM?A=B&C=É">E</a>'
+ self.assertEqual(expect_upper, a.decode(formatter=lambda x: x.upper()))
+
+ def test_formatter_skips_script_tag_for_html_documents(self):
+ doc = """
+ <script type="text/javascript">
+ console.log("< < hey > > ");
+ </script>
+"""
+ encoded = BeautifulSoup(doc).encode()
+ self.assertTrue(b"< < hey > >" in encoded)
+
+ def test_formatter_skips_style_tag_for_html_documents(self):
+ doc = """
+ <style type="text/css">
+ console.log("< < hey > > ");
+ </style>
+"""
+ encoded = BeautifulSoup(doc).encode()
+ self.assertTrue(b"< < hey > >" in encoded)
+
+ def test_prettify_leaves_preformatted_text_alone(self):
+ soup = self.soup("<div> foo <pre> \tbar\n \n </pre> baz ")
+ # Everything outside the <pre> tag is reformatted, but everything
+ # inside is left alone.
+ self.assertEqual(
+ u'<div>\n foo\n <pre> \tbar\n \n </pre>\n baz\n</div>',
+ soup.div.prettify())
+
+ def test_prettify_accepts_formatter(self):
+ soup = BeautifulSoup("<html><body>foo</body></html>")
+ pretty = soup.prettify(formatter = lambda x: x.upper())
+ self.assertTrue("FOO" in pretty)
+
+ def test_prettify_outputs_unicode_by_default(self):
+ soup = self.soup("<a></a>")
+ self.assertEqual(unicode, type(soup.prettify()))
+
+ def test_prettify_can_encode_data(self):
+ soup = self.soup("<a></a>")
+ self.assertEqual(bytes, type(soup.prettify("utf-8")))
+
+ def test_html_entity_substitution_off_by_default(self):
+ markup = u"<b>Sacr\N{LATIN SMALL LETTER E WITH ACUTE} bleu!</b>"
+ soup = self.soup(markup)
+ encoded = soup.b.encode("utf-8")
+ self.assertEqual(encoded, markup.encode('utf-8'))
+
+ def test_encoding_substitution(self):
+ # Here's the <meta> tag saying that a document is
+ # encoded in Shift-JIS.
+ meta_tag = ('<meta content="text/html; charset=x-sjis" '
+ 'http-equiv="Content-type"/>')
+ soup = self.soup(meta_tag)
+
+ # Parse the document, and the charset apprears unchanged.
+ self.assertEqual(soup.meta['content'], 'text/html; charset=x-sjis')
+
+ # Encode the document into some encoding, and the encoding is
+ # substituted into the meta tag.
+ utf_8 = soup.encode("utf-8")
+ self.assertTrue(b"charset=utf-8" in utf_8)
+
+ euc_jp = soup.encode("euc_jp")
+ self.assertTrue(b"charset=euc_jp" in euc_jp)
+
+ shift_jis = soup.encode("shift-jis")
+ self.assertTrue(b"charset=shift-jis" in shift_jis)
+
+ utf_16_u = soup.encode("utf-16").decode("utf-16")
+ self.assertTrue("charset=utf-16" in utf_16_u)
+
+ def test_encoding_substitution_doesnt_happen_if_tag_is_strained(self):
+ markup = ('<head><meta content="text/html; charset=x-sjis" '
+ 'http-equiv="Content-type"/></head><pre>foo</pre>')
+
+ # Beautiful Soup used to try to rewrite the meta tag even if the
+ # meta tag got filtered out by the strainer. This test makes
+ # sure that doesn't happen.
+ strainer = SoupStrainer('pre')
+ soup = self.soup(markup, parse_only=strainer)
+ self.assertEqual(soup.contents[0].name, 'pre')
+
+class TestEncoding(SoupTest):
+ """Test the ability to encode objects into strings."""
+
+ def test_unicode_string_can_be_encoded(self):
+ html = u"<b>\N{SNOWMAN}</b>"
+ soup = self.soup(html)
+ self.assertEqual(soup.b.string.encode("utf-8"),
+ u"\N{SNOWMAN}".encode("utf-8"))
+
+ def test_tag_containing_unicode_string_can_be_encoded(self):
+ html = u"<b>\N{SNOWMAN}</b>"
+ soup = self.soup(html)
+ self.assertEqual(
+ soup.b.encode("utf-8"), html.encode("utf-8"))
+
+ def test_encoding_substitutes_unrecognized_characters_by_default(self):
+ html = u"<b>\N{SNOWMAN}</b>"
+ soup = self.soup(html)
+ self.assertEqual(soup.b.encode("ascii"), b"<b>☃</b>")
+
+ def test_encoding_can_be_made_strict(self):
+ html = u"<b>\N{SNOWMAN}</b>"
+ soup = self.soup(html)
+ self.assertRaises(
+ UnicodeEncodeError, soup.encode, "ascii", errors="strict")
+
+ def test_decode_contents(self):
+ html = u"<b>\N{SNOWMAN}</b>"
+ soup = self.soup(html)
+ self.assertEqual(u"\N{SNOWMAN}", soup.b.decode_contents())
+
+ def test_encode_contents(self):
+ html = u"<b>\N{SNOWMAN}</b>"
+ soup = self.soup(html)
+ self.assertEqual(
+ u"\N{SNOWMAN}".encode("utf8"), soup.b.encode_contents(
+ encoding="utf8"))
+
+ def test_deprecated_renderContents(self):
+ html = u"<b>\N{SNOWMAN}</b>"
+ soup = self.soup(html)
+ self.assertEqual(
+ u"\N{SNOWMAN}".encode("utf8"), soup.b.renderContents())
+
+class TestNavigableStringSubclasses(SoupTest):
+
+ def test_cdata(self):
+ # None of the current builders turn CDATA sections into CData
+ # objects, but you can create them manually.
+ soup = self.soup("")
+ cdata = CData("foo")
+ soup.insert(1, cdata)
+ self.assertEqual(str(soup), "<![CDATA[foo]]>")
+ self.assertEqual(soup.find(text="foo"), "foo")
+ self.assertEqual(soup.contents[0], "foo")
+
+ def test_cdata_is_never_formatted(self):
+ """Text inside a CData object is passed into the formatter.
+
+ But the return value is ignored.
+ """
+
+ self.count = 0
+ def increment(*args):
+ self.count += 1
+ return "BITTER FAILURE"
+
+ soup = self.soup("")
+ cdata = CData("<><><>")
+ soup.insert(1, cdata)
+ self.assertEqual(
+ b"<![CDATA[<><><>]]>", soup.encode(formatter=increment))
+ self.assertEqual(1, self.count)
+
+ def test_doctype_ends_in_newline(self):
+ # Unlike other NavigableString subclasses, a DOCTYPE always ends
+ # in a newline.
+ doctype = Doctype("foo")
+ soup = self.soup("")
+ soup.insert(1, doctype)
+ self.assertEqual(soup.encode(), b"<!DOCTYPE foo>\n")
+
+
+class TestSoupSelector(TreeTest):
+
+ HTML = """
+<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.01//EN"
+"http://www.w3.org/TR/html4/strict.dtd">
+<html>
+<head>
+<title>The title</title>
+<link rel="stylesheet" href="blah.css" type="text/css" id="l1">
+</head>
+<body>
+
+<div id="main" class="fancy">
+<div id="inner">
+<h1 id="header1">An H1</h1>
+<p>Some text</p>
+<p class="onep" id="p1">Some more text</p>
+<h2 id="header2">An H2</h2>
+<p class="class1 class2 class3" id="pmulti">Another</p>
+<a href="http://bob.example.org/" rel="friend met" id="bob">Bob</a>
+<h2 id="header3">Another H2</h2>
+<a id="me" href="http://simonwillison.net/" rel="me">me</a>
+<span class="s1">
+<a href="#" id="s1a1">span1a1</a>
+<a href="#" id="s1a2">span1a2 <span id="s1a2s1">test</span></a>
+<span class="span2">
+<a href="#" id="s2a1">span2a1</a>
+</span>
+<span class="span3"></span>
+</span>
+</div>
+<p lang="en" id="lang-en">English</p>
+<p lang="en-gb" id="lang-en-gb">English UK</p>
+<p lang="en-us" id="lang-en-us">English US</p>
+<p lang="fr" id="lang-fr">French</p>
+</div>
+
+<div id="footer">
+</div>
+"""
+
+ def setUp(self):
+ self.soup = BeautifulSoup(self.HTML)
+
+ def assertSelects(self, selector, expected_ids):
+ el_ids = [el['id'] for el in self.soup.select(selector)]
+ el_ids.sort()
+ expected_ids.sort()
+ self.assertEqual(expected_ids, el_ids,
+ "Selector %s, expected [%s], got [%s]" % (
+ selector, ', '.join(expected_ids), ', '.join(el_ids)
+ )
+ )
+
+ assertSelect = assertSelects
+
+ def assertSelectMultiple(self, *tests):
+ for selector, expected_ids in tests:
+ self.assertSelect(selector, expected_ids)
+
+ def test_one_tag_one(self):
+ els = self.soup.select('title')
+ self.assertEqual(len(els), 1)
+ self.assertEqual(els[0].name, 'title')
+ self.assertEqual(els[0].contents, [u'The title'])
+
+ def test_one_tag_many(self):
+ els = self.soup.select('div')
+ self.assertEqual(len(els), 3)
+ for div in els:
+ self.assertEqual(div.name, 'div')
+
+ def test_tag_in_tag_one(self):
+ els = self.soup.select('div div')
+ self.assertSelects('div div', ['inner'])
+
+ def test_tag_in_tag_many(self):
+ for selector in ('html div', 'html body div', 'body div'):
+ self.assertSelects(selector, ['main', 'inner', 'footer'])
+
+ def test_tag_no_match(self):
+ self.assertEqual(len(self.soup.select('del')), 0)
+
+ def test_invalid_tag(self):
+ self.assertRaises(ValueError, self.soup.select, 'tag%t')
+
+ def test_header_tags(self):
+ self.assertSelectMultiple(
+ ('h1', ['header1']),
+ ('h2', ['header2', 'header3']),
+ )
+
+ def test_class_one(self):
+ for selector in ('.onep', 'p.onep', 'html p.onep'):
+ els = self.soup.select(selector)
+ self.assertEqual(len(els), 1)
+ self.assertEqual(els[0].name, 'p')
+ self.assertEqual(els[0]['class'], ['onep'])
+
+ def test_class_mismatched_tag(self):
+ els = self.soup.select('div.onep')
+ self.assertEqual(len(els), 0)
+
+ def test_one_id(self):
+ for selector in ('div#inner', '#inner', 'div div#inner'):
+ self.assertSelects(selector, ['inner'])
+
+ def test_bad_id(self):
+ els = self.soup.select('#doesnotexist')
+ self.assertEqual(len(els), 0)
+
+ def test_items_in_id(self):
+ els = self.soup.select('div#inner p')
+ self.assertEqual(len(els), 3)
+ for el in els:
+ self.assertEqual(el.name, 'p')
+ self.assertEqual(els[1]['class'], ['onep'])
+ self.assertFalse(els[0].has_attr('class'))
+
+ def test_a_bunch_of_emptys(self):
+ for selector in ('div#main del', 'div#main div.oops', 'div div#main'):
+ self.assertEqual(len(self.soup.select(selector)), 0)
+
+ def test_multi_class_support(self):
+ for selector in ('.class1', 'p.class1', '.class2', 'p.class2',
+ '.class3', 'p.class3', 'html p.class2', 'div#inner .class2'):
+ self.assertSelects(selector, ['pmulti'])
+
+ def test_multi_class_selection(self):
+ for selector in ('.class1.class3', '.class3.class2',
+ '.class1.class2.class3'):
+ self.assertSelects(selector, ['pmulti'])
+
+ def test_child_selector(self):
+ self.assertSelects('.s1 > a', ['s1a1', 's1a2'])
+ self.assertSelects('.s1 > a span', ['s1a2s1'])
+
+ def test_child_selector_id(self):
+ self.assertSelects('.s1 > a#s1a2 span', ['s1a2s1'])
+
+ def test_attribute_equals(self):
+ self.assertSelectMultiple(
+ ('p[class="onep"]', ['p1']),
+ ('p[id="p1"]', ['p1']),
+ ('[class="onep"]', ['p1']),
+ ('[id="p1"]', ['p1']),
+ ('link[rel="stylesheet"]', ['l1']),
+ ('link[type="text/css"]', ['l1']),
+ ('link[href="blah.css"]', ['l1']),
+ ('link[href="no-blah.css"]', []),
+ ('[rel="stylesheet"]', ['l1']),
+ ('[type="text/css"]', ['l1']),
+ ('[href="blah.css"]', ['l1']),
+ ('[href="no-blah.css"]', []),
+ ('p[href="no-blah.css"]', []),
+ ('[href="no-blah.css"]', []),
+ )
+
+ def test_attribute_tilde(self):
+ self.assertSelectMultiple(
+ ('p[class~="class1"]', ['pmulti']),
+ ('p[class~="class2"]', ['pmulti']),
+ ('p[class~="class3"]', ['pmulti']),
+ ('[class~="class1"]', ['pmulti']),
+ ('[class~="class2"]', ['pmulti']),
+ ('[class~="class3"]', ['pmulti']),
+ ('a[rel~="friend"]', ['bob']),
+ ('a[rel~="met"]', ['bob']),
+ ('[rel~="friend"]', ['bob']),
+ ('[rel~="met"]', ['bob']),
+ )
+
+ def test_attribute_startswith(self):
+ self.assertSelectMultiple(
+ ('[rel^="style"]', ['l1']),
+ ('link[rel^="style"]', ['l1']),
+ ('notlink[rel^="notstyle"]', []),
+ ('[rel^="notstyle"]', []),
+ ('link[rel^="notstyle"]', []),
+ ('link[href^="bla"]', ['l1']),
+ ('a[href^="http://"]', ['bob', 'me']),
+ ('[href^="http://"]', ['bob', 'me']),
+ ('[id^="p"]', ['pmulti', 'p1']),
+ ('[id^="m"]', ['me', 'main']),
+ ('div[id^="m"]', ['main']),
+ ('a[id^="m"]', ['me']),
+ )
+
+ def test_attribute_endswith(self):
+ self.assertSelectMultiple(
+ ('[href$=".css"]', ['l1']),
+ ('link[href$=".css"]', ['l1']),
+ ('link[id$="1"]', ['l1']),
+ ('[id$="1"]', ['l1', 'p1', 'header1', 's1a1', 's2a1', 's1a2s1']),
+ ('div[id$="1"]', []),
+ ('[id$="noending"]', []),
+ )
+
+ def test_attribute_contains(self):
+ self.assertSelectMultiple(
+ # From test_attribute_startswith
+ ('[rel*="style"]', ['l1']),
+ ('link[rel*="style"]', ['l1']),
+ ('notlink[rel*="notstyle"]', []),
+ ('[rel*="notstyle"]', []),
+ ('link[rel*="notstyle"]', []),
+ ('link[href*="bla"]', ['l1']),
+ ('a[href*="http://"]', ['bob', 'me']),
+ ('[href*="http://"]', ['bob', 'me']),
+ ('[id*="p"]', ['pmulti', 'p1']),
+ ('div[id*="m"]', ['main']),
+ ('a[id*="m"]', ['me']),
+ # From test_attribute_endswith
+ ('[href*=".css"]', ['l1']),
+ ('link[href*=".css"]', ['l1']),
+ ('link[id*="1"]', ['l1']),
+ ('[id*="1"]', ['l1', 'p1', 'header1', 's1a1', 's1a2', 's2a1', 's1a2s1']),
+ ('div[id*="1"]', []),
+ ('[id*="noending"]', []),
+ # New for this test
+ ('[href*="."]', ['bob', 'me', 'l1']),
+ ('a[href*="."]', ['bob', 'me']),
+ ('link[href*="."]', ['l1']),
+ ('div[id*="n"]', ['main', 'inner']),
+ ('div[id*="nn"]', ['inner']),
+ )
+
+ def test_attribute_exact_or_hypen(self):
+ self.assertSelectMultiple(
+ ('p[lang|="en"]', ['lang-en', 'lang-en-gb', 'lang-en-us']),
+ ('[lang|="en"]', ['lang-en', 'lang-en-gb', 'lang-en-us']),
+ ('p[lang|="fr"]', ['lang-fr']),
+ ('p[lang|="gb"]', []),
+ )
+
+ def test_attribute_exists(self):
+ self.assertSelectMultiple(
+ ('[rel]', ['l1', 'bob', 'me']),
+ ('link[rel]', ['l1']),
+ ('a[rel]', ['bob', 'me']),
+ ('[lang]', ['lang-en', 'lang-en-gb', 'lang-en-us', 'lang-fr']),
+ ('p[class]', ['p1', 'pmulti']),
+ ('[blah]', []),
+ ('p[blah]', []),
+ )
+
+ def test_nth_of_type(self):
+ # Try to select first paragraph
+ els = self.soup.select('div#inner p:nth-of-type(1)')
+ self.assertEqual(len(els), 1)
+ self.assertEqual(els[0].string, u'Some text')
+
+ # Try to select third paragraph
+ els = self.soup.select('div#inner p:nth-of-type(3)')
+ self.assertEqual(len(els), 1)
+ self.assertEqual(els[0].string, u'Another')
+
+ # Try to select (non-existent!) fourth paragraph
+ els = self.soup.select('div#inner p:nth-of-type(4)')
+ self.assertEqual(len(els), 0)
+
+ # Pass in an invalid value.
+ self.assertRaises(
+ ValueError, self.soup.select, 'div p:nth-of-type(0)')
+
+ def test_nth_of_type_direct_descendant(self):
+ els = self.soup.select('div#inner > p:nth-of-type(1)')
+ self.assertEqual(len(els), 1)
+ self.assertEqual(els[0].string, u'Some text')
+
+ def test_id_child_selector_nth_of_type(self):
+ self.assertSelects('#inner > p:nth-of-type(2)', ['p1'])
+
+ def test_select_on_element(self):
+ # Other tests operate on the tree; this operates on an element
+ # within the tree.
+ inner = self.soup.find("div", id="main")
+ selected = inner.select("div")
+ # The <div id="inner"> tag was selected. The <div id="footer">
+ # tag was not.
+ self.assertSelectsIDs(selected, ['inner'])
+
+ def test_overspecified_child_id(self):
+ self.assertSelects(".fancy #inner", ['inner'])
+ self.assertSelects(".normal #inner", [])
+
+ def test_adjacent_sibling_selector(self):
+ self.assertSelects('#p1 + h2', ['header2'])
+ self.assertSelects('#p1 + h2 + p', ['pmulti'])
+ self.assertSelects('#p1 + #header2 + .class1', ['pmulti'])
+ self.assertEqual([], self.soup.select('#p1 + p'))
+
+ def test_general_sibling_selector(self):
+ self.assertSelects('#p1 ~ h2', ['header2', 'header3'])
+ self.assertSelects('#p1 ~ #header2', ['header2'])
+ self.assertSelects('#p1 ~ h2 + a', ['me'])
+ self.assertSelects('#p1 ~ h2 + [rel="me"]', ['me'])
+ self.assertEqual([], self.soup.select('#inner ~ h2'))
+
+ def test_dangling_combinator(self):
+ self.assertRaises(ValueError, self.soup.select, 'h1 >')
+
+ def test_sibling_combinator_wont_select_same_tag_twice(self):
+ self.assertSelects('p[lang] ~ p', ['lang-en-gb', 'lang-en-us', 'lang-fr'])
diff --git a/bitbake/lib/codegen.py b/bitbake/lib/codegen.py
new file mode 100644
index 0000000..be772d5
--- /dev/null
+++ b/bitbake/lib/codegen.py
@@ -0,0 +1,570 @@
+# -*- coding: utf-8 -*-
+"""
+ codegen
+ ~~~~~~~
+
+ Extension to ast that allow ast -> python code generation.
+
+ :copyright: Copyright 2008 by Armin Ronacher.
+ :license: BSD.
+"""
+from ast import *
+
+BOOLOP_SYMBOLS = {
+ And: 'and',
+ Or: 'or'
+}
+
+BINOP_SYMBOLS = {
+ Add: '+',
+ Sub: '-',
+ Mult: '*',
+ Div: '/',
+ FloorDiv: '//',
+ Mod: '%',
+ LShift: '<<',
+ RShift: '>>',
+ BitOr: '|',
+ BitAnd: '&',
+ BitXor: '^'
+}
+
+CMPOP_SYMBOLS = {
+ Eq: '==',
+ Gt: '>',
+ GtE: '>=',
+ In: 'in',
+ Is: 'is',
+ IsNot: 'is not',
+ Lt: '<',
+ LtE: '<=',
+ NotEq: '!=',
+ NotIn: 'not in'
+}
+
+UNARYOP_SYMBOLS = {
+ Invert: '~',
+ Not: 'not',
+ UAdd: '+',
+ USub: '-'
+}
+
+ALL_SYMBOLS = {}
+ALL_SYMBOLS.update(BOOLOP_SYMBOLS)
+ALL_SYMBOLS.update(BINOP_SYMBOLS)
+ALL_SYMBOLS.update(CMPOP_SYMBOLS)
+ALL_SYMBOLS.update(UNARYOP_SYMBOLS)
+
+def to_source(node, indent_with=' ' * 4, add_line_information=False):
+ """This function can convert a node tree back into python sourcecode.
+ This is useful for debugging purposes, especially if you're dealing with
+ custom asts not generated by python itself.
+
+ It could be that the sourcecode is evaluable when the AST itself is not
+ compilable / evaluable. The reason for this is that the AST contains some
+ more data than regular sourcecode does, which is dropped during
+ conversion.
+
+ Each level of indentation is replaced with `indent_with`. Per default this
+ parameter is equal to four spaces as suggested by PEP 8, but it might be
+ adjusted to match the application's styleguide.
+
+ If `add_line_information` is set to `True` comments for the line numbers
+ of the nodes are added to the output. This can be used to spot wrong line
+ number information of statement nodes.
+ """
+ generator = SourceGenerator(indent_with, add_line_information)
+ generator.visit(node)
+ return ''.join(generator.result)
+
+
+class SourceGenerator(NodeVisitor):
+ """This visitor is able to transform a well formed syntax tree into python
+ sourcecode. For more details have a look at the docstring of the
+ `node_to_source` function.
+ """
+
+ def __init__(self, indent_with, add_line_information=False):
+ self.result = []
+ self.indent_with = indent_with
+ self.add_line_information = add_line_information
+ self.indentation = 0
+ self.new_lines = 0
+
+ def write(self, x):
+ if self.new_lines:
+ if self.result:
+ self.result.append('\n' * self.new_lines)
+ self.result.append(self.indent_with * self.indentation)
+ self.new_lines = 0
+ self.result.append(x)
+
+ def newline(self, node=None, extra=0):
+ self.new_lines = max(self.new_lines, 1 + extra)
+ if node is not None and self.add_line_information:
+ self.write('# line: %s' % node.lineno)
+ self.new_lines = 1
+
+ def body(self, statements):
+ self.new_line = True
+ self.indentation += 1
+ for stmt in statements:
+ self.visit(stmt)
+ self.indentation -= 1
+
+ def body_or_else(self, node):
+ self.body(node.body)
+ if node.orelse:
+ self.newline()
+ self.write('else:')
+ self.body(node.orelse)
+
+ def signature(self, node):
+ want_comma = []
+ def write_comma():
+ if want_comma:
+ self.write(', ')
+ else:
+ want_comma.append(True)
+
+ padding = [None] * (len(node.args) - len(node.defaults))
+ for arg, default in zip(node.args, padding + node.defaults):
+ write_comma()
+ self.visit(arg)
+ if default is not None:
+ self.write('=')
+ self.visit(default)
+ if node.vararg is not None:
+ write_comma()
+ self.write('*' + node.vararg)
+ if node.kwarg is not None:
+ write_comma()
+ self.write('**' + node.kwarg)
+
+ def decorators(self, node):
+ for decorator in node.decorator_list:
+ self.newline(decorator)
+ self.write('@')
+ self.visit(decorator)
+
+ # Statements
+
+ def visit_Assign(self, node):
+ self.newline(node)
+ for idx, target in enumerate(node.targets):
+ if idx:
+ self.write(', ')
+ self.visit(target)
+ self.write(' = ')
+ self.visit(node.value)
+
+ def visit_AugAssign(self, node):
+ self.newline(node)
+ self.visit(node.target)
+ self.write(BINOP_SYMBOLS[type(node.op)] + '=')
+ self.visit(node.value)
+
+ def visit_ImportFrom(self, node):
+ self.newline(node)
+ self.write('from %s%s import ' % ('.' * node.level, node.module))
+ for idx, item in enumerate(node.names):
+ if idx:
+ self.write(', ')
+ self.write(item)
+
+ def visit_Import(self, node):
+ self.newline(node)
+ for item in node.names:
+ self.write('import ')
+ self.visit(item)
+
+ def visit_Expr(self, node):
+ self.newline(node)
+ self.generic_visit(node)
+
+ def visit_FunctionDef(self, node):
+ self.newline(extra=1)
+ self.decorators(node)
+ self.newline(node)
+ self.write('def %s(' % node.name)
+ self.signature(node.args)
+ self.write('):')
+ self.body(node.body)
+
+ def visit_ClassDef(self, node):
+ have_args = []
+ def paren_or_comma():
+ if have_args:
+ self.write(', ')
+ else:
+ have_args.append(True)
+ self.write('(')
+
+ self.newline(extra=2)
+ self.decorators(node)
+ self.newline(node)
+ self.write('class %s' % node.name)
+ for base in node.bases:
+ paren_or_comma()
+ self.visit(base)
+ # XXX: the if here is used to keep this module compatible
+ # with python 2.6.
+ if hasattr(node, 'keywords'):
+ for keyword in node.keywords:
+ paren_or_comma()
+ self.write(keyword.arg + '=')
+ self.visit(keyword.value)
+ if node.starargs is not None:
+ paren_or_comma()
+ self.write('*')
+ self.visit(node.starargs)
+ if node.kwargs is not None:
+ paren_or_comma()
+ self.write('**')
+ self.visit(node.kwargs)
+ self.write(have_args and '):' or ':')
+ self.body(node.body)
+
+ def visit_If(self, node):
+ self.newline(node)
+ self.write('if ')
+ self.visit(node.test)
+ self.write(':')
+ self.body(node.body)
+ while True:
+ else_ = node.orelse
+ if len(else_) == 1 and isinstance(else_[0], If):
+ node = else_[0]
+ self.newline()
+ self.write('elif ')
+ self.visit(node.test)
+ self.write(':')
+ self.body(node.body)
+ else:
+ self.newline()
+ self.write('else:')
+ self.body(else_)
+ break
+
+ def visit_For(self, node):
+ self.newline(node)
+ self.write('for ')
+ self.visit(node.target)
+ self.write(' in ')
+ self.visit(node.iter)
+ self.write(':')
+ self.body_or_else(node)
+
+ def visit_While(self, node):
+ self.newline(node)
+ self.write('while ')
+ self.visit(node.test)
+ self.write(':')
+ self.body_or_else(node)
+
+ def visit_With(self, node):
+ self.newline(node)
+ self.write('with ')
+ self.visit(node.context_expr)
+ if node.optional_vars is not None:
+ self.write(' as ')
+ self.visit(node.optional_vars)
+ self.write(':')
+ self.body(node.body)
+
+ def visit_Pass(self, node):
+ self.newline(node)
+ self.write('pass')
+
+ def visit_Print(self, node):
+ # XXX: python 2.6 only
+ self.newline(node)
+ self.write('print ')
+ want_comma = False
+ if node.dest is not None:
+ self.write(' >> ')
+ self.visit(node.dest)
+ want_comma = True
+ for value in node.values:
+ if want_comma:
+ self.write(', ')
+ self.visit(value)
+ want_comma = True
+ if not node.nl:
+ self.write(',')
+
+ def visit_Delete(self, node):
+ self.newline(node)
+ self.write('del ')
+ for idx, target in enumerate(node):
+ if idx:
+ self.write(', ')
+ self.visit(target)
+
+ def visit_TryExcept(self, node):
+ self.newline(node)
+ self.write('try:')
+ self.body(node.body)
+ for handler in node.handlers:
+ self.visit(handler)
+
+ def visit_TryFinally(self, node):
+ self.newline(node)
+ self.write('try:')
+ self.body(node.body)
+ self.newline(node)
+ self.write('finally:')
+ self.body(node.finalbody)
+
+ def visit_Global(self, node):
+ self.newline(node)
+ self.write('global ' + ', '.join(node.names))
+
+ def visit_Nonlocal(self, node):
+ self.newline(node)
+ self.write('nonlocal ' + ', '.join(node.names))
+
+ def visit_Return(self, node):
+ self.newline(node)
+ self.write('return ')
+ self.visit(node.value)
+
+ def visit_Break(self, node):
+ self.newline(node)
+ self.write('break')
+
+ def visit_Continue(self, node):
+ self.newline(node)
+ self.write('continue')
+
+ def visit_Raise(self, node):
+ # XXX: Python 2.6 / 3.0 compatibility
+ self.newline(node)
+ self.write('raise')
+ if hasattr(node, 'exc') and node.exc is not None:
+ self.write(' ')
+ self.visit(node.exc)
+ if node.cause is not None:
+ self.write(' from ')
+ self.visit(node.cause)
+ elif hasattr(node, 'type') and node.type is not None:
+ self.visit(node.type)
+ if node.inst is not None:
+ self.write(', ')
+ self.visit(node.inst)
+ if node.tback is not None:
+ self.write(', ')
+ self.visit(node.tback)
+
+ # Expressions
+
+ def visit_Attribute(self, node):
+ self.visit(node.value)
+ self.write('.' + node.attr)
+
+ def visit_Call(self, node):
+ want_comma = []
+ def write_comma():
+ if want_comma:
+ self.write(', ')
+ else:
+ want_comma.append(True)
+
+ self.visit(node.func)
+ self.write('(')
+ for arg in node.args:
+ write_comma()
+ self.visit(arg)
+ for keyword in node.keywords:
+ write_comma()
+ self.write(keyword.arg + '=')
+ self.visit(keyword.value)
+ if node.starargs is not None:
+ write_comma()
+ self.write('*')
+ self.visit(node.starargs)
+ if node.kwargs is not None:
+ write_comma()
+ self.write('**')
+ self.visit(node.kwargs)
+ self.write(')')
+
+ def visit_Name(self, node):
+ self.write(node.id)
+
+ def visit_Str(self, node):
+ self.write(repr(node.s))
+
+ def visit_Bytes(self, node):
+ self.write(repr(node.s))
+
+ def visit_Num(self, node):
+ self.write(repr(node.n))
+
+ def visit_Tuple(self, node):
+ self.write('(')
+ idx = -1
+ for idx, item in enumerate(node.elts):
+ if idx:
+ self.write(', ')
+ self.visit(item)
+ self.write(idx and ')' or ',)')
+
+ def sequence_visit(left, right):
+ def visit(self, node):
+ self.write(left)
+ for idx, item in enumerate(node.elts):
+ if idx:
+ self.write(', ')
+ self.visit(item)
+ self.write(right)
+ return visit
+
+ visit_List = sequence_visit('[', ']')
+ visit_Set = sequence_visit('{', '}')
+ del sequence_visit
+
+ def visit_Dict(self, node):
+ self.write('{')
+ for idx, (key, value) in enumerate(zip(node.keys, node.values)):
+ if idx:
+ self.write(', ')
+ self.visit(key)
+ self.write(': ')
+ self.visit(value)
+ self.write('}')
+
+ def visit_BinOp(self, node):
+ self.visit(node.left)
+ self.write(' %s ' % BINOP_SYMBOLS[type(node.op)])
+ self.visit(node.right)
+
+ def visit_BoolOp(self, node):
+ self.write('(')
+ for idx, value in enumerate(node.values):
+ if idx:
+ self.write(' %s ' % BOOLOP_SYMBOLS[type(node.op)])
+ self.visit(value)
+ self.write(')')
+
+ def visit_Compare(self, node):
+ self.write('(')
+ self.write(node.left)
+ for op, right in zip(node.ops, node.comparators):
+ self.write(' %s %%' % CMPOP_SYMBOLS[type(op)])
+ self.visit(right)
+ self.write(')')
+
+ def visit_UnaryOp(self, node):
+ self.write('(')
+ op = UNARYOP_SYMBOLS[type(node.op)]
+ self.write(op)
+ if op == 'not':
+ self.write(' ')
+ self.visit(node.operand)
+ self.write(')')
+
+ def visit_Subscript(self, node):
+ self.visit(node.value)
+ self.write('[')
+ self.visit(node.slice)
+ self.write(']')
+
+ def visit_Slice(self, node):
+ if node.lower is not None:
+ self.visit(node.lower)
+ self.write(':')
+ if node.upper is not None:
+ self.visit(node.upper)
+ if node.step is not None:
+ self.write(':')
+ if not (isinstance(node.step, Name) and node.step.id == 'None'):
+ self.visit(node.step)
+
+ def visit_ExtSlice(self, node):
+ for idx, item in node.dims:
+ if idx:
+ self.write(', ')
+ self.visit(item)
+
+ def visit_Yield(self, node):
+ self.write('yield ')
+ self.visit(node.value)
+
+ def visit_Lambda(self, node):
+ self.write('lambda ')
+ self.signature(node.args)
+ self.write(': ')
+ self.visit(node.body)
+
+ def visit_Ellipsis(self, node):
+ self.write('Ellipsis')
+
+ def generator_visit(left, right):
+ def visit(self, node):
+ self.write(left)
+ self.visit(node.elt)
+ for comprehension in node.generators:
+ self.visit(comprehension)
+ self.write(right)
+ return visit
+
+ visit_ListComp = generator_visit('[', ']')
+ visit_GeneratorExp = generator_visit('(', ')')
+ visit_SetComp = generator_visit('{', '}')
+ del generator_visit
+
+ def visit_DictComp(self, node):
+ self.write('{')
+ self.visit(node.key)
+ self.write(': ')
+ self.visit(node.value)
+ for comprehension in node.generators:
+ self.visit(comprehension)
+ self.write('}')
+
+ def visit_IfExp(self, node):
+ self.visit(node.body)
+ self.write(' if ')
+ self.visit(node.test)
+ self.write(' else ')
+ self.visit(node.orelse)
+
+ def visit_Starred(self, node):
+ self.write('*')
+ self.visit(node.value)
+
+ def visit_Repr(self, node):
+ # XXX: python 2.6 only
+ self.write('`')
+ self.visit(node.value)
+ self.write('`')
+
+ # Helper Nodes
+
+ def visit_alias(self, node):
+ self.write(node.name)
+ if node.asname is not None:
+ self.write(' as ' + node.asname)
+
+ def visit_comprehension(self, node):
+ self.write(' for ')
+ self.visit(node.target)
+ self.write(' in ')
+ self.visit(node.iter)
+ if node.ifs:
+ for if_ in node.ifs:
+ self.write(' if ')
+ self.visit(if_)
+
+ def visit_excepthandler(self, node):
+ self.newline(node)
+ self.write('except')
+ if node.type is not None:
+ self.write(' ')
+ self.visit(node.type)
+ if node.name is not None:
+ self.write(' as ')
+ self.visit(node.name)
+ self.write(':')
+ self.body(node.body)
diff --git a/bitbake/lib/ply/__init__.py b/bitbake/lib/ply/__init__.py
new file mode 100644
index 0000000..853a985
--- /dev/null
+++ b/bitbake/lib/ply/__init__.py
@@ -0,0 +1,4 @@
+# PLY package
+# Author: David Beazley (dave@dabeaz.com)
+
+__all__ = ['lex','yacc']
diff --git a/bitbake/lib/ply/lex.py b/bitbake/lib/ply/lex.py
new file mode 100644
index 0000000..267ec10
--- /dev/null
+++ b/bitbake/lib/ply/lex.py
@@ -0,0 +1,1058 @@
+# -----------------------------------------------------------------------------
+# ply: lex.py
+#
+# Copyright (C) 2001-2009,
+# David M. Beazley (Dabeaz LLC)
+# All rights reserved.
+#
+# Redistribution and use in source and binary forms, with or without
+# modification, are permitted provided that the following conditions are
+# met:
+#
+# * Redistributions of source code must retain the above copyright notice,
+# this list of conditions and the following disclaimer.
+# * Redistributions in binary form must reproduce the above copyright notice,
+# this list of conditions and the following disclaimer in the documentation
+# and/or other materials provided with the distribution.
+# * Neither the name of the David Beazley or Dabeaz LLC may be used to
+# endorse or promote products derived from this software without
+# specific prior written permission.
+#
+# THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
+# "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
+# LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR
+# A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT
+# OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
+# SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT
+# LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,
+# DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY
+# THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
+# (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
+# OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
+# -----------------------------------------------------------------------------
+
+__version__ = "3.3"
+__tabversion__ = "3.2" # Version of table file used
+
+import re, sys, types, copy, os
+
+# This tuple contains known string types
+try:
+ # Python 2.6
+ StringTypes = (types.StringType, types.UnicodeType)
+except AttributeError:
+ # Python 3.0
+ StringTypes = (str, bytes)
+
+# Extract the code attribute of a function. Different implementations
+# are for Python 2/3 compatibility.
+
+if sys.version_info[0] < 3:
+ def func_code(f):
+ return f.func_code
+else:
+ def func_code(f):
+ return f.__code__
+
+# This regular expression is used to match valid token names
+_is_identifier = re.compile(r'^[a-zA-Z0-9_]+$')
+
+# Exception thrown when invalid token encountered and no default error
+# handler is defined.
+
+class LexError(Exception):
+ def __init__(self,message,s):
+ self.args = (message,)
+ self.text = s
+
+# Token class. This class is used to represent the tokens produced.
+class LexToken(object):
+ def __str__(self):
+ return "LexToken(%s,%r,%d,%d)" % (self.type,self.value,self.lineno,self.lexpos)
+ def __repr__(self):
+ return str(self)
+
+# This object is a stand-in for a logging object created by the
+# logging module.
+
+class PlyLogger(object):
+ def __init__(self,f):
+ self.f = f
+ def critical(self,msg,*args,**kwargs):
+ self.f.write((msg % args) + "\n")
+
+ def warning(self,msg,*args,**kwargs):
+ self.f.write("WARNING: "+ (msg % args) + "\n")
+
+ def error(self,msg,*args,**kwargs):
+ self.f.write("ERROR: " + (msg % args) + "\n")
+
+ info = critical
+ debug = critical
+
+# Null logger is used when no output is generated. Does nothing.
+class NullLogger(object):
+ def __getattribute__(self,name):
+ return self
+ def __call__(self,*args,**kwargs):
+ return self
+
+# -----------------------------------------------------------------------------
+# === Lexing Engine ===
+#
+# The following Lexer class implements the lexer runtime. There are only
+# a few public methods and attributes:
+#
+# input() - Store a new string in the lexer
+# token() - Get the next token
+# clone() - Clone the lexer
+#
+# lineno - Current line number
+# lexpos - Current position in the input string
+# -----------------------------------------------------------------------------
+
+class Lexer:
+ def __init__(self):
+ self.lexre = None # Master regular expression. This is a list of
+ # tuples (re,findex) where re is a compiled
+ # regular expression and findex is a list
+ # mapping regex group numbers to rules
+ self.lexretext = None # Current regular expression strings
+ self.lexstatere = {} # Dictionary mapping lexer states to master regexs
+ self.lexstateretext = {} # Dictionary mapping lexer states to regex strings
+ self.lexstaterenames = {} # Dictionary mapping lexer states to symbol names
+ self.lexstate = "INITIAL" # Current lexer state
+ self.lexstatestack = [] # Stack of lexer states
+ self.lexstateinfo = None # State information
+ self.lexstateignore = {} # Dictionary of ignored characters for each state
+ self.lexstateerrorf = {} # Dictionary of error functions for each state
+ self.lexreflags = 0 # Optional re compile flags
+ self.lexdata = None # Actual input data (as a string)
+ self.lexpos = 0 # Current position in input text
+ self.lexlen = 0 # Length of the input text
+ self.lexerrorf = None # Error rule (if any)
+ self.lextokens = None # List of valid tokens
+ self.lexignore = "" # Ignored characters
+ self.lexliterals = "" # Literal characters that can be passed through
+ self.lexmodule = None # Module
+ self.lineno = 1 # Current line number
+ self.lexoptimize = 0 # Optimized mode
+
+ def clone(self,object=None):
+ c = copy.copy(self)
+
+ # If the object parameter has been supplied, it means we are attaching the
+ # lexer to a new object. In this case, we have to rebind all methods in
+ # the lexstatere and lexstateerrorf tables.
+
+ if object:
+ newtab = { }
+ for key, ritem in self.lexstatere.items():
+ newre = []
+ for cre, findex in ritem:
+ newfindex = []
+ for f in findex:
+ if not f or not f[0]:
+ newfindex.append(f)
+ continue
+ newfindex.append((getattr(object,f[0].__name__),f[1]))
+ newre.append((cre,newfindex))
+ newtab[key] = newre
+ c.lexstatere = newtab
+ c.lexstateerrorf = { }
+ for key, ef in self.lexstateerrorf.items():
+ c.lexstateerrorf[key] = getattr(object,ef.__name__)
+ c.lexmodule = object
+ return c
+
+ # ------------------------------------------------------------
+ # writetab() - Write lexer information to a table file
+ # ------------------------------------------------------------
+ def writetab(self,tabfile,outputdir=""):
+ if isinstance(tabfile,types.ModuleType):
+ return
+ basetabfilename = tabfile.split(".")[-1]
+ filename = os.path.join(outputdir,basetabfilename)+".py"
+ tf = open(filename,"w")
+ tf.write("# %s.py. This file automatically created by PLY (version %s). Don't edit!\n" % (tabfile,__version__))
+ tf.write("_tabversion = %s\n" % repr(__version__))
+ tf.write("_lextokens = %s\n" % repr(self.lextokens))
+ tf.write("_lexreflags = %s\n" % repr(self.lexreflags))
+ tf.write("_lexliterals = %s\n" % repr(self.lexliterals))
+ tf.write("_lexstateinfo = %s\n" % repr(self.lexstateinfo))
+
+ tabre = { }
+ # Collect all functions in the initial state
+ initial = self.lexstatere["INITIAL"]
+ initialfuncs = []
+ for part in initial:
+ for f in part[1]:
+ if f and f[0]:
+ initialfuncs.append(f)
+
+ for key, lre in self.lexstatere.items():
+ titem = []
+ for i in range(len(lre)):
+ titem.append((self.lexstateretext[key][i],_funcs_to_names(lre[i][1],self.lexstaterenames[key][i])))
+ tabre[key] = titem
+
+ tf.write("_lexstatere = %s\n" % repr(tabre))
+ tf.write("_lexstateignore = %s\n" % repr(self.lexstateignore))
+
+ taberr = { }
+ for key, ef in self.lexstateerrorf.items():
+ if ef:
+ taberr[key] = ef.__name__
+ else:
+ taberr[key] = None
+ tf.write("_lexstateerrorf = %s\n" % repr(taberr))
+ tf.close()
+
+ # ------------------------------------------------------------
+ # readtab() - Read lexer information from a tab file
+ # ------------------------------------------------------------
+ def readtab(self,tabfile,fdict):
+ if isinstance(tabfile,types.ModuleType):
+ lextab = tabfile
+ else:
+ if sys.version_info[0] < 3:
+ exec("import %s as lextab" % tabfile)
+ else:
+ env = { }
+ exec("import %s as lextab" % tabfile, env,env)
+ lextab = env['lextab']
+
+ if getattr(lextab,"_tabversion","0.0") != __version__:
+ raise ImportError("Inconsistent PLY version")
+
+ self.lextokens = lextab._lextokens
+ self.lexreflags = lextab._lexreflags
+ self.lexliterals = lextab._lexliterals
+ self.lexstateinfo = lextab._lexstateinfo
+ self.lexstateignore = lextab._lexstateignore
+ self.lexstatere = { }
+ self.lexstateretext = { }
+ for key,lre in lextab._lexstatere.items():
+ titem = []
+ txtitem = []
+ for i in range(len(lre)):
+ titem.append((re.compile(lre[i][0],lextab._lexreflags | re.VERBOSE),_names_to_funcs(lre[i][1],fdict)))
+ txtitem.append(lre[i][0])
+ self.lexstatere[key] = titem
+ self.lexstateretext[key] = txtitem
+ self.lexstateerrorf = { }
+ for key,ef in lextab._lexstateerrorf.items():
+ self.lexstateerrorf[key] = fdict[ef]
+ self.begin('INITIAL')
+
+ # ------------------------------------------------------------
+ # input() - Push a new string into the lexer
+ # ------------------------------------------------------------
+ def input(self,s):
+ # Pull off the first character to see if s looks like a string
+ c = s[:1]
+ if not isinstance(c,StringTypes):
+ raise ValueError("Expected a string")
+ self.lexdata = s
+ self.lexpos = 0
+ self.lexlen = len(s)
+
+ # ------------------------------------------------------------
+ # begin() - Changes the lexing state
+ # ------------------------------------------------------------
+ def begin(self,state):
+ if not state in self.lexstatere:
+ raise ValueError("Undefined state")
+ self.lexre = self.lexstatere[state]
+ self.lexretext = self.lexstateretext[state]
+ self.lexignore = self.lexstateignore.get(state,"")
+ self.lexerrorf = self.lexstateerrorf.get(state,None)
+ self.lexstate = state
+
+ # ------------------------------------------------------------
+ # push_state() - Changes the lexing state and saves old on stack
+ # ------------------------------------------------------------
+ def push_state(self,state):
+ self.lexstatestack.append(self.lexstate)
+ self.begin(state)
+
+ # ------------------------------------------------------------
+ # pop_state() - Restores the previous state
+ # ------------------------------------------------------------
+ def pop_state(self):
+ self.begin(self.lexstatestack.pop())
+
+ # ------------------------------------------------------------
+ # current_state() - Returns the current lexing state
+ # ------------------------------------------------------------
+ def current_state(self):
+ return self.lexstate
+
+ # ------------------------------------------------------------
+ # skip() - Skip ahead n characters
+ # ------------------------------------------------------------
+ def skip(self,n):
+ self.lexpos += n
+
+ # ------------------------------------------------------------
+ # opttoken() - Return the next token from the Lexer
+ #
+ # Note: This function has been carefully implemented to be as fast
+ # as possible. Don't make changes unless you really know what
+ # you are doing
+ # ------------------------------------------------------------
+ def token(self):
+ # Make local copies of frequently referenced attributes
+ lexpos = self.lexpos
+ lexlen = self.lexlen
+ lexignore = self.lexignore
+ lexdata = self.lexdata
+
+ while lexpos < lexlen:
+ # This code provides some short-circuit code for whitespace, tabs, and other ignored characters
+ if lexdata[lexpos] in lexignore:
+ lexpos += 1
+ continue
+
+ # Look for a regular expression match
+ for lexre,lexindexfunc in self.lexre:
+ m = lexre.match(lexdata,lexpos)
+ if not m: continue
+
+ # Create a token for return
+ tok = LexToken()
+ tok.value = m.group()
+ tok.lineno = self.lineno
+ tok.lexpos = lexpos
+
+ i = m.lastindex
+ func,tok.type = lexindexfunc[i]
+
+ if not func:
+ # If no token type was set, it's an ignored token
+ if tok.type:
+ self.lexpos = m.end()
+ return tok
+ else:
+ lexpos = m.end()
+ break
+
+ lexpos = m.end()
+
+ # If token is processed by a function, call it
+
+ tok.lexer = self # Set additional attributes useful in token rules
+ self.lexmatch = m
+ self.lexpos = lexpos
+
+ newtok = func(tok)
+
+ # Every function must return a token, if nothing, we just move to next token
+ if not newtok:
+ lexpos = self.lexpos # This is here in case user has updated lexpos.
+ lexignore = self.lexignore # This is here in case there was a state change
+ break
+
+ # Verify type of the token. If not in the token map, raise an error
+ if not self.lexoptimize:
+ if not newtok.type in self.lextokens:
+ raise LexError("%s:%d: Rule '%s' returned an unknown token type '%s'" % (
+ func_code(func).co_filename, func_code(func).co_firstlineno,
+ func.__name__, newtok.type),lexdata[lexpos:])
+
+ return newtok
+ else:
+ # No match, see if in literals
+ if lexdata[lexpos] in self.lexliterals:
+ tok = LexToken()
+ tok.value = lexdata[lexpos]
+ tok.lineno = self.lineno
+ tok.type = tok.value
+ tok.lexpos = lexpos
+ self.lexpos = lexpos + 1
+ return tok
+
+ # No match. Call t_error() if defined.
+ if self.lexerrorf:
+ tok = LexToken()
+ tok.value = self.lexdata[lexpos:]
+ tok.lineno = self.lineno
+ tok.type = "error"
+ tok.lexer = self
+ tok.lexpos = lexpos
+ self.lexpos = lexpos
+ newtok = self.lexerrorf(tok)
+ if lexpos == self.lexpos:
+ # Error method didn't change text position at all. This is an error.
+ raise LexError("Scanning error. Illegal character '%s'" % (lexdata[lexpos]), lexdata[lexpos:])
+ lexpos = self.lexpos
+ if not newtok: continue
+ return newtok
+
+ self.lexpos = lexpos
+ raise LexError("Illegal character '%s' at index %d" % (lexdata[lexpos],lexpos), lexdata[lexpos:])
+
+ self.lexpos = lexpos + 1
+ if self.lexdata is None:
+ raise RuntimeError("No input string given with input()")
+ return None
+
+ # Iterator interface
+ def __iter__(self):
+ return self
+
+ def next(self):
+ t = self.token()
+ if t is None:
+ raise StopIteration
+ return t
+
+ __next__ = next
+
+# -----------------------------------------------------------------------------
+# ==== Lex Builder ===
+#
+# The functions and classes below are used to collect lexing information
+# and build a Lexer object from it.
+# -----------------------------------------------------------------------------
+
+# -----------------------------------------------------------------------------
+# get_caller_module_dict()
+#
+# This function returns a dictionary containing all of the symbols defined within
+# a caller further down the call stack. This is used to get the environment
+# associated with the yacc() call if none was provided.
+# -----------------------------------------------------------------------------
+
+def get_caller_module_dict(levels):
+ try:
+ raise RuntimeError
+ except RuntimeError:
+ e,b,t = sys.exc_info()
+ f = t.tb_frame
+ while levels > 0:
+ f = f.f_back
+ levels -= 1
+ ldict = f.f_globals.copy()
+ if f.f_globals != f.f_locals:
+ ldict.update(f.f_locals)
+
+ return ldict
+
+# -----------------------------------------------------------------------------
+# _funcs_to_names()
+#
+# Given a list of regular expression functions, this converts it to a list
+# suitable for output to a table file
+# -----------------------------------------------------------------------------
+
+def _funcs_to_names(funclist,namelist):
+ result = []
+ for f,name in zip(funclist,namelist):
+ if f and f[0]:
+ result.append((name, f[1]))
+ else:
+ result.append(f)
+ return result
+
+# -----------------------------------------------------------------------------
+# _names_to_funcs()
+#
+# Given a list of regular expression function names, this converts it back to
+# functions.
+# -----------------------------------------------------------------------------
+
+def _names_to_funcs(namelist,fdict):
+ result = []
+ for n in namelist:
+ if n and n[0]:
+ result.append((fdict[n[0]],n[1]))
+ else:
+ result.append(n)
+ return result
+
+# -----------------------------------------------------------------------------
+# _form_master_re()
+#
+# This function takes a list of all of the regex components and attempts to
+# form the master regular expression. Given limitations in the Python re
+# module, it may be necessary to break the master regex into separate expressions.
+# -----------------------------------------------------------------------------
+
+def _form_master_re(relist,reflags,ldict,toknames):
+ if not relist: return []
+ regex = "|".join(relist)
+ try:
+ lexre = re.compile(regex,re.VERBOSE | reflags)
+
+ # Build the index to function map for the matching engine
+ lexindexfunc = [ None ] * (max(lexre.groupindex.values())+1)
+ lexindexnames = lexindexfunc[:]
+
+ for f,i in lexre.groupindex.items():
+ handle = ldict.get(f,None)
+ if type(handle) in (types.FunctionType, types.MethodType):
+ lexindexfunc[i] = (handle,toknames[f])
+ lexindexnames[i] = f
+ elif handle is not None:
+ lexindexnames[i] = f
+ if f.find("ignore_") > 0:
+ lexindexfunc[i] = (None,None)
+ else:
+ lexindexfunc[i] = (None, toknames[f])
+
+ return [(lexre,lexindexfunc)],[regex],[lexindexnames]
+ except Exception:
+ m = int(len(relist)/2)
+ if m == 0: m = 1
+ llist, lre, lnames = _form_master_re(relist[:m],reflags,ldict,toknames)
+ rlist, rre, rnames = _form_master_re(relist[m:],reflags,ldict,toknames)
+ return llist+rlist, lre+rre, lnames+rnames
+
+# -----------------------------------------------------------------------------
+# def _statetoken(s,names)
+#
+# Given a declaration name s of the form "t_" and a dictionary whose keys are
+# state names, this function returns a tuple (states,tokenname) where states
+# is a tuple of state names and tokenname is the name of the token. For example,
+# calling this with s = "t_foo_bar_SPAM" might return (('foo','bar'),'SPAM')
+# -----------------------------------------------------------------------------
+
+def _statetoken(s,names):
+ nonstate = 1
+ parts = s.split("_")
+ for i in range(1,len(parts)):
+ if not parts[i] in names and parts[i] != 'ANY': break
+ if i > 1:
+ states = tuple(parts[1:i])
+ else:
+ states = ('INITIAL',)
+
+ if 'ANY' in states:
+ states = tuple(names)
+
+ tokenname = "_".join(parts[i:])
+ return (states,tokenname)
+
+
+# -----------------------------------------------------------------------------
+# LexerReflect()
+#
+# This class represents information needed to build a lexer as extracted from a
+# user's input file.
+# -----------------------------------------------------------------------------
+class LexerReflect(object):
+ def __init__(self,ldict,log=None,reflags=0):
+ self.ldict = ldict
+ self.error_func = None
+ self.tokens = []
+ self.reflags = reflags
+ self.stateinfo = { 'INITIAL' : 'inclusive'}
+ self.files = {}
+ self.error = 0
+
+ if log is None:
+ self.log = PlyLogger(sys.stderr)
+ else:
+ self.log = log
+
+ # Get all of the basic information
+ def get_all(self):
+ self.get_tokens()
+ self.get_literals()
+ self.get_states()
+ self.get_rules()
+
+ # Validate all of the information
+ def validate_all(self):
+ self.validate_tokens()
+ self.validate_literals()
+ self.validate_rules()
+ return self.error
+
+ # Get the tokens map
+ def get_tokens(self):
+ tokens = self.ldict.get("tokens",None)
+ if not tokens:
+ self.log.error("No token list is defined")
+ self.error = 1
+ return
+
+ if not isinstance(tokens,(list, tuple)):
+ self.log.error("tokens must be a list or tuple")
+ self.error = 1
+ return
+
+ if not tokens:
+ self.log.error("tokens is empty")
+ self.error = 1
+ return
+
+ self.tokens = tokens
+
+ # Validate the tokens
+ def validate_tokens(self):
+ terminals = {}
+ for n in self.tokens:
+ if not _is_identifier.match(n):
+ self.log.error("Bad token name '%s'",n)
+ self.error = 1
+ if n in terminals:
+ self.log.warning("Token '%s' multiply defined", n)
+ terminals[n] = 1
+
+ # Get the literals specifier
+ def get_literals(self):
+ self.literals = self.ldict.get("literals","")
+
+ # Validate literals
+ def validate_literals(self):
+ try:
+ for c in self.literals:
+ if not isinstance(c,StringTypes) or len(c) > 1:
+ self.log.error("Invalid literal %s. Must be a single character", repr(c))
+ self.error = 1
+ continue
+
+ except TypeError:
+ self.log.error("Invalid literals specification. literals must be a sequence of characters")
+ self.error = 1
+
+ def get_states(self):
+ self.states = self.ldict.get("states",None)
+ # Build statemap
+ if self.states:
+ if not isinstance(self.states,(tuple,list)):
+ self.log.error("states must be defined as a tuple or list")
+ self.error = 1
+ else:
+ for s in self.states:
+ if not isinstance(s,tuple) or len(s) != 2:
+ self.log.error("Invalid state specifier %s. Must be a tuple (statename,'exclusive|inclusive')",repr(s))
+ self.error = 1
+ continue
+ name, statetype = s
+ if not isinstance(name,StringTypes):
+ self.log.error("State name %s must be a string", repr(name))
+ self.error = 1
+ continue
+ if not (statetype == 'inclusive' or statetype == 'exclusive'):
+ self.log.error("State type for state %s must be 'inclusive' or 'exclusive'",name)
+ self.error = 1
+ continue
+ if name in self.stateinfo:
+ self.log.error("State '%s' already defined",name)
+ self.error = 1
+ continue
+ self.stateinfo[name] = statetype
+
+ # Get all of the symbols with a t_ prefix and sort them into various
+ # categories (functions, strings, error functions, and ignore characters)
+
+ def get_rules(self):
+ tsymbols = [f for f in self.ldict if f[:2] == 't_' ]
+
+ # Now build up a list of functions and a list of strings
+
+ self.toknames = { } # Mapping of symbols to token names
+ self.funcsym = { } # Symbols defined as functions
+ self.strsym = { } # Symbols defined as strings
+ self.ignore = { } # Ignore strings by state
+ self.errorf = { } # Error functions by state
+
+ for s in self.stateinfo:
+ self.funcsym[s] = []
+ self.strsym[s] = []
+
+ if len(tsymbols) == 0:
+ self.log.error("No rules of the form t_rulename are defined")
+ self.error = 1
+ return
+
+ for f in tsymbols:
+ t = self.ldict[f]
+ states, tokname = _statetoken(f,self.stateinfo)
+ self.toknames[f] = tokname
+
+ if hasattr(t,"__call__"):
+ if tokname == 'error':
+ for s in states:
+ self.errorf[s] = t
+ elif tokname == 'ignore':
+ line = func_code(t).co_firstlineno
+ file = func_code(t).co_filename
+ self.log.error("%s:%d: Rule '%s' must be defined as a string",file,line,t.__name__)
+ self.error = 1
+ else:
+ for s in states:
+ self.funcsym[s].append((f,t))
+ elif isinstance(t, StringTypes):
+ if tokname == 'ignore':
+ for s in states:
+ self.ignore[s] = t
+ if "\\" in t:
+ self.log.warning("%s contains a literal backslash '\\'",f)
+
+ elif tokname == 'error':
+ self.log.error("Rule '%s' must be defined as a function", f)
+ self.error = 1
+ else:
+ for s in states:
+ self.strsym[s].append((f,t))
+ else:
+ self.log.error("%s not defined as a function or string", f)
+ self.error = 1
+
+ # Sort the functions by line number
+ for f in self.funcsym.values():
+ if sys.version_info[0] < 3:
+ f.sort(lambda x,y: cmp(func_code(x[1]).co_firstlineno,func_code(y[1]).co_firstlineno))
+ else:
+ # Python 3.0
+ f.sort(key=lambda x: func_code(x[1]).co_firstlineno)
+
+ # Sort the strings by regular expression length
+ for s in self.strsym.values():
+ if sys.version_info[0] < 3:
+ s.sort(lambda x,y: (len(x[1]) < len(y[1])) - (len(x[1]) > len(y[1])))
+ else:
+ # Python 3.0
+ s.sort(key=lambda x: len(x[1]),reverse=True)
+
+ # Validate all of the t_rules collected
+ def validate_rules(self):
+ for state in self.stateinfo:
+ # Validate all rules defined by functions
+
+
+
+ for fname, f in self.funcsym[state]:
+ line = func_code(f).co_firstlineno
+ file = func_code(f).co_filename
+ self.files[file] = 1
+
+ tokname = self.toknames[fname]
+ if isinstance(f, types.MethodType):
+ reqargs = 2
+ else:
+ reqargs = 1
+ nargs = func_code(f).co_argcount
+ if nargs > reqargs:
+ self.log.error("%s:%d: Rule '%s' has too many arguments",file,line,f.__name__)
+ self.error = 1
+ continue
+
+ if nargs < reqargs:
+ self.log.error("%s:%d: Rule '%s' requires an argument", file,line,f.__name__)
+ self.error = 1
+ continue
+
+ if not f.__doc__:
+ self.log.error("%s:%d: No regular expression defined for rule '%s'",file,line,f.__name__)
+ self.error = 1
+ continue
+
+ try:
+ c = re.compile("(?P<%s>%s)" % (fname,f.__doc__), re.VERBOSE | self.reflags)
+ if c.match(""):
+ self.log.error("%s:%d: Regular expression for rule '%s' matches empty string", file,line,f.__name__)
+ self.error = 1
+ except re.error:
+ _etype, e, _etrace = sys.exc_info()
+ self.log.error("%s:%d: Invalid regular expression for rule '%s'. %s", file,line,f.__name__,e)
+ if '#' in f.__doc__:
+ self.log.error("%s:%d. Make sure '#' in rule '%s' is escaped with '\\#'",file,line, f.__name__)
+ self.error = 1
+
+ # Validate all rules defined by strings
+ for name,r in self.strsym[state]:
+ tokname = self.toknames[name]
+ if tokname == 'error':
+ self.log.error("Rule '%s' must be defined as a function", name)
+ self.error = 1
+ continue
+
+ if not tokname in self.tokens and tokname.find("ignore_") < 0:
+ self.log.error("Rule '%s' defined for an unspecified token %s",name,tokname)
+ self.error = 1
+ continue
+
+ try:
+ c = re.compile("(?P<%s>%s)" % (name,r),re.VERBOSE | self.reflags)
+ if (c.match("")):
+ self.log.error("Regular expression for rule '%s' matches empty string",name)
+ self.error = 1
+ except re.error:
+ _etype, e, _etrace = sys.exc_info()
+ self.log.error("Invalid regular expression for rule '%s'. %s",name,e)
+ if '#' in r:
+ self.log.error("Make sure '#' in rule '%s' is escaped with '\\#'",name)
+ self.error = 1
+
+ if not self.funcsym[state] and not self.strsym[state]:
+ self.log.error("No rules defined for state '%s'",state)
+ self.error = 1
+
+ # Validate the error function
+ efunc = self.errorf.get(state,None)
+ if efunc:
+ f = efunc
+ line = func_code(f).co_firstlineno
+ file = func_code(f).co_filename
+ self.files[file] = 1
+
+ if isinstance(f, types.MethodType):
+ reqargs = 2
+ else:
+ reqargs = 1
+ nargs = func_code(f).co_argcount
+ if nargs > reqargs:
+ self.log.error("%s:%d: Rule '%s' has too many arguments",file,line,f.__name__)
+ self.error = 1
+
+ if nargs < reqargs:
+ self.log.error("%s:%d: Rule '%s' requires an argument", file,line,f.__name__)
+ self.error = 1
+
+ for f in self.files:
+ self.validate_file(f)
+
+
+ # -----------------------------------------------------------------------------
+ # validate_file()
+ #
+ # This checks to see if there are duplicated t_rulename() functions or strings
+ # in the parser input file. This is done using a simple regular expression
+ # match on each line in the given file.
+ # -----------------------------------------------------------------------------
+
+ def validate_file(self,filename):
+ import os.path
+ base,ext = os.path.splitext(filename)
+ if ext != '.py': return # No idea what the file is. Return OK
+
+ try:
+ f = open(filename)
+ lines = f.readlines()
+ f.close()
+ except IOError:
+ return # Couldn't find the file. Don't worry about it
+
+ fre = re.compile(r'\s*def\s+(t_[a-zA-Z_0-9]*)\(')
+ sre = re.compile(r'\s*(t_[a-zA-Z_0-9]*)\s*=')
+
+ counthash = { }
+ linen = 1
+ for l in lines:
+ m = fre.match(l)
+ if not m:
+ m = sre.match(l)
+ if m:
+ name = m.group(1)
+ prev = counthash.get(name)
+ if not prev:
+ counthash[name] = linen
+ else:
+ self.log.error("%s:%d: Rule %s redefined. Previously defined on line %d",filename,linen,name,prev)
+ self.error = 1
+ linen += 1
+
+# -----------------------------------------------------------------------------
+# lex(module)
+#
+# Build all of the regular expression rules from definitions in the supplied module
+# -----------------------------------------------------------------------------
+def lex(module=None,object=None,debug=0,optimize=0,lextab="lextab",reflags=0,nowarn=0,outputdir="", debuglog=None, errorlog=None):
+ global lexer
+ ldict = None
+ stateinfo = { 'INITIAL' : 'inclusive'}
+ lexobj = Lexer()
+ lexobj.lexoptimize = optimize
+ global token,input
+
+ if errorlog is None:
+ errorlog = PlyLogger(sys.stderr)
+
+ if debug:
+ if debuglog is None:
+ debuglog = PlyLogger(sys.stderr)
+
+ # Get the module dictionary used for the lexer
+ if object: module = object
+
+ if module:
+ _items = [(k,getattr(module,k)) for k in dir(module)]
+ ldict = dict(_items)
+ else:
+ ldict = get_caller_module_dict(2)
+
+ # Collect parser information from the dictionary
+ linfo = LexerReflect(ldict,log=errorlog,reflags=reflags)
+ linfo.get_all()
+ if not optimize:
+ if linfo.validate_all():
+ raise SyntaxError("Can't build lexer")
+
+ if optimize and lextab:
+ try:
+ lexobj.readtab(lextab,ldict)
+ token = lexobj.token
+ input = lexobj.input
+ lexer = lexobj
+ return lexobj
+
+ except ImportError:
+ pass
+
+ # Dump some basic debugging information
+ if debug:
+ debuglog.info("lex: tokens = %r", linfo.tokens)
+ debuglog.info("lex: literals = %r", linfo.literals)
+ debuglog.info("lex: states = %r", linfo.stateinfo)
+
+ # Build a dictionary of valid token names
+ lexobj.lextokens = { }
+ for n in linfo.tokens:
+ lexobj.lextokens[n] = 1
+
+ # Get literals specification
+ if isinstance(linfo.literals,(list,tuple)):
+ lexobj.lexliterals = type(linfo.literals[0])().join(linfo.literals)
+ else:
+ lexobj.lexliterals = linfo.literals
+
+ # Get the stateinfo dictionary
+ stateinfo = linfo.stateinfo
+
+ regexs = { }
+ # Build the master regular expressions
+ for state in stateinfo:
+ regex_list = []
+
+ # Add rules defined by functions first
+ for fname, f in linfo.funcsym[state]:
+ line = func_code(f).co_firstlineno
+ file = func_code(f).co_filename
+ regex_list.append("(?P<%s>%s)" % (fname,f.__doc__))
+ if debug:
+ debuglog.info("lex: Adding rule %s -> '%s' (state '%s')",fname,f.__doc__, state)
+
+ # Now add all of the simple rules
+ for name,r in linfo.strsym[state]:
+ regex_list.append("(?P<%s>%s)" % (name,r))
+ if debug:
+ debuglog.info("lex: Adding rule %s -> '%s' (state '%s')",name,r, state)
+
+ regexs[state] = regex_list
+
+ # Build the master regular expressions
+
+ if debug:
+ debuglog.info("lex: ==== MASTER REGEXS FOLLOW ====")
+
+ for state in regexs:
+ lexre, re_text, re_names = _form_master_re(regexs[state],reflags,ldict,linfo.toknames)
+ lexobj.lexstatere[state] = lexre
+ lexobj.lexstateretext[state] = re_text
+ lexobj.lexstaterenames[state] = re_names
+ if debug:
+ for i in range(len(re_text)):
+ debuglog.info("lex: state '%s' : regex[%d] = '%s'",state, i, re_text[i])
+
+ # For inclusive states, we need to add the regular expressions from the INITIAL state
+ for state,stype in stateinfo.items():
+ if state != "INITIAL" and stype == 'inclusive':
+ lexobj.lexstatere[state].extend(lexobj.lexstatere['INITIAL'])
+ lexobj.lexstateretext[state].extend(lexobj.lexstateretext['INITIAL'])
+ lexobj.lexstaterenames[state].extend(lexobj.lexstaterenames['INITIAL'])
+
+ lexobj.lexstateinfo = stateinfo
+ lexobj.lexre = lexobj.lexstatere["INITIAL"]
+ lexobj.lexretext = lexobj.lexstateretext["INITIAL"]
+ lexobj.lexreflags = reflags
+
+ # Set up ignore variables
+ lexobj.lexstateignore = linfo.ignore
+ lexobj.lexignore = lexobj.lexstateignore.get("INITIAL","")
+
+ # Set up error functions
+ lexobj.lexstateerrorf = linfo.errorf
+ lexobj.lexerrorf = linfo.errorf.get("INITIAL",None)
+ if not lexobj.lexerrorf:
+ errorlog.warning("No t_error rule is defined")
+
+ # Check state information for ignore and error rules
+ for s,stype in stateinfo.items():
+ if stype == 'exclusive':
+ if not s in linfo.errorf:
+ errorlog.warning("No error rule is defined for exclusive state '%s'", s)
+ if not s in linfo.ignore and lexobj.lexignore:
+ errorlog.warning("No ignore rule is defined for exclusive state '%s'", s)
+ elif stype == 'inclusive':
+ if not s in linfo.errorf:
+ linfo.errorf[s] = linfo.errorf.get("INITIAL",None)
+ if not s in linfo.ignore:
+ linfo.ignore[s] = linfo.ignore.get("INITIAL","")
+
+ # Create global versions of the token() and input() functions
+ token = lexobj.token
+ input = lexobj.input
+ lexer = lexobj
+
+ # If in optimize mode, we write the lextab
+ if lextab and optimize:
+ lexobj.writetab(lextab,outputdir)
+
+ return lexobj
+
+# -----------------------------------------------------------------------------
+# runmain()
+#
+# This runs the lexer as a main program
+# -----------------------------------------------------------------------------
+
+def runmain(lexer=None,data=None):
+ if not data:
+ try:
+ filename = sys.argv[1]
+ f = open(filename)
+ data = f.read()
+ f.close()
+ except IndexError:
+ sys.stdout.write("Reading from standard input (type EOF to end):\n")
+ data = sys.stdin.read()
+
+ if lexer:
+ _input = lexer.input
+ else:
+ _input = input
+ _input(data)
+ if lexer:
+ _token = lexer.token
+ else:
+ _token = token
+
+ while 1:
+ tok = _token()
+ if not tok: break
+ sys.stdout.write("(%s,%r,%d,%d)\n" % (tok.type, tok.value, tok.lineno,tok.lexpos))
+
+# -----------------------------------------------------------------------------
+# @TOKEN(regex)
+#
+# This decorator function can be used to set the regex expression on a function
+# when its docstring might need to be set in an alternative way
+# -----------------------------------------------------------------------------
+
+def TOKEN(r):
+ def set_doc(f):
+ if hasattr(r,"__call__"):
+ f.__doc__ = r.__doc__
+ else:
+ f.__doc__ = r
+ return f
+ return set_doc
+
+# Alternative spelling of the TOKEN decorator
+Token = TOKEN
+
diff --git a/bitbake/lib/ply/yacc.py b/bitbake/lib/ply/yacc.py
new file mode 100644
index 0000000..6168fd9
--- /dev/null
+++ b/bitbake/lib/ply/yacc.py
@@ -0,0 +1,3276 @@
+# -----------------------------------------------------------------------------
+# ply: yacc.py
+#
+# Copyright (C) 2001-2009,
+# David M. Beazley (Dabeaz LLC)
+# All rights reserved.
+#
+# Redistribution and use in source and binary forms, with or without
+# modification, are permitted provided that the following conditions are
+# met:
+#
+# * Redistributions of source code must retain the above copyright notice,
+# this list of conditions and the following disclaimer.
+# * Redistributions in binary form must reproduce the above copyright notice,
+# this list of conditions and the following disclaimer in the documentation
+# and/or other materials provided with the distribution.
+# * Neither the name of the David Beazley or Dabeaz LLC may be used to
+# endorse or promote products derived from this software without
+# specific prior written permission.
+#
+# THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
+# "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
+# LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR
+# A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT
+# OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
+# SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT
+# LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,
+# DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY
+# THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
+# (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
+# OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
+# -----------------------------------------------------------------------------
+#
+# This implements an LR parser that is constructed from grammar rules defined
+# as Python functions. The grammer is specified by supplying the BNF inside
+# Python documentation strings. The inspiration for this technique was borrowed
+# from John Aycock's Spark parsing system. PLY might be viewed as cross between
+# Spark and the GNU bison utility.
+#
+# The current implementation is only somewhat object-oriented. The
+# LR parser itself is defined in terms of an object (which allows multiple
+# parsers to co-exist). However, most of the variables used during table
+# construction are defined in terms of global variables. Users shouldn't
+# notice unless they are trying to define multiple parsers at the same
+# time using threads (in which case they should have their head examined).
+#
+# This implementation supports both SLR and LALR(1) parsing. LALR(1)
+# support was originally implemented by Elias Ioup (ezioup@alumni.uchicago.edu),
+# using the algorithm found in Aho, Sethi, and Ullman "Compilers: Principles,
+# Techniques, and Tools" (The Dragon Book). LALR(1) has since been replaced
+# by the more efficient DeRemer and Pennello algorithm.
+#
+# :::::::: WARNING :::::::
+#
+# Construction of LR parsing tables is fairly complicated and expensive.
+# To make this module run fast, a *LOT* of work has been put into
+# optimization---often at the expensive of readability and what might
+# consider to be good Python "coding style." Modify the code at your
+# own risk!
+# ----------------------------------------------------------------------------
+
+__version__ = "3.3"
+__tabversion__ = "3.2" # Table version
+
+#-----------------------------------------------------------------------------
+# === User configurable parameters ===
+#
+# Change these to modify the default behavior of yacc (if you wish)
+#-----------------------------------------------------------------------------
+
+yaccdebug = 0 # Debugging mode. If set, yacc generates a
+ # a 'parser.out' file in the current directory
+
+debug_file = 'parser.out' # Default name of the debugging file
+tab_module = 'parsetab' # Default name of the table module
+default_lr = 'LALR' # Default LR table generation method
+
+error_count = 3 # Number of symbols that must be shifted to leave recovery mode
+
+yaccdevel = 0 # Set to True if developing yacc. This turns off optimized
+ # implementations of certain functions.
+
+resultlimit = 40 # Size limit of results when running in debug mode.
+
+pickle_protocol = 0 # Protocol to use when writing pickle files
+
+import re, types, sys, os.path
+
+# Compatibility function for python 2.6/3.0
+if sys.version_info[0] < 3:
+ def func_code(f):
+ return f.func_code
+else:
+ def func_code(f):
+ return f.__code__
+
+# Compatibility
+try:
+ MAXINT = sys.maxint
+except AttributeError:
+ MAXINT = sys.maxsize
+
+# Python 2.x/3.0 compatibility.
+def load_ply_lex():
+ if sys.version_info[0] < 3:
+ import lex
+ else:
+ import ply.lex as lex
+ return lex
+
+# This object is a stand-in for a logging object created by the
+# logging module. PLY will use this by default to create things
+# such as the parser.out file. If a user wants more detailed
+# information, they can create their own logging object and pass
+# it into PLY.
+
+class PlyLogger(object):
+ def __init__(self,f):
+ self.f = f
+ def debug(self,msg,*args,**kwargs):
+ self.f.write((msg % args) + "\n")
+ info = debug
+
+ def warning(self,msg,*args,**kwargs):
+ self.f.write("WARNING: "+ (msg % args) + "\n")
+
+ def error(self,msg,*args,**kwargs):
+ self.f.write("ERROR: " + (msg % args) + "\n")
+
+ critical = debug
+
+# Null logger is used when no output is generated. Does nothing.
+class NullLogger(object):
+ def __getattribute__(self,name):
+ return self
+ def __call__(self,*args,**kwargs):
+ return self
+
+# Exception raised for yacc-related errors
+class YaccError(Exception): pass
+
+# Format the result message that the parser produces when running in debug mode.
+def format_result(r):
+ repr_str = repr(r)
+ if '\n' in repr_str: repr_str = repr(repr_str)
+ if len(repr_str) > resultlimit:
+ repr_str = repr_str[:resultlimit]+" ..."
+ result = "<%s @ 0x%x> (%s)" % (type(r).__name__,id(r),repr_str)
+ return result
+
+
+# Format stack entries when the parser is running in debug mode
+def format_stack_entry(r):
+ repr_str = repr(r)
+ if '\n' in repr_str: repr_str = repr(repr_str)
+ if len(repr_str) < 16:
+ return repr_str
+ else:
+ return "<%s @ 0x%x>" % (type(r).__name__,id(r))
+
+#-----------------------------------------------------------------------------
+# === LR Parsing Engine ===
+#
+# The following classes are used for the LR parser itself. These are not
+# used during table construction and are independent of the actual LR
+# table generation algorithm
+#-----------------------------------------------------------------------------
+
+# This class is used to hold non-terminal grammar symbols during parsing.
+# It normally has the following attributes set:
+# .type = Grammar symbol type
+# .value = Symbol value
+# .lineno = Starting line number
+# .endlineno = Ending line number (optional, set automatically)
+# .lexpos = Starting lex position
+# .endlexpos = Ending lex position (optional, set automatically)
+
+class YaccSymbol:
+ def __str__(self): return self.type
+ def __repr__(self): return str(self)
+
+# This class is a wrapper around the objects actually passed to each
+# grammar rule. Index lookup and assignment actually assign the
+# .value attribute of the underlying YaccSymbol object.
+# The lineno() method returns the line number of a given
+# item (or 0 if not defined). The linespan() method returns
+# a tuple of (startline,endline) representing the range of lines
+# for a symbol. The lexspan() method returns a tuple (lexpos,endlexpos)
+# representing the range of positional information for a symbol.
+
+class YaccProduction:
+ def __init__(self,s,stack=None):
+ self.slice = s
+ self.stack = stack
+ self.lexer = None
+ self.parser= None
+ def __getitem__(self,n):
+ if n >= 0: return self.slice[n].value
+ else: return self.stack[n].value
+
+ def __setitem__(self,n,v):
+ self.slice[n].value = v
+
+ def __getslice__(self,i,j):
+ return [s.value for s in self.slice[i:j]]
+
+ def __len__(self):
+ return len(self.slice)
+
+ def lineno(self,n):
+ return getattr(self.slice[n],"lineno",0)
+
+ def set_lineno(self,n,lineno):
+ self.slice[n].lineno = lineno
+
+ def linespan(self,n):
+ startline = getattr(self.slice[n],"lineno",0)
+ endline = getattr(self.slice[n],"endlineno",startline)
+ return startline,endline
+
+ def lexpos(self,n):
+ return getattr(self.slice[n],"lexpos",0)
+
+ def lexspan(self,n):
+ startpos = getattr(self.slice[n],"lexpos",0)
+ endpos = getattr(self.slice[n],"endlexpos",startpos)
+ return startpos,endpos
+
+ def error(self):
+ raise SyntaxError
+
+
+# -----------------------------------------------------------------------------
+# == LRParser ==
+#
+# The LR Parsing engine.
+# -----------------------------------------------------------------------------
+
+class LRParser:
+ def __init__(self,lrtab,errorf):
+ self.productions = lrtab.lr_productions
+ self.action = lrtab.lr_action
+ self.goto = lrtab.lr_goto
+ self.errorfunc = errorf
+
+ def errok(self):
+ self.errorok = 1
+
+ def restart(self):
+ del self.statestack[:]
+ del self.symstack[:]
+ sym = YaccSymbol()
+ sym.type = '$end'
+ self.symstack.append(sym)
+ self.statestack.append(0)
+
+ def parse(self,input=None,lexer=None,debug=0,tracking=0,tokenfunc=None):
+ if debug or yaccdevel:
+ if isinstance(debug,int):
+ debug = PlyLogger(sys.stderr)
+ return self.parsedebug(input,lexer,debug,tracking,tokenfunc)
+ elif tracking:
+ return self.parseopt(input,lexer,debug,tracking,tokenfunc)
+ else:
+ return self.parseopt_notrack(input,lexer,debug,tracking,tokenfunc)
+
+
+ # !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
+ # parsedebug().
+ #
+ # This is the debugging enabled version of parse(). All changes made to the
+ # parsing engine should be made here. For the non-debugging version,
+ # copy this code to a method parseopt() and delete all of the sections
+ # enclosed in:
+ #
+ # #--! DEBUG
+ # statements
+ # #--! DEBUG
+ #
+ # !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
+
+ def parsedebug(self,input=None,lexer=None,debug=None,tracking=0,tokenfunc=None):
+ lookahead = None # Current lookahead symbol
+ lookaheadstack = [ ] # Stack of lookahead symbols
+ actions = self.action # Local reference to action table (to avoid lookup on self.)
+ goto = self.goto # Local reference to goto table (to avoid lookup on self.)
+ prod = self.productions # Local reference to production list (to avoid lookup on self.)
+ pslice = YaccProduction(None) # Production object passed to grammar rules
+ errorcount = 0 # Used during error recovery
+
+ # --! DEBUG
+ debug.info("PLY: PARSE DEBUG START")
+ # --! DEBUG
+
+ # If no lexer was given, we will try to use the lex module
+ if not lexer:
+ lex = load_ply_lex()
+ lexer = lex.lexer
+
+ # Set up the lexer and parser objects on pslice
+ pslice.lexer = lexer
+ pslice.parser = self
+
+ # If input was supplied, pass to lexer
+ if input is not None:
+ lexer.input(input)
+
+ if tokenfunc is None:
+ # Tokenize function
+ get_token = lexer.token
+ else:
+ get_token = tokenfunc
+
+ # Set up the state and symbol stacks
+
+ statestack = [ ] # Stack of parsing states
+ self.statestack = statestack
+ symstack = [ ] # Stack of grammar symbols
+ self.symstack = symstack
+
+ pslice.stack = symstack # Put in the production
+ errtoken = None # Err token
+
+ # The start state is assumed to be (0,$end)
+
+ statestack.append(0)
+ sym = YaccSymbol()
+ sym.type = "$end"
+ symstack.append(sym)
+ state = 0
+ while 1:
+ # Get the next symbol on the input. If a lookahead symbol
+ # is already set, we just use that. Otherwise, we'll pull
+ # the next token off of the lookaheadstack or from the lexer
+
+ # --! DEBUG
+ debug.debug('')
+ debug.debug('State : %s', state)
+ # --! DEBUG
+
+ if not lookahead:
+ if not lookaheadstack:
+ lookahead = get_token() # Get the next token
+ else:
+ lookahead = lookaheadstack.pop()
+ if not lookahead:
+ lookahead = YaccSymbol()
+ lookahead.type = "$end"
+
+ # --! DEBUG
+ debug.debug('Stack : %s',
+ ("%s . %s" % (" ".join([xx.type for xx in symstack][1:]), str(lookahead))).lstrip())
+ # --! DEBUG
+
+ # Check the action table
+ ltype = lookahead.type
+ t = actions[state].get(ltype)
+
+ if t is not None:
+ if t > 0:
+ # shift a symbol on the stack
+ statestack.append(t)
+ state = t
+
+ # --! DEBUG
+ debug.debug("Action : Shift and goto state %s", t)
+ # --! DEBUG
+
+ symstack.append(lookahead)
+ lookahead = None
+
+ # Decrease error count on successful shift
+ if errorcount: errorcount -=1
+ continue
+
+ if t < 0:
+ # reduce a symbol on the stack, emit a production
+ p = prod[-t]
+ pname = p.name
+ plen = p.len
+
+ # Get production function
+ sym = YaccSymbol()
+ sym.type = pname # Production name
+ sym.value = None
+
+ # --! DEBUG
+ if plen:
+ debug.info("Action : Reduce rule [%s] with %s and goto state %d", p.str, "["+",".join([format_stack_entry(_v.value) for _v in symstack[-plen:]])+"]",-t)
+ else:
+ debug.info("Action : Reduce rule [%s] with %s and goto state %d", p.str, [],-t)
+
+ # --! DEBUG
+
+ if plen:
+ targ = symstack[-plen-1:]
+ targ[0] = sym
+
+ # --! TRACKING
+ if tracking:
+ t1 = targ[1]
+ sym.lineno = t1.lineno
+ sym.lexpos = t1.lexpos
+ t1 = targ[-1]
+ sym.endlineno = getattr(t1,"endlineno",t1.lineno)
+ sym.endlexpos = getattr(t1,"endlexpos",t1.lexpos)
+
+ # --! TRACKING
+
+ # !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
+ # The code enclosed in this section is duplicated
+ # below as a performance optimization. Make sure
+ # changes get made in both locations.
+
+ pslice.slice = targ
+
+ try:
+ # Call the grammar rule with our special slice object
+ del symstack[-plen:]
+ del statestack[-plen:]
+ p.callable(pslice)
+ # --! DEBUG
+ debug.info("Result : %s", format_result(pslice[0]))
+ # --! DEBUG
+ symstack.append(sym)
+ state = goto[statestack[-1]][pname]
+ statestack.append(state)
+ except SyntaxError:
+ # If an error was set. Enter error recovery state
+ lookaheadstack.append(lookahead)
+ symstack.pop()
+ statestack.pop()
+ state = statestack[-1]
+ sym.type = 'error'
+ lookahead = sym
+ errorcount = error_count
+ self.errorok = 0
+ continue
+ # !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
+
+ else:
+
+ # --! TRACKING
+ if tracking:
+ sym.lineno = lexer.lineno
+ sym.lexpos = lexer.lexpos
+ # --! TRACKING
+
+ targ = [ sym ]
+
+ # !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
+ # The code enclosed in this section is duplicated
+ # above as a performance optimization. Make sure
+ # changes get made in both locations.
+
+ pslice.slice = targ
+
+ try:
+ # Call the grammar rule with our special slice object
+ p.callable(pslice)
+ # --! DEBUG
+ debug.info("Result : %s", format_result(pslice[0]))
+ # --! DEBUG
+ symstack.append(sym)
+ state = goto[statestack[-1]][pname]
+ statestack.append(state)
+ except SyntaxError:
+ # If an error was set. Enter error recovery state
+ lookaheadstack.append(lookahead)
+ symstack.pop()
+ statestack.pop()
+ state = statestack[-1]
+ sym.type = 'error'
+ lookahead = sym
+ errorcount = error_count
+ self.errorok = 0
+ continue
+ # !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
+
+ if t == 0:
+ n = symstack[-1]
+ result = getattr(n,"value",None)
+ # --! DEBUG
+ debug.info("Done : Returning %s", format_result(result))
+ debug.info("PLY: PARSE DEBUG END")
+ # --! DEBUG
+ return result
+
+ if t == None:
+
+ # --! DEBUG
+ debug.error('Error : %s',
+ ("%s . %s" % (" ".join([xx.type for xx in symstack][1:]), str(lookahead))).lstrip())
+ # --! DEBUG
+
+ # We have some kind of parsing error here. To handle
+ # this, we are going to push the current token onto
+ # the tokenstack and replace it with an 'error' token.
+ # If there are any synchronization rules, they may
+ # catch it.
+ #
+ # In addition to pushing the error token, we call call
+ # the user defined p_error() function if this is the
+ # first syntax error. This function is only called if
+ # errorcount == 0.
+ if errorcount == 0 or self.errorok:
+ errorcount = error_count
+ self.errorok = 0
+ errtoken = lookahead
+ if errtoken.type == "$end":
+ errtoken = None # End of file!
+ if self.errorfunc:
+ global errok,token,restart
+ errok = self.errok # Set some special functions available in error recovery
+ token = get_token
+ restart = self.restart
+ if errtoken and not hasattr(errtoken,'lexer'):
+ errtoken.lexer = lexer
+ tok = self.errorfunc(errtoken)
+ del errok, token, restart # Delete special functions
+
+ if self.errorok:
+ # User must have done some kind of panic
+ # mode recovery on their own. The
+ # returned token is the next lookahead
+ lookahead = tok
+ errtoken = None
+ continue
+ else:
+ if errtoken:
+ if hasattr(errtoken,"lineno"): lineno = lookahead.lineno
+ else: lineno = 0
+ if lineno:
+ sys.stderr.write("yacc: Syntax error at line %d, token=%s\n" % (lineno, errtoken.type))
+ else:
+ sys.stderr.write("yacc: Syntax error, token=%s" % errtoken.type)
+ else:
+ sys.stderr.write("yacc: Parse error in input. EOF\n")
+ return
+
+ else:
+ errorcount = error_count
+
+ # case 1: the statestack only has 1 entry on it. If we're in this state, the
+ # entire parse has been rolled back and we're completely hosed. The token is
+ # discarded and we just keep going.
+
+ if len(statestack) <= 1 and lookahead.type != "$end":
+ lookahead = None
+ errtoken = None
+ state = 0
+ # Nuke the pushback stack
+ del lookaheadstack[:]
+ continue
+
+ # case 2: the statestack has a couple of entries on it, but we're
+ # at the end of the file. nuke the top entry and generate an error token
+
+ # Start nuking entries on the stack
+ if lookahead.type == "$end":
+ # Whoa. We're really hosed here. Bail out
+ return
+
+ if lookahead.type != 'error':
+ sym = symstack[-1]
+ if sym.type == 'error':
+ # Hmmm. Error is on top of stack, we'll just nuke input
+ # symbol and continue
+ lookahead = None
+ continue
+ t = YaccSymbol()
+ t.type = 'error'
+ if hasattr(lookahead,"lineno"):
+ t.lineno = lookahead.lineno
+ t.value = lookahead
+ lookaheadstack.append(lookahead)
+ lookahead = t
+ else:
+ symstack.pop()
+ statestack.pop()
+ state = statestack[-1] # Potential bug fix
+
+ continue
+
+ # Call an error function here
+ raise RuntimeError("yacc: internal parser error!!!\n")
+
+ # !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
+ # parseopt().
+ #
+ # Optimized version of parse() method. DO NOT EDIT THIS CODE DIRECTLY.
+ # Edit the debug version above, then copy any modifications to the method
+ # below while removing #--! DEBUG sections.
+ # !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
+
+
+ def parseopt(self,input=None,lexer=None,debug=0,tracking=0,tokenfunc=None):
+ lookahead = None # Current lookahead symbol
+ lookaheadstack = [ ] # Stack of lookahead symbols
+ actions = self.action # Local reference to action table (to avoid lookup on self.)
+ goto = self.goto # Local reference to goto table (to avoid lookup on self.)
+ prod = self.productions # Local reference to production list (to avoid lookup on self.)
+ pslice = YaccProduction(None) # Production object passed to grammar rules
+ errorcount = 0 # Used during error recovery
+
+ # If no lexer was given, we will try to use the lex module
+ if not lexer:
+ lex = load_ply_lex()
+ lexer = lex.lexer
+
+ # Set up the lexer and parser objects on pslice
+ pslice.lexer = lexer
+ pslice.parser = self
+
+ # If input was supplied, pass to lexer
+ if input is not None:
+ lexer.input(input)
+
+ if tokenfunc is None:
+ # Tokenize function
+ get_token = lexer.token
+ else:
+ get_token = tokenfunc
+
+ # Set up the state and symbol stacks
+
+ statestack = [ ] # Stack of parsing states
+ self.statestack = statestack
+ symstack = [ ] # Stack of grammar symbols
+ self.symstack = symstack
+
+ pslice.stack = symstack # Put in the production
+ errtoken = None # Err token
+
+ # The start state is assumed to be (0,$end)
+
+ statestack.append(0)
+ sym = YaccSymbol()
+ sym.type = '$end'
+ symstack.append(sym)
+ state = 0
+ while 1:
+ # Get the next symbol on the input. If a lookahead symbol
+ # is already set, we just use that. Otherwise, we'll pull
+ # the next token off of the lookaheadstack or from the lexer
+
+ if not lookahead:
+ if not lookaheadstack:
+ lookahead = get_token() # Get the next token
+ else:
+ lookahead = lookaheadstack.pop()
+ if not lookahead:
+ lookahead = YaccSymbol()
+ lookahead.type = '$end'
+
+ # Check the action table
+ ltype = lookahead.type
+ t = actions[state].get(ltype)
+
+ if t is not None:
+ if t > 0:
+ # shift a symbol on the stack
+ statestack.append(t)
+ state = t
+
+ symstack.append(lookahead)
+ lookahead = None
+
+ # Decrease error count on successful shift
+ if errorcount: errorcount -=1
+ continue
+
+ if t < 0:
+ # reduce a symbol on the stack, emit a production
+ p = prod[-t]
+ pname = p.name
+ plen = p.len
+
+ # Get production function
+ sym = YaccSymbol()
+ sym.type = pname # Production name
+ sym.value = None
+
+ if plen:
+ targ = symstack[-plen-1:]
+ targ[0] = sym
+
+ # --! TRACKING
+ if tracking:
+ t1 = targ[1]
+ sym.lineno = t1.lineno
+ sym.lexpos = t1.lexpos
+ t1 = targ[-1]
+ sym.endlineno = getattr(t1,"endlineno",t1.lineno)
+ sym.endlexpos = getattr(t1,"endlexpos",t1.lexpos)
+
+ # --! TRACKING
+
+ # !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
+ # The code enclosed in this section is duplicated
+ # below as a performance optimization. Make sure
+ # changes get made in both locations.
+
+ pslice.slice = targ
+
+ try:
+ # Call the grammar rule with our special slice object
+ del symstack[-plen:]
+ del statestack[-plen:]
+ p.callable(pslice)
+ symstack.append(sym)
+ state = goto[statestack[-1]][pname]
+ statestack.append(state)
+ except SyntaxError:
+ # If an error was set. Enter error recovery state
+ lookaheadstack.append(lookahead)
+ symstack.pop()
+ statestack.pop()
+ state = statestack[-1]
+ sym.type = 'error'
+ lookahead = sym
+ errorcount = error_count
+ self.errorok = 0
+ continue
+ # !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
+
+ else:
+
+ # --! TRACKING
+ if tracking:
+ sym.lineno = lexer.lineno
+ sym.lexpos = lexer.lexpos
+ # --! TRACKING
+
+ targ = [ sym ]
+
+ # !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
+ # The code enclosed in this section is duplicated
+ # above as a performance optimization. Make sure
+ # changes get made in both locations.
+
+ pslice.slice = targ
+
+ try:
+ # Call the grammar rule with our special slice object
+ p.callable(pslice)
+ symstack.append(sym)
+ state = goto[statestack[-1]][pname]
+ statestack.append(state)
+ except SyntaxError:
+ # If an error was set. Enter error recovery state
+ lookaheadstack.append(lookahead)
+ symstack.pop()
+ statestack.pop()
+ state = statestack[-1]
+ sym.type = 'error'
+ lookahead = sym
+ errorcount = error_count
+ self.errorok = 0
+ continue
+ # !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
+
+ if t == 0:
+ n = symstack[-1]
+ return getattr(n,"value",None)
+
+ if t == None:
+
+ # We have some kind of parsing error here. To handle
+ # this, we are going to push the current token onto
+ # the tokenstack and replace it with an 'error' token.
+ # If there are any synchronization rules, they may
+ # catch it.
+ #
+ # In addition to pushing the error token, we call call
+ # the user defined p_error() function if this is the
+ # first syntax error. This function is only called if
+ # errorcount == 0.
+ if errorcount == 0 or self.errorok:
+ errorcount = error_count
+ self.errorok = 0
+ errtoken = lookahead
+ if errtoken.type == '$end':
+ errtoken = None # End of file!
+ if self.errorfunc:
+ global errok,token,restart
+ errok = self.errok # Set some special functions available in error recovery
+ token = get_token
+ restart = self.restart
+ if errtoken and not hasattr(errtoken,'lexer'):
+ errtoken.lexer = lexer
+ tok = self.errorfunc(errtoken)
+ del errok, token, restart # Delete special functions
+
+ if self.errorok:
+ # User must have done some kind of panic
+ # mode recovery on their own. The
+ # returned token is the next lookahead
+ lookahead = tok
+ errtoken = None
+ continue
+ else:
+ if errtoken:
+ if hasattr(errtoken,"lineno"): lineno = lookahead.lineno
+ else: lineno = 0
+ if lineno:
+ sys.stderr.write("yacc: Syntax error at line %d, token=%s\n" % (lineno, errtoken.type))
+ else:
+ sys.stderr.write("yacc: Syntax error, token=%s" % errtoken.type)
+ else:
+ sys.stderr.write("yacc: Parse error in input. EOF\n")
+ return
+
+ else:
+ errorcount = error_count
+
+ # case 1: the statestack only has 1 entry on it. If we're in this state, the
+ # entire parse has been rolled back and we're completely hosed. The token is
+ # discarded and we just keep going.
+
+ if len(statestack) <= 1 and lookahead.type != '$end':
+ lookahead = None
+ errtoken = None
+ state = 0
+ # Nuke the pushback stack
+ del lookaheadstack[:]
+ continue
+
+ # case 2: the statestack has a couple of entries on it, but we're
+ # at the end of the file. nuke the top entry and generate an error token
+
+ # Start nuking entries on the stack
+ if lookahead.type == '$end':
+ # Whoa. We're really hosed here. Bail out
+ return
+
+ if lookahead.type != 'error':
+ sym = symstack[-1]
+ if sym.type == 'error':
+ # Hmmm. Error is on top of stack, we'll just nuke input
+ # symbol and continue
+ lookahead = None
+ continue
+ t = YaccSymbol()
+ t.type = 'error'
+ if hasattr(lookahead,"lineno"):
+ t.lineno = lookahead.lineno
+ t.value = lookahead
+ lookaheadstack.append(lookahead)
+ lookahead = t
+ else:
+ symstack.pop()
+ statestack.pop()
+ state = statestack[-1] # Potential bug fix
+
+ continue
+
+ # Call an error function here
+ raise RuntimeError("yacc: internal parser error!!!\n")
+
+ # !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
+ # parseopt_notrack().
+ #
+ # Optimized version of parseopt() with line number tracking removed.
+ # DO NOT EDIT THIS CODE DIRECTLY. Copy the optimized version and remove
+ # code in the #--! TRACKING sections
+ # !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
+
+ def parseopt_notrack(self,input=None,lexer=None,debug=0,tracking=0,tokenfunc=None):
+ lookahead = None # Current lookahead symbol
+ lookaheadstack = [ ] # Stack of lookahead symbols
+ actions = self.action # Local reference to action table (to avoid lookup on self.)
+ goto = self.goto # Local reference to goto table (to avoid lookup on self.)
+ prod = self.productions # Local reference to production list (to avoid lookup on self.)
+ pslice = YaccProduction(None) # Production object passed to grammar rules
+ errorcount = 0 # Used during error recovery
+
+ # If no lexer was given, we will try to use the lex module
+ if not lexer:
+ lex = load_ply_lex()
+ lexer = lex.lexer
+
+ # Set up the lexer and parser objects on pslice
+ pslice.lexer = lexer
+ pslice.parser = self
+
+ # If input was supplied, pass to lexer
+ if input is not None:
+ lexer.input(input)
+
+ if tokenfunc is None:
+ # Tokenize function
+ get_token = lexer.token
+ else:
+ get_token = tokenfunc
+
+ # Set up the state and symbol stacks
+
+ statestack = [ ] # Stack of parsing states
+ self.statestack = statestack
+ symstack = [ ] # Stack of grammar symbols
+ self.symstack = symstack
+
+ pslice.stack = symstack # Put in the production
+ errtoken = None # Err token
+
+ # The start state is assumed to be (0,$end)
+
+ statestack.append(0)
+ sym = YaccSymbol()
+ sym.type = '$end'
+ symstack.append(sym)
+ state = 0
+ while 1:
+ # Get the next symbol on the input. If a lookahead symbol
+ # is already set, we just use that. Otherwise, we'll pull
+ # the next token off of the lookaheadstack or from the lexer
+
+ if not lookahead:
+ if not lookaheadstack:
+ lookahead = get_token() # Get the next token
+ else:
+ lookahead = lookaheadstack.pop()
+ if not lookahead:
+ lookahead = YaccSymbol()
+ lookahead.type = '$end'
+
+ # Check the action table
+ ltype = lookahead.type
+ t = actions[state].get(ltype)
+
+ if t is not None:
+ if t > 0:
+ # shift a symbol on the stack
+ statestack.append(t)
+ state = t
+
+ symstack.append(lookahead)
+ lookahead = None
+
+ # Decrease error count on successful shift
+ if errorcount: errorcount -=1
+ continue
+
+ if t < 0:
+ # reduce a symbol on the stack, emit a production
+ p = prod[-t]
+ pname = p.name
+ plen = p.len
+
+ # Get production function
+ sym = YaccSymbol()
+ sym.type = pname # Production name
+ sym.value = None
+
+ if plen:
+ targ = symstack[-plen-1:]
+ targ[0] = sym
+
+ # !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
+ # The code enclosed in this section is duplicated
+ # below as a performance optimization. Make sure
+ # changes get made in both locations.
+
+ pslice.slice = targ
+
+ try:
+ # Call the grammar rule with our special slice object
+ del symstack[-plen:]
+ del statestack[-plen:]
+ p.callable(pslice)
+ symstack.append(sym)
+ state = goto[statestack[-1]][pname]
+ statestack.append(state)
+ except SyntaxError:
+ # If an error was set. Enter error recovery state
+ lookaheadstack.append(lookahead)
+ symstack.pop()
+ statestack.pop()
+ state = statestack[-1]
+ sym.type = 'error'
+ lookahead = sym
+ errorcount = error_count
+ self.errorok = 0
+ continue
+ # !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
+
+ else:
+
+ targ = [ sym ]
+
+ # !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
+ # The code enclosed in this section is duplicated
+ # above as a performance optimization. Make sure
+ # changes get made in both locations.
+
+ pslice.slice = targ
+
+ try:
+ # Call the grammar rule with our special slice object
+ p.callable(pslice)
+ symstack.append(sym)
+ state = goto[statestack[-1]][pname]
+ statestack.append(state)
+ except SyntaxError:
+ # If an error was set. Enter error recovery state
+ lookaheadstack.append(lookahead)
+ symstack.pop()
+ statestack.pop()
+ state = statestack[-1]
+ sym.type = 'error'
+ lookahead = sym
+ errorcount = error_count
+ self.errorok = 0
+ continue
+ # !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
+
+ if t == 0:
+ n = symstack[-1]
+ return getattr(n,"value",None)
+
+ if t == None:
+
+ # We have some kind of parsing error here. To handle
+ # this, we are going to push the current token onto
+ # the tokenstack and replace it with an 'error' token.
+ # If there are any synchronization rules, they may
+ # catch it.
+ #
+ # In addition to pushing the error token, we call call
+ # the user defined p_error() function if this is the
+ # first syntax error. This function is only called if
+ # errorcount == 0.
+ if errorcount == 0 or self.errorok:
+ errorcount = error_count
+ self.errorok = 0
+ errtoken = lookahead
+ if errtoken.type == '$end':
+ errtoken = None # End of file!
+ if self.errorfunc:
+ global errok,token,restart
+ errok = self.errok # Set some special functions available in error recovery
+ token = get_token
+ restart = self.restart
+ if errtoken and not hasattr(errtoken,'lexer'):
+ errtoken.lexer = lexer
+ tok = self.errorfunc(errtoken)
+ del errok, token, restart # Delete special functions
+
+ if self.errorok:
+ # User must have done some kind of panic
+ # mode recovery on their own. The
+ # returned token is the next lookahead
+ lookahead = tok
+ errtoken = None
+ continue
+ else:
+ if errtoken:
+ if hasattr(errtoken,"lineno"): lineno = lookahead.lineno
+ else: lineno = 0
+ if lineno:
+ sys.stderr.write("yacc: Syntax error at line %d, token=%s\n" % (lineno, errtoken.type))
+ else:
+ sys.stderr.write("yacc: Syntax error, token=%s" % errtoken.type)
+ else:
+ sys.stderr.write("yacc: Parse error in input. EOF\n")
+ return
+
+ else:
+ errorcount = error_count
+
+ # case 1: the statestack only has 1 entry on it. If we're in this state, the
+ # entire parse has been rolled back and we're completely hosed. The token is
+ # discarded and we just keep going.
+
+ if len(statestack) <= 1 and lookahead.type != '$end':
+ lookahead = None
+ errtoken = None
+ state = 0
+ # Nuke the pushback stack
+ del lookaheadstack[:]
+ continue
+
+ # case 2: the statestack has a couple of entries on it, but we're
+ # at the end of the file. nuke the top entry and generate an error token
+
+ # Start nuking entries on the stack
+ if lookahead.type == '$end':
+ # Whoa. We're really hosed here. Bail out
+ return
+
+ if lookahead.type != 'error':
+ sym = symstack[-1]
+ if sym.type == 'error':
+ # Hmmm. Error is on top of stack, we'll just nuke input
+ # symbol and continue
+ lookahead = None
+ continue
+ t = YaccSymbol()
+ t.type = 'error'
+ if hasattr(lookahead,"lineno"):
+ t.lineno = lookahead.lineno
+ t.value = lookahead
+ lookaheadstack.append(lookahead)
+ lookahead = t
+ else:
+ symstack.pop()
+ statestack.pop()
+ state = statestack[-1] # Potential bug fix
+
+ continue
+
+ # Call an error function here
+ raise RuntimeError("yacc: internal parser error!!!\n")
+
+# -----------------------------------------------------------------------------
+# === Grammar Representation ===
+#
+# The following functions, classes, and variables are used to represent and
+# manipulate the rules that make up a grammar.
+# -----------------------------------------------------------------------------
+
+import re
+
+# regex matching identifiers
+_is_identifier = re.compile(r'^[a-zA-Z0-9_-]+$')
+
+# -----------------------------------------------------------------------------
+# class Production:
+#
+# This class stores the raw information about a single production or grammar rule.
+# A grammar rule refers to a specification such as this:
+#
+# expr : expr PLUS term
+#
+# Here are the basic attributes defined on all productions
+#
+# name - Name of the production. For example 'expr'
+# prod - A list of symbols on the right side ['expr','PLUS','term']
+# prec - Production precedence level
+# number - Production number.
+# func - Function that executes on reduce
+# file - File where production function is defined
+# lineno - Line number where production function is defined
+#
+# The following attributes are defined or optional.
+#
+# len - Length of the production (number of symbols on right hand side)
+# usyms - Set of unique symbols found in the production
+# -----------------------------------------------------------------------------
+
+class Production(object):
+ reduced = 0
+ def __init__(self,number,name,prod,precedence=('right',0),func=None,file='',line=0):
+ self.name = name
+ self.prod = tuple(prod)
+ self.number = number
+ self.func = func
+ self.callable = None
+ self.file = file
+ self.line = line
+ self.prec = precedence
+
+ # Internal settings used during table construction
+
+ self.len = len(self.prod) # Length of the production
+
+ # Create a list of unique production symbols used in the production
+ self.usyms = [ ]
+ for s in self.prod:
+ if s not in self.usyms:
+ self.usyms.append(s)
+
+ # List of all LR items for the production
+ self.lr_items = []
+ self.lr_next = None
+
+ # Create a string representation
+ if self.prod:
+ self.str = "%s -> %s" % (self.name," ".join(self.prod))
+ else:
+ self.str = "%s -> <empty>" % self.name
+
+ def __str__(self):
+ return self.str
+
+ def __repr__(self):
+ return "Production("+str(self)+")"
+
+ def __len__(self):
+ return len(self.prod)
+
+ def __nonzero__(self):
+ return 1
+
+ def __getitem__(self,index):
+ return self.prod[index]
+
+ # Return the nth lr_item from the production (or None if at the end)
+ def lr_item(self,n):
+ if n > len(self.prod): return None
+ p = LRItem(self,n)
+
+ # Precompute the list of productions immediately following. Hack. Remove later
+ try:
+ p.lr_after = Prodnames[p.prod[n+1]]
+ except (IndexError,KeyError):
+ p.lr_after = []
+ try:
+ p.lr_before = p.prod[n-1]
+ except IndexError:
+ p.lr_before = None
+
+ return p
+
+ # Bind the production function name to a callable
+ def bind(self,pdict):
+ if self.func:
+ self.callable = pdict[self.func]
+
+# This class serves as a minimal standin for Production objects when
+# reading table data from files. It only contains information
+# actually used by the LR parsing engine, plus some additional
+# debugging information.
+class MiniProduction(object):
+ def __init__(self,str,name,len,func,file,line):
+ self.name = name
+ self.len = len
+ self.func = func
+ self.callable = None
+ self.file = file
+ self.line = line
+ self.str = str
+ def __str__(self):
+ return self.str
+ def __repr__(self):
+ return "MiniProduction(%s)" % self.str
+
+ # Bind the production function name to a callable
+ def bind(self,pdict):
+ if self.func:
+ self.callable = pdict[self.func]
+
+
+# -----------------------------------------------------------------------------
+# class LRItem
+#
+# This class represents a specific stage of parsing a production rule. For
+# example:
+#
+# expr : expr . PLUS term
+#
+# In the above, the "." represents the current location of the parse. Here
+# basic attributes:
+#
+# name - Name of the production. For example 'expr'
+# prod - A list of symbols on the right side ['expr','.', 'PLUS','term']
+# number - Production number.
+#
+# lr_next Next LR item. Example, if we are ' expr -> expr . PLUS term'
+# then lr_next refers to 'expr -> expr PLUS . term'
+# lr_index - LR item index (location of the ".") in the prod list.
+# lookaheads - LALR lookahead symbols for this item
+# len - Length of the production (number of symbols on right hand side)
+# lr_after - List of all productions that immediately follow
+# lr_before - Grammar symbol immediately before
+# -----------------------------------------------------------------------------
+
+class LRItem(object):
+ def __init__(self,p,n):
+ self.name = p.name
+ self.prod = list(p.prod)
+ self.number = p.number
+ self.lr_index = n
+ self.lookaheads = { }
+ self.prod.insert(n,".")
+ self.prod = tuple(self.prod)
+ self.len = len(self.prod)
+ self.usyms = p.usyms
+
+ def __str__(self):
+ if self.prod:
+ s = "%s -> %s" % (self.name," ".join(self.prod))
+ else:
+ s = "%s -> <empty>" % self.name
+ return s
+
+ def __repr__(self):
+ return "LRItem("+str(self)+")"
+
+# -----------------------------------------------------------------------------
+# rightmost_terminal()
+#
+# Return the rightmost terminal from a list of symbols. Used in add_production()
+# -----------------------------------------------------------------------------
+def rightmost_terminal(symbols, terminals):
+ i = len(symbols) - 1
+ while i >= 0:
+ if symbols[i] in terminals:
+ return symbols[i]
+ i -= 1
+ return None
+
+# -----------------------------------------------------------------------------
+# === GRAMMAR CLASS ===
+#
+# The following class represents the contents of the specified grammar along
+# with various computed properties such as first sets, follow sets, LR items, etc.
+# This data is used for critical parts of the table generation process later.
+# -----------------------------------------------------------------------------
+
+class GrammarError(YaccError): pass
+
+class Grammar(object):
+ def __init__(self,terminals):
+ self.Productions = [None] # A list of all of the productions. The first
+ # entry is always reserved for the purpose of
+ # building an augmented grammar
+
+ self.Prodnames = { } # A dictionary mapping the names of nonterminals to a list of all
+ # productions of that nonterminal.
+
+ self.Prodmap = { } # A dictionary that is only used to detect duplicate
+ # productions.
+
+ self.Terminals = { } # A dictionary mapping the names of terminal symbols to a
+ # list of the rules where they are used.
+
+ for term in terminals:
+ self.Terminals[term] = []
+
+ self.Terminals['error'] = []
+
+ self.Nonterminals = { } # A dictionary mapping names of nonterminals to a list
+ # of rule numbers where they are used.
+
+ self.First = { } # A dictionary of precomputed FIRST(x) symbols
+
+ self.Follow = { } # A dictionary of precomputed FOLLOW(x) symbols
+
+ self.Precedence = { } # Precedence rules for each terminal. Contains tuples of the
+ # form ('right',level) or ('nonassoc', level) or ('left',level)
+
+ self.UsedPrecedence = { } # Precedence rules that were actually used by the grammer.
+ # This is only used to provide error checking and to generate
+ # a warning about unused precedence rules.
+
+ self.Start = None # Starting symbol for the grammar
+
+
+ def __len__(self):
+ return len(self.Productions)
+
+ def __getitem__(self,index):
+ return self.Productions[index]
+
+ # -----------------------------------------------------------------------------
+ # set_precedence()
+ #
+ # Sets the precedence for a given terminal. assoc is the associativity such as
+ # 'left','right', or 'nonassoc'. level is a numeric level.
+ #
+ # -----------------------------------------------------------------------------
+
+ def set_precedence(self,term,assoc,level):
+ assert self.Productions == [None],"Must call set_precedence() before add_production()"
+ if term in self.Precedence:
+ raise GrammarError("Precedence already specified for terminal '%s'" % term)
+ if assoc not in ['left','right','nonassoc']:
+ raise GrammarError("Associativity must be one of 'left','right', or 'nonassoc'")
+ self.Precedence[term] = (assoc,level)
+
+ # -----------------------------------------------------------------------------
+ # add_production()
+ #
+ # Given an action function, this function assembles a production rule and
+ # computes its precedence level.
+ #
+ # The production rule is supplied as a list of symbols. For example,
+ # a rule such as 'expr : expr PLUS term' has a production name of 'expr' and
+ # symbols ['expr','PLUS','term'].
+ #
+ # Precedence is determined by the precedence of the right-most non-terminal
+ # or the precedence of a terminal specified by %prec.
+ #
+ # A variety of error checks are performed to make sure production symbols
+ # are valid and that %prec is used correctly.
+ # -----------------------------------------------------------------------------
+
+ def add_production(self,prodname,syms,func=None,file='',line=0):
+
+ if prodname in self.Terminals:
+ raise GrammarError("%s:%d: Illegal rule name '%s'. Already defined as a token" % (file,line,prodname))
+ if prodname == 'error':
+ raise GrammarError("%s:%d: Illegal rule name '%s'. error is a reserved word" % (file,line,prodname))
+ if not _is_identifier.match(prodname):
+ raise GrammarError("%s:%d: Illegal rule name '%s'" % (file,line,prodname))
+
+ # Look for literal tokens
+ for n,s in enumerate(syms):
+ if s[0] in "'\"":
+ try:
+ c = eval(s)
+ if (len(c) > 1):
+ raise GrammarError("%s:%d: Literal token %s in rule '%s' may only be a single character" % (file,line,s, prodname))
+ if not c in self.Terminals:
+ self.Terminals[c] = []
+ syms[n] = c
+ continue
+ except SyntaxError:
+ pass
+ if not _is_identifier.match(s) and s != '%prec':
+ raise GrammarError("%s:%d: Illegal name '%s' in rule '%s'" % (file,line,s, prodname))
+
+ # Determine the precedence level
+ if '%prec' in syms:
+ if syms[-1] == '%prec':
+ raise GrammarError("%s:%d: Syntax error. Nothing follows %%prec" % (file,line))
+ if syms[-2] != '%prec':
+ raise GrammarError("%s:%d: Syntax error. %%prec can only appear at the end of a grammar rule" % (file,line))
+ precname = syms[-1]
+ prodprec = self.Precedence.get(precname,None)
+ if not prodprec:
+ raise GrammarError("%s:%d: Nothing known about the precedence of '%s'" % (file,line,precname))
+ else:
+ self.UsedPrecedence[precname] = 1
+ del syms[-2:] # Drop %prec from the rule
+ else:
+ # If no %prec, precedence is determined by the rightmost terminal symbol
+ precname = rightmost_terminal(syms,self.Terminals)
+ prodprec = self.Precedence.get(precname,('right',0))
+
+ # See if the rule is already in the rulemap
+ map = "%s -> %s" % (prodname,syms)
+ if map in self.Prodmap:
+ m = self.Prodmap[map]
+ raise GrammarError("%s:%d: Duplicate rule %s. " % (file,line, m) +
+ "Previous definition at %s:%d" % (m.file, m.line))
+
+ # From this point on, everything is valid. Create a new Production instance
+ pnumber = len(self.Productions)
+ if not prodname in self.Nonterminals:
+ self.Nonterminals[prodname] = [ ]
+
+ # Add the production number to Terminals and Nonterminals
+ for t in syms:
+ if t in self.Terminals:
+ self.Terminals[t].append(pnumber)
+ else:
+ if not t in self.Nonterminals:
+ self.Nonterminals[t] = [ ]
+ self.Nonterminals[t].append(pnumber)
+
+ # Create a production and add it to the list of productions
+ p = Production(pnumber,prodname,syms,prodprec,func,file,line)
+ self.Productions.append(p)
+ self.Prodmap[map] = p
+
+ # Add to the global productions list
+ try:
+ self.Prodnames[prodname].append(p)
+ except KeyError:
+ self.Prodnames[prodname] = [ p ]
+ return 0
+
+ # -----------------------------------------------------------------------------
+ # set_start()
+ #
+ # Sets the starting symbol and creates the augmented grammar. Production
+ # rule 0 is S' -> start where start is the start symbol.
+ # -----------------------------------------------------------------------------
+
+ def set_start(self,start=None):
+ if not start:
+ start = self.Productions[1].name
+ if start not in self.Nonterminals:
+ raise GrammarError("start symbol %s undefined" % start)
+ self.Productions[0] = Production(0,"S'",[start])
+ self.Nonterminals[start].append(0)
+ self.Start = start
+
+ # -----------------------------------------------------------------------------
+ # find_unreachable()
+ #
+ # Find all of the nonterminal symbols that can't be reached from the starting
+ # symbol. Returns a list of nonterminals that can't be reached.
+ # -----------------------------------------------------------------------------
+
+ def find_unreachable(self):
+
+ # Mark all symbols that are reachable from a symbol s
+ def mark_reachable_from(s):
+ if reachable[s]:
+ # We've already reached symbol s.
+ return
+ reachable[s] = 1
+ for p in self.Prodnames.get(s,[]):
+ for r in p.prod:
+ mark_reachable_from(r)
+
+ reachable = { }
+ for s in list(self.Terminals) + list(self.Nonterminals):
+ reachable[s] = 0
+
+ mark_reachable_from( self.Productions[0].prod[0] )
+
+ return [s for s in list(self.Nonterminals)
+ if not reachable[s]]
+
+ # -----------------------------------------------------------------------------
+ # infinite_cycles()
+ #
+ # This function looks at the various parsing rules and tries to detect
+ # infinite recursion cycles (grammar rules where there is no possible way
+ # to derive a string of only terminals).
+ # -----------------------------------------------------------------------------
+
+ def infinite_cycles(self):
+ terminates = {}
+
+ # Terminals:
+ for t in self.Terminals:
+ terminates[t] = 1
+
+ terminates['$end'] = 1
+
+ # Nonterminals:
+
+ # Initialize to false:
+ for n in self.Nonterminals:
+ terminates[n] = 0
+
+ # Then propagate termination until no change:
+ while 1:
+ some_change = 0
+ for (n,pl) in self.Prodnames.items():
+ # Nonterminal n terminates iff any of its productions terminates.
+ for p in pl:
+ # Production p terminates iff all of its rhs symbols terminate.
+ for s in p.prod:
+ if not terminates[s]:
+ # The symbol s does not terminate,
+ # so production p does not terminate.
+ p_terminates = 0
+ break
+ else:
+ # didn't break from the loop,
+ # so every symbol s terminates
+ # so production p terminates.
+ p_terminates = 1
+
+ if p_terminates:
+ # symbol n terminates!
+ if not terminates[n]:
+ terminates[n] = 1
+ some_change = 1
+ # Don't need to consider any more productions for this n.
+ break
+
+ if not some_change:
+ break
+
+ infinite = []
+ for (s,term) in terminates.items():
+ if not term:
+ if not s in self.Prodnames and not s in self.Terminals and s != 'error':
+ # s is used-but-not-defined, and we've already warned of that,
+ # so it would be overkill to say that it's also non-terminating.
+ pass
+ else:
+ infinite.append(s)
+
+ return infinite
+
+
+ # -----------------------------------------------------------------------------
+ # undefined_symbols()
+ #
+ # Find all symbols that were used the grammar, but not defined as tokens or
+ # grammar rules. Returns a list of tuples (sym, prod) where sym in the symbol
+ # and prod is the production where the symbol was used.
+ # -----------------------------------------------------------------------------
+ def undefined_symbols(self):
+ result = []
+ for p in self.Productions:
+ if not p: continue
+
+ for s in p.prod:
+ if not s in self.Prodnames and not s in self.Terminals and s != 'error':
+ result.append((s,p))
+ return result
+
+ # -----------------------------------------------------------------------------
+ # unused_terminals()
+ #
+ # Find all terminals that were defined, but not used by the grammar. Returns
+ # a list of all symbols.
+ # -----------------------------------------------------------------------------
+ def unused_terminals(self):
+ unused_tok = []
+ for s,v in self.Terminals.items():
+ if s != 'error' and not v:
+ unused_tok.append(s)
+
+ return unused_tok
+
+ # ------------------------------------------------------------------------------
+ # unused_rules()
+ #
+ # Find all grammar rules that were defined, but not used (maybe not reachable)
+ # Returns a list of productions.
+ # ------------------------------------------------------------------------------
+
+ def unused_rules(self):
+ unused_prod = []
+ for s,v in self.Nonterminals.items():
+ if not v:
+ p = self.Prodnames[s][0]
+ unused_prod.append(p)
+ return unused_prod
+
+ # -----------------------------------------------------------------------------
+ # unused_precedence()
+ #
+ # Returns a list of tuples (term,precedence) corresponding to precedence
+ # rules that were never used by the grammar. term is the name of the terminal
+ # on which precedence was applied and precedence is a string such as 'left' or
+ # 'right' corresponding to the type of precedence.
+ # -----------------------------------------------------------------------------
+
+ def unused_precedence(self):
+ unused = []
+ for termname in self.Precedence:
+ if not (termname in self.Terminals or termname in self.UsedPrecedence):
+ unused.append((termname,self.Precedence[termname][0]))
+
+ return unused
+
+ # -------------------------------------------------------------------------
+ # _first()
+ #
+ # Compute the value of FIRST1(beta) where beta is a tuple of symbols.
+ #
+ # During execution of compute_first1, the result may be incomplete.
+ # Afterward (e.g., when called from compute_follow()), it will be complete.
+ # -------------------------------------------------------------------------
+ def _first(self,beta):
+
+ # We are computing First(x1,x2,x3,...,xn)
+ result = [ ]
+ for x in beta:
+ x_produces_empty = 0
+
+ # Add all the non-<empty> symbols of First[x] to the result.
+ for f in self.First[x]:
+ if f == '<empty>':
+ x_produces_empty = 1
+ else:
+ if f not in result: result.append(f)
+
+ if x_produces_empty:
+ # We have to consider the next x in beta,
+ # i.e. stay in the loop.
+ pass
+ else:
+ # We don't have to consider any further symbols in beta.
+ break
+ else:
+ # There was no 'break' from the loop,
+ # so x_produces_empty was true for all x in beta,
+ # so beta produces empty as well.
+ result.append('<empty>')
+
+ return result
+
+ # -------------------------------------------------------------------------
+ # compute_first()
+ #
+ # Compute the value of FIRST1(X) for all symbols
+ # -------------------------------------------------------------------------
+ def compute_first(self):
+ if self.First:
+ return self.First
+
+ # Terminals:
+ for t in self.Terminals:
+ self.First[t] = [t]
+
+ self.First['$end'] = ['$end']
+
+ # Nonterminals:
+
+ # Initialize to the empty set:
+ for n in self.Nonterminals:
+ self.First[n] = []
+
+ # Then propagate symbols until no change:
+ while 1:
+ some_change = 0
+ for n in self.Nonterminals:
+ for p in self.Prodnames[n]:
+ for f in self._first(p.prod):
+ if f not in self.First[n]:
+ self.First[n].append( f )
+ some_change = 1
+ if not some_change:
+ break
+
+ return self.First
+
+ # ---------------------------------------------------------------------
+ # compute_follow()
+ #
+ # Computes all of the follow sets for every non-terminal symbol. The
+ # follow set is the set of all symbols that might follow a given
+ # non-terminal. See the Dragon book, 2nd Ed. p. 189.
+ # ---------------------------------------------------------------------
+ def compute_follow(self,start=None):
+ # If already computed, return the result
+ if self.Follow:
+ return self.Follow
+
+ # If first sets not computed yet, do that first.
+ if not self.First:
+ self.compute_first()
+
+ # Add '$end' to the follow list of the start symbol
+ for k in self.Nonterminals:
+ self.Follow[k] = [ ]
+
+ if not start:
+ start = self.Productions[1].name
+
+ self.Follow[start] = [ '$end' ]
+
+ while 1:
+ didadd = 0
+ for p in self.Productions[1:]:
+ # Here is the production set
+ for i in range(len(p.prod)):
+ B = p.prod[i]
+ if B in self.Nonterminals:
+ # Okay. We got a non-terminal in a production
+ fst = self._first(p.prod[i+1:])
+ hasempty = 0
+ for f in fst:
+ if f != '<empty>' and f not in self.Follow[B]:
+ self.Follow[B].append(f)
+ didadd = 1
+ if f == '<empty>':
+ hasempty = 1
+ if hasempty or i == (len(p.prod)-1):
+ # Add elements of follow(a) to follow(b)
+ for f in self.Follow[p.name]:
+ if f not in self.Follow[B]:
+ self.Follow[B].append(f)
+ didadd = 1
+ if not didadd: break
+ return self.Follow
+
+
+ # -----------------------------------------------------------------------------
+ # build_lritems()
+ #
+ # This function walks the list of productions and builds a complete set of the
+ # LR items. The LR items are stored in two ways: First, they are uniquely
+ # numbered and placed in the list _lritems. Second, a linked list of LR items
+ # is built for each production. For example:
+ #
+ # E -> E PLUS E
+ #
+ # Creates the list
+ #
+ # [E -> . E PLUS E, E -> E . PLUS E, E -> E PLUS . E, E -> E PLUS E . ]
+ # -----------------------------------------------------------------------------
+
+ def build_lritems(self):
+ for p in self.Productions:
+ lastlri = p
+ i = 0
+ lr_items = []
+ while 1:
+ if i > len(p):
+ lri = None
+ else:
+ lri = LRItem(p,i)
+ # Precompute the list of productions immediately following
+ try:
+ lri.lr_after = self.Prodnames[lri.prod[i+1]]
+ except (IndexError,KeyError):
+ lri.lr_after = []
+ try:
+ lri.lr_before = lri.prod[i-1]
+ except IndexError:
+ lri.lr_before = None
+
+ lastlri.lr_next = lri
+ if not lri: break
+ lr_items.append(lri)
+ lastlri = lri
+ i += 1
+ p.lr_items = lr_items
+
+# -----------------------------------------------------------------------------
+# == Class LRTable ==
+#
+# This basic class represents a basic table of LR parsing information.
+# Methods for generating the tables are not defined here. They are defined
+# in the derived class LRGeneratedTable.
+# -----------------------------------------------------------------------------
+
+class VersionError(YaccError): pass
+
+class LRTable(object):
+ def __init__(self):
+ self.lr_action = None
+ self.lr_goto = None
+ self.lr_productions = None
+ self.lr_method = None
+
+ def read_table(self,module):
+ if isinstance(module,types.ModuleType):
+ parsetab = module
+ else:
+ if sys.version_info[0] < 3:
+ exec("import %s as parsetab" % module)
+ else:
+ env = { }
+ exec("import %s as parsetab" % module, env, env)
+ parsetab = env['parsetab']
+
+ if parsetab._tabversion != __tabversion__:
+ raise VersionError("yacc table file version is out of date")
+
+ self.lr_action = parsetab._lr_action
+ self.lr_goto = parsetab._lr_goto
+
+ self.lr_productions = []
+ for p in parsetab._lr_productions:
+ self.lr_productions.append(MiniProduction(*p))
+
+ self.lr_method = parsetab._lr_method
+ return parsetab._lr_signature
+
+ def read_pickle(self,filename):
+ try:
+ import cPickle as pickle
+ except ImportError:
+ import pickle
+
+ in_f = open(filename,"rb")
+
+ tabversion = pickle.load(in_f)
+ if tabversion != __tabversion__:
+ raise VersionError("yacc table file version is out of date")
+ self.lr_method = pickle.load(in_f)
+ signature = pickle.load(in_f)
+ self.lr_action = pickle.load(in_f)
+ self.lr_goto = pickle.load(in_f)
+ productions = pickle.load(in_f)
+
+ self.lr_productions = []
+ for p in productions:
+ self.lr_productions.append(MiniProduction(*p))
+
+ in_f.close()
+ return signature
+
+ # Bind all production function names to callable objects in pdict
+ def bind_callables(self,pdict):
+ for p in self.lr_productions:
+ p.bind(pdict)
+
+# -----------------------------------------------------------------------------
+# === LR Generator ===
+#
+# The following classes and functions are used to generate LR parsing tables on
+# a grammar.
+# -----------------------------------------------------------------------------
+
+# -----------------------------------------------------------------------------
+# digraph()
+# traverse()
+#
+# The following two functions are used to compute set valued functions
+# of the form:
+#
+# F(x) = F'(x) U U{F(y) | x R y}
+#
+# This is used to compute the values of Read() sets as well as FOLLOW sets
+# in LALR(1) generation.
+#
+# Inputs: X - An input set
+# R - A relation
+# FP - Set-valued function
+# ------------------------------------------------------------------------------
+
+def digraph(X,R,FP):
+ N = { }
+ for x in X:
+ N[x] = 0
+ stack = []
+ F = { }
+ for x in X:
+ if N[x] == 0: traverse(x,N,stack,F,X,R,FP)
+ return F
+
+def traverse(x,N,stack,F,X,R,FP):
+ stack.append(x)
+ d = len(stack)
+ N[x] = d
+ F[x] = FP(x) # F(X) <- F'(x)
+
+ rel = R(x) # Get y's related to x
+ for y in rel:
+ if N[y] == 0:
+ traverse(y,N,stack,F,X,R,FP)
+ N[x] = min(N[x],N[y])
+ for a in F.get(y,[]):
+ if a not in F[x]: F[x].append(a)
+ if N[x] == d:
+ N[stack[-1]] = MAXINT
+ F[stack[-1]] = F[x]
+ element = stack.pop()
+ while element != x:
+ N[stack[-1]] = MAXINT
+ F[stack[-1]] = F[x]
+ element = stack.pop()
+
+class LALRError(YaccError): pass
+
+# -----------------------------------------------------------------------------
+# == LRGeneratedTable ==
+#
+# This class implements the LR table generation algorithm. There are no
+# public methods except for write()
+# -----------------------------------------------------------------------------
+
+class LRGeneratedTable(LRTable):
+ def __init__(self,grammar,method='LALR',log=None):
+ if method not in ['SLR','LALR']:
+ raise LALRError("Unsupported method %s" % method)
+
+ self.grammar = grammar
+ self.lr_method = method
+
+ # Set up the logger
+ if not log:
+ log = NullLogger()
+ self.log = log
+
+ # Internal attributes
+ self.lr_action = {} # Action table
+ self.lr_goto = {} # Goto table
+ self.lr_productions = grammar.Productions # Copy of grammar Production array
+ self.lr_goto_cache = {} # Cache of computed gotos
+ self.lr0_cidhash = {} # Cache of closures
+
+ self._add_count = 0 # Internal counter used to detect cycles
+
+ # Diagonistic information filled in by the table generator
+ self.sr_conflict = 0
+ self.rr_conflict = 0
+ self.conflicts = [] # List of conflicts
+
+ self.sr_conflicts = []
+ self.rr_conflicts = []
+
+ # Build the tables
+ self.grammar.build_lritems()
+ self.grammar.compute_first()
+ self.grammar.compute_follow()
+ self.lr_parse_table()
+
+ # Compute the LR(0) closure operation on I, where I is a set of LR(0) items.
+
+ def lr0_closure(self,I):
+ self._add_count += 1
+
+ # Add everything in I to J
+ J = I[:]
+ didadd = 1
+ while didadd:
+ didadd = 0
+ for j in J:
+ for x in j.lr_after:
+ if getattr(x,"lr0_added",0) == self._add_count: continue
+ # Add B --> .G to J
+ J.append(x.lr_next)
+ x.lr0_added = self._add_count
+ didadd = 1
+
+ return J
+
+ # Compute the LR(0) goto function goto(I,X) where I is a set
+ # of LR(0) items and X is a grammar symbol. This function is written
+ # in a way that guarantees uniqueness of the generated goto sets
+ # (i.e. the same goto set will never be returned as two different Python
+ # objects). With uniqueness, we can later do fast set comparisons using
+ # id(obj) instead of element-wise comparison.
+
+ def lr0_goto(self,I,x):
+ # First we look for a previously cached entry
+ g = self.lr_goto_cache.get((id(I),x),None)
+ if g: return g
+
+ # Now we generate the goto set in a way that guarantees uniqueness
+ # of the result
+
+ s = self.lr_goto_cache.get(x,None)
+ if not s:
+ s = { }
+ self.lr_goto_cache[x] = s
+
+ gs = [ ]
+ for p in I:
+ n = p.lr_next
+ if n and n.lr_before == x:
+ s1 = s.get(id(n),None)
+ if not s1:
+ s1 = { }
+ s[id(n)] = s1
+ gs.append(n)
+ s = s1
+ g = s.get('$end',None)
+ if not g:
+ if gs:
+ g = self.lr0_closure(gs)
+ s['$end'] = g
+ else:
+ s['$end'] = gs
+ self.lr_goto_cache[(id(I),x)] = g
+ return g
+
+ # Compute the LR(0) sets of item function
+ def lr0_items(self):
+
+ C = [ self.lr0_closure([self.grammar.Productions[0].lr_next]) ]
+ i = 0
+ for I in C:
+ self.lr0_cidhash[id(I)] = i
+ i += 1
+
+ # Loop over the items in C and each grammar symbols
+ i = 0
+ while i < len(C):
+ I = C[i]
+ i += 1
+
+ # Collect all of the symbols that could possibly be in the goto(I,X) sets
+ asyms = { }
+ for ii in I:
+ for s in ii.usyms:
+ asyms[s] = None
+
+ for x in asyms:
+ g = self.lr0_goto(I,x)
+ if not g: continue
+ if id(g) in self.lr0_cidhash: continue
+ self.lr0_cidhash[id(g)] = len(C)
+ C.append(g)
+
+ return C
+
+ # -----------------------------------------------------------------------------
+ # ==== LALR(1) Parsing ====
+ #
+ # LALR(1) parsing is almost exactly the same as SLR except that instead of
+ # relying upon Follow() sets when performing reductions, a more selective
+ # lookahead set that incorporates the state of the LR(0) machine is utilized.
+ # Thus, we mainly just have to focus on calculating the lookahead sets.
+ #
+ # The method used here is due to DeRemer and Pennelo (1982).
+ #
+ # DeRemer, F. L., and T. J. Pennelo: "Efficient Computation of LALR(1)
+ # Lookahead Sets", ACM Transactions on Programming Languages and Systems,
+ # Vol. 4, No. 4, Oct. 1982, pp. 615-649
+ #
+ # Further details can also be found in:
+ #
+ # J. Tremblay and P. Sorenson, "The Theory and Practice of Compiler Writing",
+ # McGraw-Hill Book Company, (1985).
+ #
+ # -----------------------------------------------------------------------------
+
+ # -----------------------------------------------------------------------------
+ # compute_nullable_nonterminals()
+ #
+ # Creates a dictionary containing all of the non-terminals that might produce
+ # an empty production.
+ # -----------------------------------------------------------------------------
+
+ def compute_nullable_nonterminals(self):
+ nullable = {}
+ num_nullable = 0
+ while 1:
+ for p in self.grammar.Productions[1:]:
+ if p.len == 0:
+ nullable[p.name] = 1
+ continue
+ for t in p.prod:
+ if not t in nullable: break
+ else:
+ nullable[p.name] = 1
+ if len(nullable) == num_nullable: break
+ num_nullable = len(nullable)
+ return nullable
+
+ # -----------------------------------------------------------------------------
+ # find_nonterminal_trans(C)
+ #
+ # Given a set of LR(0) items, this functions finds all of the non-terminal
+ # transitions. These are transitions in which a dot appears immediately before
+ # a non-terminal. Returns a list of tuples of the form (state,N) where state
+ # is the state number and N is the nonterminal symbol.
+ #
+ # The input C is the set of LR(0) items.
+ # -----------------------------------------------------------------------------
+
+ def find_nonterminal_transitions(self,C):
+ trans = []
+ for state in range(len(C)):
+ for p in C[state]:
+ if p.lr_index < p.len - 1:
+ t = (state,p.prod[p.lr_index+1])
+ if t[1] in self.grammar.Nonterminals:
+ if t not in trans: trans.append(t)
+ state = state + 1
+ return trans
+
+ # -----------------------------------------------------------------------------
+ # dr_relation()
+ #
+ # Computes the DR(p,A) relationships for non-terminal transitions. The input
+ # is a tuple (state,N) where state is a number and N is a nonterminal symbol.
+ #
+ # Returns a list of terminals.
+ # -----------------------------------------------------------------------------
+
+ def dr_relation(self,C,trans,nullable):
+ dr_set = { }
+ state,N = trans
+ terms = []
+
+ g = self.lr0_goto(C[state],N)
+ for p in g:
+ if p.lr_index < p.len - 1:
+ a = p.prod[p.lr_index+1]
+ if a in self.grammar.Terminals:
+ if a not in terms: terms.append(a)
+
+ # This extra bit is to handle the start state
+ if state == 0 and N == self.grammar.Productions[0].prod[0]:
+ terms.append('$end')
+
+ return terms
+
+ # -----------------------------------------------------------------------------
+ # reads_relation()
+ #
+ # Computes the READS() relation (p,A) READS (t,C).
+ # -----------------------------------------------------------------------------
+
+ def reads_relation(self,C, trans, empty):
+ # Look for empty transitions
+ rel = []
+ state, N = trans
+
+ g = self.lr0_goto(C[state],N)
+ j = self.lr0_cidhash.get(id(g),-1)
+ for p in g:
+ if p.lr_index < p.len - 1:
+ a = p.prod[p.lr_index + 1]
+ if a in empty:
+ rel.append((j,a))
+
+ return rel
+
+ # -----------------------------------------------------------------------------
+ # compute_lookback_includes()
+ #
+ # Determines the lookback and includes relations
+ #
+ # LOOKBACK:
+ #
+ # This relation is determined by running the LR(0) state machine forward.
+ # For example, starting with a production "N : . A B C", we run it forward
+ # to obtain "N : A B C ." We then build a relationship between this final
+ # state and the starting state. These relationships are stored in a dictionary
+ # lookdict.
+ #
+ # INCLUDES:
+ #
+ # Computes the INCLUDE() relation (p,A) INCLUDES (p',B).
+ #
+ # This relation is used to determine non-terminal transitions that occur
+ # inside of other non-terminal transition states. (p,A) INCLUDES (p', B)
+ # if the following holds:
+ #
+ # B -> LAT, where T -> epsilon and p' -L-> p
+ #
+ # L is essentially a prefix (which may be empty), T is a suffix that must be
+ # able to derive an empty string. State p' must lead to state p with the string L.
+ #
+ # -----------------------------------------------------------------------------
+
+ def compute_lookback_includes(self,C,trans,nullable):
+
+ lookdict = {} # Dictionary of lookback relations
+ includedict = {} # Dictionary of include relations
+
+ # Make a dictionary of non-terminal transitions
+ dtrans = {}
+ for t in trans:
+ dtrans[t] = 1
+
+ # Loop over all transitions and compute lookbacks and includes
+ for state,N in trans:
+ lookb = []
+ includes = []
+ for p in C[state]:
+ if p.name != N: continue
+
+ # Okay, we have a name match. We now follow the production all the way
+ # through the state machine until we get the . on the right hand side
+
+ lr_index = p.lr_index
+ j = state
+ while lr_index < p.len - 1:
+ lr_index = lr_index + 1
+ t = p.prod[lr_index]
+
+ # Check to see if this symbol and state are a non-terminal transition
+ if (j,t) in dtrans:
+ # Yes. Okay, there is some chance that this is an includes relation
+ # the only way to know for certain is whether the rest of the
+ # production derives empty
+
+ li = lr_index + 1
+ while li < p.len:
+ if p.prod[li] in self.grammar.Terminals: break # No forget it
+ if not p.prod[li] in nullable: break
+ li = li + 1
+ else:
+ # Appears to be a relation between (j,t) and (state,N)
+ includes.append((j,t))
+
+ g = self.lr0_goto(C[j],t) # Go to next set
+ j = self.lr0_cidhash.get(id(g),-1) # Go to next state
+
+ # When we get here, j is the final state, now we have to locate the production
+ for r in C[j]:
+ if r.name != p.name: continue
+ if r.len != p.len: continue
+ i = 0
+ # This look is comparing a production ". A B C" with "A B C ."
+ while i < r.lr_index:
+ if r.prod[i] != p.prod[i+1]: break
+ i = i + 1
+ else:
+ lookb.append((j,r))
+ for i in includes:
+ if not i in includedict: includedict[i] = []
+ includedict[i].append((state,N))
+ lookdict[(state,N)] = lookb
+
+ return lookdict,includedict
+
+ # -----------------------------------------------------------------------------
+ # compute_read_sets()
+ #
+ # Given a set of LR(0) items, this function computes the read sets.
+ #
+ # Inputs: C = Set of LR(0) items
+ # ntrans = Set of nonterminal transitions
+ # nullable = Set of empty transitions
+ #
+ # Returns a set containing the read sets
+ # -----------------------------------------------------------------------------
+
+ def compute_read_sets(self,C, ntrans, nullable):
+ FP = lambda x: self.dr_relation(C,x,nullable)
+ R = lambda x: self.reads_relation(C,x,nullable)
+ F = digraph(ntrans,R,FP)
+ return F
+
+ # -----------------------------------------------------------------------------
+ # compute_follow_sets()
+ #
+ # Given a set of LR(0) items, a set of non-terminal transitions, a readset,
+ # and an include set, this function computes the follow sets
+ #
+ # Follow(p,A) = Read(p,A) U U {Follow(p',B) | (p,A) INCLUDES (p',B)}
+ #
+ # Inputs:
+ # ntrans = Set of nonterminal transitions
+ # readsets = Readset (previously computed)
+ # inclsets = Include sets (previously computed)
+ #
+ # Returns a set containing the follow sets
+ # -----------------------------------------------------------------------------
+
+ def compute_follow_sets(self,ntrans,readsets,inclsets):
+ FP = lambda x: readsets[x]
+ R = lambda x: inclsets.get(x,[])
+ F = digraph(ntrans,R,FP)
+ return F
+
+ # -----------------------------------------------------------------------------
+ # add_lookaheads()
+ #
+ # Attaches the lookahead symbols to grammar rules.
+ #
+ # Inputs: lookbacks - Set of lookback relations
+ # followset - Computed follow set
+ #
+ # This function directly attaches the lookaheads to productions contained
+ # in the lookbacks set
+ # -----------------------------------------------------------------------------
+
+ def add_lookaheads(self,lookbacks,followset):
+ for trans,lb in lookbacks.items():
+ # Loop over productions in lookback
+ for state,p in lb:
+ if not state in p.lookaheads:
+ p.lookaheads[state] = []
+ f = followset.get(trans,[])
+ for a in f:
+ if a not in p.lookaheads[state]: p.lookaheads[state].append(a)
+
+ # -----------------------------------------------------------------------------
+ # add_lalr_lookaheads()
+ #
+ # This function does all of the work of adding lookahead information for use
+ # with LALR parsing
+ # -----------------------------------------------------------------------------
+
+ def add_lalr_lookaheads(self,C):
+ # Determine all of the nullable nonterminals
+ nullable = self.compute_nullable_nonterminals()
+
+ # Find all non-terminal transitions
+ trans = self.find_nonterminal_transitions(C)
+
+ # Compute read sets
+ readsets = self.compute_read_sets(C,trans,nullable)
+
+ # Compute lookback/includes relations
+ lookd, included = self.compute_lookback_includes(C,trans,nullable)
+
+ # Compute LALR FOLLOW sets
+ followsets = self.compute_follow_sets(trans,readsets,included)
+
+ # Add all of the lookaheads
+ self.add_lookaheads(lookd,followsets)
+
+ # -----------------------------------------------------------------------------
+ # lr_parse_table()
+ #
+ # This function constructs the parse tables for SLR or LALR
+ # -----------------------------------------------------------------------------
+ def lr_parse_table(self):
+ Productions = self.grammar.Productions
+ Precedence = self.grammar.Precedence
+ goto = self.lr_goto # Goto array
+ action = self.lr_action # Action array
+ log = self.log # Logger for output
+
+ actionp = { } # Action production array (temporary)
+
+ log.info("Parsing method: %s", self.lr_method)
+
+ # Step 1: Construct C = { I0, I1, ... IN}, collection of LR(0) items
+ # This determines the number of states
+
+ C = self.lr0_items()
+
+ if self.lr_method == 'LALR':
+ self.add_lalr_lookaheads(C)
+
+ # Build the parser table, state by state
+ st = 0
+ for I in C:
+ # Loop over each production in I
+ actlist = [ ] # List of actions
+ st_action = { }
+ st_actionp = { }
+ st_goto = { }
+ log.info("")
+ log.info("state %d", st)
+ log.info("")
+ for p in I:
+ log.info(" (%d) %s", p.number, str(p))
+ log.info("")
+
+ for p in I:
+ if p.len == p.lr_index + 1:
+ if p.name == "S'":
+ # Start symbol. Accept!
+ st_action["$end"] = 0
+ st_actionp["$end"] = p
+ else:
+ # We are at the end of a production. Reduce!
+ if self.lr_method == 'LALR':
+ laheads = p.lookaheads[st]
+ else:
+ laheads = self.grammar.Follow[p.name]
+ for a in laheads:
+ actlist.append((a,p,"reduce using rule %d (%s)" % (p.number,p)))
+ r = st_action.get(a,None)
+ if r is not None:
+ # Whoa. Have a shift/reduce or reduce/reduce conflict
+ if r > 0:
+ # Need to decide on shift or reduce here
+ # By default we favor shifting. Need to add
+ # some precedence rules here.
+ sprec,slevel = Productions[st_actionp[a].number].prec
+ rprec,rlevel = Precedence.get(a,('right',0))
+ if (slevel < rlevel) or ((slevel == rlevel) and (rprec == 'left')):
+ # We really need to reduce here.
+ st_action[a] = -p.number
+ st_actionp[a] = p
+ if not slevel and not rlevel:
+ log.info(" ! shift/reduce conflict for %s resolved as reduce",a)
+ self.sr_conflicts.append((st,a,'reduce'))
+ Productions[p.number].reduced += 1
+ elif (slevel == rlevel) and (rprec == 'nonassoc'):
+ st_action[a] = None
+ else:
+ # Hmmm. Guess we'll keep the shift
+ if not rlevel:
+ log.info(" ! shift/reduce conflict for %s resolved as shift",a)
+ self.sr_conflicts.append((st,a,'shift'))
+ elif r < 0:
+ # Reduce/reduce conflict. In this case, we favor the rule
+ # that was defined first in the grammar file
+ oldp = Productions[-r]
+ pp = Productions[p.number]
+ if oldp.line > pp.line:
+ st_action[a] = -p.number
+ st_actionp[a] = p
+ chosenp,rejectp = pp,oldp
+ Productions[p.number].reduced += 1
+ Productions[oldp.number].reduced -= 1
+ else:
+ chosenp,rejectp = oldp,pp
+ self.rr_conflicts.append((st,chosenp,rejectp))
+ log.info(" ! reduce/reduce conflict for %s resolved using rule %d (%s)", a,st_actionp[a].number, st_actionp[a])
+ else:
+ raise LALRError("Unknown conflict in state %d" % st)
+ else:
+ st_action[a] = -p.number
+ st_actionp[a] = p
+ Productions[p.number].reduced += 1
+ else:
+ i = p.lr_index
+ a = p.prod[i+1] # Get symbol right after the "."
+ if a in self.grammar.Terminals:
+ g = self.lr0_goto(I,a)
+ j = self.lr0_cidhash.get(id(g),-1)
+ if j >= 0:
+ # We are in a shift state
+ actlist.append((a,p,"shift and go to state %d" % j))
+ r = st_action.get(a,None)
+ if r is not None:
+ # Whoa have a shift/reduce or shift/shift conflict
+ if r > 0:
+ if r != j:
+ raise LALRError("Shift/shift conflict in state %d" % st)
+ elif r < 0:
+ # Do a precedence check.
+ # - if precedence of reduce rule is higher, we reduce.
+ # - if precedence of reduce is same and left assoc, we reduce.
+ # - otherwise we shift
+ rprec,rlevel = Productions[st_actionp[a].number].prec
+ sprec,slevel = Precedence.get(a,('right',0))
+ if (slevel > rlevel) or ((slevel == rlevel) and (rprec == 'right')):
+ # We decide to shift here... highest precedence to shift
+ Productions[st_actionp[a].number].reduced -= 1
+ st_action[a] = j
+ st_actionp[a] = p
+ if not rlevel:
+ log.info(" ! shift/reduce conflict for %s resolved as shift",a)
+ self.sr_conflicts.append((st,a,'shift'))
+ elif (slevel == rlevel) and (rprec == 'nonassoc'):
+ st_action[a] = None
+ else:
+ # Hmmm. Guess we'll keep the reduce
+ if not slevel and not rlevel:
+ log.info(" ! shift/reduce conflict for %s resolved as reduce",a)
+ self.sr_conflicts.append((st,a,'reduce'))
+
+ else:
+ raise LALRError("Unknown conflict in state %d" % st)
+ else:
+ st_action[a] = j
+ st_actionp[a] = p
+
+ # Print the actions associated with each terminal
+ _actprint = { }
+ for a,p,m in actlist:
+ if a in st_action:
+ if p is st_actionp[a]:
+ log.info(" %-15s %s",a,m)
+ _actprint[(a,m)] = 1
+ log.info("")
+ # Print the actions that were not used. (debugging)
+ not_used = 0
+ for a,p,m in actlist:
+ if a in st_action:
+ if p is not st_actionp[a]:
+ if not (a,m) in _actprint:
+ log.debug(" ! %-15s [ %s ]",a,m)
+ not_used = 1
+ _actprint[(a,m)] = 1
+ if not_used:
+ log.debug("")
+
+ # Construct the goto table for this state
+
+ nkeys = { }
+ for ii in I:
+ for s in ii.usyms:
+ if s in self.grammar.Nonterminals:
+ nkeys[s] = None
+ for n in nkeys:
+ g = self.lr0_goto(I,n)
+ j = self.lr0_cidhash.get(id(g),-1)
+ if j >= 0:
+ st_goto[n] = j
+ log.info(" %-30s shift and go to state %d",n,j)
+
+ action[st] = st_action
+ actionp[st] = st_actionp
+ goto[st] = st_goto
+ st += 1
+
+
+ # -----------------------------------------------------------------------------
+ # write()
+ #
+ # This function writes the LR parsing tables to a file
+ # -----------------------------------------------------------------------------
+
+ def write_table(self,modulename,outputdir='',signature=""):
+ basemodulename = modulename.split(".")[-1]
+ filename = os.path.join(outputdir,basemodulename) + ".py"
+ try:
+ f = open(filename,"w")
+
+ f.write("""
+# %s
+# This file is automatically generated. Do not edit.
+_tabversion = %r
+
+_lr_method = %r
+
+_lr_signature = %r
+ """ % (filename, __tabversion__, self.lr_method, signature))
+
+ # Change smaller to 0 to go back to original tables
+ smaller = 1
+
+ # Factor out names to try and make smaller
+ if smaller:
+ items = { }
+
+ for s,nd in self.lr_action.items():
+ for name,v in nd.items():
+ i = items.get(name)
+ if not i:
+ i = ([],[])
+ items[name] = i
+ i[0].append(s)
+ i[1].append(v)
+
+ f.write("\n_lr_action_items = {")
+ for k,v in items.items():
+ f.write("%r:([" % k)
+ for i in v[0]:
+ f.write("%r," % i)
+ f.write("],[")
+ for i in v[1]:
+ f.write("%r," % i)
+
+ f.write("]),")
+ f.write("}\n")
+
+ f.write("""
+_lr_action = { }
+for _k, _v in _lr_action_items.items():
+ for _x,_y in zip(_v[0],_v[1]):
+ if not _x in _lr_action: _lr_action[_x] = { }
+ _lr_action[_x][_k] = _y
+del _lr_action_items
+""")
+
+ else:
+ f.write("\n_lr_action = { ");
+ for k,v in self.lr_action.items():
+ f.write("(%r,%r):%r," % (k[0],k[1],v))
+ f.write("}\n");
+
+ if smaller:
+ # Factor out names to try and make smaller
+ items = { }
+
+ for s,nd in self.lr_goto.items():
+ for name,v in nd.items():
+ i = items.get(name)
+ if not i:
+ i = ([],[])
+ items[name] = i
+ i[0].append(s)
+ i[1].append(v)
+
+ f.write("\n_lr_goto_items = {")
+ for k,v in items.items():
+ f.write("%r:([" % k)
+ for i in v[0]:
+ f.write("%r," % i)
+ f.write("],[")
+ for i in v[1]:
+ f.write("%r," % i)
+
+ f.write("]),")
+ f.write("}\n")
+
+ f.write("""
+_lr_goto = { }
+for _k, _v in _lr_goto_items.items():
+ for _x,_y in zip(_v[0],_v[1]):
+ if not _x in _lr_goto: _lr_goto[_x] = { }
+ _lr_goto[_x][_k] = _y
+del _lr_goto_items
+""")
+ else:
+ f.write("\n_lr_goto = { ");
+ for k,v in self.lr_goto.items():
+ f.write("(%r,%r):%r," % (k[0],k[1],v))
+ f.write("}\n");
+
+ # Write production table
+ f.write("_lr_productions = [\n")
+ for p in self.lr_productions:
+ if p.func:
+ f.write(" (%r,%r,%d,%r,%r,%d),\n" % (p.str,p.name, p.len, p.func,p.file,p.line))
+ else:
+ f.write(" (%r,%r,%d,None,None,None),\n" % (str(p),p.name, p.len))
+ f.write("]\n")
+ f.close()
+
+ except IOError:
+ e = sys.exc_info()[1]
+ sys.stderr.write("Unable to create '%s'\n" % filename)
+ sys.stderr.write(str(e)+"\n")
+ return
+
+
+ # -----------------------------------------------------------------------------
+ # pickle_table()
+ #
+ # This function pickles the LR parsing tables to a supplied file object
+ # -----------------------------------------------------------------------------
+
+ def pickle_table(self,filename,signature=""):
+ try:
+ import cPickle as pickle
+ except ImportError:
+ import pickle
+ outf = open(filename,"wb")
+ pickle.dump(__tabversion__,outf,pickle_protocol)
+ pickle.dump(self.lr_method,outf,pickle_protocol)
+ pickle.dump(signature,outf,pickle_protocol)
+ pickle.dump(self.lr_action,outf,pickle_protocol)
+ pickle.dump(self.lr_goto,outf,pickle_protocol)
+
+ outp = []
+ for p in self.lr_productions:
+ if p.func:
+ outp.append((p.str,p.name, p.len, p.func,p.file,p.line))
+ else:
+ outp.append((str(p),p.name,p.len,None,None,None))
+ pickle.dump(outp,outf,pickle_protocol)
+ outf.close()
+
+# -----------------------------------------------------------------------------
+# === INTROSPECTION ===
+#
+# The following functions and classes are used to implement the PLY
+# introspection features followed by the yacc() function itself.
+# -----------------------------------------------------------------------------
+
+# -----------------------------------------------------------------------------
+# get_caller_module_dict()
+#
+# This function returns a dictionary containing all of the symbols defined within
+# a caller further down the call stack. This is used to get the environment
+# associated with the yacc() call if none was provided.
+# -----------------------------------------------------------------------------
+
+def get_caller_module_dict(levels):
+ try:
+ raise RuntimeError
+ except RuntimeError:
+ e,b,t = sys.exc_info()
+ f = t.tb_frame
+ while levels > 0:
+ f = f.f_back
+ levels -= 1
+ ldict = f.f_globals.copy()
+ if f.f_globals != f.f_locals:
+ ldict.update(f.f_locals)
+
+ return ldict
+
+# -----------------------------------------------------------------------------
+# parse_grammar()
+#
+# This takes a raw grammar rule string and parses it into production data
+# -----------------------------------------------------------------------------
+def parse_grammar(doc,file,line):
+ grammar = []
+ # Split the doc string into lines
+ pstrings = doc.splitlines()
+ lastp = None
+ dline = line
+ for ps in pstrings:
+ dline += 1
+ p = ps.split()
+ if not p: continue
+ try:
+ if p[0] == '|':
+ # This is a continuation of a previous rule
+ if not lastp:
+ raise SyntaxError("%s:%d: Misplaced '|'" % (file,dline))
+ prodname = lastp
+ syms = p[1:]
+ else:
+ prodname = p[0]
+ lastp = prodname
+ syms = p[2:]
+ assign = p[1]
+ if assign != ':' and assign != '::=':
+ raise SyntaxError("%s:%d: Syntax error. Expected ':'" % (file,dline))
+
+ grammar.append((file,dline,prodname,syms))
+ except SyntaxError:
+ raise
+ except Exception:
+ raise SyntaxError("%s:%d: Syntax error in rule '%s'" % (file,dline,ps.strip()))
+
+ return grammar
+
+# -----------------------------------------------------------------------------
+# ParserReflect()
+#
+# This class represents information extracted for building a parser including
+# start symbol, error function, tokens, precedence list, action functions,
+# etc.
+# -----------------------------------------------------------------------------
+class ParserReflect(object):
+ def __init__(self,pdict,log=None):
+ self.pdict = pdict
+ self.start = None
+ self.error_func = None
+ self.tokens = None
+ self.files = {}
+ self.grammar = []
+ self.error = 0
+
+ if log is None:
+ self.log = PlyLogger(sys.stderr)
+ else:
+ self.log = log
+
+ # Get all of the basic information
+ def get_all(self):
+ self.get_start()
+ self.get_error_func()
+ self.get_tokens()
+ self.get_precedence()
+ self.get_pfunctions()
+
+ # Validate all of the information
+ def validate_all(self):
+ self.validate_start()
+ self.validate_error_func()
+ self.validate_tokens()
+ self.validate_precedence()
+ self.validate_pfunctions()
+ self.validate_files()
+ return self.error
+
+ # Compute a signature over the grammar
+ def signature(self):
+ try:
+ from hashlib import md5
+ except ImportError:
+ from md5 import md5
+ try:
+ sig = md5()
+ if self.start:
+ sig.update(self.start.encode('latin-1'))
+ if self.prec:
+ sig.update("".join(["".join(p) for p in self.prec]).encode('latin-1'))
+ if self.tokens:
+ sig.update(" ".join(self.tokens).encode('latin-1'))
+ for f in self.pfuncs:
+ if f[3]:
+ sig.update(f[3].encode('latin-1'))
+ except (TypeError,ValueError):
+ pass
+ return sig.digest()
+
+ # -----------------------------------------------------------------------------
+ # validate_file()
+ #
+ # This method checks to see if there are duplicated p_rulename() functions
+ # in the parser module file. Without this function, it is really easy for
+ # users to make mistakes by cutting and pasting code fragments (and it's a real
+ # bugger to try and figure out why the resulting parser doesn't work). Therefore,
+ # we just do a little regular expression pattern matching of def statements
+ # to try and detect duplicates.
+ # -----------------------------------------------------------------------------
+
+ def validate_files(self):
+ # Match def p_funcname(
+ fre = re.compile(r'\s*def\s+(p_[a-zA-Z_0-9]*)\(')
+
+ for filename in self.files.keys():
+ base,ext = os.path.splitext(filename)
+ if ext != '.py': return 1 # No idea. Assume it's okay.
+
+ try:
+ f = open(filename)
+ lines = f.readlines()
+ f.close()
+ except IOError:
+ continue
+
+ counthash = { }
+ for linen,l in enumerate(lines):
+ linen += 1
+ m = fre.match(l)
+ if m:
+ name = m.group(1)
+ prev = counthash.get(name)
+ if not prev:
+ counthash[name] = linen
+ else:
+ self.log.warning("%s:%d: Function %s redefined. Previously defined on line %d", filename,linen,name,prev)
+
+ # Get the start symbol
+ def get_start(self):
+ self.start = self.pdict.get('start')
+
+ # Validate the start symbol
+ def validate_start(self):
+ if self.start is not None:
+ if not isinstance(self.start,str):
+ self.log.error("'start' must be a string")
+
+ # Look for error handler
+ def get_error_func(self):
+ self.error_func = self.pdict.get('p_error')
+
+ # Validate the error function
+ def validate_error_func(self):
+ if self.error_func:
+ if isinstance(self.error_func,types.FunctionType):
+ ismethod = 0
+ elif isinstance(self.error_func, types.MethodType):
+ ismethod = 1
+ else:
+ self.log.error("'p_error' defined, but is not a function or method")
+ self.error = 1
+ return
+
+ eline = func_code(self.error_func).co_firstlineno
+ efile = func_code(self.error_func).co_filename
+ self.files[efile] = 1
+
+ if (func_code(self.error_func).co_argcount != 1+ismethod):
+ self.log.error("%s:%d: p_error() requires 1 argument",efile,eline)
+ self.error = 1
+
+ # Get the tokens map
+ def get_tokens(self):
+ tokens = self.pdict.get("tokens",None)
+ if not tokens:
+ self.log.error("No token list is defined")
+ self.error = 1
+ return
+
+ if not isinstance(tokens,(list, tuple)):
+ self.log.error("tokens must be a list or tuple")
+ self.error = 1
+ return
+
+ if not tokens:
+ self.log.error("tokens is empty")
+ self.error = 1
+ return
+
+ self.tokens = tokens
+
+ # Validate the tokens
+ def validate_tokens(self):
+ # Validate the tokens.
+ if 'error' in self.tokens:
+ self.log.error("Illegal token name 'error'. Is a reserved word")
+ self.error = 1
+ return
+
+ terminals = {}
+ for n in self.tokens:
+ if n in terminals:
+ self.log.warning("Token '%s' multiply defined", n)
+ terminals[n] = 1
+
+ # Get the precedence map (if any)
+ def get_precedence(self):
+ self.prec = self.pdict.get("precedence",None)
+
+ # Validate and parse the precedence map
+ def validate_precedence(self):
+ preclist = []
+ if self.prec:
+ if not isinstance(self.prec,(list,tuple)):
+ self.log.error("precedence must be a list or tuple")
+ self.error = 1
+ return
+ for level,p in enumerate(self.prec):
+ if not isinstance(p,(list,tuple)):
+ self.log.error("Bad precedence table")
+ self.error = 1
+ return
+
+ if len(p) < 2:
+ self.log.error("Malformed precedence entry %s. Must be (assoc, term, ..., term)",p)
+ self.error = 1
+ return
+ assoc = p[0]
+ if not isinstance(assoc,str):
+ self.log.error("precedence associativity must be a string")
+ self.error = 1
+ return
+ for term in p[1:]:
+ if not isinstance(term,str):
+ self.log.error("precedence items must be strings")
+ self.error = 1
+ return
+ preclist.append((term,assoc,level+1))
+ self.preclist = preclist
+
+ # Get all p_functions from the grammar
+ def get_pfunctions(self):
+ p_functions = []
+ for name, item in self.pdict.items():
+ if name[:2] != 'p_': continue
+ if name == 'p_error': continue
+ if isinstance(item,(types.FunctionType,types.MethodType)):
+ line = func_code(item).co_firstlineno
+ file = func_code(item).co_filename
+ p_functions.append((line,file,name,item.__doc__))
+
+ # Sort all of the actions by line number
+ p_functions.sort()
+ self.pfuncs = p_functions
+
+
+ # Validate all of the p_functions
+ def validate_pfunctions(self):
+ grammar = []
+ # Check for non-empty symbols
+ if len(self.pfuncs) == 0:
+ self.log.error("no rules of the form p_rulename are defined")
+ self.error = 1
+ return
+
+ for line, file, name, doc in self.pfuncs:
+ func = self.pdict[name]
+ if isinstance(func, types.MethodType):
+ reqargs = 2
+ else:
+ reqargs = 1
+ if func_code(func).co_argcount > reqargs:
+ self.log.error("%s:%d: Rule '%s' has too many arguments",file,line,func.__name__)
+ self.error = 1
+ elif func_code(func).co_argcount < reqargs:
+ self.log.error("%s:%d: Rule '%s' requires an argument",file,line,func.__name__)
+ self.error = 1
+ elif not func.__doc__:
+ self.log.warning("%s:%d: No documentation string specified in function '%s' (ignored)",file,line,func.__name__)
+ else:
+ try:
+ parsed_g = parse_grammar(doc,file,line)
+ for g in parsed_g:
+ grammar.append((name, g))
+ except SyntaxError:
+ e = sys.exc_info()[1]
+ self.log.error(str(e))
+ self.error = 1
+
+ # Looks like a valid grammar rule
+ # Mark the file in which defined.
+ self.files[file] = 1
+
+ # Secondary validation step that looks for p_ definitions that are not functions
+ # or functions that look like they might be grammar rules.
+
+ for n,v in self.pdict.items():
+ if n[0:2] == 'p_' and isinstance(v, (types.FunctionType, types.MethodType)): continue
+ if n[0:2] == 't_': continue
+ if n[0:2] == 'p_' and n != 'p_error':
+ self.log.warning("'%s' not defined as a function", n)
+ if ((isinstance(v,types.FunctionType) and func_code(v).co_argcount == 1) or
+ (isinstance(v,types.MethodType) and func_code(v).co_argcount == 2)):
+ try:
+ doc = v.__doc__.split(" ")
+ if doc[1] == ':':
+ self.log.warning("%s:%d: Possible grammar rule '%s' defined without p_ prefix",
+ func_code(v).co_filename, func_code(v).co_firstlineno,n)
+ except Exception:
+ pass
+
+ self.grammar = grammar
+
+# -----------------------------------------------------------------------------
+# yacc(module)
+#
+# Build a parser
+# -----------------------------------------------------------------------------
+
+def yacc(method='LALR', debug=yaccdebug, module=None, tabmodule=tab_module, start=None,
+ check_recursion=1, optimize=0, write_tables=1, debugfile=debug_file,outputdir='',
+ debuglog=None, errorlog = None, picklefile=None):
+
+ global parse # Reference to the parsing method of the last built parser
+
+ # If pickling is enabled, table files are not created
+
+ if picklefile:
+ write_tables = 0
+
+ if errorlog is None:
+ errorlog = PlyLogger(sys.stderr)
+
+ # Get the module dictionary used for the parser
+ if module:
+ _items = [(k,getattr(module,k)) for k in dir(module)]
+ pdict = dict(_items)
+ else:
+ pdict = get_caller_module_dict(2)
+
+ # Collect parser information from the dictionary
+ pinfo = ParserReflect(pdict,log=errorlog)
+ pinfo.get_all()
+
+ if pinfo.error:
+ raise YaccError("Unable to build parser")
+
+ # Check signature against table files (if any)
+ signature = pinfo.signature()
+
+ # Read the tables
+ try:
+ lr = LRTable()
+ if picklefile:
+ read_signature = lr.read_pickle(picklefile)
+ else:
+ read_signature = lr.read_table(tabmodule)
+ if optimize or (read_signature == signature):
+ try:
+ lr.bind_callables(pinfo.pdict)
+ parser = LRParser(lr,pinfo.error_func)
+ parse = parser.parse
+ return parser
+ except Exception:
+ e = sys.exc_info()[1]
+ errorlog.warning("There was a problem loading the table file: %s", repr(e))
+ except VersionError:
+ e = sys.exc_info()
+ errorlog.warning(str(e))
+ except Exception:
+ pass
+
+ if debuglog is None:
+ if debug:
+ debuglog = PlyLogger(open(debugfile,"w"))
+ else:
+ debuglog = NullLogger()
+
+ debuglog.info("Created by PLY version %s (http://www.dabeaz.com/ply)", __version__)
+
+
+ errors = 0
+
+ # Validate the parser information
+ if pinfo.validate_all():
+ raise YaccError("Unable to build parser")
+
+ if not pinfo.error_func:
+ errorlog.warning("no p_error() function is defined")
+
+ # Create a grammar object
+ grammar = Grammar(pinfo.tokens)
+
+ # Set precedence level for terminals
+ for term, assoc, level in pinfo.preclist:
+ try:
+ grammar.set_precedence(term,assoc,level)
+ except GrammarError:
+ e = sys.exc_info()[1]
+ errorlog.warning("%s",str(e))
+
+ # Add productions to the grammar
+ for funcname, gram in pinfo.grammar:
+ file, line, prodname, syms = gram
+ try:
+ grammar.add_production(prodname,syms,funcname,file,line)
+ except GrammarError:
+ e = sys.exc_info()[1]
+ errorlog.error("%s",str(e))
+ errors = 1
+
+ # Set the grammar start symbols
+ try:
+ if start is None:
+ grammar.set_start(pinfo.start)
+ else:
+ grammar.set_start(start)
+ except GrammarError:
+ e = sys.exc_info()[1]
+ errorlog.error(str(e))
+ errors = 1
+
+ if errors:
+ raise YaccError("Unable to build parser")
+
+ # Verify the grammar structure
+ undefined_symbols = grammar.undefined_symbols()
+ for sym, prod in undefined_symbols:
+ errorlog.error("%s:%d: Symbol '%s' used, but not defined as a token or a rule",prod.file,prod.line,sym)
+ errors = 1
+
+ unused_terminals = grammar.unused_terminals()
+ if unused_terminals:
+ debuglog.info("")
+ debuglog.info("Unused terminals:")
+ debuglog.info("")
+ for term in unused_terminals:
+ errorlog.warning("Token '%s' defined, but not used", term)
+ debuglog.info(" %s", term)
+
+ # Print out all productions to the debug log
+ if debug:
+ debuglog.info("")
+ debuglog.info("Grammar")
+ debuglog.info("")
+ for n,p in enumerate(grammar.Productions):
+ debuglog.info("Rule %-5d %s", n, p)
+
+ # Find unused non-terminals
+ unused_rules = grammar.unused_rules()
+ for prod in unused_rules:
+ errorlog.warning("%s:%d: Rule '%s' defined, but not used", prod.file, prod.line, prod.name)
+
+ if len(unused_terminals) == 1:
+ errorlog.warning("There is 1 unused token")
+ if len(unused_terminals) > 1:
+ errorlog.warning("There are %d unused tokens", len(unused_terminals))
+
+ if len(unused_rules) == 1:
+ errorlog.warning("There is 1 unused rule")
+ if len(unused_rules) > 1:
+ errorlog.warning("There are %d unused rules", len(unused_rules))
+
+ if debug:
+ debuglog.info("")
+ debuglog.info("Terminals, with rules where they appear")
+ debuglog.info("")
+ terms = list(grammar.Terminals)
+ terms.sort()
+ for term in terms:
+ debuglog.info("%-20s : %s", term, " ".join([str(s) for s in grammar.Terminals[term]]))
+
+ debuglog.info("")
+ debuglog.info("Nonterminals, with rules where they appear")
+ debuglog.info("")
+ nonterms = list(grammar.Nonterminals)
+ nonterms.sort()
+ for nonterm in nonterms:
+ debuglog.info("%-20s : %s", nonterm, " ".join([str(s) for s in grammar.Nonterminals[nonterm]]))
+ debuglog.info("")
+
+ if check_recursion:
+ unreachable = grammar.find_unreachable()
+ for u in unreachable:
+ errorlog.warning("Symbol '%s' is unreachable",u)
+
+ infinite = grammar.infinite_cycles()
+ for inf in infinite:
+ errorlog.error("Infinite recursion detected for symbol '%s'", inf)
+ errors = 1
+
+ unused_prec = grammar.unused_precedence()
+ for term, assoc in unused_prec:
+ errorlog.error("Precedence rule '%s' defined for unknown symbol '%s'", assoc, term)
+ errors = 1
+
+ if errors:
+ raise YaccError("Unable to build parser")
+
+ # Run the LRGeneratedTable on the grammar
+ if debug:
+ errorlog.debug("Generating %s tables", method)
+
+ lr = LRGeneratedTable(grammar,method,debuglog)
+
+ if debug:
+ num_sr = len(lr.sr_conflicts)
+
+ # Report shift/reduce and reduce/reduce conflicts
+ if num_sr == 1:
+ errorlog.warning("1 shift/reduce conflict")
+ elif num_sr > 1:
+ errorlog.warning("%d shift/reduce conflicts", num_sr)
+
+ num_rr = len(lr.rr_conflicts)
+ if num_rr == 1:
+ errorlog.warning("1 reduce/reduce conflict")
+ elif num_rr > 1:
+ errorlog.warning("%d reduce/reduce conflicts", num_rr)
+
+ # Write out conflicts to the output file
+ if debug and (lr.sr_conflicts or lr.rr_conflicts):
+ debuglog.warning("")
+ debuglog.warning("Conflicts:")
+ debuglog.warning("")
+
+ for state, tok, resolution in lr.sr_conflicts:
+ debuglog.warning("shift/reduce conflict for %s in state %d resolved as %s", tok, state, resolution)
+
+ already_reported = {}
+ for state, rule, rejected in lr.rr_conflicts:
+ if (state,id(rule),id(rejected)) in already_reported:
+ continue
+ debuglog.warning("reduce/reduce conflict in state %d resolved using rule (%s)", state, rule)
+ debuglog.warning("rejected rule (%s) in state %d", rejected,state)
+ errorlog.warning("reduce/reduce conflict in state %d resolved using rule (%s)", state, rule)
+ errorlog.warning("rejected rule (%s) in state %d", rejected, state)
+ already_reported[state,id(rule),id(rejected)] = 1
+
+ warned_never = []
+ for state, rule, rejected in lr.rr_conflicts:
+ if not rejected.reduced and (rejected not in warned_never):
+ debuglog.warning("Rule (%s) is never reduced", rejected)
+ errorlog.warning("Rule (%s) is never reduced", rejected)
+ warned_never.append(rejected)
+
+ # Write the table file if requested
+ if write_tables:
+ lr.write_table(tabmodule,outputdir,signature)
+
+ # Write a pickled version of the tables
+ if picklefile:
+ lr.pickle_table(picklefile,signature)
+
+ # Build the parser
+ lr.bind_callables(pinfo.pdict)
+ parser = LRParser(lr,pinfo.error_func)
+
+ parse = parser.parse
+ return parser
diff --git a/bitbake/lib/progressbar.py b/bitbake/lib/progressbar.py
new file mode 100644
index 0000000..114cdc1
--- /dev/null
+++ b/bitbake/lib/progressbar.py
@@ -0,0 +1,384 @@
+#!/usr/bin/env python
+# -*- coding: iso-8859-1 -*-
+#
+# progressbar - Text progressbar library for python.
+# Copyright (c) 2005 Nilton Volpato
+#
+# This library is free software; you can redistribute it and/or
+# modify it under the terms of the GNU Lesser General Public
+# License as published by the Free Software Foundation; either
+# version 2.1 of the License, or (at your option) any later version.
+#
+# This library is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
+# Lesser General Public License for more details.
+#
+# You should have received a copy of the GNU Lesser General Public
+# License along with this library; if not, write to the Free Software
+# Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA 02110-1301 USA
+
+
+"""Text progressbar library for python.
+
+This library provides a text mode progressbar. This is typically used
+to display the progress of a long running operation, providing a
+visual clue that processing is underway.
+
+The ProgressBar class manages the progress, and the format of the line
+is given by a number of widgets. A widget is an object that may
+display diferently depending on the state of the progress. There are
+three types of widget:
+- a string, which always shows itself;
+- a ProgressBarWidget, which may return a diferent value every time
+it's update method is called; and
+- a ProgressBarWidgetHFill, which is like ProgressBarWidget, except it
+expands to fill the remaining width of the line.
+
+The progressbar module is very easy to use, yet very powerful. And
+automatically supports features like auto-resizing when available.
+"""
+
+from __future__ import division
+
+__author__ = "Nilton Volpato"
+__author_email__ = "first-name dot last-name @ gmail.com"
+__date__ = "2006-05-07"
+__version__ = "2.3-dev"
+
+import sys, time, os
+from array import array
+try:
+ from fcntl import ioctl
+ import termios
+except ImportError:
+ pass
+import signal
+try:
+ basestring
+except NameError:
+ basestring = (str,)
+
+class ProgressBarWidget(object):
+ """This is an element of ProgressBar formatting.
+
+ The ProgressBar object will call it's update value when an update
+ is needed. It's size may change between call, but the results will
+ not be good if the size changes drastically and repeatedly.
+ """
+ def update(self, pbar):
+ """Returns the string representing the widget.
+
+ The parameter pbar is a reference to the calling ProgressBar,
+ where one can access attributes of the class for knowing how
+ the update must be made.
+
+ At least this function must be overriden."""
+ pass
+
+class ProgressBarWidgetHFill(object):
+ """This is a variable width element of ProgressBar formatting.
+
+ The ProgressBar object will call it's update value, informing the
+ width this object must the made. This is like TeX \\hfill, it will
+ expand to fill the line. You can use more than one in the same
+ line, and they will all have the same width, and together will
+ fill the line.
+ """
+ def update(self, pbar, width):
+ """Returns the string representing the widget.
+
+ The parameter pbar is a reference to the calling ProgressBar,
+ where one can access attributes of the class for knowing how
+ the update must be made. The parameter width is the total
+ horizontal width the widget must have.
+
+ At least this function must be overriden."""
+ pass
+
+
+class ETA(ProgressBarWidget):
+ "Widget for the Estimated Time of Arrival"
+ def format_time(self, seconds):
+ return time.strftime('%H:%M:%S', time.gmtime(seconds))
+ def update(self, pbar):
+ if pbar.currval == 0:
+ return 'ETA: --:--:--'
+ elif pbar.finished:
+ return 'Time: %s' % self.format_time(pbar.seconds_elapsed)
+ else:
+ elapsed = pbar.seconds_elapsed
+ eta = elapsed * pbar.maxval / pbar.currval - elapsed
+ return 'ETA: %s' % self.format_time(eta)
+
+class FileTransferSpeed(ProgressBarWidget):
+ "Widget for showing the transfer speed (useful for file transfers)."
+ def __init__(self, unit='B'):
+ self.unit = unit
+ self.fmt = '%6.2f %s'
+ self.prefixes = ['', 'K', 'M', 'G', 'T', 'P']
+ def update(self, pbar):
+ if pbar.seconds_elapsed < 2e-6:#== 0:
+ bps = 0.0
+ else:
+ bps = pbar.currval / pbar.seconds_elapsed
+ spd = bps
+ for u in self.prefixes:
+ if spd < 1000:
+ break
+ spd /= 1000
+ return self.fmt % (spd, u + self.unit + '/s')
+
+class RotatingMarker(ProgressBarWidget):
+ "A rotating marker for filling the bar of progress."
+ def __init__(self, markers='|/-\\'):
+ self.markers = markers
+ self.curmark = -1
+ def update(self, pbar):
+ if pbar.finished:
+ return self.markers[0]
+ self.curmark = (self.curmark + 1) % len(self.markers)
+ return self.markers[self.curmark]
+
+class Percentage(ProgressBarWidget):
+ "Just the percentage done."
+ def update(self, pbar):
+ return '%3d%%' % pbar.percentage()
+
+class SimpleProgress(ProgressBarWidget):
+ "Returns what is already done and the total, e.g.: '5 of 47'"
+ def __init__(self, sep=' of '):
+ self.sep = sep
+ def update(self, pbar):
+ return '%d%s%d' % (pbar.currval, self.sep, pbar.maxval)
+
+class Bar(ProgressBarWidgetHFill):
+ "The bar of progress. It will stretch to fill the line."
+ def __init__(self, marker='#', left='|', right='|'):
+ self.marker = marker
+ self.left = left
+ self.right = right
+ def _format_marker(self, pbar):
+ if isinstance(self.marker, basestring):
+ return self.marker
+ else:
+ return self.marker.update(pbar)
+ def update(self, pbar, width):
+ percent = pbar.percentage()
+ cwidth = width - len(self.left) - len(self.right)
+ marked_width = int(percent * cwidth // 100)
+ m = self._format_marker(pbar)
+ bar = (self.left + (m * marked_width).ljust(cwidth) + self.right)
+ return bar
+
+class ReverseBar(Bar):
+ "The reverse bar of progress, or bar of regress. :)"
+ def update(self, pbar, width):
+ percent = pbar.percentage()
+ cwidth = width - len(self.left) - len(self.right)
+ marked_width = int(percent * cwidth // 100)
+ m = self._format_marker(pbar)
+ bar = (self.left + (m*marked_width).rjust(cwidth) + self.right)
+ return bar
+
+default_widgets = [Percentage(), ' ', Bar()]
+class ProgressBar(object):
+ """This is the ProgressBar class, it updates and prints the bar.
+
+ A common way of using it is like:
+ >>> pbar = ProgressBar().start()
+ >>> for i in xrange(100):
+ ... # do something
+ ... pbar.update(i+1)
+ ...
+ >>> pbar.finish()
+
+ You can also use a progressbar as an iterator:
+ >>> progress = ProgressBar()
+ >>> for i in progress(some_iterable):
+ ... # do something
+ ...
+
+ But anything you want to do is possible (well, almost anything).
+ You can supply different widgets of any type in any order. And you
+ can even write your own widgets! There are many widgets already
+ shipped and you should experiment with them.
+
+ The term_width parameter must be an integer or None. In the latter case
+ it will try to guess it, if it fails it will default to 80 columns.
+
+ When implementing a widget update method you may access any
+ attribute or function of the ProgressBar object calling the
+ widget's update method. The most important attributes you would
+ like to access are:
+ - currval: current value of the progress, 0 <= currval <= maxval
+ - maxval: maximum (and final) value of the progress
+ - finished: True if the bar has finished (reached 100%), False o/w
+ - start_time: the time when start() method of ProgressBar was called
+ - seconds_elapsed: seconds elapsed since start_time
+ - percentage(): percentage of the progress [0..100]. This is a method.
+
+ The attributes above are unlikely to change between different versions,
+ the other ones may change or cease to exist without notice, so try to rely
+ only on the ones documented above if you are extending the progress bar.
+ """
+
+ __slots__ = ('currval', 'fd', 'finished', 'last_update_time', 'maxval',
+ 'next_update', 'num_intervals', 'seconds_elapsed',
+ 'signal_set', 'start_time', 'term_width', 'update_interval',
+ 'widgets', '_iterable')
+
+ _DEFAULT_MAXVAL = 100
+
+ def __init__(self, maxval=None, widgets=default_widgets, term_width=None,
+ fd=sys.stderr):
+ self.maxval = maxval
+ self.widgets = widgets
+ self.fd = fd
+ self.signal_set = False
+ if term_width is not None:
+ self.term_width = term_width
+ else:
+ try:
+ self._handle_resize(None, None)
+ signal.signal(signal.SIGWINCH, self._handle_resize)
+ self.signal_set = True
+ except (SystemExit, KeyboardInterrupt):
+ raise
+ except:
+ self.term_width = int(os.environ.get('COLUMNS', 80)) - 1
+
+ self.currval = 0
+ self.finished = False
+ self.start_time = None
+ self.last_update_time = None
+ self.seconds_elapsed = 0
+ self._iterable = None
+
+ def __call__(self, iterable):
+ try:
+ self.maxval = len(iterable)
+ except TypeError:
+ # If the iterable has no length, then rely on the value provided
+ # by the user, otherwise fail.
+ if not (isinstance(self.maxval, (int, long)) and self.maxval > 0):
+ raise RuntimeError('Could not determine maxval from iterable. '
+ 'You must explicitly provide a maxval.')
+ self._iterable = iter(iterable)
+ self.start()
+ return self
+
+ def __iter__(self):
+ return self
+
+ def next(self):
+ try:
+ next = self._iterable.next()
+ self.update(self.currval + 1)
+ return next
+ except StopIteration:
+ self.finish()
+ raise
+
+ def _handle_resize(self, signum, frame):
+ h, w = array('h', ioctl(self.fd, termios.TIOCGWINSZ, '\0' * 8))[:2]
+ self.term_width = w
+
+ def percentage(self):
+ "Returns the percentage of the progress."
+ return self.currval * 100.0 / self.maxval
+
+ def _format_widgets(self):
+ r = []
+ hfill_inds = []
+ num_hfill = 0
+ currwidth = 0
+ for i, w in enumerate(self.widgets):
+ if isinstance(w, ProgressBarWidgetHFill):
+ r.append(w)
+ hfill_inds.append(i)
+ num_hfill += 1
+ elif isinstance(w, basestring):
+ r.append(w)
+ currwidth += len(w)
+ else:
+ weval = w.update(self)
+ currwidth += len(weval)
+ r.append(weval)
+ for iw in hfill_inds:
+ widget_width = int((self.term_width - currwidth) // num_hfill)
+ r[iw] = r[iw].update(self, widget_width)
+ return r
+
+ def _format_line(self):
+ return ''.join(self._format_widgets()).ljust(self.term_width)
+
+ def _next_update(self):
+ return int((int(self.num_intervals *
+ (self.currval / self.maxval)) + 1) *
+ self.update_interval)
+
+ def _need_update(self):
+ """Returns true when the progressbar should print an updated line.
+
+ You can override this method if you want finer grained control over
+ updates.
+
+ The current implementation is optimized to be as fast as possible and
+ as economical as possible in the number of updates. However, depending
+ on your usage you may want to do more updates. For instance, if your
+ progressbar stays in the same percentage for a long time, and you want
+ to update other widgets, like ETA, then you could return True after
+ some time has passed with no updates.
+
+ Ideally you could call self._format_line() and see if it's different
+ from the previous _format_line() call, but calling _format_line() takes
+ around 20 times more time than calling this implementation of
+ _need_update().
+ """
+ return self.currval >= self.next_update
+
+ def update(self, value):
+ "Updates the progress bar to a new value."
+ assert 0 <= value <= self.maxval, '0 <= %d <= %d' % (value, self.maxval)
+ self.currval = value
+ if not self._need_update():
+ return
+ if self.start_time is None:
+ raise RuntimeError('You must call start() before calling update()')
+ now = time.time()
+ self.seconds_elapsed = now - self.start_time
+ self.next_update = self._next_update()
+ self.fd.write(self._format_line() + '\r')
+ self.last_update_time = now
+
+ def start(self):
+ """Starts measuring time, and prints the bar at 0%.
+
+ It returns self so you can use it like this:
+ >>> pbar = ProgressBar().start()
+ >>> for i in xrange(100):
+ ... # do something
+ ... pbar.update(i+1)
+ ...
+ >>> pbar.finish()
+ """
+ if self.maxval is None:
+ self.maxval = self._DEFAULT_MAXVAL
+ assert self.maxval > 0
+
+ self.num_intervals = max(100, self.term_width)
+ self.update_interval = self.maxval / self.num_intervals
+ self.next_update = 0
+
+ self.start_time = self.last_update_time = time.time()
+ self.update(0)
+ return self
+
+ def finish(self):
+ """Used to tell the progress is finished."""
+ self.finished = True
+ self.update(self.maxval)
+ self.fd.write('\n')
+ if self.signal_set:
+ signal.signal(signal.SIGWINCH, signal.SIG_DFL)
diff --git a/bitbake/lib/prserv/__init__.py b/bitbake/lib/prserv/__init__.py
new file mode 100644
index 0000000..c3cb73a
--- /dev/null
+++ b/bitbake/lib/prserv/__init__.py
@@ -0,0 +1,14 @@
+__version__ = "1.0.0"
+
+import os, time
+import sys,logging
+
+def init_logger(logfile, loglevel):
+ numeric_level = getattr(logging, loglevel.upper(), None)
+ if not isinstance(numeric_level, int):
+ raise ValueError('Invalid log level: %s' % loglevel)
+ FORMAT = '%(asctime)-15s %(message)s'
+ logging.basicConfig(level=numeric_level, filename=logfile, format=FORMAT)
+
+class NotFoundError(Exception):
+ pass
diff --git a/bitbake/lib/prserv/db.py b/bitbake/lib/prserv/db.py
new file mode 100644
index 0000000..4379580
--- /dev/null
+++ b/bitbake/lib/prserv/db.py
@@ -0,0 +1,268 @@
+import logging
+import os.path
+import errno
+import prserv
+import time
+
+try:
+ import sqlite3
+except ImportError:
+ from pysqlite2 import dbapi2 as sqlite3
+
+logger = logging.getLogger("BitBake.PRserv")
+
+sqlversion = sqlite3.sqlite_version_info
+if sqlversion[0] < 3 or (sqlversion[0] == 3 and sqlversion[1] < 3):
+ raise Exception("sqlite3 version 3.3.0 or later is required.")
+
+#
+# "No History" mode - for a given query tuple (version, pkgarch, checksum),
+# the returned value will be the largest among all the values of the same
+# (version, pkgarch). This means the PR value returned can NOT be decremented.
+#
+# "History" mode - Return a new higher value for previously unseen query
+# tuple (version, pkgarch, checksum), otherwise return historical value.
+# Value can decrement if returning to a previous build.
+#
+
+class PRTable(object):
+ def __init__(self, conn, table, nohist):
+ self.conn = conn
+ self.nohist = nohist
+ self.dirty = False
+ if nohist:
+ self.table = "%s_nohist" % table
+ else:
+ self.table = "%s_hist" % table
+
+ self._execute("CREATE TABLE IF NOT EXISTS %s \
+ (version TEXT NOT NULL, \
+ pkgarch TEXT NOT NULL, \
+ checksum TEXT NOT NULL, \
+ value INTEGER, \
+ PRIMARY KEY (version, pkgarch, checksum));" % self.table)
+
+ def _execute(self, *query):
+ """Execute a query, waiting to acquire a lock if necessary"""
+ start = time.time()
+ end = start + 20
+ while True:
+ try:
+ return self.conn.execute(*query)
+ except sqlite3.OperationalError as exc:
+ if 'is locked' in str(exc) and end > time.time():
+ continue
+ raise exc
+
+ def sync(self):
+ self.conn.commit()
+ self._execute("BEGIN EXCLUSIVE TRANSACTION")
+
+ def sync_if_dirty(self):
+ if self.dirty:
+ self.sync()
+ self.dirty = False
+
+ def _getValueHist(self, version, pkgarch, checksum):
+ data=self._execute("SELECT value FROM %s WHERE version=? AND pkgarch=? AND checksum=?;" % self.table,
+ (version, pkgarch, checksum))
+ row=data.fetchone()
+ if row != None:
+ return row[0]
+ else:
+ #no value found, try to insert
+ try:
+ self._execute("INSERT INTO %s VALUES (?, ?, ?, (select ifnull(max(value)+1,0) from %s where version=? AND pkgarch=?));"
+ % (self.table,self.table),
+ (version,pkgarch, checksum,version, pkgarch))
+ except sqlite3.IntegrityError as exc:
+ logger.error(str(exc))
+
+ self.dirty = True
+
+ data=self._execute("SELECT value FROM %s WHERE version=? AND pkgarch=? AND checksum=?;" % self.table,
+ (version, pkgarch, checksum))
+ row=data.fetchone()
+ if row != None:
+ return row[0]
+ else:
+ raise prserv.NotFoundError
+
+ def _getValueNohist(self, version, pkgarch, checksum):
+ data=self._execute("SELECT value FROM %s \
+ WHERE version=? AND pkgarch=? AND checksum=? AND \
+ value >= (select max(value) from %s where version=? AND pkgarch=?);"
+ % (self.table, self.table),
+ (version, pkgarch, checksum, version, pkgarch))
+ row=data.fetchone()
+ if row != None:
+ return row[0]
+ else:
+ #no value found, try to insert
+ try:
+ self._execute("INSERT OR REPLACE INTO %s VALUES (?, ?, ?, (select ifnull(max(value)+1,0) from %s where version=? AND pkgarch=?));"
+ % (self.table,self.table),
+ (version, pkgarch, checksum, version, pkgarch))
+ except sqlite3.IntegrityError as exc:
+ logger.error(str(exc))
+ self.conn.rollback()
+
+ self.dirty = True
+
+ data=self._execute("SELECT value FROM %s WHERE version=? AND pkgarch=? AND checksum=?;" % self.table,
+ (version, pkgarch, checksum))
+ row=data.fetchone()
+ if row != None:
+ return row[0]
+ else:
+ raise prserv.NotFoundError
+
+ def getValue(self, version, pkgarch, checksum):
+ if self.nohist:
+ return self._getValueNohist(version, pkgarch, checksum)
+ else:
+ return self._getValueHist(version, pkgarch, checksum)
+
+ def _importHist(self, version, pkgarch, checksum, value):
+ val = None
+ data = self._execute("SELECT value FROM %s WHERE version=? AND pkgarch=? AND checksum=?;" % self.table,
+ (version, pkgarch, checksum))
+ row = data.fetchone()
+ if row != None:
+ val=row[0]
+ else:
+ #no value found, try to insert
+ try:
+ self._execute("INSERT INTO %s VALUES (?, ?, ?, ?);" % (self.table),
+ (version, pkgarch, checksum, value))
+ except sqlite3.IntegrityError as exc:
+ logger.error(str(exc))
+
+ self.dirty = True
+
+ data = self._execute("SELECT value FROM %s WHERE version=? AND pkgarch=? AND checksum=?;" % self.table,
+ (version, pkgarch, checksum))
+ row = data.fetchone()
+ if row != None:
+ val = row[0]
+ return val
+
+ def _importNohist(self, version, pkgarch, checksum, value):
+ try:
+ #try to insert
+ self._execute("INSERT INTO %s VALUES (?, ?, ?, ?);" % (self.table),
+ (version, pkgarch, checksum,value))
+ except sqlite3.IntegrityError as exc:
+ #already have the record, try to update
+ try:
+ self._execute("UPDATE %s SET value=? WHERE version=? AND pkgarch=? AND checksum=? AND value<?"
+ % (self.table),
+ (value,version,pkgarch,checksum,value))
+ except sqlite3.IntegrityError as exc:
+ logger.error(str(exc))
+
+ self.dirty = True
+
+ data = self._execute("SELECT value FROM %s WHERE version=? AND pkgarch=? AND checksum=? AND value>=?;" % self.table,
+ (version,pkgarch,checksum,value))
+ row=data.fetchone()
+ if row != None:
+ return row[0]
+ else:
+ return None
+
+ def importone(self, version, pkgarch, checksum, value):
+ if self.nohist:
+ return self._importNohist(version, pkgarch, checksum, value)
+ else:
+ return self._importHist(version, pkgarch, checksum, value)
+
+ def export(self, version, pkgarch, checksum, colinfo):
+ metainfo = {}
+ #column info
+ if colinfo:
+ metainfo['tbl_name'] = self.table
+ metainfo['core_ver'] = prserv.__version__
+ metainfo['col_info'] = []
+ data = self._execute("PRAGMA table_info(%s);" % self.table)
+ for row in data:
+ col = {}
+ col['name'] = row['name']
+ col['type'] = row['type']
+ col['notnull'] = row['notnull']
+ col['dflt_value'] = row['dflt_value']
+ col['pk'] = row['pk']
+ metainfo['col_info'].append(col)
+
+ #data info
+ datainfo = []
+
+ if self.nohist:
+ sqlstmt = "SELECT T1.version, T1.pkgarch, T1.checksum, T1.value FROM %s as T1, \
+ (SELECT version,pkgarch,max(value) as maxvalue FROM %s GROUP BY version,pkgarch) as T2 \
+ WHERE T1.version=T2.version AND T1.pkgarch=T2.pkgarch AND T1.value=T2.maxvalue " % (self.table, self.table)
+ else:
+ sqlstmt = "SELECT * FROM %s as T1 WHERE 1=1 " % self.table
+ sqlarg = []
+ where = ""
+ if version:
+ where += "AND T1.version=? "
+ sqlarg.append(str(version))
+ if pkgarch:
+ where += "AND T1.pkgarch=? "
+ sqlarg.append(str(pkgarch))
+ if checksum:
+ where += "AND T1.checksum=? "
+ sqlarg.append(str(checksum))
+
+ sqlstmt += where + ";"
+
+ if len(sqlarg):
+ data = self._execute(sqlstmt, tuple(sqlarg))
+ else:
+ data = self._execute(sqlstmt)
+ for row in data:
+ if row['version']:
+ col = {}
+ col['version'] = row['version']
+ col['pkgarch'] = row['pkgarch']
+ col['checksum'] = row['checksum']
+ col['value'] = row['value']
+ datainfo.append(col)
+ return (metainfo, datainfo)
+
+class PRData(object):
+ """Object representing the PR database"""
+ def __init__(self, filename, nohist=True):
+ self.filename=os.path.abspath(filename)
+ self.nohist=nohist
+ #build directory hierarchy
+ try:
+ os.makedirs(os.path.dirname(self.filename))
+ except OSError as e:
+ if e.errno != errno.EEXIST:
+ raise e
+ self.connection=sqlite3.connect(self.filename, isolation_level="EXCLUSIVE", check_same_thread = False)
+ self.connection.row_factory=sqlite3.Row
+ self.connection.execute("pragma synchronous = off;")
+ self.connection.execute("PRAGMA journal_mode = WAL;")
+ self._tables={}
+
+ def __del__(self):
+ self.connection.close()
+
+ def __getitem__(self,tblname):
+ if not isinstance(tblname, basestring):
+ raise TypeError("tblname argument must be a string, not '%s'" %
+ type(tblname))
+ if tblname in self._tables:
+ return self._tables[tblname]
+ else:
+ tableobj = self._tables[tblname] = PRTable(self.connection, tblname, self.nohist)
+ return tableobj
+
+ def __delitem__(self, tblname):
+ if tblname in self._tables:
+ del self._tables[tblname]
+ logger.info("drop table %s" % (tblname))
+ self.connection.execute("DROP TABLE IF EXISTS %s;" % tblname)
diff --git a/bitbake/lib/prserv/serv.py b/bitbake/lib/prserv/serv.py
new file mode 100644
index 0000000..5c0ffb9
--- /dev/null
+++ b/bitbake/lib/prserv/serv.py
@@ -0,0 +1,418 @@
+import os,sys,logging
+import signal, time
+from SimpleXMLRPCServer import SimpleXMLRPCServer, SimpleXMLRPCRequestHandler
+import threading
+import Queue
+
+try:
+ import sqlite3
+except ImportError:
+ from pysqlite2 import dbapi2 as sqlite3
+
+import bb.server.xmlrpc
+import prserv
+import prserv.db
+import errno
+
+logger = logging.getLogger("BitBake.PRserv")
+
+if sys.hexversion < 0x020600F0:
+ print("Sorry, python 2.6 or later is required.")
+ sys.exit(1)
+
+class Handler(SimpleXMLRPCRequestHandler):
+ def _dispatch(self,method,params):
+ try:
+ value=self.server.funcs[method](*params)
+ except:
+ import traceback
+ traceback.print_exc()
+ raise
+ return value
+
+PIDPREFIX = "/tmp/PRServer_%s_%s.pid"
+singleton = None
+
+
+class PRServer(SimpleXMLRPCServer):
+ def __init__(self, dbfile, logfile, interface, daemon=True):
+ ''' constructor '''
+ import socket
+ try:
+ SimpleXMLRPCServer.__init__(self, interface,
+ logRequests=False, allow_none=True)
+ except socket.error:
+ ip=socket.gethostbyname(interface[0])
+ port=interface[1]
+ msg="PR Server unable to bind to %s:%s\n" % (ip, port)
+ sys.stderr.write(msg)
+ raise PRServiceConfigError
+
+ self.dbfile=dbfile
+ self.daemon=daemon
+ self.logfile=logfile
+ self.working_thread=None
+ self.host, self.port = self.socket.getsockname()
+ self.pidfile=PIDPREFIX % (self.host, self.port)
+
+ self.register_function(self.getPR, "getPR")
+ self.register_function(self.quit, "quit")
+ self.register_function(self.ping, "ping")
+ self.register_function(self.export, "export")
+ self.register_function(self.importone, "importone")
+ self.register_introspection_functions()
+
+ self.db = prserv.db.PRData(self.dbfile)
+ self.table = self.db["PRMAIN"]
+
+ self.requestqueue = Queue.Queue()
+ self.handlerthread = threading.Thread(target = self.process_request_thread)
+ self.handlerthread.daemon = False
+
+ def process_request_thread(self):
+ """Same as in BaseServer but as a thread.
+
+ In addition, exception handling is done here.
+
+ """
+ iter_count = 1
+ # 60 iterations between syncs or sync if dirty every ~30 seconds
+ iterations_between_sync = 60
+
+ while not self.quit:
+ try:
+ (request, client_address) = self.requestqueue.get(True, 30)
+ except Queue.Empty:
+ self.table.sync_if_dirty()
+ continue
+ try:
+ self.finish_request(request, client_address)
+ self.shutdown_request(request)
+ iter_count = (iter_count + 1) % iterations_between_sync
+ if iter_count == 0:
+ self.table.sync_if_dirty()
+ except:
+ self.handle_error(request, client_address)
+ self.shutdown_request(request)
+ self.table.sync()
+ self.table.sync_if_dirty()
+
+ def sigint_handler(self, signum, stack):
+ self.table.sync()
+
+ def sigterm_handler(self, signum, stack):
+ self.table.sync()
+ raise SystemExit
+
+ def process_request(self, request, client_address):
+ self.requestqueue.put((request, client_address))
+
+ def export(self, version=None, pkgarch=None, checksum=None, colinfo=True):
+ try:
+ return self.table.export(version, pkgarch, checksum, colinfo)
+ except sqlite3.Error as exc:
+ logger.error(str(exc))
+ return None
+
+ def importone(self, version, pkgarch, checksum, value):
+ return self.table.importone(version, pkgarch, checksum, value)
+
+ def ping(self):
+ return not self.quit
+
+ def getinfo(self):
+ return (self.host, self.port)
+
+ def getPR(self, version, pkgarch, checksum):
+ try:
+ return self.table.getValue(version, pkgarch, checksum)
+ except prserv.NotFoundError:
+ logger.error("can not find value for (%s, %s)",version, checksum)
+ return None
+ except sqlite3.Error as exc:
+ logger.error(str(exc))
+ return None
+
+ def quit(self):
+ self.quit=True
+ return
+
+ def work_forever(self,):
+ self.quit = False
+ self.timeout = 0.5
+
+ logger.info("Started PRServer with DBfile: %s, IP: %s, PORT: %s, PID: %s" %
+ (self.dbfile, self.host, self.port, str(os.getpid())))
+
+ self.handlerthread.start()
+ while not self.quit:
+ self.handle_request()
+ self.handlerthread.join()
+ self.table.sync()
+ logger.info("PRServer: stopping...")
+ self.server_close()
+ return
+
+ def start(self):
+ if self.daemon:
+ pid = self.daemonize()
+ else:
+ pid = self.fork()
+
+ # Ensure both the parent sees this and the child from the work_forever log entry above
+ logger.info("Started PRServer with DBfile: %s, IP: %s, PORT: %s, PID: %s" %
+ (self.dbfile, self.host, self.port, str(pid)))
+
+ def delpid(self):
+ os.remove(self.pidfile)
+
+ def daemonize(self):
+ """
+ See Advanced Programming in the UNIX, Sec 13.3
+ """
+ try:
+ pid = os.fork()
+ if pid > 0:
+ os.waitpid(pid, 0)
+ #parent return instead of exit to give control
+ return pid
+ except OSError as e:
+ raise Exception("%s [%d]" % (e.strerror, e.errno))
+
+ os.setsid()
+ """
+ fork again to make sure the daemon is not session leader,
+ which prevents it from acquiring controlling terminal
+ """
+ try:
+ pid = os.fork()
+ if pid > 0: #parent
+ os._exit(0)
+ except OSError as e:
+ raise Exception("%s [%d]" % (e.strerror, e.errno))
+
+ self.cleanup_handles()
+ os._exit(0)
+
+ def fork(self):
+ try:
+ pid = os.fork()
+ if pid > 0:
+ return pid
+ except OSError as e:
+ raise Exception("%s [%d]" % (e.strerror, e.errno))
+
+ bb.utils.signal_on_parent_exit("SIGTERM")
+ self.cleanup_handles()
+ os._exit(0)
+
+ def cleanup_handles(self):
+ signal.signal(signal.SIGINT, self.sigint_handler)
+ signal.signal(signal.SIGTERM, self.sigterm_handler)
+ os.umask(0)
+ os.chdir("/")
+
+ sys.stdout.flush()
+ sys.stderr.flush()
+ si = file('/dev/null', 'r')
+ so = file(self.logfile, 'a+')
+ se = so
+ os.dup2(si.fileno(),sys.stdin.fileno())
+ os.dup2(so.fileno(),sys.stdout.fileno())
+ os.dup2(se.fileno(),sys.stderr.fileno())
+
+ # Clear out all log handlers prior to the fork() to avoid calling
+ # event handlers not part of the PRserver
+ for logger_iter in logging.Logger.manager.loggerDict.keys():
+ logging.getLogger(logger_iter).handlers = []
+
+ # Ensure logging makes it to the logfile
+ streamhandler = logging.StreamHandler()
+ streamhandler.setLevel(logging.DEBUG)
+ formatter = bb.msg.BBLogFormatter("%(levelname)s: %(message)s")
+ streamhandler.setFormatter(formatter)
+ logger.addHandler(streamhandler)
+
+ # write pidfile
+ pid = str(os.getpid())
+ pf = file(self.pidfile, 'w')
+ pf.write("%s\n" % pid)
+ pf.close()
+
+ self.work_forever()
+ self.delpid()
+
+class PRServSingleton(object):
+ def __init__(self, dbfile, logfile, interface):
+ self.dbfile = dbfile
+ self.logfile = logfile
+ self.interface = interface
+ self.host = None
+ self.port = None
+
+ def start(self):
+ self.prserv = PRServer(self.dbfile, self.logfile, self.interface, daemon=False)
+ self.prserv.start()
+ self.host, self.port = self.prserv.getinfo()
+
+ def getinfo(self):
+ return (self.host, self.port)
+
+class PRServerConnection(object):
+ def __init__(self, host, port):
+ if is_local_special(host, port):
+ host, port = singleton.getinfo()
+ self.host = host
+ self.port = port
+ self.connection, self.transport = bb.server.xmlrpc._create_server(self.host, self.port)
+
+ def terminate(self):
+ try:
+ logger.info("Terminating PRServer...")
+ self.connection.quit()
+ except Exception as exc:
+ sys.stderr.write("%s\n" % str(exc))
+
+ def getPR(self, version, pkgarch, checksum):
+ return self.connection.getPR(version, pkgarch, checksum)
+
+ def ping(self):
+ return self.connection.ping()
+
+ def export(self,version=None, pkgarch=None, checksum=None, colinfo=True):
+ return self.connection.export(version, pkgarch, checksum, colinfo)
+
+ def importone(self, version, pkgarch, checksum, value):
+ return self.connection.importone(version, pkgarch, checksum, value)
+
+ def getinfo(self):
+ return self.host, self.port
+
+def start_daemon(dbfile, host, port, logfile):
+ pidfile = PIDPREFIX % (host, port)
+ try:
+ pf = file(pidfile,'r')
+ pid = int(pf.readline().strip())
+ pf.close()
+ except IOError:
+ pid = None
+
+ if pid:
+ sys.stderr.write("pidfile %s already exist. Daemon already running?\n"
+ % pidfile)
+ return 1
+
+ server = PRServer(os.path.abspath(dbfile), os.path.abspath(logfile), (host,port))
+ server.start()
+ return 0
+
+def stop_daemon(host, port):
+ pidfile = PIDPREFIX % (host, port)
+ try:
+ pf = file(pidfile,'r')
+ pid = int(pf.readline().strip())
+ pf.close()
+ except IOError:
+ pid = None
+
+ if not pid:
+ sys.stderr.write("pidfile %s does not exist. Daemon not running?\n"
+ % pidfile)
+
+ try:
+ PRServerConnection(host, port).terminate()
+ except:
+ logger.critical("Stop PRService %s:%d failed" % (host,port))
+
+ try:
+ if pid:
+ wait_timeout = 0
+ print("Waiting for pr-server to exit.")
+ while is_running(pid) and wait_timeout < 50:
+ time.sleep(0.1)
+ wait_timeout += 1
+
+ if is_running(pid):
+ print("Sending SIGTERM to pr-server.")
+ os.kill(pid,signal.SIGTERM)
+ time.sleep(0.1)
+
+ if os.path.exists(pidfile):
+ os.remove(pidfile)
+
+ except OSError as e:
+ err = str(e)
+ if err.find("No such process") <= 0:
+ raise e
+
+ return 0
+
+def is_running(pid):
+ try:
+ os.kill(pid, 0)
+ except OSError as err:
+ if err.errno == errno.ESRCH:
+ return False
+ return True
+
+def is_local_special(host, port):
+ if host.strip().upper() == 'localhost'.upper() and (not port):
+ return True
+ else:
+ return False
+
+class PRServiceConfigError(Exception):
+ pass
+
+def auto_start(d):
+ global singleton
+
+ host_params = filter(None, (d.getVar('PRSERV_HOST', True) or '').split(':'))
+ if not host_params:
+ return None
+
+ if len(host_params) != 2:
+ logger.critical('\n'.join(['PRSERV_HOST: incorrect format',
+ 'Usage: PRSERV_HOST = "<hostname>:<port>"']))
+ raise PRServiceConfigError
+
+ if is_local_special(host_params[0], int(host_params[1])) and not singleton:
+ import bb.utils
+ cachedir = (d.getVar("PERSISTENT_DIR", True) or d.getVar("CACHE", True))
+ if not cachedir:
+ logger.critical("Please set the 'PERSISTENT_DIR' or 'CACHE' variable")
+ raise PRServiceConfigError
+ bb.utils.mkdirhier(cachedir)
+ dbfile = os.path.join(cachedir, "prserv.sqlite3")
+ logfile = os.path.join(cachedir, "prserv.log")
+ singleton = PRServSingleton(os.path.abspath(dbfile), os.path.abspath(logfile), ("localhost",0))
+ singleton.start()
+ if singleton:
+ host, port = singleton.getinfo()
+ else:
+ host = host_params[0]
+ port = int(host_params[1])
+
+ try:
+ connection = PRServerConnection(host,port)
+ connection.ping()
+ realhost, realport = connection.getinfo()
+ return str(realhost) + ":" + str(realport)
+
+ except Exception:
+ logger.critical("PRservice %s:%d not available" % (host, port))
+ raise PRServiceConfigError
+
+def auto_shutdown(d=None):
+ global singleton
+ if singleton:
+ host, port = singleton.getinfo()
+ try:
+ PRServerConnection(host, port).terminate()
+ except:
+ logger.critical("Stop PRService %s:%d failed" % (host,port))
+ singleton = None
+
+def ping(host, port):
+ conn=PRServerConnection(host, port)
+ return conn.ping()
diff --git a/bitbake/lib/pyinotify.py b/bitbake/lib/pyinotify.py
new file mode 100644
index 0000000..2dae002
--- /dev/null
+++ b/bitbake/lib/pyinotify.py
@@ -0,0 +1,2416 @@
+#!/usr/bin/env python
+
+# pyinotify.py - python interface to inotify
+# Copyright (c) 2005-2015 Sebastien Martini <seb@dbzteam.org>
+#
+# Permission is hereby granted, free of charge, to any person obtaining a copy
+# of this software and associated documentation files (the "Software"), to deal
+# in the Software without restriction, including without limitation the rights
+# to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
+# copies of the Software, and to permit persons to whom the Software is
+# furnished to do so, subject to the following conditions:
+#
+# The above copyright notice and this permission notice shall be included in
+# all copies or substantial portions of the Software.
+#
+# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
+# IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
+# FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
+# AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
+# LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
+# OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
+# THE SOFTWARE.
+"""
+pyinotify
+
+@author: Sebastien Martini
+@license: MIT License
+@contact: seb@dbzteam.org
+"""
+
+class PyinotifyError(Exception):
+ """Indicates exceptions raised by a Pyinotify class."""
+ pass
+
+
+class UnsupportedPythonVersionError(PyinotifyError):
+ """
+ Raised on unsupported Python versions.
+ """
+ def __init__(self, version):
+ """
+ @param version: Current Python version
+ @type version: string
+ """
+ err = 'Python %s is unsupported, requires at least Python 2.4'
+ PyinotifyError.__init__(self, err % version)
+
+
+# Check Python version
+import sys
+if sys.version_info < (2, 4):
+ raise UnsupportedPythonVersionError(sys.version)
+
+
+# Import directives
+import threading
+import os
+import select
+import struct
+import fcntl
+import errno
+import termios
+import array
+import logging
+import atexit
+from collections import deque
+from datetime import datetime, timedelta
+import time
+import re
+import asyncore
+import subprocess
+
+try:
+ from functools import reduce
+except ImportError:
+ pass # Will fail on Python 2.4 which has reduce() builtin anyway.
+
+try:
+ from glob import iglob as glob
+except ImportError:
+ # Python 2.4 does not have glob.iglob().
+ from glob import glob as glob
+
+try:
+ import ctypes
+ import ctypes.util
+except ImportError:
+ ctypes = None
+
+try:
+ import inotify_syscalls
+except ImportError:
+ inotify_syscalls = None
+
+
+__author__ = "seb@dbzteam.org (Sebastien Martini)"
+
+__version__ = "0.9.5"
+
+__metaclass__ = type # Use new-style classes by default
+
+
+# Compatibity mode: set to True to improve compatibility with
+# Pyinotify 0.7.1. Do not set this variable yourself, call the
+# function compatibility_mode() instead.
+COMPATIBILITY_MODE = False
+
+
+class InotifyBindingNotFoundError(PyinotifyError):
+ """
+ Raised when no inotify support couldn't be found.
+ """
+ def __init__(self):
+ err = "Couldn't find any inotify binding"
+ PyinotifyError.__init__(self, err)
+
+
+class INotifyWrapper:
+ """
+ Abstract class wrapping access to inotify's functions. This is an
+ internal class.
+ """
+ @staticmethod
+ def create():
+ # First, try to use ctypes.
+ if ctypes:
+ inotify = _CtypesLibcINotifyWrapper()
+ if inotify.init():
+ return inotify
+ # Second, see if C extension is compiled.
+ if inotify_syscalls:
+ inotify = _INotifySyscallsWrapper()
+ if inotify.init():
+ return inotify
+
+ def get_errno(self):
+ """
+ Return None is no errno code is available.
+ """
+ return self._get_errno()
+
+ def str_errno(self):
+ code = self.get_errno()
+ if code is None:
+ return 'Errno: no errno support'
+ return 'Errno=%s (%s)' % (os.strerror(code), errno.errorcode[code])
+
+ def inotify_init(self):
+ return self._inotify_init()
+
+ def inotify_add_watch(self, fd, pathname, mask):
+ # Unicode strings must be encoded to string prior to calling this
+ # method.
+ assert isinstance(pathname, str)
+ return self._inotify_add_watch(fd, pathname, mask)
+
+ def inotify_rm_watch(self, fd, wd):
+ return self._inotify_rm_watch(fd, wd)
+
+
+class _INotifySyscallsWrapper(INotifyWrapper):
+ def __init__(self):
+ # Stores the last errno value.
+ self._last_errno = None
+
+ def init(self):
+ assert inotify_syscalls
+ return True
+
+ def _get_errno(self):
+ return self._last_errno
+
+ def _inotify_init(self):
+ try:
+ fd = inotify_syscalls.inotify_init()
+ except IOError, err:
+ self._last_errno = err.errno
+ return -1
+ return fd
+
+ def _inotify_add_watch(self, fd, pathname, mask):
+ try:
+ wd = inotify_syscalls.inotify_add_watch(fd, pathname, mask)
+ except IOError, err:
+ self._last_errno = err.errno
+ return -1
+ return wd
+
+ def _inotify_rm_watch(self, fd, wd):
+ try:
+ ret = inotify_syscalls.inotify_rm_watch(fd, wd)
+ except IOError, err:
+ self._last_errno = err.errno
+ return -1
+ return ret
+
+
+class _CtypesLibcINotifyWrapper(INotifyWrapper):
+ def __init__(self):
+ self._libc = None
+ self._get_errno_func = None
+
+ def init(self):
+ assert ctypes
+
+ try_libc_name = 'c'
+ if sys.platform.startswith('freebsd'):
+ try_libc_name = 'inotify'
+
+ libc_name = None
+ try:
+ libc_name = ctypes.util.find_library(try_libc_name)
+ except (OSError, IOError):
+ pass # Will attemp to load it with None anyway.
+
+ if sys.version_info >= (2, 6):
+ self._libc = ctypes.CDLL(libc_name, use_errno=True)
+ self._get_errno_func = ctypes.get_errno
+ else:
+ self._libc = ctypes.CDLL(libc_name)
+ try:
+ location = self._libc.__errno_location
+ location.restype = ctypes.POINTER(ctypes.c_int)
+ self._get_errno_func = lambda: location().contents.value
+ except AttributeError:
+ pass
+
+ # Eventually check that libc has needed inotify bindings.
+ if (not hasattr(self._libc, 'inotify_init') or
+ not hasattr(self._libc, 'inotify_add_watch') or
+ not hasattr(self._libc, 'inotify_rm_watch')):
+ return False
+
+ self._libc.inotify_init.argtypes = []
+ self._libc.inotify_init.restype = ctypes.c_int
+ self._libc.inotify_add_watch.argtypes = [ctypes.c_int, ctypes.c_char_p,
+ ctypes.c_uint32]
+ self._libc.inotify_add_watch.restype = ctypes.c_int
+ self._libc.inotify_rm_watch.argtypes = [ctypes.c_int, ctypes.c_int]
+ self._libc.inotify_rm_watch.restype = ctypes.c_int
+ return True
+
+ def _get_errno(self):
+ if self._get_errno_func is not None:
+ return self._get_errno_func()
+ return None
+
+ def _inotify_init(self):
+ assert self._libc is not None
+ return self._libc.inotify_init()
+
+ def _inotify_add_watch(self, fd, pathname, mask):
+ assert self._libc is not None
+ pathname = ctypes.create_string_buffer(pathname)
+ return self._libc.inotify_add_watch(fd, pathname, mask)
+
+ def _inotify_rm_watch(self, fd, wd):
+ assert self._libc is not None
+ return self._libc.inotify_rm_watch(fd, wd)
+
+ def _sysctl(self, *args):
+ assert self._libc is not None
+ return self._libc.sysctl(*args)
+
+
+# Logging
+def logger_init():
+ """Initialize logger instance."""
+ log = logging.getLogger("pyinotify")
+ console_handler = logging.StreamHandler()
+ console_handler.setFormatter(
+ logging.Formatter("[%(asctime)s %(name)s %(levelname)s] %(message)s"))
+ log.addHandler(console_handler)
+ log.setLevel(20)
+ return log
+
+log = logger_init()
+
+
+# inotify's variables
+class SysCtlINotify:
+ """
+ Access (read, write) inotify's variables through sysctl. Usually it
+ requires administrator rights to update them.
+
+ Examples:
+ - Read max_queued_events attribute: myvar = max_queued_events.value
+ - Update max_queued_events attribute: max_queued_events.value = 42
+ """
+
+ inotify_attrs = {'max_user_instances': 1,
+ 'max_user_watches': 2,
+ 'max_queued_events': 3}
+
+ def __init__(self, attrname, inotify_wrapper):
+ # FIXME: right now only supporting ctypes
+ assert ctypes
+ self._attrname = attrname
+ self._inotify_wrapper = inotify_wrapper
+ sino = ctypes.c_int * 3
+ self._attr = sino(5, 20, SysCtlINotify.inotify_attrs[attrname])
+
+ @staticmethod
+ def create(attrname):
+ """
+ Factory method instanciating and returning the right wrapper.
+ """
+ # FIXME: right now only supporting ctypes
+ if ctypes is None:
+ return None
+ inotify_wrapper = _CtypesLibcINotifyWrapper()
+ if not inotify_wrapper.init():
+ return None
+ return SysCtlINotify(attrname, inotify_wrapper)
+
+ def get_val(self):
+ """
+ Gets attribute's value. Raises OSError if the operation failed.
+
+ @return: stored value.
+ @rtype: int
+ """
+ oldv = ctypes.c_int(0)
+ size = ctypes.c_int(ctypes.sizeof(oldv))
+ sysctl = self._inotify_wrapper._sysctl
+ res = sysctl(self._attr, 3,
+ ctypes.c_voidp(ctypes.addressof(oldv)),
+ ctypes.addressof(size),
+ None, 0)
+ if res == -1:
+ raise OSError(self._inotify_wrapper.get_errno(),
+ self._inotify_wrapper.str_errno())
+ return oldv.value
+
+ def set_val(self, nval):
+ """
+ Sets new attribute's value. Raises OSError if the operation failed.
+
+ @param nval: replaces current value by nval.
+ @type nval: int
+ """
+ oldv = ctypes.c_int(0)
+ sizeo = ctypes.c_int(ctypes.sizeof(oldv))
+ newv = ctypes.c_int(nval)
+ sizen = ctypes.c_int(ctypes.sizeof(newv))
+ sysctl = self._inotify_wrapper._sysctl
+ res = sysctl(self._attr, 3,
+ ctypes.c_voidp(ctypes.addressof(oldv)),
+ ctypes.addressof(sizeo),
+ ctypes.c_voidp(ctypes.addressof(newv)),
+ sizen)
+ if res == -1:
+ raise OSError(self._inotify_wrapper.get_errno(),
+ self._inotify_wrapper.str_errno())
+
+ value = property(get_val, set_val)
+
+ def __repr__(self):
+ return '<%s=%d>' % (self._attrname, self.get_val())
+
+
+# Inotify's variables
+#
+# FIXME: currently these variables are only accessible when ctypes is used,
+# otherwise there are set to None.
+#
+# read: myvar = max_queued_events.value
+# update: max_queued_events.value = 42
+#
+for attrname in ('max_queued_events', 'max_user_instances', 'max_user_watches'):
+ globals()[attrname] = SysCtlINotify.create(attrname)
+
+
+class EventsCodes:
+ """
+ Set of codes corresponding to each kind of events.
+ Some of these flags are used to communicate with inotify, whereas
+ the others are sent to userspace by inotify notifying some events.
+
+ @cvar IN_ACCESS: File was accessed.
+ @type IN_ACCESS: int
+ @cvar IN_MODIFY: File was modified.
+ @type IN_MODIFY: int
+ @cvar IN_ATTRIB: Metadata changed.
+ @type IN_ATTRIB: int
+ @cvar IN_CLOSE_WRITE: Writtable file was closed.
+ @type IN_CLOSE_WRITE: int
+ @cvar IN_CLOSE_NOWRITE: Unwrittable file closed.
+ @type IN_CLOSE_NOWRITE: int
+ @cvar IN_OPEN: File was opened.
+ @type IN_OPEN: int
+ @cvar IN_MOVED_FROM: File was moved from X.
+ @type IN_MOVED_FROM: int
+ @cvar IN_MOVED_TO: File was moved to Y.
+ @type IN_MOVED_TO: int
+ @cvar IN_CREATE: Subfile was created.
+ @type IN_CREATE: int
+ @cvar IN_DELETE: Subfile was deleted.
+ @type IN_DELETE: int
+ @cvar IN_DELETE_SELF: Self (watched item itself) was deleted.
+ @type IN_DELETE_SELF: int
+ @cvar IN_MOVE_SELF: Self (watched item itself) was moved.
+ @type IN_MOVE_SELF: int
+ @cvar IN_UNMOUNT: Backing fs was unmounted.
+ @type IN_UNMOUNT: int
+ @cvar IN_Q_OVERFLOW: Event queued overflowed.
+ @type IN_Q_OVERFLOW: int
+ @cvar IN_IGNORED: File was ignored.
+ @type IN_IGNORED: int
+ @cvar IN_ONLYDIR: only watch the path if it is a directory (new
+ in kernel 2.6.15).
+ @type IN_ONLYDIR: int
+ @cvar IN_DONT_FOLLOW: don't follow a symlink (new in kernel 2.6.15).
+ IN_ONLYDIR we can make sure that we don't watch
+ the target of symlinks.
+ @type IN_DONT_FOLLOW: int
+ @cvar IN_EXCL_UNLINK: Events are not generated for children after they
+ have been unlinked from the watched directory.
+ (new in kernel 2.6.36).
+ @type IN_EXCL_UNLINK: int
+ @cvar IN_MASK_ADD: add to the mask of an already existing watch (new
+ in kernel 2.6.14).
+ @type IN_MASK_ADD: int
+ @cvar IN_ISDIR: Event occurred against dir.
+ @type IN_ISDIR: int
+ @cvar IN_ONESHOT: Only send event once.
+ @type IN_ONESHOT: int
+ @cvar ALL_EVENTS: Alias for considering all of the events.
+ @type ALL_EVENTS: int
+ """
+
+ # The idea here is 'configuration-as-code' - this way, we get our nice class
+ # constants, but we also get nice human-friendly text mappings to do lookups
+ # against as well, for free:
+ FLAG_COLLECTIONS = {'OP_FLAGS': {
+ 'IN_ACCESS' : 0x00000001, # File was accessed
+ 'IN_MODIFY' : 0x00000002, # File was modified
+ 'IN_ATTRIB' : 0x00000004, # Metadata changed
+ 'IN_CLOSE_WRITE' : 0x00000008, # Writable file was closed
+ 'IN_CLOSE_NOWRITE' : 0x00000010, # Unwritable file closed
+ 'IN_OPEN' : 0x00000020, # File was opened
+ 'IN_MOVED_FROM' : 0x00000040, # File was moved from X
+ 'IN_MOVED_TO' : 0x00000080, # File was moved to Y
+ 'IN_CREATE' : 0x00000100, # Subfile was created
+ 'IN_DELETE' : 0x00000200, # Subfile was deleted
+ 'IN_DELETE_SELF' : 0x00000400, # Self (watched item itself)
+ # was deleted
+ 'IN_MOVE_SELF' : 0x00000800, # Self (watched item itself) was moved
+ },
+ 'EVENT_FLAGS': {
+ 'IN_UNMOUNT' : 0x00002000, # Backing fs was unmounted
+ 'IN_Q_OVERFLOW' : 0x00004000, # Event queued overflowed
+ 'IN_IGNORED' : 0x00008000, # File was ignored
+ },
+ 'SPECIAL_FLAGS': {
+ 'IN_ONLYDIR' : 0x01000000, # only watch the path if it is a
+ # directory
+ 'IN_DONT_FOLLOW' : 0x02000000, # don't follow a symlink
+ 'IN_EXCL_UNLINK' : 0x04000000, # exclude events on unlinked objects
+ 'IN_MASK_ADD' : 0x20000000, # add to the mask of an already
+ # existing watch
+ 'IN_ISDIR' : 0x40000000, # event occurred against dir
+ 'IN_ONESHOT' : 0x80000000, # only send event once
+ },
+ }
+
+ def maskname(mask):
+ """
+ Returns the event name associated to mask. IN_ISDIR is appended to
+ the result when appropriate. Note: only one event is returned, because
+ only one event can be raised at a given time.
+
+ @param mask: mask.
+ @type mask: int
+ @return: event name.
+ @rtype: str
+ """
+ ms = mask
+ name = '%s'
+ if mask & IN_ISDIR:
+ ms = mask - IN_ISDIR
+ name = '%s|IN_ISDIR'
+ return name % EventsCodes.ALL_VALUES[ms]
+
+ maskname = staticmethod(maskname)
+
+
+# So let's now turn the configuration into code
+EventsCodes.ALL_FLAGS = {}
+EventsCodes.ALL_VALUES = {}
+for flagc, valc in EventsCodes.FLAG_COLLECTIONS.items():
+ # Make the collections' members directly accessible through the
+ # class dictionary
+ setattr(EventsCodes, flagc, valc)
+
+ # Collect all the flags under a common umbrella
+ EventsCodes.ALL_FLAGS.update(valc)
+
+ # Make the individual masks accessible as 'constants' at globals() scope
+ # and masknames accessible by values.
+ for name, val in valc.items():
+ globals()[name] = val
+ EventsCodes.ALL_VALUES[val] = name
+
+
+# all 'normal' events
+ALL_EVENTS = reduce(lambda x, y: x | y, EventsCodes.OP_FLAGS.values())
+EventsCodes.ALL_FLAGS['ALL_EVENTS'] = ALL_EVENTS
+EventsCodes.ALL_VALUES[ALL_EVENTS] = 'ALL_EVENTS'
+
+
+class _Event:
+ """
+ Event structure, represent events raised by the system. This
+ is the base class and should be subclassed.
+
+ """
+ def __init__(self, dict_):
+ """
+ Attach attributes (contained in dict_) to self.
+
+ @param dict_: Set of attributes.
+ @type dict_: dictionary
+ """
+ for tpl in dict_.items():
+ setattr(self, *tpl)
+
+ def __repr__(self):
+ """
+ @return: Generic event string representation.
+ @rtype: str
+ """
+ s = ''
+ for attr, value in sorted(self.__dict__.items(), key=lambda x: x[0]):
+ if attr.startswith('_'):
+ continue
+ if attr == 'mask':
+ value = hex(getattr(self, attr))
+ elif isinstance(value, basestring) and not value:
+ value = "''"
+ s += ' %s%s%s' % (output_format.field_name(attr),
+ output_format.punctuation('='),
+ output_format.field_value(value))
+
+ s = '%s%s%s %s' % (output_format.punctuation('<'),
+ output_format.class_name(self.__class__.__name__),
+ s,
+ output_format.punctuation('>'))
+ return s
+
+ def __str__(self):
+ return repr(self)
+
+
+class _RawEvent(_Event):
+ """
+ Raw event, it contains only the informations provided by the system.
+ It doesn't infer anything.
+ """
+ def __init__(self, wd, mask, cookie, name):
+ """
+ @param wd: Watch Descriptor.
+ @type wd: int
+ @param mask: Bitmask of events.
+ @type mask: int
+ @param cookie: Cookie.
+ @type cookie: int
+ @param name: Basename of the file or directory against which the
+ event was raised in case where the watched directory
+ is the parent directory. None if the event was raised
+ on the watched item itself.
+ @type name: string or None
+ """
+ # Use this variable to cache the result of str(self), this object
+ # is immutable.
+ self._str = None
+ # name: remove trailing '\0'
+ d = {'wd': wd,
+ 'mask': mask,
+ 'cookie': cookie,
+ 'name': name.rstrip('\0')}
+ _Event.__init__(self, d)
+ log.debug(str(self))
+
+ def __str__(self):
+ if self._str is None:
+ self._str = _Event.__str__(self)
+ return self._str
+
+
+class Event(_Event):
+ """
+ This class contains all the useful informations about the observed
+ event. However, the presence of each field is not guaranteed and
+ depends on the type of event. In effect, some fields are irrelevant
+ for some kind of event (for example 'cookie' is meaningless for
+ IN_CREATE whereas it is mandatory for IN_MOVE_TO).
+
+ The possible fields are:
+ - wd (int): Watch Descriptor.
+ - mask (int): Mask.
+ - maskname (str): Readable event name.
+ - path (str): path of the file or directory being watched.
+ - name (str): Basename of the file or directory against which the
+ event was raised in case where the watched directory
+ is the parent directory. None if the event was raised
+ on the watched item itself. This field is always provided
+ even if the string is ''.
+ - pathname (str): Concatenation of 'path' and 'name'.
+ - src_pathname (str): Only present for IN_MOVED_TO events and only in
+ the case where IN_MOVED_FROM events are watched too. Holds the
+ source pathname from where pathname was moved from.
+ - cookie (int): Cookie.
+ - dir (bool): True if the event was raised against a directory.
+
+ """
+ def __init__(self, raw):
+ """
+ Concretely, this is the raw event plus inferred infos.
+ """
+ _Event.__init__(self, raw)
+ self.maskname = EventsCodes.maskname(self.mask)
+ if COMPATIBILITY_MODE:
+ self.event_name = self.maskname
+ try:
+ if self.name:
+ self.pathname = os.path.abspath(os.path.join(self.path,
+ self.name))
+ else:
+ self.pathname = os.path.abspath(self.path)
+ except AttributeError, err:
+ # Usually it is not an error some events are perfectly valids
+ # despite the lack of these attributes.
+ log.debug(err)
+
+
+class ProcessEventError(PyinotifyError):
+ """
+ ProcessEventError Exception. Raised on ProcessEvent error.
+ """
+ def __init__(self, err):
+ """
+ @param err: Exception error description.
+ @type err: string
+ """
+ PyinotifyError.__init__(self, err)
+
+
+class _ProcessEvent:
+ """
+ Abstract processing event class.
+ """
+ def __call__(self, event):
+ """
+ To behave like a functor the object must be callable.
+ This method is a dispatch method. Its lookup order is:
+ 1. process_MASKNAME method
+ 2. process_FAMILY_NAME method
+ 3. otherwise calls process_default
+
+ @param event: Event to be processed.
+ @type event: Event object
+ @return: By convention when used from the ProcessEvent class:
+ - Returning False or None (default value) means keep on
+ executing next chained functors (see chain.py example).
+ - Returning True instead means do not execute next
+ processing functions.
+ @rtype: bool
+ @raise ProcessEventError: Event object undispatchable,
+ unknown event.
+ """
+ stripped_mask = event.mask - (event.mask & IN_ISDIR)
+ maskname = EventsCodes.ALL_VALUES.get(stripped_mask)
+ if maskname is None:
+ raise ProcessEventError("Unknown mask 0x%08x" % stripped_mask)
+
+ # 1- look for process_MASKNAME
+ meth = getattr(self, 'process_' + maskname, None)
+ if meth is not None:
+ return meth(event)
+ # 2- look for process_FAMILY_NAME
+ meth = getattr(self, 'process_IN_' + maskname.split('_')[1], None)
+ if meth is not None:
+ return meth(event)
+ # 3- default call method process_default
+ return self.process_default(event)
+
+ def __repr__(self):
+ return '<%s>' % self.__class__.__name__
+
+
+class _SysProcessEvent(_ProcessEvent):
+ """
+ There is three kind of processing according to each event:
+
+ 1. special handling (deletion from internal container, bug, ...).
+ 2. default treatment: which is applied to the majority of events.
+ 3. IN_ISDIR is never sent alone, he is piggybacked with a standard
+ event, he is not processed as the others events, instead, its
+ value is captured and appropriately aggregated to dst event.
+ """
+ def __init__(self, wm, notifier):
+ """
+
+ @param wm: Watch Manager.
+ @type wm: WatchManager instance
+ @param notifier: Notifier.
+ @type notifier: Notifier instance
+ """
+ self._watch_manager = wm # watch manager
+ self._notifier = notifier # notifier
+ self._mv_cookie = {} # {cookie(int): (src_path(str), date), ...}
+ self._mv = {} # {src_path(str): (dst_path(str), date), ...}
+
+ def cleanup(self):
+ """
+ Cleanup (delete) old (>1mn) records contained in self._mv_cookie
+ and self._mv.
+ """
+ date_cur_ = datetime.now()
+ for seq in [self._mv_cookie, self._mv]:
+ for k in seq.keys():
+ if (date_cur_ - seq[k][1]) > timedelta(minutes=1):
+ log.debug('Cleanup: deleting entry %s', seq[k][0])
+ del seq[k]
+
+ def process_IN_CREATE(self, raw_event):
+ """
+ If the event affects a directory and the auto_add flag of the
+ targetted watch is set to True, a new watch is added on this
+ new directory, with the same attribute values than those of
+ this watch.
+ """
+ if raw_event.mask & IN_ISDIR:
+ watch_ = self._watch_manager.get_watch(raw_event.wd)
+ created_dir = os.path.join(watch_.path, raw_event.name)
+ if watch_.auto_add and not watch_.exclude_filter(created_dir):
+ addw = self._watch_manager.add_watch
+ # The newly monitored directory inherits attributes from its
+ # parent directory.
+ addw_ret = addw(created_dir, watch_.mask,
+ proc_fun=watch_.proc_fun,
+ rec=False, auto_add=watch_.auto_add,
+ exclude_filter=watch_.exclude_filter)
+
+ # Trick to handle mkdir -p /d1/d2/t3 where d1 is watched and
+ # d2 and t3 (directory or file) are created.
+ # Since the directory d2 is new, then everything inside it must
+ # also be new.
+ created_dir_wd = addw_ret.get(created_dir)
+ if ((created_dir_wd is not None) and (created_dir_wd > 0) and
+ os.path.isdir(created_dir)):
+ try:
+ for name in os.listdir(created_dir):
+ inner = os.path.join(created_dir, name)
+ if self._watch_manager.get_wd(inner) is not None:
+ continue
+ # Generate (simulate) creation events for sub-
+ # directories and files.
+ if os.path.isfile(inner):
+ # symlinks are handled as files.
+ flags = IN_CREATE
+ elif os.path.isdir(inner):
+ flags = IN_CREATE | IN_ISDIR
+ else:
+ # This path should not be taken.
+ continue
+ rawevent = _RawEvent(created_dir_wd, flags, 0, name)
+ self._notifier.append_event(rawevent)
+ except OSError, err:
+ msg = "process_IN_CREATE, invalid directory %s: %s"
+ log.debug(msg % (created_dir, str(err)))
+ return self.process_default(raw_event)
+
+ def process_IN_MOVED_FROM(self, raw_event):
+ """
+ Map the cookie with the source path (+ date for cleaning).
+ """
+ watch_ = self._watch_manager.get_watch(raw_event.wd)
+ path_ = watch_.path
+ src_path = os.path.normpath(os.path.join(path_, raw_event.name))
+ self._mv_cookie[raw_event.cookie] = (src_path, datetime.now())
+ return self.process_default(raw_event, {'cookie': raw_event.cookie})
+
+ def process_IN_MOVED_TO(self, raw_event):
+ """
+ Map the source path with the destination path (+ date for
+ cleaning).
+ """
+ watch_ = self._watch_manager.get_watch(raw_event.wd)
+ path_ = watch_.path
+ dst_path = os.path.normpath(os.path.join(path_, raw_event.name))
+ mv_ = self._mv_cookie.get(raw_event.cookie)
+ to_append = {'cookie': raw_event.cookie}
+ if mv_ is not None:
+ self._mv[mv_[0]] = (dst_path, datetime.now())
+ # Let's assume that IN_MOVED_FROM event is always queued before
+ # that its associated (they share a common cookie) IN_MOVED_TO
+ # event is queued itself. It is then possible in that scenario
+ # to provide as additional information to the IN_MOVED_TO event
+ # the original pathname of the moved file/directory.
+ to_append['src_pathname'] = mv_[0]
+ elif (raw_event.mask & IN_ISDIR and watch_.auto_add and
+ not watch_.exclude_filter(dst_path)):
+ # We got a diretory that's "moved in" from an unknown source and
+ # auto_add is enabled. Manually add watches to the inner subtrees.
+ # The newly monitored directory inherits attributes from its
+ # parent directory.
+ self._watch_manager.add_watch(dst_path, watch_.mask,
+ proc_fun=watch_.proc_fun,
+ rec=True, auto_add=True,
+ exclude_filter=watch_.exclude_filter)
+ return self.process_default(raw_event, to_append)
+
+ def process_IN_MOVE_SELF(self, raw_event):
+ """
+ STATUS: the following bug has been fixed in recent kernels (FIXME:
+ which version ?). Now it raises IN_DELETE_SELF instead.
+
+ Old kernels were bugged, this event raised when the watched item
+ were moved, so we had to update its path, but under some circumstances
+ it was impossible: if its parent directory and its destination
+ directory wasn't watched. The kernel (see include/linux/fsnotify.h)
+ doesn't bring us enough informations like the destination path of
+ moved items.
+ """
+ watch_ = self._watch_manager.get_watch(raw_event.wd)
+ src_path = watch_.path
+ mv_ = self._mv.get(src_path)
+ if mv_:
+ dest_path = mv_[0]
+ watch_.path = dest_path
+ # add the separator to the source path to avoid overlapping
+ # path issue when testing with startswith()
+ src_path += os.path.sep
+ src_path_len = len(src_path)
+ # The next loop renames all watches with src_path as base path.
+ # It seems that IN_MOVE_SELF does not provide IN_ISDIR information
+ # therefore the next loop is iterated even if raw_event is a file.
+ for w in self._watch_manager.watches.values():
+ if w.path.startswith(src_path):
+ # Note that dest_path is a normalized path.
+ w.path = os.path.join(dest_path, w.path[src_path_len:])
+ else:
+ log.error("The pathname '%s' of this watch %s has probably changed "
+ "and couldn't be updated, so it cannot be trusted "
+ "anymore. To fix this error move directories/files only "
+ "between watched parents directories, in this case e.g. "
+ "put a watch on '%s'.",
+ watch_.path, watch_,
+ os.path.normpath(os.path.join(watch_.path,
+ os.path.pardir)))
+ if not watch_.path.endswith('-unknown-path'):
+ watch_.path += '-unknown-path'
+ return self.process_default(raw_event)
+
+ def process_IN_Q_OVERFLOW(self, raw_event):
+ """
+ Only signal an overflow, most of the common flags are irrelevant
+ for this event (path, wd, name).
+ """
+ return Event({'mask': raw_event.mask})
+
+ def process_IN_IGNORED(self, raw_event):
+ """
+ The watch descriptor raised by this event is now ignored (forever),
+ it can be safely deleted from the watch manager dictionary.
+ After this event we can be sure that neither the event queue nor
+ the system will raise an event associated to this wd again.
+ """
+ event_ = self.process_default(raw_event)
+ self._watch_manager.del_watch(raw_event.wd)
+ return event_
+
+ def process_default(self, raw_event, to_append=None):
+ """
+ Commons handling for the followings events:
+
+ IN_ACCESS, IN_MODIFY, IN_ATTRIB, IN_CLOSE_WRITE, IN_CLOSE_NOWRITE,
+ IN_OPEN, IN_DELETE, IN_DELETE_SELF, IN_UNMOUNT.
+ """
+ watch_ = self._watch_manager.get_watch(raw_event.wd)
+ if raw_event.mask & (IN_DELETE_SELF | IN_MOVE_SELF):
+ # Unfornulately this information is not provided by the kernel
+ dir_ = watch_.dir
+ else:
+ dir_ = bool(raw_event.mask & IN_ISDIR)
+ dict_ = {'wd': raw_event.wd,
+ 'mask': raw_event.mask,
+ 'path': watch_.path,
+ 'name': raw_event.name,
+ 'dir': dir_}
+ if COMPATIBILITY_MODE:
+ dict_['is_dir'] = dir_
+ if to_append is not None:
+ dict_.update(to_append)
+ return Event(dict_)
+
+
+class ProcessEvent(_ProcessEvent):
+ """
+ Process events objects, can be specialized via subclassing, thus its
+ behavior can be overriden:
+
+ Note: you should not override __init__ in your subclass instead define
+ a my_init() method, this method will be called automatically from the
+ constructor of this class with its optionals parameters.
+
+ 1. Provide specialized individual methods, e.g. process_IN_DELETE for
+ processing a precise type of event (e.g. IN_DELETE in this case).
+ 2. Or/and provide methods for processing events by 'family', e.g.
+ process_IN_CLOSE method will process both IN_CLOSE_WRITE and
+ IN_CLOSE_NOWRITE events (if process_IN_CLOSE_WRITE and
+ process_IN_CLOSE_NOWRITE aren't defined though).
+ 3. Or/and override process_default for catching and processing all
+ the remaining types of events.
+ """
+ pevent = None
+
+ def __init__(self, pevent=None, **kargs):
+ """
+ Enable chaining of ProcessEvent instances.
+
+ @param pevent: Optional callable object, will be called on event
+ processing (before self).
+ @type pevent: callable
+ @param kargs: This constructor is implemented as a template method
+ delegating its optionals keyworded arguments to the
+ method my_init().
+ @type kargs: dict
+ """
+ self.pevent = pevent
+ self.my_init(**kargs)
+
+ def my_init(self, **kargs):
+ """
+ This method is called from ProcessEvent.__init__(). This method is
+ empty here and must be redefined to be useful. In effect, if you
+ need to specifically initialize your subclass' instance then you
+ just have to override this method in your subclass. Then all the
+ keyworded arguments passed to ProcessEvent.__init__() will be
+ transmitted as parameters to this method. Beware you MUST pass
+ keyword arguments though.
+
+ @param kargs: optional delegated arguments from __init__().
+ @type kargs: dict
+ """
+ pass
+
+ def __call__(self, event):
+ stop_chaining = False
+ if self.pevent is not None:
+ # By default methods return None so we set as guideline
+ # that methods asking for stop chaining must explicitely
+ # return non None or non False values, otherwise the default
+ # behavior will be to accept chain call to the corresponding
+ # local method.
+ stop_chaining = self.pevent(event)
+ if not stop_chaining:
+ return _ProcessEvent.__call__(self, event)
+
+ def nested_pevent(self):
+ return self.pevent
+
+ def process_IN_Q_OVERFLOW(self, event):
+ """
+ By default this method only reports warning messages, you can overredide
+ it by subclassing ProcessEvent and implement your own
+ process_IN_Q_OVERFLOW method. The actions you can take on receiving this
+ event is either to update the variable max_queued_events in order to
+ handle more simultaneous events or to modify your code in order to
+ accomplish a better filtering diminishing the number of raised events.
+ Because this method is defined, IN_Q_OVERFLOW will never get
+ transmitted as arguments to process_default calls.
+
+ @param event: IN_Q_OVERFLOW event.
+ @type event: dict
+ """
+ log.warning('Event queue overflowed.')
+
+ def process_default(self, event):
+ """
+ Default processing event method. By default does nothing. Subclass
+ ProcessEvent and redefine this method in order to modify its behavior.
+
+ @param event: Event to be processed. Can be of any type of events but
+ IN_Q_OVERFLOW events (see method process_IN_Q_OVERFLOW).
+ @type event: Event instance
+ """
+ pass
+
+
+class PrintAllEvents(ProcessEvent):
+ """
+ Dummy class used to print events strings representations. For instance this
+ class is used from command line to print all received events to stdout.
+ """
+ def my_init(self, out=None):
+ """
+ @param out: Where events will be written.
+ @type out: Object providing a valid file object interface.
+ """
+ if out is None:
+ out = sys.stdout
+ self._out = out
+
+ def process_default(self, event):
+ """
+ Writes event string representation to file object provided to
+ my_init().
+
+ @param event: Event to be processed. Can be of any type of events but
+ IN_Q_OVERFLOW events (see method process_IN_Q_OVERFLOW).
+ @type event: Event instance
+ """
+ self._out.write(str(event))
+ self._out.write('\n')
+ self._out.flush()
+
+
+class ChainIfTrue(ProcessEvent):
+ """
+ Makes conditional chaining depending on the result of the nested
+ processing instance.
+ """
+ def my_init(self, func):
+ """
+ Method automatically called from base class constructor.
+ """
+ self._func = func
+
+ def process_default(self, event):
+ return not self._func(event)
+
+
+class Stats(ProcessEvent):
+ """
+ Compute and display trivial statistics about processed events.
+ """
+ def my_init(self):
+ """
+ Method automatically called from base class constructor.
+ """
+ self._start_time = time.time()
+ self._stats = {}
+ self._stats_lock = threading.Lock()
+
+ def process_default(self, event):
+ """
+ Processes |event|.
+ """
+ self._stats_lock.acquire()
+ try:
+ events = event.maskname.split('|')
+ for event_name in events:
+ count = self._stats.get(event_name, 0)
+ self._stats[event_name] = count + 1
+ finally:
+ self._stats_lock.release()
+
+ def _stats_copy(self):
+ self._stats_lock.acquire()
+ try:
+ return self._stats.copy()
+ finally:
+ self._stats_lock.release()
+
+ def __repr__(self):
+ stats = self._stats_copy()
+
+ elapsed = int(time.time() - self._start_time)
+ elapsed_str = ''
+ if elapsed < 60:
+ elapsed_str = str(elapsed) + 'sec'
+ elif 60 <= elapsed < 3600:
+ elapsed_str = '%dmn%dsec' % (elapsed / 60, elapsed % 60)
+ elif 3600 <= elapsed < 86400:
+ elapsed_str = '%dh%dmn' % (elapsed / 3600, (elapsed % 3600) / 60)
+ elif elapsed >= 86400:
+ elapsed_str = '%dd%dh' % (elapsed / 86400, (elapsed % 86400) / 3600)
+ stats['ElapsedTime'] = elapsed_str
+
+ l = []
+ for ev, value in sorted(stats.items(), key=lambda x: x[0]):
+ l.append(' %s=%s' % (output_format.field_name(ev),
+ output_format.field_value(value)))
+ s = '<%s%s >' % (output_format.class_name(self.__class__.__name__),
+ ''.join(l))
+ return s
+
+ def dump(self, filename):
+ """
+ Dumps statistics.
+
+ @param filename: filename where stats will be dumped, filename is
+ created and must not exist prior to this call.
+ @type filename: string
+ """
+ flags = os.O_WRONLY|os.O_CREAT|os.O_NOFOLLOW|os.O_EXCL
+ fd = os.open(filename, flags, 0600)
+ os.write(fd, str(self))
+ os.close(fd)
+
+ def __str__(self, scale=45):
+ stats = self._stats_copy()
+ if not stats:
+ return ''
+
+ m = max(stats.values())
+ unity = float(scale) / m
+ fmt = '%%-26s%%-%ds%%s' % (len(output_format.field_value('@' * scale))
+ + 1)
+ def func(x):
+ return fmt % (output_format.field_name(x[0]),
+ output_format.field_value('@' * int(x[1] * unity)),
+ output_format.simple('%d' % x[1], 'yellow'))
+ s = '\n'.join(map(func, sorted(stats.items(), key=lambda x: x[0])))
+ return s
+
+
+class NotifierError(PyinotifyError):
+ """
+ Notifier Exception. Raised on Notifier error.
+
+ """
+ def __init__(self, err):
+ """
+ @param err: Exception string's description.
+ @type err: string
+ """
+ PyinotifyError.__init__(self, err)
+
+
+class Notifier:
+ """
+ Read notifications, process events.
+
+ """
+ def __init__(self, watch_manager, default_proc_fun=None, read_freq=0,
+ threshold=0, timeout=None):
+ """
+ Initialization. read_freq, threshold and timeout parameters are used
+ when looping.
+
+ @param watch_manager: Watch Manager.
+ @type watch_manager: WatchManager instance
+ @param default_proc_fun: Default processing method. If None, a new
+ instance of PrintAllEvents will be assigned.
+ @type default_proc_fun: instance of ProcessEvent
+ @param read_freq: if read_freq == 0, events are read asap,
+ if read_freq is > 0, this thread sleeps
+ max(0, read_freq - timeout) seconds. But if
+ timeout is None it may be different because
+ poll is blocking waiting for something to read.
+ @type read_freq: int
+ @param threshold: File descriptor will be read only if the accumulated
+ size to read becomes >= threshold. If != 0, you likely
+ want to use it in combination with an appropriate
+ value for read_freq because without that you would
+ keep looping without really reading anything and that
+ until the amount of events to read is >= threshold.
+ At least with read_freq set you might sleep.
+ @type threshold: int
+ @param timeout:
+ https://docs.python.org/3/library/select.html#polling-objects
+ @type timeout: int
+ """
+ # Watch Manager instance
+ self._watch_manager = watch_manager
+ # File descriptor
+ self._fd = self._watch_manager.get_fd()
+ # Poll object and registration
+ self._pollobj = select.poll()
+ self._pollobj.register(self._fd, select.POLLIN)
+ # This pipe is correctely initialized and used by ThreadedNotifier
+ self._pipe = (-1, -1)
+ # Event queue
+ self._eventq = deque()
+ # System processing functor, common to all events
+ self._sys_proc_fun = _SysProcessEvent(self._watch_manager, self)
+ # Default processing method
+ self._default_proc_fun = default_proc_fun
+ if default_proc_fun is None:
+ self._default_proc_fun = PrintAllEvents()
+ # Loop parameters
+ self._read_freq = read_freq
+ self._threshold = threshold
+ self._timeout = timeout
+ # Coalesce events option
+ self._coalesce = False
+ # set of str(raw_event), only used when coalesce option is True
+ self._eventset = set()
+
+ def append_event(self, event):
+ """
+ Append a raw event to the event queue.
+
+ @param event: An event.
+ @type event: _RawEvent instance.
+ """
+ self._eventq.append(event)
+
+ def proc_fun(self):
+ return self._default_proc_fun
+
+ def coalesce_events(self, coalesce=True):
+ """
+ Coalescing events. Events are usually processed by batchs, their size
+ depend on various factors. Thus, before processing them, events received
+ from inotify are aggregated in a fifo queue. If this coalescing
+ option is enabled events are filtered based on their unicity, only
+ unique events are enqueued, doublons are discarded. An event is unique
+ when the combination of its fields (wd, mask, cookie, name) is unique
+ among events of a same batch. After a batch of events is processed any
+ events is accepted again. By default this option is disabled, you have
+ to explictly call this function to turn it on.
+
+ @param coalesce: Optional new coalescing value. True by default.
+ @type coalesce: Bool
+ """
+ self._coalesce = coalesce
+ if not coalesce:
+ self._eventset.clear()
+
+ def check_events(self, timeout=None):
+ """
+ Check for new events available to read, blocks up to timeout
+ milliseconds.
+
+ @param timeout: If specified it overrides the corresponding instance
+ attribute _timeout.
+ @type timeout: int
+
+ @return: New events to read.
+ @rtype: bool
+ """
+ while True:
+ try:
+ # blocks up to 'timeout' milliseconds
+ if timeout is None:
+ timeout = self._timeout
+ ret = self._pollobj.poll(timeout)
+ except select.error, err:
+ if err[0] == errno.EINTR:
+ continue # interrupted, retry
+ else:
+ raise
+ else:
+ break
+
+ if not ret or (self._pipe[0] == ret[0][0]):
+ return False
+ # only one fd is polled
+ return ret[0][1] & select.POLLIN
+
+ def read_events(self):
+ """
+ Read events from device, build _RawEvents, and enqueue them.
+ """
+ buf_ = array.array('i', [0])
+ # get event queue size
+ if fcntl.ioctl(self._fd, termios.FIONREAD, buf_, 1) == -1:
+ return
+ queue_size = buf_[0]
+ if queue_size < self._threshold:
+ log.debug('(fd: %d) %d bytes available to read but threshold is '
+ 'fixed to %d bytes', self._fd, queue_size,
+ self._threshold)
+ return
+
+ try:
+ # Read content from file
+ r = os.read(self._fd, queue_size)
+ except Exception, msg:
+ raise NotifierError(msg)
+ log.debug('Event queue size: %d', queue_size)
+ rsum = 0 # counter
+ while rsum < queue_size:
+ s_size = 16
+ # Retrieve wd, mask, cookie and fname_len
+ wd, mask, cookie, fname_len = struct.unpack('iIII',
+ r[rsum:rsum+s_size])
+ # Retrieve name
+ fname, = struct.unpack('%ds' % fname_len,
+ r[rsum + s_size:rsum + s_size + fname_len])
+ rawevent = _RawEvent(wd, mask, cookie, fname)
+ if self._coalesce:
+ # Only enqueue new (unique) events.
+ raweventstr = str(rawevent)
+ if raweventstr not in self._eventset:
+ self._eventset.add(raweventstr)
+ self._eventq.append(rawevent)
+ else:
+ self._eventq.append(rawevent)
+ rsum += s_size + fname_len
+
+ def process_events(self):
+ """
+ Routine for processing events from queue by calling their
+ associated proccessing method (an instance of ProcessEvent).
+ It also does internal processings, to keep the system updated.
+ """
+ while self._eventq:
+ raw_event = self._eventq.popleft() # pop next event
+ if self._watch_manager.ignore_events:
+ log.debug("Event ignored: %s" % repr(raw_event))
+ continue
+ watch_ = self._watch_manager.get_watch(raw_event.wd)
+ if (watch_ is None) and not (raw_event.mask & IN_Q_OVERFLOW):
+ if not (raw_event.mask & IN_IGNORED):
+ # Not really sure how we ended up here, nor how we should
+ # handle these types of events and if it is appropriate to
+ # completly skip them (like we are doing here).
+ log.warning("Unable to retrieve Watch object associated to %s",
+ repr(raw_event))
+ continue
+ revent = self._sys_proc_fun(raw_event) # system processings
+ if watch_ and watch_.proc_fun:
+ watch_.proc_fun(revent) # user processings
+ else:
+ self._default_proc_fun(revent)
+ self._sys_proc_fun.cleanup() # remove olds MOVED_* events records
+ if self._coalesce:
+ self._eventset.clear()
+
+ def __daemonize(self, pid_file=None, stdin=os.devnull, stdout=os.devnull,
+ stderr=os.devnull):
+ """
+ @param pid_file: file where the pid will be written. If pid_file=None
+ the pid is written to
+ /var/run/<sys.argv[0]|pyinotify>.pid, if pid_file=False
+ no pid_file is written.
+ @param stdin:
+ @param stdout:
+ @param stderr: files associated to common streams.
+ """
+ if pid_file is None:
+ dirname = '/var/run/'
+ basename = os.path.basename(sys.argv[0]) or 'pyinotify'
+ pid_file = os.path.join(dirname, basename + '.pid')
+
+ if pid_file != False and os.path.lexists(pid_file):
+ err = 'Cannot daemonize: pid file %s already exists.' % pid_file
+ raise NotifierError(err)
+
+ def fork_daemon():
+ # Adapted from Chad J. Schroeder's recipe
+ # @see http://code.activestate.com/recipes/278731/
+ pid = os.fork()
+ if (pid == 0):
+ # parent 2
+ os.setsid()
+ pid = os.fork()
+ if (pid == 0):
+ # child
+ os.chdir('/')
+ os.umask(022)
+ else:
+ # parent 2
+ os._exit(0)
+ else:
+ # parent 1
+ os._exit(0)
+
+ fd_inp = os.open(stdin, os.O_RDONLY)
+ os.dup2(fd_inp, 0)
+ fd_out = os.open(stdout, os.O_WRONLY|os.O_CREAT, 0600)
+ os.dup2(fd_out, 1)
+ fd_err = os.open(stderr, os.O_WRONLY|os.O_CREAT, 0600)
+ os.dup2(fd_err, 2)
+
+ # Detach task
+ fork_daemon()
+
+ # Write pid
+ if pid_file != False:
+ flags = os.O_WRONLY|os.O_CREAT|os.O_NOFOLLOW|os.O_EXCL
+ fd_pid = os.open(pid_file, flags, 0600)
+ os.write(fd_pid, str(os.getpid()) + '\n')
+ os.close(fd_pid)
+ # Register unlink function
+ atexit.register(lambda : os.unlink(pid_file))
+
+ def _sleep(self, ref_time):
+ # Only consider sleeping if read_freq is > 0
+ if self._read_freq > 0:
+ cur_time = time.time()
+ sleep_amount = self._read_freq - (cur_time - ref_time)
+ if sleep_amount > 0:
+ log.debug('Now sleeping %d seconds', sleep_amount)
+ time.sleep(sleep_amount)
+
+ def loop(self, callback=None, daemonize=False, **args):
+ """
+ Events are read only one time every min(read_freq, timeout)
+ seconds at best and only if the size to read is >= threshold.
+ After this method returns it must not be called again for the same
+ instance.
+
+ @param callback: Functor called after each event processing iteration.
+ Expects to receive the notifier object (self) as first
+ parameter. If this function returns True the loop is
+ immediately terminated otherwise the loop method keeps
+ looping.
+ @type callback: callable object or function
+ @param daemonize: This thread is daemonized if set to True.
+ @type daemonize: boolean
+ @param args: Optional and relevant only if daemonize is True. Remaining
+ keyworded arguments are directly passed to daemonize see
+ __daemonize() method. If pid_file=None or is set to a
+ pathname the caller must ensure the file does not exist
+ before this method is called otherwise an exception
+ pyinotify.NotifierError will be raised. If pid_file=False
+ it is still daemonized but the pid is not written in any
+ file.
+ @type args: various
+ """
+ if daemonize:
+ self.__daemonize(**args)
+
+ # Read and process events forever
+ while 1:
+ try:
+ self.process_events()
+ if (callback is not None) and (callback(self) is True):
+ break
+ ref_time = time.time()
+ # check_events is blocking
+ if self.check_events():
+ self._sleep(ref_time)
+ self.read_events()
+ except KeyboardInterrupt:
+ # Stop monitoring if sigint is caught (Control-C).
+ log.debug('Pyinotify stops monitoring.')
+ break
+ # Close internals
+ self.stop()
+
+ def stop(self):
+ """
+ Close inotify's instance (close its file descriptor).
+ It destroys all existing watches, pending events,...
+ This method is automatically called at the end of loop().
+ """
+ self._pollobj.unregister(self._fd)
+ os.close(self._fd)
+ self._sys_proc_fun = None
+
+
+class ThreadedNotifier(threading.Thread, Notifier):
+ """
+ This notifier inherits from threading.Thread for instanciating a separate
+ thread, and also inherits from Notifier, because it is a threaded notifier.
+
+ Note that every functionality provided by this class is also provided
+ through Notifier class. Moreover Notifier should be considered first because
+ it is not threaded and could be easily daemonized.
+ """
+ def __init__(self, watch_manager, default_proc_fun=None, read_freq=0,
+ threshold=0, timeout=None):
+ """
+ Initialization, initialize base classes. read_freq, threshold and
+ timeout parameters are used when looping.
+
+ @param watch_manager: Watch Manager.
+ @type watch_manager: WatchManager instance
+ @param default_proc_fun: Default processing method. See base class.
+ @type default_proc_fun: instance of ProcessEvent
+ @param read_freq: if read_freq == 0, events are read asap,
+ if read_freq is > 0, this thread sleeps
+ max(0, read_freq - timeout) seconds.
+ @type read_freq: int
+ @param threshold: File descriptor will be read only if the accumulated
+ size to read becomes >= threshold. If != 0, you likely
+ want to use it in combination with an appropriate
+ value set for read_freq because without that you would
+ keep looping without really reading anything and that
+ until the amount of events to read is >= threshold. At
+ least with read_freq you might sleep.
+ @type threshold: int
+ @param timeout:
+ https://docs.python.org/3/library/select.html#polling-objects
+ @type timeout: int
+ """
+ # Init threading base class
+ threading.Thread.__init__(self)
+ # Stop condition
+ self._stop_event = threading.Event()
+ # Init Notifier base class
+ Notifier.__init__(self, watch_manager, default_proc_fun, read_freq,
+ threshold, timeout)
+ # Create a new pipe used for thread termination
+ self._pipe = os.pipe()
+ self._pollobj.register(self._pipe[0], select.POLLIN)
+
+ def stop(self):
+ """
+ Stop notifier's loop. Stop notification. Join the thread.
+ """
+ self._stop_event.set()
+ os.write(self._pipe[1], 'stop')
+ threading.Thread.join(self)
+ Notifier.stop(self)
+ self._pollobj.unregister(self._pipe[0])
+ os.close(self._pipe[0])
+ os.close(self._pipe[1])
+
+ def loop(self):
+ """
+ Thread's main loop. Don't meant to be called by user directly.
+ Call inherited start() method instead.
+
+ Events are read only once time every min(read_freq, timeout)
+ seconds at best and only if the size of events to read is >= threshold.
+ """
+ # When the loop must be terminated .stop() is called, 'stop'
+ # is written to pipe fd so poll() returns and .check_events()
+ # returns False which make evaluate the While's stop condition
+ # ._stop_event.isSet() wich put an end to the thread's execution.
+ while not self._stop_event.isSet():
+ self.process_events()
+ ref_time = time.time()
+ if self.check_events():
+ self._sleep(ref_time)
+ self.read_events()
+
+ def run(self):
+ """
+ Start thread's loop: read and process events until the method
+ stop() is called.
+ Never call this method directly, instead call the start() method
+ inherited from threading.Thread, which then will call run() in
+ its turn.
+ """
+ self.loop()
+
+
+class AsyncNotifier(asyncore.file_dispatcher, Notifier):
+ """
+ This notifier inherits from asyncore.file_dispatcher in order to be able to
+ use pyinotify along with the asyncore framework.
+
+ """
+ def __init__(self, watch_manager, default_proc_fun=None, read_freq=0,
+ threshold=0, timeout=None, channel_map=None):
+ """
+ Initializes the async notifier. The only additional parameter is
+ 'channel_map' which is the optional asyncore private map. See
+ Notifier class for the meaning of the others parameters.
+
+ """
+ Notifier.__init__(self, watch_manager, default_proc_fun, read_freq,
+ threshold, timeout)
+ asyncore.file_dispatcher.__init__(self, self._fd, channel_map)
+
+ def handle_read(self):
+ """
+ When asyncore tells us we can read from the fd, we proceed processing
+ events. This method can be overridden for handling a notification
+ differently.
+
+ """
+ self.read_events()
+ self.process_events()
+
+
+class TornadoAsyncNotifier(Notifier):
+ """
+ Tornado ioloop adapter.
+
+ """
+ def __init__(self, watch_manager, ioloop, callback=None,
+ default_proc_fun=None, read_freq=0, threshold=0, timeout=None,
+ channel_map=None):
+ """
+ Note that if later you must call ioloop.close() be sure to let the
+ default parameter to all_fds=False.
+
+ See example tornado_notifier.py for an example using this notifier.
+
+ @param ioloop: Tornado's IO loop.
+ @type ioloop: tornado.ioloop.IOLoop instance.
+ @param callback: Functor called at the end of each call to handle_read
+ (IOLoop's read handler). Expects to receive the
+ notifier object (self) as single parameter.
+ @type callback: callable object or function
+ """
+ self.io_loop = ioloop
+ self.handle_read_callback = callback
+ Notifier.__init__(self, watch_manager, default_proc_fun, read_freq,
+ threshold, timeout)
+ ioloop.add_handler(self._fd, self.handle_read, ioloop.READ)
+
+ def stop(self):
+ self.io_loop.remove_handler(self._fd)
+ Notifier.stop(self)
+
+ def handle_read(self, *args, **kwargs):
+ """
+ See comment in AsyncNotifier.
+
+ """
+ self.read_events()
+ self.process_events()
+ if self.handle_read_callback is not None:
+ self.handle_read_callback(self)
+
+
+class AsyncioNotifier(Notifier):
+ """
+
+ asyncio/trollius event loop adapter.
+
+ """
+ def __init__(self, watch_manager, loop, callback=None,
+ default_proc_fun=None, read_freq=0, threshold=0, timeout=None):
+ """
+
+ See examples/asyncio_notifier.py for an example usage.
+
+ @param loop: asyncio or trollius event loop instance.
+ @type loop: asyncio.BaseEventLoop or trollius.BaseEventLoop instance.
+ @param callback: Functor called at the end of each call to handle_read.
+ Expects to receive the notifier object (self) as
+ single parameter.
+ @type callback: callable object or function
+
+ """
+ self.loop = loop
+ self.handle_read_callback = callback
+ Notifier.__init__(self, watch_manager, default_proc_fun, read_freq,
+ threshold, timeout)
+ loop.add_reader(self._fd, self.handle_read)
+
+ def stop(self):
+ self.loop.remove_reader(self._fd)
+ Notifier.stop(self)
+
+ def handle_read(self, *args, **kwargs):
+ self.read_events()
+ self.process_events()
+ if self.handle_read_callback is not None:
+ self.handle_read_callback(self)
+
+
+class Watch:
+ """
+ Represent a watch, i.e. a file or directory being watched.
+
+ """
+ __slots__ = ('wd', 'path', 'mask', 'proc_fun', 'auto_add',
+ 'exclude_filter', 'dir')
+
+ def __init__(self, wd, path, mask, proc_fun, auto_add, exclude_filter):
+ """
+ Initializations.
+
+ @param wd: Watch descriptor.
+ @type wd: int
+ @param path: Path of the file or directory being watched.
+ @type path: str
+ @param mask: Mask.
+ @type mask: int
+ @param proc_fun: Processing callable object.
+ @type proc_fun:
+ @param auto_add: Automatically add watches on new directories.
+ @type auto_add: bool
+ @param exclude_filter: Boolean function, used to exclude new
+ directories from being automatically watched.
+ See WatchManager.__init__
+ @type exclude_filter: callable object
+ """
+ self.wd = wd
+ self.path = path
+ self.mask = mask
+ self.proc_fun = proc_fun
+ self.auto_add = auto_add
+ self.exclude_filter = exclude_filter
+ self.dir = os.path.isdir(self.path)
+
+ def __repr__(self):
+ """
+ @return: String representation.
+ @rtype: str
+ """
+ s = ' '.join(['%s%s%s' % (output_format.field_name(attr),
+ output_format.punctuation('='),
+ output_format.field_value(getattr(self,
+ attr))) \
+ for attr in self.__slots__ if not attr.startswith('_')])
+
+ s = '%s%s %s %s' % (output_format.punctuation('<'),
+ output_format.class_name(self.__class__.__name__),
+ s,
+ output_format.punctuation('>'))
+ return s
+
+
+class ExcludeFilter:
+ """
+ ExcludeFilter is an exclusion filter.
+
+ """
+ def __init__(self, arg_lst):
+ """
+ Examples:
+ ef1 = ExcludeFilter(["/etc/rc.*", "/etc/hostname"])
+ ef2 = ExcludeFilter("/my/path/exclude.lst")
+ Where exclude.lst contains:
+ /etc/rc.*
+ /etc/hostname
+
+ Note: it is not possible to exclude a file if its encapsulating
+ directory is itself watched. See this issue for more details
+ https://github.com/seb-m/pyinotify/issues/31
+
+ @param arg_lst: is either a list of patterns or a filename from which
+ patterns will be loaded.
+ @type arg_lst: list of str or str
+ """
+ if isinstance(arg_lst, str):
+ lst = self._load_patterns_from_file(arg_lst)
+ elif isinstance(arg_lst, list):
+ lst = arg_lst
+ else:
+ raise TypeError
+
+ self._lregex = []
+ for regex in lst:
+ self._lregex.append(re.compile(regex, re.UNICODE))
+
+ def _load_patterns_from_file(self, filename):
+ lst = []
+ file_obj = file(filename, 'r')
+ try:
+ for line in file_obj.readlines():
+ # Trim leading an trailing whitespaces
+ pattern = line.strip()
+ if not pattern or pattern.startswith('#'):
+ continue
+ lst.append(pattern)
+ finally:
+ file_obj.close()
+ return lst
+
+ def _match(self, regex, path):
+ return regex.match(path) is not None
+
+ def __call__(self, path):
+ """
+ @param path: Path to match against provided regexps.
+ @type path: str
+ @return: Return True if path has been matched and should
+ be excluded, False otherwise.
+ @rtype: bool
+ """
+ for regex in self._lregex:
+ if self._match(regex, path):
+ return True
+ return False
+
+
+class WatchManagerError(Exception):
+ """
+ WatchManager Exception. Raised on error encountered on watches
+ operations.
+
+ """
+ def __init__(self, msg, wmd):
+ """
+ @param msg: Exception string's description.
+ @type msg: string
+ @param wmd: This dictionary contains the wd assigned to paths of the
+ same call for which watches were successfully added.
+ @type wmd: dict
+ """
+ self.wmd = wmd
+ Exception.__init__(self, msg)
+
+
+class WatchManager:
+ """
+ Provide operations for watching files and directories. Its internal
+ dictionary is used to reference watched items. When used inside
+ threaded code, one must instanciate as many WatchManager instances as
+ there are ThreadedNotifier instances.
+
+ """
+ def __init__(self, exclude_filter=lambda path: False):
+ """
+ Initialization: init inotify, init watch manager dictionary.
+ Raise OSError if initialization fails, raise InotifyBindingNotFoundError
+ if no inotify binding was found (through ctypes or from direct access to
+ syscalls).
+
+ @param exclude_filter: boolean function, returns True if current
+ path must be excluded from being watched.
+ Convenient for providing a common exclusion
+ filter for every call to add_watch.
+ @type exclude_filter: callable object
+ """
+ self._ignore_events = False
+ self._exclude_filter = exclude_filter
+ self._wmd = {} # watch dict key: watch descriptor, value: watch
+
+ self._inotify_wrapper = INotifyWrapper.create()
+ if self._inotify_wrapper is None:
+ raise InotifyBindingNotFoundError()
+
+ self._fd = self._inotify_wrapper.inotify_init() # file descriptor
+ if self._fd < 0:
+ err = 'Cannot initialize new instance of inotify, %s'
+ raise OSError(err % self._inotify_wrapper.str_errno())
+
+ def close(self):
+ """
+ Close inotify's file descriptor, this action will also automatically
+ remove (i.e. stop watching) all its associated watch descriptors.
+ After a call to this method the WatchManager's instance become useless
+ and cannot be reused, a new instance must then be instanciated. It
+ makes sense to call this method in few situations for instance if
+ several independant WatchManager must be instanciated or if all watches
+ must be removed and no other watches need to be added.
+ """
+ os.close(self._fd)
+
+ def get_fd(self):
+ """
+ Return assigned inotify's file descriptor.
+
+ @return: File descriptor.
+ @rtype: int
+ """
+ return self._fd
+
+ def get_watch(self, wd):
+ """
+ Get watch from provided watch descriptor wd.
+
+ @param wd: Watch descriptor.
+ @type wd: int
+ """
+ return self._wmd.get(wd)
+
+ def del_watch(self, wd):
+ """
+ Remove watch entry associated to watch descriptor wd.
+
+ @param wd: Watch descriptor.
+ @type wd: int
+ """
+ try:
+ del self._wmd[wd]
+ except KeyError, err:
+ log.error('Cannot delete unknown watch descriptor %s' % str(err))
+
+ @property
+ def watches(self):
+ """
+ Get a reference on the internal watch manager dictionary.
+
+ @return: Internal watch manager dictionary.
+ @rtype: dict
+ """
+ return self._wmd
+
+ def __format_path(self, path):
+ """
+ Format path to its internal (stored in watch manager) representation.
+ """
+ # Unicode strings are converted back to strings, because it seems
+ # that inotify_add_watch from ctypes does not work well when
+ # it receives an ctypes.create_unicode_buffer instance as argument.
+ # Therefore even wd are indexed with bytes string and not with
+ # unicode paths.
+ if isinstance(path, unicode):
+ path = path.encode(sys.getfilesystemencoding())
+ return os.path.normpath(path)
+
+ def __add_watch(self, path, mask, proc_fun, auto_add, exclude_filter):
+ """
+ Add a watch on path, build a Watch object and insert it in the
+ watch manager dictionary. Return the wd value.
+ """
+ path = self.__format_path(path)
+ if auto_add and not mask & IN_CREATE:
+ mask |= IN_CREATE
+ wd = self._inotify_wrapper.inotify_add_watch(self._fd, path, mask)
+ if wd < 0:
+ return wd
+ watch = Watch(wd=wd, path=path, mask=mask, proc_fun=proc_fun,
+ auto_add=auto_add, exclude_filter=exclude_filter)
+ self._wmd[wd] = watch
+ log.debug('New %s', watch)
+ return wd
+
+ def __glob(self, path, do_glob):
+ if do_glob:
+ return glob(path)
+ else:
+ return [path]
+
+ def add_watch(self, path, mask, proc_fun=None, rec=False,
+ auto_add=False, do_glob=False, quiet=True,
+ exclude_filter=None):
+ """
+ Add watch(s) on the provided |path|(s) with associated |mask| flag
+ value and optionally with a processing |proc_fun| function and
+ recursive flag |rec| set to True.
+ Ideally |path| components should not be unicode objects. Note that
+ although unicode paths are accepted there are converted to byte
+ strings before a watch is put on that path. The encoding used for
+ converting the unicode object is given by sys.getfilesystemencoding().
+ If |path| si already watched it is ignored, but if it is called with
+ option rec=True a watch is put on each one of its not-watched
+ subdirectory.
+
+ @param path: Path to watch, the path can either be a file or a
+ directory. Also accepts a sequence (list) of paths.
+ @type path: string or list of strings
+ @param mask: Bitmask of events.
+ @type mask: int
+ @param proc_fun: Processing object.
+ @type proc_fun: function or ProcessEvent instance or instance of
+ one of its subclasses or callable object.
+ @param rec: Recursively add watches from path on all its
+ subdirectories, set to False by default (doesn't
+ follows symlinks in any case).
+ @type rec: bool
+ @param auto_add: Automatically add watches on newly created
+ directories in watched parent |path| directory.
+ If |auto_add| is True, IN_CREATE is ored with |mask|
+ when the watch is added.
+ @type auto_add: bool
+ @param do_glob: Do globbing on pathname (see standard globbing
+ module for more informations).
+ @type do_glob: bool
+ @param quiet: if False raises a WatchManagerError exception on
+ error. See example not_quiet.py.
+ @type quiet: bool
+ @param exclude_filter: predicate (boolean function), which returns
+ True if the current path must be excluded
+ from being watched. This argument has
+ precedence over exclude_filter passed to
+ the class' constructor.
+ @type exclude_filter: callable object
+ @return: dict of paths associated to watch descriptors. A wd value
+ is positive if the watch was added sucessfully,
+ otherwise the value is negative. If the path was invalid
+ or was already watched it is not included into this returned
+ dictionary.
+ @rtype: dict of {str: int}
+ """
+ ret_ = {} # return {path: wd, ...}
+
+ if exclude_filter is None:
+ exclude_filter = self._exclude_filter
+
+ # normalize args as list elements
+ for npath in self.__format_param(path):
+ # unix pathname pattern expansion
+ for apath in self.__glob(npath, do_glob):
+ # recursively list subdirs according to rec param
+ for rpath in self.__walk_rec(apath, rec):
+ if not exclude_filter(rpath):
+ wd = ret_[rpath] = self.__add_watch(rpath, mask,
+ proc_fun,
+ auto_add,
+ exclude_filter)
+ if wd < 0:
+ err = ('add_watch: cannot watch %s WD=%d, %s' % \
+ (rpath, wd,
+ self._inotify_wrapper.str_errno()))
+ if quiet:
+ log.error(err)
+ else:
+ raise WatchManagerError(err, ret_)
+ else:
+ # Let's say -2 means 'explicitely excluded
+ # from watching'.
+ ret_[rpath] = -2
+ return ret_
+
+ def __get_sub_rec(self, lpath):
+ """
+ Get every wd from self._wmd if its path is under the path of
+ one (at least) of those in lpath. Doesn't follow symlinks.
+
+ @param lpath: list of watch descriptor
+ @type lpath: list of int
+ @return: list of watch descriptor
+ @rtype: list of int
+ """
+ for d in lpath:
+ root = self.get_path(d)
+ if root is not None:
+ # always keep root
+ yield d
+ else:
+ # if invalid
+ continue
+
+ # nothing else to expect
+ if not os.path.isdir(root):
+ continue
+
+ # normalization
+ root = os.path.normpath(root)
+ # recursion
+ lend = len(root)
+ for iwd in self._wmd.items():
+ cur = iwd[1].path
+ pref = os.path.commonprefix([root, cur])
+ if root == os.sep or (len(pref) == lend and \
+ len(cur) > lend and \
+ cur[lend] == os.sep):
+ yield iwd[1].wd
+
+ def update_watch(self, wd, mask=None, proc_fun=None, rec=False,
+ auto_add=False, quiet=True):
+ """
+ Update existing watch descriptors |wd|. The |mask| value, the
+ processing object |proc_fun|, the recursive param |rec| and the
+ |auto_add| and |quiet| flags can all be updated.
+
+ @param wd: Watch Descriptor to update. Also accepts a list of
+ watch descriptors.
+ @type wd: int or list of int
+ @param mask: Optional new bitmask of events.
+ @type mask: int
+ @param proc_fun: Optional new processing function.
+ @type proc_fun: function or ProcessEvent instance or instance of
+ one of its subclasses or callable object.
+ @param rec: Optionally adds watches recursively on all
+ subdirectories contained into |wd| directory.
+ @type rec: bool
+ @param auto_add: Automatically adds watches on newly created
+ directories in the watch's path corresponding to |wd|.
+ If |auto_add| is True, IN_CREATE is ored with |mask|
+ when the watch is updated.
+ @type auto_add: bool
+ @param quiet: If False raises a WatchManagerError exception on
+ error. See example not_quiet.py
+ @type quiet: bool
+ @return: dict of watch descriptors associated to booleans values.
+ True if the corresponding wd has been successfully
+ updated, False otherwise.
+ @rtype: dict of {int: bool}
+ """
+ lwd = self.__format_param(wd)
+ if rec:
+ lwd = self.__get_sub_rec(lwd)
+
+ ret_ = {} # return {wd: bool, ...}
+ for awd in lwd:
+ apath = self.get_path(awd)
+ if not apath or awd < 0:
+ err = 'update_watch: invalid WD=%d' % awd
+ if quiet:
+ log.error(err)
+ continue
+ raise WatchManagerError(err, ret_)
+
+ if mask:
+ wd_ = self._inotify_wrapper.inotify_add_watch(self._fd, apath,
+ mask)
+ if wd_ < 0:
+ ret_[awd] = False
+ err = ('update_watch: cannot update %s WD=%d, %s' % \
+ (apath, wd_, self._inotify_wrapper.str_errno()))
+ if quiet:
+ log.error(err)
+ continue
+ raise WatchManagerError(err, ret_)
+
+ assert(awd == wd_)
+
+ if proc_fun or auto_add:
+ watch_ = self._wmd[awd]
+
+ if proc_fun:
+ watch_.proc_fun = proc_fun
+
+ if auto_add:
+ watch_.auto_add = auto_add
+
+ ret_[awd] = True
+ log.debug('Updated watch - %s', self._wmd[awd])
+ return ret_
+
+ def __format_param(self, param):
+ """
+ @param param: Parameter.
+ @type param: string or int
+ @return: wrap param.
+ @rtype: list of type(param)
+ """
+ if isinstance(param, list):
+ for p_ in param:
+ yield p_
+ else:
+ yield param
+
+ def get_wd(self, path):
+ """
+ Returns the watch descriptor associated to path. This method
+ presents a prohibitive cost, always prefer to keep the WD
+ returned by add_watch(). If the path is unknown it returns None.
+
+ @param path: Path.
+ @type path: str
+ @return: WD or None.
+ @rtype: int or None
+ """
+ path = self.__format_path(path)
+ for iwd in self._wmd.items():
+ if iwd[1].path == path:
+ return iwd[0]
+
+ def get_path(self, wd):
+ """
+ Returns the path associated to WD, if WD is unknown it returns None.
+
+ @param wd: Watch descriptor.
+ @type wd: int
+ @return: Path or None.
+ @rtype: string or None
+ """
+ watch_ = self._wmd.get(wd)
+ if watch_ is not None:
+ return watch_.path
+
+ def __walk_rec(self, top, rec):
+ """
+ Yields each subdirectories of top, doesn't follow symlinks.
+ If rec is false, only yield top.
+
+ @param top: root directory.
+ @type top: string
+ @param rec: recursive flag.
+ @type rec: bool
+ @return: path of one subdirectory.
+ @rtype: string
+ """
+ if not rec or os.path.islink(top) or not os.path.isdir(top):
+ yield top
+ else:
+ for root, dirs, files in os.walk(top):
+ yield root
+
+ def rm_watch(self, wd, rec=False, quiet=True):
+ """
+ Removes watch(s).
+
+ @param wd: Watch Descriptor of the file or directory to unwatch.
+ Also accepts a list of WDs.
+ @type wd: int or list of int.
+ @param rec: Recursively removes watches on every already watched
+ subdirectories and subfiles.
+ @type rec: bool
+ @param quiet: If False raises a WatchManagerError exception on
+ error. See example not_quiet.py
+ @type quiet: bool
+ @return: dict of watch descriptors associated to booleans values.
+ True if the corresponding wd has been successfully
+ removed, False otherwise.
+ @rtype: dict of {int: bool}
+ """
+ lwd = self.__format_param(wd)
+ if rec:
+ lwd = self.__get_sub_rec(lwd)
+
+ ret_ = {} # return {wd: bool, ...}
+ for awd in lwd:
+ # remove watch
+ wd_ = self._inotify_wrapper.inotify_rm_watch(self._fd, awd)
+ if wd_ < 0:
+ ret_[awd] = False
+ err = ('rm_watch: cannot remove WD=%d, %s' % \
+ (awd, self._inotify_wrapper.str_errno()))
+ if quiet:
+ log.error(err)
+ continue
+ raise WatchManagerError(err, ret_)
+
+ # Remove watch from our dictionary
+ if awd in self._wmd:
+ del self._wmd[awd]
+ ret_[awd] = True
+ log.debug('Watch WD=%d (%s) removed', awd, self.get_path(awd))
+ return ret_
+
+
+ def watch_transient_file(self, filename, mask, proc_class):
+ """
+ Watch a transient file, which will be created and deleted frequently
+ over time (e.g. pid file).
+
+ @attention: Currently under the call to this function it is not
+ possible to correctly watch the events triggered into the same
+ base directory than the directory where is located this watched
+ transient file. For instance it would be wrong to make these
+ two successive calls: wm.watch_transient_file('/var/run/foo.pid', ...)
+ and wm.add_watch('/var/run/', ...)
+
+ @param filename: Filename.
+ @type filename: string
+ @param mask: Bitmask of events, should contain IN_CREATE and IN_DELETE.
+ @type mask: int
+ @param proc_class: ProcessEvent (or of one of its subclass), beware of
+ accepting a ProcessEvent's instance as argument into
+ __init__, see transient_file.py example for more
+ details.
+ @type proc_class: ProcessEvent's instance or of one of its subclasses.
+ @return: Same as add_watch().
+ @rtype: Same as add_watch().
+ """
+ dirname = os.path.dirname(filename)
+ if dirname == '':
+ return {} # Maintains coherence with add_watch()
+ basename = os.path.basename(filename)
+ # Assuming we are watching at least for IN_CREATE and IN_DELETE
+ mask |= IN_CREATE | IN_DELETE
+
+ def cmp_name(event):
+ if getattr(event, 'name') is None:
+ return False
+ return basename == event.name
+ return self.add_watch(dirname, mask,
+ proc_fun=proc_class(ChainIfTrue(func=cmp_name)),
+ rec=False,
+ auto_add=False, do_glob=False,
+ exclude_filter=lambda path: False)
+
+ def get_ignore_events(self):
+ return self._ignore_events
+
+ def set_ignore_events(self, nval):
+ self._ignore_events = nval
+
+ ignore_events = property(get_ignore_events, set_ignore_events,
+ "Make watch manager ignoring new events.")
+
+
+
+class RawOutputFormat:
+ """
+ Format string representations.
+ """
+ def __init__(self, format=None):
+ self.format = format or {}
+
+ def simple(self, s, attribute):
+ if not isinstance(s, str):
+ s = str(s)
+ return (self.format.get(attribute, '') + s +
+ self.format.get('normal', ''))
+
+ def punctuation(self, s):
+ """Punctuation color."""
+ return self.simple(s, 'normal')
+
+ def field_value(self, s):
+ """Field value color."""
+ return self.simple(s, 'purple')
+
+ def field_name(self, s):
+ """Field name color."""
+ return self.simple(s, 'blue')
+
+ def class_name(self, s):
+ """Class name color."""
+ return self.format.get('red', '') + self.simple(s, 'bold')
+
+output_format = RawOutputFormat()
+
+class ColoredOutputFormat(RawOutputFormat):
+ """
+ Format colored string representations.
+ """
+ def __init__(self):
+ f = {'normal': '\033[0m',
+ 'black': '\033[30m',
+ 'red': '\033[31m',
+ 'green': '\033[32m',
+ 'yellow': '\033[33m',
+ 'blue': '\033[34m',
+ 'purple': '\033[35m',
+ 'cyan': '\033[36m',
+ 'bold': '\033[1m',
+ 'uline': '\033[4m',
+ 'blink': '\033[5m',
+ 'invert': '\033[7m'}
+ RawOutputFormat.__init__(self, f)
+
+
+def compatibility_mode():
+ """
+ Use this function to turn on the compatibility mode. The compatibility
+ mode is used to improve compatibility with Pyinotify 0.7.1 (or older)
+ programs. The compatibility mode provides additional variables 'is_dir',
+ 'event_name', 'EventsCodes.IN_*' and 'EventsCodes.ALL_EVENTS' as
+ Pyinotify 0.7.1 provided. Do not call this function from new programs!!
+ Especially if there are developped for Pyinotify >= 0.8.x.
+ """
+ setattr(EventsCodes, 'ALL_EVENTS', ALL_EVENTS)
+ for evname in globals():
+ if evname.startswith('IN_'):
+ setattr(EventsCodes, evname, globals()[evname])
+ global COMPATIBILITY_MODE
+ COMPATIBILITY_MODE = True
+
+
+def command_line():
+ """
+ By default the watched path is '/tmp' and all types of events are
+ monitored. Events monitoring serves forever, type c^c to stop it.
+ """
+ from optparse import OptionParser
+
+ usage = "usage: %prog [options] [path1] [path2] [pathn]"
+
+ parser = OptionParser(usage=usage)
+ parser.add_option("-v", "--verbose", action="store_true",
+ dest="verbose", help="Verbose mode")
+ parser.add_option("-r", "--recursive", action="store_true",
+ dest="recursive",
+ help="Add watches recursively on paths")
+ parser.add_option("-a", "--auto_add", action="store_true",
+ dest="auto_add",
+ help="Automatically add watches on new directories")
+ parser.add_option("-g", "--glob", action="store_true",
+ dest="glob",
+ help="Treat paths as globs")
+ parser.add_option("-e", "--events-list", metavar="EVENT[,...]",
+ dest="events_list",
+ help=("A comma-separated list of events to watch for - "
+ "see the documentation for valid options (defaults"
+ " to everything)"))
+ parser.add_option("-s", "--stats", action="store_true",
+ dest="stats",
+ help="Display dummy statistics")
+ parser.add_option("-V", "--version", action="store_true",
+ dest="version", help="Pyinotify version")
+ parser.add_option("-f", "--raw-format", action="store_true",
+ dest="raw_format",
+ help="Disable enhanced output format.")
+ parser.add_option("-c", "--command", action="store",
+ dest="command",
+ help="Shell command to run upon event")
+
+ (options, args) = parser.parse_args()
+
+ if options.verbose:
+ log.setLevel(10)
+
+ if options.version:
+ print(__version__)
+
+ if not options.raw_format:
+ global output_format
+ output_format = ColoredOutputFormat()
+
+ if len(args) < 1:
+ path = '/tmp' # default watched path
+ else:
+ path = args
+
+ # watch manager instance
+ wm = WatchManager()
+ # notifier instance and init
+ if options.stats:
+ notifier = Notifier(wm, default_proc_fun=Stats(), read_freq=5)
+ else:
+ notifier = Notifier(wm, default_proc_fun=PrintAllEvents())
+
+ # What mask to apply
+ mask = 0
+ if options.events_list:
+ events_list = options.events_list.split(',')
+ for ev in events_list:
+ evcode = EventsCodes.ALL_FLAGS.get(ev, 0)
+ if evcode:
+ mask |= evcode
+ else:
+ parser.error("The event '%s' specified with option -e"
+ " is not valid" % ev)
+ else:
+ mask = ALL_EVENTS
+
+ # stats
+ cb_fun = None
+ if options.stats:
+ def cb(s):
+ sys.stdout.write(repr(s.proc_fun()))
+ sys.stdout.write('\n')
+ sys.stdout.write(str(s.proc_fun()))
+ sys.stdout.write('\n')
+ sys.stdout.flush()
+ cb_fun = cb
+
+ # External command
+ if options.command:
+ def cb(s):
+ subprocess.Popen(options.command, shell=True)
+ cb_fun = cb
+
+ log.debug('Start monitoring %s, (press c^c to halt pyinotify)' % path)
+
+ wm.add_watch(path, mask, rec=options.recursive, auto_add=options.auto_add, do_glob=options.glob)
+ # Loop forever (until sigint signal get caught)
+ notifier.loop(callback=cb_fun)
+
+
+if __name__ == '__main__':
+ command_line()
diff --git a/bitbake/lib/toaster/__init__.py b/bitbake/lib/toaster/__init__.py
new file mode 100644
index 0000000..e69de29
--- /dev/null
+++ b/bitbake/lib/toaster/__init__.py
diff --git a/bitbake/lib/toaster/bldcollector/__init__.py b/bitbake/lib/toaster/bldcollector/__init__.py
new file mode 100644
index 0000000..e69de29
--- /dev/null
+++ b/bitbake/lib/toaster/bldcollector/__init__.py
diff --git a/bitbake/lib/toaster/bldcollector/admin.py b/bitbake/lib/toaster/bldcollector/admin.py
new file mode 100644
index 0000000..c1f85d7
--- /dev/null
+++ b/bitbake/lib/toaster/bldcollector/admin.py
@@ -0,0 +1,33 @@
+from django.contrib import admin
+from django.contrib.admin.filters import RelatedFieldListFilter
+from orm.models import BitbakeVersion, Release, LayerSource, ToasterSetting
+from django.forms.widgets import Textarea
+from django import forms
+import django.db.models as models
+
+from django.contrib.admin import widgets, helpers
+
+class LayerSourceAdmin(admin.ModelAdmin):
+ pass
+
+class BitbakeVersionAdmin(admin.ModelAdmin):
+
+ # we override the formfield for db URLField because of broken URL validation
+
+ def formfield_for_dbfield(self, db_field, **kwargs):
+ if isinstance(db_field, models.fields.URLField):
+ return forms.fields.CharField()
+ return super(BitbakeVersionAdmin, self).formfield_for_dbfield(db_field, **kwargs)
+
+
+
+class ReleaseAdmin(admin.ModelAdmin):
+ pass
+
+class ToasterSettingAdmin(admin.ModelAdmin):
+ pass
+
+admin.site.register(LayerSource, LayerSourceAdmin)
+admin.site.register(BitbakeVersion, BitbakeVersionAdmin)
+admin.site.register(Release, ReleaseAdmin)
+admin.site.register(ToasterSetting, ToasterSettingAdmin)
diff --git a/bitbake/lib/toaster/bldcollector/urls.py b/bitbake/lib/toaster/bldcollector/urls.py
new file mode 100644
index 0000000..144387b
--- /dev/null
+++ b/bitbake/lib/toaster/bldcollector/urls.py
@@ -0,0 +1,26 @@
+#
+# BitBake Toaster Implementation
+#
+# Copyright (C) 2014 Intel Corporation
+#
+# This program is free software; you can redistribute it and/or modify
+# it under the terms of the GNU General Public License version 2 as
+# published by the Free Software Foundation.
+#
+# This program is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
+# GNU General Public License for more details.
+#
+# You should have received a copy of the GNU General Public License along
+# with this program; if not, write to the Free Software Foundation, Inc.,
+# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
+
+
+from django.conf.urls import patterns, include, url
+from django.views.generic import RedirectView
+
+urlpatterns = patterns('bldcollector.views',
+ # landing point for pushing a bitbake_eventlog.json file to this toaster instace
+ url(r'^eventfile$', 'eventfile', name='eventfile'),
+ )
diff --git a/bitbake/lib/toaster/bldcollector/views.py b/bitbake/lib/toaster/bldcollector/views.py
new file mode 100644
index 0000000..f32fa4d
--- /dev/null
+++ b/bitbake/lib/toaster/bldcollector/views.py
@@ -0,0 +1,62 @@
+#
+# BitBake Toaster Implementation
+#
+# Copyright (C) 2014 Intel Corporation
+#
+# This program is free software; you can redistribute it and/or modify
+# it under the terms of the GNU General Public License version 2 as
+# published by the Free Software Foundation.
+#
+# This program is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
+# GNU General Public License for more details.
+#
+# You should have received a copy of the GNU General Public License along
+# with this program; if not, write to the Free Software Foundation, Inc.,
+# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
+
+from django.views.decorators.cache import cache_control
+from django.core.urlresolvers import reverse
+from django.core.paginator import Paginator, EmptyPage, PageNotAnInteger
+from django.http import HttpResponseBadRequest, HttpResponse
+from django.utils import timezone
+from django.utils.html import escape
+from datetime import timedelta
+from django.utils import formats
+from toastergui.templatetags.projecttags import json as jsonfilter
+import json
+import os
+import tempfile
+import subprocess
+import toastermain
+from django.views.decorators.csrf import csrf_exempt
+
+
+@csrf_exempt
+def eventfile(request):
+ """ Receives a file by POST, and runs toaster-eventreply on this file """
+ if request.method != "POST":
+ return HttpResponseBadRequest("This API only accepts POST requests. Post a file with:\n\ncurl -F eventlog=@bitbake_eventlog.json %s\n" % request.build_absolute_uri(reverse('eventfile')), content_type="text/plain;utf8")
+
+ # write temporary file
+ (handle, abstemppath) = tempfile.mkstemp(dir="/tmp/")
+ with os.fdopen(handle, "w") as tmpfile:
+ for chunk in request.FILES['eventlog'].chunks():
+ tmpfile.write(chunk)
+ tmpfile.close()
+
+ # compute the path to "bitbake/bin/toaster-eventreplay"
+ from os.path import dirname as DN
+ import_script = os.path.join(DN(DN(DN(DN(os.path.abspath(__file__))))), "bin/toaster-eventreplay")
+ if not os.path.exists(import_script):
+ raise Exception("script missing %s" % import_script)
+ scriptenv = os.environ.copy()
+ scriptenv["DATABASE_URL"] = toastermain.settings.getDATABASE_URL()
+
+ # run the data loading process and return the results
+ importer = subprocess.Popen([import_script, abstemppath], stdout=subprocess.PIPE, stderr=subprocess.PIPE, env=scriptenv)
+ (out, err) = importer.communicate()
+ if importer.returncode == 0:
+ os.remove(abstemppath)
+ return HttpResponse("== Retval %d\n== STDOUT\n%s\n\n== STDERR\n%s" % (importer.returncode, out, err), content_type="text/plain;utf8")
diff --git a/bitbake/lib/toaster/bldcontrol/__init__.py b/bitbake/lib/toaster/bldcontrol/__init__.py
new file mode 100644
index 0000000..e69de29
--- /dev/null
+++ b/bitbake/lib/toaster/bldcontrol/__init__.py
diff --git a/bitbake/lib/toaster/bldcontrol/admin.py b/bitbake/lib/toaster/bldcontrol/admin.py
new file mode 100644
index 0000000..fcbe5f5
--- /dev/null
+++ b/bitbake/lib/toaster/bldcontrol/admin.py
@@ -0,0 +1,8 @@
+from django.contrib import admin
+from django.contrib.admin.filters import RelatedFieldListFilter
+from .models import BuildEnvironment
+
+class BuildEnvironmentAdmin(admin.ModelAdmin):
+ pass
+
+admin.site.register(BuildEnvironment, BuildEnvironmentAdmin)
diff --git a/bitbake/lib/toaster/bldcontrol/bbcontroller.py b/bitbake/lib/toaster/bldcontrol/bbcontroller.py
new file mode 100644
index 0000000..ad70ac8
--- /dev/null
+++ b/bitbake/lib/toaster/bldcontrol/bbcontroller.py
@@ -0,0 +1,202 @@
+#
+# ex:ts=4:sw=4:sts=4:et
+# -*- tab-width: 4; c-basic-offset: 4; indent-tabs-mode: nil -*-
+#
+# BitBake Toaster Implementation
+#
+# Copyright (C) 2014 Intel Corporation
+#
+# This program is free software; you can redistribute it and/or modify
+# it under the terms of the GNU General Public License version 2 as
+# published by the Free Software Foundation.
+#
+# This program is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
+# GNU General Public License for more details.
+#
+# You should have received a copy of the GNU General Public License along
+# with this program; if not, write to the Free Software Foundation, Inc.,
+# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
+
+
+import os
+import sys
+import re
+from django.db import transaction
+from django.db.models import Q
+from bldcontrol.models import BuildEnvironment, BRLayer, BRVariable, BRTarget, BRBitbake
+
+# load Bitbake components
+path = os.path.join(os.path.dirname(os.path.dirname(os.path.dirname(os.path.abspath(__file__)))))
+sys.path.insert(0, path)
+import bb.server.xmlrpc
+
+class BitbakeController(object):
+ """ This is the basic class that controlls a bitbake server.
+ It is outside the scope of this class on how the server is started and aquired
+ """
+
+ def __init__(self, connection):
+ self.connection = connection
+
+ def _runCommand(self, command):
+ result, error = self.connection.connection.runCommand(command)
+ if error:
+ raise Exception(error)
+ return result
+
+ def disconnect(self):
+ return self.connection.removeClient()
+
+ def setVariable(self, name, value):
+ return self._runCommand(["setVariable", name, value])
+
+ def build(self, targets, task = None):
+ if task is None:
+ task = "build"
+ return self._runCommand(["buildTargets", targets, task])
+
+
+
+def getBuildEnvironmentController(**kwargs):
+ """ Gets you a BuildEnvironmentController that encapsulates a build environment,
+ based on the query dictionary sent in.
+
+ This is used to retrieve, for example, the currently running BE from inside
+ the toaster UI, or find a new BE to start a new build in it.
+
+ The return object MUST always be a BuildEnvironmentController.
+ """
+
+ from localhostbecontroller import LocalhostBEController
+ from sshbecontroller import SSHBEController
+
+ be = BuildEnvironment.objects.filter(Q(**kwargs))[0]
+ if be.betype == BuildEnvironment.TYPE_LOCAL:
+ return LocalhostBEController(be)
+ elif be.betype == BuildEnvironment.TYPE_SSH:
+ return SSHBEController(be)
+ else:
+ raise Exception("FIXME: Implement BEC for type %s" % str(be.betype))
+
+
+class BuildEnvironmentController(object):
+ """ BuildEnvironmentController (BEC) is the abstract class that defines the operations that MUST
+ or SHOULD be supported by a Build Environment. It is used to establish the framework, and must
+ not be instantiated directly by the user.
+
+ Use the "getBuildEnvironmentController()" function to get a working BEC for your remote.
+
+ How the BuildEnvironments are discovered is outside the scope of this class.
+
+ You must derive this class to teach Toaster how to operate in your own infrastructure.
+ We provide some specific BuildEnvironmentController classes that can be used either to
+ directly set-up Toaster infrastructure, or as a model for your own infrastructure set:
+
+ * Localhost controller will run the Toaster BE on the same account as the web server
+ (current user if you are using the the Django development web server)
+ on the local machine, with the "build/" directory under the "poky/" source checkout directory.
+ Bash is expected to be available.
+
+ * SSH controller will run the Toaster BE on a remote machine, where the current user
+ can connect without raise Exception("FIXME: implement")word (set up with either ssh-agent or raise Exception("FIXME: implement")phrase-less key authentication)
+
+ """
+ def __init__(self, be):
+ """ Takes a BuildEnvironment object as parameter that points to the settings of the BE.
+ """
+ self.be = be
+ self.connection = None
+
+ @staticmethod
+ def _updateBBLayers(bblayerconf, layerlist):
+ conflines = open(bblayerconf, "r").readlines()
+
+ bblayerconffile = open(bblayerconf, "w")
+ skip = 0
+ for i in xrange(len(conflines)):
+ if skip > 0:
+ skip =- 1
+ continue
+ if conflines[i].startswith("# line added by toaster"):
+ skip = 1
+ else:
+ bblayerconffile.write(conflines[i])
+
+ bblayerconffile.write("# line added by toaster build control\nBBLAYERS = \"" + " ".join(layerlist) + "\"")
+ bblayerconffile.close()
+
+
+ def writeConfFile(self, variable_list = None, raw = None):
+ """ Writes a configuration file in the build directory. Override with buildenv-specific implementation. """
+ raise Exception("FIXME: Must override to actually write a configuration file")
+
+
+ def startBBServer(self):
+ """ Starts a BB server with Toaster toasterui set up to record the builds, an no controlling UI.
+ After this method executes, self.be bbaddress/bbport MUST point to a running and free server,
+ and the bbstate MUST be updated to "started".
+ """
+ raise Exception("FIXME: Must override in order to actually start the BB server")
+
+ def stopBBServer(self):
+ """ Stops the currently running BB server.
+ The bbstate MUST be updated to "stopped".
+ self.connection must be none.
+ """
+ raise Exception("FIXME: Must override stoBBServer")
+
+ def setLayers(self, bbs, ls):
+ """ Checks-out bitbake executor and layers from git repositories.
+ Sets the layer variables in the config file, after validating local layer paths.
+ The bitbakes must be a 1-length list of BRBitbake
+ The layer paths must be in a list of BRLayer object
+
+ a word of attention: by convention, the first layer for any build will be poky!
+ """
+ raise Exception("FIXME: Must override setLayers")
+
+
+ def getBBController(self):
+ """ returns a BitbakeController to an already started server; this is the point where the server
+ starts if needed; or reconnects to the server if we can
+ """
+ if not self.connection:
+ self.startBBServer()
+ self.be.lock = BuildEnvironment.LOCK_RUNNING
+ self.be.save()
+
+ server = bb.server.xmlrpc.BitBakeXMLRPCClient()
+ server.initServer()
+ server.saveConnectionDetails("%s:%s" % (self.be.bbaddress, self.be.bbport))
+ self.connection = server.establishConnection([])
+
+ self.be.bbtoken = self.connection.transport.connection_token
+ self.be.save()
+
+ return BitbakeController(self.connection)
+
+ def getArtifact(self, path):
+ """ This call returns an artifact identified by the 'path'. How 'path' is interpreted as
+ up to the implementing BEC. The return MUST be a REST URL where a GET will actually return
+ the content of the artifact, e.g. for use as a "download link" in a web UI.
+ """
+ raise Exception("Must return the REST URL of the artifact")
+
+ def release(self):
+ """ This stops the server and releases any resources. After this point, all resources
+ are un-available for further reference
+ """
+ raise Exception("Must override BE release")
+
+ def triggerBuild(self, bitbake, layers, variables, targets):
+ raise Exception("Must override BE release")
+
+class ShellCmdException(Exception):
+ pass
+
+
+class BuildSetupException(Exception):
+ pass
+
diff --git a/bitbake/lib/toaster/bldcontrol/localhostbecontroller.py b/bitbake/lib/toaster/bldcontrol/localhostbecontroller.py
new file mode 100644
index 0000000..a9909b8
--- /dev/null
+++ b/bitbake/lib/toaster/bldcontrol/localhostbecontroller.py
@@ -0,0 +1,336 @@
+#
+# ex:ts=4:sw=4:sts=4:et
+# -*- tab-width: 4; c-basic-offset: 4; indent-tabs-mode: nil -*-
+#
+# BitBake Toaster Implementation
+#
+# Copyright (C) 2014 Intel Corporation
+#
+# This program is free software; you can redistribute it and/or modify
+# it under the terms of the GNU General Public License version 2 as
+# published by the Free Software Foundation.
+#
+# This program is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
+# GNU General Public License for more details.
+#
+# You should have received a copy of the GNU General Public License along
+# with this program; if not, write to the Free Software Foundation, Inc.,
+# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
+
+
+import os
+import sys
+import re
+from django.db import transaction
+from django.db.models import Q
+from bldcontrol.models import BuildEnvironment, BRLayer, BRVariable, BRTarget, BRBitbake
+import subprocess
+
+from toastermain import settings
+
+from bbcontroller import BuildEnvironmentController, ShellCmdException, BuildSetupException
+
+import logging
+logger = logging.getLogger("toaster")
+
+from pprint import pprint, pformat
+
+class LocalhostBEController(BuildEnvironmentController):
+ """ Implementation of the BuildEnvironmentController for the localhost;
+ this controller manages the default build directory,
+ the server setup and system start and stop for the localhost-type build environment
+
+ """
+
+ def __init__(self, be):
+ super(LocalhostBEController, self).__init__(be)
+ self.dburl = settings.getDATABASE_URL()
+ self.pokydirname = None
+ self.islayerset = False
+
+ def _shellcmd(self, command, cwd = None):
+ if cwd is None:
+ cwd = self.be.sourcedir
+
+ logger.debug("lbc_shellcmmd: (%s) %s" % (cwd, command))
+ p = subprocess.Popen(command, cwd = cwd, shell=True, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
+ (out,err) = p.communicate()
+ p.wait()
+ if p.returncode:
+ if len(err) == 0:
+ err = "command: %s \n%s" % (command, out)
+ else:
+ err = "command: %s \n%s" % (command, err)
+ logger.warn("localhostbecontroller: shellcmd error %s" % err)
+ raise ShellCmdException(err)
+ else:
+ logger.debug("localhostbecontroller: shellcmd success")
+ return out
+
+ def _setupBE(self):
+ assert self.pokydirname and os.path.exists(self.pokydirname)
+ path = self.be.builddir
+ if not path:
+ raise Exception("Invalid path creation specified.")
+ if not os.path.exists(path):
+ os.makedirs(path, 0755)
+ self._shellcmd("bash -c \"source %s/oe-init-build-env %s\"" % (self.pokydirname, path))
+ # delete the templateconf.cfg; it may come from an unsupported layer configuration
+ os.remove(os.path.join(path, "conf/templateconf.cfg"))
+
+
+ def writeConfFile(self, file_name, variable_list = None, raw = None):
+ filepath = os.path.join(self.be.builddir, file_name)
+ with open(filepath, "w") as conffile:
+ if variable_list is not None:
+ for i in variable_list:
+ conffile.write("%s=\"%s\"\n" % (i.name, i.value))
+ if raw is not None:
+ conffile.write(raw)
+
+
+ def startBBServer(self):
+ assert self.pokydirname and os.path.exists(self.pokydirname)
+ assert self.islayerset
+
+ # find our own toasterui listener/bitbake
+ from toaster.bldcontrol.management.commands.loadconf import _reduce_canon_path
+
+ own_bitbake = _reduce_canon_path(os.path.join(os.path.dirname(os.path.abspath(__file__)), "../../../bin/bitbake"))
+
+ assert os.path.exists(own_bitbake) and os.path.isfile(own_bitbake)
+
+ logger.debug("localhostbecontroller: running the listener at %s" % own_bitbake)
+
+ toaster_ui_log_filepath = os.path.join(self.be.builddir, "toaster_ui.log")
+ # get the file length; we need to detect the _last_ start of the toaster UI, not the first
+ toaster_ui_log_filelength = 0
+ if os.path.exists(toaster_ui_log_filepath):
+ with open(toaster_ui_log_filepath, "w") as f:
+ f.seek(0, 2) # jump to the end
+ toaster_ui_log_filelength = f.tell()
+
+ cmd = "bash -c \"source %s/oe-init-build-env %s 2>&1 >toaster_server.log && bitbake --read %s/conf/toaster-pre.conf --postread %s/conf/toaster.conf --server-only -t xmlrpc -B 0.0.0.0:0 2>&1 >>toaster_server.log \"" % (self.pokydirname, self.be.builddir, self.be.builddir, self.be.builddir)
+
+ port = "-1"
+ logger.debug("localhostbecontroller: starting builder \n%s\n" % cmd)
+
+ cmdoutput = self._shellcmd(cmd)
+ with open(self.be.builddir + "/toaster_server.log", "r") as f:
+ for i in f.readlines():
+ if i.startswith("Bitbake server address"):
+ port = i.split(" ")[-1]
+ logger.debug("localhostbecontroller: Found bitbake server port %s" % port)
+
+ cmd = "bash -c \"source %s/oe-init-build-env-memres -1 %s && DATABASE_URL=%s %s --observe-only -u toasterui --remote-server=0.0.0.0:-1 -t xmlrpc\"" % (self.pokydirname, self.be.builddir, self.dburl, own_bitbake)
+ with open(toaster_ui_log_filepath, "a+") as f:
+ p = subprocess.Popen(cmd, cwd = self.be.builddir, shell=True, stdout=f, stderr=f)
+
+ def _toaster_ui_started(filepath, filepos = 0):
+ if not os.path.exists(filepath):
+ return False
+ with open(filepath, "r") as f:
+ f.seek(filepos)
+ for line in f:
+ if line.startswith("NOTE: ToasterUI waiting for events"):
+ return True
+ return False
+
+ retries = 0
+ started = False
+ while not started and retries < 50:
+ started = _toaster_ui_started(toaster_ui_log_filepath, toaster_ui_log_filelength)
+ import time
+ logger.debug("localhostbecontroller: Waiting bitbake server to start")
+ time.sleep(0.5)
+ retries += 1
+
+ if not started:
+ toaster_ui_log = open(os.path.join(self.be.builddir, "toaster_ui.log"), "r").read()
+ toaster_server_log = open(os.path.join(self.be.builddir, "toaster_server.log"), "r").read()
+ raise BuildSetupException("localhostbecontroller: Bitbake server did not start in 25 seconds, aborting (Error: '%s' '%s')" % (toaster_ui_log, toaster_server_log))
+
+ logger.debug("localhostbecontroller: Started bitbake server")
+
+ while port == "-1":
+ # the port specification is "autodetect"; read the bitbake.lock file
+ with open("%s/bitbake.lock" % self.be.builddir, "r") as f:
+ for line in f.readlines():
+ if ":" in line:
+ port = line.split(":")[1].strip()
+ logger.debug("localhostbecontroller: Autodetected bitbake port %s", port)
+ break
+
+ assert self.be.sourcedir and os.path.exists(self.be.builddir)
+ self.be.bbaddress = "localhost"
+ self.be.bbport = port
+ self.be.bbstate = BuildEnvironment.SERVER_STARTED
+ self.be.save()
+
+ def stopBBServer(self):
+ assert self.pokydirname and os.path.exists(self.pokydirname)
+ assert self.islayerset
+ self._shellcmd("bash -c \"source %s/oe-init-build-env %s && %s source toaster stop\"" %
+ (self.pokydirname, self.be.builddir, (lambda: "" if self.be.bbtoken is None else "BBTOKEN=%s" % self.be.bbtoken)()))
+ self.be.bbstate = BuildEnvironment.SERVER_STOPPED
+ self.be.save()
+ logger.debug("localhostbecontroller: Stopped bitbake server")
+
+ def getGitCloneDirectory(self, url, branch):
+ """ Utility that returns the last component of a git path as directory
+ """
+ import re
+ components = re.split(r'[:\.\/]', url)
+ base = components[-2] if components[-1] == "git" else components[-1]
+
+ if branch != "HEAD":
+ return "_%s_%s.toaster_cloned" % (base, branch)
+
+
+ # word of attention; this is a localhost-specific issue; only on the localhost we expect to have "HEAD" releases
+ # which _ALWAYS_ means the current poky checkout
+ from os.path import dirname as DN
+ local_checkout_path = DN(DN(DN(DN(DN(os.path.abspath(__file__))))))
+ #logger.debug("localhostbecontroller: using HEAD checkout in %s" % local_checkout_path)
+ return local_checkout_path
+
+
+ def setLayers(self, bitbakes, layers):
+ """ a word of attention: by convention, the first layer for any build will be poky! """
+
+ assert self.be.sourcedir is not None
+ assert len(bitbakes) == 1
+ # set layers in the layersource
+
+ # 1. get a list of repos with branches, and map dirpaths for each layer
+ gitrepos = {}
+
+ gitrepos[(bitbakes[0].giturl, bitbakes[0].commit)] = []
+ gitrepos[(bitbakes[0].giturl, bitbakes[0].commit)].append( ("bitbake", bitbakes[0].dirpath) )
+
+ for layer in layers:
+ # we don't process local URLs
+ if layer.giturl.startswith("file://"):
+ continue
+ if not (layer.giturl, layer.commit) in gitrepos:
+ gitrepos[(layer.giturl, layer.commit)] = []
+ gitrepos[(layer.giturl, layer.commit)].append( (layer.name, layer.dirpath) )
+
+
+ logger.debug("localhostbecontroller, our git repos are %s" % pformat(gitrepos))
+
+
+ # 2. find checked-out git repos in the sourcedir directory that may help faster cloning
+
+ cached_layers = {}
+ for ldir in os.listdir(self.be.sourcedir):
+ fldir = os.path.join(self.be.sourcedir, ldir)
+ if os.path.isdir(fldir):
+ try:
+ for line in self._shellcmd("git remote -v", fldir).split("\n"):
+ try:
+ remote = line.split("\t")[1].split(" ")[0]
+ if remote not in cached_layers:
+ cached_layers[remote] = fldir
+ except IndexError:
+ pass
+ except ShellCmdException:
+ # ignore any errors in collecting git remotes
+ pass
+
+ layerlist = []
+
+
+ # 3. checkout the repositories
+ for giturl, commit in gitrepos.keys():
+ localdirname = os.path.join(self.be.sourcedir, self.getGitCloneDirectory(giturl, commit))
+ logger.debug("localhostbecontroller: giturl %s:%s checking out in current directory %s" % (giturl, commit, localdirname))
+
+ # make sure our directory is a git repository
+ if os.path.exists(localdirname):
+ localremotes = self._shellcmd("git remote -v", localdirname)
+ if not giturl in localremotes:
+ raise BuildSetupException("Existing git repository at %s, but with different remotes ('%s', expected '%s'). Toaster will not continue out of fear of damaging something." % (localdirname, ", ".join(localremotes.split("\n")), giturl))
+ else:
+ if giturl in cached_layers:
+ logger.debug("localhostbecontroller git-copying %s to %s" % (cached_layers[giturl], localdirname))
+ self._shellcmd("git clone \"%s\" \"%s\"" % (cached_layers[giturl], localdirname))
+ self._shellcmd("git remote remove origin", localdirname)
+ self._shellcmd("git remote add origin \"%s\"" % giturl, localdirname)
+ else:
+ logger.debug("localhostbecontroller: cloning %s:%s in %s" % (giturl, commit, localdirname))
+ self._shellcmd("git clone \"%s\" --single-branch --branch \"%s\" \"%s\"" % (giturl, commit, localdirname))
+
+ # branch magic name "HEAD" will inhibit checkout
+ if commit != "HEAD":
+ logger.debug("localhostbecontroller: checking out commit %s to %s " % (commit, localdirname))
+ self._shellcmd("git fetch --all && git checkout \"%s\" && git rebase \"origin/%s\"" % (commit, commit) , localdirname)
+
+ # take the localdirname as poky dir if we can find the oe-init-build-env
+ if self.pokydirname is None and os.path.exists(os.path.join(localdirname, "oe-init-build-env")):
+ logger.debug("localhostbecontroller: selected poky dir name %s" % localdirname)
+ self.pokydirname = localdirname
+
+ # make sure we have a working bitbake
+ if not os.path.exists(os.path.join(self.pokydirname, 'bitbake')):
+ logger.debug("localhostbecontroller: checking bitbake into the poky dirname %s " % self.pokydirname)
+ self._shellcmd("git clone -b \"%s\" \"%s\" \"%s\" " % (bitbakes[0].commit, bitbakes[0].giturl, os.path.join(self.pokydirname, 'bitbake')))
+
+ # verify our repositories
+ for name, dirpath in gitrepos[(giturl, commit)]:
+ localdirpath = os.path.join(localdirname, dirpath)
+ logger.debug("localhostbecontroller: localdirpath expected '%s'" % localdirpath)
+ if not os.path.exists(localdirpath):
+ raise BuildSetupException("Cannot find layer git path '%s' in checked out repository '%s:%s'. Aborting." % (localdirpath, giturl, commit))
+
+ if name != "bitbake":
+ layerlist.append(localdirpath.rstrip("/"))
+
+ logger.debug("localhostbecontroller: current layer list %s " % pformat(layerlist))
+
+ # 4. configure the build environment, so we have a conf/bblayers.conf
+ assert self.pokydirname is not None
+ self._setupBE()
+
+ # 5. update the bblayers.conf
+ bblayerconf = os.path.join(self.be.builddir, "conf/bblayers.conf")
+ if not os.path.exists(bblayerconf):
+ raise BuildSetupException("BE is not consistent: bblayers.conf file missing at %s" % bblayerconf)
+
+ BuildEnvironmentController._updateBBLayers(bblayerconf, layerlist)
+
+ self.islayerset = True
+ return True
+
+ def readServerLogFile(self):
+ return open(os.path.join(self.be.builddir, "toaster_server.log"), "r").read()
+
+ def release(self):
+ assert self.be.sourcedir and os.path.exists(self.be.builddir)
+ import shutil
+ shutil.rmtree(os.path.join(self.be.sourcedir, "build"))
+ assert not os.path.exists(self.be.builddir)
+
+
+ def triggerBuild(self, bitbake, layers, variables, targets):
+ # set up the buid environment with the needed layers
+ self.setLayers(bitbake, layers)
+ self.writeConfFile("conf/toaster-pre.conf", variables)
+ self.writeConfFile("conf/toaster.conf", raw = "INHERIT+=\"toaster buildhistory\"")
+
+ # get the bb server running with the build req id and build env id
+ bbctrl = self.getBBController()
+
+ # trigger the build command
+ task = reduce(lambda x, y: x if len(y)== 0 else y, map(lambda y: y.task, targets))
+ if len(task) == 0:
+ task = None
+
+ bbctrl.build(list(map(lambda x:x.target, targets)), task)
+
+ logger.debug("localhostbecontroller: Build launched, exiting. Follow build logs at %s/toaster_ui.log" % self.be.builddir)
+
+ # disconnect from the server
+ bbctrl.disconnect()
diff --git a/bitbake/lib/toaster/bldcontrol/management/__init__.py b/bitbake/lib/toaster/bldcontrol/management/__init__.py
new file mode 100644
index 0000000..e69de29
--- /dev/null
+++ b/bitbake/lib/toaster/bldcontrol/management/__init__.py
diff --git a/bitbake/lib/toaster/bldcontrol/management/commands/__init__.py b/bitbake/lib/toaster/bldcontrol/management/commands/__init__.py
new file mode 100644
index 0000000..e69de29
--- /dev/null
+++ b/bitbake/lib/toaster/bldcontrol/management/commands/__init__.py
diff --git a/bitbake/lib/toaster/bldcontrol/management/commands/checksettings.py b/bitbake/lib/toaster/bldcontrol/management/commands/checksettings.py
new file mode 100644
index 0000000..3ccc7c6
--- /dev/null
+++ b/bitbake/lib/toaster/bldcontrol/management/commands/checksettings.py
@@ -0,0 +1,247 @@
+from django.core.management.base import NoArgsCommand, CommandError
+from django.db import transaction
+from bldcontrol.bbcontroller import getBuildEnvironmentController, ShellCmdException
+from bldcontrol.models import BuildRequest, BuildEnvironment, BRError
+from orm.models import ToasterSetting, Build
+import os
+import sys, traceback
+
+def DN(path):
+ if path is None:
+ return ""
+ else:
+ return os.path.dirname(path)
+
+
+class Command(NoArgsCommand):
+ args = ""
+ help = "Verifies that the configured settings are valid and usable, or prompts the user to fix the settings."
+
+ def __init__(self, *args, **kwargs):
+ super(Command, self).__init__(*args, **kwargs)
+ self.guesspath = DN(DN(DN(DN(DN(DN(DN(__file__)))))))
+
+ def _find_first_path_for_file(self, startdirectory, filename, level = 0):
+ if level < 0:
+ return None
+ dirs = []
+ for i in os.listdir(startdirectory):
+ j = os.path.join(startdirectory, i)
+ if os.path.isfile(j):
+ if i == filename:
+ return startdirectory
+ elif os.path.isdir(j):
+ dirs.append(j)
+ for j in dirs:
+ ret = self._find_first_path_for_file(j, filename, level - 1)
+ if ret is not None:
+ return ret
+ return None
+
+ def _recursive_list_directories(self, startdirectory, level = 0):
+ if level < 0:
+ return []
+ dirs = []
+ try:
+ for i in os.listdir(startdirectory):
+ j = os.path.join(startdirectory, i)
+ if os.path.isdir(j):
+ dirs.append(j)
+ except OSError:
+ pass
+ for j in dirs:
+ dirs = dirs + self._recursive_list_directories(j, level - 1)
+ return dirs
+
+
+ def _get_suggested_sourcedir(self, be):
+ if be.betype != BuildEnvironment.TYPE_LOCAL:
+ return ""
+ return DN(DN(DN(self._find_first_path_for_file(self.guesspath, "toasterconf.json", 4))))
+
+ def _get_suggested_builddir(self, be):
+ if be.betype != BuildEnvironment.TYPE_LOCAL:
+ return ""
+ return DN(self._find_first_path_for_file(DN(self.guesspath), "bblayers.conf", 4))
+
+
+ def _verify_artifact_storage_dir(self):
+ # verify that we have a settings for downloading artifacts
+ while ToasterSetting.objects.filter(name="ARTIFACTS_STORAGE_DIR").count() == 0:
+ guessedpath = os.getcwd() + "/toaster_build_artifacts/"
+ print("\nToaster needs to know in which directory it can download build log files and other artifacts.\nToaster suggests \"%s\"." % guessedpath)
+ artifacts_storage_dir = raw_input("Press Enter to select \"%s\" or type the full path to a different directory: " % guessedpath)
+ if len(artifacts_storage_dir) == 0:
+ artifacts_storage_dir = guessedpath
+ if len(artifacts_storage_dir) > 0 and artifacts_storage_dir.startswith("/"):
+ try:
+ os.makedirs(artifacts_storage_dir)
+ except OSError as ose:
+ if "File exists" in str(ose):
+ pass
+ else:
+ raise ose
+ ToasterSetting.objects.create(name="ARTIFACTS_STORAGE_DIR", value=artifacts_storage_dir)
+ return 0
+
+
+ def _verify_build_environment(self):
+ # refuse to start if we have no build environments
+ while BuildEnvironment.objects.count() == 0:
+ print(" !! No build environments found. Toaster needs at least one build environment in order to be able to run builds.\n" +
+ "You can manually define build environments in the database table bldcontrol_buildenvironment.\n" +
+ "Or Toaster can define a simple localhost-based build environment for you.")
+
+ i = raw_input(" -- Do you want to create a basic localhost build environment ? (Y/n) ");
+ if not len(i) or i.startswith("y") or i.startswith("Y"):
+ BuildEnvironment.objects.create(pk = 1, betype = 0)
+ else:
+ raise Exception("Toaster cannot start without build environments. Aborting.")
+
+
+ # we make sure we have builddir and sourcedir for all defined build envionments
+ for be in BuildEnvironment.objects.all():
+ be.needs_import = False
+ def _verify_be():
+ is_changed = False
+ print("\nVerifying the build environment. If the local build environment is not properly configured, you will be asked to configure it.")
+
+ def _update_sourcedir():
+ suggesteddir = self._get_suggested_sourcedir(be)
+ if len(suggesteddir) > 0:
+ be.sourcedir = raw_input("This is the directory Toaster uses to check out the source code of the layers you will build. Toaster will create new clones of the layers, so existing content in the chosen directory will not be changed.\nToaster suggests you use \"%s\" as your layers checkout directory. If you select this directory, a layer like \"meta-intel\" will end up in \"%s/meta-intel\".\nPress Enter to select \"%s\" or type the full path to a different directory. If you provide your own directory, it must be a parent of the cloned directory for the sources you are using to run Toaster: " % (suggesteddir, suggesteddir, suggesteddir))
+ else:
+ be.sourcedir = raw_input("Toaster needs to know in which directory it should check out the source code of the layers you will build. The directory should be a parent of the cloned directory for the sources you are using to run Toaster. Toaster will create new clones of the layers, so existing content in the chosen directory will not be changed.\nType the full path to the directory (for example: \"%s\": " % os.environ.get('HOME', '/tmp/'))
+ if len(be.sourcedir) == 0 and len(suggesteddir) > 0:
+ be.sourcedir = suggesteddir
+ return True
+
+ if len(be.sourcedir) == 0:
+ print "\n -- Validation: The layers checkout directory must be set."
+ is_changed = _update_sourcedir()
+
+ if not be.sourcedir.startswith("/"):
+ print "\n -- Validation: The layers checkout directory must be set to an absolute path."
+ is_changed = _update_sourcedir()
+
+ if not be.sourcedir in DN(__file__):
+ print "\n -- Validation: The layers checkout directory must be a parent of the current checkout."
+ is_changed = _update_sourcedir()
+
+ if is_changed:
+ if be.betype == BuildEnvironment.TYPE_LOCAL:
+ be.needs_import = True
+ return True
+
+ def _update_builddir():
+ suggesteddir = self._get_suggested_builddir(be)
+ if len(suggesteddir) > 0:
+ be.builddir = raw_input("Toaster needs to know where your build directory is located.\nThe build directory is where all the artifacts created by your builds will be stored. Toaster suggests \"%s\".\nPress Enter to select \"%s\" or type the full path to a different directory: " % (suggesteddir, suggesteddir))
+ else:
+ be.builddir = raw_input("Toaster needs to know where is your build directory.\nThe build directory is where all the artifacts created by your builds will be stored. Type the full path to the directory (for example: \" %s/build\")" % os.environ.get('HOME','/tmp/'))
+ if len(be.builddir) == 0 and len(suggesteddir) > 0:
+ be.builddir = suggesteddir
+ return True
+
+ if len(be.builddir) == 0:
+ print "\n -- Validation: The build directory must be set."
+ is_changed = _update_builddir()
+
+ if not be.builddir.startswith("/"):
+ print "\n -- Validation: The build directory must to be set to an absolute path."
+ is_changed = _update_builddir()
+
+
+ if is_changed:
+ print "\nBuild configuration saved"
+ be.save()
+ return True
+
+
+ if be.needs_import:
+ print "\nToaster can use a SINGLE predefined configuration file to set up default project settings and layer information sources.\n"
+
+ # find configuration files
+ config_files = []
+ for dirname in self._recursive_list_directories(be.sourcedir,2):
+ if os.path.exists(os.path.join(dirname, ".templateconf")):
+ import subprocess
+ proc = subprocess.Popen('bash -c ". '+os.path.join(dirname, ".templateconf")+'; echo \"\$TEMPLATECONF\""', shell=True, stdout=subprocess.PIPE)
+ conffilepath, stderroroutput = proc.communicate()
+ proc.wait()
+ if proc.returncode != 0:
+ raise Exception("Failed to source TEMPLATECONF: %s" % stderroroutput)
+
+ conffilepath = os.path.join(conffilepath.strip(), "toasterconf.json")
+ candidatefilepath = os.path.join(dirname, conffilepath)
+ if "toaster_cloned" in candidatefilepath:
+ continue
+ if os.path.exists(candidatefilepath):
+ config_files.append(candidatefilepath)
+
+ if len(config_files) > 0:
+ print "Toaster will list now the configuration files that it found. Select the number to use the desired configuration file."
+ for cf in config_files:
+ print " [%d] - %s" % (config_files.index(cf) + 1, cf)
+ print "\n [0] - Exit without importing any file"
+ try:
+ i = raw_input("\nEnter your option: ")
+ if len(i) and (int(i) - 1 >= 0 and int(i) - 1 < len(config_files)):
+ print "\nImporting file: %s" % config_files[int(i)-1]
+ from loadconf import Command as LoadConfigCommand
+
+ LoadConfigCommand()._import_layer_config(config_files[int(i)-1])
+ # we run lsupdates after config update
+ print "\nLayer configuration imported. Updating information from the layer sources, please wait.\nYou can re-update any time later by running bitbake/lib/toaster/manage.py lsupdates"
+ from django.core.management import call_command
+ call_command("lsupdates")
+
+ # we don't look for any other config files
+ return is_changed
+ except Exception as e:
+ print "Failure while trying to import the toaster config file: %s" % e
+ traceback.print_exc(e)
+ else:
+ print "\nToaster could not find a configuration file. You need to configure Toaster manually using the web interface, or create a configuration file and use\n bitbake/lib/toaster/managepy.py loadconf [filename]\n command to load it. You can use https://wiki.yoctoproject.org/wiki/File:Toasterconf.json.txt.patch as a starting point."
+
+
+
+
+ return is_changed
+
+ while (_verify_be()):
+ pass
+ return 0
+
+ def _verify_default_settings(self):
+ # verify that default settings are there
+ if ToasterSetting.objects.filter(name = 'DEFAULT_RELEASE').count() != 1:
+ ToasterSetting.objects.filter(name = 'DEFAULT_RELEASE').delete()
+ ToasterSetting.objects.get_or_create(name = 'DEFAULT_RELEASE', value = '')
+ return 0
+
+ def _verify_builds_in_progress(self):
+ # we are just starting up. we must not have any builds in progress, or build environments taken
+ for b in BuildRequest.objects.filter(state = BuildRequest.REQ_INPROGRESS):
+ BRError.objects.create(req = b, errtype = "toaster", errmsg = "Toaster found this build IN PROGRESS while Toaster started up. This is an inconsistent state, and the build was marked as failed")
+
+ BuildRequest.objects.filter(state = BuildRequest.REQ_INPROGRESS).update(state = BuildRequest.REQ_FAILED)
+
+ BuildEnvironment.objects.update(lock = BuildEnvironment.LOCK_FREE)
+
+ # also mark "In Progress builds as failures"
+ from django.utils import timezone
+ Build.objects.filter(outcome = Build.IN_PROGRESS).update(outcome = Build.FAILED, completed_on = timezone.now())
+
+ return 0
+
+
+
+ def handle_noargs(self, **options):
+ retval = 0
+ retval += self._verify_artifact_storage_dir()
+ retval += self._verify_build_environment()
+ retval += self._verify_default_settings()
+ retval += self._verify_builds_in_progress()
+
+ return retval
diff --git a/bitbake/lib/toaster/bldcontrol/management/commands/loadconf.py b/bitbake/lib/toaster/bldcontrol/management/commands/loadconf.py
new file mode 100644
index 0000000..5022b59
--- /dev/null
+++ b/bitbake/lib/toaster/bldcontrol/management/commands/loadconf.py
@@ -0,0 +1,183 @@
+from django.core.management.base import BaseCommand, CommandError
+from orm.models import LayerSource, ToasterSetting, Branch, Layer, Layer_Version
+from orm.models import BitbakeVersion, Release, ReleaseDefaultLayer, ReleaseLayerSourcePriority
+from django.db import IntegrityError
+import os
+
+from checksettings import DN
+
+import logging
+logger = logging.getLogger("toaster")
+
+def _reduce_canon_path(path):
+ components = []
+ for c in path.split("/"):
+ if c == "..":
+ del components[-1]
+ elif c == ".":
+ pass
+ else:
+ components.append(c)
+ if len(components) < 2:
+ components.append('')
+ return "/".join(components)
+
+def _get_id_for_sourcetype(s):
+ for i in LayerSource.SOURCE_TYPE:
+ if s == i[1]:
+ return i[0]
+ raise Exception("Could not find definition for sourcetype '%s'. Valid source types are %s" % (str(s), ', '.join(map(lambda x: "'%s'" % x[1], LayerSource.SOURCE_TYPE ))))
+
+class Command(BaseCommand):
+ help = "Loads a toasterconf.json file in the database"
+ args = "filepath"
+
+
+
+ def _import_layer_config(self, filepath):
+ if not os.path.exists(filepath) or not os.path.isfile(filepath):
+ raise Exception("Failed to find toaster config file %s ." % filepath)
+
+ import json
+ data = json.loads(open(filepath, "r").read())
+
+ # verify config file validity before updating settings
+ for i in ['bitbake', 'releases', 'defaultrelease', 'config', 'layersources']:
+ assert i in data
+
+ def _read_git_url_from_local_repository(address):
+ url = None
+ # we detect the remote name at runtime
+ import subprocess
+ (remote, remote_name) = address.split(":", 1)
+ cmd = subprocess.Popen("git remote -v", shell=True, cwd = os.path.dirname(filepath), stdout=subprocess.PIPE, stderr = subprocess.PIPE)
+ (out,err) = cmd.communicate()
+ if cmd.returncode != 0:
+ logging.warning("Error while importing layer vcs_url: git error: %s" % err)
+ for line in out.split("\n"):
+ try:
+ (name, path) = line.split("\t", 1)
+ if name == remote_name:
+ url = path.split(" ")[0]
+ break
+ except ValueError:
+ pass
+ if url == None:
+ logging.warning("Error while looking for remote \"%s\" in \"%s\"" % (remote_name, out))
+ return url
+
+
+ # import bitbake data
+ for bvi in data['bitbake']:
+ bvo, created = BitbakeVersion.objects.get_or_create(name=bvi['name'])
+ if bvi['giturl'].startswith("remote:"):
+ bvo.giturl = _read_git_url_from_local_repository(bvi['giturl'])
+ if bvo.giturl is None:
+ logger.error("The toaster config file references the local git repo, but Toaster cannot detect it.\nYour local configuration for bitbake version %s is invalid. Make sure that the toasterconf.json file is correct." % bvi['name'])
+
+ if bvo.giturl is None:
+ bvo.giturl = bvi['giturl']
+ bvo.branch = bvi['branch']
+ bvo.dirpath = bvi['dirpath']
+ bvo.save()
+
+ # set the layer sources
+ for lsi in data['layersources']:
+ assert 'sourcetype' in lsi
+ assert 'apiurl' in lsi
+ assert 'name' in lsi
+ assert 'branches' in lsi
+
+
+ if _get_id_for_sourcetype(lsi['sourcetype']) == LayerSource.TYPE_LAYERINDEX or lsi['apiurl'].startswith("/"):
+ apiurl = lsi['apiurl']
+ else:
+ apiurl = _reduce_canon_path(os.path.join(DN(os.path.abspath(filepath)), lsi['apiurl']))
+
+ assert ((_get_id_for_sourcetype(lsi['sourcetype']) == LayerSource.TYPE_LAYERINDEX) or apiurl.startswith("/")), (lsi['sourcetype'],apiurl)
+
+ try:
+ ls, created = LayerSource.objects.get_or_create(sourcetype = _get_id_for_sourcetype(lsi['sourcetype']), apiurl = apiurl)
+ ls.name = lsi['name']
+ ls.save()
+ except IntegrityError as e:
+ logger.warning("IntegrityError %s \nWhile setting name %s for layer source %s " % (e, lsi['name'], ls))
+
+
+ layerbranches = []
+ for branchname in lsi['branches']:
+ bo, created = Branch.objects.get_or_create(layer_source = ls, name = branchname)
+ layerbranches.append(bo)
+
+ if 'layers' in lsi:
+ for layerinfo in lsi['layers']:
+ lo, created = Layer.objects.get_or_create(layer_source = ls, name = layerinfo['name'])
+ if layerinfo['local_path'].startswith("/"):
+ lo.local_path = layerinfo['local_path']
+ else:
+ lo.local_path = _reduce_canon_path(os.path.join(ls.apiurl, layerinfo['local_path']))
+
+ if not os.path.exists(lo.local_path):
+ logger.error("Local layer path %s must exists. Are you trying to import a layer that does not exist ? Check your local toasterconf.json" % lo.local_path)
+
+ if layerinfo['vcs_url'].startswith("remote:"):
+ lo.vcs_url = _read_git_url_from_local_repository(layerinfo['vcs_url'])
+ if lo.vcs_url is None:
+ logger.error("The toaster config file references the local git repo, but Toaster cannot detect it.\nYour local configuration for layer %s is invalid. Make sure that the toasterconf.json file is correct." % layerinfo['name'])
+
+ if lo.vcs_url is None:
+ lo.vcs_url = layerinfo['vcs_url']
+
+ if 'layer_index_url' in layerinfo:
+ lo.layer_index_url = layerinfo['layer_index_url']
+ lo.save()
+
+ for branch in layerbranches:
+ lvo, created = Layer_Version.objects.get_or_create(layer_source = ls,
+ up_branch = branch,
+ commit = branch.name,
+ layer = lo)
+ lvo.dirpath = layerinfo['dirpath']
+ lvo.save()
+ # set releases
+ for ri in data['releases']:
+ bvo = BitbakeVersion.objects.get(name = ri['bitbake'])
+ assert bvo is not None
+
+ ro, created = Release.objects.get_or_create(name = ri['name'], bitbake_version = bvo, branch_name = ri['branch'])
+ ro.description = ri['description']
+ ro.helptext = ri['helptext']
+ ro.save()
+
+ # save layer source priority for release
+ for ls_name in ri['layersourcepriority'].keys():
+ rlspo, created = ReleaseLayerSourcePriority.objects.get_or_create(release = ro, layer_source = LayerSource.objects.get(name=ls_name))
+ rlspo.priority = ri['layersourcepriority'][ls_name]
+ rlspo.save()
+
+ for dli in ri['defaultlayers']:
+ # find layers with the same name
+ ReleaseDefaultLayer.objects.get_or_create( release = ro, layer_name = dli)
+
+ # set default release
+ if ToasterSetting.objects.filter(name = "DEFAULT_RELEASE").count() > 0:
+ ToasterSetting.objects.filter(name = "DEFAULT_RELEASE").update(value = data['defaultrelease'])
+ else:
+ ToasterSetting.objects.create(name = "DEFAULT_RELEASE", value = data['defaultrelease'])
+
+ # set default config variables
+ for configname in data['config']:
+ if ToasterSetting.objects.filter(name = "DEFCONF_" + configname).count() > 0:
+ ToasterSetting.objects.filter(name = "DEFCONF_" + configname).update(value = data['config'][configname])
+ else:
+ ToasterSetting.objects.create(name = "DEFCONF_" + configname, value = data['config'][configname])
+
+
+ def handle(self, *args, **options):
+ if len(args) == 0:
+ raise CommandError("Need a path to the toasterconf.json file")
+ filepath = args[0]
+ self._import_layer_config(filepath)
+
+
+
diff --git a/bitbake/lib/toaster/bldcontrol/management/commands/runbuilds.py b/bitbake/lib/toaster/bldcontrol/management/commands/runbuilds.py
new file mode 100644
index 0000000..c3e9b74
--- /dev/null
+++ b/bitbake/lib/toaster/bldcontrol/management/commands/runbuilds.py
@@ -0,0 +1,153 @@
+from django.core.management.base import NoArgsCommand, CommandError
+from django.db import transaction
+from orm.models import Build, ToasterSetting, LogMessage, Target
+from bldcontrol.bbcontroller import getBuildEnvironmentController, ShellCmdException, BuildSetupException
+from bldcontrol.models import BuildRequest, BuildEnvironment, BRError, BRVariable
+import os
+import logging
+
+logger = logging.getLogger("ToasterScheduler")
+
+class Command(NoArgsCommand):
+ args = ""
+ help = "Schedules and executes build requests as possible. Does not return (interrupt with Ctrl-C)"
+
+
+ @transaction.commit_on_success
+ def _selectBuildEnvironment(self):
+ bec = getBuildEnvironmentController(lock = BuildEnvironment.LOCK_FREE)
+ bec.be.lock = BuildEnvironment.LOCK_LOCK
+ bec.be.save()
+ return bec
+
+ @transaction.commit_on_success
+ def _selectBuildRequest(self):
+ br = BuildRequest.objects.filter(state = BuildRequest.REQ_QUEUED).order_by('pk')[0]
+ br.state = BuildRequest.REQ_INPROGRESS
+ br.save()
+ return br
+
+ def schedule(self):
+ import traceback
+ try:
+ br = None
+ try:
+ # select the build environment and the request to build
+ br = self._selectBuildRequest()
+ except IndexError as e:
+ #logger.debug("runbuilds: No build request")
+ return
+ try:
+ bec = self._selectBuildEnvironment()
+ except IndexError as e:
+ # we could not find a BEC; postpone the BR
+ br.state = BuildRequest.REQ_QUEUED
+ br.save()
+ logger.debug("runbuilds: No build env")
+ return
+
+ logger.debug("runbuilds: starting build %s, environment %s" % (br, bec.be))
+
+ # write the build identification variable
+ BRVariable.objects.create(req = br, name="TOASTER_BRBE", value="%d:%d" % (br.pk, bec.be.pk))
+
+ # let the build request know where it is being executed
+ br.environment = bec.be
+ br.save()
+
+ # this triggers an async build
+ bec.triggerBuild(br.brbitbake_set.all(), br.brlayer_set.all(), br.brvariable_set.all(), br.brtarget_set.all())
+
+ except Exception as e:
+ logger.error("runbuilds: Error launching build %s" % e)
+ traceback.print_exc(e)
+ if "[Errno 111] Connection refused" in str(e):
+ # Connection refused, read toaster_server.out
+ errmsg = bec.readServerLogFile()
+ else:
+ errmsg = str(e)
+
+ BRError.objects.create(req = br,
+ errtype = str(type(e)),
+ errmsg = errmsg,
+ traceback = traceback.format_exc(e))
+ br.state = BuildRequest.REQ_FAILED
+ br.save()
+ bec.be.lock = BuildEnvironment.LOCK_FREE
+ bec.be.save()
+
+ def archive(self):
+ ''' archives data from the builds '''
+ artifact_storage_dir = ToasterSetting.objects.get(name="ARTIFACTS_STORAGE_DIR").value
+ for br in BuildRequest.objects.filter(state = BuildRequest.REQ_ARCHIVE):
+ # save cooker log
+ if br.build == None:
+ br.state = BuildRequest.REQ_FAILED
+ br.save()
+ continue
+ build_artifact_storage_dir = os.path.join(artifact_storage_dir, "%d" % br.build.pk)
+ try:
+ os.makedirs(build_artifact_storage_dir)
+ except OSError as ose:
+ if "File exists" in str(ose):
+ pass
+ else:
+ raise ose
+
+ file_name = os.path.join(build_artifact_storage_dir, "cooker_log.txt")
+ try:
+ with open(file_name, "w") as f:
+ f.write(br.environment.get_artifact(br.build.cooker_log_path).read())
+ except IOError:
+ os.unlink(file_name)
+
+ br.state = BuildRequest.REQ_COMPLETED
+ br.save()
+
+ def cleanup(self):
+ from django.utils import timezone
+ from datetime import timedelta
+ # environments locked for more than 30 seconds - they should be unlocked
+ BuildEnvironment.objects.filter(buildrequest__state__in=[BuildRequest.REQ_FAILED, BuildRequest.REQ_COMPLETED]).filter(lock=BuildEnvironment.LOCK_LOCK).filter(updated__lt = timezone.now() - timedelta(seconds = 30)).update(lock = BuildEnvironment.LOCK_FREE)
+
+
+ # update all Builds that failed to start
+
+ for br in BuildRequest.objects.filter(state = BuildRequest.REQ_FAILED, build__outcome = Build.IN_PROGRESS):
+ # transpose the launch errors in ToasterExceptions
+ br.build.outcome = Build.FAILED
+ for brerror in br.brerror_set.all():
+ logger.debug("Saving error %s" % brerror)
+ LogMessage.objects.create(build = br.build, level = LogMessage.EXCEPTION, message = brerror.errmsg)
+ br.build.save()
+
+ # we don't have a true build object here; hence, toasterui didn't have a change to release the BE lock
+ br.environment.lock = BuildEnvironment.LOCK_FREE
+ br.environment.save()
+
+
+
+ # update all BuildRequests without a build created
+ for br in BuildRequest.objects.filter(build = None):
+ br.build = Build.objects.create(project = br.project, completed_on = br.updated, started_on = br.created)
+ br.build.outcome = Build.FAILED
+ try:
+ br.build.machine = br.brvariable_set.get(name='MACHINE').value
+ except BRVariable.DoesNotExist:
+ pass
+ br.save()
+ # transpose target information
+ for brtarget in br.brtarget_set.all():
+ Target.objects.create(build = br.build, target= brtarget.target)
+ # transpose the launch errors in ToasterExceptions
+ for brerror in br.brerror_set.all():
+ LogMessage.objects.create(build = br.build, level = LogMessage.EXCEPTION, message = brerror.errmsg)
+
+ br.build.save()
+ pass
+
+
+ def handle_noargs(self, **options):
+ self.cleanup()
+ self.archive()
+ self.schedule()
diff --git a/bitbake/lib/toaster/bldcontrol/migrations/0001_initial.py b/bitbake/lib/toaster/bldcontrol/migrations/0001_initial.py
new file mode 100644
index 0000000..a7e6350
--- /dev/null
+++ b/bitbake/lib/toaster/bldcontrol/migrations/0001_initial.py
@@ -0,0 +1,154 @@
+# -*- coding: utf-8 -*-
+from south.utils import datetime_utils as datetime
+from south.db import db
+from south.v2 import SchemaMigration
+from django.db import models
+
+
+class Migration(SchemaMigration):
+
+ def forwards(self, orm):
+ # Adding model 'BuildEnvironment'
+ db.create_table(u'bldcontrol_buildenvironment', (
+ (u'id', self.gf('django.db.models.fields.AutoField')(primary_key=True)),
+ ('address', self.gf('django.db.models.fields.CharField')(max_length=254)),
+ ('betype', self.gf('django.db.models.fields.IntegerField')()),
+ ('bbaddress', self.gf('django.db.models.fields.CharField')(max_length=254, blank=True)),
+ ('bbport', self.gf('django.db.models.fields.IntegerField')(default=-1)),
+ ('bbtoken', self.gf('django.db.models.fields.CharField')(max_length=126, blank=True)),
+ ('bbstate', self.gf('django.db.models.fields.IntegerField')(default=0)),
+ ('lock', self.gf('django.db.models.fields.IntegerField')(default=0)),
+ ('created', self.gf('django.db.models.fields.DateTimeField')(auto_now_add=True, blank=True)),
+ ('updated', self.gf('django.db.models.fields.DateTimeField')(auto_now=True, blank=True)),
+ ))
+ db.send_create_signal(u'bldcontrol', ['BuildEnvironment'])
+
+ # Adding model 'BuildRequest'
+ db.create_table(u'bldcontrol_buildrequest', (
+ (u'id', self.gf('django.db.models.fields.AutoField')(primary_key=True)),
+ ('project', self.gf('django.db.models.fields.related.ForeignKey')(to=orm['orm.Project'])),
+ ('build', self.gf('django.db.models.fields.related.ForeignKey')(to=orm['orm.Build'], null=True)),
+ ('state', self.gf('django.db.models.fields.IntegerField')(default=0)),
+ ('created', self.gf('django.db.models.fields.DateTimeField')(auto_now_add=True, blank=True)),
+ ('updated', self.gf('django.db.models.fields.DateTimeField')(auto_now=True, blank=True)),
+ ))
+ db.send_create_signal(u'bldcontrol', ['BuildRequest'])
+
+ # Adding model 'BRLayer'
+ db.create_table(u'bldcontrol_brlayer', (
+ (u'id', self.gf('django.db.models.fields.AutoField')(primary_key=True)),
+ ('req', self.gf('django.db.models.fields.related.ForeignKey')(to=orm['bldcontrol.BuildRequest'])),
+ ('name', self.gf('django.db.models.fields.CharField')(max_length=100)),
+ ('giturl', self.gf('django.db.models.fields.CharField')(max_length=254)),
+ ('commit', self.gf('django.db.models.fields.CharField')(max_length=254)),
+ ))
+ db.send_create_signal(u'bldcontrol', ['BRLayer'])
+
+ # Adding model 'BRVariable'
+ db.create_table(u'bldcontrol_brvariable', (
+ (u'id', self.gf('django.db.models.fields.AutoField')(primary_key=True)),
+ ('req', self.gf('django.db.models.fields.related.ForeignKey')(to=orm['bldcontrol.BuildRequest'])),
+ ('name', self.gf('django.db.models.fields.CharField')(max_length=100)),
+ ('value', self.gf('django.db.models.fields.TextField')(blank=True)),
+ ))
+ db.send_create_signal(u'bldcontrol', ['BRVariable'])
+
+ # Adding model 'BRTarget'
+ db.create_table(u'bldcontrol_brtarget', (
+ (u'id', self.gf('django.db.models.fields.AutoField')(primary_key=True)),
+ ('req', self.gf('django.db.models.fields.related.ForeignKey')(to=orm['bldcontrol.BuildRequest'])),
+ ('target', self.gf('django.db.models.fields.CharField')(max_length=100)),
+ ('task', self.gf('django.db.models.fields.CharField')(max_length=100, null=True)),
+ ))
+ db.send_create_signal(u'bldcontrol', ['BRTarget'])
+
+
+ def backwards(self, orm):
+ # Deleting model 'BuildEnvironment'
+ db.delete_table(u'bldcontrol_buildenvironment')
+
+ # Deleting model 'BuildRequest'
+ db.delete_table(u'bldcontrol_buildrequest')
+
+ # Deleting model 'BRLayer'
+ db.delete_table(u'bldcontrol_brlayer')
+
+ # Deleting model 'BRVariable'
+ db.delete_table(u'bldcontrol_brvariable')
+
+ # Deleting model 'BRTarget'
+ db.delete_table(u'bldcontrol_brtarget')
+
+
+ models = {
+ u'bldcontrol.brlayer': {
+ 'Meta': {'object_name': 'BRLayer'},
+ 'commit': ('django.db.models.fields.CharField', [], {'max_length': '254'}),
+ 'giturl': ('django.db.models.fields.CharField', [], {'max_length': '254'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'req': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['bldcontrol.BuildRequest']"})
+ },
+ u'bldcontrol.brtarget': {
+ 'Meta': {'object_name': 'BRTarget'},
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'req': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['bldcontrol.BuildRequest']"}),
+ 'target': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'task': ('django.db.models.fields.CharField', [], {'max_length': '100', 'null': 'True'})
+ },
+ u'bldcontrol.brvariable': {
+ 'Meta': {'object_name': 'BRVariable'},
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'req': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['bldcontrol.BuildRequest']"}),
+ 'value': ('django.db.models.fields.TextField', [], {'blank': 'True'})
+ },
+ u'bldcontrol.buildenvironment': {
+ 'Meta': {'object_name': 'BuildEnvironment'},
+ 'address': ('django.db.models.fields.CharField', [], {'max_length': '254'}),
+ 'bbaddress': ('django.db.models.fields.CharField', [], {'max_length': '254', 'blank': 'True'}),
+ 'bbport': ('django.db.models.fields.IntegerField', [], {'default': '-1'}),
+ 'bbstate': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'bbtoken': ('django.db.models.fields.CharField', [], {'max_length': '126', 'blank': 'True'}),
+ 'betype': ('django.db.models.fields.IntegerField', [], {}),
+ 'created': ('django.db.models.fields.DateTimeField', [], {'auto_now_add': 'True', 'blank': 'True'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'lock': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'updated': ('django.db.models.fields.DateTimeField', [], {'auto_now': 'True', 'blank': 'True'})
+ },
+ u'bldcontrol.buildrequest': {
+ 'Meta': {'object_name': 'BuildRequest'},
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Build']", 'null': 'True'}),
+ 'created': ('django.db.models.fields.DateTimeField', [], {'auto_now_add': 'True', 'blank': 'True'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'project': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Project']"}),
+ 'state': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'updated': ('django.db.models.fields.DateTimeField', [], {'auto_now': 'True', 'blank': 'True'})
+ },
+ u'orm.build': {
+ 'Meta': {'object_name': 'Build'},
+ 'bitbake_version': ('django.db.models.fields.CharField', [], {'max_length': '50'}),
+ 'build_name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'completed_on': ('django.db.models.fields.DateTimeField', [], {}),
+ 'cooker_log_path': ('django.db.models.fields.CharField', [], {'max_length': '500'}),
+ 'distro': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'distro_version': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'errors_no': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'machine': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'outcome': ('django.db.models.fields.IntegerField', [], {'default': '2'}),
+ 'project': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Project']", 'null': 'True'}),
+ 'started_on': ('django.db.models.fields.DateTimeField', [], {}),
+ 'timespent': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'warnings_no': ('django.db.models.fields.IntegerField', [], {'default': '0'})
+ },
+ u'orm.project': {
+ 'Meta': {'object_name': 'Project'},
+ 'created': ('django.db.models.fields.DateTimeField', [], {'auto_now_add': 'True', 'blank': 'True'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'updated': ('django.db.models.fields.DateTimeField', [], {'auto_now': 'True', 'blank': 'True'})
+ }
+ }
+
+ complete_apps = ['bldcontrol']
diff --git a/bitbake/lib/toaster/bldcontrol/migrations/0002_auto__add_field_buildenvironment_sourcedir__add_field_buildenvironment.py b/bitbake/lib/toaster/bldcontrol/migrations/0002_auto__add_field_buildenvironment_sourcedir__add_field_buildenvironment.py
new file mode 100644
index 0000000..f522a50
--- /dev/null
+++ b/bitbake/lib/toaster/bldcontrol/migrations/0002_auto__add_field_buildenvironment_sourcedir__add_field_buildenvironment.py
@@ -0,0 +1,106 @@
+# -*- coding: utf-8 -*-
+from south.utils import datetime_utils as datetime
+from south.db import db
+from south.v2 import SchemaMigration
+from django.db import models
+
+
+class Migration(SchemaMigration):
+
+ def forwards(self, orm):
+ # Adding field 'BuildEnvironment.sourcedir'
+ db.add_column(u'bldcontrol_buildenvironment', 'sourcedir',
+ self.gf('django.db.models.fields.CharField')(default='', max_length=512, blank=True),
+ keep_default=False)
+
+ # Adding field 'BuildEnvironment.builddir'
+ db.add_column(u'bldcontrol_buildenvironment', 'builddir',
+ self.gf('django.db.models.fields.CharField')(default='', max_length=512, blank=True),
+ keep_default=False)
+
+
+ def backwards(self, orm):
+ # Deleting field 'BuildEnvironment.sourcedir'
+ db.delete_column(u'bldcontrol_buildenvironment', 'sourcedir')
+
+ # Deleting field 'BuildEnvironment.builddir'
+ db.delete_column(u'bldcontrol_buildenvironment', 'builddir')
+
+
+ models = {
+ u'bldcontrol.brlayer': {
+ 'Meta': {'object_name': 'BRLayer'},
+ 'commit': ('django.db.models.fields.CharField', [], {'max_length': '254'}),
+ 'giturl': ('django.db.models.fields.CharField', [], {'max_length': '254'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'req': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['bldcontrol.BuildRequest']"})
+ },
+ u'bldcontrol.brtarget': {
+ 'Meta': {'object_name': 'BRTarget'},
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'req': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['bldcontrol.BuildRequest']"}),
+ 'target': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'task': ('django.db.models.fields.CharField', [], {'max_length': '100', 'null': 'True'})
+ },
+ u'bldcontrol.brvariable': {
+ 'Meta': {'object_name': 'BRVariable'},
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'req': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['bldcontrol.BuildRequest']"}),
+ 'value': ('django.db.models.fields.TextField', [], {'blank': 'True'})
+ },
+ u'bldcontrol.buildenvironment': {
+ 'Meta': {'object_name': 'BuildEnvironment'},
+ 'address': ('django.db.models.fields.CharField', [], {'max_length': '254'}),
+ 'bbaddress': ('django.db.models.fields.CharField', [], {'max_length': '254', 'blank': 'True'}),
+ 'bbport': ('django.db.models.fields.IntegerField', [], {'default': '-1'}),
+ 'bbstate': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'bbtoken': ('django.db.models.fields.CharField', [], {'max_length': '126', 'blank': 'True'}),
+ 'betype': ('django.db.models.fields.IntegerField', [], {}),
+ 'builddir': ('django.db.models.fields.CharField', [], {'max_length': '512', 'blank': 'True'}),
+ 'created': ('django.db.models.fields.DateTimeField', [], {'auto_now_add': 'True', 'blank': 'True'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'lock': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'sourcedir': ('django.db.models.fields.CharField', [], {'max_length': '512', 'blank': 'True'}),
+ 'updated': ('django.db.models.fields.DateTimeField', [], {'auto_now': 'True', 'blank': 'True'})
+ },
+ u'bldcontrol.buildrequest': {
+ 'Meta': {'object_name': 'BuildRequest'},
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Build']", 'null': 'True'}),
+ 'created': ('django.db.models.fields.DateTimeField', [], {'auto_now_add': 'True', 'blank': 'True'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'project': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Project']"}),
+ 'state': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'updated': ('django.db.models.fields.DateTimeField', [], {'auto_now': 'True', 'blank': 'True'})
+ },
+ u'orm.build': {
+ 'Meta': {'object_name': 'Build'},
+ 'bitbake_version': ('django.db.models.fields.CharField', [], {'max_length': '50'}),
+ 'build_name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'completed_on': ('django.db.models.fields.DateTimeField', [], {}),
+ 'cooker_log_path': ('django.db.models.fields.CharField', [], {'max_length': '500'}),
+ 'distro': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'distro_version': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'errors_no': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'machine': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'outcome': ('django.db.models.fields.IntegerField', [], {'default': '2'}),
+ 'project': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Project']", 'null': 'True'}),
+ 'started_on': ('django.db.models.fields.DateTimeField', [], {}),
+ 'timespent': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'warnings_no': ('django.db.models.fields.IntegerField', [], {'default': '0'})
+ },
+ u'orm.project': {
+ 'Meta': {'object_name': 'Project'},
+ 'branch': ('django.db.models.fields.CharField', [], {'max_length': '50'}),
+ 'created': ('django.db.models.fields.DateTimeField', [], {'auto_now_add': 'True', 'blank': 'True'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'short_description': ('django.db.models.fields.CharField', [], {'max_length': '50', 'blank': 'True'}),
+ 'updated': ('django.db.models.fields.DateTimeField', [], {'auto_now': 'True', 'blank': 'True'}),
+ 'user_id': ('django.db.models.fields.IntegerField', [], {'null': 'True'})
+ }
+ }
+
+ complete_apps = ['bldcontrol']
\ No newline at end of file
diff --git a/bitbake/lib/toaster/bldcontrol/migrations/0003_auto__add_field_brlayer_dirpath.py b/bitbake/lib/toaster/bldcontrol/migrations/0003_auto__add_field_brlayer_dirpath.py
new file mode 100644
index 0000000..b9ba838
--- /dev/null
+++ b/bitbake/lib/toaster/bldcontrol/migrations/0003_auto__add_field_brlayer_dirpath.py
@@ -0,0 +1,99 @@
+# -*- coding: utf-8 -*-
+from south.utils import datetime_utils as datetime
+from south.db import db
+from south.v2 import SchemaMigration
+from django.db import models
+
+
+class Migration(SchemaMigration):
+
+ def forwards(self, orm):
+ # Adding field 'BRLayer.dirpath'
+ db.add_column(u'bldcontrol_brlayer', 'dirpath',
+ self.gf('django.db.models.fields.CharField')(default='', max_length=254),
+ keep_default=False)
+
+
+ def backwards(self, orm):
+ # Deleting field 'BRLayer.dirpath'
+ db.delete_column(u'bldcontrol_brlayer', 'dirpath')
+
+
+ models = {
+ u'bldcontrol.brlayer': {
+ 'Meta': {'object_name': 'BRLayer'},
+ 'commit': ('django.db.models.fields.CharField', [], {'max_length': '254'}),
+ 'dirpath': ('django.db.models.fields.CharField', [], {'max_length': '254'}),
+ 'giturl': ('django.db.models.fields.CharField', [], {'max_length': '254'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'req': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['bldcontrol.BuildRequest']"})
+ },
+ u'bldcontrol.brtarget': {
+ 'Meta': {'object_name': 'BRTarget'},
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'req': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['bldcontrol.BuildRequest']"}),
+ 'target': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'task': ('django.db.models.fields.CharField', [], {'max_length': '100', 'null': 'True'})
+ },
+ u'bldcontrol.brvariable': {
+ 'Meta': {'object_name': 'BRVariable'},
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'req': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['bldcontrol.BuildRequest']"}),
+ 'value': ('django.db.models.fields.TextField', [], {'blank': 'True'})
+ },
+ u'bldcontrol.buildenvironment': {
+ 'Meta': {'object_name': 'BuildEnvironment'},
+ 'address': ('django.db.models.fields.CharField', [], {'max_length': '254'}),
+ 'bbaddress': ('django.db.models.fields.CharField', [], {'max_length': '254', 'blank': 'True'}),
+ 'bbport': ('django.db.models.fields.IntegerField', [], {'default': '-1'}),
+ 'bbstate': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'bbtoken': ('django.db.models.fields.CharField', [], {'max_length': '126', 'blank': 'True'}),
+ 'betype': ('django.db.models.fields.IntegerField', [], {}),
+ 'builddir': ('django.db.models.fields.CharField', [], {'max_length': '512', 'blank': 'True'}),
+ 'created': ('django.db.models.fields.DateTimeField', [], {'auto_now_add': 'True', 'blank': 'True'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'lock': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'sourcedir': ('django.db.models.fields.CharField', [], {'max_length': '512', 'blank': 'True'}),
+ 'updated': ('django.db.models.fields.DateTimeField', [], {'auto_now': 'True', 'blank': 'True'})
+ },
+ u'bldcontrol.buildrequest': {
+ 'Meta': {'object_name': 'BuildRequest'},
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Build']", 'null': 'True'}),
+ 'created': ('django.db.models.fields.DateTimeField', [], {'auto_now_add': 'True', 'blank': 'True'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'project': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Project']"}),
+ 'state': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'updated': ('django.db.models.fields.DateTimeField', [], {'auto_now': 'True', 'blank': 'True'})
+ },
+ u'orm.build': {
+ 'Meta': {'object_name': 'Build'},
+ 'bitbake_version': ('django.db.models.fields.CharField', [], {'max_length': '50'}),
+ 'build_name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'completed_on': ('django.db.models.fields.DateTimeField', [], {}),
+ 'cooker_log_path': ('django.db.models.fields.CharField', [], {'max_length': '500'}),
+ 'distro': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'distro_version': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'errors_no': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'machine': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'outcome': ('django.db.models.fields.IntegerField', [], {'default': '2'}),
+ 'project': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Project']", 'null': 'True'}),
+ 'started_on': ('django.db.models.fields.DateTimeField', [], {}),
+ 'timespent': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'warnings_no': ('django.db.models.fields.IntegerField', [], {'default': '0'})
+ },
+ u'orm.project': {
+ 'Meta': {'object_name': 'Project'},
+ 'branch': ('django.db.models.fields.CharField', [], {'max_length': '50'}),
+ 'created': ('django.db.models.fields.DateTimeField', [], {'auto_now_add': 'True', 'blank': 'True'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'short_description': ('django.db.models.fields.CharField', [], {'max_length': '50', 'blank': 'True'}),
+ 'updated': ('django.db.models.fields.DateTimeField', [], {'auto_now': 'True', 'blank': 'True'}),
+ 'user_id': ('django.db.models.fields.IntegerField', [], {'null': 'True'})
+ }
+ }
+
+ complete_apps = ['bldcontrol']
\ No newline at end of file
diff --git a/bitbake/lib/toaster/bldcontrol/migrations/0004_loadinitialdata.py b/bitbake/lib/toaster/bldcontrol/migrations/0004_loadinitialdata.py
new file mode 100644
index 0000000..d908578
--- /dev/null
+++ b/bitbake/lib/toaster/bldcontrol/migrations/0004_loadinitialdata.py
@@ -0,0 +1,104 @@
+# -*- coding: utf-8 -*-
+from south.utils import datetime_utils as datetime
+from south.db import db
+from south.v2 import DataMigration
+from django.db import models
+
+class Migration(DataMigration):
+
+ def forwards(self, orm):
+ "Write your forwards methods here."
+ # Note: Don't use "from appname.models import ModelName".
+ # Use orm.ModelName to refer to models in this application,
+ # and orm['appname.ModelName'] for models in other applications.
+ try:
+ orm.BuildEnvironment.objects.get(pk = 1)
+ except:
+ from django.utils import timezone
+ orm.BuildEnvironment.objects.create(pk = 1,
+ created = timezone.now(),
+ updated = timezone.now(),
+ betype = 0)
+
+ def backwards(self, orm):
+ "Write your backwards methods here."
+
+ models = {
+ u'bldcontrol.brlayer': {
+ 'Meta': {'object_name': 'BRLayer'},
+ 'commit': ('django.db.models.fields.CharField', [], {'max_length': '254'}),
+ 'dirpath': ('django.db.models.fields.CharField', [], {'max_length': '254'}),
+ 'giturl': ('django.db.models.fields.CharField', [], {'max_length': '254'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'req': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['bldcontrol.BuildRequest']"})
+ },
+ u'bldcontrol.brtarget': {
+ 'Meta': {'object_name': 'BRTarget'},
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'req': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['bldcontrol.BuildRequest']"}),
+ 'target': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'task': ('django.db.models.fields.CharField', [], {'max_length': '100', 'null': 'True'})
+ },
+ u'bldcontrol.brvariable': {
+ 'Meta': {'object_name': 'BRVariable'},
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'req': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['bldcontrol.BuildRequest']"}),
+ 'value': ('django.db.models.fields.TextField', [], {'blank': 'True'})
+ },
+ u'bldcontrol.buildenvironment': {
+ 'Meta': {'object_name': 'BuildEnvironment'},
+ 'address': ('django.db.models.fields.CharField', [], {'max_length': '254'}),
+ 'bbaddress': ('django.db.models.fields.CharField', [], {'max_length': '254', 'blank': 'True'}),
+ 'bbport': ('django.db.models.fields.IntegerField', [], {'default': '-1'}),
+ 'bbstate': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'bbtoken': ('django.db.models.fields.CharField', [], {'max_length': '126', 'blank': 'True'}),
+ 'betype': ('django.db.models.fields.IntegerField', [], {}),
+ 'builddir': ('django.db.models.fields.CharField', [], {'max_length': '512', 'blank': 'True'}),
+ 'created': ('django.db.models.fields.DateTimeField', [], {'auto_now_add': 'True', 'blank': 'True'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'lock': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'sourcedir': ('django.db.models.fields.CharField', [], {'max_length': '512', 'blank': 'True'}),
+ 'updated': ('django.db.models.fields.DateTimeField', [], {'auto_now': 'True', 'blank': 'True'})
+ },
+ u'bldcontrol.buildrequest': {
+ 'Meta': {'object_name': 'BuildRequest'},
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Build']", 'null': 'True'}),
+ 'created': ('django.db.models.fields.DateTimeField', [], {'auto_now_add': 'True', 'blank': 'True'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'project': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Project']"}),
+ 'state': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'updated': ('django.db.models.fields.DateTimeField', [], {'auto_now': 'True', 'blank': 'True'})
+ },
+ u'orm.build': {
+ 'Meta': {'object_name': 'Build'},
+ 'bitbake_version': ('django.db.models.fields.CharField', [], {'max_length': '50'}),
+ 'build_name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'completed_on': ('django.db.models.fields.DateTimeField', [], {}),
+ 'cooker_log_path': ('django.db.models.fields.CharField', [], {'max_length': '500'}),
+ 'distro': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'distro_version': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'errors_no': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'machine': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'outcome': ('django.db.models.fields.IntegerField', [], {'default': '2'}),
+ 'project': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Project']", 'null': 'True'}),
+ 'started_on': ('django.db.models.fields.DateTimeField', [], {}),
+ 'timespent': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'warnings_no': ('django.db.models.fields.IntegerField', [], {'default': '0'})
+ },
+ u'orm.project': {
+ 'Meta': {'object_name': 'Project'},
+ 'branch': ('django.db.models.fields.CharField', [], {'max_length': '50'}),
+ 'created': ('django.db.models.fields.DateTimeField', [], {'auto_now_add': 'True', 'blank': 'True'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'short_description': ('django.db.models.fields.CharField', [], {'max_length': '50', 'blank': 'True'}),
+ 'updated': ('django.db.models.fields.DateTimeField', [], {'auto_now': 'True', 'blank': 'True'}),
+ 'user_id': ('django.db.models.fields.IntegerField', [], {'null': 'True'})
+ }
+ }
+
+ complete_apps = ['bldcontrol']
+ symmetrical = True
diff --git a/bitbake/lib/toaster/bldcontrol/migrations/0005_auto__add_brerror.py b/bitbake/lib/toaster/bldcontrol/migrations/0005_auto__add_brerror.py
new file mode 100644
index 0000000..98aeb41
--- /dev/null
+++ b/bitbake/lib/toaster/bldcontrol/migrations/0005_auto__add_brerror.py
@@ -0,0 +1,112 @@
+# -*- coding: utf-8 -*-
+from south.utils import datetime_utils as datetime
+from south.db import db
+from south.v2 import SchemaMigration
+from django.db import models
+
+
+class Migration(SchemaMigration):
+
+ def forwards(self, orm):
+ # Adding model 'BRError'
+ db.create_table(u'bldcontrol_brerror', (
+ (u'id', self.gf('django.db.models.fields.AutoField')(primary_key=True)),
+ ('req', self.gf('django.db.models.fields.related.ForeignKey')(to=orm['bldcontrol.BuildRequest'])),
+ ('errtype', self.gf('django.db.models.fields.CharField')(max_length=100)),
+ ('errmsg', self.gf('django.db.models.fields.TextField')()),
+ ('traceback', self.gf('django.db.models.fields.TextField')()),
+ ))
+ db.send_create_signal(u'bldcontrol', ['BRError'])
+
+
+ def backwards(self, orm):
+ # Deleting model 'BRError'
+ db.delete_table(u'bldcontrol_brerror')
+
+
+ models = {
+ u'bldcontrol.brerror': {
+ 'Meta': {'object_name': 'BRError'},
+ 'errmsg': ('django.db.models.fields.TextField', [], {}),
+ 'errtype': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'req': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['bldcontrol.BuildRequest']"}),
+ 'traceback': ('django.db.models.fields.TextField', [], {})
+ },
+ u'bldcontrol.brlayer': {
+ 'Meta': {'object_name': 'BRLayer'},
+ 'commit': ('django.db.models.fields.CharField', [], {'max_length': '254'}),
+ 'dirpath': ('django.db.models.fields.CharField', [], {'max_length': '254'}),
+ 'giturl': ('django.db.models.fields.CharField', [], {'max_length': '254'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'req': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['bldcontrol.BuildRequest']"})
+ },
+ u'bldcontrol.brtarget': {
+ 'Meta': {'object_name': 'BRTarget'},
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'req': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['bldcontrol.BuildRequest']"}),
+ 'target': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'task': ('django.db.models.fields.CharField', [], {'max_length': '100', 'null': 'True'})
+ },
+ u'bldcontrol.brvariable': {
+ 'Meta': {'object_name': 'BRVariable'},
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'req': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['bldcontrol.BuildRequest']"}),
+ 'value': ('django.db.models.fields.TextField', [], {'blank': 'True'})
+ },
+ u'bldcontrol.buildenvironment': {
+ 'Meta': {'object_name': 'BuildEnvironment'},
+ 'address': ('django.db.models.fields.CharField', [], {'max_length': '254'}),
+ 'bbaddress': ('django.db.models.fields.CharField', [], {'max_length': '254', 'blank': 'True'}),
+ 'bbport': ('django.db.models.fields.IntegerField', [], {'default': '-1'}),
+ 'bbstate': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'bbtoken': ('django.db.models.fields.CharField', [], {'max_length': '126', 'blank': 'True'}),
+ 'betype': ('django.db.models.fields.IntegerField', [], {}),
+ 'builddir': ('django.db.models.fields.CharField', [], {'max_length': '512', 'blank': 'True'}),
+ 'created': ('django.db.models.fields.DateTimeField', [], {'auto_now_add': 'True', 'blank': 'True'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'lock': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'sourcedir': ('django.db.models.fields.CharField', [], {'max_length': '512', 'blank': 'True'}),
+ 'updated': ('django.db.models.fields.DateTimeField', [], {'auto_now': 'True', 'blank': 'True'})
+ },
+ u'bldcontrol.buildrequest': {
+ 'Meta': {'object_name': 'BuildRequest'},
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Build']", 'null': 'True'}),
+ 'created': ('django.db.models.fields.DateTimeField', [], {'auto_now_add': 'True', 'blank': 'True'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'project': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Project']"}),
+ 'state': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'updated': ('django.db.models.fields.DateTimeField', [], {'auto_now': 'True', 'blank': 'True'})
+ },
+ u'orm.build': {
+ 'Meta': {'object_name': 'Build'},
+ 'bitbake_version': ('django.db.models.fields.CharField', [], {'max_length': '50'}),
+ 'build_name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'completed_on': ('django.db.models.fields.DateTimeField', [], {}),
+ 'cooker_log_path': ('django.db.models.fields.CharField', [], {'max_length': '500'}),
+ 'distro': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'distro_version': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'errors_no': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'machine': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'outcome': ('django.db.models.fields.IntegerField', [], {'default': '2'}),
+ 'project': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Project']", 'null': 'True'}),
+ 'started_on': ('django.db.models.fields.DateTimeField', [], {}),
+ 'timespent': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'warnings_no': ('django.db.models.fields.IntegerField', [], {'default': '0'})
+ },
+ u'orm.project': {
+ 'Meta': {'object_name': 'Project'},
+ 'branch': ('django.db.models.fields.CharField', [], {'max_length': '50'}),
+ 'created': ('django.db.models.fields.DateTimeField', [], {'auto_now_add': 'True', 'blank': 'True'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'short_description': ('django.db.models.fields.CharField', [], {'max_length': '50', 'blank': 'True'}),
+ 'updated': ('django.db.models.fields.DateTimeField', [], {'auto_now': 'True', 'blank': 'True'}),
+ 'user_id': ('django.db.models.fields.IntegerField', [], {'null': 'True'})
+ }
+ }
+
+ complete_apps = ['bldcontrol']
\ No newline at end of file
diff --git a/bitbake/lib/toaster/bldcontrol/migrations/0006_auto__add_brbitbake.py b/bitbake/lib/toaster/bldcontrol/migrations/0006_auto__add_brbitbake.py
new file mode 100644
index 0000000..74388f8
--- /dev/null
+++ b/bitbake/lib/toaster/bldcontrol/migrations/0006_auto__add_brbitbake.py
@@ -0,0 +1,128 @@
+# -*- coding: utf-8 -*-
+from south.utils import datetime_utils as datetime
+from south.db import db
+from south.v2 import SchemaMigration
+from django.db import models
+
+
+class Migration(SchemaMigration):
+
+ def forwards(self, orm):
+ # Adding model 'BRBitbake'
+ db.create_table(u'bldcontrol_brbitbake', (
+ (u'id', self.gf('django.db.models.fields.AutoField')(primary_key=True)),
+ ('req', self.gf('django.db.models.fields.related.ForeignKey')(to=orm['bldcontrol.BuildRequest'], unique=True)),
+ ('giturl', self.gf('django.db.models.fields.CharField')(max_length=254)),
+ ('commit', self.gf('django.db.models.fields.CharField')(max_length=254)),
+ ('dirpath', self.gf('django.db.models.fields.CharField')(max_length=254)),
+ ))
+ db.send_create_signal(u'bldcontrol', ['BRBitbake'])
+
+
+ def backwards(self, orm):
+ # Deleting model 'BRBitbake'
+ db.delete_table(u'bldcontrol_brbitbake')
+
+
+ models = {
+ u'bldcontrol.brbitbake': {
+ 'Meta': {'object_name': 'BRBitbake'},
+ 'commit': ('django.db.models.fields.CharField', [], {'max_length': '254'}),
+ 'dirpath': ('django.db.models.fields.CharField', [], {'max_length': '254'}),
+ 'giturl': ('django.db.models.fields.CharField', [], {'max_length': '254'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'req': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['bldcontrol.BuildRequest']", 'unique': 'True'})
+ },
+ u'bldcontrol.brerror': {
+ 'Meta': {'object_name': 'BRError'},
+ 'errmsg': ('django.db.models.fields.TextField', [], {}),
+ 'errtype': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'req': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['bldcontrol.BuildRequest']"}),
+ 'traceback': ('django.db.models.fields.TextField', [], {})
+ },
+ u'bldcontrol.brlayer': {
+ 'Meta': {'object_name': 'BRLayer'},
+ 'commit': ('django.db.models.fields.CharField', [], {'max_length': '254'}),
+ 'dirpath': ('django.db.models.fields.CharField', [], {'max_length': '254'}),
+ 'giturl': ('django.db.models.fields.CharField', [], {'max_length': '254'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'req': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['bldcontrol.BuildRequest']"})
+ },
+ u'bldcontrol.brtarget': {
+ 'Meta': {'object_name': 'BRTarget'},
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'req': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['bldcontrol.BuildRequest']"}),
+ 'target': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'task': ('django.db.models.fields.CharField', [], {'max_length': '100', 'null': 'True'})
+ },
+ u'bldcontrol.brvariable': {
+ 'Meta': {'object_name': 'BRVariable'},
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'req': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['bldcontrol.BuildRequest']"}),
+ 'value': ('django.db.models.fields.TextField', [], {'blank': 'True'})
+ },
+ u'bldcontrol.buildenvironment': {
+ 'Meta': {'object_name': 'BuildEnvironment'},
+ 'address': ('django.db.models.fields.CharField', [], {'max_length': '254'}),
+ 'bbaddress': ('django.db.models.fields.CharField', [], {'max_length': '254', 'blank': 'True'}),
+ 'bbport': ('django.db.models.fields.IntegerField', [], {'default': '-1'}),
+ 'bbstate': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'bbtoken': ('django.db.models.fields.CharField', [], {'max_length': '126', 'blank': 'True'}),
+ 'betype': ('django.db.models.fields.IntegerField', [], {}),
+ 'builddir': ('django.db.models.fields.CharField', [], {'max_length': '512', 'blank': 'True'}),
+ 'created': ('django.db.models.fields.DateTimeField', [], {'auto_now_add': 'True', 'blank': 'True'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'lock': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'sourcedir': ('django.db.models.fields.CharField', [], {'max_length': '512', 'blank': 'True'}),
+ 'updated': ('django.db.models.fields.DateTimeField', [], {'auto_now': 'True', 'blank': 'True'})
+ },
+ u'bldcontrol.buildrequest': {
+ 'Meta': {'object_name': 'BuildRequest'},
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Build']", 'null': 'True'}),
+ 'created': ('django.db.models.fields.DateTimeField', [], {'auto_now_add': 'True', 'blank': 'True'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'project': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Project']"}),
+ 'state': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'updated': ('django.db.models.fields.DateTimeField', [], {'auto_now': 'True', 'blank': 'True'})
+ },
+ u'orm.bitbakeversion': {
+ 'Meta': {'object_name': 'BitbakeVersion'},
+ 'branch': ('django.db.models.fields.CharField', [], {'max_length': '32'}),
+ 'dirpath': ('django.db.models.fields.CharField', [], {'max_length': '255'}),
+ 'giturl': ('django.db.models.fields.URLField', [], {'max_length': '200'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'unique': 'True', 'max_length': '32'})
+ },
+ u'orm.build': {
+ 'Meta': {'object_name': 'Build'},
+ 'bitbake_version': ('django.db.models.fields.CharField', [], {'max_length': '50'}),
+ 'build_name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'completed_on': ('django.db.models.fields.DateTimeField', [], {}),
+ 'cooker_log_path': ('django.db.models.fields.CharField', [], {'max_length': '500'}),
+ 'distro': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'distro_version': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'errors_no': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'machine': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'outcome': ('django.db.models.fields.IntegerField', [], {'default': '2'}),
+ 'project': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Project']", 'null': 'True'}),
+ 'started_on': ('django.db.models.fields.DateTimeField', [], {}),
+ 'timespent': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'warnings_no': ('django.db.models.fields.IntegerField', [], {'default': '0'})
+ },
+ u'orm.project': {
+ 'Meta': {'object_name': 'Project'},
+ 'bitbake_version': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.BitbakeVersion']"}),
+ 'created': ('django.db.models.fields.DateTimeField', [], {'auto_now_add': 'True', 'blank': 'True'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'short_description': ('django.db.models.fields.CharField', [], {'max_length': '50', 'blank': 'True'}),
+ 'updated': ('django.db.models.fields.DateTimeField', [], {'auto_now': 'True', 'blank': 'True'}),
+ 'user_id': ('django.db.models.fields.IntegerField', [], {'null': 'True'})
+ }
+ }
+
+ complete_apps = ['bldcontrol']
\ No newline at end of file
diff --git a/bitbake/lib/toaster/bldcontrol/migrations/0007_auto__add_field_buildrequest_environment__chg_field_buildrequest_build.py b/bitbake/lib/toaster/bldcontrol/migrations/0007_auto__add_field_buildrequest_environment__chg_field_buildrequest_build.py
new file mode 100644
index 0000000..70677a2
--- /dev/null
+++ b/bitbake/lib/toaster/bldcontrol/migrations/0007_auto__add_field_buildrequest_environment__chg_field_buildrequest_build.py
@@ -0,0 +1,145 @@
+# -*- coding: utf-8 -*-
+from south.utils import datetime_utils as datetime
+from south.db import db
+from south.v2 import SchemaMigration
+from django.db import models
+
+
+class Migration(SchemaMigration):
+
+ def forwards(self, orm):
+ # Adding field 'BuildRequest.environment'
+ db.add_column(u'bldcontrol_buildrequest', 'environment',
+ self.gf('django.db.models.fields.related.ForeignKey')(to=orm['bldcontrol.BuildEnvironment'], null=True),
+ keep_default=False)
+
+
+ # Changing field 'BuildRequest.build'
+ db.alter_column(u'bldcontrol_buildrequest', 'build_id', self.gf('django.db.models.fields.related.OneToOneField')(to=orm['orm.Build'], unique=True, null=True))
+ # Adding unique constraint on 'BuildRequest', fields ['build']
+ db.create_unique(u'bldcontrol_buildrequest', ['build_id'])
+
+
+ def backwards(self, orm):
+ # Removing unique constraint on 'BuildRequest', fields ['build']
+ db.delete_unique(u'bldcontrol_buildrequest', ['build_id'])
+
+ # Deleting field 'BuildRequest.environment'
+ db.delete_column(u'bldcontrol_buildrequest', 'environment_id')
+
+
+ # Changing field 'BuildRequest.build'
+ db.alter_column(u'bldcontrol_buildrequest', 'build_id', self.gf('django.db.models.fields.related.ForeignKey')(to=orm['orm.Build'], null=True))
+
+ models = {
+ u'bldcontrol.brbitbake': {
+ 'Meta': {'object_name': 'BRBitbake'},
+ 'commit': ('django.db.models.fields.CharField', [], {'max_length': '254'}),
+ 'dirpath': ('django.db.models.fields.CharField', [], {'max_length': '254'}),
+ 'giturl': ('django.db.models.fields.CharField', [], {'max_length': '254'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'req': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['bldcontrol.BuildRequest']", 'unique': 'True'})
+ },
+ u'bldcontrol.brerror': {
+ 'Meta': {'object_name': 'BRError'},
+ 'errmsg': ('django.db.models.fields.TextField', [], {}),
+ 'errtype': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'req': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['bldcontrol.BuildRequest']"}),
+ 'traceback': ('django.db.models.fields.TextField', [], {})
+ },
+ u'bldcontrol.brlayer': {
+ 'Meta': {'object_name': 'BRLayer'},
+ 'commit': ('django.db.models.fields.CharField', [], {'max_length': '254'}),
+ 'dirpath': ('django.db.models.fields.CharField', [], {'max_length': '254'}),
+ 'giturl': ('django.db.models.fields.CharField', [], {'max_length': '254'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'req': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['bldcontrol.BuildRequest']"})
+ },
+ u'bldcontrol.brtarget': {
+ 'Meta': {'object_name': 'BRTarget'},
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'req': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['bldcontrol.BuildRequest']"}),
+ 'target': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'task': ('django.db.models.fields.CharField', [], {'max_length': '100', 'null': 'True'})
+ },
+ u'bldcontrol.brvariable': {
+ 'Meta': {'object_name': 'BRVariable'},
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'req': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['bldcontrol.BuildRequest']"}),
+ 'value': ('django.db.models.fields.TextField', [], {'blank': 'True'})
+ },
+ u'bldcontrol.buildenvironment': {
+ 'Meta': {'object_name': 'BuildEnvironment'},
+ 'address': ('django.db.models.fields.CharField', [], {'max_length': '254'}),
+ 'bbaddress': ('django.db.models.fields.CharField', [], {'max_length': '254', 'blank': 'True'}),
+ 'bbport': ('django.db.models.fields.IntegerField', [], {'default': '-1'}),
+ 'bbstate': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'bbtoken': ('django.db.models.fields.CharField', [], {'max_length': '126', 'blank': 'True'}),
+ 'betype': ('django.db.models.fields.IntegerField', [], {}),
+ 'builddir': ('django.db.models.fields.CharField', [], {'max_length': '512', 'blank': 'True'}),
+ 'created': ('django.db.models.fields.DateTimeField', [], {'auto_now_add': 'True', 'blank': 'True'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'lock': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'sourcedir': ('django.db.models.fields.CharField', [], {'max_length': '512', 'blank': 'True'}),
+ 'updated': ('django.db.models.fields.DateTimeField', [], {'auto_now': 'True', 'blank': 'True'})
+ },
+ u'bldcontrol.buildrequest': {
+ 'Meta': {'object_name': 'BuildRequest'},
+ 'build': ('django.db.models.fields.related.OneToOneField', [], {'to': u"orm['orm.Build']", 'unique': 'True', 'null': 'True'}),
+ 'created': ('django.db.models.fields.DateTimeField', [], {'auto_now_add': 'True', 'blank': 'True'}),
+ 'environment': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['bldcontrol.BuildEnvironment']", 'null': 'True'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'project': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Project']"}),
+ 'state': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'updated': ('django.db.models.fields.DateTimeField', [], {'auto_now': 'True', 'blank': 'True'})
+ },
+ u'orm.bitbakeversion': {
+ 'Meta': {'object_name': 'BitbakeVersion'},
+ 'branch': ('django.db.models.fields.CharField', [], {'max_length': '32'}),
+ 'dirpath': ('django.db.models.fields.CharField', [], {'max_length': '255'}),
+ 'giturl': ('django.db.models.fields.URLField', [], {'max_length': '200'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'unique': 'True', 'max_length': '32'})
+ },
+ u'orm.build': {
+ 'Meta': {'object_name': 'Build'},
+ 'bitbake_version': ('django.db.models.fields.CharField', [], {'max_length': '50'}),
+ 'build_name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'completed_on': ('django.db.models.fields.DateTimeField', [], {}),
+ 'cooker_log_path': ('django.db.models.fields.CharField', [], {'max_length': '500'}),
+ 'distro': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'distro_version': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'errors_no': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'machine': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'outcome': ('django.db.models.fields.IntegerField', [], {'default': '2'}),
+ 'project': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Project']", 'null': 'True'}),
+ 'started_on': ('django.db.models.fields.DateTimeField', [], {}),
+ 'timespent': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'warnings_no': ('django.db.models.fields.IntegerField', [], {'default': '0'})
+ },
+ u'orm.project': {
+ 'Meta': {'object_name': 'Project'},
+ 'bitbake_version': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.BitbakeVersion']"}),
+ 'created': ('django.db.models.fields.DateTimeField', [], {'auto_now_add': 'True', 'blank': 'True'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'release': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Release']"}),
+ 'short_description': ('django.db.models.fields.CharField', [], {'max_length': '50', 'blank': 'True'}),
+ 'updated': ('django.db.models.fields.DateTimeField', [], {'auto_now': 'True', 'blank': 'True'}),
+ 'user_id': ('django.db.models.fields.IntegerField', [], {'null': 'True'})
+ },
+ u'orm.release': {
+ 'Meta': {'object_name': 'Release'},
+ 'bitbake_version': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.BitbakeVersion']"}),
+ 'branch': ('django.db.models.fields.CharField', [], {'max_length': '32'}),
+ 'description': ('django.db.models.fields.CharField', [], {'max_length': '255'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'unique': 'True', 'max_length': '32'})
+ }
+ }
+
+ complete_apps = ['bldcontrol']
\ No newline at end of file
diff --git a/bitbake/lib/toaster/bldcontrol/migrations/0008_brarchive.py b/bitbake/lib/toaster/bldcontrol/migrations/0008_brarchive.py
new file mode 100644
index 0000000..f546960
--- /dev/null
+++ b/bitbake/lib/toaster/bldcontrol/migrations/0008_brarchive.py
@@ -0,0 +1,138 @@
+# -*- coding: utf-8 -*-
+from south.utils import datetime_utils as datetime
+from south.db import db
+from south.v2 import DataMigration
+from django.db import models
+
+class Migration(DataMigration):
+ # ids that cannot be imported from BuildRequest
+
+ def forwards(self, orm):
+ REQ_COMPLETED = 3
+ REQ_ARCHIVE = 6
+ "Write your forwards methods here."
+ # Note: Don't use "from appname.models import ModelName".
+ # Use orm.ModelName to refer to models in this application,
+ # and orm['appname.ModelName'] for models in other applications.
+ orm.BuildRequest.objects.filter(state=REQ_COMPLETED).update(state=REQ_ARCHIVE)
+
+ def backwards(self, orm):
+ REQ_COMPLETED = 3
+ REQ_ARCHIVE = 6
+ "Write your backwards methods here."
+ orm.BuildRequest.objects.filter(state=REQ_ARCHIVE).update(state=REQ_COMPLETED)
+
+ models = {
+ u'bldcontrol.brbitbake': {
+ 'Meta': {'object_name': 'BRBitbake'},
+ 'commit': ('django.db.models.fields.CharField', [], {'max_length': '254'}),
+ 'dirpath': ('django.db.models.fields.CharField', [], {'max_length': '254'}),
+ 'giturl': ('django.db.models.fields.CharField', [], {'max_length': '254'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'req': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['bldcontrol.BuildRequest']", 'unique': 'True'})
+ },
+ u'bldcontrol.brerror': {
+ 'Meta': {'object_name': 'BRError'},
+ 'errmsg': ('django.db.models.fields.TextField', [], {}),
+ 'errtype': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'req': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['bldcontrol.BuildRequest']"}),
+ 'traceback': ('django.db.models.fields.TextField', [], {})
+ },
+ u'bldcontrol.brlayer': {
+ 'Meta': {'object_name': 'BRLayer'},
+ 'commit': ('django.db.models.fields.CharField', [], {'max_length': '254'}),
+ 'dirpath': ('django.db.models.fields.CharField', [], {'max_length': '254'}),
+ 'giturl': ('django.db.models.fields.CharField', [], {'max_length': '254'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'req': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['bldcontrol.BuildRequest']"})
+ },
+ u'bldcontrol.brtarget': {
+ 'Meta': {'object_name': 'BRTarget'},
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'req': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['bldcontrol.BuildRequest']"}),
+ 'target': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'task': ('django.db.models.fields.CharField', [], {'max_length': '100', 'null': 'True'})
+ },
+ u'bldcontrol.brvariable': {
+ 'Meta': {'object_name': 'BRVariable'},
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'req': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['bldcontrol.BuildRequest']"}),
+ 'value': ('django.db.models.fields.TextField', [], {'blank': 'True'})
+ },
+ u'bldcontrol.buildenvironment': {
+ 'Meta': {'object_name': 'BuildEnvironment'},
+ 'address': ('django.db.models.fields.CharField', [], {'max_length': '254'}),
+ 'bbaddress': ('django.db.models.fields.CharField', [], {'max_length': '254', 'blank': 'True'}),
+ 'bbport': ('django.db.models.fields.IntegerField', [], {'default': '-1'}),
+ 'bbstate': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'bbtoken': ('django.db.models.fields.CharField', [], {'max_length': '126', 'blank': 'True'}),
+ 'betype': ('django.db.models.fields.IntegerField', [], {}),
+ 'builddir': ('django.db.models.fields.CharField', [], {'max_length': '512', 'blank': 'True'}),
+ 'created': ('django.db.models.fields.DateTimeField', [], {'auto_now_add': 'True', 'blank': 'True'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'lock': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'sourcedir': ('django.db.models.fields.CharField', [], {'max_length': '512', 'blank': 'True'}),
+ 'updated': ('django.db.models.fields.DateTimeField', [], {'auto_now': 'True', 'blank': 'True'})
+ },
+ u'bldcontrol.buildrequest': {
+ 'Meta': {'object_name': 'BuildRequest'},
+ 'build': ('django.db.models.fields.related.OneToOneField', [], {'to': u"orm['orm.Build']", 'unique': 'True', 'null': 'True'}),
+ 'created': ('django.db.models.fields.DateTimeField', [], {'auto_now_add': 'True', 'blank': 'True'}),
+ 'environment': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['bldcontrol.BuildEnvironment']", 'null': 'True'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'project': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Project']"}),
+ 'state': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'updated': ('django.db.models.fields.DateTimeField', [], {'auto_now': 'True', 'blank': 'True'})
+ },
+ u'orm.bitbakeversion': {
+ 'Meta': {'object_name': 'BitbakeVersion'},
+ 'branch': ('django.db.models.fields.CharField', [], {'max_length': '32'}),
+ 'dirpath': ('django.db.models.fields.CharField', [], {'max_length': '255'}),
+ 'giturl': ('django.db.models.fields.URLField', [], {'max_length': '200'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'unique': 'True', 'max_length': '32'})
+ },
+ u'orm.build': {
+ 'Meta': {'object_name': 'Build'},
+ 'bitbake_version': ('django.db.models.fields.CharField', [], {'max_length': '50'}),
+ 'build_name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'completed_on': ('django.db.models.fields.DateTimeField', [], {}),
+ 'cooker_log_path': ('django.db.models.fields.CharField', [], {'max_length': '500'}),
+ 'distro': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'distro_version': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'errors_no': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'machine': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'outcome': ('django.db.models.fields.IntegerField', [], {'default': '2'}),
+ 'project': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Project']", 'null': 'True'}),
+ 'started_on': ('django.db.models.fields.DateTimeField', [], {}),
+ 'timespent': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'warnings_no': ('django.db.models.fields.IntegerField', [], {'default': '0'})
+ },
+ u'orm.project': {
+ 'Meta': {'object_name': 'Project'},
+ 'bitbake_version': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.BitbakeVersion']"}),
+ 'created': ('django.db.models.fields.DateTimeField', [], {'auto_now_add': 'True', 'blank': 'True'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'release': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Release']"}),
+ 'short_description': ('django.db.models.fields.CharField', [], {'max_length': '50', 'blank': 'True'}),
+ 'updated': ('django.db.models.fields.DateTimeField', [], {'auto_now': 'True', 'blank': 'True'}),
+ 'user_id': ('django.db.models.fields.IntegerField', [], {'null': 'True'})
+ },
+ u'orm.release': {
+ 'Meta': {'object_name': 'Release'},
+ 'bitbake_version': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.BitbakeVersion']"}),
+ 'branch_name': ('django.db.models.fields.CharField', [], {'default': "''", 'max_length': '50'}),
+ 'description': ('django.db.models.fields.CharField', [], {'max_length': '255'}),
+ 'helptext': ('django.db.models.fields.TextField', [], {'null': 'True'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'unique': 'True', 'max_length': '32'})
+ }
+ }
+
+ complete_apps = ['bldcontrol']
+ symmetrical = True
diff --git a/bitbake/lib/toaster/bldcontrol/migrations/__init__.py b/bitbake/lib/toaster/bldcontrol/migrations/__init__.py
new file mode 100644
index 0000000..e69de29
--- /dev/null
+++ b/bitbake/lib/toaster/bldcontrol/migrations/__init__.py
diff --git a/bitbake/lib/toaster/bldcontrol/models.py b/bitbake/lib/toaster/bldcontrol/models.py
new file mode 100644
index 0000000..b61de58
--- /dev/null
+++ b/bitbake/lib/toaster/bldcontrol/models.py
@@ -0,0 +1,164 @@
+from django.db import models
+from django.core.validators import MaxValueValidator, MinValueValidator
+from orm.models import Project, ProjectLayer, ProjectVariable, ProjectTarget, Build
+
+# a BuildEnvironment is the equivalent of the "build/" directory on the localhost
+class BuildEnvironment(models.Model):
+ SERVER_STOPPED = 0
+ SERVER_STARTED = 1
+ SERVER_STATE = (
+ (SERVER_STOPPED, "stopped"),
+ (SERVER_STARTED, "started"),
+ )
+
+ TYPE_LOCAL = 0
+ TYPE_SSH = 1
+ TYPE = (
+ (TYPE_LOCAL, "local"),
+ (TYPE_SSH, "ssh"),
+ )
+
+ LOCK_FREE = 0
+ LOCK_LOCK = 1
+ LOCK_RUNNING = 2
+ LOCK_STATE = (
+ (LOCK_FREE, "free"),
+ (LOCK_LOCK, "lock"),
+ (LOCK_RUNNING, "running"),
+ )
+
+ address = models.CharField(max_length = 254)
+ betype = models.IntegerField(choices = TYPE)
+ bbaddress = models.CharField(max_length = 254, blank = True)
+ bbport = models.IntegerField(default = -1)
+ bbtoken = models.CharField(max_length = 126, blank = True)
+ bbstate = models.IntegerField(choices = SERVER_STATE, default = SERVER_STOPPED)
+ sourcedir = models.CharField(max_length = 512, blank = True)
+ builddir = models.CharField(max_length = 512, blank = True)
+ lock = models.IntegerField(choices = LOCK_STATE, default = LOCK_FREE)
+ created = models.DateTimeField(auto_now_add = True)
+ updated = models.DateTimeField(auto_now = True)
+
+
+ def get_artifact_type(self, path):
+ if self.betype == BuildEnvironment.TYPE_LOCAL:
+ try:
+ import magic
+
+ # fair warning: this is a mess; there are multiple competeing and incompatible
+ # magic modules floating around, so we try some of the most common combinations
+
+ try: # we try ubuntu's python-magic 5.4
+ m = magic.open(magic.MAGIC_MIME_TYPE)
+ m.load()
+ return m.file(path)
+ except AttributeError:
+ pass
+
+ try: # we try python-magic 0.4.6
+ m = magic.Magic(magic.MAGIC_MIME)
+ return m.from_file(path)
+ except AttributeError:
+ pass
+
+ try: # we try pip filemagic 1.6
+ m = magic.Magic(flags=magic.MAGIC_MIME_TYPE)
+ return m.id_filename(path)
+ except AttributeError:
+ pass
+
+ return "binary/octet-stream"
+ except ImportError:
+ return "binary/octet-stream"
+ raise Exception("FIXME: artifact type not implemented for build environment type %s" % self.get_betype_display())
+
+
+ def get_artifact(self, path):
+ if self.betype == BuildEnvironment.TYPE_LOCAL:
+ return open(path, "r")
+ raise Exception("FIXME: artifact download not implemented for build environment type %s" % self.get_betype_display())
+
+ def has_artifact(self, path):
+ import os
+ if self.betype == BuildEnvironment.TYPE_LOCAL:
+ return os.path.exists(path)
+ raise Exception("FIXME: has artifact not implemented for build environment type %s" % self.get_betype_display())
+
+# a BuildRequest is a request that the scheduler will build using a BuildEnvironment
+# the build request queue is the table itself, ordered by state
+
+class BuildRequest(models.Model):
+ REQ_CREATED = 0
+ REQ_QUEUED = 1
+ REQ_INPROGRESS = 2
+ REQ_COMPLETED = 3
+ REQ_FAILED = 4
+ REQ_DELETED = 5
+ REQ_ARCHIVE = 6
+
+ REQUEST_STATE = (
+ (REQ_CREATED, "created"),
+ (REQ_QUEUED, "queued"),
+ (REQ_INPROGRESS, "in progress"),
+ (REQ_COMPLETED, "completed"),
+ (REQ_FAILED, "failed"),
+ (REQ_DELETED, "deleted"),
+ (REQ_ARCHIVE, "archive"),
+ )
+
+ search_allowed_fields = ("brtarget__target", "build__project__name")
+
+ project = models.ForeignKey(Project)
+ build = models.OneToOneField(Build, null = True) # TODO: toasterui should set this when Build is created
+ environment = models.ForeignKey(BuildEnvironment, null = True)
+ state = models.IntegerField(choices = REQUEST_STATE, default = REQ_CREATED)
+ created = models.DateTimeField(auto_now_add = True)
+ updated = models.DateTimeField(auto_now = True)
+
+ def get_duration(self):
+ return (self.updated - self.created).total_seconds()
+
+ def get_sorted_target_list(self):
+ tgts = self.brtarget_set.order_by( 'target' );
+ return( tgts );
+
+ def get_machine(self):
+ return self.brvariable_set.get(name="MACHINE").value
+
+ def __str__(self):
+ return "%s %s" % (self.project, self.get_state_display())
+
+# These tables specify the settings for running an actual build.
+# They MUST be kept in sync with the tables in orm.models.Project*
+
+class BRLayer(models.Model):
+ req = models.ForeignKey(BuildRequest)
+ name = models.CharField(max_length = 100)
+ giturl = models.CharField(max_length = 254)
+ commit = models.CharField(max_length = 254)
+ dirpath = models.CharField(max_length = 254)
+
+class BRBitbake(models.Model):
+ req = models.ForeignKey(BuildRequest, unique = True) # only one bitbake for a request
+ giturl = models.CharField(max_length =254)
+ commit = models.CharField(max_length = 254)
+ dirpath = models.CharField(max_length = 254)
+
+class BRVariable(models.Model):
+ req = models.ForeignKey(BuildRequest)
+ name = models.CharField(max_length=100)
+ value = models.TextField(blank = True)
+
+class BRTarget(models.Model):
+ req = models.ForeignKey(BuildRequest)
+ target = models.CharField(max_length=100)
+ task = models.CharField(max_length=100, null=True)
+
+class BRError(models.Model):
+ req = models.ForeignKey(BuildRequest)
+ errtype = models.CharField(max_length=100)
+ errmsg = models.TextField()
+ traceback = models.TextField()
+
+ def __str__(self):
+ return "%s (%s)" % (self.errmsg, self.req)
diff --git a/bitbake/lib/toaster/bldcontrol/sshbecontroller.py b/bitbake/lib/toaster/bldcontrol/sshbecontroller.py
new file mode 100644
index 0000000..8ef434b
--- /dev/null
+++ b/bitbake/lib/toaster/bldcontrol/sshbecontroller.py
@@ -0,0 +1,179 @@
+#
+# ex:ts=4:sw=4:sts=4:et
+# -*- tab-width: 4; c-basic-offset: 4; indent-tabs-mode: nil -*-
+#
+# BitBake Toaster Implementation
+#
+# Copyright (C) 2014 Intel Corporation
+#
+# This program is free software; you can redistribute it and/or modify
+# it under the terms of the GNU General Public License version 2 as
+# published by the Free Software Foundation.
+#
+# This program is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
+# GNU General Public License for more details.
+#
+# You should have received a copy of the GNU General Public License along
+# with this program; if not, write to the Free Software Foundation, Inc.,
+# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
+
+
+import sys
+import re
+from django.db import transaction
+from django.db.models import Q
+from bldcontrol.models import BuildEnvironment, BRLayer, BRVariable, BRTarget, BRBitbake
+import subprocess
+
+from toastermain import settings
+
+from bbcontroller import BuildEnvironmentController, ShellCmdException, BuildSetupException
+
+class NotImplementedException(Exception):
+ pass
+
+def DN(path):
+ return "/".join(path.split("/")[0:-1])
+
+class SSHBEController(BuildEnvironmentController):
+ """ Implementation of the BuildEnvironmentController for the localhost;
+ this controller manages the default build directory,
+ the server setup and system start and stop for the localhost-type build environment
+
+ """
+
+ def __init__(self, be):
+ super(SSHBEController, self).__init__(be)
+ self.dburl = settings.getDATABASE_URL()
+ self.pokydirname = None
+ self.islayerset = False
+
+ def _shellcmd(self, command, cwd = None):
+ if cwd is None:
+ cwd = self.be.sourcedir
+
+ p = subprocess.Popen("ssh %s 'cd %s && %s'" % (self.be.address, cwd, command), stdout=subprocess.PIPE, stderr=subprocess.PIPE, shell=True)
+ (out,err) = p.communicate()
+ if p.returncode:
+ if len(err) == 0:
+ err = "command: %s \n%s" % (command, out)
+ else:
+ err = "command: %s \n%s" % (command, err)
+ raise ShellCmdException(err)
+ else:
+ return out.strip()
+
+ def _pathexists(self, path):
+ try:
+ self._shellcmd("test -e \"%s\"" % path)
+ return True
+ except ShellCmdException as e:
+ return False
+
+ def _pathcreate(self, path):
+ self._shellcmd("mkdir -p \"%s\"" % path)
+
+ def _setupBE(self):
+ assert self.pokydirname and self._pathexists(self.pokydirname)
+ self._pathcreate(self.be.builddir)
+ self._shellcmd("bash -c \"source %s/oe-init-build-env %s\"" % (self.pokydirname, self.be.builddir))
+
+ def startBBServer(self, brbe):
+ assert self.pokydirname and self._pathexists(self.pokydirname)
+ assert self.islayerset
+ cmd = self._shellcmd("bash -c \"source %s/oe-init-build-env %s && DATABASE_URL=%s source toaster start noweb brbe=%s\"" % (self.pokydirname, self.be.builddir, self.dburl, brbe))
+
+ port = "-1"
+ for i in cmd.split("\n"):
+ if i.startswith("Bitbake server address"):
+ port = i.split(" ")[-1]
+ print "Found bitbake server port ", port
+
+
+ assert self.be.sourcedir and self._pathexists(self.be.builddir)
+ self.be.bbaddress = self.be.address.split("@")[-1]
+ self.be.bbport = port
+ self.be.bbstate = BuildEnvironment.SERVER_STARTED
+ self.be.save()
+
+ def stopBBServer(self):
+ assert self.pokydirname and self._pathexists(self.pokydirname)
+ assert self.islayerset
+ print self._shellcmd("bash -c \"source %s/oe-init-build-env %s && %s source toaster stop\"" %
+ (self.pokydirname, self.be.builddir, (lambda: "" if self.be.bbtoken is None else "BBTOKEN=%s" % self.be.bbtoken)()))
+ self.be.bbstate = BuildEnvironment.SERVER_STOPPED
+ self.be.save()
+ print "Stopped server"
+
+
+ def _copyFile(self, filepath1, filepath2):
+ p = subprocess.Popen("scp '%s' '%s'" % (filepath1, filepath2), stdout=subprocess.PIPE, stderr = subprocess.PIPE, shell=True)
+ (out, err) = p.communicate()
+ if p.returncode:
+ raise ShellCmdException(err)
+
+ def pullFile(self, local_filename, remote_filename):
+ _copyFile(local_filename, "%s:%s" % (self.be.address, remote_filename))
+
+ def pushFile(self, local_filename, remote_filename):
+ _copyFile("%s:%s" % (self.be.address, remote_filename), local_filename)
+
+ def setLayers(self, bitbakes, layers):
+ """ a word of attention: by convention, the first layer for any build will be poky! """
+
+ assert self.be.sourcedir is not None
+ assert len(bitbakes) == 1
+ # set layers in the layersource
+
+
+ raise NotImplementedException("Not implemented: SSH setLayers")
+ # 3. configure the build environment, so we have a conf/bblayers.conf
+ assert self.pokydirname is not None
+ self._setupBE()
+
+ # 4. update the bblayers.conf
+ bblayerconf = os.path.join(self.be.builddir, "conf/bblayers.conf")
+ if not self._pathexists(bblayerconf):
+ raise BuildSetupException("BE is not consistent: bblayers.conf file missing at %s" % bblayerconf)
+
+ import uuid
+ local_bblayerconf = "/tmp/" + uuid.uuid4() + "-bblayer.conf"
+
+ self.pullFile(bblayerconf, local_bblayerconf)
+
+ BuildEnvironmentController._updateBBLayers(local_bblayerconf, layerlist)
+ self.pushFile(local_bblayerconf, bblayerconf)
+
+ os.unlink(local_bblayerconf)
+
+ self.islayerset = True
+ return True
+
+ def release(self):
+ assert self.be.sourcedir and self._pathexists(self.be.builddir)
+ import shutil
+ shutil.rmtree(os.path.join(self.be.sourcedir, "build"))
+ assert not self._pathexists(self.be.builddir)
+
+ def triggerBuild(self, bitbake, layers, variables, targets):
+ # set up the buid environment with the needed layers
+ self.setLayers(bitbake, layers)
+ self.writeConfFile("conf/toaster-pre.conf", )
+ self.writeConfFile("conf/toaster.conf", raw = "INHERIT+=\"toaster buildhistory\"")
+
+ # get the bb server running with the build req id and build env id
+ bbctrl = self.getBBController()
+
+ # trigger the build command
+ task = reduce(lambda x, y: x if len(y)== 0 else y, map(lambda y: y.task, targets))
+ if len(task) == 0:
+ task = None
+
+ bbctrl.build(list(map(lambda x:x.target, targets)), task)
+
+ logger.debug("localhostbecontroller: Build launched, exiting. Follow build logs at %s/toaster_ui.log" % self.be.builddir)
+
+ # disconnect from the server
+ bbctrl.disconnect()
diff --git a/bitbake/lib/toaster/bldcontrol/tests.py b/bitbake/lib/toaster/bldcontrol/tests.py
new file mode 100644
index 0000000..5dbc77f
--- /dev/null
+++ b/bitbake/lib/toaster/bldcontrol/tests.py
@@ -0,0 +1,190 @@
+"""
+This file demonstrates writing tests using the unittest module. These will pass
+when you run "manage.py test".
+
+Replace this with more appropriate tests for your application.
+"""
+
+from django.test import TestCase
+
+from bldcontrol.bbcontroller import BitbakeController, BuildSetupException
+from bldcontrol.localhostbecontroller import LocalhostBEController
+from bldcontrol.sshbecontroller import SSHBEController
+from bldcontrol.models import BuildEnvironment, BuildRequest
+from bldcontrol.management.commands.runbuilds import Command
+
+import socket
+import subprocess
+import os
+
+# standard poky data hardcoded for testing
+BITBAKE_LAYERS = [type('bitbake_info', (object,), { "giturl": "git://git.yoctoproject.org/poky.git", "dirpath": "", "commit": "HEAD"})]
+POKY_LAYERS = [
+ type('poky_info', (object,), { "name": "meta", "giturl": "git://git.yoctoproject.org/poky.git", "dirpath": "meta", "commit": "HEAD"}),
+ type('poky_info', (object,), { "name": "meta-yocto", "giturl": "git://git.yoctoproject.org/poky.git", "dirpath": "meta-yocto", "commit": "HEAD"}),
+ type('poky_info', (object,), { "name": "meta-yocto-bsp", "giturl": "git://git.yoctoproject.org/poky.git", "dirpath": "meta-yocto-bsp", "commit": "HEAD"}),
+ ]
+
+
+
+# we have an abstract test class designed to ensure that the controllers use a single interface
+# specific controller tests only need to override the _getBuildEnvironment() method
+
+test_sourcedir = os.getenv("TTS_SOURCE_DIR")
+test_builddir = os.getenv("TTS_BUILD_DIR")
+test_address = os.getenv("TTS_TEST_ADDRESS", "localhost")
+
+if test_sourcedir == None or test_builddir == None or test_address == None:
+ raise Exception("Please set TTTS_SOURCE_DIR, TTS_BUILD_DIR and TTS_TEST_ADDRESS")
+
+# The bb server will expect a toaster-pre.conf file to exist. If it doesn't exit then we make
+# an empty one here.
+open(test_builddir + 'conf/toaster-pre.conf', 'a').close()
+
+class BEControllerTests(object):
+
+ def _serverForceStop(self, bc):
+ err = bc._shellcmd("netstat -tapn 2>/dev/null | grep 8200 | awk '{print $7}' | sort -fu | cut -d \"/\" -f 1 | grep -v -- - | tee /dev/fd/2 | xargs -r kill")
+ self.assertTrue(err == '', "bitbake server pid %s not stopped" % err)
+
+ def test_serverStartAndStop(self):
+ from bldcontrol.sshbecontroller import NotImplementedException
+ obe = self._getBuildEnvironment()
+ bc = self._getBEController(obe)
+ try:
+ # setting layers, skip any layer info
+ bc.setLayers(BITBAKE_LAYERS, POKY_LAYERS)
+ except NotImplementedException, e:
+ print "Test skipped due to command not implemented yet"
+ return True
+ # We are ok with the exception as we're handling the git already exists
+ except BuildSetupException:
+ pass
+
+ bc.pokydirname = test_sourcedir
+ bc.islayerset = True
+
+ hostname = test_address.split("@")[-1]
+
+ # test start server and stop
+ bc.startBBServer()
+
+ self.assertFalse(socket.socket(socket.AF_INET, socket.SOCK_STREAM).connect_ex((hostname, int(bc.be.bbport))), "Server not answering")
+
+ bc.stopBBServer()
+ self.assertTrue(socket.socket(socket.AF_INET, socket.SOCK_STREAM).connect_ex((hostname, int(bc.be.bbport))), "Server not stopped")
+
+ self._serverForceStop(bc)
+
+ def test_getBBController(self):
+ from bldcontrol.sshbecontroller import NotImplementedException
+ obe = self._getBuildEnvironment()
+ bc = self._getBEController(obe)
+ layerSet = False
+ try:
+ # setting layers, skip any layer info
+ layerSet = bc.setLayers(BITBAKE_LAYERS, POKY_LAYERS)
+ except NotImplementedException:
+ print "Test skipped due to command not implemented yet"
+ return True
+ # We are ok with the exception as we're handling the git already exists
+ except BuildSetupException:
+ pass
+
+ bc.pokydirname = test_sourcedir
+ bc.islayerset = True
+
+ bbc = bc.getBBController()
+ self.assertTrue(isinstance(bbc, BitbakeController))
+ bc.stopBBServer()
+
+ self._serverForceStop(bc)
+
+class LocalhostBEControllerTests(TestCase, BEControllerTests):
+ def __init__(self, *args):
+ super(LocalhostBEControllerTests, self).__init__(*args)
+
+
+ def _getBuildEnvironment(self):
+ return BuildEnvironment.objects.create(
+ lock = BuildEnvironment.LOCK_FREE,
+ betype = BuildEnvironment.TYPE_LOCAL,
+ address = test_address,
+ sourcedir = test_sourcedir,
+ builddir = test_builddir )
+
+ def _getBEController(self, obe):
+ return LocalhostBEController(obe)
+
+class SSHBEControllerTests(TestCase, BEControllerTests):
+ def __init__(self, *args):
+ super(SSHBEControllerTests, self).__init__(*args)
+
+ def _getBuildEnvironment(self):
+ return BuildEnvironment.objects.create(
+ lock = BuildEnvironment.LOCK_FREE,
+ betype = BuildEnvironment.TYPE_SSH,
+ address = test_address,
+ sourcedir = test_sourcedir,
+ builddir = test_builddir )
+
+ def _getBEController(self, obe):
+ return SSHBEController(obe)
+
+ def test_pathExists(self):
+ obe = BuildEnvironment.objects.create(betype = BuildEnvironment.TYPE_SSH, address= test_address)
+ sbc = SSHBEController(obe)
+ self.assertTrue(sbc._pathexists("/"))
+ self.assertFalse(sbc._pathexists("/.deadbeef"))
+ self.assertTrue(sbc._pathexists(sbc._shellcmd("pwd")))
+
+
+class RunBuildsCommandTests(TestCase):
+ def test_bec_select(self):
+ """
+ Tests that we can find and lock a build environment
+ """
+
+ obe = BuildEnvironment.objects.create(lock = BuildEnvironment.LOCK_FREE, betype = BuildEnvironment.TYPE_LOCAL)
+ command = Command()
+ bec = command._selectBuildEnvironment()
+
+ # make sure we select the object we've just built
+ self.assertTrue(bec.be.id == obe.id, "Environment is not properly selected")
+ # we have a locked environment
+ self.assertTrue(bec.be.lock == BuildEnvironment.LOCK_LOCK, "Environment is not locked")
+ # no more selections possible here
+ self.assertRaises(IndexError, command._selectBuildEnvironment)
+
+ def test_br_select(self):
+ from orm.models import Project, Release, BitbakeVersion, Branch
+ p = Project.objects.create_project("test", Release.objects.get_or_create(name = "HEAD", bitbake_version = BitbakeVersion.objects.get_or_create(name="HEAD", branch=Branch.objects.get_or_create(name="HEAD"))[0])[0])
+ obr = BuildRequest.objects.create(state = BuildRequest.REQ_QUEUED, project = p)
+ command = Command()
+ br = command._selectBuildRequest()
+
+ # make sure we select the object we've just built
+ self.assertTrue(obr.id == br.id, "Request is not properly selected")
+ # we have a locked environment
+ self.assertTrue(br.state == BuildRequest.REQ_INPROGRESS, "Request is not updated")
+ # no more selections possible here
+ self.assertRaises(IndexError, command._selectBuildRequest)
+
+
+class UtilityTests(TestCase):
+ def test_reduce_path(self):
+ from bldcontrol.management.commands.loadconf import _reduce_canon_path, _get_id_for_sourcetype
+
+ self.assertTrue( _reduce_canon_path("/") == "/")
+ self.assertTrue( _reduce_canon_path("/home/..") == "/")
+ self.assertTrue( _reduce_canon_path("/home/../ana") == "/ana")
+ self.assertTrue( _reduce_canon_path("/home/../ana/..") == "/")
+ self.assertTrue( _reduce_canon_path("/home/ana/mihai/../maria") == "/home/ana/maria")
+
+ def test_get_id_for_sorucetype(self):
+ from bldcontrol.management.commands.loadconf import _reduce_canon_path, _get_id_for_sourcetype
+ self.assertTrue( _get_id_for_sourcetype("layerindex") == 1)
+ self.assertTrue( _get_id_for_sourcetype("local") == 0)
+ self.assertTrue( _get_id_for_sourcetype("imported") == 2)
+ with self.assertRaises(Exception):
+ _get_id_for_sourcetype("unknown")
diff --git a/bitbake/lib/toaster/bldcontrol/views.py b/bitbake/lib/toaster/bldcontrol/views.py
new file mode 100644
index 0000000..60f00ef
--- /dev/null
+++ b/bitbake/lib/toaster/bldcontrol/views.py
@@ -0,0 +1 @@
+# Create your views here.
diff --git a/bitbake/lib/toaster/contrib/README b/bitbake/lib/toaster/contrib/README
new file mode 100644
index 0000000..46d0ff0
--- /dev/null
+++ b/bitbake/lib/toaster/contrib/README
@@ -0,0 +1,6 @@
+contrib directory for toaster
+
+This directory holds code that works with Toaster, without being an integral part of the Toaster project.
+It is intended for testing code, testing fixtures, tools for Toaster, etc.
+
+NOTE: This directory is NOT a Python module.
diff --git a/bitbake/lib/toaster/contrib/django-aggregate-if-master/.gitignore b/bitbake/lib/toaster/contrib/django-aggregate-if-master/.gitignore
new file mode 100644
index 0000000..c45652d
--- /dev/null
+++ b/bitbake/lib/toaster/contrib/django-aggregate-if-master/.gitignore
@@ -0,0 +1,10 @@
+*.pyc
+*.swp
+*.swo
+*.kpf
+*.egg-info/
+.idea
+.tox
+tmp/
+dist/
+.DS_Store
diff --git a/bitbake/lib/toaster/contrib/django-aggregate-if-master/.travis.yml b/bitbake/lib/toaster/contrib/django-aggregate-if-master/.travis.yml
new file mode 100644
index 0000000..a920f39
--- /dev/null
+++ b/bitbake/lib/toaster/contrib/django-aggregate-if-master/.travis.yml
@@ -0,0 +1,50 @@
+language: python
+python:
+ - "2.7"
+ - "3.4"
+services:
+ - mysql
+ - postgresql
+env:
+ - DJANGO=1.4 DB=sqlite
+ - DJANGO=1.4 DB=mysql
+ - DJANGO=1.4 DB=postgres
+ - DJANGO=1.5 DB=sqlite
+ - DJANGO=1.5 DB=mysql
+ - DJANGO=1.5 DB=postgres
+ - DJANGO=1.6 DB=sqlite
+ - DJANGO=1.6 DB=mysql
+ - DJANGO=1.6 DB=postgres
+ - DJANGO=1.7 DB=sqlite
+ - DJANGO=1.7 DB=mysql
+ - DJANGO=1.7 DB=postgres
+
+matrix:
+ exclude:
+ - python: "3.4"
+ env: DJANGO=1.4 DB=sqlite
+ - python: "3.4"
+ env: DJANGO=1.4 DB=mysql
+ - python: "3.4"
+ env: DJANGO=1.4 DB=postgres
+ - python: "3.4"
+ env: DJANGO=1.5 DB=sqlite
+ - python: "3.4"
+ env: DJANGO=1.5 DB=mysql
+ - python: "3.4"
+ env: DJANGO=1.5 DB=postgres
+ - python: "3.4"
+ env: DJANGO=1.6 DB=mysql
+ - python: "3.4"
+ env: DJANGO=1.7 DB=mysql
+
+before_script:
+ - mysql -e 'create database aggregation;'
+ - psql -c 'create database aggregation;' -U postgres
+install:
+ - pip install six
+ - if [ "$DB" == "mysql" ]; then pip install mysql-python; fi
+ - if [ "$DB" == "postgres" ]; then pip install psycopg2; fi
+ - pip install -q Django==$DJANGO --use-mirrors
+script:
+ - ./runtests.py --settings=tests.test_$DB
diff --git a/bitbake/lib/toaster/contrib/django-aggregate-if-master/LICENSE b/bitbake/lib/toaster/contrib/django-aggregate-if-master/LICENSE
new file mode 100644
index 0000000..6b7c3b1
--- /dev/null
+++ b/bitbake/lib/toaster/contrib/django-aggregate-if-master/LICENSE
@@ -0,0 +1,21 @@
+The MIT License
+
+Copyright (c) 2012 Henrique Bastos
+
+Permission is hereby granted, free of charge, to any person obtaining a copy
+of this software and associated documentation files (the "Software"), to deal
+in the Software without restriction, including without limitation the rights
+to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
+copies of the Software, and to permit persons to whom the Software is
+furnished to do so, subject to the following conditions:
+
+The above copyright notice and this permission notice shall be included in
+all copies or substantial portions of the Software.
+
+THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
+IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
+FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
+AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
+LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
+OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
+THE SOFTWARE.
diff --git a/bitbake/lib/toaster/contrib/django-aggregate-if-master/README.rst b/bitbake/lib/toaster/contrib/django-aggregate-if-master/README.rst
new file mode 100644
index 0000000..739d4da
--- /dev/null
+++ b/bitbake/lib/toaster/contrib/django-aggregate-if-master/README.rst
@@ -0,0 +1,156 @@
+Django Aggregate If: Condition aggregates for Django
+====================================================
+
+.. image:: https://travis-ci.org/henriquebastos/django-aggregate-if.png?branch=master
+ :target: https://travis-ci.org/henriquebastos/django-aggregate-if
+ :alt: Test Status
+
+.. image:: https://landscape.io/github/henriquebastos/django-aggregate-if/master/landscape.png
+ :target: https://landscape.io/github/henriquebastos/django-aggregate-if/master
+ :alt: Code Helth
+
+.. image:: https://pypip.in/v/django-aggregate-if/badge.png
+ :target: https://crate.io/packages/django-aggregate-if/
+ :alt: Latest PyPI version
+
+.. image:: https://pypip.in/d/django-aggregate-if/badge.png
+ :target: https://crate.io/packages/django-aggregate-if/
+ :alt: Number of PyPI downloads
+
+*Aggregate-if* adds conditional aggregates to Django.
+
+Conditional aggregates can help you reduce the ammount of queries to obtain
+aggregated information, like statistics for example.
+
+Imagine you have a model ``Offer`` like this one:
+
+.. code-block:: python
+
+ class Offer(models.Model):
+ sponsor = models.ForeignKey(User)
+ price = models.DecimalField(max_digits=9, decimal_places=2)
+ status = models.CharField(max_length=30)
+ expire_at = models.DateField(null=True, blank=True)
+ created_at = models.DateTimeField(auto_now_add=True)
+ updated_at = models.DateTimeField(auto_now=True)
+
+ OPEN = "OPEN"
+ REVOKED = "REVOKED"
+ PAID = "PAID"
+
+Let's say you want to know:
+
+#. How many offers exists in total;
+#. How many of them are OPEN, REVOKED or PAID;
+#. How much money was offered in total;
+#. How much money is in OPEN, REVOKED and PAID offers;
+
+To get these informations, you could query:
+
+.. code-block:: python
+
+ from django.db.models import Count, Sum
+
+ Offer.objects.count()
+ Offer.objects.filter(status=Offer.OPEN).aggregate(Count('pk'))
+ Offer.objects.filter(status=Offer.REVOKED).aggregate(Count('pk'))
+ Offer.objects.filter(status=Offer.PAID).aggregate(Count('pk'))
+ Offer.objects.aggregate(Sum('price'))
+ Offer.objects.filter(status=Offer.OPEN).aggregate(Sum('price'))
+ Offer.objects.filter(status=Offer.REVOKED).aggregate(Sum('price'))
+ Offer.objects.filter(status=Offer.PAID).aggregate(Sum('price'))
+
+In this case, **8 queries** were needed to retrieve the desired information.
+
+With conditional aggregates you can get it all with only **1 query**:
+
+.. code-block:: python
+
+ from django.db.models import Q
+ from aggregate_if import Count, Sum
+
+ Offer.objects.aggregate(
+ pk__count=Count('pk'),
+ pk__open__count=Count('pk', only=Q(status=Offer.OPEN)),
+ pk__revoked__count=Count('pk', only=Q(status=Offer.REVOKED)),
+ pk__paid__count=Count('pk', only=Q(status=Offer.PAID)),
+ pk__sum=Sum('price'),
+ pk__open__sum=Sum('price', only=Q(status=Offer.OPEN)),
+ pk__revoked__sum=Sum('price'), only=Q(status=Offer.REVOKED)),
+ pk__paid__sum=Sum('price'), only=Q(status=Offer.PAID))
+ )
+
+Installation
+------------
+
+*Aggregate-if* works with Django 1.4, 1.5, 1.6 and 1.7.
+
+To install it, simply:
+
+.. code-block:: bash
+
+ $ pip install django-aggregate-if
+
+Inspiration
+-----------
+
+There is a 5 years old `ticket 11305`_ that will (*hopefully*) implement this feature into
+Django 1.8.
+
+Using Django 1.6, I still wanted to avoid creating custom queries for very simple
+conditional aggregations. So I've cherry picked those ideas and others from the
+internet and built this library.
+
+This library uses the same API and tests proposed on `ticket 11305`_, so when the
+new feature is available you can easily replace ``django-aggregate-if``.
+
+Limitations
+-----------
+
+Conditions involving joins with aliases are not supported yet. If you want to
+help adding this feature, you're welcome to check the `first issue`_.
+
+Contributors
+------------
+
+* `Henrique Bastos <http://github.com/henriquebastos>`_
+* `Iuri de Silvio <https://github.com/iurisilvio>`_
+* `Hampus Stjernhav <https://github.com/champ>`_
+* `Bradley Martsberger <https://github.com/martsberger>`_
+* `Markus Bertheau <https://github.com/mbertheau>`_
+* `end0 <https://github.com/end0>`_
+* `Scott Sexton <https://github.com/scottsexton>`_
+* `Mauler <https://github.com/mauler>`_
+* `trbs <https://github.com/trbs>`_
+
+Changelog
+---------
+
+0.5
+ - Support for Django 1.7
+
+0.4
+ - Use tox to run tests.
+ - Add support for Django 1.6.
+ - Add support for Python3.
+ - The ``only`` parameter now freely supports joins independent of the main query.
+ - Adds support for alias relabeling permitting excludes and updates with aggregates filtered on remote foreign key relations.
+
+0.3.1
+ - Fix quotation escaping.
+ - Fix boolean casts on Postgres.
+
+0.2
+ - Fix postgres issue with LIKE conditions.
+
+0.1
+ - Initial release.
+
+
+License
+=======
+
+The MIT License.
+
+.. _ticket 11305: https://code.djangoproject.com/ticket/11305
+.. _first issue: https://github.com/henriquebastos/django-aggregate-if/issues/1
diff --git a/bitbake/lib/toaster/contrib/django-aggregate-if-master/aggregate_if.py b/bitbake/lib/toaster/contrib/django-aggregate-if-master/aggregate_if.py
new file mode 100644
index 0000000..d5f3427
--- /dev/null
+++ b/bitbake/lib/toaster/contrib/django-aggregate-if-master/aggregate_if.py
@@ -0,0 +1,164 @@
+# coding: utf-8
+'''
+Implements conditional aggregates.
+
+This code was based on the work of others found on the internet:
+
+1. http://web.archive.org/web/20101115170804/http://www.voteruniverse.com/Members/jlantz/blog/conditional-aggregates-in-django
+2. https://code.djangoproject.com/ticket/11305
+3. https://groups.google.com/forum/?fromgroups=#!topic/django-users/cjzloTUwmS0
+4. https://groups.google.com/forum/?fromgroups=#!topic/django-users/vVprMpsAnPo
+'''
+from __future__ import unicode_literals
+from django.utils import six
+import django
+from django.db.models.aggregates import Aggregate as DjangoAggregate
+from django.db.models.sql.aggregates import Aggregate as DjangoSqlAggregate
+
+
+VERSION = django.VERSION[:2]
+
+
+class SqlAggregate(DjangoSqlAggregate):
+ conditional_template = '%(function)s(CASE WHEN %(condition)s THEN %(field)s ELSE null END)'
+
+ def __init__(self, col, source=None, is_summary=False, condition=None, **extra):
+ super(SqlAggregate, self).__init__(col, source, is_summary, **extra)
+ self.condition = condition
+
+ def relabel_aliases(self, change_map):
+ if VERSION < (1, 7):
+ super(SqlAggregate, self).relabel_aliases(change_map)
+ if self.has_condition:
+ condition_change_map = dict((k, v) for k, v in \
+ change_map.items() if k in self.condition.query.alias_map
+ )
+ self.condition.query.change_aliases(condition_change_map)
+
+ def relabeled_clone(self, change_map):
+ self.relabel_aliases(change_map)
+ return super(SqlAggregate, self).relabeled_clone(change_map)
+
+ def as_sql(self, qn, connection):
+ if self.has_condition:
+ self.sql_template = self.conditional_template
+ self.extra['condition'] = self._condition_as_sql(qn, connection)
+
+ return super(SqlAggregate, self).as_sql(qn, connection)
+
+ @property
+ def has_condition(self):
+ # Warning: bool(QuerySet) will hit the database
+ return self.condition is not None
+
+ def _condition_as_sql(self, qn, connection):
+ '''
+ Return sql for condition.
+ '''
+ def escape(value):
+ if isinstance(value, bool):
+ value = str(int(value))
+ if isinstance(value, six.string_types):
+ # Escape params used with LIKE
+ if '%' in value:
+ value = value.replace('%', '%%')
+ # Escape single quotes
+ if "'" in value:
+ value = value.replace("'", "''")
+ # Add single quote to text values
+ value = "'" + value + "'"
+ return value
+
+ sql, param = self.condition.query.where.as_sql(qn, connection)
+ param = map(escape, param)
+
+ return sql % tuple(param)
+
+
+class SqlSum(SqlAggregate):
+ sql_function = 'SUM'
+
+
+class SqlCount(SqlAggregate):
+ is_ordinal = True
+ sql_function = 'COUNT'
+ sql_template = '%(function)s(%(distinct)s%(field)s)'
+ conditional_template = '%(function)s(%(distinct)sCASE WHEN %(condition)s THEN %(field)s ELSE null END)'
+
+ def __init__(self, col, distinct=False, **extra):
+ super(SqlCount, self).__init__(col, distinct=distinct and 'DISTINCT ' or '', **extra)
+
+
+class SqlAvg(SqlAggregate):
+ is_computed = True
+ sql_function = 'AVG'
+
+
+class SqlMax(SqlAggregate):
+ sql_function = 'MAX'
+
+
+class SqlMin(SqlAggregate):
+ sql_function = 'MIN'
+
+
+class Aggregate(DjangoAggregate):
+ def __init__(self, lookup, only=None, **extra):
+ super(Aggregate, self).__init__(lookup, **extra)
+ self.only = only
+ self.condition = None
+
+ def _get_fields_from_Q(self, q):
+ fields = []
+ for child in q.children:
+ if hasattr(child, 'children'):
+ fields.extend(self._get_fields_from_Q(child))
+ else:
+ fields.append(child)
+ return fields
+
+ def add_to_query(self, query, alias, col, source, is_summary):
+ if self.only:
+ self.condition = query.model._default_manager.filter(self.only)
+ for child in self._get_fields_from_Q(self.only):
+ field_list = child[0].split('__')
+ # Pop off the last field if it's a query term ('gte', 'contains', 'isnull', etc.)
+ if field_list[-1] in query.query_terms:
+ field_list.pop()
+ # setup_joins have different returns in Django 1.5 and 1.6, but the order of what we need remains.
+ result = query.setup_joins(field_list, query.model._meta, query.get_initial_alias(), None)
+ join_list = result[3]
+
+ fname = 'promote_alias_chain' if VERSION < (1, 5) else 'promote_joins'
+ args = (join_list, True) if VERSION < (1, 7) else (join_list,)
+
+ promote = getattr(query, fname)
+ promote(*args)
+
+ aggregate = self.sql_klass(col, source=source, is_summary=is_summary, condition=self.condition, **self.extra)
+ query.aggregates[alias] = aggregate
+
+
+class Sum(Aggregate):
+ name = 'Sum'
+ sql_klass = SqlSum
+
+
+class Count(Aggregate):
+ name = 'Count'
+ sql_klass = SqlCount
+
+
+class Avg(Aggregate):
+ name = 'Avg'
+ sql_klass = SqlAvg
+
+
+class Max(Aggregate):
+ name = 'Max'
+ sql_klass = SqlMax
+
+
+class Min(Aggregate):
+ name = 'Min'
+ sql_klass = SqlMin
diff --git a/bitbake/lib/toaster/contrib/django-aggregate-if-master/runtests.py b/bitbake/lib/toaster/contrib/django-aggregate-if-master/runtests.py
new file mode 100755
index 0000000..2e55864
--- /dev/null
+++ b/bitbake/lib/toaster/contrib/django-aggregate-if-master/runtests.py
@@ -0,0 +1,48 @@
+#!/usr/bin/env python
+
+import os
+import sys
+from optparse import OptionParser
+
+
+def parse_args():
+ parser = OptionParser()
+ parser.add_option('-s', '--settings', help='Define settings.')
+ parser.add_option('-t', '--unittest', help='Define which test to run. Default all.')
+ options, args = parser.parse_args()
+
+ if not options.settings:
+ parser.print_help()
+ sys.exit(1)
+
+ if not options.unittest:
+ options.unittest = ['aggregation']
+
+ return options
+
+
+def get_runner(settings_module):
+ '''
+ Asks Django for the TestRunner defined in settings or the default one.
+ '''
+ os.environ['DJANGO_SETTINGS_MODULE'] = settings_module
+
+ import django
+ from django.test.utils import get_runner
+ from django.conf import settings
+
+ if hasattr(django, 'setup'):
+ django.setup()
+
+ return get_runner(settings)
+
+
+def runtests():
+ options = parse_args()
+ TestRunner = get_runner(options.settings)
+ runner = TestRunner(verbosity=1, interactive=True, failfast=False)
+ sys.exit(runner.run_tests([]))
+
+
+if __name__ == '__main__':
+ runtests()
diff --git a/bitbake/lib/toaster/contrib/django-aggregate-if-master/setup.py b/bitbake/lib/toaster/contrib/django-aggregate-if-master/setup.py
new file mode 100644
index 0000000..aed3db1
--- /dev/null
+++ b/bitbake/lib/toaster/contrib/django-aggregate-if-master/setup.py
@@ -0,0 +1,33 @@
+# coding: utf-8
+from setuptools import setup
+import os
+
+
+setup(name='django-aggregate-if',
+ version='0.5',
+ description='Conditional aggregates for Django, just like the famous SumIf in Excel.',
+ long_description=open(os.path.join(os.path.dirname(__file__), "README.rst")).read(),
+ author="Henrique Bastos", author_email="henrique@bastos.net",
+ license="MIT",
+ py_modules=['aggregate_if'],
+ install_requires=[
+ 'six>=1.6.1',
+ ],
+ zip_safe=False,
+ platforms='any',
+ include_package_data=True,
+ classifiers=[
+ 'Development Status :: 5 - Production/Stable',
+ 'Framework :: Django',
+ 'Intended Audience :: Developers',
+ 'License :: OSI Approved :: MIT License',
+ 'Natural Language :: English',
+ 'Operating System :: OS Independent',
+ 'Programming Language :: Python',
+ 'Programming Language :: Python :: 2.7',
+ 'Programming Language :: Python :: 3',
+ 'Topic :: Database',
+ 'Topic :: Software Development :: Libraries',
+ ],
+ url='http://github.com/henriquebastos/django-aggregate-if/',
+)
diff --git a/bitbake/lib/toaster/contrib/django-aggregate-if-master/tox.ini b/bitbake/lib/toaster/contrib/django-aggregate-if-master/tox.ini
new file mode 100644
index 0000000..78beb14
--- /dev/null
+++ b/bitbake/lib/toaster/contrib/django-aggregate-if-master/tox.ini
@@ -0,0 +1,198 @@
+[tox]
+envlist =
+ py27-django1.4-sqlite,
+ py27-django1.4-postgres,
+ py27-django1.4-mysql,
+
+ py27-django1.5-sqlite,
+ py27-django1.5-postgres,
+ py27-django1.5-mysql,
+
+ py27-django1.6-sqlite,
+ py27-django1.6-postgres,
+ py27-django1.6-mysql,
+
+ py27-django1.7-sqlite,
+ py27-django1.7-postgres,
+ py27-django1.7-mysql,
+
+ py34-django1.6-sqlite,
+ py34-django1.6-postgres,
+ #py34-django1.6-mysql
+
+ py34-django1.7-sqlite,
+ py34-django1.7-postgres,
+ #py34-django1.7-mysql
+
+[testenv]
+whitelist_externals=
+ mysql
+ psql
+
+# Python 2.7
+# Django 1.4
+[testenv:py27-django1.4-sqlite]
+basepython = python2.7
+deps =
+ django==1.4
+commands = python runtests.py --settings tests.test_sqlite
+
+[testenv:py27-django1.4-postgres]
+basepython = python2.7
+deps =
+ django==1.4
+ psycopg2
+commands =
+ psql -c 'create database aggregation;' postgres
+ python runtests.py --settings tests.test_postgres
+ psql -c 'drop database aggregation;' postgres
+
+[testenv:py27-django1.4-mysql]
+basepython = python2.7
+deps =
+ django==1.4
+ mysql-python
+commands =
+ mysql -e 'create database aggregation;'
+ python runtests.py --settings tests.test_mysql
+ mysql -e 'drop database aggregation;'
+
+# Django 1.5
+[testenv:py27-django1.5-sqlite]
+basepython = python2.7
+deps =
+ django==1.5
+commands = python runtests.py --settings tests.test_sqlite
+
+[testenv:py27-django1.5-postgres]
+basepython = python2.7
+deps =
+ django==1.5
+ psycopg2
+commands =
+ psql -c 'create database aggregation;' postgres
+ python runtests.py --settings tests.test_postgres
+ psql -c 'drop database aggregation;' postgres
+
+[testenv:py27-django1.5-mysql]
+basepython = python2.7
+deps =
+ django==1.5
+ mysql-python
+commands =
+ mysql -e 'create database aggregation;'
+ python runtests.py --settings tests.test_mysql
+ mysql -e 'drop database aggregation;'
+
+# Django 1.6
+[testenv:py27-django1.6-sqlite]
+basepython = python2.7
+deps =
+ django==1.6
+commands = python runtests.py --settings tests.test_sqlite
+
+[testenv:py27-django1.6-postgres]
+basepython = python2.7
+deps =
+ django==1.6
+ psycopg2
+commands =
+ psql -c 'create database aggregation;' postgres
+ python runtests.py --settings tests.test_postgres
+ psql -c 'drop database aggregation;' postgres
+
+[testenv:py27-django1.6-mysql]
+basepython = python2.7
+deps =
+ django==1.6
+ mysql-python
+commands =
+ mysql -e 'create database aggregation;'
+ python runtests.py --settings tests.test_mysql
+ mysql -e 'drop database aggregation;'
+
+
+# Python 2.7 and Django 1.7
+[testenv:py27-django1.7-sqlite]
+basepython = python2.7
+deps =
+ django==1.7
+commands = python runtests.py --settings tests.test_sqlite
+
+[testenv:py27-django1.7-postgres]
+basepython = python2.7
+deps =
+ django==1.7
+ psycopg2
+commands =
+ psql -c 'create database aggregation;' postgres
+ python runtests.py --settings tests.test_postgres
+ psql -c 'drop database aggregation;' postgres
+
+[testenv:py27-django1.7-mysql]
+basepython = python2.7
+deps =
+ django==1.7
+ mysql-python
+commands =
+ mysql -e 'create database aggregation;'
+ python runtests.py --settings tests.test_mysql
+ mysql -e 'drop database aggregation;'
+
+
+# Python 3.4
+# Django 1.6
+[testenv:py34-django1.6-sqlite]
+basepython = python3.4
+deps =
+ django==1.6
+commands = python runtests.py --settings tests.test_sqlite
+
+[testenv:py34-django1.6-postgres]
+basepython = python3.4
+deps =
+ django==1.6
+ psycopg2
+commands =
+ psql -c 'create database aggregation;' postgres
+ python runtests.py --settings tests.test_postgres
+ psql -c 'drop database aggregation;' postgres
+
+[testenv:py34-django1.6-mysql]
+basepython = python3.4
+deps =
+ django==1.6
+ mysql-python3
+commands =
+ mysql -e 'create database aggregation;'
+ python runtests.py --settings tests.test_mysql
+ mysql -e 'drop database aggregation;'
+
+
+# Python 3.4
+# Django 1.7
+[testenv:py34-django1.7-sqlite]
+basepython = python3.4
+deps =
+ django==1.7
+commands = python runtests.py --settings tests.test_sqlite
+
+[testenv:py34-django1.7-postgres]
+basepython = python3.4
+deps =
+ django==1.7
+ psycopg2
+commands =
+ psql -c 'create database aggregation;' postgres
+ python runtests.py --settings tests.test_postgres
+ psql -c 'drop database aggregation;' postgres
+
+[testenv:py34-django1.7-mysql]
+basepython = python3.4
+deps =
+ django==1.7
+ mysql-python3
+commands =
+ mysql -e 'create database aggregation;'
+ python runtests.py --settings tests.test_mysql
+ mysql -e 'drop database aggregation;'
diff --git a/bitbake/lib/toaster/contrib/tts/README b/bitbake/lib/toaster/contrib/tts/README
new file mode 100644
index 0000000..22fa567
--- /dev/null
+++ b/bitbake/lib/toaster/contrib/tts/README
@@ -0,0 +1,41 @@
+
+Toaster Testing Framework
+Yocto Project
+
+
+Rationale
+------------
+As Toaster contributions grow with the number of people that contribute code, verifying each patch prior to submitting upstream becomes a hard-to-scale problem for humans. We devised this system in order to run patch-level validation, trying to eliminate common problems from submitted patches, in an automated fashion.
+
+The Toaster Testing Framework is a set of Python scripts that provides an extensible way to write smoke and regression tests that will be run on each patch set sent for review on the toaster mailing list.
+
+
+Usage
+------------
+There are three main executable scripts in this directory.
+ * runner.py is designed to be run from the command line. It requires, as mandatory parameter, a branch name on poky-contrib, branch which contains the patches to be tested. The program will auto-discover the available tests residing in this directory by looking for unittest classes, and will run the tests on the branch dumping the output to the standard output. Optionally, it can take parameters inhibiting the branch checkout, or specifying a single test to be run, for debugging purposes.
+ * launcher.py is a designed to be run from a crontab or similar scheduling mechanism. It looks up a backlog file containing branches-to-test (named tasks in the source code), select the first one in FIFO manner, and launch runner.py on it. It will await for completion, and email the standard output and standard error dumps from the runner.py execution
+ * recv.py is an email receiver, designed to be called as a pipe from a .forward file. It is used to monitor a mailing list, for example, and add tasks to the backlog based on review requests coming on the mailing list.
+
+
+Installation
+------------
+As prerequisite, we expect a functioning email system on a machine with Python2.
+
+The broad steps to installation
+* set up the .forward on the receiving email account to pipe to the recv.py file
+* edit config.py and settings.json to alter for local installation settings
+* on email receive, verify backlog.txt to see that the tasks are received and marked for processing
+* execute launcher.py in command line to verify that a test occurs with no problems, and that the outgoing email is delivered
+* add launcher.py
+
+
+
+Contribute
+------------
+What we need are tests. Add your own tests to either tests.py file, or to a new file.
+Use "config.logger" to write logs that will make it to email.
+
+Commonly used code should be going to shellutils, and configuration to config.py.
+
+Contribute code by emailing patches to the list: toaster@yoctoproject.org (membership required)
diff --git a/bitbake/lib/toaster/contrib/tts/TODO b/bitbake/lib/toaster/contrib/tts/TODO
new file mode 100644
index 0000000..1171921
--- /dev/null
+++ b/bitbake/lib/toaster/contrib/tts/TODO
@@ -0,0 +1,9 @@
+We need to implement tests:
+
+automated link checker; currently
+$ linkchecker -t 1000 -F csv http://localhost:8000/
+
+integrate the w3c-validation service; currently
+$ python urlcheck.py
+
+
diff --git a/bitbake/lib/toaster/contrib/tts/config.py b/bitbake/lib/toaster/contrib/tts/config.py
new file mode 100644
index 0000000..40d45f3
--- /dev/null
+++ b/bitbake/lib/toaster/contrib/tts/config.py
@@ -0,0 +1,98 @@
+#!/usr/bin/python
+
+# ex:ts=4:sw=4:sts=4:et
+# -*- tab-width: 4; c-basic-offset: 4; indent-tabs-mode: nil -*-
+#
+# Copyright (C) 2015 Alexandru Damian for Intel Corp.
+#
+# This program is free software; you can redistribute it and/or modify
+# it under the terms of the GNU General Public License version 2 as
+# published by the Free Software Foundation.
+#
+# This program is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
+# GNU General Public License for more details.
+#
+# You should have received a copy of the GNU General Public License along
+# with this program; if not, write to the Free Software Foundation, Inc.,
+# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
+
+# This is the configuration/single module for tts
+# everything that would be a global variable goes here
+
+import os, sys, logging
+import socket
+
+LOGDIR = "log"
+SETTINGS_FILE = os.path.join(os.path.dirname(__file__), "settings.json")
+TEST_DIR_NAME = "tts_testdir"
+
+DEBUG = True
+
+OWN_PID = os.getpid()
+
+W3C_VALIDATOR = "http://icarus.local/w3c-validator/check?doctype=HTML5&uri="
+
+TOASTER_PORT = 56789
+
+TESTDIR = None
+
+#we parse the w3c URL to know where to connect
+
+import urlparse
+
+def get_public_ip():
+ temp_socket = socket.socket(socket.AF_INET, socket.SOCK_DGRAM)
+ parsed_url = urlparse.urlparse("http://icarus.local/w3c-validator/check?doctype=HTML5&uri=")
+ temp_socket.connect((parsed_url.netloc, 80 if parsed_url.port is None else parsed_url.port))
+ public_ip = temp_socket.getsockname()[0]
+ temp_socket.close()
+ return public_ip
+
+TOASTER_BASEURL = "http://%s:%d/" % (get_public_ip(), TOASTER_PORT)
+
+
+OWN_EMAIL_ADDRESS = "Toaster Testing Framework <alexandru.damian@intel.com>"
+REPORT_EMAIL_ADDRESS = "alexandru.damian@intel.com"
+
+# make sure we have the basic logging infrastructure
+
+#pylint: disable=invalid-name
+# we disable the invalid name because the module-level "logger" is used througout bitbake
+logger = logging.getLogger("toastertest")
+__console__ = logging.StreamHandler(sys.stdout)
+__console__.setFormatter(logging.Formatter("%(asctime)s %(levelname)s: %(message)s"))
+logger.addHandler(__console__)
+logger.setLevel(logging.DEBUG)
+
+
+# singleton file names
+LOCKFILE = "/tmp/ttf.lock"
+BACKLOGFILE = os.path.join(os.path.dirname(__file__), "backlog.txt")
+
+# task states
+def enum(*sequential, **named):
+ enums = dict(zip(sequential, range(len(sequential))), **named)
+ reverse = dict((value, key) for key, value in enums.iteritems())
+ enums['reverse_mapping'] = reverse
+ return type('Enum', (), enums)
+
+
+class TASKS(object):
+ #pylint: disable=too-few-public-methods
+ PENDING = "PENDING"
+ INPROGRESS = "INPROGRESS"
+ DONE = "DONE"
+
+ @staticmethod
+ def next_task(task):
+ if task == TASKS.PENDING:
+ return TASKS.INPROGRESS
+ if task == TASKS.INPROGRESS:
+ return TASKS.DONE
+ raise Exception("Invalid next task state for %s" % task)
+
+# TTS specific
+CONTRIB_REPO = "git@git.yoctoproject.org:poky-contrib"
+
diff --git a/bitbake/lib/toaster/contrib/tts/launcher.py b/bitbake/lib/toaster/contrib/tts/launcher.py
new file mode 100755
index 0000000..e5794c1
--- /dev/null
+++ b/bitbake/lib/toaster/contrib/tts/launcher.py
@@ -0,0 +1,101 @@
+#!/usr/bin/python
+
+# ex:ts=4:sw=4:sts=4:et
+# -*- tab-width: 4; c-basic-offset: 4; indent-tabs-mode: nil -*-
+#
+# Copyright (C) 2015 Alexandru Damian for Intel Corp.
+#
+# This program is free software; you can redistribute it and/or modify
+# it under the terms of the GNU General Public License version 2 as
+# published by the Free Software Foundation.
+#
+# This program is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
+# GNU General Public License for more details.
+#
+# You should have received a copy of the GNU General Public License along
+# with this program; if not, write to the Free Software Foundation, Inc.,
+# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
+
+# Program to run the next task listed from the backlog.txt; designed to be
+# run from crontab.
+
+from __future__ import print_function
+import sys, os, config, shellutils
+from shellutils import ShellCmdException
+
+# Import smtplib for the actual sending function
+import smtplib
+
+# Import the email modules we'll need
+from email.mime.text import MIMEText
+
+def _take_lockfile():
+ return shellutils.lockfile(shellutils.mk_lock_filename())
+
+
+def read_next_task_by_state(task_state, task_name=None):
+ if not os.path.exists(os.path.join(os.path.dirname(__file__), config.BACKLOGFILE)):
+ return None
+ os.rename(config.BACKLOGFILE, config.BACKLOGFILE + ".tmp")
+ task = None
+ with open(config.BACKLOGFILE + ".tmp", "r") as f_in:
+ with open(config.BACKLOGFILE, "w") as f_out:
+ for line in f_in.readlines():
+ if task is None:
+ fields = line.strip().split("|", 2)
+ if fields[1] == task_state:
+ if task_name is None or task_name == fields[0]:
+ task = fields[0]
+ print("Updating %s %s to %s" % (task, task_state, config.TASKS.next_task(task_state)))
+ line = "%s|%s\n" % (task, config.TASKS.next_task(task_state))
+ f_out.write(line)
+ os.remove(config.BACKLOGFILE + ".tmp")
+ return task
+
+def send_report(task_name, plaintext, errtext=None):
+ if errtext is None:
+ msg = MIMEText(plaintext)
+ else:
+ if plaintext is None:
+ plaintext = ""
+ msg = MIMEText("--STDOUT dump--\n\n%s\n\n--STDERR dump--\n\n%s" % (plaintext, errtext))
+
+ msg['Subject'] = "[review-request] %s - smoke test results" % task_name
+ msg['From'] = config.OWN_EMAIL_ADDRESS
+ msg['To'] = config.REPORT_EMAIL_ADDRESS
+
+ smtp_connection = smtplib.SMTP("localhost")
+ smtp_connection.sendmail(config.OWN_EMAIL_ADDRESS, [config.REPORT_EMAIL_ADDRESS], msg.as_string())
+ smtp_connection.quit()
+
+def main():
+ # we don't do anything if we have another instance of us running
+ lock_file = _take_lockfile()
+
+ if lock_file is None:
+ if config.DEBUG:
+ print("Concurrent script in progress, exiting")
+ sys.exit(1)
+
+ next_task = read_next_task_by_state(config.TASKS.PENDING)
+ if next_task is not None:
+ print("Next task is", next_task)
+ errtext = None
+ out = None
+ try:
+ out = shellutils.run_shell_cmd("%s %s" % (os.path.join(os.path.dirname(__file__), "runner.py"), next_task))
+ except ShellCmdException as exc:
+ print("Failed while running the test runner: %s", exc)
+ errtext = exc.__str__()
+ send_report(next_task, out, errtext)
+ read_next_task_by_state(config.TASKS.INPROGRESS, next_task)
+ else:
+ print("No task")
+
+ shellutils.unlockfile(lock_file)
+
+
+if __name__ == "__main__":
+ main()
diff --git a/bitbake/lib/toaster/contrib/tts/log/.create b/bitbake/lib/toaster/contrib/tts/log/.create
new file mode 100644
index 0000000..e69de29
--- /dev/null
+++ b/bitbake/lib/toaster/contrib/tts/log/.create
diff --git a/bitbake/lib/toaster/contrib/tts/recv.py b/bitbake/lib/toaster/contrib/tts/recv.py
new file mode 100755
index 0000000..07efdac
--- /dev/null
+++ b/bitbake/lib/toaster/contrib/tts/recv.py
@@ -0,0 +1,56 @@
+#!/usr/bin/python
+
+# ex:ts=4:sw=4:sts=4:et
+# -*- tab-width: 4; c-basic-offset: 4; indent-tabs-mode: nil -*-
+#
+# Copyright (C) 2015 Alexandru Damian for Intel Corp.
+#
+# This program is free software; you can redistribute it and/or modify
+# it under the terms of the GNU General Public License version 2 as
+# published by the Free Software Foundation.
+#
+# This program is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
+# GNU General Public License for more details.
+#
+# You should have received a copy of the GNU General Public License along
+# with this program; if not, write to the Free Software Foundation, Inc.,
+# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
+
+# Program to receive review requests by email and log tasks to backlog.txt
+# Designed to be run by the email system from a .forward file:
+#
+# cat .forward
+# |[full/path]/recv.py
+
+from __future__ import print_function
+import sys, config, shellutils
+
+from email.parser import Parser
+
+def recv_mail(datastring):
+ headers = Parser().parsestr(datastring)
+ return headers['subject']
+
+def main():
+ lock_file = shellutils.lockfile(shellutils.mk_lock_filename(), retry=True)
+
+ if lock_file is None:
+ if config.DEBUG:
+ print("Concurrent script in progress, exiting")
+ sys.exit(1)
+
+ subject = recv_mail(sys.stdin.read())
+
+ subject_parts = subject.split()
+ if "[review-request]" in subject_parts:
+ task_name = subject_parts[subject_parts.index("[review-request]") + 1]
+ with open(config.BACKLOGFILE, "a") as fout:
+ line = "%s|%s\n" % (task_name, config.TASKS.PENDING)
+ fout.write(line)
+
+ shellutils.unlockfile(lock_file)
+
+if __name__ == "__main__":
+ main()
diff --git a/bitbake/lib/toaster/contrib/tts/runner.py b/bitbake/lib/toaster/contrib/tts/runner.py
new file mode 100755
index 0000000..bed6651
--- /dev/null
+++ b/bitbake/lib/toaster/contrib/tts/runner.py
@@ -0,0 +1,222 @@
+#!/usr/bin/python
+
+# ex:ts=4:sw=4:sts=4:et
+# -*- tab-width: 4; c-basic-offset: 4; indent-tabs-mode: nil -*-
+#
+# Copyright (C) 2015 Alexandru Damian for Intel Corp.
+#
+# This program is free software; you can redistribute it and/or modify
+# it under the terms of the GNU General Public License version 2 as
+# published by the Free Software Foundation.
+#
+# This program is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
+# GNU General Public License for more details.
+#
+# You should have received a copy of the GNU General Public License along
+# with this program; if not, write to the Free Software Foundation, Inc.,
+# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
+
+
+# This is the main test execution controller. It is designed to be run
+# manually from the command line, or to be called from a different program
+# that schedules test execution.
+#
+# Execute runner.py -h for help.
+
+
+
+from __future__ import print_function
+import sys, os
+import unittest, importlib
+import logging, pprint, json
+import re
+from shellutils import ShellCmdException, mkdirhier, run_shell_cmd
+
+import config
+
+# we also log to a file, in addition to console, because our output is important
+__log_file_name__ = os.path.join(os.path.dirname(__file__), "log/tts_%d.log" % config.OWN_PID)
+mkdirhier(os.path.dirname(__log_file_name__))
+__log_file__ = open(__log_file_name__, "w")
+__file_handler__ = logging.StreamHandler(__log_file__)
+__file_handler__.setFormatter(logging.Formatter("%(asctime)s %(levelname)s: %(message)s"))
+
+config.logger.addHandler(__file_handler__)
+
+# set up log directory
+try:
+ if not os.path.exists(config.LOGDIR):
+ os.mkdir(config.LOGDIR)
+ else:
+ if not os.path.isdir(config.LOGDIR):
+ raise Exception("Expected log dir '%s' is not actually a directory." % config.LOGDIR)
+except OSError as exc:
+ raise exc
+
+# creates the under-test-branch as a separate directory
+def set_up_test_branch(settings, branch_name):
+ testdir = "%s/%s.%d" % (settings['workdir'], config.TEST_DIR_NAME, config.OWN_PID)
+
+ # creates the host dir
+ if os.path.exists(testdir):
+ raise Exception("Test dir '%s'is already there, aborting" % testdir)
+
+ # may raise OSError, is to be handled by the caller
+ os.makedirs(testdir)
+
+
+ # copies over the .git from the localclone
+ run_shell_cmd("cp -a '%s'/.git '%s'" % (settings['localclone'], testdir))
+
+ # add the remote if it doesn't exist
+ crt_remotes = run_shell_cmd("git remote -v", cwd=testdir)
+ remotes = [word for line in crt_remotes.split("\n") for word in line.split()]
+ if not config.CONTRIB_REPO in remotes:
+ remote_name = "tts_contrib"
+ run_shell_cmd("git remote add %s %s" % (remote_name, config.CONTRIB_REPO), cwd=testdir)
+ else:
+ remote_name = remotes[remotes.index(config.CONTRIB_REPO) - 1]
+
+ # do the fetch
+ run_shell_cmd("git fetch %s -p" % remote_name, cwd=testdir)
+
+ # do the checkout
+ run_shell_cmd("git checkout origin/master && git branch -D %s; git checkout %s/%s -b %s && git reset --hard" % (branch_name, remote_name, branch_name, branch_name), cwd=testdir)
+
+ return testdir
+
+
+def __search_for_tests():
+ # we find all classes that can run, and run them
+ tests = []
+ for _, _, files_list in os.walk(os.path.dirname(os.path.abspath(__file__))):
+ for module_file in [f[:-3] for f in files_list if f.endswith(".py") and not f.startswith("__init__")]:
+ config.logger.debug("Inspecting module %s", module_file)
+ current_module = importlib.import_module(module_file)
+ crtclass_names = vars(current_module)
+ for name in crtclass_names:
+ tested_value = crtclass_names[name]
+ if isinstance(tested_value, type(unittest.TestCase)) and issubclass(tested_value, unittest.TestCase):
+ tests.append((module_file, name))
+ break
+ return tests
+
+
+# boilerplate to self discover tests and run them
+def execute_tests(dir_under_test, testname):
+
+ if testname is not None and "." in testname:
+ tests = []
+ tests.append(tuple(testname.split(".", 2)))
+ else:
+ tests = __search_for_tests()
+
+ # let's move to the directory under test
+ crt_dir = os.getcwd()
+ os.chdir(dir_under_test)
+
+ # execute each module
+ # pylint: disable=broad-except
+ # we disable the broad-except because we want to actually catch all possible exceptions
+ try:
+ # sorting the tests by the numeric order in the class name
+ tests = sorted(tests, key=lambda x: int(re.search(r"[0-9]+", x[1]).group(0)))
+ config.logger.debug("Discovered test clases: %s", pprint.pformat(tests))
+ unittest.installHandler()
+ suite = unittest.TestSuite()
+ loader = unittest.TestLoader()
+ result = unittest.TestResult()
+ result.failfast = True
+ for module_file, test_name in tests:
+ suite.addTest(loader.loadTestsFromName("%s.%s" % (module_file, test_name)))
+ config.logger.info("Running %d test(s)", suite.countTestCases())
+ suite.run(result)
+
+ for error in result.errors:
+ config.logger.error("Exception on test: %s\n%s", error[0],
+ "\n".join(["-- %s" % x for x in error[1].split("\n")]))
+
+ for failure in result.failures:
+ config.logger.error("Failed test: %s:\n%s\n", failure[0],
+ "\n".join(["-- %s" % x for x in failure[1].split("\n")]))
+
+ config.logger.info("Test results: %d ran, %d errors, %d failures", result.testsRun, len(result.errors), len(result.failures))
+
+ except Exception as exc:
+ import traceback
+ config.logger.error("Exception while running test. Tracedump: \n%s", traceback.format_exc(exc))
+ finally:
+ os.chdir(crt_dir)
+ return len(result.failures)
+
+# verify that we had a branch-under-test name as parameter
+def validate_args():
+ from optparse import OptionParser
+ parser = OptionParser(usage="usage: %prog [options] branch_under_test")
+
+ parser.add_option("-t", "--test-dir", dest="testdir", default=None, help="Use specified directory to run tests, inhibits the checkout.")
+ parser.add_option("-s", "--single", dest="singletest", default=None, help="Run only the specified test")
+
+ (options, args) = parser.parse_args()
+ if len(args) < 1:
+ raise Exception("Please specify the branch to run on. Use option '-h' when in doubt.")
+ return (options, args)
+
+
+
+
+# load the configuration options
+def read_settings():
+ if not os.path.exists(config.SETTINGS_FILE) or not os.path.isfile(config.SETTINGS_FILE):
+ raise Exception("Config file '%s' cannot be openend" % config.SETTINGS_FILE)
+ return json.loads(open(config.SETTINGS_FILE, "r").read())
+
+
+# cleanup !
+def clean_up(testdir):
+ run_shell_cmd("rm -rf -- '%s'" % testdir)
+
+def dump_info(settings, options, args):
+ """ detailed information about current run configuration, for debugging purposes.
+ """
+ config.logger.debug("Settings:\n%s\nOptions:\n%s\nArguments:\n%s\n", settings, options, args)
+
+def main():
+ (options, args) = validate_args()
+
+ settings = read_settings()
+ need_cleanup = False
+
+ # dump debug info
+ dump_info(settings, options, args)
+
+ testdir = None
+ no_failures = 1
+ try:
+ if options.testdir is not None and os.path.exists(options.testdir):
+ testdir = os.path.abspath(options.testdir)
+ config.logger.info("No checkout, using %s", testdir)
+ else:
+ need_cleanup = True
+ testdir = set_up_test_branch(settings, args[0]) # we expect a branch name as first argument
+
+ config.TESTDIR = testdir # we let tests know where to run
+
+ # ensure that the test dir only contains no *.pyc leftovers
+ run_shell_cmd("find '%s' -type f -name *.pyc -exec rm {} \\;" % testdir)
+
+ no_failures = execute_tests(testdir, options.singletest)
+
+ except ShellCmdException as exc:
+ import traceback
+ config.logger.error("Error while setting up testing. Traceback: \n%s", traceback.format_exc(exc))
+ finally:
+ if need_cleanup and testdir is not None:
+ clean_up(testdir)
+
+ sys.exit(no_failures)
+
+if __name__ == "__main__":
+ main()
diff --git a/bitbake/lib/toaster/contrib/tts/settings.json b/bitbake/lib/toaster/contrib/tts/settings.json
new file mode 100644
index 0000000..bb671ea
--- /dev/null
+++ b/bitbake/lib/toaster/contrib/tts/settings.json
@@ -0,0 +1,5 @@
+{
+ "repo": "git@git.yoctoproject.org:poky-contrib",
+ "localclone": "/home/ddalex/ssd/yocto/poky",
+ "workdir": "/home/ddalex/ssd/yocto"
+}
diff --git a/bitbake/lib/toaster/contrib/tts/shellutils.py b/bitbake/lib/toaster/contrib/tts/shellutils.py
new file mode 100644
index 0000000..c2012ed
--- /dev/null
+++ b/bitbake/lib/toaster/contrib/tts/shellutils.py
@@ -0,0 +1,141 @@
+#!/usr/bin/python
+
+# ex:ts=4:sw=4:sts=4:et
+# -*- tab-width: 4; c-basic-offset: 4; indent-tabs-mode: nil -*-
+#
+# Copyright (C) 2015 Alexandru Damian for Intel Corp.
+#
+# This program is free software; you can redistribute it and/or modify
+# it under the terms of the GNU General Public License version 2 as
+# published by the Free Software Foundation.
+#
+# This program is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
+# GNU General Public License for more details.
+#
+# You should have received a copy of the GNU General Public License along
+# with this program; if not, write to the Free Software Foundation, Inc.,
+# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
+
+# Utilities shared by tests and other common bits of code.
+
+import sys, os, subprocess, fcntl, errno
+import config
+from config import logger
+
+
+# License warning; this code is copied from the BitBake project, file bitbake/lib/bb/utils.py
+# The code is originally licensed GPL-2.0, and we redistribute it under still GPL-2.0
+
+# End of copy is marked with #ENDOFCOPY marker
+
+def mkdirhier(directory):
+ """Create a directory like 'mkdir -p', but does not complain if
+ directory already exists like os.makedirs
+ """
+
+ try:
+ os.makedirs(directory)
+ except OSError as exc:
+ if exc.errno != errno.EEXIST:
+ raise exc
+
+def lockfile(name, shared=False, retry=True):
+ """
+ Use the file fn as a lock file, return when the lock has been acquired.
+ Returns a variable to pass to unlockfile().
+ """
+ config.logger.debug("take lockfile %s", name)
+ dirname = os.path.dirname(name)
+ mkdirhier(dirname)
+
+ if not os.access(dirname, os.W_OK):
+ logger.error("Unable to acquire lock '%s', directory is not writable",
+ name)
+ sys.exit(1)
+
+ operation = fcntl.LOCK_EX
+ if shared:
+ operation = fcntl.LOCK_SH
+ if not retry:
+ operation = operation | fcntl.LOCK_NB
+
+ while True:
+ # If we leave the lockfiles lying around there is no problem
+ # but we should clean up after ourselves. This gives potential
+ # for races though. To work around this, when we acquire the lock
+ # we check the file we locked was still the lock file on disk.
+ # by comparing inode numbers. If they don't match or the lockfile
+ # no longer exists, we start again.
+
+ # This implementation is unfair since the last person to request the
+ # lock is the most likely to win it.
+
+ # pylint: disable=broad-except
+ # we disable the broad-except because we want to actually catch all possible exceptions
+ try:
+ lock_file = open(name, 'a+')
+ fileno = lock_file.fileno()
+ fcntl.flock(fileno, operation)
+ statinfo = os.fstat(fileno)
+ if os.path.exists(lock_file.name):
+ statinfo2 = os.stat(lock_file.name)
+ if statinfo.st_ino == statinfo2.st_ino:
+ return lock_file
+ lock_file.close()
+ except Exception as exc:
+ try:
+ lock_file.close()
+ except Exception as exc2:
+ config.logger.error("Failed to close the lockfile: %s", exc2)
+ config.logger.error("Failed to acquire the lockfile: %s", exc)
+ if not retry:
+ return None
+
+def unlockfile(lock_file):
+ """
+ Unlock a file locked using lockfile()
+ """
+ try:
+ # If we had a shared lock, we need to promote to exclusive before
+ # removing the lockfile. Attempt this, ignore failures.
+ fcntl.flock(lock_file.fileno(), fcntl.LOCK_EX|fcntl.LOCK_NB)
+ os.unlink(lock_file.name)
+ except (IOError, OSError):
+ pass
+ fcntl.flock(lock_file.fileno(), fcntl.LOCK_UN)
+ lock_file.close()
+
+#ENDOFCOPY
+
+
+def mk_lock_filename():
+ our_name = os.path.basename(__file__)
+ our_name = ".%s" % ".".join(reversed(our_name.split(".")))
+ return config.LOCKFILE + our_name
+
+
+
+class ShellCmdException(Exception):
+ pass
+
+def run_shell_cmd(command, cwd=None):
+ if cwd is None:
+ cwd = os.getcwd()
+
+ config.logger.debug("_shellcmd: (%s) %s", cwd, command)
+ process = subprocess.Popen(command, cwd=cwd, shell=True, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
+ (out, err) = process.communicate()
+ process.wait()
+ if process.returncode:
+ if len(err) == 0:
+ err = "command: %s \n%s" % (command, out)
+ else:
+ err = "command: %s \n%s" % (command, err)
+ config.logger.warn("_shellcmd: error \n%s\n%s", out, err)
+ raise ShellCmdException(err)
+ else:
+ #config.logger.debug("localhostbecontroller: shellcmd success\n%s" % out)
+ return out
+
diff --git a/bitbake/lib/toaster/contrib/tts/tests.py b/bitbake/lib/toaster/contrib/tts/tests.py
new file mode 100644
index 0000000..c510ebb
--- /dev/null
+++ b/bitbake/lib/toaster/contrib/tts/tests.py
@@ -0,0 +1,115 @@
+#!/usr/bin/python
+
+# ex:ts=4:sw=4:sts=4:et
+# -*- tab-width: 4; c-basic-offset: 4; indent-tabs-mode: nil -*-
+#
+# Copyright (C) 2015 Alexandru Damian for Intel Corp.
+#
+# This program is free software; you can redistribute it and/or modify
+# it under the terms of the GNU General Public License version 2 as
+# published by the Free Software Foundation.
+#
+# This program is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
+# GNU General Public License for more details.
+#
+# You should have received a copy of the GNU General Public License along
+# with this program; if not, write to the Free Software Foundation, Inc.,
+# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
+
+
+# Test definitions. The runner will look for and auto-discover the tests
+# no matter what they file are they in, as long as they are in the same directory
+# as this file.
+
+import unittest
+from shellutils import run_shell_cmd, ShellCmdException
+import config
+
+import pexpect
+import sys, os, signal, time
+
+class Test00PyCompilable(unittest.TestCase):
+ ''' Verifies that all Python files are syntactically correct '''
+ def test_compile_file(self):
+ try:
+ run_shell_cmd("find . -name *py -type f -print0 | xargs -0 -n1 -P20 python -m py_compile", config.TESTDIR)
+ except ShellCmdException as exc:
+ self.fail("Error compiling python files: %s" % (exc))
+
+ def test_pylint_file(self):
+ try:
+ run_shell_cmd(r"find . -iname \"*\.py\" -type f -print0 | PYTHONPATH=${PYTHONPATH}:. xargs -r -0 -n1 pylint --load-plugins pylint_django -E --reports=n 2>&1", cwd=config.TESTDIR + "/bitbake/lib/toaster")
+ except ShellCmdException as exc:
+ self.fail("Pylint fails: %s\n" % exc)
+
+class Test01PySystemStart(unittest.TestCase):
+ ''' Attempts to start Toaster, verify that it is succesfull, and stop it '''
+ def setUp(self):
+ run_shell_cmd("bash -c 'rm -f build/*log'")
+
+ def test_start_interactive_mode(self):
+ try:
+ run_shell_cmd("bash -c 'source %s/oe-init-build-env && source toaster start webport=%d && source toaster stop'" % (config.TESTDIR, config.TOASTER_PORT), config.TESTDIR)
+ except ShellCmdException as exc:
+ self.fail("Failed starting interactive mode: %s" % (exc))
+
+ def test_start_managed_mode(self):
+ try:
+ run_shell_cmd("%s/bitbake/bin/toaster webport=%d nobrowser & sleep 10 && curl http://localhost:%d/ && kill -2 %%1" % (config.TESTDIR, config.TOASTER_PORT, config.TOASTER_PORT), config.TESTDIR)
+ except ShellCmdException as exc:
+ self.fail("Failed starting managed mode: %s" % (exc))
+
+class Test02HTML5Compliance(unittest.TestCase):
+ def setUp(self):
+ self.origdir = os.getcwd()
+ self.crtdir = os.path.dirname(config.TESTDIR)
+ self.cleanup_database = False
+ os.chdir(self.crtdir)
+ if not os.path.exists(os.path.join(self.crtdir, "toaster.sqlite")):
+ self.cleanup_database = True
+ run_shell_cmd("%s/bitbake/lib/toaster/manage.py syncdb --noinput" % config.TESTDIR)
+ run_shell_cmd("%s/bitbake/lib/toaster/manage.py migrate orm" % config.TESTDIR)
+ run_shell_cmd("%s/bitbake/lib/toaster/manage.py migrate bldcontrol" % config.TESTDIR)
+ run_shell_cmd("%s/bitbake/lib/toaster/manage.py loadconf %s/meta-yocto/conf/toasterconf.json" % (config.TESTDIR, config.TESTDIR))
+ run_shell_cmd("%s/bitbake/lib/toaster/manage.py lsupdates" % config.TESTDIR)
+
+ setup = pexpect.spawn("%s/bitbake/lib/toaster/manage.py checksettings" % config.TESTDIR)
+ setup.logfile = sys.stdout
+ setup.expect(r".*or type the full path to a different directory: ")
+ setup.sendline('')
+ setup.sendline('')
+ setup.expect(r".*or type the full path to a different directory: ")
+ setup.sendline('')
+ setup.expect(r"Enter your option: ")
+ setup.sendline('0')
+
+ self.child = pexpect.spawn("bash", ["%s/bitbake/bin/toaster" % config.TESTDIR, "webport=%d" % config.TOASTER_PORT, "nobrowser"], cwd=self.crtdir)
+ self.child.logfile = sys.stdout
+ self.child.expect("Toaster is now running. You can stop it with Ctrl-C")
+
+ def test_html5_compliance(self):
+ import urllist, urlcheck
+ results = {}
+ for url in urllist.URLS:
+ results[url] = urlcheck.validate_html5(config.TOASTER_BASEURL + url)
+
+ failed = []
+ for url in results:
+ if results[url][1] != 0:
+ failed.append((url, results[url]))
+
+
+ self.assertTrue(len(failed) == 0, "Not all URLs validate: \n%s " % "\n".join(["".join(str(x)) for x in failed]))
+
+ #(config.TOASTER_BASEURL + url, status, errors, warnings))
+
+ def tearDown(self):
+ while self.child.isalive():
+ self.child.kill(signal.SIGINT)
+ time.sleep(1)
+ os.chdir(self.origdir)
+ toaster_sqlite_path = os.path.join(self.crtdir, "toaster.sqlite")
+ if self.cleanup_database and os.path.exists(toaster_sqlite_path):
+ os.remove(toaster_sqlite_path)
diff --git a/bitbake/lib/toaster/contrib/tts/toasteruitest/run_toastertests.py b/bitbake/lib/toaster/contrib/tts/toasteruitest/run_toastertests.py
new file mode 100755
index 0000000..880487c
--- /dev/null
+++ b/bitbake/lib/toaster/contrib/tts/toasteruitest/run_toastertests.py
@@ -0,0 +1,87 @@
+#!/usr/bin/python
+
+# Copyright
+
+# DESCRIPTION
+# This is script for running all selected toaster cases on
+# selected web browsers manifested in toaster_test.cfg.
+
+# 1. How to start toaster in yocto:
+# $ source poky/oe-init-build-env
+# $ source toaster start
+# $ bitbake core-image-minimal
+
+# 2. How to install selenium on Ubuntu:
+# $ sudo apt-get install scrot python-pip
+# $ sudo pip install selenium
+
+# 3. How to install selenium addon in firefox:
+# Download the lastest firefox addon from http://release.seleniumhq.org/selenium-ide/
+# Then install it. You can also install firebug and firepath addon
+
+# 4. How to start writing a new case:
+# All you need to do is to implement the function test_xxx() and pile it on.
+
+# 5. How to test with Chrome browser
+# Download/install chrome on host
+# Download chromedriver from https://code.google.com/p/chromedriver/downloads/list according to your host type
+# put chromedriver in PATH, (e.g. /usr/bin/, bear in mind to chmod)
+# For windows host, you may put chromedriver.exe in the same directory as chrome.exe
+
+
+import unittest, time, re, sys, getopt, os, logging, platform
+import ConfigParser
+import subprocess
+
+
+class toaster_run_all():
+ def __init__(self):
+ # in case this script is called from other directory
+ os.chdir(os.path.abspath(sys.path[0]))
+ self.starttime = time.strptime(time.ctime())
+ self.parser = ConfigParser.SafeConfigParser()
+ found = self.parser.read('toaster_test.cfg')
+ self.host_os = platform.system().lower()
+ self.run_all_cases()
+ self.collect_log()
+
+ def get_test_cases(self):
+ # we have config groups for different os type in toaster_test.cfg
+ cases_to_run = eval(self.parser.get('toaster_test_' + self.host_os, 'test_cases'))
+ return cases_to_run
+
+
+ def run_all_cases(self):
+ cases_temp = self.get_test_cases()
+ for case in cases_temp:
+ single_case_cmd = "python -m unittest toaster_automation_test.toaster_cases.test_" + str(case)
+ print single_case_cmd
+ subprocess.call(single_case_cmd, shell=True)
+
+ def collect_log(self):
+ """
+ the log files are temporarily stored in ./log/tmp/..
+ After all cases are done, they should be transfered to ./log/$TIMESTAMP/
+ """
+ def comple(number):
+ if number < 10:
+ return str(0) + str(number)
+ else:
+ return str(number)
+ now = self.starttime
+ now_str = comple(now.tm_year) + comple(now.tm_mon) + comple(now.tm_mday) + \
+ comple(now.tm_hour) + comple(now.tm_min) + comple(now.tm_sec)
+ log_dir = os.path.abspath(sys.path[0]) + os.sep + 'log' + os.sep + now_str
+ log_tmp_dir = os.path.abspath(sys.path[0]) + os.sep + 'log' + os.sep + 'tmp'
+ try:
+ os.renames(log_tmp_dir, log_dir)
+ except OSError :
+ logging.error(" Cannot create log dir(timestamp) under log, please check your privilege")
+
+
+if __name__ == "__main__":
+ toaster_run_all()
+
+
+
+
diff --git a/bitbake/lib/toaster/contrib/tts/toasteruitest/toaster_automation_test.py b/bitbake/lib/toaster/contrib/tts/toasteruitest/toaster_automation_test.py
new file mode 100755
index 0000000..2a2078f
--- /dev/null
+++ b/bitbake/lib/toaster/contrib/tts/toasteruitest/toaster_automation_test.py
@@ -0,0 +1,1883 @@
+#!/usr/bin/python
+# Copyright
+
+# DESCRIPTION
+# This is toaster automation base class and test cases file
+
+# History:
+# 2015.03.09 inital version
+# 2015.03.23 adding toaster_test.cfg, run_toastertest.py so we can run case by case from outside
+
+# Briefs:
+# This file is comprised of 3 parts:
+# I: common utils like sorting, getting attribute.. etc
+# II: base class part, which complies with unittest frame work and
+# contains class selenium-based functions
+# III: test cases
+# to add new case: just implement new test_xxx() function in class toaster_cases
+
+# NOTES for cases:
+# case 946:
+# step 6 - 8 needs to be observed using screenshots
+# case 956:
+# step 2 - 3 needs to be run manually
+
+import unittest, time, re, sys, getopt, os, logging, string, errno, exceptions
+import shutil, argparse, ConfigParser, platform
+from selenium import webdriver
+from selenium.common.exceptions import NoSuchElementException
+from selenium import selenium
+from selenium.webdriver.common.by import By
+from selenium.webdriver.common.keys import Keys
+from selenium.webdriver.support.ui import Select
+
+
+###########################################
+# #
+# PART I: utils stuff #
+# #
+###########################################
+
+class Listattr(object):
+ """
+ Set of list attribute. This is used to determine what the list content is.
+ Later on we may add more attributes here.
+ """
+ NULL = "null"
+ NUMBERS = "numbers"
+ STRINGS = "strings"
+ PERCENT = "percentage"
+ SIZE = "size"
+ UNKNOWN = "unknown"
+
+
+def get_log_root_dir():
+ max_depth = 5
+ parent_dir = '../'
+ for number in range(0, max_depth):
+ if os.path.isdir(sys.path[0] + os.sep + (os.pardir + os.sep)*number + 'log'):
+ log_root_dir = os.path.abspath(sys.path[0] + os.sep + (os.pardir + os.sep)*number + 'log')
+ break
+
+ if number == (max_depth - 1):
+ print 'No log dir found. Please check'
+ raise Exception
+
+ return log_root_dir
+
+
+def mkdir_p(dir):
+ try:
+ os.makedirs(dir)
+ except OSError as exc:
+ if exc.errno == errno.EEXIST and os.path.isdir(dir):
+ pass
+ else:
+ raise
+
+
+def get_list_attr(testlist):
+ """
+ To determine the list content
+ """
+ if not testlist:
+ return Listattr.NULL
+ listtest = testlist[:]
+ try:
+ listtest.remove('')
+ except ValueError:
+ pass
+ pattern_percent = re.compile(r"^([0-9])+(\.)?([0-9])*%$")
+ pattern_size = re.compile(r"^([0-9])+(\.)?([0-9])*( )*(K)*(M)*(G)*B$")
+ pattern_number = re.compile(r"^([0-9])+(\.)?([0-9])*$")
+ def get_patterned_number(pattern, tlist):
+ count = 0
+ for item in tlist:
+ if re.search(pattern, item):
+ count += 1
+ return count
+ if get_patterned_number(pattern_percent, listtest) == len(listtest):
+ return Listattr.PERCENT
+ elif get_patterned_number(pattern_size, listtest) == len(listtest):
+ return Listattr.SIZE
+ elif get_patterned_number(pattern_number, listtest) == len(listtest):
+ return Listattr.NUMBERS
+ else:
+ return Listattr.STRINGS
+
+
+def is_list_sequenced(testlist):
+ """
+ Function to tell if list is sequenced
+ Currently we may have list made up of: Strings ; numbers ; percentage ; time; size
+ Each has respective way to determine if it's sequenced.
+ """
+ test_list = testlist[:]
+ try:
+ test_list.remove('')
+ except ValueError:
+ pass
+
+ if get_list_attr(testlist) == Listattr.NULL :
+ return True
+
+ elif get_list_attr(testlist) == Listattr.STRINGS :
+ return (sorted(test_list) == test_list)
+
+ elif get_list_attr(testlist) == Listattr.NUMBERS :
+ list_number = []
+ for item in test_list:
+ list_number.append(eval(item))
+ return (sorted(list_number) == list_number)
+
+ elif get_list_attr(testlist) == Listattr.PERCENT :
+ list_number = []
+ for item in test_list:
+ list_number.append(eval(item.strip('%')))
+ return (sorted(list_number) == list_number)
+
+ elif get_list_attr(testlist) == Listattr.SIZE :
+ list_number = []
+ # currently SIZE is splitted by space
+ for item in test_list:
+ if item.split()[1].upper() == "KB":
+ list_number.append(1024 * eval(item.split()[0]))
+ elif item.split()[1].upper() == "MB":
+ list_number.append(1024 * 1024 * eval(item.split()[0]))
+ elif item.split()[1].upper() == "GB":
+ list_number.append(1024 * 1024 * 1024 * eval(item.split()[0]))
+ else:
+ list_number.append(eval(item.split()[0]))
+ return (sorted(list_number) == list_number)
+
+ else:
+ print 'Unrecognized list type, please check'
+ return False
+
+
+def is_list_inverted(testlist):
+ """
+ Function to tell if list is inverted
+ Currently we may have list made up of: Strings ; numbers ; percentage ; time; size
+ Each has respective way to determine if it's inverted.
+ """
+ test_list = testlist[:]
+ try:
+ test_list.remove('')
+ except ValueError:
+ pass
+
+ if get_list_attr(testlist) == Listattr.NULL :
+ return True
+
+ elif get_list_attr(testlist) == Listattr.STRINGS :
+ return (sorted(test_list, reverse = True) == test_list)
+
+ elif get_list_attr(testlist) == Listattr.NUMBERS :
+ list_number = []
+ for item in test_list:
+ list_number.append(eval(item))
+ return (sorted(list_number, reverse = True) == list_number)
+
+ elif get_list_attr(testlist) == Listattr.PERCENT :
+ list_number = []
+ for item in test_list:
+ list_number.append(eval(item.strip('%')))
+ return (sorted(list_number, reverse = True) == list_number)
+
+ elif get_list_attr(testlist) == Listattr.SIZE :
+ list_number = []
+ # currently SIZE is splitted by space. such as 0 B; 1 KB; 2 MB
+ for item in test_list:
+ if item.split()[1].upper() == "KB":
+ list_number.append(1024 * eval(item.split()[0]))
+ elif item.split()[1].upper() == "MB":
+ list_number.append(1024 * 1024 * eval(item.split()[0]))
+ elif item.split()[1].upper() == "GB":
+ list_number.append(1024 * 1024 * 1024 * eval(item.split()[0]))
+ else:
+ list_number.append(eval(item.split()[0]))
+ return (sorted(list_number, reverse = True) == list_number)
+
+ else:
+ print 'Unrecognized list type, please check'
+ return False
+
+def replace_file_content(filename, item, option):
+ f = open(filename)
+ lines = f.readlines()
+ f.close()
+ output = open(filename, 'w')
+ for line in lines:
+ if line.startswith(item):
+ output.write(item + " = '" + option + "'\n")
+ else:
+ output.write(line)
+ output.close()
+
+def extract_number_from_string(s):
+ """
+ extract the numbers in a string. return type is 'list'
+ """
+ return re.findall(r'([0-9]+)', s)
+
+
+
+###########################################
+# #
+# PART II: base class #
+# #
+###########################################
+
+class toaster_cases_base(unittest.TestCase):
+
+ def setUp(self):
+ self.screenshot_sequence = 1
+ self.verificationErrors = []
+ self.accept_next_alert = True
+ self.host_os = platform.system().lower()
+ self.parser = ConfigParser.SafeConfigParser()
+ configs = self.parser.read('toaster_test.cfg')
+ self.base_url = eval(self.parser.get('toaster_test_' + self.host_os, 'toaster_url'))
+
+ # create log dir . Currently , we put log files in log/tmp. After all
+ # test cases are done, move them to log/$datetime dir
+ self.log_tmp_dir = os.path.abspath(sys.path[0]) + os.sep + 'log' + os.sep + 'tmp'
+ try:
+ mkdir_p(self.log_tmp_dir)
+ except OSError :
+ logging.error("%(asctime)s Cannot create tmp dir under log, please check your privilege")
+ self.log = self.logger_create()
+ # driver setup
+ self.setup_browser()
+
+ def logger_create(self):
+ """
+ we use root logger for every testcase.
+ The reason why we don't use TOASTERXXX_logger is to avoid setting respective level for
+ root logger and TOASTERXXX_logger
+ To Be Discussed
+ """
+ log_level_dict = {'CRITICAL':logging.CRITICAL, 'ERROR':logging.ERROR, 'WARNING':logging.WARNING, \
+ 'INFO':logging.INFO, 'DEBUG':logging.DEBUG, 'NOTSET':logging.NOTSET}
+ log = logging.getLogger()
+# log = logging.getLogger('TOASTER_' + str(self.case_no))
+ self.logging_level = eval(self.parser.get('toaster_test_' + self.host_os, 'logging_level'))
+ key = self.logging_level.upper()
+ log.setLevel(log_level_dict[key])
+ fh = logging.FileHandler(filename=self.log_tmp_dir + os.sep + 'case_all' + '.log', mode='a')
+ ch = logging.StreamHandler(sys.stdout)
+ formatter = logging.Formatter('%(pathname)s - %(lineno)d - %(asctime)s \n \
+ %(name)s - %(levelname)s - %(message)s')
+ fh.setFormatter(formatter)
+ ch.setFormatter(formatter)
+ log.addHandler(fh)
+ log.addHandler(ch)
+ return log
+
+
+ def setup_browser(self, *browser_path):
+ self.browser = eval(self.parser.get('toaster_test_' + self.host_os, 'test_browser'))
+ print self.browser
+ if self.browser == "firefox":
+ driver = webdriver.Firefox()
+ elif self.browser == "chrome":
+ driver = webdriver.Chrome()
+ elif self.browser == "ie":
+ driver = webdriver.Ie()
+ else:
+ driver = None
+ print "unrecognized browser type, please check"
+ self.driver = driver
+ self.driver.implicitly_wait(30)
+ return self.driver
+
+
+ def save_screenshot(self, **log_args):
+ """
+ This function is used to save screen either by os interface or selenium interface.
+ How to use:
+ self.save_screenshot(screenshot_type = 'native'/'selenium', log_sub_dir = 'xxx',
+ append_name = 'stepx')
+ where native means screenshot func provided by OS,
+ selenium means screenshot func provided by selenium webdriver
+ """
+ types = [log_args.get('screenshot_type')]
+ # when no screenshot_type is specified
+ if types == [None]:
+ types = ['native', 'selenium']
+ # normally append_name is used to specify which step..
+ add_name = log_args.get('append_name')
+ if not add_name:
+ add_name = '-'
+ # normally there's no need to specify sub_dir
+ sub_dir = log_args.get('log_sub_dir')
+ if not sub_dir:
+ # use casexxx as sub_dir name
+ sub_dir = 'case' + str(self.case_no)
+ for item in types:
+ log_dir = self.log_tmp_dir + os.sep + sub_dir
+ mkdir_p(log_dir)
+ log_path = log_dir + os.sep + self.browser + '-' +\
+ item + '-' + add_name + '-' + str(self.screenshot_sequence) + '.png'
+ if item == 'native':
+ os.system("scrot " + log_path)
+ elif item == 'selenium':
+ self.driver.get_screenshot_as_file(log_path)
+ self.screenshot_sequence += 1
+
+ def browser_delay(self):
+ """
+ currently this is a workaround for some chrome test.
+ Sometimes we need a delay to accomplish some operation.
+ But for firefox, mostly we don't need this.
+ To be discussed
+ """
+ if self.browser == "chrome":
+ time.sleep(1)
+ return
+
+
+# these functions are not contained in WebDriver class..
+ def find_element_by_text(self, string):
+ return self.driver.find_element_by_xpath("//*[text()='" + string + "']")
+
+
+ def find_elements_by_text(self, string):
+ return self.driver.find_elements_by_xpath("//*[text()='" + string + "']")
+
+
+ def find_element_by_text_in_table(self, table_id, text_string):
+ """
+ This is used to search some certain 'text' in certain table
+ """
+ try:
+ table_element = self.get_table_element(table_id)
+ element = table_element.find_element_by_xpath("//*[text()='" + text_string + "']")
+ except NoSuchElementException, e:
+ print 'no element found'
+ raise
+ return element
+
+
+ def find_element_by_link_text_in_table(self, table_id, link_text):
+ """
+ Assume there're multiple suitable "find_element_by_link_text".
+ In this circumstance we need to specify "table".
+ """
+ try:
+ table_element = self.get_table_element(table_id)
+ element = table_element.find_element_by_link_text(link_text)
+ except NoSuchElementException, e:
+ print 'no element found'
+ raise
+ return element
+
+
+ def find_elements_by_link_text_in_table(self, table_id, link_text):
+ """
+ Search link-text in certain table. This helps to narrow down search area.
+ """
+ try:
+ table_element = self.get_table_element(table_id)
+ element_list = table_element.find_elements_by_link_text(link_text)
+ except NoSuchElementException, e:
+ print 'no element found'
+ raise
+ return element_list
+
+
+ def find_element_by_partial_link_text_in_table(self, table_id, link_text):
+ """
+ Search element by partial link text in certain table.
+ """
+ try:
+ table_element = self.get_table_element(table_id)
+ element = table_element.find_element_by_partial_link_text(link_text)
+ return element
+ except NoSuchElementException, e:
+ print 'no element found'
+ raise
+
+
+ def find_elements_by_partial_link_text_in_table(self, table_id, link_text):
+ """
+ Assume there're multiple suitable "find_partial_element_by_link_text".
+ """
+ try:
+ table_element = self.get_table_element(table_id)
+ element_list = table_element.find_elements_by_partial_link_text(link_text)
+ return element_list
+ except NoSuchElementException, e:
+ print 'no element found'
+ raise
+
+
+ def find_element_by_xpath_in_table(self, table_id, xpath):
+ """
+ This helps to narrow down search area. Especially useful when dealing with pop-up form.
+ """
+ try:
+ table_element = self.get_table_element(table_id)
+ element = table_element.find_element_by_xpath(xpath)
+ except NoSuchElementException, e:
+ print 'no element found'
+ raise
+ return element
+
+
+ def find_elements_by_xpath_in_table(self, table_id, xpath):
+ """
+ This helps to narrow down search area. Especially useful when dealing with pop-up form.
+ """
+ try:
+ table_element = self.get_table_element(table_id)
+ element_list = table_element.find_elements_by_xpath(xpath)
+ except NoSuchElementException, e:
+ print 'no elements found'
+ raise
+ return element_list
+
+
+ def shortest_xpath(self, pname, pvalue):
+ return "//*[@" + pname + "='" + pvalue + "']"
+
+
+#usually elements in the same column are with same class name. for instance: class="outcome" .TBD
+ def get_table_column_text(self, attr_name, attr_value):
+ c_xpath = self.shortest_xpath(attr_name, attr_value)
+ elements = self.driver.find_elements_by_xpath(c_xpath)
+ c_list = []
+ for element in elements:
+ c_list.append(element.text)
+ return c_list
+
+
+ def get_table_column_text_by_column_number(self, table_id, column_number):
+ c_xpath = "//*[@id='" + table_id + "']//td[" + str(column_number) + "]"
+ elements = self.driver.find_elements_by_xpath(c_xpath)
+ c_list = []
+ for element in elements:
+ c_list.append(element.text)
+ return c_list
+
+
+ def get_table_head_text(self, *table_id):
+#now table_id is a tuple...
+ if table_id:
+ thead_xpath = "//*[@id='" + table_id[0] + "']//thead//th[text()]"
+ elements = self.driver.find_elements_by_xpath(thead_xpath)
+ c_list = []
+ for element in elements:
+ if element.text:
+ c_list.append(element.text)
+ return c_list
+#default table on page
+ else:
+ return self.driver.find_element_by_xpath("//*/table/thead").text
+
+
+
+ def get_table_element(self, table_id, *coordinate):
+ if len(coordinate) == 0:
+#return whole-table element
+ element_xpath = "//*[@id='" + table_id + "']"
+ try:
+ element = self.driver.find_element_by_xpath(element_xpath)
+ except NoSuchElementException, e:
+ raise
+ return element
+ row = coordinate[0]
+
+ if len(coordinate) == 1:
+#return whole-row element
+ element_xpath = "//*[@id='" + table_id + "']/tbody/tr[" + str(row) + "]"
+ try:
+ element = self.driver.find_element_by_xpath(element_xpath)
+ except NoSuchElementException, e:
+ return False
+ return element
+#now we are looking for an element with specified X and Y
+ column = coordinate[1]
+
+ element_xpath = "//*[@id='" + table_id + "']/tbody/tr[" + str(row) + "]/td[" + str(column) + "]"
+ try:
+ element = self.driver.find_element_by_xpath(element_xpath)
+ except NoSuchElementException, e:
+ return False
+ return element
+
+
+ def get_table_data(self, table_id, row_count, column_count):
+ row = 1
+ Lists = []
+ while row <= row_count:
+ column = 1
+ row_content=[]
+ while column <= column_count:
+ s= "//*[@id='" + table_id + "']/tbody/tr[" + str(row) +"]/td[" + str(column) + "]"
+ v = self.driver.find_element_by_xpath(s).text
+ row_content.append(v)
+ column = column + 1
+ print("row_content=",row_content)
+ Lists.extend(row_content)
+ print Lists[row-1][0]
+ row = row + 1
+ return Lists
+
+ # The is_xxx_present functions only returns True/False
+ # All the log work is done in test procedure, so we can easily trace back
+ # using logging
+ def is_text_present (self, patterns):
+ for pattern in patterns:
+ if str(pattern) not in self.driver.page_source:
+ return False
+ return True
+
+
+ def is_element_present(self, how, what):
+ try:
+ self.driver.find_element(how, what)
+ except NoSuchElementException, e:
+ return False
+ return True
+
+
+ def is_alert_present(self):
+ try: self.driver.switch_to_alert()
+ except NoAlertPresentException, e: return False
+ return True
+
+
+ def close_alert_and_get_its_text(self):
+ try:
+ alert = self.driver.switch_to_alert()
+ alert_text = alert.text
+ if self.accept_next_alert:
+ alert.accept()
+ else:
+ alert.dismiss()
+ return alert_text
+ finally: self.accept_next_alert = True
+
+
+ def get_case_number(self):
+ """
+ what case are we running now
+ """
+ funcname = sys._getframe(1).f_code.co_name
+ caseno_str = funcname.strip('test_')
+ try:
+ caseno = int(caseno_str)
+ except ValueError:
+ print "get case number error! please check if func name is test_xxx"
+ return False
+ return caseno
+
+
+ def tearDown(self):
+ self.log.info(' END: CASE %s log \n\n' % str(self.case_no))
+ self.driver.quit()
+ self.assertEqual([], self.verificationErrors)
+
+
+###################################################################
+# #
+# PART III: test cases #
+# please refer to #
+# https://bugzilla.yoctoproject.org/tr_show_case.cgi?case_id=xxx #
+# #
+###################################################################
+
+# Note: to comply with the unittest framework, we call these test_xxx functions
+# from run_toastercases.py to avoid calling setUp() and tearDown() multiple times
+
+
+class toaster_cases(toaster_cases_base):
+ ##############
+ # CASE 901 #
+ ##############
+ def test_901(self):
+ # the reason why get_case_number is not in setUp function is that
+ # otherwise it returns "setUp" instead of "test_xxx"
+ self.case_no = self.get_case_number()
+ self.log.info(' CASE %s log: ' % str(self.case_no))
+ self.driver.maximize_window()
+ self.driver.get(self.base_url)
+ # open all columns
+ self.driver.find_element_by_id("edit-columns-button").click()
+ # adding explicitly wait for chromedriver..-_-
+ self.browser_delay()
+ self.driver.find_element_by_id("started_on").click()
+ self.browser_delay()
+ self.driver.find_element_by_id("time").click()
+ self.driver.find_element_by_id("edit-columns-button").click()
+ # dict: {lint text name : actual class name}
+ table_head_dict = {'Outcome':'outcome', 'Recipe':'target', 'Machine':'machine', 'Started on':'started_on', 'Completed on':'completed_on', \
+ 'Errors':'errors_no', 'Warnings':'warnings_no', 'Time':'time'}
+ for key in table_head_dict:
+ try:
+ self.driver.find_element_by_link_text(key).click()
+ except Exception, e:
+ self.log.error("%s cannot be found on page" % key)
+ raise
+ column_list = self.get_table_column_text("class", table_head_dict[key])
+ # after 1st click, the list should be either sequenced or inverted, but we don't have a "default order" here
+ # the point is, after another click, it should be another order
+ if is_list_inverted(column_list):
+ self.driver.find_element_by_link_text(key).click()
+ column_list = self.get_table_column_text("class", table_head_dict[key])
+ self.failUnless(is_list_sequenced(column_list))
+ else:
+ self.failUnless(is_list_sequenced(column_list))
+ self.driver.find_element_by_link_text(key).click()
+ column_list = self.get_table_column_text("class", table_head_dict[key])
+ self.failUnless(is_list_inverted(column_list))
+ self.log.info("case passed")
+
+
+ ##############
+ # CASE 902 #
+ ##############
+ def test_902(self):
+ self.case_no = self.get_case_number()
+ self.log.info(' CASE %s log: ' % str(self.case_no))
+ self.driver.maximize_window()
+ self.driver.get(self.base_url)
+ # Could add more test patterns here in the future. Also, could search some items other than target column in future..
+ patterns = ["minimal", "sato"]
+ for pattern in patterns:
+ ori_target_column_texts = self.get_table_column_text("class", "target")
+ print ori_target_column_texts
+ self.driver.find_element_by_id("search").clear()
+ self.driver.find_element_by_id("search").send_keys(pattern)
+ self.driver.find_element_by_id("search-button").click()
+ new_target_column_texts = self.get_table_column_text("class", "target")
+ # if nothing found, we still count it as "pass"
+ if new_target_column_texts:
+ for text in new_target_column_texts:
+ self.failUnless(text.find(pattern))
+ self.driver.find_element_by_css_selector("i.icon-remove").click()
+ target_column_texts = self.get_table_column_text("class", "target")
+ self.failUnless(ori_target_column_texts == target_column_texts)
+
+
+ ##############
+ # CASE 903 #
+ ##############
+ def test_903(self):
+ self.case_no = self.get_case_number()
+ self.log.info(' CASE %s log: ' % str(self.case_no))
+ self.driver.maximize_window()
+ self.driver.get(self.base_url)
+ # when opening a new page, "started_on" is not displayed by default
+ self.driver.find_element_by_id("edit-columns-button").click()
+ # currently all the delay are for chrome driver -_-
+ self.browser_delay()
+ self.driver.find_element_by_id("started_on").click()
+ self.driver.find_element_by_id("edit-columns-button").click()
+ # step 4
+ items = ["Outcome", "Completed on", "Started on", "Failed tasks", "Errors", "Warnings"]
+ for item in items:
+ try:
+ temp_element = self.find_element_by_text_in_table('otable', item)
+ # this is how we find "filter icon" in the same level as temp_element(where "a" means clickable, "i" means icon)
+ self.failUnless(temp_element.find_element_by_xpath("..//*/a/i[@class='icon-filter filtered']"))
+ except Exception,e:
+ self.log.error(" %s cannot be found! %s" % (item, e))
+ self.failIf(True)
+ raise
+ # step 5-6
+ temp_element = self.find_element_by_link_text_in_table('otable', 'Outcome')
+ temp_element.find_element_by_xpath("..//*/a/i[@class='icon-filter filtered']").click()
+ self.browser_delay()
+ # the 2nd option, whatever it is
+ self.driver.find_element_by_xpath("(//input[@name='filter'])[2]").click()
+ # click "Apply" button
+ self.driver.find_element_by_xpath("//*[@id='filter_outcome']//*[text()='Apply']").click()
+ # save screen here
+ time.sleep(1)
+ self.save_screenshot(screenshot_type='selenium', append_name='step5')
+ temp_element = self.find_element_by_link_text_in_table('otable', 'Completed on')
+ temp_element.find_element_by_xpath("..//*/a/i[@class='icon-filter filtered']").click()
+ self.browser_delay()
+ self.driver.find_element_by_xpath("//*[@id='filter_completed_on']//*[text()='Apply']").click()
+ # save screen here to compare to previous one
+ # please note that for chrome driver, need a little break before saving
+ # screen here -_-
+ self.browser_delay()
+ self.save_screenshot(screenshot_type='selenium', append_name='step6')
+ self.driver.find_element_by_id("search").clear()
+ self.driver.find_element_by_id("search").send_keys("core-image")
+ self.driver.find_element_by_id("search-button").click()
+
+
+ ##############
+ # CASE 904 #
+ ##############
+ def test_904(self):
+ self.case_no = self.get_case_number()
+ self.log.info(' CASE %s log: ' % str(self.case_no))
+ self.driver.maximize_window()
+ self.driver.get(self.base_url)
+ self.driver.find_element_by_partial_link_text("core-image").click()
+ self.driver.find_element_by_link_text("Tasks").click()
+# self.driver.find_element_by_link_text("All builds").click()
+# self.driver.back()
+ self.table_name = 'otable'
+ # This is how we find the "default" rows-number!
+ rows_displayed = int(Select(self.driver.find_element_by_css_selector("select.pagesize")).first_selected_option.text)
+ print rows_displayed
+ self.failUnless(self.get_table_element(self.table_name, rows_displayed))
+ self.failIf(self.get_table_element(self.table_name, rows_displayed + 1))
+ # Search text box background text is "Search tasks"
+ self.failUnless(self.driver.find_element_by_xpath("//*[@id='searchform']/*[@placeholder='Search tasks']"))
+
+ self.driver.find_element_by_id("search").clear()
+ self.driver.find_element_by_id("search").send_keys("busybox")
+ self.driver.find_element_by_id("search-button").click()
+ self.browser_delay()
+ self.save_screenshot(screenshot_type='selenium', append_name='step5')
+ self.driver.find_element_by_css_selector("i.icon-remove").click()
+ # Save screen here
+ self.save_screenshot(screenshot_type='selenium', append_name='step5_2')
+ self.driver.find_element_by_id("edit-columns-button").click()
+ self.driver.find_element_by_id("cpu_used").click()
+ self.driver.find_element_by_id("disk_io").click()
+ self.driver.find_element_by_id("task_log").click()
+ self.driver.find_element_by_id("recipe_version").click()
+ self.driver.find_element_by_id("time_taken").click()
+ self.driver.find_element_by_css_selector("edit-columns-button").click()
+ # The operation is the same as case901
+ # dict: {lint text name : actual class name}
+ table_head_dict = {'Order':'order', 'Recipe':'recipe_name', 'Task':'task_name', 'Executed':'executed', \
+ 'Outcome':'outcome', 'Cache attempt':'cache_attempt', 'Time (secs)':'time_taken', 'CPU usage':'cpu_used', \
+ 'Disk I/O (ms)':'disk_io'}
+ for key in table_head_dict:
+# This is tricky here: we are doing so because there may be more than 1
+# same-name link_text in one page. So we only find element inside the table
+ self.find_element_by_link_text_in_table(self.table_name, key).click()
+ column_list = self.get_table_column_text("class", table_head_dict[key])
+# after 1st click, the list should be either sequenced or inverted, but we don't have a "default order" here
+# the point is, after another click, it should be another order
+# the fist case is special:this means every item in column_list is the same, so
+# after one click, either sequenced or inverted will be fine
+ if (is_list_inverted(column_list) and is_list_sequenced(column_list)) \
+ or (not column_list) :
+ self.find_element_by_link_text_in_table(self.table_name, key).click()
+ column_list = self.get_table_column_text("class", table_head_dict[key])
+ self.failUnless(is_list_sequenced(column_list) or is_list_inverted(column_list))
+ elif is_list_inverted(column_list):
+ self.find_element_by_link_text_in_table(self.table_name, key).click()
+ column_list = self.get_table_column_text("class", table_head_dict[key])
+ self.failUnless(is_list_sequenced(column_list))
+ else:
+ self.failUnless(is_list_sequenced(column_list))
+ self.find_element_by_link_text_in_table(self.table_name, key).click()
+ column_list = self.get_table_column_text("class", table_head_dict[key])
+ self.failUnless(is_list_inverted(column_list))
+# step 8-10
+ # filter dict: {link text name : filter table name in xpath}
+ filter_dict = {'Executed':'filter_executed', 'Outcome':'filter_outcome', 'Cache attempt':'filter_cache_attempt'}
+ for key in filter_dict:
+ temp_element = self.find_element_by_link_text_in_table(self.table_name, key)
+ # find the filter icon besides it.
+ # And here we must have break (1 sec) to get the popup stuff
+ temp_element.find_element_by_xpath("..//*[@class='icon-filter filtered']").click()
+ self.browser_delay()
+ avail_options = self.driver.find_elements_by_xpath("//*[@id='" + filter_dict[key] + "']//*[@name='filter'][not(@disabled)]")
+ for number in range(0, len(avail_options)):
+ avail_options[number].click()
+ self.browser_delay()
+ # click "Apply"
+ self.driver.find_element_by_xpath("//*[@id='" + filter_dict[key] + "']//*[@type='submit']").click()
+ # insert screen capture here
+ self.browser_delay()
+ self.save_screenshot(screenshot_type='selenium', append_name='step8')
+ # after the last option was clicked, we don't need operation below anymore
+ if number < len(avail_options)-1:
+ temp_element = self.find_element_by_link_text_in_table(self.table_name, key)
+ temp_element.find_element_by_xpath("..//*[@class='icon-filter filtered']").click()
+ avail_options = self.driver.find_elements_by_xpath("//*[@id='" + filter_dict[key] + "']//*[@name='filter'][not(@disabled)]")
+ self.browser_delay()
+# step 11
+ for item in ['order', 'task_name', 'executed', 'outcome', 'recipe_name', 'recipe_version']:
+ try:
+ self.find_element_by_xpath_in_table(self.table_name, "./tbody/tr[1]/*[@class='" + item + "']/a").click()
+ except NoSuchElementException, e:
+ # let it go...
+ print 'no item in the colum' + item
+ # insert screen shot here
+ self.save_screenshot(screenshot_type='selenium', append_name='step11')
+ self.driver.back()
+# step 12-14
+ # about test_dict: please refer to testcase 904 requirement step 12-14
+ test_dict = {
+ 'Time':{
+ 'class':'time_taken',
+ 'check_head_list':['Recipe', 'Task', 'Executed', 'Outcome', 'Time (secs)'],
+ 'check_column_list':['cpu_used', 'cache_attempt', 'disk_io', 'order', 'recipe_version']
+ },
+ 'CPU usage':{
+ 'class':'cpu_used',
+ 'check_head_list':['Recipe', 'Task', 'Executed', 'Outcome', 'CPU usage'],
+ 'check_column_list':['cache_attempt', 'disk_io', 'order', 'recipe_version', 'time_taken']
+ },
+ 'Disk I/O':{
+ 'class':'disk_io',
+ 'check_head_list':['Recipe', 'Task', 'Executed', 'Outcome', 'Disk I/O (ms)'],
+ 'check_column_list':['cpu_used', 'cache_attempt', 'order', 'recipe_version', 'time_taken']
+ }
+ }
+ for key in test_dict:
+ self.find_element_by_partial_link_text_in_table('nav', 'core-image').click()
+ self.find_element_by_link_text_in_table('nav', key).click()
+ head_list = self.get_table_head_text('otable')
+ for item in test_dict[key]['check_head_list']:
+ self.failUnless(item in head_list)
+ column_list = self.get_table_column_text('class', test_dict[key]['class'])
+ self.failUnless(is_list_inverted(column_list))
+
+ self.driver.find_element_by_id("edit-columns-button").click()
+ for item2 in test_dict[key]['check_column_list']:
+ self.driver.find_element_by_id(item2).click()
+ self.driver.find_element_by_id("edit-columns-button").click()
+ # TBD: save screen here
+
+
+ ##############
+ # CASE 906 #
+ ##############
+ def test_906(self):
+ self.case_no = self.get_case_number()
+ self.log.info(' CASE %s log: ' % str(self.case_no))
+ self.driver.maximize_window()
+ self.driver.get(self.base_url)
+ self.driver.find_element_by_link_text("core-image-minimal").click()
+ self.find_element_by_link_text_in_table('nav', 'Packages').click()
+ # find "bash" in first column (Packages)
+ self.driver.find_element_by_xpath("//*[@id='otable']//td[1]//*[text()='bash']").click()
+ # save sceen here to observe...
+# step 6
+ self.driver.find_element_by_partial_link_text("Generated files").click()
+ head_list = self.get_table_head_text('otable')
+ for item in ['File', 'Size']:
+ self.failUnless(item in head_list)
+ c_list = self.get_table_column_text('class', 'path')
+ self.failUnless(is_list_sequenced(c_list))
+# step 7
+ self.driver.find_element_by_partial_link_text("Runtime dependencies").click()
+ # save sceen here to observe...
+ # note that here table name is not 'otable'
+ head_list = self.get_table_head_text('dependencies')
+ for item in ['Package', 'Version', 'Size']:
+ self.failUnless(item in head_list)
+ c_list = self.get_table_column_text_by_column_number('dependencies', 1)
+ self.failUnless(is_list_sequenced(c_list))
+ texts = ['Size', 'License', 'Recipe', 'Recipe version', 'Layer', \
+ 'Layer branch', 'Layer commit', 'Layer directory']
+ self.failUnless(self.is_text_present(texts))
+
+
+ ##############
+ # CASE 910 #
+ ##############
+ def test_910(self):
+ self.case_no = self.get_case_number()
+ self.log.info(' CASE %s log: ' % str(self.case_no))
+ image_type="core-image-minimal"
+ test_package1="busybox"
+ test_package2="lib"
+ self.driver.maximize_window()
+ self.driver.get(self.base_url)
+ self.driver.find_element_by_link_text(image_type).click()
+ self.driver.find_element_by_link_text("Recipes").click()
+ self.save_screenshot(screenshot_type='selenium', append_name='step3')
+
+ self.table_name = 'otable'
+ # This is how we find the "default" rows-number!
+ rows_displayed = int(Select(self.driver.find_element_by_css_selector("select.pagesize")).first_selected_option.text)
+ print rows_displayed
+ self.failUnless(self.get_table_element(self.table_name, rows_displayed))
+ self.failIf(self.get_table_element(self.table_name, rows_displayed + 1))
+
+ # Check the default table is sorted by Recipe
+ tasks_column_count = len(self.driver.find_elements_by_xpath("/html/body/div[2]/div/div[2]/div[2]/table/tbody/tr/td[1]"))
+ print tasks_column_count
+ default_column_list = self.get_table_column_text_by_column_number(self.table_name, 1)
+ #print default_column_list
+
+ self.failUnless(is_list_sequenced(default_column_list))
+
+ # Search text box background text is "Search recipes"
+ self.failUnless(self.driver.find_element_by_xpath("//*[@id='searchform']/*[@placeholder='Search recipes']"))
+
+ self.driver.find_element_by_id("search").clear()
+ self.driver.find_element_by_id("search").send_keys(test_package1)
+ self.driver.find_element_by_id("search-button").click()
+ # Save screen here
+ self.save_screenshot(screenshot_type='selenium', append_name='step4')
+ self.driver.find_element_by_css_selector("i.icon-remove").click()
+ self.save_screenshot(screenshot_type='selenium', append_name='step4_2')
+
+ self.driver.find_element_by_id("edit-columns-button").click()
+ self.driver.find_element_by_id("depends_on").click()
+ self.driver.find_element_by_id("layer_version__branch").click()
+ self.driver.find_element_by_id("layer_version__layer__commit").click()
+ self.driver.find_element_by_id("depends_by").click()
+ self.driver.find_element_by_id("edit-columns-button").click()
+
+ self.find_element_by_link_text_in_table(self.table_name, 'Recipe').click()
+ # Check the inverted table by Recipe
+ # Recipe doesn't have class name
+ #inverted_tasks_column_count = len(self.driver.find_elements_by_xpath("/html/body/div[2]/div/div[2]/div[2]/table/tbody/tr/td[1]"))
+ #print inverted_tasks_column_count
+ #inverted_column_list = self.get_table_column_text_by_column_number(self.table_name, 1)
+ #print inverted_column_list
+
+ #self.driver.find_element_by_partial_link_text("zlib").click()
+ #self.driver.back()
+ #self.failUnless(is_list_inverted(inverted_column_list))
+ #self.find_element_by_link_text_in_table(self.table_name, 'Recipe').click()
+
+ table_head_dict = {'Recipe':'recipe__name', 'Recipe file':'recipe_file', 'Section':'recipe_section', \
+ 'License':'recipe_license', 'Layer':'layer_version__layer__name', \
+ 'Layer branch':'layer_version__branch'}
+ for key in table_head_dict:
+ self.find_element_by_link_text_in_table(self.table_name, key).click()
+ column_list = self.get_table_column_text("class", table_head_dict[key])
+ if (is_list_inverted(column_list) and is_list_sequenced(column_list)) \
+ or (not column_list) :
+ self.find_element_by_link_text_in_table(self.table_name, key).click()
+ column_list = self.get_table_column_text("class", table_head_dict[key])
+ self.failUnless(is_list_sequenced(column_list) or is_list_inverted(column_list))
+ self.driver.find_element_by_partial_link_text("acl").click()
+ self.driver.back()
+ self.failUnless(is_list_sequenced(column_list) or is_list_inverted(column_list))
+ # Search text box background text is "Search recipes"
+ self.failUnless(self.driver.find_element_by_xpath("//*[@id='searchform']/*[@placeholder='Search recipes']"))
+ self.driver.find_element_by_id("search").clear()
+ self.driver.find_element_by_id("search").send_keys(test_package2)
+ self.driver.find_element_by_id("search-button").click()
+ column_search_list = self.get_table_column_text("class", table_head_dict[key])
+ self.failUnless(is_list_sequenced(column_search_list) or is_list_inverted(column_search_list))
+ self.driver.find_element_by_css_selector("i.icon-remove").click()
+ elif is_list_inverted(column_list):
+ self.find_element_by_link_text_in_table(self.table_name, key).click()
+ column_list = self.get_table_column_text("class", table_head_dict[key])
+ self.failUnless(is_list_sequenced(column_list))
+ self.driver.find_element_by_partial_link_text("acl").click()
+ self.driver.back()
+ self.failUnless(is_list_sequenced(column_list))
+ # Search text box background text is "Search recipes"
+ self.failUnless(self.driver.find_element_by_xpath("//*[@id='searchform']/*[@placeholder='Search recipes']"))
+ self.driver.find_element_by_id("search").clear()
+ self.driver.find_element_by_id("search").send_keys(test_package2)
+ self.driver.find_element_by_id("search-button").click()
+ column_search_list = self.get_table_column_text("class", table_head_dict[key])
+ self.failUnless(is_list_sequenced(column_search_list))
+ self.driver.find_element_by_css_selector("i.icon-remove").click()
+ else:
+ self.failUnless(is_list_sequenced(column_list))
+ self.find_element_by_link_text_in_table(self.table_name, key).click()
+ column_list = self.get_table_column_text("class", table_head_dict[key])
+ self.failUnless(is_list_inverted(column_list))
+ try:
+ self.driver.find_element_by_partial_link_text("acl").click()
+ except:
+ self.driver.find_element_by_partial_link_text("zlib").click()
+ self.driver.back()
+ self.failUnless(is_list_inverted(column_list))
+ # Search text box background text is "Search recipes"
+ self.failUnless(self.driver.find_element_by_xpath("//*[@id='searchform']/*[@placeholder='Search recipes']"))
+ self.driver.find_element_by_id("search").clear()
+ self.driver.find_element_by_id("search").send_keys(test_package2)
+ self.driver.find_element_by_id("search-button").click()
+ column_search_list = self.get_table_column_text("class", table_head_dict[key])
+ #print column_search_list
+ self.failUnless(is_list_inverted(column_search_list))
+ self.driver.find_element_by_css_selector("i.icon-remove").click()
+
+ # Bug 5919
+ for key in table_head_dict:
+ print key
+ self.find_element_by_link_text_in_table(self.table_name, key).click()
+ self.driver.find_element_by_id("edit-columns-button").click()
+ self.driver.find_element_by_id(table_head_dict[key]).click()
+ self.driver.find_element_by_id("edit-columns-button").click()
+ self.browser_delay()
+ # After hide the column, the default table should be sorted by Recipe
+ tasks_column_count = len(self.driver.find_elements_by_partial_link_text("acl"))
+ #print tasks_column_count
+ default_column_list = self.get_table_column_text_by_column_number(self.table_name, 1)
+ #print default_column_list
+ self.failUnless(is_list_sequenced(default_column_list))
+
+ self.driver.find_element_by_id("edit-columns-button").click()
+ self.driver.find_element_by_id("recipe_file").click()
+ self.driver.find_element_by_id("recipe_section").click()
+ self.driver.find_element_by_id("recipe_license").click()
+ self.driver.find_element_by_id("layer_version__layer__name").click()
+ self.driver.find_element_by_id("edit-columns-button").click()
+
+
+ ##############
+ # CASE 911 #
+ ##############
+ def test_911(self):
+ self.case_no = self.get_case_number()
+ self.log.info(' CASE %s log: ' % str(self.case_no))
+ self.driver.maximize_window()
+ self.driver.get(self.base_url)
+ self.driver.find_element_by_link_text("core-image-minimal").click()
+ self.find_element_by_link_text_in_table('nav', 'Recipes').click()
+# step 3-5
+ self.driver.find_element_by_id("search").clear()
+ self.driver.find_element_by_id("search").send_keys("lib")
+ self.driver.find_element_by_id("search-button").click()
+ # save screen here for observation
+ self.save_screenshot(screenshot_type='selenium', append_name='step5')
+# step 6
+ self.driver.find_element_by_css_selector("i.icon-remove").click()
+ self.driver.find_element_by_id("search").clear()
+ # we deliberately want "no result" here
+ self.driver.find_element_by_id("search").send_keys("no such input")
+ self.driver.find_element_by_id("search-button").click()
+ self.find_element_by_text("Show all recipes").click()
+ self.driver.quit()
+
+
+ ##############
+ # CASE 912 #
+ ##############
+ def test_912(self):
+ self.case_no = self.get_case_number()
+ self.log.info(' CASE %s log: ' % str(self.case_no))
+ self.driver = self.setup_browser(self)
+ self.driver.maximize_window()
+ self.driver.get(self.base_url)
+ self.driver.find_element_by_link_text("core-image-minimal").click()
+ self.find_element_by_link_text_in_table('nav', 'Recipes').click()
+ # step 3
+ head_list = self.get_table_head_text('otable')
+ for item in ['Recipe', 'Recipe version', 'Recipe file', 'Section', 'License', 'Layer']:
+ self.failUnless(item in head_list)
+ self.driver.find_element_by_css_selector("button.btn.dropdown-toggle").click()
+ self.driver.find_element_by_id("depends_on").click()
+ self.driver.find_element_by_id("layer_version__branch").click()
+ self.driver.find_element_by_id("layer_version__layer__commit").click()
+ self.driver.find_element_by_id("depends_by").click()
+ self.driver.find_element_by_css_selector("button.btn.dropdown-toggle").click()
+ # check if columns selected above is shown
+ check_list = ['Dependencies', 'Layer branch', 'Layer commit', 'Layer directory', 'Reverse dependencies']
+ head_list = self.get_table_head_text('otable')
+ time.sleep(2)
+ print head_list
+ for item in check_list:
+ self.failUnless(item in head_list)
+ # un-check 'em all
+ self.driver.find_element_by_css_selector("button.btn.dropdown-toggle").click()
+ self.driver.find_element_by_id("depends_on").click()
+ self.driver.find_element_by_id("layer_version__branch").click()
+ self.driver.find_element_by_id("layer_version__layer__commit").click()
+ self.driver.find_element_by_id("depends_by").click()
+ self.driver.find_element_by_css_selector("button.btn.dropdown-toggle").click()
+ # don't exist any more
+ head_list = self.get_table_head_text('otable')
+ for item in check_list:
+ self.failIf(item in head_list)
+
+
+ ##############
+ # CASE 913 #
+ ##############
+ def test_913(self):
+ self.case_no = self.get_case_number()
+ self.log.info(' CASE %s log: ' % str(self.case_no))
+ self.driver.maximize_window()
+ self.driver.get(self.base_url)
+ self.driver.find_element_by_link_text("core-image-minimal").click()
+ self.find_element_by_link_text_in_table('nav', 'Recipes').click()
+ # step 3
+ head_list = self.get_table_head_text('otable')
+ for item in ['Recipe', 'Recipe version', 'Recipe file', 'Section', 'License', 'Layer']:
+ self.failUnless(item in head_list)
+ # step 4
+ self.driver.find_element_by_id("edit-columns-button").click()
+ # save screen
+ self.browser_delay()
+ self.save_screenshot(screenshot_type='selenium', append_name='step4')
+ self.driver.find_element_by_id("edit-columns-button").click()
+
+
+ ##############
+ # CASE 914 #
+ ##############
+ def test_914(self):
+ self.case_no = self.get_case_number()
+ self.log.info(' CASE %s log: ' % str(self.case_no))
+ image_type="core-image-minimal"
+ test_package1="busybox"
+ test_package2="gdbm"
+ test_package3="gettext-native"
+ driver = self.driver
+ driver.maximize_window()
+ driver.get(self.base_url)
+ driver.find_element_by_link_text(image_type).click()
+ driver.find_element_by_link_text("Recipes").click()
+ driver.find_element_by_link_text(test_package1).click()
+
+ self.table_name = 'information'
+
+ tasks_row_count = len(driver.find_elements_by_xpath("/html/body/div[2]/div/div[3]/div/div[1]/table/tbody/tr/td[1]"))
+ tasks_column_count = len(driver.find_elements_by_xpath("/html/body/div[2]/div/div[3]/div/div[1]/table/tbody/tr[1]/td"))
+ print tasks_row_count
+ print tasks_column_count
+
+ Tasks_column = self.get_table_column_text_by_column_number(self.table_name, 2)
+ print ("Tasks_column=", Tasks_column)
+
+ key_tasks=["do_fetch", "do_unpack", "do_patch", "do_configure", "do_compile", "do_install", "do_package", "do_build"]
+ i = 0
+ while i < len(key_tasks):
+ if key_tasks[i] not in Tasks_column:
+ print ("Error! Missing key task: %s" % key_tasks[i])
+ else:
+ print ("%s is in tasks" % key_tasks[i])
+ i = i + 1
+
+ if Tasks_column.index(key_tasks[0]) != 0:
+ print ("Error! %s is not in the right position" % key_tasks[0])
+ else:
+ print ("%s is in right position" % key_tasks[0])
+
+ if Tasks_column[-1] != key_tasks[-1]:
+ print ("Error! %s is not in the right position" % key_tasks[-1])
+ else:
+ print ("%s is in right position" % key_tasks[-1])
+
+ driver.find_element_by_partial_link_text("Packages (").click()
+ packages_name = driver.find_element_by_partial_link_text("Packages (").text
+ print packages_name
+ packages_num = string.atoi(filter(str.isdigit, repr(packages_name)))
+ print packages_num
+
+ packages_row_count = len(driver.find_elements_by_xpath("/html/body/div[2]/div/div[3]/div/div[2]/table/tbody/tr/td[1]"))
+ print packages_row_count
+
+ if packages_num != packages_row_count:
+ print ("Error! The packages number is not correct")
+ else:
+ print ("The pakcages number is correct")
+
+ driver.find_element_by_partial_link_text("Build dependencies (").click()
+ depends_name = driver.find_element_by_partial_link_text("Build dependencies (").text
+ print depends_name
+ depends_num = string.atoi(filter(str.isdigit, repr(depends_name)))
+ print depends_num
+
+ if depends_num == 0:
+ depends_message = repr(driver.find_element_by_css_selector("div.alert.alert-info").text)
+ print depends_message
+ if depends_message.find("has no build dependencies.") < 0:
+ print ("Error! The message isn't expected.")
+ else:
+ print ("The message is expected")
+ else:
+ depends_row_count = len(driver.find_elements_by_xpath("/html/body/div[2]/div/div[3]/div/div[3]/table/tbody/tr/td[1]"))
+ print depends_row_count
+ if depends_num != depends_row_count:
+ print ("Error! The dependent packages number is not correct")
+ else:
+ print ("The dependent packages number is correct")
+
+ driver.find_element_by_partial_link_text("Reverse build dependencies (").click()
+ rdepends_name = driver.find_element_by_partial_link_text("Reverse build dependencies (").text
+ print rdepends_name
+ rdepends_num = string.atoi(filter(str.isdigit, repr(rdepends_name)))
+ print rdepends_num
+
+ if rdepends_num == 0:
+ rdepends_message = repr(driver.find_element_by_css_selector("#brought-in-by > div.alert.alert-info").text)
+ print rdepends_message
+ if rdepends_message.find("has no reverse build dependencies.") < 0:
+ print ("Error! The message isn't expected.")
+ else:
+ print ("The message is expected")
+ else:
+ print ("The reverse dependent packages number is correct")
+
+ driver.find_element_by_link_text("Recipes").click()
+ driver.find_element_by_link_text(test_package2).click()
+ driver.find_element_by_partial_link_text("Packages (").click()
+ driver.find_element_by_partial_link_text("Build dependencies (").click()
+ driver.find_element_by_partial_link_text("Reverse build dependencies (").click()
+
+
+ driver.find_element_by_link_text("Recipes").click()
+ driver.find_element_by_link_text(test_package3).click()
+
+ native_tasks_row_count = len(driver.find_elements_by_xpath("/html/body/div[2]/div/div[3]/div/div[1]/table/tbody/tr/td[1]"))
+ native_tasks_column_count = len(driver.find_elements_by_xpath("/html/body/div[2]/div/div[3]/div/div[1]/table/tbody/tr[1]/td"))
+ print native_tasks_row_count
+ print native_tasks_column_count
+
+ Native_Tasks_column = self.get_table_column_text_by_column_number(self.table_name, 2)
+ print ("Native_Tasks_column=", Native_Tasks_column)
+
+ native_key_tasks=["do_fetch", "do_unpack", "do_patch", "do_configure", "do_compile", "do_install", "do_build"]
+ i = 0
+ while i < len(native_key_tasks):
+ if native_key_tasks[i] not in Native_Tasks_column:
+ print ("Error! Missing key task: %s" % native_key_tasks[i])
+ else:
+ print ("%s is in tasks" % native_key_tasks[i])
+ i = i + 1
+
+ if Native_Tasks_column.index(native_key_tasks[0]) != 0:
+ print ("Error! %s is not in the right position" % native_key_tasks[0])
+ else:
+ print ("%s is in right position" % native_key_tasks[0])
+
+ if Native_Tasks_column[-1] != native_key_tasks[-1]:
+ print ("Error! %s is not in the right position" % native_key_tasks[-1])
+ else:
+ print ("%s is in right position" % native_key_tasks[-1])
+
+ driver.find_element_by_partial_link_text("Packages (").click()
+ native_packages_name = driver.find_element_by_partial_link_text("Packages (").text
+ print native_packages_name
+ native_packages_num = string.atoi(filter(str.isdigit, repr(native_packages_name)))
+ print native_packages_num
+
+ if native_packages_num != 0:
+ print ("Error! Native task shouldn't have any packages.")
+ else:
+ native_package_message = repr(driver.find_element_by_css_selector("div.alert.alert-info").text)
+ print native_package_message
+ if native_package_message.find("does not build any packages.") < 0:
+ print ("Error! The message for native task isn't expected.")
+ else:
+ print ("The message for native task is expected.")
+
+ driver.find_element_by_partial_link_text("Build dependencies (").click()
+ native_depends_name = driver.find_element_by_partial_link_text("Build dependencies (").text
+ print native_depends_name
+ native_depends_num = string.atoi(filter(str.isdigit, repr(native_depends_name)))
+ print native_depends_num
+
+ native_depends_row_count = len(driver.find_elements_by_xpath("/html/body/div[2]/div/div[3]/div/div[3]/table/tbody/tr/td[1]"))
+ print native_depends_row_count
+
+ if native_depends_num != native_depends_row_count:
+ print ("Error! The dependent packages number is not correct")
+ else:
+ print ("The dependent packages number is correct")
+
+ driver.find_element_by_partial_link_text("Reverse build dependencies (").click()
+ native_rdepends_name = driver.find_element_by_partial_link_text("Reverse build dependencies (").text
+ print native_rdepends_name
+ native_rdepends_num = string.atoi(filter(str.isdigit, repr(native_rdepends_name)))
+ print native_rdepends_num
+
+ native_rdepends_row_count = len(driver.find_elements_by_xpath("/html/body/div[2]/div/div[3]/div/div[4]/table/tbody/tr/td[1]"))
+ print native_rdepends_row_count
+
+ if native_rdepends_num != native_rdepends_row_count:
+ print ("Error! The reverse dependent packages number is not correct")
+ else:
+ print ("The reverse dependent packages number is correct")
+
+ driver.find_element_by_link_text("Recipes").click()
+
+
+ ##############
+ # CASE 915 #
+ ##############
+ def test_915(self):
+ self.case_no = self.get_case_number()
+ self.log.info(' CASE %s log: ' % str(self.case_no))
+ self.driver.maximize_window()
+ self.driver.get(self.base_url)
+ self.driver.find_element_by_link_text("core-image-minimal").click()
+# step 3
+ self.find_element_by_link_text_in_table('nav', 'Configuration').click()
+ self.driver.find_element_by_link_text("BitBake variables").click()
+# step 4
+ self.driver.find_element_by_id("search").clear()
+ self.driver.find_element_by_id("search").send_keys("lib")
+ self.driver.find_element_by_id("search-button").click()
+ # save screen to see result
+ self.browser_delay()
+ self.save_screenshot(screenshot_type='selenium', append_name='step4')
+# step 5
+ self.driver.find_element_by_css_selector("i.icon-remove").click()
+ head_list = self.get_table_head_text('otable')
+ print head_list
+ print len(head_list)
+ self.failUnless(head_list == ['Variable', 'Value', 'Set in file', 'Description'])
+# step 8
+ # search other string. and click "Variable" to re-sort, check if table
+ # head is still the same
+ self.driver.find_element_by_id("search").clear()
+ self.driver.find_element_by_id("search").send_keys("poky")
+ self.driver.find_element_by_id("search-button").click()
+ self.find_element_by_link_text_in_table('otable', 'Variable').click()
+ head_list = self.get_table_head_text('otable')
+ self.failUnless(head_list == ['Variable', 'Value', 'Set in file', 'Description'])
+ self.find_element_by_link_text_in_table('otable', 'Variable').click()
+ head_list = self.get_table_head_text('otable')
+ self.failUnless(head_list == ['Variable', 'Value', 'Set in file', 'Description'])
+
+
+ ##############
+ # CASE 916 #
+ ##############
+ def test_916(self):
+ self.case_no = self.get_case_number()
+ self.log.info(' CASE %s log: ' % str(self.case_no))
+ self.driver.maximize_window()
+ self.driver.get(self.base_url)
+ self.driver.find_element_by_link_text("core-image-minimal").click()
+# step 2-3
+ self.find_element_by_link_text_in_table('nav', 'Configuration').click()
+ self.driver.find_element_by_link_text("BitBake variables").click()
+ variable_list = self.get_table_column_text('class', 'variable_name')
+ self.failUnless(is_list_sequenced(variable_list))
+# step 4
+ self.find_element_by_link_text_in_table('otable', 'Variable').click()
+ variable_list = self.get_table_column_text('class', 'variable_name')
+ self.failUnless(is_list_inverted(variable_list))
+ self.find_element_by_link_text_in_table('otable', 'Variable').click()
+# step 5
+ # searching won't change the sequentiality
+ self.driver.find_element_by_id("search").clear()
+ self.driver.find_element_by_id("search").send_keys("lib")
+ self.driver.find_element_by_id("search-button").click()
+ variable_list = self.get_table_column_text('class', 'variable_name')
+ self.failUnless(is_list_sequenced(variable_list))
+
+
+ ##############
+ # CASE 923 #
+ ##############
+ def test_923(self):
+ self.case_no = self.get_case_number()
+ self.log.info(' CASE %s log: ' % str(self.case_no))
+ self.driver.maximize_window()
+ self.driver.get(self.base_url)
+ # Step 2
+ # default sequence in "Completed on" column is inverted
+ c_list = self.get_table_column_text('class', 'completed_on')
+ self.failUnless(is_list_inverted(c_list))
+ # step 3
+ self.driver.find_element_by_id("edit-columns-button").click()
+ self.driver.find_element_by_id("started_on").click()
+ self.driver.find_element_by_id("log").click()
+ self.driver.find_element_by_id("time").click()
+ self.driver.find_element_by_id("edit-columns-button").click()
+ head_list = self.get_table_head_text('otable')
+ for item in ['Outcome', 'Target', 'Machine', 'Started on', 'Completed on', 'Failed tasks', 'Errors', 'Warnings', 'Warnings', 'Time']:
+ self.failUnless(item in head_list)
+
+
+ ##############
+ # CASE 924 #
+ ##############
+ def test_924(self):
+ self.case_no = self.get_case_number()
+ self.log.info(' CASE %s log: ' % str(self.case_no))
+ self.driver.maximize_window()
+ self.driver.get(self.base_url)
+ # Please refer to case 924 requirement
+ # default sequence in "Completed on" column is inverted
+ c_list = self.get_table_column_text('class', 'completed_on')
+ self.failUnless(is_list_inverted(c_list))
+ # Step 4
+ # click Errors , order in "Completed on" should be disturbed. Then hide
+ # error column to check if order in "Completed on" can be restored
+ self.find_element_by_link_text_in_table('otable', 'Errors').click()
+ self.driver.find_element_by_id("edit-columns-button").click()
+ self.driver.find_element_by_id("errors_no").click()
+ self.driver.find_element_by_id("edit-columns-button").click()
+ # Note: without time.sleep here, there'll be unpredictable error..TBD
+ time.sleep(1)
+ c_list = self.get_table_column_text('class', 'completed_on')
+ self.failUnless(is_list_inverted(c_list))
+
+
+ ##############
+ # CASE 940 #
+ ##############
+ def test_940(self):
+ self.case_no = self.get_case_number()
+ self.log.info(' CASE %s log: ' % str(self.case_no))
+ self.driver.maximize_window()
+ self.driver.get(self.base_url)
+ self.driver.find_element_by_link_text("core-image-minimal").click()
+# Step 2-3
+ self.find_element_by_link_text_in_table('nav', 'Packages').click()
+ check_head_list = ['Package', 'Package version', 'Size', 'Recipe']
+ head_list = self.get_table_head_text('otable')
+ self.failUnless(head_list == check_head_list)
+# Step 4
+ # pulldown menu
+ option_ids = ['recipe__layer_version__layer__name', 'recipe__layer_version__branch', \
+ 'recipe__layer_version__layer__commit', 'license', 'recipe__version']
+ self.driver.find_element_by_id("edit-columns-button").click()
+ for item in option_ids:
+ if not self.driver.find_element_by_id(item).is_selected():
+ self.driver.find_element_by_id(item).click()
+ self.driver.find_element_by_id("edit-columns-button").click()
+ # save screen here to observe that 'Package' and 'Package version' is
+ # not selectable
+ self.browser_delay()
+ self.save_screenshot(screenshot_type='selenium', append_name='step4')
+
+
+ ##############
+ # CASE 941 #
+ ##############
+ def test_941(self):
+ self.case_no = self.get_case_number()
+ self.log.info(' CASE %s log: ' % str(self.case_no))
+ self.driver.maximize_window()
+ self.driver.get(self.base_url)
+ self.driver.find_element_by_link_text("core-image-minimal").click()
+ # Step 2-3
+ self.find_element_by_link_text_in_table('nav', 'Packages').click()
+ # column -- Package
+ column_list = self.get_table_column_text_by_column_number('otable', 1)
+ self.failUnless(is_list_sequenced(column_list))
+ self.find_element_by_link_text_in_table('otable', 'Size').click()
+
+
+ ##############
+ # CASE 942 #
+ ##############
+ def test_942(self):
+ self.case_no = self.get_case_number()
+ self.log.info(' CASE %s log: ' % str(self.case_no))
+ self.driver.maximize_window()
+ self.driver.get(self.base_url)
+ self.driver.find_element_by_link_text("core-image-minimal").click()
+ self.driver.find_element_by_link_text("Packages").click()
+ #get initial table header
+ head_list = self.get_table_head_text('otable')
+ #remove the Recipe column from table header
+ self.driver.find_element_by_id("edit-columns-button").click()
+ self.driver.find_element_by_id("recipe__name").click()
+ self.driver.find_element_by_id("edit-columns-button").click()
+ #get modified table header
+ new_head = self.get_table_head_text('otable')
+ self.failUnless(head_list > new_head)
+
+ ##############
+ # CASE 943 #
+ ##############
+ def test_943(self):
+ self.case_no = self.get_case_number()
+ self.log.info(' CASE %s log: ' % str(self.case_no))
+ self.driver.maximize_window()
+ self.driver.get(self.base_url)
+ self.driver.find_element_by_link_text("core-image-minimal").click()
+ self.driver.find_element_by_link_text("Packages").click()
+ #search for the "bash" package -> this should definitely be present
+ self.driver.find_element_by_id("search").clear()
+ self.driver.find_element_by_id("search").send_keys("bash")
+ self.driver.find_element_by_id("search-button").click()
+ #check for the search result message "XX packages found"
+ self.failUnless(self.is_text_present("packages found"))
+
+
+ ##############
+ # CASE 944 #
+ ##############
+ def test_944(self):
+ self.case_no = self.get_case_number()
+ self.log.info(' CASE %s log: ' % str(self.case_no))
+ self.driver.maximize_window()
+ self.driver.get(self.base_url)
+ self.driver.find_element_by_link_text("core-image-minimal").click()
+ # step 1: test Recipes page stuff
+ self.driver.find_element_by_link_text("Recipes").click()
+ # for these 3 items, default status is not-checked
+ self.driver.find_element_by_id("edit-columns-button").click()
+ self.driver.find_element_by_id("layer_version__branch").click()
+ self.driver.find_element_by_id("layer_version__layer__commit").click()
+ self.driver.find_element_by_id("edit-columns-button").click()
+ # otable is the recipes table here
+ otable_head_text = self.get_table_head_text('otable')
+ for item in ["Layer", "Layer branch", "Layer commit", "Layer directory"]:
+ self.failIf(item not in otable_head_text)
+ # click the fist recipe, whatever it is
+ self.get_table_element("otable", 1, 1).click()
+ self.failUnless(self.is_text_present(["Layer", "Layer branch", "Layer commit", "Recipe file"]))
+
+ # step 2: test Packages page stuff. almost same as above
+ self.driver.back()
+ self.browser_delay()
+ self.driver.find_element_by_link_text("Packages").click()
+ self.driver.find_element_by_id("edit-columns-button").click()
+ self.driver.find_element_by_id("recipe__layer_version__layer__name").click()
+ self.driver.find_element_by_id("recipe__layer_version__branch").click()
+ self.driver.find_element_by_id("recipe__layer_version__layer__commit").click()
+ self.driver.find_element_by_id("edit-columns-button").click()
+ otable_head_text = self.get_table_head_text("otable")
+ for item in ["Layer", "Layer branch", "Layer commit"]:
+ self.failIf(item not in otable_head_text)
+ # click the fist recipe, whatever it is
+ self.get_table_element("otable", 1, 1).click()
+ self.failUnless(self.is_text_present(["Layer", "Layer branch", "Layer commit"]))
+
+ # step 3: test Packages core-image-minimal(images) stuff. almost same as above. Note when future element-id changes...
+ self.driver.back()
+ self.driver.find_element_by_link_text("core-image-minimal").click()
+ self.driver.find_element_by_id("edit-columns-button").click()
+ self.driver.find_element_by_id("layer_name").click()
+ self.driver.find_element_by_id("layer_branch").click()
+ self.driver.find_element_by_id("layer_commit").click()
+ self.driver.find_element_by_id("edit-columns-button").click()
+ otable_head_text = self.get_table_head_text("otable")
+ for item in ["Layer", "Layer branch", "Layer commit"]:
+ self.failIf(item not in otable_head_text)
+ # click the fist recipe, whatever it is
+ self.get_table_element("otable", 1, 1).click()
+ self.failUnless(self.is_text_present(["Layer", "Layer branch", "Layer commit"]))
+
+ # step 4: check Configuration page
+ self.driver.back()
+ self.driver.find_element_by_link_text("Configuration").click()
+ otable_head_text = self.get_table_head_text()
+ for item in ["Layer", "Layer branch", "Layer commit"]:
+ self.failIf(item not in otable_head_text)
+
+
+ ##############
+ # CASE 945 #
+ ##############
+ def test_945(self):
+ self.case_no = self.get_case_number()
+ self.log.info(' CASE %s log: ' % str(self.case_no))
+ self.driver.maximize_window()
+ for items in ["Packages", "Recipes", "Tasks"]:
+ self.driver.get(self.base_url)
+ self.driver.find_element_by_link_text("core-image-minimal").click()
+ self.driver.find_element_by_link_text(items).click()
+
+ # this may be page specific. If future page content changes, try to replace it with new xpath
+ xpath_showrows = "/html/body/div[2]/div/div[2]/div[2]/div[2]/div/div/div[2]/select"
+ xpath_table = "/html/body/div[2]/div/div[2]/div[2]/table/tbody"
+ self.driver.find_element_by_xpath(xpath_showrows).click()
+ rows_displayed = int(self.driver.find_element_by_xpath(xpath_showrows + "/option[2]").text)
+
+ # not sure if this is a Selenium Select bug: If page is not refreshed here, "select(by visible text)" operation will go back to 100-row page
+ # Sure we can use driver.get(url) to refresh page, but since page will vary, we use click link text here
+ self.driver.find_element_by_link_text(items).click()
+ Select(self.driver.find_element_by_css_selector("select.pagesize")).select_by_visible_text(str(rows_displayed))
+ self.failUnless(self.is_element_present(By.XPATH, xpath_table + "/tr[" + str(rows_displayed) +"]"))
+ self.failIf(self.is_element_present(By.XPATH, xpath_table + "/tr[" + str(rows_displayed+1) +"]"))
+
+ # click 1st package, then go back to check if it's still those rows shown.
+ self.driver.find_element_by_xpath(xpath_table + "/tr[1]/td[1]").click()
+ self.driver.find_element_by_link_text(items).click()
+ self.failUnless(self.is_element_present(By.XPATH, xpath_table + "/tr[" + str(rows_displayed) +"]"))
+ self.failIf(self.is_element_present(By.XPATH, xpath_table + "/tr[" + str(rows_displayed+1) +"]"))
+
+
+ ##############
+ # CASE 946 #
+ ##############
+ def test_946(self):
+ self.case_no = self.get_case_number()
+ self.log.info(' CASE %s log: ' % str(self.case_no))
+ self.driver.maximize_window()
+ self.driver.get(self.base_url)
+ self.driver.find_element_by_link_text("core-image-minimal").click()
+ self.driver.find_element_by_link_text("Configuration").click()
+ # step 3-4
+ check_list = ["Summary", "BitBake variables"]
+ for item in check_list:
+ if not self.is_element_present(how=By.LINK_TEXT, what=item):
+ self.log.error("%s not found" %item)
+ if not self.is_text_present(['Layers', 'Layer', 'Layer branch', 'Layer commit']):
+ self.log.error("text not found")
+ # step 5
+ self.driver.find_element_by_link_text("BitBake variables").click()
+ if not self.is_text_present(['Variable', 'Value', 'Set in file', 'Description']):
+ self.log.error("text not found")
+ # This may be unstable because it's page-specific
+ # step 6: this is how we find filter beside "Set in file"
+ temp_element = self.find_element_by_text_in_table('otable', "Set in file")
+ temp_element.find_element_by_xpath("..//*/a/i[@class='icon-filter filtered']").click()
+ self.browser_delay()
+ self.driver.find_element_by_xpath("(//input[@name='filter'])[3]").click()
+ btns = self.driver.find_elements_by_css_selector("button.btn.btn-primary")
+ for btn in btns:
+ try:
+ btn.click()
+ break
+ except:
+ pass
+ # save screen here
+ self.browser_delay()
+ self.save_screenshot(screenshot_type='selenium', append_name='step6')
+ self.driver.find_element_by_id("edit-columns-button").click()
+ # save screen here
+ # step 7
+ # we should manually check the step 6-8 result using screenshot
+ self.browser_delay()
+ self.save_screenshot(screenshot_type='selenium', append_name='step7')
+ self.driver.find_element_by_id("edit-columns-button").click()
+ # step 9
+ # click the 1st item, no matter what it is
+ self.driver.find_element_by_xpath("//*[@id='otable']/tbody/tr[1]/td[1]/a").click()
+ # give it 1 sec so the pop-up becomes the "active_element"
+ time.sleep(1)
+ element = self.driver.switch_to.active_element
+ check_list = ['Order', 'Configuration file', 'Operation', 'Line number']
+ for item in check_list:
+ if item not in element.text:
+ self.log.error("%s not found" %item)
+ # any better way to close this pop-up? ... TBD
+ element.find_element_by_class_name("close").click()
+ # step 10 : need to manually check "Yocto Manual" in saved screen
+ self.driver.find_element_by_css_selector("i.icon-share.get-info").click()
+ # save screen here
+ time.sleep(5)
+ self.save_screenshot(screenshot_type='native', append_name='step10')
+
+
+ ##############
+ # CASE 947 #
+ ##############
+ def test_947(self):
+ self.case_no = self.get_case_number()
+ self.log.info(' CASE %s log: ' % str(self.case_no))
+ self.driver.maximize_window()
+ self.driver.get(self.base_url)
+ self.driver.find_element_by_link_text("core-image-minimal").click()
+ self.find_element_by_link_text_in_table('nav', 'Configuration').click()
+ # step 2
+ self.driver.find_element_by_link_text("BitBake variables").click()
+ # step 3
+ def xpath_option(column_name):
+ # return xpath of options under "Edit columns" button
+ return self.shortest_xpath('id', 'navTab') + self.shortest_xpath('id', 'editcol') \
+ + self.shortest_xpath('id', column_name)
+ self.driver.find_element_by_id('edit-columns-button').click()
+ # by default, option "Description" and "Set in file" were checked
+ self.driver.find_element_by_xpath(xpath_option('description')).click()
+ self.driver.find_element_by_xpath(xpath_option('file')).click()
+ self.driver.find_element_by_id('edit-columns-button').click()
+ check_list = ['Description', 'Set in file']
+ head_list = self.get_table_head_text('otable')
+ for item in check_list:
+ self.failIf(item in head_list)
+ # check these 2 options and verify again
+ self.driver.find_element_by_id('edit-columns-button').click()
+ self.driver.find_element_by_xpath(xpath_option('description')).click()
+ self.driver.find_element_by_xpath(xpath_option('file')).click()
+ self.driver.find_element_by_id('edit-columns-button').click()
+ head_list = self.get_table_head_text('otable')
+ for item in check_list:
+ self.failUnless(item in head_list)
+
+
+ ##############
+ # CASE 948 #
+ ##############
+ def test_948(self):
+ self.case_no = self.get_case_number()
+ self.log.info(' CASE %s log: ' % str(self.case_no))
+ self.driver.maximize_window()
+ self.driver.get(self.base_url)
+ self.driver.find_element_by_link_text("core-image-minimal").click()
+ self.find_element_by_link_text_in_table('nav', 'Configuration').click()
+ self.driver.find_element_by_link_text("BitBake variables").click()
+ #get number of variables visible by default
+ number_before_search = self.driver.find_element_by_class_name('page-header').text
+ # search for a while...
+ self.driver.find_element_by_id("search").clear()
+ self.driver.find_element_by_id("search").send_keys("BB")
+ self.driver.find_element_by_id("search-button").click()
+ #get number of variables visible after search
+ number_after_search = self.driver.find_element_by_class_name('page-header').text
+ self.failUnless(number_before_search > number_after_search)
+
+
+ ##############
+ # CASE 949 #
+ ##############
+ def test_949(self):
+ self.case_no = self.get_case_number()
+ self.log.info(' CASE %s log: ' % str(self.case_no))
+ self.driver.maximize_window()
+ self.driver.get(self.base_url)
+ self.driver.find_element_by_link_text("core-image-minimal").click()
+ self.find_element_by_link_text_in_table('nav', 'core-image-minimal').click()
+ # step 3
+ try:
+ self.driver.find_element_by_partial_link_text("Packages included")
+ self.driver.find_element_by_partial_link_text("Directory structure")
+ except Exception,e:
+ self.log.error(e)
+ self.failIf(True)
+ # step 4
+ head_list = self.get_table_head_text('otable')
+ for item in ['Package', 'Package version', 'Size', 'Dependencies', 'Reverse dependencies', 'Recipe']:
+ self.failUnless(item in head_list)
+ # step 5-6
+ self.driver.find_element_by_id("edit-columns-button").click()
+ selectable_class = 'checkbox'
+ # minimum-table : means unselectable items
+ unselectable_class = 'checkbox muted'
+ selectable_check_list = ['Dependencies', 'Layer', 'Layer branch', 'Layer commit', \
+ 'License', 'Recipe', 'Recipe version', 'Reverse dependencies', \
+ 'Size', 'Size over total (%)']
+ unselectable_check_list = ['Package', 'Package version']
+ selectable_list = list()
+ unselectable_list = list()
+ selectable_elements = self.driver.find_elements_by_xpath("//*[@id='editcol']//*[@class='" + selectable_class + "']")
+ unselectable_elements = self.driver.find_elements_by_xpath("//*[@id='editcol']//*[@class='" + unselectable_class + "']")
+ for element in selectable_elements:
+ selectable_list.append(element.text)
+ for element in unselectable_elements:
+ unselectable_list.append(element.text)
+ # check them
+ for item in selectable_check_list:
+ if item not in selectable_list:
+ self.log.error(" %s not found in dropdown menu \n" % item)
+ self.failIf(True)
+ for item in unselectable_check_list:
+ if item not in unselectable_list:
+ self.log.error(" %s not found in dropdown menu \n" % item)
+ self.failIf(True)
+ self.driver.find_element_by_id("edit-columns-button").click()
+ # step 7
+ self.driver.find_element_by_partial_link_text("Directory structure").click()
+ head_list = self.get_table_head_text('dirtable')
+ for item in ['Directory / File', 'Symbolic link to', 'Source package', 'Size', 'Permissions', 'Owner', 'Group']:
+ if item not in head_list:
+ self.log.error(" %s not found in Directory structure table head \n" % item)
+ self.failIf(True)
+
+
+ ##############
+ # CASE 950 #
+ ##############
+ def test_950(self):
+ self.case_no = self.get_case_number()
+ self.log.info(' CASE %s log: ' % str(self.case_no))
+ self.driver.maximize_window()
+ self.driver.get(self.base_url)
+ # step3&4: so far we're not sure if there's "successful build" or "failed
+ # build".If either of them doesn't exist, we can still go on other steps
+ check_list = ['Configuration', 'Tasks', 'Recipes', 'Packages', 'Time', 'CPU usage', 'Disk I/O']
+ has_successful_build = 1
+ has_failed_build = 1
+ try:
+ pass_icon = self.driver.find_element_by_xpath("//*[@class='icon-ok-sign success']")
+ except Exception:
+ self.log.info("no successful build exists")
+ has_successful_build = 0
+ pass
+ if has_successful_build:
+ pass_icon.click()
+ # save screen here to check if it matches requirement.
+ self.browser_delay()
+ self.save_screenshot(screenshot_type='selenium', append_name='step3_1')
+ for item in check_list:
+ try:
+ self.find_element_by_link_text_in_table('nav', item)
+ except Exception:
+ self.log.error("link %s cannot be found in the page" % item)
+ self.failIf(True)
+ # step 6
+ check_list_2 = ['Packages included', 'Total package size', \
+ 'License manifest', 'Image files']
+ self.failUnless(self.is_text_present(check_list_2))
+ self.driver.back()
+ try:
+ fail_icon = self.driver.find_element_by_xpath("//*[@class='icon-minus-sign error']")
+ except Exception:
+ has_failed_build = 0
+ self.log.info("no failed build exists")
+ pass
+ if has_failed_build:
+ fail_icon.click()
+ # save screen here to check if it matches requirement.
+ self.browser_delay()
+ self.save_screenshot(screenshot_type='selenium', append_name='step3_2')
+ for item in check_list:
+ try:
+ self.find_element_by_link_text_in_table('nav', item)
+ except Exception:
+ self.log.error("link %s cannot be found in the page" % item)
+ self.failIf(True)
+ # step 7 involved
+ check_list_3 = ['Machine', 'Distro', 'Layers', 'Total number of tasks', 'Tasks executed', \
+ 'Tasks not executed', 'Reuse', 'Recipes built', 'Packages built']
+ self.failUnless(self.is_text_present(check_list_3))
+ self.driver.back()
+
+
+ ##############
+ # CASE 951 #
+ ##############
+ def test_951(self):
+ self.case_no = self.get_case_number()
+ self.log.info(' CASE %s log: ' % str(self.case_no))
+ self.driver.maximize_window()
+ self.driver.get(self.base_url)
+ # currently test case itself isn't responsible for creating "1 successful and
+ # 1 failed build"
+ has_successful_build = 1
+ has_failed_build = 1
+ try:
+ fail_icon = self.driver.find_element_by_xpath("//*[@class='icon-minus-sign error']")
+ except Exception:
+ has_failed_build = 0
+ self.log.info("no failed build exists")
+ pass
+ # if there's failed build, we can proceed
+ if has_failed_build:
+ self.driver.find_element_by_partial_link_text("error").click()
+ self.driver.back()
+ # not sure if there "must be" some warnings, so here save a screen
+ self.browser_delay()
+ self.save_screenshot(screenshot_type='selenium', append_name='step4')
+
+
+ ##############
+ # CASE 955 #
+ ##############
+ def test_955(self):
+ self.case_no = self.get_case_number()
+ self.log.info(' CASE %s log: ' % str(self.case_no))
+ self.driver.maximize_window()
+ self.driver.get(self.base_url)
+ self.log.info(" You should manually create all images before test starts!")
+ # So far the case itself is not responsable for creating all sorts of images.
+ # So assuming they are already there
+ # step 2
+ self.driver.find_element_by_link_text("core-image-minimal").click()
+ # save screen here to see the page component
+
+
+ ##############
+ # CASE 956 #
+ ##############
+ def test_956(self):
+ self.case_no = self.get_case_number()
+ self.log.info(' CASE %s log: ' % str(self.case_no))
+ self.driver.maximize_window()
+ self.driver.get(self.base_url)
+ # step 2-3 need to run manually
+ self.log.info("step 2-3: checking the help message when you hover on help icon of target,\
+ tasks, recipes, packages need to run manually")
+ self.driver.find_element_by_partial_link_text("Toaster manual").click()
+ if not self.is_text_present("Toaster Manual"):
+ self.log.error("please check [Toaster manual] link on page")
+ self.failIf(True)
+
diff --git a/bitbake/lib/toaster/contrib/tts/toasteruitest/toaster_test.cfg b/bitbake/lib/toaster/contrib/tts/toasteruitest/toaster_test.cfg
new file mode 100644
index 0000000..6405f9a
--- /dev/null
+++ b/bitbake/lib/toaster/contrib/tts/toasteruitest/toaster_test.cfg
@@ -0,0 +1,21 @@
+# Configuration file for toaster_test
+# Sorted by different host type
+
+# test browser could be: firefox; chrome; ie(still under development)
+# logging_level could be: CRITICAL; ERROR; WARNING; INFO; DEBUG; NOTSET
+
+
+[toaster_test_linux]
+toaster_url = 'http://127.0.0.1:8000'
+test_browser = 'firefox'
+test_cases = [946]
+logging_level = 'INFO'
+
+
+[toaster_test_windows]
+toaster_url = 'http://127.0.0.1:8000'
+test_browser = ['ie', 'firefox', 'chrome']
+test_cases = [901, 902, 903]
+logging_level = 'DEBUG'
+
+
diff --git a/bitbake/lib/toaster/contrib/tts/urlcheck.py b/bitbake/lib/toaster/contrib/tts/urlcheck.py
new file mode 100644
index 0000000..0820f82
--- /dev/null
+++ b/bitbake/lib/toaster/contrib/tts/urlcheck.py
@@ -0,0 +1,53 @@
+from __future__ import print_function
+import sys
+
+import httplib2
+import config
+import urllist
+
+
+config.logger.info("Testing %s with %s", config.TOASTER_BASEURL, config.W3C_VALIDATOR)
+
+def validate_html5(url):
+ http_client = httplib2.Http(None)
+ status = "Failed"
+ errors = -1
+ warnings = -1
+
+ urlrequest = config.W3C_VALIDATOR+url
+
+ # pylint: disable=broad-except
+ # we disable the broad-except because we want to actually catch all possible exceptions
+ try:
+ resp, _ = http_client.request(urlrequest, "HEAD")
+ if resp['x-w3c-validator-status'] != "Abort":
+ status = resp['x-w3c-validator-status']
+ errors = int(resp['x-w3c-validator-errors'])
+ warnings = int(resp['x-w3c-validator-warnings'])
+
+ if status == 'Invalid':
+ config.logger.warn("Failed %s is %s\terrors %s warnings %s (check at %s)", url, status, errors, warnings, urlrequest)
+ else:
+ config.logger.debug("OK! %s", url)
+
+ except Exception as exc:
+ config.logger.warn("Failed validation call: %s", exc)
+ return (status, errors, warnings)
+
+
+def print_validation(url):
+ status, errors, warnings = validate_html5(url)
+ config.logger.error("url %s is %s\terrors %s warnings %s (check at %s)", url, status, errors, warnings, config.W3C_VALIDATOR+url)
+
+
+def main():
+ print("Testing %s with %s" % (config.TOASTER_BASEURL, config.W3C_VALIDATOR))
+
+ if len(sys.argv) > 1:
+ print_validation(sys.argv[1])
+ else:
+ for url in urllist.URLS:
+ print_validation(config.TOASTER_BASEURL+url)
+
+if __name__ == "__main__":
+ main()
diff --git a/bitbake/lib/toaster/contrib/tts/urllist.py b/bitbake/lib/toaster/contrib/tts/urllist.py
new file mode 100644
index 0000000..6db9ffc
--- /dev/null
+++ b/bitbake/lib/toaster/contrib/tts/urllist.py
@@ -0,0 +1,39 @@
+URLS = [
+ 'toastergui/landing/',
+ 'toastergui/builds/',
+ 'toastergui/build/1',
+ 'toastergui/build/1/tasks/',
+ 'toastergui/build/1/tasks/1/',
+ 'toastergui/build/1/task/1',
+ 'toastergui/build/1/recipes/',
+ 'toastergui/build/1/recipe/1/active_tab/1',
+ 'toastergui/build/1/recipe/1',
+ 'toastergui/build/1/recipe_packages/1',
+ 'toastergui/build/1/packages/',
+ 'toastergui/build/1/package/1',
+ 'toastergui/build/1/package_built_dependencies/1',
+ 'toastergui/build/1/package_included_detail/1/1',
+ 'toastergui/build/1/package_included_dependencies/1/1',
+ 'toastergui/build/1/package_included_reverse_dependencies/1/1',
+ 'toastergui/build/1/target/1',
+ 'toastergui/build/1/target/1/targetpkg',
+ 'toastergui/build/1/target/1/dirinfo',
+ 'toastergui/build/1/target/1/dirinfo_filepath/_/bin/bash',
+ 'toastergui/build/1/configuration',
+ 'toastergui/build/1/configvars',
+ 'toastergui/build/1/buildtime',
+ 'toastergui/build/1/cpuusage',
+ 'toastergui/build/1/diskio',
+ 'toastergui/build/1/target/1/packagefile/1',
+ 'toastergui/newproject/',
+ 'toastergui/projects/',
+ 'toastergui/project/1',
+ 'toastergui/project/1/configuration',
+ 'toastergui/project/1/builds/',
+ 'toastergui/project/1/layers/',
+ 'toastergui/project/1/layer/1',
+ 'toastergui/project/1/importlayer',
+ 'toastergui/project/1/targets/',
+ 'toastergui/project/1/machines/',
+ 'toastergui/',
+]
diff --git a/bitbake/lib/toaster/manage.py b/bitbake/lib/toaster/manage.py
new file mode 100755
index 0000000..ceaa11b
--- /dev/null
+++ b/bitbake/lib/toaster/manage.py
@@ -0,0 +1,10 @@
+#!/usr/bin/env python
+import os
+import sys
+
+if __name__ == "__main__":
+ os.environ.setdefault("DJANGO_SETTINGS_MODULE", "toastermain.settings")
+
+ from django.core.management import execute_from_command_line
+
+ execute_from_command_line(sys.argv)
diff --git a/bitbake/lib/toaster/orm/__init__.py b/bitbake/lib/toaster/orm/__init__.py
new file mode 100644
index 0000000..e69de29
--- /dev/null
+++ b/bitbake/lib/toaster/orm/__init__.py
diff --git a/bitbake/lib/toaster/orm/management/__init__.py b/bitbake/lib/toaster/orm/management/__init__.py
new file mode 100644
index 0000000..e69de29
--- /dev/null
+++ b/bitbake/lib/toaster/orm/management/__init__.py
diff --git a/bitbake/lib/toaster/orm/management/commands/__init__.py b/bitbake/lib/toaster/orm/management/commands/__init__.py
new file mode 100644
index 0000000..e69de29
--- /dev/null
+++ b/bitbake/lib/toaster/orm/management/commands/__init__.py
diff --git a/bitbake/lib/toaster/orm/management/commands/lsupdates.py b/bitbake/lib/toaster/orm/management/commands/lsupdates.py
new file mode 100644
index 0000000..75e9513
--- /dev/null
+++ b/bitbake/lib/toaster/orm/management/commands/lsupdates.py
@@ -0,0 +1,12 @@
+from django.core.management.base import NoArgsCommand, CommandError
+from orm.models import LayerSource
+import os
+
+class Command(NoArgsCommand):
+ args = ""
+ help = "Updates locally cached information from all LayerSources"
+
+
+ def handle_noargs(self, **options):
+ for ls in LayerSource.objects.all():
+ ls.update()
diff --git a/bitbake/lib/toaster/orm/migrations/0001_initial.py b/bitbake/lib/toaster/orm/migrations/0001_initial.py
new file mode 100644
index 0000000..dedeef8
--- /dev/null
+++ b/bitbake/lib/toaster/orm/migrations/0001_initial.py
@@ -0,0 +1,400 @@
+# -*- coding: utf-8 -*-
+from south.utils import datetime_utils as datetime
+from south.db import db
+from south.v2 import SchemaMigration
+from django.db import models
+
+
+class Migration(SchemaMigration):
+
+ def forwards(self, orm):
+ # Adding model 'Build'
+ db.create_table(u'orm_build', (
+ (u'id', self.gf('django.db.models.fields.AutoField')(primary_key=True)),
+ ('machine', self.gf('django.db.models.fields.CharField')(max_length=100)),
+ ('image_fstypes', self.gf('django.db.models.fields.CharField')(max_length=100)),
+ ('distro', self.gf('django.db.models.fields.CharField')(max_length=100)),
+ ('distro_version', self.gf('django.db.models.fields.CharField')(max_length=100)),
+ ('started_on', self.gf('django.db.models.fields.DateTimeField')()),
+ ('completed_on', self.gf('django.db.models.fields.DateTimeField')()),
+ ('outcome', self.gf('django.db.models.fields.IntegerField')(default=2)),
+ ('errors_no', self.gf('django.db.models.fields.IntegerField')(default=0)),
+ ('warnings_no', self.gf('django.db.models.fields.IntegerField')(default=0)),
+ ('cooker_log_path', self.gf('django.db.models.fields.CharField')(max_length=500)),
+ ('build_name', self.gf('django.db.models.fields.CharField')(max_length=100)),
+ ('bitbake_version', self.gf('django.db.models.fields.CharField')(max_length=50)),
+ ))
+ db.send_create_signal(u'orm', ['Build'])
+
+ # Adding model 'Target'
+ db.create_table(u'orm_target', (
+ (u'id', self.gf('django.db.models.fields.AutoField')(primary_key=True)),
+ ('build', self.gf('django.db.models.fields.related.ForeignKey')(to=orm['orm.Build'])),
+ ('target', self.gf('django.db.models.fields.CharField')(max_length=100)),
+ ('is_image', self.gf('django.db.models.fields.BooleanField')(default=False)),
+ ('file_name', self.gf('django.db.models.fields.CharField')(max_length=100)),
+ ('file_size', self.gf('django.db.models.fields.IntegerField')()),
+ ))
+ db.send_create_signal(u'orm', ['Target'])
+
+ # Adding model 'Task'
+ db.create_table(u'orm_task', (
+ (u'id', self.gf('django.db.models.fields.AutoField')(primary_key=True)),
+ ('build', self.gf('django.db.models.fields.related.ForeignKey')(related_name='task_build', to=orm['orm.Build'])),
+ ('order', self.gf('django.db.models.fields.IntegerField')(null=True)),
+ ('task_executed', self.gf('django.db.models.fields.BooleanField')(default=False)),
+ ('outcome', self.gf('django.db.models.fields.IntegerField')(default=5)),
+ ('sstate_checksum', self.gf('django.db.models.fields.CharField')(max_length=100, blank=True)),
+ ('path_to_sstate_obj', self.gf('django.db.models.fields.FilePathField')(max_length=500, blank=True)),
+ ('recipe', self.gf('django.db.models.fields.related.ForeignKey')(related_name='build_recipe', to=orm['orm.Recipe'])),
+ ('task_name', self.gf('django.db.models.fields.CharField')(max_length=100)),
+ ('source_url', self.gf('django.db.models.fields.FilePathField')(max_length=255, blank=True)),
+ ('work_directory', self.gf('django.db.models.fields.FilePathField')(max_length=255, blank=True)),
+ ('script_type', self.gf('django.db.models.fields.IntegerField')(default=0)),
+ ('line_number', self.gf('django.db.models.fields.IntegerField')(default=0)),
+ ('disk_io', self.gf('django.db.models.fields.IntegerField')(null=True)),
+ ('cpu_usage', self.gf('django.db.models.fields.DecimalField')(null=True, max_digits=6, decimal_places=2)),
+ ('elapsed_time', self.gf('django.db.models.fields.CharField')(default=0, max_length=50)),
+ ('sstate_result', self.gf('django.db.models.fields.IntegerField')(default=0)),
+ ('message', self.gf('django.db.models.fields.CharField')(max_length=240)),
+ ('logfile', self.gf('django.db.models.fields.FilePathField')(max_length=255, blank=True)),
+ ))
+ db.send_create_signal(u'orm', ['Task'])
+
+ # Adding model 'Task_Dependency'
+ db.create_table(u'orm_task_dependency', (
+ (u'id', self.gf('django.db.models.fields.AutoField')(primary_key=True)),
+ ('task', self.gf('django.db.models.fields.related.ForeignKey')(related_name='task_dependencies_task', to=orm['orm.Task'])),
+ ('depends_on', self.gf('django.db.models.fields.related.ForeignKey')(related_name='task_dependencies_depends', to=orm['orm.Task'])),
+ ))
+ db.send_create_signal(u'orm', ['Task_Dependency'])
+
+ # Adding model 'Package'
+ db.create_table(u'orm_package', (
+ (u'id', self.gf('django.db.models.fields.AutoField')(primary_key=True)),
+ ('build', self.gf('django.db.models.fields.related.ForeignKey')(to=orm['orm.Build'])),
+ ('recipe', self.gf('django.db.models.fields.related.ForeignKey')(to=orm['orm.Recipe'], null=True)),
+ ('name', self.gf('django.db.models.fields.CharField')(max_length=100)),
+ ('version', self.gf('django.db.models.fields.CharField')(max_length=100, blank=True)),
+ ('revision', self.gf('django.db.models.fields.CharField')(max_length=32, blank=True)),
+ ('summary', self.gf('django.db.models.fields.CharField')(max_length=200, blank=True)),
+ ('description', self.gf('django.db.models.fields.CharField')(max_length=200, blank=True)),
+ ('size', self.gf('django.db.models.fields.IntegerField')(default=0)),
+ ('installed_size', self.gf('django.db.models.fields.IntegerField')(default=0)),
+ ('section', self.gf('django.db.models.fields.CharField')(max_length=80, blank=True)),
+ ('license', self.gf('django.db.models.fields.CharField')(max_length=80, blank=True)),
+ ))
+ db.send_create_signal(u'orm', ['Package'])
+
+ # Adding model 'Package_Dependency'
+ db.create_table(u'orm_package_dependency', (
+ (u'id', self.gf('django.db.models.fields.AutoField')(primary_key=True)),
+ ('package', self.gf('django.db.models.fields.related.ForeignKey')(related_name='package_dependencies_source', to=orm['orm.Package'])),
+ ('depends_on', self.gf('django.db.models.fields.related.ForeignKey')(related_name='package_dependencies_target', to=orm['orm.Package'])),
+ ('dep_type', self.gf('django.db.models.fields.IntegerField')()),
+ ('target', self.gf('django.db.models.fields.related.ForeignKey')(to=orm['orm.Target'], null=True)),
+ ))
+ db.send_create_signal(u'orm', ['Package_Dependency'])
+
+ # Adding model 'Target_Installed_Package'
+ db.create_table(u'orm_target_installed_package', (
+ (u'id', self.gf('django.db.models.fields.AutoField')(primary_key=True)),
+ ('target', self.gf('django.db.models.fields.related.ForeignKey')(to=orm['orm.Target'])),
+ ('package', self.gf('django.db.models.fields.related.ForeignKey')(to=orm['orm.Package'])),
+ ))
+ db.send_create_signal(u'orm', ['Target_Installed_Package'])
+
+ # Adding model 'Package_File'
+ db.create_table(u'orm_package_file', (
+ (u'id', self.gf('django.db.models.fields.AutoField')(primary_key=True)),
+ ('package', self.gf('django.db.models.fields.related.ForeignKey')(related_name='buildfilelist_package', to=orm['orm.Package'])),
+ ('path', self.gf('django.db.models.fields.FilePathField')(max_length=255, blank=True)),
+ ('size', self.gf('django.db.models.fields.IntegerField')()),
+ ))
+ db.send_create_signal(u'orm', ['Package_File'])
+
+ # Adding model 'Recipe'
+ db.create_table(u'orm_recipe', (
+ (u'id', self.gf('django.db.models.fields.AutoField')(primary_key=True)),
+ ('name', self.gf('django.db.models.fields.CharField')(max_length=100, blank=True)),
+ ('version', self.gf('django.db.models.fields.CharField')(max_length=100, blank=True)),
+ ('layer_version', self.gf('django.db.models.fields.related.ForeignKey')(related_name='recipe_layer_version', to=orm['orm.Layer_Version'])),
+ ('summary', self.gf('django.db.models.fields.CharField')(max_length=100, blank=True)),
+ ('description', self.gf('django.db.models.fields.CharField')(max_length=100, blank=True)),
+ ('section', self.gf('django.db.models.fields.CharField')(max_length=100, blank=True)),
+ ('license', self.gf('django.db.models.fields.CharField')(max_length=200, blank=True)),
+ ('licensing_info', self.gf('django.db.models.fields.TextField')(blank=True)),
+ ('homepage', self.gf('django.db.models.fields.URLField')(max_length=200, blank=True)),
+ ('bugtracker', self.gf('django.db.models.fields.URLField')(max_length=200, blank=True)),
+ ('file_path', self.gf('django.db.models.fields.FilePathField')(max_length=255)),
+ ))
+ db.send_create_signal(u'orm', ['Recipe'])
+
+ # Adding model 'Recipe_Dependency'
+ db.create_table(u'orm_recipe_dependency', (
+ (u'id', self.gf('django.db.models.fields.AutoField')(primary_key=True)),
+ ('recipe', self.gf('django.db.models.fields.related.ForeignKey')(related_name='r_dependencies_recipe', to=orm['orm.Recipe'])),
+ ('depends_on', self.gf('django.db.models.fields.related.ForeignKey')(related_name='r_dependencies_depends', to=orm['orm.Recipe'])),
+ ('dep_type', self.gf('django.db.models.fields.IntegerField')()),
+ ))
+ db.send_create_signal(u'orm', ['Recipe_Dependency'])
+
+ # Adding model 'Layer'
+ db.create_table(u'orm_layer', (
+ (u'id', self.gf('django.db.models.fields.AutoField')(primary_key=True)),
+ ('name', self.gf('django.db.models.fields.CharField')(max_length=100)),
+ ('local_path', self.gf('django.db.models.fields.FilePathField')(max_length=255)),
+ ('layer_index_url', self.gf('django.db.models.fields.URLField')(max_length=200)),
+ ))
+ db.send_create_signal(u'orm', ['Layer'])
+
+ # Adding model 'Layer_Version'
+ db.create_table(u'orm_layer_version', (
+ (u'id', self.gf('django.db.models.fields.AutoField')(primary_key=True)),
+ ('build', self.gf('django.db.models.fields.related.ForeignKey')(related_name='layer_version_build', to=orm['orm.Build'])),
+ ('layer', self.gf('django.db.models.fields.related.ForeignKey')(related_name='layer_version_layer', to=orm['orm.Layer'])),
+ ('branch', self.gf('django.db.models.fields.CharField')(max_length=50)),
+ ('commit', self.gf('django.db.models.fields.CharField')(max_length=100)),
+ ('priority', self.gf('django.db.models.fields.IntegerField')()),
+ ))
+ db.send_create_signal(u'orm', ['Layer_Version'])
+
+ # Adding model 'Variable'
+ db.create_table(u'orm_variable', (
+ (u'id', self.gf('django.db.models.fields.AutoField')(primary_key=True)),
+ ('build', self.gf('django.db.models.fields.related.ForeignKey')(related_name='variable_build', to=orm['orm.Build'])),
+ ('variable_name', self.gf('django.db.models.fields.CharField')(max_length=100)),
+ ('variable_value', self.gf('django.db.models.fields.TextField')(blank=True)),
+ ('changed', self.gf('django.db.models.fields.BooleanField')(default=False)),
+ ('human_readable_name', self.gf('django.db.models.fields.CharField')(max_length=200)),
+ ('description', self.gf('django.db.models.fields.TextField')(blank=True)),
+ ))
+ db.send_create_signal(u'orm', ['Variable'])
+
+ # Adding model 'VariableHistory'
+ db.create_table(u'orm_variablehistory', (
+ (u'id', self.gf('django.db.models.fields.AutoField')(primary_key=True)),
+ ('variable', self.gf('django.db.models.fields.related.ForeignKey')(to=orm['orm.Variable'])),
+ ('file_name', self.gf('django.db.models.fields.FilePathField')(max_length=255)),
+ ('line_number', self.gf('django.db.models.fields.IntegerField')(null=True)),
+ ('operation', self.gf('django.db.models.fields.CharField')(max_length=16)),
+ ))
+ db.send_create_signal(u'orm', ['VariableHistory'])
+
+ # Adding model 'LogMessage'
+ db.create_table(u'orm_logmessage', (
+ (u'id', self.gf('django.db.models.fields.AutoField')(primary_key=True)),
+ ('build', self.gf('django.db.models.fields.related.ForeignKey')(to=orm['orm.Build'])),
+ ('level', self.gf('django.db.models.fields.IntegerField')(default=0)),
+ ('message', self.gf('django.db.models.fields.CharField')(max_length=240)),
+ ('pathname', self.gf('django.db.models.fields.FilePathField')(max_length=255, blank=True)),
+ ('lineno', self.gf('django.db.models.fields.IntegerField')(null=True)),
+ ))
+ db.send_create_signal(u'orm', ['LogMessage'])
+
+
+ def backwards(self, orm):
+ # Deleting model 'Build'
+ db.delete_table(u'orm_build')
+
+ # Deleting model 'Target'
+ db.delete_table(u'orm_target')
+
+ # Deleting model 'Task'
+ db.delete_table(u'orm_task')
+
+ # Deleting model 'Task_Dependency'
+ db.delete_table(u'orm_task_dependency')
+
+ # Deleting model 'Package'
+ db.delete_table(u'orm_package')
+
+ # Deleting model 'Package_Dependency'
+ db.delete_table(u'orm_package_dependency')
+
+ # Deleting model 'Target_Installed_Package'
+ db.delete_table(u'orm_target_installed_package')
+
+ # Deleting model 'Package_File'
+ db.delete_table(u'orm_package_file')
+
+ # Deleting model 'Recipe'
+ db.delete_table(u'orm_recipe')
+
+ # Deleting model 'Recipe_Dependency'
+ db.delete_table(u'orm_recipe_dependency')
+
+ # Deleting model 'Layer'
+ db.delete_table(u'orm_layer')
+
+ # Deleting model 'Layer_Version'
+ db.delete_table(u'orm_layer_version')
+
+ # Deleting model 'Variable'
+ db.delete_table(u'orm_variable')
+
+ # Deleting model 'VariableHistory'
+ db.delete_table(u'orm_variablehistory')
+
+ # Deleting model 'LogMessage'
+ db.delete_table(u'orm_logmessage')
+
+
+ models = {
+ u'orm.build': {
+ 'Meta': {'object_name': 'Build'},
+ 'bitbake_version': ('django.db.models.fields.CharField', [], {'max_length': '50'}),
+ 'build_name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'completed_on': ('django.db.models.fields.DateTimeField', [], {}),
+ 'cooker_log_path': ('django.db.models.fields.CharField', [], {'max_length': '500'}),
+ 'distro': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'distro_version': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'errors_no': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'image_fstypes': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'machine': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'outcome': ('django.db.models.fields.IntegerField', [], {'default': '2'}),
+ 'started_on': ('django.db.models.fields.DateTimeField', [], {}),
+ 'warnings_no': ('django.db.models.fields.IntegerField', [], {'default': '0'})
+ },
+ u'orm.layer': {
+ 'Meta': {'object_name': 'Layer'},
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'layer_index_url': ('django.db.models.fields.URLField', [], {'max_length': '200'}),
+ 'local_path': ('django.db.models.fields.FilePathField', [], {'max_length': '255'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '100'})
+ },
+ u'orm.layer_version': {
+ 'Meta': {'object_name': 'Layer_Version'},
+ 'branch': ('django.db.models.fields.CharField', [], {'max_length': '50'}),
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'layer_version_build'", 'to': u"orm['orm.Build']"}),
+ 'commit': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'layer': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'layer_version_layer'", 'to': u"orm['orm.Layer']"}),
+ 'priority': ('django.db.models.fields.IntegerField', [], {})
+ },
+ u'orm.logmessage': {
+ 'Meta': {'object_name': 'LogMessage'},
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Build']"}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'level': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'lineno': ('django.db.models.fields.IntegerField', [], {'null': 'True'}),
+ 'message': ('django.db.models.fields.CharField', [], {'max_length': '240'}),
+ 'pathname': ('django.db.models.fields.FilePathField', [], {'max_length': '255', 'blank': 'True'})
+ },
+ u'orm.package': {
+ 'Meta': {'object_name': 'Package'},
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Build']"}),
+ 'description': ('django.db.models.fields.CharField', [], {'max_length': '200', 'blank': 'True'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'installed_size': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'license': ('django.db.models.fields.CharField', [], {'max_length': '80', 'blank': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'recipe': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Recipe']", 'null': 'True'}),
+ 'revision': ('django.db.models.fields.CharField', [], {'max_length': '32', 'blank': 'True'}),
+ 'section': ('django.db.models.fields.CharField', [], {'max_length': '80', 'blank': 'True'}),
+ 'size': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'summary': ('django.db.models.fields.CharField', [], {'max_length': '200', 'blank': 'True'}),
+ 'version': ('django.db.models.fields.CharField', [], {'max_length': '100', 'blank': 'True'})
+ },
+ u'orm.package_dependency': {
+ 'Meta': {'object_name': 'Package_Dependency'},
+ 'dep_type': ('django.db.models.fields.IntegerField', [], {}),
+ 'depends_on': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'package_dependencies_target'", 'to': u"orm['orm.Package']"}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'package': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'package_dependencies_source'", 'to': u"orm['orm.Package']"}),
+ 'target': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Target']", 'null': 'True'})
+ },
+ u'orm.package_file': {
+ 'Meta': {'object_name': 'Package_File'},
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'package': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'buildfilelist_package'", 'to': u"orm['orm.Package']"}),
+ 'path': ('django.db.models.fields.FilePathField', [], {'max_length': '255', 'blank': 'True'}),
+ 'size': ('django.db.models.fields.IntegerField', [], {})
+ },
+ u'orm.recipe': {
+ 'Meta': {'object_name': 'Recipe'},
+ 'bugtracker': ('django.db.models.fields.URLField', [], {'max_length': '200', 'blank': 'True'}),
+ 'description': ('django.db.models.fields.CharField', [], {'max_length': '100', 'blank': 'True'}),
+ 'file_path': ('django.db.models.fields.FilePathField', [], {'max_length': '255'}),
+ 'homepage': ('django.db.models.fields.URLField', [], {'max_length': '200', 'blank': 'True'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'layer_version': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'recipe_layer_version'", 'to': u"orm['orm.Layer_Version']"}),
+ 'license': ('django.db.models.fields.CharField', [], {'max_length': '200', 'blank': 'True'}),
+ 'licensing_info': ('django.db.models.fields.TextField', [], {'blank': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '100', 'blank': 'True'}),
+ 'section': ('django.db.models.fields.CharField', [], {'max_length': '100', 'blank': 'True'}),
+ 'summary': ('django.db.models.fields.CharField', [], {'max_length': '100', 'blank': 'True'}),
+ 'version': ('django.db.models.fields.CharField', [], {'max_length': '100', 'blank': 'True'})
+ },
+ u'orm.recipe_dependency': {
+ 'Meta': {'object_name': 'Recipe_Dependency'},
+ 'dep_type': ('django.db.models.fields.IntegerField', [], {}),
+ 'depends_on': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'r_dependencies_depends'", 'to': u"orm['orm.Recipe']"}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'recipe': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'r_dependencies_recipe'", 'to': u"orm['orm.Recipe']"})
+ },
+ u'orm.target': {
+ 'Meta': {'object_name': 'Target'},
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Build']"}),
+ 'file_name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'file_size': ('django.db.models.fields.IntegerField', [], {}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'is_image': ('django.db.models.fields.BooleanField', [], {'default': 'False'}),
+ 'target': ('django.db.models.fields.CharField', [], {'max_length': '100'})
+ },
+ u'orm.target_installed_package': {
+ 'Meta': {'object_name': 'Target_Installed_Package'},
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'package': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Package']"}),
+ 'target': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Target']"})
+ },
+ u'orm.task': {
+ 'Meta': {'ordering': "('order', 'recipe')", 'object_name': 'Task'},
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'task_build'", 'to': u"orm['orm.Build']"}),
+ 'cpu_usage': ('django.db.models.fields.DecimalField', [], {'null': 'True', 'max_digits': '6', 'decimal_places': '2'}),
+ 'disk_io': ('django.db.models.fields.IntegerField', [], {'null': 'True'}),
+ 'elapsed_time': ('django.db.models.fields.CharField', [], {'default': '0', 'max_length': '50'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'line_number': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'logfile': ('django.db.models.fields.FilePathField', [], {'max_length': '255', 'blank': 'True'}),
+ 'message': ('django.db.models.fields.CharField', [], {'max_length': '240'}),
+ 'order': ('django.db.models.fields.IntegerField', [], {'null': 'True'}),
+ 'outcome': ('django.db.models.fields.IntegerField', [], {'default': '5'}),
+ 'path_to_sstate_obj': ('django.db.models.fields.FilePathField', [], {'max_length': '500', 'blank': 'True'}),
+ 'recipe': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'build_recipe'", 'to': u"orm['orm.Recipe']"}),
+ 'script_type': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'source_url': ('django.db.models.fields.FilePathField', [], {'max_length': '255', 'blank': 'True'}),
+ 'sstate_checksum': ('django.db.models.fields.CharField', [], {'max_length': '100', 'blank': 'True'}),
+ 'sstate_result': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'task_executed': ('django.db.models.fields.BooleanField', [], {'default': 'False'}),
+ 'task_name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'work_directory': ('django.db.models.fields.FilePathField', [], {'max_length': '255', 'blank': 'True'})
+ },
+ u'orm.task_dependency': {
+ 'Meta': {'object_name': 'Task_Dependency'},
+ 'depends_on': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'task_dependencies_depends'", 'to': u"orm['orm.Task']"}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'task': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'task_dependencies_task'", 'to': u"orm['orm.Task']"})
+ },
+ u'orm.variable': {
+ 'Meta': {'object_name': 'Variable'},
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'variable_build'", 'to': u"orm['orm.Build']"}),
+ 'changed': ('django.db.models.fields.BooleanField', [], {'default': 'False'}),
+ 'description': ('django.db.models.fields.TextField', [], {'blank': 'True'}),
+ 'human_readable_name': ('django.db.models.fields.CharField', [], {'max_length': '200'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'variable_name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'variable_value': ('django.db.models.fields.TextField', [], {'blank': 'True'})
+ },
+ u'orm.variablehistory': {
+ 'Meta': {'object_name': 'VariableHistory'},
+ 'file_name': ('django.db.models.fields.FilePathField', [], {'max_length': '255'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'line_number': ('django.db.models.fields.IntegerField', [], {'null': 'True'}),
+ 'operation': ('django.db.models.fields.CharField', [], {'max_length': '16'}),
+ 'variable': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Variable']"})
+ }
+ }
+
+ complete_apps = ['orm']
diff --git a/bitbake/lib/toaster/orm/migrations/0002_auto__add_field_build_timespent.py b/bitbake/lib/toaster/orm/migrations/0002_auto__add_field_build_timespent.py
new file mode 100644
index 0000000..61421ca
--- /dev/null
+++ b/bitbake/lib/toaster/orm/migrations/0002_auto__add_field_build_timespent.py
@@ -0,0 +1,180 @@
+# -*- coding: utf-8 -*-
+from south.utils import datetime_utils as datetime
+from south.db import db
+from south.v2 import SchemaMigration
+from django.db import models
+
+
+class Migration(SchemaMigration):
+
+ def forwards(self, orm):
+ # Adding field 'Build.timespent'
+ db.add_column(u'orm_build', 'timespent',
+ self.gf('django.db.models.fields.IntegerField')(default=0),
+ keep_default=False)
+
+
+ def backwards(self, orm):
+ # Deleting field 'Build.timespent'
+ db.delete_column(u'orm_build', 'timespent')
+
+
+ models = {
+ u'orm.build': {
+ 'Meta': {'object_name': 'Build'},
+ 'bitbake_version': ('django.db.models.fields.CharField', [], {'max_length': '50'}),
+ 'build_name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'completed_on': ('django.db.models.fields.DateTimeField', [], {}),
+ 'cooker_log_path': ('django.db.models.fields.CharField', [], {'max_length': '500'}),
+ 'distro': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'distro_version': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'errors_no': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'image_fstypes': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'machine': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'outcome': ('django.db.models.fields.IntegerField', [], {'default': '2'}),
+ 'started_on': ('django.db.models.fields.DateTimeField', [], {}),
+ 'timespent': ('django.db.models.fields.IntegerField', [], {}),
+ 'warnings_no': ('django.db.models.fields.IntegerField', [], {'default': '0'})
+ },
+ u'orm.layer': {
+ 'Meta': {'object_name': 'Layer'},
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'layer_index_url': ('django.db.models.fields.URLField', [], {'max_length': '200'}),
+ 'local_path': ('django.db.models.fields.FilePathField', [], {'max_length': '255'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '100'})
+ },
+ u'orm.layer_version': {
+ 'Meta': {'object_name': 'Layer_Version'},
+ 'branch': ('django.db.models.fields.CharField', [], {'max_length': '50'}),
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'layer_version_build'", 'to': u"orm['orm.Build']"}),
+ 'commit': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'layer': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'layer_version_layer'", 'to': u"orm['orm.Layer']"}),
+ 'priority': ('django.db.models.fields.IntegerField', [], {})
+ },
+ u'orm.logmessage': {
+ 'Meta': {'object_name': 'LogMessage'},
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Build']"}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'level': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'lineno': ('django.db.models.fields.IntegerField', [], {'null': 'True'}),
+ 'message': ('django.db.models.fields.CharField', [], {'max_length': '240'}),
+ 'pathname': ('django.db.models.fields.FilePathField', [], {'max_length': '255', 'blank': 'True'})
+ },
+ u'orm.package': {
+ 'Meta': {'object_name': 'Package'},
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Build']"}),
+ 'description': ('django.db.models.fields.CharField', [], {'max_length': '200', 'blank': 'True'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'installed_size': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'license': ('django.db.models.fields.CharField', [], {'max_length': '80', 'blank': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'recipe': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Recipe']", 'null': 'True'}),
+ 'revision': ('django.db.models.fields.CharField', [], {'max_length': '32', 'blank': 'True'}),
+ 'section': ('django.db.models.fields.CharField', [], {'max_length': '80', 'blank': 'True'}),
+ 'size': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'summary': ('django.db.models.fields.CharField', [], {'max_length': '200', 'blank': 'True'}),
+ 'version': ('django.db.models.fields.CharField', [], {'max_length': '100', 'blank': 'True'})
+ },
+ u'orm.package_dependency': {
+ 'Meta': {'object_name': 'Package_Dependency'},
+ 'dep_type': ('django.db.models.fields.IntegerField', [], {}),
+ 'depends_on': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'package_dependencies_target'", 'to': u"orm['orm.Package']"}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'package': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'package_dependencies_source'", 'to': u"orm['orm.Package']"}),
+ 'target': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Target']", 'null': 'True'})
+ },
+ u'orm.package_file': {
+ 'Meta': {'object_name': 'Package_File'},
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'package': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'buildfilelist_package'", 'to': u"orm['orm.Package']"}),
+ 'path': ('django.db.models.fields.FilePathField', [], {'max_length': '255', 'blank': 'True'}),
+ 'size': ('django.db.models.fields.IntegerField', [], {})
+ },
+ u'orm.recipe': {
+ 'Meta': {'object_name': 'Recipe'},
+ 'bugtracker': ('django.db.models.fields.URLField', [], {'max_length': '200', 'blank': 'True'}),
+ 'description': ('django.db.models.fields.CharField', [], {'max_length': '100', 'blank': 'True'}),
+ 'file_path': ('django.db.models.fields.FilePathField', [], {'max_length': '255'}),
+ 'homepage': ('django.db.models.fields.URLField', [], {'max_length': '200', 'blank': 'True'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'layer_version': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'recipe_layer_version'", 'to': u"orm['orm.Layer_Version']"}),
+ 'license': ('django.db.models.fields.CharField', [], {'max_length': '200', 'blank': 'True'}),
+ 'licensing_info': ('django.db.models.fields.TextField', [], {'blank': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '100', 'blank': 'True'}),
+ 'section': ('django.db.models.fields.CharField', [], {'max_length': '100', 'blank': 'True'}),
+ 'summary': ('django.db.models.fields.CharField', [], {'max_length': '100', 'blank': 'True'}),
+ 'version': ('django.db.models.fields.CharField', [], {'max_length': '100', 'blank': 'True'})
+ },
+ u'orm.recipe_dependency': {
+ 'Meta': {'object_name': 'Recipe_Dependency'},
+ 'dep_type': ('django.db.models.fields.IntegerField', [], {}),
+ 'depends_on': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'r_dependencies_depends'", 'to': u"orm['orm.Recipe']"}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'recipe': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'r_dependencies_recipe'", 'to': u"orm['orm.Recipe']"})
+ },
+ u'orm.target': {
+ 'Meta': {'object_name': 'Target'},
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Build']"}),
+ 'file_name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'file_size': ('django.db.models.fields.IntegerField', [], {}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'is_image': ('django.db.models.fields.BooleanField', [], {'default': 'False'}),
+ 'target': ('django.db.models.fields.CharField', [], {'max_length': '100'})
+ },
+ u'orm.target_installed_package': {
+ 'Meta': {'object_name': 'Target_Installed_Package'},
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'package': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Package']"}),
+ 'target': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Target']"})
+ },
+ u'orm.task': {
+ 'Meta': {'ordering': "('order', 'recipe')", 'object_name': 'Task'},
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'task_build'", 'to': u"orm['orm.Build']"}),
+ 'cpu_usage': ('django.db.models.fields.DecimalField', [], {'null': 'True', 'max_digits': '6', 'decimal_places': '2'}),
+ 'disk_io': ('django.db.models.fields.IntegerField', [], {'null': 'True'}),
+ 'elapsed_time': ('django.db.models.fields.CharField', [], {'default': '0', 'max_length': '50'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'line_number': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'logfile': ('django.db.models.fields.FilePathField', [], {'max_length': '255', 'blank': 'True'}),
+ 'message': ('django.db.models.fields.CharField', [], {'max_length': '240'}),
+ 'order': ('django.db.models.fields.IntegerField', [], {'null': 'True'}),
+ 'outcome': ('django.db.models.fields.IntegerField', [], {'default': '5'}),
+ 'path_to_sstate_obj': ('django.db.models.fields.FilePathField', [], {'max_length': '500', 'blank': 'True'}),
+ 'recipe': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'build_recipe'", 'to': u"orm['orm.Recipe']"}),
+ 'script_type': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'source_url': ('django.db.models.fields.FilePathField', [], {'max_length': '255', 'blank': 'True'}),
+ 'sstate_checksum': ('django.db.models.fields.CharField', [], {'max_length': '100', 'blank': 'True'}),
+ 'sstate_result': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'task_executed': ('django.db.models.fields.BooleanField', [], {'default': 'False'}),
+ 'task_name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'work_directory': ('django.db.models.fields.FilePathField', [], {'max_length': '255', 'blank': 'True'})
+ },
+ u'orm.task_dependency': {
+ 'Meta': {'object_name': 'Task_Dependency'},
+ 'depends_on': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'task_dependencies_depends'", 'to': u"orm['orm.Task']"}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'task': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'task_dependencies_task'", 'to': u"orm['orm.Task']"})
+ },
+ u'orm.variable': {
+ 'Meta': {'object_name': 'Variable'},
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'variable_build'", 'to': u"orm['orm.Build']"}),
+ 'changed': ('django.db.models.fields.BooleanField', [], {'default': 'False'}),
+ 'description': ('django.db.models.fields.TextField', [], {'blank': 'True'}),
+ 'human_readable_name': ('django.db.models.fields.CharField', [], {'max_length': '200'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'variable_name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'variable_value': ('django.db.models.fields.TextField', [], {'blank': 'True'})
+ },
+ u'orm.variablehistory': {
+ 'Meta': {'object_name': 'VariableHistory'},
+ 'file_name': ('django.db.models.fields.FilePathField', [], {'max_length': '255'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'line_number': ('django.db.models.fields.IntegerField', [], {'null': 'True'}),
+ 'operation': ('django.db.models.fields.CharField', [], {'max_length': '16'}),
+ 'variable': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'vhistory'", 'to': u"orm['orm.Variable']"})
+ }
+ }
+
+ complete_apps = ['orm']
diff --git a/bitbake/lib/toaster/orm/migrations/0003_timespent.py b/bitbake/lib/toaster/orm/migrations/0003_timespent.py
new file mode 100644
index 0000000..9600f9e
--- /dev/null
+++ b/bitbake/lib/toaster/orm/migrations/0003_timespent.py
@@ -0,0 +1,182 @@
+# -*- coding: utf-8 -*-
+from south.utils import datetime_utils as datetime
+from south.db import db
+from south.v2 import DataMigration
+from django.db import models
+
+class Migration(DataMigration):
+
+ def forwards(self, orm):
+ "Write your forwards methods here."
+ # Note: Don't use "from appname.models import ModelName".
+ # Use orm.ModelName to refer to models in this application,
+ # and orm['appname.ModelName'] for models in other applications.
+
+ for build in orm.Build.objects.all():
+ build.timespent = int((build.completed_on - build.started_on).total_seconds())
+ build.save()
+
+ def backwards(self, orm):
+ "Write your backwards methods here."
+ raise RuntimeError("Cannot reverse this migration.")
+
+ models = {
+ u'orm.build': {
+ 'Meta': {'object_name': 'Build'},
+ 'bitbake_version': ('django.db.models.fields.CharField', [], {'max_length': '50'}),
+ 'build_name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'completed_on': ('django.db.models.fields.DateTimeField', [], {}),
+ 'cooker_log_path': ('django.db.models.fields.CharField', [], {'max_length': '500'}),
+ 'distro': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'distro_version': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'errors_no': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'image_fstypes': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'machine': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'outcome': ('django.db.models.fields.IntegerField', [], {'default': '2'}),
+ 'started_on': ('django.db.models.fields.DateTimeField', [], {}),
+ 'timespent': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'warnings_no': ('django.db.models.fields.IntegerField', [], {'default': '0'})
+ },
+ u'orm.layer': {
+ 'Meta': {'object_name': 'Layer'},
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'layer_index_url': ('django.db.models.fields.URLField', [], {'max_length': '200'}),
+ 'local_path': ('django.db.models.fields.FilePathField', [], {'max_length': '255'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '100'})
+ },
+ u'orm.layer_version': {
+ 'Meta': {'object_name': 'Layer_Version'},
+ 'branch': ('django.db.models.fields.CharField', [], {'max_length': '50'}),
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'layer_version_build'", 'to': u"orm['orm.Build']"}),
+ 'commit': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'layer': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'layer_version_layer'", 'to': u"orm['orm.Layer']"}),
+ 'priority': ('django.db.models.fields.IntegerField', [], {})
+ },
+ u'orm.logmessage': {
+ 'Meta': {'object_name': 'LogMessage'},
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Build']"}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'level': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'lineno': ('django.db.models.fields.IntegerField', [], {'null': 'True'}),
+ 'message': ('django.db.models.fields.CharField', [], {'max_length': '240'}),
+ 'pathname': ('django.db.models.fields.FilePathField', [], {'max_length': '255', 'blank': 'True'})
+ },
+ u'orm.package': {
+ 'Meta': {'object_name': 'Package'},
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Build']"}),
+ 'description': ('django.db.models.fields.CharField', [], {'max_length': '200', 'blank': 'True'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'installed_size': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'license': ('django.db.models.fields.CharField', [], {'max_length': '80', 'blank': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'recipe': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Recipe']", 'null': 'True'}),
+ 'revision': ('django.db.models.fields.CharField', [], {'max_length': '32', 'blank': 'True'}),
+ 'section': ('django.db.models.fields.CharField', [], {'max_length': '80', 'blank': 'True'}),
+ 'size': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'summary': ('django.db.models.fields.CharField', [], {'max_length': '200', 'blank': 'True'}),
+ 'version': ('django.db.models.fields.CharField', [], {'max_length': '100', 'blank': 'True'})
+ },
+ u'orm.package_dependency': {
+ 'Meta': {'object_name': 'Package_Dependency'},
+ 'dep_type': ('django.db.models.fields.IntegerField', [], {}),
+ 'depends_on': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'package_dependencies_target'", 'to': u"orm['orm.Package']"}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'package': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'package_dependencies_source'", 'to': u"orm['orm.Package']"}),
+ 'target': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Target']", 'null': 'True'})
+ },
+ u'orm.package_file': {
+ 'Meta': {'object_name': 'Package_File'},
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'package': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'buildfilelist_package'", 'to': u"orm['orm.Package']"}),
+ 'path': ('django.db.models.fields.FilePathField', [], {'max_length': '255', 'blank': 'True'}),
+ 'size': ('django.db.models.fields.IntegerField', [], {})
+ },
+ u'orm.recipe': {
+ 'Meta': {'object_name': 'Recipe'},
+ 'bugtracker': ('django.db.models.fields.URLField', [], {'max_length': '200', 'blank': 'True'}),
+ 'description': ('django.db.models.fields.CharField', [], {'max_length': '100', 'blank': 'True'}),
+ 'file_path': ('django.db.models.fields.FilePathField', [], {'max_length': '255'}),
+ 'homepage': ('django.db.models.fields.URLField', [], {'max_length': '200', 'blank': 'True'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'layer_version': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'recipe_layer_version'", 'to': u"orm['orm.Layer_Version']"}),
+ 'license': ('django.db.models.fields.CharField', [], {'max_length': '200', 'blank': 'True'}),
+ 'licensing_info': ('django.db.models.fields.TextField', [], {'blank': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '100', 'blank': 'True'}),
+ 'section': ('django.db.models.fields.CharField', [], {'max_length': '100', 'blank': 'True'}),
+ 'summary': ('django.db.models.fields.CharField', [], {'max_length': '100', 'blank': 'True'}),
+ 'version': ('django.db.models.fields.CharField', [], {'max_length': '100', 'blank': 'True'})
+ },
+ u'orm.recipe_dependency': {
+ 'Meta': {'object_name': 'Recipe_Dependency'},
+ 'dep_type': ('django.db.models.fields.IntegerField', [], {}),
+ 'depends_on': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'r_dependencies_depends'", 'to': u"orm['orm.Recipe']"}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'recipe': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'r_dependencies_recipe'", 'to': u"orm['orm.Recipe']"})
+ },
+ u'orm.target': {
+ 'Meta': {'object_name': 'Target'},
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Build']"}),
+ 'file_name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'file_size': ('django.db.models.fields.IntegerField', [], {}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'is_image': ('django.db.models.fields.BooleanField', [], {'default': 'False'}),
+ 'target': ('django.db.models.fields.CharField', [], {'max_length': '100'})
+ },
+ u'orm.target_installed_package': {
+ 'Meta': {'object_name': 'Target_Installed_Package'},
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'package': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Package']"}),
+ 'target': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Target']"})
+ },
+ u'orm.task': {
+ 'Meta': {'ordering': "('order', 'recipe')", 'object_name': 'Task'},
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'task_build'", 'to': u"orm['orm.Build']"}),
+ 'cpu_usage': ('django.db.models.fields.DecimalField', [], {'null': 'True', 'max_digits': '6', 'decimal_places': '2'}),
+ 'disk_io': ('django.db.models.fields.IntegerField', [], {'null': 'True'}),
+ 'elapsed_time': ('django.db.models.fields.CharField', [], {'default': '0', 'max_length': '50'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'line_number': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'logfile': ('django.db.models.fields.FilePathField', [], {'max_length': '255', 'blank': 'True'}),
+ 'message': ('django.db.models.fields.CharField', [], {'max_length': '240'}),
+ 'order': ('django.db.models.fields.IntegerField', [], {'null': 'True'}),
+ 'outcome': ('django.db.models.fields.IntegerField', [], {'default': '5'}),
+ 'path_to_sstate_obj': ('django.db.models.fields.FilePathField', [], {'max_length': '500', 'blank': 'True'}),
+ 'recipe': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'build_recipe'", 'to': u"orm['orm.Recipe']"}),
+ 'script_type': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'source_url': ('django.db.models.fields.FilePathField', [], {'max_length': '255', 'blank': 'True'}),
+ 'sstate_checksum': ('django.db.models.fields.CharField', [], {'max_length': '100', 'blank': 'True'}),
+ 'sstate_result': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'task_executed': ('django.db.models.fields.BooleanField', [], {'default': 'False'}),
+ 'task_name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'work_directory': ('django.db.models.fields.FilePathField', [], {'max_length': '255', 'blank': 'True'})
+ },
+ u'orm.task_dependency': {
+ 'Meta': {'object_name': 'Task_Dependency'},
+ 'depends_on': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'task_dependencies_depends'", 'to': u"orm['orm.Task']"}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'task': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'task_dependencies_task'", 'to': u"orm['orm.Task']"})
+ },
+ u'orm.variable': {
+ 'Meta': {'object_name': 'Variable'},
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'variable_build'", 'to': u"orm['orm.Build']"}),
+ 'changed': ('django.db.models.fields.BooleanField', [], {'default': 'False'}),
+ 'description': ('django.db.models.fields.TextField', [], {'blank': 'True'}),
+ 'human_readable_name': ('django.db.models.fields.CharField', [], {'max_length': '200'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'variable_name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'variable_value': ('django.db.models.fields.TextField', [], {'blank': 'True'})
+ },
+ u'orm.variablehistory': {
+ 'Meta': {'object_name': 'VariableHistory'},
+ 'file_name': ('django.db.models.fields.FilePathField', [], {'max_length': '255'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'line_number': ('django.db.models.fields.IntegerField', [], {'null': 'True'}),
+ 'operation': ('django.db.models.fields.CharField', [], {'max_length': '16'}),
+ 'variable': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'vhistory'", 'to': u"orm['orm.Variable']"})
+ }
+ }
+
+ complete_apps = ['orm']
+ symmetrical = True
diff --git a/bitbake/lib/toaster/orm/migrations/0004_auto__add_field_package_installed_name.py b/bitbake/lib/toaster/orm/migrations/0004_auto__add_field_package_installed_name.py
new file mode 100644
index 0000000..134445b
--- /dev/null
+++ b/bitbake/lib/toaster/orm/migrations/0004_auto__add_field_package_installed_name.py
@@ -0,0 +1,181 @@
+# -*- coding: utf-8 -*-
+from south.utils import datetime_utils as datetime
+from south.db import db
+from south.v2 import SchemaMigration
+from django.db import models
+
+
+class Migration(SchemaMigration):
+
+ def forwards(self, orm):
+ # Adding field 'Package.installed_name'
+ db.add_column(u'orm_package', 'installed_name',
+ self.gf('django.db.models.fields.CharField')(default='', max_length=100),
+ keep_default=False)
+
+
+ def backwards(self, orm):
+ # Deleting field 'Package.installed_name'
+ db.delete_column(u'orm_package', 'installed_name')
+
+
+ models = {
+ u'orm.build': {
+ 'Meta': {'object_name': 'Build'},
+ 'bitbake_version': ('django.db.models.fields.CharField', [], {'max_length': '50'}),
+ 'build_name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'completed_on': ('django.db.models.fields.DateTimeField', [], {}),
+ 'cooker_log_path': ('django.db.models.fields.CharField', [], {'max_length': '500'}),
+ 'distro': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'distro_version': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'errors_no': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'image_fstypes': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'machine': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'outcome': ('django.db.models.fields.IntegerField', [], {'default': '2'}),
+ 'started_on': ('django.db.models.fields.DateTimeField', [], {}),
+ 'timespent': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'warnings_no': ('django.db.models.fields.IntegerField', [], {'default': '0'})
+ },
+ u'orm.layer': {
+ 'Meta': {'object_name': 'Layer'},
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'layer_index_url': ('django.db.models.fields.URLField', [], {'max_length': '200'}),
+ 'local_path': ('django.db.models.fields.FilePathField', [], {'max_length': '255'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '100'})
+ },
+ u'orm.layer_version': {
+ 'Meta': {'object_name': 'Layer_Version'},
+ 'branch': ('django.db.models.fields.CharField', [], {'max_length': '50'}),
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'layer_version_build'", 'to': u"orm['orm.Build']"}),
+ 'commit': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'layer': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'layer_version_layer'", 'to': u"orm['orm.Layer']"}),
+ 'priority': ('django.db.models.fields.IntegerField', [], {})
+ },
+ u'orm.logmessage': {
+ 'Meta': {'object_name': 'LogMessage'},
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Build']"}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'level': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'lineno': ('django.db.models.fields.IntegerField', [], {'null': 'True'}),
+ 'message': ('django.db.models.fields.CharField', [], {'max_length': '240'}),
+ 'pathname': ('django.db.models.fields.FilePathField', [], {'max_length': '255', 'blank': 'True'})
+ },
+ u'orm.package': {
+ 'Meta': {'object_name': 'Package'},
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Build']"}),
+ 'description': ('django.db.models.fields.CharField', [], {'max_length': '200', 'blank': 'True'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'installed_name': ('django.db.models.fields.CharField', [], {'default': "''", 'max_length': '100'}),
+ 'installed_size': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'license': ('django.db.models.fields.CharField', [], {'max_length': '80', 'blank': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'recipe': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Recipe']", 'null': 'True'}),
+ 'revision': ('django.db.models.fields.CharField', [], {'max_length': '32', 'blank': 'True'}),
+ 'section': ('django.db.models.fields.CharField', [], {'max_length': '80', 'blank': 'True'}),
+ 'size': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'summary': ('django.db.models.fields.CharField', [], {'max_length': '200', 'blank': 'True'}),
+ 'version': ('django.db.models.fields.CharField', [], {'max_length': '100', 'blank': 'True'})
+ },
+ u'orm.package_dependency': {
+ 'Meta': {'object_name': 'Package_Dependency'},
+ 'dep_type': ('django.db.models.fields.IntegerField', [], {}),
+ 'depends_on': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'package_dependencies_target'", 'to': u"orm['orm.Package']"}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'package': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'package_dependencies_source'", 'to': u"orm['orm.Package']"}),
+ 'target': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Target']", 'null': 'True'})
+ },
+ u'orm.package_file': {
+ 'Meta': {'object_name': 'Package_File'},
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'package': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'buildfilelist_package'", 'to': u"orm['orm.Package']"}),
+ 'path': ('django.db.models.fields.FilePathField', [], {'max_length': '255', 'blank': 'True'}),
+ 'size': ('django.db.models.fields.IntegerField', [], {})
+ },
+ u'orm.recipe': {
+ 'Meta': {'object_name': 'Recipe'},
+ 'bugtracker': ('django.db.models.fields.URLField', [], {'max_length': '200', 'blank': 'True'}),
+ 'description': ('django.db.models.fields.CharField', [], {'max_length': '100', 'blank': 'True'}),
+ 'file_path': ('django.db.models.fields.FilePathField', [], {'max_length': '255'}),
+ 'homepage': ('django.db.models.fields.URLField', [], {'max_length': '200', 'blank': 'True'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'layer_version': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'recipe_layer_version'", 'to': u"orm['orm.Layer_Version']"}),
+ 'license': ('django.db.models.fields.CharField', [], {'max_length': '200', 'blank': 'True'}),
+ 'licensing_info': ('django.db.models.fields.TextField', [], {'blank': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '100', 'blank': 'True'}),
+ 'section': ('django.db.models.fields.CharField', [], {'max_length': '100', 'blank': 'True'}),
+ 'summary': ('django.db.models.fields.CharField', [], {'max_length': '100', 'blank': 'True'}),
+ 'version': ('django.db.models.fields.CharField', [], {'max_length': '100', 'blank': 'True'})
+ },
+ u'orm.recipe_dependency': {
+ 'Meta': {'object_name': 'Recipe_Dependency'},
+ 'dep_type': ('django.db.models.fields.IntegerField', [], {}),
+ 'depends_on': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'r_dependencies_depends'", 'to': u"orm['orm.Recipe']"}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'recipe': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'r_dependencies_recipe'", 'to': u"orm['orm.Recipe']"})
+ },
+ u'orm.target': {
+ 'Meta': {'object_name': 'Target'},
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Build']"}),
+ 'file_name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'file_size': ('django.db.models.fields.IntegerField', [], {}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'is_image': ('django.db.models.fields.BooleanField', [], {'default': 'False'}),
+ 'target': ('django.db.models.fields.CharField', [], {'max_length': '100'})
+ },
+ u'orm.target_installed_package': {
+ 'Meta': {'object_name': 'Target_Installed_Package'},
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'package': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Package']"}),
+ 'target': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Target']"})
+ },
+ u'orm.task': {
+ 'Meta': {'ordering': "('order', 'recipe')", 'object_name': 'Task'},
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'task_build'", 'to': u"orm['orm.Build']"}),
+ 'cpu_usage': ('django.db.models.fields.DecimalField', [], {'null': 'True', 'max_digits': '6', 'decimal_places': '2'}),
+ 'disk_io': ('django.db.models.fields.IntegerField', [], {'null': 'True'}),
+ 'elapsed_time': ('django.db.models.fields.CharField', [], {'default': '0', 'max_length': '50'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'line_number': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'logfile': ('django.db.models.fields.FilePathField', [], {'max_length': '255', 'blank': 'True'}),
+ 'message': ('django.db.models.fields.CharField', [], {'max_length': '240'}),
+ 'order': ('django.db.models.fields.IntegerField', [], {'null': 'True'}),
+ 'outcome': ('django.db.models.fields.IntegerField', [], {'default': '5'}),
+ 'path_to_sstate_obj': ('django.db.models.fields.FilePathField', [], {'max_length': '500', 'blank': 'True'}),
+ 'recipe': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'build_recipe'", 'to': u"orm['orm.Recipe']"}),
+ 'script_type': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'source_url': ('django.db.models.fields.FilePathField', [], {'max_length': '255', 'blank': 'True'}),
+ 'sstate_checksum': ('django.db.models.fields.CharField', [], {'max_length': '100', 'blank': 'True'}),
+ 'sstate_result': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'task_executed': ('django.db.models.fields.BooleanField', [], {'default': 'False'}),
+ 'task_name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'work_directory': ('django.db.models.fields.FilePathField', [], {'max_length': '255', 'blank': 'True'})
+ },
+ u'orm.task_dependency': {
+ 'Meta': {'object_name': 'Task_Dependency'},
+ 'depends_on': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'task_dependencies_depends'", 'to': u"orm['orm.Task']"}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'task': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'task_dependencies_task'", 'to': u"orm['orm.Task']"})
+ },
+ u'orm.variable': {
+ 'Meta': {'object_name': 'Variable'},
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'variable_build'", 'to': u"orm['orm.Build']"}),
+ 'changed': ('django.db.models.fields.BooleanField', [], {'default': 'False'}),
+ 'description': ('django.db.models.fields.TextField', [], {'blank': 'True'}),
+ 'human_readable_name': ('django.db.models.fields.CharField', [], {'max_length': '200'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'variable_name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'variable_value': ('django.db.models.fields.TextField', [], {'blank': 'True'})
+ },
+ u'orm.variablehistory': {
+ 'Meta': {'object_name': 'VariableHistory'},
+ 'file_name': ('django.db.models.fields.FilePathField', [], {'max_length': '255'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'line_number': ('django.db.models.fields.IntegerField', [], {'null': 'True'}),
+ 'operation': ('django.db.models.fields.CharField', [], {'max_length': '16'}),
+ 'variable': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'vhistory'", 'to': u"orm['orm.Variable']"})
+ }
+ }
+
+ complete_apps = ['orm']
diff --git a/bitbake/lib/toaster/orm/migrations/0005_auto__add_target_image_file__add_target_file__add_field_variablehistor.py b/bitbake/lib/toaster/orm/migrations/0005_auto__add_target_image_file__add_target_file__add_field_variablehistor.py
new file mode 100644
index 0000000..7be7ac3
--- /dev/null
+++ b/bitbake/lib/toaster/orm/migrations/0005_auto__add_target_image_file__add_target_file__add_field_variablehistor.py
@@ -0,0 +1,281 @@
+# -*- coding: utf-8 -*-
+from south.utils import datetime_utils as datetime
+from south.db import db
+from south.v2 import SchemaMigration
+from django.db import models
+
+
+class Migration(SchemaMigration):
+
+ def forwards(self, orm):
+ # Adding model 'Target_File'
+ db.create_table(u'orm_target_file', (
+ (u'id', self.gf('django.db.models.fields.AutoField')(primary_key=True)),
+ ('target', self.gf('django.db.models.fields.related.ForeignKey')(to=orm['orm.Target'])),
+ ('path', self.gf('django.db.models.fields.FilePathField')(max_length=100)),
+ ('size', self.gf('django.db.models.fields.IntegerField')()),
+ ('inodetype', self.gf('django.db.models.fields.IntegerField')()),
+ ('permission', self.gf('django.db.models.fields.IntegerField')()),
+ ('owner', self.gf('django.db.models.fields.CharField')(max_length=128)),
+ ('group', self.gf('django.db.models.fields.CharField')(max_length=128)),
+ ('directory', self.gf('django.db.models.fields.related.ForeignKey')(related_name='directory_set', to=orm['orm.Target_File'])),
+ ('sym_target', self.gf('django.db.models.fields.related.ForeignKey')(related_name='symlink_set', blank=True, to=orm['orm.Target_File'])),
+ ))
+ db.send_create_signal(u'orm', ['Target_File'])
+
+ # Adding model 'Target_Image_File'
+ db.create_table(u'orm_target_image_file', (
+ (u'id', self.gf('django.db.models.fields.AutoField')(primary_key=True)),
+ ('target', self.gf('django.db.models.fields.related.ForeignKey')(to=orm['orm.Target'])),
+ ('file_name', self.gf('django.db.models.fields.FilePathField')(max_length=100)),
+ ('file_size', self.gf('django.db.models.fields.IntegerField')()),
+ ))
+ db.send_create_signal(u'orm', ['Target_Image_File'])
+
+ # Adding field 'VariableHistory.value'
+ db.add_column(u'orm_variablehistory', 'value',
+ self.gf('django.db.models.fields.TextField')(default='', blank=True),
+ keep_default=False)
+
+ # Deleting field 'Recipe.licensing_info'
+ db.delete_column(u'orm_recipe', 'licensing_info')
+
+ # Deleting field 'Target.file_name'
+ db.delete_column(u'orm_target', 'file_name')
+
+ # Deleting field 'Target.file_size'
+ db.delete_column(u'orm_target', 'file_size')
+
+ # Deleting field 'Build.image_fstypes'
+ db.delete_column(u'orm_build', 'image_fstypes')
+
+ # Adding field 'LogMessage.task'
+ db.add_column(u'orm_logmessage', 'task',
+ self.gf('django.db.models.fields.related.ForeignKey')(to=orm['orm.Task'], null=True, blank=True),
+ keep_default=False)
+
+
+ # Changing field 'Task.elapsed_time'
+ db.alter_column(u'orm_task', 'elapsed_time', self.gf('django.db.models.fields.DecimalField')(null=True, max_digits=6, decimal_places=2))
+ # Adding unique constraint on 'Task', fields ['build', 'recipe', 'task_name']
+ db.create_unique(u'orm_task', ['build_id', 'recipe_id', 'task_name'])
+
+
+ def backwards(self, orm):
+ # Removing unique constraint on 'Task', fields ['build', 'recipe', 'task_name']
+ db.delete_unique(u'orm_task', ['build_id', 'recipe_id', 'task_name'])
+
+ # Deleting model 'Target_File'
+ db.delete_table(u'orm_target_file')
+
+ # Deleting model 'Target_Image_File'
+ db.delete_table(u'orm_target_image_file')
+
+ # Deleting field 'VariableHistory.value'
+ db.delete_column(u'orm_variablehistory', 'value')
+
+ # Adding field 'Recipe.licensing_info'
+ db.add_column(u'orm_recipe', 'licensing_info',
+ self.gf('django.db.models.fields.TextField')(default='', blank=True),
+ keep_default=False)
+
+ # Adding field 'Target.file_name'
+ db.add_column(u'orm_target', 'file_name',
+ self.gf('django.db.models.fields.CharField')(default='', max_length=100),
+ keep_default=False)
+
+ # Adding field 'Target.file_size'
+ db.add_column(u'orm_target', 'file_size',
+ self.gf('django.db.models.fields.IntegerField')(default=0),
+ keep_default=False)
+
+ # Adding field 'Build.image_fstypes'
+ db.add_column(u'orm_build', 'image_fstypes',
+ self.gf('django.db.models.fields.CharField')(default='', max_length=100),
+ keep_default=False)
+
+ # Deleting field 'LogMessage.task'
+ db.delete_column(u'orm_logmessage', 'task_id')
+
+
+ # Changing field 'Task.elapsed_time'
+ db.alter_column(u'orm_task', 'elapsed_time', self.gf('django.db.models.fields.CharField')(max_length=50))
+
+ models = {
+ u'orm.build': {
+ 'Meta': {'object_name': 'Build'},
+ 'bitbake_version': ('django.db.models.fields.CharField', [], {'max_length': '50'}),
+ 'build_name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'completed_on': ('django.db.models.fields.DateTimeField', [], {}),
+ 'cooker_log_path': ('django.db.models.fields.CharField', [], {'max_length': '500'}),
+ 'distro': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'distro_version': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'errors_no': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'machine': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'outcome': ('django.db.models.fields.IntegerField', [], {'default': '2'}),
+ 'started_on': ('django.db.models.fields.DateTimeField', [], {}),
+ 'timespent': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'warnings_no': ('django.db.models.fields.IntegerField', [], {'default': '0'})
+ },
+ u'orm.layer': {
+ 'Meta': {'object_name': 'Layer'},
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'layer_index_url': ('django.db.models.fields.URLField', [], {'max_length': '200'}),
+ 'local_path': ('django.db.models.fields.FilePathField', [], {'max_length': '255'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '100'})
+ },
+ u'orm.layer_version': {
+ 'Meta': {'object_name': 'Layer_Version'},
+ 'branch': ('django.db.models.fields.CharField', [], {'max_length': '50'}),
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'layer_version_build'", 'to': u"orm['orm.Build']"}),
+ 'commit': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'layer': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'layer_version_layer'", 'to': u"orm['orm.Layer']"}),
+ 'priority': ('django.db.models.fields.IntegerField', [], {})
+ },
+ u'orm.logmessage': {
+ 'Meta': {'object_name': 'LogMessage'},
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Build']"}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'level': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'lineno': ('django.db.models.fields.IntegerField', [], {'null': 'True'}),
+ 'message': ('django.db.models.fields.CharField', [], {'max_length': '240'}),
+ 'pathname': ('django.db.models.fields.FilePathField', [], {'max_length': '255', 'blank': 'True'}),
+ 'task': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Task']", 'null': 'True', 'blank': 'True'})
+ },
+ u'orm.package': {
+ 'Meta': {'object_name': 'Package'},
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Build']"}),
+ 'description': ('django.db.models.fields.CharField', [], {'max_length': '200', 'blank': 'True'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'installed_name': ('django.db.models.fields.CharField', [], {'default': "''", 'max_length': '100'}),
+ 'installed_size': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'license': ('django.db.models.fields.CharField', [], {'max_length': '80', 'blank': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'recipe': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Recipe']", 'null': 'True'}),
+ 'revision': ('django.db.models.fields.CharField', [], {'max_length': '32', 'blank': 'True'}),
+ 'section': ('django.db.models.fields.CharField', [], {'max_length': '80', 'blank': 'True'}),
+ 'size': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'summary': ('django.db.models.fields.CharField', [], {'max_length': '200', 'blank': 'True'}),
+ 'version': ('django.db.models.fields.CharField', [], {'max_length': '100', 'blank': 'True'})
+ },
+ u'orm.package_dependency': {
+ 'Meta': {'object_name': 'Package_Dependency'},
+ 'dep_type': ('django.db.models.fields.IntegerField', [], {}),
+ 'depends_on': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'package_dependencies_target'", 'to': u"orm['orm.Package']"}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'package': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'package_dependencies_source'", 'to': u"orm['orm.Package']"}),
+ 'target': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Target']", 'null': 'True'})
+ },
+ u'orm.package_file': {
+ 'Meta': {'object_name': 'Package_File'},
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'package': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'buildfilelist_package'", 'to': u"orm['orm.Package']"}),
+ 'path': ('django.db.models.fields.FilePathField', [], {'max_length': '255', 'blank': 'True'}),
+ 'size': ('django.db.models.fields.IntegerField', [], {})
+ },
+ u'orm.recipe': {
+ 'Meta': {'object_name': 'Recipe'},
+ 'bugtracker': ('django.db.models.fields.URLField', [], {'max_length': '200', 'blank': 'True'}),
+ 'description': ('django.db.models.fields.CharField', [], {'max_length': '100', 'blank': 'True'}),
+ 'file_path': ('django.db.models.fields.FilePathField', [], {'max_length': '255'}),
+ 'homepage': ('django.db.models.fields.URLField', [], {'max_length': '200', 'blank': 'True'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'layer_version': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'recipe_layer_version'", 'to': u"orm['orm.Layer_Version']"}),
+ 'license': ('django.db.models.fields.CharField', [], {'max_length': '200', 'blank': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '100', 'blank': 'True'}),
+ 'section': ('django.db.models.fields.CharField', [], {'max_length': '100', 'blank': 'True'}),
+ 'summary': ('django.db.models.fields.CharField', [], {'max_length': '100', 'blank': 'True'}),
+ 'version': ('django.db.models.fields.CharField', [], {'max_length': '100', 'blank': 'True'})
+ },
+ u'orm.recipe_dependency': {
+ 'Meta': {'object_name': 'Recipe_Dependency'},
+ 'dep_type': ('django.db.models.fields.IntegerField', [], {}),
+ 'depends_on': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'r_dependencies_depends'", 'to': u"orm['orm.Recipe']"}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'recipe': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'r_dependencies_recipe'", 'to': u"orm['orm.Recipe']"})
+ },
+ u'orm.target': {
+ 'Meta': {'object_name': 'Target'},
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Build']"}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'is_image': ('django.db.models.fields.BooleanField', [], {'default': 'False'}),
+ 'target': ('django.db.models.fields.CharField', [], {'max_length': '100'})
+ },
+ u'orm.target_file': {
+ 'Meta': {'object_name': 'Target_File'},
+ 'directory': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'directory_set'", 'to': u"orm['orm.Target_File']"}),
+ 'group': ('django.db.models.fields.CharField', [], {'max_length': '128'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'inodetype': ('django.db.models.fields.IntegerField', [], {}),
+ 'owner': ('django.db.models.fields.CharField', [], {'max_length': '128'}),
+ 'path': ('django.db.models.fields.FilePathField', [], {'max_length': '100'}),
+ 'permission': ('django.db.models.fields.IntegerField', [], {}),
+ 'size': ('django.db.models.fields.IntegerField', [], {}),
+ 'sym_target': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'symlink_set'", 'blank': 'True', 'to': u"orm['orm.Target_File']"}),
+ 'target': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Target']"})
+ },
+ u'orm.target_image_file': {
+ 'Meta': {'object_name': 'Target_Image_File'},
+ 'file_name': ('django.db.models.fields.FilePathField', [], {'max_length': '100'}),
+ 'file_size': ('django.db.models.fields.IntegerField', [], {}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'target': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Target']"})
+ },
+ u'orm.target_installed_package': {
+ 'Meta': {'object_name': 'Target_Installed_Package'},
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'package': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'buildtargetlist_package'", 'to': u"orm['orm.Package']"}),
+ 'target': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Target']"})
+ },
+ u'orm.task': {
+ 'Meta': {'ordering': "('order', 'recipe')", 'unique_together': "(('build', 'recipe', 'task_name'),)", 'object_name': 'Task'},
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'task_build'", 'to': u"orm['orm.Build']"}),
+ 'cpu_usage': ('django.db.models.fields.DecimalField', [], {'null': 'True', 'max_digits': '6', 'decimal_places': '2'}),
+ 'disk_io': ('django.db.models.fields.IntegerField', [], {'null': 'True'}),
+ 'elapsed_time': ('django.db.models.fields.DecimalField', [], {'null': 'True', 'max_digits': '6', 'decimal_places': '2'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'line_number': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'logfile': ('django.db.models.fields.FilePathField', [], {'max_length': '255', 'blank': 'True'}),
+ 'message': ('django.db.models.fields.CharField', [], {'max_length': '240'}),
+ 'order': ('django.db.models.fields.IntegerField', [], {'null': 'True'}),
+ 'outcome': ('django.db.models.fields.IntegerField', [], {'default': '-1'}),
+ 'path_to_sstate_obj': ('django.db.models.fields.FilePathField', [], {'max_length': '500', 'blank': 'True'}),
+ 'recipe': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'build_recipe'", 'to': u"orm['orm.Recipe']"}),
+ 'script_type': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'source_url': ('django.db.models.fields.FilePathField', [], {'max_length': '255', 'blank': 'True'}),
+ 'sstate_checksum': ('django.db.models.fields.CharField', [], {'max_length': '100', 'blank': 'True'}),
+ 'sstate_result': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'task_executed': ('django.db.models.fields.BooleanField', [], {'default': 'False'}),
+ 'task_name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'work_directory': ('django.db.models.fields.FilePathField', [], {'max_length': '255', 'blank': 'True'})
+ },
+ u'orm.task_dependency': {
+ 'Meta': {'object_name': 'Task_Dependency'},
+ 'depends_on': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'task_dependencies_depends'", 'to': u"orm['orm.Task']"}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'task': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'task_dependencies_task'", 'to': u"orm['orm.Task']"})
+ },
+ u'orm.variable': {
+ 'Meta': {'object_name': 'Variable'},
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'variable_build'", 'to': u"orm['orm.Build']"}),
+ 'changed': ('django.db.models.fields.BooleanField', [], {'default': 'False'}),
+ 'description': ('django.db.models.fields.TextField', [], {'blank': 'True'}),
+ 'human_readable_name': ('django.db.models.fields.CharField', [], {'max_length': '200'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'variable_name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'variable_value': ('django.db.models.fields.TextField', [], {'blank': 'True'})
+ },
+ u'orm.variablehistory': {
+ 'Meta': {'object_name': 'VariableHistory'},
+ 'file_name': ('django.db.models.fields.FilePathField', [], {'max_length': '255'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'line_number': ('django.db.models.fields.IntegerField', [], {'null': 'True'}),
+ 'operation': ('django.db.models.fields.CharField', [], {'max_length': '16'}),
+ 'value': ('django.db.models.fields.TextField', [], {'blank': 'True'}),
+ 'variable': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'vhistory'", 'to': u"orm['orm.Variable']"})
+ }
+ }
+
+ complete_apps = ['orm']
diff --git a/bitbake/lib/toaster/orm/migrations/0006_auto__add_field_target_image_size__add_field_target_license_manifest_p.py b/bitbake/lib/toaster/orm/migrations/0006_auto__add_field_target_image_size__add_field_target_license_manifest_p.py
new file mode 100644
index 0000000..b2be30a
--- /dev/null
+++ b/bitbake/lib/toaster/orm/migrations/0006_auto__add_field_target_image_size__add_field_target_license_manifest_p.py
@@ -0,0 +1,235 @@
+# -*- coding: utf-8 -*-
+from south.utils import datetime_utils as datetime
+from south.db import db
+from south.v2 import SchemaMigration
+from django.db import models
+
+
+class Migration(SchemaMigration):
+
+ def forwards(self, orm):
+ # Adding field 'Target.image_size'
+ db.add_column(u'orm_target', 'image_size',
+ self.gf('django.db.models.fields.IntegerField')(default=0),
+ keep_default=False)
+
+ # Adding field 'Target.license_manifest_path'
+ db.add_column(u'orm_target', 'license_manifest_path',
+ self.gf('django.db.models.fields.CharField')(max_length=500, null=True),
+ keep_default=False)
+
+
+ # Changing field 'Target_File.permission'
+ db.alter_column(u'orm_target_file', 'permission', self.gf('django.db.models.fields.CharField')(max_length=16))
+
+ # Changing field 'Target_File.sym_target'
+ db.alter_column(u'orm_target_file', 'sym_target_id', self.gf('django.db.models.fields.related.ForeignKey')(null=True, to=orm['orm.Target_File']))
+
+ # Changing field 'Target_File.directory'
+ db.alter_column(u'orm_target_file', 'directory_id', self.gf('django.db.models.fields.related.ForeignKey')(null=True, to=orm['orm.Target_File']))
+
+ def backwards(self, orm):
+ # Deleting field 'Target.image_size'
+ db.delete_column(u'orm_target', 'image_size')
+
+ # Deleting field 'Target.license_manifest_path'
+ db.delete_column(u'orm_target', 'license_manifest_path')
+
+
+ # Changing field 'Target_File.permission'
+ db.alter_column(u'orm_target_file', 'permission', self.gf('django.db.models.fields.IntegerField')())
+
+ # User chose to not deal with backwards NULL issues for 'Target_File.sym_target'
+ raise RuntimeError("Cannot reverse this migration. 'Target_File.sym_target' and its values cannot be restored.")
+
+ # The following code is provided here to aid in writing a correct migration
+ # Changing field 'Target_File.sym_target'
+ db.alter_column(u'orm_target_file', 'sym_target_id', self.gf('django.db.models.fields.related.ForeignKey')(to=orm['orm.Target_File']))
+
+ # User chose to not deal with backwards NULL issues for 'Target_File.directory'
+ raise RuntimeError("Cannot reverse this migration. 'Target_File.directory' and its values cannot be restored.")
+
+ # The following code is provided here to aid in writing a correct migration
+ # Changing field 'Target_File.directory'
+ db.alter_column(u'orm_target_file', 'directory_id', self.gf('django.db.models.fields.related.ForeignKey')(to=orm['orm.Target_File']))
+
+ models = {
+ u'orm.build': {
+ 'Meta': {'object_name': 'Build'},
+ 'bitbake_version': ('django.db.models.fields.CharField', [], {'max_length': '50'}),
+ 'build_name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'completed_on': ('django.db.models.fields.DateTimeField', [], {}),
+ 'cooker_log_path': ('django.db.models.fields.CharField', [], {'max_length': '500'}),
+ 'distro': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'distro_version': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'errors_no': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'machine': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'outcome': ('django.db.models.fields.IntegerField', [], {'default': '2'}),
+ 'started_on': ('django.db.models.fields.DateTimeField', [], {}),
+ 'timespent': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'warnings_no': ('django.db.models.fields.IntegerField', [], {'default': '0'})
+ },
+ u'orm.layer': {
+ 'Meta': {'object_name': 'Layer'},
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'layer_index_url': ('django.db.models.fields.URLField', [], {'max_length': '200'}),
+ 'local_path': ('django.db.models.fields.FilePathField', [], {'max_length': '255'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '100'})
+ },
+ u'orm.layer_version': {
+ 'Meta': {'object_name': 'Layer_Version'},
+ 'branch': ('django.db.models.fields.CharField', [], {'max_length': '50'}),
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'layer_version_build'", 'to': u"orm['orm.Build']"}),
+ 'commit': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'layer': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'layer_version_layer'", 'to': u"orm['orm.Layer']"}),
+ 'priority': ('django.db.models.fields.IntegerField', [], {})
+ },
+ u'orm.logmessage': {
+ 'Meta': {'object_name': 'LogMessage'},
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Build']"}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'level': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'lineno': ('django.db.models.fields.IntegerField', [], {'null': 'True'}),
+ 'message': ('django.db.models.fields.CharField', [], {'max_length': '240'}),
+ 'pathname': ('django.db.models.fields.FilePathField', [], {'max_length': '255', 'blank': 'True'}),
+ 'task': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Task']", 'null': 'True', 'blank': 'True'})
+ },
+ u'orm.package': {
+ 'Meta': {'object_name': 'Package'},
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Build']"}),
+ 'description': ('django.db.models.fields.CharField', [], {'max_length': '200', 'blank': 'True'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'installed_name': ('django.db.models.fields.CharField', [], {'default': "''", 'max_length': '100'}),
+ 'installed_size': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'license': ('django.db.models.fields.CharField', [], {'max_length': '80', 'blank': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'recipe': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Recipe']", 'null': 'True'}),
+ 'revision': ('django.db.models.fields.CharField', [], {'max_length': '32', 'blank': 'True'}),
+ 'section': ('django.db.models.fields.CharField', [], {'max_length': '80', 'blank': 'True'}),
+ 'size': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'summary': ('django.db.models.fields.CharField', [], {'max_length': '200', 'blank': 'True'}),
+ 'version': ('django.db.models.fields.CharField', [], {'max_length': '100', 'blank': 'True'})
+ },
+ u'orm.package_dependency': {
+ 'Meta': {'object_name': 'Package_Dependency'},
+ 'dep_type': ('django.db.models.fields.IntegerField', [], {}),
+ 'depends_on': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'package_dependencies_target'", 'to': u"orm['orm.Package']"}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'package': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'package_dependencies_source'", 'to': u"orm['orm.Package']"}),
+ 'target': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Target']", 'null': 'True'})
+ },
+ u'orm.package_file': {
+ 'Meta': {'object_name': 'Package_File'},
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'package': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'buildfilelist_package'", 'to': u"orm['orm.Package']"}),
+ 'path': ('django.db.models.fields.FilePathField', [], {'max_length': '255', 'blank': 'True'}),
+ 'size': ('django.db.models.fields.IntegerField', [], {})
+ },
+ u'orm.recipe': {
+ 'Meta': {'object_name': 'Recipe'},
+ 'bugtracker': ('django.db.models.fields.URLField', [], {'max_length': '200', 'blank': 'True'}),
+ 'description': ('django.db.models.fields.CharField', [], {'max_length': '100', 'blank': 'True'}),
+ 'file_path': ('django.db.models.fields.FilePathField', [], {'max_length': '255'}),
+ 'homepage': ('django.db.models.fields.URLField', [], {'max_length': '200', 'blank': 'True'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'layer_version': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'recipe_layer_version'", 'to': u"orm['orm.Layer_Version']"}),
+ 'license': ('django.db.models.fields.CharField', [], {'max_length': '200', 'blank': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '100', 'blank': 'True'}),
+ 'section': ('django.db.models.fields.CharField', [], {'max_length': '100', 'blank': 'True'}),
+ 'summary': ('django.db.models.fields.CharField', [], {'max_length': '100', 'blank': 'True'}),
+ 'version': ('django.db.models.fields.CharField', [], {'max_length': '100', 'blank': 'True'})
+ },
+ u'orm.recipe_dependency': {
+ 'Meta': {'object_name': 'Recipe_Dependency'},
+ 'dep_type': ('django.db.models.fields.IntegerField', [], {}),
+ 'depends_on': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'r_dependencies_depends'", 'to': u"orm['orm.Recipe']"}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'recipe': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'r_dependencies_recipe'", 'to': u"orm['orm.Recipe']"})
+ },
+ u'orm.target': {
+ 'Meta': {'object_name': 'Target'},
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Build']"}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'image_size': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'is_image': ('django.db.models.fields.BooleanField', [], {'default': 'False'}),
+ 'license_manifest_path': ('django.db.models.fields.CharField', [], {'max_length': '500', 'null': 'True'}),
+ 'target': ('django.db.models.fields.CharField', [], {'max_length': '100'})
+ },
+ u'orm.target_file': {
+ 'Meta': {'object_name': 'Target_File'},
+ 'directory': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'directory_set'", 'null': 'True', 'to': u"orm['orm.Target_File']"}),
+ 'group': ('django.db.models.fields.CharField', [], {'max_length': '128'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'inodetype': ('django.db.models.fields.IntegerField', [], {}),
+ 'owner': ('django.db.models.fields.CharField', [], {'max_length': '128'}),
+ 'path': ('django.db.models.fields.FilePathField', [], {'max_length': '100'}),
+ 'permission': ('django.db.models.fields.CharField', [], {'max_length': '16'}),
+ 'size': ('django.db.models.fields.IntegerField', [], {}),
+ 'sym_target': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'symlink_set'", 'null': 'True', 'to': u"orm['orm.Target_File']"}),
+ 'target': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Target']"})
+ },
+ u'orm.target_image_file': {
+ 'Meta': {'object_name': 'Target_Image_File'},
+ 'file_name': ('django.db.models.fields.FilePathField', [], {'max_length': '100'}),
+ 'file_size': ('django.db.models.fields.IntegerField', [], {}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'target': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Target']"})
+ },
+ u'orm.target_installed_package': {
+ 'Meta': {'object_name': 'Target_Installed_Package'},
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'package': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'buildtargetlist_package'", 'to': u"orm['orm.Package']"}),
+ 'target': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Target']"})
+ },
+ u'orm.task': {
+ 'Meta': {'ordering': "('order', 'recipe')", 'unique_together': "(('build', 'recipe', 'task_name'),)", 'object_name': 'Task'},
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'task_build'", 'to': u"orm['orm.Build']"}),
+ 'cpu_usage': ('django.db.models.fields.DecimalField', [], {'null': 'True', 'max_digits': '6', 'decimal_places': '2'}),
+ 'disk_io': ('django.db.models.fields.IntegerField', [], {'null': 'True'}),
+ 'elapsed_time': ('django.db.models.fields.DecimalField', [], {'null': 'True', 'max_digits': '6', 'decimal_places': '2'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'line_number': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'logfile': ('django.db.models.fields.FilePathField', [], {'max_length': '255', 'blank': 'True'}),
+ 'message': ('django.db.models.fields.CharField', [], {'max_length': '240'}),
+ 'order': ('django.db.models.fields.IntegerField', [], {'null': 'True'}),
+ 'outcome': ('django.db.models.fields.IntegerField', [], {'default': '-1'}),
+ 'path_to_sstate_obj': ('django.db.models.fields.FilePathField', [], {'max_length': '500', 'blank': 'True'}),
+ 'recipe': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'build_recipe'", 'to': u"orm['orm.Recipe']"}),
+ 'script_type': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'source_url': ('django.db.models.fields.FilePathField', [], {'max_length': '255', 'blank': 'True'}),
+ 'sstate_checksum': ('django.db.models.fields.CharField', [], {'max_length': '100', 'blank': 'True'}),
+ 'sstate_result': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'task_executed': ('django.db.models.fields.BooleanField', [], {'default': 'False'}),
+ 'task_name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'work_directory': ('django.db.models.fields.FilePathField', [], {'max_length': '255', 'blank': 'True'})
+ },
+ u'orm.task_dependency': {
+ 'Meta': {'object_name': 'Task_Dependency'},
+ 'depends_on': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'task_dependencies_depends'", 'to': u"orm['orm.Task']"}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'task': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'task_dependencies_task'", 'to': u"orm['orm.Task']"})
+ },
+ u'orm.variable': {
+ 'Meta': {'object_name': 'Variable'},
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'variable_build'", 'to': u"orm['orm.Build']"}),
+ 'changed': ('django.db.models.fields.BooleanField', [], {'default': 'False'}),
+ 'description': ('django.db.models.fields.TextField', [], {'blank': 'True'}),
+ 'human_readable_name': ('django.db.models.fields.CharField', [], {'max_length': '200'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'variable_name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'variable_value': ('django.db.models.fields.TextField', [], {'blank': 'True'})
+ },
+ u'orm.variablehistory': {
+ 'Meta': {'object_name': 'VariableHistory'},
+ 'file_name': ('django.db.models.fields.FilePathField', [], {'max_length': '255'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'line_number': ('django.db.models.fields.IntegerField', [], {'null': 'True'}),
+ 'operation': ('django.db.models.fields.CharField', [], {'max_length': '16'}),
+ 'value': ('django.db.models.fields.TextField', [], {'blank': 'True'}),
+ 'variable': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'vhistory'", 'to': u"orm['orm.Variable']"})
+ }
+ }
+
+ complete_apps = ['orm']
diff --git a/bitbake/lib/toaster/orm/migrations/0007_auto__add_helptext.py b/bitbake/lib/toaster/orm/migrations/0007_auto__add_helptext.py
new file mode 100644
index 0000000..1e4c536
--- /dev/null
+++ b/bitbake/lib/toaster/orm/migrations/0007_auto__add_helptext.py
@@ -0,0 +1,214 @@
+# -*- coding: utf-8 -*-
+from south.utils import datetime_utils as datetime
+from south.db import db
+from south.v2 import SchemaMigration
+from django.db import models
+
+
+class Migration(SchemaMigration):
+
+ def forwards(self, orm):
+ # Adding model 'HelpText'
+ db.create_table(u'orm_helptext', (
+ (u'id', self.gf('django.db.models.fields.AutoField')(primary_key=True)),
+ ('build', self.gf('django.db.models.fields.related.ForeignKey')(related_name='helptext_build', to=orm['orm.Build'])),
+ ('area', self.gf('django.db.models.fields.IntegerField')()),
+ ('key', self.gf('django.db.models.fields.CharField')(max_length=100)),
+ ('text', self.gf('django.db.models.fields.TextField')()),
+ ))
+ db.send_create_signal(u'orm', ['HelpText'])
+
+
+ def backwards(self, orm):
+ # Deleting model 'HelpText'
+ db.delete_table(u'orm_helptext')
+
+
+ models = {
+ u'orm.build': {
+ 'Meta': {'object_name': 'Build'},
+ 'bitbake_version': ('django.db.models.fields.CharField', [], {'max_length': '50'}),
+ 'build_name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'completed_on': ('django.db.models.fields.DateTimeField', [], {}),
+ 'cooker_log_path': ('django.db.models.fields.CharField', [], {'max_length': '500'}),
+ 'distro': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'distro_version': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'errors_no': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'machine': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'outcome': ('django.db.models.fields.IntegerField', [], {'default': '2'}),
+ 'started_on': ('django.db.models.fields.DateTimeField', [], {}),
+ 'timespent': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'warnings_no': ('django.db.models.fields.IntegerField', [], {'default': '0'})
+ },
+ u'orm.helptext': {
+ 'Meta': {'object_name': 'HelpText'},
+ 'area': ('django.db.models.fields.IntegerField', [], {}),
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'helptext_build'", 'to': u"orm['orm.Build']"}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'key': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'text': ('django.db.models.fields.TextField', [], {})
+ },
+ u'orm.layer': {
+ 'Meta': {'object_name': 'Layer'},
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'layer_index_url': ('django.db.models.fields.URLField', [], {'max_length': '200'}),
+ 'local_path': ('django.db.models.fields.FilePathField', [], {'max_length': '255'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '100'})
+ },
+ u'orm.layer_version': {
+ 'Meta': {'object_name': 'Layer_Version'},
+ 'branch': ('django.db.models.fields.CharField', [], {'max_length': '50'}),
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'layer_version_build'", 'to': u"orm['orm.Build']"}),
+ 'commit': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'layer': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'layer_version_layer'", 'to': u"orm['orm.Layer']"}),
+ 'priority': ('django.db.models.fields.IntegerField', [], {})
+ },
+ u'orm.logmessage': {
+ 'Meta': {'object_name': 'LogMessage'},
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Build']"}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'level': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'lineno': ('django.db.models.fields.IntegerField', [], {'null': 'True'}),
+ 'message': ('django.db.models.fields.CharField', [], {'max_length': '240'}),
+ 'pathname': ('django.db.models.fields.FilePathField', [], {'max_length': '255', 'blank': 'True'}),
+ 'task': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Task']", 'null': 'True', 'blank': 'True'})
+ },
+ u'orm.package': {
+ 'Meta': {'object_name': 'Package'},
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Build']"}),
+ 'description': ('django.db.models.fields.CharField', [], {'max_length': '200', 'blank': 'True'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'installed_name': ('django.db.models.fields.CharField', [], {'default': "''", 'max_length': '100'}),
+ 'installed_size': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'license': ('django.db.models.fields.CharField', [], {'max_length': '80', 'blank': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'recipe': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Recipe']", 'null': 'True'}),
+ 'revision': ('django.db.models.fields.CharField', [], {'max_length': '32', 'blank': 'True'}),
+ 'section': ('django.db.models.fields.CharField', [], {'max_length': '80', 'blank': 'True'}),
+ 'size': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'summary': ('django.db.models.fields.CharField', [], {'max_length': '200', 'blank': 'True'}),
+ 'version': ('django.db.models.fields.CharField', [], {'max_length': '100', 'blank': 'True'})
+ },
+ u'orm.package_dependency': {
+ 'Meta': {'object_name': 'Package_Dependency'},
+ 'dep_type': ('django.db.models.fields.IntegerField', [], {}),
+ 'depends_on': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'package_dependencies_target'", 'to': u"orm['orm.Package']"}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'package': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'package_dependencies_source'", 'to': u"orm['orm.Package']"}),
+ 'target': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Target']", 'null': 'True'})
+ },
+ u'orm.package_file': {
+ 'Meta': {'object_name': 'Package_File'},
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'package': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'buildfilelist_package'", 'to': u"orm['orm.Package']"}),
+ 'path': ('django.db.models.fields.FilePathField', [], {'max_length': '255', 'blank': 'True'}),
+ 'size': ('django.db.models.fields.IntegerField', [], {})
+ },
+ u'orm.recipe': {
+ 'Meta': {'object_name': 'Recipe'},
+ 'bugtracker': ('django.db.models.fields.URLField', [], {'max_length': '200', 'blank': 'True'}),
+ 'description': ('django.db.models.fields.CharField', [], {'max_length': '100', 'blank': 'True'}),
+ 'file_path': ('django.db.models.fields.FilePathField', [], {'max_length': '255'}),
+ 'homepage': ('django.db.models.fields.URLField', [], {'max_length': '200', 'blank': 'True'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'layer_version': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'recipe_layer_version'", 'to': u"orm['orm.Layer_Version']"}),
+ 'license': ('django.db.models.fields.CharField', [], {'max_length': '200', 'blank': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '100', 'blank': 'True'}),
+ 'section': ('django.db.models.fields.CharField', [], {'max_length': '100', 'blank': 'True'}),
+ 'summary': ('django.db.models.fields.CharField', [], {'max_length': '100', 'blank': 'True'}),
+ 'version': ('django.db.models.fields.CharField', [], {'max_length': '100', 'blank': 'True'})
+ },
+ u'orm.recipe_dependency': {
+ 'Meta': {'object_name': 'Recipe_Dependency'},
+ 'dep_type': ('django.db.models.fields.IntegerField', [], {}),
+ 'depends_on': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'r_dependencies_depends'", 'to': u"orm['orm.Recipe']"}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'recipe': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'r_dependencies_recipe'", 'to': u"orm['orm.Recipe']"})
+ },
+ u'orm.target': {
+ 'Meta': {'object_name': 'Target'},
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Build']"}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'image_size': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'is_image': ('django.db.models.fields.BooleanField', [], {'default': 'False'}),
+ 'license_manifest_path': ('django.db.models.fields.CharField', [], {'max_length': '500', 'null': 'True'}),
+ 'target': ('django.db.models.fields.CharField', [], {'max_length': '100'})
+ },
+ u'orm.target_file': {
+ 'Meta': {'object_name': 'Target_File'},
+ 'directory': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'directory_set'", 'null': 'True', 'to': u"orm['orm.Target_File']"}),
+ 'group': ('django.db.models.fields.CharField', [], {'max_length': '128'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'inodetype': ('django.db.models.fields.IntegerField', [], {}),
+ 'owner': ('django.db.models.fields.CharField', [], {'max_length': '128'}),
+ 'path': ('django.db.models.fields.FilePathField', [], {'max_length': '100'}),
+ 'permission': ('django.db.models.fields.CharField', [], {'max_length': '16'}),
+ 'size': ('django.db.models.fields.IntegerField', [], {}),
+ 'sym_target': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'symlink_set'", 'null': 'True', 'to': u"orm['orm.Target_File']"}),
+ 'target': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Target']"})
+ },
+ u'orm.target_image_file': {
+ 'Meta': {'object_name': 'Target_Image_File'},
+ 'file_name': ('django.db.models.fields.FilePathField', [], {'max_length': '100'}),
+ 'file_size': ('django.db.models.fields.IntegerField', [], {}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'target': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Target']"})
+ },
+ u'orm.target_installed_package': {
+ 'Meta': {'object_name': 'Target_Installed_Package'},
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'package': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'buildtargetlist_package'", 'to': u"orm['orm.Package']"}),
+ 'target': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Target']"})
+ },
+ u'orm.task': {
+ 'Meta': {'ordering': "('order', 'recipe')", 'unique_together': "(('build', 'recipe', 'task_name'),)", 'object_name': 'Task'},
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'task_build'", 'to': u"orm['orm.Build']"}),
+ 'cpu_usage': ('django.db.models.fields.DecimalField', [], {'null': 'True', 'max_digits': '6', 'decimal_places': '2'}),
+ 'disk_io': ('django.db.models.fields.IntegerField', [], {'null': 'True'}),
+ 'elapsed_time': ('django.db.models.fields.DecimalField', [], {'null': 'True', 'max_digits': '6', 'decimal_places': '2'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'line_number': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'logfile': ('django.db.models.fields.FilePathField', [], {'max_length': '255', 'blank': 'True'}),
+ 'message': ('django.db.models.fields.CharField', [], {'max_length': '240'}),
+ 'order': ('django.db.models.fields.IntegerField', [], {'null': 'True'}),
+ 'outcome': ('django.db.models.fields.IntegerField', [], {'default': '-1'}),
+ 'path_to_sstate_obj': ('django.db.models.fields.FilePathField', [], {'max_length': '500', 'blank': 'True'}),
+ 'recipe': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'build_recipe'", 'to': u"orm['orm.Recipe']"}),
+ 'script_type': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'source_url': ('django.db.models.fields.FilePathField', [], {'max_length': '255', 'blank': 'True'}),
+ 'sstate_checksum': ('django.db.models.fields.CharField', [], {'max_length': '100', 'blank': 'True'}),
+ 'sstate_result': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'task_executed': ('django.db.models.fields.BooleanField', [], {'default': 'False'}),
+ 'task_name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'work_directory': ('django.db.models.fields.FilePathField', [], {'max_length': '255', 'blank': 'True'})
+ },
+ u'orm.task_dependency': {
+ 'Meta': {'object_name': 'Task_Dependency'},
+ 'depends_on': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'task_dependencies_depends'", 'to': u"orm['orm.Task']"}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'task': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'task_dependencies_task'", 'to': u"orm['orm.Task']"})
+ },
+ u'orm.variable': {
+ 'Meta': {'object_name': 'Variable'},
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'variable_build'", 'to': u"orm['orm.Build']"}),
+ 'changed': ('django.db.models.fields.BooleanField', [], {'default': 'False'}),
+ 'description': ('django.db.models.fields.TextField', [], {'blank': 'True'}),
+ 'human_readable_name': ('django.db.models.fields.CharField', [], {'max_length': '200'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'variable_name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'variable_value': ('django.db.models.fields.TextField', [], {'blank': 'True'})
+ },
+ u'orm.variablehistory': {
+ 'Meta': {'object_name': 'VariableHistory'},
+ 'file_name': ('django.db.models.fields.FilePathField', [], {'max_length': '255'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'line_number': ('django.db.models.fields.IntegerField', [], {'null': 'True'}),
+ 'operation': ('django.db.models.fields.CharField', [], {'max_length': '16'}),
+ 'value': ('django.db.models.fields.TextField', [], {'blank': 'True'}),
+ 'variable': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'vhistory'", 'to': u"orm['orm.Variable']"})
+ }
+ }
+
+ complete_apps = ['orm']
diff --git a/bitbake/lib/toaster/orm/migrations/0008_auto__chg_field_variablehistory_operation__chg_field_recipe_descriptio.py b/bitbake/lib/toaster/orm/migrations/0008_auto__chg_field_variablehistory_operation__chg_field_recipe_descriptio.py
new file mode 100644
index 0000000..ece408a
--- /dev/null
+++ b/bitbake/lib/toaster/orm/migrations/0008_auto__chg_field_variablehistory_operation__chg_field_recipe_descriptio.py
@@ -0,0 +1,225 @@
+# -*- coding: utf-8 -*-
+from south.utils import datetime_utils as datetime
+from south.db import db
+from south.v2 import SchemaMigration
+from django.db import models
+
+
+class Migration(SchemaMigration):
+
+ def forwards(self, orm):
+
+ # Changing field 'VariableHistory.operation'
+ db.alter_column(u'orm_variablehistory', 'operation', self.gf('django.db.models.fields.CharField')(max_length=64))
+
+ # Changing field 'Recipe.description'
+ db.alter_column(u'orm_recipe', 'description', self.gf('django.db.models.fields.TextField')())
+
+ # Changing field 'Target_Image_File.file_name'
+ db.alter_column(u'orm_target_image_file', 'file_name', self.gf('django.db.models.fields.FilePathField')(max_length=254))
+
+ # Changing field 'Package.description'
+ db.alter_column(u'orm_package', 'description', self.gf('django.db.models.fields.TextField')())
+
+ def backwards(self, orm):
+
+ # Changing field 'VariableHistory.operation'
+ db.alter_column(u'orm_variablehistory', 'operation', self.gf('django.db.models.fields.CharField')(max_length=16))
+
+ # Changing field 'Recipe.description'
+ db.alter_column(u'orm_recipe', 'description', self.gf('django.db.models.fields.CharField')(max_length=100))
+
+ # Changing field 'Target_Image_File.file_name'
+ db.alter_column(u'orm_target_image_file', 'file_name', self.gf('django.db.models.fields.FilePathField')(max_length=100))
+
+ # Changing field 'Package.description'
+ db.alter_column(u'orm_package', 'description', self.gf('django.db.models.fields.CharField')(max_length=200))
+
+ models = {
+ u'orm.build': {
+ 'Meta': {'object_name': 'Build'},
+ 'bitbake_version': ('django.db.models.fields.CharField', [], {'max_length': '50'}),
+ 'build_name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'completed_on': ('django.db.models.fields.DateTimeField', [], {}),
+ 'cooker_log_path': ('django.db.models.fields.CharField', [], {'max_length': '500'}),
+ 'distro': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'distro_version': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'errors_no': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'machine': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'outcome': ('django.db.models.fields.IntegerField', [], {'default': '2'}),
+ 'started_on': ('django.db.models.fields.DateTimeField', [], {}),
+ 'timespent': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'warnings_no': ('django.db.models.fields.IntegerField', [], {'default': '0'})
+ },
+ u'orm.helptext': {
+ 'Meta': {'object_name': 'HelpText'},
+ 'area': ('django.db.models.fields.IntegerField', [], {}),
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'helptext_build'", 'to': u"orm['orm.Build']"}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'key': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'text': ('django.db.models.fields.TextField', [], {})
+ },
+ u'orm.layer': {
+ 'Meta': {'object_name': 'Layer'},
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'layer_index_url': ('django.db.models.fields.URLField', [], {'max_length': '200'}),
+ 'local_path': ('django.db.models.fields.FilePathField', [], {'max_length': '255'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '100'})
+ },
+ u'orm.layer_version': {
+ 'Meta': {'object_name': 'Layer_Version'},
+ 'branch': ('django.db.models.fields.CharField', [], {'max_length': '50'}),
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'layer_version_build'", 'to': u"orm['orm.Build']"}),
+ 'commit': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'layer': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'layer_version_layer'", 'to': u"orm['orm.Layer']"}),
+ 'priority': ('django.db.models.fields.IntegerField', [], {})
+ },
+ u'orm.logmessage': {
+ 'Meta': {'object_name': 'LogMessage'},
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Build']"}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'level': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'lineno': ('django.db.models.fields.IntegerField', [], {'null': 'True'}),
+ 'message': ('django.db.models.fields.CharField', [], {'max_length': '240'}),
+ 'pathname': ('django.db.models.fields.FilePathField', [], {'max_length': '255', 'blank': 'True'}),
+ 'task': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Task']", 'null': 'True', 'blank': 'True'})
+ },
+ u'orm.package': {
+ 'Meta': {'object_name': 'Package'},
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Build']"}),
+ 'description': ('django.db.models.fields.TextField', [], {'blank': 'True'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'installed_name': ('django.db.models.fields.CharField', [], {'default': "''", 'max_length': '100'}),
+ 'installed_size': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'license': ('django.db.models.fields.CharField', [], {'max_length': '80', 'blank': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'recipe': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Recipe']", 'null': 'True'}),
+ 'revision': ('django.db.models.fields.CharField', [], {'max_length': '32', 'blank': 'True'}),
+ 'section': ('django.db.models.fields.CharField', [], {'max_length': '80', 'blank': 'True'}),
+ 'size': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'summary': ('django.db.models.fields.CharField', [], {'max_length': '200', 'blank': 'True'}),
+ 'version': ('django.db.models.fields.CharField', [], {'max_length': '100', 'blank': 'True'})
+ },
+ u'orm.package_dependency': {
+ 'Meta': {'object_name': 'Package_Dependency'},
+ 'dep_type': ('django.db.models.fields.IntegerField', [], {}),
+ 'depends_on': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'package_dependencies_target'", 'to': u"orm['orm.Package']"}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'package': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'package_dependencies_source'", 'to': u"orm['orm.Package']"}),
+ 'target': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Target']", 'null': 'True'})
+ },
+ u'orm.package_file': {
+ 'Meta': {'object_name': 'Package_File'},
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'package': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'buildfilelist_package'", 'to': u"orm['orm.Package']"}),
+ 'path': ('django.db.models.fields.FilePathField', [], {'max_length': '255', 'blank': 'True'}),
+ 'size': ('django.db.models.fields.IntegerField', [], {})
+ },
+ u'orm.recipe': {
+ 'Meta': {'object_name': 'Recipe'},
+ 'bugtracker': ('django.db.models.fields.URLField', [], {'max_length': '200', 'blank': 'True'}),
+ 'description': ('django.db.models.fields.TextField', [], {'blank': 'True'}),
+ 'file_path': ('django.db.models.fields.FilePathField', [], {'max_length': '255'}),
+ 'homepage': ('django.db.models.fields.URLField', [], {'max_length': '200', 'blank': 'True'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'layer_version': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'recipe_layer_version'", 'to': u"orm['orm.Layer_Version']"}),
+ 'license': ('django.db.models.fields.CharField', [], {'max_length': '200', 'blank': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '100', 'blank': 'True'}),
+ 'section': ('django.db.models.fields.CharField', [], {'max_length': '100', 'blank': 'True'}),
+ 'summary': ('django.db.models.fields.CharField', [], {'max_length': '100', 'blank': 'True'}),
+ 'version': ('django.db.models.fields.CharField', [], {'max_length': '100', 'blank': 'True'})
+ },
+ u'orm.recipe_dependency': {
+ 'Meta': {'object_name': 'Recipe_Dependency'},
+ 'dep_type': ('django.db.models.fields.IntegerField', [], {}),
+ 'depends_on': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'r_dependencies_depends'", 'to': u"orm['orm.Recipe']"}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'recipe': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'r_dependencies_recipe'", 'to': u"orm['orm.Recipe']"})
+ },
+ u'orm.target': {
+ 'Meta': {'object_name': 'Target'},
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Build']"}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'image_size': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'is_image': ('django.db.models.fields.BooleanField', [], {'default': 'False'}),
+ 'license_manifest_path': ('django.db.models.fields.CharField', [], {'max_length': '500', 'null': 'True'}),
+ 'target': ('django.db.models.fields.CharField', [], {'max_length': '100'})
+ },
+ u'orm.target_file': {
+ 'Meta': {'object_name': 'Target_File'},
+ 'directory': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'directory_set'", 'null': 'True', 'to': u"orm['orm.Target_File']"}),
+ 'group': ('django.db.models.fields.CharField', [], {'max_length': '128'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'inodetype': ('django.db.models.fields.IntegerField', [], {}),
+ 'owner': ('django.db.models.fields.CharField', [], {'max_length': '128'}),
+ 'path': ('django.db.models.fields.FilePathField', [], {'max_length': '100'}),
+ 'permission': ('django.db.models.fields.CharField', [], {'max_length': '16'}),
+ 'size': ('django.db.models.fields.IntegerField', [], {}),
+ 'sym_target': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'symlink_set'", 'null': 'True', 'to': u"orm['orm.Target_File']"}),
+ 'target': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Target']"})
+ },
+ u'orm.target_image_file': {
+ 'Meta': {'object_name': 'Target_Image_File'},
+ 'file_name': ('django.db.models.fields.FilePathField', [], {'max_length': '254'}),
+ 'file_size': ('django.db.models.fields.IntegerField', [], {}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'target': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Target']"})
+ },
+ u'orm.target_installed_package': {
+ 'Meta': {'object_name': 'Target_Installed_Package'},
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'package': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'buildtargetlist_package'", 'to': u"orm['orm.Package']"}),
+ 'target': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Target']"})
+ },
+ u'orm.task': {
+ 'Meta': {'ordering': "('order', 'recipe')", 'unique_together': "(('build', 'recipe', 'task_name'),)", 'object_name': 'Task'},
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'task_build'", 'to': u"orm['orm.Build']"}),
+ 'cpu_usage': ('django.db.models.fields.DecimalField', [], {'null': 'True', 'max_digits': '6', 'decimal_places': '2'}),
+ 'disk_io': ('django.db.models.fields.IntegerField', [], {'null': 'True'}),
+ 'elapsed_time': ('django.db.models.fields.DecimalField', [], {'null': 'True', 'max_digits': '6', 'decimal_places': '2'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'line_number': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'logfile': ('django.db.models.fields.FilePathField', [], {'max_length': '255', 'blank': 'True'}),
+ 'message': ('django.db.models.fields.CharField', [], {'max_length': '240'}),
+ 'order': ('django.db.models.fields.IntegerField', [], {'null': 'True'}),
+ 'outcome': ('django.db.models.fields.IntegerField', [], {'default': '-1'}),
+ 'path_to_sstate_obj': ('django.db.models.fields.FilePathField', [], {'max_length': '500', 'blank': 'True'}),
+ 'recipe': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'build_recipe'", 'to': u"orm['orm.Recipe']"}),
+ 'script_type': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'source_url': ('django.db.models.fields.FilePathField', [], {'max_length': '255', 'blank': 'True'}),
+ 'sstate_checksum': ('django.db.models.fields.CharField', [], {'max_length': '100', 'blank': 'True'}),
+ 'sstate_result': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'task_executed': ('django.db.models.fields.BooleanField', [], {'default': 'False'}),
+ 'task_name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'work_directory': ('django.db.models.fields.FilePathField', [], {'max_length': '255', 'blank': 'True'})
+ },
+ u'orm.task_dependency': {
+ 'Meta': {'object_name': 'Task_Dependency'},
+ 'depends_on': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'task_dependencies_depends'", 'to': u"orm['orm.Task']"}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'task': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'task_dependencies_task'", 'to': u"orm['orm.Task']"})
+ },
+ u'orm.variable': {
+ 'Meta': {'object_name': 'Variable'},
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'variable_build'", 'to': u"orm['orm.Build']"}),
+ 'changed': ('django.db.models.fields.BooleanField', [], {'default': 'False'}),
+ 'description': ('django.db.models.fields.TextField', [], {'blank': 'True'}),
+ 'human_readable_name': ('django.db.models.fields.CharField', [], {'max_length': '200'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'variable_name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'variable_value': ('django.db.models.fields.TextField', [], {'blank': 'True'})
+ },
+ u'orm.variablehistory': {
+ 'Meta': {'object_name': 'VariableHistory'},
+ 'file_name': ('django.db.models.fields.FilePathField', [], {'max_length': '255'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'line_number': ('django.db.models.fields.IntegerField', [], {'null': 'True'}),
+ 'operation': ('django.db.models.fields.CharField', [], {'max_length': '64'}),
+ 'value': ('django.db.models.fields.TextField', [], {'blank': 'True'}),
+ 'variable': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'vhistory'", 'to': u"orm['orm.Variable']"})
+ }
+ }
+
+ complete_apps = ['orm']
diff --git a/bitbake/lib/toaster/orm/migrations/0009_auto__add_projectvariable__add_projectlayer__add_projecttarget__add_pr.py b/bitbake/lib/toaster/orm/migrations/0009_auto__add_projectvariable__add_projectlayer__add_projecttarget__add_pr.py
new file mode 100644
index 0000000..7a58dc2
--- /dev/null
+++ b/bitbake/lib/toaster/orm/migrations/0009_auto__add_projectvariable__add_projectlayer__add_projecttarget__add_pr.py
@@ -0,0 +1,286 @@
+# -*- coding: utf-8 -*-
+from south.utils import datetime_utils as datetime
+from south.db import db
+from south.v2 import SchemaMigration
+from django.db import models
+
+
+class Migration(SchemaMigration):
+
+ def forwards(self, orm):
+ # Adding model 'ProjectVariable'
+ db.create_table(u'orm_projectvariable', (
+ (u'id', self.gf('django.db.models.fields.AutoField')(primary_key=True)),
+ ('project', self.gf('django.db.models.fields.related.ForeignKey')(to=orm['orm.Project'])),
+ ('name', self.gf('django.db.models.fields.CharField')(max_length=100)),
+ ('value', self.gf('django.db.models.fields.TextField')(blank=True)),
+ ))
+ db.send_create_signal(u'orm', ['ProjectVariable'])
+
+ # Adding model 'ProjectLayer'
+ db.create_table(u'orm_projectlayer', (
+ (u'id', self.gf('django.db.models.fields.AutoField')(primary_key=True)),
+ ('project', self.gf('django.db.models.fields.related.ForeignKey')(to=orm['orm.Project'])),
+ ('name', self.gf('django.db.models.fields.CharField')(max_length=100)),
+ ('giturl', self.gf('django.db.models.fields.CharField')(max_length=254)),
+ ('commit', self.gf('django.db.models.fields.CharField')(max_length=254)),
+ ))
+ db.send_create_signal(u'orm', ['ProjectLayer'])
+
+ # Adding model 'ProjectTarget'
+ db.create_table(u'orm_projecttarget', (
+ (u'id', self.gf('django.db.models.fields.AutoField')(primary_key=True)),
+ ('project', self.gf('django.db.models.fields.related.ForeignKey')(to=orm['orm.Project'])),
+ ('target', self.gf('django.db.models.fields.CharField')(max_length=100)),
+ ))
+ db.send_create_signal(u'orm', ['ProjectTarget'])
+
+ # Adding model 'Project'
+ db.create_table(u'orm_project', (
+ (u'id', self.gf('django.db.models.fields.AutoField')(primary_key=True)),
+ ('name', self.gf('django.db.models.fields.CharField')(max_length=100)),
+ ('created', self.gf('django.db.models.fields.DateTimeField')(auto_now_add=True, blank=True)),
+ ('updated', self.gf('django.db.models.fields.DateTimeField')(auto_now=True, blank=True)),
+ ))
+ db.send_create_signal(u'orm', ['Project'])
+
+ # Adding field 'Build.project'
+ db.add_column(u'orm_build', 'project',
+ self.gf('django.db.models.fields.related.ForeignKey')(to=orm['orm.Project'], null=True),
+ keep_default=False)
+
+
+ def backwards(self, orm):
+ # Deleting model 'ProjectVariable'
+ db.delete_table(u'orm_projectvariable')
+
+ # Deleting model 'ProjectLayer'
+ db.delete_table(u'orm_projectlayer')
+
+ # Deleting model 'ProjectTarget'
+ db.delete_table(u'orm_projecttarget')
+
+ # Deleting model 'Project'
+ db.delete_table(u'orm_project')
+
+ # Deleting field 'Build.project'
+ db.delete_column(u'orm_build', 'project_id')
+
+
+ models = {
+ u'orm.build': {
+ 'Meta': {'object_name': 'Build'},
+ 'bitbake_version': ('django.db.models.fields.CharField', [], {'max_length': '50'}),
+ 'build_name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'completed_on': ('django.db.models.fields.DateTimeField', [], {}),
+ 'cooker_log_path': ('django.db.models.fields.CharField', [], {'max_length': '500'}),
+ 'distro': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'distro_version': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'errors_no': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'machine': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'outcome': ('django.db.models.fields.IntegerField', [], {'default': '2'}),
+ 'project': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Project']", 'null': 'True'}),
+ 'started_on': ('django.db.models.fields.DateTimeField', [], {}),
+ 'timespent': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'warnings_no': ('django.db.models.fields.IntegerField', [], {'default': '0'})
+ },
+ u'orm.helptext': {
+ 'Meta': {'object_name': 'HelpText'},
+ 'area': ('django.db.models.fields.IntegerField', [], {}),
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'helptext_build'", 'to': u"orm['orm.Build']"}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'key': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'text': ('django.db.models.fields.TextField', [], {})
+ },
+ u'orm.layer': {
+ 'Meta': {'object_name': 'Layer'},
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'layer_index_url': ('django.db.models.fields.URLField', [], {'max_length': '200'}),
+ 'local_path': ('django.db.models.fields.FilePathField', [], {'max_length': '255'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '100'})
+ },
+ u'orm.layer_version': {
+ 'Meta': {'object_name': 'Layer_Version'},
+ 'branch': ('django.db.models.fields.CharField', [], {'max_length': '50'}),
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'layer_version_build'", 'to': u"orm['orm.Build']"}),
+ 'commit': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'layer': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'layer_version_layer'", 'to': u"orm['orm.Layer']"}),
+ 'priority': ('django.db.models.fields.IntegerField', [], {})
+ },
+ u'orm.logmessage': {
+ 'Meta': {'object_name': 'LogMessage'},
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Build']"}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'level': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'lineno': ('django.db.models.fields.IntegerField', [], {'null': 'True'}),
+ 'message': ('django.db.models.fields.CharField', [], {'max_length': '240'}),
+ 'pathname': ('django.db.models.fields.FilePathField', [], {'max_length': '255', 'blank': 'True'}),
+ 'task': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Task']", 'null': 'True', 'blank': 'True'})
+ },
+ u'orm.package': {
+ 'Meta': {'object_name': 'Package'},
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Build']"}),
+ 'description': ('django.db.models.fields.TextField', [], {'blank': 'True'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'installed_name': ('django.db.models.fields.CharField', [], {'default': "''", 'max_length': '100'}),
+ 'installed_size': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'license': ('django.db.models.fields.CharField', [], {'max_length': '80', 'blank': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'recipe': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Recipe']", 'null': 'True'}),
+ 'revision': ('django.db.models.fields.CharField', [], {'max_length': '32', 'blank': 'True'}),
+ 'section': ('django.db.models.fields.CharField', [], {'max_length': '80', 'blank': 'True'}),
+ 'size': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'summary': ('django.db.models.fields.CharField', [], {'max_length': '200', 'blank': 'True'}),
+ 'version': ('django.db.models.fields.CharField', [], {'max_length': '100', 'blank': 'True'})
+ },
+ u'orm.package_dependency': {
+ 'Meta': {'object_name': 'Package_Dependency'},
+ 'dep_type': ('django.db.models.fields.IntegerField', [], {}),
+ 'depends_on': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'package_dependencies_target'", 'to': u"orm['orm.Package']"}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'package': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'package_dependencies_source'", 'to': u"orm['orm.Package']"}),
+ 'target': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Target']", 'null': 'True'})
+ },
+ u'orm.package_file': {
+ 'Meta': {'object_name': 'Package_File'},
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'package': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'buildfilelist_package'", 'to': u"orm['orm.Package']"}),
+ 'path': ('django.db.models.fields.FilePathField', [], {'max_length': '255', 'blank': 'True'}),
+ 'size': ('django.db.models.fields.IntegerField', [], {})
+ },
+ u'orm.project': {
+ 'Meta': {'object_name': 'Project'},
+ 'created': ('django.db.models.fields.DateTimeField', [], {'auto_now_add': 'True', 'blank': 'True'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'updated': ('django.db.models.fields.DateTimeField', [], {'auto_now': 'True', 'blank': 'True'})
+ },
+ u'orm.projectlayer': {
+ 'Meta': {'object_name': 'ProjectLayer'},
+ 'commit': ('django.db.models.fields.CharField', [], {'max_length': '254'}),
+ 'giturl': ('django.db.models.fields.CharField', [], {'max_length': '254'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'project': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Project']"})
+ },
+ u'orm.projecttarget': {
+ 'Meta': {'object_name': 'ProjectTarget'},
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'project': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Project']"}),
+ 'target': ('django.db.models.fields.CharField', [], {'max_length': '100'})
+ },
+ u'orm.projectvariable': {
+ 'Meta': {'object_name': 'ProjectVariable'},
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'project': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Project']"}),
+ 'value': ('django.db.models.fields.TextField', [], {'blank': 'True'})
+ },
+ u'orm.recipe': {
+ 'Meta': {'object_name': 'Recipe'},
+ 'bugtracker': ('django.db.models.fields.URLField', [], {'max_length': '200', 'blank': 'True'}),
+ 'description': ('django.db.models.fields.TextField', [], {'blank': 'True'}),
+ 'file_path': ('django.db.models.fields.FilePathField', [], {'max_length': '255'}),
+ 'homepage': ('django.db.models.fields.URLField', [], {'max_length': '200', 'blank': 'True'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'layer_version': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'recipe_layer_version'", 'to': u"orm['orm.Layer_Version']"}),
+ 'license': ('django.db.models.fields.CharField', [], {'max_length': '200', 'blank': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '100', 'blank': 'True'}),
+ 'section': ('django.db.models.fields.CharField', [], {'max_length': '100', 'blank': 'True'}),
+ 'summary': ('django.db.models.fields.CharField', [], {'max_length': '100', 'blank': 'True'}),
+ 'version': ('django.db.models.fields.CharField', [], {'max_length': '100', 'blank': 'True'})
+ },
+ u'orm.recipe_dependency': {
+ 'Meta': {'object_name': 'Recipe_Dependency'},
+ 'dep_type': ('django.db.models.fields.IntegerField', [], {}),
+ 'depends_on': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'r_dependencies_depends'", 'to': u"orm['orm.Recipe']"}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'recipe': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'r_dependencies_recipe'", 'to': u"orm['orm.Recipe']"})
+ },
+ u'orm.target': {
+ 'Meta': {'object_name': 'Target'},
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Build']"}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'image_size': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'is_image': ('django.db.models.fields.BooleanField', [], {'default': 'False'}),
+ 'license_manifest_path': ('django.db.models.fields.CharField', [], {'max_length': '500', 'null': 'True'}),
+ 'target': ('django.db.models.fields.CharField', [], {'max_length': '100'})
+ },
+ u'orm.target_file': {
+ 'Meta': {'object_name': 'Target_File'},
+ 'directory': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'directory_set'", 'null': 'True', 'to': u"orm['orm.Target_File']"}),
+ 'group': ('django.db.models.fields.CharField', [], {'max_length': '128'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'inodetype': ('django.db.models.fields.IntegerField', [], {}),
+ 'owner': ('django.db.models.fields.CharField', [], {'max_length': '128'}),
+ 'path': ('django.db.models.fields.FilePathField', [], {'max_length': '100'}),
+ 'permission': ('django.db.models.fields.CharField', [], {'max_length': '16'}),
+ 'size': ('django.db.models.fields.IntegerField', [], {}),
+ 'sym_target': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'symlink_set'", 'null': 'True', 'to': u"orm['orm.Target_File']"}),
+ 'target': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Target']"})
+ },
+ u'orm.target_image_file': {
+ 'Meta': {'object_name': 'Target_Image_File'},
+ 'file_name': ('django.db.models.fields.FilePathField', [], {'max_length': '254'}),
+ 'file_size': ('django.db.models.fields.IntegerField', [], {}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'target': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Target']"})
+ },
+ u'orm.target_installed_package': {
+ 'Meta': {'object_name': 'Target_Installed_Package'},
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'package': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'buildtargetlist_package'", 'to': u"orm['orm.Package']"}),
+ 'target': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Target']"})
+ },
+ u'orm.task': {
+ 'Meta': {'ordering': "('order', 'recipe')", 'unique_together': "(('build', 'recipe', 'task_name'),)", 'object_name': 'Task'},
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'task_build'", 'to': u"orm['orm.Build']"}),
+ 'cpu_usage': ('django.db.models.fields.DecimalField', [], {'null': 'True', 'max_digits': '6', 'decimal_places': '2'}),
+ 'disk_io': ('django.db.models.fields.IntegerField', [], {'null': 'True'}),
+ 'elapsed_time': ('django.db.models.fields.DecimalField', [], {'null': 'True', 'max_digits': '6', 'decimal_places': '2'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'line_number': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'logfile': ('django.db.models.fields.FilePathField', [], {'max_length': '255', 'blank': 'True'}),
+ 'message': ('django.db.models.fields.CharField', [], {'max_length': '240'}),
+ 'order': ('django.db.models.fields.IntegerField', [], {'null': 'True'}),
+ 'outcome': ('django.db.models.fields.IntegerField', [], {'default': '-1'}),
+ 'path_to_sstate_obj': ('django.db.models.fields.FilePathField', [], {'max_length': '500', 'blank': 'True'}),
+ 'recipe': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'build_recipe'", 'to': u"orm['orm.Recipe']"}),
+ 'script_type': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'source_url': ('django.db.models.fields.FilePathField', [], {'max_length': '255', 'blank': 'True'}),
+ 'sstate_checksum': ('django.db.models.fields.CharField', [], {'max_length': '100', 'blank': 'True'}),
+ 'sstate_result': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'task_executed': ('django.db.models.fields.BooleanField', [], {'default': 'False'}),
+ 'task_name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'work_directory': ('django.db.models.fields.FilePathField', [], {'max_length': '255', 'blank': 'True'})
+ },
+ u'orm.task_dependency': {
+ 'Meta': {'object_name': 'Task_Dependency'},
+ 'depends_on': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'task_dependencies_depends'", 'to': u"orm['orm.Task']"}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'task': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'task_dependencies_task'", 'to': u"orm['orm.Task']"})
+ },
+ u'orm.variable': {
+ 'Meta': {'object_name': 'Variable'},
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'variable_build'", 'to': u"orm['orm.Build']"}),
+ 'changed': ('django.db.models.fields.BooleanField', [], {'default': 'False'}),
+ 'description': ('django.db.models.fields.TextField', [], {'blank': 'True'}),
+ 'human_readable_name': ('django.db.models.fields.CharField', [], {'max_length': '200'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'variable_name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'variable_value': ('django.db.models.fields.TextField', [], {'blank': 'True'})
+ },
+ u'orm.variablehistory': {
+ 'Meta': {'object_name': 'VariableHistory'},
+ 'file_name': ('django.db.models.fields.FilePathField', [], {'max_length': '255'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'line_number': ('django.db.models.fields.IntegerField', [], {'null': 'True'}),
+ 'operation': ('django.db.models.fields.CharField', [], {'max_length': '64'}),
+ 'value': ('django.db.models.fields.TextField', [], {'blank': 'True'}),
+ 'variable': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'vhistory'", 'to': u"orm['orm.Variable']"})
+ }
+ }
+
+ complete_apps = ['orm']
diff --git a/bitbake/lib/toaster/orm/migrations/0010_auto__add_field_project_branch__add_field_project_short_description__a.py b/bitbake/lib/toaster/orm/migrations/0010_auto__add_field_project_branch__add_field_project_short_description__a.py
new file mode 100644
index 0000000..aa1ce1f
--- /dev/null
+++ b/bitbake/lib/toaster/orm/migrations/0010_auto__add_field_project_branch__add_field_project_short_description__a.py
@@ -0,0 +1,257 @@
+# -*- coding: utf-8 -*-
+from south.utils import datetime_utils as datetime
+from south.db import db
+from south.v2 import SchemaMigration
+from django.db import models
+
+
+class Migration(SchemaMigration):
+
+ def forwards(self, orm):
+ # Adding field 'Project.branch'
+ db.add_column(u'orm_project', 'branch',
+ self.gf('django.db.models.fields.CharField')(default='master', max_length=50),
+ keep_default=False)
+
+ # Adding field 'Project.short_description'
+ db.add_column(u'orm_project', 'short_description',
+ self.gf('django.db.models.fields.CharField')(default='', max_length=50, blank=True),
+ keep_default=False)
+
+ # Adding field 'Project.user_id'
+ db.add_column(u'orm_project', 'user_id',
+ self.gf('django.db.models.fields.IntegerField')(null=True),
+ keep_default=False)
+
+
+ def backwards(self, orm):
+ # Deleting field 'Project.branch'
+ db.delete_column(u'orm_project', 'branch')
+
+ # Deleting field 'Project.short_description'
+ db.delete_column(u'orm_project', 'short_description')
+
+ # Deleting field 'Project.user_id'
+ db.delete_column(u'orm_project', 'user_id')
+
+
+ models = {
+ u'orm.build': {
+ 'Meta': {'object_name': 'Build'},
+ 'bitbake_version': ('django.db.models.fields.CharField', [], {'max_length': '50'}),
+ 'build_name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'completed_on': ('django.db.models.fields.DateTimeField', [], {}),
+ 'cooker_log_path': ('django.db.models.fields.CharField', [], {'max_length': '500'}),
+ 'distro': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'distro_version': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'errors_no': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'machine': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'outcome': ('django.db.models.fields.IntegerField', [], {'default': '2'}),
+ 'project': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Project']", 'null': 'True'}),
+ 'started_on': ('django.db.models.fields.DateTimeField', [], {}),
+ 'timespent': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'warnings_no': ('django.db.models.fields.IntegerField', [], {'default': '0'})
+ },
+ u'orm.helptext': {
+ 'Meta': {'object_name': 'HelpText'},
+ 'area': ('django.db.models.fields.IntegerField', [], {}),
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'helptext_build'", 'to': u"orm['orm.Build']"}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'key': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'text': ('django.db.models.fields.TextField', [], {})
+ },
+ u'orm.layer': {
+ 'Meta': {'object_name': 'Layer'},
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'layer_index_url': ('django.db.models.fields.URLField', [], {'max_length': '200'}),
+ 'local_path': ('django.db.models.fields.FilePathField', [], {'max_length': '255'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '100'})
+ },
+ u'orm.layer_version': {
+ 'Meta': {'object_name': 'Layer_Version'},
+ 'branch': ('django.db.models.fields.CharField', [], {'max_length': '50'}),
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'layer_version_build'", 'to': u"orm['orm.Build']"}),
+ 'commit': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'layer': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'layer_version_layer'", 'to': u"orm['orm.Layer']"}),
+ 'priority': ('django.db.models.fields.IntegerField', [], {})
+ },
+ u'orm.logmessage': {
+ 'Meta': {'object_name': 'LogMessage'},
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Build']"}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'level': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'lineno': ('django.db.models.fields.IntegerField', [], {'null': 'True'}),
+ 'message': ('django.db.models.fields.CharField', [], {'max_length': '240'}),
+ 'pathname': ('django.db.models.fields.FilePathField', [], {'max_length': '255', 'blank': 'True'}),
+ 'task': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Task']", 'null': 'True', 'blank': 'True'})
+ },
+ u'orm.package': {
+ 'Meta': {'object_name': 'Package'},
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Build']"}),
+ 'description': ('django.db.models.fields.TextField', [], {'blank': 'True'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'installed_name': ('django.db.models.fields.CharField', [], {'default': "''", 'max_length': '100'}),
+ 'installed_size': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'license': ('django.db.models.fields.CharField', [], {'max_length': '80', 'blank': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'recipe': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Recipe']", 'null': 'True'}),
+ 'revision': ('django.db.models.fields.CharField', [], {'max_length': '32', 'blank': 'True'}),
+ 'section': ('django.db.models.fields.CharField', [], {'max_length': '80', 'blank': 'True'}),
+ 'size': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'summary': ('django.db.models.fields.CharField', [], {'max_length': '200', 'blank': 'True'}),
+ 'version': ('django.db.models.fields.CharField', [], {'max_length': '100', 'blank': 'True'})
+ },
+ u'orm.package_dependency': {
+ 'Meta': {'object_name': 'Package_Dependency'},
+ 'dep_type': ('django.db.models.fields.IntegerField', [], {}),
+ 'depends_on': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'package_dependencies_target'", 'to': u"orm['orm.Package']"}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'package': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'package_dependencies_source'", 'to': u"orm['orm.Package']"}),
+ 'target': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Target']", 'null': 'True'})
+ },
+ u'orm.package_file': {
+ 'Meta': {'object_name': 'Package_File'},
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'package': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'buildfilelist_package'", 'to': u"orm['orm.Package']"}),
+ 'path': ('django.db.models.fields.FilePathField', [], {'max_length': '255', 'blank': 'True'}),
+ 'size': ('django.db.models.fields.IntegerField', [], {})
+ },
+ u'orm.project': {
+ 'Meta': {'object_name': 'Project'},
+ 'branch': ('django.db.models.fields.CharField', [], {'max_length': '50'}),
+ 'created': ('django.db.models.fields.DateTimeField', [], {'auto_now_add': 'True', 'blank': 'True'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'short_description': ('django.db.models.fields.CharField', [], {'max_length': '50', 'blank': 'True'}),
+ 'updated': ('django.db.models.fields.DateTimeField', [], {'auto_now': 'True', 'blank': 'True'}),
+ 'user_id': ('django.db.models.fields.IntegerField', [], {'null': 'True'})
+ },
+ u'orm.projectlayer': {
+ 'Meta': {'object_name': 'ProjectLayer'},
+ 'commit': ('django.db.models.fields.CharField', [], {'max_length': '254'}),
+ 'giturl': ('django.db.models.fields.CharField', [], {'max_length': '254'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'project': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Project']"})
+ },
+ u'orm.projecttarget': {
+ 'Meta': {'object_name': 'ProjectTarget'},
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'project': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Project']"}),
+ 'target': ('django.db.models.fields.CharField', [], {'max_length': '100'})
+ },
+ u'orm.projectvariable': {
+ 'Meta': {'object_name': 'ProjectVariable'},
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'project': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Project']"}),
+ 'value': ('django.db.models.fields.TextField', [], {'blank': 'True'})
+ },
+ u'orm.recipe': {
+ 'Meta': {'object_name': 'Recipe'},
+ 'bugtracker': ('django.db.models.fields.URLField', [], {'max_length': '200', 'blank': 'True'}),
+ 'description': ('django.db.models.fields.TextField', [], {'blank': 'True'}),
+ 'file_path': ('django.db.models.fields.FilePathField', [], {'max_length': '255'}),
+ 'homepage': ('django.db.models.fields.URLField', [], {'max_length': '200', 'blank': 'True'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'layer_version': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'recipe_layer_version'", 'to': u"orm['orm.Layer_Version']"}),
+ 'license': ('django.db.models.fields.CharField', [], {'max_length': '200', 'blank': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '100', 'blank': 'True'}),
+ 'section': ('django.db.models.fields.CharField', [], {'max_length': '100', 'blank': 'True'}),
+ 'summary': ('django.db.models.fields.CharField', [], {'max_length': '100', 'blank': 'True'}),
+ 'version': ('django.db.models.fields.CharField', [], {'max_length': '100', 'blank': 'True'})
+ },
+ u'orm.recipe_dependency': {
+ 'Meta': {'object_name': 'Recipe_Dependency'},
+ 'dep_type': ('django.db.models.fields.IntegerField', [], {}),
+ 'depends_on': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'r_dependencies_depends'", 'to': u"orm['orm.Recipe']"}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'recipe': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'r_dependencies_recipe'", 'to': u"orm['orm.Recipe']"})
+ },
+ u'orm.target': {
+ 'Meta': {'object_name': 'Target'},
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Build']"}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'image_size': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'is_image': ('django.db.models.fields.BooleanField', [], {'default': 'False'}),
+ 'license_manifest_path': ('django.db.models.fields.CharField', [], {'max_length': '500', 'null': 'True'}),
+ 'target': ('django.db.models.fields.CharField', [], {'max_length': '100'})
+ },
+ u'orm.target_file': {
+ 'Meta': {'object_name': 'Target_File'},
+ 'directory': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'directory_set'", 'null': 'True', 'to': u"orm['orm.Target_File']"}),
+ 'group': ('django.db.models.fields.CharField', [], {'max_length': '128'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'inodetype': ('django.db.models.fields.IntegerField', [], {}),
+ 'owner': ('django.db.models.fields.CharField', [], {'max_length': '128'}),
+ 'path': ('django.db.models.fields.FilePathField', [], {'max_length': '100'}),
+ 'permission': ('django.db.models.fields.CharField', [], {'max_length': '16'}),
+ 'size': ('django.db.models.fields.IntegerField', [], {}),
+ 'sym_target': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'symlink_set'", 'null': 'True', 'to': u"orm['orm.Target_File']"}),
+ 'target': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Target']"})
+ },
+ u'orm.target_image_file': {
+ 'Meta': {'object_name': 'Target_Image_File'},
+ 'file_name': ('django.db.models.fields.FilePathField', [], {'max_length': '254'}),
+ 'file_size': ('django.db.models.fields.IntegerField', [], {}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'target': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Target']"})
+ },
+ u'orm.target_installed_package': {
+ 'Meta': {'object_name': 'Target_Installed_Package'},
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'package': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'buildtargetlist_package'", 'to': u"orm['orm.Package']"}),
+ 'target': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Target']"})
+ },
+ u'orm.task': {
+ 'Meta': {'ordering': "('order', 'recipe')", 'unique_together': "(('build', 'recipe', 'task_name'),)", 'object_name': 'Task'},
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'task_build'", 'to': u"orm['orm.Build']"}),
+ 'cpu_usage': ('django.db.models.fields.DecimalField', [], {'null': 'True', 'max_digits': '6', 'decimal_places': '2'}),
+ 'disk_io': ('django.db.models.fields.IntegerField', [], {'null': 'True'}),
+ 'elapsed_time': ('django.db.models.fields.DecimalField', [], {'null': 'True', 'max_digits': '6', 'decimal_places': '2'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'line_number': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'logfile': ('django.db.models.fields.FilePathField', [], {'max_length': '255', 'blank': 'True'}),
+ 'message': ('django.db.models.fields.CharField', [], {'max_length': '240'}),
+ 'order': ('django.db.models.fields.IntegerField', [], {'null': 'True'}),
+ 'outcome': ('django.db.models.fields.IntegerField', [], {'default': '-1'}),
+ 'path_to_sstate_obj': ('django.db.models.fields.FilePathField', [], {'max_length': '500', 'blank': 'True'}),
+ 'recipe': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'build_recipe'", 'to': u"orm['orm.Recipe']"}),
+ 'script_type': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'source_url': ('django.db.models.fields.FilePathField', [], {'max_length': '255', 'blank': 'True'}),
+ 'sstate_checksum': ('django.db.models.fields.CharField', [], {'max_length': '100', 'blank': 'True'}),
+ 'sstate_result': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'task_executed': ('django.db.models.fields.BooleanField', [], {'default': 'False'}),
+ 'task_name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'work_directory': ('django.db.models.fields.FilePathField', [], {'max_length': '255', 'blank': 'True'})
+ },
+ u'orm.task_dependency': {
+ 'Meta': {'object_name': 'Task_Dependency'},
+ 'depends_on': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'task_dependencies_depends'", 'to': u"orm['orm.Task']"}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'task': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'task_dependencies_task'", 'to': u"orm['orm.Task']"})
+ },
+ u'orm.variable': {
+ 'Meta': {'object_name': 'Variable'},
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'variable_build'", 'to': u"orm['orm.Build']"}),
+ 'changed': ('django.db.models.fields.BooleanField', [], {'default': 'False'}),
+ 'description': ('django.db.models.fields.TextField', [], {'blank': 'True'}),
+ 'human_readable_name': ('django.db.models.fields.CharField', [], {'max_length': '200'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'variable_name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'variable_value': ('django.db.models.fields.TextField', [], {'blank': 'True'})
+ },
+ u'orm.variablehistory': {
+ 'Meta': {'object_name': 'VariableHistory'},
+ 'file_name': ('django.db.models.fields.FilePathField', [], {'max_length': '255'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'line_number': ('django.db.models.fields.IntegerField', [], {'null': 'True'}),
+ 'operation': ('django.db.models.fields.CharField', [], {'max_length': '64'}),
+ 'value': ('django.db.models.fields.TextField', [], {'blank': 'True'}),
+ 'variable': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'vhistory'", 'to': u"orm['orm.Variable']"})
+ }
+ }
+
+ complete_apps = ['orm']
\ No newline at end of file
diff --git a/bitbake/lib/toaster/orm/migrations/0011_auto__add_field_projectlayer_dirpath.py b/bitbake/lib/toaster/orm/migrations/0011_auto__add_field_projectlayer_dirpath.py
new file mode 100644
index 0000000..8a65221
--- /dev/null
+++ b/bitbake/lib/toaster/orm/migrations/0011_auto__add_field_projectlayer_dirpath.py
@@ -0,0 +1,242 @@
+# -*- coding: utf-8 -*-
+from south.utils import datetime_utils as datetime
+from south.db import db
+from south.v2 import SchemaMigration
+from django.db import models
+
+
+class Migration(SchemaMigration):
+
+ def forwards(self, orm):
+ # Adding field 'ProjectLayer.dirpath'
+ db.add_column(u'orm_projectlayer', 'dirpath',
+ self.gf('django.db.models.fields.CharField')(default='', max_length=254),
+ keep_default=False)
+
+
+ def backwards(self, orm):
+ # Deleting field 'ProjectLayer.dirpath'
+ db.delete_column(u'orm_projectlayer', 'dirpath')
+
+
+ models = {
+ u'orm.build': {
+ 'Meta': {'object_name': 'Build'},
+ 'bitbake_version': ('django.db.models.fields.CharField', [], {'max_length': '50'}),
+ 'build_name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'completed_on': ('django.db.models.fields.DateTimeField', [], {}),
+ 'cooker_log_path': ('django.db.models.fields.CharField', [], {'max_length': '500'}),
+ 'distro': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'distro_version': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'errors_no': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'machine': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'outcome': ('django.db.models.fields.IntegerField', [], {'default': '2'}),
+ 'project': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Project']", 'null': 'True'}),
+ 'started_on': ('django.db.models.fields.DateTimeField', [], {}),
+ 'timespent': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'warnings_no': ('django.db.models.fields.IntegerField', [], {'default': '0'})
+ },
+ u'orm.helptext': {
+ 'Meta': {'object_name': 'HelpText'},
+ 'area': ('django.db.models.fields.IntegerField', [], {}),
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'helptext_build'", 'to': u"orm['orm.Build']"}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'key': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'text': ('django.db.models.fields.TextField', [], {})
+ },
+ u'orm.layer': {
+ 'Meta': {'object_name': 'Layer'},
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'layer_index_url': ('django.db.models.fields.URLField', [], {'max_length': '200'}),
+ 'local_path': ('django.db.models.fields.FilePathField', [], {'max_length': '255'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '100'})
+ },
+ u'orm.layer_version': {
+ 'Meta': {'object_name': 'Layer_Version'},
+ 'branch': ('django.db.models.fields.CharField', [], {'max_length': '50'}),
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'layer_version_build'", 'to': u"orm['orm.Build']"}),
+ 'commit': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'layer': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'layer_version_layer'", 'to': u"orm['orm.Layer']"}),
+ 'priority': ('django.db.models.fields.IntegerField', [], {})
+ },
+ u'orm.logmessage': {
+ 'Meta': {'object_name': 'LogMessage'},
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Build']"}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'level': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'lineno': ('django.db.models.fields.IntegerField', [], {'null': 'True'}),
+ 'message': ('django.db.models.fields.CharField', [], {'max_length': '240'}),
+ 'pathname': ('django.db.models.fields.FilePathField', [], {'max_length': '255', 'blank': 'True'}),
+ 'task': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Task']", 'null': 'True', 'blank': 'True'})
+ },
+ u'orm.package': {
+ 'Meta': {'object_name': 'Package'},
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Build']"}),
+ 'description': ('django.db.models.fields.TextField', [], {'blank': 'True'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'installed_name': ('django.db.models.fields.CharField', [], {'default': "''", 'max_length': '100'}),
+ 'installed_size': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'license': ('django.db.models.fields.CharField', [], {'max_length': '80', 'blank': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'recipe': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Recipe']", 'null': 'True'}),
+ 'revision': ('django.db.models.fields.CharField', [], {'max_length': '32', 'blank': 'True'}),
+ 'section': ('django.db.models.fields.CharField', [], {'max_length': '80', 'blank': 'True'}),
+ 'size': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'summary': ('django.db.models.fields.CharField', [], {'max_length': '200', 'blank': 'True'}),
+ 'version': ('django.db.models.fields.CharField', [], {'max_length': '100', 'blank': 'True'})
+ },
+ u'orm.package_dependency': {
+ 'Meta': {'object_name': 'Package_Dependency'},
+ 'dep_type': ('django.db.models.fields.IntegerField', [], {}),
+ 'depends_on': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'package_dependencies_target'", 'to': u"orm['orm.Package']"}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'package': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'package_dependencies_source'", 'to': u"orm['orm.Package']"}),
+ 'target': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Target']", 'null': 'True'})
+ },
+ u'orm.package_file': {
+ 'Meta': {'object_name': 'Package_File'},
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'package': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'buildfilelist_package'", 'to': u"orm['orm.Package']"}),
+ 'path': ('django.db.models.fields.FilePathField', [], {'max_length': '255', 'blank': 'True'}),
+ 'size': ('django.db.models.fields.IntegerField', [], {})
+ },
+ u'orm.project': {
+ 'Meta': {'object_name': 'Project'},
+ 'branch': ('django.db.models.fields.CharField', [], {'max_length': '50'}),
+ 'created': ('django.db.models.fields.DateTimeField', [], {'auto_now_add': 'True', 'blank': 'True'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'short_description': ('django.db.models.fields.CharField', [], {'max_length': '50', 'blank': 'True'}),
+ 'updated': ('django.db.models.fields.DateTimeField', [], {'auto_now': 'True', 'blank': 'True'}),
+ 'user_id': ('django.db.models.fields.IntegerField', [], {'null': 'True'})
+ },
+ u'orm.projectlayer': {
+ 'Meta': {'object_name': 'ProjectLayer'},
+ 'commit': ('django.db.models.fields.CharField', [], {'max_length': '254'}),
+ 'dirpath': ('django.db.models.fields.CharField', [], {'max_length': '254'}),
+ 'giturl': ('django.db.models.fields.CharField', [], {'max_length': '254'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'project': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Project']"})
+ },
+ u'orm.projecttarget': {
+ 'Meta': {'object_name': 'ProjectTarget'},
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'project': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Project']"}),
+ 'target': ('django.db.models.fields.CharField', [], {'max_length': '100'})
+ },
+ u'orm.projectvariable': {
+ 'Meta': {'object_name': 'ProjectVariable'},
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'project': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Project']"}),
+ 'value': ('django.db.models.fields.TextField', [], {'blank': 'True'})
+ },
+ u'orm.recipe': {
+ 'Meta': {'object_name': 'Recipe'},
+ 'bugtracker': ('django.db.models.fields.URLField', [], {'max_length': '200', 'blank': 'True'}),
+ 'description': ('django.db.models.fields.TextField', [], {'blank': 'True'}),
+ 'file_path': ('django.db.models.fields.FilePathField', [], {'max_length': '255'}),
+ 'homepage': ('django.db.models.fields.URLField', [], {'max_length': '200', 'blank': 'True'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'layer_version': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'recipe_layer_version'", 'to': u"orm['orm.Layer_Version']"}),
+ 'license': ('django.db.models.fields.CharField', [], {'max_length': '200', 'blank': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '100', 'blank': 'True'}),
+ 'section': ('django.db.models.fields.CharField', [], {'max_length': '100', 'blank': 'True'}),
+ 'summary': ('django.db.models.fields.CharField', [], {'max_length': '100', 'blank': 'True'}),
+ 'version': ('django.db.models.fields.CharField', [], {'max_length': '100', 'blank': 'True'})
+ },
+ u'orm.recipe_dependency': {
+ 'Meta': {'object_name': 'Recipe_Dependency'},
+ 'dep_type': ('django.db.models.fields.IntegerField', [], {}),
+ 'depends_on': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'r_dependencies_depends'", 'to': u"orm['orm.Recipe']"}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'recipe': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'r_dependencies_recipe'", 'to': u"orm['orm.Recipe']"})
+ },
+ u'orm.target': {
+ 'Meta': {'object_name': 'Target'},
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Build']"}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'image_size': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'is_image': ('django.db.models.fields.BooleanField', [], {'default': 'False'}),
+ 'license_manifest_path': ('django.db.models.fields.CharField', [], {'max_length': '500', 'null': 'True'}),
+ 'target': ('django.db.models.fields.CharField', [], {'max_length': '100'})
+ },
+ u'orm.target_file': {
+ 'Meta': {'object_name': 'Target_File'},
+ 'directory': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'directory_set'", 'null': 'True', 'to': u"orm['orm.Target_File']"}),
+ 'group': ('django.db.models.fields.CharField', [], {'max_length': '128'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'inodetype': ('django.db.models.fields.IntegerField', [], {}),
+ 'owner': ('django.db.models.fields.CharField', [], {'max_length': '128'}),
+ 'path': ('django.db.models.fields.FilePathField', [], {'max_length': '100'}),
+ 'permission': ('django.db.models.fields.CharField', [], {'max_length': '16'}),
+ 'size': ('django.db.models.fields.IntegerField', [], {}),
+ 'sym_target': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'symlink_set'", 'null': 'True', 'to': u"orm['orm.Target_File']"}),
+ 'target': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Target']"})
+ },
+ u'orm.target_image_file': {
+ 'Meta': {'object_name': 'Target_Image_File'},
+ 'file_name': ('django.db.models.fields.FilePathField', [], {'max_length': '254'}),
+ 'file_size': ('django.db.models.fields.IntegerField', [], {}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'target': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Target']"})
+ },
+ u'orm.target_installed_package': {
+ 'Meta': {'object_name': 'Target_Installed_Package'},
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'package': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'buildtargetlist_package'", 'to': u"orm['orm.Package']"}),
+ 'target': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Target']"})
+ },
+ u'orm.task': {
+ 'Meta': {'ordering': "('order', 'recipe')", 'unique_together': "(('build', 'recipe', 'task_name'),)", 'object_name': 'Task'},
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'task_build'", 'to': u"orm['orm.Build']"}),
+ 'cpu_usage': ('django.db.models.fields.DecimalField', [], {'null': 'True', 'max_digits': '6', 'decimal_places': '2'}),
+ 'disk_io': ('django.db.models.fields.IntegerField', [], {'null': 'True'}),
+ 'elapsed_time': ('django.db.models.fields.DecimalField', [], {'null': 'True', 'max_digits': '6', 'decimal_places': '2'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'line_number': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'logfile': ('django.db.models.fields.FilePathField', [], {'max_length': '255', 'blank': 'True'}),
+ 'message': ('django.db.models.fields.CharField', [], {'max_length': '240'}),
+ 'order': ('django.db.models.fields.IntegerField', [], {'null': 'True'}),
+ 'outcome': ('django.db.models.fields.IntegerField', [], {'default': '-1'}),
+ 'path_to_sstate_obj': ('django.db.models.fields.FilePathField', [], {'max_length': '500', 'blank': 'True'}),
+ 'recipe': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'build_recipe'", 'to': u"orm['orm.Recipe']"}),
+ 'script_type': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'source_url': ('django.db.models.fields.FilePathField', [], {'max_length': '255', 'blank': 'True'}),
+ 'sstate_checksum': ('django.db.models.fields.CharField', [], {'max_length': '100', 'blank': 'True'}),
+ 'sstate_result': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'task_executed': ('django.db.models.fields.BooleanField', [], {'default': 'False'}),
+ 'task_name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'work_directory': ('django.db.models.fields.FilePathField', [], {'max_length': '255', 'blank': 'True'})
+ },
+ u'orm.task_dependency': {
+ 'Meta': {'object_name': 'Task_Dependency'},
+ 'depends_on': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'task_dependencies_depends'", 'to': u"orm['orm.Task']"}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'task': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'task_dependencies_task'", 'to': u"orm['orm.Task']"})
+ },
+ u'orm.variable': {
+ 'Meta': {'object_name': 'Variable'},
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'variable_build'", 'to': u"orm['orm.Build']"}),
+ 'changed': ('django.db.models.fields.BooleanField', [], {'default': 'False'}),
+ 'description': ('django.db.models.fields.TextField', [], {'blank': 'True'}),
+ 'human_readable_name': ('django.db.models.fields.CharField', [], {'max_length': '200'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'variable_name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'variable_value': ('django.db.models.fields.TextField', [], {'blank': 'True'})
+ },
+ u'orm.variablehistory': {
+ 'Meta': {'object_name': 'VariableHistory'},
+ 'file_name': ('django.db.models.fields.FilePathField', [], {'max_length': '255'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'line_number': ('django.db.models.fields.IntegerField', [], {'null': 'True'}),
+ 'operation': ('django.db.models.fields.CharField', [], {'max_length': '64'}),
+ 'value': ('django.db.models.fields.TextField', [], {'blank': 'True'}),
+ 'variable': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'vhistory'", 'to': u"orm['orm.Variable']"})
+ }
+ }
+
+ complete_apps = ['orm']
\ No newline at end of file
diff --git a/bitbake/lib/toaster/orm/migrations/0012_auto__add_field_projectlayer_optional__add_field_projecttarget_task.py b/bitbake/lib/toaster/orm/migrations/0012_auto__add_field_projectlayer_optional__add_field_projecttarget_task.py
new file mode 100644
index 0000000..9e483f5
--- /dev/null
+++ b/bitbake/lib/toaster/orm/migrations/0012_auto__add_field_projectlayer_optional__add_field_projecttarget_task.py
@@ -0,0 +1,252 @@
+# -*- coding: utf-8 -*-
+from south.utils import datetime_utils as datetime
+from south.db import db
+from south.v2 import SchemaMigration
+from django.db import models
+
+
+class Migration(SchemaMigration):
+
+ def forwards(self, orm):
+ # Adding field 'ProjectLayer.optional'
+ db.add_column(u'orm_projectlayer', 'optional',
+ self.gf('django.db.models.fields.BooleanField')(default=True),
+ keep_default=False)
+
+ # Adding field 'ProjectTarget.task'
+ db.add_column(u'orm_projecttarget', 'task',
+ self.gf('django.db.models.fields.CharField')(max_length=100, null=True),
+ keep_default=False)
+
+
+ def backwards(self, orm):
+ # Deleting field 'ProjectLayer.optional'
+ db.delete_column(u'orm_projectlayer', 'optional')
+
+ # Deleting field 'ProjectTarget.task'
+ db.delete_column(u'orm_projecttarget', 'task')
+
+
+ models = {
+ u'orm.build': {
+ 'Meta': {'object_name': 'Build'},
+ 'bitbake_version': ('django.db.models.fields.CharField', [], {'max_length': '50'}),
+ 'build_name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'completed_on': ('django.db.models.fields.DateTimeField', [], {}),
+ 'cooker_log_path': ('django.db.models.fields.CharField', [], {'max_length': '500'}),
+ 'distro': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'distro_version': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'errors_no': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'machine': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'outcome': ('django.db.models.fields.IntegerField', [], {'default': '2'}),
+ 'project': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Project']", 'null': 'True'}),
+ 'started_on': ('django.db.models.fields.DateTimeField', [], {}),
+ 'timespent': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'warnings_no': ('django.db.models.fields.IntegerField', [], {'default': '0'})
+ },
+ u'orm.helptext': {
+ 'Meta': {'object_name': 'HelpText'},
+ 'area': ('django.db.models.fields.IntegerField', [], {}),
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'helptext_build'", 'to': u"orm['orm.Build']"}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'key': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'text': ('django.db.models.fields.TextField', [], {})
+ },
+ u'orm.layer': {
+ 'Meta': {'object_name': 'Layer'},
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'layer_index_url': ('django.db.models.fields.URLField', [], {'max_length': '200'}),
+ 'local_path': ('django.db.models.fields.FilePathField', [], {'max_length': '255'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '100'})
+ },
+ u'orm.layer_version': {
+ 'Meta': {'object_name': 'Layer_Version'},
+ 'branch': ('django.db.models.fields.CharField', [], {'max_length': '50'}),
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'layer_version_build'", 'to': u"orm['orm.Build']"}),
+ 'commit': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'layer': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'layer_version_layer'", 'to': u"orm['orm.Layer']"}),
+ 'priority': ('django.db.models.fields.IntegerField', [], {})
+ },
+ u'orm.logmessage': {
+ 'Meta': {'object_name': 'LogMessage'},
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Build']"}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'level': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'lineno': ('django.db.models.fields.IntegerField', [], {'null': 'True'}),
+ 'message': ('django.db.models.fields.CharField', [], {'max_length': '240'}),
+ 'pathname': ('django.db.models.fields.FilePathField', [], {'max_length': '255', 'blank': 'True'}),
+ 'task': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Task']", 'null': 'True', 'blank': 'True'})
+ },
+ u'orm.package': {
+ 'Meta': {'object_name': 'Package'},
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Build']"}),
+ 'description': ('django.db.models.fields.TextField', [], {'blank': 'True'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'installed_name': ('django.db.models.fields.CharField', [], {'default': "''", 'max_length': '100'}),
+ 'installed_size': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'license': ('django.db.models.fields.CharField', [], {'max_length': '80', 'blank': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'recipe': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Recipe']", 'null': 'True'}),
+ 'revision': ('django.db.models.fields.CharField', [], {'max_length': '32', 'blank': 'True'}),
+ 'section': ('django.db.models.fields.CharField', [], {'max_length': '80', 'blank': 'True'}),
+ 'size': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'summary': ('django.db.models.fields.CharField', [], {'max_length': '200', 'blank': 'True'}),
+ 'version': ('django.db.models.fields.CharField', [], {'max_length': '100', 'blank': 'True'})
+ },
+ u'orm.package_dependency': {
+ 'Meta': {'object_name': 'Package_Dependency'},
+ 'dep_type': ('django.db.models.fields.IntegerField', [], {}),
+ 'depends_on': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'package_dependencies_target'", 'to': u"orm['orm.Package']"}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'package': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'package_dependencies_source'", 'to': u"orm['orm.Package']"}),
+ 'target': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Target']", 'null': 'True'})
+ },
+ u'orm.package_file': {
+ 'Meta': {'object_name': 'Package_File'},
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'package': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'buildfilelist_package'", 'to': u"orm['orm.Package']"}),
+ 'path': ('django.db.models.fields.FilePathField', [], {'max_length': '255', 'blank': 'True'}),
+ 'size': ('django.db.models.fields.IntegerField', [], {})
+ },
+ u'orm.project': {
+ 'Meta': {'object_name': 'Project'},
+ 'branch': ('django.db.models.fields.CharField', [], {'max_length': '50'}),
+ 'created': ('django.db.models.fields.DateTimeField', [], {'auto_now_add': 'True', 'blank': 'True'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'short_description': ('django.db.models.fields.CharField', [], {'max_length': '50', 'blank': 'True'}),
+ 'updated': ('django.db.models.fields.DateTimeField', [], {'auto_now': 'True', 'blank': 'True'}),
+ 'user_id': ('django.db.models.fields.IntegerField', [], {'null': 'True'})
+ },
+ u'orm.projectlayer': {
+ 'Meta': {'object_name': 'ProjectLayer'},
+ 'commit': ('django.db.models.fields.CharField', [], {'max_length': '254'}),
+ 'dirpath': ('django.db.models.fields.CharField', [], {'max_length': '254'}),
+ 'giturl': ('django.db.models.fields.CharField', [], {'max_length': '254'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'optional': ('django.db.models.fields.BooleanField', [], {'default': 'True'}),
+ 'project': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Project']"})
+ },
+ u'orm.projecttarget': {
+ 'Meta': {'object_name': 'ProjectTarget'},
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'project': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Project']"}),
+ 'target': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'task': ('django.db.models.fields.CharField', [], {'max_length': '100', 'null': 'True'})
+ },
+ u'orm.projectvariable': {
+ 'Meta': {'object_name': 'ProjectVariable'},
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'project': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Project']"}),
+ 'value': ('django.db.models.fields.TextField', [], {'blank': 'True'})
+ },
+ u'orm.recipe': {
+ 'Meta': {'object_name': 'Recipe'},
+ 'bugtracker': ('django.db.models.fields.URLField', [], {'max_length': '200', 'blank': 'True'}),
+ 'description': ('django.db.models.fields.TextField', [], {'blank': 'True'}),
+ 'file_path': ('django.db.models.fields.FilePathField', [], {'max_length': '255'}),
+ 'homepage': ('django.db.models.fields.URLField', [], {'max_length': '200', 'blank': 'True'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'layer_version': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'recipe_layer_version'", 'to': u"orm['orm.Layer_Version']"}),
+ 'license': ('django.db.models.fields.CharField', [], {'max_length': '200', 'blank': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '100', 'blank': 'True'}),
+ 'section': ('django.db.models.fields.CharField', [], {'max_length': '100', 'blank': 'True'}),
+ 'summary': ('django.db.models.fields.CharField', [], {'max_length': '100', 'blank': 'True'}),
+ 'version': ('django.db.models.fields.CharField', [], {'max_length': '100', 'blank': 'True'})
+ },
+ u'orm.recipe_dependency': {
+ 'Meta': {'object_name': 'Recipe_Dependency'},
+ 'dep_type': ('django.db.models.fields.IntegerField', [], {}),
+ 'depends_on': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'r_dependencies_depends'", 'to': u"orm['orm.Recipe']"}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'recipe': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'r_dependencies_recipe'", 'to': u"orm['orm.Recipe']"})
+ },
+ u'orm.target': {
+ 'Meta': {'object_name': 'Target'},
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Build']"}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'image_size': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'is_image': ('django.db.models.fields.BooleanField', [], {'default': 'False'}),
+ 'license_manifest_path': ('django.db.models.fields.CharField', [], {'max_length': '500', 'null': 'True'}),
+ 'target': ('django.db.models.fields.CharField', [], {'max_length': '100'})
+ },
+ u'orm.target_file': {
+ 'Meta': {'object_name': 'Target_File'},
+ 'directory': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'directory_set'", 'null': 'True', 'to': u"orm['orm.Target_File']"}),
+ 'group': ('django.db.models.fields.CharField', [], {'max_length': '128'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'inodetype': ('django.db.models.fields.IntegerField', [], {}),
+ 'owner': ('django.db.models.fields.CharField', [], {'max_length': '128'}),
+ 'path': ('django.db.models.fields.FilePathField', [], {'max_length': '100'}),
+ 'permission': ('django.db.models.fields.CharField', [], {'max_length': '16'}),
+ 'size': ('django.db.models.fields.IntegerField', [], {}),
+ 'sym_target': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'symlink_set'", 'null': 'True', 'to': u"orm['orm.Target_File']"}),
+ 'target': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Target']"})
+ },
+ u'orm.target_image_file': {
+ 'Meta': {'object_name': 'Target_Image_File'},
+ 'file_name': ('django.db.models.fields.FilePathField', [], {'max_length': '254'}),
+ 'file_size': ('django.db.models.fields.IntegerField', [], {}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'target': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Target']"})
+ },
+ u'orm.target_installed_package': {
+ 'Meta': {'object_name': 'Target_Installed_Package'},
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'package': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'buildtargetlist_package'", 'to': u"orm['orm.Package']"}),
+ 'target': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Target']"})
+ },
+ u'orm.task': {
+ 'Meta': {'ordering': "('order', 'recipe')", 'unique_together': "(('build', 'recipe', 'task_name'),)", 'object_name': 'Task'},
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'task_build'", 'to': u"orm['orm.Build']"}),
+ 'cpu_usage': ('django.db.models.fields.DecimalField', [], {'null': 'True', 'max_digits': '6', 'decimal_places': '2'}),
+ 'disk_io': ('django.db.models.fields.IntegerField', [], {'null': 'True'}),
+ 'elapsed_time': ('django.db.models.fields.DecimalField', [], {'null': 'True', 'max_digits': '6', 'decimal_places': '2'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'line_number': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'logfile': ('django.db.models.fields.FilePathField', [], {'max_length': '255', 'blank': 'True'}),
+ 'message': ('django.db.models.fields.CharField', [], {'max_length': '240'}),
+ 'order': ('django.db.models.fields.IntegerField', [], {'null': 'True'}),
+ 'outcome': ('django.db.models.fields.IntegerField', [], {'default': '-1'}),
+ 'path_to_sstate_obj': ('django.db.models.fields.FilePathField', [], {'max_length': '500', 'blank': 'True'}),
+ 'recipe': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'build_recipe'", 'to': u"orm['orm.Recipe']"}),
+ 'script_type': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'source_url': ('django.db.models.fields.FilePathField', [], {'max_length': '255', 'blank': 'True'}),
+ 'sstate_checksum': ('django.db.models.fields.CharField', [], {'max_length': '100', 'blank': 'True'}),
+ 'sstate_result': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'task_executed': ('django.db.models.fields.BooleanField', [], {'default': 'False'}),
+ 'task_name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'work_directory': ('django.db.models.fields.FilePathField', [], {'max_length': '255', 'blank': 'True'})
+ },
+ u'orm.task_dependency': {
+ 'Meta': {'object_name': 'Task_Dependency'},
+ 'depends_on': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'task_dependencies_depends'", 'to': u"orm['orm.Task']"}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'task': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'task_dependencies_task'", 'to': u"orm['orm.Task']"})
+ },
+ u'orm.variable': {
+ 'Meta': {'object_name': 'Variable'},
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'variable_build'", 'to': u"orm['orm.Build']"}),
+ 'changed': ('django.db.models.fields.BooleanField', [], {'default': 'False'}),
+ 'description': ('django.db.models.fields.TextField', [], {'blank': 'True'}),
+ 'human_readable_name': ('django.db.models.fields.CharField', [], {'max_length': '200'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'variable_name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'variable_value': ('django.db.models.fields.TextField', [], {'blank': 'True'})
+ },
+ u'orm.variablehistory': {
+ 'Meta': {'object_name': 'VariableHistory'},
+ 'file_name': ('django.db.models.fields.FilePathField', [], {'max_length': '255'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'line_number': ('django.db.models.fields.IntegerField', [], {'null': 'True'}),
+ 'operation': ('django.db.models.fields.CharField', [], {'max_length': '64'}),
+ 'value': ('django.db.models.fields.TextField', [], {'blank': 'True'}),
+ 'variable': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'vhistory'", 'to': u"orm['orm.Variable']"})
+ }
+ }
+
+ complete_apps = ['orm']
\ No newline at end of file
diff --git a/bitbake/lib/toaster/orm/migrations/0013_auto__add_release__add_layerversiondependency__add_unique_layerversion.py b/bitbake/lib/toaster/orm/migrations/0013_auto__add_release__add_layerversiondependency__add_unique_layerversion.py
new file mode 100644
index 0000000..7c954e6
--- /dev/null
+++ b/bitbake/lib/toaster/orm/migrations/0013_auto__add_release__add_layerversiondependency__add_unique_layerversion.py
@@ -0,0 +1,710 @@
+# -*- coding: utf-8 -*-
+from south.utils import datetime_utils as datetime
+from south.db import db
+from south.v2 import SchemaMigration
+from django.db import models
+
+
+class Migration(SchemaMigration):
+
+ def forwards(self, orm):
+ # Adding model 'Release'
+ db.create_table(u'orm_release', (
+ (u'id', self.gf('django.db.models.fields.AutoField')(primary_key=True)),
+ ('name', self.gf('django.db.models.fields.CharField')(unique=True, max_length=32)),
+ ('description', self.gf('django.db.models.fields.CharField')(max_length=255)),
+ ('bitbake_version', self.gf('django.db.models.fields.related.ForeignKey')(to=orm['orm.BitbakeVersion'])),
+ ('branch', self.gf('django.db.models.fields.CharField')(max_length=32)),
+ ))
+ db.send_create_signal(u'orm', ['Release'])
+
+ # Adding model 'LayerVersionDependency'
+ db.create_table(u'orm_layerversiondependency', (
+ (u'id', self.gf('django.db.models.fields.AutoField')(primary_key=True)),
+ ('layer_source', self.gf('django.db.models.fields.related.ForeignKey')(default=None, to=orm['orm.LayerSource'], null=True)),
+ ('up_id', self.gf('django.db.models.fields.IntegerField')(default=None, null=True)),
+ ('layer_version', self.gf('django.db.models.fields.related.ForeignKey')(related_name='dependencies', to=orm['orm.Layer_Version'])),
+ ('depends_on', self.gf('django.db.models.fields.related.ForeignKey')(related_name='dependees', to=orm['orm.Layer_Version'])),
+ ))
+ db.send_create_signal(u'orm', ['LayerVersionDependency'])
+
+ # Adding unique constraint on 'LayerVersionDependency', fields ['layer_source', 'up_id']
+ db.create_unique(u'orm_layerversiondependency', ['layer_source_id', 'up_id'])
+
+ # Adding model 'ToasterSetting'
+ db.create_table(u'orm_toastersetting', (
+ (u'id', self.gf('django.db.models.fields.AutoField')(primary_key=True)),
+ ('name', self.gf('django.db.models.fields.CharField')(max_length=63)),
+ ('helptext', self.gf('django.db.models.fields.TextField')()),
+ ('value', self.gf('django.db.models.fields.CharField')(max_length=255)),
+ ))
+ db.send_create_signal(u'orm', ['ToasterSetting'])
+
+ # Adding model 'Machine'
+ db.create_table(u'orm_machine', (
+ (u'id', self.gf('django.db.models.fields.AutoField')(primary_key=True)),
+ ('layer_source', self.gf('django.db.models.fields.related.ForeignKey')(default=None, to=orm['orm.LayerSource'], null=True)),
+ ('up_id', self.gf('django.db.models.fields.IntegerField')(default=None, null=True)),
+ ('up_date', self.gf('django.db.models.fields.DateTimeField')(default=None, null=True)),
+ ('layer_version', self.gf('django.db.models.fields.related.ForeignKey')(to=orm['orm.Layer_Version'])),
+ ('name', self.gf('django.db.models.fields.CharField')(max_length=255)),
+ ('description', self.gf('django.db.models.fields.CharField')(max_length=255)),
+ ))
+ db.send_create_signal(u'orm', ['Machine'])
+
+ # Adding unique constraint on 'Machine', fields ['layer_source', 'up_id']
+ db.create_unique(u'orm_machine', ['layer_source_id', 'up_id'])
+
+ # Adding model 'ReleaseDefaultLayer'
+ db.create_table(u'orm_releasedefaultlayer', (
+ (u'id', self.gf('django.db.models.fields.AutoField')(primary_key=True)),
+ ('release', self.gf('django.db.models.fields.related.ForeignKey')(to=orm['orm.Release'])),
+ ('layer', self.gf('django.db.models.fields.related.ForeignKey')(to=orm['orm.Layer'])),
+ ))
+ db.send_create_signal(u'orm', ['ReleaseDefaultLayer'])
+
+ # Adding model 'BitbakeVersion'
+ db.create_table(u'orm_bitbakeversion', (
+ (u'id', self.gf('django.db.models.fields.AutoField')(primary_key=True)),
+ ('name', self.gf('django.db.models.fields.CharField')(unique=True, max_length=32)),
+ ('giturl', self.gf('django.db.models.fields.URLField')(max_length=200)),
+ ('branch', self.gf('django.db.models.fields.CharField')(max_length=32)),
+ ('dirpath', self.gf('django.db.models.fields.CharField')(max_length=255)),
+ ))
+ db.send_create_signal(u'orm', ['BitbakeVersion'])
+
+ # Adding model 'Branch'
+ db.create_table(u'orm_branch', (
+ (u'id', self.gf('django.db.models.fields.AutoField')(primary_key=True)),
+ ('layer_source', self.gf('django.db.models.fields.related.ForeignKey')(default=True, to=orm['orm.LayerSource'], null=True)),
+ ('up_id', self.gf('django.db.models.fields.IntegerField')(default=None, null=True)),
+ ('up_date', self.gf('django.db.models.fields.DateTimeField')(default=None, null=True)),
+ ('name', self.gf('django.db.models.fields.CharField')(max_length=50)),
+ ('bitbake_branch', self.gf('django.db.models.fields.CharField')(max_length=50, blank=True)),
+ ('short_description', self.gf('django.db.models.fields.CharField')(max_length=50, blank=True)),
+ ))
+ db.send_create_signal(u'orm', ['Branch'])
+
+ # Adding unique constraint on 'Branch', fields ['layer_source', 'name']
+ db.create_unique(u'orm_branch', ['layer_source_id', 'name'])
+
+ # Adding unique constraint on 'Branch', fields ['layer_source', 'up_id']
+ db.create_unique(u'orm_branch', ['layer_source_id', 'up_id'])
+
+ # Adding model 'ToasterSettingDefaultLayer'
+ db.create_table(u'orm_toastersettingdefaultlayer', (
+ (u'id', self.gf('django.db.models.fields.AutoField')(primary_key=True)),
+ ('layer_version', self.gf('django.db.models.fields.related.ForeignKey')(to=orm['orm.Layer_Version'])),
+ ))
+ db.send_create_signal(u'orm', ['ToasterSettingDefaultLayer'])
+
+ # Adding model 'LayerSource'
+ db.create_table(u'orm_layersource', (
+ (u'id', self.gf('django.db.models.fields.AutoField')(primary_key=True)),
+ ('name', self.gf('django.db.models.fields.CharField')(max_length=63)),
+ ('sourcetype', self.gf('django.db.models.fields.IntegerField')()),
+ ('apiurl', self.gf('django.db.models.fields.CharField')(default=None, max_length=255, null=True)),
+ ))
+ db.send_create_signal(u'orm', ['LayerSource'])
+
+ # Adding unique constraint on 'LayerSource', fields ['sourcetype', 'apiurl']
+ db.create_unique(u'orm_layersource', ['sourcetype', 'apiurl'])
+
+ # Deleting field 'ProjectLayer.name'
+ db.delete_column(u'orm_projectlayer', 'name')
+
+ # Deleting field 'ProjectLayer.dirpath'
+ db.delete_column(u'orm_projectlayer', 'dirpath')
+
+ # Deleting field 'ProjectLayer.commit'
+ db.delete_column(u'orm_projectlayer', 'commit')
+
+ # Deleting field 'ProjectLayer.giturl'
+ db.delete_column(u'orm_projectlayer', 'giturl')
+
+ # Adding field 'ProjectLayer.layercommit'
+ db.add_column(u'orm_projectlayer', 'layercommit',
+ self.gf('django.db.models.fields.related.ForeignKey')(to=orm['orm.Layer_Version'], null=True),
+ keep_default=False)
+
+ # Adding field 'Layer_Version.layer_source'
+ db.add_column(u'orm_layer_version', 'layer_source',
+ self.gf('django.db.models.fields.related.ForeignKey')(default=None, to=orm['orm.LayerSource'], null=True),
+ keep_default=False)
+
+ # Adding field 'Layer_Version.up_id'
+ db.add_column(u'orm_layer_version', 'up_id',
+ self.gf('django.db.models.fields.IntegerField')(default=None, null=True),
+ keep_default=False)
+
+ # Adding field 'Layer_Version.up_date'
+ db.add_column(u'orm_layer_version', 'up_date',
+ self.gf('django.db.models.fields.DateTimeField')(default=None, null=True),
+ keep_default=False)
+
+ # Adding field 'Layer_Version.up_branch'
+ db.add_column(u'orm_layer_version', 'up_branch',
+ self.gf('django.db.models.fields.related.ForeignKey')(default=None, to=orm['orm.Branch'], null=True),
+ keep_default=False)
+
+ # Adding field 'Layer_Version.dirpath'
+ db.add_column(u'orm_layer_version', 'dirpath',
+ self.gf('django.db.models.fields.CharField')(default=None, max_length=255, null=True),
+ keep_default=False)
+
+
+ # Changing field 'Layer_Version.build'
+ db.alter_column(u'orm_layer_version', 'build_id', self.gf('django.db.models.fields.related.ForeignKey')(null=True, to=orm['orm.Build']))
+
+ # Changing field 'Layer_Version.branch'
+ db.alter_column(u'orm_layer_version', 'branch', self.gf('django.db.models.fields.CharField')(max_length=80))
+ # Adding unique constraint on 'Layer_Version', fields ['layer_source', 'up_id']
+ db.create_unique(u'orm_layer_version', ['layer_source_id', 'up_id'])
+
+ # Adding field 'Recipe.layer_source'
+ db.add_column(u'orm_recipe', 'layer_source',
+ self.gf('django.db.models.fields.related.ForeignKey')(default=None, to=orm['orm.LayerSource'], null=True),
+ keep_default=False)
+
+ # Adding field 'Recipe.up_id'
+ db.add_column(u'orm_recipe', 'up_id',
+ self.gf('django.db.models.fields.IntegerField')(default=None, null=True),
+ keep_default=False)
+
+ # Adding field 'Recipe.up_date'
+ db.add_column(u'orm_recipe', 'up_date',
+ self.gf('django.db.models.fields.DateTimeField')(default=None, null=True),
+ keep_default=False)
+
+ # Adding field 'Layer.layer_source'
+ db.add_column(u'orm_layer', 'layer_source',
+ self.gf('django.db.models.fields.related.ForeignKey')(default=None, to=orm['orm.LayerSource'], null=True),
+ keep_default=False)
+
+ # Adding field 'Layer.up_id'
+ db.add_column(u'orm_layer', 'up_id',
+ self.gf('django.db.models.fields.IntegerField')(default=None, null=True),
+ keep_default=False)
+
+ # Adding field 'Layer.up_date'
+ db.add_column(u'orm_layer', 'up_date',
+ self.gf('django.db.models.fields.DateTimeField')(default=None, null=True),
+ keep_default=False)
+
+ # Adding field 'Layer.vcs_url'
+ db.add_column(u'orm_layer', 'vcs_url',
+ self.gf('django.db.models.fields.URLField')(default=None, max_length=200, null=True),
+ keep_default=False)
+
+ # Adding field 'Layer.vcs_web_file_base_url'
+ db.add_column(u'orm_layer', 'vcs_web_file_base_url',
+ self.gf('django.db.models.fields.URLField')(default=None, max_length=200, null=True),
+ keep_default=False)
+
+ # Adding field 'Layer.summary'
+ db.add_column(u'orm_layer', 'summary',
+ self.gf('django.db.models.fields.CharField')(default=None, max_length=200, null=True),
+ keep_default=False)
+
+ # Adding field 'Layer.description'
+ db.add_column(u'orm_layer', 'description',
+ self.gf('django.db.models.fields.TextField')(default=None, null=True),
+ keep_default=False)
+
+
+ # Changing field 'Layer.local_path'
+ db.alter_column(u'orm_layer', 'local_path', self.gf('django.db.models.fields.FilePathField')(max_length=255, null=True))
+ # Adding unique constraint on 'Layer', fields ['layer_source', 'up_id']
+ db.create_unique(u'orm_layer', ['layer_source_id', 'up_id'])
+
+ # Adding unique constraint on 'Layer', fields ['layer_source', 'name']
+ db.create_unique(u'orm_layer', ['layer_source_id', 'name'])
+
+ # Deleting field 'Project.branch'
+ db.delete_column(u'orm_project', 'branch')
+
+ # Adding field 'Project.bitbake_version'
+ db.add_column(u'orm_project', 'bitbake_version',
+ self.gf('django.db.models.fields.related.ForeignKey')(default=-1, to=orm['orm.BitbakeVersion']),
+ keep_default=False)
+
+ # Adding field 'Project.release'
+ db.add_column(u'orm_project', 'release',
+ self.gf('django.db.models.fields.related.ForeignKey')(default=-1, to=orm['orm.Release']),
+ keep_default=False)
+
+
+ def backwards(self, orm):
+ # Removing unique constraint on 'Layer', fields ['layer_source', 'name']
+ db.delete_unique(u'orm_layer', ['layer_source_id', 'name'])
+
+ # Removing unique constraint on 'Layer', fields ['layer_source', 'up_id']
+ db.delete_unique(u'orm_layer', ['layer_source_id', 'up_id'])
+
+ # Removing unique constraint on 'Layer_Version', fields ['layer_source', 'up_id']
+ db.delete_unique(u'orm_layer_version', ['layer_source_id', 'up_id'])
+
+ # Removing unique constraint on 'LayerSource', fields ['sourcetype', 'apiurl']
+ db.delete_unique(u'orm_layersource', ['sourcetype', 'apiurl'])
+
+ # Removing unique constraint on 'Branch', fields ['layer_source', 'up_id']
+ db.delete_unique(u'orm_branch', ['layer_source_id', 'up_id'])
+
+ # Removing unique constraint on 'Branch', fields ['layer_source', 'name']
+ db.delete_unique(u'orm_branch', ['layer_source_id', 'name'])
+
+ # Removing unique constraint on 'Machine', fields ['layer_source', 'up_id']
+ db.delete_unique(u'orm_machine', ['layer_source_id', 'up_id'])
+
+ # Removing unique constraint on 'LayerVersionDependency', fields ['layer_source', 'up_id']
+ db.delete_unique(u'orm_layerversiondependency', ['layer_source_id', 'up_id'])
+
+ # Deleting model 'Release'
+ db.delete_table(u'orm_release')
+
+ # Deleting model 'LayerVersionDependency'
+ db.delete_table(u'orm_layerversiondependency')
+
+ # Deleting model 'ToasterSetting'
+ db.delete_table(u'orm_toastersetting')
+
+ # Deleting model 'Machine'
+ db.delete_table(u'orm_machine')
+
+ # Deleting model 'ReleaseDefaultLayer'
+ db.delete_table(u'orm_releasedefaultlayer')
+
+ # Deleting model 'BitbakeVersion'
+ db.delete_table(u'orm_bitbakeversion')
+
+ # Deleting model 'Branch'
+ db.delete_table(u'orm_branch')
+
+ # Deleting model 'ToasterSettingDefaultLayer'
+ db.delete_table(u'orm_toastersettingdefaultlayer')
+
+ # Deleting model 'LayerSource'
+ db.delete_table(u'orm_layersource')
+
+
+ # User chose to not deal with backwards NULL issues for 'ProjectLayer.name'
+ raise RuntimeError("Cannot reverse this migration. 'ProjectLayer.name' and its values cannot be restored.")
+
+ # The following code is provided here to aid in writing a correct migration # Adding field 'ProjectLayer.name'
+ db.add_column(u'orm_projectlayer', 'name',
+ self.gf('django.db.models.fields.CharField')(max_length=100),
+ keep_default=False)
+
+
+ # User chose to not deal with backwards NULL issues for 'ProjectLayer.dirpath'
+ raise RuntimeError("Cannot reverse this migration. 'ProjectLayer.dirpath' and its values cannot be restored.")
+
+ # The following code is provided here to aid in writing a correct migration # Adding field 'ProjectLayer.dirpath'
+ db.add_column(u'orm_projectlayer', 'dirpath',
+ self.gf('django.db.models.fields.CharField')(max_length=254),
+ keep_default=False)
+
+
+ # User chose to not deal with backwards NULL issues for 'ProjectLayer.commit'
+ raise RuntimeError("Cannot reverse this migration. 'ProjectLayer.commit' and its values cannot be restored.")
+
+ # The following code is provided here to aid in writing a correct migration # Adding field 'ProjectLayer.commit'
+ db.add_column(u'orm_projectlayer', 'commit',
+ self.gf('django.db.models.fields.CharField')(max_length=254),
+ keep_default=False)
+
+
+ # User chose to not deal with backwards NULL issues for 'ProjectLayer.giturl'
+ raise RuntimeError("Cannot reverse this migration. 'ProjectLayer.giturl' and its values cannot be restored.")
+
+ # The following code is provided here to aid in writing a correct migration # Adding field 'ProjectLayer.giturl'
+ db.add_column(u'orm_projectlayer', 'giturl',
+ self.gf('django.db.models.fields.CharField')(max_length=254),
+ keep_default=False)
+
+ # Deleting field 'ProjectLayer.layercommit'
+ db.delete_column(u'orm_projectlayer', 'layercommit_id')
+
+ # Deleting field 'Layer_Version.layer_source'
+ db.delete_column(u'orm_layer_version', 'layer_source_id')
+
+ # Deleting field 'Layer_Version.up_id'
+ db.delete_column(u'orm_layer_version', 'up_id')
+
+ # Deleting field 'Layer_Version.up_date'
+ db.delete_column(u'orm_layer_version', 'up_date')
+
+ # Deleting field 'Layer_Version.up_branch'
+ db.delete_column(u'orm_layer_version', 'up_branch_id')
+
+ # Deleting field 'Layer_Version.dirpath'
+ db.delete_column(u'orm_layer_version', 'dirpath')
+
+
+ # User chose to not deal with backwards NULL issues for 'Layer_Version.build'
+ raise RuntimeError("Cannot reverse this migration. 'Layer_Version.build' and its values cannot be restored.")
+
+ # The following code is provided here to aid in writing a correct migration
+ # Changing field 'Layer_Version.build'
+ db.alter_column(u'orm_layer_version', 'build_id', self.gf('django.db.models.fields.related.ForeignKey')(to=orm['orm.Build']))
+
+ # Changing field 'Layer_Version.branch'
+ db.alter_column(u'orm_layer_version', 'branch', self.gf('django.db.models.fields.CharField')(max_length=50))
+ # Deleting field 'Recipe.layer_source'
+ db.delete_column(u'orm_recipe', 'layer_source_id')
+
+ # Deleting field 'Recipe.up_id'
+ db.delete_column(u'orm_recipe', 'up_id')
+
+ # Deleting field 'Recipe.up_date'
+ db.delete_column(u'orm_recipe', 'up_date')
+
+ # Deleting field 'Layer.layer_source'
+ db.delete_column(u'orm_layer', 'layer_source_id')
+
+ # Deleting field 'Layer.up_id'
+ db.delete_column(u'orm_layer', 'up_id')
+
+ # Deleting field 'Layer.up_date'
+ db.delete_column(u'orm_layer', 'up_date')
+
+ # Deleting field 'Layer.vcs_url'
+ db.delete_column(u'orm_layer', 'vcs_url')
+
+ # Deleting field 'Layer.vcs_web_file_base_url'
+ db.delete_column(u'orm_layer', 'vcs_web_file_base_url')
+
+ # Deleting field 'Layer.summary'
+ db.delete_column(u'orm_layer', 'summary')
+
+ # Deleting field 'Layer.description'
+ db.delete_column(u'orm_layer', 'description')
+
+
+ # User chose to not deal with backwards NULL issues for 'Layer.local_path'
+ raise RuntimeError("Cannot reverse this migration. 'Layer.local_path' and its values cannot be restored.")
+
+ # The following code is provided here to aid in writing a correct migration
+ # Changing field 'Layer.local_path'
+ db.alter_column(u'orm_layer', 'local_path', self.gf('django.db.models.fields.FilePathField')(max_length=255))
+
+ # User chose to not deal with backwards NULL issues for 'Project.branch'
+ raise RuntimeError("Cannot reverse this migration. 'Project.branch' and its values cannot be restored.")
+
+ # The following code is provided here to aid in writing a correct migration # Adding field 'Project.branch'
+ db.add_column(u'orm_project', 'branch',
+ self.gf('django.db.models.fields.CharField')(max_length=50),
+ keep_default=False)
+
+ # Deleting field 'Project.bitbake_version'
+ db.delete_column(u'orm_project', 'bitbake_version_id')
+
+ # Deleting field 'Project.release'
+ db.delete_column(u'orm_project', 'release_id')
+
+
+ models = {
+ u'orm.bitbakeversion': {
+ 'Meta': {'object_name': 'BitbakeVersion'},
+ 'branch': ('django.db.models.fields.CharField', [], {'max_length': '32'}),
+ 'dirpath': ('django.db.models.fields.CharField', [], {'max_length': '255'}),
+ 'giturl': ('django.db.models.fields.URLField', [], {'max_length': '200'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'unique': 'True', 'max_length': '32'})
+ },
+ u'orm.branch': {
+ 'Meta': {'unique_together': "(('layer_source', 'name'), ('layer_source', 'up_id'))", 'object_name': 'Branch'},
+ 'bitbake_branch': ('django.db.models.fields.CharField', [], {'max_length': '50', 'blank': 'True'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'layer_source': ('django.db.models.fields.related.ForeignKey', [], {'default': 'True', 'to': u"orm['orm.LayerSource']", 'null': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '50'}),
+ 'short_description': ('django.db.models.fields.CharField', [], {'max_length': '50', 'blank': 'True'}),
+ 'up_date': ('django.db.models.fields.DateTimeField', [], {'default': 'None', 'null': 'True'}),
+ 'up_id': ('django.db.models.fields.IntegerField', [], {'default': 'None', 'null': 'True'})
+ },
+ u'orm.build': {
+ 'Meta': {'object_name': 'Build'},
+ 'bitbake_version': ('django.db.models.fields.CharField', [], {'max_length': '50'}),
+ 'build_name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'completed_on': ('django.db.models.fields.DateTimeField', [], {}),
+ 'cooker_log_path': ('django.db.models.fields.CharField', [], {'max_length': '500'}),
+ 'distro': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'distro_version': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'errors_no': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'machine': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'outcome': ('django.db.models.fields.IntegerField', [], {'default': '2'}),
+ 'project': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Project']", 'null': 'True'}),
+ 'started_on': ('django.db.models.fields.DateTimeField', [], {}),
+ 'timespent': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'warnings_no': ('django.db.models.fields.IntegerField', [], {'default': '0'})
+ },
+ u'orm.helptext': {
+ 'Meta': {'object_name': 'HelpText'},
+ 'area': ('django.db.models.fields.IntegerField', [], {}),
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'helptext_build'", 'to': u"orm['orm.Build']"}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'key': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'text': ('django.db.models.fields.TextField', [], {})
+ },
+ u'orm.layer': {
+ 'Meta': {'unique_together': "(('layer_source', 'up_id'), ('layer_source', 'name'))", 'object_name': 'Layer'},
+ 'description': ('django.db.models.fields.TextField', [], {'default': 'None', 'null': 'True'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'layer_index_url': ('django.db.models.fields.URLField', [], {'max_length': '200'}),
+ 'layer_source': ('django.db.models.fields.related.ForeignKey', [], {'default': 'None', 'to': u"orm['orm.LayerSource']", 'null': 'True'}),
+ 'local_path': ('django.db.models.fields.FilePathField', [], {'default': 'None', 'max_length': '255', 'null': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'summary': ('django.db.models.fields.CharField', [], {'default': 'None', 'max_length': '200', 'null': 'True'}),
+ 'up_date': ('django.db.models.fields.DateTimeField', [], {'default': 'None', 'null': 'True'}),
+ 'up_id': ('django.db.models.fields.IntegerField', [], {'default': 'None', 'null': 'True'}),
+ 'vcs_url': ('django.db.models.fields.URLField', [], {'default': 'None', 'max_length': '200', 'null': 'True'}),
+ 'vcs_web_file_base_url': ('django.db.models.fields.URLField', [], {'default': 'None', 'max_length': '200', 'null': 'True'})
+ },
+ u'orm.layer_version': {
+ 'Meta': {'unique_together': "(('layer_source', 'up_id'),)", 'object_name': 'Layer_Version'},
+ 'branch': ('django.db.models.fields.CharField', [], {'max_length': '80'}),
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'default': 'None', 'related_name': "'layer_version_build'", 'null': 'True', 'to': u"orm['orm.Build']"}),
+ 'commit': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'dirpath': ('django.db.models.fields.CharField', [], {'default': 'None', 'max_length': '255', 'null': 'True'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'layer': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'layer_version_layer'", 'to': u"orm['orm.Layer']"}),
+ 'layer_source': ('django.db.models.fields.related.ForeignKey', [], {'default': 'None', 'to': u"orm['orm.LayerSource']", 'null': 'True'}),
+ 'priority': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'up_branch': ('django.db.models.fields.related.ForeignKey', [], {'default': 'None', 'to': u"orm['orm.Branch']", 'null': 'True'}),
+ 'up_date': ('django.db.models.fields.DateTimeField', [], {'default': 'None', 'null': 'True'}),
+ 'up_id': ('django.db.models.fields.IntegerField', [], {'default': 'None', 'null': 'True'})
+ },
+ u'orm.layersource': {
+ 'Meta': {'unique_together': "(('sourcetype', 'apiurl'),)", 'object_name': 'LayerSource'},
+ 'apiurl': ('django.db.models.fields.CharField', [], {'default': 'None', 'max_length': '255', 'null': 'True'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '63'}),
+ 'sourcetype': ('django.db.models.fields.IntegerField', [], {})
+ },
+ u'orm.layerversiondependency': {
+ 'Meta': {'unique_together': "(('layer_source', 'up_id'),)", 'object_name': 'LayerVersionDependency'},
+ 'depends_on': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'dependees'", 'to': u"orm['orm.Layer_Version']"}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'layer_source': ('django.db.models.fields.related.ForeignKey', [], {'default': 'None', 'to': u"orm['orm.LayerSource']", 'null': 'True'}),
+ 'layer_version': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'dependencies'", 'to': u"orm['orm.Layer_Version']"}),
+ 'up_id': ('django.db.models.fields.IntegerField', [], {'default': 'None', 'null': 'True'})
+ },
+ u'orm.logmessage': {
+ 'Meta': {'object_name': 'LogMessage'},
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Build']"}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'level': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'lineno': ('django.db.models.fields.IntegerField', [], {'null': 'True'}),
+ 'message': ('django.db.models.fields.CharField', [], {'max_length': '240'}),
+ 'pathname': ('django.db.models.fields.FilePathField', [], {'max_length': '255', 'blank': 'True'}),
+ 'task': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Task']", 'null': 'True', 'blank': 'True'})
+ },
+ u'orm.machine': {
+ 'Meta': {'unique_together': "(('layer_source', 'up_id'),)", 'object_name': 'Machine'},
+ 'description': ('django.db.models.fields.CharField', [], {'max_length': '255'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'layer_source': ('django.db.models.fields.related.ForeignKey', [], {'default': 'None', 'to': u"orm['orm.LayerSource']", 'null': 'True'}),
+ 'layer_version': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Layer_Version']"}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '255'}),
+ 'up_date': ('django.db.models.fields.DateTimeField', [], {'default': 'None', 'null': 'True'}),
+ 'up_id': ('django.db.models.fields.IntegerField', [], {'default': 'None', 'null': 'True'})
+ },
+ u'orm.package': {
+ 'Meta': {'object_name': 'Package'},
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Build']"}),
+ 'description': ('django.db.models.fields.TextField', [], {'blank': 'True'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'installed_name': ('django.db.models.fields.CharField', [], {'default': "''", 'max_length': '100'}),
+ 'installed_size': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'license': ('django.db.models.fields.CharField', [], {'max_length': '80', 'blank': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'recipe': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Recipe']", 'null': 'True'}),
+ 'revision': ('django.db.models.fields.CharField', [], {'max_length': '32', 'blank': 'True'}),
+ 'section': ('django.db.models.fields.CharField', [], {'max_length': '80', 'blank': 'True'}),
+ 'size': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'summary': ('django.db.models.fields.CharField', [], {'max_length': '200', 'blank': 'True'}),
+ 'version': ('django.db.models.fields.CharField', [], {'max_length': '100', 'blank': 'True'})
+ },
+ u'orm.package_dependency': {
+ 'Meta': {'object_name': 'Package_Dependency'},
+ 'dep_type': ('django.db.models.fields.IntegerField', [], {}),
+ 'depends_on': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'package_dependencies_target'", 'to': u"orm['orm.Package']"}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'package': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'package_dependencies_source'", 'to': u"orm['orm.Package']"}),
+ 'target': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Target']", 'null': 'True'})
+ },
+ u'orm.package_file': {
+ 'Meta': {'object_name': 'Package_File'},
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'package': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'buildfilelist_package'", 'to': u"orm['orm.Package']"}),
+ 'path': ('django.db.models.fields.FilePathField', [], {'max_length': '255', 'blank': 'True'}),
+ 'size': ('django.db.models.fields.IntegerField', [], {})
+ },
+ u'orm.project': {
+ 'Meta': {'object_name': 'Project'},
+ 'bitbake_version': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.BitbakeVersion']"}),
+ 'created': ('django.db.models.fields.DateTimeField', [], {'auto_now_add': 'True', 'blank': 'True'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'release': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Release']"}),
+ 'short_description': ('django.db.models.fields.CharField', [], {'max_length': '50', 'blank': 'True'}),
+ 'updated': ('django.db.models.fields.DateTimeField', [], {'auto_now': 'True', 'blank': 'True'}),
+ 'user_id': ('django.db.models.fields.IntegerField', [], {'null': 'True'})
+ },
+ u'orm.projectlayer': {
+ 'Meta': {'object_name': 'ProjectLayer'},
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'layercommit': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Layer_Version']", 'null': 'True'}),
+ 'optional': ('django.db.models.fields.BooleanField', [], {'default': 'True'}),
+ 'project': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Project']"})
+ },
+ u'orm.projecttarget': {
+ 'Meta': {'object_name': 'ProjectTarget'},
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'project': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Project']"}),
+ 'target': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'task': ('django.db.models.fields.CharField', [], {'max_length': '100', 'null': 'True'})
+ },
+ u'orm.projectvariable': {
+ 'Meta': {'object_name': 'ProjectVariable'},
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'project': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Project']"}),
+ 'value': ('django.db.models.fields.TextField', [], {'blank': 'True'})
+ },
+ u'orm.recipe': {
+ 'Meta': {'object_name': 'Recipe'},
+ 'bugtracker': ('django.db.models.fields.URLField', [], {'max_length': '200', 'blank': 'True'}),
+ 'description': ('django.db.models.fields.TextField', [], {'blank': 'True'}),
+ 'file_path': ('django.db.models.fields.FilePathField', [], {'max_length': '255'}),
+ 'homepage': ('django.db.models.fields.URLField', [], {'max_length': '200', 'blank': 'True'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'layer_source': ('django.db.models.fields.related.ForeignKey', [], {'default': 'None', 'to': u"orm['orm.LayerSource']", 'null': 'True'}),
+ 'layer_version': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'recipe_layer_version'", 'to': u"orm['orm.Layer_Version']"}),
+ 'license': ('django.db.models.fields.CharField', [], {'max_length': '200', 'blank': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '100', 'blank': 'True'}),
+ 'section': ('django.db.models.fields.CharField', [], {'max_length': '100', 'blank': 'True'}),
+ 'summary': ('django.db.models.fields.CharField', [], {'max_length': '100', 'blank': 'True'}),
+ 'up_date': ('django.db.models.fields.DateTimeField', [], {'default': 'None', 'null': 'True'}),
+ 'up_id': ('django.db.models.fields.IntegerField', [], {'default': 'None', 'null': 'True'}),
+ 'version': ('django.db.models.fields.CharField', [], {'max_length': '100', 'blank': 'True'})
+ },
+ u'orm.recipe_dependency': {
+ 'Meta': {'object_name': 'Recipe_Dependency'},
+ 'dep_type': ('django.db.models.fields.IntegerField', [], {}),
+ 'depends_on': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'r_dependencies_depends'", 'to': u"orm['orm.Recipe']"}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'recipe': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'r_dependencies_recipe'", 'to': u"orm['orm.Recipe']"})
+ },
+ u'orm.release': {
+ 'Meta': {'object_name': 'Release'},
+ 'bitbake_version': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.BitbakeVersion']"}),
+ 'branch': ('django.db.models.fields.CharField', [], {'max_length': '32'}),
+ 'description': ('django.db.models.fields.CharField', [], {'max_length': '255'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'unique': 'True', 'max_length': '32'})
+ },
+ u'orm.releasedefaultlayer': {
+ 'Meta': {'object_name': 'ReleaseDefaultLayer'},
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'layer': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Layer']"}),
+ 'release': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Release']"})
+ },
+ u'orm.target': {
+ 'Meta': {'object_name': 'Target'},
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Build']"}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'image_size': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'is_image': ('django.db.models.fields.BooleanField', [], {'default': 'False'}),
+ 'license_manifest_path': ('django.db.models.fields.CharField', [], {'max_length': '500', 'null': 'True'}),
+ 'target': ('django.db.models.fields.CharField', [], {'max_length': '100'})
+ },
+ u'orm.target_file': {
+ 'Meta': {'object_name': 'Target_File'},
+ 'directory': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'directory_set'", 'null': 'True', 'to': u"orm['orm.Target_File']"}),
+ 'group': ('django.db.models.fields.CharField', [], {'max_length': '128'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'inodetype': ('django.db.models.fields.IntegerField', [], {}),
+ 'owner': ('django.db.models.fields.CharField', [], {'max_length': '128'}),
+ 'path': ('django.db.models.fields.FilePathField', [], {'max_length': '100'}),
+ 'permission': ('django.db.models.fields.CharField', [], {'max_length': '16'}),
+ 'size': ('django.db.models.fields.IntegerField', [], {}),
+ 'sym_target': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'symlink_set'", 'null': 'True', 'to': u"orm['orm.Target_File']"}),
+ 'target': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Target']"})
+ },
+ u'orm.target_image_file': {
+ 'Meta': {'object_name': 'Target_Image_File'},
+ 'file_name': ('django.db.models.fields.FilePathField', [], {'max_length': '254'}),
+ 'file_size': ('django.db.models.fields.IntegerField', [], {}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'target': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Target']"})
+ },
+ u'orm.target_installed_package': {
+ 'Meta': {'object_name': 'Target_Installed_Package'},
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'package': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'buildtargetlist_package'", 'to': u"orm['orm.Package']"}),
+ 'target': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Target']"})
+ },
+ u'orm.task': {
+ 'Meta': {'ordering': "('order', 'recipe')", 'unique_together': "(('build', 'recipe', 'task_name'),)", 'object_name': 'Task'},
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'task_build'", 'to': u"orm['orm.Build']"}),
+ 'cpu_usage': ('django.db.models.fields.DecimalField', [], {'null': 'True', 'max_digits': '6', 'decimal_places': '2'}),
+ 'disk_io': ('django.db.models.fields.IntegerField', [], {'null': 'True'}),
+ 'elapsed_time': ('django.db.models.fields.DecimalField', [], {'null': 'True', 'max_digits': '6', 'decimal_places': '2'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'line_number': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'logfile': ('django.db.models.fields.FilePathField', [], {'max_length': '255', 'blank': 'True'}),
+ 'message': ('django.db.models.fields.CharField', [], {'max_length': '240'}),
+ 'order': ('django.db.models.fields.IntegerField', [], {'null': 'True'}),
+ 'outcome': ('django.db.models.fields.IntegerField', [], {'default': '-1'}),
+ 'path_to_sstate_obj': ('django.db.models.fields.FilePathField', [], {'max_length': '500', 'blank': 'True'}),
+ 'recipe': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'build_recipe'", 'to': u"orm['orm.Recipe']"}),
+ 'script_type': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'source_url': ('django.db.models.fields.FilePathField', [], {'max_length': '255', 'blank': 'True'}),
+ 'sstate_checksum': ('django.db.models.fields.CharField', [], {'max_length': '100', 'blank': 'True'}),
+ 'sstate_result': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'task_executed': ('django.db.models.fields.BooleanField', [], {'default': 'False'}),
+ 'task_name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'work_directory': ('django.db.models.fields.FilePathField', [], {'max_length': '255', 'blank': 'True'})
+ },
+ u'orm.task_dependency': {
+ 'Meta': {'object_name': 'Task_Dependency'},
+ 'depends_on': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'task_dependencies_depends'", 'to': u"orm['orm.Task']"}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'task': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'task_dependencies_task'", 'to': u"orm['orm.Task']"})
+ },
+ u'orm.toastersetting': {
+ 'Meta': {'object_name': 'ToasterSetting'},
+ 'helptext': ('django.db.models.fields.TextField', [], {}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '63'}),
+ 'value': ('django.db.models.fields.CharField', [], {'max_length': '255'})
+ },
+ u'orm.toastersettingdefaultlayer': {
+ 'Meta': {'object_name': 'ToasterSettingDefaultLayer'},
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'layer_version': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Layer_Version']"})
+ },
+ u'orm.variable': {
+ 'Meta': {'object_name': 'Variable'},
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'variable_build'", 'to': u"orm['orm.Build']"}),
+ 'changed': ('django.db.models.fields.BooleanField', [], {'default': 'False'}),
+ 'description': ('django.db.models.fields.TextField', [], {'blank': 'True'}),
+ 'human_readable_name': ('django.db.models.fields.CharField', [], {'max_length': '200'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'variable_name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'variable_value': ('django.db.models.fields.TextField', [], {'blank': 'True'})
+ },
+ u'orm.variablehistory': {
+ 'Meta': {'object_name': 'VariableHistory'},
+ 'file_name': ('django.db.models.fields.FilePathField', [], {'max_length': '255'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'line_number': ('django.db.models.fields.IntegerField', [], {'null': 'True'}),
+ 'operation': ('django.db.models.fields.CharField', [], {'max_length': '64'}),
+ 'value': ('django.db.models.fields.TextField', [], {'blank': 'True'}),
+ 'variable': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'vhistory'", 'to': u"orm['orm.Variable']"})
+ }
+ }
+
+ complete_apps = ['orm']
\ No newline at end of file
diff --git a/bitbake/lib/toaster/orm/migrations/0014_auto__chg_field_package_summary__chg_field_layer_summary__chg_field_re.py b/bitbake/lib/toaster/orm/migrations/0014_auto__chg_field_package_summary__chg_field_layer_summary__chg_field_re.py
new file mode 100644
index 0000000..7945f15
--- /dev/null
+++ b/bitbake/lib/toaster/orm/migrations/0014_auto__chg_field_package_summary__chg_field_layer_summary__chg_field_re.py
@@ -0,0 +1,336 @@
+# -*- coding: utf-8 -*-
+from south.utils import datetime_utils as datetime
+from south.db import db
+from south.v2 import SchemaMigration
+from django.db import models
+
+
+class Migration(SchemaMigration):
+
+ def forwards(self, orm):
+
+ # Changing field 'Package.summary'
+ db.alter_column(u'orm_package', 'summary', self.gf('django.db.models.fields.TextField')())
+
+ # Changing field 'Layer.summary'
+ db.alter_column(u'orm_layer', 'summary', self.gf('django.db.models.fields.TextField')(null=True))
+
+ # Changing field 'Recipe.summary'
+ db.alter_column(u'orm_recipe', 'summary', self.gf('django.db.models.fields.TextField')())
+
+ def backwards(self, orm):
+
+ # Changing field 'Package.summary'
+ db.alter_column(u'orm_package', 'summary', self.gf('django.db.models.fields.CharField')(max_length=200))
+
+ # Changing field 'Layer.summary'
+ db.alter_column(u'orm_layer', 'summary', self.gf('django.db.models.fields.CharField')(max_length=200, null=True))
+
+ # Changing field 'Recipe.summary'
+ db.alter_column(u'orm_recipe', 'summary', self.gf('django.db.models.fields.CharField')(max_length=100))
+
+ models = {
+ u'orm.bitbakeversion': {
+ 'Meta': {'object_name': 'BitbakeVersion'},
+ 'branch': ('django.db.models.fields.CharField', [], {'max_length': '32'}),
+ 'dirpath': ('django.db.models.fields.CharField', [], {'max_length': '255'}),
+ 'giturl': ('django.db.models.fields.URLField', [], {'max_length': '200'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'unique': 'True', 'max_length': '32'})
+ },
+ u'orm.branch': {
+ 'Meta': {'unique_together': "(('layer_source', 'name'), ('layer_source', 'up_id'))", 'object_name': 'Branch'},
+ 'bitbake_branch': ('django.db.models.fields.CharField', [], {'max_length': '50', 'blank': 'True'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'layer_source': ('django.db.models.fields.related.ForeignKey', [], {'default': 'True', 'to': u"orm['orm.LayerSource']", 'null': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '50'}),
+ 'short_description': ('django.db.models.fields.CharField', [], {'max_length': '50', 'blank': 'True'}),
+ 'up_date': ('django.db.models.fields.DateTimeField', [], {'default': 'None', 'null': 'True'}),
+ 'up_id': ('django.db.models.fields.IntegerField', [], {'default': 'None', 'null': 'True'})
+ },
+ u'orm.build': {
+ 'Meta': {'object_name': 'Build'},
+ 'bitbake_version': ('django.db.models.fields.CharField', [], {'max_length': '50'}),
+ 'build_name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'completed_on': ('django.db.models.fields.DateTimeField', [], {}),
+ 'cooker_log_path': ('django.db.models.fields.CharField', [], {'max_length': '500'}),
+ 'distro': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'distro_version': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'errors_no': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'machine': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'outcome': ('django.db.models.fields.IntegerField', [], {'default': '2'}),
+ 'project': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Project']", 'null': 'True'}),
+ 'started_on': ('django.db.models.fields.DateTimeField', [], {}),
+ 'timespent': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'warnings_no': ('django.db.models.fields.IntegerField', [], {'default': '0'})
+ },
+ u'orm.helptext': {
+ 'Meta': {'object_name': 'HelpText'},
+ 'area': ('django.db.models.fields.IntegerField', [], {}),
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'helptext_build'", 'to': u"orm['orm.Build']"}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'key': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'text': ('django.db.models.fields.TextField', [], {})
+ },
+ u'orm.layer': {
+ 'Meta': {'unique_together': "(('layer_source', 'up_id'), ('layer_source', 'name'))", 'object_name': 'Layer'},
+ 'description': ('django.db.models.fields.TextField', [], {'default': 'None', 'null': 'True'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'layer_index_url': ('django.db.models.fields.URLField', [], {'max_length': '200'}),
+ 'layer_source': ('django.db.models.fields.related.ForeignKey', [], {'default': 'None', 'to': u"orm['orm.LayerSource']", 'null': 'True'}),
+ 'local_path': ('django.db.models.fields.FilePathField', [], {'default': 'None', 'max_length': '255', 'null': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'summary': ('django.db.models.fields.TextField', [], {'default': 'None', 'null': 'True'}),
+ 'up_date': ('django.db.models.fields.DateTimeField', [], {'default': 'None', 'null': 'True'}),
+ 'up_id': ('django.db.models.fields.IntegerField', [], {'default': 'None', 'null': 'True'}),
+ 'vcs_url': ('django.db.models.fields.URLField', [], {'default': 'None', 'max_length': '200', 'null': 'True'}),
+ 'vcs_web_file_base_url': ('django.db.models.fields.URLField', [], {'default': 'None', 'max_length': '200', 'null': 'True'})
+ },
+ u'orm.layer_version': {
+ 'Meta': {'unique_together': "(('layer_source', 'up_id'),)", 'object_name': 'Layer_Version'},
+ 'branch': ('django.db.models.fields.CharField', [], {'max_length': '80'}),
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'default': 'None', 'related_name': "'layer_version_build'", 'null': 'True', 'to': u"orm['orm.Build']"}),
+ 'commit': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'dirpath': ('django.db.models.fields.CharField', [], {'default': 'None', 'max_length': '255', 'null': 'True'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'layer': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'layer_version_layer'", 'to': u"orm['orm.Layer']"}),
+ 'layer_source': ('django.db.models.fields.related.ForeignKey', [], {'default': 'None', 'to': u"orm['orm.LayerSource']", 'null': 'True'}),
+ 'priority': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'up_branch': ('django.db.models.fields.related.ForeignKey', [], {'default': 'None', 'to': u"orm['orm.Branch']", 'null': 'True'}),
+ 'up_date': ('django.db.models.fields.DateTimeField', [], {'default': 'None', 'null': 'True'}),
+ 'up_id': ('django.db.models.fields.IntegerField', [], {'default': 'None', 'null': 'True'})
+ },
+ u'orm.layersource': {
+ 'Meta': {'unique_together': "(('sourcetype', 'apiurl'),)", 'object_name': 'LayerSource'},
+ 'apiurl': ('django.db.models.fields.CharField', [], {'default': 'None', 'max_length': '255', 'null': 'True'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '63'}),
+ 'sourcetype': ('django.db.models.fields.IntegerField', [], {})
+ },
+ u'orm.layerversiondependency': {
+ 'Meta': {'unique_together': "(('layer_source', 'up_id'),)", 'object_name': 'LayerVersionDependency'},
+ 'depends_on': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'dependees'", 'to': u"orm['orm.Layer_Version']"}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'layer_source': ('django.db.models.fields.related.ForeignKey', [], {'default': 'None', 'to': u"orm['orm.LayerSource']", 'null': 'True'}),
+ 'layer_version': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'dependencies'", 'to': u"orm['orm.Layer_Version']"}),
+ 'up_id': ('django.db.models.fields.IntegerField', [], {'default': 'None', 'null': 'True'})
+ },
+ u'orm.logmessage': {
+ 'Meta': {'object_name': 'LogMessage'},
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Build']"}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'level': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'lineno': ('django.db.models.fields.IntegerField', [], {'null': 'True'}),
+ 'message': ('django.db.models.fields.CharField', [], {'max_length': '240'}),
+ 'pathname': ('django.db.models.fields.FilePathField', [], {'max_length': '255', 'blank': 'True'}),
+ 'task': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Task']", 'null': 'True', 'blank': 'True'})
+ },
+ u'orm.machine': {
+ 'Meta': {'unique_together': "(('layer_source', 'up_id'),)", 'object_name': 'Machine'},
+ 'description': ('django.db.models.fields.CharField', [], {'max_length': '255'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'layer_source': ('django.db.models.fields.related.ForeignKey', [], {'default': 'None', 'to': u"orm['orm.LayerSource']", 'null': 'True'}),
+ 'layer_version': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Layer_Version']"}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '255'}),
+ 'up_date': ('django.db.models.fields.DateTimeField', [], {'default': 'None', 'null': 'True'}),
+ 'up_id': ('django.db.models.fields.IntegerField', [], {'default': 'None', 'null': 'True'})
+ },
+ u'orm.package': {
+ 'Meta': {'object_name': 'Package'},
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Build']"}),
+ 'description': ('django.db.models.fields.TextField', [], {'blank': 'True'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'installed_name': ('django.db.models.fields.CharField', [], {'default': "''", 'max_length': '100'}),
+ 'installed_size': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'license': ('django.db.models.fields.CharField', [], {'max_length': '80', 'blank': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'recipe': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Recipe']", 'null': 'True'}),
+ 'revision': ('django.db.models.fields.CharField', [], {'max_length': '32', 'blank': 'True'}),
+ 'section': ('django.db.models.fields.CharField', [], {'max_length': '80', 'blank': 'True'}),
+ 'size': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'summary': ('django.db.models.fields.TextField', [], {'blank': 'True'}),
+ 'version': ('django.db.models.fields.CharField', [], {'max_length': '100', 'blank': 'True'})
+ },
+ u'orm.package_dependency': {
+ 'Meta': {'object_name': 'Package_Dependency'},
+ 'dep_type': ('django.db.models.fields.IntegerField', [], {}),
+ 'depends_on': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'package_dependencies_target'", 'to': u"orm['orm.Package']"}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'package': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'package_dependencies_source'", 'to': u"orm['orm.Package']"}),
+ 'target': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Target']", 'null': 'True'})
+ },
+ u'orm.package_file': {
+ 'Meta': {'object_name': 'Package_File'},
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'package': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'buildfilelist_package'", 'to': u"orm['orm.Package']"}),
+ 'path': ('django.db.models.fields.FilePathField', [], {'max_length': '255', 'blank': 'True'}),
+ 'size': ('django.db.models.fields.IntegerField', [], {})
+ },
+ u'orm.project': {
+ 'Meta': {'object_name': 'Project'},
+ 'bitbake_version': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.BitbakeVersion']"}),
+ 'created': ('django.db.models.fields.DateTimeField', [], {'auto_now_add': 'True', 'blank': 'True'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'release': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Release']"}),
+ 'short_description': ('django.db.models.fields.CharField', [], {'max_length': '50', 'blank': 'True'}),
+ 'updated': ('django.db.models.fields.DateTimeField', [], {'auto_now': 'True', 'blank': 'True'}),
+ 'user_id': ('django.db.models.fields.IntegerField', [], {'null': 'True'})
+ },
+ u'orm.projectlayer': {
+ 'Meta': {'object_name': 'ProjectLayer'},
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'layercommit': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Layer_Version']", 'null': 'True'}),
+ 'optional': ('django.db.models.fields.BooleanField', [], {'default': 'True'}),
+ 'project': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Project']"})
+ },
+ u'orm.projecttarget': {
+ 'Meta': {'object_name': 'ProjectTarget'},
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'project': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Project']"}),
+ 'target': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'task': ('django.db.models.fields.CharField', [], {'max_length': '100', 'null': 'True'})
+ },
+ u'orm.projectvariable': {
+ 'Meta': {'object_name': 'ProjectVariable'},
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'project': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Project']"}),
+ 'value': ('django.db.models.fields.TextField', [], {'blank': 'True'})
+ },
+ u'orm.recipe': {
+ 'Meta': {'object_name': 'Recipe'},
+ 'bugtracker': ('django.db.models.fields.URLField', [], {'max_length': '200', 'blank': 'True'}),
+ 'description': ('django.db.models.fields.TextField', [], {'blank': 'True'}),
+ 'file_path': ('django.db.models.fields.FilePathField', [], {'max_length': '255'}),
+ 'homepage': ('django.db.models.fields.URLField', [], {'max_length': '200', 'blank': 'True'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'layer_source': ('django.db.models.fields.related.ForeignKey', [], {'default': 'None', 'to': u"orm['orm.LayerSource']", 'null': 'True'}),
+ 'layer_version': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'recipe_layer_version'", 'to': u"orm['orm.Layer_Version']"}),
+ 'license': ('django.db.models.fields.CharField', [], {'max_length': '200', 'blank': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '100', 'blank': 'True'}),
+ 'section': ('django.db.models.fields.CharField', [], {'max_length': '100', 'blank': 'True'}),
+ 'summary': ('django.db.models.fields.TextField', [], {'blank': 'True'}),
+ 'up_date': ('django.db.models.fields.DateTimeField', [], {'default': 'None', 'null': 'True'}),
+ 'up_id': ('django.db.models.fields.IntegerField', [], {'default': 'None', 'null': 'True'}),
+ 'version': ('django.db.models.fields.CharField', [], {'max_length': '100', 'blank': 'True'})
+ },
+ u'orm.recipe_dependency': {
+ 'Meta': {'object_name': 'Recipe_Dependency'},
+ 'dep_type': ('django.db.models.fields.IntegerField', [], {}),
+ 'depends_on': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'r_dependencies_depends'", 'to': u"orm['orm.Recipe']"}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'recipe': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'r_dependencies_recipe'", 'to': u"orm['orm.Recipe']"})
+ },
+ u'orm.release': {
+ 'Meta': {'object_name': 'Release'},
+ 'bitbake_version': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.BitbakeVersion']"}),
+ 'branch': ('django.db.models.fields.CharField', [], {'max_length': '32'}),
+ 'description': ('django.db.models.fields.CharField', [], {'max_length': '255'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'unique': 'True', 'max_length': '32'})
+ },
+ u'orm.releasedefaultlayer': {
+ 'Meta': {'object_name': 'ReleaseDefaultLayer'},
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'layer': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Layer']"}),
+ 'release': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Release']"})
+ },
+ u'orm.target': {
+ 'Meta': {'object_name': 'Target'},
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Build']"}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'image_size': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'is_image': ('django.db.models.fields.BooleanField', [], {'default': 'False'}),
+ 'license_manifest_path': ('django.db.models.fields.CharField', [], {'max_length': '500', 'null': 'True'}),
+ 'target': ('django.db.models.fields.CharField', [], {'max_length': '100'})
+ },
+ u'orm.target_file': {
+ 'Meta': {'object_name': 'Target_File'},
+ 'directory': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'directory_set'", 'null': 'True', 'to': u"orm['orm.Target_File']"}),
+ 'group': ('django.db.models.fields.CharField', [], {'max_length': '128'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'inodetype': ('django.db.models.fields.IntegerField', [], {}),
+ 'owner': ('django.db.models.fields.CharField', [], {'max_length': '128'}),
+ 'path': ('django.db.models.fields.FilePathField', [], {'max_length': '100'}),
+ 'permission': ('django.db.models.fields.CharField', [], {'max_length': '16'}),
+ 'size': ('django.db.models.fields.IntegerField', [], {}),
+ 'sym_target': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'symlink_set'", 'null': 'True', 'to': u"orm['orm.Target_File']"}),
+ 'target': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Target']"})
+ },
+ u'orm.target_image_file': {
+ 'Meta': {'object_name': 'Target_Image_File'},
+ 'file_name': ('django.db.models.fields.FilePathField', [], {'max_length': '254'}),
+ 'file_size': ('django.db.models.fields.IntegerField', [], {}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'target': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Target']"})
+ },
+ u'orm.target_installed_package': {
+ 'Meta': {'object_name': 'Target_Installed_Package'},
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'package': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'buildtargetlist_package'", 'to': u"orm['orm.Package']"}),
+ 'target': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Target']"})
+ },
+ u'orm.task': {
+ 'Meta': {'ordering': "('order', 'recipe')", 'unique_together': "(('build', 'recipe', 'task_name'),)", 'object_name': 'Task'},
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'task_build'", 'to': u"orm['orm.Build']"}),
+ 'cpu_usage': ('django.db.models.fields.DecimalField', [], {'null': 'True', 'max_digits': '6', 'decimal_places': '2'}),
+ 'disk_io': ('django.db.models.fields.IntegerField', [], {'null': 'True'}),
+ 'elapsed_time': ('django.db.models.fields.DecimalField', [], {'null': 'True', 'max_digits': '6', 'decimal_places': '2'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'line_number': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'logfile': ('django.db.models.fields.FilePathField', [], {'max_length': '255', 'blank': 'True'}),
+ 'message': ('django.db.models.fields.CharField', [], {'max_length': '240'}),
+ 'order': ('django.db.models.fields.IntegerField', [], {'null': 'True'}),
+ 'outcome': ('django.db.models.fields.IntegerField', [], {'default': '-1'}),
+ 'path_to_sstate_obj': ('django.db.models.fields.FilePathField', [], {'max_length': '500', 'blank': 'True'}),
+ 'recipe': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'build_recipe'", 'to': u"orm['orm.Recipe']"}),
+ 'script_type': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'source_url': ('django.db.models.fields.FilePathField', [], {'max_length': '255', 'blank': 'True'}),
+ 'sstate_checksum': ('django.db.models.fields.CharField', [], {'max_length': '100', 'blank': 'True'}),
+ 'sstate_result': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'task_executed': ('django.db.models.fields.BooleanField', [], {'default': 'False'}),
+ 'task_name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'work_directory': ('django.db.models.fields.FilePathField', [], {'max_length': '255', 'blank': 'True'})
+ },
+ u'orm.task_dependency': {
+ 'Meta': {'object_name': 'Task_Dependency'},
+ 'depends_on': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'task_dependencies_depends'", 'to': u"orm['orm.Task']"}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'task': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'task_dependencies_task'", 'to': u"orm['orm.Task']"})
+ },
+ u'orm.toastersetting': {
+ 'Meta': {'object_name': 'ToasterSetting'},
+ 'helptext': ('django.db.models.fields.TextField', [], {}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '63'}),
+ 'value': ('django.db.models.fields.CharField', [], {'max_length': '255'})
+ },
+ u'orm.toastersettingdefaultlayer': {
+ 'Meta': {'object_name': 'ToasterSettingDefaultLayer'},
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'layer_version': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Layer_Version']"})
+ },
+ u'orm.variable': {
+ 'Meta': {'object_name': 'Variable'},
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'variable_build'", 'to': u"orm['orm.Build']"}),
+ 'changed': ('django.db.models.fields.BooleanField', [], {'default': 'False'}),
+ 'description': ('django.db.models.fields.TextField', [], {'blank': 'True'}),
+ 'human_readable_name': ('django.db.models.fields.CharField', [], {'max_length': '200'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'variable_name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'variable_value': ('django.db.models.fields.TextField', [], {'blank': 'True'})
+ },
+ u'orm.variablehistory': {
+ 'Meta': {'object_name': 'VariableHistory'},
+ 'file_name': ('django.db.models.fields.FilePathField', [], {'max_length': '255'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'line_number': ('django.db.models.fields.IntegerField', [], {'null': 'True'}),
+ 'operation': ('django.db.models.fields.CharField', [], {'max_length': '64'}),
+ 'value': ('django.db.models.fields.TextField', [], {'blank': 'True'}),
+ 'variable': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'vhistory'", 'to': u"orm['orm.Variable']"})
+ }
+ }
+
+ complete_apps = ['orm']
\ No newline at end of file
diff --git a/bitbake/lib/toaster/orm/migrations/0015_auto__add_field_layer_vcs_web_url__add_field_layer_vcs_web_tree_base_u.py b/bitbake/lib/toaster/orm/migrations/0015_auto__add_field_layer_vcs_web_url__add_field_layer_vcs_web_tree_base_u.py
new file mode 100644
index 0000000..6e664c9
--- /dev/null
+++ b/bitbake/lib/toaster/orm/migrations/0015_auto__add_field_layer_vcs_web_url__add_field_layer_vcs_web_tree_base_u.py
@@ -0,0 +1,336 @@
+# -*- coding: utf-8 -*-
+from south.utils import datetime_utils as datetime
+from south.db import db
+from south.v2 import SchemaMigration
+from django.db import models
+
+
+class Migration(SchemaMigration):
+
+ def forwards(self, orm):
+ # Adding field 'Layer.vcs_web_url'
+ db.add_column(u'orm_layer', 'vcs_web_url',
+ self.gf('django.db.models.fields.URLField')(default=None, max_length=200, null=True),
+ keep_default=False)
+
+ # Adding field 'Layer.vcs_web_tree_base_url'
+ db.add_column(u'orm_layer', 'vcs_web_tree_base_url',
+ self.gf('django.db.models.fields.URLField')(default=None, max_length=200, null=True),
+ keep_default=False)
+
+
+ def backwards(self, orm):
+ # Deleting field 'Layer.vcs_web_url'
+ db.delete_column(u'orm_layer', 'vcs_web_url')
+
+ # Deleting field 'Layer.vcs_web_tree_base_url'
+ db.delete_column(u'orm_layer', 'vcs_web_tree_base_url')
+
+
+ models = {
+ u'orm.bitbakeversion': {
+ 'Meta': {'object_name': 'BitbakeVersion'},
+ 'branch': ('django.db.models.fields.CharField', [], {'max_length': '32'}),
+ 'dirpath': ('django.db.models.fields.CharField', [], {'max_length': '255'}),
+ 'giturl': ('django.db.models.fields.URLField', [], {'max_length': '200'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'unique': 'True', 'max_length': '32'})
+ },
+ u'orm.branch': {
+ 'Meta': {'unique_together': "(('layer_source', 'name'), ('layer_source', 'up_id'))", 'object_name': 'Branch'},
+ 'bitbake_branch': ('django.db.models.fields.CharField', [], {'max_length': '50', 'blank': 'True'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'layer_source': ('django.db.models.fields.related.ForeignKey', [], {'default': 'True', 'to': u"orm['orm.LayerSource']", 'null': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '50'}),
+ 'short_description': ('django.db.models.fields.CharField', [], {'max_length': '50', 'blank': 'True'}),
+ 'up_date': ('django.db.models.fields.DateTimeField', [], {'default': 'None', 'null': 'True'}),
+ 'up_id': ('django.db.models.fields.IntegerField', [], {'default': 'None', 'null': 'True'})
+ },
+ u'orm.build': {
+ 'Meta': {'object_name': 'Build'},
+ 'bitbake_version': ('django.db.models.fields.CharField', [], {'max_length': '50'}),
+ 'build_name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'completed_on': ('django.db.models.fields.DateTimeField', [], {}),
+ 'cooker_log_path': ('django.db.models.fields.CharField', [], {'max_length': '500'}),
+ 'distro': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'distro_version': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'errors_no': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'machine': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'outcome': ('django.db.models.fields.IntegerField', [], {'default': '2'}),
+ 'project': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Project']", 'null': 'True'}),
+ 'started_on': ('django.db.models.fields.DateTimeField', [], {}),
+ 'timespent': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'warnings_no': ('django.db.models.fields.IntegerField', [], {'default': '0'})
+ },
+ u'orm.helptext': {
+ 'Meta': {'object_name': 'HelpText'},
+ 'area': ('django.db.models.fields.IntegerField', [], {}),
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'helptext_build'", 'to': u"orm['orm.Build']"}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'key': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'text': ('django.db.models.fields.TextField', [], {})
+ },
+ u'orm.layer': {
+ 'Meta': {'unique_together': "(('layer_source', 'up_id'), ('layer_source', 'name'))", 'object_name': 'Layer'},
+ 'description': ('django.db.models.fields.TextField', [], {'default': 'None', 'null': 'True'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'layer_index_url': ('django.db.models.fields.URLField', [], {'max_length': '200'}),
+ 'layer_source': ('django.db.models.fields.related.ForeignKey', [], {'default': 'None', 'to': u"orm['orm.LayerSource']", 'null': 'True'}),
+ 'local_path': ('django.db.models.fields.FilePathField', [], {'default': 'None', 'max_length': '255', 'null': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'summary': ('django.db.models.fields.TextField', [], {'default': 'None', 'null': 'True'}),
+ 'up_date': ('django.db.models.fields.DateTimeField', [], {'default': 'None', 'null': 'True'}),
+ 'up_id': ('django.db.models.fields.IntegerField', [], {'default': 'None', 'null': 'True'}),
+ 'vcs_url': ('django.db.models.fields.URLField', [], {'default': 'None', 'max_length': '200', 'null': 'True'}),
+ 'vcs_web_file_base_url': ('django.db.models.fields.URLField', [], {'default': 'None', 'max_length': '200', 'null': 'True'}),
+ 'vcs_web_tree_base_url': ('django.db.models.fields.URLField', [], {'default': 'None', 'max_length': '200', 'null': 'True'}),
+ 'vcs_web_url': ('django.db.models.fields.URLField', [], {'default': 'None', 'max_length': '200', 'null': 'True'})
+ },
+ u'orm.layer_version': {
+ 'Meta': {'unique_together': "(('layer_source', 'up_id'),)", 'object_name': 'Layer_Version'},
+ 'branch': ('django.db.models.fields.CharField', [], {'max_length': '80'}),
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'default': 'None', 'related_name': "'layer_version_build'", 'null': 'True', 'to': u"orm['orm.Build']"}),
+ 'commit': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'dirpath': ('django.db.models.fields.CharField', [], {'default': 'None', 'max_length': '255', 'null': 'True'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'layer': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'layer_version_layer'", 'to': u"orm['orm.Layer']"}),
+ 'layer_source': ('django.db.models.fields.related.ForeignKey', [], {'default': 'None', 'to': u"orm['orm.LayerSource']", 'null': 'True'}),
+ 'priority': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'up_branch': ('django.db.models.fields.related.ForeignKey', [], {'default': 'None', 'to': u"orm['orm.Branch']", 'null': 'True'}),
+ 'up_date': ('django.db.models.fields.DateTimeField', [], {'default': 'None', 'null': 'True'}),
+ 'up_id': ('django.db.models.fields.IntegerField', [], {'default': 'None', 'null': 'True'})
+ },
+ u'orm.layersource': {
+ 'Meta': {'unique_together': "(('sourcetype', 'apiurl'),)", 'object_name': 'LayerSource'},
+ 'apiurl': ('django.db.models.fields.CharField', [], {'default': 'None', 'max_length': '255', 'null': 'True'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '63'}),
+ 'sourcetype': ('django.db.models.fields.IntegerField', [], {})
+ },
+ u'orm.layerversiondependency': {
+ 'Meta': {'unique_together': "(('layer_source', 'up_id'),)", 'object_name': 'LayerVersionDependency'},
+ 'depends_on': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'dependees'", 'to': u"orm['orm.Layer_Version']"}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'layer_source': ('django.db.models.fields.related.ForeignKey', [], {'default': 'None', 'to': u"orm['orm.LayerSource']", 'null': 'True'}),
+ 'layer_version': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'dependencies'", 'to': u"orm['orm.Layer_Version']"}),
+ 'up_id': ('django.db.models.fields.IntegerField', [], {'default': 'None', 'null': 'True'})
+ },
+ u'orm.logmessage': {
+ 'Meta': {'object_name': 'LogMessage'},
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Build']"}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'level': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'lineno': ('django.db.models.fields.IntegerField', [], {'null': 'True'}),
+ 'message': ('django.db.models.fields.CharField', [], {'max_length': '240'}),
+ 'pathname': ('django.db.models.fields.FilePathField', [], {'max_length': '255', 'blank': 'True'}),
+ 'task': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Task']", 'null': 'True', 'blank': 'True'})
+ },
+ u'orm.machine': {
+ 'Meta': {'unique_together': "(('layer_source', 'up_id'),)", 'object_name': 'Machine'},
+ 'description': ('django.db.models.fields.CharField', [], {'max_length': '255'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'layer_source': ('django.db.models.fields.related.ForeignKey', [], {'default': 'None', 'to': u"orm['orm.LayerSource']", 'null': 'True'}),
+ 'layer_version': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Layer_Version']"}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '255'}),
+ 'up_date': ('django.db.models.fields.DateTimeField', [], {'default': 'None', 'null': 'True'}),
+ 'up_id': ('django.db.models.fields.IntegerField', [], {'default': 'None', 'null': 'True'})
+ },
+ u'orm.package': {
+ 'Meta': {'object_name': 'Package'},
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Build']"}),
+ 'description': ('django.db.models.fields.TextField', [], {'blank': 'True'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'installed_name': ('django.db.models.fields.CharField', [], {'default': "''", 'max_length': '100'}),
+ 'installed_size': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'license': ('django.db.models.fields.CharField', [], {'max_length': '80', 'blank': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'recipe': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Recipe']", 'null': 'True'}),
+ 'revision': ('django.db.models.fields.CharField', [], {'max_length': '32', 'blank': 'True'}),
+ 'section': ('django.db.models.fields.CharField', [], {'max_length': '80', 'blank': 'True'}),
+ 'size': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'summary': ('django.db.models.fields.TextField', [], {'blank': 'True'}),
+ 'version': ('django.db.models.fields.CharField', [], {'max_length': '100', 'blank': 'True'})
+ },
+ u'orm.package_dependency': {
+ 'Meta': {'object_name': 'Package_Dependency'},
+ 'dep_type': ('django.db.models.fields.IntegerField', [], {}),
+ 'depends_on': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'package_dependencies_target'", 'to': u"orm['orm.Package']"}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'package': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'package_dependencies_source'", 'to': u"orm['orm.Package']"}),
+ 'target': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Target']", 'null': 'True'})
+ },
+ u'orm.package_file': {
+ 'Meta': {'object_name': 'Package_File'},
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'package': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'buildfilelist_package'", 'to': u"orm['orm.Package']"}),
+ 'path': ('django.db.models.fields.FilePathField', [], {'max_length': '255', 'blank': 'True'}),
+ 'size': ('django.db.models.fields.IntegerField', [], {})
+ },
+ u'orm.project': {
+ 'Meta': {'object_name': 'Project'},
+ 'bitbake_version': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.BitbakeVersion']"}),
+ 'created': ('django.db.models.fields.DateTimeField', [], {'auto_now_add': 'True', 'blank': 'True'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'release': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Release']"}),
+ 'short_description': ('django.db.models.fields.CharField', [], {'max_length': '50', 'blank': 'True'}),
+ 'updated': ('django.db.models.fields.DateTimeField', [], {'auto_now': 'True', 'blank': 'True'}),
+ 'user_id': ('django.db.models.fields.IntegerField', [], {'null': 'True'})
+ },
+ u'orm.projectlayer': {
+ 'Meta': {'object_name': 'ProjectLayer'},
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'layercommit': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Layer_Version']", 'null': 'True'}),
+ 'optional': ('django.db.models.fields.BooleanField', [], {'default': 'True'}),
+ 'project': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Project']"})
+ },
+ u'orm.projecttarget': {
+ 'Meta': {'object_name': 'ProjectTarget'},
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'project': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Project']"}),
+ 'target': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'task': ('django.db.models.fields.CharField', [], {'max_length': '100', 'null': 'True'})
+ },
+ u'orm.projectvariable': {
+ 'Meta': {'object_name': 'ProjectVariable'},
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'project': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Project']"}),
+ 'value': ('django.db.models.fields.TextField', [], {'blank': 'True'})
+ },
+ u'orm.recipe': {
+ 'Meta': {'object_name': 'Recipe'},
+ 'bugtracker': ('django.db.models.fields.URLField', [], {'max_length': '200', 'blank': 'True'}),
+ 'description': ('django.db.models.fields.TextField', [], {'blank': 'True'}),
+ 'file_path': ('django.db.models.fields.FilePathField', [], {'max_length': '255'}),
+ 'homepage': ('django.db.models.fields.URLField', [], {'max_length': '200', 'blank': 'True'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'layer_source': ('django.db.models.fields.related.ForeignKey', [], {'default': 'None', 'to': u"orm['orm.LayerSource']", 'null': 'True'}),
+ 'layer_version': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'recipe_layer_version'", 'to': u"orm['orm.Layer_Version']"}),
+ 'license': ('django.db.models.fields.CharField', [], {'max_length': '200', 'blank': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '100', 'blank': 'True'}),
+ 'section': ('django.db.models.fields.CharField', [], {'max_length': '100', 'blank': 'True'}),
+ 'summary': ('django.db.models.fields.TextField', [], {'blank': 'True'}),
+ 'up_date': ('django.db.models.fields.DateTimeField', [], {'default': 'None', 'null': 'True'}),
+ 'up_id': ('django.db.models.fields.IntegerField', [], {'default': 'None', 'null': 'True'}),
+ 'version': ('django.db.models.fields.CharField', [], {'max_length': '100', 'blank': 'True'})
+ },
+ u'orm.recipe_dependency': {
+ 'Meta': {'object_name': 'Recipe_Dependency'},
+ 'dep_type': ('django.db.models.fields.IntegerField', [], {}),
+ 'depends_on': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'r_dependencies_depends'", 'to': u"orm['orm.Recipe']"}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'recipe': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'r_dependencies_recipe'", 'to': u"orm['orm.Recipe']"})
+ },
+ u'orm.release': {
+ 'Meta': {'object_name': 'Release'},
+ 'bitbake_version': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.BitbakeVersion']"}),
+ 'branch': ('django.db.models.fields.CharField', [], {'max_length': '32'}),
+ 'description': ('django.db.models.fields.CharField', [], {'max_length': '255'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'unique': 'True', 'max_length': '32'})
+ },
+ u'orm.releasedefaultlayer': {
+ 'Meta': {'object_name': 'ReleaseDefaultLayer'},
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'layer': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Layer']"}),
+ 'release': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Release']"})
+ },
+ u'orm.target': {
+ 'Meta': {'object_name': 'Target'},
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Build']"}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'image_size': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'is_image': ('django.db.models.fields.BooleanField', [], {'default': 'False'}),
+ 'license_manifest_path': ('django.db.models.fields.CharField', [], {'max_length': '500', 'null': 'True'}),
+ 'target': ('django.db.models.fields.CharField', [], {'max_length': '100'})
+ },
+ u'orm.target_file': {
+ 'Meta': {'object_name': 'Target_File'},
+ 'directory': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'directory_set'", 'null': 'True', 'to': u"orm['orm.Target_File']"}),
+ 'group': ('django.db.models.fields.CharField', [], {'max_length': '128'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'inodetype': ('django.db.models.fields.IntegerField', [], {}),
+ 'owner': ('django.db.models.fields.CharField', [], {'max_length': '128'}),
+ 'path': ('django.db.models.fields.FilePathField', [], {'max_length': '100'}),
+ 'permission': ('django.db.models.fields.CharField', [], {'max_length': '16'}),
+ 'size': ('django.db.models.fields.IntegerField', [], {}),
+ 'sym_target': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'symlink_set'", 'null': 'True', 'to': u"orm['orm.Target_File']"}),
+ 'target': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Target']"})
+ },
+ u'orm.target_image_file': {
+ 'Meta': {'object_name': 'Target_Image_File'},
+ 'file_name': ('django.db.models.fields.FilePathField', [], {'max_length': '254'}),
+ 'file_size': ('django.db.models.fields.IntegerField', [], {}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'target': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Target']"})
+ },
+ u'orm.target_installed_package': {
+ 'Meta': {'object_name': 'Target_Installed_Package'},
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'package': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'buildtargetlist_package'", 'to': u"orm['orm.Package']"}),
+ 'target': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Target']"})
+ },
+ u'orm.task': {
+ 'Meta': {'ordering': "('order', 'recipe')", 'unique_together': "(('build', 'recipe', 'task_name'),)", 'object_name': 'Task'},
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'task_build'", 'to': u"orm['orm.Build']"}),
+ 'cpu_usage': ('django.db.models.fields.DecimalField', [], {'null': 'True', 'max_digits': '6', 'decimal_places': '2'}),
+ 'disk_io': ('django.db.models.fields.IntegerField', [], {'null': 'True'}),
+ 'elapsed_time': ('django.db.models.fields.DecimalField', [], {'null': 'True', 'max_digits': '6', 'decimal_places': '2'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'line_number': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'logfile': ('django.db.models.fields.FilePathField', [], {'max_length': '255', 'blank': 'True'}),
+ 'message': ('django.db.models.fields.CharField', [], {'max_length': '240'}),
+ 'order': ('django.db.models.fields.IntegerField', [], {'null': 'True'}),
+ 'outcome': ('django.db.models.fields.IntegerField', [], {'default': '-1'}),
+ 'path_to_sstate_obj': ('django.db.models.fields.FilePathField', [], {'max_length': '500', 'blank': 'True'}),
+ 'recipe': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'build_recipe'", 'to': u"orm['orm.Recipe']"}),
+ 'script_type': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'source_url': ('django.db.models.fields.FilePathField', [], {'max_length': '255', 'blank': 'True'}),
+ 'sstate_checksum': ('django.db.models.fields.CharField', [], {'max_length': '100', 'blank': 'True'}),
+ 'sstate_result': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'task_executed': ('django.db.models.fields.BooleanField', [], {'default': 'False'}),
+ 'task_name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'work_directory': ('django.db.models.fields.FilePathField', [], {'max_length': '255', 'blank': 'True'})
+ },
+ u'orm.task_dependency': {
+ 'Meta': {'object_name': 'Task_Dependency'},
+ 'depends_on': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'task_dependencies_depends'", 'to': u"orm['orm.Task']"}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'task': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'task_dependencies_task'", 'to': u"orm['orm.Task']"})
+ },
+ u'orm.toastersetting': {
+ 'Meta': {'object_name': 'ToasterSetting'},
+ 'helptext': ('django.db.models.fields.TextField', [], {}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '63'}),
+ 'value': ('django.db.models.fields.CharField', [], {'max_length': '255'})
+ },
+ u'orm.toastersettingdefaultlayer': {
+ 'Meta': {'object_name': 'ToasterSettingDefaultLayer'},
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'layer_version': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Layer_Version']"})
+ },
+ u'orm.variable': {
+ 'Meta': {'object_name': 'Variable'},
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'variable_build'", 'to': u"orm['orm.Build']"}),
+ 'changed': ('django.db.models.fields.BooleanField', [], {'default': 'False'}),
+ 'description': ('django.db.models.fields.TextField', [], {'blank': 'True'}),
+ 'human_readable_name': ('django.db.models.fields.CharField', [], {'max_length': '200'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'variable_name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'variable_value': ('django.db.models.fields.TextField', [], {'blank': 'True'})
+ },
+ u'orm.variablehistory': {
+ 'Meta': {'object_name': 'VariableHistory'},
+ 'file_name': ('django.db.models.fields.FilePathField', [], {'max_length': '255'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'line_number': ('django.db.models.fields.IntegerField', [], {'null': 'True'}),
+ 'operation': ('django.db.models.fields.CharField', [], {'max_length': '64'}),
+ 'value': ('django.db.models.fields.TextField', [], {'blank': 'True'}),
+ 'variable': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'vhistory'", 'to': u"orm['orm.Variable']"})
+ }
+ }
+
+ complete_apps = ['orm']
\ No newline at end of file
diff --git a/bitbake/lib/toaster/orm/migrations/0016_auto__add_field_release_helptext__chg_field_release_branch__add_index_.py b/bitbake/lib/toaster/orm/migrations/0016_auto__add_field_release_helptext__chg_field_release_branch__add_index_.py
new file mode 100644
index 0000000..545c0ba
--- /dev/null
+++ b/bitbake/lib/toaster/orm/migrations/0016_auto__add_field_release_helptext__chg_field_release_branch__add_index_.py
@@ -0,0 +1,359 @@
+# -*- coding: utf-8 -*-
+from south.utils import datetime_utils as datetime
+from south.db import db
+from south.v2 import SchemaMigration
+from django.db import models
+
+
+class Migration(SchemaMigration):
+
+ def forwards(self, orm):
+ # Adding field 'Release.helptext'
+ db.add_column(u'orm_release', 'helptext',
+ self.gf('django.db.models.fields.TextField')(null=True),
+ keep_default=False)
+
+
+ # Renaming column for 'Release.branch' to match new field type.
+ db.delete_column(u'orm_release', 'branch')
+
+ # Changing field 'Release.branch'
+ db.add_column(u'orm_release', 'branch', self.gf('django.db.models.fields.related.ForeignKey')(to=orm['orm.Branch'], default=-1))
+
+ # Deleting field 'Branch.bitbake_branch'
+ db.delete_column(u'orm_branch', 'bitbake_branch')
+
+ # Adding unique constraint on 'Recipe', fields ['layer_version', 'file_path']
+ db.create_unique(u'orm_recipe', ['layer_version_id', 'file_path'])
+
+ # Adding unique constraint on 'ProjectLayer', fields ['project', 'layercommit']
+ db.create_unique(u'orm_projectlayer', ['project_id', 'layercommit_id'])
+
+
+ def backwards(self, orm):
+ # Removing unique constraint on 'ProjectLayer', fields ['project', 'layercommit']
+ db.delete_unique(u'orm_projectlayer', ['project_id', 'layercommit_id'])
+
+ # Removing unique constraint on 'Recipe', fields ['layer_version', 'file_path']
+ db.delete_unique(u'orm_recipe', ['layer_version_id', 'file_path'])
+
+ # Deleting field 'Release.helptext'
+ db.delete_column(u'orm_release', 'helptext')
+
+ # Renaming column for 'Release.branch' to match new field type.
+ db.rename_column(u'orm_release', 'branch_id', 'branch')
+ # Changing field 'Release.branch'
+ db.alter_column(u'orm_release', 'branch', self.gf('django.db.models.fields.CharField')(max_length=32))
+ # Adding field 'Branch.bitbake_branch'
+ db.add_column(u'orm_branch', 'bitbake_branch',
+ self.gf('django.db.models.fields.CharField')(default='', max_length=50, blank=True),
+ keep_default=False)
+
+
+ models = {
+ u'orm.bitbakeversion': {
+ 'Meta': {'object_name': 'BitbakeVersion'},
+ 'branch': ('django.db.models.fields.CharField', [], {'max_length': '32'}),
+ 'dirpath': ('django.db.models.fields.CharField', [], {'max_length': '255'}),
+ 'giturl': ('django.db.models.fields.URLField', [], {'max_length': '200'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'unique': 'True', 'max_length': '32'})
+ },
+ u'orm.branch': {
+ 'Meta': {'unique_together': "(('layer_source', 'name'), ('layer_source', 'up_id'))", 'object_name': 'Branch'},
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'layer_source': ('django.db.models.fields.related.ForeignKey', [], {'default': 'True', 'to': u"orm['orm.LayerSource']", 'null': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '50'}),
+ 'short_description': ('django.db.models.fields.CharField', [], {'max_length': '50', 'blank': 'True'}),
+ 'up_date': ('django.db.models.fields.DateTimeField', [], {'default': 'None', 'null': 'True'}),
+ 'up_id': ('django.db.models.fields.IntegerField', [], {'default': 'None', 'null': 'True'})
+ },
+ u'orm.build': {
+ 'Meta': {'object_name': 'Build'},
+ 'bitbake_version': ('django.db.models.fields.CharField', [], {'max_length': '50'}),
+ 'build_name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'completed_on': ('django.db.models.fields.DateTimeField', [], {}),
+ 'cooker_log_path': ('django.db.models.fields.CharField', [], {'max_length': '500'}),
+ 'distro': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'distro_version': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'errors_no': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'machine': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'outcome': ('django.db.models.fields.IntegerField', [], {'default': '2'}),
+ 'project': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Project']", 'null': 'True'}),
+ 'started_on': ('django.db.models.fields.DateTimeField', [], {}),
+ 'timespent': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'warnings_no': ('django.db.models.fields.IntegerField', [], {'default': '0'})
+ },
+ u'orm.helptext': {
+ 'Meta': {'object_name': 'HelpText'},
+ 'area': ('django.db.models.fields.IntegerField', [], {}),
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'helptext_build'", 'to': u"orm['orm.Build']"}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'key': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'text': ('django.db.models.fields.TextField', [], {})
+ },
+ u'orm.layer': {
+ 'Meta': {'unique_together': "(('layer_source', 'up_id'), ('layer_source', 'name'))", 'object_name': 'Layer'},
+ 'description': ('django.db.models.fields.TextField', [], {'default': 'None', 'null': 'True'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'layer_index_url': ('django.db.models.fields.URLField', [], {'max_length': '200'}),
+ 'layer_source': ('django.db.models.fields.related.ForeignKey', [], {'default': 'None', 'to': u"orm['orm.LayerSource']", 'null': 'True'}),
+ 'local_path': ('django.db.models.fields.FilePathField', [], {'default': 'None', 'max_length': '255', 'null': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'summary': ('django.db.models.fields.TextField', [], {'default': 'None', 'null': 'True'}),
+ 'up_date': ('django.db.models.fields.DateTimeField', [], {'default': 'None', 'null': 'True'}),
+ 'up_id': ('django.db.models.fields.IntegerField', [], {'default': 'None', 'null': 'True'}),
+ 'vcs_url': ('django.db.models.fields.URLField', [], {'default': 'None', 'max_length': '200', 'null': 'True'}),
+ 'vcs_web_file_base_url': ('django.db.models.fields.URLField', [], {'default': 'None', 'max_length': '200', 'null': 'True'}),
+ 'vcs_web_tree_base_url': ('django.db.models.fields.URLField', [], {'default': 'None', 'max_length': '200', 'null': 'True'}),
+ 'vcs_web_url': ('django.db.models.fields.URLField', [], {'default': 'None', 'max_length': '200', 'null': 'True'})
+ },
+ u'orm.layer_version': {
+ 'Meta': {'unique_together': "(('layer_source', 'up_id'),)", 'object_name': 'Layer_Version'},
+ 'branch': ('django.db.models.fields.CharField', [], {'max_length': '80'}),
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'default': 'None', 'related_name': "'layer_version_build'", 'null': 'True', 'to': u"orm['orm.Build']"}),
+ 'commit': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'dirpath': ('django.db.models.fields.CharField', [], {'default': 'None', 'max_length': '255', 'null': 'True'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'layer': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'layer_version_layer'", 'to': u"orm['orm.Layer']"}),
+ 'layer_source': ('django.db.models.fields.related.ForeignKey', [], {'default': 'None', 'to': u"orm['orm.LayerSource']", 'null': 'True'}),
+ 'priority': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'up_branch': ('django.db.models.fields.related.ForeignKey', [], {'default': 'None', 'to': u"orm['orm.Branch']", 'null': 'True'}),
+ 'up_date': ('django.db.models.fields.DateTimeField', [], {'default': 'None', 'null': 'True'}),
+ 'up_id': ('django.db.models.fields.IntegerField', [], {'default': 'None', 'null': 'True'})
+ },
+ u'orm.layersource': {
+ 'Meta': {'unique_together': "(('sourcetype', 'apiurl'),)", 'object_name': 'LayerSource'},
+ 'apiurl': ('django.db.models.fields.CharField', [], {'default': 'None', 'max_length': '255', 'null': 'True'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '63'}),
+ 'sourcetype': ('django.db.models.fields.IntegerField', [], {})
+ },
+ u'orm.layerversiondependency': {
+ 'Meta': {'unique_together': "(('layer_source', 'up_id'),)", 'object_name': 'LayerVersionDependency'},
+ 'depends_on': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'dependees'", 'to': u"orm['orm.Layer_Version']"}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'layer_source': ('django.db.models.fields.related.ForeignKey', [], {'default': 'None', 'to': u"orm['orm.LayerSource']", 'null': 'True'}),
+ 'layer_version': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'dependencies'", 'to': u"orm['orm.Layer_Version']"}),
+ 'up_id': ('django.db.models.fields.IntegerField', [], {'default': 'None', 'null': 'True'})
+ },
+ u'orm.logmessage': {
+ 'Meta': {'object_name': 'LogMessage'},
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Build']"}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'level': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'lineno': ('django.db.models.fields.IntegerField', [], {'null': 'True'}),
+ 'message': ('django.db.models.fields.CharField', [], {'max_length': '240'}),
+ 'pathname': ('django.db.models.fields.FilePathField', [], {'max_length': '255', 'blank': 'True'}),
+ 'task': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Task']", 'null': 'True', 'blank': 'True'})
+ },
+ u'orm.machine': {
+ 'Meta': {'unique_together': "(('layer_source', 'up_id'),)", 'object_name': 'Machine'},
+ 'description': ('django.db.models.fields.CharField', [], {'max_length': '255'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'layer_source': ('django.db.models.fields.related.ForeignKey', [], {'default': 'None', 'to': u"orm['orm.LayerSource']", 'null': 'True'}),
+ 'layer_version': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Layer_Version']"}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '255'}),
+ 'up_date': ('django.db.models.fields.DateTimeField', [], {'default': 'None', 'null': 'True'}),
+ 'up_id': ('django.db.models.fields.IntegerField', [], {'default': 'None', 'null': 'True'})
+ },
+ u'orm.package': {
+ 'Meta': {'object_name': 'Package'},
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Build']"}),
+ 'description': ('django.db.models.fields.TextField', [], {'blank': 'True'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'installed_name': ('django.db.models.fields.CharField', [], {'default': "''", 'max_length': '100'}),
+ 'installed_size': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'license': ('django.db.models.fields.CharField', [], {'max_length': '80', 'blank': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'recipe': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Recipe']", 'null': 'True'}),
+ 'revision': ('django.db.models.fields.CharField', [], {'max_length': '32', 'blank': 'True'}),
+ 'section': ('django.db.models.fields.CharField', [], {'max_length': '80', 'blank': 'True'}),
+ 'size': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'summary': ('django.db.models.fields.TextField', [], {'blank': 'True'}),
+ 'version': ('django.db.models.fields.CharField', [], {'max_length': '100', 'blank': 'True'})
+ },
+ u'orm.package_dependency': {
+ 'Meta': {'object_name': 'Package_Dependency'},
+ 'dep_type': ('django.db.models.fields.IntegerField', [], {}),
+ 'depends_on': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'package_dependencies_target'", 'to': u"orm['orm.Package']"}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'package': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'package_dependencies_source'", 'to': u"orm['orm.Package']"}),
+ 'target': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Target']", 'null': 'True'})
+ },
+ u'orm.package_file': {
+ 'Meta': {'object_name': 'Package_File'},
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'package': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'buildfilelist_package'", 'to': u"orm['orm.Package']"}),
+ 'path': ('django.db.models.fields.FilePathField', [], {'max_length': '255', 'blank': 'True'}),
+ 'size': ('django.db.models.fields.IntegerField', [], {})
+ },
+ u'orm.project': {
+ 'Meta': {'object_name': 'Project'},
+ 'bitbake_version': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.BitbakeVersion']"}),
+ 'created': ('django.db.models.fields.DateTimeField', [], {'auto_now_add': 'True', 'blank': 'True'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'release': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Release']"}),
+ 'short_description': ('django.db.models.fields.CharField', [], {'max_length': '50', 'blank': 'True'}),
+ 'updated': ('django.db.models.fields.DateTimeField', [], {'auto_now': 'True', 'blank': 'True'}),
+ 'user_id': ('django.db.models.fields.IntegerField', [], {'null': 'True'})
+ },
+ u'orm.projectlayer': {
+ 'Meta': {'unique_together': "(('project', 'layercommit'),)", 'object_name': 'ProjectLayer'},
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'layercommit': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Layer_Version']", 'null': 'True'}),
+ 'optional': ('django.db.models.fields.BooleanField', [], {'default': 'True'}),
+ 'project': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Project']"})
+ },
+ u'orm.projecttarget': {
+ 'Meta': {'object_name': 'ProjectTarget'},
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'project': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Project']"}),
+ 'target': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'task': ('django.db.models.fields.CharField', [], {'max_length': '100', 'null': 'True'})
+ },
+ u'orm.projectvariable': {
+ 'Meta': {'object_name': 'ProjectVariable'},
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'project': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Project']"}),
+ 'value': ('django.db.models.fields.TextField', [], {'blank': 'True'})
+ },
+ u'orm.recipe': {
+ 'Meta': {'unique_together': "(('layer_version', 'file_path'),)", 'object_name': 'Recipe'},
+ 'bugtracker': ('django.db.models.fields.URLField', [], {'max_length': '200', 'blank': 'True'}),
+ 'description': ('django.db.models.fields.TextField', [], {'blank': 'True'}),
+ 'file_path': ('django.db.models.fields.FilePathField', [], {'max_length': '255'}),
+ 'homepage': ('django.db.models.fields.URLField', [], {'max_length': '200', 'blank': 'True'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'layer_source': ('django.db.models.fields.related.ForeignKey', [], {'default': 'None', 'to': u"orm['orm.LayerSource']", 'null': 'True'}),
+ 'layer_version': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'recipe_layer_version'", 'to': u"orm['orm.Layer_Version']"}),
+ 'license': ('django.db.models.fields.CharField', [], {'max_length': '200', 'blank': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '100', 'blank': 'True'}),
+ 'section': ('django.db.models.fields.CharField', [], {'max_length': '100', 'blank': 'True'}),
+ 'summary': ('django.db.models.fields.TextField', [], {'blank': 'True'}),
+ 'up_date': ('django.db.models.fields.DateTimeField', [], {'default': 'None', 'null': 'True'}),
+ 'up_id': ('django.db.models.fields.IntegerField', [], {'default': 'None', 'null': 'True'}),
+ 'version': ('django.db.models.fields.CharField', [], {'max_length': '100', 'blank': 'True'})
+ },
+ u'orm.recipe_dependency': {
+ 'Meta': {'object_name': 'Recipe_Dependency'},
+ 'dep_type': ('django.db.models.fields.IntegerField', [], {}),
+ 'depends_on': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'r_dependencies_depends'", 'to': u"orm['orm.Recipe']"}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'recipe': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'r_dependencies_recipe'", 'to': u"orm['orm.Recipe']"})
+ },
+ u'orm.release': {
+ 'Meta': {'object_name': 'Release'},
+ 'bitbake_version': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.BitbakeVersion']"}),
+ 'branch': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Branch']"}),
+ 'description': ('django.db.models.fields.CharField', [], {'max_length': '255'}),
+ 'helptext': ('django.db.models.fields.TextField', [], {'null': 'True'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'unique': 'True', 'max_length': '32'})
+ },
+ u'orm.releasedefaultlayer': {
+ 'Meta': {'object_name': 'ReleaseDefaultLayer'},
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'layer': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Layer']"}),
+ 'release': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Release']"})
+ },
+ u'orm.target': {
+ 'Meta': {'object_name': 'Target'},
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Build']"}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'image_size': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'is_image': ('django.db.models.fields.BooleanField', [], {'default': 'False'}),
+ 'license_manifest_path': ('django.db.models.fields.CharField', [], {'max_length': '500', 'null': 'True'}),
+ 'target': ('django.db.models.fields.CharField', [], {'max_length': '100'})
+ },
+ u'orm.target_file': {
+ 'Meta': {'object_name': 'Target_File'},
+ 'directory': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'directory_set'", 'null': 'True', 'to': u"orm['orm.Target_File']"}),
+ 'group': ('django.db.models.fields.CharField', [], {'max_length': '128'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'inodetype': ('django.db.models.fields.IntegerField', [], {}),
+ 'owner': ('django.db.models.fields.CharField', [], {'max_length': '128'}),
+ 'path': ('django.db.models.fields.FilePathField', [], {'max_length': '100'}),
+ 'permission': ('django.db.models.fields.CharField', [], {'max_length': '16'}),
+ 'size': ('django.db.models.fields.IntegerField', [], {}),
+ 'sym_target': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'symlink_set'", 'null': 'True', 'to': u"orm['orm.Target_File']"}),
+ 'target': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Target']"})
+ },
+ u'orm.target_image_file': {
+ 'Meta': {'object_name': 'Target_Image_File'},
+ 'file_name': ('django.db.models.fields.FilePathField', [], {'max_length': '254'}),
+ 'file_size': ('django.db.models.fields.IntegerField', [], {}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'target': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Target']"})
+ },
+ u'orm.target_installed_package': {
+ 'Meta': {'object_name': 'Target_Installed_Package'},
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'package': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'buildtargetlist_package'", 'to': u"orm['orm.Package']"}),
+ 'target': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Target']"})
+ },
+ u'orm.task': {
+ 'Meta': {'ordering': "('order', 'recipe')", 'unique_together': "(('build', 'recipe', 'task_name'),)", 'object_name': 'Task'},
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'task_build'", 'to': u"orm['orm.Build']"}),
+ 'cpu_usage': ('django.db.models.fields.DecimalField', [], {'null': 'True', 'max_digits': '6', 'decimal_places': '2'}),
+ 'disk_io': ('django.db.models.fields.IntegerField', [], {'null': 'True'}),
+ 'elapsed_time': ('django.db.models.fields.DecimalField', [], {'null': 'True', 'max_digits': '6', 'decimal_places': '2'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'line_number': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'logfile': ('django.db.models.fields.FilePathField', [], {'max_length': '255', 'blank': 'True'}),
+ 'message': ('django.db.models.fields.CharField', [], {'max_length': '240'}),
+ 'order': ('django.db.models.fields.IntegerField', [], {'null': 'True'}),
+ 'outcome': ('django.db.models.fields.IntegerField', [], {'default': '-1'}),
+ 'path_to_sstate_obj': ('django.db.models.fields.FilePathField', [], {'max_length': '500', 'blank': 'True'}),
+ 'recipe': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'build_recipe'", 'to': u"orm['orm.Recipe']"}),
+ 'script_type': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'source_url': ('django.db.models.fields.FilePathField', [], {'max_length': '255', 'blank': 'True'}),
+ 'sstate_checksum': ('django.db.models.fields.CharField', [], {'max_length': '100', 'blank': 'True'}),
+ 'sstate_result': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'task_executed': ('django.db.models.fields.BooleanField', [], {'default': 'False'}),
+ 'task_name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'work_directory': ('django.db.models.fields.FilePathField', [], {'max_length': '255', 'blank': 'True'})
+ },
+ u'orm.task_dependency': {
+ 'Meta': {'object_name': 'Task_Dependency'},
+ 'depends_on': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'task_dependencies_depends'", 'to': u"orm['orm.Task']"}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'task': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'task_dependencies_task'", 'to': u"orm['orm.Task']"})
+ },
+ u'orm.toastersetting': {
+ 'Meta': {'object_name': 'ToasterSetting'},
+ 'helptext': ('django.db.models.fields.TextField', [], {}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '63'}),
+ 'value': ('django.db.models.fields.CharField', [], {'max_length': '255'})
+ },
+ u'orm.toastersettingdefaultlayer': {
+ 'Meta': {'object_name': 'ToasterSettingDefaultLayer'},
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'layer_version': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Layer_Version']"})
+ },
+ u'orm.variable': {
+ 'Meta': {'object_name': 'Variable'},
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'variable_build'", 'to': u"orm['orm.Build']"}),
+ 'changed': ('django.db.models.fields.BooleanField', [], {'default': 'False'}),
+ 'description': ('django.db.models.fields.TextField', [], {'blank': 'True'}),
+ 'human_readable_name': ('django.db.models.fields.CharField', [], {'max_length': '200'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'variable_name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'variable_value': ('django.db.models.fields.TextField', [], {'blank': 'True'})
+ },
+ u'orm.variablehistory': {
+ 'Meta': {'object_name': 'VariableHistory'},
+ 'file_name': ('django.db.models.fields.FilePathField', [], {'max_length': '255'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'line_number': ('django.db.models.fields.IntegerField', [], {'null': 'True'}),
+ 'operation': ('django.db.models.fields.CharField', [], {'max_length': '64'}),
+ 'value': ('django.db.models.fields.TextField', [], {'blank': 'True'}),
+ 'variable': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'vhistory'", 'to': u"orm['orm.Variable']"})
+ }
+ }
+
+ complete_apps = ['orm']
diff --git a/bitbake/lib/toaster/orm/migrations/0017_auto__del_toastersettingdefaultlayer__add_releaselayersourcepriority__.py b/bitbake/lib/toaster/orm/migrations/0017_auto__del_toastersettingdefaultlayer__add_releaselayersourcepriority__.py
new file mode 100644
index 0000000..6685b55
--- /dev/null
+++ b/bitbake/lib/toaster/orm/migrations/0017_auto__del_toastersettingdefaultlayer__add_releaselayersourcepriority__.py
@@ -0,0 +1,396 @@
+# -*- coding: utf-8 -*-
+from south.utils import datetime_utils as datetime
+from south.db import db
+from south.v2 import SchemaMigration
+from django.db import models
+
+
+class Migration(SchemaMigration):
+
+ def forwards(self, orm):
+ # Deleting model 'ToasterSettingDefaultLayer'
+ db.delete_table(u'orm_toastersettingdefaultlayer')
+
+ # Adding model 'ReleaseLayerSourcePriority'
+ db.create_table(u'orm_releaselayersourcepriority', (
+ (u'id', self.gf('django.db.models.fields.AutoField')(primary_key=True)),
+ ('release', self.gf('django.db.models.fields.related.ForeignKey')(to=orm['orm.Release'])),
+ ('layer_source', self.gf('django.db.models.fields.related.ForeignKey')(to=orm['orm.LayerSource'])),
+ ('priority', self.gf('django.db.models.fields.IntegerField')(default=0)),
+ ))
+ db.send_create_signal(u'orm', ['ReleaseLayerSourcePriority'])
+
+ # Adding unique constraint on 'ReleaseLayerSourcePriority', fields ['release', 'layer_source']
+ db.create_unique(u'orm_releaselayersourcepriority', ['release_id', 'layer_source_id'])
+
+ # Deleting field 'Release.branch'
+ db.delete_column(u'orm_release', 'branch_id')
+
+ # Adding field 'Release.branch_name'
+ db.add_column(u'orm_release', 'branch_name',
+ self.gf('django.db.models.fields.CharField')(default='', max_length=50),
+ keep_default=False)
+
+ # Adding unique constraint on 'LayerSource', fields ['name']
+ db.create_unique(u'orm_layersource', ['name'])
+
+ # Deleting field 'ReleaseDefaultLayer.layer'
+ db.delete_column(u'orm_releasedefaultlayer', 'layer_id')
+
+ # Adding field 'ReleaseDefaultLayer.layer_name'
+ db.add_column(u'orm_releasedefaultlayer', 'layer_name',
+ self.gf('django.db.models.fields.CharField')(default='', max_length=100),
+ keep_default=False)
+
+
+ def backwards(self, orm):
+ # Removing unique constraint on 'LayerSource', fields ['name']
+ db.delete_unique(u'orm_layersource', ['name'])
+
+ # Removing unique constraint on 'ReleaseLayerSourcePriority', fields ['release', 'layer_source']
+ db.delete_unique(u'orm_releaselayersourcepriority', ['release_id', 'layer_source_id'])
+
+ # Adding model 'ToasterSettingDefaultLayer'
+ db.create_table(u'orm_toastersettingdefaultlayer', (
+ (u'id', self.gf('django.db.models.fields.AutoField')(primary_key=True)),
+ ('layer_version', self.gf('django.db.models.fields.related.ForeignKey')(to=orm['orm.Layer_Version'])),
+ ))
+ db.send_create_signal(u'orm', ['ToasterSettingDefaultLayer'])
+
+ # Deleting model 'ReleaseLayerSourcePriority'
+ db.delete_table(u'orm_releaselayersourcepriority')
+
+
+ # User chose to not deal with backwards NULL issues for 'Release.branch'
+ raise RuntimeError("Cannot reverse this migration. 'Release.branch' and its values cannot be restored.")
+
+ # The following code is provided here to aid in writing a correct migration # Adding field 'Release.branch'
+ db.add_column(u'orm_release', 'branch',
+ self.gf('django.db.models.fields.related.ForeignKey')(to=orm['orm.Branch']),
+ keep_default=False)
+
+ # Deleting field 'Release.branch_name'
+ db.delete_column(u'orm_release', 'branch_name')
+
+
+ # User chose to not deal with backwards NULL issues for 'ReleaseDefaultLayer.layer'
+ raise RuntimeError("Cannot reverse this migration. 'ReleaseDefaultLayer.layer' and its values cannot be restored.")
+
+ # The following code is provided here to aid in writing a correct migration # Adding field 'ReleaseDefaultLayer.layer'
+ db.add_column(u'orm_releasedefaultlayer', 'layer',
+ self.gf('django.db.models.fields.related.ForeignKey')(to=orm['orm.Layer']),
+ keep_default=False)
+
+ # Deleting field 'ReleaseDefaultLayer.layer_name'
+ db.delete_column(u'orm_releasedefaultlayer', 'layer_name')
+
+
+ models = {
+ u'orm.bitbakeversion': {
+ 'Meta': {'object_name': 'BitbakeVersion'},
+ 'branch': ('django.db.models.fields.CharField', [], {'max_length': '32'}),
+ 'dirpath': ('django.db.models.fields.CharField', [], {'max_length': '255'}),
+ 'giturl': ('django.db.models.fields.URLField', [], {'max_length': '200'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'unique': 'True', 'max_length': '32'})
+ },
+ u'orm.branch': {
+ 'Meta': {'unique_together': "(('layer_source', 'name'), ('layer_source', 'up_id'))", 'object_name': 'Branch'},
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'layer_source': ('django.db.models.fields.related.ForeignKey', [], {'default': 'True', 'to': u"orm['orm.LayerSource']", 'null': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '50'}),
+ 'short_description': ('django.db.models.fields.CharField', [], {'max_length': '50', 'blank': 'True'}),
+ 'up_date': ('django.db.models.fields.DateTimeField', [], {'default': 'None', 'null': 'True'}),
+ 'up_id': ('django.db.models.fields.IntegerField', [], {'default': 'None', 'null': 'True'})
+ },
+ u'orm.build': {
+ 'Meta': {'object_name': 'Build'},
+ 'bitbake_version': ('django.db.models.fields.CharField', [], {'max_length': '50'}),
+ 'build_name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'completed_on': ('django.db.models.fields.DateTimeField', [], {}),
+ 'cooker_log_path': ('django.db.models.fields.CharField', [], {'max_length': '500'}),
+ 'distro': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'distro_version': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'errors_no': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'machine': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'outcome': ('django.db.models.fields.IntegerField', [], {'default': '2'}),
+ 'project': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Project']", 'null': 'True'}),
+ 'started_on': ('django.db.models.fields.DateTimeField', [], {}),
+ 'timespent': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'warnings_no': ('django.db.models.fields.IntegerField', [], {'default': '0'})
+ },
+ u'orm.helptext': {
+ 'Meta': {'object_name': 'HelpText'},
+ 'area': ('django.db.models.fields.IntegerField', [], {}),
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'helptext_build'", 'to': u"orm['orm.Build']"}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'key': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'text': ('django.db.models.fields.TextField', [], {})
+ },
+ u'orm.layer': {
+ 'Meta': {'unique_together': "(('layer_source', 'up_id'), ('layer_source', 'name'))", 'object_name': 'Layer'},
+ 'description': ('django.db.models.fields.TextField', [], {'default': 'None', 'null': 'True'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'layer_index_url': ('django.db.models.fields.URLField', [], {'max_length': '200'}),
+ 'layer_source': ('django.db.models.fields.related.ForeignKey', [], {'default': 'None', 'to': u"orm['orm.LayerSource']", 'null': 'True'}),
+ 'local_path': ('django.db.models.fields.FilePathField', [], {'default': 'None', 'max_length': '255', 'null': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'summary': ('django.db.models.fields.TextField', [], {'default': 'None', 'null': 'True'}),
+ 'up_date': ('django.db.models.fields.DateTimeField', [], {'default': 'None', 'null': 'True'}),
+ 'up_id': ('django.db.models.fields.IntegerField', [], {'default': 'None', 'null': 'True'}),
+ 'vcs_url': ('django.db.models.fields.URLField', [], {'default': 'None', 'max_length': '200', 'null': 'True'}),
+ 'vcs_web_file_base_url': ('django.db.models.fields.URLField', [], {'default': 'None', 'max_length': '200', 'null': 'True'}),
+ 'vcs_web_tree_base_url': ('django.db.models.fields.URLField', [], {'default': 'None', 'max_length': '200', 'null': 'True'}),
+ 'vcs_web_url': ('django.db.models.fields.URLField', [], {'default': 'None', 'max_length': '200', 'null': 'True'})
+ },
+ u'orm.layer_version': {
+ 'Meta': {'unique_together': "(('layer_source', 'up_id'),)", 'object_name': 'Layer_Version'},
+ 'branch': ('django.db.models.fields.CharField', [], {'max_length': '80'}),
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'default': 'None', 'related_name': "'layer_version_build'", 'null': 'True', 'to': u"orm['orm.Build']"}),
+ 'commit': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'dirpath': ('django.db.models.fields.CharField', [], {'default': 'None', 'max_length': '255', 'null': 'True'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'layer': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'layer_version_layer'", 'to': u"orm['orm.Layer']"}),
+ 'layer_source': ('django.db.models.fields.related.ForeignKey', [], {'default': 'None', 'to': u"orm['orm.LayerSource']", 'null': 'True'}),
+ 'priority': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'up_branch': ('django.db.models.fields.related.ForeignKey', [], {'default': 'None', 'to': u"orm['orm.Branch']", 'null': 'True'}),
+ 'up_date': ('django.db.models.fields.DateTimeField', [], {'default': 'None', 'null': 'True'}),
+ 'up_id': ('django.db.models.fields.IntegerField', [], {'default': 'None', 'null': 'True'})
+ },
+ u'orm.layersource': {
+ 'Meta': {'unique_together': "(('sourcetype', 'apiurl'),)", 'object_name': 'LayerSource'},
+ 'apiurl': ('django.db.models.fields.CharField', [], {'default': 'None', 'max_length': '255', 'null': 'True'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'unique': 'True', 'max_length': '63'}),
+ 'sourcetype': ('django.db.models.fields.IntegerField', [], {})
+ },
+ u'orm.layerversiondependency': {
+ 'Meta': {'unique_together': "(('layer_source', 'up_id'),)", 'object_name': 'LayerVersionDependency'},
+ 'depends_on': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'dependees'", 'to': u"orm['orm.Layer_Version']"}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'layer_source': ('django.db.models.fields.related.ForeignKey', [], {'default': 'None', 'to': u"orm['orm.LayerSource']", 'null': 'True'}),
+ 'layer_version': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'dependencies'", 'to': u"orm['orm.Layer_Version']"}),
+ 'up_id': ('django.db.models.fields.IntegerField', [], {'default': 'None', 'null': 'True'})
+ },
+ u'orm.logmessage': {
+ 'Meta': {'object_name': 'LogMessage'},
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Build']"}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'level': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'lineno': ('django.db.models.fields.IntegerField', [], {'null': 'True'}),
+ 'message': ('django.db.models.fields.CharField', [], {'max_length': '240'}),
+ 'pathname': ('django.db.models.fields.FilePathField', [], {'max_length': '255', 'blank': 'True'}),
+ 'task': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Task']", 'null': 'True', 'blank': 'True'})
+ },
+ u'orm.machine': {
+ 'Meta': {'unique_together': "(('layer_source', 'up_id'),)", 'object_name': 'Machine'},
+ 'description': ('django.db.models.fields.CharField', [], {'max_length': '255'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'layer_source': ('django.db.models.fields.related.ForeignKey', [], {'default': 'None', 'to': u"orm['orm.LayerSource']", 'null': 'True'}),
+ 'layer_version': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Layer_Version']"}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '255'}),
+ 'up_date': ('django.db.models.fields.DateTimeField', [], {'default': 'None', 'null': 'True'}),
+ 'up_id': ('django.db.models.fields.IntegerField', [], {'default': 'None', 'null': 'True'})
+ },
+ u'orm.package': {
+ 'Meta': {'object_name': 'Package'},
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Build']"}),
+ 'description': ('django.db.models.fields.TextField', [], {'blank': 'True'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'installed_name': ('django.db.models.fields.CharField', [], {'default': "''", 'max_length': '100'}),
+ 'installed_size': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'license': ('django.db.models.fields.CharField', [], {'max_length': '80', 'blank': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'recipe': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Recipe']", 'null': 'True'}),
+ 'revision': ('django.db.models.fields.CharField', [], {'max_length': '32', 'blank': 'True'}),
+ 'section': ('django.db.models.fields.CharField', [], {'max_length': '80', 'blank': 'True'}),
+ 'size': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'summary': ('django.db.models.fields.TextField', [], {'blank': 'True'}),
+ 'version': ('django.db.models.fields.CharField', [], {'max_length': '100', 'blank': 'True'})
+ },
+ u'orm.package_dependency': {
+ 'Meta': {'object_name': 'Package_Dependency'},
+ 'dep_type': ('django.db.models.fields.IntegerField', [], {}),
+ 'depends_on': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'package_dependencies_target'", 'to': u"orm['orm.Package']"}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'package': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'package_dependencies_source'", 'to': u"orm['orm.Package']"}),
+ 'target': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Target']", 'null': 'True'})
+ },
+ u'orm.package_file': {
+ 'Meta': {'object_name': 'Package_File'},
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'package': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'buildfilelist_package'", 'to': u"orm['orm.Package']"}),
+ 'path': ('django.db.models.fields.FilePathField', [], {'max_length': '255', 'blank': 'True'}),
+ 'size': ('django.db.models.fields.IntegerField', [], {})
+ },
+ u'orm.project': {
+ 'Meta': {'object_name': 'Project'},
+ 'bitbake_version': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.BitbakeVersion']"}),
+ 'created': ('django.db.models.fields.DateTimeField', [], {'auto_now_add': 'True', 'blank': 'True'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'release': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Release']"}),
+ 'short_description': ('django.db.models.fields.CharField', [], {'max_length': '50', 'blank': 'True'}),
+ 'updated': ('django.db.models.fields.DateTimeField', [], {'auto_now': 'True', 'blank': 'True'}),
+ 'user_id': ('django.db.models.fields.IntegerField', [], {'null': 'True'})
+ },
+ u'orm.projectlayer': {
+ 'Meta': {'unique_together': "(('project', 'layercommit'),)", 'object_name': 'ProjectLayer'},
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'layercommit': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Layer_Version']", 'null': 'True'}),
+ 'optional': ('django.db.models.fields.BooleanField', [], {'default': 'True'}),
+ 'project': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Project']"})
+ },
+ u'orm.projecttarget': {
+ 'Meta': {'object_name': 'ProjectTarget'},
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'project': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Project']"}),
+ 'target': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'task': ('django.db.models.fields.CharField', [], {'max_length': '100', 'null': 'True'})
+ },
+ u'orm.projectvariable': {
+ 'Meta': {'object_name': 'ProjectVariable'},
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'project': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Project']"}),
+ 'value': ('django.db.models.fields.TextField', [], {'blank': 'True'})
+ },
+ u'orm.recipe': {
+ 'Meta': {'unique_together': "(('layer_version', 'file_path'),)", 'object_name': 'Recipe'},
+ 'bugtracker': ('django.db.models.fields.URLField', [], {'max_length': '200', 'blank': 'True'}),
+ 'description': ('django.db.models.fields.TextField', [], {'blank': 'True'}),
+ 'file_path': ('django.db.models.fields.FilePathField', [], {'max_length': '255'}),
+ 'homepage': ('django.db.models.fields.URLField', [], {'max_length': '200', 'blank': 'True'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'layer_source': ('django.db.models.fields.related.ForeignKey', [], {'default': 'None', 'to': u"orm['orm.LayerSource']", 'null': 'True'}),
+ 'layer_version': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'recipe_layer_version'", 'to': u"orm['orm.Layer_Version']"}),
+ 'license': ('django.db.models.fields.CharField', [], {'max_length': '200', 'blank': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '100', 'blank': 'True'}),
+ 'section': ('django.db.models.fields.CharField', [], {'max_length': '100', 'blank': 'True'}),
+ 'summary': ('django.db.models.fields.TextField', [], {'blank': 'True'}),
+ 'up_date': ('django.db.models.fields.DateTimeField', [], {'default': 'None', 'null': 'True'}),
+ 'up_id': ('django.db.models.fields.IntegerField', [], {'default': 'None', 'null': 'True'}),
+ 'version': ('django.db.models.fields.CharField', [], {'max_length': '100', 'blank': 'True'})
+ },
+ u'orm.recipe_dependency': {
+ 'Meta': {'object_name': 'Recipe_Dependency'},
+ 'dep_type': ('django.db.models.fields.IntegerField', [], {}),
+ 'depends_on': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'r_dependencies_depends'", 'to': u"orm['orm.Recipe']"}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'recipe': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'r_dependencies_recipe'", 'to': u"orm['orm.Recipe']"})
+ },
+ u'orm.release': {
+ 'Meta': {'object_name': 'Release'},
+ 'bitbake_version': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.BitbakeVersion']"}),
+ 'branch_name': ('django.db.models.fields.CharField', [], {'default': "''", 'max_length': '50'}),
+ 'description': ('django.db.models.fields.CharField', [], {'max_length': '255'}),
+ 'helptext': ('django.db.models.fields.TextField', [], {'null': 'True'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'unique': 'True', 'max_length': '32'})
+ },
+ u'orm.releasedefaultlayer': {
+ 'Meta': {'object_name': 'ReleaseDefaultLayer'},
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'layer_name': ('django.db.models.fields.CharField', [], {'default': "''", 'max_length': '100'}),
+ 'release': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Release']"})
+ },
+ u'orm.releaselayersourcepriority': {
+ 'Meta': {'unique_together': "(('release', 'layer_source'),)", 'object_name': 'ReleaseLayerSourcePriority'},
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'layer_source': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.LayerSource']"}),
+ 'priority': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'release': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Release']"})
+ },
+ u'orm.target': {
+ 'Meta': {'object_name': 'Target'},
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Build']"}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'image_size': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'is_image': ('django.db.models.fields.BooleanField', [], {'default': 'False'}),
+ 'license_manifest_path': ('django.db.models.fields.CharField', [], {'max_length': '500', 'null': 'True'}),
+ 'target': ('django.db.models.fields.CharField', [], {'max_length': '100'})
+ },
+ u'orm.target_file': {
+ 'Meta': {'object_name': 'Target_File'},
+ 'directory': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'directory_set'", 'null': 'True', 'to': u"orm['orm.Target_File']"}),
+ 'group': ('django.db.models.fields.CharField', [], {'max_length': '128'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'inodetype': ('django.db.models.fields.IntegerField', [], {}),
+ 'owner': ('django.db.models.fields.CharField', [], {'max_length': '128'}),
+ 'path': ('django.db.models.fields.FilePathField', [], {'max_length': '100'}),
+ 'permission': ('django.db.models.fields.CharField', [], {'max_length': '16'}),
+ 'size': ('django.db.models.fields.IntegerField', [], {}),
+ 'sym_target': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'symlink_set'", 'null': 'True', 'to': u"orm['orm.Target_File']"}),
+ 'target': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Target']"})
+ },
+ u'orm.target_image_file': {
+ 'Meta': {'object_name': 'Target_Image_File'},
+ 'file_name': ('django.db.models.fields.FilePathField', [], {'max_length': '254'}),
+ 'file_size': ('django.db.models.fields.IntegerField', [], {}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'target': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Target']"})
+ },
+ u'orm.target_installed_package': {
+ 'Meta': {'object_name': 'Target_Installed_Package'},
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'package': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'buildtargetlist_package'", 'to': u"orm['orm.Package']"}),
+ 'target': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Target']"})
+ },
+ u'orm.task': {
+ 'Meta': {'ordering': "('order', 'recipe')", 'unique_together': "(('build', 'recipe', 'task_name'),)", 'object_name': 'Task'},
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'task_build'", 'to': u"orm['orm.Build']"}),
+ 'cpu_usage': ('django.db.models.fields.DecimalField', [], {'null': 'True', 'max_digits': '6', 'decimal_places': '2'}),
+ 'disk_io': ('django.db.models.fields.IntegerField', [], {'null': 'True'}),
+ 'elapsed_time': ('django.db.models.fields.DecimalField', [], {'null': 'True', 'max_digits': '6', 'decimal_places': '2'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'line_number': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'logfile': ('django.db.models.fields.FilePathField', [], {'max_length': '255', 'blank': 'True'}),
+ 'message': ('django.db.models.fields.CharField', [], {'max_length': '240'}),
+ 'order': ('django.db.models.fields.IntegerField', [], {'null': 'True'}),
+ 'outcome': ('django.db.models.fields.IntegerField', [], {'default': '-1'}),
+ 'path_to_sstate_obj': ('django.db.models.fields.FilePathField', [], {'max_length': '500', 'blank': 'True'}),
+ 'recipe': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'build_recipe'", 'to': u"orm['orm.Recipe']"}),
+ 'script_type': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'source_url': ('django.db.models.fields.FilePathField', [], {'max_length': '255', 'blank': 'True'}),
+ 'sstate_checksum': ('django.db.models.fields.CharField', [], {'max_length': '100', 'blank': 'True'}),
+ 'sstate_result': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'task_executed': ('django.db.models.fields.BooleanField', [], {'default': 'False'}),
+ 'task_name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'work_directory': ('django.db.models.fields.FilePathField', [], {'max_length': '255', 'blank': 'True'})
+ },
+ u'orm.task_dependency': {
+ 'Meta': {'object_name': 'Task_Dependency'},
+ 'depends_on': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'task_dependencies_depends'", 'to': u"orm['orm.Task']"}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'task': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'task_dependencies_task'", 'to': u"orm['orm.Task']"})
+ },
+ u'orm.toastersetting': {
+ 'Meta': {'object_name': 'ToasterSetting'},
+ 'helptext': ('django.db.models.fields.TextField', [], {}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '63'}),
+ 'value': ('django.db.models.fields.CharField', [], {'max_length': '255'})
+ },
+ u'orm.variable': {
+ 'Meta': {'object_name': 'Variable'},
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'variable_build'", 'to': u"orm['orm.Build']"}),
+ 'changed': ('django.db.models.fields.BooleanField', [], {'default': 'False'}),
+ 'description': ('django.db.models.fields.TextField', [], {'blank': 'True'}),
+ 'human_readable_name': ('django.db.models.fields.CharField', [], {'max_length': '200'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'variable_name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'variable_value': ('django.db.models.fields.TextField', [], {'blank': 'True'})
+ },
+ u'orm.variablehistory': {
+ 'Meta': {'object_name': 'VariableHistory'},
+ 'file_name': ('django.db.models.fields.FilePathField', [], {'max_length': '255'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'line_number': ('django.db.models.fields.IntegerField', [], {'null': 'True'}),
+ 'operation': ('django.db.models.fields.CharField', [], {'max_length': '64'}),
+ 'value': ('django.db.models.fields.TextField', [], {'blank': 'True'}),
+ 'variable': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'vhistory'", 'to': u"orm['orm.Variable']"})
+ }
+ }
+
+ complete_apps = ['orm']
\ No newline at end of file
diff --git a/bitbake/lib/toaster/orm/migrations/0018_auto__add_field_layer_version_project.py b/bitbake/lib/toaster/orm/migrations/0018_auto__add_field_layer_version_project.py
new file mode 100644
index 0000000..7284bb8
--- /dev/null
+++ b/bitbake/lib/toaster/orm/migrations/0018_auto__add_field_layer_version_project.py
@@ -0,0 +1,331 @@
+# -*- coding: utf-8 -*-
+from south.utils import datetime_utils as datetime
+from south.db import db
+from south.v2 import SchemaMigration
+from django.db import models
+
+
+class Migration(SchemaMigration):
+
+ def forwards(self, orm):
+ # Adding field 'Layer_Version.project'
+ db.add_column(u'orm_layer_version', 'project',
+ self.gf('django.db.models.fields.related.ForeignKey')(default=None, to=orm['orm.Project'], null=True),
+ keep_default=False)
+
+
+ def backwards(self, orm):
+ # Deleting field 'Layer_Version.project'
+ db.delete_column(u'orm_layer_version', 'project_id')
+
+
+ models = {
+ u'orm.bitbakeversion': {
+ 'Meta': {'object_name': 'BitbakeVersion'},
+ 'branch': ('django.db.models.fields.CharField', [], {'max_length': '32'}),
+ 'dirpath': ('django.db.models.fields.CharField', [], {'max_length': '255'}),
+ 'giturl': ('django.db.models.fields.URLField', [], {'max_length': '200'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'unique': 'True', 'max_length': '32'})
+ },
+ u'orm.branch': {
+ 'Meta': {'unique_together': "(('layer_source', 'name'), ('layer_source', 'up_id'))", 'object_name': 'Branch'},
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'layer_source': ('django.db.models.fields.related.ForeignKey', [], {'default': 'True', 'to': u"orm['orm.LayerSource']", 'null': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '50'}),
+ 'short_description': ('django.db.models.fields.CharField', [], {'max_length': '50', 'blank': 'True'}),
+ 'up_date': ('django.db.models.fields.DateTimeField', [], {'default': 'None', 'null': 'True'}),
+ 'up_id': ('django.db.models.fields.IntegerField', [], {'default': 'None', 'null': 'True'})
+ },
+ u'orm.build': {
+ 'Meta': {'object_name': 'Build'},
+ 'bitbake_version': ('django.db.models.fields.CharField', [], {'max_length': '50'}),
+ 'build_name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'completed_on': ('django.db.models.fields.DateTimeField', [], {}),
+ 'cooker_log_path': ('django.db.models.fields.CharField', [], {'max_length': '500'}),
+ 'distro': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'distro_version': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'errors_no': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'machine': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'outcome': ('django.db.models.fields.IntegerField', [], {'default': '2'}),
+ 'project': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Project']", 'null': 'True'}),
+ 'started_on': ('django.db.models.fields.DateTimeField', [], {}),
+ 'timespent': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'warnings_no': ('django.db.models.fields.IntegerField', [], {'default': '0'})
+ },
+ u'orm.helptext': {
+ 'Meta': {'object_name': 'HelpText'},
+ 'area': ('django.db.models.fields.IntegerField', [], {}),
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'helptext_build'", 'to': u"orm['orm.Build']"}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'key': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'text': ('django.db.models.fields.TextField', [], {})
+ },
+ u'orm.layer': {
+ 'Meta': {'unique_together': "(('layer_source', 'up_id'), ('layer_source', 'name'))", 'object_name': 'Layer'},
+ 'description': ('django.db.models.fields.TextField', [], {'default': 'None', 'null': 'True'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'layer_index_url': ('django.db.models.fields.URLField', [], {'max_length': '200'}),
+ 'layer_source': ('django.db.models.fields.related.ForeignKey', [], {'default': 'None', 'to': u"orm['orm.LayerSource']", 'null': 'True'}),
+ 'local_path': ('django.db.models.fields.FilePathField', [], {'default': 'None', 'max_length': '255', 'null': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'summary': ('django.db.models.fields.TextField', [], {'default': 'None', 'null': 'True'}),
+ 'up_date': ('django.db.models.fields.DateTimeField', [], {'default': 'None', 'null': 'True'}),
+ 'up_id': ('django.db.models.fields.IntegerField', [], {'default': 'None', 'null': 'True'}),
+ 'vcs_url': ('django.db.models.fields.URLField', [], {'default': 'None', 'max_length': '200', 'null': 'True'}),
+ 'vcs_web_file_base_url': ('django.db.models.fields.URLField', [], {'default': 'None', 'max_length': '200', 'null': 'True'}),
+ 'vcs_web_tree_base_url': ('django.db.models.fields.URLField', [], {'default': 'None', 'max_length': '200', 'null': 'True'}),
+ 'vcs_web_url': ('django.db.models.fields.URLField', [], {'default': 'None', 'max_length': '200', 'null': 'True'})
+ },
+ u'orm.layer_version': {
+ 'Meta': {'unique_together': "(('layer_source', 'up_id'),)", 'object_name': 'Layer_Version'},
+ 'branch': ('django.db.models.fields.CharField', [], {'max_length': '80'}),
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'default': 'None', 'related_name': "'layer_version_build'", 'null': 'True', 'to': u"orm['orm.Build']"}),
+ 'commit': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'dirpath': ('django.db.models.fields.CharField', [], {'default': 'None', 'max_length': '255', 'null': 'True'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'layer': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'layer_version_layer'", 'to': u"orm['orm.Layer']"}),
+ 'layer_source': ('django.db.models.fields.related.ForeignKey', [], {'default': 'None', 'to': u"orm['orm.LayerSource']", 'null': 'True'}),
+ 'priority': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'project': ('django.db.models.fields.related.ForeignKey', [], {'default': 'None', 'to': u"orm['orm.Project']", 'null': 'True'}),
+ 'up_branch': ('django.db.models.fields.related.ForeignKey', [], {'default': 'None', 'to': u"orm['orm.Branch']", 'null': 'True'}),
+ 'up_date': ('django.db.models.fields.DateTimeField', [], {'default': 'None', 'null': 'True'}),
+ 'up_id': ('django.db.models.fields.IntegerField', [], {'default': 'None', 'null': 'True'})
+ },
+ u'orm.layersource': {
+ 'Meta': {'unique_together': "(('sourcetype', 'apiurl'),)", 'object_name': 'LayerSource'},
+ 'apiurl': ('django.db.models.fields.CharField', [], {'default': 'None', 'max_length': '255', 'null': 'True'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'unique': 'True', 'max_length': '63'}),
+ 'sourcetype': ('django.db.models.fields.IntegerField', [], {})
+ },
+ u'orm.layerversiondependency': {
+ 'Meta': {'unique_together': "(('layer_source', 'up_id'),)", 'object_name': 'LayerVersionDependency'},
+ 'depends_on': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'dependees'", 'to': u"orm['orm.Layer_Version']"}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'layer_source': ('django.db.models.fields.related.ForeignKey', [], {'default': 'None', 'to': u"orm['orm.LayerSource']", 'null': 'True'}),
+ 'layer_version': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'dependencies'", 'to': u"orm['orm.Layer_Version']"}),
+ 'up_id': ('django.db.models.fields.IntegerField', [], {'default': 'None', 'null': 'True'})
+ },
+ u'orm.logmessage': {
+ 'Meta': {'object_name': 'LogMessage'},
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Build']"}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'level': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'lineno': ('django.db.models.fields.IntegerField', [], {'null': 'True'}),
+ 'message': ('django.db.models.fields.CharField', [], {'max_length': '240'}),
+ 'pathname': ('django.db.models.fields.FilePathField', [], {'max_length': '255', 'blank': 'True'}),
+ 'task': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Task']", 'null': 'True', 'blank': 'True'})
+ },
+ u'orm.machine': {
+ 'Meta': {'unique_together': "(('layer_source', 'up_id'),)", 'object_name': 'Machine'},
+ 'description': ('django.db.models.fields.CharField', [], {'max_length': '255'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'layer_source': ('django.db.models.fields.related.ForeignKey', [], {'default': 'None', 'to': u"orm['orm.LayerSource']", 'null': 'True'}),
+ 'layer_version': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Layer_Version']"}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '255'}),
+ 'up_date': ('django.db.models.fields.DateTimeField', [], {'default': 'None', 'null': 'True'}),
+ 'up_id': ('django.db.models.fields.IntegerField', [], {'default': 'None', 'null': 'True'})
+ },
+ u'orm.package': {
+ 'Meta': {'object_name': 'Package'},
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Build']"}),
+ 'description': ('django.db.models.fields.TextField', [], {'blank': 'True'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'installed_name': ('django.db.models.fields.CharField', [], {'default': "''", 'max_length': '100'}),
+ 'installed_size': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'license': ('django.db.models.fields.CharField', [], {'max_length': '80', 'blank': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'recipe': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Recipe']", 'null': 'True'}),
+ 'revision': ('django.db.models.fields.CharField', [], {'max_length': '32', 'blank': 'True'}),
+ 'section': ('django.db.models.fields.CharField', [], {'max_length': '80', 'blank': 'True'}),
+ 'size': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'summary': ('django.db.models.fields.TextField', [], {'blank': 'True'}),
+ 'version': ('django.db.models.fields.CharField', [], {'max_length': '100', 'blank': 'True'})
+ },
+ u'orm.package_dependency': {
+ 'Meta': {'object_name': 'Package_Dependency'},
+ 'dep_type': ('django.db.models.fields.IntegerField', [], {}),
+ 'depends_on': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'package_dependencies_target'", 'to': u"orm['orm.Package']"}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'package': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'package_dependencies_source'", 'to': u"orm['orm.Package']"}),
+ 'target': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Target']", 'null': 'True'})
+ },
+ u'orm.package_file': {
+ 'Meta': {'object_name': 'Package_File'},
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'package': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'buildfilelist_package'", 'to': u"orm['orm.Package']"}),
+ 'path': ('django.db.models.fields.FilePathField', [], {'max_length': '255', 'blank': 'True'}),
+ 'size': ('django.db.models.fields.IntegerField', [], {})
+ },
+ u'orm.project': {
+ 'Meta': {'object_name': 'Project'},
+ 'bitbake_version': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.BitbakeVersion']"}),
+ 'created': ('django.db.models.fields.DateTimeField', [], {'auto_now_add': 'True', 'blank': 'True'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'release': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Release']"}),
+ 'short_description': ('django.db.models.fields.CharField', [], {'max_length': '50', 'blank': 'True'}),
+ 'updated': ('django.db.models.fields.DateTimeField', [], {'auto_now': 'True', 'blank': 'True'}),
+ 'user_id': ('django.db.models.fields.IntegerField', [], {'null': 'True'})
+ },
+ u'orm.projectlayer': {
+ 'Meta': {'unique_together': "(('project', 'layercommit'),)", 'object_name': 'ProjectLayer'},
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'layercommit': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Layer_Version']", 'null': 'True'}),
+ 'optional': ('django.db.models.fields.BooleanField', [], {'default': 'True'}),
+ 'project': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Project']"})
+ },
+ u'orm.projecttarget': {
+ 'Meta': {'object_name': 'ProjectTarget'},
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'project': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Project']"}),
+ 'target': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'task': ('django.db.models.fields.CharField', [], {'max_length': '100', 'null': 'True'})
+ },
+ u'orm.projectvariable': {
+ 'Meta': {'object_name': 'ProjectVariable'},
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'project': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Project']"}),
+ 'value': ('django.db.models.fields.TextField', [], {'blank': 'True'})
+ },
+ u'orm.recipe': {
+ 'Meta': {'unique_together': "(('layer_version', 'file_path'),)", 'object_name': 'Recipe'},
+ 'bugtracker': ('django.db.models.fields.URLField', [], {'max_length': '200', 'blank': 'True'}),
+ 'description': ('django.db.models.fields.TextField', [], {'blank': 'True'}),
+ 'file_path': ('django.db.models.fields.FilePathField', [], {'max_length': '255'}),
+ 'homepage': ('django.db.models.fields.URLField', [], {'max_length': '200', 'blank': 'True'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'layer_source': ('django.db.models.fields.related.ForeignKey', [], {'default': 'None', 'to': u"orm['orm.LayerSource']", 'null': 'True'}),
+ 'layer_version': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'recipe_layer_version'", 'to': u"orm['orm.Layer_Version']"}),
+ 'license': ('django.db.models.fields.CharField', [], {'max_length': '200', 'blank': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '100', 'blank': 'True'}),
+ 'section': ('django.db.models.fields.CharField', [], {'max_length': '100', 'blank': 'True'}),
+ 'summary': ('django.db.models.fields.TextField', [], {'blank': 'True'}),
+ 'up_date': ('django.db.models.fields.DateTimeField', [], {'default': 'None', 'null': 'True'}),
+ 'up_id': ('django.db.models.fields.IntegerField', [], {'default': 'None', 'null': 'True'}),
+ 'version': ('django.db.models.fields.CharField', [], {'max_length': '100', 'blank': 'True'})
+ },
+ u'orm.recipe_dependency': {
+ 'Meta': {'object_name': 'Recipe_Dependency'},
+ 'dep_type': ('django.db.models.fields.IntegerField', [], {}),
+ 'depends_on': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'r_dependencies_depends'", 'to': u"orm['orm.Recipe']"}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'recipe': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'r_dependencies_recipe'", 'to': u"orm['orm.Recipe']"})
+ },
+ u'orm.release': {
+ 'Meta': {'object_name': 'Release'},
+ 'bitbake_version': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.BitbakeVersion']"}),
+ 'branch_name': ('django.db.models.fields.CharField', [], {'default': "''", 'max_length': '50'}),
+ 'description': ('django.db.models.fields.CharField', [], {'max_length': '255'}),
+ 'helptext': ('django.db.models.fields.TextField', [], {'null': 'True'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'unique': 'True', 'max_length': '32'})
+ },
+ u'orm.releasedefaultlayer': {
+ 'Meta': {'object_name': 'ReleaseDefaultLayer'},
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'layer_name': ('django.db.models.fields.CharField', [], {'default': "''", 'max_length': '100'}),
+ 'release': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Release']"})
+ },
+ u'orm.releaselayersourcepriority': {
+ 'Meta': {'unique_together': "(('release', 'layer_source'),)", 'object_name': 'ReleaseLayerSourcePriority'},
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'layer_source': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.LayerSource']"}),
+ 'priority': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'release': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Release']"})
+ },
+ u'orm.target': {
+ 'Meta': {'object_name': 'Target'},
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Build']"}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'image_size': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'is_image': ('django.db.models.fields.BooleanField', [], {'default': 'False'}),
+ 'license_manifest_path': ('django.db.models.fields.CharField', [], {'max_length': '500', 'null': 'True'}),
+ 'target': ('django.db.models.fields.CharField', [], {'max_length': '100'})
+ },
+ u'orm.target_file': {
+ 'Meta': {'object_name': 'Target_File'},
+ 'directory': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'directory_set'", 'null': 'True', 'to': u"orm['orm.Target_File']"}),
+ 'group': ('django.db.models.fields.CharField', [], {'max_length': '128'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'inodetype': ('django.db.models.fields.IntegerField', [], {}),
+ 'owner': ('django.db.models.fields.CharField', [], {'max_length': '128'}),
+ 'path': ('django.db.models.fields.FilePathField', [], {'max_length': '100'}),
+ 'permission': ('django.db.models.fields.CharField', [], {'max_length': '16'}),
+ 'size': ('django.db.models.fields.IntegerField', [], {}),
+ 'sym_target': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'symlink_set'", 'null': 'True', 'to': u"orm['orm.Target_File']"}),
+ 'target': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Target']"})
+ },
+ u'orm.target_image_file': {
+ 'Meta': {'object_name': 'Target_Image_File'},
+ 'file_name': ('django.db.models.fields.FilePathField', [], {'max_length': '254'}),
+ 'file_size': ('django.db.models.fields.IntegerField', [], {}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'target': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Target']"})
+ },
+ u'orm.target_installed_package': {
+ 'Meta': {'object_name': 'Target_Installed_Package'},
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'package': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'buildtargetlist_package'", 'to': u"orm['orm.Package']"}),
+ 'target': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Target']"})
+ },
+ u'orm.task': {
+ 'Meta': {'ordering': "('order', 'recipe')", 'unique_together': "(('build', 'recipe', 'task_name'),)", 'object_name': 'Task'},
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'task_build'", 'to': u"orm['orm.Build']"}),
+ 'cpu_usage': ('django.db.models.fields.DecimalField', [], {'null': 'True', 'max_digits': '6', 'decimal_places': '2'}),
+ 'disk_io': ('django.db.models.fields.IntegerField', [], {'null': 'True'}),
+ 'elapsed_time': ('django.db.models.fields.DecimalField', [], {'null': 'True', 'max_digits': '6', 'decimal_places': '2'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'line_number': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'logfile': ('django.db.models.fields.FilePathField', [], {'max_length': '255', 'blank': 'True'}),
+ 'message': ('django.db.models.fields.CharField', [], {'max_length': '240'}),
+ 'order': ('django.db.models.fields.IntegerField', [], {'null': 'True'}),
+ 'outcome': ('django.db.models.fields.IntegerField', [], {'default': '-1'}),
+ 'path_to_sstate_obj': ('django.db.models.fields.FilePathField', [], {'max_length': '500', 'blank': 'True'}),
+ 'recipe': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'build_recipe'", 'to': u"orm['orm.Recipe']"}),
+ 'script_type': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'source_url': ('django.db.models.fields.FilePathField', [], {'max_length': '255', 'blank': 'True'}),
+ 'sstate_checksum': ('django.db.models.fields.CharField', [], {'max_length': '100', 'blank': 'True'}),
+ 'sstate_result': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'task_executed': ('django.db.models.fields.BooleanField', [], {'default': 'False'}),
+ 'task_name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'work_directory': ('django.db.models.fields.FilePathField', [], {'max_length': '255', 'blank': 'True'})
+ },
+ u'orm.task_dependency': {
+ 'Meta': {'object_name': 'Task_Dependency'},
+ 'depends_on': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'task_dependencies_depends'", 'to': u"orm['orm.Task']"}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'task': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'task_dependencies_task'", 'to': u"orm['orm.Task']"})
+ },
+ u'orm.toastersetting': {
+ 'Meta': {'object_name': 'ToasterSetting'},
+ 'helptext': ('django.db.models.fields.TextField', [], {}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '63'}),
+ 'value': ('django.db.models.fields.CharField', [], {'max_length': '255'})
+ },
+ u'orm.variable': {
+ 'Meta': {'object_name': 'Variable'},
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'variable_build'", 'to': u"orm['orm.Build']"}),
+ 'changed': ('django.db.models.fields.BooleanField', [], {'default': 'False'}),
+ 'description': ('django.db.models.fields.TextField', [], {'blank': 'True'}),
+ 'human_readable_name': ('django.db.models.fields.CharField', [], {'max_length': '200'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'variable_name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'variable_value': ('django.db.models.fields.TextField', [], {'blank': 'True'})
+ },
+ u'orm.variablehistory': {
+ 'Meta': {'object_name': 'VariableHistory'},
+ 'file_name': ('django.db.models.fields.FilePathField', [], {'max_length': '255'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'line_number': ('django.db.models.fields.IntegerField', [], {'null': 'True'}),
+ 'operation': ('django.db.models.fields.CharField', [], {'max_length': '64'}),
+ 'value': ('django.db.models.fields.TextField', [], {'blank': 'True'}),
+ 'variable': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'vhistory'", 'to': u"orm['orm.Variable']"})
+ }
+ }
+
+ complete_apps = ['orm']
\ No newline at end of file
diff --git a/bitbake/lib/toaster/orm/migrations/0019_auto__add_buildartifact.py b/bitbake/lib/toaster/orm/migrations/0019_auto__add_buildartifact.py
new file mode 100644
index 0000000..0dce9ea
--- /dev/null
+++ b/bitbake/lib/toaster/orm/migrations/0019_auto__add_buildartifact.py
@@ -0,0 +1,342 @@
+# -*- coding: utf-8 -*-
+from south.utils import datetime_utils as datetime
+from south.db import db
+from south.v2 import SchemaMigration
+from django.db import models
+
+
+class Migration(SchemaMigration):
+
+ def forwards(self, orm):
+ # Adding model 'BuildArtifact'
+ db.create_table(u'orm_buildartifact', (
+ (u'id', self.gf('django.db.models.fields.AutoField')(primary_key=True)),
+ ('build', self.gf('django.db.models.fields.related.ForeignKey')(to=orm['orm.Build'])),
+ ('file_name', self.gf('django.db.models.fields.FilePathField')(max_length=100)),
+ ('file_size', self.gf('django.db.models.fields.IntegerField')()),
+ ))
+ db.send_create_signal(u'orm', ['BuildArtifact'])
+
+
+ def backwards(self, orm):
+ # Deleting model 'BuildArtifact'
+ db.delete_table(u'orm_buildartifact')
+
+
+ models = {
+ u'orm.bitbakeversion': {
+ 'Meta': {'object_name': 'BitbakeVersion'},
+ 'branch': ('django.db.models.fields.CharField', [], {'max_length': '32'}),
+ 'dirpath': ('django.db.models.fields.CharField', [], {'max_length': '255'}),
+ 'giturl': ('django.db.models.fields.URLField', [], {'max_length': '200'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'unique': 'True', 'max_length': '32'})
+ },
+ u'orm.branch': {
+ 'Meta': {'unique_together': "(('layer_source', 'name'), ('layer_source', 'up_id'))", 'object_name': 'Branch'},
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'layer_source': ('django.db.models.fields.related.ForeignKey', [], {'default': 'True', 'to': u"orm['orm.LayerSource']", 'null': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '50'}),
+ 'short_description': ('django.db.models.fields.CharField', [], {'max_length': '50', 'blank': 'True'}),
+ 'up_date': ('django.db.models.fields.DateTimeField', [], {'default': 'None', 'null': 'True'}),
+ 'up_id': ('django.db.models.fields.IntegerField', [], {'default': 'None', 'null': 'True'})
+ },
+ u'orm.build': {
+ 'Meta': {'object_name': 'Build'},
+ 'bitbake_version': ('django.db.models.fields.CharField', [], {'max_length': '50'}),
+ 'build_name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'completed_on': ('django.db.models.fields.DateTimeField', [], {}),
+ 'cooker_log_path': ('django.db.models.fields.CharField', [], {'max_length': '500'}),
+ 'distro': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'distro_version': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'errors_no': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'machine': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'outcome': ('django.db.models.fields.IntegerField', [], {'default': '2'}),
+ 'project': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Project']", 'null': 'True'}),
+ 'started_on': ('django.db.models.fields.DateTimeField', [], {}),
+ 'timespent': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'warnings_no': ('django.db.models.fields.IntegerField', [], {'default': '0'})
+ },
+ u'orm.buildartifact': {
+ 'Meta': {'object_name': 'BuildArtifact'},
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Build']"}),
+ 'file_name': ('django.db.models.fields.FilePathField', [], {'max_length': '100'}),
+ 'file_size': ('django.db.models.fields.IntegerField', [], {}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'})
+ },
+ u'orm.helptext': {
+ 'Meta': {'object_name': 'HelpText'},
+ 'area': ('django.db.models.fields.IntegerField', [], {}),
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'helptext_build'", 'to': u"orm['orm.Build']"}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'key': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'text': ('django.db.models.fields.TextField', [], {})
+ },
+ u'orm.layer': {
+ 'Meta': {'unique_together': "(('layer_source', 'up_id'), ('layer_source', 'name'))", 'object_name': 'Layer'},
+ 'description': ('django.db.models.fields.TextField', [], {'default': 'None', 'null': 'True'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'layer_index_url': ('django.db.models.fields.URLField', [], {'max_length': '200'}),
+ 'layer_source': ('django.db.models.fields.related.ForeignKey', [], {'default': 'None', 'to': u"orm['orm.LayerSource']", 'null': 'True'}),
+ 'local_path': ('django.db.models.fields.FilePathField', [], {'default': 'None', 'max_length': '255', 'null': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'summary': ('django.db.models.fields.TextField', [], {'default': 'None', 'null': 'True'}),
+ 'up_date': ('django.db.models.fields.DateTimeField', [], {'default': 'None', 'null': 'True'}),
+ 'up_id': ('django.db.models.fields.IntegerField', [], {'default': 'None', 'null': 'True'}),
+ 'vcs_url': ('django.db.models.fields.URLField', [], {'default': 'None', 'max_length': '200', 'null': 'True'}),
+ 'vcs_web_file_base_url': ('django.db.models.fields.URLField', [], {'default': 'None', 'max_length': '200', 'null': 'True'}),
+ 'vcs_web_tree_base_url': ('django.db.models.fields.URLField', [], {'default': 'None', 'max_length': '200', 'null': 'True'}),
+ 'vcs_web_url': ('django.db.models.fields.URLField', [], {'default': 'None', 'max_length': '200', 'null': 'True'})
+ },
+ u'orm.layer_version': {
+ 'Meta': {'unique_together': "(('layer_source', 'up_id'),)", 'object_name': 'Layer_Version'},
+ 'branch': ('django.db.models.fields.CharField', [], {'max_length': '80'}),
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'default': 'None', 'related_name': "'layer_version_build'", 'null': 'True', 'to': u"orm['orm.Build']"}),
+ 'commit': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'dirpath': ('django.db.models.fields.CharField', [], {'default': 'None', 'max_length': '255', 'null': 'True'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'layer': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'layer_version_layer'", 'to': u"orm['orm.Layer']"}),
+ 'layer_source': ('django.db.models.fields.related.ForeignKey', [], {'default': 'None', 'to': u"orm['orm.LayerSource']", 'null': 'True'}),
+ 'priority': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'project': ('django.db.models.fields.related.ForeignKey', [], {'default': 'None', 'to': u"orm['orm.Project']", 'null': 'True'}),
+ 'up_branch': ('django.db.models.fields.related.ForeignKey', [], {'default': 'None', 'to': u"orm['orm.Branch']", 'null': 'True'}),
+ 'up_date': ('django.db.models.fields.DateTimeField', [], {'default': 'None', 'null': 'True'}),
+ 'up_id': ('django.db.models.fields.IntegerField', [], {'default': 'None', 'null': 'True'})
+ },
+ u'orm.layersource': {
+ 'Meta': {'unique_together': "(('sourcetype', 'apiurl'),)", 'object_name': 'LayerSource'},
+ 'apiurl': ('django.db.models.fields.CharField', [], {'default': 'None', 'max_length': '255', 'null': 'True'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'unique': 'True', 'max_length': '63'}),
+ 'sourcetype': ('django.db.models.fields.IntegerField', [], {})
+ },
+ u'orm.layerversiondependency': {
+ 'Meta': {'unique_together': "(('layer_source', 'up_id'),)", 'object_name': 'LayerVersionDependency'},
+ 'depends_on': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'dependees'", 'to': u"orm['orm.Layer_Version']"}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'layer_source': ('django.db.models.fields.related.ForeignKey', [], {'default': 'None', 'to': u"orm['orm.LayerSource']", 'null': 'True'}),
+ 'layer_version': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'dependencies'", 'to': u"orm['orm.Layer_Version']"}),
+ 'up_id': ('django.db.models.fields.IntegerField', [], {'default': 'None', 'null': 'True'})
+ },
+ u'orm.logmessage': {
+ 'Meta': {'object_name': 'LogMessage'},
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Build']"}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'level': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'lineno': ('django.db.models.fields.IntegerField', [], {'null': 'True'}),
+ 'message': ('django.db.models.fields.CharField', [], {'max_length': '240'}),
+ 'pathname': ('django.db.models.fields.FilePathField', [], {'max_length': '255', 'blank': 'True'}),
+ 'task': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Task']", 'null': 'True', 'blank': 'True'})
+ },
+ u'orm.machine': {
+ 'Meta': {'unique_together': "(('layer_source', 'up_id'),)", 'object_name': 'Machine'},
+ 'description': ('django.db.models.fields.CharField', [], {'max_length': '255'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'layer_source': ('django.db.models.fields.related.ForeignKey', [], {'default': 'None', 'to': u"orm['orm.LayerSource']", 'null': 'True'}),
+ 'layer_version': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Layer_Version']"}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '255'}),
+ 'up_date': ('django.db.models.fields.DateTimeField', [], {'default': 'None', 'null': 'True'}),
+ 'up_id': ('django.db.models.fields.IntegerField', [], {'default': 'None', 'null': 'True'})
+ },
+ u'orm.package': {
+ 'Meta': {'object_name': 'Package'},
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Build']"}),
+ 'description': ('django.db.models.fields.TextField', [], {'blank': 'True'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'installed_name': ('django.db.models.fields.CharField', [], {'default': "''", 'max_length': '100'}),
+ 'installed_size': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'license': ('django.db.models.fields.CharField', [], {'max_length': '80', 'blank': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'recipe': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Recipe']", 'null': 'True'}),
+ 'revision': ('django.db.models.fields.CharField', [], {'max_length': '32', 'blank': 'True'}),
+ 'section': ('django.db.models.fields.CharField', [], {'max_length': '80', 'blank': 'True'}),
+ 'size': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'summary': ('django.db.models.fields.TextField', [], {'blank': 'True'}),
+ 'version': ('django.db.models.fields.CharField', [], {'max_length': '100', 'blank': 'True'})
+ },
+ u'orm.package_dependency': {
+ 'Meta': {'object_name': 'Package_Dependency'},
+ 'dep_type': ('django.db.models.fields.IntegerField', [], {}),
+ 'depends_on': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'package_dependencies_target'", 'to': u"orm['orm.Package']"}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'package': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'package_dependencies_source'", 'to': u"orm['orm.Package']"}),
+ 'target': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Target']", 'null': 'True'})
+ },
+ u'orm.package_file': {
+ 'Meta': {'object_name': 'Package_File'},
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'package': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'buildfilelist_package'", 'to': u"orm['orm.Package']"}),
+ 'path': ('django.db.models.fields.FilePathField', [], {'max_length': '255', 'blank': 'True'}),
+ 'size': ('django.db.models.fields.IntegerField', [], {})
+ },
+ u'orm.project': {
+ 'Meta': {'object_name': 'Project'},
+ 'bitbake_version': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.BitbakeVersion']"}),
+ 'created': ('django.db.models.fields.DateTimeField', [], {'auto_now_add': 'True', 'blank': 'True'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'release': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Release']"}),
+ 'short_description': ('django.db.models.fields.CharField', [], {'max_length': '50', 'blank': 'True'}),
+ 'updated': ('django.db.models.fields.DateTimeField', [], {'auto_now': 'True', 'blank': 'True'}),
+ 'user_id': ('django.db.models.fields.IntegerField', [], {'null': 'True'})
+ },
+ u'orm.projectlayer': {
+ 'Meta': {'unique_together': "(('project', 'layercommit'),)", 'object_name': 'ProjectLayer'},
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'layercommit': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Layer_Version']", 'null': 'True'}),
+ 'optional': ('django.db.models.fields.BooleanField', [], {'default': 'True'}),
+ 'project': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Project']"})
+ },
+ u'orm.projecttarget': {
+ 'Meta': {'object_name': 'ProjectTarget'},
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'project': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Project']"}),
+ 'target': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'task': ('django.db.models.fields.CharField', [], {'max_length': '100', 'null': 'True'})
+ },
+ u'orm.projectvariable': {
+ 'Meta': {'object_name': 'ProjectVariable'},
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'project': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Project']"}),
+ 'value': ('django.db.models.fields.TextField', [], {'blank': 'True'})
+ },
+ u'orm.recipe': {
+ 'Meta': {'unique_together': "(('layer_version', 'file_path'),)", 'object_name': 'Recipe'},
+ 'bugtracker': ('django.db.models.fields.URLField', [], {'max_length': '200', 'blank': 'True'}),
+ 'description': ('django.db.models.fields.TextField', [], {'blank': 'True'}),
+ 'file_path': ('django.db.models.fields.FilePathField', [], {'max_length': '255'}),
+ 'homepage': ('django.db.models.fields.URLField', [], {'max_length': '200', 'blank': 'True'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'layer_source': ('django.db.models.fields.related.ForeignKey', [], {'default': 'None', 'to': u"orm['orm.LayerSource']", 'null': 'True'}),
+ 'layer_version': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'recipe_layer_version'", 'to': u"orm['orm.Layer_Version']"}),
+ 'license': ('django.db.models.fields.CharField', [], {'max_length': '200', 'blank': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '100', 'blank': 'True'}),
+ 'section': ('django.db.models.fields.CharField', [], {'max_length': '100', 'blank': 'True'}),
+ 'summary': ('django.db.models.fields.TextField', [], {'blank': 'True'}),
+ 'up_date': ('django.db.models.fields.DateTimeField', [], {'default': 'None', 'null': 'True'}),
+ 'up_id': ('django.db.models.fields.IntegerField', [], {'default': 'None', 'null': 'True'}),
+ 'version': ('django.db.models.fields.CharField', [], {'max_length': '100', 'blank': 'True'})
+ },
+ u'orm.recipe_dependency': {
+ 'Meta': {'object_name': 'Recipe_Dependency'},
+ 'dep_type': ('django.db.models.fields.IntegerField', [], {}),
+ 'depends_on': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'r_dependencies_depends'", 'to': u"orm['orm.Recipe']"}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'recipe': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'r_dependencies_recipe'", 'to': u"orm['orm.Recipe']"})
+ },
+ u'orm.release': {
+ 'Meta': {'object_name': 'Release'},
+ 'bitbake_version': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.BitbakeVersion']"}),
+ 'branch_name': ('django.db.models.fields.CharField', [], {'default': "''", 'max_length': '50'}),
+ 'description': ('django.db.models.fields.CharField', [], {'max_length': '255'}),
+ 'helptext': ('django.db.models.fields.TextField', [], {'null': 'True'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'unique': 'True', 'max_length': '32'})
+ },
+ u'orm.releasedefaultlayer': {
+ 'Meta': {'object_name': 'ReleaseDefaultLayer'},
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'layer_name': ('django.db.models.fields.CharField', [], {'default': "''", 'max_length': '100'}),
+ 'release': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Release']"})
+ },
+ u'orm.releaselayersourcepriority': {
+ 'Meta': {'unique_together': "(('release', 'layer_source'),)", 'object_name': 'ReleaseLayerSourcePriority'},
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'layer_source': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.LayerSource']"}),
+ 'priority': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'release': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Release']"})
+ },
+ u'orm.target': {
+ 'Meta': {'object_name': 'Target'},
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Build']"}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'image_size': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'is_image': ('django.db.models.fields.BooleanField', [], {'default': 'False'}),
+ 'license_manifest_path': ('django.db.models.fields.CharField', [], {'max_length': '500', 'null': 'True'}),
+ 'target': ('django.db.models.fields.CharField', [], {'max_length': '100'})
+ },
+ u'orm.target_file': {
+ 'Meta': {'object_name': 'Target_File'},
+ 'directory': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'directory_set'", 'null': 'True', 'to': u"orm['orm.Target_File']"}),
+ 'group': ('django.db.models.fields.CharField', [], {'max_length': '128'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'inodetype': ('django.db.models.fields.IntegerField', [], {}),
+ 'owner': ('django.db.models.fields.CharField', [], {'max_length': '128'}),
+ 'path': ('django.db.models.fields.FilePathField', [], {'max_length': '100'}),
+ 'permission': ('django.db.models.fields.CharField', [], {'max_length': '16'}),
+ 'size': ('django.db.models.fields.IntegerField', [], {}),
+ 'sym_target': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'symlink_set'", 'null': 'True', 'to': u"orm['orm.Target_File']"}),
+ 'target': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Target']"})
+ },
+ u'orm.target_image_file': {
+ 'Meta': {'object_name': 'Target_Image_File'},
+ 'file_name': ('django.db.models.fields.FilePathField', [], {'max_length': '254'}),
+ 'file_size': ('django.db.models.fields.IntegerField', [], {}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'target': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Target']"})
+ },
+ u'orm.target_installed_package': {
+ 'Meta': {'object_name': 'Target_Installed_Package'},
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'package': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'buildtargetlist_package'", 'to': u"orm['orm.Package']"}),
+ 'target': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Target']"})
+ },
+ u'orm.task': {
+ 'Meta': {'ordering': "('order', 'recipe')", 'unique_together': "(('build', 'recipe', 'task_name'),)", 'object_name': 'Task'},
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'task_build'", 'to': u"orm['orm.Build']"}),
+ 'cpu_usage': ('django.db.models.fields.DecimalField', [], {'null': 'True', 'max_digits': '6', 'decimal_places': '2'}),
+ 'disk_io': ('django.db.models.fields.IntegerField', [], {'null': 'True'}),
+ 'elapsed_time': ('django.db.models.fields.DecimalField', [], {'null': 'True', 'max_digits': '6', 'decimal_places': '2'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'line_number': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'logfile': ('django.db.models.fields.FilePathField', [], {'max_length': '255', 'blank': 'True'}),
+ 'message': ('django.db.models.fields.CharField', [], {'max_length': '240'}),
+ 'order': ('django.db.models.fields.IntegerField', [], {'null': 'True'}),
+ 'outcome': ('django.db.models.fields.IntegerField', [], {'default': '-1'}),
+ 'path_to_sstate_obj': ('django.db.models.fields.FilePathField', [], {'max_length': '500', 'blank': 'True'}),
+ 'recipe': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'build_recipe'", 'to': u"orm['orm.Recipe']"}),
+ 'script_type': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'source_url': ('django.db.models.fields.FilePathField', [], {'max_length': '255', 'blank': 'True'}),
+ 'sstate_checksum': ('django.db.models.fields.CharField', [], {'max_length': '100', 'blank': 'True'}),
+ 'sstate_result': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'task_executed': ('django.db.models.fields.BooleanField', [], {'default': 'False'}),
+ 'task_name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'work_directory': ('django.db.models.fields.FilePathField', [], {'max_length': '255', 'blank': 'True'})
+ },
+ u'orm.task_dependency': {
+ 'Meta': {'object_name': 'Task_Dependency'},
+ 'depends_on': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'task_dependencies_depends'", 'to': u"orm['orm.Task']"}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'task': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'task_dependencies_task'", 'to': u"orm['orm.Task']"})
+ },
+ u'orm.toastersetting': {
+ 'Meta': {'object_name': 'ToasterSetting'},
+ 'helptext': ('django.db.models.fields.TextField', [], {}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '63'}),
+ 'value': ('django.db.models.fields.CharField', [], {'max_length': '255'})
+ },
+ u'orm.variable': {
+ 'Meta': {'object_name': 'Variable'},
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'variable_build'", 'to': u"orm['orm.Build']"}),
+ 'changed': ('django.db.models.fields.BooleanField', [], {'default': 'False'}),
+ 'description': ('django.db.models.fields.TextField', [], {'blank': 'True'}),
+ 'human_readable_name': ('django.db.models.fields.CharField', [], {'max_length': '200'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'variable_name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'variable_value': ('django.db.models.fields.TextField', [], {'blank': 'True'})
+ },
+ u'orm.variablehistory': {
+ 'Meta': {'object_name': 'VariableHistory'},
+ 'file_name': ('django.db.models.fields.FilePathField', [], {'max_length': '255'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'line_number': ('django.db.models.fields.IntegerField', [], {'null': 'True'}),
+ 'operation': ('django.db.models.fields.CharField', [], {'max_length': '64'}),
+ 'value': ('django.db.models.fields.TextField', [], {'blank': 'True'}),
+ 'variable': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'vhistory'", 'to': u"orm['orm.Variable']"})
+ }
+ }
+
+ complete_apps = ['orm']
\ No newline at end of file
diff --git a/bitbake/lib/toaster/orm/migrations/0020_auto__add_field_layer_version_local_path__add_field_recipe_pathflags__.py b/bitbake/lib/toaster/orm/migrations/0020_auto__add_field_layer_version_local_path__add_field_recipe_pathflags__.py
new file mode 100644
index 0000000..0ec5795
--- /dev/null
+++ b/bitbake/lib/toaster/orm/migrations/0020_auto__add_field_layer_version_local_path__add_field_recipe_pathflags__.py
@@ -0,0 +1,361 @@
+# -*- coding: utf-8 -*-
+from south.utils import datetime_utils as datetime
+from south.db import db
+from south.v2 import SchemaMigration
+from django.db import models
+
+
+class Migration(SchemaMigration):
+
+ def forwards(self, orm):
+ # Removing unique constraint on 'Recipe', fields ['layer_version', 'file_path']
+ db.delete_unique(u'orm_recipe', ['layer_version_id', 'file_path'])
+
+ # Adding field 'Layer_Version.local_path'
+ db.add_column(u'orm_layer_version', 'local_path',
+ self.gf('django.db.models.fields.FilePathField')(default="/", max_length=1024),
+ keep_default=False)
+
+ # Adding field 'Recipe.pathflags'
+ db.add_column(u'orm_recipe', 'pathflags',
+ self.gf('django.db.models.fields.CharField')(default='', max_length=200, blank=True),
+ keep_default=False)
+
+ # Adding unique constraint on 'Recipe', fields ['layer_version', 'file_path', 'pathflags']
+ db.create_unique(u'orm_recipe', ['layer_version_id', 'file_path', 'pathflags'])
+
+ # Migrate data from Layer.local_path to Layer_Version.local_path
+ if not db.dry_run:
+ for lv in orm.Layer_Version.objects.all():
+ if lv.layer.local_path is not None:
+ lv.local_path = lv.layer.local_path
+ else:
+ lv.local_path = "/"
+ lv.save()
+
+ db.delete_column(u'orm_layer', 'local_path')
+
+
+ def backwards(self, orm):
+ raise RuntimeError("Cannot reverse this migration")
+
+
+ models = {
+ u'orm.bitbakeversion': {
+ 'Meta': {'object_name': 'BitbakeVersion'},
+ 'branch': ('django.db.models.fields.CharField', [], {'max_length': '32'}),
+ 'dirpath': ('django.db.models.fields.CharField', [], {'max_length': '255'}),
+ 'giturl': ('django.db.models.fields.URLField', [], {'max_length': '200'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'unique': 'True', 'max_length': '32'})
+ },
+ u'orm.branch': {
+ 'Meta': {'unique_together': "(('layer_source', 'name'), ('layer_source', 'up_id'))", 'object_name': 'Branch'},
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'layer_source': ('django.db.models.fields.related.ForeignKey', [], {'default': 'True', 'to': u"orm['orm.LayerSource']", 'null': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '50'}),
+ 'short_description': ('django.db.models.fields.CharField', [], {'max_length': '50', 'blank': 'True'}),
+ 'up_date': ('django.db.models.fields.DateTimeField', [], {'default': 'None', 'null': 'True'}),
+ 'up_id': ('django.db.models.fields.IntegerField', [], {'default': 'None', 'null': 'True'})
+ },
+ u'orm.build': {
+ 'Meta': {'object_name': 'Build'},
+ 'bitbake_version': ('django.db.models.fields.CharField', [], {'max_length': '50'}),
+ 'build_name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'completed_on': ('django.db.models.fields.DateTimeField', [], {}),
+ 'cooker_log_path': ('django.db.models.fields.CharField', [], {'max_length': '500'}),
+ 'distro': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'distro_version': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'errors_no': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'machine': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'outcome': ('django.db.models.fields.IntegerField', [], {'default': '2'}),
+ 'project': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Project']"}),
+ 'started_on': ('django.db.models.fields.DateTimeField', [], {}),
+ 'timespent': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'warnings_no': ('django.db.models.fields.IntegerField', [], {'default': '0'})
+ },
+ u'orm.buildartifact': {
+ 'Meta': {'object_name': 'BuildArtifact'},
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Build']"}),
+ 'file_name': ('django.db.models.fields.FilePathField', [], {'max_length': '100'}),
+ 'file_size': ('django.db.models.fields.IntegerField', [], {}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'})
+ },
+ u'orm.helptext': {
+ 'Meta': {'object_name': 'HelpText'},
+ 'area': ('django.db.models.fields.IntegerField', [], {}),
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'helptext_build'", 'to': u"orm['orm.Build']"}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'key': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'text': ('django.db.models.fields.TextField', [], {})
+ },
+ u'orm.layer': {
+ 'Meta': {'unique_together': "(('layer_source', 'up_id'), ('layer_source', 'name'))", 'object_name': 'Layer'},
+ 'description': ('django.db.models.fields.TextField', [], {'default': 'None', 'null': 'True'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'layer_index_url': ('django.db.models.fields.URLField', [], {'max_length': '200'}),
+ 'layer_source': ('django.db.models.fields.related.ForeignKey', [], {'default': 'None', 'to': u"orm['orm.LayerSource']", 'null': 'True'}),
+ 'local_path': ('django.db.models.fields.FilePathField', [], {'default': 'None', 'max_length': '255', 'null': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'summary': ('django.db.models.fields.TextField', [], {'default': 'None', 'null': 'True'}),
+ 'up_date': ('django.db.models.fields.DateTimeField', [], {'default': 'None', 'null': 'True'}),
+ 'up_id': ('django.db.models.fields.IntegerField', [], {'default': 'None', 'null': 'True'}),
+ 'vcs_url': ('django.db.models.fields.URLField', [], {'default': 'None', 'max_length': '200', 'null': 'True'}),
+ 'vcs_web_file_base_url': ('django.db.models.fields.URLField', [], {'default': 'None', 'max_length': '200', 'null': 'True'}),
+ 'vcs_web_tree_base_url': ('django.db.models.fields.URLField', [], {'default': 'None', 'max_length': '200', 'null': 'True'}),
+ 'vcs_web_url': ('django.db.models.fields.URLField', [], {'default': 'None', 'max_length': '200', 'null': 'True'})
+ },
+ u'orm.layer_version': {
+ 'Meta': {'unique_together': "(('layer_source', 'up_id'),)", 'object_name': 'Layer_Version'},
+ 'branch': ('django.db.models.fields.CharField', [], {'max_length': '80'}),
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'default': 'None', 'related_name': "'layer_version_build'", 'null': 'True', 'to': u"orm['orm.Build']"}),
+ 'commit': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'dirpath': ('django.db.models.fields.CharField', [], {'default': 'None', 'max_length': '255', 'null': 'True'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'layer': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'layer_version_layer'", 'to': u"orm['orm.Layer']"}),
+ 'layer_source': ('django.db.models.fields.related.ForeignKey', [], {'default': 'None', 'to': u"orm['orm.LayerSource']", 'null': 'True'}),
+ 'local_path': ('django.db.models.fields.FilePathField', [], {'default': 'None', 'max_length': '255', 'null': 'True'}),
+ 'priority': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'project': ('django.db.models.fields.related.ForeignKey', [], {'default': 'None', 'to': u"orm['orm.Project']", 'null': 'True'}),
+ 'up_branch': ('django.db.models.fields.related.ForeignKey', [], {'default': 'None', 'to': u"orm['orm.Branch']", 'null': 'True'}),
+ 'up_date': ('django.db.models.fields.DateTimeField', [], {'default': 'None', 'null': 'True'}),
+ 'up_id': ('django.db.models.fields.IntegerField', [], {'default': 'None', 'null': 'True'})
+ },
+ u'orm.layersource': {
+ 'Meta': {'unique_together': "(('sourcetype', 'apiurl'),)", 'object_name': 'LayerSource'},
+ 'apiurl': ('django.db.models.fields.CharField', [], {'default': 'None', 'max_length': '255', 'null': 'True'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'unique': 'True', 'max_length': '63'}),
+ 'sourcetype': ('django.db.models.fields.IntegerField', [], {})
+ },
+ u'orm.layerversiondependency': {
+ 'Meta': {'unique_together': "(('layer_source', 'up_id'),)", 'object_name': 'LayerVersionDependency'},
+ 'depends_on': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'dependees'", 'to': u"orm['orm.Layer_Version']"}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'layer_source': ('django.db.models.fields.related.ForeignKey', [], {'default': 'None', 'to': u"orm['orm.LayerSource']", 'null': 'True'}),
+ 'layer_version': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'dependencies'", 'to': u"orm['orm.Layer_Version']"}),
+ 'up_id': ('django.db.models.fields.IntegerField', [], {'default': 'None', 'null': 'True'})
+ },
+ u'orm.logmessage': {
+ 'Meta': {'object_name': 'LogMessage'},
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Build']"}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'level': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'lineno': ('django.db.models.fields.IntegerField', [], {'null': 'True'}),
+ 'message': ('django.db.models.fields.CharField', [], {'max_length': '240'}),
+ 'pathname': ('django.db.models.fields.FilePathField', [], {'max_length': '255', 'blank': 'True'}),
+ 'task': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Task']", 'null': 'True', 'blank': 'True'})
+ },
+ u'orm.machine': {
+ 'Meta': {'unique_together': "(('layer_source', 'up_id'),)", 'object_name': 'Machine'},
+ 'description': ('django.db.models.fields.CharField', [], {'max_length': '255'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'layer_source': ('django.db.models.fields.related.ForeignKey', [], {'default': 'None', 'to': u"orm['orm.LayerSource']", 'null': 'True'}),
+ 'layer_version': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Layer_Version']"}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '255'}),
+ 'up_date': ('django.db.models.fields.DateTimeField', [], {'default': 'None', 'null': 'True'}),
+ 'up_id': ('django.db.models.fields.IntegerField', [], {'default': 'None', 'null': 'True'})
+ },
+ u'orm.package': {
+ 'Meta': {'object_name': 'Package'},
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Build']"}),
+ 'description': ('django.db.models.fields.TextField', [], {'blank': 'True'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'installed_name': ('django.db.models.fields.CharField', [], {'default': "''", 'max_length': '100'}),
+ 'installed_size': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'license': ('django.db.models.fields.CharField', [], {'max_length': '80', 'blank': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'recipe': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Recipe']", 'null': 'True'}),
+ 'revision': ('django.db.models.fields.CharField', [], {'max_length': '32', 'blank': 'True'}),
+ 'section': ('django.db.models.fields.CharField', [], {'max_length': '80', 'blank': 'True'}),
+ 'size': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'summary': ('django.db.models.fields.TextField', [], {'blank': 'True'}),
+ 'version': ('django.db.models.fields.CharField', [], {'max_length': '100', 'blank': 'True'})
+ },
+ u'orm.package_dependency': {
+ 'Meta': {'object_name': 'Package_Dependency'},
+ 'dep_type': ('django.db.models.fields.IntegerField', [], {}),
+ 'depends_on': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'package_dependencies_target'", 'to': u"orm['orm.Package']"}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'package': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'package_dependencies_source'", 'to': u"orm['orm.Package']"}),
+ 'target': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Target']", 'null': 'True'})
+ },
+ u'orm.package_file': {
+ 'Meta': {'object_name': 'Package_File'},
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'package': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'buildfilelist_package'", 'to': u"orm['orm.Package']"}),
+ 'path': ('django.db.models.fields.FilePathField', [], {'max_length': '255', 'blank': 'True'}),
+ 'size': ('django.db.models.fields.IntegerField', [], {})
+ },
+ u'orm.project': {
+ 'Meta': {'object_name': 'Project'},
+ 'bitbake_version': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.BitbakeVersion']", 'null': 'True'}),
+ 'created': ('django.db.models.fields.DateTimeField', [], {'auto_now_add': 'True', 'blank': 'True'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'release': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Release']", 'null': 'True'}),
+ 'short_description': ('django.db.models.fields.CharField', [], {'max_length': '50', 'blank': 'True'}),
+ 'updated': ('django.db.models.fields.DateTimeField', [], {'auto_now': 'True', 'blank': 'True'}),
+ 'user_id': ('django.db.models.fields.IntegerField', [], {'null': 'True'})
+ },
+ u'orm.projectlayer': {
+ 'Meta': {'unique_together': "(('project', 'layercommit'),)", 'object_name': 'ProjectLayer'},
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'layercommit': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Layer_Version']", 'null': 'True'}),
+ 'optional': ('django.db.models.fields.BooleanField', [], {'default': 'True'}),
+ 'project': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Project']"})
+ },
+ u'orm.projecttarget': {
+ 'Meta': {'object_name': 'ProjectTarget'},
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'project': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Project']"}),
+ 'target': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'task': ('django.db.models.fields.CharField', [], {'max_length': '100', 'null': 'True'})
+ },
+ u'orm.projectvariable': {
+ 'Meta': {'object_name': 'ProjectVariable'},
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'project': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Project']"}),
+ 'value': ('django.db.models.fields.TextField', [], {'blank': 'True'})
+ },
+ u'orm.recipe': {
+ 'Meta': {'unique_together': "(('layer_version', 'file_path', 'pathflags'), ('file_path', 'pathflags'))", 'object_name': 'Recipe'},
+ 'bugtracker': ('django.db.models.fields.URLField', [], {'max_length': '200', 'blank': 'True'}),
+ 'description': ('django.db.models.fields.TextField', [], {'blank': 'True'}),
+ 'file_path': ('django.db.models.fields.FilePathField', [], {'max_length': '255'}),
+ 'homepage': ('django.db.models.fields.URLField', [], {'max_length': '200', 'blank': 'True'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'layer_source': ('django.db.models.fields.related.ForeignKey', [], {'default': 'None', 'to': u"orm['orm.LayerSource']", 'null': 'True'}),
+ 'layer_version': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'recipe_layer_version'", 'to': u"orm['orm.Layer_Version']"}),
+ 'license': ('django.db.models.fields.CharField', [], {'max_length': '200', 'blank': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '100', 'blank': 'True'}),
+ 'pathflags': ('django.db.models.fields.CharField', [], {'max_length': '200', 'blank': 'True'}),
+ 'section': ('django.db.models.fields.CharField', [], {'max_length': '100', 'blank': 'True'}),
+ 'summary': ('django.db.models.fields.TextField', [], {'blank': 'True'}),
+ 'up_date': ('django.db.models.fields.DateTimeField', [], {'default': 'None', 'null': 'True'}),
+ 'up_id': ('django.db.models.fields.IntegerField', [], {'default': 'None', 'null': 'True'}),
+ 'version': ('django.db.models.fields.CharField', [], {'max_length': '100', 'blank': 'True'})
+ },
+ u'orm.recipe_dependency': {
+ 'Meta': {'object_name': 'Recipe_Dependency'},
+ 'dep_type': ('django.db.models.fields.IntegerField', [], {}),
+ 'depends_on': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'r_dependencies_depends'", 'to': u"orm['orm.Recipe']"}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'recipe': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'r_dependencies_recipe'", 'to': u"orm['orm.Recipe']"})
+ },
+ u'orm.release': {
+ 'Meta': {'object_name': 'Release'},
+ 'bitbake_version': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.BitbakeVersion']"}),
+ 'branch_name': ('django.db.models.fields.CharField', [], {'default': "''", 'max_length': '50'}),
+ 'description': ('django.db.models.fields.CharField', [], {'max_length': '255'}),
+ 'helptext': ('django.db.models.fields.TextField', [], {'null': 'True'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'unique': 'True', 'max_length': '32'})
+ },
+ u'orm.releasedefaultlayer': {
+ 'Meta': {'object_name': 'ReleaseDefaultLayer'},
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'layer_name': ('django.db.models.fields.CharField', [], {'default': "''", 'max_length': '100'}),
+ 'release': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Release']"})
+ },
+ u'orm.releaselayersourcepriority': {
+ 'Meta': {'unique_together': "(('release', 'layer_source'),)", 'object_name': 'ReleaseLayerSourcePriority'},
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'layer_source': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.LayerSource']"}),
+ 'priority': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'release': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Release']"})
+ },
+ u'orm.target': {
+ 'Meta': {'object_name': 'Target'},
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Build']"}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'image_size': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'is_image': ('django.db.models.fields.BooleanField', [], {'default': 'False'}),
+ 'license_manifest_path': ('django.db.models.fields.CharField', [], {'max_length': '500', 'null': 'True'}),
+ 'target': ('django.db.models.fields.CharField', [], {'max_length': '100'})
+ },
+ u'orm.target_file': {
+ 'Meta': {'object_name': 'Target_File'},
+ 'directory': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'directory_set'", 'null': 'True', 'to': u"orm['orm.Target_File']"}),
+ 'group': ('django.db.models.fields.CharField', [], {'max_length': '128'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'inodetype': ('django.db.models.fields.IntegerField', [], {}),
+ 'owner': ('django.db.models.fields.CharField', [], {'max_length': '128'}),
+ 'path': ('django.db.models.fields.FilePathField', [], {'max_length': '100'}),
+ 'permission': ('django.db.models.fields.CharField', [], {'max_length': '16'}),
+ 'size': ('django.db.models.fields.IntegerField', [], {}),
+ 'sym_target': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'symlink_set'", 'null': 'True', 'to': u"orm['orm.Target_File']"}),
+ 'target': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Target']"})
+ },
+ u'orm.target_image_file': {
+ 'Meta': {'object_name': 'Target_Image_File'},
+ 'file_name': ('django.db.models.fields.FilePathField', [], {'max_length': '254'}),
+ 'file_size': ('django.db.models.fields.IntegerField', [], {}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'target': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Target']"})
+ },
+ u'orm.target_installed_package': {
+ 'Meta': {'object_name': 'Target_Installed_Package'},
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'package': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'buildtargetlist_package'", 'to': u"orm['orm.Package']"}),
+ 'target': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Target']"})
+ },
+ u'orm.task': {
+ 'Meta': {'ordering': "('order', 'recipe')", 'unique_together': "(('build', 'recipe', 'task_name'),)", 'object_name': 'Task'},
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'task_build'", 'to': u"orm['orm.Build']"}),
+ 'cpu_usage': ('django.db.models.fields.DecimalField', [], {'null': 'True', 'max_digits': '8', 'decimal_places': '2'}),
+ 'disk_io': ('django.db.models.fields.IntegerField', [], {'null': 'True'}),
+ 'elapsed_time': ('django.db.models.fields.DecimalField', [], {'null': 'True', 'max_digits': '8', 'decimal_places': '2'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'line_number': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'logfile': ('django.db.models.fields.FilePathField', [], {'max_length': '255', 'blank': 'True'}),
+ 'message': ('django.db.models.fields.CharField', [], {'max_length': '240'}),
+ 'order': ('django.db.models.fields.IntegerField', [], {'null': 'True'}),
+ 'outcome': ('django.db.models.fields.IntegerField', [], {'default': '-1'}),
+ 'path_to_sstate_obj': ('django.db.models.fields.FilePathField', [], {'max_length': '500', 'blank': 'True'}),
+ 'recipe': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'tasks'", 'to': u"orm['orm.Recipe']"}),
+ 'script_type': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'source_url': ('django.db.models.fields.FilePathField', [], {'max_length': '255', 'blank': 'True'}),
+ 'sstate_checksum': ('django.db.models.fields.CharField', [], {'max_length': '100', 'blank': 'True'}),
+ 'sstate_result': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'task_executed': ('django.db.models.fields.BooleanField', [], {'default': 'False'}),
+ 'task_name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'work_directory': ('django.db.models.fields.FilePathField', [], {'max_length': '255', 'blank': 'True'})
+ },
+ u'orm.task_dependency': {
+ 'Meta': {'object_name': 'Task_Dependency'},
+ 'depends_on': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'task_dependencies_depends'", 'to': u"orm['orm.Task']"}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'task': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'task_dependencies_task'", 'to': u"orm['orm.Task']"})
+ },
+ u'orm.toastersetting': {
+ 'Meta': {'object_name': 'ToasterSetting'},
+ 'helptext': ('django.db.models.fields.TextField', [], {}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '63'}),
+ 'value': ('django.db.models.fields.CharField', [], {'max_length': '255'})
+ },
+ u'orm.variable': {
+ 'Meta': {'object_name': 'Variable'},
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'variable_build'", 'to': u"orm['orm.Build']"}),
+ 'changed': ('django.db.models.fields.BooleanField', [], {'default': 'False'}),
+ 'description': ('django.db.models.fields.TextField', [], {'blank': 'True'}),
+ 'human_readable_name': ('django.db.models.fields.CharField', [], {'max_length': '200'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'variable_name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'variable_value': ('django.db.models.fields.TextField', [], {'blank': 'True'})
+ },
+ u'orm.variablehistory': {
+ 'Meta': {'object_name': 'VariableHistory'},
+ 'file_name': ('django.db.models.fields.FilePathField', [], {'max_length': '255'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'line_number': ('django.db.models.fields.IntegerField', [], {'null': 'True'}),
+ 'operation': ('django.db.models.fields.CharField', [], {'max_length': '64'}),
+ 'value': ('django.db.models.fields.TextField', [], {'blank': 'True'}),
+ 'variable': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'vhistory'", 'to': u"orm['orm.Variable']"})
+ }
+ }
+
+ complete_apps = ['orm']
diff --git a/bitbake/lib/toaster/orm/migrations/0021_auto__chg_field_build_project__chg_field_project_bitbake_version__chg_.py b/bitbake/lib/toaster/orm/migrations/0021_auto__chg_field_build_project__chg_field_project_bitbake_version__chg_.py
new file mode 100644
index 0000000..a62ddb7
--- /dev/null
+++ b/bitbake/lib/toaster/orm/migrations/0021_auto__chg_field_build_project__chg_field_project_bitbake_version__chg_.py
@@ -0,0 +1,371 @@
+# -*- coding: utf-8 -*-
+from south.utils import datetime_utils as datetime
+from south.db import db
+from south.v2 import SchemaMigration
+from django.db import models
+
+
+class Migration(SchemaMigration):
+
+ no_dry_run = True
+
+ def forwards(self, orm):
+
+ # Changing field 'Build.project'
+ db.alter_column(u'orm_build', 'project_id', self.gf('django.db.models.fields.related.ForeignKey')(to=orm['orm.Project']))
+
+ # Changing field 'Project.bitbake_version'
+ db.alter_column(u'orm_project', 'bitbake_version_id', self.gf('django.db.models.fields.related.ForeignKey')(to=orm['orm.BitbakeVersion'], null=True))
+
+ # Changing field 'Project.release'
+ db.alter_column(u'orm_project', 'release_id', self.gf('django.db.models.fields.related.ForeignKey')(to=orm['orm.Release'], null=True))
+
+ # Changing field 'Task.cpu_usage'
+ db.alter_column(u'orm_task', 'cpu_usage', self.gf('django.db.models.fields.DecimalField')(null=True, max_digits=8, decimal_places=2))
+
+ # Changing field 'Task.elapsed_time'
+ db.alter_column(u'orm_task', 'elapsed_time', self.gf('django.db.models.fields.DecimalField')(null=True, max_digits=8, decimal_places=2))
+
+ def backwards(self, orm):
+
+ # Changing field 'Build.project'
+ db.alter_column(u'orm_build', 'project_id', self.gf('django.db.models.fields.related.ForeignKey')(to=orm['orm.Project'], null=True))
+
+ # User chose to not deal with backwards NULL issues for 'Project.bitbake_version'
+ raise RuntimeError("Cannot reverse this migration. 'Project.bitbake_version' and its values cannot be restored.")
+
+ # The following code is provided here to aid in writing a correct migration
+ # Changing field 'Project.bitbake_version'
+ db.alter_column(u'orm_project', 'bitbake_version_id', self.gf('django.db.models.fields.related.ForeignKey')(to=orm['orm.BitbakeVersion']))
+
+ # User chose to not deal with backwards NULL issues for 'Project.release'
+ raise RuntimeError("Cannot reverse this migration. 'Project.release' and its values cannot be restored.")
+
+ # The following code is provided here to aid in writing a correct migration
+ # Changing field 'Project.release'
+ db.alter_column(u'orm_project', 'release_id', self.gf('django.db.models.fields.related.ForeignKey')(to=orm['orm.Release']))
+
+ # Changing field 'Task.cpu_usage'
+ db.alter_column(u'orm_task', 'cpu_usage', self.gf('django.db.models.fields.DecimalField')(null=True, max_digits=6, decimal_places=2))
+
+ # Changing field 'Task.elapsed_time'
+ db.alter_column(u'orm_task', 'elapsed_time', self.gf('django.db.models.fields.DecimalField')(null=True, max_digits=6, decimal_places=2))
+
+ models = {
+ u'orm.bitbakeversion': {
+ 'Meta': {'object_name': 'BitbakeVersion'},
+ 'branch': ('django.db.models.fields.CharField', [], {'max_length': '32'}),
+ 'dirpath': ('django.db.models.fields.CharField', [], {'max_length': '255'}),
+ 'giturl': ('django.db.models.fields.URLField', [], {'max_length': '200'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'unique': 'True', 'max_length': '32'})
+ },
+ u'orm.branch': {
+ 'Meta': {'unique_together': "(('layer_source', 'name'), ('layer_source', 'up_id'))", 'object_name': 'Branch'},
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'layer_source': ('django.db.models.fields.related.ForeignKey', [], {'default': 'True', 'to': u"orm['orm.LayerSource']", 'null': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '50'}),
+ 'short_description': ('django.db.models.fields.CharField', [], {'max_length': '50', 'blank': 'True'}),
+ 'up_date': ('django.db.models.fields.DateTimeField', [], {'default': 'None', 'null': 'True'}),
+ 'up_id': ('django.db.models.fields.IntegerField', [], {'default': 'None', 'null': 'True'})
+ },
+ u'orm.build': {
+ 'Meta': {'object_name': 'Build'},
+ 'bitbake_version': ('django.db.models.fields.CharField', [], {'max_length': '50'}),
+ 'build_name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'completed_on': ('django.db.models.fields.DateTimeField', [], {}),
+ 'cooker_log_path': ('django.db.models.fields.CharField', [], {'max_length': '500'}),
+ 'distro': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'distro_version': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'errors_no': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'machine': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'outcome': ('django.db.models.fields.IntegerField', [], {'default': '2'}),
+ 'project': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Project']"}),
+ 'started_on': ('django.db.models.fields.DateTimeField', [], {}),
+ 'timespent': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'warnings_no': ('django.db.models.fields.IntegerField', [], {'default': '0'})
+ },
+ u'orm.buildartifact': {
+ 'Meta': {'object_name': 'BuildArtifact'},
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Build']"}),
+ 'file_name': ('django.db.models.fields.FilePathField', [], {'max_length': '100'}),
+ 'file_size': ('django.db.models.fields.IntegerField', [], {}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'})
+ },
+ u'orm.helptext': {
+ 'Meta': {'object_name': 'HelpText'},
+ 'area': ('django.db.models.fields.IntegerField', [], {}),
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'helptext_build'", 'to': u"orm['orm.Build']"}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'key': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'text': ('django.db.models.fields.TextField', [], {})
+ },
+ u'orm.layer': {
+ 'Meta': {'unique_together': "(('layer_source', 'up_id'), ('layer_source', 'name'))", 'object_name': 'Layer'},
+ 'description': ('django.db.models.fields.TextField', [], {'default': 'None', 'null': 'True'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'layer_index_url': ('django.db.models.fields.URLField', [], {'max_length': '200'}),
+ 'layer_source': ('django.db.models.fields.related.ForeignKey', [], {'default': 'None', 'to': u"orm['orm.LayerSource']", 'null': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'summary': ('django.db.models.fields.TextField', [], {'default': 'None', 'null': 'True'}),
+ 'up_date': ('django.db.models.fields.DateTimeField', [], {'default': 'None', 'null': 'True'}),
+ 'up_id': ('django.db.models.fields.IntegerField', [], {'default': 'None', 'null': 'True'}),
+ 'vcs_url': ('django.db.models.fields.URLField', [], {'default': 'None', 'max_length': '200', 'null': 'True'}),
+ 'vcs_web_file_base_url': ('django.db.models.fields.URLField', [], {'default': 'None', 'max_length': '200', 'null': 'True'}),
+ 'vcs_web_tree_base_url': ('django.db.models.fields.URLField', [], {'default': 'None', 'max_length': '200', 'null': 'True'}),
+ 'vcs_web_url': ('django.db.models.fields.URLField', [], {'default': 'None', 'max_length': '200', 'null': 'True'})
+ },
+ u'orm.layer_version': {
+ 'Meta': {'unique_together': "(('layer_source', 'up_id'),)", 'object_name': 'Layer_Version'},
+ 'branch': ('django.db.models.fields.CharField', [], {'max_length': '80'}),
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'default': 'None', 'related_name': "'layer_version_build'", 'null': 'True', 'to': u"orm['orm.Build']"}),
+ 'commit': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'dirpath': ('django.db.models.fields.CharField', [], {'default': 'None', 'max_length': '255', 'null': 'True'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'layer': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'layer_version_layer'", 'to': u"orm['orm.Layer']"}),
+ 'layer_source': ('django.db.models.fields.related.ForeignKey', [], {'default': 'None', 'to': u"orm['orm.LayerSource']", 'null': 'True'}),
+ 'local_path': ('django.db.models.fields.FilePathField', [], {'default': "'/'", 'max_length': '1024'}),
+ 'priority': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'project': ('django.db.models.fields.related.ForeignKey', [], {'default': 'None', 'to': u"orm['orm.Project']", 'null': 'True'}),
+ 'up_branch': ('django.db.models.fields.related.ForeignKey', [], {'default': 'None', 'to': u"orm['orm.Branch']", 'null': 'True'}),
+ 'up_date': ('django.db.models.fields.DateTimeField', [], {'default': 'None', 'null': 'True'}),
+ 'up_id': ('django.db.models.fields.IntegerField', [], {'default': 'None', 'null': 'True'})
+ },
+ u'orm.layersource': {
+ 'Meta': {'unique_together': "(('sourcetype', 'apiurl'),)", 'object_name': 'LayerSource'},
+ 'apiurl': ('django.db.models.fields.CharField', [], {'default': 'None', 'max_length': '255', 'null': 'True'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'unique': 'True', 'max_length': '63'}),
+ 'sourcetype': ('django.db.models.fields.IntegerField', [], {})
+ },
+ u'orm.layerversiondependency': {
+ 'Meta': {'unique_together': "(('layer_source', 'up_id'),)", 'object_name': 'LayerVersionDependency'},
+ 'depends_on': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'dependees'", 'to': u"orm['orm.Layer_Version']"}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'layer_source': ('django.db.models.fields.related.ForeignKey', [], {'default': 'None', 'to': u"orm['orm.LayerSource']", 'null': 'True'}),
+ 'layer_version': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'dependencies'", 'to': u"orm['orm.Layer_Version']"}),
+ 'up_id': ('django.db.models.fields.IntegerField', [], {'default': 'None', 'null': 'True'})
+ },
+ u'orm.logmessage': {
+ 'Meta': {'object_name': 'LogMessage'},
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Build']"}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'level': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'lineno': ('django.db.models.fields.IntegerField', [], {'null': 'True'}),
+ 'message': ('django.db.models.fields.CharField', [], {'max_length': '240'}),
+ 'pathname': ('django.db.models.fields.FilePathField', [], {'max_length': '255', 'blank': 'True'}),
+ 'task': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Task']", 'null': 'True', 'blank': 'True'})
+ },
+ u'orm.machine': {
+ 'Meta': {'unique_together': "(('layer_source', 'up_id'),)", 'object_name': 'Machine'},
+ 'description': ('django.db.models.fields.CharField', [], {'max_length': '255'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'layer_source': ('django.db.models.fields.related.ForeignKey', [], {'default': 'None', 'to': u"orm['orm.LayerSource']", 'null': 'True'}),
+ 'layer_version': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Layer_Version']"}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '255'}),
+ 'up_date': ('django.db.models.fields.DateTimeField', [], {'default': 'None', 'null': 'True'}),
+ 'up_id': ('django.db.models.fields.IntegerField', [], {'default': 'None', 'null': 'True'})
+ },
+ u'orm.package': {
+ 'Meta': {'object_name': 'Package'},
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Build']"}),
+ 'description': ('django.db.models.fields.TextField', [], {'blank': 'True'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'installed_name': ('django.db.models.fields.CharField', [], {'default': "''", 'max_length': '100'}),
+ 'installed_size': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'license': ('django.db.models.fields.CharField', [], {'max_length': '80', 'blank': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'recipe': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Recipe']", 'null': 'True'}),
+ 'revision': ('django.db.models.fields.CharField', [], {'max_length': '32', 'blank': 'True'}),
+ 'section': ('django.db.models.fields.CharField', [], {'max_length': '80', 'blank': 'True'}),
+ 'size': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'summary': ('django.db.models.fields.TextField', [], {'blank': 'True'}),
+ 'version': ('django.db.models.fields.CharField', [], {'max_length': '100', 'blank': 'True'})
+ },
+ u'orm.package_dependency': {
+ 'Meta': {'object_name': 'Package_Dependency'},
+ 'dep_type': ('django.db.models.fields.IntegerField', [], {}),
+ 'depends_on': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'package_dependencies_target'", 'to': u"orm['orm.Package']"}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'package': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'package_dependencies_source'", 'to': u"orm['orm.Package']"}),
+ 'target': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Target']", 'null': 'True'})
+ },
+ u'orm.package_file': {
+ 'Meta': {'object_name': 'Package_File'},
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'package': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'buildfilelist_package'", 'to': u"orm['orm.Package']"}),
+ 'path': ('django.db.models.fields.FilePathField', [], {'max_length': '255', 'blank': 'True'}),
+ 'size': ('django.db.models.fields.IntegerField', [], {})
+ },
+ u'orm.project': {
+ 'Meta': {'object_name': 'Project'},
+ 'bitbake_version': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.BitbakeVersion']", 'null': 'True'}),
+ 'created': ('django.db.models.fields.DateTimeField', [], {'auto_now_add': 'True', 'blank': 'True'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'release': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Release']", 'null': 'True'}),
+ 'short_description': ('django.db.models.fields.CharField', [], {'max_length': '50', 'blank': 'True'}),
+ 'updated': ('django.db.models.fields.DateTimeField', [], {'auto_now': 'True', 'blank': 'True'}),
+ 'user_id': ('django.db.models.fields.IntegerField', [], {'null': 'True'})
+ },
+ u'orm.projectlayer': {
+ 'Meta': {'unique_together': "(('project', 'layercommit'),)", 'object_name': 'ProjectLayer'},
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'layercommit': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Layer_Version']", 'null': 'True'}),
+ 'optional': ('django.db.models.fields.BooleanField', [], {'default': 'True'}),
+ 'project': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Project']"})
+ },
+ u'orm.projecttarget': {
+ 'Meta': {'object_name': 'ProjectTarget'},
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'project': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Project']"}),
+ 'target': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'task': ('django.db.models.fields.CharField', [], {'max_length': '100', 'null': 'True'})
+ },
+ u'orm.projectvariable': {
+ 'Meta': {'object_name': 'ProjectVariable'},
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'project': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Project']"}),
+ 'value': ('django.db.models.fields.TextField', [], {'blank': 'True'})
+ },
+ u'orm.recipe': {
+ 'Meta': {'unique_together': "(('layer_version', 'file_path', 'pathflags'),)", 'object_name': 'Recipe'},
+ 'bugtracker': ('django.db.models.fields.URLField', [], {'max_length': '200', 'blank': 'True'}),
+ 'description': ('django.db.models.fields.TextField', [], {'blank': 'True'}),
+ 'file_path': ('django.db.models.fields.FilePathField', [], {'max_length': '255'}),
+ 'homepage': ('django.db.models.fields.URLField', [], {'max_length': '200', 'blank': 'True'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'layer_source': ('django.db.models.fields.related.ForeignKey', [], {'default': 'None', 'to': u"orm['orm.LayerSource']", 'null': 'True'}),
+ 'layer_version': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'recipe_layer_version'", 'to': u"orm['orm.Layer_Version']"}),
+ 'license': ('django.db.models.fields.CharField', [], {'max_length': '200', 'blank': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '100', 'blank': 'True'}),
+ 'pathflags': ('django.db.models.fields.CharField', [], {'max_length': '200', 'blank': 'True'}),
+ 'section': ('django.db.models.fields.CharField', [], {'max_length': '100', 'blank': 'True'}),
+ 'summary': ('django.db.models.fields.TextField', [], {'blank': 'True'}),
+ 'up_date': ('django.db.models.fields.DateTimeField', [], {'default': 'None', 'null': 'True'}),
+ 'up_id': ('django.db.models.fields.IntegerField', [], {'default': 'None', 'null': 'True'}),
+ 'version': ('django.db.models.fields.CharField', [], {'max_length': '100', 'blank': 'True'})
+ },
+ u'orm.recipe_dependency': {
+ 'Meta': {'object_name': 'Recipe_Dependency'},
+ 'dep_type': ('django.db.models.fields.IntegerField', [], {}),
+ 'depends_on': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'r_dependencies_depends'", 'to': u"orm['orm.Recipe']"}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'recipe': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'r_dependencies_recipe'", 'to': u"orm['orm.Recipe']"})
+ },
+ u'orm.release': {
+ 'Meta': {'object_name': 'Release'},
+ 'bitbake_version': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.BitbakeVersion']"}),
+ 'branch_name': ('django.db.models.fields.CharField', [], {'default': "''", 'max_length': '50'}),
+ 'description': ('django.db.models.fields.CharField', [], {'max_length': '255'}),
+ 'helptext': ('django.db.models.fields.TextField', [], {'null': 'True'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'unique': 'True', 'max_length': '32'})
+ },
+ u'orm.releasedefaultlayer': {
+ 'Meta': {'object_name': 'ReleaseDefaultLayer'},
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'layer_name': ('django.db.models.fields.CharField', [], {'default': "''", 'max_length': '100'}),
+ 'release': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Release']"})
+ },
+ u'orm.releaselayersourcepriority': {
+ 'Meta': {'unique_together': "(('release', 'layer_source'),)", 'object_name': 'ReleaseLayerSourcePriority'},
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'layer_source': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.LayerSource']"}),
+ 'priority': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'release': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Release']"})
+ },
+ u'orm.target': {
+ 'Meta': {'object_name': 'Target'},
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Build']"}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'image_size': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'is_image': ('django.db.models.fields.BooleanField', [], {'default': 'False'}),
+ 'license_manifest_path': ('django.db.models.fields.CharField', [], {'max_length': '500', 'null': 'True'}),
+ 'target': ('django.db.models.fields.CharField', [], {'max_length': '100'})
+ },
+ u'orm.target_file': {
+ 'Meta': {'object_name': 'Target_File'},
+ 'directory': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'directory_set'", 'null': 'True', 'to': u"orm['orm.Target_File']"}),
+ 'group': ('django.db.models.fields.CharField', [], {'max_length': '128'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'inodetype': ('django.db.models.fields.IntegerField', [], {}),
+ 'owner': ('django.db.models.fields.CharField', [], {'max_length': '128'}),
+ 'path': ('django.db.models.fields.FilePathField', [], {'max_length': '100'}),
+ 'permission': ('django.db.models.fields.CharField', [], {'max_length': '16'}),
+ 'size': ('django.db.models.fields.IntegerField', [], {}),
+ 'sym_target': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'symlink_set'", 'null': 'True', 'to': u"orm['orm.Target_File']"}),
+ 'target': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Target']"})
+ },
+ u'orm.target_image_file': {
+ 'Meta': {'object_name': 'Target_Image_File'},
+ 'file_name': ('django.db.models.fields.FilePathField', [], {'max_length': '254'}),
+ 'file_size': ('django.db.models.fields.IntegerField', [], {}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'target': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Target']"})
+ },
+ u'orm.target_installed_package': {
+ 'Meta': {'object_name': 'Target_Installed_Package'},
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'package': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'buildtargetlist_package'", 'to': u"orm['orm.Package']"}),
+ 'target': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Target']"})
+ },
+ u'orm.task': {
+ 'Meta': {'ordering': "('order', 'recipe')", 'unique_together': "(('build', 'recipe', 'task_name'),)", 'object_name': 'Task'},
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'task_build'", 'to': u"orm['orm.Build']"}),
+ 'cpu_usage': ('django.db.models.fields.DecimalField', [], {'null': 'True', 'max_digits': '8', 'decimal_places': '2'}),
+ 'disk_io': ('django.db.models.fields.IntegerField', [], {'null': 'True'}),
+ 'elapsed_time': ('django.db.models.fields.DecimalField', [], {'null': 'True', 'max_digits': '8', 'decimal_places': '2'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'line_number': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'logfile': ('django.db.models.fields.FilePathField', [], {'max_length': '255', 'blank': 'True'}),
+ 'message': ('django.db.models.fields.CharField', [], {'max_length': '240'}),
+ 'order': ('django.db.models.fields.IntegerField', [], {'null': 'True'}),
+ 'outcome': ('django.db.models.fields.IntegerField', [], {'default': '-1'}),
+ 'path_to_sstate_obj': ('django.db.models.fields.FilePathField', [], {'max_length': '500', 'blank': 'True'}),
+ 'recipe': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'tasks'", 'to': u"orm['orm.Recipe']"}),
+ 'script_type': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'source_url': ('django.db.models.fields.FilePathField', [], {'max_length': '255', 'blank': 'True'}),
+ 'sstate_checksum': ('django.db.models.fields.CharField', [], {'max_length': '100', 'blank': 'True'}),
+ 'sstate_result': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'task_executed': ('django.db.models.fields.BooleanField', [], {'default': 'False'}),
+ 'task_name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'work_directory': ('django.db.models.fields.FilePathField', [], {'max_length': '255', 'blank': 'True'})
+ },
+ u'orm.task_dependency': {
+ 'Meta': {'object_name': 'Task_Dependency'},
+ 'depends_on': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'task_dependencies_depends'", 'to': u"orm['orm.Task']"}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'task': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'task_dependencies_task'", 'to': u"orm['orm.Task']"})
+ },
+ u'orm.toastersetting': {
+ 'Meta': {'object_name': 'ToasterSetting'},
+ 'helptext': ('django.db.models.fields.TextField', [], {}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '63'}),
+ 'value': ('django.db.models.fields.CharField', [], {'max_length': '255'})
+ },
+ u'orm.variable': {
+ 'Meta': {'object_name': 'Variable'},
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'variable_build'", 'to': u"orm['orm.Build']"}),
+ 'changed': ('django.db.models.fields.BooleanField', [], {'default': 'False'}),
+ 'description': ('django.db.models.fields.TextField', [], {'blank': 'True'}),
+ 'human_readable_name': ('django.db.models.fields.CharField', [], {'max_length': '200'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'variable_name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'variable_value': ('django.db.models.fields.TextField', [], {'blank': 'True'})
+ },
+ u'orm.variablehistory': {
+ 'Meta': {'object_name': 'VariableHistory'},
+ 'file_name': ('django.db.models.fields.FilePathField', [], {'max_length': '255'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'line_number': ('django.db.models.fields.IntegerField', [], {'null': 'True'}),
+ 'operation': ('django.db.models.fields.CharField', [], {'max_length': '64'}),
+ 'value': ('django.db.models.fields.TextField', [], {'blank': 'True'}),
+ 'variable': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'vhistory'", 'to': u"orm['orm.Variable']"})
+ }
+ }
+
+ complete_apps = ['orm']
\ No newline at end of file
diff --git a/bitbake/lib/toaster/orm/migrations/0022_auto__add_field_target_task__add_field_layer_version_local_path__del_f.py b/bitbake/lib/toaster/orm/migrations/0022_auto__add_field_target_task__add_field_layer_version_local_path__del_f.py
new file mode 100644
index 0000000..3dec391
--- /dev/null
+++ b/bitbake/lib/toaster/orm/migrations/0022_auto__add_field_target_task__add_field_layer_version_local_path__del_f.py
@@ -0,0 +1,343 @@
+# -*- coding: utf-8 -*-
+from south.utils import datetime_utils as datetime
+from south.db import db
+from south.v2 import SchemaMigration
+from django.db import models
+
+
+class Migration(SchemaMigration):
+
+ def forwards(self, orm):
+ # Adding field 'Target.task'
+ db.add_column(u'orm_target', 'task',
+ self.gf('django.db.models.fields.CharField')(max_length=100, null=True),
+ keep_default=False)
+
+
+
+
+
+ def backwards(self, orm):
+ # Deleting field 'Target.task'
+ db.delete_column(u'orm_target', 'task')
+
+
+ models = {
+ u'orm.bitbakeversion': {
+ 'Meta': {'object_name': 'BitbakeVersion'},
+ 'branch': ('django.db.models.fields.CharField', [], {'max_length': '32'}),
+ 'dirpath': ('django.db.models.fields.CharField', [], {'max_length': '255'}),
+ 'giturl': ('django.db.models.fields.URLField', [], {'max_length': '200'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'unique': 'True', 'max_length': '32'})
+ },
+ u'orm.branch': {
+ 'Meta': {'unique_together': "(('layer_source', 'name'), ('layer_source', 'up_id'))", 'object_name': 'Branch'},
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'layer_source': ('django.db.models.fields.related.ForeignKey', [], {'default': 'True', 'to': u"orm['orm.LayerSource']", 'null': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '50'}),
+ 'short_description': ('django.db.models.fields.CharField', [], {'max_length': '50', 'blank': 'True'}),
+ 'up_date': ('django.db.models.fields.DateTimeField', [], {'default': 'None', 'null': 'True'}),
+ 'up_id': ('django.db.models.fields.IntegerField', [], {'default': 'None', 'null': 'True'})
+ },
+ u'orm.build': {
+ 'Meta': {'object_name': 'Build'},
+ 'bitbake_version': ('django.db.models.fields.CharField', [], {'max_length': '50'}),
+ 'build_name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'completed_on': ('django.db.models.fields.DateTimeField', [], {}),
+ 'cooker_log_path': ('django.db.models.fields.CharField', [], {'max_length': '500'}),
+ 'distro': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'distro_version': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'errors_no': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'machine': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'outcome': ('django.db.models.fields.IntegerField', [], {'default': '2'}),
+ 'project': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Project']"}),
+ 'started_on': ('django.db.models.fields.DateTimeField', [], {}),
+ 'timespent': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'warnings_no': ('django.db.models.fields.IntegerField', [], {'default': '0'})
+ },
+ u'orm.buildartifact': {
+ 'Meta': {'object_name': 'BuildArtifact'},
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Build']"}),
+ 'file_name': ('django.db.models.fields.FilePathField', [], {'max_length': '100'}),
+ 'file_size': ('django.db.models.fields.IntegerField', [], {}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'})
+ },
+ u'orm.helptext': {
+ 'Meta': {'object_name': 'HelpText'},
+ 'area': ('django.db.models.fields.IntegerField', [], {}),
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'helptext_build'", 'to': u"orm['orm.Build']"}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'key': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'text': ('django.db.models.fields.TextField', [], {})
+ },
+ u'orm.layer': {
+ 'Meta': {'unique_together': "(('layer_source', 'up_id'), ('layer_source', 'name'))", 'object_name': 'Layer'},
+ 'description': ('django.db.models.fields.TextField', [], {'default': 'None', 'null': 'True'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'layer_index_url': ('django.db.models.fields.URLField', [], {'max_length': '200'}),
+ 'layer_source': ('django.db.models.fields.related.ForeignKey', [], {'default': 'None', 'to': u"orm['orm.LayerSource']", 'null': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'summary': ('django.db.models.fields.TextField', [], {'default': 'None', 'null': 'True'}),
+ 'up_date': ('django.db.models.fields.DateTimeField', [], {'default': 'None', 'null': 'True'}),
+ 'up_id': ('django.db.models.fields.IntegerField', [], {'default': 'None', 'null': 'True'}),
+ 'vcs_url': ('django.db.models.fields.URLField', [], {'default': 'None', 'max_length': '200', 'null': 'True'}),
+ 'vcs_web_file_base_url': ('django.db.models.fields.URLField', [], {'default': 'None', 'max_length': '200', 'null': 'True'}),
+ 'vcs_web_tree_base_url': ('django.db.models.fields.URLField', [], {'default': 'None', 'max_length': '200', 'null': 'True'}),
+ 'vcs_web_url': ('django.db.models.fields.URLField', [], {'default': 'None', 'max_length': '200', 'null': 'True'})
+ },
+ u'orm.layer_version': {
+ 'Meta': {'unique_together': "(('layer_source', 'up_id'),)", 'object_name': 'Layer_Version'},
+ 'branch': ('django.db.models.fields.CharField', [], {'max_length': '80'}),
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'default': 'None', 'related_name': "'layer_version_build'", 'null': 'True', 'to': u"orm['orm.Build']"}),
+ 'commit': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'dirpath': ('django.db.models.fields.CharField', [], {'default': 'None', 'max_length': '255', 'null': 'True'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'layer': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'layer_version_layer'", 'to': u"orm['orm.Layer']"}),
+ 'layer_source': ('django.db.models.fields.related.ForeignKey', [], {'default': 'None', 'to': u"orm['orm.LayerSource']", 'null': 'True'}),
+ 'local_path': ('django.db.models.fields.FilePathField', [], {'default': "'/'", 'max_length': '1024'}),
+ 'priority': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'project': ('django.db.models.fields.related.ForeignKey', [], {'default': 'None', 'to': u"orm['orm.Project']", 'null': 'True'}),
+ 'up_branch': ('django.db.models.fields.related.ForeignKey', [], {'default': 'None', 'to': u"orm['orm.Branch']", 'null': 'True'}),
+ 'up_date': ('django.db.models.fields.DateTimeField', [], {'default': 'None', 'null': 'True'}),
+ 'up_id': ('django.db.models.fields.IntegerField', [], {'default': 'None', 'null': 'True'})
+ },
+ u'orm.layersource': {
+ 'Meta': {'unique_together': "(('sourcetype', 'apiurl'),)", 'object_name': 'LayerSource'},
+ 'apiurl': ('django.db.models.fields.CharField', [], {'default': 'None', 'max_length': '255', 'null': 'True'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'unique': 'True', 'max_length': '63'}),
+ 'sourcetype': ('django.db.models.fields.IntegerField', [], {})
+ },
+ u'orm.layerversiondependency': {
+ 'Meta': {'unique_together': "(('layer_source', 'up_id'),)", 'object_name': 'LayerVersionDependency'},
+ 'depends_on': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'dependees'", 'to': u"orm['orm.Layer_Version']"}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'layer_source': ('django.db.models.fields.related.ForeignKey', [], {'default': 'None', 'to': u"orm['orm.LayerSource']", 'null': 'True'}),
+ 'layer_version': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'dependencies'", 'to': u"orm['orm.Layer_Version']"}),
+ 'up_id': ('django.db.models.fields.IntegerField', [], {'default': 'None', 'null': 'True'})
+ },
+ u'orm.logmessage': {
+ 'Meta': {'object_name': 'LogMessage'},
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Build']"}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'level': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'lineno': ('django.db.models.fields.IntegerField', [], {'null': 'True'}),
+ 'message': ('django.db.models.fields.CharField', [], {'max_length': '240'}),
+ 'pathname': ('django.db.models.fields.FilePathField', [], {'max_length': '255', 'blank': 'True'}),
+ 'task': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Task']", 'null': 'True', 'blank': 'True'})
+ },
+ u'orm.machine': {
+ 'Meta': {'unique_together': "(('layer_source', 'up_id'),)", 'object_name': 'Machine'},
+ 'description': ('django.db.models.fields.CharField', [], {'max_length': '255'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'layer_source': ('django.db.models.fields.related.ForeignKey', [], {'default': 'None', 'to': u"orm['orm.LayerSource']", 'null': 'True'}),
+ 'layer_version': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Layer_Version']"}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '255'}),
+ 'up_date': ('django.db.models.fields.DateTimeField', [], {'default': 'None', 'null': 'True'}),
+ 'up_id': ('django.db.models.fields.IntegerField', [], {'default': 'None', 'null': 'True'})
+ },
+ u'orm.package': {
+ 'Meta': {'object_name': 'Package'},
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Build']"}),
+ 'description': ('django.db.models.fields.TextField', [], {'blank': 'True'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'installed_name': ('django.db.models.fields.CharField', [], {'default': "''", 'max_length': '100'}),
+ 'installed_size': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'license': ('django.db.models.fields.CharField', [], {'max_length': '80', 'blank': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'recipe': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Recipe']", 'null': 'True'}),
+ 'revision': ('django.db.models.fields.CharField', [], {'max_length': '32', 'blank': 'True'}),
+ 'section': ('django.db.models.fields.CharField', [], {'max_length': '80', 'blank': 'True'}),
+ 'size': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'summary': ('django.db.models.fields.TextField', [], {'blank': 'True'}),
+ 'version': ('django.db.models.fields.CharField', [], {'max_length': '100', 'blank': 'True'})
+ },
+ u'orm.package_dependency': {
+ 'Meta': {'object_name': 'Package_Dependency'},
+ 'dep_type': ('django.db.models.fields.IntegerField', [], {}),
+ 'depends_on': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'package_dependencies_target'", 'to': u"orm['orm.Package']"}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'package': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'package_dependencies_source'", 'to': u"orm['orm.Package']"}),
+ 'target': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Target']", 'null': 'True'})
+ },
+ u'orm.package_file': {
+ 'Meta': {'object_name': 'Package_File'},
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'package': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'buildfilelist_package'", 'to': u"orm['orm.Package']"}),
+ 'path': ('django.db.models.fields.FilePathField', [], {'max_length': '255', 'blank': 'True'}),
+ 'size': ('django.db.models.fields.IntegerField', [], {})
+ },
+ u'orm.project': {
+ 'Meta': {'object_name': 'Project'},
+ 'bitbake_version': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.BitbakeVersion']", 'null': 'True'}),
+ 'created': ('django.db.models.fields.DateTimeField', [], {'auto_now_add': 'True', 'blank': 'True'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'release': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Release']", 'null': 'True'}),
+ 'short_description': ('django.db.models.fields.CharField', [], {'max_length': '50', 'blank': 'True'}),
+ 'updated': ('django.db.models.fields.DateTimeField', [], {'auto_now': 'True', 'blank': 'True'}),
+ 'user_id': ('django.db.models.fields.IntegerField', [], {'null': 'True'})
+ },
+ u'orm.projectlayer': {
+ 'Meta': {'unique_together': "(('project', 'layercommit'),)", 'object_name': 'ProjectLayer'},
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'layercommit': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Layer_Version']", 'null': 'True'}),
+ 'optional': ('django.db.models.fields.BooleanField', [], {'default': 'True'}),
+ 'project': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Project']"})
+ },
+ u'orm.projecttarget': {
+ 'Meta': {'object_name': 'ProjectTarget'},
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'project': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Project']"}),
+ 'target': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'task': ('django.db.models.fields.CharField', [], {'max_length': '100', 'null': 'True'})
+ },
+ u'orm.projectvariable': {
+ 'Meta': {'object_name': 'ProjectVariable'},
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'project': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Project']"}),
+ 'value': ('django.db.models.fields.TextField', [], {'blank': 'True'})
+ },
+ u'orm.recipe': {
+ 'Meta': {'unique_together': "(('layer_version', 'file_path', 'pathflags'),)", 'object_name': 'Recipe'},
+ 'bugtracker': ('django.db.models.fields.URLField', [], {'max_length': '200', 'blank': 'True'}),
+ 'description': ('django.db.models.fields.TextField', [], {'blank': 'True'}),
+ 'file_path': ('django.db.models.fields.FilePathField', [], {'max_length': '255'}),
+ 'homepage': ('django.db.models.fields.URLField', [], {'max_length': '200', 'blank': 'True'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'layer_source': ('django.db.models.fields.related.ForeignKey', [], {'default': 'None', 'to': u"orm['orm.LayerSource']", 'null': 'True'}),
+ 'layer_version': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'recipe_layer_version'", 'to': u"orm['orm.Layer_Version']"}),
+ 'license': ('django.db.models.fields.CharField', [], {'max_length': '200', 'blank': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '100', 'blank': 'True'}),
+ 'pathflags': ('django.db.models.fields.CharField', [], {'max_length': '200', 'blank': 'True'}),
+ 'section': ('django.db.models.fields.CharField', [], {'max_length': '100', 'blank': 'True'}),
+ 'summary': ('django.db.models.fields.TextField', [], {'blank': 'True'}),
+ 'up_date': ('django.db.models.fields.DateTimeField', [], {'default': 'None', 'null': 'True'}),
+ 'up_id': ('django.db.models.fields.IntegerField', [], {'default': 'None', 'null': 'True'}),
+ 'version': ('django.db.models.fields.CharField', [], {'max_length': '100', 'blank': 'True'})
+ },
+ u'orm.recipe_dependency': {
+ 'Meta': {'object_name': 'Recipe_Dependency'},
+ 'dep_type': ('django.db.models.fields.IntegerField', [], {}),
+ 'depends_on': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'r_dependencies_depends'", 'to': u"orm['orm.Recipe']"}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'recipe': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'r_dependencies_recipe'", 'to': u"orm['orm.Recipe']"})
+ },
+ u'orm.release': {
+ 'Meta': {'object_name': 'Release'},
+ 'bitbake_version': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.BitbakeVersion']"}),
+ 'branch_name': ('django.db.models.fields.CharField', [], {'default': "''", 'max_length': '50'}),
+ 'description': ('django.db.models.fields.CharField', [], {'max_length': '255'}),
+ 'helptext': ('django.db.models.fields.TextField', [], {'null': 'True'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'unique': 'True', 'max_length': '32'})
+ },
+ u'orm.releasedefaultlayer': {
+ 'Meta': {'object_name': 'ReleaseDefaultLayer'},
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'layer_name': ('django.db.models.fields.CharField', [], {'default': "''", 'max_length': '100'}),
+ 'release': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Release']"})
+ },
+ u'orm.releaselayersourcepriority': {
+ 'Meta': {'unique_together': "(('release', 'layer_source'),)", 'object_name': 'ReleaseLayerSourcePriority'},
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'layer_source': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.LayerSource']"}),
+ 'priority': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'release': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Release']"})
+ },
+ u'orm.target': {
+ 'Meta': {'object_name': 'Target'},
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Build']"}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'image_size': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'is_image': ('django.db.models.fields.BooleanField', [], {'default': 'False'}),
+ 'license_manifest_path': ('django.db.models.fields.CharField', [], {'max_length': '500', 'null': 'True'}),
+ 'target': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'task': ('django.db.models.fields.CharField', [], {'max_length': '100', 'null': 'True'})
+ },
+ u'orm.target_file': {
+ 'Meta': {'object_name': 'Target_File'},
+ 'directory': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'directory_set'", 'null': 'True', 'to': u"orm['orm.Target_File']"}),
+ 'group': ('django.db.models.fields.CharField', [], {'max_length': '128'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'inodetype': ('django.db.models.fields.IntegerField', [], {}),
+ 'owner': ('django.db.models.fields.CharField', [], {'max_length': '128'}),
+ 'path': ('django.db.models.fields.FilePathField', [], {'max_length': '100'}),
+ 'permission': ('django.db.models.fields.CharField', [], {'max_length': '16'}),
+ 'size': ('django.db.models.fields.IntegerField', [], {}),
+ 'sym_target': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'symlink_set'", 'null': 'True', 'to': u"orm['orm.Target_File']"}),
+ 'target': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Target']"})
+ },
+ u'orm.target_image_file': {
+ 'Meta': {'object_name': 'Target_Image_File'},
+ 'file_name': ('django.db.models.fields.FilePathField', [], {'max_length': '254'}),
+ 'file_size': ('django.db.models.fields.IntegerField', [], {}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'target': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Target']"})
+ },
+ u'orm.target_installed_package': {
+ 'Meta': {'object_name': 'Target_Installed_Package'},
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'package': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'buildtargetlist_package'", 'to': u"orm['orm.Package']"}),
+ 'target': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Target']"})
+ },
+ u'orm.task': {
+ 'Meta': {'ordering': "('order', 'recipe')", 'unique_together': "(('build', 'recipe', 'task_name'),)", 'object_name': 'Task'},
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'task_build'", 'to': u"orm['orm.Build']"}),
+ 'cpu_usage': ('django.db.models.fields.DecimalField', [], {'null': 'True', 'max_digits': '8', 'decimal_places': '2'}),
+ 'disk_io': ('django.db.models.fields.IntegerField', [], {'null': 'True'}),
+ 'elapsed_time': ('django.db.models.fields.DecimalField', [], {'null': 'True', 'max_digits': '8', 'decimal_places': '2'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'line_number': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'logfile': ('django.db.models.fields.FilePathField', [], {'max_length': '255', 'blank': 'True'}),
+ 'message': ('django.db.models.fields.CharField', [], {'max_length': '240'}),
+ 'order': ('django.db.models.fields.IntegerField', [], {'null': 'True'}),
+ 'outcome': ('django.db.models.fields.IntegerField', [], {'default': '-1'}),
+ 'path_to_sstate_obj': ('django.db.models.fields.FilePathField', [], {'max_length': '500', 'blank': 'True'}),
+ 'recipe': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'tasks'", 'to': u"orm['orm.Recipe']"}),
+ 'script_type': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'source_url': ('django.db.models.fields.FilePathField', [], {'max_length': '255', 'blank': 'True'}),
+ 'sstate_checksum': ('django.db.models.fields.CharField', [], {'max_length': '100', 'blank': 'True'}),
+ 'sstate_result': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'task_executed': ('django.db.models.fields.BooleanField', [], {'default': 'False'}),
+ 'task_name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'work_directory': ('django.db.models.fields.FilePathField', [], {'max_length': '255', 'blank': 'True'})
+ },
+ u'orm.task_dependency': {
+ 'Meta': {'object_name': 'Task_Dependency'},
+ 'depends_on': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'task_dependencies_depends'", 'to': u"orm['orm.Task']"}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'task': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'task_dependencies_task'", 'to': u"orm['orm.Task']"})
+ },
+ u'orm.toastersetting': {
+ 'Meta': {'object_name': 'ToasterSetting'},
+ 'helptext': ('django.db.models.fields.TextField', [], {}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '63'}),
+ 'value': ('django.db.models.fields.CharField', [], {'max_length': '255'})
+ },
+ u'orm.variable': {
+ 'Meta': {'object_name': 'Variable'},
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'variable_build'", 'to': u"orm['orm.Build']"}),
+ 'changed': ('django.db.models.fields.BooleanField', [], {'default': 'False'}),
+ 'description': ('django.db.models.fields.TextField', [], {'blank': 'True'}),
+ 'human_readable_name': ('django.db.models.fields.CharField', [], {'max_length': '200'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'variable_name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'variable_value': ('django.db.models.fields.TextField', [], {'blank': 'True'})
+ },
+ u'orm.variablehistory': {
+ 'Meta': {'object_name': 'VariableHistory'},
+ 'file_name': ('django.db.models.fields.FilePathField', [], {'max_length': '255'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'line_number': ('django.db.models.fields.IntegerField', [], {'null': 'True'}),
+ 'operation': ('django.db.models.fields.CharField', [], {'max_length': '64'}),
+ 'value': ('django.db.models.fields.TextField', [], {'blank': 'True'}),
+ 'variable': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'vhistory'", 'to': u"orm['orm.Variable']"})
+ }
+ }
+
+ complete_apps = ['orm']
diff --git a/bitbake/lib/toaster/orm/migrations/0023_auto__del_field_build_warnings_no__del_field_build_errors_no__del_fiel.py b/bitbake/lib/toaster/orm/migrations/0023_auto__del_field_build_warnings_no__del_field_build_errors_no__del_fiel.py
new file mode 100644
index 0000000..b5b200c
--- /dev/null
+++ b/bitbake/lib/toaster/orm/migrations/0023_auto__del_field_build_warnings_no__del_field_build_errors_no__del_fiel.py
@@ -0,0 +1,353 @@
+# -*- coding: utf-8 -*-
+from south.utils import datetime_utils as datetime
+from south.db import db
+from south.v2 import SchemaMigration
+from django.db import models
+
+
+class Migration(SchemaMigration):
+
+ def forwards(self, orm):
+ # Deleting field 'Build.warnings_no'
+ db.delete_column(u'orm_build', 'warnings_no')
+
+ # Deleting field 'Build.errors_no'
+ db.delete_column(u'orm_build', 'errors_no')
+
+ # Deleting field 'Build.timespent'
+ db.delete_column(u'orm_build', 'timespent')
+
+
+ def backwards(self, orm):
+ # Adding field 'Build.warnings_no'
+ db.add_column(u'orm_build', 'warnings_no',
+ self.gf('django.db.models.fields.IntegerField')(default=0),
+ keep_default=False)
+
+ # Adding field 'Build.errors_no'
+ db.add_column(u'orm_build', 'errors_no',
+ self.gf('django.db.models.fields.IntegerField')(default=0),
+ keep_default=False)
+
+ # Adding field 'Build.timespent'
+ db.add_column(u'orm_build', 'timespent',
+ self.gf('django.db.models.fields.IntegerField')(default=0),
+ keep_default=False)
+
+
+ models = {
+ u'orm.bitbakeversion': {
+ 'Meta': {'object_name': 'BitbakeVersion'},
+ 'branch': ('django.db.models.fields.CharField', [], {'max_length': '32'}),
+ 'dirpath': ('django.db.models.fields.CharField', [], {'max_length': '255'}),
+ 'giturl': ('django.db.models.fields.URLField', [], {'max_length': '200'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'unique': 'True', 'max_length': '32'})
+ },
+ u'orm.branch': {
+ 'Meta': {'unique_together': "(('layer_source', 'name'), ('layer_source', 'up_id'))", 'object_name': 'Branch'},
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'layer_source': ('django.db.models.fields.related.ForeignKey', [], {'default': 'True', 'to': u"orm['orm.LayerSource']", 'null': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '50'}),
+ 'short_description': ('django.db.models.fields.CharField', [], {'max_length': '50', 'blank': 'True'}),
+ 'up_date': ('django.db.models.fields.DateTimeField', [], {'default': 'None', 'null': 'True'}),
+ 'up_id': ('django.db.models.fields.IntegerField', [], {'default': 'None', 'null': 'True'})
+ },
+ u'orm.build': {
+ 'Meta': {'object_name': 'Build'},
+ 'bitbake_version': ('django.db.models.fields.CharField', [], {'max_length': '50'}),
+ 'build_name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'completed_on': ('django.db.models.fields.DateTimeField', [], {}),
+ 'cooker_log_path': ('django.db.models.fields.CharField', [], {'max_length': '500'}),
+ 'distro': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'distro_version': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'machine': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'outcome': ('django.db.models.fields.IntegerField', [], {'default': '2'}),
+ 'project': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Project']"}),
+ 'started_on': ('django.db.models.fields.DateTimeField', [], {})
+ },
+ u'orm.buildartifact': {
+ 'Meta': {'object_name': 'BuildArtifact'},
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Build']"}),
+ 'file_name': ('django.db.models.fields.FilePathField', [], {'max_length': '100'}),
+ 'file_size': ('django.db.models.fields.IntegerField', [], {}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'})
+ },
+ u'orm.helptext': {
+ 'Meta': {'object_name': 'HelpText'},
+ 'area': ('django.db.models.fields.IntegerField', [], {}),
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'helptext_build'", 'to': u"orm['orm.Build']"}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'key': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'text': ('django.db.models.fields.TextField', [], {})
+ },
+ u'orm.layer': {
+ 'Meta': {'unique_together': "(('layer_source', 'up_id'), ('layer_source', 'name'))", 'object_name': 'Layer'},
+ 'description': ('django.db.models.fields.TextField', [], {'default': 'None', 'null': 'True'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'layer_index_url': ('django.db.models.fields.URLField', [], {'max_length': '200'}),
+ 'layer_source': ('django.db.models.fields.related.ForeignKey', [], {'default': 'None', 'to': u"orm['orm.LayerSource']", 'null': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'summary': ('django.db.models.fields.TextField', [], {'default': 'None', 'null': 'True'}),
+ 'up_date': ('django.db.models.fields.DateTimeField', [], {'default': 'None', 'null': 'True'}),
+ 'up_id': ('django.db.models.fields.IntegerField', [], {'default': 'None', 'null': 'True'}),
+ 'vcs_url': ('django.db.models.fields.URLField', [], {'default': 'None', 'max_length': '200', 'null': 'True'}),
+ 'vcs_web_file_base_url': ('django.db.models.fields.URLField', [], {'default': 'None', 'max_length': '200', 'null': 'True'}),
+ 'vcs_web_tree_base_url': ('django.db.models.fields.URLField', [], {'default': 'None', 'max_length': '200', 'null': 'True'}),
+ 'vcs_web_url': ('django.db.models.fields.URLField', [], {'default': 'None', 'max_length': '200', 'null': 'True'})
+ },
+ u'orm.layer_version': {
+ 'Meta': {'unique_together': "(('layer_source', 'up_id'),)", 'object_name': 'Layer_Version'},
+ 'branch': ('django.db.models.fields.CharField', [], {'max_length': '80'}),
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'default': 'None', 'related_name': "'layer_version_build'", 'null': 'True', 'to': u"orm['orm.Build']"}),
+ 'commit': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'dirpath': ('django.db.models.fields.CharField', [], {'default': 'None', 'max_length': '255', 'null': 'True'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'layer': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'layer_version_layer'", 'to': u"orm['orm.Layer']"}),
+ 'layer_source': ('django.db.models.fields.related.ForeignKey', [], {'default': 'None', 'to': u"orm['orm.LayerSource']", 'null': 'True'}),
+ 'local_path': ('django.db.models.fields.FilePathField', [], {'default': "'/'", 'max_length': '1024'}),
+ 'priority': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'project': ('django.db.models.fields.related.ForeignKey', [], {'default': 'None', 'to': u"orm['orm.Project']", 'null': 'True'}),
+ 'up_branch': ('django.db.models.fields.related.ForeignKey', [], {'default': 'None', 'to': u"orm['orm.Branch']", 'null': 'True'}),
+ 'up_date': ('django.db.models.fields.DateTimeField', [], {'default': 'None', 'null': 'True'}),
+ 'up_id': ('django.db.models.fields.IntegerField', [], {'default': 'None', 'null': 'True'})
+ },
+ u'orm.layersource': {
+ 'Meta': {'unique_together': "(('sourcetype', 'apiurl'),)", 'object_name': 'LayerSource'},
+ 'apiurl': ('django.db.models.fields.CharField', [], {'default': 'None', 'max_length': '255', 'null': 'True'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'unique': 'True', 'max_length': '63'}),
+ 'sourcetype': ('django.db.models.fields.IntegerField', [], {})
+ },
+ u'orm.layerversiondependency': {
+ 'Meta': {'unique_together': "(('layer_source', 'up_id'),)", 'object_name': 'LayerVersionDependency'},
+ 'depends_on': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'dependees'", 'to': u"orm['orm.Layer_Version']"}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'layer_source': ('django.db.models.fields.related.ForeignKey', [], {'default': 'None', 'to': u"orm['orm.LayerSource']", 'null': 'True'}),
+ 'layer_version': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'dependencies'", 'to': u"orm['orm.Layer_Version']"}),
+ 'up_id': ('django.db.models.fields.IntegerField', [], {'default': 'None', 'null': 'True'})
+ },
+ u'orm.logmessage': {
+ 'Meta': {'object_name': 'LogMessage'},
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Build']"}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'level': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'lineno': ('django.db.models.fields.IntegerField', [], {'null': 'True'}),
+ 'message': ('django.db.models.fields.CharField', [], {'max_length': '240'}),
+ 'pathname': ('django.db.models.fields.FilePathField', [], {'max_length': '255', 'blank': 'True'}),
+ 'task': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Task']", 'null': 'True', 'blank': 'True'})
+ },
+ u'orm.machine': {
+ 'Meta': {'unique_together': "(('layer_source', 'up_id'),)", 'object_name': 'Machine'},
+ 'description': ('django.db.models.fields.CharField', [], {'max_length': '255'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'layer_source': ('django.db.models.fields.related.ForeignKey', [], {'default': 'None', 'to': u"orm['orm.LayerSource']", 'null': 'True'}),
+ 'layer_version': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Layer_Version']"}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '255'}),
+ 'up_date': ('django.db.models.fields.DateTimeField', [], {'default': 'None', 'null': 'True'}),
+ 'up_id': ('django.db.models.fields.IntegerField', [], {'default': 'None', 'null': 'True'})
+ },
+ u'orm.package': {
+ 'Meta': {'object_name': 'Package'},
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Build']"}),
+ 'description': ('django.db.models.fields.TextField', [], {'blank': 'True'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'installed_name': ('django.db.models.fields.CharField', [], {'default': "''", 'max_length': '100'}),
+ 'installed_size': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'license': ('django.db.models.fields.CharField', [], {'max_length': '80', 'blank': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'recipe': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Recipe']", 'null': 'True'}),
+ 'revision': ('django.db.models.fields.CharField', [], {'max_length': '32', 'blank': 'True'}),
+ 'section': ('django.db.models.fields.CharField', [], {'max_length': '80', 'blank': 'True'}),
+ 'size': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'summary': ('django.db.models.fields.TextField', [], {'blank': 'True'}),
+ 'version': ('django.db.models.fields.CharField', [], {'max_length': '100', 'blank': 'True'})
+ },
+ u'orm.package_dependency': {
+ 'Meta': {'object_name': 'Package_Dependency'},
+ 'dep_type': ('django.db.models.fields.IntegerField', [], {}),
+ 'depends_on': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'package_dependencies_target'", 'to': u"orm['orm.Package']"}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'package': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'package_dependencies_source'", 'to': u"orm['orm.Package']"}),
+ 'target': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Target']", 'null': 'True'})
+ },
+ u'orm.package_file': {
+ 'Meta': {'object_name': 'Package_File'},
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'package': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'buildfilelist_package'", 'to': u"orm['orm.Package']"}),
+ 'path': ('django.db.models.fields.FilePathField', [], {'max_length': '255', 'blank': 'True'}),
+ 'size': ('django.db.models.fields.IntegerField', [], {})
+ },
+ u'orm.project': {
+ 'Meta': {'object_name': 'Project'},
+ 'bitbake_version': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.BitbakeVersion']", 'null': 'True'}),
+ 'created': ('django.db.models.fields.DateTimeField', [], {'auto_now_add': 'True', 'blank': 'True'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'release': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Release']", 'null': 'True'}),
+ 'short_description': ('django.db.models.fields.CharField', [], {'max_length': '50', 'blank': 'True'}),
+ 'updated': ('django.db.models.fields.DateTimeField', [], {'auto_now': 'True', 'blank': 'True'}),
+ 'user_id': ('django.db.models.fields.IntegerField', [], {'null': 'True'})
+ },
+ u'orm.projectlayer': {
+ 'Meta': {'unique_together': "(('project', 'layercommit'),)", 'object_name': 'ProjectLayer'},
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'layercommit': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Layer_Version']", 'null': 'True'}),
+ 'optional': ('django.db.models.fields.BooleanField', [], {'default': 'True'}),
+ 'project': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Project']"})
+ },
+ u'orm.projecttarget': {
+ 'Meta': {'object_name': 'ProjectTarget'},
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'project': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Project']"}),
+ 'target': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'task': ('django.db.models.fields.CharField', [], {'max_length': '100', 'null': 'True'})
+ },
+ u'orm.projectvariable': {
+ 'Meta': {'object_name': 'ProjectVariable'},
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'project': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Project']"}),
+ 'value': ('django.db.models.fields.TextField', [], {'blank': 'True'})
+ },
+ u'orm.recipe': {
+ 'Meta': {'unique_together': "(('layer_version', 'file_path', 'pathflags'),)", 'object_name': 'Recipe'},
+ 'bugtracker': ('django.db.models.fields.URLField', [], {'max_length': '200', 'blank': 'True'}),
+ 'description': ('django.db.models.fields.TextField', [], {'blank': 'True'}),
+ 'file_path': ('django.db.models.fields.FilePathField', [], {'max_length': '255'}),
+ 'homepage': ('django.db.models.fields.URLField', [], {'max_length': '200', 'blank': 'True'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'layer_source': ('django.db.models.fields.related.ForeignKey', [], {'default': 'None', 'to': u"orm['orm.LayerSource']", 'null': 'True'}),
+ 'layer_version': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'recipe_layer_version'", 'to': u"orm['orm.Layer_Version']"}),
+ 'license': ('django.db.models.fields.CharField', [], {'max_length': '200', 'blank': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '100', 'blank': 'True'}),
+ 'pathflags': ('django.db.models.fields.CharField', [], {'max_length': '200', 'blank': 'True'}),
+ 'section': ('django.db.models.fields.CharField', [], {'max_length': '100', 'blank': 'True'}),
+ 'summary': ('django.db.models.fields.TextField', [], {'blank': 'True'}),
+ 'up_date': ('django.db.models.fields.DateTimeField', [], {'default': 'None', 'null': 'True'}),
+ 'up_id': ('django.db.models.fields.IntegerField', [], {'default': 'None', 'null': 'True'}),
+ 'version': ('django.db.models.fields.CharField', [], {'max_length': '100', 'blank': 'True'})
+ },
+ u'orm.recipe_dependency': {
+ 'Meta': {'object_name': 'Recipe_Dependency'},
+ 'dep_type': ('django.db.models.fields.IntegerField', [], {}),
+ 'depends_on': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'r_dependencies_depends'", 'to': u"orm['orm.Recipe']"}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'recipe': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'r_dependencies_recipe'", 'to': u"orm['orm.Recipe']"})
+ },
+ u'orm.release': {
+ 'Meta': {'object_name': 'Release'},
+ 'bitbake_version': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.BitbakeVersion']"}),
+ 'branch_name': ('django.db.models.fields.CharField', [], {'default': "''", 'max_length': '50'}),
+ 'description': ('django.db.models.fields.CharField', [], {'max_length': '255'}),
+ 'helptext': ('django.db.models.fields.TextField', [], {'null': 'True'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'unique': 'True', 'max_length': '32'})
+ },
+ u'orm.releasedefaultlayer': {
+ 'Meta': {'object_name': 'ReleaseDefaultLayer'},
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'layer_name': ('django.db.models.fields.CharField', [], {'default': "''", 'max_length': '100'}),
+ 'release': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Release']"})
+ },
+ u'orm.releaselayersourcepriority': {
+ 'Meta': {'unique_together': "(('release', 'layer_source'),)", 'object_name': 'ReleaseLayerSourcePriority'},
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'layer_source': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.LayerSource']"}),
+ 'priority': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'release': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Release']"})
+ },
+ u'orm.target': {
+ 'Meta': {'object_name': 'Target'},
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Build']"}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'image_size': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'is_image': ('django.db.models.fields.BooleanField', [], {'default': 'False'}),
+ 'license_manifest_path': ('django.db.models.fields.CharField', [], {'max_length': '500', 'null': 'True'}),
+ 'target': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'task': ('django.db.models.fields.CharField', [], {'max_length': '100', 'null': 'True'})
+ },
+ u'orm.target_file': {
+ 'Meta': {'object_name': 'Target_File'},
+ 'directory': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'directory_set'", 'null': 'True', 'to': u"orm['orm.Target_File']"}),
+ 'group': ('django.db.models.fields.CharField', [], {'max_length': '128'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'inodetype': ('django.db.models.fields.IntegerField', [], {}),
+ 'owner': ('django.db.models.fields.CharField', [], {'max_length': '128'}),
+ 'path': ('django.db.models.fields.FilePathField', [], {'max_length': '100'}),
+ 'permission': ('django.db.models.fields.CharField', [], {'max_length': '16'}),
+ 'size': ('django.db.models.fields.IntegerField', [], {}),
+ 'sym_target': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'symlink_set'", 'null': 'True', 'to': u"orm['orm.Target_File']"}),
+ 'target': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Target']"})
+ },
+ u'orm.target_image_file': {
+ 'Meta': {'object_name': 'Target_Image_File'},
+ 'file_name': ('django.db.models.fields.FilePathField', [], {'max_length': '254'}),
+ 'file_size': ('django.db.models.fields.IntegerField', [], {}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'target': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Target']"})
+ },
+ u'orm.target_installed_package': {
+ 'Meta': {'object_name': 'Target_Installed_Package'},
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'package': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'buildtargetlist_package'", 'to': u"orm['orm.Package']"}),
+ 'target': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Target']"})
+ },
+ u'orm.task': {
+ 'Meta': {'ordering': "('order', 'recipe')", 'unique_together': "(('build', 'recipe', 'task_name'),)", 'object_name': 'Task'},
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'task_build'", 'to': u"orm['orm.Build']"}),
+ 'cpu_usage': ('django.db.models.fields.DecimalField', [], {'null': 'True', 'max_digits': '8', 'decimal_places': '2'}),
+ 'disk_io': ('django.db.models.fields.IntegerField', [], {'null': 'True'}),
+ 'elapsed_time': ('django.db.models.fields.DecimalField', [], {'null': 'True', 'max_digits': '8', 'decimal_places': '2'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'line_number': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'logfile': ('django.db.models.fields.FilePathField', [], {'max_length': '255', 'blank': 'True'}),
+ 'message': ('django.db.models.fields.CharField', [], {'max_length': '240'}),
+ 'order': ('django.db.models.fields.IntegerField', [], {'null': 'True'}),
+ 'outcome': ('django.db.models.fields.IntegerField', [], {'default': '-1'}),
+ 'path_to_sstate_obj': ('django.db.models.fields.FilePathField', [], {'max_length': '500', 'blank': 'True'}),
+ 'recipe': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'tasks'", 'to': u"orm['orm.Recipe']"}),
+ 'script_type': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'source_url': ('django.db.models.fields.FilePathField', [], {'max_length': '255', 'blank': 'True'}),
+ 'sstate_checksum': ('django.db.models.fields.CharField', [], {'max_length': '100', 'blank': 'True'}),
+ 'sstate_result': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'task_executed': ('django.db.models.fields.BooleanField', [], {'default': 'False'}),
+ 'task_name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'work_directory': ('django.db.models.fields.FilePathField', [], {'max_length': '255', 'blank': 'True'})
+ },
+ u'orm.task_dependency': {
+ 'Meta': {'object_name': 'Task_Dependency'},
+ 'depends_on': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'task_dependencies_depends'", 'to': u"orm['orm.Task']"}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'task': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'task_dependencies_task'", 'to': u"orm['orm.Task']"})
+ },
+ u'orm.toastersetting': {
+ 'Meta': {'object_name': 'ToasterSetting'},
+ 'helptext': ('django.db.models.fields.TextField', [], {}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '63'}),
+ 'value': ('django.db.models.fields.CharField', [], {'max_length': '255'})
+ },
+ u'orm.variable': {
+ 'Meta': {'object_name': 'Variable'},
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'variable_build'", 'to': u"orm['orm.Build']"}),
+ 'changed': ('django.db.models.fields.BooleanField', [], {'default': 'False'}),
+ 'description': ('django.db.models.fields.TextField', [], {'blank': 'True'}),
+ 'human_readable_name': ('django.db.models.fields.CharField', [], {'max_length': '200'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'variable_name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'variable_value': ('django.db.models.fields.TextField', [], {'blank': 'True'})
+ },
+ u'orm.variablehistory': {
+ 'Meta': {'object_name': 'VariableHistory'},
+ 'file_name': ('django.db.models.fields.FilePathField', [], {'max_length': '255'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'line_number': ('django.db.models.fields.IntegerField', [], {'null': 'True'}),
+ 'operation': ('django.db.models.fields.CharField', [], {'max_length': '64'}),
+ 'value': ('django.db.models.fields.TextField', [], {'blank': 'True'}),
+ 'variable': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'vhistory'", 'to': u"orm['orm.Variable']"})
+ }
+ }
+
+ complete_apps = ['orm']
\ No newline at end of file
diff --git a/bitbake/lib/toaster/orm/migrations/0024_auto__add_field_recipe_is_image.py b/bitbake/lib/toaster/orm/migrations/0024_auto__add_field_recipe_is_image.py
new file mode 100644
index 0000000..88f60a9
--- /dev/null
+++ b/bitbake/lib/toaster/orm/migrations/0024_auto__add_field_recipe_is_image.py
@@ -0,0 +1,338 @@
+# -*- coding: utf-8 -*-
+from south.utils import datetime_utils as datetime
+from south.db import db
+from south.v2 import SchemaMigration
+from django.db import models
+
+
+class Migration(SchemaMigration):
+
+ def forwards(self, orm):
+ # Adding field 'Recipe.is_image'
+ db.add_column(u'orm_recipe', 'is_image',
+ self.gf('django.db.models.fields.BooleanField')(default=False),
+ keep_default=False)
+
+
+ def backwards(self, orm):
+ # Deleting field 'Recipe.is_image'
+ db.delete_column(u'orm_recipe', 'is_image')
+
+
+ models = {
+ u'orm.bitbakeversion': {
+ 'Meta': {'object_name': 'BitbakeVersion'},
+ 'branch': ('django.db.models.fields.CharField', [], {'max_length': '32'}),
+ 'dirpath': ('django.db.models.fields.CharField', [], {'max_length': '255'}),
+ 'giturl': ('django.db.models.fields.URLField', [], {'max_length': '200'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'unique': 'True', 'max_length': '32'})
+ },
+ u'orm.branch': {
+ 'Meta': {'unique_together': "(('layer_source', 'name'), ('layer_source', 'up_id'))", 'object_name': 'Branch'},
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'layer_source': ('django.db.models.fields.related.ForeignKey', [], {'default': 'True', 'to': u"orm['orm.LayerSource']", 'null': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '50'}),
+ 'short_description': ('django.db.models.fields.CharField', [], {'max_length': '50', 'blank': 'True'}),
+ 'up_date': ('django.db.models.fields.DateTimeField', [], {'default': 'None', 'null': 'True'}),
+ 'up_id': ('django.db.models.fields.IntegerField', [], {'default': 'None', 'null': 'True'})
+ },
+ u'orm.build': {
+ 'Meta': {'object_name': 'Build'},
+ 'bitbake_version': ('django.db.models.fields.CharField', [], {'max_length': '50'}),
+ 'build_name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'completed_on': ('django.db.models.fields.DateTimeField', [], {}),
+ 'cooker_log_path': ('django.db.models.fields.CharField', [], {'max_length': '500'}),
+ 'distro': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'distro_version': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'machine': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'outcome': ('django.db.models.fields.IntegerField', [], {'default': '2'}),
+ 'project': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Project']"}),
+ 'started_on': ('django.db.models.fields.DateTimeField', [], {})
+ },
+ u'orm.buildartifact': {
+ 'Meta': {'object_name': 'BuildArtifact'},
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Build']"}),
+ 'file_name': ('django.db.models.fields.FilePathField', [], {'max_length': '100'}),
+ 'file_size': ('django.db.models.fields.IntegerField', [], {}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'})
+ },
+ u'orm.helptext': {
+ 'Meta': {'object_name': 'HelpText'},
+ 'area': ('django.db.models.fields.IntegerField', [], {}),
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'helptext_build'", 'to': u"orm['orm.Build']"}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'key': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'text': ('django.db.models.fields.TextField', [], {})
+ },
+ u'orm.layer': {
+ 'Meta': {'unique_together': "(('layer_source', 'up_id'), ('layer_source', 'name'))", 'object_name': 'Layer'},
+ 'description': ('django.db.models.fields.TextField', [], {'default': 'None', 'null': 'True'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'layer_index_url': ('django.db.models.fields.URLField', [], {'max_length': '200'}),
+ 'layer_source': ('django.db.models.fields.related.ForeignKey', [], {'default': 'None', 'to': u"orm['orm.LayerSource']", 'null': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'summary': ('django.db.models.fields.TextField', [], {'default': 'None', 'null': 'True'}),
+ 'up_date': ('django.db.models.fields.DateTimeField', [], {'default': 'None', 'null': 'True'}),
+ 'up_id': ('django.db.models.fields.IntegerField', [], {'default': 'None', 'null': 'True'}),
+ 'vcs_url': ('django.db.models.fields.URLField', [], {'default': 'None', 'max_length': '200', 'null': 'True'}),
+ 'vcs_web_file_base_url': ('django.db.models.fields.URLField', [], {'default': 'None', 'max_length': '200', 'null': 'True'}),
+ 'vcs_web_tree_base_url': ('django.db.models.fields.URLField', [], {'default': 'None', 'max_length': '200', 'null': 'True'}),
+ 'vcs_web_url': ('django.db.models.fields.URLField', [], {'default': 'None', 'max_length': '200', 'null': 'True'})
+ },
+ u'orm.layer_version': {
+ 'Meta': {'unique_together': "(('layer_source', 'up_id'),)", 'object_name': 'Layer_Version'},
+ 'branch': ('django.db.models.fields.CharField', [], {'max_length': '80'}),
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'default': 'None', 'related_name': "'layer_version_build'", 'null': 'True', 'to': u"orm['orm.Build']"}),
+ 'commit': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'dirpath': ('django.db.models.fields.CharField', [], {'default': 'None', 'max_length': '255', 'null': 'True'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'layer': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'layer_version_layer'", 'to': u"orm['orm.Layer']"}),
+ 'layer_source': ('django.db.models.fields.related.ForeignKey', [], {'default': 'None', 'to': u"orm['orm.LayerSource']", 'null': 'True'}),
+ 'local_path': ('django.db.models.fields.FilePathField', [], {'default': "'/'", 'max_length': '1024'}),
+ 'priority': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'project': ('django.db.models.fields.related.ForeignKey', [], {'default': 'None', 'to': u"orm['orm.Project']", 'null': 'True'}),
+ 'up_branch': ('django.db.models.fields.related.ForeignKey', [], {'default': 'None', 'to': u"orm['orm.Branch']", 'null': 'True'}),
+ 'up_date': ('django.db.models.fields.DateTimeField', [], {'default': 'None', 'null': 'True'}),
+ 'up_id': ('django.db.models.fields.IntegerField', [], {'default': 'None', 'null': 'True'})
+ },
+ u'orm.layersource': {
+ 'Meta': {'unique_together': "(('sourcetype', 'apiurl'),)", 'object_name': 'LayerSource'},
+ 'apiurl': ('django.db.models.fields.CharField', [], {'default': 'None', 'max_length': '255', 'null': 'True'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'unique': 'True', 'max_length': '63'}),
+ 'sourcetype': ('django.db.models.fields.IntegerField', [], {})
+ },
+ u'orm.layerversiondependency': {
+ 'Meta': {'unique_together': "(('layer_source', 'up_id'),)", 'object_name': 'LayerVersionDependency'},
+ 'depends_on': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'dependees'", 'to': u"orm['orm.Layer_Version']"}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'layer_source': ('django.db.models.fields.related.ForeignKey', [], {'default': 'None', 'to': u"orm['orm.LayerSource']", 'null': 'True'}),
+ 'layer_version': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'dependencies'", 'to': u"orm['orm.Layer_Version']"}),
+ 'up_id': ('django.db.models.fields.IntegerField', [], {'default': 'None', 'null': 'True'})
+ },
+ u'orm.logmessage': {
+ 'Meta': {'object_name': 'LogMessage'},
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Build']"}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'level': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'lineno': ('django.db.models.fields.IntegerField', [], {'null': 'True'}),
+ 'message': ('django.db.models.fields.CharField', [], {'max_length': '240'}),
+ 'pathname': ('django.db.models.fields.FilePathField', [], {'max_length': '255', 'blank': 'True'}),
+ 'task': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Task']", 'null': 'True', 'blank': 'True'})
+ },
+ u'orm.machine': {
+ 'Meta': {'unique_together': "(('layer_source', 'up_id'),)", 'object_name': 'Machine'},
+ 'description': ('django.db.models.fields.CharField', [], {'max_length': '255'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'layer_source': ('django.db.models.fields.related.ForeignKey', [], {'default': 'None', 'to': u"orm['orm.LayerSource']", 'null': 'True'}),
+ 'layer_version': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Layer_Version']"}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '255'}),
+ 'up_date': ('django.db.models.fields.DateTimeField', [], {'default': 'None', 'null': 'True'}),
+ 'up_id': ('django.db.models.fields.IntegerField', [], {'default': 'None', 'null': 'True'})
+ },
+ u'orm.package': {
+ 'Meta': {'object_name': 'Package'},
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Build']"}),
+ 'description': ('django.db.models.fields.TextField', [], {'blank': 'True'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'installed_name': ('django.db.models.fields.CharField', [], {'default': "''", 'max_length': '100'}),
+ 'installed_size': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'license': ('django.db.models.fields.CharField', [], {'max_length': '80', 'blank': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'recipe': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Recipe']", 'null': 'True'}),
+ 'revision': ('django.db.models.fields.CharField', [], {'max_length': '32', 'blank': 'True'}),
+ 'section': ('django.db.models.fields.CharField', [], {'max_length': '80', 'blank': 'True'}),
+ 'size': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'summary': ('django.db.models.fields.TextField', [], {'blank': 'True'}),
+ 'version': ('django.db.models.fields.CharField', [], {'max_length': '100', 'blank': 'True'})
+ },
+ u'orm.package_dependency': {
+ 'Meta': {'object_name': 'Package_Dependency'},
+ 'dep_type': ('django.db.models.fields.IntegerField', [], {}),
+ 'depends_on': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'package_dependencies_target'", 'to': u"orm['orm.Package']"}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'package': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'package_dependencies_source'", 'to': u"orm['orm.Package']"}),
+ 'target': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Target']", 'null': 'True'})
+ },
+ u'orm.package_file': {
+ 'Meta': {'object_name': 'Package_File'},
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'package': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'buildfilelist_package'", 'to': u"orm['orm.Package']"}),
+ 'path': ('django.db.models.fields.FilePathField', [], {'max_length': '255', 'blank': 'True'}),
+ 'size': ('django.db.models.fields.IntegerField', [], {})
+ },
+ u'orm.project': {
+ 'Meta': {'object_name': 'Project'},
+ 'bitbake_version': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.BitbakeVersion']", 'null': 'True'}),
+ 'created': ('django.db.models.fields.DateTimeField', [], {'auto_now_add': 'True', 'blank': 'True'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'release': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Release']", 'null': 'True'}),
+ 'short_description': ('django.db.models.fields.CharField', [], {'max_length': '50', 'blank': 'True'}),
+ 'updated': ('django.db.models.fields.DateTimeField', [], {'auto_now': 'True', 'blank': 'True'}),
+ 'user_id': ('django.db.models.fields.IntegerField', [], {'null': 'True'})
+ },
+ u'orm.projectlayer': {
+ 'Meta': {'unique_together': "(('project', 'layercommit'),)", 'object_name': 'ProjectLayer'},
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'layercommit': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Layer_Version']", 'null': 'True'}),
+ 'optional': ('django.db.models.fields.BooleanField', [], {'default': 'True'}),
+ 'project': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Project']"})
+ },
+ u'orm.projecttarget': {
+ 'Meta': {'object_name': 'ProjectTarget'},
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'project': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Project']"}),
+ 'target': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'task': ('django.db.models.fields.CharField', [], {'max_length': '100', 'null': 'True'})
+ },
+ u'orm.projectvariable': {
+ 'Meta': {'object_name': 'ProjectVariable'},
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'project': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Project']"}),
+ 'value': ('django.db.models.fields.TextField', [], {'blank': 'True'})
+ },
+ u'orm.recipe': {
+ 'Meta': {'unique_together': "(('layer_version', 'file_path', 'pathflags'),)", 'object_name': 'Recipe'},
+ 'bugtracker': ('django.db.models.fields.URLField', [], {'max_length': '200', 'blank': 'True'}),
+ 'description': ('django.db.models.fields.TextField', [], {'blank': 'True'}),
+ 'file_path': ('django.db.models.fields.FilePathField', [], {'max_length': '255'}),
+ 'homepage': ('django.db.models.fields.URLField', [], {'max_length': '200', 'blank': 'True'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'is_image': ('django.db.models.fields.BooleanField', [], {'default': 'False'}),
+ 'layer_source': ('django.db.models.fields.related.ForeignKey', [], {'default': 'None', 'to': u"orm['orm.LayerSource']", 'null': 'True'}),
+ 'layer_version': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'recipe_layer_version'", 'to': u"orm['orm.Layer_Version']"}),
+ 'license': ('django.db.models.fields.CharField', [], {'max_length': '200', 'blank': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '100', 'blank': 'True'}),
+ 'pathflags': ('django.db.models.fields.CharField', [], {'max_length': '200', 'blank': 'True'}),
+ 'section': ('django.db.models.fields.CharField', [], {'max_length': '100', 'blank': 'True'}),
+ 'summary': ('django.db.models.fields.TextField', [], {'blank': 'True'}),
+ 'up_date': ('django.db.models.fields.DateTimeField', [], {'default': 'None', 'null': 'True'}),
+ 'up_id': ('django.db.models.fields.IntegerField', [], {'default': 'None', 'null': 'True'}),
+ 'version': ('django.db.models.fields.CharField', [], {'max_length': '100', 'blank': 'True'})
+ },
+ u'orm.recipe_dependency': {
+ 'Meta': {'object_name': 'Recipe_Dependency'},
+ 'dep_type': ('django.db.models.fields.IntegerField', [], {}),
+ 'depends_on': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'r_dependencies_depends'", 'to': u"orm['orm.Recipe']"}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'recipe': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'r_dependencies_recipe'", 'to': u"orm['orm.Recipe']"})
+ },
+ u'orm.release': {
+ 'Meta': {'object_name': 'Release'},
+ 'bitbake_version': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.BitbakeVersion']"}),
+ 'branch_name': ('django.db.models.fields.CharField', [], {'default': "''", 'max_length': '50'}),
+ 'description': ('django.db.models.fields.CharField', [], {'max_length': '255'}),
+ 'helptext': ('django.db.models.fields.TextField', [], {'null': 'True'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'unique': 'True', 'max_length': '32'})
+ },
+ u'orm.releasedefaultlayer': {
+ 'Meta': {'object_name': 'ReleaseDefaultLayer'},
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'layer_name': ('django.db.models.fields.CharField', [], {'default': "''", 'max_length': '100'}),
+ 'release': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Release']"})
+ },
+ u'orm.releaselayersourcepriority': {
+ 'Meta': {'unique_together': "(('release', 'layer_source'),)", 'object_name': 'ReleaseLayerSourcePriority'},
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'layer_source': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.LayerSource']"}),
+ 'priority': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'release': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Release']"})
+ },
+ u'orm.target': {
+ 'Meta': {'object_name': 'Target'},
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Build']"}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'image_size': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'is_image': ('django.db.models.fields.BooleanField', [], {'default': 'False'}),
+ 'license_manifest_path': ('django.db.models.fields.CharField', [], {'max_length': '500', 'null': 'True'}),
+ 'target': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'task': ('django.db.models.fields.CharField', [], {'max_length': '100', 'null': 'True'})
+ },
+ u'orm.target_file': {
+ 'Meta': {'object_name': 'Target_File'},
+ 'directory': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'directory_set'", 'null': 'True', 'to': u"orm['orm.Target_File']"}),
+ 'group': ('django.db.models.fields.CharField', [], {'max_length': '128'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'inodetype': ('django.db.models.fields.IntegerField', [], {}),
+ 'owner': ('django.db.models.fields.CharField', [], {'max_length': '128'}),
+ 'path': ('django.db.models.fields.FilePathField', [], {'max_length': '100'}),
+ 'permission': ('django.db.models.fields.CharField', [], {'max_length': '16'}),
+ 'size': ('django.db.models.fields.IntegerField', [], {}),
+ 'sym_target': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'symlink_set'", 'null': 'True', 'to': u"orm['orm.Target_File']"}),
+ 'target': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Target']"})
+ },
+ u'orm.target_image_file': {
+ 'Meta': {'object_name': 'Target_Image_File'},
+ 'file_name': ('django.db.models.fields.FilePathField', [], {'max_length': '254'}),
+ 'file_size': ('django.db.models.fields.IntegerField', [], {}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'target': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Target']"})
+ },
+ u'orm.target_installed_package': {
+ 'Meta': {'object_name': 'Target_Installed_Package'},
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'package': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'buildtargetlist_package'", 'to': u"orm['orm.Package']"}),
+ 'target': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Target']"})
+ },
+ u'orm.task': {
+ 'Meta': {'ordering': "('order', 'recipe')", 'unique_together': "(('build', 'recipe', 'task_name'),)", 'object_name': 'Task'},
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'task_build'", 'to': u"orm['orm.Build']"}),
+ 'cpu_usage': ('django.db.models.fields.DecimalField', [], {'null': 'True', 'max_digits': '8', 'decimal_places': '2'}),
+ 'disk_io': ('django.db.models.fields.IntegerField', [], {'null': 'True'}),
+ 'elapsed_time': ('django.db.models.fields.DecimalField', [], {'null': 'True', 'max_digits': '8', 'decimal_places': '2'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'line_number': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'logfile': ('django.db.models.fields.FilePathField', [], {'max_length': '255', 'blank': 'True'}),
+ 'message': ('django.db.models.fields.CharField', [], {'max_length': '240'}),
+ 'order': ('django.db.models.fields.IntegerField', [], {'null': 'True'}),
+ 'outcome': ('django.db.models.fields.IntegerField', [], {'default': '-1'}),
+ 'path_to_sstate_obj': ('django.db.models.fields.FilePathField', [], {'max_length': '500', 'blank': 'True'}),
+ 'recipe': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'tasks'", 'to': u"orm['orm.Recipe']"}),
+ 'script_type': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'source_url': ('django.db.models.fields.FilePathField', [], {'max_length': '255', 'blank': 'True'}),
+ 'sstate_checksum': ('django.db.models.fields.CharField', [], {'max_length': '100', 'blank': 'True'}),
+ 'sstate_result': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'task_executed': ('django.db.models.fields.BooleanField', [], {'default': 'False'}),
+ 'task_name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'work_directory': ('django.db.models.fields.FilePathField', [], {'max_length': '255', 'blank': 'True'})
+ },
+ u'orm.task_dependency': {
+ 'Meta': {'object_name': 'Task_Dependency'},
+ 'depends_on': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'task_dependencies_depends'", 'to': u"orm['orm.Task']"}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'task': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'task_dependencies_task'", 'to': u"orm['orm.Task']"})
+ },
+ u'orm.toastersetting': {
+ 'Meta': {'object_name': 'ToasterSetting'},
+ 'helptext': ('django.db.models.fields.TextField', [], {}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '63'}),
+ 'value': ('django.db.models.fields.CharField', [], {'max_length': '255'})
+ },
+ u'orm.variable': {
+ 'Meta': {'object_name': 'Variable'},
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'variable_build'", 'to': u"orm['orm.Build']"}),
+ 'changed': ('django.db.models.fields.BooleanField', [], {'default': 'False'}),
+ 'description': ('django.db.models.fields.TextField', [], {'blank': 'True'}),
+ 'human_readable_name': ('django.db.models.fields.CharField', [], {'max_length': '200'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'variable_name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'variable_value': ('django.db.models.fields.TextField', [], {'blank': 'True'})
+ },
+ u'orm.variablehistory': {
+ 'Meta': {'object_name': 'VariableHistory'},
+ 'file_name': ('django.db.models.fields.FilePathField', [], {'max_length': '255'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'line_number': ('django.db.models.fields.IntegerField', [], {'null': 'True'}),
+ 'operation': ('django.db.models.fields.CharField', [], {'max_length': '64'}),
+ 'value': ('django.db.models.fields.TextField', [], {'blank': 'True'}),
+ 'variable': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'vhistory'", 'to': u"orm['orm.Variable']"})
+ }
+ }
+
+ complete_apps = ['orm']
\ No newline at end of file
diff --git a/bitbake/lib/toaster/orm/migrations/0025_auto__add_field_project_is_default.py b/bitbake/lib/toaster/orm/migrations/0025_auto__add_field_project_is_default.py
new file mode 100644
index 0000000..e76990d
--- /dev/null
+++ b/bitbake/lib/toaster/orm/migrations/0025_auto__add_field_project_is_default.py
@@ -0,0 +1,346 @@
+# -*- coding: utf-8 -*-
+from south.utils import datetime_utils as datetime
+from south.db import db
+from south.v2 import SchemaMigration
+from django.db import models
+
+
+class Migration(SchemaMigration):
+
+ def forwards(self, orm):
+ # work-around for http://south.aeracode.org/ticket/578:
+ # SQLite boolean fields aren't set to the correct default value
+ # (needs to be 0 or 1, rather than True or False)
+ default = False
+ if db.backend_name == 'sqlite3':
+ default = 0
+
+ # Adding field 'Project.is_default'
+ db.add_column(u'orm_project', 'is_default',
+ self.gf('django.db.models.fields.BooleanField')(default=default),
+ keep_default=False)
+
+
+ def backwards(self, orm):
+ # Deleting field 'Project.is_default'
+ db.delete_column(u'orm_project', 'is_default')
+
+
+ models = {
+ u'orm.bitbakeversion': {
+ 'Meta': {'object_name': 'BitbakeVersion'},
+ 'branch': ('django.db.models.fields.CharField', [], {'max_length': '32'}),
+ 'dirpath': ('django.db.models.fields.CharField', [], {'max_length': '255'}),
+ 'giturl': ('django.db.models.fields.URLField', [], {'max_length': '200'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'unique': 'True', 'max_length': '32'})
+ },
+ u'orm.branch': {
+ 'Meta': {'unique_together': "(('layer_source', 'name'), ('layer_source', 'up_id'))", 'object_name': 'Branch'},
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'layer_source': ('django.db.models.fields.related.ForeignKey', [], {'default': 'True', 'to': u"orm['orm.LayerSource']", 'null': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '50'}),
+ 'short_description': ('django.db.models.fields.CharField', [], {'max_length': '50', 'blank': 'True'}),
+ 'up_date': ('django.db.models.fields.DateTimeField', [], {'default': 'None', 'null': 'True'}),
+ 'up_id': ('django.db.models.fields.IntegerField', [], {'default': 'None', 'null': 'True'})
+ },
+ u'orm.build': {
+ 'Meta': {'object_name': 'Build'},
+ 'bitbake_version': ('django.db.models.fields.CharField', [], {'max_length': '50'}),
+ 'build_name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'completed_on': ('django.db.models.fields.DateTimeField', [], {}),
+ 'cooker_log_path': ('django.db.models.fields.CharField', [], {'max_length': '500'}),
+ 'distro': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'distro_version': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'machine': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'outcome': ('django.db.models.fields.IntegerField', [], {'default': '2'}),
+ 'project': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Project']"}),
+ 'started_on': ('django.db.models.fields.DateTimeField', [], {})
+ },
+ u'orm.buildartifact': {
+ 'Meta': {'object_name': 'BuildArtifact'},
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Build']"}),
+ 'file_name': ('django.db.models.fields.FilePathField', [], {'max_length': '100'}),
+ 'file_size': ('django.db.models.fields.IntegerField', [], {}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'})
+ },
+ u'orm.helptext': {
+ 'Meta': {'object_name': 'HelpText'},
+ 'area': ('django.db.models.fields.IntegerField', [], {}),
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'helptext_build'", 'to': u"orm['orm.Build']"}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'key': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'text': ('django.db.models.fields.TextField', [], {})
+ },
+ u'orm.layer': {
+ 'Meta': {'unique_together': "(('layer_source', 'up_id'), ('layer_source', 'name'))", 'object_name': 'Layer'},
+ 'description': ('django.db.models.fields.TextField', [], {'default': 'None', 'null': 'True'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'layer_index_url': ('django.db.models.fields.URLField', [], {'max_length': '200'}),
+ 'layer_source': ('django.db.models.fields.related.ForeignKey', [], {'default': 'None', 'to': u"orm['orm.LayerSource']", 'null': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'summary': ('django.db.models.fields.TextField', [], {'default': 'None', 'null': 'True'}),
+ 'up_date': ('django.db.models.fields.DateTimeField', [], {'default': 'None', 'null': 'True'}),
+ 'up_id': ('django.db.models.fields.IntegerField', [], {'default': 'None', 'null': 'True'}),
+ 'vcs_url': ('django.db.models.fields.URLField', [], {'default': 'None', 'max_length': '200', 'null': 'True'}),
+ 'vcs_web_file_base_url': ('django.db.models.fields.URLField', [], {'default': 'None', 'max_length': '200', 'null': 'True'}),
+ 'vcs_web_tree_base_url': ('django.db.models.fields.URLField', [], {'default': 'None', 'max_length': '200', 'null': 'True'}),
+ 'vcs_web_url': ('django.db.models.fields.URLField', [], {'default': 'None', 'max_length': '200', 'null': 'True'})
+ },
+ u'orm.layer_version': {
+ 'Meta': {'unique_together': "(('layer_source', 'up_id'),)", 'object_name': 'Layer_Version'},
+ 'branch': ('django.db.models.fields.CharField', [], {'max_length': '80'}),
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'default': 'None', 'related_name': "'layer_version_build'", 'null': 'True', 'to': u"orm['orm.Build']"}),
+ 'commit': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'dirpath': ('django.db.models.fields.CharField', [], {'default': 'None', 'max_length': '255', 'null': 'True'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'layer': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'layer_version_layer'", 'to': u"orm['orm.Layer']"}),
+ 'layer_source': ('django.db.models.fields.related.ForeignKey', [], {'default': 'None', 'to': u"orm['orm.LayerSource']", 'null': 'True'}),
+ 'local_path': ('django.db.models.fields.FilePathField', [], {'default': "'/'", 'max_length': '1024'}),
+ 'priority': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'project': ('django.db.models.fields.related.ForeignKey', [], {'default': 'None', 'to': u"orm['orm.Project']", 'null': 'True'}),
+ 'up_branch': ('django.db.models.fields.related.ForeignKey', [], {'default': 'None', 'to': u"orm['orm.Branch']", 'null': 'True'}),
+ 'up_date': ('django.db.models.fields.DateTimeField', [], {'default': 'None', 'null': 'True'}),
+ 'up_id': ('django.db.models.fields.IntegerField', [], {'default': 'None', 'null': 'True'})
+ },
+ u'orm.layersource': {
+ 'Meta': {'unique_together': "(('sourcetype', 'apiurl'),)", 'object_name': 'LayerSource'},
+ 'apiurl': ('django.db.models.fields.CharField', [], {'default': 'None', 'max_length': '255', 'null': 'True'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'unique': 'True', 'max_length': '63'}),
+ 'sourcetype': ('django.db.models.fields.IntegerField', [], {})
+ },
+ u'orm.layerversiondependency': {
+ 'Meta': {'unique_together': "(('layer_source', 'up_id'),)", 'object_name': 'LayerVersionDependency'},
+ 'depends_on': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'dependees'", 'to': u"orm['orm.Layer_Version']"}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'layer_source': ('django.db.models.fields.related.ForeignKey', [], {'default': 'None', 'to': u"orm['orm.LayerSource']", 'null': 'True'}),
+ 'layer_version': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'dependencies'", 'to': u"orm['orm.Layer_Version']"}),
+ 'up_id': ('django.db.models.fields.IntegerField', [], {'default': 'None', 'null': 'True'})
+ },
+ u'orm.logmessage': {
+ 'Meta': {'object_name': 'LogMessage'},
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Build']"}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'level': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'lineno': ('django.db.models.fields.IntegerField', [], {'null': 'True'}),
+ 'message': ('django.db.models.fields.CharField', [], {'max_length': '240'}),
+ 'pathname': ('django.db.models.fields.FilePathField', [], {'max_length': '255', 'blank': 'True'}),
+ 'task': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Task']", 'null': 'True', 'blank': 'True'})
+ },
+ u'orm.machine': {
+ 'Meta': {'unique_together': "(('layer_source', 'up_id'),)", 'object_name': 'Machine'},
+ 'description': ('django.db.models.fields.CharField', [], {'max_length': '255'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'layer_source': ('django.db.models.fields.related.ForeignKey', [], {'default': 'None', 'to': u"orm['orm.LayerSource']", 'null': 'True'}),
+ 'layer_version': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Layer_Version']"}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '255'}),
+ 'up_date': ('django.db.models.fields.DateTimeField', [], {'default': 'None', 'null': 'True'}),
+ 'up_id': ('django.db.models.fields.IntegerField', [], {'default': 'None', 'null': 'True'})
+ },
+ u'orm.package': {
+ 'Meta': {'object_name': 'Package'},
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Build']"}),
+ 'description': ('django.db.models.fields.TextField', [], {'blank': 'True'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'installed_name': ('django.db.models.fields.CharField', [], {'default': "''", 'max_length': '100'}),
+ 'installed_size': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'license': ('django.db.models.fields.CharField', [], {'max_length': '80', 'blank': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'recipe': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Recipe']", 'null': 'True'}),
+ 'revision': ('django.db.models.fields.CharField', [], {'max_length': '32', 'blank': 'True'}),
+ 'section': ('django.db.models.fields.CharField', [], {'max_length': '80', 'blank': 'True'}),
+ 'size': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'summary': ('django.db.models.fields.TextField', [], {'blank': 'True'}),
+ 'version': ('django.db.models.fields.CharField', [], {'max_length': '100', 'blank': 'True'})
+ },
+ u'orm.package_dependency': {
+ 'Meta': {'object_name': 'Package_Dependency'},
+ 'dep_type': ('django.db.models.fields.IntegerField', [], {}),
+ 'depends_on': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'package_dependencies_target'", 'to': u"orm['orm.Package']"}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'package': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'package_dependencies_source'", 'to': u"orm['orm.Package']"}),
+ 'target': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Target']", 'null': 'True'})
+ },
+ u'orm.package_file': {
+ 'Meta': {'object_name': 'Package_File'},
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'package': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'buildfilelist_package'", 'to': u"orm['orm.Package']"}),
+ 'path': ('django.db.models.fields.FilePathField', [], {'max_length': '255', 'blank': 'True'}),
+ 'size': ('django.db.models.fields.IntegerField', [], {})
+ },
+ u'orm.project': {
+ 'Meta': {'object_name': 'Project'},
+ 'bitbake_version': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.BitbakeVersion']", 'null': 'True'}),
+ 'created': ('django.db.models.fields.DateTimeField', [], {'auto_now_add': 'True', 'blank': 'True'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'is_default': ('django.db.models.fields.BooleanField', [], {'default': 'False'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'release': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Release']", 'null': 'True'}),
+ 'short_description': ('django.db.models.fields.CharField', [], {'max_length': '50', 'blank': 'True'}),
+ 'updated': ('django.db.models.fields.DateTimeField', [], {'auto_now': 'True', 'blank': 'True'}),
+ 'user_id': ('django.db.models.fields.IntegerField', [], {'null': 'True'})
+ },
+ u'orm.projectlayer': {
+ 'Meta': {'unique_together': "(('project', 'layercommit'),)", 'object_name': 'ProjectLayer'},
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'layercommit': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Layer_Version']", 'null': 'True'}),
+ 'optional': ('django.db.models.fields.BooleanField', [], {'default': 'True'}),
+ 'project': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Project']"})
+ },
+ u'orm.projecttarget': {
+ 'Meta': {'object_name': 'ProjectTarget'},
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'project': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Project']"}),
+ 'target': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'task': ('django.db.models.fields.CharField', [], {'max_length': '100', 'null': 'True'})
+ },
+ u'orm.projectvariable': {
+ 'Meta': {'object_name': 'ProjectVariable'},
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'project': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Project']"}),
+ 'value': ('django.db.models.fields.TextField', [], {'blank': 'True'})
+ },
+ u'orm.recipe': {
+ 'Meta': {'unique_together': "(('layer_version', 'file_path', 'pathflags'),)", 'object_name': 'Recipe'},
+ 'bugtracker': ('django.db.models.fields.URLField', [], {'max_length': '200', 'blank': 'True'}),
+ 'description': ('django.db.models.fields.TextField', [], {'blank': 'True'}),
+ 'file_path': ('django.db.models.fields.FilePathField', [], {'max_length': '255'}),
+ 'homepage': ('django.db.models.fields.URLField', [], {'max_length': '200', 'blank': 'True'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'is_image': ('django.db.models.fields.BooleanField', [], {'default': 'False'}),
+ 'layer_source': ('django.db.models.fields.related.ForeignKey', [], {'default': 'None', 'to': u"orm['orm.LayerSource']", 'null': 'True'}),
+ 'layer_version': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'recipe_layer_version'", 'to': u"orm['orm.Layer_Version']"}),
+ 'license': ('django.db.models.fields.CharField', [], {'max_length': '200', 'blank': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '100', 'blank': 'True'}),
+ 'pathflags': ('django.db.models.fields.CharField', [], {'max_length': '200', 'blank': 'True'}),
+ 'section': ('django.db.models.fields.CharField', [], {'max_length': '100', 'blank': 'True'}),
+ 'summary': ('django.db.models.fields.TextField', [], {'blank': 'True'}),
+ 'up_date': ('django.db.models.fields.DateTimeField', [], {'default': 'None', 'null': 'True'}),
+ 'up_id': ('django.db.models.fields.IntegerField', [], {'default': 'None', 'null': 'True'}),
+ 'version': ('django.db.models.fields.CharField', [], {'max_length': '100', 'blank': 'True'})
+ },
+ u'orm.recipe_dependency': {
+ 'Meta': {'object_name': 'Recipe_Dependency'},
+ 'dep_type': ('django.db.models.fields.IntegerField', [], {}),
+ 'depends_on': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'r_dependencies_depends'", 'to': u"orm['orm.Recipe']"}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'recipe': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'r_dependencies_recipe'", 'to': u"orm['orm.Recipe']"})
+ },
+ u'orm.release': {
+ 'Meta': {'object_name': 'Release'},
+ 'bitbake_version': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.BitbakeVersion']"}),
+ 'branch_name': ('django.db.models.fields.CharField', [], {'default': "''", 'max_length': '50'}),
+ 'description': ('django.db.models.fields.CharField', [], {'max_length': '255'}),
+ 'helptext': ('django.db.models.fields.TextField', [], {'null': 'True'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'unique': 'True', 'max_length': '32'})
+ },
+ u'orm.releasedefaultlayer': {
+ 'Meta': {'object_name': 'ReleaseDefaultLayer'},
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'layer_name': ('django.db.models.fields.CharField', [], {'default': "''", 'max_length': '100'}),
+ 'release': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Release']"})
+ },
+ u'orm.releaselayersourcepriority': {
+ 'Meta': {'unique_together': "(('release', 'layer_source'),)", 'object_name': 'ReleaseLayerSourcePriority'},
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'layer_source': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.LayerSource']"}),
+ 'priority': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'release': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Release']"})
+ },
+ u'orm.target': {
+ 'Meta': {'object_name': 'Target'},
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Build']"}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'image_size': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'is_image': ('django.db.models.fields.BooleanField', [], {'default': 'False'}),
+ 'license_manifest_path': ('django.db.models.fields.CharField', [], {'max_length': '500', 'null': 'True'}),
+ 'target': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'task': ('django.db.models.fields.CharField', [], {'max_length': '100', 'null': 'True'})
+ },
+ u'orm.target_file': {
+ 'Meta': {'object_name': 'Target_File'},
+ 'directory': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'directory_set'", 'null': 'True', 'to': u"orm['orm.Target_File']"}),
+ 'group': ('django.db.models.fields.CharField', [], {'max_length': '128'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'inodetype': ('django.db.models.fields.IntegerField', [], {}),
+ 'owner': ('django.db.models.fields.CharField', [], {'max_length': '128'}),
+ 'path': ('django.db.models.fields.FilePathField', [], {'max_length': '100'}),
+ 'permission': ('django.db.models.fields.CharField', [], {'max_length': '16'}),
+ 'size': ('django.db.models.fields.IntegerField', [], {}),
+ 'sym_target': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'symlink_set'", 'null': 'True', 'to': u"orm['orm.Target_File']"}),
+ 'target': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Target']"})
+ },
+ u'orm.target_image_file': {
+ 'Meta': {'object_name': 'Target_Image_File'},
+ 'file_name': ('django.db.models.fields.FilePathField', [], {'max_length': '254'}),
+ 'file_size': ('django.db.models.fields.IntegerField', [], {}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'target': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Target']"})
+ },
+ u'orm.target_installed_package': {
+ 'Meta': {'object_name': 'Target_Installed_Package'},
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'package': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'buildtargetlist_package'", 'to': u"orm['orm.Package']"}),
+ 'target': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Target']"})
+ },
+ u'orm.task': {
+ 'Meta': {'ordering': "('order', 'recipe')", 'unique_together': "(('build', 'recipe', 'task_name'),)", 'object_name': 'Task'},
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'task_build'", 'to': u"orm['orm.Build']"}),
+ 'cpu_usage': ('django.db.models.fields.DecimalField', [], {'null': 'True', 'max_digits': '8', 'decimal_places': '2'}),
+ 'disk_io': ('django.db.models.fields.IntegerField', [], {'null': 'True'}),
+ 'elapsed_time': ('django.db.models.fields.DecimalField', [], {'null': 'True', 'max_digits': '8', 'decimal_places': '2'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'line_number': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'logfile': ('django.db.models.fields.FilePathField', [], {'max_length': '255', 'blank': 'True'}),
+ 'message': ('django.db.models.fields.CharField', [], {'max_length': '240'}),
+ 'order': ('django.db.models.fields.IntegerField', [], {'null': 'True'}),
+ 'outcome': ('django.db.models.fields.IntegerField', [], {'default': '-1'}),
+ 'path_to_sstate_obj': ('django.db.models.fields.FilePathField', [], {'max_length': '500', 'blank': 'True'}),
+ 'recipe': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'tasks'", 'to': u"orm['orm.Recipe']"}),
+ 'script_type': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'source_url': ('django.db.models.fields.FilePathField', [], {'max_length': '255', 'blank': 'True'}),
+ 'sstate_checksum': ('django.db.models.fields.CharField', [], {'max_length': '100', 'blank': 'True'}),
+ 'sstate_result': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'task_executed': ('django.db.models.fields.BooleanField', [], {'default': 'False'}),
+ 'task_name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'work_directory': ('django.db.models.fields.FilePathField', [], {'max_length': '255', 'blank': 'True'})
+ },
+ u'orm.task_dependency': {
+ 'Meta': {'object_name': 'Task_Dependency'},
+ 'depends_on': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'task_dependencies_depends'", 'to': u"orm['orm.Task']"}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'task': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'task_dependencies_task'", 'to': u"orm['orm.Task']"})
+ },
+ u'orm.toastersetting': {
+ 'Meta': {'object_name': 'ToasterSetting'},
+ 'helptext': ('django.db.models.fields.TextField', [], {}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '63'}),
+ 'value': ('django.db.models.fields.CharField', [], {'max_length': '255'})
+ },
+ u'orm.variable': {
+ 'Meta': {'object_name': 'Variable'},
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'variable_build'", 'to': u"orm['orm.Build']"}),
+ 'changed': ('django.db.models.fields.BooleanField', [], {'default': 'False'}),
+ 'description': ('django.db.models.fields.TextField', [], {'blank': 'True'}),
+ 'human_readable_name': ('django.db.models.fields.CharField', [], {'max_length': '200'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'variable_name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'variable_value': ('django.db.models.fields.TextField', [], {'blank': 'True'})
+ },
+ u'orm.variablehistory': {
+ 'Meta': {'object_name': 'VariableHistory'},
+ 'file_name': ('django.db.models.fields.FilePathField', [], {'max_length': '255'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'line_number': ('django.db.models.fields.IntegerField', [], {'null': 'True'}),
+ 'operation': ('django.db.models.fields.CharField', [], {'max_length': '64'}),
+ 'value': ('django.db.models.fields.TextField', [], {'blank': 'True'}),
+ 'variable': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'vhistory'", 'to': u"orm['orm.Variable']"})
+ }
+ }
+
+ complete_apps = ['orm']
diff --git a/bitbake/lib/toaster/orm/migrations/0026_set_default_project.py b/bitbake/lib/toaster/orm/migrations/0026_set_default_project.py
new file mode 100644
index 0000000..6240abd
--- /dev/null
+++ b/bitbake/lib/toaster/orm/migrations/0026_set_default_project.py
@@ -0,0 +1,374 @@
+# -*- coding: utf-8 -*-
+from south.utils import datetime_utils as datetime
+from south.db import db
+from south.v2 import SchemaMigration
+from django.db import models
+
+# data-only migration to set the is_default field correctly
+# across all projects, so it has the correct value on a single
+# record only; this will add or amend default project (marked with
+# is_default = True)
+class Migration(SchemaMigration):
+
+ no_dry_run = True
+
+ # work-around for http://south.aeracode.org/ticket/578:
+ # SQLite boolean fields aren't set to the correct default value
+ # when added to existing records (value needs to be 0 or 1, rather
+ # than True or False), so manually update that field for all
+ # existing records
+ def _sqlite_update_all_projects_is_default(self, orm):
+ if db.backend_name == 'sqlite3':
+ for project in orm.Project.objects.all():
+ project.is_default = 0
+ project.save()
+
+ def forwards(self, orm):
+ # fix is_default field
+ self._sqlite_update_all_projects_is_default(orm)
+
+ # now create or modify the default project
+ project = None
+
+ # check for existing default project with ID 0 which has
+ # already been added in code
+ projects = orm.Project.objects.filter(pk = 0)
+
+ if len(projects) == 1:
+ project = projects[0]
+ else:
+ # create default project
+ options = {
+ "name": "Command line builds",
+ "short_description": "Project for builds started outside Toaster"
+ }
+ project = orm.Project.objects.create(**options)
+
+ project.is_default = True
+ project.save()
+
+ def backwards(self, orm):
+ # don't do anything when reversing this migration, as we can safely
+ # keep any generated default project which has builds attached;
+ # it's just that the old code won't use that project as the
+ # container for any new builds, as it doesn't have an ID of 0
+ pass
+
+ models = {
+ u'orm.bitbakeversion': {
+ 'Meta': {'object_name': 'BitbakeVersion'},
+ 'branch': ('django.db.models.fields.CharField', [], {'max_length': '32'}),
+ 'dirpath': ('django.db.models.fields.CharField', [], {'max_length': '255'}),
+ 'giturl': ('django.db.models.fields.URLField', [], {'max_length': '200'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'unique': 'True', 'max_length': '32'})
+ },
+ u'orm.branch': {
+ 'Meta': {'unique_together': "(('layer_source', 'name'), ('layer_source', 'up_id'))", 'object_name': 'Branch'},
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'layer_source': ('django.db.models.fields.related.ForeignKey', [], {'default': 'True', 'to': u"orm['orm.LayerSource']", 'null': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '50'}),
+ 'short_description': ('django.db.models.fields.CharField', [], {'max_length': '50', 'blank': 'True'}),
+ 'up_date': ('django.db.models.fields.DateTimeField', [], {'default': 'None', 'null': 'True'}),
+ 'up_id': ('django.db.models.fields.IntegerField', [], {'default': 'None', 'null': 'True'})
+ },
+ u'orm.build': {
+ 'Meta': {'object_name': 'Build'},
+ 'bitbake_version': ('django.db.models.fields.CharField', [], {'max_length': '50'}),
+ 'build_name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'completed_on': ('django.db.models.fields.DateTimeField', [], {}),
+ 'cooker_log_path': ('django.db.models.fields.CharField', [], {'max_length': '500'}),
+ 'distro': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'distro_version': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'machine': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'outcome': ('django.db.models.fields.IntegerField', [], {'default': '2'}),
+ 'project': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Project']"}),
+ 'started_on': ('django.db.models.fields.DateTimeField', [], {})
+ },
+ u'orm.buildartifact': {
+ 'Meta': {'object_name': 'BuildArtifact'},
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Build']"}),
+ 'file_name': ('django.db.models.fields.FilePathField', [], {'max_length': '100'}),
+ 'file_size': ('django.db.models.fields.IntegerField', [], {}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'})
+ },
+ u'orm.helptext': {
+ 'Meta': {'object_name': 'HelpText'},
+ 'area': ('django.db.models.fields.IntegerField', [], {}),
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'helptext_build'", 'to': u"orm['orm.Build']"}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'key': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'text': ('django.db.models.fields.TextField', [], {})
+ },
+ u'orm.layer': {
+ 'Meta': {'unique_together': "(('layer_source', 'up_id'), ('layer_source', 'name'))", 'object_name': 'Layer'},
+ 'description': ('django.db.models.fields.TextField', [], {'default': 'None', 'null': 'True'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'layer_index_url': ('django.db.models.fields.URLField', [], {'max_length': '200'}),
+ 'layer_source': ('django.db.models.fields.related.ForeignKey', [], {'default': 'None', 'to': u"orm['orm.LayerSource']", 'null': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'summary': ('django.db.models.fields.TextField', [], {'default': 'None', 'null': 'True'}),
+ 'up_date': ('django.db.models.fields.DateTimeField', [], {'default': 'None', 'null': 'True'}),
+ 'up_id': ('django.db.models.fields.IntegerField', [], {'default': 'None', 'null': 'True'}),
+ 'vcs_url': ('django.db.models.fields.URLField', [], {'default': 'None', 'max_length': '200', 'null': 'True'}),
+ 'vcs_web_file_base_url': ('django.db.models.fields.URLField', [], {'default': 'None', 'max_length': '200', 'null': 'True'}),
+ 'vcs_web_tree_base_url': ('django.db.models.fields.URLField', [], {'default': 'None', 'max_length': '200', 'null': 'True'}),
+ 'vcs_web_url': ('django.db.models.fields.URLField', [], {'default': 'None', 'max_length': '200', 'null': 'True'})
+ },
+ u'orm.layer_version': {
+ 'Meta': {'unique_together': "(('layer_source', 'up_id'),)", 'object_name': 'Layer_Version'},
+ 'branch': ('django.db.models.fields.CharField', [], {'max_length': '80'}),
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'default': 'None', 'related_name': "'layer_version_build'", 'null': 'True', 'to': u"orm['orm.Build']"}),
+ 'commit': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'dirpath': ('django.db.models.fields.CharField', [], {'default': 'None', 'max_length': '255', 'null': 'True'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'layer': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'layer_version_layer'", 'to': u"orm['orm.Layer']"}),
+ 'layer_source': ('django.db.models.fields.related.ForeignKey', [], {'default': 'None', 'to': u"orm['orm.LayerSource']", 'null': 'True'}),
+ 'local_path': ('django.db.models.fields.FilePathField', [], {'default': "'/'", 'max_length': '1024'}),
+ 'priority': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'project': ('django.db.models.fields.related.ForeignKey', [], {'default': 'None', 'to': u"orm['orm.Project']", 'null': 'True'}),
+ 'up_branch': ('django.db.models.fields.related.ForeignKey', [], {'default': 'None', 'to': u"orm['orm.Branch']", 'null': 'True'}),
+ 'up_date': ('django.db.models.fields.DateTimeField', [], {'default': 'None', 'null': 'True'}),
+ 'up_id': ('django.db.models.fields.IntegerField', [], {'default': 'None', 'null': 'True'})
+ },
+ u'orm.layersource': {
+ 'Meta': {'unique_together': "(('sourcetype', 'apiurl'),)", 'object_name': 'LayerSource'},
+ 'apiurl': ('django.db.models.fields.CharField', [], {'default': 'None', 'max_length': '255', 'null': 'True'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'unique': 'True', 'max_length': '63'}),
+ 'sourcetype': ('django.db.models.fields.IntegerField', [], {})
+ },
+ u'orm.layerversiondependency': {
+ 'Meta': {'unique_together': "(('layer_source', 'up_id'),)", 'object_name': 'LayerVersionDependency'},
+ 'depends_on': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'dependees'", 'to': u"orm['orm.Layer_Version']"}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'layer_source': ('django.db.models.fields.related.ForeignKey', [], {'default': 'None', 'to': u"orm['orm.LayerSource']", 'null': 'True'}),
+ 'layer_version': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'dependencies'", 'to': u"orm['orm.Layer_Version']"}),
+ 'up_id': ('django.db.models.fields.IntegerField', [], {'default': 'None', 'null': 'True'})
+ },
+ u'orm.logmessage': {
+ 'Meta': {'object_name': 'LogMessage'},
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Build']"}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'level': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'lineno': ('django.db.models.fields.IntegerField', [], {'null': 'True'}),
+ 'message': ('django.db.models.fields.CharField', [], {'max_length': '240'}),
+ 'pathname': ('django.db.models.fields.FilePathField', [], {'max_length': '255', 'blank': 'True'}),
+ 'task': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Task']", 'null': 'True', 'blank': 'True'})
+ },
+ u'orm.machine': {
+ 'Meta': {'unique_together': "(('layer_source', 'up_id'),)", 'object_name': 'Machine'},
+ 'description': ('django.db.models.fields.CharField', [], {'max_length': '255'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'layer_source': ('django.db.models.fields.related.ForeignKey', [], {'default': 'None', 'to': u"orm['orm.LayerSource']", 'null': 'True'}),
+ 'layer_version': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Layer_Version']"}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '255'}),
+ 'up_date': ('django.db.models.fields.DateTimeField', [], {'default': 'None', 'null': 'True'}),
+ 'up_id': ('django.db.models.fields.IntegerField', [], {'default': 'None', 'null': 'True'})
+ },
+ u'orm.package': {
+ 'Meta': {'object_name': 'Package'},
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Build']"}),
+ 'description': ('django.db.models.fields.TextField', [], {'blank': 'True'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'installed_name': ('django.db.models.fields.CharField', [], {'default': "''", 'max_length': '100'}),
+ 'installed_size': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'license': ('django.db.models.fields.CharField', [], {'max_length': '80', 'blank': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'recipe': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Recipe']", 'null': 'True'}),
+ 'revision': ('django.db.models.fields.CharField', [], {'max_length': '32', 'blank': 'True'}),
+ 'section': ('django.db.models.fields.CharField', [], {'max_length': '80', 'blank': 'True'}),
+ 'size': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'summary': ('django.db.models.fields.TextField', [], {'blank': 'True'}),
+ 'version': ('django.db.models.fields.CharField', [], {'max_length': '100', 'blank': 'True'})
+ },
+ u'orm.package_dependency': {
+ 'Meta': {'object_name': 'Package_Dependency'},
+ 'dep_type': ('django.db.models.fields.IntegerField', [], {}),
+ 'depends_on': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'package_dependencies_target'", 'to': u"orm['orm.Package']"}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'package': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'package_dependencies_source'", 'to': u"orm['orm.Package']"}),
+ 'target': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Target']", 'null': 'True'})
+ },
+ u'orm.package_file': {
+ 'Meta': {'object_name': 'Package_File'},
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'package': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'buildfilelist_package'", 'to': u"orm['orm.Package']"}),
+ 'path': ('django.db.models.fields.FilePathField', [], {'max_length': '255', 'blank': 'True'}),
+ 'size': ('django.db.models.fields.IntegerField', [], {})
+ },
+ u'orm.project': {
+ 'Meta': {'object_name': 'Project'},
+ 'bitbake_version': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.BitbakeVersion']", 'null': 'True'}),
+ 'created': ('django.db.models.fields.DateTimeField', [], {'auto_now_add': 'True', 'blank': 'True'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'is_default': ('django.db.models.fields.BooleanField', [], {'default': 'False'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'release': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Release']", 'null': 'True'}),
+ 'short_description': ('django.db.models.fields.CharField', [], {'max_length': '50', 'blank': 'True'}),
+ 'updated': ('django.db.models.fields.DateTimeField', [], {'auto_now': 'True', 'blank': 'True'}),
+ 'user_id': ('django.db.models.fields.IntegerField', [], {'null': 'True'})
+ },
+ u'orm.projectlayer': {
+ 'Meta': {'unique_together': "(('project', 'layercommit'),)", 'object_name': 'ProjectLayer'},
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'layercommit': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Layer_Version']", 'null': 'True'}),
+ 'optional': ('django.db.models.fields.BooleanField', [], {'default': 'True'}),
+ 'project': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Project']"})
+ },
+ u'orm.projecttarget': {
+ 'Meta': {'object_name': 'ProjectTarget'},
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'project': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Project']"}),
+ 'target': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'task': ('django.db.models.fields.CharField', [], {'max_length': '100', 'null': 'True'})
+ },
+ u'orm.projectvariable': {
+ 'Meta': {'object_name': 'ProjectVariable'},
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'project': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Project']"}),
+ 'value': ('django.db.models.fields.TextField', [], {'blank': 'True'})
+ },
+ u'orm.recipe': {
+ 'Meta': {'unique_together': "(('layer_version', 'file_path', 'pathflags'),)", 'object_name': 'Recipe'},
+ 'bugtracker': ('django.db.models.fields.URLField', [], {'max_length': '200', 'blank': 'True'}),
+ 'description': ('django.db.models.fields.TextField', [], {'blank': 'True'}),
+ 'file_path': ('django.db.models.fields.FilePathField', [], {'max_length': '255'}),
+ 'homepage': ('django.db.models.fields.URLField', [], {'max_length': '200', 'blank': 'True'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'is_image': ('django.db.models.fields.BooleanField', [], {'default': 'False'}),
+ 'layer_source': ('django.db.models.fields.related.ForeignKey', [], {'default': 'None', 'to': u"orm['orm.LayerSource']", 'null': 'True'}),
+ 'layer_version': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'recipe_layer_version'", 'to': u"orm['orm.Layer_Version']"}),
+ 'license': ('django.db.models.fields.CharField', [], {'max_length': '200', 'blank': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '100', 'blank': 'True'}),
+ 'pathflags': ('django.db.models.fields.CharField', [], {'max_length': '200', 'blank': 'True'}),
+ 'section': ('django.db.models.fields.CharField', [], {'max_length': '100', 'blank': 'True'}),
+ 'summary': ('django.db.models.fields.TextField', [], {'blank': 'True'}),
+ 'up_date': ('django.db.models.fields.DateTimeField', [], {'default': 'None', 'null': 'True'}),
+ 'up_id': ('django.db.models.fields.IntegerField', [], {'default': 'None', 'null': 'True'}),
+ 'version': ('django.db.models.fields.CharField', [], {'max_length': '100', 'blank': 'True'})
+ },
+ u'orm.recipe_dependency': {
+ 'Meta': {'object_name': 'Recipe_Dependency'},
+ 'dep_type': ('django.db.models.fields.IntegerField', [], {}),
+ 'depends_on': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'r_dependencies_depends'", 'to': u"orm['orm.Recipe']"}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'recipe': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'r_dependencies_recipe'", 'to': u"orm['orm.Recipe']"})
+ },
+ u'orm.release': {
+ 'Meta': {'object_name': 'Release'},
+ 'bitbake_version': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.BitbakeVersion']"}),
+ 'branch_name': ('django.db.models.fields.CharField', [], {'default': "''", 'max_length': '50'}),
+ 'description': ('django.db.models.fields.CharField', [], {'max_length': '255'}),
+ 'helptext': ('django.db.models.fields.TextField', [], {'null': 'True'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'unique': 'True', 'max_length': '32'})
+ },
+ u'orm.releasedefaultlayer': {
+ 'Meta': {'object_name': 'ReleaseDefaultLayer'},
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'layer_name': ('django.db.models.fields.CharField', [], {'default': "''", 'max_length': '100'}),
+ 'release': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Release']"})
+ },
+ u'orm.releaselayersourcepriority': {
+ 'Meta': {'unique_together': "(('release', 'layer_source'),)", 'object_name': 'ReleaseLayerSourcePriority'},
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'layer_source': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.LayerSource']"}),
+ 'priority': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'release': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Release']"})
+ },
+ u'orm.target': {
+ 'Meta': {'object_name': 'Target'},
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Build']"}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'image_size': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'is_image': ('django.db.models.fields.BooleanField', [], {'default': 'False'}),
+ 'license_manifest_path': ('django.db.models.fields.CharField', [], {'max_length': '500', 'null': 'True'}),
+ 'target': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'task': ('django.db.models.fields.CharField', [], {'max_length': '100', 'null': 'True'})
+ },
+ u'orm.target_file': {
+ 'Meta': {'object_name': 'Target_File'},
+ 'directory': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'directory_set'", 'null': 'True', 'to': u"orm['orm.Target_File']"}),
+ 'group': ('django.db.models.fields.CharField', [], {'max_length': '128'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'inodetype': ('django.db.models.fields.IntegerField', [], {}),
+ 'owner': ('django.db.models.fields.CharField', [], {'max_length': '128'}),
+ 'path': ('django.db.models.fields.FilePathField', [], {'max_length': '100'}),
+ 'permission': ('django.db.models.fields.CharField', [], {'max_length': '16'}),
+ 'size': ('django.db.models.fields.IntegerField', [], {}),
+ 'sym_target': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'symlink_set'", 'null': 'True', 'to': u"orm['orm.Target_File']"}),
+ 'target': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Target']"})
+ },
+ u'orm.target_image_file': {
+ 'Meta': {'object_name': 'Target_Image_File'},
+ 'file_name': ('django.db.models.fields.FilePathField', [], {'max_length': '254'}),
+ 'file_size': ('django.db.models.fields.IntegerField', [], {}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'target': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Target']"})
+ },
+ u'orm.target_installed_package': {
+ 'Meta': {'object_name': 'Target_Installed_Package'},
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'package': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'buildtargetlist_package'", 'to': u"orm['orm.Package']"}),
+ 'target': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['orm.Target']"})
+ },
+ u'orm.task': {
+ 'Meta': {'ordering': "('order', 'recipe')", 'unique_together': "(('build', 'recipe', 'task_name'),)", 'object_name': 'Task'},
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'task_build'", 'to': u"orm['orm.Build']"}),
+ 'cpu_usage': ('django.db.models.fields.DecimalField', [], {'null': 'True', 'max_digits': '8', 'decimal_places': '2'}),
+ 'disk_io': ('django.db.models.fields.IntegerField', [], {'null': 'True'}),
+ 'elapsed_time': ('django.db.models.fields.DecimalField', [], {'null': 'True', 'max_digits': '8', 'decimal_places': '2'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'line_number': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'logfile': ('django.db.models.fields.FilePathField', [], {'max_length': '255', 'blank': 'True'}),
+ 'message': ('django.db.models.fields.CharField', [], {'max_length': '240'}),
+ 'order': ('django.db.models.fields.IntegerField', [], {'null': 'True'}),
+ 'outcome': ('django.db.models.fields.IntegerField', [], {'default': '-1'}),
+ 'path_to_sstate_obj': ('django.db.models.fields.FilePathField', [], {'max_length': '500', 'blank': 'True'}),
+ 'recipe': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'tasks'", 'to': u"orm['orm.Recipe']"}),
+ 'script_type': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'source_url': ('django.db.models.fields.FilePathField', [], {'max_length': '255', 'blank': 'True'}),
+ 'sstate_checksum': ('django.db.models.fields.CharField', [], {'max_length': '100', 'blank': 'True'}),
+ 'sstate_result': ('django.db.models.fields.IntegerField', [], {'default': '0'}),
+ 'task_executed': ('django.db.models.fields.BooleanField', [], {'default': 'False'}),
+ 'task_name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'work_directory': ('django.db.models.fields.FilePathField', [], {'max_length': '255', 'blank': 'True'})
+ },
+ u'orm.task_dependency': {
+ 'Meta': {'object_name': 'Task_Dependency'},
+ 'depends_on': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'task_dependencies_depends'", 'to': u"orm['orm.Task']"}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'task': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'task_dependencies_task'", 'to': u"orm['orm.Task']"})
+ },
+ u'orm.toastersetting': {
+ 'Meta': {'object_name': 'ToasterSetting'},
+ 'helptext': ('django.db.models.fields.TextField', [], {}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '63'}),
+ 'value': ('django.db.models.fields.CharField', [], {'max_length': '255'})
+ },
+ u'orm.variable': {
+ 'Meta': {'object_name': 'Variable'},
+ 'build': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'variable_build'", 'to': u"orm['orm.Build']"}),
+ 'changed': ('django.db.models.fields.BooleanField', [], {'default': 'False'}),
+ 'description': ('django.db.models.fields.TextField', [], {'blank': 'True'}),
+ 'human_readable_name': ('django.db.models.fields.CharField', [], {'max_length': '200'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'variable_name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
+ 'variable_value': ('django.db.models.fields.TextField', [], {'blank': 'True'})
+ },
+ u'orm.variablehistory': {
+ 'Meta': {'object_name': 'VariableHistory'},
+ 'file_name': ('django.db.models.fields.FilePathField', [], {'max_length': '255'}),
+ u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'line_number': ('django.db.models.fields.IntegerField', [], {'null': 'True'}),
+ 'operation': ('django.db.models.fields.CharField', [], {'max_length': '64'}),
+ 'value': ('django.db.models.fields.TextField', [], {'blank': 'True'}),
+ 'variable': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'vhistory'", 'to': u"orm['orm.Variable']"})
+ }
+ }
+
+ complete_apps = ['orm']
diff --git a/bitbake/lib/toaster/orm/migrations/__init__.py b/bitbake/lib/toaster/orm/migrations/__init__.py
new file mode 100644
index 0000000..e69de29
--- /dev/null
+++ b/bitbake/lib/toaster/orm/migrations/__init__.py
diff --git a/bitbake/lib/toaster/orm/models.py b/bitbake/lib/toaster/orm/models.py
new file mode 100644
index 0000000..e4d2e87
--- /dev/null
+++ b/bitbake/lib/toaster/orm/models.py
@@ -0,0 +1,1233 @@
+#
+# ex:ts=4:sw=4:sts=4:et
+# -*- tab-width: 4; c-basic-offset: 4; indent-tabs-mode: nil -*-
+#
+# BitBake Toaster Implementation
+#
+# Copyright (C) 2013 Intel Corporation
+#
+# This program is free software; you can redistribute it and/or modify
+# it under the terms of the GNU General Public License version 2 as
+# published by the Free Software Foundation.
+#
+# This program is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
+# GNU General Public License for more details.
+#
+# You should have received a copy of the GNU General Public License along
+# with this program; if not, write to the Free Software Foundation, Inc.,
+# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
+
+from django.db import models, IntegrityError
+from django.db.models import F, Q, Avg, Max
+from django.utils import timezone
+
+from django.core.urlresolvers import reverse
+
+from django.core import validators
+from django.conf import settings
+import django.db.models.signals
+
+
+import logging
+logger = logging.getLogger("toaster")
+
+
+class GitURLValidator(validators.URLValidator):
+ import re
+ regex = re.compile(
+ r'^(?:ssh|git|http|ftp)s?://' # http:// or https://
+ r'(?:(?:[A-Z0-9](?:[A-Z0-9-]{0,61}[A-Z0-9])?\.)+(?:[A-Z]{2,6}\.?|[A-Z0-9-]{2,}\.?)|' # domain...
+ r'localhost|' # localhost...
+ r'\d{1,3}\.\d{1,3}\.\d{1,3}\.\d{1,3}|' # ...or ipv4
+ r'\[?[A-F0-9]*:[A-F0-9:]+\]?)' # ...or ipv6
+ r'(?::\d+)?' # optional port
+ r'(?:/?|[/?]\S+)$', re.IGNORECASE)
+
+def GitURLField(**kwargs):
+ r = models.URLField(**kwargs)
+ for i in xrange(len(r.validators)):
+ if isinstance(r.validators[i], validators.URLValidator):
+ r.validators[i] = GitURLValidator()
+ return r
+
+
+class ToasterSetting(models.Model):
+ name = models.CharField(max_length=63)
+ helptext = models.TextField()
+ value = models.CharField(max_length=255)
+
+ def __unicode__(self):
+ return "Setting %s = %s" % (self.name, self.value)
+
+class ProjectManager(models.Manager):
+ def create_project(self, name, release):
+ if release is not None:
+ prj = self.model(name = name, bitbake_version = release.bitbake_version, release = release)
+ else:
+ prj = self.model(name = name, bitbake_version = None, release = None)
+
+ prj.save()
+
+ for defaultconf in ToasterSetting.objects.filter(name__startswith="DEFCONF_"):
+ name = defaultconf.name[8:]
+ ProjectVariable.objects.create( project = prj,
+ name = name,
+ value = defaultconf.value)
+
+ if release is None:
+ return prj
+
+ for rdl in release.releasedefaultlayer_set.all():
+ try:
+ lv = Layer_Version.objects.filter(layer__name = rdl.layer_name, up_branch__name = release.branch_name)[0].get_equivalents_wpriority(prj)[0]
+ ProjectLayer.objects.create( project = prj,
+ layercommit = lv,
+ optional = False )
+ except IndexError:
+ # we may have no valid layer version objects, and that's ok
+ pass
+
+ return prj
+
+ def create(self, *args, **kwargs):
+ raise Exception("Invalid call to Project.objects.create. Use Project.objects.create_project() to create a project")
+
+ # return single object with is_default = True
+ def get_default_project(self):
+ projects = super(ProjectManager, self).filter(is_default = True)
+ if len(projects) > 1:
+ raise Exception("Inconsistent project data: multiple " +
+ "default projects (i.e. with is_default=True)")
+ elif len(projects) < 1:
+ raise Exception("Inconsistent project data: no default project found")
+ return projects[0]
+
+class Project(models.Model):
+ search_allowed_fields = ['name', 'short_description', 'release__name', 'release__branch_name']
+ name = models.CharField(max_length=100)
+ short_description = models.CharField(max_length=50, blank=True)
+ bitbake_version = models.ForeignKey('BitbakeVersion', null=True)
+ release = models.ForeignKey("Release", null=True)
+ created = models.DateTimeField(auto_now_add = True)
+ updated = models.DateTimeField(auto_now = True)
+ # This is a horrible hack; since Toaster has no "User" model available when
+ # running in interactive mode, we can't reference the field here directly
+ # Instead, we keep a possible null reference to the User id, as not to force
+ # hard links to possibly missing models
+ user_id = models.IntegerField(null = True)
+ objects = ProjectManager()
+
+ # set to True for the project which is the default container
+ # for builds initiated by the command line etc.
+ is_default = models.BooleanField(default = False)
+
+ def __unicode__(self):
+ return "%s (Release %s, BBV %s)" % (self.name, self.release, self.bitbake_version)
+
+ def get_current_machine_name(self):
+ try:
+ return self.projectvariable_set.get(name="MACHINE").value
+ except (ProjectVariable.DoesNotExist,IndexError):
+ return( "None" );
+
+ def get_number_of_builds(self):
+ try:
+ return len(Build.objects.filter( project = self.id ))
+ except (Build.DoesNotExist,IndexError):
+ return( 0 )
+
+ def get_last_build_id(self):
+ try:
+ return Build.objects.filter( project = self.id ).order_by('-completed_on')[0].id
+ except (Build.DoesNotExist,IndexError):
+ return( -1 )
+
+ def get_last_outcome(self):
+ build_id = self.get_last_build_id
+ if (-1 == build_id):
+ return( "" )
+ try:
+ return Build.objects.filter( id = self.get_last_build_id )[ 0 ].outcome
+ except (Build.DoesNotExist,IndexError):
+ return( "not_found" )
+
+ def get_last_target(self):
+ build_id = self.get_last_build_id
+ if (-1 == build_id):
+ return( "" )
+ try:
+ return Target.objects.filter(build = build_id)[0].target
+ except (Target.DoesNotExist,IndexError):
+ return( "not_found" )
+
+ def get_last_errors(self):
+ build_id = self.get_last_build_id
+ if (-1 == build_id):
+ return( 0 )
+ try:
+ return Build.objects.filter(id = build_id)[ 0 ].errors.count()
+ except (Build.DoesNotExist,IndexError):
+ return( "not_found" )
+
+ def get_last_warnings(self):
+ build_id = self.get_last_build_id
+ if (-1 == build_id):
+ return( 0 )
+ try:
+ return Build.objects.filter(id = build_id)[ 0 ].warnings.count()
+ except (Build.DoesNotExist,IndexError):
+ return( "not_found" )
+
+ def get_last_imgfiles(self):
+ build_id = self.get_last_build_id
+ if (-1 == build_id):
+ return( "" )
+ try:
+ return Variable.objects.filter(build = build_id, variable_name = "IMAGE_FSTYPES")[ 0 ].variable_value
+ except (Variable.DoesNotExist,IndexError):
+ return( "not_found" )
+
+ # returns a queryset of compatible layers for a project
+ def compatible_layerversions(self, release = None, layer_name = None):
+ if release == None:
+ release = self.release
+ # layers on the same branch or layers specifically set for this project
+ queryset = Layer_Version.objects.filter((Q(up_branch__name = release.branch_name) & Q(project = None)) | Q(project = self) | Q(build__project = self))
+
+ if layer_name is not None:
+ # we select only a layer name
+ queryset = queryset.filter(layer__name = layer_name)
+
+ # order by layer version priority
+ queryset = queryset.filter(Q(layer_source=None) | Q(layer_source__releaselayersourcepriority__release = release)).select_related('layer_source', 'layer', 'up_branch', "layer_source__releaselayersourcepriority__priority").order_by("-layer_source__releaselayersourcepriority__priority")
+
+ return queryset
+
+ def projectlayer_equivalent_set(self):
+ return self.compatible_layerversions().filter(layer__name__in = [x.layercommit.layer.name for x in self.projectlayer_set.all()]).select_related("up_branch")
+
+ def get_available_machines(self):
+ """ Returns QuerySet of all Machines which are provided by the
+ Layers currently added to the Project """
+ queryset = Machine.objects.filter(layer_version__in=self.projectlayer_equivalent_set)
+ return queryset
+
+ def get_all_compatible_machines(self):
+ """ Returns QuerySet of all the compatible machines available to the
+ project including ones from Layers not currently added """
+ compatible_layers = self.compatible_layerversions()
+
+ queryset = Machine.objects.filter(layer_version__in=compatible_layers)
+ return queryset
+
+ def get_available_recipes(self):
+ """ Returns QuerySet of all Recipes which are provided by the Layers
+ currently added to the Project """
+ project_layers = self.projectlayer_equivalent_set()
+ queryset = Recipe.objects.filter(layer_version__in = project_layers)
+
+ # Copied from get_all_compatible_recipes
+ search_maxids = map(lambda i: i[0], list(queryset.values('name').distinct().annotate(max_id=Max('id')).values_list('max_id')))
+ queryset = queryset.filter(id__in=search_maxids).select_related('layer_version', 'layer_version__layer', 'layer_version__up_branch', 'layer_source')
+ # End copy
+
+ return queryset
+
+ def get_all_compatible_recipes(self):
+ """ Returns QuerySet of all the compatible Recipes available to the
+ project including ones from Layers not currently added """
+ compatible_layerversions = self.compatible_layerversions()
+ queryset = Recipe.objects.filter(layer_version__in = compatible_layerversions)
+
+ search_maxids = map(lambda i: i[0], list(queryset.values('name').distinct().annotate(max_id=Max('id')).values_list('max_id')))
+
+ queryset = queryset.filter(id__in=search_maxids).select_related('layer_version', 'layer_version__layer', 'layer_version__up_branch', 'layer_source')
+ return queryset
+
+
+ def schedule_build(self):
+ from bldcontrol.models import BuildRequest, BRTarget, BRLayer, BRVariable, BRBitbake
+ br = BuildRequest.objects.create(project = self)
+ try:
+
+ BRBitbake.objects.create(req = br,
+ giturl = self.bitbake_version.giturl,
+ commit = self.bitbake_version.branch,
+ dirpath = self.bitbake_version.dirpath)
+
+ for l in self.projectlayer_set.all().order_by("pk"):
+ commit = l.layercommit.get_vcs_reference()
+ print("ii Building layer ", l.layercommit.layer.name, " at vcs point ", commit)
+ BRLayer.objects.create(req = br, name = l.layercommit.layer.name, giturl = l.layercommit.layer.vcs_url, commit = commit, dirpath = l.layercommit.dirpath)
+
+ br.state = BuildRequest.REQ_QUEUED
+ now = timezone.now()
+ br.build = Build.objects.create(project = self,
+ completed_on=now,
+ started_on=now,
+ )
+ for t in self.projecttarget_set.all():
+ BRTarget.objects.create(req = br, target = t.target, task = t.task)
+ Target.objects.create(build = br.build, target = t.target)
+
+ for v in self.projectvariable_set.all():
+ BRVariable.objects.create(req = br, name = v.name, value = v.value)
+
+
+ try:
+ br.build.machine = self.projectvariable_set.get(name = 'MACHINE').value
+ br.build.save()
+ except ProjectVariable.DoesNotExist:
+ pass
+ br.save()
+ except Exception:
+ # revert the build request creation since we're not done cleanly
+ br.delete()
+ raise
+ return br
+
+class Build(models.Model):
+ SUCCEEDED = 0
+ FAILED = 1
+ IN_PROGRESS = 2
+
+ BUILD_OUTCOME = (
+ (SUCCEEDED, 'Succeeded'),
+ (FAILED, 'Failed'),
+ (IN_PROGRESS, 'In Progress'),
+ )
+
+ search_allowed_fields = ['machine', 'cooker_log_path', "target__target", "target__target_image_file__file_name"]
+
+ project = models.ForeignKey(Project) # must have a project
+ machine = models.CharField(max_length=100)
+ distro = models.CharField(max_length=100)
+ distro_version = models.CharField(max_length=100)
+ started_on = models.DateTimeField()
+ completed_on = models.DateTimeField()
+ outcome = models.IntegerField(choices=BUILD_OUTCOME, default=IN_PROGRESS)
+ cooker_log_path = models.CharField(max_length=500)
+ build_name = models.CharField(max_length=100)
+ bitbake_version = models.CharField(max_length=50)
+
+ def completeper(self):
+ tf = Task.objects.filter(build = self)
+ tfc = tf.count()
+ if tfc > 0:
+ completeper = tf.exclude(order__isnull=True).count()*100/tf.count()
+ else:
+ completeper = 0
+ return completeper
+
+ def eta(self):
+ eta = timezone.now()
+ completeper = self.completeper()
+ if self.completeper() > 0:
+ eta += ((eta - self.started_on)*(100-completeper))/completeper
+ return eta
+
+
+ def get_sorted_target_list(self):
+ tgts = Target.objects.filter(build_id = self.id).order_by( 'target' );
+ return( tgts );
+
+ @property
+ def toaster_exceptions(self):
+ return self.logmessage_set.filter(level=LogMessage.EXCEPTION)
+
+ @property
+ def errors(self):
+ return (self.logmessage_set.filter(level=LogMessage.ERROR)|self.logmessage_set.filter(level=LogMessage.EXCEPTION))
+
+ @property
+ def warnings(self):
+ return self.logmessage_set.filter(level=LogMessage.WARNING)
+
+ @property
+ def timespent_seconds(self):
+ return (self.completed_on - self.started_on).total_seconds()
+
+ def get_current_status(self):
+ from bldcontrol.models import BuildRequest
+ if self.outcome == Build.IN_PROGRESS and self.buildrequest.state != BuildRequest.REQ_INPROGRESS:
+ return self.buildrequest.get_state_display()
+ return self.get_outcome_display()
+
+ def __str__(self):
+ return "%d %s %s" % (self.id, self.project, ",".join([t.target for t in self.target_set.all()]))
+
+
+# an Artifact is anything that results from a Build, and may be of interest to the user, and is not stored elsewhere
+class BuildArtifact(models.Model):
+ build = models.ForeignKey(Build)
+ file_name = models.FilePathField()
+ file_size = models.IntegerField()
+
+ def get_local_file_name(self):
+ try:
+ deploydir = Variable.objects.get(build = self.build, variable_name="DEPLOY_DIR").variable_value
+ return self.file_name[len(deploydir)+1:]
+ except:
+ raise
+
+ return self.file_name
+
+
+ def is_available(self):
+ return self.build.buildrequest.environment.has_artifact(self.file_name)
+
+class ProjectTarget(models.Model):
+ project = models.ForeignKey(Project)
+ target = models.CharField(max_length=100)
+ task = models.CharField(max_length=100, null=True)
+
+class Target(models.Model):
+ search_allowed_fields = ['target', 'file_name']
+ build = models.ForeignKey(Build)
+ target = models.CharField(max_length=100)
+ task = models.CharField(max_length=100, null=True)
+ is_image = models.BooleanField(default = False)
+ image_size = models.IntegerField(default=0)
+ license_manifest_path = models.CharField(max_length=500, null=True)
+
+ def package_count(self):
+ return Target_Installed_Package.objects.filter(target_id__exact=self.id).count()
+
+ def __unicode__(self):
+ return self.target
+
+class Target_Image_File(models.Model):
+ target = models.ForeignKey(Target)
+ file_name = models.FilePathField(max_length=254)
+ file_size = models.IntegerField()
+
+class Target_File(models.Model):
+ ITYPE_REGULAR = 1
+ ITYPE_DIRECTORY = 2
+ ITYPE_SYMLINK = 3
+ ITYPE_SOCKET = 4
+ ITYPE_FIFO = 5
+ ITYPE_CHARACTER = 6
+ ITYPE_BLOCK = 7
+ ITYPES = ( (ITYPE_REGULAR ,'regular'),
+ ( ITYPE_DIRECTORY ,'directory'),
+ ( ITYPE_SYMLINK ,'symlink'),
+ ( ITYPE_SOCKET ,'socket'),
+ ( ITYPE_FIFO ,'fifo'),
+ ( ITYPE_CHARACTER ,'character'),
+ ( ITYPE_BLOCK ,'block'),
+ )
+
+ target = models.ForeignKey(Target)
+ path = models.FilePathField()
+ size = models.IntegerField()
+ inodetype = models.IntegerField(choices = ITYPES)
+ permission = models.CharField(max_length=16)
+ owner = models.CharField(max_length=128)
+ group = models.CharField(max_length=128)
+ directory = models.ForeignKey('Target_File', related_name="directory_set", null=True)
+ sym_target = models.ForeignKey('Target_File', related_name="symlink_set", null=True)
+
+
+class Task(models.Model):
+
+ SSTATE_NA = 0
+ SSTATE_MISS = 1
+ SSTATE_FAILED = 2
+ SSTATE_RESTORED = 3
+
+ SSTATE_RESULT = (
+ (SSTATE_NA, 'Not Applicable'), # For rest of tasks, but they still need checking.
+ (SSTATE_MISS, 'File not in cache'), # the sstate object was not found
+ (SSTATE_FAILED, 'Failed'), # there was a pkg, but the script failed
+ (SSTATE_RESTORED, 'Succeeded'), # successfully restored
+ )
+
+ CODING_NA = 0
+ CODING_PYTHON = 2
+ CODING_SHELL = 3
+
+ TASK_CODING = (
+ (CODING_NA, 'N/A'),
+ (CODING_PYTHON, 'Python'),
+ (CODING_SHELL, 'Shell'),
+ )
+
+ OUTCOME_NA = -1
+ OUTCOME_SUCCESS = 0
+ OUTCOME_COVERED = 1
+ OUTCOME_CACHED = 2
+ OUTCOME_PREBUILT = 3
+ OUTCOME_FAILED = 4
+ OUTCOME_EMPTY = 5
+
+ TASK_OUTCOME = (
+ (OUTCOME_NA, 'Not Available'),
+ (OUTCOME_SUCCESS, 'Succeeded'),
+ (OUTCOME_COVERED, 'Covered'),
+ (OUTCOME_CACHED, 'Cached'),
+ (OUTCOME_PREBUILT, 'Prebuilt'),
+ (OUTCOME_FAILED, 'Failed'),
+ (OUTCOME_EMPTY, 'Empty'),
+ )
+
+ TASK_OUTCOME_HELP = (
+ (OUTCOME_SUCCESS, 'This task successfully completed'),
+ (OUTCOME_COVERED, 'This task did not run because its output is provided by another task'),
+ (OUTCOME_CACHED, 'This task restored output from the sstate-cache directory or mirrors'),
+ (OUTCOME_PREBUILT, 'This task did not run because its outcome was reused from a previous build'),
+ (OUTCOME_FAILED, 'This task did not complete'),
+ (OUTCOME_EMPTY, 'This task has no executable content'),
+ (OUTCOME_NA, ''),
+ )
+
+ search_allowed_fields = [ "recipe__name", "recipe__version", "task_name", "logfile" ]
+
+ def __init__(self, *args, **kwargs):
+ super(Task, self).__init__(*args, **kwargs)
+ try:
+ self._helptext = HelpText.objects.get(key=self.task_name, area=HelpText.VARIABLE, build=self.build).text
+ except HelpText.DoesNotExist:
+ self._helptext = None
+
+ def get_related_setscene(self):
+ return Task.objects.filter(task_executed=True, build = self.build, recipe = self.recipe, task_name=self.task_name+"_setscene")
+
+ def get_outcome_text(self):
+ return Task.TASK_OUTCOME[int(self.outcome) + 1][1]
+
+ def get_outcome_help(self):
+ return Task.TASK_OUTCOME_HELP[int(self.outcome)][1]
+
+ def get_sstate_text(self):
+ if self.sstate_result==Task.SSTATE_NA:
+ return ''
+ else:
+ return Task.SSTATE_RESULT[int(self.sstate_result)][1]
+
+ def get_executed_display(self):
+ if self.task_executed:
+ return "Executed"
+ return "Not Executed"
+
+ def get_description(self):
+ return self._helptext
+
+ build = models.ForeignKey(Build, related_name='task_build')
+ order = models.IntegerField(null=True)
+ task_executed = models.BooleanField(default=False) # True means Executed, False means Not/Executed
+ outcome = models.IntegerField(choices=TASK_OUTCOME, default=OUTCOME_NA)
+ sstate_checksum = models.CharField(max_length=100, blank=True)
+ path_to_sstate_obj = models.FilePathField(max_length=500, blank=True)
+ recipe = models.ForeignKey('Recipe', related_name='tasks')
+ task_name = models.CharField(max_length=100)
+ source_url = models.FilePathField(max_length=255, blank=True)
+ work_directory = models.FilePathField(max_length=255, blank=True)
+ script_type = models.IntegerField(choices=TASK_CODING, default=CODING_NA)
+ line_number = models.IntegerField(default=0)
+ disk_io = models.IntegerField(null=True)
+ cpu_usage = models.DecimalField(max_digits=8, decimal_places=2, null=True)
+ elapsed_time = models.DecimalField(max_digits=8, decimal_places=2, null=True)
+ sstate_result = models.IntegerField(choices=SSTATE_RESULT, default=SSTATE_NA)
+ message = models.CharField(max_length=240)
+ logfile = models.FilePathField(max_length=255, blank=True)
+
+ outcome_text = property(get_outcome_text)
+ sstate_text = property(get_sstate_text)
+
+ def __unicode__(self):
+ return "%d(%d) %s:%s" % (self.pk, self.build.pk, self.recipe.name, self.task_name)
+
+ class Meta:
+ ordering = ('order', 'recipe' ,)
+ unique_together = ('build', 'recipe', 'task_name', )
+
+
+class Task_Dependency(models.Model):
+ task = models.ForeignKey(Task, related_name='task_dependencies_task')
+ depends_on = models.ForeignKey(Task, related_name='task_dependencies_depends')
+
+class Package(models.Model):
+ search_allowed_fields = ['name', 'version', 'revision', 'recipe__name', 'recipe__version', 'recipe__license', 'recipe__layer_version__layer__name', 'recipe__layer_version__branch', 'recipe__layer_version__commit', 'recipe__layer_version__local_path', 'installed_name']
+ build = models.ForeignKey('Build')
+ recipe = models.ForeignKey('Recipe', null=True)
+ name = models.CharField(max_length=100)
+ installed_name = models.CharField(max_length=100, default='')
+ version = models.CharField(max_length=100, blank=True)
+ revision = models.CharField(max_length=32, blank=True)
+ summary = models.TextField(blank=True)
+ description = models.TextField(blank=True)
+ size = models.IntegerField(default=0)
+ installed_size = models.IntegerField(default=0)
+ section = models.CharField(max_length=80, blank=True)
+ license = models.CharField(max_length=80, blank=True)
+
+class Package_DependencyManager(models.Manager):
+ use_for_related_fields = True
+
+ def get_query_set(self):
+ return super(Package_DependencyManager, self).get_query_set().exclude(package_id = F('depends_on__id'))
+
+class Package_Dependency(models.Model):
+ TYPE_RDEPENDS = 0
+ TYPE_TRDEPENDS = 1
+ TYPE_RRECOMMENDS = 2
+ TYPE_TRECOMMENDS = 3
+ TYPE_RSUGGESTS = 4
+ TYPE_RPROVIDES = 5
+ TYPE_RREPLACES = 6
+ TYPE_RCONFLICTS = 7
+ ' TODO: bpackage should be changed to remove the DEPENDS_TYPE access '
+ DEPENDS_TYPE = (
+ (TYPE_RDEPENDS, "depends"),
+ (TYPE_TRDEPENDS, "depends"),
+ (TYPE_TRECOMMENDS, "recommends"),
+ (TYPE_RRECOMMENDS, "recommends"),
+ (TYPE_RSUGGESTS, "suggests"),
+ (TYPE_RPROVIDES, "provides"),
+ (TYPE_RREPLACES, "replaces"),
+ (TYPE_RCONFLICTS, "conflicts"),
+ )
+ """ Indexed by dep_type, in view order, key for short name and help
+ description which when viewed will be printf'd with the
+ package name.
+ """
+ DEPENDS_DICT = {
+ TYPE_RDEPENDS : ("depends", "%s is required to run %s"),
+ TYPE_TRDEPENDS : ("depends", "%s is required to run %s"),
+ TYPE_TRECOMMENDS : ("recommends", "%s extends the usability of %s"),
+ TYPE_RRECOMMENDS : ("recommends", "%s extends the usability of %s"),
+ TYPE_RSUGGESTS : ("suggests", "%s is suggested for installation with %s"),
+ TYPE_RPROVIDES : ("provides", "%s is provided by %s"),
+ TYPE_RREPLACES : ("replaces", "%s is replaced by %s"),
+ TYPE_RCONFLICTS : ("conflicts", "%s conflicts with %s, which will not be installed if this package is not first removed"),
+ }
+
+ package = models.ForeignKey(Package, related_name='package_dependencies_source')
+ depends_on = models.ForeignKey(Package, related_name='package_dependencies_target') # soft dependency
+ dep_type = models.IntegerField(choices=DEPENDS_TYPE)
+ target = models.ForeignKey(Target, null=True)
+ objects = Package_DependencyManager()
+
+class Target_Installed_Package(models.Model):
+ target = models.ForeignKey(Target)
+ package = models.ForeignKey(Package, related_name='buildtargetlist_package')
+
+class Package_File(models.Model):
+ package = models.ForeignKey(Package, related_name='buildfilelist_package')
+ path = models.FilePathField(max_length=255, blank=True)
+ size = models.IntegerField()
+
+class Recipe(models.Model):
+ search_allowed_fields = ['name', 'version', 'file_path', 'section', 'summary', 'description', 'license', 'layer_version__layer__name', 'layer_version__branch', 'layer_version__commit', 'layer_version__local_path', 'layer_version__layer_source__name']
+
+ layer_source = models.ForeignKey('LayerSource', default = None, null = True) # from where did we get this recipe
+ up_id = models.IntegerField(null = True, default = None) # id of entry in the source
+ up_date = models.DateTimeField(null = True, default = None)
+
+ name = models.CharField(max_length=100, blank=True) # pn
+ version = models.CharField(max_length=100, blank=True) # pv
+ layer_version = models.ForeignKey('Layer_Version', related_name='recipe_layer_version')
+ summary = models.TextField(blank=True)
+ description = models.TextField(blank=True)
+ section = models.CharField(max_length=100, blank=True)
+ license = models.CharField(max_length=200, blank=True)
+ homepage = models.URLField(blank=True)
+ bugtracker = models.URLField(blank=True)
+ file_path = models.FilePathField(max_length=255)
+ pathflags = models.CharField(max_length=200, blank=True)
+ is_image = models.BooleanField(default=False)
+
+ def get_layersource_view_url(self):
+ if self.layer_source is None:
+ return ""
+
+ url = self.layer_source.get_object_view(self.layer_version.up_branch, "recipes", self.name)
+ return url
+
+ def __unicode__(self):
+ return "Recipe " + self.name + ":" + self.version
+
+ def get_vcs_recipe_file_link_url(self):
+ return self.layer_version.get_vcs_file_link_url(self.file_path)
+
+ def get_description_or_summary(self):
+ if self.description:
+ return self.description
+ elif self.summary:
+ return self.summary
+ else:
+ return ""
+
+ class Meta:
+ unique_together = (("layer_version", "file_path", "pathflags"), )
+
+
+class Recipe_DependencyManager(models.Manager):
+ use_for_related_fields = True
+
+ def get_query_set(self):
+ return super(Recipe_DependencyManager, self).get_query_set().exclude(recipe_id = F('depends_on__id'))
+
+class Recipe_Dependency(models.Model):
+ TYPE_DEPENDS = 0
+ TYPE_RDEPENDS = 1
+
+ DEPENDS_TYPE = (
+ (TYPE_DEPENDS, "depends"),
+ (TYPE_RDEPENDS, "rdepends"),
+ )
+ recipe = models.ForeignKey(Recipe, related_name='r_dependencies_recipe')
+ depends_on = models.ForeignKey(Recipe, related_name='r_dependencies_depends')
+ dep_type = models.IntegerField(choices=DEPENDS_TYPE)
+ objects = Recipe_DependencyManager()
+
+
+class Machine(models.Model):
+ search_allowed_fields = ["name", "description", "layer_version__layer__name"]
+ layer_source = models.ForeignKey('LayerSource', default = None, null = True) # from where did we get this machine
+ up_id = models.IntegerField(null = True, default = None) # id of entry in the source
+ up_date = models.DateTimeField(null = True, default = None)
+
+ layer_version = models.ForeignKey('Layer_Version')
+ name = models.CharField(max_length=255)
+ description = models.CharField(max_length=255)
+
+ def get_vcs_machine_file_link_url(self):
+ path = 'conf/machine/'+self.name+'.conf'
+
+ return self.layer_version.get_vcs_file_link_url(path)
+
+ def __unicode__(self):
+ return "Machine " + self.name + "(" + self.description + ")"
+
+ class Meta:
+ unique_together = ("layer_source", "up_id")
+
+
+from django.db.models.base import ModelBase
+
+class InheritanceMetaclass(ModelBase):
+ def __call__(cls, *args, **kwargs):
+ obj = super(InheritanceMetaclass, cls).__call__(*args, **kwargs)
+ return obj.get_object()
+
+
+class LayerSource(models.Model):
+ __metaclass__ = InheritanceMetaclass
+
+ class Meta:
+ unique_together = (('sourcetype', 'apiurl'), )
+
+ TYPE_LOCAL = 0
+ TYPE_LAYERINDEX = 1
+ TYPE_IMPORTED = 2
+ SOURCE_TYPE = (
+ (TYPE_LOCAL, "local"),
+ (TYPE_LAYERINDEX, "layerindex"),
+ (TYPE_IMPORTED, "imported"),
+ )
+
+ name = models.CharField(max_length=63, unique = True)
+ sourcetype = models.IntegerField(choices=SOURCE_TYPE)
+ apiurl = models.CharField(max_length=255, null=True, default=None)
+
+ def __init__(self, *args, **kwargs):
+ super(LayerSource, self).__init__(*args, **kwargs)
+ if self.sourcetype == LayerSource.TYPE_LOCAL:
+ self.__class__ = LocalLayerSource
+ elif self.sourcetype == LayerSource.TYPE_LAYERINDEX:
+ self.__class__ = LayerIndexLayerSource
+ elif self.sourcetype == LayerSource.TYPE_IMPORTED:
+ self.__class__ = ImportedLayerSource
+ elif self.sourcetype == None:
+ raise Exception("Unknown LayerSource-derived class. If you added a new layer source type, fill out all code stubs.")
+
+
+ def update(self):
+ """
+ Updates the local database information from the upstream layer source
+ """
+ raise Exception("Abstract, update() must be implemented by all LayerSource-derived classes (object is %s)" % str(vars(self)))
+
+ def save(self, *args, **kwargs):
+ return super(LayerSource, self).save(*args, **kwargs)
+
+ def get_object(self):
+ # preset an un-initilized object
+ if None == self.name:
+ self.name=""
+ if None == self.apiurl:
+ self.apiurl=""
+ if None == self.sourcetype:
+ self.sourcetype=LayerSource.TYPE_LOCAL
+
+ if self.sourcetype == LayerSource.TYPE_LOCAL:
+ self.__class__ = LocalLayerSource
+ elif self.sourcetype == LayerSource.TYPE_LAYERINDEX:
+ self.__class__ = LayerIndexLayerSource
+ elif self.sourcetype == LayerSource.TYPE_IMPORTED:
+ self.__class__ = ImportedLayerSource
+ else:
+ raise Exception("Unknown LayerSource type. If you added a new layer source type, fill out all code stubs.")
+ return self
+
+ def __unicode__(self):
+ return "%s (%s)" % (self.name, self.sourcetype)
+
+
+class LocalLayerSource(LayerSource):
+ class Meta(LayerSource._meta.__class__):
+ proxy = True
+
+ def __init__(self, *args, **kwargs):
+ super(LocalLayerSource, self).__init__(args, kwargs)
+ self.sourcetype = LayerSource.TYPE_LOCAL
+
+ def update(self):
+ """
+ Fetches layer, recipe and machine information from local repository
+ """
+ pass
+
+class ImportedLayerSource(LayerSource):
+ class Meta(LayerSource._meta.__class__):
+ proxy = True
+
+ def __init__(self, *args, **kwargs):
+ super(ImportedLayerSource, self).__init__(args, kwargs)
+ self.sourcetype = LayerSource.TYPE_IMPORTED
+
+ def update(self):
+ """
+ Fetches layer, recipe and machine information from local repository
+ """
+ pass
+
+
+class LayerIndexLayerSource(LayerSource):
+ class Meta(LayerSource._meta.__class__):
+ proxy = True
+
+ def __init__(self, *args, **kwargs):
+ super(LayerIndexLayerSource, self).__init__(args, kwargs)
+ self.sourcetype = LayerSource.TYPE_LAYERINDEX
+
+ def get_object_view(self, branch, objectype, upid):
+ return self.apiurl + "../branch/" + branch.name + "/" + objectype + "/?q=" + str(upid)
+
+ def update(self):
+ """
+ Fetches layer, recipe and machine information from remote repository
+ """
+ assert self.apiurl is not None
+ from django.db import transaction, connection
+
+ import urllib2, urlparse, json
+ import os
+ proxy_settings = os.environ.get("http_proxy", None)
+
+ def _get_json_response(apiurl = self.apiurl):
+ _parsedurl = urlparse.urlparse(apiurl)
+ path = _parsedurl.path
+
+ try:
+ res = urllib2.urlopen(apiurl)
+ except urllib2.URLError as e:
+ raise Exception("Failed to read %s: %s" % (path, e.reason))
+
+ return json.loads(res.read())
+
+ # verify we can get the basic api
+ try:
+ apilinks = _get_json_response()
+ except Exception as e:
+ import traceback
+ if proxy_settings is not None:
+ logger.info("EE: Using proxy %s" % proxy_settings)
+ logger.warning("EE: could not connect to %s, skipping update: %s\n%s" % (self.apiurl, e, traceback.format_exc(e)))
+ return
+
+ # update branches; only those that we already have names listed in the
+ # Releases table
+ whitelist_branch_names = map(lambda x: x.branch_name, Release.objects.all())
+ if len(whitelist_branch_names) == 0:
+ raise Exception("Failed to make list of branches to fetch")
+
+ logger.debug("Fetching branches")
+ branches_info = _get_json_response(apilinks['branches']
+ + "?filter=name:%s" % "OR".join(whitelist_branch_names))
+ for bi in branches_info:
+ b, created = Branch.objects.get_or_create(layer_source = self, name = bi['name'])
+ b.up_id = bi['id']
+ b.up_date = bi['updated']
+ b.name = bi['name']
+ b.short_description = bi['short_description']
+ b.save()
+
+ # update layers
+ layers_info = _get_json_response(apilinks['layerItems'])
+ if not connection.features.autocommits_when_autocommit_is_off:
+ transaction.set_autocommit(False)
+ for li in layers_info:
+ l, created = Layer.objects.get_or_create(layer_source = self, name = li['name'])
+ l.up_id = li['id']
+ l.up_date = li['updated']
+ l.vcs_url = li['vcs_url']
+ l.vcs_web_url = li['vcs_web_url']
+ l.vcs_web_tree_base_url = li['vcs_web_tree_base_url']
+ l.vcs_web_file_base_url = li['vcs_web_file_base_url']
+ l.summary = li['summary']
+ l.description = li['description']
+ l.save()
+ if not connection.features.autocommits_when_autocommit_is_off:
+ transaction.set_autocommit(True)
+
+ # update layerbranches/layer_versions
+ logger.debug("Fetching layer information")
+ layerbranches_info = _get_json_response(apilinks['layerBranches']
+ + "?filter=branch:%s" % "OR".join(map(lambda x: str(x.up_id), [i for i in Branch.objects.filter(layer_source = self) if i.up_id is not None] ))
+ )
+
+ if not connection.features.autocommits_when_autocommit_is_off:
+ transaction.set_autocommit(False)
+ for lbi in layerbranches_info:
+ lv, created = Layer_Version.objects.get_or_create(layer_source = self,
+ up_id = lbi['id'],
+ layer=Layer.objects.get(layer_source = self, up_id = lbi['layer'])
+ )
+
+ lv.up_date = lbi['updated']
+ lv.up_branch = Branch.objects.get(layer_source = self, up_id = lbi['branch'])
+ lv.branch = lbi['actual_branch']
+ lv.commit = lbi['actual_branch']
+ lv.dirpath = lbi['vcs_subdir']
+ lv.save()
+ if not connection.features.autocommits_when_autocommit_is_off:
+ transaction.set_autocommit(True)
+
+ # update layer dependencies
+ layerdependencies_info = _get_json_response(apilinks['layerDependencies'])
+ dependlist = {}
+ if not connection.features.autocommits_when_autocommit_is_off:
+ transaction.set_autocommit(False)
+ for ldi in layerdependencies_info:
+ try:
+ lv = Layer_Version.objects.get(layer_source = self, up_id = ldi['layerbranch'])
+ except Layer_Version.DoesNotExist as e:
+ continue
+
+ if lv not in dependlist:
+ dependlist[lv] = []
+ try:
+ dependlist[lv].append(Layer_Version.objects.get(layer_source = self, layer__up_id = ldi['dependency'], up_branch = lv.up_branch))
+ except Layer_Version.DoesNotExist:
+ logger.warning("Cannot find layer version (ls:%s), up_id:%s lv:%s" % (self, ldi['dependency'], lv))
+
+ for lv in dependlist:
+ LayerVersionDependency.objects.filter(layer_version = lv).delete()
+ for lvd in dependlist[lv]:
+ LayerVersionDependency.objects.get_or_create(layer_version = lv, depends_on = lvd)
+ if not connection.features.autocommits_when_autocommit_is_off:
+ transaction.set_autocommit(True)
+
+
+ # update machines
+ logger.debug("Fetching machine information")
+ machines_info = _get_json_response(apilinks['machines']
+ + "?filter=layerbranch:%s" % "OR".join(map(lambda x: str(x.up_id), Layer_Version.objects.filter(layer_source = self)))
+ )
+
+ if not connection.features.autocommits_when_autocommit_is_off:
+ transaction.set_autocommit(False)
+ for mi in machines_info:
+ mo, created = Machine.objects.get_or_create(layer_source = self, up_id = mi['id'], layer_version = Layer_Version.objects.get(layer_source = self, up_id = mi['layerbranch']))
+ mo.up_date = mi['updated']
+ mo.name = mi['name']
+ mo.description = mi['description']
+ mo.save()
+
+ if not connection.features.autocommits_when_autocommit_is_off:
+ transaction.set_autocommit(True)
+
+ # update recipes; paginate by layer version / layer branch
+ logger.debug("Fetching target information")
+ recipes_info = _get_json_response(apilinks['recipes']
+ + "?filter=layerbranch:%s" % "OR".join(map(lambda x: str(x.up_id), Layer_Version.objects.filter(layer_source = self)))
+ )
+ if not connection.features.autocommits_when_autocommit_is_off:
+ transaction.set_autocommit(False)
+ for ri in recipes_info:
+ try:
+ ro, created = Recipe.objects.get_or_create(layer_source = self, up_id = ri['id'], layer_version = Layer_Version.objects.get(layer_source = self, up_id = ri['layerbranch']))
+ ro.up_date = ri['updated']
+ ro.name = ri['pn']
+ ro.version = ri['pv']
+ ro.summary = ri['summary']
+ ro.description = ri['description']
+ ro.section = ri['section']
+ ro.license = ri['license']
+ ro.homepage = ri['homepage']
+ ro.bugtracker = ri['bugtracker']
+ ro.file_path = ri['filepath'] + "/" + ri['filename']
+ if 'inherits' in ri:
+ ro.is_image = 'image' in ri['inherits'].split()
+ ro.save()
+ except IntegrityError as e:
+ logger.debug("Failed saving recipe, ignoring: %s (%s:%s)" % (e, ro.layer_version, ri['filepath']+"/"+ri['filename']))
+ if not connection.features.autocommits_when_autocommit_is_off:
+ transaction.set_autocommit(True)
+
+class BitbakeVersion(models.Model):
+
+ name = models.CharField(max_length=32, unique = True)
+ giturl = GitURLField()
+ branch = models.CharField(max_length=32)
+ dirpath = models.CharField(max_length=255)
+
+ def __unicode__(self):
+ return "%s (Branch: %s)" % (self.name, self.branch)
+
+
+class Release(models.Model):
+ """ A release is a project template, used to pre-populate Project settings with a configuration set """
+ name = models.CharField(max_length=32, unique = True)
+ description = models.CharField(max_length=255)
+ bitbake_version = models.ForeignKey(BitbakeVersion)
+ branch_name = models.CharField(max_length=50, default = "")
+ helptext = models.TextField(null=True)
+
+ def __unicode__(self):
+ return "%s (%s)" % (self.name, self.branch_name)
+
+class ReleaseLayerSourcePriority(models.Model):
+ """ Each release selects layers from the set up layer sources, ordered by priority """
+ release = models.ForeignKey("Release")
+ layer_source = models.ForeignKey("LayerSource")
+ priority = models.IntegerField(default = 0)
+
+ def __unicode__(self):
+ return "%s-%s:%d" % (self.release.name, self.layer_source.name, self.priority)
+ class Meta:
+ unique_together = (('release', 'layer_source'),)
+
+
+class ReleaseDefaultLayer(models.Model):
+ release = models.ForeignKey(Release)
+ layer_name = models.CharField(max_length=100, default="")
+
+
+# Branch class is synced with layerindex.Branch, branches can only come from remote layer indexes
+class Branch(models.Model):
+ layer_source = models.ForeignKey('LayerSource', null = True, default = True)
+ up_id = models.IntegerField(null = True, default = None) # id of branch in the source
+ up_date = models.DateTimeField(null = True, default = None)
+
+ name = models.CharField(max_length=50)
+ short_description = models.CharField(max_length=50, blank=True)
+
+ class Meta:
+ verbose_name_plural = "Branches"
+ unique_together = (('layer_source', 'name'),('layer_source', 'up_id'))
+
+ def __unicode__(self):
+ return self.name
+
+
+# Layer class synced with layerindex.LayerItem
+class Layer(models.Model):
+ layer_source = models.ForeignKey(LayerSource, null = True, default = None) # from where did we got this layer
+ up_id = models.IntegerField(null = True, default = None) # id of layer in the remote source
+ up_date = models.DateTimeField(null = True, default = None)
+
+ name = models.CharField(max_length=100)
+ layer_index_url = models.URLField()
+ vcs_url = GitURLField(default = None, null = True)
+ vcs_web_url = models.URLField(null = True, default = None)
+ vcs_web_tree_base_url = models.URLField(null = True, default = None)
+ vcs_web_file_base_url = models.URLField(null = True, default = None)
+
+ summary = models.TextField(help_text='One-line description of the layer', null = True, default = None)
+ description = models.TextField(null = True, default = None)
+
+ def __unicode__(self):
+ return "%s / %s " % (self.name, self.layer_source)
+
+ class Meta:
+ unique_together = (("layer_source", "up_id"), ("layer_source", "name"))
+
+
+# LayerCommit class is synced with layerindex.LayerBranch
+class Layer_Version(models.Model):
+ search_allowed_fields = ["layer__name", "layer__summary", "layer__description", "layer__vcs_url", "dirpath", "up_branch__name", "commit", "branch"]
+ build = models.ForeignKey(Build, related_name='layer_version_build', default = None, null = True)
+ layer = models.ForeignKey(Layer, related_name='layer_version_layer')
+
+ layer_source = models.ForeignKey(LayerSource, null = True, default = None) # from where did we get this Layer Version
+ up_id = models.IntegerField(null = True, default = None) # id of layerbranch in the remote source
+ up_date = models.DateTimeField(null = True, default = None)
+ up_branch = models.ForeignKey(Branch, null = True, default = None)
+
+ branch = models.CharField(max_length=80) # LayerBranch.actual_branch
+ commit = models.CharField(max_length=100) # LayerBranch.vcs_last_rev
+ dirpath = models.CharField(max_length=255, null = True, default = None) # LayerBranch.vcs_subdir
+ priority = models.IntegerField(default = 0) # if -1, this is a default layer
+
+ local_path = models.FilePathField(max_length=1024, default = "/") # where this layer was checked-out
+
+ project = models.ForeignKey('Project', null = True, default = None) # Set if this layer is project-specific; always set for imported layers, and project-set branches
+
+ # code lifted, with adaptations, from the layerindex-web application https://git.yoctoproject.org/cgit/cgit.cgi/layerindex-web/
+ def _handle_url_path(self, base_url, path):
+ import re, posixpath
+ if base_url:
+ if self.dirpath:
+ if path:
+ extra_path = self.dirpath + '/' + path
+ # Normalise out ../ in path for usage URL
+ extra_path = posixpath.normpath(extra_path)
+ # Minor workaround to handle case where subdirectory has been added between branches
+ # (should probably support usage URL per branch to handle this... sigh...)
+ if extra_path.startswith('../'):
+ extra_path = extra_path[3:]
+ else:
+ extra_path = self.dirpath
+ else:
+ extra_path = path
+ branchname = self.up_branch.name
+ url = base_url.replace('%branch%', branchname)
+
+ # If there's a % in the path (e.g. a wildcard bbappend) we need to encode it
+ if extra_path:
+ extra_path = extra_path.replace('%', '%25')
+
+ if '%path%' in base_url:
+ if extra_path:
+ url = re.sub(r'\[([^\]]*%path%[^\]]*)\]', '\\1', url)
+ else:
+ url = re.sub(r'\[([^\]]*%path%[^\]]*)\]', '', url)
+ return url.replace('%path%', extra_path)
+ else:
+ return url + extra_path
+ return None
+
+ def get_vcs_link_url(self):
+ if self.layer.vcs_web_url is None:
+ return None
+ return self.layer.vcs_web_url
+
+ def get_vcs_file_link_url(self, file_path=""):
+ if self.layer.vcs_web_file_base_url is None:
+ return None
+ return self._handle_url_path(self.layer.vcs_web_file_base_url, file_path)
+
+ def get_vcs_dirpath_link_url(self):
+ if self.layer.vcs_web_tree_base_url is None:
+ return None
+ return self._handle_url_path(self.layer.vcs_web_tree_base_url, '')
+
+ def get_equivalents_wpriority(self, project):
+ return project.compatible_layerversions(layer_name = self.layer.name)
+
+ def get_vcs_reference(self):
+ if self.commit is not None and len(self.commit) > 0:
+ return self.commit
+ if self.branch is not None and len(self.branch) > 0:
+ return self.branch
+ if self.up_branch is not None:
+ return self.up_branch.name
+ return ("Cannot determine the vcs_reference for layer version %s" % vars(self))
+
+ def get_detailspage_url(self, project_id):
+ return reverse('layerdetails', args=(project_id, self.pk))
+
+ def __unicode__(self):
+ return "%d %s (VCS %s, Project %s)" % (self.pk, str(self.layer), self.get_vcs_reference(), self.build.project if self.build is not None else "No project")
+
+ class Meta:
+ unique_together = ("layer_source", "up_id")
+
+class LayerVersionDependency(models.Model):
+ layer_source = models.ForeignKey(LayerSource, null = True, default = None) # from where did we got this layer
+ up_id = models.IntegerField(null = True, default = None) # id of layerbranch in the remote source
+
+ layer_version = models.ForeignKey(Layer_Version, related_name="dependencies")
+ depends_on = models.ForeignKey(Layer_Version, related_name="dependees")
+
+ class Meta:
+ unique_together = ("layer_source", "up_id")
+
+class ProjectLayer(models.Model):
+ project = models.ForeignKey(Project)
+ layercommit = models.ForeignKey(Layer_Version, null=True)
+ optional = models.BooleanField(default = True)
+
+ def __unicode__(self):
+ return "%s, %s" % (self.project.name, self.layercommit)
+
+ class Meta:
+ unique_together = (("project", "layercommit"),)
+
+class ProjectVariable(models.Model):
+ project = models.ForeignKey(Project)
+ name = models.CharField(max_length=100)
+ value = models.TextField(blank = True)
+
+class Variable(models.Model):
+ search_allowed_fields = ['variable_name', 'variable_value',
+ 'vhistory__file_name', "description"]
+ build = models.ForeignKey(Build, related_name='variable_build')
+ variable_name = models.CharField(max_length=100)
+ variable_value = models.TextField(blank=True)
+ changed = models.BooleanField(default=False)
+ human_readable_name = models.CharField(max_length=200)
+ description = models.TextField(blank=True)
+
+class VariableHistory(models.Model):
+ variable = models.ForeignKey(Variable, related_name='vhistory')
+ value = models.TextField(blank=True)
+ file_name = models.FilePathField(max_length=255)
+ line_number = models.IntegerField(null=True)
+ operation = models.CharField(max_length=64)
+
+class HelpText(models.Model):
+ VARIABLE = 0
+ HELPTEXT_AREA = ((VARIABLE, 'variable'), )
+
+ build = models.ForeignKey(Build, related_name='helptext_build')
+ area = models.IntegerField(choices=HELPTEXT_AREA)
+ key = models.CharField(max_length=100)
+ text = models.TextField()
+
+class LogMessage(models.Model):
+ EXCEPTION = -1 # used to signal self-toaster-exceptions
+ INFO = 0
+ WARNING = 1
+ ERROR = 2
+
+ LOG_LEVEL = ( (INFO, "info"),
+ (WARNING, "warn"),
+ (ERROR, "error"),
+ (EXCEPTION, "toaster exception"))
+
+ build = models.ForeignKey(Build)
+ task = models.ForeignKey(Task, blank = True, null=True)
+ level = models.IntegerField(choices=LOG_LEVEL, default=INFO)
+ message=models.CharField(max_length=240)
+ pathname = models.FilePathField(max_length=255, blank=True)
+ lineno = models.IntegerField(null=True)
+
+ def __str__(self):
+ return "%s %s %s" % (self.get_level_display(), self.message, self.build)
+
+def invalidate_cache(**kwargs):
+ from django.core.cache import cache
+ try:
+ cache.clear()
+ except Exception as e:
+ logger.warning("Problem with cache backend: Failed to clear cache: %s" % e)
+
+django.db.models.signals.post_save.connect(invalidate_cache)
+django.db.models.signals.post_delete.connect(invalidate_cache)
diff --git a/bitbake/lib/toaster/orm/tests.py b/bitbake/lib/toaster/orm/tests.py
new file mode 100644
index 0000000..783aea8
--- /dev/null
+++ b/bitbake/lib/toaster/orm/tests.py
@@ -0,0 +1,187 @@
+#! /usr/bin/env python
+# ex:ts=4:sw=4:sts=4:et
+# -*- tab-width: 4; c-basic-offset: 4; indent-tabs-mode: nil -*-
+#
+# BitBake Toaster Implementation
+#
+# Copyright (C) 2013-2015 Intel Corporation
+#
+# This program is free software; you can redistribute it and/or modify
+# it under the terms of the GNU General Public License version 2 as
+# published by the Free Software Foundation.
+#
+# This program is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
+# GNU General Public License for more details.
+#
+# You should have received a copy of the GNU General Public License along
+# with this program; if not, write to the Free Software Foundation, Inc.,
+# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
+
+"""Test cases for Toaster ORM."""
+
+from django.test import TestCase, TransactionTestCase
+from orm.models import LocalLayerSource, LayerIndexLayerSource, ImportedLayerSource, LayerSource
+from orm.models import Branch
+
+from orm.models import Project, Build, Layer, Layer_Version, Branch, ProjectLayer
+from orm.models import Release, ReleaseLayerSourcePriority, BitbakeVersion
+
+from django.utils import timezone
+from django.db import IntegrityError
+
+import os
+
+# set TTS_LAYER_INDEX to the base url to use a different instance of the layer index
+
+class LayerSourceVerifyInheritanceSaveLoad(TestCase):
+ """
+ Tests to verify inheritance for the LayerSource proxy-inheritance classes.
+ """
+ def test_object_creation(self):
+ """Test LayerSource object creation."""
+ for name, sourcetype in [("a1", LayerSource.TYPE_LOCAL),
+ ("a2", LayerSource.TYPE_LAYERINDEX),
+ ("a3", LayerSource.TYPE_IMPORTED)]:
+ LayerSource.objects.create(name=name, sourcetype=sourcetype)
+
+ objects = LayerSource.objects.all()
+ self.assertTrue(isinstance(objects[0], LocalLayerSource))
+ self.assertTrue(isinstance(objects[1], LayerIndexLayerSource))
+ self.assertTrue(isinstance(objects[2], ImportedLayerSource))
+
+ def test_duplicate_error(self):
+ """Test creation of duplicate LayerSource objects."""
+ stype = LayerSource.TYPE_LOCAL
+ LayerSource.objects.create(name="a1", sourcetype=stype)
+ with self.assertRaises(IntegrityError):
+ LayerSource.objects.create(name="a1", sourcetype=stype)
+
+
+class LILSUpdateTestCase(TransactionTestCase):
+ """Test Layer Source update."""
+
+ def setUp(self):
+ """Create release."""
+ bbv = BitbakeVersion.objects.create(\
+ name="master", giturl="git://git.openembedded.org/bitbake")
+ Release.objects.create(name="default-release", bitbake_version=bbv,
+ branch_name="master")
+
+ def test_update(self):
+ """Check if LayerSource.update can fetch branches."""
+ url = os.getenv("TTS_LAYER_INDEX",
+ default="http://layers.openembedded.org/")
+
+ lsobj = LayerSource.objects.create(\
+ name="b1", sourcetype=LayerSource.TYPE_LAYERINDEX,
+ apiurl=url + "layerindex/api/")
+ lsobj.update()
+ self.assertTrue(lsobj.branch_set.all().count() > 0,
+ "no branches fetched")
+
+class LayerVersionEquivalenceTestCase(TestCase):
+ """Verify Layer_Version priority selection."""
+
+ def setUp(self):
+ """Create required objects."""
+ # create layer source
+ self.lsrc = LayerSource.objects.create(name="dummy-layersource",
+ sourcetype=LayerSource.TYPE_LOCAL)
+ # create release
+ bbv = BitbakeVersion.objects.create(\
+ name="master", giturl="git://git.openembedded.org/bitbake")
+ self.release = Release.objects.create(name="default-release",
+ bitbake_version=bbv,
+ branch_name="master")
+ # attach layer source to release
+ ReleaseLayerSourcePriority.objects.create(\
+ release=self.release, layer_source=self.lsrc, priority=1)
+
+ # create a layer version for the layer on the specified branch
+ self.layer = Layer.objects.create(name="meta-testlayer",
+ layer_source=self.lsrc)
+ self.branch = Branch.objects.create(name="master", layer_source=self.lsrc)
+ self.lver = Layer_Version.objects.create(\
+ layer=self.layer, layer_source=self.lsrc, up_branch=self.branch)
+
+ # create project and project layer
+ self.project = Project.objects.create_project(name="test-project",
+ release=self.release)
+ ProjectLayer.objects.create(project=self.project,
+ layercommit=self.lver)
+
+ # create spoof layer that should not appear in the search results
+ layer = Layer.objects.create(name="meta-notvalid",
+ layer_source=self.lsrc)
+ self.lver2 = Layer_Version.objects.create(layer=layer,
+ layer_source=self.lsrc,
+ up_branch=self.branch)
+
+ def test_single_layersource(self):
+ """
+ When we have a single layer version,
+ get_equivalents_wpriority() should return a list with
+ just this layer_version.
+ """
+ equivqs = self.lver.get_equivalents_wpriority(self.project)
+ self.assertEqual(list(equivqs), [self.lver])
+
+ def test_dual_layersource(self):
+ """
+ If we have two layers with the same name, from different layer sources,
+ we expect both layers in, in increasing priority of the layer source.
+ """
+ lsrc2 = LayerSource.objects.create(\
+ name="dummy-layersource2",
+ sourcetype=LayerSource.TYPE_LOCAL,
+ apiurl="test")
+
+ # assign a lower priority for the second layer source
+ self.release.releaselayersourcepriority_set.create(layer_source=lsrc2,
+ priority=2)
+
+ # create a new layer_version for a layer with the same name
+ # coming from the second layer source
+ layer2 = Layer.objects.create(name="meta-testlayer",
+ layer_source=lsrc2)
+ lver2 = Layer_Version.objects.create(layer=layer2, layer_source=lsrc2,
+ up_branch=self.branch)
+
+ # expect two layer versions, in the priority order
+ equivqs = self.lver.get_equivalents_wpriority(self.project)
+ self.assertEqual(list(equivqs), [lver2, self.lver])
+
+ def test_build_layerversion(self):
+ """
+ Any layer version coming from the build should show up
+ before any layer version coming from upstream
+ """
+ build = Build.objects.create(project=self.project,
+ started_on=timezone.now(),
+ completed_on=timezone.now())
+ lvb = Layer_Version.objects.create(layer=self.layer, build=build,
+ commit="deadbeef")
+
+ # a build layerversion must be in the equivalence
+ # list for the original layerversion
+ equivqs = self.lver.get_equivalents_wpriority(self.project)
+ self.assertTrue(len(equivqs) == 2)
+ self.assertTrue(equivqs[0] == self.lver)
+ self.assertTrue(equivqs[1] == lvb)
+
+ # getting the build layerversion equivalent list must
+ # return the same list as the original layer
+ bequivqs = lvb.get_equivalents_wpriority(self.project)
+
+ self.assertEqual(list(equivqs), list(bequivqs))
+
+ def test_compatible_layerversions(self):
+ """
+ When we have a 2 layer versions, compatible_layerversions()
+ should return a queryset with both.
+ """
+ compat_lv = self.project.compatible_layerversions()
+ self.assertEqual(list(compat_lv), [self.lver, self.lver2])
+
diff --git a/bitbake/lib/toaster/toastergui/__init__.py b/bitbake/lib/toaster/toastergui/__init__.py
new file mode 100644
index 0000000..e69de29
--- /dev/null
+++ b/bitbake/lib/toaster/toastergui/__init__.py
diff --git a/bitbake/lib/toaster/toastergui/static/css/bootstrap-responsive.min.css b/bitbake/lib/toaster/toastergui/static/css/bootstrap-responsive.min.css
new file mode 100755
index 0000000..0597860
--- /dev/null
+++ b/bitbake/lib/toaster/toastergui/static/css/bootstrap-responsive.min.css
@@ -0,0 +1,9 @@
+/*!
+ * Bootstrap Responsive v2.3.0
+ *
+ * Copyright 2012 Twitter, Inc
+ * Licensed under the Apache License v2.0
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Designed and built with all the love in the world @twitter by @mdo and @fat.
+ */.clearfix{*zoom:1}.clearfix:before,.clearfix:after{display:table;line-height:0;content:""}.clearfix:after{clear:both}.hide-text{font:0/0 a;color:transparent;text-shadow:none;background-color:transparent;border:0}.input-block-level{display:block;width:100%;min-height:30px;-webkit-box-sizing:border-box;-moz-box-sizing:border-box;box-sizing:border-box}@-ms-viewport{width:device-width}.hidden{display:none;visibility:hidden}.visible-phone{display:none!important}.visible-tablet{display:none!important}.hidden-desktop{display:none!important}.visible-desktop{display:inherit!important}@media(min-width:768px) and (max-width:979px){.hidden-desktop{display:inherit!important}.visible-desktop{display:none!important}.visible-tablet{display:inherit!important}.hidden-tablet{display:none!important}}@media(max-width:767px){.hidden-desktop{display:inherit!important}.visible-desktop{display:none!important}.visible-phone{display:inherit!important}.hidden-phone{display:none!important}}.visible-print{display:none!important}@media print{.visible-print{display:inherit!important}.hidden-print{display:none!important}}@media(min-width:1200px){.row{margin-left:-30px;*zoom:1}.row:before,.row:after{display:table;line-height:0;content:""}.row:after{clear:both}[class*="span"]{float:left;min-height:1px;margin-left:30px}.container,.navbar-static-top .container,.navbar-fixed-top .container,.navbar-fixed-bottom .container{width:1170px}.span12{width:1170px}.span11{width:1070px}.span10{width:970px}.span9{width:870px}.span8{width:770px}.span7{width:670px}.span6{width:570px}.span5{width:470px}.span4{width:370px}.span3{width:270px}.span2{width:170px}.span1{width:70px}.offset12{margin-left:1230px}.offset11{margin-left:1130px}.offset10{margin-left:1030px}.offset9{margin-left:930px}.offset8{margin-left:830px}.offset7{margin-left:730px}.offset6{margin-left:630px}.offset5{margin-left:530px}.offset4{margin-left:430px}.offset3{margin-left:330px}.offset2{margin-left:230px}.offset1{margin-left:130px}.row-fluid{width:100%;*zoom:1}.row-fluid:before,.row-fluid:after{display:table;line-height:0;content:""}.row-fluid:after{clear:both}.row-fluid [class*="span"]{display:block;float:left;width:100%;min-height:30px;margin-left:2.564102564102564%;*margin-left:2.5109110747408616%;-webkit-box-sizing:border-box;-moz-box-sizing:border-box;box-sizing:border-box}.row-fluid [class*="span"]:first-child{margin-left:0}.row-fluid .controls-row [class*="span"]+[class*="span"]{margin-left:2.564102564102564%}.row-fluid .span12{width:100%;*width:99.94680851063829%}.row-fluid .span11{width:91.45299145299145%;*width:91.39979996362975%}.row-fluid .span10{width:82.90598290598291%;*width:82.8527914166212%}.row-fluid .span9{width:74.35897435897436%;*width:74.30578286961266%}.row-fluid .span8{width:65.81196581196582%;*width:65.75877432260411%}.row-fluid .span7{width:57.26495726495726%;*width:57.21176577559556%}.row-fluid .span6{width:48.717948717948715%;*width:48.664757228587014%}.row-fluid .span5{width:40.17094017094017%;*width:40.11774868157847%}.row-fluid .span4{width:31.623931623931625%;*width:31.570740134569924%}.row-fluid .span3{width:23.076923076923077%;*width:23.023731587561375%}.row-fluid .span2{width:14.52991452991453%;*width:14.476723040552828%}.row-fluid .span1{width:5.982905982905983%;*width:5.929714493544281%}.row-fluid .offset12{margin-left:105.12820512820512%;*margin-left:105.02182214948171%}.row-fluid .offset12:first-child{margin-left:102.56410256410257%;*margin-left:102.45771958537915%}.row-fluid .offset11{margin-left:96.58119658119658%;*margin-left:96.47481360247316%}.row-fluid .offset11:first-child{margin-left:94.01709401709402%;*margin-left:93.91071103837061%}.row-fluid .offset10{margin-left:88.03418803418803%;*margin-left:87.92780505546462%}.row-fluid .offset10:first-child{margin-left:85.47008547008548%;*margin-left:85.36370249136206%}.row-fluid .offset9{margin-left:79.48717948717949%;*margin-left:79.38079650845607%}.row-fluid .offset9:first-child{margin-left:76.92307692307693%;*margin-left:76.81669394435352%}.row-fluid .offset8{margin-left:70.94017094017094%;*margin-left:70.83378796144753%}.row-fluid .offset8:first-child{margin-left:68.37606837606839%;*margin-left:68.26968539734497%}.row-fluid .offset7{margin-left:62.393162393162385%;*margin-left:62.28677941443899%}.row-fluid .offset7:first-child{margin-left:59.82905982905982%;*margin-left:59.72267685033642%}.row-fluid .offset6{margin-left:53.84615384615384%;*margin-left:53.739770867430444%}.row-fluid .offset6:first-child{margin-left:51.28205128205128%;*margin-left:51.175668303327875%}.row-fluid .offset5{margin-left:45.299145299145295%;*margin-left:45.1927623204219%}.row-fluid .offset5:first-child{margin-left:42.73504273504273%;*margin-left:42.62865975631933%}.row-fluid .offset4{margin-left:36.75213675213675%;*margin-left:36.645753773413354%}.row-fluid .offset4:first-child{margin-left:34.18803418803419%;*margin-left:34.081651209310785%}.row-fluid .offset3{margin-left:28.205128205128204%;*margin-left:28.0987452264048%}.row-fluid .offset3:first-child{margin-left:25.641025641025642%;*margin-left:25.53464266230224%}.row-fluid .offset2{margin-left:19.65811965811966%;*margin-left:19.551736679396257%}.row-fluid .offset2:first-child{margin-left:17.094017094017094%;*margin-left:16.98763411529369%}.row-fluid .offset1{margin-left:11.11111111111111%;*margin-left:11.004728132387708%}.row-fluid .offset1:first-child{margin-left:8.547008547008547%;*margin-left:8.440625568285142%}input,textarea,.uneditable-input{margin-left:0}.controls-row [class*="span"]+[class*="span"]{margin-left:30px}input.span12,textarea.span12,.uneditable-input.span12{width:1156px}input.span11,textarea.span11,.uneditable-input.span11{width:1056px}input.span10,textarea.span10,.uneditable-input.span10{width:956px}input.span9,textarea.span9,.uneditable-input.span9{width:856px}input.span8,textarea.span8,.uneditable-input.span8{width:756px}input.span7,textarea.span7,.uneditable-input.span7{width:656px}input.span6,textarea.span6,.uneditable-input.span6{width:556px}input.span5,textarea.span5,.uneditable-input.span5{width:456px}input.span4,textarea.span4,.uneditable-input.span4{width:356px}input.span3,textarea.span3,.uneditable-input.span3{width:256px}input.span2,textarea.span2,.uneditable-input.span2{width:156px}input.span1,textarea.span1,.uneditable-input.span1{width:56px}.thumbnails{margin-left:-30px}.thumbnails>li{margin-left:30px}.row-fluid .thumbnails{margin-left:0}}@media(min-width:768px) and (max-width:979px){.row{margin-left:-20px;*zoom:1}.row:before,.row:after{display:table;line-height:0;content:""}.row:after{clear:both}[class*="span"]{float:left;min-height:1px;margin-left:20px}.container,.navbar-static-top .container,.navbar-fixed-top .container,.navbar-fixed-bottom .container{width:724px}.span12{width:724px}.span11{width:662px}.span10{width:600px}.span9{width:538px}.span8{width:476px}.span7{width:414px}.span6{width:352px}.span5{width:290px}.span4{width:228px}.span3{width:166px}.span2{width:104px}.span1{width:42px}.offset12{margin-left:764px}.offset11{margin-left:702px}.offset10{margin-left:640px}.offset9{margin-left:578px}.offset8{margin-left:516px}.offset7{margin-left:454px}.offset6{margin-left:392px}.offset5{margin-left:330px}.offset4{margin-left:268px}.offset3{margin-left:206px}.offset2{margin-left:144px}.offset1{margin-left:82px}.row-fluid{width:100%;*zoom:1}.row-fluid:before,.row-fluid:after{display:table;line-height:0;content:""}.row-fluid:after{clear:both}.row-fluid [class*="span"]{display:block;float:left;width:100%;min-height:30px;margin-left:2.7624309392265194%;*margin-left:2.709239449864817%;-webkit-box-sizing:border-box;-moz-box-sizing:border-box;box-sizing:border-box}.row-fluid [class*="span"]:first-child{margin-left:0}.row-fluid .controls-row [class*="span"]+[class*="span"]{margin-left:2.7624309392265194%}.row-fluid .span12{width:100%;*width:99.94680851063829%}.row-fluid .span11{width:91.43646408839778%;*width:91.38327259903608%}.row-fluid .span10{width:82.87292817679558%;*width:82.81973668743387%}.row-fluid .span9{width:74.30939226519337%;*width:74.25620077583166%}.row-fluid .span8{width:65.74585635359117%;*width:65.69266486422946%}.row-fluid .span7{width:57.18232044198895%;*width:57.12912895262725%}.row-fluid .span6{width:48.61878453038674%;*width:48.56559304102504%}.row-fluid .span5{width:40.05524861878453%;*width:40.00205712942283%}.row-fluid .span4{width:31.491712707182323%;*width:31.43852121782062%}.row-fluid .span3{width:22.92817679558011%;*width:22.87498530621841%}.row-fluid .span2{width:14.3646408839779%;*width:14.311449394616199%}.row-fluid .span1{width:5.801104972375691%;*width:5.747913483013988%}.row-fluid .offset12{margin-left:105.52486187845304%;*margin-left:105.41847889972962%}.row-fluid .offset12:first-child{margin-left:102.76243093922652%;*margin-left:102.6560479605031%}.row-fluid .offset11{margin-left:96.96132596685082%;*margin-left:96.8549429881274%}.row-fluid .offset11:first-child{margin-left:94.1988950276243%;*margin-left:94.09251204890089%}.row-fluid .offset10{margin-left:88.39779005524862%;*margin-left:88.2914070765252%}.row-fluid .offset10:first-child{margin-left:85.6353591160221%;*margin-left:85.52897613729868%}.row-fluid .offset9{margin-left:79.8342541436464%;*margin-left:79.72787116492299%}.row-fluid .offset9:first-child{margin-left:77.07182320441989%;*margin-left:76.96544022569647%}.row-fluid .offset8{margin-left:71.2707182320442%;*margin-left:71.16433525332079%}.row-fluid .offset8:first-child{margin-left:68.50828729281768%;*margin-left:68.40190431409427%}.row-fluid .offset7{margin-left:62.70718232044199%;*margin-left:62.600799341718584%}.row-fluid .offset7:first-child{margin-left:59.94475138121547%;*margin-left:59.838368402492065%}.row-fluid .offset6{margin-left:54.14364640883978%;*margin-left:54.037263430116376%}.row-fluid .offset6:first-child{margin-left:51.38121546961326%;*margin-left:51.27483249088986%}.row-fluid .offset5{margin-left:45.58011049723757%;*margin-left:45.47372751851417%}.row-fluid .offset5:first-child{margin-left:42.81767955801105%;*margin-left:42.71129657928765%}.row-fluid .offset4{margin-left:37.01657458563536%;*margin-left:36.91019160691196%}.row-fluid .offset4:first-child{margin-left:34.25414364640884%;*margin-left:34.14776066768544%}.row-fluid .offset3{margin-left:28.45303867403315%;*margin-left:28.346655695309746%}.row-fluid .offset3:first-child{margin-left:25.69060773480663%;*margin-left:25.584224756083227%}.row-fluid .offset2{margin-left:19.88950276243094%;*margin-left:19.783119783707537%}.row-fluid .offset2:first-child{margin-left:17.12707182320442%;*margin-left:17.02068884448102%}.row-fluid .offset1{margin-left:11.32596685082873%;*margin-left:11.219583872105325%}.row-fluid .offset1:first-child{margin-left:8.56353591160221%;*margin-left:8.457152932878806%}input,textarea,.uneditable-input{margin-left:0}.controls-row [class*="span"]+[class*="span"]{margin-left:20px}input.span12,textarea.span12,.uneditable-input.span12{width:710px}input.span11,textarea.span11,.uneditable-input.span11{width:648px}input.span10,textarea.span10,.uneditable-input.span10{width:586px}input.span9,textarea.span9,.uneditable-input.span9{width:524px}input.span8,textarea.span8,.uneditable-input.span8{width:462px}input.span7,textarea.span7,.uneditable-input.span7{width:400px}input.span6,textarea.span6,.uneditable-input.span6{width:338px}input.span5,textarea.span5,.uneditable-input.span5{width:276px}input.span4,textarea.span4,.uneditable-input.span4{width:214px}input.span3,textarea.span3,.uneditable-input.span3{width:152px}input.span2,textarea.span2,.uneditable-input.span2{width:90px}input.span1,textarea.span1,.uneditable-input.span1{width:28px}}@media(max-width:767px){body{padding-right:20px;padding-left:20px}.navbar-fixed-top,.navbar-fixed-bottom,.navbar-static-top{margin-right:-20px;margin-left:-20px}.container-fluid{padding:0}.dl-horizontal dt{float:none;width:auto;clear:none;text-align:left}.dl-horizontal dd{margin-left:0}.container{width:auto}.row-fluid{width:100%}.row,.thumbnails{margin-left:0}.thumbnails>li{float:none;margin-left:0}[class*="span"],.uneditable-input[class*="span"],.row-fluid [class*="span"]{display:block;float:none;width:100%;margin-left:0;-webkit-box-sizing:border-box;-moz-box-sizing:border-box;box-sizing:border-box}.span12,.row-fluid .span12{width:100%;-webkit-box-sizing:border-box;-moz-box-sizing:border-box;box-sizing:border-box}.row-fluid [class*="offset"]:first-child{margin-left:0}.input-large,.input-xlarge,.input-xxlarge,input[class*="span"],select[class*="span"],textarea[class*="span"],.uneditable-input{display:block;width:100%;min-height:30px;-webkit-box-sizing:border-box;-moz-box-sizing:border-box;box-sizing:border-box}.input-prepend input,.input-append input,.input-prepend input[class*="span"],.input-append input[class*="span"]{display:inline-block;width:auto}.controls-row [class*="span"]+[class*="span"]{margin-left:0}.modal{position:fixed;top:20px;right:20px;left:20px;width:auto;margin:0}.modal.fade{top:-100px}.modal.fade.in{top:20px}}@media(max-width:480px){.nav-collapse{-webkit-transform:translate3d(0,0,0)}.page-header h1 small{display:block;line-height:20px}input[type="checkbox"],input[type="radio"]{border:1px solid #ccc}.form-horizontal .control-label{float:none;width:auto;padding-top:0;text-align:left}.form-horizontal .controls{margin-left:0}.form-horizontal .control-list{padding-top:0}.form-horizontal .form-actions{padding-right:10px;padding-left:10px}.media .pull-left,.media .pull-right{display:block;float:none;margin-bottom:10px}.media-object{margin-right:0;margin-left:0}.modal{top:10px;right:10px;left:10px}.modal-header .close{padding:10px;margin:-10px}.carousel-caption{position:static}}@media(max-width:979px){body{padding-top:0}.navbar-fixed-top,.navbar-fixed-bottom{position:static}.navbar-fixed-top{margin-bottom:20px}.navbar-fixed-bottom{margin-top:20px}.navbar-fixed-top .navbar-inner,.navbar-fixed-bottom .navbar-inner{padding:5px}.navbar .container{width:auto;padding:0}.navbar .brand{padding-right:10px;padding-left:10px;margin:0 0 0 -5px}.nav-collapse{clear:both}.nav-collapse .nav{float:none;margin:0 0 10px}.nav-collapse .nav>li{float:none}.nav-collapse .nav>li>a{margin-bottom:2px}.nav-collapse .nav>.divider-vertical{display:none}.nav-collapse .nav .nav-header{color:#777;text-shadow:none}.nav-collapse .nav>li>a,.nav-collapse .dropdown-menu a{padding:9px 15px;font-weight:bold;color:#777;-webkit-border-radius:3px;-moz-border-radius:3px;border-radius:3px}.nav-collapse .btn{padding:4px 10px 4px;font-weight:normal;-webkit-border-radius:4px;-moz-border-radius:4px;border-radius:4px}.nav-collapse .dropdown-menu li+li a{margin-bottom:2px}.nav-collapse .nav>li>a:hover,.nav-collapse .nav>li>a:focus,.nav-collapse .dropdown-menu a:hover,.nav-collapse .dropdown-menu a:focus{background-color:#f2f2f2}.navbar-inverse .nav-collapse .nav>li>a,.navbar-inverse .nav-collapse .dropdown-menu a{color:#999}.navbar-inverse .nav-collapse .nav>li>a:hover,.navbar-inverse .nav-collapse .nav>li>a:focus,.navbar-inverse .nav-collapse .dropdown-menu a:hover,.navbar-inverse .nav-collapse .dropdown-menu a:focus{background-color:#111}.nav-collapse.in .btn-group{padding:0;margin-top:5px}.nav-collapse .dropdown-menu{position:static;top:auto;left:auto;display:none;float:none;max-width:none;padding:0;margin:0 15px;background-color:transparent;border:0;-webkit-border-radius:0;-moz-border-radius:0;border-radius:0;-webkit-box-shadow:none;-moz-box-shadow:none;box-shadow:none}.nav-collapse .open>.dropdown-menu{display:block}.nav-collapse .dropdown-menu:before,.nav-collapse .dropdown-menu:after{display:none}.nav-collapse .dropdown-menu .divider{display:none}.nav-collapse .nav>li>.dropdown-menu:before,.nav-collapse .nav>li>.dropdown-menu:after{display:none}.nav-collapse .navbar-form,.nav-collapse .navbar-search{float:none;padding:10px 15px;margin:10px 0;border-top:1px solid #f2f2f2;border-bottom:1px solid #f2f2f2;-webkit-box-shadow:inset 0 1px 0 rgba(255,255,255,0.1),0 1px 0 rgba(255,255,255,0.1);-moz-box-shadow:inset 0 1px 0 rgba(255,255,255,0.1),0 1px 0 rgba(255,255,255,0.1);box-shadow:inset 0 1px 0 rgba(255,255,255,0.1),0 1px 0 rgba(255,255,255,0.1)}.navbar-inverse .nav-collapse .navbar-form,.navbar-inverse .nav-collapse .navbar-search{border-top-color:#111;border-bottom-color:#111}.navbar .nav-collapse .nav.pull-right{float:none;margin-left:0}.nav-collapse,.nav-collapse.collapse{height:0;overflow:hidden}.navbar .btn-navbar{display:block}.navbar-static .navbar-inner{padding-right:10px;padding-left:10px}}@media(min-width:980px){.nav-collapse.collapse{height:auto!important;overflow:visible!important}}
diff --git a/bitbake/lib/toaster/toastergui/static/css/bootstrap.min.css b/bitbake/lib/toaster/toastergui/static/css/bootstrap.min.css
new file mode 100755
index 0000000..2b927f8
--- /dev/null
+++ b/bitbake/lib/toaster/toastergui/static/css/bootstrap.min.css
@@ -0,0 +1,9 @@
+/*!
+ * Bootstrap v2.3.0
+ *
+ * Copyright 2012 Twitter, Inc
+ * Licensed under the Apache License v2.0
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Designed and built with all the love in the world @twitter by @mdo and @fat.
+ */.clearfix{*zoom:1}.clearfix:before,.clearfix:after{display:table;line-height:0;content:""}.clearfix:after{clear:both}.hide-text{font:0/0 a;color:transparent;text-shadow:none;background-color:transparent;border:0}.input-block-level{display:block;width:100%;min-height:30px;-webkit-box-sizing:border-box;-moz-box-sizing:border-box;box-sizing:border-box}article,aside,details,figcaption,figure,footer,header,hgroup,nav,section{display:block}audio,canvas,video{display:inline-block;*display:inline;*zoom:1}audio:not([controls]){display:none}html{font-size:100%;-webkit-text-size-adjust:100%;-ms-text-size-adjust:100%}a:focus{outline:thin dotted #333;outline:5px auto -webkit-focus-ring-color;outline-offset:-2px}a:hover,a:active{outline:0}sub,sup{position:relative;font-size:75%;line-height:0;vertical-align:baseline}sup{top:-0.5em}sub{bottom:-0.25em}img{width:auto\9;height:auto;max-width:100%;vertical-align:middle;border:0;-ms-interpolation-mode:bicubic}#map_canvas img,.google-maps img{max-width:none}button,input,select,textarea{margin:0;font-size:100%;vertical-align:middle}button,input{*overflow:visible;line-height:normal}button::-moz-focus-inner,input::-moz-focus-inner{padding:0;border:0}button,html input[type="button"],input[type="reset"],input[type="submit"]{cursor:pointer;-webkit-appearance:button}label,select,button,input[type="button"],input[type="reset"],input[type="submit"],input[type="radio"],input[type="checkbox"]{cursor:pointer}input[type="search"]{-webkit-box-sizing:content-box;-moz-box-sizing:content-box;box-sizing:content-box;-webkit-appearance:textfield}input[type="search"]::-webkit-search-decoration,input[type="search"]::-webkit-search-cancel-button{-webkit-appearance:none}textarea{overflow:auto;vertical-align:top}@media print{*{color:#000!important;text-shadow:none!important;background:transparent!important;box-shadow:none!important}a,a:visited{text-decoration:underline}a[href]:after{content:" (" attr(href) ")"}abbr[title]:after{content:" (" attr(title) ")"}.ir a:after,a[href^="javascript:"]:after,a[href^="#"]:after{content:""}pre,blockquote{border:1px solid #999;page-break-inside:avoid}thead{display:table-header-group}tr,img{page-break-inside:avoid}img{max-width:100%!important}@page{margin:.5cm}p,h2,h3{orphans:3;widows:3}h2,h3{page-break-after:avoid}}body{margin:0;font-family:"Helvetica Neue",Helvetica,Arial,sans-serif;font-size:14px;line-height:20px;color:#333;background-color:#fff}a{color:#08c;text-decoration:none}a:hover,a:focus{color:#005580;text-decoration:underline}.img-rounded{-webkit-border-radius:6px;-moz-border-radius:6px;border-radius:6px}.img-polaroid{padding:4px;background-color:#fff;border:1px solid #ccc;border:1px solid rgba(0,0,0,0.2);-webkit-box-shadow:0 1px 3px rgba(0,0,0,0.1);-moz-box-shadow:0 1px 3px rgba(0,0,0,0.1);box-shadow:0 1px 3px rgba(0,0,0,0.1)}.img-circle{-webkit-border-radius:500px;-moz-border-radius:500px;border-radius:500px}.row{margin-left:-20px;*zoom:1}.row:before,.row:after{display:table;line-height:0;content:""}.row:after{clear:both}[class*="span"]{float:left;min-height:1px;margin-left:20px}.container,.navbar-static-top .container,.navbar-fixed-top .container,.navbar-fixed-bottom .container{width:940px}.span12{width:940px}.span11{width:860px}.span10{width:780px}.span9{width:700px}.span8{width:620px}.span7{width:540px}.span6{width:460px}.span5{width:380px}.span4{width:300px}.span3{width:220px}.span2{width:140px}.span1{width:60px}.offset12{margin-left:980px}.offset11{margin-left:900px}.offset10{margin-left:820px}.offset9{margin-left:740px}.offset8{margin-left:660px}.offset7{margin-left:580px}.offset6{margin-left:500px}.offset5{margin-left:420px}.offset4{margin-left:340px}.offset3{margin-left:260px}.offset2{margin-left:180px}.offset1{margin-left:100px}.row-fluid{width:100%;*zoom:1}.row-fluid:before,.row-fluid:after{display:table;line-height:0;content:""}.row-fluid:after{clear:both}.row-fluid [class*="span"]{display:block;float:left;width:100%;min-height:30px;margin-left:2.127659574468085%;*margin-left:2.074468085106383%;-webkit-box-sizing:border-box;-moz-box-sizing:border-box;box-sizing:border-box}.row-fluid [class*="span"]:first-child{margin-left:0}.row-fluid .controls-row [class*="span"]+[class*="span"]{margin-left:2.127659574468085%}.row-fluid .span12{width:100%;*width:99.94680851063829%}.row-fluid .span11{width:91.48936170212765%;*width:91.43617021276594%}.row-fluid .span10{width:82.97872340425532%;*width:82.92553191489361%}.row-fluid .span9{width:74.46808510638297%;*width:74.41489361702126%}.row-fluid .span8{width:65.95744680851064%;*width:65.90425531914893%}.row-fluid .span7{width:57.44680851063829%;*width:57.39361702127659%}.row-fluid .span6{width:48.93617021276595%;*width:48.88297872340425%}.row-fluid .span5{width:40.42553191489362%;*width:40.37234042553192%}.row-fluid .span4{width:31.914893617021278%;*width:31.861702127659576%}.row-fluid .span3{width:23.404255319148934%;*width:23.351063829787233%}.row-fluid .span2{width:14.893617021276595%;*width:14.840425531914894%}.row-fluid .span1{width:6.382978723404255%;*width:6.329787234042553%}.row-fluid .offset12{margin-left:104.25531914893617%;*margin-left:104.14893617021275%}.row-fluid .offset12:first-child{margin-left:102.12765957446808%;*margin-left:102.02127659574467%}.row-fluid .offset11{margin-left:95.74468085106382%;*margin-left:95.6382978723404%}.row-fluid .offset11:first-child{margin-left:93.61702127659574%;*margin-left:93.51063829787232%}.row-fluid .offset10{margin-left:87.23404255319149%;*margin-left:87.12765957446807%}.row-fluid .offset10:first-child{margin-left:85.1063829787234%;*margin-left:84.99999999999999%}.row-fluid .offset9{margin-left:78.72340425531914%;*margin-left:78.61702127659572%}.row-fluid .offset9:first-child{margin-left:76.59574468085106%;*margin-left:76.48936170212764%}.row-fluid .offset8{margin-left:70.2127659574468%;*margin-left:70.10638297872339%}.row-fluid .offset8:first-child{margin-left:68.08510638297872%;*margin-left:67.9787234042553%}.row-fluid .offset7{margin-left:61.70212765957446%;*margin-left:61.59574468085106%}.row-fluid .offset7:first-child{margin-left:59.574468085106375%;*margin-left:59.46808510638297%}.row-fluid .offset6{margin-left:53.191489361702125%;*margin-left:53.085106382978715%}.row-fluid .offset6:first-child{margin-left:51.063829787234035%;*margin-left:50.95744680851063%}.row-fluid .offset5{margin-left:44.68085106382979%;*margin-left:44.57446808510638%}.row-fluid .offset5:first-child{margin-left:42.5531914893617%;*margin-left:42.4468085106383%}.row-fluid .offset4{margin-left:36.170212765957444%;*margin-left:36.06382978723405%}.row-fluid .offset4:first-child{margin-left:34.04255319148936%;*margin-left:33.93617021276596%}.row-fluid .offset3{margin-left:27.659574468085104%;*margin-left:27.5531914893617%}.row-fluid .offset3:first-child{margin-left:25.53191489361702%;*margin-left:25.425531914893618%}.row-fluid .offset2{margin-left:19.148936170212764%;*margin-left:19.04255319148936%}.row-fluid .offset2:first-child{margin-left:17.02127659574468%;*margin-left:16.914893617021278%}.row-fluid .offset1{margin-left:10.638297872340425%;*margin-left:10.53191489361702%}.row-fluid .offset1:first-child{margin-left:8.51063829787234%;*margin-left:8.404255319148938%}[class*="span"].hide,.row-fluid [class*="span"].hide{display:none}[class*="span"].pull-right,.row-fluid [class*="span"].pull-right{float:right}.container{margin-right:auto;margin-left:auto;*zoom:1}.container:before,.container:after{display:table;line-height:0;content:""}.container:after{clear:both}.container-fluid{padding-right:20px;padding-left:20px;*zoom:1}.container-fluid:before,.container-fluid:after{display:table;line-height:0;content:""}.container-fluid:after{clear:both}p{margin:0 0 10px}.lead{margin-bottom:20px;font-size:21px;font-weight:200;line-height:30px}small{font-size:85%}strong{font-weight:bold}em{font-style:italic}cite{font-style:normal}.muted{color:#999}a.muted:hover,a.muted:focus{color:#808080}.text-warning{color:#c09853}a.text-warning:hover,a.text-warning:focus{color:#a47e3c}.text-error{color:#b94a48}a.text-error:hover,a.text-error:focus{color:#953b39}.text-info{color:#3a87ad}a.text-info:hover,a.text-info:focus{color:#2d6987}.text-success{color:#468847}a.text-success:hover,a.text-success:focus{color:#356635}.text-left{text-align:left}.text-right{text-align:right}.text-center{text-align:center}h1,h2,h3,h4,h5,h6{margin:10px 0;font-family:inherit;font-weight:bold;line-height:20px;color:inherit;text-rendering:optimizelegibility}h1 small,h2 small,h3 small,h4 small,h5 small,h6 small{font-weight:normal;line-height:1;color:#999}h1,h2,h3{line-height:40px}h1{font-size:38.5px}h2{font-size:31.5px}h3{font-size:24.5px}h4{font-size:17.5px}h5{font-size:14px}h6{font-size:11.9px}h1 small{font-size:24.5px}h2 small{font-size:17.5px}h3 small{font-size:14px}h4 small{font-size:14px}.page-header{padding-bottom:9px;margin:20px 0 30px;border-bottom:1px solid #eee}ul,ol{padding:0;margin:0 0 10px 25px}ul ul,ul ol,ol ol,ol ul{margin-bottom:0}li{line-height:20px}ul.unstyled,ol.unstyled{margin-left:0;list-style:none}ul.inline,ol.inline{margin-left:0;list-style:none}ul.inline>li,ol.inline>li{display:inline-block;*display:inline;padding-right:5px;padding-left:5px;*zoom:1}dl{margin-bottom:20px}dt,dd{line-height:20px}dt{font-weight:bold}dd{margin-left:10px}.dl-horizontal{*zoom:1}.dl-horizontal:before,.dl-horizontal:after{display:table;line-height:0;content:""}.dl-horizontal:after{clear:both}.dl-horizontal dt{float:left;width:200px;overflow:hidden;clear:left;text-align:right;text-overflow:ellipsis;white-space:nowrap}.dl-horizontal dd{margin-left:220px}hr{margin:20px 0;border:0;border-top:1px solid #eee;border-bottom:1px solid #fff}abbr[title],abbr[data-original-title]{cursor:help;border-bottom:1px dotted #999}abbr.initialism{font-size:90%;text-transform:uppercase}blockquote{padding:0 0 0 15px;margin:0 0 20px;border-left:5px solid #eee}blockquote p{margin-bottom:0;font-size:17.5px;font-weight:300;line-height:1.25}blockquote small{display:block;line-height:20px;color:#999}blockquote small:before{content:'\2014 \00A0'}blockquote.pull-right{float:right;padding-right:15px;padding-left:0;border-right:5px solid #eee;border-left:0}blockquote.pull-right p,blockquote.pull-right small{text-align:right}blockquote.pull-right small:before{content:''}blockquote.pull-right small:after{content:'\00A0 \2014'}q:before,q:after,blockquote:before,blockquote:after{content:""}address{display:block;margin-bottom:20px;font-style:normal;line-height:20px}code,pre{padding:0 3px 2px;font-family:Monaco,Menlo,Consolas,"Courier New",monospace;font-size:12px;color:#333;-webkit-border-radius:3px;-moz-border-radius:3px;border-radius:3px}code{padding:2px 4px;color:#d14;white-space:nowrap;background-color:#f7f7f9;border:1px solid #e1e1e8}pre{display:block;padding:9.5px;margin:0 0 10px;font-size:13px;line-height:20px;word-break:break-all;word-wrap:break-word;white-space:pre;white-space:pre-wrap;background-color:#f5f5f5;border:1px solid #ccc;border:1px solid rgba(0,0,0,0.15);-webkit-border-radius:4px;-moz-border-radius:4px;border-radius:4px}pre.prettyprint{margin-bottom:20px}pre code{padding:0;color:inherit;white-space:pre;white-space:pre-wrap;background-color:transparent;border:0}.pre-scrollable{max-height:340px;overflow-y:scroll}form{margin:0 0 20px}fieldset{padding:0;margin:0;border:0}legend{display:block;width:100%;padding:0;margin-bottom:20px;font-size:21px;line-height:40px;color:#333;border:0;border-bottom:1px solid #e5e5e5}legend small{font-size:15px;color:#999}label,input,button,select,textarea{font-size:14px;font-weight:normal;line-height:20px}input,button,select,textarea{font-family:"Helvetica Neue",Helvetica,Arial,sans-serif}label{display:block;margin-bottom:5px}select,textarea,input[type="text"],input[type="password"],input[type="datetime"],input[type="datetime-local"],input[type="date"],input[type="month"],input[type="time"],input[type="week"],input[type="number"],input[type="email"],input[type="url"],input[type="search"],input[type="tel"],input[type="color"],.uneditable-input{display:inline-block;height:20px;padding:4px 6px;margin-bottom:10px;font-size:14px;line-height:20px;color:#555;vertical-align:middle;-webkit-border-radius:4px;-moz-border-radius:4px;border-radius:4px}input,textarea,.uneditable-input{width:206px}textarea{height:auto}textarea,input[type="text"],input[type="password"],input[type="datetime"],input[type="datetime-local"],input[type="date"],input[type="month"],input[type="time"],input[type="week"],input[type="number"],input[type="email"],input[type="url"],input[type="search"],input[type="tel"],input[type="color"],.uneditable-input{background-color:#fff;border:1px solid #ccc;-webkit-box-shadow:inset 0 1px 1px rgba(0,0,0,0.075);-moz-box-shadow:inset 0 1px 1px rgba(0,0,0,0.075);box-shadow:inset 0 1px 1px rgba(0,0,0,0.075);-webkit-transition:border linear .2s,box-shadow linear .2s;-moz-transition:border linear .2s,box-shadow linear .2s;-o-transition:border linear .2s,box-shadow linear .2s;transition:border linear .2s,box-shadow linear .2s}textarea:focus,input[type="text"]:focus,input[type="password"]:focus,input[type="datetime"]:focus,input[type="datetime-local"]:focus,input[type="date"]:focus,input[type="month"]:focus,input[type="time"]:focus,input[type="week"]:focus,input[type="number"]:focus,input[type="email"]:focus,input[type="url"]:focus,input[type="search"]:focus,input[type="tel"]:focus,input[type="color"]:focus,.uneditable-input:focus{border-color:rgba(82,168,236,0.8);outline:0;outline:thin dotted \9;-webkit-box-shadow:inset 0 1px 1px rgba(0,0,0,0.075),0 0 8px rgba(82,168,236,0.6);-moz-box-shadow:inset 0 1px 1px rgba(0,0,0,0.075),0 0 8px rgba(82,168,236,0.6);box-shadow:inset 0 1px 1px rgba(0,0,0,0.075),0 0 8px rgba(82,168,236,0.6)}input[type="radio"],input[type="checkbox"]{margin:4px 0 0;margin-top:1px \9;*margin-top:0;line-height:normal}input[type="file"],input[type="image"],input[type="submit"],input[type="reset"],input[type="button"],input[type="radio"],input[type="checkbox"]{width:auto}select,input[type="file"]{height:30px;*margin-top:4px;line-height:30px}select{width:220px;background-color:#fff;border:1px solid #ccc}select[multiple],select[size]{height:auto}select:focus,input[type="file"]:focus,input[type="radio"]:focus,input[type="checkbox"]:focus{outline:thin dotted #333;outline:5px auto -webkit-focus-ring-color;outline-offset:-2px}.uneditable-input,.uneditable-textarea{color:#999;cursor:not-allowed;background-color:#fcfcfc;border-color:#ccc;-webkit-box-shadow:inset 0 1px 2px rgba(0,0,0,0.025);-moz-box-shadow:inset 0 1px 2px rgba(0,0,0,0.025);box-shadow:inset 0 1px 2px rgba(0,0,0,0.025)}.uneditable-input{overflow:hidden;white-space:nowrap}.uneditable-textarea{width:auto;height:auto}input:-moz-placeholder,textarea:-moz-placeholder{color:#999}input:-ms-input-placeholder,textarea:-ms-input-placeholder{color:#999}input::-webkit-input-placeholder,textarea::-webkit-input-placeholder{color:#999}.radio,.checkbox{min-height:20px;padding-left:20px}.radio input[type="radio"],.checkbox input[type="checkbox"]{float:left;margin-left:-20px}.controls>.radio:first-child,.controls>.checkbox:first-child{padding-top:5px}.radio.inline,.checkbox.inline{display:inline-block;padding-top:5px;margin-bottom:0;vertical-align:middle}.radio.inline+.radio.inline,.checkbox.inline+.checkbox.inline{margin-left:10px}.input-mini{width:60px}.input-small{width:90px}.input-medium{width:150px}.input-large{width:210px}.input-xlarge{width:270px}.input-xxlarge{width:530px}input[class*="span"],select[class*="span"],textarea[class*="span"],.uneditable-input[class*="span"],.row-fluid input[class*="span"],.row-fluid select[class*="span"],.row-fluid textarea[class*="span"],.row-fluid .uneditable-input[class*="span"]{float:none;margin-left:0}.input-append input[class*="span"],.input-append .uneditable-input[class*="span"],.input-prepend input[class*="span"],.input-prepend .uneditable-input[class*="span"],.row-fluid input[class*="span"],.row-fluid select[class*="span"],.row-fluid textarea[class*="span"],.row-fluid .uneditable-input[class*="span"],.row-fluid .input-prepend [class*="span"],.row-fluid .input-append [class*="span"]{display:inline-block}input,textarea,.uneditable-input{margin-left:0}.controls-row [class*="span"]+[class*="span"]{margin-left:20px}input.span12,textarea.span12,.uneditable-input.span12{width:926px}input.span11,textarea.span11,.uneditable-input.span11{width:846px}input.span10,textarea.span10,.uneditable-input.span10{width:766px}input.span9,textarea.span9,.uneditable-input.span9{width:686px}input.span8,textarea.span8,.uneditable-input.span8{width:606px}input.span7,textarea.span7,.uneditable-input.span7{width:526px}input.span6,textarea.span6,.uneditable-input.span6{width:446px}input.span5,textarea.span5,.uneditable-input.span5{width:366px}input.span4,textarea.span4,.uneditable-input.span4{width:286px}input.span3,textarea.span3,.uneditable-input.span3{width:206px}input.span2,textarea.span2,.uneditable-input.span2{width:126px}input.span1,textarea.span1,.uneditable-input.span1{width:46px}.controls-row{*zoom:1}.controls-row:before,.controls-row:after{display:table;line-height:0;content:""}.controls-row:after{clear:both}.controls-row [class*="span"],.row-fluid .controls-row [class*="span"]{float:left}.controls-row .checkbox[class*="span"],.controls-row .radio[class*="span"]{padding-top:5px}input[disabled],select[disabled],textarea[disabled],input[readonly],select[readonly],textarea[readonly]{cursor:not-allowed;background-color:#eee}input[type="radio"][disabled],input[type="checkbox"][disabled],input[type="radio"][readonly],input[type="checkbox"][readonly]{background-color:transparent}.control-group.warning .control-label,.control-group.warning .help-block,.control-group.warning .help-inline{color:#c09853}.control-group.warning .checkbox,.control-group.warning .radio,.control-group.warning input,.control-group.warning select,.control-group.warning textarea{color:#c09853}.control-group.warning input,.control-group.warning select,.control-group.warning textarea{border-color:#c09853;-webkit-box-shadow:inset 0 1px 1px rgba(0,0,0,0.075);-moz-box-shadow:inset 0 1px 1px rgba(0,0,0,0.075);box-shadow:inset 0 1px 1px rgba(0,0,0,0.075)}.control-group.warning input:focus,.control-group.warning select:focus,.control-group.warning textarea:focus{border-color:#a47e3c;-webkit-box-shadow:inset 0 1px 1px rgba(0,0,0,0.075),0 0 6px #dbc59e;-moz-box-shadow:inset 0 1px 1px rgba(0,0,0,0.075),0 0 6px #dbc59e;box-shadow:inset 0 1px 1px rgba(0,0,0,0.075),0 0 6px #dbc59e}.control-group.warning .input-prepend .add-on,.control-group.warning .input-append .add-on{color:#c09853;background-color:#fcf8e3;border-color:#c09853}.control-group.error .control-label,.control-group.error .help-block,.control-group.error .help-inline{color:#b94a48}.control-group.error .checkbox,.control-group.error .radio,.control-group.error input,.control-group.error select,.control-group.error textarea{color:#b94a48}.control-group.error input,.control-group.error select,.control-group.error textarea{border-color:#b94a48;-webkit-box-shadow:inset 0 1px 1px rgba(0,0,0,0.075);-moz-box-shadow:inset 0 1px 1px rgba(0,0,0,0.075);box-shadow:inset 0 1px 1px rgba(0,0,0,0.075)}.control-group.error input:focus,.control-group.error select:focus,.control-group.error textarea:focus{border-color:#953b39;-webkit-box-shadow:inset 0 1px 1px rgba(0,0,0,0.075),0 0 6px #d59392;-moz-box-shadow:inset 0 1px 1px rgba(0,0,0,0.075),0 0 6px #d59392;box-shadow:inset 0 1px 1px rgba(0,0,0,0.075),0 0 6px #d59392}.control-group.error .input-prepend .add-on,.control-group.error .input-append .add-on{color:#b94a48;background-color:#f2dede;border-color:#b94a48}.control-group.success .control-label,.control-group.success .help-block,.control-group.success .help-inline{color:#468847}.control-group.success .checkbox,.control-group.success .radio,.control-group.success input,.control-group.success select,.control-group.success textarea{color:#468847}.control-group.success input,.control-group.success select,.control-group.success textarea{border-color:#468847;-webkit-box-shadow:inset 0 1px 1px rgba(0,0,0,0.075);-moz-box-shadow:inset 0 1px 1px rgba(0,0,0,0.075);box-shadow:inset 0 1px 1px rgba(0,0,0,0.075)}.control-group.success input:focus,.control-group.success select:focus,.control-group.success textarea:focus{border-color:#356635;-webkit-box-shadow:inset 0 1px 1px rgba(0,0,0,0.075),0 0 6px #7aba7b;-moz-box-shadow:inset 0 1px 1px rgba(0,0,0,0.075),0 0 6px #7aba7b;box-shadow:inset 0 1px 1px rgba(0,0,0,0.075),0 0 6px #7aba7b}.control-group.success .input-prepend .add-on,.control-group.success .input-append .add-on{color:#468847;background-color:#dff0d8;border-color:#468847}.control-group.info .control-label,.control-group.info .help-block,.control-group.info .help-inline{color:#3a87ad}.control-group.info .checkbox,.control-group.info .radio,.control-group.info input,.control-group.info select,.control-group.info textarea{color:#3a87ad}.control-group.info input,.control-group.info select,.control-group.info textarea{border-color:#3a87ad;-webkit-box-shadow:inset 0 1px 1px rgba(0,0,0,0.075);-moz-box-shadow:inset 0 1px 1px rgba(0,0,0,0.075);box-shadow:inset 0 1px 1px rgba(0,0,0,0.075)}.control-group.info input:focus,.control-group.info select:focus,.control-group.info textarea:focus{border-color:#2d6987;-webkit-box-shadow:inset 0 1px 1px rgba(0,0,0,0.075),0 0 6px #7ab5d3;-moz-box-shadow:inset 0 1px 1px rgba(0,0,0,0.075),0 0 6px #7ab5d3;box-shadow:inset 0 1px 1px rgba(0,0,0,0.075),0 0 6px #7ab5d3}.control-group.info .input-prepend .add-on,.control-group.info .input-append .add-on{color:#3a87ad;background-color:#d9edf7;border-color:#3a87ad}input:focus:invalid,textarea:focus:invalid,select:focus:invalid{color:#b94a48;border-color:#ee5f5b}input:focus:invalid:focus,textarea:focus:invalid:focus,select:focus:invalid:focus{border-color:#e9322d;-webkit-box-shadow:0 0 6px #f8b9b7;-moz-box-shadow:0 0 6px #f8b9b7;box-shadow:0 0 6px #f8b9b7}.form-actions{padding:19px 20px 20px;margin-top:20px;margin-bottom:20px;background-color:#f5f5f5;border-top:1px solid #e5e5e5;*zoom:1}.form-actions:before,.form-actions:after{display:table;line-height:0;content:""}.form-actions:after{clear:both}.help-block,.help-inline{color:#595959}.help-block{display:block;margin-bottom:10px}.help-inline{display:inline-block;*display:inline;padding-left:5px;vertical-align:middle;*zoom:1}.input-append,.input-prepend{display:inline-block;margin-bottom:10px;font-size:0;white-space:nowrap;vertical-align:middle}.input-append input,.input-prepend input,.input-append select,.input-prepend select,.input-append .uneditable-input,.input-prepend .uneditable-input,.input-append .dropdown-menu,.input-prepend .dropdown-menu,.input-append .popover,.input-prepend .popover{font-size:14px}.input-append input,.input-prepend input,.input-append select,.input-prepend select,.input-append .uneditable-input,.input-prepend .uneditable-input{position:relative;margin-bottom:0;*margin-left:0;vertical-align:top;-webkit-border-radius:0 4px 4px 0;-moz-border-radius:0 4px 4px 0;border-radius:0 4px 4px 0}.input-append input:focus,.input-prepend input:focus,.input-append select:focus,.input-prepend select:focus,.input-append .uneditable-input:focus,.input-prepend .uneditable-input:focus{z-index:2}.input-append .add-on,.input-prepend .add-on{display:inline-block;width:auto;height:20px;min-width:16px;padding:4px 5px;font-size:14px;font-weight:normal;line-height:20px;text-align:center;text-shadow:0 1px 0 #fff;background-color:#eee;border:1px solid #ccc}.input-append .add-on,.input-prepend .add-on,.input-append .btn,.input-prepend .btn,.input-append .btn-group>.dropdown-toggle,.input-prepend .btn-group>.dropdown-toggle{vertical-align:top;-webkit-border-radius:0;-moz-border-radius:0;border-radius:0}.input-append .active,.input-prepend .active{background-color:#a9dba9;border-color:#46a546}.input-prepend .add-on,.input-prepend .btn{margin-right:-1px}.input-prepend .add-on:first-child,.input-prepend .btn:first-child{-webkit-border-radius:4px 0 0 4px;-moz-border-radius:4px 0 0 4px;border-radius:4px 0 0 4px}.input-append input,.input-append select,.input-append .uneditable-input{-webkit-border-radius:4px 0 0 4px;-moz-border-radius:4px 0 0 4px;border-radius:4px 0 0 4px}.input-append input+.btn-group .btn:last-child,.input-append select+.btn-group .btn:last-child,.input-append .uneditable-input+.btn-group .btn:last-child{-webkit-border-radius:0 4px 4px 0;-moz-border-radius:0 4px 4px 0;border-radius:0 4px 4px 0}.input-append .add-on,.input-append .btn,.input-append .btn-group{margin-left:-1px}.input-append .add-on:last-child,.input-append .btn:last-child,.input-append .btn-group:last-child>.dropdown-toggle{-webkit-border-radius:0 4px 4px 0;-moz-border-radius:0 4px 4px 0;border-radius:0 4px 4px 0}.input-prepend.input-append input,.input-prepend.input-append select,.input-prepend.input-append .uneditable-input{-webkit-border-radius:0;-moz-border-radius:0;border-radius:0}.input-prepend.input-append input+.btn-group .btn,.input-prepend.input-append select+.btn-group .btn,.input-prepend.input-append .uneditable-input+.btn-group .btn{-webkit-border-radius:0 4px 4px 0;-moz-border-radius:0 4px 4px 0;border-radius:0 4px 4px 0}.input-prepend.input-append .add-on:first-child,.input-prepend.input-append .btn:first-child{margin-right:-1px;-webkit-border-radius:4px 0 0 4px;-moz-border-radius:4px 0 0 4px;border-radius:4px 0 0 4px}.input-prepend.input-append .add-on:last-child,.input-prepend.input-append .btn:last-child{margin-left:-1px;-webkit-border-radius:0 4px 4px 0;-moz-border-radius:0 4px 4px 0;border-radius:0 4px 4px 0}.input-prepend.input-append .btn-group:first-child{margin-left:0}input.search-query{padding-right:14px;padding-right:4px \9;padding-left:14px;padding-left:4px \9;margin-bottom:0;-webkit-border-radius:15px;-moz-border-radius:15px;border-radius:15px}.form-search .input-append .search-query,.form-search .input-prepend .search-query{-webkit-border-radius:0;-moz-border-radius:0;border-radius:0}.form-search .input-append .search-query{-webkit-border-radius:14px 0 0 14px;-moz-border-radius:14px 0 0 14px;border-radius:14px 0 0 14px}.form-search .input-append .btn{-webkit-border-radius:0 14px 14px 0;-moz-border-radius:0 14px 14px 0;border-radius:0 14px 14px 0}.form-search .input-prepend .search-query{-webkit-border-radius:0 14px 14px 0;-moz-border-radius:0 14px 14px 0;border-radius:0 14px 14px 0}.form-search .input-prepend .btn{-webkit-border-radius:14px 0 0 14px;-moz-border-radius:14px 0 0 14px;border-radius:14px 0 0 14px}.form-search input,.form-inline input,.form-horizontal input,.form-search textarea,.form-inline textarea,.form-horizontal textarea,.form-search select,.form-inline select,.form-horizontal select,.form-search .help-inline,.form-inline .help-inline,.form-horizontal .help-inline,.form-search .uneditable-input,.form-inline .uneditable-input,.form-horizontal .uneditable-input,.form-search .input-prepend,.form-inline .input-prepend,.form-horizontal .input-prepend,.form-search .input-append,.form-inline .input-append,.form-horizontal .input-append{display:inline-block;*display:inline;margin-bottom:0;vertical-align:middle;*zoom:1}.form-search .hide,.form-inline .hide,.form-horizontal .hide{display:none}.form-search label,.form-inline label,.form-search .btn-group,.form-inline .btn-group{display:inline-block}.form-search .input-append,.form-inline .input-append,.form-search .input-prepend,.form-inline .input-prepend{margin-bottom:0}.form-search .radio,.form-search .checkbox,.form-inline .radio,.form-inline .checkbox{padding-left:0;margin-bottom:0;vertical-align:middle}.form-search .radio input[type="radio"],.form-search .checkbox input[type="checkbox"],.form-inline .radio input[type="radio"],.form-inline .checkbox input[type="checkbox"]{float:left;margin-right:3px;margin-left:0}.control-group{margin-bottom:10px}legend+.control-group{margin-top:20px;-webkit-margin-top-collapse:separate}.form-horizontal .control-group{margin-bottom:20px;*zoom:1}.form-horizontal .control-group:before,.form-horizontal .control-group:after{display:table;line-height:0;content:""}.form-horizontal .control-group:after{clear:both}.form-horizontal .control-label{float:left;width:160px;padding-top:5px;text-align:right}.form-horizontal .controls{*display:inline-block;*padding-left:20px;margin-left:180px;*margin-left:0}.form-horizontal .controls:first-child{*padding-left:180px}.form-horizontal .help-block{margin-bottom:0}.form-horizontal input+.help-block,.form-horizontal select+.help-block,.form-horizontal textarea+.help-block,.form-horizontal .uneditable-input+.help-block,.form-horizontal .input-prepend+.help-block,.form-horizontal .input-append+.help-block{margin-top:10px}.form-horizontal .form-actions{padding-left:180px}table{max-width:100%;background-color:transparent;border-collapse:collapse;border-spacing:0}.table{width:100%;margin-bottom:20px}.table th,.table td{padding:8px;line-height:20px;text-align:left;vertical-align:top;border-top:1px solid #ddd}.table th{font-weight:bold}.table thead th{vertical-align:bottom}.table caption+thead tr:first-child th,.table caption+thead tr:first-child td,.table colgroup+thead tr:first-child th,.table colgroup+thead tr:first-child td,.table thead:first-child tr:first-child th,.table thead:first-child tr:first-child td{border-top:0}.table tbody+tbody{border-top:2px solid #ddd}.table .table{background-color:#fff}.table-condensed th,.table-condensed td{padding:4px 5px}.table-bordered{border:1px solid #ddd;border-collapse:separate;*border-collapse:collapse;border-left:0;-webkit-border-radius:4px;-moz-border-radius:4px;border-radius:4px}.table-bordered th,.table-bordered td{border-left:1px solid #ddd}.table-bordered caption+thead tr:first-child th,.table-bordered caption+tbody tr:first-child th,.table-bordered caption+tbody tr:first-child td,.table-bordered colgroup+thead tr:first-child th,.table-bordered colgroup+tbody tr:first-child th,.table-bordered colgroup+tbody tr:first-child td,.table-bordered thead:first-child tr:first-child th,.table-bordered tbody:first-child tr:first-child th,.table-bordered tbody:first-child tr:first-child td{border-top:0}.table-bordered thead:first-child tr:first-child>th:first-child,.table-bordered tbody:first-child tr:first-child>td:first-child,.table-bordered tbody:first-child tr:first-child>th:first-child{-webkit-border-top-left-radius:4px;border-top-left-radius:4px;-moz-border-radius-topleft:4px}.table-bordered thead:first-child tr:first-child>th:last-child,.table-bordered tbody:first-child tr:first-child>td:last-child,.table-bordered tbody:first-child tr:first-child>th:last-child{-webkit-border-top-right-radius:4px;border-top-right-radius:4px;-moz-border-radius-topright:4px}.table-bordered thead:last-child tr:last-child>th:first-child,.table-bordered tbody:last-child tr:last-child>td:first-child,.table-bordered tbody:last-child tr:last-child>th:first-child,.table-bordered tfoot:last-child tr:last-child>td:first-child,.table-bordered tfoot:last-child tr:last-child>th:first-child{-webkit-border-bottom-left-radius:4px;border-bottom-left-radius:4px;-moz-border-radius-bottomleft:4px}.table-bordered thead:last-child tr:last-child>th:last-child,.table-bordered tbody:last-child tr:last-child>td:last-child,.table-bordered tbody:last-child tr:last-child>th:last-child,.table-bordered tfoot:last-child tr:last-child>td:last-child,.table-bordered tfoot:last-child tr:last-child>th:last-child{-webkit-border-bottom-right-radius:4px;border-bottom-right-radius:4px;-moz-border-radius-bottomright:4px}.table-bordered tfoot+tbody:last-child tr:last-child td:first-child{-webkit-border-bottom-left-radius:0;border-bottom-left-radius:0;-moz-border-radius-bottomleft:0}.table-bordered tfoot+tbody:last-child tr:last-child td:last-child{-webkit-border-bottom-right-radius:0;border-bottom-right-radius:0;-moz-border-radius-bottomright:0}.table-bordered caption+thead tr:first-child th:first-child,.table-bordered caption+tbody tr:first-child td:first-child,.table-bordered colgroup+thead tr:first-child th:first-child,.table-bordered colgroup+tbody tr:first-child td:first-child{-webkit-border-top-left-radius:4px;border-top-left-radius:4px;-moz-border-radius-topleft:4px}.table-bordered caption+thead tr:first-child th:last-child,.table-bordered caption+tbody tr:first-child td:last-child,.table-bordered colgroup+thead tr:first-child th:last-child,.table-bordered colgroup+tbody tr:first-child td:last-child{-webkit-border-top-right-radius:4px;border-top-right-radius:4px;-moz-border-radius-topright:4px}.table-striped tbody>tr:nth-child(odd)>td,.table-striped tbody>tr:nth-child(odd)>th{background-color:#f9f9f9}.table-hover tbody tr:hover>td,.table-hover tbody tr:hover>th{background-color:#f5f5f5}table td[class*="span"],table th[class*="span"],.row-fluid table td[class*="span"],.row-fluid table th[class*="span"]{display:table-cell;float:none;margin-left:0}.table td.span1,.table th.span1{float:none;width:44px;margin-left:0}.table td.span2,.table th.span2{float:none;width:124px;margin-left:0}.table td.span3,.table th.span3{float:none;width:204px;margin-left:0}.table td.span4,.table th.span4{float:none;width:284px;margin-left:0}.table td.span5,.table th.span5{float:none;width:364px;margin-left:0}.table td.span6,.table th.span6{float:none;width:444px;margin-left:0}.table td.span7,.table th.span7{float:none;width:524px;margin-left:0}.table td.span8,.table th.span8{float:none;width:604px;margin-left:0}.table td.span9,.table th.span9{float:none;width:684px;margin-left:0}.table td.span10,.table th.span10{float:none;width:764px;margin-left:0}.table td.span11,.table th.span11{float:none;width:844px;margin-left:0}.table td.span12,.table th.span12{float:none;width:924px;margin-left:0}.table tbody tr.success>td{background-color:#dff0d8}.table tbody tr.error>td{background-color:#f2dede}.table tbody tr.warning>td{background-color:#fcf8e3}.table tbody tr.info>td{background-color:#d9edf7}.table-hover tbody tr.success:hover>td{background-color:#d0e9c6}.table-hover tbody tr.error:hover>td{background-color:#ebcccc}.table-hover tbody tr.warning:hover>td{background-color:#faf2cc}.table-hover tbody tr.info:hover>td{background-color:#c4e3f3}[class^="icon-"],[class*=" icon-"]{display:inline-block;width:14px;height:14px;margin-top:1px;*margin-right:.3em;line-height:14px;vertical-align:text-top;background-image:url("../img/glyphicons-halflings.png");background-position:14px 14px;background-repeat:no-repeat}.icon-white,.nav-pills>.active>a>[class^="icon-"],.nav-pills>.active>a>[class*=" icon-"],.nav-list>.active>a>[class^="icon-"],.nav-list>.active>a>[class*=" icon-"],.navbar-inverse .nav>.active>a>[class^="icon-"],.navbar-inverse .nav>.active>a>[class*=" icon-"],.dropdown-menu>li>a:hover>[class^="icon-"],.dropdown-menu>li>a:focus>[class^="icon-"],.dropdown-menu>li>a:hover>[class*=" icon-"],.dropdown-menu>li>a:focus>[class*=" icon-"],.dropdown-menu>.active>a>[class^="icon-"],.dropdown-menu>.active>a>[class*=" icon-"],.dropdown-submenu:hover>a>[class^="icon-"],.dropdown-submenu:focus>a>[class^="icon-"],.dropdown-submenu:hover>a>[class*=" icon-"],.dropdown-submenu:focus>a>[class*=" icon-"]{background-image:url("../img/glyphicons-halflings-white.png")}.icon-glass{background-position:0 0}.icon-music{background-position:-24px 0}.icon-search{background-position:-48px 0}.icon-envelope{background-position:-72px 0}.icon-heart{background-position:-96px 0}.icon-star{background-position:-120px 0}.icon-star-empty{background-position:-144px 0}.icon-user{background-position:-168px 0}.icon-film{background-position:-192px 0}.icon-th-large{background-position:-216px 0}.icon-th{background-position:-240px 0}.icon-th-list{background-position:-264px 0}.icon-ok{background-position:-288px 0}.icon-remove{background-position:-312px 0}.icon-zoom-in{background-position:-336px 0}.icon-zoom-out{background-position:-360px 0}.icon-off{background-position:-384px 0}.icon-signal{background-position:-408px 0}.icon-cog{background-position:-432px 0}.icon-trash{background-position:-456px 0}.icon-home{background-position:0 -24px}.icon-file{background-position:-24px -24px}.icon-time{background-position:-48px -24px}.icon-road{background-position:-72px -24px}.icon-download-alt{background-position:-96px -24px}.icon-download{background-position:-120px -24px}.icon-upload{background-position:-144px -24px}.icon-inbox{background-position:-168px -24px}.icon-play-circle{background-position:-192px -24px}.icon-repeat{background-position:-216px -24px}.icon-refresh{background-position:-240px -24px}.icon-list-alt{background-position:-264px -24px}.icon-lock{background-position:-287px -24px}.icon-flag{background-position:-312px -24px}.icon-headphones{background-position:-336px -24px}.icon-volume-off{background-position:-360px -24px}.icon-volume-down{background-position:-384px -24px}.icon-volume-up{background-position:-408px -24px}.icon-qrcode{background-position:-432px -24px}.icon-barcode{background-position:-456px -24px}.icon-tag{background-position:0 -48px}.icon-tags{background-position:-25px -48px}.icon-book{background-position:-48px -48px}.icon-bookmark{background-position:-72px -48px}.icon-print{background-position:-96px -48px}.icon-camera{background-position:-120px -48px}.icon-font{background-position:-144px -48px}.icon-bold{background-position:-167px -48px}.icon-italic{background-position:-192px -48px}.icon-text-height{background-position:-216px -48px}.icon-text-width{background-position:-240px -48px}.icon-align-left{background-position:-264px -48px}.icon-align-center{background-position:-288px -48px}.icon-align-right{background-position:-312px -48px}.icon-align-justify{background-position:-336px -48px}.icon-list{background-position:-360px -48px}.icon-indent-left{background-position:-384px -48px}.icon-indent-right{background-position:-408px -48px}.icon-facetime-video{background-position:-432px -48px}.icon-picture{background-position:-456px -48px}.icon-pencil{background-position:0 -72px}.icon-map-marker{background-position:-24px -72px}.icon-adjust{background-position:-48px -72px}.icon-tint{background-position:-72px -72px}.icon-edit{background-position:-96px -72px}.icon-share{background-position:-120px -72px}.icon-check{background-position:-144px -72px}.icon-move{background-position:-168px -72px}.icon-step-backward{background-position:-192px -72px}.icon-fast-backward{background-position:-216px -72px}.icon-backward{background-position:-240px -72px}.icon-play{background-position:-264px -72px}.icon-pause{background-position:-288px -72px}.icon-stop{background-position:-312px -72px}.icon-forward{background-position:-336px -72px}.icon-fast-forward{background-position:-360px -72px}.icon-step-forward{background-position:-384px -72px}.icon-eject{background-position:-408px -72px}.icon-chevron-left{background-position:-432px -72px}.icon-chevron-right{background-position:-456px -72px}.icon-plus-sign{background-position:0 -96px}.icon-minus-sign{background-position:-24px -96px}.icon-remove-sign{background-position:-48px -96px}.icon-ok-sign{background-position:-72px -96px}.icon-question-sign{background-position:-96px -96px}.icon-info-sign{background-position:-120px -96px}.icon-screenshot{background-position:-144px -96px}.icon-remove-circle{background-position:-168px -96px}.icon-ok-circle{background-position:-192px -96px}.icon-ban-circle{background-position:-216px -96px}.icon-arrow-left{background-position:-240px -96px}.icon-arrow-right{background-position:-264px -96px}.icon-arrow-up{background-position:-289px -96px}.icon-arrow-down{background-position:-312px -96px}.icon-share-alt{background-position:-336px -96px}.icon-resize-full{background-position:-360px -96px}.icon-resize-small{background-position:-384px -96px}.icon-plus{background-position:-408px -96px}.icon-minus{background-position:-433px -96px}.icon-asterisk{background-position:-456px -96px}.icon-exclamation-sign{background-position:0 -120px}.icon-gift{background-position:-24px -120px}.icon-leaf{background-position:-48px -120px}.icon-fire{background-position:-72px -120px}.icon-eye-open{background-position:-96px -120px}.icon-eye-close{background-position:-120px -120px}.icon-warning-sign{background-position:-144px -120px}.icon-plane{background-position:-168px -120px}.icon-calendar{background-position:-192px -120px}.icon-random{width:16px;background-position:-216px -120px}.icon-comment{background-position:-240px -120px}.icon-magnet{background-position:-264px -120px}.icon-chevron-up{background-position:-288px -120px}.icon-chevron-down{background-position:-313px -119px}.icon-retweet{background-position:-336px -120px}.icon-shopping-cart{background-position:-360px -120px}.icon-folder-close{width:16px;background-position:-384px -120px}.icon-folder-open{width:16px;background-position:-408px -120px}.icon-resize-vertical{background-position:-432px -119px}.icon-resize-horizontal{background-position:-456px -118px}.icon-hdd{background-position:0 -144px}.icon-bullhorn{background-position:-24px -144px}.icon-bell{background-position:-48px -144px}.icon-certificate{background-position:-72px -144px}.icon-thumbs-up{background-position:-96px -144px}.icon-thumbs-down{background-position:-120px -144px}.icon-hand-right{background-position:-144px -144px}.icon-hand-left{background-position:-168px -144px}.icon-hand-up{background-position:-192px -144px}.icon-hand-down{background-position:-216px -144px}.icon-circle-arrow-right{background-position:-240px -144px}.icon-circle-arrow-left{background-position:-264px -144px}.icon-circle-arrow-up{background-position:-288px -144px}.icon-circle-arrow-down{background-position:-312px -144px}.icon-globe{background-position:-336px -144px}.icon-wrench{background-position:-360px -144px}.icon-tasks{background-position:-384px -144px}.icon-filter{background-position:-408px -144px}.icon-briefcase{background-position:-432px -144px}.icon-fullscreen{background-position:-456px -144px}.dropup,.dropdown{position:relative}.dropdown-toggle{*margin-bottom:-3px}.dropdown-toggle:active,.open .dropdown-toggle{outline:0}.caret{display:inline-block;width:0;height:0;vertical-align:top;border-top:4px solid #000;border-right:4px solid transparent;border-left:4px solid transparent;content:""}.dropdown .caret{margin-top:8px;margin-left:2px}.dropdown-menu{position:absolute;top:100%;left:0;z-index:1000;display:none;float:left;min-width:160px;padding:5px 0;margin:2px 0 0;list-style:none;background-color:#fff;border:1px solid #ccc;border:1px solid rgba(0,0,0,0.2);*border-right-width:2px;*border-bottom-width:2px;-webkit-border-radius:6px;-moz-border-radius:6px;border-radius:6px;-webkit-box-shadow:0 5px 10px rgba(0,0,0,0.2);-moz-box-shadow:0 5px 10px rgba(0,0,0,0.2);box-shadow:0 5px 10px rgba(0,0,0,0.2);-webkit-background-clip:padding-box;-moz-background-clip:padding;background-clip:padding-box}.dropdown-menu.pull-right{right:0;left:auto}.dropdown-menu .divider{*width:100%;height:1px;margin:9px 1px;*margin:-5px 0 5px;overflow:hidden;background-color:#e5e5e5;border-bottom:1px solid #fff}.dropdown-menu>li>a{display:block;padding:3px 20px;clear:both;font-weight:normal;line-height:20px;color:#333;white-space:nowrap}.dropdown-menu>li>a:hover,.dropdown-menu>li>a:focus,.dropdown-submenu:hover>a,.dropdown-submenu:focus>a{color:#fff;text-decoration:none;background-color:#0081c2;background-image:-moz-linear-gradient(top,#08c,#0077b3);background-image:-webkit-gradient(linear,0 0,0 100%,from(#08c),to(#0077b3));background-image:-webkit-linear-gradient(top,#08c,#0077b3);background-image:-o-linear-gradient(top,#08c,#0077b3);background-image:linear-gradient(to bottom,#08c,#0077b3);background-repeat:repeat-x;filter:progid:DXImageTransform.Microsoft.gradient(startColorstr='#ff0088cc',endColorstr='#ff0077b3',GradientType=0)}.dropdown-menu>.active>a,.dropdown-menu>.active>a:hover,.dropdown-menu>.active>a:focus{color:#fff;text-decoration:none;background-color:#0081c2;background-image:-moz-linear-gradient(top,#08c,#0077b3);background-image:-webkit-gradient(linear,0 0,0 100%,from(#08c),to(#0077b3));background-image:-webkit-linear-gradient(top,#08c,#0077b3);background-image:-o-linear-gradient(top,#08c,#0077b3);background-image:linear-gradient(to bottom,#08c,#0077b3);background-repeat:repeat-x;outline:0;filter:progid:DXImageTransform.Microsoft.gradient(startColorstr='#ff0088cc',endColorstr='#ff0077b3',GradientType=0)}.dropdown-menu>.disabled>a,.dropdown-menu>.disabled>a:hover,.dropdown-menu>.disabled>a:focus{color:#999}.dropdown-menu>.disabled>a:hover,.dropdown-menu>.disabled>a:focus{text-decoration:none;cursor:default;background-color:transparent;background-image:none;filter:progid:DXImageTransform.Microsoft.gradient(enabled=false)}.open{*z-index:1000}.open>.dropdown-menu{display:block}.pull-right>.dropdown-menu{right:0;left:auto}.dropup .caret,.navbar-fixed-bottom .dropdown .caret{border-top:0;border-bottom:4px solid #000;content:""}.dropup .dropdown-menu,.navbar-fixed-bottom .dropdown .dropdown-menu{top:auto;bottom:100%;margin-bottom:1px}.dropdown-submenu{position:relative}.dropdown-submenu>.dropdown-menu{top:0;left:100%;margin-top:-6px;margin-left:-1px;-webkit-border-radius:0 6px 6px 6px;-moz-border-radius:0 6px 6px 6px;border-radius:0 6px 6px 6px}.dropdown-submenu:hover>.dropdown-menu{display:block}.dropup .dropdown-submenu>.dropdown-menu{top:auto;bottom:0;margin-top:0;margin-bottom:-2px;-webkit-border-radius:5px 5px 5px 0;-moz-border-radius:5px 5px 5px 0;border-radius:5px 5px 5px 0}.dropdown-submenu>a:after{display:block;float:right;width:0;height:0;margin-top:5px;margin-right:-10px;border-color:transparent;border-left-color:#ccc;border-style:solid;border-width:5px 0 5px 5px;content:" "}.dropdown-submenu:hover>a:after{border-left-color:#fff}.dropdown-submenu.pull-left{float:none}.dropdown-submenu.pull-left>.dropdown-menu{left:-100%;margin-left:10px;-webkit-border-radius:6px 0 6px 6px;-moz-border-radius:6px 0 6px 6px;border-radius:6px 0 6px 6px}.dropdown .dropdown-menu .nav-header{padding-right:20px;padding-left:20px}.typeahead{z-index:1051;margin-top:2px;-webkit-border-radius:4px;-moz-border-radius:4px;border-radius:4px}.well{min-height:20px;padding:19px;margin-bottom:20px;background-color:#f5f5f5;border:1px solid #e3e3e3;-webkit-border-radius:4px;-moz-border-radius:4px;border-radius:4px;-webkit-box-shadow:inset 0 1px 1px rgba(0,0,0,0.05);-moz-box-shadow:inset 0 1px 1px rgba(0,0,0,0.05);box-shadow:inset 0 1px 1px rgba(0,0,0,0.05)}.well blockquote{border-color:#ddd;border-color:rgba(0,0,0,0.15)}.well-large{padding:24px;-webkit-border-radius:6px;-moz-border-radius:6px;border-radius:6px}.well-small{padding:9px;-webkit-border-radius:3px;-moz-border-radius:3px;border-radius:3px}.fade{opacity:0;-webkit-transition:opacity .15s linear;-moz-transition:opacity .15s linear;-o-transition:opacity .15s linear;transition:opacity .15s linear}.fade.in{opacity:1}.collapse{position:relative;height:0;overflow:hidden;-webkit-transition:height .35s ease;-moz-transition:height .35s ease;-o-transition:height .35s ease;transition:height .35s ease}.collapse.in{height:auto}.close{float:right;font-size:20px;font-weight:bold;line-height:20px;color:#000;text-shadow:0 1px 0 #fff;opacity:.2;filter:alpha(opacity=20)}.close:hover,.close:focus{color:#000;text-decoration:none;cursor:pointer;opacity:.4;filter:alpha(opacity=40)}button.close{padding:0;cursor:pointer;background:transparent;border:0;-webkit-appearance:none}.btn{display:inline-block;*display:inline;padding:4px 12px;margin-bottom:0;*margin-left:.3em;font-size:14px;line-height:20px;color:#333;text-align:center;text-shadow:0 1px 1px rgba(255,255,255,0.75);vertical-align:middle;cursor:pointer;background-color:#f5f5f5;*background-color:#e6e6e6;background-image:-moz-linear-gradient(top,#fff,#e6e6e6);background-image:-webkit-gradient(linear,0 0,0 100%,from(#fff),to(#e6e6e6));background-image:-webkit-linear-gradient(top,#fff,#e6e6e6);background-image:-o-linear-gradient(top,#fff,#e6e6e6);background-image:linear-gradient(to bottom,#fff,#e6e6e6);background-repeat:repeat-x;border:1px solid #ccc;*border:0;border-color:#e6e6e6 #e6e6e6 #bfbfbf;border-color:rgba(0,0,0,0.1) rgba(0,0,0,0.1) rgba(0,0,0,0.25);border-bottom-color:#b3b3b3;-webkit-border-radius:4px;-moz-border-radius:4px;border-radius:4px;filter:progid:DXImageTransform.Microsoft.gradient(startColorstr='#ffffffff',endColorstr='#ffe6e6e6',GradientType=0);filter:progid:DXImageTransform.Microsoft.gradient(enabled=false);*zoom:1;-webkit-box-shadow:inset 0 1px 0 rgba(255,255,255,0.2),0 1px 2px rgba(0,0,0,0.05);-moz-box-shadow:inset 0 1px 0 rgba(255,255,255,0.2),0 1px 2px rgba(0,0,0,0.05);box-shadow:inset 0 1px 0 rgba(255,255,255,0.2),0 1px 2px rgba(0,0,0,0.05)}.btn:hover,.btn:focus,.btn:active,.btn.active,.btn.disabled,.btn[disabled]{color:#333;background-color:#e6e6e6;*background-color:#d9d9d9}.btn:active,.btn.active{background-color:#ccc \9}.btn:first-child{*margin-left:0}.btn:hover,.btn:focus{color:#333;text-decoration:none;background-position:0 -15px;-webkit-transition:background-position .1s linear;-moz-transition:background-position .1s linear;-o-transition:background-position .1s linear;transition:background-position .1s linear}.btn:focus{outline:thin dotted #333;outline:5px auto -webkit-focus-ring-color;outline-offset:-2px}.btn.active,.btn:active{background-image:none;outline:0;-webkit-box-shadow:inset 0 2px 4px rgba(0,0,0,0.15),0 1px 2px rgba(0,0,0,0.05);-moz-box-shadow:inset 0 2px 4px rgba(0,0,0,0.15),0 1px 2px rgba(0,0,0,0.05);box-shadow:inset 0 2px 4px rgba(0,0,0,0.15),0 1px 2px rgba(0,0,0,0.05)}.btn.disabled,.btn[disabled]{cursor:default;background-image:none;opacity:.65;filter:alpha(opacity=65);-webkit-box-shadow:none;-moz-box-shadow:none;box-shadow:none}.btn-large{padding:11px 19px;font-size:17.5px;-webkit-border-radius:6px;-moz-border-radius:6px;border-radius:6px}.btn-large [class^="icon-"],.btn-large [class*=" icon-"]{margin-top:4px}.btn-small{padding:2px 10px;font-size:11.9px;-webkit-border-radius:3px;-moz-border-radius:3px;border-radius:3px}.btn-small [class^="icon-"],.btn-small [class*=" icon-"]{margin-top:0}.btn-mini [class^="icon-"],.btn-mini [class*=" icon-"]{margin-top:-1px}.btn-mini{padding:0 6px;font-size:10.5px;-webkit-border-radius:3px;-moz-border-radius:3px;border-radius:3px}.btn-block{display:block;width:100%;padding-right:0;padding-left:0;-webkit-box-sizing:border-box;-moz-box-sizing:border-box;box-sizing:border-box}.btn-block+.btn-block{margin-top:5px}input[type="submit"].btn-block,input[type="reset"].btn-block,input[type="button"].btn-block{width:100%}.btn-primary.active,.btn-warning.active,.btn-danger.active,.btn-success.active,.btn-info.active,.btn-inverse.active{color:rgba(255,255,255,0.75)}.btn-primary{color:#fff;text-shadow:0 -1px 0 rgba(0,0,0,0.25);background-color:#006dcc;*background-color:#04c;background-image:-moz-linear-gradient(top,#08c,#04c);background-image:-webkit-gradient(linear,0 0,0 100%,from(#08c),to(#04c));background-image:-webkit-linear-gradient(top,#08c,#04c);background-image:-o-linear-gradient(top,#08c,#04c);background-image:linear-gradient(to bottom,#08c,#04c);background-repeat:repeat-x;border-color:#04c #04c #002a80;border-color:rgba(0,0,0,0.1) rgba(0,0,0,0.1) rgba(0,0,0,0.25);filter:progid:DXImageTransform.Microsoft.gradient(startColorstr='#ff0088cc',endColorstr='#ff0044cc',GradientType=0);filter:progid:DXImageTransform.Microsoft.gradient(enabled=false)}.btn-primary:hover,.btn-primary:focus,.btn-primary:active,.btn-primary.active,.btn-primary.disabled,.btn-primary[disabled]{color:#fff;background-color:#04c;*background-color:#003bb3}.btn-primary:active,.btn-primary.active{background-color:#039 \9}.btn-warning{color:#fff;text-shadow:0 -1px 0 rgba(0,0,0,0.25);background-color:#faa732;*background-color:#f89406;background-image:-moz-linear-gradient(top,#fbb450,#f89406);background-image:-webkit-gradient(linear,0 0,0 100%,from(#fbb450),to(#f89406));background-image:-webkit-linear-gradient(top,#fbb450,#f89406);background-image:-o-linear-gradient(top,#fbb450,#f89406);background-image:linear-gradient(to bottom,#fbb450,#f89406);background-repeat:repeat-x;border-color:#f89406 #f89406 #ad6704;border-color:rgba(0,0,0,0.1) rgba(0,0,0,0.1) rgba(0,0,0,0.25);filter:progid:DXImageTransform.Microsoft.gradient(startColorstr='#fffbb450',endColorstr='#fff89406',GradientType=0);filter:progid:DXImageTransform.Microsoft.gradient(enabled=false)}.btn-warning:hover,.btn-warning:focus,.btn-warning:active,.btn-warning.active,.btn-warning.disabled,.btn-warning[disabled]{color:#fff;background-color:#f89406;*background-color:#df8505}.btn-warning:active,.btn-warning.active{background-color:#c67605 \9}.btn-danger{color:#fff;text-shadow:0 -1px 0 rgba(0,0,0,0.25);background-color:#da4f49;*background-color:#bd362f;background-image:-moz-linear-gradient(top,#ee5f5b,#bd362f);background-image:-webkit-gradient(linear,0 0,0 100%,from(#ee5f5b),to(#bd362f));background-image:-webkit-linear-gradient(top,#ee5f5b,#bd362f);background-image:-o-linear-gradient(top,#ee5f5b,#bd362f);background-image:linear-gradient(to bottom,#ee5f5b,#bd362f);background-repeat:repeat-x;border-color:#bd362f #bd362f #802420;border-color:rgba(0,0,0,0.1) rgba(0,0,0,0.1) rgba(0,0,0,0.25);filter:progid:DXImageTransform.Microsoft.gradient(startColorstr='#ffee5f5b',endColorstr='#ffbd362f',GradientType=0);filter:progid:DXImageTransform.Microsoft.gradient(enabled=false)}.btn-danger:hover,.btn-danger:focus,.btn-danger:active,.btn-danger.active,.btn-danger.disabled,.btn-danger[disabled]{color:#fff;background-color:#bd362f;*background-color:#a9302a}.btn-danger:active,.btn-danger.active{background-color:#942a25 \9}.btn-success{color:#fff;text-shadow:0 -1px 0 rgba(0,0,0,0.25);background-color:#5bb75b;*background-color:#51a351;background-image:-moz-linear-gradient(top,#62c462,#51a351);background-image:-webkit-gradient(linear,0 0,0 100%,from(#62c462),to(#51a351));background-image:-webkit-linear-gradient(top,#62c462,#51a351);background-image:-o-linear-gradient(top,#62c462,#51a351);background-image:linear-gradient(to bottom,#62c462,#51a351);background-repeat:repeat-x;border-color:#51a351 #51a351 #387038;border-color:rgba(0,0,0,0.1) rgba(0,0,0,0.1) rgba(0,0,0,0.25);filter:progid:DXImageTransform.Microsoft.gradient(startColorstr='#ff62c462',endColorstr='#ff51a351',GradientType=0);filter:progid:DXImageTransform.Microsoft.gradient(enabled=false)}.btn-success:hover,.btn-success:focus,.btn-success:active,.btn-success.active,.btn-success.disabled,.btn-success[disabled]{color:#fff;background-color:#51a351;*background-color:#499249}.btn-success:active,.btn-success.active{background-color:#408140 \9}.btn-info{color:#fff;text-shadow:0 -1px 0 rgba(0,0,0,0.25);background-color:#49afcd;*background-color:#2f96b4;background-image:-moz-linear-gradient(top,#5bc0de,#2f96b4);background-image:-webkit-gradient(linear,0 0,0 100%,from(#5bc0de),to(#2f96b4));background-image:-webkit-linear-gradient(top,#5bc0de,#2f96b4);background-image:-o-linear-gradient(top,#5bc0de,#2f96b4);background-image:linear-gradient(to bottom,#5bc0de,#2f96b4);background-repeat:repeat-x;border-color:#2f96b4 #2f96b4 #1f6377;border-color:rgba(0,0,0,0.1) rgba(0,0,0,0.1) rgba(0,0,0,0.25);filter:progid:DXImageTransform.Microsoft.gradient(startColorstr='#ff5bc0de',endColorstr='#ff2f96b4',GradientType=0);filter:progid:DXImageTransform.Microsoft.gradient(enabled=false)}.btn-info:hover,.btn-info:focus,.btn-info:active,.btn-info.active,.btn-info.disabled,.btn-info[disabled]{color:#fff;background-color:#2f96b4;*background-color:#2a85a0}.btn-info:active,.btn-info.active{background-color:#24748c \9}.btn-inverse{color:#fff;text-shadow:0 -1px 0 rgba(0,0,0,0.25);background-color:#363636;*background-color:#222;background-image:-moz-linear-gradient(top,#444,#222);background-image:-webkit-gradient(linear,0 0,0 100%,from(#444),to(#222));background-image:-webkit-linear-gradient(top,#444,#222);background-image:-o-linear-gradient(top,#444,#222);background-image:linear-gradient(to bottom,#444,#222);background-repeat:repeat-x;border-color:#222 #222 #000;border-color:rgba(0,0,0,0.1) rgba(0,0,0,0.1) rgba(0,0,0,0.25);filter:progid:DXImageTransform.Microsoft.gradient(startColorstr='#ff444444',endColorstr='#ff222222',GradientType=0);filter:progid:DXImageTransform.Microsoft.gradient(enabled=false)}.btn-inverse:hover,.btn-inverse:focus,.btn-inverse:active,.btn-inverse.active,.btn-inverse.disabled,.btn-inverse[disabled]{color:#fff;background-color:#222;*background-color:#151515}.btn-inverse:active,.btn-inverse.active{background-color:#080808 \9}button.btn,input[type="submit"].btn{*padding-top:3px;*padding-bottom:3px}button.btn::-moz-focus-inner,input[type="submit"].btn::-moz-focus-inner{padding:0;border:0}button.btn.btn-large,input[type="submit"].btn.btn-large{*padding-top:7px;*padding-bottom:7px}button.btn.btn-small,input[type="submit"].btn.btn-small{*padding-top:3px;*padding-bottom:3px}button.btn.btn-mini,input[type="submit"].btn.btn-mini{*padding-top:1px;*padding-bottom:1px}.btn-link,.btn-link:active,.btn-link[disabled]{background-color:transparent;background-image:none;-webkit-box-shadow:none;-moz-box-shadow:none;box-shadow:none}.btn-link{color:#08c;cursor:pointer;border-color:transparent;-webkit-border-radius:0;-moz-border-radius:0;border-radius:0}.btn-link:hover,.btn-link:focus{color:#005580;text-decoration:underline;background-color:transparent}.btn-link[disabled]:hover,.btn-link[disabled]:focus{color:#333;text-decoration:none}.btn-group{position:relative;display:inline-block;*display:inline;*margin-left:.3em;font-size:0;white-space:nowrap;vertical-align:middle;*zoom:1}.btn-group:first-child{*margin-left:0}.btn-group+.btn-group{margin-left:5px}.btn-toolbar{margin-top:10px;margin-bottom:10px;font-size:0}.btn-toolbar>.btn+.btn,.btn-toolbar>.btn-group+.btn,.btn-toolbar>.btn+.btn-group{margin-left:5px}.btn-group>.btn{position:relative;-webkit-border-radius:0;-moz-border-radius:0;border-radius:0}.btn-group>.btn+.btn{margin-left:-1px}.btn-group>.btn,.btn-group>.dropdown-menu,.btn-group>.popover{font-size:14px}.btn-group>.btn-mini{font-size:10.5px}.btn-group>.btn-small{font-size:11.9px}.btn-group>.btn-large{font-size:17.5px}.btn-group>.btn:first-child{margin-left:0;-webkit-border-bottom-left-radius:4px;border-bottom-left-radius:4px;-webkit-border-top-left-radius:4px;border-top-left-radius:4px;-moz-border-radius-bottomleft:4px;-moz-border-radius-topleft:4px}.btn-group>.btn:last-child,.btn-group>.dropdown-toggle{-webkit-border-top-right-radius:4px;border-top-right-radius:4px;-webkit-border-bottom-right-radius:4px;border-bottom-right-radius:4px;-moz-border-radius-topright:4px;-moz-border-radius-bottomright:4px}.btn-group>.btn.large:first-child{margin-left:0;-webkit-border-bottom-left-radius:6px;border-bottom-left-radius:6px;-webkit-border-top-left-radius:6px;border-top-left-radius:6px;-moz-border-radius-bottomleft:6px;-moz-border-radius-topleft:6px}.btn-group>.btn.large:last-child,.btn-group>.large.dropdown-toggle{-webkit-border-top-right-radius:6px;border-top-right-radius:6px;-webkit-border-bottom-right-radius:6px;border-bottom-right-radius:6px;-moz-border-radius-topright:6px;-moz-border-radius-bottomright:6px}.btn-group>.btn:hover,.btn-group>.btn:focus,.btn-group>.btn:active,.btn-group>.btn.active{z-index:2}.btn-group .dropdown-toggle:active,.btn-group.open .dropdown-toggle{outline:0}.btn-group>.btn+.dropdown-toggle{*padding-top:5px;padding-right:8px;*padding-bottom:5px;padding-left:8px;-webkit-box-shadow:inset 1px 0 0 rgba(255,255,255,0.125),inset 0 1px 0 rgba(255,255,255,0.2),0 1px 2px rgba(0,0,0,0.05);-moz-box-shadow:inset 1px 0 0 rgba(255,255,255,0.125),inset 0 1px 0 rgba(255,255,255,0.2),0 1px 2px rgba(0,0,0,0.05);box-shadow:inset 1px 0 0 rgba(255,255,255,0.125),inset 0 1px 0 rgba(255,255,255,0.2),0 1px 2px rgba(0,0,0,0.05)}.btn-group>.btn-mini+.dropdown-toggle{*padding-top:2px;padding-right:5px;*padding-bottom:2px;padding-left:5px}.btn-group>.btn-small+.dropdown-toggle{*padding-top:5px;*padding-bottom:4px}.btn-group>.btn-large+.dropdown-toggle{*padding-top:7px;padding-right:12px;*padding-bottom:7px;padding-left:12px}.btn-group.open .dropdown-toggle{background-image:none;-webkit-box-shadow:inset 0 2px 4px rgba(0,0,0,0.15),0 1px 2px rgba(0,0,0,0.05);-moz-box-shadow:inset 0 2px 4px rgba(0,0,0,0.15),0 1px 2px rgba(0,0,0,0.05);box-shadow:inset 0 2px 4px rgba(0,0,0,0.15),0 1px 2px rgba(0,0,0,0.05)}.btn-group.open .btn.dropdown-toggle{background-color:#e6e6e6}.btn-group.open .btn-primary.dropdown-toggle{background-color:#04c}.btn-group.open .btn-warning.dropdown-toggle{background-color:#f89406}.btn-group.open .btn-danger.dropdown-toggle{background-color:#bd362f}.btn-group.open .btn-success.dropdown-toggle{background-color:#51a351}.btn-group.open .btn-info.dropdown-toggle{background-color:#2f96b4}.btn-group.open .btn-inverse.dropdown-toggle{background-color:#222}.btn .caret{margin-top:8px;margin-left:0}.btn-large .caret{margin-top:6px}.btn-large .caret{border-top-width:5px;border-right-width:5px;border-left-width:5px}.btn-mini .caret,.btn-small .caret{margin-top:8px}.dropup .btn-large .caret{border-bottom-width:5px}.btn-primary .caret,.btn-warning .caret,.btn-danger .caret,.btn-info .caret,.btn-success .caret,.btn-inverse .caret{border-top-color:#fff;border-bottom-color:#fff}.btn-group-vertical{display:inline-block;*display:inline;*zoom:1}.btn-group-vertical>.btn{display:block;float:none;max-width:100%;-webkit-border-radius:0;-moz-border-radius:0;border-radius:0}.btn-group-vertical>.btn+.btn{margin-top:-1px;margin-left:0}.btn-group-vertical>.btn:first-child{-webkit-border-radius:4px 4px 0 0;-moz-border-radius:4px 4px 0 0;border-radius:4px 4px 0 0}.btn-group-vertical>.btn:last-child{-webkit-border-radius:0 0 4px 4px;-moz-border-radius:0 0 4px 4px;border-radius:0 0 4px 4px}.btn-group-vertical>.btn-large:first-child{-webkit-border-radius:6px 6px 0 0;-moz-border-radius:6px 6px 0 0;border-radius:6px 6px 0 0}.btn-group-vertical>.btn-large:last-child{-webkit-border-radius:0 0 6px 6px;-moz-border-radius:0 0 6px 6px;border-radius:0 0 6px 6px}.alert{padding:8px 35px 8px 14px;margin-bottom:20px;text-shadow:0 1px 0 rgba(255,255,255,0.5);background-color:#fcf8e3;border:1px solid #fbeed5;-webkit-border-radius:4px;-moz-border-radius:4px;border-radius:4px}.alert,.alert h4{color:#c09853}.alert h4{margin:0}.alert .close{position:relative;top:-2px;right:-21px;line-height:20px}.alert-success{color:#468847;background-color:#dff0d8;border-color:#d6e9c6}.alert-success h4{color:#468847}.alert-danger,.alert-error{color:#b94a48;background-color:#f2dede;border-color:#eed3d7}.alert-danger h4,.alert-error h4{color:#b94a48}.alert-info{color:#3a87ad;background-color:#d9edf7;border-color:#bce8f1}.alert-info h4{color:#3a87ad}.alert-block{padding-top:14px;padding-bottom:14px}.alert-block>p,.alert-block>ul{margin-bottom:0}.alert-block p+p{margin-top:5px}.nav{margin-bottom:20px;margin-left:0;list-style:none}.nav>li>a{display:block}.nav>li>a:hover,.nav>li>a:focus{text-decoration:none;background-color:#eee}.nav>li>a>img{max-width:none}.nav>.pull-right{float:right}.nav-header{display:block;padding:3px 15px;font-size:11px;font-weight:bold;line-height:20px;color:#999;text-shadow:0 1px 0 rgba(255,255,255,0.5);text-transform:uppercase}.nav li+.nav-header{margin-top:9px}.nav-list{padding-right:15px;padding-left:15px;margin-bottom:0}.nav-list>li>a,.nav-list .nav-header{margin-right:-15px;margin-left:-15px;text-shadow:0 1px 0 rgba(255,255,255,0.5)}.nav-list>li>a{padding:3px 15px}.nav-list>.active>a,.nav-list>.active>a:hover,.nav-list>.active>a:focus{color:#fff;text-shadow:0 -1px 0 rgba(0,0,0,0.2);background-color:#08c}.nav-list [class^="icon-"],.nav-list [class*=" icon-"]{margin-right:2px}.nav-list .divider{*width:100%;height:1px;margin:9px 1px;*margin:-5px 0 5px;overflow:hidden;background-color:#e5e5e5;border-bottom:1px solid #fff}.nav-tabs,.nav-pills{*zoom:1}.nav-tabs:before,.nav-pills:before,.nav-tabs:after,.nav-pills:after{display:table;line-height:0;content:""}.nav-tabs:after,.nav-pills:after{clear:both}.nav-tabs>li,.nav-pills>li{float:left}.nav-tabs>li>a,.nav-pills>li>a{padding-right:12px;padding-left:12px;margin-right:2px;line-height:14px}.nav-tabs{border-bottom:1px solid #ddd}.nav-tabs>li{margin-bottom:-1px}.nav-tabs>li>a{padding-top:8px;padding-bottom:8px;line-height:20px;border:1px solid transparent;-webkit-border-radius:4px 4px 0 0;-moz-border-radius:4px 4px 0 0;border-radius:4px 4px 0 0}.nav-tabs>li>a:hover,.nav-tabs>li>a:focus{border-color:#eee #eee #ddd}.nav-tabs>.active>a,.nav-tabs>.active>a:hover,.nav-tabs>.active>a:focus{color:#555;cursor:default;background-color:#fff;border:1px solid #ddd;border-bottom-color:transparent}.nav-pills>li>a{padding-top:8px;padding-bottom:8px;margin-top:2px;margin-bottom:2px;-webkit-border-radius:5px;-moz-border-radius:5px;border-radius:5px}.nav-pills>.active>a,.nav-pills>.active>a:hover,.nav-pills>.active>a:focus{color:#fff;background-color:#08c}.nav-stacked>li{float:none}.nav-stacked>li>a{margin-right:0}.nav-tabs.nav-stacked{border-bottom:0}.nav-tabs.nav-stacked>li>a{border:1px solid #ddd;-webkit-border-radius:0;-moz-border-radius:0;border-radius:0}.nav-tabs.nav-stacked>li:first-child>a{-webkit-border-top-right-radius:4px;border-top-right-radius:4px;-webkit-border-top-left-radius:4px;border-top-left-radius:4px;-moz-border-radius-topright:4px;-moz-border-radius-topleft:4px}.nav-tabs.nav-stacked>li:last-child>a{-webkit-border-bottom-right-radius:4px;border-bottom-right-radius:4px;-webkit-border-bottom-left-radius:4px;border-bottom-left-radius:4px;-moz-border-radius-bottomright:4px;-moz-border-radius-bottomleft:4px}.nav-tabs.nav-stacked>li>a:hover,.nav-tabs.nav-stacked>li>a:focus{z-index:2;border-color:#ddd}.nav-pills.nav-stacked>li>a{margin-bottom:3px}.nav-pills.nav-stacked>li:last-child>a{margin-bottom:1px}.nav-tabs .dropdown-menu{-webkit-border-radius:0 0 6px 6px;-moz-border-radius:0 0 6px 6px;border-radius:0 0 6px 6px}.nav-pills .dropdown-menu{-webkit-border-radius:6px;-moz-border-radius:6px;border-radius:6px}.nav .dropdown-toggle .caret{margin-top:6px;border-top-color:#08c;border-bottom-color:#08c}.nav .dropdown-toggle:hover .caret,.nav .dropdown-toggle:focus .caret{border-top-color:#005580;border-bottom-color:#005580}.nav-tabs .dropdown-toggle .caret{margin-top:8px}.nav .active .dropdown-toggle .caret{border-top-color:#fff;border-bottom-color:#fff}.nav-tabs .active .dropdown-toggle .caret{border-top-color:#555;border-bottom-color:#555}.nav>.dropdown.active>a:hover,.nav>.dropdown.active>a:focus{cursor:pointer}.nav-tabs .open .dropdown-toggle,.nav-pills .open .dropdown-toggle,.nav>li.dropdown.open.active>a:hover,.nav>li.dropdown.open.active>a:focus{color:#fff;background-color:#999;border-color:#999}.nav li.dropdown.open .caret,.nav li.dropdown.open.active .caret,.nav li.dropdown.open a:hover .caret,.nav li.dropdown.open a:focus .caret{border-top-color:#fff;border-bottom-color:#fff;opacity:1;filter:alpha(opacity=100)}.tabs-stacked .open>a:hover,.tabs-stacked .open>a:focus{border-color:#999}.tabbable{*zoom:1}.tabbable:before,.tabbable:after{display:table;line-height:0;content:""}.tabbable:after{clear:both}.tab-content{overflow:auto}.tabs-below>.nav-tabs,.tabs-right>.nav-tabs,.tabs-left>.nav-tabs{border-bottom:0}.tab-content>.tab-pane,.pill-content>.pill-pane{display:none}.tab-content>.active,.pill-content>.active{display:block}.tabs-below>.nav-tabs{border-top:1px solid #ddd}.tabs-below>.nav-tabs>li{margin-top:-1px;margin-bottom:0}.tabs-below>.nav-tabs>li>a{-webkit-border-radius:0 0 4px 4px;-moz-border-radius:0 0 4px 4px;border-radius:0 0 4px 4px}.tabs-below>.nav-tabs>li>a:hover,.tabs-below>.nav-tabs>li>a:focus{border-top-color:#ddd;border-bottom-color:transparent}.tabs-below>.nav-tabs>.active>a,.tabs-below>.nav-tabs>.active>a:hover,.tabs-below>.nav-tabs>.active>a:focus{border-color:transparent #ddd #ddd #ddd}.tabs-left>.nav-tabs>li,.tabs-right>.nav-tabs>li{float:none}.tabs-left>.nav-tabs>li>a,.tabs-right>.nav-tabs>li>a{min-width:74px;margin-right:0;margin-bottom:3px}.tabs-left>.nav-tabs{float:left;margin-right:19px;border-right:1px solid #ddd}.tabs-left>.nav-tabs>li>a{margin-right:-1px;-webkit-border-radius:4px 0 0 4px;-moz-border-radius:4px 0 0 4px;border-radius:4px 0 0 4px}.tabs-left>.nav-tabs>li>a:hover,.tabs-left>.nav-tabs>li>a:focus{border-color:#eee #ddd #eee #eee}.tabs-left>.nav-tabs .active>a,.tabs-left>.nav-tabs .active>a:hover,.tabs-left>.nav-tabs .active>a:focus{border-color:#ddd transparent #ddd #ddd;*border-right-color:#fff}.tabs-right>.nav-tabs{float:right;margin-left:19px;border-left:1px solid #ddd}.tabs-right>.nav-tabs>li>a{margin-left:-1px;-webkit-border-radius:0 4px 4px 0;-moz-border-radius:0 4px 4px 0;border-radius:0 4px 4px 0}.tabs-right>.nav-tabs>li>a:hover,.tabs-right>.nav-tabs>li>a:focus{border-color:#eee #eee #eee #ddd}.tabs-right>.nav-tabs .active>a,.tabs-right>.nav-tabs .active>a:hover,.tabs-right>.nav-tabs .active>a:focus{border-color:#ddd #ddd #ddd transparent;*border-left-color:#fff}.nav>.disabled>a{color:#999}.nav>.disabled>a:hover,.nav>.disabled>a:focus{text-decoration:none;cursor:default;background-color:transparent}.navbar{*position:relative;*z-index:2;margin-bottom:20px;overflow:visible}.navbar-inner{min-height:40px;padding-right:20px;padding-left:20px;background-color:#fafafa;background-image:-moz-linear-gradient(top,#fff,#f2f2f2);background-image:-webkit-gradient(linear,0 0,0 100%,from(#fff),to(#f2f2f2));background-image:-webkit-linear-gradient(top,#fff,#f2f2f2);background-image:-o-linear-gradient(top,#fff,#f2f2f2);background-image:linear-gradient(to bottom,#fff,#f2f2f2);background-repeat:repeat-x;border:1px solid #d4d4d4;-webkit-border-radius:4px;-moz-border-radius:4px;border-radius:4px;filter:progid:DXImageTransform.Microsoft.gradient(startColorstr='#ffffffff',endColorstr='#fff2f2f2',GradientType=0);*zoom:1;-webkit-box-shadow:0 1px 4px rgba(0,0,0,0.065);-moz-box-shadow:0 1px 4px rgba(0,0,0,0.065);box-shadow:0 1px 4px rgba(0,0,0,0.065)}.navbar-inner:before,.navbar-inner:after{display:table;line-height:0;content:""}.navbar-inner:after{clear:both}.navbar .container{width:auto}.nav-collapse.collapse{height:auto;overflow:visible}.navbar .brand{display:block;float:left;padding:10px 20px 10px;margin-left:-20px;font-size:20px;font-weight:200;color:#777;text-shadow:0 1px 0 #fff}.navbar .brand:hover,.navbar .brand:focus{text-decoration:none}.navbar-text{margin-bottom:0;line-height:40px;color:#777}.navbar-link{color:#777}.navbar-link:hover,.navbar-link:focus{color:#333}.navbar .divider-vertical{height:40px;margin:0 9px;border-right:1px solid #fff;border-left:1px solid #f2f2f2}.navbar .btn,.navbar .btn-group{margin-top:5px}.navbar .btn-group .btn,.navbar .input-prepend .btn,.navbar .input-append .btn,.navbar .input-prepend .btn-group,.navbar .input-append .btn-group{margin-top:0}.navbar-form{margin-bottom:0;*zoom:1}.navbar-form:before,.navbar-form:after{display:table;line-height:0;content:""}.navbar-form:after{clear:both}.navbar-form input,.navbar-form select,.navbar-form .radio,.navbar-form .checkbox{margin-top:5px}.navbar-form input,.navbar-form select,.navbar-form .btn{display:inline-block;margin-bottom:0}.navbar-form input[type="image"],.navbar-form input[type="checkbox"],.navbar-form input[type="radio"]{margin-top:3px}.navbar-form .input-append,.navbar-form .input-prepend{margin-top:5px;white-space:nowrap}.navbar-form .input-append input,.navbar-form .input-prepend input{margin-top:0}.navbar-search{position:relative;float:left;margin-top:5px;margin-bottom:0}.navbar-search .search-query{padding:4px 14px;margin-bottom:0;font-family:"Helvetica Neue",Helvetica,Arial,sans-serif;font-size:13px;font-weight:normal;line-height:1;-webkit-border-radius:15px;-moz-border-radius:15px;border-radius:15px}.navbar-static-top{position:static;margin-bottom:0}.navbar-static-top .navbar-inner{-webkit-border-radius:0;-moz-border-radius:0;border-radius:0}.navbar-fixed-top,.navbar-fixed-bottom{position:fixed;right:0;left:0;z-index:1030;margin-bottom:0}.navbar-fixed-top .navbar-inner,.navbar-static-top .navbar-inner{border-width:0 0 1px}.navbar-fixed-bottom .navbar-inner{border-width:1px 0 0}.navbar-fixed-top .navbar-inner,.navbar-fixed-bottom .navbar-inner{padding-right:0;padding-left:0;-webkit-border-radius:0;-moz-border-radius:0;border-radius:0}.navbar-static-top .container,.navbar-fixed-top .container,.navbar-fixed-bottom .container{width:940px}.navbar-fixed-top{top:0}.navbar-fixed-top .navbar-inner,.navbar-static-top .navbar-inner{-webkit-box-shadow:0 1px 10px rgba(0,0,0,0.1);-moz-box-shadow:0 1px 10px rgba(0,0,0,0.1);box-shadow:0 1px 10px rgba(0,0,0,0.1)}.navbar-fixed-bottom{bottom:0}.navbar-fixed-bottom .navbar-inner{-webkit-box-shadow:0 -1px 10px rgba(0,0,0,0.1);-moz-box-shadow:0 -1px 10px rgba(0,0,0,0.1);box-shadow:0 -1px 10px rgba(0,0,0,0.1)}.navbar .nav{position:relative;left:0;display:block;float:left;margin:0 10px 0 0}.navbar .nav.pull-right{float:right;margin-right:0}.navbar .nav>li{float:left}.navbar .nav>li>a{float:none;padding:10px 15px 10px;color:#777;text-decoration:none;text-shadow:0 1px 0 #fff}.navbar .nav .dropdown-toggle .caret{margin-top:8px}.navbar .nav>li>a:focus,.navbar .nav>li>a:hover{color:#333;text-decoration:none;background-color:transparent}.navbar .nav>.active>a,.navbar .nav>.active>a:hover,.navbar .nav>.active>a:focus{color:#555;text-decoration:none;background-color:#e5e5e5;-webkit-box-shadow:inset 0 3px 8px rgba(0,0,0,0.125);-moz-box-shadow:inset 0 3px 8px rgba(0,0,0,0.125);box-shadow:inset 0 3px 8px rgba(0,0,0,0.125)}.navbar .btn-navbar{display:none;float:right;padding:7px 10px;margin-right:5px;margin-left:5px;color:#fff;text-shadow:0 -1px 0 rgba(0,0,0,0.25);background-color:#ededed;*background-color:#e5e5e5;background-image:-moz-linear-gradient(top,#f2f2f2,#e5e5e5);background-image:-webkit-gradient(linear,0 0,0 100%,from(#f2f2f2),to(#e5e5e5));background-image:-webkit-linear-gradient(top,#f2f2f2,#e5e5e5);background-image:-o-linear-gradient(top,#f2f2f2,#e5e5e5);background-image:linear-gradient(to bottom,#f2f2f2,#e5e5e5);background-repeat:repeat-x;border-color:#e5e5e5 #e5e5e5 #bfbfbf;border-color:rgba(0,0,0,0.1) rgba(0,0,0,0.1) rgba(0,0,0,0.25);filter:progid:DXImageTransform.Microsoft.gradient(startColorstr='#fff2f2f2',endColorstr='#ffe5e5e5',GradientType=0);filter:progid:DXImageTransform.Microsoft.gradient(enabled=false);-webkit-box-shadow:inset 0 1px 0 rgba(255,255,255,0.1),0 1px 0 rgba(255,255,255,0.075);-moz-box-shadow:inset 0 1px 0 rgba(255,255,255,0.1),0 1px 0 rgba(255,255,255,0.075);box-shadow:inset 0 1px 0 rgba(255,255,255,0.1),0 1px 0 rgba(255,255,255,0.075)}.navbar .btn-navbar:hover,.navbar .btn-navbar:focus,.navbar .btn-navbar:active,.navbar .btn-navbar.active,.navbar .btn-navbar.disabled,.navbar .btn-navbar[disabled]{color:#fff;background-color:#e5e5e5;*background-color:#d9d9d9}.navbar .btn-navbar:active,.navbar .btn-navbar.active{background-color:#ccc \9}.navbar .btn-navbar .icon-bar{display:block;width:18px;height:2px;background-color:#f5f5f5;-webkit-border-radius:1px;-moz-border-radius:1px;border-radius:1px;-webkit-box-shadow:0 1px 0 rgba(0,0,0,0.25);-moz-box-shadow:0 1px 0 rgba(0,0,0,0.25);box-shadow:0 1px 0 rgba(0,0,0,0.25)}.btn-navbar .icon-bar+.icon-bar{margin-top:3px}.navbar .nav>li>.dropdown-menu:before{position:absolute;top:-7px;left:9px;display:inline-block;border-right:7px solid transparent;border-bottom:7px solid #ccc;border-left:7px solid transparent;border-bottom-color:rgba(0,0,0,0.2);content:''}.navbar .nav>li>.dropdown-menu:after{position:absolute;top:-6px;left:10px;display:inline-block;border-right:6px solid transparent;border-bottom:6px solid #fff;border-left:6px solid transparent;content:''}.navbar-fixed-bottom .nav>li>.dropdown-menu:before{top:auto;bottom:-7px;border-top:7px solid #ccc;border-bottom:0;border-top-color:rgba(0,0,0,0.2)}.navbar-fixed-bottom .nav>li>.dropdown-menu:after{top:auto;bottom:-6px;border-top:6px solid #fff;border-bottom:0}.navbar .nav li.dropdown>a:hover .caret,.navbar .nav li.dropdown>a:focus .caret{border-top-color:#333;border-bottom-color:#333}.navbar .nav li.dropdown.open>.dropdown-toggle,.navbar .nav li.dropdown.active>.dropdown-toggle,.navbar .nav li.dropdown.open.active>.dropdown-toggle{color:#555;background-color:#e5e5e5}.navbar .nav li.dropdown>.dropdown-toggle .caret{border-top-color:#777;border-bottom-color:#777}.navbar .nav li.dropdown.open>.dropdown-toggle .caret,.navbar .nav li.dropdown.active>.dropdown-toggle .caret,.navbar .nav li.dropdown.open.active>.dropdown-toggle .caret{border-top-color:#555;border-bottom-color:#555}.navbar .pull-right>li>.dropdown-menu,.navbar .nav>li>.dropdown-menu.pull-right{right:0;left:auto}.navbar .pull-right>li>.dropdown-menu:before,.navbar .nav>li>.dropdown-menu.pull-right:before{right:12px;left:auto}.navbar .pull-right>li>.dropdown-menu:after,.navbar .nav>li>.dropdown-menu.pull-right:after{right:13px;left:auto}.navbar .pull-right>li>.dropdown-menu .dropdown-menu,.navbar .nav>li>.dropdown-menu.pull-right .dropdown-menu{right:100%;left:auto;margin-right:-1px;margin-left:0;-webkit-border-radius:6px 0 6px 6px;-moz-border-radius:6px 0 6px 6px;border-radius:6px 0 6px 6px}.navbar-inverse .navbar-inner{background-color:#1b1b1b;background-image:-moz-linear-gradient(top,#222,#111);background-image:-webkit-gradient(linear,0 0,0 100%,from(#222),to(#111));background-image:-webkit-linear-gradient(top,#222,#111);background-image:-o-linear-gradient(top,#222,#111);background-image:linear-gradient(to bottom,#222,#111);background-repeat:repeat-x;border-color:#252525;filter:progid:DXImageTransform.Microsoft.gradient(startColorstr='#ff222222',endColorstr='#ff111111',GradientType=0)}.navbar-inverse .brand,.navbar-inverse .nav>li>a{color:#999;text-shadow:0 -1px 0 rgba(0,0,0,0.25)}.navbar-inverse .brand:hover,.navbar-inverse .nav>li>a:hover,.navbar-inverse .brand:focus,.navbar-inverse .nav>li>a:focus{color:#fff}.navbar-inverse .brand{color:#999}.navbar-inverse .navbar-text{color:#999}.navbar-inverse .nav>li>a:focus,.navbar-inverse .nav>li>a:hover{color:#fff;background-color:transparent}.navbar-inverse .nav .active>a,.navbar-inverse .nav .active>a:hover,.navbar-inverse .nav .active>a:focus{color:#fff;background-color:#111}.navbar-inverse .navbar-link{color:#999}.navbar-inverse .navbar-link:hover,.navbar-inverse .navbar-link:focus{color:#fff}.navbar-inverse .divider-vertical{border-right-color:#222;border-left-color:#111}.navbar-inverse .nav li.dropdown.open>.dropdown-toggle,.navbar-inverse .nav li.dropdown.active>.dropdown-toggle,.navbar-inverse .nav li.dropdown.open.active>.dropdown-toggle{color:#fff;background-color:#111}.navbar-inverse .nav li.dropdown>a:hover .caret,.navbar-inverse .nav li.dropdown>a:focus .caret{border-top-color:#fff;border-bottom-color:#fff}.navbar-inverse .nav li.dropdown>.dropdown-toggle .caret{border-top-color:#999;border-bottom-color:#999}.navbar-inverse .nav li.dropdown.open>.dropdown-toggle .caret,.navbar-inverse .nav li.dropdown.active>.dropdown-toggle .caret,.navbar-inverse .nav li.dropdown.open.active>.dropdown-toggle .caret{border-top-color:#fff;border-bottom-color:#fff}.navbar-inverse .navbar-search .search-query{color:#fff;background-color:#515151;border-color:#111;-webkit-box-shadow:inset 0 1px 2px rgba(0,0,0,0.1),0 1px 0 rgba(255,255,255,0.15);-moz-box-shadow:inset 0 1px 2px rgba(0,0,0,0.1),0 1px 0 rgba(255,255,255,0.15);box-shadow:inset 0 1px 2px rgba(0,0,0,0.1),0 1px 0 rgba(255,255,255,0.15);-webkit-transition:none;-moz-transition:none;-o-transition:none;transition:none}.navbar-inverse .navbar-search .search-query:-moz-placeholder{color:#ccc}.navbar-inverse .navbar-search .search-query:-ms-input-placeholder{color:#ccc}.navbar-inverse .navbar-search .search-query::-webkit-input-placeholder{color:#ccc}.navbar-inverse .navbar-search .search-query:focus,.navbar-inverse .navbar-search .search-query.focused{padding:5px 15px;color:#333;text-shadow:0 1px 0 #fff;background-color:#fff;border:0;outline:0;-webkit-box-shadow:0 0 3px rgba(0,0,0,0.15);-moz-box-shadow:0 0 3px rgba(0,0,0,0.15);box-shadow:0 0 3px rgba(0,0,0,0.15)}.navbar-inverse .btn-navbar{color:#fff;text-shadow:0 -1px 0 rgba(0,0,0,0.25);background-color:#0e0e0e;*background-color:#040404;background-image:-moz-linear-gradient(top,#151515,#040404);background-image:-webkit-gradient(linear,0 0,0 100%,from(#151515),to(#040404));background-image:-webkit-linear-gradient(top,#151515,#040404);background-image:-o-linear-gradient(top,#151515,#040404);background-image:linear-gradient(to bottom,#151515,#040404);background-repeat:repeat-x;border-color:#040404 #040404 #000;border-color:rgba(0,0,0,0.1) rgba(0,0,0,0.1) rgba(0,0,0,0.25);filter:progid:DXImageTransform.Microsoft.gradient(startColorstr='#ff151515',endColorstr='#ff040404',GradientType=0);filter:progid:DXImageTransform.Microsoft.gradient(enabled=false)}.navbar-inverse .btn-navbar:hover,.navbar-inverse .btn-navbar:focus,.navbar-inverse .btn-navbar:active,.navbar-inverse .btn-navbar.active,.navbar-inverse .btn-navbar.disabled,.navbar-inverse .btn-navbar[disabled]{color:#fff;background-color:#040404;*background-color:#000}.navbar-inverse .btn-navbar:active,.navbar-inverse .btn-navbar.active{background-color:#000 \9}.breadcrumb{padding:8px 15px;margin:0 0 20px;list-style:none;background-color:#f5f5f5;-webkit-border-radius:4px;-moz-border-radius:4px;border-radius:4px}.breadcrumb>li{display:inline-block;*display:inline;text-shadow:0 1px 0 #fff;*zoom:1}.breadcrumb>li>.divider{padding:0 5px;color:#ccc}.breadcrumb>.active{color:#999}.pagination{margin:20px 0}.pagination ul{display:inline-block;*display:inline;margin-bottom:0;margin-left:0;-webkit-border-radius:4px;-moz-border-radius:4px;border-radius:4px;*zoom:1;-webkit-box-shadow:0 1px 2px rgba(0,0,0,0.05);-moz-box-shadow:0 1px 2px rgba(0,0,0,0.05);box-shadow:0 1px 2px rgba(0,0,0,0.05)}.pagination ul>li{display:inline}.pagination ul>li>a,.pagination ul>li>span{float:left;padding:4px 12px;line-height:20px;text-decoration:none;background-color:#fff;border:1px solid #ddd;border-left-width:0}.pagination ul>li>a:hover,.pagination ul>li>a:focus,.pagination ul>.active>a,.pagination ul>.active>span{background-color:#f5f5f5}.pagination ul>.active>a,.pagination ul>.active>span{color:#999;cursor:default}.pagination ul>.disabled>span,.pagination ul>.disabled>a,.pagination ul>.disabled>a:hover,.pagination ul>.disabled>a:focus{color:#999;cursor:default;background-color:transparent}.pagination ul>li:first-child>a,.pagination ul>li:first-child>span{border-left-width:1px;-webkit-border-bottom-left-radius:4px;border-bottom-left-radius:4px;-webkit-border-top-left-radius:4px;border-top-left-radius:4px;-moz-border-radius-bottomleft:4px;-moz-border-radius-topleft:4px}.pagination ul>li:last-child>a,.pagination ul>li:last-child>span{-webkit-border-top-right-radius:4px;border-top-right-radius:4px;-webkit-border-bottom-right-radius:4px;border-bottom-right-radius:4px;-moz-border-radius-topright:4px;-moz-border-radius-bottomright:4px}.pagination-centered{text-align:center}.pagination-right{text-align:right}.pagination-large ul>li>a,.pagination-large ul>li>span{padding:11px 19px;font-size:17.5px}.pagination-large ul>li:first-child>a,.pagination-large ul>li:first-child>span{-webkit-border-bottom-left-radius:6px;border-bottom-left-radius:6px;-webkit-border-top-left-radius:6px;border-top-left-radius:6px;-moz-border-radius-bottomleft:6px;-moz-border-radius-topleft:6px}.pagination-large ul>li:last-child>a,.pagination-large ul>li:last-child>span{-webkit-border-top-right-radius:6px;border-top-right-radius:6px;-webkit-border-bottom-right-radius:6px;border-bottom-right-radius:6px;-moz-border-radius-topright:6px;-moz-border-radius-bottomright:6px}.pagination-mini ul>li:first-child>a,.pagination-small ul>li:first-child>a,.pagination-mini ul>li:first-child>span,.pagination-small ul>li:first-child>span{-webkit-border-bottom-left-radius:3px;border-bottom-left-radius:3px;-webkit-border-top-left-radius:3px;border-top-left-radius:3px;-moz-border-radius-bottomleft:3px;-moz-border-radius-topleft:3px}.pagination-mini ul>li:last-child>a,.pagination-small ul>li:last-child>a,.pagination-mini ul>li:last-child>span,.pagination-small ul>li:last-child>span{-webkit-border-top-right-radius:3px;border-top-right-radius:3px;-webkit-border-bottom-right-radius:3px;border-bottom-right-radius:3px;-moz-border-radius-topright:3px;-moz-border-radius-bottomright:3px}.pagination-small ul>li>a,.pagination-small ul>li>span{padding:2px 10px;font-size:11.9px}.pagination-mini ul>li>a,.pagination-mini ul>li>span{padding:0 6px;font-size:10.5px}.pager{margin:20px 0;text-align:center;list-style:none;*zoom:1}.pager:before,.pager:after{display:table;line-height:0;content:""}.pager:after{clear:both}.pager li{display:inline}.pager li>a,.pager li>span{display:inline-block;padding:5px 14px;background-color:#fff;border:1px solid #ddd;-webkit-border-radius:15px;-moz-border-radius:15px;border-radius:15px}.pager li>a:hover,.pager li>a:focus{text-decoration:none;background-color:#f5f5f5}.pager .next>a,.pager .next>span{float:right}.pager .previous>a,.pager .previous>span{float:left}.pager .disabled>a,.pager .disabled>a:hover,.pager .disabled>a:focus,.pager .disabled>span{color:#999;cursor:default;background-color:#fff}.modal-backdrop{position:fixed;top:0;right:0;bottom:0;left:0;z-index:1040;background-color:#000}.modal-backdrop.fade{opacity:0}.modal-backdrop,.modal-backdrop.fade.in{opacity:.8;filter:alpha(opacity=80)}.modal{position:fixed;top:10%;left:50%;z-index:1050;width:560px;margin-left:-280px;background-color:#fff;border:1px solid #999;border:1px solid rgba(0,0,0,0.3);*border:1px solid #999;-webkit-border-radius:6px;-moz-border-radius:6px;border-radius:6px;outline:0;-webkit-box-shadow:0 3px 7px rgba(0,0,0,0.3);-moz-box-shadow:0 3px 7px rgba(0,0,0,0.3);box-shadow:0 3px 7px rgba(0,0,0,0.3);-webkit-background-clip:padding-box;-moz-background-clip:padding-box;background-clip:padding-box}.modal.fade{top:-25%;-webkit-transition:opacity .3s linear,top .3s ease-out;-moz-transition:opacity .3s linear,top .3s ease-out;-o-transition:opacity .3s linear,top .3s ease-out;transition:opacity .3s linear,top .3s ease-out}.modal.fade.in{top:10%}.modal-header{padding:9px 15px;border-bottom:1px solid #eee}.modal-header .close{margin-top:2px}.modal-header h3{margin:0;line-height:30px}.modal-body{position:relative;max-height:400px;padding:15px;overflow-y:auto}.modal-form{margin-bottom:0}.modal-footer{padding:14px 15px 15px;margin-bottom:0;text-align:right;background-color:#f5f5f5;border-top:1px solid #ddd;-webkit-border-radius:0 0 6px 6px;-moz-border-radius:0 0 6px 6px;border-radius:0 0 6px 6px;*zoom:1;-webkit-box-shadow:inset 0 1px 0 #fff;-moz-box-shadow:inset 0 1px 0 #fff;box-shadow:inset 0 1px 0 #fff}.modal-footer:before,.modal-footer:after{display:table;line-height:0;content:""}.modal-footer:after{clear:both}.modal-footer .btn+.btn{margin-bottom:0;margin-left:5px}.modal-footer .btn-group .btn+.btn{margin-left:-1px}.modal-footer .btn-block+.btn-block{margin-left:0}.tooltip{position:absolute;z-index:1030;display:block;font-size:11px;line-height:1.4;opacity:0;filter:alpha(opacity=0);visibility:visible}.tooltip.in{opacity:.8;filter:alpha(opacity=80)}.tooltip.top{padding:5px 0;margin-top:-3px}.tooltip.right{padding:0 5px;margin-left:3px}.tooltip.bottom{padding:5px 0;margin-top:3px}.tooltip.left{padding:0 5px;margin-left:-3px}.tooltip-inner{max-width:200px;padding:8px;color:#fff;text-align:center;text-decoration:none;background-color:#000;-webkit-border-radius:4px;-moz-border-radius:4px;border-radius:4px}.tooltip-arrow{position:absolute;width:0;height:0;border-color:transparent;border-style:solid}.tooltip.top .tooltip-arrow{bottom:0;left:50%;margin-left:-5px;border-top-color:#000;border-width:5px 5px 0}.tooltip.right .tooltip-arrow{top:50%;left:0;margin-top:-5px;border-right-color:#000;border-width:5px 5px 5px 0}.tooltip.left .tooltip-arrow{top:50%;right:0;margin-top:-5px;border-left-color:#000;border-width:5px 0 5px 5px}.tooltip.bottom .tooltip-arrow{top:0;left:50%;margin-left:-5px;border-bottom-color:#000;border-width:0 5px 5px}.popover{position:absolute;top:0;left:0;z-index:1010;display:none;max-width:400px;padding:1px;text-align:left;white-space:normal;background-color:#fff;border:1px solid #ccc;border:1px solid rgba(0,0,0,0.2);-webkit-border-radius:6px;-moz-border-radius:6px;border-radius:6px;-webkit-box-shadow:0 5px 10px rgba(0,0,0,0.2);-moz-box-shadow:0 5px 10px rgba(0,0,0,0.2);box-shadow:0 5px 10px rgba(0,0,0,0.2);-webkit-background-clip:padding-box;-moz-background-clip:padding;background-clip:padding-box}.popover.top{margin-top:-10px}.popover.right{margin-left:10px}.popover.bottom{margin-top:10px}.popover.left{margin-left:-10px}.popover-title{padding:8px 14px;margin:0;font-size:14px;font-weight:normal;line-height:18px;background-color:#f7f7f7;border-bottom:1px solid #ebebeb;-webkit-border-radius:5px 5px 0 0;-moz-border-radius:5px 5px 0 0;border-radius:5px 5px 0 0}.popover-title:empty{display:none}.popover-content{padding:9px 14px}.popover .arrow,.popover .arrow:after{position:absolute;display:block;width:0;height:0;border-color:transparent;border-style:solid}.popover .arrow{border-width:11px}.popover .arrow:after{border-width:10px;content:""}.popover.top .arrow{bottom:-11px;left:50%;margin-left:-11px;border-top-color:#999;border-top-color:rgba(0,0,0,0.25);border-bottom-width:0}.popover.top .arrow:after{bottom:1px;margin-left:-10px;border-top-color:#fff;border-bottom-width:0}.popover.right .arrow{top:50%;left:-11px;margin-top:-11px;border-right-color:#999;border-right-color:rgba(0,0,0,0.25);border-left-width:0}.popover.right .arrow:after{bottom:-10px;left:1px;border-right-color:#fff;border-left-width:0}.popover.bottom .arrow{top:-11px;left:50%;margin-left:-11px;border-bottom-color:#999;border-bottom-color:rgba(0,0,0,0.25);border-top-width:0}.popover.bottom .arrow:after{top:1px;margin-left:-10px;border-bottom-color:#fff;border-top-width:0}.popover.left .arrow{top:50%;right:-11px;margin-top:-11px;border-left-color:#999;border-left-color:rgba(0,0,0,0.25);border-right-width:0}.popover.left .arrow:after{right:1px;bottom:-10px;border-left-color:#fff;border-right-width:0}.thumbnails{margin-left:-20px;list-style:none;*zoom:1}.thumbnails:before,.thumbnails:after{display:table;line-height:0;content:""}.thumbnails:after{clear:both}.row-fluid .thumbnails{margin-left:0}.thumbnails>li{float:left;margin-bottom:20px;margin-left:20px}.thumbnail{display:block;padding:4px;line-height:20px;border:1px solid #ddd;-webkit-border-radius:4px;-moz-border-radius:4px;border-radius:4px;-webkit-box-shadow:0 1px 3px rgba(0,0,0,0.055);-moz-box-shadow:0 1px 3px rgba(0,0,0,0.055);box-shadow:0 1px 3px rgba(0,0,0,0.055);-webkit-transition:all .2s ease-in-out;-moz-transition:all .2s ease-in-out;-o-transition:all .2s ease-in-out;transition:all .2s ease-in-out}a.thumbnail:hover,a.thumbnail:focus{border-color:#08c;-webkit-box-shadow:0 1px 4px rgba(0,105,214,0.25);-moz-box-shadow:0 1px 4px rgba(0,105,214,0.25);box-shadow:0 1px 4px rgba(0,105,214,0.25)}.thumbnail>img{display:block;max-width:100%;margin-right:auto;margin-left:auto}.thumbnail .caption{padding:9px;color:#555}.media,.media-body{overflow:hidden;*overflow:visible;zoom:1}.media,.media .media{margin-top:15px}.media:first-child{margin-top:0}.media-object{display:block}.media-heading{margin:0 0 5px}.media>.pull-left{margin-right:10px}.media>.pull-right{margin-left:10px}.media-list{margin-left:0;list-style:none}.label,.badge{display:inline-block;padding:2px 4px;font-size:11.844px;font-weight:bold;line-height:14px;color:#fff;text-shadow:0 -1px 0 rgba(0,0,0,0.25);white-space:nowrap;vertical-align:baseline;background-color:#999}.label{-webkit-border-radius:3px;-moz-border-radius:3px;border-radius:3px}.badge{padding-right:9px;padding-left:9px;-webkit-border-radius:9px;-moz-border-radius:9px;border-radius:9px}.label:empty,.badge:empty{display:none}a.label:hover,a.label:focus,a.badge:hover,a.badge:focus{color:#fff;text-decoration:none;cursor:pointer}.label-important,.badge-important{background-color:#b94a48}.label-important[href],.badge-important[href]{background-color:#953b39}.label-warning,.badge-warning{background-color:#f89406}.label-warning[href],.badge-warning[href]{background-color:#c67605}.label-success,.badge-success{background-color:#468847}.label-success[href],.badge-success[href]{background-color:#356635}.label-info,.badge-info{background-color:#3a87ad}.label-info[href],.badge-info[href]{background-color:#2d6987}.label-inverse,.badge-inverse{background-color:#333}.label-inverse[href],.badge-inverse[href]{background-color:#1a1a1a}.btn .label,.btn .badge{position:relative;top:-1px}.btn-mini .label,.btn-mini .badge{top:0}@-webkit-keyframes progress-bar-stripes{from{background-position:40px 0}to{background-position:0 0}}@-moz-keyframes progress-bar-stripes{from{background-position:40px 0}to{background-position:0 0}}@-ms-keyframes progress-bar-stripes{from{background-position:40px 0}to{background-position:0 0}}@-o-keyframes progress-bar-stripes{from{background-position:0 0}to{background-position:40px 0}}@keyframes progress-bar-stripes{from{background-position:40px 0}to{background-position:0 0}}.progress{height:20px;margin-bottom:20px;overflow:hidden;background-color:#f7f7f7;background-image:-moz-linear-gradient(top,#f5f5f5,#f9f9f9);background-image:-webkit-gradient(linear,0 0,0 100%,from(#f5f5f5),to(#f9f9f9));background-image:-webkit-linear-gradient(top,#f5f5f5,#f9f9f9);background-image:-o-linear-gradient(top,#f5f5f5,#f9f9f9);background-image:linear-gradient(to bottom,#f5f5f5,#f9f9f9);background-repeat:repeat-x;-webkit-border-radius:4px;-moz-border-radius:4px;border-radius:4px;filter:progid:DXImageTransform.Microsoft.gradient(startColorstr='#fff5f5f5',endColorstr='#fff9f9f9',GradientType=0);-webkit-box-shadow:inset 0 1px 2px rgba(0,0,0,0.1);-moz-box-shadow:inset 0 1px 2px rgba(0,0,0,0.1);box-shadow:inset 0 1px 2px rgba(0,0,0,0.1)}.progress .bar{float:left;width:0;height:100%;font-size:12px;color:#fff;text-align:center;text-shadow:0 -1px 0 rgba(0,0,0,0.25);background-color:#0e90d2;background-image:-moz-linear-gradient(top,#149bdf,#0480be);background-image:-webkit-gradient(linear,0 0,0 100%,from(#149bdf),to(#0480be));background-image:-webkit-linear-gradient(top,#149bdf,#0480be);background-image:-o-linear-gradient(top,#149bdf,#0480be);background-image:linear-gradient(to bottom,#149bdf,#0480be);background-repeat:repeat-x;filter:progid:DXImageTransform.Microsoft.gradient(startColorstr='#ff149bdf',endColorstr='#ff0480be',GradientType=0);-webkit-box-shadow:inset 0 -1px 0 rgba(0,0,0,0.15);-moz-box-shadow:inset 0 -1px 0 rgba(0,0,0,0.15);box-shadow:inset 0 -1px 0 rgba(0,0,0,0.15);-webkit-box-sizing:border-box;-moz-box-sizing:border-box;box-sizing:border-box;-webkit-transition:width .6s ease;-moz-transition:width .6s ease;-o-transition:width .6s ease;transition:width .6s ease}.progress .bar+.bar{-webkit-box-shadow:inset 1px 0 0 rgba(0,0,0,0.15),inset 0 -1px 0 rgba(0,0,0,0.15);-moz-box-shadow:inset 1px 0 0 rgba(0,0,0,0.15),inset 0 -1px 0 rgba(0,0,0,0.15);box-shadow:inset 1px 0 0 rgba(0,0,0,0.15),inset 0 -1px 0 rgba(0,0,0,0.15)}.progress-striped .bar{background-color:#149bdf;background-image:-webkit-gradient(linear,0 100%,100% 0,color-stop(0.25,rgba(255,255,255,0.15)),color-stop(0.25,transparent),color-stop(0.5,transparent),color-stop(0.5,rgba(255,255,255,0.15)),color-stop(0.75,rgba(255,255,255,0.15)),color-stop(0.75,transparent),to(transparent));background-image:-webkit-linear-gradient(45deg,rgba(255,255,255,0.15) 25%,transparent 25%,transparent 50%,rgba(255,255,255,0.15) 50%,rgba(255,255,255,0.15) 75%,transparent 75%,transparent);background-image:-moz-linear-gradient(45deg,rgba(255,255,255,0.15) 25%,transparent 25%,transparent 50%,rgba(255,255,255,0.15) 50%,rgba(255,255,255,0.15) 75%,transparent 75%,transparent);background-image:-o-linear-gradient(45deg,rgba(255,255,255,0.15) 25%,transparent 25%,transparent 50%,rgba(255,255,255,0.15) 50%,rgba(255,255,255,0.15) 75%,transparent 75%,transparent);background-image:linear-gradient(45deg,rgba(255,255,255,0.15) 25%,transparent 25%,transparent 50%,rgba(255,255,255,0.15) 50%,rgba(255,255,255,0.15) 75%,transparent 75%,transparent);-webkit-background-size:40px 40px;-moz-background-size:40px 40px;-o-background-size:40px 40px;background-size:40px 40px}.progress.active .bar{-webkit-animation:progress-bar-stripes 2s linear infinite;-moz-animation:progress-bar-stripes 2s linear infinite;-ms-animation:progress-bar-stripes 2s linear infinite;-o-animation:progress-bar-stripes 2s linear infinite;animation:progress-bar-stripes 2s linear infinite}.progress-danger .bar,.progress .bar-danger{background-color:#dd514c;background-image:-moz-linear-gradient(top,#ee5f5b,#c43c35);background-image:-webkit-gradient(linear,0 0,0 100%,from(#ee5f5b),to(#c43c35));background-image:-webkit-linear-gradient(top,#ee5f5b,#c43c35);background-image:-o-linear-gradient(top,#ee5f5b,#c43c35);background-image:linear-gradient(to bottom,#ee5f5b,#c43c35);background-repeat:repeat-x;filter:progid:DXImageTransform.Microsoft.gradient(startColorstr='#ffee5f5b',endColorstr='#ffc43c35',GradientType=0)}.progress-danger.progress-striped .bar,.progress-striped .bar-danger{background-color:#ee5f5b;background-image:-webkit-gradient(linear,0 100%,100% 0,color-stop(0.25,rgba(255,255,255,0.15)),color-stop(0.25,transparent),color-stop(0.5,transparent),color-stop(0.5,rgba(255,255,255,0.15)),color-stop(0.75,rgba(255,255,255,0.15)),color-stop(0.75,transparent),to(transparent));background-image:-webkit-linear-gradient(45deg,rgba(255,255,255,0.15) 25%,transparent 25%,transparent 50%,rgba(255,255,255,0.15) 50%,rgba(255,255,255,0.15) 75%,transparent 75%,transparent);background-image:-moz-linear-gradient(45deg,rgba(255,255,255,0.15) 25%,transparent 25%,transparent 50%,rgba(255,255,255,0.15) 50%,rgba(255,255,255,0.15) 75%,transparent 75%,transparent);background-image:-o-linear-gradient(45deg,rgba(255,255,255,0.15) 25%,transparent 25%,transparent 50%,rgba(255,255,255,0.15) 50%,rgba(255,255,255,0.15) 75%,transparent 75%,transparent);background-image:linear-gradient(45deg,rgba(255,255,255,0.15) 25%,transparent 25%,transparent 50%,rgba(255,255,255,0.15) 50%,rgba(255,255,255,0.15) 75%,transparent 75%,transparent)}.progress-success .bar,.progress .bar-success{background-color:#5eb95e;background-image:-moz-linear-gradient(top,#62c462,#57a957);background-image:-webkit-gradient(linear,0 0,0 100%,from(#62c462),to(#57a957));background-image:-webkit-linear-gradient(top,#62c462,#57a957);background-image:-o-linear-gradient(top,#62c462,#57a957);background-image:linear-gradient(to bottom,#62c462,#57a957);background-repeat:repeat-x;filter:progid:DXImageTransform.Microsoft.gradient(startColorstr='#ff62c462',endColorstr='#ff57a957',GradientType=0)}.progress-success.progress-striped .bar,.progress-striped .bar-success{background-color:#62c462;background-image:-webkit-gradient(linear,0 100%,100% 0,color-stop(0.25,rgba(255,255,255,0.15)),color-stop(0.25,transparent),color-stop(0.5,transparent),color-stop(0.5,rgba(255,255,255,0.15)),color-stop(0.75,rgba(255,255,255,0.15)),color-stop(0.75,transparent),to(transparent));background-image:-webkit-linear-gradient(45deg,rgba(255,255,255,0.15) 25%,transparent 25%,transparent 50%,rgba(255,255,255,0.15) 50%,rgba(255,255,255,0.15) 75%,transparent 75%,transparent);background-image:-moz-linear-gradient(45deg,rgba(255,255,255,0.15) 25%,transparent 25%,transparent 50%,rgba(255,255,255,0.15) 50%,rgba(255,255,255,0.15) 75%,transparent 75%,transparent);background-image:-o-linear-gradient(45deg,rgba(255,255,255,0.15) 25%,transparent 25%,transparent 50%,rgba(255,255,255,0.15) 50%,rgba(255,255,255,0.15) 75%,transparent 75%,transparent);background-image:linear-gradient(45deg,rgba(255,255,255,0.15) 25%,transparent 25%,transparent 50%,rgba(255,255,255,0.15) 50%,rgba(255,255,255,0.15) 75%,transparent 75%,transparent)}.progress-info .bar,.progress .bar-info{background-color:#4bb1cf;background-image:-moz-linear-gradient(top,#5bc0de,#339bb9);background-image:-webkit-gradient(linear,0 0,0 100%,from(#5bc0de),to(#339bb9));background-image:-webkit-linear-gradient(top,#5bc0de,#339bb9);background-image:-o-linear-gradient(top,#5bc0de,#339bb9);background-image:linear-gradient(to bottom,#5bc0de,#339bb9);background-repeat:repeat-x;filter:progid:DXImageTransform.Microsoft.gradient(startColorstr='#ff5bc0de',endColorstr='#ff339bb9',GradientType=0)}.progress-info.progress-striped .bar,.progress-striped .bar-info{background-color:#5bc0de;background-image:-webkit-gradient(linear,0 100%,100% 0,color-stop(0.25,rgba(255,255,255,0.15)),color-stop(0.25,transparent),color-stop(0.5,transparent),color-stop(0.5,rgba(255,255,255,0.15)),color-stop(0.75,rgba(255,255,255,0.15)),color-stop(0.75,transparent),to(transparent));background-image:-webkit-linear-gradient(45deg,rgba(255,255,255,0.15) 25%,transparent 25%,transparent 50%,rgba(255,255,255,0.15) 50%,rgba(255,255,255,0.15) 75%,transparent 75%,transparent);background-image:-moz-linear-gradient(45deg,rgba(255,255,255,0.15) 25%,transparent 25%,transparent 50%,rgba(255,255,255,0.15) 50%,rgba(255,255,255,0.15) 75%,transparent 75%,transparent);background-image:-o-linear-gradient(45deg,rgba(255,255,255,0.15) 25%,transparent 25%,transparent 50%,rgba(255,255,255,0.15) 50%,rgba(255,255,255,0.15) 75%,transparent 75%,transparent);background-image:linear-gradient(45deg,rgba(255,255,255,0.15) 25%,transparent 25%,transparent 50%,rgba(255,255,255,0.15) 50%,rgba(255,255,255,0.15) 75%,transparent 75%,transparent)}.progress-warning .bar,.progress .bar-warning{background-color:#faa732;background-image:-moz-linear-gradient(top,#fbb450,#f89406);background-image:-webkit-gradient(linear,0 0,0 100%,from(#fbb450),to(#f89406));background-image:-webkit-linear-gradient(top,#fbb450,#f89406);background-image:-o-linear-gradient(top,#fbb450,#f89406);background-image:linear-gradient(to bottom,#fbb450,#f89406);background-repeat:repeat-x;filter:progid:DXImageTransform.Microsoft.gradient(startColorstr='#fffbb450',endColorstr='#fff89406',GradientType=0)}.progress-warning.progress-striped .bar,.progress-striped .bar-warning{background-color:#fbb450;background-image:-webkit-gradient(linear,0 100%,100% 0,color-stop(0.25,rgba(255,255,255,0.15)),color-stop(0.25,transparent),color-stop(0.5,transparent),color-stop(0.5,rgba(255,255,255,0.15)),color-stop(0.75,rgba(255,255,255,0.15)),color-stop(0.75,transparent),to(transparent));background-image:-webkit-linear-gradient(45deg,rgba(255,255,255,0.15) 25%,transparent 25%,transparent 50%,rgba(255,255,255,0.15) 50%,rgba(255,255,255,0.15) 75%,transparent 75%,transparent);background-image:-moz-linear-gradient(45deg,rgba(255,255,255,0.15) 25%,transparent 25%,transparent 50%,rgba(255,255,255,0.15) 50%,rgba(255,255,255,0.15) 75%,transparent 75%,transparent);background-image:-o-linear-gradient(45deg,rgba(255,255,255,0.15) 25%,transparent 25%,transparent 50%,rgba(255,255,255,0.15) 50%,rgba(255,255,255,0.15) 75%,transparent 75%,transparent);background-image:linear-gradient(45deg,rgba(255,255,255,0.15) 25%,transparent 25%,transparent 50%,rgba(255,255,255,0.15) 50%,rgba(255,255,255,0.15) 75%,transparent 75%,transparent)}.accordion{margin-bottom:20px}.accordion-group{margin-bottom:2px;border:1px solid #e5e5e5;-webkit-border-radius:4px;-moz-border-radius:4px;border-radius:4px}.accordion-heading{border-bottom:0}.accordion-heading .accordion-toggle{display:block;padding:8px 15px}.accordion-toggle{cursor:pointer}.accordion-inner{padding:9px 15px;border-top:1px solid #e5e5e5}.carousel{position:relative;margin-bottom:20px;line-height:1}.carousel-inner{position:relative;width:100%;overflow:hidden}.carousel-inner>.item{position:relative;display:none;-webkit-transition:.6s ease-in-out left;-moz-transition:.6s ease-in-out left;-o-transition:.6s ease-in-out left;transition:.6s ease-in-out left}.carousel-inner>.item>img,.carousel-inner>.item>a>img{display:block;line-height:1}.carousel-inner>.active,.carousel-inner>.next,.carousel-inner>.prev{display:block}.carousel-inner>.active{left:0}.carousel-inner>.next,.carousel-inner>.prev{position:absolute;top:0;width:100%}.carousel-inner>.next{left:100%}.carousel-inner>.prev{left:-100%}.carousel-inner>.next.left,.carousel-inner>.prev.right{left:0}.carousel-inner>.active.left{left:-100%}.carousel-inner>.active.right{left:100%}.carousel-control{position:absolute;top:40%;left:15px;width:40px;height:40px;margin-top:-20px;font-size:60px;font-weight:100;line-height:30px;color:#fff;text-align:center;background:#222;border:3px solid #fff;-webkit-border-radius:23px;-moz-border-radius:23px;border-radius:23px;opacity:.5;filter:alpha(opacity=50)}.carousel-control.right{right:15px;left:auto}.carousel-control:hover,.carousel-control:focus{color:#fff;text-decoration:none;opacity:.9;filter:alpha(opacity=90)}.carousel-indicators{position:absolute;top:15px;right:15px;z-index:5;margin:0;list-style:none}.carousel-indicators li{display:block;float:left;width:10px;height:10px;margin-left:5px;text-indent:-999px;background-color:#ccc;background-color:rgba(255,255,255,0.25);border-radius:5px}.carousel-indicators .active{background-color:#fff}.carousel-caption{position:absolute;right:0;bottom:0;left:0;padding:15px;background:#333;background:rgba(0,0,0,0.75)}.carousel-caption h4,.carousel-caption p{line-height:20px;color:#fff}.carousel-caption h4{margin:0 0 5px}.carousel-caption p{margin-bottom:0}.hero-unit{padding:60px;margin-bottom:30px;font-size:18px;font-weight:200;line-height:30px;color:inherit;background-color:#eee;-webkit-border-radius:6px;-moz-border-radius:6px;border-radius:6px}.hero-unit h1{margin-bottom:0;font-size:60px;line-height:1;letter-spacing:-1px;color:inherit}.hero-unit li{line-height:30px}.pull-right{float:right}.pull-left{float:left}.hide{display:none}.show{display:block}.invisible{visibility:hidden}.affix{position:fixed}
diff --git a/bitbake/lib/toaster/toastergui/static/css/default.css b/bitbake/lib/toaster/toastergui/static/css/default.css
new file mode 100644
index 0000000..cce3e31
--- /dev/null
+++ b/bitbake/lib/toaster/toastergui/static/css/default.css
@@ -0,0 +1,330 @@
+/* Style the Yocto Project logo */
+.logo img { height: 30px; width: auto !important; }
+.logo { padding-top: 4px !important; padding-bottom:0px !important; }
+
+/* style the version information */
+.brand > a { color: #777; }
+.brand > a:hover { color: #999; text-decoration: none; }
+.icon-info-sign { color: #777; font-size: 16px; margin-left: 5px;}
+.icon-info-sign:hover { color: #999; cursor: pointer; }
+
+/* Style the breadcrumb */
+.breadcrumb { display: inline-block; background-color: transparent; }
+.breadcrumb li:first-child { padding-right: 10px; }
+
+/* Styles for the help information */
+.get-help { color: #CCCCCC; }
+.get-help:hover, .icon-plus-sign:hover { color: #999999; cursor: pointer; }
+.get-help-blue { color: #3A87AD; }
+.get-help-blue:hover { color: #005580; cursor: pointer; }
+.get-help-yellow { color: #C09853; }
+.get-help-yellow:hover { color: #B38942; cursor: pointer; }
+.get-help-red { color: #B94A48; font-size: 16px; padding-left: 2px; }
+.get-help-red:hover { color: #943A38; cursor: pointer; }
+.build-form>i:first-of-type { margin-left: 5px; }
+.manual { margin: 11px 15px 0 11px;}
+.heading-help { font-size: 14px; }
+
+/* Styles for the external link */
+.get-info { color: #0088CC; }
+.get-info:hover { color: #005580; cursor: pointer; text-decoration: none; }
+
+/* Styles for code and pre tags */
+code { background-color: transparent; border: none; color: #333333; }
+dd code, .alert code { white-space: pre-wrap; word-break: break-all; word-wrap: break-word; }
+.alert-warning code, .alert-warning pre { background-color: transparent; border: none; color: #C09853; margin-bottom: 0px; }
+.alert-error code { background-color: transparent; border: none; color: #B94A48; margin-bottom:0px; }
+.alert-error pre { background-color: transparent; border: none; color: #B94A48; word-break: normal; margin-bottom: 0px; }
+.alert-warning pre { word-break: normal; }
+.alert-info a { font-weight: 300; }
+.alert-info code { color: #3A87AD; }
+.tooltip code { background-color: transparent; color: #FFFFFF; font-weight: normal; border: none; font-size: 1em; }
+
+/* Style for definition lists */
+dd ul { list-style-type: none; margin: 0px; }
+dt, dd {line-height: 25px; }
+dd li { line-height: 25px; }
+.item-info dd { line-height: 20px; margin-bottom: 10px; }
+
+/* Style the filter modal dialogs */
+.modal { width: 800px; margin-left: -400px; }
+.modal-footer .btn { float: left; }
+
+/* Hover style for the clear search icon */
+.icon-remove-sign:hover { color: #999999; cursor: pointer; }
+
+/* Some extra space before headings when needed */
+.details { margin-top: 30px; }
+.air { margin-top: 30px; }
+
+/* Required classes for the highlight behaviour in tables */
+.highlight { -webkit-animation: target-fade 10s 1; -moz-animation: target-fade 10s 1; animation: target-fade 10s 1; }
+@-webkit-keyframes target-fade { 0% { background-color: #D9EDF7; } 25% { background-color: #D9EDF7; } 100% { background-color: white; } }
+@-moz-keyframes target-fade { 0% { background-color: #D9EDF7; } 25% { background-color: #D9EDF7; } 100% { background-color: white; } }
+@keyframes target-fade { 0% { background-color: #D9EDF7; } 25% { background-color: #D9EDF7; } 100% { background-color: white; } }
+
+/* This makes tooltips work inside modal dialogs */
+.tooltip { z-index: 2000 !important; }
+
+/* Override default Twitter Boostrap styles for anchor tags inside tables */
+td a, td a > code { color: #333333; }
+td code { white-space: normal; }
+td a:hover, td a > code:hover { color: #000000; text-decoration: underline; }
+
+/* Override default Twitter Bootstrap styles for tr.error */
+.table tbody tr.error > td { background-color: transparent; } /* override default Bootstrap behaviour */
+.table-hover tbody tr.error:hover > td { background-color: #F5F5F5;} /* override default Bootstrap behaviour */
+
+/* Right justify Bootstrap table columns for size fields */
+.table .sizecol { text-align: right; }
+
+/* Set error, warning, success and muted styles */
+.error, .red, td.error a, tr.error a { color: #b94a48; }
+a.error:hover, a.error:focus, tr.error a:hover { color: #943A38; text-decoration: underline; }
+.warning, .yellow { color: #c09853;}
+a.warning { background-color: transparent; }
+a.warning:hover, a.warning:focus { color: #B38942; text-decoration: underline; }
+.success, .green { color: #468847;}
+.success:hover { color: #347132; text-decoration: underline; }
+td > .success:hover { text-decoration: underline; }
+.muted a { color:#999999; }
+.muted a:hover { color:#999999; }
+
+/* Sorting functionality styles for table headings */
+.sorted { color: #333333; font-weight: bold; }
+.sorted:hover { color: #000000; text-decoration: underline; }
+th > a, th > span { font-weight: normal; }
+
+/* Force long strings like commit hashes to wrap */
+.iscommit { white-space: pre-wrap; word-break: break-all; word-wrap: break-word;}
+
+/* Make the popovers scrollable if they are too long */
+.popover-content { max-height: 30em; overflow-y: scroll; }
+
+/* Styles for the directory structure table. We'll probably won't use those in production */
+.one { padding-left: 18px !important; }
+.two { padding-left: 36px !important; }
+.three { padding-left: 54px !important; }
+.content-directory a { color: #0088CC; }
+.content-directory a:hover { color: #005580; text-decoration: underline; }
+.symlink { color: #CCCCCC; }
+
+/* Styles for the navbar actions */
+.btn-group + .btn-group { margin-right: 10px; }
+.navbar-inner > .btn-group { margin-top: 6px; }
+
+/* Styles for the parent item in the left navigation */
+
+.nav > li > a.nav-parent { font-size: 18px; line-height: 25px; }
+
+/* Other styles */
+.dropdown-menu { padding: 10px; }
+select { width: auto; }
+.page-header { color: #5A5A5A; }
+.top-air { margin-top: 40px;}
+.progress { margin-bottom: 0px; }
+.lead .badge { font-size: 18px; font-weight: normal; border-radius: 15px; padding: 9px; }
+.lead ol, .lead ul { padding: 10px 0 0 20px; }
+.lead ol > li, .lead ul > li {
+ line-height: 35px;
+}
+.well > .lead, .alert .lead { margin-bottom: 0px; }
+.well-transparent { background-color: transparent; }
+.no-results { margin: 10px 0; }
+.task-name { margin-left: 7px; }
+.icon-hand-right {color: #CCCCCC; }
+.help-inline { margin: 5px; }
+.dashboard-section { background-color: transparent; }
+
+/* styles for landing page - analysis mode */
+.hero-unit { margin: 20px 0 30px; }
+.hero-unit > .close { font-size:40px; }
+.hero-actions { margin-top: 30px; }
+
+/* styles for landing page - build mode */
+.hero-unit p { line-height: 25px; }
+.hero-unit p, .hero-unit .btn-large { margin-top: 15px; }
+.hero-unit ul { margin-top: 20px; }
+.hero-unit li { line-height: 30px; }
+.hero-unit img { background-color: #eee; margin-top: 15px; }
+
+/* make tables Chrome-happy (me, not so much) */
+table { table-layout: fixed; word-wrap: break-word; }
+
+/* styles for the new build button */
+.new-build .btn-primary { padding: 4px 30px; }
+.new-build .alert { margin-top: 10px; }
+.new-build .alert p { margin-top: 10px; }
+
+/* styles for showing the project name in build mode */
+.project-name { padding-top: 0; }
+.project-name .label { font-weight: normal; margin-bottom: 5px; margin-left: -15px; padding: 5px; }
+.project-name .label > a { color: #fff; font-weight: normal; }
+
+/* Remove bottom margin for forms inside modal dialogs */
+#dependencies-modal-form { margin-bottom: 0px; }
+
+/* Configuration styles */
+.icon-trash { color: #B94A48; font-size: 16px; padding-left: 5px; }
+.icon-trash:hover { color: #943A38; text-decoration: none; cursor: pointer; }
+.icon-pencil, .icon-download-alt, .icon-refresh, .icon-star-empty, .icon-star { font-size: 16px; color: #0088CC; padding-left: 2px; }
+.icon-pencil:hover, .icon-download-alt:hover, .icon-refresh:hover, .icon-star-empty:hover, .icon-star:hover, .icon-tasks:hover { color: #005580; text-decoration: none; cursor: pointer; }
+.icon-share { padding-left: 2px; }
+.alert-success .icon-refresh, .alert-success .icon-tasks { color: #468847; }
+.alert-success .icon-refresh:hover, .alert-success .icon-tasks:hover { color: #347132; }
+.alert-error .icon-refresh, .alert-error .icon-tasks { color: #b94a48; }
+.alert-error .icon-refresh:hover, .alert-error .icon-tasks:hover { color: #943A38; }
+.configuration-list li, .configuration-list label { line-height: 35px; font-size: 21px; font-weight: 200; margin-bottom: 0px;}
+.configuration-list { font-size: 16px; margin-bottom: 1.5em; }
+.configuration-list i { font-size: 16px; }
+/*.configuration-layers { height: 135px; overflow: scroll; }*/
+.counter { font-weight: normal; }
+.well-alert { background-color: #FCF8E3; border: 1px solid #FBEED5; border-radius: 4px; }
+.well-alert > .lead { color: #C09853; padding-bottom: .75em; }
+.configuration-alert { margin-bottom: 0px; padding: 8px 14px; }
+.configuration-alert p { margin-bottom: 0px; }
+.project-form { margin-top: 10px; }
+.add-layers .btn-block + .btn-block, .build .btn-block + .btn-block { margin-top: 0px; }
+input.huge { font-size: 17.5px; padding: 11px 19px; }
+.build-form { margin-bottom: 0px; }
+.build-form .input-append { margin-bottom: 0px; }
+.build-form .btn-large { padding: 11px 35px; }
+.build-form p { font-size:17.5px ;margin:12px 0 0 10px;}
+#layer-container form, #target-container form { margin-bottom: 0px; }
+.btn-primary .icon-question-sign, .btn-danger .icon-question-sign { color: #fff; }
+.btn-primary .icon-question-sign:hover, .btn-danger .icon-question-sign:hover { color: #999; }
+a code { color: #0088CC; }
+a code:hover { color: #005580; }
+.localconf { font-size: 17.5px; margin-top: 40px; }
+.localconf code { font-size: 17.5px; }
+#add-layer-dependencies { margin-top: 5px; }
+.link-action { font-size: 17.5px; margin-top: 40px; }
+.link-action code { font-size: 17.5px; }
+.artifact { width: 9em; }
+.control-group { margin-bottom: 0px; }
+#project-details form { margin: 0px; }
+dd form { margin: 10px 0 0 0; }
+dl textarea { resize: vertical; }
+.navbar-fixed-top { z-index: 1; }
+.popover { z-index: 2; }
+.btn-danger .icon-trash { color: #fff; }
+.bbappends { list-style-type: none; margin-left: 0; }
+.bbappends li { line-height: 25px; }
+.configuration-list input[type="checkbox"] { margin-top:13px;margin-right:10px; }
+.alert input[type="checkbox"] { margin-top: 0px; margin-right: 3px; }
+.alert ol { padding: 10px 0px 0px 20px; }
+.alert ol > li { line-height: 35px; }
+.dl-vertical form { margin-top: 10px; }
+.scrolling { border: 1px solid #dddddd; height: 154px; overflow: auto; padding: 8px; width: 27.5%; margin-bottom: 10px; }
+.lead .help-block { font-size: 14px; line-height: 20px; font-weight: normal; }
+.button-place .btn { margin: 0 0 20px 0; }
+.tooltip-inner { max-width: 250px; }
+.new-build { padding: 20px; }
+.new-build li { line-height: 30px; }
+.new-build li .alert { line-height: 20px; width: 200px; white-space: normal; }
+.new-build h6 { margin: 10px 0 0 0; color: #5a5a5a; }
+.new-build h3 { margin: 0; color: #5a5a5a; }
+.new-build form { margin: 5px 0 0; }
+.new-build .input-append { margin-bottom: 0; }
+#build-selected { margin-top: 15px; }
+div.add-deps { margin-top: 15px; }
+.btn.log { margin-left: 20px; }
+
+
+.animate-repeat {
+ list-style:none;
+ box-sizing:border-box;
+}
+
+.animate-repeat.ng-move,
+.animate-repeat.ng-enter,
+.animate-repeat.ng-leave {
+ -webkit-transition:all linear 0.5s;
+ transition:all linear 0.5s;
+}
+
+.animate-repeat.ng-leave.ng-leave-active,
+.animate-repeat.ng-move,
+.animate-repeat.ng-enter {
+ opacity:0;
+}
+
+.animate-repeat.ng-leave,
+.animate-repeat.ng-enter.ng-enter-active {
+ opacity:1;
+}
+
+.tab-pane table { margin-top: 10px; }
+
+thead .description, .get_description_or_summary { width: 364px; }
+thead .add-del-layers { width: 124px; }
+
+#loading-notification {
+ position: fixed;
+ z-index: 101;
+ top: 3%;
+ left: 40%;
+ right: 40%;
+ -webkit-box-shadow: 0 0 10px #c09853;
+ -moz-box-shadow: 0 0 10px #c09853;
+ box-shadow: 0 0 10px #c09853;
+}
+
+#change-notification {
+ position: fixed;
+ z-index: 101;
+ top: 3%;
+ left: 20%;
+ right: 20%;
+ -webkit-box-shadow: 0 0 10px #3a87ad;
+ -moz-box-shadow: 0 0 10px #3a87ad;
+ box-shadow: 0 0 10px #3a87ad;
+}
+
+/* Copied in from newer version of Font-Awesome 4.3.0 */
+.fa-spin {
+ -webkit-animation: fa-spin 2s infinite linear;
+ animation: fa-spin 2s infinite linear;
+ display: inline-block;
+}
+.fa-pulse {
+ -webkit-animation: fa-spin 1s infinite steps(8);
+ animation: fa-spin 1s infinite steps(8);
+ display: inline-block;
+}
+
+@-webkit-keyframes fa-spin {
+ 0% {
+ -webkit-transform: rotate(0deg);
+ transform: rotate(0deg);
+ }
+ 100% {
+ -webkit-transform: rotate(359deg);
+ transform: rotate(359deg);
+ }
+}
+
+@keyframes fa-spin {
+ 0% {
+ -webkit-transform: rotate(0deg);
+ -moz-transform: rotate(0deg);
+ transform: rotate(0deg);
+ }
+ 100% {
+ -webkit-transform: rotate(359deg);
+ -moz-transform: rotate(359deg);
+ transform: rotate(359deg);
+ }
+}
+/* End copied in from newer version of Font-Awesome 4.3.0 */
+
+.top-padded {
+ padding-top: 60px;
+}
+
+input.input-lg {
+ font-size: 18px;
+ height: 22px;
+ line-height: 1.33333;
+ padding: 10px 16px;
+}
diff --git a/bitbake/lib/toaster/toastergui/static/css/font-awesome.min.css b/bitbake/lib/toaster/toastergui/static/css/font-awesome.min.css
new file mode 100755
index 0000000..fa15fd5
--- /dev/null
+++ b/bitbake/lib/toaster/toastergui/static/css/font-awesome.min.css
@@ -0,0 +1,33 @@
+/*!
+ * Font Awesome 3.0.2
+ * the iconic font designed for use with Twitter Bootstrap
+ * -------------------------------------------------------
+ * The full suite of pictographic icons, examples, and documentation
+ * can be found at: http://fortawesome.github.com/Font-Awesome/
+ *
+ * License
+ * -------------------------------------------------------
+ * - The Font Awesome font is licensed under the SIL Open Font License - http://scripts.sil.org/OFL
+ * - Font Awesome CSS, LESS, and SASS files are licensed under the MIT License -
+ * http://opensource.org/licenses/mit-license.html
+ * - The Font Awesome pictograms are licensed under the CC BY 3.0 License - http://creativecommons.org/licenses/by/3.0/
+ * - Attribution is no longer required in Font Awesome 3.0, but much appreciated:
+ * "Font Awesome by Dave Gandy - http://fortawesome.github.com/Font-Awesome"
+
+ * Contact
+ * -------------------------------------------------------
+ * Email: dave@davegandy.com
+ * Twitter: http://twitter.com/fortaweso_me
+ * Work: Lead Product Designer @ http://kyruus.com
+ */
+
+@font-face{
+ font-family:'FontAwesome';
+ src:url('../fonts/fontawesome-webfont.eot?v=3.0.1');
+ src:url('../fonts/fontawesome-webfont.eot?#iefix&v=3.0.1') format('embedded-opentype'),
+ url('../fonts/fontawesome-webfont.woff?v=3.0.1') format('woff'),
+ url('../fonts/fontawesome-webfont.ttf?v=3.0.1') format('truetype');
+ font-weight:normal;
+ font-style:normal }
+
+[class^="icon-"],[class*=" icon-"]{font-family:FontAwesome;font-weight:normal;font-style:normal;text-decoration:inherit;-webkit-font-smoothing:antialiased;display:inline;width:auto;height:auto;line-height:normal;vertical-align:baseline;background-image:none;background-position:0 0;background-repeat:repeat;margin-top:0}.icon-white,.nav-pills>.active>a>[class^="icon-"],.nav-pills>.active>a>[class*=" icon-"],.nav-list>.active>a>[class^="icon-"],.nav-list>.active>a>[class*=" icon-"],.navbar-inverse .nav>.active>a>[class^="icon-"],.navbar-inverse .nav>.active>a>[class*=" icon-"],.dropdown-menu>li>a:hover>[class^="icon-"],.dropdown-menu>li>a:hover>[class*=" icon-"],.dropdown-menu>.active>a>[class^="icon-"],.dropdown-menu>.active>a>[class*=" icon-"],.dropdown-submenu:hover>a>[class^="icon-"],.dropdown-submenu:hover>a>[class*=" icon-"]{background-image:none}[class^="icon-"]:before,[class*=" icon-"]:before{text-decoration:inherit;display:inline-block;speak:none}a [class^="icon-"],a [class*=" icon-"]{display:inline-block}.icon-large:before{vertical-align:-10%;font-size:1.3333333333333333em}.btn [class^="icon-"],.nav [class^="icon-"],.btn [class*=" icon-"],.nav [class*=" icon-"]{display:inline}.btn [class^="icon-"].icon-large,.nav [class^="icon-"].icon-large,.btn [class*=" icon-"].icon-large,.nav [class*=" icon-"].icon-large{line-height:.9em}.btn [class^="icon-"].icon-spin,.nav [class^="icon-"].icon-spin,.btn [class*=" icon-"].icon-spin,.nav [class*=" icon-"].icon-spin{display:inline-block}.nav-tabs [class^="icon-"],.nav-pills [class^="icon-"],.nav-tabs [class*=" icon-"],.nav-pills [class*=" icon-"],.nav-tabs [class^="icon-"].icon-large,.nav-pills [class^="icon-"].icon-large,.nav-tabs [class*=" icon-"].icon-large,.nav-pills [class*=" icon-"].icon-large{line-height:.9em}li [class^="icon-"],.nav li [class^="icon-"],li [class*=" icon-"],.nav li [class*=" icon-"]{display:inline-block;width:1.25em;text-align:center}li [class^="icon-"].icon-large,.nav li [class^="icon-"].icon-large,li [class*=" icon-"].icon-large,.nav li [class*=" icon-"].icon-large{width:1.5625em}ul.icons{list-style-type:none;text-indent:-0.75em}ul.icons li [class^="icon-"],ul.icons li [class*=" icon-"]{width:.75em}.icon-muted{color:#eee}.icon-border{border:solid 1px #eee;padding:.2em .25em .15em;-webkit-border-radius:3px;-moz-border-radius:3px;border-radius:3px}.icon-2x{font-size:2em}.icon-2x.icon-border{border-width:2px;-webkit-border-radius:4px;-moz-border-radius:4px;border-radius:4px}.icon-3x{font-size:3em}.icon-3x.icon-border{border-width:3px;-webkit-border-radius:5px;-moz-border-radius:5px;border-radius:5px}.icon-4x{font-size:4em}.icon-4x.icon-border{border-width:4px;-webkit-border-radius:6px;-moz-border-radius:6px;border-radius:6px}.pull-right{float:right}.pull-left{float:left}[class^="icon-"].pull-left,[class*=" icon-"].pull-left{margin-right:.3em}[class^="icon-"].pull-right,[class*=" icon-"].pull-right{margin-left:.3em}.btn [class^="icon-"].pull-left.icon-2x,.btn [class*=" icon-"].pull-left.icon-2x,.btn [class^="icon-"].pull-right.icon-2x,.btn [class*=" icon-"].pull-right.icon-2x{margin-top:.18em}.btn [class^="icon-"].icon-spin.icon-large,.btn [class*=" icon-"].icon-spin.icon-large{line-height:.8em}.btn.btn-small [class^="icon-"].pull-left.icon-2x,.btn.btn-small [class*=" icon-"].pull-left.icon-2x,.btn.btn-small [class^="icon-"].pull-right.icon-2x,.btn.btn-small [class*=" icon-"].pull-right.icon-2x{margin-top:.25em}.btn.btn-large [class^="icon-"],.btn.btn-large [class*=" icon-"]{margin-top:0}.btn.btn-large [class^="icon-"].pull-left.icon-2x,.btn.btn-large [class*=" icon-"].pull-left.icon-2x,.btn.btn-large [class^="icon-"].pull-right.icon-2x,.btn.btn-large [class*=" icon-"].pull-right.icon-2x{margin-top:.05em}.btn.btn-large [class^="icon-"].pull-left.icon-2x,.btn.btn-large [class*=" icon-"].pull-left.icon-2x{margin-right:.2em}.btn.btn-large [class^="icon-"].pull-right.icon-2x,.btn.btn-large [class*=" icon-"].pull-right.icon-2x{margin-left:.2em}.icon-spin{display:inline-block;-moz-animation:spin 2s infinite linear;-o-animation:spin 2s infinite linear;-webkit-animation:spin 2s infinite linear;animation:spin 2s infinite linear}@-moz-keyframes spin{0%{-moz-transform:rotate(0deg)}100%{-moz-transform:rotate(359deg)}}@-webkit-keyframes spin{0%{-webkit-transform:rotate(0deg)}100%{-webkit-transform:rotate(359deg)}}@-o-keyframes spin{0%{-o-transform:rotate(0deg)}100%{-o-transform:rotate(359deg)}}@-ms-keyframes spin{0%{-ms-transform:rotate(0deg)}100%{-ms-transform:rotate(359deg)}}@keyframes spin{0%{transform:rotate(0deg)}100%{transform:rotate(359deg)}}@-moz-document url-prefix(){.icon-spin{height:.9em}.btn .icon-spin{height:auto}.icon-spin.icon-large{height:1.25em}.btn .icon-spin.icon-large{height:.75em}}.icon-glass:before{content:"\f000"}.icon-music:before{content:"\f001"}.icon-search:before{content:"\f002"}.icon-envelope:before{content:"\f003"}.icon-heart:before{content:"\f004"}.icon-star:before{content:"\f005"}.icon-star-empty:before{content:"\f006"}.icon-user:before{content:"\f007"}.icon-film:before{content:"\f008"}.icon-th-large:before{content:"\f009"}.icon-th:before{content:"\f00a"}.icon-th-list:before{content:"\f00b"}.icon-ok:before{content:"\f00c"}.icon-remove:before{content:"\f00d"}.icon-zoom-in:before{content:"\f00e"}.icon-zoom-out:before{content:"\f010"}.icon-off:before{content:"\f011"}.icon-signal:before{content:"\f012"}.icon-cog:before{content:"\f013"}.icon-trash:before{content:"\f014"}.icon-home:before{content:"\f015"}.icon-file:before{content:"\f016"}.icon-time:before{content:"\f017"}.icon-road:before{content:"\f018"}.icon-download-alt:before{content:"\f019"}.icon-download:before{content:"\f01a"}.icon-upload:before{content:"\f01b"}.icon-inbox:before{content:"\f01c"}.icon-play-circle:before{content:"\f01d"}.icon-repeat:before{content:"\f01e"}.icon-refresh:before{content:"\f021"}.icon-list-alt:before{content:"\f022"}.icon-lock:before{content:"\f023"}.icon-flag:before{content:"\f024"}.icon-headphones:before{content:"\f025"}.icon-volume-off:before{content:"\f026"}.icon-volume-down:before{content:"\f027"}.icon-volume-up:before{content:"\f028"}.icon-qrcode:before{content:"\f029"}.icon-barcode:before{content:"\f02a"}.icon-tag:before{content:"\f02b"}.icon-tags:before{content:"\f02c"}.icon-book:before{content:"\f02d"}.icon-bookmark:before{content:"\f02e"}.icon-print:before{content:"\f02f"}.icon-camera:before{content:"\f030"}.icon-font:before{content:"\f031"}.icon-bold:before{content:"\f032"}.icon-italic:before{content:"\f033"}.icon-text-height:before{content:"\f034"}.icon-text-width:before{content:"\f035"}.icon-align-left:before{content:"\f036"}.icon-align-center:before{content:"\f037"}.icon-align-right:before{content:"\f038"}.icon-align-justify:before{content:"\f039"}.icon-list:before{content:"\f03a"}.icon-indent-left:before{content:"\f03b"}.icon-indent-right:before{content:"\f03c"}.icon-facetime-video:before{content:"\f03d"}.icon-picture:before{content:"\f03e"}.icon-pencil:before{content:"\f040"}.icon-map-marker:before{content:"\f041"}.icon-adjust:before{content:"\f042"}.icon-tint:before{content:"\f043"}.icon-edit:before{content:"\f044"}.icon-share:before{content:"\f045"}.icon-check:before{content:"\f046"}.icon-move:before{content:"\f047"}.icon-step-backward:before{content:"\f048"}.icon-fast-backward:before{content:"\f049"}.icon-backward:before{content:"\f04a"}.icon-play:before{content:"\f04b"}.icon-pause:before{content:"\f04c"}.icon-stop:before{content:"\f04d"}.icon-forward:before{content:"\f04e"}.icon-fast-forward:before{content:"\f050"}.icon-step-forward:before{content:"\f051"}.icon-eject:before{content:"\f052"}.icon-chevron-left:before{content:"\f053"}.icon-chevron-right:before{content:"\f054"}.icon-plus-sign:before{content:"\f055"}.icon-minus-sign:before{content:"\f056"}.icon-remove-sign:before{content:"\f057"}.icon-ok-sign:before{content:"\f058"}.icon-question-sign:before{content:"\f059"}.icon-info-sign:before{content:"\f05a"}.icon-screenshot:before{content:"\f05b"}.icon-remove-circle:before{content:"\f05c"}.icon-ok-circle:before{content:"\f05d"}.icon-ban-circle:before{content:"\f05e"}.icon-arrow-left:before{content:"\f060"}.icon-arrow-right:before{content:"\f061"}.icon-arrow-up:before{content:"\f062"}.icon-arrow-down:before{content:"\f063"}.icon-share-alt:before{content:"\f064"}.icon-resize-full:before{content:"\f065"}.icon-resize-small:before{content:"\f066"}.icon-plus:before{content:"\f067"}.icon-minus:before{content:"\f068"}.icon-asterisk:before{content:"\f069"}.icon-exclamation-sign:before{content:"\f06a"}.icon-gift:before{content:"\f06b"}.icon-leaf:before{content:"\f06c"}.icon-fire:before{content:"\f06d"}.icon-eye-open:before{content:"\f06e"}.icon-eye-close:before{content:"\f070"}.icon-warning-sign:before{content:"\f071"}.icon-plane:before{content:"\f072"}.icon-calendar:before{content:"\f073"}.icon-random:before{content:"\f074"}.icon-comment:before{content:"\f075"}.icon-magnet:before{content:"\f076"}.icon-chevron-up:before{content:"\f077"}.icon-chevron-down:before{content:"\f078"}.icon-retweet:before{content:"\f079"}.icon-shopping-cart:before{content:"\f07a"}.icon-folder-close:before{content:"\f07b"}.icon-folder-open:before{content:"\f07c"}.icon-resize-vertical:before{content:"\f07d"}.icon-resize-horizontal:before{content:"\f07e"}.icon-bar-chart:before{content:"\f080"}.icon-twitter-sign:before{content:"\f081"}.icon-facebook-sign:before{content:"\f082"}.icon-camera-retro:before{content:"\f083"}.icon-key:before{content:"\f084"}.icon-cogs:before{content:"\f085"}.icon-comments:before{content:"\f086"}.icon-thumbs-up:before{content:"\f087"}.icon-thumbs-down:before{content:"\f088"}.icon-star-half:before{content:"\f089"}.icon-heart-empty:before{content:"\f08a"}.icon-signout:before{content:"\f08b"}.icon-linkedin-sign:before{content:"\f08c"}.icon-pushpin:before{content:"\f08d"}.icon-external-link:before{content:"\f08e"}.icon-signin:before{content:"\f090"}.icon-trophy:before{content:"\f091"}.icon-github-sign:before{content:"\f092"}.icon-upload-alt:before{content:"\f093"}.icon-lemon:before{content:"\f094"}.icon-phone:before{content:"\f095"}.icon-check-empty:before{content:"\f096"}.icon-bookmark-empty:before{content:"\f097"}.icon-phone-sign:before{content:"\f098"}.icon-twitter:before{content:"\f099"}.icon-facebook:before{content:"\f09a"}.icon-github:before{content:"\f09b"}.icon-unlock:before{content:"\f09c"}.icon-credit-card:before{content:"\f09d"}.icon-rss:before{content:"\f09e"}.icon-hdd:before{content:"\f0a0"}.icon-bullhorn:before{content:"\f0a1"}.icon-bell:before{content:"\f0a2"}.icon-certificate:before{content:"\f0a3"}.icon-hand-right:before{content:"\f0a4"}.icon-hand-left:before{content:"\f0a5"}.icon-hand-up:before{content:"\f0a6"}.icon-hand-down:before{content:"\f0a7"}.icon-circle-arrow-left:before{content:"\f0a8"}.icon-circle-arrow-right:before{content:"\f0a9"}.icon-circle-arrow-up:before{content:"\f0aa"}.icon-circle-arrow-down:before{content:"\f0ab"}.icon-globe:before{content:"\f0ac"}.icon-wrench:before{content:"\f0ad"}.icon-tasks:before{content:"\f0ae"}.icon-filter:before{content:"\f0b0"}.icon-briefcase:before{content:"\f0b1"}.icon-fullscreen:before{content:"\f0b2"}.icon-group:before{content:"\f0c0"}.icon-link:before{content:"\f0c1"}.icon-cloud:before{content:"\f0c2"}.icon-beaker:before{content:"\f0c3"}.icon-cut:before{content:"\f0c4"}.icon-copy:before{content:"\f0c5"}.icon-paper-clip:before{content:"\f0c6"}.icon-save:before{content:"\f0c7"}.icon-sign-blank:before{content:"\f0c8"}.icon-reorder:before{content:"\f0c9"}.icon-list-ul:before{content:"\f0ca"}.icon-list-ol:before{content:"\f0cb"}.icon-strikethrough:before{content:"\f0cc"}.icon-underline:before{content:"\f0cd"}.icon-table:before{content:"\f0ce"}.icon-magic:before{content:"\f0d0"}.icon-truck:before{content:"\f0d1"}.icon-pinterest:before{content:"\f0d2"}.icon-pinterest-sign:before{content:"\f0d3"}.icon-google-plus-sign:before{content:"\f0d4"}.icon-google-plus:before{content:"\f0d5"}.icon-money:before{content:"\f0d6"}.icon-caret-down:before{content:"\f0d7"}.icon-caret-up:before{content:"\f0d8"}.icon-caret-left:before{content:"\f0d9"}.icon-caret-right:before{content:"\f0da"}.icon-columns:before{content:"\f0db"}.icon-sort:before{content:"\f0dc"}.icon-sort-down:before{content:"\f0dd"}.icon-sort-up:before{content:"\f0de"}.icon-envelope-alt:before{content:"\f0e0"}.icon-linkedin:before{content:"\f0e1"}.icon-undo:before{content:"\f0e2"}.icon-legal:before{content:"\f0e3"}.icon-dashboard:before{content:"\f0e4"}.icon-comment-alt:before{content:"\f0e5"}.icon-comments-alt:before{content:"\f0e6"}.icon-bolt:before{content:"\f0e7"}.icon-sitemap:before{content:"\f0e8"}.icon-umbrella:before{content:"\f0e9"}.icon-paste:before{content:"\f0ea"}.icon-lightbulb:before{content:"\f0eb"}.icon-exchange:before{content:"\f0ec"}.icon-cloud-download:before{content:"\f0ed"}.icon-cloud-upload:before{content:"\f0ee"}.icon-user-md:before{content:"\f0f0"}.icon-stethoscope:before{content:"\f0f1"}.icon-suitcase:before{content:"\f0f2"}.icon-bell-alt:before{content:"\f0f3"}.icon-coffee:before{content:"\f0f4"}.icon-food:before{content:"\f0f5"}.icon-file-alt:before{content:"\f0f6"}.icon-building:before{content:"\f0f7"}.icon-hospital:before{content:"\f0f8"}.icon-ambulance:before{content:"\f0f9"}.icon-medkit:before{content:"\f0fa"}.icon-fighter-jet:before{content:"\f0fb"}.icon-beer:before{content:"\f0fc"}.icon-h-sign:before{content:"\f0fd"}.icon-plus-sign-alt:before{content:"\f0fe"}.icon-double-angle-left:before{content:"\f100"}.icon-double-angle-right:before{content:"\f101"}.icon-double-angle-up:before{content:"\f102"}.icon-double-angle-down:before{content:"\f103"}.icon-angle-left:before{content:"\f104"}.icon-angle-right:before{content:"\f105"}.icon-angle-up:before{content:"\f106"}.icon-angle-down:before{content:"\f107"}.icon-desktop:before{content:"\f108"}.icon-laptop:before{content:"\f109"}.icon-tablet:before{content:"\f10a"}.icon-mobile-phone:before{content:"\f10b"}.icon-circle-blank:before{content:"\f10c"}.icon-quote-left:before{content:"\f10d"}.icon-quote-right:before{content:"\f10e"}.icon-spinner:before{content:"\f110"}.icon-circle:before{content:"\f111"}.icon-reply:before{content:"\f112"}.icon-github-alt:before{content:"\f113"}.icon-folder-close-alt:before{content:"\f114"}.icon-folder-open-alt:before{content:"\f115"}
diff --git a/bitbake/lib/toaster/toastergui/static/css/images/ui-bg_diagonals-thick_18_b81900_40x40.png b/bitbake/lib/toaster/toastergui/static/css/images/ui-bg_diagonals-thick_18_b81900_40x40.png
new file mode 100755
index 0000000..d5d78a3
--- /dev/null
+++ b/bitbake/lib/toaster/toastergui/static/css/images/ui-bg_diagonals-thick_18_b81900_40x40.png
Binary files differ
diff --git a/bitbake/lib/toaster/toastergui/static/css/images/ui-bg_diagonals-thick_20_666666_40x40.png b/bitbake/lib/toaster/toastergui/static/css/images/ui-bg_diagonals-thick_20_666666_40x40.png
new file mode 100755
index 0000000..1b0a045
--- /dev/null
+++ b/bitbake/lib/toaster/toastergui/static/css/images/ui-bg_diagonals-thick_20_666666_40x40.png
Binary files differ
diff --git a/bitbake/lib/toaster/toastergui/static/css/images/ui-bg_flat_10_000000_40x100.png b/bitbake/lib/toaster/toastergui/static/css/images/ui-bg_flat_10_000000_40x100.png
new file mode 100755
index 0000000..c55f8ef
--- /dev/null
+++ b/bitbake/lib/toaster/toastergui/static/css/images/ui-bg_flat_10_000000_40x100.png
Binary files differ
diff --git a/bitbake/lib/toaster/toastergui/static/css/images/ui-bg_glass_100_f6f6f6_1x400.png b/bitbake/lib/toaster/toastergui/static/css/images/ui-bg_glass_100_f6f6f6_1x400.png
new file mode 100755
index 0000000..ea4fa0f
--- /dev/null
+++ b/bitbake/lib/toaster/toastergui/static/css/images/ui-bg_glass_100_f6f6f6_1x400.png
Binary files differ
diff --git a/bitbake/lib/toaster/toastergui/static/css/images/ui-bg_glass_100_fdf5ce_1x400.png b/bitbake/lib/toaster/toastergui/static/css/images/ui-bg_glass_100_fdf5ce_1x400.png
new file mode 100755
index 0000000..2f59e2d
--- /dev/null
+++ b/bitbake/lib/toaster/toastergui/static/css/images/ui-bg_glass_100_fdf5ce_1x400.png
Binary files differ
diff --git a/bitbake/lib/toaster/toastergui/static/css/images/ui-bg_glass_65_ffffff_1x400.png b/bitbake/lib/toaster/toastergui/static/css/images/ui-bg_glass_65_ffffff_1x400.png
new file mode 100755
index 0000000..f172b30
--- /dev/null
+++ b/bitbake/lib/toaster/toastergui/static/css/images/ui-bg_glass_65_ffffff_1x400.png
Binary files differ
diff --git a/bitbake/lib/toaster/toastergui/static/css/images/ui-bg_gloss-wave_35_f6a828_500x100.png b/bitbake/lib/toaster/toastergui/static/css/images/ui-bg_gloss-wave_35_f6a828_500x100.png
new file mode 100644
index 0000000..1039d54
--- /dev/null
+++ b/bitbake/lib/toaster/toastergui/static/css/images/ui-bg_gloss-wave_35_f6a828_500x100.png
Binary files differ
diff --git a/bitbake/lib/toaster/toastergui/static/css/images/ui-bg_highlight-soft_100_eeeeee_1x100.png b/bitbake/lib/toaster/toastergui/static/css/images/ui-bg_highlight-soft_100_eeeeee_1x100.png
new file mode 100755
index 0000000..1089b70
--- /dev/null
+++ b/bitbake/lib/toaster/toastergui/static/css/images/ui-bg_highlight-soft_100_eeeeee_1x100.png
Binary files differ
diff --git a/bitbake/lib/toaster/toastergui/static/css/images/ui-bg_highlight-soft_75_ffe45c_1x100.png b/bitbake/lib/toaster/toastergui/static/css/images/ui-bg_highlight-soft_75_ffe45c_1x100.png
new file mode 100755
index 0000000..561b7f8
--- /dev/null
+++ b/bitbake/lib/toaster/toastergui/static/css/images/ui-bg_highlight-soft_75_ffe45c_1x100.png
Binary files differ
diff --git a/bitbake/lib/toaster/toastergui/static/css/images/ui-icons_222222_256x240.png b/bitbake/lib/toaster/toastergui/static/css/images/ui-icons_222222_256x240.png
new file mode 100644
index 0000000..e9c8e16
--- /dev/null
+++ b/bitbake/lib/toaster/toastergui/static/css/images/ui-icons_222222_256x240.png
Binary files differ
diff --git a/bitbake/lib/toaster/toastergui/static/css/images/ui-icons_228ef1_256x240.png b/bitbake/lib/toaster/toastergui/static/css/images/ui-icons_228ef1_256x240.png
new file mode 100644
index 0000000..8d68c54
--- /dev/null
+++ b/bitbake/lib/toaster/toastergui/static/css/images/ui-icons_228ef1_256x240.png
Binary files differ
diff --git a/bitbake/lib/toaster/toastergui/static/css/images/ui-icons_ef8c08_256x240.png b/bitbake/lib/toaster/toastergui/static/css/images/ui-icons_ef8c08_256x240.png
new file mode 100644
index 0000000..18bbfe8
--- /dev/null
+++ b/bitbake/lib/toaster/toastergui/static/css/images/ui-icons_ef8c08_256x240.png
Binary files differ
diff --git a/bitbake/lib/toaster/toastergui/static/css/images/ui-icons_ffd27a_256x240.png b/bitbake/lib/toaster/toastergui/static/css/images/ui-icons_ffd27a_256x240.png
new file mode 100644
index 0000000..4435b49
--- /dev/null
+++ b/bitbake/lib/toaster/toastergui/static/css/images/ui-icons_ffd27a_256x240.png
Binary files differ
diff --git a/bitbake/lib/toaster/toastergui/static/css/images/ui-icons_ffffff_256x240.png b/bitbake/lib/toaster/toastergui/static/css/images/ui-icons_ffffff_256x240.png
new file mode 100644
index 0000000..4d66f59
--- /dev/null
+++ b/bitbake/lib/toaster/toastergui/static/css/images/ui-icons_ffffff_256x240.png
Binary files differ
diff --git a/bitbake/lib/toaster/toastergui/static/css/jquery-ui.min.css b/bitbake/lib/toaster/toastergui/static/css/jquery-ui.min.css
new file mode 100755
index 0000000..c486ec0
--- /dev/null
+++ b/bitbake/lib/toaster/toastergui/static/css/jquery-ui.min.css
@@ -0,0 +1,7 @@
+/*! jQuery UI - v1.11.4 - 2015-03-17
+* http://jqueryui.com
+* Includes: core.css, datepicker.css, theme.css
+* To view and modify this theme, visit http://jqueryui.com/themeroller/?ffDefault=Trebuchet%20MS%2CTahoma%2CVerdana%2CArial%2Csans-serif&fwDefault=bold&fsDefault=1.1em&cornerRadius=4px&bgColorHeader=f6a828&bgTextureHeader=gloss_wave&bgImgOpacityHeader=35&borderColorHeader=e78f08&fcHeader=ffffff&iconColorHeader=ffffff&bgColorContent=eeeeee&bgTextureContent=highlight_soft&bgImgOpacityContent=100&borderColorContent=dddddd&fcContent=333333&iconColorContent=222222&bgColorDefault=f6f6f6&bgTextureDefault=glass&bgImgOpacityDefault=100&borderColorDefault=cccccc&fcDefault=1c94c4&iconColorDefault=ef8c08&bgColorHover=fdf5ce&bgTextureHover=glass&bgImgOpacityHover=100&borderColorHover=fbcb09&fcHover=c77405&iconColorHover=ef8c08&bgColorActive=ffffff&bgTextureActive=glass&bgImgOpacityActive=65&borderColorActive=fbd850&fcActive=eb8f00&iconColorActive=ef8c08&bgColorHighlight=ffe45c&bgTextureHighlight=highlight_soft&bgImgOpacityHighlight=75&borderColorHighlight=fed22f&fcHighlight=363636&iconColorHighlight=228ef1&bgColorError=b81900&bgTextureError=diagonals_thick&bgImgOpacityError=18&borderColorError=cd0a0a&fcError=ffffff&iconColorError=ffd27a&bgColorOverlay=666666&bgTextureOverlay=diagonals_thick&bgImgOpacityOverlay=20&opacityOverlay=50&bgColorShadow=000000&bgTextureShadow=flat&bgImgOpacityShadow=10&opacityShadow=20&thicknessShadow=5px&offsetTopShadow=-5px&offsetLeftShadow=-5px&cornerRadiusShadow=5px
+* Copyright 2015 jQuery Foundation and other contributors; Licensed MIT */
+
+.ui-helper-hidden{display:none}.ui-helper-hidden-accessible{border:0;clip:rect(0 0 0 0);height:1px;margin:-1px;overflow:hidden;padding:0;position:absolute;width:1px}.ui-helper-reset{margin:0;padding:0;border:0;outline:0;line-height:1.3;text-decoration:none;font-size:100%;list-style:none}.ui-helper-clearfix:before,.ui-helper-clearfix:after{content:"";display:table;border-collapse:collapse}.ui-helper-clearfix:after{clear:both}.ui-helper-clearfix{min-height:0}.ui-helper-zfix{width:100%;height:100%;top:0;left:0;position:absolute;opacity:0;filter:Alpha(Opacity=0)}.ui-front{z-index:100}.ui-state-disabled{cursor:default!important}.ui-icon{display:block;text-indent:-99999px;overflow:hidden;background-repeat:no-repeat}.ui-widget-overlay{position:fixed;top:0;left:0;width:100%;height:100%}.ui-datepicker{width:17em;padding:.2em .2em 0;display:none}.ui-datepicker .ui-datepicker-header{position:relative;padding:.2em 0}.ui-datepicker .ui-datepicker-prev,.ui-datepicker .ui-datepicker-next{position:absolute;top:2px;width:1.8em;height:1.8em}.ui-datepicker .ui-datepicker-prev-hover,.ui-datepicker .ui-datepicker-next-hover{top:1px}.ui-datepicker .ui-datepicker-prev{left:2px}.ui-datepicker .ui-datepicker-next{right:2px}.ui-datepicker .ui-datepicker-prev-hover{left:1px}.ui-datepicker .ui-datepicker-next-hover{right:1px}.ui-datepicker .ui-datepicker-prev span,.ui-datepicker .ui-datepicker-next span{display:block;position:absolute;left:50%;margin-left:-8px;top:50%;margin-top:-8px}.ui-datepicker .ui-datepicker-title{margin:0 2.3em;line-height:1.8em;text-align:center}.ui-datepicker .ui-datepicker-title select{font-size:1em;margin:1px 0}.ui-datepicker select.ui-datepicker-month,.ui-datepicker select.ui-datepicker-year{width:45%}.ui-datepicker table{width:100%;font-size:.9em;border-collapse:collapse;margin:0 0 .4em}.ui-datepicker th{padding:.7em .3em;text-align:center;font-weight:bold;border:0}.ui-datepicker td{border:0;padding:1px}.ui-datepicker td span,.ui-datepicker td a{display:block;padding:.2em;text-align:right;text-decoration:none}.ui-datepicker .ui-datepicker-buttonpane{background-image:none;margin:.7em 0 0 0;padding:0 .2em;border-left:0;border-right:0;border-bottom:0}.ui-datepicker .ui-datepicker-buttonpane button{float:right;margin:.5em .2em .4em;cursor:pointer;padding:.2em .6em .3em .6em;width:auto;overflow:visible}.ui-datepicker .ui-datepicker-buttonpane button.ui-datepicker-current{float:left}.ui-datepicker.ui-datepicker-multi{width:auto}.ui-datepicker-multi .ui-datepicker-group{float:left}.ui-datepicker-multi .ui-datepicker-group table{width:95%;margin:0 auto .4em}.ui-datepicker-multi-2 .ui-datepicker-group{width:50%}.ui-datepicker-multi-3 .ui-datepicker-group{width:33.3%}.ui-datepicker-multi-4 .ui-datepicker-group{width:25%}.ui-datepicker-multi .ui-datepicker-group-last .ui-datepicker-header,.ui-datepicker-multi .ui-datepicker-group-middle .ui-datepicker-header{border-left-width:0}.ui-datepicker-multi .ui-datepicker-buttonpane{clear:left}.ui-datepicker-row-break{clear:both;width:100%;font-size:0}.ui-datepicker-rtl{direction:rtl}.ui-datepicker-rtl .ui-datepicker-prev{right:2px;left:auto}.ui-datepicker-rtl .ui-datepicker-next{left:2px;right:auto}.ui-datepicker-rtl .ui-datepicker-prev:hover{right:1px;left:auto}.ui-datepicker-rtl .ui-datepicker-next:hover{left:1px;right:auto}.ui-datepicker-rtl .ui-datepicker-buttonpane{clear:right}.ui-datepicker-rtl .ui-datepicker-buttonpane button{float:left}.ui-datepicker-rtl .ui-datepicker-buttonpane button.ui-datepicker-current,.ui-datepicker-rtl .ui-datepicker-group{float:right}.ui-datepicker-rtl .ui-datepicker-group-last .ui-datepicker-header,.ui-datepicker-rtl .ui-datepicker-group-middle .ui-datepicker-header{border-right-width:0;border-left-width:1px}.ui-widget{font-family:Trebuchet MS,Tahoma,Verdana,Arial,sans-serif;font-size:1.1em}.ui-widget .ui-widget{font-size:1em}.ui-widget input,.ui-widget select,.ui-widget textarea,.ui-widget button{font-family:Trebuchet MS,Tahoma,Verdana,Arial,sans-serif;font-size:1em}.ui-widget-content{border:1px solid #ddd;background:#eee url("images/ui-bg_highlight-soft_100_eeeeee_1x100.png") 50% top repeat-x;color:#333}.ui-widget-content a{color:#333}.ui-widget-header{border:1px solid #e78f08;background:#f6a828 url("images/ui-bg_gloss-wave_35_f6a828_500x100.png") 50% 50% repeat-x;color:#fff;font-weight:bold}.ui-widget-header a{color:#fff}.ui-state-default,.ui-widget-content .ui-state-default,.ui-widget-header .ui-state-default{border:1px solid #ccc;background:#f6f6f6 url("images/ui-bg_glass_100_f6f6f6_1x400.png") 50% 50% repeat-x;font-weight:bold;color:#1c94c4}.ui-state-default a,.ui-state-default a:link,.ui-state-default a:visited{color:#1c94c4;text-decoration:none}.ui-state-hover,.ui-widget-content .ui-state-hover,.ui-widget-header .ui-state-hover,.ui-state-focus,.ui-widget-content .ui-state-focus,.ui-widget-header .ui-state-focus{border:1px solid #fbcb09;background:#fdf5ce url("images/ui-bg_glass_100_fdf5ce_1x400.png") 50% 50% repeat-x;font-weight:bold;color:#c77405}.ui-state-hover a,.ui-state-hover a:hover,.ui-state-hover a:link,.ui-state-hover a:visited,.ui-state-focus a,.ui-state-focus a:hover,.ui-state-focus a:link,.ui-state-focus a:visited{color:#c77405;text-decoration:none}.ui-state-active,.ui-widget-content .ui-state-active,.ui-widget-header .ui-state-active{border:1px solid #fbd850;background:#fff url("images/ui-bg_glass_65_ffffff_1x400.png") 50% 50% repeat-x;font-weight:bold;color:#eb8f00}.ui-state-active a,.ui-state-active a:link,.ui-state-active a:visited{color:#eb8f00;text-decoration:none}.ui-state-highlight,.ui-widget-content .ui-state-highlight,.ui-widget-header .ui-state-highlight{border:1px solid #fed22f;background:#ffe45c url("images/ui-bg_highlight-soft_75_ffe45c_1x100.png") 50% top repeat-x;color:#363636}.ui-state-highlight a,.ui-widget-content .ui-state-highlight a,.ui-widget-header .ui-state-highlight a{color:#363636}.ui-state-error,.ui-widget-content .ui-state-error,.ui-widget-header .ui-state-error{border:1px solid #cd0a0a;background:#b81900 url("images/ui-bg_diagonals-thick_18_b81900_40x40.png") 50% 50% repeat;color:#fff}.ui-state-error a,.ui-widget-content .ui-state-error a,.ui-widget-header .ui-state-error a{color:#fff}.ui-state-error-text,.ui-widget-content .ui-state-error-text,.ui-widget-header .ui-state-error-text{color:#fff}.ui-priority-primary,.ui-widget-content .ui-priority-primary,.ui-widget-header .ui-priority-primary{font-weight:bold}.ui-priority-secondary,.ui-widget-content .ui-priority-secondary,.ui-widget-header .ui-priority-secondary{opacity:.7;filter:Alpha(Opacity=70);font-weight:normal}.ui-state-disabled,.ui-widget-content .ui-state-disabled,.ui-widget-header .ui-state-disabled{opacity:.35;filter:Alpha(Opacity=35);background-image:none}.ui-state-disabled .ui-icon{filter:Alpha(Opacity=35)}.ui-icon{width:16px;height:16px}.ui-icon,.ui-widget-content .ui-icon{background-image:url("images/ui-icons_222222_256x240.png")}.ui-widget-header .ui-icon{background-image:url("images/ui-icons_ffffff_256x240.png")}.ui-state-default .ui-icon{background-image:url("images/ui-icons_ef8c08_256x240.png")}.ui-state-hover .ui-icon,.ui-state-focus .ui-icon{background-image:url("images/ui-icons_ef8c08_256x240.png")}.ui-state-active .ui-icon{background-image:url("images/ui-icons_ef8c08_256x240.png")}.ui-state-highlight .ui-icon{background-image:url("images/ui-icons_228ef1_256x240.png")}.ui-state-error .ui-icon,.ui-state-error-text .ui-icon{background-image:url("images/ui-icons_ffd27a_256x240.png")}.ui-icon-blank{background-position:16px 16px}.ui-icon-carat-1-n{background-position:0 0}.ui-icon-carat-1-ne{background-position:-16px 0}.ui-icon-carat-1-e{background-position:-32px 0}.ui-icon-carat-1-se{background-position:-48px 0}.ui-icon-carat-1-s{background-position:-64px 0}.ui-icon-carat-1-sw{background-position:-80px 0}.ui-icon-carat-1-w{background-position:-96px 0}.ui-icon-carat-1-nw{background-position:-112px 0}.ui-icon-carat-2-n-s{background-position:-128px 0}.ui-icon-carat-2-e-w{background-position:-144px 0}.ui-icon-triangle-1-n{background-position:0 -16px}.ui-icon-triangle-1-ne{background-position:-16px -16px}.ui-icon-triangle-1-e{background-position:-32px -16px}.ui-icon-triangle-1-se{background-position:-48px -16px}.ui-icon-triangle-1-s{background-position:-64px -16px}.ui-icon-triangle-1-sw{background-position:-80px -16px}.ui-icon-triangle-1-w{background-position:-96px -16px}.ui-icon-triangle-1-nw{background-position:-112px -16px}.ui-icon-triangle-2-n-s{background-position:-128px -16px}.ui-icon-triangle-2-e-w{background-position:-144px -16px}.ui-icon-arrow-1-n{background-position:0 -32px}.ui-icon-arrow-1-ne{background-position:-16px -32px}.ui-icon-arrow-1-e{background-position:-32px -32px}.ui-icon-arrow-1-se{background-position:-48px -32px}.ui-icon-arrow-1-s{background-position:-64px -32px}.ui-icon-arrow-1-sw{background-position:-80px -32px}.ui-icon-arrow-1-w{background-position:-96px -32px}.ui-icon-arrow-1-nw{background-position:-112px -32px}.ui-icon-arrow-2-n-s{background-position:-128px -32px}.ui-icon-arrow-2-ne-sw{background-position:-144px -32px}.ui-icon-arrow-2-e-w{background-position:-160px -32px}.ui-icon-arrow-2-se-nw{background-position:-176px -32px}.ui-icon-arrowstop-1-n{background-position:-192px -32px}.ui-icon-arrowstop-1-e{background-position:-208px -32px}.ui-icon-arrowstop-1-s{background-position:-224px -32px}.ui-icon-arrowstop-1-w{background-position:-240px -32px}.ui-icon-arrowthick-1-n{background-position:0 -48px}.ui-icon-arrowthick-1-ne{background-position:-16px -48px}.ui-icon-arrowthick-1-e{background-position:-32px -48px}.ui-icon-arrowthick-1-se{background-position:-48px -48px}.ui-icon-arrowthick-1-s{background-position:-64px -48px}.ui-icon-arrowthick-1-sw{background-position:-80px -48px}.ui-icon-arrowthick-1-w{background-position:-96px -48px}.ui-icon-arrowthick-1-nw{background-position:-112px -48px}.ui-icon-arrowthick-2-n-s{background-position:-128px -48px}.ui-icon-arrowthick-2-ne-sw{background-position:-144px -48px}.ui-icon-arrowthick-2-e-w{background-position:-160px -48px}.ui-icon-arrowthick-2-se-nw{background-position:-176px -48px}.ui-icon-arrowthickstop-1-n{background-position:-192px -48px}.ui-icon-arrowthickstop-1-e{background-position:-208px -48px}.ui-icon-arrowthickstop-1-s{background-position:-224px -48px}.ui-icon-arrowthickstop-1-w{background-position:-240px -48px}.ui-icon-arrowreturnthick-1-w{background-position:0 -64px}.ui-icon-arrowreturnthick-1-n{background-position:-16px -64px}.ui-icon-arrowreturnthick-1-e{background-position:-32px -64px}.ui-icon-arrowreturnthick-1-s{background-position:-48px -64px}.ui-icon-arrowreturn-1-w{background-position:-64px -64px}.ui-icon-arrowreturn-1-n{background-position:-80px -64px}.ui-icon-arrowreturn-1-e{background-position:-96px -64px}.ui-icon-arrowreturn-1-s{background-position:-112px -64px}.ui-icon-arrowrefresh-1-w{background-position:-128px -64px}.ui-icon-arrowrefresh-1-n{background-position:-144px -64px}.ui-icon-arrowrefresh-1-e{background-position:-160px -64px}.ui-icon-arrowrefresh-1-s{background-position:-176px -64px}.ui-icon-arrow-4{background-position:0 -80px}.ui-icon-arrow-4-diag{background-position:-16px -80px}.ui-icon-extlink{background-position:-32px -80px}.ui-icon-newwin{background-position:-48px -80px}.ui-icon-refresh{background-position:-64px -80px}.ui-icon-shuffle{background-position:-80px -80px}.ui-icon-transfer-e-w{background-position:-96px -80px}.ui-icon-transferthick-e-w{background-position:-112px -80px}.ui-icon-folder-collapsed{background-position:0 -96px}.ui-icon-folder-open{background-position:-16px -96px}.ui-icon-document{background-position:-32px -96px}.ui-icon-document-b{background-position:-48px -96px}.ui-icon-note{background-position:-64px -96px}.ui-icon-mail-closed{background-position:-80px -96px}.ui-icon-mail-open{background-position:-96px -96px}.ui-icon-suitcase{background-position:-112px -96px}.ui-icon-comment{background-position:-128px -96px}.ui-icon-person{background-position:-144px -96px}.ui-icon-print{background-position:-160px -96px}.ui-icon-trash{background-position:-176px -96px}.ui-icon-locked{background-position:-192px -96px}.ui-icon-unlocked{background-position:-208px -96px}.ui-icon-bookmark{background-position:-224px -96px}.ui-icon-tag{background-position:-240px -96px}.ui-icon-home{background-position:0 -112px}.ui-icon-flag{background-position:-16px -112px}.ui-icon-calendar{background-position:-32px -112px}.ui-icon-cart{background-position:-48px -112px}.ui-icon-pencil{background-position:-64px -112px}.ui-icon-clock{background-position:-80px -112px}.ui-icon-disk{background-position:-96px -112px}.ui-icon-calculator{background-position:-112px -112px}.ui-icon-zoomin{background-position:-128px -112px}.ui-icon-zoomout{background-position:-144px -112px}.ui-icon-search{background-position:-160px -112px}.ui-icon-wrench{background-position:-176px -112px}.ui-icon-gear{background-position:-192px -112px}.ui-icon-heart{background-position:-208px -112px}.ui-icon-star{background-position:-224px -112px}.ui-icon-link{background-position:-240px -112px}.ui-icon-cancel{background-position:0 -128px}.ui-icon-plus{background-position:-16px -128px}.ui-icon-plusthick{background-position:-32px -128px}.ui-icon-minus{background-position:-48px -128px}.ui-icon-minusthick{background-position:-64px -128px}.ui-icon-close{background-position:-80px -128px}.ui-icon-closethick{background-position:-96px -128px}.ui-icon-key{background-position:-112px -128px}.ui-icon-lightbulb{background-position:-128px -128px}.ui-icon-scissors{background-position:-144px -128px}.ui-icon-clipboard{background-position:-160px -128px}.ui-icon-copy{background-position:-176px -128px}.ui-icon-contact{background-position:-192px -128px}.ui-icon-image{background-position:-208px -128px}.ui-icon-video{background-position:-224px -128px}.ui-icon-script{background-position:-240px -128px}.ui-icon-alert{background-position:0 -144px}.ui-icon-info{background-position:-16px -144px}.ui-icon-notice{background-position:-32px -144px}.ui-icon-help{background-position:-48px -144px}.ui-icon-check{background-position:-64px -144px}.ui-icon-bullet{background-position:-80px -144px}.ui-icon-radio-on{background-position:-96px -144px}.ui-icon-radio-off{background-position:-112px -144px}.ui-icon-pin-w{background-position:-128px -144px}.ui-icon-pin-s{background-position:-144px -144px}.ui-icon-play{background-position:0 -160px}.ui-icon-pause{background-position:-16px -160px}.ui-icon-seek-next{background-position:-32px -160px}.ui-icon-seek-prev{background-position:-48px -160px}.ui-icon-seek-end{background-position:-64px -160px}.ui-icon-seek-start{background-position:-80px -160px}.ui-icon-seek-first{background-position:-80px -160px}.ui-icon-stop{background-position:-96px -160px}.ui-icon-eject{background-position:-112px -160px}.ui-icon-volume-off{background-position:-128px -160px}.ui-icon-volume-on{background-position:-144px -160px}.ui-icon-power{background-position:0 -176px}.ui-icon-signal-diag{background-position:-16px -176px}.ui-icon-signal{background-position:-32px -176px}.ui-icon-battery-0{background-position:-48px -176px}.ui-icon-battery-1{background-position:-64px -176px}.ui-icon-battery-2{background-position:-80px -176px}.ui-icon-battery-3{background-position:-96px -176px}.ui-icon-circle-plus{background-position:0 -192px}.ui-icon-circle-minus{background-position:-16px -192px}.ui-icon-circle-close{background-position:-32px -192px}.ui-icon-circle-triangle-e{background-position:-48px -192px}.ui-icon-circle-triangle-s{background-position:-64px -192px}.ui-icon-circle-triangle-w{background-position:-80px -192px}.ui-icon-circle-triangle-n{background-position:-96px -192px}.ui-icon-circle-arrow-e{background-position:-112px -192px}.ui-icon-circle-arrow-s{background-position:-128px -192px}.ui-icon-circle-arrow-w{background-position:-144px -192px}.ui-icon-circle-arrow-n{background-position:-160px -192px}.ui-icon-circle-zoomin{background-position:-176px -192px}.ui-icon-circle-zoomout{background-position:-192px -192px}.ui-icon-circle-check{background-position:-208px -192px}.ui-icon-circlesmall-plus{background-position:0 -208px}.ui-icon-circlesmall-minus{background-position:-16px -208px}.ui-icon-circlesmall-close{background-position:-32px -208px}.ui-icon-squaresmall-plus{background-position:-48px -208px}.ui-icon-squaresmall-minus{background-position:-64px -208px}.ui-icon-squaresmall-close{background-position:-80px -208px}.ui-icon-grip-dotted-vertical{background-position:0 -224px}.ui-icon-grip-dotted-horizontal{background-position:-16px -224px}.ui-icon-grip-solid-vertical{background-position:-32px -224px}.ui-icon-grip-solid-horizontal{background-position:-48px -224px}.ui-icon-gripsmall-diagonal-se{background-position:-64px -224px}.ui-icon-grip-diagonal-se{background-position:-80px -224px}.ui-corner-all,.ui-corner-top,.ui-corner-left,.ui-corner-tl{border-top-left-radius:4px}.ui-corner-all,.ui-corner-top,.ui-corner-right,.ui-corner-tr{border-top-right-radius:4px}.ui-corner-all,.ui-corner-bottom,.ui-corner-left,.ui-corner-bl{border-bottom-left-radius:4px}.ui-corner-all,.ui-corner-bottom,.ui-corner-right,.ui-corner-br{border-bottom-right-radius:4px}.ui-widget-overlay{background:#666 url("images/ui-bg_diagonals-thick_20_666666_40x40.png") 50% 50% repeat;opacity:.5;filter:Alpha(Opacity=50)}.ui-widget-shadow{margin:-5px 0 0 -5px;padding:5px;background:#000 url("images/ui-bg_flat_10_000000_40x100.png") 50% 50% repeat-x;opacity:.2;filter:Alpha(Opacity=20);border-radius:5px}
\ No newline at end of file
diff --git a/bitbake/lib/toaster/toastergui/static/css/jquery-ui.structure.min.css b/bitbake/lib/toaster/toastergui/static/css/jquery-ui.structure.min.css
new file mode 100755
index 0000000..d1578a4
--- /dev/null
+++ b/bitbake/lib/toaster/toastergui/static/css/jquery-ui.structure.min.css
@@ -0,0 +1,5 @@
+/*! jQuery UI - v1.11.4 - 2015-03-15
+* http://jqueryui.com
+* Copyright 2015 jQuery Foundation and other contributors; Licensed MIT */
+
+.ui-helper-hidden{display:none}.ui-helper-hidden-accessible{border:0;clip:rect(0 0 0 0);height:1px;margin:-1px;overflow:hidden;padding:0;position:absolute;width:1px}.ui-helper-reset{margin:0;padding:0;border:0;outline:0;line-height:1.3;text-decoration:none;font-size:100%;list-style:none}.ui-helper-clearfix:before,.ui-helper-clearfix:after{content:"";display:table;border-collapse:collapse}.ui-helper-clearfix:after{clear:both}.ui-helper-clearfix{min-height:0}.ui-helper-zfix{width:100%;height:100%;top:0;left:0;position:absolute;opacity:0;filter:Alpha(Opacity=0)}.ui-front{z-index:100}.ui-state-disabled{cursor:default!important}.ui-icon{display:block;text-indent:-99999px;overflow:hidden;background-repeat:no-repeat}.ui-widget-overlay{position:fixed;top:0;left:0;width:100%;height:100%}.ui-datepicker{width:17em;padding:.2em .2em 0;display:none}.ui-datepicker .ui-datepicker-header{position:relative;padding:.2em 0}.ui-datepicker .ui-datepicker-prev,.ui-datepicker .ui-datepicker-next{position:absolute;top:2px;width:1.8em;height:1.8em}.ui-datepicker .ui-datepicker-prev-hover,.ui-datepicker .ui-datepicker-next-hover{top:1px}.ui-datepicker .ui-datepicker-prev{left:2px}.ui-datepicker .ui-datepicker-next{right:2px}.ui-datepicker .ui-datepicker-prev-hover{left:1px}.ui-datepicker .ui-datepicker-next-hover{right:1px}.ui-datepicker .ui-datepicker-prev span,.ui-datepicker .ui-datepicker-next span{display:block;position:absolute;left:50%;margin-left:-8px;top:50%;margin-top:-8px}.ui-datepicker .ui-datepicker-title{margin:0 2.3em;line-height:1.8em;text-align:center}.ui-datepicker .ui-datepicker-title select{font-size:1em;margin:1px 0}.ui-datepicker select.ui-datepicker-month,.ui-datepicker select.ui-datepicker-year{width:45%}.ui-datepicker table{width:100%;font-size:.9em;border-collapse:collapse;margin:0 0 .4em}.ui-datepicker th{padding:.7em .3em;text-align:center;font-weight:bold;border:0}.ui-datepicker td{border:0;padding:1px}.ui-datepicker td span,.ui-datepicker td a{display:block;padding:.2em;text-align:right;text-decoration:none}.ui-datepicker .ui-datepicker-buttonpane{background-image:none;margin:.7em 0 0 0;padding:0 .2em;border-left:0;border-right:0;border-bottom:0}.ui-datepicker .ui-datepicker-buttonpane button{float:right;margin:.5em .2em .4em;cursor:pointer;padding:.2em .6em .3em .6em;width:auto;overflow:visible}.ui-datepicker .ui-datepicker-buttonpane button.ui-datepicker-current{float:left}.ui-datepicker.ui-datepicker-multi{width:auto}.ui-datepicker-multi .ui-datepicker-group{float:left}.ui-datepicker-multi .ui-datepicker-group table{width:95%;margin:0 auto .4em}.ui-datepicker-multi-2 .ui-datepicker-group{width:50%}.ui-datepicker-multi-3 .ui-datepicker-group{width:33.3%}.ui-datepicker-multi-4 .ui-datepicker-group{width:25%}.ui-datepicker-multi .ui-datepicker-group-last .ui-datepicker-header,.ui-datepicker-multi .ui-datepicker-group-middle .ui-datepicker-header{border-left-width:0}.ui-datepicker-multi .ui-datepicker-buttonpane{clear:left}.ui-datepicker-row-break{clear:both;width:100%;font-size:0}.ui-datepicker-rtl{direction:rtl}.ui-datepicker-rtl .ui-datepicker-prev{right:2px;left:auto}.ui-datepicker-rtl .ui-datepicker-next{left:2px;right:auto}.ui-datepicker-rtl .ui-datepicker-prev:hover{right:1px;left:auto}.ui-datepicker-rtl .ui-datepicker-next:hover{left:1px;right:auto}.ui-datepicker-rtl .ui-datepicker-buttonpane{clear:right}.ui-datepicker-rtl .ui-datepicker-buttonpane button{float:left}.ui-datepicker-rtl .ui-datepicker-buttonpane button.ui-datepicker-current,.ui-datepicker-rtl .ui-datepicker-group{float:right}.ui-datepicker-rtl .ui-datepicker-group-last .ui-datepicker-header,.ui-datepicker-rtl .ui-datepicker-group-middle .ui-datepicker-header{border-right-width:0;border-left-width:1px}
\ No newline at end of file
diff --git a/bitbake/lib/toaster/toastergui/static/css/jquery-ui.theme.min.css b/bitbake/lib/toaster/toastergui/static/css/jquery-ui.theme.min.css
new file mode 100755
index 0000000..3454311
--- /dev/null
+++ b/bitbake/lib/toaster/toastergui/static/css/jquery-ui.theme.min.css
@@ -0,0 +1,5 @@
+/*! jQuery UI - v1.11.4 - 2015-03-17
+* http://jqueryui.com
+* Copyright 2015 jQuery Foundation and other contributors; Licensed MIT */
+
+.ui-widget{font-family:Trebuchet MS,Tahoma,Verdana,Arial,sans-serif;font-size:1.1em}.ui-widget .ui-widget{font-size:1em}.ui-widget input,.ui-widget select,.ui-widget textarea,.ui-widget button{font-family:Trebuchet MS,Tahoma,Verdana,Arial,sans-serif;font-size:1em}.ui-widget-content{border:1px solid #ddd;background:#eee url("images/ui-bg_highlight-soft_100_eeeeee_1x100.png") 50% top repeat-x;color:#333}.ui-widget-content a{color:#333}.ui-widget-header{border:1px solid #e78f08;background:#f6a828 url("images/ui-bg_gloss-wave_35_f6a828_500x100.png") 50% 50% repeat-x;color:#fff;font-weight:bold}.ui-widget-header a{color:#fff}.ui-state-default,.ui-widget-content .ui-state-default,.ui-widget-header .ui-state-default{border:1px solid #ccc;background:#f6f6f6 url("images/ui-bg_glass_100_f6f6f6_1x400.png") 50% 50% repeat-x;font-weight:bold;color:#1c94c4}.ui-state-default a,.ui-state-default a:link,.ui-state-default a:visited{color:#1c94c4;text-decoration:none}.ui-state-hover,.ui-widget-content .ui-state-hover,.ui-widget-header .ui-state-hover,.ui-state-focus,.ui-widget-content .ui-state-focus,.ui-widget-header .ui-state-focus{border:1px solid #fbcb09;background:#fdf5ce url("images/ui-bg_glass_100_fdf5ce_1x400.png") 50% 50% repeat-x;font-weight:bold;color:#c77405}.ui-state-hover a,.ui-state-hover a:hover,.ui-state-hover a:link,.ui-state-hover a:visited,.ui-state-focus a,.ui-state-focus a:hover,.ui-state-focus a:link,.ui-state-focus a:visited{color:#c77405;text-decoration:none}.ui-state-active,.ui-widget-content .ui-state-active,.ui-widget-header .ui-state-active{border:1px solid #fbd850;background:#fff url("images/ui-bg_glass_65_ffffff_1x400.png") 50% 50% repeat-x;font-weight:bold;color:#eb8f00}.ui-state-active a,.ui-state-active a:link,.ui-state-active a:visited{color:#eb8f00;text-decoration:none}.ui-state-highlight,.ui-widget-content .ui-state-highlight,.ui-widget-header .ui-state-highlight{border:1px solid #fed22f;background:#ffe45c url("images/ui-bg_highlight-soft_75_ffe45c_1x100.png") 50% top repeat-x;color:#363636}.ui-state-highlight a,.ui-widget-content .ui-state-highlight a,.ui-widget-header .ui-state-highlight a{color:#363636}.ui-state-error,.ui-widget-content .ui-state-error,.ui-widget-header .ui-state-error{border:1px solid #cd0a0a;background:#b81900 url("images/ui-bg_diagonals-thick_18_b81900_40x40.png") 50% 50% repeat;color:#fff}.ui-state-error a,.ui-widget-content .ui-state-error a,.ui-widget-header .ui-state-error a{color:#fff}.ui-state-error-text,.ui-widget-content .ui-state-error-text,.ui-widget-header .ui-state-error-text{color:#fff}.ui-priority-primary,.ui-widget-content .ui-priority-primary,.ui-widget-header .ui-priority-primary{font-weight:bold}.ui-priority-secondary,.ui-widget-content .ui-priority-secondary,.ui-widget-header .ui-priority-secondary{opacity:.7;filter:Alpha(Opacity=70);font-weight:normal}.ui-state-disabled,.ui-widget-content .ui-state-disabled,.ui-widget-header .ui-state-disabled{opacity:.35;filter:Alpha(Opacity=35);background-image:none}.ui-state-disabled .ui-icon{filter:Alpha(Opacity=35)}.ui-icon{width:16px;height:16px}.ui-icon,.ui-widget-content .ui-icon{background-image:url("images/ui-icons_222222_256x240.png")}.ui-widget-header .ui-icon{background-image:url("images/ui-icons_ffffff_256x240.png")}.ui-state-default .ui-icon{background-image:url("images/ui-icons_ef8c08_256x240.png")}.ui-state-hover .ui-icon,.ui-state-focus .ui-icon{background-image:url("images/ui-icons_ef8c08_256x240.png")}.ui-state-active .ui-icon{background-image:url("images/ui-icons_ef8c08_256x240.png")}.ui-state-highlight .ui-icon{background-image:url("images/ui-icons_228ef1_256x240.png")}.ui-state-error .ui-icon,.ui-state-error-text .ui-icon{background-image:url("images/ui-icons_ffd27a_256x240.png")}.ui-icon-blank{background-position:16px 16px}.ui-icon-carat-1-n{background-position:0 0}.ui-icon-carat-1-ne{background-position:-16px 0}.ui-icon-carat-1-e{background-position:-32px 0}.ui-icon-carat-1-se{background-position:-48px 0}.ui-icon-carat-1-s{background-position:-64px 0}.ui-icon-carat-1-sw{background-position:-80px 0}.ui-icon-carat-1-w{background-position:-96px 0}.ui-icon-carat-1-nw{background-position:-112px 0}.ui-icon-carat-2-n-s{background-position:-128px 0}.ui-icon-carat-2-e-w{background-position:-144px 0}.ui-icon-triangle-1-n{background-position:0 -16px}.ui-icon-triangle-1-ne{background-position:-16px -16px}.ui-icon-triangle-1-e{background-position:-32px -16px}.ui-icon-triangle-1-se{background-position:-48px -16px}.ui-icon-triangle-1-s{background-position:-64px -16px}.ui-icon-triangle-1-sw{background-position:-80px -16px}.ui-icon-triangle-1-w{background-position:-96px -16px}.ui-icon-triangle-1-nw{background-position:-112px -16px}.ui-icon-triangle-2-n-s{background-position:-128px -16px}.ui-icon-triangle-2-e-w{background-position:-144px -16px}.ui-icon-arrow-1-n{background-position:0 -32px}.ui-icon-arrow-1-ne{background-position:-16px -32px}.ui-icon-arrow-1-e{background-position:-32px -32px}.ui-icon-arrow-1-se{background-position:-48px -32px}.ui-icon-arrow-1-s{background-position:-64px -32px}.ui-icon-arrow-1-sw{background-position:-80px -32px}.ui-icon-arrow-1-w{background-position:-96px -32px}.ui-icon-arrow-1-nw{background-position:-112px -32px}.ui-icon-arrow-2-n-s{background-position:-128px -32px}.ui-icon-arrow-2-ne-sw{background-position:-144px -32px}.ui-icon-arrow-2-e-w{background-position:-160px -32px}.ui-icon-arrow-2-se-nw{background-position:-176px -32px}.ui-icon-arrowstop-1-n{background-position:-192px -32px}.ui-icon-arrowstop-1-e{background-position:-208px -32px}.ui-icon-arrowstop-1-s{background-position:-224px -32px}.ui-icon-arrowstop-1-w{background-position:-240px -32px}.ui-icon-arrowthick-1-n{background-position:0 -48px}.ui-icon-arrowthick-1-ne{background-position:-16px -48px}.ui-icon-arrowthick-1-e{background-position:-32px -48px}.ui-icon-arrowthick-1-se{background-position:-48px -48px}.ui-icon-arrowthick-1-s{background-position:-64px -48px}.ui-icon-arrowthick-1-sw{background-position:-80px -48px}.ui-icon-arrowthick-1-w{background-position:-96px -48px}.ui-icon-arrowthick-1-nw{background-position:-112px -48px}.ui-icon-arrowthick-2-n-s{background-position:-128px -48px}.ui-icon-arrowthick-2-ne-sw{background-position:-144px -48px}.ui-icon-arrowthick-2-e-w{background-position:-160px -48px}.ui-icon-arrowthick-2-se-nw{background-position:-176px -48px}.ui-icon-arrowthickstop-1-n{background-position:-192px -48px}.ui-icon-arrowthickstop-1-e{background-position:-208px -48px}.ui-icon-arrowthickstop-1-s{background-position:-224px -48px}.ui-icon-arrowthickstop-1-w{background-position:-240px -48px}.ui-icon-arrowreturnthick-1-w{background-position:0 -64px}.ui-icon-arrowreturnthick-1-n{background-position:-16px -64px}.ui-icon-arrowreturnthick-1-e{background-position:-32px -64px}.ui-icon-arrowreturnthick-1-s{background-position:-48px -64px}.ui-icon-arrowreturn-1-w{background-position:-64px -64px}.ui-icon-arrowreturn-1-n{background-position:-80px -64px}.ui-icon-arrowreturn-1-e{background-position:-96px -64px}.ui-icon-arrowreturn-1-s{background-position:-112px -64px}.ui-icon-arrowrefresh-1-w{background-position:-128px -64px}.ui-icon-arrowrefresh-1-n{background-position:-144px -64px}.ui-icon-arrowrefresh-1-e{background-position:-160px -64px}.ui-icon-arrowrefresh-1-s{background-position:-176px -64px}.ui-icon-arrow-4{background-position:0 -80px}.ui-icon-arrow-4-diag{background-position:-16px -80px}.ui-icon-extlink{background-position:-32px -80px}.ui-icon-newwin{background-position:-48px -80px}.ui-icon-refresh{background-position:-64px -80px}.ui-icon-shuffle{background-position:-80px -80px}.ui-icon-transfer-e-w{background-position:-96px -80px}.ui-icon-transferthick-e-w{background-position:-112px -80px}.ui-icon-folder-collapsed{background-position:0 -96px}.ui-icon-folder-open{background-position:-16px -96px}.ui-icon-document{background-position:-32px -96px}.ui-icon-document-b{background-position:-48px -96px}.ui-icon-note{background-position:-64px -96px}.ui-icon-mail-closed{background-position:-80px -96px}.ui-icon-mail-open{background-position:-96px -96px}.ui-icon-suitcase{background-position:-112px -96px}.ui-icon-comment{background-position:-128px -96px}.ui-icon-person{background-position:-144px -96px}.ui-icon-print{background-position:-160px -96px}.ui-icon-trash{background-position:-176px -96px}.ui-icon-locked{background-position:-192px -96px}.ui-icon-unlocked{background-position:-208px -96px}.ui-icon-bookmark{background-position:-224px -96px}.ui-icon-tag{background-position:-240px -96px}.ui-icon-home{background-position:0 -112px}.ui-icon-flag{background-position:-16px -112px}.ui-icon-calendar{background-position:-32px -112px}.ui-icon-cart{background-position:-48px -112px}.ui-icon-pencil{background-position:-64px -112px}.ui-icon-clock{background-position:-80px -112px}.ui-icon-disk{background-position:-96px -112px}.ui-icon-calculator{background-position:-112px -112px}.ui-icon-zoomin{background-position:-128px -112px}.ui-icon-zoomout{background-position:-144px -112px}.ui-icon-search{background-position:-160px -112px}.ui-icon-wrench{background-position:-176px -112px}.ui-icon-gear{background-position:-192px -112px}.ui-icon-heart{background-position:-208px -112px}.ui-icon-star{background-position:-224px -112px}.ui-icon-link{background-position:-240px -112px}.ui-icon-cancel{background-position:0 -128px}.ui-icon-plus{background-position:-16px -128px}.ui-icon-plusthick{background-position:-32px -128px}.ui-icon-minus{background-position:-48px -128px}.ui-icon-minusthick{background-position:-64px -128px}.ui-icon-close{background-position:-80px -128px}.ui-icon-closethick{background-position:-96px -128px}.ui-icon-key{background-position:-112px -128px}.ui-icon-lightbulb{background-position:-128px -128px}.ui-icon-scissors{background-position:-144px -128px}.ui-icon-clipboard{background-position:-160px -128px}.ui-icon-copy{background-position:-176px -128px}.ui-icon-contact{background-position:-192px -128px}.ui-icon-image{background-position:-208px -128px}.ui-icon-video{background-position:-224px -128px}.ui-icon-script{background-position:-240px -128px}.ui-icon-alert{background-position:0 -144px}.ui-icon-info{background-position:-16px -144px}.ui-icon-notice{background-position:-32px -144px}.ui-icon-help{background-position:-48px -144px}.ui-icon-check{background-position:-64px -144px}.ui-icon-bullet{background-position:-80px -144px}.ui-icon-radio-on{background-position:-96px -144px}.ui-icon-radio-off{background-position:-112px -144px}.ui-icon-pin-w{background-position:-128px -144px}.ui-icon-pin-s{background-position:-144px -144px}.ui-icon-play{background-position:0 -160px}.ui-icon-pause{background-position:-16px -160px}.ui-icon-seek-next{background-position:-32px -160px}.ui-icon-seek-prev{background-position:-48px -160px}.ui-icon-seek-end{background-position:-64px -160px}.ui-icon-seek-start{background-position:-80px -160px}.ui-icon-seek-first{background-position:-80px -160px}.ui-icon-stop{background-position:-96px -160px}.ui-icon-eject{background-position:-112px -160px}.ui-icon-volume-off{background-position:-128px -160px}.ui-icon-volume-on{background-position:-144px -160px}.ui-icon-power{background-position:0 -176px}.ui-icon-signal-diag{background-position:-16px -176px}.ui-icon-signal{background-position:-32px -176px}.ui-icon-battery-0{background-position:-48px -176px}.ui-icon-battery-1{background-position:-64px -176px}.ui-icon-battery-2{background-position:-80px -176px}.ui-icon-battery-3{background-position:-96px -176px}.ui-icon-circle-plus{background-position:0 -192px}.ui-icon-circle-minus{background-position:-16px -192px}.ui-icon-circle-close{background-position:-32px -192px}.ui-icon-circle-triangle-e{background-position:-48px -192px}.ui-icon-circle-triangle-s{background-position:-64px -192px}.ui-icon-circle-triangle-w{background-position:-80px -192px}.ui-icon-circle-triangle-n{background-position:-96px -192px}.ui-icon-circle-arrow-e{background-position:-112px -192px}.ui-icon-circle-arrow-s{background-position:-128px -192px}.ui-icon-circle-arrow-w{background-position:-144px -192px}.ui-icon-circle-arrow-n{background-position:-160px -192px}.ui-icon-circle-zoomin{background-position:-176px -192px}.ui-icon-circle-zoomout{background-position:-192px -192px}.ui-icon-circle-check{background-position:-208px -192px}.ui-icon-circlesmall-plus{background-position:0 -208px}.ui-icon-circlesmall-minus{background-position:-16px -208px}.ui-icon-circlesmall-close{background-position:-32px -208px}.ui-icon-squaresmall-plus{background-position:-48px -208px}.ui-icon-squaresmall-minus{background-position:-64px -208px}.ui-icon-squaresmall-close{background-position:-80px -208px}.ui-icon-grip-dotted-vertical{background-position:0 -224px}.ui-icon-grip-dotted-horizontal{background-position:-16px -224px}.ui-icon-grip-solid-vertical{background-position:-32px -224px}.ui-icon-grip-solid-horizontal{background-position:-48px -224px}.ui-icon-gripsmall-diagonal-se{background-position:-64px -224px}.ui-icon-grip-diagonal-se{background-position:-80px -224px}.ui-corner-all,.ui-corner-top,.ui-corner-left,.ui-corner-tl{border-top-left-radius:4px}.ui-corner-all,.ui-corner-top,.ui-corner-right,.ui-corner-tr{border-top-right-radius:4px}.ui-corner-all,.ui-corner-bottom,.ui-corner-left,.ui-corner-bl{border-bottom-left-radius:4px}.ui-corner-all,.ui-corner-bottom,.ui-corner-right,.ui-corner-br{border-bottom-right-radius:4px}.ui-widget-overlay{background:#666 url("images/ui-bg_diagonals-thick_20_666666_40x40.png") 50% 50% repeat;opacity:.5;filter:Alpha(Opacity=50)}.ui-widget-shadow{margin:-5px 0 0 -5px;padding:5px;background:#000 url("images/ui-bg_flat_10_000000_40x100.png") 50% 50% repeat-x;opacity:.2;filter:Alpha(Opacity=20);border-radius:5px}
\ No newline at end of file
diff --git a/bitbake/lib/toaster/toastergui/static/css/jquery.treetable.css b/bitbake/lib/toaster/toastergui/static/css/jquery.treetable.css
new file mode 100644
index 0000000..4e95bfd
--- /dev/null
+++ b/bitbake/lib/toaster/toastergui/static/css/jquery.treetable.css
@@ -0,0 +1,28 @@
+table.treetable span.indenter {
+ display: inline-block;
+ margin: 0;
+ padding: 0;
+ text-align: right;
+
+ /* Disable text selection of nodes (for better D&D UX) */
+ user-select: none;
+ -khtml-user-select: none;
+ -moz-user-select: none;
+ -o-user-select: none;
+ -webkit-user-select: none;
+
+ /* Force content-box box model for indenter (Bootstrap compatibility) */
+ -webkit-box-sizing: content-box;
+ -moz-box-sizing: content-box;
+ box-sizing: content-box;
+
+ width: 19px;
+}
+
+table.treetable span.indenter a {
+ background-position: left center;
+ background-repeat: no-repeat;
+ display: inline-block;
+ text-decoration: none;
+ width: 19px;
+}
diff --git a/bitbake/lib/toaster/toastergui/static/css/jquery.treetable.theme.default.css b/bitbake/lib/toaster/toastergui/static/css/jquery.treetable.theme.default.css
new file mode 100644
index 0000000..48289ba
--- /dev/null
+++ b/bitbake/lib/toaster/toastergui/static/css/jquery.treetable.theme.default.css
@@ -0,0 +1,64 @@
+table.treetable {
+ border: 1px solid #888;
+ border-collapse: collapse;
+ font-size: .8em;
+ line-height: 1;
+ margin: .6em 0 1.8em 0;
+ width: 100%;
+}
+
+table.treetable caption {
+ font-size: .9em;
+ font-weight: bold;
+ margin-bottom: .2em;
+}
+
+table.treetable tbody tr td {
+ cursor: default;
+ padding: .3em 1em;
+}
+
+table.treetable span {
+ background-position: center left;
+ background-repeat: no-repeat;
+ padding: .2em 0 .2em 1.5em;
+}
+
+table.treetable span.file {
+ background-image: url(data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAABAAAAAQCAQAAAC1+jfqAAAABGdBTUEAAK/INwWK6QAAABl0RVh0U29mdHdhcmUAQWRvYmUgSW1hZ2VSZWFkeXHJZTwAAADoSURBVBgZBcExblNBGAbA2ceegTRBuIKOgiihSZNTcC5LUHAihNJR0kGKCDcYJY6D3/77MdOinTvzAgCw8ysThIvn/VojIyMjIyPP+bS1sUQIV2s95pBDDvmbP/mdkft83tpYguZq5Jh/OeaYh+yzy8hTHvNlaxNNczm+la9OTlar1UdA/+C2A4trRCnD3jS8BB1obq2Gk6GU6QbQAS4BUaYSQAf4bhhKKTFdAzrAOwAxEUAH+KEM01SY3gM6wBsEAQB0gJ+maZoC3gI6iPYaAIBJsiRmHU0AALOeFC3aK2cWAACUXe7+AwO0lc9eTHYTAAAAAElFTkSuQmCC);
+}
+
+table.treetable span.folder {
+ background-image: url(data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAABAAAAAQCAYAAAAf8/9hAAAABGdBTUEAAK/INwWK6QAAABl0RVh0U29mdHdhcmUAQWRvYmUgSW1hZ2VSZWFkeXHJZTwAAAGrSURBVDjLxZO7ihRBFIa/6u0ZW7GHBUV0UQQTZzd3QdhMQxOfwMRXEANBMNQX0MzAzFAwEzHwARbNFDdwEd31Mj3X7a6uOr9BtzNjYjKBJ6nicP7v3KqcJFaxhBVtZUAK8OHlld2st7Xl3DJPVONP+zEUV4HqL5UDYHr5xvuQAjgl/Qs7TzvOOVAjxjlC+ePSwe6DfbVegLVuT4r14eTr6zvA8xSAoBLzx6pvj4l+DZIezuVkG9fY2H7YRQIMZIBwycmzH1/s3F8AapfIPNF3kQk7+kw9PWBy+IZOdg5Ug3mkAATy/t0usovzGeCUWTjCz0B+Sj0ekfdvkZ3abBv+U4GaCtJ1iEm6ANQJ6fEzrG/engcKw/wXQvEKxSEKQxRGKE7Izt+DSiwBJMUSm71rguMYhQKrBygOIRStf4TiFFRBvbRGKiQLWP29yRSHKBTtfdBmHs0BUpgvtgF4yRFR+NUKi0XZcYjCeCG2smkzLAHkbRBmP0/Uk26O5YnUActBp1GsAI+S5nRJJJal5K1aAMrq0d6Tm9uI6zjyf75dAe6tx/SsWeD//o2/Ab6IH3/h25pOAAAAAElFTkSuQmCC);
+}
+
+table.treetable tr.collapsed span.indenter a {
+ background-image: url(data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAABAAAAAQCAYAAAAf8/9hAAAACXBIWXMAAAsTAAALEwEAmpwYAAAKT2lDQ1BQaG90b3Nob3AgSUNDIHByb2ZpbGUAAHjanVNnVFPpFj333vRCS4iAlEtvUhUIIFJCi4AUkSYqIQkQSoghodkVUcERRUUEG8igiAOOjoCMFVEsDIoK2AfkIaKOg6OIisr74Xuja9a89+bN/rXXPues852zzwfACAyWSDNRNYAMqUIeEeCDx8TG4eQuQIEKJHAAEAizZCFz/SMBAPh+PDwrIsAHvgABeNMLCADATZvAMByH/w/qQplcAYCEAcB0kThLCIAUAEB6jkKmAEBGAYCdmCZTAKAEAGDLY2LjAFAtAGAnf+bTAICd+Jl7AQBblCEVAaCRACATZYhEAGg7AKzPVopFAFgwABRmS8Q5ANgtADBJV2ZIALC3AMDOEAuyAAgMADBRiIUpAAR7AGDIIyN4AISZABRG8lc88SuuEOcqAAB4mbI8uSQ5RYFbCC1xB1dXLh4ozkkXKxQ2YQJhmkAuwnmZGTKBNA/g88wAAKCRFRHgg/P9eM4Ors7ONo62Dl8t6r8G/yJiYuP+5c+rcEAAAOF0ftH+LC+zGoA7BoBt/qIl7gRoXgugdfeLZrIPQLUAoOnaV/Nw+H48PEWhkLnZ2eXk5NhKxEJbYcpXff5nwl/AV/1s+X48/Pf14L7iJIEyXYFHBPjgwsz0TKUcz5IJhGLc5o9H/LcL//wd0yLESWK5WCoU41EScY5EmozzMqUiiUKSKcUl0v9k4t8s+wM+3zUAsGo+AXuRLahdYwP2SycQWHTA4vcAAPK7b8HUKAgDgGiD4c93/+8//UegJQCAZkmScQAAXkQkLlTKsz/HCAAARKCBKrBBG/TBGCzABhzBBdzBC/xgNoRCJMTCQhBCCmSAHHJgKayCQiiGzbAdKmAv1EAdNMBRaIaTcA4uwlW4Dj1wD/phCJ7BKLyBCQRByAgTYSHaiAFiilgjjggXmYX4IcFIBBKLJCDJiBRRIkuRNUgxUopUIFVIHfI9cgI5h1xGupE7yAAygvyGvEcxlIGyUT3UDLVDuag3GoRGogvQZHQxmo8WoJvQcrQaPYw2oefQq2gP2o8+Q8cwwOgYBzPEbDAuxsNCsTgsCZNjy7EirAyrxhqwVqwDu4n1Y8+xdwQSgUXACTYEd0IgYR5BSFhMWE7YSKggHCQ0EdoJNwkDhFHCJyKTqEu0JroR+cQYYjIxh1hILCPWEo8TLxB7iEPENyQSiUMyJ7mQAkmxpFTSEtJG0m5SI+ksqZs0SBojk8naZGuyBzmULCAryIXkneTD5DPkG+Qh8lsKnWJAcaT4U+IoUspqShnlEOU05QZlmDJBVaOaUt2ooVQRNY9aQq2htlKvUYeoEzR1mjnNgxZJS6WtopXTGmgXaPdpr+h0uhHdlR5Ol9BX0svpR+iX6AP0dwwNhhWDx4hnKBmbGAcYZxl3GK+YTKYZ04sZx1QwNzHrmOeZD5lvVVgqtip8FZHKCpVKlSaVGyovVKmqpqreqgtV81XLVI+pXlN9rkZVM1PjqQnUlqtVqp1Q61MbU2epO6iHqmeob1Q/pH5Z/YkGWcNMw09DpFGgsV/jvMYgC2MZs3gsIWsNq4Z1gTXEJrHN2Xx2KruY/R27iz2qqaE5QzNKM1ezUvOUZj8H45hx+Jx0TgnnKKeX836K3hTvKeIpG6Y0TLkxZVxrqpaXllirSKtRq0frvTau7aedpr1Fu1n7gQ5Bx0onXCdHZ4/OBZ3nU9lT3acKpxZNPTr1ri6qa6UbobtEd79up+6Ynr5egJ5Mb6feeb3n+hx9L/1U/W36p/VHDFgGswwkBtsMzhg8xTVxbzwdL8fb8VFDXcNAQ6VhlWGX4YSRudE8o9VGjUYPjGnGXOMk423GbcajJgYmISZLTepN7ppSTbmmKaY7TDtMx83MzaLN1pk1mz0x1zLnm+eb15vft2BaeFostqi2uGVJsuRaplnutrxuhVo5WaVYVVpds0atna0l1rutu6cRp7lOk06rntZnw7Dxtsm2qbcZsOXYBtuutm22fWFnYhdnt8Wuw+6TvZN9un2N/T0HDYfZDqsdWh1+c7RyFDpWOt6azpzuP33F9JbpL2dYzxDP2DPjthPLKcRpnVOb00dnF2e5c4PziIuJS4LLLpc+Lpsbxt3IveRKdPVxXeF60vWdm7Obwu2o26/uNu5p7ofcn8w0nymeWTNz0MPIQ+BR5dE/C5+VMGvfrH5PQ0+BZ7XnIy9jL5FXrdewt6V3qvdh7xc+9j5yn+M+4zw33jLeWV/MN8C3yLfLT8Nvnl+F30N/I/9k/3r/0QCngCUBZwOJgUGBWwL7+Hp8Ib+OPzrbZfay2e1BjKC5QRVBj4KtguXBrSFoyOyQrSH355jOkc5pDoVQfujW0Adh5mGLw34MJ4WHhVeGP45wiFga0TGXNXfR3ENz30T6RJZE3ptnMU85ry1KNSo+qi5qPNo3ujS6P8YuZlnM1VidWElsSxw5LiquNm5svt/87fOH4p3iC+N7F5gvyF1weaHOwvSFpxapLhIsOpZATIhOOJTwQRAqqBaMJfITdyWOCnnCHcJnIi/RNtGI2ENcKh5O8kgqTXqS7JG8NXkkxTOlLOW5hCepkLxMDUzdmzqeFpp2IG0yPTq9MYOSkZBxQqohTZO2Z+pn5mZ2y6xlhbL+xW6Lty8elQfJa7OQrAVZLQq2QqboVFoo1yoHsmdlV2a/zYnKOZarnivN7cyzytuQN5zvn//tEsIS4ZK2pYZLVy0dWOa9rGo5sjxxedsK4xUFK4ZWBqw8uIq2Km3VT6vtV5eufr0mek1rgV7ByoLBtQFr6wtVCuWFfevc1+1dT1gvWd+1YfqGnRs+FYmKrhTbF5cVf9go3HjlG4dvyr+Z3JS0qavEuWTPZtJm6ebeLZ5bDpaql+aXDm4N2dq0Dd9WtO319kXbL5fNKNu7g7ZDuaO/PLi8ZafJzs07P1SkVPRU+lQ27tLdtWHX+G7R7ht7vPY07NXbW7z3/T7JvttVAVVN1WbVZftJ+7P3P66Jqun4lvttXa1ObXHtxwPSA/0HIw6217nU1R3SPVRSj9Yr60cOxx++/p3vdy0NNg1VjZzG4iNwRHnk6fcJ3/ceDTradox7rOEH0x92HWcdL2pCmvKaRptTmvtbYlu6T8w+0dbq3nr8R9sfD5w0PFl5SvNUyWna6YLTk2fyz4ydlZ19fi753GDborZ752PO32oPb++6EHTh0kX/i+c7vDvOXPK4dPKy2+UTV7hXmq86X23qdOo8/pPTT8e7nLuarrlca7nuer21e2b36RueN87d9L158Rb/1tWeOT3dvfN6b/fF9/XfFt1+cif9zsu72Xcn7q28T7xf9EDtQdlD3YfVP1v+3Njv3H9qwHeg89HcR/cGhYPP/pH1jw9DBY+Zj8uGDYbrnjg+OTniP3L96fynQ89kzyaeF/6i/suuFxYvfvjV69fO0ZjRoZfyl5O/bXyl/erA6xmv28bCxh6+yXgzMV70VvvtwXfcdx3vo98PT+R8IH8o/2j5sfVT0Kf7kxmTk/8EA5jz/GMzLdsAAAAgY0hSTQAAeiUAAICDAAD5/wAAgOkAAHUwAADqYAAAOpgAABdvkl/FRgAAAHlJREFUeNrcU1sNgDAQ6wgmcAM2MICGGlg1gJnNzWQcvwQGy1j4oUl/7tH0mpwzM7SgQyO+EZAUWh2MkkzSWhJwuRAlHYsJwEwyvs1gABDuzqoJcTw5qxaIJN0bgQRgIjnlmn1heSO5PE6Y2YXe+5Cr5+h++gs12AcAS6FS+7YOsj4AAAAASUVORK5CYII=);
+}
+
+table.treetable tr.expanded span.indenter a {
+ background-image: url(data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAABAAAAAQCAYAAAAf8/9hAAAACXBIWXMAAAsTAAALEwEAmpwYAAAKT2lDQ1BQaG90b3Nob3AgSUNDIHByb2ZpbGUAAHjanVNnVFPpFj333vRCS4iAlEtvUhUIIFJCi4AUkSYqIQkQSoghodkVUcERRUUEG8igiAOOjoCMFVEsDIoK2AfkIaKOg6OIisr74Xuja9a89+bN/rXXPues852zzwfACAyWSDNRNYAMqUIeEeCDx8TG4eQuQIEKJHAAEAizZCFz/SMBAPh+PDwrIsAHvgABeNMLCADATZvAMByH/w/qQplcAYCEAcB0kThLCIAUAEB6jkKmAEBGAYCdmCZTAKAEAGDLY2LjAFAtAGAnf+bTAICd+Jl7AQBblCEVAaCRACATZYhEAGg7AKzPVopFAFgwABRmS8Q5ANgtADBJV2ZIALC3AMDOEAuyAAgMADBRiIUpAAR7AGDIIyN4AISZABRG8lc88SuuEOcqAAB4mbI8uSQ5RYFbCC1xB1dXLh4ozkkXKxQ2YQJhmkAuwnmZGTKBNA/g88wAAKCRFRHgg/P9eM4Ors7ONo62Dl8t6r8G/yJiYuP+5c+rcEAAAOF0ftH+LC+zGoA7BoBt/qIl7gRoXgugdfeLZrIPQLUAoOnaV/Nw+H48PEWhkLnZ2eXk5NhKxEJbYcpXff5nwl/AV/1s+X48/Pf14L7iJIEyXYFHBPjgwsz0TKUcz5IJhGLc5o9H/LcL//wd0yLESWK5WCoU41EScY5EmozzMqUiiUKSKcUl0v9k4t8s+wM+3zUAsGo+AXuRLahdYwP2SycQWHTA4vcAAPK7b8HUKAgDgGiD4c93/+8//UegJQCAZkmScQAAXkQkLlTKsz/HCAAARKCBKrBBG/TBGCzABhzBBdzBC/xgNoRCJMTCQhBCCmSAHHJgKayCQiiGzbAdKmAv1EAdNMBRaIaTcA4uwlW4Dj1wD/phCJ7BKLyBCQRByAgTYSHaiAFiilgjjggXmYX4IcFIBBKLJCDJiBRRIkuRNUgxUopUIFVIHfI9cgI5h1xGupE7yAAygvyGvEcxlIGyUT3UDLVDuag3GoRGogvQZHQxmo8WoJvQcrQaPYw2oefQq2gP2o8+Q8cwwOgYBzPEbDAuxsNCsTgsCZNjy7EirAyrxhqwVqwDu4n1Y8+xdwQSgUXACTYEd0IgYR5BSFhMWE7YSKggHCQ0EdoJNwkDhFHCJyKTqEu0JroR+cQYYjIxh1hILCPWEo8TLxB7iEPENyQSiUMyJ7mQAkmxpFTSEtJG0m5SI+ksqZs0SBojk8naZGuyBzmULCAryIXkneTD5DPkG+Qh8lsKnWJAcaT4U+IoUspqShnlEOU05QZlmDJBVaOaUt2ooVQRNY9aQq2htlKvUYeoEzR1mjnNgxZJS6WtopXTGmgXaPdpr+h0uhHdlR5Ol9BX0svpR+iX6AP0dwwNhhWDx4hnKBmbGAcYZxl3GK+YTKYZ04sZx1QwNzHrmOeZD5lvVVgqtip8FZHKCpVKlSaVGyovVKmqpqreqgtV81XLVI+pXlN9rkZVM1PjqQnUlqtVqp1Q61MbU2epO6iHqmeob1Q/pH5Z/YkGWcNMw09DpFGgsV/jvMYgC2MZs3gsIWsNq4Z1gTXEJrHN2Xx2KruY/R27iz2qqaE5QzNKM1ezUvOUZj8H45hx+Jx0TgnnKKeX836K3hTvKeIpG6Y0TLkxZVxrqpaXllirSKtRq0frvTau7aedpr1Fu1n7gQ5Bx0onXCdHZ4/OBZ3nU9lT3acKpxZNPTr1ri6qa6UbobtEd79up+6Ynr5egJ5Mb6feeb3n+hx9L/1U/W36p/VHDFgGswwkBtsMzhg8xTVxbzwdL8fb8VFDXcNAQ6VhlWGX4YSRudE8o9VGjUYPjGnGXOMk423GbcajJgYmISZLTepN7ppSTbmmKaY7TDtMx83MzaLN1pk1mz0x1zLnm+eb15vft2BaeFostqi2uGVJsuRaplnutrxuhVo5WaVYVVpds0atna0l1rutu6cRp7lOk06rntZnw7Dxtsm2qbcZsOXYBtuutm22fWFnYhdnt8Wuw+6TvZN9un2N/T0HDYfZDqsdWh1+c7RyFDpWOt6azpzuP33F9JbpL2dYzxDP2DPjthPLKcRpnVOb00dnF2e5c4PziIuJS4LLLpc+Lpsbxt3IveRKdPVxXeF60vWdm7Obwu2o26/uNu5p7ofcn8w0nymeWTNz0MPIQ+BR5dE/C5+VMGvfrH5PQ0+BZ7XnIy9jL5FXrdewt6V3qvdh7xc+9j5yn+M+4zw33jLeWV/MN8C3yLfLT8Nvnl+F30N/I/9k/3r/0QCngCUBZwOJgUGBWwL7+Hp8Ib+OPzrbZfay2e1BjKC5QRVBj4KtguXBrSFoyOyQrSH355jOkc5pDoVQfujW0Adh5mGLw34MJ4WHhVeGP45wiFga0TGXNXfR3ENz30T6RJZE3ptnMU85ry1KNSo+qi5qPNo3ujS6P8YuZlnM1VidWElsSxw5LiquNm5svt/87fOH4p3iC+N7F5gvyF1weaHOwvSFpxapLhIsOpZATIhOOJTwQRAqqBaMJfITdyWOCnnCHcJnIi/RNtGI2ENcKh5O8kgqTXqS7JG8NXkkxTOlLOW5hCepkLxMDUzdmzqeFpp2IG0yPTq9MYOSkZBxQqohTZO2Z+pn5mZ2y6xlhbL+xW6Lty8elQfJa7OQrAVZLQq2QqboVFoo1yoHsmdlV2a/zYnKOZarnivN7cyzytuQN5zvn//tEsIS4ZK2pYZLVy0dWOa9rGo5sjxxedsK4xUFK4ZWBqw8uIq2Km3VT6vtV5eufr0mek1rgV7ByoLBtQFr6wtVCuWFfevc1+1dT1gvWd+1YfqGnRs+FYmKrhTbF5cVf9go3HjlG4dvyr+Z3JS0qavEuWTPZtJm6ebeLZ5bDpaql+aXDm4N2dq0Dd9WtO319kXbL5fNKNu7g7ZDuaO/PLi8ZafJzs07P1SkVPRU+lQ27tLdtWHX+G7R7ht7vPY07NXbW7z3/T7JvttVAVVN1WbVZftJ+7P3P66Jqun4lvttXa1ObXHtxwPSA/0HIw6217nU1R3SPVRSj9Yr60cOxx++/p3vdy0NNg1VjZzG4iNwRHnk6fcJ3/ceDTradox7rOEH0x92HWcdL2pCmvKaRptTmvtbYlu6T8w+0dbq3nr8R9sfD5w0PFl5SvNUyWna6YLTk2fyz4ydlZ19fi753GDborZ752PO32oPb++6EHTh0kX/i+c7vDvOXPK4dPKy2+UTV7hXmq86X23qdOo8/pPTT8e7nLuarrlca7nuer21e2b36RueN87d9L158Rb/1tWeOT3dvfN6b/fF9/XfFt1+cif9zsu72Xcn7q28T7xf9EDtQdlD3YfVP1v+3Njv3H9qwHeg89HcR/cGhYPP/pH1jw9DBY+Zj8uGDYbrnjg+OTniP3L96fynQ89kzyaeF/6i/suuFxYvfvjV69fO0ZjRoZfyl5O/bXyl/erA6xmv28bCxh6+yXgzMV70VvvtwXfcdx3vo98PT+R8IH8o/2j5sfVT0Kf7kxmTk/8EA5jz/GMzLdsAAAAgY0hSTQAAeiUAAICDAAD5/wAAgOkAAHUwAADqYAAAOpgAABdvkl/FRgAAAHFJREFUeNpi/P//PwMlgImBQsA44C6gvhfa29v3MzAwOODRc6CystIRbxi0t7fjDJjKykpGYrwwi1hxnLHQ3t7+jIGBQRJJ6HllZaUUKYEYRYBPOB0gBShKwKGA////48VtbW3/8clTnBIH3gCKkzJgAGvBX0dDm0sCAAAAAElFTkSuQmCC);
+}
+
+
+
+table.treetable tr.collapsed.selected span.indenter a {
+ background-image: url(data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAABAAAAAQCAYAAAAf8/9hAAAACXBIWXMAAAsTAAALEwEAmpwYAAAKT2lDQ1BQaG90b3Nob3AgSUNDIHByb2ZpbGUAAHjanVNnVFPpFj333vRCS4iAlEtvUhUIIFJCi4AUkSYqIQkQSoghodkVUcERRUUEG8igiAOOjoCMFVEsDIoK2AfkIaKOg6OIisr74Xuja9a89+bN/rXXPues852zzwfACAyWSDNRNYAMqUIeEeCDx8TG4eQuQIEKJHAAEAizZCFz/SMBAPh+PDwrIsAHvgABeNMLCADATZvAMByH/w/qQplcAYCEAcB0kThLCIAUAEB6jkKmAEBGAYCdmCZTAKAEAGDLY2LjAFAtAGAnf+bTAICd+Jl7AQBblCEVAaCRACATZYhEAGg7AKzPVopFAFgwABRmS8Q5ANgtADBJV2ZIALC3AMDOEAuyAAgMADBRiIUpAAR7AGDIIyN4AISZABRG8lc88SuuEOcqAAB4mbI8uSQ5RYFbCC1xB1dXLh4ozkkXKxQ2YQJhmkAuwnmZGTKBNA/g88wAAKCRFRHgg/P9eM4Ors7ONo62Dl8t6r8G/yJiYuP+5c+rcEAAAOF0ftH+LC+zGoA7BoBt/qIl7gRoXgugdfeLZrIPQLUAoOnaV/Nw+H48PEWhkLnZ2eXk5NhKxEJbYcpXff5nwl/AV/1s+X48/Pf14L7iJIEyXYFHBPjgwsz0TKUcz5IJhGLc5o9H/LcL//wd0yLESWK5WCoU41EScY5EmozzMqUiiUKSKcUl0v9k4t8s+wM+3zUAsGo+AXuRLahdYwP2SycQWHTA4vcAAPK7b8HUKAgDgGiD4c93/+8//UegJQCAZkmScQAAXkQkLlTKsz/HCAAARKCBKrBBG/TBGCzABhzBBdzBC/xgNoRCJMTCQhBCCmSAHHJgKayCQiiGzbAdKmAv1EAdNMBRaIaTcA4uwlW4Dj1wD/phCJ7BKLyBCQRByAgTYSHaiAFiilgjjggXmYX4IcFIBBKLJCDJiBRRIkuRNUgxUopUIFVIHfI9cgI5h1xGupE7yAAygvyGvEcxlIGyUT3UDLVDuag3GoRGogvQZHQxmo8WoJvQcrQaPYw2oefQq2gP2o8+Q8cwwOgYBzPEbDAuxsNCsTgsCZNjy7EirAyrxhqwVqwDu4n1Y8+xdwQSgUXACTYEd0IgYR5BSFhMWE7YSKggHCQ0EdoJNwkDhFHCJyKTqEu0JroR+cQYYjIxh1hILCPWEo8TLxB7iEPENyQSiUMyJ7mQAkmxpFTSEtJG0m5SI+ksqZs0SBojk8naZGuyBzmULCAryIXkneTD5DPkG+Qh8lsKnWJAcaT4U+IoUspqShnlEOU05QZlmDJBVaOaUt2ooVQRNY9aQq2htlKvUYeoEzR1mjnNgxZJS6WtopXTGmgXaPdpr+h0uhHdlR5Ol9BX0svpR+iX6AP0dwwNhhWDx4hnKBmbGAcYZxl3GK+YTKYZ04sZx1QwNzHrmOeZD5lvVVgqtip8FZHKCpVKlSaVGyovVKmqpqreqgtV81XLVI+pXlN9rkZVM1PjqQnUlqtVqp1Q61MbU2epO6iHqmeob1Q/pH5Z/YkGWcNMw09DpFGgsV/jvMYgC2MZs3gsIWsNq4Z1gTXEJrHN2Xx2KruY/R27iz2qqaE5QzNKM1ezUvOUZj8H45hx+Jx0TgnnKKeX836K3hTvKeIpG6Y0TLkxZVxrqpaXllirSKtRq0frvTau7aedpr1Fu1n7gQ5Bx0onXCdHZ4/OBZ3nU9lT3acKpxZNPTr1ri6qa6UbobtEd79up+6Ynr5egJ5Mb6feeb3n+hx9L/1U/W36p/VHDFgGswwkBtsMzhg8xTVxbzwdL8fb8VFDXcNAQ6VhlWGX4YSRudE8o9VGjUYPjGnGXOMk423GbcajJgYmISZLTepN7ppSTbmmKaY7TDtMx83MzaLN1pk1mz0x1zLnm+eb15vft2BaeFostqi2uGVJsuRaplnutrxuhVo5WaVYVVpds0atna0l1rutu6cRp7lOk06rntZnw7Dxtsm2qbcZsOXYBtuutm22fWFnYhdnt8Wuw+6TvZN9un2N/T0HDYfZDqsdWh1+c7RyFDpWOt6azpzuP33F9JbpL2dYzxDP2DPjthPLKcRpnVOb00dnF2e5c4PziIuJS4LLLpc+Lpsbxt3IveRKdPVxXeF60vWdm7Obwu2o26/uNu5p7ofcn8w0nymeWTNz0MPIQ+BR5dE/C5+VMGvfrH5PQ0+BZ7XnIy9jL5FXrdewt6V3qvdh7xc+9j5yn+M+4zw33jLeWV/MN8C3yLfLT8Nvnl+F30N/I/9k/3r/0QCngCUBZwOJgUGBWwL7+Hp8Ib+OPzrbZfay2e1BjKC5QRVBj4KtguXBrSFoyOyQrSH355jOkc5pDoVQfujW0Adh5mGLw34MJ4WHhVeGP45wiFga0TGXNXfR3ENz30T6RJZE3ptnMU85ry1KNSo+qi5qPNo3ujS6P8YuZlnM1VidWElsSxw5LiquNm5svt/87fOH4p3iC+N7F5gvyF1weaHOwvSFpxapLhIsOpZATIhOOJTwQRAqqBaMJfITdyWOCnnCHcJnIi/RNtGI2ENcKh5O8kgqTXqS7JG8NXkkxTOlLOW5hCepkLxMDUzdmzqeFpp2IG0yPTq9MYOSkZBxQqohTZO2Z+pn5mZ2y6xlhbL+xW6Lty8elQfJa7OQrAVZLQq2QqboVFoo1yoHsmdlV2a/zYnKOZarnivN7cyzytuQN5zvn//tEsIS4ZK2pYZLVy0dWOa9rGo5sjxxedsK4xUFK4ZWBqw8uIq2Km3VT6vtV5eufr0mek1rgV7ByoLBtQFr6wtVCuWFfevc1+1dT1gvWd+1YfqGnRs+FYmKrhTbF5cVf9go3HjlG4dvyr+Z3JS0qavEuWTPZtJm6ebeLZ5bDpaql+aXDm4N2dq0Dd9WtO319kXbL5fNKNu7g7ZDuaO/PLi8ZafJzs07P1SkVPRU+lQ27tLdtWHX+G7R7ht7vPY07NXbW7z3/T7JvttVAVVN1WbVZftJ+7P3P66Jqun4lvttXa1ObXHtxwPSA/0HIw6217nU1R3SPVRSj9Yr60cOxx++/p3vdy0NNg1VjZzG4iNwRHnk6fcJ3/ceDTradox7rOEH0x92HWcdL2pCmvKaRptTmvtbYlu6T8w+0dbq3nr8R9sfD5w0PFl5SvNUyWna6YLTk2fyz4ydlZ19fi753GDborZ752PO32oPb++6EHTh0kX/i+c7vDvOXPK4dPKy2+UTV7hXmq86X23qdOo8/pPTT8e7nLuarrlca7nuer21e2b36RueN87d9L158Rb/1tWeOT3dvfN6b/fF9/XfFt1+cif9zsu72Xcn7q28T7xf9EDtQdlD3YfVP1v+3Njv3H9qwHeg89HcR/cGhYPP/pH1jw9DBY+Zj8uGDYbrnjg+OTniP3L96fynQ89kzyaeF/6i/suuFxYvfvjV69fO0ZjRoZfyl5O/bXyl/erA6xmv28bCxh6+yXgzMV70VvvtwXfcdx3vo98PT+R8IH8o/2j5sfVT0Kf7kxmTk/8EA5jz/GMzLdsAAAAgY0hSTQAAeiUAAICDAAD5/wAAgOkAAHUwAADqYAAAOpgAABdvkl/FRgAAAFpJREFUeNpi/P//PwMlgHHADWD4//8/NtyAQxwD45KAAQdKDfj//////fgMIsYAZIMw1DKREFwODAwM/4kNRKq64AADA4MjFDOQ6gKyY4HodMA49PMCxQYABgAVYHsjyZ1x7QAAAABJRU5ErkJggg==);
+}
+
+table.treetable tr.expanded.selected span.indenter a {
+ background-image: url(data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAABAAAAAQCAYAAAAf8/9hAAAACXBIWXMAAAsTAAALEwEAmpwYAAAKT2lDQ1BQaG90b3Nob3AgSUNDIHByb2ZpbGUAAHjanVNnVFPpFj333vRCS4iAlEtvUhUIIFJCi4AUkSYqIQkQSoghodkVUcERRUUEG8igiAOOjoCMFVEsDIoK2AfkIaKOg6OIisr74Xuja9a89+bN/rXXPues852zzwfACAyWSDNRNYAMqUIeEeCDx8TG4eQuQIEKJHAAEAizZCFz/SMBAPh+PDwrIsAHvgABeNMLCADATZvAMByH/w/qQplcAYCEAcB0kThLCIAUAEB6jkKmAEBGAYCdmCZTAKAEAGDLY2LjAFAtAGAnf+bTAICd+Jl7AQBblCEVAaCRACATZYhEAGg7AKzPVopFAFgwABRmS8Q5ANgtADBJV2ZIALC3AMDOEAuyAAgMADBRiIUpAAR7AGDIIyN4AISZABRG8lc88SuuEOcqAAB4mbI8uSQ5RYFbCC1xB1dXLh4ozkkXKxQ2YQJhmkAuwnmZGTKBNA/g88wAAKCRFRHgg/P9eM4Ors7ONo62Dl8t6r8G/yJiYuP+5c+rcEAAAOF0ftH+LC+zGoA7BoBt/qIl7gRoXgugdfeLZrIPQLUAoOnaV/Nw+H48PEWhkLnZ2eXk5NhKxEJbYcpXff5nwl/AV/1s+X48/Pf14L7iJIEyXYFHBPjgwsz0TKUcz5IJhGLc5o9H/LcL//wd0yLESWK5WCoU41EScY5EmozzMqUiiUKSKcUl0v9k4t8s+wM+3zUAsGo+AXuRLahdYwP2SycQWHTA4vcAAPK7b8HUKAgDgGiD4c93/+8//UegJQCAZkmScQAAXkQkLlTKsz/HCAAARKCBKrBBG/TBGCzABhzBBdzBC/xgNoRCJMTCQhBCCmSAHHJgKayCQiiGzbAdKmAv1EAdNMBRaIaTcA4uwlW4Dj1wD/phCJ7BKLyBCQRByAgTYSHaiAFiilgjjggXmYX4IcFIBBKLJCDJiBRRIkuRNUgxUopUIFVIHfI9cgI5h1xGupE7yAAygvyGvEcxlIGyUT3UDLVDuag3GoRGogvQZHQxmo8WoJvQcrQaPYw2oefQq2gP2o8+Q8cwwOgYBzPEbDAuxsNCsTgsCZNjy7EirAyrxhqwVqwDu4n1Y8+xdwQSgUXACTYEd0IgYR5BSFhMWE7YSKggHCQ0EdoJNwkDhFHCJyKTqEu0JroR+cQYYjIxh1hILCPWEo8TLxB7iEPENyQSiUMyJ7mQAkmxpFTSEtJG0m5SI+ksqZs0SBojk8naZGuyBzmULCAryIXkneTD5DPkG+Qh8lsKnWJAcaT4U+IoUspqShnlEOU05QZlmDJBVaOaUt2ooVQRNY9aQq2htlKvUYeoEzR1mjnNgxZJS6WtopXTGmgXaPdpr+h0uhHdlR5Ol9BX0svpR+iX6AP0dwwNhhWDx4hnKBmbGAcYZxl3GK+YTKYZ04sZx1QwNzHrmOeZD5lvVVgqtip8FZHKCpVKlSaVGyovVKmqpqreqgtV81XLVI+pXlN9rkZVM1PjqQnUlqtVqp1Q61MbU2epO6iHqmeob1Q/pH5Z/YkGWcNMw09DpFGgsV/jvMYgC2MZs3gsIWsNq4Z1gTXEJrHN2Xx2KruY/R27iz2qqaE5QzNKM1ezUvOUZj8H45hx+Jx0TgnnKKeX836K3hTvKeIpG6Y0TLkxZVxrqpaXllirSKtRq0frvTau7aedpr1Fu1n7gQ5Bx0onXCdHZ4/OBZ3nU9lT3acKpxZNPTr1ri6qa6UbobtEd79up+6Ynr5egJ5Mb6feeb3n+hx9L/1U/W36p/VHDFgGswwkBtsMzhg8xTVxbzwdL8fb8VFDXcNAQ6VhlWGX4YSRudE8o9VGjUYPjGnGXOMk423GbcajJgYmISZLTepN7ppSTbmmKaY7TDtMx83MzaLN1pk1mz0x1zLnm+eb15vft2BaeFostqi2uGVJsuRaplnutrxuhVo5WaVYVVpds0atna0l1rutu6cRp7lOk06rntZnw7Dxtsm2qbcZsOXYBtuutm22fWFnYhdnt8Wuw+6TvZN9un2N/T0HDYfZDqsdWh1+c7RyFDpWOt6azpzuP33F9JbpL2dYzxDP2DPjthPLKcRpnVOb00dnF2e5c4PziIuJS4LLLpc+Lpsbxt3IveRKdPVxXeF60vWdm7Obwu2o26/uNu5p7ofcn8w0nymeWTNz0MPIQ+BR5dE/C5+VMGvfrH5PQ0+BZ7XnIy9jL5FXrdewt6V3qvdh7xc+9j5yn+M+4zw33jLeWV/MN8C3yLfLT8Nvnl+F30N/I/9k/3r/0QCngCUBZwOJgUGBWwL7+Hp8Ib+OPzrbZfay2e1BjKC5QRVBj4KtguXBrSFoyOyQrSH355jOkc5pDoVQfujW0Adh5mGLw34MJ4WHhVeGP45wiFga0TGXNXfR3ENz30T6RJZE3ptnMU85ry1KNSo+qi5qPNo3ujS6P8YuZlnM1VidWElsSxw5LiquNm5svt/87fOH4p3iC+N7F5gvyF1weaHOwvSFpxapLhIsOpZATIhOOJTwQRAqqBaMJfITdyWOCnnCHcJnIi/RNtGI2ENcKh5O8kgqTXqS7JG8NXkkxTOlLOW5hCepkLxMDUzdmzqeFpp2IG0yPTq9MYOSkZBxQqohTZO2Z+pn5mZ2y6xlhbL+xW6Lty8elQfJa7OQrAVZLQq2QqboVFoo1yoHsmdlV2a/zYnKOZarnivN7cyzytuQN5zvn//tEsIS4ZK2pYZLVy0dWOa9rGo5sjxxedsK4xUFK4ZWBqw8uIq2Km3VT6vtV5eufr0mek1rgV7ByoLBtQFr6wtVCuWFfevc1+1dT1gvWd+1YfqGnRs+FYmKrhTbF5cVf9go3HjlG4dvyr+Z3JS0qavEuWTPZtJm6ebeLZ5bDpaql+aXDm4N2dq0Dd9WtO319kXbL5fNKNu7g7ZDuaO/PLi8ZafJzs07P1SkVPRU+lQ27tLdtWHX+G7R7ht7vPY07NXbW7z3/T7JvttVAVVN1WbVZftJ+7P3P66Jqun4lvttXa1ObXHtxwPSA/0HIw6217nU1R3SPVRSj9Yr60cOxx++/p3vdy0NNg1VjZzG4iNwRHnk6fcJ3/ceDTradox7rOEH0x92HWcdL2pCmvKaRptTmvtbYlu6T8w+0dbq3nr8R9sfD5w0PFl5SvNUyWna6YLTk2fyz4ydlZ19fi753GDborZ752PO32oPb++6EHTh0kX/i+c7vDvOXPK4dPKy2+UTV7hXmq86X23qdOo8/pPTT8e7nLuarrlca7nuer21e2b36RueN87d9L158Rb/1tWeOT3dvfN6b/fF9/XfFt1+cif9zsu72Xcn7q28T7xf9EDtQdlD3YfVP1v+3Njv3H9qwHeg89HcR/cGhYPP/pH1jw9DBY+Zj8uGDYbrnjg+OTniP3L96fynQ89kzyaeF/6i/suuFxYvfvjV69fO0ZjRoZfyl5O/bXyl/erA6xmv28bCxh6+yXgzMV70VvvtwXfcdx3vo98PT+R8IH8o/2j5sfVT0Kf7kxmTk/8EA5jz/GMzLdsAAAAgY0hSTQAAeiUAAICDAAD5/wAAgOkAAHUwAADqYAAAOpgAABdvkl/FRgAAAFtJREFUeNpi/P//PwMlgImBQsA44C6giQENDAwM//HgBmLCAF/AMBLjBUeixf///48L7/+PCvZjU4fPAAc0AxywqcMXCwegGJ1NckL6jx5wpKYDxqGXEkkCgAEAmrqBIejdgngAAAAASUVORK5CYII=);
+}
+
+table.treetable tr.accept {
+ background-color: #a3bce4;
+ color: #fff
+}
+
+table.treetable tr.collapsed.accept td span.indenter a {
+ background-image: url(data:image/x-png;base64,iVBORw0KGgoAAAANSUhEUgAAABAAAAAQCAYAAAAf8/9hAAAACXBIWXMAAAsTAAALEwEAmpwYAAAKT2lDQ1BQaG90b3Nob3AgSUNDIHByb2ZpbGUAAHjanVNnVFPpFj333vRCS4iAlEtvUhUIIFJCi4AUkSYqIQkQSoghodkVUcERRUUEG8igiAOOjoCMFVEsDIoK2AfkIaKOg6OIisr74Xuja9a89+bN/rXXPues852zzwfACAyWSDNRNYAMqUIeEeCDx8TG4eQuQIEKJHAAEAizZCFz/SMBAPh+PDwrIsAHvgABeNMLCADATZvAMByH/w/qQplcAYCEAcB0kThLCIAUAEB6jkKmAEBGAYCdmCZTAKAEAGDLY2LjAFAtAGAnf+bTAICd+Jl7AQBblCEVAaCRACATZYhEAGg7AKzPVopFAFgwABRmS8Q5ANgtADBJV2ZIALC3AMDOEAuyAAgMADBRiIUpAAR7AGDIIyN4AISZABRG8lc88SuuEOcqAAB4mbI8uSQ5RYFbCC1xB1dXLh4ozkkXKxQ2YQJhmkAuwnmZGTKBNA/g88wAAKCRFRHgg/P9eM4Ors7ONo62Dl8t6r8G/yJiYuP+5c+rcEAAAOF0ftH+LC+zGoA7BoBt/qIl7gRoXgugdfeLZrIPQLUAoOnaV/Nw+H48PEWhkLnZ2eXk5NhKxEJbYcpXff5nwl/AV/1s+X48/Pf14L7iJIEyXYFHBPjgwsz0TKUcz5IJhGLc5o9H/LcL//wd0yLESWK5WCoU41EScY5EmozzMqUiiUKSKcUl0v9k4t8s+wM+3zUAsGo+AXuRLahdYwP2SycQWHTA4vcAAPK7b8HUKAgDgGiD4c93/+8//UegJQCAZkmScQAAXkQkLlTKsz/HCAAARKCBKrBBG/TBGCzABhzBBdzBC/xgNoRCJMTCQhBCCmSAHHJgKayCQiiGzbAdKmAv1EAdNMBRaIaTcA4uwlW4Dj1wD/phCJ7BKLyBCQRByAgTYSHaiAFiilgjjggXmYX4IcFIBBKLJCDJiBRRIkuRNUgxUopUIFVIHfI9cgI5h1xGupE7yAAygvyGvEcxlIGyUT3UDLVDuag3GoRGogvQZHQxmo8WoJvQcrQaPYw2oefQq2gP2o8+Q8cwwOgYBzPEbDAuxsNCsTgsCZNjy7EirAyrxhqwVqwDu4n1Y8+xdwQSgUXACTYEd0IgYR5BSFhMWE7YSKggHCQ0EdoJNwkDhFHCJyKTqEu0JroR+cQYYjIxh1hILCPWEo8TLxB7iEPENyQSiUMyJ7mQAkmxpFTSEtJG0m5SI+ksqZs0SBojk8naZGuyBzmULCAryIXkneTD5DPkG+Qh8lsKnWJAcaT4U+IoUspqShnlEOU05QZlmDJBVaOaUt2ooVQRNY9aQq2htlKvUYeoEzR1mjnNgxZJS6WtopXTGmgXaPdpr+h0uhHdlR5Ol9BX0svpR+iX6AP0dwwNhhWDx4hnKBmbGAcYZxl3GK+YTKYZ04sZx1QwNzHrmOeZD5lvVVgqtip8FZHKCpVKlSaVGyovVKmqpqreqgtV81XLVI+pXlN9rkZVM1PjqQnUlqtVqp1Q61MbU2epO6iHqmeob1Q/pH5Z/YkGWcNMw09DpFGgsV/jvMYgC2MZs3gsIWsNq4Z1gTXEJrHN2Xx2KruY/R27iz2qqaE5QzNKM1ezUvOUZj8H45hx+Jx0TgnnKKeX836K3hTvKeIpG6Y0TLkxZVxrqpaXllirSKtRq0frvTau7aedpr1Fu1n7gQ5Bx0onXCdHZ4/OBZ3nU9lT3acKpxZNPTr1ri6qa6UbobtEd79up+6Ynr5egJ5Mb6feeb3n+hx9L/1U/W36p/VHDFgGswwkBtsMzhg8xTVxbzwdL8fb8VFDXcNAQ6VhlWGX4YSRudE8o9VGjUYPjGnGXOMk423GbcajJgYmISZLTepN7ppSTbmmKaY7TDtMx83MzaLN1pk1mz0x1zLnm+eb15vft2BaeFostqi2uGVJsuRaplnutrxuhVo5WaVYVVpds0atna0l1rutu6cRp7lOk06rntZnw7Dxtsm2qbcZsOXYBtuutm22fWFnYhdnt8Wuw+6TvZN9un2N/T0HDYfZDqsdWh1+c7RyFDpWOt6azpzuP33F9JbpL2dYzxDP2DPjthPLKcRpnVOb00dnF2e5c4PziIuJS4LLLpc+Lpsbxt3IveRKdPVxXeF60vWdm7Obwu2o26/uNu5p7ofcn8w0nymeWTNz0MPIQ+BR5dE/C5+VMGvfrH5PQ0+BZ7XnIy9jL5FXrdewt6V3qvdh7xc+9j5yn+M+4zw33jLeWV/MN8C3yLfLT8Nvnl+F30N/I/9k/3r/0QCngCUBZwOJgUGBWwL7+Hp8Ib+OPzrbZfay2e1BjKC5QRVBj4KtguXBrSFoyOyQrSH355jOkc5pDoVQfujW0Adh5mGLw34MJ4WHhVeGP45wiFga0TGXNXfR3ENz30T6RJZE3ptnMU85ry1KNSo+qi5qPNo3ujS6P8YuZlnM1VidWElsSxw5LiquNm5svt/87fOH4p3iC+N7F5gvyF1weaHOwvSFpxapLhIsOpZATIhOOJTwQRAqqBaMJfITdyWOCnnCHcJnIi/RNtGI2ENcKh5O8kgqTXqS7JG8NXkkxTOlLOW5hCepkLxMDUzdmzqeFpp2IG0yPTq9MYOSkZBxQqohTZO2Z+pn5mZ2y6xlhbL+xW6Lty8elQfJa7OQrAVZLQq2QqboVFoo1yoHsmdlV2a/zYnKOZarnivN7cyzytuQN5zvn//tEsIS4ZK2pYZLVy0dWOa9rGo5sjxxedsK4xUFK4ZWBqw8uIq2Km3VT6vtV5eufr0mek1rgV7ByoLBtQFr6wtVCuWFfevc1+1dT1gvWd+1YfqGnRs+FYmKrhTbF5cVf9go3HjlG4dvyr+Z3JS0qavEuWTPZtJm6ebeLZ5bDpaql+aXDm4N2dq0Dd9WtO319kXbL5fNKNu7g7ZDuaO/PLi8ZafJzs07P1SkVPRU+lQ27tLdtWHX+G7R7ht7vPY07NXbW7z3/T7JvttVAVVN1WbVZftJ+7P3P66Jqun4lvttXa1ObXHtxwPSA/0HIw6217nU1R3SPVRSj9Yr60cOxx++/p3vdy0NNg1VjZzG4iNwRHnk6fcJ3/ceDTradox7rOEH0x92HWcdL2pCmvKaRptTmvtbYlu6T8w+0dbq3nr8R9sfD5w0PFl5SvNUyWna6YLTk2fyz4ydlZ19fi753GDborZ752PO32oPb++6EHTh0kX/i+c7vDvOXPK4dPKy2+UTV7hXmq86X23qdOo8/pPTT8e7nLuarrlca7nuer21e2b36RueN87d9L158Rb/1tWeOT3dvfN6b/fF9/XfFt1+cif9zsu72Xcn7q28T7xf9EDtQdlD3YfVP1v+3Njv3H9qwHeg89HcR/cGhYPP/pH1jw9DBY+Zj8uGDYbrnjg+OTniP3L96fynQ89kzyaeF/6i/suuFxYvfvjV69fO0ZjRoZfyl5O/bXyl/erA6xmv28bCxh6+yXgzMV70VvvtwXfcdx3vo98PT+R8IH8o/2j5sfVT0Kf7kxmTk/8EA5jz/GMzLdsAAAAgY0hSTQAAeiUAAICDAAD5/wAAgOkAAHUwAADqYAAAOpgAABdvkl/FRgAAAFpJREFUeNpi/P//PwMlgHHADWD4//8/NtyAQxwD45KAAQdKDfj//////fgMIsYAZIMw1DKREFwODAwM/4kNRKq64AADA4MjFDOQ6gKyY4HodMA49PMCxQYABgAVYHsjyZ1x7QAAAABJRU5ErkJggg==);
+}
+
+table.treetable tr.expanded.accept td span.indenter a {
+ background-image: url(data:image/x-png;base64,iVBORw0KGgoAAAANSUhEUgAAABAAAAAQCAYAAAAf8/9hAAAACXBIWXMAAAsTAAALEwEAmpwYAAAKT2lDQ1BQaG90b3Nob3AgSUNDIHByb2ZpbGUAAHjanVNnVFPpFj333vRCS4iAlEtvUhUIIFJCi4AUkSYqIQkQSoghodkVUcERRUUEG8igiAOOjoCMFVEsDIoK2AfkIaKOg6OIisr74Xuja9a89+bN/rXXPues852zzwfACAyWSDNRNYAMqUIeEeCDx8TG4eQuQIEKJHAAEAizZCFz/SMBAPh+PDwrIsAHvgABeNMLCADATZvAMByH/w/qQplcAYCEAcB0kThLCIAUAEB6jkKmAEBGAYCdmCZTAKAEAGDLY2LjAFAtAGAnf+bTAICd+Jl7AQBblCEVAaCRACATZYhEAGg7AKzPVopFAFgwABRmS8Q5ANgtADBJV2ZIALC3AMDOEAuyAAgMADBRiIUpAAR7AGDIIyN4AISZABRG8lc88SuuEOcqAAB4mbI8uSQ5RYFbCC1xB1dXLh4ozkkXKxQ2YQJhmkAuwnmZGTKBNA/g88wAAKCRFRHgg/P9eM4Ors7ONo62Dl8t6r8G/yJiYuP+5c+rcEAAAOF0ftH+LC+zGoA7BoBt/qIl7gRoXgugdfeLZrIPQLUAoOnaV/Nw+H48PEWhkLnZ2eXk5NhKxEJbYcpXff5nwl/AV/1s+X48/Pf14L7iJIEyXYFHBPjgwsz0TKUcz5IJhGLc5o9H/LcL//wd0yLESWK5WCoU41EScY5EmozzMqUiiUKSKcUl0v9k4t8s+wM+3zUAsGo+AXuRLahdYwP2SycQWHTA4vcAAPK7b8HUKAgDgGiD4c93/+8//UegJQCAZkmScQAAXkQkLlTKsz/HCAAARKCBKrBBG/TBGCzABhzBBdzBC/xgNoRCJMTCQhBCCmSAHHJgKayCQiiGzbAdKmAv1EAdNMBRaIaTcA4uwlW4Dj1wD/phCJ7BKLyBCQRByAgTYSHaiAFiilgjjggXmYX4IcFIBBKLJCDJiBRRIkuRNUgxUopUIFVIHfI9cgI5h1xGupE7yAAygvyGvEcxlIGyUT3UDLVDuag3GoRGogvQZHQxmo8WoJvQcrQaPYw2oefQq2gP2o8+Q8cwwOgYBzPEbDAuxsNCsTgsCZNjy7EirAyrxhqwVqwDu4n1Y8+xdwQSgUXACTYEd0IgYR5BSFhMWE7YSKggHCQ0EdoJNwkDhFHCJyKTqEu0JroR+cQYYjIxh1hILCPWEo8TLxB7iEPENyQSiUMyJ7mQAkmxpFTSEtJG0m5SI+ksqZs0SBojk8naZGuyBzmULCAryIXkneTD5DPkG+Qh8lsKnWJAcaT4U+IoUspqShnlEOU05QZlmDJBVaOaUt2ooVQRNY9aQq2htlKvUYeoEzR1mjnNgxZJS6WtopXTGmgXaPdpr+h0uhHdlR5Ol9BX0svpR+iX6AP0dwwNhhWDx4hnKBmbGAcYZxl3GK+YTKYZ04sZx1QwNzHrmOeZD5lvVVgqtip8FZHKCpVKlSaVGyovVKmqpqreqgtV81XLVI+pXlN9rkZVM1PjqQnUlqtVqp1Q61MbU2epO6iHqmeob1Q/pH5Z/YkGWcNMw09DpFGgsV/jvMYgC2MZs3gsIWsNq4Z1gTXEJrHN2Xx2KruY/R27iz2qqaE5QzNKM1ezUvOUZj8H45hx+Jx0TgnnKKeX836K3hTvKeIpG6Y0TLkxZVxrqpaXllirSKtRq0frvTau7aedpr1Fu1n7gQ5Bx0onXCdHZ4/OBZ3nU9lT3acKpxZNPTr1ri6qa6UbobtEd79up+6Ynr5egJ5Mb6feeb3n+hx9L/1U/W36p/VHDFgGswwkBtsMzhg8xTVxbzwdL8fb8VFDXcNAQ6VhlWGX4YSRudE8o9VGjUYPjGnGXOMk423GbcajJgYmISZLTepN7ppSTbmmKaY7TDtMx83MzaLN1pk1mz0x1zLnm+eb15vft2BaeFostqi2uGVJsuRaplnutrxuhVo5WaVYVVpds0atna0l1rutu6cRp7lOk06rntZnw7Dxtsm2qbcZsOXYBtuutm22fWFnYhdnt8Wuw+6TvZN9un2N/T0HDYfZDqsdWh1+c7RyFDpWOt6azpzuP33F9JbpL2dYzxDP2DPjthPLKcRpnVOb00dnF2e5c4PziIuJS4LLLpc+Lpsbxt3IveRKdPVxXeF60vWdm7Obwu2o26/uNu5p7ofcn8w0nymeWTNz0MPIQ+BR5dE/C5+VMGvfrH5PQ0+BZ7XnIy9jL5FXrdewt6V3qvdh7xc+9j5yn+M+4zw33jLeWV/MN8C3yLfLT8Nvnl+F30N/I/9k/3r/0QCngCUBZwOJgUGBWwL7+Hp8Ib+OPzrbZfay2e1BjKC5QRVBj4KtguXBrSFoyOyQrSH355jOkc5pDoVQfujW0Adh5mGLw34MJ4WHhVeGP45wiFga0TGXNXfR3ENz30T6RJZE3ptnMU85ry1KNSo+qi5qPNo3ujS6P8YuZlnM1VidWElsSxw5LiquNm5svt/87fOH4p3iC+N7F5gvyF1weaHOwvSFpxapLhIsOpZATIhOOJTwQRAqqBaMJfITdyWOCnnCHcJnIi/RNtGI2ENcKh5O8kgqTXqS7JG8NXkkxTOlLOW5hCepkLxMDUzdmzqeFpp2IG0yPTq9MYOSkZBxQqohTZO2Z+pn5mZ2y6xlhbL+xW6Lty8elQfJa7OQrAVZLQq2QqboVFoo1yoHsmdlV2a/zYnKOZarnivN7cyzytuQN5zvn//tEsIS4ZK2pYZLVy0dWOa9rGo5sjxxedsK4xUFK4ZWBqw8uIq2Km3VT6vtV5eufr0mek1rgV7ByoLBtQFr6wtVCuWFfevc1+1dT1gvWd+1YfqGnRs+FYmKrhTbF5cVf9go3HjlG4dvyr+Z3JS0qavEuWTPZtJm6ebeLZ5bDpaql+aXDm4N2dq0Dd9WtO319kXbL5fNKNu7g7ZDuaO/PLi8ZafJzs07P1SkVPRU+lQ27tLdtWHX+G7R7ht7vPY07NXbW7z3/T7JvttVAVVN1WbVZftJ+7P3P66Jqun4lvttXa1ObXHtxwPSA/0HIw6217nU1R3SPVRSj9Yr60cOxx++/p3vdy0NNg1VjZzG4iNwRHnk6fcJ3/ceDTradox7rOEH0x92HWcdL2pCmvKaRptTmvtbYlu6T8w+0dbq3nr8R9sfD5w0PFl5SvNUyWna6YLTk2fyz4ydlZ19fi753GDborZ752PO32oPb++6EHTh0kX/i+c7vDvOXPK4dPKy2+UTV7hXmq86X23qdOo8/pPTT8e7nLuarrlca7nuer21e2b36RueN87d9L158Rb/1tWeOT3dvfN6b/fF9/XfFt1+cif9zsu72Xcn7q28T7xf9EDtQdlD3YfVP1v+3Njv3H9qwHeg89HcR/cGhYPP/pH1jw9DBY+Zj8uGDYbrnjg+OTniP3L96fynQ89kzyaeF/6i/suuFxYvfvjV69fO0ZjRoZfyl5O/bXyl/erA6xmv28bCxh6+yXgzMV70VvvtwXfcdx3vo98PT+R8IH8o/2j5sfVT0Kf7kxmTk/8EA5jz/GMzLdsAAAAgY0hSTQAAeiUAAICDAAD5/wAAgOkAAHUwAADqYAAAOpgAABdvkl/FRgAAAFtJREFUeNpi/P//PwMlgImBQsA44C6giQENDAwM//HgBmLCAF/AMBLjBUeixf///48L7/+PCvZjU4fPAAc0AxywqcMXCwegGJ1NckL6jx5wpKYDxqGXEkkCgAEAmrqBIejdgngAAAAASUVORK5CYII=);
+}
diff --git a/bitbake/lib/toaster/toastergui/static/css/jquery.treetable.theme.toaster.css b/bitbake/lib/toaster/toastergui/static/css/jquery.treetable.theme.toaster.css
new file mode 100644
index 0000000..d8552e5
--- /dev/null
+++ b/bitbake/lib/toaster/toastergui/static/css/jquery.treetable.theme.toaster.css
@@ -0,0 +1,38 @@
+table.treetable span.file {
+ background-image: url(data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAABAAAAAQCAQAAAC1+jfqAAAABGdBTUEAAK/INwWK6QAAABl0RVh0U29mdHdhcmUAQWRvYmUgSW1hZ2VSZWFkeXHJZTwAAADoSURBVBgZBcExblNBGAbA2ceegTRBuIKOgiihSZNTcC5LUHAihNJR0kGKCDcYJY6D3/77MdOinTvzAgCw8ysThIvn/VojIyMjIyPP+bS1sUQIV2s95pBDDvmbP/mdkft83tpYguZq5Jh/OeaYh+yzy8hTHvNlaxNNczm+la9OTlar1UdA/+C2A4trRCnD3jS8BB1obq2Gk6GU6QbQAS4BUaYSQAf4bhhKKTFdAzrAOwAxEUAH+KEM01SY3gM6wBsEAQB0gJ+maZoC3gI6iPYaAIBJsiRmHU0AALOeFC3aK2cWAACUXe7+AwO0lc9eTHYTAAAAAElFTkSuQmCC);
+}
+
+table.treetable span.folder {
+ background-image: url(data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAABAAAAAQCAYAAAAf8/9hAAAABGdBTUEAAK/INwWK6QAAABl0RVh0U29mdHdhcmUAQWRvYmUgSW1hZ2VSZWFkeXHJZTwAAAGrSURBVDjLxZO7ihRBFIa/6u0ZW7GHBUV0UQQTZzd3QdhMQxOfwMRXEANBMNQX0MzAzFAwEzHwARbNFDdwEd31Mj3X7a6uOr9BtzNjYjKBJ6nicP7v3KqcJFaxhBVtZUAK8OHlld2st7Xl3DJPVONP+zEUV4HqL5UDYHr5xvuQAjgl/Qs7TzvOOVAjxjlC+ePSwe6DfbVegLVuT4r14eTr6zvA8xSAoBLzx6pvj4l+DZIezuVkG9fY2H7YRQIMZIBwycmzH1/s3F8AapfIPNF3kQk7+kw9PWBy+IZOdg5Ug3mkAATy/t0usovzGeCUWTjCz0B+Sj0ekfdvkZ3abBv+U4GaCtJ1iEm6ANQJ6fEzrG/engcKw/wXQvEKxSEKQxRGKE7Izt+DSiwBJMUSm71rguMYhQKrBygOIRStf4TiFFRBvbRGKiQLWP29yRSHKBTtfdBmHs0BUpgvtgF4yRFR+NUKi0XZcYjCeCG2smkzLAHkbRBmP0/Uk26O5YnUActBp1GsAI+S5nRJJJal5K1aAMrq0d6Tm9uI6zjyf75dAe6tx/SsWeD//o2/Ab6IH3/h25pOAAAAAElFTkSuQmCC);
+}
+
+table.treetable tr.collapsed span.indenter a {
+ background-image: url(data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAABAAAAAQCAYAAAAf8/9hAAAACXBIWXMAAAsTAAALEwEAmpwYAAAKT2lDQ1BQaG90b3Nob3AgSUNDIHByb2ZpbGUAAHjanVNnVFPpFj333vRCS4iAlEtvUhUIIFJCi4AUkSYqIQkQSoghodkVUcERRUUEG8igiAOOjoCMFVEsDIoK2AfkIaKOg6OIisr74Xuja9a89+bN/rXXPues852zzwfACAyWSDNRNYAMqUIeEeCDx8TG4eQuQIEKJHAAEAizZCFz/SMBAPh+PDwrIsAHvgABeNMLCADATZvAMByH/w/qQplcAYCEAcB0kThLCIAUAEB6jkKmAEBGAYCdmCZTAKAEAGDLY2LjAFAtAGAnf+bTAICd+Jl7AQBblCEVAaCRACATZYhEAGg7AKzPVopFAFgwABRmS8Q5ANgtADBJV2ZIALC3AMDOEAuyAAgMADBRiIUpAAR7AGDIIyN4AISZABRG8lc88SuuEOcqAAB4mbI8uSQ5RYFbCC1xB1dXLh4ozkkXKxQ2YQJhmkAuwnmZGTKBNA/g88wAAKCRFRHgg/P9eM4Ors7ONo62Dl8t6r8G/yJiYuP+5c+rcEAAAOF0ftH+LC+zGoA7BoBt/qIl7gRoXgugdfeLZrIPQLUAoOnaV/Nw+H48PEWhkLnZ2eXk5NhKxEJbYcpXff5nwl/AV/1s+X48/Pf14L7iJIEyXYFHBPjgwsz0TKUcz5IJhGLc5o9H/LcL//wd0yLESWK5WCoU41EScY5EmozzMqUiiUKSKcUl0v9k4t8s+wM+3zUAsGo+AXuRLahdYwP2SycQWHTA4vcAAPK7b8HUKAgDgGiD4c93/+8//UegJQCAZkmScQAAXkQkLlTKsz/HCAAARKCBKrBBG/TBGCzABhzBBdzBC/xgNoRCJMTCQhBCCmSAHHJgKayCQiiGzbAdKmAv1EAdNMBRaIaTcA4uwlW4Dj1wD/phCJ7BKLyBCQRByAgTYSHaiAFiilgjjggXmYX4IcFIBBKLJCDJiBRRIkuRNUgxUopUIFVIHfI9cgI5h1xGupE7yAAygvyGvEcxlIGyUT3UDLVDuag3GoRGogvQZHQxmo8WoJvQcrQaPYw2oefQq2gP2o8+Q8cwwOgYBzPEbDAuxsNCsTgsCZNjy7EirAyrxhqwVqwDu4n1Y8+xdwQSgUXACTYEd0IgYR5BSFhMWE7YSKggHCQ0EdoJNwkDhFHCJyKTqEu0JroR+cQYYjIxh1hILCPWEo8TLxB7iEPENyQSiUMyJ7mQAkmxpFTSEtJG0m5SI+ksqZs0SBojk8naZGuyBzmULCAryIXkneTD5DPkG+Qh8lsKnWJAcaT4U+IoUspqShnlEOU05QZlmDJBVaOaUt2ooVQRNY9aQq2htlKvUYeoEzR1mjnNgxZJS6WtopXTGmgXaPdpr+h0uhHdlR5Ol9BX0svpR+iX6AP0dwwNhhWDx4hnKBmbGAcYZxl3GK+YTKYZ04sZx1QwNzHrmOeZD5lvVVgqtip8FZHKCpVKlSaVGyovVKmqpqreqgtV81XLVI+pXlN9rkZVM1PjqQnUlqtVqp1Q61MbU2epO6iHqmeob1Q/pH5Z/YkGWcNMw09DpFGgsV/jvMYgC2MZs3gsIWsNq4Z1gTXEJrHN2Xx2KruY/R27iz2qqaE5QzNKM1ezUvOUZj8H45hx+Jx0TgnnKKeX836K3hTvKeIpG6Y0TLkxZVxrqpaXllirSKtRq0frvTau7aedpr1Fu1n7gQ5Bx0onXCdHZ4/OBZ3nU9lT3acKpxZNPTr1ri6qa6UbobtEd79up+6Ynr5egJ5Mb6feeb3n+hx9L/1U/W36p/VHDFgGswwkBtsMzhg8xTVxbzwdL8fb8VFDXcNAQ6VhlWGX4YSRudE8o9VGjUYPjGnGXOMk423GbcajJgYmISZLTepN7ppSTbmmKaY7TDtMx83MzaLN1pk1mz0x1zLnm+eb15vft2BaeFostqi2uGVJsuRaplnutrxuhVo5WaVYVVpds0atna0l1rutu6cRp7lOk06rntZnw7Dxtsm2qbcZsOXYBtuutm22fWFnYhdnt8Wuw+6TvZN9un2N/T0HDYfZDqsdWh1+c7RyFDpWOt6azpzuP33F9JbpL2dYzxDP2DPjthPLKcRpnVOb00dnF2e5c4PziIuJS4LLLpc+Lpsbxt3IveRKdPVxXeF60vWdm7Obwu2o26/uNu5p7ofcn8w0nymeWTNz0MPIQ+BR5dE/C5+VMGvfrH5PQ0+BZ7XnIy9jL5FXrdewt6V3qvdh7xc+9j5yn+M+4zw33jLeWV/MN8C3yLfLT8Nvnl+F30N/I/9k/3r/0QCngCUBZwOJgUGBWwL7+Hp8Ib+OPzrbZfay2e1BjKC5QRVBj4KtguXBrSFoyOyQrSH355jOkc5pDoVQfujW0Adh5mGLw34MJ4WHhVeGP45wiFga0TGXNXfR3ENz30T6RJZE3ptnMU85ry1KNSo+qi5qPNo3ujS6P8YuZlnM1VidWElsSxw5LiquNm5svt/87fOH4p3iC+N7F5gvyF1weaHOwvSFpxapLhIsOpZATIhOOJTwQRAqqBaMJfITdyWOCnnCHcJnIi/RNtGI2ENcKh5O8kgqTXqS7JG8NXkkxTOlLOW5hCepkLxMDUzdmzqeFpp2IG0yPTq9MYOSkZBxQqohTZO2Z+pn5mZ2y6xlhbL+xW6Lty8elQfJa7OQrAVZLQq2QqboVFoo1yoHsmdlV2a/zYnKOZarnivN7cyzytuQN5zvn//tEsIS4ZK2pYZLVy0dWOa9rGo5sjxxedsK4xUFK4ZWBqw8uIq2Km3VT6vtV5eufr0mek1rgV7ByoLBtQFr6wtVCuWFfevc1+1dT1gvWd+1YfqGnRs+FYmKrhTbF5cVf9go3HjlG4dvyr+Z3JS0qavEuWTPZtJm6ebeLZ5bDpaql+aXDm4N2dq0Dd9WtO319kXbL5fNKNu7g7ZDuaO/PLi8ZafJzs07P1SkVPRU+lQ27tLdtWHX+G7R7ht7vPY07NXbW7z3/T7JvttVAVVN1WbVZftJ+7P3P66Jqun4lvttXa1ObXHtxwPSA/0HIw6217nU1R3SPVRSj9Yr60cOxx++/p3vdy0NNg1VjZzG4iNwRHnk6fcJ3/ceDTradox7rOEH0x92HWcdL2pCmvKaRptTmvtbYlu6T8w+0dbq3nr8R9sfD5w0PFl5SvNUyWna6YLTk2fyz4ydlZ19fi753GDborZ752PO32oPb++6EHTh0kX/i+c7vDvOXPK4dPKy2+UTV7hXmq86X23qdOo8/pPTT8e7nLuarrlca7nuer21e2b36RueN87d9L158Rb/1tWeOT3dvfN6b/fF9/XfFt1+cif9zsu72Xcn7q28T7xf9EDtQdlD3YfVP1v+3Njv3H9qwHeg89HcR/cGhYPP/pH1jw9DBY+Zj8uGDYbrnjg+OTniP3L96fynQ89kzyaeF/6i/suuFxYvfvjV69fO0ZjRoZfyl5O/bXyl/erA6xmv28bCxh6+yXgzMV70VvvtwXfcdx3vo98PT+R8IH8o/2j5sfVT0Kf7kxmTk/8EA5jz/GMzLdsAAAAgY0hSTQAAeiUAAICDAAD5/wAAgOkAAHUwAADqYAAAOpgAABdvkl/FRgAAAHlJREFUeNrcU1sNgDAQ6wgmcAM2MICGGlg1gJnNzWQcvwQGy1j4oUl/7tH0mpwzM7SgQyO+EZAUWh2MkkzSWhJwuRAlHYsJwEwyvs1gABDuzqoJcTw5qxaIJN0bgQRgIjnlmn1heSO5PE6Y2YXe+5Cr5+h++gs12AcAS6FS+7YOsj4AAAAASUVORK5CYII=);
+}
+
+table.treetable tr.expanded span.indenter a {
+ background-image: url(data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAABAAAAAQCAYAAAAf8/9hAAAACXBIWXMAAAsTAAALEwEAmpwYAAAKT2lDQ1BQaG90b3Nob3AgSUNDIHByb2ZpbGUAAHjanVNnVFPpFj333vRCS4iAlEtvUhUIIFJCi4AUkSYqIQkQSoghodkVUcERRUUEG8igiAOOjoCMFVEsDIoK2AfkIaKOg6OIisr74Xuja9a89+bN/rXXPues852zzwfACAyWSDNRNYAMqUIeEeCDx8TG4eQuQIEKJHAAEAizZCFz/SMBAPh+PDwrIsAHvgABeNMLCADATZvAMByH/w/qQplcAYCEAcB0kThLCIAUAEB6jkKmAEBGAYCdmCZTAKAEAGDLY2LjAFAtAGAnf+bTAICd+Jl7AQBblCEVAaCRACATZYhEAGg7AKzPVopFAFgwABRmS8Q5ANgtADBJV2ZIALC3AMDOEAuyAAgMADBRiIUpAAR7AGDIIyN4AISZABRG8lc88SuuEOcqAAB4mbI8uSQ5RYFbCC1xB1dXLh4ozkkXKxQ2YQJhmkAuwnmZGTKBNA/g88wAAKCRFRHgg/P9eM4Ors7ONo62Dl8t6r8G/yJiYuP+5c+rcEAAAOF0ftH+LC+zGoA7BoBt/qIl7gRoXgugdfeLZrIPQLUAoOnaV/Nw+H48PEWhkLnZ2eXk5NhKxEJbYcpXff5nwl/AV/1s+X48/Pf14L7iJIEyXYFHBPjgwsz0TKUcz5IJhGLc5o9H/LcL//wd0yLESWK5WCoU41EScY5EmozzMqUiiUKSKcUl0v9k4t8s+wM+3zUAsGo+AXuRLahdYwP2SycQWHTA4vcAAPK7b8HUKAgDgGiD4c93/+8//UegJQCAZkmScQAAXkQkLlTKsz/HCAAARKCBKrBBG/TBGCzABhzBBdzBC/xgNoRCJMTCQhBCCmSAHHJgKayCQiiGzbAdKmAv1EAdNMBRaIaTcA4uwlW4Dj1wD/phCJ7BKLyBCQRByAgTYSHaiAFiilgjjggXmYX4IcFIBBKLJCDJiBRRIkuRNUgxUopUIFVIHfI9cgI5h1xGupE7yAAygvyGvEcxlIGyUT3UDLVDuag3GoRGogvQZHQxmo8WoJvQcrQaPYw2oefQq2gP2o8+Q8cwwOgYBzPEbDAuxsNCsTgsCZNjy7EirAyrxhqwVqwDu4n1Y8+xdwQSgUXACTYEd0IgYR5BSFhMWE7YSKggHCQ0EdoJNwkDhFHCJyKTqEu0JroR+cQYYjIxh1hILCPWEo8TLxB7iEPENyQSiUMyJ7mQAkmxpFTSEtJG0m5SI+ksqZs0SBojk8naZGuyBzmULCAryIXkneTD5DPkG+Qh8lsKnWJAcaT4U+IoUspqShnlEOU05QZlmDJBVaOaUt2ooVQRNY9aQq2htlKvUYeoEzR1mjnNgxZJS6WtopXTGmgXaPdpr+h0uhHdlR5Ol9BX0svpR+iX6AP0dwwNhhWDx4hnKBmbGAcYZxl3GK+YTKYZ04sZx1QwNzHrmOeZD5lvVVgqtip8FZHKCpVKlSaVGyovVKmqpqreqgtV81XLVI+pXlN9rkZVM1PjqQnUlqtVqp1Q61MbU2epO6iHqmeob1Q/pH5Z/YkGWcNMw09DpFGgsV/jvMYgC2MZs3gsIWsNq4Z1gTXEJrHN2Xx2KruY/R27iz2qqaE5QzNKM1ezUvOUZj8H45hx+Jx0TgnnKKeX836K3hTvKeIpG6Y0TLkxZVxrqpaXllirSKtRq0frvTau7aedpr1Fu1n7gQ5Bx0onXCdHZ4/OBZ3nU9lT3acKpxZNPTr1ri6qa6UbobtEd79up+6Ynr5egJ5Mb6feeb3n+hx9L/1U/W36p/VHDFgGswwkBtsMzhg8xTVxbzwdL8fb8VFDXcNAQ6VhlWGX4YSRudE8o9VGjUYPjGnGXOMk423GbcajJgYmISZLTepN7ppSTbmmKaY7TDtMx83MzaLN1pk1mz0x1zLnm+eb15vft2BaeFostqi2uGVJsuRaplnutrxuhVo5WaVYVVpds0atna0l1rutu6cRp7lOk06rntZnw7Dxtsm2qbcZsOXYBtuutm22fWFnYhdnt8Wuw+6TvZN9un2N/T0HDYfZDqsdWh1+c7RyFDpWOt6azpzuP33F9JbpL2dYzxDP2DPjthPLKcRpnVOb00dnF2e5c4PziIuJS4LLLpc+Lpsbxt3IveRKdPVxXeF60vWdm7Obwu2o26/uNu5p7ofcn8w0nymeWTNz0MPIQ+BR5dE/C5+VMGvfrH5PQ0+BZ7XnIy9jL5FXrdewt6V3qvdh7xc+9j5yn+M+4zw33jLeWV/MN8C3yLfLT8Nvnl+F30N/I/9k/3r/0QCngCUBZwOJgUGBWwL7+Hp8Ib+OPzrbZfay2e1BjKC5QRVBj4KtguXBrSFoyOyQrSH355jOkc5pDoVQfujW0Adh5mGLw34MJ4WHhVeGP45wiFga0TGXNXfR3ENz30T6RJZE3ptnMU85ry1KNSo+qi5qPNo3ujS6P8YuZlnM1VidWElsSxw5LiquNm5svt/87fOH4p3iC+N7F5gvyF1weaHOwvSFpxapLhIsOpZATIhOOJTwQRAqqBaMJfITdyWOCnnCHcJnIi/RNtGI2ENcKh5O8kgqTXqS7JG8NXkkxTOlLOW5hCepkLxMDUzdmzqeFpp2IG0yPTq9MYOSkZBxQqohTZO2Z+pn5mZ2y6xlhbL+xW6Lty8elQfJa7OQrAVZLQq2QqboVFoo1yoHsmdlV2a/zYnKOZarnivN7cyzytuQN5zvn//tEsIS4ZK2pYZLVy0dWOa9rGo5sjxxedsK4xUFK4ZWBqw8uIq2Km3VT6vtV5eufr0mek1rgV7ByoLBtQFr6wtVCuWFfevc1+1dT1gvWd+1YfqGnRs+FYmKrhTbF5cVf9go3HjlG4dvyr+Z3JS0qavEuWTPZtJm6ebeLZ5bDpaql+aXDm4N2dq0Dd9WtO319kXbL5fNKNu7g7ZDuaO/PLi8ZafJzs07P1SkVPRU+lQ27tLdtWHX+G7R7ht7vPY07NXbW7z3/T7JvttVAVVN1WbVZftJ+7P3P66Jqun4lvttXa1ObXHtxwPSA/0HIw6217nU1R3SPVRSj9Yr60cOxx++/p3vdy0NNg1VjZzG4iNwRHnk6fcJ3/ceDTradox7rOEH0x92HWcdL2pCmvKaRptTmvtbYlu6T8w+0dbq3nr8R9sfD5w0PFl5SvNUyWna6YLTk2fyz4ydlZ19fi753GDborZ752PO32oPb++6EHTh0kX/i+c7vDvOXPK4dPKy2+UTV7hXmq86X23qdOo8/pPTT8e7nLuarrlca7nuer21e2b36RueN87d9L158Rb/1tWeOT3dvfN6b/fF9/XfFt1+cif9zsu72Xcn7q28T7xf9EDtQdlD3YfVP1v+3Njv3H9qwHeg89HcR/cGhYPP/pH1jw9DBY+Zj8uGDYbrnjg+OTniP3L96fynQ89kzyaeF/6i/suuFxYvfvjV69fO0ZjRoZfyl5O/bXyl/erA6xmv28bCxh6+yXgzMV70VvvtwXfcdx3vo98PT+R8IH8o/2j5sfVT0Kf7kxmTk/8EA5jz/GMzLdsAAAAgY0hSTQAAeiUAAICDAAD5/wAAgOkAAHUwAADqYAAAOpgAABdvkl/FRgAAAHFJREFUeNpi/P//PwMlgImBQsA44C6gvhfa29v3MzAwOODRc6CystIRbxi0t7fjDJjKykpGYrwwi1hxnLHQ3t7+jIGBQRJJ6HllZaUUKYEYRYBPOB0gBShKwKGA////48VtbW3/8clTnBIH3gCKkzJgAGvBX0dDm0sCAAAAAElFTkSuQmCC);
+}
+
+
+
+table.treetable tr.collapsed.selected span.indenter a {
+ background-image: url(data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAABAAAAAQCAYAAAAf8/9hAAAACXBIWXMAAAsTAAALEwEAmpwYAAAKT2lDQ1BQaG90b3Nob3AgSUNDIHByb2ZpbGUAAHjanVNnVFPpFj333vRCS4iAlEtvUhUIIFJCi4AUkSYqIQkQSoghodkVUcERRUUEG8igiAOOjoCMFVEsDIoK2AfkIaKOg6OIisr74Xuja9a89+bN/rXXPues852zzwfACAyWSDNRNYAMqUIeEeCDx8TG4eQuQIEKJHAAEAizZCFz/SMBAPh+PDwrIsAHvgABeNMLCADATZvAMByH/w/qQplcAYCEAcB0kThLCIAUAEB6jkKmAEBGAYCdmCZTAKAEAGDLY2LjAFAtAGAnf+bTAICd+Jl7AQBblCEVAaCRACATZYhEAGg7AKzPVopFAFgwABRmS8Q5ANgtADBJV2ZIALC3AMDOEAuyAAgMADBRiIUpAAR7AGDIIyN4AISZABRG8lc88SuuEOcqAAB4mbI8uSQ5RYFbCC1xB1dXLh4ozkkXKxQ2YQJhmkAuwnmZGTKBNA/g88wAAKCRFRHgg/P9eM4Ors7ONo62Dl8t6r8G/yJiYuP+5c+rcEAAAOF0ftH+LC+zGoA7BoBt/qIl7gRoXgugdfeLZrIPQLUAoOnaV/Nw+H48PEWhkLnZ2eXk5NhKxEJbYcpXff5nwl/AV/1s+X48/Pf14L7iJIEyXYFHBPjgwsz0TKUcz5IJhGLc5o9H/LcL//wd0yLESWK5WCoU41EScY5EmozzMqUiiUKSKcUl0v9k4t8s+wM+3zUAsGo+AXuRLahdYwP2SycQWHTA4vcAAPK7b8HUKAgDgGiD4c93/+8//UegJQCAZkmScQAAXkQkLlTKsz/HCAAARKCBKrBBG/TBGCzABhzBBdzBC/xgNoRCJMTCQhBCCmSAHHJgKayCQiiGzbAdKmAv1EAdNMBRaIaTcA4uwlW4Dj1wD/phCJ7BKLyBCQRByAgTYSHaiAFiilgjjggXmYX4IcFIBBKLJCDJiBRRIkuRNUgxUopUIFVIHfI9cgI5h1xGupE7yAAygvyGvEcxlIGyUT3UDLVDuag3GoRGogvQZHQxmo8WoJvQcrQaPYw2oefQq2gP2o8+Q8cwwOgYBzPEbDAuxsNCsTgsCZNjy7EirAyrxhqwVqwDu4n1Y8+xdwQSgUXACTYEd0IgYR5BSFhMWE7YSKggHCQ0EdoJNwkDhFHCJyKTqEu0JroR+cQYYjIxh1hILCPWEo8TLxB7iEPENyQSiUMyJ7mQAkmxpFTSEtJG0m5SI+ksqZs0SBojk8naZGuyBzmULCAryIXkneTD5DPkG+Qh8lsKnWJAcaT4U+IoUspqShnlEOU05QZlmDJBVaOaUt2ooVQRNY9aQq2htlKvUYeoEzR1mjnNgxZJS6WtopXTGmgXaPdpr+h0uhHdlR5Ol9BX0svpR+iX6AP0dwwNhhWDx4hnKBmbGAcYZxl3GK+YTKYZ04sZx1QwNzHrmOeZD5lvVVgqtip8FZHKCpVKlSaVGyovVKmqpqreqgtV81XLVI+pXlN9rkZVM1PjqQnUlqtVqp1Q61MbU2epO6iHqmeob1Q/pH5Z/YkGWcNMw09DpFGgsV/jvMYgC2MZs3gsIWsNq4Z1gTXEJrHN2Xx2KruY/R27iz2qqaE5QzNKM1ezUvOUZj8H45hx+Jx0TgnnKKeX836K3hTvKeIpG6Y0TLkxZVxrqpaXllirSKtRq0frvTau7aedpr1Fu1n7gQ5Bx0onXCdHZ4/OBZ3nU9lT3acKpxZNPTr1ri6qa6UbobtEd79up+6Ynr5egJ5Mb6feeb3n+hx9L/1U/W36p/VHDFgGswwkBtsMzhg8xTVxbzwdL8fb8VFDXcNAQ6VhlWGX4YSRudE8o9VGjUYPjGnGXOMk423GbcajJgYmISZLTepN7ppSTbmmKaY7TDtMx83MzaLN1pk1mz0x1zLnm+eb15vft2BaeFostqi2uGVJsuRaplnutrxuhVo5WaVYVVpds0atna0l1rutu6cRp7lOk06rntZnw7Dxtsm2qbcZsOXYBtuutm22fWFnYhdnt8Wuw+6TvZN9un2N/T0HDYfZDqsdWh1+c7RyFDpWOt6azpzuP33F9JbpL2dYzxDP2DPjthPLKcRpnVOb00dnF2e5c4PziIuJS4LLLpc+Lpsbxt3IveRKdPVxXeF60vWdm7Obwu2o26/uNu5p7ofcn8w0nymeWTNz0MPIQ+BR5dE/C5+VMGvfrH5PQ0+BZ7XnIy9jL5FXrdewt6V3qvdh7xc+9j5yn+M+4zw33jLeWV/MN8C3yLfLT8Nvnl+F30N/I/9k/3r/0QCngCUBZwOJgUGBWwL7+Hp8Ib+OPzrbZfay2e1BjKC5QRVBj4KtguXBrSFoyOyQrSH355jOkc5pDoVQfujW0Adh5mGLw34MJ4WHhVeGP45wiFga0TGXNXfR3ENz30T6RJZE3ptnMU85ry1KNSo+qi5qPNo3ujS6P8YuZlnM1VidWElsSxw5LiquNm5svt/87fOH4p3iC+N7F5gvyF1weaHOwvSFpxapLhIsOpZATIhOOJTwQRAqqBaMJfITdyWOCnnCHcJnIi/RNtGI2ENcKh5O8kgqTXqS7JG8NXkkxTOlLOW5hCepkLxMDUzdmzqeFpp2IG0yPTq9MYOSkZBxQqohTZO2Z+pn5mZ2y6xlhbL+xW6Lty8elQfJa7OQrAVZLQq2QqboVFoo1yoHsmdlV2a/zYnKOZarnivN7cyzytuQN5zvn//tEsIS4ZK2pYZLVy0dWOa9rGo5sjxxedsK4xUFK4ZWBqw8uIq2Km3VT6vtV5eufr0mek1rgV7ByoLBtQFr6wtVCuWFfevc1+1dT1gvWd+1YfqGnRs+FYmKrhTbF5cVf9go3HjlG4dvyr+Z3JS0qavEuWTPZtJm6ebeLZ5bDpaql+aXDm4N2dq0Dd9WtO319kXbL5fNKNu7g7ZDuaO/PLi8ZafJzs07P1SkVPRU+lQ27tLdtWHX+G7R7ht7vPY07NXbW7z3/T7JvttVAVVN1WbVZftJ+7P3P66Jqun4lvttXa1ObXHtxwPSA/0HIw6217nU1R3SPVRSj9Yr60cOxx++/p3vdy0NNg1VjZzG4iNwRHnk6fcJ3/ceDTradox7rOEH0x92HWcdL2pCmvKaRptTmvtbYlu6T8w+0dbq3nr8R9sfD5w0PFl5SvNUyWna6YLTk2fyz4ydlZ19fi753GDborZ752PO32oPb++6EHTh0kX/i+c7vDvOXPK4dPKy2+UTV7hXmq86X23qdOo8/pPTT8e7nLuarrlca7nuer21e2b36RueN87d9L158Rb/1tWeOT3dvfN6b/fF9/XfFt1+cif9zsu72Xcn7q28T7xf9EDtQdlD3YfVP1v+3Njv3H9qwHeg89HcR/cGhYPP/pH1jw9DBY+Zj8uGDYbrnjg+OTniP3L96fynQ89kzyaeF/6i/suuFxYvfvjV69fO0ZjRoZfyl5O/bXyl/erA6xmv28bCxh6+yXgzMV70VvvtwXfcdx3vo98PT+R8IH8o/2j5sfVT0Kf7kxmTk/8EA5jz/GMzLdsAAAAgY0hSTQAAeiUAAICDAAD5/wAAgOkAAHUwAADqYAAAOpgAABdvkl/FRgAAAFpJREFUeNpi/P//PwMlgHHADWD4//8/NtyAQxwD45KAAQdKDfj//////fgMIsYAZIMw1DKREFwODAwM/4kNRKq64AADA4MjFDOQ6gKyY4HodMA49PMCxQYABgAVYHsjyZ1x7QAAAABJRU5ErkJggg==);
+}
+
+table.treetable tr.expanded.selected span.indenter a {
+ background-image: url(data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAABAAAAAQCAYAAAAf8/9hAAAACXBIWXMAAAsTAAALEwEAmpwYAAAKT2lDQ1BQaG90b3Nob3AgSUNDIHByb2ZpbGUAAHjanVNnVFPpFj333vRCS4iAlEtvUhUIIFJCi4AUkSYqIQkQSoghodkVUcERRUUEG8igiAOOjoCMFVEsDIoK2AfkIaKOg6OIisr74Xuja9a89+bN/rXXPues852zzwfACAyWSDNRNYAMqUIeEeCDx8TG4eQuQIEKJHAAEAizZCFz/SMBAPh+PDwrIsAHvgABeNMLCADATZvAMByH/w/qQplcAYCEAcB0kThLCIAUAEB6jkKmAEBGAYCdmCZTAKAEAGDLY2LjAFAtAGAnf+bTAICd+Jl7AQBblCEVAaCRACATZYhEAGg7AKzPVopFAFgwABRmS8Q5ANgtADBJV2ZIALC3AMDOEAuyAAgMADBRiIUpAAR7AGDIIyN4AISZABRG8lc88SuuEOcqAAB4mbI8uSQ5RYFbCC1xB1dXLh4ozkkXKxQ2YQJhmkAuwnmZGTKBNA/g88wAAKCRFRHgg/P9eM4Ors7ONo62Dl8t6r8G/yJiYuP+5c+rcEAAAOF0ftH+LC+zGoA7BoBt/qIl7gRoXgugdfeLZrIPQLUAoOnaV/Nw+H48PEWhkLnZ2eXk5NhKxEJbYcpXff5nwl/AV/1s+X48/Pf14L7iJIEyXYFHBPjgwsz0TKUcz5IJhGLc5o9H/LcL//wd0yLESWK5WCoU41EScY5EmozzMqUiiUKSKcUl0v9k4t8s+wM+3zUAsGo+AXuRLahdYwP2SycQWHTA4vcAAPK7b8HUKAgDgGiD4c93/+8//UegJQCAZkmScQAAXkQkLlTKsz/HCAAARKCBKrBBG/TBGCzABhzBBdzBC/xgNoRCJMTCQhBCCmSAHHJgKayCQiiGzbAdKmAv1EAdNMBRaIaTcA4uwlW4Dj1wD/phCJ7BKLyBCQRByAgTYSHaiAFiilgjjggXmYX4IcFIBBKLJCDJiBRRIkuRNUgxUopUIFVIHfI9cgI5h1xGupE7yAAygvyGvEcxlIGyUT3UDLVDuag3GoRGogvQZHQxmo8WoJvQcrQaPYw2oefQq2gP2o8+Q8cwwOgYBzPEbDAuxsNCsTgsCZNjy7EirAyrxhqwVqwDu4n1Y8+xdwQSgUXACTYEd0IgYR5BSFhMWE7YSKggHCQ0EdoJNwkDhFHCJyKTqEu0JroR+cQYYjIxh1hILCPWEo8TLxB7iEPENyQSiUMyJ7mQAkmxpFTSEtJG0m5SI+ksqZs0SBojk8naZGuyBzmULCAryIXkneTD5DPkG+Qh8lsKnWJAcaT4U+IoUspqShnlEOU05QZlmDJBVaOaUt2ooVQRNY9aQq2htlKvUYeoEzR1mjnNgxZJS6WtopXTGmgXaPdpr+h0uhHdlR5Ol9BX0svpR+iX6AP0dwwNhhWDx4hnKBmbGAcYZxl3GK+YTKYZ04sZx1QwNzHrmOeZD5lvVVgqtip8FZHKCpVKlSaVGyovVKmqpqreqgtV81XLVI+pXlN9rkZVM1PjqQnUlqtVqp1Q61MbU2epO6iHqmeob1Q/pH5Z/YkGWcNMw09DpFGgsV/jvMYgC2MZs3gsIWsNq4Z1gTXEJrHN2Xx2KruY/R27iz2qqaE5QzNKM1ezUvOUZj8H45hx+Jx0TgnnKKeX836K3hTvKeIpG6Y0TLkxZVxrqpaXllirSKtRq0frvTau7aedpr1Fu1n7gQ5Bx0onXCdHZ4/OBZ3nU9lT3acKpxZNPTr1ri6qa6UbobtEd79up+6Ynr5egJ5Mb6feeb3n+hx9L/1U/W36p/VHDFgGswwkBtsMzhg8xTVxbzwdL8fb8VFDXcNAQ6VhlWGX4YSRudE8o9VGjUYPjGnGXOMk423GbcajJgYmISZLTepN7ppSTbmmKaY7TDtMx83MzaLN1pk1mz0x1zLnm+eb15vft2BaeFostqi2uGVJsuRaplnutrxuhVo5WaVYVVpds0atna0l1rutu6cRp7lOk06rntZnw7Dxtsm2qbcZsOXYBtuutm22fWFnYhdnt8Wuw+6TvZN9un2N/T0HDYfZDqsdWh1+c7RyFDpWOt6azpzuP33F9JbpL2dYzxDP2DPjthPLKcRpnVOb00dnF2e5c4PziIuJS4LLLpc+Lpsbxt3IveRKdPVxXeF60vWdm7Obwu2o26/uNu5p7ofcn8w0nymeWTNz0MPIQ+BR5dE/C5+VMGvfrH5PQ0+BZ7XnIy9jL5FXrdewt6V3qvdh7xc+9j5yn+M+4zw33jLeWV/MN8C3yLfLT8Nvnl+F30N/I/9k/3r/0QCngCUBZwOJgUGBWwL7+Hp8Ib+OPzrbZfay2e1BjKC5QRVBj4KtguXBrSFoyOyQrSH355jOkc5pDoVQfujW0Adh5mGLw34MJ4WHhVeGP45wiFga0TGXNXfR3ENz30T6RJZE3ptnMU85ry1KNSo+qi5qPNo3ujS6P8YuZlnM1VidWElsSxw5LiquNm5svt/87fOH4p3iC+N7F5gvyF1weaHOwvSFpxapLhIsOpZATIhOOJTwQRAqqBaMJfITdyWOCnnCHcJnIi/RNtGI2ENcKh5O8kgqTXqS7JG8NXkkxTOlLOW5hCepkLxMDUzdmzqeFpp2IG0yPTq9MYOSkZBxQqohTZO2Z+pn5mZ2y6xlhbL+xW6Lty8elQfJa7OQrAVZLQq2QqboVFoo1yoHsmdlV2a/zYnKOZarnivN7cyzytuQN5zvn//tEsIS4ZK2pYZLVy0dWOa9rGo5sjxxedsK4xUFK4ZWBqw8uIq2Km3VT6vtV5eufr0mek1rgV7ByoLBtQFr6wtVCuWFfevc1+1dT1gvWd+1YfqGnRs+FYmKrhTbF5cVf9go3HjlG4dvyr+Z3JS0qavEuWTPZtJm6ebeLZ5bDpaql+aXDm4N2dq0Dd9WtO319kXbL5fNKNu7g7ZDuaO/PLi8ZafJzs07P1SkVPRU+lQ27tLdtWHX+G7R7ht7vPY07NXbW7z3/T7JvttVAVVN1WbVZftJ+7P3P66Jqun4lvttXa1ObXHtxwPSA/0HIw6217nU1R3SPVRSj9Yr60cOxx++/p3vdy0NNg1VjZzG4iNwRHnk6fcJ3/ceDTradox7rOEH0x92HWcdL2pCmvKaRptTmvtbYlu6T8w+0dbq3nr8R9sfD5w0PFl5SvNUyWna6YLTk2fyz4ydlZ19fi753GDborZ752PO32oPb++6EHTh0kX/i+c7vDvOXPK4dPKy2+UTV7hXmq86X23qdOo8/pPTT8e7nLuarrlca7nuer21e2b36RueN87d9L158Rb/1tWeOT3dvfN6b/fF9/XfFt1+cif9zsu72Xcn7q28T7xf9EDtQdlD3YfVP1v+3Njv3H9qwHeg89HcR/cGhYPP/pH1jw9DBY+Zj8uGDYbrnjg+OTniP3L96fynQ89kzyaeF/6i/suuFxYvfvjV69fO0ZjRoZfyl5O/bXyl/erA6xmv28bCxh6+yXgzMV70VvvtwXfcdx3vo98PT+R8IH8o/2j5sfVT0Kf7kxmTk/8EA5jz/GMzLdsAAAAgY0hSTQAAeiUAAICDAAD5/wAAgOkAAHUwAADqYAAAOpgAABdvkl/FRgAAAFtJREFUeNpi/P//PwMlgImBQsA44C6giQENDAwM//HgBmLCAF/AMBLjBUeixf///48L7/+PCvZjU4fPAAc0AxywqcMXCwegGJ1NckL6jx5wpKYDxqGXEkkCgAEAmrqBIejdgngAAAAASUVORK5CYII=);
+}
+
+table.treetable tr.accept {
+ background-color: #a3bce4;
+ color: #fff
+}
+
+table.treetable tr.collapsed.accept td span.indenter a {
+ background-image: url(data:image/x-png;base64,iVBORw0KGgoAAAANSUhEUgAAABAAAAAQCAYAAAAf8/9hAAAACXBIWXMAAAsTAAALEwEAmpwYAAAKT2lDQ1BQaG90b3Nob3AgSUNDIHByb2ZpbGUAAHjanVNnVFPpFj333vRCS4iAlEtvUhUIIFJCi4AUkSYqIQkQSoghodkVUcERRUUEG8igiAOOjoCMFVEsDIoK2AfkIaKOg6OIisr74Xuja9a89+bN/rXXPues852zzwfACAyWSDNRNYAMqUIeEeCDx8TG4eQuQIEKJHAAEAizZCFz/SMBAPh+PDwrIsAHvgABeNMLCADATZvAMByH/w/qQplcAYCEAcB0kThLCIAUAEB6jkKmAEBGAYCdmCZTAKAEAGDLY2LjAFAtAGAnf+bTAICd+Jl7AQBblCEVAaCRACATZYhEAGg7AKzPVopFAFgwABRmS8Q5ANgtADBJV2ZIALC3AMDOEAuyAAgMADBRiIUpAAR7AGDIIyN4AISZABRG8lc88SuuEOcqAAB4mbI8uSQ5RYFbCC1xB1dXLh4ozkkXKxQ2YQJhmkAuwnmZGTKBNA/g88wAAKCRFRHgg/P9eM4Ors7ONo62Dl8t6r8G/yJiYuP+5c+rcEAAAOF0ftH+LC+zGoA7BoBt/qIl7gRoXgugdfeLZrIPQLUAoOnaV/Nw+H48PEWhkLnZ2eXk5NhKxEJbYcpXff5nwl/AV/1s+X48/Pf14L7iJIEyXYFHBPjgwsz0TKUcz5IJhGLc5o9H/LcL//wd0yLESWK5WCoU41EScY5EmozzMqUiiUKSKcUl0v9k4t8s+wM+3zUAsGo+AXuRLahdYwP2SycQWHTA4vcAAPK7b8HUKAgDgGiD4c93/+8//UegJQCAZkmScQAAXkQkLlTKsz/HCAAARKCBKrBBG/TBGCzABhzBBdzBC/xgNoRCJMTCQhBCCmSAHHJgKayCQiiGzbAdKmAv1EAdNMBRaIaTcA4uwlW4Dj1wD/phCJ7BKLyBCQRByAgTYSHaiAFiilgjjggXmYX4IcFIBBKLJCDJiBRRIkuRNUgxUopUIFVIHfI9cgI5h1xGupE7yAAygvyGvEcxlIGyUT3UDLVDuag3GoRGogvQZHQxmo8WoJvQcrQaPYw2oefQq2gP2o8+Q8cwwOgYBzPEbDAuxsNCsTgsCZNjy7EirAyrxhqwVqwDu4n1Y8+xdwQSgUXACTYEd0IgYR5BSFhMWE7YSKggHCQ0EdoJNwkDhFHCJyKTqEu0JroR+cQYYjIxh1hILCPWEo8TLxB7iEPENyQSiUMyJ7mQAkmxpFTSEtJG0m5SI+ksqZs0SBojk8naZGuyBzmULCAryIXkneTD5DPkG+Qh8lsKnWJAcaT4U+IoUspqShnlEOU05QZlmDJBVaOaUt2ooVQRNY9aQq2htlKvUYeoEzR1mjnNgxZJS6WtopXTGmgXaPdpr+h0uhHdlR5Ol9BX0svpR+iX6AP0dwwNhhWDx4hnKBmbGAcYZxl3GK+YTKYZ04sZx1QwNzHrmOeZD5lvVVgqtip8FZHKCpVKlSaVGyovVKmqpqreqgtV81XLVI+pXlN9rkZVM1PjqQnUlqtVqp1Q61MbU2epO6iHqmeob1Q/pH5Z/YkGWcNMw09DpFGgsV/jvMYgC2MZs3gsIWsNq4Z1gTXEJrHN2Xx2KruY/R27iz2qqaE5QzNKM1ezUvOUZj8H45hx+Jx0TgnnKKeX836K3hTvKeIpG6Y0TLkxZVxrqpaXllirSKtRq0frvTau7aedpr1Fu1n7gQ5Bx0onXCdHZ4/OBZ3nU9lT3acKpxZNPTr1ri6qa6UbobtEd79up+6Ynr5egJ5Mb6feeb3n+hx9L/1U/W36p/VHDFgGswwkBtsMzhg8xTVxbzwdL8fb8VFDXcNAQ6VhlWGX4YSRudE8o9VGjUYPjGnGXOMk423GbcajJgYmISZLTepN7ppSTbmmKaY7TDtMx83MzaLN1pk1mz0x1zLnm+eb15vft2BaeFostqi2uGVJsuRaplnutrxuhVo5WaVYVVpds0atna0l1rutu6cRp7lOk06rntZnw7Dxtsm2qbcZsOXYBtuutm22fWFnYhdnt8Wuw+6TvZN9un2N/T0HDYfZDqsdWh1+c7RyFDpWOt6azpzuP33F9JbpL2dYzxDP2DPjthPLKcRpnVOb00dnF2e5c4PziIuJS4LLLpc+Lpsbxt3IveRKdPVxXeF60vWdm7Obwu2o26/uNu5p7ofcn8w0nymeWTNz0MPIQ+BR5dE/C5+VMGvfrH5PQ0+BZ7XnIy9jL5FXrdewt6V3qvdh7xc+9j5yn+M+4zw33jLeWV/MN8C3yLfLT8Nvnl+F30N/I/9k/3r/0QCngCUBZwOJgUGBWwL7+Hp8Ib+OPzrbZfay2e1BjKC5QRVBj4KtguXBrSFoyOyQrSH355jOkc5pDoVQfujW0Adh5mGLw34MJ4WHhVeGP45wiFga0TGXNXfR3ENz30T6RJZE3ptnMU85ry1KNSo+qi5qPNo3ujS6P8YuZlnM1VidWElsSxw5LiquNm5svt/87fOH4p3iC+N7F5gvyF1weaHOwvSFpxapLhIsOpZATIhOOJTwQRAqqBaMJfITdyWOCnnCHcJnIi/RNtGI2ENcKh5O8kgqTXqS7JG8NXkkxTOlLOW5hCepkLxMDUzdmzqeFpp2IG0yPTq9MYOSkZBxQqohTZO2Z+pn5mZ2y6xlhbL+xW6Lty8elQfJa7OQrAVZLQq2QqboVFoo1yoHsmdlV2a/zYnKOZarnivN7cyzytuQN5zvn//tEsIS4ZK2pYZLVy0dWOa9rGo5sjxxedsK4xUFK4ZWBqw8uIq2Km3VT6vtV5eufr0mek1rgV7ByoLBtQFr6wtVCuWFfevc1+1dT1gvWd+1YfqGnRs+FYmKrhTbF5cVf9go3HjlG4dvyr+Z3JS0qavEuWTPZtJm6ebeLZ5bDpaql+aXDm4N2dq0Dd9WtO319kXbL5fNKNu7g7ZDuaO/PLi8ZafJzs07P1SkVPRU+lQ27tLdtWHX+G7R7ht7vPY07NXbW7z3/T7JvttVAVVN1WbVZftJ+7P3P66Jqun4lvttXa1ObXHtxwPSA/0HIw6217nU1R3SPVRSj9Yr60cOxx++/p3vdy0NNg1VjZzG4iNwRHnk6fcJ3/ceDTradox7rOEH0x92HWcdL2pCmvKaRptTmvtbYlu6T8w+0dbq3nr8R9sfD5w0PFl5SvNUyWna6YLTk2fyz4ydlZ19fi753GDborZ752PO32oPb++6EHTh0kX/i+c7vDvOXPK4dPKy2+UTV7hXmq86X23qdOo8/pPTT8e7nLuarrlca7nuer21e2b36RueN87d9L158Rb/1tWeOT3dvfN6b/fF9/XfFt1+cif9zsu72Xcn7q28T7xf9EDtQdlD3YfVP1v+3Njv3H9qwHeg89HcR/cGhYPP/pH1jw9DBY+Zj8uGDYbrnjg+OTniP3L96fynQ89kzyaeF/6i/suuFxYvfvjV69fO0ZjRoZfyl5O/bXyl/erA6xmv28bCxh6+yXgzMV70VvvtwXfcdx3vo98PT+R8IH8o/2j5sfVT0Kf7kxmTk/8EA5jz/GMzLdsAAAAgY0hSTQAAeiUAAICDAAD5/wAAgOkAAHUwAADqYAAAOpgAABdvkl/FRgAAAFpJREFUeNpi/P//PwMlgHHADWD4//8/NtyAQxwD45KAAQdKDfj//////fgMIsYAZIMw1DKREFwODAwM/4kNRKq64AADA4MjFDOQ6gKyY4HodMA49PMCxQYABgAVYHsjyZ1x7QAAAABJRU5ErkJggg==);
+}
+
+table.treetable tr.expanded.accept td span.indenter a {
+ background-image: url(data:image/x-png;base64,iVBORw0KGgoAAAANSUhEUgAAABAAAAAQCAYAAAAf8/9hAAAACXBIWXMAAAsTAAALEwEAmpwYAAAKT2lDQ1BQaG90b3Nob3AgSUNDIHByb2ZpbGUAAHjanVNnVFPpFj333vRCS4iAlEtvUhUIIFJCi4AUkSYqIQkQSoghodkVUcERRUUEG8igiAOOjoCMFVEsDIoK2AfkIaKOg6OIisr74Xuja9a89+bN/rXXPues852zzwfACAyWSDNRNYAMqUIeEeCDx8TG4eQuQIEKJHAAEAizZCFz/SMBAPh+PDwrIsAHvgABeNMLCADATZvAMByH/w/qQplcAYCEAcB0kThLCIAUAEB6jkKmAEBGAYCdmCZTAKAEAGDLY2LjAFAtAGAnf+bTAICd+Jl7AQBblCEVAaCRACATZYhEAGg7AKzPVopFAFgwABRmS8Q5ANgtADBJV2ZIALC3AMDOEAuyAAgMADBRiIUpAAR7AGDIIyN4AISZABRG8lc88SuuEOcqAAB4mbI8uSQ5RYFbCC1xB1dXLh4ozkkXKxQ2YQJhmkAuwnmZGTKBNA/g88wAAKCRFRHgg/P9eM4Ors7ONo62Dl8t6r8G/yJiYuP+5c+rcEAAAOF0ftH+LC+zGoA7BoBt/qIl7gRoXgugdfeLZrIPQLUAoOnaV/Nw+H48PEWhkLnZ2eXk5NhKxEJbYcpXff5nwl/AV/1s+X48/Pf14L7iJIEyXYFHBPjgwsz0TKUcz5IJhGLc5o9H/LcL//wd0yLESWK5WCoU41EScY5EmozzMqUiiUKSKcUl0v9k4t8s+wM+3zUAsGo+AXuRLahdYwP2SycQWHTA4vcAAPK7b8HUKAgDgGiD4c93/+8//UegJQCAZkmScQAAXkQkLlTKsz/HCAAARKCBKrBBG/TBGCzABhzBBdzBC/xgNoRCJMTCQhBCCmSAHHJgKayCQiiGzbAdKmAv1EAdNMBRaIaTcA4uwlW4Dj1wD/phCJ7BKLyBCQRByAgTYSHaiAFiilgjjggXmYX4IcFIBBKLJCDJiBRRIkuRNUgxUopUIFVIHfI9cgI5h1xGupE7yAAygvyGvEcxlIGyUT3UDLVDuag3GoRGogvQZHQxmo8WoJvQcrQaPYw2oefQq2gP2o8+Q8cwwOgYBzPEbDAuxsNCsTgsCZNjy7EirAyrxhqwVqwDu4n1Y8+xdwQSgUXACTYEd0IgYR5BSFhMWE7YSKggHCQ0EdoJNwkDhFHCJyKTqEu0JroR+cQYYjIxh1hILCPWEo8TLxB7iEPENyQSiUMyJ7mQAkmxpFTSEtJG0m5SI+ksqZs0SBojk8naZGuyBzmULCAryIXkneTD5DPkG+Qh8lsKnWJAcaT4U+IoUspqShnlEOU05QZlmDJBVaOaUt2ooVQRNY9aQq2htlKvUYeoEzR1mjnNgxZJS6WtopXTGmgXaPdpr+h0uhHdlR5Ol9BX0svpR+iX6AP0dwwNhhWDx4hnKBmbGAcYZxl3GK+YTKYZ04sZx1QwNzHrmOeZD5lvVVgqtip8FZHKCpVKlSaVGyovVKmqpqreqgtV81XLVI+pXlN9rkZVM1PjqQnUlqtVqp1Q61MbU2epO6iHqmeob1Q/pH5Z/YkGWcNMw09DpFGgsV/jvMYgC2MZs3gsIWsNq4Z1gTXEJrHN2Xx2KruY/R27iz2qqaE5QzNKM1ezUvOUZj8H45hx+Jx0TgnnKKeX836K3hTvKeIpG6Y0TLkxZVxrqpaXllirSKtRq0frvTau7aedpr1Fu1n7gQ5Bx0onXCdHZ4/OBZ3nU9lT3acKpxZNPTr1ri6qa6UbobtEd79up+6Ynr5egJ5Mb6feeb3n+hx9L/1U/W36p/VHDFgGswwkBtsMzhg8xTVxbzwdL8fb8VFDXcNAQ6VhlWGX4YSRudE8o9VGjUYPjGnGXOMk423GbcajJgYmISZLTepN7ppSTbmmKaY7TDtMx83MzaLN1pk1mz0x1zLnm+eb15vft2BaeFostqi2uGVJsuRaplnutrxuhVo5WaVYVVpds0atna0l1rutu6cRp7lOk06rntZnw7Dxtsm2qbcZsOXYBtuutm22fWFnYhdnt8Wuw+6TvZN9un2N/T0HDYfZDqsdWh1+c7RyFDpWOt6azpzuP33F9JbpL2dYzxDP2DPjthPLKcRpnVOb00dnF2e5c4PziIuJS4LLLpc+Lpsbxt3IveRKdPVxXeF60vWdm7Obwu2o26/uNu5p7ofcn8w0nymeWTNz0MPIQ+BR5dE/C5+VMGvfrH5PQ0+BZ7XnIy9jL5FXrdewt6V3qvdh7xc+9j5yn+M+4zw33jLeWV/MN8C3yLfLT8Nvnl+F30N/I/9k/3r/0QCngCUBZwOJgUGBWwL7+Hp8Ib+OPzrbZfay2e1BjKC5QRVBj4KtguXBrSFoyOyQrSH355jOkc5pDoVQfujW0Adh5mGLw34MJ4WHhVeGP45wiFga0TGXNXfR3ENz30T6RJZE3ptnMU85ry1KNSo+qi5qPNo3ujS6P8YuZlnM1VidWElsSxw5LiquNm5svt/87fOH4p3iC+N7F5gvyF1weaHOwvSFpxapLhIsOpZATIhOOJTwQRAqqBaMJfITdyWOCnnCHcJnIi/RNtGI2ENcKh5O8kgqTXqS7JG8NXkkxTOlLOW5hCepkLxMDUzdmzqeFpp2IG0yPTq9MYOSkZBxQqohTZO2Z+pn5mZ2y6xlhbL+xW6Lty8elQfJa7OQrAVZLQq2QqboVFoo1yoHsmdlV2a/zYnKOZarnivN7cyzytuQN5zvn//tEsIS4ZK2pYZLVy0dWOa9rGo5sjxxedsK4xUFK4ZWBqw8uIq2Km3VT6vtV5eufr0mek1rgV7ByoLBtQFr6wtVCuWFfevc1+1dT1gvWd+1YfqGnRs+FYmKrhTbF5cVf9go3HjlG4dvyr+Z3JS0qavEuWTPZtJm6ebeLZ5bDpaql+aXDm4N2dq0Dd9WtO319kXbL5fNKNu7g7ZDuaO/PLi8ZafJzs07P1SkVPRU+lQ27tLdtWHX+G7R7ht7vPY07NXbW7z3/T7JvttVAVVN1WbVZftJ+7P3P66Jqun4lvttXa1ObXHtxwPSA/0HIw6217nU1R3SPVRSj9Yr60cOxx++/p3vdy0NNg1VjZzG4iNwRHnk6fcJ3/ceDTradox7rOEH0x92HWcdL2pCmvKaRptTmvtbYlu6T8w+0dbq3nr8R9sfD5w0PFl5SvNUyWna6YLTk2fyz4ydlZ19fi753GDborZ752PO32oPb++6EHTh0kX/i+c7vDvOXPK4dPKy2+UTV7hXmq86X23qdOo8/pPTT8e7nLuarrlca7nuer21e2b36RueN87d9L158Rb/1tWeOT3dvfN6b/fF9/XfFt1+cif9zsu72Xcn7q28T7xf9EDtQdlD3YfVP1v+3Njv3H9qwHeg89HcR/cGhYPP/pH1jw9DBY+Zj8uGDYbrnjg+OTniP3L96fynQ89kzyaeF/6i/suuFxYvfvjV69fO0ZjRoZfyl5O/bXyl/erA6xmv28bCxh6+yXgzMV70VvvtwXfcdx3vo98PT+R8IH8o/2j5sfVT0Kf7kxmTk/8EA5jz/GMzLdsAAAAgY0hSTQAAeiUAAICDAAD5/wAAgOkAAHUwAADqYAAAOpgAABdvkl/FRgAAAFtJREFUeNpi/P//PwMlgImBQsA44C6giQENDAwM//HgBmLCAF/AMBLjBUeixf///48L7/+PCvZjU4fPAAc0AxywqcMXCwegGJ1NckL6jx5wpKYDxqGXEkkCgAEAmrqBIejdgngAAAAASUVORK5CYII=);
+}
diff --git a/bitbake/lib/toaster/toastergui/static/css/prettify.css b/bitbake/lib/toaster/toastergui/static/css/prettify.css
new file mode 100755
index 0000000..b317a7c
--- /dev/null
+++ b/bitbake/lib/toaster/toastergui/static/css/prettify.css
@@ -0,0 +1 @@
+.pln{color:#000}@media screen{.str{color:#080}.kwd{color:#008}.com{color:#800}.typ{color:#606}.lit{color:#066}.pun,.opn,.clo{color:#660}.tag{color:#008}.atn{color:#606}.atv{color:#080}.dec,.var{color:#606}.fun{color:red}}@media print,projection{.str{color:#060}.kwd{color:#006;font-weight:bold}.com{color:#600;font-style:italic}.typ{color:#404;font-weight:bold}.lit{color:#044}.pun,.opn,.clo{color:#440}.tag{color:#006;font-weight:bold}.atn{color:#404}.atv{color:#060}}pre.prettyprint{padding:2px;border:1px solid #888}ol.linenums{margin-top:0;margin-bottom:0}li.L0,li.L1,li.L2,li.L3,li.L5,li.L6,li.L7,li.L8{list-style-type:none}li.L1,li.L3,li.L5,li.L7,li.L9{background:#eee}
diff --git a/bitbake/lib/toaster/toastergui/static/css/screen.css b/bitbake/lib/toaster/toastergui/static/css/screen.css
new file mode 100644
index 0000000..e233ef6
--- /dev/null
+++ b/bitbake/lib/toaster/toastergui/static/css/screen.css
@@ -0,0 +1,28 @@
+body {
+ background: #ddd;
+ color: #000;
+ font-family: Helvetica, Arial, sans-serif;
+ line-height: 1.5;
+ margin: 0;
+ padding: 0;
+}
+
+#main {
+ background: #fff;
+ border-left: 20px solid #eee;
+ border-right: 20px solid #eee;
+ margin: 0 auto;
+ max-width: 800px;
+ padding: 20px;
+}
+
+pre.listing {
+ background: #eee;
+ border: 1px solid #ccc;
+ margin: .6em 0 .3em 0;
+ padding: .1em .3em;
+}
+
+pre.listing b {
+ color: #f00;
+}
diff --git a/bitbake/lib/toaster/toastergui/static/fonts/FontAwesome.otf b/bitbake/lib/toaster/toastergui/static/fonts/FontAwesome.otf
new file mode 100644
index 0000000..64049bf
--- /dev/null
+++ b/bitbake/lib/toaster/toastergui/static/fonts/FontAwesome.otf
Binary files differ
diff --git a/bitbake/lib/toaster/toastergui/static/fonts/fontawesome-webfont.eot b/bitbake/lib/toaster/toastergui/static/fonts/fontawesome-webfont.eot
new file mode 100644
index 0000000..7d81019
--- /dev/null
+++ b/bitbake/lib/toaster/toastergui/static/fonts/fontawesome-webfont.eot
Binary files differ
diff --git a/bitbake/lib/toaster/toastergui/static/fonts/fontawesome-webfont.svg b/bitbake/lib/toaster/toastergui/static/fonts/fontawesome-webfont.svg
new file mode 100644
index 0000000..ba0afe5
--- /dev/null
+++ b/bitbake/lib/toaster/toastergui/static/fonts/fontawesome-webfont.svg
@@ -0,0 +1,284 @@
+<?xml version="1.0" standalone="no"?>
+<!DOCTYPE svg PUBLIC "-//W3C//DTD SVG 1.1//EN" "http://www.w3.org/Graphics/SVG/1.1/DTD/svg11.dtd" >
+<svg xmlns="http://www.w3.org/2000/svg">
+<metadata></metadata>
+<defs>
+<font id="fontawesomeregular" horiz-adv-x="1536" >
+<font-face units-per-em="1792" ascent="1536" descent="-256" />
+<missing-glyph horiz-adv-x="448" />
+<glyph unicode=" " horiz-adv-x="448" />
+<glyph unicode="	" horiz-adv-x="448" />
+<glyph unicode=" " horiz-adv-x="448" />
+<glyph unicode="¨" horiz-adv-x="1792" />
+<glyph unicode="©" horiz-adv-x="1792" />
+<glyph unicode="®" horiz-adv-x="1792" />
+<glyph unicode="´" horiz-adv-x="1792" />
+<glyph unicode="Æ" horiz-adv-x="1792" />
+<glyph unicode=" " horiz-adv-x="768" />
+<glyph unicode=" " />
+<glyph unicode=" " horiz-adv-x="768" />
+<glyph unicode=" " />
+<glyph unicode=" " horiz-adv-x="512" />
+<glyph unicode=" " horiz-adv-x="384" />
+<glyph unicode=" " horiz-adv-x="256" />
+<glyph unicode=" " horiz-adv-x="256" />
+<glyph unicode=" " horiz-adv-x="192" />
+<glyph unicode=" " horiz-adv-x="307" />
+<glyph unicode=" " horiz-adv-x="85" />
+<glyph unicode=" " horiz-adv-x="307" />
+<glyph unicode=" " horiz-adv-x="384" />
+<glyph unicode="™" horiz-adv-x="1792" />
+<glyph unicode="∞" horiz-adv-x="1792" />
+<glyph unicode="≠" horiz-adv-x="1792" />
+<glyph unicode="" horiz-adv-x="500" d="M0 0z" />
+<glyph unicode="" horiz-adv-x="1792" d="M1699 1350q0 -35 -43 -78l-632 -632v-768h320q26 0 45 -19t19 -45t-19 -45t-45 -19h-896q-26 0 -45 19t-19 45t19 45t45 19h320v768l-632 632q-43 43 -43 78q0 23 18 36.5t38 17.5t43 4h1408q23 0 43 -4t38 -17.5t18 -36.5z" />
+<glyph unicode="" d="M1536 1312v-1120q0 -50 -34 -89t-86 -60.5t-103.5 -32t-96.5 -10.5t-96.5 10.5t-103.5 32t-86 60.5t-34 89t34 89t86 60.5t103.5 32t96.5 10.5q105 0 192 -39v537l-768 -237v-709q0 -50 -34 -89t-86 -60.5t-103.5 -32t-96.5 -10.5t-96.5 10.5t-103.5 32t-86 60.5t-34 89 t34 89t86 60.5t103.5 32t96.5 10.5q105 0 192 -39v967q0 31 19 56.5t49 35.5l832 256q12 4 28 4q40 0 68 -28t28 -68z" />
+<glyph unicode="" horiz-adv-x="1664" d="M1152 704q0 185 -131.5 316.5t-316.5 131.5t-316.5 -131.5t-131.5 -316.5t131.5 -316.5t316.5 -131.5t316.5 131.5t131.5 316.5zM1664 -128q0 -52 -38 -90t-90 -38q-54 0 -90 38l-343 342q-179 -124 -399 -124q-143 0 -273.5 55.5t-225 150t-150 225t-55.5 273.5 t55.5 273.5t150 225t225 150t273.5 55.5t273.5 -55.5t225 -150t150 -225t55.5 -273.5q0 -220 -124 -399l343 -343q37 -37 37 -90z" />
+<glyph unicode="" horiz-adv-x="1792" d="M1664 32v768q-32 -36 -69 -66q-268 -206 -426 -338q-51 -43 -83 -67t-86.5 -48.5t-102.5 -24.5h-1h-1q-48 0 -102.5 24.5t-86.5 48.5t-83 67q-158 132 -426 338q-37 30 -69 66v-768q0 -13 9.5 -22.5t22.5 -9.5h1472q13 0 22.5 9.5t9.5 22.5zM1664 1083v11v13.5t-0.5 13 t-3 12.5t-5.5 9t-9 7.5t-14 2.5h-1472q-13 0 -22.5 -9.5t-9.5 -22.5q0 -168 147 -284q193 -152 401 -317q6 -5 35 -29.5t46 -37.5t44.5 -31.5t50.5 -27.5t43 -9h1h1q20 0 43 9t50.5 27.5t44.5 31.5t46 37.5t35 29.5q208 165 401 317q54 43 100.5 115.5t46.5 131.5z M1792 1120v-1088q0 -66 -47 -113t-113 -47h-1472q-66 0 -113 47t-47 113v1088q0 66 47 113t113 47h1472q66 0 113 -47t47 -113z" />
+<glyph unicode="" horiz-adv-x="1792" d="M896 -128q-26 0 -44 18l-624 602q-10 8 -27.5 26t-55.5 65.5t-68 97.5t-53.5 121t-23.5 138q0 220 127 344t351 124q62 0 126.5 -21.5t120 -58t95.5 -68.5t76 -68q36 36 76 68t95.5 68.5t120 58t126.5 21.5q224 0 351 -124t127 -344q0 -221 -229 -450l-623 -600 q-18 -18 -44 -18z" />
+<glyph unicode="" horiz-adv-x="1664" d="M1664 889q0 -22 -26 -48l-363 -354l86 -500q1 -7 1 -20q0 -21 -10.5 -35.5t-30.5 -14.5q-19 0 -40 12l-449 236l-449 -236q-22 -12 -40 -12q-21 0 -31.5 14.5t-10.5 35.5q0 6 2 20l86 500l-364 354q-25 27 -25 48q0 37 56 46l502 73l225 455q19 41 49 41t49 -41l225 -455 l502 -73q56 -9 56 -46z" />
+<glyph unicode="" horiz-adv-x="1664" d="M1137 532l306 297l-422 62l-189 382l-189 -382l-422 -62l306 -297l-73 -421l378 199l377 -199zM1664 889q0 -22 -26 -48l-363 -354l86 -500q1 -7 1 -20q0 -50 -41 -50q-19 0 -40 12l-449 236l-449 -236q-22 -12 -40 -12q-21 0 -31.5 14.5t-10.5 35.5q0 6 2 20l86 500 l-364 354q-25 27 -25 48q0 37 56 46l502 73l225 455q19 41 49 41t49 -41l225 -455l502 -73q56 -9 56 -46z" />
+<glyph unicode="" horiz-adv-x="1408" d="M1408 131q0 -120 -73 -189.5t-194 -69.5h-874q-121 0 -194 69.5t-73 189.5q0 53 3.5 103.5t14 109t26.5 108.5t43 97.5t62 81t85.5 53.5t111.5 20q9 0 42 -21.5t74.5 -48t108 -48t133.5 -21.5t133.5 21.5t108 48t74.5 48t42 21.5q61 0 111.5 -20t85.5 -53.5t62 -81 t43 -97.5t26.5 -108.5t14 -109t3.5 -103.5zM1088 1024q0 -159 -112.5 -271.5t-271.5 -112.5t-271.5 112.5t-112.5 271.5t112.5 271.5t271.5 112.5t271.5 -112.5t112.5 -271.5z" />
+<glyph unicode="" horiz-adv-x="1920" d="M384 -64v128q0 26 -19 45t-45 19h-128q-26 0 -45 -19t-19 -45v-128q0 -26 19 -45t45 -19h128q26 0 45 19t19 45zM384 320v128q0 26 -19 45t-45 19h-128q-26 0 -45 -19t-19 -45v-128q0 -26 19 -45t45 -19h128q26 0 45 19t19 45zM384 704v128q0 26 -19 45t-45 19h-128 q-26 0 -45 -19t-19 -45v-128q0 -26 19 -45t45 -19h128q26 0 45 19t19 45zM1408 -64v512q0 26 -19 45t-45 19h-768q-26 0 -45 -19t-19 -45v-512q0 -26 19 -45t45 -19h768q26 0 45 19t19 45zM384 1088v128q0 26 -19 45t-45 19h-128q-26 0 -45 -19t-19 -45v-128q0 -26 19 -45 t45 -19h128q26 0 45 19t19 45zM1792 -64v128q0 26 -19 45t-45 19h-128q-26 0 -45 -19t-19 -45v-128q0 -26 19 -45t45 -19h128q26 0 45 19t19 45zM1408 704v512q0 26 -19 45t-45 19h-768q-26 0 -45 -19t-19 -45v-512q0 -26 19 -45t45 -19h768q26 0 45 19t19 45zM1792 320v128 q0 26 -19 45t-45 19h-128q-26 0 -45 -19t-19 -45v-128q0 -26 19 -45t45 -19h128q26 0 45 19t19 45zM1792 704v128q0 26 -19 45t-45 19h-128q-26 0 -45 -19t-19 -45v-128q0 -26 19 -45t45 -19h128q26 0 45 19t19 45zM1792 1088v128q0 26 -19 45t-45 19h-128q-26 0 -45 -19 t-19 -45v-128q0 -26 19 -45t45 -19h128q26 0 45 19t19 45zM1920 1248v-1344q0 -66 -47 -113t-113 -47h-1600q-66 0 -113 47t-47 113v1344q0 66 47 113t113 47h1600q66 0 113 -47t47 -113z" />
+<glyph unicode="" horiz-adv-x="1664" d="M768 512v-384q0 -52 -38 -90t-90 -38h-512q-52 0 -90 38t-38 90v384q0 52 38 90t90 38h512q52 0 90 -38t38 -90zM768 1280v-384q0 -52 -38 -90t-90 -38h-512q-52 0 -90 38t-38 90v384q0 52 38 90t90 38h512q52 0 90 -38t38 -90zM1664 512v-384q0 -52 -38 -90t-90 -38 h-512q-52 0 -90 38t-38 90v384q0 52 38 90t90 38h512q52 0 90 -38t38 -90zM1664 1280v-384q0 -52 -38 -90t-90 -38h-512q-52 0 -90 38t-38 90v384q0 52 38 90t90 38h512q52 0 90 -38t38 -90z" />
+<glyph unicode="" horiz-adv-x="1792" d="M512 288v-192q0 -40 -28 -68t-68 -28h-320q-40 0 -68 28t-28 68v192q0 40 28 68t68 28h320q40 0 68 -28t28 -68zM512 800v-192q0 -40 -28 -68t-68 -28h-320q-40 0 -68 28t-28 68v192q0 40 28 68t68 28h320q40 0 68 -28t28 -68zM1152 288v-192q0 -40 -28 -68t-68 -28h-320 q-40 0 -68 28t-28 68v192q0 40 28 68t68 28h320q40 0 68 -28t28 -68zM512 1312v-192q0 -40 -28 -68t-68 -28h-320q-40 0 -68 28t-28 68v192q0 40 28 68t68 28h320q40 0 68 -28t28 -68zM1152 800v-192q0 -40 -28 -68t-68 -28h-320q-40 0 -68 28t-28 68v192q0 40 28 68t68 28 h320q40 0 68 -28t28 -68zM1792 288v-192q0 -40 -28 -68t-68 -28h-320q-40 0 -68 28t-28 68v192q0 40 28 68t68 28h320q40 0 68 -28t28 -68zM1152 1312v-192q0 -40 -28 -68t-68 -28h-320q-40 0 -68 28t-28 68v192q0 40 28 68t68 28h320q40 0 68 -28t28 -68zM1792 800v-192 q0 -40 -28 -68t-68 -28h-320q-40 0 -68 28t-28 68v192q0 40 28 68t68 28h320q40 0 68 -28t28 -68zM1792 1312v-192q0 -40 -28 -68t-68 -28h-320q-40 0 -68 28t-28 68v192q0 40 28 68t68 28h320q40 0 68 -28t28 -68z" />
+<glyph unicode="" horiz-adv-x="1792" d="M512 288v-192q0 -40 -28 -68t-68 -28h-320q-40 0 -68 28t-28 68v192q0 40 28 68t68 28h320q40 0 68 -28t28 -68zM512 800v-192q0 -40 -28 -68t-68 -28h-320q-40 0 -68 28t-28 68v192q0 40 28 68t68 28h320q40 0 68 -28t28 -68zM1792 288v-192q0 -40 -28 -68t-68 -28h-960 q-40 0 -68 28t-28 68v192q0 40 28 68t68 28h960q40 0 68 -28t28 -68zM512 1312v-192q0 -40 -28 -68t-68 -28h-320q-40 0 -68 28t-28 68v192q0 40 28 68t68 28h320q40 0 68 -28t28 -68zM1792 800v-192q0 -40 -28 -68t-68 -28h-960q-40 0 -68 28t-28 68v192q0 40 28 68t68 28 h960q40 0 68 -28t28 -68zM1792 1312v-192q0 -40 -28 -68t-68 -28h-960q-40 0 -68 28t-28 68v192q0 40 28 68t68 28h960q40 0 68 -28t28 -68z" />
+<glyph unicode="" horiz-adv-x="1792" d="M1671 970q0 -40 -28 -68l-724 -724l-136 -136q-28 -28 -68 -28t-68 28l-136 136l-362 362q-28 28 -28 68t28 68l136 136q28 28 68 28t68 -28l294 -295l656 657q28 28 68 28t68 -28l136 -136q28 -28 28 -68z" />
+<glyph unicode="" horiz-adv-x="1408" d="M1298 214q0 -40 -28 -68l-136 -136q-28 -28 -68 -28t-68 28l-294 294l-294 -294q-28 -28 -68 -28t-68 28l-136 136q-28 28 -28 68t28 68l294 294l-294 294q-28 28 -28 68t28 68l136 136q28 28 68 28t68 -28l294 -294l294 294q28 28 68 28t68 -28l136 -136q28 -28 28 -68 t-28 -68l-294 -294l294 -294q28 -28 28 -68z" />
+<glyph unicode="" horiz-adv-x="1664" d="M1024 736v-64q0 -13 -9.5 -22.5t-22.5 -9.5h-224v-224q0 -13 -9.5 -22.5t-22.5 -9.5h-64q-13 0 -22.5 9.5t-9.5 22.5v224h-224q-13 0 -22.5 9.5t-9.5 22.5v64q0 13 9.5 22.5t22.5 9.5h224v224q0 13 9.5 22.5t22.5 9.5h64q13 0 22.5 -9.5t9.5 -22.5v-224h224 q13 0 22.5 -9.5t9.5 -22.5zM1152 704q0 185 -131.5 316.5t-316.5 131.5t-316.5 -131.5t-131.5 -316.5t131.5 -316.5t316.5 -131.5t316.5 131.5t131.5 316.5zM1664 -128q0 -53 -37.5 -90.5t-90.5 -37.5q-54 0 -90 38l-343 342q-179 -124 -399 -124q-143 0 -273.5 55.5 t-225 150t-150 225t-55.5 273.5t55.5 273.5t150 225t225 150t273.5 55.5t273.5 -55.5t225 -150t150 -225t55.5 -273.5q0 -220 -124 -399l343 -343q37 -37 37 -90z" />
+<glyph unicode="" horiz-adv-x="1664" d="M1024 736v-64q0 -13 -9.5 -22.5t-22.5 -9.5h-576q-13 0 -22.5 9.5t-9.5 22.5v64q0 13 9.5 22.5t22.5 9.5h576q13 0 22.5 -9.5t9.5 -22.5zM1152 704q0 185 -131.5 316.5t-316.5 131.5t-316.5 -131.5t-131.5 -316.5t131.5 -316.5t316.5 -131.5t316.5 131.5t131.5 316.5z M1664 -128q0 -53 -37.5 -90.5t-90.5 -37.5q-54 0 -90 38l-343 342q-179 -124 -399 -124q-143 0 -273.5 55.5t-225 150t-150 225t-55.5 273.5t55.5 273.5t150 225t225 150t273.5 55.5t273.5 -55.5t225 -150t150 -225t55.5 -273.5q0 -220 -124 -399l343 -343q37 -37 37 -90z " />
+<glyph unicode="" d="M1536 640q0 -156 -61 -298t-164 -245t-245 -164t-298 -61t-298 61t-245 164t-164 245t-61 298q0 182 80.5 343t226.5 270q43 32 95.5 25t83.5 -50q32 -42 24.5 -94.5t-49.5 -84.5q-98 -74 -151.5 -181t-53.5 -228q0 -104 40.5 -198.5t109.5 -163.5t163.5 -109.5 t198.5 -40.5t198.5 40.5t163.5 109.5t109.5 163.5t40.5 198.5q0 121 -53.5 228t-151.5 181q-42 32 -49.5 84.5t24.5 94.5q31 43 84 50t95 -25q146 -109 226.5 -270t80.5 -343zM896 1408v-640q0 -52 -38 -90t-90 -38t-90 38t-38 90v640q0 52 38 90t90 38t90 -38t38 -90z" />
+<glyph unicode="" horiz-adv-x="1792" d="M256 96v-192q0 -14 -9 -23t-23 -9h-192q-14 0 -23 9t-9 23v192q0 14 9 23t23 9h192q14 0 23 -9t9 -23zM640 224v-320q0 -14 -9 -23t-23 -9h-192q-14 0 -23 9t-9 23v320q0 14 9 23t23 9h192q14 0 23 -9t9 -23zM1024 480v-576q0 -14 -9 -23t-23 -9h-192q-14 0 -23 9t-9 23 v576q0 14 9 23t23 9h192q14 0 23 -9t9 -23zM1408 864v-960q0 -14 -9 -23t-23 -9h-192q-14 0 -23 9t-9 23v960q0 14 9 23t23 9h192q14 0 23 -9t9 -23zM1792 1376v-1472q0 -14 -9 -23t-23 -9h-192q-14 0 -23 9t-9 23v1472q0 14 9 23t23 9h192q14 0 23 -9t9 -23z" />
+<glyph unicode="" d="M1024 640q0 106 -75 181t-181 75t-181 -75t-75 -181t75 -181t181 -75t181 75t75 181zM1536 749v-222q0 -12 -8 -23t-20 -13l-185 -28q-19 -54 -39 -91q35 -50 107 -138q10 -12 10 -25t-9 -23q-27 -37 -99 -108t-94 -71q-12 0 -26 9l-138 108q-44 -23 -91 -38 q-16 -136 -29 -186q-7 -28 -36 -28h-222q-14 0 -24.5 8.5t-11.5 21.5l-28 184q-49 16 -90 37l-141 -107q-10 -9 -25 -9q-14 0 -25 11q-126 114 -165 168q-7 10 -7 23q0 12 8 23q15 21 51 66.5t54 70.5q-27 50 -41 99l-183 27q-13 2 -21 12.5t-8 23.5v222q0 12 8 23t19 13 l186 28q14 46 39 92q-40 57 -107 138q-10 12 -10 24q0 10 9 23q26 36 98.5 107.5t94.5 71.5q13 0 26 -10l138 -107q44 23 91 38q16 136 29 186q7 28 36 28h222q14 0 24.5 -8.5t11.5 -21.5l28 -184q49 -16 90 -37l142 107q9 9 24 9q13 0 25 -10q129 -119 165 -170q7 -8 7 -22 q0 -12 -8 -23q-15 -21 -51 -66.5t-54 -70.5q26 -50 41 -98l183 -28q13 -2 21 -12.5t8 -23.5z" />
+<glyph unicode="" horiz-adv-x="1408" d="M512 800v-576q0 -14 -9 -23t-23 -9h-64q-14 0 -23 9t-9 23v576q0 14 9 23t23 9h64q14 0 23 -9t9 -23zM768 800v-576q0 -14 -9 -23t-23 -9h-64q-14 0 -23 9t-9 23v576q0 14 9 23t23 9h64q14 0 23 -9t9 -23zM1024 800v-576q0 -14 -9 -23t-23 -9h-64q-14 0 -23 9t-9 23v576 q0 14 9 23t23 9h64q14 0 23 -9t9 -23zM1152 76v948h-896v-948q0 -22 7 -40.5t14.5 -27t10.5 -8.5h832q3 0 10.5 8.5t14.5 27t7 40.5zM480 1152h448l-48 117q-7 9 -17 11h-317q-10 -2 -17 -11zM1408 1120v-64q0 -14 -9 -23t-23 -9h-96v-948q0 -83 -47 -143.5t-113 -60.5h-832 q-66 0 -113 58.5t-47 141.5v952h-96q-14 0 -23 9t-9 23v64q0 14 9 23t23 9h309l70 167q15 37 54 63t79 26h320q40 0 79 -26t54 -63l70 -167h309q14 0 23 -9t9 -23z" />
+<glyph unicode="" horiz-adv-x="1664" d="M1408 544v-480q0 -26 -19 -45t-45 -19h-384v384h-256v-384h-384q-26 0 -45 19t-19 45v480q0 1 0.5 3t0.5 3l575 474l575 -474q1 -2 1 -6zM1631 613l-62 -74q-8 -9 -21 -11h-3q-13 0 -21 7l-692 577l-692 -577q-12 -8 -24 -7q-13 2 -21 11l-62 74q-8 10 -7 23.5t11 21.5 l719 599q32 26 76 26t76 -26l244 -204v195q0 14 9 23t23 9h192q14 0 23 -9t9 -23v-408l219 -182q10 -8 11 -21.5t-7 -23.5z" />
+<glyph unicode="" horiz-adv-x="1280" d="M128 0h1024v768h-416q-40 0 -68 28t-28 68v416h-512v-1280zM768 896h299l-299 299v-299zM1280 768v-800q0 -40 -28 -68t-68 -28h-1088q-40 0 -68 28t-28 68v1344q0 40 28 68t68 28h544q40 0 88 -20t76 -48l408 -408q28 -28 48 -76t20 -88z" />
+<glyph unicode="" d="M1088 608v-64q0 -13 -9.5 -22.5t-22.5 -9.5h-384q-13 0 -22.5 9.5t-9.5 22.5v448q0 13 9.5 22.5t22.5 9.5h64q13 0 22.5 -9.5t9.5 -22.5v-352h288q13 0 22.5 -9.5t9.5 -22.5zM1280 640q0 104 -40.5 198.5t-109.5 163.5t-163.5 109.5t-198.5 40.5t-198.5 -40.5 t-163.5 -109.5t-109.5 -163.5t-40.5 -198.5t40.5 -198.5t109.5 -163.5t163.5 -109.5t198.5 -40.5t198.5 40.5t163.5 109.5t109.5 163.5t40.5 198.5zM1536 640q0 -209 -103 -385.5t-279.5 -279.5t-385.5 -103t-385.5 103t-279.5 279.5t-103 385.5t103 385.5t279.5 279.5 t385.5 103t385.5 -103t279.5 -279.5t103 -385.5z" />
+<glyph unicode="" horiz-adv-x="1920" d="M1111 540v4l-24 320q-1 13 -11 22.5t-23 9.5h-186q-13 0 -23 -9.5t-11 -22.5l-24 -320v-4q-1 -12 8 -20t21 -8h244q12 0 21 8t8 20zM1870 73q0 -73 -46 -73h-704q13 0 22 9.5t8 22.5l-20 256q-1 13 -11 22.5t-23 9.5h-272q-13 0 -23 -9.5t-11 -22.5l-20 -256 q-1 -13 8 -22.5t22 -9.5h-704q-46 0 -46 73q0 54 26 116l417 1044q8 19 26 33t38 14h339q-13 0 -23 -9.5t-11 -22.5l-15 -192q-1 -14 8 -23t22 -9h166q13 0 22 9t8 23l-15 192q-1 13 -11 22.5t-23 9.5h339q20 0 38 -14t26 -33l417 -1044q26 -62 26 -116z" />
+<glyph unicode="" horiz-adv-x="1664" d="M1339 729q17 -41 -14 -70l-448 -448q-18 -19 -45 -19t-45 19l-448 448q-31 29 -14 70q17 39 59 39h256v448q0 26 19 45t45 19h256q26 0 45 -19t19 -45v-448h256q42 0 59 -39zM1632 512q14 0 23 -9t9 -23v-576q0 -14 -9 -23t-23 -9h-1600q-14 0 -23 9t-9 23v576q0 14 9 23 t23 9h192q14 0 23 -9t9 -23v-352h1152v352q0 14 9 23t23 9h192z" />
+<glyph unicode="" d="M1120 608q0 -12 -10 -24l-319 -319q-9 -9 -23 -9t-23 9l-320 320q-9 9 -9 23q0 13 9.5 22.5t22.5 9.5h192v352q0 13 9.5 22.5t22.5 9.5h192q13 0 22.5 -9.5t9.5 -22.5v-352h192q14 0 23 -9t9 -23zM1280 640q0 104 -40.5 198.5t-109.5 163.5t-163.5 109.5t-198.5 40.5 t-198.5 -40.5t-163.5 -109.5t-109.5 -163.5t-40.5 -198.5t40.5 -198.5t109.5 -163.5t163.5 -109.5t198.5 -40.5t198.5 40.5t163.5 109.5t109.5 163.5t40.5 198.5zM1536 640q0 -209 -103 -385.5t-279.5 -279.5t-385.5 -103t-385.5 103t-279.5 279.5t-103 385.5t103 385.5 t279.5 279.5t385.5 103t385.5 -103t279.5 -279.5t103 -385.5z" />
+<glyph unicode="" d="M1120 672q0 -13 -9.5 -22.5t-22.5 -9.5h-192v-352q0 -13 -9.5 -22.5t-22.5 -9.5h-192q-13 0 -22.5 9.5t-9.5 22.5v352h-192q-14 0 -23 9t-9 23q0 12 10 24l319 319q9 9 23 9t23 -9l320 -320q9 -9 9 -23zM1280 640q0 104 -40.5 198.5t-109.5 163.5t-163.5 109.5 t-198.5 40.5t-198.5 -40.5t-163.5 -109.5t-109.5 -163.5t-40.5 -198.5t40.5 -198.5t109.5 -163.5t163.5 -109.5t198.5 -40.5t198.5 40.5t163.5 109.5t109.5 163.5t40.5 198.5zM1536 640q0 -209 -103 -385.5t-279.5 -279.5t-385.5 -103t-385.5 103t-279.5 279.5t-103 385.5 t103 385.5t279.5 279.5t385.5 103t385.5 -103t279.5 -279.5t103 -385.5z" />
+<glyph unicode="" d="M1023 576h316q-1 3 -2.5 8t-2.5 8l-212 496h-708l-212 -496q-1 -2 -2.5 -8t-2.5 -8h316l95 -192h320zM1536 546v-482q0 -26 -19 -45t-45 -19h-1408q-26 0 -45 19t-19 45v482q0 62 25 123l238 552q10 25 36.5 42t52.5 17h832q26 0 52.5 -17t36.5 -42l238 -552 q25 -61 25 -123z" />
+<glyph unicode="" d="M1152 640q0 -37 -33 -56l-512 -288q-14 -8 -31 -8t-32 9q-32 18 -32 55v576q0 37 32 55q31 20 63 1l512 -288q33 -19 33 -56zM1280 640q0 104 -40.5 198.5t-109.5 163.5t-163.5 109.5t-198.5 40.5t-198.5 -40.5t-163.5 -109.5t-109.5 -163.5t-40.5 -198.5t40.5 -198.5 t109.5 -163.5t163.5 -109.5t198.5 -40.5t198.5 40.5t163.5 109.5t109.5 163.5t40.5 198.5zM1536 640q0 -209 -103 -385.5t-279.5 -279.5t-385.5 -103t-385.5 103t-279.5 279.5t-103 385.5t103 385.5t279.5 279.5t385.5 103t385.5 -103t279.5 -279.5t103 -385.5z" />
+<glyph unicode="" d="M1536 1280v-448q0 -26 -19 -45t-45 -19h-448q-42 0 -59 40q-17 39 14 69l138 138q-148 137 -349 137q-104 0 -198.5 -40.5t-163.5 -109.5t-109.5 -163.5t-40.5 -198.5t40.5 -198.5t109.5 -163.5t163.5 -109.5t198.5 -40.5q169 0 304 99.5t185 261.5q7 23 30 23h199 q16 0 25 -12q10 -13 7 -27q-39 -175 -147.5 -312t-266 -213t-336.5 -76q-156 0 -298 61t-245 164t-164 245t-61 298t61 298t164 245t245 164t298 61q147 0 284.5 -55.5t244.5 -156.5l130 129q29 31 70 14q39 -17 39 -59z" />
+<glyph unicode="" d="M1511 480q0 -5 -1 -7q-64 -268 -268 -434.5t-478 -166.5q-146 0 -282.5 55t-243.5 157l-129 -129q-19 -19 -45 -19t-45 19t-19 45v448q0 26 19 45t45 19h448q26 0 45 -19t19 -45t-19 -45l-137 -137q71 -66 161 -102t187 -36q134 0 250 65t186 179q11 17 53 117 q8 23 30 23h192q13 0 22.5 -9.5t9.5 -22.5zM1536 1280v-448q0 -26 -19 -45t-45 -19h-448q-26 0 -45 19t-19 45t19 45l138 138q-148 137 -349 137q-134 0 -250 -65t-186 -179q-11 -17 -53 -117q-8 -23 -30 -23h-199q-13 0 -22.5 9.5t-9.5 22.5v7q65 268 270 434.5t480 166.5 q146 0 284 -55.5t245 -156.5l130 129q19 19 45 19t45 -19t19 -45z" />
+<glyph unicode="" horiz-adv-x="1792" d="M384 352v-64q0 -13 -9.5 -22.5t-22.5 -9.5h-64q-13 0 -22.5 9.5t-9.5 22.5v64q0 13 9.5 22.5t22.5 9.5h64q13 0 22.5 -9.5t9.5 -22.5zM384 608v-64q0 -13 -9.5 -22.5t-22.5 -9.5h-64q-13 0 -22.5 9.5t-9.5 22.5v64q0 13 9.5 22.5t22.5 9.5h64q13 0 22.5 -9.5t9.5 -22.5z M384 864v-64q0 -13 -9.5 -22.5t-22.5 -9.5h-64q-13 0 -22.5 9.5t-9.5 22.5v64q0 13 9.5 22.5t22.5 9.5h64q13 0 22.5 -9.5t9.5 -22.5zM1536 352v-64q0 -13 -9.5 -22.5t-22.5 -9.5h-960q-13 0 -22.5 9.5t-9.5 22.5v64q0 13 9.5 22.5t22.5 9.5h960q13 0 22.5 -9.5t9.5 -22.5z M1536 608v-64q0 -13 -9.5 -22.5t-22.5 -9.5h-960q-13 0 -22.5 9.5t-9.5 22.5v64q0 13 9.5 22.5t22.5 9.5h960q13 0 22.5 -9.5t9.5 -22.5zM1536 864v-64q0 -13 -9.5 -22.5t-22.5 -9.5h-960q-13 0 -22.5 9.5t-9.5 22.5v64q0 13 9.5 22.5t22.5 9.5h960q13 0 22.5 -9.5 t9.5 -22.5zM1664 160v832q0 13 -9.5 22.5t-22.5 9.5h-1472q-13 0 -22.5 -9.5t-9.5 -22.5v-832q0 -13 9.5 -22.5t22.5 -9.5h1472q13 0 22.5 9.5t9.5 22.5zM1792 1248v-1088q0 -66 -47 -113t-113 -47h-1472q-66 0 -113 47t-47 113v1088q0 66 47 113t113 47h1472q66 0 113 -47 t47 -113z" />
+<glyph unicode="" horiz-adv-x="1152" d="M704 512q0 53 -37.5 90.5t-90.5 37.5t-90.5 -37.5t-37.5 -90.5q0 -37 19 -67t51 -47l-69 -229q-5 -15 5 -28t26 -13h192q16 0 26 13t5 28l-69 229q32 17 51 47t19 67zM320 768h512v192q0 106 -75 181t-181 75t-181 -75t-75 -181v-192zM1152 672v-576q0 -40 -28 -68 t-68 -28h-960q-40 0 -68 28t-28 68v576q0 40 28 68t68 28h32v192q0 184 132 316t316 132t316 -132t132 -316v-192h32q40 0 68 -28t28 -68z" />
+<glyph unicode="" horiz-adv-x="1792" d="M320 1280q0 -72 -64 -110v-1266q0 -13 -9.5 -22.5t-22.5 -9.5h-64q-13 0 -22.5 9.5t-9.5 22.5v1266q-64 38 -64 110q0 53 37.5 90.5t90.5 37.5t90.5 -37.5t37.5 -90.5zM1792 1216v-763q0 -25 -12.5 -38.5t-39.5 -27.5q-215 -116 -369 -116q-61 0 -123.5 22t-108.5 48 t-115.5 48t-142.5 22q-192 0 -464 -146q-17 -9 -33 -9q-26 0 -45 19t-19 45v742q0 32 31 55q21 14 79 43q236 120 421 120q107 0 200 -29t219 -88q38 -19 88 -19q54 0 117.5 21t110 47t88 47t54.5 21q26 0 45 -19t19 -45z" />
+<glyph unicode="" horiz-adv-x="1664" d="M1664 650q0 -166 -60 -314l-20 -49l-185 -33q-22 -83 -90.5 -136.5t-156.5 -53.5v-32q0 -14 -9 -23t-23 -9h-64q-14 0 -23 9t-9 23v576q0 14 9 23t23 9h64q14 0 23 -9t9 -23v-32q71 0 130 -35.5t93 -95.5l68 12q29 95 29 193q0 148 -88 279t-236.5 209t-315.5 78 t-315.5 -78t-236.5 -209t-88 -279q0 -98 29 -193l68 -12q34 60 93 95.5t130 35.5v32q0 14 9 23t23 9h64q14 0 23 -9t9 -23v-576q0 -14 -9 -23t-23 -9h-64q-14 0 -23 9t-9 23v32q-88 0 -156.5 53.5t-90.5 136.5l-185 33l-20 49q-60 148 -60 314q0 151 67 291t179 242.5 t266 163.5t320 61t320 -61t266 -163.5t179 -242.5t67 -291z" />
+<glyph unicode="" horiz-adv-x="768" d="M768 1184v-1088q0 -26 -19 -45t-45 -19t-45 19l-333 333h-262q-26 0 -45 19t-19 45v384q0 26 19 45t45 19h262l333 333q19 19 45 19t45 -19t19 -45z" />
+<glyph unicode="" horiz-adv-x="1152" d="M768 1184v-1088q0 -26 -19 -45t-45 -19t-45 19l-333 333h-262q-26 0 -45 19t-19 45v384q0 26 19 45t45 19h262l333 333q19 19 45 19t45 -19t19 -45zM1152 640q0 -76 -42.5 -141.5t-112.5 -93.5q-10 -5 -25 -5q-26 0 -45 18.5t-19 45.5q0 21 12 35.5t29 25t34 23t29 35.5 t12 57t-12 57t-29 35.5t-34 23t-29 25t-12 35.5q0 27 19 45.5t45 18.5q15 0 25 -5q70 -27 112.5 -93t42.5 -142z" />
+<glyph unicode="" horiz-adv-x="1664" d="M768 1184v-1088q0 -26 -19 -45t-45 -19t-45 19l-333 333h-262q-26 0 -45 19t-19 45v384q0 26 19 45t45 19h262l333 333q19 19 45 19t45 -19t19 -45zM1152 640q0 -76 -42.5 -141.5t-112.5 -93.5q-10 -5 -25 -5q-26 0 -45 18.5t-19 45.5q0 21 12 35.5t29 25t34 23t29 35.5 t12 57t-12 57t-29 35.5t-34 23t-29 25t-12 35.5q0 27 19 45.5t45 18.5q15 0 25 -5q70 -27 112.5 -93t42.5 -142zM1408 640q0 -153 -85 -282.5t-225 -188.5q-13 -5 -25 -5q-27 0 -46 19t-19 45q0 39 39 59q56 29 76 44q74 54 115.5 135.5t41.5 173.5t-41.5 173.5 t-115.5 135.5q-20 15 -76 44q-39 20 -39 59q0 26 19 45t45 19q13 0 26 -5q140 -59 225 -188.5t85 -282.5zM1664 640q0 -230 -127 -422.5t-338 -283.5q-13 -5 -26 -5q-26 0 -45 19t-19 45q0 36 39 59q7 4 22.5 10.5t22.5 10.5q46 25 82 51q123 91 192 227t69 289t-69 289 t-192 227q-36 26 -82 51q-7 4 -22.5 10.5t-22.5 10.5q-39 23 -39 59q0 26 19 45t45 19q13 0 26 -5q211 -91 338 -283.5t127 -422.5z" />
+<glyph unicode="" horiz-adv-x="1408" d="M384 384v-128h-128v128h128zM384 1152v-128h-128v128h128zM1152 1152v-128h-128v128h128zM128 129h384v383h-384v-383zM128 896h384v384h-384v-384zM896 896h384v384h-384v-384zM640 640v-640h-640v640h640zM1152 128v-128h-128v128h128zM1408 128v-128h-128v128h128z M1408 640v-384h-384v128h-128v-384h-128v640h384v-128h128v128h128zM640 1408v-640h-640v640h640zM1408 1408v-640h-640v640h640z" />
+<glyph unicode="" horiz-adv-x="1792" d="M672 1408v-1536h-64v1536h64zM1408 1408v-1536h-64v1536h64zM1568 1408v-1536h-64v1536h64zM576 1408v-1536h-64v1536h64zM1280 1408v-1536h-256v1536h256zM896 1408v-1536h-128v1536h128zM448 1408v-1536h-128v1536h128zM1792 1408v-1536h-128v1536h128zM256 1408v-1536 h-256v1536h256z" />
+<glyph unicode="" d="M448 1088q0 53 -37.5 90.5t-90.5 37.5t-90.5 -37.5t-37.5 -90.5t37.5 -90.5t90.5 -37.5t90.5 37.5t37.5 90.5zM1515 512q0 -53 -37 -90l-491 -492q-39 -37 -91 -37q-53 0 -90 37l-715 716q-38 37 -64.5 101t-26.5 117v416q0 52 38 90t90 38h416q53 0 117 -26.5t102 -64.5 l715 -714q37 -39 37 -91z" />
+<glyph unicode="" horiz-adv-x="1920" d="M448 1088q0 53 -37.5 90.5t-90.5 37.5t-90.5 -37.5t-37.5 -90.5t37.5 -90.5t90.5 -37.5t90.5 37.5t37.5 90.5zM1515 512q0 -53 -37 -90l-491 -492q-39 -37 -91 -37q-53 0 -90 37l-715 716q-38 37 -64.5 101t-26.5 117v416q0 52 38 90t90 38h416q53 0 117 -26.5t102 -64.5 l715 -714q37 -39 37 -91zM1899 512q0 -53 -37 -90l-491 -492q-39 -37 -91 -37q-36 0 -59 14t-53 45l470 470q37 37 37 90q0 52 -37 91l-715 714q-38 38 -102 64.5t-117 26.5h224q53 0 117 -26.5t102 -64.5l715 -714q37 -39 37 -91z" />
+<glyph unicode="" horiz-adv-x="1664" d="M1639 1058q40 -57 18 -129l-275 -906q-19 -64 -76.5 -107.5t-122.5 -43.5h-923q-77 0 -148.5 53.5t-99.5 131.5q-24 67 -2 127q0 4 3 27t4 37q1 8 -3 21.5t-3 19.5q2 11 8 21t16.5 23.5t16.5 23.5q23 38 45 91.5t30 91.5q3 10 0.5 30t-0.5 28q3 11 17 28t17 23 q21 36 42 92t25 90q1 9 -2.5 32t0.5 28q4 13 22 30.5t22 22.5q19 26 42.5 84.5t27.5 96.5q1 8 -3 25.5t-2 26.5q2 8 9 18t18 23t17 21q8 12 16.5 30.5t15 35t16 36t19.5 32t26.5 23.5t36 11.5t47.5 -5.5l-1 -3q38 9 51 9h761q74 0 114 -56t18 -130l-274 -906 q-36 -119 -71.5 -153.5t-128.5 -34.5h-869q-27 0 -38 -15q-11 -16 -1 -43q24 -70 144 -70h923q29 0 56 15.5t35 41.5l300 987q7 22 5 57q38 -15 59 -43zM575 1056q-4 -13 2 -22.5t20 -9.5h608q13 0 25.5 9.5t16.5 22.5l21 64q4 13 -2 22.5t-20 9.5h-608q-13 0 -25.5 -9.5 t-16.5 -22.5zM492 800q-4 -13 2 -22.5t20 -9.5h608q13 0 25.5 9.5t16.5 22.5l21 64q4 13 -2 22.5t-20 9.5h-608q-13 0 -25.5 -9.5t-16.5 -22.5z" />
+<glyph unicode="" horiz-adv-x="1280" d="M1164 1408q23 0 44 -9q33 -13 52.5 -41t19.5 -62v-1289q0 -34 -19.5 -62t-52.5 -41q-19 -8 -44 -8q-48 0 -83 32l-441 424l-441 -424q-36 -33 -83 -33q-23 0 -44 9q-33 13 -52.5 41t-19.5 62v1289q0 34 19.5 62t52.5 41q21 9 44 9h1048z" />
+<glyph unicode="" horiz-adv-x="1664" d="M384 0h896v256h-896v-256zM384 640h896v384h-160q-40 0 -68 28t-28 68v160h-640v-640zM1536 576q0 26 -19 45t-45 19t-45 -19t-19 -45t19 -45t45 -19t45 19t19 45zM1664 576v-416q0 -13 -9.5 -22.5t-22.5 -9.5h-224v-160q0 -40 -28 -68t-68 -28h-960q-40 0 -68 28t-28 68 v160h-224q-13 0 -22.5 9.5t-9.5 22.5v416q0 79 56.5 135.5t135.5 56.5h64v544q0 40 28 68t68 28h672q40 0 88 -20t76 -48l152 -152q28 -28 48 -76t20 -88v-256h64q79 0 135.5 -56.5t56.5 -135.5z" />
+<glyph unicode="" horiz-adv-x="1920" d="M960 864q119 0 203.5 -84.5t84.5 -203.5t-84.5 -203.5t-203.5 -84.5t-203.5 84.5t-84.5 203.5t84.5 203.5t203.5 84.5zM1664 1280q106 0 181 -75t75 -181v-896q0 -106 -75 -181t-181 -75h-1408q-106 0 -181 75t-75 181v896q0 106 75 181t181 75h224l51 136 q19 49 69.5 84.5t103.5 35.5h512q53 0 103.5 -35.5t69.5 -84.5l51 -136h224zM960 128q185 0 316.5 131.5t131.5 316.5t-131.5 316.5t-316.5 131.5t-316.5 -131.5t-131.5 -316.5t131.5 -316.5t316.5 -131.5z" />
+<glyph unicode="" horiz-adv-x="1664" d="M725 977l-170 -450q73 -1 153.5 -2t119 -1.5t52.5 -0.5l29 2q-32 95 -92 241q-53 132 -92 211zM21 -128h-21l2 79q22 7 80 18q89 16 110 31q20 16 48 68l237 616l280 724h75h53l11 -21l205 -480q103 -242 124 -297q39 -102 96 -235q26 -58 65 -164q24 -67 65 -149 q22 -49 35 -57q22 -19 69 -23q47 -6 103 -27q6 -39 6 -57q0 -14 -1 -26q-80 0 -192 8q-93 8 -189 8q-79 0 -135 -2l-200 -11l-58 -2q0 45 4 78l131 28q56 13 68 23q12 12 12 27t-6 32l-47 114l-92 228l-450 2q-29 -65 -104 -274q-23 -64 -23 -84q0 -31 17 -43 q26 -21 103 -32q3 0 13.5 -2t30 -5t40.5 -6q1 -28 1 -58q0 -17 -2 -27q-66 0 -349 20l-48 -8q-81 -14 -167 -14z" />
+<glyph unicode="" horiz-adv-x="1408" d="M555 15q76 -32 140 -32q131 0 216 41t122 113q38 70 38 181q0 114 -41 180q-58 94 -141 126q-80 32 -247 32q-74 0 -101 -10v-144l-1 -173l3 -270q0 -15 12 -44zM541 761q43 -7 109 -7q175 0 264 65t89 224q0 112 -85 187q-84 75 -255 75q-52 0 -130 -13q0 -44 2 -77 q7 -122 6 -279l-1 -98q0 -43 1 -77zM0 -128l2 94q45 9 68 12q77 12 123 31q17 27 21 51q9 66 9 194l-2 497q-5 256 -9 404q-1 87 -11 109q-1 4 -12 12q-18 12 -69 15q-30 2 -114 13l-4 83l260 6l380 13l45 1q5 0 14 0.5t14 0.5q1 0 21.5 -0.5t40.5 -0.5h74q88 0 191 -27 q43 -13 96 -39q57 -29 102 -76q44 -47 65 -104t21 -122q0 -70 -32 -128t-95 -105q-26 -20 -150 -77q177 -41 267 -146q92 -106 92 -236q0 -76 -29 -161q-21 -62 -71 -117q-66 -72 -140 -108q-73 -36 -203 -60q-82 -15 -198 -11l-197 4q-84 2 -298 -11q-33 -3 -272 -11z" />
+<glyph unicode="" horiz-adv-x="1024" d="M0 -126l17 85q4 1 77 20q76 19 116 39q29 37 41 101l27 139l56 268l12 64q8 44 17 84.5t16 67t12.5 46.5t9 30.5t3.5 11.5l29 157l16 63l22 135l8 50v38q-41 22 -144 28q-28 2 -38 4l19 103l317 -14q39 -2 73 -2q66 0 214 9q33 2 68 4.5t36 2.5q-2 -19 -6 -38 q-7 -29 -13 -51q-55 -19 -109 -31q-64 -16 -101 -31q-12 -31 -24 -88q-9 -44 -13 -82q-44 -199 -66 -306l-61 -311l-38 -158l-43 -235l-12 -45q-2 -7 1 -27q64 -15 119 -21q36 -5 66 -10q-1 -29 -7 -58q-7 -31 -9 -41q-18 0 -23 -1q-24 -2 -42 -2q-9 0 -28 3q-19 4 -145 17 l-198 2q-41 1 -174 -11q-74 -7 -98 -9z" />
+<glyph unicode="" horiz-adv-x="1792" d="M81 1407l54 -27q20 -5 211 -5h130l19 3l115 1l215 -1h293l34 -2q14 -1 28 7t21 16l7 8l42 1q15 0 28 -1v-104.5t1 -131.5l1 -100l-1 -58q0 -32 -4 -51q-39 -15 -68 -18q-25 43 -54 128q-8 24 -15.5 62.5t-11.5 65.5t-6 29q-13 15 -27 19q-7 2 -42.5 2t-103.5 -1t-111 -1 q-34 0 -67 -5q-10 -97 -8 -136l1 -152v-332l3 -359l-1 -147q-1 -46 11 -85q49 -25 89 -32q2 0 18 -5t44 -13t43 -12q30 -8 50 -18q5 -45 5 -50q0 -10 -3 -29q-14 -1 -34 -1q-110 0 -187 10q-72 8 -238 8q-88 0 -233 -14q-48 -4 -70 -4q-2 22 -2 26l-1 26v9q21 33 79 49 q139 38 159 50q9 21 12 56q8 192 6 433l-5 428q-1 62 -0.5 118.5t0.5 102.5t-2 57t-6 15q-6 5 -14 6q-38 6 -148 6q-43 0 -100 -13.5t-73 -24.5q-13 -9 -22 -33t-22 -75t-24 -84q-6 -19 -19.5 -32t-20.5 -13q-44 27 -56 44v297v86zM1744 128q33 0 42 -18.5t-11 -44.5 l-126 -162q-20 -26 -49 -26t-49 26l-126 162q-20 26 -11 44.5t42 18.5h80v1024h-80q-33 0 -42 18.5t11 44.5l126 162q20 26 49 26t49 -26l126 -162q20 -26 11 -44.5t-42 -18.5h-80v-1024h80z" />
+<glyph unicode="" d="M81 1407l54 -27q20 -5 211 -5h130l19 3l115 1l446 -1h318l34 -2q14 -1 28 7t21 16l7 8l42 1q15 0 28 -1v-104.5t1 -131.5l1 -100l-1 -58q0 -32 -4 -51q-39 -15 -68 -18q-25 43 -54 128q-8 24 -15.5 62.5t-11.5 65.5t-6 29q-13 15 -27 19q-7 2 -58.5 2t-138.5 -1t-128 -1 q-94 0 -127 -5q-10 -97 -8 -136l1 -152v52l3 -359l-1 -147q-1 -46 11 -85q49 -25 89 -32q2 0 18 -5t44 -13t43 -12q30 -8 50 -18q5 -45 5 -50q0 -10 -3 -29q-14 -1 -34 -1q-110 0 -187 10q-72 8 -238 8q-82 0 -233 -13q-45 -5 -70 -5q-2 22 -2 26l-1 26v9q21 33 79 49 q139 38 159 50q9 21 12 56q6 137 6 433l-5 44q0 265 -2 278q-2 11 -6 15q-6 5 -14 6q-38 6 -148 6q-50 0 -168.5 -14t-132.5 -24q-13 -9 -22 -33t-22 -75t-24 -84q-6 -19 -19.5 -32t-20.5 -13q-44 27 -56 44v297v86zM1505 113q26 -20 26 -49t-26 -49l-162 -126 q-26 -20 -44.5 -11t-18.5 42v80h-1024v-80q0 -33 -18.5 -42t-44.5 11l-162 126q-26 20 -26 49t26 49l162 126q26 20 44.5 11t18.5 -42v-80h1024v80q0 33 18.5 42t44.5 -11z" />
+<glyph unicode="" horiz-adv-x="1792" d="M1792 192v-128q0 -26 -19 -45t-45 -19h-1664q-26 0 -45 19t-19 45v128q0 26 19 45t45 19h1664q26 0 45 -19t19 -45zM1408 576v-128q0 -26 -19 -45t-45 -19h-1280q-26 0 -45 19t-19 45v128q0 26 19 45t45 19h1280q26 0 45 -19t19 -45zM1664 960v-128q0 -26 -19 -45 t-45 -19h-1536q-26 0 -45 19t-19 45v128q0 26 19 45t45 19h1536q26 0 45 -19t19 -45zM1280 1344v-128q0 -26 -19 -45t-45 -19h-1152q-26 0 -45 19t-19 45v128q0 26 19 45t45 19h1152q26 0 45 -19t19 -45z" />
+<glyph unicode="" horiz-adv-x="1792" d="M1792 192v-128q0 -26 -19 -45t-45 -19h-1664q-26 0 -45 19t-19 45v128q0 26 19 45t45 19h1664q26 0 45 -19t19 -45zM1408 576v-128q0 -26 -19 -45t-45 -19h-896q-26 0 -45 19t-19 45v128q0 26 19 45t45 19h896q26 0 45 -19t19 -45zM1664 960v-128q0 -26 -19 -45t-45 -19 h-1408q-26 0 -45 19t-19 45v128q0 26 19 45t45 19h1408q26 0 45 -19t19 -45zM1280 1344v-128q0 -26 -19 -45t-45 -19h-640q-26 0 -45 19t-19 45v128q0 26 19 45t45 19h640q26 0 45 -19t19 -45z" />
+<glyph unicode="" horiz-adv-x="1792" d="M1792 192v-128q0 -26 -19 -45t-45 -19h-1664q-26 0 -45 19t-19 45v128q0 26 19 45t45 19h1664q26 0 45 -19t19 -45zM1792 576v-128q0 -26 -19 -45t-45 -19h-1280q-26 0 -45 19t-19 45v128q0 26 19 45t45 19h1280q26 0 45 -19t19 -45zM1792 960v-128q0 -26 -19 -45 t-45 -19h-1536q-26 0 -45 19t-19 45v128q0 26 19 45t45 19h1536q26 0 45 -19t19 -45zM1792 1344v-128q0 -26 -19 -45t-45 -19h-1152q-26 0 -45 19t-19 45v128q0 26 19 45t45 19h1152q26 0 45 -19t19 -45z" />
+<glyph unicode="" horiz-adv-x="1792" d="M1792 192v-128q0 -26 -19 -45t-45 -19h-1664q-26 0 -45 19t-19 45v128q0 26 19 45t45 19h1664q26 0 45 -19t19 -45zM1792 576v-128q0 -26 -19 -45t-45 -19h-1664q-26 0 -45 19t-19 45v128q0 26 19 45t45 19h1664q26 0 45 -19t19 -45zM1792 960v-128q0 -26 -19 -45 t-45 -19h-1664q-26 0 -45 19t-19 45v128q0 26 19 45t45 19h1664q26 0 45 -19t19 -45zM1792 1344v-128q0 -26 -19 -45t-45 -19h-1664q-26 0 -45 19t-19 45v128q0 26 19 45t45 19h1664q26 0 45 -19t19 -45z" />
+<glyph unicode="" horiz-adv-x="1792" d="M256 224v-192q0 -13 -9.5 -22.5t-22.5 -9.5h-192q-13 0 -22.5 9.5t-9.5 22.5v192q0 13 9.5 22.5t22.5 9.5h192q13 0 22.5 -9.5t9.5 -22.5zM256 608v-192q0 -13 -9.5 -22.5t-22.5 -9.5h-192q-13 0 -22.5 9.5t-9.5 22.5v192q0 13 9.5 22.5t22.5 9.5h192q13 0 22.5 -9.5 t9.5 -22.5zM256 992v-192q0 -13 -9.5 -22.5t-22.5 -9.5h-192q-13 0 -22.5 9.5t-9.5 22.5v192q0 13 9.5 22.5t22.5 9.5h192q13 0 22.5 -9.5t9.5 -22.5zM1792 224v-192q0 -13 -9.5 -22.5t-22.5 -9.5h-1344q-13 0 -22.5 9.5t-9.5 22.5v192q0 13 9.5 22.5t22.5 9.5h1344 q13 0 22.5 -9.5t9.5 -22.5zM256 1376v-192q0 -13 -9.5 -22.5t-22.5 -9.5h-192q-13 0 -22.5 9.5t-9.5 22.5v192q0 13 9.5 22.5t22.5 9.5h192q13 0 22.5 -9.5t9.5 -22.5zM1792 608v-192q0 -13 -9.5 -22.5t-22.5 -9.5h-1344q-13 0 -22.5 9.5t-9.5 22.5v192q0 13 9.5 22.5 t22.5 9.5h1344q13 0 22.5 -9.5t9.5 -22.5zM1792 992v-192q0 -13 -9.5 -22.5t-22.5 -9.5h-1344q-13 0 -22.5 9.5t-9.5 22.5v192q0 13 9.5 22.5t22.5 9.5h1344q13 0 22.5 -9.5t9.5 -22.5zM1792 1376v-192q0 -13 -9.5 -22.5t-22.5 -9.5h-1344q-13 0 -22.5 9.5t-9.5 22.5v192 q0 13 9.5 22.5t22.5 9.5h1344q13 0 22.5 -9.5t9.5 -22.5z" />
+<glyph unicode="" horiz-adv-x="1792" d="M384 992v-576q0 -13 -9.5 -22.5t-22.5 -9.5q-14 0 -23 9l-288 288q-9 9 -9 23t9 23l288 288q9 9 23 9q13 0 22.5 -9.5t9.5 -22.5zM1792 224v-192q0 -13 -9.5 -22.5t-22.5 -9.5h-1728q-13 0 -22.5 9.5t-9.5 22.5v192q0 13 9.5 22.5t22.5 9.5h1728q13 0 22.5 -9.5 t9.5 -22.5zM1792 608v-192q0 -13 -9.5 -22.5t-22.5 -9.5h-1088q-13 0 -22.5 9.5t-9.5 22.5v192q0 13 9.5 22.5t22.5 9.5h1088q13 0 22.5 -9.5t9.5 -22.5zM1792 992v-192q0 -13 -9.5 -22.5t-22.5 -9.5h-1088q-13 0 -22.5 9.5t-9.5 22.5v192q0 13 9.5 22.5t22.5 9.5h1088 q13 0 22.5 -9.5t9.5 -22.5zM1792 1376v-192q0 -13 -9.5 -22.5t-22.5 -9.5h-1728q-13 0 -22.5 9.5t-9.5 22.5v192q0 13 9.5 22.5t22.5 9.5h1728q13 0 22.5 -9.5t9.5 -22.5z" />
+<glyph unicode="" horiz-adv-x="1792" d="M352 704q0 -14 -9 -23l-288 -288q-9 -9 -23 -9q-13 0 -22.5 9.5t-9.5 22.5v576q0 13 9.5 22.5t22.5 9.5q14 0 23 -9l288 -288q9 -9 9 -23zM1792 224v-192q0 -13 -9.5 -22.5t-22.5 -9.5h-1728q-13 0 -22.5 9.5t-9.5 22.5v192q0 13 9.5 22.5t22.5 9.5h1728q13 0 22.5 -9.5 t9.5 -22.5zM1792 608v-192q0 -13 -9.5 -22.5t-22.5 -9.5h-1088q-13 0 -22.5 9.5t-9.5 22.5v192q0 13 9.5 22.5t22.5 9.5h1088q13 0 22.5 -9.5t9.5 -22.5zM1792 992v-192q0 -13 -9.5 -22.5t-22.5 -9.5h-1088q-13 0 -22.5 9.5t-9.5 22.5v192q0 13 9.5 22.5t22.5 9.5h1088 q13 0 22.5 -9.5t9.5 -22.5zM1792 1376v-192q0 -13 -9.5 -22.5t-22.5 -9.5h-1728q-13 0 -22.5 9.5t-9.5 22.5v192q0 13 9.5 22.5t22.5 9.5h1728q13 0 22.5 -9.5t9.5 -22.5z" />
+<glyph unicode="" horiz-adv-x="1920" d="M1900 1278q20 -8 20 -30v-1216q0 -22 -20 -30q-8 -2 -12 -2q-12 0 -23 9l-585 586v-307q0 -119 -84.5 -203.5t-203.5 -84.5h-704q-119 0 -203.5 84.5t-84.5 203.5v704q0 119 84.5 203.5t203.5 84.5h704q119 0 203.5 -84.5t84.5 -203.5v-307l585 586q16 15 35 7z" />
+<glyph unicode="" horiz-adv-x="1920" d="M640 960q0 -80 -56 -136t-136 -56t-136 56t-56 136t56 136t136 56t136 -56t56 -136zM1664 576v-448h-1408v192l320 320l160 -160l512 512zM1760 1280h-1600q-13 0 -22.5 -9.5t-9.5 -22.5v-1216q0 -13 9.5 -22.5t22.5 -9.5h1600q13 0 22.5 9.5t9.5 22.5v1216 q0 13 -9.5 22.5t-22.5 9.5zM1920 1248v-1216q0 -66 -47 -113t-113 -47h-1600q-66 0 -113 47t-47 113v1216q0 66 47 113t113 47h1600q66 0 113 -47t47 -113z" />
+<glyph unicode="" d="M363 0l91 91l-235 235l-91 -91v-107h128v-128h107zM886 928q0 22 -22 22q-10 0 -17 -7l-542 -542q-7 -7 -7 -17q0 -22 22 -22q10 0 17 7l542 542q7 7 7 17zM832 1120l416 -416l-832 -832h-416v416zM1515 1024q0 -53 -37 -90l-166 -166l-416 416l166 165q36 38 90 38 q53 0 91 -38l235 -234q37 -39 37 -91z" />
+<glyph unicode="" horiz-adv-x="1024" d="M768 896q0 106 -75 181t-181 75t-181 -75t-75 -181t75 -181t181 -75t181 75t75 181zM1024 896q0 -109 -33 -179l-364 -774q-16 -33 -47.5 -52t-67.5 -19t-67.5 19t-46.5 52l-365 774q-33 70 -33 179q0 212 150 362t362 150t362 -150t150 -362z" />
+<glyph unicode="" d="M768 1408q209 0 385.5 -103t279.5 -279.5t103 -385.5t-103 -385.5t-279.5 -279.5t-385.5 -103t-385.5 103t-279.5 279.5t-103 385.5t103 385.5t279.5 279.5t385.5 103zM256 640q0 -104 40.5 -198.5t109.5 -163.5t163.5 -109.5t198.5 -40.5v1024q-104 0 -198.5 -40.5 t-163.5 -109.5t-109.5 -163.5t-40.5 -198.5z" />
+<glyph unicode="" horiz-adv-x="1024" d="M512 384q0 36 -20 69q-1 1 -15.5 22.5t-25.5 38t-25 44t-21 50.5q-4 16 -21 16t-21 -16q-7 -23 -21 -50.5t-25 -44t-25.5 -38t-15.5 -22.5q-20 -33 -20 -69q0 -53 37.5 -90.5t90.5 -37.5t90.5 37.5t37.5 90.5zM1024 512q0 -212 -150 -362t-362 -150t-362 150t-150 362 q0 145 81 275q6 9 62.5 90.5t101 151t99.5 178t83 201.5q9 30 34 47t51 17t51.5 -17t33.5 -47q28 -93 83 -201.5t99.5 -178t101 -151t62.5 -90.5q81 -127 81 -275z" />
+<glyph unicode="" horiz-adv-x="1792" d="M888 352l116 116l-152 152l-116 -116v-56h96v-96h56zM1328 1072q-16 16 -33 -1l-350 -350q-17 -17 -1 -33t33 1l350 350q17 17 1 33zM1408 478v-190q0 -119 -84.5 -203.5t-203.5 -84.5h-832q-119 0 -203.5 84.5t-84.5 203.5v832q0 119 84.5 203.5t203.5 84.5h832 q63 0 117 -25q15 -7 18 -23q3 -17 -9 -29l-49 -49q-14 -14 -32 -8q-23 6 -45 6h-832q-66 0 -113 -47t-47 -113v-832q0 -66 47 -113t113 -47h832q66 0 113 47t47 113v126q0 13 9 22l64 64q15 15 35 7t20 -29zM1312 1216l288 -288l-672 -672h-288v288zM1756 1084l-92 -92 l-288 288l92 92q28 28 68 28t68 -28l152 -152q28 -28 28 -68t-28 -68z" />
+<glyph unicode="" horiz-adv-x="1664" d="M1408 547v-259q0 -119 -84.5 -203.5t-203.5 -84.5h-832q-119 0 -203.5 84.5t-84.5 203.5v832q0 119 84.5 203.5t203.5 84.5h255v0q13 0 22.5 -9.5t9.5 -22.5q0 -27 -26 -32q-77 -26 -133 -60q-10 -4 -16 -4h-112q-66 0 -113 -47t-47 -113v-832q0 -66 47 -113t113 -47h832 q66 0 113 47t47 113v214q0 19 18 29q28 13 54 37q16 16 35 8q21 -9 21 -29zM1645 1043l-384 -384q-18 -19 -45 -19q-12 0 -25 5q-39 17 -39 59v192h-160q-323 0 -438 -131q-119 -137 -74 -473q3 -23 -20 -34q-8 -2 -12 -2q-16 0 -26 13q-10 14 -21 31t-39.5 68.5t-49.5 99.5 t-38.5 114t-17.5 122q0 49 3.5 91t14 90t28 88t47 81.5t68.5 74t94.5 61.5t124.5 48.5t159.5 30.5t196.5 11h160v192q0 42 39 59q13 5 25 5q26 0 45 -19l384 -384q19 -19 19 -45t-19 -45z" />
+<glyph unicode="" horiz-adv-x="1664" d="M1408 606v-318q0 -119 -84.5 -203.5t-203.5 -84.5h-832q-119 0 -203.5 84.5t-84.5 203.5v832q0 119 84.5 203.5t203.5 84.5h832q63 0 117 -25q15 -7 18 -23q3 -17 -9 -29l-49 -49q-10 -10 -23 -10q-3 0 -9 2q-23 6 -45 6h-832q-66 0 -113 -47t-47 -113v-832 q0 -66 47 -113t113 -47h832q66 0 113 47t47 113v254q0 13 9 22l64 64q10 10 23 10q6 0 12 -3q20 -8 20 -29zM1639 1095l-814 -814q-24 -24 -57 -24t-57 24l-430 430q-24 24 -24 57t24 57l110 110q24 24 57 24t57 -24l263 -263l647 647q24 24 57 24t57 -24l110 -110 q24 -24 24 -57t-24 -57z" />
+<glyph unicode="" horiz-adv-x="1792" d="M1792 640q0 -26 -19 -45l-256 -256q-19 -19 -45 -19t-45 19t-19 45v128h-384v-384h128q26 0 45 -19t19 -45t-19 -45l-256 -256q-19 -19 -45 -19t-45 19l-256 256q-19 19 -19 45t19 45t45 19h128v384h-384v-128q0 -26 -19 -45t-45 -19t-45 19l-256 256q-19 19 -19 45 t19 45l256 256q19 19 45 19t45 -19t19 -45v-128h384v384h-128q-26 0 -45 19t-19 45t19 45l256 256q19 19 45 19t45 -19l256 -256q19 -19 19 -45t-19 -45t-45 -19h-128v-384h384v128q0 26 19 45t45 19t45 -19l256 -256q19 -19 19 -45z" />
+<glyph unicode="" horiz-adv-x="1024" d="M979 1395q19 19 32 13t13 -32v-1472q0 -26 -13 -32t-32 13l-710 710q-9 9 -13 19v-678q0 -26 -19 -45t-45 -19h-128q-26 0 -45 19t-19 45v1408q0 26 19 45t45 19h128q26 0 45 -19t19 -45v-678q4 11 13 19z" />
+<glyph unicode="" horiz-adv-x="1792" d="M1747 1395q19 19 32 13t13 -32v-1472q0 -26 -13 -32t-32 13l-710 710q-9 9 -13 19v-710q0 -26 -13 -32t-32 13l-710 710q-9 9 -13 19v-678q0 -26 -19 -45t-45 -19h-128q-26 0 -45 19t-19 45v1408q0 26 19 45t45 19h128q26 0 45 -19t19 -45v-678q4 11 13 19l710 710 q19 19 32 13t13 -32v-710q4 11 13 19z" />
+<glyph unicode="" horiz-adv-x="1664" d="M1619 1395q19 19 32 13t13 -32v-1472q0 -26 -13 -32t-32 13l-710 710q-8 9 -13 19v-710q0 -26 -13 -32t-32 13l-710 710q-19 19 -19 45t19 45l710 710q19 19 32 13t13 -32v-710q5 11 13 19z" />
+<glyph unicode="" horiz-adv-x="1408" d="M1384 609l-1328 -738q-23 -13 -39.5 -3t-16.5 36v1472q0 26 16.5 36t39.5 -3l1328 -738q23 -13 23 -31t-23 -31z" />
+<glyph unicode="" d="M1536 1344v-1408q0 -26 -19 -45t-45 -19h-512q-26 0 -45 19t-19 45v1408q0 26 19 45t45 19h512q26 0 45 -19t19 -45zM640 1344v-1408q0 -26 -19 -45t-45 -19h-512q-26 0 -45 19t-19 45v1408q0 26 19 45t45 19h512q26 0 45 -19t19 -45z" />
+<glyph unicode="" d="M1536 1344v-1408q0 -26 -19 -45t-45 -19h-1408q-26 0 -45 19t-19 45v1408q0 26 19 45t45 19h1408q26 0 45 -19t19 -45z" />
+<glyph unicode="" horiz-adv-x="1664" d="M45 -115q-19 -19 -32 -13t-13 32v1472q0 26 13 32t32 -13l710 -710q8 -8 13 -19v710q0 26 13 32t32 -13l710 -710q19 -19 19 -45t-19 -45l-710 -710q-19 -19 -32 -13t-13 32v710q-5 -10 -13 -19z" />
+<glyph unicode="" horiz-adv-x="1792" d="M45 -115q-19 -19 -32 -13t-13 32v1472q0 26 13 32t32 -13l710 -710q8 -8 13 -19v710q0 26 13 32t32 -13l710 -710q8 -8 13 -19v678q0 26 19 45t45 19h128q26 0 45 -19t19 -45v-1408q0 -26 -19 -45t-45 -19h-128q-26 0 -45 19t-19 45v678q-5 -10 -13 -19l-710 -710 q-19 -19 -32 -13t-13 32v710q-5 -10 -13 -19z" />
+<glyph unicode="" horiz-adv-x="1024" d="M45 -115q-19 -19 -32 -13t-13 32v1472q0 26 13 32t32 -13l710 -710q8 -8 13 -19v678q0 26 19 45t45 19h128q26 0 45 -19t19 -45v-1408q0 -26 -19 -45t-45 -19h-128q-26 0 -45 19t-19 45v678q-5 -10 -13 -19z" />
+<glyph unicode="" horiz-adv-x="1538" d="M14 557l710 710q19 19 45 19t45 -19l710 -710q19 -19 13 -32t-32 -13h-1472q-26 0 -32 13t13 32zM1473 0h-1408q-26 0 -45 19t-19 45v256q0 26 19 45t45 19h1408q26 0 45 -19t19 -45v-256q0 -26 -19 -45t-45 -19z" />
+<glyph unicode="" horiz-adv-x="1152" d="M742 -37l-652 651q-37 37 -37 90.5t37 90.5l652 651q37 37 90.5 37t90.5 -37l75 -75q37 -37 37 -90.5t-37 -90.5l-486 -486l486 -485q37 -38 37 -91t-37 -90l-75 -75q-37 -37 -90.5 -37t-90.5 37z" />
+<glyph unicode="" horiz-adv-x="1152" d="M1099 704q0 -52 -37 -91l-652 -651q-37 -37 -90 -37t-90 37l-76 75q-37 39 -37 91q0 53 37 90l486 486l-486 485q-37 39 -37 91q0 53 37 90l76 75q36 38 90 38t90 -38l652 -651q37 -37 37 -90z" />
+<glyph unicode="" d="M1216 576v128q0 26 -19 45t-45 19h-256v256q0 26 -19 45t-45 19h-128q-26 0 -45 -19t-19 -45v-256h-256q-26 0 -45 -19t-19 -45v-128q0 -26 19 -45t45 -19h256v-256q0 -26 19 -45t45 -19h128q26 0 45 19t19 45v256h256q26 0 45 19t19 45zM1536 640q0 -209 -103 -385.5 t-279.5 -279.5t-385.5 -103t-385.5 103t-279.5 279.5t-103 385.5t103 385.5t279.5 279.5t385.5 103t385.5 -103t279.5 -279.5t103 -385.5z" />
+<glyph unicode="" d="M1216 576v128q0 26 -19 45t-45 19h-768q-26 0 -45 -19t-19 -45v-128q0 -26 19 -45t45 -19h768q26 0 45 19t19 45zM1536 640q0 -209 -103 -385.5t-279.5 -279.5t-385.5 -103t-385.5 103t-279.5 279.5t-103 385.5t103 385.5t279.5 279.5t385.5 103t385.5 -103t279.5 -279.5 t103 -385.5z" />
+<glyph unicode="" d="M1149 414q0 26 -19 45l-181 181l181 181q19 19 19 45q0 27 -19 46l-90 90q-19 19 -46 19q-26 0 -45 -19l-181 -181l-181 181q-19 19 -45 19q-27 0 -46 -19l-90 -90q-19 -19 -19 -46q0 -26 19 -45l181 -181l-181 -181q-19 -19 -19 -45q0 -27 19 -46l90 -90q19 -19 46 -19 q26 0 45 19l181 181l181 -181q19 -19 45 -19q27 0 46 19l90 90q19 19 19 46zM1536 640q0 -209 -103 -385.5t-279.5 -279.5t-385.5 -103t-385.5 103t-279.5 279.5t-103 385.5t103 385.5t279.5 279.5t385.5 103t385.5 -103t279.5 -279.5t103 -385.5z" />
+<glyph unicode="" d="M1284 802q0 28 -18 46l-91 90q-19 19 -45 19t-45 -19l-408 -407l-226 226q-19 19 -45 19t-45 -19l-91 -90q-18 -18 -18 -46q0 -27 18 -45l362 -362q19 -19 45 -19q27 0 46 19l543 543q18 18 18 45zM1536 640q0 -209 -103 -385.5t-279.5 -279.5t-385.5 -103t-385.5 103 t-279.5 279.5t-103 385.5t103 385.5t279.5 279.5t385.5 103t385.5 -103t279.5 -279.5t103 -385.5z" />
+<glyph unicode="" d="M896 160v192q0 13 -9.5 22.5t-22.5 9.5h-192q-13 0 -22.5 -9.5t-9.5 -22.5v-192q0 -13 9.5 -22.5t22.5 -9.5h192q13 0 22.5 9.5t9.5 22.5zM1152 832q0 97 -58.5 172t-144.5 111.5t-181 36.5t-181 -36.5t-144.5 -111.5t-58.5 -172v-11v-13t1 -11.5t3 -11.5t5.5 -8t9 -7 t13.5 -2h192q14 0 23 9t9 23q0 12 11 27q19 31 50.5 50t66.5 19q39 0 83 -21.5t44 -57.5q0 -33 -26.5 -58t-63.5 -44t-74.5 -41.5t-64 -63.5t-26.5 -98v-11v-13t1 -11.5t3 -11.5t5.5 -8t9 -7t13.5 -2h192q17 0 24 10.5t8 24.5t13.5 33t37.5 32q60 33 70 39q62 44 98.5 108 t36.5 137zM1536 640q0 -209 -103 -385.5t-279.5 -279.5t-385.5 -103t-385.5 103t-279.5 279.5t-103 385.5t103 385.5t279.5 279.5t385.5 103t385.5 -103t279.5 -279.5t103 -385.5z" />
+<glyph unicode="" d="M1024 160v64q0 14 -9 23t-23 9h-96v480q0 14 -9 23t-23 9h-320q-14 0 -23 -9t-9 -23v-64q0 -14 9 -23t23 -9h96v-384h-96q-14 0 -23 -9t-9 -23v-64q0 -14 9 -23t23 -9h448q14 0 23 9t9 23zM896 928v192q0 14 -9 23t-23 9h-192q-14 0 -23 -9t-9 -23v-192q0 -14 9 -23 t23 -9h192q14 0 23 9t9 23zM1536 640q0 -209 -103 -385.5t-279.5 -279.5t-385.5 -103t-385.5 103t-279.5 279.5t-103 385.5t103 385.5t279.5 279.5t385.5 103t385.5 -103t279.5 -279.5t103 -385.5z" />
+<glyph unicode="" d="M1197 512h-109q-26 0 -45 19t-19 45v128q0 26 19 45t45 19h109q-32 108 -112.5 188.5t-188.5 112.5v-109q0 -26 -19 -45t-45 -19h-128q-26 0 -45 19t-19 45v109q-108 -32 -188.5 -112.5t-112.5 -188.5h109q26 0 45 -19t19 -45v-128q0 -26 -19 -45t-45 -19h-109 q32 -108 112.5 -188.5t188.5 -112.5v109q0 26 19 45t45 19h128q26 0 45 -19t19 -45v-109q108 32 188.5 112.5t112.5 188.5zM1536 704v-128q0 -26 -19 -45t-45 -19h-143q-37 -161 -154.5 -278.5t-278.5 -154.5v-143q0 -26 -19 -45t-45 -19h-128q-26 0 -45 19t-19 45v143 q-161 37 -278.5 154.5t-154.5 278.5h-143q-26 0 -45 19t-19 45v128q0 26 19 45t45 19h143q37 161 154.5 278.5t278.5 154.5v143q0 26 19 45t45 19h128q26 0 45 -19t19 -45v-143q161 -37 278.5 -154.5t154.5 -278.5h143q26 0 45 -19t19 -45z" />
+<glyph unicode="" d="M1125 448q0 -27 -18 -45l-102 -102q-18 -18 -45 -18t-45 18l-147 147l-147 -147q-18 -18 -45 -18t-45 18l-102 102q-18 18 -18 45t18 45l147 147l-147 147q-18 18 -18 45t18 45l102 102q18 18 45 18t45 -18l147 -147l147 147q18 18 45 18t45 -18l102 -102q18 -18 18 -45 t-18 -45l-147 -147l147 -147q18 -18 18 -45zM1280 640q0 104 -40.5 198.5t-109.5 163.5t-163.5 109.5t-198.5 40.5t-198.5 -40.5t-163.5 -109.5t-109.5 -163.5t-40.5 -198.5t40.5 -198.5t109.5 -163.5t163.5 -109.5t198.5 -40.5t198.5 40.5t163.5 109.5t109.5 163.5 t40.5 198.5zM1536 640q0 -209 -103 -385.5t-279.5 -279.5t-385.5 -103t-385.5 103t-279.5 279.5t-103 385.5t103 385.5t279.5 279.5t385.5 103t385.5 -103t279.5 -279.5t103 -385.5z" />
+<glyph unicode="" d="M1189 768q0 -27 -18 -45l-320 -320l-102 -102q-18 -18 -45 -18t-45 18l-102 102l-192 192q-18 18 -18 45t18 45l102 102q18 18 45 18t45 -18l147 -147l275 275q18 18 45 18t45 -18l102 -102q18 -18 18 -45zM1280 640q0 104 -40.5 198.5t-109.5 163.5t-163.5 109.5 t-198.5 40.5t-198.5 -40.5t-163.5 -109.5t-109.5 -163.5t-40.5 -198.5t40.5 -198.5t109.5 -163.5t163.5 -109.5t198.5 -40.5t198.5 40.5t163.5 109.5t109.5 163.5t40.5 198.5zM1536 640q0 -209 -103 -385.5t-279.5 -279.5t-385.5 -103t-385.5 103t-279.5 279.5t-103 385.5 t103 385.5t279.5 279.5t385.5 103t385.5 -103t279.5 -279.5t103 -385.5z" />
+<glyph unicode="" d="M1280 640q0 139 -71 260l-701 -701q121 -71 260 -71q104 0 198.5 40.5t163.5 109.5t109.5 163.5t40.5 198.5zM327 380l701 701q-121 71 -260 71q-104 0 -198.5 -40.5t-163.5 -109.5t-109.5 -163.5t-40.5 -198.5q0 -139 71 -260zM1536 640q0 -209 -103 -385.5 t-279.5 -279.5t-385.5 -103t-385.5 103t-279.5 279.5t-103 385.5t103 385.5t279.5 279.5t385.5 103t385.5 -103t279.5 -279.5t103 -385.5z" />
+<glyph unicode="" d="M1536 640v-128q0 -53 -32.5 -90.5t-84.5 -37.5h-704l293 -294q38 -36 38 -90t-38 -90l-75 -76q-37 -37 -90 -37q-52 0 -91 37l-651 652q-37 37 -37 90q0 52 37 91l651 650q38 38 91 38q52 0 90 -38l75 -74q38 -38 38 -91t-38 -91l-293 -293h704q52 0 84.5 -37.5 t32.5 -90.5z" />
+<glyph unicode="" d="M1472 576q0 -54 -37 -91l-651 -651q-39 -37 -91 -37q-51 0 -90 37l-75 75q-38 38 -38 91t38 91l293 293h-704q-52 0 -84.5 37.5t-32.5 90.5v128q0 53 32.5 90.5t84.5 37.5h704l-293 294q-38 36 -38 90t38 90l75 75q38 38 90 38q53 0 91 -38l651 -651q37 -35 37 -90z" />
+<glyph unicode="" horiz-adv-x="1664" d="M1611 565q0 -51 -37 -90l-75 -75q-38 -38 -91 -38q-54 0 -90 38l-294 293v-704q0 -52 -37.5 -84.5t-90.5 -32.5h-128q-53 0 -90.5 32.5t-37.5 84.5v704l-294 -293q-36 -38 -90 -38t-90 38l-75 75q-38 38 -38 90q0 53 38 91l651 651q35 37 90 37q54 0 91 -37l651 -651 q37 -39 37 -91z" />
+<glyph unicode="" horiz-adv-x="1664" d="M1611 704q0 -53 -37 -90l-651 -652q-39 -37 -91 -37q-53 0 -90 37l-651 652q-38 36 -38 90q0 53 38 91l74 75q39 37 91 37q53 0 90 -37l294 -294v704q0 52 38 90t90 38h128q52 0 90 -38t38 -90v-704l294 294q37 37 90 37q52 0 91 -37l75 -75q37 -39 37 -91z" />
+<glyph unicode="" horiz-adv-x="1792" d="M1792 896q0 -26 -19 -45l-512 -512q-19 -19 -45 -19t-45 19t-19 45v256h-224q-98 0 -175.5 -6t-154 -21.5t-133 -42.5t-105.5 -69.5t-80 -101t-48.5 -138.5t-17.5 -181q0 -55 5 -123q0 -6 2.5 -23.5t2.5 -26.5q0 -15 -8.5 -25t-23.5 -10q-16 0 -28 17q-7 9 -13 22 t-13.5 30t-10.5 24q-127 285 -127 451q0 199 53 333q162 403 875 403h224v256q0 26 19 45t45 19t45 -19l512 -512q19 -19 19 -45z" />
+<glyph unicode="" d="M755 480q0 -13 -10 -23l-332 -332l144 -144q19 -19 19 -45t-19 -45t-45 -19h-448q-26 0 -45 19t-19 45v448q0 26 19 45t45 19t45 -19l144 -144l332 332q10 10 23 10t23 -10l114 -114q10 -10 10 -23zM1536 1344v-448q0 -26 -19 -45t-45 -19t-45 19l-144 144l-332 -332 q-10 -10 -23 -10t-23 10l-114 114q-10 10 -10 23t10 23l332 332l-144 144q-19 19 -19 45t19 45t45 19h448q26 0 45 -19t19 -45z" />
+<glyph unicode="" d="M768 576v-448q0 -26 -19 -45t-45 -19t-45 19l-144 144l-332 -332q-10 -10 -23 -10t-23 10l-114 114q-10 10 -10 23t10 23l332 332l-144 144q-19 19 -19 45t19 45t45 19h448q26 0 45 -19t19 -45zM1523 1248q0 -13 -10 -23l-332 -332l144 -144q19 -19 19 -45t-19 -45 t-45 -19h-448q-26 0 -45 19t-19 45v448q0 26 19 45t45 19t45 -19l144 -144l332 332q10 10 23 10t23 -10l114 -114q10 -10 10 -23z" />
+<glyph unicode="" horiz-adv-x="1408" d="M1408 800v-192q0 -40 -28 -68t-68 -28h-416v-416q0 -40 -28 -68t-68 -28h-192q-40 0 -68 28t-28 68v416h-416q-40 0 -68 28t-28 68v192q0 40 28 68t68 28h416v416q0 40 28 68t68 28h192q40 0 68 -28t28 -68v-416h416q40 0 68 -28t28 -68z" />
+<glyph unicode="" horiz-adv-x="1408" d="M1408 800v-192q0 -40 -28 -68t-68 -28h-1216q-40 0 -68 28t-28 68v192q0 40 28 68t68 28h1216q40 0 68 -28t28 -68z" />
+<glyph unicode="" horiz-adv-x="1664" d="M1482 486q46 -26 59.5 -77.5t-12.5 -97.5l-64 -110q-26 -46 -77.5 -59.5t-97.5 12.5l-266 153v-307q0 -52 -38 -90t-90 -38h-128q-52 0 -90 38t-38 90v307l-266 -153q-46 -26 -97.5 -12.5t-77.5 59.5l-64 110q-26 46 -12.5 97.5t59.5 77.5l266 154l-266 154 q-46 26 -59.5 77.5t12.5 97.5l64 110q26 46 77.5 59.5t97.5 -12.5l266 -153v307q0 52 38 90t90 38h128q52 0 90 -38t38 -90v-307l266 153q46 26 97.5 12.5t77.5 -59.5l64 -110q26 -46 12.5 -97.5t-59.5 -77.5l-266 -154z" />
+<glyph unicode="" d="M768 1408q209 0 385.5 -103t279.5 -279.5t103 -385.5t-103 -385.5t-279.5 -279.5t-385.5 -103t-385.5 103t-279.5 279.5t-103 385.5t103 385.5t279.5 279.5t385.5 103zM896 161v190q0 14 -9 23.5t-22 9.5h-192q-13 0 -23 -10t-10 -23v-190q0 -13 10 -23t23 -10h192 q13 0 22 9.5t9 23.5zM894 505l18 621q0 12 -10 18q-10 8 -24 8h-220q-14 0 -24 -8q-10 -6 -10 -18l17 -621q0 -10 10 -17.5t24 -7.5h185q14 0 23.5 7.5t10.5 17.5z" />
+<glyph unicode="" d="M928 180v716h-320v-716q0 -25 18.5 -38.5t45.5 -13.5h192q27 0 45.5 13.5t18.5 38.5zM472 1024h195l-126 161q-24 31 -69 31q-40 0 -68 -28t-28 -68t28 -68t68 -28zM1160 1120q0 40 -28 68t-68 28q-45 0 -69 -31l-125 -161h194q40 0 68 28t28 68zM1536 864v-320 q0 -14 -10 -22t-27 -10.5t-32 -2.5t-34.5 1.5t-24.5 1.5v-416q0 -40 -28 -68t-68 -28h-1088q-40 0 -68 28t-28 68v416q-5 0 -24.5 -1.5t-34.5 -1.5t-32 2.5t-27 10.5t-10 22v320q0 13 9.5 22.5t22.5 9.5h440q-93 0 -158.5 65.5t-65.5 158.5t65.5 158.5t158.5 65.5 q108 0 168 -77l128 -165l128 165q60 77 168 77q93 0 158.5 -65.5t65.5 -158.5t-65.5 -158.5t-158.5 -65.5h440q13 0 22.5 -9.5t9.5 -22.5z" />
+<glyph unicode="" horiz-adv-x="1792" d="M1280 832q0 26 -19 45t-45 19q-172 0 -318 -49.5t-259.5 -134t-235.5 -219.5q-19 -21 -19 -45q0 -26 19 -45t45 -19q24 0 45 19q27 24 74 71t67 66q137 124 268.5 176t313.5 52q26 0 45 19t19 45zM1792 1030q0 -95 -20 -193q-46 -224 -184.5 -383t-357.5 -268 q-214 -108 -438 -108q-148 0 -286 47q-15 5 -88 42t-96 37q-16 0 -39.5 -32t-45 -70t-52.5 -70t-60 -32q-30 0 -51 11t-31 24t-27 42q-2 4 -6 11t-5.5 10t-3 9.5t-1.5 13.5q0 35 31 73.5t68 65.5t68 56t31 48q0 4 -14 38t-16 44q-9 51 -9 104q0 115 43.5 220t119 184.5 t170.5 139t204 95.5q55 18 145 25.5t179.5 9t178.5 6t163.5 24t113.5 56.5l29.5 29.5t29.5 28t27 20t36.5 16t43.5 4.5q39 0 70.5 -46t47.5 -112t24 -124t8 -96z" />
+<glyph unicode="" horiz-adv-x="1408" d="M1408 -160v-64q0 -13 -9.5 -22.5t-22.5 -9.5h-1344q-13 0 -22.5 9.5t-9.5 22.5v64q0 13 9.5 22.5t22.5 9.5h1344q13 0 22.5 -9.5t9.5 -22.5zM1152 896q0 -78 -24.5 -144t-64 -112.5t-87.5 -88t-96 -77.5t-87.5 -72t-64 -81.5t-24.5 -96.5q0 -96 67 -224l-4 1l1 -1 q-90 41 -160 83t-138.5 100t-113.5 122.5t-72.5 150.5t-27.5 184q0 78 24.5 144t64 112.5t87.5 88t96 77.5t87.5 72t64 81.5t24.5 96.5q0 94 -66 224l3 -1l-1 1q90 -41 160 -83t138.5 -100t113.5 -122.5t72.5 -150.5t27.5 -184z" />
+<glyph unicode="" horiz-adv-x="1792" d="M1664 576q-152 236 -381 353q61 -104 61 -225q0 -185 -131.5 -316.5t-316.5 -131.5t-316.5 131.5t-131.5 316.5q0 121 61 225q-229 -117 -381 -353q133 -205 333.5 -326.5t434.5 -121.5t434.5 121.5t333.5 326.5zM944 960q0 20 -14 34t-34 14q-125 0 -214.5 -89.5 t-89.5 -214.5q0 -20 14 -34t34 -14t34 14t14 34q0 86 61 147t147 61q20 0 34 14t14 34zM1792 576q0 -34 -20 -69q-140 -230 -376.5 -368.5t-499.5 -138.5t-499.5 139t-376.5 368q-20 35 -20 69t20 69q140 229 376.5 368t499.5 139t499.5 -139t376.5 -368q20 -35 20 -69z" />
+<glyph unicode="" horiz-adv-x="1792" d="M555 201l78 141q-87 63 -136 159t-49 203q0 121 61 225q-229 -117 -381 -353q167 -258 427 -375zM944 960q0 20 -14 34t-34 14q-125 0 -214.5 -89.5t-89.5 -214.5q0 -20 14 -34t34 -14t34 14t14 34q0 86 61 147t147 61q20 0 34 14t14 34zM1307 1151q0 -7 -1 -9 q-105 -188 -315 -566t-316 -567l-49 -89q-10 -16 -28 -16q-12 0 -134 70q-16 10 -16 28q0 12 44 87q-143 65 -263.5 173t-208.5 245q-20 31 -20 69t20 69q153 235 380 371t496 136q89 0 180 -17l54 97q10 16 28 16q5 0 18 -6t31 -15.5t33 -18.5t31.5 -18.5t19.5 -11.5 q16 -10 16 -27zM1344 704q0 -139 -79 -253.5t-209 -164.5l280 502q8 -45 8 -84zM1792 576q0 -35 -20 -69q-39 -64 -109 -145q-150 -172 -347.5 -267t-419.5 -95l74 132q212 18 392.5 137t301.5 307q-115 179 -282 294l63 112q95 -64 182.5 -153t144.5 -184q20 -34 20 -69z " />
+<glyph unicode="" horiz-adv-x="1792" d="M1024 161v190q0 14 -9.5 23.5t-22.5 9.5h-192q-13 0 -22.5 -9.5t-9.5 -23.5v-190q0 -14 9.5 -23.5t22.5 -9.5h192q13 0 22.5 9.5t9.5 23.5zM1022 535l18 459q0 12 -10 19q-13 11 -24 11h-220q-11 0 -24 -11q-10 -7 -10 -21l17 -457q0 -10 10 -16.5t24 -6.5h185 q14 0 23.5 6.5t10.5 16.5zM1008 1469l768 -1408q35 -63 -2 -126q-17 -29 -46.5 -46t-63.5 -17h-1536q-34 0 -63.5 17t-46.5 46q-37 63 -2 126l768 1408q17 31 47 49t65 18t65 -18t47 -49z" />
+<glyph unicode="" horiz-adv-x="1408" d="M1397 1324q0 -87 -149 -236l-240 -240l143 -746l1 -6q0 -14 -9 -23l-64 -64q-9 -9 -23 -9q-21 0 -29 18l-274 575l-245 -245q68 -238 68 -252t-9 -23l-64 -64q-9 -9 -23 -9q-18 0 -28 16l-155 280l-280 155q-17 9 -17 28q0 14 9 23l64 65q9 9 23 9t252 -68l245 245 l-575 274q-18 8 -18 29q0 14 9 23l64 64q9 9 23 9q4 0 6 -1l746 -143l240 240q149 149 236 149q32 0 52.5 -20.5t20.5 -52.5z" />
+<glyph unicode="" horiz-adv-x="1664" d="M128 -128h288v288h-288v-288zM480 -128h320v288h-320v-288zM128 224h288v320h-288v-320zM480 224h320v320h-320v-320zM128 608h288v288h-288v-288zM864 -128h320v288h-320v-288zM480 608h320v288h-320v-288zM1248 -128h288v288h-288v-288zM864 224h320v320h-320v-320z M512 1088v288q0 13 -9.5 22.5t-22.5 9.5h-64q-13 0 -22.5 -9.5t-9.5 -22.5v-288q0 -13 9.5 -22.5t22.5 -9.5h64q13 0 22.5 9.5t9.5 22.5zM1248 224h288v320h-288v-320zM864 608h320v288h-320v-288zM1248 608h288v288h-288v-288zM1280 1088v288q0 13 -9.5 22.5t-22.5 9.5h-64 q-13 0 -22.5 -9.5t-9.5 -22.5v-288q0 -13 9.5 -22.5t22.5 -9.5h64q13 0 22.5 9.5t9.5 22.5zM1664 1152v-1280q0 -52 -38 -90t-90 -38h-1408q-52 0 -90 38t-38 90v1280q0 52 38 90t90 38h128v96q0 66 47 113t113 47h64q66 0 113 -47t47 -113v-96h384v96q0 66 47 113t113 47 h64q66 0 113 -47t47 -113v-96h128q52 0 90 -38t38 -90z" />
+<glyph unicode="" horiz-adv-x="1792" d="M666 1055q-60 -92 -137 -273q-22 45 -37 72.5t-40.5 63.5t-51 56.5t-63 35t-81.5 14.5h-224q-14 0 -23 9t-9 23v192q0 14 9 23t23 9h224q250 0 410 -225zM1792 256q0 -14 -9 -23l-320 -320q-9 -9 -23 -9q-13 0 -22.5 9.5t-9.5 22.5v192q-32 0 -85 -0.5t-81 -1t-73 1 t-71 5t-64 10.5t-63 18.5t-58 28.5t-59 40t-55 53.5t-56 69.5q59 93 136 273q22 -45 37 -72.5t40.5 -63.5t51 -56.5t63 -35t81.5 -14.5h256v192q0 14 9 23t23 9q12 0 24 -10l319 -319q9 -9 9 -23zM1792 1152q0 -14 -9 -23l-320 -320q-9 -9 -23 -9q-13 0 -22.5 9.5t-9.5 22.5 v192h-256q-48 0 -87 -15t-69 -45t-51 -61.5t-45 -77.5q-32 -62 -78 -171q-29 -66 -49.5 -111t-54 -105t-64 -100t-74 -83t-90 -68.5t-106.5 -42t-128 -16.5h-224q-14 0 -23 9t-9 23v192q0 14 9 23t23 9h224q48 0 87 15t69 45t51 61.5t45 77.5q32 62 78 171q29 66 49.5 111 t54 105t64 100t74 83t90 68.5t106.5 42t128 16.5h256v192q0 14 9 23t23 9q12 0 24 -10l319 -319q9 -9 9 -23z" />
+<glyph unicode="" horiz-adv-x="1792" d="M1792 640q0 -174 -120 -321.5t-326 -233t-450 -85.5q-70 0 -145 8q-198 -175 -460 -242q-49 -14 -114 -22q-17 -2 -30.5 9t-17.5 29v1q-3 4 -0.5 12t2 10t4.5 9.5l6 9t7 8.5t8 9q7 8 31 34.5t34.5 38t31 39.5t32.5 51t27 59t26 76q-157 89 -247.5 220t-90.5 281 q0 130 71 248.5t191 204.5t286 136.5t348 50.5q244 0 450 -85.5t326 -233t120 -321.5z" />
+<glyph unicode="" d="M1536 704v-128q0 -201 -98.5 -362t-274 -251.5t-395.5 -90.5t-395.5 90.5t-274 251.5t-98.5 362v128q0 26 19 45t45 19h384q26 0 45 -19t19 -45v-128q0 -52 23.5 -90t53.5 -57t71 -30t64 -13t44 -2t44 2t64 13t71 30t53.5 57t23.5 90v128q0 26 19 45t45 19h384 q26 0 45 -19t19 -45zM512 1344v-384q0 -26 -19 -45t-45 -19h-384q-26 0 -45 19t-19 45v384q0 26 19 45t45 19h384q26 0 45 -19t19 -45zM1536 1344v-384q0 -26 -19 -45t-45 -19h-384q-26 0 -45 19t-19 45v384q0 26 19 45t45 19h384q26 0 45 -19t19 -45z" />
+<glyph unicode="" horiz-adv-x="1664" d="M1611 320q0 -53 -37 -90l-75 -75q-38 -38 -91 -38q-54 0 -90 38l-486 485l-486 -485q-36 -38 -90 -38t-90 38l-75 75q-38 36 -38 90q0 53 38 91l651 651q37 37 90 37q52 0 91 -37l650 -651q38 -38 38 -91z" />
+<glyph unicode="" horiz-adv-x="1664" d="M1611 832q0 -53 -37 -90l-651 -651q-38 -38 -91 -38q-54 0 -90 38l-651 651q-38 36 -38 90q0 53 38 91l74 75q39 37 91 37q53 0 90 -37l486 -486l486 486q37 37 90 37q52 0 91 -37l75 -75q37 -39 37 -91z" />
+<glyph unicode="" horiz-adv-x="1920" d="M1280 32q0 -13 -9.5 -22.5t-22.5 -9.5h-960q-8 0 -13.5 2t-9 7t-5.5 8t-3 11.5t-1 11.5v13v11v160v416h-192q-26 0 -45 19t-19 45q0 24 15 41l320 384q19 22 49 22t49 -22l320 -384q15 -17 15 -41q0 -26 -19 -45t-45 -19h-192v-384h576q16 0 25 -11l160 -192q7 -11 7 -21 zM1920 448q0 -24 -15 -41l-320 -384q-20 -23 -49 -23t-49 23l-320 384q-15 17 -15 41q0 26 19 45t45 19h192v384h-576q-16 0 -25 12l-160 192q-7 9 -7 20q0 13 9.5 22.5t22.5 9.5h960q8 0 13.5 -2t9 -7t5.5 -8t3 -11.5t1 -11.5v-13v-11v-160v-416h192q26 0 45 -19t19 -45z " />
+<glyph unicode="" horiz-adv-x="1664" d="M640 0q0 -53 -37.5 -90.5t-90.5 -37.5t-90.5 37.5t-37.5 90.5t37.5 90.5t90.5 37.5t90.5 -37.5t37.5 -90.5zM1536 0q0 -53 -37.5 -90.5t-90.5 -37.5t-90.5 37.5t-37.5 90.5t37.5 90.5t90.5 37.5t90.5 -37.5t37.5 -90.5zM1664 1088v-512q0 -24 -16 -42.5t-41 -21.5 l-1044 -122q1 -7 4.5 -21.5t6 -26.5t2.5 -22q0 -16 -24 -64h920q26 0 45 -19t19 -45t-19 -45t-45 -19h-1024q-26 0 -45 19t-19 45q0 14 11 39.5t29.5 59.5t20.5 38l-177 823h-204q-26 0 -45 19t-19 45t19 45t45 19h256q16 0 28.5 -6.5t20 -15.5t13 -24.5t7.5 -26.5 t5.5 -29.5t4.5 -25.5h1201q26 0 45 -19t19 -45z" />
+<glyph unicode="" horiz-adv-x="1664" d="M1664 928v-704q0 -92 -66 -158t-158 -66h-1216q-92 0 -158 66t-66 158v960q0 92 66 158t158 66h320q92 0 158 -66t66 -158v-32h672q92 0 158 -66t66 -158z" />
+<glyph unicode="" horiz-adv-x="1920" d="M1879 584q0 -31 -31 -66l-336 -396q-43 -51 -120.5 -86.5t-143.5 -35.5h-1088q-34 0 -60.5 13t-26.5 43q0 31 31 66l336 396q43 51 120.5 86.5t143.5 35.5h1088q34 0 60.5 -13t26.5 -43zM1536 928v-160h-832q-94 0 -197 -47.5t-164 -119.5l-337 -396l-5 -6q0 4 -0.5 12.5 t-0.5 12.5v960q0 92 66 158t158 66h320q92 0 158 -66t66 -158v-32h544q92 0 158 -66t66 -158z" />
+<glyph unicode="" horiz-adv-x="768" d="M704 1216q0 -26 -19 -45t-45 -19h-128v-1024h128q26 0 45 -19t19 -45t-19 -45l-256 -256q-19 -19 -45 -19t-45 19l-256 256q-19 19 -19 45t19 45t45 19h128v1024h-128q-26 0 -45 19t-19 45t19 45l256 256q19 19 45 19t45 -19l256 -256q19 -19 19 -45z" />
+<glyph unicode="" horiz-adv-x="1792" d="M1792 640q0 -26 -19 -45l-256 -256q-19 -19 -45 -19t-45 19t-19 45v128h-1024v-128q0 -26 -19 -45t-45 -19t-45 19l-256 256q-19 19 -19 45t19 45l256 256q19 19 45 19t45 -19t19 -45v-128h1024v128q0 26 19 45t45 19t45 -19l256 -256q19 -19 19 -45z" />
+<glyph unicode="" horiz-adv-x="1920" d="M512 512v-384h-256v384h256zM896 1024v-896h-256v896h256zM1280 768v-640h-256v640h256zM1664 1152v-1024h-256v1024h256zM1792 32v1216q0 13 -9.5 22.5t-22.5 9.5h-1600q-13 0 -22.5 -9.5t-9.5 -22.5v-1216q0 -13 9.5 -22.5t22.5 -9.5h1600q13 0 22.5 9.5t9.5 22.5z M1920 1248v-1216q0 -66 -47 -113t-113 -47h-1600q-66 0 -113 47t-47 113v1216q0 66 47 113t113 47h1600q66 0 113 -47t47 -113z" />
+<glyph unicode="" d="M1280 958q0 13 -9.5 22.5t-22.5 9.5q-5 0 -15 -4q20 34 20 55q0 13 -9.5 22.5t-22.5 9.5q-7 0 -17 -5q-60 -34 -97 -43q-65 63 -154 63q-98 0 -164.5 -72.5t-64.5 -169.5v-12q-107 14 -187.5 64t-156.5 139q-10 12 -28 12q-26 0 -41 -50.5t-15 -86.5q0 -62 29 -117 q-13 -2 -21.5 -11.5t-8.5 -22.5q0 -112 81 -185q-12 -8 -12 -25q0 -6 1 -9q15 -51 50.5 -91.5t84.5 -60.5q-77 -43 -165 -43q-8 0 -24 1.5t-23 1.5q-13 0 -22.5 -9.5t-9.5 -22.5q0 -17 14 -26q63 -47 150 -73.5t170 -26.5q130 0 248 58q166 79 256 232.5t88 339.5v12 q27 22 62.5 63t35.5 61zM1536 1120v-960q0 -119 -84.5 -203.5t-203.5 -84.5h-960q-119 0 -203.5 84.5t-84.5 203.5v960q0 119 84.5 203.5t203.5 84.5h960q119 0 203.5 -84.5t84.5 -203.5z" />
+<glyph unicode="" d="M1248 1408q119 0 203.5 -84.5t84.5 -203.5v-960q0 -119 -84.5 -203.5t-203.5 -84.5h-350q-2 0 -2 1v671h177q31 0 32 23l12 164q2 15 -8 25q-10 12 -24 12h-189v72q0 44 11.5 57t54.5 13q57 0 117 -13q13 -3 26 5q11 8 13 22l23 166q2 12 -5.5 22.5t-19.5 13.5 q-93 26 -197 26q-311 0 -311 -299v-85h-95q-13 0 -23 -10.5t-10 -24.5v-172q0 -8 5.5 -12t10 -4.5t17.5 -0.5h95v-671l10 -1h-330q-119 0 -203.5 84.5t-84.5 203.5v960q0 119 84.5 203.5t203.5 84.5h960z" />
+<glyph unicode="" horiz-adv-x="1792" d="M928 704q0 14 -9 23t-23 9q-66 0 -113 -47t-47 -113q0 -14 9 -23t23 -9t23 9t9 23q0 40 28 68t68 28q14 0 23 9t9 23zM1152 574q0 -106 -75 -181t-181 -75t-181 75t-75 181t75 181t181 75t181 -75t75 -181zM128 0h1536v128h-1536v-128zM1280 574q0 159 -112.5 271.5 t-271.5 112.5t-271.5 -112.5t-112.5 -271.5t112.5 -271.5t271.5 -112.5t271.5 112.5t112.5 271.5zM256 1216h384v128h-384v-128zM128 1024h1536v118v138h-828l-64 -128h-644v-128zM1792 1280v-1280q0 -53 -37.5 -90.5t-90.5 -37.5h-1536q-53 0 -90.5 37.5t-37.5 90.5v1280 q0 53 37.5 90.5t90.5 37.5h1536q53 0 90.5 -37.5t37.5 -90.5z" />
+<glyph unicode="" horiz-adv-x="1792" d="M832 1024q0 80 -56 136t-136 56t-136 -56t-56 -136q0 -42 19 -83q-41 19 -83 19q-80 0 -136 -56t-56 -136t56 -136t136 -56t136 56t56 136q0 42 -19 83q41 -19 83 -19q80 0 136 56t56 136zM1683 320q0 -17 -49 -66t-66 -49q-9 0 -28.5 16t-36.5 33t-38.5 40t-24.5 26 l-96 -96l220 -220q28 -28 28 -68q0 -42 -39 -81t-81 -39q-40 0 -68 28l-671 671q-176 -131 -365 -131q-163 0 -265.5 102.5t-102.5 265.5q0 160 95 313t248 248t313 95q163 0 265.5 -102.5t102.5 -265.5q0 -189 -131 -365l355 -355l96 96q-3 3 -26 24.5t-40 38.5t-33 36.5 t-16 28.5q0 17 49 66t66 49q13 0 23 -10q6 -6 46 -44.5t82 -79.5t86.5 -86t73 -78t28.5 -41z" />
+<glyph unicode="" horiz-adv-x="1920" d="M896 640q0 106 -75 181t-181 75t-181 -75t-75 -181t75 -181t181 -75t181 75t75 181zM1664 128q0 52 -38 90t-90 38t-90 -38t-38 -90q0 -53 37.5 -90.5t90.5 -37.5t90.5 37.5t37.5 90.5zM1664 1152q0 52 -38 90t-90 38t-90 -38t-38 -90q0 -53 37.5 -90.5t90.5 -37.5 t90.5 37.5t37.5 90.5zM1280 731v-185q0 -10 -7 -19.5t-16 -10.5l-155 -24q-11 -35 -32 -76q34 -48 90 -115q7 -10 7 -20q0 -12 -7 -19q-23 -30 -82.5 -89.5t-78.5 -59.5q-11 0 -21 7l-115 90q-37 -19 -77 -31q-11 -108 -23 -155q-7 -24 -30 -24h-186q-11 0 -20 7.5t-10 17.5 l-23 153q-34 10 -75 31l-118 -89q-7 -7 -20 -7q-11 0 -21 8q-144 133 -144 160q0 9 7 19q10 14 41 53t47 61q-23 44 -35 82l-152 24q-10 1 -17 9.5t-7 19.5v185q0 10 7 19.5t16 10.5l155 24q11 35 32 76q-34 48 -90 115q-7 11 -7 20q0 12 7 20q22 30 82 89t79 59q11 0 21 -7 l115 -90q34 18 77 32q11 108 23 154q7 24 30 24h186q11 0 20 -7.5t10 -17.5l23 -153q34 -10 75 -31l118 89q8 7 20 7q11 0 21 -8q144 -133 144 -160q0 -9 -7 -19q-12 -16 -42 -54t-45 -60q23 -48 34 -82l152 -23q10 -2 17 -10.5t7 -19.5zM1920 198v-140q0 -16 -149 -31 q-12 -27 -30 -52q51 -113 51 -138q0 -4 -4 -7q-122 -71 -124 -71q-8 0 -46 47t-52 68q-20 -2 -30 -2t-30 2q-14 -21 -52 -68t-46 -47q-2 0 -124 71q-4 3 -4 7q0 25 51 138q-18 25 -30 52q-149 15 -149 31v140q0 16 149 31q13 29 30 52q-51 113 -51 138q0 4 4 7q4 2 35 20 t59 34t30 16q8 0 46 -46.5t52 -67.5q20 2 30 2t30 -2q51 71 92 112l6 2q4 0 124 -70q4 -3 4 -7q0 -25 -51 -138q17 -23 30 -52q149 -15 149 -31zM1920 1222v-140q0 -16 -149 -31q-12 -27 -30 -52q51 -113 51 -138q0 -4 -4 -7q-122 -71 -124 -71q-8 0 -46 47t-52 68 q-20 -2 -30 -2t-30 2q-14 -21 -52 -68t-46 -47q-2 0 -124 71q-4 3 -4 7q0 25 51 138q-18 25 -30 52q-149 15 -149 31v140q0 16 149 31q13 29 30 52q-51 113 -51 138q0 4 4 7q4 2 35 20t59 34t30 16q8 0 46 -46.5t52 -67.5q20 2 30 2t30 -2q51 71 92 112l6 2q4 0 124 -70 q4 -3 4 -7q0 -25 -51 -138q17 -23 30 -52q149 -15 149 -31z" />
+<glyph unicode="" horiz-adv-x="1792" d="M1408 768q0 -139 -94 -257t-256.5 -186.5t-353.5 -68.5q-86 0 -176 16q-124 -88 -278 -128q-36 -9 -86 -16h-3q-11 0 -20.5 8t-11.5 21q-1 3 -1 6.5t0.5 6.5t2 6l2.5 5t3.5 5.5t4 5t4.5 5t4 4.5q5 6 23 25t26 29.5t22.5 29t25 38.5t20.5 44q-124 72 -195 177t-71 224 q0 139 94 257t256.5 186.5t353.5 68.5t353.5 -68.5t256.5 -186.5t94 -257zM1792 512q0 -120 -71 -224.5t-195 -176.5q10 -24 20.5 -44t25 -38.5t22.5 -29t26 -29.5t23 -25q1 -1 4 -4.5t4.5 -5t4 -5t3.5 -5.5l2.5 -5t2 -6t0.5 -6.5t-1 -6.5q-3 -14 -13 -22t-22 -7 q-50 7 -86 16q-154 40 -278 128q-90 -16 -176 -16q-271 0 -472 132q58 -4 88 -4q161 0 309 45t264 129q125 92 192 212t67 254q0 77 -23 152q129 -71 204 -178t75 -230z" />
+<glyph unicode="" d="M256 192q0 26 -19 45t-45 19t-45 -19t-19 -45t19 -45t45 -19t45 19t19 45zM1408 768q0 51 -39 89.5t-89 38.5h-352q0 58 48 159.5t48 160.5q0 98 -32 145t-128 47q-26 -26 -38 -85t-30.5 -125.5t-59.5 -109.5q-22 -23 -77 -91q-4 -5 -23 -30t-31.5 -41t-34.5 -42.5 t-40 -44t-38.5 -35.5t-40 -27t-35.5 -9h-32v-640h32q13 0 31.5 -3t33 -6.5t38 -11t35 -11.5t35.5 -12.5t29 -10.5q211 -73 342 -73h121q192 0 192 167q0 26 -5 56q30 16 47.5 52.5t17.5 73.5t-18 69q53 50 53 119q0 25 -10 55.5t-25 47.5q32 1 53.5 47t21.5 81zM1536 769 q0 -89 -49 -163q9 -33 9 -69q0 -77 -38 -144q3 -21 3 -43q0 -101 -60 -178q1 -139 -85 -219.5t-227 -80.5h-36h-93q-96 0 -189.5 22.5t-216.5 65.5q-116 40 -138 40h-288q-53 0 -90.5 37.5t-37.5 90.5v640q0 53 37.5 90.5t90.5 37.5h274q36 24 137 155q58 75 107 128 q24 25 35.5 85.5t30.5 126.5t62 108q39 37 90 37q84 0 151 -32.5t102 -101.5t35 -186q0 -93 -48 -192h176q104 0 180 -76t76 -179z" />
+<glyph unicode="" d="M256 1088q0 26 -19 45t-45 19t-45 -19t-19 -45t19 -45t45 -19t45 19t19 45zM1408 512q0 35 -21.5 81t-53.5 47q15 17 25 47.5t10 55.5q0 69 -53 119q18 32 18 69t-17.5 73.5t-47.5 52.5q5 30 5 56q0 85 -49 126t-136 41h-128q-131 0 -342 -73q-5 -2 -29 -10.5 t-35.5 -12.5t-35 -11.5t-38 -11t-33 -6.5t-31.5 -3h-32v-640h32q16 0 35.5 -9t40 -27t38.5 -35.5t40 -44t34.5 -42.5t31.5 -41t23 -30q55 -68 77 -91q41 -43 59.5 -109.5t30.5 -125.5t38 -85q96 0 128 47t32 145q0 59 -48 160.5t-48 159.5h352q50 0 89 38.5t39 89.5z M1536 511q0 -103 -76 -179t-180 -76h-176q48 -99 48 -192q0 -118 -35 -186q-35 -69 -102 -101.5t-151 -32.5q-51 0 -90 37q-34 33 -54 82t-25.5 90.5t-17.5 84.5t-31 64q-48 50 -107 127q-101 131 -137 155h-274q-53 0 -90.5 37.5t-37.5 90.5v640q0 53 37.5 90.5t90.5 37.5 h288q22 0 138 40q128 44 223 66t200 22h112q140 0 226.5 -79t85.5 -216v-5q60 -77 60 -178q0 -22 -3 -43q38 -67 38 -144q0 -36 -9 -69q49 -74 49 -163z" />
+<glyph unicode="" horiz-adv-x="896" d="M832 1504v-1339l-449 -236q-22 -12 -40 -12q-21 0 -31.5 14.5t-10.5 35.5q0 6 2 20l86 500l-364 354q-25 27 -25 48q0 37 56 46l502 73l225 455q19 41 49 41z" />
+<glyph unicode="" horiz-adv-x="1792" d="M1664 940q0 81 -21.5 143t-55 98.5t-81.5 59.5t-94 31t-98 8t-112 -25.5t-110.5 -64t-86.5 -72t-60 -61.5q-18 -22 -49 -22t-49 22q-24 28 -60 61.5t-86.5 72t-110.5 64t-112 25.5t-98 -8t-94 -31t-81.5 -59.5t-55 -98.5t-21.5 -143q0 -168 187 -355l581 -560l580 559 q188 188 188 356zM1792 940q0 -221 -229 -450l-623 -600q-18 -18 -44 -18t-44 18l-624 602q-10 8 -27.5 26t-55.5 65.5t-68 97.5t-53.5 121t-23.5 138q0 220 127 344t351 124q62 0 126.5 -21.5t120 -58t95.5 -68.5t76 -68q36 36 76 68t95.5 68.5t120 58t126.5 21.5 q224 0 351 -124t127 -344z" />
+<glyph unicode="" horiz-adv-x="1664" d="M640 96q0 -4 1 -20t0.5 -26.5t-3 -23.5t-10 -19.5t-20.5 -6.5h-320q-119 0 -203.5 84.5t-84.5 203.5v704q0 119 84.5 203.5t203.5 84.5h320q13 0 22.5 -9.5t9.5 -22.5q0 -4 1 -20t0.5 -26.5t-3 -23.5t-10 -19.5t-20.5 -6.5h-320q-66 0 -113 -47t-47 -113v-704 q0 -66 47 -113t113 -47h288h11h13t11.5 -1t11.5 -3t8 -5.5t7 -9t2 -13.5zM1568 640q0 -26 -19 -45l-544 -544q-19 -19 -45 -19t-45 19t-19 45v288h-448q-26 0 -45 19t-19 45v384q0 26 19 45t45 19h448v288q0 26 19 45t45 19t45 -19l544 -544q19 -19 19 -45z" />
+<glyph unicode="" d="M512 160v640q0 13 -9.5 22.5t-22.5 9.5h-192q-13 0 -22.5 -9.5t-9.5 -22.5v-640q0 -13 9.5 -22.5t22.5 -9.5h192q13 0 22.5 9.5t9.5 22.5zM503 1028q0 51 -36 87.5t-88 36.5q-51 0 -87 -36.5t-36 -87.5t36 -87.5t87 -36.5q52 0 88 36.5t36 87.5zM1280 160v435 q0 127 -73.5 192.5t-202.5 65.5q-90 0 -158 -45q-12 -8 -14 -12q0 36 -35 36h-176q-14 0 -29.5 -7.5t-15.5 -20.5v-644q0 -13 15.5 -22.5t29.5 -9.5h182q12 0 20.5 9.5t8.5 22.5v349q0 140 114 140q49 0 63.5 -22.5t14.5 -73.5v-393q0 -13 12 -22.5t26 -9.5h186 q13 0 22.5 9.5t9.5 22.5zM1536 1120v-960q0 -119 -84.5 -203.5t-203.5 -84.5h-960q-119 0 -203.5 84.5t-84.5 203.5v960q0 119 84.5 203.5t203.5 84.5h960q119 0 203.5 -84.5t84.5 -203.5z" />
+<glyph unicode="" horiz-adv-x="1152" d="M480 672v448q0 14 -9 23t-23 9t-23 -9t-9 -23v-448q0 -14 9 -23t23 -9t23 9t9 23zM1152 320q0 -26 -19 -45t-45 -19h-429l-51 -483q-2 -12 -10.5 -20.5t-20.5 -8.5h-1q-27 0 -32 27l-76 485h-404q-26 0 -45 19t-19 45q0 123 78.5 221.5t177.5 98.5v512q-52 0 -90 38 t-38 90t38 90t90 38h640q52 0 90 -38t38 -90t-38 -90t-90 -38v-512q99 0 177.5 -98.5t78.5 -221.5z" />
+<glyph unicode="" horiz-adv-x="1792" d="M1408 608v-320q0 -119 -84.5 -203.5t-203.5 -84.5h-832q-119 0 -203.5 84.5t-84.5 203.5v832q0 119 84.5 203.5t203.5 84.5h704q14 0 23 -9t9 -23v-64q0 -14 -9 -23t-23 -9h-704q-66 0 -113 -47t-47 -113v-832q0 -66 47 -113t113 -47h832q66 0 113 47t47 113v320 q0 14 9 23t23 9h64q14 0 23 -9t9 -23zM1792 1472v-512q0 -26 -19 -45t-45 -19t-45 19l-176 176l-652 -652q-10 -10 -23 -10t-23 10l-114 114q-10 10 -10 23t10 23l652 652l-176 176q-19 19 -19 45t19 45t45 19h512q26 0 45 -19t19 -45z" />
+<glyph unicode="" d="M1184 640q0 -26 -19 -45l-544 -544q-19 -19 -45 -19t-45 19t-19 45v288h-448q-26 0 -45 19t-19 45v384q0 26 19 45t45 19h448v288q0 26 19 45t45 19t45 -19l544 -544q19 -19 19 -45zM1536 992v-704q0 -119 -84.5 -203.5t-203.5 -84.5h-320q-13 0 -22.5 9.5t-9.5 22.5 q0 4 -1 20t-0.5 26.5t3 23.5t10 19.5t20.5 6.5h320q66 0 113 47t47 113v704q0 66 -47 113t-113 47h-288h-11h-13t-11.5 1t-11.5 3t-8 5.5t-7 9t-2 13.5q0 4 -1 20t-0.5 26.5t3 23.5t10 19.5t20.5 6.5h320q119 0 203.5 -84.5t84.5 -203.5z" />
+<glyph unicode="" horiz-adv-x="1664" d="M458 653q-74 162 -74 371h-256v-96q0 -78 94.5 -162t235.5 -113zM1536 928v96h-256q0 -209 -74 -371q141 29 235.5 113t94.5 162zM1664 1056v-128q0 -71 -41.5 -143t-112 -130t-173 -97.5t-215.5 -44.5q-42 -54 -95 -95q-38 -34 -52.5 -72.5t-14.5 -89.5q0 -54 30.5 -91 t97.5 -37q75 0 133.5 -45.5t58.5 -114.5v-64q0 -14 -9 -23t-23 -9h-832q-14 0 -23 9t-9 23v64q0 69 58.5 114.5t133.5 45.5q67 0 97.5 37t30.5 91q0 51 -14.5 89.5t-52.5 72.5q-53 41 -95 95q-113 5 -215.5 44.5t-173 97.5t-112 130t-41.5 143v128q0 40 28 68t68 28h288v96 q0 66 47 113t113 47h576q66 0 113 -47t47 -113v-96h288q40 0 68 -28t28 -68z" />
+<glyph unicode="" d="M582 228q0 -66 -93 -66q-107 0 -107 63q0 64 98 64q102 0 102 -61zM546 694q0 -85 -74 -85q-77 0 -77 84q0 90 77 90q36 0 55 -26t19 -63zM712 769v125q-78 -29 -135 -29q-50 29 -110 29q-86 0 -145 -57t-59 -143q0 -50 29.5 -102t73.5 -67v-3q-38 -17 -38 -85 q0 -52 41 -77v-3q-113 -37 -113 -139q0 -60 36 -98t84 -51t107 -13q224 0 224 187q0 48 -25.5 78t-62.5 42.5t-74 21.5t-62.5 23.5t-25.5 39.5q0 44 49 52q77 15 122 70t45 134q0 24 -10 52q30 7 49 13zM771 350h137q-2 20 -2 90v372q0 59 2 76h-137q3 -26 3 -79v-377 q0 -55 -3 -82zM1280 366v121q-30 -21 -68 -21q-53 0 -53 82v225h52q9 0 26.5 -1t26.5 -1v117h-105q0 82 3 102h-140q4 -24 4 -55v-47h-60v-117q36 3 37 3q4 0 11.5 -0.5t11.5 -0.5v-2h-2v-217q0 -37 2.5 -64t11.5 -56.5t24.5 -48.5t43.5 -31t66 -12q64 0 108 24zM924 1072 q0 36 -24 63.5t-60 27.5t-60.5 -27t-24.5 -64q0 -36 25 -62.5t60 -26.5t59.5 27t24.5 62zM1536 1120v-960q0 -119 -84.5 -203.5t-203.5 -84.5h-960q-119 0 -203.5 84.5t-84.5 203.5v960q0 119 84.5 203.5t203.5 84.5h960q119 0 203.5 -84.5t84.5 -203.5z" />
+<glyph unicode="" horiz-adv-x="1664" d="M1664 480v-576q0 -13 -9.5 -22.5t-22.5 -9.5h-1600q-13 0 -22.5 9.5t-9.5 22.5v576q0 13 9.5 22.5t22.5 9.5h192q13 0 22.5 -9.5t9.5 -22.5v-352h1152v352q0 13 9.5 22.5t22.5 9.5h192q13 0 22.5 -9.5t9.5 -22.5zM1344 832q0 -26 -19 -45t-45 -19h-256v-448 q0 -26 -19 -45t-45 -19h-256q-26 0 -45 19t-19 45v448h-256q-26 0 -45 19t-19 45t19 45l448 448q19 19 45 19t45 -19l448 -448q19 -19 19 -45z" />
+<glyph unicode="" d="M1407 710q0 44 -7 113.5t-18 96.5q-12 30 -17 44t-9 36.5t-4 48.5q0 23 5 68.5t5 67.5q0 37 -10 55q-4 1 -13 1q-19 0 -58 -4.5t-59 -4.5q-60 0 -176 24t-175 24q-43 0 -94.5 -11.5t-85 -23.5t-89.5 -34q-137 -54 -202 -103q-96 -73 -159.5 -189.5t-88 -236t-24.5 -248.5 q0 -40 12.5 -120t12.5 -121q0 -23 -11 -66.5t-11 -65.5t12 -36.5t34 -14.5q24 0 72.5 11t73.5 11q57 0 169.5 -15.5t169.5 -15.5q181 0 284 36q129 45 235.5 152.5t166 245.5t59.5 275zM1535 712q0 -165 -70 -327.5t-196 -288t-281 -180.5q-124 -44 -326 -44 q-57 0 -170 14.5t-169 14.5q-24 0 -72.5 -14.5t-73.5 -14.5q-73 0 -123.5 55.5t-50.5 128.5q0 24 11 68t11 67q0 40 -12.5 120.5t-12.5 121.5q0 111 18 217.5t54.5 209.5t100.5 194t150 156q78 59 232 120q194 78 316 78q60 0 175.5 -24t173.5 -24q19 0 57 5t58 5 q81 0 118 -50.5t37 -134.5q0 -23 -5 -68t-5 -68q0 -10 1 -18.5t3 -17t4 -13.5t6.5 -16t6.5 -17q16 -40 25 -118.5t9 -136.5z" />
+<glyph unicode="" horiz-adv-x="1408" d="M1408 296q0 -27 -10 -70.5t-21 -68.5q-21 -50 -122 -106q-94 -51 -186 -51q-27 0 -52.5 3.5t-57.5 12.5t-47.5 14.5t-55.5 20.5t-49 18q-98 35 -175 83q-128 79 -264.5 215.5t-215.5 264.5q-48 77 -83 175q-3 9 -18 49t-20.5 55.5t-14.5 47.5t-12.5 57.5t-3.5 52.5 q0 92 51 186q56 101 106 122q25 11 68.5 21t70.5 10q14 0 21 -3q18 -6 53 -76q11 -19 30 -54t35 -63.5t31 -53.5q3 -4 17.5 -25t21.5 -35.5t7 -28.5q0 -20 -28.5 -50t-62 -55t-62 -53t-28.5 -46q0 -9 5 -22.5t8.5 -20.5t14 -24t11.5 -19q76 -137 174 -235t235 -174 q2 -1 19 -11.5t24 -14t20.5 -8.5t22.5 -5q18 0 46 28.5t53 62t55 62t50 28.5q14 0 28.5 -7t35.5 -21.5t25 -17.5q25 -15 53.5 -31t63.5 -35t54 -30q70 -35 76 -53q3 -7 3 -21z" />
+<glyph unicode="" horiz-adv-x="1664" d="M1120 1280h-832q-66 0 -113 -47t-47 -113v-832q0 -66 47 -113t113 -47h832q66 0 113 47t47 113v832q0 66 -47 113t-113 47zM1408 1120v-832q0 -119 -84.5 -203.5t-203.5 -84.5h-832q-119 0 -203.5 84.5t-84.5 203.5v832q0 119 84.5 203.5t203.5 84.5h832 q119 0 203.5 -84.5t84.5 -203.5z" />
+<glyph unicode="" horiz-adv-x="1280" d="M1152 1280h-1024v-1242l423 406l89 85l89 -85l423 -406v1242zM1164 1408q23 0 44 -9q33 -13 52.5 -41t19.5 -62v-1289q0 -34 -19.5 -62t-52.5 -41q-19 -8 -44 -8q-48 0 -83 32l-441 424l-441 -424q-36 -33 -83 -33q-23 0 -44 9q-33 13 -52.5 41t-19.5 62v1289 q0 34 19.5 62t52.5 41q21 9 44 9h1048z" />
+<glyph unicode="" d="M1280 343q0 11 -2 16q-3 8 -38.5 29.5t-88.5 49.5l-53 29q-5 3 -19 13t-25 15t-21 5q-18 0 -47 -32.5t-57 -65.5t-44 -33q-7 0 -16.5 3.5t-15.5 6.5t-17 9.5t-14 8.5q-99 55 -170.5 126.5t-126.5 170.5q-2 3 -8.5 14t-9.5 17t-6.5 15.5t-3.5 16.5q0 13 20.5 33.5t45 38.5 t45 39.5t20.5 36.5q0 10 -5 21t-15 25t-13 19q-3 6 -15 28.5t-25 45.5t-26.5 47.5t-25 40.5t-16.5 18t-16 2q-48 0 -101 -22q-46 -21 -80 -94.5t-34 -130.5q0 -16 2.5 -34t5 -30.5t9 -33t10 -29.5t12.5 -33t11 -30q60 -164 216.5 -320.5t320.5 -216.5q6 -2 30 -11t33 -12.5 t29.5 -10t33 -9t30.5 -5t34 -2.5q57 0 130.5 34t94.5 80q22 53 22 101zM1536 1120v-960q0 -119 -84.5 -203.5t-203.5 -84.5h-960q-119 0 -203.5 84.5t-84.5 203.5v960q0 119 84.5 203.5t203.5 84.5h960q119 0 203.5 -84.5t84.5 -203.5z" />
+<glyph unicode="" horiz-adv-x="1920" d="M1875 1202q0 -10 -5 -18q-64 -104 -179 -190v-33q4 -227 -100 -457q-134 -297 -397.5 -464.5t-591.5 -167.5q-265 0 -500 122q-64 33 -87 50q-15 12 -15 27q0 13 9.5 22.5t22.5 9.5q14 0 44 -2.5t45 -2.5q204 0 375 106q-103 24 -181 96t-111 173q-2 8 -2 11q0 12 9 21.5 t22 9.5q5 0 14 -2t12 -2q-89 55 -142 147t-53 196q0 15 11.5 25.5t27.5 10.5q10 0 35 -11.5t30 -13.5q-92 110 -92 256q0 51 14.5 108t40.5 95q10 16 25 16q16 0 27 -12q76 -84 110 -115q123 -111 276 -177.5t317 -80.5q-4 21 -4 49q0 167 118.5 285.5t285.5 118.5 q163 0 282 -114q95 20 209 82q8 5 16 5q13 0 22.5 -9.5t9.5 -22.5q0 -24 -28 -73t-51 -76q7 2 30 10.5t43 16t24 7.5q13 0 22.5 -9.5t9.5 -22.5z" />
+<glyph unicode="" horiz-adv-x="768" d="M560 1125q-49 0 -62 -15.5t-13 -66.5v-88h217q16 0 27 -12q11 -13 10 -29l-14 -200q-2 -15 -12.5 -25.5t-25.5 -10.5h-202v-768q0 -16 -11 -27t-26 -11h-250q-16 0 -27 11t-11 27v768h-122q-16 0 -27 11.5t-11 27.5v200q0 16 11 27t27 11h122v103q0 177 88 263.5 t267 86.5q120 0 225 -30q14 -4 22 -16t6 -26l-27 -195q-2 -16 -16 -26q-14 -9 -30 -6q-76 16 -135 16z" />
+<glyph unicode="" d="M1408 640q0 130 -51 248.5t-136.5 204t-204 136.5t-248.5 51t-248.5 -51t-204 -136.5t-136.5 -204t-51 -248.5q0 -209 124.5 -378.5t323.5 -231.5v169q-54 -7 -69 -7q-110 0 -153 100q-15 38 -36 63q-5 6 -21 19t-28.5 24t-12.5 16q0 12 28 12q29 0 51.5 -14.5t38 -35 t31.5 -41.5t40.5 -35.5t56.5 -14.5q42 0 81 14q16 57 63 89q-166 16 -246 83.5t-80 224.5q0 118 73 198q-14 42 -14 84q0 58 27 109q57 0 101 -19.5t101 -60.5q76 18 169 18q80 0 153 -16q57 40 100.5 59t99.5 19q27 -51 27 -109q0 -43 -14 -83q73 -82 73 -199 q0 -157 -80 -225.5t-245 -83.5q69 -47 69 -131v-226q199 62 323.5 231.5t124.5 378.5zM1536 640q0 -209 -103 -385.5t-279.5 -279.5t-385.5 -103t-385.5 103t-279.5 279.5t-103 385.5t103 385.5t279.5 279.5t385.5 103t385.5 -103t279.5 -279.5t103 -385.5z" />
+<glyph unicode="" horiz-adv-x="1664" d="M704 160q0 6 -15 57t-35 115.5t-20 65.5q32 16 51 47t19 67q0 53 -37.5 90.5t-90.5 37.5t-90.5 -37.5t-37.5 -90.5q0 -36 19 -66.5t51 -47.5q0 -2 -20 -66t-35 -115t-15 -57q0 -13 9.5 -22.5t22.5 -9.5h192q13 0 22.5 9.5t9.5 22.5zM1664 960v-256q0 -26 -19 -45t-45 -19 h-64q-26 0 -45 19t-19 45v256q0 106 -75 181t-181 75t-181 -75t-75 -181v-192h96q40 0 68 -28t28 -68v-576q0 -40 -28 -68t-68 -28h-960q-40 0 -68 28t-28 68v576q0 40 28 68t68 28h672v192q0 185 131.5 316.5t316.5 131.5t316.5 -131.5t131.5 -316.5z" />
+<glyph unicode="" horiz-adv-x="1920" d="M1760 1408q66 0 113 -47t47 -113v-1216q0 -66 -47 -113t-113 -47h-1600q-66 0 -113 47t-47 113v1216q0 66 47 113t113 47h1600zM160 1280q-13 0 -22.5 -9.5t-9.5 -22.5v-224h1664v224q0 13 -9.5 22.5t-22.5 9.5h-1600zM1760 0q13 0 22.5 9.5t9.5 22.5v608h-1664v-608 q0 -13 9.5 -22.5t22.5 -9.5h1600zM256 128v128h256v-128h-256zM640 128v128h384v-128h-384z" />
+<glyph unicode="" horiz-adv-x="1408" d="M384 192q0 -80 -56 -136t-136 -56t-136 56t-56 136t56 136t136 56t136 -56t56 -136zM896 69q2 -28 -17 -48q-18 -21 -47 -21h-135q-25 0 -43 16.5t-20 41.5q-22 229 -184.5 391.5t-391.5 184.5q-25 2 -41.5 20t-16.5 43v135q0 29 21 47q17 17 43 17h5q160 -13 306 -80.5 t259 -181.5q114 -113 181.5 -259t80.5 -306zM1408 67q2 -27 -18 -47q-18 -20 -46 -20h-143q-26 0 -44.5 17.5t-19.5 42.5q-12 215 -101 408.5t-231.5 336t-336 231.5t-408.5 102q-25 1 -42.5 19.5t-17.5 43.5v143q0 28 20 46q18 18 44 18h3q262 -13 501.5 -120t425.5 -294 q187 -186 294 -425.5t120 -501.5z" />
+<glyph unicode="" d="M1040 320q0 -33 -23.5 -56.5t-56.5 -23.5t-56.5 23.5t-23.5 56.5t23.5 56.5t56.5 23.5t56.5 -23.5t23.5 -56.5zM1296 320q0 -33 -23.5 -56.5t-56.5 -23.5t-56.5 23.5t-23.5 56.5t23.5 56.5t56.5 23.5t56.5 -23.5t23.5 -56.5zM1408 160v320q0 13 -9.5 22.5t-22.5 9.5 h-1216q-13 0 -22.5 -9.5t-9.5 -22.5v-320q0 -13 9.5 -22.5t22.5 -9.5h1216q13 0 22.5 9.5t9.5 22.5zM178 640h1180l-157 482q-4 13 -16 21.5t-26 8.5h-782q-14 0 -26 -8.5t-16 -21.5zM1536 480v-320q0 -66 -47 -113t-113 -47h-1216q-66 0 -113 47t-47 113v320q0 25 16 75 l197 606q17 53 63 86t101 33h782q55 0 101 -33t63 -86l197 -606q16 -50 16 -75z" />
+<glyph unicode="" horiz-adv-x="1792" d="M1664 896q53 0 90.5 -37.5t37.5 -90.5t-37.5 -90.5t-90.5 -37.5v-384q0 -52 -38 -90t-90 -38q-417 347 -812 380q-58 -19 -91 -66t-31 -100.5t40 -92.5q-20 -33 -23 -65.5t6 -58t33.5 -55t48 -50t61.5 -50.5q-29 -58 -111.5 -83t-168.5 -11.5t-132 55.5q-7 23 -29.5 87.5 t-32 94.5t-23 89t-15 101t3.5 98.5t22 110.5h-122q-66 0 -113 47t-47 113v192q0 66 47 113t113 47h480q435 0 896 384q52 0 90 -38t38 -90v-384zM1536 292v954q-394 -302 -768 -343v-270q377 -42 768 -341z" />
+<glyph unicode="" horiz-adv-x="1664" d="M848 -160q0 16 -16 16q-59 0 -101.5 42.5t-42.5 101.5q0 16 -16 16t-16 -16q0 -73 51.5 -124.5t124.5 -51.5q16 0 16 16zM183 128h1298q-164 181 -246.5 411.5t-82.5 484.5q0 256 -320 256t-320 -256q0 -254 -82.5 -484.5t-246.5 -411.5zM1664 128q0 -52 -38 -90t-90 -38 h-448q0 -106 -75 -181t-181 -75t-181 75t-75 181h-448q-52 0 -90 38t-38 90q190 161 287 397.5t97 498.5q0 165 96 262t264 117q-8 18 -8 37q0 40 28 68t68 28t68 -28t28 -68q0 -19 -8 -37q168 -20 264 -117t96 -262q0 -262 97 -498.5t287 -397.5z" />
+<glyph unicode="" d="M1376 640l138 -135q30 -28 20 -70q-12 -41 -52 -51l-188 -48l53 -186q12 -41 -19 -70q-29 -31 -70 -19l-186 53l-48 -188q-10 -40 -51 -52q-12 -2 -19 -2q-31 0 -51 22l-135 138l-135 -138q-28 -30 -70 -20q-41 11 -51 52l-48 188l-186 -53q-41 -12 -70 19q-31 29 -19 70 l53 186l-188 48q-40 10 -52 51q-10 42 20 70l138 135l-138 135q-30 28 -20 70q12 41 52 51l188 48l-53 186q-12 41 19 70q29 31 70 19l186 -53l48 188q10 41 51 51q41 12 70 -19l135 -139l135 139q29 30 70 19q41 -10 51 -51l48 -188l186 53q41 12 70 -19q31 -29 19 -70 l-53 -186l188 -48q40 -10 52 -51q10 -42 -20 -70z" />
+<glyph unicode="" horiz-adv-x="1792" d="M256 192q0 26 -19 45t-45 19t-45 -19t-19 -45t19 -45t45 -19t45 19t19 45zM1664 768q0 51 -39 89.5t-89 38.5h-576q0 20 15 48.5t33 55t33 68t15 84.5q0 67 -44.5 97.5t-115.5 30.5q-24 0 -90 -139q-24 -44 -37 -65q-40 -64 -112 -145q-71 -81 -101 -106 q-69 -57 -140 -57h-32v-640h32q72 0 167 -32t193.5 -64t179.5 -32q189 0 189 167q0 26 -5 56q30 16 47.5 52.5t17.5 73.5t-18 69q53 50 53 119q0 25 -10 55.5t-25 47.5h331q52 0 90 38t38 90zM1792 769q0 -105 -75.5 -181t-180.5 -76h-169q-4 -62 -37 -119q3 -21 3 -43 q0 -101 -60 -178q1 -139 -85 -219.5t-227 -80.5q-133 0 -322 69q-164 59 -223 59h-288q-53 0 -90.5 37.5t-37.5 90.5v640q0 53 37.5 90.5t90.5 37.5h288q10 0 21.5 4.5t23.5 14t22.5 18t24 22.5t20.5 21.5t19 21.5t14 17q65 74 100 129q13 21 33 62t37 72t40.5 63t55 49.5 t69.5 17.5q125 0 206.5 -67t81.5 -189q0 -68 -22 -128h374q104 0 180 -76t76 -179z" />
+<glyph unicode="" horiz-adv-x="1792" d="M1376 128h32v640h-32q-35 0 -67 11.5t-64 38.5t-48 44t-50 55q-2 3 -3.5 4.5t-4 4.5t-4.5 5q-72 81 -112 145q-14 22 -38 68q-1 3 -10.5 22.5t-18.5 36t-20 35.5t-21.5 30.5t-18.5 11.5q-71 0 -115.5 -30.5t-44.5 -97.5q0 -43 15 -84.5t33 -68t33 -55t15 -48.5h-576 q-50 0 -89 -38.5t-39 -89.5q0 -52 38 -90t90 -38h331q-15 -17 -25 -47.5t-10 -55.5q0 -69 53 -119q-18 -32 -18 -69t17.5 -73.5t47.5 -52.5q-4 -24 -4 -56q0 -85 48.5 -126t135.5 -41q84 0 183 32t194 64t167 32zM1664 192q0 26 -19 45t-45 19t-45 -19t-19 -45t19 -45 t45 -19t45 19t19 45zM1792 768v-640q0 -53 -37.5 -90.5t-90.5 -37.5h-288q-59 0 -223 -59q-190 -69 -317 -69q-142 0 -230 77.5t-87 217.5l1 5q-61 76 -61 178q0 22 3 43q-33 57 -37 119h-169q-105 0 -180.5 76t-75.5 181q0 103 76 179t180 76h374q-22 60 -22 128 q0 122 81.5 189t206.5 67q38 0 69.5 -17.5t55 -49.5t40.5 -63t37 -72t33 -62q35 -55 100 -129q2 -3 14 -17t19 -21.5t20.5 -21.5t24 -22.5t22.5 -18t23.5 -14t21.5 -4.5h288q53 0 90.5 -37.5t37.5 -90.5z" />
+<glyph unicode="" d="M1280 -64q0 26 -19 45t-45 19t-45 -19t-19 -45t19 -45t45 -19t45 19t19 45zM1408 700q0 189 -167 189q-26 0 -56 -5q-16 30 -52.5 47.5t-73.5 17.5t-69 -18q-50 53 -119 53q-25 0 -55.5 -10t-47.5 -25v331q0 52 -38 90t-90 38q-51 0 -89.5 -39t-38.5 -89v-576 q-20 0 -48.5 15t-55 33t-68 33t-84.5 15q-67 0 -97.5 -44.5t-30.5 -115.5q0 -24 139 -90q44 -24 65 -37q64 -40 145 -112q81 -71 106 -101q57 -69 57 -140v-32h640v32q0 72 32 167t64 193.5t32 179.5zM1536 705q0 -133 -69 -322q-59 -164 -59 -223v-288q0 -53 -37.5 -90.5 t-90.5 -37.5h-640q-53 0 -90.5 37.5t-37.5 90.5v288q0 10 -4.5 21.5t-14 23.5t-18 22.5t-22.5 24t-21.5 20.5t-21.5 19t-17 14q-74 65 -129 100q-21 13 -62 33t-72 37t-63 40.5t-49.5 55t-17.5 69.5q0 125 67 206.5t189 81.5q68 0 128 -22v374q0 104 76 180t179 76 q105 0 181 -75.5t76 -180.5v-169q62 -4 119 -37q21 3 43 3q101 0 178 -60q139 1 219.5 -85t80.5 -227z" />
+<glyph unicode="" d="M1408 576q0 84 -32 183t-64 194t-32 167v32h-640v-32q0 -46 -25 -91t-52 -72t-72 -66q-9 -8 -14 -12q-81 -72 -145 -112q-22 -14 -68 -38q-3 -1 -22.5 -10.5t-36 -18.5t-35.5 -20t-30.5 -21.5t-11.5 -18.5q0 -71 30.5 -115.5t97.5 -44.5q43 0 84.5 15t68 33t55 33 t48.5 15v-576q0 -50 38.5 -89t89.5 -39q52 0 90 38t38 90v331q46 -35 103 -35q69 0 119 53q32 -18 69 -18t73.5 17.5t52.5 47.5q24 -4 56 -4q85 0 126 48.5t41 135.5zM1280 1344q0 26 -19 45t-45 19t-45 -19t-19 -45t19 -45t45 -19t45 19t19 45zM1536 580q0 -142 -77.5 -230 t-217.5 -87l-5 1q-76 -61 -178 -61q-22 0 -43 3q-54 -30 -119 -37v-169q0 -105 -76 -180.5t-181 -75.5q-103 0 -179 76t-76 180v374q-54 -22 -128 -22q-121 0 -188.5 81.5t-67.5 206.5q0 38 17.5 69.5t49.5 55t63 40.5t72 37t62 33q55 35 129 100q3 2 17 14t21.5 19 t21.5 20.5t22.5 24t18 22.5t14 23.5t4.5 21.5v288q0 53 37.5 90.5t90.5 37.5h640q53 0 90.5 -37.5t37.5 -90.5v-288q0 -59 59 -223q69 -190 69 -317z" />
+<glyph unicode="" d="M1280 576v128q0 26 -19 45t-45 19h-502l189 189q19 19 19 45t-19 45l-91 91q-18 18 -45 18t-45 -18l-362 -362l-91 -91q-18 -18 -18 -45t18 -45l91 -91l362 -362q18 -18 45 -18t45 18l91 91q18 18 18 45t-18 45l-189 189h502q26 0 45 19t19 45zM1536 640 q0 -209 -103 -385.5t-279.5 -279.5t-385.5 -103t-385.5 103t-279.5 279.5t-103 385.5t103 385.5t279.5 279.5t385.5 103t385.5 -103t279.5 -279.5t103 -385.5z" />
+<glyph unicode="" d="M1285 640q0 27 -18 45l-91 91l-362 362q-18 18 -45 18t-45 -18l-91 -91q-18 -18 -18 -45t18 -45l189 -189h-502q-26 0 -45 -19t-19 -45v-128q0 -26 19 -45t45 -19h502l-189 -189q-19 -19 -19 -45t19 -45l91 -91q18 -18 45 -18t45 18l362 362l91 91q18 18 18 45zM1536 640 q0 -209 -103 -385.5t-279.5 -279.5t-385.5 -103t-385.5 103t-279.5 279.5t-103 385.5t103 385.5t279.5 279.5t385.5 103t385.5 -103t279.5 -279.5t103 -385.5z" />
+<glyph unicode="" d="M1284 641q0 27 -18 45l-362 362l-91 91q-18 18 -45 18t-45 -18l-91 -91l-362 -362q-18 -18 -18 -45t18 -45l91 -91q18 -18 45 -18t45 18l189 189v-502q0 -26 19 -45t45 -19h128q26 0 45 19t19 45v502l189 -189q19 -19 45 -19t45 19l91 91q18 18 18 45zM1536 640 q0 -209 -103 -385.5t-279.5 -279.5t-385.5 -103t-385.5 103t-279.5 279.5t-103 385.5t103 385.5t279.5 279.5t385.5 103t385.5 -103t279.5 -279.5t103 -385.5z" />
+<glyph unicode="" d="M1284 639q0 27 -18 45l-91 91q-18 18 -45 18t-45 -18l-189 -189v502q0 26 -19 45t-45 19h-128q-26 0 -45 -19t-19 -45v-502l-189 189q-19 19 -45 19t-45 -19l-91 -91q-18 -18 -18 -45t18 -45l362 -362l91 -91q18 -18 45 -18t45 18l91 91l362 362q18 18 18 45zM1536 640 q0 -209 -103 -385.5t-279.5 -279.5t-385.5 -103t-385.5 103t-279.5 279.5t-103 385.5t103 385.5t279.5 279.5t385.5 103t385.5 -103t279.5 -279.5t103 -385.5z" />
+<glyph unicode="" d="M1193 993q11 7 25 22v-1q0 -2 -9.5 -10t-11.5 -12q-1 1 -4 1zM1187 992q-1 1 -2.5 3t-1.5 3q3 -2 10 -5q-6 -4 -6 -1zM728 1175q-16 2 -26 5q1 0 6.5 -1t10.5 -2t9 -2zM773 1212q7 4 13.5 2.5t7.5 -7.5q-5 3 -21 5zM765 1206l-3 2q-2 3 -5.5 5t-4.5 2q2 -1 21 -3 q-6 -4 -8 -6zM663 1290v2q1 -2 3 -5.5t3 -5.5zM558 1250q0 -2 -1 -2l-1 2h2zM933 206v-1v1zM768 1408q209 0 385.5 -103t279.5 -279.5t103 -385.5t-103 -385.5t-279.5 -279.5t-385.5 -103t-385.5 103t-279.5 279.5t-103 385.5t103 385.5t279.5 279.5t385.5 103zM1240 162 l5 5q-7 10 -29 12q1 12 -14 26.5t-27 15.5q0 4 -10.5 11t-17.5 8q-9 2 -27 -9q-7 -3 -4 -5q-3 3 -12 11t-16 11q-2 1 -7.5 1t-8.5 2q-1 1 -6 4.5t-7 4.5t-6.5 3t-7.5 1.5t-7.5 -2.5t-8.5 -6t-4.5 -15.5t-2.5 -14.5q-8 6 -0.5 20t1.5 20q-7 7 -21 0.5t-21 -15.5 q-1 -1 -9.5 -5.5t-11.5 -7.5q-4 -6 -9 -17.5t-6 -13.5q0 2 -2.5 6.5t-2.5 6.5q-12 -2 -16 3q5 -16 8 -17l-4 2q-1 -6 3 -15t4 -11q1 -5 -1.5 -13t-2.5 -11q0 -2 5 -11q4 -19 -2 -32q0 -1 -3.5 -7t-6.5 -11l-2 -5l-2 1q-1 1 -2 0q-1 -6 -9 -13t-10 -11q-15 -23 -9 -38 q3 -8 10 -10q3 -1 3 2q1 -9 -11 -27q1 -1 4 -3q-17 0 -10 -14q202 36 352 181h-3zM680 347q16 3 30.5 -16t22.5 -23q41 -20 59 -11q0 -9 14 -28q3 -4 6.5 -11.5t5.5 -10.5q5 -7 19 -16t19 -16q6 3 9 9q13 -35 24 -34q5 0 8 8q0 -1 -0.5 -3t-1.5 -3q7 15 5 26l6 4q5 4 5 5 q-6 6 -9 -3q-30 -14 -48 22q-2 3 -4.5 8t-5 12t-1.5 11.5t6 4.5q11 0 12.5 1.5t-2.5 6t-4 7.5q-1 4 -1.5 12.5t-1.5 12.5l-5 6q-5 6 -11.5 13.5t-7.5 9.5q-4 -10 -16.5 -8.5t-18.5 9.5q1 -2 -0.5 -6.5t-1.5 -6.5q-14 0 -17 1q1 6 3 21t4 22q1 5 5.5 13.5t8 15.5t4.5 14 t-4.5 10.5t-18.5 2.5q-20 -1 -29 -22q-1 -3 -3 -11.5t-5 -12.5t-9 -7q-8 -3 -27 -2t-26 5q-14 8 -24 30.5t-11 41.5q0 10 3 27.5t3 27t-6 26.5q3 2 10 10.5t11 11.5q2 2 5 2h5t4 2t3 6q-1 1 -4 3q-3 3 -4 3q4 -3 19 -1t19 2q0 1 22 0q17 -13 24 2q0 1 -2.5 10.5t-0.5 14.5 q5 -29 32 -10q3 -4 16.5 -6t18.5 -5q3 -2 7 -5.5t6 -5t6 -0.5t9 7q11 -17 13 -25q11 -43 20 -48q8 -2 12.5 -2t5 10.5t0 15.5t-1.5 13l-2 37q-16 3 -20 12.5t1.5 20t16.5 19.5q1 1 16.5 8t21.5 12q24 19 17 39q9 -2 11 9l-5 3q-4 3 -8 5.5t-5 1.5q11 7 2 18q5 3 8 11.5 t9 11.5q9 -14 22 -3q8 9 2 18q5 8 22 11.5t20 9.5q5 -1 7 0t2 4.5v7.5t1 8.5t3 7.5q4 6 16 10.5t14 5.5l19 12q4 4 0 4q18 -2 32 11q13 12 -5 23q2 7 -4 10.5t-16 5.5q3 1 12 0.5t12 1.5q15 11 -7 17q-20 5 -47 -13q-3 -2 -13 -12t-17 -11q15 18 5 22q8 -1 22.5 9t15.5 11 q4 2 10.5 2.5t8.5 1.5q71 25 92 -1q8 11 11 15t9.5 9t15.5 8q21 7 23 9l1 23q-12 -1 -18 8t-7 22l-6 -8q0 6 -3.5 7.5t-7.5 0.5t-9.5 -2t-7.5 0q-9 2 -19.5 15.5t-14.5 16.5q9 0 9 5q-2 5 -10 8q1 6 -2 8t-9 0q-2 12 -1 13q-6 1 -11 11t-8 10q-2 0 -4.5 -2t-5 -5.5l-5 -7 t-3.5 -5.5l-2 -2q-12 6 -24 -10q-9 1 -17 -2q15 6 2 13q-11 5 -21 2q12 5 10 14t-12 16q1 0 4 -1t4 -1q-1 5 -9.5 9.5t-19.5 9t-14 6.5q-7 5 -36 10.5t-36 1.5q-5 -3 -6 -6t1.5 -8.5t3.5 -8.5q6 -23 5 -27q-1 -3 -8.5 -8t-5.5 -12q1 -4 11.5 -10t12.5 -12q5 -13 -4 -25 q-4 -5 -15 -11t-14 -10q-5 -5 -3.5 -11.5t0.5 -9.5q1 1 1 2.5t1 2.5q0 -13 11 -22q8 -6 -16 -18q-20 -11 -20 -4q1 8 -7.5 16t-10.5 12t-3.5 19t-9.5 21q-6 4 -19 4t-18 -5q0 10 -49 30q-17 8 -58 4q7 1 0 17q-8 16 -21 12q-8 25 -4 35q2 5 9 14t9 15q1 3 15.5 6t16.5 8 q1 4 -2.5 6.5t-9.5 4.5q53 -6 63 18q5 9 3 14q0 -1 2 -1t2 -1q12 3 7 17q19 8 26 8q5 -1 11 -6t10 -5q17 -3 21.5 10t-9.5 23q7 -4 7 6q-1 13 -7 19q-3 2 -6.5 2.5t-6.5 0t-7 0.5q-1 0 -8 2q-1 -1 -2 -1h-8q-4 -2 -4 -5v-1q-1 -3 4 -6l5 -1l3 -2q-1 0 -2.5 -2.5t-2.5 -2.5 q0 -3 3 -5q-2 -1 -14 -7.5t-17 -10.5q-1 -1 -4 -2.5t-4 -2.5q-2 -1 -4 2t-4 9t-4 11.5t-4.5 10t-5.5 4.5q-12 0 -18 -17q3 10 -13 17.5t-25 7.5q20 15 -9 30l-1 1q-30 -4 -45 -7q-2 -6 3 -12q-1 -7 6 -9q0 -1 0.5 -1t0.5 -1q0 1 -0.5 1t-0.5 1q3 -1 10.5 -1.5t9.5 -1.5 q3 -1 4.5 -2l7.5 -5t5.5 -6t-2.5 -5q-2 -1 -9 -4t-12.5 -5.5t-6.5 -3.5q-3 -5 0 -16t-2 -15q-5 5 -10 18.5t-8 17.5q8 -9 -30 -6l-8 1q-4 0 -15 -2t-16 -1q-7 0 -29 6q7 17 5 25q5 0 7 2l-6 3q-3 -1 -25 -9q2 -3 8 -9.5t9 -11.5q-22 6 -27 -2q0 -1 -9 0q-25 1 -24 -7 q1 -4 9 -12q0 -9 -1 -9q-27 22 -30 23q-172 -83 -276 -248q1 -2 2.5 -11t3.5 -8.5t11 4.5q9 -9 3 -21q2 2 36 -21q56 -40 22 -53v5.5t1 6.5q-9 -1 -19 5q-3 -6 0.5 -20t11.5 -14q-8 0 -10.5 -17t-2.5 -38.5t-1 -25.5l2 -1q-3 -13 6 -37.5t24 -20.5q-4 -18 5 -21q-1 -4 0 -8 t4.5 -8.5t6 -7l7.5 -7.5l6 -6q28 -11 41 -29q4 -6 10.5 -24.5t15.5 -25.5q-2 -6 10 -21.5t11 -25.5q-1 0 -2.5 -0.5t-2.5 -0.5q3 -8 16.5 -16t16.5 -14q2 -3 2.5 -10.5t3 -12t8.5 -2.5q3 24 -26 68q-16 27 -18 31q-3 5 -5.5 16.5t-4.5 15.5q27 -9 26 -13q-5 -10 26 -52 q2 -3 10 -10t11 -12q3 -4 9.5 -14.5t10.5 -15.5q-1 0 -3 -2l-3 -3q4 -2 9 -5t8 -4.5t7.5 -5t7.5 -7.5q16 -18 20 -33q1 -4 0.5 -15.5t1.5 -16.5q2 -6 6 -11t11.5 -10t11.5 -7t14.5 -6.5t11.5 -5.5q2 -1 18 -11t25 -14q10 -4 16.5 -4.5t16 2.5t15.5 4z" />
+<glyph unicode="" horiz-adv-x="1664" d="M384 64q0 26 -19 45t-45 19t-45 -19t-19 -45t19 -45t45 -19t45 19t19 45zM1028 484l-682 -682q-37 -37 -90 -37q-52 0 -91 37l-106 108q-38 36 -38 90q0 53 38 91l681 681q39 -98 114.5 -173.5t173.5 -114.5zM1662 919q0 -39 -23 -106q-47 -134 -164.5 -217.5 t-258.5 -83.5q-185 0 -316.5 131.5t-131.5 316.5t131.5 316.5t316.5 131.5q58 0 121.5 -16.5t107.5 -46.5q16 -11 16 -28t-16 -28l-293 -169v-224l193 -107q5 3 79 48.5t135.5 81t70.5 35.5q15 0 23.5 -10t8.5 -25z" />
+<glyph unicode="" horiz-adv-x="1792" d="M1024 128h640v128h-640v-128zM640 640h1024v128h-1024v-128zM1280 1152h384v128h-384v-128zM1792 320v-256q0 -26 -19 -45t-45 -19h-1664q-26 0 -45 19t-19 45v256q0 26 19 45t45 19h1664q26 0 45 -19t19 -45zM1792 832v-256q0 -26 -19 -45t-45 -19h-1664q-26 0 -45 19 t-19 45v256q0 26 19 45t45 19h1664q26 0 45 -19t19 -45zM1792 1344v-256q0 -26 -19 -45t-45 -19h-1664q-26 0 -45 19t-19 45v256q0 26 19 45t45 19h1664q26 0 45 -19t19 -45z" />
+<glyph unicode="" horiz-adv-x="1408" d="M1403 1241q17 -41 -14 -70l-493 -493v-742q0 -42 -39 -59q-13 -5 -25 -5q-27 0 -45 19l-256 256q-19 19 -19 45v486l-493 493q-31 29 -14 70q17 39 59 39h1280q42 0 59 -39z" />
+<glyph unicode="" horiz-adv-x="1792" d="M640 1152h512v128h-512v-128zM1792 512v-480q0 -66 -47 -113t-113 -47h-1472q-66 0 -113 47t-47 113v480h672v-160q0 -26 19 -45t45 -19h320q26 0 45 19t19 45v160h672zM1024 512v-128h-256v128h256zM1792 992v-384h-1792v384q0 66 47 113t113 47h352v160q0 40 28 68 t68 28h576q40 0 68 -28t28 -68v-160h352q66 0 113 -47t47 -113z" />
+<glyph unicode="" d="M1283 995l-355 -355l355 -355l144 144q29 31 70 14q39 -17 39 -59v-448q0 -26 -19 -45t-45 -19h-448q-42 0 -59 40q-17 39 14 69l144 144l-355 355l-355 -355l144 -144q31 -30 14 -69q-17 -40 -59 -40h-448q-26 0 -45 19t-19 45v448q0 42 40 59q39 17 69 -14l144 -144 l355 355l-355 355l-144 -144q-19 -19 -45 -19q-12 0 -24 5q-40 17 -40 59v448q0 26 19 45t45 19h448q42 0 59 -40q17 -39 -14 -69l-144 -144l355 -355l355 355l-144 144q-31 30 -14 69q17 40 59 40h448q26 0 45 -19t19 -45v-448q0 -42 -39 -59q-13 -5 -25 -5q-26 0 -45 19z " />
+<glyph unicode="" horiz-adv-x="1920" d="M593 640q-162 -5 -265 -128h-134q-82 0 -138 40.5t-56 118.5q0 353 124 353q6 0 43.5 -21t97.5 -42.5t119 -21.5q67 0 133 23q-5 -37 -5 -66q0 -139 81 -256zM1664 3q0 -120 -73 -189.5t-194 -69.5h-874q-121 0 -194 69.5t-73 189.5q0 53 3.5 103.5t14 109t26.5 108.5 t43 97.5t62 81t85.5 53.5t111.5 20q10 0 43 -21.5t73 -48t107 -48t135 -21.5t135 21.5t107 48t73 48t43 21.5q61 0 111.5 -20t85.5 -53.5t62 -81t43 -97.5t26.5 -108.5t14 -109t3.5 -103.5zM640 1280q0 -106 -75 -181t-181 -75t-181 75t-75 181t75 181t181 75t181 -75 t75 -181zM1344 896q0 -159 -112.5 -271.5t-271.5 -112.5t-271.5 112.5t-112.5 271.5t112.5 271.5t271.5 112.5t271.5 -112.5t112.5 -271.5zM1920 671q0 -78 -56 -118.5t-138 -40.5h-134q-103 123 -265 128q81 117 81 256q0 29 -5 66q66 -23 133 -23q59 0 119 21.5t97.5 42.5 t43.5 21q124 0 124 -353zM1792 1280q0 -106 -75 -181t-181 -75t-181 75t-75 181t75 181t181 75t181 -75t75 -181z" />
+<glyph unicode="" horiz-adv-x="1664" d="M1456 320q0 40 -28 68l-208 208q-28 28 -68 28q-42 0 -72 -32q3 -3 19 -18.5t21.5 -21.5t15 -19t13 -25.5t3.5 -27.5q0 -40 -28 -68t-68 -28q-15 0 -27.5 3.5t-25.5 13t-19 15t-21.5 21.5t-18.5 19q-33 -31 -33 -73q0 -40 28 -68l206 -207q27 -27 68 -27q40 0 68 26 l147 146q28 28 28 67zM753 1025q0 40 -28 68l-206 207q-28 28 -68 28q-39 0 -68 -27l-147 -146q-28 -28 -28 -67q0 -40 28 -68l208 -208q27 -27 68 -27q42 0 72 31q-3 3 -19 18.5t-21.5 21.5t-15 19t-13 25.5t-3.5 27.5q0 40 28 68t68 28q15 0 27.5 -3.5t25.5 -13t19 -15 t21.5 -21.5t18.5 -19q33 31 33 73zM1648 320q0 -120 -85 -203l-147 -146q-83 -83 -203 -83q-121 0 -204 85l-206 207q-83 83 -83 203q0 123 88 209l-88 88q-86 -88 -208 -88q-120 0 -204 84l-208 208q-84 84 -84 204t85 203l147 146q83 83 203 83q121 0 204 -85l206 -207 q83 -83 83 -203q0 -123 -88 -209l88 -88q86 88 208 88q120 0 204 -84l208 -208q84 -84 84 -204z" />
+<glyph unicode="" horiz-adv-x="1920" d="M1920 384q0 -159 -112.5 -271.5t-271.5 -112.5h-1088q-185 0 -316.5 131.5t-131.5 316.5q0 132 71 241.5t187 163.5q-2 28 -2 43q0 212 150 362t362 150q158 0 286.5 -88t187.5 -230q70 62 166 62q106 0 181 -75t75 -181q0 -75 -41 -138q129 -30 213 -134.5t84 -239.5z " />
+<glyph unicode="" horiz-adv-x="1664" d="M1527 88q56 -89 21.5 -152.5t-140.5 -63.5h-1152q-106 0 -140.5 63.5t21.5 152.5l503 793v399h-64q-26 0 -45 19t-19 45t19 45t45 19h512q26 0 45 -19t19 -45t-19 -45t-45 -19h-64v-399zM748 813l-272 -429h712l-272 429l-20 31v37v399h-128v-399v-37z" />
+<glyph unicode="" horiz-adv-x="1792" d="M960 640q26 0 45 -19t19 -45t-19 -45t-45 -19t-45 19t-19 45t19 45t45 19zM1260 576l507 -398q28 -20 25 -56q-5 -35 -35 -51l-128 -64q-13 -7 -29 -7q-17 0 -31 8l-690 387l-110 -66q-8 -4 -12 -5q14 -49 10 -97q-7 -77 -56 -147.5t-132 -123.5q-132 -84 -277 -84 q-136 0 -222 78q-90 84 -79 207q7 76 56 147t131 124q132 84 278 84q83 0 151 -31q9 13 22 22l122 73l-122 73q-13 9 -22 22q-68 -31 -151 -31q-146 0 -278 84q-82 53 -131 124t-56 147q-5 59 15.5 113t63.5 93q85 79 222 79q145 0 277 -84q83 -52 132 -123t56 -148 q4 -48 -10 -97q4 -1 12 -5l110 -66l690 387q14 8 31 8q16 0 29 -7l128 -64q30 -16 35 -51q3 -36 -25 -56zM579 836q46 42 21 108t-106 117q-92 59 -192 59q-74 0 -113 -36q-46 -42 -21 -108t106 -117q92 -59 192 -59q74 0 113 36zM494 91q81 51 106 117t-21 108 q-39 36 -113 36q-100 0 -192 -59q-81 -51 -106 -117t21 -108q39 -36 113 -36q100 0 192 59zM672 704l96 -58v11q0 36 33 56l14 8l-79 47l-26 -26q-3 -3 -10 -11t-12 -12q-2 -2 -4 -3.5t-3 -2.5zM896 480l96 -32l736 576l-128 64l-768 -431v-113l-160 -96l9 -8q2 -2 7 -6 q4 -4 11 -12t11 -12l26 -26zM1600 64l128 64l-520 408l-177 -138q-2 -3 -13 -7z" />
+<glyph unicode="" horiz-adv-x="1792" d="M1696 1152q40 0 68 -28t28 -68v-1216q0 -40 -28 -68t-68 -28h-960q-40 0 -68 28t-28 68v288h-544q-40 0 -68 28t-28 68v672q0 40 20 88t48 76l408 408q28 28 76 48t88 20h416q40 0 68 -28t28 -68v-328q68 40 128 40h416zM1152 939l-299 -299h299v299zM512 1323l-299 -299 h299v299zM708 676l316 316v416h-384v-416q0 -40 -28 -68t-68 -28h-416v-640h512v256q0 40 20 88t48 76zM1664 -128v1152h-384v-416q0 -40 -28 -68t-68 -28h-416v-640h896z" />
+<glyph unicode="" horiz-adv-x="1408" d="M1404 151q0 -117 -79 -196t-196 -79q-135 0 -235 100l-777 776q-113 115 -113 271q0 159 110 270t269 111q158 0 273 -113l605 -606q10 -10 10 -22q0 -16 -30.5 -46.5t-46.5 -30.5q-13 0 -23 10l-606 607q-79 77 -181 77q-106 0 -179 -75t-73 -181q0 -105 76 -181 l776 -777q63 -63 145 -63q64 0 106 42t42 106q0 82 -63 145l-581 581q-26 24 -60 24q-29 0 -48 -19t-19 -48q0 -32 25 -59l410 -410q10 -10 10 -22q0 -16 -31 -47t-47 -31q-12 0 -22 10l-410 410q-63 61 -63 149q0 82 57 139t139 57q88 0 149 -63l581 -581q100 -98 100 -235 z" />
+<glyph unicode="" d="M384 0h768v384h-768v-384zM1280 0h128v896q0 14 -10 38.5t-20 34.5l-281 281q-10 10 -34 20t-39 10v-416q0 -40 -28 -68t-68 -28h-576q-40 0 -68 28t-28 68v416h-128v-1280h128v416q0 40 28 68t68 28h832q40 0 68 -28t28 -68v-416zM896 928v320q0 13 -9.5 22.5t-22.5 9.5 h-192q-13 0 -22.5 -9.5t-9.5 -22.5v-320q0 -13 9.5 -22.5t22.5 -9.5h192q13 0 22.5 9.5t9.5 22.5zM1536 896v-928q0 -40 -28 -68t-68 -28h-1344q-40 0 -68 28t-28 68v1344q0 40 28 68t68 28h928q40 0 88 -20t76 -48l280 -280q28 -28 48 -76t20 -88z" />
+<glyph unicode="" d="M1536 1120v-960q0 -119 -84.5 -203.5t-203.5 -84.5h-960q-119 0 -203.5 84.5t-84.5 203.5v960q0 119 84.5 203.5t203.5 84.5h960q119 0 203.5 -84.5t84.5 -203.5z" />
+<glyph unicode="" d="M1536 192v-128q0 -26 -19 -45t-45 -19h-1408q-26 0 -45 19t-19 45v128q0 26 19 45t45 19h1408q26 0 45 -19t19 -45zM1536 704v-128q0 -26 -19 -45t-45 -19h-1408q-26 0 -45 19t-19 45v128q0 26 19 45t45 19h1408q26 0 45 -19t19 -45zM1536 1216v-128q0 -26 -19 -45 t-45 -19h-1408q-26 0 -45 19t-19 45v128q0 26 19 45t45 19h1408q26 0 45 -19t19 -45z" />
+<glyph unicode="" horiz-adv-x="1792" d="M384 128q0 -80 -56 -136t-136 -56t-136 56t-56 136t56 136t136 56t136 -56t56 -136zM384 640q0 -80 -56 -136t-136 -56t-136 56t-56 136t56 136t136 56t136 -56t56 -136zM1792 224v-192q0 -13 -9.5 -22.5t-22.5 -9.5h-1216q-13 0 -22.5 9.5t-9.5 22.5v192q0 13 9.5 22.5 t22.5 9.5h1216q13 0 22.5 -9.5t9.5 -22.5zM384 1152q0 -80 -56 -136t-136 -56t-136 56t-56 136t56 136t136 56t136 -56t56 -136zM1792 736v-192q0 -13 -9.5 -22.5t-22.5 -9.5h-1216q-13 0 -22.5 9.5t-9.5 22.5v192q0 13 9.5 22.5t22.5 9.5h1216q13 0 22.5 -9.5t9.5 -22.5z M1792 1248v-192q0 -13 -9.5 -22.5t-22.5 -9.5h-1216q-13 0 -22.5 9.5t-9.5 22.5v192q0 13 9.5 22.5t22.5 9.5h1216q13 0 22.5 -9.5t9.5 -22.5z" />
+<glyph unicode="" horiz-adv-x="1792" d="M381 -84q0 -80 -54.5 -126t-135.5 -46q-106 0 -172 66l57 88q49 -45 106 -45q29 0 50.5 14.5t21.5 42.5q0 64 -105 56l-26 56q8 10 32.5 43.5t42.5 54t37 38.5v1q-16 0 -48.5 -1t-48.5 -1v-53h-106v152h333v-88l-95 -115q51 -12 81 -49t30 -88zM383 543v-159h-362 q-6 36 -6 54q0 51 23.5 93t56.5 68t66 47.5t56.5 43.5t23.5 45q0 25 -14.5 38.5t-39.5 13.5q-46 0 -81 -58l-85 59q24 51 71.5 79.5t105.5 28.5q73 0 123 -41.5t50 -112.5q0 -50 -34 -91.5t-75 -64.5t-75.5 -50.5t-35.5 -52.5h127v60h105zM1792 224v-192q0 -13 -9.5 -22.5 t-22.5 -9.5h-1216q-13 0 -22.5 9.5t-9.5 22.5v192q0 14 9 23t23 9h1216q13 0 22.5 -9.5t9.5 -22.5zM384 1123v-99h-335v99h107q0 41 0.5 122t0.5 121v12h-2q-8 -17 -50 -54l-71 76l136 127h106v-404h108zM1792 736v-192q0 -13 -9.5 -22.5t-22.5 -9.5h-1216q-13 0 -22.5 9.5 t-9.5 22.5v192q0 14 9 23t23 9h1216q13 0 22.5 -9.5t9.5 -22.5zM1792 1248v-192q0 -13 -9.5 -22.5t-22.5 -9.5h-1216q-13 0 -22.5 9.5t-9.5 22.5v192q0 13 9.5 22.5t22.5 9.5h1216q13 0 22.5 -9.5t9.5 -22.5z" />
+<glyph unicode="" horiz-adv-x="1792" d="M1760 640q14 0 23 -9t9 -23v-64q0 -14 -9 -23t-23 -9h-1728q-14 0 -23 9t-9 23v64q0 14 9 23t23 9h1728zM483 704q-28 35 -51 80q-48 97 -48 188q0 181 134 309q133 127 393 127q50 0 167 -19q66 -12 177 -48q10 -38 21 -118q14 -123 14 -183q0 -18 -5 -45l-12 -3l-84 6 l-14 2q-50 149 -103 205q-88 91 -210 91q-114 0 -182 -59q-67 -58 -67 -146q0 -73 66 -140t279 -129q69 -20 173 -66q58 -28 95 -52h-743zM990 448h411q7 -39 7 -92q0 -111 -41 -212q-23 -55 -71 -104q-37 -35 -109 -81q-80 -48 -153 -66q-80 -21 -203 -21q-114 0 -195 23 l-140 40q-57 16 -72 28q-8 8 -8 22v13q0 108 -2 156q-1 30 0 68l2 37v44l102 2q15 -34 30 -71t22.5 -56t12.5 -27q35 -57 80 -94q43 -36 105 -57q59 -22 132 -22q64 0 139 27q77 26 122 86q47 61 47 129q0 84 -81 157q-34 29 -137 71z" />
+<glyph unicode="" d="M48 1313q-37 2 -45 4l-3 88q13 1 40 1q60 0 112 -4q132 -7 166 -7q86 0 168 3q116 4 146 5q56 0 86 2l-1 -14l2 -64v-9q-60 -9 -124 -9q-60 0 -79 -25q-13 -14 -13 -132q0 -13 0.5 -32.5t0.5 -25.5l1 -229l14 -280q6 -124 51 -202q35 -59 96 -92q88 -47 177 -47 q104 0 191 28q56 18 99 51q48 36 65 64q36 56 53 114q21 73 21 229q0 79 -3.5 128t-11 122.5t-13.5 159.5l-4 59q-5 67 -24 88q-34 35 -77 34l-100 -2l-14 3l2 86h84l205 -10q76 -3 196 10l18 -2q6 -38 6 -51q0 -7 -4 -31q-45 -12 -84 -13q-73 -11 -79 -17q-15 -15 -15 -41 q0 -7 1.5 -27t1.5 -31q8 -19 22 -396q6 -195 -15 -304q-15 -76 -41 -122q-38 -65 -112 -123q-75 -57 -182 -89q-109 -33 -255 -33q-167 0 -284 46q-119 47 -179 122q-61 76 -83 195q-16 80 -16 237v333q0 188 -17 213q-25 36 -147 39zM1536 -96v64q0 14 -9 23t-23 9h-1472 q-14 0 -23 -9t-9 -23v-64q0 -14 9 -23t23 -9h1472q14 0 23 9t9 23z" />
+<glyph unicode="" horiz-adv-x="1664" d="M512 160v192q0 14 -9 23t-23 9h-320q-14 0 -23 -9t-9 -23v-192q0 -14 9 -23t23 -9h320q14 0 23 9t9 23zM512 544v192q0 14 -9 23t-23 9h-320q-14 0 -23 -9t-9 -23v-192q0 -14 9 -23t23 -9h320q14 0 23 9t9 23zM1024 160v192q0 14 -9 23t-23 9h-320q-14 0 -23 -9t-9 -23 v-192q0 -14 9 -23t23 -9h320q14 0 23 9t9 23zM512 928v192q0 14 -9 23t-23 9h-320q-14 0 -23 -9t-9 -23v-192q0 -14 9 -23t23 -9h320q14 0 23 9t9 23zM1024 544v192q0 14 -9 23t-23 9h-320q-14 0 -23 -9t-9 -23v-192q0 -14 9 -23t23 -9h320q14 0 23 9t9 23zM1536 160v192 q0 14 -9 23t-23 9h-320q-14 0 -23 -9t-9 -23v-192q0 -14 9 -23t23 -9h320q14 0 23 9t9 23zM1024 928v192q0 14 -9 23t-23 9h-320q-14 0 -23 -9t-9 -23v-192q0 -14 9 -23t23 -9h320q14 0 23 9t9 23zM1536 544v192q0 14 -9 23t-23 9h-320q-14 0 -23 -9t-9 -23v-192 q0 -14 9 -23t23 -9h320q14 0 23 9t9 23zM1536 928v192q0 14 -9 23t-23 9h-320q-14 0 -23 -9t-9 -23v-192q0 -14 9 -23t23 -9h320q14 0 23 9t9 23zM1664 1248v-1088q0 -66 -47 -113t-113 -47h-1344q-66 0 -113 47t-47 113v1088q0 66 47 113t113 47h1344q66 0 113 -47t47 -113 z" />
+<glyph unicode="" horiz-adv-x="1664" d="M1190 955l293 293l-107 107l-293 -293zM1637 1248q0 -27 -18 -45l-1286 -1286q-18 -18 -45 -18t-45 18l-198 198q-18 18 -18 45t18 45l1286 1286q18 18 45 18t45 -18l198 -198q18 -18 18 -45zM286 1438l98 -30l-98 -30l-30 -98l-30 98l-98 30l98 30l30 98zM636 1276 l196 -60l-196 -60l-60 -196l-60 196l-196 60l196 60l60 196zM1566 798l98 -30l-98 -30l-30 -98l-30 98l-98 30l98 30l30 98zM926 1438l98 -30l-98 -30l-30 -98l-30 98l-98 30l98 30l30 98z" />
+<glyph unicode="" horiz-adv-x="1792" d="M640 128q0 52 -38 90t-90 38t-90 -38t-38 -90t38 -90t90 -38t90 38t38 90zM256 640h384v256h-158q-13 0 -22 -9l-195 -195q-9 -9 -9 -22v-30zM1536 128q0 52 -38 90t-90 38t-90 -38t-38 -90t38 -90t90 -38t90 38t38 90zM1792 1216v-1024q0 -15 -4 -26.5t-13.5 -18.5 t-16.5 -11.5t-23.5 -6t-22.5 -2t-25.5 0t-22.5 0.5q0 -106 -75 -181t-181 -75t-181 75t-75 181h-384q0 -106 -75 -181t-181 -75t-181 75t-75 181h-64q-3 0 -22.5 -0.5t-25.5 0t-22.5 2t-23.5 6t-16.5 11.5t-13.5 18.5t-4 26.5q0 26 19 45t45 19v320q0 8 -0.5 35t0 38 t2.5 34.5t6.5 37t14 30.5t22.5 30l198 198q19 19 50.5 32t58.5 13h160v192q0 26 19 45t45 19h1024q26 0 45 -19t19 -45z" />
+<glyph unicode="" d="M1536 640q0 -209 -103 -385.5t-279.5 -279.5t-385.5 -103q-111 0 -218 32q59 93 78 164q9 34 54 211q20 -39 73 -67.5t114 -28.5q121 0 216 68.5t147 188.5t52 270q0 114 -59.5 214t-172.5 163t-255 63q-105 0 -196 -29t-154.5 -77t-109 -110.5t-67 -129.5t-21.5 -134 q0 -104 40 -183t117 -111q30 -12 38 20q2 7 8 31t8 30q6 23 -11 43q-51 61 -51 151q0 151 104.5 259.5t273.5 108.5q151 0 235.5 -82t84.5 -213q0 -170 -68.5 -289t-175.5 -119q-61 0 -98 43.5t-23 104.5q8 35 26.5 93.5t30 103t11.5 75.5q0 50 -27 83t-77 33 q-62 0 -105 -57t-43 -142q0 -73 25 -122l-99 -418q-17 -70 -13 -177q-206 91 -333 281t-127 423q0 209 103 385.5t279.5 279.5t385.5 103t385.5 -103t279.5 -279.5t103 -385.5z" />
+<glyph unicode="" d="M1248 1408q119 0 203.5 -84.5t84.5 -203.5v-960q0 -119 -84.5 -203.5t-203.5 -84.5h-725q85 122 108 210q9 34 53 209q21 -39 73.5 -67t112.5 -28q181 0 295.5 147.5t114.5 373.5q0 84 -35 162.5t-96.5 139t-152.5 97t-197 36.5q-104 0 -194.5 -28.5t-153 -76.5 t-107.5 -109.5t-66.5 -128t-21.5 -132.5q0 -102 39.5 -180t116.5 -110q13 -5 23.5 0t14.5 19q10 44 15 61q6 23 -11 42q-50 62 -50 150q0 150 103.5 256.5t270.5 106.5q149 0 232.5 -81t83.5 -210q0 -168 -67.5 -286t-173.5 -118q-60 0 -97 43.5t-23 103.5q8 34 26.5 92.5 t29.5 102t11 74.5q0 49 -26.5 81.5t-75.5 32.5q-61 0 -103.5 -56.5t-42.5 -139.5q0 -72 24 -121l-98 -414q-24 -100 -7 -254h-183q-119 0 -203.5 84.5t-84.5 203.5v960q0 119 84.5 203.5t203.5 84.5h960z" />
+<glyph unicode="" d="M678 -57q0 -38 -10 -71h-380q-95 0 -171.5 56.5t-103.5 147.5q24 45 69 77.5t100 49.5t107 24t107 7q32 0 49 -2q6 -4 30.5 -21t33 -23t31 -23t32 -25.5t27.5 -25.5t26.5 -29.5t21 -30.5t17.5 -34.5t9.5 -36t4.5 -40.5zM385 294q-234 -7 -385 -85v433q103 -118 273 -118 q32 0 70 5q-21 -61 -21 -86q0 -67 63 -149zM558 805q0 -100 -43.5 -160.5t-140.5 -60.5q-51 0 -97 26t-78 67.5t-56 93.5t-35.5 104t-11.5 99q0 96 51.5 165t144.5 69q66 0 119 -41t84 -104t47 -130t16 -128zM1536 896v-736q0 -119 -84.5 -203.5t-203.5 -84.5h-468 q39 73 39 157q0 66 -22 122.5t-55.5 93t-72 71t-72 59.5t-55.5 54.5t-22 59.5q0 36 23 68t56 61.5t65.5 64.5t55.5 93t23 131t-26.5 145.5t-75.5 118.5q-6 6 -14 11t-12.5 7.5t-10 9.5t-10.5 17h135l135 64h-437q-138 0 -244.5 -38.5t-182.5 -133.5q0 126 81 213t207 87h960 q119 0 203.5 -84.5t84.5 -203.5v-96h-256v256h-128v-256h-256v-128h256v-256h128v256h256z" />
+<glyph unicode="" horiz-adv-x="1664" d="M876 71q0 21 -4.5 40.5t-9.5 36t-17.5 34.5t-21 30.5t-26.5 29.5t-27.5 25.5t-32 25.5t-31 23t-33 23t-30.5 21q-17 2 -50 2q-54 0 -106 -7t-108 -25t-98 -46t-69 -75t-27 -107q0 -68 35.5 -121.5t93 -84t120.5 -45.5t127 -15q59 0 112.5 12.5t100.5 39t74.5 73.5 t27.5 110zM756 933q0 60 -16.5 127.5t-47 130.5t-84 104t-119.5 41q-93 0 -144 -69t-51 -165q0 -47 11.5 -99t35.5 -104t56 -93.5t78 -67.5t97 -26q97 0 140.5 60.5t43.5 160.5zM625 1408h437l-135 -79h-135q71 -45 110 -126t39 -169q0 -74 -23 -131.5t-56 -92.5t-66 -64.5 t-56 -61t-23 -67.5q0 -26 16.5 -51t43 -48t58.5 -48t64 -55.5t58.5 -66t43 -85t16.5 -106.5q0 -160 -140 -282q-152 -131 -420 -131q-59 0 -119.5 10t-122 33.5t-108.5 58t-77 89t-30 121.5q0 61 37 135q32 64 96 110.5t145 71t155 36t150 13.5q-64 83 -64 149q0 12 2 23.5 t5 19.5t8 21.5t7 21.5q-40 -5 -70 -5q-149 0 -255.5 98t-106.5 246q0 140 95 250.5t234 141.5q94 20 187 20zM1664 1152v-128h-256v-256h-128v256h-256v128h256v256h128v-256h256z" />
+<glyph unicode="" horiz-adv-x="1920" d="M768 384h384v96h-128v448h-114l-148 -137l77 -80q42 37 55 57h2v-288h-128v-96zM1280 640q0 -70 -21 -142t-59.5 -134t-101.5 -101t-138 -39t-138 39t-101.5 101t-59.5 134t-21 142t21 142t59.5 134t101.5 101t138 39t138 -39t101.5 -101t59.5 -134t21 -142zM1792 384 v512q-106 0 -181 75t-75 181h-1152q0 -106 -75 -181t-181 -75v-512q106 0 181 -75t75 -181h1152q0 106 75 181t181 75zM1920 1216v-1152q0 -26 -19 -45t-45 -19h-1792q-26 0 -45 19t-19 45v1152q0 26 19 45t45 19h1792q26 0 45 -19t19 -45z" />
+<glyph unicode="" horiz-adv-x="1024" d="M1024 832q0 -26 -19 -45l-448 -448q-19 -19 -45 -19t-45 19l-448 448q-19 19 -19 45t19 45t45 19h896q26 0 45 -19t19 -45z" />
+<glyph unicode="" horiz-adv-x="1024" d="M1024 320q0 -26 -19 -45t-45 -19h-896q-26 0 -45 19t-19 45t19 45l448 448q19 19 45 19t45 -19l448 -448q19 -19 19 -45z" />
+<glyph unicode="" horiz-adv-x="640" d="M640 1088v-896q0 -26 -19 -45t-45 -19t-45 19l-448 448q-19 19 -19 45t19 45l448 448q19 19 45 19t45 -19t19 -45z" />
+<glyph unicode="" horiz-adv-x="640" d="M576 640q0 -26 -19 -45l-448 -448q-19 -19 -45 -19t-45 19t-19 45v896q0 26 19 45t45 19t45 -19l448 -448q19 -19 19 -45z" />
+<glyph unicode="" horiz-adv-x="1664" d="M160 0h608v1152h-640v-1120q0 -13 9.5 -22.5t22.5 -9.5zM1536 32v1120h-640v-1152h608q13 0 22.5 9.5t9.5 22.5zM1664 1248v-1216q0 -66 -47 -113t-113 -47h-1344q-66 0 -113 47t-47 113v1216q0 66 47 113t113 47h1344q66 0 113 -47t47 -113z" />
+<glyph unicode="" horiz-adv-x="1024" d="M1024 448q0 -26 -19 -45l-448 -448q-19 -19 -45 -19t-45 19l-448 448q-19 19 -19 45t19 45t45 19h896q26 0 45 -19t19 -45zM1024 832q0 -26 -19 -45t-45 -19h-896q-26 0 -45 19t-19 45t19 45l448 448q19 19 45 19t45 -19l448 -448q19 -19 19 -45z" />
+<glyph unicode="" horiz-adv-x="1024" d="M1024 448q0 -26 -19 -45l-448 -448q-19 -19 -45 -19t-45 19l-448 448q-19 19 -19 45t19 45t45 19h896q26 0 45 -19t19 -45z" />
+<glyph unicode="" horiz-adv-x="1024" d="M1024 832q0 -26 -19 -45t-45 -19h-896q-26 0 -45 19t-19 45t19 45l448 448q19 19 45 19t45 -19l448 -448q19 -19 19 -45z" />
+<glyph unicode="" horiz-adv-x="1792" d="M1792 826v-794q0 -66 -47 -113t-113 -47h-1472q-66 0 -113 47t-47 113v794q44 -49 101 -87q362 -246 497 -345q57 -42 92.5 -65.5t94.5 -48t110 -24.5h1h1q51 0 110 24.5t94.5 48t92.5 65.5q170 123 498 345q57 39 100 87zM1792 1120q0 -79 -49 -151t-122 -123 q-376 -261 -468 -325q-10 -7 -42.5 -30.5t-54 -38t-52 -32.5t-57.5 -27t-50 -9h-1h-1q-23 0 -50 9t-57.5 27t-52 32.5t-54 38t-42.5 30.5q-91 64 -262 182.5t-205 142.5q-62 42 -117 115.5t-55 136.5q0 78 41.5 130t118.5 52h1472q65 0 112.5 -47t47.5 -113z" />
+<glyph unicode="" horiz-adv-x="1379" d="M1014 961q171 0 268 -85.5t97 -254.5v-586q0 -14 -10.5 -24.5t-24.5 -10.5h-252q-14 0 -24.5 10.5t-10.5 24.5v529q0 71 -26.5 104t-95.5 33q-88 0 -123.5 -51.5t-35.5 -143.5v-471q0 -14 -10.5 -24.5t-25.5 -10.5h-246q-14 0 -24.5 10.5t-10.5 24.5v868q0 14 10.5 24.5 t24.5 10.5h239q13 0 21 -5t10.5 -18.5t3 -18t0.5 -22.5q93 87 246 87zM290 938q14 0 24.5 -10.5t10.5 -24.5v-868q0 -14 -10.5 -24.5t-24.5 -10.5h-246q-14 0 -24.5 10.5t-10.5 24.5v868q0 14 10.5 24.5t24.5 10.5h246zM167 1371q69 0 118 -49t49 -118t-49 -118t-118 -49 t-118 49t-49 118t49 118t118 49z" />
+<glyph unicode="" d="M1536 640q0 -156 -61 -298t-164 -245t-245 -164t-298 -61q-179 0 -336.5 76t-266 213t-147.5 312q-3 14 7 27q9 12 25 12h199q23 0 30 -23q50 -162 185 -261.5t304 -99.5q104 0 198.5 40.5t163.5 109.5t109.5 163.5t40.5 198.5t-40.5 198.5t-109.5 163.5t-163.5 109.5 t-198.5 40.5q-98 0 -188 -35.5t-160 -101.5l137 -138q31 -30 14 -69q-17 -40 -59 -40h-448q-26 0 -45 19t-19 45v448q0 42 40 59q39 17 69 -14l130 -129q107 101 244.5 156.5t284.5 55.5q156 0 298 -61t245 -164t164 -245t61 -298z" />
+<glyph unicode="" horiz-adv-x="1792" d="M1771 0q0 -53 -37 -90l-107 -108q-39 -37 -91 -37q-53 0 -90 37l-363 364q-38 36 -38 90q0 53 43 96l-256 256l-126 -126q-14 -14 -34 -14t-34 14q2 -2 12.5 -12t12.5 -13t10 -11.5t10 -13.5t6 -13.5t5.5 -16.5t1.5 -18q0 -38 -28 -68q-3 -3 -16.5 -18t-19 -20.5 t-18.5 -16.5t-22 -15.5t-22 -9t-26 -4.5q-40 0 -68 28l-408 408q-28 28 -28 68q0 13 4.5 26t9 22t15.5 22t16.5 18.5t20.5 19t18 16.5q30 28 68 28q10 0 18 -1.5t16.5 -5.5t13.5 -6t13.5 -10t11.5 -10t13 -12.5t12 -12.5q-14 14 -14 34t14 34l348 348q14 14 34 14t34 -14 q-2 2 -12.5 12t-12.5 13t-10 11.5t-10 13.5t-6 13.5t-5.5 16.5t-1.5 18q0 38 28 68q3 3 16.5 18t19 20.5t18.5 16.5t22 15.5t22 9t26 4.5q40 0 68 -28l408 -408q28 -28 28 -68q0 -13 -4.5 -26t-9 -22t-15.5 -22t-16.5 -18.5t-20.5 -19t-18 -16.5q-30 -28 -68 -28 q-10 0 -18 1.5t-16.5 5.5t-13.5 6t-13.5 10t-11.5 10t-13 12.5t-12 12.5q14 -14 14 -34t-14 -34l-126 -126l256 -256q43 43 96 43q52 0 91 -37l363 -363q37 -39 37 -91z" />
+<glyph unicode="" horiz-adv-x="1792" d="M384 384q0 53 -37.5 90.5t-90.5 37.5t-90.5 -37.5t-37.5 -90.5t37.5 -90.5t90.5 -37.5t90.5 37.5t37.5 90.5zM576 832q0 53 -37.5 90.5t-90.5 37.5t-90.5 -37.5t-37.5 -90.5t37.5 -90.5t90.5 -37.5t90.5 37.5t37.5 90.5zM1004 351l101 382q6 26 -7.5 48.5t-38.5 29.5 t-48 -6.5t-30 -39.5l-101 -382q-60 -5 -107 -43.5t-63 -98.5q-20 -77 20 -146t117 -89t146 20t89 117q16 60 -6 117t-72 91zM1664 384q0 53 -37.5 90.5t-90.5 37.5t-90.5 -37.5t-37.5 -90.5t37.5 -90.5t90.5 -37.5t90.5 37.5t37.5 90.5zM1024 1024q0 53 -37.5 90.5 t-90.5 37.5t-90.5 -37.5t-37.5 -90.5t37.5 -90.5t90.5 -37.5t90.5 37.5t37.5 90.5zM1472 832q0 53 -37.5 90.5t-90.5 37.5t-90.5 -37.5t-37.5 -90.5t37.5 -90.5t90.5 -37.5t90.5 37.5t37.5 90.5zM1792 384q0 -261 -141 -483q-19 -29 -54 -29h-1402q-35 0 -54 29 q-141 221 -141 483q0 182 71 348t191 286t286 191t348 71t348 -71t286 -191t191 -286t71 -348z" />
+<glyph unicode="" horiz-adv-x="1792" d="M896 1152q-204 0 -381.5 -69.5t-282 -187.5t-104.5 -255q0 -112 71.5 -213.5t201.5 -175.5l87 -50l-27 -96q-24 -91 -70 -172q152 63 275 171l43 38l57 -6q69 -8 130 -8q204 0 381.5 69.5t282 187.5t104.5 255t-104.5 255t-282 187.5t-381.5 69.5zM1792 640 q0 -174 -120 -321.5t-326 -233t-450 -85.5q-70 0 -145 8q-198 -175 -460 -242q-49 -14 -114 -22h-5q-15 0 -27 10.5t-16 27.5v1q-3 4 -0.5 12t2 10t4.5 9.5l6 9t7 8.5t8 9q7 8 31 34.5t34.5 38t31 39.5t32.5 51t27 59t26 76q-157 89 -247.5 220t-90.5 281q0 174 120 321.5 t326 233t450 85.5t450 -85.5t326 -233t120 -321.5z" />
+<glyph unicode="" horiz-adv-x="1792" d="M704 1152q-153 0 -286 -52t-211.5 -141t-78.5 -191q0 -82 53 -158t149 -132l97 -56l-35 -84q34 20 62 39l44 31l53 -10q78 -14 153 -14q153 0 286 52t211.5 141t78.5 191t-78.5 191t-211.5 141t-286 52zM704 1280q191 0 353.5 -68.5t256.5 -186.5t94 -257t-94 -257 t-256.5 -186.5t-353.5 -68.5q-86 0 -176 16q-124 -88 -278 -128q-36 -9 -86 -16h-3q-11 0 -20.5 8t-11.5 21q-1 3 -1 6.5t0.5 6.5t2 6l2.5 5t3.5 5.5t4 5t4.5 5t4 4.5q5 6 23 25t26 29.5t22.5 29t25 38.5t20.5 44q-124 72 -195 177t-71 224q0 139 94 257t256.5 186.5 t353.5 68.5zM1526 111q10 -24 20.5 -44t25 -38.5t22.5 -29t26 -29.5t23 -25q1 -1 4 -4.5t4.5 -5t4 -5t3.5 -5.5l2.5 -5t2 -6t0.5 -6.5t-1 -6.5q-3 -14 -13 -22t-22 -7q-50 7 -86 16q-154 40 -278 128q-90 -16 -176 -16q-271 0 -472 132q58 -4 88 -4q161 0 309 45t264 129 q125 92 192 212t67 254q0 77 -23 152q129 -71 204 -178t75 -230q0 -120 -71 -224.5t-195 -176.5z" />
+<glyph unicode="" horiz-adv-x="896" d="M885 970q18 -20 7 -44l-540 -1157q-13 -25 -42 -25q-4 0 -14 2q-17 5 -25.5 19t-4.5 30l197 808l-406 -101q-4 -1 -12 -1q-18 0 -31 11q-18 15 -13 39l201 825q4 14 16 23t28 9h328q19 0 32 -12.5t13 -29.5q0 -8 -5 -18l-171 -463l396 98q8 2 12 2q19 0 34 -15z" />
+<glyph unicode="" horiz-adv-x="1792" d="M1792 288v-320q0 -40 -28 -68t-68 -28h-320q-40 0 -68 28t-28 68v320q0 40 28 68t68 28h96v192h-512v-192h96q40 0 68 -28t28 -68v-320q0 -40 -28 -68t-68 -28h-320q-40 0 -68 28t-28 68v320q0 40 28 68t68 28h96v192h-512v-192h96q40 0 68 -28t28 -68v-320 q0 -40 -28 -68t-68 -28h-320q-40 0 -68 28t-28 68v320q0 40 28 68t68 28h96v192q0 52 38 90t90 38h512v192h-96q-40 0 -68 28t-28 68v320q0 40 28 68t68 28h320q40 0 68 -28t28 -68v-320q0 -40 -28 -68t-68 -28h-96v-192h512q52 0 90 -38t38 -90v-192h96q40 0 68 -28t28 -68 z" />
+<glyph unicode="" horiz-adv-x="1664" d="M896 708v-580q0 -104 -76 -180t-180 -76t-180 76t-76 180q0 26 19 45t45 19t45 -19t19 -45q0 -50 39 -89t89 -39t89 39t39 89v580q33 11 64 11t64 -11zM1664 681q0 -13 -9.5 -22.5t-22.5 -9.5q-11 0 -23 10q-49 46 -93 69t-102 23q-68 0 -128 -37t-103 -97 q-7 -10 -17.5 -28t-14.5 -24q-11 -17 -28 -17q-18 0 -29 17q-4 6 -14.5 24t-17.5 28q-43 60 -102.5 97t-127.5 37t-127.5 -37t-102.5 -97q-7 -10 -17.5 -28t-14.5 -24q-11 -17 -29 -17q-17 0 -28 17q-4 6 -14.5 24t-17.5 28q-43 60 -103 97t-128 37q-58 0 -102 -23t-93 -69 q-12 -10 -23 -10q-13 0 -22.5 9.5t-9.5 22.5q0 5 1 7q45 183 172.5 319.5t298 204.5t360.5 68q140 0 274.5 -40t246.5 -113.5t194.5 -187t115.5 -251.5q1 -2 1 -7zM896 1408v-98q-42 2 -64 2t-64 -2v98q0 26 19 45t45 19t45 -19t19 -45z" />
+<glyph unicode="" horiz-adv-x="1792" d="M768 -128h896v640h-416q-40 0 -68 28t-28 68v416h-384v-1152zM1024 1312v64q0 13 -9.5 22.5t-22.5 9.5h-704q-13 0 -22.5 -9.5t-9.5 -22.5v-64q0 -13 9.5 -22.5t22.5 -9.5h704q13 0 22.5 9.5t9.5 22.5zM1280 640h299l-299 299v-299zM1792 512v-672q0 -40 -28 -68t-68 -28 h-960q-40 0 -68 28t-28 68v160h-544q-40 0 -68 28t-28 68v1344q0 40 28 68t68 28h1088q40 0 68 -28t28 -68v-328q21 -13 36 -28l408 -408q28 -28 48 -76t20 -88z" />
+<glyph unicode="" horiz-adv-x="1024" d="M736 960q0 -13 -9.5 -22.5t-22.5 -9.5t-22.5 9.5t-9.5 22.5q0 46 -54 71t-106 25q-13 0 -22.5 9.5t-9.5 22.5t9.5 22.5t22.5 9.5q50 0 99.5 -16t87 -54t37.5 -90zM896 960q0 72 -34.5 134t-90 101.5t-123 62t-136.5 22.5t-136.5 -22.5t-123 -62t-90 -101.5t-34.5 -134 q0 -101 68 -180q10 -11 30.5 -33t30.5 -33q128 -153 141 -298h228q13 145 141 298q10 11 30.5 33t30.5 33q68 79 68 180zM1024 960q0 -155 -103 -268q-45 -49 -74.5 -87t-59.5 -95.5t-34 -107.5q47 -28 47 -82q0 -37 -25 -64q25 -27 25 -64q0 -52 -45 -81q13 -23 13 -47 q0 -46 -31.5 -71t-77.5 -25q-20 -44 -60 -70t-87 -26t-87 26t-60 70q-46 0 -77.5 25t-31.5 71q0 24 13 47q-45 29 -45 81q0 37 25 64q-25 27 -25 64q0 54 47 82q-4 50 -34 107.5t-59.5 95.5t-74.5 87q-103 113 -103 268q0 99 44.5 184.5t117 142t164 89t186.5 32.5 t186.5 -32.5t164 -89t117 -142t44.5 -184.5z" />
+<glyph unicode="" horiz-adv-x="1792" d="M1792 352v-192q0 -13 -9.5 -22.5t-22.5 -9.5h-1376v-192q0 -13 -9.5 -22.5t-22.5 -9.5q-12 0 -24 10l-319 320q-9 9 -9 22q0 14 9 23l320 320q9 9 23 9q13 0 22.5 -9.5t9.5 -22.5v-192h1376q13 0 22.5 -9.5t9.5 -22.5zM1792 896q0 -14 -9 -23l-320 -320q-9 -9 -23 -9 q-13 0 -22.5 9.5t-9.5 22.5v192h-1376q-13 0 -22.5 9.5t-9.5 22.5v192q0 13 9.5 22.5t22.5 9.5h1376v192q0 14 9 23t23 9q12 0 24 -10l319 -319q9 -9 9 -23z" />
+<glyph unicode="" horiz-adv-x="1920" d="M1280 608q0 14 -9 23t-23 9h-224v352q0 13 -9.5 22.5t-22.5 9.5h-192q-13 0 -22.5 -9.5t-9.5 -22.5v-352h-224q-13 0 -22.5 -9.5t-9.5 -22.5q0 -14 9 -23l352 -352q9 -9 23 -9t23 9l351 351q10 12 10 24zM1920 384q0 -159 -112.5 -271.5t-271.5 -112.5h-1088 q-185 0 -316.5 131.5t-131.5 316.5q0 130 70 240t188 165q-2 30 -2 43q0 212 150 362t362 150q156 0 285.5 -87t188.5 -231q71 62 166 62q106 0 181 -75t75 -181q0 -76 -41 -138q130 -31 213.5 -135.5t83.5 -238.5z" />
+<glyph unicode="" horiz-adv-x="1920" d="M1280 672q0 14 -9 23l-352 352q-9 9 -23 9t-23 -9l-351 -351q-10 -12 -10 -24q0 -14 9 -23t23 -9h224v-352q0 -13 9.5 -22.5t22.5 -9.5h192q13 0 22.5 9.5t9.5 22.5v352h224q13 0 22.5 9.5t9.5 22.5zM1920 384q0 -159 -112.5 -271.5t-271.5 -112.5h-1088 q-185 0 -316.5 131.5t-131.5 316.5q0 130 70 240t188 165q-2 30 -2 43q0 212 150 362t362 150q156 0 285.5 -87t188.5 -231q71 62 166 62q106 0 181 -75t75 -181q0 -76 -41 -138q130 -31 213.5 -135.5t83.5 -238.5z" />
+<glyph unicode="" horiz-adv-x="1408" d="M384 192q0 -26 -19 -45t-45 -19t-45 19t-19 45t19 45t45 19t45 -19t19 -45zM1408 131q0 -121 -73 -190t-194 -69h-874q-121 0 -194 69t-73 190q0 68 5.5 131t24 138t47.5 132.5t81 103t120 60.5q-22 -52 -22 -120v-203q-58 -20 -93 -70t-35 -111q0 -80 56 -136t136 -56 t136 56t56 136q0 61 -35.5 111t-92.5 70v203q0 62 25 93q132 -104 295 -104t295 104q25 -31 25 -93v-64q-106 0 -181 -75t-75 -181v-89q-32 -29 -32 -71q0 -40 28 -68t68 -28t68 28t28 68q0 42 -32 71v89q0 52 38 90t90 38t90 -38t38 -90v-89q-32 -29 -32 -71q0 -40 28 -68 t68 -28t68 28t28 68q0 42 -32 71v89q0 68 -34.5 127.5t-93.5 93.5q0 10 0.5 42.5t0 48t-2.5 41.5t-7 47t-13 40q68 -15 120 -60.5t81 -103t47.5 -132.5t24 -138t5.5 -131zM1088 1024q0 -159 -112.5 -271.5t-271.5 -112.5t-271.5 112.5t-112.5 271.5t112.5 271.5t271.5 112.5 t271.5 -112.5t112.5 -271.5z" />
+<glyph unicode="" horiz-adv-x="1408" d="M1280 832q0 26 -19 45t-45 19t-45 -19t-19 -45t19 -45t45 -19t45 19t19 45zM1408 832q0 -62 -35.5 -111t-92.5 -70v-395q0 -159 -131.5 -271.5t-316.5 -112.5t-316.5 112.5t-131.5 271.5v132q-164 20 -274 128t-110 252v512q0 26 19 45t45 19q6 0 16 -2q17 30 47 48 t65 18q53 0 90.5 -37.5t37.5 -90.5t-37.5 -90.5t-90.5 -37.5q-33 0 -64 18v-402q0 -106 94 -181t226 -75t226 75t94 181v402q-31 -18 -64 -18q-53 0 -90.5 37.5t-37.5 90.5t37.5 90.5t90.5 37.5q35 0 65 -18t47 -48q10 2 16 2q26 0 45 -19t19 -45v-512q0 -144 -110 -252 t-274 -128v-132q0 -106 94 -181t226 -75t226 75t94 181v395q-57 21 -92.5 70t-35.5 111q0 80 56 136t136 56t136 -56t56 -136z" />
+<glyph unicode="" horiz-adv-x="1792" d="M640 1152h512v128h-512v-128zM288 1152v-1280h-64q-92 0 -158 66t-66 158v832q0 92 66 158t158 66h64zM1408 1152v-1280h-1024v1280h128v160q0 40 28 68t68 28h576q40 0 68 -28t28 -68v-160h128zM1792 928v-832q0 -92 -66 -158t-158 -66h-64v1280h64q92 0 158 -66 t66 -158z" />
+<glyph unicode="" horiz-adv-x="1664" d="M848 -160q0 16 -16 16q-59 0 -101.5 42.5t-42.5 101.5q0 16 -16 16t-16 -16q0 -73 51.5 -124.5t124.5 -51.5q16 0 16 16zM1664 128q0 -52 -38 -90t-90 -38h-448q0 -106 -75 -181t-181 -75t-181 75t-75 181h-448q-52 0 -90 38t-38 90q190 161 287 397.5t97 498.5 q0 165 96 262t264 117q-8 18 -8 37q0 40 28 68t68 28t68 -28t28 -68q0 -19 -8 -37q168 -20 264 -117t96 -262q0 -262 97 -498.5t287 -397.5z" />
+<glyph unicode="" horiz-adv-x="1920" d="M1664 896q0 80 -56 136t-136 56h-64v-384h64q80 0 136 56t56 136zM0 128h1792q0 -106 -75 -181t-181 -75h-1280q-106 0 -181 75t-75 181zM1856 896q0 -159 -112.5 -271.5t-271.5 -112.5h-64v-32q0 -92 -66 -158t-158 -66h-704q-92 0 -158 66t-66 158v736q0 26 19 45 t45 19h1152q159 0 271.5 -112.5t112.5 -271.5z" />
+<glyph unicode="" horiz-adv-x="1408" d="M640 1472v-640q0 -61 -35.5 -111t-92.5 -70v-779q0 -52 -38 -90t-90 -38h-128q-52 0 -90 38t-38 90v779q-57 20 -92.5 70t-35.5 111v640q0 26 19 45t45 19t45 -19t19 -45v-416q0 -26 19 -45t45 -19t45 19t19 45v416q0 26 19 45t45 19t45 -19t19 -45v-416q0 -26 19 -45 t45 -19t45 19t19 45v416q0 26 19 45t45 19t45 -19t19 -45zM1408 1472v-1600q0 -52 -38 -90t-90 -38h-128q-52 0 -90 38t-38 90v512h-224q-13 0 -22.5 9.5t-9.5 22.5v800q0 132 94 226t226 94h256q26 0 45 -19t19 -45z" />
+<glyph unicode="" horiz-adv-x="1280" d="M1024 352v-64q0 -14 -9 -23t-23 -9h-704q-14 0 -23 9t-9 23v64q0 14 9 23t23 9h704q14 0 23 -9t9 -23zM1024 608v-64q0 -14 -9 -23t-23 -9h-704q-14 0 -23 9t-9 23v64q0 14 9 23t23 9h704q14 0 23 -9t9 -23zM128 0h1024v768h-416q-40 0 -68 28t-28 68v416h-512v-1280z M768 896h299l-299 299v-299zM1280 768v-800q0 -40 -28 -68t-68 -28h-1088q-40 0 -68 28t-28 68v1344q0 40 28 68t68 28h544q40 0 88 -20t76 -48l408 -408q28 -28 48 -76t20 -88z" />
+<glyph unicode="" horiz-adv-x="1408" d="M384 224v-64q0 -13 -9.5 -22.5t-22.5 -9.5h-64q-13 0 -22.5 9.5t-9.5 22.5v64q0 13 9.5 22.5t22.5 9.5h64q13 0 22.5 -9.5t9.5 -22.5zM384 480v-64q0 -13 -9.5 -22.5t-22.5 -9.5h-64q-13 0 -22.5 9.5t-9.5 22.5v64q0 13 9.5 22.5t22.5 9.5h64q13 0 22.5 -9.5t9.5 -22.5z M640 480v-64q0 -13 -9.5 -22.5t-22.5 -9.5h-64q-13 0 -22.5 9.5t-9.5 22.5v64q0 13 9.5 22.5t22.5 9.5h64q13 0 22.5 -9.5t9.5 -22.5zM384 736v-64q0 -13 -9.5 -22.5t-22.5 -9.5h-64q-13 0 -22.5 9.5t-9.5 22.5v64q0 13 9.5 22.5t22.5 9.5h64q13 0 22.5 -9.5t9.5 -22.5z M1152 224v-64q0 -13 -9.5 -22.5t-22.5 -9.5h-64q-13 0 -22.5 9.5t-9.5 22.5v64q0 13 9.5 22.5t22.5 9.5h64q13 0 22.5 -9.5t9.5 -22.5zM896 480v-64q0 -13 -9.5 -22.5t-22.5 -9.5h-64q-13 0 -22.5 9.5t-9.5 22.5v64q0 13 9.5 22.5t22.5 9.5h64q13 0 22.5 -9.5t9.5 -22.5z M640 736v-64q0 -13 -9.5 -22.5t-22.5 -9.5h-64q-13 0 -22.5 9.5t-9.5 22.5v64q0 13 9.5 22.5t22.5 9.5h64q13 0 22.5 -9.5t9.5 -22.5zM384 992v-64q0 -13 -9.5 -22.5t-22.5 -9.5h-64q-13 0 -22.5 9.5t-9.5 22.5v64q0 13 9.5 22.5t22.5 9.5h64q13 0 22.5 -9.5t9.5 -22.5z M1152 480v-64q0 -13 -9.5 -22.5t-22.5 -9.5h-64q-13 0 -22.5 9.5t-9.5 22.5v64q0 13 9.5 22.5t22.5 9.5h64q13 0 22.5 -9.5t9.5 -22.5zM896 736v-64q0 -13 -9.5 -22.5t-22.5 -9.5h-64q-13 0 -22.5 9.5t-9.5 22.5v64q0 13 9.5 22.5t22.5 9.5h64q13 0 22.5 -9.5t9.5 -22.5z M640 992v-64q0 -13 -9.5 -22.5t-22.5 -9.5h-64q-13 0 -22.5 9.5t-9.5 22.5v64q0 13 9.5 22.5t22.5 9.5h64q13 0 22.5 -9.5t9.5 -22.5zM384 1248v-64q0 -13 -9.5 -22.5t-22.5 -9.5h-64q-13 0 -22.5 9.5t-9.5 22.5v64q0 13 9.5 22.5t22.5 9.5h64q13 0 22.5 -9.5t9.5 -22.5z M1152 736v-64q0 -13 -9.5 -22.5t-22.5 -9.5h-64q-13 0 -22.5 9.5t-9.5 22.5v64q0 13 9.5 22.5t22.5 9.5h64q13 0 22.5 -9.5t9.5 -22.5zM896 992v-64q0 -13 -9.5 -22.5t-22.5 -9.5h-64q-13 0 -22.5 9.5t-9.5 22.5v64q0 13 9.5 22.5t22.5 9.5h64q13 0 22.5 -9.5t9.5 -22.5z M640 1248v-64q0 -13 -9.5 -22.5t-22.5 -9.5h-64q-13 0 -22.5 9.5t-9.5 22.5v64q0 13 9.5 22.5t22.5 9.5h64q13 0 22.5 -9.5t9.5 -22.5zM1152 992v-64q0 -13 -9.5 -22.5t-22.5 -9.5h-64q-13 0 -22.5 9.5t-9.5 22.5v64q0 13 9.5 22.5t22.5 9.5h64q13 0 22.5 -9.5t9.5 -22.5z M896 1248v-64q0 -13 -9.5 -22.5t-22.5 -9.5h-64q-13 0 -22.5 9.5t-9.5 22.5v64q0 13 9.5 22.5t22.5 9.5h64q13 0 22.5 -9.5t9.5 -22.5zM1152 1248v-64q0 -13 -9.5 -22.5t-22.5 -9.5h-64q-13 0 -22.5 9.5t-9.5 22.5v64q0 13 9.5 22.5t22.5 9.5h64q13 0 22.5 -9.5t9.5 -22.5z M896 -128h384v1536h-1152v-1536h384v224q0 13 9.5 22.5t22.5 9.5h320q13 0 22.5 -9.5t9.5 -22.5v-224zM1408 1472v-1664q0 -26 -19 -45t-45 -19h-1280q-26 0 -45 19t-19 45v1664q0 26 19 45t45 19h1280q26 0 45 -19t19 -45z" />
+<glyph unicode="" horiz-adv-x="1408" d="M384 224v-64q0 -13 -9.5 -22.5t-22.5 -9.5h-64q-13 0 -22.5 9.5t-9.5 22.5v64q0 13 9.5 22.5t22.5 9.5h64q13 0 22.5 -9.5t9.5 -22.5zM384 480v-64q0 -13 -9.5 -22.5t-22.5 -9.5h-64q-13 0 -22.5 9.5t-9.5 22.5v64q0 13 9.5 22.5t22.5 9.5h64q13 0 22.5 -9.5t9.5 -22.5z M640 480v-64q0 -13 -9.5 -22.5t-22.5 -9.5h-64q-13 0 -22.5 9.5t-9.5 22.5v64q0 13 9.5 22.5t22.5 9.5h64q13 0 22.5 -9.5t9.5 -22.5zM384 736v-64q0 -13 -9.5 -22.5t-22.5 -9.5h-64q-13 0 -22.5 9.5t-9.5 22.5v64q0 13 9.5 22.5t22.5 9.5h64q13 0 22.5 -9.5t9.5 -22.5z M1152 224v-64q0 -13 -9.5 -22.5t-22.5 -9.5h-64q-13 0 -22.5 9.5t-9.5 22.5v64q0 13 9.5 22.5t22.5 9.5h64q13 0 22.5 -9.5t9.5 -22.5zM896 480v-64q0 -13 -9.5 -22.5t-22.5 -9.5h-64q-13 0 -22.5 9.5t-9.5 22.5v64q0 13 9.5 22.5t22.5 9.5h64q13 0 22.5 -9.5t9.5 -22.5z M640 736v-64q0 -13 -9.5 -22.5t-22.5 -9.5h-64q-13 0 -22.5 9.5t-9.5 22.5v64q0 13 9.5 22.5t22.5 9.5h64q13 0 22.5 -9.5t9.5 -22.5zM1152 480v-64q0 -13 -9.5 -22.5t-22.5 -9.5h-64q-13 0 -22.5 9.5t-9.5 22.5v64q0 13 9.5 22.5t22.5 9.5h64q13 0 22.5 -9.5t9.5 -22.5z M896 736v-64q0 -13 -9.5 -22.5t-22.5 -9.5h-64q-13 0 -22.5 9.5t-9.5 22.5v64q0 13 9.5 22.5t22.5 9.5h64q13 0 22.5 -9.5t9.5 -22.5zM1152 736v-64q0 -13 -9.5 -22.5t-22.5 -9.5h-64q-13 0 -22.5 9.5t-9.5 22.5v64q0 13 9.5 22.5t22.5 9.5h64q13 0 22.5 -9.5t9.5 -22.5z M896 -128h384v1152h-256v-32q0 -40 -28 -68t-68 -28h-448q-40 0 -68 28t-28 68v32h-256v-1152h384v224q0 13 9.5 22.5t22.5 9.5h320q13 0 22.5 -9.5t9.5 -22.5v-224zM896 1056v320q0 13 -9.5 22.5t-22.5 9.5h-64q-13 0 -22.5 -9.5t-9.5 -22.5v-96h-128v96q0 13 -9.5 22.5 t-22.5 9.5h-64q-13 0 -22.5 -9.5t-9.5 -22.5v-320q0 -13 9.5 -22.5t22.5 -9.5h64q13 0 22.5 9.5t9.5 22.5v96h128v-96q0 -13 9.5 -22.5t22.5 -9.5h64q13 0 22.5 9.5t9.5 22.5zM1408 1088v-1280q0 -26 -19 -45t-45 -19h-1280q-26 0 -45 19t-19 45v1280q0 26 19 45t45 19h320 v288q0 40 28 68t68 28h448q40 0 68 -28t28 -68v-288h320q26 0 45 -19t19 -45z" />
+<glyph unicode="" horiz-adv-x="1920" d="M640 128q0 53 -37.5 90.5t-90.5 37.5t-90.5 -37.5t-37.5 -90.5t37.5 -90.5t90.5 -37.5t90.5 37.5t37.5 90.5zM256 640h384v256h-158q-14 -2 -22 -9l-195 -195q-7 -12 -9 -22v-30zM1536 128q0 53 -37.5 90.5t-90.5 37.5t-90.5 -37.5t-37.5 -90.5t37.5 -90.5t90.5 -37.5 t90.5 37.5t37.5 90.5zM1664 800v192q0 14 -9 23t-23 9h-224v224q0 14 -9 23t-23 9h-192q-14 0 -23 -9t-9 -23v-224h-224q-14 0 -23 -9t-9 -23v-192q0 -14 9 -23t23 -9h224v-224q0 -14 9 -23t23 -9h192q14 0 23 9t9 23v224h224q14 0 23 9t9 23zM1920 1344v-1152 q0 -26 -19 -45t-45 -19h-192q0 -106 -75 -181t-181 -75t-181 75t-75 181h-384q0 -106 -75 -181t-181 -75t-181 75t-75 181h-128q-26 0 -45 19t-19 45t19 45t45 19v416q0 26 13 58t32 51l198 198q19 19 51 32t58 13h160v320q0 26 19 45t45 19h1152q26 0 45 -19t19 -45z" />
+<glyph unicode="" horiz-adv-x="1792" d="M1280 416v192q0 14 -9 23t-23 9h-224v224q0 14 -9 23t-23 9h-192q-14 0 -23 -9t-9 -23v-224h-224q-14 0 -23 -9t-9 -23v-192q0 -14 9 -23t23 -9h224v-224q0 -14 9 -23t23 -9h192q14 0 23 9t9 23v224h224q14 0 23 9t9 23zM640 1152h512v128h-512v-128zM256 1152v-1280h-32 q-92 0 -158 66t-66 158v832q0 92 66 158t158 66h32zM1440 1152v-1280h-1088v1280h160v160q0 40 28 68t68 28h576q40 0 68 -28t28 -68v-160h160zM1792 928v-832q0 -92 -66 -158t-158 -66h-32v1280h32q92 0 158 -66t66 -158z" />
+<glyph unicode="" horiz-adv-x="1920" d="M1632 800q261 -58 287 -93l1 -3q-1 -32 -288 -96l-352 -32l-224 -64h-64l-293 -352h69q26 0 45 -4.5t19 -11.5t-19 -11.5t-45 -4.5h-96h-160h-64v32h64v416h-160l-192 -224h-96l-32 32v192h32v32h128v8l-192 24v128l192 24v8h-128v32h-32v192l32 32h96l192 -224h160v416 h-64v32h64h160h96q26 0 45 -4.5t19 -11.5t-19 -11.5t-45 -4.5h-69l293 -352h64l224 -64z" />
+<glyph unicode="" horiz-adv-x="1664" d="M640 640v384h-256v-160q0 -45 2 -76t7.5 -56.5t14.5 -40t23 -26.5t33.5 -15.5t45 -7.5t58 -2.5t72.5 0.5zM1664 192v-192h-1152v192l128 192h-97q-211 0 -313 102.5t-102 314.5v287l-64 64l32 128h480l32 128h960l32 -192l-64 -32v-800z" />
+<glyph unicode="" d="M1280 192v896q0 26 -19 45t-45 19h-128q-26 0 -45 -19t-19 -45v-320h-512v320q0 26 -19 45t-45 19h-128q-26 0 -45 -19t-19 -45v-896q0 -26 19 -45t45 -19h128q26 0 45 19t19 45v320h512v-320q0 -26 19 -45t45 -19h128q26 0 45 19t19 45zM1536 1120v-960 q0 -119 -84.5 -203.5t-203.5 -84.5h-960q-119 0 -203.5 84.5t-84.5 203.5v960q0 119 84.5 203.5t203.5 84.5h960q119 0 203.5 -84.5t84.5 -203.5z" />
+<glyph unicode="" d="M1280 576v128q0 26 -19 45t-45 19h-320v320q0 26 -19 45t-45 19h-128q-26 0 -45 -19t-19 -45v-320h-320q-26 0 -45 -19t-19 -45v-128q0 -26 19 -45t45 -19h320v-320q0 -26 19 -45t45 -19h128q26 0 45 19t19 45v320h320q26 0 45 19t19 45zM1536 1120v-960 q0 -119 -84.5 -203.5t-203.5 -84.5h-960q-119 0 -203.5 84.5t-84.5 203.5v960q0 119 84.5 203.5t203.5 84.5h960q119 0 203.5 -84.5t84.5 -203.5z" />
+<glyph unicode="" horiz-adv-x="1024" d="M627 160q0 -13 -10 -23l-50 -50q-10 -10 -23 -10t-23 10l-466 466q-10 10 -10 23t10 23l466 466q10 10 23 10t23 -10l50 -50q10 -10 10 -23t-10 -23l-393 -393l393 -393q10 -10 10 -23zM1011 160q0 -13 -10 -23l-50 -50q-10 -10 -23 -10t-23 10l-466 466q-10 10 -10 23 t10 23l466 466q10 10 23 10t23 -10l50 -50q10 -10 10 -23t-10 -23l-393 -393l393 -393q10 -10 10 -23z" />
+<glyph unicode="" horiz-adv-x="1024" d="M595 576q0 -13 -10 -23l-466 -466q-10 -10 -23 -10t-23 10l-50 50q-10 10 -10 23t10 23l393 393l-393 393q-10 10 -10 23t10 23l50 50q10 10 23 10t23 -10l466 -466q10 -10 10 -23zM979 576q0 -13 -10 -23l-466 -466q-10 -10 -23 -10t-23 10l-50 50q-10 10 -10 23t10 23 l393 393l-393 393q-10 10 -10 23t10 23l50 50q10 10 23 10t23 -10l466 -466q10 -10 10 -23z" />
+<glyph unicode="" horiz-adv-x="1152" d="M1075 224q0 -13 -10 -23l-50 -50q-10 -10 -23 -10t-23 10l-393 393l-393 -393q-10 -10 -23 -10t-23 10l-50 50q-10 10 -10 23t10 23l466 466q10 10 23 10t23 -10l466 -466q10 -10 10 -23zM1075 608q0 -13 -10 -23l-50 -50q-10 -10 -23 -10t-23 10l-393 393l-393 -393 q-10 -10 -23 -10t-23 10l-50 50q-10 10 -10 23t10 23l466 466q10 10 23 10t23 -10l466 -466q10 -10 10 -23z" />
+<glyph unicode="" horiz-adv-x="1152" d="M1075 672q0 -13 -10 -23l-466 -466q-10 -10 -23 -10t-23 10l-466 466q-10 10 -10 23t10 23l50 50q10 10 23 10t23 -10l393 -393l393 393q10 10 23 10t23 -10l50 -50q10 -10 10 -23zM1075 1056q0 -13 -10 -23l-466 -466q-10 -10 -23 -10t-23 10l-466 466q-10 10 -10 23 t10 23l50 50q10 10 23 10t23 -10l393 -393l393 393q10 10 23 10t23 -10l50 -50q10 -10 10 -23z" />
+<glyph unicode="" horiz-adv-x="640" d="M627 992q0 -13 -10 -23l-393 -393l393 -393q10 -10 10 -23t-10 -23l-50 -50q-10 -10 -23 -10t-23 10l-466 466q-10 10 -10 23t10 23l466 466q10 10 23 10t23 -10l50 -50q10 -10 10 -23z" />
+<glyph unicode="" horiz-adv-x="640" d="M595 576q0 -13 -10 -23l-466 -466q-10 -10 -23 -10t-23 10l-50 50q-10 10 -10 23t10 23l393 393l-393 393q-10 10 -10 23t10 23l50 50q10 10 23 10t23 -10l466 -466q10 -10 10 -23z" />
+<glyph unicode="" horiz-adv-x="1152" d="M1075 352q0 -13 -10 -23l-50 -50q-10 -10 -23 -10t-23 10l-393 393l-393 -393q-10 -10 -23 -10t-23 10l-50 50q-10 10 -10 23t10 23l466 466q10 10 23 10t23 -10l466 -466q10 -10 10 -23z" />
+<glyph unicode="" horiz-adv-x="1152" d="M1075 800q0 -13 -10 -23l-466 -466q-10 -10 -23 -10t-23 10l-466 466q-10 10 -10 23t10 23l50 50q10 10 23 10t23 -10l393 -393l393 393q10 10 23 10t23 -10l50 -50q10 -10 10 -23z" />
+<glyph unicode="" horiz-adv-x="1920" d="M1792 544v832q0 13 -9.5 22.5t-22.5 9.5h-1600q-13 0 -22.5 -9.5t-9.5 -22.5v-832q0 -13 9.5 -22.5t22.5 -9.5h1600q13 0 22.5 9.5t9.5 22.5zM1920 1376v-1088q0 -66 -47 -113t-113 -47h-544q0 -37 16 -77.5t32 -71t16 -43.5q0 -26 -19 -45t-45 -19h-512q-26 0 -45 19 t-19 45q0 14 16 44t32 70t16 78h-544q-66 0 -113 47t-47 113v1088q0 66 47 113t113 47h1600q66 0 113 -47t47 -113z" />
+<glyph unicode="" horiz-adv-x="1920" d="M416 256q-66 0 -113 47t-47 113v704q0 66 47 113t113 47h1088q66 0 113 -47t47 -113v-704q0 -66 -47 -113t-113 -47h-1088zM384 1120v-704q0 -13 9.5 -22.5t22.5 -9.5h1088q13 0 22.5 9.5t9.5 22.5v704q0 13 -9.5 22.5t-22.5 9.5h-1088q-13 0 -22.5 -9.5t-9.5 -22.5z M1760 192h160v-96q0 -40 -47 -68t-113 -28h-1600q-66 0 -113 28t-47 68v96h160h1600zM1040 96q16 0 16 16t-16 16h-160q-16 0 -16 -16t16 -16h160z" />
+<glyph unicode="" horiz-adv-x="1152" d="M640 128q0 26 -19 45t-45 19t-45 -19t-19 -45t19 -45t45 -19t45 19t19 45zM1024 288v960q0 13 -9.5 22.5t-22.5 9.5h-832q-13 0 -22.5 -9.5t-9.5 -22.5v-960q0 -13 9.5 -22.5t22.5 -9.5h832q13 0 22.5 9.5t9.5 22.5zM1152 1248v-1088q0 -66 -47 -113t-113 -47h-832 q-66 0 -113 47t-47 113v1088q0 66 47 113t113 47h832q66 0 113 -47t47 -113z" />
+<glyph unicode="" horiz-adv-x="768" d="M464 128q0 33 -23.5 56.5t-56.5 23.5t-56.5 -23.5t-23.5 -56.5t23.5 -56.5t56.5 -23.5t56.5 23.5t23.5 56.5zM672 288v704q0 13 -9.5 22.5t-22.5 9.5h-512q-13 0 -22.5 -9.5t-9.5 -22.5v-704q0 -13 9.5 -22.5t22.5 -9.5h512q13 0 22.5 9.5t9.5 22.5zM480 1136 q0 16 -16 16h-160q-16 0 -16 -16t16 -16h160q16 0 16 16zM768 1152v-1024q0 -52 -38 -90t-90 -38h-512q-52 0 -90 38t-38 90v1024q0 52 38 90t90 38h512q52 0 90 -38t38 -90z" />
+<glyph unicode="" d="M1280 640q0 104 -40.5 198.5t-109.5 163.5t-163.5 109.5t-198.5 40.5t-198.5 -40.5t-163.5 -109.5t-109.5 -163.5t-40.5 -198.5t40.5 -198.5t109.5 -163.5t163.5 -109.5t198.5 -40.5t198.5 40.5t163.5 109.5t109.5 163.5t40.5 198.5zM1536 640q0 -209 -103 -385.5 t-279.5 -279.5t-385.5 -103t-385.5 103t-279.5 279.5t-103 385.5t103 385.5t279.5 279.5t385.5 103t385.5 -103t279.5 -279.5t103 -385.5z" />
+<glyph unicode="" horiz-adv-x="1664" d="M768 576v-384q0 -80 -56 -136t-136 -56h-384q-80 0 -136 56t-56 136v704q0 104 40.5 198.5t109.5 163.5t163.5 109.5t198.5 40.5h64q26 0 45 -19t19 -45v-128q0 -26 -19 -45t-45 -19h-64q-106 0 -181 -75t-75 -181v-32q0 -40 28 -68t68 -28h224q80 0 136 -56t56 -136z M1664 576v-384q0 -80 -56 -136t-136 -56h-384q-80 0 -136 56t-56 136v704q0 104 40.5 198.5t109.5 163.5t163.5 109.5t198.5 40.5h64q26 0 45 -19t19 -45v-128q0 -26 -19 -45t-45 -19h-64q-106 0 -181 -75t-75 -181v-32q0 -40 28 -68t68 -28h224q80 0 136 -56t56 -136z" />
+<glyph unicode="" horiz-adv-x="1664" d="M768 1216v-704q0 -104 -40.5 -198.5t-109.5 -163.5t-163.5 -109.5t-198.5 -40.5h-64q-26 0 -45 19t-19 45v128q0 26 19 45t45 19h64q106 0 181 75t75 181v32q0 40 -28 68t-68 28h-224q-80 0 -136 56t-56 136v384q0 80 56 136t136 56h384q80 0 136 -56t56 -136zM1664 1216 v-704q0 -104 -40.5 -198.5t-109.5 -163.5t-163.5 -109.5t-198.5 -40.5h-64q-26 0 -45 19t-19 45v128q0 26 19 45t45 19h64q106 0 181 75t75 181v32q0 40 -28 68t-68 28h-224q-80 0 -136 56t-56 136v384q0 80 56 136t136 56h384q80 0 136 -56t56 -136z" />
+<glyph unicode="" horiz-adv-x="1568" d="M496 192q0 -60 -42.5 -102t-101.5 -42q-60 0 -102 42t-42 102t42 102t102 42q59 0 101.5 -42t42.5 -102zM928 0q0 -53 -37.5 -90.5t-90.5 -37.5t-90.5 37.5t-37.5 90.5t37.5 90.5t90.5 37.5t90.5 -37.5t37.5 -90.5zM320 640q0 -66 -47 -113t-113 -47t-113 47t-47 113 t47 113t113 47t113 -47t47 -113zM1360 192q0 -46 -33 -79t-79 -33t-79 33t-33 79t33 79t79 33t79 -33t33 -79zM528 1088q0 -73 -51.5 -124.5t-124.5 -51.5t-124.5 51.5t-51.5 124.5t51.5 124.5t124.5 51.5t124.5 -51.5t51.5 -124.5zM992 1280q0 -80 -56 -136t-136 -56 t-136 56t-56 136t56 136t136 56t136 -56t56 -136zM1536 640q0 -40 -28 -68t-68 -28t-68 28t-28 68t28 68t68 28t68 -28t28 -68zM1328 1088q0 -33 -23.5 -56.5t-56.5 -23.5t-56.5 23.5t-23.5 56.5t23.5 56.5t56.5 23.5t56.5 -23.5t23.5 -56.5z" />
+<glyph unicode="" d="M1536 640q0 -209 -103 -385.5t-279.5 -279.5t-385.5 -103t-385.5 103t-279.5 279.5t-103 385.5t103 385.5t279.5 279.5t385.5 103t385.5 -103t279.5 -279.5t103 -385.5z" />
+<glyph unicode="" horiz-adv-x="1792" d="M1792 416q0 -166 -127 -451q-3 -7 -10.5 -24t-13.5 -30t-13 -22q-12 -17 -28 -17q-15 0 -23.5 10t-8.5 25q0 9 2.5 26.5t2.5 23.5q5 68 5 123q0 101 -17.5 181t-48.5 138.5t-80 101t-105.5 69.5t-133 42.5t-154 21.5t-175.5 6h-224v-256q0 -26 -19 -45t-45 -19t-45 19 l-512 512q-19 19 -19 45t19 45l512 512q19 19 45 19t45 -19t19 -45v-256h224q713 0 875 -403q53 -134 53 -333z" />
+<glyph unicode="" horiz-adv-x="1664" d="M640 320q0 -40 -12.5 -82t-43 -76t-72.5 -34t-72.5 34t-43 76t-12.5 82t12.5 82t43 76t72.5 34t72.5 -34t43 -76t12.5 -82zM1280 320q0 -40 -12.5 -82t-43 -76t-72.5 -34t-72.5 34t-43 76t-12.5 82t12.5 82t43 76t72.5 34t72.5 -34t43 -76t12.5 -82zM1440 320 q0 120 -69 204t-187 84q-41 0 -195 -21q-71 -11 -157 -11t-157 11q-152 21 -195 21q-118 0 -187 -84t-69 -204q0 -88 32 -153.5t81 -103t122 -60t140 -29.5t149 -7h168q82 0 149 7t140 29.5t122 60t81 103t32 153.5zM1664 496q0 -207 -61 -331q-38 -77 -105.5 -133t-141 -86 t-170 -47.5t-171.5 -22t-167 -4.5q-78 0 -142 3t-147.5 12.5t-152.5 30t-137 51.5t-121 81t-86 115q-62 123 -62 331q0 237 136 396q-27 82 -27 170q0 116 51 218q108 0 190 -39.5t189 -123.5q147 35 309 35q148 0 280 -32q105 82 187 121t189 39q51 -102 51 -218 q0 -87 -27 -168q136 -160 136 -398z" />
+<glyph unicode="" horiz-adv-x="1664" d="M1536 224v704q0 40 -28 68t-68 28h-704q-40 0 -68 28t-28 68v64q0 40 -28 68t-68 28h-320q-40 0 -68 -28t-28 -68v-960q0 -40 28 -68t68 -28h1216q40 0 68 28t28 68zM1664 928v-704q0 -92 -66 -158t-158 -66h-1216q-92 0 -158 66t-66 158v960q0 92 66 158t158 66h320 q92 0 158 -66t66 -158v-32h672q92 0 158 -66t66 -158z" />
+<glyph unicode="" horiz-adv-x="1920" d="M1781 605q0 35 -53 35h-1088q-40 0 -85.5 -21.5t-71.5 -52.5l-294 -363q-18 -24 -18 -40q0 -35 53 -35h1088q40 0 86 22t71 53l294 363q18 22 18 39zM640 768h768v160q0 40 -28 68t-68 28h-576q-40 0 -68 28t-28 68v64q0 40 -28 68t-68 28h-320q-40 0 -68 -28t-28 -68 v-853l256 315q44 53 116 87.5t140 34.5zM1909 605q0 -62 -46 -120l-295 -363q-43 -53 -116 -87.5t-140 -34.5h-1088q-92 0 -158 66t-66 158v960q0 92 66 158t158 66h320q92 0 158 -66t66 -158v-32h544q92 0 158 -66t66 -158v-160h192q54 0 99 -24.5t67 -70.5q15 -32 15 -68z " />
+</font>
+</defs></svg>
\ No newline at end of file
diff --git a/bitbake/lib/toaster/toastergui/static/fonts/fontawesome-webfont.ttf b/bitbake/lib/toaster/toastergui/static/fonts/fontawesome-webfont.ttf
new file mode 100644
index 0000000..d461724
--- /dev/null
+++ b/bitbake/lib/toaster/toastergui/static/fonts/fontawesome-webfont.ttf
Binary files differ
diff --git a/bitbake/lib/toaster/toastergui/static/fonts/fontawesome-webfont.woff b/bitbake/lib/toaster/toastergui/static/fonts/fontawesome-webfont.woff
new file mode 100644
index 0000000..3c89ae0
--- /dev/null
+++ b/bitbake/lib/toaster/toastergui/static/fonts/fontawesome-webfont.woff
Binary files differ
diff --git a/bitbake/lib/toaster/toastergui/static/fonts/glyphicons-halflings-regular.eot b/bitbake/lib/toaster/toastergui/static/fonts/glyphicons-halflings-regular.eot
new file mode 100644
index 0000000..423bd5d
--- /dev/null
+++ b/bitbake/lib/toaster/toastergui/static/fonts/glyphicons-halflings-regular.eot
Binary files differ
diff --git a/bitbake/lib/toaster/toastergui/static/fonts/glyphicons-halflings-regular.svg b/bitbake/lib/toaster/toastergui/static/fonts/glyphicons-halflings-regular.svg
new file mode 100644
index 0000000..4469488
--- /dev/null
+++ b/bitbake/lib/toaster/toastergui/static/fonts/glyphicons-halflings-regular.svg
@@ -0,0 +1,229 @@
+<?xml version="1.0" standalone="no"?>
+<!DOCTYPE svg PUBLIC "-//W3C//DTD SVG 1.1//EN" "http://www.w3.org/Graphics/SVG/1.1/DTD/svg11.dtd" >
+<svg xmlns="http://www.w3.org/2000/svg">
+<metadata></metadata>
+<defs>
+<font id="glyphicons_halflingsregular" horiz-adv-x="1200" >
+<font-face units-per-em="1200" ascent="960" descent="-240" />
+<missing-glyph horiz-adv-x="500" />
+<glyph />
+<glyph />
+<glyph unicode="
" />
+<glyph unicode=" " />
+<glyph unicode="*" d="M100 500v200h259l-183 183l141 141l183 -183v259h200v-259l183 183l141 -141l-183 -183h259v-200h-259l183 -183l-141 -141l-183 183v-259h-200v259l-183 -183l-141 141l183 183h-259z" />
+<glyph unicode="+" d="M0 400v300h400v400h300v-400h400v-300h-400v-400h-300v400h-400z" />
+<glyph unicode=" " />
+<glyph unicode=" " horiz-adv-x="652" />
+<glyph unicode=" " horiz-adv-x="1304" />
+<glyph unicode=" " horiz-adv-x="652" />
+<glyph unicode=" " horiz-adv-x="1304" />
+<glyph unicode=" " horiz-adv-x="434" />
+<glyph unicode=" " horiz-adv-x="326" />
+<glyph unicode=" " horiz-adv-x="217" />
+<glyph unicode=" " horiz-adv-x="217" />
+<glyph unicode=" " horiz-adv-x="163" />
+<glyph unicode=" " horiz-adv-x="260" />
+<glyph unicode=" " horiz-adv-x="72" />
+<glyph unicode=" " horiz-adv-x="260" />
+<glyph unicode=" " horiz-adv-x="326" />
+<glyph unicode="€" d="M100 500l100 100h113q0 47 5 100h-218l100 100h135q37 167 112 257q117 141 297 141q242 0 354 -189q60 -103 66 -209h-181q0 55 -25.5 99t-63.5 68t-75 36.5t-67 12.5q-24 0 -52.5 -10t-62.5 -32t-65.5 -67t-50.5 -107h379l-100 -100h-300q-6 -46 -6 -100h406l-100 -100 h-300q9 -74 33 -132t52.5 -91t62 -54.5t59 -29t46.5 -7.5q29 0 66 13t75 37t63.5 67.5t25.5 96.5h174q-31 -172 -128 -278q-107 -117 -274 -117q-205 0 -324 158q-36 46 -69 131.5t-45 205.5h-217z" />
+<glyph unicode="−" d="M200 400h900v300h-900v-300z" />
+<glyph unicode="☁" d="M-14 494q0 -80 56.5 -137t135.5 -57h750q120 0 205 86t85 208q0 120 -85 206.5t-205 86.5q-46 0 -90 -14q-44 97 -134.5 156.5t-200.5 59.5q-152 0 -260 -107.5t-108 -260.5q0 -25 2 -37q-66 -14 -108.5 -67.5t-42.5 -122.5z" />
+<glyph unicode="✉" d="M0 100l400 400l200 -200l200 200l400 -400h-1200zM0 300v600l300 -300zM0 1100l600 -603l600 603h-1200zM900 600l300 300v-600z" />
+<glyph unicode="✏" d="M-13 -13l333 112l-223 223zM187 403l214 -214l614 614l-214 214zM887 1103l214 -214l99 92q13 13 13 32.5t-13 33.5l-153 153q-15 13 -33 13t-33 -13z" />
+<glyph unicode="" horiz-adv-x="500" d="M0 0z" />
+<glyph unicode="" d="M0 1200h1200l-500 -550v-550h300v-100h-800v100h300v550z" />
+<glyph unicode="" d="M14 84q18 -55 86 -75.5t147 5.5q65 21 109 69t44 90v606l600 155v-521q-64 16 -138 -7q-79 -26 -122.5 -83t-25.5 -111q17 -55 85.5 -75.5t147.5 4.5q70 23 111.5 63.5t41.5 95.5v881q0 10 -7 15.5t-17 2.5l-752 -193q-10 -3 -17 -12.5t-7 -19.5v-689q-64 17 -138 -7 q-79 -25 -122.5 -82t-25.5 -112z" />
+<glyph unicode="" d="M23 693q0 200 142 342t342 142t342 -142t142 -342q0 -142 -78 -261l300 -300q7 -8 7 -18t-7 -18l-109 -109q-8 -7 -18 -7t-18 7l-300 300q-119 -78 -261 -78q-200 0 -342 142t-142 342zM176 693q0 -136 97 -233t234 -97t233.5 96.5t96.5 233.5t-96.5 233.5t-233.5 96.5 t-234 -97t-97 -233z" />
+<glyph unicode="" d="M100 784q0 64 28 123t73 100.5t104.5 64t119 20.5t120 -38.5t104.5 -104.5q48 69 109.5 105t121.5 38t118.5 -20.5t102.5 -64t71 -100.5t27 -123q0 -57 -33.5 -117.5t-94 -124.5t-126.5 -127.5t-150 -152.5t-146 -174q-62 85 -145.5 174t-149.5 152.5t-126.5 127.5 t-94 124.5t-33.5 117.5z" />
+<glyph unicode="" d="M-72 800h479l146 400h2l146 -400h472l-382 -278l145 -449l-384 275l-382 -275l146 447zM168 71l2 1z" />
+<glyph unicode="" d="M-72 800h479l146 400h2l146 -400h472l-382 -278l145 -449l-384 275l-382 -275l146 447zM168 71l2 1zM237 700l196 -142l-73 -226l192 140l195 -141l-74 229l193 140h-235l-77 211l-78 -211h-239z" />
+<glyph unicode="" d="M0 0v143l400 257v100q-37 0 -68.5 74.5t-31.5 125.5v200q0 124 88 212t212 88t212 -88t88 -212v-200q0 -51 -31.5 -125.5t-68.5 -74.5v-100l400 -257v-143h-1200z" />
+<glyph unicode="" d="M0 0v1100h1200v-1100h-1200zM100 100h100v100h-100v-100zM100 300h100v100h-100v-100zM100 500h100v100h-100v-100zM100 700h100v100h-100v-100zM100 900h100v100h-100v-100zM300 100h600v400h-600v-400zM300 600h600v400h-600v-400zM1000 100h100v100h-100v-100z M1000 300h100v100h-100v-100zM1000 500h100v100h-100v-100zM1000 700h100v100h-100v-100zM1000 900h100v100h-100v-100z" />
+<glyph unicode="" d="M0 50v400q0 21 14.5 35.5t35.5 14.5h400q21 0 35.5 -14.5t14.5 -35.5v-400q0 -21 -14.5 -35.5t-35.5 -14.5h-400q-21 0 -35.5 14.5t-14.5 35.5zM0 650v400q0 21 14.5 35.5t35.5 14.5h400q21 0 35.5 -14.5t14.5 -35.5v-400q0 -21 -14.5 -35.5t-35.5 -14.5h-400 q-21 0 -35.5 14.5t-14.5 35.5zM600 50v400q0 21 14.5 35.5t35.5 14.5h400q21 0 35.5 -14.5t14.5 -35.5v-400q0 -21 -14.5 -35.5t-35.5 -14.5h-400q-21 0 -35.5 14.5t-14.5 35.5zM600 650v400q0 21 14.5 35.5t35.5 14.5h400q21 0 35.5 -14.5t14.5 -35.5v-400 q0 -21 -14.5 -35.5t-35.5 -14.5h-400q-21 0 -35.5 14.5t-14.5 35.5z" />
+<glyph unicode="" d="M0 50v200q0 21 14.5 35.5t35.5 14.5h200q21 0 35.5 -14.5t14.5 -35.5v-200q0 -21 -14.5 -35.5t-35.5 -14.5h-200q-21 0 -35.5 14.5t-14.5 35.5zM0 450v200q0 21 14.5 35.5t35.5 14.5h200q21 0 35.5 -14.5t14.5 -35.5v-200q0 -21 -14.5 -35.5t-35.5 -14.5h-200 q-21 0 -35.5 14.5t-14.5 35.5zM0 850v200q0 21 14.5 35.5t35.5 14.5h200q21 0 35.5 -14.5t14.5 -35.5v-200q0 -21 -14.5 -35.5t-35.5 -14.5h-200q-21 0 -35.5 14.5t-14.5 35.5zM400 50v200q0 21 14.5 35.5t35.5 14.5h200q21 0 35.5 -14.5t14.5 -35.5v-200q0 -21 -14.5 -35.5 t-35.5 -14.5h-200q-21 0 -35.5 14.5t-14.5 35.5zM400 450v200q0 21 14.5 35.5t35.5 14.5h200q21 0 35.5 -14.5t14.5 -35.5v-200q0 -21 -14.5 -35.5t-35.5 -14.5h-200q-21 0 -35.5 14.5t-14.5 35.5zM400 850v200q0 21 14.5 35.5t35.5 14.5h200q21 0 35.5 -14.5t14.5 -35.5 v-200q0 -21 -14.5 -35.5t-35.5 -14.5h-200q-21 0 -35.5 14.5t-14.5 35.5zM800 50v200q0 21 14.5 35.5t35.5 14.5h200q21 0 35.5 -14.5t14.5 -35.5v-200q0 -21 -14.5 -35.5t-35.5 -14.5h-200q-21 0 -35.5 14.5t-14.5 35.5zM800 450v200q0 21 14.5 35.5t35.5 14.5h200 q21 0 35.5 -14.5t14.5 -35.5v-200q0 -21 -14.5 -35.5t-35.5 -14.5h-200q-21 0 -35.5 14.5t-14.5 35.5zM800 850v200q0 21 14.5 35.5t35.5 14.5h200q21 0 35.5 -14.5t14.5 -35.5v-200q0 -21 -14.5 -35.5t-35.5 -14.5h-200q-21 0 -35.5 14.5t-14.5 35.5z" />
+<glyph unicode="" d="M0 50v200q0 21 14.5 35.5t35.5 14.5h200q21 0 35.5 -14.5t14.5 -35.5v-200q0 -21 -14.5 -35.5t-35.5 -14.5h-200q-21 0 -35.5 14.5t-14.5 35.5zM0 450q0 -21 14.5 -35.5t35.5 -14.5h200q21 0 35.5 14.5t14.5 35.5v200q0 21 -14.5 35.5t-35.5 14.5h-200q-21 0 -35.5 -14.5 t-14.5 -35.5v-200zM0 850v200q0 21 14.5 35.5t35.5 14.5h200q21 0 35.5 -14.5t14.5 -35.5v-200q0 -21 -14.5 -35.5t-35.5 -14.5h-200q-21 0 -35.5 14.5t-14.5 35.5zM400 50v200q0 21 14.5 35.5t35.5 14.5h700q21 0 35.5 -14.5t14.5 -35.5v-200q0 -21 -14.5 -35.5 t-35.5 -14.5h-700q-21 0 -35.5 14.5t-14.5 35.5zM400 450v200q0 21 14.5 35.5t35.5 14.5h700q21 0 35.5 -14.5t14.5 -35.5v-200q0 -21 -14.5 -35.5t-35.5 -14.5h-700q-21 0 -35.5 14.5t-14.5 35.5zM400 850v200q0 21 14.5 35.5t35.5 14.5h700q21 0 35.5 -14.5t14.5 -35.5 v-200q0 -21 -14.5 -35.5t-35.5 -14.5h-700q-21 0 -35.5 14.5t-14.5 35.5z" />
+<glyph unicode="" d="M29 454l419 -420l818 820l-212 212l-607 -607l-206 207z" />
+<glyph unicode="" d="M106 318l282 282l-282 282l212 212l282 -282l282 282l212 -212l-282 -282l282 -282l-212 -212l-282 282l-282 -282z" />
+<glyph unicode="" d="M23 693q0 200 142 342t342 142t342 -142t142 -342q0 -142 -78 -261l300 -300q7 -8 7 -18t-7 -18l-109 -109q-8 -7 -18 -7t-18 7l-300 300q-119 -78 -261 -78q-200 0 -342 142t-142 342zM176 693q0 -136 97 -233t234 -97t233.5 96.5t96.5 233.5t-96.5 233.5t-233.5 96.5 t-234 -97t-97 -233zM300 600v200h100v100h200v-100h100v-200h-100v-100h-200v100h-100z" />
+<glyph unicode="" d="M23 694q0 200 142 342t342 142t342 -142t142 -342q0 -141 -78 -262l300 -299q7 -7 7 -18t-7 -18l-109 -109q-8 -8 -18 -8t-18 8l-300 299q-120 -77 -261 -77q-200 0 -342 142t-142 342zM176 694q0 -136 97 -233t234 -97t233.5 97t96.5 233t-96.5 233t-233.5 97t-234 -97 t-97 -233zM300 601h400v200h-400v-200z" />
+<glyph unicode="" d="M23 600q0 183 105 331t272 210v-166q-103 -55 -165 -155t-62 -220q0 -177 125 -302t302 -125t302 125t125 302q0 120 -62 220t-165 155v166q167 -62 272 -210t105 -331q0 -118 -45.5 -224.5t-123 -184t-184 -123t-224.5 -45.5t-224.5 45.5t-184 123t-123 184t-45.5 224.5 zM500 750q0 -21 14.5 -35.5t35.5 -14.5h100q21 0 35.5 14.5t14.5 35.5v400q0 21 -14.5 35.5t-35.5 14.5h-100q-21 0 -35.5 -14.5t-14.5 -35.5v-400z" />
+<glyph unicode="" d="M100 1h200v300h-200v-300zM400 1v500h200v-500h-200zM700 1v800h200v-800h-200zM1000 1v1200h200v-1200h-200z" />
+<glyph unicode="" d="M26 601q0 -33 6 -74l151 -38l2 -6q14 -49 38 -93l3 -5l-80 -134q45 -59 105 -105l133 81l5 -3q45 -26 94 -39l5 -2l38 -151q40 -5 74 -5q27 0 74 5l38 151l6 2q46 13 93 39l5 3l134 -81q56 44 104 105l-80 134l3 5q24 44 39 93l1 6l152 38q5 40 5 74q0 28 -5 73l-152 38 l-1 6q-16 51 -39 93l-3 5l80 134q-44 58 -104 105l-134 -81l-5 3q-45 25 -93 39l-6 1l-38 152q-40 5 -74 5q-27 0 -74 -5l-38 -152l-5 -1q-50 -14 -94 -39l-5 -3l-133 81q-59 -47 -105 -105l80 -134l-3 -5q-25 -47 -38 -93l-2 -6l-151 -38q-6 -48 -6 -73zM385 601 q0 88 63 151t152 63t152 -63t63 -151q0 -89 -63 -152t-152 -63t-152 63t-63 152z" />
+<glyph unicode="" d="M100 1025v50q0 10 7.5 17.5t17.5 7.5h275v100q0 41 29.5 70.5t70.5 29.5h300q41 0 70.5 -29.5t29.5 -70.5v-100h275q10 0 17.5 -7.5t7.5 -17.5v-50q0 -11 -7 -18t-18 -7h-1050q-11 0 -18 7t-7 18zM200 100v800h900v-800q0 -41 -29.5 -71t-70.5 -30h-700q-41 0 -70.5 30 t-29.5 71zM300 100h100v700h-100v-700zM500 100h100v700h-100v-700zM500 1100h300v100h-300v-100zM700 100h100v700h-100v-700zM900 100h100v700h-100v-700z" />
+<glyph unicode="" d="M1 601l656 644l644 -644h-200v-600h-300v400h-300v-400h-300v600h-200z" />
+<glyph unicode="" d="M100 25v1150q0 11 7 18t18 7h475v-500h400v-675q0 -11 -7 -18t-18 -7h-850q-11 0 -18 7t-7 18zM700 800v300l300 -300h-300z" />
+<glyph unicode="" d="M4 600q0 162 80 299t217 217t299 80t299 -80t217 -217t80 -299t-80 -299t-217 -217t-299 -80t-299 80t-217 217t-80 299zM186 600q0 -171 121.5 -292.5t292.5 -121.5t292.5 121.5t121.5 292.5t-121.5 292.5t-292.5 121.5t-292.5 -121.5t-121.5 -292.5zM500 500v400h100 v-300h200v-100h-300z" />
+<glyph unicode="" d="M-100 0l431 1200h209l-21 -300h162l-20 300h208l431 -1200h-538l-41 400h-242l-40 -400h-539zM488 500h224l-27 300h-170z" />
+<glyph unicode="" d="M0 0v400h490l-290 300h200v500h300v-500h200l-290 -300h490v-400h-1100zM813 200h175v100h-175v-100z" />
+<glyph unicode="" d="M1 600q0 122 47.5 233t127.5 191t191 127.5t233 47.5t233 -47.5t191 -127.5t127.5 -191t47.5 -233t-47.5 -233t-127.5 -191t-191 -127.5t-233 -47.5t-233 47.5t-191 127.5t-127.5 191t-47.5 233zM188 600q0 -170 121 -291t291 -121t291 121t121 291t-121 291t-291 121 t-291 -121t-121 -291zM350 600h150v300h200v-300h150l-250 -300z" />
+<glyph unicode="" d="M4 600q0 162 80 299t217 217t299 80t299 -80t217 -217t80 -299t-80 -299t-217 -217t-299 -80t-299 80t-217 217t-80 299zM186 600q0 -171 121.5 -292.5t292.5 -121.5t292.5 121.5t121.5 292.5t-121.5 292.5t-292.5 121.5t-292.5 -121.5t-121.5 -292.5zM350 600l250 300 l250 -300h-150v-300h-200v300h-150z" />
+<glyph unicode="" d="M0 25v475l200 700h800q199 -700 200 -700v-475q0 -11 -7 -18t-18 -7h-1150q-11 0 -18 7t-7 18zM200 500h200l50 -200h300l50 200h200l-97 500h-606z" />
+<glyph unicode="" d="M4 600q0 162 80 299t217 217t299 80t299 -80t217 -217t80 -299t-80 -299t-217 -217t-299 -80t-299 80t-217 217t-80 299zM186 600q0 -172 121.5 -293t292.5 -121t292.5 121t121.5 293q0 171 -121.5 292.5t-292.5 121.5t-292.5 -121.5t-121.5 -292.5zM500 397v401 l297 -200z" />
+<glyph unicode="" d="M23 600q0 -118 45.5 -224.5t123 -184t184 -123t224.5 -45.5t224.5 45.5t184 123t123 184t45.5 224.5h-150q0 -177 -125 -302t-302 -125t-302 125t-125 302t125 302t302 125q136 0 246 -81l-146 -146h400v400l-145 -145q-157 122 -355 122q-118 0 -224.5 -45.5t-184 -123 t-123 -184t-45.5 -224.5z" />
+<glyph unicode="" d="M23 600q0 118 45.5 224.5t123 184t184 123t224.5 45.5q198 0 355 -122l145 145v-400h-400l147 147q-112 80 -247 80q-177 0 -302 -125t-125 -302h-150zM100 0v400h400l-147 -147q112 -80 247 -80q177 0 302 125t125 302h150q0 -118 -45.5 -224.5t-123 -184t-184 -123 t-224.5 -45.5q-198 0 -355 122z" />
+<glyph unicode="" d="M100 0h1100v1200h-1100v-1200zM200 100v900h900v-900h-900zM300 200v100h100v-100h-100zM300 400v100h100v-100h-100zM300 600v100h100v-100h-100zM300 800v100h100v-100h-100zM500 200h500v100h-500v-100zM500 400v100h500v-100h-500zM500 600v100h500v-100h-500z M500 800v100h500v-100h-500z" />
+<glyph unicode="" d="M0 100v600q0 41 29.5 70.5t70.5 29.5h100v200q0 82 59 141t141 59h300q82 0 141 -59t59 -141v-200h100q41 0 70.5 -29.5t29.5 -70.5v-600q0 -41 -29.5 -70.5t-70.5 -29.5h-900q-41 0 -70.5 29.5t-29.5 70.5zM400 800h300v150q0 21 -14.5 35.5t-35.5 14.5h-200 q-21 0 -35.5 -14.5t-14.5 -35.5v-150z" />
+<glyph unicode="" d="M100 0v1100h100v-1100h-100zM300 400q60 60 127.5 84t127.5 17.5t122 -23t119 -30t110 -11t103 42t91 120.5v500q-40 -81 -101.5 -115.5t-127.5 -29.5t-138 25t-139.5 40t-125.5 25t-103 -29.5t-65 -115.5v-500z" />
+<glyph unicode="" d="M0 275q0 -11 7 -18t18 -7h50q11 0 18 7t7 18v300q0 127 70.5 231.5t184.5 161.5t245 57t245 -57t184.5 -161.5t70.5 -231.5v-300q0 -11 7 -18t18 -7h50q11 0 18 7t7 18v300q0 116 -49.5 227t-131 192.5t-192.5 131t-227 49.5t-227 -49.5t-192.5 -131t-131 -192.5 t-49.5 -227v-300zM200 20v460q0 8 6 14t14 6h160q8 0 14 -6t6 -14v-460q0 -8 -6 -14t-14 -6h-160q-8 0 -14 6t-6 14zM800 20v460q0 8 6 14t14 6h160q8 0 14 -6t6 -14v-460q0 -8 -6 -14t-14 -6h-160q-8 0 -14 6t-6 14z" />
+<glyph unicode="" d="M0 400h300l300 -200v800l-300 -200h-300v-400zM688 459l141 141l-141 141l71 71l141 -141l141 141l71 -71l-141 -141l141 -141l-71 -71l-141 141l-141 -141z" />
+<glyph unicode="" d="M0 400h300l300 -200v800l-300 -200h-300v-400zM700 857l69 53q111 -135 111 -310q0 -169 -106 -302l-67 54q86 110 86 248q0 146 -93 257z" />
+<glyph unicode="" d="M0 401v400h300l300 200v-800l-300 200h-300zM702 858l69 53q111 -135 111 -310q0 -170 -106 -303l-67 55q86 110 86 248q0 145 -93 257zM889 951l7 -8q123 -151 123 -344q0 -189 -119 -339l-7 -8l81 -66l6 8q142 178 142 405q0 230 -144 408l-6 8z" />
+<glyph unicode="" d="M0 0h500v500h-200v100h-100v-100h-200v-500zM0 600h100v100h400v100h100v100h-100v300h-500v-600zM100 100v300h300v-300h-300zM100 800v300h300v-300h-300zM200 200v100h100v-100h-100zM200 900h100v100h-100v-100zM500 500v100h300v-300h200v-100h-100v-100h-200v100 h-100v100h100v200h-200zM600 0v100h100v-100h-100zM600 1000h100v-300h200v-300h300v200h-200v100h200v500h-600v-200zM800 800v300h300v-300h-300zM900 0v100h300v-100h-300zM900 900v100h100v-100h-100zM1100 200v100h100v-100h-100z" />
+<glyph unicode="" d="M0 200h100v1000h-100v-1000zM100 0v100h300v-100h-300zM200 200v1000h100v-1000h-100zM500 0v91h100v-91h-100zM500 200v1000h200v-1000h-200zM700 0v91h100v-91h-100zM800 200v1000h100v-1000h-100zM900 0v91h200v-91h-200zM1000 200v1000h200v-1000h-200z" />
+<glyph unicode="" d="M1 700v475q0 10 7.5 17.5t17.5 7.5h474l700 -700l-500 -500zM148 953q0 -42 29 -71q30 -30 71.5 -30t71.5 30q29 29 29 71t-29 71q-30 30 -71.5 30t-71.5 -30q-29 -29 -29 -71z" />
+<glyph unicode="" d="M2 700v475q0 11 7 18t18 7h474l700 -700l-500 -500zM148 953q0 -42 30 -71q29 -30 71 -30t71 30q30 29 30 71t-30 71q-29 30 -71 30t-71 -30q-30 -29 -30 -71zM701 1200h100l700 -700l-500 -500l-50 50l450 450z" />
+<glyph unicode="" d="M100 0v1025l175 175h925v-1000l-100 -100v1000h-750l-100 -100h750v-1000h-900z" />
+<glyph unicode="" d="M200 0l450 444l450 -443v1150q0 20 -14.5 35t-35.5 15h-800q-21 0 -35.5 -15t-14.5 -35v-1151z" />
+<glyph unicode="" d="M0 100v700h200l100 -200h600l100 200h200v-700h-200v200h-800v-200h-200zM253 829l40 -124h592l62 124l-94 346q-2 11 -10 18t-18 7h-450q-10 0 -18 -7t-10 -18zM281 24l38 152q2 10 11.5 17t19.5 7h500q10 0 19.5 -7t11.5 -17l38 -152q2 -10 -3.5 -17t-15.5 -7h-600 q-10 0 -15.5 7t-3.5 17z" />
+<glyph unicode="" d="M0 200q0 -41 29.5 -70.5t70.5 -29.5h1000q41 0 70.5 29.5t29.5 70.5v600q0 41 -29.5 70.5t-70.5 29.5h-150q-4 8 -11.5 21.5t-33 48t-53 61t-69 48t-83.5 21.5h-200q-41 0 -82 -20.5t-70 -50t-52 -59t-34 -50.5l-12 -20h-150q-41 0 -70.5 -29.5t-29.5 -70.5v-600z M356 500q0 100 72 172t172 72t172 -72t72 -172t-72 -172t-172 -72t-172 72t-72 172zM494 500q0 -44 31 -75t75 -31t75 31t31 75t-31 75t-75 31t-75 -31t-31 -75zM900 700v100h100v-100h-100z" />
+<glyph unicode="" d="M53 0h365v66q-41 0 -72 11t-49 38t1 71l92 234h391l82 -222q16 -45 -5.5 -88.5t-74.5 -43.5v-66h417v66q-34 1 -74 43q-18 19 -33 42t-21 37l-6 13l-385 998h-93l-399 -1006q-24 -48 -52 -75q-12 -12 -33 -25t-36 -20l-15 -7v-66zM416 521l178 457l46 -140l116 -317h-340 z" />
+<glyph unicode="" d="M100 0v89q41 7 70.5 32.5t29.5 65.5v827q0 28 -1 39.5t-5.5 26t-15.5 21t-29 14t-49 14.5v70h471q120 0 213 -88t93 -228q0 -55 -11.5 -101.5t-28 -74t-33.5 -47.5t-28 -28l-12 -7q8 -3 21.5 -9t48 -31.5t60.5 -58t47.5 -91.5t21.5 -129q0 -84 -59 -156.5t-142 -111 t-162 -38.5h-500zM400 200h161q89 0 153 48.5t64 132.5q0 90 -62.5 154.5t-156.5 64.5h-159v-400zM400 700h139q76 0 130 61.5t54 138.5q0 82 -84 130.5t-239 48.5v-379z" />
+<glyph unicode="" d="M200 0v57q77 7 134.5 40.5t65.5 80.5l173 849q10 56 -10 74t-91 37q-6 1 -10.5 2.5t-9.5 2.5v57h425l2 -57q-33 -8 -62 -25.5t-46 -37t-29.5 -38t-17.5 -30.5l-5 -12l-128 -825q-10 -52 14 -82t95 -36v-57h-500z" />
+<glyph unicode="" d="M-75 200h75v800h-75l125 167l125 -167h-75v-800h75l-125 -167zM300 900v300h150h700h150v-300h-50q0 29 -8 48.5t-18.5 30t-33.5 15t-39.5 5.5t-50.5 1h-200v-850l100 -50v-100h-400v100l100 50v850h-200q-34 0 -50.5 -1t-40 -5.5t-33.5 -15t-18.5 -30t-8.5 -48.5h-49z " />
+<glyph unicode="" d="M33 51l167 125v-75h800v75l167 -125l-167 -125v75h-800v-75zM100 901v300h150h700h150v-300h-50q0 29 -8 48.5t-18 30t-33.5 15t-40 5.5t-50.5 1h-200v-650l100 -50v-100h-400v100l100 50v650h-200q-34 0 -50.5 -1t-39.5 -5.5t-33.5 -15t-18.5 -30t-8 -48.5h-50z" />
+<glyph unicode="" d="M0 50q0 -20 14.5 -35t35.5 -15h1100q21 0 35.5 15t14.5 35v100q0 21 -14.5 35.5t-35.5 14.5h-1100q-21 0 -35.5 -14.5t-14.5 -35.5v-100zM0 350q0 -20 14.5 -35t35.5 -15h800q21 0 35.5 15t14.5 35v100q0 21 -14.5 35.5t-35.5 14.5h-800q-21 0 -35.5 -14.5t-14.5 -35.5 v-100zM0 650q0 -20 14.5 -35t35.5 -15h1000q21 0 35.5 15t14.5 35v100q0 21 -14.5 35.5t-35.5 14.5h-1000q-21 0 -35.5 -14.5t-14.5 -35.5v-100zM0 950q0 -20 14.5 -35t35.5 -15h600q21 0 35.5 15t14.5 35v100q0 21 -14.5 35.5t-35.5 14.5h-600q-21 0 -35.5 -14.5 t-14.5 -35.5v-100z" />
+<glyph unicode="" d="M0 50q0 -20 14.5 -35t35.5 -15h1100q21 0 35.5 15t14.5 35v100q0 21 -14.5 35.5t-35.5 14.5h-1100q-21 0 -35.5 -14.5t-14.5 -35.5v-100zM0 650q0 -20 14.5 -35t35.5 -15h1100q21 0 35.5 15t14.5 35v100q0 21 -14.5 35.5t-35.5 14.5h-1100q-21 0 -35.5 -14.5t-14.5 -35.5 v-100zM200 350q0 -20 14.5 -35t35.5 -15h700q21 0 35.5 15t14.5 35v100q0 21 -14.5 35.5t-35.5 14.5h-700q-21 0 -35.5 -14.5t-14.5 -35.5v-100zM200 950q0 -20 14.5 -35t35.5 -15h700q21 0 35.5 15t14.5 35v100q0 21 -14.5 35.5t-35.5 14.5h-700q-21 0 -35.5 -14.5 t-14.5 -35.5v-100z" />
+<glyph unicode="" d="M0 50v100q0 21 14.5 35.5t35.5 14.5h1100q21 0 35.5 -14.5t14.5 -35.5v-100q0 -20 -14.5 -35t-35.5 -15h-1100q-21 0 -35.5 15t-14.5 35zM100 650v100q0 21 14.5 35.5t35.5 14.5h1000q21 0 35.5 -14.5t14.5 -35.5v-100q0 -20 -14.5 -35t-35.5 -15h-1000q-21 0 -35.5 15 t-14.5 35zM300 350v100q0 21 14.5 35.5t35.5 14.5h800q21 0 35.5 -14.5t14.5 -35.5v-100q0 -20 -14.5 -35t-35.5 -15h-800q-21 0 -35.5 15t-14.5 35zM500 950v100q0 21 14.5 35.5t35.5 14.5h600q21 0 35.5 -14.5t14.5 -35.5v-100q0 -20 -14.5 -35t-35.5 -15h-600 q-21 0 -35.5 15t-14.5 35z" />
+<glyph unicode="" d="M0 50v100q0 21 14.5 35.5t35.5 14.5h1100q21 0 35.5 -14.5t14.5 -35.5v-100q0 -20 -14.5 -35t-35.5 -15h-1100q-21 0 -35.5 15t-14.5 35zM0 350v100q0 21 14.5 35.5t35.5 14.5h1100q21 0 35.5 -14.5t14.5 -35.5v-100q0 -20 -14.5 -35t-35.5 -15h-1100q-21 0 -35.5 15 t-14.5 35zM0 650v100q0 21 14.5 35.5t35.5 14.5h1100q21 0 35.5 -14.5t14.5 -35.5v-100q0 -20 -14.5 -35t-35.5 -15h-1100q-21 0 -35.5 15t-14.5 35zM0 950v100q0 21 14.5 35.5t35.5 14.5h1100q21 0 35.5 -14.5t14.5 -35.5v-100q0 -20 -14.5 -35t-35.5 -15h-1100 q-21 0 -35.5 15t-14.5 35z" />
+<glyph unicode="" d="M0 50v100q0 21 14.5 35.5t35.5 14.5h100q21 0 35.5 -14.5t14.5 -35.5v-100q0 -20 -14.5 -35t-35.5 -15h-100q-21 0 -35.5 15t-14.5 35zM0 350v100q0 21 14.5 35.5t35.5 14.5h100q21 0 35.5 -14.5t14.5 -35.5v-100q0 -20 -14.5 -35t-35.5 -15h-100q-21 0 -35.5 15 t-14.5 35zM0 650v100q0 21 14.5 35.5t35.5 14.5h100q21 0 35.5 -14.5t14.5 -35.5v-100q0 -20 -14.5 -35t-35.5 -15h-100q-21 0 -35.5 15t-14.5 35zM0 950v100q0 21 14.5 35.5t35.5 14.5h100q21 0 35.5 -14.5t14.5 -35.5v-100q0 -20 -14.5 -35t-35.5 -15h-100q-21 0 -35.5 15 t-14.5 35zM300 50v100q0 21 14.5 35.5t35.5 14.5h800q21 0 35.5 -14.5t14.5 -35.5v-100q0 -20 -14.5 -35t-35.5 -15h-800q-21 0 -35.5 15t-14.5 35zM300 350v100q0 21 14.5 35.5t35.5 14.5h800q21 0 35.5 -14.5t14.5 -35.5v-100q0 -20 -14.5 -35t-35.5 -15h-800 q-21 0 -35.5 15t-14.5 35zM300 650v100q0 21 14.5 35.5t35.5 14.5h800q21 0 35.5 -14.5t14.5 -35.5v-100q0 -20 -14.5 -35t-35.5 -15h-800q-21 0 -35.5 15t-14.5 35zM300 950v100q0 21 14.5 35.5t35.5 14.5h800q21 0 35.5 -14.5t14.5 -35.5v-100q0 -20 -14.5 -35t-35.5 -15 h-800q-21 0 -35.5 15t-14.5 35z" />
+<glyph unicode="" d="M-101 500v100h201v75l166 -125l-166 -125v75h-201zM300 0h100v1100h-100v-1100zM500 50q0 -20 14.5 -35t35.5 -15h600q20 0 35 15t15 35v100q0 21 -15 35.5t-35 14.5h-600q-21 0 -35.5 -14.5t-14.5 -35.5v-100zM500 350q0 -20 14.5 -35t35.5 -15h300q20 0 35 15t15 35 v100q0 21 -15 35.5t-35 14.5h-300q-21 0 -35.5 -14.5t-14.5 -35.5v-100zM500 650q0 -20 14.5 -35t35.5 -15h500q20 0 35 15t15 35v100q0 21 -15 35.5t-35 14.5h-500q-21 0 -35.5 -14.5t-14.5 -35.5v-100zM500 950q0 -20 14.5 -35t35.5 -15h100q20 0 35 15t15 35v100 q0 21 -15 35.5t-35 14.5h-100q-21 0 -35.5 -14.5t-14.5 -35.5v-100z" />
+<glyph unicode="" d="M1 50q0 -20 14.5 -35t35.5 -15h600q20 0 35 15t15 35v100q0 21 -15 35.5t-35 14.5h-600q-21 0 -35.5 -14.5t-14.5 -35.5v-100zM1 350q0 -20 14.5 -35t35.5 -15h300q20 0 35 15t15 35v100q0 21 -15 35.5t-35 14.5h-300q-21 0 -35.5 -14.5t-14.5 -35.5v-100zM1 650 q0 -20 14.5 -35t35.5 -15h500q20 0 35 15t15 35v100q0 21 -15 35.5t-35 14.5h-500q-21 0 -35.5 -14.5t-14.5 -35.5v-100zM1 950q0 -20 14.5 -35t35.5 -15h100q20 0 35 15t15 35v100q0 21 -15 35.5t-35 14.5h-100q-21 0 -35.5 -14.5t-14.5 -35.5v-100zM801 0v1100h100v-1100 h-100zM934 550l167 -125v75h200v100h-200v75z" />
+<glyph unicode="" d="M0 275v650q0 31 22 53t53 22h750q31 0 53 -22t22 -53v-650q0 -31 -22 -53t-53 -22h-750q-31 0 -53 22t-22 53zM900 600l300 300v-600z" />
+<glyph unicode="" d="M0 44v1012q0 18 13 31t31 13h1112q19 0 31.5 -13t12.5 -31v-1012q0 -18 -12.5 -31t-31.5 -13h-1112q-18 0 -31 13t-13 31zM100 263l247 182l298 -131l-74 156l293 318l236 -288v500h-1000v-737zM208 750q0 56 39 95t95 39t95 -39t39 -95t-39 -95t-95 -39t-95 39t-39 95z " />
+<glyph unicode="" d="M148 745q0 124 60.5 231.5t165 172t226.5 64.5q123 0 227 -63t164.5 -169.5t60.5 -229.5t-73 -272q-73 -114 -166.5 -237t-150.5 -189l-57 -66q-10 9 -27 26t-66.5 70.5t-96 109t-104 135.5t-100.5 155q-63 139 -63 262zM342 772q0 -107 75.5 -182.5t181.5 -75.5 q107 0 182.5 75.5t75.5 182.5t-75.5 182t-182.5 75t-182 -75.5t-75 -181.5z" />
+<glyph unicode="" d="M1 600q0 122 47.5 233t127.5 191t191 127.5t233 47.5t233 -47.5t191 -127.5t127.5 -191t47.5 -233t-47.5 -233t-127.5 -191t-191 -127.5t-233 -47.5t-233 47.5t-191 127.5t-127.5 191t-47.5 233zM173 600q0 -177 125.5 -302t301.5 -125v854q-176 0 -301.5 -125 t-125.5 -302z" />
+<glyph unicode="" d="M117 406q0 94 34 186t88.5 172.5t112 159t115 177t87.5 194.5q21 -71 57.5 -142.5t76 -130.5t83 -118.5t82 -117t70 -116t50 -125.5t18.5 -136q0 -89 -39 -165.5t-102 -126.5t-140 -79.5t-156 -33.5q-114 6 -211.5 53t-161.5 138.5t-64 210.5zM243 414q14 -82 59.5 -136 t136.5 -80l16 98q-7 6 -18 17t-34 48t-33 77q-15 73 -14 143.5t10 122.5l9 51q-92 -110 -119.5 -185t-12.5 -156z" />
+<glyph unicode="" d="M0 400v300q0 165 117.5 282.5t282.5 117.5q366 -6 397 -14l-186 -186h-311q-41 0 -70.5 -29.5t-29.5 -70.5v-500q0 -41 29.5 -70.5t70.5 -29.5h500q41 0 70.5 29.5t29.5 70.5v125l200 200v-225q0 -165 -117.5 -282.5t-282.5 -117.5h-300q-165 0 -282.5 117.5 t-117.5 282.5zM436 341l161 50l412 412l-114 113l-405 -405zM995 1015l113 -113l113 113l-21 85l-92 28z" />
+<glyph unicode="" d="M0 400v300q0 165 117.5 282.5t282.5 117.5h261l2 -80q-133 -32 -218 -120h-145q-41 0 -70.5 -29.5t-29.5 -70.5v-500q0 -41 29.5 -70.5t70.5 -29.5h500q41 0 70.5 29.5t29.5 70.5l200 153v-53q0 -165 -117.5 -282.5t-282.5 -117.5h-300q-165 0 -282.5 117.5t-117.5 282.5 zM423 524q30 38 81.5 64t103 35.5t99 14t77.5 3.5l29 -1v-209l360 324l-359 318v-216q-7 0 -19 -1t-48 -8t-69.5 -18.5t-76.5 -37t-76.5 -59t-62 -88t-39.5 -121.5z" />
+<glyph unicode="" d="M0 400v300q0 165 117.5 282.5t282.5 117.5h300q60 0 127 -23l-178 -177h-349q-41 0 -70.5 -29.5t-29.5 -70.5v-500q0 -41 29.5 -70.5t70.5 -29.5h500q41 0 70.5 29.5t29.5 70.5v69l200 200v-169q0 -165 -117.5 -282.5t-282.5 -117.5h-300q-165 0 -282.5 117.5 t-117.5 282.5zM342 632l283 -284l566 567l-136 137l-430 -431l-147 147z" />
+<glyph unicode="" d="M0 603l300 296v-198h200v200h-200l300 300l295 -300h-195v-200h200v198l300 -296l-300 -300v198h-200v-200h195l-295 -300l-300 300h200v200h-200v-198z" />
+<glyph unicode="" d="M200 50v1000q0 21 14.5 35.5t35.5 14.5h100q21 0 35.5 -14.5t14.5 -35.5v-437l500 487v-1100l-500 488v-438q0 -21 -14.5 -35.5t-35.5 -14.5h-100q-21 0 -35.5 14.5t-14.5 35.5z" />
+<glyph unicode="" d="M0 50v1000q0 21 14.5 35.5t35.5 14.5h100q21 0 35.5 -14.5t14.5 -35.5v-437l500 487v-487l500 487v-1100l-500 488v-488l-500 488v-438q0 -21 -14.5 -35.5t-35.5 -14.5h-100q-21 0 -35.5 14.5t-14.5 35.5z" />
+<glyph unicode="" d="M136 550l564 550v-487l500 487v-1100l-500 488v-488z" />
+<glyph unicode="" d="M200 0l900 550l-900 550v-1100z" />
+<glyph unicode="" d="M200 150q0 -21 14.5 -35.5t35.5 -14.5h200q21 0 35.5 14.5t14.5 35.5v800q0 21 -14.5 35.5t-35.5 14.5h-200q-21 0 -35.5 -14.5t-14.5 -35.5v-800zM600 150q0 -21 14.5 -35.5t35.5 -14.5h200q21 0 35.5 14.5t14.5 35.5v800q0 21 -14.5 35.5t-35.5 14.5h-200 q-21 0 -35.5 -14.5t-14.5 -35.5v-800z" />
+<glyph unicode="" d="M200 150q0 -20 14.5 -35t35.5 -15h800q21 0 35.5 15t14.5 35v800q0 21 -14.5 35.5t-35.5 14.5h-800q-21 0 -35.5 -14.5t-14.5 -35.5v-800z" />
+<glyph unicode="" d="M0 0v1100l500 -487v487l564 -550l-564 -550v488z" />
+<glyph unicode="" d="M0 0v1100l500 -487v487l500 -487v437q0 21 14.5 35.5t35.5 14.5h100q21 0 35.5 -14.5t14.5 -35.5v-1000q0 -21 -14.5 -35.5t-35.5 -14.5h-100q-21 0 -35.5 14.5t-14.5 35.5v438l-500 -488v488z" />
+<glyph unicode="" d="M300 0v1100l500 -487v437q0 21 14.5 35.5t35.5 14.5h100q21 0 35.5 -14.5t14.5 -35.5v-1000q0 -21 -14.5 -35.5t-35.5 -14.5h-100q-21 0 -35.5 14.5t-14.5 35.5v438z" />
+<glyph unicode="" d="M100 250v100q0 21 14.5 35.5t35.5 14.5h1000q21 0 35.5 -14.5t14.5 -35.5v-100q0 -21 -14.5 -35.5t-35.5 -14.5h-1000q-21 0 -35.5 14.5t-14.5 35.5zM100 500h1100l-550 564z" />
+<glyph unicode="" d="M185 599l592 -592l240 240l-353 353l353 353l-240 240z" />
+<glyph unicode="" d="M272 194l353 353l-353 353l241 240l572 -571l21 -22l-1 -1v-1l-592 -591z" />
+<glyph unicode="" d="M3 600q0 162 80 299.5t217.5 217.5t299.5 80t299.5 -80t217.5 -217.5t80 -299.5t-80 -300t-217.5 -218t-299.5 -80t-299.5 80t-217.5 218t-80 300zM300 500h200v-200h200v200h200v200h-200v200h-200v-200h-200v-200z" />
+<glyph unicode="" d="M3 600q0 162 80 299.5t217.5 217.5t299.5 80t299.5 -80t217.5 -217.5t80 -299.5t-80 -300t-217.5 -218t-299.5 -80t-299.5 80t-217.5 218t-80 300zM300 500h600v200h-600v-200z" />
+<glyph unicode="" d="M3 600q0 162 80 299.5t217.5 217.5t299.5 80t299.5 -80t217.5 -217.5t80 -299.5t-80 -300t-217.5 -218t-299.5 -80t-299.5 80t-217.5 218t-80 300zM246 459l213 -213l141 142l141 -142l213 213l-142 141l142 141l-213 212l-141 -141l-141 142l-212 -213l141 -141z" />
+<glyph unicode="" d="M3 600q0 162 80 299.5t217.5 217.5t299.5 80t299.5 -80t217.5 -217.5t80 -299.5t-80 -299.5t-217.5 -217.5t-299.5 -80t-299.5 80t-217.5 217.5t-80 299.5zM270 551l276 -277l411 411l-175 174l-236 -236l-102 102z" />
+<glyph unicode="" d="M3 600q0 162 80 299.5t217.5 217.5t299.5 80t299.5 -80t217.5 -217.5t80 -299.5t-80 -300t-217.5 -218t-299.5 -80t-299.5 80t-217.5 218t-80 300zM363 700h144q4 0 11.5 -1t11 -1t6.5 3t3 9t1 11t3.5 8.5t3.5 6t5.5 4t6.5 2.5t9 1.5t9 0.5h11.5h12.5q19 0 30 -10t11 -26 q0 -22 -4 -28t-27 -22q-5 -1 -12.5 -3t-27 -13.5t-34 -27t-26.5 -46t-11 -68.5h200q5 3 14 8t31.5 25.5t39.5 45.5t31 69t14 94q0 51 -17.5 89t-42 58t-58.5 32t-58.5 15t-51.5 3q-105 0 -172 -56t-67 -183zM500 300h200v100h-200v-100z" />
+<glyph unicode="" d="M3 600q0 162 80 299.5t217.5 217.5t299.5 80t299.5 -80t217.5 -217.5t80 -299.5t-80 -300t-217.5 -218t-299.5 -80t-299.5 80t-217.5 218t-80 300zM400 300h400v100h-100v300h-300v-100h100v-200h-100v-100zM500 800h200v100h-200v-100z" />
+<glyph unicode="" d="M0 500v200h194q15 60 36 104.5t55.5 86t88 69t126.5 40.5v200h200v-200q54 -20 113 -60t112.5 -105.5t71.5 -134.5h203v-200h-203q-25 -102 -116.5 -186t-180.5 -117v-197h-200v197q-140 27 -208 102.5t-98 200.5h-194zM290 500q24 -73 79.5 -127.5t130.5 -78.5v206h200 v-206q149 48 201 206h-201v200h200q-25 74 -76 127.5t-124 76.5v-204h-200v203q-75 -24 -130 -77.5t-79 -125.5h209v-200h-210z" />
+<glyph unicode="" d="M4 600q0 162 80 299t217 217t299 80t299 -80t217 -217t80 -299t-80 -299t-217 -217t-299 -80t-299 80t-217 217t-80 299zM186 600q0 -171 121.5 -292.5t292.5 -121.5t292.5 121.5t121.5 292.5t-121.5 292.5t-292.5 121.5t-292.5 -121.5t-121.5 -292.5zM356 465l135 135 l-135 135l109 109l135 -135l135 135l109 -109l-135 -135l135 -135l-109 -109l-135 135l-135 -135z" />
+<glyph unicode="" d="M4 600q0 162 80 299t217 217t299 80t299 -80t217 -217t80 -299t-80 -299t-217 -217t-299 -80t-299 80t-217 217t-80 299zM186 600q0 -171 121.5 -292.5t292.5 -121.5t292.5 121.5t121.5 292.5t-121.5 292.5t-292.5 121.5t-292.5 -121.5t-121.5 -292.5zM322 537l141 141 l87 -87l204 205l142 -142l-346 -345z" />
+<glyph unicode="" d="M4 600q0 162 80 299t217 217t299 80t299 -80t217 -217t80 -299t-80 -299t-217 -217t-299 -80t-299 80t-217 217t-80 299zM186 600q0 -115 62 -215l568 567q-100 62 -216 62q-171 0 -292.5 -121.5t-121.5 -292.5zM391 245q97 -59 209 -59q171 0 292.5 121.5t121.5 292.5 q0 112 -59 209z" />
+<glyph unicode="" d="M0 547l600 453v-300h600v-300h-600v-301z" />
+<glyph unicode="" d="M0 400v300h600v300l600 -453l-600 -448v301h-600z" />
+<glyph unicode="" d="M204 600l450 600l444 -600h-298v-600h-300v600h-296z" />
+<glyph unicode="" d="M104 600h296v600h300v-600h298l-449 -600z" />
+<glyph unicode="" d="M0 200q6 132 41 238.5t103.5 193t184 138t271.5 59.5v271l600 -453l-600 -448v301q-95 -2 -183 -20t-170 -52t-147 -92.5t-100 -135.5z" />
+<glyph unicode="" d="M0 0v400l129 -129l294 294l142 -142l-294 -294l129 -129h-400zM635 777l142 -142l294 294l129 -129v400h-400l129 -129z" />
+<glyph unicode="" d="M34 176l295 295l-129 129h400v-400l-129 130l-295 -295zM600 600v400l129 -129l295 295l142 -141l-295 -295l129 -130h-400z" />
+<glyph unicode="" d="M23 600q0 118 45.5 224.5t123 184t184 123t224.5 45.5t224.5 -45.5t184 -123t123 -184t45.5 -224.5t-45.5 -224.5t-123 -184t-184 -123t-224.5 -45.5t-224.5 45.5t-184 123t-123 184t-45.5 224.5zM456 851l58 -302q4 -20 21.5 -34.5t37.5 -14.5h54q20 0 37.5 14.5 t21.5 34.5l58 302q4 20 -8 34.5t-33 14.5h-207q-20 0 -32 -14.5t-8 -34.5zM500 300h200v100h-200v-100z" />
+<glyph unicode="" d="M0 800h100v-200h400v300h200v-300h400v200h100v100h-111v6t-1 15t-3 18l-34 172q-11 39 -41.5 63t-69.5 24q-32 0 -61 -17l-239 -144q-22 -13 -40 -35q-19 24 -40 36l-238 144q-33 18 -62 18q-39 0 -69.5 -23t-40.5 -61l-35 -177q-2 -8 -3 -18t-1 -15v-6h-111v-100z M100 0h400v400h-400v-400zM200 900q-3 0 14 48t35 96l18 47l214 -191h-281zM700 0v400h400v-400h-400zM731 900l202 197q5 -12 12 -32.5t23 -64t25 -72t7 -28.5h-269z" />
+<glyph unicode="" d="M0 -22v143l216 193q-9 53 -13 83t-5.5 94t9 113t38.5 114t74 124q47 60 99.5 102.5t103 68t127.5 48t145.5 37.5t184.5 43.5t220 58.5q0 -189 -22 -343t-59 -258t-89 -181.5t-108.5 -120t-122 -68t-125.5 -30t-121.5 -1.5t-107.5 12.5t-87.5 17t-56.5 7.5l-99 -55z M238.5 300.5q19.5 -6.5 86.5 76.5q55 66 367 234q70 38 118.5 69.5t102 79t99 111.5t86.5 148q22 50 24 60t-6 19q-7 5 -17 5t-26.5 -14.5t-33.5 -39.5q-35 -51 -113.5 -108.5t-139.5 -89.5l-61 -32q-369 -197 -458 -401q-48 -111 -28.5 -117.5z" />
+<glyph unicode="" d="M111 408q0 -33 5 -63q9 -56 44 -119.5t105 -108.5q31 -21 64 -16t62 23.5t57 49.5t48 61.5t35 60.5q32 66 39 184.5t-13 157.5q79 -80 122 -164t26 -184q-5 -33 -20.5 -69.5t-37.5 -80.5q-10 -19 -14.5 -29t-12 -26t-9 -23.5t-3 -19t2.5 -15.5t11 -9.5t19.5 -5t30.5 2.5 t42 8q57 20 91 34t87.5 44.5t87 64t65.5 88.5t47 122q38 172 -44.5 341.5t-246.5 278.5q22 -44 43 -129q39 -159 -32 -154q-15 2 -33 9q-79 33 -120.5 100t-44 175.5t48.5 257.5q-13 -8 -34 -23.5t-72.5 -66.5t-88.5 -105.5t-60 -138t-8 -166.5q2 -12 8 -41.5t8 -43t6 -39.5 t3.5 -39.5t-1 -33.5t-6 -31.5t-13.5 -24t-21 -20.5t-31 -12q-38 -10 -67 13t-40.5 61.5t-15 81.5t10.5 75q-52 -46 -83.5 -101t-39 -107t-7.5 -85z" />
+<glyph unicode="" d="M-61 600l26 40q6 10 20 30t49 63.5t74.5 85.5t97 90t116.5 83.5t132.5 59t145.5 23.5t145.5 -23.5t132.5 -59t116.5 -83.5t97 -90t74.5 -85.5t49 -63.5t20 -30l26 -40l-26 -40q-6 -10 -20 -30t-49 -63.5t-74.5 -85.5t-97 -90t-116.5 -83.5t-132.5 -59t-145.5 -23.5 t-145.5 23.5t-132.5 59t-116.5 83.5t-97 90t-74.5 85.5t-49 63.5t-20 30zM120 600q7 -10 40.5 -58t56 -78.5t68 -77.5t87.5 -75t103 -49.5t125 -21.5t123.5 20t100.5 45.5t85.5 71.5t66.5 75.5t58 81.5t47 66q-1 1 -28.5 37.5t-42 55t-43.5 53t-57.5 63.5t-58.5 54 q49 -74 49 -163q0 -124 -88 -212t-212 -88t-212 88t-88 212q0 85 46 158q-102 -87 -226 -258zM377 656q49 -124 154 -191l105 105q-37 24 -75 72t-57 84l-20 36z" />
+<glyph unicode="" d="M-61 600l26 40q6 10 20 30t49 63.5t74.5 85.5t97 90t116.5 83.5t132.5 59t145.5 23.5q61 0 121 -17l37 142h148l-314 -1200h-148l37 143q-82 21 -165 71.5t-140 102t-109.5 112t-72 88.5t-29.5 43zM120 600q210 -282 393 -336l37 141q-107 18 -178.5 101.5t-71.5 193.5 q0 85 46 158q-102 -87 -226 -258zM377 656q49 -124 154 -191l47 47l23 87q-30 28 -59 69t-44 68l-14 26zM780 161l38 145q22 15 44.5 34t46 44t40.5 44t41 50.5t33.5 43.5t33 44t24.5 34q-97 127 -140 175l39 146q67 -54 131.5 -125.5t87.5 -103.5t36 -52l26 -40l-26 -40 q-7 -12 -25.5 -38t-63.5 -79.5t-95.5 -102.5t-124 -100t-146.5 -79z" />
+<glyph unicode="" d="M-97.5 34q13.5 -34 50.5 -34h1294q37 0 50.5 35.5t-7.5 67.5l-642 1056q-20 33 -48 36t-48 -29l-642 -1066q-21 -32 -7.5 -66zM155 200l445 723l445 -723h-345v100h-200v-100h-345zM500 600l100 -300l100 300v100h-200v-100z" />
+<glyph unicode="" d="M100 262v41q0 20 11 44.5t26 38.5l363 325v339q0 62 44 106t106 44t106 -44t44 -106v-339l363 -325q15 -14 26 -38.5t11 -44.5v-41q0 -20 -12 -26.5t-29 5.5l-359 249v-263q100 -91 100 -113v-64q0 -21 -13 -29t-32 1l-94 78h-222l-94 -78q-19 -9 -32 -1t-13 29v64 q0 22 100 113v263l-359 -249q-17 -12 -29 -5.5t-12 26.5z" />
+<glyph unicode="" d="M0 50q0 -20 14.5 -35t35.5 -15h1000q21 0 35.5 15t14.5 35v750h-1100v-750zM0 900h1100v150q0 21 -14.5 35.5t-35.5 14.5h-150v100h-100v-100h-500v100h-100v-100h-150q-21 0 -35.5 -14.5t-14.5 -35.5v-150zM100 100v100h100v-100h-100zM100 300v100h100v-100h-100z M100 500v100h100v-100h-100zM300 100v100h100v-100h-100zM300 300v100h100v-100h-100zM300 500v100h100v-100h-100zM500 100v100h100v-100h-100zM500 300v100h100v-100h-100zM500 500v100h100v-100h-100zM700 100v100h100v-100h-100zM700 300v100h100v-100h-100zM700 500 v100h100v-100h-100zM900 100v100h100v-100h-100zM900 300v100h100v-100h-100zM900 500v100h100v-100h-100z" />
+<glyph unicode="" d="M0 200v200h259l600 600h241v198l300 -295l-300 -300v197h-159l-600 -600h-341zM0 800h259l122 -122l141 142l-181 180h-341v-200zM678 381l141 142l122 -123h159v198l300 -295l-300 -300v197h-241z" />
+<glyph unicode="" d="M0 400v600q0 41 29.5 70.5t70.5 29.5h1000q41 0 70.5 -29.5t29.5 -70.5v-600q0 -41 -29.5 -70.5t-70.5 -29.5h-596l-304 -300v300h-100q-41 0 -70.5 29.5t-29.5 70.5z" />
+<glyph unicode="" d="M100 600v200h300v-250q0 -113 6 -145q17 -92 102 -117q39 -11 92 -11q37 0 66.5 5.5t50 15.5t36 24t24 31.5t14 37.5t7 42t2.5 45t0 47v25v250h300v-200q0 -42 -3 -83t-15 -104t-31.5 -116t-58 -109.5t-89 -96.5t-129 -65.5t-174.5 -25.5t-174.5 25.5t-129 65.5t-89 96.5 t-58 109.5t-31.5 116t-15 104t-3 83zM100 900v300h300v-300h-300zM800 900v300h300v-300h-300z" />
+<glyph unicode="" d="M-30 411l227 -227l352 353l353 -353l226 227l-578 579z" />
+<glyph unicode="" d="M70 797l580 -579l578 579l-226 227l-353 -353l-352 353z" />
+<glyph unicode="" d="M-198 700l299 283l300 -283h-203v-400h385l215 -200h-800v600h-196zM402 1000l215 -200h381v-400h-198l299 -283l299 283h-200v600h-796z" />
+<glyph unicode="" d="M18 939q-5 24 10 42q14 19 39 19h896l38 162q5 17 18.5 27.5t30.5 10.5h94q20 0 35 -14.5t15 -35.5t-15 -35.5t-35 -14.5h-54l-201 -961q-2 -4 -6 -10.5t-19 -17.5t-33 -11h-31v-50q0 -20 -14.5 -35t-35.5 -15t-35.5 15t-14.5 35v50h-300v-50q0 -20 -14.5 -35t-35.5 -15 t-35.5 15t-14.5 35v50h-50q-21 0 -35.5 15t-14.5 35q0 21 14.5 35.5t35.5 14.5h535l48 200h-633q-32 0 -54.5 21t-27.5 43z" />
+<glyph unicode="" d="M0 0v800h1200v-800h-1200zM0 900v100h200q0 41 29.5 70.5t70.5 29.5h300q41 0 70.5 -29.5t29.5 -70.5h500v-100h-1200z" />
+<glyph unicode="" d="M1 0l300 700h1200l-300 -700h-1200zM1 400v600h200q0 41 29.5 70.5t70.5 29.5h300q41 0 70.5 -29.5t29.5 -70.5h500v-200h-1000z" />
+<glyph unicode="" d="M302 300h198v600h-198l298 300l298 -300h-198v-600h198l-298 -300z" />
+<glyph unicode="" d="M0 600l300 298v-198h600v198l300 -298l-300 -297v197h-600v-197z" />
+<glyph unicode="" d="M0 100v100q0 41 29.5 70.5t70.5 29.5h1000q41 0 70.5 -29.5t29.5 -70.5v-100q0 -41 -29.5 -70.5t-70.5 -29.5h-1000q-41 0 -70.5 29.5t-29.5 70.5zM31 400l172 739q5 22 23 41.5t38 19.5h672q19 0 37.5 -22.5t23.5 -45.5l172 -732h-1138zM800 100h100v100h-100v-100z M1000 100h100v100h-100v-100z" />
+<glyph unicode="" d="M-101 600v50q0 24 25 49t50 38l25 13v-250l-11 5.5t-24 14t-30 21.5t-24 27.5t-11 31.5zM99 500v250v5q0 13 0.5 18.5t2.5 13t8 10.5t15 3h200l675 250v-850l-675 200h-38l47 -276q2 -12 -3 -17.5t-11 -6t-21 -0.5h-8h-83q-20 0 -34.5 14t-18.5 35q-56 337 -56 351z M1100 200v850q0 21 14.5 35.5t35.5 14.5q20 0 35 -14.5t15 -35.5v-850q0 -20 -15 -35t-35 -15q-21 0 -35.5 15t-14.5 35z" />
+<glyph unicode="" d="M74 350q0 21 13.5 35.5t33.5 14.5h17l118 173l63 327q15 77 76 140t144 83l-18 32q-6 19 3 32t29 13h94q20 0 29 -10.5t3 -29.5l-18 -37q83 -19 144 -82.5t76 -140.5l63 -327l118 -173h17q20 0 33.5 -14.5t13.5 -35.5q0 -20 -13 -40t-31 -27q-22 -9 -63 -23t-167.5 -37 t-251.5 -23t-245.5 20.5t-178.5 41.5l-58 20q-18 7 -31 27.5t-13 40.5zM497 110q12 -49 40 -79.5t63 -30.5t63 30.5t39 79.5q-48 -6 -102 -6t-103 6z" />
+<glyph unicode="" d="M21 445l233 -45l-78 -224l224 78l45 -233l155 179l155 -179l45 233l224 -78l-78 224l234 45l-180 155l180 156l-234 44l78 225l-224 -78l-45 233l-155 -180l-155 180l-45 -233l-224 78l78 -225l-233 -44l179 -156z" />
+<glyph unicode="" d="M0 200h200v600h-200v-600zM300 275q0 -75 100 -75h61q123 -100 139 -100h250q46 0 83 57l238 344q29 31 29 74v100q0 44 -30.5 84.5t-69.5 40.5h-328q28 118 28 125v150q0 44 -30.5 84.5t-69.5 40.5h-50q-27 0 -51 -20t-38 -48l-96 -198l-145 -196q-20 -26 -20 -63v-400z M400 300v375l150 212l100 213h50v-175l-50 -225h450v-125l-250 -375h-214l-136 100h-100z" />
+<glyph unicode="" d="M0 400v600h200v-600h-200zM300 525v400q0 75 100 75h61q123 100 139 100h250q46 0 83 -57l238 -344q29 -31 29 -74v-100q0 -44 -30.5 -84.5t-69.5 -40.5h-328q28 -118 28 -125v-150q0 -44 -30.5 -84.5t-69.5 -40.5h-50q-27 0 -51 20t-38 48l-96 198l-145 196 q-20 26 -20 63zM400 525l150 -212l100 -213h50v175l-50 225h450v125l-250 375h-214l-136 -100h-100v-375z" />
+<glyph unicode="" d="M8 200v600h200v-600h-200zM308 275v525q0 17 14 35.5t28 28.5l14 9l362 230q14 6 25 6q17 0 29 -12l109 -112q14 -14 14 -34q0 -18 -11 -32l-85 -121h302q85 0 138.5 -38t53.5 -110t-54.5 -111t-138.5 -39h-107l-130 -339q-7 -22 -20.5 -41.5t-28.5 -19.5h-341 q-7 0 -90 81t-83 94zM408 289l100 -89h293l131 339q6 21 19.5 41t28.5 20h203q16 0 25 15t9 36q0 20 -9 34.5t-25 14.5h-457h-6.5h-7.5t-6.5 0.5t-6 1t-5 1.5t-5.5 2.5t-4 4t-4 5.5q-5 12 -5 20q0 14 10 27l147 183l-86 83l-339 -236v-503z" />
+<glyph unicode="" d="M-101 651q0 72 54 110t139 37h302l-85 121q-11 16 -11 32q0 21 14 34l109 113q13 12 29 12q11 0 25 -6l365 -230q7 -4 16.5 -10.5t26 -26t16.5 -36.5v-526q0 -13 -85.5 -93.5t-93.5 -80.5h-342q-15 0 -28.5 20t-19.5 41l-131 339h-106q-84 0 -139 39t-55 111zM-1 601h222 q15 0 28.5 -20.5t19.5 -40.5l131 -339h293l106 89v502l-342 237l-87 -83l145 -184q10 -11 10 -26q0 -11 -5 -20q-1 -3 -3.5 -5.5l-4 -4t-5 -2.5t-5.5 -1.5t-6.5 -1t-6.5 -0.5h-7.5h-6.5h-476v-100zM999 201v600h200v-600h-200z" />
+<glyph unicode="" d="M97 719l230 -363q4 -6 10.5 -15.5t26 -25t36.5 -15.5h525q13 0 94 83t81 90v342q0 15 -20 28.5t-41 19.5l-339 131v106q0 84 -39 139t-111 55t-110 -53.5t-38 -138.5v-302l-121 84q-15 12 -33.5 11.5t-32.5 -13.5l-112 -110q-22 -22 -6 -53zM172 739l83 86l183 -146 q22 -18 47 -5q3 1 5.5 3.5l4 4t2.5 5t1.5 5.5t1 6.5t0.5 6v7.5v7v456q0 22 25 31t50 -0.5t25 -30.5v-202q0 -16 20 -29.5t41 -19.5l339 -130v-294l-89 -100h-503zM400 0v200h600v-200h-600z" />
+<glyph unicode="" d="M1 585q-15 -31 7 -53l112 -110q13 -13 32 -13.5t34 10.5l121 85l-1 -302q0 -84 38.5 -138t110.5 -54t111 55t39 139v106l339 131q20 6 40.5 19.5t20.5 28.5v342q0 7 -81 90t-94 83h-525q-17 0 -35.5 -14t-28.5 -28l-10 -15zM76 565l237 339h503l89 -100v-294l-340 -130 q-20 -6 -40 -20t-20 -29v-202q0 -22 -25 -31t-50 0t-25 31v456v14.5t-1.5 11.5t-5 12t-9.5 7q-24 13 -46 -5l-184 -146zM305 1104v200h600v-200h-600z" />
+<glyph unicode="" d="M5 597q0 122 47.5 232.5t127.5 190.5t190.5 127.5t232.5 47.5q162 0 299.5 -80t217.5 -218t80 -300t-80 -299.5t-217.5 -217.5t-299.5 -80t-300 80t-218 217.5t-80 299.5zM300 500h300l-2 -194l402 294l-402 298v-197h-298v-201z" />
+<glyph unicode="" d="M0 597q0 122 47.5 232.5t127.5 190.5t190.5 127.5t231.5 47.5q122 0 232.5 -47.5t190.5 -127.5t127.5 -190.5t47.5 -232.5q0 -162 -80 -299.5t-218 -217.5t-300 -80t-299.5 80t-217.5 217.5t-80 299.5zM200 600l400 -294v194h302v201h-300v197z" />
+<glyph unicode="" d="M5 597q0 122 47.5 232.5t127.5 190.5t190.5 127.5t232.5 47.5q121 0 231.5 -47.5t190.5 -127.5t127.5 -190.5t47.5 -232.5q0 -162 -80 -299.5t-217.5 -217.5t-299.5 -80t-300 80t-218 217.5t-80 299.5zM300 600h200v-300h200v300h200l-300 400z" />
+<glyph unicode="" d="M5 597q0 122 47.5 232.5t127.5 190.5t190.5 127.5t232.5 47.5q121 0 231.5 -47.5t190.5 -127.5t127.5 -190.5t47.5 -232.5q0 -162 -80 -299.5t-217.5 -217.5t-299.5 -80t-300 80t-218 217.5t-80 299.5zM300 600l300 -400l300 400h-200v300h-200v-300h-200z" />
+<glyph unicode="" d="M5 597q0 122 47.5 232.5t127.5 190.5t190.5 127.5t232.5 47.5q121 0 231.5 -47.5t190.5 -127.5t127.5 -190.5t47.5 -232.5q0 -162 -80 -299.5t-217.5 -217.5t-299.5 -80t-300 80t-218 217.5t-80 299.5zM254 780q-8 -34 5.5 -93t7.5 -87q0 -9 17 -44t16 -60q12 0 23 -5.5 t23 -15t20 -13.5q20 -10 108 -42q22 -8 53 -31.5t59.5 -38.5t57.5 -11q8 -18 -15 -55.5t-20 -57.5q12 -21 22.5 -34.5t28 -27t36.5 -17.5q0 -6 -3 -15.5t-3.5 -14.5t4.5 -17q101 -2 221 111q31 30 47 48t34 49t21 62q-14 9 -37.5 9.5t-35.5 7.5q-14 7 -49 15t-52 19 q-9 0 -39.5 -0.5t-46.5 -1.5t-39 -6.5t-39 -16.5q-50 -35 -66 -12q-4 2 -3.5 25.5t0.5 25.5q-6 13 -26.5 17t-24.5 7q2 22 -2 41t-16.5 28t-38.5 -20q-23 -25 -42 4q-19 28 -8 58q8 16 22 22q6 -1 26 -1.5t33.5 -4.5t19.5 -13q12 -19 32 -37.5t34 -27.5l14 -8q0 3 9.5 39.5 t5.5 57.5q-4 23 14.5 44.5t22.5 31.5q5 14 10 35t8.5 31t15.5 22.5t34 21.5q-6 18 10 37q8 0 23.5 -1.5t24.5 -1.5t20.5 4.5t20.5 15.5q-10 23 -30.5 42.5t-38 30t-49 26.5t-43.5 23q11 41 1 44q31 -13 58.5 -14.5t39.5 3.5l11 4q6 36 -17 53.5t-64 28.5t-56 23 q-19 -3 -37 0q-15 -12 -36.5 -21t-34.5 -12t-44 -8t-39 -6q-15 -3 -46 0t-45 -3q-20 -6 -51.5 -25.5t-34.5 -34.5q-3 -11 6.5 -22.5t8.5 -18.5q-3 -34 -27.5 -91t-29.5 -79zM518 915q3 12 16 30.5t16 25.5q10 -10 18.5 -10t14 6t14.5 14.5t16 12.5q0 -18 8 -42.5t16.5 -44 t9.5 -23.5q-6 1 -39 5t-53.5 10t-36.5 16z" />
+<glyph unicode="" d="M0 164.5q0 21.5 15 37.5l600 599q-33 101 6 201.5t135 154.5q164 92 306 -9l-259 -138l145 -232l251 126q13 -175 -151 -267q-123 -70 -253 -23l-596 -596q-15 -16 -36.5 -16t-36.5 16l-111 110q-15 15 -15 36.5z" />
+<glyph unicode="" horiz-adv-x="1220" d="M0 196v100q0 41 29.5 70.5t70.5 29.5h1000q41 0 70.5 -29.5t29.5 -70.5v-100q0 -41 -29.5 -70.5t-70.5 -29.5h-1000q-41 0 -70.5 29.5t-29.5 70.5zM0 596v100q0 41 29.5 70.5t70.5 29.5h1000q41 0 70.5 -29.5t29.5 -70.5v-100q0 -41 -29.5 -70.5t-70.5 -29.5h-1000 q-41 0 -70.5 29.5t-29.5 70.5zM0 996v100q0 41 29.5 70.5t70.5 29.5h1000q41 0 70.5 -29.5t29.5 -70.5v-100q0 -41 -29.5 -70.5t-70.5 -29.5h-1000q-41 0 -70.5 29.5t-29.5 70.5zM600 596h500v100h-500v-100zM800 196h300v100h-300v-100zM900 996h200v100h-200v-100z" />
+<glyph unicode="" d="M100 1100v100h1000v-100h-1000zM150 1000h900l-350 -500v-300l-200 -200v500z" />
+<glyph unicode="" d="M0 200v200h1200v-200q0 -41 -29.5 -70.5t-70.5 -29.5h-1000q-41 0 -70.5 29.5t-29.5 70.5zM0 500v400q0 41 29.5 70.5t70.5 29.5h300v100q0 41 29.5 70.5t70.5 29.5h200q41 0 70.5 -29.5t29.5 -70.5v-100h300q41 0 70.5 -29.5t29.5 -70.5v-400h-500v100h-200v-100h-500z M500 1000h200v100h-200v-100z" />
+<glyph unicode="" d="M0 0v400l129 -129l200 200l142 -142l-200 -200l129 -129h-400zM0 800l129 129l200 -200l142 142l-200 200l129 129h-400v-400zM729 329l142 142l200 -200l129 129v-400h-400l129 129zM729 871l200 200l-129 129h400v-400l-129 129l-200 -200z" />
+<glyph unicode="" d="M0 596q0 162 80 299t217 217t299 80t299 -80t217 -217t80 -299t-80 -299t-217 -217t-299 -80t-299 80t-217 217t-80 299zM182 596q0 -172 121.5 -293t292.5 -121t292.5 121t121.5 293q0 171 -121.5 292.5t-292.5 121.5t-292.5 -121.5t-121.5 -292.5zM291 655 q0 23 15.5 38.5t38.5 15.5t39 -16t16 -38q0 -23 -16 -39t-39 -16q-22 0 -38 16t-16 39zM400 850q0 22 16 38.5t39 16.5q22 0 38 -16t16 -39t-16 -39t-38 -16q-23 0 -39 16.5t-16 38.5zM513 609q0 32 21 56.5t52 29.5l122 126l1 1q-9 14 -9 28q0 22 16 38.5t39 16.5 q22 0 38 -16t16 -39t-16 -39t-38 -16q-16 0 -29 10l-55 -145q17 -22 17 -51q0 -36 -25.5 -61.5t-61.5 -25.5q-37 0 -62.5 25.5t-25.5 61.5zM800 655q0 22 16 38t39 16t38.5 -15.5t15.5 -38.5t-16 -39t-38 -16q-23 0 -39 16t-16 39z" />
+<glyph unicode="" d="M-40 375q-13 -95 35 -173q35 -57 94 -89t129 -32q63 0 119 28q33 16 65 40.5t52.5 45.5t59.5 64q40 44 57 61l394 394q35 35 47 84t-3 96q-27 87 -117 104q-20 2 -29 2q-46 0 -79.5 -17t-67.5 -51l-388 -396l-7 -7l69 -67l377 373q20 22 39 38q23 23 50 23q38 0 53 -36 q16 -39 -20 -75l-547 -547q-52 -52 -125 -52q-55 0 -100 33t-54 96q-5 35 2.5 66t31.5 63t42 50t56 54q24 21 44 41l348 348q52 52 82.5 79.5t84 54t107.5 26.5q25 0 48 -4q95 -17 154 -94.5t51 -175.5q-7 -101 -98 -192l-252 -249l-253 -256l7 -7l69 -60l517 511 q67 67 95 157t11 183q-16 87 -67 154t-130 103q-69 33 -152 33q-107 0 -197 -55q-40 -24 -111 -95l-512 -512q-68 -68 -81 -163z" />
+<glyph unicode="" d="M79 784q0 131 99 229.5t230 98.5q144 0 242 -129q103 129 245 129q130 0 227 -98.5t97 -229.5q0 -46 -17.5 -91t-61 -99t-77 -89.5t-104.5 -105.5q-197 -191 -293 -322l-17 -23l-16 23q-43 58 -100 122.5t-92 99.5t-101 100l-84.5 84.5t-68 74t-60 78t-33.5 70.5t-15 78z M250 784q0 -27 30.5 -70t61.5 -75.5t95 -94.5l22 -22q93 -90 190 -201q82 92 195 203l12 12q64 62 97.5 97t64.5 79t31 72q0 71 -48 119.5t-106 48.5q-73 0 -131 -83l-118 -171l-114 174q-51 80 -124 80q-59 0 -108.5 -49.5t-49.5 -118.5z" />
+<glyph unicode="" d="M57 353q0 -94 66 -160l141 -141q66 -66 159 -66q95 0 159 66l283 283q66 66 66 159t-66 159l-141 141q-12 12 -19 17l-105 -105l212 -212l-389 -389l-247 248l95 95l-18 18q-46 45 -75 101l-55 -55q-66 -66 -66 -159zM269 706q0 -93 66 -159l141 -141l19 -17l105 105 l-212 212l389 389l247 -247l-95 -96l18 -18q46 -46 77 -99l29 29q35 35 62.5 88t27.5 96q0 93 -66 159l-141 141q-66 66 -159 66q-95 0 -159 -66l-283 -283q-66 -64 -66 -159z" />
+<glyph unicode="" d="M200 100v953q0 21 30 46t81 48t129 38t163 15t162 -15t127 -38t79 -48t29 -46v-953q0 -41 -29.5 -70.5t-70.5 -29.5h-600q-41 0 -70.5 29.5t-29.5 70.5zM300 300h600v700h-600v-700zM496 150q0 -43 30.5 -73.5t73.5 -30.5t73.5 30.5t30.5 73.5t-30.5 73.5t-73.5 30.5 t-73.5 -30.5t-30.5 -73.5z" />
+<glyph unicode="" d="M0 0l303 380l207 208l-210 212h300l267 279l-35 36q-15 14 -15 35t15 35q14 15 35 15t35 -15l283 -282q15 -15 15 -36t-15 -35q-14 -15 -35 -15t-35 15l-36 35l-279 -267v-300l-212 210l-208 -207z" />
+<glyph unicode="" d="M295 433h139q5 -77 48.5 -126.5t117.5 -64.5v335l-27 7q-46 14 -79 26.5t-72 36t-62.5 52t-40 72.5t-16.5 99q0 92 44 159.5t109 101t144 40.5v78h100v-79q38 -4 72.5 -13.5t75.5 -31.5t71 -53.5t51.5 -84t24.5 -118.5h-159q-8 72 -35 109.5t-101 50.5v-307l64 -14 q34 -7 64 -16.5t70 -31.5t67.5 -52t47.5 -80.5t20 -112.5q0 -139 -89 -224t-244 -96v-77h-100v78q-152 17 -237 104q-40 40 -52.5 93.5t-15.5 139.5zM466 889q0 -29 8 -51t16.5 -34t29.5 -22.5t31 -13.5t38 -10q7 -2 11 -3v274q-61 -8 -97.5 -37.5t-36.5 -102.5zM700 237 q170 18 170 151q0 64 -44 99.5t-126 60.5v-311z" />
+<glyph unicode="" d="M100 600v100h166q-24 49 -44 104q-10 26 -14.5 55.5t-3 72.5t25 90t68.5 87q97 88 263 88q129 0 230 -89t101 -208h-153q0 52 -34 89.5t-74 51.5t-76 14q-37 0 -79 -14.5t-62 -35.5q-41 -44 -41 -101q0 -11 2.5 -24.5t5.5 -24t9.5 -26.5t10.5 -25t14 -27.5t14 -25.5 t15.5 -27t13.5 -24h242v-100h-197q8 -50 -2.5 -115t-31.5 -94q-41 -59 -99 -113q35 11 84 18t70 7q32 1 102 -16t104 -17q76 0 136 30l50 -147q-41 -25 -80.5 -36.5t-59 -13t-61.5 -1.5q-23 0 -128 33t-155 29q-39 -4 -82 -17t-66 -25l-24 -11l-55 145l16.5 11t15.5 10 t13.5 9.5t14.5 12t14.5 14t17.5 18.5q48 55 54 126.5t-30 142.5h-221z" />
+<glyph unicode="" d="M2 300l298 -300l298 300h-198v900h-200v-900h-198zM602 900l298 300l298 -300h-198v-900h-200v900h-198z" />
+<glyph unicode="" d="M2 300h198v900h200v-900h198l-298 -300zM700 0v200h100v-100h200v-100h-300zM700 400v100h300v-200h-99v-100h-100v100h99v100h-200zM700 700v500h300v-500h-100v100h-100v-100h-100zM801 900h100v200h-100v-200z" />
+<glyph unicode="" d="M2 300h198v900h200v-900h198l-298 -300zM700 0v500h300v-500h-100v100h-100v-100h-100zM700 700v200h100v-100h200v-100h-300zM700 1100v100h300v-200h-99v-100h-100v100h99v100h-200zM801 200h100v200h-100v-200z" />
+<glyph unicode="" d="M2 300l298 -300l298 300h-198v900h-200v-900h-198zM800 100v400h300v-500h-100v100h-200zM800 1100v100h200v-500h-100v400h-100zM901 200h100v200h-100v-200z" />
+<glyph unicode="" d="M2 300l298 -300l298 300h-198v900h-200v-900h-198zM800 400v100h200v-500h-100v400h-100zM800 800v400h300v-500h-100v100h-200zM901 900h100v200h-100v-200z" />
+<glyph unicode="" d="M2 300l298 -300l298 300h-198v900h-200v-900h-198zM700 100v200h500v-200h-500zM700 400v200h400v-200h-400zM700 700v200h300v-200h-300zM700 1000v200h200v-200h-200z" />
+<glyph unicode="" d="M2 300l298 -300l298 300h-198v900h-200v-900h-198zM700 100v200h200v-200h-200zM700 400v200h300v-200h-300zM700 700v200h400v-200h-400zM700 1000v200h500v-200h-500z" />
+<glyph unicode="" d="M0 400v300q0 165 117.5 282.5t282.5 117.5h300q162 0 281 -118.5t119 -281.5v-300q0 -165 -118.5 -282.5t-281.5 -117.5h-300q-165 0 -282.5 117.5t-117.5 282.5zM200 300q0 -41 29.5 -70.5t70.5 -29.5h500q41 0 70.5 29.5t29.5 70.5v500q0 41 -29.5 70.5t-70.5 29.5 h-500q-41 0 -70.5 -29.5t-29.5 -70.5v-500z" />
+<glyph unicode="" d="M0 400v300q0 163 119 281.5t281 118.5h300q165 0 282.5 -117.5t117.5 -282.5v-300q0 -165 -117.5 -282.5t-282.5 -117.5h-300q-163 0 -281.5 117.5t-118.5 282.5zM200 300q0 -41 29.5 -70.5t70.5 -29.5h500q41 0 70.5 29.5t29.5 70.5v500q0 41 -29.5 70.5t-70.5 29.5 h-500q-41 0 -70.5 -29.5t-29.5 -70.5v-500zM400 300l333 250l-333 250v-500z" />
+<glyph unicode="" d="M0 400v300q0 163 117.5 281.5t282.5 118.5h300q163 0 281.5 -119t118.5 -281v-300q0 -165 -117.5 -282.5t-282.5 -117.5h-300q-165 0 -282.5 117.5t-117.5 282.5zM200 300q0 -41 29.5 -70.5t70.5 -29.5h500q41 0 70.5 29.5t29.5 70.5v500q0 41 -29.5 70.5t-70.5 29.5 h-500q-41 0 -70.5 -29.5t-29.5 -70.5v-500zM300 700l250 -333l250 333h-500z" />
+<glyph unicode="" d="M0 400v300q0 165 117.5 282.5t282.5 117.5h300q165 0 282.5 -117.5t117.5 -282.5v-300q0 -162 -118.5 -281t-281.5 -119h-300q-165 0 -282.5 118.5t-117.5 281.5zM200 300q0 -41 29.5 -70.5t70.5 -29.5h500q41 0 70.5 29.5t29.5 70.5v500q0 41 -29.5 70.5t-70.5 29.5 h-500q-41 0 -70.5 -29.5t-29.5 -70.5v-500zM300 400h500l-250 333z" />
+<glyph unicode="" d="M0 400v300h300v200l400 -350l-400 -350v200h-300zM500 0v200h500q41 0 70.5 29.5t29.5 70.5v500q0 41 -29.5 70.5t-70.5 29.5h-500v200h400q165 0 282.5 -117.5t117.5 -282.5v-300q0 -165 -117.5 -282.5t-282.5 -117.5h-400z" />
+<glyph unicode="" d="M216 519q10 -19 32 -19h302q-155 -438 -160 -458q-5 -21 4 -32l9 -8l9 -1q13 0 26 16l538 630q15 19 6 36q-8 18 -32 16h-300q1 4 78 219.5t79 227.5q2 17 -6 27l-8 8h-9q-16 0 -25 -15q-4 -5 -98.5 -111.5t-228 -257t-209.5 -238.5q-17 -19 -7 -40z" />
+<glyph unicode="" d="M0 400q0 -165 117.5 -282.5t282.5 -117.5h300q47 0 100 15v185h-500q-41 0 -70.5 29.5t-29.5 70.5v500q0 41 29.5 70.5t70.5 29.5h500v185q-14 4 -114 7.5t-193 5.5l-93 2q-165 0 -282.5 -117.5t-117.5 -282.5v-300zM600 400v300h300v200l400 -350l-400 -350v200h-300z " />
+<glyph unicode="" d="M0 400q0 -165 117.5 -282.5t282.5 -117.5h300q163 0 281.5 117.5t118.5 282.5v98l-78 73l-122 -123v-148q0 -41 -29.5 -70.5t-70.5 -29.5h-500q-41 0 -70.5 29.5t-29.5 70.5v500q0 41 29.5 70.5t70.5 29.5h156l118 122l-74 78h-100q-165 0 -282.5 -117.5t-117.5 -282.5 v-300zM496 709l353 342l-149 149h500v-500l-149 149l-342 -353z" />
+<glyph unicode="" d="M4 600q0 162 80 299t217 217t299 80t299 -80t217 -217t80 -299t-80 -299t-217 -217t-299 -80t-299 80t-217 217t-80 299zM186 600q0 -171 121.5 -292.5t292.5 -121.5t292.5 121.5t121.5 292.5t-121.5 292.5t-292.5 121.5t-292.5 -121.5t-121.5 -292.5zM406 600 q0 80 57 137t137 57t137 -57t57 -137t-57 -137t-137 -57t-137 57t-57 137z" />
+<glyph unicode="" d="M0 0v275q0 11 7 18t18 7h1048q11 0 19 -7.5t8 -17.5v-275h-1100zM100 800l445 -500l450 500h-295v400h-300v-400h-300zM900 150h100v50h-100v-50z" />
+<glyph unicode="" d="M0 0v275q0 11 7 18t18 7h1048q11 0 19 -7.5t8 -17.5v-275h-1100zM100 700h300v-300h300v300h295l-445 500zM900 150h100v50h-100v-50z" />
+<glyph unicode="" d="M0 0v275q0 11 7 18t18 7h1048q11 0 19 -7.5t8 -17.5v-275h-1100zM100 705l305 -305l596 596l-154 155l-442 -442l-150 151zM900 150h100v50h-100v-50z" />
+<glyph unicode="" d="M0 0v275q0 11 7 18t18 7h1048q11 0 19 -7.5t8 -17.5v-275h-1100zM100 988l97 -98l212 213l-97 97zM200 401h700v699l-250 -239l-149 149l-212 -212l149 -149zM900 150h100v50h-100v-50z" />
+<glyph unicode="" d="M0 0v275q0 11 7 18t18 7h1048q11 0 19 -7.5t8 -17.5v-275h-1100zM200 612l212 -212l98 97l-213 212zM300 1200l239 -250l-149 -149l212 -212l149 148l248 -237v700h-699zM900 150h100v50h-100v-50z" />
+<glyph unicode="" d="M23 415l1177 784v-1079l-475 272l-310 -393v416h-392zM494 210l672 938l-672 -712v-226z" />
+<glyph unicode="" d="M0 150v1000q0 20 14.5 35t35.5 15h250v-300h500v300h100l200 -200v-850q0 -21 -15 -35.5t-35 -14.5h-150v400h-700v-400h-150q-21 0 -35.5 14.5t-14.5 35.5zM600 1000h100v200h-100v-200z" />
+<glyph unicode="" d="M0 150v1000q0 20 14.5 35t35.5 15h250v-300h500v300h100l200 -200v-218l-276 -275l-120 120l-126 -127h-378v-400h-150q-21 0 -35.5 14.5t-14.5 35.5zM581 306l123 123l120 -120l353 352l123 -123l-475 -476zM600 1000h100v200h-100v-200z" />
+<glyph unicode="" d="M0 150v1000q0 20 14.5 35t35.5 15h250v-300h500v300h100l200 -200v-269l-103 -103l-170 170l-298 -298h-329v-400h-150q-21 0 -35.5 14.5t-14.5 35.5zM600 1000h100v200h-100v-200zM700 133l170 170l-170 170l127 127l170 -170l170 170l127 -128l-170 -169l170 -170 l-127 -127l-170 170l-170 -170z" />
+<glyph unicode="" d="M0 150v1000q0 20 14.5 35t35.5 15h250v-300h500v300h100l200 -200v-300h-400v-200h-500v-400h-150q-21 0 -35.5 14.5t-14.5 35.5zM600 300l300 -300l300 300h-200v300h-200v-300h-200zM600 1000v200h100v-200h-100z" />
+<glyph unicode="" d="M0 150v1000q0 20 14.5 35t35.5 15h250v-300h500v300h100l200 -200v-402l-200 200l-298 -298h-402v-400h-150q-21 0 -35.5 14.5t-14.5 35.5zM600 300h200v-300h200v300h200l-300 300zM600 1000v200h100v-200h-100z" />
+<glyph unicode="" d="M0 250q0 -21 14.5 -35.5t35.5 -14.5h1100q21 0 35.5 14.5t14.5 35.5v550h-1200v-550zM0 900h1200v150q0 21 -14.5 35.5t-35.5 14.5h-1100q-21 0 -35.5 -14.5t-14.5 -35.5v-150zM100 300v200h400v-200h-400z" />
+<glyph unicode="" d="M0 400l300 298v-198h400v-200h-400v-198zM100 800v200h100v-200h-100zM300 800v200h100v-200h-100zM500 800v200h400v198l300 -298l-300 -298v198h-400zM800 300v200h100v-200h-100zM1000 300h100v200h-100v-200z" />
+<glyph unicode="" d="M100 700v400l50 100l50 -100v-300h100v300l50 100l50 -100v-300h100v300l50 100l50 -100v-400l-100 -203v-447q0 -21 -14.5 -35.5t-35.5 -14.5h-200q-21 0 -35.5 14.5t-14.5 35.5v447zM800 597q0 -29 10.5 -55.5t25 -43t29 -28.5t25.5 -18l10 -5v-397q0 -21 14.5 -35.5 t35.5 -14.5h200q21 0 35.5 14.5t14.5 35.5v1106q0 31 -18 40.5t-44 -7.5l-276 -117q-25 -16 -43.5 -50.5t-18.5 -65.5v-359z" />
+<glyph unicode="" d="M100 0h400v56q-75 0 -87.5 6t-12.5 44v394h500v-394q0 -38 -12.5 -44t-87.5 -6v-56h400v56q-4 0 -11 0.5t-24 3t-30 7t-24 15t-11 24.5v888q0 22 25 34.5t50 13.5l25 2v56h-400v-56q75 0 87.5 -6t12.5 -44v-394h-500v394q0 38 12.5 44t87.5 6v56h-400v-56q4 0 11 -0.5 t24 -3t30 -7t24 -15t11 -24.5v-888q0 -22 -25 -34.5t-50 -13.5l-25 -2v-56z" />
+<glyph unicode="" d="M0 300q0 -41 29.5 -70.5t70.5 -29.5h300q41 0 70.5 29.5t29.5 70.5v500q0 41 -29.5 70.5t-70.5 29.5h-300q-41 0 -70.5 -29.5t-29.5 -70.5v-500zM100 100h400l200 200h105l295 98v-298h-425l-100 -100h-375zM100 300v200h300v-200h-300zM100 600v200h300v-200h-300z M100 1000h400l200 -200v-98l295 98h105v200h-425l-100 100h-375zM700 402v163l400 133v-163z" />
+<glyph unicode="" d="M16.5 974.5q0.5 -21.5 16 -90t46.5 -140t104 -177.5t175 -208q103 -103 207.5 -176t180 -103.5t137 -47t92.5 -16.5l31 1l163 162q16 17 13 40.5t-22 37.5l-192 136q-19 14 -45 12t-42 -19l-119 -118q-143 103 -267 227q-126 126 -227 268l118 118q17 17 20 41.5 t-11 44.5l-139 194q-14 19 -36.5 22t-40.5 -14l-162 -162q-1 -11 -0.5 -32.5z" />
+<glyph unicode="" d="M0 50v212q0 20 10.5 45.5t24.5 39.5l365 303v50q0 4 1 10.5t12 22.5t30 28.5t60 23t97 10.5t97 -10t60 -23.5t30 -27.5t12 -24l1 -10v-50l365 -303q14 -14 24.5 -39.5t10.5 -45.5v-212q0 -21 -15 -35.5t-35 -14.5h-1100q-21 0 -35.5 14.5t-14.5 35.5zM0 712 q0 -21 14.5 -33.5t34.5 -8.5l202 33q20 4 34.5 21t14.5 38v146q141 24 300 24t300 -24v-146q0 -21 14.5 -38t34.5 -21l202 -33q20 -4 34.5 8.5t14.5 33.5v200q-6 8 -19 20.5t-63 45t-112 57t-171 45t-235 20.5q-92 0 -175 -10.5t-141.5 -27t-108.5 -36.5t-81.5 -40 t-53.5 -36.5t-31 -27.5l-9 -10v-200z" />
+<glyph unicode="" d="M100 0v100h1100v-100h-1100zM175 200h950l-125 150v250l100 100v400h-100v-200h-100v200h-200v-200h-100v200h-200v-200h-100v200h-100v-400l100 -100v-250z" />
+<glyph unicode="" d="M100 0h300v400q0 41 -29.5 70.5t-70.5 29.5h-100q-41 0 -70.5 -29.5t-29.5 -70.5v-400zM500 0v1000q0 41 29.5 70.5t70.5 29.5h100q41 0 70.5 -29.5t29.5 -70.5v-1000h-300zM900 0v700q0 41 29.5 70.5t70.5 29.5h100q41 0 70.5 -29.5t29.5 -70.5v-700h-300z" />
+<glyph unicode="" d="M-100 300v500q0 124 88 212t212 88h700q124 0 212 -88t88 -212v-500q0 -124 -88 -212t-212 -88h-700q-124 0 -212 88t-88 212zM100 200h900v700h-900v-700zM200 300h300v300h-200v100h200v100h-300v-300h200v-100h-200v-100zM600 300h200v100h100v300h-100v100h-200v-500 zM700 400v300h100v-300h-100z" />
+<glyph unicode="" d="M-100 300v500q0 124 88 212t212 88h700q124 0 212 -88t88 -212v-500q0 -124 -88 -212t-212 -88h-700q-124 0 -212 88t-88 212zM100 200h900v700h-900v-700zM200 300h100v200h100v-200h100v500h-100v-200h-100v200h-100v-500zM600 300h200v100h100v300h-100v100h-200v-500 zM700 400v300h100v-300h-100z" />
+<glyph unicode="" d="M-100 300v500q0 124 88 212t212 88h700q124 0 212 -88t88 -212v-500q0 -124 -88 -212t-212 -88h-700q-124 0 -212 88t-88 212zM100 200h900v700h-900v-700zM200 300h300v100h-200v300h200v100h-300v-500zM600 300h300v100h-200v300h200v100h-300v-500z" />
+<glyph unicode="" d="M-100 300v500q0 124 88 212t212 88h700q124 0 212 -88t88 -212v-500q0 -124 -88 -212t-212 -88h-700q-124 0 -212 88t-88 212zM100 200h900v700h-900v-700zM200 550l300 -150v300zM600 400l300 150l-300 150v-300z" />
+<glyph unicode="" d="M-100 300v500q0 124 88 212t212 88h700q124 0 212 -88t88 -212v-500q0 -124 -88 -212t-212 -88h-700q-124 0 -212 88t-88 212zM100 200h900v700h-900v-700zM200 300v500h700v-500h-700zM300 400h130q41 0 68 42t27 107t-28.5 108t-66.5 43h-130v-300zM575 549 q0 -65 27 -107t68 -42h130v300h-130q-38 0 -66.5 -43t-28.5 -108z" />
+<glyph unicode="" d="M-100 300v500q0 124 88 212t212 88h700q124 0 212 -88t88 -212v-500q0 -124 -88 -212t-212 -88h-700q-124 0 -212 88t-88 212zM100 200h900v700h-900v-700zM200 300h300v300h-200v100h200v100h-300v-300h200v-100h-200v-100zM601 300h100v100h-100v-100zM700 700h100 v-400h100v500h-200v-100z" />
+<glyph unicode="" d="M-100 300v500q0 124 88 212t212 88h700q124 0 212 -88t88 -212v-500q0 -124 -88 -212t-212 -88h-700q-124 0 -212 88t-88 212zM100 200h900v700h-900v-700zM200 300h300v400h-200v100h-100v-500zM301 400v200h100v-200h-100zM601 300h100v100h-100v-100zM700 700h100 v-400h100v500h-200v-100z" />
+<glyph unicode="" d="M-100 300v500q0 124 88 212t212 88h700q124 0 212 -88t88 -212v-500q0 -124 -88 -212t-212 -88h-700q-124 0 -212 88t-88 212zM100 200h900v700h-900v-700zM200 700v100h300v-300h-99v-100h-100v100h99v200h-200zM201 300v100h100v-100h-100zM601 300v100h100v-100h-100z M700 700v100h200v-500h-100v400h-100z" />
+<glyph unicode="" d="M4 600q0 162 80 299t217 217t299 80t299 -80t217 -217t80 -299t-80 -299t-217 -217t-299 -80t-299 80t-217 217t-80 299zM186 600q0 -171 121.5 -292.5t292.5 -121.5t292.5 121.5t121.5 292.5t-121.5 292.5t-292.5 121.5t-292.5 -121.5t-121.5 -292.5zM400 500v200 l100 100h300v-100h-300v-200h300v-100h-300z" />
+<glyph unicode="" d="M0 600q0 162 80 299t217 217t299 80t299 -80t217 -217t80 -299t-80 -299t-217 -217t-299 -80t-299 80t-217 217t-80 299zM182 600q0 -171 121.5 -292.5t292.5 -121.5t292.5 121.5t121.5 292.5t-121.5 292.5t-292.5 121.5t-292.5 -121.5t-121.5 -292.5zM400 400v400h300 l100 -100v-100h-100v100h-200v-100h200v-100h-200v-100h-100zM700 400v100h100v-100h-100z" />
+<glyph unicode="" d="M-14 494q0 -80 56.5 -137t135.5 -57h222v300h400v-300h128q120 0 205 86t85 208q0 120 -85 206.5t-205 86.5q-46 0 -90 -14q-44 97 -134.5 156.5t-200.5 59.5q-152 0 -260 -107.5t-108 -260.5q0 -25 2 -37q-66 -14 -108.5 -67.5t-42.5 -122.5zM300 200h200v300h200v-300 h200l-300 -300z" />
+<glyph unicode="" d="M-14 494q0 -80 56.5 -137t135.5 -57h8l414 414l403 -403q94 26 154.5 104t60.5 178q0 121 -85 207.5t-205 86.5q-46 0 -90 -14q-44 97 -134.5 156.5t-200.5 59.5q-152 0 -260 -107.5t-108 -260.5q0 -25 2 -37q-66 -14 -108.5 -67.5t-42.5 -122.5zM300 200l300 300 l300 -300h-200v-300h-200v300h-200z" />
+<glyph unicode="" d="M100 200h400v-155l-75 -45h350l-75 45v155h400l-270 300h170l-270 300h170l-300 333l-300 -333h170l-270 -300h170z" />
+<glyph unicode="" d="M121 700q0 -53 28.5 -97t75.5 -65q-4 -16 -4 -38q0 -74 52.5 -126.5t126.5 -52.5q56 0 100 30v-306l-75 -45h350l-75 45v306q46 -30 100 -30q74 0 126.5 52.5t52.5 126.5q0 24 -9 55q50 32 79.5 83t29.5 112q0 90 -61.5 155.5t-150.5 71.5q-26 89 -99.5 145.5 t-167.5 56.5q-116 0 -197.5 -81.5t-81.5 -197.5q0 -4 1 -12t1 -11q-14 2 -23 2q-74 0 -126.5 -52.5t-52.5 -126.5z" />
+</font>
+</defs></svg>
\ No newline at end of file
diff --git a/bitbake/lib/toaster/toastergui/static/fonts/glyphicons-halflings-regular.ttf b/bitbake/lib/toaster/toastergui/static/fonts/glyphicons-halflings-regular.ttf
new file mode 100644
index 0000000..a498ef4
--- /dev/null
+++ b/bitbake/lib/toaster/toastergui/static/fonts/glyphicons-halflings-regular.ttf
Binary files differ
diff --git a/bitbake/lib/toaster/toastergui/static/fonts/glyphicons-halflings-regular.woff b/bitbake/lib/toaster/toastergui/static/fonts/glyphicons-halflings-regular.woff
new file mode 100644
index 0000000..d83c539
--- /dev/null
+++ b/bitbake/lib/toaster/toastergui/static/fonts/glyphicons-halflings-regular.woff
Binary files differ
diff --git a/bitbake/lib/toaster/toastergui/static/html/layer_deps_modal.html b/bitbake/lib/toaster/toastergui/static/html/layer_deps_modal.html
new file mode 100644
index 0000000..e1dba43
--- /dev/null
+++ b/bitbake/lib/toaster/toastergui/static/html/layer_deps_modal.html
@@ -0,0 +1,17 @@
+<div id="dependencies-modal" class="modal hide fade" tabindex="-1" role="dialog" aria-hidden="false">
+ <form id="dependencies-modal-form">
+ <div class="modal-header">
+ <button type="button" class="close" data-dismiss="modal" aria-hidden="true">x</button>
+ <h3><span id="title"></span> dependencies</h3>
+ </div>
+ <div class="modal-body">
+ <p id="body-text"> <strong id="layer-name"></strong> depends on some layers that are not added to your project. Select the ones you want to add:</p>
+ <ul class="unstyled" id="dependencies-list">
+ </ul>
+ </div>
+ <div class="modal-footer">
+ <button class="btn btn-primary" type="submit">Add layers</button>
+ <button class="btn" type="reset" data-dismiss="modal">Cancel</button>
+ </div>
+ </form>
+</div>
diff --git a/bitbake/lib/toaster/toastergui/static/img/glyphicons-halflings-white.png b/bitbake/lib/toaster/toastergui/static/img/glyphicons-halflings-white.png
new file mode 100644
index 0000000..3bf6484
--- /dev/null
+++ b/bitbake/lib/toaster/toastergui/static/img/glyphicons-halflings-white.png
Binary files differ
diff --git a/bitbake/lib/toaster/toastergui/static/img/glyphicons-halflings.png b/bitbake/lib/toaster/toastergui/static/img/glyphicons-halflings.png
new file mode 100644
index 0000000..a996999
--- /dev/null
+++ b/bitbake/lib/toaster/toastergui/static/img/glyphicons-halflings.png
Binary files differ
diff --git a/bitbake/lib/toaster/toastergui/static/img/logo.png b/bitbake/lib/toaster/toastergui/static/img/logo.png
new file mode 100644
index 0000000..35ad733
--- /dev/null
+++ b/bitbake/lib/toaster/toastergui/static/img/logo.png
Binary files differ
diff --git a/bitbake/lib/toaster/toastergui/static/img/toaster_bw.png b/bitbake/lib/toaster/toastergui/static/img/toaster_bw.png
new file mode 100644
index 0000000..b465907
--- /dev/null
+++ b/bitbake/lib/toaster/toastergui/static/img/toaster_bw.png
Binary files differ
diff --git a/bitbake/lib/toaster/toastergui/static/jquery-treetable-license/GPL-LICENSE.txt b/bitbake/lib/toaster/toastergui/static/jquery-treetable-license/GPL-LICENSE.txt
new file mode 100644
index 0000000..76927f5
--- /dev/null
+++ b/bitbake/lib/toaster/toastergui/static/jquery-treetable-license/GPL-LICENSE.txt
@@ -0,0 +1,278 @@
+ GNU GENERAL PUBLIC LICENSE
+ Version 2, June 1991
+
+ Copyright (C) 1989, 1991 Free Software Foundation, Inc.
+ 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
+ Everyone is permitted to copy and distribute verbatim copies
+ of this license document, but changing it is not allowed.
+
+ Preamble
+
+ The licenses for most software are designed to take away your
+freedom to share and change it. By contrast, the GNU General Public
+License is intended to guarantee your freedom to share and change free
+software--to make sure the software is free for all its users. This
+General Public License applies to most of the Free Software
+Foundation's software and to any other program whose authors commit to
+using it. (Some other Free Software Foundation software is covered by
+the GNU Lesser General Public License instead.) You can apply it to
+your programs, too.
+
+ When we speak of free software, we are referring to freedom, not
+price. Our General Public Licenses are designed to make sure that you
+have the freedom to distribute copies of free software (and charge for
+this service if you wish), that you receive source code or can get it
+if you want it, that you can change the software or use pieces of it
+in new free programs; and that you know you can do these things.
+
+ To protect your rights, we need to make restrictions that forbid
+anyone to deny you these rights or to ask you to surrender the rights.
+These restrictions translate to certain responsibilities for you if you
+distribute copies of the software, or if you modify it.
+
+ For example, if you distribute copies of such a program, whether
+gratis or for a fee, you must give the recipients all the rights that
+you have. You must make sure that they, too, receive or can get the
+source code. And you must show them these terms so they know their
+rights.
+
+ We protect your rights with two steps: (1) copyright the software, and
+(2) offer you this license which gives you legal permission to copy,
+distribute and/or modify the software.
+
+ Also, for each author's protection and ours, we want to make certain
+that everyone understands that there is no warranty for this free
+software. If the software is modified by someone else and passed on, we
+want its recipients to know that what they have is not the original, so
+that any problems introduced by others will not reflect on the original
+authors' reputations.
+
+ Finally, any free program is threatened constantly by software
+patents. We wish to avoid the danger that redistributors of a free
+program will individually obtain patent licenses, in effect making the
+program proprietary. To prevent this, we have made it clear that any
+patent must be licensed for everyone's free use or not licensed at all.
+
+ The precise terms and conditions for copying, distribution and
+modification follow.
+
+ GNU GENERAL PUBLIC LICENSE
+ TERMS AND CONDITIONS FOR COPYING, DISTRIBUTION AND MODIFICATION
+
+ 0. This License applies to any program or other work which contains
+a notice placed by the copyright holder saying it may be distributed
+under the terms of this General Public License. The "Program", below,
+refers to any such program or work, and a "work based on the Program"
+means either the Program or any derivative work under copyright law:
+that is to say, a work containing the Program or a portion of it,
+either verbatim or with modifications and/or translated into another
+language. (Hereinafter, translation is included without limitation in
+the term "modification".) Each licensee is addressed as "you".
+
+Activities other than copying, distribution and modification are not
+covered by this License; they are outside its scope. The act of
+running the Program is not restricted, and the output from the Program
+is covered only if its contents constitute a work based on the
+Program (independent of having been made by running the Program).
+Whether that is true depends on what the Program does.
+
+ 1. You may copy and distribute verbatim copies of the Program's
+source code as you receive it, in any medium, provided that you
+conspicuously and appropriately publish on each copy an appropriate
+copyright notice and disclaimer of warranty; keep intact all the
+notices that refer to this License and to the absence of any warranty;
+and give any other recipients of the Program a copy of this License
+along with the Program.
+
+You may charge a fee for the physical act of transferring a copy, and
+you may at your option offer warranty protection in exchange for a fee.
+
+ 2. You may modify your copy or copies of the Program or any portion
+of it, thus forming a work based on the Program, and copy and
+distribute such modifications or work under the terms of Section 1
+above, provided that you also meet all of these conditions:
+
+ a) You must cause the modified files to carry prominent notices
+ stating that you changed the files and the date of any change.
+
+ b) You must cause any work that you distribute or publish, that in
+ whole or in part contains or is derived from the Program or any
+ part thereof, to be licensed as a whole at no charge to all third
+ parties under the terms of this License.
+
+ c) If the modified program normally reads commands interactively
+ when run, you must cause it, when started running for such
+ interactive use in the most ordinary way, to print or display an
+ announcement including an appropriate copyright notice and a
+ notice that there is no warranty (or else, saying that you provide
+ a warranty) and that users may redistribute the program under
+ these conditions, and telling the user how to view a copy of this
+ License. (Exception: if the Program itself is interactive but
+ does not normally print such an announcement, your work based on
+ the Program is not required to print an announcement.)
+
+These requirements apply to the modified work as a whole. If
+identifiable sections of that work are not derived from the Program,
+and can be reasonably considered independent and separate works in
+themselves, then this License, and its terms, do not apply to those
+sections when you distribute them as separate works. But when you
+distribute the same sections as part of a whole which is a work based
+on the Program, the distribution of the whole must be on the terms of
+this License, whose permissions for other licensees extend to the
+entire whole, and thus to each and every part regardless of who wrote it.
+
+Thus, it is not the intent of this section to claim rights or contest
+your rights to work written entirely by you; rather, the intent is to
+exercise the right to control the distribution of derivative or
+collective works based on the Program.
+
+In addition, mere aggregation of another work not based on the Program
+with the Program (or with a work based on the Program) on a volume of
+a storage or distribution medium does not bring the other work under
+the scope of this License.
+
+ 3. You may copy and distribute the Program (or a work based on it,
+under Section 2) in object code or executable form under the terms of
+Sections 1 and 2 above provided that you also do one of the following:
+
+ a) Accompany it with the complete corresponding machine-readable
+ source code, which must be distributed under the terms of Sections
+ 1 and 2 above on a medium customarily used for software interchange; or,
+
+ b) Accompany it with a written offer, valid for at least three
+ years, to give any third party, for a charge no more than your
+ cost of physically performing source distribution, a complete
+ machine-readable copy of the corresponding source code, to be
+ distributed under the terms of Sections 1 and 2 above on a medium
+ customarily used for software interchange; or,
+
+ c) Accompany it with the information you received as to the offer
+ to distribute corresponding source code. (This alternative is
+ allowed only for noncommercial distribution and only if you
+ received the program in object code or executable form with such
+ an offer, in accord with Subsection b above.)
+
+The source code for a work means the preferred form of the work for
+making modifications to it. For an executable work, complete source
+code means all the source code for all modules it contains, plus any
+associated interface definition files, plus the scripts used to
+control compilation and installation of the executable. However, as a
+special exception, the source code distributed need not include
+anything that is normally distributed (in either source or binary
+form) with the major components (compiler, kernel, and so on) of the
+operating system on which the executable runs, unless that component
+itself accompanies the executable.
+
+If distribution of executable or object code is made by offering
+access to copy from a designated place, then offering equivalent
+access to copy the source code from the same place counts as
+distribution of the source code, even though third parties are not
+compelled to copy the source along with the object code.
+
+ 4. You may not copy, modify, sublicense, or distribute the Program
+except as expressly provided under this License. Any attempt
+otherwise to copy, modify, sublicense or distribute the Program is
+void, and will automatically terminate your rights under this License.
+However, parties who have received copies, or rights, from you under
+this License will not have their licenses terminated so long as such
+parties remain in full compliance.
+
+ 5. You are not required to accept this License, since you have not
+signed it. However, nothing else grants you permission to modify or
+distribute the Program or its derivative works. These actions are
+prohibited by law if you do not accept this License. Therefore, by
+modifying or distributing the Program (or any work based on the
+Program), you indicate your acceptance of this License to do so, and
+all its terms and conditions for copying, distributing or modifying
+the Program or works based on it.
+
+ 6. Each time you redistribute the Program (or any work based on the
+Program), the recipient automatically receives a license from the
+original licensor to copy, distribute or modify the Program subject to
+these terms and conditions. You may not impose any further
+restrictions on the recipients' exercise of the rights granted herein.
+You are not responsible for enforcing compliance by third parties to
+this License.
+
+ 7. If, as a consequence of a court judgment or allegation of patent
+infringement or for any other reason (not limited to patent issues),
+conditions are imposed on you (whether by court order, agreement or
+otherwise) that contradict the conditions of this License, they do not
+excuse you from the conditions of this License. If you cannot
+distribute so as to satisfy simultaneously your obligations under this
+License and any other pertinent obligations, then as a consequence you
+may not distribute the Program at all. For example, if a patent
+license would not permit royalty-free redistribution of the Program by
+all those who receive copies directly or indirectly through you, then
+the only way you could satisfy both it and this License would be to
+refrain entirely from distribution of the Program.
+
+If any portion of this section is held invalid or unenforceable under
+any particular circumstance, the balance of the section is intended to
+apply and the section as a whole is intended to apply in other
+circumstances.
+
+It is not the purpose of this section to induce you to infringe any
+patents or other property right claims or to contest validity of any
+such claims; this section has the sole purpose of protecting the
+integrity of the free software distribution system, which is
+implemented by public license practices. Many people have made
+generous contributions to the wide range of software distributed
+through that system in reliance on consistent application of that
+system; it is up to the author/donor to decide if he or she is willing
+to distribute software through any other system and a licensee cannot
+impose that choice.
+
+This section is intended to make thoroughly clear what is believed to
+be a consequence of the rest of this License.
+
+ 8. If the distribution and/or use of the Program is restricted in
+certain countries either by patents or by copyrighted interfaces, the
+original copyright holder who places the Program under this License
+may add an explicit geographical distribution limitation excluding
+those countries, so that distribution is permitted only in or among
+countries not thus excluded. In such case, this License incorporates
+the limitation as if written in the body of this License.
+
+ 9. The Free Software Foundation may publish revised and/or new versions
+of the General Public License from time to time. Such new versions will
+be similar in spirit to the present version, but may differ in detail to
+address new problems or concerns.
+
+Each version is given a distinguishing version number. If the Program
+specifies a version number of this License which applies to it and "any
+later version", you have the option of following the terms and conditions
+either of that version or of any later version published by the Free
+Software Foundation. If the Program does not specify a version number of
+this License, you may choose any version ever published by the Free Software
+Foundation.
+
+ 10. If you wish to incorporate parts of the Program into other free
+programs whose distribution conditions are different, write to the author
+to ask for permission. For software which is copyrighted by the Free
+Software Foundation, write to the Free Software Foundation; we sometimes
+make exceptions for this. Our decision will be guided by the two goals
+of preserving the free status of all derivatives of our free software and
+of promoting the sharing and reuse of software generally.
+
+ NO WARRANTY
+
+ 11. BECAUSE THE PROGRAM IS LICENSED FREE OF CHARGE, THERE IS NO WARRANTY
+FOR THE PROGRAM, TO THE EXTENT PERMITTED BY APPLICABLE LAW. EXCEPT WHEN
+OTHERWISE STATED IN WRITING THE COPYRIGHT HOLDERS AND/OR OTHER PARTIES
+PROVIDE THE PROGRAM "AS IS" WITHOUT WARRANTY OF ANY KIND, EITHER EXPRESSED
+OR IMPLIED, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF
+MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE. THE ENTIRE RISK AS
+TO THE QUALITY AND PERFORMANCE OF THE PROGRAM IS WITH YOU. SHOULD THE
+PROGRAM PROVE DEFECTIVE, YOU ASSUME THE COST OF ALL NECESSARY SERVICING,
+REPAIR OR CORRECTION.
+
+ 12. IN NO EVENT UNLESS REQUIRED BY APPLICABLE LAW OR AGREED TO IN WRITING
+WILL ANY COPYRIGHT HOLDER, OR ANY OTHER PARTY WHO MAY MODIFY AND/OR
+REDISTRIBUTE THE PROGRAM AS PERMITTED ABOVE, BE LIABLE TO YOU FOR DAMAGES,
+INCLUDING ANY GENERAL, SPECIAL, INCIDENTAL OR CONSEQUENTIAL DAMAGES ARISING
+OUT OF THE USE OR INABILITY TO USE THE PROGRAM (INCLUDING BUT NOT LIMITED
+TO LOSS OF DATA OR DATA BEING RENDERED INACCURATE OR LOSSES SUSTAINED BY
+YOU OR THIRD PARTIES OR A FAILURE OF THE PROGRAM TO OPERATE WITH ANY OTHER
+PROGRAMS), EVEN IF SUCH HOLDER OR OTHER PARTY HAS BEEN ADVISED OF THE
+POSSIBILITY OF SUCH DAMAGES.
\ No newline at end of file
diff --git a/bitbake/lib/toaster/toastergui/static/jquery-treetable-license/MIT-LICENSE.txt b/bitbake/lib/toaster/toastergui/static/jquery-treetable-license/MIT-LICENSE.txt
new file mode 100644
index 0000000..c2e824f
--- /dev/null
+++ b/bitbake/lib/toaster/toastergui/static/jquery-treetable-license/MIT-LICENSE.txt
@@ -0,0 +1,20 @@
+Copyright (c) 2013 Ludo van den Boom, http://ludovandenboom.com
+
+Permission is hereby granted, free of charge, to any person obtaining
+a copy of this software and associated documentation files (the
+"Software"), to deal in the Software without restriction, including
+without limitation the rights to use, copy, modify, merge, publish,
+distribute, sublicense, and/or sell copies of the Software, and to
+permit persons to whom the Software is furnished to do so, subject to
+the following conditions:
+
+The above copyright notice and this permission notice shall be
+included in all copies or substantial portions of the Software.
+
+THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND,
+EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
+MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND
+NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE
+LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION
+OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION
+WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
diff --git a/bitbake/lib/toaster/toastergui/static/jquery-treetable-license/README.md b/bitbake/lib/toaster/toastergui/static/jquery-treetable-license/README.md
new file mode 100644
index 0000000..ece7afb
--- /dev/null
+++ b/bitbake/lib/toaster/toastergui/static/jquery-treetable-license/README.md
@@ -0,0 +1,20 @@
+# jQuery treetable
+
+jQuery treetable is a plugin for jQuery, the 'Write Less, Do More, JavaScript
+Library'. With this plugin you can display a tree in an HTML table, e.g. a
+directory structure or a nested list. Why not use a list, you say? Because lists
+are great for displaying a tree, and tables are not. Oh wait, but this plugin
+uses tables, doesn't it? Yes. Why do I use a table to display a list? Because I
+need multiple columns to display additional data besides the tree.
+
+Download the latest release from the jQuery Plugin Registry or grab the source
+code from Github. Please report issues through Github issues. This plugin is
+released under both the MIT and the GPLv2 license by Ludo van den Boom.
+
+## Documentation and Examples
+
+See index.html for technical documentation and examples. The most recent version
+of this document is also available online at
+http://ludo.cubicphuse.nl/jquery-treetable. An AJAX enabled example built with
+Ruby on Rails can be found at
+https://github.com/ludo/jquery-treetable-ajax-example.
diff --git a/bitbake/lib/toaster/toastergui/static/jquery.treetable.theme.toaster.css b/bitbake/lib/toaster/toastergui/static/jquery.treetable.theme.toaster.css
new file mode 100644
index 0000000..5194b23
--- /dev/null
+++ b/bitbake/lib/toaster/toastergui/static/jquery.treetable.theme.toaster.css
@@ -0,0 +1,66 @@
+/*
+table.treetable {
+ border: 1px solid #888;
+ border-collapse: collapse;
+ font-size: .8em;
+ line-height: 1;
+ margin: .6em 0 1.8em 0;
+ width: 100%;
+}
+
+table.treetable caption {
+ font-size: .9em;
+ font-weight: bold;
+ margin-bottom: .2em;
+}
+
+table.treetable tbody tr td {
+ cursor: default;
+ padding: .3em 1em;
+}
+
+table.treetable span {
+ background-position: center left;
+ background-repeat: no-repeat;
+ padding: .2em 0 .2em 1.5em;
+}
+*/
+
+table.treetable span.file {
+ background-image: url(data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAABAAAAAQCAQAAAC1+jfqAAAABGdBTUEAAK/INwWK6QAAABl0RVh0U29mdHdhcmUAQWRvYmUgSW1hZ2VSZWFkeXHJZTwAAADoSURBVBgZBcExblNBGAbA2ceegTRBuIKOgiihSZNTcC5LUHAihNJR0kGKCDcYJY6D3/77MdOinTvzAgCw8ysThIvn/VojIyMjIyPP+bS1sUQIV2s95pBDDvmbP/mdkft83tpYguZq5Jh/OeaYh+yzy8hTHvNlaxNNczm+la9OTlar1UdA/+C2A4trRCnD3jS8BB1obq2Gk6GU6QbQAS4BUaYSQAf4bhhKKTFdAzrAOwAxEUAH+KEM01SY3gM6wBsEAQB0gJ+maZoC3gI6iPYaAIBJsiRmHU0AALOeFC3aK2cWAACUXe7+AwO0lc9eTHYTAAAAAElFTkSuQmCC);
+}
+
+table.treetable span.folder {
+ background-image: url(data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAABAAAAAQCAYAAAAf8/9hAAAABGdBTUEAAK/INwWK6QAAABl0RVh0U29mdHdhcmUAQWRvYmUgSW1hZ2VSZWFkeXHJZTwAAAGrSURBVDjLxZO7ihRBFIa/6u0ZW7GHBUV0UQQTZzd3QdhMQxOfwMRXEANBMNQX0MzAzFAwEzHwARbNFDdwEd31Mj3X7a6uOr9BtzNjYjKBJ6nicP7v3KqcJFaxhBVtZUAK8OHlld2st7Xl3DJPVONP+zEUV4HqL5UDYHr5xvuQAjgl/Qs7TzvOOVAjxjlC+ePSwe6DfbVegLVuT4r14eTr6zvA8xSAoBLzx6pvj4l+DZIezuVkG9fY2H7YRQIMZIBwycmzH1/s3F8AapfIPNF3kQk7+kw9PWBy+IZOdg5Ug3mkAATy/t0usovzGeCUWTjCz0B+Sj0ekfdvkZ3abBv+U4GaCtJ1iEm6ANQJ6fEzrG/engcKw/wXQvEKxSEKQxRGKE7Izt+DSiwBJMUSm71rguMYhQKrBygOIRStf4TiFFRBvbRGKiQLWP29yRSHKBTtfdBmHs0BUpgvtgF4yRFR+NUKi0XZcYjCeCG2smkzLAHkbRBmP0/Uk26O5YnUActBp1GsAI+S5nRJJJal5K1aAMrq0d6Tm9uI6zjyf75dAe6tx/SsWeD//o2/Ab6IH3/h25pOAAAAAElFTkSuQmCC);
+}
+
+table.treetable tr.collapsed span.indenter a {
+ background-image: url(data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAABAAAAAQCAYAAAAf8/9hAAAACXBIWXMAAAsTAAALEwEAmpwYAAAKT2lDQ1BQaG90b3Nob3AgSUNDIHByb2ZpbGUAAHjanVNnVFPpFj333vRCS4iAlEtvUhUIIFJCi4AUkSYqIQkQSoghodkVUcERRUUEG8igiAOOjoCMFVEsDIoK2AfkIaKOg6OIisr74Xuja9a89+bN/rXXPues852zzwfACAyWSDNRNYAMqUIeEeCDx8TG4eQuQIEKJHAAEAizZCFz/SMBAPh+PDwrIsAHvgABeNMLCADATZvAMByH/w/qQplcAYCEAcB0kThLCIAUAEB6jkKmAEBGAYCdmCZTAKAEAGDLY2LjAFAtAGAnf+bTAICd+Jl7AQBblCEVAaCRACATZYhEAGg7AKzPVopFAFgwABRmS8Q5ANgtADBJV2ZIALC3AMDOEAuyAAgMADBRiIUpAAR7AGDIIyN4AISZABRG8lc88SuuEOcqAAB4mbI8uSQ5RYFbCC1xB1dXLh4ozkkXKxQ2YQJhmkAuwnmZGTKBNA/g88wAAKCRFRHgg/P9eM4Ors7ONo62Dl8t6r8G/yJiYuP+5c+rcEAAAOF0ftH+LC+zGoA7BoBt/qIl7gRoXgugdfeLZrIPQLUAoOnaV/Nw+H48PEWhkLnZ2eXk5NhKxEJbYcpXff5nwl/AV/1s+X48/Pf14L7iJIEyXYFHBPjgwsz0TKUcz5IJhGLc5o9H/LcL//wd0yLESWK5WCoU41EScY5EmozzMqUiiUKSKcUl0v9k4t8s+wM+3zUAsGo+AXuRLahdYwP2SycQWHTA4vcAAPK7b8HUKAgDgGiD4c93/+8//UegJQCAZkmScQAAXkQkLlTKsz/HCAAARKCBKrBBG/TBGCzABhzBBdzBC/xgNoRCJMTCQhBCCmSAHHJgKayCQiiGzbAdKmAv1EAdNMBRaIaTcA4uwlW4Dj1wD/phCJ7BKLyBCQRByAgTYSHaiAFiilgjjggXmYX4IcFIBBKLJCDJiBRRIkuRNUgxUopUIFVIHfI9cgI5h1xGupE7yAAygvyGvEcxlIGyUT3UDLVDuag3GoRGogvQZHQxmo8WoJvQcrQaPYw2oefQq2gP2o8+Q8cwwOgYBzPEbDAuxsNCsTgsCZNjy7EirAyrxhqwVqwDu4n1Y8+xdwQSgUXACTYEd0IgYR5BSFhMWE7YSKggHCQ0EdoJNwkDhFHCJyKTqEu0JroR+cQYYjIxh1hILCPWEo8TLxB7iEPENyQSiUMyJ7mQAkmxpFTSEtJG0m5SI+ksqZs0SBojk8naZGuyBzmULCAryIXkneTD5DPkG+Qh8lsKnWJAcaT4U+IoUspqShnlEOU05QZlmDJBVaOaUt2ooVQRNY9aQq2htlKvUYeoEzR1mjnNgxZJS6WtopXTGmgXaPdpr+h0uhHdlR5Ol9BX0svpR+iX6AP0dwwNhhWDx4hnKBmbGAcYZxl3GK+YTKYZ04sZx1QwNzHrmOeZD5lvVVgqtip8FZHKCpVKlSaVGyovVKmqpqreqgtV81XLVI+pXlN9rkZVM1PjqQnUlqtVqp1Q61MbU2epO6iHqmeob1Q/pH5Z/YkGWcNMw09DpFGgsV/jvMYgC2MZs3gsIWsNq4Z1gTXEJrHN2Xx2KruY/R27iz2qqaE5QzNKM1ezUvOUZj8H45hx+Jx0TgnnKKeX836K3hTvKeIpG6Y0TLkxZVxrqpaXllirSKtRq0frvTau7aedpr1Fu1n7gQ5Bx0onXCdHZ4/OBZ3nU9lT3acKpxZNPTr1ri6qa6UbobtEd79up+6Ynr5egJ5Mb6feeb3n+hx9L/1U/W36p/VHDFgGswwkBtsMzhg8xTVxbzwdL8fb8VFDXcNAQ6VhlWGX4YSRudE8o9VGjUYPjGnGXOMk423GbcajJgYmISZLTepN7ppSTbmmKaY7TDtMx83MzaLN1pk1mz0x1zLnm+eb15vft2BaeFostqi2uGVJsuRaplnutrxuhVo5WaVYVVpds0atna0l1rutu6cRp7lOk06rntZnw7Dxtsm2qbcZsOXYBtuutm22fWFnYhdnt8Wuw+6TvZN9un2N/T0HDYfZDqsdWh1+c7RyFDpWOt6azpzuP33F9JbpL2dYzxDP2DPjthPLKcRpnVOb00dnF2e5c4PziIuJS4LLLpc+Lpsbxt3IveRKdPVxXeF60vWdm7Obwu2o26/uNu5p7ofcn8w0nymeWTNz0MPIQ+BR5dE/C5+VMGvfrH5PQ0+BZ7XnIy9jL5FXrdewt6V3qvdh7xc+9j5yn+M+4zw33jLeWV/MN8C3yLfLT8Nvnl+F30N/I/9k/3r/0QCngCUBZwOJgUGBWwL7+Hp8Ib+OPzrbZfay2e1BjKC5QRVBj4KtguXBrSFoyOyQrSH355jOkc5pDoVQfujW0Adh5mGLw34MJ4WHhVeGP45wiFga0TGXNXfR3ENz30T6RJZE3ptnMU85ry1KNSo+qi5qPNo3ujS6P8YuZlnM1VidWElsSxw5LiquNm5svt/87fOH4p3iC+N7F5gvyF1weaHOwvSFpxapLhIsOpZATIhOOJTwQRAqqBaMJfITdyWOCnnCHcJnIi/RNtGI2ENcKh5O8kgqTXqS7JG8NXkkxTOlLOW5hCepkLxMDUzdmzqeFpp2IG0yPTq9MYOSkZBxQqohTZO2Z+pn5mZ2y6xlhbL+xW6Lty8elQfJa7OQrAVZLQq2QqboVFoo1yoHsmdlV2a/zYnKOZarnivN7cyzytuQN5zvn//tEsIS4ZK2pYZLVy0dWOa9rGo5sjxxedsK4xUFK4ZWBqw8uIq2Km3VT6vtV5eufr0mek1rgV7ByoLBtQFr6wtVCuWFfevc1+1dT1gvWd+1YfqGnRs+FYmKrhTbF5cVf9go3HjlG4dvyr+Z3JS0qavEuWTPZtJm6ebeLZ5bDpaql+aXDm4N2dq0Dd9WtO319kXbL5fNKNu7g7ZDuaO/PLi8ZafJzs07P1SkVPRU+lQ27tLdtWHX+G7R7ht7vPY07NXbW7z3/T7JvttVAVVN1WbVZftJ+7P3P66Jqun4lvttXa1ObXHtxwPSA/0HIw6217nU1R3SPVRSj9Yr60cOxx++/p3vdy0NNg1VjZzG4iNwRHnk6fcJ3/ceDTradox7rOEH0x92HWcdL2pCmvKaRptTmvtbYlu6T8w+0dbq3nr8R9sfD5w0PFl5SvNUyWna6YLTk2fyz4ydlZ19fi753GDborZ752PO32oPb++6EHTh0kX/i+c7vDvOXPK4dPKy2+UTV7hXmq86X23qdOo8/pPTT8e7nLuarrlca7nuer21e2b36RueN87d9L158Rb/1tWeOT3dvfN6b/fF9/XfFt1+cif9zsu72Xcn7q28T7xf9EDtQdlD3YfVP1v+3Njv3H9qwHeg89HcR/cGhYPP/pH1jw9DBY+Zj8uGDYbrnjg+OTniP3L96fynQ89kzyaeF/6i/suuFxYvfvjV69fO0ZjRoZfyl5O/bXyl/erA6xmv28bCxh6+yXgzMV70VvvtwXfcdx3vo98PT+R8IH8o/2j5sfVT0Kf7kxmTk/8EA5jz/GMzLdsAAAAgY0hSTQAAeiUAAICDAAD5/wAAgOkAAHUwAADqYAAAOpgAABdvkl/FRgAAAHlJREFUeNrcU1sNgDAQ6wgmcAM2MICGGlg1gJnNzWQcvwQGy1j4oUl/7tH0mpwzM7SgQyO+EZAUWh2MkkzSWhJwuRAlHYsJwEwyvs1gABDuzqoJcTw5qxaIJN0bgQRgIjnlmn1heSO5PE6Y2YXe+5Cr5+h++gs12AcAS6FS+7YOsj4AAAAASUVORK5CYII=);
+}
+
+table.treetable tr.expanded span.indenter a {
+ background-image: url(data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAABAAAAAQCAYAAAAf8/9hAAAACXBIWXMAAAsTAAALEwEAmpwYAAAKT2lDQ1BQaG90b3Nob3AgSUNDIHByb2ZpbGUAAHjanVNnVFPpFj333vRCS4iAlEtvUhUIIFJCi4AUkSYqIQkQSoghodkVUcERRUUEG8igiAOOjoCMFVEsDIoK2AfkIaKOg6OIisr74Xuja9a89+bN/rXXPues852zzwfACAyWSDNRNYAMqUIeEeCDx8TG4eQuQIEKJHAAEAizZCFz/SMBAPh+PDwrIsAHvgABeNMLCADATZvAMByH/w/qQplcAYCEAcB0kThLCIAUAEB6jkKmAEBGAYCdmCZTAKAEAGDLY2LjAFAtAGAnf+bTAICd+Jl7AQBblCEVAaCRACATZYhEAGg7AKzPVopFAFgwABRmS8Q5ANgtADBJV2ZIALC3AMDOEAuyAAgMADBRiIUpAAR7AGDIIyN4AISZABRG8lc88SuuEOcqAAB4mbI8uSQ5RYFbCC1xB1dXLh4ozkkXKxQ2YQJhmkAuwnmZGTKBNA/g88wAAKCRFRHgg/P9eM4Ors7ONo62Dl8t6r8G/yJiYuP+5c+rcEAAAOF0ftH+LC+zGoA7BoBt/qIl7gRoXgugdfeLZrIPQLUAoOnaV/Nw+H48PEWhkLnZ2eXk5NhKxEJbYcpXff5nwl/AV/1s+X48/Pf14L7iJIEyXYFHBPjgwsz0TKUcz5IJhGLc5o9H/LcL//wd0yLESWK5WCoU41EScY5EmozzMqUiiUKSKcUl0v9k4t8s+wM+3zUAsGo+AXuRLahdYwP2SycQWHTA4vcAAPK7b8HUKAgDgGiD4c93/+8//UegJQCAZkmScQAAXkQkLlTKsz/HCAAARKCBKrBBG/TBGCzABhzBBdzBC/xgNoRCJMTCQhBCCmSAHHJgKayCQiiGzbAdKmAv1EAdNMBRaIaTcA4uwlW4Dj1wD/phCJ7BKLyBCQRByAgTYSHaiAFiilgjjggXmYX4IcFIBBKLJCDJiBRRIkuRNUgxUopUIFVIHfI9cgI5h1xGupE7yAAygvyGvEcxlIGyUT3UDLVDuag3GoRGogvQZHQxmo8WoJvQcrQaPYw2oefQq2gP2o8+Q8cwwOgYBzPEbDAuxsNCsTgsCZNjy7EirAyrxhqwVqwDu4n1Y8+xdwQSgUXACTYEd0IgYR5BSFhMWE7YSKggHCQ0EdoJNwkDhFHCJyKTqEu0JroR+cQYYjIxh1hILCPWEo8TLxB7iEPENyQSiUMyJ7mQAkmxpFTSEtJG0m5SI+ksqZs0SBojk8naZGuyBzmULCAryIXkneTD5DPkG+Qh8lsKnWJAcaT4U+IoUspqShnlEOU05QZlmDJBVaOaUt2ooVQRNY9aQq2htlKvUYeoEzR1mjnNgxZJS6WtopXTGmgXaPdpr+h0uhHdlR5Ol9BX0svpR+iX6AP0dwwNhhWDx4hnKBmbGAcYZxl3GK+YTKYZ04sZx1QwNzHrmOeZD5lvVVgqtip8FZHKCpVKlSaVGyovVKmqpqreqgtV81XLVI+pXlN9rkZVM1PjqQnUlqtVqp1Q61MbU2epO6iHqmeob1Q/pH5Z/YkGWcNMw09DpFGgsV/jvMYgC2MZs3gsIWsNq4Z1gTXEJrHN2Xx2KruY/R27iz2qqaE5QzNKM1ezUvOUZj8H45hx+Jx0TgnnKKeX836K3hTvKeIpG6Y0TLkxZVxrqpaXllirSKtRq0frvTau7aedpr1Fu1n7gQ5Bx0onXCdHZ4/OBZ3nU9lT3acKpxZNPTr1ri6qa6UbobtEd79up+6Ynr5egJ5Mb6feeb3n+hx9L/1U/W36p/VHDFgGswwkBtsMzhg8xTVxbzwdL8fb8VFDXcNAQ6VhlWGX4YSRudE8o9VGjUYPjGnGXOMk423GbcajJgYmISZLTepN7ppSTbmmKaY7TDtMx83MzaLN1pk1mz0x1zLnm+eb15vft2BaeFostqi2uGVJsuRaplnutrxuhVo5WaVYVVpds0atna0l1rutu6cRp7lOk06rntZnw7Dxtsm2qbcZsOXYBtuutm22fWFnYhdnt8Wuw+6TvZN9un2N/T0HDYfZDqsdWh1+c7RyFDpWOt6azpzuP33F9JbpL2dYzxDP2DPjthPLKcRpnVOb00dnF2e5c4PziIuJS4LLLpc+Lpsbxt3IveRKdPVxXeF60vWdm7Obwu2o26/uNu5p7ofcn8w0nymeWTNz0MPIQ+BR5dE/C5+VMGvfrH5PQ0+BZ7XnIy9jL5FXrdewt6V3qvdh7xc+9j5yn+M+4zw33jLeWV/MN8C3yLfLT8Nvnl+F30N/I/9k/3r/0QCngCUBZwOJgUGBWwL7+Hp8Ib+OPzrbZfay2e1BjKC5QRVBj4KtguXBrSFoyOyQrSH355jOkc5pDoVQfujW0Adh5mGLw34MJ4WHhVeGP45wiFga0TGXNXfR3ENz30T6RJZE3ptnMU85ry1KNSo+qi5qPNo3ujS6P8YuZlnM1VidWElsSxw5LiquNm5svt/87fOH4p3iC+N7F5gvyF1weaHOwvSFpxapLhIsOpZATIhOOJTwQRAqqBaMJfITdyWOCnnCHcJnIi/RNtGI2ENcKh5O8kgqTXqS7JG8NXkkxTOlLOW5hCepkLxMDUzdmzqeFpp2IG0yPTq9MYOSkZBxQqohTZO2Z+pn5mZ2y6xlhbL+xW6Lty8elQfJa7OQrAVZLQq2QqboVFoo1yoHsmdlV2a/zYnKOZarnivN7cyzytuQN5zvn//tEsIS4ZK2pYZLVy0dWOa9rGo5sjxxedsK4xUFK4ZWBqw8uIq2Km3VT6vtV5eufr0mek1rgV7ByoLBtQFr6wtVCuWFfevc1+1dT1gvWd+1YfqGnRs+FYmKrhTbF5cVf9go3HjlG4dvyr+Z3JS0qavEuWTPZtJm6ebeLZ5bDpaql+aXDm4N2dq0Dd9WtO319kXbL5fNKNu7g7ZDuaO/PLi8ZafJzs07P1SkVPRU+lQ27tLdtWHX+G7R7ht7vPY07NXbW7z3/T7JvttVAVVN1WbVZftJ+7P3P66Jqun4lvttXa1ObXHtxwPSA/0HIw6217nU1R3SPVRSj9Yr60cOxx++/p3vdy0NNg1VjZzG4iNwRHnk6fcJ3/ceDTradox7rOEH0x92HWcdL2pCmvKaRptTmvtbYlu6T8w+0dbq3nr8R9sfD5w0PFl5SvNUyWna6YLTk2fyz4ydlZ19fi753GDborZ752PO32oPb++6EHTh0kX/i+c7vDvOXPK4dPKy2+UTV7hXmq86X23qdOo8/pPTT8e7nLuarrlca7nuer21e2b36RueN87d9L158Rb/1tWeOT3dvfN6b/fF9/XfFt1+cif9zsu72Xcn7q28T7xf9EDtQdlD3YfVP1v+3Njv3H9qwHeg89HcR/cGhYPP/pH1jw9DBY+Zj8uGDYbrnjg+OTniP3L96fynQ89kzyaeF/6i/suuFxYvfvjV69fO0ZjRoZfyl5O/bXyl/erA6xmv28bCxh6+yXgzMV70VvvtwXfcdx3vo98PT+R8IH8o/2j5sfVT0Kf7kxmTk/8EA5jz/GMzLdsAAAAgY0hSTQAAeiUAAICDAAD5/wAAgOkAAHUwAADqYAAAOpgAABdvkl/FRgAAAHFJREFUeNpi/P//PwMlgImBQsA44C6gvhfa29v3MzAwOODRc6CystIRbxi0t7fjDJjKykpGYrwwi1hxnLHQ3t7+jIGBQRJJ6HllZaUUKYEYRYBPOB0gBShKwKGA////48VtbW3/8clTnBIH3gCKkzJgAGvBX0dDm0sCAAAAAElFTkSuQmCC);
+}
+
+
+
+table.treetable tr.collapsed.selected span.indenter a {
+ background-image: url(data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAABAAAAAQCAYAAAAf8/9hAAAACXBIWXMAAAsTAAALEwEAmpwYAAAKT2lDQ1BQaG90b3Nob3AgSUNDIHByb2ZpbGUAAHjanVNnVFPpFj333vRCS4iAlEtvUhUIIFJCi4AUkSYqIQkQSoghodkVUcERRUUEG8igiAOOjoCMFVEsDIoK2AfkIaKOg6OIisr74Xuja9a89+bN/rXXPues852zzwfACAyWSDNRNYAMqUIeEeCDx8TG4eQuQIEKJHAAEAizZCFz/SMBAPh+PDwrIsAHvgABeNMLCADATZvAMByH/w/qQplcAYCEAcB0kThLCIAUAEB6jkKmAEBGAYCdmCZTAKAEAGDLY2LjAFAtAGAnf+bTAICd+Jl7AQBblCEVAaCRACATZYhEAGg7AKzPVopFAFgwABRmS8Q5ANgtADBJV2ZIALC3AMDOEAuyAAgMADBRiIUpAAR7AGDIIyN4AISZABRG8lc88SuuEOcqAAB4mbI8uSQ5RYFbCC1xB1dXLh4ozkkXKxQ2YQJhmkAuwnmZGTKBNA/g88wAAKCRFRHgg/P9eM4Ors7ONo62Dl8t6r8G/yJiYuP+5c+rcEAAAOF0ftH+LC+zGoA7BoBt/qIl7gRoXgugdfeLZrIPQLUAoOnaV/Nw+H48PEWhkLnZ2eXk5NhKxEJbYcpXff5nwl/AV/1s+X48/Pf14L7iJIEyXYFHBPjgwsz0TKUcz5IJhGLc5o9H/LcL//wd0yLESWK5WCoU41EScY5EmozzMqUiiUKSKcUl0v9k4t8s+wM+3zUAsGo+AXuRLahdYwP2SycQWHTA4vcAAPK7b8HUKAgDgGiD4c93/+8//UegJQCAZkmScQAAXkQkLlTKsz/HCAAARKCBKrBBG/TBGCzABhzBBdzBC/xgNoRCJMTCQhBCCmSAHHJgKayCQiiGzbAdKmAv1EAdNMBRaIaTcA4uwlW4Dj1wD/phCJ7BKLyBCQRByAgTYSHaiAFiilgjjggXmYX4IcFIBBKLJCDJiBRRIkuRNUgxUopUIFVIHfI9cgI5h1xGupE7yAAygvyGvEcxlIGyUT3UDLVDuag3GoRGogvQZHQxmo8WoJvQcrQaPYw2oefQq2gP2o8+Q8cwwOgYBzPEbDAuxsNCsTgsCZNjy7EirAyrxhqwVqwDu4n1Y8+xdwQSgUXACTYEd0IgYR5BSFhMWE7YSKggHCQ0EdoJNwkDhFHCJyKTqEu0JroR+cQYYjIxh1hILCPWEo8TLxB7iEPENyQSiUMyJ7mQAkmxpFTSEtJG0m5SI+ksqZs0SBojk8naZGuyBzmULCAryIXkneTD5DPkG+Qh8lsKnWJAcaT4U+IoUspqShnlEOU05QZlmDJBVaOaUt2ooVQRNY9aQq2htlKvUYeoEzR1mjnNgxZJS6WtopXTGmgXaPdpr+h0uhHdlR5Ol9BX0svpR+iX6AP0dwwNhhWDx4hnKBmbGAcYZxl3GK+YTKYZ04sZx1QwNzHrmOeZD5lvVVgqtip8FZHKCpVKlSaVGyovVKmqpqreqgtV81XLVI+pXlN9rkZVM1PjqQnUlqtVqp1Q61MbU2epO6iHqmeob1Q/pH5Z/YkGWcNMw09DpFGgsV/jvMYgC2MZs3gsIWsNq4Z1gTXEJrHN2Xx2KruY/R27iz2qqaE5QzNKM1ezUvOUZj8H45hx+Jx0TgnnKKeX836K3hTvKeIpG6Y0TLkxZVxrqpaXllirSKtRq0frvTau7aedpr1Fu1n7gQ5Bx0onXCdHZ4/OBZ3nU9lT3acKpxZNPTr1ri6qa6UbobtEd79up+6Ynr5egJ5Mb6feeb3n+hx9L/1U/W36p/VHDFgGswwkBtsMzhg8xTVxbzwdL8fb8VFDXcNAQ6VhlWGX4YSRudE8o9VGjUYPjGnGXOMk423GbcajJgYmISZLTepN7ppSTbmmKaY7TDtMx83MzaLN1pk1mz0x1zLnm+eb15vft2BaeFostqi2uGVJsuRaplnutrxuhVo5WaVYVVpds0atna0l1rutu6cRp7lOk06rntZnw7Dxtsm2qbcZsOXYBtuutm22fWFnYhdnt8Wuw+6TvZN9un2N/T0HDYfZDqsdWh1+c7RyFDpWOt6azpzuP33F9JbpL2dYzxDP2DPjthPLKcRpnVOb00dnF2e5c4PziIuJS4LLLpc+Lpsbxt3IveRKdPVxXeF60vWdm7Obwu2o26/uNu5p7ofcn8w0nymeWTNz0MPIQ+BR5dE/C5+VMGvfrH5PQ0+BZ7XnIy9jL5FXrdewt6V3qvdh7xc+9j5yn+M+4zw33jLeWV/MN8C3yLfLT8Nvnl+F30N/I/9k/3r/0QCngCUBZwOJgUGBWwL7+Hp8Ib+OPzrbZfay2e1BjKC5QRVBj4KtguXBrSFoyOyQrSH355jOkc5pDoVQfujW0Adh5mGLw34MJ4WHhVeGP45wiFga0TGXNXfR3ENz30T6RJZE3ptnMU85ry1KNSo+qi5qPNo3ujS6P8YuZlnM1VidWElsSxw5LiquNm5svt/87fOH4p3iC+N7F5gvyF1weaHOwvSFpxapLhIsOpZATIhOOJTwQRAqqBaMJfITdyWOCnnCHcJnIi/RNtGI2ENcKh5O8kgqTXqS7JG8NXkkxTOlLOW5hCepkLxMDUzdmzqeFpp2IG0yPTq9MYOSkZBxQqohTZO2Z+pn5mZ2y6xlhbL+xW6Lty8elQfJa7OQrAVZLQq2QqboVFoo1yoHsmdlV2a/zYnKOZarnivN7cyzytuQN5zvn//tEsIS4ZK2pYZLVy0dWOa9rGo5sjxxedsK4xUFK4ZWBqw8uIq2Km3VT6vtV5eufr0mek1rgV7ByoLBtQFr6wtVCuWFfevc1+1dT1gvWd+1YfqGnRs+FYmKrhTbF5cVf9go3HjlG4dvyr+Z3JS0qavEuWTPZtJm6ebeLZ5bDpaql+aXDm4N2dq0Dd9WtO319kXbL5fNKNu7g7ZDuaO/PLi8ZafJzs07P1SkVPRU+lQ27tLdtWHX+G7R7ht7vPY07NXbW7z3/T7JvttVAVVN1WbVZftJ+7P3P66Jqun4lvttXa1ObXHtxwPSA/0HIw6217nU1R3SPVRSj9Yr60cOxx++/p3vdy0NNg1VjZzG4iNwRHnk6fcJ3/ceDTradox7rOEH0x92HWcdL2pCmvKaRptTmvtbYlu6T8w+0dbq3nr8R9sfD5w0PFl5SvNUyWna6YLTk2fyz4ydlZ19fi753GDborZ752PO32oPb++6EHTh0kX/i+c7vDvOXPK4dPKy2+UTV7hXmq86X23qdOo8/pPTT8e7nLuarrlca7nuer21e2b36RueN87d9L158Rb/1tWeOT3dvfN6b/fF9/XfFt1+cif9zsu72Xcn7q28T7xf9EDtQdlD3YfVP1v+3Njv3H9qwHeg89HcR/cGhYPP/pH1jw9DBY+Zj8uGDYbrnjg+OTniP3L96fynQ89kzyaeF/6i/suuFxYvfvjV69fO0ZjRoZfyl5O/bXyl/erA6xmv28bCxh6+yXgzMV70VvvtwXfcdx3vo98PT+R8IH8o/2j5sfVT0Kf7kxmTk/8EA5jz/GMzLdsAAAAgY0hSTQAAeiUAAICDAAD5/wAAgOkAAHUwAADqYAAAOpgAABdvkl/FRgAAAFpJREFUeNpi/P//PwMlgHHADWD4//8/NtyAQxwD45KAAQdKDfj//////fgMIsYAZIMw1DKREFwODAwM/4kNRKq64AADA4MjFDOQ6gKyY4HodMA49PMCxQYABgAVYHsjyZ1x7QAAAABJRU5ErkJggg==);
+}
+
+table.treetable tr.expanded.selected span.indenter a {
+ background-image: url(data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAABAAAAAQCAYAAAAf8/9hAAAACXBIWXMAAAsTAAALEwEAmpwYAAAKT2lDQ1BQaG90b3Nob3AgSUNDIHByb2ZpbGUAAHjanVNnVFPpFj333vRCS4iAlEtvUhUIIFJCi4AUkSYqIQkQSoghodkVUcERRUUEG8igiAOOjoCMFVEsDIoK2AfkIaKOg6OIisr74Xuja9a89+bN/rXXPues852zzwfACAyWSDNRNYAMqUIeEeCDx8TG4eQuQIEKJHAAEAizZCFz/SMBAPh+PDwrIsAHvgABeNMLCADATZvAMByH/w/qQplcAYCEAcB0kThLCIAUAEB6jkKmAEBGAYCdmCZTAKAEAGDLY2LjAFAtAGAnf+bTAICd+Jl7AQBblCEVAaCRACATZYhEAGg7AKzPVopFAFgwABRmS8Q5ANgtADBJV2ZIALC3AMDOEAuyAAgMADBRiIUpAAR7AGDIIyN4AISZABRG8lc88SuuEOcqAAB4mbI8uSQ5RYFbCC1xB1dXLh4ozkkXKxQ2YQJhmkAuwnmZGTKBNA/g88wAAKCRFRHgg/P9eM4Ors7ONo62Dl8t6r8G/yJiYuP+5c+rcEAAAOF0ftH+LC+zGoA7BoBt/qIl7gRoXgugdfeLZrIPQLUAoOnaV/Nw+H48PEWhkLnZ2eXk5NhKxEJbYcpXff5nwl/AV/1s+X48/Pf14L7iJIEyXYFHBPjgwsz0TKUcz5IJhGLc5o9H/LcL//wd0yLESWK5WCoU41EScY5EmozzMqUiiUKSKcUl0v9k4t8s+wM+3zUAsGo+AXuRLahdYwP2SycQWHTA4vcAAPK7b8HUKAgDgGiD4c93/+8//UegJQCAZkmScQAAXkQkLlTKsz/HCAAARKCBKrBBG/TBGCzABhzBBdzBC/xgNoRCJMTCQhBCCmSAHHJgKayCQiiGzbAdKmAv1EAdNMBRaIaTcA4uwlW4Dj1wD/phCJ7BKLyBCQRByAgTYSHaiAFiilgjjggXmYX4IcFIBBKLJCDJiBRRIkuRNUgxUopUIFVIHfI9cgI5h1xGupE7yAAygvyGvEcxlIGyUT3UDLVDuag3GoRGogvQZHQxmo8WoJvQcrQaPYw2oefQq2gP2o8+Q8cwwOgYBzPEbDAuxsNCsTgsCZNjy7EirAyrxhqwVqwDu4n1Y8+xdwQSgUXACTYEd0IgYR5BSFhMWE7YSKggHCQ0EdoJNwkDhFHCJyKTqEu0JroR+cQYYjIxh1hILCPWEo8TLxB7iEPENyQSiUMyJ7mQAkmxpFTSEtJG0m5SI+ksqZs0SBojk8naZGuyBzmULCAryIXkneTD5DPkG+Qh8lsKnWJAcaT4U+IoUspqShnlEOU05QZlmDJBVaOaUt2ooVQRNY9aQq2htlKvUYeoEzR1mjnNgxZJS6WtopXTGmgXaPdpr+h0uhHdlR5Ol9BX0svpR+iX6AP0dwwNhhWDx4hnKBmbGAcYZxl3GK+YTKYZ04sZx1QwNzHrmOeZD5lvVVgqtip8FZHKCpVKlSaVGyovVKmqpqreqgtV81XLVI+pXlN9rkZVM1PjqQnUlqtVqp1Q61MbU2epO6iHqmeob1Q/pH5Z/YkGWcNMw09DpFGgsV/jvMYgC2MZs3gsIWsNq4Z1gTXEJrHN2Xx2KruY/R27iz2qqaE5QzNKM1ezUvOUZj8H45hx+Jx0TgnnKKeX836K3hTvKeIpG6Y0TLkxZVxrqpaXllirSKtRq0frvTau7aedpr1Fu1n7gQ5Bx0onXCdHZ4/OBZ3nU9lT3acKpxZNPTr1ri6qa6UbobtEd79up+6Ynr5egJ5Mb6feeb3n+hx9L/1U/W36p/VHDFgGswwkBtsMzhg8xTVxbzwdL8fb8VFDXcNAQ6VhlWGX4YSRudE8o9VGjUYPjGnGXOMk423GbcajJgYmISZLTepN7ppSTbmmKaY7TDtMx83MzaLN1pk1mz0x1zLnm+eb15vft2BaeFostqi2uGVJsuRaplnutrxuhVo5WaVYVVpds0atna0l1rutu6cRp7lOk06rntZnw7Dxtsm2qbcZsOXYBtuutm22fWFnYhdnt8Wuw+6TvZN9un2N/T0HDYfZDqsdWh1+c7RyFDpWOt6azpzuP33F9JbpL2dYzxDP2DPjthPLKcRpnVOb00dnF2e5c4PziIuJS4LLLpc+Lpsbxt3IveRKdPVxXeF60vWdm7Obwu2o26/uNu5p7ofcn8w0nymeWTNz0MPIQ+BR5dE/C5+VMGvfrH5PQ0+BZ7XnIy9jL5FXrdewt6V3qvdh7xc+9j5yn+M+4zw33jLeWV/MN8C3yLfLT8Nvnl+F30N/I/9k/3r/0QCngCUBZwOJgUGBWwL7+Hp8Ib+OPzrbZfay2e1BjKC5QRVBj4KtguXBrSFoyOyQrSH355jOkc5pDoVQfujW0Adh5mGLw34MJ4WHhVeGP45wiFga0TGXNXfR3ENz30T6RJZE3ptnMU85ry1KNSo+qi5qPNo3ujS6P8YuZlnM1VidWElsSxw5LiquNm5svt/87fOH4p3iC+N7F5gvyF1weaHOwvSFpxapLhIsOpZATIhOOJTwQRAqqBaMJfITdyWOCnnCHcJnIi/RNtGI2ENcKh5O8kgqTXqS7JG8NXkkxTOlLOW5hCepkLxMDUzdmzqeFpp2IG0yPTq9MYOSkZBxQqohTZO2Z+pn5mZ2y6xlhbL+xW6Lty8elQfJa7OQrAVZLQq2QqboVFoo1yoHsmdlV2a/zYnKOZarnivN7cyzytuQN5zvn//tEsIS4ZK2pYZLVy0dWOa9rGo5sjxxedsK4xUFK4ZWBqw8uIq2Km3VT6vtV5eufr0mek1rgV7ByoLBtQFr6wtVCuWFfevc1+1dT1gvWd+1YfqGnRs+FYmKrhTbF5cVf9go3HjlG4dvyr+Z3JS0qavEuWTPZtJm6ebeLZ5bDpaql+aXDm4N2dq0Dd9WtO319kXbL5fNKNu7g7ZDuaO/PLi8ZafJzs07P1SkVPRU+lQ27tLdtWHX+G7R7ht7vPY07NXbW7z3/T7JvttVAVVN1WbVZftJ+7P3P66Jqun4lvttXa1ObXHtxwPSA/0HIw6217nU1R3SPVRSj9Yr60cOxx++/p3vdy0NNg1VjZzG4iNwRHnk6fcJ3/ceDTradox7rOEH0x92HWcdL2pCmvKaRptTmvtbYlu6T8w+0dbq3nr8R9sfD5w0PFl5SvNUyWna6YLTk2fyz4ydlZ19fi753GDborZ752PO32oPb++6EHTh0kX/i+c7vDvOXPK4dPKy2+UTV7hXmq86X23qdOo8/pPTT8e7nLuarrlca7nuer21e2b36RueN87d9L158Rb/1tWeOT3dvfN6b/fF9/XfFt1+cif9zsu72Xcn7q28T7xf9EDtQdlD3YfVP1v+3Njv3H9qwHeg89HcR/cGhYPP/pH1jw9DBY+Zj8uGDYbrnjg+OTniP3L96fynQ89kzyaeF/6i/suuFxYvfvjV69fO0ZjRoZfyl5O/bXyl/erA6xmv28bCxh6+yXgzMV70VvvtwXfcdx3vo98PT+R8IH8o/2j5sfVT0Kf7kxmTk/8EA5jz/GMzLdsAAAAgY0hSTQAAeiUAAICDAAD5/wAAgOkAAHUwAADqYAAAOpgAABdvkl/FRgAAAFtJREFUeNpi/P//PwMlgImBQsA44C6giQENDAwM//HgBmLCAF/AMBLjBUeixf///48L7/+PCvZjU4fPAAc0AxywqcMXCwegGJ1NckL6jx5wpKYDxqGXEkkCgAEAmrqBIejdgngAAAAASUVORK5CYII=);
+}
+
+table.treetable tr.accept {
+ background-color: #a3bce4;
+ color: #fff
+}
+
+table.treetable tr.collapsed.accept td span.indenter a {
+ background-image: url(data:image/x-png;base64,iVBORw0KGgoAAAANSUhEUgAAABAAAAAQCAYAAAAf8/9hAAAACXBIWXMAAAsTAAALEwEAmpwYAAAKT2lDQ1BQaG90b3Nob3AgSUNDIHByb2ZpbGUAAHjanVNnVFPpFj333vRCS4iAlEtvUhUIIFJCi4AUkSYqIQkQSoghodkVUcERRUUEG8igiAOOjoCMFVEsDIoK2AfkIaKOg6OIisr74Xuja9a89+bN/rXXPues852zzwfACAyWSDNRNYAMqUIeEeCDx8TG4eQuQIEKJHAAEAizZCFz/SMBAPh+PDwrIsAHvgABeNMLCADATZvAMByH/w/qQplcAYCEAcB0kThLCIAUAEB6jkKmAEBGAYCdmCZTAKAEAGDLY2LjAFAtAGAnf+bTAICd+Jl7AQBblCEVAaCRACATZYhEAGg7AKzPVopFAFgwABRmS8Q5ANgtADBJV2ZIALC3AMDOEAuyAAgMADBRiIUpAAR7AGDIIyN4AISZABRG8lc88SuuEOcqAAB4mbI8uSQ5RYFbCC1xB1dXLh4ozkkXKxQ2YQJhmkAuwnmZGTKBNA/g88wAAKCRFRHgg/P9eM4Ors7ONo62Dl8t6r8G/yJiYuP+5c+rcEAAAOF0ftH+LC+zGoA7BoBt/qIl7gRoXgugdfeLZrIPQLUAoOnaV/Nw+H48PEWhkLnZ2eXk5NhKxEJbYcpXff5nwl/AV/1s+X48/Pf14L7iJIEyXYFHBPjgwsz0TKUcz5IJhGLc5o9H/LcL//wd0yLESWK5WCoU41EScY5EmozzMqUiiUKSKcUl0v9k4t8s+wM+3zUAsGo+AXuRLahdYwP2SycQWHTA4vcAAPK7b8HUKAgDgGiD4c93/+8//UegJQCAZkmScQAAXkQkLlTKsz/HCAAARKCBKrBBG/TBGCzABhzBBdzBC/xgNoRCJMTCQhBCCmSAHHJgKayCQiiGzbAdKmAv1EAdNMBRaIaTcA4uwlW4Dj1wD/phCJ7BKLyBCQRByAgTYSHaiAFiilgjjggXmYX4IcFIBBKLJCDJiBRRIkuRNUgxUopUIFVIHfI9cgI5h1xGupE7yAAygvyGvEcxlIGyUT3UDLVDuag3GoRGogvQZHQxmo8WoJvQcrQaPYw2oefQq2gP2o8+Q8cwwOgYBzPEbDAuxsNCsTgsCZNjy7EirAyrxhqwVqwDu4n1Y8+xdwQSgUXACTYEd0IgYR5BSFhMWE7YSKggHCQ0EdoJNwkDhFHCJyKTqEu0JroR+cQYYjIxh1hILCPWEo8TLxB7iEPENyQSiUMyJ7mQAkmxpFTSEtJG0m5SI+ksqZs0SBojk8naZGuyBzmULCAryIXkneTD5DPkG+Qh8lsKnWJAcaT4U+IoUspqShnlEOU05QZlmDJBVaOaUt2ooVQRNY9aQq2htlKvUYeoEzR1mjnNgxZJS6WtopXTGmgXaPdpr+h0uhHdlR5Ol9BX0svpR+iX6AP0dwwNhhWDx4hnKBmbGAcYZxl3GK+YTKYZ04sZx1QwNzHrmOeZD5lvVVgqtip8FZHKCpVKlSaVGyovVKmqpqreqgtV81XLVI+pXlN9rkZVM1PjqQnUlqtVqp1Q61MbU2epO6iHqmeob1Q/pH5Z/YkGWcNMw09DpFGgsV/jvMYgC2MZs3gsIWsNq4Z1gTXEJrHN2Xx2KruY/R27iz2qqaE5QzNKM1ezUvOUZj8H45hx+Jx0TgnnKKeX836K3hTvKeIpG6Y0TLkxZVxrqpaXllirSKtRq0frvTau7aedpr1Fu1n7gQ5Bx0onXCdHZ4/OBZ3nU9lT3acKpxZNPTr1ri6qa6UbobtEd79up+6Ynr5egJ5Mb6feeb3n+hx9L/1U/W36p/VHDFgGswwkBtsMzhg8xTVxbzwdL8fb8VFDXcNAQ6VhlWGX4YSRudE8o9VGjUYPjGnGXOMk423GbcajJgYmISZLTepN7ppSTbmmKaY7TDtMx83MzaLN1pk1mz0x1zLnm+eb15vft2BaeFostqi2uGVJsuRaplnutrxuhVo5WaVYVVpds0atna0l1rutu6cRp7lOk06rntZnw7Dxtsm2qbcZsOXYBtuutm22fWFnYhdnt8Wuw+6TvZN9un2N/T0HDYfZDqsdWh1+c7RyFDpWOt6azpzuP33F9JbpL2dYzxDP2DPjthPLKcRpnVOb00dnF2e5c4PziIuJS4LLLpc+Lpsbxt3IveRKdPVxXeF60vWdm7Obwu2o26/uNu5p7ofcn8w0nymeWTNz0MPIQ+BR5dE/C5+VMGvfrH5PQ0+BZ7XnIy9jL5FXrdewt6V3qvdh7xc+9j5yn+M+4zw33jLeWV/MN8C3yLfLT8Nvnl+F30N/I/9k/3r/0QCngCUBZwOJgUGBWwL7+Hp8Ib+OPzrbZfay2e1BjKC5QRVBj4KtguXBrSFoyOyQrSH355jOkc5pDoVQfujW0Adh5mGLw34MJ4WHhVeGP45wiFga0TGXNXfR3ENz30T6RJZE3ptnMU85ry1KNSo+qi5qPNo3ujS6P8YuZlnM1VidWElsSxw5LiquNm5svt/87fOH4p3iC+N7F5gvyF1weaHOwvSFpxapLhIsOpZATIhOOJTwQRAqqBaMJfITdyWOCnnCHcJnIi/RNtGI2ENcKh5O8kgqTXqS7JG8NXkkxTOlLOW5hCepkLxMDUzdmzqeFpp2IG0yPTq9MYOSkZBxQqohTZO2Z+pn5mZ2y6xlhbL+xW6Lty8elQfJa7OQrAVZLQq2QqboVFoo1yoHsmdlV2a/zYnKOZarnivN7cyzytuQN5zvn//tEsIS4ZK2pYZLVy0dWOa9rGo5sjxxedsK4xUFK4ZWBqw8uIq2Km3VT6vtV5eufr0mek1rgV7ByoLBtQFr6wtVCuWFfevc1+1dT1gvWd+1YfqGnRs+FYmKrhTbF5cVf9go3HjlG4dvyr+Z3JS0qavEuWTPZtJm6ebeLZ5bDpaql+aXDm4N2dq0Dd9WtO319kXbL5fNKNu7g7ZDuaO/PLi8ZafJzs07P1SkVPRU+lQ27tLdtWHX+G7R7ht7vPY07NXbW7z3/T7JvttVAVVN1WbVZftJ+7P3P66Jqun4lvttXa1ObXHtxwPSA/0HIw6217nU1R3SPVRSj9Yr60cOxx++/p3vdy0NNg1VjZzG4iNwRHnk6fcJ3/ceDTradox7rOEH0x92HWcdL2pCmvKaRptTmvtbYlu6T8w+0dbq3nr8R9sfD5w0PFl5SvNUyWna6YLTk2fyz4ydlZ19fi753GDborZ752PO32oPb++6EHTh0kX/i+c7vDvOXPK4dPKy2+UTV7hXmq86X23qdOo8/pPTT8e7nLuarrlca7nuer21e2b36RueN87d9L158Rb/1tWeOT3dvfN6b/fF9/XfFt1+cif9zsu72Xcn7q28T7xf9EDtQdlD3YfVP1v+3Njv3H9qwHeg89HcR/cGhYPP/pH1jw9DBY+Zj8uGDYbrnjg+OTniP3L96fynQ89kzyaeF/6i/suuFxYvfvjV69fO0ZjRoZfyl5O/bXyl/erA6xmv28bCxh6+yXgzMV70VvvtwXfcdx3vo98PT+R8IH8o/2j5sfVT0Kf7kxmTk/8EA5jz/GMzLdsAAAAgY0hSTQAAeiUAAICDAAD5/wAAgOkAAHUwAADqYAAAOpgAABdvkl/FRgAAAFpJREFUeNpi/P//PwMlgHHADWD4//8/NtyAQxwD45KAAQdKDfj//////fgMIsYAZIMw1DKREFwODAwM/4kNRKq64AADA4MjFDOQ6gKyY4HodMA49PMCxQYABgAVYHsjyZ1x7QAAAABJRU5ErkJggg==);
+}
+
+table.treetable tr.expanded.accept td span.indenter a {
+ background-image: url(data:image/x-png;base64,iVBORw0KGgoAAAANSUhEUgAAABAAAAAQCAYAAAAf8/9hAAAACXBIWXMAAAsTAAALEwEAmpwYAAAKT2lDQ1BQaG90b3Nob3AgSUNDIHByb2ZpbGUAAHjanVNnVFPpFj333vRCS4iAlEtvUhUIIFJCi4AUkSYqIQkQSoghodkVUcERRUUEG8igiAOOjoCMFVEsDIoK2AfkIaKOg6OIisr74Xuja9a89+bN/rXXPues852zzwfACAyWSDNRNYAMqUIeEeCDx8TG4eQuQIEKJHAAEAizZCFz/SMBAPh+PDwrIsAHvgABeNMLCADATZvAMByH/w/qQplcAYCEAcB0kThLCIAUAEB6jkKmAEBGAYCdmCZTAKAEAGDLY2LjAFAtAGAnf+bTAICd+Jl7AQBblCEVAaCRACATZYhEAGg7AKzPVopFAFgwABRmS8Q5ANgtADBJV2ZIALC3AMDOEAuyAAgMADBRiIUpAAR7AGDIIyN4AISZABRG8lc88SuuEOcqAAB4mbI8uSQ5RYFbCC1xB1dXLh4ozkkXKxQ2YQJhmkAuwnmZGTKBNA/g88wAAKCRFRHgg/P9eM4Ors7ONo62Dl8t6r8G/yJiYuP+5c+rcEAAAOF0ftH+LC+zGoA7BoBt/qIl7gRoXgugdfeLZrIPQLUAoOnaV/Nw+H48PEWhkLnZ2eXk5NhKxEJbYcpXff5nwl/AV/1s+X48/Pf14L7iJIEyXYFHBPjgwsz0TKUcz5IJhGLc5o9H/LcL//wd0yLESWK5WCoU41EScY5EmozzMqUiiUKSKcUl0v9k4t8s+wM+3zUAsGo+AXuRLahdYwP2SycQWHTA4vcAAPK7b8HUKAgDgGiD4c93/+8//UegJQCAZkmScQAAXkQkLlTKsz/HCAAARKCBKrBBG/TBGCzABhzBBdzBC/xgNoRCJMTCQhBCCmSAHHJgKayCQiiGzbAdKmAv1EAdNMBRaIaTcA4uwlW4Dj1wD/phCJ7BKLyBCQRByAgTYSHaiAFiilgjjggXmYX4IcFIBBKLJCDJiBRRIkuRNUgxUopUIFVIHfI9cgI5h1xGupE7yAAygvyGvEcxlIGyUT3UDLVDuag3GoRGogvQZHQxmo8WoJvQcrQaPYw2oefQq2gP2o8+Q8cwwOgYBzPEbDAuxsNCsTgsCZNjy7EirAyrxhqwVqwDu4n1Y8+xdwQSgUXACTYEd0IgYR5BSFhMWE7YSKggHCQ0EdoJNwkDhFHCJyKTqEu0JroR+cQYYjIxh1hILCPWEo8TLxB7iEPENyQSiUMyJ7mQAkmxpFTSEtJG0m5SI+ksqZs0SBojk8naZGuyBzmULCAryIXkneTD5DPkG+Qh8lsKnWJAcaT4U+IoUspqShnlEOU05QZlmDJBVaOaUt2ooVQRNY9aQq2htlKvUYeoEzR1mjnNgxZJS6WtopXTGmgXaPdpr+h0uhHdlR5Ol9BX0svpR+iX6AP0dwwNhhWDx4hnKBmbGAcYZxl3GK+YTKYZ04sZx1QwNzHrmOeZD5lvVVgqtip8FZHKCpVKlSaVGyovVKmqpqreqgtV81XLVI+pXlN9rkZVM1PjqQnUlqtVqp1Q61MbU2epO6iHqmeob1Q/pH5Z/YkGWcNMw09DpFGgsV/jvMYgC2MZs3gsIWsNq4Z1gTXEJrHN2Xx2KruY/R27iz2qqaE5QzNKM1ezUvOUZj8H45hx+Jx0TgnnKKeX836K3hTvKeIpG6Y0TLkxZVxrqpaXllirSKtRq0frvTau7aedpr1Fu1n7gQ5Bx0onXCdHZ4/OBZ3nU9lT3acKpxZNPTr1ri6qa6UbobtEd79up+6Ynr5egJ5Mb6feeb3n+hx9L/1U/W36p/VHDFgGswwkBtsMzhg8xTVxbzwdL8fb8VFDXcNAQ6VhlWGX4YSRudE8o9VGjUYPjGnGXOMk423GbcajJgYmISZLTepN7ppSTbmmKaY7TDtMx83MzaLN1pk1mz0x1zLnm+eb15vft2BaeFostqi2uGVJsuRaplnutrxuhVo5WaVYVVpds0atna0l1rutu6cRp7lOk06rntZnw7Dxtsm2qbcZsOXYBtuutm22fWFnYhdnt8Wuw+6TvZN9un2N/T0HDYfZDqsdWh1+c7RyFDpWOt6azpzuP33F9JbpL2dYzxDP2DPjthPLKcRpnVOb00dnF2e5c4PziIuJS4LLLpc+Lpsbxt3IveRKdPVxXeF60vWdm7Obwu2o26/uNu5p7ofcn8w0nymeWTNz0MPIQ+BR5dE/C5+VMGvfrH5PQ0+BZ7XnIy9jL5FXrdewt6V3qvdh7xc+9j5yn+M+4zw33jLeWV/MN8C3yLfLT8Nvnl+F30N/I/9k/3r/0QCngCUBZwOJgUGBWwL7+Hp8Ib+OPzrbZfay2e1BjKC5QRVBj4KtguXBrSFoyOyQrSH355jOkc5pDoVQfujW0Adh5mGLw34MJ4WHhVeGP45wiFga0TGXNXfR3ENz30T6RJZE3ptnMU85ry1KNSo+qi5qPNo3ujS6P8YuZlnM1VidWElsSxw5LiquNm5svt/87fOH4p3iC+N7F5gvyF1weaHOwvSFpxapLhIsOpZATIhOOJTwQRAqqBaMJfITdyWOCnnCHcJnIi/RNtGI2ENcKh5O8kgqTXqS7JG8NXkkxTOlLOW5hCepkLxMDUzdmzqeFpp2IG0yPTq9MYOSkZBxQqohTZO2Z+pn5mZ2y6xlhbL+xW6Lty8elQfJa7OQrAVZLQq2QqboVFoo1yoHsmdlV2a/zYnKOZarnivN7cyzytuQN5zvn//tEsIS4ZK2pYZLVy0dWOa9rGo5sjxxedsK4xUFK4ZWBqw8uIq2Km3VT6vtV5eufr0mek1rgV7ByoLBtQFr6wtVCuWFfevc1+1dT1gvWd+1YfqGnRs+FYmKrhTbF5cVf9go3HjlG4dvyr+Z3JS0qavEuWTPZtJm6ebeLZ5bDpaql+aXDm4N2dq0Dd9WtO319kXbL5fNKNu7g7ZDuaO/PLi8ZafJzs07P1SkVPRU+lQ27tLdtWHX+G7R7ht7vPY07NXbW7z3/T7JvttVAVVN1WbVZftJ+7P3P66Jqun4lvttXa1ObXHtxwPSA/0HIw6217nU1R3SPVRSj9Yr60cOxx++/p3vdy0NNg1VjZzG4iNwRHnk6fcJ3/ceDTradox7rOEH0x92HWcdL2pCmvKaRptTmvtbYlu6T8w+0dbq3nr8R9sfD5w0PFl5SvNUyWna6YLTk2fyz4ydlZ19fi753GDborZ752PO32oPb++6EHTh0kX/i+c7vDvOXPK4dPKy2+UTV7hXmq86X23qdOo8/pPTT8e7nLuarrlca7nuer21e2b36RueN87d9L158Rb/1tWeOT3dvfN6b/fF9/XfFt1+cif9zsu72Xcn7q28T7xf9EDtQdlD3YfVP1v+3Njv3H9qwHeg89HcR/cGhYPP/pH1jw9DBY+Zj8uGDYbrnjg+OTniP3L96fynQ89kzyaeF/6i/suuFxYvfvjV69fO0ZjRoZfyl5O/bXyl/erA6xmv28bCxh6+yXgzMV70VvvtwXfcdx3vo98PT+R8IH8o/2j5sfVT0Kf7kxmTk/8EA5jz/GMzLdsAAAAgY0hSTQAAeiUAAICDAAD5/wAAgOkAAHUwAADqYAAAOpgAABdvkl/FRgAAAFtJREFUeNpi/P//PwMlgImBQsA44C6giQENDAwM//HgBmLCAF/AMBLjBUeixf///48L7/+PCvZjU4fPAAc0AxywqcMXCwegGJ1NckL6jx5wpKYDxqGXEkkCgAEAmrqBIejdgngAAAAASUVORK5CYII=);
+}
diff --git a/bitbake/lib/toaster/toastergui/static/js/.jshintrc b/bitbake/lib/toaster/toastergui/static/js/.jshintrc
new file mode 100644
index 0000000..b02f3ef
--- /dev/null
+++ b/bitbake/lib/toaster/toastergui/static/js/.jshintrc
@@ -0,0 +1,11 @@
+{
+ "curly" : false,
+ "predef" : [ "$","libtoaster", "prettyPrint" ],
+ "eqnull": true,
+ "plusplus" : false,
+ "browser" : true,
+ "jquery" : true,
+ "devel" : true,
+ "unused" : true,
+ "maxerr" : 60
+}
diff --git a/bitbake/lib/toaster/toastergui/static/js/base.js b/bitbake/lib/toaster/toastergui/static/js/base.js
new file mode 100644
index 0000000..e0df463
--- /dev/null
+++ b/bitbake/lib/toaster/toastergui/static/js/base.js
@@ -0,0 +1,231 @@
+'use strict';
+
+function basePageInit(ctx) {
+
+ var newBuildButton = $("#new-build-button");
+ var newBuildTargetInput;
+ var newBuildTargetBuildBtn;
+ var projectNameForm = $("#project-name-change-form");
+ var projectName = $("#project-name");
+ var projectNameFormToggle = $("#project-change-form-toggle");
+ var projectNameChangeCancel = $("#project-name-change-cancel");
+
+ /* initially the current project is used unless overridden by the new build
+ * button in top right nav
+ */
+ var selectedProject = libtoaster.ctx;
+
+ var selectedTarget;
+
+ var newBuildProjectInput = $("#new-build-button #project-name-input");
+ var newBuildProjectSaveBtn = $("#new-build-button #save-project-button");
+
+ /* Project name change functionality */
+ projectNameFormToggle.click(function(e){
+ e.preventDefault();
+
+ $(this).add(projectName).hide();
+ projectNameForm.fadeIn();
+ });
+
+ projectNameChangeCancel.click(function(e){
+ e.preventDefault();
+
+ projectNameForm.hide();
+ projectName.add(projectNameFormToggle).fadeIn();
+ });
+
+ $("#project-name-change-btn").click(function(e){
+ var newProjectName = $("#project-name-change-input").val();
+
+ libtoaster.editCurrentProject({ projectName: newProjectName },function (){
+
+ projectName.text(newProjectName);
+ libtoaster.ctx.projectName = newProjectName;
+ projectNameChangeCancel.click();
+ });
+ });
+
+ _checkProjectBuildable();
+
+ $("#project-topbar .nav li a").each(function(){
+ if (window.location.pathname === $(this).attr('href'))
+ $(this).parent().addClass('active');
+ else
+ $(this).parent().removeClass('active');
+ });
+
+ if ($(".total-builds").length !== 0){
+ libtoaster.getProjectInfo(libtoaster.ctx.projectPageUrl, function(prjInfo){
+ if (prjInfo.builds)
+ $(".total-builds").text(prjInfo.builds.length);
+ });
+ }
+
+ /* Hide the button if we're on the project,newproject or importlyaer page
+ * or if there are no projects yet defined
+ * only show if there isn't already a build-target-input already
+ */
+ if (ctx.numProjects > 0 &&
+ ctx.currentUrl.search('newproject') < 0 &&
+ $(".build-target-input").length === 1) {
+
+ newBuildTargetInput = $("#new-build-button .build-target-input");
+ newBuildTargetBuildBtn = $("#new-build-button").find(".build-button");
+
+ _setupNewBuildButton();
+ newBuildButton.show();
+ } else if ($(".build-target-input").length > 0) {
+ newBuildTargetInput = $("#project-topbar .build-target-input");
+ newBuildTargetBuildBtn = $("#project-topbar .build-button");
+ } else {
+ return;
+ }
+
+ /* Hide the change project icon when there is only one project */
+ if (ctx.numProjects === 1) {
+ $('#project .icon-pencil').hide();
+ }
+
+ /* If we have a project setup the typeahead */
+ if (selectedProject.recipesTypeAheadUrl){
+ libtoaster.makeTypeahead(newBuildTargetInput, selectedProject.recipesTypeAheadUrl, { format: "json" }, function (item) {
+ selectedTarget = item;
+ newBuildTargetBuildBtn.removeAttr("disabled");
+ });
+ }
+
+ newBuildTargetInput.on('input', function () {
+ if ($(this).val().length === 0) {
+ newBuildTargetBuildBtn.attr("disabled", "disabled");
+ } else {
+ newBuildTargetBuildBtn.removeAttr("disabled");
+ }
+ });
+
+ newBuildTargetBuildBtn.click(function (e) {
+ e.preventDefault();
+
+ if (!newBuildTargetInput.val()) {
+ return;
+ }
+
+ /* We use the value of the input field so as to maintain any command also
+ * added e.g. core-image-minimal:clean
+ */
+ selectedTarget = { name: newBuildTargetInput.val() };
+
+ /* Fire off the build */
+ libtoaster.startABuild(selectedProject.projectBuildsUrl,
+ selectedProject.projectId, selectedTarget.name, function(){
+ window.location.replace(selectedProject.projectBuildsUrl);
+ }, null);
+ });
+
+ function _checkProjectBuildable() {
+ if (selectedProject.projectId === undefined) {
+ return;
+ }
+
+ libtoaster.getProjectInfo(selectedProject.projectPageUrl,
+ function (data) {
+ if (data.machine === null || data.machine.name === undefined || data.layers.length === 0) {
+ /* we can't build anything with out a machine and some layers */
+ $("#new-build-button #targets-form").hide();
+ $("#new-build-button .alert").show();
+ } else {
+ $("#new-build-button #targets-form").show();
+ $("#new-build-button .alert").hide();
+
+ /* we can build this project; enable input fields */
+ newBuildTargetInput.removeAttr("disabled");
+ }
+ }, null);
+ }
+
+ /* Setup New build button in the top nav bar */
+ function _setupNewBuildButton() {
+
+ /* If we don't have a current project then present the set project
+ * form.
+ */
+ if (selectedProject.projectId === undefined) {
+ $('#change-project-form').show();
+ $('#project .icon-pencil').hide();
+ }
+
+ libtoaster.makeTypeahead(newBuildProjectInput, selectedProject.projectsTypeAheadUrl, { format : "json" }, function (item) {
+ /* successfully selected a project */
+ newBuildProjectSaveBtn.removeAttr("disabled");
+ selectedProject = item;
+ });
+
+ /* Any typing in the input apart from enter key is going to invalidate
+ * the value that has been set by selecting a suggestion from the typeahead
+ */
+ newBuildProjectInput.on('input', function (event) {
+ if (event.keyCode === 13) {
+ return;
+ }
+ newBuildProjectSaveBtn.attr("disabled", "disabled");
+ });
+
+
+ newBuildProjectSaveBtn.click(function () {
+ selectedProject.projectId = selectedProject.id;
+ /* Update the typeahead project_id paramater */
+ _checkProjectBuildable();
+
+ newBuildTargetInput.removeAttr("disabled");
+
+ /* We've got a new project so now we need to update the
+ * target urls. We can get this from the new project's info
+ */
+ $.getJSON(selectedProject.projectPageUrl, { format: "json" },
+ function(projectInfo){
+ /* Update the typeahead to use the new selectedProject */
+ selectedProject = projectInfo;
+
+ libtoaster.makeTypeahead(newBuildTargetInput, selectedProject.recipesTypeAheadUrl, { format: "json" }, function (item) {
+ /* successfully selected a target */
+ selectedTarget = item;
+ newBuildTargetBuildBtn.removeAttr("disabled");
+ });
+
+ });
+ newBuildTargetInput.val("");
+
+ /* set up new form aspect */
+ $("#new-build-button #project a").text(selectedProject.name).attr('href', selectedProject.projectPageUrl);
+ $("#new-build-button .alert a").attr('href', selectedProject.projectPageUrl);
+ $("#project .icon-pencil").show();
+
+ $("#change-project-form").slideUp({ 'complete' : function () {
+ $("#new-build-button #project").show();
+ }});
+ });
+
+ $('#new-build-button #project .icon-pencil').click(function () {
+ newBuildProjectSaveBtn.attr("disabled", "disabled");
+ newBuildProjectInput.val($("#new-build-button #project a").text());
+ $("#cancel-change-project").show();
+ $(this).parent().hide();
+ $("#change-project-form").slideDown();
+ });
+
+ $("#new-build-button #cancel-change-project").click(function () {
+ $("#change-project-form").hide(function () {
+ $('#new-build-button #project').show();
+ });
+
+ newBuildProjectInput.val("");
+ newBuildProjectSaveBtn.attr("disabled", "disabled");
+ });
+
+ /* Keep the dropdown open even unless we click outside the dropdown area */
+ $(".new-build").click (function (event) {
+ event.stopPropagation();
+ });
+ };
+
+}
diff --git a/bitbake/lib/toaster/toastergui/static/js/bootstrap.min.js b/bitbake/lib/toaster/toastergui/static/js/bootstrap.min.js
new file mode 100644
index 0000000..848258d
--- /dev/null
+++ b/bitbake/lib/toaster/toastergui/static/js/bootstrap.min.js
@@ -0,0 +1,6 @@
+/*!
+* Bootstrap.js by @fat & @mdo
+* Copyright 2013 Twitter, Inc.
+* http://www.apache.org/licenses/LICENSE-2.0.txt
+*/
+!function(e){"use strict";e(function(){e.support.transition=function(){var e=function(){var e=document.createElement("bootstrap"),t={WebkitTransition:"webkitTransitionEnd",MozTransition:"transitionend",OTransition:"oTransitionEnd otransitionend",transition:"transitionend"},n;for(n in t)if(e.style[n]!==undefined)return t[n]}();return e&&{end:e}}()})}(window.jQuery),!function(e){"use strict";var t='[data-dismiss="alert"]',n=function(n){e(n).on("click",t,this.close)};n.prototype.close=function(t){function s(){i.trigger("closed").remove()}var n=e(this),r=n.attr("data-target"),i;r||(r=n.attr("href"),r=r&&r.replace(/.*(?=#[^\s]*$)/,"")),i=e(r),t&&t.preventDefault(),i.length||(i=n.hasClass("alert")?n:n.parent()),i.trigger(t=e.Event("close"));if(t.isDefaultPrevented())return;i.removeClass("in"),e.support.transition&&i.hasClass("fade")?i.on(e.support.transition.end,s):s()};var r=e.fn.alert;e.fn.alert=function(t){return this.each(function(){var r=e(this),i=r.data("alert");i||r.data("alert",i=new n(this)),typeof t=="string"&&i[t].call(r)})},e.fn.alert.Constructor=n,e.fn.alert.noConflict=function(){return e.fn.alert=r,this},e(document).on("click.alert.data-api",t,n.prototype.close)}(window.jQuery),!function(e){"use strict";var t=function(t,n){this.$element=e(t),this.options=e.extend({},e.fn.button.defaults,n)};t.prototype.setState=function(e){var t="disabled",n=this.$element,r=n.data(),i=n.is("input")?"val":"html";e+="Text",r.resetText||n.data("resetText",n[i]()),n[i](r[e]||this.options[e]),setTimeout(function(){e=="loadingText"?n.addClass(t).attr(t,t):n.removeClass(t).removeAttr(t)},0)},t.prototype.toggle=function(){var e=this.$element.closest('[data-toggle="buttons-radio"]');e&&e.find(".active").removeClass("active"),this.$element.toggleClass("active")};var n=e.fn.button;e.fn.button=function(n){return this.each(function(){var r=e(this),i=r.data("button"),s=typeof n=="object"&&n;i||r.data("button",i=new t(this,s)),n=="toggle"?i.toggle():n&&i.setState(n)})},e.fn.button.defaults={loadingText:"loading..."},e.fn.button.Constructor=t,e.fn.button.noConflict=function(){return e.fn.button=n,this},e(document).on("click.button.data-api","[data-toggle^=button]",function(t){var n=e(t.target);n.hasClass("btn")||(n=n.closest(".btn")),n.button("toggle")})}(window.jQuery),!function(e){"use strict";var t=function(t,n){this.$element=e(t),this.$indicators=this.$element.find(".carousel-indicators"),this.options=n,this.options.pause=="hover"&&this.$element.on("mouseenter",e.proxy(this.pause,this)).on("mouseleave",e.proxy(this.cycle,this))};t.prototype={cycle:function(t){return t||(this.paused=!1),this.interval&&clearInterval(this.interval),this.options.interval&&!this.paused&&(this.interval=setInterval(e.proxy(this.next,this),this.options.interval)),this},getActiveIndex:function(){return this.$active=this.$element.find(".item.active"),this.$items=this.$active.parent().children(),this.$items.index(this.$active)},to:function(t){var n=this.getActiveIndex(),r=this;if(t>this.$items.length-1||t<0)return;return this.sliding?this.$element.one("slid",function(){r.to(t)}):n==t?this.pause().cycle():this.slide(t>n?"next":"prev",e(this.$items[t]))},pause:function(t){return t||(this.paused=!0),this.$element.find(".next, .prev").length&&e.support.transition.end&&(this.$element.trigger(e.support.transition.end),this.cycle(!0)),clearInterval(this.interval),this.interval=null,this},next:function(){if(this.sliding)return;return this.slide("next")},prev:function(){if(this.sliding)return;return this.slide("prev")},slide:function(t,n){var r=this.$element.find(".item.active"),i=n||r[t](),s=this.interval,o=t=="next"?"left":"right",u=t=="next"?"first":"last",a=this,f;this.sliding=!0,s&&this.pause(),i=i.length?i:this.$element.find(".item")[u](),f=e.Event("slide",{relatedTarget:i[0],direction:o});if(i.hasClass("active"))return;this.$indicators.length&&(this.$indicators.find(".active").removeClass("active"),this.$element.one("slid",function(){var t=e(a.$indicators.children()[a.getActiveIndex()]);t&&t.addClass("active")}));if(e.support.transition&&this.$element.hasClass("slide")){this.$element.trigger(f);if(f.isDefaultPrevented())return;i.addClass(t),i[0].offsetWidth,r.addClass(o),i.addClass(o),this.$element.one(e.support.transition.end,function(){i.removeClass([t,o].join(" ")).addClass("active"),r.removeClass(["active",o].join(" ")),a.sliding=!1,setTimeout(function(){a.$element.trigger("slid")},0)})}else{this.$element.trigger(f);if(f.isDefaultPrevented())return;r.removeClass("active"),i.addClass("active"),this.sliding=!1,this.$element.trigger("slid")}return s&&this.cycle(),this}};var n=e.fn.carousel;e.fn.carousel=function(n){return this.each(function(){var r=e(this),i=r.data("carousel"),s=e.extend({},e.fn.carousel.defaults,typeof n=="object"&&n),o=typeof n=="string"?n:s.slide;i||r.data("carousel",i=new t(this,s)),typeof n=="number"?i.to(n):o?i[o]():s.interval&&i.pause().cycle()})},e.fn.carousel.defaults={interval:5e3,pause:"hover"},e.fn.carousel.Constructor=t,e.fn.carousel.noConflict=function(){return e.fn.carousel=n,this},e(document).on("click.carousel.data-api","[data-slide], [data-slide-to]",function(t){var n=e(this),r,i=e(n.attr("data-target")||(r=n.attr("href"))&&r.replace(/.*(?=#[^\s]+$)/,"")),s=e.extend({},i.data(),n.data()),o;i.carousel(s),(o=n.attr("data-slide-to"))&&i.data("carousel").pause().to(o).cycle(),t.preventDefault()})}(window.jQuery),!function(e){"use strict";var t=function(t,n){this.$element=e(t),this.options=e.extend({},e.fn.collapse.defaults,n),this.options.parent&&(this.$parent=e(this.options.parent)),this.options.toggle&&this.toggle()};t.prototype={constructor:t,dimension:function(){var e=this.$element.hasClass("width");return e?"width":"height"},show:function(){var t,n,r,i;if(this.transitioning||this.$element.hasClass("in"))return;t=this.dimension(),n=e.camelCase(["scroll",t].join("-")),r=this.$parent&&this.$parent.find("> .accordion-group > .in");if(r&&r.length){i=r.data("collapse");if(i&&i.transitioning)return;r.collapse("hide"),i||r.data("collapse",null)}this.$element[t](0),this.transition("addClass",e.Event("show"),"shown"),e.support.transition&&this.$element[t](this.$element[0][n])},hide:function(){var t;if(this.transitioning||!this.$element.hasClass("in"))return;t=this.dimension(),this.reset(this.$element[t]()),this.transition("removeClass",e.Event("hide"),"hidden"),this.$element[t](0)},reset:function(e){var t=this.dimension();return this.$element.removeClass("collapse")[t](e||"auto")[0].offsetWidth,this.$element[e!==null?"addClass":"removeClass"]("collapse"),this},transition:function(t,n,r){var i=this,s=function(){n.type=="show"&&i.reset(),i.transitioning=0,i.$element.trigger(r)};this.$element.trigger(n);if(n.isDefaultPrevented())return;this.transitioning=1,this.$element[t]("in"),e.support.transition&&this.$element.hasClass("collapse")?this.$element.one(e.support.transition.end,s):s()},toggle:function(){this[this.$element.hasClass("in")?"hide":"show"]()}};var n=e.fn.collapse;e.fn.collapse=function(n){return this.each(function(){var r=e(this),i=r.data("collapse"),s=e.extend({},e.fn.collapse.defaults,r.data(),typeof n=="object"&&n);i||r.data("collapse",i=new t(this,s)),typeof n=="string"&&i[n]()})},e.fn.collapse.defaults={toggle:!0},e.fn.collapse.Constructor=t,e.fn.collapse.noConflict=function(){return e.fn.collapse=n,this},e(document).on("click.collapse.data-api","[data-toggle=collapse]",function(t){var n=e(this),r,i=n.attr("data-target")||t.preventDefault()||(r=n.attr("href"))&&r.replace(/.*(?=#[^\s]+$)/,""),s=e(i).data("collapse")?"toggle":n.data();n[e(i).hasClass("in")?"addClass":"removeClass"]("collapsed"),e(i).collapse(s)})}(window.jQuery),!function(e){"use strict";function r(){e(".dropdown-backdrop").remove(),e(t).each(function(){i(e(this)).removeClass("open")})}function i(t){var n=t.attr("data-target"),r;n||(n=t.attr("href"),n=n&&/#/.test(n)&&n.replace(/.*(?=#[^\s]*$)/,"")),r=n&&e(n);if(!r||!r.length)r=t.parent();return r}var t="[data-toggle=dropdown]",n=function(t){var n=e(t).on("click.dropdown.data-api",this.toggle);e("html").on("click.dropdown.data-api",function(){n.parent().removeClass("open")})};n.prototype={constructor:n,toggle:function(t){var n=e(this),s,o;if(n.is(".disabled, :disabled"))return;return s=i(n),o=s.hasClass("open"),r(),o||("ontouchstart"in document.documentElement&&e('<div class="dropdown-backdrop"/>').insertBefore(e(this)).on("click",r),s.toggleClass("open")),n.focus(),!1},keydown:function(n){var r,s,o,u,a,f;if(!/(38|40|27)/.test(n.keyCode))return;r=e(this),n.preventDefault(),n.stopPropagation();if(r.is(".disabled, :disabled"))return;u=i(r),a=u.hasClass("open");if(!a||a&&n.keyCode==27)return n.which==27&&u.find(t).focus(),r.click();s=e("[role=menu] li:not(.divider):visible a",u);if(!s.length)return;f=s.index(s.filter(":focus")),n.keyCode==38&&f>0&&f--,n.keyCode==40&&f<s.length-1&&f++,~f||(f=0),s.eq(f).focus()}};var s=e.fn.dropdown;e.fn.dropdown=function(t){return this.each(function(){var r=e(this),i=r.data("dropdown");i||r.data("dropdown",i=new n(this)),typeof t=="string"&&i[t].call(r)})},e.fn.dropdown.Constructor=n,e.fn.dropdown.noConflict=function(){return e.fn.dropdown=s,this},e(document).on("click.dropdown.data-api",r).on("click.dropdown.data-api",".dropdown form",function(e){e.stopPropagation()}).on("click.dropdown.data-api",t,n.prototype.toggle).on("keydown.dropdown.data-api",t+", [role=menu]",n.prototype.keydown)}(window.jQuery),!function(e){"use strict";var t=function(t,n){this.options=n,this.$element=e(t).delegate('[data-dismiss="modal"]',"click.dismiss.modal",e.proxy(this.hide,this)),this.options.remote&&this.$element.find(".modal-body").load(this.options.remote)};t.prototype={constructor:t,toggle:function(){return this[this.isShown?"hide":"show"]()},show:function(){var t=this,n=e.Event("show");this.$element.trigger(n);if(this.isShown||n.isDefaultPrevented())return;this.isShown=!0,this.escape(),this.backdrop(function(){var n=e.support.transition&&t.$element.hasClass("fade");t.$element.parent().length||t.$element.appendTo(document.body),t.$element.show(),n&&t.$element[0].offsetWidth,t.$element.addClass("in").attr("aria-hidden",!1),t.enforceFocus(),n?t.$element.one(e.support.transition.end,function(){t.$element.focus().trigger("shown")}):t.$element.focus().trigger("shown")})},hide:function(t){t&&t.preventDefault();var n=this;t=e.Event("hide"),this.$element.trigger(t);if(!this.isShown||t.isDefaultPrevented())return;this.isShown=!1,this.escape(),e(document).off("focusin.modal"),this.$element.removeClass("in").attr("aria-hidden",!0),e.support.transition&&this.$element.hasClass("fade")?this.hideWithTransition():this.hideModal()},enforceFocus:function(){var t=this;e(document).on("focusin.modal",function(e){t.$element[0]!==e.target&&!t.$element.has(e.target).length&&t.$element.focus()})},escape:function(){var e=this;this.isShown&&this.options.keyboard?this.$element.on("keyup.dismiss.modal",function(t){t.which==27&&e.hide()}):this.isShown||this.$element.off("keyup.dismiss.modal")},hideWithTransition:function(){var t=this,n=setTimeout(function(){t.$element.off(e.support.transition.end),t.hideModal()},500);this.$element.one(e.support.transition.end,function(){clearTimeout(n),t.hideModal()})},hideModal:function(){var e=this;this.$element.hide(),this.backdrop(function(){e.removeBackdrop(),e.$element.trigger("hidden")})},removeBackdrop:function(){this.$backdrop&&this.$backdrop.remove(),this.$backdrop=null},backdrop:function(t){var n=this,r=this.$element.hasClass("fade")?"fade":"";if(this.isShown&&this.options.backdrop){var i=e.support.transition&&r;this.$backdrop=e('<div class="modal-backdrop '+r+'" />').appendTo(document.body),this.$backdrop.click(this.options.backdrop=="static"?e.proxy(this.$element[0].focus,this.$element[0]):e.proxy(this.hide,this)),i&&this.$backdrop[0].offsetWidth,this.$backdrop.addClass("in");if(!t)return;i?this.$backdrop.one(e.support.transition.end,t):t()}else!this.isShown&&this.$backdrop?(this.$backdrop.removeClass("in"),e.support.transition&&this.$element.hasClass("fade")?this.$backdrop.one(e.support.transition.end,t):t()):t&&t()}};var n=e.fn.modal;e.fn.modal=function(n){return this.each(function(){var r=e(this),i=r.data("modal"),s=e.extend({},e.fn.modal.defaults,r.data(),typeof n=="object"&&n);i||r.data("modal",i=new t(this,s)),typeof n=="string"?i[n]():s.show&&i.show()})},e.fn.modal.defaults={backdrop:!0,keyboard:!0,show:!0},e.fn.modal.Constructor=t,e.fn.modal.noConflict=function(){return e.fn.modal=n,this},e(document).on("click.modal.data-api",'[data-toggle="modal"]',function(t){var n=e(this),r=n.attr("href"),i=e(n.attr("data-target")||r&&r.replace(/.*(?=#[^\s]+$)/,"")),s=i.data("modal")?"toggle":e.extend({remote:!/#/.test(r)&&r},i.data(),n.data());t.preventDefault(),i.modal(s).one("hide",function(){n.focus()})})}(window.jQuery),!function(e){"use strict";var t=function(e,t){this.init("tooltip",e,t)};t.prototype={constructor:t,init:function(t,n,r){var i,s,o,u,a;this.type=t,this.$element=e(n),this.options=this.getOptions(r),this.enabled=!0,o=this.options.trigger.split(" ");for(a=o.length;a--;)u=o[a],u=="click"?this.$element.on("click."+this.type,this.options.selector,e.proxy(this.toggle,this)):u!="manual"&&(i=u=="hover"?"mouseenter":"focus",s=u=="hover"?"mouseleave":"blur",this.$element.on(i+"."+this.type,this.options.selector,e.proxy(this.enter,this)),this.$element.on(s+"."+this.type,this.options.selector,e.proxy(this.leave,this)));this.options.selector?this._options=e.extend({},this.options,{trigger:"manual",selector:""}):this.fixTitle()},getOptions:function(t){return t=e.extend({},e.fn[this.type].defaults,this.$element.data(),t),t.delay&&typeof t.delay=="number"&&(t.delay={show:t.delay,hide:t.delay}),t},enter:function(t){var n=e.fn[this.type].defaults,r={},i;this._options&&e.each(this._options,function(e,t){n[e]!=t&&(r[e]=t)},this),i=e(t.currentTarget)[this.type](r).data(this.type);if(!i.options.delay||!i.options.delay.show)return i.show();clearTimeout(this.timeout),i.hoverState="in",this.timeout=setTimeout(function(){i.hoverState=="in"&&i.show()},i.options.delay.show)},leave:function(t){var n=e(t.currentTarget)[this.type](this._options).data(this.type);this.timeout&&clearTimeout(this.timeout);if(!n.options.delay||!n.options.delay.hide)return n.hide();n.hoverState="out",this.timeout=setTimeout(function(){n.hoverState=="out"&&n.hide()},n.options.delay.hide)},show:function(){var t,n,r,i,s,o,u=e.Event("show");if(this.hasContent()&&this.enabled){this.$element.trigger(u);if(u.isDefaultPrevented())return;t=this.tip(),this.setContent(),this.options.animation&&t.addClass("fade"),s=typeof this.options.placement=="function"?this.options.placement.call(this,t[0],this.$element[0]):this.options.placement,t.detach().css({top:0,left:0,display:"block"}),this.options.container?t.appendTo(this.options.container):t.insertAfter(this.$element),n=this.getPosition(),r=t[0].offsetWidth,i=t[0].offsetHeight;switch(s){case"bottom":o={top:n.top+n.height,left:n.left+n.width/2-r/2};break;case"top":o={top:n.top-i,left:n.left+n.width/2-r/2};break;case"left":o={top:n.top+n.height/2-i/2,left:n.left-r};break;case"right":o={top:n.top+n.height/2-i/2,left:n.left+n.width}}this.applyPlacement(o,s),this.$element.trigger("shown")}},applyPlacement:function(e,t){var n=this.tip(),r=n[0].offsetWidth,i=n[0].offsetHeight,s,o,u,a;n.offset(e).addClass(t).addClass("in"),s=n[0].offsetWidth,o=n[0].offsetHeight,t=="top"&&o!=i&&(e.top=e.top+i-o,a=!0),t=="bottom"||t=="top"?(u=0,e.left<0&&(u=e.left*-2,e.left=0,n.offset(e),s=n[0].offsetWidth,o=n[0].offsetHeight),this.replaceArrow(u-r+s,s,"left")):this.replaceArrow(o-i,o,"top"),a&&n.offset(e)},replaceArrow:function(e,t,n){this.arrow().css(n,e?50*(1-e/t)+"%":"")},setContent:function(){var e=this.tip(),t=this.getTitle();e.find(".tooltip-inner")[this.options.html?"html":"text"](t),e.removeClass("fade in top bottom left right")},hide:function(){function i(){var t=setTimeout(function(){n.off(e.support.transition.end).detach()},500);n.one(e.support.transition.end,function(){clearTimeout(t),n.detach()})}var t=this,n=this.tip(),r=e.Event("hide");this.$element.trigger(r);if(r.isDefaultPrevented())return;return n.removeClass("in"),e.support.transition&&this.$tip.hasClass("fade")?i():n.detach(),this.$element.trigger("hidden"),this},fixTitle:function(){var e=this.$element;(e.attr("title")||typeof e.attr("data-original-title")!="string")&&e.attr("data-original-title",e.attr("title")||"").attr("title","")},hasContent:function(){return this.getTitle()},getPosition:function(){var t=this.$element[0];return e.extend({},typeof t.getBoundingClientRect=="function"?t.getBoundingClientRect():{width:t.offsetWidth,height:t.offsetHeight},this.$element.offset())},getTitle:function(){var e,t=this.$element,n=this.options;return e=t.attr("data-original-title")||(typeof n.title=="function"?n.title.call(t[0]):n.title),e},tip:function(){return this.$tip=this.$tip||e(this.options.template)},arrow:function(){return this.$arrow=this.$arrow||this.tip().find(".tooltip-arrow")},validate:function(){this.$element[0].parentNode||(this.hide(),this.$element=null,this.options=null)},enable:function(){this.enabled=!0},disable:function(){this.enabled=!1},toggleEnabled:function(){this.enabled=!this.enabled},toggle:function(t){var n=t?e(t.currentTarget)[this.type](this._options).data(this.type):this;n.tip().hasClass("in")?n.hide():n.show()},destroy:function(){this.hide().$element.off("."+this.type).removeData(this.type)}};var n=e.fn.tooltip;e.fn.tooltip=function(n){return this.each(function(){var r=e(this),i=r.data("tooltip"),s=typeof n=="object"&&n;i||r.data("tooltip",i=new t(this,s)),typeof n=="string"&&i[n]()})},e.fn.tooltip.Constructor=t,e.fn.tooltip.defaults={animation:!0,placement:"top",selector:!1,template:'<div class="tooltip"><div class="tooltip-arrow"></div><div class="tooltip-inner"></div></div>',trigger:"hover focus",title:"",delay:0,html:!1,container:!1},e.fn.tooltip.noConflict=function(){return e.fn.tooltip=n,this}}(window.jQuery),!function(e){"use strict";var t=function(e,t){this.init("popover",e,t)};t.prototype=e.extend({},e.fn.tooltip.Constructor.prototype,{constructor:t,setContent:function(){var e=this.tip(),t=this.getTitle(),n=this.getContent();e.find(".popover-title")[this.options.html?"html":"text"](t),e.find(".popover-content")[this.options.html?"html":"text"](n),e.removeClass("fade top bottom left right in")},hasContent:function(){return this.getTitle()||this.getContent()},getContent:function(){var e,t=this.$element,n=this.options;return e=(typeof n.content=="function"?n.content.call(t[0]):n.content)||t.attr("data-content"),e},tip:function(){return this.$tip||(this.$tip=e(this.options.template)),this.$tip},destroy:function(){this.hide().$element.off("."+this.type).removeData(this.type)}});var n=e.fn.popover;e.fn.popover=function(n){return this.each(function(){var r=e(this),i=r.data("popover"),s=typeof n=="object"&&n;i||r.data("popover",i=new t(this,s)),typeof n=="string"&&i[n]()})},e.fn.popover.Constructor=t,e.fn.popover.defaults=e.extend({},e.fn.tooltip.defaults,{placement:"right",trigger:"click",content:"",template:'<div class="popover"><div class="arrow"></div><h3 class="popover-title"></h3><div class="popover-content"></div></div>'}),e.fn.popover.noConflict=function(){return e.fn.popover=n,this}}(window.jQuery),!function(e){"use strict";function t(t,n){var r=e.proxy(this.process,this),i=e(t).is("body")?e(window):e(t),s;this.options=e.extend({},e.fn.scrollspy.defaults,n),this.$scrollElement=i.on("scroll.scroll-spy.data-api",r),this.selector=(this.options.target||(s=e(t).attr("href"))&&s.replace(/.*(?=#[^\s]+$)/,"")||"")+" .nav li > a",this.$body=e("body"),this.refresh(),this.process()}t.prototype={constructor:t,refresh:function(){var t=this,n;this.offsets=e([]),this.targets=e([]),n=this.$body.find(this.selector).map(function(){var n=e(this),r=n.data("target")||n.attr("href"),i=/^#\w/.test(r)&&e(r);return i&&i.length&&[[i.position().top+(!e.isWindow(t.$scrollElement.get(0))&&t.$scrollElement.scrollTop()),r]]||null}).sort(function(e,t){return e[0]-t[0]}).each(function(){t.offsets.push(this[0]),t.targets.push(this[1])})},process:function(){var e=this.$scrollElement.scrollTop()+this.options.offset,t=this.$scrollElement[0].scrollHeight||this.$body[0].scrollHeight,n=t-this.$scrollElement.height(),r=this.offsets,i=this.targets,s=this.activeTarget,o;if(e>=n)return s!=(o=i.last()[0])&&this.activate(o);for(o=r.length;o--;)s!=i[o]&&e>=r[o]&&(!r[o+1]||e<=r[o+1])&&this.activate(i[o])},activate:function(t){var n,r;this.activeTarget=t,e(this.selector).parent(".active").removeClass("active"),r=this.selector+'[data-target="'+t+'"],'+this.selector+'[href="'+t+'"]',n=e(r).parent("li").addClass("active"),n.parent(".dropdown-menu").length&&(n=n.closest("li.dropdown").addClass("active")),n.trigger("activate")}};var n=e.fn.scrollspy;e.fn.scrollspy=function(n){return this.each(function(){var r=e(this),i=r.data("scrollspy"),s=typeof n=="object"&&n;i||r.data("scrollspy",i=new t(this,s)),typeof n=="string"&&i[n]()})},e.fn.scrollspy.Constructor=t,e.fn.scrollspy.defaults={offset:10},e.fn.scrollspy.noConflict=function(){return e.fn.scrollspy=n,this},e(window).on("load",function(){e('[data-spy="scroll"]').each(function(){var t=e(this);t.scrollspy(t.data())})})}(window.jQuery),!function(e){"use strict";var t=function(t){this.element=e(t)};t.prototype={constructor:t,show:function(){var t=this.element,n=t.closest("ul:not(.dropdown-menu)"),r=t.attr("data-target"),i,s,o;r||(r=t.attr("href"),r=r&&r.replace(/.*(?=#[^\s]*$)/,""));if(t.parent("li").hasClass("active"))return;i=n.find(".active:last a")[0],o=e.Event("show",{relatedTarget:i}),t.trigger(o);if(o.isDefaultPrevented())return;s=e(r),this.activate(t.parent("li"),n),this.activate(s,s.parent(),function(){t.trigger({type:"shown",relatedTarget:i})})},activate:function(t,n,r){function o(){i.removeClass("active").find("> .dropdown-menu > .active").removeClass("active"),t.addClass("active"),s?(t[0].offsetWidth,t.addClass("in")):t.removeClass("fade"),t.parent(".dropdown-menu")&&t.closest("li.dropdown").addClass("active"),r&&r()}var i=n.find("> .active"),s=r&&e.support.transition&&i.hasClass("fade");s?i.one(e.support.transition.end,o):o(),i.removeClass("in")}};var n=e.fn.tab;e.fn.tab=function(n){return this.each(function(){var r=e(this),i=r.data("tab");i||r.data("tab",i=new t(this)),typeof n=="string"&&i[n]()})},e.fn.tab.Constructor=t,e.fn.tab.noConflict=function(){return e.fn.tab=n,this},e(document).on("click.tab.data-api",'[data-toggle="tab"], [data-toggle="pill"]',function(t){t.preventDefault(),e(this).tab("show")})}(window.jQuery),!function(e){"use strict";var t=function(t,n){this.$element=e(t),this.options=e.extend({},e.fn.typeahead.defaults,n),this.matcher=this.options.matcher||this.matcher,this.sorter=this.options.sorter||this.sorter,this.highlighter=this.options.highlighter||this.highlighter,this.updater=this.options.updater||this.updater,this.source=this.options.source,this.$menu=e(this.options.menu),this.shown=!1,this.listen()};t.prototype={constructor:t,select:function(){var e=this.$menu.find(".active").attr("data-value");return this.$element.val(this.updater(e)).change(),this.hide()},updater:function(e){return e},show:function(){var t=e.extend({},this.$element.position(),{height:this.$element[0].offsetHeight});return this.$menu.insertAfter(this.$element).css({top:t.top+t.height,left:t.left}).show(),this.shown=!0,this},hide:function(){return this.$menu.hide(),this.shown=!1,this},lookup:function(t){var n;return this.query=this.$element.val(),!this.query||this.query.length<this.options.minLength?this.shown?this.hide():this:(n=e.isFunction(this.source)?this.source(this.query,e.proxy(this.process,this)):this.source,n?this.process(n):this)},process:function(t){var n=this;return t=e.grep(t,function(e){return n.matcher(e)}),t=this.sorter(t),t.length?this.render(t.slice(0,this.options.items)).show():this.shown?this.hide():this},matcher:function(e){return~e.toLowerCase().indexOf(this.query.toLowerCase())},sorter:function(e){var t=[],n=[],r=[],i;while(i=e.shift())i.toLowerCase().indexOf(this.query.toLowerCase())?~i.indexOf(this.query)?n.push(i):r.push(i):t.push(i);return t.concat(n,r)},highlighter:function(e){var t=this.query.replace(/[\-\[\]{}()*+?.,\\\^$|#\s]/g,"\\$&");return e.replace(new RegExp("("+t+")","ig"),function(e,t){return"<strong>"+t+"</strong>"})},render:function(t){var n=this;return t=e(t).map(function(t,r){return t=e(n.options.item).attr("data-value",r),t.find("a").html(n.highlighter(r)),t[0]}),t.first().addClass("active"),this.$menu.html(t),this},next:function(t){var n=this.$menu.find(".active").removeClass("active"),r=n.next();r.length||(r=e(this.$menu.find("li")[0])),r.addClass("active")},prev:function(e){var t=this.$menu.find(".active").removeClass("active"),n=t.prev();n.length||(n=this.$menu.find("li").last()),n.addClass("active")},listen:function(){this.$element.on("focus",e.proxy(this.focus,this)).on("blur",e.proxy(this.blur,this)).on("keypress",e.proxy(this.keypress,this)).on("keyup",e.proxy(this.keyup,this)),this.eventSupported("keydown")&&this.$element.on("keydown",e.proxy(this.keydown,this)),this.$menu.on("click",e.proxy(this.click,this)).on("mouseenter","li",e.proxy(this.mouseenter,this)).on("mouseleave","li",e.proxy(this.mouseleave,this))},eventSupported:function(e){var t=e in this.$element;return t||(this.$element.setAttribute(e,"return;"),t=typeof this.$element[e]=="function"),t},move:function(e){if(!this.shown)return;switch(e.keyCode){case 9:case 13:case 27:e.preventDefault();break;case 38:e.preventDefault(),this.prev();break;case 40:e.preventDefault(),this.next()}e.stopPropagation()},keydown:function(t){this.suppressKeyPressRepeat=~e.inArray(t.keyCode,[40,38,9,13,27]),this.move(t)},keypress:function(e){if(this.suppressKeyPressRepeat)return;this.move(e)},keyup:function(e){switch(e.keyCode){case 40:case 38:case 16:case 17:case 18:break;case 9:case 13:if(!this.shown)return;this.select();break;case 27:if(!this.shown)return;this.hide();break;default:this.lookup()}e.stopPropagation(),e.preventDefault()},focus:function(e){this.focused=!0},blur:function(e){this.focused=!1,!this.mousedover&&this.shown&&this.hide()},click:function(e){e.stopPropagation(),e.preventDefault(),this.select(),this.$element.focus()},mouseenter:function(t){this.mousedover=!0,this.$menu.find(".active").removeClass("active"),e(t.currentTarget).addClass("active")},mouseleave:function(e){this.mousedover=!1,!this.focused&&this.shown&&this.hide()}};var n=e.fn.typeahead;e.fn.typeahead=function(n){return this.each(function(){var r=e(this),i=r.data("typeahead"),s=typeof n=="object"&&n;i||r.data("typeahead",i=new t(this,s)),typeof n=="string"&&i[n]()})},e.fn.typeahead.defaults={source:[],items:8,menu:'<ul class="typeahead dropdown-menu"></ul>',item:'<li><a href="#"></a></li>',minLength:1},e.fn.typeahead.Constructor=t,e.fn.typeahead.noConflict=function(){return e.fn.typeahead=n,this},e(document).on("focus.typeahead.data-api",'[data-provide="typeahead"]',function(t){var n=e(this);if(n.data("typeahead"))return;n.typeahead(n.data())})}(window.jQuery),!function(e){"use strict";var t=function(t,n){this.options=e.extend({},e.fn.affix.defaults,n),this.$window=e(window).on("scroll.affix.data-api",e.proxy(this.checkPosition,this)).on("click.affix.data-api",e.proxy(function(){setTimeout(e.proxy(this.checkPosition,this),1)},this)),this.$element=e(t),this.checkPosition()};t.prototype.checkPosition=function(){if(!this.$element.is(":visible"))return;var t=e(document).height(),n=this.$window.scrollTop(),r=this.$element.offset(),i=this.options.offset,s=i.bottom,o=i.top,u="affix affix-top affix-bottom",a;typeof i!="object"&&(s=o=i),typeof o=="function"&&(o=i.top()),typeof s=="function"&&(s=i.bottom()),a=this.unpin!=null&&n+this.unpin<=r.top?!1:s!=null&&r.top+this.$element.height()>=t-s?"bottom":o!=null&&n<=o?"top":!1;if(this.affixed===a)return;this.affixed=a,this.unpin=a=="bottom"?r.top-n:null,this.$element.removeClass(u).addClass("affix"+(a?"-"+a:""))};var n=e.fn.affix;e.fn.affix=function(n){return this.each(function(){var r=e(this),i=r.data("affix"),s=typeof n=="object"&&n;i||r.data("affix",i=new t(this,s)),typeof n=="string"&&i[n]()})},e.fn.affix.Constructor=t,e.fn.affix.defaults={offset:0},e.fn.affix.noConflict=function(){return e.fn.affix=n,this},e(window).on("load",function(){e('[data-spy="affix"]').each(function(){var t=e(this),n=t.data();n.offset=n.offset||{},n.offsetBottom&&(n.offset.bottom=n.offsetBottom),n.offsetTop&&(n.offset.top=n.offsetTop),t.affix(n)})})}(window.jQuery);
\ No newline at end of file
diff --git a/bitbake/lib/toaster/toastergui/static/js/filtersnippet.js b/bitbake/lib/toaster/toastergui/static/js/filtersnippet.js
new file mode 100755
index 0000000..2b84c54
--- /dev/null
+++ b/bitbake/lib/toaster/toastergui/static/js/filtersnippet.js
@@ -0,0 +1,94 @@
+"use strict"
+
+// The disable removes the 'datepicker' attribute and
+// settings, so you have to re-initialize it each time
+// the date range is selected and enabled
+// DOM is used instead of jQuery to find the elements
+// in all contexts
+function date_enable (key, action) {
+
+ var elemFrom=document.getElementById("date_from_"+key);
+ var elemTo=document.getElementById("date_to_"+key);
+
+ if ('enable' == action) {
+ elemFrom.removeAttribute("disabled");
+ elemTo.removeAttribute("disabled");
+
+ $(elemFrom).datepicker();
+ $(elemTo).datepicker();
+
+ $(elemFrom).datepicker( "option", "dateFormat", "dd/mm/yy" );
+ $(elemTo).datepicker( "option", "dateFormat", "dd/mm/yy" );
+
+ $(elemFrom).datepicker( "setDate", elemFrom.getAttribute( "data-setDate") );
+ $(elemTo).datepicker( "setDate", elemTo.getAttribute( "data-setDate") );
+ $(elemFrom).datepicker( "option", "minDate", elemFrom.getAttribute( "data-minDate"));
+ $(elemTo).datepicker( "option", "minDate", elemTo.getAttribute( "data-minDate"));
+ $(elemFrom).datepicker( "option", "maxDate", elemFrom.getAttribute( "data-maxDate"));
+ $(elemTo).datepicker( "option", "maxDate", elemTo.getAttribute( "data-maxDate"));
+ } else {
+ elemFrom.setAttribute("disabled","disabled");
+ elemTo.setAttribute("disabled","disabled");
+ }
+}
+
+// Initialize the date picker elements with their default state variables, and
+// register the radio button and form actions
+function date_init (key, from_date, to_date, min_date, max_date, initial_enable) {
+
+ var elemFrom=document.getElementById("date_from_"+key);
+ var elemTo=document.getElementById("date_to_"+key);
+
+ // Were there any daterange filters instantiated? (e.g. no builds found)
+ if (null == elemFrom) {
+ return;
+ }
+
+ // init the datepicker context data
+ elemFrom.setAttribute( "data-setDate", from_date );
+ elemTo.setAttribute( "data-setDate", to_date );
+ elemFrom.setAttribute( "data-minDate", min_date);
+ elemTo.setAttribute( "data-minDate", min_date);
+ elemFrom.setAttribute( "data-maxDate", max_date);
+ elemTo.setAttribute( "data-maxDate", max_date);
+
+ // does the date set start enabled?
+ if (key == initial_enable) {
+ date_enable (key, "enable");
+ } else {
+ date_enable (key, "disable");
+ }
+
+ // catch the radio button selects for enable/disable
+ $('input:radio[name="filter"]').change(function(){
+ if ($(this).val() == 'daterange') {
+ key=$(this).attr("data-key");
+ date_enable (key, 'enable');
+ } else {
+ key=$(this).attr("data-key");
+ date_enable (key, 'disable');
+ }
+ });
+
+ // catch any new 'from' date as minDate for 'to' date
+ $("#date_from_"+key).change(function(){
+ from_date = $("#date_from_"+key).val();
+ $("#date_to_"+key).datepicker( "option", "minDate", from_date );
+ });
+
+ // catch the submit (just once)
+ $("form").unbind('submit');
+ $("form").submit(function(e) {
+ // format a composite daterange filter value so that it can be parsed and post-processed in the view
+ if (key !== undefined) {
+ if ($("#date_from_"+key).length) {
+ var filter=key+"__gte!"+key+"__lt:"+$("#date_from_"+key).val()+"!"+$("#date_to_"+key).val()+"_daterange";
+ $("#last_date_from_"+key).val($("#date_from_"+key).val());
+ $("#last_date_to_"+key).val($("#date_to_"+key).val());
+ $("#filter_value_"+key).val(filter);
+ }
+ }
+ return true;
+ });
+
+};
diff --git a/bitbake/lib/toaster/toastergui/static/js/importlayer.js b/bitbake/lib/toaster/toastergui/static/js/importlayer.js
new file mode 100644
index 0000000..2fadbc0
--- /dev/null
+++ b/bitbake/lib/toaster/toastergui/static/js/importlayer.js
@@ -0,0 +1,276 @@
+"use strict"
+
+function importLayerPageInit (ctx) {
+
+ var layerDepBtn = $("#add-layer-dependency-btn");
+ var importAndAddBtn = $("#import-and-add-btn");
+ var layerNameInput = $("#import-layer-name");
+ var vcsURLInput = $("#layer-git-repo-url");
+ var gitRefInput = $("#layer-git-ref");
+ var layerDepInput = $("#layer-dependency");
+ var layerNameCtrl = $("#layer-name-ctrl");
+ var duplicatedLayerName = $("#duplicated-layer-name-hint");
+
+ var layerDeps = {};
+ var layerDepsDeps = {};
+ var currentLayerDepSelection;
+ var validLayerName = /^(\w|-)+$/;
+
+ libtoaster.makeTypeahead(layerDepInput, libtoaster.ctx.layersTypeAheadUrl, { include_added: "true" }, function(item){
+ currentLayerDepSelection = item;
+
+ layerDepBtn.removeAttr("disabled");
+ });
+
+
+ /* We automatically add "openembedded-core" layer for convenience as a
+ * dependency as pretty much all layers depend on this one
+ */
+ $.getJSON(libtoaster.ctx.layersTypeAheadUrl,
+ { include_added: "true" , search: "openembedded-core" },
+ function(layer) {
+ if (layer.results.length > 0) {
+ currentLayerDepSelection = layer.results[0];
+ layerDepBtn.click();
+ }
+ });
+
+ layerDepBtn.click(function(){
+ if (currentLayerDepSelection == undefined)
+ return;
+
+ layerDeps[currentLayerDepSelection.id] = currentLayerDepSelection;
+
+ /* Make a list item for the new layer dependency */
+ var newLayerDep = $("<li><a></a><span class=\"icon-trash\" data-toggle=\"tooltip\" title=\"Delete\"></span></li>");
+
+ newLayerDep.data('layer-id', currentLayerDepSelection.id);
+ newLayerDep.children("span").tooltip();
+
+ var link = newLayerDep.children("a");
+ link.attr("href", currentLayerDepSelection.layerdetailurl);
+ link.text(currentLayerDepSelection.name);
+ link.tooltip({title: currentLayerDepSelection.tooltip, placement: "right"});
+
+ var trashItem = newLayerDep.children("span");
+ trashItem.click(function () {
+ var toRemove = $(this).parent().data('layer-id');
+ delete layerDeps[toRemove];
+ $(this).parent().fadeOut(function (){
+ $(this).remove();
+ });
+ });
+
+ $("#layer-deps-list").append(newLayerDep);
+
+ libtoaster.getLayerDepsForProject(currentLayerDepSelection.layerdetailurl, function (data){
+ /* These are the dependencies of the layer added as a dependency */
+ if (data.list.length > 0) {
+ currentLayerDepSelection.url = currentLayerDepSelection.layerdetailurl;
+ layerDeps[currentLayerDepSelection.id].deps = data.list;
+ }
+
+ /* Clear the current selection */
+ layerDepInput.val("");
+ currentLayerDepSelection = undefined;
+ layerDepBtn.attr("disabled","disabled");
+ }, null);
+ });
+
+ importAndAddBtn.click(function(){
+ /* This is a list of the names from layerDeps for the layer deps
+ * modal dialog body
+ */
+ var depNames = [];
+
+ /* arrray of all layer dep ids includes parent and child deps */
+ var allDeps = [];
+
+ /* temporary object to use to do a reduce on the dependencies for each
+ * layer dependency added
+ */
+ var depDeps = {};
+
+ /* the layers that have dependencies have an extra property "deps"
+ * look in this for each layer and reduce this to a unquie object
+ * of deps.
+ */
+ for (var key in layerDeps){
+ if (layerDeps[key].hasOwnProperty('deps')){
+ for (var dep in layerDeps[key].deps){
+ var layer = layerDeps[key].deps[dep];
+ depDeps[layer.id] = layer;
+ }
+ }
+ depNames.push(layerDeps[key].name);
+ allDeps.push(layerDeps[key].id);
+ }
+
+ /* we actually want it as an array so convert it now */
+ var depDepsArray = [];
+ for (var key in depDeps)
+ depDepsArray.push (depDeps[key]);
+
+ if (depDepsArray.length > 0) {
+ var layer = { name: layerNameInput.val(), url: "#", id: -1 };
+ var title = "Layer";
+ var body = "<strong>"+layer.name+"</strong>'s dependencies ("+
+ depNames.join(", ")+"</span>) require some layers that are not added to your project. Select the ones you want to add:</p>";
+
+ showLayerDepsModal(layer, depDepsArray, title, body, false, function(layerObsList){
+ /* Add the accepted layer dependencies' ids to the allDeps array */
+ for (var key in layerObsList){
+ allDeps.push(layerObsList[key].id);
+ }
+ import_and_add ();
+ });
+ } else {
+ import_and_add ();
+ }
+
+ function import_and_add () {
+ /* convert to a csv of all the deps to be added */
+ var layerDepsCsv = allDeps.join(",");
+
+ var layerData = {
+ name: layerNameInput.val(),
+ vcs_url: vcsURLInput.val(),
+ git_ref: gitRefInput.val(),
+ dir_path: $("#layer-subdir").val(),
+ project_id: libtoaster.ctx.projectId,
+ layer_deps: layerDepsCsv,
+ };
+
+ $.ajax({
+ type: "POST",
+ url: ctx.xhrImportLayerUrl,
+ data: layerData,
+ headers: { 'X-CSRFToken' : $.cookie('csrftoken')},
+ success: function (data) {
+ if (data.error != "ok") {
+ console.log(data.error);
+ } else {
+ /* Success layer import now go to the project page */
+ $.cookie('layer-imported-alert', JSON.stringify(data), { path: '/'});
+ window.location.replace(libtoaster.ctx.projectPageUrl+'?notify=layer-imported');
+ }
+ },
+ error: function (data) {
+ console.log("Call failed");
+ console.log(data);
+ }
+ });
+ }
+ });
+
+ function enable_import_btn(enabled) {
+ var importAndAddHint = $("#import-and-add-hint");
+
+ if (enabled) {
+ importAndAddBtn.removeAttr("disabled");
+ importAndAddHint.hide();
+ return;
+ }
+
+ importAndAddBtn.attr("disabled", "disabled");
+ importAndAddHint.show();
+ }
+
+ function check_form() {
+ var valid = false;
+ var inputs = $("input:required");
+
+ for (var i=0; i<inputs.length; i++){
+ if (!(valid = inputs[i].value)){
+ enable_import_btn(false);
+ break;
+ }
+ }
+
+ if (valid)
+ enable_import_btn(true);
+ }
+
+ function layerExistsError(layer){
+ var dupLayerInfo = $("#duplicate-layer-info");
+ dupLayerInfo.find(".dup-layer-name").text(layer.name);
+ dupLayerInfo.find(".dup-layer-link").attr("href", layer.layerdetailurl);
+ dupLayerInfo.find("#dup-layer-vcs-url").text(layer.layer__vcs_url);
+ dupLayerInfo.find("#dup-layer-revision").text(layer.revision.commit);
+
+ $(".fields-apart-from-layer-name").fadeOut(function(){
+
+ dupLayerInfo.fadeIn();
+ });
+ }
+
+ layerNameInput.on('blur', function() {
+ if (!$(this).val()){
+ return;
+ }
+ var name = $(this).val();
+
+ /* Check if the layer name exists */
+ $.getJSON(libtoaster.ctx.layersTypeAheadUrl,
+ { include_added: "true" , search: name, format: "json" },
+ function(layer) {
+ if (layer.rows.length > 0) {
+ for (var i in layer.rows){
+ if (layer.rows[i].name == name) {
+ console.log(layer.rows[i])
+ layerExistsError(layer.rows[i]);
+ }
+ }
+ }
+ });
+ });
+
+ vcsURLInput.on('input', function() {
+ check_form();
+ });
+
+ gitRefInput.on('input', function() {
+ check_form();
+ });
+
+ layerNameInput.on('input', function() {
+ if ($(this).val() && !validLayerName.test($(this).val())){
+ layerNameCtrl.addClass("error")
+ $("#invalid-layer-name-hint").show();
+ enable_import_btn(false);
+ return;
+ }
+
+ if ($("#duplicate-layer-info").css("display") != "None"){
+ $("#duplicate-layer-info").fadeOut(function(){
+ $(".fields-apart-from-layer-name").show();
+ });
+
+ }
+
+ /* Don't remove the error class if we're displaying the error for another
+ * reason.
+ */
+ if (!duplicatedLayerName.is(":visible"))
+ layerNameCtrl.removeClass("error")
+
+ $("#invalid-layer-name-hint").hide();
+ check_form();
+ });
+
+ /* Have a guess at the layer name */
+ vcsURLInput.focusout(function (){
+ /* If we a layer name specified don't overwrite it or if there isn't a
+ * url typed in yet return
+ */
+ if (layerNameInput.val() || !$(this).val())
+ return;
+
+ if ($(this).val().search("/")){
+ var urlPts = $(this).val().split("/");
+ var suggestion = urlPts[urlPts.length-1].replace(".git","");
+ layerNameInput.val(suggestion);
+ }
+ });
+
+}
diff --git a/bitbake/lib/toaster/toastergui/static/js/jquery-2.0.3.min.js b/bitbake/lib/toaster/toastergui/static/js/jquery-2.0.3.min.js
new file mode 100644
index 0000000..2be209d
--- /dev/null
+++ b/bitbake/lib/toaster/toastergui/static/js/jquery-2.0.3.min.js
@@ -0,0 +1,6 @@
+/*! jQuery v2.0.3 | (c) 2005, 2013 jQuery Foundation, Inc. | jquery.org/license
+//@ sourceMappingURL=jquery-2.0.3.min.map
+*/
+(function(e,undefined){var t,n,r=typeof undefined,i=e.location,o=e.document,s=o.documentElement,a=e.jQuery,u=e.$,l={},c=[],p="2.0.3",f=c.concat,h=c.push,d=c.slice,g=c.indexOf,m=l.toString,y=l.hasOwnProperty,v=p.trim,x=function(e,n){return new x.fn.init(e,n,t)},b=/[+-]?(?:\d*\.|)\d+(?:[eE][+-]?\d+|)/.source,w=/\S+/g,T=/^(?:\s*(<[\w\W]+>)[^>]*|#([\w-]*))$/,C=/^<(\w+)\s*\/?>(?:<\/\1>|)$/,k=/^-ms-/,N=/-([\da-z])/gi,E=function(e,t){return t.toUpperCase()},S=function(){o.removeEventListener("DOMContentLoaded",S,!1),e.removeEventListener("load",S,!1),x.ready()};x.fn=x.prototype={jquery:p,constructor:x,init:function(e,t,n){var r,i;if(!e)return this;if("string"==typeof e){if(r="<"===e.charAt(0)&&">"===e.charAt(e.length-1)&&e.length>=3?[null,e,null]:T.exec(e),!r||!r[1]&&t)return!t||t.jquery?(t||n).find(e):this.constructor(t).find(e);if(r[1]){if(t=t instanceof x?t[0]:t,x.merge(this,x.parseHTML(r[1],t&&t.nodeType?t.ownerDocument||t:o,!0)),C.test(r[1])&&x.isPlainObject(t))for(r in t)x.isFunction(this[r])?this[r](t[r]):this.attr(r,t[r]);return this}return i=o.getElementById(r[2]),i&&i.parentNode&&(this.length=1,this[0]=i),this.context=o,this.selector=e,this}return e.nodeType?(this.context=this[0]=e,this.length=1,this):x.isFunction(e)?n.ready(e):(e.selector!==undefined&&(this.selector=e.selector,this.context=e.context),x.makeArray(e,this))},selector:"",length:0,toArray:function(){return d.call(this)},get:function(e){return null==e?this.toArray():0>e?this[this.length+e]:this[e]},pushStack:function(e){var t=x.merge(this.constructor(),e);return t.prevObject=this,t.context=this.context,t},each:function(e,t){return x.each(this,e,t)},ready:function(e){return x.ready.promise().done(e),this},slice:function(){return this.pushStack(d.apply(this,arguments))},first:function(){return this.eq(0)},last:function(){return this.eq(-1)},eq:function(e){var t=this.length,n=+e+(0>e?t:0);return this.pushStack(n>=0&&t>n?[this[n]]:[])},map:function(e){return this.pushStack(x.map(this,function(t,n){return e.call(t,n,t)}))},end:function(){return this.prevObject||this.constructor(null)},push:h,sort:[].sort,splice:[].splice},x.fn.init.prototype=x.fn,x.extend=x.fn.extend=function(){var e,t,n,r,i,o,s=arguments[0]||{},a=1,u=arguments.length,l=!1;for("boolean"==typeof s&&(l=s,s=arguments[1]||{},a=2),"object"==typeof s||x.isFunction(s)||(s={}),u===a&&(s=this,--a);u>a;a++)if(null!=(e=arguments[a]))for(t in e)n=s[t],r=e[t],s!==r&&(l&&r&&(x.isPlainObject(r)||(i=x.isArray(r)))?(i?(i=!1,o=n&&x.isArray(n)?n:[]):o=n&&x.isPlainObject(n)?n:{},s[t]=x.extend(l,o,r)):r!==undefined&&(s[t]=r));return s},x.extend({expando:"jQuery"+(p+Math.random()).replace(/\D/g,""),noConflict:function(t){return e.$===x&&(e.$=u),t&&e.jQuery===x&&(e.jQuery=a),x},isReady:!1,readyWait:1,holdReady:function(e){e?x.readyWait++:x.ready(!0)},ready:function(e){(e===!0?--x.readyWait:x.isReady)||(x.isReady=!0,e!==!0&&--x.readyWait>0||(n.resolveWith(o,[x]),x.fn.trigger&&x(o).trigger("ready").off("ready")))},isFunction:function(e){return"function"===x.type(e)},isArray:Array.isArray,isWindow:function(e){return null!=e&&e===e.window},isNumeric:function(e){return!isNaN(parseFloat(e))&&isFinite(e)},type:function(e){return null==e?e+"":"object"==typeof e||"function"==typeof e?l[m.call(e)]||"object":typeof e},isPlainObject:function(e){if("object"!==x.type(e)||e.nodeType||x.isWindow(e))return!1;try{if(e.constructor&&!y.call(e.constructor.prototype,"isPrototypeOf"))return!1}catch(t){return!1}return!0},isEmptyObject:function(e){var t;for(t in e)return!1;return!0},error:function(e){throw Error(e)},parseHTML:function(e,t,n){if(!e||"string"!=typeof e)return null;"boolean"==typeof t&&(n=t,t=!1),t=t||o;var r=C.exec(e),i=!n&&[];return r?[t.createElement(r[1])]:(r=x.buildFragment([e],t,i),i&&x(i).remove(),x.merge([],r.childNodes))},parseJSON:JSON.parse,parseXML:function(e){var t,n;if(!e||"string"!=typeof e)return null;try{n=new DOMParser,t=n.parseFromString(e,"text/xml")}catch(r){t=undefined}return(!t||t.getElementsByTagName("parsererror").length)&&x.error("Invalid XML: "+e),t},noop:function(){},globalEval:function(e){var t,n=eval;e=x.trim(e),e&&(1===e.indexOf("use strict")?(t=o.createElement("script"),t.text=e,o.head.appendChild(t).parentNode.removeChild(t)):n(e))},camelCase:function(e){return e.replace(k,"ms-").replace(N,E)},nodeName:function(e,t){return e.nodeName&&e.nodeName.toLowerCase()===t.toLowerCase()},each:function(e,t,n){var r,i=0,o=e.length,s=j(e);if(n){if(s){for(;o>i;i++)if(r=t.apply(e[i],n),r===!1)break}else for(i in e)if(r=t.apply(e[i],n),r===!1)break}else if(s){for(;o>i;i++)if(r=t.call(e[i],i,e[i]),r===!1)break}else for(i in e)if(r=t.call(e[i],i,e[i]),r===!1)break;return e},trim:function(e){return null==e?"":v.call(e)},makeArray:function(e,t){var n=t||[];return null!=e&&(j(Object(e))?x.merge(n,"string"==typeof e?[e]:e):h.call(n,e)),n},inArray:function(e,t,n){return null==t?-1:g.call(t,e,n)},merge:function(e,t){var n=t.length,r=e.length,i=0;if("number"==typeof n)for(;n>i;i++)e[r++]=t[i];else while(t[i]!==undefined)e[r++]=t[i++];return e.length=r,e},grep:function(e,t,n){var r,i=[],o=0,s=e.length;for(n=!!n;s>o;o++)r=!!t(e[o],o),n!==r&&i.push(e[o]);return i},map:function(e,t,n){var r,i=0,o=e.length,s=j(e),a=[];if(s)for(;o>i;i++)r=t(e[i],i,n),null!=r&&(a[a.length]=r);else for(i in e)r=t(e[i],i,n),null!=r&&(a[a.length]=r);return f.apply([],a)},guid:1,proxy:function(e,t){var n,r,i;return"string"==typeof t&&(n=e[t],t=e,e=n),x.isFunction(e)?(r=d.call(arguments,2),i=function(){return e.apply(t||this,r.concat(d.call(arguments)))},i.guid=e.guid=e.guid||x.guid++,i):undefined},access:function(e,t,n,r,i,o,s){var a=0,u=e.length,l=null==n;if("object"===x.type(n)){i=!0;for(a in n)x.access(e,t,a,n[a],!0,o,s)}else if(r!==undefined&&(i=!0,x.isFunction(r)||(s=!0),l&&(s?(t.call(e,r),t=null):(l=t,t=function(e,t,n){return l.call(x(e),n)})),t))for(;u>a;a++)t(e[a],n,s?r:r.call(e[a],a,t(e[a],n)));return i?e:l?t.call(e):u?t(e[0],n):o},now:Date.now,swap:function(e,t,n,r){var i,o,s={};for(o in t)s[o]=e.style[o],e.style[o]=t[o];i=n.apply(e,r||[]);for(o in t)e.style[o]=s[o];return i}}),x.ready.promise=function(t){return n||(n=x.Deferred(),"complete"===o.readyState?setTimeout(x.ready):(o.addEventListener("DOMContentLoaded",S,!1),e.addEventListener("load",S,!1))),n.promise(t)},x.each("Boolean Number String Function Array Date RegExp Object Error".split(" "),function(e,t){l["[object "+t+"]"]=t.toLowerCase()});function j(e){var t=e.length,n=x.type(e);return x.isWindow(e)?!1:1===e.nodeType&&t?!0:"array"===n||"function"!==n&&(0===t||"number"==typeof t&&t>0&&t-1 in e)}t=x(o),function(e,undefined){var t,n,r,i,o,s,a,u,l,c,p,f,h,d,g,m,y,v="sizzle"+-new Date,b=e.document,w=0,T=0,C=st(),k=st(),N=st(),E=!1,S=function(e,t){return e===t?(E=!0,0):0},j=typeof undefined,D=1<<31,A={}.hasOwnProperty,L=[],q=L.pop,H=L.push,O=L.push,F=L.slice,P=L.indexOf||function(e){var t=0,n=this.length;for(;n>t;t++)if(this[t]===e)return t;return-1},R="checked|selected|async|autofocus|autoplay|controls|defer|disabled|hidden|ismap|loop|multiple|open|readonly|required|scoped",M="[\\x20\\t\\r\\n\\f]",W="(?:\\\\.|[\\w-]|[^\\x00-\\xa0])+",$=W.replace("w","w#"),B="\\["+M+"*("+W+")"+M+"*(?:([*^$|!~]?=)"+M+"*(?:(['\"])((?:\\\\.|[^\\\\])*?)\\3|("+$+")|)|)"+M+"*\\]",I=":("+W+")(?:\\(((['\"])((?:\\\\.|[^\\\\])*?)\\3|((?:\\\\.|[^\\\\()[\\]]|"+B.replace(3,8)+")*)|.*)\\)|)",z=RegExp("^"+M+"+|((?:^|[^\\\\])(?:\\\\.)*)"+M+"+$","g"),_=RegExp("^"+M+"*,"+M+"*"),X=RegExp("^"+M+"*([>+~]|"+M+")"+M+"*"),U=RegExp(M+"*[+~]"),Y=RegExp("="+M+"*([^\\]'\"]*)"+M+"*\\]","g"),V=RegExp(I),G=RegExp("^"+$+"$"),J={ID:RegExp("^#("+W+")"),CLASS:RegExp("^\\.("+W+")"),TAG:RegExp("^("+W.replace("w","w*")+")"),ATTR:RegExp("^"+B),PSEUDO:RegExp("^"+I),CHILD:RegExp("^:(only|first|last|nth|nth-last)-(child|of-type)(?:\\("+M+"*(even|odd|(([+-]|)(\\d*)n|)"+M+"*(?:([+-]|)"+M+"*(\\d+)|))"+M+"*\\)|)","i"),bool:RegExp("^(?:"+R+")$","i"),needsContext:RegExp("^"+M+"*[>+~]|:(even|odd|eq|gt|lt|nth|first|last)(?:\\("+M+"*((?:-\\d)?\\d*)"+M+"*\\)|)(?=[^-]|$)","i")},Q=/^[^{]+\{\s*\[native \w/,K=/^(?:#([\w-]+)|(\w+)|\.([\w-]+))$/,Z=/^(?:input|select|textarea|button)$/i,et=/^h\d$/i,tt=/'|\\/g,nt=RegExp("\\\\([\\da-f]{1,6}"+M+"?|("+M+")|.)","ig"),rt=function(e,t,n){var r="0x"+t-65536;return r!==r||n?t:0>r?String.fromCharCode(r+65536):String.fromCharCode(55296|r>>10,56320|1023&r)};try{O.apply(L=F.call(b.childNodes),b.childNodes),L[b.childNodes.length].nodeType}catch(it){O={apply:L.length?function(e,t){H.apply(e,F.call(t))}:function(e,t){var n=e.length,r=0;while(e[n++]=t[r++]);e.length=n-1}}}function ot(e,t,r,i){var o,s,a,u,l,f,g,m,x,w;if((t?t.ownerDocument||t:b)!==p&&c(t),t=t||p,r=r||[],!e||"string"!=typeof e)return r;if(1!==(u=t.nodeType)&&9!==u)return[];if(h&&!i){if(o=K.exec(e))if(a=o[1]){if(9===u){if(s=t.getElementById(a),!s||!s.parentNode)return r;if(s.id===a)return r.push(s),r}else if(t.ownerDocument&&(s=t.ownerDocument.getElementById(a))&&y(t,s)&&s.id===a)return r.push(s),r}else{if(o[2])return O.apply(r,t.getElementsByTagName(e)),r;if((a=o[3])&&n.getElementsByClassName&&t.getElementsByClassName)return O.apply(r,t.getElementsByClassName(a)),r}if(n.qsa&&(!d||!d.test(e))){if(m=g=v,x=t,w=9===u&&e,1===u&&"object"!==t.nodeName.toLowerCase()){f=gt(e),(g=t.getAttribute("id"))?m=g.replace(tt,"\\$&"):t.setAttribute("id",m),m="[id='"+m+"'] ",l=f.length;while(l--)f[l]=m+mt(f[l]);x=U.test(e)&&t.parentNode||t,w=f.join(",")}if(w)try{return O.apply(r,x.querySelectorAll(w)),r}catch(T){}finally{g||t.removeAttribute("id")}}}return kt(e.replace(z,"$1"),t,r,i)}function st(){var e=[];function t(n,r){return e.push(n+=" ")>i.cacheLength&&delete t[e.shift()],t[n]=r}return t}function at(e){return e[v]=!0,e}function ut(e){var t=p.createElement("div");try{return!!e(t)}catch(n){return!1}finally{t.parentNode&&t.parentNode.removeChild(t),t=null}}function lt(e,t){var n=e.split("|"),r=e.length;while(r--)i.attrHandle[n[r]]=t}function ct(e,t){var n=t&&e,r=n&&1===e.nodeType&&1===t.nodeType&&(~t.sourceIndex||D)-(~e.sourceIndex||D);if(r)return r;if(n)while(n=n.nextSibling)if(n===t)return-1;return e?1:-1}function pt(e){return function(t){var n=t.nodeName.toLowerCase();return"input"===n&&t.type===e}}function ft(e){return function(t){var n=t.nodeName.toLowerCase();return("input"===n||"button"===n)&&t.type===e}}function ht(e){return at(function(t){return t=+t,at(function(n,r){var i,o=e([],n.length,t),s=o.length;while(s--)n[i=o[s]]&&(n[i]=!(r[i]=n[i]))})})}s=ot.isXML=function(e){var t=e&&(e.ownerDocument||e).documentElement;return t?"HTML"!==t.nodeName:!1},n=ot.support={},c=ot.setDocument=function(e){var t=e?e.ownerDocument||e:b,r=t.defaultView;return t!==p&&9===t.nodeType&&t.documentElement?(p=t,f=t.documentElement,h=!s(t),r&&r.attachEvent&&r!==r.top&&r.attachEvent("onbeforeunload",function(){c()}),n.attributes=ut(function(e){return e.className="i",!e.getAttribute("className")}),n.getElementsByTagName=ut(function(e){return e.appendChild(t.createComment("")),!e.getElementsByTagName("*").length}),n.getElementsByClassName=ut(function(e){return e.innerHTML="<div class='a'></div><div class='a i'></div>",e.firstChild.className="i",2===e.getElementsByClassName("i").length}),n.getById=ut(function(e){return f.appendChild(e).id=v,!t.getElementsByName||!t.getElementsByName(v).length}),n.getById?(i.find.ID=function(e,t){if(typeof t.getElementById!==j&&h){var n=t.getElementById(e);return n&&n.parentNode?[n]:[]}},i.filter.ID=function(e){var t=e.replace(nt,rt);return function(e){return e.getAttribute("id")===t}}):(delete i.find.ID,i.filter.ID=function(e){var t=e.replace(nt,rt);return function(e){var n=typeof e.getAttributeNode!==j&&e.getAttributeNode("id");return n&&n.value===t}}),i.find.TAG=n.getElementsByTagName?function(e,t){return typeof t.getElementsByTagName!==j?t.getElementsByTagName(e):undefined}:function(e,t){var n,r=[],i=0,o=t.getElementsByTagName(e);if("*"===e){while(n=o[i++])1===n.nodeType&&r.push(n);return r}return o},i.find.CLASS=n.getElementsByClassName&&function(e,t){return typeof t.getElementsByClassName!==j&&h?t.getElementsByClassName(e):undefined},g=[],d=[],(n.qsa=Q.test(t.querySelectorAll))&&(ut(function(e){e.innerHTML="<select><option selected=''></option></select>",e.querySelectorAll("[selected]").length||d.push("\\["+M+"*(?:value|"+R+")"),e.querySelectorAll(":checked").length||d.push(":checked")}),ut(function(e){var n=t.createElement("input");n.setAttribute("type","hidden"),e.appendChild(n).setAttribute("t",""),e.querySelectorAll("[t^='']").length&&d.push("[*^$]="+M+"*(?:''|\"\")"),e.querySelectorAll(":enabled").length||d.push(":enabled",":disabled"),e.querySelectorAll("*,:x"),d.push(",.*:")})),(n.matchesSelector=Q.test(m=f.webkitMatchesSelector||f.mozMatchesSelector||f.oMatchesSelector||f.msMatchesSelector))&&ut(function(e){n.disconnectedMatch=m.call(e,"div"),m.call(e,"[s!='']:x"),g.push("!=",I)}),d=d.length&&RegExp(d.join("|")),g=g.length&&RegExp(g.join("|")),y=Q.test(f.contains)||f.compareDocumentPosition?function(e,t){var n=9===e.nodeType?e.documentElement:e,r=t&&t.parentNode;return e===r||!(!r||1!==r.nodeType||!(n.contains?n.contains(r):e.compareDocumentPosition&&16&e.compareDocumentPosition(r)))}:function(e,t){if(t)while(t=t.parentNode)if(t===e)return!0;return!1},S=f.compareDocumentPosition?function(e,r){if(e===r)return E=!0,0;var i=r.compareDocumentPosition&&e.compareDocumentPosition&&e.compareDocumentPosition(r);return i?1&i||!n.sortDetached&&r.compareDocumentPosition(e)===i?e===t||y(b,e)?-1:r===t||y(b,r)?1:l?P.call(l,e)-P.call(l,r):0:4&i?-1:1:e.compareDocumentPosition?-1:1}:function(e,n){var r,i=0,o=e.parentNode,s=n.parentNode,a=[e],u=[n];if(e===n)return E=!0,0;if(!o||!s)return e===t?-1:n===t?1:o?-1:s?1:l?P.call(l,e)-P.call(l,n):0;if(o===s)return ct(e,n);r=e;while(r=r.parentNode)a.unshift(r);r=n;while(r=r.parentNode)u.unshift(r);while(a[i]===u[i])i++;return i?ct(a[i],u[i]):a[i]===b?-1:u[i]===b?1:0},t):p},ot.matches=function(e,t){return ot(e,null,null,t)},ot.matchesSelector=function(e,t){if((e.ownerDocument||e)!==p&&c(e),t=t.replace(Y,"='$1']"),!(!n.matchesSelector||!h||g&&g.test(t)||d&&d.test(t)))try{var r=m.call(e,t);if(r||n.disconnectedMatch||e.document&&11!==e.document.nodeType)return r}catch(i){}return ot(t,p,null,[e]).length>0},ot.contains=function(e,t){return(e.ownerDocument||e)!==p&&c(e),y(e,t)},ot.attr=function(e,t){(e.ownerDocument||e)!==p&&c(e);var r=i.attrHandle[t.toLowerCase()],o=r&&A.call(i.attrHandle,t.toLowerCase())?r(e,t,!h):undefined;return o===undefined?n.attributes||!h?e.getAttribute(t):(o=e.getAttributeNode(t))&&o.specified?o.value:null:o},ot.error=function(e){throw Error("Syntax error, unrecognized expression: "+e)},ot.uniqueSort=function(e){var t,r=[],i=0,o=0;if(E=!n.detectDuplicates,l=!n.sortStable&&e.slice(0),e.sort(S),E){while(t=e[o++])t===e[o]&&(i=r.push(o));while(i--)e.splice(r[i],1)}return e},o=ot.getText=function(e){var t,n="",r=0,i=e.nodeType;if(i){if(1===i||9===i||11===i){if("string"==typeof e.textContent)return e.textContent;for(e=e.firstChild;e;e=e.nextSibling)n+=o(e)}else if(3===i||4===i)return e.nodeValue}else for(;t=e[r];r++)n+=o(t);return n},i=ot.selectors={cacheLength:50,createPseudo:at,match:J,attrHandle:{},find:{},relative:{">":{dir:"parentNode",first:!0}," ":{dir:"parentNode"},"+":{dir:"previousSibling",first:!0},"~":{dir:"previousSibling"}},preFilter:{ATTR:function(e){return e[1]=e[1].replace(nt,rt),e[3]=(e[4]||e[5]||"").replace(nt,rt),"~="===e[2]&&(e[3]=" "+e[3]+" "),e.slice(0,4)},CHILD:function(e){return e[1]=e[1].toLowerCase(),"nth"===e[1].slice(0,3)?(e[3]||ot.error(e[0]),e[4]=+(e[4]?e[5]+(e[6]||1):2*("even"===e[3]||"odd"===e[3])),e[5]=+(e[7]+e[8]||"odd"===e[3])):e[3]&&ot.error(e[0]),e},PSEUDO:function(e){var t,n=!e[5]&&e[2];return J.CHILD.test(e[0])?null:(e[3]&&e[4]!==undefined?e[2]=e[4]:n&&V.test(n)&&(t=gt(n,!0))&&(t=n.indexOf(")",n.length-t)-n.length)&&(e[0]=e[0].slice(0,t),e[2]=n.slice(0,t)),e.slice(0,3))}},filter:{TAG:function(e){var t=e.replace(nt,rt).toLowerCase();return"*"===e?function(){return!0}:function(e){return e.nodeName&&e.nodeName.toLowerCase()===t}},CLASS:function(e){var t=C[e+" "];return t||(t=RegExp("(^|"+M+")"+e+"("+M+"|$)"))&&C(e,function(e){return t.test("string"==typeof e.className&&e.className||typeof e.getAttribute!==j&&e.getAttribute("class")||"")})},ATTR:function(e,t,n){return function(r){var i=ot.attr(r,e);return null==i?"!="===t:t?(i+="","="===t?i===n:"!="===t?i!==n:"^="===t?n&&0===i.indexOf(n):"*="===t?n&&i.indexOf(n)>-1:"$="===t?n&&i.slice(-n.length)===n:"~="===t?(" "+i+" ").indexOf(n)>-1:"|="===t?i===n||i.slice(0,n.length+1)===n+"-":!1):!0}},CHILD:function(e,t,n,r,i){var o="nth"!==e.slice(0,3),s="last"!==e.slice(-4),a="of-type"===t;return 1===r&&0===i?function(e){return!!e.parentNode}:function(t,n,u){var l,c,p,f,h,d,g=o!==s?"nextSibling":"previousSibling",m=t.parentNode,y=a&&t.nodeName.toLowerCase(),x=!u&&!a;if(m){if(o){while(g){p=t;while(p=p[g])if(a?p.nodeName.toLowerCase()===y:1===p.nodeType)return!1;d=g="only"===e&&!d&&"nextSibling"}return!0}if(d=[s?m.firstChild:m.lastChild],s&&x){c=m[v]||(m[v]={}),l=c[e]||[],h=l[0]===w&&l[1],f=l[0]===w&&l[2],p=h&&m.childNodes[h];while(p=++h&&p&&p[g]||(f=h=0)||d.pop())if(1===p.nodeType&&++f&&p===t){c[e]=[w,h,f];break}}else if(x&&(l=(t[v]||(t[v]={}))[e])&&l[0]===w)f=l[1];else while(p=++h&&p&&p[g]||(f=h=0)||d.pop())if((a?p.nodeName.toLowerCase()===y:1===p.nodeType)&&++f&&(x&&((p[v]||(p[v]={}))[e]=[w,f]),p===t))break;return f-=i,f===r||0===f%r&&f/r>=0}}},PSEUDO:function(e,t){var n,r=i.pseudos[e]||i.setFilters[e.toLowerCase()]||ot.error("unsupported pseudo: "+e);return r[v]?r(t):r.length>1?(n=[e,e,"",t],i.setFilters.hasOwnProperty(e.toLowerCase())?at(function(e,n){var i,o=r(e,t),s=o.length;while(s--)i=P.call(e,o[s]),e[i]=!(n[i]=o[s])}):function(e){return r(e,0,n)}):r}},pseudos:{not:at(function(e){var t=[],n=[],r=a(e.replace(z,"$1"));return r[v]?at(function(e,t,n,i){var o,s=r(e,null,i,[]),a=e.length;while(a--)(o=s[a])&&(e[a]=!(t[a]=o))}):function(e,i,o){return t[0]=e,r(t,null,o,n),!n.pop()}}),has:at(function(e){return function(t){return ot(e,t).length>0}}),contains:at(function(e){return function(t){return(t.textContent||t.innerText||o(t)).indexOf(e)>-1}}),lang:at(function(e){return G.test(e||"")||ot.error("unsupported lang: "+e),e=e.replace(nt,rt).toLowerCase(),function(t){var n;do if(n=h?t.lang:t.getAttribute("xml:lang")||t.getAttribute("lang"))return n=n.toLowerCase(),n===e||0===n.indexOf(e+"-");while((t=t.parentNode)&&1===t.nodeType);return!1}}),target:function(t){var n=e.location&&e.location.hash;return n&&n.slice(1)===t.id},root:function(e){return e===f},focus:function(e){return e===p.activeElement&&(!p.hasFocus||p.hasFocus())&&!!(e.type||e.href||~e.tabIndex)},enabled:function(e){return e.disabled===!1},disabled:function(e){return e.disabled===!0},checked:function(e){var t=e.nodeName.toLowerCase();return"input"===t&&!!e.checked||"option"===t&&!!e.selected},selected:function(e){return e.parentNode&&e.parentNode.selectedIndex,e.selected===!0},empty:function(e){for(e=e.firstChild;e;e=e.nextSibling)if(e.nodeName>"@"||3===e.nodeType||4===e.nodeType)return!1;return!0},parent:function(e){return!i.pseudos.empty(e)},header:function(e){return et.test(e.nodeName)},input:function(e){return Z.test(e.nodeName)},button:function(e){var t=e.nodeName.toLowerCase();return"input"===t&&"button"===e.type||"button"===t},text:function(e){var t;return"input"===e.nodeName.toLowerCase()&&"text"===e.type&&(null==(t=e.getAttribute("type"))||t.toLowerCase()===e.type)},first:ht(function(){return[0]}),last:ht(function(e,t){return[t-1]}),eq:ht(function(e,t,n){return[0>n?n+t:n]}),even:ht(function(e,t){var n=0;for(;t>n;n+=2)e.push(n);return e}),odd:ht(function(e,t){var n=1;for(;t>n;n+=2)e.push(n);return e}),lt:ht(function(e,t,n){var r=0>n?n+t:n;for(;--r>=0;)e.push(r);return e}),gt:ht(function(e,t,n){var r=0>n?n+t:n;for(;t>++r;)e.push(r);return e})}},i.pseudos.nth=i.pseudos.eq;for(t in{radio:!0,checkbox:!0,file:!0,password:!0,image:!0})i.pseudos[t]=pt(t);for(t in{submit:!0,reset:!0})i.pseudos[t]=ft(t);function dt(){}dt.prototype=i.filters=i.pseudos,i.setFilters=new dt;function gt(e,t){var n,r,o,s,a,u,l,c=k[e+" "];if(c)return t?0:c.slice(0);a=e,u=[],l=i.preFilter;while(a){(!n||(r=_.exec(a)))&&(r&&(a=a.slice(r[0].length)||a),u.push(o=[])),n=!1,(r=X.exec(a))&&(n=r.shift(),o.push({value:n,type:r[0].replace(z," ")}),a=a.slice(n.length));for(s in i.filter)!(r=J[s].exec(a))||l[s]&&!(r=l[s](r))||(n=r.shift(),o.push({value:n,type:s,matches:r}),a=a.slice(n.length));if(!n)break}return t?a.length:a?ot.error(e):k(e,u).slice(0)}function mt(e){var t=0,n=e.length,r="";for(;n>t;t++)r+=e[t].value;return r}function yt(e,t,n){var i=t.dir,o=n&&"parentNode"===i,s=T++;return t.first?function(t,n,r){while(t=t[i])if(1===t.nodeType||o)return e(t,n,r)}:function(t,n,a){var u,l,c,p=w+" "+s;if(a){while(t=t[i])if((1===t.nodeType||o)&&e(t,n,a))return!0}else while(t=t[i])if(1===t.nodeType||o)if(c=t[v]||(t[v]={}),(l=c[i])&&l[0]===p){if((u=l[1])===!0||u===r)return u===!0}else if(l=c[i]=[p],l[1]=e(t,n,a)||r,l[1]===!0)return!0}}function vt(e){return e.length>1?function(t,n,r){var i=e.length;while(i--)if(!e[i](t,n,r))return!1;return!0}:e[0]}function xt(e,t,n,r,i){var o,s=[],a=0,u=e.length,l=null!=t;for(;u>a;a++)(o=e[a])&&(!n||n(o,r,i))&&(s.push(o),l&&t.push(a));return s}function bt(e,t,n,r,i,o){return r&&!r[v]&&(r=bt(r)),i&&!i[v]&&(i=bt(i,o)),at(function(o,s,a,u){var l,c,p,f=[],h=[],d=s.length,g=o||Ct(t||"*",a.nodeType?[a]:a,[]),m=!e||!o&&t?g:xt(g,f,e,a,u),y=n?i||(o?e:d||r)?[]:s:m;if(n&&n(m,y,a,u),r){l=xt(y,h),r(l,[],a,u),c=l.length;while(c--)(p=l[c])&&(y[h[c]]=!(m[h[c]]=p))}if(o){if(i||e){if(i){l=[],c=y.length;while(c--)(p=y[c])&&l.push(m[c]=p);i(null,y=[],l,u)}c=y.length;while(c--)(p=y[c])&&(l=i?P.call(o,p):f[c])>-1&&(o[l]=!(s[l]=p))}}else y=xt(y===s?y.splice(d,y.length):y),i?i(null,s,y,u):O.apply(s,y)})}function wt(e){var t,n,r,o=e.length,s=i.relative[e[0].type],a=s||i.relative[" "],l=s?1:0,c=yt(function(e){return e===t},a,!0),p=yt(function(e){return P.call(t,e)>-1},a,!0),f=[function(e,n,r){return!s&&(r||n!==u)||((t=n).nodeType?c(e,n,r):p(e,n,r))}];for(;o>l;l++)if(n=i.relative[e[l].type])f=[yt(vt(f),n)];else{if(n=i.filter[e[l].type].apply(null,e[l].matches),n[v]){for(r=++l;o>r;r++)if(i.relative[e[r].type])break;return bt(l>1&&vt(f),l>1&&mt(e.slice(0,l-1).concat({value:" "===e[l-2].type?"*":""})).replace(z,"$1"),n,r>l&&wt(e.slice(l,r)),o>r&&wt(e=e.slice(r)),o>r&&mt(e))}f.push(n)}return vt(f)}function Tt(e,t){var n=0,o=t.length>0,s=e.length>0,a=function(a,l,c,f,h){var d,g,m,y=[],v=0,x="0",b=a&&[],T=null!=h,C=u,k=a||s&&i.find.TAG("*",h&&l.parentNode||l),N=w+=null==C?1:Math.random()||.1;for(T&&(u=l!==p&&l,r=n);null!=(d=k[x]);x++){if(s&&d){g=0;while(m=e[g++])if(m(d,l,c)){f.push(d);break}T&&(w=N,r=++n)}o&&((d=!m&&d)&&v--,a&&b.push(d))}if(v+=x,o&&x!==v){g=0;while(m=t[g++])m(b,y,l,c);if(a){if(v>0)while(x--)b[x]||y[x]||(y[x]=q.call(f));y=xt(y)}O.apply(f,y),T&&!a&&y.length>0&&v+t.length>1&&ot.uniqueSort(f)}return T&&(w=N,u=C),b};return o?at(a):a}a=ot.compile=function(e,t){var n,r=[],i=[],o=N[e+" "];if(!o){t||(t=gt(e)),n=t.length;while(n--)o=wt(t[n]),o[v]?r.push(o):i.push(o);o=N(e,Tt(i,r))}return o};function Ct(e,t,n){var r=0,i=t.length;for(;i>r;r++)ot(e,t[r],n);return n}function kt(e,t,r,o){var s,u,l,c,p,f=gt(e);if(!o&&1===f.length){if(u=f[0]=f[0].slice(0),u.length>2&&"ID"===(l=u[0]).type&&n.getById&&9===t.nodeType&&h&&i.relative[u[1].type]){if(t=(i.find.ID(l.matches[0].replace(nt,rt),t)||[])[0],!t)return r;e=e.slice(u.shift().value.length)}s=J.needsContext.test(e)?0:u.length;while(s--){if(l=u[s],i.relative[c=l.type])break;if((p=i.find[c])&&(o=p(l.matches[0].replace(nt,rt),U.test(u[0].type)&&t.parentNode||t))){if(u.splice(s,1),e=o.length&&mt(u),!e)return O.apply(r,o),r;break}}}return a(e,f)(o,t,!h,r,U.test(e)),r}n.sortStable=v.split("").sort(S).join("")===v,n.detectDuplicates=E,c(),n.sortDetached=ut(function(e){return 1&e.compareDocumentPosition(p.createElement("div"))}),ut(function(e){return e.innerHTML="<a href='#'></a>","#"===e.firstChild.getAttribute("href")})||lt("type|href|height|width",function(e,t,n){return n?undefined:e.getAttribute(t,"type"===t.toLowerCase()?1:2)}),n.attributes&&ut(function(e){return e.innerHTML="<input/>",e.firstChild.setAttribute("value",""),""===e.firstChild.getAttribute("value")})||lt("value",function(e,t,n){return n||"input"!==e.nodeName.toLowerCase()?undefined:e.defaultValue}),ut(function(e){return null==e.getAttribute("disabled")})||lt(R,function(e,t,n){var r;return n?undefined:(r=e.getAttributeNode(t))&&r.specified?r.value:e[t]===!0?t.toLowerCase():null}),x.find=ot,x.expr=ot.selectors,x.expr[":"]=x.expr.pseudos,x.unique=ot.uniqueSort,x.text=ot.getText,x.isXMLDoc=ot.isXML,x.contains=ot.contains}(e);var D={};function A(e){var t=D[e]={};return x.each(e.match(w)||[],function(e,n){t[n]=!0}),t}x.Callbacks=function(e){e="string"==typeof e?D[e]||A(e):x.extend({},e);var t,n,r,i,o,s,a=[],u=!e.once&&[],l=function(p){for(t=e.memory&&p,n=!0,s=i||0,i=0,o=a.length,r=!0;a&&o>s;s++)if(a[s].apply(p[0],p[1])===!1&&e.stopOnFalse){t=!1;break}r=!1,a&&(u?u.length&&l(u.shift()):t?a=[]:c.disable())},c={add:function(){if(a){var n=a.length;(function s(t){x.each(t,function(t,n){var r=x.type(n);"function"===r?e.unique&&c.has(n)||a.push(n):n&&n.length&&"string"!==r&&s(n)})})(arguments),r?o=a.length:t&&(i=n,l(t))}return this},remove:function(){return a&&x.each(arguments,function(e,t){var n;while((n=x.inArray(t,a,n))>-1)a.splice(n,1),r&&(o>=n&&o--,s>=n&&s--)}),this},has:function(e){return e?x.inArray(e,a)>-1:!(!a||!a.length)},empty:function(){return a=[],o=0,this},disable:function(){return a=u=t=undefined,this},disabled:function(){return!a},lock:function(){return u=undefined,t||c.disable(),this},locked:function(){return!u},fireWith:function(e,t){return!a||n&&!u||(t=t||[],t=[e,t.slice?t.slice():t],r?u.push(t):l(t)),this},fire:function(){return c.fireWith(this,arguments),this},fired:function(){return!!n}};return c},x.extend({Deferred:function(e){var t=[["resolve","done",x.Callbacks("once memory"),"resolved"],["reject","fail",x.Callbacks("once memory"),"rejected"],["notify","progress",x.Callbacks("memory")]],n="pending",r={state:function(){return n},always:function(){return i.done(arguments).fail(arguments),this},then:function(){var e=arguments;return x.Deferred(function(n){x.each(t,function(t,o){var s=o[0],a=x.isFunction(e[t])&&e[t];i[o[1]](function(){var e=a&&a.apply(this,arguments);e&&x.isFunction(e.promise)?e.promise().done(n.resolve).fail(n.reject).progress(n.notify):n[s+"With"](this===r?n.promise():this,a?[e]:arguments)})}),e=null}).promise()},promise:function(e){return null!=e?x.extend(e,r):r}},i={};return r.pipe=r.then,x.each(t,function(e,o){var s=o[2],a=o[3];r[o[1]]=s.add,a&&s.add(function(){n=a},t[1^e][2].disable,t[2][2].lock),i[o[0]]=function(){return i[o[0]+"With"](this===i?r:this,arguments),this},i[o[0]+"With"]=s.fireWith}),r.promise(i),e&&e.call(i,i),i},when:function(e){var t=0,n=d.call(arguments),r=n.length,i=1!==r||e&&x.isFunction(e.promise)?r:0,o=1===i?e:x.Deferred(),s=function(e,t,n){return function(r){t[e]=this,n[e]=arguments.length>1?d.call(arguments):r,n===a?o.notifyWith(t,n):--i||o.resolveWith(t,n)}},a,u,l;if(r>1)for(a=Array(r),u=Array(r),l=Array(r);r>t;t++)n[t]&&x.isFunction(n[t].promise)?n[t].promise().done(s(t,l,n)).fail(o.reject).progress(s(t,u,a)):--i;return i||o.resolveWith(l,n),o.promise()}}),x.support=function(t){var n=o.createElement("input"),r=o.createDocumentFragment(),i=o.createElement("div"),s=o.createElement("select"),a=s.appendChild(o.createElement("option"));return n.type?(n.type="checkbox",t.checkOn=""!==n.value,t.optSelected=a.selected,t.reliableMarginRight=!0,t.boxSizingReliable=!0,t.pixelPosition=!1,n.checked=!0,t.noCloneChecked=n.cloneNode(!0).checked,s.disabled=!0,t.optDisabled=!a.disabled,n=o.createElement("input"),n.value="t",n.type="radio",t.radioValue="t"===n.value,n.setAttribute("checked","t"),n.setAttribute("name","t"),r.appendChild(n),t.checkClone=r.cloneNode(!0).cloneNode(!0).lastChild.checked,t.focusinBubbles="onfocusin"in e,i.style.backgroundClip="content-box",i.cloneNode(!0).style.backgroundClip="",t.clearCloneStyle="content-box"===i.style.backgroundClip,x(function(){var n,r,s="padding:0;margin:0;border:0;display:block;-webkit-box-sizing:content-box;-moz-box-sizing:content-box;box-sizing:content-box",a=o.getElementsByTagName("body")[0];a&&(n=o.createElement("div"),n.style.cssText="border:0;width:0;height:0;position:absolute;top:0;left:-9999px;margin-top:1px",a.appendChild(n).appendChild(i),i.innerHTML="",i.style.cssText="-webkit-box-sizing:border-box;-moz-box-sizing:border-box;box-sizing:border-box;padding:1px;border:1px;display:block;width:4px;margin-top:1%;position:absolute;top:1%",x.swap(a,null!=a.style.zoom?{zoom:1}:{},function(){t.boxSizing=4===i.offsetWidth}),e.getComputedStyle&&(t.pixelPosition="1%"!==(e.getComputedStyle(i,null)||{}).top,t.boxSizingReliable="4px"===(e.getComputedStyle(i,null)||{width:"4px"}).width,r=i.appendChild(o.createElement("div")),r.style.cssText=i.style.cssText=s,r.style.marginRight=r.style.width="0",i.style.width="1px",t.reliableMarginRight=!parseFloat((e.getComputedStyle(r,null)||{}).marginRight)),a.removeChild(n))}),t):t}({});var L,q,H=/(?:\{[\s\S]*\}|\[[\s\S]*\])$/,O=/([A-Z])/g;function F(){Object.defineProperty(this.cache={},0,{get:function(){return{}}}),this.expando=x.expando+Math.random()}F.uid=1,F.accepts=function(e){return e.nodeType?1===e.nodeType||9===e.nodeType:!0},F.prototype={key:function(e){if(!F.accepts(e))return 0;var t={},n=e[this.expando];if(!n){n=F.uid++;try{t[this.expando]={value:n},Object.defineProperties(e,t)}catch(r){t[this.expando]=n,x.extend(e,t)}}return this.cache[n]||(this.cache[n]={}),n},set:function(e,t,n){var r,i=this.key(e),o=this.cache[i];if("string"==typeof t)o[t]=n;else if(x.isEmptyObject(o))x.extend(this.cache[i],t);else for(r in t)o[r]=t[r];return o},get:function(e,t){var n=this.cache[this.key(e)];return t===undefined?n:n[t]},access:function(e,t,n){var r;return t===undefined||t&&"string"==typeof t&&n===undefined?(r=this.get(e,t),r!==undefined?r:this.get(e,x.camelCase(t))):(this.set(e,t,n),n!==undefined?n:t)},remove:function(e,t){var n,r,i,o=this.key(e),s=this.cache[o];if(t===undefined)this.cache[o]={};else{x.isArray(t)?r=t.concat(t.map(x.camelCase)):(i=x.camelCase(t),t in s?r=[t,i]:(r=i,r=r in s?[r]:r.match(w)||[])),n=r.length;while(n--)delete s[r[n]]}},hasData:function(e){return!x.isEmptyObject(this.cache[e[this.expando]]||{})},discard:function(e){e[this.expando]&&delete this.cache[e[this.expando]]}},L=new F,q=new F,x.extend({acceptData:F.accepts,hasData:function(e){return L.hasData(e)||q.hasData(e)},data:function(e,t,n){return L.access(e,t,n)},removeData:function(e,t){L.remove(e,t)},_data:function(e,t,n){return q.access(e,t,n)},_removeData:function(e,t){q.remove(e,t)}}),x.fn.extend({data:function(e,t){var n,r,i=this[0],o=0,s=null;if(e===undefined){if(this.length&&(s=L.get(i),1===i.nodeType&&!q.get(i,"hasDataAttrs"))){for(n=i.attributes;n.length>o;o++)r=n[o].name,0===r.indexOf("data-")&&(r=x.camelCase(r.slice(5)),P(i,r,s[r]));q.set(i,"hasDataAttrs",!0)}return s}return"object"==typeof e?this.each(function(){L.set(this,e)}):x.access(this,function(t){var n,r=x.camelCase(e);if(i&&t===undefined){if(n=L.get(i,e),n!==undefined)return n;if(n=L.get(i,r),n!==undefined)return n;if(n=P(i,r,undefined),n!==undefined)return n}else this.each(function(){var n=L.get(this,r);L.set(this,r,t),-1!==e.indexOf("-")&&n!==undefined&&L.set(this,e,t)})},null,t,arguments.length>1,null,!0)},removeData:function(e){return this.each(function(){L.remove(this,e)})}});function P(e,t,n){var r;if(n===undefined&&1===e.nodeType)if(r="data-"+t.replace(O,"-$1").toLowerCase(),n=e.getAttribute(r),"string"==typeof n){try{n="true"===n?!0:"false"===n?!1:"null"===n?null:+n+""===n?+n:H.test(n)?JSON.parse(n):n}catch(i){}L.set(e,t,n)}else n=undefined;return n}x.extend({queue:function(e,t,n){var r;return e?(t=(t||"fx")+"queue",r=q.get(e,t),n&&(!r||x.isArray(n)?r=q.access(e,t,x.makeArray(n)):r.push(n)),r||[]):undefined},dequeue:function(e,t){t=t||"fx";var n=x.queue(e,t),r=n.length,i=n.shift(),o=x._queueHooks(e,t),s=function(){x.dequeue(e,t)
+};"inprogress"===i&&(i=n.shift(),r--),i&&("fx"===t&&n.unshift("inprogress"),delete o.stop,i.call(e,s,o)),!r&&o&&o.empty.fire()},_queueHooks:function(e,t){var n=t+"queueHooks";return q.get(e,n)||q.access(e,n,{empty:x.Callbacks("once memory").add(function(){q.remove(e,[t+"queue",n])})})}}),x.fn.extend({queue:function(e,t){var n=2;return"string"!=typeof e&&(t=e,e="fx",n--),n>arguments.length?x.queue(this[0],e):t===undefined?this:this.each(function(){var n=x.queue(this,e,t);x._queueHooks(this,e),"fx"===e&&"inprogress"!==n[0]&&x.dequeue(this,e)})},dequeue:function(e){return this.each(function(){x.dequeue(this,e)})},delay:function(e,t){return e=x.fx?x.fx.speeds[e]||e:e,t=t||"fx",this.queue(t,function(t,n){var r=setTimeout(t,e);n.stop=function(){clearTimeout(r)}})},clearQueue:function(e){return this.queue(e||"fx",[])},promise:function(e,t){var n,r=1,i=x.Deferred(),o=this,s=this.length,a=function(){--r||i.resolveWith(o,[o])};"string"!=typeof e&&(t=e,e=undefined),e=e||"fx";while(s--)n=q.get(o[s],e+"queueHooks"),n&&n.empty&&(r++,n.empty.add(a));return a(),i.promise(t)}});var R,M,W=/[\t\r\n\f]/g,$=/\r/g,B=/^(?:input|select|textarea|button)$/i;x.fn.extend({attr:function(e,t){return x.access(this,x.attr,e,t,arguments.length>1)},removeAttr:function(e){return this.each(function(){x.removeAttr(this,e)})},prop:function(e,t){return x.access(this,x.prop,e,t,arguments.length>1)},removeProp:function(e){return this.each(function(){delete this[x.propFix[e]||e]})},addClass:function(e){var t,n,r,i,o,s=0,a=this.length,u="string"==typeof e&&e;if(x.isFunction(e))return this.each(function(t){x(this).addClass(e.call(this,t,this.className))});if(u)for(t=(e||"").match(w)||[];a>s;s++)if(n=this[s],r=1===n.nodeType&&(n.className?(" "+n.className+" ").replace(W," "):" ")){o=0;while(i=t[o++])0>r.indexOf(" "+i+" ")&&(r+=i+" ");n.className=x.trim(r)}return this},removeClass:function(e){var t,n,r,i,o,s=0,a=this.length,u=0===arguments.length||"string"==typeof e&&e;if(x.isFunction(e))return this.each(function(t){x(this).removeClass(e.call(this,t,this.className))});if(u)for(t=(e||"").match(w)||[];a>s;s++)if(n=this[s],r=1===n.nodeType&&(n.className?(" "+n.className+" ").replace(W," "):"")){o=0;while(i=t[o++])while(r.indexOf(" "+i+" ")>=0)r=r.replace(" "+i+" "," ");n.className=e?x.trim(r):""}return this},toggleClass:function(e,t){var n=typeof e;return"boolean"==typeof t&&"string"===n?t?this.addClass(e):this.removeClass(e):x.isFunction(e)?this.each(function(n){x(this).toggleClass(e.call(this,n,this.className,t),t)}):this.each(function(){if("string"===n){var t,i=0,o=x(this),s=e.match(w)||[];while(t=s[i++])o.hasClass(t)?o.removeClass(t):o.addClass(t)}else(n===r||"boolean"===n)&&(this.className&&q.set(this,"__className__",this.className),this.className=this.className||e===!1?"":q.get(this,"__className__")||"")})},hasClass:function(e){var t=" "+e+" ",n=0,r=this.length;for(;r>n;n++)if(1===this[n].nodeType&&(" "+this[n].className+" ").replace(W," ").indexOf(t)>=0)return!0;return!1},val:function(e){var t,n,r,i=this[0];{if(arguments.length)return r=x.isFunction(e),this.each(function(n){var i;1===this.nodeType&&(i=r?e.call(this,n,x(this).val()):e,null==i?i="":"number"==typeof i?i+="":x.isArray(i)&&(i=x.map(i,function(e){return null==e?"":e+""})),t=x.valHooks[this.type]||x.valHooks[this.nodeName.toLowerCase()],t&&"set"in t&&t.set(this,i,"value")!==undefined||(this.value=i))});if(i)return t=x.valHooks[i.type]||x.valHooks[i.nodeName.toLowerCase()],t&&"get"in t&&(n=t.get(i,"value"))!==undefined?n:(n=i.value,"string"==typeof n?n.replace($,""):null==n?"":n)}}}),x.extend({valHooks:{option:{get:function(e){var t=e.attributes.value;return!t||t.specified?e.value:e.text}},select:{get:function(e){var t,n,r=e.options,i=e.selectedIndex,o="select-one"===e.type||0>i,s=o?null:[],a=o?i+1:r.length,u=0>i?a:o?i:0;for(;a>u;u++)if(n=r[u],!(!n.selected&&u!==i||(x.support.optDisabled?n.disabled:null!==n.getAttribute("disabled"))||n.parentNode.disabled&&x.nodeName(n.parentNode,"optgroup"))){if(t=x(n).val(),o)return t;s.push(t)}return s},set:function(e,t){var n,r,i=e.options,o=x.makeArray(t),s=i.length;while(s--)r=i[s],(r.selected=x.inArray(x(r).val(),o)>=0)&&(n=!0);return n||(e.selectedIndex=-1),o}}},attr:function(e,t,n){var i,o,s=e.nodeType;if(e&&3!==s&&8!==s&&2!==s)return typeof e.getAttribute===r?x.prop(e,t,n):(1===s&&x.isXMLDoc(e)||(t=t.toLowerCase(),i=x.attrHooks[t]||(x.expr.match.bool.test(t)?M:R)),n===undefined?i&&"get"in i&&null!==(o=i.get(e,t))?o:(o=x.find.attr(e,t),null==o?undefined:o):null!==n?i&&"set"in i&&(o=i.set(e,n,t))!==undefined?o:(e.setAttribute(t,n+""),n):(x.removeAttr(e,t),undefined))},removeAttr:function(e,t){var n,r,i=0,o=t&&t.match(w);if(o&&1===e.nodeType)while(n=o[i++])r=x.propFix[n]||n,x.expr.match.bool.test(n)&&(e[r]=!1),e.removeAttribute(n)},attrHooks:{type:{set:function(e,t){if(!x.support.radioValue&&"radio"===t&&x.nodeName(e,"input")){var n=e.value;return e.setAttribute("type",t),n&&(e.value=n),t}}}},propFix:{"for":"htmlFor","class":"className"},prop:function(e,t,n){var r,i,o,s=e.nodeType;if(e&&3!==s&&8!==s&&2!==s)return o=1!==s||!x.isXMLDoc(e),o&&(t=x.propFix[t]||t,i=x.propHooks[t]),n!==undefined?i&&"set"in i&&(r=i.set(e,n,t))!==undefined?r:e[t]=n:i&&"get"in i&&null!==(r=i.get(e,t))?r:e[t]},propHooks:{tabIndex:{get:function(e){return e.hasAttribute("tabindex")||B.test(e.nodeName)||e.href?e.tabIndex:-1}}}}),M={set:function(e,t,n){return t===!1?x.removeAttr(e,n):e.setAttribute(n,n),n}},x.each(x.expr.match.bool.source.match(/\w+/g),function(e,t){var n=x.expr.attrHandle[t]||x.find.attr;x.expr.attrHandle[t]=function(e,t,r){var i=x.expr.attrHandle[t],o=r?undefined:(x.expr.attrHandle[t]=undefined)!=n(e,t,r)?t.toLowerCase():null;return x.expr.attrHandle[t]=i,o}}),x.support.optSelected||(x.propHooks.selected={get:function(e){var t=e.parentNode;return t&&t.parentNode&&t.parentNode.selectedIndex,null}}),x.each(["tabIndex","readOnly","maxLength","cellSpacing","cellPadding","rowSpan","colSpan","useMap","frameBorder","contentEditable"],function(){x.propFix[this.toLowerCase()]=this}),x.each(["radio","checkbox"],function(){x.valHooks[this]={set:function(e,t){return x.isArray(t)?e.checked=x.inArray(x(e).val(),t)>=0:undefined}},x.support.checkOn||(x.valHooks[this].get=function(e){return null===e.getAttribute("value")?"on":e.value})});var I=/^key/,z=/^(?:mouse|contextmenu)|click/,_=/^(?:focusinfocus|focusoutblur)$/,X=/^([^.]*)(?:\.(.+)|)$/;function U(){return!0}function Y(){return!1}function V(){try{return o.activeElement}catch(e){}}x.event={global:{},add:function(e,t,n,i,o){var s,a,u,l,c,p,f,h,d,g,m,y=q.get(e);if(y){n.handler&&(s=n,n=s.handler,o=s.selector),n.guid||(n.guid=x.guid++),(l=y.events)||(l=y.events={}),(a=y.handle)||(a=y.handle=function(e){return typeof x===r||e&&x.event.triggered===e.type?undefined:x.event.dispatch.apply(a.elem,arguments)},a.elem=e),t=(t||"").match(w)||[""],c=t.length;while(c--)u=X.exec(t[c])||[],d=m=u[1],g=(u[2]||"").split(".").sort(),d&&(f=x.event.special[d]||{},d=(o?f.delegateType:f.bindType)||d,f=x.event.special[d]||{},p=x.extend({type:d,origType:m,data:i,handler:n,guid:n.guid,selector:o,needsContext:o&&x.expr.match.needsContext.test(o),namespace:g.join(".")},s),(h=l[d])||(h=l[d]=[],h.delegateCount=0,f.setup&&f.setup.call(e,i,g,a)!==!1||e.addEventListener&&e.addEventListener(d,a,!1)),f.add&&(f.add.call(e,p),p.handler.guid||(p.handler.guid=n.guid)),o?h.splice(h.delegateCount++,0,p):h.push(p),x.event.global[d]=!0);e=null}},remove:function(e,t,n,r,i){var o,s,a,u,l,c,p,f,h,d,g,m=q.hasData(e)&&q.get(e);if(m&&(u=m.events)){t=(t||"").match(w)||[""],l=t.length;while(l--)if(a=X.exec(t[l])||[],h=g=a[1],d=(a[2]||"").split(".").sort(),h){p=x.event.special[h]||{},h=(r?p.delegateType:p.bindType)||h,f=u[h]||[],a=a[2]&&RegExp("(^|\\.)"+d.join("\\.(?:.*\\.|)")+"(\\.|$)"),s=o=f.length;while(o--)c=f[o],!i&&g!==c.origType||n&&n.guid!==c.guid||a&&!a.test(c.namespace)||r&&r!==c.selector&&("**"!==r||!c.selector)||(f.splice(o,1),c.selector&&f.delegateCount--,p.remove&&p.remove.call(e,c));s&&!f.length&&(p.teardown&&p.teardown.call(e,d,m.handle)!==!1||x.removeEvent(e,h,m.handle),delete u[h])}else for(h in u)x.event.remove(e,h+t[l],n,r,!0);x.isEmptyObject(u)&&(delete m.handle,q.remove(e,"events"))}},trigger:function(t,n,r,i){var s,a,u,l,c,p,f,h=[r||o],d=y.call(t,"type")?t.type:t,g=y.call(t,"namespace")?t.namespace.split("."):[];if(a=u=r=r||o,3!==r.nodeType&&8!==r.nodeType&&!_.test(d+x.event.triggered)&&(d.indexOf(".")>=0&&(g=d.split("."),d=g.shift(),g.sort()),c=0>d.indexOf(":")&&"on"+d,t=t[x.expando]?t:new x.Event(d,"object"==typeof t&&t),t.isTrigger=i?2:3,t.namespace=g.join("."),t.namespace_re=t.namespace?RegExp("(^|\\.)"+g.join("\\.(?:.*\\.|)")+"(\\.|$)"):null,t.result=undefined,t.target||(t.target=r),n=null==n?[t]:x.makeArray(n,[t]),f=x.event.special[d]||{},i||!f.trigger||f.trigger.apply(r,n)!==!1)){if(!i&&!f.noBubble&&!x.isWindow(r)){for(l=f.delegateType||d,_.test(l+d)||(a=a.parentNode);a;a=a.parentNode)h.push(a),u=a;u===(r.ownerDocument||o)&&h.push(u.defaultView||u.parentWindow||e)}s=0;while((a=h[s++])&&!t.isPropagationStopped())t.type=s>1?l:f.bindType||d,p=(q.get(a,"events")||{})[t.type]&&q.get(a,"handle"),p&&p.apply(a,n),p=c&&a[c],p&&x.acceptData(a)&&p.apply&&p.apply(a,n)===!1&&t.preventDefault();return t.type=d,i||t.isDefaultPrevented()||f._default&&f._default.apply(h.pop(),n)!==!1||!x.acceptData(r)||c&&x.isFunction(r[d])&&!x.isWindow(r)&&(u=r[c],u&&(r[c]=null),x.event.triggered=d,r[d](),x.event.triggered=undefined,u&&(r[c]=u)),t.result}},dispatch:function(e){e=x.event.fix(e);var t,n,r,i,o,s=[],a=d.call(arguments),u=(q.get(this,"events")||{})[e.type]||[],l=x.event.special[e.type]||{};if(a[0]=e,e.delegateTarget=this,!l.preDispatch||l.preDispatch.call(this,e)!==!1){s=x.event.handlers.call(this,e,u),t=0;while((i=s[t++])&&!e.isPropagationStopped()){e.currentTarget=i.elem,n=0;while((o=i.handlers[n++])&&!e.isImmediatePropagationStopped())(!e.namespace_re||e.namespace_re.test(o.namespace))&&(e.handleObj=o,e.data=o.data,r=((x.event.special[o.origType]||{}).handle||o.handler).apply(i.elem,a),r!==undefined&&(e.result=r)===!1&&(e.preventDefault(),e.stopPropagation()))}return l.postDispatch&&l.postDispatch.call(this,e),e.result}},handlers:function(e,t){var n,r,i,o,s=[],a=t.delegateCount,u=e.target;if(a&&u.nodeType&&(!e.button||"click"!==e.type))for(;u!==this;u=u.parentNode||this)if(u.disabled!==!0||"click"!==e.type){for(r=[],n=0;a>n;n++)o=t[n],i=o.selector+" ",r[i]===undefined&&(r[i]=o.needsContext?x(i,this).index(u)>=0:x.find(i,this,null,[u]).length),r[i]&&r.push(o);r.length&&s.push({elem:u,handlers:r})}return t.length>a&&s.push({elem:this,handlers:t.slice(a)}),s},props:"altKey bubbles cancelable ctrlKey currentTarget eventPhase metaKey relatedTarget shiftKey target timeStamp view which".split(" "),fixHooks:{},keyHooks:{props:"char charCode key keyCode".split(" "),filter:function(e,t){return null==e.which&&(e.which=null!=t.charCode?t.charCode:t.keyCode),e}},mouseHooks:{props:"button buttons clientX clientY offsetX offsetY pageX pageY screenX screenY toElement".split(" "),filter:function(e,t){var n,r,i,s=t.button;return null==e.pageX&&null!=t.clientX&&(n=e.target.ownerDocument||o,r=n.documentElement,i=n.body,e.pageX=t.clientX+(r&&r.scrollLeft||i&&i.scrollLeft||0)-(r&&r.clientLeft||i&&i.clientLeft||0),e.pageY=t.clientY+(r&&r.scrollTop||i&&i.scrollTop||0)-(r&&r.clientTop||i&&i.clientTop||0)),e.which||s===undefined||(e.which=1&s?1:2&s?3:4&s?2:0),e}},fix:function(e){if(e[x.expando])return e;var t,n,r,i=e.type,s=e,a=this.fixHooks[i];a||(this.fixHooks[i]=a=z.test(i)?this.mouseHooks:I.test(i)?this.keyHooks:{}),r=a.props?this.props.concat(a.props):this.props,e=new x.Event(s),t=r.length;while(t--)n=r[t],e[n]=s[n];return e.target||(e.target=o),3===e.target.nodeType&&(e.target=e.target.parentNode),a.filter?a.filter(e,s):e},special:{load:{noBubble:!0},focus:{trigger:function(){return this!==V()&&this.focus?(this.focus(),!1):undefined},delegateType:"focusin"},blur:{trigger:function(){return this===V()&&this.blur?(this.blur(),!1):undefined},delegateType:"focusout"},click:{trigger:function(){return"checkbox"===this.type&&this.click&&x.nodeName(this,"input")?(this.click(),!1):undefined},_default:function(e){return x.nodeName(e.target,"a")}},beforeunload:{postDispatch:function(e){e.result!==undefined&&(e.originalEvent.returnValue=e.result)}}},simulate:function(e,t,n,r){var i=x.extend(new x.Event,n,{type:e,isSimulated:!0,originalEvent:{}});r?x.event.trigger(i,null,t):x.event.dispatch.call(t,i),i.isDefaultPrevented()&&n.preventDefault()}},x.removeEvent=function(e,t,n){e.removeEventListener&&e.removeEventListener(t,n,!1)},x.Event=function(e,t){return this instanceof x.Event?(e&&e.type?(this.originalEvent=e,this.type=e.type,this.isDefaultPrevented=e.defaultPrevented||e.getPreventDefault&&e.getPreventDefault()?U:Y):this.type=e,t&&x.extend(this,t),this.timeStamp=e&&e.timeStamp||x.now(),this[x.expando]=!0,undefined):new x.Event(e,t)},x.Event.prototype={isDefaultPrevented:Y,isPropagationStopped:Y,isImmediatePropagationStopped:Y,preventDefault:function(){var e=this.originalEvent;this.isDefaultPrevented=U,e&&e.preventDefault&&e.preventDefault()},stopPropagation:function(){var e=this.originalEvent;this.isPropagationStopped=U,e&&e.stopPropagation&&e.stopPropagation()},stopImmediatePropagation:function(){this.isImmediatePropagationStopped=U,this.stopPropagation()}},x.each({mouseenter:"mouseover",mouseleave:"mouseout"},function(e,t){x.event.special[e]={delegateType:t,bindType:t,handle:function(e){var n,r=this,i=e.relatedTarget,o=e.handleObj;return(!i||i!==r&&!x.contains(r,i))&&(e.type=o.origType,n=o.handler.apply(this,arguments),e.type=t),n}}}),x.support.focusinBubbles||x.each({focus:"focusin",blur:"focusout"},function(e,t){var n=0,r=function(e){x.event.simulate(t,e.target,x.event.fix(e),!0)};x.event.special[t]={setup:function(){0===n++&&o.addEventListener(e,r,!0)},teardown:function(){0===--n&&o.removeEventListener(e,r,!0)}}}),x.fn.extend({on:function(e,t,n,r,i){var o,s;if("object"==typeof e){"string"!=typeof t&&(n=n||t,t=undefined);for(s in e)this.on(s,t,n,e[s],i);return this}if(null==n&&null==r?(r=t,n=t=undefined):null==r&&("string"==typeof t?(r=n,n=undefined):(r=n,n=t,t=undefined)),r===!1)r=Y;else if(!r)return this;return 1===i&&(o=r,r=function(e){return x().off(e),o.apply(this,arguments)},r.guid=o.guid||(o.guid=x.guid++)),this.each(function(){x.event.add(this,e,r,n,t)})},one:function(e,t,n,r){return this.on(e,t,n,r,1)},off:function(e,t,n){var r,i;if(e&&e.preventDefault&&e.handleObj)return r=e.handleObj,x(e.delegateTarget).off(r.namespace?r.origType+"."+r.namespace:r.origType,r.selector,r.handler),this;if("object"==typeof e){for(i in e)this.off(i,t,e[i]);return this}return(t===!1||"function"==typeof t)&&(n=t,t=undefined),n===!1&&(n=Y),this.each(function(){x.event.remove(this,e,n,t)})},trigger:function(e,t){return this.each(function(){x.event.trigger(e,t,this)})},triggerHandler:function(e,t){var n=this[0];return n?x.event.trigger(e,t,n,!0):undefined}});var G=/^.[^:#\[\.,]*$/,J=/^(?:parents|prev(?:Until|All))/,Q=x.expr.match.needsContext,K={children:!0,contents:!0,next:!0,prev:!0};x.fn.extend({find:function(e){var t,n=[],r=this,i=r.length;if("string"!=typeof e)return this.pushStack(x(e).filter(function(){for(t=0;i>t;t++)if(x.contains(r[t],this))return!0}));for(t=0;i>t;t++)x.find(e,r[t],n);return n=this.pushStack(i>1?x.unique(n):n),n.selector=this.selector?this.selector+" "+e:e,n},has:function(e){var t=x(e,this),n=t.length;return this.filter(function(){var e=0;for(;n>e;e++)if(x.contains(this,t[e]))return!0})},not:function(e){return this.pushStack(et(this,e||[],!0))},filter:function(e){return this.pushStack(et(this,e||[],!1))},is:function(e){return!!et(this,"string"==typeof e&&Q.test(e)?x(e):e||[],!1).length},closest:function(e,t){var n,r=0,i=this.length,o=[],s=Q.test(e)||"string"!=typeof e?x(e,t||this.context):0;for(;i>r;r++)for(n=this[r];n&&n!==t;n=n.parentNode)if(11>n.nodeType&&(s?s.index(n)>-1:1===n.nodeType&&x.find.matchesSelector(n,e))){n=o.push(n);break}return this.pushStack(o.length>1?x.unique(o):o)},index:function(e){return e?"string"==typeof e?g.call(x(e),this[0]):g.call(this,e.jquery?e[0]:e):this[0]&&this[0].parentNode?this.first().prevAll().length:-1},add:function(e,t){var n="string"==typeof e?x(e,t):x.makeArray(e&&e.nodeType?[e]:e),r=x.merge(this.get(),n);return this.pushStack(x.unique(r))},addBack:function(e){return this.add(null==e?this.prevObject:this.prevObject.filter(e))}});function Z(e,t){while((e=e[t])&&1!==e.nodeType);return e}x.each({parent:function(e){var t=e.parentNode;return t&&11!==t.nodeType?t:null},parents:function(e){return x.dir(e,"parentNode")},parentsUntil:function(e,t,n){return x.dir(e,"parentNode",n)},next:function(e){return Z(e,"nextSibling")},prev:function(e){return Z(e,"previousSibling")},nextAll:function(e){return x.dir(e,"nextSibling")},prevAll:function(e){return x.dir(e,"previousSibling")},nextUntil:function(e,t,n){return x.dir(e,"nextSibling",n)},prevUntil:function(e,t,n){return x.dir(e,"previousSibling",n)},siblings:function(e){return x.sibling((e.parentNode||{}).firstChild,e)},children:function(e){return x.sibling(e.firstChild)},contents:function(e){return e.contentDocument||x.merge([],e.childNodes)}},function(e,t){x.fn[e]=function(n,r){var i=x.map(this,t,n);return"Until"!==e.slice(-5)&&(r=n),r&&"string"==typeof r&&(i=x.filter(r,i)),this.length>1&&(K[e]||x.unique(i),J.test(e)&&i.reverse()),this.pushStack(i)}}),x.extend({filter:function(e,t,n){var r=t[0];return n&&(e=":not("+e+")"),1===t.length&&1===r.nodeType?x.find.matchesSelector(r,e)?[r]:[]:x.find.matches(e,x.grep(t,function(e){return 1===e.nodeType}))},dir:function(e,t,n){var r=[],i=n!==undefined;while((e=e[t])&&9!==e.nodeType)if(1===e.nodeType){if(i&&x(e).is(n))break;r.push(e)}return r},sibling:function(e,t){var n=[];for(;e;e=e.nextSibling)1===e.nodeType&&e!==t&&n.push(e);return n}});function et(e,t,n){if(x.isFunction(t))return x.grep(e,function(e,r){return!!t.call(e,r,e)!==n});if(t.nodeType)return x.grep(e,function(e){return e===t!==n});if("string"==typeof t){if(G.test(t))return x.filter(t,e,n);t=x.filter(t,e)}return x.grep(e,function(e){return g.call(t,e)>=0!==n})}var tt=/<(?!area|br|col|embed|hr|img|input|link|meta|param)(([\w:]+)[^>]*)\/>/gi,nt=/<([\w:]+)/,rt=/<|&#?\w+;/,it=/<(?:script|style|link)/i,ot=/^(?:checkbox|radio)$/i,st=/checked\s*(?:[^=]|=\s*.checked.)/i,at=/^$|\/(?:java|ecma)script/i,ut=/^true\/(.*)/,lt=/^\s*<!(?:\[CDATA\[|--)|(?:\]\]|--)>\s*$/g,ct={option:[1,"<select multiple='multiple'>","</select>"],thead:[1,"<table>","</table>"],col:[2,"<table><colgroup>","</colgroup></table>"],tr:[2,"<table><tbody>","</tbody></table>"],td:[3,"<table><tbody><tr>","</tr></tbody></table>"],_default:[0,"",""]};ct.optgroup=ct.option,ct.tbody=ct.tfoot=ct.colgroup=ct.caption=ct.thead,ct.th=ct.td,x.fn.extend({text:function(e){return x.access(this,function(e){return e===undefined?x.text(this):this.empty().append((this[0]&&this[0].ownerDocument||o).createTextNode(e))},null,e,arguments.length)},append:function(){return this.domManip(arguments,function(e){if(1===this.nodeType||11===this.nodeType||9===this.nodeType){var t=pt(this,e);t.appendChild(e)}})},prepend:function(){return this.domManip(arguments,function(e){if(1===this.nodeType||11===this.nodeType||9===this.nodeType){var t=pt(this,e);t.insertBefore(e,t.firstChild)}})},before:function(){return this.domManip(arguments,function(e){this.parentNode&&this.parentNode.insertBefore(e,this)})},after:function(){return this.domManip(arguments,function(e){this.parentNode&&this.parentNode.insertBefore(e,this.nextSibling)})},remove:function(e,t){var n,r=e?x.filter(e,this):this,i=0;for(;null!=(n=r[i]);i++)t||1!==n.nodeType||x.cleanData(mt(n)),n.parentNode&&(t&&x.contains(n.ownerDocument,n)&&dt(mt(n,"script")),n.parentNode.removeChild(n));return this},empty:function(){var e,t=0;for(;null!=(e=this[t]);t++)1===e.nodeType&&(x.cleanData(mt(e,!1)),e.textContent="");return this},clone:function(e,t){return e=null==e?!1:e,t=null==t?e:t,this.map(function(){return x.clone(this,e,t)})},html:function(e){return x.access(this,function(e){var t=this[0]||{},n=0,r=this.length;if(e===undefined&&1===t.nodeType)return t.innerHTML;if("string"==typeof e&&!it.test(e)&&!ct[(nt.exec(e)||["",""])[1].toLowerCase()]){e=e.replace(tt,"<$1></$2>");try{for(;r>n;n++)t=this[n]||{},1===t.nodeType&&(x.cleanData(mt(t,!1)),t.innerHTML=e);t=0}catch(i){}}t&&this.empty().append(e)},null,e,arguments.length)},replaceWith:function(){var e=x.map(this,function(e){return[e.nextSibling,e.parentNode]}),t=0;return this.domManip(arguments,function(n){var r=e[t++],i=e[t++];i&&(r&&r.parentNode!==i&&(r=this.nextSibling),x(this).remove(),i.insertBefore(n,r))},!0),t?this:this.remove()},detach:function(e){return this.remove(e,!0)},domManip:function(e,t,n){e=f.apply([],e);var r,i,o,s,a,u,l=0,c=this.length,p=this,h=c-1,d=e[0],g=x.isFunction(d);if(g||!(1>=c||"string"!=typeof d||x.support.checkClone)&&st.test(d))return this.each(function(r){var i=p.eq(r);g&&(e[0]=d.call(this,r,i.html())),i.domManip(e,t,n)});if(c&&(r=x.buildFragment(e,this[0].ownerDocument,!1,!n&&this),i=r.firstChild,1===r.childNodes.length&&(r=i),i)){for(o=x.map(mt(r,"script"),ft),s=o.length;c>l;l++)a=r,l!==h&&(a=x.clone(a,!0,!0),s&&x.merge(o,mt(a,"script"))),t.call(this[l],a,l);if(s)for(u=o[o.length-1].ownerDocument,x.map(o,ht),l=0;s>l;l++)a=o[l],at.test(a.type||"")&&!q.access(a,"globalEval")&&x.contains(u,a)&&(a.src?x._evalUrl(a.src):x.globalEval(a.textContent.replace(lt,"")))}return this}}),x.each({appendTo:"append",prependTo:"prepend",insertBefore:"before",insertAfter:"after",replaceAll:"replaceWith"},function(e,t){x.fn[e]=function(e){var n,r=[],i=x(e),o=i.length-1,s=0;for(;o>=s;s++)n=s===o?this:this.clone(!0),x(i[s])[t](n),h.apply(r,n.get());return this.pushStack(r)}}),x.extend({clone:function(e,t,n){var r,i,o,s,a=e.cloneNode(!0),u=x.contains(e.ownerDocument,e);if(!(x.support.noCloneChecked||1!==e.nodeType&&11!==e.nodeType||x.isXMLDoc(e)))for(s=mt(a),o=mt(e),r=0,i=o.length;i>r;r++)yt(o[r],s[r]);if(t)if(n)for(o=o||mt(e),s=s||mt(a),r=0,i=o.length;i>r;r++)gt(o[r],s[r]);else gt(e,a);return s=mt(a,"script"),s.length>0&&dt(s,!u&&mt(e,"script")),a},buildFragment:function(e,t,n,r){var i,o,s,a,u,l,c=0,p=e.length,f=t.createDocumentFragment(),h=[];for(;p>c;c++)if(i=e[c],i||0===i)if("object"===x.type(i))x.merge(h,i.nodeType?[i]:i);else if(rt.test(i)){o=o||f.appendChild(t.createElement("div")),s=(nt.exec(i)||["",""])[1].toLowerCase(),a=ct[s]||ct._default,o.innerHTML=a[1]+i.replace(tt,"<$1></$2>")+a[2],l=a[0];while(l--)o=o.lastChild;x.merge(h,o.childNodes),o=f.firstChild,o.textContent=""}else h.push(t.createTextNode(i));f.textContent="",c=0;while(i=h[c++])if((!r||-1===x.inArray(i,r))&&(u=x.contains(i.ownerDocument,i),o=mt(f.appendChild(i),"script"),u&&dt(o),n)){l=0;while(i=o[l++])at.test(i.type||"")&&n.push(i)}return f},cleanData:function(e){var t,n,r,i,o,s,a=x.event.special,u=0;for(;(n=e[u])!==undefined;u++){if(F.accepts(n)&&(o=n[q.expando],o&&(t=q.cache[o]))){if(r=Object.keys(t.events||{}),r.length)for(s=0;(i=r[s])!==undefined;s++)a[i]?x.event.remove(n,i):x.removeEvent(n,i,t.handle);q.cache[o]&&delete q.cache[o]}delete L.cache[n[L.expando]]}},_evalUrl:function(e){return x.ajax({url:e,type:"GET",dataType:"script",async:!1,global:!1,"throws":!0})}});function pt(e,t){return x.nodeName(e,"table")&&x.nodeName(1===t.nodeType?t:t.firstChild,"tr")?e.getElementsByTagName("tbody")[0]||e.appendChild(e.ownerDocument.createElement("tbody")):e}function ft(e){return e.type=(null!==e.getAttribute("type"))+"/"+e.type,e}function ht(e){var t=ut.exec(e.type);return t?e.type=t[1]:e.removeAttribute("type"),e}function dt(e,t){var n=e.length,r=0;for(;n>r;r++)q.set(e[r],"globalEval",!t||q.get(t[r],"globalEval"))}function gt(e,t){var n,r,i,o,s,a,u,l;if(1===t.nodeType){if(q.hasData(e)&&(o=q.access(e),s=q.set(t,o),l=o.events)){delete s.handle,s.events={};for(i in l)for(n=0,r=l[i].length;r>n;n++)x.event.add(t,i,l[i][n])}L.hasData(e)&&(a=L.access(e),u=x.extend({},a),L.set(t,u))}}function mt(e,t){var n=e.getElementsByTagName?e.getElementsByTagName(t||"*"):e.querySelectorAll?e.querySelectorAll(t||"*"):[];return t===undefined||t&&x.nodeName(e,t)?x.merge([e],n):n}function yt(e,t){var n=t.nodeName.toLowerCase();"input"===n&&ot.test(e.type)?t.checked=e.checked:("input"===n||"textarea"===n)&&(t.defaultValue=e.defaultValue)}x.fn.extend({wrapAll:function(e){var t;return x.isFunction(e)?this.each(function(t){x(this).wrapAll(e.call(this,t))}):(this[0]&&(t=x(e,this[0].ownerDocument).eq(0).clone(!0),this[0].parentNode&&t.insertBefore(this[0]),t.map(function(){var e=this;while(e.firstElementChild)e=e.firstElementChild;return e}).append(this)),this)},wrapInner:function(e){return x.isFunction(e)?this.each(function(t){x(this).wrapInner(e.call(this,t))}):this.each(function(){var t=x(this),n=t.contents();n.length?n.wrapAll(e):t.append(e)})},wrap:function(e){var t=x.isFunction(e);return this.each(function(n){x(this).wrapAll(t?e.call(this,n):e)})},unwrap:function(){return this.parent().each(function(){x.nodeName(this,"body")||x(this).replaceWith(this.childNodes)}).end()}});var vt,xt,bt=/^(none|table(?!-c[ea]).+)/,wt=/^margin/,Tt=RegExp("^("+b+")(.*)$","i"),Ct=RegExp("^("+b+")(?!px)[a-z%]+$","i"),kt=RegExp("^([+-])=("+b+")","i"),Nt={BODY:"block"},Et={position:"absolute",visibility:"hidden",display:"block"},St={letterSpacing:0,fontWeight:400},jt=["Top","Right","Bottom","Left"],Dt=["Webkit","O","Moz","ms"];function At(e,t){if(t in e)return t;var n=t.charAt(0).toUpperCase()+t.slice(1),r=t,i=Dt.length;while(i--)if(t=Dt[i]+n,t in e)return t;return r}function Lt(e,t){return e=t||e,"none"===x.css(e,"display")||!x.contains(e.ownerDocument,e)}function qt(t){return e.getComputedStyle(t,null)}function Ht(e,t){var n,r,i,o=[],s=0,a=e.length;for(;a>s;s++)r=e[s],r.style&&(o[s]=q.get(r,"olddisplay"),n=r.style.display,t?(o[s]||"none"!==n||(r.style.display=""),""===r.style.display&&Lt(r)&&(o[s]=q.access(r,"olddisplay",Rt(r.nodeName)))):o[s]||(i=Lt(r),(n&&"none"!==n||!i)&&q.set(r,"olddisplay",i?n:x.css(r,"display"))));for(s=0;a>s;s++)r=e[s],r.style&&(t&&"none"!==r.style.display&&""!==r.style.display||(r.style.display=t?o[s]||"":"none"));return e}x.fn.extend({css:function(e,t){return x.access(this,function(e,t,n){var r,i,o={},s=0;if(x.isArray(t)){for(r=qt(e),i=t.length;i>s;s++)o[t[s]]=x.css(e,t[s],!1,r);return o}return n!==undefined?x.style(e,t,n):x.css(e,t)},e,t,arguments.length>1)},show:function(){return Ht(this,!0)},hide:function(){return Ht(this)},toggle:function(e){return"boolean"==typeof e?e?this.show():this.hide():this.each(function(){Lt(this)?x(this).show():x(this).hide()})}}),x.extend({cssHooks:{opacity:{get:function(e,t){if(t){var n=vt(e,"opacity");return""===n?"1":n}}}},cssNumber:{columnCount:!0,fillOpacity:!0,fontWeight:!0,lineHeight:!0,opacity:!0,order:!0,orphans:!0,widows:!0,zIndex:!0,zoom:!0},cssProps:{"float":"cssFloat"},style:function(e,t,n,r){if(e&&3!==e.nodeType&&8!==e.nodeType&&e.style){var i,o,s,a=x.camelCase(t),u=e.style;return t=x.cssProps[a]||(x.cssProps[a]=At(u,a)),s=x.cssHooks[t]||x.cssHooks[a],n===undefined?s&&"get"in s&&(i=s.get(e,!1,r))!==undefined?i:u[t]:(o=typeof n,"string"===o&&(i=kt.exec(n))&&(n=(i[1]+1)*i[2]+parseFloat(x.css(e,t)),o="number"),null==n||"number"===o&&isNaN(n)||("number"!==o||x.cssNumber[a]||(n+="px"),x.support.clearCloneStyle||""!==n||0!==t.indexOf("background")||(u[t]="inherit"),s&&"set"in s&&(n=s.set(e,n,r))===undefined||(u[t]=n)),undefined)}},css:function(e,t,n,r){var i,o,s,a=x.camelCase(t);return t=x.cssProps[a]||(x.cssProps[a]=At(e.style,a)),s=x.cssHooks[t]||x.cssHooks[a],s&&"get"in s&&(i=s.get(e,!0,n)),i===undefined&&(i=vt(e,t,r)),"normal"===i&&t in St&&(i=St[t]),""===n||n?(o=parseFloat(i),n===!0||x.isNumeric(o)?o||0:i):i}}),vt=function(e,t,n){var r,i,o,s=n||qt(e),a=s?s.getPropertyValue(t)||s[t]:undefined,u=e.style;return s&&(""!==a||x.contains(e.ownerDocument,e)||(a=x.style(e,t)),Ct.test(a)&&wt.test(t)&&(r=u.width,i=u.minWidth,o=u.maxWidth,u.minWidth=u.maxWidth=u.width=a,a=s.width,u.width=r,u.minWidth=i,u.maxWidth=o)),a};function Ot(e,t,n){var r=Tt.exec(t);return r?Math.max(0,r[1]-(n||0))+(r[2]||"px"):t}function Ft(e,t,n,r,i){var o=n===(r?"border":"content")?4:"width"===t?1:0,s=0;for(;4>o;o+=2)"margin"===n&&(s+=x.css(e,n+jt[o],!0,i)),r?("content"===n&&(s-=x.css(e,"padding"+jt[o],!0,i)),"margin"!==n&&(s-=x.css(e,"border"+jt[o]+"Width",!0,i))):(s+=x.css(e,"padding"+jt[o],!0,i),"padding"!==n&&(s+=x.css(e,"border"+jt[o]+"Width",!0,i)));return s}function Pt(e,t,n){var r=!0,i="width"===t?e.offsetWidth:e.offsetHeight,o=qt(e),s=x.support.boxSizing&&"border-box"===x.css(e,"boxSizing",!1,o);if(0>=i||null==i){if(i=vt(e,t,o),(0>i||null==i)&&(i=e.style[t]),Ct.test(i))return i;r=s&&(x.support.boxSizingReliable||i===e.style[t]),i=parseFloat(i)||0}return i+Ft(e,t,n||(s?"border":"content"),r,o)+"px"}function Rt(e){var t=o,n=Nt[e];return n||(n=Mt(e,t),"none"!==n&&n||(xt=(xt||x("<iframe frameborder='0' width='0' height='0'/>").css("cssText","display:block !important")).appendTo(t.documentElement),t=(xt[0].contentWindow||xt[0].contentDocument).document,t.write("<!doctype html><html><body>"),t.close(),n=Mt(e,t),xt.detach()),Nt[e]=n),n}function Mt(e,t){var n=x(t.createElement(e)).appendTo(t.body),r=x.css(n[0],"display");return n.remove(),r}x.each(["height","width"],function(e,t){x.cssHooks[t]={get:function(e,n,r){return n?0===e.offsetWidth&&bt.test(x.css(e,"display"))?x.swap(e,Et,function(){return Pt(e,t,r)}):Pt(e,t,r):undefined},set:function(e,n,r){var i=r&&qt(e);return Ot(e,n,r?Ft(e,t,r,x.support.boxSizing&&"border-box"===x.css(e,"boxSizing",!1,i),i):0)}}}),x(function(){x.support.reliableMarginRight||(x.cssHooks.marginRight={get:function(e,t){return t?x.swap(e,{display:"inline-block"},vt,[e,"marginRight"]):undefined}}),!x.support.pixelPosition&&x.fn.position&&x.each(["top","left"],function(e,t){x.cssHooks[t]={get:function(e,n){return n?(n=vt(e,t),Ct.test(n)?x(e).position()[t]+"px":n):undefined}}})}),x.expr&&x.expr.filters&&(x.expr.filters.hidden=function(e){return 0>=e.offsetWidth&&0>=e.offsetHeight},x.expr.filters.visible=function(e){return!x.expr.filters.hidden(e)}),x.each({margin:"",padding:"",border:"Width"},function(e,t){x.cssHooks[e+t]={expand:function(n){var r=0,i={},o="string"==typeof n?n.split(" "):[n];for(;4>r;r++)i[e+jt[r]+t]=o[r]||o[r-2]||o[0];return i}},wt.test(e)||(x.cssHooks[e+t].set=Ot)});var Wt=/%20/g,$t=/\[\]$/,Bt=/\r?\n/g,It=/^(?:submit|button|image|reset|file)$/i,zt=/^(?:input|select|textarea|keygen)/i;x.fn.extend({serialize:function(){return x.param(this.serializeArray())},serializeArray:function(){return this.map(function(){var e=x.prop(this,"elements");return e?x.makeArray(e):this}).filter(function(){var e=this.type;return this.name&&!x(this).is(":disabled")&&zt.test(this.nodeName)&&!It.test(e)&&(this.checked||!ot.test(e))}).map(function(e,t){var n=x(this).val();return null==n?null:x.isArray(n)?x.map(n,function(e){return{name:t.name,value:e.replace(Bt,"\r\n")}}):{name:t.name,value:n.replace(Bt,"\r\n")}}).get()}}),x.param=function(e,t){var n,r=[],i=function(e,t){t=x.isFunction(t)?t():null==t?"":t,r[r.length]=encodeURIComponent(e)+"="+encodeURIComponent(t)};if(t===undefined&&(t=x.ajaxSettings&&x.ajaxSettings.traditional),x.isArray(e)||e.jquery&&!x.isPlainObject(e))x.each(e,function(){i(this.name,this.value)});else for(n in e)_t(n,e[n],t,i);return r.join("&").replace(Wt,"+")};function _t(e,t,n,r){var i;if(x.isArray(t))x.each(t,function(t,i){n||$t.test(e)?r(e,i):_t(e+"["+("object"==typeof i?t:"")+"]",i,n,r)});else if(n||"object"!==x.type(t))r(e,t);else for(i in t)_t(e+"["+i+"]",t[i],n,r)}x.each("blur focus focusin focusout load resize scroll unload click dblclick mousedown mouseup mousemove mouseover mouseout mouseenter mouseleave change select submit keydown keypress keyup error contextmenu".split(" "),function(e,t){x.fn[t]=function(e,n){return arguments.length>0?this.on(t,null,e,n):this.trigger(t)}}),x.fn.extend({hover:function(e,t){return this.mouseenter(e).mouseleave(t||e)},bind:function(e,t,n){return this.on(e,null,t,n)},unbind:function(e,t){return this.off(e,null,t)
+},delegate:function(e,t,n,r){return this.on(t,e,n,r)},undelegate:function(e,t,n){return 1===arguments.length?this.off(e,"**"):this.off(t,e||"**",n)}});var Xt,Ut,Yt=x.now(),Vt=/\?/,Gt=/#.*$/,Jt=/([?&])_=[^&]*/,Qt=/^(.*?):[ \t]*([^\r\n]*)$/gm,Kt=/^(?:about|app|app-storage|.+-extension|file|res|widget):$/,Zt=/^(?:GET|HEAD)$/,en=/^\/\//,tn=/^([\w.+-]+:)(?:\/\/([^\/?#:]*)(?::(\d+)|)|)/,nn=x.fn.load,rn={},on={},sn="*/".concat("*");try{Ut=i.href}catch(an){Ut=o.createElement("a"),Ut.href="",Ut=Ut.href}Xt=tn.exec(Ut.toLowerCase())||[];function un(e){return function(t,n){"string"!=typeof t&&(n=t,t="*");var r,i=0,o=t.toLowerCase().match(w)||[];if(x.isFunction(n))while(r=o[i++])"+"===r[0]?(r=r.slice(1)||"*",(e[r]=e[r]||[]).unshift(n)):(e[r]=e[r]||[]).push(n)}}function ln(e,t,n,r){var i={},o=e===on;function s(a){var u;return i[a]=!0,x.each(e[a]||[],function(e,a){var l=a(t,n,r);return"string"!=typeof l||o||i[l]?o?!(u=l):undefined:(t.dataTypes.unshift(l),s(l),!1)}),u}return s(t.dataTypes[0])||!i["*"]&&s("*")}function cn(e,t){var n,r,i=x.ajaxSettings.flatOptions||{};for(n in t)t[n]!==undefined&&((i[n]?e:r||(r={}))[n]=t[n]);return r&&x.extend(!0,e,r),e}x.fn.load=function(e,t,n){if("string"!=typeof e&&nn)return nn.apply(this,arguments);var r,i,o,s=this,a=e.indexOf(" ");return a>=0&&(r=e.slice(a),e=e.slice(0,a)),x.isFunction(t)?(n=t,t=undefined):t&&"object"==typeof t&&(i="POST"),s.length>0&&x.ajax({url:e,type:i,dataType:"html",data:t}).done(function(e){o=arguments,s.html(r?x("<div>").append(x.parseHTML(e)).find(r):e)}).complete(n&&function(e,t){s.each(n,o||[e.responseText,t,e])}),this},x.each(["ajaxStart","ajaxStop","ajaxComplete","ajaxError","ajaxSuccess","ajaxSend"],function(e,t){x.fn[t]=function(e){return this.on(t,e)}}),x.extend({active:0,lastModified:{},etag:{},ajaxSettings:{url:Ut,type:"GET",isLocal:Kt.test(Xt[1]),global:!0,processData:!0,async:!0,contentType:"application/x-www-form-urlencoded; charset=UTF-8",accepts:{"*":sn,text:"text/plain",html:"text/html",xml:"application/xml, text/xml",json:"application/json, text/javascript"},contents:{xml:/xml/,html:/html/,json:/json/},responseFields:{xml:"responseXML",text:"responseText",json:"responseJSON"},converters:{"* text":String,"text html":!0,"text json":x.parseJSON,"text xml":x.parseXML},flatOptions:{url:!0,context:!0}},ajaxSetup:function(e,t){return t?cn(cn(e,x.ajaxSettings),t):cn(x.ajaxSettings,e)},ajaxPrefilter:un(rn),ajaxTransport:un(on),ajax:function(e,t){"object"==typeof e&&(t=e,e=undefined),t=t||{};var n,r,i,o,s,a,u,l,c=x.ajaxSetup({},t),p=c.context||c,f=c.context&&(p.nodeType||p.jquery)?x(p):x.event,h=x.Deferred(),d=x.Callbacks("once memory"),g=c.statusCode||{},m={},y={},v=0,b="canceled",T={readyState:0,getResponseHeader:function(e){var t;if(2===v){if(!o){o={};while(t=Qt.exec(i))o[t[1].toLowerCase()]=t[2]}t=o[e.toLowerCase()]}return null==t?null:t},getAllResponseHeaders:function(){return 2===v?i:null},setRequestHeader:function(e,t){var n=e.toLowerCase();return v||(e=y[n]=y[n]||e,m[e]=t),this},overrideMimeType:function(e){return v||(c.mimeType=e),this},statusCode:function(e){var t;if(e)if(2>v)for(t in e)g[t]=[g[t],e[t]];else T.always(e[T.status]);return this},abort:function(e){var t=e||b;return n&&n.abort(t),k(0,t),this}};if(h.promise(T).complete=d.add,T.success=T.done,T.error=T.fail,c.url=((e||c.url||Ut)+"").replace(Gt,"").replace(en,Xt[1]+"//"),c.type=t.method||t.type||c.method||c.type,c.dataTypes=x.trim(c.dataType||"*").toLowerCase().match(w)||[""],null==c.crossDomain&&(a=tn.exec(c.url.toLowerCase()),c.crossDomain=!(!a||a[1]===Xt[1]&&a[2]===Xt[2]&&(a[3]||("http:"===a[1]?"80":"443"))===(Xt[3]||("http:"===Xt[1]?"80":"443")))),c.data&&c.processData&&"string"!=typeof c.data&&(c.data=x.param(c.data,c.traditional)),ln(rn,c,t,T),2===v)return T;u=c.global,u&&0===x.active++&&x.event.trigger("ajaxStart"),c.type=c.type.toUpperCase(),c.hasContent=!Zt.test(c.type),r=c.url,c.hasContent||(c.data&&(r=c.url+=(Vt.test(r)?"&":"?")+c.data,delete c.data),c.cache===!1&&(c.url=Jt.test(r)?r.replace(Jt,"$1_="+Yt++):r+(Vt.test(r)?"&":"?")+"_="+Yt++)),c.ifModified&&(x.lastModified[r]&&T.setRequestHeader("If-Modified-Since",x.lastModified[r]),x.etag[r]&&T.setRequestHeader("If-None-Match",x.etag[r])),(c.data&&c.hasContent&&c.contentType!==!1||t.contentType)&&T.setRequestHeader("Content-Type",c.contentType),T.setRequestHeader("Accept",c.dataTypes[0]&&c.accepts[c.dataTypes[0]]?c.accepts[c.dataTypes[0]]+("*"!==c.dataTypes[0]?", "+sn+"; q=0.01":""):c.accepts["*"]);for(l in c.headers)T.setRequestHeader(l,c.headers[l]);if(c.beforeSend&&(c.beforeSend.call(p,T,c)===!1||2===v))return T.abort();b="abort";for(l in{success:1,error:1,complete:1})T[l](c[l]);if(n=ln(on,c,t,T)){T.readyState=1,u&&f.trigger("ajaxSend",[T,c]),c.async&&c.timeout>0&&(s=setTimeout(function(){T.abort("timeout")},c.timeout));try{v=1,n.send(m,k)}catch(C){if(!(2>v))throw C;k(-1,C)}}else k(-1,"No Transport");function k(e,t,o,a){var l,m,y,b,w,C=t;2!==v&&(v=2,s&&clearTimeout(s),n=undefined,i=a||"",T.readyState=e>0?4:0,l=e>=200&&300>e||304===e,o&&(b=pn(c,T,o)),b=fn(c,b,T,l),l?(c.ifModified&&(w=T.getResponseHeader("Last-Modified"),w&&(x.lastModified[r]=w),w=T.getResponseHeader("etag"),w&&(x.etag[r]=w)),204===e||"HEAD"===c.type?C="nocontent":304===e?C="notmodified":(C=b.state,m=b.data,y=b.error,l=!y)):(y=C,(e||!C)&&(C="error",0>e&&(e=0))),T.status=e,T.statusText=(t||C)+"",l?h.resolveWith(p,[m,C,T]):h.rejectWith(p,[T,C,y]),T.statusCode(g),g=undefined,u&&f.trigger(l?"ajaxSuccess":"ajaxError",[T,c,l?m:y]),d.fireWith(p,[T,C]),u&&(f.trigger("ajaxComplete",[T,c]),--x.active||x.event.trigger("ajaxStop")))}return T},getJSON:function(e,t,n){return x.get(e,t,n,"json")},getScript:function(e,t){return x.get(e,undefined,t,"script")}}),x.each(["get","post"],function(e,t){x[t]=function(e,n,r,i){return x.isFunction(n)&&(i=i||r,r=n,n=undefined),x.ajax({url:e,type:t,dataType:i,data:n,success:r})}});function pn(e,t,n){var r,i,o,s,a=e.contents,u=e.dataTypes;while("*"===u[0])u.shift(),r===undefined&&(r=e.mimeType||t.getResponseHeader("Content-Type"));if(r)for(i in a)if(a[i]&&a[i].test(r)){u.unshift(i);break}if(u[0]in n)o=u[0];else{for(i in n){if(!u[0]||e.converters[i+" "+u[0]]){o=i;break}s||(s=i)}o=o||s}return o?(o!==u[0]&&u.unshift(o),n[o]):undefined}function fn(e,t,n,r){var i,o,s,a,u,l={},c=e.dataTypes.slice();if(c[1])for(s in e.converters)l[s.toLowerCase()]=e.converters[s];o=c.shift();while(o)if(e.responseFields[o]&&(n[e.responseFields[o]]=t),!u&&r&&e.dataFilter&&(t=e.dataFilter(t,e.dataType)),u=o,o=c.shift())if("*"===o)o=u;else if("*"!==u&&u!==o){if(s=l[u+" "+o]||l["* "+o],!s)for(i in l)if(a=i.split(" "),a[1]===o&&(s=l[u+" "+a[0]]||l["* "+a[0]])){s===!0?s=l[i]:l[i]!==!0&&(o=a[0],c.unshift(a[1]));break}if(s!==!0)if(s&&e["throws"])t=s(t);else try{t=s(t)}catch(p){return{state:"parsererror",error:s?p:"No conversion from "+u+" to "+o}}}return{state:"success",data:t}}x.ajaxSetup({accepts:{script:"text/javascript, application/javascript, application/ecmascript, application/x-ecmascript"},contents:{script:/(?:java|ecma)script/},converters:{"text script":function(e){return x.globalEval(e),e}}}),x.ajaxPrefilter("script",function(e){e.cache===undefined&&(e.cache=!1),e.crossDomain&&(e.type="GET")}),x.ajaxTransport("script",function(e){if(e.crossDomain){var t,n;return{send:function(r,i){t=x("<script>").prop({async:!0,charset:e.scriptCharset,src:e.url}).on("load error",n=function(e){t.remove(),n=null,e&&i("error"===e.type?404:200,e.type)}),o.head.appendChild(t[0])},abort:function(){n&&n()}}}});var hn=[],dn=/(=)\?(?=&|$)|\?\?/;x.ajaxSetup({jsonp:"callback",jsonpCallback:function(){var e=hn.pop()||x.expando+"_"+Yt++;return this[e]=!0,e}}),x.ajaxPrefilter("json jsonp",function(t,n,r){var i,o,s,a=t.jsonp!==!1&&(dn.test(t.url)?"url":"string"==typeof t.data&&!(t.contentType||"").indexOf("application/x-www-form-urlencoded")&&dn.test(t.data)&&"data");return a||"jsonp"===t.dataTypes[0]?(i=t.jsonpCallback=x.isFunction(t.jsonpCallback)?t.jsonpCallback():t.jsonpCallback,a?t[a]=t[a].replace(dn,"$1"+i):t.jsonp!==!1&&(t.url+=(Vt.test(t.url)?"&":"?")+t.jsonp+"="+i),t.converters["script json"]=function(){return s||x.error(i+" was not called"),s[0]},t.dataTypes[0]="json",o=e[i],e[i]=function(){s=arguments},r.always(function(){e[i]=o,t[i]&&(t.jsonpCallback=n.jsonpCallback,hn.push(i)),s&&x.isFunction(o)&&o(s[0]),s=o=undefined}),"script"):undefined}),x.ajaxSettings.xhr=function(){try{return new XMLHttpRequest}catch(e){}};var gn=x.ajaxSettings.xhr(),mn={0:200,1223:204},yn=0,vn={};e.ActiveXObject&&x(e).on("unload",function(){for(var e in vn)vn[e]();vn=undefined}),x.support.cors=!!gn&&"withCredentials"in gn,x.support.ajax=gn=!!gn,x.ajaxTransport(function(e){var t;return x.support.cors||gn&&!e.crossDomain?{send:function(n,r){var i,o,s=e.xhr();if(s.open(e.type,e.url,e.async,e.username,e.password),e.xhrFields)for(i in e.xhrFields)s[i]=e.xhrFields[i];e.mimeType&&s.overrideMimeType&&s.overrideMimeType(e.mimeType),e.crossDomain||n["X-Requested-With"]||(n["X-Requested-With"]="XMLHttpRequest");for(i in n)s.setRequestHeader(i,n[i]);t=function(e){return function(){t&&(delete vn[o],t=s.onload=s.onerror=null,"abort"===e?s.abort():"error"===e?r(s.status||404,s.statusText):r(mn[s.status]||s.status,s.statusText,"string"==typeof s.responseText?{text:s.responseText}:undefined,s.getAllResponseHeaders()))}},s.onload=t(),s.onerror=t("error"),t=vn[o=yn++]=t("abort"),s.send(e.hasContent&&e.data||null)},abort:function(){t&&t()}}:undefined});var xn,bn,wn=/^(?:toggle|show|hide)$/,Tn=RegExp("^(?:([+-])=|)("+b+")([a-z%]*)$","i"),Cn=/queueHooks$/,kn=[An],Nn={"*":[function(e,t){var n=this.createTween(e,t),r=n.cur(),i=Tn.exec(t),o=i&&i[3]||(x.cssNumber[e]?"":"px"),s=(x.cssNumber[e]||"px"!==o&&+r)&&Tn.exec(x.css(n.elem,e)),a=1,u=20;if(s&&s[3]!==o){o=o||s[3],i=i||[],s=+r||1;do a=a||".5",s/=a,x.style(n.elem,e,s+o);while(a!==(a=n.cur()/r)&&1!==a&&--u)}return i&&(s=n.start=+s||+r||0,n.unit=o,n.end=i[1]?s+(i[1]+1)*i[2]:+i[2]),n}]};function En(){return setTimeout(function(){xn=undefined}),xn=x.now()}function Sn(e,t,n){var r,i=(Nn[t]||[]).concat(Nn["*"]),o=0,s=i.length;for(;s>o;o++)if(r=i[o].call(n,t,e))return r}function jn(e,t,n){var r,i,o=0,s=kn.length,a=x.Deferred().always(function(){delete u.elem}),u=function(){if(i)return!1;var t=xn||En(),n=Math.max(0,l.startTime+l.duration-t),r=n/l.duration||0,o=1-r,s=0,u=l.tweens.length;for(;u>s;s++)l.tweens[s].run(o);return a.notifyWith(e,[l,o,n]),1>o&&u?n:(a.resolveWith(e,[l]),!1)},l=a.promise({elem:e,props:x.extend({},t),opts:x.extend(!0,{specialEasing:{}},n),originalProperties:t,originalOptions:n,startTime:xn||En(),duration:n.duration,tweens:[],createTween:function(t,n){var r=x.Tween(e,l.opts,t,n,l.opts.specialEasing[t]||l.opts.easing);return l.tweens.push(r),r},stop:function(t){var n=0,r=t?l.tweens.length:0;if(i)return this;for(i=!0;r>n;n++)l.tweens[n].run(1);return t?a.resolveWith(e,[l,t]):a.rejectWith(e,[l,t]),this}}),c=l.props;for(Dn(c,l.opts.specialEasing);s>o;o++)if(r=kn[o].call(l,e,c,l.opts))return r;return x.map(c,Sn,l),x.isFunction(l.opts.start)&&l.opts.start.call(e,l),x.fx.timer(x.extend(u,{elem:e,anim:l,queue:l.opts.queue})),l.progress(l.opts.progress).done(l.opts.done,l.opts.complete).fail(l.opts.fail).always(l.opts.always)}function Dn(e,t){var n,r,i,o,s;for(n in e)if(r=x.camelCase(n),i=t[r],o=e[n],x.isArray(o)&&(i=o[1],o=e[n]=o[0]),n!==r&&(e[r]=o,delete e[n]),s=x.cssHooks[r],s&&"expand"in s){o=s.expand(o),delete e[r];for(n in o)n in e||(e[n]=o[n],t[n]=i)}else t[r]=i}x.Animation=x.extend(jn,{tweener:function(e,t){x.isFunction(e)?(t=e,e=["*"]):e=e.split(" ");var n,r=0,i=e.length;for(;i>r;r++)n=e[r],Nn[n]=Nn[n]||[],Nn[n].unshift(t)},prefilter:function(e,t){t?kn.unshift(e):kn.push(e)}});function An(e,t,n){var r,i,o,s,a,u,l=this,c={},p=e.style,f=e.nodeType&&Lt(e),h=q.get(e,"fxshow");n.queue||(a=x._queueHooks(e,"fx"),null==a.unqueued&&(a.unqueued=0,u=a.empty.fire,a.empty.fire=function(){a.unqueued||u()}),a.unqueued++,l.always(function(){l.always(function(){a.unqueued--,x.queue(e,"fx").length||a.empty.fire()})})),1===e.nodeType&&("height"in t||"width"in t)&&(n.overflow=[p.overflow,p.overflowX,p.overflowY],"inline"===x.css(e,"display")&&"none"===x.css(e,"float")&&(p.display="inline-block")),n.overflow&&(p.overflow="hidden",l.always(function(){p.overflow=n.overflow[0],p.overflowX=n.overflow[1],p.overflowY=n.overflow[2]}));for(r in t)if(i=t[r],wn.exec(i)){if(delete t[r],o=o||"toggle"===i,i===(f?"hide":"show")){if("show"!==i||!h||h[r]===undefined)continue;f=!0}c[r]=h&&h[r]||x.style(e,r)}if(!x.isEmptyObject(c)){h?"hidden"in h&&(f=h.hidden):h=q.access(e,"fxshow",{}),o&&(h.hidden=!f),f?x(e).show():l.done(function(){x(e).hide()}),l.done(function(){var t;q.remove(e,"fxshow");for(t in c)x.style(e,t,c[t])});for(r in c)s=Sn(f?h[r]:0,r,l),r in h||(h[r]=s.start,f&&(s.end=s.start,s.start="width"===r||"height"===r?1:0))}}function Ln(e,t,n,r,i){return new Ln.prototype.init(e,t,n,r,i)}x.Tween=Ln,Ln.prototype={constructor:Ln,init:function(e,t,n,r,i,o){this.elem=e,this.prop=n,this.easing=i||"swing",this.options=t,this.start=this.now=this.cur(),this.end=r,this.unit=o||(x.cssNumber[n]?"":"px")},cur:function(){var e=Ln.propHooks[this.prop];return e&&e.get?e.get(this):Ln.propHooks._default.get(this)},run:function(e){var t,n=Ln.propHooks[this.prop];return this.pos=t=this.options.duration?x.easing[this.easing](e,this.options.duration*e,0,1,this.options.duration):e,this.now=(this.end-this.start)*t+this.start,this.options.step&&this.options.step.call(this.elem,this.now,this),n&&n.set?n.set(this):Ln.propHooks._default.set(this),this}},Ln.prototype.init.prototype=Ln.prototype,Ln.propHooks={_default:{get:function(e){var t;return null==e.elem[e.prop]||e.elem.style&&null!=e.elem.style[e.prop]?(t=x.css(e.elem,e.prop,""),t&&"auto"!==t?t:0):e.elem[e.prop]},set:function(e){x.fx.step[e.prop]?x.fx.step[e.prop](e):e.elem.style&&(null!=e.elem.style[x.cssProps[e.prop]]||x.cssHooks[e.prop])?x.style(e.elem,e.prop,e.now+e.unit):e.elem[e.prop]=e.now}}},Ln.propHooks.scrollTop=Ln.propHooks.scrollLeft={set:function(e){e.elem.nodeType&&e.elem.parentNode&&(e.elem[e.prop]=e.now)}},x.each(["toggle","show","hide"],function(e,t){var n=x.fn[t];x.fn[t]=function(e,r,i){return null==e||"boolean"==typeof e?n.apply(this,arguments):this.animate(qn(t,!0),e,r,i)}}),x.fn.extend({fadeTo:function(e,t,n,r){return this.filter(Lt).css("opacity",0).show().end().animate({opacity:t},e,n,r)},animate:function(e,t,n,r){var i=x.isEmptyObject(e),o=x.speed(t,n,r),s=function(){var t=jn(this,x.extend({},e),o);(i||q.get(this,"finish"))&&t.stop(!0)};return s.finish=s,i||o.queue===!1?this.each(s):this.queue(o.queue,s)},stop:function(e,t,n){var r=function(e){var t=e.stop;delete e.stop,t(n)};return"string"!=typeof e&&(n=t,t=e,e=undefined),t&&e!==!1&&this.queue(e||"fx",[]),this.each(function(){var t=!0,i=null!=e&&e+"queueHooks",o=x.timers,s=q.get(this);if(i)s[i]&&s[i].stop&&r(s[i]);else for(i in s)s[i]&&s[i].stop&&Cn.test(i)&&r(s[i]);for(i=o.length;i--;)o[i].elem!==this||null!=e&&o[i].queue!==e||(o[i].anim.stop(n),t=!1,o.splice(i,1));(t||!n)&&x.dequeue(this,e)})},finish:function(e){return e!==!1&&(e=e||"fx"),this.each(function(){var t,n=q.get(this),r=n[e+"queue"],i=n[e+"queueHooks"],o=x.timers,s=r?r.length:0;for(n.finish=!0,x.queue(this,e,[]),i&&i.stop&&i.stop.call(this,!0),t=o.length;t--;)o[t].elem===this&&o[t].queue===e&&(o[t].anim.stop(!0),o.splice(t,1));for(t=0;s>t;t++)r[t]&&r[t].finish&&r[t].finish.call(this);delete n.finish})}});function qn(e,t){var n,r={height:e},i=0;for(t=t?1:0;4>i;i+=2-t)n=jt[i],r["margin"+n]=r["padding"+n]=e;return t&&(r.opacity=r.width=e),r}x.each({slideDown:qn("show"),slideUp:qn("hide"),slideToggle:qn("toggle"),fadeIn:{opacity:"show"},fadeOut:{opacity:"hide"},fadeToggle:{opacity:"toggle"}},function(e,t){x.fn[e]=function(e,n,r){return this.animate(t,e,n,r)}}),x.speed=function(e,t,n){var r=e&&"object"==typeof e?x.extend({},e):{complete:n||!n&&t||x.isFunction(e)&&e,duration:e,easing:n&&t||t&&!x.isFunction(t)&&t};return r.duration=x.fx.off?0:"number"==typeof r.duration?r.duration:r.duration in x.fx.speeds?x.fx.speeds[r.duration]:x.fx.speeds._default,(null==r.queue||r.queue===!0)&&(r.queue="fx"),r.old=r.complete,r.complete=function(){x.isFunction(r.old)&&r.old.call(this),r.queue&&x.dequeue(this,r.queue)},r},x.easing={linear:function(e){return e},swing:function(e){return.5-Math.cos(e*Math.PI)/2}},x.timers=[],x.fx=Ln.prototype.init,x.fx.tick=function(){var e,t=x.timers,n=0;for(xn=x.now();t.length>n;n++)e=t[n],e()||t[n]!==e||t.splice(n--,1);t.length||x.fx.stop(),xn=undefined},x.fx.timer=function(e){e()&&x.timers.push(e)&&x.fx.start()},x.fx.interval=13,x.fx.start=function(){bn||(bn=setInterval(x.fx.tick,x.fx.interval))},x.fx.stop=function(){clearInterval(bn),bn=null},x.fx.speeds={slow:600,fast:200,_default:400},x.fx.step={},x.expr&&x.expr.filters&&(x.expr.filters.animated=function(e){return x.grep(x.timers,function(t){return e===t.elem}).length}),x.fn.offset=function(e){if(arguments.length)return e===undefined?this:this.each(function(t){x.offset.setOffset(this,e,t)});var t,n,i=this[0],o={top:0,left:0},s=i&&i.ownerDocument;if(s)return t=s.documentElement,x.contains(t,i)?(typeof i.getBoundingClientRect!==r&&(o=i.getBoundingClientRect()),n=Hn(s),{top:o.top+n.pageYOffset-t.clientTop,left:o.left+n.pageXOffset-t.clientLeft}):o},x.offset={setOffset:function(e,t,n){var r,i,o,s,a,u,l,c=x.css(e,"position"),p=x(e),f={};"static"===c&&(e.style.position="relative"),a=p.offset(),o=x.css(e,"top"),u=x.css(e,"left"),l=("absolute"===c||"fixed"===c)&&(o+u).indexOf("auto")>-1,l?(r=p.position(),s=r.top,i=r.left):(s=parseFloat(o)||0,i=parseFloat(u)||0),x.isFunction(t)&&(t=t.call(e,n,a)),null!=t.top&&(f.top=t.top-a.top+s),null!=t.left&&(f.left=t.left-a.left+i),"using"in t?t.using.call(e,f):p.css(f)}},x.fn.extend({position:function(){if(this[0]){var e,t,n=this[0],r={top:0,left:0};return"fixed"===x.css(n,"position")?t=n.getBoundingClientRect():(e=this.offsetParent(),t=this.offset(),x.nodeName(e[0],"html")||(r=e.offset()),r.top+=x.css(e[0],"borderTopWidth",!0),r.left+=x.css(e[0],"borderLeftWidth",!0)),{top:t.top-r.top-x.css(n,"marginTop",!0),left:t.left-r.left-x.css(n,"marginLeft",!0)}}},offsetParent:function(){return this.map(function(){var e=this.offsetParent||s;while(e&&!x.nodeName(e,"html")&&"static"===x.css(e,"position"))e=e.offsetParent;return e||s})}}),x.each({scrollLeft:"pageXOffset",scrollTop:"pageYOffset"},function(t,n){var r="pageYOffset"===n;x.fn[t]=function(i){return x.access(this,function(t,i,o){var s=Hn(t);return o===undefined?s?s[n]:t[i]:(s?s.scrollTo(r?e.pageXOffset:o,r?o:e.pageYOffset):t[i]=o,undefined)},t,i,arguments.length,null)}});function Hn(e){return x.isWindow(e)?e:9===e.nodeType&&e.defaultView}x.each({Height:"height",Width:"width"},function(e,t){x.each({padding:"inner"+e,content:t,"":"outer"+e},function(n,r){x.fn[r]=function(r,i){var o=arguments.length&&(n||"boolean"!=typeof r),s=n||(r===!0||i===!0?"margin":"border");return x.access(this,function(t,n,r){var i;return x.isWindow(t)?t.document.documentElement["client"+e]:9===t.nodeType?(i=t.documentElement,Math.max(t.body["scroll"+e],i["scroll"+e],t.body["offset"+e],i["offset"+e],i["client"+e])):r===undefined?x.css(t,n,s):x.style(t,n,r,s)},t,o?r:undefined,o,null)}})}),x.fn.size=function(){return this.length},x.fn.andSelf=x.fn.addBack,"object"==typeof module&&module&&"object"==typeof module.exports?module.exports=x:"function"==typeof define&&define.amd&&define("jquery",[],function(){return x}),"object"==typeof e&&"object"==typeof e.document&&(e.jQuery=e.$=x)})(window);
diff --git a/bitbake/lib/toaster/toastergui/static/js/jquery-2.0.3.min.map b/bitbake/lib/toaster/toastergui/static/js/jquery-2.0.3.min.map
new file mode 100644
index 0000000..1edecd1
--- /dev/null
+++ b/bitbake/lib/toaster/toastergui/static/js/jquery-2.0.3.min.map
@@ -0,0 +1 @@
+{"version":3,"file":"jquery-2.0.3.min.js","sources":["jquery-2.0.3.js"],"names":["window","undefined","rootjQuery","readyList","core_strundefined","location","document","docElem","documentElement","_jQuery","jQuery","_$","$","class2type","core_deletedIds","core_version","core_concat","concat","core_push","push","core_slice","slice","core_indexOf","indexOf","core_toString","toString","core_hasOwn","hasOwnProperty","core_trim","trim","selector","context","fn","init","core_pnum","source","core_rnotwhite","rquickExpr","rsingleTag","rmsPrefix","rdashAlpha","fcamelCase","all","letter","toUpperCase","completed","removeEventListener","ready","prototype","jquery","constructor","match","elem","this","charAt","length","exec","find","merge","parseHTML","nodeType","ownerDocument","test","isPlainObject","isFunction","attr","getElementById","parentNode","makeArray","toArray","call","get","num","pushStack","elems","ret","prevObject","each","callback","args","promise","done","apply","arguments","first","eq","last","i","len","j","map","end","sort","splice","extend","options","name","src","copy","copyIsArray","clone","target","deep","isArray","expando","Math","random","replace","noConflict","isReady","readyWait","holdReady","hold","wait","resolveWith","trigger","off","obj","type","Array","isWindow","isNumeric","isNaN","parseFloat","isFinite","String","e","isEmptyObject","error","msg","Error","data","keepScripts","parsed","scripts","createElement","buildFragment","remove","childNodes","parseJSON","JSON","parse","parseXML","xml","tmp","DOMParser","parseFromString","getElementsByTagName","noop","globalEval","code","script","indirect","eval","text","head","appendChild","removeChild","camelCase","string","nodeName","toLowerCase","value","isArraylike","arr","results","Object","inArray","second","l","grep","inv","retVal","arg","guid","proxy","access","key","chainable","emptyGet","raw","bulk","now","Date","swap","old","style","Deferred","readyState","setTimeout","addEventListener","split","support","cachedruns","Expr","getText","isXML","compile","outermostContext","sortInput","setDocument","documentIsHTML","rbuggyQSA","rbuggyMatches","matches","contains","preferredDoc","dirruns","classCache","createCache","tokenCache","compilerCache","hasDuplicate","sortOrder","a","b","strundefined","MAX_NEGATIVE","hasOwn","pop","push_native","booleans","whitespace","characterEncoding","identifier","attributes","pseudos","rtrim","RegExp","rcomma","rcombinators","rsibling","rattributeQuotes","rpseudo","ridentifier","matchExpr","ID","CLASS","TAG","ATTR","PSEUDO","CHILD","bool","needsContext","rnative","rinputs","rheader","rescape","runescape","funescape","_","escaped","escapedWhitespace","high","fromCharCode","els","Sizzle","seed","m","groups","nid","newContext","newSelector","id","getElementsByClassName","qsa","tokenize","getAttribute","setAttribute","toSelector","join","querySelectorAll","qsaError","removeAttribute","select","keys","cache","cacheLength","shift","markFunction","assert","div","addHandle","attrs","handler","attrHandle","siblingCheck","cur","diff","sourceIndex","nextSibling","createInputPseudo","createButtonPseudo","createPositionalPseudo","argument","matchIndexes","node","doc","parent","defaultView","attachEvent","top","className","createComment","innerHTML","firstChild","getById","getElementsByName","filter","attrId","getAttributeNode","tag","input","matchesSelector","webkitMatchesSelector","mozMatchesSelector","oMatchesSelector","msMatchesSelector","disconnectedMatch","compareDocumentPosition","adown","bup","compare","sortDetached","aup","ap","bp","unshift","expr","elements","val","specified","uniqueSort","duplicates","detectDuplicates","sortStable","textContent","nodeValue","selectors","createPseudo","relative",">","dir"," ","+","~","preFilter","excess","unquoted","nodeNameSelector","pattern","operator","check","result","what","simple","forward","ofType","outerCache","nodeIndex","start","useCache","lastChild","pseudo","setFilters","idx","matched","not","matcher","unmatched","has","innerText","lang","elemLang","hash","root","focus","activeElement","hasFocus","href","tabIndex","enabled","disabled","checked","selected","selectedIndex","empty","header","button","even","odd","lt","gt","radio","checkbox","file","password","image","submit","reset","filters","parseOnly","tokens","soFar","preFilters","cached","addCombinator","combinator","base","checkNonElements","doneName","dirkey","elementMatcher","matchers","condense","newUnmatched","mapped","setMatcher","postFilter","postFinder","postSelector","temp","preMap","postMap","preexisting","multipleContexts","matcherIn","matcherOut","matcherFromTokens","checkContext","leadingRelative","implicitRelative","matchContext","matchAnyContext","matcherFromGroupMatchers","elementMatchers","setMatchers","matcherCachedRuns","bySet","byElement","superMatcher","expandContext","setMatched","matchedCount","outermost","contextBackup","dirrunsUnique","group","contexts","token","div1","defaultValue","unique","isXMLDoc","optionsCache","createOptions","object","flag","Callbacks","memory","fired","firing","firingStart","firingLength","firingIndex","list","stack","once","fire","stopOnFalse","self","disable","add","index","lock","locked","fireWith","func","tuples","state","always","deferred","fail","then","fns","newDefer","tuple","action","returned","resolve","reject","progress","notify","pipe","stateString","when","subordinate","resolveValues","remaining","updateFunc","values","progressValues","notifyWith","progressContexts","resolveContexts","fragment","createDocumentFragment","opt","checkOn","optSelected","reliableMarginRight","boxSizingReliable","pixelPosition","noCloneChecked","cloneNode","optDisabled","radioValue","checkClone","focusinBubbles","backgroundClip","clearCloneStyle","container","marginDiv","divReset","body","cssText","zoom","boxSizing","offsetWidth","getComputedStyle","width","marginRight","data_user","data_priv","rbrace","rmultiDash","Data","defineProperty","uid","accepts","owner","descriptor","unlock","defineProperties","set","prop","stored","camel","hasData","discard","acceptData","removeData","_data","_removeData","dataAttr","camelKey","queue","dequeue","startLength","hooks","_queueHooks","next","stop","setter","delay","time","fx","speeds","timeout","clearTimeout","clearQueue","count","defer","nodeHook","boolHook","rclass","rreturn","rfocusable","removeAttr","removeProp","propFix","addClass","classes","clazz","proceed","removeClass","toggleClass","stateVal","classNames","hasClass","valHooks","option","one","max","optionSet","nType","attrHooks","propName","attrNames","for","class","notxml","propHooks","hasAttribute","getter","rkeyEvent","rmouseEvent","rfocusMorph","rtypenamespace","returnTrue","returnFalse","safeActiveElement","err","event","global","types","handleObjIn","eventHandle","events","t","handleObj","special","handlers","namespaces","origType","elemData","handle","triggered","dispatch","delegateType","bindType","namespace","delegateCount","setup","mappedTypes","origCount","teardown","removeEvent","onlyHandlers","bubbleType","ontype","eventPath","Event","isTrigger","namespace_re","noBubble","parentWindow","isPropagationStopped","preventDefault","isDefaultPrevented","_default","fix","handlerQueue","delegateTarget","preDispatch","currentTarget","isImmediatePropagationStopped","stopPropagation","postDispatch","sel","props","fixHooks","keyHooks","original","which","charCode","keyCode","mouseHooks","eventDoc","pageX","clientX","scrollLeft","clientLeft","pageY","clientY","scrollTop","clientTop","originalEvent","fixHook","load","blur","click","beforeunload","returnValue","simulate","bubble","isSimulated","defaultPrevented","getPreventDefault","timeStamp","stopImmediatePropagation","mouseenter","mouseleave","orig","related","relatedTarget","attaches","on","origFn","triggerHandler","isSimple","rparentsprev","rneedsContext","guaranteedUnique","children","contents","prev","targets","winnow","is","closest","pos","prevAll","addBack","sibling","parents","parentsUntil","until","nextAll","nextUntil","prevUntil","siblings","contentDocument","reverse","truncate","n","qualifier","rxhtmlTag","rtagName","rhtml","rnoInnerhtml","manipulation_rcheckableType","rchecked","rscriptType","rscriptTypeMasked","rcleanScript","wrapMap","thead","col","tr","td","optgroup","tbody","tfoot","colgroup","caption","th","append","createTextNode","domManip","manipulationTarget","prepend","insertBefore","before","after","keepData","cleanData","getAll","setGlobalEval","dataAndEvents","deepDataAndEvents","html","replaceWith","detach","allowIntersection","hasScripts","iNoClone","disableScript","restoreScript","_evalUrl","appendTo","prependTo","insertAfter","replaceAll","insert","srcElements","destElements","inPage","fixInput","cloneCopyEvent","selection","wrap","nodes","url","ajax","dataType","async","throws","content","refElements","dest","pdataOld","pdataCur","udataOld","udataCur","wrapAll","firstElementChild","wrapInner","unwrap","curCSS","iframe","rdisplayswap","rmargin","rnumsplit","rnumnonpx","rrelNum","elemdisplay","BODY","cssShow","position","visibility","display","cssNormalTransform","letterSpacing","fontWeight","cssExpand","cssPrefixes","vendorPropName","capName","origName","isHidden","el","css","getStyles","showHide","show","hidden","css_defaultDisplay","styles","hide","toggle","cssHooks","opacity","computed","cssNumber","columnCount","fillOpacity","lineHeight","order","orphans","widows","zIndex","cssProps","float","extra","_computed","minWidth","maxWidth","getPropertyValue","setPositiveNumber","subtract","augmentWidthOrHeight","isBorderBox","getWidthOrHeight","valueIsBorderBox","offsetHeight","actualDisplay","contentWindow","write","close","visible","margin","padding","border","prefix","suffix","expand","expanded","parts","r20","rbracket","rCRLF","rsubmitterTypes","rsubmittable","serialize","param","serializeArray","traditional","s","encodeURIComponent","ajaxSettings","buildParams","v","hover","fnOver","fnOut","bind","unbind","delegate","undelegate","ajaxLocParts","ajaxLocation","ajax_nonce","ajax_rquery","rhash","rts","rheaders","rlocalProtocol","rnoContent","rprotocol","rurl","_load","prefilters","transports","allTypes","addToPrefiltersOrTransports","structure","dataTypeExpression","dataTypes","inspectPrefiltersOrTransports","originalOptions","jqXHR","inspected","seekingTransport","inspect","prefilterOrFactory","dataTypeOrTransport","ajaxExtend","flatOptions","params","response","responseText","complete","status","active","lastModified","etag","isLocal","processData","contentType","*","json","responseFields","converters","* text","text html","text json","text xml","ajaxSetup","settings","ajaxPrefilter","ajaxTransport","transport","cacheURL","responseHeadersString","responseHeaders","timeoutTimer","fireGlobals","callbackContext","globalEventContext","completeDeferred","statusCode","requestHeaders","requestHeadersNames","strAbort","getResponseHeader","getAllResponseHeaders","setRequestHeader","lname","overrideMimeType","mimeType","abort","statusText","finalText","success","method","crossDomain","hasContent","ifModified","headers","beforeSend","send","nativeStatusText","responses","isSuccess","modified","ajaxHandleResponses","ajaxConvert","rejectWith","getJSON","getScript","ct","finalDataType","firstDataType","conv2","current","conv","dataFilter","text script","charset","scriptCharset","evt","oldCallbacks","rjsonp","jsonp","jsonpCallback","originalSettings","callbackName","overwritten","responseContainer","jsonProp","xhr","XMLHttpRequest","xhrSupported","xhrSuccessStatus",1223,"xhrId","xhrCallbacks","ActiveXObject","cors","open","username","xhrFields","onload","onerror","fxNow","timerId","rfxtypes","rfxnum","rrun","animationPrefilters","defaultPrefilter","tweeners","tween","createTween","unit","scale","maxIterations","createFxNow","animation","collection","Animation","properties","stopped","tick","currentTime","startTime","duration","percent","tweens","run","opts","specialEasing","originalProperties","Tween","easing","gotoEnd","propFilter","timer","anim","tweener","prefilter","oldfire","dataShow","unqueued","overflow","overflowX","overflowY","eased","step","cssFn","speed","animate","genFx","fadeTo","to","optall","doAnimation","finish","stopQueue","timers","includeWidth","height","slideDown","slideUp","slideToggle","fadeIn","fadeOut","fadeToggle","linear","p","swing","cos","PI","interval","setInterval","clearInterval","slow","fast","animated","offset","setOffset","win","box","left","getBoundingClientRect","getWindow","pageYOffset","pageXOffset","curPosition","curLeft","curCSSTop","curTop","curOffset","curCSSLeft","calculatePosition","curElem","using","offsetParent","parentOffset","scrollTo","Height","Width","defaultExtra","funcName","size","andSelf","module","exports","define","amd"],"mappings":";;;CAaA,SAAWA,EAAQC,WAOnB,GAECC,GAGAC,EAIAC,QAA2BH,WAG3BI,EAAWL,EAAOK,SAClBC,EAAWN,EAAOM,SAClBC,EAAUD,EAASE,gBAGnBC,EAAUT,EAAOU,OAGjBC,EAAKX,EAAOY,EAGZC,KAGAC,KAEAC,EAAe,QAGfC,EAAcF,EAAgBG,OAC9BC,EAAYJ,EAAgBK,KAC5BC,EAAaN,EAAgBO,MAC7BC,EAAeR,EAAgBS,QAC/BC,EAAgBX,EAAWY,SAC3BC,EAAcb,EAAWc,eACzBC,EAAYb,EAAac,KAGzBnB,EAAS,SAAUoB,EAAUC,GAE5B,MAAO,IAAIrB,GAAOsB,GAAGC,KAAMH,EAAUC,EAAS7B,IAI/CgC,EAAY,sCAAsCC,OAGlDC,EAAiB,OAKjBC,EAAa,sCAGbC,EAAa,6BAGbC,EAAY,QACZC,EAAa,eAGbC,EAAa,SAAUC,EAAKC,GAC3B,MAAOA,GAAOC,eAIfC,EAAY,WACXvC,EAASwC,oBAAqB,mBAAoBD,GAAW,GAC7D7C,EAAO8C,oBAAqB,OAAQD,GAAW,GAC/CnC,EAAOqC,QAGTrC,GAAOsB,GAAKtB,EAAOsC,WAElBC,OAAQlC,EAERmC,YAAaxC,EACbuB,KAAM,SAAUH,EAAUC,EAAS7B,GAClC,GAAIiD,GAAOC,CAGX,KAAMtB,EACL,MAAOuB,KAIR,IAAyB,gBAAbvB,GAAwB,CAUnC,GAPCqB,EAF2B,MAAvBrB,EAASwB,OAAO,IAAyD,MAA3CxB,EAASwB,OAAQxB,EAASyB,OAAS,IAAezB,EAASyB,QAAU,GAE7F,KAAMzB,EAAU,MAGlBO,EAAWmB,KAAM1B,IAIrBqB,IAAUA,EAAM,IAAOpB,EA+CrB,OAAMA,GAAWA,EAAQkB,QACtBlB,GAAW7B,GAAauD,KAAM3B,GAKhCuB,KAAKH,YAAanB,GAAU0B,KAAM3B,EAlDzC,IAAKqB,EAAM,GAAK,CAWf,GAVApB,EAAUA,YAAmBrB,GAASqB,EAAQ,GAAKA,EAGnDrB,EAAOgD,MAAOL,KAAM3C,EAAOiD,UAC1BR,EAAM,GACNpB,GAAWA,EAAQ6B,SAAW7B,EAAQ8B,eAAiB9B,EAAUzB,GACjE,IAIIgC,EAAWwB,KAAMX,EAAM,KAAQzC,EAAOqD,cAAehC,GACzD,IAAMoB,IAASpB,GAETrB,EAAOsD,WAAYX,KAAMF,IAC7BE,KAAMF,GAASpB,EAASoB,IAIxBE,KAAKY,KAAMd,EAAOpB,EAASoB,GAK9B,OAAOE,MAgBP,MAZAD,GAAO9C,EAAS4D,eAAgBf,EAAM,IAIjCC,GAAQA,EAAKe,aAEjBd,KAAKE,OAAS,EACdF,KAAK,GAAKD,GAGXC,KAAKtB,QAAUzB,EACf+C,KAAKvB,SAAWA,EACTuB,KAcH,MAAKvB,GAAS8B,UACpBP,KAAKtB,QAAUsB,KAAK,GAAKvB,EACzBuB,KAAKE,OAAS,EACPF,MAII3C,EAAOsD,WAAYlC,GACvB5B,EAAW6C,MAAOjB,IAGrBA,EAASA,WAAa7B,YAC1BoD,KAAKvB,SAAWA,EAASA,SACzBuB,KAAKtB,QAAUD,EAASC,SAGlBrB,EAAO0D,UAAWtC,EAAUuB,QAIpCvB,SAAU,GAGVyB,OAAQ,EAERc,QAAS,WACR,MAAOjD,GAAWkD,KAAMjB,OAKzBkB,IAAK,SAAUC,GACd,MAAc,OAAPA,EAGNnB,KAAKgB,UAGG,EAANG,EAAUnB,KAAMA,KAAKE,OAASiB,GAAQnB,KAAMmB,IAKhDC,UAAW,SAAUC,GAGpB,GAAIC,GAAMjE,EAAOgD,MAAOL,KAAKH,cAAewB,EAO5C,OAJAC,GAAIC,WAAavB,KACjBsB,EAAI5C,QAAUsB,KAAKtB,QAGZ4C,GAMRE,KAAM,SAAUC,EAAUC,GACzB,MAAOrE,GAAOmE,KAAMxB,KAAMyB,EAAUC,IAGrChC,MAAO,SAAUf,GAIhB,MAFAtB,GAAOqC,MAAMiC,UAAUC,KAAMjD,GAEtBqB,MAGRhC,MAAO,WACN,MAAOgC,MAAKoB,UAAWrD,EAAW8D,MAAO7B,KAAM8B,aAGhDC,MAAO,WACN,MAAO/B,MAAKgC,GAAI,IAGjBC,KAAM,WACL,MAAOjC,MAAKgC,GAAI,KAGjBA,GAAI,SAAUE,GACb,GAAIC,GAAMnC,KAAKE,OACdkC,GAAKF,GAAU,EAAJA,EAAQC,EAAM,EAC1B,OAAOnC,MAAKoB,UAAWgB,GAAK,GAASD,EAAJC,GAAYpC,KAAKoC,SAGnDC,IAAK,SAAUZ,GACd,MAAOzB,MAAKoB,UAAW/D,EAAOgF,IAAIrC,KAAM,SAAUD,EAAMmC,GACvD,MAAOT,GAASR,KAAMlB,EAAMmC,EAAGnC,OAIjCuC,IAAK,WACJ,MAAOtC,MAAKuB,YAAcvB,KAAKH,YAAY,OAK5C/B,KAAMD,EACN0E,QAASA,KACTC,UAAWA,QAIZnF,EAAOsB,GAAGC,KAAKe,UAAYtC,EAAOsB,GAElCtB,EAAOoF,OAASpF,EAAOsB,GAAG8D,OAAS,WAClC,GAAIC,GAASC,EAAMC,EAAKC,EAAMC,EAAaC,EAC1CC,EAASlB,UAAU,OACnBI,EAAI,EACJhC,EAAS4B,UAAU5B,OACnB+C,GAAO,CAqBR,KAlBuB,iBAAXD,KACXC,EAAOD,EACPA,EAASlB,UAAU,OAEnBI,EAAI,GAIkB,gBAAXc,IAAwB3F,EAAOsD,WAAWqC,KACrDA,MAII9C,IAAWgC,IACfc,EAAShD,OACPkC,GAGShC,EAAJgC,EAAYA,IAEnB,GAAmC,OAA7BQ,EAAUZ,UAAWI,IAE1B,IAAMS,IAAQD,GACbE,EAAMI,EAAQL,GACdE,EAAOH,EAASC,GAGXK,IAAWH,IAKXI,GAAQJ,IAAUxF,EAAOqD,cAAcmC,KAAUC,EAAczF,EAAO6F,QAAQL,MAC7EC,GACJA,GAAc,EACdC,EAAQH,GAAOvF,EAAO6F,QAAQN,GAAOA,MAGrCG,EAAQH,GAAOvF,EAAOqD,cAAckC,GAAOA,KAI5CI,EAAQL,GAAStF,EAAOoF,OAAQQ,EAAMF,EAAOF,IAGlCA,IAASjG,YACpBoG,EAAQL,GAASE,GAOrB,OAAOG,IAGR3F,EAAOoF,QAENU,QAAS,UAAazF,EAAe0F,KAAKC,UAAWC,QAAS,MAAO,IAErEC,WAAY,SAAUN,GASrB,MARKtG,GAAOY,IAAMF,IACjBV,EAAOY,EAAID,GAGP2F,GAAQtG,EAAOU,SAAWA,IAC9BV,EAAOU,OAASD,GAGVC,GAIRmG,SAAS,EAITC,UAAW,EAGXC,UAAW,SAAUC,GACfA,EACJtG,EAAOoG,YAEPpG,EAAOqC,OAAO,IAKhBA,MAAO,SAAUkE,IAGXA,KAAS,IAASvG,EAAOoG,UAAYpG,EAAOmG,WAKjDnG,EAAOmG,SAAU,EAGZI,KAAS,KAAUvG,EAAOoG,UAAY,IAK3C3G,EAAU+G,YAAa5G,GAAYI,IAG9BA,EAAOsB,GAAGmF,SACdzG,EAAQJ,GAAW6G,QAAQ,SAASC,IAAI,YAO1CpD,WAAY,SAAUqD,GACrB,MAA4B,aAArB3G,EAAO4G,KAAKD,IAGpBd,QAASgB,MAAMhB,QAEfiB,SAAU,SAAUH,GACnB,MAAc,OAAPA,GAAeA,IAAQA,EAAIrH,QAGnCyH,UAAW,SAAUJ,GACpB,OAAQK,MAAOC,WAAWN,KAAUO,SAAUP,IAG/CC,KAAM,SAAUD,GACf,MAAY,OAAPA,EACWA,EAARQ,GAGc,gBAARR,IAAmC,kBAARA,GACxCxG,EAAYW,EAAc8C,KAAK+C,KAAU,eAClCA,IAGTtD,cAAe,SAAUsD,GAKxB,GAA4B,WAAvB3G,EAAO4G,KAAMD,IAAsBA,EAAIzD,UAAYlD,EAAO8G,SAAUH,GACxE,OAAO,CAOR,KACC,GAAKA,EAAInE,cACNxB,EAAY4C,KAAM+C,EAAInE,YAAYF,UAAW,iBAC/C,OAAO,EAEP,MAAQ8E,GACT,OAAO,EAKR,OAAO,GAGRC,cAAe,SAAUV,GACxB,GAAIrB,EACJ,KAAMA,IAAQqB,GACb,OAAO,CAER,QAAO,GAGRW,MAAO,SAAUC,GAChB,KAAUC,OAAOD,IAMlBtE,UAAW,SAAUwE,EAAMpG,EAASqG,GACnC,IAAMD,GAAwB,gBAATA,GACpB,MAAO,KAEgB,kBAAZpG,KACXqG,EAAcrG,EACdA,GAAU,GAEXA,EAAUA,GAAWzB,CAErB,IAAI+H,GAAS/F,EAAWkB,KAAM2E,GAC7BG,GAAWF,KAGZ,OAAKC,IACKtG,EAAQwG,cAAeF,EAAO,MAGxCA,EAAS3H,EAAO8H,eAAiBL,GAAQpG,EAASuG,GAE7CA,GACJ5H,EAAQ4H,GAAUG,SAGZ/H,EAAOgD,SAAW2E,EAAOK,cAGjCC,UAAWC,KAAKC,MAGhBC,SAAU,SAAUX,GACnB,GAAIY,GAAKC,CACT,KAAMb,GAAwB,gBAATA,GACpB,MAAO,KAIR,KACCa,EAAM,GAAIC,WACVF,EAAMC,EAAIE,gBAAiBf,EAAO,YACjC,MAAQL,GACTiB,EAAM9I,UAMP,QAHM8I,GAAOA,EAAII,qBAAsB,eAAgB5F,SACtD7C,EAAOsH,MAAO,gBAAkBG,GAE1BY,GAGRK,KAAM,aAGNC,WAAY,SAAUC,GACrB,GAAIC,GACFC,EAAWC,IAEbH,GAAO5I,EAAOmB,KAAMyH,GAEfA,IAIgC,IAA/BA,EAAK/H,QAAQ,eACjBgI,EAASjJ,EAASiI,cAAc,UAChCgB,EAAOG,KAAOJ,EACdhJ,EAASqJ,KAAKC,YAAaL,GAASpF,WAAW0F,YAAaN,IAI5DC,EAAUF,KAObQ,UAAW,SAAUC,GACpB,MAAOA,GAAOpD,QAASpE,EAAW,OAAQoE,QAASnE,EAAYC,IAGhEuH,SAAU,SAAU5G,EAAM4C,GACzB,MAAO5C,GAAK4G,UAAY5G,EAAK4G,SAASC,gBAAkBjE,EAAKiE,eAI9DpF,KAAM,SAAUwC,EAAKvC,EAAUC,GAC9B,GAAImF,GACH3E,EAAI,EACJhC,EAAS8D,EAAI9D,OACbgD,EAAU4D,EAAa9C,EAExB,IAAKtC,GACJ,GAAKwB,GACJ,KAAYhD,EAAJgC,EAAYA,IAGnB,GAFA2E,EAAQpF,EAASI,MAAOmC,EAAK9B,GAAKR,GAE7BmF,KAAU,EACd,UAIF,KAAM3E,IAAK8B,GAGV,GAFA6C,EAAQpF,EAASI,MAAOmC,EAAK9B,GAAKR,GAE7BmF,KAAU,EACd,UAOH,IAAK3D,GACJ,KAAYhD,EAAJgC,EAAYA,IAGnB,GAFA2E,EAAQpF,EAASR,KAAM+C,EAAK9B,GAAKA,EAAG8B,EAAK9B,IAEpC2E,KAAU,EACd,UAIF,KAAM3E,IAAK8B,GAGV,GAFA6C,EAAQpF,EAASR,KAAM+C,EAAK9B,GAAKA,EAAG8B,EAAK9B,IAEpC2E,KAAU,EACd,KAMJ,OAAO7C,IAGRxF,KAAM,SAAU6H,GACf,MAAe,OAARA,EAAe,GAAK9H,EAAU0C,KAAMoF,IAI5CtF,UAAW,SAAUgG,EAAKC,GACzB,GAAI1F,GAAM0F,KAaV,OAXY,OAAPD,IACCD,EAAaG,OAAOF,IACxB1J,EAAOgD,MAAOiB,EACE,gBAARyF,IACLA,GAAQA,GAGXlJ,EAAUoD,KAAMK,EAAKyF,IAIhBzF,GAGR4F,QAAS,SAAUnH,EAAMgH,EAAK7E,GAC7B,MAAc,OAAP6E,EAAc,GAAK9I,EAAagD,KAAM8F,EAAKhH,EAAMmC,IAGzD7B,MAAO,SAAU0B,EAAOoF,GACvB,GAAIC,GAAID,EAAOjH,OACdgC,EAAIH,EAAM7B,OACVkC,EAAI,CAEL,IAAkB,gBAANgF,GACX,KAAYA,EAAJhF,EAAOA,IACdL,EAAOG,KAAQiF,EAAQ/E,OAGxB,OAAQ+E,EAAO/E,KAAOxF,UACrBmF,EAAOG,KAAQiF,EAAQ/E,IAMzB,OAFAL,GAAM7B,OAASgC,EAERH,GAGRsF,KAAM,SAAUhG,EAAOI,EAAU6F,GAChC,GAAIC,GACHjG,KACAY,EAAI,EACJhC,EAASmB,EAAMnB,MAKhB,KAJAoH,IAAQA,EAIIpH,EAAJgC,EAAYA,IACnBqF,IAAW9F,EAAUJ,EAAOa,GAAKA,GAC5BoF,IAAQC,GACZjG,EAAIxD,KAAMuD,EAAOa,GAInB,OAAOZ,IAIRe,IAAK,SAAUhB,EAAOI,EAAU+F,GAC/B,GAAIX,GACH3E,EAAI,EACJhC,EAASmB,EAAMnB,OACfgD,EAAU4D,EAAazF,GACvBC,IAGD,IAAK4B,EACJ,KAAYhD,EAAJgC,EAAYA,IACnB2E,EAAQpF,EAAUJ,EAAOa,GAAKA,EAAGsF,GAEnB,MAATX,IACJvF,EAAKA,EAAIpB,QAAW2G,OAMtB,KAAM3E,IAAKb,GACVwF,EAAQpF,EAAUJ,EAAOa,GAAKA,EAAGsF,GAEnB,MAATX,IACJvF,EAAKA,EAAIpB,QAAW2G,EAMvB,OAAOlJ,GAAYkE,SAAWP,IAI/BmG,KAAM,EAINC,MAAO,SAAU/I,EAAID,GACpB,GAAIiH,GAAKjE,EAAMgG,CAUf,OARwB,gBAAZhJ,KACXiH,EAAMhH,EAAID,GACVA,EAAUC,EACVA,EAAKgH,GAKAtI,EAAOsD,WAAYhC,IAKzB+C,EAAO3D,EAAWkD,KAAMa,UAAW,GACnC4F,EAAQ,WACP,MAAO/I,GAAGkD,MAAOnD,GAAWsB,KAAM0B,EAAK9D,OAAQG,EAAWkD,KAAMa,cAIjE4F,EAAMD,KAAO9I,EAAG8I,KAAO9I,EAAG8I,MAAQpK,EAAOoK,OAElCC,GAZC9K,WAiBT+K,OAAQ,SAAUtG,EAAO1C,EAAIiJ,EAAKf,EAAOgB,EAAWC,EAAUC,GAC7D,GAAI7F,GAAI,EACPhC,EAASmB,EAAMnB,OACf8H,EAAc,MAAPJ,CAGR,IAA4B,WAAvBvK,EAAO4G,KAAM2D,GAAqB,CACtCC,GAAY,CACZ,KAAM3F,IAAK0F,GACVvK,EAAOsK,OAAQtG,EAAO1C,EAAIuD,EAAG0F,EAAI1F,IAAI,EAAM4F,EAAUC,OAIhD,IAAKlB,IAAUjK,YACrBiL,GAAY,EAENxK,EAAOsD,WAAYkG,KACxBkB,GAAM,GAGFC,IAECD,GACJpJ,EAAGsC,KAAMI,EAAOwF,GAChBlI,EAAK,OAILqJ,EAAOrJ,EACPA,EAAK,SAAUoB,EAAM6H,EAAKf,GACzB,MAAOmB,GAAK/G,KAAM5D,EAAQ0C,GAAQ8G,MAKhClI,GACJ,KAAYuB,EAAJgC,EAAYA,IACnBvD,EAAI0C,EAAMa,GAAI0F,EAAKG,EAAMlB,EAAQA,EAAM5F,KAAMI,EAAMa,GAAIA,EAAGvD,EAAI0C,EAAMa,GAAI0F,IAK3E,OAAOC,GACNxG,EAGA2G,EACCrJ,EAAGsC,KAAMI,GACTnB,EAASvB,EAAI0C,EAAM,GAAIuG,GAAQE,GAGlCG,IAAKC,KAAKD,IAKVE,KAAM,SAAUpI,EAAM2C,EAASjB,EAAUC,GACxC,GAAIJ,GAAKqB,EACRyF,IAGD,KAAMzF,IAAQD,GACb0F,EAAKzF,GAAS5C,EAAKsI,MAAO1F,GAC1B5C,EAAKsI,MAAO1F,GAASD,EAASC,EAG/BrB,GAAMG,EAASI,MAAO9B,EAAM2B,MAG5B,KAAMiB,IAAQD,GACb3C,EAAKsI,MAAO1F,GAASyF,EAAKzF,EAG3B,OAAOrB,MAITjE,EAAOqC,MAAMiC,QAAU,SAAUqC,GAqBhC,MApBMlH,KAELA,EAAYO,EAAOiL,WAKU,aAAxBrL,EAASsL,WAEbC,WAAYnL,EAAOqC,QAKnBzC,EAASwL,iBAAkB,mBAAoBjJ,GAAW,GAG1D7C,EAAO8L,iBAAkB,OAAQjJ,GAAW,KAGvC1C,EAAU6E,QAASqC,IAI3B3G,EAAOmE,KAAK,gEAAgEkH,MAAM,KAAM,SAASxG,EAAGS,GACnGnF,EAAY,WAAamF,EAAO,KAAQA,EAAKiE,eAG9C,SAASE,GAAa9C,GACrB,GAAI9D,GAAS8D,EAAI9D,OAChB+D,EAAO5G,EAAO4G,KAAMD,EAErB,OAAK3G,GAAO8G,SAAUH,IACd,EAGc,IAAjBA,EAAIzD,UAAkBL,GACnB,EAGQ,UAAT+D,GAA6B,aAATA,IACb,IAAX/D,GACgB,gBAAXA,IAAuBA,EAAS,GAAOA,EAAS,IAAO8D,IAIhEnH,EAAaQ,EAAOJ,GAWpB,SAAWN,EAAQC,WAEnB,GAAIsF,GACHyG,EACAC,EACAC,EACAC,EACAC,EACAC,EACAC,EACAC,EAGAC,EACAlM,EACAC,EACAkM,EACAC,EACAC,EACAC,EACAC,EAGArG,EAAU,UAAY,GAAK+E,MAC3BuB,EAAe9M,EAAOM,SACtByM,EAAU,EACV9H,EAAO,EACP+H,EAAaC,KACbC,EAAaD,KACbE,EAAgBF,KAChBG,GAAe,EACfC,EAAY,SAAUC,EAAGC,GACxB,MAAKD,KAAMC,GACVH,GAAe,EACR,GAED,GAIRI,QAAsBvN,WACtBwN,EAAe,GAAK,GAGpBC,KAAc/L,eACdyI,KACAuD,EAAMvD,EAAIuD,IACVC,EAAcxD,EAAIjJ,KAClBA,EAAOiJ,EAAIjJ,KACXE,EAAQ+I,EAAI/I,MAEZE,EAAU6I,EAAI7I,SAAW,SAAU6B,GAClC,GAAImC,GAAI,EACPC,EAAMnC,KAAKE,MACZ,MAAYiC,EAAJD,EAASA,IAChB,GAAKlC,KAAKkC,KAAOnC,EAChB,MAAOmC,EAGT,OAAO,IAGRsI,EAAW,6HAKXC,EAAa,sBAEbC,EAAoB,mCAKpBC,EAAaD,EAAkBpH,QAAS,IAAK,MAG7CsH,EAAa,MAAQH,EAAa,KAAOC,EAAoB,IAAMD,EAClE,mBAAqBA,EAAa,wCAA0CE,EAAa,QAAUF,EAAa,OAQjHI,EAAU,KAAOH,EAAoB,mEAAqEE,EAAWtH,QAAS,EAAG,GAAM,eAGvIwH,EAAYC,OAAQ,IAAMN,EAAa,8BAAgCA,EAAa,KAAM,KAE1FO,EAAaD,OAAQ,IAAMN,EAAa,KAAOA,EAAa,KAC5DQ,EAAmBF,OAAQ,IAAMN,EAAa,WAAaA,EAAa,IAAMA,EAAa,KAE3FS,EAAeH,OAAQN,EAAa,SACpCU,EAAuBJ,OAAQ,IAAMN,EAAa,gBAAkBA,EAAa,OAAQ,KAEzFW,EAAcL,OAAQF,GACtBQ,EAAkBN,OAAQ,IAAMJ,EAAa,KAE7CW,GACCC,GAAUR,OAAQ,MAAQL,EAAoB,KAC9Cc,MAAaT,OAAQ,QAAUL,EAAoB,KACnDe,IAAWV,OAAQ,KAAOL,EAAkBpH,QAAS,IAAK,MAAS,KACnEoI,KAAYX,OAAQ,IAAMH,GAC1Be,OAAcZ,OAAQ,IAAMF,GAC5Be,MAAab,OAAQ,yDAA2DN,EAC/E,+BAAiCA,EAAa,cAAgBA,EAC9D,aAAeA,EAAa,SAAU,KACvCoB,KAAYd,OAAQ,OAASP,EAAW,KAAM,KAG9CsB,aAAoBf,OAAQ,IAAMN,EAAa,mDAC9CA,EAAa,mBAAqBA,EAAa,mBAAoB,MAGrEsB,EAAU,yBAGV/M,EAAa,mCAEbgN,EAAU,sCACVC,GAAU,SAEVC,GAAU,QAGVC,GAAgBpB,OAAQ,qBAAuBN,EAAa,MAAQA,EAAa,OAAQ,MACzF2B,GAAY,SAAUC,EAAGC,EAASC,GACjC,GAAIC,GAAO,KAAOF,EAAU,KAI5B,OAAOE,KAASA,GAAQD,EACvBD,EAEO,EAAPE,EACChI,OAAOiI,aAAcD,EAAO,OAE5BhI,OAAOiI,aAA2B,MAAbD,GAAQ,GAA4B,MAAR,KAAPA,GAI9C,KACC1O,EAAK+D,MACHkF,EAAM/I,EAAMiD,KAAMwI,EAAapE,YAChCoE,EAAapE,YAId0B,EAAK0C,EAAapE,WAAWnF,QAASK,SACrC,MAAQkE,IACT3G,GAAS+D,MAAOkF,EAAI7G,OAGnB,SAAU8C,EAAQ0J,GACjBnC,EAAY1I,MAAOmB,EAAQhF,EAAMiD,KAAKyL,KAKvC,SAAU1J,EAAQ0J,GACjB,GAAItK,GAAIY,EAAO9C,OACdgC,EAAI,CAEL,OAASc,EAAOZ,KAAOsK,EAAIxK,MAC3Bc,EAAO9C,OAASkC,EAAI,IAKvB,QAASuK,IAAQlO,EAAUC,EAASsI,EAAS4F,GAC5C,GAAI9M,GAAOC,EAAM8M,EAAGtM,EAEnB2B,EAAG4K,EAAQ1E,EAAK2E,EAAKC,EAAYC,CASlC,KAPOvO,EAAUA,EAAQ8B,eAAiB9B,EAAU+K,KAAmBxM,GACtEkM,EAAazK,GAGdA,EAAUA,GAAWzB,EACrB+J,EAAUA,OAEJvI,GAAgC,gBAAbA,GACxB,MAAOuI,EAGR,IAAuC,KAAjCzG,EAAW7B,EAAQ6B,WAAgC,IAAbA,EAC3C,QAGD,IAAK6I,IAAmBwD,EAAO,CAG9B,GAAM9M,EAAQd,EAAWmB,KAAM1B,GAE9B,GAAMoO,EAAI/M,EAAM,IACf,GAAkB,IAAbS,EAAiB,CAIrB,GAHAR,EAAOrB,EAAQmC,eAAgBgM,IAG1B9M,IAAQA,EAAKe,WAQjB,MAAOkG,EALP,IAAKjH,EAAKmN,KAAOL,EAEhB,MADA7F,GAAQlJ,KAAMiC,GACPiH,MAOT,IAAKtI,EAAQ8B,gBAAkBT,EAAOrB,EAAQ8B,cAAcK,eAAgBgM,KAC3ErD,EAAU9K,EAASqB,IAAUA,EAAKmN,KAAOL,EAEzC,MADA7F,GAAQlJ,KAAMiC,GACPiH,MAKH,CAAA,GAAKlH,EAAM,GAEjB,MADAhC,GAAK+D,MAAOmF,EAAStI,EAAQoH,qBAAsBrH,IAC5CuI,CAGD,KAAM6F,EAAI/M,EAAM,KAAO6I,EAAQwE,wBAA0BzO,EAAQyO,uBAEvE,MADArP,GAAK+D,MAAOmF,EAAStI,EAAQyO,uBAAwBN,IAC9C7F,EAKT,GAAK2B,EAAQyE,OAAS/D,IAAcA,EAAU5I,KAAMhC,IAAc,CASjE,GARAsO,EAAM3E,EAAMjF,EACZ6J,EAAatO,EACbuO,EAA2B,IAAb1M,GAAkB9B,EAMd,IAAb8B,GAAqD,WAAnC7B,EAAQiI,SAASC,cAA6B,CACpEkG,EAASO,GAAU5O,IAEb2J,EAAM1J,EAAQ4O,aAAa,OAChCP,EAAM3E,EAAI9E,QAAS4I,GAAS,QAE5BxN,EAAQ6O,aAAc,KAAMR,GAE7BA,EAAM,QAAUA,EAAM,MAEtB7K,EAAI4K,EAAO5M,MACX,OAAQgC,IACP4K,EAAO5K,GAAK6K,EAAMS,GAAYV,EAAO5K,GAEtC8K,GAAa9B,EAASzK,KAAMhC,IAAcC,EAAQoC,YAAcpC,EAChEuO,EAAcH,EAAOW,KAAK,KAG3B,GAAKR,EACJ,IAIC,MAHAnP,GAAK+D,MAAOmF,EACXgG,EAAWU,iBAAkBT,IAEvBjG,EACN,MAAM2G,IACN,QACKvF,GACL1J,EAAQkP,gBAAgB,QAQ7B,MAAOC,IAAQpP,EAAS6E,QAASwH,EAAO,MAAQpM,EAASsI,EAAS4F,GASnE,QAAShD,MACR,GAAIkE,KAEJ,SAASC,GAAOnG,EAAKf,GAMpB,MAJKiH,GAAKhQ,KAAM8J,GAAO,KAAQiB,EAAKmF,mBAE5BD,GAAOD,EAAKG,SAEZF,EAAOnG,GAAQf,EAExB,MAAOkH,GAOR,QAASG,IAAcvP,GAEtB,MADAA,GAAIwE,IAAY,EACTxE,EAOR,QAASwP,IAAQxP,GAChB,GAAIyP,GAAMnR,EAASiI,cAAc,MAEjC,KACC,QAASvG,EAAIyP,GACZ,MAAO3J,GACR,OAAO,EACN,QAEI2J,EAAItN,YACRsN,EAAItN,WAAW0F,YAAa4H,GAG7BA,EAAM,MASR,QAASC,IAAWC,EAAOC,GAC1B,GAAIxH,GAAMuH,EAAM5F,MAAM,KACrBxG,EAAIoM,EAAMpO,MAEX,OAAQgC,IACP2G,EAAK2F,WAAYzH,EAAI7E,IAAOqM,EAU9B,QAASE,IAAcxE,EAAGC,GACzB,GAAIwE,GAAMxE,GAAKD,EACd0E,EAAOD,GAAsB,IAAfzE,EAAE1J,UAAiC,IAAf2J,EAAE3J,YAChC2J,EAAE0E,aAAexE,KACjBH,EAAE2E,aAAexE,EAGtB,IAAKuE,EACJ,MAAOA,EAIR,IAAKD,EACJ,MAASA,EAAMA,EAAIG,YAClB,GAAKH,IAAQxE,EACZ,MAAO,EAKV,OAAOD,GAAI,EAAI,GAOhB,QAAS6E,IAAmB7K,GAC3B,MAAO,UAAUlE,GAChB,GAAI4C,GAAO5C,EAAK4G,SAASC,aACzB,OAAgB,UAATjE,GAAoB5C,EAAKkE,OAASA,GAQ3C,QAAS8K,IAAoB9K,GAC5B,MAAO,UAAUlE,GAChB,GAAI4C,GAAO5C,EAAK4G,SAASC,aACzB,QAAiB,UAATjE,GAA6B,WAATA,IAAsB5C,EAAKkE,OAASA,GAQlE,QAAS+K,IAAwBrQ,GAChC,MAAOuP,IAAa,SAAUe,GAE7B,MADAA,IAAYA,EACLf,GAAa,SAAUtB,EAAMrD,GACnC,GAAInH,GACH8M,EAAevQ,KAAQiO,EAAK1M,OAAQ+O,GACpC/M,EAAIgN,EAAahP,MAGlB,OAAQgC,IACF0K,EAAOxK,EAAI8M,EAAahN,MAC5B0K,EAAKxK,KAAOmH,EAAQnH,GAAKwK,EAAKxK,SAWnC2G,EAAQ4D,GAAO5D,MAAQ,SAAUhJ,GAGhC,GAAI5C,GAAkB4C,IAASA,EAAKS,eAAiBT,GAAM5C,eAC3D,OAAOA,GAA+C,SAA7BA,EAAgBwJ,UAAsB,GAIhEgC,EAAUgE,GAAOhE,WAOjBQ,EAAcwD,GAAOxD,YAAc,SAAUgG,GAC5C,GAAIC,GAAMD,EAAOA,EAAK3O,eAAiB2O,EAAO1F,EAC7C4F,EAASD,EAAIE,WAGd,OAAKF,KAAQnS,GAA6B,IAAjBmS,EAAI7O,UAAmB6O,EAAIjS,iBAKpDF,EAAWmS,EACXlS,EAAUkS,EAAIjS,gBAGdiM,GAAkBL,EAAOqG,GAMpBC,GAAUA,EAAOE,aAAeF,IAAWA,EAAOG,KACtDH,EAAOE,YAAa,iBAAkB,WACrCpG,MASFR,EAAQiC,WAAauD,GAAO,SAAUC,GAErC,MADAA,GAAIqB,UAAY,KACRrB,EAAId,aAAa,eAO1B3E,EAAQ7C,qBAAuBqI,GAAO,SAAUC,GAE/C,MADAA,GAAI7H,YAAa6I,EAAIM,cAAc,MAC3BtB,EAAItI,qBAAqB,KAAK5F,SAIvCyI,EAAQwE,uBAAyBgB,GAAO,SAAUC,GAQjD,MAPAA,GAAIuB,UAAY,+CAIhBvB,EAAIwB,WAAWH,UAAY,IAGuB,IAA3CrB,EAAIjB,uBAAuB,KAAKjN,SAOxCyI,EAAQkH,QAAU1B,GAAO,SAAUC,GAElC,MADAlR,GAAQqJ,YAAa6H,GAAMlB,GAAK/J,GACxBiM,EAAIU,oBAAsBV,EAAIU,kBAAmB3M,GAAUjD,SAI/DyI,EAAQkH,SACZhH,EAAKzI,KAAS,GAAI,SAAU8M,EAAIxO,GAC/B,SAAYA,GAAQmC,iBAAmBsJ,GAAgBf,EAAiB,CACvE,GAAIyD,GAAInO,EAAQmC,eAAgBqM,EAGhC,OAAOL,IAAKA,EAAE/L,YAAc+L,QAG9BhE,EAAKkH,OAAW,GAAI,SAAU7C,GAC7B,GAAI8C,GAAS9C,EAAG5J,QAAS6I,GAAWC,GACpC,OAAO,UAAUrM,GAChB,MAAOA,GAAKuN,aAAa,QAAU0C,YAM9BnH,GAAKzI,KAAS,GAErByI,EAAKkH,OAAW,GAAK,SAAU7C,GAC9B,GAAI8C,GAAS9C,EAAG5J,QAAS6I,GAAWC,GACpC,OAAO,UAAUrM,GAChB,GAAIoP,SAAcpP,GAAKkQ,mBAAqB9F,GAAgBpK,EAAKkQ,iBAAiB,KAClF,OAAOd,IAAQA,EAAKtI,QAAUmJ,KAMjCnH,EAAKzI,KAAU,IAAIuI,EAAQ7C,qBAC1B,SAAUoK,EAAKxR,GACd,aAAYA,GAAQoH,uBAAyBqE,EACrCzL,EAAQoH,qBAAsBoK,GADtC,WAID,SAAUA,EAAKxR,GACd,GAAIqB,GACH4F,KACAzD,EAAI,EACJ8E,EAAUtI,EAAQoH,qBAAsBoK,EAGzC,IAAa,MAARA,EAAc,CAClB,MAASnQ,EAAOiH,EAAQ9E,KACA,IAAlBnC,EAAKQ,UACToF,EAAI7H,KAAMiC,EAIZ,OAAO4F,GAER,MAAOqB,IAIT6B,EAAKzI,KAAY,MAAIuI,EAAQwE,wBAA0B,SAAUsC,EAAW/Q,GAC3E,aAAYA,GAAQyO,yBAA2BhD,GAAgBf,EACvD1K,EAAQyO,uBAAwBsC,GADxC,WAWDnG,KAOAD,MAEMV,EAAQyE,IAAMrB,EAAQtL,KAAM2O,EAAI1B,qBAGrCS,GAAO,SAAUC,GAMhBA,EAAIuB,UAAY,iDAIVvB,EAAIV,iBAAiB,cAAcxN,QACxCmJ,EAAUvL,KAAM,MAAQ2M,EAAa,aAAeD,EAAW,KAM1D4D,EAAIV,iBAAiB,YAAYxN,QACtCmJ,EAAUvL,KAAK,cAIjBqQ,GAAO,SAAUC,GAOhB,GAAI+B,GAAQf,EAAIlK,cAAc,QAC9BiL,GAAM5C,aAAc,OAAQ,UAC5Ba,EAAI7H,YAAa4J,GAAQ5C,aAAc,IAAK,IAEvCa,EAAIV,iBAAiB,WAAWxN,QACpCmJ,EAAUvL,KAAM,SAAW2M,EAAa,gBAKnC2D,EAAIV,iBAAiB,YAAYxN,QACtCmJ,EAAUvL,KAAM,WAAY,aAI7BsQ,EAAIV,iBAAiB,QACrBrE,EAAUvL,KAAK,YAIX6K,EAAQyH,gBAAkBrE,EAAQtL,KAAO8I,EAAUrM,EAAQmT,uBAChEnT,EAAQoT,oBACRpT,EAAQqT,kBACRrT,EAAQsT,qBAERrC,GAAO,SAAUC,GAGhBzF,EAAQ8H,kBAAoBlH,EAAQtI,KAAMmN,EAAK,OAI/C7E,EAAQtI,KAAMmN,EAAK,aACnB9E,EAAcxL,KAAM,KAAM+M,KAI5BxB,EAAYA,EAAUnJ,QAAc6K,OAAQ1B,EAAUoE,KAAK,MAC3DnE,EAAgBA,EAAcpJ,QAAc6K,OAAQzB,EAAcmE,KAAK,MAQvEjE,EAAWuC,EAAQtL,KAAMvD,EAAQsM,WAActM,EAAQwT,wBACtD,SAAUzG,EAAGC,GACZ,GAAIyG,GAAuB,IAAf1G,EAAE1J,SAAiB0J,EAAE9M,gBAAkB8M,EAClD2G,EAAM1G,GAAKA,EAAEpJ,UACd,OAAOmJ,KAAM2G,MAAWA,GAAwB,IAAjBA,EAAIrQ,YAClCoQ,EAAMnH,SACLmH,EAAMnH,SAAUoH,GAChB3G,EAAEyG,yBAA8D,GAAnCzG,EAAEyG,wBAAyBE,MAG3D,SAAU3G,EAAGC,GACZ,GAAKA,EACJ,MAASA,EAAIA,EAAEpJ,WACd,GAAKoJ,IAAMD,EACV,OAAO,CAIV,QAAO,GAOTD,EAAY9M,EAAQwT,wBACpB,SAAUzG,EAAGC,GAGZ,GAAKD,IAAMC,EAEV,MADAH,IAAe,EACR,CAGR,IAAI8G,GAAU3G,EAAEwG,yBAA2BzG,EAAEyG,yBAA2BzG,EAAEyG,wBAAyBxG,EAEnG,OAAK2G,GAEW,EAAVA,IACFlI,EAAQmI,cAAgB5G,EAAEwG,wBAAyBzG,KAAQ4G,EAGxD5G,IAAMmF,GAAO5F,EAASC,EAAcQ,GACjC,GAEHC,IAAMkF,GAAO5F,EAASC,EAAcS,GACjC,EAIDhB,EACJhL,EAAQ+C,KAAMiI,EAAWe,GAAM/L,EAAQ+C,KAAMiI,EAAWgB,GAC1D,EAGe,EAAV2G,EAAc,GAAK,EAIpB5G,EAAEyG,wBAA0B,GAAK,GAEzC,SAAUzG,EAAGC,GACZ,GAAIwE,GACHxM,EAAI,EACJ6O,EAAM9G,EAAEnJ,WACR8P,EAAM1G,EAAEpJ,WACRkQ,GAAO/G,GACPgH,GAAO/G,EAGR,IAAKD,IAAMC,EAEV,MADAH,IAAe,EACR,CAGD,KAAMgH,IAAQH,EACpB,MAAO3G,KAAMmF,EAAM,GAClBlF,IAAMkF,EAAM,EACZ2B,EAAM,GACNH,EAAM,EACN1H,EACEhL,EAAQ+C,KAAMiI,EAAWe,GAAM/L,EAAQ+C,KAAMiI,EAAWgB,GAC1D,CAGK,IAAK6G,IAAQH,EACnB,MAAOnC,IAAcxE,EAAGC,EAIzBwE,GAAMzE,CACN,OAASyE,EAAMA,EAAI5N,WAClBkQ,EAAGE,QAASxC,EAEbA,GAAMxE,CACN,OAASwE,EAAMA,EAAI5N,WAClBmQ,EAAGC,QAASxC,EAIb,OAAQsC,EAAG9O,KAAO+O,EAAG/O,GACpBA,GAGD,OAAOA,GAENuM,GAAcuC,EAAG9O,GAAI+O,EAAG/O,IAGxB8O,EAAG9O,KAAOuH,EAAe,GACzBwH,EAAG/O,KAAOuH,EAAe,EACzB,GAGK2F,GA1UCnS,GA6UT0P,GAAOpD,QAAU,SAAU4H,EAAMC,GAChC,MAAOzE,IAAQwE,EAAM,KAAM,KAAMC,IAGlCzE,GAAOyD,gBAAkB,SAAUrQ,EAAMoR,GASxC,IAPOpR,EAAKS,eAAiBT,KAAW9C,GACvCkM,EAAapJ,GAIdoR,EAAOA,EAAK7N,QAAS6H,EAAkB,aAElCxC,EAAQyH,kBAAmBhH,GAC5BE,GAAkBA,EAAc7I,KAAM0Q,IACtC9H,GAAkBA,EAAU5I,KAAM0Q,IAErC,IACC,GAAI7P,GAAMiI,EAAQtI,KAAMlB,EAAMoR,EAG9B,IAAK7P,GAAOqH,EAAQ8H,mBAGlB1Q,EAAK9C,UAAuC,KAA3B8C,EAAK9C,SAASsD,SAChC,MAAOe,GAEP,MAAMmD,IAGT,MAAOkI,IAAQwE,EAAMlU,EAAU,MAAO8C,IAAQG,OAAS,GAGxDyM,GAAOnD,SAAW,SAAU9K,EAASqB,GAKpC,OAHOrB,EAAQ8B,eAAiB9B,KAAczB,GAC7CkM,EAAazK,GAEP8K,EAAU9K,EAASqB,IAG3B4M,GAAO/L,KAAO,SAAUb,EAAM4C,IAEtB5C,EAAKS,eAAiBT,KAAW9C,GACvCkM,EAAapJ,EAGd,IAAIpB,GAAKkK,EAAK2F,WAAY7L,EAAKiE,eAE9ByK,EAAM1S,GAAM0L,EAAOpJ,KAAM4H,EAAK2F,WAAY7L,EAAKiE,eAC9CjI,EAAIoB,EAAM4C,GAAOyG,GACjBxM,SAEF,OAAOyU,KAAQzU,UACd+L,EAAQiC,aAAexB,EACtBrJ,EAAKuN,aAAc3K,IAClB0O,EAAMtR,EAAKkQ,iBAAiBtN,KAAU0O,EAAIC,UAC1CD,EAAIxK,MACJ,KACFwK,GAGF1E,GAAOhI,MAAQ,SAAUC,GACxB,KAAUC,OAAO,0CAA4CD,IAO9D+H,GAAO4E,WAAa,SAAUvK,GAC7B,GAAIjH,GACHyR,KACApP,EAAI,EACJF,EAAI,CAOL,IAJA6H,GAAgBpB,EAAQ8I,iBACxBvI,GAAaP,EAAQ+I,YAAc1K,EAAQhJ,MAAO,GAClDgJ,EAAQzE,KAAMyH,GAETD,EAAe,CACnB,MAAShK,EAAOiH,EAAQ9E,KAClBnC,IAASiH,EAAS9E,KACtBE,EAAIoP,EAAW1T,KAAMoE,GAGvB,OAAQE,IACP4E,EAAQxE,OAAQgP,EAAYpP,GAAK,GAInC,MAAO4E,IAOR8B,EAAU6D,GAAO7D,QAAU,SAAU/I,GACpC,GAAIoP,GACH7N,EAAM,GACNY,EAAI,EACJ3B,EAAWR,EAAKQ,QAEjB,IAAMA,GAMC,GAAkB,IAAbA,GAA+B,IAAbA,GAA+B,KAAbA,EAAkB,CAGjE,GAAiC,gBAArBR,GAAK4R,YAChB,MAAO5R,GAAK4R,WAGZ,KAAM5R,EAAOA,EAAK6P,WAAY7P,EAAMA,EAAOA,EAAK8O,YAC/CvN,GAAOwH,EAAS/I,OAGZ,IAAkB,IAAbQ,GAA+B,IAAbA,EAC7B,MAAOR,GAAK6R,cAhBZ,MAASzC,EAAOpP,EAAKmC,GAAKA,IAEzBZ,GAAOwH,EAASqG,EAkBlB,OAAO7N,IAGRuH,EAAO8D,GAAOkF,WAGb7D,YAAa,GAEb8D,aAAc5D,GAEdpO,MAAOwL,EAEPkD,cAEApO,QAEA2R,UACCC,KAAOC,IAAK,aAAclQ,OAAO,GACjCmQ,KAAOD,IAAK,cACZE,KAAOF,IAAK,kBAAmBlQ,OAAO,GACtCqQ,KAAOH,IAAK,oBAGbI,WACC3G,KAAQ,SAAU5L,GAUjB,MATAA,GAAM,GAAKA,EAAM,GAAGwD,QAAS6I,GAAWC,IAGxCtM,EAAM,IAAOA,EAAM,IAAMA,EAAM,IAAM,IAAKwD,QAAS6I,GAAWC,IAE5C,OAAbtM,EAAM,KACVA,EAAM,GAAK,IAAMA,EAAM,GAAK,KAGtBA,EAAM9B,MAAO,EAAG,IAGxB4N,MAAS,SAAU9L,GA6BlB,MAlBAA,GAAM,GAAKA,EAAM,GAAG8G,cAEY,QAA3B9G,EAAM,GAAG9B,MAAO,EAAG,IAEjB8B,EAAM,IACX6M,GAAOhI,MAAO7E,EAAM,IAKrBA,EAAM,KAAQA,EAAM,GAAKA,EAAM,IAAMA,EAAM,IAAM,GAAK,GAAmB,SAAbA,EAAM,IAA8B,QAAbA,EAAM,KACzFA,EAAM,KAAUA,EAAM,GAAKA,EAAM,IAAqB,QAAbA,EAAM,KAGpCA,EAAM,IACjB6M,GAAOhI,MAAO7E,EAAM,IAGdA,GAGR6L,OAAU,SAAU7L,GACnB,GAAIwS,GACHC,GAAYzS,EAAM,IAAMA,EAAM,EAE/B,OAAKwL,GAAiB,MAAE7K,KAAMX,EAAM,IAC5B,MAIHA,EAAM,IAAMA,EAAM,KAAOlD,UAC7BkD,EAAM,GAAKA,EAAM,GAGNyS,GAAYnH,EAAQ3K,KAAM8R,KAEpCD,EAASjF,GAAUkF,GAAU,MAE7BD,EAASC,EAASrU,QAAS,IAAKqU,EAASrS,OAASoS,GAAWC,EAASrS,UAGvEJ,EAAM,GAAKA,EAAM,GAAG9B,MAAO,EAAGsU,GAC9BxS,EAAM,GAAKyS,EAASvU,MAAO,EAAGsU,IAIxBxS,EAAM9B,MAAO,EAAG,MAIzB+R,QAECtE,IAAO,SAAU+G,GAChB,GAAI7L,GAAW6L,EAAiBlP,QAAS6I,GAAWC,IAAYxF,aAChE,OAA4B,MAArB4L,EACN,WAAa,OAAO,GACpB,SAAUzS,GACT,MAAOA,GAAK4G,UAAY5G,EAAK4G,SAASC,gBAAkBD,IAI3D6E,MAAS,SAAUiE,GAClB,GAAIgD,GAAU9I,EAAY8F,EAAY,IAEtC,OAAOgD,KACLA,EAAc1H,OAAQ,MAAQN,EAAa,IAAMgF,EAAY,IAAMhF,EAAa,SACjFd,EAAY8F,EAAW,SAAU1P,GAChC,MAAO0S,GAAQhS,KAAgC,gBAAnBV,GAAK0P,WAA0B1P,EAAK0P,iBAAoB1P,GAAKuN,eAAiBnD,GAAgBpK,EAAKuN,aAAa,UAAY,OAI3J5B,KAAQ,SAAU/I,EAAM+P,EAAUC,GACjC,MAAO,UAAU5S,GAChB,GAAI6S,GAASjG,GAAO/L,KAAMb,EAAM4C,EAEhC,OAAe,OAAViQ,EACgB,OAAbF,EAEFA,GAINE,GAAU,GAEU,MAAbF,EAAmBE,IAAWD,EACvB,OAAbD,EAAoBE,IAAWD,EAClB,OAAbD,EAAoBC,GAAqC,IAA5BC,EAAO1U,QAASyU,GAChC,OAAbD,EAAoBC,GAASC,EAAO1U,QAASyU,GAAU,GAC1C,OAAbD,EAAoBC,GAASC,EAAO5U,OAAQ2U,EAAMzS,UAAayS,EAClD,OAAbD,GAAsB,IAAME,EAAS,KAAM1U,QAASyU,GAAU,GACjD,OAAbD,EAAoBE,IAAWD,GAASC,EAAO5U,MAAO,EAAG2U,EAAMzS,OAAS,KAAQyS,EAAQ,KACxF,IAZO,IAgBV/G,MAAS,SAAU3H,EAAM4O,EAAM5D,EAAUlN,EAAOE,GAC/C,GAAI6Q,GAAgC,QAAvB7O,EAAKjG,MAAO,EAAG,GAC3B+U,EAA+B,SAArB9O,EAAKjG,MAAO,IACtBgV,EAAkB,YAATH,CAEV,OAAiB,KAAV9Q,GAAwB,IAATE,EAGrB,SAAUlC,GACT,QAASA,EAAKe,YAGf,SAAUf,EAAMrB,EAASgH,GACxB,GAAIqI,GAAOkF,EAAY9D,EAAMR,EAAMuE,EAAWC,EAC7ClB,EAAMa,IAAWC,EAAU,cAAgB,kBAC3C1D,EAAStP,EAAKe,WACd6B,EAAOqQ,GAAUjT,EAAK4G,SAASC,cAC/BwM,GAAY1N,IAAQsN,CAErB,IAAK3D,EAAS,CAGb,GAAKyD,EAAS,CACb,MAAQb,EAAM,CACb9C,EAAOpP,CACP,OAASoP,EAAOA,EAAM8C,GACrB,GAAKe,EAAS7D,EAAKxI,SAASC,gBAAkBjE,EAAyB,IAAlBwM,EAAK5O,SACzD,OAAO,CAIT4S,GAAQlB,EAAe,SAAThO,IAAoBkP,GAAS,cAE5C,OAAO,EAMR,GAHAA,GAAUJ,EAAU1D,EAAOO,WAAaP,EAAOgE,WAG1CN,GAAWK,EAAW,CAE1BH,EAAa5D,EAAQlM,KAAckM,EAAQlM,OAC3C4K,EAAQkF,EAAYhP,OACpBiP,EAAYnF,EAAM,KAAOrE,GAAWqE,EAAM,GAC1CY,EAAOZ,EAAM,KAAOrE,GAAWqE,EAAM,GACrCoB,EAAO+D,GAAa7D,EAAOhK,WAAY6N,EAEvC,OAAS/D,IAAS+D,GAAa/D,GAAQA,EAAM8C,KAG3CtD,EAAOuE,EAAY,IAAMC,EAAM7I,MAGhC,GAAuB,IAAlB6E,EAAK5O,YAAoBoO,GAAQQ,IAASpP,EAAO,CACrDkT,EAAYhP,IAAWyF,EAASwJ,EAAWvE,EAC3C,YAKI,IAAKyE,IAAarF,GAAShO,EAAMoD,KAAcpD,EAAMoD,QAAkBc,KAAW8J,EAAM,KAAOrE,EACrGiF,EAAOZ,EAAM,OAKb,OAASoB,IAAS+D,GAAa/D,GAAQA,EAAM8C,KAC3CtD,EAAOuE,EAAY,IAAMC,EAAM7I,MAEhC,IAAO0I,EAAS7D,EAAKxI,SAASC,gBAAkBjE,EAAyB,IAAlBwM,EAAK5O,aAAsBoO,IAE5EyE,KACHjE,EAAMhM,KAAcgM,EAAMhM,QAAkBc,IAAWyF,EAASiF,IAG7DQ,IAASpP,GACb,KAQJ,OADA4O,IAAQ1M,EACD0M,IAAS5M,GAA4B,IAAjB4M,EAAO5M,GAAe4M,EAAO5M,GAAS,KAKrE4J,OAAU,SAAU2H,EAAQrE,GAK3B,GAAIvN,GACH/C,EAAKkK,EAAKgC,QAASyI,IAAYzK,EAAK0K,WAAYD,EAAO1M,gBACtD+F,GAAOhI,MAAO,uBAAyB2O,EAKzC,OAAK3U,GAAIwE,GACDxE,EAAIsQ,GAIPtQ,EAAGuB,OAAS,GAChBwB,GAAS4R,EAAQA,EAAQ,GAAIrE,GACtBpG,EAAK0K,WAAWjV,eAAgBgV,EAAO1M,eAC7CsH,GAAa,SAAUtB,EAAMrD,GAC5B,GAAIiK,GACHC,EAAU9U,EAAIiO,EAAMqC,GACpB/M,EAAIuR,EAAQvT,MACb,OAAQgC,IACPsR,EAAMtV,EAAQ+C,KAAM2L,EAAM6G,EAAQvR,IAClC0K,EAAM4G,KAAWjK,EAASiK,GAAQC,EAAQvR,MAG5C,SAAUnC,GACT,MAAOpB,GAAIoB,EAAM,EAAG2B,KAIhB/C,IAITkM,SAEC6I,IAAOxF,GAAa,SAAUzP,GAI7B,GAAI0R,MACHnJ,KACA2M,EAAU3K,EAASvK,EAAS6E,QAASwH,EAAO,MAE7C,OAAO6I,GAASxQ,GACf+K,GAAa,SAAUtB,EAAMrD,EAAS7K,EAASgH,GAC9C,GAAI3F,GACH6T,EAAYD,EAAS/G,EAAM,KAAMlH,MACjCxD,EAAI0K,EAAK1M,MAGV,OAAQgC,KACDnC,EAAO6T,EAAU1R,MACtB0K,EAAK1K,KAAOqH,EAAQrH,GAAKnC,MAI5B,SAAUA,EAAMrB,EAASgH,GAGxB,MAFAyK,GAAM,GAAKpQ,EACX4T,EAASxD,EAAO,KAAMzK,EAAKsB,IACnBA,EAAQsD,SAInBuJ,IAAO3F,GAAa,SAAUzP,GAC7B,MAAO,UAAUsB,GAChB,MAAO4M,IAAQlO,EAAUsB,GAAOG,OAAS,KAI3CsJ,SAAY0E,GAAa,SAAU7H,GAClC,MAAO,UAAUtG,GAChB,OAASA,EAAK4R,aAAe5R,EAAK+T,WAAahL,EAAS/I,IAAS7B,QAASmI,GAAS,MAWrF0N,KAAQ7F,GAAc,SAAU6F,GAM/B,MAJM1I,GAAY5K,KAAKsT,GAAQ,KAC9BpH,GAAOhI,MAAO,qBAAuBoP,GAEtCA,EAAOA,EAAKzQ,QAAS6I,GAAWC,IAAYxF,cACrC,SAAU7G,GAChB,GAAIiU,EACJ,GACC,IAAMA,EAAW5K,EAChBrJ,EAAKgU,KACLhU,EAAKuN,aAAa,aAAevN,EAAKuN,aAAa,QAGnD,MADA0G,GAAWA,EAASpN,cACboN,IAAaD,GAA2C,IAAnCC,EAAS9V,QAAS6V,EAAO,YAE5ChU,EAAOA,EAAKe,aAAiC,IAAlBf,EAAKQ,SAC3C,QAAO,KAKTyC,OAAU,SAAUjD,GACnB,GAAIkU,GAAOtX,EAAOK,UAAYL,EAAOK,SAASiX,IAC9C,OAAOA,IAAQA,EAAKjW,MAAO,KAAQ+B,EAAKmN,IAGzCgH,KAAQ,SAAUnU,GACjB,MAAOA,KAAS7C,GAGjBiX,MAAS,SAAUpU,GAClB,MAAOA,KAAS9C,EAASmX,iBAAmBnX,EAASoX,UAAYpX,EAASoX,gBAAkBtU,EAAKkE,MAAQlE,EAAKuU,OAASvU,EAAKwU,WAI7HC,QAAW,SAAUzU,GACpB,MAAOA,GAAK0U,YAAa,GAG1BA,SAAY,SAAU1U,GACrB,MAAOA,GAAK0U,YAAa,GAG1BC,QAAW,SAAU3U,GAGpB,GAAI4G,GAAW5G,EAAK4G,SAASC,aAC7B,OAAqB,UAAbD,KAA0B5G,EAAK2U,SAA0B,WAAb/N,KAA2B5G,EAAK4U,UAGrFA,SAAY,SAAU5U,GAOrB,MAJKA,GAAKe,YACTf,EAAKe,WAAW8T,cAGV7U,EAAK4U,YAAa,GAI1BE,MAAS,SAAU9U,GAMlB,IAAMA,EAAOA,EAAK6P,WAAY7P,EAAMA,EAAOA,EAAK8O,YAC/C,GAAK9O,EAAK4G,SAAW,KAAyB,IAAlB5G,EAAKQ,UAAoC,IAAlBR,EAAKQ,SACvD,OAAO,CAGT,QAAO,GAGR8O,OAAU,SAAUtP,GACnB,OAAQ8I,EAAKgC,QAAe,MAAG9K,IAIhC+U,OAAU,SAAU/U,GACnB,MAAOkM,IAAQxL,KAAMV,EAAK4G,WAG3BwJ,MAAS,SAAUpQ,GAClB,MAAOiM,GAAQvL,KAAMV,EAAK4G,WAG3BoO,OAAU,SAAUhV,GACnB,GAAI4C,GAAO5C,EAAK4G,SAASC,aACzB,OAAgB,UAATjE,GAAkC,WAAd5C,EAAKkE,MAA8B,WAATtB,GAGtD0D,KAAQ,SAAUtG,GACjB,GAAIa,EAGJ,OAAuC,UAAhCb,EAAK4G,SAASC,eACN,SAAd7G,EAAKkE,OACmC,OAArCrD,EAAOb,EAAKuN,aAAa,UAAoB1M,EAAKgG,gBAAkB7G,EAAKkE,OAI9ElC,MAASiN,GAAuB,WAC/B,OAAS,KAGV/M,KAAQ+M,GAAuB,SAAUE,EAAchP,GACtD,OAASA,EAAS,KAGnB8B,GAAMgN,GAAuB,SAAUE,EAAchP,EAAQ+O,GAC5D,OAAoB,EAAXA,EAAeA,EAAW/O,EAAS+O,KAG7C+F,KAAQhG,GAAuB,SAAUE,EAAchP,GACtD,GAAIgC,GAAI,CACR,MAAYhC,EAAJgC,EAAYA,GAAK,EACxBgN,EAAapR,KAAMoE,EAEpB,OAAOgN,KAGR+F,IAAOjG,GAAuB,SAAUE,EAAchP,GACrD,GAAIgC,GAAI,CACR,MAAYhC,EAAJgC,EAAYA,GAAK,EACxBgN,EAAapR,KAAMoE,EAEpB,OAAOgN,KAGRgG,GAAMlG,GAAuB,SAAUE,EAAchP,EAAQ+O,GAC5D,GAAI/M,GAAe,EAAX+M,EAAeA,EAAW/O,EAAS+O,CAC3C,QAAU/M,GAAK,GACdgN,EAAapR,KAAMoE,EAEpB,OAAOgN,KAGRiG,GAAMnG,GAAuB,SAAUE,EAAchP,EAAQ+O,GAC5D,GAAI/M,GAAe,EAAX+M,EAAeA,EAAW/O,EAAS+O,CAC3C,MAAc/O,IAAJgC,GACTgN,EAAapR,KAAMoE,EAEpB,OAAOgN,OAKVrG,EAAKgC,QAAa,IAAIhC,EAAKgC,QAAY,EAGvC,KAAM3I,KAAOkT,OAAO,EAAMC,UAAU,EAAMC,MAAM,EAAMC,UAAU,EAAMC,OAAO,GAC5E3M,EAAKgC,QAAS3I,GAAM4M,GAAmB5M,EAExC,KAAMA,KAAOuT,QAAQ,EAAMC,OAAO,GACjC7M,EAAKgC,QAAS3I,GAAM6M,GAAoB7M,EAIzC,SAASqR,OACTA,GAAW5T,UAAYkJ,EAAK8M,QAAU9M,EAAKgC,QAC3ChC,EAAK0K,WAAa,GAAIA,GAEtB,SAASlG,IAAU5O,EAAUmX,GAC5B,GAAInC,GAAS3T,EAAO+V,EAAQ5R,EAC3B6R,EAAOhJ,EAAQiJ,EACfC,EAASnM,EAAYpL,EAAW,IAEjC,IAAKuX,EACJ,MAAOJ,GAAY,EAAII,EAAOhY,MAAO,EAGtC8X,GAAQrX,EACRqO,KACAiJ,EAAalN,EAAKwJ,SAElB,OAAQyD,EAAQ,GAGTrC,IAAY3T,EAAQkL,EAAO7K,KAAM2V,OACjChW,IAEJgW,EAAQA,EAAM9X,MAAO8B,EAAM,GAAGI,SAAY4V,GAE3ChJ,EAAOhP,KAAM+X,OAGdpC,GAAU,GAGJ3T,EAAQmL,EAAa9K,KAAM2V,MAChCrC,EAAU3T,EAAMmO,QAChB4H,EAAO/X,MACN+I,MAAO4M,EAEPxP,KAAMnE,EAAM,GAAGwD,QAASwH,EAAO,OAEhCgL,EAAQA,EAAM9X,MAAOyV,EAAQvT,QAI9B,KAAM+D,IAAQ4E,GAAKkH,SACZjQ,EAAQwL,EAAWrH,GAAO9D,KAAM2V,KAAcC,EAAY9R,MAC9DnE,EAAQiW,EAAY9R,GAAQnE,MAC7B2T,EAAU3T,EAAMmO,QAChB4H,EAAO/X,MACN+I,MAAO4M,EACPxP,KAAMA,EACNsF,QAASzJ,IAEVgW,EAAQA,EAAM9X,MAAOyV,EAAQvT,QAI/B,KAAMuT,EACL,MAOF,MAAOmC,GACNE,EAAM5V,OACN4V,EACCnJ,GAAOhI,MAAOlG,GAEdoL,EAAYpL,EAAUqO,GAAS9O,MAAO,GAGzC,QAASwP,IAAYqI,GACpB,GAAI3T,GAAI,EACPC,EAAM0T,EAAO3V,OACbzB,EAAW,EACZ,MAAY0D,EAAJD,EAASA,IAChBzD,GAAYoX,EAAO3T,GAAG2E,KAEvB,OAAOpI,GAGR,QAASwX,IAAetC,EAASuC,EAAYC,GAC5C,GAAIlE,GAAMiE,EAAWjE,IACpBmE,EAAmBD,GAAgB,eAARlE,EAC3BoE,EAAWzU,GAEZ,OAAOsU,GAAWnU,MAEjB,SAAUhC,EAAMrB,EAASgH,GACxB,MAAS3F,EAAOA,EAAMkS,GACrB,GAAuB,IAAlBlS,EAAKQ,UAAkB6V,EAC3B,MAAOzC,GAAS5T,EAAMrB,EAASgH,IAMlC,SAAU3F,EAAMrB,EAASgH,GACxB,GAAIZ,GAAMiJ,EAAOkF,EAChBqD,EAAS5M,EAAU,IAAM2M,CAG1B,IAAK3Q,GACJ,MAAS3F,EAAOA,EAAMkS,GACrB,IAAuB,IAAlBlS,EAAKQ,UAAkB6V,IACtBzC,EAAS5T,EAAMrB,EAASgH,GAC5B,OAAO,MAKV,OAAS3F,EAAOA,EAAMkS,GACrB,GAAuB,IAAlBlS,EAAKQ,UAAkB6V,EAE3B,GADAnD,EAAalT,EAAMoD,KAAcpD,EAAMoD,QACjC4K,EAAQkF,EAAYhB,KAAUlE,EAAM,KAAOuI,GAChD,IAAMxR,EAAOiJ,EAAM,OAAQ,GAAQjJ,IAAS8D,EAC3C,MAAO9D,MAAS,MAKjB,IAFAiJ,EAAQkF,EAAYhB,IAAUqE,GAC9BvI,EAAM,GAAK4F,EAAS5T,EAAMrB,EAASgH,IAASkD,EACvCmF,EAAM,MAAO,EACjB,OAAO,GASf,QAASwI,IAAgBC,GACxB,MAAOA,GAAStW,OAAS,EACxB,SAAUH,EAAMrB,EAASgH,GACxB,GAAIxD,GAAIsU,EAAStW,MACjB,OAAQgC,IACP,IAAMsU,EAAStU,GAAInC,EAAMrB,EAASgH,GACjC,OAAO,CAGT,QAAO,GAER8Q,EAAS,GAGX,QAASC,IAAU7C,EAAWvR,EAAK0N,EAAQrR,EAASgH,GACnD,GAAI3F,GACH2W,KACAxU,EAAI,EACJC,EAAMyR,EAAU1T,OAChByW,EAAgB,MAAPtU,CAEV,MAAYF,EAAJD,EAASA,KACVnC,EAAO6T,EAAU1R,OAChB6N,GAAUA,EAAQhQ,EAAMrB,EAASgH,MACtCgR,EAAa5Y,KAAMiC,GACd4W,GACJtU,EAAIvE,KAAMoE,GAMd,OAAOwU,GAGR,QAASE,IAAYvE,EAAW5T,EAAUkV,EAASkD,EAAYC,EAAYC,GAO1E,MANKF,KAAeA,EAAY1T,KAC/B0T,EAAaD,GAAYC,IAErBC,IAAeA,EAAY3T,KAC/B2T,EAAaF,GAAYE,EAAYC,IAE/B7I,GAAa,SAAUtB,EAAM5F,EAAStI,EAASgH,GACrD,GAAIsR,GAAM9U,EAAGnC,EACZkX,KACAC,KACAC,EAAcnQ,EAAQ9G,OAGtBmB,EAAQuL,GAAQwK,GAAkB3Y,GAAY,IAAKC,EAAQ6B,UAAa7B,GAAYA,MAGpF2Y,GAAYhF,IAAezF,GAASnO,EAEnC4C,EADAoV,GAAUpV,EAAO4V,EAAQ5E,EAAW3T,EAASgH,GAG9C4R,EAAa3D,EAEZmD,IAAgBlK,EAAOyF,EAAY8E,GAAeN,MAMjD7P,EACDqQ,CAQF,IALK1D,GACJA,EAAS0D,EAAWC,EAAY5Y,EAASgH,GAIrCmR,EAAa,CACjBG,EAAOP,GAAUa,EAAYJ,GAC7BL,EAAYG,KAAUtY,EAASgH,GAG/BxD,EAAI8U,EAAK9W,MACT,OAAQgC,KACDnC,EAAOiX,EAAK9U,MACjBoV,EAAYJ,EAAQhV,MAASmV,EAAWH,EAAQhV,IAAOnC,IAK1D,GAAK6M,GACJ,GAAKkK,GAAczE,EAAY,CAC9B,GAAKyE,EAAa,CAEjBE,KACA9U,EAAIoV,EAAWpX,MACf,OAAQgC,KACDnC,EAAOuX,EAAWpV,KAEvB8U,EAAKlZ,KAAOuZ,EAAUnV,GAAKnC,EAG7B+W,GAAY,KAAOQ,KAAkBN,EAAMtR,GAI5CxD,EAAIoV,EAAWpX,MACf,OAAQgC,KACDnC,EAAOuX,EAAWpV,MACtB8U,EAAOF,EAAa5Y,EAAQ+C,KAAM2L,EAAM7M,GAASkX,EAAO/U,IAAM,KAE/D0K,EAAKoK,KAAUhQ,EAAQgQ,GAAQjX,SAOlCuX,GAAab,GACZa,IAAetQ,EACdsQ,EAAW9U,OAAQ2U,EAAaG,EAAWpX,QAC3CoX,GAEGR,EACJA,EAAY,KAAM9P,EAASsQ,EAAY5R,GAEvC5H,EAAK+D,MAAOmF,EAASsQ,KAMzB,QAASC,IAAmB1B,GAC3B,GAAI2B,GAAc7D,EAASvR,EAC1BD,EAAM0T,EAAO3V,OACbuX,EAAkB5O,EAAKkJ,SAAU8D,EAAO,GAAG5R,MAC3CyT,EAAmBD,GAAmB5O,EAAKkJ,SAAS,KACpD7P,EAAIuV,EAAkB,EAAI,EAG1BE,EAAe1B,GAAe,SAAUlW,GACvC,MAAOA,KAASyX,GACdE,GAAkB,GACrBE,EAAkB3B,GAAe,SAAUlW,GAC1C,MAAO7B,GAAQ+C,KAAMuW,EAAczX,GAAS,IAC1C2X,GAAkB,GACrBlB,GAAa,SAAUzW,EAAMrB,EAASgH,GACrC,OAAU+R,IAAqB/R,GAAOhH,IAAYuK,MAChDuO,EAAe9Y,GAAS6B,SACxBoX,EAAc5X,EAAMrB,EAASgH,GAC7BkS,EAAiB7X,EAAMrB,EAASgH,KAGpC,MAAYvD,EAAJD,EAASA,IAChB,GAAMyR,EAAU9K,EAAKkJ,SAAU8D,EAAO3T,GAAG+B,MACxCuS,GAAaP,GAAcM,GAAgBC,GAAY7C,QACjD,CAIN,GAHAA,EAAU9K,EAAKkH,OAAQ8F,EAAO3T,GAAG+B,MAAOpC,MAAO,KAAMgU,EAAO3T,GAAGqH,SAG1DoK,EAASxQ,GAAY,CAGzB,IADAf,IAAMF,EACMC,EAAJC,EAASA,IAChB,GAAKyG,EAAKkJ,SAAU8D,EAAOzT,GAAG6B,MAC7B,KAGF,OAAO2S,IACN1U,EAAI,GAAKqU,GAAgBC,GACzBtU,EAAI,GAAKsL,GAERqI,EAAO7X,MAAO,EAAGkE,EAAI,GAAItE,QAASiJ,MAAgC,MAAzBgP,EAAQ3T,EAAI,GAAI+B,KAAe,IAAM,MAC7EX,QAASwH,EAAO,MAClB6I,EACIvR,EAAJF,GAASqV,GAAmB1B,EAAO7X,MAAOkE,EAAGE,IACzCD,EAAJC,GAAWmV,GAAoB1B,EAASA,EAAO7X,MAAOoE,IAClDD,EAAJC,GAAWoL,GAAYqI,IAGzBW,EAAS1Y,KAAM6V,GAIjB,MAAO4C,IAAgBC,GAGxB,QAASqB,IAA0BC,EAAiBC,GAEnD,GAAIC,GAAoB,EACvBC,EAAQF,EAAY7X,OAAS,EAC7BgY,EAAYJ,EAAgB5X,OAAS,EACrCiY,EAAe,SAAUvL,EAAMlO,EAASgH,EAAKsB,EAASoR,GACrD,GAAIrY,GAAMqC,EAAGuR,EACZ0E,KACAC,EAAe,EACfpW,EAAI,IACJ0R,EAAYhH,MACZ2L,EAA6B,MAAjBH,EACZI,EAAgBvP,EAEhB5H,EAAQuL,GAAQsL,GAAarP,EAAKzI,KAAU,IAAG,IAAKgY,GAAiB1Z,EAAQoC,YAAcpC,GAE3F+Z,EAAiB/O,GAA4B,MAAjB8O,EAAwB,EAAIpV,KAAKC,UAAY,EAS1E,KAPKkV,IACJtP,EAAmBvK,IAAYzB,GAAYyB,EAC3CkK,EAAaoP,GAKe,OAApBjY,EAAOsB,EAAMa,IAAaA,IAAM,CACxC,GAAKgW,GAAanY,EAAO,CACxBqC,EAAI,CACJ,OAASuR,EAAUmE,EAAgB1V,KAClC,GAAKuR,EAAS5T,EAAMrB,EAASgH,GAAQ,CACpCsB,EAAQlJ,KAAMiC,EACd,OAGGwY,IACJ7O,EAAU+O,EACV7P,IAAeoP,GAKZC,KAEElY,GAAQ4T,GAAW5T,IACxBuY,IAII1L,GACJgH,EAAU9V,KAAMiC,IAOnB,GADAuY,GAAgBpW,EACX+V,GAAS/V,IAAMoW,EAAe,CAClClW,EAAI,CACJ,OAASuR,EAAUoE,EAAY3V,KAC9BuR,EAASC,EAAWyE,EAAY3Z,EAASgH,EAG1C,IAAKkH,EAAO,CAEX,GAAK0L,EAAe,EACnB,MAAQpW,IACA0R,EAAU1R,IAAMmW,EAAWnW,KACjCmW,EAAWnW,GAAKoI,EAAIrJ,KAAM+F,GAM7BqR,GAAa5B,GAAU4B,GAIxBva,EAAK+D,MAAOmF,EAASqR,GAGhBE,IAAc3L,GAAQyL,EAAWnY,OAAS,GAC5CoY,EAAeP,EAAY7X,OAAW,GAExCyM,GAAO4E,WAAYvK,GAUrB,MALKuR,KACJ7O,EAAU+O,EACVxP,EAAmBuP,GAGb5E,EAGT,OAAOqE,GACN/J,GAAciK,GACdA,EAGFnP,EAAU2D,GAAO3D,QAAU,SAAUvK,EAAUia,GAC9C,GAAIxW,GACH6V,KACAD,KACA9B,EAASlM,EAAerL,EAAW,IAEpC,KAAMuX,EAAS,CAER0C,IACLA,EAAQrL,GAAU5O,IAEnByD,EAAIwW,EAAMxY,MACV,OAAQgC,IACP8T,EAASuB,GAAmBmB,EAAMxW,IAC7B8T,EAAQ7S,GACZ4U,EAAYja,KAAMkY,GAElB8B,EAAgBha,KAAMkY,EAKxBA,GAASlM,EAAerL,EAAUoZ,GAA0BC,EAAiBC,IAE9E,MAAO/B,GAGR,SAASoB,IAAkB3Y,EAAUka,EAAU3R,GAC9C,GAAI9E,GAAI,EACPC,EAAMwW,EAASzY,MAChB,MAAYiC,EAAJD,EAASA,IAChByK,GAAQlO,EAAUka,EAASzW,GAAI8E,EAEhC,OAAOA,GAGR,QAAS6G,IAAQpP,EAAUC,EAASsI,EAAS4F,GAC5C,GAAI1K,GAAG2T,EAAQ+C,EAAO3U,EAAM7D,EAC3BN,EAAQuN,GAAU5O,EAEnB,KAAMmO,GAEiB,IAAjB9M,EAAMI,OAAe,CAIzB,GADA2V,EAAS/V,EAAM,GAAKA,EAAM,GAAG9B,MAAO,GAC/B6X,EAAO3V,OAAS,GAAkC,QAA5B0Y,EAAQ/C,EAAO,IAAI5R,MAC5C0E,EAAQkH,SAAgC,IAArBnR,EAAQ6B,UAAkB6I,GAC7CP,EAAKkJ,SAAU8D,EAAO,GAAG5R,MAAS,CAGnC,GADAvF,GAAYmK,EAAKzI,KAAS,GAAGwY,EAAMrP,QAAQ,GAAGjG,QAAQ6I,GAAWC,IAAY1N,QAAkB,IACzFA,EACL,MAAOsI,EAERvI,GAAWA,EAAST,MAAO6X,EAAO5H,QAAQpH,MAAM3G,QAIjDgC,EAAIoJ,EAAwB,aAAE7K,KAAMhC,GAAa,EAAIoX,EAAO3V,MAC5D,OAAQgC,IAAM,CAIb,GAHA0W,EAAQ/C,EAAO3T,GAGV2G,EAAKkJ,SAAW9N,EAAO2U,EAAM3U,MACjC,KAED,KAAM7D,EAAOyI,EAAKzI,KAAM6D,MAEjB2I,EAAOxM,EACZwY,EAAMrP,QAAQ,GAAGjG,QAAS6I,GAAWC,IACrClB,EAASzK,KAAMoV,EAAO,GAAG5R,OAAUvF,EAAQoC,YAAcpC,IACrD,CAKJ,GAFAmX,EAAOrT,OAAQN,EAAG,GAClBzD,EAAWmO,EAAK1M,QAAUsN,GAAYqI,IAChCpX,EAEL,MADAX,GAAK+D,MAAOmF,EAAS4F,GACd5F,CAGR,SAgBL,MAPAgC,GAASvK,EAAUqB,GAClB8M,EACAlO,GACC0K,EACDpC,EACAkE,EAASzK,KAAMhC,IAETuI,EAMR2B,EAAQ+I,WAAavO,EAAQuF,MAAM,IAAInG,KAAMyH,GAAYyD,KAAK,MAAQtK,EAItEwF,EAAQ8I,iBAAmB1H,EAG3BZ,IAIAR,EAAQmI,aAAe3C,GAAO,SAAU0K,GAEvC,MAAuE,GAAhEA,EAAKnI,wBAAyBzT,EAASiI,cAAc,UAMvDiJ,GAAO,SAAUC,GAEtB,MADAA,GAAIuB,UAAY,mBAC+B,MAAxCvB,EAAIwB,WAAWtC,aAAa,WAEnCe,GAAW,yBAA0B,SAAUtO,EAAM4C,EAAMoG,GAC1D,MAAMA,GAAN,UACQhJ,EAAKuN,aAAc3K,EAA6B,SAAvBA,EAAKiE,cAA2B,EAAI,KAOjE+B,EAAQiC,YAAeuD,GAAO,SAAUC,GAG7C,MAFAA,GAAIuB,UAAY,WAChBvB,EAAIwB,WAAWrC,aAAc,QAAS,IACY,KAA3Ca,EAAIwB,WAAWtC,aAAc,YAEpCe,GAAW,QAAS,SAAUtO,EAAM4C,EAAMoG,GACzC,MAAMA,IAAyC,UAAhChJ,EAAK4G,SAASC,cAA7B,UACQ7G,EAAK+Y,eAOT3K,GAAO,SAAUC,GACtB,MAAuC,OAAhCA,EAAId,aAAa,eAExBe,GAAW7D,EAAU,SAAUzK,EAAM4C,EAAMoG,GAC1C,GAAIsI,EACJ,OAAMtI,GAAN,WACSsI,EAAMtR,EAAKkQ,iBAAkBtN,KAAW0O,EAAIC,UACnDD,EAAIxK,MACJ9G,EAAM4C,MAAW,EAAOA,EAAKiE,cAAgB,OAKjDvJ,EAAO+C,KAAOuM,GACdtP,EAAO8T,KAAOxE,GAAOkF,UACrBxU,EAAO8T,KAAK,KAAO9T,EAAO8T,KAAKtG,QAC/BxN,EAAO0b,OAASpM,GAAO4E,WACvBlU,EAAOgJ,KAAOsG,GAAO7D,QACrBzL,EAAO2b,SAAWrM,GAAO5D,MACzB1L,EAAOmM,SAAWmD,GAAOnD,UAGrB7M,EAEJ,IAAIsc,KAGJ,SAASC,GAAexW,GACvB,GAAIyW,GAASF,EAAcvW,KAI3B,OAHArF,GAAOmE,KAAMkB,EAAQ5C,MAAOf,OAAwB,SAAUsN,EAAG+M,GAChED,EAAQC,IAAS,IAEXD,EAyBR9b,EAAOgc,UAAY,SAAU3W,GAI5BA,EAA6B,gBAAZA,GACduW,EAAcvW,IAAawW,EAAexW,GAC5CrF,EAAOoF,UAAYC,EAEpB,IACC4W,GAEAC,EAEAC,EAEAC,EAEAC,EAEAC,EAEAC,KAEAC,GAASnX,EAAQoX,SAEjBC,EAAO,SAAUjV,GAOhB,IANAwU,EAAS5W,EAAQ4W,QAAUxU,EAC3ByU,GAAQ,EACRI,EAAcF,GAAe,EAC7BA,EAAc,EACdC,EAAeE,EAAK1Z,OACpBsZ,GAAS,EACDI,GAAsBF,EAAdC,EAA4BA,IAC3C,GAAKC,EAAMD,GAAc9X,MAAOiD,EAAM,GAAKA,EAAM,OAAU,GAASpC,EAAQsX,YAAc,CACzFV,GAAS,CACT,OAGFE,GAAS,EACJI,IACCC,EACCA,EAAM3Z,QACV6Z,EAAMF,EAAM5L,SAEFqL,EACXM,KAEAK,EAAKC,YAKRD,GAECE,IAAK,WACJ,GAAKP,EAAO,CAEX,GAAIzG,GAAQyG,EAAK1Z,QACjB,QAAUia,GAAKzY,GACdrE,EAAOmE,KAAME,EAAM,SAAU2K,EAAG7E,GAC/B,GAAIvD,GAAO5G,EAAO4G,KAAMuD,EACV,cAATvD,EACEvB,EAAQqW,QAAWkB,EAAKpG,IAAKrM,IAClCoS,EAAK9b,KAAM0J,GAEDA,GAAOA,EAAItH,QAAmB,WAAT+D,GAEhCkW,EAAK3S,OAGJ1F,WAGC0X,EACJE,EAAeE,EAAK1Z,OAGToZ,IACXG,EAActG,EACd4G,EAAMT,IAGR,MAAOtZ,OAGRoF,OAAQ,WAkBP,MAjBKwU,IACJvc,EAAOmE,KAAMM,UAAW,SAAUuK,EAAG7E,GACpC,GAAI4S,EACJ,QAASA,EAAQ/c,EAAO6J,QAASM,EAAKoS,EAAMQ,IAAY,GACvDR,EAAKpX,OAAQ4X,EAAO,GAEfZ,IACUE,GAATU,GACJV,IAEaC,GAATS,GACJT,OAME3Z,MAIR6T,IAAK,SAAUlV,GACd,MAAOA,GAAKtB,EAAO6J,QAASvI,EAAIib,GAAS,MAASA,IAAQA,EAAK1Z,SAGhE2U,MAAO,WAGN,MAFA+E,MACAF,EAAe,EACR1Z,MAGRka,QAAS,WAER,MADAN,GAAOC,EAAQP,EAAS1c,UACjBoD,MAGRyU,SAAU,WACT,OAAQmF,GAGTS,KAAM,WAKL,MAJAR,GAAQjd,UACF0c,GACLW,EAAKC,UAECla,MAGRsa,OAAQ,WACP,OAAQT,GAGTU,SAAU,SAAU7b,EAASgD,GAU5B,OATKkY,GAAWL,IAASM,IACxBnY,EAAOA,MACPA,GAAShD,EAASgD,EAAK1D,MAAQ0D,EAAK1D,QAAU0D,GACzC8X,EACJK,EAAM/b,KAAM4D,GAEZqY,EAAMrY,IAGD1B,MAGR+Z,KAAM,WAEL,MADAE,GAAKM,SAAUva,KAAM8B,WACd9B,MAGRuZ,MAAO,WACN,QAASA,GAIZ,OAAOU,IAER5c,EAAOoF,QAEN6F,SAAU,SAAUkS,GACnB,GAAIC,KAEA,UAAW,OAAQpd,EAAOgc,UAAU,eAAgB,aACpD,SAAU,OAAQhc,EAAOgc,UAAU,eAAgB,aACnD,SAAU,WAAYhc,EAAOgc,UAAU,YAE1CqB,EAAQ,UACR/Y,GACC+Y,MAAO,WACN,MAAOA,IAERC,OAAQ,WAEP,MADAC,GAAShZ,KAAME,WAAY+Y,KAAM/Y,WAC1B9B,MAER8a,KAAM,WACL,GAAIC,GAAMjZ,SACV,OAAOzE,GAAOiL,SAAS,SAAU0S,GAChC3d,EAAOmE,KAAMiZ,EAAQ,SAAUvY,EAAG+Y,GACjC,GAAIC,GAASD,EAAO,GACnBtc,EAAKtB,EAAOsD,WAAYoa,EAAK7Y,KAAS6Y,EAAK7Y,EAE5C0Y,GAAUK,EAAM,IAAK,WACpB,GAAIE,GAAWxc,GAAMA,EAAGkD,MAAO7B,KAAM8B,UAChCqZ,IAAY9d,EAAOsD,WAAYwa,EAASxZ,SAC5CwZ,EAASxZ,UACPC,KAAMoZ,EAASI,SACfP,KAAMG,EAASK,QACfC,SAAUN,EAASO,QAErBP,EAAUE,EAAS,QAAUlb,OAAS2B,EAAUqZ,EAASrZ,UAAY3B,KAAMrB,GAAOwc,GAAarZ,eAIlGiZ,EAAM,OACJpZ,WAIJA,QAAS,SAAUqC,GAClB,MAAc,OAAPA,EAAc3G,EAAOoF,OAAQuB,EAAKrC,GAAYA,IAGvDiZ,IAwCD,OArCAjZ,GAAQ6Z,KAAO7Z,EAAQmZ,KAGvBzd,EAAOmE,KAAMiZ,EAAQ,SAAUvY,EAAG+Y,GACjC,GAAIrB,GAAOqB,EAAO,GACjBQ,EAAcR,EAAO,EAGtBtZ,GAASsZ,EAAM,IAAOrB,EAAKO,IAGtBsB,GACJ7B,EAAKO,IAAI,WAERO,EAAQe,GAGNhB,EAAY,EAAJvY,GAAS,GAAIgY,QAASO,EAAQ,GAAK,GAAIJ,MAInDO,EAAUK,EAAM,IAAO,WAEtB,MADAL,GAAUK,EAAM,GAAK,QAAUjb,OAAS4a,EAAWjZ,EAAU3B,KAAM8B,WAC5D9B,MAER4a,EAAUK,EAAM,GAAK,QAAWrB,EAAKW,WAItC5Y,EAAQA,QAASiZ,GAGZJ,GACJA,EAAKvZ,KAAM2Z,EAAUA,GAIfA,GAIRc,KAAM,SAAUC,GACf,GAAIzZ,GAAI,EACP0Z,EAAgB7d,EAAWkD,KAAMa,WACjC5B,EAAS0b,EAAc1b,OAGvB2b,EAAuB,IAAX3b,GAAkByb,GAAete,EAAOsD,WAAYgb,EAAYha,SAAczB,EAAS,EAGnG0a,EAAyB,IAAdiB,EAAkBF,EAActe,EAAOiL,WAGlDwT,EAAa,SAAU5Z,EAAGyW,EAAUoD,GACnC,MAAO,UAAUlV,GAChB8R,EAAUzW,GAAMlC,KAChB+b,EAAQ7Z,GAAMJ,UAAU5B,OAAS,EAAInC,EAAWkD,KAAMa,WAAc+E,EAChEkV,IAAWC,EACdpB,EAASqB,WAAYtD,EAAUoD,KACfF,GAChBjB,EAAS/W,YAAa8U,EAAUoD,KAKnCC,EAAgBE,EAAkBC,CAGnC,IAAKjc,EAAS,EAIb,IAHA8b,EAAqB9X,MAAOhE,GAC5Bgc,EAAuBhY,MAAOhE,GAC9Bic,EAAsBjY,MAAOhE,GACjBA,EAAJgC,EAAYA,IACd0Z,EAAe1Z,IAAO7E,EAAOsD,WAAYib,EAAe1Z,GAAIP,SAChEia,EAAe1Z,GAAIP,UACjBC,KAAMka,EAAY5Z,EAAGia,EAAiBP,IACtCf,KAAMD,EAASS,QACfC,SAAUQ,EAAY5Z,EAAGga,EAAkBF,MAE3CH,CAUL,OAJMA,IACLjB,EAAS/W,YAAasY,EAAiBP,GAGjChB,EAASjZ,aAGlBtE,EAAOsL,QAAU,SAAWA,GAC3B,GAAIwH,GAAQlT,EAASiI,cAAc,SAClCkX,EAAWnf,EAASof,yBACpBjO,EAAMnR,EAASiI,cAAc,OAC7B2I,EAAS5Q,EAASiI,cAAc,UAChCoX,EAAMzO,EAAOtH,YAAatJ,EAASiI,cAAc,UAGlD,OAAMiL,GAAMlM,MAIZkM,EAAMlM,KAAO,WAIb0E,EAAQ4T,QAA0B,KAAhBpM,EAAMtJ,MAIxB8B,EAAQ6T,YAAcF,EAAI3H,SAG1BhM,EAAQ8T,qBAAsB,EAC9B9T,EAAQ+T,mBAAoB,EAC5B/T,EAAQgU,eAAgB,EAIxBxM,EAAMuE,SAAU,EAChB/L,EAAQiU,eAAiBzM,EAAM0M,WAAW,GAAOnI,QAIjD7G,EAAO4G,UAAW,EAClB9L,EAAQmU,aAAeR,EAAI7H,SAI3BtE,EAAQlT,EAASiI,cAAc,SAC/BiL,EAAMtJ,MAAQ,IACdsJ,EAAMlM,KAAO,QACb0E,EAAQoU,WAA6B,MAAhB5M,EAAMtJ,MAG3BsJ,EAAM5C,aAAc,UAAW,KAC/B4C,EAAM5C,aAAc,OAAQ,KAE5B6O,EAAS7V,YAAa4J,GAItBxH,EAAQqU,WAAaZ,EAASS,WAAW,GAAOA,WAAW,GAAOxJ,UAAUqB,QAI5E/L,EAAQsU,eAAiB,aAAetgB,GAExCyR,EAAI/F,MAAM6U,eAAiB,cAC3B9O,EAAIyO,WAAW,GAAOxU,MAAM6U,eAAiB,GAC7CvU,EAAQwU,gBAA+C,gBAA7B/O,EAAI/F,MAAM6U,eAGpC7f,EAAO,WACN,GAAI+f,GAAWC,EAEdC,EAAW,8HACXC,EAAOtgB,EAAS6I,qBAAqB,QAAS,EAEzCyX,KAKNH,EAAYngB,EAASiI,cAAc,OACnCkY,EAAU/U,MAAMmV,QAAU,gFAG1BD,EAAKhX,YAAa6W,GAAY7W,YAAa6H,GAC3CA,EAAIuB,UAAY,GAEhBvB,EAAI/F,MAAMmV,QAAU,uKAIpBngB,EAAO8K,KAAMoV,EAAyB,MAAnBA,EAAKlV,MAAMoV,MAAiBA,KAAM,MAAU,WAC9D9U,EAAQ+U,UAAgC,IAApBtP,EAAIuP,cAIpBhhB,EAAOihB,mBACXjV,EAAQgU,cAAuE,QAArDhgB,EAAOihB,iBAAkBxP,EAAK,WAAeoB,IACvE7G,EAAQ+T,kBAA2F,SAArE/f,EAAOihB,iBAAkBxP,EAAK,QAAYyP,MAAO,QAAUA,MAMzFR,EAAYjP,EAAI7H,YAAatJ,EAASiI,cAAc,QACpDmY,EAAUhV,MAAMmV,QAAUpP,EAAI/F,MAAMmV,QAAUF,EAC9CD,EAAUhV,MAAMyV,YAAcT,EAAUhV,MAAMwV,MAAQ,IACtDzP,EAAI/F,MAAMwV,MAAQ,MAElBlV,EAAQ8T,qBACNnY,YAAc3H,EAAOihB,iBAAkBP,EAAW,WAAeS,cAGpEP,EAAK/W,YAAa4W,MAGZzU,GArGCA,MAmHT,IAAIoV,GAAWC,EACdC,EAAS,+BACTC,EAAa,UAEd,SAASC,KAIRlX,OAAOmX,eAAgBpe,KAAK+N,SAAY,GACvC7M,IAAK,WACJ,YAIFlB,KAAKmD,QAAU9F,EAAO8F,QAAUC,KAAKC,SAGtC8a,EAAKE,IAAM,EAEXF,EAAKG,QAAU,SAAUC,GAOxB,MAAOA,GAAMhe,SACO,IAAnBge,EAAMhe,UAAqC,IAAnBge,EAAMhe,UAAiB,GAGjD4d,EAAKxe,WACJiI,IAAK,SAAU2W,GAId,IAAMJ,EAAKG,QAASC,GACnB,MAAO,EAGR,IAAIC,MAEHC,EAASF,EAAOve,KAAKmD,QAGtB,KAAMsb,EAAS,CACdA,EAASN,EAAKE,KAGd,KACCG,EAAYxe,KAAKmD,UAAc0D,MAAO4X,GACtCxX,OAAOyX,iBAAkBH,EAAOC,GAI/B,MAAQ/Z,GACT+Z,EAAYxe,KAAKmD,SAAYsb,EAC7BphB,EAAOoF,OAAQ8b,EAAOC,IASxB,MAJMxe,MAAK+N,MAAO0Q,KACjBze,KAAK+N,MAAO0Q,OAGNA,GAERE,IAAK,SAAUJ,EAAOzZ,EAAM+B,GAC3B,GAAI+X,GAIHH,EAASze,KAAK4H,IAAK2W,GACnBxQ,EAAQ/N,KAAK+N,MAAO0Q,EAGrB,IAAqB,gBAAT3Z,GACXiJ,EAAOjJ,GAAS+B,MAKhB,IAAKxJ,EAAOqH,cAAeqJ,GAC1B1Q,EAAOoF,OAAQzC,KAAK+N,MAAO0Q,GAAU3Z,OAGrC,KAAM8Z,IAAQ9Z,GACbiJ,EAAO6Q,GAAS9Z,EAAM8Z,EAIzB,OAAO7Q,IAER7M,IAAK,SAAUqd,EAAO3W,GAKrB,GAAImG,GAAQ/N,KAAK+N,MAAO/N,KAAK4H,IAAK2W,GAElC,OAAO3W,KAAQhL,UACdmR,EAAQA,EAAOnG,IAEjBD,OAAQ,SAAU4W,EAAO3W,EAAKf,GAC7B,GAAIgY,EAYJ,OAAKjX,KAAQhL,WACTgL,GAAsB,gBAARA,IAAqBf,IAAUjK,WAEhDiiB,EAAS7e,KAAKkB,IAAKqd,EAAO3W,GAEnBiX,IAAWjiB,UACjBiiB,EAAS7e,KAAKkB,IAAKqd,EAAOlhB,EAAOoJ,UAAUmB,MAS7C5H,KAAK2e,IAAKJ,EAAO3W,EAAKf,GAIfA,IAAUjK,UAAYiK,EAAQe,IAEtCxC,OAAQ,SAAUmZ,EAAO3W,GACxB,GAAI1F,GAAGS,EAAMmc,EACZL,EAASze,KAAK4H,IAAK2W,GACnBxQ,EAAQ/N,KAAK+N,MAAO0Q,EAErB,IAAK7W,IAAQhL,UACZoD,KAAK+N,MAAO0Q,UAEN,CAEDphB,EAAO6F,QAAS0E,GAOpBjF,EAAOiF,EAAIhK,OAAQgK,EAAIvF,IAAKhF,EAAOoJ,aAEnCqY,EAAQzhB,EAAOoJ,UAAWmB,GAErBA,IAAOmG,GACXpL,GAASiF,EAAKkX,IAIdnc,EAAOmc,EACPnc,EAAOA,IAAQoL,IACZpL,GAAWA,EAAK7C,MAAOf,SAI5BmD,EAAIS,EAAKzC,MACT,OAAQgC,UACA6L,GAAOpL,EAAMT,MAIvB6c,QAAS,SAAUR,GAClB,OAAQlhB,EAAOqH,cACd1E,KAAK+N,MAAOwQ,EAAOve,KAAKmD,gBAG1B6b,QAAS,SAAUT,GACbA,EAAOve,KAAKmD,gBACTnD,MAAK+N,MAAOwQ,EAAOve,KAAKmD,YAMlC4a,EAAY,GAAII,GAChBH,EAAY,GAAIG,GAGhB9gB,EAAOoF,QACNwc,WAAYd,EAAKG,QAEjBS,QAAS,SAAUhf,GAClB,MAAOge,GAAUgB,QAAShf,IAAUie,EAAUe,QAAShf,IAGxD+E,KAAM,SAAU/E,EAAM4C,EAAMmC,GAC3B,MAAOiZ,GAAUpW,OAAQ5H,EAAM4C,EAAMmC,IAGtCoa,WAAY,SAAUnf,EAAM4C,GAC3Bob,EAAU3Y,OAAQrF,EAAM4C,IAKzBwc,MAAO,SAAUpf,EAAM4C,EAAMmC,GAC5B,MAAOkZ,GAAUrW,OAAQ5H,EAAM4C,EAAMmC,IAGtCsa,YAAa,SAAUrf,EAAM4C,GAC5Bqb,EAAU5Y,OAAQrF,EAAM4C,MAI1BtF,EAAOsB,GAAG8D,QACTqC,KAAM,SAAU8C,EAAKf,GACpB,GAAIyH,GAAO3L,EACV5C,EAAOC,KAAM,GACbkC,EAAI,EACJ4C,EAAO,IAGR,IAAK8C,IAAQhL,UAAY,CACxB,GAAKoD,KAAKE,SACT4E,EAAOiZ,EAAU7c,IAAKnB,GAEC,IAAlBA,EAAKQ,WAAmByd,EAAU9c,IAAKnB,EAAM,iBAAmB,CAEpE,IADAuO,EAAQvO,EAAK6K,WACD0D,EAAMpO,OAAVgC,EAAkBA,IACzBS,EAAO2L,EAAOpM,GAAIS,KAEe,IAA5BA,EAAKzE,QAAS,WAClByE,EAAOtF,EAAOoJ,UAAW9D,EAAK3E,MAAM,IACpCqhB,EAAUtf,EAAM4C,EAAMmC,EAAMnC,IAG9Bqb,GAAUW,IAAK5e,EAAM,gBAAgB,GAIvC,MAAO+E,GAIR,MAAoB,gBAAR8C,GACJ5H,KAAKwB,KAAK,WAChBuc,EAAUY,IAAK3e,KAAM4H,KAIhBvK,EAAOsK,OAAQ3H,KAAM,SAAU6G,GACrC,GAAI/B,GACHwa,EAAWjiB,EAAOoJ,UAAWmB,EAO9B,IAAK7H,GAAQ8G,IAAUjK,UAAvB,CAIC,GADAkI,EAAOiZ,EAAU7c,IAAKnB,EAAM6H,GACvB9C,IAASlI,UACb,MAAOkI,EAMR,IADAA,EAAOiZ,EAAU7c,IAAKnB,EAAMuf,GACvBxa,IAASlI,UACb,MAAOkI,EAMR,IADAA,EAAOua,EAAUtf,EAAMuf,EAAU1iB,WAC5BkI,IAASlI,UACb,MAAOkI,OAQT9E,MAAKwB,KAAK,WAGT,GAAIsD,GAAOiZ,EAAU7c,IAAKlB,KAAMsf,EAKhCvB,GAAUY,IAAK3e,KAAMsf,EAAUzY,GAKL,KAArBe,EAAI1J,QAAQ,MAAe4G,IAASlI,WACxCmhB,EAAUY,IAAK3e,KAAM4H,EAAKf,MAG1B,KAAMA,EAAO/E,UAAU5B,OAAS,EAAG,MAAM,IAG7Cgf,WAAY,SAAUtX,GACrB,MAAO5H,MAAKwB,KAAK,WAChBuc,EAAU3Y,OAAQpF,KAAM4H,OAK3B,SAASyX,GAAUtf,EAAM6H,EAAK9C,GAC7B,GAAInC,EAIJ,IAAKmC,IAASlI,WAA+B,IAAlBmD,EAAKQ,SAI/B,GAHAoC,EAAO,QAAUiF,EAAItE,QAAS4a,EAAY,OAAQtX,cAClD9B,EAAO/E,EAAKuN,aAAc3K,GAEL,gBAATmC,GAAoB,CAC/B,IACCA,EAAgB,SAATA,GAAkB,EACf,UAATA,GAAmB,EACV,SAATA,EAAkB,MAEjBA,EAAO,KAAOA,GAAQA,EACvBmZ,EAAOxd,KAAMqE,GAASS,KAAKC,MAAOV,GAClCA,EACA,MAAOL,IAGTsZ,EAAUY,IAAK5e,EAAM6H,EAAK9C,OAE1BA,GAAOlI,SAGT,OAAOkI,GAERzH,EAAOoF,QACN8c,MAAO,SAAUxf,EAAMkE,EAAMa,GAC5B,GAAIya,EAEJ,OAAKxf,IACJkE,GAASA,GAAQ,MAAS,QAC1Bsb,EAAQvB,EAAU9c,IAAKnB,EAAMkE,GAGxBa,KACEya,GAASliB,EAAO6F,QAAS4B,GAC9Bya,EAAQvB,EAAUrW,OAAQ5H,EAAMkE,EAAM5G,EAAO0D,UAAU+D,IAEvDya,EAAMzhB,KAAMgH,IAGPya,OAZR,WAgBDC,QAAS,SAAUzf,EAAMkE,GACxBA,EAAOA,GAAQ,IAEf,IAAIsb,GAAQliB,EAAOkiB,MAAOxf,EAAMkE,GAC/Bwb,EAAcF,EAAMrf,OACpBvB,EAAK4gB,EAAMtR,QACXyR,EAAQriB,EAAOsiB,YAAa5f,EAAMkE,GAClC2b,EAAO,WACNviB,EAAOmiB,QAASzf,EAAMkE;CAIZ,gBAAPtF,IACJA,EAAK4gB,EAAMtR,QACXwR,KAGI9gB,IAIU,OAATsF,GACJsb,EAAMrO,QAAS,oBAITwO,GAAMG,KACblhB,EAAGsC,KAAMlB,EAAM6f,EAAMF,KAGhBD,GAAeC,GACpBA,EAAM7K,MAAMkF,QAKd4F,YAAa,SAAU5f,EAAMkE,GAC5B,GAAI2D,GAAM3D,EAAO,YACjB,OAAO+Z,GAAU9c,IAAKnB,EAAM6H,IAASoW,EAAUrW,OAAQ5H,EAAM6H,GAC5DiN,MAAOxX,EAAOgc,UAAU,eAAec,IAAI,WAC1C6D,EAAU5Y,OAAQrF,GAAQkE,EAAO,QAAS2D,WAM9CvK,EAAOsB,GAAG8D,QACT8c,MAAO,SAAUtb,EAAMa,GACtB,GAAIgb,GAAS,CAQb,OANqB,gBAAT7b,KACXa,EAAOb,EACPA,EAAO,KACP6b,KAGuBA,EAAnBhe,UAAU5B,OACP7C,EAAOkiB,MAAOvf,KAAK,GAAIiE,GAGxBa,IAASlI,UACfoD,KACAA,KAAKwB,KAAK,WACT,GAAI+d,GAAQliB,EAAOkiB,MAAOvf,KAAMiE,EAAMa,EAGtCzH,GAAOsiB,YAAa3f,KAAMiE,GAEZ,OAATA,GAA8B,eAAbsb,EAAM,IAC3BliB,EAAOmiB,QAASxf,KAAMiE,MAI1Bub,QAAS,SAAUvb,GAClB,MAAOjE,MAAKwB,KAAK,WAChBnE,EAAOmiB,QAASxf,KAAMiE,MAKxB8b,MAAO,SAAUC,EAAM/b,GAItB,MAHA+b,GAAO3iB,EAAO4iB,GAAK5iB,EAAO4iB,GAAGC,OAAQF,IAAUA,EAAOA,EACtD/b,EAAOA,GAAQ,KAERjE,KAAKuf,MAAOtb,EAAM,SAAU2b,EAAMF,GACxC,GAAIS,GAAU3X,WAAYoX,EAAMI,EAChCN,GAAMG,KAAO,WACZO,aAAcD,OAIjBE,WAAY,SAAUpc,GACrB,MAAOjE,MAAKuf,MAAOtb,GAAQ,UAI5BtC,QAAS,SAAUsC,EAAMD,GACxB,GAAI2B,GACH2a,EAAQ,EACRC,EAAQljB,EAAOiL,WACf8I,EAAWpR,KACXkC,EAAIlC,KAAKE,OACTkb,EAAU,aACCkF,GACTC,EAAM1c,YAAauN,GAAYA,IAIb,iBAATnN,KACXD,EAAMC,EACNA,EAAOrH,WAERqH,EAAOA,GAAQ,IAEf,OAAO/B,IACNyD,EAAMqY,EAAU9c,IAAKkQ,EAAUlP,GAAK+B,EAAO,cACtC0B,GAAOA,EAAIkP,QACfyL,IACA3a,EAAIkP,MAAMsF,IAAKiB,GAIjB,OADAA,KACOmF,EAAM5e,QAASqC,KAGxB,IAAIwc,GAAUC,EACbC,EAAS,cACTC,EAAU,MACVC,EAAa,qCAEdvjB,GAAOsB,GAAG8D,QACT7B,KAAM,SAAU+B,EAAMkE,GACrB,MAAOxJ,GAAOsK,OAAQ3H,KAAM3C,EAAOuD,KAAM+B,EAAMkE,EAAO/E,UAAU5B,OAAS,IAG1E2gB,WAAY,SAAUle,GACrB,MAAO3C,MAAKwB,KAAK,WAChBnE,EAAOwjB,WAAY7gB,KAAM2C,MAI3Bic,KAAM,SAAUjc,EAAMkE,GACrB,MAAOxJ,GAAOsK,OAAQ3H,KAAM3C,EAAOuhB,KAAMjc,EAAMkE,EAAO/E,UAAU5B,OAAS,IAG1E4gB,WAAY,SAAUne,GACrB,MAAO3C,MAAKwB,KAAK,iBACTxB,MAAM3C,EAAO0jB,QAASpe,IAAUA,MAIzCqe,SAAU,SAAUna,GACnB,GAAIoa,GAASlhB,EAAM2O,EAAKwS,EAAO9e,EAC9BF,EAAI,EACJC,EAAMnC,KAAKE,OACXihB,EAA2B,gBAAVta,IAAsBA,CAExC,IAAKxJ,EAAOsD,WAAYkG,GACvB,MAAO7G,MAAKwB,KAAK,SAAUY,GAC1B/E,EAAQ2C,MAAOghB,SAAUna,EAAM5F,KAAMjB,KAAMoC,EAAGpC,KAAKyP,aAIrD,IAAK0R,EAIJ,IAFAF,GAAYpa,GAAS,IAAK/G,MAAOf,OAErBoD,EAAJD,EAASA,IAOhB,GANAnC,EAAOC,KAAMkC,GACbwM,EAAwB,IAAlB3O,EAAKQ,WAAoBR,EAAK0P,WACjC,IAAM1P,EAAK0P,UAAY,KAAMnM,QAASod,EAAQ,KAChD,KAGU,CACVte,EAAI,CACJ,OAAS8e,EAAQD,EAAQ7e,KACgB,EAAnCsM,EAAIxQ,QAAS,IAAMgjB,EAAQ,OAC/BxS,GAAOwS,EAAQ,IAGjBnhB,GAAK0P,UAAYpS,EAAOmB,KAAMkQ,GAMjC,MAAO1O,OAGRohB,YAAa,SAAUva,GACtB,GAAIoa,GAASlhB,EAAM2O,EAAKwS,EAAO9e,EAC9BF,EAAI,EACJC,EAAMnC,KAAKE,OACXihB,EAA+B,IAArBrf,UAAU5B,QAAiC,gBAAV2G,IAAsBA,CAElE,IAAKxJ,EAAOsD,WAAYkG,GACvB,MAAO7G,MAAKwB,KAAK,SAAUY,GAC1B/E,EAAQ2C,MAAOohB,YAAava,EAAM5F,KAAMjB,KAAMoC,EAAGpC,KAAKyP,aAGxD,IAAK0R,EAGJ,IAFAF,GAAYpa,GAAS,IAAK/G,MAAOf,OAErBoD,EAAJD,EAASA,IAQhB,GAPAnC,EAAOC,KAAMkC,GAEbwM,EAAwB,IAAlB3O,EAAKQ,WAAoBR,EAAK0P,WACjC,IAAM1P,EAAK0P,UAAY,KAAMnM,QAASod,EAAQ,KAChD,IAGU,CACVte,EAAI,CACJ,OAAS8e,EAAQD,EAAQ7e,KAExB,MAAQsM,EAAIxQ,QAAS,IAAMgjB,EAAQ,MAAS,EAC3CxS,EAAMA,EAAIpL,QAAS,IAAM4d,EAAQ,IAAK,IAGxCnhB,GAAK0P,UAAY5I,EAAQxJ,EAAOmB,KAAMkQ,GAAQ,GAKjD,MAAO1O,OAGRqhB,YAAa,SAAUxa,EAAOya,GAC7B,GAAIrd,SAAc4C,EAElB,OAAyB,iBAAbya,IAAmC,WAATrd,EAC9Bqd,EAAWthB,KAAKghB,SAAUna,GAAU7G,KAAKohB,YAAava,GAGzDxJ,EAAOsD,WAAYkG,GAChB7G,KAAKwB,KAAK,SAAUU,GAC1B7E,EAAQ2C,MAAOqhB,YAAaxa,EAAM5F,KAAKjB,KAAMkC,EAAGlC,KAAKyP,UAAW6R,GAAWA,KAItEthB,KAAKwB,KAAK,WAChB,GAAc,WAATyC,EAAoB,CAExB,GAAIwL,GACHvN,EAAI,EACJ+X,EAAO5c,EAAQ2C,MACfuhB,EAAa1a,EAAM/G,MAAOf,MAE3B,OAAS0Q,EAAY8R,EAAYrf,KAE3B+X,EAAKuH,SAAU/R,GACnBwK,EAAKmH,YAAa3R,GAElBwK,EAAK+G,SAAUvR,QAKNxL,IAASlH,GAA8B,YAATkH,KACpCjE,KAAKyP,WAETuO,EAAUW,IAAK3e,KAAM,gBAAiBA,KAAKyP,WAO5CzP,KAAKyP,UAAYzP,KAAKyP,WAAa5I,KAAU,EAAQ,GAAKmX,EAAU9c,IAAKlB,KAAM,kBAAqB,OAKvGwhB,SAAU,SAAU/iB,GACnB,GAAIgR,GAAY,IAAMhR,EAAW,IAChCyD,EAAI,EACJkF,EAAIpH,KAAKE,MACV,MAAYkH,EAAJlF,EAAOA,IACd,GAA0B,IAArBlC,KAAKkC,GAAG3B,WAAmB,IAAMP,KAAKkC,GAAGuN,UAAY,KAAKnM,QAAQod,EAAQ,KAAKxiB,QAASuR,IAAe,EAC3G,OAAO,CAIT,QAAO,GAGR4B,IAAK,SAAUxK,GACd,GAAI6Y,GAAOpe,EAAKX,EACfZ,EAAOC,KAAK,EAEb,EAAA,GAAM8B,UAAU5B,OAsBhB,MAFAS,GAAatD,EAAOsD,WAAYkG,GAEzB7G,KAAKwB,KAAK,SAAUU,GAC1B,GAAImP,EAEmB,KAAlBrR,KAAKO,WAKT8Q,EADI1Q,EACEkG,EAAM5F,KAAMjB,KAAMkC,EAAG7E,EAAQ2C,MAAOqR,OAEpCxK,EAIK,MAAPwK,EACJA,EAAM,GACoB,gBAARA,GAClBA,GAAO,GACIhU,EAAO6F,QAASmO,KAC3BA,EAAMhU,EAAOgF,IAAIgP,EAAK,SAAWxK,GAChC,MAAgB,OAATA,EAAgB,GAAKA,EAAQ,MAItC6Y,EAAQriB,EAAOokB,SAAUzhB,KAAKiE,OAAU5G,EAAOokB,SAAUzhB,KAAK2G,SAASC,eAGjE8Y,GAAW,OAASA,IAAUA,EAAMf,IAAK3e,KAAMqR,EAAK,WAAczU,YACvEoD,KAAK6G,MAAQwK,KAjDd,IAAKtR,EAGJ,MAFA2f,GAAQriB,EAAOokB,SAAU1hB,EAAKkE,OAAU5G,EAAOokB,SAAU1hB,EAAK4G,SAASC,eAElE8Y,GAAS,OAASA,KAAUpe,EAAMoe,EAAMxe,IAAKnB,EAAM,YAAenD,UAC/D0E,GAGRA,EAAMvB,EAAK8G,MAEW,gBAARvF,GAEbA,EAAIgC,QAAQqd,EAAS,IAEd,MAAPrf,EAAc,GAAKA,OA0CxBjE,EAAOoF,QACNgf,UACCC,QACCxgB,IAAK,SAAUnB,GAGd,GAAIsR,GAAMtR,EAAK6K,WAAW/D,KAC1B,QAAQwK,GAAOA,EAAIC,UAAYvR,EAAK8G,MAAQ9G,EAAKsG,OAGnDwH,QACC3M,IAAK,SAAUnB,GACd,GAAI8G,GAAO6a,EACVhf,EAAU3C,EAAK2C,QACf0X,EAAQra,EAAK6U,cACb+M,EAAoB,eAAd5hB,EAAKkE,MAAiC,EAARmW,EACpC2B,EAAS4F,EAAM,QACfC,EAAMD,EAAMvH,EAAQ,EAAI1X,EAAQxC,OAChCgC,EAAY,EAARkY,EACHwH,EACAD,EAAMvH,EAAQ,CAGhB,MAAYwH,EAAJ1f,EAASA,IAIhB,GAHAwf,EAAShf,EAASR,MAGXwf,EAAO/M,UAAYzS,IAAMkY,IAE5B/c,EAAOsL,QAAQmU,YAAe4E,EAAOjN,SAA+C,OAApCiN,EAAOpU,aAAa,cACnEoU,EAAO5gB,WAAW2T,UAAapX,EAAOsJ,SAAU+a,EAAO5gB,WAAY,aAAiB,CAMxF,GAHA+F,EAAQxJ,EAAQqkB,GAASrQ,MAGpBsQ,EACJ,MAAO9a,EAIRkV,GAAOje,KAAM+I,GAIf,MAAOkV,IAGR4C,IAAK,SAAU5e,EAAM8G,GACpB,GAAIgb,GAAWH,EACdhf,EAAU3C,EAAK2C,QACfqZ,EAAS1e,EAAO0D,UAAW8F,GAC3B3E,EAAIQ,EAAQxC,MAEb,OAAQgC,IACPwf,EAAShf,EAASR,IACZwf,EAAO/M,SAAWtX,EAAO6J,QAAS7J,EAAOqkB,GAAQrQ,MAAO0K,IAAY,KACzE8F,GAAY,EAQd,OAHMA,KACL9hB,EAAK6U,cAAgB,IAEfmH,KAKVnb,KAAM,SAAUb,EAAM4C,EAAMkE,GAC3B,GAAI6Y,GAAOpe,EACVwgB,EAAQ/hB,EAAKQ,QAGd,IAAMR,GAAkB,IAAV+hB,GAAyB,IAAVA,GAAyB,IAAVA,EAK5C,aAAY/hB,GAAKuN,eAAiBvQ,EAC1BM,EAAOuhB,KAAM7e,EAAM4C,EAAMkE,IAKlB,IAAVib,GAAgBzkB,EAAO2b,SAAUjZ,KACrC4C,EAAOA,EAAKiE,cACZ8Y,EAAQriB,EAAO0kB,UAAWpf,KACvBtF,EAAO8T,KAAKrR,MAAM+L,KAAKpL,KAAMkC,GAAS8d,EAAWD,IAGhD3Z,IAAUjK,UAaH8iB,GAAS,OAASA,IAA6C,QAAnCpe,EAAMoe,EAAMxe,IAAKnB,EAAM4C,IACvDrB,GAGPA,EAAMjE,EAAO+C,KAAKQ,KAAMb,EAAM4C,GAGhB,MAAPrB,EACN1E,UACA0E,GApBc,OAAVuF,EAGO6Y,GAAS,OAASA,KAAUpe,EAAMoe,EAAMf,IAAK5e,EAAM8G,EAAOlE,MAAY/F,UAC1E0E,GAGPvB,EAAKwN,aAAc5K,EAAMkE,EAAQ,IAC1BA,IAPPxJ,EAAOwjB,WAAY9gB,EAAM4C,GAAzBtF,aAuBHwjB,WAAY,SAAU9gB,EAAM8G,GAC3B,GAAIlE,GAAMqf,EACT9f,EAAI,EACJ+f,EAAYpb,GAASA,EAAM/G,MAAOf,EAEnC,IAAKkjB,GAA+B,IAAlBliB,EAAKQ,SACtB,MAASoC,EAAOsf,EAAU/f,KACzB8f,EAAW3kB,EAAO0jB,QAASpe,IAAUA,EAGhCtF,EAAO8T,KAAKrR,MAAM+L,KAAKpL,KAAMkC,KAEjC5C,EAAMiiB,IAAa,GAGpBjiB,EAAK6N,gBAAiBjL,IAKzBof,WACC9d,MACC0a,IAAK,SAAU5e,EAAM8G,GACpB,IAAMxJ,EAAOsL,QAAQoU,YAAwB,UAAVlW,GAAqBxJ,EAAOsJ,SAAS5G,EAAM,SAAW,CAGxF,GAAIsR,GAAMtR,EAAK8G,KAKf,OAJA9G,GAAKwN,aAAc,OAAQ1G,GACtBwK,IACJtR,EAAK8G,MAAQwK,GAEPxK,MAMXka,SACCmB,MAAO,UACPC,QAAS,aAGVvD,KAAM,SAAU7e,EAAM4C,EAAMkE,GAC3B,GAAIvF,GAAKoe,EAAO0C,EACfN,EAAQ/hB,EAAKQ,QAGd,IAAMR,GAAkB,IAAV+hB,GAAyB,IAAVA,GAAyB,IAAVA,EAY5C,MARAM,GAAmB,IAAVN,IAAgBzkB,EAAO2b,SAAUjZ,GAErCqiB,IAEJzf,EAAOtF,EAAO0jB,QAASpe,IAAUA,EACjC+c,EAAQriB,EAAOglB,UAAW1f,IAGtBkE,IAAUjK,UACP8iB,GAAS,OAASA,KAAUpe,EAAMoe,EAAMf,IAAK5e,EAAM8G,EAAOlE,MAAY/F,UAC5E0E,EACEvB,EAAM4C,GAASkE,EAGX6Y,GAAS,OAASA,IAA6C,QAAnCpe,EAAMoe,EAAMxe,IAAKnB,EAAM4C,IACzDrB,EACAvB,EAAM4C,IAIT0f,WACC9N,UACCrT,IAAK,SAAUnB,GACd,MAAOA,GAAKuiB,aAAc,aAAgB1B,EAAWngB,KAAMV,EAAK4G,WAAc5G,EAAKuU,KAClFvU,EAAKwU,SACL,QAOLkM,GACC9B,IAAK,SAAU5e,EAAM8G,EAAOlE,GAO3B,MANKkE,MAAU,EAEdxJ,EAAOwjB,WAAY9gB,EAAM4C,GAEzB5C,EAAKwN,aAAc5K,EAAMA,GAEnBA,IAGTtF,EAAOmE,KAAMnE,EAAO8T,KAAKrR,MAAM+L,KAAK/M,OAAOgB,MAAO,QAAU,SAAUoC,EAAGS,GACxE,GAAI4f,GAASllB,EAAO8T,KAAK3C,WAAY7L,IAAUtF,EAAO+C,KAAKQ,IAE3DvD,GAAO8T,KAAK3C,WAAY7L,GAAS,SAAU5C,EAAM4C,EAAMoG,GACtD,GAAIpK,GAAKtB,EAAO8T,KAAK3C,WAAY7L,GAChCrB,EAAMyH,EACLnM,WAGCS,EAAO8T,KAAK3C,WAAY7L,GAAS/F,YACjC2lB,EAAQxiB,EAAM4C,EAAMoG,GAEpBpG,EAAKiE,cACL,IAKH,OAFAvJ,GAAO8T,KAAK3C,WAAY7L,GAAShE,EAE1B2C,KAMHjE,EAAOsL,QAAQ6T,cACpBnf,EAAOglB,UAAU1N,UAChBzT,IAAK,SAAUnB,GACd,GAAIsP,GAAStP,EAAKe,UAIlB,OAHKuO,IAAUA,EAAOvO,YACrBuO,EAAOvO,WAAW8T,cAEZ,QAKVvX,EAAOmE,MACN,WACA,WACA,YACA,cACA,cACA,UACA,UACA,SACA,cACA,mBACE,WACFnE,EAAO0jB,QAAS/gB,KAAK4G,eAAkB5G,OAIxC3C,EAAOmE,MAAO,QAAS,YAAc,WACpCnE,EAAOokB,SAAUzhB,OAChB2e,IAAK,SAAU5e,EAAM8G,GACpB,MAAKxJ,GAAO6F,QAAS2D,GACX9G,EAAK2U,QAAUrX,EAAO6J,QAAS7J,EAAO0C,GAAMsR,MAAOxK,IAAW,EADxE,YAKIxJ,EAAOsL,QAAQ4T,UACpBlf,EAAOokB,SAAUzhB,MAAOkB,IAAM,SAAUnB,GAGvC,MAAsC,QAA/BA,EAAKuN,aAAa,SAAoB,KAAOvN,EAAK8G,SAI5D,IAAI2b,GAAY,OACfC,EAAc,+BACdC,EAAc,kCACdC,EAAiB,sBAElB,SAASC,KACR,OAAO,EAGR,QAASC,KACR,OAAO,EAGR,QAASC,KACR,IACC,MAAO7lB,GAASmX,cACf,MAAQ2O,KAOX1lB,EAAO2lB,OAENC,UAEA9I,IAAK,SAAUpa,EAAMmjB,EAAO3U,EAASzJ,EAAMrG,GAE1C,GAAI0kB,GAAaC,EAAazd,EAC7B0d,EAAQC,EAAGC,EACXC,EAASC,EAAUxf,EAAMyf,EAAYC,EACrCC,EAAW5F,EAAU9c,IAAKnB,EAG3B,IAAM6jB,EAAN,CAKKrV,EAAQA,UACZ4U,EAAc5U,EACdA,EAAU4U,EAAY5U,QACtB9P,EAAW0kB,EAAY1kB,UAIlB8P,EAAQ9G,OACb8G,EAAQ9G,KAAOpK,EAAOoK,SAIhB4b,EAASO,EAASP,UACxBA,EAASO,EAASP,YAEZD,EAAcQ,EAASC,UAC7BT,EAAcQ,EAASC,OAAS,SAAUpf,GAGzC,aAAcpH,KAAWN,GAAuB0H,GAAKpH,EAAO2lB,MAAMc,YAAcrf,EAAER,KAEjFrH,UADAS,EAAO2lB,MAAMe,SAASliB,MAAOuhB,EAAYrjB,KAAM+B,YAIjDshB,EAAYrjB,KAAOA,GAIpBmjB,GAAUA,GAAS,IAAKpjB,MAAOf,KAAqB,IACpDukB,EAAIJ,EAAMhjB,MACV,OAAQojB,IACP3d,EAAMgd,EAAexiB,KAAM+iB,EAAMI,QACjCrf,EAAO0f,EAAWhe,EAAI,GACtB+d,GAAe/d,EAAI,IAAM,IAAK+C,MAAO,KAAMnG,OAGrC0B,IAKNuf,EAAUnmB,EAAO2lB,MAAMQ,QAASvf,OAGhCA,GAASxF,EAAW+kB,EAAQQ,aAAeR,EAAQS,WAAchgB,EAGjEuf,EAAUnmB,EAAO2lB,MAAMQ,QAASvf,OAGhCsf,EAAYlmB,EAAOoF,QAClBwB,KAAMA,EACN0f,SAAUA,EACV7e,KAAMA,EACNyJ,QAASA,EACT9G,KAAM8G,EAAQ9G,KACdhJ,SAAUA,EACVqN,aAAcrN,GAAYpB,EAAO8T,KAAKrR,MAAMgM,aAAarL,KAAMhC,GAC/DylB,UAAWR,EAAWjW,KAAK,MACzB0V,IAGIM,EAAWJ,EAAQpf,MACzBwf,EAAWJ,EAAQpf,MACnBwf,EAASU,cAAgB,EAGnBX,EAAQY,OAASZ,EAAQY,MAAMnjB,KAAMlB,EAAM+E,EAAM4e,EAAYN,MAAkB,GAC/ErjB,EAAK0I,kBACT1I,EAAK0I,iBAAkBxE,EAAMmf,GAAa,IAKxCI,EAAQrJ,MACZqJ,EAAQrJ,IAAIlZ,KAAMlB,EAAMwjB,GAElBA,EAAUhV,QAAQ9G,OACvB8b,EAAUhV,QAAQ9G,KAAO8G,EAAQ9G,OAK9BhJ,EACJglB,EAASjhB,OAAQihB,EAASU,gBAAiB,EAAGZ,GAE9CE,EAAS3lB,KAAMylB,GAIhBlmB,EAAO2lB,MAAMC,OAAQhf,IAAS,EAI/BlE,GAAO,OAIRqF,OAAQ,SAAUrF,EAAMmjB,EAAO3U,EAAS9P,EAAU4lB,GAEjD,GAAIjiB,GAAGkiB,EAAW3e,EACjB0d,EAAQC,EAAGC,EACXC,EAASC,EAAUxf,EAAMyf,EAAYC,EACrCC,EAAW5F,EAAUe,QAAShf,IAAUie,EAAU9c,IAAKnB,EAExD,IAAM6jB,IAAcP,EAASO,EAASP,QAAtC,CAKAH,GAAUA,GAAS,IAAKpjB,MAAOf,KAAqB,IACpDukB,EAAIJ,EAAMhjB,MACV,OAAQojB,IAMP,GALA3d,EAAMgd,EAAexiB,KAAM+iB,EAAMI,QACjCrf,EAAO0f,EAAWhe,EAAI,GACtB+d,GAAe/d,EAAI,IAAM,IAAK+C,MAAO,KAAMnG,OAGrC0B,EAAN,CAOAuf,EAAUnmB,EAAO2lB,MAAMQ,QAASvf,OAChCA,GAASxF,EAAW+kB,EAAQQ,aAAeR,EAAQS,WAAchgB,EACjEwf,EAAWJ,EAAQpf,OACnB0B,EAAMA,EAAI,IAAUoF,OAAQ,UAAY2Y,EAAWjW,KAAK,iBAAmB,WAG3E6W,EAAYliB,EAAIqhB,EAASvjB,MACzB,OAAQkC,IACPmhB,EAAYE,EAAUrhB,IAEfiiB,GAAeV,IAAaJ,EAAUI,UACzCpV,GAAWA,EAAQ9G,OAAS8b,EAAU9b,MACtC9B,IAAOA,EAAIlF,KAAM8iB,EAAUW,YAC3BzlB,GAAYA,IAAa8kB,EAAU9kB,WAAyB,OAAbA,IAAqB8kB,EAAU9kB,YACjFglB,EAASjhB,OAAQJ,EAAG,GAEfmhB,EAAU9kB,UACdglB,EAASU,gBAELX,EAAQpe,QACZoe,EAAQpe,OAAOnE,KAAMlB,EAAMwjB,GAOzBe,KAAcb,EAASvjB,SACrBsjB,EAAQe,UAAYf,EAAQe,SAAStjB,KAAMlB,EAAM2jB,EAAYE,EAASC,WAAa,GACxFxmB,EAAOmnB,YAAazkB,EAAMkE,EAAM2f,EAASC,cAGnCR,GAAQpf,QAtCf,KAAMA,IAAQof,GACbhmB,EAAO2lB,MAAM5d,OAAQrF,EAAMkE,EAAOif,EAAOI,GAAK/U,EAAS9P,GAAU,EA0C/DpB,GAAOqH,cAAe2e,WACnBO,GAASC,OAChB7F,EAAU5Y,OAAQrF,EAAM,aAI1B+D,QAAS,SAAUkf,EAAOle,EAAM/E,EAAM0kB,GAErC,GAAIviB,GAAGwM,EAAK/I,EAAK+e,EAAYC,EAAQd,EAAQL,EAC5CoB,GAAc7kB,GAAQ9C,GACtBgH,EAAO5F,EAAY4C,KAAM+hB,EAAO,QAAWA,EAAM/e,KAAO+e,EACxDU,EAAarlB,EAAY4C,KAAM+hB,EAAO,aAAgBA,EAAMkB,UAAUxb,MAAM,OAK7E,IAHAgG,EAAM/I,EAAM5F,EAAOA,GAAQ9C,EAGJ,IAAlB8C,EAAKQ,UAAoC,IAAlBR,EAAKQ,WAK5BmiB,EAAYjiB,KAAMwD,EAAO5G,EAAO2lB,MAAMc,aAItC7f,EAAK/F,QAAQ,MAAQ,IAEzBwlB,EAAazf,EAAKyE,MAAM,KACxBzE,EAAOyf,EAAWzV,QAClByV,EAAWnhB,QAEZoiB,EAA6B,EAApB1gB,EAAK/F,QAAQ,MAAY,KAAO+F,EAGzC+e,EAAQA,EAAO3lB,EAAO8F,SACrB6f,EACA,GAAI3lB,GAAOwnB,MAAO5gB,EAAuB,gBAAV+e,IAAsBA,GAGtDA,EAAM8B,UAAYL,EAAe,EAAI,EACrCzB,EAAMkB,UAAYR,EAAWjW,KAAK,KAClCuV,EAAM+B,aAAe/B,EAAMkB,UACtBnZ,OAAQ,UAAY2Y,EAAWjW,KAAK,iBAAmB,WAC3D,KAGDuV,EAAMpQ,OAAShW,UACTomB,EAAMhgB,SACXggB,EAAMhgB,OAASjD,GAIhB+E,EAAe,MAARA,GACJke,GACF3lB,EAAO0D,UAAW+D,GAAQke,IAG3BQ,EAAUnmB,EAAO2lB,MAAMQ,QAASvf,OAC1BwgB,IAAgBjB,EAAQ1f,SAAW0f,EAAQ1f,QAAQjC,MAAO9B,EAAM+E,MAAW,GAAjF,CAMA,IAAM2f,IAAiBjB,EAAQwB,WAAa3nB,EAAO8G,SAAUpE,GAAS,CAMrE,IAJA2kB,EAAalB,EAAQQ,cAAgB/f,EAC/Bye,EAAYjiB,KAAMikB,EAAazgB,KACpCyK,EAAMA,EAAI5N,YAEH4N,EAAKA,EAAMA,EAAI5N,WACtB8jB,EAAU9mB,KAAM4Q,GAChB/I,EAAM+I,CAIF/I,MAAS5F,EAAKS,eAAiBvD,IACnC2nB,EAAU9mB,KAAM6H,EAAI2J,aAAe3J,EAAIsf,cAAgBtoB,GAKzDuF,EAAI,CACJ,QAASwM,EAAMkW,EAAU1iB,QAAU8gB,EAAMkC,uBAExClC,EAAM/e,KAAO/B,EAAI,EAChBwiB,EACAlB,EAAQS,UAAYhgB,EAGrB4f,GAAW7F,EAAU9c,IAAKwN,EAAK,eAAoBsU,EAAM/e,OAAU+Z,EAAU9c,IAAKwN,EAAK,UAClFmV,GACJA,EAAOhiB,MAAO6M,EAAK5J,GAIpB+e,EAASc,GAAUjW,EAAKiW,GACnBd,GAAUxmB,EAAO4hB,WAAYvQ,IAASmV,EAAOhiB,OAASgiB,EAAOhiB,MAAO6M,EAAK5J,MAAW,GACxFke,EAAMmC,gBAkCR,OA/BAnC,GAAM/e,KAAOA,EAGPwgB,GAAiBzB,EAAMoC,sBAErB5B,EAAQ6B,UAAY7B,EAAQ6B,SAASxjB,MAAO+iB,EAAUta,MAAOxF,MAAW,IAC9EzH,EAAO4hB,WAAYlf,IAId4kB,GAAUtnB,EAAOsD,WAAYZ,EAAMkE,MAAa5G,EAAO8G,SAAUpE,KAGrE4F,EAAM5F,EAAM4kB,GAEPhf,IACJ5F,EAAM4kB,GAAW,MAIlBtnB,EAAO2lB,MAAMc,UAAY7f,EACzBlE,EAAMkE,KACN5G,EAAO2lB,MAAMc,UAAYlnB,UAEpB+I,IACJ5F,EAAM4kB,GAAWhf,IAMdqd,EAAMpQ,SAGdmR,SAAU,SAAUf,GAGnBA,EAAQ3lB,EAAO2lB,MAAMsC,IAAKtC,EAE1B,IAAI9gB,GAAGE,EAAGd,EAAKmS,EAAS8P,EACvBgC,KACA7jB,EAAO3D,EAAWkD,KAAMa,WACxB2hB,GAAazF,EAAU9c,IAAKlB,KAAM,eAAoBgjB,EAAM/e,UAC5Duf,EAAUnmB,EAAO2lB,MAAMQ,QAASR,EAAM/e,SAOvC,IAJAvC,EAAK,GAAKshB,EACVA,EAAMwC,eAAiBxlB,MAGlBwjB,EAAQiC,aAAejC,EAAQiC,YAAYxkB,KAAMjB,KAAMgjB,MAAY,EAAxE,CAKAuC,EAAeloB,EAAO2lB,MAAMS,SAASxiB,KAAMjB,KAAMgjB,EAAOS,GAGxDvhB,EAAI,CACJ,QAASuR,EAAU8R,EAAcrjB,QAAW8gB,EAAMkC,uBAAyB,CAC1ElC,EAAM0C,cAAgBjS,EAAQ1T,KAE9BqC,EAAI,CACJ,QAASmhB,EAAY9P,EAAQgQ,SAAUrhB,QAAW4gB,EAAM2C,kCAIjD3C,EAAM+B,cAAgB/B,EAAM+B,aAAatkB,KAAM8iB,EAAUW,cAE9DlB,EAAMO,UAAYA,EAClBP,EAAMle,KAAOye,EAAUze,KAEvBxD,IAASjE,EAAO2lB,MAAMQ,QAASD,EAAUI,eAAkBE,QAAUN,EAAUhV,SAC5E1M,MAAO4R,EAAQ1T,KAAM2B,GAEnBJ,IAAQ1E,YACNomB,EAAMpQ,OAAStR,MAAS,IAC7B0hB,EAAMmC,iBACNnC,EAAM4C,oBAYX,MAJKpC,GAAQqC,cACZrC,EAAQqC,aAAa5kB,KAAMjB,KAAMgjB,GAG3BA,EAAMpQ,SAGd6Q,SAAU,SAAUT,EAAOS,GAC1B,GAAIvhB,GAAGqH,EAASuc,EAAKvC,EACpBgC,KACApB,EAAgBV,EAASU,cACzBzV,EAAMsU,EAAMhgB,MAKb,IAAKmhB,GAAiBzV,EAAInO,YAAcyiB,EAAMjO,QAAyB,UAAfiO,EAAM/e,MAE7D,KAAQyK,IAAQ1O,KAAM0O,EAAMA,EAAI5N,YAAcd,KAG7C,GAAK0O,EAAI+F,YAAa,GAAuB,UAAfuO,EAAM/e,KAAmB,CAEtD,IADAsF,KACMrH,EAAI,EAAOiiB,EAAJjiB,EAAmBA,IAC/BqhB,EAAYE,EAAUvhB,GAGtB4jB,EAAMvC,EAAU9kB,SAAW,IAEtB8K,EAASuc,KAAUlpB,YACvB2M,EAASuc,GAAQvC,EAAUzX,aAC1BzO,EAAQyoB,EAAK9lB,MAAOoa,MAAO1L,IAAS,EACpCrR,EAAO+C,KAAM0lB,EAAK9lB,KAAM,MAAQ0O,IAAQxO,QAErCqJ,EAASuc,IACbvc,EAAQzL,KAAMylB,EAGXha,GAAQrJ,QACZqlB,EAAaznB,MAAOiC,KAAM2O,EAAK+U,SAAUla,IAW7C,MAJqBka,GAASvjB,OAAzBikB,GACJoB,EAAaznB,MAAOiC,KAAMC,KAAMyjB,SAAUA,EAASzlB,MAAOmmB,KAGpDoB,GAIRQ,MAAO,wHAAwHrd,MAAM,KAErIsd,YAEAC,UACCF,MAAO,4BAA4Brd,MAAM,KACzCqH,OAAQ,SAAUiT,EAAOkD,GAOxB,MAJoB,OAAflD,EAAMmD,QACVnD,EAAMmD,MAA6B,MAArBD,EAASE,SAAmBF,EAASE,SAAWF,EAASG,SAGjErD,IAITsD,YACCP,MAAO,uFAAuFrd,MAAM,KACpGqH,OAAQ,SAAUiT,EAAOkD,GACxB,GAAIK,GAAUnX,EAAKmO,EAClBxI,EAASmR,EAASnR,MAkBnB,OAfoB,OAAfiO,EAAMwD,OAAqC,MAApBN,EAASO,UACpCF,EAAWvD,EAAMhgB,OAAOxC,eAAiBvD,EACzCmS,EAAMmX,EAASppB,gBACfogB,EAAOgJ,EAAShJ,KAEhByF,EAAMwD,MAAQN,EAASO,SAAYrX,GAAOA,EAAIsX,YAAcnJ,GAAQA,EAAKmJ,YAAc,IAAQtX,GAAOA,EAAIuX,YAAcpJ,GAAQA,EAAKoJ,YAAc,GACnJ3D,EAAM4D,MAAQV,EAASW,SAAYzX,GAAOA,EAAI0X,WAAcvJ,GAAQA,EAAKuJ,WAAc,IAAQ1X,GAAOA,EAAI2X,WAAcxJ,GAAQA,EAAKwJ,WAAc,IAK9I/D,EAAMmD,OAASpR,IAAWnY,YAC/BomB,EAAMmD,MAAmB,EAATpR,EAAa,EAAe,EAATA,EAAa,EAAe,EAATA,EAAa,EAAI,GAGjEiO,IAITsC,IAAK,SAAUtC,GACd,GAAKA,EAAO3lB,EAAO8F,SAClB,MAAO6f,EAIR,IAAI9gB,GAAG0c,EAAM/b,EACZoB,EAAO+e,EAAM/e,KACb+iB,EAAgBhE,EAChBiE,EAAUjnB,KAAKgmB,SAAU/hB,EAEpBgjB,KACLjnB,KAAKgmB,SAAU/hB,GAASgjB,EACvBxE,EAAYhiB,KAAMwD,GAASjE,KAAKsmB,WAChC9D,EAAU/hB,KAAMwD,GAASjE,KAAKimB,aAGhCpjB,EAAOokB,EAAQlB,MAAQ/lB,KAAK+lB,MAAMnoB,OAAQqpB,EAAQlB,OAAU/lB,KAAK+lB,MAEjE/C,EAAQ,GAAI3lB,GAAOwnB,MAAOmC,GAE1B9kB,EAAIW,EAAK3C,MACT,OAAQgC,IACP0c,EAAO/b,EAAMX,GACb8gB,EAAOpE,GAASoI,EAAepI,EAehC,OAVMoE,GAAMhgB,SACXggB,EAAMhgB,OAAS/F,GAKe,IAA1B+lB,EAAMhgB,OAAOzC,WACjByiB,EAAMhgB,OAASggB,EAAMhgB,OAAOlC,YAGtBmmB,EAAQlX,OAAQkX,EAAQlX,OAAQiT,EAAOgE,GAAkBhE,GAGjEQ,SACC0D,MAEClC,UAAU,GAEX7Q,OAECrQ,QAAS,WACR,MAAK9D,QAAS8iB,KAAuB9iB,KAAKmU,OACzCnU,KAAKmU,SACE,GAFR,WAKD6P,aAAc,WAEfmD,MACCrjB,QAAS,WACR,MAAK9D,QAAS8iB,KAAuB9iB,KAAKmnB,MACzCnnB,KAAKmnB,QACE,GAFR,WAKDnD,aAAc,YAEfoD,OAECtjB,QAAS,WACR,MAAmB,aAAd9D,KAAKiE,MAAuBjE,KAAKonB,OAAS/pB,EAAOsJ,SAAU3G,KAAM,UACrEA,KAAKonB,SACE,GAFR,WAOD/B,SAAU,SAAUrC,GACnB,MAAO3lB,GAAOsJ,SAAUqc,EAAMhgB,OAAQ,OAIxCqkB,cACCxB,aAAc,SAAU7C,GAIlBA,EAAMpQ,SAAWhW,YACrBomB,EAAMgE,cAAcM,YAActE,EAAMpQ,WAM5C2U,SAAU,SAAUtjB,EAAMlE,EAAMijB,EAAOwE,GAItC,GAAI/iB,GAAIpH,EAAOoF,OACd,GAAIpF,GAAOwnB,MACX7B,GAEC/e,KAAMA,EACNwjB,aAAa,EACbT,kBAGGQ,GACJnqB,EAAO2lB,MAAMlf,QAASW,EAAG,KAAM1E,GAE/B1C,EAAO2lB,MAAMe,SAAS9iB,KAAMlB,EAAM0E,GAE9BA,EAAE2gB,sBACNpC,EAAMmC,mBAKT9nB,EAAOmnB,YAAc,SAAUzkB,EAAMkE,EAAM4f,GACrC9jB,EAAKN,qBACTM,EAAKN,oBAAqBwE,EAAM4f,GAAQ,IAI1CxmB,EAAOwnB,MAAQ,SAAUjiB,EAAKmjB,GAE7B,MAAO/lB,gBAAgB3C,GAAOwnB,OAKzBjiB,GAAOA,EAAIqB,MACfjE,KAAKgnB,cAAgBpkB,EACrB5C,KAAKiE,KAAOrB,EAAIqB,KAIhBjE,KAAKolB,mBAAuBxiB,EAAI8kB,kBAC/B9kB,EAAI+kB,mBAAqB/kB,EAAI+kB,oBAAwB/E,EAAaC,GAInE7iB,KAAKiE,KAAOrB,EAIRmjB,GACJ1oB,EAAOoF,OAAQzC,KAAM+lB,GAItB/lB,KAAK4nB,UAAYhlB,GAAOA,EAAIglB,WAAavqB,EAAO4K,MAGhDjI,KAAM3C,EAAO8F,UAAY,EAvBzB,WAJQ,GAAI9F,GAAOwnB,MAAOjiB,EAAKmjB,IAgChC1oB,EAAOwnB,MAAMllB,WACZylB,mBAAoBvC,EACpBqC,qBAAsBrC,EACtB8C,8BAA+B9C,EAE/BsC,eAAgB,WACf,GAAI1gB,GAAIzE,KAAKgnB,aAEbhnB,MAAKolB,mBAAqBxC,EAErBne,GAAKA,EAAE0gB,gBACX1gB,EAAE0gB,kBAGJS,gBAAiB,WAChB,GAAInhB,GAAIzE,KAAKgnB,aAEbhnB,MAAKklB,qBAAuBtC,EAEvBne,GAAKA,EAAEmhB,iBACXnhB,EAAEmhB,mBAGJiC,yBAA0B,WACzB7nB,KAAK2lB,8BAAgC/C,EACrC5iB,KAAK4lB,oBAMPvoB,EAAOmE,MACNsmB,WAAY,YACZC,WAAY,YACV,SAAUC,EAAM1C,GAClBjoB,EAAO2lB,MAAMQ,QAASwE,IACrBhE,aAAcsB,EACdrB,SAAUqB,EAEVzB,OAAQ,SAAUb,GACjB,GAAI1hB,GACH0B,EAAShD,KACTioB,EAAUjF,EAAMkF,cAChB3E,EAAYP,EAAMO,SASnB,SALM0E,GAAYA,IAAYjlB,IAAW3F,EAAOmM,SAAUxG,EAAQilB,MACjEjF,EAAM/e,KAAOsf,EAAUI,SACvBriB,EAAMiiB,EAAUhV,QAAQ1M,MAAO7B,KAAM8B,WACrCkhB,EAAM/e,KAAOqhB,GAEPhkB,MAOJjE,EAAOsL,QAAQsU,gBACpB5f,EAAOmE,MAAO2S,MAAO,UAAWgT,KAAM,YAAc,SAAUa,EAAM1C,GAGnE,GAAI6C,GAAW,EACd5Z,EAAU,SAAUyU,GACnB3lB,EAAO2lB,MAAMuE,SAAUjC,EAAKtC,EAAMhgB,OAAQ3F,EAAO2lB,MAAMsC,IAAKtC,IAAS,GAGvE3lB,GAAO2lB,MAAMQ,QAAS8B,IACrBlB,MAAO,WACc,IAAf+D,KACJlrB,EAASwL,iBAAkBuf,EAAMzZ,GAAS,IAG5CgW,SAAU,WACW,MAAb4D,GACNlrB,EAASwC,oBAAqBuoB,EAAMzZ,GAAS,OAOlDlR,EAAOsB,GAAG8D,QAET2lB,GAAI,SAAUlF,EAAOzkB,EAAUqG,EAAMnG,EAAiBgjB,GACrD,GAAI0G,GAAQpkB,CAGZ,IAAsB,gBAAVif,GAAqB,CAEP,gBAAbzkB,KAEXqG,EAAOA,GAAQrG,EACfA,EAAW7B,UAEZ,KAAMqH,IAAQif,GACbljB,KAAKooB,GAAInkB,EAAMxF,EAAUqG,EAAMoe,EAAOjf,GAAQ0d,EAE/C,OAAO3hB,MAmBR,GAhBa,MAAR8E,GAAsB,MAANnG,GAEpBA,EAAKF,EACLqG,EAAOrG,EAAW7B,WACD,MAAN+B,IACc,gBAAbF,IAEXE,EAAKmG,EACLA,EAAOlI,YAGP+B,EAAKmG,EACLA,EAAOrG,EACPA,EAAW7B,YAGR+B,KAAO,EACXA,EAAKkkB,MACC,KAAMlkB,EACZ,MAAOqB,KAaR,OAVa,KAAR2hB,IACJ0G,EAAS1pB,EACTA,EAAK,SAAUqkB,GAGd,MADA3lB,KAAS0G,IAAKif,GACPqF,EAAOxmB,MAAO7B,KAAM8B,YAG5BnD,EAAG8I,KAAO4gB,EAAO5gB,OAAU4gB,EAAO5gB,KAAOpK,EAAOoK,SAE1CzH,KAAKwB,KAAM,WACjBnE,EAAO2lB,MAAM7I,IAAKna,KAAMkjB,EAAOvkB,EAAImG,EAAMrG,MAG3CkjB,IAAK,SAAUuB,EAAOzkB,EAAUqG,EAAMnG,GACrC,MAAOqB,MAAKooB,GAAIlF,EAAOzkB,EAAUqG,EAAMnG,EAAI,IAE5CoF,IAAK,SAAUmf,EAAOzkB,EAAUE,GAC/B,GAAI4kB,GAAWtf,CACf,IAAKif,GAASA,EAAMiC,gBAAkBjC,EAAMK,UAQ3C,MANAA,GAAYL,EAAMK,UAClBlmB,EAAQ6lB,EAAMsC,gBAAiBzhB,IAC9Bwf,EAAUW,UAAYX,EAAUI,SAAW,IAAMJ,EAAUW,UAAYX,EAAUI,SACjFJ,EAAU9kB,SACV8kB,EAAUhV,SAEJvO,IAER,IAAsB,gBAAVkjB,GAAqB,CAEhC,IAAMjf,IAAQif,GACbljB,KAAK+D,IAAKE,EAAMxF,EAAUykB,EAAOjf,GAElC,OAAOjE,MAUR,OARKvB,KAAa,GAA6B,kBAAbA,MAEjCE,EAAKF,EACLA,EAAW7B,WAEP+B,KAAO,IACXA,EAAKkkB,GAEC7iB,KAAKwB,KAAK,WAChBnE,EAAO2lB,MAAM5d,OAAQpF,KAAMkjB,EAAOvkB,EAAIF,MAIxCqF,QAAS,SAAUG,EAAMa,GACxB,MAAO9E,MAAKwB,KAAK,WAChBnE,EAAO2lB,MAAMlf,QAASG,EAAMa,EAAM9E,SAGpCsoB,eAAgB,SAAUrkB,EAAMa,GAC/B,GAAI/E,GAAOC,KAAK,EAChB,OAAKD,GACG1C,EAAO2lB,MAAMlf,QAASG,EAAMa,EAAM/E,GAAM,GADhD,YAKF,IAAIwoB,GAAW,iBACdC,EAAe,iCACfC,EAAgBprB,EAAO8T,KAAKrR,MAAMgM,aAElC4c,GACCC,UAAU,EACVC,UAAU,EACVhJ,MAAM,EACNiJ,MAAM,EAGRxrB,GAAOsB,GAAG8D,QACTrC,KAAM,SAAU3B,GACf,GAAIyD,GACHZ,KACA2Y,EAAOja,KACPmC,EAAM8X,EAAK/Z,MAEZ,IAAyB,gBAAbzB,GACX,MAAOuB,MAAKoB,UAAW/D,EAAQoB,GAAWsR,OAAO,WAChD,IAAM7N,EAAI,EAAOC,EAAJD,EAASA,IACrB,GAAK7E,EAAOmM,SAAUyQ,EAAM/X,GAAKlC,MAChC,OAAO,IAMX,KAAMkC,EAAI,EAAOC,EAAJD,EAASA,IACrB7E,EAAO+C,KAAM3B,EAAUwb,EAAM/X,GAAKZ,EAMnC,OAFAA,GAAMtB,KAAKoB,UAAWe,EAAM,EAAI9E,EAAO0b,OAAQzX,GAAQA,GACvDA,EAAI7C,SAAWuB,KAAKvB,SAAWuB,KAAKvB,SAAW,IAAMA,EAAWA,EACzD6C,GAGRuS,IAAK,SAAU7Q,GACd,GAAI8lB,GAAUzrB,EAAQ2F,EAAQhD,MAC7BoH,EAAI0hB,EAAQ5oB,MAEb,OAAOF,MAAK+P,OAAO,WAClB,GAAI7N,GAAI,CACR,MAAYkF,EAAJlF,EAAOA,IACd,GAAK7E,EAAOmM,SAAUxJ,KAAM8oB,EAAQ5mB,IACnC,OAAO,KAMXwR,IAAK,SAAUjV,GACd,MAAOuB,MAAKoB,UAAW2nB,GAAO/oB,KAAMvB,OAAgB,KAGrDsR,OAAQ,SAAUtR,GACjB,MAAOuB,MAAKoB,UAAW2nB,GAAO/oB,KAAMvB,OAAgB,KAGrDuqB,GAAI,SAAUvqB,GACb,QAASsqB,GACR/oB,KAIoB,gBAAbvB,IAAyBgqB,EAAchoB,KAAMhC,GACnDpB,EAAQoB,GACRA,OACD,GACCyB,QAGH+oB,QAAS,SAAUpX,EAAWnT,GAC7B,GAAIgQ,GACHxM,EAAI,EACJkF,EAAIpH,KAAKE,OACTuT,KACAyV,EAAQT,EAAchoB,KAAMoR,IAAoC,gBAAdA,GACjDxU,EAAQwU,EAAWnT,GAAWsB,KAAKtB,SACnC,CAEF,MAAY0I,EAAJlF,EAAOA,IACd,IAAMwM,EAAM1O,KAAKkC,GAAIwM,GAAOA,IAAQhQ,EAASgQ,EAAMA,EAAI5N,WAEtD,GAAoB,GAAf4N,EAAInO,WAAkB2oB,EAC1BA,EAAI9O,MAAM1L,GAAO,GAGA,IAAjBA,EAAInO,UACHlD,EAAO+C,KAAKgQ,gBAAgB1B,EAAKmD,IAAc,CAEhDnD,EAAM+E,EAAQ3V,KAAM4Q,EACpB,OAKH,MAAO1O,MAAKoB,UAAWqS,EAAQvT,OAAS,EAAI7C,EAAO0b,OAAQtF,GAAYA,IAKxE2G,MAAO,SAAUra,GAGhB,MAAMA,GAKe,gBAATA,GACJ9B,EAAagD,KAAM5D,EAAQ0C,GAAQC,KAAM,IAI1C/B,EAAagD,KAAMjB,KAGzBD,EAAKH,OAASG,EAAM,GAAMA,GAZjBC,KAAM,IAAOA,KAAM,GAAIc,WAAed,KAAK+B,QAAQonB,UAAUjpB,OAAS,IAgBjFia,IAAK,SAAU1b,EAAUC,GACxB,GAAIigB,GAA0B,gBAAblgB,GACfpB,EAAQoB,EAAUC,GAClBrB,EAAO0D,UAAWtC,GAAYA,EAAS8B,UAAa9B,GAAaA,GAClEY,EAAMhC,EAAOgD,MAAOL,KAAKkB,MAAOyd,EAEjC,OAAO3e,MAAKoB,UAAW/D,EAAO0b,OAAO1Z,KAGtC+pB,QAAS,SAAU3qB,GAClB,MAAOuB,MAAKma,IAAiB,MAAZ1b,EAChBuB,KAAKuB,WAAavB,KAAKuB,WAAWwO,OAAOtR,MAK5C,SAAS4qB,GAAS3a,EAAKuD,GACtB,OAASvD,EAAMA,EAAIuD,KAA0B,IAAjBvD,EAAInO,UAEhC,MAAOmO,GAGRrR,EAAOmE,MACN6N,OAAQ,SAAUtP,GACjB,GAAIsP,GAAStP,EAAKe,UAClB,OAAOuO,IAA8B,KAApBA,EAAO9O,SAAkB8O,EAAS,MAEpDia,QAAS,SAAUvpB,GAClB,MAAO1C,GAAO4U,IAAKlS,EAAM,eAE1BwpB,aAAc,SAAUxpB,EAAMmC,EAAGsnB,GAChC,MAAOnsB,GAAO4U,IAAKlS,EAAM,aAAcypB,IAExC5J,KAAM,SAAU7f,GACf,MAAOspB,GAAStpB,EAAM,gBAEvB8oB,KAAM,SAAU9oB,GACf,MAAOspB,GAAStpB,EAAM,oBAEvB0pB,QAAS,SAAU1pB,GAClB,MAAO1C,GAAO4U,IAAKlS,EAAM,gBAE1BopB,QAAS,SAAUppB,GAClB,MAAO1C,GAAO4U,IAAKlS,EAAM,oBAE1B2pB,UAAW,SAAU3pB,EAAMmC,EAAGsnB,GAC7B,MAAOnsB,GAAO4U,IAAKlS,EAAM,cAAeypB,IAEzCG,UAAW,SAAU5pB,EAAMmC,EAAGsnB,GAC7B,MAAOnsB,GAAO4U,IAAKlS,EAAM,kBAAmBypB,IAE7CI,SAAU,SAAU7pB,GACnB,MAAO1C,GAAOgsB,SAAWtpB,EAAKe,gBAAmB8O,WAAY7P,IAE9D4oB,SAAU,SAAU5oB,GACnB,MAAO1C,GAAOgsB,QAAStpB,EAAK6P,aAE7BgZ,SAAU,SAAU7oB,GACnB,MAAOA,GAAK8pB,iBAAmBxsB,EAAOgD,SAAWN,EAAKsF,cAErD,SAAU1C,EAAMhE,GAClBtB,EAAOsB,GAAIgE,GAAS,SAAU6mB,EAAO/qB,GACpC,GAAIgV,GAAUpW,EAAOgF,IAAKrC,KAAMrB,EAAI6qB,EAsBpC,OApB0B,UAArB7mB,EAAK3E,MAAO,MAChBS,EAAW+qB,GAGP/qB,GAAgC,gBAAbA,KACvBgV,EAAUpW,EAAO0S,OAAQtR,EAAUgV,IAG/BzT,KAAKE,OAAS,IAEZwoB,EAAkB/lB,IACvBtF,EAAO0b,OAAQtF,GAIX+U,EAAa/nB,KAAMkC,IACvB8Q,EAAQqW,WAIH9pB,KAAKoB,UAAWqS,MAIzBpW,EAAOoF,QACNsN,OAAQ,SAAUoB,EAAM9P,EAAOqS,GAC9B,GAAI3T,GAAOsB,EAAO,EAMlB,OAJKqS,KACJvC,EAAO,QAAUA,EAAO,KAGD,IAAjB9P,EAAMnB,QAAkC,IAAlBH,EAAKQ,SACjClD,EAAO+C,KAAKgQ,gBAAiBrQ,EAAMoR,IAAWpR,MAC9C1C,EAAO+C,KAAKmJ,QAAS4H,EAAM9T,EAAOgK,KAAMhG,EAAO,SAAUtB,GACxD,MAAyB,KAAlBA,EAAKQ,aAIf0R,IAAK,SAAUlS,EAAMkS,EAAKuX,GACzB,GAAI/V,MACHsW,EAAWP,IAAU5sB,SAEtB,QAASmD,EAAOA,EAAMkS,KAA4B,IAAlBlS,EAAKQ,SACpC,GAAuB,IAAlBR,EAAKQ,SAAiB,CAC1B,GAAKwpB,GAAY1sB,EAAQ0C,GAAOipB,GAAIQ,GACnC,KAED/V,GAAQ3V,KAAMiC,GAGhB,MAAO0T,IAGR4V,QAAS,SAAUW,EAAGjqB,GACrB,GAAI0T,KAEJ,MAAQuW,EAAGA,EAAIA,EAAEnb,YACI,IAAfmb,EAAEzpB,UAAkBypB,IAAMjqB,GAC9B0T,EAAQ3V,KAAMksB,EAIhB,OAAOvW,KAKT,SAASsV,IAAQ3X,EAAU6Y,EAAWvW,GACrC,GAAKrW,EAAOsD,WAAYspB,GACvB,MAAO5sB,GAAOgK,KAAM+J,EAAU,SAAUrR,EAAMmC,GAE7C,QAAS+nB,EAAUhpB,KAAMlB,EAAMmC,EAAGnC,KAAW2T,GAK/C,IAAKuW,EAAU1pB,SACd,MAAOlD,GAAOgK,KAAM+J,EAAU,SAAUrR,GACvC,MAASA,KAASkqB,IAAgBvW,GAKpC,IAA0B,gBAAduW,GAAyB,CACpC,GAAK1B,EAAS9nB,KAAMwpB,GACnB,MAAO5sB,GAAO0S,OAAQka,EAAW7Y,EAAUsC,EAG5CuW,GAAY5sB,EAAO0S,OAAQka,EAAW7Y,GAGvC,MAAO/T,GAAOgK,KAAM+J,EAAU,SAAUrR,GACvC,MAAS9B,GAAagD,KAAMgpB,EAAWlqB,IAAU,IAAQ2T,IAG3D,GAAIwW,IAAY,0EACfC,GAAW,YACXC,GAAQ,YACRC,GAAe,0BACfC,GAA8B,wBAE9BC,GAAW,oCACXC,GAAc,4BACdC,GAAoB,cACpBC,GAAe,2CAGfC,IAGCjJ,QAAU,EAAG,+BAAgC,aAE7CkJ,OAAS,EAAG,UAAW,YACvBC,KAAO,EAAG,oBAAqB,uBAC/BC,IAAM,EAAG,iBAAkB,oBAC3BC,IAAM,EAAG,qBAAsB,yBAE/B1F,UAAY,EAAG,GAAI,IAIrBsF,IAAQK,SAAWL,GAAQjJ,OAE3BiJ,GAAQM,MAAQN,GAAQO,MAAQP,GAAQQ,SAAWR,GAAQS,QAAUT,GAAQC,MAC7ED,GAAQU,GAAKV,GAAQI,GAErB1tB,EAAOsB,GAAG8D,QACT4D,KAAM,SAAUQ,GACf,MAAOxJ,GAAOsK,OAAQ3H,KAAM,SAAU6G,GACrC,MAAOA,KAAUjK,UAChBS,EAAOgJ,KAAMrG,MACbA,KAAK6U,QAAQyW,QAAUtrB,KAAM,IAAOA,KAAM,GAAIQ,eAAiBvD,GAAWsuB,eAAgB1kB,KACzF,KAAMA,EAAO/E,UAAU5B,SAG3BorB,OAAQ,WACP,MAAOtrB,MAAKwrB,SAAU1pB,UAAW,SAAU/B,GAC1C,GAAuB,IAAlBC,KAAKO,UAAoC,KAAlBP,KAAKO,UAAqC,IAAlBP,KAAKO,SAAiB,CACzE,GAAIyC,GAASyoB,GAAoBzrB,KAAMD,EACvCiD,GAAOuD,YAAaxG,OAKvB2rB,QAAS,WACR,MAAO1rB,MAAKwrB,SAAU1pB,UAAW,SAAU/B,GAC1C,GAAuB,IAAlBC,KAAKO,UAAoC,KAAlBP,KAAKO,UAAqC,IAAlBP,KAAKO,SAAiB,CACzE,GAAIyC,GAASyoB,GAAoBzrB,KAAMD,EACvCiD,GAAO2oB,aAAc5rB,EAAMiD,EAAO4M,gBAKrCgc,OAAQ,WACP,MAAO5rB,MAAKwrB,SAAU1pB,UAAW,SAAU/B,GACrCC,KAAKc,YACTd,KAAKc,WAAW6qB,aAAc5rB,EAAMC,SAKvC6rB,MAAO,WACN,MAAO7rB,MAAKwrB,SAAU1pB,UAAW,SAAU/B,GACrCC,KAAKc,YACTd,KAAKc,WAAW6qB,aAAc5rB,EAAMC,KAAK6O,gBAM5CzJ,OAAQ,SAAU3G,EAAUqtB,GAC3B,GAAI/rB,GACHsB,EAAQ5C,EAAWpB,EAAO0S,OAAQtR,EAAUuB,MAASA,KACrDkC,EAAI,CAEL,MAA6B,OAApBnC,EAAOsB,EAAMa,IAAaA,IAC5B4pB,GAA8B,IAAlB/rB,EAAKQ,UACtBlD,EAAO0uB,UAAWC,GAAQjsB,IAGtBA,EAAKe,aACJgrB,GAAYzuB,EAAOmM,SAAUzJ,EAAKS,cAAeT,IACrDksB,GAAeD,GAAQjsB,EAAM,WAE9BA,EAAKe,WAAW0F,YAAazG,GAI/B,OAAOC,OAGR6U,MAAO,WACN,GAAI9U,GACHmC,EAAI,CAEL,MAA4B,OAAnBnC,EAAOC,KAAKkC,IAAaA,IACV,IAAlBnC,EAAKQ,WAGTlD,EAAO0uB,UAAWC,GAAQjsB,GAAM,IAGhCA,EAAK4R,YAAc,GAIrB,OAAO3R,OAGR+C,MAAO,SAAUmpB,EAAeC,GAI/B,MAHAD,GAAiC,MAAjBA,GAAwB,EAAQA,EAChDC,EAAyC,MAArBA,EAA4BD,EAAgBC,EAEzDnsB,KAAKqC,IAAK,WAChB,MAAOhF,GAAO0F,MAAO/C,KAAMksB,EAAeC,MAI5CC,KAAM,SAAUvlB,GACf,MAAOxJ,GAAOsK,OAAQ3H,KAAM,SAAU6G,GACrC,GAAI9G,GAAOC,KAAM,OAChBkC,EAAI,EACJkF,EAAIpH,KAAKE,MAEV,IAAK2G,IAAUjK,WAA+B,IAAlBmD,EAAKQ,SAChC,MAAOR,GAAK4P,SAIb,IAAsB,gBAAV9I,KAAuBwjB,GAAa5pB,KAAMoG,KACpD8jB,IAAWR,GAAShqB,KAAM0G,KAAa,GAAI,KAAQ,GAAID,eAAkB,CAE1EC,EAAQA,EAAMvD,QAAS4mB,GAAW,YAElC,KACC,KAAY9iB,EAAJlF,EAAOA,IACdnC,EAAOC,KAAMkC,OAGU,IAAlBnC,EAAKQ,WACTlD,EAAO0uB,UAAWC,GAAQjsB,GAAM,IAChCA,EAAK4P,UAAY9I,EAInB9G,GAAO,EAGN,MAAO0E,KAGL1E,GACJC,KAAK6U,QAAQyW,OAAQzkB,IAEpB,KAAMA,EAAO/E,UAAU5B,SAG3BmsB,YAAa,WACZ,GAEC3qB,GAAOrE,EAAOgF,IAAKrC,KAAM,SAAUD,GAClC,OAASA,EAAK8O,YAAa9O,EAAKe,cAEjCoB,EAAI,CAmBL,OAhBAlC,MAAKwrB,SAAU1pB,UAAW,SAAU/B,GACnC,GAAI6f,GAAOle,EAAMQ,KAChBmN,EAAS3N,EAAMQ,IAEXmN,KAECuQ,GAAQA,EAAK9e,aAAeuO,IAChCuQ,EAAO5f,KAAK6O,aAEbxR,EAAQ2C,MAAOoF,SACfiK,EAAOsc,aAAc5rB,EAAM6f,MAG1B,GAGI1d,EAAIlC,KAAOA,KAAKoF,UAGxBknB,OAAQ,SAAU7tB,GACjB,MAAOuB,MAAKoF,OAAQ3G,GAAU,IAG/B+sB,SAAU,SAAU9pB,EAAMD,EAAU8qB,GAGnC7qB,EAAO/D,EAAYkE,SAAWH,EAE9B,IAAI0a,GAAUra,EAAOkD,EAASunB,EAAYrd,EAAMC,EAC/ClN,EAAI,EACJkF,EAAIpH,KAAKE,OACTye,EAAM3e,KACNysB,EAAWrlB,EAAI,EACfP,EAAQnF,EAAM,GACdf,EAAatD,EAAOsD,WAAYkG,EAGjC,IAAKlG,KAAsB,GAALyG,GAA2B,gBAAVP,IAAsBxJ,EAAOsL,QAAQqU,aAAeuN,GAAS9pB,KAAMoG,GACzG,MAAO7G,MAAKwB,KAAK,SAAU4Y,GAC1B,GAAIH,GAAO0E,EAAI3c,GAAIoY,EACdzZ,KACJe,EAAM,GAAMmF,EAAM5F,KAAMjB,KAAMoa,EAAOH,EAAKmS,SAE3CnS,EAAKuR,SAAU9pB,EAAMD,EAAU8qB,IAIjC,IAAKnlB,IACJgV,EAAW/e,EAAO8H,cAAezD,EAAM1B,KAAM,GAAIQ,eAAe,GAAQ+rB,GAAqBvsB,MAC7F+B,EAAQqa,EAASxM,WAEmB,IAA/BwM,EAAS/W,WAAWnF,SACxBkc,EAAWra,GAGPA,GAAQ,CAMZ,IALAkD,EAAU5H,EAAOgF,IAAK2pB,GAAQ5P,EAAU,UAAYsQ,IACpDF,EAAavnB,EAAQ/E,OAITkH,EAAJlF,EAAOA,IACdiN,EAAOiN,EAEFla,IAAMuqB,IACVtd,EAAO9R,EAAO0F,MAAOoM,GAAM,GAAM,GAG5Bqd,GAGJnvB,EAAOgD,MAAO4E,EAAS+mB,GAAQ7c,EAAM,YAIvC1N,EAASR,KAAMjB,KAAMkC,GAAKiN,EAAMjN,EAGjC,IAAKsqB,EAOJ,IANApd,EAAMnK,EAASA,EAAQ/E,OAAS,GAAIM,cAGpCnD,EAAOgF,IAAK4C,EAAS0nB,IAGfzqB,EAAI,EAAOsqB,EAAJtqB,EAAgBA,IAC5BiN,EAAOlK,EAAS/C,GACXsoB,GAAY/pB,KAAM0O,EAAKlL,MAAQ,MAClC+Z,EAAUrW,OAAQwH,EAAM,eAAkB9R,EAAOmM,SAAU4F,EAAKD,KAE5DA,EAAKvM,IAETvF,EAAOuvB,SAAUzd,EAAKvM,KAEtBvF,EAAO2I,WAAYmJ,EAAKwC,YAAYrO,QAASonB,GAAc,MAQjE,MAAO1qB,SAIT3C,EAAOmE,MACNqrB,SAAU,SACVC,UAAW,UACXnB,aAAc,SACdoB,YAAa,QACbC,WAAY,eACV,SAAUrqB,EAAMujB,GAClB7oB,EAAOsB,GAAIgE,GAAS,SAAUlE,GAC7B,GAAI4C,GACHC,KACA2rB,EAAS5vB,EAAQoB,GACjBwD,EAAOgrB,EAAO/sB,OAAS,EACvBgC,EAAI,CAEL,MAAaD,GAALC,EAAWA,IAClBb,EAAQa,IAAMD,EAAOjC,KAAOA,KAAK+C,OAAO,GACxC1F,EAAQ4vB,EAAQ/qB,IAAOgkB,GAAY7kB,GAInCxD,EAAUgE,MAAOP,EAAKD,EAAMH,MAG7B,OAAOlB,MAAKoB,UAAWE,MAIzBjE,EAAOoF,QACNM,MAAO,SAAUhD,EAAMmsB,EAAeC,GACrC,GAAIjqB,GAAGkF,EAAG8lB,EAAaC,EACtBpqB,EAAQhD,EAAK8c,WAAW,GACxBuQ,EAAS/vB,EAAOmM,SAAUzJ,EAAKS,cAAeT,EAI/C,MAAM1C,EAAOsL,QAAQiU,gBAAsC,IAAlB7c,EAAKQ,UAAoC,KAAlBR,EAAKQ,UAAsBlD,EAAO2b,SAAUjZ,IAM3G,IAHAotB,EAAenB,GAAQjpB,GACvBmqB,EAAclB,GAAQjsB,GAEhBmC,EAAI,EAAGkF,EAAI8lB,EAAYhtB,OAAYkH,EAAJlF,EAAOA,IAC3CmrB,GAAUH,EAAahrB,GAAKirB,EAAcjrB,GAK5C,IAAKgqB,EACJ,GAAKC,EAIJ,IAHAe,EAAcA,GAAelB,GAAQjsB,GACrCotB,EAAeA,GAAgBnB,GAAQjpB,GAEjCb,EAAI,EAAGkF,EAAI8lB,EAAYhtB,OAAYkH,EAAJlF,EAAOA,IAC3CorB,GAAgBJ,EAAahrB,GAAKirB,EAAcjrB,QAGjDorB,IAAgBvtB,EAAMgD,EAWxB,OANAoqB,GAAenB,GAAQjpB,EAAO,UACzBoqB,EAAajtB,OAAS,GAC1B+rB,GAAekB,GAAeC,GAAUpB,GAAQjsB,EAAM,WAIhDgD,GAGRoC,cAAe,SAAU9D,EAAO3C,EAASuG,EAASsoB,GACjD,GAAIxtB,GAAM4F,EAAKuK,EAAKsd,EAAMhkB,EAAUpH,EACnCF,EAAI,EACJkF,EAAI/F,EAAMnB,OACVkc,EAAW1d,EAAQ2d,yBACnBoR,IAED,MAAYrmB,EAAJlF,EAAOA,IAGd,GAFAnC,EAAOsB,EAAOa,GAETnC,GAAiB,IAATA,EAGZ,GAA6B,WAAxB1C,EAAO4G,KAAMlE,GAGjB1C,EAAOgD,MAAOotB,EAAO1tB,EAAKQ,UAAaR,GAASA,OAG1C,IAAMqqB,GAAM3pB,KAAMV,GAIlB,CACN4F,EAAMA,GAAOyW,EAAS7V,YAAa7H,EAAQwG,cAAc,QAGzDgL,GAAQia,GAAShqB,KAAMJ,KAAW,GAAI,KAAO,GAAI6G,cACjD4mB,EAAO7C,GAASza,IAASya,GAAQtF,SACjC1f,EAAIgK,UAAY6d,EAAM,GAAMztB,EAAKuD,QAAS4mB,GAAW,aAAgBsD,EAAM,GAG3EprB,EAAIorB,EAAM,EACV,OAAQprB,IACPuD,EAAMA,EAAI0N,SAKXhW,GAAOgD,MAAOotB,EAAO9nB,EAAIN,YAGzBM,EAAMyW,EAASxM,WAIfjK,EAAIgM,YAAc,OA1BlB8b,GAAM3vB,KAAMY,EAAQ6sB,eAAgBxrB,GAgCvCqc,GAASzK,YAAc,GAEvBzP,EAAI,CACJ,OAASnC,EAAO0tB,EAAOvrB,KAItB,KAAKqrB,GAAmD,KAAtClwB,EAAO6J,QAASnH,EAAMwtB,MAIxC/jB,EAAWnM,EAAOmM,SAAUzJ,EAAKS,cAAeT,GAGhD4F,EAAMqmB,GAAQ5P,EAAS7V,YAAaxG,GAAQ,UAGvCyJ,GACJyiB,GAAetmB,GAIXV,GAAU,CACd7C,EAAI,CACJ,OAASrC,EAAO4F,EAAKvD,KACfooB,GAAY/pB,KAAMV,EAAKkE,MAAQ,KACnCgB,EAAQnH,KAAMiC,GAMlB,MAAOqc,IAGR2P,UAAW,SAAU1qB,GACpB,GAAIyD,GAAM/E,EAAMsjB,EAAQpf,EAAM2D,EAAKxF,EAClCohB,EAAUnmB,EAAO2lB,MAAMQ,QACvBthB,EAAI,CAEL,OAASnC,EAAOsB,EAAOa,MAAStF,UAAWsF,IAAM,CAChD,GAAKic,EAAKG,QAASve,KAClB6H,EAAM7H,EAAMie,EAAU7a,SAEjByE,IAAQ9C,EAAOkZ,EAAUjQ,MAAOnG,KAAS,CAE7C,GADAyb,EAASpc,OAAO6G,KAAMhJ,EAAKue,YACtBA,EAAOnjB,OACX,IAAMkC,EAAI,GAAI6B,EAAOof,EAAOjhB,MAAQxF,UAAWwF,IACzCohB,EAASvf,GACb5G,EAAO2lB,MAAM5d,OAAQrF,EAAMkE,GAI3B5G,EAAOmnB,YAAazkB,EAAMkE,EAAMa,EAAK+e,OAInC7F,GAAUjQ,MAAOnG,UAEdoW,GAAUjQ,MAAOnG,SAKpBmW,GAAUhQ,MAAOhO,EAAMge,EAAU5a,YAI1CypB,SAAU,SAAUc,GACnB,MAAOrwB,GAAOswB,MACbD,IAAKA,EACLzpB,KAAM,MACN2pB,SAAU,SACVC,OAAO,EACP5K,QAAQ,EACR6K,UAAU,MAOb,SAASrC,IAAoB1rB,EAAMguB,GAClC,MAAO1wB,GAAOsJ,SAAU5G,EAAM,UAC7B1C,EAAOsJ,SAA+B,IAArBonB,EAAQxtB,SAAiBwtB,EAAUA,EAAQne,WAAY,MAExE7P,EAAK+F,qBAAqB,SAAS,IAClC/F,EAAKwG,YAAaxG,EAAKS,cAAc0E,cAAc,UACpDnF,EAIF,QAAS2sB,IAAe3sB,GAEvB,MADAA,GAAKkE,MAAsC,OAA9BlE,EAAKuN,aAAa,SAAoB,IAAMvN,EAAKkE,KACvDlE,EAER,QAAS4sB,IAAe5sB,GACvB,GAAID,GAAQ2qB,GAAkBtqB,KAAMJ,EAAKkE,KAQzC,OANKnE,GACJC,EAAKkE,KAAOnE,EAAO,GAEnBC,EAAK6N,gBAAgB,QAGf7N,EAIR,QAASksB,IAAe5qB,EAAO2sB,GAC9B,GAAI5mB,GAAI/F,EAAMnB,OACbgC,EAAI,CAEL,MAAYkF,EAAJlF,EAAOA,IACd8b,EAAUW,IACTtd,EAAOa,GAAK,cAAe8rB,GAAehQ,EAAU9c,IAAK8sB,EAAa9rB,GAAK,eAK9E,QAASorB,IAAgB1qB,EAAKqrB,GAC7B,GAAI/rB,GAAGkF,EAAGnD,EAAMiqB,EAAUC,EAAUC,EAAUC,EAAUhL,CAExD,IAAuB,IAAlB4K,EAAK1tB,SAAV,CAKA,GAAKyd,EAAUe,QAASnc,KACvBsrB,EAAWlQ,EAAUrW,OAAQ/E,GAC7BurB,EAAWnQ,EAAUW,IAAKsP,EAAMC,GAChC7K,EAAS6K,EAAS7K,QAEJ,OACN8K,GAAStK,OAChBsK,EAAS9K,SAET,KAAMpf,IAAQof,GACb,IAAMnhB,EAAI,EAAGkF,EAAIic,EAAQpf,GAAO/D,OAAYkH,EAAJlF,EAAOA,IAC9C7E,EAAO2lB,MAAM7I,IAAK8T,EAAMhqB,EAAMof,EAAQpf,GAAQ/B,IAO7C6b,EAAUgB,QAASnc,KACvBwrB,EAAWrQ,EAAUpW,OAAQ/E,GAC7ByrB,EAAWhxB,EAAOoF,UAAY2rB,GAE9BrQ,EAAUY,IAAKsP,EAAMI,KAKvB,QAASrC,IAAQttB,EAASwR,GACzB,GAAI5O,GAAM5C,EAAQoH,qBAAuBpH,EAAQoH,qBAAsBoK,GAAO,KAC5ExR,EAAQgP,iBAAmBhP,EAAQgP,iBAAkBwC,GAAO,OAG9D,OAAOA,KAAQtT,WAAasT,GAAO7S,EAAOsJ,SAAUjI,EAASwR,GAC5D7S,EAAOgD,OAAS3B,GAAW4C,GAC3BA,EAIF,QAAS+rB,IAAUzqB,EAAKqrB,GACvB,GAAItnB,GAAWsnB,EAAKtnB,SAASC,aAGX,WAAbD,GAAwB2jB,GAA4B7pB,KAAMmC,EAAIqB,MAClEgqB,EAAKvZ,QAAU9R,EAAI8R,SAGK,UAAb/N,GAAqC,aAAbA,KACnCsnB,EAAKnV,aAAelW,EAAIkW,cAG1Bzb,EAAOsB,GAAG8D,QACT6rB,QAAS,SAAUlC,GAClB,GAAIoB,EAEJ,OAAKnwB,GAAOsD,WAAYyrB,GAChBpsB,KAAKwB,KAAK,SAAUU,GAC1B7E,EAAQ2C,MAAOsuB,QAASlC,EAAKnrB,KAAKjB,KAAMkC,OAIrClC,KAAM,KAGVwtB,EAAOnwB,EAAQ+uB,EAAMpsB,KAAM,GAAIQ,eAAgBwB,GAAI,GAAIe,OAAO,GAEzD/C,KAAM,GAAIc,YACd0sB,EAAK7B,aAAc3rB,KAAM,IAG1BwtB,EAAKnrB,IAAI,WACR,GAAItC,GAAOC,IAEX,OAAQD,EAAKwuB,kBACZxuB,EAAOA,EAAKwuB,iBAGb,OAAOxuB,KACLurB,OAAQtrB,OAGLA,OAGRwuB,UAAW,SAAUpC,GACpB,MAAK/uB,GAAOsD,WAAYyrB,GAChBpsB,KAAKwB,KAAK,SAAUU,GAC1B7E,EAAQ2C,MAAOwuB,UAAWpC,EAAKnrB,KAAKjB,KAAMkC,MAIrClC,KAAKwB,KAAK,WAChB,GAAIyY,GAAO5c,EAAQ2C,MAClB4oB,EAAW3O,EAAK2O,UAEZA,GAAS1oB,OACb0oB,EAAS0F,QAASlC,GAGlBnS,EAAKqR,OAAQc,MAKhBoB,KAAM,SAAUpB,GACf,GAAIzrB,GAAatD,EAAOsD,WAAYyrB,EAEpC,OAAOpsB,MAAKwB,KAAK,SAAUU,GAC1B7E,EAAQ2C,MAAOsuB,QAAS3tB,EAAayrB,EAAKnrB,KAAKjB,KAAMkC,GAAKkqB,MAI5DqC,OAAQ,WACP,MAAOzuB,MAAKqP,SAAS7N,KAAK,WACnBnE,EAAOsJ,SAAU3G,KAAM,SAC5B3C,EAAQ2C,MAAOqsB,YAAarsB,KAAKqF,cAEhC/C,QAGL,IAAIosB,IAAQC,GAGXC,GAAe,4BACfC,GAAU,UACVC,GAAgB/jB,OAAQ,KAAOlM,EAAY,SAAU,KACrDkwB,GAAgBhkB,OAAQ,KAAOlM,EAAY,kBAAmB,KAC9DmwB,GAAcjkB,OAAQ,YAAclM,EAAY,IAAK,KACrDowB,IAAgBC,KAAM,SAEtBC,IAAYC,SAAU,WAAYC,WAAY,SAAUC,QAAS,SACjEC,IACCC,cAAe,EACfC,WAAY,KAGbC,IAAc,MAAO,QAAS,SAAU,QACxCC,IAAgB,SAAU,IAAK,MAAO,KAGvC,SAASC,IAAgBvnB,EAAO1F,GAG/B,GAAKA,IAAQ0F,GACZ,MAAO1F,EAIR,IAAIktB,GAAUltB,EAAK1C,OAAO,GAAGV,cAAgBoD,EAAK3E,MAAM,GACvD8xB,EAAWntB,EACXT,EAAIytB,GAAYzvB,MAEjB,OAAQgC,IAEP,GADAS,EAAOgtB,GAAaztB,GAAM2tB,EACrBltB,IAAQ0F,GACZ,MAAO1F,EAIT,OAAOmtB,GAGR,QAASC,IAAUhwB,EAAMiwB,GAIxB,MADAjwB,GAAOiwB,GAAMjwB,EAC4B,SAAlC1C,EAAO4yB,IAAKlwB,EAAM,aAA2B1C,EAAOmM,SAAUzJ,EAAKS,cAAeT,GAK1F,QAASmwB,IAAWnwB,GACnB,MAAOpD,GAAOihB,iBAAkB7d,EAAM,MAGvC,QAASowB,IAAU/e,EAAUgf,GAC5B,GAAId,GAASvvB,EAAMswB,EAClBtU,KACA3B,EAAQ,EACRla,EAASkR,EAASlR,MAEnB,MAAgBA,EAARka,EAAgBA,IACvBra,EAAOqR,EAAUgJ,GACXra,EAAKsI,QAIX0T,EAAQ3B,GAAU4D,EAAU9c,IAAKnB,EAAM,cACvCuvB,EAAUvvB,EAAKsI,MAAMinB,QAChBc,GAGErU,EAAQ3B,IAAuB,SAAZkV,IACxBvvB,EAAKsI,MAAMinB,QAAU,IAMM,KAAvBvvB,EAAKsI,MAAMinB,SAAkBS,GAAUhwB,KAC3Cgc,EAAQ3B,GAAU4D,EAAUrW,OAAQ5H,EAAM,aAAcuwB,GAAmBvwB,EAAK4G,aAI3EoV,EAAQ3B,KACbiW,EAASN,GAAUhwB,IAEduvB,GAAuB,SAAZA,IAAuBe,IACtCrS,EAAUW,IAAK5e,EAAM,aAAcswB,EAASf,EAAUjyB,EAAO4yB,IAAIlwB,EAAM,aAQ3E,KAAMqa,EAAQ,EAAWla,EAARka,EAAgBA,IAChCra,EAAOqR,EAAUgJ,GACXra,EAAKsI,QAGL+nB,GAA+B,SAAvBrwB,EAAKsI,MAAMinB,SAA6C,KAAvBvvB,EAAKsI,MAAMinB,UACzDvvB,EAAKsI,MAAMinB,QAAUc,EAAOrU,EAAQ3B,IAAW,GAAK,QAItD,OAAOhJ,GAGR/T,EAAOsB,GAAG8D,QACTwtB,IAAK,SAAUttB,EAAMkE,GACpB,MAAOxJ,GAAOsK,OAAQ3H,KAAM,SAAUD,EAAM4C,EAAMkE,GACjD,GAAI0pB,GAAQpuB,EACXE,KACAH,EAAI,CAEL,IAAK7E,EAAO6F,QAASP,GAAS,CAI7B,IAHA4tB,EAASL,GAAWnwB,GACpBoC,EAAMQ,EAAKzC,OAECiC,EAAJD,EAASA,IAChBG,EAAKM,EAAMT,IAAQ7E,EAAO4yB,IAAKlwB,EAAM4C,EAAMT,IAAK,EAAOquB,EAGxD,OAAOluB,GAGR,MAAOwE,KAAUjK,UAChBS,EAAOgL,MAAOtI,EAAM4C,EAAMkE,GAC1BxJ,EAAO4yB,IAAKlwB,EAAM4C,IACjBA,EAAMkE,EAAO/E,UAAU5B,OAAS,IAEpCkwB,KAAM,WACL,MAAOD,IAAUnwB,MAAM,IAExBwwB,KAAM,WACL,MAAOL,IAAUnwB,OAElBywB,OAAQ,SAAU/V,GACjB,MAAsB,iBAAVA,GACJA,EAAQ1a,KAAKowB,OAASpwB,KAAKwwB,OAG5BxwB,KAAKwB,KAAK,WACXuuB,GAAU/vB,MACd3C,EAAQ2C,MAAOowB,OAEf/yB,EAAQ2C,MAAOwwB,YAMnBnzB,EAAOoF,QAGNiuB,UACCC,SACCzvB,IAAK,SAAUnB,EAAM6wB,GACpB,GAAKA,EAAW,CAEf,GAAItvB,GAAMotB,GAAQ3uB,EAAM,UACxB,OAAe,KAARuB,EAAa,IAAMA,MAO9BuvB,WACCC,aAAe,EACfC,aAAe,EACftB,YAAc,EACduB,YAAc,EACdL,SAAW,EACXM,OAAS,EACTC,SAAW,EACXC,QAAU,EACVC,QAAU,EACV3T,MAAQ,GAKT4T,UAECC,QAAS,YAIVjpB,MAAO,SAAUtI,EAAM4C,EAAMkE,EAAO0qB,GAEnC,GAAMxxB,GAA0B,IAAlBA,EAAKQ,UAAoC,IAAlBR,EAAKQ,UAAmBR,EAAKsI,MAAlE,CAKA,GAAI/G,GAAK2C,EAAMyb,EACdoQ,EAAWzyB,EAAOoJ,UAAW9D,GAC7B0F,EAAQtI,EAAKsI,KASd,OAPA1F,GAAOtF,EAAOg0B,SAAUvB,KAAgBzyB,EAAOg0B,SAAUvB,GAAaF,GAAgBvnB,EAAOynB,IAI7FpQ,EAAQriB,EAAOqzB,SAAU/tB,IAAUtF,EAAOqzB,SAAUZ,GAG/CjpB,IAAUjK,UAiCT8iB,GAAS,OAASA,KAAUpe,EAAMoe,EAAMxe,IAAKnB,GAAM,EAAOwxB,MAAa30B,UACpE0E,EAID+G,EAAO1F,IArCdsB,QAAc4C,GAGA,WAAT5C,IAAsB3C,EAAM0tB,GAAQ7uB,KAAM0G,MAC9CA,GAAUvF,EAAI,GAAK,GAAMA,EAAI,GAAKgD,WAAYjH,EAAO4yB,IAAKlwB,EAAM4C,IAEhEsB,EAAO,UAIM,MAAT4C,GAA0B,WAAT5C,GAAqBI,MAAOwC,KAKpC,WAAT5C,GAAsB5G,EAAOwzB,UAAWf,KAC5CjpB,GAAS,MAKJxJ,EAAOsL,QAAQwU,iBAA6B,KAAVtW,GAA+C,IAA/BlE,EAAKzE,QAAQ,gBACpEmK,EAAO1F,GAAS,WAIX+c,GAAW,OAASA,KAAW7Y,EAAQ6Y,EAAMf,IAAK5e,EAAM8G,EAAO0qB,MAAa30B,YACjFyL,EAAO1F,GAASkE,IAjBjB,aA+BFopB,IAAK,SAAUlwB,EAAM4C,EAAM4uB,EAAOhB,GACjC,GAAIlf,GAAKlQ,EAAKue,EACboQ,EAAWzyB,EAAOoJ,UAAW9D,EAyB9B,OAtBAA,GAAOtF,EAAOg0B,SAAUvB,KAAgBzyB,EAAOg0B,SAAUvB,GAAaF,GAAgB7vB,EAAKsI,MAAOynB,IAIlGpQ,EAAQriB,EAAOqzB,SAAU/tB,IAAUtF,EAAOqzB,SAAUZ,GAG/CpQ,GAAS,OAASA,KACtBrO,EAAMqO,EAAMxe,IAAKnB,GAAM,EAAMwxB,IAIzBlgB,IAAQzU,YACZyU,EAAMqd,GAAQ3uB,EAAM4C,EAAM4tB,IAId,WAARlf,GAAoB1O,IAAQ4sB,MAChCle,EAAMke,GAAoB5sB,IAIZ,KAAV4uB,GAAgBA,GACpBpwB,EAAMmD,WAAY+M,GACXkgB,KAAU,GAAQl0B,EAAO+G,UAAWjD,GAAQA,GAAO,EAAIkQ,GAExDA,KAITqd,GAAS,SAAU3uB,EAAM4C,EAAM6uB,GAC9B,GAAI3T,GAAO4T,EAAUC,EACpBd,EAAWY,GAAatB,GAAWnwB,GAInCuB,EAAMsvB,EAAWA,EAASe,iBAAkBhvB,IAAUiuB,EAAUjuB,GAAS/F,UACzEyL,EAAQtI,EAAKsI,KA8Bd,OA5BKuoB,KAES,KAARtvB,GAAejE,EAAOmM,SAAUzJ,EAAKS,cAAeT,KACxDuB,EAAMjE,EAAOgL,MAAOtI,EAAM4C,IAOtBosB,GAAUtuB,KAAMa,IAASutB,GAAQpuB,KAAMkC,KAG3Ckb,EAAQxV,EAAMwV,MACd4T,EAAWppB,EAAMopB,SACjBC,EAAWrpB,EAAMqpB,SAGjBrpB,EAAMopB,SAAWppB,EAAMqpB,SAAWrpB,EAAMwV,MAAQvc,EAChDA,EAAMsvB,EAAS/S,MAGfxV,EAAMwV,MAAQA,EACdxV,EAAMopB,SAAWA,EACjBppB,EAAMqpB,SAAWA,IAIZpwB,EAIR,SAASswB,IAAmB7xB,EAAM8G,EAAOgrB,GACxC,GAAItoB,GAAUulB,GAAU3uB,KAAM0G,EAC9B,OAAO0C,GAENnG,KAAKwe,IAAK,EAAGrY,EAAS,IAAQsoB,GAAY,KAAUtoB,EAAS,IAAO,MACpE1C,EAGF,QAASirB,IAAsB/xB,EAAM4C,EAAM4uB,EAAOQ,EAAaxB,GAC9D,GAAIruB,GAAIqvB,KAAYQ,EAAc,SAAW,WAE5C,EAES,UAATpvB,EAAmB,EAAI,EAEvB0O,EAAM,CAEP,MAAY,EAAJnP,EAAOA,GAAK,EAEJ,WAAVqvB,IACJlgB,GAAOhU,EAAO4yB,IAAKlwB,EAAMwxB,EAAQ7B,GAAWxtB,IAAK,EAAMquB,IAGnDwB,GAEW,YAAVR,IACJlgB,GAAOhU,EAAO4yB,IAAKlwB,EAAM,UAAY2vB,GAAWxtB,IAAK,EAAMquB,IAI7C,WAAVgB,IACJlgB,GAAOhU,EAAO4yB,IAAKlwB,EAAM,SAAW2vB,GAAWxtB,GAAM,SAAS,EAAMquB,MAIrElf,GAAOhU,EAAO4yB,IAAKlwB,EAAM,UAAY2vB,GAAWxtB,IAAK,EAAMquB,GAG5C,YAAVgB,IACJlgB,GAAOhU,EAAO4yB,IAAKlwB,EAAM,SAAW2vB,GAAWxtB,GAAM,SAAS,EAAMquB,IAKvE,OAAOlf,GAGR,QAAS2gB,IAAkBjyB,EAAM4C,EAAM4uB,GAGtC,GAAIU,IAAmB,EACtB5gB,EAAe,UAAT1O,EAAmB5C,EAAK4d,YAAc5d,EAAKmyB,aACjD3B,EAASL,GAAWnwB,GACpBgyB,EAAc10B,EAAOsL,QAAQ+U,WAAgE,eAAnDrgB,EAAO4yB,IAAKlwB,EAAM,aAAa,EAAOwwB,EAKjF,IAAY,GAAPlf,GAAmB,MAAPA,EAAc,CAQ9B,GANAA,EAAMqd,GAAQ3uB,EAAM4C,EAAM4tB,IACf,EAANlf,GAAkB,MAAPA,KACfA,EAAMtR,EAAKsI,MAAO1F,IAIdosB,GAAUtuB,KAAK4Q,GACnB,MAAOA,EAKR4gB,GAAmBF,IAAiB10B,EAAOsL,QAAQ+T,mBAAqBrL,IAAQtR,EAAKsI,MAAO1F,IAG5F0O,EAAM/M,WAAY+M,IAAS,EAI5B,MAASA,GACRygB,GACC/xB,EACA4C,EACA4uB,IAAWQ,EAAc,SAAW,WACpCE,EACA1B,GAEE,KAIL,QAASD,IAAoB3pB,GAC5B,GAAIyI,GAAMnS,EACTqyB,EAAUL,GAAatoB,EA0BxB,OAxBM2oB,KACLA,EAAU6C,GAAexrB,EAAUyI,GAGlB,SAAZkgB,GAAuBA,IAE3BX,IAAWA,IACVtxB,EAAO,kDACN4yB,IAAK,UAAW,6BAChBpD,SAAUzd,EAAIjS,iBAGhBiS,GAAQuf,GAAO,GAAGyD,eAAiBzD,GAAO,GAAG9E,iBAAkB5sB,SAC/DmS,EAAIijB,MAAM,+BACVjjB,EAAIkjB,QAEJhD,EAAU6C,GAAexrB,EAAUyI,GACnCuf,GAAOrC,UAIR2C,GAAatoB,GAAa2oB,GAGpBA,EAIR,QAAS6C,IAAexvB,EAAMyM,GAC7B,GAAIrP,GAAO1C,EAAQ+R,EAAIlK,cAAevC,IAASkqB,SAAUzd,EAAImO,MAC5D+R,EAAUjyB,EAAO4yB,IAAKlwB,EAAK,GAAI,UAEhC,OADAA,GAAKqF,SACEkqB,EAGRjyB,EAAOmE,MAAO,SAAU,SAAW,SAAUU,EAAGS,GAC/CtF,EAAOqzB,SAAU/tB,IAChBzB,IAAK,SAAUnB,EAAM6wB,EAAUW,GAC9B,MAAKX,GAGwB,IAArB7wB,EAAK4d,aAAqBiR,GAAanuB,KAAMpD,EAAO4yB,IAAKlwB,EAAM,YACrE1C,EAAO8K,KAAMpI,EAAMovB,GAAS,WAC3B,MAAO6C,IAAkBjyB,EAAM4C,EAAM4uB,KAEtCS,GAAkBjyB,EAAM4C,EAAM4uB,GAPhC,WAWD5S,IAAK,SAAU5e,EAAM8G,EAAO0qB,GAC3B,GAAIhB,GAASgB,GAASrB,GAAWnwB,EACjC,OAAO6xB,IAAmB7xB,EAAM8G,EAAO0qB,EACtCO,GACC/xB,EACA4C,EACA4uB,EACAl0B,EAAOsL,QAAQ+U,WAAgE,eAAnDrgB,EAAO4yB,IAAKlwB,EAAM,aAAa,EAAOwwB,GAClEA,GACG,OAQRlzB,EAAO,WAEAA,EAAOsL,QAAQ8T,sBACpBpf,EAAOqzB,SAAS5S,aACf5c,IAAK,SAAUnB,EAAM6wB,GACpB,MAAKA,GAIGvzB,EAAO8K,KAAMpI,GAAQuvB,QAAW,gBACtCZ,IAAU3uB,EAAM,gBALlB,cAcG1C,EAAOsL,QAAQgU,eAAiBtf,EAAOsB,GAAGywB,UAC/C/xB,EAAOmE,MAAQ,MAAO,QAAU,SAAUU,EAAG0c,GAC5CvhB,EAAOqzB,SAAU9R,IAChB1d,IAAK,SAAUnB,EAAM6wB,GACpB,MAAKA,IACJA,EAAWlC,GAAQ3uB,EAAM6e,GAElBmQ,GAAUtuB,KAAMmwB,GACtBvzB,EAAQ0C,GAAOqvB,WAAYxQ,GAAS,KACpCgS,GALF,gBAcAvzB,EAAO8T,MAAQ9T,EAAO8T,KAAKwE,UAC/BtY,EAAO8T,KAAKwE,QAAQ0a,OAAS,SAAUtwB,GAGtC,MAA2B,IAApBA,EAAK4d,aAAyC,GAArB5d,EAAKmyB,cAGtC70B,EAAO8T,KAAKwE,QAAQ4c,QAAU,SAAUxyB,GACvC,OAAQ1C,EAAO8T,KAAKwE,QAAQ0a,OAAQtwB,KAKtC1C,EAAOmE,MACNgxB,OAAQ,GACRC,QAAS,GACTC,OAAQ,SACN,SAAUC,EAAQC,GACpBv1B,EAAOqzB,SAAUiC,EAASC,IACzBC,OAAQ,SAAUhsB,GACjB,GAAI3E,GAAI,EACP4wB,KAGAC,EAAyB,gBAAVlsB,GAAqBA,EAAM6B,MAAM,MAAS7B,EAE1D,MAAY,EAAJ3E,EAAOA,IACd4wB,EAAUH,EAASjD,GAAWxtB,GAAM0wB,GACnCG,EAAO7wB,IAAO6wB,EAAO7wB,EAAI,IAAO6wB,EAAO,EAGzC,OAAOD,KAIHjE,GAAQpuB,KAAMkyB,KACnBt1B,EAAOqzB,SAAUiC,EAASC,GAASjU,IAAMiT,KAG3C,IAAIoB,IAAM,OACTC,GAAW,QACXC,GAAQ,SACRC,GAAkB,wCAClBC,GAAe,oCAEhB/1B,GAAOsB,GAAG8D,QACT4wB,UAAW,WACV,MAAOh2B,GAAOi2B,MAAOtzB,KAAKuzB,mBAE3BA,eAAgB,WACf,MAAOvzB,MAAKqC,IAAI,WAEf,GAAI+O,GAAW/T,EAAOuhB,KAAM5e,KAAM,WAClC,OAAOoR,GAAW/T,EAAO0D,UAAWqQ,GAAapR,OAEjD+P,OAAO,WACP,GAAI9L,GAAOjE,KAAKiE,IAEhB,OAAOjE,MAAK2C,OAAStF,EAAQ2C,MAAOgpB,GAAI,cACvCoK,GAAa3yB,KAAMT,KAAK2G,YAAewsB,GAAgB1yB,KAAMwD,KAC3DjE,KAAK0U,UAAY4V,GAA4B7pB,KAAMwD,MAEtD5B,IAAI,SAAUH,EAAGnC,GACjB,GAAIsR,GAAMhU,EAAQ2C,MAAOqR,KAEzB,OAAc,OAAPA,EACN,KACAhU,EAAO6F,QAASmO,GACfhU,EAAOgF,IAAKgP,EAAK,SAAUA,GAC1B,OAAS1O,KAAM5C,EAAK4C,KAAMkE,MAAOwK,EAAI/N,QAAS4vB,GAAO,YAEpDvwB,KAAM5C,EAAK4C,KAAMkE,MAAOwK,EAAI/N,QAAS4vB,GAAO,WAC9ChyB,SAML7D,EAAOi2B,MAAQ,SAAUrpB,EAAGupB,GAC3B,GAAIb,GACHc,KACAtZ,EAAM,SAAUvS,EAAKf,GAEpBA,EAAQxJ,EAAOsD,WAAYkG,GAAUA,IAAqB,MAATA,EAAgB,GAAKA,EACtE4sB,EAAGA,EAAEvzB,QAAWwzB,mBAAoB9rB,GAAQ,IAAM8rB,mBAAoB7sB,GASxE,IALK2sB,IAAgB52B,YACpB42B,EAAcn2B,EAAOs2B,cAAgBt2B,EAAOs2B,aAAaH,aAIrDn2B,EAAO6F,QAAS+G,IAASA,EAAErK,SAAWvC,EAAOqD,cAAeuJ,GAEhE5M,EAAOmE,KAAMyI,EAAG,WACfkQ,EAAKna,KAAK2C,KAAM3C,KAAK6G,aAMtB,KAAM8rB,IAAU1oB,GACf2pB,GAAajB,EAAQ1oB,EAAG0oB,GAAUa,EAAarZ,EAKjD,OAAOsZ,GAAEhmB,KAAM,KAAMnK,QAAS0vB,GAAK,KAGpC,SAASY,IAAajB,EAAQ3uB,EAAKwvB,EAAarZ,GAC/C,GAAIxX,EAEJ,IAAKtF,EAAO6F,QAASc,GAEpB3G,EAAOmE,KAAMwC,EAAK,SAAU9B,EAAG2xB,GACzBL,GAAeP,GAASxyB,KAAMkyB,GAElCxY,EAAKwY,EAAQkB,GAIbD,GAAajB,EAAS,KAAqB,gBAANkB,GAAiB3xB,EAAI,IAAO,IAAK2xB,EAAGL,EAAarZ,SAIlF,IAAMqZ,GAAsC,WAAvBn2B,EAAO4G,KAAMD,GAQxCmW,EAAKwY,EAAQ3uB,OANb,KAAMrB,IAAQqB,GACb4vB,GAAajB,EAAS,IAAMhwB,EAAO,IAAKqB,EAAKrB,GAAQ6wB,EAAarZ,GAQrE9c,EAAOmE,KAAM,0MAEqDkH,MAAM,KAAM,SAAUxG,EAAGS,GAG1FtF,EAAOsB,GAAIgE,GAAS,SAAUmC,EAAMnG,GACnC,MAAOmD,WAAU5B,OAAS,EACzBF,KAAKooB,GAAIzlB,EAAM,KAAMmC,EAAMnG,GAC3BqB,KAAK8D,QAASnB,MAIjBtF,EAAOsB,GAAG8D,QACTqxB,MAAO,SAAUC,EAAQC,GACxB,MAAOh0B,MAAK8nB,WAAYiM,GAAShM,WAAYiM,GAASD,IAGvDE,KAAM,SAAU/Q,EAAOpe,EAAMnG,GAC5B,MAAOqB,MAAKooB,GAAIlF,EAAO,KAAMpe,EAAMnG,IAEpCu1B,OAAQ,SAAUhR,EAAOvkB,GACxB,MAAOqB,MAAK+D,IAAKmf,EAAO,KAAMvkB;EAG/Bw1B,SAAU,SAAU11B,EAAUykB,EAAOpe,EAAMnG,GAC1C,MAAOqB,MAAKooB,GAAIlF,EAAOzkB,EAAUqG,EAAMnG,IAExCy1B,WAAY,SAAU31B,EAAUykB,EAAOvkB,GAEtC,MAA4B,KAArBmD,UAAU5B,OAAeF,KAAK+D,IAAKtF,EAAU,MAASuB,KAAK+D,IAAKmf,EAAOzkB,GAAY,KAAME,KAGlG,IAEC01B,IACAC,GAEAC,GAAal3B,EAAO4K,MAEpBusB,GAAc,KACdC,GAAQ,OACRC,GAAM,gBACNC,GAAW,6BAEXC,GAAiB,4DACjBC,GAAa,iBACbC,GAAY,QACZC,GAAO,8CAGPC,GAAQ33B,EAAOsB,GAAGuoB,KAWlB+N,MAOAC,MAGAC,GAAW,KAAKv3B,OAAO,IAIxB,KACC02B,GAAet3B,EAASsX,KACvB,MAAO7P,IAGR6vB,GAAer3B,EAASiI,cAAe,KACvCovB,GAAahgB,KAAO,GACpBggB,GAAeA,GAAahgB,KAI7B+f,GAAeU,GAAK50B,KAAMm0B,GAAa1tB,kBAGvC,SAASwuB,IAA6BC,GAGrC,MAAO,UAAUC,EAAoB9a,GAED,gBAAvB8a,KACX9a,EAAO8a,EACPA,EAAqB,IAGtB,IAAI1H,GACH1rB,EAAI,EACJqzB,EAAYD,EAAmB1uB,cAAc9G,MAAOf,MAErD,IAAK1B,EAAOsD,WAAY6Z,GAEvB,MAASoT,EAAW2H,EAAUrzB,KAER,MAAhB0rB,EAAS,IACbA,EAAWA,EAAS5vB,MAAO,IAAO,KACjCq3B,EAAWzH,GAAayH,EAAWzH,QAAkB1c,QAASsJ,KAI9D6a,EAAWzH,GAAayH,EAAWzH,QAAkB9vB,KAAM0c,IAQjE,QAASgb,IAA+BH,EAAW3yB,EAAS+yB,EAAiBC,GAE5E,GAAIC,MACHC,EAAqBP,IAAcH,EAEpC,SAASW,GAASjI,GACjB,GAAIjZ,EAYJ,OAXAghB,GAAW/H,IAAa,EACxBvwB,EAAOmE,KAAM6zB,EAAWzH,OAAkB,SAAUvhB,EAAGypB,GACtD,GAAIC,GAAsBD,EAAoBpzB,EAAS+yB,EAAiBC,EACxE,OAAmC,gBAAxBK,IAAqCH,GAAqBD,EAAWI,GAIpEH,IACDjhB,EAAWohB,GADf,WAHNrzB,EAAQ6yB,UAAUrkB,QAAS6kB,GAC3BF,EAASE,IACF,KAKFphB,EAGR,MAAOkhB,GAASnzB,EAAQ6yB,UAAW,MAAUI,EAAW,MAASE,EAAS,KAM3E,QAASG,IAAYhzB,EAAQJ,GAC5B,GAAIgF,GAAK3E,EACRgzB,EAAc54B,EAAOs2B,aAAasC,eAEnC,KAAMruB,IAAOhF,GACPA,EAAKgF,KAAUhL,aACjBq5B,EAAaruB,GAAQ5E,EAAWC,IAASA,OAAgB2E,GAAQhF,EAAKgF,GAO1E,OAJK3E,IACJ5F,EAAOoF,QAAQ,EAAMO,EAAQC,GAGvBD,EAGR3F,EAAOsB,GAAGuoB,KAAO,SAAUwG,EAAKwI,EAAQz0B,GACvC,GAAoB,gBAARisB,IAAoBsH,GAC/B,MAAOA,IAAMnzB,MAAO7B,KAAM8B,UAG3B,IAAIrD,GAAUwF,EAAMkyB,EACnBlc,EAAOja,KACP+D,EAAM2pB,EAAIxvB,QAAQ,IA+CnB,OA7CK6F,IAAO,IACXtF,EAAWivB,EAAI1vB,MAAO+F,GACtB2pB,EAAMA,EAAI1vB,MAAO,EAAG+F,IAIhB1G,EAAOsD,WAAYu1B,IAGvBz0B,EAAWy0B,EACXA,EAASt5B,WAGEs5B,GAA4B,gBAAXA,KAC5BjyB,EAAO,QAIHgW,EAAK/Z,OAAS,GAClB7C,EAAOswB,MACND,IAAKA,EAGLzpB,KAAMA,EACN2pB,SAAU,OACV9oB,KAAMoxB,IACJt0B,KAAK,SAAUw0B,GAGjBD,EAAWr0B,UAEXmY,EAAKmS,KAAM3tB,EAIVpB,EAAO,SAASiuB,OAAQjuB,EAAOiD,UAAW81B,IAAiBh2B,KAAM3B,GAGjE23B,KAECC,SAAU50B,GAAY,SAAUi0B,EAAOY,GACzCrc,EAAKzY,KAAMC,EAAU00B,IAAcT,EAAMU,aAAcE,EAAQZ,MAI1D11B,MAIR3C,EAAOmE,MAAQ,YAAa,WAAY,eAAgB,YAAa,cAAe,YAAc,SAAUU,EAAG+B,GAC9G5G,EAAOsB,GAAIsF,GAAS,SAAUtF,GAC7B,MAAOqB,MAAKooB,GAAInkB,EAAMtF,MAIxBtB,EAAOoF,QAGN8zB,OAAQ,EAGRC,gBACAC,QAEA9C,cACCjG,IAAK4G,GACLrwB,KAAM,MACNyyB,QAAS9B,GAAen0B,KAAM4zB,GAAc,IAC5CpR,QAAQ,EACR0T,aAAa,EACb9I,OAAO,EACP+I,YAAa,mDAabtY,SACCuY,IAAK1B,GACL9uB,KAAM,aACN+lB,KAAM,YACN1mB,IAAK,4BACLoxB,KAAM,qCAGPlO,UACCljB,IAAK,MACL0mB,KAAM,OACN0K,KAAM,QAGPC,gBACCrxB,IAAK,cACLW,KAAM,eACNywB,KAAM,gBAKPE,YAGCC,SAAUzyB,OAGV0yB,aAAa,EAGbC,YAAa95B,EAAOiI,UAGpB8xB,WAAY/5B,EAAOoI,UAOpBwwB,aACCvI,KAAK,EACLhvB,SAAS,IAOX24B,UAAW,SAAUr0B,EAAQs0B,GAC5B,MAAOA,GAGNtB,GAAYA,GAAYhzB,EAAQ3F,EAAOs2B,cAAgB2D,GAGvDtB,GAAY34B,EAAOs2B,aAAc3wB,IAGnCu0B,cAAenC,GAA6BH,IAC5CuC,cAAepC,GAA6BF,IAG5CvH,KAAM,SAAUD,EAAKhrB,GAGA,gBAARgrB,KACXhrB,EAAUgrB,EACVA,EAAM9wB,WAIP8F,EAAUA,KAEV,IAAI+0B,GAEHC,EAEAC,EACAC,EAEAC,EAEA9E,EAEA+E,EAEA51B,EAEAuxB,EAAIp2B,EAAOg6B,aAAe30B,GAE1Bq1B,EAAkBtE,EAAE/0B,SAAW+0B,EAE/BuE,EAAqBvE,EAAE/0B,UAAaq5B,EAAgBx3B,UAAYw3B,EAAgBn4B,QAC/EvC,EAAQ06B,GACR16B,EAAO2lB,MAERpI,EAAWvd,EAAOiL,WAClB2vB,EAAmB56B,EAAOgc,UAAU,eAEpC6e,EAAazE,EAAEyE,eAEfC,KACAC,KAEA1d,EAAQ,EAER2d,EAAW,WAEX3C,GACCntB,WAAY,EAGZ+vB,kBAAmB,SAAU1wB,GAC5B,GAAI9H,EACJ,IAAe,IAAV4a,EAAc,CAClB,IAAMkd,EAAkB,CACvBA,IACA,OAAS93B,EAAQ60B,GAASx0B,KAAMw3B,GAC/BC,EAAiB93B,EAAM,GAAG8G,eAAkB9G,EAAO,GAGrDA,EAAQ83B,EAAiBhwB,EAAIhB,eAE9B,MAAgB,OAAT9G,EAAgB,KAAOA,GAI/By4B,sBAAuB,WACtB,MAAiB,KAAV7d,EAAcid,EAAwB,MAI9Ca,iBAAkB,SAAU71B,EAAMkE,GACjC,GAAI4xB,GAAQ91B,EAAKiE,aAKjB,OAJM8T,KACL/X,EAAOy1B,EAAqBK,GAAUL,EAAqBK,IAAW91B,EACtEw1B,EAAgBx1B,GAASkE,GAEnB7G,MAIR04B,iBAAkB,SAAUz0B,GAI3B,MAHMyW,KACL+Y,EAAEkF,SAAW10B,GAEPjE,MAIRk4B,WAAY,SAAU71B,GACrB,GAAI4D,EACJ,IAAK5D,EACJ,GAAa,EAARqY,EACJ,IAAMzU,IAAQ5D,GAEb61B,EAAYjyB,IAAWiyB,EAAYjyB,GAAQ5D,EAAK4D,QAIjDyvB,GAAM/a,OAAQtY,EAAKqzB,EAAMY,QAG3B,OAAOt2B,OAIR44B,MAAO,SAAUC,GAChB,GAAIC,GAAYD,GAAcR,CAK9B,OAJKZ,IACJA,EAAUmB,MAAOE,GAElBl3B,EAAM,EAAGk3B,GACF94B,MAyCV,IApCA4a,EAASjZ,QAAS+zB,GAAQW,SAAW4B,EAAiB9d,IACtDub,EAAMqD,QAAUrD,EAAM9zB,KACtB8zB,EAAM/wB,MAAQ+wB,EAAM7a,KAMpB4Y,EAAE/F,MAAUA,GAAO+F,EAAE/F,KAAO4G,IAAiB,IAAKhxB,QAASmxB,GAAO,IAChEnxB,QAASwxB,GAAWT,GAAc,GAAM,MAG1CZ,EAAExvB,KAAOvB,EAAQs2B,QAAUt2B,EAAQuB,MAAQwvB,EAAEuF,QAAUvF,EAAExvB,KAGzDwvB,EAAE8B,UAAYl4B,EAAOmB,KAAMi1B,EAAE7F,UAAY,KAAMhnB,cAAc9G,MAAOf,KAAqB,IAGnE,MAAjB00B,EAAEwF,cACNlG,EAAQgC,GAAK50B,KAAMszB,EAAE/F,IAAI9mB,eACzB6sB,EAAEwF,eAAkBlG,GACjBA,EAAO,KAAQsB,GAAc,IAAOtB,EAAO,KAAQsB,GAAc,KAChEtB,EAAO,KAAwB,UAAfA,EAAO,GAAkB,KAAO,WAC/CsB,GAAc,KAA+B,UAAtBA,GAAc,GAAkB,KAAO,UAK/DZ,EAAE3uB,MAAQ2uB,EAAEkD,aAAiC,gBAAXlD,GAAE3uB,OACxC2uB,EAAE3uB,KAAOzH,EAAOi2B,MAAOG,EAAE3uB,KAAM2uB,EAAED,cAIlCgC,GAA+BP,GAAYxB,EAAG/wB,EAASgzB,GAGxC,IAAVhb,EACJ,MAAOgb,EAIRoC,GAAcrE,EAAExQ,OAGX6U,GAAmC,IAApBz6B,EAAOk5B,UAC1Bl5B,EAAO2lB,MAAMlf,QAAQ,aAItB2vB,EAAExvB,KAAOwvB,EAAExvB,KAAK1E,cAGhBk0B,EAAEyF,YAAcrE,GAAWp0B,KAAMgzB,EAAExvB,MAInCyzB,EAAWjE,EAAE/F,IAGP+F,EAAEyF,aAGFzF,EAAE3uB,OACN4yB,EAAajE,EAAE/F,MAAS8G,GAAY/zB,KAAMi3B,GAAa,IAAM,KAAQjE,EAAE3uB,WAEhE2uB,GAAE3uB,MAIL2uB,EAAE1lB,SAAU,IAChB0lB,EAAE/F,IAAMgH,GAAIj0B,KAAMi3B,GAGjBA,EAASp0B,QAASoxB,GAAK,OAASH,MAGhCmD,GAAalD,GAAY/zB,KAAMi3B,GAAa,IAAM,KAAQ,KAAOnD,OAK/Dd,EAAE0F,aACD97B,EAAOm5B,aAAckB,IACzBhC,EAAM8C,iBAAkB,oBAAqBn7B,EAAOm5B,aAAckB,IAE9Dr6B,EAAOo5B,KAAMiB,IACjBhC,EAAM8C,iBAAkB,gBAAiBn7B,EAAOo5B,KAAMiB,MAKnDjE,EAAE3uB,MAAQ2uB,EAAEyF,YAAczF,EAAEmD,eAAgB,GAASl0B,EAAQk0B,cACjElB,EAAM8C,iBAAkB,eAAgB/E,EAAEmD,aAI3ClB,EAAM8C,iBACL,SACA/E,EAAE8B,UAAW,IAAO9B,EAAEnV,QAASmV,EAAE8B,UAAU,IAC1C9B,EAAEnV,QAASmV,EAAE8B,UAAU,KAA8B,MAArB9B,EAAE8B,UAAW,GAAc,KAAOJ,GAAW,WAAa,IAC1F1B,EAAEnV,QAAS,KAIb,KAAMpc,IAAKuxB,GAAE2F,QACZ1D,EAAM8C,iBAAkBt2B,EAAGuxB,EAAE2F,QAASl3B,GAIvC,IAAKuxB,EAAE4F,aAAgB5F,EAAE4F,WAAWp4B,KAAM82B,EAAiBrC,EAAOjC,MAAQ,GAAmB,IAAV/Y,GAElF,MAAOgb,GAAMkD,OAIdP,GAAW,OAGX,KAAMn2B,KAAO62B,QAAS,EAAGp0B,MAAO,EAAG0xB,SAAU,GAC5CX,EAAOxzB,GAAKuxB,EAAGvxB,GAOhB,IAHAu1B,EAAYjC,GAA+BN,GAAYzB,EAAG/wB,EAASgzB,GAK5D,CACNA,EAAMntB,WAAa,EAGduvB,GACJE,EAAmBl0B,QAAS,YAAc4xB,EAAOjC,IAG7CA,EAAE5F,OAAS4F,EAAEtT,QAAU,IAC3B0X,EAAervB,WAAW,WACzBktB,EAAMkD,MAAM,YACVnF,EAAEtT,SAGN,KACCzF,EAAQ,EACR+c,EAAU6B,KAAMnB,EAAgBv2B,GAC/B,MAAQ6C,GAET,KAAa,EAARiW,GAIJ,KAAMjW,EAHN7C,GAAM,GAAI6C,QArBZ7C,GAAM,GAAI,eA8BX,SAASA,GAAM00B,EAAQiD,EAAkBC,EAAWJ,GACnD,GAAIK,GAAWV,EAASp0B,EAAOwxB,EAAUuD,EACxCb,EAAaU,CAGC,KAAV7e,IAKLA,EAAQ,EAGHmd,GACJzX,aAAcyX,GAKfJ,EAAY76B,UAGZ+6B,EAAwByB,GAAW,GAGnC1D,EAAMntB,WAAa+tB,EAAS,EAAI,EAAI,EAGpCmD,EAAYnD,GAAU,KAAgB,IAATA,GAA2B,MAAXA,EAGxCkD,IACJrD,EAAWwD,GAAqBlG,EAAGiC,EAAO8D,IAI3CrD,EAAWyD,GAAanG,EAAG0C,EAAUT,EAAO+D,GAGvCA,GAGChG,EAAE0F,aACNO,EAAWhE,EAAM4C,kBAAkB,iBAC9BoB,IACJr8B,EAAOm5B,aAAckB,GAAagC,GAEnCA,EAAWhE,EAAM4C,kBAAkB,QAC9BoB,IACJr8B,EAAOo5B,KAAMiB,GAAagC,IAKZ,MAAXpD,GAA6B,SAAX7C,EAAExvB,KACxB40B,EAAa,YAGS,MAAXvC,EACXuC,EAAa,eAIbA,EAAa1C,EAASzb,MACtBqe,EAAU5C,EAASrxB,KACnBH,EAAQwxB,EAASxxB,MACjB80B,GAAa90B,KAKdA,EAAQk0B,GACHvC,IAAWuC,KACfA,EAAa,QACC,EAATvC,IACJA,EAAS,KAMZZ,EAAMY,OAASA,EACfZ,EAAMmD,YAAeU,GAAoBV,GAAe,GAGnDY,EACJ7e,EAAS/W,YAAak0B,GAAmBgB,EAASF,EAAYnD,IAE9D9a,EAASif,WAAY9B,GAAmBrC,EAAOmD,EAAYl0B,IAI5D+wB,EAAMwC,WAAYA,GAClBA,EAAat7B,UAERk7B,GACJE,EAAmBl0B,QAAS21B,EAAY,cAAgB,aACrD/D,EAAOjC,EAAGgG,EAAYV,EAAUp0B,IAIpCszB,EAAiB1d,SAAUwd,GAAmBrC,EAAOmD,IAEhDf,IACJE,EAAmBl0B,QAAS,gBAAkB4xB,EAAOjC,MAE3Cp2B,EAAOk5B,QAChBl5B,EAAO2lB,MAAMlf,QAAQ,cAKxB,MAAO4xB,IAGRoE,QAAS,SAAUpM,EAAK5oB,EAAMrD,GAC7B,MAAOpE,GAAO6D,IAAKwsB,EAAK5oB,EAAMrD,EAAU,SAGzCs4B,UAAW,SAAUrM,EAAKjsB,GACzB,MAAOpE,GAAO6D,IAAKwsB,EAAK9wB,UAAW6E,EAAU,aAI/CpE,EAAOmE,MAAQ,MAAO,QAAU,SAAUU,EAAG82B,GAC5C37B,EAAQ27B,GAAW,SAAUtL,EAAK5oB,EAAMrD,EAAUwC,GAQjD,MANK5G,GAAOsD,WAAYmE,KACvBb,EAAOA,GAAQxC,EACfA,EAAWqD,EACXA,EAAOlI,WAGDS,EAAOswB,MACbD,IAAKA,EACLzpB,KAAM+0B,EACNpL,SAAU3pB,EACVa,KAAMA,EACNi0B,QAASt3B,MASZ,SAASk4B,IAAqBlG,EAAGiC,EAAO8D,GAEvC,GAAIQ,GAAI/1B,EAAMg2B,EAAeC,EAC5BtR,EAAW6K,EAAE7K,SACb2M,EAAY9B,EAAE8B,SAGf,OAA0B,MAAnBA,EAAW,GACjBA,EAAUtnB,QACL+rB,IAAOp9B,YACXo9B,EAAKvG,EAAEkF,UAAYjD,EAAM4C,kBAAkB,gBAK7C,IAAK0B,EACJ,IAAM/1B,IAAQ2kB,GACb,GAAKA,EAAU3kB,IAAU2kB,EAAU3kB,GAAOxD,KAAMu5B,GAAO,CACtDzE,EAAUrkB,QAASjN,EACnB,OAMH,GAAKsxB,EAAW,IAAOiE,GACtBS,EAAgB1E,EAAW,OACrB,CAEN,IAAMtxB,IAAQu1B,GAAY,CACzB,IAAMjE,EAAW,IAAO9B,EAAEuD,WAAY/yB,EAAO,IAAMsxB,EAAU,IAAO,CACnE0E,EAAgBh2B,CAChB,OAEKi2B,IACLA,EAAgBj2B,GAIlBg2B,EAAgBA,GAAiBC,EAMlC,MAAKD,IACCA,IAAkB1E,EAAW,IACjCA,EAAUrkB,QAAS+oB,GAEbT,EAAWS,IAJnB,UAWD,QAASL,IAAanG,EAAG0C,EAAUT,EAAO+D,GACzC,GAAIU,GAAOC,EAASC,EAAM10B,EAAKkjB,EAC9BmO,KAEAzB,EAAY9B,EAAE8B,UAAUv3B,OAGzB,IAAKu3B,EAAW,GACf,IAAM8E,IAAQ5G,GAAEuD,WACfA,EAAYqD,EAAKzzB,eAAkB6sB,EAAEuD,WAAYqD,EAInDD,GAAU7E,EAAUtnB,OAGpB,OAAQmsB,EAcP,GAZK3G,EAAEsD,eAAgBqD,KACtB1E,EAAOjC,EAAEsD,eAAgBqD,IAAcjE,IAIlCtN,GAAQ4Q,GAAahG,EAAE6G,aAC5BnE,EAAW1C,EAAE6G,WAAYnE,EAAU1C,EAAE7F,WAGtC/E,EAAOuR,EACPA,EAAU7E,EAAUtnB,QAKnB,GAAiB,MAAZmsB,EAEJA,EAAUvR,MAGJ,IAAc,MAATA,GAAgBA,IAASuR,EAAU,CAM9C,GAHAC,EAAOrD,EAAYnO,EAAO,IAAMuR,IAAapD,EAAY,KAAOoD,IAG1DC,EACL,IAAMF,IAASnD,GAId,GADArxB,EAAMw0B,EAAMzxB,MAAO,KACd/C,EAAK,KAAQy0B,IAGjBC,EAAOrD,EAAYnO,EAAO,IAAMljB,EAAK,KACpCqxB,EAAY,KAAOrxB,EAAK,KACb,CAEN00B,KAAS,EACbA,EAAOrD,EAAYmD,GAGRnD,EAAYmD,MAAY,IACnCC,EAAUz0B,EAAK,GACf4vB,EAAUrkB,QAASvL,EAAK,IAEzB,OAOJ,GAAK00B,KAAS,EAGb,GAAKA,GAAQ5G,EAAG,UACf0C,EAAWkE,EAAMlE,OAEjB,KACCA,EAAWkE,EAAMlE,GAChB,MAAQ1xB,GACT,OAASiW,MAAO,cAAe/V,MAAO01B,EAAO51B,EAAI,sBAAwBokB,EAAO,OAASuR,IAQ/F,OAAS1f,MAAO,UAAW5V,KAAMqxB,GAGlC94B,EAAOg6B,WACN/Y,SACCpY,OAAQ,6FAET0iB,UACC1iB,OAAQ,uBAET8wB,YACCuD,cAAe,SAAUl0B,GAExB,MADAhJ,GAAO2I,WAAYK,GACZA,MAMVhJ,EAAOk6B,cAAe,SAAU,SAAU9D,GACpCA,EAAE1lB,QAAUnR,YAChB62B,EAAE1lB,OAAQ,GAEN0lB,EAAEwF,cACNxF,EAAExvB,KAAO,SAKX5G,EAAOm6B,cAAe,SAAU,SAAU/D,GAEzC,GAAKA,EAAEwF,YAAc,CACpB,GAAI/yB,GAAQzE,CACZ,QACC63B,KAAM,SAAUjtB,EAAGgqB,GAClBnwB,EAAS7I,EAAO,YAAYuhB,MAC3BiP,OAAO,EACP2M,QAAS/G,EAAEgH,cACX73B,IAAK6wB,EAAE/F,MACLtF,GACF,aACA3mB,EAAW,SAAUi5B,GACpBx0B,EAAOd,SACP3D,EAAW,KACNi5B,GACJrE,EAAuB,UAAbqE,EAAIz2B,KAAmB,IAAM,IAAKy2B,EAAIz2B,QAInDhH,EAASqJ,KAAKC,YAAaL,EAAQ,KAEpC0yB,MAAO,WACDn3B,GACJA,QAML,IAAIk5B,OACHC,GAAS,mBAGVv9B,GAAOg6B,WACNwD,MAAO,WACPC,cAAe,WACd,GAAIr5B,GAAWk5B,GAAarwB,OAAWjN,EAAO8F,QAAU,IAAQoxB,IAEhE,OADAv0B,MAAMyB,IAAa,EACZA,KAKTpE,EAAOk6B,cAAe,aAAc,SAAU9D,EAAGsH,EAAkBrF,GAElE,GAAIsF,GAAcC,EAAaC,EAC9BC,EAAW1H,EAAEoH,SAAU,IAAWD,GAAOn6B,KAAMgzB,EAAE/F,KAChD,MACkB,gBAAX+F,GAAE3uB,QAAwB2uB,EAAEmD,aAAe,IAAK14B,QAAQ,sCAAwC08B,GAAOn6B,KAAMgzB,EAAE3uB,OAAU,OAIlI,OAAKq2B,IAAiC,UAArB1H,EAAE8B,UAAW,IAG7ByF,EAAevH,EAAEqH,cAAgBz9B,EAAOsD,WAAY8yB,EAAEqH,eACrDrH,EAAEqH,gBACFrH,EAAEqH,cAGEK,EACJ1H,EAAG0H,GAAa1H,EAAG0H,GAAW73B,QAASs3B,GAAQ,KAAOI,GAC3CvH,EAAEoH,SAAU,IACvBpH,EAAE/F,MAAS8G,GAAY/zB,KAAMgzB,EAAE/F,KAAQ,IAAM,KAAQ+F,EAAEoH,MAAQ,IAAMG,GAItEvH,EAAEuD,WAAW,eAAiB,WAI7B,MAHMkE,IACL79B,EAAOsH,MAAOq2B,EAAe,mBAEvBE,EAAmB,IAI3BzH,EAAE8B,UAAW,GAAM,OAGnB0F,EAAct+B,EAAQq+B,GACtBr+B,EAAQq+B,GAAiB,WACxBE,EAAoBp5B,WAIrB4zB,EAAM/a,OAAO,WAEZhe,EAAQq+B,GAAiBC,EAGpBxH,EAAGuH,KAEPvH,EAAEqH,cAAgBC,EAAiBD,cAGnCH,GAAa78B,KAAMk9B,IAIfE,GAAqB79B,EAAOsD,WAAYs6B,IAC5CA,EAAaC,EAAmB,IAGjCA,EAAoBD,EAAcr+B,YAI5B,UAtDR,YAyDDS,EAAOs2B,aAAayH,IAAM,WACzB,IACC,MAAO,IAAIC,gBACV,MAAO52B,KAGV,IAAI62B,IAAej+B,EAAOs2B,aAAayH,MACtCG,IAEC,EAAG,IAGHC,KAAM,KAKPC,GAAQ,EACRC,KAEI/+B,GAAOg/B,eACXt+B,EAAQV,GAASyrB,GAAI,SAAU,WAC9B,IAAK,GAAIxgB,KAAO8zB,IACfA,GAAc9zB,IAEf8zB,IAAe9+B,YAIjBS,EAAOsL,QAAQizB,OAASN,IAAkB,mBAAqBA,IAC/Dj+B,EAAOsL,QAAQglB,KAAO2N,KAAiBA,GAEvCj+B,EAAOm6B,cAAc,SAAU90B,GAC9B,GAAIjB,EAEJ,OAAKpE,GAAOsL,QAAQizB,MAAQN,KAAiB54B,EAAQu2B,aAEnDK,KAAM,SAAUF,EAAS/C,GACxB,GAAIn0B,GAAGgL,EACNkuB,EAAM14B,EAAQ04B,KAGf,IAFAA,EAAIS,KAAMn5B,EAAQuB,KAAMvB,EAAQgrB,IAAKhrB,EAAQmrB,MAAOnrB,EAAQo5B,SAAUp5B,EAAQ6S,UAEzE7S,EAAQq5B,UACZ,IAAM75B,IAAKQ,GAAQq5B,UAClBX,EAAKl5B,GAAMQ,EAAQq5B,UAAW75B,EAI3BQ,GAAQi2B,UAAYyC,EAAI1C,kBAC5B0C,EAAI1C,iBAAkBh2B,EAAQi2B,UAOzBj2B,EAAQu2B,aAAgBG,EAAQ,sBACrCA,EAAQ,oBAAsB,iBAG/B,KAAMl3B,IAAKk3B,GACVgC,EAAI5C,iBAAkBt2B,EAAGk3B,EAASl3B,GAGnCT,GAAW,SAAUwC,GACpB,MAAO,YACDxC,UACGi6B,IAAcxuB,GACrBzL,EAAW25B,EAAIY,OAASZ,EAAIa,QAAU,KACxB,UAATh4B,EACJm3B,EAAIxC,QACgB,UAAT30B,EACXoyB,EAEC+E,EAAI9E,QAAU,IACd8E,EAAIvC,YAGLxC,EACCkF,GAAkBH,EAAI9E,SAAY8E,EAAI9E,OACtC8E,EAAIvC,WAIwB,gBAArBuC,GAAIhF,cACV/vB,KAAM+0B,EAAIhF,cACPx5B,UACJw+B,EAAI7C,4BAOT6C,EAAIY,OAASv6B,IACb25B,EAAIa,QAAUx6B,EAAS,SAEvBA,EAAWi6B,GAAexuB,EAAKuuB,MAAah6B,EAAS,SAIrD25B,EAAI9B,KAAM52B,EAAQw2B,YAAcx2B,EAAQoC,MAAQ,OAEjD8zB,MAAO,WACDn3B,GACJA,MAtEJ,WA4ED,IAAIy6B,IAAOC,GACVC,GAAW,yBACXC,GAAatxB,OAAQ,iBAAmBlM,EAAY,cAAe,KACnEy9B,GAAO,cACPC,IAAwBC,IACxBC,IACC5F,KAAM,SAAUjY,EAAM/X,GACrB,GAAI61B,GAAQ18B,KAAK28B,YAAa/d,EAAM/X,GACnC7D,EAAS05B,EAAMhuB,MACfqkB,EAAQsJ,GAAOl8B,KAAM0G,GACrB+1B,EAAO7J,GAASA,EAAO,KAAS11B,EAAOwzB,UAAWjS,GAAS,GAAK,MAGhEzL,GAAU9V,EAAOwzB,UAAWjS,IAAmB,OAATge,IAAkB55B,IACvDq5B,GAAOl8B,KAAM9C,EAAO4yB,IAAKyM,EAAM38B,KAAM6e,IACtCie,EAAQ,EACRC,EAAgB,EAEjB,IAAK3pB,GAASA,EAAO,KAAQypB,EAAO,CAEnCA,EAAOA,GAAQzpB,EAAO,GAGtB4f,EAAQA,MAGR5f,GAASnQ,GAAU,CAEnB,GAGC65B,GAAQA,GAAS,KAGjB1pB,GAAgB0pB,EAChBx/B,EAAOgL,MAAOq0B,EAAM38B,KAAM6e,EAAMzL,EAAQypB,SAI/BC,KAAWA,EAAQH,EAAMhuB,MAAQ1L,IAAqB,IAAV65B,KAAiBC,GAaxE,MATK/J,KACJ5f,EAAQupB,EAAMvpB,OAASA,IAAUnQ,GAAU,EAC3C05B,EAAME,KAAOA,EAEbF,EAAMp6B,IAAMywB,EAAO,GAClB5f,GAAU4f,EAAO,GAAM,GAAMA,EAAO,IACnCA,EAAO,IAGH2J,IAKV,SAASK,MAIR,MAHAv0B,YAAW,WACV0zB,GAAQt/B,YAEAs/B,GAAQ7+B,EAAO4K,MAGzB,QAAS00B,IAAa91B,EAAO+X,EAAMoe,GAClC,GAAIN,GACHO,GAAeR,GAAU7d,QAAehhB,OAAQ6+B,GAAU,MAC1DriB,EAAQ,EACRla,EAAS+8B,EAAW/8B,MACrB,MAAgBA,EAARka,EAAgBA,IACvB,GAAMsiB,EAAQO,EAAY7iB,GAAQnZ,KAAM+7B,EAAWpe,EAAM/X,GAGxD,MAAO61B,GAKV,QAASQ,IAAWn9B,EAAMo9B,EAAYz6B,GACrC,GAAIkQ,GACHwqB,EACAhjB,EAAQ,EACRla,EAASq8B,GAAoBr8B,OAC7B0a,EAAWvd,EAAOiL,WAAWqS,OAAQ,iBAE7B0iB,GAAKt9B,OAEbs9B,EAAO,WACN,GAAKD,EACJ,OAAO,CAER,IAAIE,GAAcpB,IAASa,KAC1BlhB,EAAYzY,KAAKwe,IAAK,EAAGob,EAAUO,UAAYP,EAAUQ,SAAWF,GAEpEtmB,EAAO6E,EAAYmhB,EAAUQ,UAAY,EACzCC,EAAU,EAAIzmB,EACdoD,EAAQ,EACRla,EAAS88B,EAAUU,OAAOx9B,MAE3B,MAAgBA,EAARka,EAAiBA,IACxB4iB,EAAUU,OAAQtjB,GAAQujB,IAAKF,EAKhC,OAFA7iB,GAASqB,WAAYlc,GAAQi9B,EAAWS,EAAS5hB,IAElC,EAAV4hB,GAAev9B,EACZ2b,GAEPjB,EAAS/W,YAAa9D,GAAQi9B,KACvB,IAGTA,EAAYpiB,EAASjZ,SACpB5B,KAAMA,EACNgmB,MAAO1oB,EAAOoF,UAAY06B,GAC1BS,KAAMvgC,EAAOoF,QAAQ,GAAQo7B,kBAAqBn7B,GAClDo7B,mBAAoBX,EACpB1H,gBAAiB/yB,EACjB66B,UAAWrB,IAASa,KACpBS,SAAU96B,EAAQ86B,SAClBE,UACAf,YAAa,SAAU/d,EAAMtc,GAC5B,GAAIo6B,GAAQr/B,EAAO0gC,MAAOh+B,EAAMi9B,EAAUY,KAAMhf,EAAMtc,EACpD06B,EAAUY,KAAKC,cAAejf,IAAUoe,EAAUY,KAAKI,OAEzD,OADAhB,GAAUU,OAAO5/B,KAAM4+B,GAChBA,GAER7c,KAAM,SAAUoe,GACf,GAAI7jB,GAAQ,EAGXla,EAAS+9B,EAAUjB,EAAUU,OAAOx9B,OAAS,CAC9C,IAAKk9B,EACJ,MAAOp9B,KAGR,KADAo9B,GAAU,EACMl9B,EAARka,EAAiBA,IACxB4iB,EAAUU,OAAQtjB,GAAQujB,IAAK,EAUhC,OALKM,GACJrjB,EAAS/W,YAAa9D,GAAQi9B,EAAWiB,IAEzCrjB,EAASif,WAAY95B,GAAQi9B,EAAWiB,IAElCj+B,QAGT+lB,EAAQiX,EAAUjX,KAInB,KAFAmY,GAAYnY,EAAOiX,EAAUY,KAAKC,eAElB39B,EAARka,EAAiBA,IAExB,GADAxH,EAAS2pB,GAAqBniB,GAAQnZ,KAAM+7B,EAAWj9B,EAAMgmB,EAAOiX,EAAUY,MAE7E,MAAOhrB,EAmBT,OAfAvV,GAAOgF,IAAK0jB,EAAO4W,GAAaK,GAE3B3/B,EAAOsD,WAAYq8B,EAAUY,KAAKzqB,QACtC6pB,EAAUY,KAAKzqB,MAAMlS,KAAMlB,EAAMi9B,GAGlC3/B,EAAO4iB,GAAGke,MACT9gC,EAAOoF,OAAQ46B,GACdt9B,KAAMA,EACNq+B,KAAMpB,EACNzd,MAAOyd,EAAUY,KAAKre,SAKjByd,EAAU1hB,SAAU0hB,EAAUY,KAAKtiB,UACxC1Z,KAAMo7B,EAAUY,KAAKh8B,KAAMo7B,EAAUY,KAAKvH,UAC1Cxb,KAAMmiB,EAAUY,KAAK/iB,MACrBF,OAAQqiB,EAAUY,KAAKjjB,QAG1B,QAASujB,IAAYnY,EAAO8X,GAC3B,GAAIzjB,GAAOzX,EAAMq7B,EAAQn3B,EAAO6Y,CAGhC,KAAMtF,IAAS2L,GAed,GAdApjB,EAAOtF,EAAOoJ,UAAW2T,GACzB4jB,EAASH,EAAel7B,GACxBkE,EAAQkf,EAAO3L,GACV/c,EAAO6F,QAAS2D,KACpBm3B,EAASn3B,EAAO,GAChBA,EAAQkf,EAAO3L,GAAUvT,EAAO,IAG5BuT,IAAUzX,IACdojB,EAAOpjB,GAASkE,QACTkf,GAAO3L,IAGfsF,EAAQriB,EAAOqzB,SAAU/tB,GACpB+c,GAAS,UAAYA,GAAQ,CACjC7Y,EAAQ6Y,EAAMmT,OAAQhsB,SACfkf,GAAOpjB,EAId,KAAMyX,IAASvT,GACNuT,IAAS2L,KAChBA,EAAO3L,GAAUvT,EAAOuT,GACxByjB,EAAezjB,GAAU4jB,OAI3BH,GAAel7B,GAASq7B,EAK3B3gC,EAAO6/B,UAAY7/B,EAAOoF,OAAQy6B,IAEjCmB,QAAS,SAAUtY,EAAOtkB,GACpBpE,EAAOsD,WAAYolB,IACvBtkB,EAAWskB,EACXA,GAAU,MAEVA,EAAQA,EAAMrd,MAAM,IAGrB,IAAIkW,GACHxE,EAAQ,EACRla,EAAS6lB,EAAM7lB,MAEhB,MAAgBA,EAARka,EAAiBA,IACxBwE,EAAOmH,EAAO3L,GACdqiB,GAAU7d,GAAS6d,GAAU7d,OAC7B6d,GAAU7d,GAAO1N,QAASzP,IAI5B68B,UAAW,SAAU78B,EAAUiqB,GACzBA,EACJ6Q,GAAoBrrB,QAASzP,GAE7B86B,GAAoBz+B,KAAM2D,KAK7B,SAAS+6B,IAAkBz8B,EAAMgmB,EAAO6X,GAEvC,GAAIhf,GAAM/X,EAAO4pB,EAAQiM,EAAOhd,EAAO6e,EACtCH,EAAOp+B,KACPgoB,KACA3f,EAAQtI,EAAKsI,MACbgoB,EAAStwB,EAAKQ,UAAYwvB,GAAUhwB,GACpCy+B,EAAWxgB,EAAU9c,IAAKnB,EAAM,SAG3B69B,GAAKre,QACVG,EAAQriB,EAAOsiB,YAAa5f,EAAM,MACX,MAAlB2f,EAAM+e,WACV/e,EAAM+e,SAAW,EACjBF,EAAU7e,EAAM7K,MAAMkF,KACtB2F,EAAM7K,MAAMkF,KAAO,WACZ2F,EAAM+e,UACXF,MAIH7e,EAAM+e,WAENL,EAAKzjB,OAAO,WAGXyjB,EAAKzjB,OAAO,WACX+E,EAAM+e,WACAphC,EAAOkiB,MAAOxf,EAAM,MAAOG,QAChCwf,EAAM7K,MAAMkF,YAOO,IAAlBha,EAAKQ,WAAoB,UAAYwlB,IAAS,SAAWA,MAK7D6X,EAAKc,UAAar2B,EAAMq2B,SAAUr2B,EAAMs2B,UAAWt2B,EAAMu2B,WAIlB,WAAlCvhC,EAAO4yB,IAAKlwB,EAAM,YACW,SAAhC1C,EAAO4yB,IAAKlwB,EAAM,WAEnBsI,EAAMinB,QAAU,iBAIbsO,EAAKc,WACTr2B,EAAMq2B,SAAW,SACjBN,EAAKzjB,OAAO,WACXtS,EAAMq2B,SAAWd,EAAKc,SAAU,GAChCr2B,EAAMs2B,UAAYf,EAAKc,SAAU,GACjCr2B,EAAMu2B,UAAYhB,EAAKc,SAAU,KAMnC,KAAM9f,IAAQmH,GAEb,GADAlf,EAAQkf,EAAOnH,GACVwd,GAASj8B,KAAM0G,GAAU,CAG7B,SAFOkf,GAAOnH,GACd6R,EAASA,GAAoB,WAAV5pB,EACdA,KAAYwpB,EAAS,OAAS,QAAW,CAG7C,GAAe,SAAVxpB,IAAoB23B,GAAYA,EAAU5f,KAAWhiB,UAGzD,QAFAyzB,IAAS,EAKXrI,EAAMpJ,GAAS4f,GAAYA,EAAU5f,IAAUvhB,EAAOgL,MAAOtI,EAAM6e,GAIrE,IAAMvhB,EAAOqH,cAAesjB,GAAS,CAC/BwW,EACC,UAAYA,KAChBnO,EAASmO,EAASnO,QAGnBmO,EAAWxgB,EAAUrW,OAAQ5H,EAAM,aAI/B0wB,IACJ+N,EAASnO,QAAUA,GAEfA,EACJhzB,EAAQ0C,GAAOqwB,OAEfgO,EAAKx8B,KAAK,WACTvE,EAAQ0C,GAAOywB,SAGjB4N,EAAKx8B,KAAK,WACT,GAAIgd,EAEJZ,GAAU5Y,OAAQrF,EAAM,SACxB,KAAM6e,IAAQoJ,GACb3qB,EAAOgL,MAAOtI,EAAM6e,EAAMoJ,EAAMpJ,KAGlC,KAAMA,IAAQoJ,GACb0U,EAAQC,GAAatM,EAASmO,EAAU5f,GAAS,EAAGA,EAAMwf,GAElDxf,IAAQ4f,KACfA,EAAU5f,GAAS8d,EAAMvpB,MACpBkd,IACJqM,EAAMp6B,IAAMo6B,EAAMvpB,MAClBupB,EAAMvpB,MAAiB,UAATyL,GAA6B,WAATA,EAAoB,EAAI,KAO/D,QAASmf,IAAOh+B,EAAM2C,EAASkc,EAAMtc,EAAK07B,GACzC,MAAO,IAAID,IAAMp+B,UAAUf,KAAMmB,EAAM2C,EAASkc,EAAMtc,EAAK07B,GAE5D3gC,EAAO0gC,MAAQA,GAEfA,GAAMp+B,WACLE,YAAak+B,GACbn/B,KAAM,SAAUmB,EAAM2C,EAASkc,EAAMtc,EAAK07B,EAAQpB,GACjD58B,KAAKD,KAAOA,EACZC,KAAK4e,KAAOA,EACZ5e,KAAKg+B,OAASA,GAAU,QACxBh+B,KAAK0C,QAAUA,EACf1C,KAAKmT,MAAQnT,KAAKiI,IAAMjI,KAAK0O,MAC7B1O,KAAKsC,IAAMA,EACXtC,KAAK48B,KAAOA,IAAUv/B,EAAOwzB,UAAWjS,GAAS,GAAK,OAEvDlQ,IAAK,WACJ,GAAIgR,GAAQqe,GAAM1b,UAAWriB,KAAK4e,KAElC,OAAOc,IAASA,EAAMxe,IACrBwe,EAAMxe,IAAKlB,MACX+9B,GAAM1b,UAAUgD,SAASnkB,IAAKlB,OAEhC29B,IAAK,SAAUF,GACd,GAAIoB,GACHnf,EAAQqe,GAAM1b,UAAWriB,KAAK4e,KAoB/B,OAjBC5e,MAAKkpB,IAAM2V,EADP7+B,KAAK0C,QAAQ86B,SACEngC,EAAO2gC,OAAQh+B,KAAKg+B,QACtCP,EAASz9B,KAAK0C,QAAQ86B,SAAWC,EAAS,EAAG,EAAGz9B,KAAK0C,QAAQ86B,UAG3CC,EAEpBz9B,KAAKiI,KAAQjI,KAAKsC,IAAMtC,KAAKmT,OAAU0rB,EAAQ7+B,KAAKmT,MAE/CnT,KAAK0C,QAAQo8B,MACjB9+B,KAAK0C,QAAQo8B,KAAK79B,KAAMjB,KAAKD,KAAMC,KAAKiI,IAAKjI,MAGzC0f,GAASA,EAAMf,IACnBe,EAAMf,IAAK3e,MAEX+9B,GAAM1b,UAAUgD,SAAS1G,IAAK3e,MAExBA,OAIT+9B,GAAMp+B,UAAUf,KAAKe,UAAYo+B,GAAMp+B,UAEvCo+B,GAAM1b,WACLgD,UACCnkB,IAAK,SAAUw7B,GACd,GAAI9pB,EAEJ,OAAiC,OAA5B8pB,EAAM38B,KAAM28B,EAAM9d,OACpB8d,EAAM38B,KAAKsI,OAA2C,MAAlCq0B,EAAM38B,KAAKsI,MAAOq0B,EAAM9d,OAQ/ChM,EAASvV,EAAO4yB,IAAKyM,EAAM38B,KAAM28B,EAAM9d,KAAM,IAErChM,GAAqB,SAAXA,EAAwBA,EAAJ,GAT9B8pB,EAAM38B,KAAM28B,EAAM9d,OAW3BD,IAAK,SAAU+d,GAGTr/B,EAAO4iB,GAAG6e,KAAMpC,EAAM9d,MAC1BvhB,EAAO4iB,GAAG6e,KAAMpC,EAAM9d,MAAQ8d,GACnBA,EAAM38B,KAAKsI,QAAgE,MAArDq0B,EAAM38B,KAAKsI,MAAOhL,EAAOg0B,SAAUqL,EAAM9d,QAAoBvhB,EAAOqzB,SAAUgM,EAAM9d,OACrHvhB,EAAOgL,MAAOq0B,EAAM38B,KAAM28B,EAAM9d,KAAM8d,EAAMz0B,IAAMy0B,EAAME,MAExDF,EAAM38B,KAAM28B,EAAM9d,MAAS8d,EAAMz0B,OASrC81B,GAAM1b,UAAUyE,UAAYiX,GAAM1b,UAAUqE,YAC3C/H,IAAK,SAAU+d,GACTA,EAAM38B,KAAKQ,UAAYm8B,EAAM38B,KAAKe,aACtC47B,EAAM38B,KAAM28B,EAAM9d,MAAS8d,EAAMz0B,OAKpC5K,EAAOmE,MAAO,SAAU,OAAQ,QAAU,SAAUU,EAAGS,GACtD,GAAIo8B,GAAQ1hC,EAAOsB,GAAIgE,EACvBtF,GAAOsB,GAAIgE,GAAS,SAAUq8B,EAAOhB,EAAQv8B,GAC5C,MAAgB,OAATu9B,GAAkC,iBAAVA,GAC9BD,EAAMl9B,MAAO7B,KAAM8B,WACnB9B,KAAKi/B,QAASC,GAAOv8B,GAAM,GAAQq8B,EAAOhB,EAAQv8B,MAIrDpE,EAAOsB,GAAG8D,QACT08B,OAAQ,SAAUH,EAAOI,EAAIpB,EAAQv8B,GAGpC,MAAOzB,MAAK+P,OAAQggB,IAAWE,IAAK,UAAW,GAAIG,OAGjD9tB,MAAM28B,SAAUtO,QAASyO,GAAMJ,EAAOhB,EAAQv8B,IAEjDw9B,QAAS,SAAUrgB,EAAMogB,EAAOhB,EAAQv8B,GACvC,GAAIoT,GAAQxX,EAAOqH,cAAeka,GACjCygB,EAAShiC,EAAO2hC,MAAOA,EAAOhB,EAAQv8B,GACtC69B,EAAc,WAEb,GAAIlB,GAAOlB,GAAWl9B,KAAM3C,EAAOoF,UAAYmc,GAAQygB,IAGlDxqB,GAASmJ,EAAU9c,IAAKlB,KAAM,YAClCo+B,EAAKve,MAAM,GAKd,OAFCyf,GAAYC,OAASD,EAEfzqB,GAASwqB,EAAO9f,SAAU,EAChCvf,KAAKwB,KAAM89B,GACXt/B,KAAKuf,MAAO8f,EAAO9f,MAAO+f,IAE5Bzf,KAAM,SAAU5b,EAAMoc,EAAY4d,GACjC,GAAIuB,GAAY,SAAU9f,GACzB,GAAIG,GAAOH,EAAMG,WACVH,GAAMG,KACbA,EAAMoe,GAYP,OATqB,gBAATh6B,KACXg6B,EAAU5d,EACVA,EAAapc,EACbA,EAAOrH,WAEHyjB,GAAcpc,KAAS,GAC3BjE,KAAKuf,MAAOtb,GAAQ,SAGdjE,KAAKwB,KAAK,WAChB,GAAIge,IAAU,EACbpF,EAAgB,MAARnW,GAAgBA,EAAO,aAC/Bw7B,EAASpiC,EAAOoiC,OAChB36B,EAAOkZ,EAAU9c,IAAKlB,KAEvB,IAAKoa,EACCtV,EAAMsV,IAAWtV,EAAMsV,GAAQyF,MACnC2f,EAAW16B,EAAMsV,QAGlB,KAAMA,IAAStV,GACTA,EAAMsV,IAAWtV,EAAMsV,GAAQyF,MAAQyc,GAAK77B,KAAM2Z,IACtDolB,EAAW16B,EAAMsV,GAKpB,KAAMA,EAAQqlB,EAAOv/B,OAAQka,KACvBqlB,EAAQrlB,GAAQra,OAASC,MAAiB,MAARiE,GAAgBw7B,EAAQrlB,GAAQmF,QAAUtb,IAChFw7B,EAAQrlB,GAAQgkB,KAAKve,KAAMoe,GAC3Bze,GAAU,EACVigB,EAAOj9B,OAAQ4X,EAAO,KAOnBoF,IAAYye,IAChB5gC,EAAOmiB,QAASxf,KAAMiE,MAIzBs7B,OAAQ,SAAUt7B,GAIjB,MAHKA,MAAS,IACbA,EAAOA,GAAQ,MAETjE,KAAKwB,KAAK,WAChB,GAAI4Y,GACHtV,EAAOkZ,EAAU9c,IAAKlB,MACtBuf,EAAQza,EAAMb,EAAO,SACrByb,EAAQ5a,EAAMb,EAAO,cACrBw7B,EAASpiC,EAAOoiC,OAChBv/B,EAASqf,EAAQA,EAAMrf,OAAS,CAajC,KAVA4E,EAAKy6B,QAAS,EAGdliC,EAAOkiB,MAAOvf,KAAMiE,MAEfyb,GAASA,EAAMG,MACnBH,EAAMG,KAAK5e,KAAMjB,MAAM,GAIlBoa,EAAQqlB,EAAOv/B,OAAQka,KACvBqlB,EAAQrlB,GAAQra,OAASC,MAAQy/B,EAAQrlB,GAAQmF,QAAUtb,IAC/Dw7B,EAAQrlB,GAAQgkB,KAAKve,MAAM,GAC3B4f,EAAOj9B,OAAQ4X,EAAO,GAKxB,KAAMA,EAAQ,EAAWla,EAARka,EAAgBA,IAC3BmF,EAAOnF,IAAWmF,EAAOnF,GAAQmlB,QACrChgB,EAAOnF,GAAQmlB,OAAOt+B,KAAMjB,YAKvB8E,GAAKy6B,WAMf,SAASL,IAAOj7B,EAAMy7B,GACrB,GAAIvZ,GACH7X,GAAUqxB,OAAQ17B,GAClB/B,EAAI,CAKL,KADAw9B,EAAeA,EAAc,EAAI,EACtB,EAAJx9B,EAAQA,GAAK,EAAIw9B,EACvBvZ,EAAQuJ,GAAWxtB,GACnBoM,EAAO,SAAW6X,GAAU7X,EAAO,UAAY6X,GAAUliB,CAO1D,OAJKy7B,KACJpxB,EAAMqiB,QAAUriB,EAAMuP,MAAQ5Z,GAGxBqK,EAIRjR,EAAOmE,MACNo+B,UAAWV,GAAM,QACjBW,QAASX,GAAM,QACfY,YAAaZ,GAAM,UACnBa,QAAUpP,QAAS,QACnBqP,SAAWrP,QAAS,QACpBsP,YAActP,QAAS,WACrB,SAAUhuB,EAAMojB,GAClB1oB,EAAOsB,GAAIgE,GAAS,SAAUq8B,EAAOhB,EAAQv8B,GAC5C,MAAOzB,MAAKi/B,QAASlZ,EAAOiZ,EAAOhB,EAAQv8B,MAI7CpE,EAAO2hC,MAAQ,SAAUA,EAAOhB,EAAQr/B,GACvC,GAAI2d,GAAM0iB,GAA0B,gBAAVA,GAAqB3hC,EAAOoF,UAAYu8B,IACjE3I,SAAU13B,IAAOA,GAAMq/B,GACtB3gC,EAAOsD,WAAYq+B,IAAWA,EAC/BxB,SAAUwB,EACVhB,OAAQr/B,GAAMq/B,GAAUA,IAAW3gC,EAAOsD,WAAYq9B,IAAYA,EAwBnE,OArBA1hB,GAAIkhB,SAAWngC,EAAO4iB,GAAGlc,IAAM,EAA4B,gBAAjBuY,GAAIkhB,SAAwBlhB,EAAIkhB,SACzElhB,EAAIkhB,WAAYngC,GAAO4iB,GAAGC,OAAS7iB,EAAO4iB,GAAGC,OAAQ5D,EAAIkhB,UAAangC,EAAO4iB,GAAGC,OAAOmF,UAGtE,MAAb/I,EAAIiD,OAAiBjD,EAAIiD,SAAU,KACvCjD,EAAIiD,MAAQ,MAIbjD,EAAIlU,IAAMkU,EAAI+Z,SAEd/Z,EAAI+Z,SAAW,WACTh5B,EAAOsD,WAAY2b,EAAIlU,MAC3BkU,EAAIlU,IAAInH,KAAMjB,MAGVsc,EAAIiD,OACRliB,EAAOmiB,QAASxf,KAAMsc,EAAIiD,QAIrBjD,GAGRjf,EAAO2gC,QACNkC,OAAQ,SAAUC,GACjB,MAAOA,IAERC,MAAO,SAAUD,GAChB,MAAO,GAAM/8B,KAAKi9B,IAAKF,EAAE/8B,KAAKk9B,IAAO,IAIvCjjC,EAAOoiC,UACPpiC,EAAO4iB,GAAK8d,GAAMp+B,UAAUf,KAC5BvB,EAAO4iB,GAAGod,KAAO,WAChB,GAAIc,GACHsB,EAASpiC,EAAOoiC,OAChBv9B,EAAI,CAIL,KAFAg6B,GAAQ7+B,EAAO4K,MAEHw3B,EAAOv/B,OAAXgC,EAAmBA,IAC1Bi8B,EAAQsB,EAAQv9B,GAEVi8B,KAAWsB,EAAQv9B,KAAQi8B,GAChCsB,EAAOj9B,OAAQN,IAAK,EAIhBu9B,GAAOv/B,QACZ7C,EAAO4iB,GAAGJ,OAEXqc,GAAQt/B,WAGTS,EAAO4iB,GAAGke,MAAQ,SAAUA,GACtBA,KAAW9gC,EAAOoiC,OAAO3hC,KAAMqgC,IACnC9gC,EAAO4iB,GAAG9M,SAIZ9V,EAAO4iB,GAAGsgB,SAAW,GAErBljC,EAAO4iB,GAAG9M,MAAQ,WACXgpB,KACLA,GAAUqE,YAAanjC,EAAO4iB,GAAGod,KAAMhgC,EAAO4iB,GAAGsgB,YAInDljC,EAAO4iB,GAAGJ,KAAO,WAChB4gB,cAAetE,IACfA,GAAU,MAGX9+B,EAAO4iB,GAAGC,QACTwgB,KAAM,IACNC,KAAM,IAENtb,SAAU,KAIXhoB,EAAO4iB,GAAG6e,QAELzhC,EAAO8T,MAAQ9T,EAAO8T,KAAKwE,UAC/BtY,EAAO8T,KAAKwE,QAAQirB,SAAW,SAAU7gC,GACxC,MAAO1C,GAAOgK,KAAKhK,EAAOoiC,OAAQ,SAAU9gC,GAC3C,MAAOoB,KAASpB,EAAGoB,OACjBG,SAGL7C,EAAOsB,GAAGkiC,OAAS,SAAUn+B,GAC5B,GAAKZ,UAAU5B,OACd,MAAOwC,KAAY9F,UAClBoD,KACAA,KAAKwB,KAAK,SAAUU,GACnB7E,EAAOwjC,OAAOC,UAAW9gC,KAAM0C,EAASR,IAI3C,IAAIhF,GAAS6jC,EACZhhC,EAAOC,KAAM,GACbghC,GAAQxxB,IAAK,EAAGyxB,KAAM,GACtB7xB,EAAMrP,GAAQA,EAAKS,aAEpB,IAAM4O,EAON,MAHAlS,GAAUkS,EAAIjS,gBAGRE,EAAOmM,SAAUtM,EAAS6C,UAMpBA,GAAKmhC,wBAA0BnkC,IAC1CikC,EAAMjhC,EAAKmhC,yBAEZH,EAAMI,GAAW/xB,IAEhBI,IAAKwxB,EAAIxxB,IAAMuxB,EAAIK,YAAclkC,EAAQ6pB,UACzCka,KAAMD,EAAIC,KAAOF,EAAIM,YAAcnkC,EAAQypB,aAXpCqa,GAeT3jC,EAAOwjC,QAENC,UAAW,SAAU/gC,EAAM2C,EAASR,GACnC,GAAIo/B,GAAaC,EAASC,EAAWC,EAAQC,EAAWC,EAAYC,EACnExS,EAAW/xB,EAAO4yB,IAAKlwB,EAAM,YAC7B8hC,EAAUxkC,EAAQ0C,GAClBgmB,IAGiB,YAAbqJ,IACJrvB,EAAKsI,MAAM+mB,SAAW,YAGvBsS,EAAYG,EAAQhB,SACpBW,EAAYnkC,EAAO4yB,IAAKlwB,EAAM,OAC9B4hC,EAAatkC,EAAO4yB,IAAKlwB,EAAM,QAC/B6hC,GAAmC,aAAbxS,GAAwC,UAAbA,KAA4BoS,EAAYG,GAAazjC,QAAQ,QAAU,GAGnH0jC,GACJN,EAAcO,EAAQzS,WACtBqS,EAASH,EAAY9xB,IACrB+xB,EAAUD,EAAYL,OAGtBQ,EAASn9B,WAAYk9B,IAAe,EACpCD,EAAUj9B,WAAYq9B,IAAgB,GAGlCtkC,EAAOsD,WAAY+B,KACvBA,EAAUA,EAAQzB,KAAMlB,EAAMmC,EAAGw/B,IAGd,MAAfh/B,EAAQ8M,MACZuW,EAAMvW,IAAQ9M,EAAQ8M,IAAMkyB,EAAUlyB,IAAQiyB,GAE1B,MAAhB/+B,EAAQu+B,OACZlb,EAAMkb,KAASv+B,EAAQu+B,KAAOS,EAAUT,KAASM,GAG7C,SAAW7+B,GACfA,EAAQo/B,MAAM7gC,KAAMlB,EAAMgmB,GAG1B8b,EAAQ5R,IAAKlK,KAMhB1oB,EAAOsB,GAAG8D,QAET2sB,SAAU,WACT,GAAMpvB,KAAM,GAAZ,CAIA,GAAI+hC,GAAclB,EACjB9gC,EAAOC,KAAM,GACbgiC,GAAiBxyB,IAAK,EAAGyxB,KAAM,EAuBhC,OApBwC,UAAnC5jC,EAAO4yB,IAAKlwB,EAAM,YAEtB8gC,EAAS9gC,EAAKmhC,yBAIda,EAAe/hC,KAAK+hC,eAGpBlB,EAAS7gC,KAAK6gC,SACRxjC,EAAOsJ,SAAUo7B,EAAc,GAAK,UACzCC,EAAeD,EAAalB,UAI7BmB,EAAaxyB,KAAOnS,EAAO4yB,IAAK8R,EAAc,GAAK,kBAAkB,GACrEC,EAAaf,MAAQ5jC,EAAO4yB,IAAK8R,EAAc,GAAK,mBAAmB,KAKvEvyB,IAAKqxB,EAAOrxB,IAAMwyB,EAAaxyB,IAAMnS,EAAO4yB,IAAKlwB,EAAM,aAAa,GACpEkhC,KAAMJ,EAAOI,KAAOe,EAAaf,KAAO5jC,EAAO4yB,IAAKlwB,EAAM,cAAc,MAI1EgiC,aAAc,WACb,MAAO/hC,MAAKqC,IAAI,WACf,GAAI0/B,GAAe/hC,KAAK+hC,cAAgB7kC,CAExC,OAAQ6kC,IAAmB1kC,EAAOsJ,SAAUo7B,EAAc,SAAsD,WAA1C1kC,EAAO4yB,IAAK8R,EAAc,YAC/FA,EAAeA,EAAaA,YAG7B,OAAOA,IAAgB7kC,OAO1BG,EAAOmE,MAAOklB,WAAY,cAAeI,UAAW,eAAgB,SAAUkS,EAAQpa,GACrF,GAAIpP,GAAM,gBAAkBoP,CAE5BvhB,GAAOsB,GAAIq6B,GAAW,SAAU3nB,GAC/B,MAAOhU,GAAOsK,OAAQ3H,KAAM,SAAUD,EAAMi5B,EAAQ3nB,GACnD,GAAI0vB,GAAMI,GAAWphC,EAErB,OAAKsR,KAAQzU,UACLmkC,EAAMA,EAAKniB,GAAS7e,EAAMi5B,IAG7B+H,EACJA,EAAIkB,SACFzyB,EAAY7S,EAAO0kC,YAAbhwB,EACP7B,EAAM6B,EAAM1U,EAAOykC,aAIpBrhC,EAAMi5B,GAAW3nB,EAPlB,YASE2nB,EAAQ3nB,EAAKvP,UAAU5B,OAAQ,QAIpC,SAASihC,IAAWphC,GACnB,MAAO1C,GAAO8G,SAAUpE,GAASA,EAAyB,IAAlBA,EAAKQ,UAAkBR,EAAKuP,YAGrEjS,EAAOmE,MAAQ0gC,OAAQ,SAAUC,MAAO,SAAW,SAAUx/B,EAAMsB,GAClE5G,EAAOmE,MAAQixB,QAAS,QAAU9vB,EAAMorB,QAAS9pB,EAAM,GAAI,QAAUtB,GAAQ,SAAUy/B,EAAcC,GAEpGhlC,EAAOsB,GAAI0jC,GAAa,SAAU7P,EAAQ3rB,GACzC,GAAIgB,GAAY/F,UAAU5B,SAAYkiC,GAAkC,iBAAX5P,IAC5DjB,EAAQ6Q,IAAkB5P,KAAW,GAAQ3rB,KAAU,EAAO,SAAW,SAE1E,OAAOxJ,GAAOsK,OAAQ3H,KAAM,SAAUD,EAAMkE,EAAM4C,GACjD,GAAIuI,EAEJ,OAAK/R,GAAO8G,SAAUpE,GAIdA,EAAK9C,SAASE,gBAAiB,SAAWwF,GAI3B,IAAlB5C,EAAKQ,UACT6O,EAAMrP,EAAK5C,gBAIJiG,KAAKwe,IACX7hB,EAAKwd,KAAM,SAAW5a,GAAQyM,EAAK,SAAWzM,GAC9C5C,EAAKwd,KAAM,SAAW5a,GAAQyM,EAAK,SAAWzM,GAC9CyM,EAAK,SAAWzM,KAIXkE,IAAUjK,UAEhBS,EAAO4yB,IAAKlwB,EAAMkE,EAAMstB,GAGxBl0B,EAAOgL,MAAOtI,EAAMkE,EAAM4C,EAAO0qB,IAChCttB,EAAM4D,EAAY2qB,EAAS51B,UAAWiL,EAAW,WAQvDxK,EAAOsB,GAAG2jC,KAAO,WAChB,MAAOtiC,MAAKE,QAGb7C,EAAOsB,GAAG4jC,QAAUllC,EAAOsB,GAAGyqB,QAGP,gBAAXoZ,SAAuBA,QAAoC,gBAAnBA,QAAOC,QAK1DD,OAAOC,QAAUplC,EASM,kBAAXqlC,SAAyBA,OAAOC,KAC3CD,OAAQ,YAAc,WAAc,MAAOrlC,KAMtB,gBAAXV,IAAkD,gBAApBA,GAAOM,WAChDN,EAAOU,OAASV,EAAOY,EAAIF,KAGxBV"}
diff --git a/bitbake/lib/toaster/toastergui/static/js/jquery-ui.min.js b/bitbake/lib/toaster/toastergui/static/js/jquery-ui.min.js
new file mode 100755
index 0000000..4e6396b
--- /dev/null
+++ b/bitbake/lib/toaster/toastergui/static/js/jquery-ui.min.js
@@ -0,0 +1,7 @@
+/*! jQuery UI - v1.11.4 - 2015-03-15
+* http://jqueryui.com
+* Includes: core.js, datepicker.js
+* Copyright 2015 jQuery Foundation and other contributors; Licensed MIT */
+
+(function(e){"function"==typeof define&&define.amd?define(["jquery"],e):e(jQuery)})(function(e){function t(t,s){var a,n,r,o=t.nodeName.toLowerCase();return"area"===o?(a=t.parentNode,n=a.name,t.href&&n&&"map"===a.nodeName.toLowerCase()?(r=e("img[usemap='#"+n+"']")[0],!!r&&i(r)):!1):(/^(input|select|textarea|button|object)$/.test(o)?!t.disabled:"a"===o?t.href||s:s)&&i(t)}function i(t){return e.expr.filters.visible(t)&&!e(t).parents().addBack().filter(function(){return"hidden"===e.css(this,"visibility")}).length}function s(e){for(var t,i;e.length&&e[0]!==document;){if(t=e.css("position"),("absolute"===t||"relative"===t||"fixed"===t)&&(i=parseInt(e.css("zIndex"),10),!isNaN(i)&&0!==i))return i;e=e.parent()}return 0}function a(){this._curInst=null,this._keyEvent=!1,this._disabledInputs=[],this._datepickerShowing=!1,this._inDialog=!1,this._mainDivId="ui-datepicker-div",this._inlineClass="ui-datepicker-inline",this._appendClass="ui-datepicker-append",this._triggerClass="ui-datepicker-trigger",this._dialogClass="ui-datepicker-dialog",this._disableClass="ui-datepicker-disabled",this._unselectableClass="ui-datepicker-unselectable",this._currentClass="ui-datepicker-current-day",this._dayOverClass="ui-datepicker-days-cell-over",this.regional=[],this.regional[""]={closeText:"Done",prevText:"Prev",nextText:"Next",currentText:"Today",monthNames:["January","February","March","April","May","June","July","August","September","October","November","December"],monthNamesShort:["Jan","Feb","Mar","Apr","May","Jun","Jul","Aug","Sep","Oct","Nov","Dec"],dayNames:["Sunday","Monday","Tuesday","Wednesday","Thursday","Friday","Saturday"],dayNamesShort:["Sun","Mon","Tue","Wed","Thu","Fri","Sat"],dayNamesMin:["Su","Mo","Tu","We","Th","Fr","Sa"],weekHeader:"Wk",dateFormat:"mm/dd/yy",firstDay:0,isRTL:!1,showMonthAfterYear:!1,yearSuffix:""},this._defaults={showOn:"focus",showAnim:"fadeIn",showOptions:{},defaultDate:null,appendText:"",buttonText:"...",buttonImage:"",buttonImageOnly:!1,hideIfNoPrevNext:!1,navigationAsDateFormat:!1,gotoCurrent:!1,changeMonth:!1,changeYear:!1,yearRange:"c-10:c+10",showOtherMonths:!1,selectOtherMonths:!1,showWeek:!1,calculateWeek:this.iso8601Week,shortYearCutoff:"+10",minDate:null,maxDate:null,duration:"fast",beforeShowDay:null,beforeShow:null,onSelect:null,onChangeMonthYear:null,onClose:null,numberOfMonths:1,showCurrentAtPos:0,stepMonths:1,stepBigMonths:12,altField:"",altFormat:"",constrainInput:!0,showButtonPanel:!1,autoSize:!1,disabled:!1},e.extend(this._defaults,this.regional[""]),this.regional.en=e.extend(!0,{},this.regional[""]),this.regional["en-US"]=e.extend(!0,{},this.regional.en),this.dpDiv=n(e("<div id='"+this._mainDivId+"' class='ui-datepicker ui-widget ui-widget-content ui-helper-clearfix ui-corner-all'></div>"))}function n(t){var i="button, .ui-datepicker-prev, .ui-datepicker-next, .ui-datepicker-calendar td a";return t.delegate(i,"mouseout",function(){e(this).removeClass("ui-state-hover"),-1!==this.className.indexOf("ui-datepicker-prev")&&e(this).removeClass("ui-datepicker-prev-hover"),-1!==this.className.indexOf("ui-datepicker-next")&&e(this).removeClass("ui-datepicker-next-hover")}).delegate(i,"mouseover",r)}function r(){e.datepicker._isDisabledDatepicker(h.inline?h.dpDiv.parent()[0]:h.input[0])||(e(this).parents(".ui-datepicker-calendar").find("a").removeClass("ui-state-hover"),e(this).addClass("ui-state-hover"),-1!==this.className.indexOf("ui-datepicker-prev")&&e(this).addClass("ui-datepicker-prev-hover"),-1!==this.className.indexOf("ui-datepicker-next")&&e(this).addClass("ui-datepicker-next-hover"))}function o(t,i){e.extend(t,i);for(var s in i)null==i[s]&&(t[s]=i[s]);return t}e.ui=e.ui||{},e.extend(e.ui,{version:"1.11.4",keyCode:{BACKSPACE:8,COMMA:188,DELETE:46,DOWN:40,END:35,ENTER:13,ESCAPE:27,HOME:36,LEFT:37,PAGE_DOWN:34,PAGE_UP:33,PERIOD:190,RIGHT:39,SPACE:32,TAB:9,UP:38}}),e.fn.extend({scrollParent:function(t){var i=this.css("position"),s="absolute"===i,a=t?/(auto|scroll|hidden)/:/(auto|scroll)/,n=this.parents().filter(function(){var t=e(this);return s&&"static"===t.css("position")?!1:a.test(t.css("overflow")+t.css("overflow-y")+t.css("overflow-x"))}).eq(0);return"fixed"!==i&&n.length?n:e(this[0].ownerDocument||document)},uniqueId:function(){var e=0;return function(){return this.each(function(){this.id||(this.id="ui-id-"+ ++e)})}}(),removeUniqueId:function(){return this.each(function(){/^ui-id-\d+$/.test(this.id)&&e(this).removeAttr("id")})}}),e.extend(e.expr[":"],{data:e.expr.createPseudo?e.expr.createPseudo(function(t){return function(i){return!!e.data(i,t)}}):function(t,i,s){return!!e.data(t,s[3])},focusable:function(i){return t(i,!isNaN(e.attr(i,"tabindex")))},tabbable:function(i){var s=e.attr(i,"tabindex"),a=isNaN(s);return(a||s>=0)&&t(i,!a)}}),e("<a>").outerWidth(1).jquery||e.each(["Width","Height"],function(t,i){function s(t,i,s,n){return e.each(a,function(){i-=parseFloat(e.css(t,"padding"+this))||0,s&&(i-=parseFloat(e.css(t,"border"+this+"Width"))||0),n&&(i-=parseFloat(e.css(t,"margin"+this))||0)}),i}var a="Width"===i?["Left","Right"]:["Top","Bottom"],n=i.toLowerCase(),r={innerWidth:e.fn.innerWidth,innerHeight:e.fn.innerHeight,outerWidth:e.fn.outerWidth,outerHeight:e.fn.outerHeight};e.fn["inner"+i]=function(t){return void 0===t?r["inner"+i].call(this):this.each(function(){e(this).css(n,s(this,t)+"px")})},e.fn["outer"+i]=function(t,a){return"number"!=typeof t?r["outer"+i].call(this,t):this.each(function(){e(this).css(n,s(this,t,!0,a)+"px")})}}),e.fn.addBack||(e.fn.addBack=function(e){return this.add(null==e?this.prevObject:this.prevObject.filter(e))}),e("<a>").data("a-b","a").removeData("a-b").data("a-b")&&(e.fn.removeData=function(t){return function(i){return arguments.length?t.call(this,e.camelCase(i)):t.call(this)}}(e.fn.removeData)),e.ui.ie=!!/msie [\w.]+/.exec(navigator.userAgent.toLowerCase()),e.fn.extend({focus:function(t){return function(i,s){return"number"==typeof i?this.each(function(){var t=this;setTimeout(function(){e(t).focus(),s&&s.call(t)},i)}):t.apply(this,arguments)}}(e.fn.focus),disableSelection:function(){var e="onselectstart"in document.createElement("div")?"selectstart":"mousedown";return function(){return this.bind(e+".ui-disableSelection",function(e){e.preventDefault()})}}(),enableSelection:function(){return this.unbind(".ui-disableSelection")},zIndex:function(t){if(void 0!==t)return this.css("zIndex",t);if(this.length)for(var i,s,a=e(this[0]);a.length&&a[0]!==document;){if(i=a.css("position"),("absolute"===i||"relative"===i||"fixed"===i)&&(s=parseInt(a.css("zIndex"),10),!isNaN(s)&&0!==s))return s;a=a.parent()}return 0}}),e.ui.plugin={add:function(t,i,s){var a,n=e.ui[t].prototype;for(a in s)n.plugins[a]=n.plugins[a]||[],n.plugins[a].push([i,s[a]])},call:function(e,t,i,s){var a,n=e.plugins[t];if(n&&(s||e.element[0].parentNode&&11!==e.element[0].parentNode.nodeType))for(a=0;n.length>a;a++)e.options[n[a][0]]&&n[a][1].apply(e.element,i)}},e.extend(e.ui,{datepicker:{version:"1.11.4"}});var h;e.extend(a.prototype,{markerClassName:"hasDatepicker",maxRows:4,_widgetDatepicker:function(){return this.dpDiv},setDefaults:function(e){return o(this._defaults,e||{}),this},_attachDatepicker:function(t,i){var s,a,n;s=t.nodeName.toLowerCase(),a="div"===s||"span"===s,t.id||(this.uuid+=1,t.id="dp"+this.uuid),n=this._newInst(e(t),a),n.settings=e.extend({},i||{}),"input"===s?this._connectDatepicker(t,n):a&&this._inlineDatepicker(t,n)},_newInst:function(t,i){var s=t[0].id.replace(/([^A-Za-z0-9_\-])/g,"\\\\$1");return{id:s,input:t,selectedDay:0,selectedMonth:0,selectedYear:0,drawMonth:0,drawYear:0,inline:i,dpDiv:i?n(e("<div class='"+this._inlineClass+" ui-datepicker ui-widget ui-widget-content ui-helper-clearfix ui-corner-all'></div>")):this.dpDiv}},_connectDatepicker:function(t,i){var s=e(t);i.append=e([]),i.trigger=e([]),s.hasClass(this.markerClassName)||(this._attachments(s,i),s.addClass(this.markerClassName).keydown(this._doKeyDown).keypress(this._doKeyPress).keyup(this._doKeyUp),this._autoSize(i),e.data(t,"datepicker",i),i.settings.disabled&&this._disableDatepicker(t))},_attachments:function(t,i){var s,a,n,r=this._get(i,"appendText"),o=this._get(i,"isRTL");i.append&&i.append.remove(),r&&(i.append=e("<span class='"+this._appendClass+"'>"+r+"</span>"),t[o?"before":"after"](i.append)),t.unbind("focus",this._showDatepicker),i.trigger&&i.trigger.remove(),s=this._get(i,"showOn"),("focus"===s||"both"===s)&&t.focus(this._showDatepicker),("button"===s||"both"===s)&&(a=this._get(i,"buttonText"),n=this._get(i,"buttonImage"),i.trigger=e(this._get(i,"buttonImageOnly")?e("<img/>").addClass(this._triggerClass).attr({src:n,alt:a,title:a}):e("<button type='button'></button>").addClass(this._triggerClass).html(n?e("<img/>").attr({src:n,alt:a,title:a}):a)),t[o?"before":"after"](i.trigger),i.trigger.click(function(){return e.datepicker._datepickerShowing&&e.datepicker._lastInput===t[0]?e.datepicker._hideDatepicker():e.datepicker._datepickerShowing&&e.datepicker._lastInput!==t[0]?(e.datepicker._hideDatepicker(),e.datepicker._showDatepicker(t[0])):e.datepicker._showDatepicker(t[0]),!1}))},_autoSize:function(e){if(this._get(e,"autoSize")&&!e.inline){var t,i,s,a,n=new Date(2009,11,20),r=this._get(e,"dateFormat");r.match(/[DM]/)&&(t=function(e){for(i=0,s=0,a=0;e.length>a;a++)e[a].length>i&&(i=e[a].length,s=a);return s},n.setMonth(t(this._get(e,r.match(/MM/)?"monthNames":"monthNamesShort"))),n.setDate(t(this._get(e,r.match(/DD/)?"dayNames":"dayNamesShort"))+20-n.getDay())),e.input.attr("size",this._formatDate(e,n).length)}},_inlineDatepicker:function(t,i){var s=e(t);s.hasClass(this.markerClassName)||(s.addClass(this.markerClassName).append(i.dpDiv),e.data(t,"datepicker",i),this._setDate(i,this._getDefaultDate(i),!0),this._updateDatepicker(i),this._updateAlternate(i),i.settings.disabled&&this._disableDatepicker(t),i.dpDiv.css("display","block"))},_dialogDatepicker:function(t,i,s,a,n){var r,h,l,u,d,c=this._dialogInst;return c||(this.uuid+=1,r="dp"+this.uuid,this._dialogInput=e("<input type='text' id='"+r+"' style='position: absolute; top: -100px; width: 0px;'/>"),this._dialogInput.keydown(this._doKeyDown),e("body").append(this._dialogInput),c=this._dialogInst=this._newInst(this._dialogInput,!1),c.settings={},e.data(this._dialogInput[0],"datepicker",c)),o(c.settings,a||{}),i=i&&i.constructor===Date?this._formatDate(c,i):i,this._dialogInput.val(i),this._pos=n?n.length?n:[n.pageX,n.pageY]:null,this._pos||(h=document.documentElement.clientWidth,l=document.documentElement.clientHeight,u=document.documentElement.scrollLeft||document.body.scrollLeft,d=document.documentElement.scrollTop||document.body.scrollTop,this._pos=[h/2-100+u,l/2-150+d]),this._dialogInput.css("left",this._pos[0]+20+"px").css("top",this._pos[1]+"px"),c.settings.onSelect=s,this._inDialog=!0,this.dpDiv.addClass(this._dialogClass),this._showDatepicker(this._dialogInput[0]),e.blockUI&&e.blockUI(this.dpDiv),e.data(this._dialogInput[0],"datepicker",c),this},_destroyDatepicker:function(t){var i,s=e(t),a=e.data(t,"datepicker");s.hasClass(this.markerClassName)&&(i=t.nodeName.toLowerCase(),e.removeData(t,"datepicker"),"input"===i?(a.append.remove(),a.trigger.remove(),s.removeClass(this.markerClassName).unbind("focus",this._showDatepicker).unbind("keydown",this._doKeyDown).unbind("keypress",this._doKeyPress).unbind("keyup",this._doKeyUp)):("div"===i||"span"===i)&&s.removeClass(this.markerClassName).empty(),h===a&&(h=null))},_enableDatepicker:function(t){var i,s,a=e(t),n=e.data(t,"datepicker");a.hasClass(this.markerClassName)&&(i=t.nodeName.toLowerCase(),"input"===i?(t.disabled=!1,n.trigger.filter("button").each(function(){this.disabled=!1}).end().filter("img").css({opacity:"1.0",cursor:""})):("div"===i||"span"===i)&&(s=a.children("."+this._inlineClass),s.children().removeClass("ui-state-disabled"),s.find("select.ui-datepicker-month, select.ui-datepicker-year").prop("disabled",!1)),this._disabledInputs=e.map(this._disabledInputs,function(e){return e===t?null:e}))},_disableDatepicker:function(t){var i,s,a=e(t),n=e.data(t,"datepicker");a.hasClass(this.markerClassName)&&(i=t.nodeName.toLowerCase(),"input"===i?(t.disabled=!0,n.trigger.filter("button").each(function(){this.disabled=!0}).end().filter("img").css({opacity:"0.5",cursor:"default"})):("div"===i||"span"===i)&&(s=a.children("."+this._inlineClass),s.children().addClass("ui-state-disabled"),s.find("select.ui-datepicker-month, select.ui-datepicker-year").prop("disabled",!0)),this._disabledInputs=e.map(this._disabledInputs,function(e){return e===t?null:e}),this._disabledInputs[this._disabledInputs.length]=t)},_isDisabledDatepicker:function(e){if(!e)return!1;for(var t=0;this._disabledInputs.length>t;t++)if(this._disabledInputs[t]===e)return!0;return!1},_getInst:function(t){try{return e.data(t,"datepicker")}catch(i){throw"Missing instance data for this datepicker"}},_optionDatepicker:function(t,i,s){var a,n,r,h,l=this._getInst(t);return 2===arguments.length&&"string"==typeof i?"defaults"===i?e.extend({},e.datepicker._defaults):l?"all"===i?e.extend({},l.settings):this._get(l,i):null:(a=i||{},"string"==typeof i&&(a={},a[i]=s),l&&(this._curInst===l&&this._hideDatepicker(),n=this._getDateDatepicker(t,!0),r=this._getMinMaxDate(l,"min"),h=this._getMinMaxDate(l,"max"),o(l.settings,a),null!==r&&void 0!==a.dateFormat&&void 0===a.minDate&&(l.settings.minDate=this._formatDate(l,r)),null!==h&&void 0!==a.dateFormat&&void 0===a.maxDate&&(l.settings.maxDate=this._formatDate(l,h)),"disabled"in a&&(a.disabled?this._disableDatepicker(t):this._enableDatepicker(t)),this._attachments(e(t),l),this._autoSize(l),this._setDate(l,n),this._updateAlternate(l),this._updateDatepicker(l)),void 0)},_changeDatepicker:function(e,t,i){this._optionDatepicker(e,t,i)},_refreshDatepicker:function(e){var t=this._getInst(e);t&&this._updateDatepicker(t)},_setDateDatepicker:function(e,t){var i=this._getInst(e);i&&(this._setDate(i,t),this._updateDatepicker(i),this._updateAlternate(i))},_getDateDatepicker:function(e,t){var i=this._getInst(e);return i&&!i.inline&&this._setDateFromField(i,t),i?this._getDate(i):null},_doKeyDown:function(t){var i,s,a,n=e.datepicker._getInst(t.target),r=!0,o=n.dpDiv.is(".ui-datepicker-rtl");if(n._keyEvent=!0,e.datepicker._datepickerShowing)switch(t.keyCode){case 9:e.datepicker._hideDatepicker(),r=!1;break;case 13:return a=e("td."+e.datepicker._dayOverClass+":not(."+e.datepicker._currentClass+")",n.dpDiv),a[0]&&e.datepicker._selectDay(t.target,n.selectedMonth,n.selectedYear,a[0]),i=e.datepicker._get(n,"onSelect"),i?(s=e.datepicker._formatDate(n),i.apply(n.input?n.input[0]:null,[s,n])):e.datepicker._hideDatepicker(),!1;case 27:e.datepicker._hideDatepicker();break;case 33:e.datepicker._adjustDate(t.target,t.ctrlKey?-e.datepicker._get(n,"stepBigMonths"):-e.datepicker._get(n,"stepMonths"),"M");break;case 34:e.datepicker._adjustDate(t.target,t.ctrlKey?+e.datepicker._get(n,"stepBigMonths"):+e.datepicker._get(n,"stepMonths"),"M");break;case 35:(t.ctrlKey||t.metaKey)&&e.datepicker._clearDate(t.target),r=t.ctrlKey||t.metaKey;break;case 36:(t.ctrlKey||t.metaKey)&&e.datepicker._gotoToday(t.target),r=t.ctrlKey||t.metaKey;break;case 37:(t.ctrlKey||t.metaKey)&&e.datepicker._adjustDate(t.target,o?1:-1,"D"),r=t.ctrlKey||t.metaKey,t.originalEvent.altKey&&e.datepicker._adjustDate(t.target,t.ctrlKey?-e.datepicker._get(n,"stepBigMonths"):-e.datepicker._get(n,"stepMonths"),"M");break;case 38:(t.ctrlKey||t.metaKey)&&e.datepicker._adjustDate(t.target,-7,"D"),r=t.ctrlKey||t.metaKey;break;case 39:(t.ctrlKey||t.metaKey)&&e.datepicker._adjustDate(t.target,o?-1:1,"D"),r=t.ctrlKey||t.metaKey,t.originalEvent.altKey&&e.datepicker._adjustDate(t.target,t.ctrlKey?+e.datepicker._get(n,"stepBigMonths"):+e.datepicker._get(n,"stepMonths"),"M");break;case 40:(t.ctrlKey||t.metaKey)&&e.datepicker._adjustDate(t.target,7,"D"),r=t.ctrlKey||t.metaKey;break;default:r=!1}else 36===t.keyCode&&t.ctrlKey?e.datepicker._showDatepicker(this):r=!1;r&&(t.preventDefault(),t.stopPropagation())},_doKeyPress:function(t){var i,s,a=e.datepicker._getInst(t.target);return e.datepicker._get(a,"constrainInput")?(i=e.datepicker._possibleChars(e.datepicker._get(a,"dateFormat")),s=String.fromCharCode(null==t.charCode?t.keyCode:t.charCode),t.ctrlKey||t.metaKey||" ">s||!i||i.indexOf(s)>-1):void 0},_doKeyUp:function(t){var i,s=e.datepicker._getInst(t.target);if(s.input.val()!==s.lastVal)try{i=e.datepicker.parseDate(e.datepicker._get(s,"dateFormat"),s.input?s.input.val():null,e.datepicker._getFormatConfig(s)),i&&(e.datepicker._setDateFromField(s),e.datepicker._updateAlternate(s),e.datepicker._updateDatepicker(s))}catch(a){}return!0},_showDatepicker:function(t){if(t=t.target||t,"input"!==t.nodeName.toLowerCase()&&(t=e("input",t.parentNode)[0]),!e.datepicker._isDisabledDatepicker(t)&&e.datepicker._lastInput!==t){var i,a,n,r,h,l,u;i=e.datepicker._getInst(t),e.datepicker._curInst&&e.datepicker._curInst!==i&&(e.datepicker._curInst.dpDiv.stop(!0,!0),i&&e.datepicker._datepickerShowing&&e.datepicker._hideDatepicker(e.datepicker._curInst.input[0])),a=e.datepicker._get(i,"beforeShow"),n=a?a.apply(t,[t,i]):{},n!==!1&&(o(i.settings,n),i.lastVal=null,e.datepicker._lastInput=t,e.datepicker._setDateFromField(i),e.datepicker._inDialog&&(t.value=""),e.datepicker._pos||(e.datepicker._pos=e.datepicker._findPos(t),e.datepicker._pos[1]+=t.offsetHeight),r=!1,e(t).parents().each(function(){return r|="fixed"===e(this).css("position"),!r}),h={left:e.datepicker._pos[0],top:e.datepicker._pos[1]},e.datepicker._pos=null,i.dpDiv.empty(),i.dpDiv.css({position:"absolute",display:"block",top:"-1000px"}),e.datepicker._updateDatepicker(i),h=e.datepicker._checkOffset(i,h,r),i.dpDiv.css({position:e.datepicker._inDialog&&e.blockUI?"static":r?"fixed":"absolute",display:"none",left:h.left+"px",top:h.top+"px"}),i.inline||(l=e.datepicker._get(i,"showAnim"),u=e.datepicker._get(i,"duration"),i.dpDiv.css("z-index",s(e(t))+1),e.datepicker._datepickerShowing=!0,e.effects&&e.effects.effect[l]?i.dpDiv.show(l,e.datepicker._get(i,"showOptions"),u):i.dpDiv[l||"show"](l?u:null),e.datepicker._shouldFocusInput(i)&&i.input.focus(),e.datepicker._curInst=i))}},_updateDatepicker:function(t){this.maxRows=4,h=t,t.dpDiv.empty().append(this._generateHTML(t)),this._attachHandlers(t);var i,s=this._getNumberOfMonths(t),a=s[1],n=17,o=t.dpDiv.find("."+this._dayOverClass+" a");o.length>0&&r.apply(o.get(0)),t.dpDiv.removeClass("ui-datepicker-multi-2 ui-datepicker-multi-3 ui-datepicker-multi-4").width(""),a>1&&t.dpDiv.addClass("ui-datepicker-multi-"+a).css("width",n*a+"em"),t.dpDiv[(1!==s[0]||1!==s[1]?"add":"remove")+"Class"]("ui-datepicker-multi"),t.dpDiv[(this._get(t,"isRTL")?"add":"remove")+"Class"]("ui-datepicker-rtl"),t===e.datepicker._curInst&&e.datepicker._datepickerShowing&&e.datepicker._shouldFocusInput(t)&&t.input.focus(),t.yearshtml&&(i=t.yearshtml,setTimeout(function(){i===t.yearshtml&&t.yearshtml&&t.dpDiv.find("select.ui-datepicker-year:first").replaceWith(t.yearshtml),i=t.yearshtml=null},0))},_shouldFocusInput:function(e){return e.input&&e.input.is(":visible")&&!e.input.is(":disabled")&&!e.input.is(":focus")},_checkOffset:function(t,i,s){var a=t.dpDiv.outerWidth(),n=t.dpDiv.outerHeight(),r=t.input?t.input.outerWidth():0,o=t.input?t.input.outerHeight():0,h=document.documentElement.clientWidth+(s?0:e(document).scrollLeft()),l=document.documentElement.clientHeight+(s?0:e(document).scrollTop());return i.left-=this._get(t,"isRTL")?a-r:0,i.left-=s&&i.left===t.input.offset().left?e(document).scrollLeft():0,i.top-=s&&i.top===t.input.offset().top+o?e(document).scrollTop():0,i.left-=Math.min(i.left,i.left+a>h&&h>a?Math.abs(i.left+a-h):0),i.top-=Math.min(i.top,i.top+n>l&&l>n?Math.abs(n+o):0),i},_findPos:function(t){for(var i,s=this._getInst(t),a=this._get(s,"isRTL");t&&("hidden"===t.type||1!==t.nodeType||e.expr.filters.hidden(t));)t=t[a?"previousSibling":"nextSibling"];return i=e(t).offset(),[i.left,i.top]},_hideDatepicker:function(t){var i,s,a,n,r=this._curInst;!r||t&&r!==e.data(t,"datepicker")||this._datepickerShowing&&(i=this._get(r,"showAnim"),s=this._get(r,"duration"),a=function(){e.datepicker._tidyDialog(r)},e.effects&&(e.effects.effect[i]||e.effects[i])?r.dpDiv.hide(i,e.datepicker._get(r,"showOptions"),s,a):r.dpDiv["slideDown"===i?"slideUp":"fadeIn"===i?"fadeOut":"hide"](i?s:null,a),i||a(),this._datepickerShowing=!1,n=this._get(r,"onClose"),n&&n.apply(r.input?r.input[0]:null,[r.input?r.input.val():"",r]),this._lastInput=null,this._inDialog&&(this._dialogInput.css({position:"absolute",left:"0",top:"-100px"}),e.blockUI&&(e.unblockUI(),e("body").append(this.dpDiv))),this._inDialog=!1)},_tidyDialog:function(e){e.dpDiv.removeClass(this._dialogClass).unbind(".ui-datepicker-calendar")},_checkExternalClick:function(t){if(e.datepicker._curInst){var i=e(t.target),s=e.datepicker._getInst(i[0]);(i[0].id!==e.datepicker._mainDivId&&0===i.parents("#"+e.datepicker._mainDivId).length&&!i.hasClass(e.datepicker.markerClassName)&&!i.closest("."+e.datepicker._triggerClass).length&&e.datepicker._datepickerShowing&&(!e.datepicker._inDialog||!e.blockUI)||i.hasClass(e.datepicker.markerClassName)&&e.datepicker._curInst!==s)&&e.datepicker._hideDatepicker()}},_adjustDate:function(t,i,s){var a=e(t),n=this._getInst(a[0]);this._isDisabledDatepicker(a[0])||(this._adjustInstDate(n,i+("M"===s?this._get(n,"showCurrentAtPos"):0),s),this._updateDatepicker(n))},_gotoToday:function(t){var i,s=e(t),a=this._getInst(s[0]);this._get(a,"gotoCurrent")&&a.currentDay?(a.selectedDay=a.currentDay,a.drawMonth=a.selectedMonth=a.currentMonth,a.drawYear=a.selectedYear=a.currentYear):(i=new Date,a.selectedDay=i.getDate(),a.drawMonth=a.selectedMonth=i.getMonth(),a.drawYear=a.selectedYear=i.getFullYear()),this._notifyChange(a),this._adjustDate(s)},_selectMonthYear:function(t,i,s){var a=e(t),n=this._getInst(a[0]);n["selected"+("M"===s?"Month":"Year")]=n["draw"+("M"===s?"Month":"Year")]=parseInt(i.options[i.selectedIndex].value,10),this._notifyChange(n),this._adjustDate(a)},_selectDay:function(t,i,s,a){var n,r=e(t);e(a).hasClass(this._unselectableClass)||this._isDisabledDatepicker(r[0])||(n=this._getInst(r[0]),n.selectedDay=n.currentDay=e("a",a).html(),n.selectedMonth=n.currentMonth=i,n.selectedYear=n.currentYear=s,this._selectDate(t,this._formatDate(n,n.currentDay,n.currentMonth,n.currentYear)))},_clearDate:function(t){var i=e(t);this._selectDate(i,"")},_selectDate:function(t,i){var s,a=e(t),n=this._getInst(a[0]);i=null!=i?i:this._formatDate(n),n.input&&n.input.val(i),this._updateAlternate(n),s=this._get(n,"onSelect"),s?s.apply(n.input?n.input[0]:null,[i,n]):n.input&&n.input.trigger("change"),n.inline?this._updateDatepicker(n):(this._hideDatepicker(),this._lastInput=n.input[0],"object"!=typeof n.input[0]&&n.input.focus(),this._lastInput=null)},_updateAlternate:function(t){var i,s,a,n=this._get(t,"altField");n&&(i=this._get(t,"altFormat")||this._get(t,"dateFormat"),s=this._getDate(t),a=this.formatDate(i,s,this._getFormatConfig(t)),e(n).each(function(){e(this).val(a)}))},noWeekends:function(e){var t=e.getDay();return[t>0&&6>t,""]},iso8601Week:function(e){var t,i=new Date(e.getTime());return i.setDate(i.getDate()+4-(i.getDay()||7)),t=i.getTime(),i.setMonth(0),i.setDate(1),Math.floor(Math.round((t-i)/864e5)/7)+1},parseDate:function(t,i,s){if(null==t||null==i)throw"Invalid arguments";if(i="object"==typeof i?""+i:i+"",""===i)return null;var a,n,r,o,h=0,l=(s?s.shortYearCutoff:null)||this._defaults.shortYearCutoff,u="string"!=typeof l?l:(new Date).getFullYear()%100+parseInt(l,10),d=(s?s.dayNamesShort:null)||this._defaults.dayNamesShort,c=(s?s.dayNames:null)||this._defaults.dayNames,p=(s?s.monthNamesShort:null)||this._defaults.monthNamesShort,f=(s?s.monthNames:null)||this._defaults.monthNames,m=-1,g=-1,v=-1,y=-1,b=!1,_=function(e){var i=t.length>a+1&&t.charAt(a+1)===e;return i&&a++,i},x=function(e){var t=_(e),s="@"===e?14:"!"===e?20:"y"===e&&t?4:"o"===e?3:2,a="y"===e?s:1,n=RegExp("^\\d{"+a+","+s+"}"),r=i.substring(h).match(n);if(!r)throw"Missing number at position "+h;return h+=r[0].length,parseInt(r[0],10)},k=function(t,s,a){var n=-1,r=e.map(_(t)?a:s,function(e,t){return[[t,e]]}).sort(function(e,t){return-(e[1].length-t[1].length)});if(e.each(r,function(e,t){var s=t[1];return i.substr(h,s.length).toLowerCase()===s.toLowerCase()?(n=t[0],h+=s.length,!1):void 0}),-1!==n)return n+1;throw"Unknown name at position "+h},w=function(){if(i.charAt(h)!==t.charAt(a))throw"Unexpected literal at position "+h;h++};for(a=0;t.length>a;a++)if(b)"'"!==t.charAt(a)||_("'")?w():b=!1;else switch(t.charAt(a)){case"d":v=x("d");break;case"D":k("D",d,c);break;case"o":y=x("o");break;case"m":g=x("m");break;case"M":g=k("M",p,f);break;case"y":m=x("y");break;case"@":o=new Date(x("@")),m=o.getFullYear(),g=o.getMonth()+1,v=o.getDate();break;case"!":o=new Date((x("!")-this._ticksTo1970)/1e4),m=o.getFullYear(),g=o.getMonth()+1,v=o.getDate();break;case"'":_("'")?w():b=!0;break;default:w()}if(i.length>h&&(r=i.substr(h),!/^\s+/.test(r)))throw"Extra/unparsed characters found in date: "+r;if(-1===m?m=(new Date).getFullYear():100>m&&(m+=(new Date).getFullYear()-(new Date).getFullYear()%100+(u>=m?0:-100)),y>-1)for(g=1,v=y;;){if(n=this._getDaysInMonth(m,g-1),n>=v)break;g++,v-=n}if(o=this._daylightSavingAdjust(new Date(m,g-1,v)),o.getFullYear()!==m||o.getMonth()+1!==g||o.getDate()!==v)throw"Invalid date";return o},ATOM:"yy-mm-dd",COOKIE:"D, dd M yy",ISO_8601:"yy-mm-dd",RFC_822:"D, d M y",RFC_850:"DD, dd-M-y",RFC_1036:"D, d M y",RFC_1123:"D, d M yy",RFC_2822:"D, d M yy",RSS:"D, d M y",TICKS:"!",TIMESTAMP:"@",W3C:"yy-mm-dd",_ticksTo1970:1e7*60*60*24*(718685+Math.floor(492.5)-Math.floor(19.7)+Math.floor(4.925)),formatDate:function(e,t,i){if(!t)return"";var s,a=(i?i.dayNamesShort:null)||this._defaults.dayNamesShort,n=(i?i.dayNames:null)||this._defaults.dayNames,r=(i?i.monthNamesShort:null)||this._defaults.monthNamesShort,o=(i?i.monthNames:null)||this._defaults.monthNames,h=function(t){var i=e.length>s+1&&e.charAt(s+1)===t;return i&&s++,i},l=function(e,t,i){var s=""+t;if(h(e))for(;i>s.length;)s="0"+s;return s},u=function(e,t,i,s){return h(e)?s[t]:i[t]},d="",c=!1;if(t)for(s=0;e.length>s;s++)if(c)"'"!==e.charAt(s)||h("'")?d+=e.charAt(s):c=!1;else switch(e.charAt(s)){case"d":d+=l("d",t.getDate(),2);break;case"D":d+=u("D",t.getDay(),a,n);break;case"o":d+=l("o",Math.round((new Date(t.getFullYear(),t.getMonth(),t.getDate()).getTime()-new Date(t.getFullYear(),0,0).getTime())/864e5),3);break;case"m":d+=l("m",t.getMonth()+1,2);break;case"M":d+=u("M",t.getMonth(),r,o);break;case"y":d+=h("y")?t.getFullYear():(10>t.getYear()%100?"0":"")+t.getYear()%100;break;case"@":d+=t.getTime();break;case"!":d+=1e4*t.getTime()+this._ticksTo1970;break;case"'":h("'")?d+="'":c=!0;break;default:d+=e.charAt(s)}return d},_possibleChars:function(e){var t,i="",s=!1,a=function(i){var s=e.length>t+1&&e.charAt(t+1)===i;return s&&t++,s};for(t=0;e.length>t;t++)if(s)"'"!==e.charAt(t)||a("'")?i+=e.charAt(t):s=!1;else switch(e.charAt(t)){case"d":case"m":case"y":case"@":i+="0123456789";break;case"D":case"M":return null;case"'":a("'")?i+="'":s=!0;break;default:i+=e.charAt(t)}return i},_get:function(e,t){return void 0!==e.settings[t]?e.settings[t]:this._defaults[t]},_setDateFromField:function(e,t){if(e.input.val()!==e.lastVal){var i=this._get(e,"dateFormat"),s=e.lastVal=e.input?e.input.val():null,a=this._getDefaultDate(e),n=a,r=this._getFormatConfig(e);try{n=this.parseDate(i,s,r)||a}catch(o){s=t?"":s}e.selectedDay=n.getDate(),e.drawMonth=e.selectedMonth=n.getMonth(),e.drawYear=e.selectedYear=n.getFullYear(),e.currentDay=s?n.getDate():0,e.currentMonth=s?n.getMonth():0,e.currentYear=s?n.getFullYear():0,this._adjustInstDate(e)}},_getDefaultDate:function(e){return this._restrictMinMax(e,this._determineDate(e,this._get(e,"defaultDate"),new Date))},_determineDate:function(t,i,s){var a=function(e){var t=new Date;return t.setDate(t.getDate()+e),t},n=function(i){try{return e.datepicker.parseDate(e.datepicker._get(t,"dateFormat"),i,e.datepicker._getFormatConfig(t))}catch(s){}for(var a=(i.toLowerCase().match(/^c/)?e.datepicker._getDate(t):null)||new Date,n=a.getFullYear(),r=a.getMonth(),o=a.getDate(),h=/([+\-]?[0-9]+)\s*(d|D|w|W|m|M|y|Y)?/g,l=h.exec(i);l;){switch(l[2]||"d"){case"d":case"D":o+=parseInt(l[1],10);break;case"w":case"W":o+=7*parseInt(l[1],10);break;case"m":case"M":r+=parseInt(l[1],10),o=Math.min(o,e.datepicker._getDaysInMonth(n,r));break;case"y":case"Y":n+=parseInt(l[1],10),o=Math.min(o,e.datepicker._getDaysInMonth(n,r))}l=h.exec(i)}return new Date(n,r,o)},r=null==i||""===i?s:"string"==typeof i?n(i):"number"==typeof i?isNaN(i)?s:a(i):new Date(i.getTime());return r=r&&"Invalid Date"==""+r?s:r,r&&(r.setHours(0),r.setMinutes(0),r.setSeconds(0),r.setMilliseconds(0)),this._daylightSavingAdjust(r)},_daylightSavingAdjust:function(e){return e?(e.setHours(e.getHours()>12?e.getHours()+2:0),e):null},_setDate:function(e,t,i){var s=!t,a=e.selectedMonth,n=e.selectedYear,r=this._restrictMinMax(e,this._determineDate(e,t,new Date));e.selectedDay=e.currentDay=r.getDate(),e.drawMonth=e.selectedMonth=e.currentMonth=r.getMonth(),e.drawYear=e.selectedYear=e.currentYear=r.getFullYear(),a===e.selectedMonth&&n===e.selectedYear||i||this._notifyChange(e),this._adjustInstDate(e),e.input&&e.input.val(s?"":this._formatDate(e))},_getDate:function(e){var t=!e.currentYear||e.input&&""===e.input.val()?null:this._daylightSavingAdjust(new Date(e.currentYear,e.currentMonth,e.currentDay));return t},_attachHandlers:function(t){var i=this._get(t,"stepMonths"),s="#"+t.id.replace(/\\\\/g,"\\");t.dpDiv.find("[data-handler]").map(function(){var t={prev:function(){e.datepicker._adjustDate(s,-i,"M")},next:function(){e.datepicker._adjustDate(s,+i,"M")},hide:function(){e.datepicker._hideDatepicker()},today:function(){e.datepicker._gotoToday(s)},selectDay:function(){return e.datepicker._selectDay(s,+this.getAttribute("data-month"),+this.getAttribute("data-year"),this),!1},selectMonth:function(){return e.datepicker._selectMonthYear(s,this,"M"),!1},selectYear:function(){return e.datepicker._selectMonthYear(s,this,"Y"),!1}};e(this).bind(this.getAttribute("data-event"),t[this.getAttribute("data-handler")])})},_generateHTML:function(e){var t,i,s,a,n,r,o,h,l,u,d,c,p,f,m,g,v,y,b,_,x,k,w,T,S,D,N,M,C,A,P,I,H,F,j,z,E,O,L,W=new Date,R=this._daylightSavingAdjust(new Date(W.getFullYear(),W.getMonth(),W.getDate())),Y=this._get(e,"isRTL"),J=this._get(e,"showButtonPanel"),B=this._get(e,"hideIfNoPrevNext"),K=this._get(e,"navigationAsDateFormat"),V=this._getNumberOfMonths(e),q=this._get(e,"showCurrentAtPos"),Q=this._get(e,"stepMonths"),U=1!==V[0]||1!==V[1],G=this._daylightSavingAdjust(e.currentDay?new Date(e.currentYear,e.currentMonth,e.currentDay):new Date(9999,9,9)),X=this._getMinMaxDate(e,"min"),$=this._getMinMaxDate(e,"max"),Z=e.drawMonth-q,et=e.drawYear;if(0>Z&&(Z+=12,et--),$)for(t=this._daylightSavingAdjust(new Date($.getFullYear(),$.getMonth()-V[0]*V[1]+1,$.getDate())),t=X&&X>t?X:t;this._daylightSavingAdjust(new Date(et,Z,1))>t;)Z--,0>Z&&(Z=11,et--);for(e.drawMonth=Z,e.drawYear=et,i=this._get(e,"prevText"),i=K?this.formatDate(i,this._daylightSavingAdjust(new Date(et,Z-Q,1)),this._getFormatConfig(e)):i,s=this._canAdjustMonth(e,-1,et,Z)?"<a class='ui-datepicker-prev ui-corner-all' data-handler='prev' data-event='click' title='"+i+"'><span class='ui-icon ui-icon-circle-triangle-"+(Y?"e":"w")+"'>"+i+"</span></a>":B?"":"<a class='ui-datepicker-prev ui-corner-all ui-state-disabled' title='"+i+"'><span class='ui-icon ui-icon-circle-triangle-"+(Y?"e":"w")+"'>"+i+"</span></a>",a=this._get(e,"nextText"),a=K?this.formatDate(a,this._daylightSavingAdjust(new Date(et,Z+Q,1)),this._getFormatConfig(e)):a,n=this._canAdjustMonth(e,1,et,Z)?"<a class='ui-datepicker-next ui-corner-all' data-handler='next' data-event='click' title='"+a+"'><span class='ui-icon ui-icon-circle-triangle-"+(Y?"w":"e")+"'>"+a+"</span></a>":B?"":"<a class='ui-datepicker-next ui-corner-all ui-state-disabled' title='"+a+"'><span class='ui-icon ui-icon-circle-triangle-"+(Y?"w":"e")+"'>"+a+"</span></a>",r=this._get(e,"currentText"),o=this._get(e,"gotoCurrent")&&e.currentDay?G:R,r=K?this.formatDate(r,o,this._getFormatConfig(e)):r,h=e.inline?"":"<button type='button' class='ui-datepicker-close ui-state-default ui-priority-primary ui-corner-all' data-handler='hide' data-event='click'>"+this._get(e,"closeText")+"</button>",l=J?"<div class='ui-datepicker-buttonpane ui-widget-content'>"+(Y?h:"")+(this._isInRange(e,o)?"<button type='button' class='ui-datepicker-current ui-state-default ui-priority-secondary ui-corner-all' data-handler='today' data-event='click'>"+r+"</button>":"")+(Y?"":h)+"</div>":"",u=parseInt(this._get(e,"firstDay"),10),u=isNaN(u)?0:u,d=this._get(e,"showWeek"),c=this._get(e,"dayNames"),p=this._get(e,"dayNamesMin"),f=this._get(e,"monthNames"),m=this._get(e,"monthNamesShort"),g=this._get(e,"beforeShowDay"),v=this._get(e,"showOtherMonths"),y=this._get(e,"selectOtherMonths"),b=this._getDefaultDate(e),_="",k=0;V[0]>k;k++){for(w="",this.maxRows=4,T=0;V[1]>T;T++){if(S=this._daylightSavingAdjust(new Date(et,Z,e.selectedDay)),D=" ui-corner-all",N="",U){if(N+="<div class='ui-datepicker-group",V[1]>1)switch(T){case 0:N+=" ui-datepicker-group-first",D=" ui-corner-"+(Y?"right":"left");
+break;case V[1]-1:N+=" ui-datepicker-group-last",D=" ui-corner-"+(Y?"left":"right");break;default:N+=" ui-datepicker-group-middle",D=""}N+="'>"}for(N+="<div class='ui-datepicker-header ui-widget-header ui-helper-clearfix"+D+"'>"+(/all|left/.test(D)&&0===k?Y?n:s:"")+(/all|right/.test(D)&&0===k?Y?s:n:"")+this._generateMonthYearHeader(e,Z,et,X,$,k>0||T>0,f,m)+"</div><table class='ui-datepicker-calendar'><thead>"+"<tr>",M=d?"<th class='ui-datepicker-week-col'>"+this._get(e,"weekHeader")+"</th>":"",x=0;7>x;x++)C=(x+u)%7,M+="<th scope='col'"+((x+u+6)%7>=5?" class='ui-datepicker-week-end'":"")+">"+"<span title='"+c[C]+"'>"+p[C]+"</span></th>";for(N+=M+"</tr></thead><tbody>",A=this._getDaysInMonth(et,Z),et===e.selectedYear&&Z===e.selectedMonth&&(e.selectedDay=Math.min(e.selectedDay,A)),P=(this._getFirstDayOfMonth(et,Z)-u+7)%7,I=Math.ceil((P+A)/7),H=U?this.maxRows>I?this.maxRows:I:I,this.maxRows=H,F=this._daylightSavingAdjust(new Date(et,Z,1-P)),j=0;H>j;j++){for(N+="<tr>",z=d?"<td class='ui-datepicker-week-col'>"+this._get(e,"calculateWeek")(F)+"</td>":"",x=0;7>x;x++)E=g?g.apply(e.input?e.input[0]:null,[F]):[!0,""],O=F.getMonth()!==Z,L=O&&!y||!E[0]||X&&X>F||$&&F>$,z+="<td class='"+((x+u+6)%7>=5?" ui-datepicker-week-end":"")+(O?" ui-datepicker-other-month":"")+(F.getTime()===S.getTime()&&Z===e.selectedMonth&&e._keyEvent||b.getTime()===F.getTime()&&b.getTime()===S.getTime()?" "+this._dayOverClass:"")+(L?" "+this._unselectableClass+" ui-state-disabled":"")+(O&&!v?"":" "+E[1]+(F.getTime()===G.getTime()?" "+this._currentClass:"")+(F.getTime()===R.getTime()?" ui-datepicker-today":""))+"'"+(O&&!v||!E[2]?"":" title='"+E[2].replace(/'/g,"'")+"'")+(L?"":" data-handler='selectDay' data-event='click' data-month='"+F.getMonth()+"' data-year='"+F.getFullYear()+"'")+">"+(O&&!v?" ":L?"<span class='ui-state-default'>"+F.getDate()+"</span>":"<a class='ui-state-default"+(F.getTime()===R.getTime()?" ui-state-highlight":"")+(F.getTime()===G.getTime()?" ui-state-active":"")+(O?" ui-priority-secondary":"")+"' href='#'>"+F.getDate()+"</a>")+"</td>",F.setDate(F.getDate()+1),F=this._daylightSavingAdjust(F);N+=z+"</tr>"}Z++,Z>11&&(Z=0,et++),N+="</tbody></table>"+(U?"</div>"+(V[0]>0&&T===V[1]-1?"<div class='ui-datepicker-row-break'></div>":""):""),w+=N}_+=w}return _+=l,e._keyEvent=!1,_},_generateMonthYearHeader:function(e,t,i,s,a,n,r,o){var h,l,u,d,c,p,f,m,g=this._get(e,"changeMonth"),v=this._get(e,"changeYear"),y=this._get(e,"showMonthAfterYear"),b="<div class='ui-datepicker-title'>",_="";if(n||!g)_+="<span class='ui-datepicker-month'>"+r[t]+"</span>";else{for(h=s&&s.getFullYear()===i,l=a&&a.getFullYear()===i,_+="<select class='ui-datepicker-month' data-handler='selectMonth' data-event='change'>",u=0;12>u;u++)(!h||u>=s.getMonth())&&(!l||a.getMonth()>=u)&&(_+="<option value='"+u+"'"+(u===t?" selected='selected'":"")+">"+o[u]+"</option>");_+="</select>"}if(y||(b+=_+(!n&&g&&v?"":" ")),!e.yearshtml)if(e.yearshtml="",n||!v)b+="<span class='ui-datepicker-year'>"+i+"</span>";else{for(d=this._get(e,"yearRange").split(":"),c=(new Date).getFullYear(),p=function(e){var t=e.match(/c[+\-].*/)?i+parseInt(e.substring(1),10):e.match(/[+\-].*/)?c+parseInt(e,10):parseInt(e,10);return isNaN(t)?c:t},f=p(d[0]),m=Math.max(f,p(d[1]||"")),f=s?Math.max(f,s.getFullYear()):f,m=a?Math.min(m,a.getFullYear()):m,e.yearshtml+="<select class='ui-datepicker-year' data-handler='selectYear' data-event='change'>";m>=f;f++)e.yearshtml+="<option value='"+f+"'"+(f===i?" selected='selected'":"")+">"+f+"</option>";e.yearshtml+="</select>",b+=e.yearshtml,e.yearshtml=null}return b+=this._get(e,"yearSuffix"),y&&(b+=(!n&&g&&v?"":" ")+_),b+="</div>"},_adjustInstDate:function(e,t,i){var s=e.drawYear+("Y"===i?t:0),a=e.drawMonth+("M"===i?t:0),n=Math.min(e.selectedDay,this._getDaysInMonth(s,a))+("D"===i?t:0),r=this._restrictMinMax(e,this._daylightSavingAdjust(new Date(s,a,n)));e.selectedDay=r.getDate(),e.drawMonth=e.selectedMonth=r.getMonth(),e.drawYear=e.selectedYear=r.getFullYear(),("M"===i||"Y"===i)&&this._notifyChange(e)},_restrictMinMax:function(e,t){var i=this._getMinMaxDate(e,"min"),s=this._getMinMaxDate(e,"max"),a=i&&i>t?i:t;return s&&a>s?s:a},_notifyChange:function(e){var t=this._get(e,"onChangeMonthYear");t&&t.apply(e.input?e.input[0]:null,[e.selectedYear,e.selectedMonth+1,e])},_getNumberOfMonths:function(e){var t=this._get(e,"numberOfMonths");return null==t?[1,1]:"number"==typeof t?[1,t]:t},_getMinMaxDate:function(e,t){return this._determineDate(e,this._get(e,t+"Date"),null)},_getDaysInMonth:function(e,t){return 32-this._daylightSavingAdjust(new Date(e,t,32)).getDate()},_getFirstDayOfMonth:function(e,t){return new Date(e,t,1).getDay()},_canAdjustMonth:function(e,t,i,s){var a=this._getNumberOfMonths(e),n=this._daylightSavingAdjust(new Date(i,s+(0>t?t:a[0]*a[1]),1));return 0>t&&n.setDate(this._getDaysInMonth(n.getFullYear(),n.getMonth())),this._isInRange(e,n)},_isInRange:function(e,t){var i,s,a=this._getMinMaxDate(e,"min"),n=this._getMinMaxDate(e,"max"),r=null,o=null,h=this._get(e,"yearRange");return h&&(i=h.split(":"),s=(new Date).getFullYear(),r=parseInt(i[0],10),o=parseInt(i[1],10),i[0].match(/[+\-].*/)&&(r+=s),i[1].match(/[+\-].*/)&&(o+=s)),(!a||t.getTime()>=a.getTime())&&(!n||t.getTime()<=n.getTime())&&(!r||t.getFullYear()>=r)&&(!o||o>=t.getFullYear())},_getFormatConfig:function(e){var t=this._get(e,"shortYearCutoff");return t="string"!=typeof t?t:(new Date).getFullYear()%100+parseInt(t,10),{shortYearCutoff:t,dayNamesShort:this._get(e,"dayNamesShort"),dayNames:this._get(e,"dayNames"),monthNamesShort:this._get(e,"monthNamesShort"),monthNames:this._get(e,"monthNames")}},_formatDate:function(e,t,i,s){t||(e.currentDay=e.selectedDay,e.currentMonth=e.selectedMonth,e.currentYear=e.selectedYear);var a=t?"object"==typeof t?t:this._daylightSavingAdjust(new Date(s,i,t)):this._daylightSavingAdjust(new Date(e.currentYear,e.currentMonth,e.currentDay));return this.formatDate(this._get(e,"dateFormat"),a,this._getFormatConfig(e))}}),e.fn.datepicker=function(t){if(!this.length)return this;e.datepicker.initialized||(e(document).mousedown(e.datepicker._checkExternalClick),e.datepicker.initialized=!0),0===e("#"+e.datepicker._mainDivId).length&&e("body").append(e.datepicker.dpDiv);var i=Array.prototype.slice.call(arguments,1);return"string"!=typeof t||"isDisabled"!==t&&"getDate"!==t&&"widget"!==t?"option"===t&&2===arguments.length&&"string"==typeof arguments[1]?e.datepicker["_"+t+"Datepicker"].apply(e.datepicker,[this[0]].concat(i)):this.each(function(){"string"==typeof t?e.datepicker["_"+t+"Datepicker"].apply(e.datepicker,[this].concat(i)):e.datepicker._attachDatepicker(this,t)}):e.datepicker["_"+t+"Datepicker"].apply(e.datepicker,[this[0]].concat(i))},e.datepicker=new a,e.datepicker.initialized=!1,e.datepicker.uuid=(new Date).getTime(),e.datepicker.version="1.11.4",e.datepicker});
\ No newline at end of file
diff --git a/bitbake/lib/toaster/toastergui/static/js/jquery.cookie.js b/bitbake/lib/toaster/toastergui/static/js/jquery.cookie.js
new file mode 100644
index 0000000..9271900
--- /dev/null
+++ b/bitbake/lib/toaster/toastergui/static/js/jquery.cookie.js
@@ -0,0 +1,117 @@
+/*!
+ * jQuery Cookie Plugin v1.4.0
+ * https://github.com/carhartl/jquery-cookie
+ *
+ * Copyright 2013 Klaus Hartl
+ * Released under the MIT license
+ */
+(function (factory) {
+ if (typeof define === 'function' && define.amd) {
+ // AMD. Register as anonymous module.
+ define(['jquery'], factory);
+ } else {
+ // Browser globals.
+ factory(jQuery);
+ }
+}(function ($) {
+
+ var pluses = /\+/g;
+
+ function encode(s) {
+ return config.raw ? s : encodeURIComponent(s);
+ }
+
+ function decode(s) {
+ return config.raw ? s : decodeURIComponent(s);
+ }
+
+ function stringifyCookieValue(value) {
+ return encode(config.json ? JSON.stringify(value) : String(value));
+ }
+
+ function parseCookieValue(s) {
+ if (s.indexOf('"') === 0) {
+ // This is a quoted cookie as according to RFC2068, unescape...
+ s = s.slice(1, -1).replace(/\\"/g, '"').replace(/\\\\/g, '\\');
+ }
+
+ try {
+ // Replace server-side written pluses with spaces.
+ // If we can't decode the cookie, ignore it, it's unusable.
+ s = decodeURIComponent(s.replace(pluses, ' '));
+ } catch(e) {
+ return;
+ }
+
+ try {
+ // If we can't parse the cookie, ignore it, it's unusable.
+ return config.json ? JSON.parse(s) : s;
+ } catch(e) {}
+ }
+
+ function read(s, converter) {
+ var value = config.raw ? s : parseCookieValue(s);
+ return $.isFunction(converter) ? converter(value) : value;
+ }
+
+ var config = $.cookie = function (key, value, options) {
+
+ // Write
+ if (value !== undefined && !$.isFunction(value)) {
+ options = $.extend({}, config.defaults, options);
+
+ if (typeof options.expires === 'number') {
+ var days = options.expires, t = options.expires = new Date();
+ t.setDate(t.getDate() + days);
+ }
+
+ return (document.cookie = [
+ encode(key), '=', stringifyCookieValue(value),
+ options.expires ? '; expires=' + options.expires.toUTCString() : '', // use expires attribute, max-age is not supported by IE
+ options.path ? '; path=' + options.path : '',
+ options.domain ? '; domain=' + options.domain : '',
+ options.secure ? '; secure' : ''
+ ].join(''));
+ }
+
+ // Read
+
+ var result = key ? undefined : {};
+
+ // To prevent the for loop in the first place assign an empty array
+ // in case there are no cookies at all. Also prevents odd result when
+ // calling $.cookie().
+ var cookies = document.cookie ? document.cookie.split('; ') : [];
+
+ for (var i = 0, l = cookies.length; i < l; i++) {
+ var parts = cookies[i].split('=');
+ var name = decode(parts.shift());
+ var cookie = parts.join('=');
+
+ if (key && key === name) {
+ // If second argument (value) is a function it's a converter...
+ result = read(cookie, value);
+ break;
+ }
+
+ // Prevent storing a cookie that we couldn't decode.
+ if (!key && (cookie = read(cookie)) !== undefined) {
+ result[name] = cookie;
+ }
+ }
+
+ return result;
+ };
+
+ config.defaults = {};
+
+ $.removeCookie = function (key, options) {
+ if ($.cookie(key) !== undefined) {
+ // Must not alter options, thus extending a fresh object...
+ $.cookie(key, '', $.extend({}, options, { expires: -1 }));
+ return true;
+ }
+ return false;
+ };
+
+}));
diff --git a/bitbake/lib/toaster/toastergui/static/js/jquery.treetable.js b/bitbake/lib/toaster/toastergui/static/js/jquery.treetable.js
new file mode 100644
index 0000000..42e7427
--- /dev/null
+++ b/bitbake/lib/toaster/toastergui/static/js/jquery.treetable.js
@@ -0,0 +1,620 @@
+/*
+ * jQuery treetable Plugin 3.1.0
+ * http://ludo.cubicphuse.nl/jquery-treetable
+ *
+ * Copyright 2013, Ludo van den Boom
+ * Dual licensed under the MIT or GPL Version 2 licenses.
+ */
+(function() {
+ var $, Node, Tree, methods;
+
+ $ = jQuery;
+
+ Node = (function() {
+ function Node(row, tree, settings) {
+ var parentId;
+
+ this.row = row;
+ this.tree = tree;
+ this.settings = settings;
+
+ // TODO Ensure id/parentId is always a string (not int)
+ this.id = this.row.data(this.settings.nodeIdAttr);
+
+ // TODO Move this to a setParentId function?
+ parentId = this.row.data(this.settings.parentIdAttr);
+ if (parentId != null && parentId !== "") {
+ this.parentId = parentId;
+ }
+
+ this.treeCell = $(this.row.children(this.settings.columnElType)[this.settings.column]);
+ this.expander = $(this.settings.expanderTemplate);
+ this.indenter = $(this.settings.indenterTemplate);
+ this.children = [];
+ this.initialized = false;
+ this.treeCell.prepend(this.indenter);
+ }
+
+ Node.prototype.addChild = function(child) {
+ return this.children.push(child);
+ };
+
+ Node.prototype.ancestors = function() {
+ var ancestors, node;
+ node = this;
+ ancestors = [];
+ while (node = node.parentNode()) {
+ ancestors.push(node);
+ }
+ return ancestors;
+ };
+
+ Node.prototype.collapse = function() {
+ if (this.collapsed()) {
+ return this;
+ }
+
+ this.row.removeClass("expanded").addClass("collapsed");
+
+ this._hideChildren();
+ this.expander.attr("title", this.settings.stringExpand);
+
+ if (this.initialized && this.settings.onNodeCollapse != null) {
+ this.settings.onNodeCollapse.apply(this);
+ }
+
+ return this;
+ };
+
+ Node.prototype.collapsed = function() {
+ return this.row.hasClass("collapsed");
+ };
+
+ // TODO destroy: remove event handlers, expander, indenter, etc.
+
+ Node.prototype.expand = function() {
+ if (this.expanded()) {
+ return this;
+ }
+
+ this.row.removeClass("collapsed").addClass("expanded");
+
+ if (this.initialized && this.settings.onNodeExpand != null) {
+ this.settings.onNodeExpand.apply(this);
+ }
+
+ if ($(this.row).is(":visible")) {
+ this._showChildren();
+ }
+
+ this.expander.attr("title", this.settings.stringCollapse);
+
+ return this;
+ };
+
+ Node.prototype.expanded = function() {
+ return this.row.hasClass("expanded");
+ };
+
+ Node.prototype.hide = function() {
+ this._hideChildren();
+ this.row.hide();
+ return this;
+ };
+
+ Node.prototype.isBranchNode = function() {
+ if(this.children.length > 0 || this.row.data(this.settings.branchAttr) === true) {
+ return true;
+ } else {
+ return false;
+ }
+ };
+
+ Node.prototype.updateBranchLeafClass = function(){
+ this.row.removeClass('branch');
+ this.row.removeClass('leaf');
+ this.row.addClass(this.isBranchNode() ? 'branch' : 'leaf');
+ };
+
+ Node.prototype.level = function() {
+ return this.ancestors().length;
+ };
+
+ Node.prototype.parentNode = function() {
+ if (this.parentId != null) {
+ return this.tree[this.parentId];
+ } else {
+ return null;
+ }
+ };
+
+ Node.prototype.removeChild = function(child) {
+ var i = $.inArray(child, this.children);
+ return this.children.splice(i, 1)
+ };
+
+ Node.prototype.render = function() {
+ var handler,
+ settings = this.settings,
+ target;
+
+ if (settings.expandable === true && this.isBranchNode()) {
+ handler = function(e) {
+ $(this).parents("table").treetable("node", $(this).parents("tr").data(settings.nodeIdAttr)).toggle();
+ return e.preventDefault();
+ };
+
+ this.indenter.html(this.expander);
+ target = settings.clickableNodeNames === true ? this.treeCell : this.expander;
+
+ target.off("click.treetable").on("click.treetable", handler);
+ target.off("keydown.treetable").on("keydown.treetable", function(e) {
+ if (e.keyCode == 13) {
+ handler.apply(this, [e]);
+ }
+ });
+ }
+
+ this.indenter[0].style.paddingLeft = "" + (this.level() * settings.indent) + "px";
+
+ return this;
+ };
+
+ Node.prototype.reveal = function() {
+ if (this.parentId != null) {
+ this.parentNode().reveal();
+ }
+ return this.expand();
+ };
+
+ Node.prototype.setParent = function(node) {
+ if (this.parentId != null) {
+ this.tree[this.parentId].removeChild(this);
+ }
+ this.parentId = node.id;
+ this.row.data(this.settings.parentIdAttr, node.id);
+ return node.addChild(this);
+ };
+
+ Node.prototype.show = function() {
+ if (!this.initialized) {
+ this._initialize();
+ }
+ this.row.show();
+ if (this.expanded()) {
+ this._showChildren();
+ }
+ return this;
+ };
+
+ Node.prototype.toggle = function() {
+ if (this.expanded()) {
+ this.collapse();
+ } else {
+ this.expand();
+ }
+ return this;
+ };
+
+ Node.prototype._hideChildren = function() {
+ var child, _i, _len, _ref, _results;
+ _ref = this.children;
+ _results = [];
+ for (_i = 0, _len = _ref.length; _i < _len; _i++) {
+ child = _ref[_i];
+ _results.push(child.hide());
+ }
+ return _results;
+ };
+
+ Node.prototype._initialize = function() {
+ var settings = this.settings;
+
+ this.render();
+
+ if (settings.expandable === true && settings.initialState === "collapsed") {
+ this.collapse();
+ } else {
+ this.expand();
+ }
+
+ if (settings.onNodeInitialized != null) {
+ settings.onNodeInitialized.apply(this);
+ }
+
+ return this.initialized = true;
+ };
+
+ Node.prototype._showChildren = function() {
+ var child, _i, _len, _ref, _results;
+ _ref = this.children;
+ _results = [];
+ for (_i = 0, _len = _ref.length; _i < _len; _i++) {
+ child = _ref[_i];
+ _results.push(child.show());
+ }
+ return _results;
+ };
+
+ return Node;
+ })();
+
+ Tree = (function() {
+ function Tree(table, settings) {
+ this.table = table;
+ this.settings = settings;
+ this.tree = {};
+
+ // Cache the nodes and roots in simple arrays for quick access/iteration
+ this.nodes = [];
+ this.roots = [];
+ }
+
+ Tree.prototype.collapseAll = function() {
+ var node, _i, _len, _ref, _results;
+ _ref = this.nodes;
+ _results = [];
+ for (_i = 0, _len = _ref.length; _i < _len; _i++) {
+ node = _ref[_i];
+ _results.push(node.collapse());
+ }
+ return _results;
+ };
+
+ Tree.prototype.expandAll = function() {
+ var node, _i, _len, _ref, _results;
+ _ref = this.nodes;
+ _results = [];
+ for (_i = 0, _len = _ref.length; _i < _len; _i++) {
+ node = _ref[_i];
+ _results.push(node.expand());
+ }
+ return _results;
+ };
+
+ Tree.prototype.findLastNode = function (node) {
+ if (node.children.length > 0) {
+ return this.findLastNode(node.children[node.children.length - 1]);
+ } else {
+ return node;
+ }
+ };
+
+ Tree.prototype.loadRows = function(rows) {
+ var node, row, i;
+
+ if (rows != null) {
+ for (i = 0; i < rows.length; i++) {
+ row = $(rows[i]);
+
+ if (row.data(this.settings.nodeIdAttr) != null) {
+ node = new Node(row, this.tree, this.settings);
+ this.nodes.push(node);
+ this.tree[node.id] = node;
+
+ if (node.parentId != null) {
+ this.tree[node.parentId].addChild(node);
+ } else {
+ this.roots.push(node);
+ }
+ }
+ }
+ }
+
+ for (i = 0; i < this.nodes.length; i++) {
+ node = this.nodes[i].updateBranchLeafClass();
+ }
+
+ return this;
+ };
+
+ Tree.prototype.move = function(node, destination) {
+ // Conditions:
+ // 1: +node+ should not be inserted as a child of +node+ itself.
+ // 2: +destination+ should not be the same as +node+'s current parent (this
+ // prevents +node+ from being moved to the same location where it already
+ // is).
+ // 3: +node+ should not be inserted in a location in a branch if this would
+ // result in +node+ being an ancestor of itself.
+ var nodeParent = node.parentNode();
+ if (node !== destination && destination.id !== node.parentId && $.inArray(node, destination.ancestors()) === -1) {
+ node.setParent(destination);
+ this._moveRows(node, destination);
+
+ // Re-render parentNode if this is its first child node, and therefore
+ // doesn't have the expander yet.
+ if (node.parentNode().children.length === 1) {
+ node.parentNode().render();
+ }
+ }
+
+ if(nodeParent){
+ nodeParent.updateBranchLeafClass();
+ }
+ if(node.parentNode()){
+ node.parentNode().updateBranchLeafClass();
+ }
+ node.updateBranchLeafClass();
+ return this;
+ };
+
+ Tree.prototype.removeNode = function(node) {
+ // Recursively remove all descendants of +node+
+ this.unloadBranch(node);
+
+ // Remove node from DOM (<tr>)
+ node.row.remove();
+
+ // Clean up Tree object (so Node objects are GC-ed)
+ delete this.tree[node.id];
+ this.nodes.splice($.inArray(node, this.nodes), 1);
+ }
+
+ Tree.prototype.render = function() {
+ var root, _i, _len, _ref;
+ _ref = this.roots;
+ for (_i = 0, _len = _ref.length; _i < _len; _i++) {
+ root = _ref[_i];
+
+ // Naming is confusing (show/render). I do not call render on node from
+ // here.
+ root.show();
+ }
+ return this;
+ };
+
+ Tree.prototype.sortBranch = function(node, sortFun) {
+ // First sort internal array of children
+ node.children.sort(sortFun);
+
+ // Next render rows in correct order on page
+ this._sortChildRows(node);
+
+ return this;
+ };
+
+ Tree.prototype.unloadBranch = function(node) {
+ var children, i;
+
+ for (i = 0; i < node.children.length; i++) {
+ this.removeNode(node.children[i]);
+ }
+
+ // Reset node's collection of children
+ node.children = [];
+
+ node.updateBranchLeafClass();
+
+ return this;
+ };
+
+ Tree.prototype._moveRows = function(node, destination) {
+ var children = node.children, i;
+
+ node.row.insertAfter(destination.row);
+ node.render();
+
+ // Loop backwards through children to have them end up on UI in correct
+ // order (see #112)
+ for (i = children.length - 1; i >= 0; i--) {
+ this._moveRows(children[i], node);
+ }
+ };
+
+ // Special _moveRows case, move children to itself to force sorting
+ Tree.prototype._sortChildRows = function(parentNode) {
+ return this._moveRows(parentNode, parentNode);
+ };
+
+ return Tree;
+ })();
+
+ // jQuery Plugin
+ methods = {
+ init: function(options, force) {
+ var settings;
+
+ settings = $.extend({
+ branchAttr: "ttBranch",
+ clickableNodeNames: false,
+ column: 0,
+ columnElType: "td", // i.e. 'td', 'th' or 'td,th'
+ expandable: false,
+ expanderTemplate: "<a href='#'> </a>",
+ indent: 19,
+ indenterTemplate: "<span class='indenter'></span>",
+ initialState: "collapsed",
+ nodeIdAttr: "ttId", // maps to data-tt-id
+ parentIdAttr: "ttParentId", // maps to data-tt-parent-id
+ stringExpand: "Expand",
+ stringCollapse: "Collapse",
+
+ // Events
+ onInitialized: null,
+ onNodeCollapse: null,
+ onNodeExpand: null,
+ onNodeInitialized: null
+ }, options);
+
+ return this.each(function() {
+ var el = $(this), tree;
+
+ if (force || el.data("treetable") === undefined) {
+ tree = new Tree(this, settings);
+ tree.loadRows(this.rows).render();
+
+ el.addClass("treetable").data("treetable", tree);
+
+ if (settings.onInitialized != null) {
+ settings.onInitialized.apply(tree);
+ }
+ }
+
+ return el;
+ });
+ },
+
+ destroy: function() {
+ return this.each(function() {
+ return $(this).removeData("treetable").removeClass("treetable");
+ });
+ },
+
+ collapseAll: function() {
+ this.data("treetable").collapseAll();
+ return this;
+ },
+
+ collapseNode: function(id) {
+ var node = this.data("treetable").tree[id];
+
+ if (node) {
+ node.collapse();
+ } else {
+ throw new Error("Unknown node '" + id + "'");
+ }
+
+ return this;
+ },
+
+ expandAll: function() {
+ this.data("treetable").expandAll();
+ return this;
+ },
+
+ expandNode: function(id) {
+ var node = this.data("treetable").tree[id];
+
+ if (node) {
+ if (!node.initialized) {
+ node._initialize();
+ }
+
+ node.expand();
+ } else {
+ throw new Error("Unknown node '" + id + "'");
+ }
+
+ return this;
+ },
+
+ loadBranch: function(node, rows) {
+ var settings = this.data("treetable").settings,
+ tree = this.data("treetable").tree;
+
+ // TODO Switch to $.parseHTML
+ rows = $(rows);
+
+ if (node == null) { // Inserting new root nodes
+ this.append(rows);
+ } else {
+ var lastNode = this.data("treetable").findLastNode(node);
+ rows.insertAfter(lastNode.row);
+ }
+
+ this.data("treetable").loadRows(rows);
+
+ // Make sure nodes are properly initialized
+ rows.filter("tr").each(function() {
+ tree[$(this).data(settings.nodeIdAttr)].show();
+ });
+
+ if (node != null) {
+ // Re-render parent to ensure expander icon is shown (#79)
+ node.render().expand();
+ }
+
+ return this;
+ },
+
+ move: function(nodeId, destinationId) {
+ var destination, node;
+
+ node = this.data("treetable").tree[nodeId];
+ destination = this.data("treetable").tree[destinationId];
+ this.data("treetable").move(node, destination);
+
+ return this;
+ },
+
+ node: function(id) {
+ return this.data("treetable").tree[id];
+ },
+
+ removeNode: function(id) {
+ var node = this.data("treetable").tree[id];
+
+ if (node) {
+ this.data("treetable").removeNode(node);
+ } else {
+ throw new Error("Unknown node '" + id + "'");
+ }
+
+ return this;
+ },
+
+ reveal: function(id) {
+ var node = this.data("treetable").tree[id];
+
+ if (node) {
+ node.reveal();
+ } else {
+ throw new Error("Unknown node '" + id + "'");
+ }
+
+ return this;
+ },
+
+ sortBranch: function(node, columnOrFunction) {
+ var settings = this.data("treetable").settings,
+ prepValue,
+ sortFun;
+
+ columnOrFunction = columnOrFunction || settings.column;
+ sortFun = columnOrFunction;
+
+ if ($.isNumeric(columnOrFunction)) {
+ sortFun = function(a, b) {
+ var extractValue, valA, valB;
+
+ extractValue = function(node) {
+ var val = node.row.find("td:eq(" + columnOrFunction + ")").text();
+ // Ignore trailing/leading whitespace and use uppercase values for
+ // case insensitive ordering
+ return $.trim(val).toUpperCase();
+ }
+
+ valA = extractValue(a);
+ valB = extractValue(b);
+
+ if (valA < valB) return -1;
+ if (valA > valB) return 1;
+ return 0;
+ };
+ }
+
+ this.data("treetable").sortBranch(node, sortFun);
+ return this;
+ },
+
+ unloadBranch: function(node) {
+ this.data("treetable").unloadBranch(node);
+ return this;
+ }
+ };
+
+ $.fn.treetable = function(method) {
+ if (methods[method]) {
+ return methods[method].apply(this, Array.prototype.slice.call(arguments, 1));
+ } else if (typeof method === 'object' || !method) {
+ return methods.init.apply(this, arguments);
+ } else {
+ return $.error("Method " + method + " does not exist on jQuery.treetable");
+ }
+ };
+
+ // Expose classes to world
+ this.TreeTable || (this.TreeTable = {});
+ this.TreeTable.Node = Node;
+ this.TreeTable.Tree = Tree;
+}).call(this);
diff --git a/bitbake/lib/toaster/toastergui/static/js/layerBtn.js b/bitbake/lib/toaster/toastergui/static/js/layerBtn.js
new file mode 100644
index 0000000..a0509f9
--- /dev/null
+++ b/bitbake/lib/toaster/toastergui/static/js/layerBtn.js
@@ -0,0 +1,77 @@
+"use strict";
+
+function layerBtnsInit(ctx) {
+
+ /* Remove any current bindings to avoid duplicated binds */
+ $(".layerbtn").unbind('click');
+
+ $(".layerbtn").click(function (){
+ var layerObj = $(this).data("layer");
+ var add = ($(this).data('directive') === "add");
+ var thisBtn = $(this);
+
+ libtoaster.addRmLayer(layerObj, add, function (layerDepsList){
+ libtoaster.showChangeNotification(libtoaster.makeLayerAddRmAlertMsg(layerObj, layerDepsList, add));
+
+ /* In-cell notification */
+ var notification = $('<div id="temp-inline-notify" style="display: none; font-size: 11px; line-height: 1.3;" class="tooltip-inner"></div>');
+ thisBtn.parent().append(notification);
+
+ if (add){
+ if (layerDepsList.length > 0)
+ notification.text(String(layerDepsList.length + 1) + " layers added");
+ else
+ notification.text("1 layer added");
+
+ var layerBtnsFadeOut = $();
+ var layerExistsBtnFadeIn = $();
+
+ layerBtnsFadeOut = layerBtnsFadeOut.add(".layer-add-" + layerObj.id);
+ layerExistsBtnFadeIn = layerExistsBtnFadeIn.add(".layer-exists-" + layerObj.id);
+
+ for (var i in layerDepsList){
+ layerBtnsFadeOut = layerBtnsFadeOut.add(".layer-add-" + layerDepsList[i].id);
+ layerExistsBtnFadeIn = layerExistsBtnFadeIn.add(".layer-exists-" + layerDepsList[i].id);
+ }
+
+ layerBtnsFadeOut.fadeOut().promise().done(function(){
+ notification.fadeIn().delay(500).fadeOut(function(){
+ /* Fade in the buttons */
+ layerExistsBtnFadeIn.fadeIn();
+ notification.remove();
+ });
+ });
+ } else {
+ notification.text("1 layer deleted");
+ /* Deleting a layer we only hanlde the one button */
+ thisBtn.fadeOut(function(){
+ notification.fadeIn().delay(500).fadeOut(function(){
+ $(".layer-add-" + layerObj.id).fadeIn();
+ notification.remove();
+ });
+ });
+ }
+
+ });
+ });
+
+ $(".build-recipe-btn").unbind('click');
+ $(".build-recipe-btn").click(function(e){
+ e.preventDefault();
+ var recipe = $(this).data('recipe-name');
+
+ libtoaster.startABuild(libtoaster.ctx.projectBuildsUrl,
+ libtoaster.ctx.projectId, recipe,
+ function(){
+ /* Success */
+ window.location.replace(libtoaster.ctx.projectBuildsUrl);
+ });
+ });
+
+ /* Setup the initial state of the buttons */
+
+ for (var i in ctx.projectLayers){
+ $(".layer-exists-" + ctx.projectLayers[i]).show();
+ $(".layer-add-" + ctx.projectLayers[i]).hide();
+ }
+}
diff --git a/bitbake/lib/toaster/toastergui/static/js/layerDepsModal.js b/bitbake/lib/toaster/toastergui/static/js/layerDepsModal.js
new file mode 100644
index 0000000..825f9dc
--- /dev/null
+++ b/bitbake/lib/toaster/toastergui/static/js/layerDepsModal.js
@@ -0,0 +1,90 @@
+/*
+ * layer: Object representing the parent layer { id: .. name: ... url }
+ * dependencies: array of dependency layer objects { id: .. name: ..}
+ * title: optional override for title
+ * body: optional override for body
+ * addToProject: Whether to add layers to project on accept
+ * successAdd: function to run on success
+ */
+function showLayerDepsModal(layer, dependencies, title, body, addToProject, successAdd) {
+
+ if ($("#dependencies-modal").length === 0) {
+ $.get(libtoaster.ctx.htmlUrl + "/layer_deps_modal.html", function(html){
+ $("body").append(html);
+ setupModal();
+ });
+ } else {
+ setupModal();
+ }
+
+ function setupModal(){
+
+ if (title) {
+ $('#dependencies-modal #title').text(title);
+ } else {
+ $('#dependencies-modal #title').text(layer.name);
+ }
+
+ if (body) {
+ $("#dependencies-modal #body-text").html(body);
+ } else {
+ $("#dependencies-modal #layer-name").text(layer.name);
+ }
+
+ var deplistHtml = "";
+ for (var i = 0; i < dependencies.length; i++) {
+ deplistHtml += "<li><label class=\"checkbox\"><input name=\"dependencies\" value=\"";
+ deplistHtml += dependencies[i].id;
+ deplistHtml +="\" type=\"checkbox\" checked=\"checked\"/>";
+ deplistHtml += dependencies[i].name;
+ deplistHtml += "</label></li>";
+ }
+ $('#dependencies-list').html(deplistHtml);
+
+ $("#dependencies-modal").data("deps", dependencies);
+
+ $('#dependencies-modal').modal('show');
+
+ /* Discard the old submission function */
+ $("#dependencies-modal-form").unbind('submit');
+
+ $("#dependencies-modal-form").submit(function (e) {
+ e.preventDefault();
+ var selectedLayerIds = [];
+ var selectedLayers = [];
+
+ $("input[name='dependencies']:checked").each(function () {
+ selectedLayerIds.push(parseInt($(this).val()));
+ });
+
+ /* -1 is a special dummy Id which we use when the layer isn't yet in the
+ * system, normally we would add the current layer to the selection.
+ */
+ if (layer.id != -1)
+ selectedLayerIds.push(layer.id);
+
+ /* Find the selected layer objects from our original list */
+ for (var i = 0; i < selectedLayerIds.length; i++) {
+ for (var j = 0; j < dependencies.length; j++) {
+ if (dependencies[j].id == selectedLayerIds[i]) {
+ selectedLayers.push(dependencies[j]);
+ }
+ }
+ }
+
+ if (addToProject) {
+ libtoaster.editCurrentProject({ 'layerAdd': selectedLayerIds.join(",") }, function () {
+ if (successAdd) {
+ successAdd(selectedLayers);
+ }
+ }, function () {
+ console.warn("Adding layers to project failed");
+ });
+ } else {
+ successAdd(selectedLayers);
+ }
+
+ $('#dependencies-modal').modal('hide');
+ });
+ }
+}
diff --git a/bitbake/lib/toaster/toastergui/static/js/layerdetails.js b/bitbake/lib/toaster/toastergui/static/js/layerdetails.js
new file mode 100644
index 0000000..000e803
--- /dev/null
+++ b/bitbake/lib/toaster/toastergui/static/js/layerdetails.js
@@ -0,0 +1,392 @@
+"use strict";
+
+function layerDetailsPageInit (ctx) {
+
+ var layerDepInput = $("#layer-dep-input");
+ var layerDepBtn = $("#add-layer-dependency-btn");
+ var layerDepsList = $("#layer-deps-list");
+ var currentLayerDepSelection;
+ var addRmLayerBtn = $("#add-remove-layer-btn");
+
+ /* setup the dependencies typeahead */
+ libtoaster.makeTypeahead(layerDepInput, libtoaster.ctx.layersTypeAheadUrl, { include_added: "true" }, function(item){
+ currentLayerDepSelection = item;
+
+ layerDepBtn.removeAttr("disabled");
+ });
+
+ $(".breadcrumb li:first a").click(function(e){
+ e.preventDefault();
+ /* By default this link goes to the project configuration page. However
+ * if we have some builds we go there instead of the default href
+ */
+ libtoaster.getProjectInfo(libtoaster.ctx.projectPageUrl, function(prjInfo){
+ if (prjInfo.builds && prjInfo.builds.length > 0) {
+ window.location.replace(libtoaster.ctx.projectBuildsUrl);
+ } else {
+ window.location.replace(libtoaster.ctx.projectPageUrl);
+ }
+ });
+ });
+
+ function addRemoveDep(depLayerId, add, doneCb) {
+ var data = { layer_version_id : ctx.layerVersion.id };
+ if (add)
+ data.add_dep = depLayerId;
+ else
+ data.rm_dep = depLayerId;
+
+ $.ajax({
+ type: "POST",
+ url: ctx.xhrUpdateLayerUrl,
+ data: data,
+ headers: { 'X-CSRFToken' : $.cookie('csrftoken')},
+ success: function (data) {
+ if (data.error != "ok") {
+ console.warn(data.error);
+ } else {
+ doneCb();
+ }
+ },
+ error: function (data) {
+ console.warn("Call failed");
+ console.warn(data);
+ }
+ });
+ }
+
+ function layerDepRemoveClick() {
+ var toRemove = $(this).parent().data('layer-id');
+ var layerDepItem = $(this);
+
+ addRemoveDep(toRemove, false, function(){
+ layerDepItem.parent().fadeOut(function (){
+ layerDepItem.remove();
+ });
+ });
+ }
+
+ /* Add dependency layer button click handler */
+ layerDepBtn.click(function(){
+ if (currentLayerDepSelection === undefined)
+ return;
+
+ addRemoveDep(currentLayerDepSelection.id, true, function(){
+ /* Make a list item for the new layer dependency */
+ var newLayerDep = $("<li><a></a><span class=\"icon-trash\" data-toggle=\"tooltip\" title=\"Delete\"></span></li>");
+
+ newLayerDep.data('layer-id', currentLayerDepSelection.id);
+ newLayerDep.children("span").tooltip();
+
+ var link = newLayerDep.children("a");
+ link.attr("href", currentLayerDepSelection.layerdetailurl);
+ link.text(currentLayerDepSelection.name);
+ link.tooltip({title: currentLayerDepSelection.tooltip, placement: "right"});
+
+ /* Connect up the tash icon */
+ var trashItem = newLayerDep.children("span");
+ trashItem.click(layerDepRemoveClick);
+
+ layerDepsList.append(newLayerDep);
+ /* Clear the current selection */
+ layerDepInput.val("");
+ currentLayerDepSelection = undefined;
+ layerDepBtn.attr("disabled","disabled");
+ });
+ });
+
+ $(".icon-pencil").click(function (){
+ var mParent = $(this).parent("dd");
+ mParent.prev().css("margin-top", "10px");
+ mParent.children("form").slideDown();
+ var currentVal = mParent.children(".current-value");
+ currentVal.hide();
+ /* Set the current value to the input field */
+ mParent.find("textarea,input").val(currentVal.text());
+ /* Hides the "Not set" text */
+ mParent.children(".muted").hide();
+ /* We're editing so hide the delete icon */
+ mParent.children(".delete-current-value").hide();
+ mParent.find(".cancel").show();
+ $(this).hide();
+ });
+
+ $(".delete-current-value").click(function(){
+ var mParent = $(this).parent("dd");
+ mParent.find("input").val("");
+ mParent.find("textarea").val("");
+ mParent.find(".change-btn").click();
+ });
+
+ $(".cancel").click(function(){
+ var mParent = $(this).parents("dd");
+ $(this).hide();
+ mParent.children("form").slideUp(function(){
+ mParent.children(".current-value").show();
+ /* Show the "Not set" text if we ended up with no value */
+ if (!mParent.children(".current-value").html()){
+ mParent.children(".muted").fadeIn();
+ mParent.children(".delete-current-value").hide();
+ } else {
+ mParent.children(".delete-current-value").show();
+ }
+
+ mParent.children(".icon-pencil").show();
+ mParent.prev().css("margin-top", "0px");
+ });
+ });
+
+ function defaultAddBtnText(){
+ var text = " Add the "+ctx.layerVersion.name+" layer to your project";
+ addRmLayerBtn.text(text);
+ addRmLayerBtn.prepend("<span class=\"icon-plus\"></span>");
+ addRmLayerBtn.removeClass("btn-danger");
+ }
+
+ $("#details-tab").on('show', function(){
+ if (!ctx.layerVersion.inCurrentPrj)
+ defaultAddBtnText();
+
+ window.location.hash = "details";
+ });
+
+ function targetsTabShow(){
+ if (!ctx.layerVersion.inCurrentPrj){
+ if (ctx.numTargets > 0) {
+ var text = " Add the "+ctx.layerVersion.name+" layer to your project "+
+ "to enable these recipes";
+ addRmLayerBtn.text(text);
+ addRmLayerBtn.prepend("<span class=\"icon-plus\"></span>");
+ } else {
+ defaultAddBtnText();
+ }
+ }
+
+ window.location.hash = "recipes";
+ }
+
+ $("#recipestable").on('table-done', function(e, total, tableParams){
+ ctx.numTargets = total;
+
+ if (total === 0 && !tableParams.search) {
+ $("#no-recipes-yet").show();
+ } else {
+ $("#no-recipes-yet").hide();
+ }
+
+ $("#targets-tab").removeClass("muted");
+ if (window.location.hash === "#recipes"){
+ /* re run the machinesTabShow to update the text */
+ targetsTabShow();
+ }
+ });
+
+ $("#machinestable").on('table-done', function(e, total, tableParams){
+ ctx.numMachines = total;
+
+ if (total === 0 && !tableParams.search)
+ $("#no-machines-yet").show();
+ else
+ $("#no-machines-yet").hide();
+
+ $("#machines-tab").removeClass("muted");
+ if (window.location.hash === "#machines"){
+ /* re run the machinesTabShow to update the text */
+ machinesTabShow();
+ }
+
+ $(".select-machine-btn").click(function(e){
+ if ($(this).attr("disabled") === "disabled")
+ e.preventDefault();
+ });
+
+ });
+
+ $("#targets-tab").on('show', targetsTabShow);
+
+ function machinesTabShow(){
+ if (!ctx.layerVersion.inCurrentPrj) {
+ if (ctx.numMachines > 0){
+ var text = " Add the "+ctx.layerVersion.name+" layer to your project " +
+ "to enable these machines";
+ addRmLayerBtn.text(text);
+ addRmLayerBtn.prepend("<span class=\"icon-plus\"></span>");
+ } else {
+ defaultAddBtnText();
+ }
+ }
+
+ window.location.hash = "machines";
+ }
+
+ $("#machines-tab").on('show', machinesTabShow);
+
+ $(".pagesize").change(function(){
+ var search = libtoaster.parseUrlParams();
+ search.limit = this.value;
+
+ window.location.search = libtoaster.dumpsUrlParams(search);
+ });
+
+ /* Enables the Build target and Select Machine buttons and switches the
+ * add/remove button
+ */
+ function setLayerInCurrentPrj(added) {
+ ctx.layerVersion.inCurrentPrj = added;
+
+ if (added){
+ /* enable and switch all the button states */
+ $(".build-target-btn").removeAttr("disabled");
+ $(".select-machine-btn").removeAttr("disabled");
+ addRmLayerBtn.addClass("btn-danger");
+ addRmLayerBtn.data('directive', "remove");
+ addRmLayerBtn.text(" Delete the "+ctx.layerVersion.name+" layer from your project");
+ addRmLayerBtn.prepend("<span class=\"icon-trash\"></span>");
+
+ } else {
+ /* disable and switch all the button states */
+ $(".build-target-btn").attr("disabled","disabled");
+ $(".select-machine-btn").attr("disabled", "disabled");
+ addRmLayerBtn.removeClass("btn-danger");
+ addRmLayerBtn.data('directive', "add");
+
+ /* "special" handler so that we get the correct button text which depends
+ * on which tab is currently visible. Unfortunately we can't just call
+ * tab('show') as if it's already visible it doesn't run the event.
+ */
+ switch ($(".nav-pills .active a").prop('id')){
+ case 'machines-tab':
+ machinesTabShow();
+ break;
+ case 'targets-tab':
+ targetsTabShow();
+ break;
+ default:
+ defaultAddBtnText();
+ break;
+ }
+ }
+ }
+
+ $("#dismiss-alert").click(function(){
+ $(this).parent().fadeOut();
+ });
+
+ /* Add or remove this layer from the project */
+ addRmLayerBtn.click(function() {
+
+ var add = ($(this).data('directive') === "add");
+
+ libtoaster.addRmLayer(ctx.layerVersion, add, function (layersList){
+ var alertMsg = $("#alert-msg");
+ alertMsg.html(libtoaster.makeLayerAddRmAlertMsg(ctx.layerVersion, layersList, add));
+
+ setLayerInCurrentPrj(add);
+
+ $("#alert-area").show();
+ });
+ });
+
+ /* Handler for all of the Change buttons */
+ $(".change-btn").click(function(){
+ var mParent = $(this).parent();
+ var prop = $(this).data('layer-prop');
+
+ /* We have inputs, select and textareas to potentially grab the value
+ * from.
+ */
+ var entryElement = mParent.find("input");
+ if (entryElement.length === 0)
+ entryElement = mParent.find("textarea");
+ if (entryElement.length === 0) {
+ console.warn("Could not find element to get data from for this change");
+ return;
+ }
+
+ var data = { layer_version_id: ctx.layerVersion.id };
+ data[prop] = entryElement.val();
+
+ $.ajax({
+ type: "POST",
+ url: ctx.xhrUpdateLayerUrl,
+ data: data,
+ headers: { 'X-CSRFToken' : $.cookie('csrftoken')},
+ success: function (data) {
+ if (data.error != "ok") {
+ console.warn(data.error);
+ } else {
+ /* success layer property changed */
+ var inputArea = mParent.parents("dd");
+ var text;
+
+ text = entryElement.val();
+
+ /* Hide the "Not set" text if it's visible */
+ inputArea.find(".muted").hide();
+ inputArea.find(".current-value").text(text);
+ /* Same behaviour as cancel in that we hide the form/show current
+ * value.
+ */
+ inputArea.find(".cancel").click();
+ }
+ },
+ error: function (data) {
+ console.warn("Call failed");
+ console.warn(data);
+ }
+ });
+ });
+
+ /* Disable the change button when we have no data in the input */
+ $("dl input, dl textarea").on("input",function() {
+ if ($(this).val().length === 0)
+ $(this).parent().children(".change-btn").attr("disabled", "disabled");
+ else
+ $(this).parent().children(".change-btn").removeAttr("disabled");
+ });
+
+ /* This checks to see if the dt's dd has data in it or if the change data
+ * form is visible, otherwise hide it
+ */
+ $("dl").children().each(function (){
+ if ($(this).is("dt")) {
+ var dd = $(this).next("dd");
+ if (!dd.children("form:visible")|| !dd.find(".current-value").html()){
+ if (ctx.layerVersion.sourceId == 3){
+ /* There's no current value and the layer is editable
+ * so show the "Not set" and hide the delete icon
+ */
+ dd.find(".muted").show();
+ dd.find(".delete-current-value").hide();
+ } else {
+ /* We're not viewing an editable layer so hide the empty dd/dl pair */
+ $(this).hide();
+ dd.hide();
+ }
+ }
+ }
+ });
+
+ /* Hide the right column if it contains no information */
+ if ($("dl.item-info").children(':visible').length === 0) {
+ $("dl.item-info").parent().hide();
+ }
+
+ /* Clear the current search selection and reload the results */
+ $(".target-search-clear").click(function(){
+ $("#target-search").val("");
+ $(this).parents("form").submit();
+ });
+
+ $(".machine-search-clear").click(function(){
+ $("#machine-search").val("");
+ $(this).parents("form").submit();
+ });
+
+
+ layerDepsList.find(".icon-trash").click(layerDepRemoveClick);
+ layerDepsList.find("a").tooltip();
+ $(".icon-trash").tooltip();
+ $(".commit").tooltip();
+
+}
diff --git a/bitbake/lib/toaster/toastergui/static/js/libtoaster.js b/bitbake/lib/toaster/toastergui/static/js/libtoaster.js
new file mode 100644
index 0000000..c04f7ab
--- /dev/null
+++ b/bitbake/lib/toaster/toastergui/static/js/libtoaster.js
@@ -0,0 +1,574 @@
+"use strict";
+/* All shared functionality to go in libtoaster object.
+ * This object really just helps readability since we can then have
+ * a traceable namespace.
+ */
+var libtoaster = (function (){
+
+ /* makeTypeahead parameters
+ * elementSelector: JQuery elementSelector string
+ * xhrUrl: the url to get the JSON from expects JSON in the form:
+ * { "list": [ { "name": "test", "detail" : "a test thing" }, .... ] }
+ * xhrParams: the data/parameters to pass to the getJSON url e.g.
+ * { 'type' : 'projects' } the text typed will be passed as 'search'.
+ * selectedCB: function to call once an item has been selected one
+ * arg of the item.
+ */
+ function _makeTypeahead (jQElement, xhrUrl, xhrParams, selectedCB) {
+ if (!xhrUrl || xhrUrl.length === 0)
+ throw("No url to typeahead supplied");
+
+ var xhrReq;
+
+ jQElement.typeahead({
+ source: function(query, process){
+ xhrParams.search = query;
+
+ /* If we have a request in progress don't fire off another one*/
+ if (xhrReq)
+ xhrReq.abort();
+
+ xhrReq = $.getJSON(xhrUrl, this.options.xhrParams, function(data){
+ if (data.error !== "ok") {
+ console.log("Error getting data from server "+data.error);
+ return;
+ }
+
+ xhrReq = null;
+
+ return process(data.results);
+ });
+ },
+ updater: function(item) {
+ var itemObj = this.$menu.find('.active').data('itemObject');
+ selectedCB(itemObj);
+ return item;
+ },
+ matcher: function(item) {
+ if (!item.hasOwnProperty('name')) {
+ console.log("Name property missing in data");
+ return 0;
+ }
+
+ if (this.$element.val().length === 0)
+ return 0;
+
+ return 1;
+ },
+ highlighter: function (item) {
+ /* Use jquery to escape the item name and detail */
+ var current = $("<span></span>").text(item.name + ' '+item.detail);
+ current = current.html();
+
+ var query = this.query.replace(/[\-\[\]{}()*+?.,\\\^$|#\s]/g, '\\$&')
+ return current.replace(new RegExp('(' + query + ')', 'ig'), function ($1, match) {
+ return '<strong>' + match + '</strong>'
+ })
+ },
+ sorter: function (items) { return items; },
+ xhrUrl: xhrUrl,
+ xhrParams: xhrParams,
+ xhrReq: xhrReq,
+ });
+
+
+ /* Copy of bootstrap's render func but sets selectedObject value */
+ function customRenderFunc (items) {
+ var that = this;
+
+ items = $(items).map(function (i, item) {
+ i = $(that.options.item).attr('data-value', item.name).data('itemObject', item);
+ i.find('a').html(that.highlighter(item));
+ return i[0];
+ });
+
+ items.first().addClass('active');
+ this.$menu.html(items);
+ return this;
+ }
+
+ jQElement.data('typeahead').render = customRenderFunc;
+ }
+
+ /*
+ * url - the url of the xhr build */
+ function _startABuild (url, project_id, targets, onsuccess, onfail) {
+
+ var data = {
+ project_id : project_id,
+ targets : targets,
+ }
+
+ $.ajax( {
+ type: "POST",
+ url: url,
+ data: data,
+ headers: { 'X-CSRFToken' : $.cookie('csrftoken')},
+ success: function (_data) {
+ /* No proper reponse YOCTO #7995
+ if (_data.error !== "ok") {
+ console.warn(_data.error);
+ } else { */
+ if (onsuccess !== undefined) onsuccess(_data);
+ // }
+ },
+ error: function (_data) {
+ console.warn("Call failed");
+ console.warn(_data);
+ if (onfail) onfail(data);
+ } });
+ }
+
+ /* cancelABuild:
+ * url: projectbuilds
+ * builds_ids: space separated list of build request ids
+ * onsuccess: callback for successful execution
+ * onfail: callback for failed execution
+ */
+ function _cancelABuild(url, build_ids, onsuccess, onfail){
+ $.ajax( {
+ type: "POST",
+ url: url,
+ data: { 'buildCancel': build_ids },
+ headers: { 'X-CSRFToken' : $.cookie('csrftoken')},
+ success: function (_data) {
+ if (_data.error !== "ok") {
+ console.warn(_data.error);
+ } else {
+ if (onsuccess !== undefined) onsuccess(_data);
+ }
+ },
+ error: function (_data) {
+ console.warn("Call failed");
+ console.warn(_data);
+ if (onfail) onfail(_data);
+ }
+ });
+ }
+
+ /* Get a project's configuration info */
+ function _getProjectInfo(url, onsuccess, onfail){
+ $.ajax({
+ type: "GET",
+ data : { format: "json" },
+ url: url,
+ headers: { 'X-CSRFToken' : $.cookie('csrftoken')},
+ success: function (_data) {
+ if (_data.error !== "ok") {
+ console.warn(_data.error);
+ } else {
+ if (onsuccess !== undefined) onsuccess(_data);
+ }
+ },
+ error: function (_data) {
+ console.warn(_data);
+ if (onfail) onfail(_data);
+ }
+ });
+ }
+
+ /* Properties for data can be:
+ * layerDel (csv)
+ * layerAdd (csv)
+ * projectName
+ * projectVersion
+ * machineName
+ */
+ function _editCurrentProject(data, onSuccess, onFail){
+ $.ajax({
+ type: "POST",
+ url: libtoaster.ctx.projectPageUrl + "?format=json",
+ data: data,
+ headers: { 'X-CSRFToken' : $.cookie('csrftoken')},
+ success: function (data) {
+ if (data.error != "ok") {
+ console.log(data.error);
+ if (onFail !== undefined)
+ onFail(data);
+ } else {
+ if (onSuccess !== undefined)
+ onSuccess(data);
+ }
+ },
+ error: function (data) {
+ console.log("Call failed");
+ console.log(data);
+ }
+ });
+ }
+
+ function _getLayerDepsForProject(url, onSuccess, onFail){
+ /* Check for dependencies not in the current project */
+ $.getJSON(url,
+ { format: 'json' },
+ function(data) {
+ if (data.error != "ok") {
+ console.log(data.error);
+ if (onFail !== undefined)
+ onFail(data);
+ } else {
+ var deps = {};
+ /* Filter out layer dep ids which are in the
+ * project already.
+ */
+ deps.list = data.layerdeps.list.filter(function(layerObj){
+ return (data.projectlayers.lastIndexOf(layerObj.id) < 0);
+ });
+
+ onSuccess(deps);
+ }
+ }, function() {
+ console.log("E: Failed to make request");
+ });
+ }
+
+ /* parses the query string of the current window.location to an object */
+ function _parseUrlParams() {
+ var string = window.location.search;
+ string = string.substr(1);
+ var stringArray = string.split ("&");
+ var obj = {};
+
+ for (var i in stringArray) {
+ var keyVal = stringArray[i].split ("=");
+ obj[keyVal[0]] = keyVal[1];
+ }
+
+ return obj;
+ }
+
+ /* takes a flat object and outputs it as a query string
+ * e.g. the output of dumpsUrlParams
+ */
+ function _dumpsUrlParams(obj) {
+ var str = "?";
+
+ for (var key in obj){
+ if (!obj[key])
+ continue;
+
+ str += key+ "="+obj[key].toString();
+ str += "&";
+ }
+
+ /* Maintain the current hash */
+ str += window.location.hash;
+
+ return str;
+ }
+
+ function _addRmLayer(layerObj, add, doneCb){
+ if (add === true) {
+ /* If adding get the deps for this layer */
+ libtoaster.getLayerDepsForProject(layerObj.layerdetailurl,
+ function (layers) {
+
+ /* got result for dependencies */
+ if (layers.list.length === 0){
+ var editData = { layerAdd : layerObj.id };
+ libtoaster.editCurrentProject(editData, function() {
+ doneCb([]);
+ });
+ return;
+ } else {
+ try {
+ showLayerDepsModal(layerObj, layers.list, null, null, true, doneCb);
+ } catch (e) {
+ $.getScript(libtoaster.ctx.jsUrl + "layerDepsModal.js", function(){
+ showLayerDepsModal(layerObj, layers.list, null, null, true, doneCb);
+ }, function(){
+ console.warn("Failed to load layerDepsModal");
+ });
+ }
+ }
+ }, null);
+ } else if (add === false) {
+ var editData = { layerDel : layerObj.id };
+
+ libtoaster.editCurrentProject(editData, function () {
+ doneCb([]);
+ }, function () {
+ console.warn ("Removing layer from project failed");
+ doneCb(null);
+ });
+ }
+ }
+
+ function _makeLayerAddRmAlertMsg(layer, layerDepsList, add) {
+ var alertMsg;
+
+ if (layerDepsList.length > 0 && add === true) {
+ alertMsg = $("<span>You have added <strong>"+(layerDepsList.length+1)+"</strong> layers to your project: <a id=\"layer-affected-name\"></a> and its dependencies </span>");
+
+ /* Build the layer deps list */
+ layerDepsList.map(function(layer, i){
+ var link = $("<a></a>");
+
+ link.attr("href", layer.layerdetailurl);
+ link.text(layer.name);
+ link.tooltip({title: layer.tooltip});
+
+ if (i !== 0)
+ alertMsg.append(", ");
+
+ alertMsg.append(link);
+ });
+ } else if (layerDepsList.length === 0 && add === true) {
+ alertMsg = $("<span>You have added <strong>1</strong> layer to your project: <a id=\"layer-affected-name\"></a></span></span>");
+ } else if (add === false) {
+ alertMsg = $("<span>You have deleted <strong>1</strong> layer from your project: <a id=\"layer-affected-name\"></a></span>");
+ }
+
+ alertMsg.children("#layer-affected-name").text(layer.name);
+ alertMsg.children("#layer-affected-name").attr("href", layer.layerdetailurl);
+
+ return alertMsg.html();
+ }
+
+ function _showChangeNotification(message){
+ var alertMsg = $("#change-notification-msg");
+
+ alertMsg.html(message);
+ $("#change-notification, #change-notification *").fadeIn();
+ }
+
+
+ return {
+ reload_params : reload_params,
+ startABuild : _startABuild,
+ cancelABuild : _cancelABuild,
+ makeTypeahead : _makeTypeahead,
+ getProjectInfo: _getProjectInfo,
+ getLayerDepsForProject : _getLayerDepsForProject,
+ editCurrentProject : _editCurrentProject,
+ debug: false,
+ parseUrlParams : _parseUrlParams,
+ dumpsUrlParams : _dumpsUrlParams,
+ addRmLayer : _addRmLayer,
+ makeLayerAddRmAlertMsg : _makeLayerAddRmAlertMsg,
+ showChangeNotification : _showChangeNotification,
+ };
+})();
+
+/* keep this in the global scope for compatability */
+function reload_params(params) {
+ var uri = window.location.href;
+ var splitlist = uri.split("?");
+ var url = splitlist[0];
+ var parameters = splitlist[1];
+ // deserialize the call parameters
+ var cparams = [];
+ if(parameters)
+ cparams = parameters.split("&");
+
+ var nparams = {};
+ for (var i = 0; i < cparams.length; i++) {
+ var temp = cparams[i].split("=");
+ nparams[temp[0]] = temp[1];
+ }
+ // update parameter values
+ for (i in params) {
+ nparams[encodeURIComponent(i)] = encodeURIComponent(params[i]);
+ }
+ // serialize the structure
+ var callparams = [];
+ for (i in nparams) {
+ callparams.push(i+"="+nparams[i]);
+ }
+ window.location.href = url+"?"+callparams.join('&');
+}
+
+
+/* Things that happen for all pages */
+$(document).ready(function() {
+
+ var ajaxLoadingTimer;
+
+ /* If we don't have a console object which might be the case in some
+ * browsers, no-op it to avoid undefined errors.
+ */
+ if (!window.console) {
+ window.console = {};
+ window.console.warn = function() {};
+ window.console.error = function() {};
+ }
+
+ /*
+ * PrettyPrint plugin.
+ *
+ */
+ // Init
+ prettyPrint();
+
+ // Prevent invalid links from jumping page scroll
+ $('a[href=#]').click(function() {
+ return false;
+ });
+
+
+ /* START TODO Delete this section now redundant */
+ /* Belen's additions */
+
+ // turn Edit columns dropdown into a multiselect menu
+ $('.dropdown-menu input, .dropdown-menu label').click(function(e) {
+ e.stopPropagation();
+ });
+
+ // enable popovers in any table cells that contain an anchor with the
+ // .btn class applied, and make sure popovers work on click, are mutually
+ // exclusive and they close when your click outside their area
+
+ $('html').click(function(){
+ $('td > a.btn').popover('hide');
+ });
+
+ $('td > a.btn').popover({
+ html:true,
+ placement:'left',
+ container:'body',
+ trigger:'manual'
+ }).click(function(e){
+ $('td > a.btn').not(this).popover('hide');
+ // ideally we would use 'toggle' here
+ // but it seems buggy in our Bootstrap version
+ $(this).popover('show');
+ e.stopPropagation();
+ });
+
+ // enable tooltips for applied filters
+ $('th a.btn-primary').tooltip({container:'body', html:true, placement:'bottom', delay:{hide:1500}});
+
+ // hide applied filter tooltip when you click on the filter button
+ $('th a.btn-primary').click(function () {
+ $('.tooltip').hide();
+ });
+
+ // enable help information tooltip
+ $(".get-help").tooltip({container:'body', html:true, delay:{show:300}});
+
+ // show help bubble only on hover inside tables
+ $(".hover-help").css("visibility","hidden");
+ $("th, td").hover(function () {
+ $(this).find(".hover-help").css("visibility","visible");
+ });
+ $("th, td").mouseleave(function () {
+ $(this).find(".hover-help").css("visibility","hidden");
+ });
+
+ /* END TODO Delete this section now redundant */
+
+ // show task type and outcome in task details pages
+ $(".task-info").tooltip({ container: 'body', html: true, delay: {show: 200}, placement: 'right' });
+
+ // initialise the tooltips for the icon-pencil icons
+ $(".icon-pencil").tooltip({ container: 'body', html: true, delay: {show: 400}, title: "Change" });
+
+ // initialise the tooltips for the download icons
+ $(".icon-download-alt").tooltip({ container: 'body', html: true, delay: { show: 200 } });
+
+ // initialise popover for debug information
+ $(".icon-info-sign").popover( { placement: 'bottom', html: true, container: 'body' });
+
+ // linking directly to tabs
+ $(function(){
+ var hash = window.location.hash;
+ $('ul.nav a[href="' + hash + '"]').tab('show');
+
+ $('.nav-tabs a').click(function () {
+ $(this).tab('show');
+ $('body').scrollTop();
+ });
+ });
+
+ // toggle for long content (variables, python stack trace, etc)
+ $('.full, .full-hide').hide();
+ $('.full-show').click(function(){
+ $('.full').slideDown(function(){
+ $('.full-hide').show();
+ });
+ $(this).hide();
+ });
+ $('.full-hide').click(function(){
+ $(this).hide();
+ $('.full').slideUp(function(){
+ $('.full-show').show();
+ });
+ });
+
+ //toggle the errors and warnings sections
+ $('.show-errors').click(function() {
+ $('#collapse-errors').addClass('in');
+ });
+ $('.toggle-errors').click(function() {
+ $('#collapse-errors').toggleClass('in');
+ });
+ $('.show-warnings').click(function() {
+ $('#collapse-warnings').addClass('in');
+ });
+ $('.toggle-warnings').click(function() {
+ $('#collapse-warnings').toggleClass('in');
+ });
+ $('.show-exceptions').click(function() {
+ $('#collapse-exceptions').addClass('in');
+ });
+ $('.toggle-exceptions').click(function() {
+ $('#collapse-exceptions').toggleClass('in');
+ });
+
+
+ $("#hide-alert").click(function(){
+ $(this).parent().fadeOut();
+ });
+
+ //show warnings section when requested from the previous page
+ if (location.href.search('#warnings') > -1) {
+ $('#collapse-warnings').addClass('in');
+ }
+
+ /* Show the loading notification if nothing has happend after 1.5
+ * seconds
+ */
+ $(document).bind("ajaxStart", function(){
+ if (ajaxLoadingTimer)
+ window.clearTimeout(ajaxLoadingTimer);
+
+ ajaxLoadingTimer = window.setTimeout(function() {
+ $("#loading-notification").fadeIn();
+ }, 1200);
+ });
+
+ $(document).bind("ajaxStop", function(){
+ if (ajaxLoadingTimer)
+ window.clearTimeout(ajaxLoadingTimer);
+
+ $("#loading-notification").fadeOut();
+ });
+
+ $(document).ajaxError(function(event, jqxhr, settings, errMsg){
+ if (errMsg === 'abort')
+ return;
+
+ console.warn("Problem with xhr call");
+ console.warn(errMsg);
+ console.warn(jqxhr.responseText);
+ });
+
+ function check_for_duplicate_ids () {
+ /* warn about duplicate element ids */
+ var ids = {};
+ $("[id]").each(function() {
+ if (this.id && ids[this.id]) {
+ console.warn('Duplicate element id #'+this.id);
+ }
+ ids[this.id] = true;
+ });
+ }
+
+ if (libtoaster.debug) {
+ check_for_duplicate_ids();
+ } else {
+ /* Debug is false so supress warnings by overriding the functions */
+ window.console.warn = function () {};
+ window.console.error = function () {};
+ }
+});
diff --git a/bitbake/lib/toaster/toastergui/static/js/prettify.js b/bitbake/lib/toaster/toastergui/static/js/prettify.js
new file mode 100755
index 0000000..eef5ad7
--- /dev/null
+++ b/bitbake/lib/toaster/toastergui/static/js/prettify.js
@@ -0,0 +1,28 @@
+var q=null;window.PR_SHOULD_USE_CONTINUATION=!0;
+(function(){function L(a){function m(a){var f=a.charCodeAt(0);if(f!==92)return f;var b=a.charAt(1);return(f=r[b])?f:"0"<=b&&b<="7"?parseInt(a.substring(1),8):b==="u"||b==="x"?parseInt(a.substring(2),16):a.charCodeAt(1)}function e(a){if(a<32)return(a<16?"\\x0":"\\x")+a.toString(16);a=String.fromCharCode(a);if(a==="\\"||a==="-"||a==="["||a==="]")a="\\"+a;return a}function h(a){for(var f=a.substring(1,a.length-1).match(/\\u[\dA-Fa-f]{4}|\\x[\dA-Fa-f]{2}|\\[0-3][0-7]{0,2}|\\[0-7]{1,2}|\\[\S\s]|[^\\]/g),a=
+[],b=[],o=f[0]==="^",c=o?1:0,i=f.length;c<i;++c){var j=f[c];if(/\\[bdsw]/i.test(j))a.push(j);else{var j=m(j),d;c+2<i&&"-"===f[c+1]?(d=m(f[c+2]),c+=2):d=j;b.push([j,d]);d<65||j>122||(d<65||j>90||b.push([Math.max(65,j)|32,Math.min(d,90)|32]),d<97||j>122||b.push([Math.max(97,j)&-33,Math.min(d,122)&-33]))}}b.sort(function(a,f){return a[0]-f[0]||f[1]-a[1]});f=[];j=[NaN,NaN];for(c=0;c<b.length;++c)i=b[c],i[0]<=j[1]+1?j[1]=Math.max(j[1],i[1]):f.push(j=i);b=["["];o&&b.push("^");b.push.apply(b,a);for(c=0;c<
+f.length;++c)i=f[c],b.push(e(i[0])),i[1]>i[0]&&(i[1]+1>i[0]&&b.push("-"),b.push(e(i[1])));b.push("]");return b.join("")}function y(a){for(var f=a.source.match(/\[(?:[^\\\]]|\\[\S\s])*]|\\u[\dA-Fa-f]{4}|\\x[\dA-Fa-f]{2}|\\\d+|\\[^\dux]|\(\?[!:=]|[()^]|[^()[\\^]+/g),b=f.length,d=[],c=0,i=0;c<b;++c){var j=f[c];j==="("?++i:"\\"===j.charAt(0)&&(j=+j.substring(1))&&j<=i&&(d[j]=-1)}for(c=1;c<d.length;++c)-1===d[c]&&(d[c]=++t);for(i=c=0;c<b;++c)j=f[c],j==="("?(++i,d[i]===void 0&&(f[c]="(?:")):"\\"===j.charAt(0)&&
+(j=+j.substring(1))&&j<=i&&(f[c]="\\"+d[i]);for(i=c=0;c<b;++c)"^"===f[c]&&"^"!==f[c+1]&&(f[c]="");if(a.ignoreCase&&s)for(c=0;c<b;++c)j=f[c],a=j.charAt(0),j.length>=2&&a==="["?f[c]=h(j):a!=="\\"&&(f[c]=j.replace(/[A-Za-z]/g,function(a){a=a.charCodeAt(0);return"["+String.fromCharCode(a&-33,a|32)+"]"}));return f.join("")}for(var t=0,s=!1,l=!1,p=0,d=a.length;p<d;++p){var g=a[p];if(g.ignoreCase)l=!0;else if(/[a-z]/i.test(g.source.replace(/\\u[\da-f]{4}|\\x[\da-f]{2}|\\[^UXux]/gi,""))){s=!0;l=!1;break}}for(var r=
+{b:8,t:9,n:10,v:11,f:12,r:13},n=[],p=0,d=a.length;p<d;++p){g=a[p];if(g.global||g.multiline)throw Error(""+g);n.push("(?:"+y(g)+")")}return RegExp(n.join("|"),l?"gi":"g")}function M(a){function m(a){switch(a.nodeType){case 1:if(e.test(a.className))break;for(var g=a.firstChild;g;g=g.nextSibling)m(g);g=a.nodeName;if("BR"===g||"LI"===g)h[s]="\n",t[s<<1]=y++,t[s++<<1|1]=a;break;case 3:case 4:g=a.nodeValue,g.length&&(g=p?g.replace(/\r\n?/g,"\n"):g.replace(/[\t\n\r ]+/g," "),h[s]=g,t[s<<1]=y,y+=g.length,
+t[s++<<1|1]=a)}}var e=/(?:^|\s)nocode(?:\s|$)/,h=[],y=0,t=[],s=0,l;a.currentStyle?l=a.currentStyle.whiteSpace:window.getComputedStyle&&(l=document.defaultView.getComputedStyle(a,q).getPropertyValue("white-space"));var p=l&&"pre"===l.substring(0,3);m(a);return{a:h.join("").replace(/\n$/,""),c:t}}function B(a,m,e,h){m&&(a={a:m,d:a},e(a),h.push.apply(h,a.e))}function x(a,m){function e(a){for(var l=a.d,p=[l,"pln"],d=0,g=a.a.match(y)||[],r={},n=0,z=g.length;n<z;++n){var f=g[n],b=r[f],o=void 0,c;if(typeof b===
+"string")c=!1;else{var i=h[f.charAt(0)];if(i)o=f.match(i[1]),b=i[0];else{for(c=0;c<t;++c)if(i=m[c],o=f.match(i[1])){b=i[0];break}o||(b="pln")}if((c=b.length>=5&&"lang-"===b.substring(0,5))&&!(o&&typeof o[1]==="string"))c=!1,b="src";c||(r[f]=b)}i=d;d+=f.length;if(c){c=o[1];var j=f.indexOf(c),k=j+c.length;o[2]&&(k=f.length-o[2].length,j=k-c.length);b=b.substring(5);B(l+i,f.substring(0,j),e,p);B(l+i+j,c,C(b,c),p);B(l+i+k,f.substring(k),e,p)}else p.push(l+i,b)}a.e=p}var h={},y;(function(){for(var e=a.concat(m),
+l=[],p={},d=0,g=e.length;d<g;++d){var r=e[d],n=r[3];if(n)for(var k=n.length;--k>=0;)h[n.charAt(k)]=r;r=r[1];n=""+r;p.hasOwnProperty(n)||(l.push(r),p[n]=q)}l.push(/[\S\s]/);y=L(l)})();var t=m.length;return e}function u(a){var m=[],e=[];a.tripleQuotedStrings?m.push(["str",/^(?:'''(?:[^'\\]|\\[\S\s]|''?(?=[^']))*(?:'''|$)|"""(?:[^"\\]|\\[\S\s]|""?(?=[^"]))*(?:"""|$)|'(?:[^'\\]|\\[\S\s])*(?:'|$)|"(?:[^"\\]|\\[\S\s])*(?:"|$))/,q,"'\""]):a.multiLineStrings?m.push(["str",/^(?:'(?:[^'\\]|\\[\S\s])*(?:'|$)|"(?:[^"\\]|\\[\S\s])*(?:"|$)|`(?:[^\\`]|\\[\S\s])*(?:`|$))/,
+q,"'\"`"]):m.push(["str",/^(?:'(?:[^\n\r'\\]|\\.)*(?:'|$)|"(?:[^\n\r"\\]|\\.)*(?:"|$))/,q,"\"'"]);a.verbatimStrings&&e.push(["str",/^@"(?:[^"]|"")*(?:"|$)/,q]);var h=a.hashComments;h&&(a.cStyleComments?(h>1?m.push(["com",/^#(?:##(?:[^#]|#(?!##))*(?:###|$)|.*)/,q,"#"]):m.push(["com",/^#(?:(?:define|elif|else|endif|error|ifdef|include|ifndef|line|pragma|undef|warning)\b|[^\n\r]*)/,q,"#"]),e.push(["str",/^<(?:(?:(?:\.\.\/)*|\/?)(?:[\w-]+(?:\/[\w-]+)+)?[\w-]+\.h|[a-z]\w*)>/,q])):m.push(["com",/^#[^\n\r]*/,
+q,"#"]));a.cStyleComments&&(e.push(["com",/^\/\/[^\n\r]*/,q]),e.push(["com",/^\/\*[\S\s]*?(?:\*\/|$)/,q]));a.regexLiterals&&e.push(["lang-regex",/^(?:^^\.?|[!+-]|!=|!==|#|%|%=|&|&&|&&=|&=|\(|\*|\*=|\+=|,|-=|->|\/|\/=|:|::|;|<|<<|<<=|<=|=|==|===|>|>=|>>|>>=|>>>|>>>=|[?@[^]|\^=|\^\^|\^\^=|{|\||\|=|\|\||\|\|=|~|break|case|continue|delete|do|else|finally|instanceof|return|throw|try|typeof)\s*(\/(?=[^*/])(?:[^/[\\]|\\[\S\s]|\[(?:[^\\\]]|\\[\S\s])*(?:]|$))+\/)/]);(h=a.types)&&e.push(["typ",h]);a=(""+a.keywords).replace(/^ | $/g,
+"");a.length&&e.push(["kwd",RegExp("^(?:"+a.replace(/[\s,]+/g,"|")+")\\b"),q]);m.push(["pln",/^\s+/,q," \r\n\t\xa0"]);e.push(["lit",/^@[$_a-z][\w$@]*/i,q],["typ",/^(?:[@_]?[A-Z]+[a-z][\w$@]*|\w+_t\b)/,q],["pln",/^[$_a-z][\w$@]*/i,q],["lit",/^(?:0x[\da-f]+|(?:\d(?:_\d+)*\d*(?:\.\d*)?|\.\d\+)(?:e[+-]?\d+)?)[a-z]*/i,q,"0123456789"],["pln",/^\\[\S\s]?/,q],["pun",/^.[^\s\w"-$'./@\\`]*/,q]);return x(m,e)}function D(a,m){function e(a){switch(a.nodeType){case 1:if(k.test(a.className))break;if("BR"===a.nodeName)h(a),
+a.parentNode&&a.parentNode.removeChild(a);else for(a=a.firstChild;a;a=a.nextSibling)e(a);break;case 3:case 4:if(p){var b=a.nodeValue,d=b.match(t);if(d){var c=b.substring(0,d.index);a.nodeValue=c;(b=b.substring(d.index+d[0].length))&&a.parentNode.insertBefore(s.createTextNode(b),a.nextSibling);h(a);c||a.parentNode.removeChild(a)}}}}function h(a){function b(a,d){var e=d?a.cloneNode(!1):a,f=a.parentNode;if(f){var f=b(f,1),g=a.nextSibling;f.appendChild(e);for(var h=g;h;h=g)g=h.nextSibling,f.appendChild(h)}return e}
+for(;!a.nextSibling;)if(a=a.parentNode,!a)return;for(var a=b(a.nextSibling,0),e;(e=a.parentNode)&&e.nodeType===1;)a=e;d.push(a)}var k=/(?:^|\s)nocode(?:\s|$)/,t=/\r\n?|\n/,s=a.ownerDocument,l;a.currentStyle?l=a.currentStyle.whiteSpace:window.getComputedStyle&&(l=s.defaultView.getComputedStyle(a,q).getPropertyValue("white-space"));var p=l&&"pre"===l.substring(0,3);for(l=s.createElement("LI");a.firstChild;)l.appendChild(a.firstChild);for(var d=[l],g=0;g<d.length;++g)e(d[g]);m===(m|0)&&d[0].setAttribute("value",
+m);var r=s.createElement("OL");r.className="linenums";for(var n=Math.max(0,m-1|0)||0,g=0,z=d.length;g<z;++g)l=d[g],l.className="L"+(g+n)%10,l.firstChild||l.appendChild(s.createTextNode("\xa0")),r.appendChild(l);a.appendChild(r)}function k(a,m){for(var e=m.length;--e>=0;){var h=m[e];A.hasOwnProperty(h)?window.console&&console.warn("cannot override language handler %s",h):A[h]=a}}function C(a,m){if(!a||!A.hasOwnProperty(a))a=/^\s*</.test(m)?"default-markup":"default-code";return A[a]}function E(a){var m=
+a.g;try{var e=M(a.h),h=e.a;a.a=h;a.c=e.c;a.d=0;C(m,h)(a);var k=/\bMSIE\b/.test(navigator.userAgent),m=/\n/g,t=a.a,s=t.length,e=0,l=a.c,p=l.length,h=0,d=a.e,g=d.length,a=0;d[g]=s;var r,n;for(n=r=0;n<g;)d[n]!==d[n+2]?(d[r++]=d[n++],d[r++]=d[n++]):n+=2;g=r;for(n=r=0;n<g;){for(var z=d[n],f=d[n+1],b=n+2;b+2<=g&&d[b+1]===f;)b+=2;d[r++]=z;d[r++]=f;n=b}for(d.length=r;h<p;){var o=l[h+2]||s,c=d[a+2]||s,b=Math.min(o,c),i=l[h+1],j;if(i.nodeType!==1&&(j=t.substring(e,b))){k&&(j=j.replace(m,"\r"));i.nodeValue=
+j;var u=i.ownerDocument,v=u.createElement("SPAN");v.className=d[a+1];var x=i.parentNode;x.replaceChild(v,i);v.appendChild(i);e<o&&(l[h+1]=i=u.createTextNode(t.substring(b,o)),x.insertBefore(i,v.nextSibling))}e=b;e>=o&&(h+=2);e>=c&&(a+=2)}}catch(w){"console"in window&&console.log(w&&w.stack?w.stack:w)}}var v=["break,continue,do,else,for,if,return,while"],w=[[v,"auto,case,char,const,default,double,enum,extern,float,goto,int,long,register,short,signed,sizeof,static,struct,switch,typedef,union,unsigned,void,volatile"],
+"catch,class,delete,false,import,new,operator,private,protected,public,this,throw,true,try,typeof"],F=[w,"alignof,align_union,asm,axiom,bool,concept,concept_map,const_cast,constexpr,decltype,dynamic_cast,explicit,export,friend,inline,late_check,mutable,namespace,nullptr,reinterpret_cast,static_assert,static_cast,template,typeid,typename,using,virtual,where"],G=[w,"abstract,boolean,byte,extends,final,finally,implements,import,instanceof,null,native,package,strictfp,super,synchronized,throws,transient"],
+H=[G,"as,base,by,checked,decimal,delegate,descending,dynamic,event,fixed,foreach,from,group,implicit,in,interface,internal,into,is,lock,object,out,override,orderby,params,partial,readonly,ref,sbyte,sealed,stackalloc,string,select,uint,ulong,unchecked,unsafe,ushort,var"],w=[w,"debugger,eval,export,function,get,null,set,undefined,var,with,Infinity,NaN"],I=[v,"and,as,assert,class,def,del,elif,except,exec,finally,from,global,import,in,is,lambda,nonlocal,not,or,pass,print,raise,try,with,yield,False,True,None"],
+J=[v,"alias,and,begin,case,class,def,defined,elsif,end,ensure,false,in,module,next,nil,not,or,redo,rescue,retry,self,super,then,true,undef,unless,until,when,yield,BEGIN,END"],v=[v,"case,done,elif,esac,eval,fi,function,in,local,set,then,until"],K=/^(DIR|FILE|vector|(de|priority_)?queue|list|stack|(const_)?iterator|(multi)?(set|map)|bitset|u?(int|float)\d*)/,N=/\S/,O=u({keywords:[F,H,w,"caller,delete,die,do,dump,elsif,eval,exit,foreach,for,goto,if,import,last,local,my,next,no,our,print,package,redo,require,sub,undef,unless,until,use,wantarray,while,BEGIN,END"+
+I,J,v],hashComments:!0,cStyleComments:!0,multiLineStrings:!0,regexLiterals:!0}),A={};k(O,["default-code"]);k(x([],[["pln",/^[^<?]+/],["dec",/^<!\w[^>]*(?:>|$)/],["com",/^<\!--[\S\s]*?(?:--\>|$)/],["lang-",/^<\?([\S\s]+?)(?:\?>|$)/],["lang-",/^<%([\S\s]+?)(?:%>|$)/],["pun",/^(?:<[%?]|[%?]>)/],["lang-",/^<xmp\b[^>]*>([\S\s]+?)<\/xmp\b[^>]*>/i],["lang-js",/^<script\b[^>]*>([\S\s]*?)(<\/script\b[^>]*>)/i],["lang-css",/^<style\b[^>]*>([\S\s]*?)(<\/style\b[^>]*>)/i],["lang-in.tag",/^(<\/?[a-z][^<>]*>)/i]]),
+["default-markup","htm","html","mxml","xhtml","xml","xsl"]);k(x([["pln",/^\s+/,q," \t\r\n"],["atv",/^(?:"[^"]*"?|'[^']*'?)/,q,"\"'"]],[["tag",/^^<\/?[a-z](?:[\w-.:]*\w)?|\/?>$/i],["atn",/^(?!style[\s=]|on)[a-z](?:[\w:-]*\w)?/i],["lang-uq.val",/^=\s*([^\s"'>]*(?:[^\s"'/>]|\/(?=\s)))/],["pun",/^[/<->]+/],["lang-js",/^on\w+\s*=\s*"([^"]+)"/i],["lang-js",/^on\w+\s*=\s*'([^']+)'/i],["lang-js",/^on\w+\s*=\s*([^\s"'>]+)/i],["lang-css",/^style\s*=\s*"([^"]+)"/i],["lang-css",/^style\s*=\s*'([^']+)'/i],["lang-css",
+/^style\s*=\s*([^\s"'>]+)/i]]),["in.tag"]);k(x([],[["atv",/^[\S\s]+/]]),["uq.val"]);k(u({keywords:F,hashComments:!0,cStyleComments:!0,types:K}),["c","cc","cpp","cxx","cyc","m"]);k(u({keywords:"null,true,false"}),["json"]);k(u({keywords:H,hashComments:!0,cStyleComments:!0,verbatimStrings:!0,types:K}),["cs"]);k(u({keywords:G,cStyleComments:!0}),["java"]);k(u({keywords:v,hashComments:!0,multiLineStrings:!0}),["bsh","csh","sh"]);k(u({keywords:I,hashComments:!0,multiLineStrings:!0,tripleQuotedStrings:!0}),
+["cv","py"]);k(u({keywords:"caller,delete,die,do,dump,elsif,eval,exit,foreach,for,goto,if,import,last,local,my,next,no,our,print,package,redo,require,sub,undef,unless,until,use,wantarray,while,BEGIN,END",hashComments:!0,multiLineStrings:!0,regexLiterals:!0}),["perl","pl","pm"]);k(u({keywords:J,hashComments:!0,multiLineStrings:!0,regexLiterals:!0}),["rb"]);k(u({keywords:w,cStyleComments:!0,regexLiterals:!0}),["js"]);k(u({keywords:"all,and,by,catch,class,else,extends,false,finally,for,if,in,is,isnt,loop,new,no,not,null,of,off,on,or,return,super,then,true,try,unless,until,when,while,yes",
+hashComments:3,cStyleComments:!0,multilineStrings:!0,tripleQuotedStrings:!0,regexLiterals:!0}),["coffee"]);k(x([],[["str",/^[\S\s]+/]]),["regex"]);window.prettyPrintOne=function(a,m,e){var h=document.createElement("PRE");h.innerHTML=a;e&&D(h,e);E({g:m,i:e,h:h});return h.innerHTML};window.prettyPrint=function(a){function m(){for(var e=window.PR_SHOULD_USE_CONTINUATION?l.now()+250:Infinity;p<h.length&&l.now()<e;p++){var n=h[p],k=n.className;if(k.indexOf("prettyprint")>=0){var k=k.match(g),f,b;if(b=
+!k){b=n;for(var o=void 0,c=b.firstChild;c;c=c.nextSibling)var i=c.nodeType,o=i===1?o?b:c:i===3?N.test(c.nodeValue)?b:o:o;b=(f=o===b?void 0:o)&&"CODE"===f.tagName}b&&(k=f.className.match(g));k&&(k=k[1]);b=!1;for(o=n.parentNode;o;o=o.parentNode)if((o.tagName==="pre"||o.tagName==="code"||o.tagName==="xmp")&&o.className&&o.className.indexOf("prettyprint")>=0){b=!0;break}b||((b=(b=n.className.match(/\blinenums\b(?::(\d+))?/))?b[1]&&b[1].length?+b[1]:!0:!1)&&D(n,b),d={g:k,h:n,i:b},E(d))}}p<h.length?setTimeout(m,
+250):a&&a()}for(var e=[document.getElementsByTagName("pre"),document.getElementsByTagName("code"),document.getElementsByTagName("xmp")],h=[],k=0;k<e.length;++k)for(var t=0,s=e[k].length;t<s;++t)h.push(e[k][t]);var e=q,l=Date;l.now||(l={now:function(){return+new Date}});var p=0,d,g=/\blang(?:uage)?-([\w.]+)(?!\S)/;m()};window.PR={createSimpleLexer:x,registerLangHandler:k,sourceDecorator:u,PR_ATTRIB_NAME:"atn",PR_ATTRIB_VALUE:"atv",PR_COMMENT:"com",PR_DECLARATION:"dec",PR_KEYWORD:"kwd",PR_LITERAL:"lit",
+PR_NOCODE:"nocode",PR_PLAIN:"pln",PR_PUNCTUATION:"pun",PR_SOURCE:"src",PR_STRING:"str",PR_TAG:"tag",PR_TYPE:"typ"}})();
diff --git a/bitbake/lib/toaster/toastergui/static/js/projectpage.js b/bitbake/lib/toaster/toastergui/static/js/projectpage.js
new file mode 100644
index 0000000..d367047
--- /dev/null
+++ b/bitbake/lib/toaster/toastergui/static/js/projectpage.js
@@ -0,0 +1,428 @@
+"use strict";
+
+function projectPageInit(ctx) {
+
+ var layerAddInput = $("#layer-add-input");
+ var layersInPrjList = $("#layers-in-project-list");
+ var layerAddBtn = $("#add-layer-btn");
+
+ var machineChangeInput = $("#machine-change-input");
+ var machineChangeBtn = $("#machine-change-btn");
+ var machineForm = $("#select-machine-form");
+ var machineChangeFormToggle = $("#change-machine-toggle");
+ var machineNameTitle = $("#project-machine-name");
+ var machineChangeCancel = $("#cancel-machine-change");
+
+ var freqBuildBtn = $("#freq-build-btn");
+ var freqBuildList = $("#freq-build-list");
+
+ var releaseChangeFormToggle = $("#release-change-toggle");
+ var releaseTitle = $("#project-release-title");
+ var releaseForm = $("#change-release-form");
+ var releaseModal = $("#change-release-modal");
+ var cancelReleaseChange = $("#cancel-release-change");
+
+ var currentLayerAddSelection;
+ var currentMachineAddSelection = {};
+
+ var urlParams = libtoaster.parseUrlParams();
+
+ libtoaster.getProjectInfo(libtoaster.ctx.projectPageUrl, function(prjInfo){
+ updateProjectLayers(prjInfo.layers);
+ updateFreqBuildRecipes(prjInfo.freqtargets);
+ updateProjectRelease(prjInfo.release);
+ updateProjectReleases(prjInfo.releases, prjInfo.release);
+
+ /* If we're receiving a machine set from the url and it's different from
+ * our current machine then activate set machine sequence.
+ */
+ if (urlParams.hasOwnProperty('setMachine') &&
+ urlParams.setMachine !== prjInfo.machine.name){
+ currentMachineAddSelection.name = urlParams.setMachine;
+ machineChangeBtn.click();
+ } else {
+ updateMachineName(prjInfo.machine.name);
+ }
+
+ /* Now we're really ready show the page */
+ $("#project-page").show();
+ });
+
+ (function notificationRequest(){
+
+ if (urlParams.hasOwnProperty('notify')){
+ switch (urlParams.notify){
+ case 'new-project':
+ $("#project-created-notification").show();
+ break;
+ case 'layer-imported':
+ layerImportedNotification();
+ break;
+ default:
+ break;
+ }
+ }
+ })();
+
+ /* Layer imported notification */
+ function layerImportedNotification(){
+ var imported = $.cookie("layer-imported-alert");
+ var message = "Layer imported";
+
+ if (!imported)
+ return;
+ else
+ imported = JSON.parse(imported);
+
+ if (imported.deps_added.length === 0) {
+ message = "You have imported <strong><a href=\""+imported.imported_layer.layerdetailurl+"\">"+imported.imported_layer.name+"</a></strong> and added it to your project.";
+ } else {
+
+ var links = "<a href=\""+imported.imported_layer.layerdetailurl+"\">"+imported.imported_layer.name+"</a>, ";
+
+ imported.deps_added.map (function(item, index){
+ links +='<a href="'+item.layerdetailurl+'">'+item.name+'</a>';
+ /*If we're at the last element we don't want the trailing comma */
+ if (imported.deps_added[index+1] !== undefined)
+ links += ', ';
+ });
+
+ /* Length + 1 here to do deps + the imported layer */
+ message = 'You have imported <strong><a href="'+imported.imported_layer.layerdetailurl+'">'+imported.imported_layer.name+'</a></strong> and added <strong>'+(imported.deps_added.length+1)+'</strong> layers to your project: <strong>'+links+'</strong>';
+ }
+
+ libtoaster.showChangeNotification(message);
+
+ $.removeCookie("layer-imported-alert", { path: "/"});
+ }
+
+ /* Add/Rm layer functionality */
+
+ libtoaster.makeTypeahead(layerAddInput, libtoaster.ctx.layersTypeAheadUrl, { include_added: "false" }, function(item){
+ currentLayerAddSelection = item;
+ layerAddBtn.removeAttr("disabled");
+ });
+
+ layerAddBtn.click(function(e){
+ e.preventDefault();
+ var layerObj = currentLayerAddSelection;
+
+ addRmLayer(layerObj, true);
+ /* Reset the text input */
+ layerAddInput.val("");
+ });
+
+ function addRmLayer(layerObj, add){
+
+ libtoaster.addRmLayer(layerObj, add, function(layerDepsList){
+ if (add){
+ updateProjectLayers([layerObj]);
+ updateProjectLayers(layerDepsList);
+ }
+
+ /* Show the alert message */
+ var message = libtoaster.makeLayerAddRmAlertMsg(layerObj, layerDepsList, add);
+ libtoaster.showChangeNotification(message);
+ });
+ }
+
+ function updateProjectLayers(layers){
+
+ /* No layers to add */
+ if (layers.length === 0){
+ updateLayersCount();
+ return;
+ }
+
+ for (var i in layers){
+ var layerObj = layers[i];
+
+ var projectLayer = $("<li><a></a><span class=\"icon-trash\" data-toggle=\"tooltip\" title=\"Delete\"></span></li>");
+
+ projectLayer.data('layer', layerObj);
+ projectLayer.children("span").tooltip();
+
+ var link = projectLayer.children("a");
+
+ link.attr("href", layerObj.layerdetailurl);
+ link.text(layerObj.name);
+ /* YOCTO #8024
+ link.tooltip({title: layerObj.giturl + " | "+ layerObj.branch.name, placement: "right"});
+ branch name not accessible sometimes it is revision instead
+ */
+
+ var trashItem = projectLayer.children("span");
+ trashItem.click(function (e) {
+ e.preventDefault();
+ var layerObjToRm = $(this).parent().data('layer');
+
+ addRmLayer(layerObjToRm, false);
+
+ $(this).parent().fadeOut(function (){
+ $(this).remove();
+ updateLayersCount();
+ });
+ });
+
+ layersInPrjList.append(projectLayer);
+
+ updateLayersCount();
+ }
+ }
+
+ function updateLayersCount(){
+ var count = $("#layers-in-project-list").children().length;
+
+ if (count === 0)
+ $("#no-layers-in-project").fadeIn();
+ else
+ $("#no-layers-in-project").hide();
+
+ $("#project-layers-count").text(count);
+
+ return count;
+ }
+
+ /* Frequent builds functionality */
+ function updateFreqBuildRecipes(recipes) {
+ var noMostBuilt = $("#no-most-built");
+
+ if (recipes.length === 0){
+ noMostBuilt.show();
+ freqBuildBtn.hide();
+ } else {
+ noMostBuilt.hide();
+ freqBuildBtn.show();
+ }
+
+ for (var i in recipes){
+ var freqTargetCheck = $('<li><label class="checkbox"><input type="checkbox" /><span class="freq-target-name"></span></label></li>');
+ freqTargetCheck.find(".freq-target-name").text(recipes[i]);
+ freqTargetCheck.find("input").val(recipes[i]);
+ freqTargetCheck.click(function(){
+ if (freqBuildList.find(":checked").length > 0)
+ freqBuildBtn.removeAttr("disabled");
+ else
+ freqBuildBtn.attr("disabled", "disabled");
+ });
+
+ freqBuildList.append(freqTargetCheck);
+ }
+ }
+
+ freqBuildBtn.click(function(e){
+ e.preventDefault();
+
+ var toBuild = "";
+ freqBuildList.find(":checked").each(function(){
+ toBuild += $(this).val();
+ });
+
+ libtoaster.startABuild(libtoaster.ctx.projectBuildsUrl, libtoaster.ctx.projectId, toBuild, function(){
+ /* Build started */
+ window.location.replace(libtoaster.ctx.projectBuildsUrl);
+ },
+ function(){
+ /* Build start failed */
+ /* [YOCTO #7995] */
+ window.location.replace(libtoaster.ctx.projectBuildsUrl);
+ });
+ });
+
+
+ /* Change machine functionality */
+
+ machineChangeFormToggle.click(function(){
+ machineForm.slideDown();
+ machineNameTitle.hide();
+ $(this).hide();
+ });
+
+ machineChangeCancel.click(function(){
+ machineForm.slideUp(function(){
+ machineNameTitle.show();
+ machineChangeFormToggle.show();
+ });
+ });
+
+ function updateMachineName(machineName){
+ machineChangeInput.val(machineName);
+ machineNameTitle.text(machineName);
+ }
+
+ libtoaster.makeTypeahead(machineChangeInput, libtoaster.ctx.machinesTypeAheadUrl, { }, function(item){
+ currentMachineAddSelection = item;
+ machineChangeBtn.removeAttr("disabled");
+ });
+
+ machineChangeBtn.click(function(e){
+ e.preventDefault();
+ if (currentMachineAddSelection.name === undefined)
+ return;
+
+ libtoaster.editCurrentProject({ machineName : currentMachineAddSelection.name },
+ function(){
+ /* Success machine changed */
+ updateMachineName(currentMachineAddSelection.name);
+ machineChangeCancel.click();
+
+ /* Show the alert message */
+ var message = $('<span class="lead">You have changed the machine to: <strong><span id="notify-machine-name"></span></strong></span>');
+ message.find("#notify-machine-name").text(currentMachineAddSelection.name);
+ libtoaster.showChangeNotification(message);
+ },
+ function(){
+ /* Failed machine changed */
+ console.log("failed to change machine");
+ });
+ });
+
+
+ /* Change release functionality */
+ function updateProjectRelease(release){
+ releaseTitle.text(release.description);
+ }
+
+ function updateProjectReleases(releases, current){
+ for (var i in releases){
+ var releaseOption = $("<option></option>");
+
+ releaseOption.val(releases[i].id);
+ releaseOption.text(releases[i].description);
+ releaseOption.data('release', releases[i]);
+
+ if (releases[i].id == current.id)
+ releaseOption.attr("selected", "selected");
+
+ releaseForm.children("select").append(releaseOption);
+ }
+ }
+
+ releaseChangeFormToggle.click(function(){
+ releaseForm.slideDown();
+ releaseTitle.hide();
+ $(this).hide();
+ });
+
+ cancelReleaseChange.click(function(e){
+ e.preventDefault();
+ releaseForm.slideUp(function(){
+ releaseTitle.show();
+ releaseChangeFormToggle.show();
+ });
+ });
+
+ function changeProjectRelease(release, layersToRm){
+ libtoaster.editCurrentProject({ projectVersion : release.id },
+ function(){
+ /* Success */
+ /* Update layers list with new layers */
+ layersInPrjList.addClass('muted');
+ libtoaster.getProjectInfo(libtoaster.ctx.projectPageUrl,
+ function(prjInfo){
+ layersInPrjList.children().remove();
+ updateProjectLayers(prjInfo.layers);
+ layersInPrjList.removeClass('muted');
+ releaseChangedNotification(release, prjInfo.layers, layersToRm);
+ });
+ updateProjectRelease(release);
+ cancelReleaseChange.click();
+ });
+ }
+
+ /* Create a notification to show the changes to the layer configuration
+ * caused by changing a release.
+ */
+
+ function releaseChangedNotification(release, layers, layersToRm){
+
+ var message;
+
+ if (layers.length === 0 && layersToRm.length === 0){
+ message = $('<span><span class="lead">You have changed the project release to: <strong><span id="notify-release-name"></span></strong>.');
+ message.find("#notify-release-name").text(release.description);
+ libtoaster.showChangeNotification(message);
+ return;
+ }
+
+ /* Create the whitespace separated list of layers removed */
+ var layersDelList = "";
+
+ layersToRm.map(function(layer, i){
+ layersDelList += layer.name;
+ if (layersToRm[i+1] !== undefined)
+ layersDelList += ', ';
+ });
+
+ message = $('<span><span class="lead">You have changed the project release to: <strong><span id="notify-release-name"></span></strong>. This has caused the following changes in your project layers:</span><ul id="notify-layers-changed-list"></ul></span>');
+
+ var changedList = message.find("#notify-layers-changed-list");
+
+ message.find("#notify-release-name").text(release.description);
+
+ /* Manually construct the list item for changed layers */
+ var li = '<li><strong>'+layers.length+'</strong> layers changed to the <strong>'+release.name+'</strong> release: ';
+ for (var i in layers){
+ li += '<a href='+layers[i].layerdetailurl+'>'+layers[i].name+'</a>';
+ if (i !== 0)
+ li += ', ';
+ }
+
+ changedList.append($(li));
+
+ /* Layers removed */
+ if (layersToRm && layersToRm.length > 0){
+ if (layersToRm.length == 1)
+ li = '<li><strong>1</strong> layer deleted: '+layersToRm[0].name+'</li>';
+ else
+ li = '<li><strong>'+layersToRm.length+'</strong> layers deleted: '+layersDelList+'</li>';
+
+ changedList.append($(li));
+ }
+
+ libtoaster.showChangeNotification(message);
+ }
+
+ /* Show the modal dialog which gives the option to remove layers which
+ * aren't compatible with the proposed release
+ */
+ function showReleaseLayerChangeModal(release, layers){
+ var layersToRmList = releaseModal.find("#layers-to-remove-list");
+ layersToRmList.text("");
+
+ releaseModal.find(".proposed-release-change-name").text(release.description);
+ releaseModal.data("layers", layers);
+ releaseModal.data("release", release);
+
+ for (var i in layers){
+ layersToRmList.append($("<li></li>").text(layers[i].name));
+ }
+ releaseModal.modal('show');
+ }
+
+ $("#change-release-btn").click(function(e){
+ e.preventDefault();
+
+ var newRelease = releaseForm.find("option:selected").data('release');
+
+ $.getJSON(ctx.testReleaseChangeUrl,
+ { new_release_id: newRelease.id },
+ function(layers) {
+ if (layers.rows.length === 0){
+ /* No layers to change for this release */
+ changeProjectRelease(newRelease, []);
+ } else {
+ showReleaseLayerChangeModal(newRelease, layers.rows);
+ }
+ });
+ });
+
+ /* Release change modal accept */
+ $("#change-release-and-rm-layers").click(function(){
+ var layers = releaseModal.data("layers");
+ var release = releaseModal.data("release");
+
+ changeProjectRelease(release, layers);
+ });
+
+}
diff --git a/bitbake/lib/toaster/toastergui/static/js/table.js b/bitbake/lib/toaster/toastergui/static/js/table.js
new file mode 100644
index 0000000..f18034d
--- /dev/null
+++ b/bitbake/lib/toaster/toastergui/static/js/table.js
@@ -0,0 +1,554 @@
+'use strict';
+
+function tableInit(ctx){
+
+ if (ctx.url.length === 0) {
+ throw "No url supplied for retreiving data";
+ }
+
+ var tableChromeDone = false;
+ var tableTotal = 0;
+
+ var tableParams = {
+ limit : 25,
+ page : 1,
+ orderby : null,
+ filter : null,
+ search : null,
+ };
+
+ var defaultHiddenCols = [];
+
+ var table = $("#" + ctx.tableName);
+
+ /* if we're loading clean from a url use it's parameters as the default */
+ var urlParams = libtoaster.parseUrlParams();
+
+ /* Merge the tableParams and urlParams object properties */
+ tableParams = $.extend(tableParams, urlParams);
+
+ /* Now fix the types that .extend changed for us */
+ tableParams.limit = Number(tableParams.limit);
+ tableParams.page = Number(tableParams.page);
+
+ loadData(tableParams);
+
+ window.onpopstate = function(event){
+ if (event.state){
+ tableParams = event.state.tableParams;
+ /* We skip loadData and just update the table */
+ updateTable(event.state.tableData);
+ }
+ };
+
+ function loadData(tableParams){
+ $.ajax({
+ type: "GET",
+ url: ctx.url,
+ data: tableParams,
+ headers: { 'X-CSRFToken' : $.cookie('csrftoken')},
+ success: function(tableData) {
+ updateTable(tableData);
+ window.history.pushState({
+ tableData: tableData,
+ tableParams: tableParams
+ }, null, libtoaster.dumpsUrlParams(tableParams));
+ }
+ });
+ }
+
+ function updateTable(tableData) {
+ var tableBody = table.children("tbody");
+ var pagination = $('#pagination-'+ctx.tableName);
+ var paginationBtns = pagination.children('ul');
+ var tableContainer = $("#table-container-"+ctx.tableName);
+
+ tableContainer.css("visibility", "hidden");
+ /* To avoid page re-layout flicker when paging set fixed height */
+ table.css("padding-bottom", table.height());
+
+ /* Reset table components */
+ tableBody.html("");
+ paginationBtns.html("");
+
+ if (tableParams.search)
+ $('.remove-search-btn-'+ctx.tableName).show();
+ else
+ $('.remove-search-btn-'+ctx.tableName).hide();
+
+ $('.table-count-' + ctx.tableName).text(tableData.total);
+ tableTotal = tableData.total;
+
+ if (tableData.total === 0){
+ tableContainer.hide();
+ /* If we were searching show the new search bar and return */
+ if (tableParams.search){
+ $("#new-search-input-"+ctx.tableName).val(tableParams.search);
+ $("#no-results-"+ctx.tableName).show();
+ }
+ table.trigger("table-done", [tableData.total, tableParams]);
+
+ return;
+
+ /* We don't want to clutter the place with the table chrome if there
+ * are only a few results */
+ } else if (tableData.total <= 10 &&
+ !tableParams.filter &&
+ !tableParams.search){
+ $("#table-chrome-"+ctx.tableName).hide();
+ pagination.hide();
+ } else {
+ tableContainer.show();
+ $("#no-results-"+ctx.tableName).hide();
+ }
+
+ setupTableChrome(tableData);
+
+ /* Add table data rows */
+ var column_index;
+ for (var i in tableData.rows){
+ /* only display if the column is display-able */
+ var row = $("<tr></tr>");
+ column_index = -1;
+ for (var key_j in tableData.rows[i]){
+
+ /* if we have a static: version of a key, prefer the static: version for rendering */
+ var orig_key_j = key_j;
+
+ if (key_j.indexOf("static:") === 0) {
+ if (key_j.substr("static:".length) in tableData.rows[i]) {
+ continue;
+ }
+ orig_key_j = key_j.substr("static:".length)
+ } else if (("static:" + key_j) in tableData.rows[i]) {
+ key_j = "static:" + key_j;
+ }
+
+ /* we skip over un-displayable column entries */
+ column_index += 1;
+ if (! tableData.columns[column_index].displayable) {
+ continue;
+ }
+
+ var td = $("<td></td>");
+ td.prop("class", orig_key_j);
+ if (tableData.rows[i][key_j]){
+ td.html(tableData.rows[i][key_j]);
+ }
+ row.append(td);
+ }
+ tableBody.append(row);
+
+ /* If we have layerbtns then initialise them */
+ layerBtnsInit(ctx);
+
+ /* If we have popovers initialise them now */
+ $('td > a.btn').popover({
+ html:true,
+ placement:'left',
+ container:'body',
+ trigger:'manual'
+ }).click(function(e){
+ $('td > a.btn').not(this).popover('hide');
+ /* ideally we would use 'toggle' here
+ * but it seems buggy in our Bootstrap version
+ */
+ $(this).popover('show');
+ e.stopPropagation();
+ });
+
+ /* enable help information tooltip */
+ $(".get-help").tooltip({container:'body', html:true, delay:{show:300}});
+ }
+
+ /* Setup the pagination controls */
+
+ var start = tableParams.page - 2;
+ var end = tableParams.page + 2;
+ var numPages = Math.ceil(tableData.total/tableParams.limit);
+
+ if (numPages > 1){
+ if (tableParams.page < 3)
+ end = 5;
+
+ for (var page_i=1; page_i <= numPages; page_i++){
+ if (page_i >= start && page_i <= end){
+ var btn = $('<li><a href="#" class="page">'+page_i+'</a></li>');
+
+ if (page_i === tableParams.page){
+ btn.addClass("active");
+ }
+
+ /* Add the click handler */
+ btn.click(pageButtonClicked);
+ paginationBtns.append(btn);
+ }
+ }
+ }
+
+ loadColumnsPreference();
+
+ table.css("padding-bottom", 0);
+ tableContainer.css("visibility", "visible");
+
+ table.trigger("table-done", [tableData.total, tableParams]);
+ }
+
+ function setupTableChrome(tableData){
+ if (tableChromeDone === true)
+ return;
+
+ var tableHeadRow = table.find("thead");
+ var editColMenu = $("#table-chrome-"+ctx.tableName).find(".editcol");
+
+ tableHeadRow.html("");
+ editColMenu.html("");
+
+ if (!tableParams.orderby && tableData.default_orderby){
+ tableParams.orderby = tableData.default_orderby;
+ }
+
+ /* Add table header and column toggle menu */
+ for (var i in tableData.columns){
+ var col = tableData.columns[i];
+ if (col.displayable === false) {
+ continue;
+ }
+ var header = $("<th></th>");
+ header.prop("class", col.field_name);
+
+ /* Setup the help text */
+ if (col.help_text.length > 0) {
+ var help_text = $('<i class="icon-question-sign get-help"> </i>');
+ help_text.tooltip({title: col.help_text});
+ header.append(help_text);
+ }
+
+ /* Setup the orderable title */
+ if (col.orderable) {
+ var title = $('<a href=\"#\" ></a>');
+
+ title.data('field-name', col.field_name);
+ title.text(col.title);
+ title.click(sortColumnClicked);
+
+ header.append(title);
+
+ header.append(' <i class="icon-caret-down" style="display:none"></i>');
+ header.append(' <i class="icon-caret-up" style="display:none"></i>');
+
+ /* If we're currently ordered setup the visual indicator */
+ if (col.field_name === tableParams.orderby ||
+ '-' + col.field_name === tableParams.orderby){
+ header.children("a").addClass("sorted");
+
+ if (tableParams.orderby.indexOf("-") === -1){
+ header.find('.icon-caret-down').show();
+ } else {
+ header.find('.icon-caret-up').show();
+ }
+ }
+
+ } else {
+ /* Not orderable */
+ header.addClass("muted");
+ header.css("font-weight", "normal");
+ header.append(col.title+' ');
+ }
+
+ /* Setup the filter button */
+ if (col.filter_name){
+ var filterBtn = $('<a href="#" role="button" class="pull-right btn btn-mini" data-toggle="modal"><i class="icon-filter filtered"></i></a>');
+
+ filterBtn.data('filter-name', col.filter_name);
+ filterBtn.prop('id', col.filter_name);
+ filterBtn.click(filterOpenClicked);
+
+ /* If we're currently being filtered setup the visial indicator */
+ if (tableParams.filter &&
+ tableParams.filter.match('^'+col.filter_name)) {
+
+ filterBtnActive(filterBtn, true);
+ }
+ header.append(filterBtn);
+ }
+
+ /* Done making the header now add it */
+ tableHeadRow.append(header);
+
+ /* Now setup the checkbox state and click handler */
+ var toggler = $('<li><label class="checkbox">'+col.title+'<input type="checkbox" id="checkbox-'+ col.field_name +'" class="col-toggle" value="'+col.field_name+'" /></label></li>');
+
+ var togglerInput = toggler.find("input");
+
+ togglerInput.attr("checked","checked");
+
+ /* If we can hide the column enable the checkbox action */
+ if (col.hideable){
+ togglerInput.click(colToggleClicked);
+ } else {
+ toggler.find("label").addClass("muted");
+ togglerInput.attr("disabled", "disabled");
+ }
+
+ if (col.hidden) {
+ defaultHiddenCols.push(col.field_name);
+ }
+
+ editColMenu.append(toggler);
+ } /* End for each column */
+
+ tableChromeDone = true;
+ }
+
+ /* Toggles the active state of the filter button */
+ function filterBtnActive(filterBtn, active){
+ if (active) {
+ filterBtn.addClass("btn-primary");
+
+ filterBtn.tooltip({
+ html: true,
+ title: '<button class="btn btn-small btn-primary" onClick=\'$("#clear-filter-btn-'+ ctx.tableName +'").click();\'>Clear filter</button>',
+ placement: 'bottom',
+ delay: {
+ hide: 1500,
+ show: 400,
+ },
+ });
+ } else {
+ filterBtn.removeClass("btn-primary");
+ filterBtn.tooltip('destroy');
+ }
+ }
+
+ /* Display or hide table columns based on the cookie preference or defaults */
+ function loadColumnsPreference(){
+ var cookie_data = $.cookie("cols");
+
+ if (cookie_data) {
+ var cols_hidden = JSON.parse($.cookie("cols"));
+
+ /* For each of the columns check if we should hide them
+ * also update the checked status in the Edit columns menu
+ */
+ $("#"+ctx.tableName+" th").each(function(){
+ for (var i in cols_hidden){
+ if ($(this).hasClass(cols_hidden[i])){
+ $("."+cols_hidden[i]).hide();
+ $("#checkbox-"+cols_hidden[i]).removeAttr("checked");
+ }
+ }
+ });
+ } else {
+ /* Disable these columns by default when we have no columns
+ * user setting.
+ */
+ for (var i in defaultHiddenCols) {
+ $("."+defaultHiddenCols[i]).hide();
+ $("#checkbox-"+defaultHiddenCols[i]).removeAttr("checked");
+ }
+ }
+ }
+
+ function sortColumnClicked(){
+
+ /* We only have one sort at a time so remove any existing sort indicators */
+ $("#"+ctx.tableName+" th .icon-caret-down").hide();
+ $("#"+ctx.tableName+" th .icon-caret-up").hide();
+ $("#"+ctx.tableName+" th a").removeClass("sorted");
+
+ var fieldName = $(this).data('field-name');
+
+ /* if we're already sorted sort the other way */
+ if (tableParams.orderby === fieldName &&
+ tableParams.orderby.indexOf('-') === -1) {
+ tableParams.orderby = '-' + $(this).data('field-name');
+ $(this).parent().children('.icon-caret-up').show();
+ } else {
+ tableParams.orderby = $(this).data('field-name');
+ $(this).parent().children('.icon-caret-down').show();
+ }
+
+ $(this).addClass("sorted");
+
+ loadData(tableParams);
+ }
+
+ function pageButtonClicked(e) {
+ tableParams.page = Number($(this).text());
+ loadData(tableParams);
+ /* Stop page jumps when clicking on # links */
+ e.preventDefault();
+ }
+
+ /* Toggle a table column */
+ function colToggleClicked (){
+ var col = $(this).val();
+ var disabled_cols = [];
+
+ if ($(this).prop("checked")) {
+ $("."+col).show();
+ } else {
+ $("."+col).hide();
+ /* If we're ordered by the column we're hiding remove the order by */
+ if (col === tableParams.orderby ||
+ '-' + col === tableParams.orderby){
+ tableParams.orderby = null;
+ loadData(tableParams);
+ }
+ }
+
+ /* Update the cookie with the unchecked columns */
+ $(".col-toggle").not(":checked").map(function(){
+ disabled_cols.push($(this).val());
+ });
+
+ $.cookie("cols", JSON.stringify(disabled_cols));
+ }
+
+ function filterOpenClicked(){
+ var filterName = $(this).data('filter-name');
+
+ /* We need to pass in the curren search so that the filter counts take
+ * into account the current search filter
+ */
+ var params = {
+ 'name' : filterName,
+ 'search': tableParams.search,
+ 'cmd': 'filterinfo',
+ };
+
+ $.ajax({
+ type: "GET",
+ url: ctx.url,
+ data: params,
+ headers: { 'X-CSRFToken' : $.cookie('csrftoken')},
+ success: function (filterData) {
+ var filterActionRadios = $('#filter-actions-'+ctx.tableName);
+
+ $('#filter-modal-title-'+ctx.tableName).text(filterData.title);
+
+ filterActionRadios.text("");
+
+ for (var i in filterData.filter_actions){
+ var filterAction = filterData.filter_actions[i];
+
+ var action = $('<label class="radio"><input type="radio" name="filter" value=""><span class="filter-title"></span></label>');
+ var actionTitle = filterAction.title + ' (' + filterAction.count + ')';
+
+ var radioInput = action.children("input");
+
+ if (Number(filterAction.count) == 0){
+ radioInput.attr("disabled", "disabled");
+ }
+
+ action.children(".filter-title").text(actionTitle);
+
+ radioInput.val(filterName + ':' + filterAction.name);
+
+ /* Setup the current selected filter, default to 'all' if
+ * no current filter selected.
+ */
+ if ((tableParams.filter &&
+ tableParams.filter === radioInput.val()) ||
+ filterAction.name == 'all') {
+ radioInput.attr("checked", "checked");
+ }
+
+ filterActionRadios.append(action);
+ }
+
+ $('#filter-modal-'+ctx.tableName).modal('show');
+ }
+ });
+ }
+
+
+ $(".get-help").tooltip({container:'body', html:true, delay:{show:300}});
+
+ /* Keep the Edit columns menu open after click by eating the event */
+ $('.dropdown-menu').click(function(e) {
+ e.stopPropagation();
+ });
+
+ $(".pagesize-"+ctx.tableName).val(tableParams.limit);
+
+ /* page size selector */
+ $(".pagesize-"+ctx.tableName).change(function(e){
+ tableParams.limit = Number(this.value);
+ if ((tableParams.page * tableParams.limit) > tableTotal)
+ tableParams.page = 1;
+
+ loadData(tableParams);
+ /* sync the other selectors on the page */
+ $(".pagesize-"+ctx.tableName).val(this.value);
+ e.preventDefault();
+ });
+
+ $("#search-submit-"+ctx.tableName).click(function(e){
+ var searchTerm = $("#search-input-"+ctx.tableName).val();
+
+ tableParams.page = 1;
+ tableParams.search = searchTerm;
+
+ /* If a filter was active we remove it */
+ if (tableParams.filter) {
+ var filterBtn = $("#" + tableParams.filter.split(":")[0]);
+ filterBtnActive(filterBtn, false);
+ tableParams.filter = null;
+ }
+
+ loadData(tableParams);
+
+ e.preventDefault();
+ });
+
+ $('.remove-search-btn-'+ctx.tableName).click(function(e){
+ e.preventDefault();
+
+ tableParams.page = 1;
+ tableParams.search = null;
+ loadData(tableParams);
+
+ $("#search-input-"+ctx.tableName).val("");
+ $(this).hide();
+ });
+
+ $("#search-input-"+ctx.tableName).keyup(function(e){
+ if (e.which === 13)
+ $('#search-submit-'+ctx.tableName).click();
+ });
+
+ /* Stop page jumps when clicking on # links */
+ $('a[href="#"]').click(function(e){
+ e.preventDefault();
+ });
+
+ $("#clear-filter-btn-"+ctx.tableName).click(function(){
+ var filterBtn = $("#" + tableParams.filter.split(":")[0]);
+ filterBtnActive(filterBtn, false);
+
+ tableParams.filter = null;
+ loadData(tableParams);
+ });
+
+ $("#filter-modal-form-"+ctx.tableName).submit(function(e){
+ e.preventDefault();
+
+ tableParams.filter = $(this).find("input[type='radio']:checked").val();
+
+ var filterBtn = $("#" + tableParams.filter.split(":")[0]);
+
+ /* All === remove filter */
+ if (tableParams.filter.match(":all$")) {
+ tableParams.filter = null;
+ filterBtnActive(filterBtn, false);
+ } else {
+ filterBtnActive(filterBtn, true);
+ }
+
+ loadData(tableParams);
+
+ $(this).parent().modal('hide');
+ });
+}
diff --git a/bitbake/lib/toaster/toastergui/static/js/tests/test.js b/bitbake/lib/toaster/toastergui/static/js/tests/test.js
new file mode 100644
index 0000000..d610113
--- /dev/null
+++ b/bitbake/lib/toaster/toastergui/static/js/tests/test.js
@@ -0,0 +1,175 @@
+"use strict";
+/* Unit tests for Toaster's JS */
+
+/* libtoaster tests */
+
+QUnit.test("Layer alert notification", function(assert) {
+ var layer = {
+ "layerdetailurl":"/toastergui/project/1/layer/22",
+ "vcs_url":"git://example.com/example.git",
+ "detail":"[ git://example.com/example.git | master ]",
+ "vcs_reference":"master",
+ "id": 22,
+ "name":"meta-example"
+ };
+
+ var correctResponse = "You have added <strong>3</strong> layers to your project: <a id=\"layer-affected-name\" href=\"/toastergui/project/1/layer/22\">meta-example</a> and its dependencies <a href=\"/toastergui/project/1/layer/9\" data-original-title=\"\" title=\"\">meta-example-two</a>, <a href=\"/toastergui/project/1/layer/9\" data-original-title=\"\" title=\"\">meta-example-three</a>";
+
+ var layerDepsList = [
+ {
+ "layerdetailurl":"/toastergui/project/1/layer/9",
+ "vcs_url":"git://example.com/example.git",
+ "detail":"[ git://example.com/example.git | master ]",
+ "vcs_reference":"master",
+ "id": 9,
+ "name":"meta-example-two"
+ },
+ {
+ "layerdetailurl":"/toastergui/project/1/layer/9",
+ "vcs_url":"git://example.com/example.git",
+ "detail":"[ git://example.com/example.git | master ]",
+ "vcs_reference":"master",
+ "id": 10,
+ "name":"meta-example-three"
+ },
+ ];
+
+ var msg = libtoaster.makeLayerAddRmAlertMsg(layer, layerDepsList, true);
+ var test = $("<div></div>");
+
+ test.html(msg);
+
+ assert.equal(test.children("strong").text(), "3");
+ assert.equal(test.children("a").length, 3);
+});
+
+QUnit.test("Project info", function(assert){
+ var done = assert.async();
+ libtoaster.getProjectInfo(libtoaster.ctx.projectPageUrl, function(prjInfo){
+ assert.ok(prjInfo.machine.name);
+ assert.ok(prjInfo.releases.length > 0);
+ assert.ok(prjInfo.layers.length > 0);
+ assert.ok(prjInfo.freqtargets);
+ assert.ok(prjInfo.release);
+ done();
+ });
+});
+
+QUnit.test("Show notification", function(assert){
+ var msg = "Testing";
+ var element = $("#change-notification-msg");
+
+ libtoaster.showChangeNotification(msg);
+
+ assert.equal(element.text(), msg);
+ assert.ok(element.is(":visible"));
+
+ $("#change-notification").hide();
+});
+
+var layer = {
+ "id": 91,
+ "name": "meta-crystalforest",
+ "layerdetailurl": "/toastergui/project/4/layer/91"
+};
+
+QUnit.test("Add layer", function(assert){
+ var done = assert.async();
+
+ /* Wait for the modal to be added to the dom */
+ var checkModal = setInterval(function(){
+ if ($("#dependencies-modal").length > 0) {
+ $("#dependencies-modal .btn-primary").click();
+ clearInterval(checkModal);
+ }
+ }, 200);
+
+ libtoaster.addRmLayer(layer, true, function(deps){
+ assert.equal(deps.length, 1);
+ done();
+ });
+
+});
+
+QUnit.test("Rm layer", function(assert){
+ var done = assert.async();
+
+ libtoaster.addRmLayer(layer, false, function(deps){
+ assert.equal(deps.length, 0);
+ done();
+ });
+
+});
+
+QUnit.test("Parse url params", function(assert){
+ var params = libtoaster.parseUrlParams();
+ assert.ok(params);
+});
+
+QUnit.test("Dump url params", function(assert){
+ var params = libtoaster.dumpsUrlParams();
+ assert.ok(params);
+});
+
+QUnit.test("Make typeaheads", function(assert){
+ var layersT = $("#layers");
+ var machinesT = $("#machines");
+ var projectsT = $("#projects");
+ var recipesT = $("#recipes");
+
+ libtoaster.makeTypeahead(layersT,
+ libtoaster.ctx.layersTypeAheadUrl, {}, function(){});
+
+ libtoaster.makeTypeahead(machinesT,
+ libtoaster.ctx.machinesTypeAheadUrl, {}, function(){});
+
+ libtoaster.makeTypeahead(projectsT,
+ libtoaster.ctx.projectsTypeAheadUrl, {}, function(){});
+
+ libtoaster.makeTypeahead(recipesT,
+ libtoaster.ctx.recipesTypeAheadUrl, {}, function(){});
+
+ assert.ok(recipesT.data('typeahead'));
+ assert.ok(layersT.data('typeahead'));
+ assert.ok(projectsT.data('typeahead'));
+ assert.ok(recipesT.data('typeahead'));
+});
+
+
+
+/* Page init functions */
+
+QUnit.test("Import layer page init", function(assert){
+ assert.throws(importLayerPageInit());
+});
+
+QUnit.test("Project page init", function(assert){
+ assert.throws(projectPageInit());
+});
+
+QUnit.test("Layer details page init", function(assert){
+ assert.throws(layerDetailsPageInit());
+});
+
+QUnit.test("Layer btns init", function(assert){
+ assert.throws(layerBtnsInit({ projectLayers : [] }));
+});
+
+QUnit.test("Table init", function(assert){
+ assert.throws(tableInit({ url : tableUrl }));
+});
+
+$(document).ajaxError(function(event, jqxhr, settings, errMsg){
+ if (errMsg === 'abort')
+ return;
+
+ QUnit.test("Ajax error", function(assert){
+ assert.notOk(jqxhr.responseText);
+ });
+});
+
+
+
+
+
+
diff --git a/bitbake/lib/toaster/toastergui/static/js/ui-bootstrap-tpls-0.11.0.js b/bitbake/lib/toaster/toastergui/static/js/ui-bootstrap-tpls-0.11.0.js
new file mode 100644
index 0000000..fa6a861
--- /dev/null
+++ b/bitbake/lib/toaster/toastergui/static/js/ui-bootstrap-tpls-0.11.0.js
@@ -0,0 +1,10 @@
+/*
+ * angular-ui-bootstrap
+ * http://angular-ui.github.io/bootstrap/
+
+ * Version: 0.11.0 - 2014-05-01
+ * License: MIT
+ */
+angular.module("ui.bootstrap",["ui.bootstrap.tpls","ui.bootstrap.transition","ui.bootstrap.collapse","ui.bootstrap.accordion","ui.bootstrap.alert","ui.bootstrap.bindHtml","ui.bootstrap.buttons","ui.bootstrap.carousel","ui.bootstrap.dateparser","ui.bootstrap.position","ui.bootstrap.datepicker","ui.bootstrap.dropdown","ui.bootstrap.modal","ui.bootstrap.pagination","ui.bootstrap.tooltip","ui.bootstrap.popover","ui.bootstrap.progressbar","ui.bootstrap.rating","ui.bootstrap.tabs","ui.bootstrap.timepicker","ui.bootstrap.typeahead"]),angular.module("ui.bootstrap.tpls",["template/accordion/accordion-group.html","template/accordion/accordion.html","template/alert/alert.html","template/carousel/carousel.html","template/carousel/slide.html","template/datepicker/datepicker.html","template/datepicker/day.html","template/datepicker/month.html","template/datepicker/popup.html","template/datepicker/year.html","template/modal/backdrop.html","template/modal/window.html","template/pagination/pager.html","template/pagination/pagination.html","template/tooltip/tooltip-html-unsafe-popup.html","template/tooltip/tooltip-popup.html","template/popover/popover.html","template/progressbar/bar.html","template/progressbar/progress.html","template/progressbar/progressbar.html","template/rating/rating.html","template/tabs/tab.html","template/tabs/tabset.html","template/timepicker/timepicker.html","template/typeahead/typeahead-match.html","template/typeahead/typeahead-popup.html"]),angular.module("ui.bootstrap.transition",[]).factory("$transition",["$q","$timeout","$rootScope",function(a,b,c){function d(a){for(var b in a)if(void 0!==f.style[b])return a[b]}var e=function(d,f,g){g=g||{};var h=a.defer(),i=e[g.animation?"animationEndEventName":"transitionEndEventName"],j=function(){c.$apply(function(){d.unbind(i,j),h.resolve(d)})};return i&&d.bind(i,j),b(function(){angular.isString(f)?d.addClass(f):angular.isFunction(f)?f(d):angular.isObject(f)&&d.css(f),i||h.resolve(d)}),h.promise.cancel=function(){i&&d.unbind(i,j),h.reject("Transition cancelled")},h.promise},f=document.createElement("trans"),g={WebkitTransition:"webkitTransitionEnd",MozTransition:"transitionend",OTransition:"oTransitionEnd",transition:"transitionend"},h={WebkitTransition:"webkitAnimationEnd",MozTransition:"animationend",OTransition:"oAnimationEnd",transition:"animationend"};return e.transitionEndEventName=d(g),e.animationEndEventName=d(h),e}]),angular.module("ui.bootstrap.collapse",["ui.bootstrap.transition"]).directive("collapse",["$transition",function(a){return{link:function(b,c,d){function e(b){function d(){j===e&&(j=void 0)}var e=a(c,b);return j&&j.cancel(),j=e,e.then(d,d),e}function f(){k?(k=!1,g()):(c.removeClass("collapse").addClass("collapsing"),e({height:c[0].scrollHeight+"px"}).then(g))}function g(){c.removeClass("collapsing"),c.addClass("collapse in"),c.css({height:"auto"})}function h(){if(k)k=!1,i(),c.css({height:0});else{c.css({height:c[0].scrollHeight+"px"});{c[0].offsetWidth}c.removeClass("collapse in").addClass("collapsing"),e({height:0}).then(i)}}function i(){c.removeClass("collapsing"),c.addClass("collapse")}var j,k=!0;b.$watch(d.collapse,function(a){a?h():f()})}}}]),angular.module("ui.bootstrap.accordion",["ui.bootstrap.collapse"]).constant("accordionConfig",{closeOthers:!0}).controller("AccordionController",["$scope","$attrs","accordionConfig",function(a,b,c){this.groups=[],this.closeOthers=function(d){var e=angular.isDefined(b.closeOthers)?a.$eval(b.closeOthers):c.closeOthers;e&&angular.forEach(this.groups,function(a){a!==d&&(a.isOpen=!1)})},this.addGroup=function(a){var b=this;this.groups.push(a),a.$on("$destroy",function(){b.removeGroup(a)})},this.removeGroup=function(a){var b=this.groups.indexOf(a);-1!==b&&this.groups.splice(b,1)}}]).directive("accordion",function(){return{restrict:"EA",controller:"AccordionController",transclude:!0,replace:!1,templateUrl:"template/accordion/accordion.html"}}).directive("accordionGroup",function(){return{require:"^accordion",restrict:"EA",transclude:!0,replace:!0,templateUrl:"template/accordion/accordion-group.html",scope:{heading:"@",isOpen:"=?",isDisabled:"=?"},controller:function(){this.setHeading=function(a){this.heading=a}},link:function(a,b,c,d){d.addGroup(a),a.$watch("isOpen",function(b){b&&d.closeOthers(a)}),a.toggleOpen=function(){a.isDisabled||(a.isOpen=!a.isOpen)}}}}).directive("accordionHeading",function(){return{restrict:"EA",transclude:!0,template:"",replace:!0,require:"^accordionGroup",link:function(a,b,c,d,e){d.setHeading(e(a,function(){}))}}}).directive("accordionTransclude",function(){return{require:"^accordionGroup",link:function(a,b,c,d){a.$watch(function(){return d[c.accordionTransclude]},function(a){a&&(b.html(""),b.append(a))})}}}),angular.module("ui.bootstrap.alert",[]).controller("AlertController",["$scope","$attrs",function(a,b){a.closeable="close"in b}]).directive("alert",function(){return{restrict:"EA",controller:"AlertController",templateUrl:"template/alert/alert.html",transclude:!0,replace:!0,scope:{type:"@",close:"&"}}}),angular.module("ui.bootstrap.bindHtml",[]).directive("bindHtmlUnsafe",function(){return function(a,b,c){b.addClass("ng-binding").data("$binding",c.bindHtmlUnsafe),a.$watch(c.bindHtmlUnsafe,function(a){b.html(a||"")})}}),angular.module("ui.bootstrap.buttons",[]).constant("buttonConfig",{activeClass:"active",toggleEvent:"click"}).controller("ButtonsController",["buttonConfig",function(a){this.activeClass=a.activeClass||"active",this.toggleEvent=a.toggleEvent||"click"}]).directive("btnRadio",function(){return{require:["btnRadio","ngModel"],controller:"ButtonsController",link:function(a,b,c,d){var e=d[0],f=d[1];f.$render=function(){b.toggleClass(e.activeClass,angular.equals(f.$modelValue,a.$eval(c.btnRadio)))},b.bind(e.toggleEvent,function(){var d=b.hasClass(e.activeClass);(!d||angular.isDefined(c.uncheckable))&&a.$apply(function(){f.$setViewValue(d?null:a.$eval(c.btnRadio)),f.$render()})})}}}).directive("btnCheckbox",function(){return{require:["btnCheckbox","ngModel"],controller:"ButtonsController",link:function(a,b,c,d){function e(){return g(c.btnCheckboxTrue,!0)}function f(){return g(c.btnCheckboxFalse,!1)}function g(b,c){var d=a.$eval(b);return angular.isDefined(d)?d:c}var h=d[0],i=d[1];i.$render=function(){b.toggleClass(h.activeClass,angular.equals(i.$modelValue,e()))},b.bind(h.toggleEvent,function(){a.$apply(function(){i.$setViewValue(b.hasClass(h.activeClass)?f():e()),i.$render()})})}}}),angular.module("ui.bootstrap.carousel",["ui.bootstrap.transition"]).controller("CarouselController",["$scope","$timeout","$transition",function(a,b,c){function d(){e();var c=+a.interval;!isNaN(c)&&c>=0&&(g=b(f,c))}function e(){g&&(b.cancel(g),g=null)}function f(){h?(a.next(),d()):a.pause()}var g,h,i=this,j=i.slides=a.slides=[],k=-1;i.currentSlide=null;var l=!1;i.select=a.select=function(e,f){function g(){if(!l){if(i.currentSlide&&angular.isString(f)&&!a.noTransition&&e.$element){e.$element.addClass(f);{e.$element[0].offsetWidth}angular.forEach(j,function(a){angular.extend(a,{direction:"",entering:!1,leaving:!1,active:!1})}),angular.extend(e,{direction:f,active:!0,entering:!0}),angular.extend(i.currentSlide||{},{direction:f,leaving:!0}),a.$currentTransition=c(e.$element,{}),function(b,c){a.$currentTransition.then(function(){h(b,c)},function(){h(b,c)})}(e,i.currentSlide)}else h(e,i.currentSlide);i.currentSlide=e,k=m,d()}}function h(b,c){angular.extend(b,{direction:"",active:!0,leaving:!1,entering:!1}),angular.extend(c||{},{direction:"",active:!1,leaving:!1,entering:!1}),a.$currentTransition=null}var m=j.indexOf(e);void 0===f&&(f=m>k?"next":"prev"),e&&e!==i.currentSlide&&(a.$currentTransition?(a.$currentTransition.cancel(),b(g)):g())},a.$on("$destroy",function(){l=!0}),i.indexOfSlide=function(a){return j.indexOf(a)},a.next=function(){var b=(k+1)%j.length;return a.$currentTransition?void 0:i.select(j[b],"next")},a.prev=function(){var b=0>k-1?j.length-1:k-1;return a.$currentTransition?void 0:i.select(j[b],"prev")},a.isActive=function(a){return i.currentSlide===a},a.$watch("interval",d),a.$on("$destroy",e),a.play=function(){h||(h=!0,d())},a.pause=function(){a.noPause||(h=!1,e())},i.addSlide=function(b,c){b.$element=c,j.push(b),1===j.length||b.active?(i.select(j[j.length-1]),1==j.length&&a.play()):b.active=!1},i.removeSlide=function(a){var b=j.indexOf(a);j.splice(b,1),j.length>0&&a.active?i.select(b>=j.length?j[b-1]:j[b]):k>b&&k--}}]).directive("carousel",[function(){return{restrict:"EA",transclude:!0,replace:!0,controller:"CarouselController",require:"carousel",templateUrl:"template/carousel/carousel.html",scope:{interval:"=",noTransition:"=",noPause:"="}}}]).directive("slide",function(){return{require:"^carousel",restrict:"EA",transclude:!0,replace:!0,templateUrl:"template/carousel/slide.html",scope:{active:"=?"},link:function(a,b,c,d){d.addSlide(a,b),a.$on("$destroy",function(){d.removeSlide(a)}),a.$watch("active",function(b){b&&d.select(a)})}}}),angular.module("ui.bootstrap.dateparser",[]).service("dateParser",["$locale","orderByFilter",function(a,b){function c(a,b,c){return 1===b&&c>28?29===c&&(a%4===0&&a%100!==0||a%400===0):3===b||5===b||8===b||10===b?31>c:!0}this.parsers={};var d={yyyy:{regex:"\\d{4}",apply:function(a){this.year=+a}},yy:{regex:"\\d{2}",apply:function(a){this.year=+a+2e3}},y:{regex:"\\d{1,4}",apply:function(a){this.year=+a}},MMMM:{regex:a.DATETIME_FORMATS.MONTH.join("|"),apply:function(b){this.month=a.DATETIME_FORMATS.MONTH.indexOf(b)}},MMM:{regex:a.DATETIME_FORMATS.SHORTMONTH.join("|"),apply:function(b){this.month=a.DATETIME_FORMATS.SHORTMONTH.indexOf(b)}},MM:{regex:"0[1-9]|1[0-2]",apply:function(a){this.month=a-1}},M:{regex:"[1-9]|1[0-2]",apply:function(a){this.month=a-1}},dd:{regex:"[0-2][0-9]{1}|3[0-1]{1}",apply:function(a){this.date=+a}},d:{regex:"[1-2]?[0-9]{1}|3[0-1]{1}",apply:function(a){this.date=+a}},EEEE:{regex:a.DATETIME_FORMATS.DAY.join("|")},EEE:{regex:a.DATETIME_FORMATS.SHORTDAY.join("|")}};this.createParser=function(a){var c=[],e=a.split("");return angular.forEach(d,function(b,d){var f=a.indexOf(d);if(f>-1){a=a.split(""),e[f]="("+b.regex+")",a[f]="$";for(var g=f+1,h=f+d.length;h>g;g++)e[g]="",a[g]="$";a=a.join(""),c.push({index:f,apply:b.apply})}}),{regex:new RegExp("^"+e.join("")+"$"),map:b(c,"index")}},this.parse=function(b,d){if(!angular.isString(b))return b;d=a.DATETIME_FORMATS[d]||d,this.parsers[d]||(this.parsers[d]=this.createParser(d));var e=this.parsers[d],f=e.regex,g=e.map,h=b.match(f);if(h&&h.length){for(var i,j={year:1900,month:0,date:1,hours:0},k=1,l=h.length;l>k;k++){var m=g[k-1];m.apply&&m.apply.call(j,h[k])}return c(j.year,j.month,j.date)&&(i=new Date(j.year,j.month,j.date,j.hours)),i}}}]),angular.module("ui.bootstrap.position",[]).factory("$position",["$document","$window",function(a,b){function c(a,c){return a.currentStyle?a.currentStyle[c]:b.getComputedStyle?b.getComputedStyle(a)[c]:a.style[c]}function d(a){return"static"===(c(a,"position")||"static")}var e=function(b){for(var c=a[0],e=b.offsetParent||c;e&&e!==c&&d(e);)e=e.offsetParent;return e||c};return{position:function(b){var c=this.offset(b),d={top:0,left:0},f=e(b[0]);f!=a[0]&&(d=this.offset(angular.element(f)),d.top+=f.clientTop-f.scrollTop,d.left+=f.clientLeft-f.scrollLeft);var g=b[0].getBoundingClientRect();return{width:g.width||b.prop("offsetWidth"),height:g.height||b.prop("offsetHeight"),top:c.top-d.top,left:c.left-d.left}},offset:function(c){var d=c[0].getBoundingClientRect();return{width:d.width||c.prop("offsetWidth"),height:d.height||c.prop("offsetHeight"),top:d.top+(b.pageYOffset||a[0].documentElement.scrollTop),left:d.left+(b.pageXOffset||a[0].documentElement.scrollLeft)}},positionElements:function(a,b,c,d){var e,f,g,h,i=c.split("-"),j=i[0],k=i[1]||"center";e=d?this.offset(a):this.position(a),f=b.prop("offsetWidth"),g=b.prop("offsetHeight");var l={center:function(){return e.left+e.width/2-f/2},left:function(){return e.left},right:function(){return e.left+e.width}},m={center:function(){return e.top+e.height/2-g/2},top:function(){return e.top},bottom:function(){return e.top+e.height}};switch(j){case"right":h={top:m[k](),left:l[j]()};break;case"left":h={top:m[k](),left:e.left-f};break;case"bottom":h={top:m[j](),left:l[k]()};break;default:h={top:e.top-g,left:l[k]()}}return h}}}]),angular.module("ui.bootstrap.datepicker",["ui.bootstrap.dateparser","ui.bootstrap.position"]).constant("datepickerConfig",{formatDay:"dd",formatMonth:"MMMM",formatYear:"yyyy",formatDayHeader:"EEE",formatDayTitle:"MMMM yyyy",formatMonthTitle:"yyyy",datepickerMode:"day",minMode:"day",maxMode:"year",showWeeks:!0,startingDay:0,yearRange:20,minDate:null,maxDate:null}).controller("DatepickerController",["$scope","$attrs","$parse","$interpolate","$timeout","$log","dateFilter","datepickerConfig",function(a,b,c,d,e,f,g,h){var i=this,j={$setViewValue:angular.noop};this.modes=["day","month","year"],angular.forEach(["formatDay","formatMonth","formatYear","formatDayHeader","formatDayTitle","formatMonthTitle","minMode","maxMode","showWeeks","startingDay","yearRange"],function(c,e){i[c]=angular.isDefined(b[c])?8>e?d(b[c])(a.$parent):a.$parent.$eval(b[c]):h[c]}),angular.forEach(["minDate","maxDate"],function(d){b[d]?a.$parent.$watch(c(b[d]),function(a){i[d]=a?new Date(a):null,i.refreshView()}):i[d]=h[d]?new Date(h[d]):null}),a.datepickerMode=a.datepickerMode||h.datepickerMode,a.uniqueId="datepicker-"+a.$id+"-"+Math.floor(1e4*Math.random()),this.activeDate=angular.isDefined(b.initDate)?a.$parent.$eval(b.initDate):new Date,a.isActive=function(b){return 0===i.compare(b.date,i.activeDate)?(a.activeDateId=b.uid,!0):!1},this.init=function(a){j=a,j.$render=function(){i.render()}},this.render=function(){if(j.$modelValue){var a=new Date(j.$modelValue),b=!isNaN(a);b?this.activeDate=a:f.error('Datepicker directive: "ng-model" value must be a Date object, a number of milliseconds since 01.01.1970 or a string representing an RFC2822 or ISO 8601 date.'),j.$setValidity("date",b)}this.refreshView()},this.refreshView=function(){if(this.element){this._refreshView();var a=j.$modelValue?new Date(j.$modelValue):null;j.$setValidity("date-disabled",!a||this.element&&!this.isDisabled(a))}},this.createDateObject=function(a,b){var c=j.$modelValue?new Date(j.$modelValue):null;return{date:a,label:g(a,b),selected:c&&0===this.compare(a,c),disabled:this.isDisabled(a),current:0===this.compare(a,new Date)}},this.isDisabled=function(c){return this.minDate&&this.compare(c,this.minDate)<0||this.maxDate&&this.compare(c,this.maxDate)>0||b.dateDisabled&&a.dateDisabled({date:c,mode:a.datepickerMode})},this.split=function(a,b){for(var c=[];a.length>0;)c.push(a.splice(0,b));return c},a.select=function(b){if(a.datepickerMode===i.minMode){var c=j.$modelValue?new Date(j.$modelValue):new Date(0,0,0,0,0,0,0);c.setFullYear(b.getFullYear(),b.getMonth(),b.getDate()),j.$setViewValue(c),j.$render()}else i.activeDate=b,a.datepickerMode=i.modes[i.modes.indexOf(a.datepickerMode)-1]},a.move=function(a){var b=i.activeDate.getFullYear()+a*(i.step.years||0),c=i.activeDate.getMonth()+a*(i.step.months||0);i.activeDate.setFullYear(b,c,1),i.refreshView()},a.toggleMode=function(b){b=b||1,a.datepickerMode===i.maxMode&&1===b||a.datepickerMode===i.minMode&&-1===b||(a.datepickerMode=i.modes[i.modes.indexOf(a.datepickerMode)+b])},a.keys={13:"enter",32:"space",33:"pageup",34:"pagedown",35:"end",36:"home",37:"left",38:"up",39:"right",40:"down"};var k=function(){e(function(){i.element[0].focus()},0,!1)};a.$on("datepicker.focus",k),a.keydown=function(b){var c=a.keys[b.which];if(c&&!b.shiftKey&&!b.altKey)if(b.preventDefault(),b.stopPropagation(),"enter"===c||"space"===c){if(i.isDisabled(i.activeDate))return;a.select(i.activeDate),k()}else!b.ctrlKey||"up"!==c&&"down"!==c?(i.handleKeyDown(c,b),i.refreshView()):(a.toggleMode("up"===c?1:-1),k())}}]).directive("datepicker",function(){return{restrict:"EA",replace:!0,templateUrl:"template/datepicker/datepicker.html",scope:{datepickerMode:"=?",dateDisabled:"&"},require:["datepicker","?^ngModel"],controller:"DatepickerController",link:function(a,b,c,d){var e=d[0],f=d[1];f&&e.init(f)}}}).directive("daypicker",["dateFilter",function(a){return{restrict:"EA",replace:!0,templateUrl:"template/datepicker/day.html",require:"^datepicker",link:function(b,c,d,e){function f(a,b){return 1!==b||a%4!==0||a%100===0&&a%400!==0?i[b]:29}function g(a,b){var c=new Array(b),d=new Date(a),e=0;for(d.setHours(12);b>e;)c[e++]=new Date(d),d.setDate(d.getDate()+1);return c}function h(a){var b=new Date(a);b.setDate(b.getDate()+4-(b.getDay()||7));var c=b.getTime();return b.setMonth(0),b.setDate(1),Math.floor(Math.round((c-b)/864e5)/7)+1}b.showWeeks=e.showWeeks,e.step={months:1},e.element=c;var i=[31,28,31,30,31,30,31,31,30,31,30,31];e._refreshView=function(){var c=e.activeDate.getFullYear(),d=e.activeDate.getMonth(),f=new Date(c,d,1),i=e.startingDay-f.getDay(),j=i>0?7-i:-i,k=new Date(f);j>0&&k.setDate(-j+1);for(var l=g(k,42),m=0;42>m;m++)l[m]=angular.extend(e.createDateObject(l[m],e.formatDay),{secondary:l[m].getMonth()!==d,uid:b.uniqueId+"-"+m});b.labels=new Array(7);for(var n=0;7>n;n++)b.labels[n]={abbr:a(l[n].date,e.formatDayHeader),full:a(l[n].date,"EEEE")};if(b.title=a(e.activeDate,e.formatDayTitle),b.rows=e.split(l,7),b.showWeeks){b.weekNumbers=[];for(var o=h(b.rows[0][0].date),p=b.rows.length;b.weekNumbers.push(o++)<p;);}},e.compare=function(a,b){return new Date(a.getFullYear(),a.getMonth(),a.getDate())-new Date(b.getFullYear(),b.getMonth(),b.getDate())},e.handleKeyDown=function(a){var b=e.activeDate.getDate();if("left"===a)b-=1;else if("up"===a)b-=7;else if("right"===a)b+=1;else if("down"===a)b+=7;else if("pageup"===a||"pagedown"===a){var c=e.activeDate.getMonth()+("pageup"===a?-1:1);e.activeDate.setMonth(c,1),b=Math.min(f(e.activeDate.getFullYear(),e.activeDate.getMonth()),b)}else"home"===a?b=1:"end"===a&&(b=f(e.activeDate.getFullYear(),e.activeDate.getMonth()));e.activeDate.setDate(b)},e.refreshView()}}}]).directive("monthpicker",["dateFilter",function(a){return{restrict:"EA",replace:!0,templateUrl:"template/datepicker/month.html",require:"^datepicker",link:function(b,c,d,e){e.step={years:1},e.element=c,e._refreshView=function(){for(var c=new Array(12),d=e.activeDate.getFullYear(),f=0;12>f;f++)c[f]=angular.extend(e.createDateObject(new Date(d,f,1),e.formatMonth),{uid:b.uniqueId+"-"+f});b.title=a(e.activeDate,e.formatMonthTitle),b.rows=e.split(c,3)},e.compare=function(a,b){return new Date(a.getFullYear(),a.getMonth())-new Date(b.getFullYear(),b.getMonth())},e.handleKeyDown=function(a){var b=e.activeDate.getMonth();if("left"===a)b-=1;else if("up"===a)b-=3;else if("right"===a)b+=1;else if("down"===a)b+=3;else if("pageup"===a||"pagedown"===a){var c=e.activeDate.getFullYear()+("pageup"===a?-1:1);e.activeDate.setFullYear(c)}else"home"===a?b=0:"end"===a&&(b=11);e.activeDate.setMonth(b)},e.refreshView()}}}]).directive("yearpicker",["dateFilter",function(){return{restrict:"EA",replace:!0,templateUrl:"template/datepicker/year.html",require:"^datepicker",link:function(a,b,c,d){function e(a){return parseInt((a-1)/f,10)*f+1}var f=d.yearRange;d.step={years:f},d.element=b,d._refreshView=function(){for(var b=new Array(f),c=0,g=e(d.activeDate.getFullYear());f>c;c++)b[c]=angular.extend(d.createDateObject(new Date(g+c,0,1),d.formatYear),{uid:a.uniqueId+"-"+c});a.title=[b[0].label,b[f-1].label].join(" - "),a.rows=d.split(b,5)},d.compare=function(a,b){return a.getFullYear()-b.getFullYear()},d.handleKeyDown=function(a){var b=d.activeDate.getFullYear();"left"===a?b-=1:"up"===a?b-=5:"right"===a?b+=1:"down"===a?b+=5:"pageup"===a||"pagedown"===a?b+=("pageup"===a?-1:1)*d.step.years:"home"===a?b=e(d.activeDate.getFullYear()):"end"===a&&(b=e(d.activeDate.getFullYear())+f-1),d.activeDate.setFullYear(b)},d.refreshView()}}}]).constant("datepickerPopupConfig",{datepickerPopup:"yyyy-MM-dd",currentText:"Today",clearText:"Clear",closeText:"Done",closeOnDateSelection:!0,appendToBody:!1,showButtonBar:!0}).directive("datepickerPopup",["$compile","$parse","$document","$position","dateFilter","dateParser","datepickerPopupConfig",function(a,b,c,d,e,f,g){return{restrict:"EA",require:"ngModel",scope:{isOpen:"=?",currentText:"@",clearText:"@",closeText:"@",dateDisabled:"&"},link:function(h,i,j,k){function l(a){return a.replace(/([A-Z])/g,function(a){return"-"+a.toLowerCase()})}function m(a){if(a){if(angular.isDate(a)&&!isNaN(a))return k.$setValidity("date",!0),a;if(angular.isString(a)){var b=f.parse(a,n)||new Date(a);return isNaN(b)?void k.$setValidity("date",!1):(k.$setValidity("date",!0),b)}return void k.$setValidity("date",!1)}return k.$setValidity("date",!0),null}var n,o=angular.isDefined(j.closeOnDateSelection)?h.$parent.$eval(j.closeOnDateSelection):g.closeOnDateSelection,p=angular.isDefined(j.datepickerAppendToBody)?h.$parent.$eval(j.datepickerAppendToBody):g.appendToBody;h.showButtonBar=angular.isDefined(j.showButtonBar)?h.$parent.$eval(j.showButtonBar):g.showButtonBar,h.getText=function(a){return h[a+"Text"]||g[a+"Text"]},j.$observe("datepickerPopup",function(a){n=a||g.datepickerPopup,k.$render()});var q=angular.element("<div datepicker-popup-wrap><div datepicker></div></div>");q.attr({"ng-model":"date","ng-change":"dateSelection()"});var r=angular.element(q.children()[0]);j.datepickerOptions&&angular.forEach(h.$parent.$eval(j.datepickerOptions),function(a,b){r.attr(l(b),a)}),angular.forEach(["minDate","maxDate"],function(a){j[a]&&(h.$parent.$watch(b(j[a]),function(b){h[a]=b}),r.attr(l(a),a))}),j.dateDisabled&&r.attr("date-disabled","dateDisabled({ date: date, mode: mode })"),k.$parsers.unshift(m),h.dateSelection=function(a){angular.isDefined(a)&&(h.date=a),k.$setViewValue(h.date),k.$render(),o&&(h.isOpen=!1,i[0].focus())},i.bind("input change keyup",function(){h.$apply(function(){h.date=k.$modelValue})}),k.$render=function(){var a=k.$viewValue?e(k.$viewValue,n):"";i.val(a),h.date=m(k.$modelValue)};var s=function(a){h.isOpen&&a.target!==i[0]&&h.$apply(function(){h.isOpen=!1})},t=function(a){h.keydown(a)};i.bind("keydown",t),h.keydown=function(a){27===a.which?(a.preventDefault(),a.stopPropagation(),h.close()):40!==a.which||h.isOpen||(h.isOpen=!0)},h.$watch("isOpen",function(a){a?(h.$broadcast("datepicker.focus"),h.position=p?d.offset(i):d.position(i),h.position.top=h.position.top+i.prop("offsetHeight"),c.bind("click",s)):c.unbind("click",s)}),h.select=function(a){if("today"===a){var b=new Date;angular.isDate(k.$modelValue)?(a=new Date(k.$modelValue),a.setFullYear(b.getFullYear(),b.getMonth(),b.getDate())):a=new Date(b.setHours(0,0,0,0))}h.dateSelection(a)},h.close=function(){h.isOpen=!1,i[0].focus()};var u=a(q)(h);p?c.find("body").append(u):i.after(u),h.$on("$destroy",function(){u.remove(),i.unbind("keydown",t),c.unbind("click",s)})}}}]).directive("datepickerPopupWrap",function(){return{restrict:"EA",replace:!0,transclude:!0,templateUrl:"template/datepicker/popup.html",link:function(a,b){b.bind("click",function(a){a.preventDefault(),a.stopPropagation()})}}}),angular.module("ui.bootstrap.dropdown",[]).constant("dropdownConfig",{openClass:"open"}).service("dropdownService",["$document",function(a){var b=null;this.open=function(e){b||(a.bind("click",c),a.bind("keydown",d)),b&&b!==e&&(b.isOpen=!1),b=e},this.close=function(e){b===e&&(b=null,a.unbind("click",c),a.unbind("keydown",d))};var c=function(a){a&&a.isDefaultPrevented()||b.$apply(function(){b.isOpen=!1})},d=function(a){27===a.which&&(b.focusToggleElement(),c())}}]).controller("DropdownController",["$scope","$attrs","$parse","dropdownConfig","dropdownService","$animate",function(a,b,c,d,e,f){var g,h=this,i=a.$new(),j=d.openClass,k=angular.noop,l=b.onToggle?c(b.onToggle):angular.noop;this.init=function(d){h.$element=d,b.isOpen&&(g=c(b.isOpen),k=g.assign,a.$watch(g,function(a){i.isOpen=!!a}))},this.toggle=function(a){return i.isOpen=arguments.length?!!a:!i.isOpen},this.isOpen=function(){return i.isOpen},i.focusToggleElement=function(){h.toggleElement&&h.toggleElement[0].focus()},i.$watch("isOpen",function(b,c){f[b?"addClass":"removeClass"](h.$element,j),b?(i.focusToggleElement(),e.open(i)):e.close(i),k(a,b),angular.isDefined(b)&&b!==c&&l(a,{open:!!b})}),a.$on("$locationChangeSuccess",function(){i.isOpen=!1}),a.$on("$destroy",function(){i.$destroy()})}]).directive("dropdown",function(){return{restrict:"CA",controller:"DropdownController",link:function(a,b,c,d){d.init(b)}}}).directive("dropdownToggle",function(){return{restrict:"CA",require:"?^dropdown",link:function(a,b,c,d){if(d){d.toggleElement=b;var e=function(e){e.preventDefault(),b.hasClass("disabled")||c.disabled||a.$apply(function(){d.toggle()})};b.bind("click",e),b.attr({"aria-haspopup":!0,"aria-expanded":!1}),a.$watch(d.isOpen,function(a){b.attr("aria-expanded",!!a)}),a.$on("$destroy",function(){b.unbind("click",e)})}}}}),angular.module("ui.bootstrap.modal",["ui.bootstrap.transition"]).factory("$$stackedMap",function(){return{createNew:function(){var a=[];return{add:function(b,c){a.push({key:b,value:c})},get:function(b){for(var c=0;c<a.length;c++)if(b==a[c].key)return a[c]},keys:function(){for(var b=[],c=0;c<a.length;c++)b.push(a[c].key);return b},top:function(){return a[a.length-1]},remove:function(b){for(var c=-1,d=0;d<a.length;d++)if(b==a[d].key){c=d;break}return a.splice(c,1)[0]},removeTop:function(){return a.splice(a.length-1,1)[0]},length:function(){return a.length}}}}}).directive("modalBackdrop",["$timeout",function(a){return{restrict:"EA",replace:!0,templateUrl:"template/modal/backdrop.html",link:function(b){b.animate=!1,a(function(){b.animate=!0})}}}]).directive("modalWindow",["$modalStack","$timeout",function(a,b){return{restrict:"EA",scope:{index:"@",animate:"="},replace:!0,transclude:!0,templateUrl:function(a,b){return b.templateUrl||"template/modal/window.html"},link:function(c,d,e){d.addClass(e.windowClass||""),c.size=e.size,b(function(){c.animate=!0,d[0].focus()}),c.close=function(b){var c=a.getTop();c&&c.value.backdrop&&"static"!=c.value.backdrop&&b.target===b.currentTarget&&(b.preventDefault(),b.stopPropagation(),a.dismiss(c.key,"backdrop click"))}}}}]).factory("$modalStack",["$transition","$timeout","$document","$compile","$rootScope","$$stackedMap",function(a,b,c,d,e,f){function g(){for(var a=-1,b=n.keys(),c=0;c<b.length;c++)n.get(b[c]).value.backdrop&&(a=c);return a}function h(a){var b=c.find("body").eq(0),d=n.get(a).value;n.remove(a),j(d.modalDomEl,d.modalScope,300,function(){d.modalScope.$destroy(),b.toggleClass(m,n.length()>0),i()})}function i(){if(k&&-1==g()){var a=l;j(k,l,150,function(){a.$destroy(),a=null}),k=void 0,l=void 0}}function j(c,d,e,f){function g(){g.done||(g.done=!0,c.remove(),f&&f())}d.animate=!1;var h=a.transitionEndEventName;if(h){var i=b(g,e);c.bind(h,function(){b.cancel(i),g(),d.$apply()})}else b(g,0)}var k,l,m="modal-open",n=f.createNew(),o={};return e.$watch(g,function(a){l&&(l.index=a)}),c.bind("keydown",function(a){var b;27===a.which&&(b=n.top(),b&&b.value.keyboard&&(a.preventDefault(),e.$apply(function(){o.dismiss(b.key,"escape key press")})))}),o.open=function(a,b){n.add(a,{deferred:b.deferred,modalScope:b.scope,backdrop:b.backdrop,keyboard:b.keyboard});var f=c.find("body").eq(0),h=g();h>=0&&!k&&(l=e.$new(!0),l.index=h,k=d("<div modal-backdrop></div>")(l),f.append(k));var i=angular.element("<div modal-window></div>");i.attr({"template-url":b.windowTemplateUrl,"window-class":b.windowClass,size:b.size,index:n.length()-1,animate:"animate"}).html(b.content);var j=d(i)(b.scope);n.top().value.modalDomEl=j,f.append(j),f.addClass(m)},o.close=function(a,b){var c=n.get(a).value;c&&(c.deferred.resolve(b),h(a))},o.dismiss=function(a,b){var c=n.get(a).value;c&&(c.deferred.reject(b),h(a))},o.dismissAll=function(a){for(var b=this.getTop();b;)this.dismiss(b.key,a),b=this.getTop()},o.getTop=function(){return n.top()},o}]).provider("$modal",function(){var a={options:{backdrop:!0,keyboard:!0},$get:["$injector","$rootScope","$q","$http","$templateCache","$controller","$modalStack",function(b,c,d,e,f,g,h){function i(a){return a.template?d.when(a.template):e.get(a.templateUrl,{cache:f}).then(function(a){return a.data})}function j(a){var c=[];return angular.forEach(a,function(a){(angular.isFunction(a)||angular.isArray(a))&&c.push(d.when(b.invoke(a)))}),c}var k={};return k.open=function(b){var e=d.defer(),f=d.defer(),k={result:e.promise,opened:f.promise,close:function(a){h.close(k,a)},dismiss:function(a){h.dismiss(k,a)}};if(b=angular.extend({},a.options,b),b.resolve=b.resolve||{},!b.template&&!b.templateUrl)throw new Error("One of template or templateUrl options is required.");var l=d.all([i(b)].concat(j(b.resolve)));return l.then(function(a){var d=(b.scope||c).$new();d.$close=k.close,d.$dismiss=k.dismiss;var f,i={},j=1;b.controller&&(i.$scope=d,i.$modalInstance=k,angular.forEach(b.resolve,function(b,c){i[c]=a[j++]}),f=g(b.controller,i)),h.open(k,{scope:d,deferred:e,content:a[0],backdrop:b.backdrop,keyboard:b.keyboard,windowClass:b.windowClass,windowTemplateUrl:b.windowTemplateUrl,size:b.size})},function(a){e.reject(a)}),l.then(function(){f.resolve(!0)},function(){f.reject(!1)}),k},k}]};return a}),angular.module("ui.bootstrap.pagination",[]).controller("PaginationController",["$scope","$attrs","$parse",function(a,b,c){var d=this,e={$setViewValue:angular.noop},f=b.numPages?c(b.numPages).assign:angular.noop;this.init=function(f,g){e=f,this.config=g,e.$render=function(){d.render()},b.itemsPerPage?a.$parent.$watch(c(b.itemsPerPage),function(b){d.itemsPerPage=parseInt(b,10),a.totalPages=d.calculateTotalPages()}):this.itemsPerPage=g.itemsPerPage},this.calculateTotalPages=function(){var b=this.itemsPerPage<1?1:Math.ceil(a.totalItems/this.itemsPerPage);return Math.max(b||0,1)},this.render=function(){a.page=parseInt(e.$viewValue,10)||1},a.selectPage=function(b){a.page!==b&&b>0&&b<=a.totalPages&&(e.$setViewValue(b),e.$render())},a.getText=function(b){return a[b+"Text"]||d.config[b+"Text"]},a.noPrevious=function(){return 1===a.page},a.noNext=function(){return a.page===a.totalPages},a.$watch("totalItems",function(){a.totalPages=d.calculateTotalPages()}),a.$watch("totalPages",function(b){f(a.$parent,b),a.page>b?a.selectPage(b):e.$render()})}]).constant("paginationConfig",{itemsPerPage:10,boundaryLinks:!1,directionLinks:!0,firstText:"First",previousText:"Previous",nextText:"Next",lastText:"Last",rotate:!0}).directive("pagination",["$parse","paginationConfig",function(a,b){return{restrict:"EA",scope:{totalItems:"=",firstText:"@",previousText:"@",nextText:"@",lastText:"@"},require:["pagination","?ngModel"],controller:"PaginationController",templateUrl:"template/pagination/pagination.html",replace:!0,link:function(c,d,e,f){function g(a,b,c){return{number:a,text:b,active:c}}function h(a,b){var c=[],d=1,e=b,f=angular.isDefined(k)&&b>k;f&&(l?(d=Math.max(a-Math.floor(k/2),1),e=d+k-1,e>b&&(e=b,d=e-k+1)):(d=(Math.ceil(a/k)-1)*k+1,e=Math.min(d+k-1,b)));for(var h=d;e>=h;h++){var i=g(h,h,h===a);c.push(i)}if(f&&!l){if(d>1){var j=g(d-1,"...",!1);c.unshift(j)}if(b>e){var m=g(e+1,"...",!1);c.push(m)}}return c}var i=f[0],j=f[1];if(j){var k=angular.isDefined(e.maxSize)?c.$parent.$eval(e.maxSize):b.maxSize,l=angular.isDefined(e.rotate)?c.$parent.$eval(e.rotate):b.rotate;c.boundaryLinks=angular.isDefined(e.boundaryLinks)?c.$parent.$eval(e.boundaryLinks):b.boundaryLinks,c.directionLinks=angular.isDefined(e.directionLinks)?c.$parent.$eval(e.directionLinks):b.directionLinks,i.init(j,b),e.maxSize&&c.$parent.$watch(a(e.maxSize),function(a){k=parseInt(a,10),i.render()});var m=i.render;i.render=function(){m(),c.page>0&&c.page<=c.totalPages&&(c.pages=h(c.page,c.totalPages))}}}}}]).constant("pagerConfig",{itemsPerPage:10,previousText:"« Previous",nextText:"Next »",align:!0}).directive("pager",["pagerConfig",function(a){return{restrict:"EA",scope:{totalItems:"=",previousText:"@",nextText:"@"},require:["pager","?ngModel"],controller:"PaginationController",templateUrl:"template/pagination/pager.html",replace:!0,link:function(b,c,d,e){var f=e[0],g=e[1];g&&(b.align=angular.isDefined(d.align)?b.$parent.$eval(d.align):a.align,f.init(g,a))}}}]),angular.module("ui.bootstrap.tooltip",["ui.bootstrap.position","ui.bootstrap.bindHtml"]).provider("$tooltip",function(){function a(a){var b=/[A-Z]/g,c="-";
+return a.replace(b,function(a,b){return(b?c:"")+a.toLowerCase()})}var b={placement:"top",animation:!0,popupDelay:0},c={mouseenter:"mouseleave",click:"click",focus:"blur"},d={};this.options=function(a){angular.extend(d,a)},this.setTriggers=function(a){angular.extend(c,a)},this.$get=["$window","$compile","$timeout","$parse","$document","$position","$interpolate",function(e,f,g,h,i,j,k){return function(e,l,m){function n(a){var b=a||o.trigger||m,d=c[b]||b;return{show:b,hide:d}}var o=angular.extend({},b,d),p=a(e),q=k.startSymbol(),r=k.endSymbol(),s="<div "+p+'-popup title="'+q+"tt_title"+r+'" content="'+q+"tt_content"+r+'" placement="'+q+"tt_placement"+r+'" animation="tt_animation" is-open="tt_isOpen"></div>';return{restrict:"EA",scope:!0,compile:function(){var a=f(s);return function(b,c,d){function f(){b.tt_isOpen?m():k()}function k(){(!y||b.$eval(d[l+"Enable"]))&&(b.tt_popupDelay?v||(v=g(p,b.tt_popupDelay,!1),v.then(function(a){a()})):p()())}function m(){b.$apply(function(){q()})}function p(){return v=null,u&&(g.cancel(u),u=null),b.tt_content?(r(),t.css({top:0,left:0,display:"block"}),w?i.find("body").append(t):c.after(t),z(),b.tt_isOpen=!0,b.$digest(),z):angular.noop}function q(){b.tt_isOpen=!1,g.cancel(v),v=null,b.tt_animation?u||(u=g(s,500)):s()}function r(){t&&s(),t=a(b,function(){}),b.$digest()}function s(){u=null,t&&(t.remove(),t=null)}var t,u,v,w=angular.isDefined(o.appendToBody)?o.appendToBody:!1,x=n(void 0),y=angular.isDefined(d[l+"Enable"]),z=function(){var a=j.positionElements(c,t,b.tt_placement,w);a.top+="px",a.left+="px",t.css(a)};b.tt_isOpen=!1,d.$observe(e,function(a){b.tt_content=a,!a&&b.tt_isOpen&&q()}),d.$observe(l+"Title",function(a){b.tt_title=a}),d.$observe(l+"Placement",function(a){b.tt_placement=angular.isDefined(a)?a:o.placement}),d.$observe(l+"PopupDelay",function(a){var c=parseInt(a,10);b.tt_popupDelay=isNaN(c)?o.popupDelay:c});var A=function(){c.unbind(x.show,k),c.unbind(x.hide,m)};d.$observe(l+"Trigger",function(a){A(),x=n(a),x.show===x.hide?c.bind(x.show,f):(c.bind(x.show,k),c.bind(x.hide,m))});var B=b.$eval(d[l+"Animation"]);b.tt_animation=angular.isDefined(B)?!!B:o.animation,d.$observe(l+"AppendToBody",function(a){w=angular.isDefined(a)?h(a)(b):w}),w&&b.$on("$locationChangeSuccess",function(){b.tt_isOpen&&q()}),b.$on("$destroy",function(){g.cancel(u),g.cancel(v),A(),s()})}}}}}]}).directive("tooltipPopup",function(){return{restrict:"EA",replace:!0,scope:{content:"@",placement:"@",animation:"&",isOpen:"&"},templateUrl:"template/tooltip/tooltip-popup.html"}}).directive("tooltip",["$tooltip",function(a){return a("tooltip","tooltip","mouseenter")}]).directive("tooltipHtmlUnsafePopup",function(){return{restrict:"EA",replace:!0,scope:{content:"@",placement:"@",animation:"&",isOpen:"&"},templateUrl:"template/tooltip/tooltip-html-unsafe-popup.html"}}).directive("tooltipHtmlUnsafe",["$tooltip",function(a){return a("tooltipHtmlUnsafe","tooltip","mouseenter")}]),angular.module("ui.bootstrap.popover",["ui.bootstrap.tooltip"]).directive("popoverPopup",function(){return{restrict:"EA",replace:!0,scope:{title:"@",content:"@",placement:"@",animation:"&",isOpen:"&"},templateUrl:"template/popover/popover.html"}}).directive("popover",["$tooltip",function(a){return a("popover","popover","click")}]),angular.module("ui.bootstrap.progressbar",[]).constant("progressConfig",{animate:!0,max:100}).controller("ProgressController",["$scope","$attrs","progressConfig",function(a,b,c){var d=this,e=angular.isDefined(b.animate)?a.$parent.$eval(b.animate):c.animate;this.bars=[],a.max=angular.isDefined(b.max)?a.$parent.$eval(b.max):c.max,this.addBar=function(b,c){e||c.css({transition:"none"}),this.bars.push(b),b.$watch("value",function(c){b.percent=+(100*c/a.max).toFixed(2)}),b.$on("$destroy",function(){c=null,d.removeBar(b)})},this.removeBar=function(a){this.bars.splice(this.bars.indexOf(a),1)}}]).directive("progress",function(){return{restrict:"EA",replace:!0,transclude:!0,controller:"ProgressController",require:"progress",scope:{},templateUrl:"template/progressbar/progress.html"}}).directive("bar",function(){return{restrict:"EA",replace:!0,transclude:!0,require:"^progress",scope:{value:"=",type:"@"},templateUrl:"template/progressbar/bar.html",link:function(a,b,c,d){d.addBar(a,b)}}}).directive("progressbar",function(){return{restrict:"EA",replace:!0,transclude:!0,controller:"ProgressController",scope:{value:"=",type:"@"},templateUrl:"template/progressbar/progressbar.html",link:function(a,b,c,d){d.addBar(a,angular.element(b.children()[0]))}}}),angular.module("ui.bootstrap.rating",[]).constant("ratingConfig",{max:5,stateOn:null,stateOff:null}).controller("RatingController",["$scope","$attrs","ratingConfig",function(a,b,c){var d={$setViewValue:angular.noop};this.init=function(e){d=e,d.$render=this.render,this.stateOn=angular.isDefined(b.stateOn)?a.$parent.$eval(b.stateOn):c.stateOn,this.stateOff=angular.isDefined(b.stateOff)?a.$parent.$eval(b.stateOff):c.stateOff;var f=angular.isDefined(b.ratingStates)?a.$parent.$eval(b.ratingStates):new Array(angular.isDefined(b.max)?a.$parent.$eval(b.max):c.max);a.range=this.buildTemplateObjects(f)},this.buildTemplateObjects=function(a){for(var b=0,c=a.length;c>b;b++)a[b]=angular.extend({index:b},{stateOn:this.stateOn,stateOff:this.stateOff},a[b]);return a},a.rate=function(b){!a.readonly&&b>=0&&b<=a.range.length&&(d.$setViewValue(b),d.$render())},a.enter=function(b){a.readonly||(a.value=b),a.onHover({value:b})},a.reset=function(){a.value=d.$viewValue,a.onLeave()},a.onKeydown=function(b){/(37|38|39|40)/.test(b.which)&&(b.preventDefault(),b.stopPropagation(),a.rate(a.value+(38===b.which||39===b.which?1:-1)))},this.render=function(){a.value=d.$viewValue}}]).directive("rating",function(){return{restrict:"EA",require:["rating","ngModel"],scope:{readonly:"=?",onHover:"&",onLeave:"&"},controller:"RatingController",templateUrl:"template/rating/rating.html",replace:!0,link:function(a,b,c,d){var e=d[0],f=d[1];f&&e.init(f)}}}),angular.module("ui.bootstrap.tabs",[]).controller("TabsetController",["$scope",function(a){var b=this,c=b.tabs=a.tabs=[];b.select=function(a){angular.forEach(c,function(b){b.active&&b!==a&&(b.active=!1,b.onDeselect())}),a.active=!0,a.onSelect()},b.addTab=function(a){c.push(a),1===c.length?a.active=!0:a.active&&b.select(a)},b.removeTab=function(a){var d=c.indexOf(a);if(a.active&&c.length>1){var e=d==c.length-1?d-1:d+1;b.select(c[e])}c.splice(d,1)}}]).directive("tabset",function(){return{restrict:"EA",transclude:!0,replace:!0,scope:{type:"@"},controller:"TabsetController",templateUrl:"template/tabs/tabset.html",link:function(a,b,c){a.vertical=angular.isDefined(c.vertical)?a.$parent.$eval(c.vertical):!1,a.justified=angular.isDefined(c.justified)?a.$parent.$eval(c.justified):!1}}}).directive("tab",["$parse",function(a){return{require:"^tabset",restrict:"EA",replace:!0,templateUrl:"template/tabs/tab.html",transclude:!0,scope:{active:"=?",heading:"@",onSelect:"&select",onDeselect:"&deselect"},controller:function(){},compile:function(b,c,d){return function(b,c,e,f){b.$watch("active",function(a){a&&f.select(b)}),b.disabled=!1,e.disabled&&b.$parent.$watch(a(e.disabled),function(a){b.disabled=!!a}),b.select=function(){b.disabled||(b.active=!0)},f.addTab(b),b.$on("$destroy",function(){f.removeTab(b)}),b.$transcludeFn=d}}}}]).directive("tabHeadingTransclude",[function(){return{restrict:"A",require:"^tab",link:function(a,b){a.$watch("headingElement",function(a){a&&(b.html(""),b.append(a))})}}}]).directive("tabContentTransclude",function(){function a(a){return a.tagName&&(a.hasAttribute("tab-heading")||a.hasAttribute("data-tab-heading")||"tab-heading"===a.tagName.toLowerCase()||"data-tab-heading"===a.tagName.toLowerCase())}return{restrict:"A",require:"^tabset",link:function(b,c,d){var e=b.$eval(d.tabContentTransclude);e.$transcludeFn(e.$parent,function(b){angular.forEach(b,function(b){a(b)?e.headingElement=b:c.append(b)})})}}}),angular.module("ui.bootstrap.timepicker",[]).constant("timepickerConfig",{hourStep:1,minuteStep:1,showMeridian:!0,meridians:null,readonlyInput:!1,mousewheel:!0}).controller("TimepickerController",["$scope","$attrs","$parse","$log","$locale","timepickerConfig",function(a,b,c,d,e,f){function g(){var b=parseInt(a.hours,10),c=a.showMeridian?b>0&&13>b:b>=0&&24>b;return c?(a.showMeridian&&(12===b&&(b=0),a.meridian===p[1]&&(b+=12)),b):void 0}function h(){var b=parseInt(a.minutes,10);return b>=0&&60>b?b:void 0}function i(a){return angular.isDefined(a)&&a.toString().length<2?"0"+a:a}function j(a){k(),o.$setViewValue(new Date(n)),l(a)}function k(){o.$setValidity("time",!0),a.invalidHours=!1,a.invalidMinutes=!1}function l(b){var c=n.getHours(),d=n.getMinutes();a.showMeridian&&(c=0===c||12===c?12:c%12),a.hours="h"===b?c:i(c),a.minutes="m"===b?d:i(d),a.meridian=n.getHours()<12?p[0]:p[1]}function m(a){var b=new Date(n.getTime()+6e4*a);n.setHours(b.getHours(),b.getMinutes()),j()}var n=new Date,o={$setViewValue:angular.noop},p=angular.isDefined(b.meridians)?a.$parent.$eval(b.meridians):f.meridians||e.DATETIME_FORMATS.AMPMS;this.init=function(c,d){o=c,o.$render=this.render;var e=d.eq(0),g=d.eq(1),h=angular.isDefined(b.mousewheel)?a.$parent.$eval(b.mousewheel):f.mousewheel;h&&this.setupMousewheelEvents(e,g),a.readonlyInput=angular.isDefined(b.readonlyInput)?a.$parent.$eval(b.readonlyInput):f.readonlyInput,this.setupInputEvents(e,g)};var q=f.hourStep;b.hourStep&&a.$parent.$watch(c(b.hourStep),function(a){q=parseInt(a,10)});var r=f.minuteStep;b.minuteStep&&a.$parent.$watch(c(b.minuteStep),function(a){r=parseInt(a,10)}),a.showMeridian=f.showMeridian,b.showMeridian&&a.$parent.$watch(c(b.showMeridian),function(b){if(a.showMeridian=!!b,o.$error.time){var c=g(),d=h();angular.isDefined(c)&&angular.isDefined(d)&&(n.setHours(c),j())}else l()}),this.setupMousewheelEvents=function(b,c){var d=function(a){a.originalEvent&&(a=a.originalEvent);var b=a.wheelDelta?a.wheelDelta:-a.deltaY;return a.detail||b>0};b.bind("mousewheel wheel",function(b){a.$apply(d(b)?a.incrementHours():a.decrementHours()),b.preventDefault()}),c.bind("mousewheel wheel",function(b){a.$apply(d(b)?a.incrementMinutes():a.decrementMinutes()),b.preventDefault()})},this.setupInputEvents=function(b,c){if(a.readonlyInput)return a.updateHours=angular.noop,void(a.updateMinutes=angular.noop);var d=function(b,c){o.$setViewValue(null),o.$setValidity("time",!1),angular.isDefined(b)&&(a.invalidHours=b),angular.isDefined(c)&&(a.invalidMinutes=c)};a.updateHours=function(){var a=g();angular.isDefined(a)?(n.setHours(a),j("h")):d(!0)},b.bind("blur",function(){!a.invalidHours&&a.hours<10&&a.$apply(function(){a.hours=i(a.hours)})}),a.updateMinutes=function(){var a=h();angular.isDefined(a)?(n.setMinutes(a),j("m")):d(void 0,!0)},c.bind("blur",function(){!a.invalidMinutes&&a.minutes<10&&a.$apply(function(){a.minutes=i(a.minutes)})})},this.render=function(){var a=o.$modelValue?new Date(o.$modelValue):null;isNaN(a)?(o.$setValidity("time",!1),d.error('Timepicker directive: "ng-model" value must be a Date object, a number of milliseconds since 01.01.1970 or a string representing an RFC2822 or ISO 8601 date.')):(a&&(n=a),k(),l())},a.incrementHours=function(){m(60*q)},a.decrementHours=function(){m(60*-q)},a.incrementMinutes=function(){m(r)},a.decrementMinutes=function(){m(-r)},a.toggleMeridian=function(){m(720*(n.getHours()<12?1:-1))}}]).directive("timepicker",function(){return{restrict:"EA",require:["timepicker","?^ngModel"],controller:"TimepickerController",replace:!0,scope:{},templateUrl:"template/timepicker/timepicker.html",link:function(a,b,c,d){var e=d[0],f=d[1];f&&e.init(f,b.find("input"))}}}),angular.module("ui.bootstrap.typeahead",["ui.bootstrap.position","ui.bootstrap.bindHtml"]).factory("typeaheadParser",["$parse",function(a){var b=/^\s*(.*?)(?:\s+as\s+(.*?))?\s+for\s+(?:([\$\w][\$\w\d]*))\s+in\s+(.*)$/;return{parse:function(c){var d=c.match(b);if(!d)throw new Error('Expected typeahead specification in form of "_modelValue_ (as _label_)? for _item_ in _collection_" but got "'+c+'".');return{itemName:d[3],source:a(d[4]),viewMapper:a(d[2]||d[1]),modelMapper:a(d[1])}}}}]).directive("typeahead",["$compile","$parse","$q","$timeout","$document","$position","typeaheadParser",function(a,b,c,d,e,f,g){var h=[9,13,27,38,40];return{require:"ngModel",link:function(i,j,k,l){var m,n=i.$eval(k.typeaheadMinLength)||1,o=i.$eval(k.typeaheadWaitMs)||0,p=i.$eval(k.typeaheadEditable)!==!1,q=b(k.typeaheadLoading).assign||angular.noop,r=b(k.typeaheadOnSelect),s=k.typeaheadInputFormatter?b(k.typeaheadInputFormatter):void 0,t=k.typeaheadAppendToBody?i.$eval(k.typeaheadAppendToBody):!1,u=b(k.ngModel).assign,v=g.parse(k.typeahead),w=i.$new();i.$on("$destroy",function(){w.$destroy()});var x="typeahead-"+w.$id+"-"+Math.floor(1e4*Math.random());j.attr({"aria-autocomplete":"list","aria-expanded":!1,"aria-owns":x});var y=angular.element("<div typeahead-popup></div>");y.attr({id:x,matches:"matches",active:"activeIdx",select:"select(activeIdx)",query:"query",position:"position"}),angular.isDefined(k.typeaheadTemplateUrl)&&y.attr("template-url",k.typeaheadTemplateUrl);var z=function(){w.matches=[],w.activeIdx=-1,j.attr("aria-expanded",!1)},A=function(a){return x+"-option-"+a};w.$watch("activeIdx",function(a){0>a?j.removeAttr("aria-activedescendant"):j.attr("aria-activedescendant",A(a))});var B=function(a){var b={$viewValue:a};q(i,!0),c.when(v.source(i,b)).then(function(c){var d=a===l.$viewValue;if(d&&m)if(c.length>0){w.activeIdx=0,w.matches.length=0;for(var e=0;e<c.length;e++)b[v.itemName]=c[e],w.matches.push({id:A(e),label:v.viewMapper(w,b),model:c[e]});w.query=a,w.position=t?f.offset(j):f.position(j),w.position.top=w.position.top+j.prop("offsetHeight"),j.attr("aria-expanded",!0)}else z();d&&q(i,!1)},function(){z(),q(i,!1)})};z(),w.query=void 0;var C;l.$parsers.unshift(function(a){return m=!0,a&&a.length>=n?o>0?(C&&d.cancel(C),C=d(function(){B(a)},o)):B(a):(q(i,!1),z()),p?a:a?void l.$setValidity("editable",!1):(l.$setValidity("editable",!0),a)}),l.$formatters.push(function(a){var b,c,d={};return s?(d.$model=a,s(i,d)):(d[v.itemName]=a,b=v.viewMapper(i,d),d[v.itemName]=void 0,c=v.viewMapper(i,d),b!==c?b:a)}),w.select=function(a){var b,c,e={};e[v.itemName]=c=w.matches[a].model,b=v.modelMapper(i,e),u(i,b),l.$setValidity("editable",!0),r(i,{$item:c,$model:b,$label:v.viewMapper(i,e)}),z(),d(function(){j[0].focus()},0,!1)},j.bind("keydown",function(a){0!==w.matches.length&&-1!==h.indexOf(a.which)&&(a.preventDefault(),40===a.which?(w.activeIdx=(w.activeIdx+1)%w.matches.length,w.$digest()):38===a.which?(w.activeIdx=(w.activeIdx?w.activeIdx:w.matches.length)-1,w.$digest()):13===a.which||9===a.which?w.$apply(function(){w.select(w.activeIdx)}):27===a.which&&(a.stopPropagation(),z(),w.$digest()))}),j.bind("blur",function(){m=!1});var D=function(a){j[0]!==a.target&&(z(),w.$digest())};e.bind("click",D),i.$on("$destroy",function(){e.unbind("click",D)});var E=a(y)(w);t?e.find("body").append(E):j.after(E)}}}]).directive("typeaheadPopup",function(){return{restrict:"EA",scope:{matches:"=",query:"=",active:"=",position:"=",select:"&"},replace:!0,templateUrl:"template/typeahead/typeahead-popup.html",link:function(a,b,c){a.templateUrl=c.templateUrl,a.isOpen=function(){return a.matches.length>0},a.isActive=function(b){return a.active==b},a.selectActive=function(b){a.active=b},a.selectMatch=function(b){a.select({activeIdx:b})}}}}).directive("typeaheadMatch",["$http","$templateCache","$compile","$parse",function(a,b,c,d){return{restrict:"EA",scope:{index:"=",match:"=",query:"="},link:function(e,f,g){var h=d(g.templateUrl)(e.$parent)||"template/typeahead/typeahead-match.html";a.get(h,{cache:b}).success(function(a){f.replaceWith(c(a.trim())(e))})}}}]).filter("typeaheadHighlight",function(){function a(a){return a.replace(/([.?*+^$[\]\\(){}|-])/g,"\\$1")}return function(b,c){return c?(""+b).replace(new RegExp(a(c),"gi"),"<strong>$&</strong>"):b}}),angular.module("template/accordion/accordion-group.html",[]).run(["$templateCache",function(a){a.put("template/accordion/accordion-group.html",'<div class="panel panel-default">\n <div class="panel-heading">\n <h4 class="panel-title">\n <a class="accordion-toggle" ng-click="toggleOpen()" accordion-transclude="heading"><span ng-class="{\'text-muted\': isDisabled}">{{heading}}</span></a>\n </h4>\n </div>\n <div class="panel-collapse" collapse="!isOpen">\n <div class="panel-body" ng-transclude></div>\n </div>\n</div>')}]),angular.module("template/accordion/accordion.html",[]).run(["$templateCache",function(a){a.put("template/accordion/accordion.html",'<div class="panel-group" ng-transclude></div>')}]),angular.module("template/alert/alert.html",[]).run(["$templateCache",function(a){a.put("template/alert/alert.html",'<div class="alert" ng-class="{\'alert-{{type || \'warning\'}}\': true, \'alert-dismissable\': closeable}" role="alert">\n <button ng-show="closeable" type="button" class="close" ng-click="close()">\n <span aria-hidden="true">×</span>\n <span class="sr-only">Close</span>\n </button>\n <div ng-transclude></div>\n</div>\n')}]),angular.module("template/carousel/carousel.html",[]).run(["$templateCache",function(a){a.put("template/carousel/carousel.html",'<div ng-mouseenter="pause()" ng-mouseleave="play()" class="carousel" ng-swipe-right="prev()" ng-swipe-left="next()">\n <ol class="carousel-indicators" ng-show="slides.length > 1">\n <li ng-repeat="slide in slides track by $index" ng-class="{active: isActive(slide)}" ng-click="select(slide)"></li>\n </ol>\n <div class="carousel-inner" ng-transclude></div>\n <a class="left carousel-control" ng-click="prev()" ng-show="slides.length > 1"><span class="glyphicon glyphicon-chevron-left"></span></a>\n <a class="right carousel-control" ng-click="next()" ng-show="slides.length > 1"><span class="glyphicon glyphicon-chevron-right"></span></a>\n</div>\n')}]),angular.module("template/carousel/slide.html",[]).run(["$templateCache",function(a){a.put("template/carousel/slide.html","<div ng-class=\"{\n 'active': leaving || (active && !entering),\n 'prev': (next || active) && direction=='prev',\n 'next': (next || active) && direction=='next',\n 'right': direction=='prev',\n 'left': direction=='next'\n }\" class=\"item text-center\" ng-transclude></div>\n")}]),angular.module("template/datepicker/datepicker.html",[]).run(["$templateCache",function(a){a.put("template/datepicker/datepicker.html",'<div ng-switch="datepickerMode" role="application" ng-keydown="keydown($event)">\n <daypicker ng-switch-when="day" tabindex="0"></daypicker>\n <monthpicker ng-switch-when="month" tabindex="0"></monthpicker>\n <yearpicker ng-switch-when="year" tabindex="0"></yearpicker>\n</div>')}]),angular.module("template/datepicker/day.html",[]).run(["$templateCache",function(a){a.put("template/datepicker/day.html",'<table role="grid" aria-labelledby="{{uniqueId}}-title" aria-activedescendant="{{activeDateId}}">\n <thead>\n <tr>\n <th><button type="button" class="btn btn-default btn-sm pull-left" ng-click="move(-1)" tabindex="-1"><i class="glyphicon glyphicon-chevron-left"></i></button></th>\n <th colspan="{{5 + showWeeks}}"><button id="{{uniqueId}}-title" role="heading" aria-live="assertive" aria-atomic="true" type="button" class="btn btn-default btn-sm" ng-click="toggleMode()" tabindex="-1" style="width:100%;"><strong>{{title}}</strong></button></th>\n <th><button type="button" class="btn btn-default btn-sm pull-right" ng-click="move(1)" tabindex="-1"><i class="glyphicon glyphicon-chevron-right"></i></button></th>\n </tr>\n <tr>\n <th ng-show="showWeeks" class="text-center"></th>\n <th ng-repeat="label in labels track by $index" class="text-center"><small aria-label="{{label.full}}">{{label.abbr}}</small></th>\n </tr>\n </thead>\n <tbody>\n <tr ng-repeat="row in rows track by $index">\n <td ng-show="showWeeks" class="text-center h6"><em>{{ weekNumbers[$index] }}</em></td>\n <td ng-repeat="dt in row track by dt.date" class="text-center" role="gridcell" id="{{dt.uid}}" aria-disabled="{{!!dt.disabled}}">\n <button type="button" style="width:100%;" class="btn btn-default btn-sm" ng-class="{\'btn-info\': dt.selected, active: isActive(dt)}" ng-click="select(dt.date)" ng-disabled="dt.disabled" tabindex="-1"><span ng-class="{\'text-muted\': dt.secondary, \'text-info\': dt.current}">{{dt.label}}</span></button>\n </td>\n </tr>\n </tbody>\n</table>\n')}]),angular.module("template/datepicker/month.html",[]).run(["$templateCache",function(a){a.put("template/datepicker/month.html",'<table role="grid" aria-labelledby="{{uniqueId}}-title" aria-activedescendant="{{activeDateId}}">\n <thead>\n <tr>\n <th><button type="button" class="btn btn-default btn-sm pull-left" ng-click="move(-1)" tabindex="-1"><i class="glyphicon glyphicon-chevron-left"></i></button></th>\n <th><button id="{{uniqueId}}-title" role="heading" aria-live="assertive" aria-atomic="true" type="button" class="btn btn-default btn-sm" ng-click="toggleMode()" tabindex="-1" style="width:100%;"><strong>{{title}}</strong></button></th>\n <th><button type="button" class="btn btn-default btn-sm pull-right" ng-click="move(1)" tabindex="-1"><i class="glyphicon glyphicon-chevron-right"></i></button></th>\n </tr>\n </thead>\n <tbody>\n <tr ng-repeat="row in rows track by $index">\n <td ng-repeat="dt in row track by dt.date" class="text-center" role="gridcell" id="{{dt.uid}}" aria-disabled="{{!!dt.disabled}}">\n <button type="button" style="width:100%;" class="btn btn-default" ng-class="{\'btn-info\': dt.selected, active: isActive(dt)}" ng-click="select(dt.date)" ng-disabled="dt.disabled" tabindex="-1"><span ng-class="{\'text-info\': dt.current}">{{dt.label}}</span></button>\n </td>\n </tr>\n </tbody>\n</table>\n')}]),angular.module("template/datepicker/popup.html",[]).run(["$templateCache",function(a){a.put("template/datepicker/popup.html",'<ul class="dropdown-menu" ng-style="{display: (isOpen && \'block\') || \'none\', top: position.top+\'px\', left: position.left+\'px\'}" ng-keydown="keydown($event)">\n <li ng-transclude></li>\n <li ng-if="showButtonBar" style="padding:10px 9px 2px">\n <span class="btn-group">\n <button type="button" class="btn btn-sm btn-info" ng-click="select(\'today\')">{{ getText(\'current\') }}</button>\n <button type="button" class="btn btn-sm btn-danger" ng-click="select(null)">{{ getText(\'clear\') }}</button>\n </span>\n <button type="button" class="btn btn-sm btn-success pull-right" ng-click="close()">{{ getText(\'close\') }}</button>\n </li>\n</ul>\n')}]),angular.module("template/datepicker/year.html",[]).run(["$templateCache",function(a){a.put("template/datepicker/year.html",'<table role="grid" aria-labelledby="{{uniqueId}}-title" aria-activedescendant="{{activeDateId}}">\n <thead>\n <tr>\n <th><button type="button" class="btn btn-default btn-sm pull-left" ng-click="move(-1)" tabindex="-1"><i class="glyphicon glyphicon-chevron-left"></i></button></th>\n <th colspan="3"><button id="{{uniqueId}}-title" role="heading" aria-live="assertive" aria-atomic="true" type="button" class="btn btn-default btn-sm" ng-click="toggleMode()" tabindex="-1" style="width:100%;"><strong>{{title}}</strong></button></th>\n <th><button type="button" class="btn btn-default btn-sm pull-right" ng-click="move(1)" tabindex="-1"><i class="glyphicon glyphicon-chevron-right"></i></button></th>\n </tr>\n </thead>\n <tbody>\n <tr ng-repeat="row in rows track by $index">\n <td ng-repeat="dt in row track by dt.date" class="text-center" role="gridcell" id="{{dt.uid}}" aria-disabled="{{!!dt.disabled}}">\n <button type="button" style="width:100%;" class="btn btn-default" ng-class="{\'btn-info\': dt.selected, active: isActive(dt)}" ng-click="select(dt.date)" ng-disabled="dt.disabled" tabindex="-1"><span ng-class="{\'text-info\': dt.current}">{{dt.label}}</span></button>\n </td>\n </tr>\n </tbody>\n</table>\n')}]),angular.module("template/modal/backdrop.html",[]).run(["$templateCache",function(a){a.put("template/modal/backdrop.html",'<div class="modal-backdrop fade"\n ng-class="{in: animate}"\n ng-style="{\'z-index\': 1040 + (index && 1 || 0) + index*10}"\n></div>\n')}]),angular.module("template/modal/window.html",[]).run(["$templateCache",function(a){a.put("template/modal/window.html",'<div tabindex="-1" role="dialog" class="modal fade" ng-class="{in: animate}" ng-style="{\'z-index\': 1050 + index*10, display: \'block\'}" ng-click="close($event)">\n <div class="modal-dialog" ng-class="{\'modal-sm\': size == \'sm\', \'modal-lg\': size == \'lg\'}"><div class="modal-content" ng-transclude></div></div>\n</div>')}]),angular.module("template/pagination/pager.html",[]).run(["$templateCache",function(a){a.put("template/pagination/pager.html",'<ul class="pager">\n <li ng-class="{disabled: noPrevious(), previous: align}"><a href ng-click="selectPage(page - 1)">{{getText(\'previous\')}}</a></li>\n <li ng-class="{disabled: noNext(), next: align}"><a href ng-click="selectPage(page + 1)">{{getText(\'next\')}}</a></li>\n</ul>')}]),angular.module("template/pagination/pagination.html",[]).run(["$templateCache",function(a){a.put("template/pagination/pagination.html",'<ul class="pagination">\n <li ng-if="boundaryLinks" ng-class="{disabled: noPrevious()}"><a href ng-click="selectPage(1)">{{getText(\'first\')}}</a></li>\n <li ng-if="directionLinks" ng-class="{disabled: noPrevious()}"><a href ng-click="selectPage(page - 1)">{{getText(\'previous\')}}</a></li>\n <li ng-repeat="page in pages track by $index" ng-class="{active: page.active}"><a href ng-click="selectPage(page.number)">{{page.text}}</a></li>\n <li ng-if="directionLinks" ng-class="{disabled: noNext()}"><a href ng-click="selectPage(page + 1)">{{getText(\'next\')}}</a></li>\n <li ng-if="boundaryLinks" ng-class="{disabled: noNext()}"><a href ng-click="selectPage(totalPages)">{{getText(\'last\')}}</a></li>\n</ul>')}]),angular.module("template/tooltip/tooltip-html-unsafe-popup.html",[]).run(["$templateCache",function(a){a.put("template/tooltip/tooltip-html-unsafe-popup.html",'<div class="tooltip {{placement}}" ng-class="{ in: isOpen(), fade: animation() }">\n <div class="tooltip-arrow"></div>\n <div class="tooltip-inner" bind-html-unsafe="content"></div>\n</div>\n')}]),angular.module("template/tooltip/tooltip-popup.html",[]).run(["$templateCache",function(a){a.put("template/tooltip/tooltip-popup.html",'<div class="tooltip {{placement}}" ng-class="{ in: isOpen(), fade: animation() }">\n <div class="tooltip-arrow"></div>\n <div class="tooltip-inner" ng-bind="content"></div>\n</div>\n')}]),angular.module("template/popover/popover.html",[]).run(["$templateCache",function(a){a.put("template/popover/popover.html",'<div class="popover {{placement}}" ng-class="{ in: isOpen(), fade: animation() }">\n <div class="arrow"></div>\n\n <div class="popover-inner">\n <h3 class="popover-title" ng-bind="title" ng-show="title"></h3>\n <div class="popover-content" ng-bind="content"></div>\n </div>\n</div>\n')}]),angular.module("template/progressbar/bar.html",[]).run(["$templateCache",function(a){a.put("template/progressbar/bar.html",'<div class="progress-bar" ng-class="type && \'progress-bar-\' + type" role="progressbar" aria-valuenow="{{value}}" aria-valuemin="0" aria-valuemax="{{max}}" ng-style="{width: percent + \'%\'}" aria-valuetext="{{percent | number:0}}%" ng-transclude></div>')}]),angular.module("template/progressbar/progress.html",[]).run(["$templateCache",function(a){a.put("template/progressbar/progress.html",'<div class="progress" ng-transclude></div>')}]),angular.module("template/progressbar/progressbar.html",[]).run(["$templateCache",function(a){a.put("template/progressbar/progressbar.html",'<div class="progress">\n <div class="progress-bar" ng-class="type && \'progress-bar-\' + type" role="progressbar" aria-valuenow="{{value}}" aria-valuemin="0" aria-valuemax="{{max}}" ng-style="{width: percent + \'%\'}" aria-valuetext="{{percent | number:0}}%" ng-transclude></div>\n</div>')}]),angular.module("template/rating/rating.html",[]).run(["$templateCache",function(a){a.put("template/rating/rating.html",'<span ng-mouseleave="reset()" ng-keydown="onKeydown($event)" tabindex="0" role="slider" aria-valuemin="0" aria-valuemax="{{range.length}}" aria-valuenow="{{value}}">\n <i ng-repeat="r in range track by $index" ng-mouseenter="enter($index + 1)" ng-click="rate($index + 1)" class="glyphicon" ng-class="$index < value && (r.stateOn || \'glyphicon-star\') || (r.stateOff || \'glyphicon-star-empty\')">\n <span class="sr-only">({{ $index < value ? \'*\' : \' \' }})</span>\n </i>\n</span>')}]),angular.module("template/tabs/tab.html",[]).run(["$templateCache",function(a){a.put("template/tabs/tab.html",'<li ng-class="{active: active, disabled: disabled}">\n <a ng-click="select()" tab-heading-transclude>{{heading}}</a>\n</li>\n')}]),angular.module("template/tabs/tabset-titles.html",[]).run(["$templateCache",function(a){a.put("template/tabs/tabset-titles.html","<ul class=\"nav {{type && 'nav-' + type}}\" ng-class=\"{'nav-stacked': vertical}\">\n</ul>\n")}]),angular.module("template/tabs/tabset.html",[]).run(["$templateCache",function(a){a.put("template/tabs/tabset.html",'\n<div>\n <ul class="nav nav-{{type || \'tabs\'}}" ng-class="{\'nav-stacked\': vertical, \'nav-justified\': justified}" ng-transclude></ul>\n <div class="tab-content">\n <div class="tab-pane" \n ng-repeat="tab in tabs" \n ng-class="{active: tab.active}"\n tab-content-transclude="tab">\n </div>\n </div>\n</div>\n')}]),angular.module("template/timepicker/timepicker.html",[]).run(["$templateCache",function(a){a.put("template/timepicker/timepicker.html",'<table>\n <tbody>\n <tr class="text-center">\n <td><a ng-click="incrementHours()" class="btn btn-link"><span class="glyphicon glyphicon-chevron-up"></span></a></td>\n <td> </td>\n <td><a ng-click="incrementMinutes()" class="btn btn-link"><span class="glyphicon glyphicon-chevron-up"></span></a></td>\n <td ng-show="showMeridian"></td>\n </tr>\n <tr>\n <td style="width:50px;" class="form-group" ng-class="{\'has-error\': invalidHours}">\n <input type="text" ng-model="hours" ng-change="updateHours()" class="form-control text-center" ng-mousewheel="incrementHours()" ng-readonly="readonlyInput" maxlength="2">\n </td>\n <td>:</td>\n <td style="width:50px;" class="form-group" ng-class="{\'has-error\': invalidMinutes}">\n <input type="text" ng-model="minutes" ng-change="updateMinutes()" class="form-control text-center" ng-readonly="readonlyInput" maxlength="2">\n </td>\n <td ng-show="showMeridian"><button type="button" class="btn btn-default text-center" ng-click="toggleMeridian()">{{meridian}}</button></td>\n </tr>\n <tr class="text-center">\n <td><a ng-click="decrementHours()" class="btn btn-link"><span class="glyphicon glyphicon-chevron-down"></span></a></td>\n <td> </td>\n <td><a ng-click="decrementMinutes()" class="btn btn-link"><span class="glyphicon glyphicon-chevron-down"></span></a></td>\n <td ng-show="showMeridian"></td>\n </tr>\n </tbody>\n</table>\n')}]),angular.module("template/typeahead/typeahead-match.html",[]).run(["$templateCache",function(a){a.put("template/typeahead/typeahead-match.html",'<a tabindex="-1" bind-html-unsafe="match.label | typeaheadHighlight:query"></a>')}]),angular.module("template/typeahead/typeahead-popup.html",[]).run(["$templateCache",function(a){a.put("template/typeahead/typeahead-popup.html",'<ul class="dropdown-menu" ng-if="isOpen()" ng-style="{top: position.top+\'px\', left: position.left+\'px\'}" style="display: block;" role="listbox" aria-hidden="{{!isOpen()}}">\n <li ng-repeat="match in matches track by $index" ng-class="{active: isActive($index) }" ng-mouseenter="selectActive($index)" ng-click="selectMatch($index)" role="option" id="{{match.id}}">\n <div typeahead-match index="$index" match="match" query="query" template-url="templateUrl"></div>\n </li>\n</ul>')
+}]);
\ No newline at end of file
diff --git a/bitbake/lib/toaster/toastergui/static/js/ui-bootstrap-tpls-0.11.0.min.js b/bitbake/lib/toaster/toastergui/static/js/ui-bootstrap-tpls-0.11.0.min.js
new file mode 100644
index 0000000..fa6a861
--- /dev/null
+++ b/bitbake/lib/toaster/toastergui/static/js/ui-bootstrap-tpls-0.11.0.min.js
@@ -0,0 +1,10 @@
+/*
+ * angular-ui-bootstrap
+ * http://angular-ui.github.io/bootstrap/
+
+ * Version: 0.11.0 - 2014-05-01
+ * License: MIT
+ */
+angular.module("ui.bootstrap",["ui.bootstrap.tpls","ui.bootstrap.transition","ui.bootstrap.collapse","ui.bootstrap.accordion","ui.bootstrap.alert","ui.bootstrap.bindHtml","ui.bootstrap.buttons","ui.bootstrap.carousel","ui.bootstrap.dateparser","ui.bootstrap.position","ui.bootstrap.datepicker","ui.bootstrap.dropdown","ui.bootstrap.modal","ui.bootstrap.pagination","ui.bootstrap.tooltip","ui.bootstrap.popover","ui.bootstrap.progressbar","ui.bootstrap.rating","ui.bootstrap.tabs","ui.bootstrap.timepicker","ui.bootstrap.typeahead"]),angular.module("ui.bootstrap.tpls",["template/accordion/accordion-group.html","template/accordion/accordion.html","template/alert/alert.html","template/carousel/carousel.html","template/carousel/slide.html","template/datepicker/datepicker.html","template/datepicker/day.html","template/datepicker/month.html","template/datepicker/popup.html","template/datepicker/year.html","template/modal/backdrop.html","template/modal/window.html","template/pagination/pager.html","template/pagination/pagination.html","template/tooltip/tooltip-html-unsafe-popup.html","template/tooltip/tooltip-popup.html","template/popover/popover.html","template/progressbar/bar.html","template/progressbar/progress.html","template/progressbar/progressbar.html","template/rating/rating.html","template/tabs/tab.html","template/tabs/tabset.html","template/timepicker/timepicker.html","template/typeahead/typeahead-match.html","template/typeahead/typeahead-popup.html"]),angular.module("ui.bootstrap.transition",[]).factory("$transition",["$q","$timeout","$rootScope",function(a,b,c){function d(a){for(var b in a)if(void 0!==f.style[b])return a[b]}var e=function(d,f,g){g=g||{};var h=a.defer(),i=e[g.animation?"animationEndEventName":"transitionEndEventName"],j=function(){c.$apply(function(){d.unbind(i,j),h.resolve(d)})};return i&&d.bind(i,j),b(function(){angular.isString(f)?d.addClass(f):angular.isFunction(f)?f(d):angular.isObject(f)&&d.css(f),i||h.resolve(d)}),h.promise.cancel=function(){i&&d.unbind(i,j),h.reject("Transition cancelled")},h.promise},f=document.createElement("trans"),g={WebkitTransition:"webkitTransitionEnd",MozTransition:"transitionend",OTransition:"oTransitionEnd",transition:"transitionend"},h={WebkitTransition:"webkitAnimationEnd",MozTransition:"animationend",OTransition:"oAnimationEnd",transition:"animationend"};return e.transitionEndEventName=d(g),e.animationEndEventName=d(h),e}]),angular.module("ui.bootstrap.collapse",["ui.bootstrap.transition"]).directive("collapse",["$transition",function(a){return{link:function(b,c,d){function e(b){function d(){j===e&&(j=void 0)}var e=a(c,b);return j&&j.cancel(),j=e,e.then(d,d),e}function f(){k?(k=!1,g()):(c.removeClass("collapse").addClass("collapsing"),e({height:c[0].scrollHeight+"px"}).then(g))}function g(){c.removeClass("collapsing"),c.addClass("collapse in"),c.css({height:"auto"})}function h(){if(k)k=!1,i(),c.css({height:0});else{c.css({height:c[0].scrollHeight+"px"});{c[0].offsetWidth}c.removeClass("collapse in").addClass("collapsing"),e({height:0}).then(i)}}function i(){c.removeClass("collapsing"),c.addClass("collapse")}var j,k=!0;b.$watch(d.collapse,function(a){a?h():f()})}}}]),angular.module("ui.bootstrap.accordion",["ui.bootstrap.collapse"]).constant("accordionConfig",{closeOthers:!0}).controller("AccordionController",["$scope","$attrs","accordionConfig",function(a,b,c){this.groups=[],this.closeOthers=function(d){var e=angular.isDefined(b.closeOthers)?a.$eval(b.closeOthers):c.closeOthers;e&&angular.forEach(this.groups,function(a){a!==d&&(a.isOpen=!1)})},this.addGroup=function(a){var b=this;this.groups.push(a),a.$on("$destroy",function(){b.removeGroup(a)})},this.removeGroup=function(a){var b=this.groups.indexOf(a);-1!==b&&this.groups.splice(b,1)}}]).directive("accordion",function(){return{restrict:"EA",controller:"AccordionController",transclude:!0,replace:!1,templateUrl:"template/accordion/accordion.html"}}).directive("accordionGroup",function(){return{require:"^accordion",restrict:"EA",transclude:!0,replace:!0,templateUrl:"template/accordion/accordion-group.html",scope:{heading:"@",isOpen:"=?",isDisabled:"=?"},controller:function(){this.setHeading=function(a){this.heading=a}},link:function(a,b,c,d){d.addGroup(a),a.$watch("isOpen",function(b){b&&d.closeOthers(a)}),a.toggleOpen=function(){a.isDisabled||(a.isOpen=!a.isOpen)}}}}).directive("accordionHeading",function(){return{restrict:"EA",transclude:!0,template:"",replace:!0,require:"^accordionGroup",link:function(a,b,c,d,e){d.setHeading(e(a,function(){}))}}}).directive("accordionTransclude",function(){return{require:"^accordionGroup",link:function(a,b,c,d){a.$watch(function(){return d[c.accordionTransclude]},function(a){a&&(b.html(""),b.append(a))})}}}),angular.module("ui.bootstrap.alert",[]).controller("AlertController",["$scope","$attrs",function(a,b){a.closeable="close"in b}]).directive("alert",function(){return{restrict:"EA",controller:"AlertController",templateUrl:"template/alert/alert.html",transclude:!0,replace:!0,scope:{type:"@",close:"&"}}}),angular.module("ui.bootstrap.bindHtml",[]).directive("bindHtmlUnsafe",function(){return function(a,b,c){b.addClass("ng-binding").data("$binding",c.bindHtmlUnsafe),a.$watch(c.bindHtmlUnsafe,function(a){b.html(a||"")})}}),angular.module("ui.bootstrap.buttons",[]).constant("buttonConfig",{activeClass:"active",toggleEvent:"click"}).controller("ButtonsController",["buttonConfig",function(a){this.activeClass=a.activeClass||"active",this.toggleEvent=a.toggleEvent||"click"}]).directive("btnRadio",function(){return{require:["btnRadio","ngModel"],controller:"ButtonsController",link:function(a,b,c,d){var e=d[0],f=d[1];f.$render=function(){b.toggleClass(e.activeClass,angular.equals(f.$modelValue,a.$eval(c.btnRadio)))},b.bind(e.toggleEvent,function(){var d=b.hasClass(e.activeClass);(!d||angular.isDefined(c.uncheckable))&&a.$apply(function(){f.$setViewValue(d?null:a.$eval(c.btnRadio)),f.$render()})})}}}).directive("btnCheckbox",function(){return{require:["btnCheckbox","ngModel"],controller:"ButtonsController",link:function(a,b,c,d){function e(){return g(c.btnCheckboxTrue,!0)}function f(){return g(c.btnCheckboxFalse,!1)}function g(b,c){var d=a.$eval(b);return angular.isDefined(d)?d:c}var h=d[0],i=d[1];i.$render=function(){b.toggleClass(h.activeClass,angular.equals(i.$modelValue,e()))},b.bind(h.toggleEvent,function(){a.$apply(function(){i.$setViewValue(b.hasClass(h.activeClass)?f():e()),i.$render()})})}}}),angular.module("ui.bootstrap.carousel",["ui.bootstrap.transition"]).controller("CarouselController",["$scope","$timeout","$transition",function(a,b,c){function d(){e();var c=+a.interval;!isNaN(c)&&c>=0&&(g=b(f,c))}function e(){g&&(b.cancel(g),g=null)}function f(){h?(a.next(),d()):a.pause()}var g,h,i=this,j=i.slides=a.slides=[],k=-1;i.currentSlide=null;var l=!1;i.select=a.select=function(e,f){function g(){if(!l){if(i.currentSlide&&angular.isString(f)&&!a.noTransition&&e.$element){e.$element.addClass(f);{e.$element[0].offsetWidth}angular.forEach(j,function(a){angular.extend(a,{direction:"",entering:!1,leaving:!1,active:!1})}),angular.extend(e,{direction:f,active:!0,entering:!0}),angular.extend(i.currentSlide||{},{direction:f,leaving:!0}),a.$currentTransition=c(e.$element,{}),function(b,c){a.$currentTransition.then(function(){h(b,c)},function(){h(b,c)})}(e,i.currentSlide)}else h(e,i.currentSlide);i.currentSlide=e,k=m,d()}}function h(b,c){angular.extend(b,{direction:"",active:!0,leaving:!1,entering:!1}),angular.extend(c||{},{direction:"",active:!1,leaving:!1,entering:!1}),a.$currentTransition=null}var m=j.indexOf(e);void 0===f&&(f=m>k?"next":"prev"),e&&e!==i.currentSlide&&(a.$currentTransition?(a.$currentTransition.cancel(),b(g)):g())},a.$on("$destroy",function(){l=!0}),i.indexOfSlide=function(a){return j.indexOf(a)},a.next=function(){var b=(k+1)%j.length;return a.$currentTransition?void 0:i.select(j[b],"next")},a.prev=function(){var b=0>k-1?j.length-1:k-1;return a.$currentTransition?void 0:i.select(j[b],"prev")},a.isActive=function(a){return i.currentSlide===a},a.$watch("interval",d),a.$on("$destroy",e),a.play=function(){h||(h=!0,d())},a.pause=function(){a.noPause||(h=!1,e())},i.addSlide=function(b,c){b.$element=c,j.push(b),1===j.length||b.active?(i.select(j[j.length-1]),1==j.length&&a.play()):b.active=!1},i.removeSlide=function(a){var b=j.indexOf(a);j.splice(b,1),j.length>0&&a.active?i.select(b>=j.length?j[b-1]:j[b]):k>b&&k--}}]).directive("carousel",[function(){return{restrict:"EA",transclude:!0,replace:!0,controller:"CarouselController",require:"carousel",templateUrl:"template/carousel/carousel.html",scope:{interval:"=",noTransition:"=",noPause:"="}}}]).directive("slide",function(){return{require:"^carousel",restrict:"EA",transclude:!0,replace:!0,templateUrl:"template/carousel/slide.html",scope:{active:"=?"},link:function(a,b,c,d){d.addSlide(a,b),a.$on("$destroy",function(){d.removeSlide(a)}),a.$watch("active",function(b){b&&d.select(a)})}}}),angular.module("ui.bootstrap.dateparser",[]).service("dateParser",["$locale","orderByFilter",function(a,b){function c(a,b,c){return 1===b&&c>28?29===c&&(a%4===0&&a%100!==0||a%400===0):3===b||5===b||8===b||10===b?31>c:!0}this.parsers={};var d={yyyy:{regex:"\\d{4}",apply:function(a){this.year=+a}},yy:{regex:"\\d{2}",apply:function(a){this.year=+a+2e3}},y:{regex:"\\d{1,4}",apply:function(a){this.year=+a}},MMMM:{regex:a.DATETIME_FORMATS.MONTH.join("|"),apply:function(b){this.month=a.DATETIME_FORMATS.MONTH.indexOf(b)}},MMM:{regex:a.DATETIME_FORMATS.SHORTMONTH.join("|"),apply:function(b){this.month=a.DATETIME_FORMATS.SHORTMONTH.indexOf(b)}},MM:{regex:"0[1-9]|1[0-2]",apply:function(a){this.month=a-1}},M:{regex:"[1-9]|1[0-2]",apply:function(a){this.month=a-1}},dd:{regex:"[0-2][0-9]{1}|3[0-1]{1}",apply:function(a){this.date=+a}},d:{regex:"[1-2]?[0-9]{1}|3[0-1]{1}",apply:function(a){this.date=+a}},EEEE:{regex:a.DATETIME_FORMATS.DAY.join("|")},EEE:{regex:a.DATETIME_FORMATS.SHORTDAY.join("|")}};this.createParser=function(a){var c=[],e=a.split("");return angular.forEach(d,function(b,d){var f=a.indexOf(d);if(f>-1){a=a.split(""),e[f]="("+b.regex+")",a[f]="$";for(var g=f+1,h=f+d.length;h>g;g++)e[g]="",a[g]="$";a=a.join(""),c.push({index:f,apply:b.apply})}}),{regex:new RegExp("^"+e.join("")+"$"),map:b(c,"index")}},this.parse=function(b,d){if(!angular.isString(b))return b;d=a.DATETIME_FORMATS[d]||d,this.parsers[d]||(this.parsers[d]=this.createParser(d));var e=this.parsers[d],f=e.regex,g=e.map,h=b.match(f);if(h&&h.length){for(var i,j={year:1900,month:0,date:1,hours:0},k=1,l=h.length;l>k;k++){var m=g[k-1];m.apply&&m.apply.call(j,h[k])}return c(j.year,j.month,j.date)&&(i=new Date(j.year,j.month,j.date,j.hours)),i}}}]),angular.module("ui.bootstrap.position",[]).factory("$position",["$document","$window",function(a,b){function c(a,c){return a.currentStyle?a.currentStyle[c]:b.getComputedStyle?b.getComputedStyle(a)[c]:a.style[c]}function d(a){return"static"===(c(a,"position")||"static")}var e=function(b){for(var c=a[0],e=b.offsetParent||c;e&&e!==c&&d(e);)e=e.offsetParent;return e||c};return{position:function(b){var c=this.offset(b),d={top:0,left:0},f=e(b[0]);f!=a[0]&&(d=this.offset(angular.element(f)),d.top+=f.clientTop-f.scrollTop,d.left+=f.clientLeft-f.scrollLeft);var g=b[0].getBoundingClientRect();return{width:g.width||b.prop("offsetWidth"),height:g.height||b.prop("offsetHeight"),top:c.top-d.top,left:c.left-d.left}},offset:function(c){var d=c[0].getBoundingClientRect();return{width:d.width||c.prop("offsetWidth"),height:d.height||c.prop("offsetHeight"),top:d.top+(b.pageYOffset||a[0].documentElement.scrollTop),left:d.left+(b.pageXOffset||a[0].documentElement.scrollLeft)}},positionElements:function(a,b,c,d){var e,f,g,h,i=c.split("-"),j=i[0],k=i[1]||"center";e=d?this.offset(a):this.position(a),f=b.prop("offsetWidth"),g=b.prop("offsetHeight");var l={center:function(){return e.left+e.width/2-f/2},left:function(){return e.left},right:function(){return e.left+e.width}},m={center:function(){return e.top+e.height/2-g/2},top:function(){return e.top},bottom:function(){return e.top+e.height}};switch(j){case"right":h={top:m[k](),left:l[j]()};break;case"left":h={top:m[k](),left:e.left-f};break;case"bottom":h={top:m[j](),left:l[k]()};break;default:h={top:e.top-g,left:l[k]()}}return h}}}]),angular.module("ui.bootstrap.datepicker",["ui.bootstrap.dateparser","ui.bootstrap.position"]).constant("datepickerConfig",{formatDay:"dd",formatMonth:"MMMM",formatYear:"yyyy",formatDayHeader:"EEE",formatDayTitle:"MMMM yyyy",formatMonthTitle:"yyyy",datepickerMode:"day",minMode:"day",maxMode:"year",showWeeks:!0,startingDay:0,yearRange:20,minDate:null,maxDate:null}).controller("DatepickerController",["$scope","$attrs","$parse","$interpolate","$timeout","$log","dateFilter","datepickerConfig",function(a,b,c,d,e,f,g,h){var i=this,j={$setViewValue:angular.noop};this.modes=["day","month","year"],angular.forEach(["formatDay","formatMonth","formatYear","formatDayHeader","formatDayTitle","formatMonthTitle","minMode","maxMode","showWeeks","startingDay","yearRange"],function(c,e){i[c]=angular.isDefined(b[c])?8>e?d(b[c])(a.$parent):a.$parent.$eval(b[c]):h[c]}),angular.forEach(["minDate","maxDate"],function(d){b[d]?a.$parent.$watch(c(b[d]),function(a){i[d]=a?new Date(a):null,i.refreshView()}):i[d]=h[d]?new Date(h[d]):null}),a.datepickerMode=a.datepickerMode||h.datepickerMode,a.uniqueId="datepicker-"+a.$id+"-"+Math.floor(1e4*Math.random()),this.activeDate=angular.isDefined(b.initDate)?a.$parent.$eval(b.initDate):new Date,a.isActive=function(b){return 0===i.compare(b.date,i.activeDate)?(a.activeDateId=b.uid,!0):!1},this.init=function(a){j=a,j.$render=function(){i.render()}},this.render=function(){if(j.$modelValue){var a=new Date(j.$modelValue),b=!isNaN(a);b?this.activeDate=a:f.error('Datepicker directive: "ng-model" value must be a Date object, a number of milliseconds since 01.01.1970 or a string representing an RFC2822 or ISO 8601 date.'),j.$setValidity("date",b)}this.refreshView()},this.refreshView=function(){if(this.element){this._refreshView();var a=j.$modelValue?new Date(j.$modelValue):null;j.$setValidity("date-disabled",!a||this.element&&!this.isDisabled(a))}},this.createDateObject=function(a,b){var c=j.$modelValue?new Date(j.$modelValue):null;return{date:a,label:g(a,b),selected:c&&0===this.compare(a,c),disabled:this.isDisabled(a),current:0===this.compare(a,new Date)}},this.isDisabled=function(c){return this.minDate&&this.compare(c,this.minDate)<0||this.maxDate&&this.compare(c,this.maxDate)>0||b.dateDisabled&&a.dateDisabled({date:c,mode:a.datepickerMode})},this.split=function(a,b){for(var c=[];a.length>0;)c.push(a.splice(0,b));return c},a.select=function(b){if(a.datepickerMode===i.minMode){var c=j.$modelValue?new Date(j.$modelValue):new Date(0,0,0,0,0,0,0);c.setFullYear(b.getFullYear(),b.getMonth(),b.getDate()),j.$setViewValue(c),j.$render()}else i.activeDate=b,a.datepickerMode=i.modes[i.modes.indexOf(a.datepickerMode)-1]},a.move=function(a){var b=i.activeDate.getFullYear()+a*(i.step.years||0),c=i.activeDate.getMonth()+a*(i.step.months||0);i.activeDate.setFullYear(b,c,1),i.refreshView()},a.toggleMode=function(b){b=b||1,a.datepickerMode===i.maxMode&&1===b||a.datepickerMode===i.minMode&&-1===b||(a.datepickerMode=i.modes[i.modes.indexOf(a.datepickerMode)+b])},a.keys={13:"enter",32:"space",33:"pageup",34:"pagedown",35:"end",36:"home",37:"left",38:"up",39:"right",40:"down"};var k=function(){e(function(){i.element[0].focus()},0,!1)};a.$on("datepicker.focus",k),a.keydown=function(b){var c=a.keys[b.which];if(c&&!b.shiftKey&&!b.altKey)if(b.preventDefault(),b.stopPropagation(),"enter"===c||"space"===c){if(i.isDisabled(i.activeDate))return;a.select(i.activeDate),k()}else!b.ctrlKey||"up"!==c&&"down"!==c?(i.handleKeyDown(c,b),i.refreshView()):(a.toggleMode("up"===c?1:-1),k())}}]).directive("datepicker",function(){return{restrict:"EA",replace:!0,templateUrl:"template/datepicker/datepicker.html",scope:{datepickerMode:"=?",dateDisabled:"&"},require:["datepicker","?^ngModel"],controller:"DatepickerController",link:function(a,b,c,d){var e=d[0],f=d[1];f&&e.init(f)}}}).directive("daypicker",["dateFilter",function(a){return{restrict:"EA",replace:!0,templateUrl:"template/datepicker/day.html",require:"^datepicker",link:function(b,c,d,e){function f(a,b){return 1!==b||a%4!==0||a%100===0&&a%400!==0?i[b]:29}function g(a,b){var c=new Array(b),d=new Date(a),e=0;for(d.setHours(12);b>e;)c[e++]=new Date(d),d.setDate(d.getDate()+1);return c}function h(a){var b=new Date(a);b.setDate(b.getDate()+4-(b.getDay()||7));var c=b.getTime();return b.setMonth(0),b.setDate(1),Math.floor(Math.round((c-b)/864e5)/7)+1}b.showWeeks=e.showWeeks,e.step={months:1},e.element=c;var i=[31,28,31,30,31,30,31,31,30,31,30,31];e._refreshView=function(){var c=e.activeDate.getFullYear(),d=e.activeDate.getMonth(),f=new Date(c,d,1),i=e.startingDay-f.getDay(),j=i>0?7-i:-i,k=new Date(f);j>0&&k.setDate(-j+1);for(var l=g(k,42),m=0;42>m;m++)l[m]=angular.extend(e.createDateObject(l[m],e.formatDay),{secondary:l[m].getMonth()!==d,uid:b.uniqueId+"-"+m});b.labels=new Array(7);for(var n=0;7>n;n++)b.labels[n]={abbr:a(l[n].date,e.formatDayHeader),full:a(l[n].date,"EEEE")};if(b.title=a(e.activeDate,e.formatDayTitle),b.rows=e.split(l,7),b.showWeeks){b.weekNumbers=[];for(var o=h(b.rows[0][0].date),p=b.rows.length;b.weekNumbers.push(o++)<p;);}},e.compare=function(a,b){return new Date(a.getFullYear(),a.getMonth(),a.getDate())-new Date(b.getFullYear(),b.getMonth(),b.getDate())},e.handleKeyDown=function(a){var b=e.activeDate.getDate();if("left"===a)b-=1;else if("up"===a)b-=7;else if("right"===a)b+=1;else if("down"===a)b+=7;else if("pageup"===a||"pagedown"===a){var c=e.activeDate.getMonth()+("pageup"===a?-1:1);e.activeDate.setMonth(c,1),b=Math.min(f(e.activeDate.getFullYear(),e.activeDate.getMonth()),b)}else"home"===a?b=1:"end"===a&&(b=f(e.activeDate.getFullYear(),e.activeDate.getMonth()));e.activeDate.setDate(b)},e.refreshView()}}}]).directive("monthpicker",["dateFilter",function(a){return{restrict:"EA",replace:!0,templateUrl:"template/datepicker/month.html",require:"^datepicker",link:function(b,c,d,e){e.step={years:1},e.element=c,e._refreshView=function(){for(var c=new Array(12),d=e.activeDate.getFullYear(),f=0;12>f;f++)c[f]=angular.extend(e.createDateObject(new Date(d,f,1),e.formatMonth),{uid:b.uniqueId+"-"+f});b.title=a(e.activeDate,e.formatMonthTitle),b.rows=e.split(c,3)},e.compare=function(a,b){return new Date(a.getFullYear(),a.getMonth())-new Date(b.getFullYear(),b.getMonth())},e.handleKeyDown=function(a){var b=e.activeDate.getMonth();if("left"===a)b-=1;else if("up"===a)b-=3;else if("right"===a)b+=1;else if("down"===a)b+=3;else if("pageup"===a||"pagedown"===a){var c=e.activeDate.getFullYear()+("pageup"===a?-1:1);e.activeDate.setFullYear(c)}else"home"===a?b=0:"end"===a&&(b=11);e.activeDate.setMonth(b)},e.refreshView()}}}]).directive("yearpicker",["dateFilter",function(){return{restrict:"EA",replace:!0,templateUrl:"template/datepicker/year.html",require:"^datepicker",link:function(a,b,c,d){function e(a){return parseInt((a-1)/f,10)*f+1}var f=d.yearRange;d.step={years:f},d.element=b,d._refreshView=function(){for(var b=new Array(f),c=0,g=e(d.activeDate.getFullYear());f>c;c++)b[c]=angular.extend(d.createDateObject(new Date(g+c,0,1),d.formatYear),{uid:a.uniqueId+"-"+c});a.title=[b[0].label,b[f-1].label].join(" - "),a.rows=d.split(b,5)},d.compare=function(a,b){return a.getFullYear()-b.getFullYear()},d.handleKeyDown=function(a){var b=d.activeDate.getFullYear();"left"===a?b-=1:"up"===a?b-=5:"right"===a?b+=1:"down"===a?b+=5:"pageup"===a||"pagedown"===a?b+=("pageup"===a?-1:1)*d.step.years:"home"===a?b=e(d.activeDate.getFullYear()):"end"===a&&(b=e(d.activeDate.getFullYear())+f-1),d.activeDate.setFullYear(b)},d.refreshView()}}}]).constant("datepickerPopupConfig",{datepickerPopup:"yyyy-MM-dd",currentText:"Today",clearText:"Clear",closeText:"Done",closeOnDateSelection:!0,appendToBody:!1,showButtonBar:!0}).directive("datepickerPopup",["$compile","$parse","$document","$position","dateFilter","dateParser","datepickerPopupConfig",function(a,b,c,d,e,f,g){return{restrict:"EA",require:"ngModel",scope:{isOpen:"=?",currentText:"@",clearText:"@",closeText:"@",dateDisabled:"&"},link:function(h,i,j,k){function l(a){return a.replace(/([A-Z])/g,function(a){return"-"+a.toLowerCase()})}function m(a){if(a){if(angular.isDate(a)&&!isNaN(a))return k.$setValidity("date",!0),a;if(angular.isString(a)){var b=f.parse(a,n)||new Date(a);return isNaN(b)?void k.$setValidity("date",!1):(k.$setValidity("date",!0),b)}return void k.$setValidity("date",!1)}return k.$setValidity("date",!0),null}var n,o=angular.isDefined(j.closeOnDateSelection)?h.$parent.$eval(j.closeOnDateSelection):g.closeOnDateSelection,p=angular.isDefined(j.datepickerAppendToBody)?h.$parent.$eval(j.datepickerAppendToBody):g.appendToBody;h.showButtonBar=angular.isDefined(j.showButtonBar)?h.$parent.$eval(j.showButtonBar):g.showButtonBar,h.getText=function(a){return h[a+"Text"]||g[a+"Text"]},j.$observe("datepickerPopup",function(a){n=a||g.datepickerPopup,k.$render()});var q=angular.element("<div datepicker-popup-wrap><div datepicker></div></div>");q.attr({"ng-model":"date","ng-change":"dateSelection()"});var r=angular.element(q.children()[0]);j.datepickerOptions&&angular.forEach(h.$parent.$eval(j.datepickerOptions),function(a,b){r.attr(l(b),a)}),angular.forEach(["minDate","maxDate"],function(a){j[a]&&(h.$parent.$watch(b(j[a]),function(b){h[a]=b}),r.attr(l(a),a))}),j.dateDisabled&&r.attr("date-disabled","dateDisabled({ date: date, mode: mode })"),k.$parsers.unshift(m),h.dateSelection=function(a){angular.isDefined(a)&&(h.date=a),k.$setViewValue(h.date),k.$render(),o&&(h.isOpen=!1,i[0].focus())},i.bind("input change keyup",function(){h.$apply(function(){h.date=k.$modelValue})}),k.$render=function(){var a=k.$viewValue?e(k.$viewValue,n):"";i.val(a),h.date=m(k.$modelValue)};var s=function(a){h.isOpen&&a.target!==i[0]&&h.$apply(function(){h.isOpen=!1})},t=function(a){h.keydown(a)};i.bind("keydown",t),h.keydown=function(a){27===a.which?(a.preventDefault(),a.stopPropagation(),h.close()):40!==a.which||h.isOpen||(h.isOpen=!0)},h.$watch("isOpen",function(a){a?(h.$broadcast("datepicker.focus"),h.position=p?d.offset(i):d.position(i),h.position.top=h.position.top+i.prop("offsetHeight"),c.bind("click",s)):c.unbind("click",s)}),h.select=function(a){if("today"===a){var b=new Date;angular.isDate(k.$modelValue)?(a=new Date(k.$modelValue),a.setFullYear(b.getFullYear(),b.getMonth(),b.getDate())):a=new Date(b.setHours(0,0,0,0))}h.dateSelection(a)},h.close=function(){h.isOpen=!1,i[0].focus()};var u=a(q)(h);p?c.find("body").append(u):i.after(u),h.$on("$destroy",function(){u.remove(),i.unbind("keydown",t),c.unbind("click",s)})}}}]).directive("datepickerPopupWrap",function(){return{restrict:"EA",replace:!0,transclude:!0,templateUrl:"template/datepicker/popup.html",link:function(a,b){b.bind("click",function(a){a.preventDefault(),a.stopPropagation()})}}}),angular.module("ui.bootstrap.dropdown",[]).constant("dropdownConfig",{openClass:"open"}).service("dropdownService",["$document",function(a){var b=null;this.open=function(e){b||(a.bind("click",c),a.bind("keydown",d)),b&&b!==e&&(b.isOpen=!1),b=e},this.close=function(e){b===e&&(b=null,a.unbind("click",c),a.unbind("keydown",d))};var c=function(a){a&&a.isDefaultPrevented()||b.$apply(function(){b.isOpen=!1})},d=function(a){27===a.which&&(b.focusToggleElement(),c())}}]).controller("DropdownController",["$scope","$attrs","$parse","dropdownConfig","dropdownService","$animate",function(a,b,c,d,e,f){var g,h=this,i=a.$new(),j=d.openClass,k=angular.noop,l=b.onToggle?c(b.onToggle):angular.noop;this.init=function(d){h.$element=d,b.isOpen&&(g=c(b.isOpen),k=g.assign,a.$watch(g,function(a){i.isOpen=!!a}))},this.toggle=function(a){return i.isOpen=arguments.length?!!a:!i.isOpen},this.isOpen=function(){return i.isOpen},i.focusToggleElement=function(){h.toggleElement&&h.toggleElement[0].focus()},i.$watch("isOpen",function(b,c){f[b?"addClass":"removeClass"](h.$element,j),b?(i.focusToggleElement(),e.open(i)):e.close(i),k(a,b),angular.isDefined(b)&&b!==c&&l(a,{open:!!b})}),a.$on("$locationChangeSuccess",function(){i.isOpen=!1}),a.$on("$destroy",function(){i.$destroy()})}]).directive("dropdown",function(){return{restrict:"CA",controller:"DropdownController",link:function(a,b,c,d){d.init(b)}}}).directive("dropdownToggle",function(){return{restrict:"CA",require:"?^dropdown",link:function(a,b,c,d){if(d){d.toggleElement=b;var e=function(e){e.preventDefault(),b.hasClass("disabled")||c.disabled||a.$apply(function(){d.toggle()})};b.bind("click",e),b.attr({"aria-haspopup":!0,"aria-expanded":!1}),a.$watch(d.isOpen,function(a){b.attr("aria-expanded",!!a)}),a.$on("$destroy",function(){b.unbind("click",e)})}}}}),angular.module("ui.bootstrap.modal",["ui.bootstrap.transition"]).factory("$$stackedMap",function(){return{createNew:function(){var a=[];return{add:function(b,c){a.push({key:b,value:c})},get:function(b){for(var c=0;c<a.length;c++)if(b==a[c].key)return a[c]},keys:function(){for(var b=[],c=0;c<a.length;c++)b.push(a[c].key);return b},top:function(){return a[a.length-1]},remove:function(b){for(var c=-1,d=0;d<a.length;d++)if(b==a[d].key){c=d;break}return a.splice(c,1)[0]},removeTop:function(){return a.splice(a.length-1,1)[0]},length:function(){return a.length}}}}}).directive("modalBackdrop",["$timeout",function(a){return{restrict:"EA",replace:!0,templateUrl:"template/modal/backdrop.html",link:function(b){b.animate=!1,a(function(){b.animate=!0})}}}]).directive("modalWindow",["$modalStack","$timeout",function(a,b){return{restrict:"EA",scope:{index:"@",animate:"="},replace:!0,transclude:!0,templateUrl:function(a,b){return b.templateUrl||"template/modal/window.html"},link:function(c,d,e){d.addClass(e.windowClass||""),c.size=e.size,b(function(){c.animate=!0,d[0].focus()}),c.close=function(b){var c=a.getTop();c&&c.value.backdrop&&"static"!=c.value.backdrop&&b.target===b.currentTarget&&(b.preventDefault(),b.stopPropagation(),a.dismiss(c.key,"backdrop click"))}}}}]).factory("$modalStack",["$transition","$timeout","$document","$compile","$rootScope","$$stackedMap",function(a,b,c,d,e,f){function g(){for(var a=-1,b=n.keys(),c=0;c<b.length;c++)n.get(b[c]).value.backdrop&&(a=c);return a}function h(a){var b=c.find("body").eq(0),d=n.get(a).value;n.remove(a),j(d.modalDomEl,d.modalScope,300,function(){d.modalScope.$destroy(),b.toggleClass(m,n.length()>0),i()})}function i(){if(k&&-1==g()){var a=l;j(k,l,150,function(){a.$destroy(),a=null}),k=void 0,l=void 0}}function j(c,d,e,f){function g(){g.done||(g.done=!0,c.remove(),f&&f())}d.animate=!1;var h=a.transitionEndEventName;if(h){var i=b(g,e);c.bind(h,function(){b.cancel(i),g(),d.$apply()})}else b(g,0)}var k,l,m="modal-open",n=f.createNew(),o={};return e.$watch(g,function(a){l&&(l.index=a)}),c.bind("keydown",function(a){var b;27===a.which&&(b=n.top(),b&&b.value.keyboard&&(a.preventDefault(),e.$apply(function(){o.dismiss(b.key,"escape key press")})))}),o.open=function(a,b){n.add(a,{deferred:b.deferred,modalScope:b.scope,backdrop:b.backdrop,keyboard:b.keyboard});var f=c.find("body").eq(0),h=g();h>=0&&!k&&(l=e.$new(!0),l.index=h,k=d("<div modal-backdrop></div>")(l),f.append(k));var i=angular.element("<div modal-window></div>");i.attr({"template-url":b.windowTemplateUrl,"window-class":b.windowClass,size:b.size,index:n.length()-1,animate:"animate"}).html(b.content);var j=d(i)(b.scope);n.top().value.modalDomEl=j,f.append(j),f.addClass(m)},o.close=function(a,b){var c=n.get(a).value;c&&(c.deferred.resolve(b),h(a))},o.dismiss=function(a,b){var c=n.get(a).value;c&&(c.deferred.reject(b),h(a))},o.dismissAll=function(a){for(var b=this.getTop();b;)this.dismiss(b.key,a),b=this.getTop()},o.getTop=function(){return n.top()},o}]).provider("$modal",function(){var a={options:{backdrop:!0,keyboard:!0},$get:["$injector","$rootScope","$q","$http","$templateCache","$controller","$modalStack",function(b,c,d,e,f,g,h){function i(a){return a.template?d.when(a.template):e.get(a.templateUrl,{cache:f}).then(function(a){return a.data})}function j(a){var c=[];return angular.forEach(a,function(a){(angular.isFunction(a)||angular.isArray(a))&&c.push(d.when(b.invoke(a)))}),c}var k={};return k.open=function(b){var e=d.defer(),f=d.defer(),k={result:e.promise,opened:f.promise,close:function(a){h.close(k,a)},dismiss:function(a){h.dismiss(k,a)}};if(b=angular.extend({},a.options,b),b.resolve=b.resolve||{},!b.template&&!b.templateUrl)throw new Error("One of template or templateUrl options is required.");var l=d.all([i(b)].concat(j(b.resolve)));return l.then(function(a){var d=(b.scope||c).$new();d.$close=k.close,d.$dismiss=k.dismiss;var f,i={},j=1;b.controller&&(i.$scope=d,i.$modalInstance=k,angular.forEach(b.resolve,function(b,c){i[c]=a[j++]}),f=g(b.controller,i)),h.open(k,{scope:d,deferred:e,content:a[0],backdrop:b.backdrop,keyboard:b.keyboard,windowClass:b.windowClass,windowTemplateUrl:b.windowTemplateUrl,size:b.size})},function(a){e.reject(a)}),l.then(function(){f.resolve(!0)},function(){f.reject(!1)}),k},k}]};return a}),angular.module("ui.bootstrap.pagination",[]).controller("PaginationController",["$scope","$attrs","$parse",function(a,b,c){var d=this,e={$setViewValue:angular.noop},f=b.numPages?c(b.numPages).assign:angular.noop;this.init=function(f,g){e=f,this.config=g,e.$render=function(){d.render()},b.itemsPerPage?a.$parent.$watch(c(b.itemsPerPage),function(b){d.itemsPerPage=parseInt(b,10),a.totalPages=d.calculateTotalPages()}):this.itemsPerPage=g.itemsPerPage},this.calculateTotalPages=function(){var b=this.itemsPerPage<1?1:Math.ceil(a.totalItems/this.itemsPerPage);return Math.max(b||0,1)},this.render=function(){a.page=parseInt(e.$viewValue,10)||1},a.selectPage=function(b){a.page!==b&&b>0&&b<=a.totalPages&&(e.$setViewValue(b),e.$render())},a.getText=function(b){return a[b+"Text"]||d.config[b+"Text"]},a.noPrevious=function(){return 1===a.page},a.noNext=function(){return a.page===a.totalPages},a.$watch("totalItems",function(){a.totalPages=d.calculateTotalPages()}),a.$watch("totalPages",function(b){f(a.$parent,b),a.page>b?a.selectPage(b):e.$render()})}]).constant("paginationConfig",{itemsPerPage:10,boundaryLinks:!1,directionLinks:!0,firstText:"First",previousText:"Previous",nextText:"Next",lastText:"Last",rotate:!0}).directive("pagination",["$parse","paginationConfig",function(a,b){return{restrict:"EA",scope:{totalItems:"=",firstText:"@",previousText:"@",nextText:"@",lastText:"@"},require:["pagination","?ngModel"],controller:"PaginationController",templateUrl:"template/pagination/pagination.html",replace:!0,link:function(c,d,e,f){function g(a,b,c){return{number:a,text:b,active:c}}function h(a,b){var c=[],d=1,e=b,f=angular.isDefined(k)&&b>k;f&&(l?(d=Math.max(a-Math.floor(k/2),1),e=d+k-1,e>b&&(e=b,d=e-k+1)):(d=(Math.ceil(a/k)-1)*k+1,e=Math.min(d+k-1,b)));for(var h=d;e>=h;h++){var i=g(h,h,h===a);c.push(i)}if(f&&!l){if(d>1){var j=g(d-1,"...",!1);c.unshift(j)}if(b>e){var m=g(e+1,"...",!1);c.push(m)}}return c}var i=f[0],j=f[1];if(j){var k=angular.isDefined(e.maxSize)?c.$parent.$eval(e.maxSize):b.maxSize,l=angular.isDefined(e.rotate)?c.$parent.$eval(e.rotate):b.rotate;c.boundaryLinks=angular.isDefined(e.boundaryLinks)?c.$parent.$eval(e.boundaryLinks):b.boundaryLinks,c.directionLinks=angular.isDefined(e.directionLinks)?c.$parent.$eval(e.directionLinks):b.directionLinks,i.init(j,b),e.maxSize&&c.$parent.$watch(a(e.maxSize),function(a){k=parseInt(a,10),i.render()});var m=i.render;i.render=function(){m(),c.page>0&&c.page<=c.totalPages&&(c.pages=h(c.page,c.totalPages))}}}}}]).constant("pagerConfig",{itemsPerPage:10,previousText:"« Previous",nextText:"Next »",align:!0}).directive("pager",["pagerConfig",function(a){return{restrict:"EA",scope:{totalItems:"=",previousText:"@",nextText:"@"},require:["pager","?ngModel"],controller:"PaginationController",templateUrl:"template/pagination/pager.html",replace:!0,link:function(b,c,d,e){var f=e[0],g=e[1];g&&(b.align=angular.isDefined(d.align)?b.$parent.$eval(d.align):a.align,f.init(g,a))}}}]),angular.module("ui.bootstrap.tooltip",["ui.bootstrap.position","ui.bootstrap.bindHtml"]).provider("$tooltip",function(){function a(a){var b=/[A-Z]/g,c="-";
+return a.replace(b,function(a,b){return(b?c:"")+a.toLowerCase()})}var b={placement:"top",animation:!0,popupDelay:0},c={mouseenter:"mouseleave",click:"click",focus:"blur"},d={};this.options=function(a){angular.extend(d,a)},this.setTriggers=function(a){angular.extend(c,a)},this.$get=["$window","$compile","$timeout","$parse","$document","$position","$interpolate",function(e,f,g,h,i,j,k){return function(e,l,m){function n(a){var b=a||o.trigger||m,d=c[b]||b;return{show:b,hide:d}}var o=angular.extend({},b,d),p=a(e),q=k.startSymbol(),r=k.endSymbol(),s="<div "+p+'-popup title="'+q+"tt_title"+r+'" content="'+q+"tt_content"+r+'" placement="'+q+"tt_placement"+r+'" animation="tt_animation" is-open="tt_isOpen"></div>';return{restrict:"EA",scope:!0,compile:function(){var a=f(s);return function(b,c,d){function f(){b.tt_isOpen?m():k()}function k(){(!y||b.$eval(d[l+"Enable"]))&&(b.tt_popupDelay?v||(v=g(p,b.tt_popupDelay,!1),v.then(function(a){a()})):p()())}function m(){b.$apply(function(){q()})}function p(){return v=null,u&&(g.cancel(u),u=null),b.tt_content?(r(),t.css({top:0,left:0,display:"block"}),w?i.find("body").append(t):c.after(t),z(),b.tt_isOpen=!0,b.$digest(),z):angular.noop}function q(){b.tt_isOpen=!1,g.cancel(v),v=null,b.tt_animation?u||(u=g(s,500)):s()}function r(){t&&s(),t=a(b,function(){}),b.$digest()}function s(){u=null,t&&(t.remove(),t=null)}var t,u,v,w=angular.isDefined(o.appendToBody)?o.appendToBody:!1,x=n(void 0),y=angular.isDefined(d[l+"Enable"]),z=function(){var a=j.positionElements(c,t,b.tt_placement,w);a.top+="px",a.left+="px",t.css(a)};b.tt_isOpen=!1,d.$observe(e,function(a){b.tt_content=a,!a&&b.tt_isOpen&&q()}),d.$observe(l+"Title",function(a){b.tt_title=a}),d.$observe(l+"Placement",function(a){b.tt_placement=angular.isDefined(a)?a:o.placement}),d.$observe(l+"PopupDelay",function(a){var c=parseInt(a,10);b.tt_popupDelay=isNaN(c)?o.popupDelay:c});var A=function(){c.unbind(x.show,k),c.unbind(x.hide,m)};d.$observe(l+"Trigger",function(a){A(),x=n(a),x.show===x.hide?c.bind(x.show,f):(c.bind(x.show,k),c.bind(x.hide,m))});var B=b.$eval(d[l+"Animation"]);b.tt_animation=angular.isDefined(B)?!!B:o.animation,d.$observe(l+"AppendToBody",function(a){w=angular.isDefined(a)?h(a)(b):w}),w&&b.$on("$locationChangeSuccess",function(){b.tt_isOpen&&q()}),b.$on("$destroy",function(){g.cancel(u),g.cancel(v),A(),s()})}}}}}]}).directive("tooltipPopup",function(){return{restrict:"EA",replace:!0,scope:{content:"@",placement:"@",animation:"&",isOpen:"&"},templateUrl:"template/tooltip/tooltip-popup.html"}}).directive("tooltip",["$tooltip",function(a){return a("tooltip","tooltip","mouseenter")}]).directive("tooltipHtmlUnsafePopup",function(){return{restrict:"EA",replace:!0,scope:{content:"@",placement:"@",animation:"&",isOpen:"&"},templateUrl:"template/tooltip/tooltip-html-unsafe-popup.html"}}).directive("tooltipHtmlUnsafe",["$tooltip",function(a){return a("tooltipHtmlUnsafe","tooltip","mouseenter")}]),angular.module("ui.bootstrap.popover",["ui.bootstrap.tooltip"]).directive("popoverPopup",function(){return{restrict:"EA",replace:!0,scope:{title:"@",content:"@",placement:"@",animation:"&",isOpen:"&"},templateUrl:"template/popover/popover.html"}}).directive("popover",["$tooltip",function(a){return a("popover","popover","click")}]),angular.module("ui.bootstrap.progressbar",[]).constant("progressConfig",{animate:!0,max:100}).controller("ProgressController",["$scope","$attrs","progressConfig",function(a,b,c){var d=this,e=angular.isDefined(b.animate)?a.$parent.$eval(b.animate):c.animate;this.bars=[],a.max=angular.isDefined(b.max)?a.$parent.$eval(b.max):c.max,this.addBar=function(b,c){e||c.css({transition:"none"}),this.bars.push(b),b.$watch("value",function(c){b.percent=+(100*c/a.max).toFixed(2)}),b.$on("$destroy",function(){c=null,d.removeBar(b)})},this.removeBar=function(a){this.bars.splice(this.bars.indexOf(a),1)}}]).directive("progress",function(){return{restrict:"EA",replace:!0,transclude:!0,controller:"ProgressController",require:"progress",scope:{},templateUrl:"template/progressbar/progress.html"}}).directive("bar",function(){return{restrict:"EA",replace:!0,transclude:!0,require:"^progress",scope:{value:"=",type:"@"},templateUrl:"template/progressbar/bar.html",link:function(a,b,c,d){d.addBar(a,b)}}}).directive("progressbar",function(){return{restrict:"EA",replace:!0,transclude:!0,controller:"ProgressController",scope:{value:"=",type:"@"},templateUrl:"template/progressbar/progressbar.html",link:function(a,b,c,d){d.addBar(a,angular.element(b.children()[0]))}}}),angular.module("ui.bootstrap.rating",[]).constant("ratingConfig",{max:5,stateOn:null,stateOff:null}).controller("RatingController",["$scope","$attrs","ratingConfig",function(a,b,c){var d={$setViewValue:angular.noop};this.init=function(e){d=e,d.$render=this.render,this.stateOn=angular.isDefined(b.stateOn)?a.$parent.$eval(b.stateOn):c.stateOn,this.stateOff=angular.isDefined(b.stateOff)?a.$parent.$eval(b.stateOff):c.stateOff;var f=angular.isDefined(b.ratingStates)?a.$parent.$eval(b.ratingStates):new Array(angular.isDefined(b.max)?a.$parent.$eval(b.max):c.max);a.range=this.buildTemplateObjects(f)},this.buildTemplateObjects=function(a){for(var b=0,c=a.length;c>b;b++)a[b]=angular.extend({index:b},{stateOn:this.stateOn,stateOff:this.stateOff},a[b]);return a},a.rate=function(b){!a.readonly&&b>=0&&b<=a.range.length&&(d.$setViewValue(b),d.$render())},a.enter=function(b){a.readonly||(a.value=b),a.onHover({value:b})},a.reset=function(){a.value=d.$viewValue,a.onLeave()},a.onKeydown=function(b){/(37|38|39|40)/.test(b.which)&&(b.preventDefault(),b.stopPropagation(),a.rate(a.value+(38===b.which||39===b.which?1:-1)))},this.render=function(){a.value=d.$viewValue}}]).directive("rating",function(){return{restrict:"EA",require:["rating","ngModel"],scope:{readonly:"=?",onHover:"&",onLeave:"&"},controller:"RatingController",templateUrl:"template/rating/rating.html",replace:!0,link:function(a,b,c,d){var e=d[0],f=d[1];f&&e.init(f)}}}),angular.module("ui.bootstrap.tabs",[]).controller("TabsetController",["$scope",function(a){var b=this,c=b.tabs=a.tabs=[];b.select=function(a){angular.forEach(c,function(b){b.active&&b!==a&&(b.active=!1,b.onDeselect())}),a.active=!0,a.onSelect()},b.addTab=function(a){c.push(a),1===c.length?a.active=!0:a.active&&b.select(a)},b.removeTab=function(a){var d=c.indexOf(a);if(a.active&&c.length>1){var e=d==c.length-1?d-1:d+1;b.select(c[e])}c.splice(d,1)}}]).directive("tabset",function(){return{restrict:"EA",transclude:!0,replace:!0,scope:{type:"@"},controller:"TabsetController",templateUrl:"template/tabs/tabset.html",link:function(a,b,c){a.vertical=angular.isDefined(c.vertical)?a.$parent.$eval(c.vertical):!1,a.justified=angular.isDefined(c.justified)?a.$parent.$eval(c.justified):!1}}}).directive("tab",["$parse",function(a){return{require:"^tabset",restrict:"EA",replace:!0,templateUrl:"template/tabs/tab.html",transclude:!0,scope:{active:"=?",heading:"@",onSelect:"&select",onDeselect:"&deselect"},controller:function(){},compile:function(b,c,d){return function(b,c,e,f){b.$watch("active",function(a){a&&f.select(b)}),b.disabled=!1,e.disabled&&b.$parent.$watch(a(e.disabled),function(a){b.disabled=!!a}),b.select=function(){b.disabled||(b.active=!0)},f.addTab(b),b.$on("$destroy",function(){f.removeTab(b)}),b.$transcludeFn=d}}}}]).directive("tabHeadingTransclude",[function(){return{restrict:"A",require:"^tab",link:function(a,b){a.$watch("headingElement",function(a){a&&(b.html(""),b.append(a))})}}}]).directive("tabContentTransclude",function(){function a(a){return a.tagName&&(a.hasAttribute("tab-heading")||a.hasAttribute("data-tab-heading")||"tab-heading"===a.tagName.toLowerCase()||"data-tab-heading"===a.tagName.toLowerCase())}return{restrict:"A",require:"^tabset",link:function(b,c,d){var e=b.$eval(d.tabContentTransclude);e.$transcludeFn(e.$parent,function(b){angular.forEach(b,function(b){a(b)?e.headingElement=b:c.append(b)})})}}}),angular.module("ui.bootstrap.timepicker",[]).constant("timepickerConfig",{hourStep:1,minuteStep:1,showMeridian:!0,meridians:null,readonlyInput:!1,mousewheel:!0}).controller("TimepickerController",["$scope","$attrs","$parse","$log","$locale","timepickerConfig",function(a,b,c,d,e,f){function g(){var b=parseInt(a.hours,10),c=a.showMeridian?b>0&&13>b:b>=0&&24>b;return c?(a.showMeridian&&(12===b&&(b=0),a.meridian===p[1]&&(b+=12)),b):void 0}function h(){var b=parseInt(a.minutes,10);return b>=0&&60>b?b:void 0}function i(a){return angular.isDefined(a)&&a.toString().length<2?"0"+a:a}function j(a){k(),o.$setViewValue(new Date(n)),l(a)}function k(){o.$setValidity("time",!0),a.invalidHours=!1,a.invalidMinutes=!1}function l(b){var c=n.getHours(),d=n.getMinutes();a.showMeridian&&(c=0===c||12===c?12:c%12),a.hours="h"===b?c:i(c),a.minutes="m"===b?d:i(d),a.meridian=n.getHours()<12?p[0]:p[1]}function m(a){var b=new Date(n.getTime()+6e4*a);n.setHours(b.getHours(),b.getMinutes()),j()}var n=new Date,o={$setViewValue:angular.noop},p=angular.isDefined(b.meridians)?a.$parent.$eval(b.meridians):f.meridians||e.DATETIME_FORMATS.AMPMS;this.init=function(c,d){o=c,o.$render=this.render;var e=d.eq(0),g=d.eq(1),h=angular.isDefined(b.mousewheel)?a.$parent.$eval(b.mousewheel):f.mousewheel;h&&this.setupMousewheelEvents(e,g),a.readonlyInput=angular.isDefined(b.readonlyInput)?a.$parent.$eval(b.readonlyInput):f.readonlyInput,this.setupInputEvents(e,g)};var q=f.hourStep;b.hourStep&&a.$parent.$watch(c(b.hourStep),function(a){q=parseInt(a,10)});var r=f.minuteStep;b.minuteStep&&a.$parent.$watch(c(b.minuteStep),function(a){r=parseInt(a,10)}),a.showMeridian=f.showMeridian,b.showMeridian&&a.$parent.$watch(c(b.showMeridian),function(b){if(a.showMeridian=!!b,o.$error.time){var c=g(),d=h();angular.isDefined(c)&&angular.isDefined(d)&&(n.setHours(c),j())}else l()}),this.setupMousewheelEvents=function(b,c){var d=function(a){a.originalEvent&&(a=a.originalEvent);var b=a.wheelDelta?a.wheelDelta:-a.deltaY;return a.detail||b>0};b.bind("mousewheel wheel",function(b){a.$apply(d(b)?a.incrementHours():a.decrementHours()),b.preventDefault()}),c.bind("mousewheel wheel",function(b){a.$apply(d(b)?a.incrementMinutes():a.decrementMinutes()),b.preventDefault()})},this.setupInputEvents=function(b,c){if(a.readonlyInput)return a.updateHours=angular.noop,void(a.updateMinutes=angular.noop);var d=function(b,c){o.$setViewValue(null),o.$setValidity("time",!1),angular.isDefined(b)&&(a.invalidHours=b),angular.isDefined(c)&&(a.invalidMinutes=c)};a.updateHours=function(){var a=g();angular.isDefined(a)?(n.setHours(a),j("h")):d(!0)},b.bind("blur",function(){!a.invalidHours&&a.hours<10&&a.$apply(function(){a.hours=i(a.hours)})}),a.updateMinutes=function(){var a=h();angular.isDefined(a)?(n.setMinutes(a),j("m")):d(void 0,!0)},c.bind("blur",function(){!a.invalidMinutes&&a.minutes<10&&a.$apply(function(){a.minutes=i(a.minutes)})})},this.render=function(){var a=o.$modelValue?new Date(o.$modelValue):null;isNaN(a)?(o.$setValidity("time",!1),d.error('Timepicker directive: "ng-model" value must be a Date object, a number of milliseconds since 01.01.1970 or a string representing an RFC2822 or ISO 8601 date.')):(a&&(n=a),k(),l())},a.incrementHours=function(){m(60*q)},a.decrementHours=function(){m(60*-q)},a.incrementMinutes=function(){m(r)},a.decrementMinutes=function(){m(-r)},a.toggleMeridian=function(){m(720*(n.getHours()<12?1:-1))}}]).directive("timepicker",function(){return{restrict:"EA",require:["timepicker","?^ngModel"],controller:"TimepickerController",replace:!0,scope:{},templateUrl:"template/timepicker/timepicker.html",link:function(a,b,c,d){var e=d[0],f=d[1];f&&e.init(f,b.find("input"))}}}),angular.module("ui.bootstrap.typeahead",["ui.bootstrap.position","ui.bootstrap.bindHtml"]).factory("typeaheadParser",["$parse",function(a){var b=/^\s*(.*?)(?:\s+as\s+(.*?))?\s+for\s+(?:([\$\w][\$\w\d]*))\s+in\s+(.*)$/;return{parse:function(c){var d=c.match(b);if(!d)throw new Error('Expected typeahead specification in form of "_modelValue_ (as _label_)? for _item_ in _collection_" but got "'+c+'".');return{itemName:d[3],source:a(d[4]),viewMapper:a(d[2]||d[1]),modelMapper:a(d[1])}}}}]).directive("typeahead",["$compile","$parse","$q","$timeout","$document","$position","typeaheadParser",function(a,b,c,d,e,f,g){var h=[9,13,27,38,40];return{require:"ngModel",link:function(i,j,k,l){var m,n=i.$eval(k.typeaheadMinLength)||1,o=i.$eval(k.typeaheadWaitMs)||0,p=i.$eval(k.typeaheadEditable)!==!1,q=b(k.typeaheadLoading).assign||angular.noop,r=b(k.typeaheadOnSelect),s=k.typeaheadInputFormatter?b(k.typeaheadInputFormatter):void 0,t=k.typeaheadAppendToBody?i.$eval(k.typeaheadAppendToBody):!1,u=b(k.ngModel).assign,v=g.parse(k.typeahead),w=i.$new();i.$on("$destroy",function(){w.$destroy()});var x="typeahead-"+w.$id+"-"+Math.floor(1e4*Math.random());j.attr({"aria-autocomplete":"list","aria-expanded":!1,"aria-owns":x});var y=angular.element("<div typeahead-popup></div>");y.attr({id:x,matches:"matches",active:"activeIdx",select:"select(activeIdx)",query:"query",position:"position"}),angular.isDefined(k.typeaheadTemplateUrl)&&y.attr("template-url",k.typeaheadTemplateUrl);var z=function(){w.matches=[],w.activeIdx=-1,j.attr("aria-expanded",!1)},A=function(a){return x+"-option-"+a};w.$watch("activeIdx",function(a){0>a?j.removeAttr("aria-activedescendant"):j.attr("aria-activedescendant",A(a))});var B=function(a){var b={$viewValue:a};q(i,!0),c.when(v.source(i,b)).then(function(c){var d=a===l.$viewValue;if(d&&m)if(c.length>0){w.activeIdx=0,w.matches.length=0;for(var e=0;e<c.length;e++)b[v.itemName]=c[e],w.matches.push({id:A(e),label:v.viewMapper(w,b),model:c[e]});w.query=a,w.position=t?f.offset(j):f.position(j),w.position.top=w.position.top+j.prop("offsetHeight"),j.attr("aria-expanded",!0)}else z();d&&q(i,!1)},function(){z(),q(i,!1)})};z(),w.query=void 0;var C;l.$parsers.unshift(function(a){return m=!0,a&&a.length>=n?o>0?(C&&d.cancel(C),C=d(function(){B(a)},o)):B(a):(q(i,!1),z()),p?a:a?void l.$setValidity("editable",!1):(l.$setValidity("editable",!0),a)}),l.$formatters.push(function(a){var b,c,d={};return s?(d.$model=a,s(i,d)):(d[v.itemName]=a,b=v.viewMapper(i,d),d[v.itemName]=void 0,c=v.viewMapper(i,d),b!==c?b:a)}),w.select=function(a){var b,c,e={};e[v.itemName]=c=w.matches[a].model,b=v.modelMapper(i,e),u(i,b),l.$setValidity("editable",!0),r(i,{$item:c,$model:b,$label:v.viewMapper(i,e)}),z(),d(function(){j[0].focus()},0,!1)},j.bind("keydown",function(a){0!==w.matches.length&&-1!==h.indexOf(a.which)&&(a.preventDefault(),40===a.which?(w.activeIdx=(w.activeIdx+1)%w.matches.length,w.$digest()):38===a.which?(w.activeIdx=(w.activeIdx?w.activeIdx:w.matches.length)-1,w.$digest()):13===a.which||9===a.which?w.$apply(function(){w.select(w.activeIdx)}):27===a.which&&(a.stopPropagation(),z(),w.$digest()))}),j.bind("blur",function(){m=!1});var D=function(a){j[0]!==a.target&&(z(),w.$digest())};e.bind("click",D),i.$on("$destroy",function(){e.unbind("click",D)});var E=a(y)(w);t?e.find("body").append(E):j.after(E)}}}]).directive("typeaheadPopup",function(){return{restrict:"EA",scope:{matches:"=",query:"=",active:"=",position:"=",select:"&"},replace:!0,templateUrl:"template/typeahead/typeahead-popup.html",link:function(a,b,c){a.templateUrl=c.templateUrl,a.isOpen=function(){return a.matches.length>0},a.isActive=function(b){return a.active==b},a.selectActive=function(b){a.active=b},a.selectMatch=function(b){a.select({activeIdx:b})}}}}).directive("typeaheadMatch",["$http","$templateCache","$compile","$parse",function(a,b,c,d){return{restrict:"EA",scope:{index:"=",match:"=",query:"="},link:function(e,f,g){var h=d(g.templateUrl)(e.$parent)||"template/typeahead/typeahead-match.html";a.get(h,{cache:b}).success(function(a){f.replaceWith(c(a.trim())(e))})}}}]).filter("typeaheadHighlight",function(){function a(a){return a.replace(/([.?*+^$[\]\\(){}|-])/g,"\\$1")}return function(b,c){return c?(""+b).replace(new RegExp(a(c),"gi"),"<strong>$&</strong>"):b}}),angular.module("template/accordion/accordion-group.html",[]).run(["$templateCache",function(a){a.put("template/accordion/accordion-group.html",'<div class="panel panel-default">\n <div class="panel-heading">\n <h4 class="panel-title">\n <a class="accordion-toggle" ng-click="toggleOpen()" accordion-transclude="heading"><span ng-class="{\'text-muted\': isDisabled}">{{heading}}</span></a>\n </h4>\n </div>\n <div class="panel-collapse" collapse="!isOpen">\n <div class="panel-body" ng-transclude></div>\n </div>\n</div>')}]),angular.module("template/accordion/accordion.html",[]).run(["$templateCache",function(a){a.put("template/accordion/accordion.html",'<div class="panel-group" ng-transclude></div>')}]),angular.module("template/alert/alert.html",[]).run(["$templateCache",function(a){a.put("template/alert/alert.html",'<div class="alert" ng-class="{\'alert-{{type || \'warning\'}}\': true, \'alert-dismissable\': closeable}" role="alert">\n <button ng-show="closeable" type="button" class="close" ng-click="close()">\n <span aria-hidden="true">×</span>\n <span class="sr-only">Close</span>\n </button>\n <div ng-transclude></div>\n</div>\n')}]),angular.module("template/carousel/carousel.html",[]).run(["$templateCache",function(a){a.put("template/carousel/carousel.html",'<div ng-mouseenter="pause()" ng-mouseleave="play()" class="carousel" ng-swipe-right="prev()" ng-swipe-left="next()">\n <ol class="carousel-indicators" ng-show="slides.length > 1">\n <li ng-repeat="slide in slides track by $index" ng-class="{active: isActive(slide)}" ng-click="select(slide)"></li>\n </ol>\n <div class="carousel-inner" ng-transclude></div>\n <a class="left carousel-control" ng-click="prev()" ng-show="slides.length > 1"><span class="glyphicon glyphicon-chevron-left"></span></a>\n <a class="right carousel-control" ng-click="next()" ng-show="slides.length > 1"><span class="glyphicon glyphicon-chevron-right"></span></a>\n</div>\n')}]),angular.module("template/carousel/slide.html",[]).run(["$templateCache",function(a){a.put("template/carousel/slide.html","<div ng-class=\"{\n 'active': leaving || (active && !entering),\n 'prev': (next || active) && direction=='prev',\n 'next': (next || active) && direction=='next',\n 'right': direction=='prev',\n 'left': direction=='next'\n }\" class=\"item text-center\" ng-transclude></div>\n")}]),angular.module("template/datepicker/datepicker.html",[]).run(["$templateCache",function(a){a.put("template/datepicker/datepicker.html",'<div ng-switch="datepickerMode" role="application" ng-keydown="keydown($event)">\n <daypicker ng-switch-when="day" tabindex="0"></daypicker>\n <monthpicker ng-switch-when="month" tabindex="0"></monthpicker>\n <yearpicker ng-switch-when="year" tabindex="0"></yearpicker>\n</div>')}]),angular.module("template/datepicker/day.html",[]).run(["$templateCache",function(a){a.put("template/datepicker/day.html",'<table role="grid" aria-labelledby="{{uniqueId}}-title" aria-activedescendant="{{activeDateId}}">\n <thead>\n <tr>\n <th><button type="button" class="btn btn-default btn-sm pull-left" ng-click="move(-1)" tabindex="-1"><i class="glyphicon glyphicon-chevron-left"></i></button></th>\n <th colspan="{{5 + showWeeks}}"><button id="{{uniqueId}}-title" role="heading" aria-live="assertive" aria-atomic="true" type="button" class="btn btn-default btn-sm" ng-click="toggleMode()" tabindex="-1" style="width:100%;"><strong>{{title}}</strong></button></th>\n <th><button type="button" class="btn btn-default btn-sm pull-right" ng-click="move(1)" tabindex="-1"><i class="glyphicon glyphicon-chevron-right"></i></button></th>\n </tr>\n <tr>\n <th ng-show="showWeeks" class="text-center"></th>\n <th ng-repeat="label in labels track by $index" class="text-center"><small aria-label="{{label.full}}">{{label.abbr}}</small></th>\n </tr>\n </thead>\n <tbody>\n <tr ng-repeat="row in rows track by $index">\n <td ng-show="showWeeks" class="text-center h6"><em>{{ weekNumbers[$index] }}</em></td>\n <td ng-repeat="dt in row track by dt.date" class="text-center" role="gridcell" id="{{dt.uid}}" aria-disabled="{{!!dt.disabled}}">\n <button type="button" style="width:100%;" class="btn btn-default btn-sm" ng-class="{\'btn-info\': dt.selected, active: isActive(dt)}" ng-click="select(dt.date)" ng-disabled="dt.disabled" tabindex="-1"><span ng-class="{\'text-muted\': dt.secondary, \'text-info\': dt.current}">{{dt.label}}</span></button>\n </td>\n </tr>\n </tbody>\n</table>\n')}]),angular.module("template/datepicker/month.html",[]).run(["$templateCache",function(a){a.put("template/datepicker/month.html",'<table role="grid" aria-labelledby="{{uniqueId}}-title" aria-activedescendant="{{activeDateId}}">\n <thead>\n <tr>\n <th><button type="button" class="btn btn-default btn-sm pull-left" ng-click="move(-1)" tabindex="-1"><i class="glyphicon glyphicon-chevron-left"></i></button></th>\n <th><button id="{{uniqueId}}-title" role="heading" aria-live="assertive" aria-atomic="true" type="button" class="btn btn-default btn-sm" ng-click="toggleMode()" tabindex="-1" style="width:100%;"><strong>{{title}}</strong></button></th>\n <th><button type="button" class="btn btn-default btn-sm pull-right" ng-click="move(1)" tabindex="-1"><i class="glyphicon glyphicon-chevron-right"></i></button></th>\n </tr>\n </thead>\n <tbody>\n <tr ng-repeat="row in rows track by $index">\n <td ng-repeat="dt in row track by dt.date" class="text-center" role="gridcell" id="{{dt.uid}}" aria-disabled="{{!!dt.disabled}}">\n <button type="button" style="width:100%;" class="btn btn-default" ng-class="{\'btn-info\': dt.selected, active: isActive(dt)}" ng-click="select(dt.date)" ng-disabled="dt.disabled" tabindex="-1"><span ng-class="{\'text-info\': dt.current}">{{dt.label}}</span></button>\n </td>\n </tr>\n </tbody>\n</table>\n')}]),angular.module("template/datepicker/popup.html",[]).run(["$templateCache",function(a){a.put("template/datepicker/popup.html",'<ul class="dropdown-menu" ng-style="{display: (isOpen && \'block\') || \'none\', top: position.top+\'px\', left: position.left+\'px\'}" ng-keydown="keydown($event)">\n <li ng-transclude></li>\n <li ng-if="showButtonBar" style="padding:10px 9px 2px">\n <span class="btn-group">\n <button type="button" class="btn btn-sm btn-info" ng-click="select(\'today\')">{{ getText(\'current\') }}</button>\n <button type="button" class="btn btn-sm btn-danger" ng-click="select(null)">{{ getText(\'clear\') }}</button>\n </span>\n <button type="button" class="btn btn-sm btn-success pull-right" ng-click="close()">{{ getText(\'close\') }}</button>\n </li>\n</ul>\n')}]),angular.module("template/datepicker/year.html",[]).run(["$templateCache",function(a){a.put("template/datepicker/year.html",'<table role="grid" aria-labelledby="{{uniqueId}}-title" aria-activedescendant="{{activeDateId}}">\n <thead>\n <tr>\n <th><button type="button" class="btn btn-default btn-sm pull-left" ng-click="move(-1)" tabindex="-1"><i class="glyphicon glyphicon-chevron-left"></i></button></th>\n <th colspan="3"><button id="{{uniqueId}}-title" role="heading" aria-live="assertive" aria-atomic="true" type="button" class="btn btn-default btn-sm" ng-click="toggleMode()" tabindex="-1" style="width:100%;"><strong>{{title}}</strong></button></th>\n <th><button type="button" class="btn btn-default btn-sm pull-right" ng-click="move(1)" tabindex="-1"><i class="glyphicon glyphicon-chevron-right"></i></button></th>\n </tr>\n </thead>\n <tbody>\n <tr ng-repeat="row in rows track by $index">\n <td ng-repeat="dt in row track by dt.date" class="text-center" role="gridcell" id="{{dt.uid}}" aria-disabled="{{!!dt.disabled}}">\n <button type="button" style="width:100%;" class="btn btn-default" ng-class="{\'btn-info\': dt.selected, active: isActive(dt)}" ng-click="select(dt.date)" ng-disabled="dt.disabled" tabindex="-1"><span ng-class="{\'text-info\': dt.current}">{{dt.label}}</span></button>\n </td>\n </tr>\n </tbody>\n</table>\n')}]),angular.module("template/modal/backdrop.html",[]).run(["$templateCache",function(a){a.put("template/modal/backdrop.html",'<div class="modal-backdrop fade"\n ng-class="{in: animate}"\n ng-style="{\'z-index\': 1040 + (index && 1 || 0) + index*10}"\n></div>\n')}]),angular.module("template/modal/window.html",[]).run(["$templateCache",function(a){a.put("template/modal/window.html",'<div tabindex="-1" role="dialog" class="modal fade" ng-class="{in: animate}" ng-style="{\'z-index\': 1050 + index*10, display: \'block\'}" ng-click="close($event)">\n <div class="modal-dialog" ng-class="{\'modal-sm\': size == \'sm\', \'modal-lg\': size == \'lg\'}"><div class="modal-content" ng-transclude></div></div>\n</div>')}]),angular.module("template/pagination/pager.html",[]).run(["$templateCache",function(a){a.put("template/pagination/pager.html",'<ul class="pager">\n <li ng-class="{disabled: noPrevious(), previous: align}"><a href ng-click="selectPage(page - 1)">{{getText(\'previous\')}}</a></li>\n <li ng-class="{disabled: noNext(), next: align}"><a href ng-click="selectPage(page + 1)">{{getText(\'next\')}}</a></li>\n</ul>')}]),angular.module("template/pagination/pagination.html",[]).run(["$templateCache",function(a){a.put("template/pagination/pagination.html",'<ul class="pagination">\n <li ng-if="boundaryLinks" ng-class="{disabled: noPrevious()}"><a href ng-click="selectPage(1)">{{getText(\'first\')}}</a></li>\n <li ng-if="directionLinks" ng-class="{disabled: noPrevious()}"><a href ng-click="selectPage(page - 1)">{{getText(\'previous\')}}</a></li>\n <li ng-repeat="page in pages track by $index" ng-class="{active: page.active}"><a href ng-click="selectPage(page.number)">{{page.text}}</a></li>\n <li ng-if="directionLinks" ng-class="{disabled: noNext()}"><a href ng-click="selectPage(page + 1)">{{getText(\'next\')}}</a></li>\n <li ng-if="boundaryLinks" ng-class="{disabled: noNext()}"><a href ng-click="selectPage(totalPages)">{{getText(\'last\')}}</a></li>\n</ul>')}]),angular.module("template/tooltip/tooltip-html-unsafe-popup.html",[]).run(["$templateCache",function(a){a.put("template/tooltip/tooltip-html-unsafe-popup.html",'<div class="tooltip {{placement}}" ng-class="{ in: isOpen(), fade: animation() }">\n <div class="tooltip-arrow"></div>\n <div class="tooltip-inner" bind-html-unsafe="content"></div>\n</div>\n')}]),angular.module("template/tooltip/tooltip-popup.html",[]).run(["$templateCache",function(a){a.put("template/tooltip/tooltip-popup.html",'<div class="tooltip {{placement}}" ng-class="{ in: isOpen(), fade: animation() }">\n <div class="tooltip-arrow"></div>\n <div class="tooltip-inner" ng-bind="content"></div>\n</div>\n')}]),angular.module("template/popover/popover.html",[]).run(["$templateCache",function(a){a.put("template/popover/popover.html",'<div class="popover {{placement}}" ng-class="{ in: isOpen(), fade: animation() }">\n <div class="arrow"></div>\n\n <div class="popover-inner">\n <h3 class="popover-title" ng-bind="title" ng-show="title"></h3>\n <div class="popover-content" ng-bind="content"></div>\n </div>\n</div>\n')}]),angular.module("template/progressbar/bar.html",[]).run(["$templateCache",function(a){a.put("template/progressbar/bar.html",'<div class="progress-bar" ng-class="type && \'progress-bar-\' + type" role="progressbar" aria-valuenow="{{value}}" aria-valuemin="0" aria-valuemax="{{max}}" ng-style="{width: percent + \'%\'}" aria-valuetext="{{percent | number:0}}%" ng-transclude></div>')}]),angular.module("template/progressbar/progress.html",[]).run(["$templateCache",function(a){a.put("template/progressbar/progress.html",'<div class="progress" ng-transclude></div>')}]),angular.module("template/progressbar/progressbar.html",[]).run(["$templateCache",function(a){a.put("template/progressbar/progressbar.html",'<div class="progress">\n <div class="progress-bar" ng-class="type && \'progress-bar-\' + type" role="progressbar" aria-valuenow="{{value}}" aria-valuemin="0" aria-valuemax="{{max}}" ng-style="{width: percent + \'%\'}" aria-valuetext="{{percent | number:0}}%" ng-transclude></div>\n</div>')}]),angular.module("template/rating/rating.html",[]).run(["$templateCache",function(a){a.put("template/rating/rating.html",'<span ng-mouseleave="reset()" ng-keydown="onKeydown($event)" tabindex="0" role="slider" aria-valuemin="0" aria-valuemax="{{range.length}}" aria-valuenow="{{value}}">\n <i ng-repeat="r in range track by $index" ng-mouseenter="enter($index + 1)" ng-click="rate($index + 1)" class="glyphicon" ng-class="$index < value && (r.stateOn || \'glyphicon-star\') || (r.stateOff || \'glyphicon-star-empty\')">\n <span class="sr-only">({{ $index < value ? \'*\' : \' \' }})</span>\n </i>\n</span>')}]),angular.module("template/tabs/tab.html",[]).run(["$templateCache",function(a){a.put("template/tabs/tab.html",'<li ng-class="{active: active, disabled: disabled}">\n <a ng-click="select()" tab-heading-transclude>{{heading}}</a>\n</li>\n')}]),angular.module("template/tabs/tabset-titles.html",[]).run(["$templateCache",function(a){a.put("template/tabs/tabset-titles.html","<ul class=\"nav {{type && 'nav-' + type}}\" ng-class=\"{'nav-stacked': vertical}\">\n</ul>\n")}]),angular.module("template/tabs/tabset.html",[]).run(["$templateCache",function(a){a.put("template/tabs/tabset.html",'\n<div>\n <ul class="nav nav-{{type || \'tabs\'}}" ng-class="{\'nav-stacked\': vertical, \'nav-justified\': justified}" ng-transclude></ul>\n <div class="tab-content">\n <div class="tab-pane" \n ng-repeat="tab in tabs" \n ng-class="{active: tab.active}"\n tab-content-transclude="tab">\n </div>\n </div>\n</div>\n')}]),angular.module("template/timepicker/timepicker.html",[]).run(["$templateCache",function(a){a.put("template/timepicker/timepicker.html",'<table>\n <tbody>\n <tr class="text-center">\n <td><a ng-click="incrementHours()" class="btn btn-link"><span class="glyphicon glyphicon-chevron-up"></span></a></td>\n <td> </td>\n <td><a ng-click="incrementMinutes()" class="btn btn-link"><span class="glyphicon glyphicon-chevron-up"></span></a></td>\n <td ng-show="showMeridian"></td>\n </tr>\n <tr>\n <td style="width:50px;" class="form-group" ng-class="{\'has-error\': invalidHours}">\n <input type="text" ng-model="hours" ng-change="updateHours()" class="form-control text-center" ng-mousewheel="incrementHours()" ng-readonly="readonlyInput" maxlength="2">\n </td>\n <td>:</td>\n <td style="width:50px;" class="form-group" ng-class="{\'has-error\': invalidMinutes}">\n <input type="text" ng-model="minutes" ng-change="updateMinutes()" class="form-control text-center" ng-readonly="readonlyInput" maxlength="2">\n </td>\n <td ng-show="showMeridian"><button type="button" class="btn btn-default text-center" ng-click="toggleMeridian()">{{meridian}}</button></td>\n </tr>\n <tr class="text-center">\n <td><a ng-click="decrementHours()" class="btn btn-link"><span class="glyphicon glyphicon-chevron-down"></span></a></td>\n <td> </td>\n <td><a ng-click="decrementMinutes()" class="btn btn-link"><span class="glyphicon glyphicon-chevron-down"></span></a></td>\n <td ng-show="showMeridian"></td>\n </tr>\n </tbody>\n</table>\n')}]),angular.module("template/typeahead/typeahead-match.html",[]).run(["$templateCache",function(a){a.put("template/typeahead/typeahead-match.html",'<a tabindex="-1" bind-html-unsafe="match.label | typeaheadHighlight:query"></a>')}]),angular.module("template/typeahead/typeahead-popup.html",[]).run(["$templateCache",function(a){a.put("template/typeahead/typeahead-popup.html",'<ul class="dropdown-menu" ng-if="isOpen()" ng-style="{top: position.top+\'px\', left: position.left+\'px\'}" style="display: block;" role="listbox" aria-hidden="{{!isOpen()}}">\n <li ng-repeat="match in matches track by $index" ng-class="{active: isActive($index) }" ng-mouseenter="selectActive($index)" ng-click="selectMatch($index)" role="option" id="{{match.id}}">\n <div typeahead-match index="$index" match="match" query="query" template-url="templateUrl"></div>\n </li>\n</ul>')
+}]);
\ No newline at end of file
diff --git a/bitbake/lib/toaster/toastergui/tables.py b/bitbake/lib/toaster/toastergui/tables.py
new file mode 100644
index 0000000..92e3b5c
--- /dev/null
+++ b/bitbake/lib/toaster/toastergui/tables.py
@@ -0,0 +1,453 @@
+#
+# ex:ts=4:sw=4:sts=4:et
+# -*- tab-width: 4; c-basic-offset: 4; indent-tabs-mode: nil -*-
+#
+# BitBake Toaster Implementation
+#
+# Copyright (C) 2015 Intel Corporation
+#
+# This program is free software; you can redistribute it and/or modify
+# it under the terms of the GNU General Public License version 2 as
+# published by the Free Software Foundation.
+#
+# This program is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
+# GNU General Public License for more details.
+#
+# You should have received a copy of the GNU General Public License along
+# with this program; if not, write to the Free Software Foundation, Inc.,
+# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
+
+from toastergui.widgets import ToasterTable
+from orm.models import Recipe, ProjectLayer, Layer_Version, Machine, Project
+from django.db.models import Q, Max
+from django.conf.urls import url
+from django.core.urlresolvers import reverse
+from django.views.generic import TemplateView
+
+
+class ProjectFiltersMixin(object):
+ """Common mixin for recipe, machine in project filters"""
+
+ def filter_in_project(self, count_only=False):
+ query = self.queryset.filter(layer_version__in=self.project_layers)
+ if count_only:
+ return query.count()
+
+ self.queryset = query
+
+ def filter_not_in_project(self, count_only=False):
+ query = self.queryset.exclude(layer_version__in=self.project_layers)
+ if count_only:
+ return query.count()
+
+ self.queryset = query
+
+class LayersTable(ToasterTable):
+ """Table of layers in Toaster"""
+
+ def __init__(self, *args, **kwargs):
+ super(LayersTable, self).__init__(*args, **kwargs)
+ self.default_orderby = "layer__name"
+
+ def get_context_data(self, **kwargs):
+ context = super(LayersTable, self).get_context_data(**kwargs)
+
+ project = Project.objects.get(pk=kwargs['pid'])
+
+ context['project'] = project
+ context['projectlayers'] = map(lambda prjlayer: prjlayer.layercommit.id, ProjectLayer.objects.filter(project=project))
+
+ return context
+
+
+ def setup_filters(self, *args, **kwargs):
+ project = Project.objects.get(pk=kwargs['pid'])
+ self.project_layers = ProjectLayer.objects.filter(project=project)
+
+
+ self.add_filter(title="Filter by project layers",
+ name="in_current_project",
+ filter_actions=[
+ self.make_filter_action("in_project", "Layers added to this project", self.filter_in_project),
+ self.make_filter_action("not_in_project", "Layers not added to this project", self.filter_not_in_project)
+ ])
+
+ def filter_in_project(self, count_only=False):
+ query = self.queryset.filter(projectlayer__in=self.project_layers)
+ if count_only:
+ return query.count()
+
+ self.queryset = query
+
+ def filter_not_in_project(self, count_only=False):
+ query = self.queryset.exclude(projectlayer__in=self.project_layers)
+ if count_only:
+ return query.count()
+
+ self.queryset = query
+
+
+ def setup_queryset(self, *args, **kwargs):
+ prj = Project.objects.get(pk = kwargs['pid'])
+ compatible_layers = prj.compatible_layerversions()
+
+ self.queryset = compatible_layers.order_by(self.default_orderby)
+
+ def setup_columns(self, *args, **kwargs):
+
+ layer_link_template = '''
+ <a href="{% url 'layerdetails' extra.pid data.id %}">
+ {{data.layer.name}}
+ </a>
+ '''
+
+ self.add_column(title="Layer",
+ hideable=False,
+ orderable=True,
+ static_data_name="layer__name",
+ static_data_template=layer_link_template)
+
+ self.add_column(title="Summary",
+ field_name="layer__summary")
+
+ git_url_template = '''
+ <a href="{% url 'layerdetails' extra.pid data.id %}">
+ <code>{{data.layer.vcs_url}}</code>
+ </a>
+ {% if data.get_vcs_link_url %}
+ <a target="_blank" href="{{ data.get_vcs_link_url }}">
+ <i class="icon-share get-info"></i>
+ </a>
+ {% endif %}
+ '''
+
+ self.add_column(title="Git repository URL",
+ help_text="The Git repository for the layer source code",
+ hidden=True,
+ static_data_name="layer__vcs_url",
+ static_data_template=git_url_template)
+
+ git_dir_template = '''
+ <a href="{% url 'layerdetails' extra.pid data.id %}">
+ <code>{{data.dirpath}}</code>
+ </a>
+ {% if data.dirpath and data.get_vcs_dirpath_link_url %}
+ <a target="_blank" href="{{ data.get_vcs_dirpath_link_url }}">
+ <i class="icon-share get-info"></i>
+ </a>
+ {% endif %}'''
+
+ self.add_column(title="Subdirectory",
+ help_text="The layer directory within the Git repository",
+ hidden=True,
+ static_data_name="git_subdir",
+ static_data_template=git_dir_template)
+
+ revision_template = '''
+ {% load projecttags %}
+ {% with vcs_ref=data.get_vcs_reference %}
+ {% if vcs_ref|is_shaid %}
+ <a class="btn" data-content="<ul class='unstyled'> <li>{{vcs_ref}}</li> </ul>">
+ {{vcs_ref|truncatechars:10}}
+ </a>
+ {% else %}
+ {{vcs_ref}}
+ {% endif %}
+ {% endwith %}
+ '''
+
+ self.add_column(title="Revision",
+ help_text="The Git branch, tag or commit. For the layers from the OpenEmbedded layer source, the revision is always the branch compatible with the Yocto Project version you selected for this project",
+ static_data_name="revision",
+ static_data_template=revision_template)
+
+ deps_template = '''
+ {% with ods=data.dependencies.all%}
+ {% if ods.count %}
+ <a class="btn" title="<a href='{% url "layerdetails" extra.pid data.id %}'>{{data.layer.name}}</a> dependencies"
+ data-content="<ul class='unstyled'>
+ {% for i in ods%}
+ <li><a href='{% url "layerdetails" extra.pid i.depends_on.pk %}'>{{i.depends_on.layer.name}}</a></li>
+ {% endfor %}
+ </ul>">
+ {{ods.count}}
+ </a>
+ {% endif %}
+ {% endwith %}
+ '''
+
+ self.add_column(title="Dependencies",
+ help_text="Other layers a layer depends upon",
+ static_data_name="dependencies",
+ static_data_template=deps_template)
+
+ self.add_column(title="Add | Delete",
+ help_text="Add or delete layers to / from your project",
+ hideable=False,
+ filter_name="in_current_project",
+ static_data_name="add-del-layers",
+ static_data_template='{% include "layer_btn.html" %}')
+
+ project = Project.objects.get(pk=kwargs['pid'])
+ self.add_column(title="LayerDetailsUrl",
+ displayable = False,
+ field_name="layerdetailurl",
+ computation = lambda x: reverse('layerdetails', args=(project.id, x.id)))
+
+ self.add_column(title="name",
+ displayable = False,
+ field_name="name",
+ computation = lambda x: x.layer.name)
+
+
+class MachinesTable(ToasterTable, ProjectFiltersMixin):
+ """Table of Machines in Toaster"""
+
+ def __init__(self, *args, **kwargs):
+ super(MachinesTable, self).__init__(*args, **kwargs)
+ self.empty_state = "No machines maybe you need to do a build?"
+ self.default_orderby = "name"
+
+ def get_context_data(self, **kwargs):
+ context = super(MachinesTable, self).get_context_data(**kwargs)
+ context['project'] = Project.objects.get(pk=kwargs['pid'])
+ context['projectlayers'] = map(lambda prjlayer: prjlayer.layercommit.id, ProjectLayer.objects.filter(project=context['project']))
+ return context
+
+ def setup_filters(self, *args, **kwargs):
+ project = Project.objects.get(pk=kwargs['pid'])
+ self.project_layers = project.projectlayer_equivalent_set()
+
+ self.add_filter(title="Filter by project machines",
+ name="in_current_project",
+ filter_actions=[
+ self.make_filter_action("in_project", "Machines provided by layers added to this project", self.filter_in_project),
+ self.make_filter_action("not_in_project", "Machines provided by layers not added to this project", self.filter_not_in_project)
+ ])
+
+ def setup_queryset(self, *args, **kwargs):
+ prj = Project.objects.get(pk = kwargs['pid'])
+ self.queryset = prj.get_all_compatible_machines()
+ self.queryset = self.queryset.order_by(self.default_orderby)
+
+ def setup_columns(self, *args, **kwargs):
+
+ self.add_column(title="Machine",
+ hideable=False,
+ orderable=True,
+ field_name="name")
+
+ self.add_column(title="Description",
+ field_name="description")
+
+ layer_link_template = '''
+ <a href="{% url 'layerdetails' extra.pid data.layer_version.id %}">
+ {{data.layer_version.layer.name}}</a>
+ '''
+
+ self.add_column(title="Layer",
+ static_data_name="layer_version__layer__name",
+ static_data_template=layer_link_template,
+ orderable=True)
+
+ self.add_column(title="Revision",
+ help_text="The Git branch, tag or commit. For the layers from the OpenEmbedded layer source, the revision is always the branch compatible with the Yocto Project version you selected for this project",
+ hidden=True,
+ field_name="layer_version__get_vcs_reference")
+
+ machine_file_template = '''<code>conf/machine/{{data.name}}.conf</code>
+ <a href="{{data.get_vcs_machine_file_link_url}}" target="_blank"><i class="icon-share get-info"></i></a>'''
+
+ self.add_column(title="Machine file",
+ hidden=True,
+ static_data_name="machinefile",
+ static_data_template=machine_file_template)
+
+ self.add_column(title="Select",
+ help_text="Sets the selected machine as the project machine. You can only have one machine per project",
+ hideable=False,
+ filter_name="in_current_project",
+ static_data_name="add-del-layers",
+ static_data_template='{% include "machine_btn.html" %}')
+
+
+class LayerMachinesTable(MachinesTable):
+ """ Smaller version of the Machines table for use in layer details """
+
+ def __init__(self, *args, **kwargs):
+ super(LayerMachinesTable, self).__init__(*args, **kwargs)
+
+ def get_context_data(self, **kwargs):
+ context = super(LayerMachinesTable, self).get_context_data(**kwargs)
+ context['layerversion'] = Layer_Version.objects.get(pk=kwargs['layerid'])
+ return context
+
+
+ def setup_queryset(self, *args, **kwargs):
+ MachinesTable.setup_queryset(self, *args, **kwargs)
+
+ self.queryset = self.queryset.filter(layer_version__pk=int(kwargs['layerid']))
+ self.static_context_extra['in_prj'] = ProjectLayer.objects.filter(Q(project=kwargs['pid']) & Q(layercommit=kwargs['layerid'])).count()
+
+ def setup_columns(self, *args, **kwargs):
+ self.add_column(title="Machine",
+ hideable=False,
+ orderable=True,
+ field_name="name")
+
+ self.add_column(title="Description",
+ field_name="description")
+
+ select_btn_template = '<a href="{% url "project" extra.pid %}?setMachine={{data.name}}" class="btn btn-block select-machine-btn" {% if extra.in_prj == 0%}disabled="disabled"{%endif%}>Select machine</a>'
+
+ self.add_column(title="Select machine",
+ static_data_name="add-del-layers",
+ static_data_template=select_btn_template)
+
+
+class RecipesTable(ToasterTable, ProjectFiltersMixin):
+ """Table of Recipes in Toaster"""
+
+ def __init__(self, *args, **kwargs):
+ super(RecipesTable, self).__init__(*args, **kwargs)
+ self.empty_state = "Toaster has no recipe information. To generate recipe information you can configure a layer source then run a build."
+ self.default_orderby = "name"
+
+ def get_context_data(self, **kwargs):
+ project = Project.objects.get(pk=kwargs['pid'])
+ context = super(RecipesTable, self).get_context_data(**kwargs)
+
+ context['project'] = project
+
+ context['projectlayers'] = map(lambda prjlayer: prjlayer.layercommit.id, ProjectLayer.objects.filter(project=context['project']))
+
+ return context
+
+ def setup_filters(self, *args, **kwargs):
+ project = Project.objects.get(pk=kwargs['pid'])
+ self.project_layers = project.projectlayer_equivalent_set()
+
+ self.add_filter(title="Filter by project recipes",
+ name="in_current_project",
+ filter_actions=[
+ self.make_filter_action("in_project", "Recipes provided by layers added to this project", self.filter_in_project),
+ self.make_filter_action("not_in_project", "Recipes provided by layers not added to this project", self.filter_not_in_project)
+ ])
+
+
+ def setup_queryset(self, *args, **kwargs):
+ prj = Project.objects.get(pk = kwargs['pid'])
+
+ self.queryset = prj.get_all_compatible_recipes()
+ self.queryset = self.queryset.order_by(self.default_orderby)
+
+
+ def setup_columns(self, *args, **kwargs):
+
+ self.add_column(title="Recipe",
+ help_text="Information about a single piece of software, including where to download the source, configuration options, how to compile the source files and how to package the compiled output",
+ hideable=False,
+ orderable=True,
+ field_name="name")
+
+ self.add_column(title="Recipe Version",
+ hidden=True,
+ field_name="version")
+
+ self.add_column(title="Description",
+ field_name="get_description_or_summary")
+
+ recipe_file_template = '''
+ <code>{{data.file_path}}</code>
+ <a href="{{data.get_vcs_recipe_file_link_url}}" target="_blank">
+ <i class="icon-share get-info"></i>
+ </a>
+ '''
+
+ self.add_column(title="Recipe file",
+ help_text="Path to the recipe .bb file",
+ hidden=True,
+ static_data_name="recipe-file",
+ static_data_template=recipe_file_template)
+
+ self.add_column(title="Section",
+ help_text="The section in which recipes should be categorized",
+ orderable=True,
+ field_name="section")
+
+ layer_link_template = '''
+ <a href="{% url 'layerdetails' extra.pid data.layer_version.id %}">
+ {{data.layer_version.layer.name}}</a>
+ '''
+
+ self.add_column(title="Layer",
+ help_text="The name of the layer providing the recipe",
+ orderable=True,
+ static_data_name="layer_version__layer__name",
+ static_data_template=layer_link_template)
+
+ self.add_column(title="License",
+ help_text="The list of source licenses for the recipe. Multiple license names separated by the pipe character indicates a choice between licenses. Multiple license names separated by the ampersand character indicates multiple licenses exist that cover different parts of the source",
+ orderable=True,
+ field_name="license")
+
+ self.add_column(title="Revision",
+ field_name="layer_version__get_vcs_reference")
+
+ self.add_column(title="Build",
+ help_text="Add or delete recipes to and from your project",
+ hideable=False,
+ filter_name="in_current_project",
+ static_data_name="add-del-layers",
+ static_data_template='{% include "recipe_btn.html" %}')
+
+ project = Project.objects.get(pk=kwargs['pid'])
+ self.add_column(title="Project compatible Layer ID",
+ displayable = False,
+ field_name = "projectcompatible_layer",
+ computation = lambda x: (x.layer_version.get_equivalents_wpriority(project)[0]))
+
+class LayerRecipesTable(RecipesTable):
+ """ Smaller version of the Recipes table for use in layer details """
+
+ def __init__(self, *args, **kwargs):
+ super(LayerRecipesTable, self).__init__(*args, **kwargs)
+
+ def get_context_data(self, **kwargs):
+ context = super(LayerRecipesTable, self).get_context_data(**kwargs)
+ context['layerversion'] = Layer_Version.objects.get(pk=kwargs['layerid'])
+ return context
+
+
+ def setup_queryset(self, *args, **kwargs):
+ RecipesTable.setup_queryset(self, *args, **kwargs)
+ self.queryset = self.queryset.filter(layer_version__pk=int(kwargs['layerid']))
+
+ self.static_context_extra['in_prj'] = ProjectLayer.objects.filter(Q(project=kwargs['pid']) & Q(layercommit=kwargs['layerid'])).count()
+
+ def setup_columns(self, *args, **kwargs):
+ self.add_column(title="Recipe",
+ help_text="Information about a single piece of software, including where to download the source, configuration options, how to compile the source files and how to package the compiled output",
+ hideable=False,
+ orderable=True,
+ field_name="name")
+
+ self.add_column(title="Description",
+ field_name="get_description_or_summary")
+
+
+ build_recipe_template ='<button class="btn btn-block build-recipe-btn" data-recipe-name="{{data.name}}" {%if extra.in_prj == 0 %}disabled="disabled"{%endif%}>Build recipe</button>'
+
+ self.add_column(title="Build recipe",
+ static_data_name="add-del-layers",
+ static_data_template=build_recipe_template)
+
+class ProjectLayersRecipesTable(RecipesTable):
+ """ Table that lists only recipes available for layers added to the project """
+
+ def setup_queryset(self, *args, **kwargs):
+ super(ProjectLayersRecipesTable, self).setup_queryset(*args, **kwargs)
+ prj = Project.objects.get(pk = kwargs['pid'])
+ self.queryset = self.queryset.filter(layer_version__in = prj.projectlayer_equivalent_set())
diff --git a/bitbake/lib/toaster/toastergui/templates/base.html b/bitbake/lib/toaster/toastergui/templates/base.html
new file mode 100644
index 0000000..640bc47
--- /dev/null
+++ b/bitbake/lib/toaster/toastergui/templates/base.html
@@ -0,0 +1,182 @@
+<!DOCTYPE html>
+{% load static %}
+{% load projecttags %}
+<html lang="en">
+ <head>
+ <title>{% if objectname %} {{objectname|title}} - {% endif %}Toaster</title>
+<link rel="stylesheet" href="{% static 'css/bootstrap.min.css' %}" type="text/css"/>
+<link rel="stylesheet" href="{% static 'css/bootstrap-responsive.min.css' %}" type='text/css'/>
+<link rel="stylesheet" href="{% static 'css/font-awesome.min.css' %}" type='text/css'/>
+<link rel="stylesheet" href="{% static 'css/prettify.css' %}" type='text/css'/>
+<link rel="stylesheet" href="{% static 'css/default.css' %}" type='text/css'/>
+
+ <meta name="viewport" content="width=device-width, initial-scale=1.0" />
+ <meta http-equiv="Content-Type" content="text/html;charset=UTF-8" />
+ <script src="{% static 'js/jquery-2.0.3.min.js' %}">
+ </script>
+ <script src="{% static 'js/jquery.cookie.js' %}">
+ </script>
+ <script src="{% static 'js/bootstrap.min.js' %}">
+ </script>
+ <script src="{% static 'js/prettify.js' %}">
+ </script>
+ <script src="{% static 'js/libtoaster.js' %}">
+ </script>
+ {% if DEBUG %}
+ <script>
+ libtoaster.debug = true;
+ </script>
+ {% endif %}
+ <script>
+ libtoaster.ctx = {
+ jsUrl : "{% static 'js/' %}",
+ htmlUrl : "{% static 'html/' %}",
+ projectsUrl : "{% url 'all-projects' %}",
+ projectsTypeAheadUrl: {% url 'xhr_projectstypeahead' as prjurl%}{{prjurl|json}},
+ {% if project.id %}
+ projectId : {{project.id}},
+ projectPageUrl : {% url 'project' project.id as purl%}{{purl|json}},
+ projectName : {{project.name|json}},
+ recipesTypeAheadUrl: {% url 'xhr_recipestypeahead' project.id as paturl%}{{paturl|json}},
+ layersTypeAheadUrl: {% url 'xhr_layerstypeahead' project.id as paturl%}{{paturl|json}},
+ machinesTypeAheadUrl: {% url 'xhr_machinestypeahead' project.id as paturl%}{{paturl|json}},
+
+ projectBuildsUrl: {% url 'projectbuilds' project.id as pburl %}{{pburl|json}},
+ projectId : {{project.id}},
+ {% else %}
+ projectId : undefined,
+ projectPageUrl : undefined,
+ projectName : undefined,
+ projectId : undefined,
+ {% endif %}
+ };
+ </script>
+ <script src="{% static 'js/base.js' %}"></script>
+ <script>
+ $(document).ready(function () {
+ /* Vars needed for base.js */
+ var ctx = {};
+ ctx.numProjects = {{projects|length}};
+ ctx.currentUrl = "{{request.path|escapejs}}";
+
+ basePageInit(ctx);
+ });
+ </script>
+
+{% block extraheadcontent %}
+{% endblock %}
+ </head>
+
+<body style="height: 100%">
+
+ {% csrf_token %}
+ <div id="loading-notification" class="alert lead text-center" style="display:none">
+ Loading <i class="fa-pulse icon-spinner"></i>
+ </div>
+
+ <div id="change-notification" class="alert lead alert-info" style="display:none">
+ <button type="button" class="close" id="hide-alert">×</button>
+ <span id="change-notification-msg"></span>
+ </div>
+
+ <div class="navbar navbar-fixed-top">
+ <div class="navbar-inner">
+ <div class="container-fluid">
+ <a class="brand logo" href="#"><img src="{% static 'img/logo.png' %}" class="" alt="Yocto logo project"/></a>
+ <span class="brand">
+ <a href="/">Toaster</a>
+ {% if DEBUG %}
+ <i class="icon-info-sign" title="<strong>Toaster version information</strong>" data-content="<dl><dt>Branch</dt><dd>{{TOASTER_BRANCH}}</dd><dt>Revision</dt><dd>{{TOASTER_REVISION}}</dd></dl>"></i>
+ {% endif %}
+ </span>
+ {% if request.resolver_match.url_name != 'landing' and request.resolver_match.url_name != 'newproject' %}
+ <ul class="nav">
+ <li {% if request.resolver_match.url_name == 'all-builds' %}
+ class="active"
+ {% endif %}>
+ <a href="{% url 'all-builds' %}">
+ <i class="icon-tasks"></i>
+ All builds
+ </a>
+ </li>
+ <li {% if request.resolver_match.url_name == 'all-projects' %}
+ class="active"
+ {% endif %}>
+ <a href="{% url 'all-projects' %}">
+ <i class="icon-folder-open"></i>
+ All projects
+ </a>
+ </li>
+ </ul>
+ {% endif %}
+ <ul class="nav pull-right">
+ <li>
+ <a target="_blank" href="http://www.yoctoproject.org/docs/latest/toaster-manual/toaster-manual.html">
+ <i class="icon-book"></i>
+ Manual
+ </a>
+ </li>
+ </ul>
+ <span class="pull-right divider-vertical"></span>
+ <div class="btn-group pull-right">
+ <a class="btn" id="new-project-button" href="{% url 'newproject' %}">New project</a>
+ </div>
+ <!-- New build popover -->
+ <div class="btn-group pull-right" id="new-build-button" style="display:none">
+ <button class="btn dropdown-toggle" data-toggle="dropdown">
+ New build
+ <i class="icon-caret-down"></i>
+ </button>
+ <ul class="dropdown-menu new-build multi-select">
+ <li>
+ <h3>New build</h3>
+ <h6>Project:</h6>
+ <span id="project">
+ {% if project.id %}
+ <a class="lead" href="{% url 'project' project.id %}">{{project.name}}</a>
+ {% else %}
+ <a class="lead" href="#"></a>
+ {% endif %}
+ <i class="icon-pencil"></i>
+ </span>
+ <form id="change-project-form" style="display:none;">
+ <div class="input-append">
+ <input type="text" class="input-medium" id="project-name-input" placeholder="Type a project name" autocomplete="off" data-minLength="1" data-autocomplete="off" data-provide="typeahead"/>
+ <button id="save-project-button" class="btn" type="button">Save</button>
+ <a href="#" id="cancel-change-project" class="btn btn-link" style="display: none">Cancel</a>
+ </div>
+ <p><a id="view-all-projects" href="{% url 'all-projects' %}">View all projects</a></p>
+ </form>
+ </li>
+ <li>
+ <div class="alert" style="display:none;">
+ <p>This project configuration is incomplete, so you cannot run builds.</p>
+ <p><a href="{% if project.id %}{% url 'project' project.id %}{% endif %}">View project configuration</a></p>
+ </div>
+ </li>
+ <li id="targets-form">
+ <h6>Recipe(s):</h6>
+ <form>
+ <input type="text" class="input-xlarge build-target-input" placeholder="Type a recipe name" autocomplete="off" data-minLength="1" data-autocomplete="off" data-provide="typeahead" disabled/>
+ <div class="row-fluid">
+ <button class="btn btn-primary build-button" disabled>Build</button>
+ </div>
+ </form>
+ </li>
+ </ul>
+ </div>
+
+
+ </div>
+ </div>
+</div>
+
+<div class="container-fluid top-padded">
+<div class="row-fluid">
+{% block pagecontent %}
+{% endblock %}
+</div>
+</div>
+</body>
+</html>
+
diff --git a/bitbake/lib/toaster/toastergui/templates/basebuilddetailpage.html b/bitbake/lib/toaster/toastergui/templates/basebuilddetailpage.html
new file mode 100644
index 0000000..22ca50c
--- /dev/null
+++ b/bitbake/lib/toaster/toastergui/templates/basebuilddetailpage.html
@@ -0,0 +1,29 @@
+{% extends "base.html" %}
+{% load humanize %}
+{% block pagecontent %}
+
+<div class="row-fluid">
+<!-- Breadcrumbs -->
+ <div class="section">
+ <ul class="breadcrumb" id="breadcrumb">
+ <li class="muted">{{build.project.name}}:</li>
+ <li><a href="{% url 'projectbuilds' build.project.id %}">Builds</a></li>
+ <li><a href="{%url 'builddashboard' build.pk%}">{{build.target_set.all.0.target}} {%if build.target_set.all.count > 1%}(+ {{build.target_set.all.count|add:"-1"}}){%endif%} ({{build.completed_on|date:"d/m/y H:i"}})</a></li>
+ {% block localbreadcrumb %}{% endblock %}
+ </ul>
+ <script>
+ $( function () {
+ $('#breadcrumb > li').append('<span class="divider">→</span>');
+ $('#breadcrumb > li:last').addClass("active");
+ $('#breadcrumb > li:last > span, #breadcrumb > li:first > span').remove();
+ });
+ </script>
+ </div> <!--section-->
+
+ <!-- Begin container -->
+ {% block pagedetailinfomain %}{% endblock %}
+ <!-- End container -->
+
+</div>
+
+{% endblock %}
diff --git a/bitbake/lib/toaster/toastergui/templates/basebuildpage.html b/bitbake/lib/toaster/toastergui/templates/basebuildpage.html
new file mode 100644
index 0000000..d441df8
--- /dev/null
+++ b/bitbake/lib/toaster/toastergui/templates/basebuildpage.html
@@ -0,0 +1,88 @@
+{% extends "base.html" %}
+{% load projecttags %}
+{% load humanize %}
+{% block pagecontent %}
+
+
+ <div class="">
+<!-- Breadcrumbs -->
+ <div class="section">
+ <ul class="breadcrumb" id="breadcrumb">
+ <li class="muted">{{build.project.name}}:</li>
+ <li><a href="{% url 'projectbuilds' build.project.id %}">Builds</a></li>
+ <li>
+ {% block parentbreadcrumb %}
+ <a href="{%url 'builddashboard' build.pk%}">
+ {{build.get_sorted_target_list.0.target}} {%if build.target_set.all.count > 1%}(+ {{build.target_set.all.count|add:"-1"}}){%endif%} ({{build.completed_on|date:"d/m/y H:i"}})
+ </a>
+ {% endblock %}
+ </li>
+ {% block localbreadcrumb %}{% endblock %}
+ </ul>
+ <script>
+ $( function () {
+ $('#breadcrumb > li').append('<span class="divider">→</span>');
+ $('#breadcrumb > li:last').addClass("active");
+ $('#breadcrumb > li:last > span, #breadcrumb > li:first > span').remove();
+ console.log("done");
+ });
+ </script>
+ </div>
+
+ <div class="row-fluid">
+
+ <!-- begin left sidebar container -->
+ <div id="nav" class="span2">
+ <ul class="nav nav-list well">
+ <li
+ {% if request.resolver_match.url_name == 'builddashboard' %}
+ class="active"
+ {% endif %} >
+ <a class="nav-parent" href="{% url 'builddashboard' build.pk %}">Build summary</a>
+ </li>
+ {% if build.target_set.all.0.is_image and build.outcome == 0 %}
+ <li class="nav-header">Images</li>
+ {% block nav-target %}
+ {% for t in build.get_sorted_target_list %}
+ <li><a href="{% url 'target' build.pk t.pk %}">{{t.target}}</a><li>
+ {% endfor %}
+ {% endblock %}
+ {% endif %}
+ <li class="nav-header">Build</li>
+ {% block nav-configuration %}
+ <li><a href="{% url 'configuration' build.pk %}">Configuration</a></li>
+ {% endblock %}
+ {% block nav-tasks %}
+ <li><a href="{% url 'tasks' build.pk %}">Tasks</a></li>
+ {% endblock %}
+ {% block nav-recipes %}
+ <li><a href="{% url 'recipes' build.pk %}">Recipes</a></li>
+ {% endblock %}
+ {% block nav-packages %}
+ <li><a href="{% url 'packages' build.pk %}">Packages</a></li>
+ {% endblock %}
+ <li class="nav-header">Performance</li>
+ {% block nav-buildtime %}
+ <li><a href="{% url 'buildtime' build.pk %}">Time</a></li>
+ {% endblock %}
+ {% block nav-cpuusage %}
+ <li><a href="{% url 'cpuusage' build.pk %}">CPU usage</a></li>
+ {% endblock %}
+ {% block nav-diskio %}
+ <li><a href="{% url 'diskio' build.pk %}">Disk I/O</a></li>
+ {% endblock %}
+ </ul>
+ </div>
+ <!-- end left sidebar container -->
+
+ <!-- Begin right container -->
+ {% block buildinfomain %}{% endblock %}
+ <!-- End right container -->
+
+
+ </div>
+ </div>
+
+
+{% endblock %}
+
diff --git a/bitbake/lib/toaster/toastergui/templates/baseprojectbuildspage.html b/bitbake/lib/toaster/toastergui/templates/baseprojectbuildspage.html
new file mode 100644
index 0000000..229cd17
--- /dev/null
+++ b/bitbake/lib/toaster/toastergui/templates/baseprojectbuildspage.html
@@ -0,0 +1,15 @@
+{% extends "base.html" %}
+{% load projecttags %}
+{% load humanize %}
+{% block pagecontent %}
+
+{% include "projecttopbar.html" %}
+
+<!-- Begin main page container -->
+<div>
+ {% block projectinfomain %}{% endblock %}
+</div>
+<!-- End main container -->
+
+{% endblock %}
+
diff --git a/bitbake/lib/toaster/toastergui/templates/baseprojectpage.html b/bitbake/lib/toaster/toastergui/templates/baseprojectpage.html
new file mode 100644
index 0000000..668e0bf
--- /dev/null
+++ b/bitbake/lib/toaster/toastergui/templates/baseprojectpage.html
@@ -0,0 +1,40 @@
+{% extends "base.html" %}
+{% load projecttags %}
+{% load humanize %}
+{% block pagecontent %}
+
+{% include "projecttopbar.html" %}
+<script type="text/javascript">
+ $(document).ready(function(){
+ $("#config-nav .nav li a").each(function(){
+ if (window.location.pathname === $(this).attr('href'))
+ $(this).parent().addClass('active');
+ else
+ $(this).parent().removeClass('active');
+ });
+
+ $("#topbar-configuration-tab").addClass("active")
+ });
+</script>
+
+<div class="row-fluid">
+ <!-- only on config pages -->
+ <div id="config-nav" class="span2">
+ <ul class="nav nav-list well">
+ <li><a class="nav-parent" href="{% url 'project' project.id %}">Configuration</a></li>
+ <li class="nav-header">Compatible metadata</li>
+<!-- <li><a href="all-image-recipes.html">Image recipes</a></li> -->
+ <li><a href="{% url 'projecttargets' project.id %}">Recipes</a></li>
+ <li><a href="{% url 'projectmachines' project.id %}">Machines</a></li>
+ <li><a href="{% url 'projectlayers' project.id %}">Layers</a></li>
+ <li class="nav-header">Extra configuration</li>
+ <li><a href="{% url 'projectconf' project.id %}">BitBake variables</a></li>
+ </ul>
+ </div>
+ <div class="span10">
+ {% block projectinfomain %}{% endblock %}
+ </div>
+</div>
+
+{% endblock %}
+
diff --git a/bitbake/lib/toaster/toastergui/templates/basetable_bottom.html b/bitbake/lib/toaster/toastergui/templates/basetable_bottom.html
new file mode 100644
index 0000000..ce023f5
--- /dev/null
+++ b/bitbake/lib/toaster/toastergui/templates/basetable_bottom.html
@@ -0,0 +1,94 @@
+ </tbody>
+ </table>
+
+<!-- Show pagination controls -->
+<div class="pagination pagination-centered">
+ <div class="pull-left">
+ Showing {{objects.start_index}} to {{objects.end_index}} out of {{objects.paginator.count}} entries.
+ </div>
+
+ <ul class="pagination" style="display: block-inline">
+{%if objects.has_previous %}
+ <li><a href="javascript:reload_params({'page':{{objects.previous_page_number}}})">«</a></li>
+{%else%}
+ <li class="disabled"><a href="#">«</a></li>
+{%endif%}
+{% for i in objects.page_range %}
+ <li{%if i == objects.number %} class="active" {%endif%}><a href="javascript:reload_params({'page':{{i}}})">{{i}}</a></li>
+{% endfor %}
+{%if objects.has_next%}
+ <li><a href="javascript:reload_params({'page':{{objects.next_page_number}}})">»</a></li>
+{%else%}
+ <li class="disabled"><a href="#">»</a></li>
+{%endif%}
+ </ul>
+ <div class="pull-right">
+ <span class="help-inline" style="padding-top:5px;">Show rows:</span>
+ <select style="margin-top:5px;margin-bottom:0px;" class="pagesize">
+ {% with "10 25 50 100 150" as list%}
+ {% for i in list.split %}
+ <option value="{{i}}">{{i}}</option>
+ {% endfor %}
+ {% endwith %}
+ </select>
+ </div>
+</div>
+
+<!-- Update page display settings -->
+
+<script>
+ $(document).ready(function() {
+
+ // we load cookies for the column display
+ save = $.cookie('_displaycols_{{objectname}}');
+ if (save != undefined) {
+ setting = save.split(';');
+ for ( i = 0; i < setting.length; i++) {
+ if (setting[i].length > 0) {
+ splitlist = setting[i].split(':');
+ id = splitlist[0], v = splitlist[1];
+ if (v == 'true') {
+ $('.chbxtoggle#'+id).prop('checked', true);
+ }
+ else {
+ $('.chbxtoggle#'+id).prop('checked', false);
+ }
+ }
+ }
+ }
+
+ // load data for number of entries to be displayed on page
+ if ({{request.GET.count}} != "") {
+ pagesize = {{request.GET.count}};
+ }
+
+ $('.pagesize option').prop('selected', false)
+ .filter('[value="' + pagesize + '"]')
+ .attr('selected', true);
+
+ $('.chbxtoggle').each(function () {
+ showhideTableColumn($(this).attr('id'), $(this).is(':checked'))
+ });
+
+ //turn edit columns dropdown into a multi-select menu
+ $('.dropdown-menu input, .dropdown-menu label').click(function(e) {
+ e.stopPropagation();
+ });
+
+ //show tooltip with applied filter
+ $('#filtered').tooltip({container:'table', placement:'bottom', delay:{hide:1500}, html:true});
+
+ //progress bar tooltip
+ $('.progress, .lead span').tooltip({container:'table', placement:'top'});
+
+ $(".pagesize").change(function () {
+ reload_params({"count":$(this).val()});
+ });
+});
+</script>
+
+<!-- modal filter boxes -->
+ {% for tc in tablecols %}{% if tc.filter %}{% with objectname=objectname f=tc.filter %}
+ {% include "filtersnippet.html" %}
+ {% endwith %}{% endif %} {% endfor %}
+<!-- end modals -->
diff --git a/bitbake/lib/toaster/toastergui/templates/basetable_top.html b/bitbake/lib/toaster/toastergui/templates/basetable_top.html
new file mode 100644
index 0000000..33ede66
--- /dev/null
+++ b/bitbake/lib/toaster/toastergui/templates/basetable_top.html
@@ -0,0 +1,239 @@
+{% load projecttags %}
+<!-- component to display a generic table -->
+ <script>
+
+ //
+ // most of the following javascript is for managing the 'Edit Columns'
+ // pop-up dialog and actions. the idea is that there are 2 types
+ // of actions: immediate - performed while the dialog is still
+ // visible - hide/show columns, and delayed - performed when the
+ // dialog becomes invisible - any resorting if necessary.
+ //
+ // When the dialog is open, an interval timer is set up to
+ // determine if the dialog is still visible. when the dialog
+ // closes - goes invisible, the delayed actions are performed.
+ //
+ // the interval timer and interrupt handler is a way of simulating
+ // an onclose event. there is probably a simpler way to do this
+ // however the pop-up window id was elusive.
+ //
+
+ var editColTimer;
+ var editColAction;
+
+ //
+ // this is the target function of the interval timeout.
+ // check to see if the dialog is visible. if the dialog
+ // has gone invisible since the last check, take any delayed
+ // actions indicated in the action list and clear the timer.
+ //
+
+ function checkVisible( ) {
+ editcol = document.getElementById( 'editcol' );
+ if ( editcol.offsetWidth <= 0 ) {
+ clearInterval( editColTimer );
+ editColTimer = false;
+ hideshowColumns( );
+ editColAction = [ ];
+ }
+ }
+
+ function filterTableRows(test) {
+ if (test.length > 0) {
+ var r = test.split(/[ ,]+/).map(function (e) { return new RegExp(e, 'i') });
+ $('tr.data').map( function (i, el) {
+ (! r.map(function (j) { return j.test($(el).html())}).reduce(function (c, p) { return c && p;} )) ? $(el).hide() : $(el).show();
+ });
+ } else
+ {
+ $('tr.data').show();
+ }
+ }
+
+ //
+ // determine the value of the indicated url arg.
+ // this is needed to determine whether a resort
+ // is necessary. it looks like a lot of gorp stuff
+ // but its actually pretty simple.
+ //
+
+ function getURLParameter( name ) {
+ return decodeURIComponent((new RegExp('[?|&]' + name + '=' +
+ '([^&;]+?)(&|#|;|$)').exec(location.search)||[,""])[1].replace(/\+/g,
+ '%20'))||null
+ }
+
+ //
+ // when the dialog box goes invisible
+ // this function is called to interpret
+ // the action list and take any delayed actions necessary.
+ // the editColAction list is a hash table with
+ // the column name as the hash key, the hash value
+ // is a 2 element list. the first element is a flag
+ // indicating whether the column is on or off. the
+ // 2nd element is the sort order indicator for the column.
+ //
+
+ function hideshowColumns( ) {
+ for( var k in editColAction ) {
+ showhideDelayedTableAction( k, editColAction[ k ][ 0 ], editColAction[ k ][ 1 ]);
+ }
+ }
+
+ //
+ // this function actually performs the delayed table actions
+ // namely any resorting if necessary
+ //
+
+ function showhideDelayedTableAction( clname, sh, orderkey ) {
+ if ( !sh ) {
+ p = getURLParameter( "orderby" ).split( ":" )[ 0 ];
+ if ( p == orderkey ) {
+ reload_params({ 'orderby' : '{{default_orderby}}'});
+ }
+ }
+ }
+
+ //
+ // this function actually performs the immediate table actions
+ // namely any colums that need to be hidden/shown
+ //
+
+ function showhideImmediateTableAction( clname, sh, orderkey ) {
+ if ( sh ) {
+ $( '.' + clname ).show( 100 );
+ }
+ else {
+ $( '.' + clname ).hide( 100 );
+ }
+
+ // save cookie for all checkboxes
+ save = '';
+ $( '.chbxtoggle' ).each(function( ) {
+ if ( $( this ).attr( 'id' ) != undefined ) {
+ save += ';' + $( this ).attr( 'id' ) +':'+ $( this ).is( ':checked' )
+ }
+ });
+ $.cookie( '_displaycols_{{objectname}}', save );
+ save = '';
+ }
+
+ //
+ // this is the onclick handler for all of the check box
+ // items in edit columns dialog
+ //
+
+ function showhideTableColumn( clname, sh, orderkey ) {
+ editcol = document.getElementById( 'editcol' );
+ if ( editcol.offsetWidth <= 0 ) {
+
+ //
+ // this path is taken when the page is first
+ // getting initialized - no dialog visible,
+ // perform both the immediate and delayed actions
+ //
+
+ showhideImmediateTableAction( clname, sh, orderkey );
+ showhideDelayedTableAction( clname, sh, orderkey );
+ return;
+ }
+ if ( !editColTimer ) {
+
+ //
+ // we don't have a timer active so set one up
+ // and clear the action list
+ //
+
+ editColTimer = setInterval( checkVisible, 250 );
+ editColAction = [ ];
+ }
+
+ //
+ // save the action to be taken when the dialog closes
+ //
+
+ editColAction[ clname ] = [ sh, orderkey ];
+ showhideImmediateTableAction( clname, sh, orderkey );
+ }
+
+ </script>
+
+<!-- control header -->
+<div class="navbar">
+ <div class="navbar-inner">
+ <form class="navbar-search input-append pull-left" id="searchform">
+ <input class="input-xxlarge" id="search" name="search" type="text" placeholder="Search {%if object_search_display %}{{object_search_display}}{%else%}{{objectname}}{%endif%}" value="{%if request.GET.search %}{{request.GET.search}}{% endif %}"/>{% if request.GET.search %}<a href="javascript:$('#search').val('');searchform.submit()" class="add-on btn" tabindex="-1"><i class="icon-remove"></i></a>{%endif%}
+ <input type="hidden" name="orderby" value="{{request.GET.orderby}}">
+ <input type="hidden" name="page" value="1">
+ <button class="btn" id="search-button" type="submit" value="Search">Search</button>
+ </form>
+ <div class="pull-right">
+{% if tablecols %}
+ <div class="btn-group">
+ <button id="edit-columns-button" class="btn dropdown-toggle" data-toggle="dropdown">Edit columns
+ <span class="caret"></span>
+ </button>
+<!--
+ {{tablecols|sortcols}}
+-->
+ <ul id='editcol' class="dropdown-menu">
+ {% for i in tablecols|sortcols %}
+ <li>
+ <label {% if not i.clclass %} class="checkbox muted" {%else%} class="checkbox" {%endif%}>
+ <input type="checkbox" class="chbxtoggle"
+ {% if i.clclass %}
+ id="{{i.clclass}}"
+ value="ct{{i.name}}"
+ {% if not i.hidden %}
+ checked="checked"
+ {%endif%}
+ onclick="showhideTableColumn(
+ $(this).attr('id'),
+ $(this).is(':checked'),
+ {% if i.ordericon %}
+ '{{i.orderkey}}'
+ {% else %}
+ undefined
+ {% endif %}
+ )"
+ {%else%}
+ checked disabled
+ {% endif %}/> {{i.name}}
+ </label>
+ </li>
+ {% endfor %}
+ </ul>
+ </div>
+{% endif %}
+ <div style="display:inline">
+ <span class="divider-vertical"></span>
+ <span class="help-inline" style="padding-top:5px;">Show rows:</span>
+ <select style="margin-top:5px;margin-bottom:0px;" class="pagesize">
+ {% with "10 25 50 100 150" as list%}
+ {% for i in list.split %}
+ <option value="{{i}}">{{i}}</option>
+ {% endfor %}
+ {% endwith %}
+ </select>
+ </div>
+ </div>
+ </div> <!-- navbar-inner -->
+</div>
+
+<!-- the actual rows of the table -->
+ <table class="table table-bordered table-hover tablesorter" id="otable">
+ <thead>
+ <!-- Table header row; generated from "tablecols" entry in the context dict -->
+ <tr>
+ {% for tc in tablecols %}<th class="{%if tc.dclass%}{{tc.dclass}}{%endif%} {% if tc.clclass %}{{tc.clclass}}{% endif %}">
+ {%if tc.qhelp%}<i class="icon-question-sign get-help" title="{{tc.qhelp}}"></i>{%endif%}
+ {%if tc.orderfield%}<a {%if tc.ordericon%} class="sorted" {%endif%}href="javascript:reload_params({'page': 1, 'orderby' : '{{tc.orderfield}}' })">{{tc.name}}</a>{%else%}<span class="muted">{{tc.name}}</span>{%endif%}
+ {%if tc.ordericon%} <i class="icon-caret-{{tc.ordericon}}"></i>{%endif%}
+ {%if tc.filter%}<div class="btn-group pull-right">
+ <a href="#filter_{{tc.filter.class}}" role="button" class="btn btn-mini {%if request.GET.filter%}{{tc.filter.options|filtered_icon:request.GET.filter}} {%endif%}" {%if request.GET.filter and tc.filter.options|filtered_tooltip:request.GET.filter %} title="<p>{{tc.filter.options|filtered_tooltip:request.GET.filter}}</p><p><a class='btn btn-small btn-primary' href=javascript:reload_params({'filter':''})>Show all {% if filter_search_display %}{{filter_search_display}}{% else %}{{objectname}}{% endif %}</a></p>" {%endif%} data-toggle="modal"> <i class="icon-filter filtered"></i> </a>
+ </div>{%endif%}
+ </th>{% endfor %}
+ </tr>
+ </thead>
+ <tbody>
+
diff --git a/bitbake/lib/toaster/toastergui/templates/basetable_top_layers.html b/bitbake/lib/toaster/toastergui/templates/basetable_top_layers.html
new file mode 100644
index 0000000..722091b
--- /dev/null
+++ b/bitbake/lib/toaster/toastergui/templates/basetable_top_layers.html
@@ -0,0 +1,5 @@
+{% extends "basetable_top.html" %}
+
+{%block custombuttons %}
+ <a class="btn" href="{% url 'importlayer' project.id %}" style="margin-right:5px;">Import layer</a>
+{%endblock%}
diff --git a/bitbake/lib/toaster/toastergui/templates/bfile.html b/bitbake/lib/toaster/toastergui/templates/bfile.html
new file mode 100644
index 0000000..c7f5943
--- /dev/null
+++ b/bitbake/lib/toaster/toastergui/templates/bfile.html
@@ -0,0 +1,24 @@
+{% extends "basebuildpage.html" %}
+
+{% block pagetitle %}Files for package {{objects.0.bpackage.name}} {% endblock %}
+{% block pagetable %}
+ {% if not objects %}
+ <p>No files were recorded for this package!</p>
+ {% else %}
+
+ <tr>
+ <th>Name</th>
+ <th>Size (Bytes)</th>
+ </tr>
+
+ {% for file in objects %}
+
+ <tr class="data">
+ <td>{{file.path}}</td>
+ <td>{{file.size}}</td>
+
+ {% endfor %}
+
+ {% endif %}
+
+{% endblock %}
diff --git a/bitbake/lib/toaster/toastergui/templates/bpackage.html b/bitbake/lib/toaster/toastergui/templates/bpackage.html
new file mode 100644
index 0000000..d775fec
--- /dev/null
+++ b/bitbake/lib/toaster/toastergui/templates/bpackage.html
@@ -0,0 +1,107 @@
+{% extends "basebuildpage.html" %}
+
+{% load projecttags %}
+
+{% block localbreadcrumb %}
+<li>Packages</li>
+{% endblock %}
+
+{% block nav-packages %}
+ <li class="active"><a href="{% url 'packages' build.pk %}">Packages</a></li>
+{% endblock %}
+
+{% block buildinfomain %}
+<div class="span10">
+
+{% if not request.GET.filter and not request.GET.search and not objects.paginator.count %}
+
+<!-- Empty - no data in database -->
+<div class="page-header">
+ <h1>
+ Packages
+ </h1>
+</div>
+<div class="alert alert-info lead">
+ <strong>No packages were built.</strong> How did this happen? Well, BitBake reuses as much stuff as possible.
+ If all of the packages needed were already built and available in your build infrastructure, BitBake
+ will not rebuild any of them. This might be slightly confusing, but it does make everything faster.
+</div>
+
+{% else %}
+
+<div class="page-header">
+ <h1>
+ {% if request.GET.search and objects.paginator.count > 0 %}
+ {{objects.paginator.count}} package{{objects.paginator.count|pluralize}} found
+ {%elif request.GET.search and objects.paginator.count == 0%}
+ No packages found
+ {%else%}
+ Packages
+ {%endif%}
+ </h1>
+</div>
+
+ {% if objects.paginator.count == 0 %}
+ <div class="row-fluid">
+ <div class="alert">
+ <form class="no-results input-append" id="searchform">
+ <input id="search" name="search" class="input-xxlarge" type="text" value="{{request.GET.search}}"/>{% if request.GET.search %}<a href="javascript:$('#search').val('');searchform.submit()" class="add-on btn" tabindex="-1"><i class="icon-remove"></i></a>{% endif %}
+ <button class="btn" type="submit" value="Search">Search</button>
+ <button class="btn btn-link" onclick="javascript:$('#search').val('');searchform.submit()">Show all packages</button>
+ </form>
+ </div>
+ </div>
+
+ {% else %}
+ {% include "basetable_top.html" %}
+
+ {% for package in objects %}
+
+ <tr class="data">
+
+ <!-- Package -->
+ <td class="package_name"><a href="{% url "package_built_detail" build.pk package.pk %}">{{package.name}}</a></td>
+ <!-- Package Version -->
+ <td class="package_version">{%if package.version%}<a href="{% url "package_built_detail" build.pk package.pk %}">{{package.version}}-{{package.revision}}</a>{%endif%}</td>
+ <!-- Package Size -->
+ <td class="size sizecol">{{package.size|filtered_filesizeformat}}</td>
+ <!-- License -->
+ <td class="license">{{package.license}}</td>
+
+ {%if package.recipe%}
+ <!-- Recipe -->
+ <td class="recipe__name"><a href="{% url "recipe" build.pk package.recipe.pk %}">{{package.recipe.name}}</a></td>
+ <!-- Recipe Version -->
+ <td class="recipe__version"><a href="{% url "recipe" build.pk package.recipe.pk %}">{{package.recipe.version}}</a></td>
+
+ <!-- Layer -->
+ <td class="recipe__layer_version__layer__name">{{package.recipe.layer_version.layer.name}}</td>
+ <!-- Layer branch -->
+ <td class="recipe__layer_version__branch">{{package.recipe.layer_version.branch}}</td>
+ <!-- Layer commit -->
+ <td class="recipe__layer_version__layer__commit">
+ <a class="btn"
+ data-content="<ul class='unstyled'>
+ <li>{{package.recipe.layer_version.commit}}</li>
+ </ul>">
+ {{package.recipe.layer_version.commit|truncatechars:13}}
+ </a>
+ </td>
+ <!-- Layer directory -->
+ {%else%}
+ <td class="recipe__name"></td>
+ <td class="recipe__version"></td>
+ <td class="recipe__layer_version__layer__name"></td>
+ <td class="recipe__layer_version__branch"></td>
+ <td class="recipe__layer_version__layer__commit"></td>
+ <td class="recipe__layer_version__local_path"></td>
+ {%endif%}
+
+ </tr>
+ {% endfor %}
+
+ {% include "basetable_bottom.html" %}
+ {% endif %} {# objects.paginator.count #}
+{% endif %} {# Empty #}
+</div>
+{% endblock %}
diff --git a/bitbake/lib/toaster/toastergui/templates/brtargets.html b/bitbake/lib/toaster/toastergui/templates/brtargets.html
new file mode 100644
index 0000000..4ebd058
--- /dev/null
+++ b/bitbake/lib/toaster/toastergui/templates/brtargets.html
@@ -0,0 +1,20 @@
+<span data-toggle="tooltip"
+ {% if buildrequest.brtarget_set.all.count > 1 %}
+ title="Targets:
+ {% for target in buildrequest.brtarget_set.all %}
+ {% if target.task %}
+ {{target.target}}:{{target.task}}
+ {% else %}
+ {{target.target}}
+ {% endif %}
+ {%endfor%}"
+ {%endif%}>
+ {% if buildrequest.brtarget_set.all.0.task %}
+ {{buildrequest.brtarget_set.all.0.target}}:{{buildrequest.brtarget_set.all.0.task}}
+ {% else %}
+ {{buildrequest.brtarget_set.all.0.target}}
+ {% endif %}
+ {% if buildrequest.brtarget_set.all.count > 1 %}
+ (+ {{buildrequest.brtarget_set.all.count|add:"-1"}})
+ {% endif %}
+</span>
diff --git a/bitbake/lib/toaster/toastergui/templates/builddashboard.html b/bitbake/lib/toaster/toastergui/templates/builddashboard.html
new file mode 100644
index 0000000..bab8e38
--- /dev/null
+++ b/bitbake/lib/toaster/toastergui/templates/builddashboard.html
@@ -0,0 +1,307 @@
+{% extends "basebuildpage.html" %}
+{% load humanize %}
+{% load projecttags %}
+
+{% block parentbreadcrumb %}
+{{build.get_sorted_target_list.0.target}} {%if build.target_set.all.count > 1%}(+ {{build.target_set.all.count|add:"-1"}}){%endif%} {{build.machine}} ({{build.completed_on|date:"d/m/y H:i"}})
+{% endblock %}
+
+{% block buildinfomain %}
+<!-- page title -->
+<div class="row-fluid span10">
+ <div class="page-header">
+ <h1>{{build.target_set.all|dictsort:"target"|join:", "}} {{build.machine}}</h1>
+ </div>
+</div>
+
+<!-- build result bar -->
+<div class="row-fluid span10 pull-right">
+ <div class="alert {%if build.outcome == build.SUCCEEDED%}alert-success{%elif build.outcome == build.FAILED%}alert-error{%else%}alert-info{%endif%}">
+ <div class="row-fluid lead">
+ <span class="pull-left"><strong>
+ {%if build.outcome == build.SUCCEEDED%}Completed{%elif build.outcome == build.FAILED%}Failed{%else%}{%endif%}
+ </strong> on
+ {{build.completed_on|date:"d/m/y H:i"}}
+</span>
+{% if build.warnings.count or build.errors.count %}
+ with
+{% endif %}
+{%if build.outcome == build.SUCCEEDED or build.outcome == build.FAILED %}
+{% if build.errors.count %}
+ <span > <i class="icon-minus-sign red"></i><strong><a href="#errors" class="error show-errors"> {{build.errors.count}} error{{build.errors.count|pluralize}}</a></strong></span>
+{% endif %}
+{% if build.warnings.count %}
+{% if build.errors.count %}
+ and
+{% endif %}
+ <span > <i class="icon-warning-sign yellow"></i><strong><a href="#warnings" class="warning show-warnings"> {{build.warnings.count}} warning{{build.warnings.count|pluralize}}</a></strong></span>
+{% endif %}
+ <span class="pull-right">Build time: <a href="{% url 'buildtime' build.pk %}">{{ build.timespent_seconds|sectohms }}</a>
+ <a class="btn {%if build.outcome == build.SUCCEEDED%}btn-success{%else%}btn-danger{%endif%} pull-right log" href="{% url 'build_artifact' build.id "cookerlog" build.id %}">Download build log</a>
+ </span>
+
+{%endif%}
+ </div>
+ {% if build.toaster_exceptions.count > 0 %}
+ <div class="row">
+ <small class="pull-right">
+ <i class="icon-question-sign get-help get-help-blue" title="" data-original-title="Toaster exceptions do not affect your build: only the operation of Toaster"></i>
+ <a class="show-exceptions" href="#exceptions">Toaster threw {{build.toaster_exceptions.count}} exception{{build.toaster_exceptions.count|pluralize}}</a>
+ </small>
+ </div>
+ {% endif %}
+ </div>
+</div>
+
+{% if build.errors.count %}
+<div class="accordion span10 pull-right" id="errors">
+ <div class="accordion-group">
+ <div class="accordion-heading">
+ <a class="accordion-toggle error toggle-errors">
+ <h2 id="error-toggle">
+ <i class="icon-minus-sign"></i>
+ {{build.errors.count}} error{{build.errors.count|pluralize}}
+ </h2>
+ </a>
+ </div>
+ <div class="accordion-body collapse in" id="collapse-errors">
+ <div class="accordion-inner">
+ <div class="span10">
+ {% for error in logmessages %}{% if error.level == 2 %}
+ <div class="alert alert-error">
+ <pre>{{error.message}}</pre>
+ </div>
+ {% endif %}
+ {% endfor %}
+ </div>
+ </div>
+ </div>
+ </div>
+</div>
+{% endif %}
+
+{%if build.outcome == build.SUCCEEDED%}
+<!-- built images -->
+{% if hasImages %}
+<div class="row-fluid span10 pull-right">
+ <h2>Images</h2>
+ {% for target in targets %}
+ {% if target.target.is_image %}
+ <div class="well dashboard-section">
+ <h3><a href="{% url 'target' build.pk target.target.pk %}">{{target.target}}</a>
+ </h3>
+ <dl class="dl-horizontal">
+ <dt>Packages included</dt>
+ <dd><a href="{% url 'target' build.pk target.target.pk %}">{{target.npkg}}</a></dd>
+ <dt>Total package size</dt>
+ <dd>{{target.pkgsz|filtered_filesizeformat}}</dd>
+ {% if target.targetHasNoImages %}
+ </dl>
+ <div class="row-fluid">
+ <div class="alert alert-info span7">
+ <p>
+ <b>This build did not create any image files</b>
+ </p>
+ <p>
+ This is probably because valid image and license manifest
+ files from a previous build already exist in your
+ <code>.../poky/build/tmp/deploy</code>
+ directory. You can
+ also <a href="{% url 'targetpkg' build.pk target.target.pk %}">view the
+ license manifest information</a> in Toaster.
+ </p>
+ </div>
+ </div>
+ {% else %}
+ <dt>
+ <i class="icon-question-sign get-help" title="The location in disk of the license manifest, a document listing all packages installed in your image and their licenses"></i>
+
+ License manifest
+ </dt>
+ <dd>
+ <a href="{% url 'targetpkg' build.pk target.target.pk %}">View in Toaster</a> |
+ <a href="{% url 'build_artifact' build.pk 'licensemanifest' target.target.pk %}">Download</a></dd>
+ <dt>
+ <i class="icon-question-sign get-help" title="Image files are stored in <code>/build/tmp/deploy/images/</code>"></i>
+ Image files
+ </dt>
+ <dd>
+ <ul>
+ {% for i in target.imageFiles %}
+ {% if build.project %}
+ <li><a href="{% url 'build_artifact' build.pk 'imagefile' i.id %}">{{i.path}}</a>
+ {% else %}
+ <li>{{i.path}}
+ {% endif %}
+ ({{i.size|filtered_filesizeformat}})</li>
+ {% endfor %}
+ </ul>
+ </dd>
+ </dl>
+ {% endif %}
+ </div>
+ {% endif %}
+ {% endfor %}
+</div>
+{% endif %}
+
+{%else%}
+<!-- error dump -->
+{%endif%}
+
+<!-- other artifacts -->
+{% if build.buildartifact_set.all.count > 0 %}
+<div class="row-fluid span10 pull-right">
+<h2>Other artifacts</h2>
+
+ <div class="well dashboard-section">
+ <dl class="dl-horizontal">
+ <dt>
+ <i class="icon-question-sign get-help" title="Build artifacts discovered in <i>tmp/deploy/images</i>. Usually kernel images and kernel modules."></i>
+ Other artifacts</dt>
+ <dd><div>
+ {% for ba in build.buildartifact_set.all|dictsort:"file_name" %}
+ <a href="{%url 'build_artifact' build.id 'buildartifact' ba.id %}">
+ {{ba.get_local_file_name}}
+ </a>
+
+ ({{ba.file_size|filtered_filesizeformat}}) <br/>
+ {% endfor %}
+ </div>
+ </dd>
+ </dl>
+ </div>
+
+</div>
+{% endif %}
+<!-- build summary -->
+<div class="row-fluid span10 pull-right">
+<h2>Build summary</h2>
+ <div class="well span4 dashboard-section" style="margin-left:0px;">
+ <h4><a href="{%url 'configuration' build.pk%}">Configuration</a></h4>
+ <dl>
+ <dt>Machine</dt><dd>{{build.machine}}</dd>
+ <dt>Distro</dt><dd>{{build.distro}}</dd>
+ <dt>Layers</dt>{% for i in build.layer_version_build.all|dictsort:"layer.name" %}<dd>{{i.layer.name}}</dd>{%endfor%}
+ </dl>
+ </div>
+ <div class="well span4 dashboard-section">
+ <h4><a href="{%url 'tasks' build.pk%}">Tasks</a></h4>
+ <dl>
+ {% query build.task_build outcome=4 order__gt=0 as exectask%}
+ {% if exectask.count > 0 %}
+ <dt>Failed tasks</dt>
+ <dd>
+ {% if exectask.count == 1 %}
+ <a class="error" href="{% url "task" build.id exectask.0.id %}">
+ {{exectask.0.recipe.name}}
+ <span class="task-name">{{exectask.0.task_name}}</span>
+ </a>
+
+ <a href="{% url 'build_artifact' build.id "tasklogfile" exectask.0.id %}">
+ <i class="icon-download-alt" title="" data-original-title="Download task log file"></i>
+ </a>
+
+ {% elif exectask.count > 1%}
+ <a class="error" href="{% url "tasks" build.id %}?filter=outcome%3A4">{{exectask.count}}</a>
+ {% endif %}
+ </dd>
+ {% endif %}
+ <dt>Total number of tasks</dt><dd><a href="{% url 'tasks' build.pk %}">{% query build.task_build order__gt=0 as alltasks %}{{alltasks.count}}</a></dd>
+ <dt>
+ Tasks executed
+ <i class="icon-question-sign get-help" title="'Executed' tasks are those that need to be run in order to generate the task output"></i>
+ </dt>
+ <dd><a href="{% url 'tasks' build.pk %}?filter=task_executed%3A1&count=25&search=&page=1&orderby=order%3A%2B">{% query build.task_build task_executed=1 order__gt=0 as exectask%}{{exectask.count}}</a></dd>
+ <dt>
+ Tasks not executed
+ <i class="icon-question-sign get-help" title="'Not executed' tasks don't need to run because their outcome is provided by another task"></i>
+ </dt>
+ <dd><a href="{% url 'tasks' build.pk %}?filter=task_executed%3A0&count=25&search=&page=1&orderby=order%3A%2B">{% query build.task_build task_executed=0 order__gt=0 as noexectask%}{{noexectask.count}}</a></dd>
+ <dt>
+ Reuse
+ <i class="icon-question-sign get-help" title="The percentage of 'not executed' tasks over the total number of tasks, which is a measure of the efficiency of your build"></i>
+ </dt>
+ <dd>
+{% query build.task_build order__gt=0 as texec %}
+{% if noexectask.count|multiply:100|divide:texec.count < 0 %}
+0
+{% else %}
+{{noexectask.count|multiply:100|divide:texec.count}}
+{% endif %}
+%
+ </dd>
+ </dl>
+ </div>
+ <div class="well span4 dashboard-section">
+ <h4><a href="{% url 'recipes' build.pk %}">Recipes</a> & <a href="{% url 'packages' build.pk %}">Packages</a></h4>
+ <dl>
+ <dt>Recipes built</dt><dd><a href="{% url 'recipes' build.pk %}">{{recipecount}}</a></dd>
+ <dt>Packages built</dt><dd><a href="{% url 'packages' build.pk %}">{{packagecount}}</a></dd>
+ </dl>
+ </div>
+</div>
+
+{% if build.warnings.count %}
+<div class="accordion span10 pull-right" id="warnings">
+ <div class="accordion-group">
+ <div class="accordion-heading">
+ <a class="accordion-toggle warning toggle-warnings">
+ <h2 id="warning-toggle">
+ <i class="icon-warning-sign"></i>
+ {{build.warnings.count}} warning{{build.warnings.count|pluralize}}
+ </h2>
+ </a>
+ </div>
+ <div class="accordion-body collapse" id="collapse-warnings">
+ <div class="accordion-inner">
+ <div class="span10">
+ {% for warning in logmessages %}{% if warning.level == 1 %}
+ <div class="alert alert-warning">
+ <pre>{{warning.message}}</pre>
+ </div>
+ {% endif %}{% endfor %}
+ </div>
+ </div>
+ </div>
+ </div>
+</div>
+{% endif %}
+
+
+{% if build.toaster_exceptions.count > 0 %}
+<div class="accordion span10 pull-right" id="exceptions">
+ <div class="accordion-group">
+ <div class="accordion-heading">
+ <a class="accordion-toggle exception toggle-exceptions">
+ <h2 id="exception-toggle">
+ <i class="icon-warning-sign"></i>
+ {{build.toaster_exceptions.count}} Toaster exception{{build.toaster_exceptions.count|pluralize}}
+ </h2>
+ </a>
+ </div>
+ <div class="accordion-body collapse" id="collapse-exceptions">
+ <div class="accordion-inner">
+ <div class="span10">
+ {% for exception in build.toaster_exceptions %}
+ <div class="alert alert-exception">
+ <pre>{{exception.message}}</pre>
+ </div>
+ {% endfor %}
+ </div>
+ </div>
+ </div>
+ </div>
+</div>
+{% endif %}
+
+<script type="text/javascript">
+ $(document).ready(function() {
+ //show warnings section when requested from the previous page
+ if (location.href.search('#warnings') > -1) {
+ $('#collapse-warnings').addClass('in');
+ }
+ });
+</script>
+
+{% endblock %}
diff --git a/bitbake/lib/toaster/toastergui/templates/buildrequestdetails.html b/bitbake/lib/toaster/toastergui/templates/buildrequestdetails.html
new file mode 100644
index 0000000..70fa1fb
--- /dev/null
+++ b/bitbake/lib/toaster/toastergui/templates/buildrequestdetails.html
@@ -0,0 +1,64 @@
+{% extends "baseprojectpage.html" %}
+
+{% load static %}
+{% load projecttags %}
+{% load humanize %}
+
+
+{% block projectinfomain %}
+ <!-- begin content -->
+
+ <div class="row-fluid">
+
+ <!-- end left sidebar container -->
+ <!-- Begin right container -->
+ <div class="span10">
+ <div class="page-header">
+ <h1>
+ <span data-toggle="tooltip" {%if buildrequest.brtarget_set.all.count > 1%}title="Targets: {%for target in buildrequest.brtarget_set.all%}{{target.target}} {%endfor%}"{%endif%}>{{buildrequest.brtarget_set.all.0.target}} {%if buildrequest.brtarget_set.all.count > 1%}(+ {{buildrequest.brtarget_set.all.count|add:"-1"}}){%endif%} {{buildrequest.get_machine}} </span>
+
+ </h1>
+ </div>
+ <div class="alert alert-error">
+ <p class="lead">
+ <strong>Failed</strong>
+ on {{ buildrequest.updated|date:'d/m/y H:i' }}
+ with
+
+ <i class="icon-minus-sign error" style="margin-left:6px;"></i>
+ <strong><a class="error accordion-toggle toggle-errors" href="#errors">
+ {{buildrequest.brerror_set.all.count}} error{{buildrequest.brerror_set.all.count|pluralize}}
+ </a></strong>
+ <span class="pull-right">Build time: {{buildrequest.get_duration|sectohms}}</span>
+ </p>
+ </div>
+
+ <div class="accordion" id="errors">
+ <div class="accordion-group">
+ <div class="accordion-heading">
+ <a class="accordion-toggle error toggle-errors">
+ <h2>
+ <i class="icon-minus-sign"></i>
+ {{buildrequest.brerror_set.all.count}} error{{buildrequest.brerror_set.all.count|pluralize}}
+ </h2>
+ </a>
+ </div>
+ <div class="accordion-body collapse in" id="collapse-errors">
+ <div class="accordion-inner">
+ <div class="span10">
+ {% for error in buildrequest.brerror_set.all %}
+ <div class="alert alert-error">
+ ERROR: <div class="air well"><pre>{{error.errmsg}}</pre></div>
+ </div>
+ {% endfor %}
+ </div>
+ </div>
+ </div>
+
+ </div>
+ </div>
+ </div>
+ </div> <!-- end of row-fluid -->
+
+
+{%endblock%}
diff --git a/bitbake/lib/toaster/toastergui/templates/builds.html b/bitbake/lib/toaster/toastergui/templates/builds.html
new file mode 100644
index 0000000..c0d0c64
--- /dev/null
+++ b/bitbake/lib/toaster/toastergui/templates/builds.html
@@ -0,0 +1,109 @@
+{% extends "base.html" %}
+
+{% load static %}
+{% load projecttags %}
+{% load humanize %}
+
+{% block extraheadcontent %}
+<link rel="stylesheet" href="/static/css/jquery-ui.min.css" type='text/css'>
+<link rel="stylesheet" href="/static/css/jquery-ui.structure.min.css" type='text/css'>
+<link rel="stylesheet" href="/static/css/jquery-ui.theme.min.css" type='text/css'>
+<script src="/static/js/jquery-ui.min.js"></script>
+<script src="/static/js/filtersnippet.js"></script>
+{% endblock %}
+
+{% block pagecontent %}
+
+{% if last_date_from and last_date_to %}
+<script>
+ // initialize the date range controls
+ $(document).ready(function () {
+ date_init('started_on','{{last_date_from}}','{{last_date_to}}','{{dateMin_started_on}}','{{dateMax_started_on}}','{{daterange_selected}}');
+ date_init('completed_on','{{last_date_from}}','{{last_date_to}}','{{dateMin_completed_on}}','{{dateMax_completed_on}}','{{daterange_selected}}');
+ });
+</script>
+{%endif%} {# last_date_from and last_date_to #}
+
+<div class="row-fluid">
+
+ {% include "mrb_section.html" %}
+
+
+ {% if 1 %}
+ <div class="page-header top-air">
+ <h1>
+ {% if request.GET.filter and objects.paginator.count > 0 or request.GET.search and objects.paginator.count > 0 %}
+ {{objects.paginator.count}} build{{objects.paginator.count|pluralize}} found
+ {%elif request.GET.filter and objects.paginator.count == 0 or request.GET.search and objects.paginator.count == 0 %}
+ No builds found
+ {%else%}
+ All builds
+ {%endif%}
+ </h1>
+ </div>
+
+ {% if objects.paginator.count == 0 %}
+ <div class="row-fluid">
+ <div class="alert">
+ <form class="no-results input-append" id="searchform">
+ <input id="search" name="search" class="input-xxlarge" type="text" value="
+ {% if request.GET.search %}
+ {{request.GET.search}}
+ {% endif %}"/>
+ {% if request.GET.search %}<a href="javascript:$('#search').val('');searchform.submit()" class="add-on btn" tabindex="-1"><i class="icon-remove"></i></a>{% endif %}
+ <button class="btn" type="submit" value="Search">Search</button>
+ <button class="btn btn-link" onclick="javascript:$('#search').val('');searchform.submit()">Show all builds</button>
+ </form>
+ </div>
+ </div>
+
+
+ {% else %}
+ {% include "basetable_top.html" %}
+ <!-- Table data rows; the order needs to match the order of "tablecols" definitions; and the <td class value needs to match the tablecols clclass value for show/hide buttons to work -->
+ {% for build in objects %}
+ <tr class="data">
+ <td class="outcome">
+ <a href="{% url "builddashboard" build.id %}">{%if build.outcome == build.SUCCEEDED%}<i class="icon-ok-sign success"></i>{%elif build.outcome == build.FAILED%}<i class="icon-minus-sign error"></i>{%else%}{%endif%}</a>
+ </td>
+ <td class="target">{% for t in build.target_set.all %} <a href="{% url "builddashboard" build.id %}"> {{t.target}} </a> <br />{% endfor %}</td>
+ <td class="machine"><a href="{% url "builddashboard" build.id %}">{{build.machine}}</a></td>
+ <td class="started_on"><a href="{% url "builddashboard" build.id %}">{{build.started_on|date:"d/m/y H:i"}}</a></td>
+ <td class="completed_on"><a href="{% url "builddashboard" build.id %}">{{build.completed_on|date:"d/m/y H:i"}}</a></td>
+ <td class="failed_tasks error">
+ {% query build.task_build outcome=4 order__gt=0 as exectask%}
+ {% if exectask.count == 1 %}
+ <a href="{% url "task" build.id exectask.0.id %}">{{exectask.0.recipe.name}}.{{exectask.0.task_name}}</a>
+ <a href="{% url 'build_artifact' build.id "tasklogfile" exectask.0.id %}">
+ <i class="icon-download-alt" title="" data-original-title="Download task log file"></i>
+ </a>
+ {% elif exectask.count > 1%}
+ <a href="{% url "tasks" build.id %}?filter=outcome%3A4">{{exectask.count}} task{{exectask.count|pluralize}}</a>
+ {%endif%}
+ </td>
+ <td class="errors.count errors_no">
+ {% if build.errors.count %}
+ <a class="errors.count error" href="{% url "builddashboard" build.id %}#errors">{{build.errors.count}} error{{build.errors.count|pluralize}}</a>
+ {%endif%}
+ </td>
+ <td class="warnings.count warnings_no">{% if build.warnings.count %}<a class="warnings.count warning" href="{% url "builddashboard" build.id %}#warnings">{{build.warnings.count}} warning{{build.warnings.count|pluralize}}</a>{%endif%}</td>
+ <td class="time"><a href="{% url "buildtime" build.id %}">{{build.timespent_seconds|sectohms}}</a></td>
+ <td class="output">
+ {% if build.outcome == build.SUCCEEDED %}
+ <a href="{%url "builddashboard" build.id%}#images">{{fstypes|get_dict_value:build.id}}</a>
+ {% endif %}
+ </td>
+ <td>
+ <a href="{% url 'project' build.project.id %}">{{build.project.name}}</a>
+ </td>
+ </tr>
+
+ {% endfor %}
+
+
+ {% include "basetable_bottom.html" %}
+ {% endif %} {# objects.paginator.count #}
+{% endif %} {# empty #}
+</div><!-- end row-fluid-->
+
+{% endblock %}
diff --git a/bitbake/lib/toaster/toastergui/templates/buildtime.html b/bitbake/lib/toaster/toastergui/templates/buildtime.html
new file mode 100644
index 0000000..ea84ae7
--- /dev/null
+++ b/bitbake/lib/toaster/toastergui/templates/buildtime.html
@@ -0,0 +1,4 @@
+{% extends "basebuildpage.html" %}
+{% block localbreadcrumb %}
+<li>Build Time</li>
+{% endblock %}
diff --git a/bitbake/lib/toaster/toastergui/templates/configuration.html b/bitbake/lib/toaster/toastergui/templates/configuration.html
new file mode 100644
index 0000000..3e48991
--- /dev/null
+++ b/bitbake/lib/toaster/toastergui/templates/configuration.html
@@ -0,0 +1,71 @@
+{% extends "basebuildpage.html" %}
+{% load projecttags %}
+
+{% block localbreadcrumb %}
+<li>Configuration</li>
+{% endblock %}
+
+{% block nav-configuration %}
+ <li class="active"><a href="{% url 'configuration' build.pk %}">Configuration</a></li>
+{% endblock %}
+
+{% block buildinfomain %}
+<!-- page title -->
+<div class="row-fluid span10">
+ <div class="page-header">
+ <h1>Configuration</h1>
+ </div>
+</div>
+
+<!-- configuration table -->
+<div class="row-fluid pull-right span10" id="navTab">
+<ul class="nav nav-pills">
+ <li class="active"><a href="#">Summary</a></li>
+ <li class=""><a href="{% url 'configvars' build.id %}">BitBake variables</a></li>
+</ul>
+
+ <!-- summary -->
+ <div id="summary" class="tab-pane active">
+ <h3>Build configuration</h3>
+ <dl class="dl-horizontal">
+ {%if BB_VERSION %}<dt>BitBake version</dt><dd>{{BB_VERSION}}</dd> {% endif %}
+ {%if BUILD_SYS %}<dt>Build system</dt><dd>{{BUILD_SYS}}</dd> {% endif %}
+ {%if NATIVELSBSTRING %}<dt>Host distribution</dt><dd>{{NATIVELSBSTRING}}</dd> {% endif %}
+ {%if TARGET_SYS %}<dt>Target system</dt><dd>{{TARGET_SYS}}</dd> {% endif %}
+ {%if MACHINE %}<dt>Machine</dt><dd>{{MACHINE}}</dd> {% endif %}
+ {%if DISTRO %}<dt>Distro</dt><dd>{{DISTRO}}</dd> {% endif %}
+ {%if DISTRO_VERSION %}<dt>Distro version</dt><dd>{{DISTRO_VERSION}}</dd> {% endif %}
+ {%if TUNE_FEATURES %}<dt>Tune features</dt><dd>{{TUNE_FEATURES}}</dd> {% endif %}
+ {%if TARGET_FPU %}<dt>Target FPU</dt><dd>{{TARGET_FPU}}</dd> {% endif %}
+ {%if targets.all %}<dt>Target(s)</dt>
+ <dd> <ul> {% for target in targets.all %}
+ <li>{{target.target}}{%if forloop.counter > 1 %}<br>{% endif %}</li>
+ {% endfor %} </ul> </dd> {% endif %}
+ </dl>
+ <h3>Layers</h3>
+ <div class="span9" style="margin-left:0px;">
+ <table class="table table-bordered table-hover">
+ <thead>
+ <tr>
+ <th>Layer</th>
+ <th>Layer branch</th>
+ <th>Layer commit</th>
+ </tr>
+ </thead>
+ <tbody>{% for lv in build.layer_version_build.all|dictsort:"layer.name" %}
+ <tr>
+ <td>{{lv.layer.name}}</td>
+ <td>{{lv.branch}}</td>
+ <td> <a class="btn" data-content="<ul class='unstyled'>
+ <li>{{lv.commit}}</li> </ul>">
+ {{lv.commit|truncatechars:13}}
+ </a></td>
+ </tr>{% endfor %}
+ </tbody>
+ </table>
+ </div>
+ </div>
+
+
+</div>
+{% endblock %}
diff --git a/bitbake/lib/toaster/toastergui/templates/configvars.html b/bitbake/lib/toaster/toastergui/templates/configvars.html
new file mode 100644
index 0000000..8a572ae
--- /dev/null
+++ b/bitbake/lib/toaster/toastergui/templates/configvars.html
@@ -0,0 +1,132 @@
+{% extends "basebuildpage.html" %}
+{% load projecttags %}
+
+{% block localbreadcrumb %}
+<li>Configuration</li>
+{% endblock %}
+
+{% block nav-configuration %}
+ <li class="active"><a href="{% url 'configuration' build.pk %}">Configuration</a></li>
+{% endblock %}
+
+{% block buildinfomain %}
+<!-- page title -->
+<div class="row-fluid span10">
+ <div class="page-header">
+ <h1>
+ {% if request.GET.filter and objects.paginator.count > 0 or request.GET.search and objects.paginator.count > 0 %}
+ {{objects.paginator.count}} variable{{objects.paginator.count|pluralize}} found
+ {%elif request.GET.filter and objects.paginator.count == 0 or request.GET.search and objects.paginator.count == 0 %}
+ No variables found
+ {%else%}
+ Configuration
+ {%endif%}
+ </h1>
+ </div>
+</div>
+
+<!-- configuration table -->
+<div class="row-fluid pull-right span10" id="navTab">
+ <ul class="nav nav-pills">
+ <li class=""><a href="{% url 'configuration' build.id %}">Summary</a></li>
+ <li class="active"><a href="#" >BitBake variables</a></li>
+ </ul>
+
+ <!-- variables -->
+ <div id="variables" class="tab-pane">
+
+ {% if objects.paginator.count == 0 %}
+ <div class="row-fluid">
+ <div class="alert">
+ <form class="no-results input-append" id="searchform">
+ <input id="search" name="search" class="input-xxlarge" type="text" value="{% if request.GET.search %}{{request.GET.search}}{% endif %}"/>{% if request.GET.search %}<a href="javascript:$('#search').val('');searchform.submit()" class="add-on btn" tabindex="-1"><i class="icon-remove"></i></a>{% endif %}
+ <button class="btn" type="submit" value="Search">Search</button>
+ <button class="btn btn-link" onclick="javascript:$('#search').val('');searchform.submit()">Show all variables</button>
+ </form>
+ </div>
+ </div>
+
+ {% else %}
+ {% include "basetable_top.html" %}
+
+ {% for variable in objects %}
+ <tr class="data">
+ <td class="variable_name"><a data-toggle="modal" href="#variable-{{variable.pk}}">{{variable.variable_name}}</a></td>
+ <td class="variable_value"><a data-toggle="modal" href="#variable-{{variable.pk}}">{{variable.variable_value|truncatechars:153}}</a></td>
+ <td class="file"><a data-toggle="modal" href="#variable-{{variable.pk}}">
+ {% if variable.vhistory.all %}
+ {% for path in variable.vhistory.all|filter_setin_files:file_filter %}
+ {{path}}<br/>
+ {% endfor %}
+ {% endif %}
+ </a></td>
+ <td class="description">
+ {% if variable.description %}
+ {{variable.description}}
+ <a href="http://www.yoctoproject.org/docs/current/ref-manual/ref-manual.html#var-{{variable.variable_name|variable_parent_name}}" target="_blank">
+ <i class="icon-share get-info"></i></a>
+ {% endif %}
+ </td>
+ </tr>
+{% endfor %}
+
+{% include "basetable_bottom.html" %}
+{% endif %}
+</div> <!-- endvariables -->
+
+<!-- file list popups -->
+{% for variable in objects %}
+ {% if variable.vhistory.count %}
+ <div id="variable-{{variable.pk}}" class="modal hide fade" tabindex="-1" role="dialog">
+ <div class="modal-header">
+ <button type="button" class="close" data-dismiss="modal" aria-hidden="true">x</button>
+ <h3>History of {{variable.variable_name}}</h3>
+ </div>
+ <div class="modal-body">
+ {% if variable.variable_value %}
+ {% if variable.variable_value|length < 570 %}
+ <h4>{{variable.variable_name}} value is:</h4>
+ <p>
+ {{variable.variable_value}}
+ </p>
+ {% else %}
+ <h4>{{variable.variable_name}} value is:</h4>
+ <p>
+ <span>{{variable.variable_value|string_slice:':570'}}
+ <span class="full"> {{variable.variable_value|string_slice:'570:'}}
+ </span>
+ <a class="btn btn-mini full-show">...</a>
+ </span>
+ </p>
+ <a class="btn btn-mini full-hide">Collapse variable value <i class="icon-caret-up"></i>
+ </a>
+ {% endif %}
+ {% else %}
+ <div class="alert alert-info">The value of <strong>{{variable.variable_name}}</strong> is an empty string</div>
+ {% endif %}
+ <h4>The value was set in the following configuration files:</h4>
+ <table class="table table-bordered table-hover">
+ <thead>
+ <tr>
+ <th>Order</th>
+ <th>Configuration file</th>
+ <th>Operation</th>
+ <th>Line number</th>
+ </tr>
+ </thead>
+ <tbody>
+ {% for vh in variable.vhistory.all %}
+ <tr>
+ <td>{{forloop.counter}}</td><td>{{vh.file_name}}</td><td>{{vh.operation}}</td><td>{{vh.line_number}}</td>
+ </tr>
+ {%endfor%}
+ </tbody>
+ </table>
+ </div>
+ </div>
+ {% endif %}
+{% endfor %}
+
+</div> <!-- buildinfomain -->
+
+{% endblock %}
diff --git a/bitbake/lib/toaster/toastergui/templates/cpuusage.html b/bitbake/lib/toaster/toastergui/templates/cpuusage.html
new file mode 100644
index 0000000..02f07b7
--- /dev/null
+++ b/bitbake/lib/toaster/toastergui/templates/cpuusage.html
@@ -0,0 +1,4 @@
+{% extends "basebuildpage.html" %}
+{% block localbreadcrumb %}
+<li>Cpu Usage</li>
+{% endblock %}
diff --git a/bitbake/lib/toaster/toastergui/templates/detail_pagination_bottom.html b/bitbake/lib/toaster/toastergui/templates/detail_pagination_bottom.html
new file mode 100644
index 0000000..f40c21d
--- /dev/null
+++ b/bitbake/lib/toaster/toastergui/templates/detail_pagination_bottom.html
@@ -0,0 +1,55 @@
+{% comment %}
+ Show pagination controls as per search/pagination table detail spec.
+ Input: objects, setup for pagination using the standard method in views.
+ object_count, count for complete list of objects, (all pages, no pattern)
+{% endcomment %}
+
+{# only paginate if 10 or more rows unfiltered, all pages #}
+{% if object_count >= 10 %}
+<div class="pagination">
+ <ul>
+{%if objects.has_previous %}
+ <li><a href="javascript:reload_params({'page':{{objects.previous_page_number}}})">«</a></li>
+{%else%}
+ <li class="disabled"><a href="#">«</a></li>
+{%endif%}
+{% for i in objects.page_range %}
+ <li{%if i == objects.number %} class="active" {%endif%}><a href="javascript:reload_params({'page':{{i}}})">{{i}}</a></li>
+{% endfor %}
+{%if objects.has_next%}
+ <li><a href="javascript:reload_params({'page':{{objects.next_page_number}}})">»</a></li>
+{%else%}
+ <li class="disabled"><a href="#">»</a></li>
+{%endif%}
+ </ul>
+
+ <div class="pull-right">
+ <span class="help-inline" style="padding-bottom:10px;">Show rows:</span>
+ <select class="pagesize">
+ {% with "10 25 50 100 150" as list%}
+ {% for i in list.split %}
+ <option value="{{i}}">{{i}}</option>
+ {% endfor %}
+ {% endwith %}
+ </select>
+ </div>
+</div>
+
+<!-- Update page display settings -->
+<script>
+ $(document).ready(function() {
+ // load data for number of entries to be displayed on page
+ if ({{request.GET.count}} != "") {
+ pagesize = {{request.GET.count}};
+ }
+ $('.pagesize option').prop('selected', false)
+ .filter('[value="' + pagesize + '"]')
+ .attr('selected', true);
+
+ $(".pagesize").change(function () {
+ reload_params({"count":$(this).val()});
+ });
+});
+</script>
+{% endif %}
+
diff --git a/bitbake/lib/toaster/toastergui/templates/detail_search_header.html b/bitbake/lib/toaster/toastergui/templates/detail_search_header.html
new file mode 100644
index 0000000..7bea3f4
--- /dev/null
+++ b/bitbake/lib/toaster/toastergui/templates/detail_search_header.html
@@ -0,0 +1,68 @@
+{% comment %}
+ Show a detail table Search field and Rows per page.
+ Input:
+ objects, our boilerplated paginated with search fields set.
+ object_count, count of full, unfiltered, objects list
+ search_what, fills in "Search ___"
+ Only show the search form if we have more than 10 results,
+ or if return from a previous search.
+{% endcomment %}
+
+
+<script>
+$(document).ready(function() {
+ /* Clear the current search selection and reload the results */
+ $(".search-clear").click(function(){
+ $("#search").val("");
+ $(this).parents("form").submit();
+ });
+});
+</script>
+<div class="row-fluid">
+{% if objects.paginator.count > 10 or request.GET.search %}
+ {% if objects.paginator.count == 0 %}
+ <div class="alert">
+ <h3>No {{search_what}} found</h3>
+ <form id="searchform" class="input-append">
+ {% else %}
+ <form id="searchform" class="navbar-search input-append pull-left">
+ {% endif %}
+
+ <input id="search" class="input-xlarge" type="text" placeholder="Search {{search_what}}" name="search" value="{% if request.GET.search %}{{request.GET.search}}{% endif %}">
+ <input type="hidden" value="name:+" name="orderby">
+ <input type="hidden" value="l" name="page">
+ {% if request.GET.search %}
+ <a class="add-on btn search-clear">
+ <i class="icon-remove"></i>
+ </a>
+ {% endif %}
+ <button type="submit" class="btn">Search</button>
+ {% if objects.paginator.count == 0 %}
+ <button type="submit" class="btn btn-link search-clear">
+ Show all {{search_what}}
+ </button>
+ {% endif %}
+ </form>
+{% endif %}
+
+{% if objects.paginator.count == 0 %}
+ </div> {# end alert #}
+{% else %}
+ {% if object_count > 10 %}
+ <div class="pull-right">
+ <span class="help-inline" style="padding-top:5px;">Show rows:</span>
+ <select style="margin-top:5px;margin-bottom:0px;" class="pagesize">
+ {% with "10 25 50 100 150" as list%}
+ {% for i in list.split %}
+ {% if request.session.limit == i %}
+ <option value="{{i}}" selected>{{i}}</option>
+ {% else %}
+ <option value="{{i}}">{{i}}</option>
+ {% endif %}
+ {% endfor %}
+ {% endwith %}
+ </select>
+ </div>
+ {% endif %}
+{% endif %}
+</div> {# row-fluid #}
diff --git a/bitbake/lib/toaster/toastergui/templates/detail_sorted_header.html b/bitbake/lib/toaster/toastergui/templates/detail_sorted_header.html
new file mode 100644
index 0000000..6ce292e
--- /dev/null
+++ b/bitbake/lib/toaster/toastergui/templates/detail_sorted_header.html
@@ -0,0 +1,25 @@
+{% comment %}
+ Adds sorted columns to a detail table.
+ Must be preceded by <table class="table table-bordered table-hover tablesorter" id="otable">
+ Must be followed by <tbody>...</tbody></table>.
+ Requires tablecols setup column fields dclass, clclass, qhelp, orderfield.
+{% endcomment %}
+{% load projecttags %}
+{# <table class="table table-bordered table-hover tablesorter" id="otable"> #}
+ <thead>
+ <!-- Table header row; generated from "tablecols" entry in the context dict -->
+ <tr>
+ {% for tc in tablecols %}<th class="{%if tc.dclass%}{{tc.dclass}}{% endif %} {%if tc.class %}{{tc.clclass}}{% endif %}">
+ {%if tc.qhelp%}<i class="icon-question-sign get-help" title="{{tc.qhelp}}"></i>{%endif%}
+ {%if tc.orderfield%}<a {%if tc.ordericon%} class="sorted" {%endif%}href="javascript:reload_params({'page': 1, 'orderby' : '{{tc.orderfield}}' })">{{tc.name}}</a>{%else%}<span class="muted">{{tc.name}}</span>{%endif%}
+ {%if tc.ordericon%} <i class="icon-caret-{{tc.ordericon}}"></i>{%endif%}
+ {% if request.GET.search and forloop.first %}
+ <span class="badge badge-info">{{objects.paginator.count}}</span>
+ {% endif %}
+ {%if tc.filter%}<div class="btn-group pull-right">
+ <a href="#filter_{{tc.filter.class}}" role="button" class="btn btn-mini {%if request.GET.filter%}{{tc.filter.options|filtered_icon:request.GET.filter}} {%endif%}" {%if request.GET.filter and tc.filter.options|filtered_tooltip:request.GET.filter %} title="<p>{{tc.filter.options|filtered_tooltip:request.GET.filter}}</p><p><a class='btn btn-small btn-primary' href=javascript:reload_params({'filter':''})>Show all {% if filter_search_display %}{{filter_search_display}}{% else %}{{objectname}}{% endif %}</a></p>" {%endif%} data-toggle="modal"> <i class="icon-filter filtered"></i> </a>
+ </div>{%endif%}
+ </th>{% endfor %}
+ </tr>
+ </thead>
+
diff --git a/bitbake/lib/toaster/toastergui/templates/dirinfo.html b/bitbake/lib/toaster/toastergui/templates/dirinfo.html
new file mode 100644
index 0000000..a5bc481
--- /dev/null
+++ b/bitbake/lib/toaster/toastergui/templates/dirinfo.html
@@ -0,0 +1,236 @@
+{% extends "basebuildpage.html" %}
+{% block extraheadcontent %}
+{% load static %}
+<link rel="stylesheet" href="{% static 'css/jquery.treetable.css' %}" type="text/css">
+<link rel="stylesheet" href="{% static 'css/jquery.treetable.theme.toaster.css' %}" type="text/css">
+{% endblock extraheadcontent %}
+
+{% block localbreadcrumb %}
+<li>{{target.target}}</li>
+{% endblock localbreadcrumb%}
+
+{% block buildinfomain %}
+
+{% load static %}
+<script src="{% static 'js/jquery.treetable.js' %}">
+</script>
+{% load projecttags %}
+
+<script type='text/javascript'>
+ function setupTreetable() {
+ $("#dirtable").treetable({
+ expandable: true,
+ branchAttr: "ttBranch",
+ clickableNodeNames: true,
+ onNodeCollapse: function() {
+ /* Do nothing, keep cached */
+ },
+ onNodeExpand: function() {
+ var start = this.id;
+ var n = $("#dirtable").treetable("node", start);
+ if (this.children.length > 0) {
+ /* already was expanded once */
+ $("#dirtable").treetable("reveal", start);
+ }
+ else {
+ var url = "{% url "dirinfo_ajax" build.id target.id %}";
+ $.ajax({
+ async: false,
+ type : "GET",
+ url : url,
+ data : "start=" + start,
+ success : function(response) {
+ addRows(n, response)
+ },
+ error : function(jqXHR, textStatus, errorThrown ) {alert(textStatus + ":" + errorThrown)},
+ });
+ }
+ },
+ });
+ }
+ function td(data) {
+ if (data == null) {
+ data = '';
+ }
+ return '<td>' + data + '</td>'
+ }
+
+ function formatRow(o) {
+ /* setup tr-wide formatting */
+ var tr = '<tr class="';
+ if (o.link_to != null) {
+ tr += 'muted ';
+ }
+ if (o.isdir && o.childcount) {
+ tr += 'branch" data-tt-branch="true" ';
+ }
+ else {
+ tr += 'leaf" data-tt-branch="false" ';
+ }
+ tr += ' data-tt-id="' + o.fullpath +'" ';
+ if (o.parent != "/") {
+ tr += ' data-tt-parent-id="' + o.parent +'" ';
+ }
+ tr += '>';
+
+ /* setup td specific formatting */
+ var link_to = td(o.link_to);
+ var size = '<td class = "sizecol">' + o.size + '</td>'
+ var permission = td(o.permission);
+ var owner = td(o.owner);
+ var group = td(o.group);
+
+ /* handle the name column */
+ var name = null;;
+ var namespan=1;
+ if (o.isdir) {
+ if (o.link_to == null) {
+ namespan = 2;
+ if (o.package == null) {
+ namespan = 3;
+ }
+ }
+ var colspan = 'colspan="' + namespan + '"';
+ name = '<td class="content-directory"' + colspan + '>';
+ if (o.childcount) {
+ name += '<a href="">';
+ }
+ name += '<i class="icon-folder-close"></i>';
+ name += ' ' + o.name;
+ if (o.childcount) {
+ name += '</a>';
+ }
+ name += '</td>';
+ }
+ else {
+ name = '<td>';
+ if (o.link_to == null) {
+ name += '<i class="icon-file"></i>';
+ }
+ else {
+ name += '<i class="icon-hand-right"></i>';
+ }
+ name += ' ' + o.name;
+ name += '</td>';
+ }
+
+ /* handle the package column */
+ var package = null;
+ if (o.package != null) {
+ /* add link to included package page */
+ build_id = {{ build.id }};
+ target_id = {{ target.id }};
+ /* Create a url for a dummy package id of 0 */
+ dummy = "{% url 'package_included_detail' build.id target.id 0 %}"
+ /* fill in the package id */
+ url = dummy.substr(0, dummy.length-1) + o.package_id;
+ package = '<a href=' + url + '>' ;
+ package += o.package;
+ package += '</a>';
+ if (o.installed_package != o.package) {
+ /* make class muted and add hover help */
+ package += '<span class="muted"> as ' + o.installed_package + ' </span>';
+ package += '<i class="icon-question-sign get-help hover-help" ';
+ package += 'title="' + o.package + ' was renamed at packaging time and was installed in your image as ' + o.installed_package + '">';
+ package += '</i>';
+ }
+ }
+ package = td(package);
+
+ var cols1to3;
+ switch (namespan) {
+ case 3:
+ cols1to3 = name;
+ break;
+ case 2:
+ cols1to3 = name + package;
+ break;
+ default:
+ cols1to3 = name + link_to + package;
+ }
+ r = tr + cols1to3 + size + permission + owner + group + "</tr>"
+ return r;
+ }
+
+ function addRows(n, objs) {
+ rows = "";
+ for (i=0; i<objs.length; i++) {
+ rows += formatRow(objs[i]);
+ }
+ $("#dirtable").treetable("loadBranch", n, rows);
+ }
+
+ $.fn.isOffScreen = function(){
+ var win = $(window);
+ viewportBottom = win.scrollTop() + win.height();
+
+ var bounds = this.offset();
+ bounds.bottom = bounds.top + this.outerHeight();
+
+ return (bounds.bottom > viewportBottom);
+ };
+
+ function selectRow(path) {
+ var row = $('tr[data-tt-id="' + path + '"]');
+ row.addClass(" highlight");
+ if (row.isOffScreen()) {
+ $('html, body').animate({ scrollTop: row.offset().top - 150}, 2000);
+ }
+ }
+</script>
+
+<div class="span10">
+
+ <div class="page-header">
+ <h1> {{target.target}} </h1>
+ </div>
+
+ <ul class="nav nav-pills">
+ <li class="">
+ <a href="{% url 'target' build.id target.id %}">
+ <i class="icon-question-sign get-help" title="Of all the packages built, the subset installed in the root file system of this image"></i>
+ Packages included ({{target.package_count}} - {{packages_sum|filtered_filesizeformat}})
+ </a>
+ </li>
+ <li class="active">
+ <a href="{% url 'dirinfo' build.id target.id %}">
+ <i class="icon-question-sign get-help" title="The directories and files in the root file system of this image"></i>
+ Directory structure
+ </a>
+ </li>
+ </ul>
+
+ <div id="directory-structure" class="tab-pane active">
+ <table id="dirtable" class="table table-bordered table-hover treetable">
+ <thead>
+ <tr>
+ <th>Directory / File</th>
+ <th>Symbolic link to</th>
+ <th>Source package</th>
+ <th>Size</th>
+ <th>Permissions</th>
+ <th>Owner</th>
+ <th>Group</th>
+ </tr>
+ </thead>
+ <tbody>
+ <script type='text/javascript'>
+ setupTreetable();
+ addRows(null, {{ objects|safe }} );
+ {% if file_path %}
+ {% comment %}
+ link from package_included_detail specifies file path
+ {% endcomment %}
+ {% for dir_elem in dir_list %}
+ $("#dirtable").treetable("expandNode", "{{dir_elem}}");
+ {% endfor %}
+ selectRow("{{file_path}}");
+ {% endif %}
+ </script>
+ </tbody>
+ </table>
+ </div> <!-- directory-structure -->
+</div> <!-- span10 -->
+
+{% endblock buildinfomain %}
+
diff --git a/bitbake/lib/toaster/toastergui/templates/diskio.html b/bitbake/lib/toaster/toastergui/templates/diskio.html
new file mode 100644
index 0000000..c5cef6f
--- /dev/null
+++ b/bitbake/lib/toaster/toastergui/templates/diskio.html
@@ -0,0 +1,4 @@
+{% extends "basebuildpage.html" %}
+{% block localbreadcrumb %}
+<li>Disk I/O</li>
+{% endblock %}
diff --git a/bitbake/lib/toaster/toastergui/templates/filtersnippet.html b/bitbake/lib/toaster/toastergui/templates/filtersnippet.html
new file mode 100644
index 0000000..1101aa8
--- /dev/null
+++ b/bitbake/lib/toaster/toastergui/templates/filtersnippet.html
@@ -0,0 +1,57 @@
+{% load projecttags %}
+<!-- '{{f.class}}' filter -->
+{% with f.class as key %}
+<form id="filter_{{f.class}}" class="modal hide fade" tabindex="-1" role="dialog" aria-hidden="true">
+ <input type="hidden" name="search" value="{%if request.GET.search %}{{request.GET.search}}{%endif%}"/>
+ <div class="modal-header">
+ <button type="button" class="close" data-dismiss="modal" aria-hidden="true">x</button>
+ {% if search_term %}
+ <h3>Filter {{total_count}} {%if filter_search_display%}{{filter_search_display|title}}{%else%}{{objectname|title}}{%endif%} matching '{{search_term}}' by '{{tc.name}}'</h3>
+ {% else %}
+ <h3>Filter {%if filter_search_display%}{{filter_search_display|title}}{%else%}{{objectname|title}}{%endif%} by '{{tc.name}}'</h3>
+ {% endif %}
+ </div>
+ <div class="modal-body">
+ <p>{{f.label}}</p>
+ <label class="radio">
+ <input type="radio" name="filter" {%if request.GET.filter%}{{f.options|check_filter_status:request.GET.filter}} {%else%} checked {%endif%} value="" data-key="{{key}}"> All {%if filter_search_display%}{{filter_search_display|title}}{%else%}{{objectname|title}}{%endif%}
+ </label>
+ {% for option in f.options %}
+ {% if option.1 == 'daterange' %}
+ <div class="form-inline">
+ <label class="radio">
+ <input type="radio" name="filter" id="filter_value_{{key}}" {%if key == daterange_selected %}checked{%endif%} value="{{option.1}}" data-key="{{key}}"> {{option.0}}
+ {% else %}
+ {% if 1 %}
+ <label class="radio">
+ <input type="radio" name="filter" {%if request.GET.filter == option.1 %}checked{%endif%} value="{{option.1}}" data-key="{{key}}"> {{option.0}}
+ {% comment "do not disable radio selections by count for now" %}{% else %}
+ <label class="radio muted">
+ <input type="radio" name="filter" disabled {%if request.GET.filter == option.1 %}checked{%endif%} value="{{option.1}}" data-key="{{key}}"> {{option.0}}
+ {% endcomment %}{% endif %}
+ {% endif %}
+ {% if option.3 %}<i class="icon-question-sign get-help" data-placement="right" title="{{option.3}}"></i>{% endif %}
+ </label>
+ {% if option.1 == 'daterange' %}
+ <input type="text" id="date_from_{{key}}" name="date_from_{{key}}" disabled class="input-small" /><label class="help-inline">to</label>
+ <input type="text" id="date_to_{{key}}" name="date_to_{{key}}" disabled class="input-small" />
+ <label class="help-inline get-help" >(dd/mm/yyyy)</label>
+ </div>
+ {% endif %}
+ {% endfor %}
+ <!-- daterange persistence -->
+ {% if last_date_from and last_date_to %}
+ <input type="hidden" id="last_date_from_{{key}}" name="last_date_from" value="{{last_date_from}}"/>
+ <input type="hidden" id="last_date_to_{{key}}" name="last_date_to" value="{{last_date_to}}"/>
+ {% endif %}
+ </div>
+ <div class="modal-footer">
+ <button type="submit" class="btn btn-primary" data-key="{{key}}">Apply</button>
+ {% if request.GET.filter %}
+ {% if request.GET.filter|string_remove_regex:':.*' != f.options.0.1|string_remove_regex:':.*' %}
+ <span class="help-inline pull-left">You can only apply one filter to the table. This filter will override the current filter.</span>
+ {% endif %}
+ {% endif %}
+ </div>
+</form>
+{% endwith %}
diff --git a/bitbake/lib/toaster/toastergui/templates/generic-toastertable-page.html b/bitbake/lib/toaster/toastergui/templates/generic-toastertable-page.html
new file mode 100644
index 0000000..33aa8ce
--- /dev/null
+++ b/bitbake/lib/toaster/toastergui/templates/generic-toastertable-page.html
@@ -0,0 +1,17 @@
+{% extends "baseprojectpage.html" %}
+{% load projecttags %}
+{% load humanize %}
+{% load static %}
+
+{% block projectinfomain %}
+
+<h2>{{title}} (<span class="table-count-{{table_name}}"></span>)
+ {% if project.release %}
+ <i class="icon-question-sign get-help heading-help" title="This page lists {{title}} compatible with the release selected for this project, which is {{project.release.description}}"></i>
+ {% endif %}
+</h2>
+
+{% url table_name project.id as xhr_table_url %}
+{% include "toastertable.html" %}
+
+{% endblock %}
diff --git a/bitbake/lib/toaster/toastergui/templates/importlayer.html b/bitbake/lib/toaster/toastergui/templates/importlayer.html
new file mode 100644
index 0000000..ce3d724
--- /dev/null
+++ b/bitbake/lib/toaster/toastergui/templates/importlayer.html
@@ -0,0 +1,138 @@
+{% extends "base.html" %}
+{% load projecttags %}
+{% load humanize %}
+{% load static %}
+{% block pagecontent %}
+
+{% include "projecttopbar.html" %}
+
+
+ {% if project and project.release %}
+ <script src="{% static 'js/layerDepsModal.js' %}"></script>
+ <script src="{% static 'js/importlayer.js' %}"></script>
+ <script>
+ $(document).ready(function (){
+ var ctx = {
+ xhrImportLayerUrl : "{% url 'xhr_importlayer' %}",
+ };
+
+ try {
+ importLayerPageInit(ctx);
+ } catch (e) {
+ document.write("Sorry, An error has occurred loading this page");
+ console.warn(e);
+ }
+ });
+ </script>
+
+ <form class="span11">
+ <fieldset>
+ <legend>Layer repository information</legend>
+ <span class="help-block">The layer you are importing must be compatible with <strong>{{project.release.description}}</strong>, which is the release you are using in this project.</span>
+ <div class="alert alert-error" id="import-error" style="display:none">
+ <button type="button" class="close" data-dismiss="alert">×</button>
+ <h3> </h3>
+ <p></p>
+ <ul></ul>
+ </div>
+
+ <div class="control-group" id="layer-name-ctrl">
+ <label class="control-label air" for="import-layer-name">
+ Layer name
+ <span class="icon-question-sign get-help" title="Something like 'meta-mylayer'. Your layer name must be unique and can only include letters, numbers and dashes"></span>
+ </label>
+ <div class="controls">
+ <input id="import-layer-name" type="text" required autofocus data-autocomplete="off" data-provide="typeahead">
+ <span class="help-inline" style="display: none;" id="invalid-layer-name-hint">A valid layer name can only include letters, numbers and dashes</span>
+ <span class="help-inline" style="display: none;" id="duplicated-layer-name-hint"></span>
+ </div>
+
+ </div>
+ <div id="duplicate-layer-info" style="display:none">
+ <div class="alert warning">
+ <h3>A layer called <a href="" class="dup-layer-link"><span class="dup-layer-name"></span></a> already exists</h3>
+ <p>Layer names must be unqiue. Please use a different layer name.</p>
+ </div>
+ <dl>
+ <dt>
+ The <span class="dup-layer-name"></span> repository url is
+ </dt>
+ <dd>
+ <span id="dup-layer-vcs-url"></span>
+ </dd>
+
+ <dt>
+ The <span class="dup-layer-name"></span> revision is
+ </dt>
+ <dd>
+ <span id="dup-layer-revision"></span>
+ </dd>
+ </dl>
+
+ <p><a href="" class="dup-layer-link">View the <span class="dup-layer-name"></span> layer information</a></p>
+
+ </div>
+
+ <div class="fields-apart-from-layer-name">
+ <label for="layer-git-repo-url" class="project-form">
+ Git repository URL
+ <span class="icon-question-sign get-help" title="Fetch/clone URL of the repository. Currently, Toaster only supports Git repositories." ></span>
+ </label>
+
+ <input type="text" id="layer-git-repo-url" class="input-xxlarge" required>
+ <label class="project-form" for="layer-subdir">
+ Repository subdirectory
+ <span class="muted">(optional)</span>
+ <span class="icon-question-sign get-help" title="Subdirectory within the repository where the layer is located, if not in the root (usually only used if the repository contains more than one layer)"></span>
+ </label>
+ <input type="text" id="layer-subdir">
+
+ <div class="control-group" id="layer-revision-ctrl">
+ <label class="control-label project-form" for="layer-git-ref">Revision
+ <span class="icon-question-sign get-help" title="You can provide a Git branch, a tag or a commit SHA as the revision"></span>
+ </label>
+ <div class="controls">
+ <input type="text" class="span3" id="layer-git-ref" required>
+ <span class="help-inline" style="diaply:none;" id="invalid-layer-revision-hint"></span>
+ </div>
+ </div>
+ </div>
+
+ </fieldset>
+
+ <div class="fields-apart-from-layer-name">
+ <fieldset class="air">
+ <legend>
+ Layer dependencies
+ <span class="muted">(optional)</span>
+ <span class="icon-question-sign get-help heading-help" title="Other layers this layer depends upon"></span>
+ </legend>
+ <ul class="unstyled configuration-list" id="layer-deps-list">
+ </ul>
+ <div class="input-append">
+ <input type="text" autocomplete="off" data-minLength="1" data-autocomplete="off" data-provide="typeahead" placeholder="Type a layer name" id="layer-dependency" class="input-xlarge">
+ <a class="btn" id="add-layer-dependency-btn">
+ Add layer
+ </a>
+ </div>
+ <span class="help-inline">You can only add layers Toaster knows about</span>
+ </fieldset>
+ <div class="air" id="form-actions">
+ <button class="btn btn-primary btn-large" data-toggle="modal" id="import-and-add-btn" data-target="#dependencies-message" disabled>Import and add to project</button>
+ <span class="help-inline" id="import-and-add-hint" style="vertical-align: middle;">To import a layer you need to enter a layer name, a Git repository URL and a revision (branch, tag or commit)</span>
+ </div>
+ </div>
+ </form>
+
+ {% else %} {#project and project release#}
+ <div class="page-header">
+ <h1>Import layer</h1>
+ </div>
+ <div class="alert alert-info" id="import-error" >
+ <h3>Unsupported project type</h3>
+ <p>This project does not support importing layers.</p>
+ <ul></ul>
+ </div>
+
+ {% endif %}
+{% endblock %}
diff --git a/bitbake/lib/toaster/toastergui/templates/js-unit-tests.html b/bitbake/lib/toaster/toastergui/templates/js-unit-tests.html
new file mode 100644
index 0000000..5b8fd84
--- /dev/null
+++ b/bitbake/lib/toaster/toastergui/templates/js-unit-tests.html
@@ -0,0 +1,39 @@
+{% extends "base.html" %}
+{% load projecttags %}
+{% load humanize %}
+{% load static %}
+{% block pagecontent %}
+
+<link rel="stylesheet" href="//code.jquery.com/qunit/qunit-1.18.0.css" />
+
+<script src="//code.jquery.com/qunit/qunit-1.18.0.js"></script>
+
+<script src="{% static 'js/layerDepsModal.js' %}"></script>
+<script src="{% static 'js/projectpage.js' %}"></script>
+
+<script src="{% static 'js/bootstrap.min.js' %}"></script>
+<script src="{% static 'js/filtersnippet.js' %}"></script>
+<script src="{% static 'js/importlayer.js' %}"></script>
+<script src="{% static 'js/prettify.js' %}"></script>
+<script src="{% static 'js/layerBtn.js' %}"></script>
+<script src="{% static 'js/layerDepsModal.js' %}"></script>
+<script src="{% static 'js/projectpage.js' %}"></script>
+<script src="{% static 'js/layerdetails.js' %}"></script>
+<script src="{% static 'js/table.js' %}"></script>
+
+<script>
+ var tableUrl = '{% url 'projectlayers' project.pk %}';
+</script>
+
+<script src="{% static 'js/tests/test.js' %}"></script>
+
+<div id="qunit"></div>
+
+<input type="text" id="layers" placeholder="layers" ></input>
+<input type="text" id="recipes" placeholder="recipes"></input>
+<input type="text" id="projects" placeholder="projects"></input>
+<input type="text" id="machines" placeholder="machines"></input>
+
+{% endblock %}
+
+
diff --git a/bitbake/lib/toaster/toastergui/templates/landing.html b/bitbake/lib/toaster/toastergui/templates/landing.html
new file mode 100644
index 0000000..45e9532
--- /dev/null
+++ b/bitbake/lib/toaster/toastergui/templates/landing.html
@@ -0,0 +1,58 @@
+{% extends "base.html" %}
+
+{% load static %}
+{% load projecttags %}
+{% load humanize %}
+
+{% block pagecontent %}
+
+ <div class="container-fluid">
+ <div class="row-fluid">
+ <!-- Empty - no data in database -->
+ <div class="hero-unit span12 well-transparent">
+ <div class="row-fluid">
+ <div class="span6">
+ <h1>
+ This is Toaster
+ </h1>
+ <p>A web interface to <a href="http://www.openembedded.org">OpenEmbedded</a> and <a href="http://www.yoctoproject.org/tools-resources/projects/bitbake">BitBake</a>, the <a href="http://www.yoctoproject.org">Yocto Project</a> build system.</p>
+
+
+ {% if lvs_nos %}
+ <p class="hero-actions">
+ <a class="btn btn-primary btn-large" href="{% url 'newproject' %}">
+ To start building, create your first Toaster project
+ </a>
+ </p>
+ {% else %}
+ <div class="alert alert-info lead air">
+ Toaster has no layer information. Without layer information, you cannot run builds. To generate layer information you can:
+ <ul>
+ <li>
+ <a href="http://www.yoctoproject.org/docs/latest/toaster-manual/toaster-manual.html#layer-source">Configure a layer source</a>
+ </li>
+ <li>
+ <a href="{% url 'newproject' %}">Create a project</a>, then import layers
+ </li>
+ </ul>
+ </div>
+ {% endif %}
+
+ <ul class="unstyled">
+ <li>
+ <a href="http://www.yoctoproject.org/docs/latest/toaster-manual/toaster-manual.html">Read the Toaster manual</a>
+ </li>
+ <li>
+ <a href="https://wiki.yoctoproject.org/wiki/Contribute_to_Toaster">Contribute to Toaster</a>
+ </li>
+ </ul>
+
+ </div>
+ <div class="span6">
+ <img alt="Yocto Project" class="thumbnail" src="{% static 'img/toaster_bw.png' %}"/>
+ </div>
+ </div>
+ </div>
+ </div>
+
+{% endblock %}
diff --git a/bitbake/lib/toaster/toastergui/templates/landing_not_managed.html b/bitbake/lib/toaster/toastergui/templates/landing_not_managed.html
new file mode 100644
index 0000000..5bc435d
--- /dev/null
+++ b/bitbake/lib/toaster/toastergui/templates/landing_not_managed.html
@@ -0,0 +1,32 @@
+{% extends "base.html" %}
+
+{% load static %}
+{% load projecttags %}
+{% load humanize %}
+
+{% block pagecontent %}
+
+ <div class="container-fluid">
+ <div class="row-fluid">
+ <!-- Empty - no build module -->
+ <div class="page-header top-air">
+ <h1>
+ This page only works with Toaster in 'Build' mode
+ </h1>
+ </div>
+ <div class="alert alert-info lead">
+ <p">
+ The 'Build' mode allows you to configure and run your Yocto Project builds from Toaster.
+ <ul>
+ <li><a href="http://www.yoctoproject.org/docs/latest/toaster-manual/toaster-manual.html#intro-modes">
+ Read about the 'Build' mode
+ </a></li>
+ <li><a href="/">
+ View your builds
+ </a></li>
+ </ul>
+ </p>
+ </div>
+ </div>
+
+{% endblock %}
diff --git a/bitbake/lib/toaster/toastergui/templates/layer_btn.html b/bitbake/lib/toaster/toastergui/templates/layer_btn.html
new file mode 100644
index 0000000..a2e9393
--- /dev/null
+++ b/bitbake/lib/toaster/toastergui/templates/layer_btn.html
@@ -0,0 +1,9 @@
+<button class="btn btn-danger btn-block layer-exists-{{data.pk}} layerbtn" style="display:none;" data-layer='{ "id": {{data.pk}}, "name": "{{data.layer.name}}", "layerdetailurl": "{%url 'layerdetails' extra.pid data.pk%}"}' data-directive="remove" >
+ <i class="icon-trash"></i>
+ Delete layer
+</button>
+<button class="btn btn-block layer-add-{{data.pk}} layerbtn" data-layer='{ "id": {{data.pk}}, "name": "{{data.layer.name}}", "layerdetailurl": "{%url 'layerdetails' extra.pid data.pk%}"}' data-directive="add">
+ <i class="icon-plus"></i>
+ Add layer
+</button>
+
diff --git a/bitbake/lib/toaster/toastergui/templates/layerdetails.html b/bitbake/lib/toaster/toastergui/templates/layerdetails.html
new file mode 100644
index 0000000..7dd3db2
--- /dev/null
+++ b/bitbake/lib/toaster/toastergui/templates/layerdetails.html
@@ -0,0 +1,276 @@
+{% extends "base.html" %}
+{% load projecttags %}
+{% load humanize %}
+{% load static %}
+
+{% block pagecontent %}
+
+<div class="section">
+ <ul class="breadcrumb">
+ <li class="muted">{{project.name}}:</li>
+ <li>
+ <a href="{% url 'project' project.id %}">Configuration</a>
+ <span class="divider">→</span>
+ </li>
+ <li><a href="{% url 'projectlayers' project.id %}">Compatible layers</a>
+ <span class="divider">→</span>
+ </li>
+ <li class="active">
+ {{layerversion.layer.name}} ({{layerversion.get_vcs_reference|truncatechars:13}})
+ </li>
+ </ul>
+</div>
+
+{# If this is not an imported layer then hide the edit ui #}
+{% if not layerversion.layer_source_id or layerversion.layer_source.sourcetype != layerversion.layer_source.TYPE_IMPORTED %}
+<style scoped>
+ .icon-pencil {
+ display:none;
+ }
+.delete-current-value{
+ display: none;
+}
+ li .icon-trash {
+ display:none;
+ }
+ .add-deps {
+ display:none;
+ }
+</style>
+{% endif %}
+
+
+<script src="{% static 'js/layerdetails.js' %}"></script>
+<script>
+
+ $(document).ready(function (){
+ var ctx = {
+ projectBuildsUrl : "{% url 'projectbuilds' project.id %}",
+ xhrUpdateLayerUrl : "{% url 'xhr_updatelayer' %}",
+ layerVersion : {
+ name : "{{layerversion.layer.name}}",
+ id : {{layerversion.id}},
+ commit: "{{layerversion.get_vcs_reference}}",
+ {%if layerversion.id in projectlayers %}
+ inCurrentPrj : true,
+ {% else %}
+ inCurrentPrj : false,
+ {% endif %}
+ layerdetailurl : "{% url 'layerdetails' project.id layerversion.id %}",
+ sourceId: {{layerversion.layer_source_id|json}},
+ }
+ };
+
+ try {
+ layerDetailsPageInit(ctx);
+ } catch (e) {
+ document.write("Sorry, An error has occurred loading this page");
+ console.warn(e);
+ }
+ });
+</script>
+
+<div class="row-fluid span11">
+ <div class="page-header">
+ <h1>{{layerversion.layer.name}} <small class="commit"
+ {% if layerversion.get_vcs_reference|length > 13 %}
+ data-toggle="tooltip" title="{{layerversion.get_vcs_reference}}"
+ {% endif %}>
+ ({{layerversion.get_vcs_reference|truncatechars:13}})</small></h1>
+ </div>
+</div>
+
+<!-- container for tabs -->
+<div class="row-fluid span7 tabbable">
+ <div class="alert alert-info lead" id="alert-area" style="display:none">
+ <button type="button" class="close" id="dismiss-alert">×</button>
+ <span id="alert-msg"></span>
+ </div>
+ <ul class="nav nav-pills">
+ <li class="active">
+ <a data-toggle="tab" href="#information" id="details-tab">Layer details</a>
+ </li>
+ <li>
+ <a data-toggle="tab" href="#recipes" class="muted" id="targets-tab">Recipes (<span class="table-count-recipestable"></span>)</a>
+ </li>
+ <li>
+ <a data-toggle="tab" href="#machines" class="muted" id="machines-tab">Machines (<span class="table-count-machinestable"></span>)</a>
+ </li>
+ </ul>
+ <div class="tab-content">
+ <span class="button-place">
+ {% if layerversion.id not in projectlayers %}
+ <button id="add-remove-layer-btn" data-directive="add" class="btn btn-large btn-block">
+ <span class="icon-plus"></span>
+ Add the {{layerversion.layer.name}} layer to your project
+ </button>
+ {% else %}
+ <button id="add-remove-layer-btn" data-directive="remove" class="btn btn-block btn-large btn-danger">
+ <span class="icon-trash"></span>
+ Delete the {{layerversion.layer.name}} layer from your project
+ </button>
+ {% endif %}
+ </span>
+
+ <!-- layer details pane -->
+ <div id="information" class="tab-pane active">
+ <dl class="dl-horizontal">
+ <dt class="">
+ <i class="icon-question-sign get-help" title="Fetch/clone URL of the repository"></i>
+ Repository URL
+ </dt>
+ <dd>
+ <span class="current-value">{{layerversion.layer.vcs_url}}</span>
+ {% if layerversion.get_vcs_link_url %}
+ <a href="{{layerversion.get_vcs_link_url}}/" class="icon-share get-info" target="_blank"></a>
+ {% endif %}
+ <form id="change-repo-form" class="control-group" style="display:none">
+ <div class="input-append">
+ <input type="text" class="input-xlarge" value="{{layerversion.layer.vcs_url}}">
+ <button data-layer-prop="vcs_url" class="btn change-btn" type="button">Save</button>
+ <a href="#" style="display:none" class="btn btn-link cancel">Cancel</a>
+ </div>
+ </form>
+ <i class="icon-pencil" ></i>
+ </dd>
+ <dt>
+ <i class="icon-question-sign get-help" title="Subdirectory within the repository where the layer is located, if not in the root (usually only used if the repository contains more than one layer)"></i>
+ Repository subdirectory
+ </dt>
+ <dd>
+ <span class="muted" style="display:none">Not set</span>
+ <span class="current-value">{{layerversion.dirpath}}</span>
+ {% if layerversion.get_vcs_dirpath_link_url %}
+ <a href="{{layerversion.get_vcs_dirpath_link_url}}" class="icon-share get-info" target="_blank"></a>
+ {% endif %}
+ <form id="change-subdir-form" style="display:none;">
+ <div class="input-append">
+ <input type="text" value="{{layerversion.dirpath}}">
+ <button data-layer-prop="dirpath" class="btn change-btn" type="button">Save</button>
+ <a href="#" style="display:none" class="btn btn-link cancel">Cancel</a>
+ </div>
+ </form>
+ <i id="change-subdir" class="icon-pencil"></i>
+ <span class="icon-trash delete-current-value" data-toggle="tooltip" title="Delete"></span>
+ </dd>
+ <dt>
+ <i class="icon-question-sign get-help" title="The Git branch, tag or commit"></i>
+ Revision
+ </dt>
+ <dd>
+ <span class="current-value">{{layerversion.get_vcs_reference}}</span>
+ <form style="display:none;">
+ <div class="input-append">
+ <input type="text" value="{{layerversion.get_vcs_reference}}">
+ <button data-layer-prop="commit" class="btn change-btn" type="button">Save</button>
+ <a href="#" style="display:none" class="btn btn-link cancel">Cancel</a>
+ </div>
+ </form>
+ <i class="icon-pencil"></i>
+ </dd>
+ <dt>
+ <i class="icon-question-sign get-help" title="Other layers this layer depends upon"></i>
+ Layer dependencies
+ </dt>
+ <dd>
+ <ul class="unstyled current-value" id="layer-deps-list">
+ {% for ld in layerversion.dependencies.all %}
+ <li data-layer-id="{{ld.depends_on.id}}">
+ <a data-toggle="tooltip" title="{{ld.depends_on.layer.vcs_url}} | {{ld.depends_on.get_vcs_reference}}" href="{% url 'layerdetails' project.id ld.depends_on.id %}">{{ld.depends_on.layer.name}}</a>
+ <span class="icon-trash " data-toggle="tooltip" title="Delete"></span>
+ </li>
+ {% endfor %}
+ </ul>
+ <div class="input-append add-deps">
+ <input type="text" autocomplete="off" data-minLength="1" data-autocomplete="off" placeholder="Type a layer name" id="layer-dep-input">
+ <a class="btn" id="add-layer-dependency-btn" >
+ Add layer
+ </a>
+ </div>
+ <span class="help-block add-deps">You can only add layers Toaster knows about</span>
+ </dd>
+ </dl>
+ </div>
+ <!-- end layerdetails tab -->
+ <!-- targets tab -->
+ <div id="recipes" class="tab-pane">
+ <!-- Recipe table -->
+ <div id="no-recipes-yet" class="alert alert-info" style="display:none">
+ <p>Toaster does not have recipe information for the <strong> {{layerversion.layer.name}} </strong> layer.</p>
+ <p>Toaster learns about layers when you build them. If this layer provides any recipes, they will be listed here after you build the <strong> {{layerversion.layer.name}} </strong> layer.</p>
+ </div>
+
+
+
+ {% url 'layerrecipestable' project.id layerversion.id as xhr_table_url %}
+ {% with "recipestable" as table_name %}
+ {% with "Recipes" as title %}
+ {% include 'toastertable-simple.html' %}
+ {% endwith %}
+ {% endwith %}
+ </div>
+
+ <div id="machines" class="tab-pane">
+
+ <div id="no-machines-yet" class="alert alert-info" style="display:none">
+ <p>Toaster does not have machine information for the <strong> {{layerversion.layer.name}} </strong> layer.</p>
+ <p>Toaster learns about layers when you build them. If this layer provides any machines, they will be listed here after you build the <strong> {{layerversion.layer.name}} </strong> layer.</p>
+ </div>
+
+
+ <!-- Machines table -->
+ {% url 'layermachinestable' project.id layerversion.id as xhr_table_url %}
+ {% with "machinestable" as table_name %}
+ {% with "Machines" as title %}
+ {% include 'toastertable-simple.html' %}
+ {% endwith %}
+ {% endwith %}
+ </div>
+ </div> <!-- end tab content -->
+ </div> <!-- end tabable -->
+
+ <div class="row-fluid span4 well"> <!-- info side panel -->
+ <h2>About {{layerversion.layer.name}}</h2>
+ <dl class="item-info">
+
+ <dt>
+ Summary
+ <i class="icon-question-sign get-help" title="One-line description of the layer"></i>
+ </dt>
+ <dd>
+ <span class="muted" style="display:none">Not set</span>
+ <span class="current-value">{{layerversion.layer.summary|default_if_none:''}}</span>
+ <form style="display:none; margin-bottom:20px">
+ <textarea class="span12" rows="2">{% if layerversion.layer.summary %}{{layerversion.layer.summary}}{% endif %}</textarea>
+ <button class="btn change-btn" data-layer-prop="summary" type="button">Save</button>
+ <a href="#" class="btn btn-link cancel">Cancel</a>
+ </form>
+ <i class="icon-pencil"></i>
+ <span class="icon-trash delete-current-value" data-toggle="tooltip" title="Delete"></span>
+ </dd>
+ <dt>
+ Description
+ </dt>
+ <dd>
+ <span class="muted" style="display:none">Not set</span>
+ <span class="current-value">{{layerversion.layer.description|default_if_none:''}}</span>
+ <form style="display:none; margin-bottom:20px">
+ <textarea class="span12" rows="6">{% if layerversion.layer.description %}{{layerversion.layer.description}}{% endif %}</textarea>
+ <button class="btn change-btn" data-layer-prop="description" type="button" >Save</button>
+ <a href="#" class="btn btn-link cancel">Cancel</a>
+ </form>
+ <i class="icon-pencil"></i>
+ <span class="icon-trash delete-current-value" data-toggle="tooltip" title="Delete"></span>
+ </dd>
+ {% if layerversion.layer.up_id %}
+ <dt>Layer index</dt>
+ <dd>
+ <a href="http://layers.openembedded.org/layerindex/branch/{{layerversion.up_branch.name}}/layer/{{layerversion.layer.name}}">layer index link</a>
+
+ </dd>
+ {% endif %}
+
+ </dl>
+ </div>
+
+ {% endblock %}
diff --git a/bitbake/lib/toaster/toastergui/templates/machine_btn.html b/bitbake/lib/toaster/toastergui/templates/machine_btn.html
new file mode 100644
index 0000000..d2cb55b
--- /dev/null
+++ b/bitbake/lib/toaster/toastergui/templates/machine_btn.html
@@ -0,0 +1,8 @@
+<a href="{% url 'project' extra.pid %}?setMachine={{data.name}}" class="btn btn-block layer-exists-{{data.layer_version.id}}" style="margin-top: 5px; display:none">
+ Select machine</a>
+<button class="btn btn-block layerbtn layer-add-{{data.layer_version.id}}" data-layer='{ "id": {{data.layer_version.id}}, "name": "{{data.layer_version.layer.name}}", "layerdetailurl": "{%url 'layerdetails' extra.pid data.layer_version.id %}"}' data-directive="add">
+ <i class="icon-plus"></i>
+ Add layer
+ <i title="" class="icon-question-sign get-help" data-original-title="To enable this machine, you must first add the {{data.layer_version.layer.name}} layer to your project"></i>
+</button>
+
diff --git a/bitbake/lib/toaster/toastergui/templates/mrb_section.html b/bitbake/lib/toaster/toastergui/templates/mrb_section.html
new file mode 100644
index 0000000..396fb8e
--- /dev/null
+++ b/bitbake/lib/toaster/toastergui/templates/mrb_section.html
@@ -0,0 +1,121 @@
+{% load static %}
+{% load projecttags %}
+{% load humanize %}
+
+
+{%if mru.count > 0%}
+
+ <div class="page-header">
+ <h1>
+ Latest builds
+ </h1>
+ </div>
+ <div id="latest-builds">
+ {% for build in mru %}
+ <div class="alert {%if build.outcome == build.SUCCEEDED%}alert-success{%elif build.outcome == build.FAILED%}alert-error{%else%}alert-info{%endif%} project-name ">
+ <span class="label {%if build.outcome == build.SUCCEEDED%}label-success{%elif build.outcome == build.FAILED%}label-important{%else%}label-info{%endif%}">
+ <a href={% url 'project' build.project.pk %}>
+ {{build.project.name}}
+ </a>
+ </span>
+
+ <div class="row-fluid">
+ <div class="span3 lead">
+ {%if build.outcome == build.SUCCEEDED or build.outcome == build.FAILED %}
+ <a href="{%url 'builddashboard' build.pk%}" class="{%if build.outcome == build.SUCCEEDED %}success{%else%}error{%endif%}">
+ {% endif %}
+ {% if build.target_set.all.count > 0 %}
+ <span data-toggle="tooltip"
+ {%if build.target_set.all.count > 1%}
+ title="Targets: {%for target in build.target_set.all%}{{target.target}} {%endfor%}"
+ {%endif%}
+ >
+
+ {{build.target_set.all.0.target}} {%if build.target_set.all.count > 1%}(+ {{build.target_set.all.count|add:"-1"}}){%endif%}
+ </span>
+ {% endif %}
+ {%if build.outcome == build.SUCCEEDED or build.outcome == build.FAILED %}
+ </a>
+ {% endif %}
+ </div>
+ <div class="span2 lead">
+ {% if build.completed_on|format_build_date %}
+ {{ build.completed_on|date:'d/m/y H:i' }}
+ {% else %}
+ {{ build.completed_on|date:'H:i' }}
+ {% endif %}
+ </div>
+ {%if build.outcome == build.SUCCEEDED or build.outcome == build.FAILED %}
+ <div class="span2 lead">
+ {% if build.errors.count %}
+ <i class="icon-minus-sign red"></i> <a href="{%url 'builddashboard' build.pk%}#errors" class="error">{{build.errors.count}} error{{build.errors.count|pluralize}}</a>
+ {% endif %}
+ </div>
+ <div class="span2 lead">
+ {% if build.warnings.count %}
+ <i class="icon-warning-sign yellow"></i> <a href="{%url 'builddashboard' build.pk%}#warnings" class="warning">{{build.warnings.count}} warning{{build.warnings.count|pluralize}}</a>
+ {% endif %}
+ </div>
+ <div class="lead ">
+ <span class="lead">
+ Build time: <a href="{% url 'buildtime' build.pk %}">{{ build.timespent_seconds|sectohms }}</a>
+ </span>
+ <button class="btn
+ {% if build.outcome == build.SUCCEEDED %}
+ btn-success
+ {% elif build.outcome == build.FAILED %}
+ btn-danger
+ {% else %}
+ btn-info
+ {%endif%}
+ pull-right"
+ onclick='scheduleBuild({% url 'projectbuilds' build.project.id as bpi %}{{bpi|json}},
+ {{build.project.name|json}},
+ {% url 'project' build.project.id as bpurl %}{{bpurl|json}},
+ {{build.target_set.all|get_tasks|json}})'>
+
+ Run again
+ </button>
+ </div>
+ {%endif%}
+ {%if build.outcome == build.IN_PROGRESS %}
+ <div class="span4">
+ <div class="progress" style="margin-top:5px;" data-toggle="tooltip" title="{{build.completeper}}% of tasks complete">
+ <div style="width: {{build.completeper}}%;" class="bar"></div>
+ </div>
+ </div>
+ <div class="lead pull-right">{{build.completeper}}% of tasks complete</div>
+ {%endif%}
+ </div>
+ </div>
+
+ {% endfor %}
+ </div>
+
+<script>
+
+function scheduleBuild(url, projectName, projectUrl, buildlist) {
+ console.log("scheduleBuild");
+ libtoaster.startABuild(url, null, buildlist.join(" "), function(){
+ console.log("reloading page");
+ window.location.reload();
+ }, null);
+}
+
+$(document).ready(function(){
+
+ $(".cancel-build-btn").click(function (){
+ var url = $(this).data('request-url');
+ var buildIds = $(this).data('build-id');
+ var btn = $(this);
+
+ libtoaster.cancelABuild(url, buildIds, function(){
+ btn.parents(".alert").fadeOut();
+ }, null);
+ });
+});
+
+</script>
+
+{%endif%}
+
diff --git a/bitbake/lib/toaster/toastergui/templates/newproject.html b/bitbake/lib/toaster/toastergui/templates/newproject.html
new file mode 100644
index 0000000..997390b
--- /dev/null
+++ b/bitbake/lib/toaster/toastergui/templates/newproject.html
@@ -0,0 +1,130 @@
+{% extends "base.html" %}
+{% load projecttags %}
+{% load humanize %}
+{% block pagecontent %}
+<div class="row-fluid">
+ <div class="page-header">
+ <h1>Create a new project</h1>
+ </div>
+ <div class="container-fluid">
+ {% if alert %}
+ <div class="alert alert-error row-fluid" role="alert">{{alert}}</div>
+ {% endif %}
+ </div>
+
+ <div class="row-fluid">
+ <div class="span6">
+ <form method="POST">{% csrf_token %}
+
+ <fieldset>
+ <label>Project name <span class="muted">(required)</span></label>
+ <input type="text" class="input-xlarge" required id="new-project-name" name="projectname">
+ </fieldset>
+<!--
+ <fieldset>
+ <label class="project-form">Project type</label>
+ <label class="project-form radio"><input type="radio" name="ptype" value="analysis" checked/> Analysis Project</label>
+
+ {% if releases.count > 0 %}
+ <label class="project-form radio"><input type="radio" name="ptype" value="build" checked /> Build Project</label>
+ {% endif %}
+ </fieldset> -->
+ <input type="hidden" name="ptype" value="build" />
+
+ {% if releases.count > 0 %}
+ <fieldset class="release">
+ {% if releases.count > 1 %}
+ <label class="project-form">
+ Release
+ <i class="icon-question-sign get-help" title="The version of the build system you want to use"></i>
+ </label>
+ <select name="projectversion" id="projectversion">
+ {% for release in releases %}
+ <option value="{{release.id}}"
+ {%if defaultbranch == release.name %}
+ selected
+ {%endif%}
+ >{{release.description}}</option>
+ {% endfor %}
+ </select>
+ {% for release in releases %}
+ <div class="row-fluid helptext" id="description-{{release.id}}" style="display: none">
+ <span class="help-block span5">{{release.helptext|safe}}</span>
+ </div>
+ {% endfor %}
+ {% else %}
+ <input type="hidden" name="projectversion" value="{{releases.0.id}}"/>
+ {% endif %}
+ </fieldset>
+ {% endif %}
+
+ <div class="form-actions">
+ <input type="submit" id="create-project-button" class="btn btn-primary btn-large" value="Create project"/>
+ <span class="help-inline" style="vertical-align:middle;">To create a project, you need to enter a project name</span>
+ </div>
+ </form>
+ </div>
+ <!--
+ <div class="span5 well">
+ <span class="help-block">
+ <h4>Toaster project types</h4>
+ <p>With a <strong>build project</strong> you configure and run your builds from Toaster.</p>
+ <p>With an <strong>analysis project</strong>, the builds are configured and run by another tool
+ (something like Buildbot or Jenkins), and the project only collects the information about the
+ builds (packages, recipes, dependencies, logs, etc). </p>
+ <p>You can read more on <a href="#">how to set up an analysis project</a>
+ in the Toaster manual.</p>
+ <h4>Release</h4>
+ <p>If you create a <strong>build project</strong>, you will need to select a <strong>release</strong>,
+ which is the version of the build system you want to use to run your builds.</p>
+ </div> -->
+ </div>
+ </div>
+
+ <script type="text/javascript">
+ $(document).ready(function () {
+ // hide the new project button
+ $("#new-project-button").hide();
+ $('.btn-primary').attr('disabled', 'disabled');
+
+ // enable submit button when all required fields are populated
+ $("input#new-project-name").on('input', function() {
+ if ($("input#new-project-name").val().length > 0 ){
+ $('.btn-primary').removeAttr('disabled');
+ $(".help-inline").css('visibility','hidden');
+ }
+ else {
+ $('.btn-primary').attr('disabled', 'disabled');
+ $(".help-inline").css('visibility','visible');
+ }
+ });
+
+ // show relevant help text for the selected release
+ var selected_release = $('select').val();
+ $("#description-" + selected_release).show();
+
+
+ $('select').change(function(){
+ var new_release = $('select').val();
+ $(".helptext").hide();
+ $('#description-' + new_release).fadeIn();
+ });
+
+/* // Hide the project release when you select an analysis project
+ function projectType() {
+ if ($("input[type='radio']:checked").val() == 'build') {
+ $('.release').fadeIn();
+ }
+ else {
+ $('.release').fadeOut();
+ }
+ }
+ projectType();
+
+ $('input:radio').change(function(){
+ projectType();
+ }); */
+ });
+ </script>
+
+{% endblock %}
diff --git a/bitbake/lib/toaster/toastergui/templates/package_built_dependencies.html b/bitbake/lib/toaster/toastergui/templates/package_built_dependencies.html
new file mode 100644
index 0000000..e6f20c3
--- /dev/null
+++ b/bitbake/lib/toaster/toastergui/templates/package_built_dependencies.html
@@ -0,0 +1,99 @@
+{% extends "package_detail_base.html" %}
+{% load projecttags %}
+
+{% block tabcontent %}
+ <ul class="nav nav-pills">
+ <li class="">
+ <a href="{% url 'package_built_detail' build.id package.id %}">
+ <i class="icon-question-sign get-help" title="Shows the files produced by this package."></i>
+ Generated files ({{package.buildfilelist_package.count}})
+ </a>
+ </li>
+ <li class="active">
+ <a href="{% url 'package_built_dependencies' build.id package.id %}">
+ <i class="icon-question-sign get-help" title="Shows the runtime packages required by this package."></i>
+ Runtime dependencies ({{dependency_count}})
+ </a>
+ </li>
+ </ul>
+ <div class="tab-content">
+ <div class="tab-pane active" id="dependencies">
+ {% ifequal runtime_deps|length 0 %}
+ <div class="alert alert-info">
+ <strong>{{package.fullpackagespec}}</strong> has no runtime dependencies.
+ </div>
+ {% else %}
+ <div class="alert alert-info">
+ <strong>{{package.fullpackagespec}}</strong> is <strong>not included</strong> in any image. This page shows you the projected runtime dependencies if you include <strong>{{package.fullpackagespec}}</strong> in future builds.
+ </div>
+ <table class="table table-bordered table-hover">
+ <thead>
+ <tr>
+ <th>Package</th>
+ <th>Version</th>
+ <th class="sizecol span2">Size</th>
+ </tr>
+ </thead>
+ <tbody>
+ {% for runtime_dep in runtime_deps %}
+ <tr {{runtime_dep.size|format_vpackage_rowclass}} >
+ {% if runtime_dep.size != -1 %}
+ <td>
+ <a href="{% url 'package_built_detail' build.id runtime_dep.depends_on_id %}">
+ {{runtime_dep.name}}
+ </a>
+ </td>
+ {% else %}
+ <td>
+ {{runtime_dep.name|format_vpackage_namehelp}}
+ </td>
+ {% endif %}
+ <td>{{runtime_dep.version}}</td>
+ <td class="sizecol">{{runtime_dep.size|filtered_filesizeformat}}</td>
+ </tr>
+ {% endfor %}
+ </tbody>
+ </table>
+ {% endifequal %}
+ {% ifnotequal other_deps|length 0 %}
+ <h3>Other runtime relationships</h3>
+ <table class="table table-bordered table-hover">
+ <thead>
+ <tr>
+ <th>Package</th>
+ <th>Version</th>
+ <th class="sizecol span2">Size</th>
+ <th>
+ <i class="icon-question-sign get-help" title="Five relationship types exist: recommends, suggests, provides, replaces and conflicts"></i>
+ Relationship type
+ </th>
+ </tr>
+ </thead>
+ <tbody>
+ {% for other_dep in other_deps %}
+ <tr {{other_dep.size|format_vpackage_rowclass}} >
+ {% if other_dep.size != -1 %}
+ <td>
+ <a href="{% url 'package_built_detail' build.id other_dep.depends_on_id %}">
+ {{other_dep.name}}
+ </a>
+ </td>
+ {% else %}
+ <td>
+ {{other_dep.name|format_vpackage_namehelp}}
+ </td>
+ {% endif %}
+ <td>{{other_dep.version}}</td>
+ <td class="sizecol">{{other_dep.size|filtered_filesizeformat}}</td>
+ <td>
+ {{other_dep.dep_type_display}}
+ <i class="icon-question-sign get-help hover-help" title="{{other_dep.dep_type_help}}" ></i>
+ </td>
+ </tr>
+ {% endfor %}
+ </tbody>
+ </table>
+ {% endifnotequal %}
+ </div> <!-- tab-pane -->
+ </div> <!-- tab-content -->
+{% endblock tabcontent %}
diff --git a/bitbake/lib/toaster/toastergui/templates/package_built_detail.html b/bitbake/lib/toaster/toastergui/templates/package_built_detail.html
new file mode 100644
index 0000000..9be8ccb
--- /dev/null
+++ b/bitbake/lib/toaster/toastergui/templates/package_built_detail.html
@@ -0,0 +1,65 @@
+{% extends "package_detail_base.html" %}
+{% load projecttags %}
+
+{% block tabcontent %}
+ {% with packageFileCount=package.buildfilelist_package.count %}
+ <!-- Generated Files -->
+ {% if package.buildtargetlist_package.count == 0 %}
+ {# Not included case #}
+ <ul class="nav nav-pills">
+ <li class="active"> <a href="#">
+ <i class="icon-question-sign get-help" title="Files added to a root file system when you include {{package.name}} in an image"></i>
+ Generated files ({{packageFileCount}})
+ </a></li>
+ <li class=""><a href="{% url 'package_built_dependencies' build.id package.id %}">
+ <i class="icon-question-sign get-help" title="Projected runtime dependencies when you include {{package.name}} in an image"></i>
+ Runtime dependencies ({{dependency_count}})
+ </a></li>
+ </ul>
+ <div class="tab-content">
+ <div class="tab-pane active" id="files">
+ <!-- Package file list or if empty, alert pane -->
+ {% if packageFileCount > 0 %}
+ <div class="alert alert-info">
+ <strong>{{package.fullpackagespec}}</strong> is <strong>not included</strong> in any image. This page shows you the files added to an image root file system if you include <strong>{{package.fullpackagespec}}</strong> in future builds.
+ </div>
+ {% include "tablesort.html" %}
+ <tbody>
+ {% for file in objects %}
+ <tr>
+ <td class="path">{{file.path}}</td>
+ <td class="filesize sizecol">{{file.size|filtered_filesizeformat}}</td>
+ </tr>
+ {% endfor %}
+ </tbody>
+ </table>
+
+ {% else %}
+ <div class="alert alert-info">
+ <strong>{{package.fullpackagespec}}</strong> does not generate any files.
+ </div>
+ {% endif %}
+
+ </div> <!-- tab-pane active -->
+ </div> <!-- tab-content -->
+ {% else %}
+ {# Included case #}
+ <div class="tab-content">
+ <div class="tab-pane active">
+ <div class="lead well">
+ Package included in:
+ {% for itarget in package.buildtargetlist_package.all|dictsort:"target.target" %}
+ <a href="{% url 'package_included_detail' build.id itarget.target.id package.id %}">
+ {% if forloop.counter0 > 0 %}
+ ,
+ {% endif %}
+ {{itarget.target.target}}
+ </a>
+ {% endfor %}
+ </div>
+ </div> <!-- tab-pane active -->
+ </div> <!-- tab-content -->
+ {% endif %}
+
+ {% endwith %}
+{% endblock tabcontent %}
diff --git a/bitbake/lib/toaster/toastergui/templates/package_detail_base.html b/bitbake/lib/toaster/toastergui/templates/package_detail_base.html
new file mode 100644
index 0000000..a24bc8e
--- /dev/null
+++ b/bitbake/lib/toaster/toastergui/templates/package_detail_base.html
@@ -0,0 +1,142 @@
+{% extends "basebuilddetailpage.html" %}
+{% load projecttags %}
+
+{% block extraheadcontent %}
+ <!-- functions to format package 'installed_package' alias -->
+ <script>
+ function fmtAliasHelp(package_name, alias, hover) {
+ var r = null;
+ if (alias != null && alias != '') {
+ r = '<span class="muted"> as ' + alias + ' ';
+ r += '<i class="icon-question-sign get-help';
+ if (hover) {
+ r+= ' hover-help';
+ }
+ else {
+ r+= ' heading-help';
+ }
+ r += '"';
+ title = package_name + ' was renamed at packaging time and was installed on your system as ' + alias;
+ r += ' title="' + title + '">';
+ r += '</i>';
+ r += '</span>';
+ document.write(r);
+ }
+ }
+ </script>
+{% endblock extraheadcontent %}
+{% block localbreadcrumb %}
+ {% if target %}
+ <li><a href="{% url "target" build.id target.id %}">{{target.target}}</a></li>
+ <li>{{package.fullpackagespec}} {% if package.alias %} as {{package.alias}}{% endif %}</li>
+ {% else %}
+ <li><a href="{% url "packages" build.id %}"> Packages </a></li>
+ <li>{{package.fullpackagespec}}</li>
+ {% endif %}
+{% endblock localbreadcrumb %}
+
+{% block pagedetailinfomain %}
+ <div class="row span11">
+ <div class="page-header">
+ {% block title %}
+ <h1>{{package.fullpackagespec}}</h1>
+ {% endblock title %}
+ </div> <!-- page-header -->
+ </div> <!-- row span11 page-header -->
+
+ {% block twocolumns %}
+ <div class="row span7 tabbable">
+ {% block tabcontent %}
+ {% endblock tabcontent %}
+ </div> <!-- row span7 -->
+
+ <div class="row span4 well">
+ <h2>Package information</h2>
+
+ <!-- info presented as definition list -->
+ <dl class="item-info">
+ <dt>
+ Size
+ <i class="icon-question-sign get-help" title="The size of the package"></i>
+ </dt>
+ <dd>
+ {% comment %}
+ if recipe is absent, filesize is not 0
+ {% endcomment %}
+ {% if package.recipe_id > 0 %}
+ {{package.size|filtered_filesizeformat}}
+ {% if target.file_size %}
+ ({{package.size|multiply:100|divide:target.file_size}}% of included package size)
+ {% endif %}
+
+ {% endif %}
+ </dd>
+
+ <dt>
+ License
+ <i class="icon-question-sign get-help" title="The license under which this package is distributed"></i>
+ </dt>
+ <dd>{{package.license}}</dd>
+
+ {% comment %}
+ # Removed per review on 1/18/2014 until license data population
+ # problemse are resolved.
+ <dt>
+ License files
+ <i class="icon-question-sign get-help" title="Path to the license files that apply to the package"></i>
+ </dt>
+ <dd></dd>
+ {% endcomment %}
+
+ <dt>
+ Recipe
+ <i class="icon-question-sign get-help" title="The name of the recipe building this package"></i>
+ </dt>
+ <dd>
+ {% if package.recipe_id > 0 %}
+ <a href="{% url "recipe" build.id package.recipe_id %}"> {{package.recipe.name}} </a>
+ {% else %}
+ {{package.recipe.name}}
+ {% endif %}
+ </dd>
+
+ <dt>
+ Recipe version
+ <i class="icon-question-sign get-help" title="The version of the recipe building this package"></i>
+ </dt>
+ <dd>{{package.recipe.version}}</dd>
+
+ <dt>
+ Layer
+ <i class="icon-question-sign get-help" title="The name of the layer providing the recipe that builds this package"></i>
+ </dt>
+ <dd>
+ {{package.recipe.layer_version.layer.name}}
+ {% if package.recipe.layer_version.layer.name|format_none_and_zero != "" %}
+ {% comment %}
+ # Removed per team meeting of 1/29/2014 until
+ # decision on index search algorithm
+ <a href="http://layers.openembedded.org" target="_blank">
+ <i class="icon-share get-info"></i>
+ </a>
+ {% endcomment %}
+ {% endif %}
+ </dd>
+ {% if package.recipe.layer_version.branch %}
+ <dt>
+ Layer branch
+ <i class="icon-question-sign get-help" title="The Git branch of the layer providing the recipe that builds this package"></i>
+ </dt>
+ <dd>{{package.recipe.layer_version.branch}}</dd>
+ {% endif %}
+ <dt>
+ Layer commit
+ <i class="icon-question-sign get-help" title="The Git commit of the layer providing the recipe that builds this package"></i>
+ </dt>
+
+ <dd class="iscommit">{{package.recipe.layer_version.commit}}</dd>
+
+ </dl>
+ </div> <!-- row4 well -->
+ {% endblock twocolumns %}
+{% endblock pagedetailinfomain %}
diff --git a/bitbake/lib/toaster/toastergui/templates/package_included_dependencies.html b/bitbake/lib/toaster/toastergui/templates/package_included_dependencies.html
new file mode 100644
index 0000000..642ca69
--- /dev/null
+++ b/bitbake/lib/toaster/toastergui/templates/package_included_dependencies.html
@@ -0,0 +1,110 @@
+{% extends "package_detail_base.html" %}
+{% load projecttags %}
+
+{% block title %}
+ <h1>
+ {{package.fullpackagespec}}
+ <script> fmtAliasHelp("{{package.name}}", "{{package.alias}}", false) </script>
+ <small>({{target.target}})</small>
+ </h1>
+{% endblock title %}
+
+{% block tabcontent %}
+ {% with packageFileCount=package.buildfilelist_package.count %}
+ {% include "package_included_tabs.html" with active_tab="dependencies" %}
+ <div class="tab-content">
+ <div class="tab-pane active" id="dependencies">
+ {% ifnotequal runtime_deps|length 0 %}
+ <table class="table table-bordered table-hover">
+ <thead>
+ <tr>
+ <th>Package</th>
+ <th>Version</th>
+ <th class='sizecol span2'>Size</th>
+ </tr>
+ </thead>
+ <tbody>
+ {% for runtime_dep in runtime_deps %}
+ <tr {{runtime_dep.size|format_vpackage_rowclass}} >
+ {% if runtime_dep.size != -1 %}
+ <td>
+ <a href="{% url 'package_included_detail' build.id target.id runtime_dep.depends_on_id %}">
+ {{runtime_dep.name}}
+ </a>
+ <script>fmtAliasHelp("{{runtime_dep.name}}", "{{runtime_dep.alias}}", true)</script>
+ </td>
+ {% else %}
+ <td>
+ {{runtime_dep.name|format_vpackage_namehelp}}
+ </td>
+ {% endif %}
+ <td>{{runtime_dep.version}} </td>
+ <td class='sizecol'>{{runtime_dep.size|filtered_filesizeformat}} </td>
+ </tr>
+ {% endfor %}
+ </tbody>
+ </table>
+ {% else %}
+ <div class="alert alert-info">
+ <strong>{{package.fullpackagespec}}</strong> has no runtime dependencies.
+ </div>
+ {% endifnotequal %}
+
+ {% ifnotequal other_deps|length 0 %}
+ <h3>Other runtime relationships</h3>
+ <table class="table table-bordered table-hover">
+ <thead>
+ <tr>
+ <th>Package</th>
+ <th>Version</th>
+ <th class='sizecol span2'>Size</th>
+ <th>
+ <i class="icon-question-sign get-help" title="Five relationship types exist: recommends, suggests, provides, replaces and conflicts"></i>
+ Relationship type
+ </th>
+ </tr>
+ </thead>
+ <tbody>
+ {% for other_dep in other_deps %}
+ {% if other_dep.installed %}
+ <tr {{other_dep.size|format_vpackage_rowclass}}>
+ {% if other_dep.size != -1 %}
+ <td>
+ <a href="{% url 'package_included_detail' build.id target.id other_dep.depends_on_id %}">
+ {{other_dep.name}}
+ <script>
+ fmtAliasHelp("{{other_dep.name}}","{{other_dep.alias}}", true)
+ </script>
+ </a>
+ </td>
+ {% else %}
+ <td>
+ {{other_dep.name|format_vpackage_namehelp}}
+ </td>
+ {% endif %}
+ <td>{{other_dep.version}} </td>
+ <td class='sizecol'>{{other_dep.size|filtered_filesizeformat}} </td>
+ <td>
+ {{other_dep.dep_type_display}}
+ <i class="icon-question-sign get-help hover-help" title="{{other_dep.dep_type_help}}" ></i>
+ </td>
+ </tr>
+ {% else %}
+ <tr class="muted">
+ <td>{{other_dep.name}}</td>
+ <td>{{other_dep.version}}</td>
+ <td></td>
+ <td>
+ {{other_dep.dep_type_display}}
+ <i class="icon-question-sign get-help hover-help" title="{{other_dep.dep_type_help}}" ></i>
+ </td>
+ </tr>
+ {% endif %}
+ {% endfor %}
+ </tbody>
+ </table>
+ {% endifnotequal %}
+ </div> <!-- end tab-pane -->
+ </div> <!-- end tab content -->
+ {% endwith %}
+{% endblock tabcontent %}
diff --git a/bitbake/lib/toaster/toastergui/templates/package_included_detail.html b/bitbake/lib/toaster/toastergui/templates/package_included_detail.html
new file mode 100644
index 0000000..d2aa26e
--- /dev/null
+++ b/bitbake/lib/toaster/toastergui/templates/package_included_detail.html
@@ -0,0 +1,44 @@
+{% extends "package_detail_base.html" %}
+{% load projecttags %}
+
+{% block title %}
+ <h1>
+ {{package.fullpackagespec}}
+ <script>
+ fmtAliasHelp("{{package.name}}", "{{package.alias}}", false)
+ </script>
+ <small>({{target.target}})</small>
+ </h1>
+{% endblock title %}
+
+{% block tabcontent %}
+{% with packageFileCount=package.buildfilelist_package.count %}
+ {% include "package_included_tabs.html" with active_tab="detail" %}
+ <div class="tab-content">
+ <div class="tab-pane active" id="files">
+ {% if packageFileCount > 0 %}
+ {% include "tablesort.html" %}
+ <tbody>
+ {% for file in objects %}
+ <tr>
+ <td class="path">
+ <a href="{% url 'dirinfo_filepath' build.id target.id file.path %}">
+ {{file.path}}
+ </a>
+ </td>
+ <td class="filesize sizecol" >{{file.size|filtered_filesizeformat}}</td>
+ </tr>
+ {% endfor %}
+ </tbody>
+ </table>
+
+ {% else %}
+ <div class="alert alert-info">
+ <strong>{{package.fullpackagespec}}</strong> does not generate any files.
+ </div>
+ {% endif %}
+ </div> <!-- end tab-pane -->
+ </div> <!-- end tab content -->
+
+{% endwith %}
+{% endblock tabcontent %}
diff --git a/bitbake/lib/toaster/toastergui/templates/package_included_reverse_dependencies.html b/bitbake/lib/toaster/toastergui/templates/package_included_reverse_dependencies.html
new file mode 100644
index 0000000..5cc8b47
--- /dev/null
+++ b/bitbake/lib/toaster/toastergui/templates/package_included_reverse_dependencies.html
@@ -0,0 +1,50 @@
+{% extends "package_detail_base.html" %}
+{% load projecttags %}
+
+{% block title %}
+ <h1>
+ {{package.fullpackagespec}}
+ <script> fmtAliasHelp("{{package.name}}", "{{package.alias}}", false) </script>
+ <small>({{target.target}})</small>
+ </h1>
+{% endblock title %}
+
+{% block tabcontent %}
+ {% with packageFileCount=package.buildfilelist_package.count %}
+ {% include "package_included_tabs.html" with active_tab="reverse" %}
+ <div class="tab-content">
+ <div class="tab-pane active" id="brought-in-by">
+
+ {% ifequal reverse_count 0 %}
+ <div class="alert alert-info">
+ <strong>{{package.fullpackagespec}}</strong> has no reverse runtime dependencies.
+ </div>
+ {% else %}
+ {% include "tablesort.html" %}
+ <tbody>
+ {% for reverse_dep in objects %}
+ <tr {% if reverse_dep.size %}{{reverse_dep.size|format_vpackage_rowclass}}{%endif%} >
+ {% if reverse_dep.size != -1 %}
+ <td>
+ <a href="{% url 'package_included_detail' build.id target.id reverse_dep.package_id %}">
+ {{reverse_dep.package.name}}
+ </a>
+ <script>fmtAliasHelp("{{reverse_dep.package.name}}", "{{reverse_dep.alias}}", true)</script>
+ </td>
+ {% else %}
+ <td>
+ {{reverse_dep.name|format_vpackage_namehelp}}
+ </td>
+ {% endif %}
+
+ <td>{{reverse_dep.package.version}} </td>
+ <td class='sizecol'>{{reverse_dep.package.size|filtered_filesizeformat}} </td>
+ </tr>
+ {% endfor %}
+ </tbody>
+ </table>
+ {% endifequal %}
+ </div> <!-- end tab-pane -->
+ </div> <!-- end tab content -->
+ {% endwith %}
+{% endblock tabcontent %}
diff --git a/bitbake/lib/toaster/toastergui/templates/package_included_tabs.html b/bitbake/lib/toaster/toastergui/templates/package_included_tabs.html
new file mode 100644
index 0000000..958aa88
--- /dev/null
+++ b/bitbake/lib/toaster/toastergui/templates/package_included_tabs.html
@@ -0,0 +1,33 @@
+
+ <ul class="nav nav-pills">
+ {% if active_tab == "detail" %}
+ <li class="active">
+ {% else %}
+ <li class="">
+ {% endif %}
+ <a href="{% url 'package_included_detail' build.id target.id package.id %}">
+ <i class="icon-question-sign get-help" title="The files this package adds to the image root file system"></i>
+ Files in root file system ({{packageFileCount}})
+ </a>
+ </li>
+ {% if active_tab == "dependencies" %}
+ <li class="active">
+ {% else %}
+ <li class="">
+ {% endif %}
+ <a href="{% url 'package_included_dependencies' build.id target.id package.id %}">
+ <i class="icon-question-sign get-help" title="Package runtime dependencies"></i>
+ Runtime dependencies ({{dependency_count}})
+ </a>
+ </li>
+ {% if active_tab == "reverse" %}
+ <li class="active">
+ {% else %}
+ <li class="">
+ {% endif %}
+ <a href="{% url 'package_included_reverse_dependencies' build.id target.id package.id %}">
+ <i class="icon-question-sign get-help" title="The package runtime reverse dependencies (i.e. the packages in this image that depend on this package). Reverse dependencies reflect only the 'depends' dependency type"></i>
+ Reverse runtime dependencies ({{reverse_count}})
+ </a>
+ </li>
+ </ul>
diff --git a/bitbake/lib/toaster/toastergui/templates/project.html b/bitbake/lib/toaster/toastergui/templates/project.html
new file mode 100644
index 0000000..e8354fd
--- /dev/null
+++ b/bitbake/lib/toaster/toastergui/templates/project.html
@@ -0,0 +1,130 @@
+{% extends "baseprojectpage.html" %}
+
+{% load projecttags %}
+{% load humanize %}
+{% load static %}
+
+{% block projectinfomain %}
+
+<script src="{% static 'js/layerDepsModal.js' %}"></script>
+<script src="{% static 'js/projectpage.js' %}"></script>
+<script>
+ $(document).ready(function (){
+ var ctx = {
+ testReleaseChangeUrl: "{% url 'xhr_testreleasechange' project.id %}",
+ };
+
+ try {
+ projectPageInit(ctx);
+ } catch (e) {
+ document.write("Sorry, An error has occurred loading this page");
+ console.warn(e);
+ }
+ });
+</script>
+
+<div id="change-release-modal" class="modal hide fade in" tabindex="-1" role="dialog" aria-labelledby="change-release-modal" aria-hidden="false">
+ <div class="modal-header">
+ <button type="button" class="close" data-dismiss="modal" aria-hidden="true">x</button>
+ <h3>Changing Yocto Project release to <span class="proposed-release-change-name"></span></h3>
+ </div>
+ <div class="modal-body">
+ <p>The following added layers do not exist for <span class="proposed-release-change-name"></span>: </p>
+ <ul id="layers-to-remove-list">
+ </ul>
+ <p>If you change the Yocto Project release to <span class="proposed-release-change-name"></span>, the above layers will be deleted from your added layers.</p>
+ </div>
+ <div class="modal-footer">
+ <button id="change-release-and-rm-layers" data-dismiss="modal" type="submit" class="btn btn-primary">Change release and delete layers</button>
+ <button class="btn" data-dismiss="modal" aria-hidden="true">Cancel</button>
+ </div>
+</div>
+
+
+<div class="row-fluid" id="project-page" style="display:none">
+ <div class="span6">
+ <div class="well well-transparent" id="machine-section">
+ <h3>Machine</h3>
+
+ <p class="lead"><span id="project-machine-name"></span> <i title="" data-original-title="" id="change-machine-toggle" class="icon-pencil"></i></p>
+
+ <form id="select-machine-form" style="display:none;">
+ <div class="alert alert-info">
+ <strong>Machine changes have a big impact on build outcome.</strong> You cannot really compare the builds for the new machine with the previous ones.
+ </div>
+
+ <div class="input-append">
+ <input id="machine-change-input" autocomplete="off" value="" data-provide="typeahead" data-minlength="1" data-autocomplete="off" type="text">
+ <button id="machine-change-btn" class="btn" type="button">Save</button> <a href="#" id="cancel-machine-change" class="btn btn-link">Cancel</a>
+ </div>
+
+ <p><a href="{% url 'projectmachines' project.id %}" class="link">View compatible machines</a></p>
+ </form>
+ </div>
+
+ <div class="well well-transparent">
+ <h3>Most built recipes</h3>
+
+ <div class="alert alert-info" style="display:none" id="no-most-built">
+ <span class="lead">You haven't built any recipes yet</span>
+ <p style="margin-top: 10px;"><a href="{% url 'projecttargets' project.id %}">Choose a recipe to build</a></p>
+ </div>
+
+ <ul class="unstyled configuration-list" id="freq-build-list">
+ </ul>
+ <button class="btn btn-primary" id="freq-build-btn" disabled="disabled">Build selected recipes</button>
+ </div>
+
+ <div class="well well-transparent">
+ <h3>Project release</h3>
+
+ <p class="lead"><span id="project-release-title"></span> <i title="" data-original-title="" id="release-change-toggle" class="icon-pencil"></i></p>
+
+ <form class="form-inline" id="change-release-form" style="display:none;">
+ <select></select>
+ <button class="btn" style="margin-left:5px;" id="change-release-btn">Change</button> <a href="#" id="cancel-release-change" class="btn btn-link">Cancel</a>
+ </form>
+ </div>
+ </div>
+
+ <div class="span6">
+ <div class="well well-transparent" id="layer-container">
+ <h3>Layers <span class="muted counter">(<span id="project-layers-count"></span>)</span>
+ <i data-original-title="OpenEmbedded organises metadata into modules called 'layers'. Layers allow you to isolate different types of customizations from each other. <a href='http://www.yoctoproject.org/docs/current/dev-manual/dev-manual.html#understanding-and-creating-layers' target='_blank'>More on layers</a>" class="icon-question-sign get-help heading-help" title=""></i>
+ </h3>
+
+ <div class="alert lead" id="no-layers-in-project" style="display:none">
+ You need to add some layers. For that you can:
+ <ul>
+ <li><a href="{% url 'projectlayers' project.id %}">View all layers compatible with this project</a></li>
+ <li><a href="{% url 'importlayer' project.id %}">Import a layer</a></li>
+ <li><a href="http://www.yoctoproject.org/docs/current/dev-manual/dev-manual.html#understanding-and-creating-layers" target="_blank">Read about layers in the documentation</a></li>
+ </ul>
+ <p>Or type a layer name below.</p>
+ </div>
+
+ <form style="margin-top:20px">
+ <!--div class="control-group error"-->
+
+ <div class="input-append">
+ <input id="layer-add-input" autocomplete="off" placeholder="Type a layer name" data-minlength="1" data-autocomplete="off" data-provide="typeahead" data-source="" type="text">
+ <button id="add-layer-btn" class="btn" disabled>Add</button>
+ </div>
+
+ <div id="import-alert" class="alert alert-info" style="display:none;">
+ Toaster does not know about this layer. Please <a href="#">import it</a>
+ </div>
+
+ <p>
+ <a href="{% url 'projectlayers' project.id %}" id="view-compatible-layers">View compatible layers</a>
+ <i data-original-title="View all the layers you can build with the release selected for this project, which is Yocto Project master" class="icon-question-sign get-help" title=""></i>
+ | <a href="{% url 'importlayer' project.id %}">Import layer</a>
+ </p>
+ </form>
+
+ <ul class="unstyled configuration-list" id="layers-in-project-list">
+ </ul>
+ </div>
+ </div>
+</div>
+{% endblock %}
diff --git a/bitbake/lib/toaster/toastergui/templates/projectbuilds.html b/bitbake/lib/toaster/toastergui/templates/projectbuilds.html
new file mode 100644
index 0000000..df809de
--- /dev/null
+++ b/bitbake/lib/toaster/toastergui/templates/projectbuilds.html
@@ -0,0 +1,103 @@
+{% extends "baseprojectbuildspage.html" %}
+{% load projecttags %}
+{% load humanize %}
+
+
+{% block extraheadcontent %}
+<link rel="stylesheet" href="/static/css/jquery-ui.min.css" type='text/css'>
+<link rel="stylesheet" href="/static/css/jquery-ui.structure.min.css" type='text/css'>
+<link rel="stylesheet" href="/static/css/jquery-ui.theme.min.css" type='text/css'>
+<script src="/static/js/jquery-ui.min.js"></script>
+<script src="/static/js/filtersnippet.js"></script>
+{% endblock %}
+
+{% block projectinfomain %}
+
+<script>
+ // initialize the date range controls
+ $(document).ready(function () {
+ date_init('created','{{last_date_from}}','{{last_date_to}}','{{dateMin_started_on}}','{{dateMax_started_on}}','{{daterange_selected}}');
+ date_init('updated','{{last_date_from}}','{{last_date_to}}','{{dateMin_completed_on}}','{{dateMax_completed_on}}','{{daterange_selected}}');
+ });
+</script>
+
+ <h2>
+ {% if request.GET.filter and objects.paginator.count > 0 or request.GET.search and objects.paginator.count > 0 %}
+ {{objects.paginator.count}} build{{objects.paginator.count|pluralize}} found
+ {%elif request.GET.filter and objects.paginator.count == 0 or request.GET.search and objects.paginator.count == 0 %}
+ No builds found
+ {%else%}
+ Project builds
+ {%endif%}
+ <i class="icon-question-sign get-help heading-help" title="This page lists all the builds for the current project"></i>
+ </h2>
+
+
+ {% if objects.paginator.count == 0 %}
+ {% if request.GET.filter or request.GET.search %}
+ <div class="row-fluid">
+ <div class="alert">
+ <form class="no-results input-append" id="searchform">
+ <input id="search" name="search" class="input-xxlarge" type="text" value="{{request.GET.search}}"/>{% if request.GET.search %}<a href="javascript:$('#search').val('');searchform.submit()" class="add-on btn" tabindex="-1"><i class="icon-remove"></i></a>{% endif %}
+ <button class="btn" type="submit" value="Search">Search</button>
+ <button class="btn btn-link" onclick="javascript:$('#search').val('');searchform.submit()">Show all builds</button>
+ </form>
+ </div>
+ </div>
+ {% else %}
+ <div class="alert alert-info">
+ <p class="lead">
+ This project has no builds.
+ </p>
+ </div>
+ {% endif %}
+
+ {% else %}
+
+ {% include "basetable_top.html" %}
+ <!-- Table data rows; the order needs to match the order of "tablecols" definitions; and the <td class value needs to match the tablecols clclass value for show/hide buttons to work -->
+ {% for build in objects %} {# if we have a build, just display it #}
+ <tr class="data">
+ <td class="outcome"><a href="{% url "builddashboard" build.id %}">{%if build.outcome == build.SUCCEEDED%}<i class="icon-ok-sign success"></i>{%elif build.outcome == build.FAILED%}<i class="icon-minus-sign error"></i>{%else%}{%endif%}</a>
+ {% if build.project %}
+ <a href="{% url 'build_artifact' build.id "cookerlog" build.id %}">
+ <i class="icon-download-alt" title="" data-original-title="Download build log"></i>
+ </a>
+ {% endif %}
+ </td>
+
+ <td class="target">{% for t in build.target_set.all %} <a href="{% url "builddashboard" build.id %}"> {{t.target}} </a> <br />{% endfor %}</td>
+ <td class="machine"><a href="{% url "builddashboard" build.id %}">{{build.machine}}</a></td>
+ <td class="started_on"><a href="{% url "builddashboard" build.id %}">{{build.started_on|date:"d/m/y H:i"}}</a></td>
+ <td class="completed_on"><a href="{% url "builddashboard" build.id %}">{{build.completed_on|date:"d/m/y H:i"}}</a></td>
+ <td class="failed_tasks error">
+ {% query build.task_build outcome=4 order__gt=0 as exectask%}
+ {% if exectask.count == 1 %}
+ <a href="{% url "task" build.id exectask.0.id %}">{{exectask.0.recipe.name}}.{{exectask.0.task_name}}</a>
+ <a href="{% url 'build_artifact' build.id "tasklogfile" exectask.0.id %}">
+ <i class="icon-download-alt" title="" data-original-title="Download task log file"></i>
+ </a>
+ {% elif exectask.count > 1%}
+ <a href="{% url "tasks" build.id %}?filter=outcome%3A4">{{exectask.count}} task{{exectask.count|pluralize}}</a>
+ {%endif%}
+ </td>
+ <td class="errors.count">
+ {% if build.errors.count %}
+ <a class="errors.count error" href="{% url "builddashboard" build.id %}#errors">{{build.errors.count}} error{{build.errors.count|pluralize}}</a>
+ {%endif%}
+ </td>
+ <td class="warnings.count">{% if build.warnings.count %}<a class="warnings.count warning" href="{% url "builddashboard" build.id %}#warnings">{{build.warnings.count}} warning{{build.warnings.count|pluralize}}</a>{%endif%}</td>
+ <td class="time"><a href="{% url "buildtime" build.id %}">{{build.timespent_seconds|sectohms}}</a></td>
+ <td class="output">
+ {% if build.outcome == build.SUCCEEDED %}
+ <a href="{%url "builddashboard" build.id%}#images">{{fstypes|get_dict_value:build.id}}</a>
+ {% endif %}
+ </td>
+ </tr>
+ {% endfor %}
+
+
+ {% include "basetable_bottom.html" %}
+{% endif %}
+
+{% endblock %}
diff --git a/bitbake/lib/toaster/toastergui/templates/projectconf.html b/bitbake/lib/toaster/toastergui/templates/projectconf.html
new file mode 100644
index 0000000..4c5a188
--- /dev/null
+++ b/bitbake/lib/toaster/toastergui/templates/projectconf.html
@@ -0,0 +1,849 @@
+{% extends "baseprojectpage.html" %}
+{% load projecttags %}
+{% load humanize %}
+
+
+{% block projectinfomain %}
+
+<h2>Bitbake variables</h2>
+
+ <div style="padding-left:19px;">
+
+ <dl class="dl-vertical">
+ {% if distro_defined %}
+ <dt>
+ <span class="js-config-var-name js-config-var-managed-name">DISTRO</span>
+ <i class="icon-question-sign get-help" title="The short name of the distribution. If the variable is blank, meta/conf/distro/defaultsetup.conf will be used. <br /><a href='http://www.yoctoproject.org/docs/1.6.1/ref-manual/ref-manual.html#var-DISTRO' target='_blank'>Read more in the manual</a>"></i>
+ </dt>
+ <dd class="lead">
+ <span id="distro">{{distro}}</span>
+ <i class="icon-pencil" id="change-distro-icon"></i>
+ <form id="change-distro-form" style="display:none;">
+ <div class="input-append">
+ <span id="edit-distro-name-div" class="control-group">
+ <input type="text" id="new-distro" value="{{distro}}">
+ <button id="apply-change-distro" class="btn" type="button">Save</button>
+ <button id="cancel-change-distro" type="button" class="btn btn-link">Cancel</button>
+ </span>
+ <span class="help-block error" id="distro-error-message"></span>
+ </div>
+ </form>
+ </dd>
+ {% endif %}
+
+ {% if fstypes_defined %}
+ <dt>
+ <span class="js-config-var-name js-config-var-managed-name">IMAGE_FSTYPES</span>
+ <i class="icon-question-sign get-help" title="Formats of root file system images that you want to have created <br /><a href='http://www.yoctoproject.org/docs/1.6.1/ref-manual/ref-manual.html#var-IMAGE_FSTYPES' target='_blank'>Read more in the manual</a>"></i>
+ </dt>
+ <dd class="lead">
+ <span id="image_fstypes">{{fstypes}}</span>
+ <i class="icon-pencil" id="change-image_fstypes-icon"></i>
+ <form id="change-image_fstypes-form" style="display:none;">
+ <input id="filter-image_fstypes" type="text" placeholder="Search image types" class="span4">
+ <div id="all-image_fstypes" class="scrolling">
+ </div>
+ <button id="apply-change-image_fstypes" type="button" class="btn">Save</button>
+ <button id="cancel-change-image_fstypes" type="button" class="btn btn-link">Cancel</button>
+ </form>
+ </dd>
+ {% endif %}
+
+ {% if image_install_append_defined %}
+ <dt>
+ <span class="js-config-var-name js-config-var-managed-name">IMAGE_INSTALL_append</span>
+ <i class="icon-question-sign get-help" title="Specifies additional packages to install into an image. If your build creates more than one image, the packages will be installed in <strong>all of them</strong> <br /><a href='http://www.yoctoproject.org/docs/1.6.1/ref-manual/ref-manual.html#var-IMAGE_INSTALL' target='_blank'>Read more in the manual</a>"></i>
+ </dt>
+ <dd class="lead">
+ <span id="image_install"{% if image_install_append %}{%else%} class="muted"{%endif%}>{% if image_install_append %}{{image_install_append}}{%else%}Not set{%endif%}</span>
+ <i class="icon-pencil" id="change-image_install-icon"></i>
+ <i class="icon-trash" id="delete-image_install-icon" {% if image_install_append %}{%else%}style="display:none;"{%endif%}></i>
+ <form id="change-image_install-form" style="display:none;">
+ <div class="row-fluid">
+ <span class="help-block span4">To set IMAGE_INSTALL_append to more than one package, type the package names separated by a space.</span>
+ </div>
+ <div class="input-append">
+ <input type="text" class="input-xlarge" id="new-image_install" placeholder="Type one or more package names">
+ <button id="apply-change-image_install" class="btn" type="button">Save</button>
+ <button id="cancel-change-image_install" type="button" class="btn btn-link">Cancel</button>
+ </div>
+ </form>
+ </dd>
+ {% endif %}
+
+ {% if package_classes_defined %}
+ <dt>
+ <span class="js-config-var-name js-config-var-managed-name">PACKAGE_CLASSES</span>
+ <i class="icon-question-sign get-help" title="Specifies the package manager to use when packaging data <br /><a href='http://www.yoctoproject.org/docs/1.6.1/ref-manual/ref-manual.html#var-PACKAGE_CLASSES' target='_blank'>Read more in the manual</a>"></i>
+ </dt>
+ <dd class="lead">
+ <span id="package_classes">{{package_classes}}</span>
+ <i id="change-package_classes-icon" class="icon-pencil"></i>
+ <form id="change-package_classes-form" style="display:none;">
+ <label>
+ Root file system package format
+ <i class="icon-question-sign get-help" title="The package format used to generate the root file system. Options are <code>dev</code>, <code>ipk</code> and <code>rpm</code>"></i>
+ </label>
+ <select id="package_classes-select">
+ <option>package_deb</option>
+ <option>package_ipk</option>
+ <option>package_rpm</option>
+ </select>
+ <label>
+ Additional package formats
+ <i class="icon-question-sign get-help" title="Extra package formats to build"></i>
+ </label>
+ <label class="checkbox" id="package_class_1">
+ <input type="checkbox" id="package_class_1_input"> package_deb
+ </label>
+ <label class="checkbox" id="package_class_2">
+ <input type="checkbox" id="package_class_2_input"> package_ipk
+ </label>
+ <div style="padding-top:10px;">
+ <button id="apply-change-package_classes" type="button" class="btn">Save</button>
+ <button id="cancel-change-package_classes" type="button" class="btn btn-link">Cancel</button>
+ </div>
+ </form>
+ </dd>
+ {% endif %}
+
+ {% if sdk_machine_defined %}
+ <dt>
+ <span class="js-config-var-name js-config-var-managed-name">SDKMACHINE</span>
+ <i class="icon-question-sign get-help" title="Specifies the architecture (i.e. i686 or x86_64) for which to build SDK and ADT items <br /><a href='http://www.yoctoproject.org/docs/1.6.1/ref-manual/ref-manual.html#var-SDKMACHINE' target='_blank'>Read more in the manual</a>"></i>
+ </dt>
+ <dd class="lead">
+ <span id="sdkmachine">{{sdk_machine}}</span>
+ <i id="change-sdkmachine-icon" class="icon-pencil"></i>
+ <form id="change-sdkmachine-form" style="display:none;">
+ <label class="radio">
+ <input type="radio" name="sdkmachine" value="i686">
+ i686
+ </label>
+ <label class="radio">
+ <input type="radio" name="sdkmachine" value="x86_64">
+ x86_64
+ </label>
+ <div style="padding-top:10px;">
+ <button id="apply-change-sdkmachine" type="button" class="btn">Save</button>
+ <button id="cancel-change-sdkmachine" type="button" class="btn btn-link">Cancel</button>
+ </div>
+ </form>
+ </dd>
+ {% endif %}
+
+ </dl>
+
+ <!-- <ul class="unstyled configuration-list" id="configvar-list"> -->
+ <dl id="configvar-list">
+ <!-- the added configuration variables are inserted here -->
+ </dl>
+
+ <!-- pass the fstypes list, black list, and externally managed variables here -->
+ {% for fstype in vars_fstypes %}
+ <input type="hidden" class="js-checkbox-fstypes-list" value="{{fstype}}">
+ {% endfor %}
+ {% for b in vars_blacklist %}
+ <input type="hidden" class="js-config-blacklist-name" value="{{b}}">
+ {% endfor %}
+ {% for b in vars_managed %}
+ <input type="hidden" class="js-config-managed-name" value="{{b}}">
+ {% endfor %}
+
+ <div class="row-fluid">
+ <form id="variable-form">
+ <fieldset style="padding-left:0px;">
+ <legend>Add variable</legend>
+ <div class="span3" style="margin-left:0px;">
+ <span id="add-configvar-name-div" class="control-group">
+ <label>
+ Variable
+ <i title="" class="icon-question-sign get-help"
+ data-original-title="Variable names are case sensitive,
+ cannot have spaces, and can only include letters, numbers, underscores
+ and dashes"></i>
+ </label>
+ <input type="text" placeholder="Type variable name" id="variable">
+ <span class="help-block error" id="new-variable-error-message"></span>
+ </span>
+ <label>Value</label>
+ <input id="value" type="text" placeholder="Type variable value"><p>
+ <div>
+ <button id="add-configvar-button" class="btn save" type="button" disabled>Add variable</button>
+ </div>
+ </div>
+ <div class="span5 help-block">
+ <h5>Some variables are reserved from Toaster</h5>
+ <p>Toaster cannot set any variables that impact 1) the configuration of the build servers,
+ or 2) where artifacts produced by the build are stored. Such variables include: </p>
+ <p>
+ <code><a href="http://www.yoctoproject.org/docs/1.6.1/ref-manual/ref-manual.html#var-BB_DISKMON_DIRS" target="_blank">BB_DISKMON_DIRS</a></code>
+ <code><a href="http://www.yoctoproject.org/docs/1.6.1/ref-manual/ref-manual.html#var-BB_NUMBER_THREADS" target="_blank">BB_NUMBER_THREADS</a></code>
+ <code>CVS_PROXY_HOST</code>
+ <code>CVS_PROXY_PORT</code>
+ <code><a href="http://www.yoctoproject.org/docs/1.6.1/ref-manual/ref-manual.html#var-DL_DIR" target="_blank">DL_DIR</a></code>
+ <code><a href="http://www.yoctoproject.org/docs/1.6.1/ref-manual/ref-manual.html#var-PARALLEL_MAKE" target="_blank">PARALLEL_MAKE</a></code>
+ <code><a href="http://www.yoctoproject.org/docs/1.6.1/ref-manual/ref-manual.html#var-SSTATE_DIR" target="_blank">SSTATE_DIR</a></code>
+ <code><a href="http://www.yoctoproject.org/docs/1.6.1/ref-manual/ref-manual.html#var-SSTATE_MIRRORS" target="_blank">SSTATE_MIRRORS</a></code>
+ <code><a href="http://www.yoctoproject.org/docs/1.6.1/ref-manual/ref-manual.html#var-TMPDIR" target="_blank">TMPDIR</a></code></p>
+ <p>Plus the following standard shell environment variables:</p>
+ <p><code>http_proxy</code> <code>ftp_proxy</code> <code>https_proxy</code> <code>all_proxy</code></p>
+ </div>
+ </fieldset>
+ </form>
+ </div>
+
+ </div>
+
+ <script>
+
+ // global variables
+ var do_reload=false;
+
+ // validate new variable name
+ function validate_new_variable() {
+ var variable = $("input#variable").val();
+ var value = $("input#value").val();
+
+ // presumed innocence
+ $('#new-variable-error-message').text("");
+ var error_msg = "";
+
+ var existing_configvars = document.getElementsByClassName('js-config-var-name');
+ for (var i = 0, length = existing_configvars.length; i < length; i++) {
+ if (existing_configvars[i].innerHTML.toUpperCase() == variable.toUpperCase()) {
+ error_msg = "This variable is already set in this page, edit its value instead";
+ }
+ }
+
+ var blacklist_configvars = document.getElementsByClassName('js-config-blacklist-name');
+ for (var i = 0, length = blacklist_configvars.length; i < length; i++) {
+ if (blacklist_configvars[i].value.toUpperCase() == variable.toUpperCase()) {
+ error_msg = "You cannot edit this variable in Toaster because it is set by the build servers";
+ }
+ }
+
+ var managed_configvars = document.getElementsByClassName('js-config-managed-name');
+ for (var i = 0, length = managed_configvars.length; i < length; i++) {
+ if (managed_configvars[i].value.toUpperCase() == variable.toUpperCase()) {
+ error_msg = "You cannot set this variable here. Please set it in the <a href=\"{% url 'project' project.id %}\">project main page</a>";
+ }
+ }
+
+ var bad_chars = /[^a-zA-Z0-9\-_]/.test(variable);
+ var has_spaces = (0 <= variable.indexOf(" "));
+ var only_spaces = (0 < variable.length) && (0 == variable.trim().length);
+
+ if (only_spaces) {
+ error_msg = "A valid variable name cannot include spaces";
+ } else if (bad_chars && has_spaces) {
+ error_msg = "A valid variable name can only include letters, numbers, underscores, dashes, and cannot include spaces";
+ } else if (bad_chars) {
+ error_msg = "A valid variable name can only include letters, numbers, underscores, and dashes";
+ }
+
+ if ("" != error_msg) {
+ $('#new-variable-error-message').html(error_msg);
+ $(".save").attr("disabled","disabled");
+
+ // add one (and only one) error class append
+ var d = document.getElementById("add-configvar-name-div");
+ d.className = d.className.replace(" error","");
+ d.className = d.className + " error";
+
+ return false;
+ } else if (0 == variable.length) {
+ $(".save").attr("disabled","disabled");
+ return false;
+ }
+
+ var d = document.getElementById("add-configvar-name-div");
+ d.className = d.className.replace(" error","");
+
+ // now set the "Save" enablement if 'value' also passes
+ if (value.trim().length > 0) {
+ $(".save").removeAttr("disabled");
+ } else {
+ $(".save").attr("disabled","disabled");
+ }
+
+ return true;
+ }
+
+ // validate distro name
+ function validate_distro_name() {
+ var value = $("input#new-distro").val();
+
+ // presumed innocence
+ $('#distro-error-message').text("");
+ var error_msg = "";
+
+ var has_spaces = (0 <= value.indexOf(" "));
+
+ if (has_spaces) {
+ error_msg = "A valid distro name cannot include spaces";
+ } else if (0 == value.length) {
+ error_msg = " ";
+ }
+
+ if ("" != error_msg) {
+ $('#distro-error-message').text(error_msg);
+ $("#apply-change-distro").attr("disabled","disabled");
+
+ // add one (and only one) error class append
+ var d = document.getElementById("edit-distro-name-div");
+ d.className = d.className.replace(" error","");
+ d.className = d.className + " error";
+
+ return false;
+ }
+
+ var d = document.getElementById("edit-distro-name-div");
+ d.className = d.className.replace(" error","");
+ $("#apply-change-distro").removeAttr("disabled");
+ return true;
+ }
+
+ // Test to insure at least one FS Type is checked
+ function enableFsTypesSave() {
+ var any_checked = 0;
+ $(".fs-checkbox-fstypes:checked").each(function(){
+ any_checked = 1;
+ });
+ if ( 0 == any_checked ) {
+ $("#apply-change-image_fstypes").attr("disabled","disabled");
+ }
+ else {
+ $("#apply-change-image_fstypes").removeAttr("disabled");
+ }
+ }
+
+ // Preset or reset the Package Class checkbox labels
+ function updatePackageClassCheckboxes() {
+ $('#package_class_1, #package_class_2').hide();
+ if ($('select').val() == 'package_deb') {
+ $('#package_class_1').html('<input type="checkbox" id="package_class_1_input"> package_ipk');
+ $('#package_class_2').html('<input type="checkbox" id="package_class_2_input"> package_rpm');
+ }
+ if ($('select').val() == 'package_ipk') {
+ $('#package_class_1').html('<input type="checkbox" id="package_class_1_input"> package_deb');
+ $('#package_class_2').html('<input type="checkbox" id="package_class_2_input"> package_rpm');
+ }
+ if ($('select').val() == 'package_rpm') {
+ $('#package_class_1').html('<input type="checkbox" id="package_class_1_input"> package_deb');
+ $('#package_class_2').html('<input type="checkbox" id="package_class_2_input"> package_ipk');
+ }
+ $('#package_class_1, #package_class_2').fadeIn(1500);
+ }
+
+ // Re-assert handlers when the page is served and/or refreshed via Ajax
+ function setEventHandlersForDynamicElements() {
+
+ // change variable value
+ $('.js-icon-pencil-config_var').click(function (evt) {
+ var pk = evt.target.attributes["x-data"].value;
+ var current_val = $("span#config_var_value_"+pk).text();
+ $('.js-icon-pencil-config_var, .js-icon-trash-config_var, #config_var_value_'+pk).hide();
+ $("#change-config_var-form_"+pk).slideDown();
+ $("input#new-config_var_"+pk).val(current_val);
+ });
+
+ $('.js-cancel-change-config_var').click(function (evt) {
+ var pk = evt.target.attributes["x-data"].value;
+ $("#change-config_var-form_"+pk).slideUp(function() {
+ $('.js-icon-pencil-config_var, .js-icon-trash-config_var, #config_var_value_'+pk).show();
+ });
+ });
+
+ $(".js-new-config_var").on('input', function(){
+ if ($(this).val().length == 0) {
+ $(".js-apply-change-config_var").attr("disabled","disabled");
+ }
+ else {
+ $(".js-apply-change-config_var").removeAttr("disabled");
+ }
+ });
+
+ $('.js-apply-change-config_var').click(function (evt) {
+ var xdata = evt.target.attributes["x-data"].value.split(":");
+ var pk = xdata[0];
+ var variable = xdata[1];
+ var val = $('#new-config_var_'+pk).val();
+ postEditAjaxRequest({"configvarChange" : variable+':'+val});
+ $('#config_var_value_'+pk).parent().removeClass('muted');
+ $("#change-config_var-form_"+pk).slideUp(function() {
+ $('.js-icon-pencil-config_var, .js-icon-trash-config_var, #config_var_value_'+pk).show();
+ });
+ });
+
+ // delete variable
+ $(".js-icon-trash-config_var").click(function (evt) {
+ var xdata = evt.target.attributes["x-data"].value.split(":");
+ var pk = xdata[0];
+
+ // hide the dangling trash tooltip
+ $('#config_var_trash_'+pk).hide();
+
+ // fade out the variable+value div, then refresh the variable list
+ $('#config_var_entry_'+pk).parent().parent().fadeOut(1000, function(){
+ postEditAjaxRequest({"configvarDel": evt.target.attributes["x-data"].value});
+ });
+
+ });
+
+ }
+
+ function onEditPageUpdate(data) {
+ // update targets
+ var i; var orightml = "";
+
+ var configvars_sorted = data.configvars.sort(function(a, b){return a[0] > b[0]});
+
+ var managed_configvars = document.getElementsByClassName('js-config-var-managed-name');
+
+ for (i = 0; i < configvars_sorted.length; i++) {
+ // skip if the variable name has a special context (not user defined)
+ var var_context=undefined;
+ for (var j = 0, length = managed_configvars.length; j < length; j++) {
+ if ((managed_configvars[j].innerHTML == configvars_sorted[i][0]) ||
+ (managed_configvars[j].value == configvars_sorted[i][0]) ) {
+ var_context='m';
+ }
+ }
+ if (var_context == undefined) {
+ orightml += '<div> <dt><span id="config_var_entry_'+configvars_sorted[i][2]+'" class="js-config-var-name"></span><i class="icon-trash js-icon-trash-config_var" id="config_var_trash_'+configvars_sorted[i][2]+'" x-data="'+configvars_sorted[i][2]+'"></i> </dt>'
+ orightml += '<dd class="lead">'
+ orightml += ' <span id="config_var_value_'+configvars_sorted[i][2]+'"></span>'
+ orightml += ' <i class="icon-pencil js-icon-pencil-config_var" x-data="'+configvars_sorted[i][2]+'"></i>'
+ orightml += ' <form id="change-config_var-form_'+configvars_sorted[i][2]+'" style="display:none;">'
+ orightml += ' <div class="input-append">'
+ orightml += ' <input type="text" class="input-xlarge js-new-config_var" id="new-config_var_'+configvars_sorted[i][2]+'" value="">'
+ orightml += ' <button class="btn js-apply-change-config_var" type="button" x-data="'+configvars_sorted[i][2]+':'+configvars_sorted[i][0]+'" disabled>Save</button>'
+ orightml += ' <button type="button" class="btn btn-link js-cancel-change-config_var" x-data="'+configvars_sorted[i][2]+'">Cancel</button>'
+ orightml += ' </div>'
+ orightml += ' </form>'
+ orightml += '</dd> </div>'
+ }
+ }
+
+ // update configvars list HTML framework
+ $("dl#configvar-list").html(orightml);
+
+ // insert the name/value pairs safely as non-HTML
+ for (i = 0; i < configvars_sorted.length; i++) {
+ $('#config_var_entry_'+configvars_sorted[i][2]).text(configvars_sorted[i][0]);
+ $('#config_var_value_'+configvars_sorted[i][2]).text(configvars_sorted[i][1]);
+ }
+
+ // Add the tooltips
+ $(".js-icon-trash-config_var").each( function(){ setDeleteTooltip($(this)); });
+ $(".js-icon-pencil-config_var").each(function(){ setChangeTooltip($(this)); });
+
+ // re-assert these event handlers
+ setEventHandlersForDynamicElements();
+ }
+
+ function onEditAjaxSuccess(data, textstatus) {
+ // console.log("XHR returned:", data, "(" + textstatus + ")");
+ if (data.error != "ok") {
+ alert("error on request:\n" + data.error);
+ return;
+ }
+
+ // delayed page reload?
+ if (do_reload) {
+ do_reload=false;
+ location.reload(true);
+ } else {
+ onEditPageUpdate(data);
+ }
+ }
+
+ function onEditAjaxError(jqXHR, textstatus, error) {
+ alert("XHR errored:\n" + error + "\n(" + textstatus + ")");
+ // re-assert the event handlers
+ }
+
+ /* ensure cookie exists {% csrf_token %} */
+ function postEditAjaxRequest(reqdata) {
+ var ajax = $.ajax({
+ type:"POST",
+ data: $.param(reqdata),
+ url:"{% url 'xhr_configvaredit' project.id%}",
+ headers: { 'X-CSRFToken': $.cookie("csrftoken")},
+ success: onEditAjaxSuccess,
+ error: onEditAjaxError,
+ })
+ }
+
+ function setDeleteTooltip(object) {
+ object.tooltip({ container: 'body', html: true, delay: {show: 400}, title: "Delete" });
+ }
+ function setChangeTooltip(object) {
+ object.tooltip({ container: 'body', html: true, delay: {show: 400}, title: "Change" });
+ }
+
+ $(document).ready(function() {
+
+ //
+ // Register handlers for static elements
+ //
+
+ {% if distro_defined %}
+ // change distro variable
+ $('#change-distro-icon').click(function() {
+ $('#change-distro-icon, #distro').hide();
+ $("#change-distro-form").slideDown();
+ $("#new-distro").val( $('#distro').text() );
+ });
+
+ $('#cancel-change-distro').click(function(){
+ $("#change-distro-form").slideUp(function() {
+ $('#distro, #change-distro-icon').show();
+
+ // reset any dangling error state
+ $('#distro-error-message').text("");
+ var d = document.getElementById("edit-distro-name-div");
+ d.className = d.className.replace(" error","");
+ });
+ });
+
+ // validate new distro name
+ $("input#new-distro").on('input', function (evt) {
+ validate_distro_name();
+ });
+
+ $('#apply-change-distro').click(function(){
+ //$('#repo').parent().removeClass('highlight-go');
+ var name = $('#new-distro').val();
+ postEditAjaxRequest({"configvarChange" : 'DISTRO:'+name});
+ $('#distro').text(name);
+ $("#change-distro-form").slideUp(function () {
+ $('#distro, #change-distro-icon').show();
+ });
+ });
+ {% endif %}
+
+
+ {% if fstypes_defined %}
+ // change IMAGE_FSTYPES variable
+
+ $('#change-image_fstypes-icon').click(function() {
+ $('#change-image_fstypes-icon, #image_fstypes').hide();
+ $("#change-image_fstypes-form").slideDown();
+ // avoid false substring matches by including space separators
+ var html = "";
+ var fstypes = " " + document.getElementById("image_fstypes").innerHTML + " ";
+ var fstypes_list = document.getElementsByClassName('js-checkbox-fstypes-list');
+ // Add the checked boxes first
+ if (" " != fstypes) {
+ for (var i = 0, length = fstypes_list.length; i < length; i++) {
+ if (0 <= fstypes.indexOf(" "+fstypes_list[i].value+" ")) {
+ html += '<label class="checkbox"><input type="checkbox" class="fs-checkbox-fstypes" value="'+fstypes_list[i].value+'" checked="checked">'+fstypes_list[i].value+'</label>\n';
+ }
+ }
+ }
+ // Add the un-checked boxes second
+ for (var i = 0, length = fstypes_list.length; i < length; i++) {
+ if (0 > fstypes.indexOf(" "+fstypes_list[i].value+" ")) {
+ html += '<label class="checkbox"><input type="checkbox" class="fs-checkbox-fstypes" value="'+fstypes_list[i].value+'">'+fstypes_list[i].value+'</label>\n';
+ }
+ }
+ document.getElementById("all-image_fstypes").innerHTML = html;
+
+ // Watch elements to disable Save when none are checked
+ $(".fs-checkbox-fstypes").each(function(){
+ $(this).click(function() {
+ enableFsTypesSave();
+ });
+ });
+
+ // clear the previous filter values
+ $("input#filter-image_fstypes").val("");
+ });
+
+ $('#cancel-change-image_fstypes').click(function(){
+ $("#change-image_fstypes-form").slideUp(function() {
+ $('#image_fstypes, #change-image_fstypes-icon').show();
+ });
+ });
+
+ $('#filter-image_fstypes').on('input', function(){
+ var valThis = $(this).val().toLowerCase();
+ $('#all-image_fstypes label').each(function(){
+ var text = $(this).text().toLowerCase();
+ var match = text.indexOf(valThis);
+ if (match >= 0) {
+ $(this).show();
+ }
+ else {
+ $(this).hide();
+ }
+ });
+ });
+
+ $('#apply-change-image_fstypes').click(function(){
+ // extract the selected fstypes and sort them
+ var fstypes_array = [];
+ var checkboxes = document.getElementsByClassName('fs-checkbox-fstypes');
+ $(".fs-checkbox-fstypes:checked").each(function(){
+ fstypes_array.push($(this).val());
+ });
+ fstypes_array.sort();
+
+ // now make a string of them
+ var fstypes = '';
+ for (var i = 0, length = fstypes_array.length; i < length; i++) {
+ fstypes += fstypes_array[i] + ' ';
+ }
+ fstypes = fstypes.trim();
+
+ postEditAjaxRequest({"configvarChange" : 'IMAGE_FSTYPES:'+fstypes});
+ $('#image_fstypes').text(fstypes);
+ $('#image_fstypes').parent().removeClass('muted');
+
+ $("#change-image_fstypes-form").slideUp(function() {
+ $('#image_fstypes, #change-image_fstypes-icon').show();
+ });
+ });
+ {% endif %}
+
+
+ {% if image_install_append_defined %}
+
+ // init IMAGE_INSTALL_append trash icon
+ setDeleteTooltip($('#delete-image_install-icon'));
+
+ // change IMAGE_INSTALL_append variable
+ $('#change-image_install-icon').click(function() {
+ // preset the edit value
+ var current_val = $("span#image_install").text().trim();
+ if (current_val == "Not set") {
+ current_val="";
+ $("#apply-change-image_install").attr("disabled","disabled");
+ } else {
+ // insure these non-empty values have single space prefix
+ current_val=" " + current_val;
+ }
+ $("input#new-image_install").val(current_val);
+
+ $('#change-image_install-icon, #delete-image_install-icon, #image_install').hide();
+ $("#change-image_install-form").slideDown();
+ });
+
+ $('#cancel-change-image_install').click(function(){
+ $("#change-image_install-form").slideUp(function() {
+ $('#image_install, #change-image_install-icon').show();
+ if ($("span#image_install").text() != "Not set") {
+ $('#delete-image_install-icon').show();
+ setDeleteTooltip($('#delete-image_install-icon'));
+ }
+ });
+ });
+
+ $("#new-image_install").on('input', function(){
+ if ($(this).val().trim().length == 0) {
+ $("#apply-change-image_install").attr("disabled","disabled");
+ }
+ else {
+ $("#apply-change-image_install").removeAttr("disabled");
+ }
+ });
+
+ $('#apply-change-image_install').click(function(){
+ // insure these non-empty values have single space prefix
+ var value = " " + $('#new-image_install').val().trim();
+ postEditAjaxRequest({"configvarChange" : 'IMAGE_INSTALL_append:'+value});
+ $('#image_install').text(value);
+ $('#image_install').removeClass('muted');
+ $("#change-image_install-form").slideUp(function () {
+ $('#image_install, #change-image_install-icon').show();
+ if (value.length > -1) {
+ $('#delete-image_install-icon').show();
+ setDeleteTooltip($('#delete-image_install-icon'));
+ }
+ });
+ });
+
+ // delete IMAGE_INSTALL_append variable value
+ $('#delete-image_install-icon').click(function(){
+ $(this).tooltip('hide');
+ postEditAjaxRequest({"configvarChange" : 'IMAGE_INSTALL_append:'+''});
+ $('#image_install').parent().fadeOut(1000, function(){
+ $('#image_install').addClass('muted');
+ $('#image_install').text('Not set');
+ $('#delete-image_install-icon').hide();
+ $('#image_install').parent().fadeIn(1000);
+ });
+ });
+ {% endif %}
+
+
+ {% if package_classes_defined %}
+ // change PACKAGE_CLASSES variable
+ $('#change-package_classes-icon').click(function() {
+ $('#change-package_classes-icon, #package_classes').hide();
+ $("#change-package_classes-form").slideDown();
+
+ // initialize the pulldown and checkboxes
+ var value = $("#package_classes").text();
+ if ( value.indexOf("package_deb") == 0 ) {
+ $("#package_classes-select").prop('selectedIndex', 0);
+ updatePackageClassCheckboxes();
+ if ( value.indexOf("_ipk") > 0 ) {
+ $("#package_class_1_input").attr("checked",true);
+ }
+ if ( value.indexOf("_rpm") > 0 ) {
+ $("#package_class_2_input").attr("checked",true);
+ }
+ }
+
+ if ( value.indexOf("package_ipk") == 0 ) {
+ $("#package_classes-select").prop('selectedIndex', 1);
+ updatePackageClassCheckboxes();
+ if ( value.indexOf("_deb") > 0 ) {
+ $("#package_class_1_input").attr("checked",true);
+ }
+ if ( value.indexOf("_rpm") > 0 ) {
+ $("#package_class_2_input").attr("checked",true);
+ }
+ }
+
+ if ( value.indexOf("package_rpm") == 0 ) {
+ $("#package_classes-select").prop('selectedIndex', 2);
+ updatePackageClassCheckboxes();
+ if ( value.indexOf("_deb") > 0 ) {
+ $("#package_class_1_input").attr("checked",true);
+ }
+ if ( value.indexOf("_ipk") > 0 ) {
+ $("#package_class_2_input").attr("checked",true);
+ }
+ }
+ });
+
+ $('#cancel-change-package_classes').click(function(){
+ $("#change-package_classes-form").slideUp(function() {
+ $('#package_classes, #change-package_classes-icon').show();
+ });
+ });
+
+ $('select').change(function() {
+ updatePackageClassCheckboxes();
+ });
+
+ $('#apply-change-package_classes').click(function(){
+ var e = document.getElementById("package_classes-select");
+ var val = e.options[e.selectedIndex].text;
+
+ pc1_checked = document.getElementById("package_class_1_input").checked;
+ pc2_checked = document.getElementById("package_class_2_input").checked;
+ if (val == "package_deb") {
+ if (pc1_checked) val = val + " package_ipk";
+ if (pc2_checked) val = val + " package_rpm";
+ }
+ if (val == "package_ipk") {
+ if (pc1_checked) val = val + " package_deb";
+ if (pc2_checked) val = val + " package_rpm";
+ }
+ if (val == "package_rpm") {
+ if (pc1_checked) val = val + " package_deb";
+ if (pc2_checked) val = val + " package_ipk";
+ }
+
+ $('#package_classes').text(val);
+ //$('#package_classes').parent().removeClass('muted');
+ postEditAjaxRequest({"configvarChange" : 'PACKAGE_CLASSES:'+val});
+ $("#change-package_classes-form").slideUp(function() {
+ $('#package_classes, #change-package_classes-icon').show();
+ });
+ });
+ {% endif %}
+
+
+ {% if sdk_machine_defined %}
+ // change SDKMACHINE variable
+ $('#change-sdkmachine-icon').click(function() {
+ var current_value = document.getElementById("sdkmachine").innerHTML;
+ var radios = document.getElementsByName('sdkmachine');
+ for (var i = 0, length = radios.length; i < length; i++) {
+ radios[i].checked = false;
+ if (radios[i].value == current_value) {
+ radios[i].checked = true;
+ }
+ }
+ $('#change-sdkmachine-icon, #sdkmachine').hide();
+ $("#change-sdkmachine-form").slideDown();
+ });
+
+ $('#cancel-change-sdkmachine').click(function(){
+ $("#change-sdkmachine-form").slideUp(function() {
+ $('#sdkmachine, #change-sdkmachine-icon').show();
+ });
+ });
+
+ $('#apply-change-sdkmachine').click(function(){
+ var value="";
+ var radios = document.getElementsByName('sdkmachine');
+ for (var i = 0, length = radios.length; i < length; i++) {
+ if (radios[i].checked) {
+ // do whatever you want with the checked radio
+ value=radios[i].value;
+ break;
+ }
+ }
+ postEditAjaxRequest({"configvarChange" : 'SDKMACHINE:'+value});
+ $('#sdkmachine').text(value);
+ $("#change-sdkmachine-form").slideUp(function() {
+ $('#sdkmachine, #change-sdkmachine-icon').show();
+ });
+
+ });
+ {% endif %}
+
+
+ // add new variable
+ $("button#add-configvar-button").click( function (evt) {
+ var variable = $("input#variable").val();
+ var value = $("input#value").val();
+
+ postEditAjaxRequest({"configvarAdd" : variable+':'+value});
+
+ // clear the previous values
+ $("input#variable").val("");
+ $("input#value").val("");
+ // Disable add button
+ $(".save").attr("disabled","disabled");
+
+ // Reload page if admin-removed core managed value is manually added back in
+ if (0 <= " DISTRO IMAGE_FSTYPES IMAGE_INSTALL_append PACKAGE_CLASSES SDKMACHINE ".indexOf( " "+variable+" " )) {
+ // delayed reload to avoid race condition with postEditAjaxRequest
+ do_reload=true;
+ }
+ });
+
+ // validate new variable name and value
+ $("#variable, #value").on('input', function() {
+ validate_new_variable();
+ });
+
+ //
+ // draw and register the dynamic configuration variables and handlers
+ //
+
+ var data = {
+ configvars : []
+ };
+ {% for c in configvars %}
+ data.configvars.push([ "{{c.name}}","{{c.value}}","{{c.pk}}" ]);
+ {% if '' != vars_context|get_dict_value:c.name %}
+ data.vars_context[ "{{c.name}}" ] = "{{vars_context|get_dict_value:c.name }}";
+ {% endif %}
+ {% endfor %}
+
+ // draw these elements and assert their event handlers
+ onEditPageUpdate(data);
+ });
+
+ </script>
+
+{% endblock %}
diff --git a/bitbake/lib/toaster/toastergui/templates/projects.html b/bitbake/lib/toaster/toastergui/templates/projects.html
new file mode 100644
index 0000000..c2d77b5
--- /dev/null
+++ b/bitbake/lib/toaster/toastergui/templates/projects.html
@@ -0,0 +1,77 @@
+{% extends "base.html" %}
+
+{% load static %}
+{% load projecttags %}
+{% load humanize %}
+
+{% block pagecontent %}
+
+
+ <div class="page-header top-air">
+ <h1>
+ {% if request.GET.filter and objects.paginator.count > 0 or request.GET.search and objects.paginator.count > 0 %}
+ {{objects.paginator.count}} project{{objects.paginator.count|pluralize}} found
+ {%elif request.GET.filter and objects.paginator.count == 0 or request.GET.search and objects.paginator.count == 0 %}
+ No projects found
+ {%else%}
+ All projects
+ {%endif%}
+ </h1>
+ </div>
+
+ {% if objects.paginator.count == 0 %}
+ <div class="row-fluid">
+ <div class="alert">
+ <form class="no-results input-append" id="searchform">
+ <input id="search" name="search" class="input-xxlarge" type="text" value="
+ {% if request.GET.search %}
+ {{request.GET.search}}
+ {% endif %}"/>{% if request.GET.search %}<a href="javascript:$('#search').val('');searchform.submit()" class="add-on btn" tabindex="-1"><i class="icon-remove"></i></a>{% endif %}
+ <button class="btn" type="submit" value="Search">Search</button>
+ <button class="btn btn-link" onclick="javascript:$('#search').val('');searchform.submit()">Show all projects</button>
+ </form>
+ </div>
+ </div>
+
+ {% else %} {# We have builds to display #}
+ {% include "basetable_top.html" %}
+ {% for o in objects %}
+ <tr class="data">
+ <td><a href="{% url 'project' o.id %}">{{o.name}}</a></td>
+ <td class="updated"><a href="{% url 'project' o.id %}">{{o.updated|date:"d/m/y H:i"}}</a></td>
+ <td>
+ {% if o.release %}
+ <a href="{% url 'project' o.id %}#project-details">{{o.release.name}}</a>
+ {% else %}
+ No release available
+ {% endif %}
+ </td>
+ <td><a href="{% url 'project' o.id %}#machine-distro">{{o.get_current_machine_name}}</a></td>
+ {% if o.get_number_of_builds == 0 %}
+ <td class="muted">{{o.get_number_of_builds}}</td>
+ <td class="loutcome"></td>
+ <td class="ltarget"></td>
+ <td class="lerrors"></td>
+ <td class="lwarnings"></td>
+ <td class="limagefiles"></td>
+ {% else %}
+ <td><a href="{% url 'projectbuilds' o.id %}">{{o.get_number_of_builds}}</a></td>
+ <td class="loutcome"><a href="{% url "builddashboard" o.get_last_build_id %}">{%if o.get_last_outcome == build_SUCCEEDED%}<i class="icon-ok-sign success"></i>{%elif o.get_last_outcome == build_FAILED%}<i class="icon-minus-sign error"></i>{%else%}{%endif%}</a></td>
+ <td class="ltarget"><a href="{% url "builddashboard" o.get_last_build_id %}">{{o.get_last_target}} </a></td>
+ <td class="lerrors">{% if o.get_last_errors %}<a class="errors.count error" href="{% url "builddashboard" o.get_last_build_id %}#errors">{{o.get_last_errors}} error{{o.get_last_errors|pluralize}}</a>{%endif%}</td>
+ <td class="lwarnings">{% if o.get_last_warnings %}<a class="warnings.count warning" href="{% url "builddashboard" o.get_last_build_id %}#warnings">{{o.get_last_warnings}} warning{{o.get_last_warnings|pluralize}}</a>{%endif%}</td>
+ <td class="limagefiles">
+ {% if o.get_last_outcome == build_SUCCEEDED %}
+ <a href="{%url "builddashboard" o.get_last_build_id %}#images">{{fstypes|get_dict_value:o.id}}</a>
+ {% endif %}
+ </td>
+
+ {% endif %}
+ </tr>
+ {% endfor %}
+ {% include "basetable_bottom.html" %}
+ {% endif %} {# empty #}
+
+{% endblock %}
+
+
diff --git a/bitbake/lib/toaster/toastergui/templates/projecttopbar.html b/bitbake/lib/toaster/toastergui/templates/projecttopbar.html
new file mode 100644
index 0000000..ca2741d
--- /dev/null
+++ b/bitbake/lib/toaster/toastergui/templates/projecttopbar.html
@@ -0,0 +1,47 @@
+<div class="alert alert-success lead" id="project-created-notification" style="margin-top:15px; display:none">
+ <button type="button" class="close" data-dismiss="alert">×</button>
+ Your project <strong>{{project.name}}</strong> has been created. You can now <a href="{% url 'projectmachines' project.id %}">select your target machine</a> and <a href="{% url 'projecttargets' project.id %}">choose image recipes</a> to build.
+</div>
+
+<!-- project name -->
+<div class="page-header">
+ <h1><span id="project-name">{{project.name}}</span>
+ <i class="icon-pencil" data-original-title="" id="project-change-form-toggle" title=""></i>
+ </h1>
+ <form id="project-name-change-form" style="margin-bottom: 0px; display: none;">
+ <div class="input-append">
+ <input class="huge input-xxlarge" type="text" id="project-name-change-input" autocomplete="off" value="{{project.name}}">
+ <button id="project-name-change-btn" class="btn btn-large" type="button">Save</button>
+ <a href="#" id="project-name-change-cancel" class="btn btn-large btn-link">Cancel</a>
+ </div>
+ </form>
+</div>
+
+<div id="project-topbar">
+ <ul class="nav nav-pills">
+ <li>
+ <a href="{% url 'projectbuilds' project.id %}">
+ Builds (<span class="total-builds">0</span>)
+ </a>
+ </li>
+ <li id="topbar-configuration-tab">
+ <a href="{% url 'project' project.id %}">
+ Configuration
+ </a>
+ </li>
+ <li>
+ <a href="{% url 'importlayer' project.id %}">
+ Import layer
+ </a>
+ </li>
+ <li class="pull-right">
+ <form class="form-inline" style="margin-bottom:0px;">
+ <i class="icon-question-sign get-help heading-help" data-placement="left" title="" data-original-title="Type the name of one or more recipes you want to build, separated by a space. You can also specify a task by appending a semicolon and a task name to the recipe name, like so: <code>busybox:clean</code>"></i>
+ <div class="input-append">
+ <input id="build-input" type="text" class="input-xlarge input-lg build-target-input" placeholder="Type the recipe you want to build" autocomplete="off" disabled>
+ <button id="build-button" class="btn btn-primary btn-large build-button" data-project-id="{{project.id}}" disabled>Build</button>
+ </div>
+ </form>
+ </li>
+ </ul>
+</div>
diff --git a/bitbake/lib/toaster/toastergui/templates/recipe.html b/bitbake/lib/toaster/toastergui/templates/recipe.html
new file mode 100644
index 0000000..b5e4192
--- /dev/null
+++ b/bitbake/lib/toaster/toastergui/templates/recipe.html
@@ -0,0 +1,242 @@
+{% extends "basebuilddetailpage.html" %}
+
+{% load projecttags %}
+
+{% block localbreadcrumb %}
+<li><a href="{% url 'recipes' build.pk %}">Recipes</a></li>
+<li>{{object.name}}_{{object.version}} </li>
+{% endblock %}
+
+{% block pagedetailinfomain %}
+
+<!-- Begin container -->
+
+<div class="row span11">
+ <div class="page-header">
+ <h1>{{object.name}}_{{object.version}}</h1>
+ </div>
+</div>
+
+<div class="row span7 tabbable">
+ <ul class="nav nav-pills">
+ <li class="{{tab_states.1}}">
+ <a href="#information" data-toggle="tab">
+ <i class="icon-question-sign get-help" title="Build-related information about the recipe"></i>
+ Recipe details
+ </a>
+ </li>
+ <li>
+ <a href="{% url "recipe_packages" build.pk object.id %}">
+ <i class="icon-question-sign get-help" title="The packaged output resulting from building the recipe"></i>
+ Packages ({{package_count}})
+ </a>
+ </li>
+ <li class="{{tab_states.3}}">
+ <a href="#dependencies" data-toggle="tab">
+ <i class="icon-question-sign get-help" title="The recipe build-time dependencies (i.e. other recipes)"></i>
+ Build dependencies ({{object.r_dependencies_recipe.all.count}})
+ </a>
+ </li>
+ <li class="{{tab_states.4}}">
+ <a href="#brought-in-by" data-toggle="tab">
+ <i class="icon-question-sign get-help" title="The recipe build-time reverse dependencies (i.e. the recipes that depend on this recipe)"></i>
+ Reverse build dependencies ({{object.r_dependencies_depends.all.count}})
+ </a>
+ </li>
+ </ul>
+ <div class="tab-content">
+ <div class="tab-pane {{tab_states.1}}" id="information">
+ <dl class="dl-horizontal">
+ <dt>
+ <i class="icon-question-sign get-help" title="The name of the layer providing the recipe"></i>
+ Layer
+ </dt>
+ <dd>{{layer.name}}</dd>
+
+ <dt>
+ <i class="icon-question-sign get-help" title="Path to the recipe .bb file"></i>
+ Recipe file
+ </dt>
+ <dd><code>{{object.file_path}} {% if object.pathflags %}<i>({{object.pathflags}})</i>{% endif %}</code></dd>
+ {% if layer_version.branch %}
+ <dt>
+ <i class="icon-question-sign get-help" title="The Git branch of the layer providing the recipe"></i>
+ Layer branch
+ </dt>
+ <dd>{{layer_version.branch}}</dd>
+ {% endif %}
+ <dt>
+ <i class="icon-question-sign get-help" title="The Git commit of the layer providing the recipe"></i>
+ Layer commit
+ </dt>
+ <dd class="iscommit">{{layer_version.commit}}</dd>
+ </dl>
+
+ <h2 class="details">Tasks</h2>
+ {% if not tasks %}
+ <div class="alert alert-info">
+ <strong>{{object.name}}_{{object.version}}</strong> does not have any tasks in this build.
+ </div>
+ {% else %}
+ <table class="table table-bordered table-hover">
+ <thead>
+ <tr>
+ <th>
+ <i class="icon-question-sign get-help" title="The running sequence of each task in the build"></i>
+ Order
+ </th>
+ <th>
+ <i class="icon-question-sign get-help" title="The name of the task"></i>
+ Task
+ </th>
+ <th>
+ <i class="icon-question-sign get-help" title="This value tells you if a task had to run (executed) in order to generate the task output, or if the output was provided by another task and therefore the task didn't need to run (not executed)"></i>
+ Executed
+ </th>
+ <th>
+ <i class="icon-question-sign get-help" title="This column tells you if 'executed' tasks succeeded or failed. The column also tells you why 'not executed' tasks did not need to run"></i>
+ Outcome
+ </th>
+ <th>
+ <i class="icon-question-sign get-help" title="This column tells you if a task tried to restore output from the <code>sstate-cache</code> directory or mirrors, and reports the result: Succeeded, Failed or File not in cache"></i>
+ Cache attempt
+ </th>
+ </tr>
+ </thead>
+ <tbody>
+
+ {% for task in tasks %}
+
+ <tr {{ task|task_color }} >
+
+ <td><a {{ task|task_color }} href="{% url "task" build.pk task.pk %}">{{task.order}}</a></td>
+ <td>
+ <a {{ task|task_color }} href="{% url "task" build.pk task.pk %}">{{task.task_name}}</a>
+ {% if task.get_description %}<i class="icon-question-sign get-help hover-help" title="" data-original-title="{{task.get_description}}"></i> {% endif %}
+ </td>
+
+ <td><a {{ task|task_color }} href="{% url "task" build.pk task.pk %}">{{task.get_executed_display}}</a></td>
+
+ <td>
+ <a {{ task|task_color }} href="{% url "task" build.pk task.pk %}">{{task.get_outcome_display}} </a>
+ {% if task.outcome = task.OUTCOME_FAILED %}
+ <a href="{% url 'build_artifact' build.pk "tasklogfile" task.pk %}">
+ <i class="icon-download-alt" title="Download task log file"></i>
+ </a>
+ {% endif %}
+ <i class="icon-question-sign get-help hover-help" title="{{task.get_outcome_help}}"></i>
+ </td>
+ <td>
+ {% ifnotequal task.sstate_result task.SSTATE_NA %}
+ <a {{ task|task_color }} href="{% url "task" build.pk task.pk %}">{{task.get_sstate_result_display}}</a>
+ {% endifnotequal %}
+ </td>
+
+ </tr>
+
+ {% endfor %}
+ </tbody>
+ </table>
+ {% endif %}
+ </div>
+ <div class="tab-pane {{tab_states.3}}" id="dependencies">
+
+ {% if not object.r_dependencies_recipe.all %}
+ <div class="alert alert-info">
+ <strong>{{object.name}}_{{object.version}}</strong> has no build dependencies.
+ </div>
+ {% else %}
+ <table class="table table-bordered table-hover">
+ <thead>
+ <tr>
+ <th>
+ Recipe
+ </th>
+ <th>
+ Version
+ </th>
+ </tr>
+ </thead>
+ <tbody>
+
+ {% for rr in object.r_dependencies_recipe.all|dictsort:"depends_on.name" %}
+ <tr>
+ <td><a href="{% url "recipe" build.pk rr.depends_on.pk %}">{{rr.depends_on.name}}</a></td>
+ <td><a href="{% url "recipe" build.pk rr.depends_on.pk %}">{{rr.depends_on.version}}</a></td>
+ </tr>
+ {% endfor %}
+
+ </tbody>
+ </table>
+ {% endif %}
+
+ </div>
+ <div class="tab-pane {{tab_states.4}}" id="brought-in-by">
+
+ {% if not object.r_dependencies_depends.all %}
+ <div class="alert alert-info">
+ <strong>{{object.name}}_{{object.version}}</strong> has no reverse build dependencies.
+ </div>
+ {% else %}
+ <table class="table table-bordered table-hover">
+ <thead>
+ <tr>
+ <th>
+ Recipe
+ </th>
+ <th>
+ Version
+ </th>
+ </tr>
+ </thead>
+ <tbody>
+
+ {% for rr in object.r_dependencies_depends.all|dictsort:"recipe.name" %}
+ <tr>
+ <td><a href="{% url "recipe" build.pk rr.recipe.pk %}">{{rr.recipe.name}}</a></td>
+ <td><a href="{% url "recipe" build.pk rr.recipe.pk %}">{{rr.recipe.version}}</a></td>
+ </tr>
+ {% endfor %}
+
+ </tbody>
+ </table>
+ {% endif %}
+
+ </div>
+ </div>
+</div>
+
+<div class="row span4 well">
+ <h2>About {{object.name}}</h2>
+ <dl class="item-info">
+ {% if object.summary %}
+ <dt>Summary</dt>
+ <dd>{{object.summary}}</dd>
+ {% endif %}
+ {% if object.description %}
+ <dt>Description</dt>
+ <dd>{{object.description}}</dd>
+ {% endif %}
+ {% if object.homepage %}
+ <dt>Homepage</dt>
+ <dd><a href="{{object.homepage}}">{{object.homepage}}</a></dd>
+ {% endif %}
+ {% if object.bugtracker %}
+ <dt>Bugtracker</dt>
+ <dd><a href="{{object.bugtracker}}">{{object.bugtracker}}</a></dd>
+ {% endif %}
+ {% if object.section %}
+ <dt>
+ Section
+ <i class="icon-question-sign get-help" title="The section in which recipes should be categorized"></i>
+ </dt>
+ <dd>{{object.section}}</dd>
+ {% endif %}
+ {% if object.license %}
+ <dt>License</dt>
+ <dd>{{object.license}}</dd>
+ {% endif %}
+ </dl>
+</div>
+
+{% endblock %}
diff --git a/bitbake/lib/toaster/toastergui/templates/recipe_btn.html b/bitbake/lib/toaster/toastergui/templates/recipe_btn.html
new file mode 100644
index 0000000..77c1b23
--- /dev/null
+++ b/bitbake/lib/toaster/toastergui/templates/recipe_btn.html
@@ -0,0 +1,8 @@
+<button data-recipe-name="{{data.name}}" class="btn btn-block layer-exists-{{data.layer_version.pk}} build-recipe-btn" style="display:none; margin-top: 5px;" >
+ Build recipe
+</button>
+<button class="btn btn-block layerbtn layer-add-{{data.layer_version.pk}}" data-layer='{ "id": {{data.layer_version.pk}}, "name": "{{data.layer_version.layer.name}}", "layerdetailurl": "{%url 'layerdetails' extra.pid data.layer_version.pk%}"}' data-directive="add">
+ <i class="icon-plus"></i>
+ Add layer
+ <i title="" class="icon-question-sign get-help" data-original-title="To build this target, you must first add the {{data.layer_version.layer.name}} layer to your project"></i>
+</button>
diff --git a/bitbake/lib/toaster/toastergui/templates/recipe_packages.html b/bitbake/lib/toaster/toastergui/templates/recipe_packages.html
new file mode 100644
index 0000000..d25847b
--- /dev/null
+++ b/bitbake/lib/toaster/toastergui/templates/recipe_packages.html
@@ -0,0 +1,123 @@
+{% extends "basebuilddetailpage.html" %}
+
+{% load projecttags %}
+{% load humanize %}
+{% block localbreadcrumb %}
+<li><a href="{% url 'recipes' build.pk %}">Recipes</a></li>
+<li>{{recipe.name}}_{{recipe.version}} </li>
+{% endblock %}
+
+{% block pagedetailinfomain %}
+
+<!-- Begin container -->
+
+<div class="row-fluid span11">
+ <div class="page-header">
+ <h1>{{recipe.name}}_{{recipe.version}}</h1>
+ </div>
+</div>
+
+<div class="row-fluid span7 tabbable">
+ <ul class="nav nav-pills">
+ <li>
+ <a href="{% url "recipe" build.pk recipe.id "1" %}">
+ <i class="icon-question-sign get-help" title="Build-related information about the recipe"></i>
+ Recipe details
+ </a>
+ </li>
+ <li class="active">
+ <a href="#packages-built" data-toggle="tab">
+ <i class="icon-question-sign get-help" title="The packaged output resulting from building the recipe"></i>
+ Packages ({{object_count}})
+ </a>
+ </li>
+ <li>
+ <a href="{% url "recipe" build.pk recipe.id "3" %}">
+ <i class="icon-question-sign get-help" title="The recipe build-time dependencies (i.e. other recipes)"></i>
+ Build dependencies ({{recipe.r_dependencies_recipe.all.count}})
+ </a>
+ </li>
+ <li>
+ <a href="{% url "recipe" build.pk recipe.id "4" %}">
+ <i class="icon-question-sign get-help" title="The recipe build-time reverse dependencies (i.e. the recipes that depend on this recipe)"></i>
+ Reverse build dependencies ({{recipe.r_dependencies_depends.all.count}})
+ </a>
+ </li>
+ </ul>
+ <div class="tab-content">
+{# <div class="tab-pane active" id="packages-built" name="packages-built">#}
+ <div class="tab-pane active" id="packages-built">
+ {% if not objects and not request.GET.search %}
+ <div class="alert alert-info">
+ <strong>{{recipe.name}}_{{recipe.version}}</strong> does not build any packages.
+ </div>
+
+ {% elif not objects %}
+ {# have empty search results, no table nor pagination #}
+ {% with "packages" as search_what %}
+ {% include "detail_search_header.html" %}
+ {% endwith %}
+
+ {% else %}
+
+
+ {% with "packages" as search_what %}
+ {% include "detail_search_header.html" %}
+ {% endwith %}
+ <table class="table table-bordered table-hover tablesorter" id="otable">
+ {% include "detail_sorted_header.html" %}
+
+ <tbody>
+ {% for package in objects %}
+
+ <tr>
+ <td><a href="{% url "package_built_detail" build.pk package.pk %}">{{package.name}}</a></td>
+ <td><a href="{% url "package_built_detail" build.pk package.pk %}">{{package.version}}_{{package.revision}}</a></td>
+ <td class="sizecol"><a href="{% url "package_built_detail" build.pk package.pk %}">{{package.size|filtered_filesizeformat}}</a></td>
+ </tr>
+
+ {% endfor %}
+
+ {% endif %}
+ {% if objects %}
+ </tbody>
+ </table>
+ {% include "detail_pagination_bottom.html" %}
+ {% endif %}
+ </div> {# tab-pane #}
+ </div> {# tab-content #}
+</div> {# span7 #}
+
+<div class="row span4 well">
+ <h2>About {{recipe.name}}</h2>
+ <dl class="item-info">
+ {% if recipe.summary %}
+ <dt>Summary</dt>
+ <dd>{{recipe.summary}}</dd>
+ {% endif %}
+ {% if recipe.description %}
+ <dt>Description</dt>
+ <dd>{{recipe.description}}</dd>
+ {% endif %}
+ {% if recipe.homepage %}
+ <dt>Homepage</dt>
+ <dd><a href="{{recipe.homepage}}">{{recipe.homepage}}</a></dd>
+ {% endif %}
+ {% if recipe.bugtracker %}
+ <dt>Bugtracker</dt>
+ <dd><a href="{{recipe.bugtracker}}">{{recipe.bugtracker}}</a></dd>
+ {% endif %}
+ {% if recipe.section %}
+ <dt>
+ Section
+ <i class="icon-question-sign get-help" title="The section in which recipes should be categorized"></i>
+ </dt>
+ <dd>{{recipe.section}}</dd>
+ {% endif %}
+ {% if recipe.license %}
+ <dt>License</dt>
+ <dd>{{recipe.license}}</dd>
+ {% endif %}
+ </dl>
+</div>
+{% endblock pagedetailinfomain %}
diff --git a/bitbake/lib/toaster/toastergui/templates/recipes.html b/bitbake/lib/toaster/toastergui/templates/recipes.html
new file mode 100644
index 0000000..5cdac43
--- /dev/null
+++ b/bitbake/lib/toaster/toastergui/templates/recipes.html
@@ -0,0 +1,112 @@
+{% extends "basebuildpage.html" %}
+
+{% load projecttags %}
+
+{% block localbreadcrumb %}
+<li>Recipes</li>
+{% endblock %}
+
+{% block nav-recipes %}
+ <li class="active"><a href="{% url 'recipes' build.pk %}">Recipes</a></li>
+{% endblock %}
+
+{% block buildinfomain %}
+<div class="span10">
+<div class="page-header">
+<h1>
+ {% if request.GET.search and objects.paginator.count > 0 %}
+ {{objects.paginator.count}} recipe{{objects.paginator.count|pluralize}} found
+ {%elif request.GET.search and objects.paginator.count == 0%}
+ No recipes found
+ {%else%}
+ Recipes
+ {%endif%}
+ </h1>
+</div>
+
+{% if objects.paginator.count == 0 %}
+ <div class="row-fluid">
+ <div class="alert">
+ <form class="no-results input-append" id="searchform">
+ <input id="search" name="search" class="input-xxlarge" type="text" value="{%if request.GET.search%}{{request.GET.search}}{%endif%}"/>{% if request.GET.search %}<a href="javascript:$('#search').val('');searchform.submit()" class="add-on btn" tabindex="-1"><i class="icon-remove"></i></a>{% endif %}
+ <button class="btn" type="submit" value="Search">Search</button>
+ <button class="btn btn-link" onclick="javascript:$('#search').val('');searchform.submit()">Show all recipes</button>
+ </form>
+ </div>
+ </div>
+
+{% else %}
+{% include "basetable_top.html" %}
+
+ {% for recipe in objects %}
+
+ <tr class="data">
+ <td class="recipe__name">
+ <a href="{% url "recipe" build.pk recipe.pk %}">{{recipe.name}}</a>
+ </td>
+ <td class="recipe__version">
+ <a href="{% url "recipe" build.pk recipe.pk %}">{{recipe.version}}</a>
+ </td>
+ <!-- Depends -->
+ <td class="depends_on">
+ {% with deps=recipe_deps|get_dict_value:recipe.pk %}
+ {% with count=deps|length %}
+ {% if count %}
+ <a class="btn"
+ title="<a href='{% url "recipe" build.pk recipe.pk %}#dependencies'>{{recipe.name}}</a> dependencies"
+ data-content="<ul class='unstyled'>
+ {% for i in deps|dictsort:"depends_on.name"%}
+ <li><a href='{% url "recipe" build.pk i.depends_on.pk %}'>{{i.depends_on.name}}</a></li>
+ {% endfor %}
+ </ul>">
+ {{count}}
+ </a>
+ {% endif %}
+ {% endwith %}
+ {% endwith %}
+ </td>
+ <!-- Brought in by -->
+ <td class="depends_by">
+ {% with revs=recipe_revs|get_dict_value:recipe.pk %}
+ {% with count=revs|length %}
+ {% if count %}
+ <a class="btn"
+ title="<a href='{% url "recipe" build.pk recipe.pk %}#brought-in-by'>{{recipe.name}}</a> reverse dependencies"
+ data-content="<ul class='unstyled'>
+ {% for i in revs|dictsort:"recipe.name" %}
+ <li><a href='{% url "recipe" build.pk i.recipe.pk %}'>{{i.recipe.name}}</a></li>
+ {% endfor %}
+ </ul>">
+ {{count}}
+ </a>
+ {% endif %}
+ {% endwith %}
+ {% endwith %}
+ </td>
+ <!-- Recipe file -->
+ <td class="recipe_file">{{recipe.file_path}} {% if recipe.pathflags %}<i>({{recipe.pathflags}})</i>{% endif %}</td>
+ <!-- Section -->
+ <td class="recipe_section">{{recipe.section}}</td>
+ <!-- License -->
+ <td class="recipe_license">{{recipe.license}}</td>
+ <!-- Layer -->
+ <td class="layer_version__layer__name">{{recipe.layer_version.layer.name}}</td>
+ <!-- Layer branch -->
+ <td class="layer_version__branch">{{recipe.layer_version.branch}}</td>
+ <!-- Layer commit -->
+ <td class="layer_version__layer__commit">
+ <a class="btn"
+ data-content="<ul class='unstyled'>
+ <li>{{recipe.layer_version.commit}}</li>
+ </ul>">
+ {{recipe.layer_version.commit|truncatechars:13}}
+ </a>
+ </td>
+ </tr>
+
+ {% endfor %}
+
+{% include "basetable_bottom.html" %}
+{% endif %}
+</div>
+{% endblock %}
diff --git a/bitbake/lib/toaster/toastergui/templates/tablesort.html b/bitbake/lib/toaster/toastergui/templates/tablesort.html
new file mode 100644
index 0000000..3624742
--- /dev/null
+++ b/bitbake/lib/toaster/toastergui/templates/tablesort.html
@@ -0,0 +1,38 @@
+{% load projecttags %}
+<!-- component to display a generic table -->
+ {% if disable_sort %}
+ <table class="table table-bordered table-hover" id="detail_table">
+ <thead>
+ <tr>
+ {% for tc in tablecols %}
+ <th class="{%if tc.dclass%}{{tc.dclass}}{%endif%} {%if tc.clclass%}{{tc.clclass}}{%endif%}">
+ {%if tc.qhelp%}<i class="icon-question-sign get-help" title="{{tc.qhelp}}"></i>{%endif%}
+ {{tc.name}}
+ </th>
+ {% endfor %}
+ </tr>
+ </thead>
+ {% else %}
+ <table class="table table-bordered table-hover tablesorter" id="otable">
+ <thead>
+ <!-- Table header row; generated from "tablecols" entry in the context dict -->
+ <tr>
+ {% for tc in tablecols %}
+ <th class="{%if tc.dclass%}{{tc.dclass}}{%endif%} {%if tc.clclass%}{{tc.clclass}}{%endif%}">
+ {%if tc.qhelp%}<i class="icon-question-sign get-help" title="{{tc.qhelp}}"></i>{%endif%}
+ {%if tc.orderfield%}
+ <a {%if tc.ordericon%} class="sorted" {%endif%}
+ href="javascript:reload_params({'page': 1, 'orderby' : '{{tc.orderfield}}' })" >
+ {{tc.name}}
+ </a>
+ {%else%}
+ <span class="muted">
+ {{tc.name}}
+ </span>
+ {%endif%}
+ {%if tc.ordericon%} <i class="icon-caret-{{tc.ordericon}}"></i>{%endif%}
+ </th>
+ {% endfor %}
+ </tr>
+ </thead>
+ {% endif %}
diff --git a/bitbake/lib/toaster/toastergui/templates/target.html b/bitbake/lib/toaster/toastergui/templates/target.html
new file mode 100644
index 0000000..65e6c4a
--- /dev/null
+++ b/bitbake/lib/toaster/toastergui/templates/target.html
@@ -0,0 +1,162 @@
+{% extends "basebuildpage.html" %}
+{% block localbreadcrumb %}
+<li>{{target.target}}</li>
+{% endblock localbreadcrumb%}
+
+{% load projecttags %}
+
+{% block nav-target %}
+ {% for t in build.get_sorted_target_list %}
+ {% ifequal target.pk t.pk %}
+ <li class="active"><a href="{% url 'target' build.pk t.pk %}">{{t.target}}</a><li>
+ {% else %}
+ <li><a href="{% url 'target' build.pk t.pk %}">{{t.target}}</a><li>
+ {% endifequal %}
+ {% endfor %}
+{% endblock %}
+
+{% block buildinfomain %}
+
+<div class="row-fluid span10">
+ <div class="page-header">
+ <h1>
+ {% if request.GET.search and objects.paginator.count > 0 %}
+ {{objects.paginator.count}} package{{objects.paginator.count|pluralize}} found
+ {% elif request.GET.search and objects.paginator.count == 0 %}
+ No packages found
+ {% else %}
+ {{target.target}}
+ {% endif %}
+ </h1>
+ </div>
+</div>
+
+<div class="row-fluid pull-right span10" id="navTab">
+ <ul class="nav nav-pills">
+ <li class="active">
+ <a href="#target">
+ <i class="icon-question-sign get-help" title="Of all the packages built, the subset installed in the root file system of this image"></i>
+ Packages included ({{target.package_count}} - {{packages_sum|filtered_filesizeformat}})
+ </a>
+ </li>
+ <li>
+ <a href="{% url 'dirinfo' build.id target.id %}">
+ <i class="icon-question-sign get-help" title="The directories and files in the root file system of this image"></i>
+ Directory structure
+ </a>
+ </li>
+ </ul>
+
+ <div id="image-packages" class="tab-pane">
+
+ {% if objects.paginator.count == 0 %}
+ <div class="row-fluid">
+ <div class="alert">
+ <form class="no-results input-append" id="searchform">
+ <input id="search" name="search" class="input-xxlarge" type="text" value="{% if request.GET.search %}{{request.GET.search}}{% endif %}"/>{% if request.GET.search %}<a href="javascript:$('#search').val('');searchform.submit()" class="add-on btn" tabindex="-1"><i class="icon-remove"></i></a>{% endif %}
+ <button class="btn" type="submit" value="Search">Search</button>
+ <button class="btn btn-link" onclick="javascript:$('#search').val('');searchform.submit()">Show all packages</button>
+ </form>
+ </div>
+ </div>
+
+
+ {% else %}
+ {% include "basetable_top.html" %}
+ {% for package in objects %}
+ <tr>
+ {# order of the table data must match the columns defined in template's context tablecols #}
+ <td class="package_name">
+ <a href="{% url 'package_included_detail' build.id target.id package.id %}">
+ {{package.name}}
+ </a>
+ {% if package.installed_name and package.name != package.installed_name %}
+ <span class="muted"> as {{package.installed_name}}</span>
+ <i class="icon-question-sign get-help hover-help" title='{{package.name|add:" was renamed at packaging time and was installed in your image as "|add:package.installed_name}}'></i>
+ {% endif %}
+ </td>
+ <td class="package_version">
+ <a href="{% url 'package_included_detail' build.id target.id package.id %}">
+ {{package.version|filtered_packageversion:package.revision}}
+ </a>
+ </td>
+ <td class="license">
+ {{package.license}}
+ </td>
+ <td class="size sizecol">
+ {{package.size|filtered_installedsize:package.installed_size|filtered_filesizeformat}}
+ </td>
+
+ <td class="size_over_total sizecol">
+ {{package|filter_sizeovertotal:packages_sum}}
+ </td>
+ <td class="depends">
+ {% with deps=package.runtime_dependencies %}
+ {% with deps_count=deps|length %}
+ {% if deps_count > 0 %}
+ <a class="btn"
+ title="<a href='{% url "package_included_dependencies" build.id target.id package.id %}'>{{package.name}}</a> dependencies"
+ data-content="<ul class='unstyled'>
+ {% for i in deps|dictsort:'depends_on.name' %}
+ <li><a href='{% url "package_included_detail" build.pk target.id i.depends_on.pk %}'>{{i.depends_on.name}}</a></li>
+ {% endfor %}
+ </ul>">
+ {{deps_count}}
+ </a>
+ {% endif %}
+ {% endwith %}
+ {% endwith %}
+ </td>
+ <td class="brought_in_by">
+ {% with rdeps=package.reverse_runtime_dependencies %}
+ {% with rdeps_count=rdeps|length %}
+ {% if rdeps_count > 0 %}
+ <a class="btn"
+ title="<a href='{% url "package_included_reverse_dependencies" build.id target.id package.id %}'>{{package.name}}</a> reverse dependencies"
+ data-content="<ul class='unstyled'>
+ {% for i in rdeps|dictsort:'package.name' %}
+ <li><a href='{% url "package_included_detail" build.id target.id i.package.id %}'>{{i.package.name}}</a></li>
+ {% endfor %}
+ </ul>">
+ {{rdeps_count}}
+ </a>
+ {% endif %}
+ {% endwith %}
+ {% endwith %}
+ </td>
+ <td class="recipe_name">
+ {% if package.recipe.version %}
+ <a href="{% url 'recipe' build.id package.recipe_id %}">
+ {{ package.recipe.name }}
+ </a>
+ {% endif %}
+ </td>
+ <td class="recipe_version">
+ {% if package.recipe.version %}
+ <a href="{% url 'recipe' build.id package.recipe_id %}">
+ {{ package.recipe.version }}
+ </a>
+ {% endif %}
+ </td>
+ <td class="layer_name">
+ {{ package.recipe.layer_version.layer.name }}
+ </td>
+ <td class="layer_branch">
+ {{ package.recipe.layer_version.branch}}
+ </td>
+ <td class="layer_commit">
+ <a class="btn"
+ data-content="<ul class='unstyled'>
+ <li>{{package.recipe.layer_version.commit}}</li>
+ </ul>">
+ {{package.recipe.layer_version.commit|truncatechars:13}}
+ </a>
+ </td>
+ </tr>
+ {% endfor %}
+
+ {% include "basetable_bottom.html" %}
+ {% endif %}
+ </div> <!-- tabpane -->
+</div> <!--span 10-->
+{% endblock buildinfomain %}
diff --git a/bitbake/lib/toaster/toastergui/templates/task.html b/bitbake/lib/toaster/toastergui/templates/task.html
new file mode 100644
index 0000000..635098a
--- /dev/null
+++ b/bitbake/lib/toaster/toastergui/templates/task.html
@@ -0,0 +1,269 @@
+{% extends "basebuilddetailpage.html" %}
+
+{% load projecttags %}
+{% load humanize %}
+
+{% block localbreadcrumb %}
+<li><a href="{% url 'tasks' build.pk %}">Tasks</a></li>
+<li>{{task.recipe.name}}_{{task.recipe.version}} {{task.task_name}}</li>
+{% endblock %}
+
+{% block pagedetailinfomain %}
+
+<div class="row span11">
+ <div class="page-header">
+ <h1><a href="{%url 'recipe' build.pk task.recipe.pk %}">{{task.recipe.name}}_{{task.recipe.version}}</a> {{task.task_name}}</h1>
+ </div>
+
+{# Outcome section #}
+<h2 {{ task|task_color:True }}>
+ {{task.get_outcome_display}}
+ <i class="icon-question-sign get-help heading-help" title="{{task.get_outcome_help}}"></i>
+</h2>
+{%if task.task_executed %}
+ {# executed tasks outcome #}
+ {% if task.logfile %}
+ <a class="btn btn-large" href="{% url 'build_artifact' build.id "tasklogfile" task.pk %}" style="margin:15px;">Download task log</a>
+ {% endif %}
+ {# show stack trace for failed task #}
+ {% if task.outcome == task.OUTCOME_FAILED and log_head %}
+ <h3>Python stack trace</h3>
+ <div>
+ <pre style="min-height:160px;">
+ <code>{{log_head}}</code><a id="full-trace-show" data-target="#fulltrace" data-toggle="collapse" class="btn btn-mini">...</a>
+ <div id="fulltrace" class="collapse" style="margin-top: -20px; height: 0px;">
+ <code>{{log_body}}</code><br><a id="full-trace-hide" class="btn btn-mini collapsed" style="font-family:Helvetica Neue" data-target="#fulltrace" data-toggle="collapse">Collapse stack trace<i class="icon-caret-up"></i></a></div></pre>
+ </div>
+ {% endif %}
+{% else %}
+{# not executed tasks outcome #}
+ {% if task.outcome == task.OUTCOME_PREBUILT %}
+ {% if not showing_matches %}
+ <a class="btn" href="javascript:reload_params({'show_matches' : 'true' })">Match to tasks in previous builds <i class="icon-question-sign get-help" style="margin-top:20px;" title="This shows you a list of tasks from previous builds with the same signature generated from the same inputs as used in the prebuilt task. Any of them could be the task that generated the output this prebuilt task is reusing"></i></a>
+ {% elif matching_tasks %}
+ <h3 class="details">Prebuilt task could be based on
+ <i class="icon-question-sign get-help heading-help" title="This table shows a list of tasks from previous builds with the same signature generated from the same inputs as used in the prebuilt task. Any of them could be the task that generated the output this prebuilt task is reusing"></i>
+ </h3>
+ <table class="table table-bordered table-hover">
+ <thead>
+ <th>
+ <i class="icon-question-sign get-help" title="The name of the recipe to which each task applies"></i>
+ Recipe
+ </th>
+ <th>
+ <i class="icon-question-sign get-help" title="The name of the task"></i>
+ Task
+ </th>
+ <th>
+ <i class="icon-question-sign get-help" title="This value tells you if a task had to run (executed) in order to generate the task output, or if the output was provided by another task and therefore the task didn't need to run (not executed)"></i>
+ Executed
+ </th>
+ <th>
+ <i class="icon-question-sign get-help" title="This column tells you if 'executed' tasks succeeded or failed. The column also tells you why 'not executed' tasks did not need to run"></i>
+ Outcome
+ </th>
+ <th>
+ <i class="icon-question-sign get-help" title="The date and time the build finished"></i>
+ Build completed on
+ </th>
+ </thead>
+ <tbody>
+ {% for match in matching_tasks %}
+ <tr {{ match|task_color }}>
+ <td>
+ <a href="{%url "task" match.build.pk match.pk%}">{{match.recipe.name}}</a>
+ </td>
+ <td>
+ <a href="{%url "task" match.build.pk match.pk%}">{{match.task_name}}</a>
+ {% if task.get_description %}
+ <i class="icon-question-sign get-help hover-help" title="{{task.get_description}}"></i>
+ {% endif %}
+ </td>
+ <td>
+ <a href="{%url "task" match.build.pk match.pk%}">{{match.get_executed_display}}</a>
+ </td>
+ <td>
+ <a href="{%url "task" match.build.pk match.pk%}">{{match.get_outcome_display}} </a><i class="icon-question-sign get-help hover-help" title="{{match.get_outcome_help}}"></i>
+ </td>
+ <td>
+ <a href="{%url "task" match.build.pk match.pk%}">{{match.build.completed_on|date:"d/m/y H:i"}}</a>
+ </td>
+ </tr>
+ {% endfor %}
+ </tbody>
+ </table>
+ {% else %}
+ <p class="alert">
+ <strong> We have found no tasks matching this prebuilt task</strong><br/>
+ The task you are looking for could belong to a build for which Toaster has no data.
+ </p>
+ {% endif %}
+ {% elif task.outcome == task.OUTCOME_COVERED %}
+ <dl class="dl-horizontal">
+ <dt>
+ <i class="icon-question-sign get-help" title="The task(s) providing the outcome of this task"></i> Task covered by
+ </dt>
+ <dd>
+ <ul>
+ {% for t in covered_by %}
+ <li>
+ <a href="{%url 'task' t.build.pk t.pk%}"
+ class="task-info"
+ title="{{t.get_executed_display}} | {{t.get_outcome_display}}">
+ {{t.recipe.name}}_{{t.recipe.version}}
+ {{t.task_name}}
+ </a>
+ </li>
+ {% endfor %}
+ </ul>
+ </dd>
+ </dl>
+ {%elif task.outcome == task.OUTCOME_CACHED%}
+ {% for t in task.get_related_setscene %}
+ {% if forloop.last %}
+ <a class="btn btn-large" href="{% url 'build_artifact' build.id "tasklogfile" t.pk %}" style="margin:15px;">Download task log</a>
+ {% endif %}
+ {% endfor %}
+
+ {%elif task.outcome == task.OUTCOME_EMPTY%}
+ <div class="alert alert-info details">
+ This task is empty because it has the <code>noexec</code> flag set to <code>1</code>, or the task function is empty
+ </div>
+ {% endif %}
+{% endif %}
+
+{# Execution section #}
+ {% if task.task_executed %}
+ <h2>
+ Executed
+ <i class="icon-question-sign get-help heading-help" title="'Executed' tasks are those that need to run in order to generate the task output"></i>
+ {% else %}
+ <h2 class="muted">
+ Not Executed
+ <i class="icon-question-sign get-help heading-help" title="'Not executed' tasks don't need to run because their outcome is provided by another task"></i>
+ {% endif %}
+ </h2>
+
+<dl class="dl-horizontal">
+ <dt>
+ <i class="icon-question-sign get-help" title="To make builds more efficient, the build system detects changes in the 'inputs' to a given task by creating a 'task signature'. If the signature changes, the build system assumes the inputs have changed and the task needs to be rerun"></i>
+ Task inputs signature
+ </dt>
+ <dd>
+ {{task.sstate_checksum}}
+ </dd>
+</dl>
+ {% if task.sstate_result != task.SSTATE_NA %}
+ <div class="alert alert-info">Attempting to restore output from sstate cache
+ <i class="icon-question-sign get-help get-help-blue" title="The build system is searching for the task output in your <code>sstate-cache</code> directory and mirrors. If the build system finds the task output, it will reuse it instead of building it from scratch by running the real task. Reusing the task output makes the build faster"></i>
+ </div>
+ <dl class="dl-horizontal">
+ <dt>
+ <i class="icon-question-sign get-help" title="The name of the file searched for in your <code>sstate-cache</code> directory and mirrors"></i>
+ File searched for
+ </dt>
+ <dd><code>{{task.path_to_sstate_obj}}</code></dd>
+ <dt>
+ <i class="icon-question-sign get-help" title="The locations searched for the above file (i.e. your <code>sstate-cache</code> directory and any mirrors you have set up)"></i>
+ URI(s) searched
+ </dt>
+ <dd><ul>{% for uri in uri_list %}<li><code>{{uri}}</code></li>{% endfor %}</ul></dd>
+ </dl>
+ {% endif %}
+ {% if task.sstate_result == task.SSTATE_MISS %}
+ <div class="alert alert-info">
+ <strong>File not in sstate cache.</strong> Running the real task instead.
+ </div>
+ {% elif task.sstate_result == task.SSTATE_FAILED%}
+ <div class="alert">
+ <strong>Failed</strong> to restore output from sstate cache. The file was found but could not be unpacked.
+ </div>
+ <dl class="dl-horizontal">
+ <a href="{% url 'build_artifact' build.id "tasklogfile" task.pk %}" style="margin:15px;">Download log</a>
+ </dl>
+ <div class="alert alert-info">
+ Running the real task instead.
+ </div>
+ {% elif task.sstate_result == task.SSTATE_RESTORED %}
+ <div class="alert alert-info">
+ Output <strong>successfully restored</strong> from sstate cache.
+ </div>
+ {% endif %}
+ <dl class="dl-horizontal">
+ <dt>
+ <i class="icon-question-sign get-help" title="The running sequence of each task in the build"></i>
+ Task order
+ </dt>
+ <dd><a href="{%url "tasks_task" build.pk task.order %}#{{task.order}}">{{task.order}}</a></dd>
+ {% if task.task_executed %}
+ <dt>
+ <i class="icon-question-sign get-help" title="Indicates if this task executes a Python or Shell function(s)"></i>
+ Task script type
+ </dt>
+ <dd>{{task.get_script_type_display}}</dd>
+ {% endif %}
+<!--
+ <dt>
+ <i class="icon-question-sign get-help" title="The code executed by the task"></i>
+ Task executable output
+ </dt>
+ <dd><code>{{task.source_url}}</code></dd>
+-->
+ <dt>
+ <i class="icon-question-sign get-help" title="Task dependency chain (i.e. other tasks)"></i>
+ Dependencies
+ </dt>
+ <dd>
+ <ul>
+ {% for dep in deps %}
+ <li><a href="{%url 'task' dep.build.pk dep.pk%}" class="task-info" title="{{dep.get_executed_display}} | {{dep.get_outcome_display}}">{{dep.recipe.name}}_{{dep.recipe.version}} <span class="task-name">{{dep.task_name}}</span></a></li>
+ {% empty %}
+ <li class="muted">This task has no dependencies</li>
+ {% endfor %}
+ </ul>
+ </dd>
+ <dt>
+ <i class="icon-question-sign get-help" title="Tasks that depend on this task"></i>
+ Reverse dependencies
+ </dt>
+ <dd>
+ <ul>
+ {% for dep in rdeps %}
+ <li><a href="{%url 'task' dep.build.pk dep.pk%}" class="task-info" title="{{dep.get_executed_display}} | {{dep.get_outcome_display}}">{{dep.recipe.name}}_{{dep.recipe.version}} <span class="task-name">{{dep.task_name}}</span></a></li>
+ {% empty %}
+ <li class="muted">This task has no reverse dependencies</li>
+ {% endfor %}
+ </ul>
+</dl>
+
+{# Performance section - shown only for executed tasks #}
+{%if task.elapsed_time or task.cpu_usage or task.disk_io %}
+ <h2 class="details">Performance</h2>
+{% endif %}
+ <dl class="dl-horizontal">
+ {% if task.elapsed_time %}
+ <dt>
+ <i class="icon-question-sign get-help" title="How long it took the task to finish in seconds"></i>
+ Time (secs)
+ </dt>
+ <dd>{{task.elapsed_time|format_none_and_zero|floatformat:2}}</dd>
+ {% endif %}
+ {% if task.cpu_usage > 0 %}
+ <dt>
+ <i class="icon-question-sign get-help" title="The percentage of task CPU utilization"></i>
+ CPU usage
+ </dt>
+ <dd>{{task.cpu_usage|format_none_and_zero|floatformat:2}}%</dd>
+ {% endif %}
+ {% if task.disk_io > 0 %}
+ <dt>
+ <i class="icon-question-sign get-help" title="Number of miliseconds the task spent doing disk input and output"></i>
+ Disk I/O (ms)
+ </dt>
+ <dd>{{task.disk_io|format_none_and_zero}}</dd>
+ {% endif %}
+ </dl>
+
+</div>
+{% endblock %}
+
diff --git a/bitbake/lib/toaster/toastergui/templates/tasks.html b/bitbake/lib/toaster/toastergui/templates/tasks.html
new file mode 100644
index 0000000..b18b5c7
--- /dev/null
+++ b/bitbake/lib/toaster/toastergui/templates/tasks.html
@@ -0,0 +1,136 @@
+{% extends "basebuildpage.html" %}
+{% load projecttags %}
+
+{% block localbreadcrumb %}
+<li>{{title}}</li>
+{% endblock %}
+
+{% block nav-tasks %}
+ {% if 'Tasks' == title %}
+ <li class="active"><a href="{% url 'tasks' build.pk %}">Tasks</a></li>
+ {% else %}
+ <li><a href="{% url 'tasks' build.pk %}">Tasks</a></li>
+ {% endif %}
+{% endblock %}
+{% block nav-buildtime %}
+ {% if 'Time' == title %}
+ <li class="active"><a href="{% url 'buildtime' build.pk %}">Time</a></li>
+ {% else %}
+ <li><a href="{% url 'buildtime' build.pk %}">Time</a></li>
+ {% endif %}
+{% endblock %}
+{% block nav-cpuusage %}
+ {% if 'CPU usage' == title %}
+ <li class="active"><a href="{% url 'cpuusage' build.pk %}">CPU usage</a></li>
+ {% else %}
+ <li><a href="{% url 'cpuusage' build.pk %}">CPU usage</a></li>
+ {% endif %}
+{% endblock %}
+{% block nav-diskio %}
+ {% if 'Disk I/O' == title %}
+ <li class="active"><a href="{% url 'diskio' build.pk %}">Disk I/O</a></li>
+ {% else %}
+ <li><a href="{% url 'diskio' build.pk %}">Disk I/O</a></li>
+ {% endif %}
+{% endblock %}
+
+{% block buildinfomain %}
+<div class="span10">
+{% if not request.GET.filter and not request.GET.search and not objects.paginator.count %}
+ <!-- Empty - no data in database -->
+ <div class="page-header">
+ <h1>{{title}}</h1>
+ </div>
+ <div class="alert alert-info lead">
+ No data was recorded for this build.
+ </div>
+
+{% else %}
+
+ <div class="page-header">
+ <h1>
+ {% if request.GET.filter and objects.paginator.count > 0 or request.GET.search and objects.paginator.count > 0 %}
+ {{objects.paginator.count}} task{{objects.paginator.count|pluralize}} found
+ {%elif request.GET.filter and objects.paginator.count == 0 or request.GET.search and objects.paginator.count == 0 %}
+ No tasks found
+ {%else%}
+ {{title}}
+ {%endif%}
+ </h1>
+ </div>
+
+ {% if objects.paginator.count == 0 %}
+ <div class="row-fluid">
+ <div class="alert">
+ <form class="no-results input-append" id="searchform">
+ <input id="search" name="search" class="input-xxlarge" type="text" value="{{request.GET.search}}"/>{% if request.GET.search %}<a href="javascript:$('#search').val('');searchform.submit()" class="add-on btn" tabindex="-1"><i class="icon-remove"></i></a>{% endif %}
+ <button class="btn" type="submit" value="Search">Search</button>
+ <button class="btn btn-link" onclick="javascript:$('#search').val('');searchform.submit()">Show all tasks</button>
+ </form>
+ </div>
+ </div>
+
+
+ {% else %}
+ {% include "basetable_top.html" %}
+
+ {% for task in objects %}
+ <tr {{ task|task_color }} id="{{task.order}}">
+ <td class="order">
+ <a href="{%url "task" build.pk task.pk%}">{{task.order}}</a>
+ </td>
+ <td class="recipe_name" >
+ <a href="{% url "recipe" build.pk task.recipe.pk %}">{{task.recipe.name}}</a>
+ </td>
+ <td class="recipe_version">
+ <a href="{% url "recipe" build.pk task.recipe.pk %}">{{task.recipe.version}}</a>
+ </td>
+ <td class="task_name">
+ <a href="{%url "task" build.pk task.pk%}">{{task.task_name}}</a> {% if task.get_description %}<i class="icon-question-sign get-help hover-help" title="{{task.get_description}}"></i> {% endif %}
+ </td>
+ <td class="executed">
+ <a href="{%url "task" build.pk task.pk%}">{{task.get_executed_display}}</a>
+ </td>
+ <td class="outcome">
+ <a href="{%url "task" build.pk task.pk%}">{{task.get_outcome_display}} </a>
+ {% if task.outcome = task.OUTCOME_FAILED %}
+ <a href="{% url 'build_artifact' build.pk "tasklogfile" task.pk %}">
+ <i class="icon-download-alt" title="Download task log file"></i>
+ </a>
+ {% endif %}
+ <i class="icon-question-sign get-help hover-help" title="{{task.get_outcome_help}}"></i>
+ </td>
+ <td class="cache_attempt">
+ <a href="{%url "task" build.pk task.pk%}">{{task.get_sstate_result_display|format_none_and_zero}}</a>
+ </td>
+ <td class="time_taken">
+ {{task.elapsed_time|format_none_and_zero|floatformat:2}}
+ </td>
+ <td class="cpu_used">
+ {{task.cpu_usage|format_none_and_zero|floatformat:2}}{% if task.cpu_usage %}%{% endif %}
+ </td>
+ <td class="disk_io">
+ {{task.disk_io|format_none_and_zero}}
+ </td>
+
+ </tr>
+ {% endfor %}
+
+ {% include "basetable_bottom.html" %}
+ {% endif %} {# objects.paginator.count #}
+{% endif %} {# empty #}
+</div>
+
+<script type="text/javascript">
+
+ $(document).ready(function() {
+ // enable blue hightlight animation for the order link
+ if (location.href.search('#') > -1) {
+ var task_order = location.href.split('#')[1];
+ $("#" + task_order).addClass("highlight");
+ }
+ });
+
+</script>
+
+{% endblock %}
diff --git a/bitbake/lib/toaster/toastergui/templates/toastertable-filter.html b/bitbake/lib/toaster/toastergui/templates/toastertable-filter.html
new file mode 100644
index 0000000..7c8dc49
--- /dev/null
+++ b/bitbake/lib/toaster/toastergui/templates/toastertable-filter.html
@@ -0,0 +1,18 @@
+<!-- filter modal -->
+<div id="filter-modal-{{table_name}}" class="modal hide fade" tabindex="-1" role="dialog" aria-hidden="false">
+ <form id="filter-modal-form-{{table_name}}" style="margin-bottom: 0px">
+ <div class="modal-header">
+ <button type="button" class="close" data-dismiss="modal" aria-hidden="true">x</button>
+ <h3 id="filter-modal-title-{{table_name}}"> </h3>
+ </div>
+ <div class="modal-body">
+ <p>Show:</p>
+ <span id="filter-actions-{{table_name}}"></span>
+ </div>
+ <div class="modal-footer">
+ <button class="btn btn-primary" type="submit">Apply</button>
+ </div>
+ </form>
+</div>
+<button id="clear-filter-btn-{{table_name}}" style="display:none"></button>
+<!-- end filter modal -->
diff --git a/bitbake/lib/toaster/toastergui/templates/toastertable-simple.html b/bitbake/lib/toaster/toastergui/templates/toastertable-simple.html
new file mode 100644
index 0000000..212318b
--- /dev/null
+++ b/bitbake/lib/toaster/toastergui/templates/toastertable-simple.html
@@ -0,0 +1,94 @@
+
+{% load static %}
+{% load projecttags %}
+
+<script src="{% static 'js/table.js' %}"></script>
+<script src="{% static 'js/layerBtn.js' %}"></script>
+<script>
+ $(document).ready(function() {
+ (function(){
+
+ var ctx = {
+ tableName : "{{table_name}}",
+ url : "{{ xhr_table_url }}?format=json",
+ title : "{{title}}",
+ projectLayers : {{projectlayers|json}},
+ };
+
+ try {
+ tableInit(ctx);
+ } catch (e) {
+ document.write("Problem loading table widget: " + e);
+ }
+ })();
+ });
+</script>
+
+{% include 'toastertable-filter.html' %}
+
+<div class="row-fluid" id="no-results-{{table_name}}" style="display:none">
+ <div class="alert">
+ <form class="no-results input-append">
+ <input class="input-xlarge" id="new-search-input-{{table_name}}" name="search" type="text" placeholder="Search {{title|lower}}" value="{% if request.GET.search %}{{request.GET.search}}{% endif %}"/>
+ <a href="#" class="add-on btn remove-search-btn-{{table_name}}" tabindex="-1">
+ <i class="icon-remove"></i>
+ </a>
+ <button class="btn search-submit-{{table_name}}" >Search</button>
+ <button class="btn btn-link remove-search-btn-{{table_name}}">Show {{title|lower}}
+ </button>
+ </form>
+ </div>
+</div>
+<div id="table-container-{{table_name}}" style="visibility: hidden">
+ <!-- control header -->
+ <div class="row-fluid" id="table-chrome-{{table_name}}">
+ <div class="navbar-search input-append pull-left">
+
+ <input class="input-xlarge" id="search-input-{{table_name}}" name="search" type="text" placeholder="Search {{title|lower}}" value="{% if request.GET.search %}{{request.GET.search}}{% endif %}"/>
+ <a href="#" style="display:none" class="add-on btn remove-search-btn-{{table_name}}" tabindex="-1">
+ <i class="icon-remove"></i>
+ </a>
+ <button class="btn" id="search-submit-{{table_name}}" >Search</button>
+ </div>
+
+ <div class="pull-right">
+
+ <div style="display:inline">
+ <span class="divider-vertical"></span>
+ <span class="help-inline" style="padding-top:5px;">Show rows:</span>
+ <select style="margin-top:5px;margin-bottom:0px;" class="pagesize-{{table_name}}">
+ {% with "10 25 50 100 150" as list%}
+ {% for i in list.split %}
+ <option value="{{i}}">{{i}}</option>
+ {% endfor %}
+ {% endwith %}
+ </select>
+ </div>
+ </div>
+ </div>
+
+ <!-- The actual table -->
+ <table class="table table-bordered table-hover tablesorter" id="{{table_name}}">
+ <thead>
+ <tr><th></th></tr>
+ </thead>
+ <tbody></tbody>
+ </table>
+
+ <!-- Pagination controls -->
+ <div class="pagination pagination-centered" id="pagination-{{table_name}}">
+ <ul class="pagination" style="display: block-inline">
+ </ul>
+
+ <div class="pull-right">
+ <span class="help-inline" style="padding-top:5px;">Show rows:</span>
+ <select style="margin-top:5px;margin-bottom:0px;" class="pagesize-{{table_name}}">
+ {% with "10 25 50 100 150" as list%}
+ {% for i in list.split %}
+ <option value="{{i}}">{{i}}</option>
+ {% endfor %}
+ {% endwith %}
+ </select>
+ </div>
+ </div>
+</div>
diff --git a/bitbake/lib/toaster/toastergui/templates/toastertable.html b/bitbake/lib/toaster/toastergui/templates/toastertable.html
new file mode 100644
index 0000000..9ef4c6f
--- /dev/null
+++ b/bitbake/lib/toaster/toastergui/templates/toastertable.html
@@ -0,0 +1,103 @@
+
+{% load static %}
+{% load projecttags %}
+
+<script src="{% static 'js/table.js' %}"></script>
+<script src="{% static 'js/layerBtn.js' %}"></script>
+<script>
+ $(document).ready(function() {
+ (function(){
+
+ var ctx = {
+ tableName : "{{table_name}}",
+ url : "{{ xhr_table_url }}?format=json",
+ title : "{{title}}",
+ projectLayers : {{projectlayers|json}},
+ };
+
+ try {
+ tableInit(ctx);
+ } catch (e) {
+ document.write("Problem loading table widget: " + e);
+ }
+ })();
+ });
+</script>
+
+{% include 'toastertable-filter.html' %}
+
+<div class="row-fluid" id="no-results-{{table_name}}" style="display:none">
+ <div class="alert">
+ <form class="no-results input-append">
+ <input class="input-xxlarge" id="new-search-input-{{table_name}}" name="search" type="text" placeholder="Search {{title|lower}}" value="{%if request.GET.search %}{{request.GET.search}}{%endif%}"/>
+ <a href="#" class="add-on btn remove-search-btn-{{table_name}}" tabindex="-1">
+ <i class="icon-remove"></i>
+ </a>
+ <button class="btn search-submit-{{table_name}}" >Search</button>
+ <button class="btn btn-link remove-search-btn-{{table_name}}">Show {{title|lower}}
+ </button>
+ </form>
+ </div>
+</div>
+
+<div id="table-container-{{table_name}}" style="visibility: hidden">
+ <!-- control header -->
+ <div class="navbar" id="table-chrome-{{table_name}}">
+ <div class="navbar-inner">
+ <div class="navbar-search input-append pull-left">
+
+ <input class="input-xxlarge" id="search-input-{{table_name}}" name="search" type="text" placeholder="Search {{title|lower}}" value="{%if request.GET.search%}{{request.GET.search}}{%endif%}"/>
+ <a href="#" style="display:none" class="add-on btn remove-search-btn-{{table_name}}" tabindex="-1">
+ <i class="icon-remove"></i>
+ </a>
+ <button class="btn" id="search-submit-{{table_name}}" >Search</button>
+ </div>
+
+ <div class="pull-right">
+ <div class="btn-group">
+ <button class="btn dropdown-toggle" data-toggle="dropdown">Edit columns
+ <span class="caret"></span>
+ </button>
+ <ul class="dropdown-menu editcol">
+ </ul>
+ </div>
+ <div style="display:inline">
+ <span class="divider-vertical"></span>
+ <span class="help-inline" style="padding-top:5px;">Show rows:</span>
+ <select style="margin-top:5px;margin-bottom:0px;" class="pagesize-{{table_name}}">
+ {% with "10 25 50 100 150" as list%}
+ {% for i in list.split %}
+ <option value="{{i}}">{{i}}</option>
+ {% endfor %}
+ {% endwith %}
+ </select>
+ </div>
+ </div>
+ </div>
+ </div>
+
+ <!-- The actual table -->
+ <table class="table table-bordered table-hover tablesorter" id="{{table_name}}">
+ <thead>
+ <tr><th></th></tr>
+ </thead>
+ <tbody></tbody>
+ </table>
+
+ <!-- Pagination controls -->
+ <div class="pagination pagination-centered" id="pagination-{{table_name}}">
+ <ul class="pagination" style="display: block-inline">
+ </ul>
+
+ <div class="pull-right">
+ <span class="help-inline" style="padding-top:5px;">Show rows:</span>
+ <select style="margin-top:5px;margin-bottom:0px;" class="pagesize-{{table_name}}">
+ {% with "10 25 50 100 150" as list%}
+ {% for i in list.split %}
+ <option value="{{i}}">{{i}}</option>
+ {% endfor %}
+ {% endwith %}
+ </select>
+ </div>
+ </div>
+</div>
diff --git a/bitbake/lib/toaster/toastergui/templates/unavailable_artifact.html b/bitbake/lib/toaster/toastergui/templates/unavailable_artifact.html
new file mode 100644
index 0000000..b9f8fee
--- /dev/null
+++ b/bitbake/lib/toaster/toastergui/templates/unavailable_artifact.html
@@ -0,0 +1,15 @@
+{% extends "base.html" %}
+{% load projecttags %}
+{% load humanize %}
+{% load static %}
+
+{% block pagecontent %}
+
+<div class="row-fluid air">
+ <div class="alert alert-info span8 lead">
+ <p"> The build artifact you are trying to download no longer exists.</p>
+ <p><a href="javascript:window.history.back()">Back to previous page</a></p>
+ </div>
+</div>
+{% endblock %}
+
diff --git a/bitbake/lib/toaster/toastergui/templatetags/__init__.py b/bitbake/lib/toaster/toastergui/templatetags/__init__.py
new file mode 100644
index 0000000..e69de29
--- /dev/null
+++ b/bitbake/lib/toaster/toastergui/templatetags/__init__.py
diff --git a/bitbake/lib/toaster/toastergui/templatetags/projecttags.py b/bitbake/lib/toaster/toastergui/templatetags/projecttags.py
new file mode 100644
index 0000000..75f2261
--- /dev/null
+++ b/bitbake/lib/toaster/toastergui/templatetags/projecttags.py
@@ -0,0 +1,299 @@
+#
+# ex:ts=4:sw=4:sts=4:et
+# -*- tab-width: 4; c-basic-offset: 4; indent-tabs-mode: nil -*-
+#
+# BitBake Toaster Implementation
+#
+# Copyright (C) 2013 Intel Corporation
+#
+# This program is free software; you can redistribute it and/or modify
+# it under the terms of the GNU General Public License version 2 as
+# published by the Free Software Foundation.
+#
+# This program is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
+# GNU General Public License for more details.
+#
+# You should have received a copy of the GNU General Public License along
+# with this program; if not, write to the Free Software Foundation, Inc.,
+# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
+
+from datetime import datetime, timedelta
+from os.path import relpath
+import re
+from django import template
+from django.utils import timezone
+from django.template.defaultfilters import filesizeformat
+import json as JsonLib
+from django.utils.safestring import mark_safe
+
+register = template.Library()
+
+@register.simple_tag
+def time_difference(start_time, end_time):
+ return end_time - start_time
+
+@register.filter(name = 'sectohms')
+def sectohms(time):
+ try:
+ tdsec = int(time)
+ except ValueError:
+ tdsec = 0
+ hours = int(tdsec / 3600)
+ return "%02d:%02d:%02d" % (hours, int((tdsec - (hours * 3600))/ 60), int(tdsec) % 60)
+
+
+@register.filter(name = 'get_tasks')
+def get_tasks(queryset):
+ return list(target + ':' + task if task else target \
+ for target, task in queryset.values_list('target', 'task'))
+
+
+@register.filter(name = "json")
+def json(value, default = None):
+ # JSON spec says that "\/" is functionally identical to "/" to allow for HTML-tag embedding in JSON strings
+ # unfortunately, I can't find any option in the json module to turn on forward-slash escaping, so we do
+ # it manually here
+ return mark_safe(JsonLib.dumps(value, indent=2, default = default, ensure_ascii=False).replace('</', '<\\/'))
+
+@register.assignment_tag
+def query(qs, **kwargs):
+ """ template tag which allows queryset filtering. Usage:
+ {% query books author=author as mybooks %}
+ {% for book in mybooks %}
+ ...
+ {% endfor %}
+ """
+ return qs.filter(**kwargs)
+
+
+@register.filter("whitespace_slice")
+def whitespace_space_filter(value, arg):
+ try:
+ bits = []
+ for x in arg.split(":"):
+ if len(x) == 0:
+ bits.append(None)
+ else:
+ # convert numeric value to the first whitespace after
+ first_whitespace = value.find(" ", int(x))
+ if first_whitespace == -1:
+ bits.append(int(x))
+ else:
+ bits.append(first_whitespace)
+ return value[slice(*bits)]
+ except (ValueError, TypeError):
+ raise
+
+@register.filter
+def divide(value, arg):
+ if int(arg) == 0:
+ return -1
+ return int(value) / int(arg)
+
+@register.filter
+def multiply(value, arg):
+ return int(value) * int(arg)
+
+@register.assignment_tag
+def datecompute(delta, start = timezone.now()):
+ return start + timedelta(delta)
+
+
+@register.filter(name = 'sortcols')
+def sortcols(tablecols):
+ return sorted(tablecols, key = lambda t: t['name'])
+
+@register.filter
+def task_color(task_object, show_green=False):
+ """ Return css class depending on Task execution status and execution outcome.
+ By default, green is not returned for executed and successful tasks;
+ show_green argument should be True to get green color.
+ """
+ if not task_object.task_executed:
+ return 'class=muted'
+ elif task_object.outcome == task_object.OUTCOME_FAILED:
+ return 'class=error'
+ elif task_object.outcome == task_object.OUTCOME_SUCCESS and show_green:
+ return 'class=green'
+ else:
+ return ''
+
+@register.filter
+def filtered_icon(options, filter):
+ """Returns btn-primary if the filter matches one of the filter options
+ """
+ for option in options:
+ if filter == option[1]:
+ return "btn-primary"
+ if ('daterange' == option[1]) and filter.startswith(option[4]):
+ return "btn-primary"
+ return ""
+
+@register.filter
+def filtered_tooltip(options, filter):
+ """Returns tooltip for the filter icon if the filter matches one of the filter options
+ """
+ for option in options:
+ if filter == option[1]:
+ return "Showing only %s"%option[0]
+ if ('daterange' == option[1]) and filter.startswith(option[4]):
+ return "Showing only %s"%option[0]
+ return ""
+
+@register.filter
+def format_none_and_zero(value):
+ """Return empty string if the value is None, zero or Not Applicable
+ """
+ return "" if (not value) or (value == 0) or (value == "0") or (value == 'Not Applicable') else value
+
+@register.filter
+def filtered_filesizeformat(value):
+ """
+ If the value is -1 return an empty string. Otherwise,
+ change output from fileformatsize to suppress trailing '.0'
+ and change 'bytes' to 'B'.
+ """
+ if value == -1:
+ return ''
+
+ return filesizeformat(value).replace("bytes", "B")
+
+@register.filter
+def filtered_packagespec(value):
+ """Strip off empty version and revision"""
+ return re.sub(r'(--$)', '', value)
+
+@register.filter
+def check_filter_status(options, filter):
+ """Check if the active filter is among the available options, and return 'checked'
+ if filter is not active.
+ Used in FilterDialog to select the first radio button if the filter is not active.
+ """
+ for option in options:
+ if filter == option[1]:
+ return ""
+ return "checked"
+
+@register.filter
+def variable_parent_name(value):
+ """ filter extended variable names to the parent name
+ """
+ value=re.sub('_\$.*', '', value)
+ return re.sub('_[a-z].*', '', value)
+
+@register.filter
+def filter_setin_files(file_list, matchstr):
+ """Filter/search the 'set in' file lists."""
+ result = []
+ search, filter = matchstr.split(':')
+ for pattern in (search, filter):
+ if pattern:
+ for fobj in file_list:
+ fname = fobj.file_name
+ if fname not in result and re.search(pattern, fname):
+ result.append(fname)
+
+ # no filter, show last file (if any)
+ last = list(file_list)[-1].file_name
+ if not filter and last not in result:
+ result.append(last)
+
+ return result
+
+@register.filter
+def string_slice(strvar,slicevar):
+ """ slice a string with |string_slice:'[first]:[last]'
+ """
+ first,last= slicevar.partition(':')[::2]
+ if first=='':
+ return strvar[:int(last)]
+ elif last=='':
+ return strvar[int(first):]
+ else:
+ return strvar[int(first):int(last)]
+
+@register.filter
+def string_remove_regex(value,ex):
+ """ remove sub-string of string that matches regex
+ """
+ return re.sub(ex, '', value)
+
+@register.filter
+def filtered_installedsize(size, installed_size):
+ """If package.installed_size not null and not empty return it,
+ else return package.size
+ """
+ return size if (installed_size == 0) or (installed_size == "") or (installed_size == None) else installed_size
+
+@register.filter
+def filtered_packageversion(version, revision):
+ """ Emit "version-revision" if version and revision are not null
+ else "version" if version is not null
+ else ""
+ """
+ return "" if (not version or version == "") else version if (not revision or revision == "") else version + "-" + revision
+
+@register.filter
+def filter_sizeovertotal(package_object, total_size):
+ """ Return the % size of the package over the total size argument
+ formatted nicely.
+ """
+ size = package_object.installed_size
+ if size == None or size == '':
+ size = package_object.size
+
+ return '{:.1%}'.format(float(size)/float(total_size))
+
+from django.utils.safestring import mark_safe
+@register.filter
+def format_vpackage_rowclass(size):
+ if size == -1:
+ return mark_safe('class="muted"')
+ return ''
+
+@register.filter
+def format_vpackage_namehelp(name):
+ r = name + ' '
+ r += '<i class="icon-question-sign get-help hover-help"'
+ r += ' title = "' + name + ' has not been built">'
+ r += '</i>'
+ return mark_safe(r)
+
+@register.filter
+def get_dict_value(dictionary, key):
+ """ return the value of a dictionary key
+ """
+ try:
+ return dictionary[key]
+ except (KeyError, IndexError):
+ return ''
+
+@register.filter
+def format_build_date(completed_on):
+ now = timezone.now()
+ delta = now - completed_on
+
+ if delta.days >= 1:
+ return True
+
+@register.filter
+def is_shaid(text):
+ """ return True if text length is 40 characters and all hex-digits
+ """
+ try:
+ int(text, 16)
+ if len(text) == 40:
+ return True
+ return False
+ except ValueError:
+ return False
+
+@register.filter
+def cut_path_prefix(fullpath, prefixes):
+ """Cut path prefix from fullpath."""
+ for prefix in prefixes:
+ if fullpath.startswith(prefix):
+ return relpath(fullpath, prefix)
+ return fullpath
diff --git a/bitbake/lib/toaster/toastergui/tests.py b/bitbake/lib/toaster/toastergui/tests.py
new file mode 100644
index 0000000..1a8b478
--- /dev/null
+++ b/bitbake/lib/toaster/toastergui/tests.py
@@ -0,0 +1,294 @@
+#! /usr/bin/env python
+# ex:ts=4:sw=4:sts=4:et
+# -*- tab-width: 4; c-basic-offset: 4; indent-tabs-mode: nil -*-
+#
+# BitBake Toaster Implementation
+#
+# Copyright (C) 2013-2015 Intel Corporation
+#
+# This program is free software; you can redistribute it and/or modify
+# it under the terms of the GNU General Public License version 2 as
+# published by the Free Software Foundation.
+#
+# This program is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
+# GNU General Public License for more details.
+#
+# You should have received a copy of the GNU General Public License along
+# with this program; if not, write to the Free Software Foundation, Inc.,
+# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
+
+"""Test cases for Toaster GUI and ReST."""
+
+from django.test import TestCase
+from django.core.urlresolvers import reverse
+from django.utils import timezone
+from orm.models import Project, Release, BitbakeVersion, Build
+from orm.models import ReleaseLayerSourcePriority, LayerSource, Layer
+from orm.models import Layer_Version, Recipe, Machine, ProjectLayer
+import json
+
+PROJECT_NAME = "test project"
+
+class ViewTests(TestCase):
+ """Tests to verify view APIs."""
+
+ def setUp(self):
+ bbv = BitbakeVersion.objects.create(name="test bbv", giturl="/tmp/",
+ branch="master", dirpath="")
+ release = Release.objects.create(name="test release",
+ bitbake_version=bbv)
+ self.project = Project.objects.create_project(name=PROJECT_NAME,
+ release=release)
+
+ layersrc = LayerSource.objects.create(sourcetype=LayerSource.TYPE_IMPORTED)
+ self.priority = ReleaseLayerSourcePriority.objects.create(release=release,
+ layer_source=layersrc)
+ layer = Layer.objects.create(name="base-layer", layer_source=layersrc,
+ vcs_url="/tmp/")
+
+ lver = Layer_Version.objects.create(layer=layer, project=self.project,
+ layer_source=layersrc, commit="master")
+
+ Recipe.objects.create(layer_source=layersrc, name="base-recipe",
+ version="1.2", summary="one recipe",
+ description="recipe", layer_version=lver)
+
+ Machine.objects.create(layer_version=lver, name="wisk",
+ description="wisking machine")
+
+ ProjectLayer.objects.create(project=self.project, layercommit=lver)
+
+ self.assertTrue(lver in self.project.compatible_layerversions())
+
+ def test_get_base_call_returns_html(self):
+ """Basic test for all-projects view"""
+ response = self.client.get(reverse('all-projects'), follow=True)
+ self.assertEqual(response.status_code, 200)
+ self.assertTrue(response['Content-Type'].startswith('text/html'))
+ self.assertTemplateUsed(response, "projects.html")
+ self.assertTrue(PROJECT_NAME in response.content)
+
+ def test_get_json_call_returns_json(self):
+ """Test for all projects output in json format"""
+ url = reverse('all-projects')
+ response = self.client.get(url, {"format": "json"}, follow=True)
+ self.assertEqual(response.status_code, 200)
+ self.assertTrue(response['Content-Type'].startswith('application/json'))
+
+ data = json.loads(response.content)
+
+ self.assertTrue("error" in data)
+ self.assertEqual(data["error"], "ok")
+ self.assertTrue("rows" in data)
+
+ self.assertTrue(PROJECT_NAME in [x["name"] for x in data["rows"]])
+ self.assertTrue("id" in data["rows"][0])
+
+ self.assertEqual(sorted(data["rows"][0]),
+ ['bitbake_version_id', 'created', 'id',
+ 'is_default', 'layersTypeAheadUrl', 'name',
+ 'num_builds', 'projectBuildsUrl', 'projectPageUrl',
+ 'recipesTypeAheadUrl', 'release_id',
+ 'short_description', 'updated', 'user_id'])
+
+ def test_typeaheads(self):
+ """Test typeahead ReST API"""
+ layers_url = reverse('xhr_layerstypeahead', args=(self.project.id,))
+ prj_url = reverse('xhr_projectstypeahead')
+
+ urls = [layers_url,
+ prj_url,
+ reverse('xhr_recipestypeahead', args=(self.project.id,)),
+ reverse('xhr_machinestypeahead', args=(self.project.id,)),
+ ]
+
+ def basic_reponse_check(response, url):
+ """Check data structure of http response."""
+ self.assertEqual(response.status_code, 200)
+ self.assertTrue(response['Content-Type'].startswith('application/json'))
+
+ data = json.loads(response.content)
+
+ self.assertTrue("error" in data)
+ self.assertEqual(data["error"], "ok")
+ self.assertTrue("results" in data)
+
+ # We got a result so now check the fields
+ if len(data['results']) > 0:
+ result = data['results'][0]
+
+ self.assertTrue(len(result['name']) > 0)
+ self.assertTrue("detail" in result)
+ self.assertTrue(result['id'] > 0)
+
+ # Special check for the layers typeahead's extra fields
+ if url == layers_url:
+ self.assertTrue(len(result['layerdetailurl']) > 0)
+ self.assertTrue(len(result['vcs_url']) > 0)
+ self.assertTrue(len(result['vcs_reference']) > 0)
+ # Special check for project typeahead extra fields
+ elif url == prj_url:
+ self.assertTrue(len(result['projectPageUrl']) > 0)
+
+ return True
+
+ return False
+
+ import string
+
+ for url in urls:
+ results = False
+
+ for typeing in list(string.ascii_letters):
+ response = self.client.get(url, {'search': typeing})
+ results = basic_reponse_check(response, url)
+ if results:
+ break
+
+ # After "typeing" the alpabet we should have result true
+ # from each of the urls
+ self.assertTrue(results)
+
+ def test_xhr_import_layer(self):
+ """Test xhr_importlayer API"""
+ #Test for importing an already existing layer
+ args = {'vcs_url' : "git://git.example.com/test",
+ 'name' : "base-layer",
+ 'git_ref': "c12b9596afd236116b25ce26dbe0d793de9dc7ce",
+ 'project_id': 1, 'dir_path' : "/path/in/repository"}
+ response = self.client.post(reverse('xhr_importlayer'), args)
+ data = json.loads(response.content)
+ self.assertEqual(response.status_code, 200)
+ self.assertNotEqual(data["error"], "ok")
+
+ #Test to verify import of a layer successful
+ args['name'] = "meta-oe"
+ response = self.client.post(reverse('xhr_importlayer'), args)
+ data = json.loads(response.content)
+ self.assertTrue(data["error"], "ok")
+
+ #Test for html tag in the data
+ args['<'] = "testing html tag"
+ response = self.client.post(reverse('xhr_importlayer'), args)
+ data = json.loads(response.content)
+ self.assertNotEqual(data["error"], "ok")
+
+ #Empty data passed
+ args = {}
+ response = self.client.post(reverse('xhr_importlayer'), args)
+ data = json.loads(response.content)
+ self.assertNotEqual(data["error"], "ok")
+
+class LandingPageTests(TestCase):
+ """ Tests for redirects on the landing page """
+ # disable bogus pylint message error:
+ # "Instance of 'WSGIRequest' has no 'url' member (no-member)"
+ # (see https://github.com/landscapeio/pylint-django/issues/42)
+ # pylint: disable=E1103
+
+ LANDING_PAGE_TITLE = 'This is Toaster'
+
+ def setUp(self):
+ """ Add default project manually """
+ self.project = Project.objects.create_project('foo', None)
+ self.project.is_default = True
+ self.project.save()
+
+ def test_only_default_project(self):
+ """
+ No projects except default
+ => get the landing page
+ """
+ response = self.client.get(reverse('landing'))
+ self.assertTrue(self.LANDING_PAGE_TITLE in response.content)
+
+ def test_default_project_has_build(self):
+ """
+ Default project has a build, no other projects
+ => get the builds page
+ """
+ now = timezone.now()
+ build = Build.objects.create(project=self.project,
+ started_on=now,
+ completed_on=now)
+ build.save()
+
+ response = self.client.get(reverse('landing'))
+ self.assertEqual(response.status_code, 302,
+ 'response should be a redirect')
+ self.assertTrue('/builds' in response.url,
+ 'should redirect to builds')
+
+ def test_user_project_exists(self):
+ """
+ User has added a project (without builds)
+ => get the projects page
+ """
+ user_project = Project.objects.create_project('foo', None)
+ user_project.save()
+
+ response = self.client.get(reverse('landing'))
+ self.assertEqual(response.status_code, 302,
+ 'response should be a redirect')
+ self.assertTrue('/projects' in response.url,
+ 'should redirect to projects')
+
+ def test_user_project_has_build(self):
+ """
+ User has added a project (with builds)
+ => get the builds page
+ """
+ user_project = Project.objects.create_project('foo', None)
+ user_project.save()
+
+ now = timezone.now()
+ build = Build.objects.create(project=user_project,
+ started_on=now,
+ completed_on=now)
+ build.save()
+
+ response = self.client.get(reverse('landing'))
+ self.assertEqual(response.status_code, 302,
+ 'response should be a redirect')
+ self.assertTrue('/builds' in response.url,
+ 'should redirect to builds')
+
+class ProjectsPageTests(TestCase):
+ """ Tests for projects page """
+
+ PROJECT_NAME = 'cli builds'
+
+ def setUp(self):
+ """ Add default project manually """
+ project = Project.objects.create_project(self.PROJECT_NAME, None)
+ self.default_project = project
+ self.default_project.is_default = True
+ self.default_project.save()
+
+ def test_default_project_hidden(self):
+ """ The default project should be hidden if it has no builds """
+ params = {"count": 10, "orderby": "updated:-", "page": 1}
+ response = self.client.get(reverse('all-projects'), params)
+
+ self.assertTrue(not('tr class="data"' in response.content),
+ 'should be no project rows in the page')
+ self.assertTrue(not(self.PROJECT_NAME in response.content),
+ 'default project "cli builds" should not be in page')
+
+ def test_default_project_has_build(self):
+ """ The default project should be shown if it has builds """
+ now = timezone.now()
+ build = Build.objects.create(project=self.default_project,
+ started_on=now,
+ completed_on=now)
+ build.save()
+
+ params = {"count": 10, "orderby": "updated:-", "page": 1}
+ response = self.client.get(reverse('all-projects'), params)
+
+ self.assertTrue('tr class="data"' in response.content,
+ 'should be a project row in the page')
+ self.assertTrue(self.PROJECT_NAME in response.content,
+ 'default project "cli builds" should be in page')
diff --git a/bitbake/lib/toaster/toastergui/typeaheads.py b/bitbake/lib/toaster/toastergui/typeaheads.py
new file mode 100644
index 0000000..d5bec58
--- /dev/null
+++ b/bitbake/lib/toaster/toastergui/typeaheads.py
@@ -0,0 +1,145 @@
+#
+# BitBake Toaster Implementation
+#
+# Copyright (C) 2015 Intel Corporation
+#
+# This program is free software; you can redistribute it and/or modify
+# it under the terms of the GNU General Public License version 2 as
+# published by the Free Software Foundation.
+#
+# This program is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
+# GNU General Public License for more details.
+#
+# You should have received a copy of the GNU General Public License along
+# with this program; if not, write to the Free Software Foundation, Inc.,
+# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
+
+from toastergui.widgets import ToasterTypeAhead
+from orm.models import Project
+from django.core.urlresolvers import reverse
+
+class LayersTypeAhead(ToasterTypeAhead):
+ """ Typeahead for layers available and not added in the current project's
+ configuration """
+ def __init__(self):
+ super(LayersTypeAhead, self).__init__()
+
+ def apply_search(self, search_term, prj, request):
+ layers = prj.compatible_layerversions()
+ layers = layers.order_by('layer__name')
+
+ # Unlike the other typeaheads we also don't want to show suggestions
+ # for layers already in the project unless required such as when adding
+ # layerdeps to a new layer.
+ if ("include_added" in request.GET and
+ request.GET['include_added'] != "true"):
+ layers = layers.exclude(pk__in=prj.projectlayer_equivalent_set)
+
+ primary_results = layers.filter(layer__name__istartswith=search_term)
+ secondary_results = layers.filter(layer__name__icontains=search_term).exclude(pk__in=primary_results)
+
+ results = []
+
+ for layer_version in list(primary_results) + list(secondary_results):
+ vcs_reference = layer_version.get_vcs_reference()
+
+ detail = "[ %s | %s ]" % (layer_version.layer.vcs_url,
+ vcs_reference)
+ needed_fields = {
+ 'id' : layer_version.pk,
+ 'name' : layer_version.layer.name,
+ 'layerdetailurl' : layer_version.get_detailspage_url(prj.pk),
+ 'vcs_url' : layer_version.layer.vcs_url,
+ 'vcs_reference' : vcs_reference,
+ 'detail' : detail,
+ }
+
+ results.append(needed_fields)
+
+ return results
+
+class MachinesTypeAhead(ToasterTypeAhead):
+ """ Typeahead for all the machines available in the current project's
+ configuration """
+ def __init__(self):
+ super(MachinesTypeAhead, self).__init__()
+
+ def apply_search(self, search_term, prj, request):
+ machines = prj.get_available_machines()
+ machines = machines.order_by("name")
+
+ primary_results = machines.filter(name__istartswith=search_term)
+ secondary_results = machines.filter(name__icontains=search_term).exclude(pk__in=primary_results)
+ tertiary_results = machines.filter(layer_version__layer__name__icontains=search_term).exclude(pk__in=primary_results).exclude(pk__in=secondary_results)
+
+ results = []
+
+ for machine in list(primary_results) + list(secondary_results) + list(tertiary_results):
+
+ detail = "[ %s ]" % (machine.layer_version.layer.name)
+ needed_fields = {
+ 'id' : machine.pk,
+ 'name' : machine.name,
+ 'detail' : detail,
+ }
+
+ results.append(needed_fields)
+
+ return results
+
+class RecipesTypeAhead(ToasterTypeAhead):
+ """ Typeahead for all the recipes available in the current project's
+ configuration """
+ def __init__(self):
+ super(RecipesTypeAhead, self).__init__()
+
+ def apply_search(self, search_term, prj, request):
+ recipes = prj.get_available_recipes()
+ recipes = recipes.order_by("name")
+
+
+ primary_results = recipes.filter(name__istartswith=search_term)
+ secondary_results = recipes.filter(name__icontains=search_term).exclude(pk__in=primary_results)
+ tertiary_results = recipes.filter(layer_version__layer__name__icontains=search_term).exclude(pk__in=primary_results).exclude(pk__in=secondary_results)
+
+ results = []
+
+ for recipe in list(primary_results) + list(secondary_results) + list(tertiary_results):
+
+ detail = "[ %s ]" % (recipe.layer_version.layer.name)
+ needed_fields = {
+ 'id' : recipe.pk,
+ 'name' : recipe.name,
+ 'detail' : detail,
+ }
+
+ results.append(needed_fields)
+
+ return results
+
+class ProjectsTypeAhead(ToasterTypeAhead):
+ """ Typeahead for all the projects """
+ def __init__(self):
+ super(ProjectsTypeAhead, self).__init__()
+
+ def apply_search(self, search_term, prj, request):
+ projects = Project.objects.all().order_by("name")
+
+ primary_results = projects.filter(name__istartswith=search_term)
+ secondary_results = projects.filter(name__icontains=search_term).exclude(pk__in=primary_results)
+
+ results = []
+
+ for project in list(primary_results) + list(secondary_results):
+ needed_fields = {
+ 'id' : project.pk,
+ 'name' : project.name,
+ 'detail' : "",
+ 'projectPageUrl' : reverse('project', args=(project.pk,))
+ }
+
+ results.append(needed_fields)
+
+ return results
diff --git a/bitbake/lib/toaster/toastergui/urls.py b/bitbake/lib/toaster/toastergui/urls.py
new file mode 100644
index 0000000..46e5761
--- /dev/null
+++ b/bitbake/lib/toaster/toastergui/urls.py
@@ -0,0 +1,153 @@
+#
+# BitBake Toaster Implementation
+#
+# Copyright (C) 2013 Intel Corporation
+#
+# This program is free software; you can redistribute it and/or modify
+# it under the terms of the GNU General Public License version 2 as
+# published by the Free Software Foundation.
+#
+# This program is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
+# GNU General Public License for more details.
+#
+# You should have received a copy of the GNU General Public License along
+# with this program; if not, write to the Free Software Foundation, Inc.,
+# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
+
+from django.conf.urls import patterns, include, url
+from django.views.generic import RedirectView, TemplateView
+
+from django.http import HttpResponseBadRequest
+from toastergui import tables
+from toastergui import typeaheads
+
+urlpatterns = patterns('toastergui.views',
+ # landing page
+ url(r'^landing/$', 'landing', name='landing'),
+
+ url(r'^builds/$', 'builds', name='all-builds'),
+ # build info navigation
+ url(r'^build/(?P<build_id>\d+)$', 'builddashboard', name="builddashboard"),
+
+ url(r'^build/(?P<build_id>\d+)/tasks/$', 'tasks', name='tasks'),
+ url(r'^build/(?P<build_id>\d+)/tasks/(?P<task_id>\d+)/$', 'tasks_task', name='tasks_task'),
+ url(r'^build/(?P<build_id>\d+)/task/(?P<task_id>\d+)$', 'task', name='task'),
+
+ url(r'^build/(?P<build_id>\d+)/recipes/$', 'recipes', name='recipes'),
+ url(r'^build/(?P<build_id>\d+)/recipe/(?P<recipe_id>\d+)/active_tab/(?P<active_tab>\d{1})$', 'recipe', name='recipe'),
+ url(r'^build/(?P<build_id>\d+)/recipe/(?P<recipe_id>\d+)$', 'recipe', name='recipe'),
+ url(r'^build/(?P<build_id>\d+)/recipe_packages/(?P<recipe_id>\d+)$', 'recipe_packages', name='recipe_packages'),
+
+ url(r'^build/(?P<build_id>\d+)/packages/$', 'bpackage', name='packages'),
+ url(r'^build/(?P<build_id>\d+)/package/(?P<package_id>\d+)$', 'package_built_detail',
+ name='package_built_detail'),
+ url(r'^build/(?P<build_id>\d+)/package_built_dependencies/(?P<package_id>\d+)$',
+ 'package_built_dependencies', name='package_built_dependencies'),
+ url(r'^build/(?P<build_id>\d+)/package_included_detail/(?P<target_id>\d+)/(?P<package_id>\d+)$',
+ 'package_included_detail', name='package_included_detail'),
+ url(r'^build/(?P<build_id>\d+)/package_included_dependencies/(?P<target_id>\d+)/(?P<package_id>\d+)$',
+ 'package_included_dependencies', name='package_included_dependencies'),
+ url(r'^build/(?P<build_id>\d+)/package_included_reverse_dependencies/(?P<target_id>\d+)/(?P<package_id>\d+)$',
+ 'package_included_reverse_dependencies', name='package_included_reverse_dependencies'),
+
+ # images are known as targets in the internal model
+ url(r'^build/(?P<build_id>\d+)/target/(?P<target_id>\d+)$', 'target', name='target'),
+ url(r'^build/(?P<build_id>\d+)/target/(?P<target_id>\d+)/targetpkg$', 'targetpkg', name='targetpkg'),
+ url(r'^dentries/build/(?P<build_id>\d+)/target/(?P<target_id>\d+)$', 'xhr_dirinfo', name='dirinfo_ajax'),
+ url(r'^build/(?P<build_id>\d+)/target/(?P<target_id>\d+)/dirinfo$', 'dirinfo', name='dirinfo'),
+ url(r'^build/(?P<build_id>\d+)/target/(?P<target_id>\d+)/dirinfo_filepath/_(?P<file_path>(?:/[^/\n]+)*)$', 'dirinfo', name='dirinfo_filepath'),
+ url(r'^build/(?P<build_id>\d+)/configuration$', 'configuration', name='configuration'),
+ url(r'^build/(?P<build_id>\d+)/configvars$', 'configvars', name='configvars'),
+ url(r'^build/(?P<build_id>\d+)/buildtime$', 'buildtime', name='buildtime'),
+ url(r'^build/(?P<build_id>\d+)/cpuusage$', 'cpuusage', name='cpuusage'),
+ url(r'^build/(?P<build_id>\d+)/diskio$', 'diskio', name='diskio'),
+
+ # image information dir
+ url(r'^build/(?P<build_id>\d+)/target/(?P<target_id>\d+)/packagefile/(?P<packagefile_id>\d+)$',
+ 'image_information_dir', name='image_information_dir'),
+
+ # build download artifact
+ url(r'^build/(?P<build_id>\d+)/artifact/(?P<artifact_type>\w+)/id/(?P<artifact_id>\w+)', 'build_artifact', name="build_artifact"),
+
+ # project URLs
+ url(r'^newproject/$', 'newproject', name='newproject'),
+
+
+ url(r'^projects/$', 'projects', name='all-projects'),
+
+ url(r'^project/(?P<pid>\d+)/$', 'project', name='project'),
+ url(r'^project/(?P<pid>\d+)/configuration$', 'projectconf', name='projectconf'),
+ url(r'^project/(?P<pid>\d+)/builds/$', 'projectbuilds', name='projectbuilds'),
+
+ # the import layer is a project-specific functionality;
+ url(r'^project/(?P<pid>\d+)/importlayer$', 'importlayer', name='importlayer'),
+
+ # the table pages that have been converted to ToasterTable widget
+ url(r'^project/(?P<pid>\d+)/machines/$',
+ tables.MachinesTable.as_view(template_name="generic-toastertable-page.html"),
+ { 'table_name': tables.MachinesTable.__name__.lower(),
+ 'title' : 'Compatible machines' },
+ name="projectmachines"),
+
+ url(r'^project/(?P<pid>\d+)/recipes/$',
+ tables.RecipesTable.as_view(template_name="generic-toastertable-page.html"),
+ { 'table_name': tables.RecipesTable.__name__.lower(),
+ 'title' : 'Compatible recipes' },
+ name="projecttargets"),
+
+ url(r'^project/(?P<pid>\d+)/availablerecipes/$',
+ tables.ProjectLayersRecipesTable.as_view(template_name="generic-toastertable-page.html"),
+ { 'table_name': tables.ProjectLayersRecipesTable.__name__.lower(),
+ 'title' : 'Recipes available for layers in the current project' },
+ name="projectavailabletargets"),
+
+ url(r'^project/(?P<pid>\d+)/layers/$',
+ tables.LayersTable.as_view(template_name="generic-toastertable-page.html"),
+ { 'table_name': tables.LayersTable.__name__.lower(),
+ 'title' : 'Compatible layers' },
+ name="projectlayers"),
+
+ url(r'^project/(?P<pid>\d+)/layer/(?P<layerid>\d+)$',
+ 'layerdetails', name='layerdetails'),
+
+ url(r'^project/(?P<pid>\d+)/layer/(?P<layerid>\d+)/recipes/$',
+ tables.LayerRecipesTable.as_view(template_name="generic-toastertable-page.html"),
+ { 'table_name': tables.LayerRecipesTable.__name__.lower(),
+ 'title' : 'All recipes in layer' },
+ name=tables.LayerRecipesTable.__name__.lower()),
+
+ url(r'^project/(?P<pid>\d+)/layer/(?P<layerid>\d+)/machines/$',
+ tables.LayerMachinesTable.as_view(template_name="generic-toastertable-page.html"),
+ { 'table_name': tables.LayerMachinesTable.__name__.lower(),
+ 'title' : 'All machines in layer' },
+ name=tables.LayerMachinesTable.__name__.lower()),
+
+
+ # typeahead api end points
+ url(r'^xhr_typeahead/(?P<pid>\d+)/layers$',
+ typeaheads.LayersTypeAhead.as_view(), name='xhr_layerstypeahead'),
+ url(r'^xhr_typeahead/(?P<pid>\d+)/machines$',
+ typeaheads.MachinesTypeAhead.as_view(), name='xhr_machinestypeahead'),
+ url(r'^xhr_typeahead/(?P<pid>\d+)/recipes$',
+ typeaheads.RecipesTypeAhead.as_view(), name='xhr_recipestypeahead'),
+ url(r'^xhr_typeahead/projects$',
+ typeaheads.ProjectsTypeAhead.as_view(), name='xhr_projectstypeahead'),
+
+
+
+ url(r'^xhr_testreleasechange/(?P<pid>\d+)$', 'xhr_testreleasechange',
+ name='xhr_testreleasechange'),
+ url(r'^xhr_configvaredit/(?P<pid>\d+)$', 'xhr_configvaredit',
+ name='xhr_configvaredit'),
+
+ url(r'^xhr_importlayer/$', 'xhr_importlayer', name='xhr_importlayer'),
+ url(r'^xhr_updatelayer/$', 'xhr_updatelayer', name='xhr_updatelayer'),
+
+ # JS Unit tests
+ url(r'^js-unit-tests/$', 'jsunittests', name='js-unit-tests'),
+
+ # default redirection
+ url(r'^$', RedirectView.as_view( url= 'landing')),
+)
diff --git a/bitbake/lib/toaster/toastergui/views.py b/bitbake/lib/toaster/toastergui/views.py
new file mode 100755
index 0000000..4e8f69e
--- /dev/null
+++ b/bitbake/lib/toaster/toastergui/views.py
@@ -0,0 +1,2912 @@
+#
+# ex:ts=4:sw=4:sts=4:et
+# -*- tab-width: 4; c-basic-offset: 4; indent-tabs-mode: nil -*-
+#
+# BitBake Toaster Implementation
+#
+# Copyright (C) 2013 Intel Corporation
+#
+# This program is free software; you can redistribute it and/or modify
+# it under the terms of the GNU General Public License version 2 as
+# published by the Free Software Foundation.
+#
+# This program is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
+# GNU General Public License for more details.
+#
+# You should have received a copy of the GNU General Public License along
+# with this program; if not, write to the Free Software Foundation, Inc.,
+# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
+
+# pylint: disable=method-hidden
+# Gives E:848, 4: An attribute defined in json.encoder line 162 hides this method (method-hidden)
+# which is an invalid warning
+
+import operator,re
+
+from django.db.models import F, Q, Sum, Count, Max
+from django.db import IntegrityError
+from django.shortcuts import render, redirect
+from orm.models import Build, Target, Task, Layer, Layer_Version, Recipe, LogMessage, Variable
+from orm.models import Task_Dependency, Recipe_Dependency, Package, Package_File, Package_Dependency
+from orm.models import Target_Installed_Package, Target_File, Target_Image_File, BuildArtifact
+from orm.models import BitbakeVersion
+from bldcontrol import bbcontroller
+from django.views.decorators.cache import cache_control
+from django.core.urlresolvers import reverse, resolve
+from django.core.exceptions import MultipleObjectsReturned
+from django.core.paginator import Paginator, EmptyPage, PageNotAnInteger
+from django.http import HttpResponseBadRequest, HttpResponseNotFound
+from django.utils import timezone
+from django.utils.html import escape
+from datetime import timedelta, datetime, date
+from django.utils import formats
+from toastergui.templatetags.projecttags import json as jsonfilter
+import json
+from os.path import dirname
+import itertools
+
+import logging
+
+logger = logging.getLogger("toaster")
+
+# all new sessions should come through the landing page;
+# determine in which mode we are running in, and redirect appropriately
+def landing(request):
+ # we only redirect to projects page if there is a user-generated project
+ user_projects = Project.objects.filter(is_default = False)
+ has_user_project = user_projects.count() > 0
+
+ if Build.objects.count() == 0 and has_user_project:
+ return redirect(reverse('all-projects'), permanent = False)
+
+ if Build.objects.all().count() > 0:
+ return redirect(reverse('all-builds'), permanent = False)
+
+ context = {'lvs_nos' : Layer_Version.objects.all().count()}
+
+ return render(request, 'landing.html', context)
+
+
+
+# returns a list for most recent builds;
+def _get_latest_builds(prj=None):
+ queryset = Build.objects.all()
+
+ if prj is not None:
+ queryset = queryset.filter(project = prj)
+
+ return list(itertools.chain(
+ queryset.filter(outcome=Build.IN_PROGRESS).order_by("-pk"),
+ queryset.filter(outcome__lt=Build.IN_PROGRESS).order_by("-pk")[:3] ))
+
+
+# a JSON-able dict of recent builds; for use in the Project page, xhr_ updates, and other places, as needed
+def _project_recent_build_list(prj):
+ data = []
+ # take the most recent 3 completed builds, plus any builds in progress
+ for x in _get_latest_builds(prj):
+ d = {
+ "id": x.pk,
+ "targets" : map(lambda y: {"target": y.target, "task": y.task }, x.target_set.all()), # TODO: create the task entry in the Target table
+ "status": x.get_current_status(),
+ "errors": map(lambda y: {"type": y.lineno, "msg": y.message, "tb": y.pathname}, (x.logmessage_set.filter(level__gte=LogMessage.WARNING)|x.logmessage_set.filter(level=LogMessage.EXCEPTION))),
+ "updated": x.completed_on.strftime('%s')+"000",
+ "command_time": (x.completed_on - x.started_on).total_seconds(),
+ "br_page_url": reverse('builddashboard', args=(x.pk,) ),
+ "build" : map( lambda y: {"id": y.pk,
+ "status": y.get_outcome_display(),
+ "completed_on" : y.completed_on.strftime('%s')+"000",
+ "build_time" : (y.completed_on - y.started_on).total_seconds(),
+ "build_page_url" : reverse('builddashboard', args=(y.pk,)),
+ 'build_time_page_url': reverse('buildtime', args=(y.pk,)),
+ "errors": y.errors.count(),
+ "warnings": y.warnings.count(),
+ "completeper": y.completeper() if y.outcome == Build.IN_PROGRESS else "0",
+ "eta": y.eta().strftime('%s')+"000" if y.outcome == Build.IN_PROGRESS else "0",
+ }, [x]),
+ }
+ data.append(d)
+
+ return data
+
+
+
+def objtojson(obj):
+ from django.db.models.query import QuerySet
+ from django.db.models import Model
+
+ if isinstance(obj, datetime):
+ return obj.isoformat()
+ elif isinstance(obj, timedelta):
+ return obj.total_seconds()
+ elif isinstance(obj, QuerySet) or isinstance(obj, set):
+ return list(obj)
+ elif type(obj).__name__ == "RelatedManager":
+ return [x.pk for x in obj.all()]
+ elif hasattr( obj, '__dict__') and isinstance(obj, Model):
+ d = obj.__dict__
+ nd = dict(d)
+ for di in d.keys():
+ if di.startswith("_"):
+ del nd[di]
+ elif isinstance(d[di], Model):
+ nd[di] = d[di].pk
+ elif isinstance(d[di], int) and hasattr(obj, "get_%s_display" % di):
+ nd[di] = getattr(obj, "get_%s_display" % di)()
+ return nd
+ elif isinstance( obj, type(lambda x:x)):
+ import inspect
+ return inspect.getsourcelines(obj)[0]
+ else:
+ raise TypeError("Unserializable object %s (%s) of type %s" % ( obj, dir(obj), type(obj)))
+
+
+def _template_renderer(template):
+ def func_wrapper(view):
+ def returned_wrapper(request, *args, **kwargs):
+ try:
+ context = view(request, *args, **kwargs)
+ except RedirectException as e:
+ return e.get_redirect_response()
+
+ if request.GET.get('format', None) == 'json':
+ # objects is a special keyword - it's a Page, but we need the actual objects here
+ # in XHR, the objects come in the "rows" property
+ if "objects" in context:
+ context["rows"] = context["objects"].object_list
+ del context["objects"]
+
+ # we're about to return; to keep up with the XHR API, we set the error to OK
+ context["error"] = "ok"
+
+ return HttpResponse(jsonfilter(context, default=objtojson ),
+ content_type = "application/json; charset=utf-8")
+ else:
+ return render(request, template, context)
+ return returned_wrapper
+ return func_wrapper
+
+
+def _lv_to_dict(prj, x = None):
+ if x is None:
+ def wrapper(x):
+ return _lv_to_dict(prj, x)
+ return wrapper
+
+ return {"id": x.pk,
+ "name": x.layer.name,
+ "tooltip": "%s | %s" % (x.layer.vcs_url,x.get_vcs_reference()),
+ "detail": "(%s" % x.layer.vcs_url + (")" if x.up_branch == None else " | "+x.get_vcs_reference()+")"),
+ "giturl": x.layer.vcs_url,
+ "layerdetailurl" : reverse('layerdetails', args=(prj.id,x.pk)),
+ "revision" : x.get_vcs_reference(),
+ }
+
+
+def _build_page_range(paginator, index = 1):
+ try:
+ page = paginator.page(index)
+ except PageNotAnInteger:
+ page = paginator.page(1)
+ except EmptyPage:
+ page = paginator.page(paginator.num_pages)
+
+
+ page.page_range = [page.number]
+ crt_range = 0
+ for i in range(1,5):
+ if (page.number + i) <= paginator.num_pages:
+ page.page_range = page.page_range + [ page.number + i]
+ crt_range +=1
+ if (page.number - i) > 0:
+ page.page_range = [page.number -i] + page.page_range
+ crt_range +=1
+ if crt_range == 4:
+ break
+ return page
+
+
+def _verify_parameters(g, mandatory_parameters):
+ miss = []
+ for mp in mandatory_parameters:
+ if not mp in g:
+ miss.append(mp)
+ if len(miss):
+ return miss
+ return None
+
+def _redirect_parameters(view, g, mandatory_parameters, *args, **kwargs):
+ import urllib
+ url = reverse(view, kwargs=kwargs)
+ params = {}
+ for i in g:
+ params[i] = g[i]
+ for i in mandatory_parameters:
+ if not i in params:
+ params[i] = urllib.unquote(str(mandatory_parameters[i]))
+
+ return redirect(url + "?%s" % urllib.urlencode(params), permanent = False, **kwargs)
+
+class RedirectException(Exception):
+ def __init__(self, view, g, mandatory_parameters, *args, **kwargs):
+ super(RedirectException, self).__init__()
+ self.view = view
+ self.g = g
+ self.mandatory_parameters = mandatory_parameters
+ self.oargs = args
+ self.okwargs = kwargs
+
+ def get_redirect_response(self):
+ return _redirect_parameters(self.view, self.g, self.mandatory_parameters, self.oargs, **self.okwargs)
+
+FIELD_SEPARATOR = ":"
+AND_VALUE_SEPARATOR = "!"
+OR_VALUE_SEPARATOR = "|"
+DESCENDING = "-"
+
+def __get_q_for_val(name, value):
+ if "OR" in value:
+ return reduce(operator.or_, map(lambda x: __get_q_for_val(name, x), [ x for x in value.split("OR") ]))
+ if "AND" in value:
+ return reduce(operator.and_, map(lambda x: __get_q_for_val(name, x), [ x for x in value.split("AND") ]))
+ if value.startswith("NOT"):
+ value = value[3:]
+ if value == 'None':
+ value = None
+ kwargs = { name : value }
+ return ~Q(**kwargs)
+ else:
+ if value == 'None':
+ value = None
+ kwargs = { name : value }
+ return Q(**kwargs)
+
+def _get_filtering_query(filter_string):
+
+ search_terms = filter_string.split(FIELD_SEPARATOR)
+ and_keys = search_terms[0].split(AND_VALUE_SEPARATOR)
+ and_values = search_terms[1].split(AND_VALUE_SEPARATOR)
+
+ and_query = []
+ for kv in zip(and_keys, and_values):
+ or_keys = kv[0].split(OR_VALUE_SEPARATOR)
+ or_values = kv[1].split(OR_VALUE_SEPARATOR)
+ querydict = dict(zip(or_keys, or_values))
+ and_query.append(reduce(operator.or_, map(lambda x: __get_q_for_val(x, querydict[x]), [k for k in querydict])))
+
+ return reduce(operator.and_, [k for k in and_query])
+
+def _get_toggle_order(request, orderkey, toggle_reverse = False):
+ if toggle_reverse:
+ return "%s:+" % orderkey if request.GET.get('orderby', "") == "%s:-" % orderkey else "%s:-" % orderkey
+ else:
+ return "%s:-" % orderkey if request.GET.get('orderby', "") == "%s:+" % orderkey else "%s:+" % orderkey
+
+def _get_toggle_order_icon(request, orderkey):
+ if request.GET.get('orderby', "") == "%s:+"%orderkey:
+ return "down"
+ elif request.GET.get('orderby', "") == "%s:-"%orderkey:
+ return "up"
+ else:
+ return None
+
+# we check that the input comes in a valid form that we can recognize
+def _validate_input(field_input, model):
+
+ invalid = None
+
+ if field_input:
+ field_input_list = field_input.split(FIELD_SEPARATOR)
+
+ # Check we have only one colon
+ if len(field_input_list) != 2:
+ invalid = "We have an invalid number of separators: " + field_input + " -> " + str(field_input_list)
+ return None, invalid
+
+ # Check we have an equal number of terms both sides of the colon
+ if len(field_input_list[0].split(AND_VALUE_SEPARATOR)) != len(field_input_list[1].split(AND_VALUE_SEPARATOR)):
+ invalid = "Not all arg names got values"
+ return None, invalid + str(field_input_list)
+
+ # Check we are looking for a valid field
+ valid_fields = model._meta.get_all_field_names()
+ for field in field_input_list[0].split(AND_VALUE_SEPARATOR):
+ if not reduce(lambda x, y: x or y, [ field.startswith(x) for x in valid_fields ]):
+ return None, (field, [ x for x in valid_fields ])
+
+ return field_input, invalid
+
+# uses search_allowed_fields in orm/models.py to create a search query
+# for these fields with the supplied input text
+def _get_search_results(search_term, queryset, model):
+ search_objects = []
+ for st in search_term.split(" "):
+ q_map = map(lambda x: Q(**{x+'__icontains': st}),
+ model.search_allowed_fields)
+
+ search_objects.append(reduce(operator.or_, q_map))
+ search_object = reduce(operator.and_, search_objects)
+ queryset = queryset.filter(search_object)
+
+ return queryset
+
+
+# function to extract the search/filter/ordering parameters from the request
+# it uses the request and the model to validate input for the filter and orderby values
+def _search_tuple(request, model):
+ ordering_string, invalid = _validate_input(request.GET.get('orderby', ''), model)
+ if invalid:
+ raise BaseException("Invalid ordering model:" + str(model) + str(invalid))
+
+ filter_string, invalid = _validate_input(request.GET.get('filter', ''), model)
+ if invalid:
+ raise BaseException("Invalid filter " + str(invalid))
+
+ search_term = request.GET.get('search', '')
+ return (filter_string, search_term, ordering_string)
+
+
+# returns a lazy-evaluated queryset for a filter/search/order combination
+def _get_queryset(model, queryset, filter_string, search_term, ordering_string, ordering_secondary=''):
+ if filter_string:
+ filter_query = _get_filtering_query(filter_string)
+ queryset = queryset.filter(filter_query)
+ else:
+ queryset = queryset.all()
+
+ if search_term:
+ queryset = _get_search_results(search_term, queryset, model)
+
+ if ordering_string:
+ column, order = ordering_string.split(':')
+ if column == re.sub('-','',ordering_secondary):
+ ordering_secondary=''
+ if order.lower() == DESCENDING:
+ column = '-' + column
+ if ordering_secondary:
+ queryset = queryset.order_by(column, ordering_secondary)
+ else:
+ queryset = queryset.order_by(column)
+
+ # insure only distinct records (e.g. from multiple search hits) are returned
+ return queryset.distinct()
+
+# returns the value of entries per page and the name of the applied sorting field.
+# if the value is given explicitly as a GET parameter it will be the first selected,
+# otherwise the cookie value will be used.
+def _get_parameters_values(request, default_count, default_order):
+ current_url = resolve(request.path_info).url_name
+ pagesize = request.GET.get('count', request.session.get('%s_count' % current_url, default_count))
+ orderby = request.GET.get('orderby', request.session.get('%s_orderby' % current_url, default_order))
+ return (pagesize, orderby)
+
+
+# set cookies for parameters. this is usefull in case parameters are set
+# manually from the GET values of the link
+def _set_parameters_values(pagesize, orderby, request):
+ from django.core.urlresolvers import resolve
+ current_url = resolve(request.path_info).url_name
+ request.session['%s_count' % current_url] = pagesize
+ request.session['%s_orderby' % current_url] =orderby
+
+# date range: normalize GUI's dd/mm/yyyy to date object
+def _normalize_input_date(date_str,default):
+ date_str=re.sub('/', '-', date_str)
+ # accept dd/mm/yyyy to d/m/yy
+ try:
+ date_in = datetime.strptime(date_str, "%d-%m-%Y")
+ except ValueError:
+ # courtesy try with two digit year
+ try:
+ date_in = datetime.strptime(date_str, "%d-%m-%y")
+ except ValueError:
+ return default
+ date_in = date_in.replace(tzinfo=default.tzinfo)
+ return date_in
+
+# convert and normalize any received date range filter, for example:
+# "completed_on__gte!completed_on__lt:01/03/2015!02/03/2015_daterange" to
+# "completed_on__gte!completed_on__lt:2015-03-01!2015-03-02"
+def _modify_date_range_filter(filter_string):
+ # was the date range radio button selected?
+ if 0 > filter_string.find('_daterange'):
+ return filter_string,''
+ # normalize GUI dates to database format
+ filter_string = filter_string.replace('_daterange','').replace(':','!');
+ filter_list = filter_string.split('!');
+ if 4 != len(filter_list):
+ return filter_string
+ today = timezone.localtime(timezone.now())
+ date_id = filter_list[1]
+ date_from = _normalize_input_date(filter_list[2],today)
+ date_to = _normalize_input_date(filter_list[3],today)
+ # swap dates if manually set dates are out of order
+ if date_to < date_from:
+ date_to,date_from = date_from,date_to
+ # convert to strings, make 'date_to' inclusive by moving to begining of next day
+ date_from_str = date_from.strftime("%Y-%m-%d")
+ date_to_str = (date_to+timedelta(days=1)).strftime("%Y-%m-%d")
+ filter_string=filter_list[0]+'!'+filter_list[1]+':'+date_from_str+'!'+date_to_str
+ daterange_selected = re.sub('__.*','', date_id)
+ return filter_string,daterange_selected
+
+def _add_daterange_context(queryset_all, request, daterange_list):
+ # calculate the exact begining of local today and yesterday
+ today_begin = timezone.localtime(timezone.now())
+ today_begin = date(today_begin.year,today_begin.month,today_begin.day)
+ yesterday_begin = today_begin-timedelta(days=1)
+ # add daterange persistent
+ context_date = {}
+ context_date['last_date_from'] = request.GET.get('last_date_from',timezone.localtime(timezone.now()).strftime("%d/%m/%Y"))
+ context_date['last_date_to' ] = request.GET.get('last_date_to' ,context_date['last_date_from'])
+ # calculate the date ranges, avoid second sort for 'created'
+ # fetch the respective max range from the database
+ context_date['daterange_filter']=''
+ for key in daterange_list:
+ queryset_key = queryset_all.order_by(key)
+ try:
+ context_date['dateMin_'+key]=timezone.localtime(getattr(queryset_key.first(),key)).strftime("%d/%m/%Y")
+ except AttributeError:
+ context_date['dateMin_'+key]=timezone.localtime(timezone.now())
+ try:
+ context_date['dateMax_'+key]=timezone.localtime(getattr(queryset_key.last(),key)).strftime("%d/%m/%Y")
+ except AttributeError:
+ context_date['dateMax_'+key]=timezone.localtime(timezone.now())
+ return context_date,today_begin,yesterday_begin
+
+
+##
+# build dashboard for a single build, coming in as argument
+# Each build may contain multiple targets and each target
+# may generate multiple image files. display them all.
+#
+def builddashboard( request, build_id ):
+ template = "builddashboard.html"
+ if Build.objects.filter( pk=build_id ).count( ) == 0 :
+ return redirect( builds )
+ build = Build.objects.get( pk = build_id );
+ layerVersionId = Layer_Version.objects.filter( build = build_id );
+ recipeCount = Recipe.objects.filter( layer_version__id__in = layerVersionId ).count( );
+ tgts = Target.objects.filter( build_id = build_id ).order_by( 'target' );
+
+ ##
+ # set up custom target list with computed package and image data
+ #
+
+ targets = [ ]
+ ntargets = 0
+ hasImages = False
+ targetHasNoImages = False
+ for t in tgts:
+ elem = { }
+ elem[ 'target' ] = t
+ if ( t.is_image ):
+ hasImages = True
+ npkg = 0
+ pkgsz = 0
+ package = None
+ for package in Package.objects.filter(id__in = [x.package_id for x in t.target_installed_package_set.all()]):
+ pkgsz = pkgsz + package.size
+ if ( package.installed_name ):
+ npkg = npkg + 1
+ elem[ 'npkg' ] = npkg
+ elem[ 'pkgsz' ] = pkgsz
+ ti = Target_Image_File.objects.filter( target_id = t.id )
+ imageFiles = [ ]
+ for i in ti:
+ ndx = i.file_name.rfind( '/' )
+ if ( ndx < 0 ):
+ ndx = 0;
+ f = i.file_name[ ndx + 1: ]
+ imageFiles.append({ 'id': i.id, 'path': f, 'size' : i.file_size })
+ if ( t.is_image and
+ (( len( imageFiles ) <= 0 ) or ( len( t.license_manifest_path ) <= 0 ))):
+ targetHasNoImages = True
+ elem[ 'imageFiles' ] = imageFiles
+ elem[ 'targetHasNoImages' ] = targetHasNoImages
+ targets.append( elem )
+
+ ##
+ # how many packages in this build - ignore anonymous ones
+ #
+
+ packageCount = 0
+ packages = Package.objects.filter( build_id = build_id )
+ for p in packages:
+ if ( p.installed_name ):
+ packageCount = packageCount + 1
+
+ logmessages = list(LogMessage.objects.filter( build = build_id ))
+
+ context = {
+ 'build' : build,
+ 'hasImages' : hasImages,
+ 'ntargets' : ntargets,
+ 'targets' : targets,
+ 'recipecount' : recipeCount,
+ 'packagecount' : packageCount,
+ 'logmessages' : logmessages,
+ }
+ return render( request, template, context )
+
+
+
+def generateCoveredList2( revlist = None ):
+ if not revlist:
+ revlist = []
+ covered_list = [ x for x in revlist if x.outcome == Task.OUTCOME_COVERED ]
+ while len(covered_list):
+ revlist = [ x for x in revlist if x.outcome != Task.OUTCOME_COVERED ]
+ if len(revlist) > 0:
+ return revlist
+
+ newlist = _find_task_revdep_list(covered_list)
+
+ revlist = list(set(revlist + newlist))
+ covered_list = [ x for x in revlist if x.outcome == Task.OUTCOME_COVERED ]
+ return revlist
+
+def task( request, build_id, task_id ):
+ template = "task.html"
+ tasks_list = Task.objects.filter( pk=task_id )
+ if tasks_list.count( ) == 0:
+ return redirect( builds )
+ task_object = tasks_list[ 0 ];
+ dependencies = sorted(
+ _find_task_dep( task_object ),
+ key=lambda t:'%s_%s %s'%(t.recipe.name, t.recipe.version, t.task_name))
+ reverse_dependencies = sorted(
+ _find_task_revdep( task_object ),
+ key=lambda t:'%s_%s %s'%( t.recipe.name, t.recipe.version, t.task_name ))
+ coveredBy = '';
+ if ( task_object.outcome == Task.OUTCOME_COVERED ):
+# _list = generateCoveredList( task )
+ coveredBy = sorted(generateCoveredList2( _find_task_revdep( task_object ) ), key = lambda x: x.recipe.name)
+ log_head = ''
+ log_body = ''
+ if task_object.outcome == task_object.OUTCOME_FAILED:
+ pass
+
+ uri_list= [ ]
+ variables = Variable.objects.filter(build=build_id)
+ v=variables.filter(variable_name='SSTATE_DIR')
+ if v.count() > 0:
+ uri_list.append(v[0].variable_value)
+ v=variables.filter(variable_name='SSTATE_MIRRORS')
+ if (v.count() > 0):
+ for mirror in v[0].variable_value.split('\\n'):
+ s=re.sub('.* ','',mirror.strip(' \t\n\r'))
+ if len(s):
+ uri_list.append(s)
+
+ context = {
+ 'build' : Build.objects.filter( pk = build_id )[ 0 ],
+ 'object' : task_object,
+ 'task' : task_object,
+ 'covered_by' : coveredBy,
+ 'deps' : dependencies,
+ 'rdeps' : reverse_dependencies,
+ 'log_head' : log_head,
+ 'log_body' : log_body,
+ 'showing_matches' : False,
+ 'uri_list' : uri_list,
+ }
+ if request.GET.get( 'show_matches', "" ):
+ context[ 'showing_matches' ] = True
+ context[ 'matching_tasks' ] = Task.objects.filter(
+ sstate_checksum=task_object.sstate_checksum ).filter(
+ build__completed_on__lt=task_object.build.completed_on).exclude(
+ order__isnull=True).exclude(outcome=Task.OUTCOME_NA).order_by('-build__completed_on')
+
+ return render( request, template, context )
+
+def recipe(request, build_id, recipe_id, active_tab="1"):
+ template = "recipe.html"
+ if Recipe.objects.filter(pk=recipe_id).count() == 0 :
+ return redirect(builds)
+
+ recipe_object = Recipe.objects.get(pk=recipe_id)
+ layer_version = Layer_Version.objects.get(pk=recipe_object.layer_version_id)
+ layer = Layer.objects.get(pk=layer_version.layer_id)
+ tasks_list = Task.objects.filter(recipe_id = recipe_id, build_id = build_id).exclude(order__isnull=True).exclude(task_name__endswith='_setscene').exclude(outcome=Task.OUTCOME_NA)
+ package_count = Package.objects.filter(recipe_id = recipe_id).filter(build_id = build_id).filter(size__gte=0).count()
+
+ if active_tab != '1' and active_tab != '3' and active_tab != '4' :
+ active_tab = '1'
+ tab_states = {'1': '', '3': '', '4': ''}
+ tab_states[active_tab] = 'active'
+
+ context = {
+ 'build' : Build.objects.get(pk=build_id),
+ 'object' : recipe_object,
+ 'layer_version' : layer_version,
+ 'layer' : layer,
+ 'tasks' : tasks_list,
+ 'package_count' : package_count,
+ 'tab_states' : tab_states,
+ }
+ return render(request, template, context)
+
+def recipe_packages(request, build_id, recipe_id):
+ template = "recipe_packages.html"
+ if Recipe.objects.filter(pk=recipe_id).count() == 0 :
+ return redirect(builds)
+
+ (pagesize, orderby) = _get_parameters_values(request, 10, 'name:+')
+ mandatory_parameters = { 'count': pagesize, 'page' : 1, 'orderby': orderby }
+ retval = _verify_parameters( request.GET, mandatory_parameters )
+ if retval:
+ return _redirect_parameters( 'recipe_packages', request.GET, mandatory_parameters, build_id = build_id, recipe_id = recipe_id)
+ (filter_string, search_term, ordering_string) = _search_tuple(request, Package)
+
+ recipe_object = Recipe.objects.get(pk=recipe_id)
+ queryset = Package.objects.filter(recipe_id = recipe_id).filter(build_id = build_id).filter(size__gte=0)
+ package_count = queryset.count()
+ queryset = _get_queryset(Package, queryset, filter_string, search_term, ordering_string, 'name')
+
+ packages = _build_page_range(Paginator(queryset, pagesize),request.GET.get('page', 1))
+
+ context = {
+ 'build' : Build.objects.get(pk=build_id),
+ 'recipe' : recipe_object,
+ 'objects' : packages,
+ 'object_count' : package_count,
+ 'tablecols':[
+ {
+ 'name':'Package',
+ 'orderfield': _get_toggle_order(request,"name"),
+ 'ordericon': _get_toggle_order_icon(request,"name"),
+ 'orderkey': "name",
+ },
+ {
+ 'name':'Version',
+ },
+ {
+ 'name':'Size',
+ 'orderfield': _get_toggle_order(request,"size", True),
+ 'ordericon': _get_toggle_order_icon(request,"size"),
+ 'orderkey': 'size',
+ 'dclass': 'sizecol span2',
+ },
+ ]
+ }
+ response = render(request, template, context)
+ _set_parameters_values(pagesize, orderby, request)
+ return response
+
+def target_common( request, build_id, target_id, variant ):
+ template = "target.html"
+ (pagesize, orderby) = _get_parameters_values(request, 25, 'name:+')
+ mandatory_parameters = { 'count': pagesize, 'page' : 1, 'orderby': orderby }
+ retval = _verify_parameters( request.GET, mandatory_parameters )
+ if retval:
+ return _redirect_parameters(
+ variant, request.GET, mandatory_parameters,
+ build_id = build_id, target_id = target_id )
+ ( filter_string, search_term, ordering_string ) = _search_tuple( request, Package )
+
+ # FUTURE: get rid of nested sub-queries replacing with ManyToMany field
+ queryset = Package.objects.filter(
+ size__gte = 0,
+ id__in = Target_Installed_Package.objects.filter(
+ target_id=target_id ).values( 'package_id' ))
+ packages_sum = queryset.aggregate( Sum( 'installed_size' ))
+ queryset = _get_queryset(
+ Package, queryset, filter_string, search_term, ordering_string, 'name' )
+ queryset = queryset.select_related("recipe", "recipe__layer_version", "recipe__layer_version__layer")
+ packages = _build_page_range( Paginator(queryset, pagesize), request.GET.get( 'page', 1 ))
+
+
+
+ build = Build.objects.get( pk = build_id )
+
+ # bring in package dependencies
+ for p in packages.object_list:
+ p.runtime_dependencies = p.package_dependencies_source.filter(
+ target_id = target_id, dep_type=Package_Dependency.TYPE_TRDEPENDS ).select_related("depends_on")
+ p.reverse_runtime_dependencies = p.package_dependencies_target.filter(
+ target_id = target_id, dep_type=Package_Dependency.TYPE_TRDEPENDS ).select_related("package")
+ tc_package = {
+ 'name' : 'Package',
+ 'qhelp' : 'Packaged output resulting from building a recipe included in this image',
+ 'orderfield' : _get_toggle_order( request, "name" ),
+ 'ordericon' : _get_toggle_order_icon( request, "name" ),
+ }
+ tc_packageVersion = {
+ 'name' : 'Package version',
+ 'qhelp' : 'The package version and revision',
+ }
+ tc_size = {
+ 'name' : 'Size',
+ 'qhelp' : 'The size of the package',
+ 'orderfield' : _get_toggle_order( request, "size", True ),
+ 'ordericon' : _get_toggle_order_icon( request, "size" ),
+ 'orderkey' : 'size',
+ 'clclass' : 'size',
+ 'dclass' : 'span2',
+ }
+ if ( variant == 'target' ):
+ tc_size[ "hidden" ] = 0
+ else:
+ tc_size[ "hidden" ] = 1
+ tc_sizePercentage = {
+ 'name' : 'Size over total (%)',
+ 'qhelp' : 'Proportion of the overall size represented by this package',
+ 'clclass' : 'size_over_total',
+ 'hidden' : 1,
+ }
+ tc_license = {
+ 'name' : 'License',
+ 'qhelp' : 'The license under which the package is distributed. Separate license names u\
+sing | (pipe) means there is a choice between licenses. Separate license names using & (ampersand) m\
+eans multiple licenses exist that cover different parts of the source',
+ 'orderfield' : _get_toggle_order( request, "license" ),
+ 'ordericon' : _get_toggle_order_icon( request, "license" ),
+ 'orderkey' : 'license',
+ 'clclass' : 'license',
+ }
+ if ( variant == 'target' ):
+ tc_license[ "hidden" ] = 1
+ else:
+ tc_license[ "hidden" ] = 0
+ tc_dependencies = {
+ 'name' : 'Dependencies',
+ 'qhelp' : "Package runtime dependencies (other packages)",
+ 'clclass' : 'depends',
+ }
+ if ( variant == 'target' ):
+ tc_dependencies[ "hidden" ] = 0
+ else:
+ tc_dependencies[ "hidden" ] = 1
+ tc_rdependencies = {
+ 'name' : 'Reverse dependencies',
+ 'qhelp' : 'Package run-time reverse dependencies (i.e. which other packages depend on this package',
+ 'clclass' : 'brought_in_by',
+ }
+ if ( variant == 'target' ):
+ tc_rdependencies[ "hidden" ] = 0
+ else:
+ tc_rdependencies[ "hidden" ] = 1
+ tc_recipe = {
+ 'name' : 'Recipe',
+ 'qhelp' : 'The name of the recipe building the package',
+ 'orderfield' : _get_toggle_order( request, "recipe__name" ),
+ 'ordericon' : _get_toggle_order_icon( request, "recipe__name" ),
+ 'orderkey' : "recipe__name",
+ 'clclass' : 'recipe_name',
+ 'hidden' : 0,
+ }
+ tc_recipeVersion = {
+ 'name' : 'Recipe version',
+ 'qhelp' : 'Version and revision of the recipe building the package',
+ 'clclass' : 'recipe_version',
+ 'hidden' : 1,
+ }
+ tc_layer = {
+ 'name' : 'Layer',
+ 'qhelp' : 'The name of the layer providing the recipe that builds the package',
+ 'orderfield' : _get_toggle_order( request, "recipe__layer_version__layer__name" ),
+ 'ordericon' : _get_toggle_order_icon( request, "recipe__layer_version__layer__name" ),
+ 'orderkey' : "recipe__layer_version__layer__name",
+ 'clclass' : 'layer_name',
+ 'hidden' : 1,
+ }
+ tc_layerBranch = {
+ 'name' : 'Layer branch',
+ 'qhelp' : 'The Git branch of the layer providing the recipe that builds the package',
+ 'orderfield' : _get_toggle_order( request, "recipe__layer_version__branch" ),
+ 'ordericon' : _get_toggle_order_icon( request, "recipe__layer_version__branch" ),
+ 'orderkey' : "recipe__layer_version__branch",
+ 'clclass' : 'layer_branch',
+ 'hidden' : 1,
+ }
+ tc_layerCommit = {
+ 'name' : 'Layer commit',
+ 'qhelp' : 'The Git commit of the layer providing the recipe that builds the package',
+ 'clclass' : 'layer_commit',
+ 'hidden' : 1,
+ }
+
+ context = {
+ 'objectname': variant,
+ 'build' : build,
+ 'target' : Target.objects.filter( pk = target_id )[ 0 ],
+ 'objects' : packages,
+ 'packages_sum' : packages_sum[ 'installed_size__sum' ],
+ 'object_search_display': "packages included",
+ 'default_orderby' : orderby,
+ 'tablecols' : [
+ tc_package,
+ tc_packageVersion,
+ tc_license,
+ tc_size,
+ tc_sizePercentage,
+ tc_dependencies,
+ tc_rdependencies,
+ tc_recipe,
+ tc_recipeVersion,
+ tc_layer,
+ tc_layerBranch,
+ tc_layerCommit,
+ ]
+ }
+
+
+ response = render(request, template, context)
+ _set_parameters_values(pagesize, orderby, request)
+ return response
+
+def target( request, build_id, target_id ):
+ return( target_common( request, build_id, target_id, "target" ))
+
+def targetpkg( request, build_id, target_id ):
+ return( target_common( request, build_id, target_id, "targetpkg" ))
+
+from django.core.serializers.json import DjangoJSONEncoder
+from django.http import HttpResponse
+def xhr_dirinfo(request, build_id, target_id):
+ top = request.GET.get('start', '/')
+ return HttpResponse(_get_dir_entries(build_id, target_id, top), content_type = "application/json")
+
+from django.utils.functional import Promise
+from django.utils.encoding import force_text
+class LazyEncoder(json.JSONEncoder):
+ def default(self, obj):
+ if isinstance(obj, Promise):
+ return force_text(obj)
+ return super(LazyEncoder, self).default(obj)
+
+from toastergui.templatetags.projecttags import filtered_filesizeformat
+import os
+def _get_dir_entries(build_id, target_id, start):
+ node_str = {
+ Target_File.ITYPE_REGULAR : '-',
+ Target_File.ITYPE_DIRECTORY : 'd',
+ Target_File.ITYPE_SYMLINK : 'l',
+ Target_File.ITYPE_SOCKET : 's',
+ Target_File.ITYPE_FIFO : 'p',
+ Target_File.ITYPE_CHARACTER : 'c',
+ Target_File.ITYPE_BLOCK : 'b',
+ }
+ response = []
+ objects = Target_File.objects.filter(target__exact=target_id, directory__path=start)
+ target_packages = Target_Installed_Package.objects.filter(target__exact=target_id).values_list('package_id', flat=True)
+ for o in objects:
+ # exclude root inode '/'
+ if o.path == '/':
+ continue
+ try:
+ entry = {}
+ entry['parent'] = start
+ entry['name'] = os.path.basename(o.path)
+ entry['fullpath'] = o.path
+
+ # set defaults, not all dentries have packages
+ entry['installed_package'] = None
+ entry['package_id'] = None
+ entry['package'] = None
+ entry['link_to'] = None
+ if o.inodetype == Target_File.ITYPE_DIRECTORY:
+ entry['isdir'] = 1
+ # is there content in directory
+ entry['childcount'] = Target_File.objects.filter(target__exact=target_id, directory__path=o.path).all().count()
+ else:
+ entry['isdir'] = 0
+
+ # resolve the file to get the package from the resolved file
+ resolved_id = o.sym_target_id
+ resolved_path = o.path
+ if target_packages.count():
+ while resolved_id != "" and resolved_id != None:
+ tf = Target_File.objects.get(pk=resolved_id)
+ resolved_path = tf.path
+ resolved_id = tf.sym_target_id
+
+ thisfile=Package_File.objects.all().filter(path__exact=resolved_path, package_id__in=target_packages)
+ if thisfile.count():
+ p = Package.objects.get(pk=thisfile[0].package_id)
+ entry['installed_package'] = p.installed_name
+ entry['package_id'] = str(p.id)
+ entry['package'] = p.name
+ # don't use resolved path from above, show immediate link-to
+ if o.sym_target_id != "" and o.sym_target_id != None:
+ entry['link_to'] = Target_File.objects.get(pk=o.sym_target_id).path
+ entry['size'] = filtered_filesizeformat(o.size)
+ if entry['link_to'] != None:
+ entry['permission'] = node_str[o.inodetype] + o.permission
+ else:
+ entry['permission'] = node_str[o.inodetype] + o.permission
+ entry['owner'] = o.owner
+ entry['group'] = o.group
+ response.append(entry)
+
+ except Exception as e:
+ print "Exception ", e
+ traceback.print_exc(e)
+
+ # sort by directories first, then by name
+ rsorted = sorted(response, key=lambda entry : entry['name'])
+ rsorted = sorted(rsorted, key=lambda entry : entry['isdir'], reverse=True)
+ return json.dumps(rsorted, cls=LazyEncoder).replace('</', '<\\/')
+
+def dirinfo(request, build_id, target_id, file_path=None):
+ template = "dirinfo.html"
+ objects = _get_dir_entries(build_id, target_id, '/')
+ packages_sum = Package.objects.filter(id__in=Target_Installed_Package.objects.filter(target_id=target_id).values('package_id')).aggregate(Sum('installed_size'))
+ dir_list = None
+ if file_path != None:
+ """
+ Link from the included package detail file list page and is
+ requesting opening the dir info to a specific file path.
+ Provide the list of directories to expand and the full path to
+ highlight in the page.
+ """
+ # Aassume target's path separator matches host's, that is, os.sep
+ sep = os.sep
+ dir_list = []
+ head = file_path
+ while head != sep:
+ (head, tail) = os.path.split(head)
+ if head != sep:
+ dir_list.insert(0, head)
+
+ context = { 'build': Build.objects.get(pk=build_id),
+ 'target': Target.objects.get(pk=target_id),
+ 'packages_sum': packages_sum['installed_size__sum'],
+ 'objects': objects,
+ 'dir_list': dir_list,
+ 'file_path': file_path,
+ }
+ return render(request, template, context)
+
+def _find_task_dep(task_object):
+ return map(lambda x: x.depends_on, Task_Dependency.objects.filter(task=task_object).filter(depends_on__order__gt = 0).exclude(depends_on__outcome = Task.OUTCOME_NA).select_related("depends_on"))
+
+
+def _find_task_revdep(task_object):
+ tp = []
+ tp = map(lambda t: t.task, Task_Dependency.objects.filter(depends_on=task_object).filter(task__order__gt=0).exclude(task__outcome = Task.OUTCOME_NA).select_related("task", "task__recipe", "task__build"))
+ return tp
+
+def _find_task_revdep_list(tasklist):
+ tp = []
+ tp = map(lambda t: t.task, Task_Dependency.objects.filter(depends_on__in=tasklist).filter(task__order__gt=0).exclude(task__outcome = Task.OUTCOME_NA).select_related("task", "task__recipe", "task__build"))
+ return tp
+
+def _find_task_provider(task_object):
+ task_revdeps = _find_task_revdep(task_object)
+ for tr in task_revdeps:
+ if tr.outcome != Task.OUTCOME_COVERED:
+ return tr
+ for tr in task_revdeps:
+ trc = _find_task_provider(tr)
+ if trc is not None:
+ return trc
+ return None
+
+def tasks_common(request, build_id, variant, task_anchor):
+# This class is shared between these pages
+#
+# Column tasks buildtime diskio cpuusage
+# --------- ------ ---------- ------- ---------
+# Cache def
+# CPU min -
+# Disk min -
+# Executed def def def def
+# Log
+# Order def +
+# Outcome def def def def
+# Recipe min min min min
+# Version
+# Task min min min min
+# Time min -
+#
+# 'min':on always, 'def':on by default, else hidden
+# '+' default column sort up, '-' default column sort down
+
+ anchor = request.GET.get('anchor', '')
+ if not anchor:
+ anchor=task_anchor
+
+ # default ordering depends on variant
+ if 'buildtime' == variant:
+ title_variant='Time'
+ object_search_display="time data"
+ filter_search_display="tasks"
+ (pagesize, orderby) = _get_parameters_values(request, 25, 'elapsed_time:-')
+ elif 'diskio' == variant:
+ title_variant='Disk I/O'
+ object_search_display="disk I/O data"
+ filter_search_display="tasks"
+ (pagesize, orderby) = _get_parameters_values(request, 25, 'disk_io:-')
+ elif 'cpuusage' == variant:
+ title_variant='CPU usage'
+ object_search_display="CPU usage data"
+ filter_search_display="tasks"
+ (pagesize, orderby) = _get_parameters_values(request, 25, 'cpu_usage:-')
+ else :
+ title_variant='Tasks'
+ object_search_display="tasks"
+ filter_search_display="tasks"
+ (pagesize, orderby) = _get_parameters_values(request, 25, 'order:+')
+
+
+ mandatory_parameters = { 'count': pagesize, 'page' : 1, 'orderby': orderby }
+
+ template = 'tasks.html'
+ retval = _verify_parameters( request.GET, mandatory_parameters )
+ if retval:
+ if task_anchor:
+ mandatory_parameters['anchor']=task_anchor
+ return _redirect_parameters( variant, request.GET, mandatory_parameters, build_id = build_id)
+ (filter_string, search_term, ordering_string) = _search_tuple(request, Task)
+ queryset_all = Task.objects.filter(build=build_id).exclude(order__isnull=True).exclude(outcome=Task.OUTCOME_NA)
+ queryset_all = queryset_all.select_related("recipe", "build")
+
+ queryset_with_search = _get_queryset(Task, queryset_all, None , search_term, ordering_string, 'order')
+
+ if ordering_string.startswith('outcome'):
+ queryset = _get_queryset(Task, queryset_all, filter_string, search_term, 'order:+', 'order')
+ queryset = sorted(queryset, key=lambda ur: (ur.outcome_text), reverse=ordering_string.endswith('-'))
+ elif ordering_string.startswith('sstate_result'):
+ queryset = _get_queryset(Task, queryset_all, filter_string, search_term, 'order:+', 'order')
+ queryset = sorted(queryset, key=lambda ur: (ur.sstate_text), reverse=ordering_string.endswith('-'))
+ else:
+ queryset = _get_queryset(Task, queryset_all, filter_string, search_term, ordering_string, 'order')
+
+
+ # compute the anchor's page
+ if anchor:
+ request.GET = request.GET.copy()
+ del request.GET['anchor']
+ i=0
+ a=int(anchor)
+ count_per_page=int(pagesize)
+ for task_object in queryset.iterator():
+ if a == task_object.order:
+ new_page= (i / count_per_page ) + 1
+ request.GET.__setitem__('page', new_page)
+ mandatory_parameters['page']=new_page
+ return _redirect_parameters( variant, request.GET, mandatory_parameters, build_id = build_id)
+ i += 1
+
+ task_objects = _build_page_range(Paginator(queryset, pagesize),request.GET.get('page', 1))
+
+ # define (and modify by variants) the 'tablecols' members
+ tc_order={
+ 'name':'Order',
+ 'qhelp':'The running sequence of each task in the build',
+ 'clclass': 'order', 'hidden' : 1,
+ 'orderkey' : 'order',
+ 'orderfield':_get_toggle_order(request, "order"),
+ 'ordericon':_get_toggle_order_icon(request, "order")}
+ if 'tasks' == variant:
+ tc_order['hidden']='0'
+ del tc_order['clclass']
+
+ tc_recipe={
+ 'name':'Recipe',
+ 'qhelp':'The name of the recipe to which each task applies',
+ 'orderkey' : 'recipe__name',
+ 'orderfield': _get_toggle_order(request, "recipe__name"),
+ 'ordericon':_get_toggle_order_icon(request, "recipe__name"),
+ }
+ tc_recipe_version={
+ 'name':'Recipe version',
+ 'qhelp':'The version of the recipe to which each task applies',
+ 'clclass': 'recipe_version', 'hidden' : 1,
+ }
+ tc_task={
+ 'name':'Task',
+ 'qhelp':'The name of the task',
+ 'orderfield': _get_toggle_order(request, "task_name"),
+ 'ordericon':_get_toggle_order_icon(request, "task_name"),
+ 'orderkey' : 'task_name',
+ }
+ tc_executed={
+ 'name':'Executed',
+ 'qhelp':"This value tells you if a task had to run (executed) in order to generate the task output, or if the output was provided by another task and therefore the task didn't need to run (not executed)",
+ 'clclass': 'executed', 'hidden' : 0,
+ 'orderfield': _get_toggle_order(request, "task_executed"),
+ 'ordericon':_get_toggle_order_icon(request, "task_executed"),
+ 'orderkey' : 'task_executed',
+ 'filter' : {
+ 'class' : 'executed',
+ 'label': 'Show:',
+ 'options' : [
+ ('Executed Tasks', 'task_executed:1', queryset_with_search.filter(task_executed=1).count()),
+ ('Not Executed Tasks', 'task_executed:0', queryset_with_search.filter(task_executed=0).count()),
+ ]
+ }
+
+ }
+ tc_outcome={
+ 'name':'Outcome',
+ 'qhelp':"This column tells you if 'executed' tasks succeeded or failed. The column also tells you why 'not executed' tasks did not need to run",
+ 'clclass': 'outcome', 'hidden' : 0,
+ 'orderfield': _get_toggle_order(request, "outcome"),
+ 'ordericon':_get_toggle_order_icon(request, "outcome"),
+ 'orderkey' : 'outcome',
+ 'filter' : {
+ 'class' : 'outcome',
+ 'label': 'Show:',
+ 'options' : [
+ ('Succeeded Tasks', 'outcome:%d'%Task.OUTCOME_SUCCESS, queryset_with_search.filter(outcome=Task.OUTCOME_SUCCESS).count(), "'Succeeded' tasks are those that ran and completed during the build" ),
+ ('Failed Tasks', 'outcome:%d'%Task.OUTCOME_FAILED, queryset_with_search.filter(outcome=Task.OUTCOME_FAILED).count(), "'Failed' tasks are those that ran but did not complete during the build"),
+ ('Cached Tasks', 'outcome:%d'%Task.OUTCOME_CACHED, queryset_with_search.filter(outcome=Task.OUTCOME_CACHED).count(), 'Cached tasks restore output from the <code>sstate-cache</code> directory or mirrors'),
+ ('Prebuilt Tasks', 'outcome:%d'%Task.OUTCOME_PREBUILT, queryset_with_search.filter(outcome=Task.OUTCOME_PREBUILT).count(),'Prebuilt tasks didn\'t need to run because their output was reused from a previous build'),
+ ('Covered Tasks', 'outcome:%d'%Task.OUTCOME_COVERED, queryset_with_search.filter(outcome=Task.OUTCOME_COVERED).count(), 'Covered tasks didn\'t need to run because their output is provided by another task in this build'),
+ ('Empty Tasks', 'outcome:%d'%Task.OUTCOME_EMPTY, queryset_with_search.filter(outcome=Task.OUTCOME_EMPTY).count(), 'Empty tasks have no executable content'),
+ ]
+ }
+
+ }
+
+ tc_cache={
+ 'name':'Cache attempt',
+ 'qhelp':'This column tells you if a task tried to restore output from the <code>sstate-cache</code> directory or mirrors, and reports the result: Succeeded, Failed or File not in cache',
+ 'clclass': 'cache_attempt', 'hidden' : 0,
+ 'orderfield': _get_toggle_order(request, "sstate_result"),
+ 'ordericon':_get_toggle_order_icon(request, "sstate_result"),
+ 'orderkey' : 'sstate_result',
+ 'filter' : {
+ 'class' : 'cache_attempt',
+ 'label': 'Show:',
+ 'options' : [
+ ('Tasks with cache attempts', 'sstate_result__gt:%d'%Task.SSTATE_NA, queryset_with_search.filter(sstate_result__gt=Task.SSTATE_NA).count(), 'Show all tasks that tried to restore ouput from the <code>sstate-cache</code> directory or mirrors'),
+ ("Tasks with 'File not in cache' attempts", 'sstate_result:%d'%Task.SSTATE_MISS, queryset_with_search.filter(sstate_result=Task.SSTATE_MISS).count(), 'Show tasks that tried to restore output, but did not find it in the <code>sstate-cache</code> directory or mirrors'),
+ ("Tasks with 'Failed' cache attempts", 'sstate_result:%d'%Task.SSTATE_FAILED, queryset_with_search.filter(sstate_result=Task.SSTATE_FAILED).count(), 'Show tasks that found the required output in the <code>sstate-cache</code> directory or mirrors, but could not restore it'),
+ ("Tasks with 'Succeeded' cache attempts", 'sstate_result:%d'%Task.SSTATE_RESTORED, queryset_with_search.filter(sstate_result=Task.SSTATE_RESTORED).count(), 'Show tasks that successfully restored the required output from the <code>sstate-cache</code> directory or mirrors'),
+ ]
+ }
+
+ }
+ #if 'tasks' == variant: tc_cache['hidden']='0';
+ tc_time={
+ 'name':'Time (secs)',
+ 'qhelp':'How long it took the task to finish in seconds',
+ 'orderfield': _get_toggle_order(request, "elapsed_time", True),
+ 'ordericon':_get_toggle_order_icon(request, "elapsed_time"),
+ 'orderkey' : 'elapsed_time',
+ 'clclass': 'time_taken', 'hidden' : 1,
+ }
+ if 'buildtime' == variant:
+ tc_time['hidden']='0'
+ del tc_time['clclass']
+ tc_cache['hidden']='1'
+
+ tc_cpu={
+ 'name':'CPU usage',
+ 'qhelp':'The percentage of task CPU utilization',
+ 'orderfield': _get_toggle_order(request, "cpu_usage", True),
+ 'ordericon':_get_toggle_order_icon(request, "cpu_usage"),
+ 'orderkey' : 'cpu_usage',
+ 'clclass': 'cpu_used', 'hidden' : 1,
+ }
+
+ if 'cpuusage' == variant:
+ tc_cpu['hidden']='0'
+ del tc_cpu['clclass']
+ tc_cache['hidden']='1'
+
+ tc_diskio={
+ 'name':'Disk I/O (ms)',
+ 'qhelp':'Number of miliseconds the task spent doing disk input and output',
+ 'orderfield': _get_toggle_order(request, "disk_io", True),
+ 'ordericon':_get_toggle_order_icon(request, "disk_io"),
+ 'orderkey' : 'disk_io',
+ 'clclass': 'disk_io', 'hidden' : 1,
+ }
+ if 'diskio' == variant:
+ tc_diskio['hidden']='0'
+ del tc_diskio['clclass']
+ tc_cache['hidden']='1'
+
+ build = Build.objects.get(pk=build_id)
+
+ context = { 'objectname': variant,
+ 'object_search_display': object_search_display,
+ 'filter_search_display': filter_search_display,
+ 'title': title_variant,
+ 'build': build,
+ 'objects': task_objects,
+ 'default_orderby' : orderby,
+ 'search_term': search_term,
+ 'total_count': queryset_with_search.count(),
+ 'tablecols':[
+ tc_order,
+ tc_recipe,
+ tc_recipe_version,
+ tc_task,
+ tc_executed,
+ tc_outcome,
+ tc_cache,
+ tc_time,
+ tc_cpu,
+ tc_diskio,
+ ]}
+
+
+ response = render(request, template, context)
+ _set_parameters_values(pagesize, orderby, request)
+ return response
+
+def tasks(request, build_id):
+ return tasks_common(request, build_id, 'tasks', '')
+
+def tasks_task(request, build_id, task_id):
+ return tasks_common(request, build_id, 'tasks', task_id)
+
+def buildtime(request, build_id):
+ return tasks_common(request, build_id, 'buildtime', '')
+
+def diskio(request, build_id):
+ return tasks_common(request, build_id, 'diskio', '')
+
+def cpuusage(request, build_id):
+ return tasks_common(request, build_id, 'cpuusage', '')
+
+
+def recipes(request, build_id):
+ template = 'recipes.html'
+ (pagesize, orderby) = _get_parameters_values(request, 100, 'name:+')
+ mandatory_parameters = { 'count': pagesize, 'page' : 1, 'orderby' : orderby }
+ retval = _verify_parameters( request.GET, mandatory_parameters )
+ if retval:
+ return _redirect_parameters( 'recipes', request.GET, mandatory_parameters, build_id = build_id)
+ (filter_string, search_term, ordering_string) = _search_tuple(request, Recipe)
+ queryset = Recipe.objects.filter(layer_version__id__in=Layer_Version.objects.filter(build=build_id)).select_related("layer_version", "layer_version__layer")
+ queryset = _get_queryset(Recipe, queryset, filter_string, search_term, ordering_string, 'name')
+
+ recipes = _build_page_range(Paginator(queryset, pagesize),request.GET.get('page', 1))
+
+ # prefetch the forward and reverse recipe dependencies
+ deps = { }
+ revs = { }
+ queryset_dependency=Recipe_Dependency.objects.filter(recipe__layer_version__build_id = build_id).select_related("depends_on", "recipe")
+ for recipe in recipes:
+ deplist = [ ]
+ for recipe_dep in [x for x in queryset_dependency if x.recipe_id == recipe.id]:
+ deplist.append(recipe_dep)
+ deps[recipe.id] = deplist
+ revlist = [ ]
+ for recipe_dep in [x for x in queryset_dependency if x.depends_on_id == recipe.id]:
+ revlist.append(recipe_dep)
+ revs[recipe.id] = revlist
+
+ build = Build.objects.get(pk=build_id)
+
+ context = {
+ 'objectname': 'recipes',
+ 'build': build,
+ 'objects': recipes,
+ 'default_orderby' : 'name:+',
+ 'recipe_deps' : deps,
+ 'recipe_revs' : revs,
+ 'tablecols':[
+ {
+ 'name':'Recipe',
+ 'qhelp':'Information about a single piece of software, including where to download the source, configuration options, how to compile the source files and how to package the compiled output',
+ 'orderfield': _get_toggle_order(request, "name"),
+ 'ordericon':_get_toggle_order_icon(request, "name"),
+ },
+ {
+ 'name':'Recipe version',
+ 'qhelp':'The recipe version and revision',
+ },
+ {
+ 'name':'Dependencies',
+ 'qhelp':'Recipe build-time dependencies (i.e. other recipes)',
+ 'clclass': 'depends_on', 'hidden': 1,
+ },
+ {
+ 'name':'Reverse dependencies',
+ 'qhelp':'Recipe build-time reverse dependencies (i.e. the recipes that depend on this recipe)',
+ 'clclass': 'depends_by', 'hidden': 1,
+ },
+ {
+ 'name':'Recipe file',
+ 'qhelp':'Path to the recipe .bb file',
+ 'orderfield': _get_toggle_order(request, "file_path"),
+ 'ordericon':_get_toggle_order_icon(request, "file_path"),
+ 'orderkey' : 'file_path',
+ 'clclass': 'recipe_file', 'hidden': 0,
+ },
+ {
+ 'name':'Section',
+ 'qhelp':'The section in which recipes should be categorized',
+ 'orderfield': _get_toggle_order(request, "section"),
+ 'ordericon':_get_toggle_order_icon(request, "section"),
+ 'orderkey' : 'section',
+ 'clclass': 'recipe_section', 'hidden': 0,
+ },
+ {
+ 'name':'License',
+ 'qhelp':'The list of source licenses for the recipe. Multiple license names separated by the pipe character indicates a choice between licenses. Multiple license names separated by the ampersand character indicates multiple licenses exist that cover different parts of the source',
+ 'orderfield': _get_toggle_order(request, "license"),
+ 'ordericon':_get_toggle_order_icon(request, "license"),
+ 'orderkey' : 'license',
+ 'clclass': 'recipe_license', 'hidden': 0,
+ },
+ {
+ 'name':'Layer',
+ 'qhelp':'The name of the layer providing the recipe',
+ 'orderfield': _get_toggle_order(request, "layer_version__layer__name"),
+ 'ordericon':_get_toggle_order_icon(request, "layer_version__layer__name"),
+ 'orderkey' : 'layer_version__layer__name',
+ 'clclass': 'layer_version__layer__name', 'hidden': 0,
+ },
+ {
+ 'name':'Layer branch',
+ 'qhelp':'The Git branch of the layer providing the recipe',
+ 'orderfield': _get_toggle_order(request, "layer_version__branch"),
+ 'ordericon':_get_toggle_order_icon(request, "layer_version__branch"),
+ 'orderkey' : 'layer_version__branch',
+ 'clclass': 'layer_version__branch', 'hidden': 1,
+ },
+ {
+ 'name':'Layer commit',
+ 'qhelp':'The Git commit of the layer providing the recipe',
+ 'clclass': 'layer_version__layer__commit', 'hidden': 1,
+ },
+ ]
+ }
+
+ response = render(request, template, context)
+ _set_parameters_values(pagesize, orderby, request)
+ return response
+
+def configuration(request, build_id):
+ template = 'configuration.html'
+
+ var_names = ('BB_VERSION', 'BUILD_SYS', 'NATIVELSBSTRING', 'TARGET_SYS',
+ 'MACHINE', 'DISTRO', 'DISTRO_VERSION', 'TUNE_FEATURES', 'TARGET_FPU')
+ context = dict(Variable.objects.filter(build=build_id, variable_name__in=var_names)\
+ .values_list('variable_name', 'variable_value'))
+ context.update({'objectname': 'configuration',
+ 'object_search_display':'variables',
+ 'filter_search_display':'variables',
+ 'build': Build.objects.get(pk=build_id),
+ 'targets': Target.objects.filter(build=build_id)})
+ return render(request, template, context)
+
+
+def configvars(request, build_id):
+ template = 'configvars.html'
+ (pagesize, orderby) = _get_parameters_values(request, 100, 'variable_name:+')
+ mandatory_parameters = { 'count': pagesize, 'page' : 1, 'orderby' : orderby, 'filter' : 'description__regex:.+' }
+ retval = _verify_parameters( request.GET, mandatory_parameters )
+ (filter_string, search_term, ordering_string) = _search_tuple(request, Variable)
+ if retval:
+ # if new search, clear the default filter
+ if search_term and len(search_term):
+ mandatory_parameters['filter']=''
+ return _redirect_parameters( 'configvars', request.GET, mandatory_parameters, build_id = build_id)
+
+ queryset = Variable.objects.filter(build=build_id).exclude(variable_name__istartswith='B_').exclude(variable_name__istartswith='do_')
+ queryset_with_search = _get_queryset(Variable, queryset, None, search_term, ordering_string, 'variable_name').exclude(variable_value='',vhistory__file_name__isnull=True)
+ queryset = _get_queryset(Variable, queryset, filter_string, search_term, ordering_string, 'variable_name')
+ # remove records where the value is empty AND there are no history files
+ queryset = queryset.exclude(variable_value='',vhistory__file_name__isnull=True)
+
+ variables = _build_page_range(Paginator(queryset, pagesize), request.GET.get('page', 1))
+
+ # show all matching files (not just the last one)
+ file_filter= search_term + ":"
+ if filter_string.find('/conf/') > 0:
+ file_filter += 'conf/(local|bblayers).conf'
+ if filter_string.find('conf/machine/') > 0:
+ file_filter += 'conf/machine/'
+ if filter_string.find('conf/distro/') > 0:
+ file_filter += 'conf/distro/'
+ if filter_string.find('/bitbake.conf') > 0:
+ file_filter += '/bitbake.conf'
+ build_dir=re.sub("/tmp/log/.*","",Build.objects.get(pk=build_id).cooker_log_path)
+
+ context = {
+ 'objectname': 'configvars',
+ 'object_search_display':'BitBake variables',
+ 'filter_search_display':'variables',
+ 'file_filter': file_filter,
+ 'build': Build.objects.get(pk=build_id),
+ 'objects' : variables,
+ 'total_count':queryset_with_search.count(),
+ 'default_orderby' : 'variable_name:+',
+ 'search_term':search_term,
+ # Specifies the display of columns for the table, appearance in "Edit columns" box, toggling default show/hide, and specifying filters for columns
+ 'tablecols' : [
+ {'name': 'Variable',
+ 'qhelp': "BitBake is a generic task executor that considers a list of tasks with dependencies and handles metadata that consists of variables in a certain format that get passed to the tasks",
+ 'orderfield': _get_toggle_order(request, "variable_name"),
+ 'ordericon':_get_toggle_order_icon(request, "variable_name"),
+ },
+ {'name': 'Value',
+ 'qhelp': "The value assigned to the variable",
+ 'dclass': "span4",
+ },
+ {'name': 'Set in file',
+ 'qhelp': "The last configuration file that touched the variable value",
+ 'clclass': 'file', 'hidden' : 0,
+ 'orderkey' : 'vhistory__file_name',
+ 'filter' : {
+ 'class' : 'vhistory__file_name',
+ 'label': 'Show:',
+ 'options' : [
+ ('Local configuration variables', 'vhistory__file_name__contains:'+build_dir+'/conf/',queryset_with_search.filter(vhistory__file_name__contains=build_dir+'/conf/').count(), 'Select this filter to see variables set by the <code>local.conf</code> and <code>bblayers.conf</code> configuration files inside the <code>/build/conf/</code> directory'),
+ ('Machine configuration variables', 'vhistory__file_name__contains:conf/machine/',queryset_with_search.filter(vhistory__file_name__contains='conf/machine').count(), 'Select this filter to see variables set by the configuration file(s) inside your layers <code>/conf/machine/</code> directory'),
+ ('Distro configuration variables', 'vhistory__file_name__contains:conf/distro/',queryset_with_search.filter(vhistory__file_name__contains='conf/distro').count(), 'Select this filter to see variables set by the configuration file(s) inside your layers <code>/conf/distro/</code> directory'),
+ ('Layer configuration variables', 'vhistory__file_name__contains:conf/layer.conf',queryset_with_search.filter(vhistory__file_name__contains='conf/layer.conf').count(), 'Select this filter to see variables set by the <code>layer.conf</code> configuration file inside your layers'),
+ ('bitbake.conf variables', 'vhistory__file_name__contains:/bitbake.conf',queryset_with_search.filter(vhistory__file_name__contains='/bitbake.conf').count(), 'Select this filter to see variables set by the <code>bitbake.conf</code> configuration file'),
+ ]
+ },
+ },
+ {'name': 'Description',
+ 'qhelp': "A brief explanation of the variable",
+ 'clclass': 'description', 'hidden' : 0,
+ 'dclass': "span4",
+ 'filter' : {
+ 'class' : 'description',
+ 'label': 'Show:',
+ 'options' : [
+ ('Variables with description', 'description__regex:.+', queryset_with_search.filter(description__regex='.+').count(), 'We provide descriptions for the most common BitBake variables. The list of descriptions lives in <code>meta/conf/documentation.conf</code>'),
+ ]
+ },
+ },
+ ],
+ }
+
+ response = render(request, template, context)
+ _set_parameters_values(pagesize, orderby, request)
+ return response
+
+def bpackage(request, build_id):
+ template = 'bpackage.html'
+ (pagesize, orderby) = _get_parameters_values(request, 100, 'name:+')
+ mandatory_parameters = { 'count' : pagesize, 'page' : 1, 'orderby' : orderby }
+ retval = _verify_parameters( request.GET, mandatory_parameters )
+ if retval:
+ return _redirect_parameters( 'packages', request.GET, mandatory_parameters, build_id = build_id)
+ (filter_string, search_term, ordering_string) = _search_tuple(request, Package)
+ queryset = Package.objects.filter(build = build_id).filter(size__gte=0)
+ queryset = _get_queryset(Package, queryset, filter_string, search_term, ordering_string, 'name')
+
+ packages = _build_page_range(Paginator(queryset, pagesize),request.GET.get('page', 1))
+
+ build = Build.objects.get( pk = build_id )
+
+ context = {
+ 'objectname': 'packages built',
+ 'build': build,
+ 'objects' : packages,
+ 'default_orderby' : 'name:+',
+ 'tablecols':[
+ {
+ 'name':'Package',
+ 'qhelp':'Packaged output resulting from building a recipe',
+ 'orderfield': _get_toggle_order(request, "name"),
+ 'ordericon':_get_toggle_order_icon(request, "name"),
+ },
+ {
+ 'name':'Package version',
+ 'qhelp':'The package version and revision',
+ },
+ {
+ 'name':'Size',
+ 'qhelp':'The size of the package',
+ 'orderfield': _get_toggle_order(request, "size", True),
+ 'ordericon':_get_toggle_order_icon(request, "size"),
+ 'orderkey' : 'size',
+ 'clclass': 'size', 'hidden': 0,
+ 'dclass' : 'span2',
+ },
+ {
+ 'name':'License',
+ 'qhelp':'The license under which the package is distributed. Multiple license names separated by the pipe character indicates a choice between licenses. Multiple license names separated by the ampersand character indicates multiple licenses exist that cover different parts of the source',
+ 'orderfield': _get_toggle_order(request, "license"),
+ 'ordericon':_get_toggle_order_icon(request, "license"),
+ 'orderkey' : 'license',
+ 'clclass': 'license', 'hidden': 1,
+ },
+ {
+ 'name':'Recipe',
+ 'qhelp':'The name of the recipe building the package',
+ 'orderfield': _get_toggle_order(request, "recipe__name"),
+ 'ordericon':_get_toggle_order_icon(request, "recipe__name"),
+ 'orderkey' : 'recipe__name',
+ 'clclass': 'recipe__name', 'hidden': 0,
+ },
+ {
+ 'name':'Recipe version',
+ 'qhelp':'Version and revision of the recipe building the package',
+ 'clclass': 'recipe__version', 'hidden': 1,
+ },
+ {
+ 'name':'Layer',
+ 'qhelp':'The name of the layer providing the recipe that builds the package',
+ 'orderfield': _get_toggle_order(request, "recipe__layer_version__layer__name"),
+ 'ordericon':_get_toggle_order_icon(request, "recipe__layer_version__layer__name"),
+ 'orderkey' : 'recipe__layer_version__layer__name',
+ 'clclass': 'recipe__layer_version__layer__name', 'hidden': 1,
+ },
+ {
+ 'name':'Layer branch',
+ 'qhelp':'The Git branch of the layer providing the recipe that builds the package',
+ 'orderfield': _get_toggle_order(request, "recipe__layer_version__branch"),
+ 'ordericon':_get_toggle_order_icon(request, "recipe__layer_version__branch"),
+ 'orderkey' : 'recipe__layer_version__branch',
+ 'clclass': 'recipe__layer_version__branch', 'hidden': 1,
+ },
+ {
+ 'name':'Layer commit',
+ 'qhelp':'The Git commit of the layer providing the recipe that builds the package',
+ 'clclass': 'recipe__layer_version__layer__commit', 'hidden': 1,
+ },
+ ]
+ }
+
+ response = render(request, template, context)
+ _set_parameters_values(pagesize, orderby, request)
+ return response
+
+def bfile(request, build_id, package_id):
+ template = 'bfile.html'
+ files = Package_File.objects.filter(package = package_id)
+ context = {'build': Build.objects.get(pk=build_id), 'objects' : files}
+ return render(request, template, context)
+
+
+# A set of dependency types valid for both included and built package views
+OTHER_DEPENDS_BASE = [
+ Package_Dependency.TYPE_RSUGGESTS,
+ Package_Dependency.TYPE_RPROVIDES,
+ Package_Dependency.TYPE_RREPLACES,
+ Package_Dependency.TYPE_RCONFLICTS,
+ ]
+
+# value for invalid row id
+INVALID_KEY = -1
+
+"""
+Given a package id, target_id retrieves two sets of this image and package's
+dependencies. The return value is a dictionary consisting of two other
+lists: a list of 'runtime' dependencies, that is, having RDEPENDS
+values in source package's recipe, and a list of other dependencies, that is
+the list of possible recipe variables as found in OTHER_DEPENDS_BASE plus
+the RRECOMMENDS or TRECOMMENDS value.
+The lists are built in the sort order specified for the package runtime
+dependency views.
+"""
+def _get_package_dependencies(package_id, target_id = INVALID_KEY):
+ runtime_deps = []
+ other_deps = []
+ other_depends_types = OTHER_DEPENDS_BASE
+
+ if target_id != INVALID_KEY :
+ rdepends_type = Package_Dependency.TYPE_TRDEPENDS
+ other_depends_types += [Package_Dependency.TYPE_TRECOMMENDS]
+ else :
+ rdepends_type = Package_Dependency.TYPE_RDEPENDS
+ other_depends_types += [Package_Dependency.TYPE_RRECOMMENDS]
+
+ package = Package.objects.get(pk=package_id)
+ if target_id != INVALID_KEY :
+ alldeps = package.package_dependencies_source.filter(target_id__exact = target_id)
+ else :
+ alldeps = package.package_dependencies_source.all()
+ for idep in alldeps:
+ dep_package = Package.objects.get(pk=idep.depends_on_id)
+ dep_entry = Package_Dependency.DEPENDS_DICT[idep.dep_type]
+ if dep_package.version == '' :
+ version = ''
+ else :
+ version = dep_package.version + "-" + dep_package.revision
+ installed = False
+ if target_id != INVALID_KEY :
+ if Target_Installed_Package.objects.filter(target_id__exact = target_id, package_id__exact = dep_package.id).count() > 0:
+ installed = True
+ dep = {
+ 'name' : dep_package.name,
+ 'version' : version,
+ 'size' : dep_package.size,
+ 'dep_type' : idep.dep_type,
+ 'dep_type_display' : dep_entry[0].capitalize(),
+ 'dep_type_help' : dep_entry[1] % (dep_package.name, package.name),
+ 'depends_on_id' : dep_package.id,
+ 'installed' : installed,
+ }
+
+ if target_id != INVALID_KEY:
+ dep['alias'] = _get_package_alias(dep_package)
+
+ if idep.dep_type == rdepends_type :
+ runtime_deps.append(dep)
+ elif idep.dep_type in other_depends_types :
+ other_deps.append(dep)
+
+ rdep_sorted = sorted(runtime_deps, key=lambda k: k['name'])
+ odep_sorted = sorted(
+ sorted(other_deps, key=lambda k: k['name']),
+ key=lambda k: k['dep_type'])
+ retvalues = {'runtime_deps' : rdep_sorted, 'other_deps' : odep_sorted}
+ return retvalues
+
+# Return the count of packages dependent on package for this target_id image
+def _get_package_reverse_dep_count(package, target_id):
+ return package.package_dependencies_target.filter(target_id__exact=target_id, dep_type__exact = Package_Dependency.TYPE_TRDEPENDS).count()
+
+# Return the count of the packages that this package_id is dependent on.
+# Use one of the two RDEPENDS types, either TRDEPENDS if the package was
+# installed, or else RDEPENDS if only built.
+def _get_package_dependency_count(package, target_id, is_installed):
+ if is_installed :
+ return package.package_dependencies_source.filter(target_id__exact = target_id,
+ dep_type__exact = Package_Dependency.TYPE_TRDEPENDS).count()
+ else :
+ return package.package_dependencies_source.filter(dep_type__exact = Package_Dependency.TYPE_RDEPENDS).count()
+
+def _get_package_alias(package):
+ alias = package.installed_name
+ if alias != None and alias != '' and alias != package.name:
+ return alias
+ else:
+ return ''
+
+def _get_fullpackagespec(package):
+ r = package.name
+ version_good = package.version != None and package.version != ''
+ revision_good = package.revision != None and package.revision != ''
+ if version_good or revision_good:
+ r += '_'
+ if version_good:
+ r += package.version
+ if revision_good:
+ r += '-'
+ if revision_good:
+ r += package.revision
+ return r
+
+def package_built_detail(request, build_id, package_id):
+ template = "package_built_detail.html"
+ if Build.objects.filter(pk=build_id).count() == 0 :
+ return redirect(builds)
+
+ # follow convention for pagination w/ search although not used for this view
+ queryset = Package_File.objects.filter(package_id__exact=package_id)
+ (pagesize, orderby) = _get_parameters_values(request, 25, 'path:+')
+ mandatory_parameters = { 'count': pagesize, 'page' : 1, 'orderby' : orderby }
+ retval = _verify_parameters( request.GET, mandatory_parameters )
+ if retval:
+ return _redirect_parameters( 'package_built_detail', request.GET, mandatory_parameters, build_id = build_id, package_id = package_id)
+
+ (filter_string, search_term, ordering_string) = _search_tuple(request, Package_File)
+ paths = _get_queryset(Package_File, queryset, filter_string, search_term, ordering_string, 'path')
+
+ package = Package.objects.get(pk=package_id)
+ package.fullpackagespec = _get_fullpackagespec(package)
+ context = {
+ 'build' : Build.objects.get(pk=build_id),
+ 'package' : package,
+ 'dependency_count' : _get_package_dependency_count(package, -1, False),
+ 'objects' : paths,
+ 'tablecols':[
+ {
+ 'name':'File',
+ 'orderfield': _get_toggle_order(request, "path"),
+ 'ordericon':_get_toggle_order_icon(request, "path"),
+ },
+ {
+ 'name':'Size',
+ 'orderfield': _get_toggle_order(request, "size", True),
+ 'ordericon':_get_toggle_order_icon(request, "size"),
+ 'dclass': 'sizecol span2',
+ },
+ ]
+ }
+ if paths.all().count() < 2:
+ context['disable_sort'] = True;
+
+ response = render(request, template, context)
+ _set_parameters_values(pagesize, orderby, request)
+ return response
+
+def package_built_dependencies(request, build_id, package_id):
+ template = "package_built_dependencies.html"
+ if Build.objects.filter(pk=build_id).count() == 0 :
+ return redirect(builds)
+
+ package = Package.objects.get(pk=package_id)
+ package.fullpackagespec = _get_fullpackagespec(package)
+ dependencies = _get_package_dependencies(package_id)
+ context = {
+ 'build' : Build.objects.get(pk=build_id),
+ 'package' : package,
+ 'runtime_deps' : dependencies['runtime_deps'],
+ 'other_deps' : dependencies['other_deps'],
+ 'dependency_count' : _get_package_dependency_count(package, -1, False)
+ }
+ return render(request, template, context)
+
+
+def package_included_detail(request, build_id, target_id, package_id):
+ template = "package_included_detail.html"
+ if Build.objects.filter(pk=build_id).count() == 0 :
+ return redirect(builds)
+
+ # follow convention for pagination w/ search although not used for this view
+ (pagesize, orderby) = _get_parameters_values(request, 25, 'path:+')
+ mandatory_parameters = { 'count': pagesize, 'page' : 1, 'orderby' : orderby }
+ retval = _verify_parameters( request.GET, mandatory_parameters )
+ if retval:
+ return _redirect_parameters( 'package_included_detail', request.GET, mandatory_parameters, build_id = build_id, target_id = target_id, package_id = package_id)
+ (filter_string, search_term, ordering_string) = _search_tuple(request, Package_File)
+
+ queryset = Package_File.objects.filter(package_id__exact=package_id)
+ paths = _get_queryset(Package_File, queryset, filter_string, search_term, ordering_string, 'path')
+
+ package = Package.objects.get(pk=package_id)
+ package.fullpackagespec = _get_fullpackagespec(package)
+ package.alias = _get_package_alias(package)
+ target = Target.objects.get(pk=target_id)
+ context = {
+ 'build' : Build.objects.get(pk=build_id),
+ 'target' : target,
+ 'package' : package,
+ 'reverse_count' : _get_package_reverse_dep_count(package, target_id),
+ 'dependency_count' : _get_package_dependency_count(package, target_id, True),
+ 'objects': paths,
+ 'tablecols':[
+ {
+ 'name':'File',
+ 'orderfield': _get_toggle_order(request, "path"),
+ 'ordericon':_get_toggle_order_icon(request, "path"),
+ },
+ {
+ 'name':'Size',
+ 'orderfield': _get_toggle_order(request, "size", True),
+ 'ordericon':_get_toggle_order_icon(request, "size"),
+ 'dclass': 'sizecol span2',
+ },
+ ]
+ }
+ if paths.all().count() < 2:
+ context['disable_sort'] = True
+ response = render(request, template, context)
+ _set_parameters_values(pagesize, orderby, request)
+ return response
+
+def package_included_dependencies(request, build_id, target_id, package_id):
+ template = "package_included_dependencies.html"
+ if Build.objects.filter(pk=build_id).count() == 0 :
+ return redirect(builds)
+
+ package = Package.objects.get(pk=package_id)
+ package.fullpackagespec = _get_fullpackagespec(package)
+ package.alias = _get_package_alias(package)
+ target = Target.objects.get(pk=target_id)
+
+ dependencies = _get_package_dependencies(package_id, target_id)
+ context = {
+ 'build' : Build.objects.get(pk=build_id),
+ 'package' : package,
+ 'target' : target,
+ 'runtime_deps' : dependencies['runtime_deps'],
+ 'other_deps' : dependencies['other_deps'],
+ 'reverse_count' : _get_package_reverse_dep_count(package, target_id),
+ 'dependency_count' : _get_package_dependency_count(package, target_id, True)
+ }
+ return render(request, template, context)
+
+def package_included_reverse_dependencies(request, build_id, target_id, package_id):
+ template = "package_included_reverse_dependencies.html"
+ if Build.objects.filter(pk=build_id).count() == 0 :
+ return redirect(builds)
+
+ (pagesize, orderby) = _get_parameters_values(request, 25, 'package__name:+')
+ mandatory_parameters = { 'count': pagesize, 'page' : 1, 'orderby': orderby }
+ retval = _verify_parameters( request.GET, mandatory_parameters )
+ if retval:
+ return _redirect_parameters( 'package_included_reverse_dependencies', request.GET, mandatory_parameters, build_id = build_id, target_id = target_id, package_id = package_id)
+ (filter_string, search_term, ordering_string) = _search_tuple(request, Package_File)
+
+ queryset = Package_Dependency.objects.select_related('depends_on__name', 'depends_on__size').filter(depends_on=package_id, target_id=target_id, dep_type=Package_Dependency.TYPE_TRDEPENDS)
+ objects = _get_queryset(Package_Dependency, queryset, filter_string, search_term, ordering_string, 'package__name')
+
+ package = Package.objects.get(pk=package_id)
+ package.fullpackagespec = _get_fullpackagespec(package)
+ package.alias = _get_package_alias(package)
+ target = Target.objects.get(pk=target_id)
+ for o in objects:
+ if o.package.version != '':
+ o.package.version += '-' + o.package.revision
+ o.alias = _get_package_alias(o.package)
+ context = {
+ 'build' : Build.objects.get(pk=build_id),
+ 'package' : package,
+ 'target' : target,
+ 'objects' : objects,
+ 'reverse_count' : _get_package_reverse_dep_count(package, target_id),
+ 'dependency_count' : _get_package_dependency_count(package, target_id, True),
+ 'tablecols':[
+ {
+ 'name':'Package',
+ 'orderfield': _get_toggle_order(request, "package__name"),
+ 'ordericon': _get_toggle_order_icon(request, "package__name"),
+ },
+ {
+ 'name':'Version',
+ },
+ {
+ 'name':'Size',
+ 'orderfield': _get_toggle_order(request, "package__size", True),
+ 'ordericon': _get_toggle_order_icon(request, "package__size"),
+ 'dclass': 'sizecol span2',
+ },
+ ]
+ }
+ if objects.all().count() < 2:
+ context['disable_sort'] = True
+ response = render(request, template, context)
+ _set_parameters_values(pagesize, orderby, request)
+ return response
+
+def image_information_dir(request, build_id, target_id, packagefile_id):
+ # stubbed for now
+ return redirect(builds)
+ # the context processor that supplies data used across all the pages
+
+
+def managedcontextprocessor(request):
+ ret = {
+ "projects": Project.objects.all(),
+ "DEBUG" : toastermain.settings.DEBUG,
+ "TOASTER_BRANCH": toastermain.settings.TOASTER_BRANCH,
+ "TOASTER_REVISION" : toastermain.settings.TOASTER_REVISION,
+ }
+ return ret
+
+
+
+import toastermain.settings
+
+from orm.models import Project, ProjectLayer, ProjectTarget, ProjectVariable
+
+# we have a set of functions if we're in managed mode, or
+# a default "page not available" simple functions for interactive mode
+
+if True:
+ from django.contrib.auth.models import User
+ from django.contrib.auth import authenticate, login
+ from django.contrib.auth.decorators import login_required
+
+ from orm.models import Branch, LayerSource, ToasterSetting, Release, Machine, LayerVersionDependency
+ from bldcontrol.models import BuildRequest
+
+ import traceback
+
+ class BadParameterException(Exception):
+ ''' The exception raised on invalid POST requests '''
+ pass
+
+ # shows the "all builds" page for managed mode; it displays build requests (at least started!) instead of actual builds
+ @_template_renderer("builds.html")
+ def builds(request):
+ # define here what parameters the view needs in the GET portion in order to
+ # be able to display something. 'count' and 'page' are mandatory for all views
+ # that use paginators.
+
+ queryset = Build.objects.exclude(outcome = Build.IN_PROGRESS)
+
+ try:
+ context, pagesize, orderby = _build_list_helper(request, queryset)
+ # all builds page as a Project column
+ context['tablecols'].append({'name': 'Project', 'clcalss': 'project_column', })
+ except RedirectException as re:
+ # rewrite the RedirectException
+ re.view = resolve(request.path_info).url_name
+ raise re
+
+ _set_parameters_values(pagesize, orderby, request)
+ return context
+
+
+ # helper function, to be used on "all builds" and "project builds" pages
+ def _build_list_helper(request, queryset_all):
+
+ default_orderby = 'completed_on:-'
+ (pagesize, orderby) = _get_parameters_values(request, 10, default_orderby)
+ mandatory_parameters = { 'count': pagesize, 'page' : 1, 'orderby' : orderby }
+ retval = _verify_parameters( request.GET, mandatory_parameters )
+ if retval:
+ raise RedirectException( None, request.GET, mandatory_parameters)
+
+ # boilerplate code that takes a request for an object type and returns a queryset
+ # for that object type. copypasta for all needed table searches
+ (filter_string, search_term, ordering_string) = _search_tuple(request, Build)
+ # post-process any date range filters
+ filter_string,daterange_selected = _modify_date_range_filter(filter_string)
+ queryset_all = queryset_all.select_related("project").annotate(errors_no = Count('logmessage', only=Q(logmessage__level=LogMessage.ERROR)|Q(logmessage__level=LogMessage.EXCEPTION))).annotate(warnings_no = Count('logmessage', only=Q(logmessage__level=LogMessage.WARNING))).extra(select={'timespent':'completed_on - started_on'})
+ queryset_with_search = _get_queryset(Build, queryset_all, None, search_term, ordering_string, '-completed_on')
+ queryset = _get_queryset(Build, queryset_all, filter_string, search_term, ordering_string, '-completed_on')
+
+ # retrieve the objects that will be displayed in the table; builds a paginator and gets a page range to display
+ build_info = _build_page_range(Paginator(queryset, pagesize), request.GET.get('page', 1))
+
+ # build view-specific information; this is rendered specifically in the builds page, at the top of the page (i.e. Recent builds)
+ build_mru = Build.objects.order_by("-started_on")[:3]
+
+ # calculate the exact begining of local today and yesterday, append context
+ context_date,today_begin,yesterday_begin = _add_daterange_context(queryset_all, request, {'started_on','completed_on'})
+
+ # set up list of fstypes for each build
+ fstypes_map = {};
+ for build in build_info:
+ targets = Target.objects.filter( build_id = build.id )
+ comma = "";
+ extensions = "";
+ for t in targets:
+ if ( not t.is_image ):
+ continue
+ tif = Target_Image_File.objects.filter( target_id = t.id )
+ for i in tif:
+ s=re.sub('.*tar.bz2', 'tar.bz2', i.file_name)
+ if s == i.file_name:
+ s=re.sub('.*\.', '', i.file_name)
+ if None == re.search(s,extensions):
+ extensions += comma + s
+ comma = ", "
+ fstypes_map[build.id]=extensions
+
+ # send the data to the template
+ context = {
+ # specific info for
+ 'mru' : build_mru,
+ # TODO: common objects for all table views, adapt as needed
+ 'objects' : build_info,
+ 'objectname' : "builds",
+ 'default_orderby' : default_orderby,
+ 'fstypes' : fstypes_map,
+ 'search_term' : search_term,
+ 'total_count' : queryset_with_search.count(),
+ 'daterange_selected' : daterange_selected,
+ # Specifies the display of columns for the table, appearance in "Edit columns" box, toggling default show/hide, and specifying filters for columns
+ 'tablecols' : [
+ {'name': 'Outcome', # column with a single filter
+ 'qhelp' : "The outcome tells you if a build successfully completed or failed", # the help button content
+ 'dclass' : "span2", # indication about column width; comes from the design
+ 'orderfield': _get_toggle_order(request, "outcome"), # adds ordering by the field value; default ascending unless clicked from ascending into descending
+ 'ordericon':_get_toggle_order_icon(request, "outcome"),
+ # filter field will set a filter on that column with the specs in the filter description
+ # the class field in the filter has no relation with clclass; the control different aspects of the UI
+ # still, it is recommended for the values to be identical for easy tracking in the generated HTML
+ 'filter' : {'class' : 'outcome',
+ 'label': 'Show:',
+ 'options' : [
+ ('Successful builds', 'outcome:' + str(Build.SUCCEEDED), queryset_with_search.filter(outcome=str(Build.SUCCEEDED)).count()), # this is the field search expression
+ ('Failed builds', 'outcome:'+ str(Build.FAILED), queryset_with_search.filter(outcome=str(Build.FAILED)).count()),
+ ]
+ }
+ },
+ {'name': 'Recipe', # default column, disabled box, with just the name in the list
+ 'qhelp': "What you built (i.e. one or more recipes or image recipes)",
+ 'orderfield': _get_toggle_order(request, "target__target"),
+ 'ordericon':_get_toggle_order_icon(request, "target__target"),
+ },
+ {'name': 'Machine',
+ 'qhelp': "The machine is the hardware for which you are building a recipe or image recipe",
+ 'orderfield': _get_toggle_order(request, "machine"),
+ 'ordericon':_get_toggle_order_icon(request, "machine"),
+ 'dclass': 'span3'
+ }, # a slightly wider column
+ {'name': 'Started on', 'clclass': 'started_on', 'hidden' : 1, # this is an unchecked box, which hides the column
+ 'qhelp': "The date and time you started the build",
+ 'orderfield': _get_toggle_order(request, "started_on", True),
+ 'ordericon':_get_toggle_order_icon(request, "started_on"),
+ 'orderkey' : "started_on",
+ 'filter' : {'class' : 'started_on',
+ 'label': 'Show:',
+ 'options' : [
+ ("Today's builds" , 'started_on__gte:'+today_begin.strftime("%Y-%m-%d"), queryset_all.filter(started_on__gte=today_begin).count()),
+ ("Yesterday's builds",
+ 'started_on__gte!started_on__lt:'
+ +yesterday_begin.strftime("%Y-%m-%d")+'!'
+ +today_begin.strftime("%Y-%m-%d"),
+ queryset_all.filter(
+ started_on__gte=yesterday_begin,
+ started_on__lt=today_begin
+ ).count()),
+ ("Build date range", 'daterange', 1, '', 'started_on'),
+ ]
+ }
+ },
+ {'name': 'Completed on',
+ 'qhelp': "The date and time the build finished",
+ 'orderfield': _get_toggle_order(request, "completed_on", True),
+ 'ordericon':_get_toggle_order_icon(request, "completed_on"),
+ 'orderkey' : 'completed_on',
+ 'filter' : {'class' : 'completed_on',
+ 'label': 'Show:',
+ 'options' : [
+ ("Today's builds" , 'completed_on__gte:'+today_begin.strftime("%Y-%m-%d"), queryset_all.filter(completed_on__gte=today_begin).count()),
+ ("Yesterday's builds",
+ 'completed_on__gte!completed_on__lt:'
+ +yesterday_begin.strftime("%Y-%m-%d")+'!'
+ +today_begin.strftime("%Y-%m-%d"),
+ queryset_all.filter(
+ completed_on__gte=yesterday_begin,
+ completed_on__lt=today_begin
+ ).count()),
+ ("Build date range", 'daterange', 1, '', 'completed_on'),
+ ]
+ }
+ },
+ {'name': 'Failed tasks', 'clclass': 'failed_tasks', # specifing a clclass will enable the checkbox
+ 'qhelp': "How many tasks failed during the build",
+ 'filter' : {'class' : 'failed_tasks',
+ 'label': 'Show:',
+ 'options' : [
+ ('Builds with failed tasks', 'task_build__outcome:4', queryset_with_search.filter(task_build__outcome=4).count()),
+ ('Builds without failed tasks', 'task_build__outcome:NOT4', queryset_with_search.filter(~Q(task_build__outcome=4)).count()),
+ ]
+ }
+ },
+ {'name': 'Errors', 'clclass': 'errors_no',
+ 'qhelp': "How many errors were encountered during the build (if any)",
+ 'orderfield': _get_toggle_order(request, "errors_no", True),
+ 'ordericon':_get_toggle_order_icon(request, "errors_no"),
+ 'orderkey' : 'errors_no',
+ 'filter' : {'class' : 'errors_no',
+ 'label': 'Show:',
+ 'options' : [
+ ('Builds with errors', 'errors_no__gte:1', queryset_with_search.filter(errors_no__gte=1).count()),
+ ('Builds without errors', 'errors_no:0', queryset_with_search.filter(errors_no=0).count()),
+ ]
+ }
+ },
+ {'name': 'Warnings', 'clclass': 'warnings_no',
+ 'qhelp': "How many warnings were encountered during the build (if any)",
+ 'orderfield': _get_toggle_order(request, "warnings_no", True),
+ 'ordericon':_get_toggle_order_icon(request, "warnings_no"),
+ 'orderkey' : 'warnings_no',
+ 'filter' : {'class' : 'warnings_no',
+ 'label': 'Show:',
+ 'options' : [
+ ('Builds with warnings','warnings_no__gte:1', queryset_with_search.filter(warnings_no__gte=1).count()),
+ ('Builds without warnings','warnings_no:0', queryset_with_search.filter(warnings_no=0).count()),
+ ]
+ }
+ },
+ {'name': 'Time', 'clclass': 'time', 'hidden' : 1,
+ 'qhelp': "How long it took the build to finish",
+ 'orderfield': _get_toggle_order(request, "timespent", True),
+ 'ordericon':_get_toggle_order_icon(request, "timespent"),
+ 'orderkey' : 'timespent',
+ },
+ {'name': 'Image files', 'clclass': 'output',
+ 'qhelp': "The root file system types produced by the build. You can find them in your <code>/build/tmp/deploy/images/</code> directory",
+ # TODO: compute image fstypes from Target_Image_File
+ }
+ ]
+ }
+
+ # merge daterange values
+ context.update(context_date)
+ return context, pagesize, orderby
+
+
+
+ # new project
+ def newproject(request):
+ template = "newproject.html"
+ context = {
+ 'email': request.user.email if request.user.is_authenticated() else '',
+ 'username': request.user.username if request.user.is_authenticated() else '',
+ 'releases': Release.objects.order_by("description"),
+ }
+
+ try:
+ context['defaultbranch'] = ToasterSetting.objects.get(name = "DEFAULT_RELEASE").value
+ except ToasterSetting.DoesNotExist:
+ pass
+
+ if request.method == "GET":
+ # render new project page
+ return render(request, template, context)
+ elif request.method == "POST":
+ mandatory_fields = ['projectname', 'ptype']
+ try:
+ ptype = request.POST.get('ptype')
+ if ptype == "build":
+ mandatory_fields.append('projectversion')
+ # make sure we have values for all mandatory_fields
+ if reduce( lambda x, y: x or y, map(lambda x: len(request.POST.get(x, '')) == 0, mandatory_fields)):
+ # set alert for missing fields
+ raise BadParameterException("Fields missing: " +
+ ", ".join([x for x in mandatory_fields if len(request.POST.get(x, '')) == 0 ]))
+
+ if not request.user.is_authenticated():
+ user = authenticate(username = request.POST.get('username', '_anonuser'), password = 'nopass')
+ if user is None:
+ user = User.objects.create_user(username = request.POST.get('username', '_anonuser'), email = request.POST.get('email', ''), password = "nopass")
+
+ user = authenticate(username = user.username, password = 'nopass')
+ login(request, user)
+
+ # save the project
+ if ptype == "analysis":
+ release = None
+ else:
+ release = Release.objects.get(pk = request.POST.get('projectversion', None ))
+
+ prj = Project.objects.create_project(name = request.POST['projectname'], release = release)
+ prj.user_id = request.user.pk
+ prj.save()
+ return redirect(reverse(project, args=(prj.pk,)) + "?notify=new-project")
+
+ except (IntegrityError, BadParameterException) as e:
+ # fill in page with previously submitted values
+ map(lambda x: context.__setitem__(x, request.POST.get(x, "-- missing")), mandatory_fields)
+ if isinstance(e, IntegrityError) and "username" in str(e):
+ context['alert'] = "Your chosen username is already used"
+ else:
+ context['alert'] = str(e)
+ return render(request, template, context)
+
+ raise Exception("Invalid HTTP method for this page")
+
+
+
+ # Shows the edit project page
+ @_template_renderer('project.html')
+ def project(request, pid):
+ prj = Project.objects.get(id = pid)
+
+ try:
+ puser = User.objects.get(id = prj.user_id)
+ except User.DoesNotExist:
+ puser = None
+
+ # execute POST requests
+ if request.method == "POST":
+ # add layers
+ if 'layerAdd' in request.POST and len(request.POST['layerAdd']) > 0:
+ for lc in Layer_Version.objects.filter(pk__in=[i for i in request.POST['layerAdd'].split(",") if len(i) > 0]):
+ ProjectLayer.objects.get_or_create(project = prj, layercommit = lc)
+
+ # remove layers
+ if 'layerDel' in request.POST and len(request.POST['layerDel']) > 0:
+ for t in request.POST['layerDel'].strip().split(" "):
+ pt = ProjectLayer.objects.filter(project = prj, layercommit_id = int(t)).delete()
+
+ if 'projectName' in request.POST:
+ prj.name = request.POST['projectName']
+ prj.save();
+
+ if 'projectVersion' in request.POST:
+ # If the release is the current project then return now
+ if prj.release.pk == int(request.POST.get('projectVersion',-1)):
+ return {}
+
+ prj.release = Release.objects.get(pk = request.POST['projectVersion'])
+ # we need to change the bitbake version
+ prj.bitbake_version = prj.release.bitbake_version
+ prj.save()
+ # we need to change the layers
+ for i in prj.projectlayer_set.all():
+ # find and add a similarly-named layer on the new branch
+ try:
+ lv = prj.compatible_layerversions(layer_name = i.layercommit.layer.name)[0]
+ ProjectLayer.objects.get_or_create(project = prj, layercommit = lv)
+ except IndexError:
+ pass
+ finally:
+ # get rid of the old entry
+ i.delete()
+
+ if 'machineName' in request.POST:
+ machinevar = prj.projectvariable_set.get(name="MACHINE")
+ machinevar.value=request.POST['machineName']
+ machinevar.save()
+
+
+ # we use implicit knowledge of the current user's project to filter layer information, e.g.
+ pid = prj.id
+
+ from collections import Counter
+ freqtargets = []
+ try:
+ freqtargets += map(lambda x: x.target, reduce(lambda x, y: x + y, map(lambda x: list(x.target_set.all()), Build.objects.filter(project = prj, outcome__lt = Build.IN_PROGRESS))))
+ freqtargets += map(lambda x: x.target, reduce(lambda x, y: x + y, map(lambda x: list(x.brtarget_set.all()), BuildRequest.objects.filter(project = prj, state = BuildRequest.REQ_FAILED))))
+ except TypeError:
+ pass
+ freqtargets = Counter(freqtargets)
+ freqtargets = sorted(freqtargets, key = lambda x: freqtargets[x], reverse=True)
+
+ context = {
+ "project" : prj,
+ "lvs_nos" : Layer_Version.objects.all().count(),
+ "completedbuilds": Build.objects.filter(project_id = pid).filter(outcome__lte = Build.IN_PROGRESS),
+ "prj" : {"name": prj.name, },
+ "buildrequests" : prj.build_set.filter(outcome=Build.IN_PROGRESS),
+ "builds" : _project_recent_build_list(prj),
+ "layers" : map(lambda x: {
+ "id": x.layercommit.pk,
+ "orderid": x.pk,
+ "name" : x.layercommit.layer.name,
+ "vcs_url": x.layercommit.layer.vcs_url,
+ "vcs_reference" : x.layercommit.get_vcs_reference(),
+ "url": x.layercommit.layer.layer_index_url,
+ "layerdetailurl": x.layercommit.get_detailspage_url(prj.pk),
+ # This branch name is actually the release
+ "branch" : { "name" : x.layercommit.get_vcs_reference(), "layersource" : x.layercommit.up_branch.layer_source.name if x.layercommit.up_branch != None else None}},
+ prj.projectlayer_set.all().order_by("id")),
+ "targets" : map(lambda x: {"target" : x.target, "task" : x.task, "pk": x.pk}, prj.projecttarget_set.all()),
+ "variables": map(lambda x: (x.name, x.value), prj.projectvariable_set.all()),
+ "freqtargets": freqtargets[:5],
+ "releases": map(lambda x: {"id": x.pk, "name": x.name, "description":x.description}, Release.objects.all()),
+ "project_html": 1,
+ "recipesTypeAheadUrl": reverse('xhr_recipestypeahead', args=(prj.pk,)),
+ "projectBuildsUrl": reverse('projectbuilds', args=(prj.pk,)),
+ }
+
+ if prj.release is not None:
+ context['release'] = { "id": prj.release.pk, "name": prj.release.name, "description": prj.release.description}
+
+
+ try:
+ context["machine"] = {"name": prj.projectvariable_set.get(name="MACHINE").value}
+ except ProjectVariable.DoesNotExist:
+ context["machine"] = None
+ try:
+ context["distro"] = prj.projectvariable_set.get(name="DISTRO").value
+ except ProjectVariable.DoesNotExist:
+ context["distro"] = "-- not set yet"
+
+ return context
+
+ def jsunittests(request):
+ """ Provides a page for the js unit tests """
+ bbv = BitbakeVersion.objects.filter(branch="master").first()
+ release = Release.objects.filter(bitbake_version=bbv).first()
+
+ name = "_js_unit_test_prj_"
+
+ # If there is an existing project by this name delete it. We don't want
+ # Lots of duplicates cluttering up the projects.
+ Project.objects.filter(name=name).delete()
+
+ new_project = Project.objects.create_project(name=name, release=release)
+
+ context = { 'project' : new_project }
+ return render(request, "js-unit-tests.html", context)
+
+ from django.views.decorators.csrf import csrf_exempt
+ @csrf_exempt
+ def xhr_testreleasechange(request, pid):
+ def response(data):
+ return HttpResponse(jsonfilter(data),
+ content_type="application/json")
+
+ """ returns layer versions that would be deleted on the new
+ release__pk """
+ try:
+ prj = Project.objects.get(pk = pid)
+ new_release_id = request.GET['new_release_id']
+
+ # If we're already on this project do nothing
+ if prj.release.pk == int(new_release_id):
+ return reponse({"error": "ok", "rows": []})
+
+ retval = []
+
+ for i in prj.projectlayer_set.all():
+ lv = prj.compatible_layerversions(release = Release.objects.get(pk=new_release_id)).filter(layer__name = i.layercommit.layer.name)
+ # there is no layer_version with the new release id,
+ # and the same name
+ if lv.count() < 1:
+ retval.append(i)
+
+ return response({"error":"ok",
+ "rows" : map( _lv_to_dict(prj),
+ map(lambda x: x.layercommit, retval ))
+ })
+
+ except Exception as e:
+ return response({"error": str(e) })
+
+ def xhr_configvaredit(request, pid):
+ try:
+ prj = Project.objects.get(id = pid)
+ # add conf variables
+ if 'configvarAdd' in request.POST:
+ t=request.POST['configvarAdd'].strip()
+ if ":" in t:
+ variable, value = t.split(":")
+ else:
+ variable = t
+ value = ""
+
+ pt, created = ProjectVariable.objects.get_or_create(project = prj, name = variable, value = value)
+ # change conf variables
+ if 'configvarChange' in request.POST:
+ t=request.POST['configvarChange'].strip()
+ if ":" in t:
+ variable, value = t.split(":")
+ else:
+ variable = t
+ value = ""
+
+ pt, created = ProjectVariable.objects.get_or_create(project = prj, name = variable)
+ pt.value=value
+ pt.save()
+ # remove conf variables
+ if 'configvarDel' in request.POST:
+ t=request.POST['configvarDel'].strip()
+ pt = ProjectVariable.objects.get(pk = int(t)).delete()
+
+ # return all project settings, filter out blacklist and elsewhere-managed variables
+ vars_managed,vars_fstypes,vars_blacklist = get_project_configvars_context()
+ configvars_query = ProjectVariable.objects.filter(project_id = pid).all()
+ for var in vars_managed:
+ configvars_query = configvars_query.exclude(name = var)
+ for var in vars_blacklist:
+ configvars_query = configvars_query.exclude(name = var)
+
+ return_data = {
+ "error": "ok",
+ 'configvars' : map(lambda x: (x.name, x.value, x.pk), configvars_query),
+ }
+ try:
+ return_data['distro'] = ProjectVariable.objects.get(project = prj, name = "DISTRO").value,
+ except ProjectVariable.DoesNotExist:
+ pass
+ try:
+ return_data['fstypes'] = ProjectVariable.objects.get(project = prj, name = "IMAGE_FSTYPES").value,
+ except ProjectVariable.DoesNotExist:
+ pass
+ try:
+ return_data['image_install_append'] = ProjectVariable.objects.get(project = prj, name = "IMAGE_INSTALL_append").value,
+ except ProjectVariable.DoesNotExist:
+ pass
+ try:
+ return_data['package_classes'] = ProjectVariable.objects.get(project = prj, name = "PACKAGE_CLASSES").value,
+ except ProjectVariable.DoesNotExist:
+ pass
+ try:
+ return_data['sdk_machine'] = ProjectVariable.objects.get(project = prj, name = "SDKMACHINE").value,
+ except ProjectVariable.DoesNotExist:
+ pass
+
+ return HttpResponse(json.dumps( return_data ), content_type = "application/json")
+
+ except Exception as e:
+ return HttpResponse(json.dumps({"error":str(e) + "\n" + traceback.format_exc()}), content_type = "application/json")
+
+
+ def xhr_importlayer(request):
+ if (not request.POST.has_key('vcs_url') or
+ not request.POST.has_key('name') or
+ not request.POST.has_key('git_ref') or
+ not request.POST.has_key('project_id')):
+ return HttpResponse(jsonfilter({"error": "Missing parameters; requires vcs_url, name, git_ref and project_id"}), content_type = "application/json")
+
+ layers_added = [];
+
+ # Rudimentary check for any possible html tags
+ if "<" in request.POST:
+ return HttpResponse(jsonfilter({"error": "Invalid character <"}), content_type = "application/json")
+
+ prj = Project.objects.get(pk=request.POST['project_id'])
+
+ # Strip trailing/leading whitespace from all values
+ # put into a new dict because POST one is immutable
+ post_data = dict()
+ for key,val in request.POST.iteritems():
+ post_data[key] = val.strip()
+
+
+ # We need to know what release the current project is so that we
+ # can set the imported layer's up_branch_id
+ prj_branch_name = Release.objects.get(pk=prj.release_id).branch_name
+ up_branch, branch_created = Branch.objects.get_or_create(name=prj_branch_name, layer_source_id=LayerSource.TYPE_IMPORTED)
+
+ layer_source = LayerSource.objects.get(sourcetype=LayerSource.TYPE_IMPORTED)
+ try:
+ layer, layer_created = Layer.objects.get_or_create(name=post_data['name'])
+ except MultipleObjectsReturned:
+ return HttpResponse(jsonfilter({"error": "hint-layer-exists"}), content_type = "application/json")
+
+ if layer:
+ if layer_created:
+ layer.layer_source = layer_source
+ layer.vcs_url = post_data['vcs_url']
+ layer.up_date = timezone.now()
+ layer.save()
+ else:
+ # We have an existing layer by this name, let's see if the git
+ # url is the same, if it is then we can just create a new layer
+ # version for this layer. Otherwise we need to bail out.
+ if layer.vcs_url != post_data['vcs_url']:
+ return HttpResponse(jsonfilter({"error": "hint-layer-exists-with-different-url" , "current_url" : layer.vcs_url, "current_id": layer.id }), content_type = "application/json")
+
+
+ layer_version, version_created = Layer_Version.objects.get_or_create(layer_source=layer_source, layer=layer, project=prj, up_branch_id=up_branch.id,branch=post_data['git_ref'], commit=post_data['git_ref'], dirpath=post_data['dir_path'])
+
+ if layer_version:
+ if not version_created:
+ return HttpResponse(jsonfilter({"error": "hint-layer-version-exists", "existing_layer_version": layer_version.id }), content_type = "application/json")
+
+ layer_version.up_date = timezone.now()
+ layer_version.save()
+
+ # Add the dependencies specified for this new layer
+ if (post_data.has_key("layer_deps") and
+ version_created and
+ len(post_data["layer_deps"]) > 0):
+ for layer_dep_id in post_data["layer_deps"].split(","):
+
+ layer_dep_obj = Layer_Version.objects.get(pk=layer_dep_id)
+ LayerVersionDependency.objects.get_or_create(layer_version=layer_version, depends_on=layer_dep_obj)
+ # Now add them to the project, we could get an execption
+ # if the project now contains the exact
+ # dependency already (like modified on another page)
+ try:
+ prj_layer, prj_layer_created = ProjectLayer.objects.get_or_create(layercommit=layer_dep_obj, project=prj)
+ except IntegrityError as e:
+ logger.warning("Integrity error while saving Project Layers: %s (original %s)" % (e, e.__cause__))
+ continue
+
+ if prj_layer_created:
+ layerdepdetailurl = reverse('layerdetails', args=(prj.id, layer_dep_obj.pk))
+ layers_added.append({'id': layer_dep_obj.id, 'name': Layer.objects.get(id=layer_dep_obj.layer_id).name, 'layerdetailurl': layerdepdetailurl })
+
+
+ # If an old layer version exists in our project then remove it
+ for prj_layers in ProjectLayer.objects.filter(project=prj):
+ dup_layer_v = Layer_Version.objects.filter(id=prj_layers.layercommit_id, layer_id=layer.id)
+ if len(dup_layer_v) >0 :
+ prj_layers.delete()
+
+ # finally add the imported layer (version id) to the project
+ ProjectLayer.objects.create(layercommit=layer_version, project=prj,optional=1)
+
+ else:
+ # We didn't create a layer version so back out now and clean up.
+ if layer_created:
+ layer.delete()
+
+ return HttpResponse(jsonfilter({"error": "Uncaught error: Could not create layer version"}), content_type = "application/json")
+
+ layerdetailurl = reverse('layerdetails', args=(prj.id, layer_version.pk))
+
+ json_response = {"error": "ok",
+ "imported_layer" : {
+ "name" : layer.name,
+ "id": layer_version.id,
+ "layerdetailurl": layerdetailurl,
+ },
+ "deps_added": layers_added }
+
+ return HttpResponse(jsonfilter(json_response), content_type = "application/json")
+
+ def xhr_updatelayer(request):
+
+ def error_response(error):
+ return HttpResponse(jsonfilter({"error": error}), content_type = "application/json")
+
+ if not request.POST.has_key("layer_version_id"):
+ return error_response("Please specify a layer version id")
+ try:
+ layer_version_id = request.POST["layer_version_id"]
+ layer_version = Layer_Version.objects.get(id=layer_version_id)
+ except Layer_Version.DoesNotExist:
+ return error_response("Cannot find layer to update")
+
+
+ if request.POST.has_key("vcs_url"):
+ layer_version.layer.vcs_url = request.POST["vcs_url"]
+ if request.POST.has_key("dirpath"):
+ layer_version.dirpath = request.POST["dirpath"]
+ if request.POST.has_key("commit"):
+ layer_version.commit = request.POST["commit"]
+ if request.POST.has_key("up_branch"):
+ layer_version.up_branch_id = int(request.POST["up_branch"])
+
+ if request.POST.has_key("add_dep"):
+ lvd = LayerVersionDependency(layer_version=layer_version, depends_on_id=request.POST["add_dep"])
+ lvd.save()
+
+ if request.POST.has_key("rm_dep"):
+ rm_dep = LayerVersionDependency.objects.get(layer_version=layer_version, depends_on_id=request.POST["rm_dep"])
+ rm_dep.delete()
+
+ if request.POST.has_key("summary"):
+ layer_version.layer.summary = request.POST["summary"]
+ if request.POST.has_key("description"):
+ layer_version.layer.description = request.POST["description"]
+
+ try:
+ layer_version.layer.save()
+ layer_version.save()
+ except Exception as e:
+ return error_response("Could not update layer version entry: %s" % e)
+
+ return HttpResponse(jsonfilter({"error": "ok",}), content_type = "application/json")
+
+
+
+ def importlayer(request, pid):
+ template = "importlayer.html"
+ context = {
+ 'project': Project.objects.get(id=pid),
+ }
+ return render(request, template, context)
+
+ @_template_renderer('layerdetails.html')
+ def layerdetails(request, pid, layerid):
+ project = Project.objects.get(pk=pid)
+ layer_version = Layer_Version.objects.get(pk=layerid)
+
+ context = { 'project' : project,
+ 'layerversion' : layer_version,
+ 'layerdeps' : { "list": [
+ [{"id": y.id, "name": y.layer.name} for y in x.depends_on.get_equivalents_wpriority(project)][0] for x in layer_version.dependencies.all()]},
+ 'projectlayers': map(lambda prjlayer: prjlayer.layercommit.id, ProjectLayer.objects.filter(project=project))
+ }
+
+ return context
+
+
+ def get_project_configvars_context():
+ # Vars managed outside of this view
+ vars_managed = {
+ 'MACHINE', 'BBLAYERS'
+ }
+
+ vars_blacklist = {
+ 'DL_DR','PARALLEL_MAKE','BB_NUMBER_THREADS','SSTATE_DIR',
+ 'BB_DISKMON_DIRS','BB_NUMBER_THREADS','CVS_PROXY_HOST','CVS_PROXY_PORT',
+ 'DL_DIR','PARALLEL_MAKE','SSTATE_DIR','SSTATE_DIR','SSTATE_MIRRORS','TMPDIR',
+ 'all_proxy','ftp_proxy','http_proxy ','https_proxy'
+ }
+
+ vars_fstypes = {
+ 'btrfs','cpio','cpio.gz','cpio.lz4','cpio.lzma','cpio.xz','cramfs',
+ 'elf','ext2','ext2.bz2','ext2.gz','ext2.lzma', 'ext4', 'ext4.gz', 'ext3','ext3.gz','hddimg',
+ 'iso','jffs2','jffs2.sum','squashfs','squashfs-lzo','squashfs-xz','tar.bz2',
+ 'tar.lz4','tar.xz','tartar.gz','ubi','ubifs','vmdk'
+ }
+
+ return(vars_managed,sorted(vars_fstypes),vars_blacklist)
+
+ @_template_renderer("projectconf.html")
+ def projectconf(request, pid):
+
+ try:
+ prj = Project.objects.get(id = pid)
+ except Project.DoesNotExist:
+ return HttpResponseNotFound("<h1>Project id " + pid + " is unavailable</h1>")
+
+ # remove blacklist and externally managed varaibles from this list
+ vars_managed,vars_fstypes,vars_blacklist = get_project_configvars_context()
+ configvars = ProjectVariable.objects.filter(project_id = pid).all()
+ for var in vars_managed:
+ configvars = configvars.exclude(name = var)
+ for var in vars_blacklist:
+ configvars = configvars.exclude(name = var)
+
+ context = {
+ 'project': prj,
+ 'configvars': configvars,
+ 'vars_managed': vars_managed,
+ 'vars_fstypes': vars_fstypes,
+ 'vars_blacklist': vars_blacklist,
+ }
+
+ try:
+ context['distro'] = ProjectVariable.objects.get(project = prj, name = "DISTRO").value
+ context['distro_defined'] = "1"
+ except ProjectVariable.DoesNotExist:
+ pass
+ try:
+ context['fstypes'] = ProjectVariable.objects.get(project = prj, name = "IMAGE_FSTYPES").value
+ context['fstypes_defined'] = "1"
+ except ProjectVariable.DoesNotExist:
+ pass
+ try:
+ context['image_install_append'] = ProjectVariable.objects.get(project = prj, name = "IMAGE_INSTALL_append").value
+ context['image_install_append_defined'] = "1"
+ except ProjectVariable.DoesNotExist:
+ pass
+ try:
+ context['package_classes'] = ProjectVariable.objects.get(project = prj, name = "PACKAGE_CLASSES").value
+ context['package_classes_defined'] = "1"
+ except ProjectVariable.DoesNotExist:
+ pass
+ try:
+ context['sdk_machine'] = ProjectVariable.objects.get(project = prj, name = "SDKMACHINE").value
+ context['sdk_machine_defined'] = "1"
+ except ProjectVariable.DoesNotExist:
+ pass
+
+ return context
+
+ @_template_renderer('projectbuilds.html')
+ def projectbuilds(request, pid):
+ prj = Project.objects.get(id = pid)
+
+ if request.method == "POST":
+ # process any build request
+
+ if 'buildCancel' in request.POST:
+ for i in request.POST['buildCancel'].strip().split(" "):
+ try:
+ br = BuildRequest.objects.select_for_update().get(project = prj, pk = i, state__lte = BuildRequest.REQ_QUEUED)
+ br.state = BuildRequest.REQ_DELETED
+ br.save()
+ except BuildRequest.DoesNotExist:
+ pass
+
+ if 'buildDelete' in request.POST:
+ for i in request.POST['buildDelete'].strip().split(" "):
+ try:
+ br = BuildRequest.objects.select_for_update().get(project = prj, pk = i, state__lte = BuildRequest.REQ_DELETED).delete()
+ except BuildRequest.DoesNotExist:
+ pass
+
+ if 'targets' in request.POST:
+ ProjectTarget.objects.filter(project = prj).delete()
+ s = str(request.POST['targets'])
+ for t in s.translate(None, ";%|\"").split(" "):
+ if ":" in t:
+ target, task = t.split(":")
+ else:
+ target = t
+ task = ""
+ ProjectTarget.objects.create(project = prj, target = target, task = task)
+
+ br = prj.schedule_build()
+
+
+ queryset = Build.objects.filter(outcome__lte = Build.IN_PROGRESS)
+
+ try:
+ context, pagesize, orderby = _build_list_helper(request, queryset)
+ except RedirectException as re:
+ # rewrite the RedirectException with our current url information
+ re.view = resolve(request.path_info).url_name
+ re.okwargs = {"pid" : pid}
+ raise re
+
+ context['project'] = prj
+ _set_parameters_values(pagesize, orderby, request)
+
+ return context
+
+
+ def _file_name_for_artifact(b, artifact_type, artifact_id):
+ file_name = None
+ # Target_Image_File file_name
+ if artifact_type == "imagefile":
+ file_name = Target_Image_File.objects.get(target__build = b, pk = artifact_id).file_name
+
+ elif artifact_type == "buildartifact":
+ file_name = BuildArtifact.objects.get(build = b, pk = artifact_id).file_name
+
+ elif artifact_type == "licensemanifest":
+ file_name = Target.objects.get(build = b, pk = artifact_id).license_manifest_path
+
+ elif artifact_type == "tasklogfile":
+ file_name = Task.objects.get(build = b, pk = artifact_id).logfile
+
+ elif artifact_type == "logmessagefile":
+ file_name = LogMessage.objects.get(build = b, pk = artifact_id).pathname
+ else:
+ raise Exception("FIXME: artifact type %s not implemented" % (artifact_type))
+
+ return file_name
+
+
+ def build_artifact(request, build_id, artifact_type, artifact_id):
+ if artifact_type in ["cookerlog"]:
+ # these artifacts are saved after building, so they are on the server itself
+ def _mimetype_for_artifact(path):
+ try:
+ import magic
+
+ # fair warning: this is a mess; there are multiple competing and incompatible
+ # magic modules floating around, so we try some of the most common combinations
+
+ try: # we try ubuntu's python-magic 5.4
+ m = magic.open(magic.MAGIC_MIME_TYPE)
+ m.load()
+ return m.file(path)
+ except AttributeError:
+ pass
+
+ try: # we try python-magic 0.4.6
+ m = magic.Magic(magic.MAGIC_MIME)
+ return m.from_file(path)
+ except AttributeError:
+ pass
+
+ try: # we try pip filemagic 1.6
+ m = magic.Magic(flags=magic.MAGIC_MIME_TYPE)
+ return m.id_filename(path)
+ except AttributeError:
+ pass
+
+ return "binary/octet-stream"
+ except ImportError:
+ return "binary/octet-stream"
+ try:
+ # match code with runbuilds.Command.archive()
+ build_artifact_storage_dir = os.path.join(ToasterSetting.objects.get(name="ARTIFACTS_STORAGE_DIR").value, "%d" % int(build_id))
+ file_name = os.path.join(build_artifact_storage_dir, "cooker_log.txt")
+
+ fsock = open(file_name, "r")
+ content_type=_mimetype_for_artifact(file_name)
+
+ response = HttpResponse(fsock, content_type = content_type)
+
+ response['Content-Disposition'] = 'attachment; filename=' + os.path.basename(file_name)
+ return response
+ except IOError:
+ context = {
+ 'build' : Build.objects.get(pk = build_id),
+ }
+ return render(request, "unavailable_artifact.html", context)
+
+ else:
+ # retrieve the artifact directly from the build environment
+ return _get_be_artifact(request, build_id, artifact_type, artifact_id)
+
+
+ def _get_be_artifact(request, build_id, artifact_type, artifact_id):
+ try:
+ b = Build.objects.get(pk=build_id)
+ if b.buildrequest is None or b.buildrequest.environment is None:
+ raise Exception("Artifact not available for download (missing build request or build environment)")
+
+ file_name = _file_name_for_artifact(b, artifact_type, artifact_id)
+ fsock = None
+ content_type='application/force-download'
+
+ if file_name is None:
+ raise Exception("Could not handle artifact %s id %s" % (artifact_type, artifact_id))
+ else:
+ content_type = b.buildrequest.environment.get_artifact_type(file_name)
+ fsock = b.buildrequest.environment.get_artifact(file_name)
+ file_name = os.path.basename(file_name) # we assume that the build environment system has the same path conventions as host
+
+ response = HttpResponse(fsock, content_type = content_type)
+
+ # returns a file from the environment
+ response['Content-Disposition'] = 'attachment; filename=' + file_name
+ return response
+ except IOError:
+ context = {
+ 'build' : Build.objects.get(pk = build_id),
+ }
+ return render(request, "unavailable_artifact.html", context)
+
+
+
+
+ @_template_renderer("projects.html")
+ def projects(request):
+ (pagesize, orderby) = _get_parameters_values(request, 10, 'updated:-')
+ mandatory_parameters = { 'count': pagesize, 'page' : 1, 'orderby' : orderby }
+ retval = _verify_parameters( request.GET, mandatory_parameters )
+ if retval:
+ raise RedirectException( 'all-projects', request.GET, mandatory_parameters )
+
+ queryset_all = Project.objects.all()
+
+ # annotate each project with its number of builds
+ queryset_all = queryset_all.annotate(num_builds=Count('build'))
+
+ # exclude the command line builds project if it has no builds
+ q_default_with_builds = Q(is_default=True) & Q(num_builds__gt=0)
+ queryset_all = queryset_all.filter(Q(is_default=False) |
+ q_default_with_builds)
+
+ # boilerplate code that takes a request for an object type and returns a queryset
+ # for that object type. copypasta for all needed table searches
+ (filter_string, search_term, ordering_string) = _search_tuple(request, Project)
+ queryset_with_search = _get_queryset(Project, queryset_all, None, search_term, ordering_string, '-updated')
+ queryset = _get_queryset(Project, queryset_all, filter_string, search_term, ordering_string, '-updated')
+
+ # retrieve the objects that will be displayed in the table; projects a paginator and gets a page range to display
+ project_info = _build_page_range(Paginator(queryset, pagesize), request.GET.get('page', 1))
+
+ # add fields needed in JSON dumps for API call support
+ for p in project_info.object_list:
+ p.id = p.pk
+ p.projectPageUrl = reverse('project', args=(p.id,))
+ p.layersTypeAheadUrl = reverse('xhr_layerstypeahead', args=(p.id,))
+ p.recipesTypeAheadUrl = reverse('xhr_recipestypeahead', args=(p.id,))
+ p.projectBuildsUrl = reverse('projectbuilds', args=(p.id,))
+
+ # build view-specific information; this is rendered specifically in the builds page, at the top of the page (i.e. Recent builds)
+ build_mru = _get_latest_builds()
+
+ # translate the project's build target strings
+ fstypes_map = {};
+ for project in project_info:
+ try:
+ targets = Target.objects.filter( build_id = project.get_last_build_id() )
+ comma = "";
+ extensions = "";
+ for t in targets:
+ if ( not t.is_image ):
+ continue
+ tif = Target_Image_File.objects.filter( target_id = t.id )
+ for i in tif:
+ s=re.sub('.*tar.bz2', 'tar.bz2', i.file_name)
+ if s == i.file_name:
+ s=re.sub('.*\.', '', i.file_name)
+ if None == re.search(s,extensions):
+ extensions += comma + s
+ comma = ", "
+ fstypes_map[project.id]=extensions
+ except (Target.DoesNotExist,IndexError):
+ fstypes_map[project.id]=project.get_last_imgfiles
+
+ context = {
+ 'mru' : build_mru,
+
+ 'objects' : project_info,
+ 'objectname' : "projects",
+ 'default_orderby' : 'id:-',
+ 'search_term' : search_term,
+ 'total_count' : queryset_with_search.count(),
+ 'fstypes' : fstypes_map,
+ 'build_FAILED' : Build.FAILED,
+ 'build_SUCCEEDED' : Build.SUCCEEDED,
+ 'tablecols': [
+ {'name': 'Project',
+ 'orderfield': _get_toggle_order(request, "name"),
+ 'ordericon':_get_toggle_order_icon(request, "name"),
+ 'orderkey' : 'name',
+ },
+ {'name': 'Last activity on',
+ 'clclass': 'updated',
+ 'qhelp': "Shows the starting date and time of the last project build. If the project has no builds, it shows the date the project was created",
+ 'orderfield': _get_toggle_order(request, "updated", True),
+ 'ordericon':_get_toggle_order_icon(request, "updated"),
+ 'orderkey' : 'updated',
+ },
+ {'name': 'Release',
+ 'qhelp' : "The version of the build system used by the project",
+ 'orderfield': _get_toggle_order(request, "release__name"),
+ 'ordericon':_get_toggle_order_icon(request, "release__name"),
+ 'orderkey' : 'release__name',
+ },
+ {'name': 'Machine',
+ 'qhelp': "The hardware currently selected for the project",
+ },
+ {'name': 'Number of builds',
+ 'qhelp': "How many builds have been run for the project",
+ },
+ {'name': 'Last build outcome', 'clclass': 'loutcome',
+ 'qhelp': "Tells you if the last project build completed successfully or failed",
+ },
+ {'name': 'Recipe', 'clclass': 'ltarget',
+ 'qhelp': "The last recipe that was built in this project",
+ },
+ {'name': 'Errors', 'clclass': 'lerrors',
+ 'qhelp': "How many errors were encountered during the last project build (if any)",
+ },
+ {'name': 'Warnings', 'clclass': 'lwarnings',
+ 'qhelp': "How many warnigns were encountered during the last project build (if any)",
+ },
+ {'name': 'Image files', 'clclass': 'limagefiles', 'hidden': 1,
+ 'qhelp': "The root file system types produced by the last project build",
+ },
+ ]
+ }
+
+ _set_parameters_values(pagesize, orderby, request)
+ return context
diff --git a/bitbake/lib/toaster/toastergui/widgets.py b/bitbake/lib/toaster/toastergui/widgets.py
new file mode 100644
index 0000000..eb2914d
--- /dev/null
+++ b/bitbake/lib/toaster/toastergui/widgets.py
@@ -0,0 +1,411 @@
+#
+# ex:ts=4:sw=4:sts=4:et
+# -*- tab-width: 4; c-basic-offset: 4; indent-tabs-mode: nil -*-
+#
+# BitBake Toaster Implementation
+#
+# Copyright (C) 2015 Intel Corporation
+#
+# This program is free software; you can redistribute it and/or modify
+# it under the terms of the GNU General Public License version 2 as
+# published by the Free Software Foundation.
+#
+# This program is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
+# GNU General Public License for more details.
+#
+# You should have received a copy of the GNU General Public License along
+# with this program; if not, write to the Free Software Foundation, Inc.,
+# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
+
+from django.views.generic import View, TemplateView
+from django.shortcuts import HttpResponse
+from django.http import HttpResponseBadRequest
+from django.core import serializers
+from django.core.cache import cache
+from django.core.paginator import Paginator, EmptyPage
+from django.db.models import Q
+from orm.models import Project, ProjectLayer, Layer_Version
+from django.template import Context, Template
+from django.core.serializers.json import DjangoJSONEncoder
+from django.core.exceptions import FieldError
+from django.conf.urls import url, patterns
+
+import types
+import json
+import collections
+import operator
+import re
+
+from toastergui.views import objtojson
+
+class ToasterTable(TemplateView):
+ def __init__(self, *args, **kwargs):
+ super(ToasterTable, self).__init__()
+ if 'template_name' in kwargs:
+ self.template_name = kwargs['template_name']
+ self.title = None
+ self.queryset = None
+ self.columns = []
+ self.filters = {}
+ self.total_count = 0
+ self.static_context_extra = {}
+ self.filter_actions = {}
+ self.empty_state = "Sorry - no data found"
+ self.default_orderby = ""
+
+ # add the "id" column, undisplayable, by default
+ self.add_column(title="Id",
+ displayable=False,
+ orderable=True,
+ field_name="id")
+
+
+ def get(self, request, *args, **kwargs):
+ if request.GET.get('format', None) == 'json':
+
+ self.setup_queryset(*args, **kwargs)
+ # Put the project id into the context for the static_data_template
+ if 'pid' in kwargs:
+ self.static_context_extra['pid'] = kwargs['pid']
+
+ cmd = request.GET.get('cmd', None)
+ if cmd and 'filterinfo' in cmd:
+ data = self.get_filter_info(request, **kwargs)
+ else:
+ # If no cmd is specified we give you the table data
+ data = self.get_data(request, **kwargs)
+
+ return HttpResponse(data, content_type="application/json")
+
+ return super(ToasterTable, self).get(request, *args, **kwargs)
+
+ def get_filter_info(self, request, **kwargs):
+ data = None
+
+ self.setup_filters(**kwargs)
+
+ search = request.GET.get("search", None)
+ if search:
+ self.apply_search(search)
+
+ name = request.GET.get("name", None)
+ if name is None:
+ data = json.dumps(self.filters,
+ indent=2,
+ cls=DjangoJSONEncoder)
+ else:
+ for actions in self.filters[name]['filter_actions']:
+ actions['count'] = self.filter_actions[actions['name']](count_only=True)
+
+ # Add the "All" items filter action
+ self.filters[name]['filter_actions'].insert(0, {
+ 'name' : 'all',
+ 'title' : 'All',
+ 'count' : self.queryset.count(),
+ })
+
+ data = json.dumps(self.filters[name],
+ indent=2,
+ cls=DjangoJSONEncoder)
+
+ return data
+
+ def setup_columns(self, *args, **kwargs):
+ """ function to implement in the subclass which sets up the columns """
+ pass
+ def setup_filters(self, *args, **kwargs):
+ """ function to implement in the subclass which sets up the filters """
+ pass
+ def setup_queryset(self, *args, **kwargs):
+ """ function to implement in the subclass which sets up the queryset"""
+ pass
+
+ def add_filter(self, name, title, filter_actions):
+ """Add a filter to the table.
+
+ Args:
+ name (str): Unique identifier of the filter.
+ title (str): Title of the filter.
+ filter_actions: Actions for all the filters.
+ """
+ self.filters[name] = {
+ 'title' : title,
+ 'filter_actions' : filter_actions,
+ }
+
+ def make_filter_action(self, name, title, action_function):
+ """ Utility to make a filter_action """
+
+ action = {
+ 'title' : title,
+ 'name' : name,
+ }
+
+ self.filter_actions[name] = action_function
+
+ return action
+
+ def add_column(self, title="", help_text="",
+ orderable=False, hideable=True, hidden=False,
+ field_name="", filter_name=None, static_data_name=None,
+ displayable=True, computation=None,
+ static_data_template=None):
+ """Add a column to the table.
+
+ Args:
+ title (str): Title for the table header
+ help_text (str): Optional help text to describe the column
+ orderable (bool): Whether the column can be ordered.
+ We order on the field_name.
+ hideable (bool): Whether the user can hide the column
+ hidden (bool): Whether the column is default hidden
+ field_name (str or list): field(s) required for this column's data
+ static_data_name (str, optional): The column's main identifier
+ which will replace the field_name.
+ static_data_template(str, optional): The template to be rendered
+ as data
+ """
+
+ self.columns.append({'title' : title,
+ 'help_text' : help_text,
+ 'orderable' : orderable,
+ 'hideable' : hideable,
+ 'hidden' : hidden,
+ 'field_name' : field_name,
+ 'filter_name' : filter_name,
+ 'static_data_name': static_data_name,
+ 'static_data_template': static_data_template,
+ 'displayable': displayable,
+ 'computation': computation,
+ })
+
+ def render_static_data(self, template, row):
+ """Utility function to render the static data template"""
+
+ context = {
+ 'extra' : self.static_context_extra,
+ 'data' : row,
+ }
+
+ context = Context(context)
+ template = Template(template)
+
+ return template.render(context)
+
+ def apply_filter(self, filters, **kwargs):
+ self.setup_filters(**kwargs)
+
+ try:
+ filter_name, filter_action = filters.split(':')
+ except ValueError:
+ return
+
+ if "all" in filter_action:
+ return
+
+ try:
+ self.filter_actions[filter_action]()
+ except KeyError:
+ # pass it to the user - programming error here
+ raise
+
+ def apply_orderby(self, orderby):
+ # Note that django will execute this when we try to retrieve the data
+ self.queryset = self.queryset.order_by(orderby)
+
+ def apply_search(self, search_term):
+ """Creates a query based on the model's search_allowed_fields"""
+
+ if not hasattr(self.queryset.model, 'search_allowed_fields'):
+ raise Exception("Err Search fields aren't defined in the model")
+
+ search_queries = []
+ for st in search_term.split(" "):
+ q_map = [Q(**{field + '__icontains': st})
+ for field in self.queryset.model.search_allowed_fields]
+
+ search_queries.append(reduce(operator.or_, q_map))
+
+ search_queries = reduce(operator.and_, search_queries)
+
+ self.queryset = self.queryset.filter(search_queries)
+
+
+ def get_data(self, request, **kwargs):
+ """Returns the data for the page requested with the specified
+ parameters applied"""
+
+ page_num = request.GET.get("page", 1)
+ limit = request.GET.get("limit", 10)
+ search = request.GET.get("search", None)
+ filters = request.GET.get("filter", None)
+ orderby = request.GET.get("orderby", None)
+
+ # Make a unique cache name
+ cache_name = self.__class__.__name__
+
+ for key, val in request.GET.iteritems():
+ cache_name = cache_name + str(key) + str(val)
+
+ for key, val in kwargs.iteritems():
+ cache_name = cache_name + str(key) + str(val)
+
+ # No special chars allowed in the cache name apart from dash
+ cache_name = re.sub(r'[^A-Za-z0-9-]', "", cache_name)
+ data = cache.get(cache_name)
+
+ if data:
+ return data
+
+ self.setup_columns(**kwargs)
+
+ if search:
+ self.apply_search(search)
+ if filters:
+ self.apply_filter(filters, **kwargs)
+ if orderby:
+ self.apply_orderby(orderby)
+
+ paginator = Paginator(self.queryset, limit)
+
+ try:
+ page = paginator.page(page_num)
+ except EmptyPage:
+ page = paginator.page(1)
+
+ data = {
+ 'total' : self.queryset.count(),
+ 'default_orderby' : self.default_orderby,
+ 'columns' : self.columns,
+ 'rows' : [],
+ 'error' : "ok",
+ }
+
+ try:
+ for row in page.object_list:
+ #Use collection to maintain the order
+ required_data = collections.OrderedDict()
+
+ for col in self.columns:
+ field = col['field_name']
+ if not field:
+ field = col['static_data_name']
+ if not field:
+ raise Exception("Must supply a field_name or static_data_name for column %s.%s" % (self.__class__.__name__,col))
+ # Check if we need to process some static data
+ if "static_data_name" in col and col['static_data_name']:
+ required_data["static:%s" % col['static_data_name']] = self.render_static_data(col['static_data_template'], row)
+
+ # Overwrite the field_name with static_data_name
+ # so that this can be used as the html class name
+
+ col['field_name'] = col['static_data_name']
+
+ # compute the computation on the raw data if needed
+ model_data = row
+ if col['computation']:
+ model_data = col['computation'](row)
+ else:
+ # Traverse to any foriegn key in the object hierachy
+ for subfield in field.split("__"):
+ if hasattr(model_data, subfield):
+ model_data = getattr(model_data, subfield)
+ # The field could be a function on the model so check
+ # If it is then call it
+ if isinstance(model_data, types.MethodType):
+ model_data = model_data()
+
+ required_data[col['field_name']] = model_data
+
+ data['rows'].append(required_data)
+
+ except FieldError:
+ # pass it to the user - programming-error here
+ raise
+ data = json.dumps(data, indent=2, default=objtojson)
+ cache.set(cache_name, data, 60*30)
+
+ return data
+
+
+class ToasterTemplateView(TemplateView):
+ # renders a instance in a template, or returns the context as json
+ # the class-equivalent of the _template_renderer decorator for views
+
+ def __init__(self, *args, **kwargs):
+ super(ToasterTemplateView, self).__init__(*args, **kwargs)
+ self.context_entries = []
+
+ def get(self, *args, **kwargs):
+ if self.request.GET.get('format', None) == 'json':
+ from django.core.urlresolvers import reverse
+ from django.shortcuts import HttpResponse
+ from views import objtojson
+ from toastergui.templatetags.projecttags import json as jsonfilter
+
+ context = self.get_context_data(**kwargs)
+
+ for x in context.keys():
+ if x not in self.context_entries:
+ del context[x]
+
+ context["error"] = "ok"
+
+ return HttpResponse(jsonfilter(context, default=objtojson ),
+ content_type = "application/json; charset=utf-8")
+
+ return super(ToasterTemplateView, self).get(*args, **kwargs)
+
+class ToasterTypeAhead(View):
+ """ A typeahead mechanism to support the front end typeahead widgets """
+ MAX_RESULTS = 6
+
+ class MissingFieldsException(Exception):
+ pass
+
+ def __init__(self, *args, **kwargs):
+ super(ToasterTypeAhead, self).__init__()
+
+ def get(self, request, *args, **kwargs):
+ def response(data):
+ return HttpResponse(json.dumps(data,
+ indent=2,
+ cls=DjangoJSONEncoder),
+ content_type="application/json")
+
+ error = "ok"
+
+ search_term = request.GET.get("search", None)
+ if search_term == None:
+ # We got no search value so return empty reponse
+ return response({'error' : error , 'results': []})
+
+ try:
+ prj = Project.objects.get(pk=kwargs['pid'])
+ except KeyError:
+ prj = None
+
+ results = self.apply_search(search_term, prj, request)[:ToasterTypeAhead.MAX_RESULTS]
+
+ if len(results) > 0:
+ try:
+ self.validate_fields(results[0])
+ except MissingFieldsException as e:
+ error = e
+
+ data = { 'results' : results,
+ 'error' : error,
+ }
+
+ return response(data)
+
+ def validate_fields(self, result):
+ if 'name' in result == False or 'detail' in result == False:
+ raise MissingFieldsException("name and detail are required fields")
+
+ def apply_search(self, search_term, prj):
+ """ Override this function to implement search. Return an array of
+ dictionaries with a minium of a name and detail field"""
+ pass
diff --git a/bitbake/lib/toaster/toastermain/__init__.py b/bitbake/lib/toaster/toastermain/__init__.py
new file mode 100644
index 0000000..e69de29
--- /dev/null
+++ b/bitbake/lib/toaster/toastermain/__init__.py
diff --git a/bitbake/lib/toaster/toastermain/management/__init__.py b/bitbake/lib/toaster/toastermain/management/__init__.py
new file mode 100644
index 0000000..e69de29
--- /dev/null
+++ b/bitbake/lib/toaster/toastermain/management/__init__.py
diff --git a/bitbake/lib/toaster/toastermain/management/commands/__init__.py b/bitbake/lib/toaster/toastermain/management/commands/__init__.py
new file mode 100644
index 0000000..e69de29
--- /dev/null
+++ b/bitbake/lib/toaster/toastermain/management/commands/__init__.py
diff --git a/bitbake/lib/toaster/toastermain/management/commands/builddelete.py b/bitbake/lib/toaster/toastermain/management/commands/builddelete.py
new file mode 100644
index 0000000..343d311
--- /dev/null
+++ b/bitbake/lib/toaster/toastermain/management/commands/builddelete.py
@@ -0,0 +1,49 @@
+from django.core.management.base import BaseCommand, CommandError
+from orm.models import Build
+from django.db import OperationalError
+import os
+
+
+
+class Command(BaseCommand):
+ args = "buildId"
+ help = "Deletes selected build(s)"
+
+ def handle(self, buildId, *args, **options):
+ for bid in buildId.split(","):
+ b = Build.objects.get(pk = bid)
+ # theoretically, just b.delete() would suffice
+ # however SQLite runs into problems when you try to
+ # delete too many rows at once, so we delete some direct
+ # relationships from Build manually.
+ for t in b.target_set.all():
+ t.delete()
+ for t in b.task_build.all():
+ t.delete()
+ for p in b.package_set.all():
+ p.delete()
+ for lv in b.layer_version_build.all():
+ lv.delete()
+ for v in b.variable_build.all():
+ v.delete()
+ for l in b.logmessage_set.all():
+ l.delete()
+
+ # delete the build; some databases might have had problem with migration of the bldcontrol app
+ retry_count = 0
+ need_bldcontrol_migration = False
+ while True:
+ if retry_count >= 5:
+ break
+ retry_count += 1
+ if need_bldcontrol_migration:
+ from django.core import management
+ management.call_command('migrate', 'bldcontrol', interactive=False)
+
+ try:
+ b.delete()
+ break
+ except OperationalError as e:
+ # execute migrations
+ need_bldcontrol_migration = True
+
diff --git a/bitbake/lib/toaster/toastermain/management/commands/buildslist.py b/bitbake/lib/toaster/toastermain/management/commands/buildslist.py
new file mode 100644
index 0000000..cad987f
--- /dev/null
+++ b/bitbake/lib/toaster/toastermain/management/commands/buildslist.py
@@ -0,0 +1,13 @@
+from django.core.management.base import NoArgsCommand, CommandError
+from orm.models import Build
+import os
+
+
+
+class Command(NoArgsCommand):
+ args = ""
+ help = "Lists current builds"
+
+ def handle_noargs(self,**options):
+ for b in Build.objects.all():
+ print "%d: %s %s %s" % (b.pk, b.machine, b.distro, ",".join([x.target for x in b.target_set.all()]))
diff --git a/bitbake/lib/toaster/toastermain/management/commands/perf.py b/bitbake/lib/toaster/toastermain/management/commands/perf.py
new file mode 100644
index 0000000..71a48e9
--- /dev/null
+++ b/bitbake/lib/toaster/toastermain/management/commands/perf.py
@@ -0,0 +1,58 @@
+from django.core.management.base import BaseCommand
+from django.test.client import Client
+import os, sys, re
+import requests
+from django.conf import settings
+
+# pylint: disable=E1103
+# Instance of 'WSGIRequest' has no 'status_code' member
+# (but some types could not be inferred) (maybe-no-member)
+
+
+class Command(BaseCommand):
+ help = "Test the response time for all toaster urls"
+
+ def handle(self, *args, **options):
+ root_urlconf = __import__(settings.ROOT_URLCONF)
+ patterns = root_urlconf.urls.urlpatterns
+ global full_url
+ for pat in patterns:
+ if pat.__class__.__name__ == 'RegexURLResolver':
+ url_root_res = str(pat).split('^')[1].replace('>', '')
+ if 'gui' in url_root_res:
+ for url_patt in pat.url_patterns:
+ full_url = self.get_full_url(url_patt, url_root_res)
+ info = self.url_info(full_url)
+ status_code = info[0]
+ load_time = info[1]
+ print 'Trying \'' + full_url + '\', ' + str(status_code) + ', ' + str(load_time)
+
+ def get_full_url(self, url_patt, url_root_res):
+ full_url = str(url_patt).split('^')[1].replace('$>', '').replace('(?P<file_path>(?:/[', '/bin/busybox').replace('.*', '')
+ full_url = str(url_root_res + full_url)
+ full_url = re.sub('\(\?P<.*?>\\\d\+\)', '1', full_url)
+ full_url = 'http://localhost:8000/' + full_url
+ return full_url
+
+ def url_info(self, full_url):
+ client = Client()
+ info = []
+ try:
+ resp = client.get(full_url, follow = True)
+ except Exception as e_status_code:
+ self.error('Url: %s, error: %s' % (full_url, e_status_code))
+ resp = type('object', (), {'status_code':0, 'content': str(e_status_code)})
+ status_code = resp.status_code
+ info.append(status_code)
+ try:
+ req = requests.get(full_url)
+ except Exception as e_load_time:
+ self.error('Url: %s, error: %s' % (full_url, e_load_time))
+ load_time = req.elapsed
+ info.append(load_time)
+ return info
+
+ def error(self, *args):
+ for arg in args:
+ print >>sys.stderr, arg,
+ print >>sys.stderr
diff --git a/bitbake/lib/toaster/toastermain/settings.py b/bitbake/lib/toaster/toastermain/settings.py
new file mode 100644
index 0000000..b149a5e
--- /dev/null
+++ b/bitbake/lib/toaster/toastermain/settings.py
@@ -0,0 +1,410 @@
+#
+# ex:ts=4:sw=4:sts=4:et
+# -*- tab-width: 4; c-basic-offset: 4; indent-tabs-mode: nil -*-
+#
+# BitBake Toaster Implementation
+#
+# Copyright (C) 2013 Intel Corporation
+#
+# This program is free software; you can redistribute it and/or modify
+# it under the terms of the GNU General Public License version 2 as
+# published by the Free Software Foundation.
+#
+# This program is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
+# GNU General Public License for more details.
+#
+# You should have received a copy of the GNU General Public License along
+# with this program; if not, write to the Free Software Foundation, Inc.,
+# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
+
+# Django settings for Toaster project.
+
+import os, re
+
+DEBUG = True
+TEMPLATE_DEBUG = DEBUG
+
+# Set to True to see the SQL queries in console
+SQL_DEBUG = False
+if os.environ.get("TOASTER_SQLDEBUG", None) is not None:
+ SQL_DEBUG = True
+
+
+ADMINS = (
+ # ('Your Name', 'your_email@example.com'),
+)
+
+MANAGERS = ADMINS
+
+DATABASES = {
+ 'default': {
+ 'ENGINE': 'django.db.backends.sqlite3', # Add 'postgresql_psycopg2', 'mysql', 'sqlite3' or 'oracle'.
+ 'NAME': 'toaster.sqlite', # Or path to database file if using sqlite3.
+ 'USER': '',
+ 'PASSWORD': '',
+ 'HOST': '127.0.0.1', # Empty for localhost through domain sockets or '127.0.0.1' for localhost through TCP.
+ 'PORT': '3306', # Set to empty string for default.
+ }
+}
+
+# Needed when Using sqlite especially to add a longer timeout for waiting
+# for the database lock to be released
+# https://docs.djangoproject.com/en/1.6/ref/databases/#database-is-locked-errors
+if 'sqlite' in DATABASES['default']['ENGINE']:
+ DATABASES['default']['OPTIONS'] = { 'timeout': 20 }
+
+# Reinterpret database settings if we have DATABASE_URL environment variable defined
+
+if 'DATABASE_URL' in os.environ:
+ dburl = os.environ['DATABASE_URL']
+ if dburl.startswith('sqlite3://'):
+ result = re.match('sqlite3://(.*)', dburl)
+ if result is None:
+ raise Exception("ERROR: Could not read sqlite database url: %s" % dburl)
+ DATABASES['default'] = {
+ 'ENGINE': 'django.db.backends.sqlite3',
+ 'NAME': result.group(1),
+ 'USER': '',
+ 'PASSWORD': '',
+ 'HOST': '',
+ 'PORT': '',
+ }
+ elif dburl.startswith('mysql://'):
+ # URL must be in this form: mysql://user:pass@host:port/name
+ result = re.match(r"mysql://([^:]*):([^@]*)@([^:]*):(\d+)/([^/]*)", dburl)
+ if result is None:
+ raise Exception("ERROR: Could not read mysql database url: %s" % dburl)
+ DATABASES['default'] = {
+ 'ENGINE': 'django.db.backends.mysql',
+ 'NAME': result.group(5),
+ 'USER': result.group(1),
+ 'PASSWORD': result.group(2),
+ 'HOST': result.group(3),
+ 'PORT': result.group(4),
+ }
+ else:
+ raise Exception("FIXME: Please implement missing database url schema for url: %s" % dburl)
+
+
+if 'TOASTER_MANAGED' in os.environ and os.environ['TOASTER_MANAGED'] == "1":
+ MANAGED = True
+else:
+ MANAGED = False
+
+# Allows current database settings to be exported as a DATABASE_URL environment variable value
+
+def getDATABASE_URL():
+ d = DATABASES['default']
+ if d['ENGINE'] == 'django.db.backends.sqlite3':
+ if d['NAME'] == ':memory:':
+ return 'sqlite3://:memory:'
+ elif d['NAME'].startswith("/"):
+ return 'sqlite3://' + d['NAME']
+ return "sqlite3://" + os.path.join(os.getcwd(), d['NAME'])
+
+ elif d['ENGINE'] == 'django.db.backends.mysql':
+ return "mysql://" + d['USER'] + ":" + d['PASSWORD'] + "@" + d['HOST'] + ":" + d['PORT'] + "/" + d['NAME']
+
+ raise Exception("FIXME: Please implement missing database url schema for engine: %s" % d['ENGINE'])
+
+
+
+# Hosts/domain names that are valid for this site; required if DEBUG is False
+# See https://docs.djangoproject.com/en/1.5/ref/settings/#allowed-hosts
+ALLOWED_HOSTS = []
+
+# Local time zone for this installation. Choices can be found here:
+# http://en.wikipedia.org/wiki/List_of_tz_zones_by_name
+# although not all choices may be available on all operating systems.
+# In a Windows environment this must be set to your system time zone.
+
+# Always use local computer's time zone, find
+import hashlib
+if 'TZ' in os.environ:
+ TIME_ZONE = os.environ['TZ']
+else:
+ # need to read the /etc/localtime file which is the libc standard
+ # and do a reverse-mapping to /usr/share/zoneinfo/;
+ # since the timezone may match any number of identical timezone definitions,
+
+ zonefilelist = {}
+ ZONEINFOPATH = '/usr/share/zoneinfo/'
+ for dirpath, dirnames, filenames in os.walk(ZONEINFOPATH):
+ for fn in filenames:
+ filepath = os.path.join(dirpath, fn)
+ zonename = filepath.lstrip(ZONEINFOPATH).strip()
+ try:
+ import pytz
+ from pytz.exceptions import UnknownTimeZoneError
+ pass
+ try:
+ if pytz.timezone(zonename) is not None:
+ zonefilelist[hashlib.md5(open(filepath).read()).hexdigest()] = zonename
+ except UnknownTimeZoneError, ValueError:
+ # we expect timezone failures here, just move over
+ pass
+ except ImportError:
+ zonefilelist[hashlib.md5(open(filepath).read()).hexdigest()] = zonename
+
+ TIME_ZONE = zonefilelist[hashlib.md5(open('/etc/localtime').read()).hexdigest()]
+
+# Language code for this installation. All choices can be found here:
+# http://www.i18nguy.com/unicode/language-identifiers.html
+LANGUAGE_CODE = 'en-us'
+
+SITE_ID = 1
+
+# If you set this to False, Django will make some optimizations so as not
+# to load the internationalization machinery.
+USE_I18N = True
+
+# If you set this to False, Django will not format dates, numbers and
+# calendars according to the current locale.
+USE_L10N = True
+
+# If you set this to False, Django will not use timezone-aware datetimes.
+USE_TZ = True
+
+# Absolute filesystem path to the directory that will hold user-uploaded files.
+# Example: "/var/www/example.com/media/"
+MEDIA_ROOT = ''
+
+# URL that handles the media served from MEDIA_ROOT. Make sure to use a
+# trailing slash.
+# Examples: "http://example.com/media/", "http://media.example.com/"
+MEDIA_URL = ''
+
+# Absolute path to the directory static files should be collected to.
+# Don't put anything in this directory yourself; store your static files
+# in apps' "static/" subdirectories and in STATICFILES_DIRS.
+# Example: "/var/www/example.com/static/"
+STATIC_ROOT = ''
+
+# URL prefix for static files.
+# Example: "http://example.com/static/", "http://static.example.com/"
+STATIC_URL = '/static/'
+
+# Additional locations of static files
+STATICFILES_DIRS = (
+ # Put strings here, like "/home/html/static" or "C:/www/django/static".
+ # Always use forward slashes, even on Windows.
+ # Don't forget to use absolute paths, not relative paths.
+)
+
+# List of finder classes that know how to find static files in
+# various locations.
+STATICFILES_FINDERS = (
+ 'django.contrib.staticfiles.finders.FileSystemFinder',
+ 'django.contrib.staticfiles.finders.AppDirectoriesFinder',
+# 'django.contrib.staticfiles.finders.DefaultStorageFinder',
+)
+
+# Make this unique, and don't share it with anybody.
+SECRET_KEY = 'NOT_SUITABLE_FOR_HOSTED_DEPLOYMENT'
+
+# List of callables that know how to import templates from various sources.
+TEMPLATE_LOADERS = (
+ 'django.template.loaders.filesystem.Loader',
+ 'django.template.loaders.app_directories.Loader',
+# 'django.template.loaders.eggs.Loader',
+)
+
+MIDDLEWARE_CLASSES = (
+ 'django.middleware.common.CommonMiddleware',
+ 'django.contrib.sessions.middleware.SessionMiddleware',
+ 'django.middleware.csrf.CsrfViewMiddleware',
+ 'django.contrib.auth.middleware.AuthenticationMiddleware',
+ 'django.contrib.messages.middleware.MessageMiddleware',
+ # Uncomment the next line for simple clickjacking protection:
+ # 'django.middleware.clickjacking.XFrameOptionsMiddleware',
+)
+
+CACHES = {
+ # 'default': {
+ # 'BACKEND': 'django.core.cache.backends.memcached.MemcachedCache',
+ # 'LOCATION': '127.0.0.1:11211',
+ # },
+ 'default': {
+ 'BACKEND': 'django.core.cache.backends.filebased.FileBasedCache',
+ 'LOCATION': '/tmp/django-default-cache',
+ 'TIMEOUT': 1,
+ }
+ }
+
+
+from os.path import dirname as DN
+SITE_ROOT=DN(DN(os.path.abspath(__file__)))
+
+import subprocess
+TOASTER_BRANCH = subprocess.Popen('git branch | grep "^* " | tr -d "* "', cwd = SITE_ROOT, shell=True, stdout=subprocess.PIPE, stderr=subprocess.PIPE).communicate()[0]
+TOASTER_REVISION = subprocess.Popen('git rev-parse HEAD ', cwd = SITE_ROOT, shell=True, stdout=subprocess.PIPE, stderr=subprocess.PIPE).communicate()[0]
+
+ROOT_URLCONF = 'toastermain.urls'
+
+# Python dotted path to the WSGI application used by Django's runserver.
+WSGI_APPLICATION = 'toastermain.wsgi.application'
+
+TEMPLATE_DIRS = (
+ # Put strings here, like "/home/html/django_templates" or "C:/www/django/templates".
+ # Always use forward slashes, even on Windows.
+ # Don't forget to use absolute paths, not relative paths.
+)
+
+TEMPLATE_CONTEXT_PROCESSORS = ('django.contrib.auth.context_processors.auth',
+ 'django.core.context_processors.debug',
+ 'django.core.context_processors.i18n',
+ 'django.core.context_processors.media',
+ 'django.core.context_processors.static',
+ 'django.core.context_processors.tz',
+ 'django.contrib.messages.context_processors.messages',
+ "django.core.context_processors.request",
+ 'toastergui.views.managedcontextprocessor',
+ )
+
+INSTALLED_APPS = (
+ 'django.contrib.auth',
+ 'django.contrib.contenttypes',
+ 'django.contrib.messages',
+ 'django.contrib.sessions',
+ 'django.contrib.admin',
+ 'django.contrib.staticfiles',
+
+ # Uncomment the next line to enable admin documentation:
+ # 'django.contrib.admindocs',
+ 'django.contrib.humanize',
+ 'bldcollector',
+ 'toastermain',
+ 'south',
+)
+
+
+INTERNAL_IPS = ['127.0.0.1', '192.168.2.28']
+
+# Load django-fresh is TOASTER_DEVEL is set, and the module is available
+FRESH_ENABLED = False
+if os.environ.get('TOASTER_DEVEL', None) is not None:
+ try:
+ import fresh
+ MIDDLEWARE_CLASSES = ("fresh.middleware.FreshMiddleware",) + MIDDLEWARE_CLASSES
+ INSTALLED_APPS = INSTALLED_APPS + ('fresh',)
+ FRESH_ENABLED = True
+ except:
+ pass
+
+DEBUG_PANEL_ENABLED = False
+if os.environ.get('TOASTER_DEVEL', None) is not None:
+ try:
+ import debug_toolbar, debug_panel
+ MIDDLEWARE_CLASSES = ('debug_panel.middleware.DebugPanelMiddleware',) + MIDDLEWARE_CLASSES
+ #MIDDLEWARE_CLASSES = MIDDLEWARE_CLASSES + ('debug_toolbar.middleware.DebugToolbarMiddleware',)
+ INSTALLED_APPS = INSTALLED_APPS + ('debug_toolbar','debug_panel',)
+ DEBUG_PANEL_ENABLED = True
+
+ # this cache backend will be used by django-debug-panel
+ CACHES['debug-panel'] = {
+ 'BACKEND': 'django.core.cache.backends.filebased.FileBasedCache',
+ 'LOCATION': '/var/tmp/debug-panel-cache',
+ 'TIMEOUT': 300,
+ 'OPTIONS': {
+ 'MAX_ENTRIES': 200
+ }
+ }
+
+ except:
+ pass
+
+
+SOUTH_TESTS_MIGRATE = False
+
+
+# We automatically detect and install applications here if
+# they have a 'models.py' or 'views.py' file
+import os
+currentdir = os.path.dirname(__file__)
+for t in os.walk(os.path.dirname(currentdir)):
+ modulename = os.path.basename(t[0])
+ #if we have a virtualenv skip it to avoid incorrect imports
+ if os.environ.has_key('VIRTUAL_ENV') and os.environ['VIRTUAL_ENV'] in t[0]:
+ continue
+
+ if ("views.py" in t[2] or "models.py" in t[2]) and not modulename in INSTALLED_APPS:
+ INSTALLED_APPS = INSTALLED_APPS + (modulename,)
+
+# A sample logging configuration. The only tangible logging
+# performed by this configuration is to send an email to
+# the site admins on every HTTP 500 error when DEBUG=False.
+# See http://docs.djangoproject.com/en/dev/topics/logging for
+# more details on how to customize your logging configuration.
+LOGGING = {
+ 'version': 1,
+ 'disable_existing_loggers': False,
+ 'filters': {
+ 'require_debug_false': {
+ '()': 'django.utils.log.RequireDebugFalse'
+ }
+ },
+ 'formatters': {
+ 'datetime': {
+ 'format': '%(asctime)s %(levelname)s %(message)s'
+ }
+ },
+ 'handlers': {
+ 'mail_admins': {
+ 'level': 'ERROR',
+ 'filters': ['require_debug_false'],
+ 'class': 'django.utils.log.AdminEmailHandler'
+ },
+ 'console': {
+ 'level': 'DEBUG',
+ 'class': 'logging.StreamHandler',
+ 'formatter': 'datetime',
+ }
+ },
+ 'loggers': {
+ 'toaster' : {
+ 'handlers': ['console'],
+ 'level': 'DEBUG',
+ },
+ 'django.request': {
+ 'handlers': ['console'],
+ 'level': 'WARN',
+ 'propagate': True,
+ },
+ }
+}
+
+if DEBUG and SQL_DEBUG:
+ LOGGING['loggers']['django.db.backends'] = {
+ 'level': 'DEBUG',
+ 'handlers': ['console'],
+ }
+
+
+# If we're using sqlite, we need to tweak the performance a bit
+from django.db.backends.signals import connection_created
+def activate_synchronous_off(sender, connection, **kwargs):
+ if connection.vendor == 'sqlite':
+ cursor = connection.cursor()
+ cursor.execute('PRAGMA synchronous = 0;')
+connection_created.connect(activate_synchronous_off)
+#
+
+
+class InvalidString(str):
+ def __mod__(self, other):
+ from django.template.base import TemplateSyntaxError
+ raise TemplateSyntaxError(
+ "Undefined variable or unknown value for: \"%s\"" % other)
+
+TEMPLATE_STRING_IF_INVALID = InvalidString("%s")
+
+import sys
+sys.path.append(
+ os.path.join(
+ os.path.join(
+ os.path.dirname(os.path.dirname(os.path.abspath(__file__))),
+ "contrib"),
+ "django-aggregate-if-master")
+ )
diff --git a/bitbake/lib/toaster/toastermain/urls.py b/bitbake/lib/toaster/toastermain/urls.py
new file mode 100644
index 0000000..521588a
--- /dev/null
+++ b/bitbake/lib/toaster/toastermain/urls.py
@@ -0,0 +1,91 @@
+#
+# ex:ts=4:sw=4:sts=4:et
+# -*- tab-width: 4; c-basic-offset: 4; indent-tabs-mode: nil -*-
+#
+# BitBake Toaster Implementation
+#
+# Copyright (C) 2013 Intel Corporation
+#
+# This program is free software; you can redistribute it and/or modify
+# it under the terms of the GNU General Public License version 2 as
+# published by the Free Software Foundation.
+#
+# This program is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
+# GNU General Public License for more details.
+#
+# You should have received a copy of the GNU General Public License along
+# with this program; if not, write to the Free Software Foundation, Inc.,
+# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
+
+from django.conf.urls import patterns, include, url
+from django.views.generic import RedirectView
+from django.views.decorators.cache import never_cache
+
+import logging
+
+logger = logging.getLogger("toaster")
+
+# Uncomment the next two lines to enable the admin:
+from django.contrib import admin
+admin.autodiscover()
+
+urlpatterns = patterns('',
+
+ # Examples:
+ # url(r'^toaster/', include('toaster.foo.urls')),
+
+ # Uncomment the admin/doc line below to enable admin documentation:
+ # url(r'^admin/doc/', include('django.contrib.admindocs.urls')),
+
+
+ # This is here to maintain backward compatibility and will be deprecated
+ # in the future.
+ url(r'^orm/eventfile$', 'bldcollector.views.eventfile'),
+
+ # if no application is selected, we have the magic toastergui app here
+ url(r'^$', never_cache(RedirectView.as_view(url='/toastergui/'))),
+)
+
+import toastermain.settings
+
+if toastermain.settings.FRESH_ENABLED:
+ urlpatterns.insert(1, url(r'', include('fresh.urls')))
+ #logger.info("Enabled django-fresh extension")
+
+if toastermain.settings.DEBUG_PANEL_ENABLED:
+ import debug_toolbar
+ urlpatterns.insert(1, url(r'', include(debug_toolbar.urls)))
+ #logger.info("Enabled django_toolbar extension")
+
+
+if toastermain.settings.MANAGED:
+ urlpatterns = [
+ # Uncomment the next line to enable the admin:
+ url(r'^admin/', include(admin.site.urls)),
+ ] + urlpatterns
+# Automatically discover urls.py in various apps, beside our own
+# and map module directories to the patterns
+
+import os
+currentdir = os.path.dirname(__file__)
+for t in os.walk(os.path.dirname(currentdir)):
+ #if we have a virtualenv skip it to avoid incorrect imports
+ if os.environ.has_key('VIRTUAL_ENV') and os.environ['VIRTUAL_ENV'] in t[0]:
+ continue
+
+ if "urls.py" in t[2] and t[0] != currentdir:
+ modulename = os.path.basename(t[0])
+ # make sure we don't have this module name in
+ conflict = False
+ for p in urlpatterns:
+ if p.regex.pattern == '^' + modulename + '/':
+ conflict = True
+ if not conflict:
+ urlpatterns.insert(0, url(r'^' + modulename + '/', include ( modulename + '.urls')))
+ else:
+ logger.warn("Module \'%s\' has a regexp conflict, was not added to the urlpatterns" % modulename)
+
+from pprint import pformat
+#logger.debug("urlpatterns list %s", pformat(urlpatterns))
diff --git a/bitbake/lib/toaster/toastermain/wsgi.py b/bitbake/lib/toaster/toastermain/wsgi.py
new file mode 100644
index 0000000..031b314
--- /dev/null
+++ b/bitbake/lib/toaster/toastermain/wsgi.py
@@ -0,0 +1,35 @@
+"""
+# ex:ts=4:sw=4:sts=4:et
+# -*- tab-width: 4; c-basic-offset: 4; indent-tabs-mode: nil -*-
+#
+WSGI config for Toaster project.
+
+This module contains the WSGI application used by Django's development server
+and any production WSGI deployments. It should expose a module-level variable
+named ``application``. Django's ``runserver`` and ``runfcgi`` commands discover
+this application via the ``WSGI_APPLICATION`` setting.
+
+Usually you will have the standard Django WSGI application here, but it also
+might make sense to replace the whole Django WSGI application with a custom one
+that later delegates to the Django one. For example, you could introduce WSGI
+middleware here, or combine a Django application with an application of another
+framework.
+
+"""
+import os
+
+# We defer to a DJANGO_SETTINGS_MODULE already in the environment. This breaks
+# if running multiple sites in the same mod_wsgi process. To fix this, use
+# mod_wsgi daemon mode with each site in its own daemon process, or use
+# os.environ["DJANGO_SETTINGS_MODULE"] = "Toaster.settings"
+os.environ.setdefault("DJANGO_SETTINGS_MODULE", "toastermain.settings")
+
+# This application object is used by any WSGI server configured to use this
+# file. This includes Django's development server, if the WSGI_APPLICATION
+# setting points here.
+from django.core.wsgi import get_wsgi_application
+application = get_wsgi_application()
+
+# Apply WSGI middleware here.
+# from helloworld.wsgi import HelloWorldApplication
+# application = HelloWorldApplication(application)
diff --git a/bitbake/toaster-requirements.txt b/bitbake/toaster-requirements.txt
new file mode 100644
index 0000000..19b5293
--- /dev/null
+++ b/bitbake/toaster-requirements.txt
@@ -0,0 +1,4 @@
+Django==1.6
+South==0.8.4
+argparse==1.2.1
+wsgiref==0.1.2