pax_global_header00006660000000000000000000000064147533750250014525gustar00rootroot0000000000000052 comment=718bc8c0db3e1e2f4a17e14ed052b7cdcfa53a2a osc-1.12.1/000077500000000000000000000000001475337502500123735ustar00rootroot00000000000000osc-1.12.1/.gitignore000066400000000000000000000006371475337502500143710ustar00rootroot00000000000000# Byte-compiled / optimized / DLL files __pycache__/ *.py[cod] *$py.class # Distribution / packaging .Python build/ develop-eggs/ dist/ downloads/ eggs/ .eggs/ lib/ lib64/ parts/ sdist/ var/ wheels/ share/python-wheels/ *.egg-info/ .installed.cfg *.egg MANIFEST # Environments .env .venv env/ venv/ ENV/ env.bak/ venv.bak/ # Temporary/backup files of text editors *.swp *~ .#* # Packages, archives *.rpm *.tar* osc-1.12.1/AUTHORS000066400000000000000000000007411475337502500134450ustar00rootroot00000000000000Adrian Schroeter Anatoli Babenia Andreas Bauer Andreas Schwab Bernhard M. Wiedemann Christoph Thiel Daniel Mach Danny Al-Gaaf Danny Kukawka David Mayr Dirk Mueller Jan-Simon Möller Juergen Weigert Lars Vogdt Lenz Grimmer Ludwig Nussel Marco Strigl Marcus Huewe Marcus Rueckert Martin Mohring Michael Schroeder Michael Wolf Michal Čihař Michal Marek Michal Vyskocil Pavol Rusnak Peter Poeml Sascha Peilicke Stephan Kulow Susanne Oberhauser Tom Patzig Werner Fink Will Stephenson osc-1.12.1/COPYING000066400000000000000000000432541475337502500134360ustar00rootroot00000000000000 GNU GENERAL PUBLIC LICENSE Version 2, June 1991 Copyright (C) 1989, 1991 Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA Everyone is permitted to copy and distribute verbatim copies of this license document, but changing it is not allowed. Preamble The licenses for most software are designed to take away your freedom to share and change it. By contrast, the GNU General Public License is intended to guarantee your freedom to share and change free software--to make sure the software is free for all its users. This General Public License applies to most of the Free Software Foundation's software and to any other program whose authors commit to using it. (Some other Free Software Foundation software is covered by the GNU Lesser General Public License instead.) You can apply it to your programs, too. When we speak of free software, we are referring to freedom, not price. Our General Public Licenses are designed to make sure that you have the freedom to distribute copies of free software (and charge for this service if you wish), that you receive source code or can get it if you want it, that you can change the software or use pieces of it in new free programs; and that you know you can do these things. To protect your rights, we need to make restrictions that forbid anyone to deny you these rights or to ask you to surrender the rights. These restrictions translate to certain responsibilities for you if you distribute copies of the software, or if you modify it. For example, if you distribute copies of such a program, whether gratis or for a fee, you must give the recipients all the rights that you have. You must make sure that they, too, receive or can get the source code. And you must show them these terms so they know their rights. We protect your rights with two steps: (1) copyright the software, and (2) offer you this license which gives you legal permission to copy, distribute and/or modify the software. Also, for each author's protection and ours, we want to make certain that everyone understands that there is no warranty for this free software. If the software is modified by someone else and passed on, we want its recipients to know that what they have is not the original, so that any problems introduced by others will not reflect on the original authors' reputations. Finally, any free program is threatened constantly by software patents. We wish to avoid the danger that redistributors of a free program will individually obtain patent licenses, in effect making the program proprietary. To prevent this, we have made it clear that any patent must be licensed for everyone's free use or not licensed at all. The precise terms and conditions for copying, distribution and modification follow. GNU GENERAL PUBLIC LICENSE TERMS AND CONDITIONS FOR COPYING, DISTRIBUTION AND MODIFICATION 0. This License applies to any program or other work which contains a notice placed by the copyright holder saying it may be distributed under the terms of this General Public License. The "Program", below, refers to any such program or work, and a "work based on the Program" means either the Program or any derivative work under copyright law: that is to say, a work containing the Program or a portion of it, either verbatim or with modifications and/or translated into another language. (Hereinafter, translation is included without limitation in the term "modification".) Each licensee is addressed as "you". Activities other than copying, distribution and modification are not covered by this License; they are outside its scope. The act of running the Program is not restricted, and the output from the Program is covered only if its contents constitute a work based on the Program (independent of having been made by running the Program). Whether that is true depends on what the Program does. 1. You may copy and distribute verbatim copies of the Program's source code as you receive it, in any medium, provided that you conspicuously and appropriately publish on each copy an appropriate copyright notice and disclaimer of warranty; keep intact all the notices that refer to this License and to the absence of any warranty; and give any other recipients of the Program a copy of this License along with the Program. You may charge a fee for the physical act of transferring a copy, and you may at your option offer warranty protection in exchange for a fee. 2. You may modify your copy or copies of the Program or any portion of it, thus forming a work based on the Program, and copy and distribute such modifications or work under the terms of Section 1 above, provided that you also meet all of these conditions: a) You must cause the modified files to carry prominent notices stating that you changed the files and the date of any change. b) You must cause any work that you distribute or publish, that in whole or in part contains or is derived from the Program or any part thereof, to be licensed as a whole at no charge to all third parties under the terms of this License. c) If the modified program normally reads commands interactively when run, you must cause it, when started running for such interactive use in the most ordinary way, to print or display an announcement including an appropriate copyright notice and a notice that there is no warranty (or else, saying that you provide a warranty) and that users may redistribute the program under these conditions, and telling the user how to view a copy of this License. (Exception: if the Program itself is interactive but does not normally print such an announcement, your work based on the Program is not required to print an announcement.) These requirements apply to the modified work as a whole. If identifiable sections of that work are not derived from the Program, and can be reasonably considered independent and separate works in themselves, then this License, and its terms, do not apply to those sections when you distribute them as separate works. But when you distribute the same sections as part of a whole which is a work based on the Program, the distribution of the whole must be on the terms of this License, whose permissions for other licensees extend to the entire whole, and thus to each and every part regardless of who wrote it. Thus, it is not the intent of this section to claim rights or contest your rights to work written entirely by you; rather, the intent is to exercise the right to control the distribution of derivative or collective works based on the Program. In addition, mere aggregation of another work not based on the Program with the Program (or with a work based on the Program) on a volume of a storage or distribution medium does not bring the other work under the scope of this License. 3. You may copy and distribute the Program (or a work based on it, under Section 2) in object code or executable form under the terms of Sections 1 and 2 above provided that you also do one of the following: a) Accompany it with the complete corresponding machine-readable source code, which must be distributed under the terms of Sections 1 and 2 above on a medium customarily used for software interchange; or, b) Accompany it with a written offer, valid for at least three years, to give any third party, for a charge no more than your cost of physically performing source distribution, a complete machine-readable copy of the corresponding source code, to be distributed under the terms of Sections 1 and 2 above on a medium customarily used for software interchange; or, c) Accompany it with the information you received as to the offer to distribute corresponding source code. (This alternative is allowed only for noncommercial distribution and only if you received the program in object code or executable form with such an offer, in accord with Subsection b above.) The source code for a work means the preferred form of the work for making modifications to it. For an executable work, complete source code means all the source code for all modules it contains, plus any associated interface definition files, plus the scripts used to control compilation and installation of the executable. However, as a special exception, the source code distributed need not include anything that is normally distributed (in either source or binary form) with the major components (compiler, kernel, and so on) of the operating system on which the executable runs, unless that component itself accompanies the executable. If distribution of executable or object code is made by offering access to copy from a designated place, then offering equivalent access to copy the source code from the same place counts as distribution of the source code, even though third parties are not compelled to copy the source along with the object code. 4. You may not copy, modify, sublicense, or distribute the Program except as expressly provided under this License. Any attempt otherwise to copy, modify, sublicense or distribute the Program is void, and will automatically terminate your rights under this License. However, parties who have received copies, or rights, from you under this License will not have their licenses terminated so long as such parties remain in full compliance. 5. You are not required to accept this License, since you have not signed it. However, nothing else grants you permission to modify or distribute the Program or its derivative works. These actions are prohibited by law if you do not accept this License. Therefore, by modifying or distributing the Program (or any work based on the Program), you indicate your acceptance of this License to do so, and all its terms and conditions for copying, distributing or modifying the Program or works based on it. 6. Each time you redistribute the Program (or any work based on the Program), the recipient automatically receives a license from the original licensor to copy, distribute or modify the Program subject to these terms and conditions. You may not impose any further restrictions on the recipients' exercise of the rights granted herein. You are not responsible for enforcing compliance by third parties to this License. 7. If, as a consequence of a court judgment or allegation of patent infringement or for any other reason (not limited to patent issues), conditions are imposed on you (whether by court order, agreement or otherwise) that contradict the conditions of this License, they do not excuse you from the conditions of this License. If you cannot distribute so as to satisfy simultaneously your obligations under this License and any other pertinent obligations, then as a consequence you may not distribute the Program at all. For example, if a patent license would not permit royalty-free redistribution of the Program by all those who receive copies directly or indirectly through you, then the only way you could satisfy both it and this License would be to refrain entirely from distribution of the Program. If any portion of this section is held invalid or unenforceable under any particular circumstance, the balance of the section is intended to apply and the section as a whole is intended to apply in other circumstances. It is not the purpose of this section to induce you to infringe any patents or other property right claims or to contest validity of any such claims; this section has the sole purpose of protecting the integrity of the free software distribution system, which is implemented by public license practices. Many people have made generous contributions to the wide range of software distributed through that system in reliance on consistent application of that system; it is up to the author/donor to decide if he or she is willing to distribute software through any other system and a licensee cannot impose that choice. This section is intended to make thoroughly clear what is believed to be a consequence of the rest of this License. 8. If the distribution and/or use of the Program is restricted in certain countries either by patents or by copyrighted interfaces, the original copyright holder who places the Program under this License may add an explicit geographical distribution limitation excluding those countries, so that distribution is permitted only in or among countries not thus excluded. In such case, this License incorporates the limitation as if written in the body of this License. 9. The Free Software Foundation may publish revised and/or new versions of the General Public License from time to time. Such new versions will be similar in spirit to the present version, but may differ in detail to address new problems or concerns. Each version is given a distinguishing version number. If the Program specifies a version number of this License which applies to it and "any later version", you have the option of following the terms and conditions either of that version or of any later version published by the Free Software Foundation. If the Program does not specify a version number of this License, you may choose any version ever published by the Free Software Foundation. 10. If you wish to incorporate parts of the Program into other free programs whose distribution conditions are different, write to the author to ask for permission. For software which is copyrighted by the Free Software Foundation, write to the Free Software Foundation; we sometimes make exceptions for this. Our decision will be guided by the two goals of preserving the free status of all derivatives of our free software and of promoting the sharing and reuse of software generally. NO WARRANTY 11. BECAUSE THE PROGRAM IS LICENSED FREE OF CHARGE, THERE IS NO WARRANTY FOR THE PROGRAM, TO THE EXTENT PERMITTED BY APPLICABLE LAW. EXCEPT WHEN OTHERWISE STATED IN WRITING THE COPYRIGHT HOLDERS AND/OR OTHER PARTIES PROVIDE THE PROGRAM "AS IS" WITHOUT WARRANTY OF ANY KIND, EITHER EXPRESSED OR IMPLIED, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE. THE ENTIRE RISK AS TO THE QUALITY AND PERFORMANCE OF THE PROGRAM IS WITH YOU. SHOULD THE PROGRAM PROVE DEFECTIVE, YOU ASSUME THE COST OF ALL NECESSARY SERVICING, REPAIR OR CORRECTION. 12. IN NO EVENT UNLESS REQUIRED BY APPLICABLE LAW OR AGREED TO IN WRITING WILL ANY COPYRIGHT HOLDER, OR ANY OTHER PARTY WHO MAY MODIFY AND/OR REDISTRIBUTE THE PROGRAM AS PERMITTED ABOVE, BE LIABLE TO YOU FOR DAMAGES, INCLUDING ANY GENERAL, SPECIAL, INCIDENTAL OR CONSEQUENTIAL DAMAGES ARISING OUT OF THE USE OR INABILITY TO USE THE PROGRAM (INCLUDING BUT NOT LIMITED TO LOSS OF DATA OR DATA BEING RENDERED INACCURATE OR LOSSES SUSTAINED BY YOU OR THIRD PARTIES OR A FAILURE OF THE PROGRAM TO OPERATE WITH ANY OTHER PROGRAMS), EVEN IF SUCH HOLDER OR OTHER PARTY HAS BEEN ADVISED OF THE POSSIBILITY OF SUCH DAMAGES. END OF TERMS AND CONDITIONS How to Apply These Terms to Your New Programs If you develop a new program, and you want it to be of the greatest possible use to the public, the best way to achieve this is to make it free software which everyone can redistribute and change under these terms. To do so, attach the following notices to the program. It is safest to attach them to the start of each source file to most effectively convey the exclusion of warranty; and each file should have at least the "copyright" line and a pointer to where the full notice is found. Copyright (C) This program is free software; you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation; either version 2 of the License, or (at your option) any later version. This program is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details. You should have received a copy of the GNU General Public License along with this program; if not, write to the Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA. Also add information on how to contact you by electronic and paper mail. If the program is interactive, make it output a short notice like this when it starts in an interactive mode: Gnomovision version 69, Copyright (C) year name of author Gnomovision comes with ABSOLUTELY NO WARRANTY; for details type `show w'. This is free software, and you are welcome to redistribute it under certain conditions; type `show c' for details. The hypothetical commands `show w' and `show c' should show the appropriate parts of the General Public License. Of course, the commands you use may be called something other than `show w' and `show c'; they could even be mouse-clicks or menu items--whatever suits your program. You should also get your employer (if you work as a programmer) or your school, if any, to sign a "copyright disclaimer" for the program, if necessary. Here is a sample; alter the names: Yoyodyne, Inc., hereby disclaims all copyright interest in the program `Gnomovision' (which makes passes at compilers) written by James Hacker. , 1 April 1989 Ty Coon, President of Vice This General Public License does not permit incorporating your program into proprietary programs. If your program is a subroutine library, you may consider it more useful to permit linking proprietary applications with the library. If this is what you want to do, use the GNU Lesser General Public License instead of this License. osc-1.12.1/MANIFEST.in000066400000000000000000000000771475337502500141350ustar00rootroot00000000000000include NEWS include README.md include AUTHORS include COPYING osc-1.12.1/NEWS000066400000000000000000001005671475337502500131030ustar00rootroot00000000000000- 1.12.1 - Command-line: - Improve 'maintenancerequest' command to inherit description from superseded request - Library: - Tell the build script to disable repos/containers handling for normal builds - Improve detecting git packages, use .osc metadata from project in parent directory - Fix retrieving apiurl from git repositories in get_api_url() in the command-line code - Fix typo in core.binary() that caused text files being detected as binary - Improve core.binary() by always considering data that contains \0 binary - Fix assembling scm_url when scmsync_obsinfo.revision is None - 1.12.0 - Command-line: - Add 'git-obs pr' command - Add 'git-obs api' command for making arbitrary API calls - Change 'git-obs' to use owner/repo[#pull] arguments consistently - Change 'git-obs repo clone' command to store ssh command in core.sshCommand git config option - Fix 'request list --interactive' command that wasn't showing request forward prompt when message was part of the state change - Library: - Avoid fetching _scmsync.obsinfo when scmsync url contains 'noobsinfo' query parameter - Fix ssh key priority in 'git-obs repo clone' command - Fix loading config entries with underscores instead of dashes in the keys - Fix detecting binary files - Fix diff highligting - Connection: - Implement retries in 'git-obs' - 1.11.1 - Command-line: - Fix 'linkpac' command for projects with a project link - Fix 'linkpac' command by always removing element from target meta - Fix command descriptions in help by moving the import statements under docstring in the do_*() methods - 1.11.0 - Command-line: - Add hint how to deal with scmsync-nobranch scenarios - Speed loading commands up - Fix 'log' command to work correctly with --meta --patch options - Document that 'status' command prints unmodified files in verbose output - Hide progressbar.Bar widget after ProgressBar has completed - Avoid printing urlquoted file names in 'getbinaries' command - Configuration: - Mute oscrc permissions warning when there's no password set - Connection: - Save session cookie even if a request fails - Library: - Fix local building in git projects - Extend xml.etree.ElementTree.ParseError output with a snippet of broken XML - Sync ScmsyncObsinfo with obs-scm-bridge - Move running obs_scm_bridge into run_obs_scm_bridge() function - Convert remaining makeurl() query parameters from deprecated string to dict - Add more operators to XPathQuery - Make 'title' and 'description' fields optional in the Package model - Fix progress bar code to support progressbar2 - Support LoongArch64 architecture - Mute pylint errors in show_package_disabled_repos() that are false-positives - Fix another bytes/unicode issues in core.link_pac() by replacing ElementTree code with XML models - Fix handling 'lock' field in 'Package' model - Fix handling SimpleFlag model - Extend RequestActionSource with 'repository' attribute - Load most of the modules in commandline.py on-demand - Fix traceback in 'linkpac --disable-build' - Spec: - Add conflict with older versions of obs-service-source_validator to ensure that version compatible with .osc store 2.0 is installed - 1.10.1 - Command-line: - Fix crash in 'build' command due to undefined 'build_root' variable - Spec: - Add missing python3-ruamel.yaml BuildRequires - Fix rpmlint error about creating the /usr/libexec/git/obs symlink - 1.10.0 - Command-line: - New git-obs executable with several subcommands - Support parameters on token triggers - Change 'update' command to treat empty '.osc/_in_update/_files' as missing - Change 'checkout' command to allow checkout obs imported sources of scmsync sources - Change 'creq' command to allow specifying source and target repository - Change the help output to determine executable name from the command-line arguments - Add '--no-timestamps' parameter to 'build' command - Fix 'token' command to avoid crash when a wipe token exists - Fix crash in 'results' command by skipping non-status elements in project results - Fix crash in 'build' command when building with --local-package --alternative-project from a locally initialized .osc package - Print buildroot directory when build or chroot finishes - Library: - Add 'gitea_api' module - Support the mkosi build type - Support copy of scmsync packages - Support Dockerfile.* in _multibuild packages - Implement obs_api.Status.data property that returns status data as a dictionary - Implement obs_api.Package.cmd_fork() - Fix meter by setting default of use_pb_fallback to False - Fix store migration from 1.0 to 2.0 when there is a 'sources' file that would conflict with 'sources' directory - Fix revision'd checkout of scmsync package - Refactor code handling _scmsync.obsinfo to obs_api.scmsync_obsinfo.ScmsyncObsinfo class - Store the container annotation in the "containers" directory - Spec: - Install symlink /usr/bin/git-obs to /usr/libexec/git/obs to make git-obs available as a git sub-command - 1.9.2 - Command-line: - Fix 'resolved' command to skip subdirectories in package checkouts - Fix 'comment' command to resolve project name - Implement meter.SimpleTextMeter that prints what's being downloaded - Connection: - Use configured 'http_headers' in HTTP requests - Library: - Fix storing _buildinfo and _buildconfig files in .osc rather than among the source files - Use findtext() instead of find().text - Spec: - Recommend python3-zstandard to support opening control.tar.zst - 1.9.1 - Command-line: - Add 'createrequest release' subcommand - Change 'review list' command to omit requests with 'declined' state - Fix `osc build --local-package` - Fix typos - Library: - Fix extraction of the 'ar' archives when they don't contain header with long filenames - Fix iterating through arch_list in core.get_repos_of_project() - Check for None & len() of ET.Element instead of bool() in PackageBase.get_meta_value() - Spec: - Replace 'setup.py test' that was removed in setuptools 72 with 'python3 -m unittest' - 1.9.0 - Security: - Fix possibility to overwrite special files in .osc (CVE-2024-22034 boo#1225911) Source files are now stored in the 'sources' subdirectory which prevents name collisons. This requires changing version of '.osc' store to 2.0. - Command-line: - Introduce build --checks parameter - Library: - OscConfigParser: Remove automatic __name__ option - 1.8.3 - Command-line: - Change 'repairwc' command to always run all repair steps - Library: - Make most of the fields in KeyinfoPubkey and KeyinfoSslcert models optional - Fix colorize() to avoid wrapping empty string into color escape sequences - Provide default values for kwargs.get/pop in get_results() function - 1.8.2 - Library: - Change 'repairwc' command to fix missing .osc/_osclib_version - Make error message in check_store_version() more generic to work for both projects and packages - Fix check_store_version in project store - 1.8.1 - Command-line: - Fix 'linkpac' command crash when used with '--disable-build' or '--disable-publish' option - 1.8.0 - Command-line: - Improve 'submitrequest' command to inherit description from superseded request - Fix 'mv' command when renaming a file multiple times - Improve 'info' command to support projects - Improve 'getbinaries' command by accepting '-M' / '--multibuild-package' option outside checkouts - Add architecture filtering to 'release' command - Change 'results' command so the normal and multibuild packages have the same output - Change 'results' command to use csv writer instead of formatting csv as string - Add couple mutually exclusive options errors to 'results' command - Set a default value for 'results --format' only for the csv output - Add support for 'results --format' for the default text mode - Update help text for '--format' option in 'results' command - Add 'results --fail-on-error/-F' flag - Redirect venv warnings from stderr to debug output - Configuration: - Fix config parser to throw an exception on duplicate sections or options - Modify conf.get_config() to print permissions warning to stderr rather than stdout - Library: - Run check_store_version() in obs_scm.Store and fix related code in Project and Package - Forbid extracting files with absolute path from 'cpio' archives (boo#1122683) - Forbid extracting files with absolute path from 'ar' archives (boo#1122683) - Remove no longer valid warning from core.unpack_srcrpm() - Make obs_api.KeyinfoSslcert keyid and fingerprint fields optional - Fix return value in build build.create_build_descr_data() - Fix core.get_package_results() to obey 'multibuild_packages' argument - Tests: - Fix tests so they don't modify fixtures - 1.7.0 - Command-line: - Add 'person search' command - Add 'person register' command - Add '-M/--multibuild-package' option to '[what]dependson' commands - Update '-U/--user' option in 'maintainer' command to accept also an email address - Fix 'branch' command to allow using '--new-package' option on packages that do not exist - Fix 'buildinfo' command to include obs:cli_debug_packages by default - Fix 'buildinfo' command to send complete local build environment as the 'build' command does - Fix 'maintainer --devel-project' to raise an error if running outside a working copy without any arguments - Fix handling arguments in 'service remoterun prj/pac' - Fix 'rebuild' command so the '--all' option conflicts with the 'package' argument - Fix crash when removing 'scmsync' element from dst package meta in 'linkpac' command - Fix crash when reading dst package meta in 'linkpac' command - Allow `osc rpmlint` to infer prj/pkg from CWD - Propagate exit code from the run() and do_() commandline methods - Give a hint where a scmsync git is hosted - Fix crash in 'updatepacmetafromspec' command when working with an incomplete spec - Improve 'updatepacmetafromspec' command to expand rpm spec macros by calling rpmspec to query the data - Improve 'build' and 'buildinfo' commands by uploading *.inc files to OBS for parsing BuildRequires (boo#1221340) - Improve 'service' command by printing names of running services - Improve 'getbinaries' command by ignoring source and debuginfo filters when a binary name is specified - Change 'build' command to pass '--jobs' option to 'build' tool only if 'build_jobs' > 0 - Clarify 'list' command's help that that listing binaries doesn't contain md5 checksums - Improve 'log' command: produce proper CSV and XML outputs, add -p/--patch option for the text output - Allow setlinkrev to set a specific vrev - Document '--buildtool-opt=--noclean' example in 'build' command's help - Fix handling the default package argument on the command-line - Configuration: - Document loading configuration from env variables - Connection: - Don't retry on error 400 - Remove now unused 'retry_on_400' http_request() option from XmlModel - Revert "Don't retry on 400 HTTP status code in core.server_diff()" - Revert "connection: Allow disabling retry on 400 HTTP status code" - Authentication: - Update SignatureAuthHandler to support specifying ssh key by its fingerprint - Use ssh key from ssh agent that contains comment 'obs=' - Use strings instead of bytes in SignatureAuthHandler - Cache password from SecretService to avoid spamming user with an accept dialog - Never ask for credentials when displaying help - Remove unused SignatureAuthHandler.get_fingerprint() - Library: - Add rootless build support for 'qemu' VM type - Support package linking of packages from scmsync projects - Fix do_createrequest() function to return None instead of request id - Replace invalid 'if' with 'elif' in BaseModel.dict() - Fix crash when no prefered packages are defined - Add XmlModel class that encapsulates manipulation with XML - Add obs_api.Person.cmd_register() for registering new users - Fix conf.get_config() to ignore file type bits when comparing oscrc perms - Fix conf.get_config() to correctly handle overrides when env variables are set - Fix output.tty.IS_INTERACTIVE when os.isatty() throws OSError - Improve cmdln.HelpFormatter to obey newline characters - Update list of color codes in 'output.tty' module - Remove core.setDevelProject() in favor of core.set_devel_project() - Move removing control characters to output.sanitize_text() - Improve sanitize_text() to keep selected CSI escape sequences - Add output.pipe_to_pager() that pipes lines to a pager without creating an intermediate temporary file - Fix output.safe_write() in connection with NamedTemporaryFile - Modernize output.run_pager() - Extend output.print_msg() to accept 'error' and 'warning' values of 'to_print' argument - Add XPathQuery class for translating keyword arguments to an xpath query - Add obs_api.Keyinfo class - Add obs_api.Package class - Add Package.get_revision_list() for listing commit log - Add obs_api.PackageSources class for handling OBS SCM sources - Add obs_api.Person class - Add obs_api.Project class - Add obs_api.Request class - Add obs_api.Token class - Allow storing apiurl in the XmlModel instances - Allow retrieving default field value from top-level model - Fix BaseModel to convert dictionaries to objects on retrieving a model list - Fix BaseModel to always deepcopy mutable defaults on first use - Implement do_snapshot() and has_changed() methods to determine changes in BaseModel - Implement total ordering on BaseModel - Add comments with available attributes/elements to edited XML - Refactoring: - Migrate repo {list,add,remove} commands to obs_api.Project - Migrate core.show_package_disabled_repos() to obs_api.Package - Migrate core.Package.update_package_meta() to obs_api.Package - Migrate core.get_repos_of_project() to obs_api.Project - Migrate core.get_repositories_of_project() to obs_api.Project - Migrate core.show_scmsync() to obs_api.{Package,Project} - Migrate core.set_devel_project() to obs_api.Package - Migrate core.show_devel_project() to obs_api.Package - Migrate Fetcher.run() to obs_api.Keyinfo - Migrate core.create_submit_request() to obs_api.Request - Migrate 'token' command to obs_api.Token - Migrate 'whois/user' command to obs_api.Person - Migrate 'signkey' command to obs_api.Keyinfo - Move print_msg() to the 'osc.output' module - Move run_pager() and get_default_pager() from 'core' to 'output' module - Move core.Package to obs_scm.Package - Move core.Project to obs_scm.Project - Move functions manipulating store from core to obs_scm.store - Move store.Store to obs_scm.Store - Move core.Linkinfo to obs_scm.Linkinfo - Move core.Serviceinfo to obs_scm.Serviceinfo - Move core.File to obs_scm.File - Merge _private.project.ProjectMeta into obs_api.Project - Spec: - Remove dependency on /usr/bin/python3 using %python3_fix_shebang macro (bsc#1212476) - 1.6.1 - Command-line: - Use busybox compatible commands for completion - Change 'wipe' command to use the new get_user_input() function - Fix error 500 in running 'meta attribute ' - Configuration: - Fix resolving config symlink to the actual config file - Honor XDG_CONFIG_HOME and XDG_CACHE_HOME env vars - Warn about ignoring XDG_CONFIG_HOME and ~/.config/osc/oscrc if ~/.oscrc exists - Library: - Error out when branching a scmsync package - New get_user_input() function for consistent handling of user input - Move xml_indent, xml_quote and xml_unquote to osc.util.xml module - Refactor makeurl(), deprecate query taking string or list arguments, drop osc_urlencode() - Remove all path quoting, rely on makeurl() - Always use dict query in makeurl() - Fix core.slash_split() to strip both leading and trailing slashes - 1.6.0 - Command-line: - The 'token --trigger' command no longer sets '--operation=runservice' by default. - Change 'token --create' command to require '--operation' - Fix 'linkdiff' command error 400: prj/pac/md5 not in repository - Update 'build' command to support building 'productcompose' build type with updateinfo.xml data - Don't show meter in terminals that are not interactive - Fix traceback when running osc from an arbitrary git repo that fails to map branch to a project (boo#1218170) - Configuration: - Implement reading credentials from environmental variables - Allow starting with an empty config if --configfile is either empty or points to /dev/null - Implement 'quiet' conf option - Password can be an empty string (commonly used with ssh auth) - Connection: - Allow -X HEAD on osc api requests as well - Library: - Fix credentials managers to consistently return Password - Fix Password.encode() on python < 3.8 - Refactor 'meter' module, use config settings to pick the right class - Convert to using f-strings - Use Field.get_callback to handle quiet/verbose and http_debug/http_full_debug options - Implement get_callback that allows modifying returned value to the Field class - Add support for List[BaseModel] type to Field class - Report class name when reporting an error during instantiating BaseModel object - Fix exporting an empty model field in BaseModel.dict() - Fix initializing a sub-model instance from a dictionary - Implement 'Enum' support in models - Fix Field.origin_type for Optional types - Drop unused 'exclude_unset' argument from BaseModel.dict() method - Store cached model defaults in self._defaults, avoid sharing references to mutable defaults - Limit model attributes to predefined fields by forbidding creating new attributes on fly - Store model values in self._values dict instead of private attributes - Spec: - Recommend openssh-clients for ssh-add that is required during ssh auth - Add 0%{?amzn} macro that wasn't usptreamed - 1.5.1 - Library: - Avoid using '/public/' API routes - Update 'osc.util.models' to avoid including lazy defaults in the rendered man pages - Spec: - Simplify distro-specific macros - Use %{?rhel} macros - 1.5.0 - Command-line: - Change 'rdiff' command to display diff for _project if no package is specified - Update 'build' command with initial support for 'productcompose' build type - Change 'build' command to disable preinstall images in rootless builds - Configuration: - Fix conf.write_initial_config() to use read_file() instead of deprecated readfp() - Other: - Support installing osc into virtualenv - Spec: - Recommend 'build' on openSUSE/SLE and 'obs-build' on all other distros - 1.4.4 - Command-line: - Fix autocompletion for new locations - Configuration: - Fix apiurl_aliases handling in OscOptions.__getitem__ - Fix crash when there's no [general]/apiurl option in the config file - Spec: - Install bash completion with .bash suffix rather than .sh - 1.4.3 - Configuration: - Allow undefined fields in Options and HostOptions - 1.4.2 - Command-line: - Change NoPBTextMeter to display no output at all - Fix retrieving the configured user in 'user' command - Configuration: - Restore 'passx' host option that contains an obfuscated password - Fix retrieving a password in case a function returns another callable - Fix retrieving config values in core.vc_export_env() - 1.4.1 - Configuration: - Always display apiurl when asking for credentials - Ask for new credentials when user is missing from an apiurl section in the config file - Library: - Fix testing revision for being empty - Fix core.change_request_state_template() to always return a string - Tests: - Replace 'git init -b' with 'git init' and 'git checkout -b' - Spec: - Run fdupes after install - List the python sitelib paths explicitly - Mark csh completion files as configs - Own zsh completion dirs to mute rpmlint errors - Move bash completion from /etc to /usr/share - 1.4.0 - Command-line: - Add rootless build support to 'build' command for 'kvm' and 'podman' vm types - Print a hint to clean the build root after a failed build - Avoid adding a newline to prompt in 'wipe' command - Fix 'build' command to pass '--vm-type' option to the underlying build tool - Add '--just-print-buildroot' option to print build root path and exit to 'build' command - Add support for keep_packages_locked on request revoke - Import zsh completition made by Holger Macht and improve it - Use XDG locations in completion - Fix 'search' command to resolve '-B .' to the current project - Add '-M/--multibuild-package' option to 'checkconstraints' command - Allow constraints file with remote request in 'checkconstraints' command - Unify how the 'commit' and 'build' commands work with '--noservice' option - Fix 'request show' command to print superseded_by information - Fix 'service' command to support already documented 'r' abbreviation for 'run' - Configuration: - Implement 'exclude_files' and 'include_files' config options that allow skipping files in the 'checkout' command - Fix api_host_options for custom CAs (cafile and capath options work again) - Switch 'osc.conf.config' from dict to Options class with type checking - Rename conf.Options.build_type to vm_type to be consistent with obs build and osc --vm-type option - Update list of supported vm_type values in conf.Options.vm_type - Remove any duplicated code loading configuration from ENV - Library: - Add 'osc.util.models' module implementing an alternative pydantic-like data validation - Add 'osc.util.xdg' module for handling XDG paths - Fix handling empty vm_type in Store.last_buildroot - Spec: - Install zsh completion - Build and install oscrc man page - 1.3.1 - Command-line: - Fix string + int concatenation errors in 'build' command by using f-strings instead - Fix '--all' option in 'rebuild' command - Fix 'build' command when '--alternative-project' option is specified and the git branch cannot be mapped to a project - Stop suggesting that the working directory is git/mercurial/svn/cvs - Be helpful in deprecating commands - Configuration: - Add more config options among integer options - Library: - Fix GitStore to error out properly if there is no 'origin' remote in the git repo - print_buildlog: Remove control characters from build log before printing (CVE-2012-1095) - 1.3.0 - Command-line: - Add experimental support of Git SCM to the 'build' command - Add experimental support of Git SCM to the 'service' command - Make 'meta' command capable of editing attributes - Change '--add' option in 'meta attribute' command to skip duplicate values - Add an interactive option to display build log in 'request list -i' command - Add '--setopt' option for setting config options from the command-line - Fix '--prefer-pkgs' option for noinstall="1" packages in kiwi builds - Change 'checkout' command to print open requests only when running in an interactive terminal - Enhance '--force' option description in the 'request' command - Connection: - Fix crash when HTTP_PROXY env contains no auth - Library: - Add 'git_scm' module for handling packages that live in git scm rather than usual obs scm - Change pop_project_package_from_args() to use get_store() to support Git SCM - Change osc.build module to use 'store' object instead of calling core.store_*() functions - Use alternative project if specified in parse_repoarchdescr() - Fix xml indent() on Python 3.6 - Fix less pager by adding '-R' to LESS env - Improve print_msg() and migrate some arbitrary prints to it - 1.2.0 - Command-line: - Add 'repo' command and subcommands for managing repositories in project meta - Extend 'browse' command to open requests in a web browser - Add highlighting for 'osc diff' and similar commands - Fix 'api' command to stream output to avoid running out of memory - Fix printing utf-8 characters to stdout - Connection: - Fix ValueError: Cannot set verify_mode to CERT_NONE when check_hostname is enabled - Authentication: - Correctly handle passwords with utf-8 characters - Library: - Fix crash when submiting a SCM package which has no _link - Fix local service execution of scmsync packages - Detect target package by its full name, instead of assuming its origin is identical to the source package type - Other: - Spell openSUSE correctly - 1.1.4 - Command-line: - Change 'review list' command to display open requests (state: new, review, declined) - Fix running osc in an AppImage by switching to the correct working directory - Handle ProtocolError exception - Library: - Add 'req_states' parameter to osc.core.get_review_list() - Connection: - Fix grabber to work with old urllib3 versions that do not contain URLSchemeUnknown exception - 1.1.3 - Command-line: - Backup edited messages and notify user about them when osc errors out - Consider only open requests when listing requests with a given review state - Fix 'diff' command when no files are specified - Configuration: - Add glob support to the 'trusted_prj' config option - Library: - Fix core.xmlindent() to work with ElementTree objects - 1.1.2 - Command-line: - Add '--buildtool-opt' option passing options to underlying rpmbuild to the 'build' command - Fix 'diff' command to support diffing selected files only - Identify inherited packages in the 'dependson' command output - Bring the '--debug' option back to the 'buildinfo' command - Fix 'buildhistory' command by setting the type of the '--limit' option to int - Library: - Fix a traceback when failed to unlock a keyring - Don't retry on 400 HTTP status code in core.server_diff() - Clean-up the '.old' folder if an exception happens - Document 'popt' attribute in the _link template - Fix build.get_repo() to return only directory that contains 'repodata/repomd.xml' - Connection: - Retry on receiving the following HTTP status codes: 400, 500, 502, 503, 504 - Allow disabling retry on 400 HTTP status code - Fix urlgrab to skip mirrors with invalid scheme - 1.1.1 - Command-line: - Fix 'creq' command that wasn't working at all - Fix 'ls' command when listing all projects by setting project argument to '/' - Fix regression: Run interactive config setup on missing config or credentials - Append plugin dirs to sys.path to allow loading modules installed next to the plugins - Do not recurse into subdirs when loading plugins - Configuration: - Display apiurl when asking for a username or a password - If apiurl is not set in interactive_config_setup(), use apiurl from DEFAULTS - Library: - Decode entities in HTTPError message body - 1.1.0 - Command-line: - New class-based commands - Sort commands before printing help - No longer read plugins from /var/lib/osc-plugins - Configuration: - Do not error out on setting oscrc permissions if the file is owned by another user - Library: - Restore 'include_request_from_project' conf option functionality - Simplify how babysitter works with options and config - Prefer f-strings over c-style string expansion - 1.0.1 - Configuration: - Fix a cut&paste error in setting 'disable_hdrmd5_check' config option - Connection: - Set Content-Type of POST requests without data to 'application/x-www-form-urlencoded' - 1.0.0 - Command-line: - Use '.' as a wildcard that resolves to a project or a package name from the current working copy, for example 'osc rdiff ./. -c ' - Add 'create-pbuild-config' (cpc) command - Add '--disable-build' option to the 'branch' command - Add '--disable-build' option to the 'linkpac' command - Add '-X/--extra-pkgs-from' option to the 'build' command - Add '--add' option to the 'meta' command that appends new values to the existing values - Replace '-q/--hide-legend' option in 'prjresults' command with global '-q/--quiet' option - Replace '--debug' option in the 'getbinaries' command with '--debuginfo' to avoid conflicts with the global '--debug' - Replace '--verbose' option in the 'build' command with '--verbose-mode' to avoid conflicts with the global '--verbose' - Replace '--version' option with 'version' command - Enable forwarding requests to the parent projects in 'rq list -i' and 'sr accept' commands - Make use of '-M/--multibuild' option consistent across the commands - Enhance '--force' option in the 'commit' command to allow deleting packages even if other packages depend on them - Print URLs and xpaths in the debug rather than the verbose mode - Fix 'add' command for github /archive/ URLs - Fix 'buildhistory' command to produce proper output using build_table() and a CSV writer - Fix 'linkpac' command to avoid copying a lock from a locked package to the target package - Fix 'setlinkrev' command to write a log message on setting a revision - Fix 'submitrequest' command not to error out on using an alias to apiurl - Fix 'update' command on project level for scm packages - Fix '--mine' option in the 'request' command to show only requests created by the user - Fix the documentation url in the 'maintenancerequest' and 'createincident' commands - Remove '--skip-validation' option from the 'commit' command - Remove '--oldpkg', '--oldprj' options from the 'rdiff' command - Remove 'create', 'list', 'log', 'show', 'decline', 'accept', 'delete', 'revoke' subcommands from the 'submitrequest' command - Remove '--seperate-requests' option from the 'submitrequest' command - Remove '--raw' option from the 'develproject' command - Remove '--maintained' option from the 'search' command - Remove 'deleteprj' command - Remove 'deletepac' command - Remove 'editmeta' command - Remove 'results_meta' command - Remove 'rlog' command - Remove 'rprjresults' command - Remove 'rresults' command - Configuration: - Add 'project_separator' config option - Add 'disable_hdrmd5_check' config option to ignore hdrmd5 mismatches - Remove 'plaintext_passwd' config option - Library: - Add Store class that will replace store_{read,write}* functions - Remove 'GnomeKeyringCredentialsManager' and related code - Remove Request.get_creator() method - Replace unmaintained cmdln.py with a custom code based on argparse - Replace core.findpacs() with Package.from_paths() and Package.from_paths_nofail() - Drop Python 2 support, Python 3.6 is the lowest supported version - Code cleanups, following PEP 8 and the latest best practices now - Improve handling of hdrmd5 mismatches - Handle uncompressed Debian packages - Replace arbitrary XML escaping code with xml_escape() - Fix race condition in using .old directory in Serviceinfo.execute() - Fix manual run of source service - Connection: - Switch http_request() to urllib3 and cryptography (drop M2Crypto dependency) - Warn when using HTTP connection, make HTTPS the default - Send HTTP header Accept: application/xml - Wait between retries - Authentication: - Support signature (ssh) auth including ssh-agent forwarding - Lock cookiejar to prevent unnecessary signature auth by waiting for a session cookie - Print user and apiurl when prompting for a password - Fix a crash when deleting a password - Known issues: - Commandline option values cannot start with '-', for example: osc build -x -vim Background: This is a limitation of underlying Python's argparse How to fix: Use '=' to join the option with its value: osc build -x=-vim URL: https://github.com/openSUSE/osc/issues/1227 - Commandline positional arguments no longer recognize '/' as a universal argument separator Background: This is an attempt to bring some consistency into argument parsing, reducing number of separators and value combinations How to fix: Separate / from / with a space URL: https://github.com/openSUSE/osc/issues/1272 osc-1.12.1/README.md000066400000000000000000000130001475337502500136440ustar00rootroot00000000000000[![unit tests](https://github.com/openSUSE/osc/actions/workflows/tests.yaml/badge.svg)](https://github.com/openSUSE/osc/actions/workflows/tests.yaml) [![docs](https://readthedocs.org/projects/opensuse-commander/badge/?version=latest)](https://opensuse-commander.readthedocs.io/en/latest/?badge=latest) [![codecov](https://codecov.io/gh/openSUSE/osc/branch/master/graph/badge.svg)](https://codecov.io/gh/openSUSE/osc) [![code climate](https://github.com/openSUSE/osc/actions/workflows/codeql.yml/badge.svg)](https://github.com/openSUSE/osc/actions/workflows/codeql.yml) [![contributors](https://img.shields.io/github/contributors/openSUSE/osc.svg)](https://github.com/openSUSE/osc/graphs/contributors) # openSUSE Commander openSUSE Commander (osc) is a command-line interface to the [Open Build Service (OBS)](https://github.com/openSUSE/open-build-service/). ## Installation RPM packages are available in the [openSUSE:Tools](http://download.opensuse.org/repositories/openSUSE:/Tools/) repository. zypper addrepo --repo http://download.opensuse.org/repositories/openSUSE:/Tools/openSUSE_Tumbleweed/openSUSE:Tools.repo zypper install osc **Unstable** RPM packages are available in the [OBS:Server:Unstable](http://download.opensuse.org/repositories/OBS:/Server:/Unstable/) repository. zypper addrepo --repo http://download.opensuse.org/repositories/OBS:/Server:/Unstable/openSUSE_Factory/OBS:Server:Unstable.repo zypper install osc To install from git, do ./setup.py build ./setup.py install Alternatively, you can directly use `./osc-wrapper.py` from the source directory, which is easier if you develop on osc. ## Configuration When you use osc for the first time, it will ask you for your username and password, and store it in `~/.config/osc/oscrc`. ## Keyrings Osc can store passwords in keyrings instead of `~/.config/osc/oscrc`. To use them, you need python3-keyring with a backend of your choice installed: - kwalletd5 (A pasword manager for KDE) - secrets (A password manager for GNOME) - python3-keyring-keyutils (A python-keyring backend for the kernel keyring) If you want to switch to using a keyring you need to delete apiurl section from `~/.config/osc/oscrc` and you will be asked for credentials again, which will be then stored in the keyring application. ## Usage For more details please check the [openSUSE wiki](https://en.opensuse.org/openSUSE:OSC). To list existing content on the server osc ls # list projects osc ls Apache # list packages in a project osc ls Apache subversion # list files of package of a project Check out content osc co Apache # entire project osc co Apache subversion # a package osc co Apache subversion foo # single file Update a working copy osc up osc up [pac_dir] # update a single package by its path osc up * # from within a project dir, update all packages osc up # from within a project dir, update all packages # AND check out all newly added packages If an update can't be merged automatically, a file is in `C` (conflict) state, and conflicts are marked with special `<<<<<<<` and `>>>>>>>` lines. After manually resolving the problem, use osc resolved foo Upload change content osc ci # current dir osc ci osc ci file1 file2 ... Show the status (which files have been changed locally) osc st osc st osc st file1 file2 ... Mark files to be added or removed on the next 'checkin' osc add file1 file2 ... osc rm file1 file2 ... Adds all new files in local copy and removes all disappeared files osc addremove Generates a diff, to view the changes osc diff # current dir osc diff file1 file2 ... Shows the build results of the package osc results osc results [repository] Shows the log file of a package (you need to be inside a package directory) osc log Shows the URLs of .repo files which are packages sources for Yum/YaST/smart osc repourls [dir] Triggers a package rebuild for all repositories/architectures of a package osc rebuildpac [dir] Shows available repository/build targets osc repository Shows the configured repository/build targets of a project osc repository Shows meta information osc meta Apache osc meta Apache subversion osc id username Edit meta information (Creates new package/project if it doesn't exist) osc editmeta Apache osc editmeta Apache subversion Update package meta data with metadata taken from spec file osc updatepacmetafromspec There are other commands, which you may not need (they may be useful in scripts) osc repos osc buildconfig osc buildinfo Locally build a package (see 'osc help build' for more info) osc build specfile [--clean|--noinit] Update a package to a different sources (directory foo_package_source) cp -a foo_package_source foo cd foo osc init osc addremove osc ci cd $OLDPWD rm -r foo ## Contributing Report [issues](https://github.com/openSUSE/osc/issues) or submit [pull-requests](https://github.com/openSUSE/osc/pulls) to the [osc](https://github.com/openSUSE/osc/issues) project on GitHub. ## Testing Unit tests can be run from a git checkout by executing ./setup.py test osc-1.12.1/contrib/000077500000000000000000000000001475337502500140335ustar00rootroot00000000000000osc-1.12.1/contrib/complete.csh000066400000000000000000000006671475337502500163530ustar00rootroot00000000000000onintr - if (! $?prompt || ! $?tcsh) goto end if ($tcsh == 1) goto end set rev=$tcsh:r set rel=$rev:e set pat=$tcsh:e set rev=$rev:r if ($rev > 5 && $rel > 1) then if ( -s /usr/share/osc/complete ) complete osc 'p@*@`\/usr/share/osc/complete`@' if ( -s /usr/lib64/osc/complete ) complete osc 'p@*@`\/usr/lib64/osc/complete`@' if ( -s /usr/lib/osc/complete ) complete osc 'p@*@`\/usr/lib/osc/complete`@' endif end: onintr osc-1.12.1/contrib/complete.sh000066400000000000000000000006541475337502500162040ustar00rootroot00000000000000test -z "$BASH_VERSION" && return complete -o default _nullcommand >/dev/null 2>&1 || return complete -r _nullcommand >/dev/null 2>&1 || return test -s /usr/share/osc/complete && complete -o default -C /usr/share/osc/complete osc && return test -s /usr/lib64/osc/complete && complete -o default -C /usr/lib64/osc/complete osc && return test -s /usr/lib/osc/complete && complete -o default -C /usr/lib/osc/complete osc osc-1.12.1/contrib/osc.complete000077500000000000000000001513321475337502500163610ustar00rootroot00000000000000#!/bin/bash # # Helper script for completion, usage with tcsh: # # complete osc 'p@*@`\osc.complete`@' # # usage with bash # # complete -C osc.complete osc # # Author: Werner Fink # ## For debugging only: ## Choose your terminal not identical with the test terminal ## exec 2>/dev/pts/9 ## set -x set -o noclobber shopt -s extglob typeset -i last typeset -i off typeset -i count typeset -i offset typeset -i remove typeset -i colon typeset -r OIFS="$IFS" # Do not pollute the terminal session with warnings or errors exec 2>/dev/null if test "/proc/$PPID/exe" -ef /bin/tcsh ; then export COMP_TYPE=63 export COMP_KEY=9 export COMP_LINE="${COMMAND_LINE}" export COMP_POINT="${#COMMAND_LINE}" let colon=0 else COMMAND_LINE="${COMP_LINE:0:$COMP_POINT}" let colon=0 case "$COMP_WORDBREAKS" in *:*) let colon=1 esac [[ $COMMAND_LINE =~ \\: ]] && COMMAND_LINE="${COMMAND_LINE//\\:/:}" fi IFS="${IFS}=" cmdline=($COMMAND_LINE) IFS="$OIFS" case "${cmdline[0]}" in iosc|isc|osc) ;; *) exit 1 esac let last=${#COMMAND_LINE} let last-- let count=${#cmdline[@]} let count-- test "${COMMAND_LINE:$last}" = " " && let count++ unset last XDG_CACHE_HOME="${XDG_CACHE_HOME:-$HOME/.cache}" XDG_CONFIG_HOME="${XDG_CONFIG_HOME:-$HOME/.config}" for xdg_dir in "${XDG_CACHE_HOME}" "${XDG_CONFIG_HOME}"; do if [ ! -d "${xdg_dir}" ]; then mkdir -p "${xdg_dir}" fi done projects="${XDG_CACHE_HOME}/osc.projects" if [ -f ~/.osc.projects ]; then rm ~/.osc.projects -f fi oscrc="${XDG_CONFIG_HOME}/osc/oscrc" command=osc oscopts=(--version --help --debugger --post-mortem --traceback --http-full-debug --debug --apiurl -A --config -c --no-keyring --no-gnome-keyring --verbose --quiet) osccmds=(abortbuild add addremove aggregatepac api ar bco bl blt branch branchco bsdevelproject bse bugowner build buildconfig buildhist buildhistory buildinfo buildlog buildlogtail cat changedevelreq changedevelrequest checkconstraints checkin checkout chroot ci co comment commit config copypac cr createincident createrequest creq del delete deletereq deleterequest dependson detachbranch develproject di diff distributions dists dr dropreq droprequest getbinaries getpac help importsrcpkg info init jobhist jobhistory lbl ldiff less linkdiff linkpac linktobranch list LL localbuildlog log ls maintained maintainer maintenancerequest man mbranch meta metafromspec mkpac mr mv my patchinfo pdiff platforms pr prdiff prjresults projdiff projectdiff pull r rbl rblt rbuildlog rbuildlogtail rdelete rdiff rebuild rebuildpac releaserequest remotebuildlog remotebuildlogtail remove repairlink repairwc repos repositories repourls reqbs reqbugownership reqmaintainership reqms request requestbugownership requestmaintainership resolved results revert review rm rq rremove se search service setlinkrev signkey sm sr st status submitpac submitreq submitrequest tr triggerreason undelete unlock up update updatepacmetafromspec user vc whatdependson who whois wipebinaries workerinfo) oscreq=(list log show accept decline revoke reopen setincident supersede approvenew checkout clone) oscrev=(show list add accept decline reopen supersede) oscmy=(work pkg prj rq sr) osccmt=(list create delete) osccmtkind=(package project request) oscprj="" oscpkg="" lnkprj="" lnkpkg="" apiurl="" alias="" test -s "${PWD}/.osc/_project" && read -t 1 oscprj < "${PWD}/.osc/_project" test -s "${PWD}/.osc/_package" && read -t 1 oscpkg < "${PWD}/.osc/_package" if test -s "${PWD}/.osc/_files" ; then lnkprj=$(command sed -rn '/ /dev/null) fi if test "${cmdline[0]}" = isc ; then alias=internal fi case "${cmdline[1]}" in -A|--apiurl) if test -n "${cmdline[2]}" -a -s "${oscrc}" ; then hints=($(sed -rn '/^(aliases=|\[http)/{s/,/ /g;s/(aliases=|\[|\])//gp}' < "${oscrc}" 2> /dev/null)) for h in ${hints[@]} ; do case "$h" in http*) tmp=$(sed -rn '\@^\['${h}'@,\@=@{\@^aliases=@{s@[^=]+=([^,]+),.*@\1@p};}' < "${oscrc}" 2> /dev/null) if test "${cmdline[2]}" = "$h" ; then alias=$tmp break fi ;; *) if test "${cmdline[2]}" = "$h" ; then alias=$h break fi esac done fi esac if test -n "$alias" ; then projects="${projects}.${alias}" command="$command -A $alias" fi update_projects_list () { if test -s "${projects}" ; then typeset -i ctime=$(command stat -c '%Z' ${projects}) typeset -i now=$(command date +'%s') if ((now - ctime > 86400)) ; then if tmp=$(mktemp ${projects}.XXXXXX) ; then command ${command} ls / >| $tmp mv -f $tmp ${projects} fi fi else command ${command} ls / >| "${projects}" fi } projects () { local -a list local -a argv local -i argc=0 local arg cur for arg; do if test $arg == "--" ; then let argc++ break fi argv[argc++]=$arg done shift $argc update_projects_list cur="$1" if ((colon)) ; then local colon_word colon_word=${cur%${cur##*:}} if test -n "${cur}" ; then builtin compgen -W '`grep -E "^${cur}" ${projects}`' -- "${cur}" | sed -r "s@^${colon_word}@@g" else builtin compgen -W '`cat ${projects}`' -- "${cur}" | sed -r "s@^${colon_word}@@g" fi else builtin compgen -W "${list[*]}" -- "${cur}" fi } packages () { local -a list local -a argv local -i argc=0 local arg cur for arg; do if test $arg == "--" ; then let argc++ break fi argv[argc++]=$arg done shift $argc cur="$1" if test -n "${cur}" ; then list=($(command ${command} ls ${argv[@]}|command grep -E "^${cur}")) else list=($(command ${command} ls ${argv[@]})) fi builtin compgen -W "${list[*]}" -- "${cur}" } repositories () { local -a list local -a argv local -i argc=0 local arg for arg; do if test $arg == "--" ; then let argc++ break fi argv[argc++]=$arg done shift $argc if test -n "$1" ; then list=($(command ${command} meta prj ${argv[@]}|\ command sed -rn '//{s@^\s*(.*)@\1@p}'|\ command sort -u|command grep -E "^$1")) else list=($(command ${command} meta prj ${argv[@]}|\ command sed -rn '//{s@^\s*(.*)@\1@p}'|\ command sort -u)) fi builtin compgen -W "${list[*]}" -- ${1+"$@"} } targets () { local -a targets=() local -a argv local -i argc=0 local arg for arg; do if test $arg == "--" ; then let argc++ break fi argv[argc++]=$arg done shift $argc let argc=0 for arg in $(builtin compgen -o filenames -o bashdefault -f -X '.osc' -- ${1+"$@"}); do test -d $arg && targets[argc]=$arg/ || targets[argc]=$arg let argc++ done builtin compgen -W "${argv[*]}${targets+ ${targets[*]}}" -- ${1+"$@"} } users () { update_projects_list if test -s ${projects} ; then command sed -rn "/^home:$1/{ s/^home:([^:]*):.*/\1/p}" ${projects}|command sort -u elif test -s "${oscrc}"; then command sed -rn '/^(user=)/{s/(user=)//p}' "${oscrc}" | command sort -u else command id -un fi } submit () { local -i pos=$1 local target if ((pos == 1)) ; then if test -n "${oscprj}" -a -z "${cmdline[2]}" ; then builtin compgen -W "${oscprj}" -- "${cmdline[2]}" else if [[ -n "${oscprj}" && "${oscprj}" =~ "${cmdline[2]}" ]] ; then builtin compgen -W "${oscprj}" -- "${cmdline[2]}" else projects -- "${cmdline[2]}" fi fi elif ((pos == 2)) ; then if test -n "${oscpkg}" -a -z "${cmdline[3]}" ; then builtin compgen -W "${oscpkg}" -- "${cmdline[3]}" else if [[ -n "${oscpkg}" && "${oscpkg}" =~ "${cmdline[3]}" ]] ; then builtin compgen -W "${oscpkg}" -- "${cmdline[3]}" else packages "${cmdline[2]}" -- "${cmdline[3]}" fi fi elif ((pos == 3)) ; then if test -n "${lnkprj}" -a -z "${cmdline[4]}" ; then builtin compgen -W "${lnkprj}" -- "${cmdline[4]}" else projects -- "${cmdline[4]}" fi elif ((pos == 4)) ; then target="${lnkpkg}" target="${target:+$target }$oscpkg" if test -n "${target}" ; then builtin compgen -W "${target}" -- "${cmdline[5]}" else packages "${cmdline[4]}" -- "${cmdline[5]}" fi fi } # # The main options # let remove=0 while test "${cmdline[1+remove]::1}" = "-" ; do case "${cmdline[1+remove]}" in -A|--apiurl) if ((count-remove == 1)); then builtin compgen -W "${oscopts[*]}" -- "${cmdline[1+remove]}" exit elif ((count-remove == 2)); then if test -s "${oscrc}" ; then hints=($(sed -rn '/^(aliases=|\[http)/{s/,/ /g;s/(aliases=|\[|\])//gp}' "${oscrc}" | sort -u)) builtin compgen -W "${hints[*]}" -- "${cmdline[2+remove]}" else builtin compgen -P https:// -A hostname fi exit fi let remove+=2 ;; -c|--config) if ((count-remove == 1)); then builtin compgen -W "${oscopts[*]}" -- "${cmdline[1+remove]}" exit elif ((count-remove == 2)); then builtin compgen -o filenames -o bashdefault -f -X '.osc' -- "${cmdline[2+remove]}" exit fi let remove+=2 ;; -*) if ((count-remove == 1)); then builtin compgen -W "${oscopts[*]}" -- "${cmdline[1+remove]}" exit fi let remove++ ;; *) break esac done if ((remove)) ; then cmdline=(${cmdline[0]} ${cmdline[@]:remove+1}) let count-=remove let remove=0 fi case "${cmdline[1]}" in add|addremove|ar) opts=(--help --recursive) if ((count == 1)) ; then builtin compgen -W "${osccmds[*]}" -- "${cmdline[count]}" else for ((off=2; off<=count; off++)) ; do while test "${cmdline[off+remove]::1}" = "-" ; do case "${cmdline[off+remove]}" in -*) if ((count-remove == off)) ; then builtin compgen -W "${opts[*]}" -- "${cmdline[off+remove]}" exit fi let remove++ esac done if ((remove)) ; then cmdline=(${cmdline[*]:0:off} ${cmdline[@]:remove+off}) let count-=remove let remove=0 fi done fi if ((count >= 2)) ; then targets ${opts[*]} -- "${cmdline[count]}" fi ;; build) opts=(--help --oldpackages --disable-cpio-bulk-download --release --baselibs --disable-debuginfo --debuginfo --alternative-project --vm-type --linksources --local-package --build-uid --userootforbuild --define --without --with --ccache --icecream --jobs --root --extra-pkgs --keep-pkgs --prefer-pkgs --noservice --no-service --no-verify --nochecks --no-checks --noinit --no-init --overlay --rsync-dest --rsync-src --no-changelog --preload --offline --clean) if ((count == 1)) ; then builtin compgen -W "${osccmds[*]}" -- "${cmdline[count]}" elif ((count >= 2)) ; then for ((off=2; off<=count; off++)) ; do while test "${cmdline[off+remove]::1}" = "-" ; do case "${cmdline[off+remove]}" in --alternative-project) if ((count-remove == off)); then builtin compgen -W "${opts[*]}" -- "${cmdline[off+remove]}" exit elif ((count-remove == off+1)) ; then projects -- "${cmdline[count]}" exit elif ((count-remove == off+2)) ; then repositories "${cmdline[off+1+remove]}" -- "${cmdline[off+2+$remove]}" exit elif ((count-remove == off+3)) ; then architectures "${cmdline[off+1+remove]}" -- "${cmdline[off+3+remove]}" exit fi let remove+=4 ;; --define) if ((count-remove == off)); then builtin compgen -W "${opts[*]}" -- "${cmdline[off+remove]}" exit elif ((count-remove == off+1)) ; then builtin compgen -P "'" -S "'" -W 'macro\ definition' -- "${cmdline[off+1+remvoe]}" exit elif ((count-remove == off+2)) ; then exit fi let remove+=3 ;; --@(root|oldpackages|keep-pkgs|prefer-pkgs|rsync-dest|rsync-src)) if ((count-remove == off)); then builtin compgen -W "${opts[*]}" -- "${cmdline[off+remove]}" exit elif ((count-remove == off+1)) ; then builtin compgen -o dirnames -d -- ${cmdline[off+1+remove]} exit fi let remove+=2 ;; --@(release|icecream|jobs|without|with|overlay)) if ((count-remove == off)); then builtin compgen -W "${opts[*]}" -- "${cmdline[off+remove]}" exit elif ((count-remove == off+1)) && test -z "${cmdline[off+1+remove]}" ; then hint="${cmdline[off+remove]^^}" builtin echo -n ${hint##*-} exit fi let remove+=2 ;; --build-uid) if ((count-remove == off)); then builtin compgen -W "${opts[*]}" -- "${cmdline[off+remove]}" exit elif ((count-remove == off+1)) ; then builtin compgen -W "399:399 $(id -u):$(id -g)" -- "${cmdline[off+2+remove]}" exit fi let remove+=2 ;; -*) if ((count-remove == off)) ; then builtin compgen -W "${opts[*]}" -- "${cmdline[off+remove]}" exit fi let remove++ esac done if ((remove)) ; then cmdline=(${cmdline[*]:0:off} ${cmdline[@]:remove+off}) let count-=remove let remove=0 fi done fi if ((count == 2)) ; then specs=($(command ls *.spec)) builtin compgen -W "${opts[*]} ${specs[*]}" -- "${cmdline[count]}" fi ;; branch|getpac|bco|branchco) opts=(--help --revision --new-package --maintenance --noaccess --extend-package-names --add-repositories --force --nodevelproject) if ((count == 1)) ; then builtin compgen -W "${osccmds[*]}" -- "${cmdline[count]}" elif ((count >= 2)) ; then case "${cmdline[1]}" in branch) opts[${#opts[@]}]=--checkout esac for ((off=2; off<=count; off++)) ; do while test "${cmdline[off+remove]::1}" = "-" ; do case "${cmdline[off+remove]}" in --revision) if ((count-remove == off)); then builtin compgen -W "${opts[*]}" -- "${cmdline[off+remove]}" exit elif ((count-remove == off+1)) && test -z "${cmdline[off+1+remove]}" ; then hint="${cmdline[off+remove]^^}" builtin echo -n ${hint##*-} exit fi let remove+=2 ;; --message) exit ;; -*) if ((count-remove == off)) ; then ((count >= 6)) && opts[${#opts[@]}]=--message builtin compgen -W "${opts[*]}" -- "${cmdline[off+remove]}" exit fi let remove++ esac done if ((remove)) ; then cmdline=(${cmdline[*]:0:off} ${cmdline[@]:remove+off}) let count-=remove let remove=0 fi done fi if ((count == 2)) ; then builtin compgen -W "${opts[*]}" -- "${cmdline[2]}" projects -- "${cmdline[2]}" elif ((count == 3)) ; then packages "${cmdline[2]}" -- "${cmdline[3]}" elif ((count == 4)) ; then builtin compgen -W "${opts[*]}" -- "${cmdline[4]}" projects -- "${cmdline[4]}" elif ((count == 5)) ; then packages "${cmdline[4]}" -- "${cmdline[5]}" elif ((count == 6)) ; then builtin compgen -W "--message ${opts[*]}" -- "${cmdline[6]}" fi ;; list|ls|ll|LL) opts=(--help --meta --deleted --long --verbose --unexpand --expand --binaries --repo --revision --arch) if ((count == 1)) ; then builtin compgen -W 'list ls ll LL' -- "${cmdline[count]}" elif ((count >= 2)) ; then for ((off=2; off<=count; off++)) ; do while test "${cmdline[off+remove]::1}" = "-" ; do case "${cmdline[off+remove]}" in --@(revision|repo|arch)) if ((count-remove == off)); then builtin compgen -W "${opts[*]}" -- "${cmdline[off+remove]}" exit elif ((count-remove == off+1)) && test -z "${cmdline[off+1+remove]}" ; then hint="${cmdline[off+remove]^^}" builtin echo -n ${hint##*-} exit fi let remove+=2 ;; -*) if ((count-remove == off)) ; then builtin compgen -W "${opts[*]}" -- "${cmdline[off+remove]}" exit fi let remove++ esac done if ((remove)) ; then cmdline=(${cmdline[*]:0:off} ${cmdline[@]:remove+off}) let count-=remove let remove=0 fi done fi if ((count == 2)) ; then builtin compgen -W "${opts[*]}" -- "${cmdline[2]}" projects -- "${cmdline[2]}" elif ((count == 3)) ; then packages "${cmdline[2]}" -- "${cmdline[3]}" elif ((count == 4)) ; then packages -u "${cmdline[2]}" "${cmdline[3]}" -- "${cmdline[4]}" else builtin compgen -W "${opts[*]}" -- "${cmdline[count]}" fi ;; less|cat) opts=(--help --meta --unexpand --expand --revision) if ((count == 1)) ; then builtin compgen -W "${osccmds[*]}" -- "${cmdline[count]}" elif ((count >= 2)) ; then for ((off=2; off<=count; off++)) ; do while test "${cmdline[off+remove]::1}" = "-" ; do case "${cmdline[off+remove]}" in --revision) if ((count-remove == off)); then builtin compgen -W "${opts[*]}" -- "${cmdline[off+remove]}" exit elif ((count-remove == off+1)) && test -z "${cmdline[off+1+remove]}" ; then hint="${cmdline[off+remove]^^}" builtin echo -n ${hint##*-} exit fi let remove+=2 ;; -*) if ((count-remove == off)) ; then builtin compgen -W "${opts[*]}" -- "${cmdline[off+remove]}" exit fi let remove++ esac done if ((remove)) ; then cmdline=(${cmdline[*]:0:off} ${cmdline[@]:remove+off}) let count-=remove let remove=0 fi done fi if ((count == 2)) ; then builtin compgen -W "${opts[*]}" -- "${cmdline[2]}" projects -- "${cmdline[2]}" elif ((count == 3)) ; then packages "${cmdline[2]}" -- "${cmdline[3]}" elif ((count == 4)) ; then packages -u "${cmdline[2]}" "${cmdline[3]}" -- "${cmdline[4]}" else builtin compgen -W "${opts[*]}" -- "${cmdline[2]}" fi ;; sr|submitpac|submitreq|submitrequest) opts=(--help --yes --diff --no-update --no-cleanup --cleanup --separate-requests --nodevelproject --supersede --revision) if ((count == 1)) ; then builtin compgen -W "${osccmds[*]}" -- "${cmdline[count]}" elif ((count >= 2)) ; then for ((off=2; off<=count; off++)) ; do while test "${cmdline[off+remove]::1}" = "-" ; do case "${cmdline[off+remove]}" in --@(revision|supersede)) if ((count-remove == off)); then builtin compgen -W "${opts[*]}" -- "${cmdline[off+remove]}" exit elif ((count-remove == off+1)) && test -z "${cmdline[off+1+remove]}" ; then hint="${cmdline[off+remove]^^}" builtin echo -n ${hint##*-} exit fi let remove+=2 ;; --message) exit ;; -*) if ((count-remove == off)) ; then ((count >= 6)) && opts[${#opts[@]}]=--message builtin compgen -W "${opts[*]}" -- "${cmdline[off+remove]}" exit fi let remove++ esac done if ((remove)) ; then cmdline=(${cmdline[*]:0:off} ${cmdline[@]:remove+off}) let count-=remove let remove=0 fi done fi if ((count >= 2 && count <= 5)) ; then if ((count == 2)) ; then builtin compgen -W "${opts[*]}" -- "${cmdline[2]}" fi submit $((count-1)) 1 elif ((count == 6)) ; then builtin compgen -W "--message ${opts[*]}" -- "${cmdline[6]}" fi ;; rq|request|review) opts=(--help --involved-projects --exclude-target-project --non-interactive --interactive --edit) if ((count == 1)) ; then builtin compgen -W "${osccmds[*]}" -- "${cmdline[count]}" elif ((count >= 2)) ; then while test "${cmdline[2+remove]::1}" = "-" ; do case "${cmdline[2+remove]}" in --exclude-target-project) if ((count-remove == 2)); then builtin compgen -W "${opts[*]}" -- "${cmdline[2+remove]}" exit elif ((count-remove == 3)) ; then builtin echo -n EXCLUDE_TARGET_PROJECT exit fi let remove+=2 ;; -*) if ((count-remove == 2)) ; then builtin compgen -W "${opts[*]}" -- "${cmdline[2+remove]}" exit fi let remove++ esac done if ((remove)) ; then cmdline=(${cmdline[*]:0:2} ${cmdline[@]:remove+2}) let count-=remove let remove=0 fi fi case "${cmdline[2]}" in log|checkout) opts=(--help) if ((count == 2)) ; then builtin compgen -W 'log checkout' -- "${cmdline[count]}" elif ((count >= 3)) ; then for ((off=3; off<=count; off++)) ; do while test "${cmdline[off+remove]::1}" = "-" ; do case "${cmdline[off+remove]}" in -*) if ((count-remove == off)) ; then builtin compgen -W "${opts[*]}" -- "${cmdline[off+remove]}" exit fi let remove++ esac done if ((remove)) ; then cmdline=(${cmdline[*]:0:off} ${cmdline[@]:remove+off}) let count-=remove let remove=0 fi done fi if ((count == 3)) ; then builtin compgen -W "${opts[*]}" -- "${cmdline[3]}" builtin echo -n 'ID' else builtin compgen -W "${opts[*]}" -- "${cmdline[count]}" fi ;; revoke|clone) opts=(--help) if ((count == 2)) ; then builtin compgen -W 'revoke clone' -- "${cmdline[count]}" elif ((count >= 3)) ; then for ((off=3; off<=count; off++)) ; do while test "${cmdline[off+remove]::1}" = "-" ; do case "${cmdline[off+remove]}" in --message) exit ;; -*) if ((count-remove == off)) ; then ((count >= 4)) && opts[${#opts[@]}]=--message builtin compgen -W "${opts[*]}" -- "${cmdline[off+remove]}" exit fi let remove++ esac done if ((remove)) ; then cmdline=(${cmdline[*]:0:off} ${cmdline[@]:remove+off}) let count-=remove let remove=0 fi done fi if ((count == 3)) ; then builtin compgen -W "${opts[*]}" -- "${cmdline[3]}" builtin echo -n 'ID' elif ((count == 4)) ; then builtin compgen -W "--message ${opts[*]}" -- "${cmdline[4]}" fi ;; setincident) opts=(--help) if ((count == 2)) ; then builtin compgen -W 'setincident' -- "${cmdline[count]}" elif ((count >= 3)) ; then for ((off=3; off<=count; off++)) ; do while test "${cmdline[off+remove]::1}" = "-" ; do case "${cmdline[off+remove]}" in --message) exit ;; -*) if ((count-remove == off)) ; then ((count >= 4)) && opts[${#opts[@]}]=--message builtin compgen -W "${opts[*]}" -- "${cmdline[off+remove]}" exit fi let remove++ esac done if ((remove)) ; then cmdline=(${cmdline[*]:0:off} ${cmdline[@]:remove+off}) let count-=remove let remove=0 fi done fi if ((count == 3)) ; then builtin compgen -W "${opts[*]}" -- "${cmdline[3]}" builtin echo -n 'ID' elif ((count == 4)) ; then builtin echo -n 'INCIDENT' elif ((count == 5)) ; then builtin compgen -W "--message ${opts[*]}" -- "${cmdline[5]}" fi ;; supersede|add|accept|decline|reopen) case "${cmdline[1]}" in rq|request) opts=() ;; review) opts=(--user --group --project --package) esac if ((count == 2)) ; then builtin compgen -W 'supersede add accept decline reopen' -- "${cmdline[count]}" elif ((count >= 3)) ; then typeset -i supersede=0 case "${cmdline[2]}" in supersede) let supersede=1 esac typeset project="" for ((off=3; off<=count; off++)) ; do while test "${cmdline[off+remove]::1}" = "-" ; do case "${cmdline[off+remove]}" in --user) if ((count-remove == off)); then builtin compgen -W "${opts[*]}" -- "${cmdline[off+remove]}" exit elif ((count-remove == off+1)) ; then user=($(users ${cmdline[off+1+remove]})) builtin compgen -W "${user[*]}" -- ${cmdline[off+1+remove]} fi let remove+=2 ;; --group) if ((count-remove == off)); then builtin compgen -W "${opts[*]}" -- "${cmdline[off+remove]}" exit elif ((count-remove == off+1)) && test -z "${cmdline[off+1+remove]}" ; then hint="${cmdline[off+remove]^^}" builtin echo -n ${hint##*-} exit fi let remove+=2 ;; --project) if ((count-remove == off)); then builtin compgen -W "${opts[*]}" -- "${cmdline[off+remove]}" exit elif ((count-remove == off+1)) ; then projects -- "${cmdline[off+1+remove]}" exit else project="${cmdline[off+1+remove]}" fi let remove+=2 ;; --package) if ((count-remove == off)); then builtin compgen -W "${opts[*]}" -- "${cmdline[off+remove]}" exit elif ((count-remove == off+1)) ; then test -z "$project" && project=PROJECT_REQUIRED packages -u "$project" -- "${cmdline[off+1+remove]}" exit fi let remove+=2 ;; --message) exit ;; -*) if ((count-remove == off)) ; then ((count >= 4)) && opts[${#opts[@]}]=--message builtin compgen -W "${opts[*]}" -- "${cmdline[off+remove]}" exit fi let remove++ esac done if ((remove)) ; then cmdline=(${cmdline[*]:0:off} ${cmdline[@]:remove+off}) let count-=remove let remove=0 fi done fi if ((count == 3)) ; then builtin compgen -W "${opts+${opts[*]} }ID" -- "${cmdline[3]}" elif ((count == 4)) ; then if ((supersede)) ; then builtin echo -n 'SUPERSEDING_ID' else builtin compgen -W '--message' -- "${cmdline[4]}" fi elif ((count == 5 && supersede)) ; then builtin compgen -W '--message' -- "${cmdline[5]}" fi ;; approvenew) opts=(--help) if ((count == 2)) ; then builtin compgen -W 'approvenew' -- "${cmdline[count]}" elif ((count >= 3)) ; then for ((off=3; off<=count; off++)) ; do while test "${cmdline[off+remove]::1}" = "-" ; do case "${cmdline[off+remove]}" in --message) exit ;; -*) if ((count-remove == off)) ; then ((count >= 4)) && opts[${#opts[@]}]=--message builtin compgen -W "${opts[*]}" -- "${cmdline[off+remove]}" exit fi let remove++ esac done if ((remove)) ; then cmdline=(${cmdline[*]:0:off} ${cmdline[@]:remove+off}) let count-=remove let remove=0 fi done fi if ((count == 3)) ; then builtin compgen -W "${opts[*]}" -- "${cmdline[3]}" projects -- "${cmdline[3]}" elif ((count == 4)) ; then builtin compgen -W "--message ${opts[*]}" -- "${cmdline[4]}" fi ;; show) opts=(--diff --brief --source-buildstatus) if ((count == 2)) ; then builtin compgen -W 'show' -- "${cmdline[count]}" elif ((count >= 3)) ; then for ((off=3; off<=count; off++)) ; do while test "${cmdline[off+remove]::1}" = "-" ; do case "${cmdline[off+remove]}" in -*) if ((count-remove == off)) ; then builtin compgen -W "${opts[*]}" -- "${cmdline[off+remove]}" exit fi let remove++ esac done if ((remove)) ; then cmdline=("${cmdline[@]:0:off}" "${cmdline[@]:remove+off}") let count-=remove let remove=0 fi done fi if ((count == 3)) ; then builtin compgen -W "${opts[*]} ID" -- "${cmdline[3]}" else builtin compgen -W "${opts[*]}" -- "${cmdline[count]}" fi ;; list) case "${cmdline[1]}" in rq|request) opts=(--mine --user --state -days --type --bugowner) ;; review) opts=(--user --group --project --package) esac if ((count == 2)) ; then builtin compgen -W 'list' -- "${cmdline[count]}" elif ((count >= 3)) ; then typeset project="" for ((off=3; off<=count; off++)) ; do while test "${cmdline[off+remove]::1}" = "-" ; do case "${cmdline[off+remove]}" in --user) if ((count-remove == off)); then builtin compgen -W "${opts[*]}" -- "${cmdline[off+remove]}" exit elif ((count-remove == off+1)) ; then user=($(users ${cmdline[off+1+remove]})) builtin compgen -W "${user[*]}" -- ${cmdline[off+1+remove]} fi let remove+=2 ;; --group) if ((count-remove == off)); then builtin compgen -W "${opts[*]}" -- "${cmdline[off+remove]}" exit elif ((count-remove == off+1)) && test -z "${cmdline[off+1+remove]}" ; then hint="${cmdline[off+remove]^^}" builtin echo -n ${hint##*-} exit fi let remove+=2 ;; --project) if ((count-remove == off)); then builtin compgen -W "${opts[*]}" -- "${cmdline[off+remove]}" exit elif ((count-remove == off+1)) ; then projects -- "${cmdline[off+1+remove]}" exit else project="${cmdline[off+1+remove]}" fi let remove+=2 ;; --package) if ((count-remove == off)); then builtin compgen -W "${opts[*]}" -- "${cmdline[off+remove]}" exit elif ((count-remove == off+1)) ; then test -z "$project" && project=PROJECT_REQUIRED packages -u "$project" -- "${cmdline[off+1+remove]}" exit fi let remove+=2 ;; -*) if ((count-remove == off)) ; then builtin compgen -W "${opts[*]}" -- "${cmdline[off+remove]}" exit fi let remove++ esac done if ((remove)) ; then cmdline=("${cmdline[@]:0:off}" "${cmdline[@]:remove+off}") let count-=remove let remove=0 fi done fi if ((count == 3)) ; then builtin compgen -W "${opts[*]}" -- "${cmdline[3]}" projects -- "${cmdline[3]}" fi if ((count == 4)) ; then packages -u "${cmdline[3]}" -- "${cmdline[4]}" fi ;; *) if ((count == 2)) ; then case "${cmdline[1+remove]}" in rq|request) builtin compgen -W "${opts[*]} ${oscreq[*]}" -- "${cmdline[2]}" ;; review) builtin compgen -W "${opts[*]} ${oscrev[*]}" -- "${cmdline[2]}" ;; esac fi esac ;; my) opts=(--help --maintained --verbose --exclude-project --user --all --maintainer --bugowner) if ((count == 1)) ; then builtin compgen -W "${osccmds[*]}" -- "${cmdline[count]}" elif ((count >= 2)) ; then for ((off=2; off<=count; off++)) ; do while test "${cmdline[off+remove]::1}" = "-" ; do case "${cmdline[off+remove]}" in --user) if ((count-remove == off)); then builtin compgen -W "${opts[*]}" -- "${cmdline[off+remove]}" exit elif ((count-remove == off+1)) ; then user=($(users ${cmdline[off+1+remove]})) builtin compgen -W "${user[*]}" -- ${cmdline[off+1+remove]} fi let remove+=2 ;; --exclude-project) if ((count-remove == off)); then builtin compgen -W "${opts[*]}" -- "${cmdline[off+remove]}" exit elif ((count-remove == off+1)) ; then projects -- "${cmdline[off+1+remove]}" exit fi let remove+=2 ;; -*) if ((count-remove == off)) ; then builtin compgen -W "${opts[*]}" -- "${cmdline[off+remove]}" exit fi let remove++ esac done if ((remove)) ; then cmdline=(${cmdline[*]:0:off} ${cmdline[@]:remove+off}) let count-=remove let remove=0 fi done fi if ((count == 2)) ; then builtin compgen -W "${opts[*]} ${oscmy[*]}" -- "${cmdline[2]}" elif ((count >= 3)) ; then builtin compgen -W "${opts[*]}" -- "${cmdline[3]}" fi ;; comment) opts=(--comment --parent) if ((count == 1)) ; then builtin compgen -W "${osccmds[*]}" -- "${cmdline[count]}" elif ((count == 2)) ; then builtin compgen -W "${opts[*]} ${osccmt[*]}" -- "${cmdline[2]}" elif ((count == 3)) ; then builtin compgen -W "${opts[*]} ${osccmtkind[*]}" -- "${cmdline[3]}" fi ;; copypac|linkpac) opts=(--help --expand --to-apiurl --revision --keep-develproject --keep-link --keep-maintainers --client-side-copy) if ((count == 1)) ; then builtin compgen -W "${osccmds[*]}" -- "${cmdline[count]}" elif ((count >= 2)) ; then for ((off=2; off<=count; off++)) ; do while test "${cmdline[off+remove]::1}" = "-" ; do case "${cmdline[off+remove]}" in --@(revision|to-apiurl)) if ((count-remove == off)); then builtin compgen -W "${opts[*]}" -- "${cmdline[off+remove]}" exit elif ((count-remove == off+1)) && test -z "${cmdline[off+1+remove]}" ; then hint="${cmdline[off+remove]^^}" builtin echo -n ${hint##*-} exit fi let remove+=2 ;; -*) if ((count-remove == off)) ; then builtin compgen -W "${opts[*]}" -- "${cmdline[off+remove]}" exit fi let remove++ esac done if ((remove)) ; then cmdline=(${cmdline[*]:0:off} ${cmdline[@]:remove+off}) let count-=remove let remove=0 fi done fi if ((count == 2)) ; then builtin compgen -W "${opts[*]}" -- "${cmdline[2]}" projects -- "${cmdline[2]}" elif ((count == 3)) ; then packages "${cmdline[2]}" -- "${cmdline[3]}" elif ((count == 4)) ; then projects -- "${cmdline[4]}" elif ((count == 5)) ; then packages "${cmdline[4]}" -- "${cmdline[5]}" elif ((count == 6)) ; then builtin compgen -W "--message ${opts[*]}" -- "${cmdline[6]}" fi ;; delete) opts=(--help --force) if ((count == 1)) ; then builtin compgen -W "${osccmds[*]}" -- "${cmdline[count]}" else for ((off=2; off<=count; off++)) ; do while test "${cmdline[off+remove]::1}" = "-" ; do case "${cmdline[off+remove]}" in -*) if ((count-remove == off)) ; then builtin compgen -W "${opts[*]}" -- "${cmdline[off+remove]}" exit fi let remove++ esac done if ((remove)) ; then cmdline=(${cmdline[*]:0:off} ${cmdline[@]:remove+off}) let count-=remove let remove=0 fi done fi if ((count >= 2)) ; then targets ${opts[*]} -- "${cmdline[count]}" fi ;; deleterequest|deletereq|droprequest|dropreq|dr) typeset -i repository=0 opts=(--help --repository --message) if ((count == 1)) ; then builtin compgen -W "${osccmds[*]}" -- "${cmdline[count]}" elif ((count == 2)) ; then projects -- "${cmdline[2]}" elif ((count >= 3)) ; then for ((off=3; off<=count; off++)) ; do while test "${cmdline[off+remove]::1}" = "-" ; do case "${cmdline[off+remove]}" in --repository) if ((count-remove == off)); then builtin compgen -W "${opts[*]}" -- "${cmdline[off+remove]}" exit elif ((count-remove == off+1)) && test -z "${cmdline[off+1+remove]}" ; then hint="${cmdline[off+remove]^^}" builtin echo -n ${hint##*-} exit fi let remove+=2 let repository++ ;; --message) exit ;; -*) if ((count-remove == off)) ; then ((count >= 4)) && opts[${#opts[@]}]=--message builtin compgen -W "${opts[*]}" -- "${cmdline[off+remove]}" exit fi let remove++ esac done if ((remove)) ; then cmdline=(${cmdline[*]:0:off} ${cmdline[@]:remove+off}) let count-=remove let remove=0 fi done fi if ((count == 3)) ; then if ((repository)) ; then builtin compgen -W '--message' -- "${cmdline[4]}" else builtin compgen -W "${opts[*]}" -- "${cmdline[3]}" packages "${cmdline[2]}" -- "${cmdline[3]}" fi elif ((count == 4)) ; then builtin compgen -W "--message ${opts[*]}" -- "${cmdline[4]}" fi ;; changedevelrequest|changedevelreq|cr) opts=(--help) if ((count == 1)) ; then builtin compgen -W "${osccmds[*]}" -- "${cmdline[count]}" elif ((count >= 2)) ; then for ((off=2; off<=count; off++)) ; do while test "${cmdline[off+remove]::1}" = "-" ; do case "${cmdline[off+remove]}" in -*) if ((count-remove == off)) ; then builtin compgen -W "${opts[*]}" -- "${cmdline[off+remove]}" exit fi let remove++ esac done if ((remove)) ; then cmdline=(${cmdline[*]:0:off} ${cmdline[@]:remove+off}) let count-=remove let remove=0 fi done fi if ((count == 2)) ; then builtin compgen -W "${opts[*]}" -- "${cmdline[2]}" projects -- "${cmdline[2]}" elif ((count == 3)) ; then packages "${cmdline[2]}" -- "${cmdline[3]}" elif ((count == 4)) ; then builtin compgen -W "${opts[*]}" -- "${cmdline[2]}" projects -- "${cmdline[4]}" elif ((count == 5)) ; then packages "${cmdline[4]}" -- "${cmdline[5]}" fi ;; rdiff) opts=(--help --unexpand --missingok --change --plain --revision --meta) if ((count == 1)) ; then builtin compgen -W "${osccmds[*]}" -- "${cmdline[count]}" elif ((count >= 2)) ; then for ((off=2; off<=count; off++)) ; do while test "${cmdline[off+remove]::1}" = "-" ; do case "${cmdline[off+remove]}" in --@(revision|change)) if ((count-remove == off)); then builtin compgen -W "${opts[*]}" -- "${cmdline[off+remove]}" exit elif ((count-remove == off+1)) && test -z "${cmdline[off+1+remove]}" ; then hint="${cmdline[off+remove]^^}" builtin echo -n ${hint##*-} exit fi let remove+=2 ;; -*) if ((count-remove == off)) ; then builtin compgen -W "${opts[*]}" -- "${cmdline[off+remove]}" exit fi let remove++ esac done if ((remove)) ; then cmdline=(${cmdline[*]:0:off} ${cmdline[@]:remove+off}) let count-=remove let remove=0 fi done fi if ((count == 2)) ; then builtin compgen -W "${opts[*]}" -- "${cmdline[2]}" projects -- "${cmdline[2]}" elif ((count == 3)) ; then packages "${cmdline[2]}" -- "${cmdline[3]}" elif ((count == 4)) ; then builtin compgen -W "${opts[*]}" -- "${cmdline[4]}" projects -- "${cmdline[4]}" elif ((count == 5)) ; then builtin compgen -W "${opts[*]}" -- "${cmdline[5]}" packages "${cmdline[4]}" -- "${cmdline[5]}" fi ;; ci|commit|checkin) opts=(--help --skip-local-service-run --noservice --verbose --skip-validation --force --file --message) if ((count == 1)) ; then builtin compgen -W "${osccmds[*]}" -- "${cmdline[count]}" elif ((count >= 2)) ; then for ((off=2; off<=count; off++)) ; do while test "${cmdline[off+remove]::1}" = "-" ; do case "${cmdline[off+remove]}" in --file) if ((count-remove == off)); then builtin compgen -W "${opts[*]}" -- "${cmdline[off+remove]}" exit elif ((count-remove == off+1)) && test -z "${cmdline[off+1+remove]}" ; then hint="${cmdline[off+remove]^^}" builtin echo -n ${hint##*-} exit fi let remove+=2 ;; --message) exit ;; -*) if ((count-remove == off)) ; then ((count >= 3)) && opts[${#opts[@]}]=--message builtin compgen -W "${opts[*]}" -- "${cmdline[off+remove]}" exit fi let remove++ esac done if ((remove)) ; then cmdline=(${cmdline[*]:0:off} ${cmdline[@]:remove+off}) let count-=remove let remove=0 fi done fi if ((count == 2)) ; then targets ${opts[*]} -- "${cmdline[2]}" elif ((count == 3)) ; then builtin compgen -W "--message ${opts[*]}" -- "${cmdline[3]}" fi ;; co|checkout) opts=(--help --limit-size --server-side-source-service-files --source-service-files --output-dir --current-dir --meta --unexpand-link --expand-link --revision) if ((count == 1)) ; then builtin compgen -W "${osccmds[*]}" -- "${cmdline[count]}" elif ((count >= 2)) ; then for ((off=2; off<=count; off++)) ; do while test "${cmdline[off+remove]::1}" = "-" ; do case "${cmdline[off+remove]}" in --@(output-dir|revision)) if ((count-remove == off)); then builtin compgen -W "${opts[*]}" -- "${cmdline[off+remove]}" exit elif ((count-remove == off+1)) && test -z "${cmdline[off+1+remove]}" ; then hint="${cmdline[off+remove]^^}" builtin echo -n ${hint##*-} exit fi let remove+=2 ;; -*) if ((count-remove == off)) ; then builtin compgen -W "${opts[*]}" -- "${cmdline[off+remove]}" exit fi let remove++ ;; *) break esac done if ((remove)) ; then cmdline=(${cmdline[*]:0:off} ${cmdline[@]:remove+off}) let count-=remove let remove=0 fi done fi if ((count == 2)) ; then builtin compgen -W "${opts[*]}" -- "${cmdline[2]}" projects -- "${cmdline[2]}" elif ((count == 3)) ; then packages "${cmdline[2]}" -- "${cmdline[3]}" elif ((count == 4)) ; then packages "${cmdline[2]}" "${cmdline[3]}" -- "${cmdline[4]}" fi ;; maintainer) opts=(--help --role --delete --set-bugowner-request --set-bugowner --all --add --devel-project --verbose --nodevelproject --email --bugowner --bugowner-only) if ((count == 1)) ; then builtin compgen -W "${osccmds[*]}" -- "${cmdline[count]}" elif ((count >= 2)) ; then for ((off=2; off<=count; off++)) ; do while test "${cmdline[off+remove]::1}" = "-" ; do case "${cmdline[off+remove]}" in --@(delete|set-bugowner-request|set-bugowner|add|devel-projec)) if ((count-remove == off)); then builtin compgen -W "${opts[*]}" -- "${cmdline[off+remove]}" exit elif ((count-remove == off+1)) && test -z "${cmdline[off+1+remove]}" ; then hint="${cmdline[off+remove]^^}" builtin echo -n ${hint##*-} exit fi let remove+=2 ;; --devel-projec) if ((count-remove == off)); then builtin compgen -W "${opts[*]}" -- "${cmdline[off+remove]}" exit elif ((count-remove == off+1)) ; then projects -- "${cmdline[off+1+remove]}" exit fi let remove+=2 ;; --role) if ((count-remove == off)); then builtin compgen -W "${opts[*]}" -- "${cmdline[off+remove]}" exit elif ((count-remove == off+1)) ; then builtin compgen -W 'bugowner maintainer involved' -- "${cmdline[off+1+remove]}" exit fi let remove+=2 ;; -*) if ((count-remove == off)) ; then builtin compgen -W "${opts[*]}" -- "${cmdline[off+remove]}" exit fi let remove++ esac done if ((remove)) ; then cmdline=(${cmdline[*]:0:off} ${cmdline[@]:remove+off}) let count-=remove let remove=0 fi done fi if ((count == 2)) ; then builtin compgen -W "${opts[*]}" -- "${cmdline[2]}" projects -- "${cmdline[2]}" elif ((count > 2)) ; then builtin compgen -W "${opts[*]}" -- "${cmdline[count]}" if ((count == 3)) ; then packages "${cmdline[2]}" -- "${cmdline[3]}" fi fi ;; up|update) opts=(--help --limit-size --server-side-source-service-files --source-service-files --expand-link --unexpand-link --revision) if ((count == 1)) ; then builtin compgen -W "${osccmds[*]}" -- "${cmdline[count]}" elif ((count >= 2)) ; then while test "${cmdline[2+remove]::1}" = "-" ; do case "${cmdline[2+remove]}" in --revision) if ((count-remove == 2)); then builtin compgen -W "${opts[*]}" -- "${cmdline[2+remove]}" exit elif ((count-remove == 3)) && test -z "${cmdline[3+remove]}" ; then hint="${cmdline[2+remove]^^}" builtin echo -n ${hint##*-} exit fi let remove+=2 ;; -*) if ((count-remove == 2)) ; then builtin compgen -W "${opts[*]}" -- "${cmdline[2+remove]}" exit fi let remove++ esac done fi builtin compgen -W "${opts[*]}" -- "${cmdline[count]}" ;; meta) opts=(--help --delete --set --remove-linking-repositories --create --edit --file --force --attribute-project --attribute-defaults --attribute) if ((count == 1)) ; then builtin compgen -W "${osccmds[*]}" -- "${cmdline[count]}" elif ((count >= 2)) ; then for ((off=2; off<=count; off++)) ; do while test "${cmdline[off+remove]::1}" = "-" ; do case "${cmdline[off+remove]}" in --@(attribute|file)) if ((count-remove == off)); then builtin compgen -W "${opts[*]}" -- "${cmdline[off+remove]}" exit elif ((count-remove == off+1)) && test -z "${cmdline[off+1+remove]}" ; then hint="${cmdline[off+remove]^^}" builtin echo -n ${hint##*-} exit fi let remove+=2 ;; --set) if ((count-remove == off)); then builtin compgen -W "${opts[*]}" -- "${cmdline[off+remove]}" exit elif ((count-remove == off+1)) && test -z "${cmdline[off+1+remove]}" ; then builtin echo -n ATTRIBUTE_VALUES exit fi let remove+=2 ;; -*) if ((count-remove == off)) ; then builtin compgen -W "${opts[*]}" -- "${cmdline[off+remove]}" exit fi let remove++ esac done if ((remove)) ; then cmdline=(${cmdline[*]:0:off} ${cmdline[@]:remove+off}) let count-=remove let remove=0 fi done fi if ((count == 2)) ; then builtin compgen -W 'prj pkg prjconf user pattern attribute' -- "${cmdline[2]}" elif ((count == 3)) ; then if test "${cmdline[2]}" = user ; then user=($(users ${cmdline[3]})) builtin compgen -W "${user[*]}" -- ${cmdline[3]} else projects -- "${cmdline[3]}" fi elif ((count == 4)) ; then if test "${cmdline[2]}" = pkg ; then packages "${cmdline[3]}" -- "${cmdline[4]}" elif test "${cmdline[2]}" = attribute ; then builtin compgen -W "${opts[*]}" -- "${cmdline[4]}" packages "${cmdline[3]}" -- "${cmdline[4]}" elif test "${cmdline[2]}" = user ; then user=($(users ${cmdline[4]})) builtin compgen -W "${user[*]}" -- ${cmdline[4]} else builtin compgen -W "${opts[*]}" -- ${cmdline[4]} fi elif ((count == 5)) ; then builtin compgen -W "${opts[*]}" -- ${cmdline[5]} fi ;; wipebinaries) opts=(--help --all --unresolvable --broken --build-failed --build-disabled --repo --arch) if ((count == 1)) ; then builtin compgen -W "${osccmds[*]}" -- "${cmdline[count]}" elif ((count >= 2)) ; then for ((off=2; off<=count; off++)) ; do while test "${cmdline[off+remove]::1}" = "-" ; do case "${cmdline[off+remove]}" in --@(repo|arch)) if ((count-remove == off)); then builtin compgen -W "${opts[*]}" -- "${cmdline[off+remove]}" exit elif ((count-remove == off+1)) && test -z "${cmdline[off+1+remove]}" ; then hint="${cmdline[off+remove]^^}" builtin echo -n ${hint##*-} exit fi let remove+=2 ;; -*) if ((count-remove == off)) ; then builtin compgen -W "${opts[*]}" -- "${cmdline[off+remove]}" exit fi let remove++ esac done if ((remove)) ; then cmdline=(${cmdline[*]:0:off} ${cmdline[@]:remove+off}) let count-=remove let remove=0 fi done fi if ((count == 2)) ; then builtin compgen -W "${opts[*]}" -- "${cmdline[2]}" projects -- "${cmdline[2]}" elif ((count-remove == 3)) ; then packages "${cmdline[2]}" -- "${cmdline[3]}" fi ;; help) if ((count == 1)) ; then builtin compgen -W "${osccmds[*]}" -- "${cmdline[count]}" else builtin compgen -W "${osccmds[*]}" -- "${cmdline[2]}" fi ;; search) opts=(--help --all --binaryversion --baseproject --binary --csv --mine --maintained --maintainer --bugowner --involved --version --verbose --limit-to-attribute --description --title --project --package --substring --exact --repos-baseurl) if ((count == 1)) ; then builtin compgen -W "${osccmds[*]}" -- "${cmdline[count]}" elif ((count >= 2)) ; then while test "${cmdline[2+remove]::1}" = "-" ; do case "${cmdline[2+remove]}" in --@(binaryversion|baseproject|limit-to-attribute)) if ((count-remove == 2)); then builtin compgen -W "${opts[*]}" -- "${cmdline[2+remove]}" exit elif ((count-remove == 3)) && test -z "${cmdline[3+remove]}" ; then hint="${cmdline[2+remove]^^}" builtin echo -n ${hint##*-} exit fi let remove+=2 ;; --@(maintainer|bugowner|involved)) if ((count-remove == 2)); then builtin compgen -W "${opts[*]}" -- "${cmdline[2+remove]}" exit elif ((count-remove == 3)) ; then user=($(users ${cmdline[3+remove]})) builtin compgen -W "${user[*]}" -- ${cmdline[3+remove]} fi let remove+=2 ;; -*) if ((count-remove == 2)) ; then builtin compgen -W "${opts[*]}" -- "${cmdline[2+remove]}" exit fi let remove++ esac done fi if ((count-remove == 2)) ; then builtin compgen -W "${opts[*]} SEARCH_TERM" -- "${cmdline[count]}" fi ;; pr|prjresults) opts=(--help --show-excluded --vertical --repo --arch --name-filter --status-filter --xml --csv --hide-legend) if ((count == 1)) ; then builtin compgen -W "${osccmds[*]}" -- "${cmdline[2]}" elif ((count >= 2)) ; then for ((off=2; off<=count; off++)) ; do while test "${cmdline[off+remove]::1}" = "-" ; do case "${cmdline[off+remove]}" in --@(repo|arch)) if ((count-remove == off)); then builtin compgen -W "${opts[*]}" -- "${cmdline[off+remove]}" exit elif ((count-remove == off+1)) && test -z "${cmdline[off+1+remove]}" ; then hint="${cmdline[off+remove]^^}" builtin echo -n ${hint##*-} exit fi let remove+=2 ;; --name-filter) if ((count-remove == off)); then builtin compgen -W "${opts[*]}" -- "${cmdline[off+remove]}" exit elif ((count-remove == off+1)) ; then builtin echo -n EXPR exit fi let remove+=2 ;; --status-filter) if ((count-remove == off)); then builtin compgen -W "${opts[*]}" -- "${cmdline[off+remove]}" exit elif ((count-remove == off+1)) ; then status=(disabled failed finished building succeeded broken scheduled unresolvable signing blocked) builtin compgen -W "${status[*]}" -- "${cmdline[off+1+remove]}" exit fi let remove+=2 ;; -*) if ((count-remove == off)) ; then builtin compgen -W "${opts[*]}" -- "${cmdline[off+remove]}" exit fi let remove++ esac done if ((remove)) ; then cmdline=(${cmdline[*]:0:off} ${cmdline[@]:remove+off}) let count-=remove let remove=0 fi done fi if ((count == 2)) ; then builtin compgen -W "${opts[*]}" -- "${cmdline[2]}" projects -- "${cmdline[2]}" else builtin compgen -W "${opts[*]}" -- "${cmdline[count]}" fi ;; r|results) opts=(--help --format --csv --xml --watch --verbose --arch --repo --last-build) if ((count == 1)) ; then builtin compgen -W "${osccmds[*]}" -- "${cmdline[count]}" elif ((count >= 2)) ; then for ((off=2; off<=count; off++)) ; do while test "${cmdline[off+remove]::1}" = "-" ; do case "${cmdline[off+remove]}" in --@(repo|arch|format)) if ((count-remove == off)); then builtin compgen -W "${opts[*]}" -- "${cmdline[off+remove]}" exit elif ((count-remove == off+1)) && test -z "${cmdline[off+1+remove]}" ; then hint="${cmdline[off+remove]^^}" builtin echo -n ${hint##*-} exit fi let remove+=2 ;; -*) if ((count-remove == off)) ; then builtin compgen -W "${opts[*]}" -- "${cmdline[off+remove]}" exit fi let remove++ esac done if ((remove)) ; then cmdline=(${cmdline[*]:0:off} ${cmdline[@]:remove+off}) let count-=remove let remove=0 fi done fi if ((count == 2)) ; then builtin compgen -W "${opts[*]}" -- "${cmdline[2]}" projects -- "${cmdline[2]}" elif ((count == 3)) ; then builtin compgen -W "${opts[*]}" -- "${cmdline[3]}" packages "${cmdline[2]}" -- "${cmdline[3]}" elif ((count > 3)) ; then builtin compgen -W "${opts[*]}" -- "${cmdline[count]}" fi ;; diff|linkdiff) opts=(--help --missingok --link --plain --revision --change) typeset -i link=0 if ((count == 1)) ; then builtin compgen -W "${osccmds[*]}" -- "${cmdline[count]}" elif ((count >= 2)) ; then case "${cmdline[1]}" in linkdiff) let link++ ;; esac for ((off=2; off<=count; off++)) ; do while test "${cmdline[off+remove]::1}" = "-" ; do case "${cmdline[off+remove]}" in --@(revision|change)) if ((count-remove == off)); then builtin compgen -W "${opts[*]}" -- "${cmdline[off+remove]}" exit elif ((count-remove == off+1)) && test -z "${cmdline[off+1+remove]}" ; then hint="${cmdline[off+remove]^^}" builtin echo -n ${hint##*-} exit fi let remove+=2 ;; --link) if ((count-remove == off)) ; then builtin compgen -W "${opts[*]}" -- "${cmdline[off+remove]}" exit fi let remove++ let link++ ;; -*) if ((count-remove == off)) ; then builtin compgen -W "${opts[*]}" -- "${cmdline[off+remove]}" exit fi let remove++ esac done if ((remove)) ; then cmdline=(${cmdline[*]:0:off} ${cmdline[@]:remove+off}) let count-=remove let remove=0 fi done fi if ((count == 2)) ; then builtin compgen -W "${opts[*]}" -- "${cmdline[2]}" ((link)) && projects -- "${cmdline[2]}" elif ((count == 3)) ; then builtin compgen -W "${opts[*]}" -- "${cmdline[3]}" ((link)) && packages "${cmdline[2]}" -- "${cmdline[3]}" elif ((count > 3)) ; then builtin compgen -W "${opts[*]}" -- "${cmdline[count]}" fi ;; workerinfo) opts=(--help) if ((count == 1)) ; then builtin compgen -W "${osccmds[*]} ${oscopts[*]}" -- "${cmdline[count]}" elif ((count >= 2)) ; then if test "${cmdline[count]::1}" = "-" ; then builtin compgen -W "${opts[*]}" -- "${cmdline[count]}" else targets ${opts[*]} -- "${cmdline[count]}" fi fi ;; checkconstraints) opts=(--help --ignore-file) if ((count == 1)) ; then builtin compgen -W "${osccmds[*]} ${oscopts[*]}" -- "${cmdline[count]}" elif ((count >= 2)) ; then if test "${cmdline[count]::1}" = "-" ; then builtin compgen -W "${opts[*]}" -- "${cmdline[count]}" else targets ${opts[*]} -- "${cmdline[count]}" fi fi ;; *) opts=(--help) if ((count == 1)) ; then builtin compgen -W "${osccmds[*]} ${oscopts[*]}" -- "${cmdline[count]}" elif ((count >= 2)) ; then if test "${cmdline[count]::1}" = "-" ; then builtin compgen -W "${opts[*]}" -- "${cmdline[count]}" else targets ${opts[*]} -- "${cmdline[count]}" fi fi esac osc-1.12.1/contrib/osc.fish000066400000000000000000000377221475337502500155050ustar00rootroot00000000000000# fish completion for git # vim: smartindent:expandtab:ts=2:sw=2 function __fish_osc_needs_command set cmd (commandline -opc) if contains "$cmd" 'osc' 'osc help' return 0 end return 1 end function __fish_osc_using_command set cmd (commandline -opc) if [ (count $cmd) -gt 1 ] for arg in $argv if [ $arg = $cmd[2] ] return 0 end end end return 1 end # general options complete -f -c osc -n 'not __fish_osc_needs_command' -s A -l apiurl -d 'specify URL to access API server at or an alias' complete -f -c osc -n 'not __fish_osc_needs_command' -s c -l config -d 'specify alternate configuration file' complete -f -c osc -n 'not __fish_osc_needs_command' -s d -l debug -d 'print info useful for debugging' complete -f -c osc -n 'not __fish_osc_needs_command' -l debugger -d 'jump into the debugger before executing anything' complete -f -c osc -n 'not __fish_osc_needs_command' -s h -l help -d 'show this help message and exit' complete -f -c osc -n 'not __fish_osc_needs_command' -s H -l http-debug -d 'debug HTTP traffic (filters some headers)' complete -f -c osc -n 'not __fish_osc_needs_command' -l http-full-debug -d 'debug HTTP traffic (filters no headers)' complete -f -c osc -n 'not __fish_osc_needs_command' -l no-gnome-keyring -d 'disable usage of GNOME Keyring' complete -f -c osc -n 'not __fish_osc_needs_command' -l no-keyring -d 'disable usage of desktop keyring system' complete -f -c osc -n 'not __fish_osc_needs_command' -l post-mortem -d 'jump into the debugger in case of errors' complete -f -c osc -n 'not __fish_osc_needs_command' -s q -l quiet -d 'be quiet, not verbose' complete -f -c osc -n 'not __fish_osc_needs_command' -s t -l traceback -d 'print call trace in case of errors' complete -f -c osc -n 'not __fish_osc_needs_command' -s v -l verbose -d 'increase verbosity' complete -f -c osc -n 'not __fish_osc_needs_command' -l version -d 'show program\'s version number and exit' # osc commands complete -f -c osc -n '__fish_osc_needs_command' -a 'add' -d 'Mark files to be added upon the next commit' complete -f -c osc -n '__fish_osc_needs_command' -a 'addremove ar' -d 'Adds new files, removes disappeared files' complete -f -c osc -n '__fish_osc_needs_command' -a 'aggregatepac' -d '"Aggregate" a package to another package' complete -f -c osc -n '__fish_osc_needs_command' -a 'api' -d 'Issue an arbitrary request to the API' complete -f -c osc -n '__fish_osc_needs_command' -a 'branch bco branchco getpac' -d 'Branch a package' complete -f -c osc -n '__fish_osc_needs_command' -a 'chroot' -d 'into the buildchroot' complete -f -c osc -n '__fish_osc_needs_command' -a 'clean' -d 'removes all untracked files from the package working ...' complete -f -c osc -n '__fish_osc_needs_command' -a 'commit checkin ci' -d 'Upload content to the repository server' complete -f -c osc -n '__fish_osc_needs_command' -a 'config' -d 'get/set a config option' complete -f -c osc -n '__fish_osc_needs_command' -a 'copypac' -d 'Copy a package' complete -f -c osc -n '__fish_osc_needs_command' -a 'createincident' -d 'Create a maintenance incident' complete -f -c osc -n '__fish_osc_needs_command' -a 'createrequest creq' -d 'create multiple requests with a single command' complete -f -c osc -n '__fish_osc_needs_command' -a 'delete del remove rm' -d 'Mark files or package directories to be deleted upon ...' complete -f -c osc -n '__fish_osc_needs_command' -a 'deleterequest deletereq dr dropreq droprequest' -d 'Request to delete (or "drop") a package or project' complete -f -c osc -n '__fish_osc_needs_command' -a 'dependson whatdependson' -d 'Show the build dependencies' complete -f -c osc -n '__fish_osc_needs_command' -a 'detachbranch' -d 'replace a link with its expanded sources' complete -f -c osc -n '__fish_osc_needs_command' -a 'develproject bsdevelproject dp' -d 'print the devel project / package of a package' complete -f -c osc -n '__fish_osc_needs_command' -a 'diff di ldiff linkdiff' -d 'Generates a diff' complete -f -c osc -n '__fish_osc_needs_command' -a 'distributions dists' -d 'Shows all available distributions' complete -f -c osc -n '__fish_osc_needs_command' -a 'getbinaries' -d 'Download binaries to a local directory' complete -f -c osc -n '__fish_osc_needs_command' -a 'help ? h' -d 'give detailed help on a specific sub-command' complete -f -c osc -n '__fish_osc_needs_command' -a 'importsrcpkg' -d 'Import a new package from a src.rpm' complete -f -c osc -n '__fish_osc_needs_command' -a 'info' -d 'Print information about a working copy' complete -f -c osc -n '__fish_osc_needs_command' -a 'init' -d 'Initialize a directory as working copy' complete -f -c osc -n '__fish_osc_needs_command' -a 'jobhistory jobhist' -d 'Shows the job history of a project' complete -f -c osc -n '__fish_osc_needs_command' -a 'linkpac' -d '"Link" a package to another package' complete -f -c osc -n '__fish_osc_needs_command' -a 'linktobranch' -d 'Convert a package containing a classic link with patc...' complete -f -c osc -n '__fish_osc_needs_command' -a 'list LL lL ll ls' -d 'List sources or binaries on the server' complete -f -c osc -n '__fish_osc_needs_command' -a 'localbuildlog lbl' -d 'Shows the build log of a local buildchroot' complete -f -c osc -n '__fish_osc_needs_command' -a 'log' -d 'Shows the commit log of a package' complete -f -c osc -n '__fish_osc_needs_command' -a 'maintainer bugowner' -d 'Show maintainers according to server side configuration' complete -f -c osc -n '__fish_osc_needs_command' -a 'maintenancerequest mr' -d 'Create a request for starting a maintenance incident.' complete -f -c osc -n '__fish_osc_needs_command' -a 'man' -d 'generates a man page' complete -f -c osc -n '__fish_osc_needs_command' -a 'mbranch maintained sm' -d 'Search or banch multiple instances of a package' complete -f -c osc -n '__fish_osc_needs_command' -a 'meta' -d 'Show meta information, or edit it' complete -f -c osc -n '__fish_osc_needs_command' -a 'mkpac' -d 'Create a new package under version control' complete -f -c osc -n '__fish_osc_needs_command' -a 'mv' -d 'Move SOURCE file to DEST and keep it under version co...' complete -f -c osc -n '__fish_osc_needs_command' -a 'my' -d 'show waiting work, packages, projects or requests inv...' complete -f -c osc -n '__fish_osc_needs_command' -a 'patchinfo' -d 'Generate and edit a patchinfo file.' complete -f -c osc -n '__fish_osc_needs_command' -a 'pdiff' -d 'Quick alias to diff the content of a package with its...' complete -f -c osc -n '__fish_osc_needs_command' -a 'prdiff projdiff projectdiff' -d 'Server-side diff of two projects' complete -f -c osc -n '__fish_osc_needs_command' -a 'prjresults pr' -d 'Shows project-wide build results' complete -f -c osc -n '__fish_osc_needs_command' -a 'pull' -d 'merge the changes of the link target into your workin...' complete -f -c osc -n '__fish_osc_needs_command' -a 'rdelete' -d 'Delete a project or packages on the server.' complete -f -c osc -n '__fish_osc_needs_command' -a 'rdiff' -d 'Server-side "pretty" diff of two packages' complete -f -c osc -n '__fish_osc_needs_command' -a 'rebuild rebuildpac' -d 'Trigger package rebuilds' complete -f -c osc -n '__fish_osc_needs_command' -a 'release' -d 'Release sources and binaries' complete -f -c osc -n '__fish_osc_needs_command' -a 'releaserequest' -d 'Create a request for releasing a maintenance update.' complete -f -c osc -n '__fish_osc_needs_command' -a 'remotebuildlog rbl rblt rbuildlog rbuildlogtail remotebuildlogtail' -d 'Shows the build log of a package' complete -f -c osc -n '__fish_osc_needs_command' -a 'repairlink' -d 'Repair a broken source link' complete -f -c osc -n '__fish_osc_needs_command' -a 'repairwc' -d 'try to repair an inconsistent working copy' complete -f -c osc -n '__fish_osc_needs_command' -a 'repositories platforms repos' -d 'shows repositories configured for a project. It skips...' complete -f -c osc -n '__fish_osc_needs_command' -a 'repourls' -d 'Shows URLs of .repo files' complete -f -c osc -n '__fish_osc_needs_command' -a 'request review rq' -d 'Show or modify requests and reviews' complete -f -c osc -n '__fish_osc_needs_command' -a 'requestmaintainership reqbs reqbugownership reqmaintainership reqms requestbugownership' -d 'requests to add user as maintainer or bugowner' complete -f -c osc -n '__fish_osc_needs_command' -a 'resolved' -d 'Remove "conflicted" state on working copy files' complete -f -c osc -n '__fish_osc_needs_command' -a 'restartbuild abortbuild' -d 'Restart the build of a certain project or package' complete -f -c osc -n '__fish_osc_needs_command' -a 'results r' -d 'Shows the build results of a package or project' complete -f -c osc -n '__fish_osc_needs_command' -a 'revert' -d 'Restore changed files or the entire working copy.' complete -f -c osc -n '__fish_osc_needs_command' -a 'rremove' -d 'Remove source files from selected package' complete -f -c osc -n '__fish_osc_needs_command' -a 'search bse se' -d 'Search for a project and/or package.' complete -f -c osc -n '__fish_osc_needs_command' -a 'service' -d 'Handle source services' complete -f -c osc -n '__fish_osc_needs_command' -a 'setdevelproject sdp' -d 'Set the devel project / package of a package' complete -f -c osc -n '__fish_osc_needs_command' -a 'setlinkrev' -d 'Updates a revision number in a source link.' complete -f -c osc -n '__fish_osc_needs_command' -a 'signkey' -d 'Manage Project Signing Key' complete -f -c osc -n '__fish_osc_needs_command' -a 'status st' -d 'Show status of files in working copy' complete -f -c osc -n '__fish_osc_needs_command' -a 'submitrequest sr submitpac submitreq' -d 'Create request to submit source into another Project' complete -f -c osc -n '__fish_osc_needs_command' -a 'token' -d 'Show and manage authentication token' complete -f -c osc -n '__fish_osc_needs_command' -a 'triggerreason tr' -d 'Show reason why a package got triggered to build' complete -f -c osc -n '__fish_osc_needs_command' -a 'undelete' -d 'Restores a deleted project or package on the server.' complete -f -c osc -n '__fish_osc_needs_command' -a 'unlock' -d 'Unlocks a project or package' complete -f -c osc -n '__fish_osc_needs_command' -a 'update up' -d 'Update a working copy' complete -f -c osc -n '__fish_osc_needs_command' -a 'updatepacmetafromspec metafromspec updatepkgmetafromspec' -d 'Update package meta information from a specfile' complete -f -c osc -n '__fish_osc_needs_command' -a 'vc' -d 'Edit the changes file' complete -f -c osc -n '__fish_osc_needs_command' -a 'whois user who' -d 'Show fullname and email of a buildservice user' complete -f -c osc -n '__fish_osc_needs_command' -a 'wipebinaries' -d 'Delete all binary packages of a certain project/package' osc-1.12.1/contrib/osc.zsh000066400000000000000000000206731475337502500153550ustar00rootroot00000000000000#compdef osc # # Copyright (C) 2009,2010 Holger Macht # Copyright (C) 2023 Björn Bidar # # This file is released under the GPLv2. # # Based on the zsh guide from http://zsh.dotsrc.org/Guide/zshguide06.html # # Toggle verbose completions: zstyle ':completion:*:osc:*' verbose no # zstyle ':completion:*:osc-subcommand:*' verbose no # # version 0.2 # # Main dispatcher _osc() { # Variables shared by all internal functions local osc_projects osc_rc osc_cmd osc_alias _osc_complete_prepare osc_projects="${XDG_CACHE_HOME}/osc.projects" osc_rc="${XDG_CONFIG_HOME}/osc/oscrc" osc_cmd=osc if [[ "${words[0]}" = "isc" ]] ; then osc_alias=internal fi if [ -s "${PWD}/.osc/_apiurl" -a -s "${osc_rc}" ]; then local osc_apiurl read osc_apiurl < "${PWD}/.osc/_apiurl" # We prefer to match an apiurl with an alias so that the project list # cache would match also when -A was passed with said alias. # If there's no alias for that api url match to use the plain apiurl instead. osc_alias=$(sed -rn '\@^\['${apiurl}'@,\@=@{\@^aliases=@{s@[^=]+=([^,]+),.*@\1@p};}' < "${osc_rc}" 2> /dev/null) if [ -z $osc_alias ] ; then osc_alias=${osc_apiurl} fi fi if (( CURRENT > 2 )) && [[ ${words[2]} != "help" ]]; then # Remember the subcommand name local cmd=${words[2]} # Set the context for the subcommand. curcontext="${curcontext%:*:*}:osc-subcommand" # Narrow the range of words we are looking at to exclude `osc' (( CURRENT-- )) shift words # Run the completion for the subcommand if [ $cmd = -A -o $cmd = --apiurl ] ; then if [[ -s "${osc_rc}" ]] ; then local hints=($(sed -rn '/^(aliases=|\[http)/{s/,/ /g;s/(aliases=|\[|\])//gp}' < "${osc_rc}" 2> /dev/null)) if [[ -n "${words[2]}" ]]; then for h in ${hints[@]} ; do case "$h" in http*) local tmp=$(sed -rn '\@^\['${h}'@,\@=@{\@^aliases=@{s@[^=]+=([^,]+),.*@\1@p};}' < "${osc_rc}" 2> /dev/null) if [[ "${words[2]}" = "$h" ]]; then osc_alias=$tmp break fi ;; *) if [[ "${words[2]}" = "$h" ]]; then osc_alias=$h break fi esac done else _arguments '1:ALIAS:( `echo $hints`)' return fi fi fi if [[ -n "$osc_alias" ]] ; then osc_projects="${osc_projects}.${osc_alias//\//_}" osc_command="$osc_command -A ${osc_alias}" fi _osc_update_project_list case $cmd in submitrequest|submitreq|sr) _osc_cmd_submitreq ;; getbinaries) _osc_cmd_getbinaries ;; build) _osc_cmd_build ;; checkout|co|branch|getpac|bco|branchco) _osc_cmd_checkout ;; buildlog|buildinfo|bl|blt|buildlogtail) _osc_cmd_buildlog ;; *) _osc_cmd_do $cmd esac else local hline local -a cmdlist local tag=0 _call_program help-commands osc help | while read -A hline; do # start parsing with "commands:" [[ $hline[1] = "commands:" ]] && tag=1 # stop parsing at the line starting with "For" [[ $hline[1] = "For" ]] && tag=0 [[ $tag = 0 ]] && continue # all commands have to start with lower case letters [[ $hline[1] =~ ^[A-Z] ]] && continue (( ${#hline} < 2 )) && continue # ${hline[1]%,} truncates the last ',' cmdlist=($cmdlist "${hline[1]%,}:${hline[2,-1]}") done _describe -t osc-commands 'osc command' cmdlist fi } _osc_call_me_maybe() { typeset -i ctime=$(command date -d "$(command stat -c '%z' ${1})" +'%s') typeset -i now=$(command date -d now +'%s') if ((now - ctime < 86400)) ; then return 1 fi return 0 } _osc_complete_prepare() { local xdg_dir for xdg_dir in "${XDG_CACHE_HOME:=$HOME/.cache}" "${XDG_CONFIG_HOME:=$HOME/.config}"; do if [[ ! -d "${xdg_dir}" ]]; then mkdir -p "${xdg_dir}" fi done if [[ -f ~/.osc.projects ]]; then rm ~/.osc.projects -f fi } _osc_update_project_list() { if [[ -s "${osc_projects}" ]] ; then if _osc_call_me_maybe "$osc_projects" ; then if tmp=$(mktemp ${osc_projects}.XXXXXX) ; then command ${osc_cmd} ls / >| $tmp mv -uf $tmp ${osc_projects} fi fi else command ${osc_cmd} ls / >| "${osc_projects}" fi } _osc_project_repositories() { if [ ! -s $PWD/.osc/_build_repositories ] || \ _osc_call_me_maybe $PWD/.osc/_build_repositories ; then osc repositories > /dev/null fi # Just check if file exist in case the call to the api failed if [ -s $PWD/.osc/_build_repositories ] ; then cat $PWD/.osc/_build_repositories | while read build_repository ; do # Only output first word of each line echo ${build_repository%\ *} done | sort -u fi } _osc_project_repositories_arches() { if [ ! -s $PWD/.osc/_build_repositories ] || \ _osc_call_me_maybe $PWD/.osc/_build_repositories ; then osc repositories > /dev/null fi # Just check if file exist in case the call to the api failed if [ -s $PWD/.osc/_build_repositories ] ; then grep -- $1 $PWD/.osc/_build_repositories | while read build_repository ; do # Only output second word of each line echo ${build_repository#*\ } done | sort -u fi } _osc_cmd_getbinaries() { if [ "$words[2]" = "-" ]; then _osc_complete_help_commands 'options' 'option' return else if [ -n "$words[2]" ] ; then local osc_project_repository_arch=$(_osc_project_repositories_arches \ "${words[2]}") fi _arguments \ '1:PROJECT:( `cat $osc_projects` )' \ '2:PACKAGE:(PACKAGE)' \ '3:REPOSITORY:( `_osc_project_repositories`' \ '4:ARCHITECTURE:(`echo $osc_project_repository_arch`)' fi } _osc_cmd_checkout() { if [ "$words[2]" = "-" ]; then _osc_complete_help_commands 'options' 'option' return else _arguments \ '1:PROJECT:( `cat $osc_projects` )' \ '2:PACKAGE:(PACKAGE)' fi } _osc_cmd_buildlog() { if [ "$words[2]" = "-" ]; then _osc_complete_help_commands 'options' 'option' return else if [ -n "$words[2]" ] ; then local osc_project_repository_arch=$(_osc_project_repositories_arches \ "${words[2]}") fi _arguments \ '1:REPOSITORY:( `_osc_project_repositories` )' \ '2:ARCHITECTURE:(`echo $osc_project_repository_arch`)' fi } _osc_cmd_build() { if [ "$words[2]" = "-" ]; then _osc_complete_help_commands 'options' 'option' return else if [ -n "$words[2]" ] ; then local osc_project_repository_arch=$(_osc_project_repositories_arches \ "${words[2]}") fi _arguments \ '1:REPOSITORY:( `_osc_project_repositories` )' \ '2:ARCHITECTURE:(`echo $osc_project_repository_arch`)' \ '3:Build Description:_files' fi } _osc_cmd_submitreq() { _osc_complete_help_commands 'options' 'option' } _osc_complete_help_commands() { local hline local -a cmdlist local tag=0 _call_program help-commands osc help $cmd | while read -A hline; do # start parsing from "usage:" [[ $hline[1] = "${1}:" ]] && tag=1 [[ $tag = 0 ]] && continue if [[ $hline[1] =~ ^osc ]]; then shift hline; shift hline elif ! [[ $hline[1] =~ ^- ]]; then # Option has to start with a '-' or 'osc submitrequest' continue fi (( ${#hline} < 2 )) && continue cmdlist=($cmdlist "${hline[1]%,}:${hline[2,-1]}") done if [ -n "$cmdlist" ] ; then _describe -t osc-commands "osc $2" cmdlist else return 1 fi } _osc_cmd_do() { # only start completion if there's some '-' on the line if ! [ "$words[2]" = "-" ]; then _complete return fi if ! _osc_complete_help_commands 'options' 'option'; then _complete fi } # Code to make sure _osc is run when we load it _osc "$@" osc-1.12.1/doc/000077500000000000000000000000001475337502500131405ustar00rootroot00000000000000osc-1.12.1/doc/Makefile000066400000000000000000000011721475337502500146010ustar00rootroot00000000000000# Minimal makefile for Sphinx documentation # # You can set these variables from the command line, and also # from the environment for the first two. SPHINXOPTS ?= SPHINXBUILD ?= sphinx-build SOURCEDIR = . BUILDDIR = _build # Put it first so that "make" without argument is like "make help". help: @$(SPHINXBUILD) -M help "$(SOURCEDIR)" "$(BUILDDIR)" $(SPHINXOPTS) $(O) .PHONY: help Makefile # Catch-all target: route all unknown targets to Sphinx using the new # "make mode" option. $(O) is meant as a shortcut for $(SPHINXOPTS). %: Makefile @$(SPHINXBUILD) -M $@ "$(SOURCEDIR)" "$(BUILDDIR)" $(SPHINXOPTS) $(O) osc-1.12.1/doc/README.md000066400000000000000000000000651475337502500144200ustar00rootroot00000000000000This is the place where all osc documentation starts osc-1.12.1/doc/_static/000077500000000000000000000000001475337502500145665ustar00rootroot00000000000000osc-1.12.1/doc/_static/.keepme000066400000000000000000000000001475337502500160230ustar00rootroot00000000000000osc-1.12.1/doc/_static/css/000077500000000000000000000000001475337502500153565ustar00rootroot00000000000000osc-1.12.1/doc/_static/css/custom.css000066400000000000000000000000561475337502500174030ustar00rootroot00000000000000dl.property { display: block !important; } osc-1.12.1/doc/api/000077500000000000000000000000001475337502500137115ustar00rootroot00000000000000osc-1.12.1/doc/api/modules.rst000066400000000000000000000003011475337502500161050ustar00rootroot00000000000000osc === These are the packages in the osc package. .. toctree:: :maxdepth: 4 osc.core osc.util osc.credentials osc.build osc.commandline osc.conf osc.OscConfigParser osc-1.12.1/doc/api/osc.OscConfigParser.rst000066400000000000000000000002711475337502500202550ustar00rootroot00000000000000.. py:module:: osc.OscConfigParser OscConfigParser =============== This is the osc config parser. basic structures ---------------- .. automodule:: osc.OscConfigParser :members: osc-1.12.1/doc/api/osc.build.rst000066400000000000000000000002541475337502500163260ustar00rootroot00000000000000.. py:module:: osc.build build ===== This is the osc build module to talk to the build script. basic structures ---------------- .. automodule:: osc.build :members: osc-1.12.1/doc/api/osc.commandline.rst000066400000000000000000000010551475337502500175150ustar00rootroot00000000000000commandline =========== The ``osc.commandline`` module provides functionality for creating osc command-line plugins. .. autoclass:: osc.commandline.OscCommand :inherited-members: :members: .. autoclass:: osc.commandline.OscMainCommand :members: main .. automodule:: osc.commandline :members: ensure_no_remaining_args, pop_project_package_from_args, pop_project_package_targetproject_targetpackage_from_args, pop_project_package_repository_arch_from_args, pop_repository_arch_from_args osc-1.12.1/doc/api/osc.conf.rst000066400000000000000000000002721475337502500161540ustar00rootroot00000000000000.. py:module:: osc.conf osc.conf ======== .. automodule:: osc.conf :members: :exclude-members: maintained_attribute, maintenance_attribute, maintained_update_project_attribute osc-1.12.1/doc/api/osc.core.rst000066400000000000000000000004501475337502500161550ustar00rootroot00000000000000.. py:module:: osc.core core ==== This is the osc core module. basic structures ---------------- .. automodule:: osc.core :members: .. autoclass:: File :members: .. autoclass:: Serviceinfo :members: .. autoclass:: Linkinfo :members: .. autoclass:: Project :members: osc-1.12.1/doc/api/osc.credentials.rst000066400000000000000000000002561475337502500175260ustar00rootroot00000000000000.. py:module:: osc.credentials credentials =========== This is the osc credentials module. basic structures ---------------- .. automodule:: osc.credentials :members: osc-1.12.1/doc/api/osc.util.rst000066400000000000000000000024651475337502500162120ustar00rootroot00000000000000osc.util package ================ Submodules ---------- osc.util.ar module ------------------ .. automodule:: osc.util.ar :members: :undoc-members: :show-inheritance: osc.util.archquery module ------------------------- .. automodule:: osc.util.archquery :members: :undoc-members: :show-inheritance: osc.util.cpio module -------------------- .. automodule:: osc.util.cpio :members: :undoc-members: :show-inheritance: osc.util.debquery module ------------------------ .. automodule:: osc.util.debquery :members: :undoc-members: :show-inheritance: osc.util.packagequery module ---------------------------- .. automodule:: osc.util.packagequery :members: :undoc-members: :show-inheritance: osc.util.repodata module ------------------------ .. automodule:: osc.util.repodata :members: :undoc-members: :show-inheritance: osc.util.rpmquery module ------------------------ .. automodule:: osc.util.rpmquery :members: :undoc-members: :show-inheritance: osc.util.safewriter module -------------------------- .. automodule:: osc.util.safewriter :members: :undoc-members: :show-inheritance: osc.util.helper module -------------------------- .. automodule:: osc.util.helper :members: :undoc-members: :show-inheritance: osc-1.12.1/doc/api/tutorial.rst000066400000000000000000000046361475337502500163170ustar00rootroot00000000000000Tutorial ======== This is a tutorial on how to use the osc python api. Key to the |obs| are (remote): #. A **project** #. A project has associated multiple **repositories** (linux distributions) #. Multiple **packages** in a project will hold the builds against the difefrent **repositories** A user will deal with local checkout of a project in a **working copy**: this is similar to the subversion checkout model. Initial config setup -------------------- Osc the library requires an initial setup: >>> import osc.conf >>> osc.conf.get_config() This will read all the external config files (eg. ~/.oscrc) and the internal configuration values. Acquiring the apiurl -------------------- All the osc operation will use a **apiurl** to lookup for things like passwords, username and other parameters while performing operations: >>> apiurl = osc.conf.config['apiurl'] Operations on a remote build server ----------------------------------- osc is similar to subversion, it has a remote server and a local (checkout) **working** directory. First we'll go through the remote operation on a server **NOT** requiring a checkout. Operations are contained in the osc.core module: >>> import osc.core List all the projects and packages ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ This will show all the projects on the remote |obs|: >>> for prj in osc.core.meta_get_project_list(apiurl, deleted=False): print(prj) A project has **repositories** associated with it (eg. linux distributions): >>> prj = 'home:cavallo71:opt-python-interpreters' >>> for repo in osc.core.get_repos_of_project(apiurl, prj): print(repo) A project contains packages and to list them all: >>> prj = 'home:cavallo71:opt-python-interpreters' >>> for pkg in osc.core.meta_get_packagelist(apiurl, prj): print(pkg) Add a package to an existing project ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ Operations in a checked out **working copy** -------------------------------------------- Create your first project: the hello project -------------------------------------------- .. todo:: add he description on how to init a project Adding your firs package to the project hello: the world package ---------------------------------------------------------------- .. todo:: add he description on how to add a package Setting the build architectures ------------------------------- osc-1.12.1/doc/conf.py000066400000000000000000000071141475337502500144420ustar00rootroot00000000000000# Configuration file for the Sphinx documentation builder. # # This file only contains a selection of the most common options. For a full # list see the documentation: # https://www.sphinx-doc.org/en/master/usage/configuration.html # -- Path setup -------------------------------------------------------------- # If extensions (or modules to document with autodoc) are in another directory, # add these directories to sys.path here. If the directory is relative to the # documentation root, use os.path.abspath to make it absolute, like shown here. # import os import sys import textwrap TOPDIR = os.path.dirname(os.path.abspath(__file__)) sys.path.insert(0, os.path.join(TOPDIR, "..")) import osc.conf # -- Project information ----------------------------------------------------- project = 'osc' copyright = 'Contributors to the osc project' author = 'see the AUTHORS list' # -- General configuration --------------------------------------------------- # Add any Sphinx extension module names here, as strings. They can be # extensions coming with Sphinx (named 'sphinx.ext.*') or your custom # ones. extensions = [ 'sphinx.ext.todo', 'sphinx.ext.autodoc', 'sphinx.ext.doctest', 'sphinx.ext.ifconfig', ] # Add any paths that contain templates here, relative to this directory. templates_path = ['_templates'] # List of patterns, relative to source directory, that match files and # directories to ignore when looking for source files. # This pattern also affects html_static_path and html_extra_path. exclude_patterns = ['_build', 'Thumbs.db', '.DS_Store'] # A string of reStructuredText that will be included at the end of every # source file that is read. This is a possible place to add substitutions # that should be available in every file. rst_epilog = """ .. |obs| replace:: open build service """ master_doc = 'index' # order members by __all__ or their order in the source code autodoc_default_options = { 'member-order': 'bysource', } autodoc_typehints = "both" # -- Generate documents ------------------------------------------------- osc.conf._model_to_rst( cls=osc.conf.Options, title="Configuration file", description=textwrap.dedent( """ The configuration file path is ``$XDG_CONFIG_HOME/osc/oscrc``, which usually translates into ``~/.config/osc/oscrc``. The configuration options are loaded with the following priority: 1. environment variables: ``OSC_`` or ``OSC__`` 2. command-line options 3. oscrc config file """ ), sections={ "Host options": osc.conf.HostOptions, }, output_file=os.path.join(TOPDIR, "oscrc.rst"), ) # -- Options for HTML output ------------------------------------------------- # The theme to use for HTML and HTML Help pages. See the documentation for # a list of builtin themes. # # html_theme = 'alabaster' html_theme = 'sphinx_rtd_theme' # Add any paths that contain custom static files (such as style sheets) here, # relative to this directory. They are copied after the builtin static files, # so a file named "default.css" will overwrite the builtin "default.css". html_static_path = ['_static'] html_css_files = [ # fixes https://github.com/readthedocs/sphinx_rtd_theme/issues/1301 'css/custom.css', ] # -- Options for MAN output ------------------------------------------------- # (source start file, name, description, authors, manual section). man_pages = [ ("oscrc", "oscrc", "openSUSE Commander configuration file", "openSUSE project ", 5), ] osc-1.12.1/doc/index.rst000066400000000000000000000010631475337502500150010ustar00rootroot00000000000000.. osc documentation master file, created by sphinx-quickstart on Sun Jan 24 13:06:29 2016. You can adapt this file completely to your liking, but it should at least contain the root `toctree` directive. Welcome to osc's documentation! =============================== This is the documentation for the osc python client to the |obs|. Tutorial .. TODO:: add more documentation API: .. toctree:: :maxdepth: 2 api/modules plugins/index oscrc Indices and tables ================== * :ref:`genindex` * :ref:`modindex` * :ref:`search` osc-1.12.1/doc/plugins/000077500000000000000000000000001475337502500146215ustar00rootroot00000000000000osc-1.12.1/doc/plugins/index.rst000066400000000000000000000023011475337502500164560ustar00rootroot00000000000000Extending osc with plugins ========================== .. note:: New in osc 1.1.0 .. warning:: Plugins are currently NOT supported in virtualenv. This is a simple tutorial. More details can be found in the :py:class:`osc.commandline.OscCommand` reference. Steps ----- 1. First, we choose a location where to put the plugin .. include:: plugin_locations.rst 2. Then we pick a file name - The file should contain a single command and its name should correspond with the command name. - The file name should be prefixed with parent command(s) (only if applicable). - Example: Adding ``list`` subcommand to ``osc request`` -> ``request_list.py``. 3. And then we write a class that inherits from :py:class:`osc.commandline.OscCommand` and implements our command. - The class name should also correspond with the command name incl. the parent prefix. - Examples follow... A simple command ---------------- ``simple.py`` .. literalinclude:: simple.py Command with subcommands ------------------------ ``request.py`` .. literalinclude:: request.py ``request_list.py`` .. literalinclude:: request_list.py ``request_accept.py`` .. literalinclude:: request_accept.py osc-1.12.1/doc/plugins/plugin_locations.rst000066400000000000000000000002501475337502500207210ustar00rootroot00000000000000 - The directory from where the ``osc.commands`` module gets loaded. - /usr/lib/osc-plugins - /usr/local/lib/osc-plugins - ~/.local/lib/osc-plugins - ~/.osc-pluginsosc-1.12.1/doc/plugins/request.py000066400000000000000000000005701475337502500166650ustar00rootroot00000000000000import osc.commandline class RequestCommand(osc.commandline.OscCommand): """ Manage requests """ name = "request" aliases = ["rq"] # arguments specified here will get inherited to all subcommands automatically def init_arguments(self): self.add_argument( "-m", "--message", metavar="TEXT", ) osc-1.12.1/doc/plugins/request_accept.py000066400000000000000000000005351475337502500202050ustar00rootroot00000000000000import osc.commandline class RequestAcceptCommand(osc.commandline.OscCommand): """ Accept request """ name = "accept" parent = "RequestCommand" def init_arguments(self): self.add_argument( "id", type=int, ) def run(self, args): print(f"Accepting request '{args.id}'") osc-1.12.1/doc/plugins/request_list.py000066400000000000000000000003361475337502500177200ustar00rootroot00000000000000import osc.commandline class RequestListCommand(osc.commandline.OscCommand): """ List requests """ name = "list" parent = "RequestCommand" def run(self, args): print("Listing requests") osc-1.12.1/doc/plugins/simple.py000066400000000000000000000013031475337502500164610ustar00rootroot00000000000000import osc.commandline class SimpleCommand(osc.commandline.OscCommand): """ A command that does nothing More description of what the command does. """ # command name name = "simple" # options and positional arguments def init_arguments(self): self.add_argument( "--bool-option", action="store_true", help="...", ) self.add_argument( "arguments", metavar="arg", nargs="+", help="...", ) # code of the command def run(self, args): print(f"Bool option is {args.bool_option}") print(f"Positional arguments are {args.arguments}") osc-1.12.1/doc/requirements.txt000066400000000000000000000000461475337502500164240ustar00rootroot00000000000000cryptography sphinx-rtd-theme urllib3 osc-1.12.1/git-obs.py000077500000000000000000000003211475337502500143100ustar00rootroot00000000000000#!/usr/bin/env python3 """ This wrapper allows git-obs to be called from the source directory during development. """ import osc.commandline_git if __name__ == "__main__": osc.commandline_git.main() osc-1.12.1/osc-wrapper.py000077500000000000000000000002411475337502500152070ustar00rootroot00000000000000#!/usr/bin/env python3 """ This wrapper allows osc to be called from the source directory during development. """ import osc.babysitter osc.babysitter.main() osc-1.12.1/osc/000077500000000000000000000000001475337502500131575ustar00rootroot00000000000000osc-1.12.1/osc/.gitignore000066400000000000000000000000141475337502500151420ustar00rootroot00000000000000*.pyc *.swp osc-1.12.1/osc/OscConfigParser.py000066400000000000000000000322311475337502500165610ustar00rootroot00000000000000# Copyright 2008,2009 Marcus Huewe # # This program is free software; you can redistribute it and/or # modify it under the terms of the GNU General Public License version 2 # as published by the Free Software Foundation; # # This program is distributed in the hope that it will be useful, # but WITHOUT ANY WARRANTY; without even the implied warranty of # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the # GNU General Public License for more details. # # You should have received a copy of the GNU General Public License # along with this program; if not, write to the Free Software # Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA import configparser # inspired from http://code.google.com/p/iniparse/ - although their implementation is # quite different class ConfigLineOrder: """ A ConfigLineOrder() instance task is to preserve the order of a config file. It keeps track of all lines (including comments) in the _lines list. This list either contains SectionLine() instances or CommentLine() instances. """ def __init__(self): self._lines = [] def _append(self, line_obj): self._lines.append(line_obj) def _find_section(self, section): for line in self._lines: if line.type == 'section' and line.name == section: return line return None def add_section(self, sectname): self._append(SectionLine(sectname)) def get_section(self, sectname): section = self._find_section(sectname) if section: return section section = SectionLine(sectname) self._append(section) return section def add_other(self, sectname, line): if sectname: self.get_section(sectname).add_other(line) else: self._append(CommentLine(line)) def keys(self): return [i.name for i in self._lines if i.type == 'section'] def __setitem__(self, key, value): section = SectionLine(key) self._append(section) def __getitem__(self, key): section = self._find_section(key) if not section: raise KeyError() return section def __delitem__(self, key): line = self._find_section(key) if not line: raise KeyError(key) self._lines.remove(line) def __iter__(self): # return self._lines.__iter__() for line in self._lines: if line.type == 'section': yield line.name class Line: """Base class for all line objects""" def __init__(self, name, type): self.name = name self.type = type class SectionLine(Line): """ This class represents a [section]. It stores all lines which belongs to this certain section in the _lines list. The _lines list either contains CommentLine() or OptionLine() instances. """ def __init__(self, sectname): super().__init__(sectname, 'section') self._lines = [] def _find(self, name): for line in self._lines: if line.name == name: return line return None def _add_option(self, optname, value=None, line=None, sep='='): if value is None and line is None: raise configparser.Error('Either value or line must be passed in') elif value and line: raise configparser.Error('value and line are mutually exclusive') if value is not None: line = f'{optname}{sep}{value}' opt = self._find(optname) if opt: opt.format(line) else: self._lines.append(OptionLine(optname, line)) def add_other(self, line): self._lines.append(CommentLine(line)) def copy(self): return dict(self.items()) def items(self): return [(i.name, i.value) for i in self._lines if i.type == 'option'] def keys(self): return [i.name for i in self._lines] def __setitem__(self, key, val): self._add_option(key, val) def __getitem__(self, key): line = self._find(key) if not line: raise KeyError(key) return str(line) def __delitem__(self, key): line = self._find(key) if not line: raise KeyError(key) self._lines.remove(line) def __str__(self): return self.name # XXX: needed to support 'x' in cp._sections['sectname'] def __iter__(self): for line in self._lines: yield line.name class CommentLine(Line): """Store a commentline""" def __init__(self, line): super().__init__(line.strip('\n'), 'comment') def __str__(self): return self.name class OptionLine(Line): """ This class represents an option. The class' ``name`` attribute is used to store the option's name and the "value" attribute contains the option's value. The ``frmt`` attribute preserves the format which was used in the configuration file. Example:: optionx:value => self.frmt = '%s:%s' optiony=value;some_comment => self.frmt = '%s=%s;some_comment """ def __init__(self, optname, line): super().__init__(optname, 'option') self.name = optname self.format(line) def format(self, line): mo = configparser.ConfigParser.OPTCRE.match(line.strip()) key, val = mo.group('option', 'value') self.frmt = line.replace(key.strip(), '%s', 1) pos = val.find(' ;') if pos >= 0: val = val[:pos] self.value = val self.frmt = self.frmt.replace(val.strip(), '%s', 1).rstrip('\n') def __str__(self): return self.value class OscConfigParser(configparser.ConfigParser): """ OscConfigParser() behaves like a normal ConfigParser() object. The only differences is that it preserves the order+format of configuration entries and that it stores comments. In order to keep the order and the format it makes use of the ConfigLineOrder() class. """ def __init__(self, defaults=None): super().__init__(defaults or {}, interpolation=None) self._sections = ConfigLineOrder() # XXX: unfortunately we have to override the _read() method from the ConfigParser() # class because a) we need to store comments b) the original version doesn't use # the its set methods to add and set sections, options etc. instead they use a # dictionary (this makes it hard for subclasses to use their own objects, IMHO # a bug) and c) in case of an option we need the complete line to store the format. # This all sounds complicated but it isn't - we only needed some slight changes def _read(self, fp, fpname): """Parse a sectioned setup file. The sections in setup file contains a title line at the top, indicated by a name in square brackets (`[]'), plus key/value options lines, indicated by `name: value' format lines. Continuations are represented by an embedded newline then leading whitespace. Blank lines, lines beginning with a '#', and just about everything else are ignored. """ cursect = None # None, or a dictionary optname = None lineno = 0 e = None # None, or an exception while True: line = fp.readline() if not line: break lineno = lineno + 1 # comment or blank line? if line.strip() == '' or line[0] in '#;': self._sections.add_other(cursect, line) continue if line.split(None, 1)[0].lower() == 'rem' and line[0] in "rR": # no leading whitespace continue # continuation line? if line[0].isspace() and cursect is not None and optname: value = line.strip() if value: #cursect[optname] = "%s\n%s" % (cursect[optname], value) #self.set(cursect, optname, "%s\n%s" % (self.get(cursect, optname), value)) if cursect == configparser.DEFAULTSECT: self._defaults[optname] = f"{self._defaults[optname]}\n{value}" else: # use the raw value here (original version uses raw=False) self._sections[cursect]._find(optname).value = f'{self.get(cursect, optname, raw=True)}\n{value}' # a section header or option header? else: # is it a section header? mo = self.SECTCRE.match(line) if mo: sectname = mo.group('header') if self._strict and sectname in self._sections: raise configparser.DuplicateSectionError(sectname, fpname, lineno) elif sectname in self._sections: cursect = self._sections[sectname] elif sectname == configparser.DEFAULTSECT: cursect = self._defaults else: self.add_section(sectname) # So sections can't start with a continuation line cursect = sectname optname = None # no section header in the file? elif cursect is None: raise configparser.MissingSectionHeaderError(fpname, lineno, line) # an option line? else: mo = self.OPTCRE.match(line) if mo: optname, vi, optval = mo.group('option', 'vi', 'value') if vi in ('=', ':') and ';' in optval: # ';' is a comment delimiter only if it follows # a spacing character pos = optval.find(';') if pos != -1 and optval[pos - 1].isspace(): optval = optval[:pos] optval = optval.strip() # allow empty values if optval == '""': optval = '' optname = self.optionxform(optname.rstrip()) if self._strict and optname in self._sections[cursect]: raise configparser.DuplicateOptionError(sectname, optname, fpname, lineno) elif cursect == configparser.DEFAULTSECT: self._defaults[optname] = optval else: self._sections[cursect]._add_option(optname, line=line) else: # a non-fatal parsing error occurred. set up the # exception but keep going. the exception will be # raised at the end of the file and will contain a # list of all bogus lines if not e: e = configparser.ParsingError(fpname) e.append(lineno, repr(line)) # if any parsing errors occurred, raise an exception if e: raise e # pylint: disable-msg=E0702 def write(self, fp, comments=False): """ write the configuration file. If comments is True all comments etc. will be written to fp otherwise the ConfigParsers' default write method will be called. """ if comments: fp.write(str(self)) fp.write('\n') else: super().write(fp) def has_option(self, section, option, proper=False, **kwargs): """ Returns True, if the passed section contains the specified option. If proper is True, True is only returned if the option is owned by this section and not "inherited" from the default. """ if proper: return self.optionxform(option) in self._sections[section].keys() return super().has_option(section, option, **kwargs) # XXX: simplify! def __str__(self): ret = [] first = True for line in self._sections._lines: if line.type == 'section': if first: first = False else: ret.append('') ret.append(f'[{line.name}]') for sline in line._lines: if sline.type == 'option': # special handling for continuation lines val = '\n '.join(sline.value.split('\n')) ret.append(sline.frmt % (sline.name, val)) elif str(sline) != '': ret.append(str(sline)) else: ret.append(str(line)) return '\n'.join(ret) def _validate_value_types(self, section="", option="", value=""): if not isinstance(section, str): raise TypeError("section names must be strings") if not isinstance(option, str): raise TypeError("option keys must be strings") # vim: sw=4 et osc-1.12.1/osc/__init__.py000066400000000000000000000004041475337502500152660ustar00rootroot00000000000000__all__ = [ 'babysitter', 'build', 'connection', 'commandline', 'core', 'fetch', 'grabber', 'meter', 'oscerr', 'oscssl', ] from .util import git_version __version__ = git_version.get_version('1.12.1') # vim: sw=4 et osc-1.12.1/osc/_private/000077500000000000000000000000001475337502500147705ustar00rootroot00000000000000osc-1.12.1/osc/_private/__init__.py000066400000000000000000000012641475337502500171040ustar00rootroot00000000000000# This is a private implementation of osc.core that will replace it in the future. # The existing osc.core needs to stay for a while to emit deprecation warnings. # # The cherry-picked imports will be the supported API. from .api_build import BuildHistory from .api_configuration import get_configuration_value from .api_source import add_channels from .api_source import add_containers from .api_source import enable_channels from .api_source import get_linked_packages from .api_source import release from .common import print_msg from .common import format_msg_project_package_options from .package import ApiPackage from .package import LocalPackage from .request import forward_request osc-1.12.1/osc/_private/api.py000066400000000000000000000140311475337502500161120ustar00rootroot00000000000000""" Functions that communicate with OBS API and work with related XML data. """ import xml.sax.saxutils from xml.etree import ElementTree as ET from ..util.xml import xml_escape from ..util.xml import xml_indent from ..util.xml import xml_unescape from ..util.xml import xml_parse def get(apiurl, path, query=None): """ Send a GET request to OBS. :param apiurl: OBS apiurl. :type apiurl: str :param path: URL path segments. :type path: list(str) :param query: URL query values. :type query: dict(str, str) :returns: Parsed XML root. :rtype: xml.etree.ElementTree.Element """ from .. import connection as osc_connection from .. import core as osc_core assert apiurl assert path if not isinstance(path, (list, tuple)): raise TypeError("Argument `path` expects a list of strings") url = osc_core.makeurl(apiurl, path, query) with osc_connection.http_GET(url) as f: root = xml_parse(f).getroot() return root def post(apiurl, path, query=None): """ Send a POST request to OBS. :param apiurl: OBS apiurl. :type apiurl: str :param path: URL path segments. :type path: list(str) :param query: URL query values. :type query: dict(str, str) :returns: Parsed XML root. :rtype: xml.etree.ElementTree.Element """ from .. import connection as osc_connection from .. import core as osc_core assert apiurl assert path if not isinstance(path, (list, tuple)): raise TypeError("Argument `path` expects a list of strings") url = osc_core.makeurl(apiurl, path, query) with osc_connection.http_POST(url) as f: root = xml_parse(f).getroot() return root def put(apiurl, path, query=None, data=None): """ Send a PUT request to OBS. :param apiurl: OBS apiurl. :type apiurl: str :param path: URL path segments. :type path: list(str) :param query: URL query values. :type query: dict(str, str) :returns: Parsed XML root. :rtype: xml.etree.ElementTree.Element """ from osc import connection as osc_connection from osc import core as osc_core assert apiurl assert path if not isinstance(path, (list, tuple)): raise TypeError("Argument `path` expects a list of strings") url = osc_core.makeurl(apiurl, path, query) with osc_connection.http_PUT(url, data=data) as f: root = xml_parse(f).getroot() return root def _to_xpath(*args): """ Convert strings and dictionaries to xpath: string gets translated to a node name dictionary gets translated to [@key='value'] predicate All values are properly escaped. Examples: args: ["directory", "entry", {"name": "osc"}] result: "directory/entry[@name='osc']" args: ["attributes", "attribute", {"namespace": "OBS", "name": "BranchSkipRepositories"}, "value"] result: "attributes/attribute[@namespace='OBS'][@name='BranchSkipRepositories']/value" """ xpath = "" for arg in args: if isinstance(arg, str): arg = xml.sax.saxutils.escape(arg) xpath += f"/{arg}" elif isinstance(arg, dict): for key, value in arg.items(): key = xml.sax.saxutils.escape(key) value = xml.sax.saxutils.escape(value) xpath += f"[@{key}='{value}']" else: raise TypeError(f"Argument '{arg}' has invalid type '{type(arg).__name__}'. Expected types: str, dict") # strip the leading slash because we're making a relative search xpath = xpath.lstrip("/") return xpath def find_nodes(root, root_name, *args): """ Find nodes with given `node_name`. Also, verify that the root tag matches the `root_name`. :param root: Root node. :type root: xml.etree.ElementTree.Element :param root_name: Expected (tag) name of the root node. :type root_name: str :param *args: Simplified xpath notation: strings are node names, dictionaries translate to [@key='value'] predicates. :type *args: list[str, dict] :returns: List of nodes that match xpath based on the given `args`. :rtype: list(xml.etree.ElementTree.Element) """ assert root.tag == root_name return root.findall(_to_xpath(*args)) def find_node(root, root_name, *args): """ Find a single node with given `node_name`. If `node_name` is not specified, the root node is returned. Also, verify that the root tag matches the `root_name`. :param root: Root node. :type root: xml.etree.ElementTree.Element :param root_name: Expected (tag) name of the root node. :type root_name: str :param *args: Simplified xpath notation: strings are node names, dictionaries translate to [@key='value'] predicates. :type *args: list[str, dict] :returns: The node that matches xpath based on the given `args` or the root node if `args` are not specified. :rtype: xml.etree.ElementTree.Element """ assert root.tag == root_name if not args: # only verify the root tag return root return root.find(_to_xpath(*args)) def group_child_nodes(node): nodes = node[:] result = [] while nodes: # look at the tag of the first node tag = nodes[0].tag # collect all nodes with the same tag and append them to the result # then repeat the step for the next tag(s) matches = [] others = [] for i in nodes: if i.tag == tag: matches.append(i) else: others.append(i) result += matches nodes = others node[:] = result def write_xml_node_to_file(node, path, indent=True): """ Write a XML node to a file. :param node: Node to write. :type node: xml.etree.ElementTree.Element :param path: Path to a file that will be written to. :type path: str :param indent: Whether to indent (pretty-print) the written XML. :type indent: bool """ if indent: xml_indent(node) ET.ElementTree(node).write(path) osc-1.12.1/osc/_private/api_build.py000066400000000000000000000044421475337502500172760ustar00rootroot00000000000000import csv import io import time from . import api class BuildHistory: def __init__( self, apiurl: str, project: str, package: str, repository: str, arch: str, limit: int = 0, ): self.apiurl = apiurl self.project = project self.package = package self.repository = repository self.arch = arch self._limit = int(limit) self.entries = self._get_entries() def _get_entries(self): url_path = [ "build", self.project, self.repository, self.arch, self.package, "_history", ] url_query = {} if self._limit and self._limit > 0: url_query["limit"] = self._limit root = api.get(self.apiurl, url_path, url_query) result = [] nodes = api.find_nodes(root, "buildhistory", "entry") for node in nodes: item = { "rev": node.get("rev"), "srcmd5": node.get("srcmd5"), "ver_rel": node.get("versrel"), "build_count": int(node.get("bcnt")), "time": time.strftime("%Y-%m-%d %H:%M:%S", time.gmtime(int(node.get("time")))), } # duration may not always be available duration = node.get("duration") if duration: item["duration"] = int(duration) result.append(item) return result def to_csv(self): out = io.StringIO() header = ["time", "srcmd5", "rev", "ver_rel", "build_count", "duration"] writer = csv.DictWriter(out, fieldnames=header, quoting=csv.QUOTE_ALL) writer.writeheader() for i in self.entries: writer.writerow(i) return out.getvalue() def to_text_table(self): from ..core import build_table header = ("TIME", "SRCMD5", "VER-REL.BUILD#", "REV", "DURATION") data = [] for i in self.entries: item = ( i["time"], i["srcmd5"], f"{i['ver_rel']}.{i['build_count']}", i["rev"], i.get("duration", ""), ) data.extend(item) return "\n".join(build_table(len(header), data, header)) osc-1.12.1/osc/_private/api_configuration.py000066400000000000000000000005751475337502500210510ustar00rootroot00000000000000from . import api from .. import oscerr def get_configuration_value(apiurl, option): url_path = ["configuration"] url_query = {} root = api.get(apiurl, url_path, url_query) node = api.find_node(root, "configuration", option) if node is None or not node.text: raise oscerr.APIError(f"Couldn't get configuration option '{option}'") return node.text osc-1.12.1/osc/_private/api_source.py000066400000000000000000000066071475337502500175040ustar00rootroot00000000000000from . import api from .common import format_msg_project_package_options from .. import oscerr from ..output import print_msg def add_channels(apiurl, project, package=None, enable_all=False, skip_disabled=False, print_to="debug"): if all((enable_all, skip_disabled)): raise oscerr.OscValueError("Options 'enable_all' and 'skip_disabled' are mutually exclusive") msg = format_msg_project_package_options( "Adding channels to", project, package, enable_all=enable_all, skip_disabled=skip_disabled, ) print_msg(msg, print_to=print_to) url_path = ["source", project] if package: url_path += [package] url_query = {"cmd": "addchannels"} if enable_all: url_query["mode"] = "enable_all" if skip_disabled: url_query["mode"] = "skip_disabled" return api.post(apiurl, url_path, url_query) def add_containers(apiurl, project, package, extend_package_names=False, print_to="debug"): msg = format_msg_project_package_options( "Adding containers to", project, package, extend_package_names=extend_package_names, ) print_msg(msg, print_to=print_to) url_path = ["source", project, package] url_query = {"cmd": "addcontainers"} if extend_package_names: url_query["extend_package_names"] = "1" return api.post(apiurl, url_path, url_query) def enable_channels(apiurl, project, package=None, print_to="debug"): msg = format_msg_project_package_options( "Enabling channels in", project, package, ) print_msg(msg, print_to=print_to) url_path = ["source", project] if package: url_path += [package] if package: url_query = {"cmd": "enablechannel"} else: url_query = {"cmd": "modifychannels", "mode": "enable_all"} return api.post(apiurl, url_path, url_query) def get_linked_packages(apiurl, project, package): url_path = ["source", project, package] url_query = {"cmd": "showlinked"} root = api.post(apiurl, url_path, url_query) result = [] nodes = api.find_nodes(root, "collection", "package") for node in nodes: item = { "project": node.get("project"), "name": node.get("name"), } result.append(item) return result def release( apiurl, project, package, repository, architecture, target_project, target_repository, set_release_to=None, delayed=False, print_to="debug", ): msg = format_msg_project_package_options( "Releasing", project, package, target_project, target_package=None, repository=repository, architecture=architecture, dest_repository=target_repository, delayed=delayed, ) print_msg(msg, print_to=print_to) url_path = ["source", project] if package: url_path += [package] url_query = {"cmd": "release"} if repository: url_query["repository"] = repository if architecture: url_query["arch"] = architecture if target_project: url_query["target_project"] = target_project if target_repository: url_query["target_repository"] = target_repository if set_release_to: url_query["setrelease"] = set_release_to if not delayed: url_query["nodelay"] = "1" return api.post(apiurl, url_path, url_query) osc-1.12.1/osc/_private/common.py000066400000000000000000000024061475337502500166340ustar00rootroot00000000000000from ..output.output import print_msg def format_msg_project_package_options( msg, project=None, package=None, dest_project=None, dest_package=None, repository=None, architecture=None, dest_repository=None, **options, ): """ Format msg, project, package, dest_project, dest_package and options into a meaningful message that can be printed out directly or as a debug message. """ if project and not package: msg += f" project '{project}'" else: msg += f" package '{project}/{package}'" if repository: msg += f" repository '{repository}'" if any([dest_project, dest_package, dest_repository]): msg += " to" if dest_project and not dest_package: msg += f" project '{dest_project}'" elif dest_project and dest_package: msg += f" package '{dest_project}/{dest_package}'" if dest_repository: msg += f" repository '{dest_repository}'" if architecture: msg += f" architecture '{architecture}'" msg_options = [key.replace("_", "-") for key, value in options.items() if value] if msg_options: msg_options.sort() msg_options_str = ", ".join(msg_options) msg += f" options: {msg_options_str}" return msg osc-1.12.1/osc/_private/package.py000066400000000000000000000071511475337502500167410ustar00rootroot00000000000000import functools from .. import oscerr from . import api @functools.total_ordering class PackageBase: def __init__(self, apiurl, project, package): self.apiurl = apiurl self.project = project self.name = package self.rev = None self.vrev = None self.srcmd5 = None self.linkinfo = None self.files = [] directory_node = self._get_directory_node() self._load_from_directory_node(directory_node) self._meta_node = None def __str__(self): return f"{self.project}/{self.name}" def __repr__(self): return super().__repr__() + f"({self})" def __hash__(self): return hash((self.name, self.project, self.apiurl)) def __eq__(self, other): return (self.name, self.project, self.apiurl) == (other.name, other.project, other.apiurl) def __lt__(self, other): return (self.name, self.project, self.apiurl) < (other.name, other.project, other.apiurl) def _get_directory_node(self): raise NotImplementedError def _load_from_directory_node(self, directory_node): from .. import core as osc_core # attributes self.rev = directory_node.get("rev") self.vrev = directory_node.get("vrev") self.srcmd5 = directory_node.get("srcmd5") # files file_nodes = api.find_nodes(directory_node, "directory", "entry") for file_node in file_nodes: self.files.append(osc_core.File.from_xml_node(file_node)) # linkinfo linkinfo_node = api.find_node(directory_node, "directory", "linkinfo") if linkinfo_node is not None: self.linkinfo = osc_core.Linkinfo() self.linkinfo.read(linkinfo_node) if self.linkinfo.project and not self.linkinfo.package: # if the link points to a package with the same name, # the name is omitted and we want it present for overall sanity self.linkinfo.package = self.name def _get_meta_node(self): raise NotImplementedError() def get_meta_value(self, option): if self._meta_node is None or len(self._meta_node) == 0: self._meta_node = self._get_meta_node() if self._meta_node is None or len(self._meta_node) == 0: return None node = api.find_node(self._meta_node, "package", option) if node is None or not node.text: raise oscerr.APIError(f"Couldn't get '{option}' from package _meta") return node.text class ApiPackage(PackageBase): def __init__(self, apiurl, project, package, rev=None): # for loading the directory node from the API # the actual revision is loaded from the directory node self.__rev = rev super().__init__(apiurl, project, package) def _get_directory_node(self): url_path = ["source", self.project, self.name] url_query = {} if self.__rev: url_query["rev"] = self.__rev return api.get(self.apiurl, url_path, url_query) def _get_meta_node(self): url_path = ["source", self.project, self.name, "_meta"] url_query = {} return api.get(self.apiurl, url_path, url_query) class LocalPackage(PackageBase): def __init__(self, path): from .. import store as osc_store self.dir = path self.store = osc_store.Store(self.dir) super().__init__(self.store.apiurl, self.store.project, self.store.package) def _get_directory_node(self): return self.store.read_xml_node("_files", "directory").getroot() def _get_meta_node(self): return self.store._meta_node osc-1.12.1/osc/_private/request.py000066400000000000000000000022671475337502500170410ustar00rootroot00000000000000from . import package as osc_package def forward_request(apiurl, request, interactive=True): """ Forward the specified `request` to the projects the packages were branched from. """ from .. import core as osc_core for action in request.get_actions("submit"): package = osc_package.ApiPackage(apiurl, action.tgt_project, action.tgt_package) if not package.linkinfo: # not a linked/branched package, can't forward to parent continue project = package.linkinfo.project package = package.linkinfo.package if interactive: reply = input(f"\nForward request to {project}/{package}? ([y]/n) ") if reply.lower() not in ("y", ""): continue msg = f"Forwarded request #{request.reqid} from {request.creator}\n\n{request.description}" new_request_id = osc_core.create_submit_request( apiurl, action.tgt_project, action.tgt_package, project, package, msg, ) msg = f"Forwarded request #{request.reqid} from {request.creator} to {project}/{package}: #{new_request_id}" print(msg) osc-1.12.1/osc/babysitter.py000066400000000000000000000205111475337502500157000ustar00rootroot00000000000000# Copyright (C) 2008 Novell Inc. All rights reserved. # This program is free software; it may be used, copied, modified # and distributed under the terms of the GNU General Public Licence, # either version 2, or (at your option) any later version. import errno import os import pdb import signal import ssl import sys import traceback from http.client import HTTPException, BadStatusLine from urllib.error import URLError, HTTPError import urllib3.exceptions from . import _private from . import commandline from . import conf as osc_conf from . import core as osc_core from . import oscerr from . import output from .OscConfigParser import configparser from .oscssl import CertVerificationError from .util.cpio import CpioError from .util.helper import decode_it from .util.packagequery import PackageError try: # import as RPMError because the class "error" is too generic # pylint: disable=E0611 from rpm import error as RPMError except: # if rpm-python isn't installed (we might be on a debian system): class RPMError(Exception): pass try: from keyring.errors import KeyringLocked except ImportError: # python-keyring is not installed class KeyringLocked(Exception): pass # the good things are stolen from Matt Mackall's mercurial def catchterm(*args): raise oscerr.SignalInterrupt # Signals which should terminate the program safely for name in 'SIGBREAK', 'SIGHUP', 'SIGTERM': num = getattr(signal, name, None) if num: signal.signal(num, catchterm) def run(prg, argv=None): try: try: # we haven't parsed options yet, that's why we rely on argv directly if "--debugger" in (argv or sys.argv[1:]): pdb.set_trace() # here we actually run the program prg.main(argv) return 0 except: # If any of these was set via the command-line options, # the config values are expected to be changed accordingly. # That's why we're working only with the config. if osc_conf.config["traceback"] or osc_conf.config["post_mortem"]: traceback.print_exc(file=sys.stderr) # we could use http://aspn.activestate.com/ASPN/Cookbook/Python/Recipe/52215 # enter the debugger, if desired if osc_conf.config["post_mortem"]: if sys.stdout.isatty() and not hasattr(sys, 'ps1'): pdb.post_mortem(sys.exc_info()[2]) else: print('sys.stdout is not a tty. Not jumping into pdb.', file=sys.stderr) raise except oscerr.SignalInterrupt: print('killed!', file=sys.stderr) except KeyboardInterrupt: print('interrupted!', file=sys.stderr) return 130 except oscerr.UserAbort: print('aborted.', file=sys.stderr) except oscerr.APIError as e: print('BuildService API error:', e.msg, file=sys.stderr) except oscerr.LinkExpandError as e: print(f'Link "{e.prj}/{e.pac}" cannot be expanded:\n', e.msg, file=sys.stderr) print('Use "osc repairlink" to fix merge conflicts.\n', file=sys.stderr) except oscerr.WorkingCopyWrongVersion as e: print(e, file=sys.stderr) except oscerr.NoWorkingCopy as e: print(e, file=sys.stderr) except HTTPError as e: print('Server returned an error:', e, file=sys.stderr) if hasattr(e, 'osc_msg'): print(e.osc_msg, file=sys.stderr) try: body = e.read() except AttributeError: body = '' output.print_msg(e.hdrs, print_to="debug") output.print_msg(body, print_to="debug") if e.code in [400, 403, 404, 500]: if b'' in body: msg = body.split(b'')[1] msg = msg.split(b'')[0] msg = _private.api.xml_unescape(msg) print(decode_it(msg), file=sys.stderr) if e.code >= 500 and e.code <= 599: print(f'\nRequest: {e.filename}') print('Headers:') for h, v in e.hdrs.items(): if h != 'Set-Cookie': print(f"{h}: {v}") except BadStatusLine as e: print('Server returned an invalid response:', e, file=sys.stderr) print(e.line, file=sys.stderr) except HTTPException as e: print(e, file=sys.stderr) except URLError as e: msg = 'Failed to reach a server' if hasattr(e, '_osc_host_port'): msg += f' ({e._osc_host_port})' msg += ':\n' print(msg, e.reason, file=sys.stderr) except ssl.SSLError as e: if 'tlsv1' in str(e): print('The python on this system or the server does not support TLSv1.2', file=sys.stderr) print("SSL Error:", e, file=sys.stderr) except OSError as e: # ignore broken pipe if e.errno != errno.EPIPE: raise except OSError as e: if e.errno != errno.ENOENT: raise print(e, file=sys.stderr) except (oscerr.ConfigError, oscerr.NoConfigfile) as e: print(e, file=sys.stderr) except configparser.Error as e: print(e.message, file=sys.stderr) except oscerr.OscIOError as e: print(e.msg, file=sys.stderr) output.print_msg(e.e, print_to="debug") except (oscerr.WrongOptions, oscerr.WrongArgs) as e: print(e, file=sys.stderr) return 2 except oscerr.ExtRuntimeError as e: print(f"{e.file}:", e.msg, file=sys.stderr) except oscerr.ServiceRuntimeError as e: print(e.msg, file=sys.stderr) except oscerr.WorkingCopyOutdated as e: print(e, file=sys.stderr) except (oscerr.PackageExists, oscerr.PackageMissing, oscerr.WorkingCopyInconsistent) as e: print(e.msg, file=sys.stderr) except oscerr.PackageInternalError as e: print('a package internal error occured\n' 'please file a bug and attach your current package working copy ' 'and the following traceback to it:', file=sys.stderr) print(e.msg, file=sys.stderr) traceback.print_exc(file=sys.stderr) except oscerr.PackageError as e: print(str(e), file=sys.stderr) except PackageError as e: print(f'{e.fname}:', e.msg, file=sys.stderr) except RPMError as e: print(e, file=sys.stderr) except KeyringLocked as e: print(e, file=sys.stderr) except CertVerificationError as e: print(e, file=sys.stderr) except urllib3.exceptions.MaxRetryError as e: print(e.reason, file=sys.stderr) except urllib3.exceptions.ProtocolError as e: print(e.args[0], file=sys.stderr) except CpioError as e: print(e, file=sys.stderr) except oscerr.OscBaseError as e: print('*** Error:', e, file=sys.stderr) if osc_core.MESSAGE_BACKUPS: print() print("If you lost any edited commit messages due to an error, you may find them here:") for path in osc_core.MESSAGE_BACKUPS: print(f" - {path}") return 1 def main(): # avoid buffering output on pipes (bnc#930137) Basically, # a "print('foo')" call is translated to a corresponding # fwrite call that writes to the stdout stream (cf. # string_print (Objects/stringobject.c) and builtin_print # (Python/bltinmodule.c)); If no pipe is used, stdout is # a tty/refers to a terminal => the stream is line buffered # (see _IO_file_doallocate (libio/filedoalloc.c)). If a pipe # is used, stdout does not refer to a terminal anymore => the # stream is fully buffered by default (see # _IO_file_doallocate). The following fdopen call makes # stdout line buffered again (at least on systems that # support setvbuf - if setvbuf is not supported, the stream # remains fully buffered (see PyFile_SetBufSize # (Objects/fileobject.c))). if not os.isatty(sys.stdout.fileno()): sys.stdout = os.fdopen(sys.stdout.fileno(), sys.stdout.mode, 1) sys.stderr = os.fdopen(sys.stderr.fileno(), sys.stderr.mode, 1) appimage = os.getenv("APPIMAGE", None) owd = os.getenv("OWD", None) if appimage and owd: # OWD stands for Original Working Directory and we need to switch there when running in an AppImage # https://docs.appimage.org/packaging-guide/environment-variables.html os.chdir(owd) sys.exit(run(commandline.OscMainCommand())) # vim: sw=4 et osc-1.12.1/osc/build.py000066400000000000000000002023051475337502500146320ustar00rootroot00000000000000# Copyright (C) 2006 Novell Inc. All rights reserved. # This program is free software; it may be used, copied, modified # and distributed under the terms of the GNU General Public Licence, # either version 2, or (at your option) any later version. import fnmatch import getpass import glob import os import re import shutil import subprocess import sys from tempfile import NamedTemporaryFile, mkdtemp from typing import List from typing import Optional from urllib.parse import urlsplit from urllib.request import URLError, HTTPError from xml.etree import ElementTree as ET from . import conf from . import connection from . import core from . import oscerr from .core import get_buildinfo, meta_exists, get_buildconfig, dgst from .core import get_binarylist, get_binary_file, run_external, return_external, raw_input from .fetch import Fetcher, OscFileGrabber, verify_pacs from .meter import create_text_meter from .util import cpio from .util import archquery, debquery, packagequery, rpmquery from .util import repodata from .util.helper import decode_it from .util.xml import xml_parse change_personality = { 'i686': 'linux32', 'i586': 'linux32', 'i386': 'linux32', 'ppc': 'powerpc32', 's390': 's390', 'sparc': 'linux32', 'sparcv8': 'linux32', } can_also_build = { 'aarch64': ['aarch64'], # only needed due to used heuristics in build parameter evaluation 'armv6l': ['armv4l', 'armv5l', 'armv6l', 'armv5el', 'armv6el'], 'armv7l': ['armv4l', 'armv5l', 'armv6l', 'armv7l', 'armv5el', 'armv6el', 'armv7el'], 'armv5el': ['armv4l', 'armv5l', 'armv5el'], # not existing arch, just for compatibility 'armv6el': ['armv4l', 'armv5l', 'armv6l', 'armv5el', 'armv6el'], # not existing arch, just for compatibility 'armv6hl': ['armv4l', 'armv5l', 'armv6l', 'armv5el', 'armv6el'], 'armv7el': ['armv4l', 'armv5l', 'armv6l', 'armv7l', 'armv5el', 'armv6el', 'armv7el'], # not existing arch, just for compatibility 'armv7hl': ['armv7hl'], # not existing arch, just for compatibility 'armv8el': ['armv4l', 'armv5el', 'armv6el', 'armv7el', 'armv8el'], # not existing arch, just for compatibility 'armv8l': ['armv4l', 'armv5el', 'armv6el', 'armv7el', 'armv8el'], # not existing arch, just for compatibility 'armv5tel': ['armv4l', 'armv5el', 'armv5tel'], 's390x': ['s390'], 'ppc64': ['ppc', 'ppc64', 'ppc64p7', 'ppc64le'], 'ppc64le': ['ppc64le', 'ppc64'], 'i586': ['i386'], 'i686': ['i586', 'i386'], 'x86_64': ['i686', 'i586', 'i386'], 'sparc64': ['sparc64v', 'sparcv9v', 'sparcv9', 'sparcv8', 'sparc'], 'parisc': ['hppa'], } # real arch of this machine hostarch = os.uname()[4] if hostarch == 'i686': # FIXME hostarch = 'i586' if hostarch == 'parisc': hostarch = 'hppa' class Buildinfo: """represent the contents of a buildinfo file""" def __init__(self, filename, apiurl, buildtype='spec', localpkgs=None, binarytype='rpm'): localpkgs = localpkgs or [] try: tree = xml_parse(filename) except ET.ParseError: print('could not parse the buildinfo:', file=sys.stderr) print(open(filename).read(), file=sys.stderr) sys.exit(1) root = tree.getroot() self.apiurl = apiurl if root.find('error') is not None: sys.stderr.write('buildinfo is broken... it says:\n') error = root.findtext("error") if error.startswith('unresolvable: '): sys.stderr.write('unresolvable: ') sys.stderr.write('\n '.join(error[14:].split(','))) else: sys.stderr.write(error) sys.stderr.write('\n') sys.exit(1) if not (apiurl.startswith('https://') or apiurl.startswith('http://')): raise URLError('invalid protocol for the apiurl: \'%s\'' % apiurl) self.buildtype = buildtype self.binarytype = binarytype self.apiurl = apiurl # are we building .rpm or .deb? # XXX: shouldn't we deliver the type via the buildinfo? self.pacsuffix = 'rpm' if self.buildtype in ('dsc', 'collax', 'deb'): self.pacsuffix = 'deb' if self.buildtype == 'arch': self.pacsuffix = 'arch' if self.buildtype == 'livebuild': self.pacsuffix = 'deb' if self.buildtype == 'snapcraft': # atm ubuntu is used as base, but we need to be more clever when # snapcraft also supports rpm self.pacsuffix = 'deb' # The architectures become a bit mad ... # buildarch: The architecture of the build result (host arch in GNU definition) # hostarch: The architecture of the build environment (build arch in GNU defintion) # crossarch: Same as hostarch, but indicating that a sysroot with an incompatible architecture exists self.buildarch = root.findtext("arch") self.crossarch = root.findtext("crossarch") self.hostarch = root.findtext("hostarch") self.release = root.findtext("release") if conf.config['api_host_options'][apiurl]['downloadurl']: # Formerly, this was set to False, but we have to set it to True, because a large # number of repos in OBS are misconfigured and don't actually have repos setup - they # are API only. self.enable_cpio = True self.downloadurl = conf.config['api_host_options'][apiurl]['downloadurl'] + "/repositories" if conf.config['http_debug']: print("⚠️ setting dl_url to %s" % conf.config['api_host_options'][apiurl]['downloadurl']) else: self.enable_cpio = True self.downloadurl = root.get('downloadurl') self.debuginfo = 0 if root.find('debuginfo') is not None: try: self.debuginfo = int(root.findtext("debuginfo")) except ValueError: pass self.deps = [] self.projects = {} self.keys = [] self.prjkeys = [] self.pathes = [] self.urls = {} self.modules = [] for node in root.findall('module'): self.modules.append(node.text) for node in root.findall('bdep'): if node.find('sysroot'): p = Pac(node, self.buildarch, self.pacsuffix, apiurl, localpkgs) else: pac_arch = self.crossarch if pac_arch is None: pac_arch = self.buildarch p = Pac(node, pac_arch, self.pacsuffix, apiurl, localpkgs) if p.project: self.projects[p.project] = 1 self.deps.append(p) for node in root.findall('path'): # old simple list for compatibility # XXX: really old? This is currently used for kiwi builds self.pathes.append(node.get('project') + "/" + node.get('repository')) # a hash providing the matching URL for specific repos for newer OBS instances if node.get('url'): baseurl = node.get('url').replace('%', '%%') if conf.config['api_host_options'][apiurl]['downloadurl']: # Add the path element to the download url override. baseurl = conf.config['api_host_options'][apiurl]['downloadurl'] + urlsplit(node.get('url'))[2] self.urls[node.get('project') + "/" + node.get('repository')] = baseurl + '/%(arch)s/%(filename)s' self.vminstall_list = [dep.name for dep in self.deps if dep.vminstall] self.preinstall_list = [dep.name for dep in self.deps if dep.preinstall] self.runscripts_list = [dep.name for dep in self.deps if dep.runscripts] self.noinstall_list = [dep.name for dep in self.deps if dep.noinstall] self.installonly_list = [dep.name for dep in self.deps if dep.installonly] if root.find('preinstallimage') is not None: self.preinstallimage = root.find('preinstallimage') else: self.preinstallimage = None self.containerannotation = root.findtext("containerannotation") def has_dep(self, name): for i in self.deps: if i.name == name: return True return False def remove_dep(self, name): # we need to iterate over all deps because if this a # kiwi build the same package might appear multiple times # NOTE: do not loop and remove items, the second same one would not get catched self.deps = [i for i in self.deps if not i.name == name] class Pac: """represent a package to be downloaded We build a map that's later used to fill our URL templates """ def __init__(self, node, buildarch, pacsuffix, apiurl, localpkgs=None): localpkgs = localpkgs or [] # set attributes to mute pylint error E1101: Instance of 'Pac' has no '' member (no-member) self.project = None self.name = None self.canonname = None self.repository = None self.repoarch = None self.mp = {} for i in ['binary', 'package', 'epoch', 'version', 'release', 'hdrmd5', 'project', 'repository', 'sysroot', 'preinstall', 'vminstall', 'runscripts', 'noinstall', 'installonly', 'notmeta', ]: self.mp[i] = node.get(i) self.mp['buildarch'] = buildarch self.mp['pacsuffix'] = pacsuffix self.mp['arch'] = node.get('arch') or self.mp['buildarch'] self.mp['name'] = node.get('name') or self.mp['binary'] # this is not the ideal place to check if the package is a localdep or not localdep = self.mp['name'] in localpkgs # and not self.mp['noinstall'] if not localdep and not (node.get('project') and node.get('repository')): raise oscerr.APIError('incomplete information for package %s, may be caused by a broken project configuration.' % self.mp['name']) if not localdep: self.mp['extproject'] = node.get('project').replace(':', ':/') self.mp['extrepository'] = node.get('repository').replace(':', ':/') self.mp['repopackage'] = node.get('package') or '_repository' self.mp['repoarch'] = node.get('repoarch') or self.mp['buildarch'] if pacsuffix == 'deb' and not (self.mp['name'] and self.mp['arch'] and self.mp['version']): raise oscerr.APIError( "buildinfo for package %s/%s/%s is incomplete" % (self.mp['name'], self.mp['arch'], self.mp['version'])) self.mp['apiurl'] = apiurl if self.mp['epoch'] is None: epoch = None else: epoch = self.mp['epoch'].encode() if self.mp['release'] is None: release = None else: release = self.mp['release'].encode() if self.mp['binary'] == 'updateinfo.xml': canonname = 'updateinfo.xml' elif self.mp['name'].startswith('container:'): canonname = self.mp['name'] + '.tar.xz' elif pacsuffix == 'deb': canonname = debquery.DebQuery.filename(self.mp['name'].encode(), epoch, self.mp['version'].encode(), release, self.mp['arch'].encode()) elif pacsuffix == 'arch': canonname = archquery.ArchQuery.filename(self.mp['name'].encode(), epoch, self.mp['version'].encode(), release, self.mp['arch'].encode()) else: canonname = rpmquery.RpmQuery.filename(self.mp['name'].encode(), epoch, self.mp['version'].encode(), release or b'0', self.mp['arch'].encode()) self.mp['canonname'] = decode_it(canonname) # maybe we should rename filename key to binary self.mp['filename'] = node.get('binary') or decode_it(canonname) if self.mp['repopackage'] == '_repository': self.mp['repofilename'] = self.mp['name'] else: # OBS 2.3 puts binary into product bdeps (noinstall ones) self.mp['repofilename'] = self.mp['filename'] # make the content of the dictionary accessible as class attributes self.__dict__.update(self.mp) def makeurls(self, cachedir, urllist): self.localdir = '%s/%s/%s/%s' % (cachedir, self.project, self.repository, self.repoarch) self.fullfilename = os.path.join(self.localdir, self.canonname) self.urllist = [url % self.mp for url in urllist] def __str__(self): return self.name or "" def __repr__(self): return "%s" % self.name def get_preinstall_image(apiurl, arch, cache_dir, img_info, offline=False): """ Searches preinstall image according to build info and downloads it to cache (unless offline is set to ``True`` (default: ``False``)). Returns preinstall image path, source and list of image binaries, which can be used to create rpmlist. .. note:: preinstall image can be used only for new build roots! """ imagefile = '' imagesource = '' img_bins = [] for bin in img_info.findall('binary'): img_bins.append(bin.text) img_project = img_info.get('project') img_repository = img_info.get('repository') img_arch = arch img_pkg = img_info.get('package') img_file = img_info.get('filename') img_hdrmd5 = img_info.get('hdrmd5') if not img_hdrmd5: img_hdrmd5 = img_file cache_path = '%s/%s/%s/%s' % (cache_dir, img_project, img_repository, img_arch) ifile_path = '%s/%s' % (cache_path, img_file) ifile_path_part = '%s.part' % ifile_path imagefile = ifile_path imagesource = "%s/%s/%s [%s]" % (img_project, img_repository, img_pkg, img_hdrmd5) if not os.path.exists(ifile_path): if offline: return '', '', [] url = "%s/build/%s/%s/%s/%s/%s" % (apiurl, img_project, img_repository, img_arch, img_pkg, img_file) print("downloading preinstall image %s" % imagesource) if not os.path.exists(cache_path): try: os.makedirs(cache_path, mode=0o755) except OSError as e: print('packagecachedir is not writable for you?', file=sys.stderr) print(e, file=sys.stderr) sys.exit(1) progress_obj = None if sys.stdout.isatty(): progress_obj = create_text_meter(use_pb_fallback=False) gr = OscFileGrabber(progress_obj=progress_obj) try: gr.urlgrab(url, filename=ifile_path_part, text='fetching image') except HTTPError as e: print("Failed to download! ecode:%i reason:%s" % (e.code, e.reason)) return ('', '', []) # download ok, rename partial file to final file name os.rename(ifile_path_part, ifile_path) return (imagefile, imagesource, img_bins) def get_built_files(pacdir, buildtype): if buildtype == 'spec': debs_dir = os.path.join(pacdir, 'DEBS') sdebs_dir = os.path.join(pacdir, 'SDEBS') if os.path.isdir(debs_dir) or os.path.isdir(sdebs_dir): # (S)DEBS directories detected, list their *.(s)deb files b_built = subprocess.Popen(['find', debs_dir, '-name', '*.deb'], stdout=subprocess.PIPE).stdout.read().strip() s_built = subprocess.Popen(['find', sdebs_dir, '-name', '*.sdeb'], stdout=subprocess.PIPE).stdout.read().strip() else: # default: (S)RPMS directories and their *.rpm files b_built = subprocess.Popen(['find', os.path.join(pacdir, 'RPMS'), '-name', '*.rpm'], stdout=subprocess.PIPE).stdout.read().strip() s_built = subprocess.Popen(['find', os.path.join(pacdir, 'SRPMS'), '-name', '*.rpm'], stdout=subprocess.PIPE).stdout.read().strip() elif buildtype == 'kiwi': b_built = subprocess.Popen(['find', os.path.join(pacdir, 'KIWI'), '-type', 'f'], stdout=subprocess.PIPE).stdout.read().strip() s_built = '' elif buildtype == 'docker': b_built = subprocess.Popen(['find', os.path.join(pacdir, 'DOCKER'), '-type', 'f'], stdout=subprocess.PIPE).stdout.read().strip() s_built = '' elif buildtype == 'podman': b_built = subprocess.Popen(['find', os.path.join(pacdir, 'DOCKER'), '-type', 'f'], stdout=subprocess.PIPE).stdout.read().strip() s_built = '' elif buildtype == 'fissile': b_built = subprocess.Popen(['find', os.path.join(pacdir, 'FISSILE'), '-type', 'f'], stdout=subprocess.PIPE).stdout.read().strip() s_built = '' elif buildtype in ('dsc', 'collax'): b_built = subprocess.Popen(['find', os.path.join(pacdir, 'DEBS'), '-name', '*.deb'], stdout=subprocess.PIPE).stdout.read().strip() s_built = subprocess.Popen(['find', os.path.join(pacdir, 'SOURCES.DEB'), '-type', 'f'], stdout=subprocess.PIPE).stdout.read().strip() elif buildtype == 'arch': b_built = subprocess.Popen(['find', os.path.join(pacdir, 'ARCHPKGS'), '-name', '*.pkg.tar*'], stdout=subprocess.PIPE).stdout.read().strip() s_built = '' elif buildtype == 'livebuild': b_built = subprocess.Popen(['find', os.path.join(pacdir, 'OTHER'), '-name', '*.iso*'], stdout=subprocess.PIPE).stdout.read().strip() s_built = '' elif buildtype == 'helm': b_built = subprocess.Popen(['find', os.path.join(pacdir, 'HELM'), '-type', 'f'], stdout=subprocess.PIPE).stdout.read().strip() s_built = '' elif buildtype == 'snapcraft': b_built = subprocess.Popen(['find', os.path.join(pacdir, 'OTHER'), '-name', '*.snap'], stdout=subprocess.PIPE).stdout.read().strip() s_built = '' elif buildtype == 'appimage': b_built = subprocess.Popen(['find', os.path.join(pacdir, 'OTHER'), '-name', '*.AppImage'], stdout=subprocess.PIPE).stdout.read().strip() s_built = '' elif buildtype == 'simpleimage': b_built = subprocess.Popen(['find', os.path.join(pacdir, 'OTHER'), '-type', 'f'], stdout=subprocess.PIPE).stdout.read().strip() s_built = '' elif buildtype == 'flatpak': b_built = subprocess.Popen(['find', os.path.join(pacdir, 'OTHER'), '-type', 'f'], stdout=subprocess.PIPE).stdout.read().strip() s_built = '' elif buildtype == 'preinstallimage': b_built = subprocess.Popen(['find', os.path.join(pacdir, 'OTHER'), '-type', 'f'], stdout=subprocess.PIPE).stdout.read().strip() s_built = '' elif buildtype == 'productcompose': b_built = subprocess.Popen(['find', os.path.join(pacdir, 'PRODUCT'), '-type', 'f'], stdout=subprocess.PIPE).stdout.read().strip() s_built = '' elif buildtype == 'mkosi': b_built = subprocess.Popen(['find', os.path.join(pacdir, 'OTHER'), '-type', 'f'], stdout=subprocess.PIPE).stdout.read().strip() s_built = '' else: print('WARNING: Unknown package type \'%s\'.' % buildtype, file=sys.stderr) b_built = '' s_built = '' return s_built, b_built def get_repo(path): """ Walks up path looking for any repodata directories. :param path: path to a directory :return: path to repository directory containing repodata directory with repomd.xml file :rtype: str """ for root, dirs, files in os.walk(path): if not "repodata" in dirs: continue if "repomd.xml" in os.listdir(os.path.join(root, "repodata")): return root return None def get_prefer_pkgs(dirs, wanted_arch, type, cpio): paths = [] repositories = [] suffix = '*.rpm' if type in ('dsc', 'collax', 'livebuild'): suffix = '*.deb' elif type == 'arch': suffix = '*.pkg.tar.*' for dir in dirs: # check for repodata repository = get_repo(dir) if repository is None: paths += glob.glob(os.path.join(os.path.abspath(dir), suffix)) else: repositories.append(repository) packageQueries = packagequery.PackageQueries(wanted_arch) for repository in repositories: repodataPackageQueries = repodata.queries(repository) for packageQuery in repodataPackageQueries: packageQueries.add(packageQuery) for path in paths: if path.endswith('.src.rpm') or path.endswith('.nosrc.rpm'): continue if path.endswith('.patch.rpm') or path.endswith('.delta.rpm'): continue packageQuery = packagequery.PackageQuery.query(path) packageQueries.add(packageQuery) prefer_pkgs = {decode_it(name): packageQuery.path() for name, packageQuery in packageQueries.items()} depfile = create_deps(packageQueries.values()) cpio.add(b'deps', b'\n'.join(depfile)) return prefer_pkgs def create_deps(pkgqs): """ creates a list of dependencies which corresponds to build's internal dependency file format """ depfile = [] for p in pkgqs: id = b'%s.%s-0/0/0: ' % (p.name(), p.arch()) depfile.append(b'P:%s%s' % (id, b' '.join(p.provides()))) depfile.append(b'R:%s%s' % (id, b' '.join(p.requires()))) d = p.conflicts() if d: depfile.append(b'C:%s%s' % (id, b' '.join(d))) d = p.obsoletes() if d: depfile.append(b'O:%s%s' % (id, b' '.join(d))) d = p.recommends() if d: depfile.append(b'r:%s%s' % (id, b' '.join(d))) d = p.supplements() if d: depfile.append(b's:%s%s' % (id, b' '.join(d))) depfile.append(b'I:%s%s-%s 0-%s' % (id, p.name(), p.evr(), p.arch())) return depfile trustprompt = """Would you like to ... 0 - quit (default) 1 - always trust packages from '%(project)s' 2 - trust packages just this time ? """ def check_trusted_projects(apiurl, projects, interactive=True): trusted = conf.config['api_host_options'][apiurl]['trusted_prj'] tlen = len(trusted) for prj in projects: is_trusted = False for pattern in trusted: if fnmatch.fnmatch(prj, pattern): is_trusted = True break if not is_trusted: print("\nThe build root needs packages from project '%s'." % prj) print("Note that malicious packages can compromise the build result or even your system.") if interactive: r = raw_input(trustprompt % {'project': prj}) else: r = "0" if r == '1': print("adding '%s' to oscrc: ['%s']['trusted_prj']" % (prj, apiurl)) trusted.append(prj) elif r != '2': print("Well, goodbye then :-)") raise oscerr.UserAbort() if tlen != len(trusted): conf.config['api_host_options'][apiurl]['trusted_prj'] = trusted conf.config_set_option(apiurl, 'trusted_prj', ' '.join(trusted)) def get_kiwipath_from_buildinfo(bi, prj, repo): # If the project does not have a path defined we need to get the config # via the repositories in the kiwi file. Unfortunately the buildinfo # does not include a hint if this is the case, so we rely on a heuristic # here: if the path list contains our own repo, it probably does not # come from the kiwi file and thus a path is defined in the config. # It is unlikely that our own repo is included in the kiwi file, as it # contains no packages. myprp = prj + '/' + repo if myprp in bi.pathes: return None kiwipath = bi.pathes kiwipath.insert(0, myprp) return kiwipath def calculate_prj_pac(store, opts, descr): project = opts.alternative_project or store.project if opts.local_package: package = os.path.splitext(os.path.basename(descr))[0] else: store.assert_is_package() package = store.package return project, package def calculate_build_root_user(vm_type): if vm_type in ("kvm", "podman", "qemu"): return getpass.getuser() return None def calculate_build_root(apihost, prj, pac, repo, arch, user=None): user = user or "" dash_user = f"-{user:s}" if user else "" buildroot = conf.config["build-root"] % { 'apihost': apihost, 'project': prj, 'package': pac, 'repo': repo, 'arch': arch, "user": user, "dash_user": dash_user, } return buildroot def build_as_user(vm_type=None): if not conf.config.su_wrapper: return True if calculate_build_root_user(vm_type): return True return False def su_wrapper(cmd): sucmd = conf.config['su-wrapper'].split() if sucmd: if sucmd[0] == 'su': if sucmd[-1] == '-c': sucmd.pop() cmd = sucmd + ['-s', cmd[0], 'root', '--'] + cmd[1:] else: cmd = sucmd + cmd return cmd def run_build(opts, *args): cmd = [conf.config['build-cmd']] cmd += args if opts.vm_type: cmd.extend(["--vm-type", opts.vm_type]) user = calculate_build_root_user(opts.vm_type) if not user: cmd = su_wrapper(cmd) if not opts.userootforbuild: cmd.append('--norootforbuild') return run_external(cmd[0], *cmd[1:]) def create_build_descr_data( build_descr_path: Optional[str], *, build_type: Optional[str], repo: Optional[str] = None, arch: Optional[str] = None, prefer_pkgs: Optional[List[str]] = None, define: Optional[List[str]] = None, define_with: Optional[List[str]] = None, define_without: Optional[List[str]] = None, ): if build_descr_path: build_descr_path = os.path.abspath(build_descr_path) topdir = os.path.dirname(build_descr_path) else: topdir = None result_data = [] if build_descr_path: print(f"Using local file: {os.path.basename(build_descr_path)}", file=sys.stderr) with open(build_descr_path, "rb") as f: build_descr_data = f.read() # HACK: there's no api to provide custom defines # TODO: check if we're working with a spec? defines: List[bytes] = [] for i in define or []: defines.append(f"%define {i}".encode("utf-8")) for i in define_with or []: defines.append(f"%define _with_{i} 1".encode("utf-8")) for i in define_without or []: defines.append(f"%define _without_{i} 1".encode("utf-8")) if defines: build_descr_data = b"\n".join(defines) + b"\n\n" + build_descr_data # build recipe must go first for compatibility with the older OBS versions result_data.append((os.path.basename(build_descr_path).encode("utf-8"), build_descr_data)) if topdir: for include_file in glob.glob(os.path.join(topdir, "*.inc")): fn = os.path.basename(include_file) print(f"Using local file: {fn}", file=sys.stderr) with open(include_file, "rb") as f: result_data.append((fn.encode("utf-8"), f.read())) if topdir: buildenv_file = os.path.join(topdir, f"_buildenv.{repo}.{arch}") if not os.path.isfile(buildenv_file): buildenv_file = os.path.join(topdir, "_buildenv") if os.path.isfile(buildenv_file): print(f"Using local file: {os.path.basename(buildenv_file)}", file=sys.stderr) with open(buildenv_file, "rb") as f: result_data.append((b"buildenv", f.read())) if topdir: service_file = os.path.join(topdir, "_service") if os.path.isfile(service_file): print("Using local file: _service", file=sys.stderr) with open(service_file, "rb") as f: result_data.append((b"_service", f.read())) if not result_data and not prefer_pkgs: return None, {} cpio_data = cpio.CpioWrite() for key, value in result_data: cpio_data.add(key, value) if prefer_pkgs: print(f"Scanning the following dirs for local preferred packages: {', '.join(prefer_pkgs)}", file=sys.stderr) prefer_pkgs_result = get_prefer_pkgs(prefer_pkgs, arch, build_type, cpio_data) else: prefer_pkgs_result = {} return cpio_data.get(), prefer_pkgs_result def main(apiurl, store, opts, argv): repo = argv[0] arch = argv[1] build_descr = argv[2] xp = [] build_root = None cache_dir = None build_uid = '' config = conf.config build_shell_after_fail = config['build-shell-after-fail'] vm_memory = config['build-memory'] vm_disk_size = config['build-vmdisk-rootsize'] vm_type = config['build-type'] vm_telnet = None build_descr = os.path.abspath(build_descr) build_type = os.path.splitext(build_descr)[1][1:] if build_type in ['spec', 'dsc', 'kiwi', 'productcompose', 'livebuild']: # File extension works pass elif os.path.basename(build_descr) == 'PKGBUILD': build_type = 'arch' elif os.path.basename(build_descr) == 'build.collax': build_type = 'collax' elif os.path.basename(build_descr) == 'appimage.yml': build_type = 'appimage' elif os.path.basename(build_descr) == 'Chart.yaml': build_type = 'helm' elif os.path.basename(build_descr) == 'snapcraft.yaml': build_type = 'snapcraft' elif os.path.basename(build_descr) == 'simpleimage': build_type = 'simpleimage' elif os.path.basename(build_descr) == 'Containerfile' or os.path.basename(build_descr).startswith('Containerfile.'): build_type = 'docker' elif os.path.basename(build_descr) == 'Dockerfile' or os.path.basename(build_descr).startswith('Dockerfile.'): build_type = 'docker' elif os.path.basename(build_descr) == 'fissile.yml': build_type = 'fissile' elif os.path.basename(build_descr) == '_preinstallimage': build_type = 'preinstallimage' elif build_descr.endswith('flatpak.yaml') or build_descr.endswith('flatpak.yml') or build_descr.endswith('flatpak.json'): build_type = 'flatpak' elif os.path.basename(build_descr).startswith('mkosi.'): build_type = 'mkosi' else: raise oscerr.WrongArgs( 'Unknown build type: \'%s\'. ' 'Build description should end in .spec, .dsc, .kiwi, .productcompose or .livebuild. ' 'Or being named PKGBUILD, build.collax, simpleimage, appimage.yml, ' 'Chart.yaml, snapcraft.yaml, flatpak.json, flatpak.yml, flatpak.yaml, ' 'preinstallimage, Dockerfile.*, Containerfile.* or mkosi.*' % build_type) if not os.path.isfile(build_descr): raise oscerr.WrongArgs('Error: build description file named \'%s\' does not exist.' % build_descr) buildargs = [] buildargs.append('--statistics') if not opts.userootforbuild: buildargs.append('--norootforbuild') if opts.clean: buildargs.append('--clean') if opts.checks: buildargs.append('--checks') if opts.nochecks: buildargs.append('--no-checks') if not opts.no_changelog: buildargs.append('--changelog') if opts.root: build_root = opts.root if opts.target: buildargs.append('--target=%s' % opts.target) if opts.threads: buildargs.append('--threads=%s' % opts.threads) if opts.jobs: buildargs.append('--jobs=%s' % opts.jobs) elif config['build-jobs'] > 0: buildargs.append('--jobs=%s' % config['build-jobs']) if opts.icecream or config['icecream'] != '0': if opts.icecream: num = opts.icecream else: num = config['icecream'] if int(num) > 0: buildargs.append('--icecream=%s' % num) xp.append('icecream') xp.append('gcc-c++') if opts.ccache or config['ccache']: buildargs.append('--ccache') xp.append('ccache') if opts.pkg_ccache: buildargs.append('--pkg-ccache=%s' % opts.pkg_ccache) xp.append('ccache') if opts.linksources: buildargs.append('--linksources') if opts.baselibs: buildargs.append('--baselibs') if opts.debuginfo: buildargs.append('--debug') if opts._with: for o in opts._with: buildargs.append('--with=%s' % o) if opts.without: for o in opts.without: buildargs.append('--without=%s' % o) if opts.define: for o in opts.define: buildargs.append('--define=%s' % o) if config['build-uid']: build_uid = config['build-uid'] if opts.build_uid: build_uid = opts.build_uid if build_uid: buildidre = re.compile('^[0-9]+:[0-9]+$') if build_uid == 'caller': buildargs.append('--uid=%s:%s' % (os.getuid(), os.getgid())) elif buildidre.match(build_uid): buildargs.append('--uid=%s' % build_uid) else: print('Error: build-uid arg must be 2 colon separated numerics: "uid:gid" or "caller"', file=sys.stderr) return 1 if opts.shell_after_fail: build_shell_after_fail = opts.shell_after_fail if opts.vm_memory: vm_memory = opts.vm_memory if opts.vm_disk_size: vm_disk_size = opts.vm_disk_size if opts.vm_type: vm_type = opts.vm_type if opts.vm_telnet: vm_telnet = opts.vm_telnet if opts.alternative_project: prj = opts.alternative_project pac = '_repository' else: prj = store.project if opts.local_package: pac = '_repository' else: pac = store.package if opts.multibuild_package: buildargs.append('--buildflavor=%s' % opts.multibuild_package) pac = pac + ":" + opts.multibuild_package if opts.verbose_mode: buildargs.append('--verbose=%s' % opts.verbose_mode) if opts.no_timestamps: buildargs.append('--no-timestamps') if opts.wipe: buildargs.append("--wipe") pacname = pac if pacname == '_repository': if not opts.local_package: try: pacname = store.package except oscerr.NoWorkingCopy: opts.local_package = True if opts.local_package: pacname = os.path.splitext(os.path.basename(build_descr))[0] apihost = urlsplit(apiurl)[1] if not build_root: user = calculate_build_root_user(vm_type) build_root = calculate_build_root(apihost, prj, pacname, repo, arch, user) # We configure sccache after pacname, so that in default cases we can have an sccache for each # package to prevent cross-cache polutions. It helps to make the local-use case a bit nicer. if opts.sccache_uri or config['sccache_uri'] or opts.sccache or config['sccache']: if opts.pkg_ccache or opts.ccache or config['ccache']: raise oscerr.WrongArgs('Error: sccache and ccache can not be enabled at the same time') sccache_arg = "--sccache-uri=/var/tmp/osbuild-sccache-{pkgname}.tar" if opts.sccache_uri: sccache_arg = '--sccache-uri=%s' % opts.sccache_uri elif config['sccache_uri']: sccache_arg = '--sccache-uri=%s' % config['sccache_uri'] # Format the package name. sccache_arg = sccache_arg.format(pkgname=pacname) buildargs.append(sccache_arg) xp.append('sccache') # define buildinfo & config local cache bi_file = None bc_file = None bi_filename = '_buildinfo-%s-%s.xml' % (repo, arch) bc_filename = '_buildconfig-%s-%s' % (repo, arch) if store is not None and store.is_package and os.access(core.store, os.W_OK): bi_filename = os.path.join(os.getcwd(), core.store, bi_filename) bc_filename = os.path.join(os.getcwd(), core.store, bc_filename) elif not os.access('.', os.W_OK): bi_file = NamedTemporaryFile(prefix=bi_filename) bi_filename = bi_file.name bc_file = NamedTemporaryFile(prefix=bc_filename) bc_filename = bc_file.name else: bi_filename = os.path.abspath(bi_filename) bc_filename = os.path.abspath(bc_filename) if opts.shell: buildargs.append("--shell") if build_shell_after_fail: buildargs.append("--shell-after-fail") if opts.shell_cmd: buildargs.append("--shell-cmd") buildargs.append(opts.shell_cmd) if opts.noinit: buildargs.append('--noinit') if opts.local_package or not store.is_package: opts.skip_local_service_run = True # check for source services if not opts.offline and not opts.skip_local_service_run: p = core.Package(os.curdir) r = p.run_source_services(verbose=True) if r: raise oscerr.ServiceRuntimeError('Source service run failed!') cache_dir = config['packagecachedir'] % {'apihost': apihost} extra_pkgs = [] if not opts.extra_pkgs: extra_pkgs = config.get('extra-pkgs', []) elif opts.extra_pkgs != ['']: extra_pkgs = opts.extra_pkgs if opts.extra_pkgs_from: for filename in opts.extra_pkgs_from: with open(filename, encoding="utf-8") as f: for line in f: extra_pkgs.append(line.rstrip('\n')) if xp: extra_pkgs += xp build_descr_data, prefer_pkgs = create_build_descr_data( build_descr, build_type=build_type, repo=repo, arch=arch, prefer_pkgs=opts.prefer_pkgs, define=opts.define, define_with=opts._with, define_without=opts.without, ) # special handling for overlay and rsync-src/dest specialcmdopts = [] if opts.rsyncsrc or opts.rsyncdest: if not opts.rsyncsrc or not opts.rsyncdest: raise oscerr.WrongOptions('When using --rsync-{src,dest} both parameters have to be specified.') myrsyncsrc = os.path.abspath(os.path.expanduser(os.path.expandvars(opts.rsyncsrc))) if not os.path.isdir(myrsyncsrc): raise oscerr.WrongOptions('--rsync-src %s is no valid directory!' % opts.rsyncsrc) # can't check destination - its in the target chroot ;) - but we can check for sanity myrsyncdest = os.path.expandvars(opts.rsyncdest) if not os.path.isabs(myrsyncdest): raise oscerr.WrongOptions('--rsync-dest %s is no absolute path (starting with \'/\')!' % opts.rsyncdest) specialcmdopts = ['--rsync-src=' + myrsyncsrc, '--rsync-dest=' + myrsyncdest] if opts.overlay: myoverlay = os.path.abspath(os.path.expanduser(os.path.expandvars(opts.overlay))) if not os.path.isdir(myoverlay): raise oscerr.WrongOptions('--overlay %s is no valid directory!' % opts.overlay) specialcmdopts += ['--overlay=' + myoverlay] try: if opts.noinit: if not os.path.isfile(bi_filename): raise oscerr.WrongOptions('--noinit is not possible, no local buildinfo file') print('Use local \'%s\' file as buildinfo' % bi_filename) if not os.path.isfile(bc_filename): raise oscerr.WrongOptions('--noinit is not possible, no local buildconfig file') print('Use local \'%s\' file as buildconfig' % bc_filename) elif opts.offline: if not os.path.isfile(bi_filename): raise oscerr.WrongOptions('--offline is not possible, no local buildinfo file') print('Use local \'%s\' file as buildinfo' % bi_filename) if not os.path.isfile(bc_filename): raise oscerr.WrongOptions('--offline is not possible, no local buildconfig file') else: print('Getting buildconfig from server and store to %s' % bc_filename) bc = get_buildconfig(apiurl, prj, repo) if not bc_file: bc_file = open(bc_filename, 'w') bc_file.write(decode_it(bc)) bc_file.flush() if os.path.exists('/usr/lib/build/queryconfig') and not opts.nodebugpackages: debug_pkgs = decode_it(return_external('/usr/lib/build/queryconfig', '--dist', bc_filename, 'substitute', 'obs:cli_debug_packages')) if len(debug_pkgs) > 0: extra_pkgs.extend(debug_pkgs.strip().split(" ")) print('Getting buildinfo from server and store to %s' % bi_filename) bi_text = decode_it(get_buildinfo(apiurl, prj, pac, repo, arch, specfile=build_descr_data, addlist=extra_pkgs)) if not bi_file: bi_file = open(bi_filename, 'w') # maybe we should check for errors before saving the file bi_file.write(bi_text) bi_file.flush() kiwipath = None if build_type == 'kiwi': bi = Buildinfo(bi_filename, apiurl, 'kiwi', list(prefer_pkgs.keys())) kiwipath = get_kiwipath_from_buildinfo(bi, prj, repo) bc = get_buildconfig(apiurl, prj, repo, kiwipath) bc_file.seek(0) bc_file.write(decode_it(bc)) bc_file.flush() except HTTPError as e: if e.code == 404: # check what caused the 404 if meta_exists(metatype='prj', path_args=(prj, ), template_args=None, create_new=False, apiurl=apiurl): pkg_meta_e = None try: # take care, not to run into double trouble. pkg_meta_e = meta_exists(metatype='pkg', path_args=(prj, pac), template_args=None, create_new=False, apiurl=apiurl) except: pass if pkg_meta_e: print('ERROR: Either wrong repo/arch as parameter or a parse error of .spec/.dsc/.kiwi file due to syntax error', file=sys.stderr) else: print('The package \'%s\' does not exist - please ' 'rerun with \'--local-package\'' % pac, file=sys.stderr) else: print('The project \'%s\' does not exist - please ' 'rerun with \'--alternative-project \'' % prj, file=sys.stderr) sys.exit(1) else: raise # Set default binary type if cannot be detected binary_type = 'rpm' if os.path.exists('/usr/lib/build/queryconfig'): binary_type = decode_it(return_external('/usr/lib/build/queryconfig', '--dist', bc_filename, 'binarytype')).strip() # If binary type is set to a useless value, reset to 'rpm' if binary_type == 'UNDEFINED': binary_type = 'rpm' bi = Buildinfo(bi_filename, apiurl, build_type, list(prefer_pkgs.keys()), binary_type) if bi.debuginfo and not (opts.disable_debuginfo or '--debug' in buildargs): buildargs.append('--debug') if opts.release: bi.release = opts.release if bi.release: buildargs.append('--release') buildargs.append(bi.release) if opts.stage: buildargs.append('--stage') buildargs.append(opts.stage) if opts.build_opt: buildargs += opts.build_opt if opts.buildtool_opt: buildargs += [f"--buildtool-opt={opt}" for opt in opts.buildtool_opt] # real arch of this machine # vs. # arch we are supposed to build for if vm_type != "emulator" and vm_type != "qemu": if bi.hostarch is not None: if hostarch != bi.hostarch and bi.hostarch not in can_also_build.get(hostarch, []): print('Error: hostarch \'%s\' is required.' % (bi.hostarch), file=sys.stderr) return 1 elif hostarch != bi.buildarch: if bi.buildarch not in can_also_build.get(hostarch, []): print('WARNING: It is guessed to build on hostarch \'%s\' for \'%s\' via QEMU user emulation.' % (hostarch, bi.buildarch), file=sys.stderr) rpmlist_prefers = [] if prefer_pkgs: print('Evaluating preferred packages') for name, path in prefer_pkgs.items(): if bi.has_dep(name): # We remove a preferred package from the buildinfo, so that the # fetcher doesn't take care about them. # Instead, we put it in a list which is appended to the rpmlist later. # At the same time, this will make sure that these packages are # not verified. bi.remove_dep(name) rpmlist_prefers.append((name, path)) print(' - %s (%s)' % (name, path)) print('Updating cache of required packages') urllist = [] if not opts.download_api_only: # transform 'url1, url2, url3' form into a list if 'urllist' in config: if isinstance(config['urllist'], str): re_clist = re.compile('[, ]+') urllist = [i.strip() for i in re_clist.split(config['urllist'].strip())] else: urllist = config['urllist'] # OBS 1.5 and before has no downloadurl defined in buildinfo, but it is obsolete again meanwhile. # we have now specific download repositories per repository. Could be removed IMHO, since the api fallback # is there. In worst case it could fetch the wrong rpm... if bi.downloadurl: urllist.append(bi.downloadurl.replace('%', '%%') + '/%(extproject)s/%(extrepository)s/%(arch)s/%(filename)s') if opts.disable_cpio_bulk_download: urllist.append('%(apiurl)s/build/%(project)s/%(repository)s/%(repoarch)s/%(repopackage)s/%(repofilename)s') fetcher = Fetcher(cache_dir, urllist=urllist, offline=opts.noinit or opts.offline, http_debug=config['http_debug'], modules=bi.modules, enable_cpio=not opts.disable_cpio_bulk_download and bi.enable_cpio, cookiejar=connection.CookieJarAuthHandler(apiurl, os.path.expanduser(config["cookiejar"]))._cookiejar, download_api_only=opts.download_api_only) if not opts.trust_all_projects: # implicitly trust the project we are building for check_trusted_projects(apiurl, [i for i in bi.projects.keys() if not i == prj]) imagefile = '' imagesource = '' imagebins = [] if build_as_user(vm_type): # preinstallimage extraction will fail because unprivileged user cannot chroot or extract devices from the tarball bi.preinstallimage = None if build_type == 'preinstallimage': # preinstallimage would repackage just the previously built preinstallimage bi.preinstallimage = None if (not config['no_preinstallimage'] and not opts.nopreinstallimage and bi.preinstallimage and not opts.noinit and (opts.clean or (not os.path.exists(build_root + "/installed-pkg") and not os.path.exists(build_root + "/.build/init_buildsystem.data")))): (imagefile, imagesource, imagebins) = get_preinstall_image(apiurl, arch, cache_dir, bi.preinstallimage, opts.offline) if imagefile: # remove binaries from build deps which are included in preinstall image for i in bi.deps: if i.name in imagebins: bi.remove_dep(i.name) # now update the package cache fetcher.run(bi) old_pkg_dir = None if opts.oldpackages: old_pkg_dir = opts.oldpackages if not old_pkg_dir.startswith('/') and not opts.offline: data = [prj, pacname, repo, arch] if old_pkg_dir == '_link': p = core.Package(os.curdir) if not p.islink(): raise oscerr.WrongOptions('package is not a link') data[0] = p.linkinfo.project data[1] = p.linkinfo.package repos = core.get_repositories_of_project(apiurl, data[0]) # hack for links to e.g. Factory if not data[2] in repos and 'standard' in repos: data[2] = 'standard' elif old_pkg_dir != '' and old_pkg_dir != '_self': a = old_pkg_dir.split('/') for i in range(0, len(a)): data[i] = a[i] destdir = os.path.join(cache_dir, data[0], data[2], data[3]) old_pkg_dir = None try: print("Downloading previous build from %s ..." % '/'.join(data)) binaries = get_binarylist(apiurl, data[0], data[2], data[3], package=data[1], verbose=True) except Exception as e: print("Error: failed to get binaries: %s" % str(e)) binaries = [] if binaries: class mytmpdir: """ temporary directory that removes itself""" def __init__(self, *args, **kwargs): self.name = mkdtemp(*args, **kwargs) _rmtree = staticmethod(shutil.rmtree) def cleanup(self): self._rmtree(self.name) def __del__(self): self.cleanup() def __exit__(self, exc_type, exc_value, traceback): self.cleanup() def __str__(self): return self.name or "" old_pkg_dir = mytmpdir(prefix='.build.oldpackages', dir=os.path.abspath(os.curdir)) if not os.path.exists(destdir): os.makedirs(destdir) for i in binaries: fname = os.path.join(destdir, i.name) os.symlink(fname, os.path.join(str(old_pkg_dir), i.name)) if os.path.exists(fname): st = os.stat(fname) if st.st_mtime == i.mtime and st.st_size == i.size: continue get_binary_file(apiurl, data[0], data[2], data[3], i.name, package=data[1], target_filename=fname, target_mtime=i.mtime, progress_meter=True) if old_pkg_dir is not None: buildargs.append('--oldpackages=%s' % old_pkg_dir) # Make packages from buildinfo available as repos for kiwi/docker/fissile if build_type in ('kiwi', 'docker', 'podman', 'fissile', 'productcompose'): if os.path.lexists('repos'): shutil.rmtree('repos') if os.path.lexists('containers'): shutil.rmtree('containers') os.mkdir('repos') for i in bi.deps: if not i.extproject: # remove bi.deps.remove(i) continue if i.notmeta: continue # project pdir = str(i.extproject).replace(':/', ':') # repo rdir = str(i.extrepository).replace(':/', ':') # arch adir = i.repoarch # source fullfilename sffn = i.fullfilename filename = sffn.split("/")[-1] if i.name == 'updateinfo.xml': adir = 'updateinfo' filename = i.package + ':' + i.repoarch + ':updateinfo.xml' # project/repo if i.name.startswith("container:"): prdir = "containers/" + pdir + "/" + rdir pradir = prdir filename = filename[10:] if build_type == 'kiwi': buildargs.append('--kiwi-parameter') buildargs.append('--set-container-derived-from=dir://./' + prdir + "/" + filename) else: prdir = "repos/" + pdir + "/" + rdir # project/repo/arch pradir = prdir + "/" + adir # target fullfilename tffn = pradir + "/" + filename if not os.path.exists(os.path.join(pradir)): os.makedirs(os.path.join(pradir)) if not os.path.exists(tffn): print("Using package: " + sffn) if opts.linksources: os.link(sffn, tffn) else: os.symlink(sffn, tffn) if prefer_pkgs: for name, path in prefer_pkgs.items(): if name == filename: print("Using prefered package: " + path + "/" + filename) os.unlink(tffn) if bi.containerannotation: if not os.path.exists("containers"): os.makedirs("containers") with open("containers/annotation", "wb") as f: f.write(bi.containerannotation.encode()) if prefer_pkgs: localpkgdir = "repos/_local/" os.mkdir(localpkgdir) buildargs.append("--kiwi-parameter") buildargs.append("--add-repo") buildargs.append("--kiwi-parameter") buildargs.append(f"dir://./{localpkgdir}") buildargs.append("--kiwi-parameter") buildargs.append("--add-repotype") buildargs.append("--kiwi-parameter") buildargs.append("rpm-md") for name, path in prefer_pkgs.items(): tffn = os.path.join(localpkgdir, os.path.basename(path)) if opts.linksources: os.link(path, tffn) else: os.symlink(path, tffn) else: buildargs.append("--repos-directory=-") buildargs.append("--containers-directory=-") if build_type == 'kiwi': # Is a obsrepositories tag used? try: tree = xml_parse(build_descr) except: print('could not parse the kiwi file:', file=sys.stderr) print(open(build_descr).read(), file=sys.stderr) sys.exit(1) root = tree.getroot() # product if root.find('instsource'): # leads to unsigned media, but avoids build failure buildargs.append('--signdummy') for xml in root.findall('instsource'): found_obsrepositories = 0 for node in xml.findall('instrepo'): if node and node.find('source').get('path') == 'obsrepositories:/': for path in bi.pathes: found_obsrepositories += 1 new_node = ET.SubElement(xml, 'instrepo') new_node.set('name', node.get('name') + "_" + str(found_obsrepositories)) new_node.set('priority', node.get('priority')) new_node.set('local', 'true') new_source_node = ET.SubElement(new_node, 'source') new_source_node.set('path', "obs://" + path) xml.remove(node) if found_obsrepositories > 0: build_descr = os.getcwd() + '/_service:osc_obsrepositories:' + build_descr.rsplit('/', 1)[-1] tree.write(open(build_descr, 'wb')) # appliance expand_obsrepos = None for xml in root.findall('repository'): if xml.find('source').get('path') == 'obsrepositories:/': expand_obsrepos = True if expand_obsrepos: buildargs.append('--kiwi-parameter') buildargs.append('--ignore-repos') for xml in root.findall('repository'): if xml.find('source').get('path') == 'obsrepositories:/': for path in bi.pathes: if not os.path.isdir("repos/" + path): continue buildargs.append('--kiwi-parameter') buildargs.append('--add-repo') buildargs.append('--kiwi-parameter') buildargs.append("dir://./repos/" + path) buildargs.append('--kiwi-parameter') buildargs.append('--add-repotype') buildargs.append('--kiwi-parameter') buildargs.append('rpm-md') if xml.get('priority'): buildargs.append('--kiwi-parameter') buildargs.append('--add-repoprio=' + xml.get('priority')) else: m = re.match(r"obs://[^/]+/([^/]+)/(\S+)", xml.find('source').get('path')) if not m: # short path without obs instance name m = re.match(r"obs://([^/]+)/(.+)", xml.find('source').get('path')) project = m.group(1).replace(":", ":/") repo = m.group(2) buildargs.append('--kiwi-parameter') buildargs.append('--add-repo') buildargs.append('--kiwi-parameter') buildargs.append("dir://./repos/" + project + "/" + repo) buildargs.append('--kiwi-parameter') buildargs.append('--add-repotype') buildargs.append('--kiwi-parameter') buildargs.append('rpm-md') if xml.get('priority'): buildargs.append('--kiwi-parameter') buildargs.append('--add-repopriority=' + xml.get('priority')) if vm_type in ('xen', 'kvm', 'lxc', 'nspawn'): print('Skipping verification of package signatures due to secure VM build') elif bi.pacsuffix == 'rpm': if opts.no_verify: print('Skipping verification of package signatures') else: print('Verifying integrity of cached packages') verify_pacs(bi) elif bi.pacsuffix == 'deb': if opts.no_verify or opts.noinit: print('Skipping verification of package signatures') else: print('WARNING: deb packages get not verified, they can compromise your system !') else: print('WARNING: unknown packages get not verified, they can compromise your system !') for i in bi.deps: if i.hdrmd5: if not i.name.startswith('container:') and not i.fullfilename.endswith(".rpm"): continue if i.name.startswith('container:'): hdrmd5 = dgst(i.fullfilename) else: hdrmd5 = packagequery.PackageQuery.queryhdrmd5(i.fullfilename) if not hdrmd5: print("Error: cannot get hdrmd5 for %s" % i.fullfilename) sys.exit(1) if hdrmd5 != i.hdrmd5: if conf.config["api_host_options"][apiurl]["disable_hdrmd5_check"]: print(f"Warning: Ignoring a hdrmd5 mismatch for {i.fullfilename}: {hdrmd5} (actual) != {i.hdrmd5} (expected)") else: print(f"Error: hdrmd5 mismatch for {i.fullfilename}: {hdrmd5} (actual) != {i.hdrmd5} (expected)") sys.exit(1) print('Writing build configuration') if build_type in ('kiwi', 'docker', 'podman', 'fissile', 'productcompose'): rpmlist = ['%s %s\n' % (i.name, i.fullfilename) for i in bi.deps if not i.noinstall] else: rpmlist = [] for dep in bi.deps: if dep.sysroot: # packages installed in sysroot subdirectory need to get a prefix for init_buildsystem rpmlist.append("sysroot: %s %s\n" % (dep.name, dep.fullfilename)) else: rpmlist.append("%s %s\n" % (dep.name, dep.fullfilename)) for i in imagebins: rpmlist.append("%s preinstallimage\n" % i) rpmlist += ["%s %s\n" % (i[0], i[1]) for i in rpmlist_prefers] if imagefile: rpmlist.append('preinstallimage: %s\n' % imagefile) if imagesource: rpmlist.append('preinstallimagesource: %s\n' % imagesource) rpmlist.append('preinstall: ' + ' '.join(bi.preinstall_list) + '\n') rpmlist.append('vminstall: ' + ' '.join(bi.vminstall_list) + '\n') rpmlist.append('runscripts: ' + ' '.join(bi.runscripts_list) + '\n') if build_type != 'kiwi' and build_type != 'docker' and build_type != 'podman' and build_type != 'fissile': if bi.noinstall_list: rpmlist.append('noinstall: ' + ' '.join(bi.noinstall_list) + '\n') if bi.installonly_list: rpmlist.append('installonly: ' + ' '.join(bi.installonly_list) + '\n') rpmlist_file = NamedTemporaryFile(mode='w+t', prefix='rpmlist.') rpmlist_filename = rpmlist_file.name rpmlist_file.writelines(rpmlist) rpmlist_file.flush() subst = {'repo': repo, 'arch': arch, 'project': prj, 'package': pacname} vm_options = [] # XXX check if build-device present my_build_device = '' if config['build-device']: my_build_device = config['build-device'] % subst else: # obs worker uses /root here but that collides with the # /root directory if the build root was used without vm # before my_build_device = build_root + '/img' if vm_type: if config['build-swap']: my_build_swap = config['build-swap'] % subst else: my_build_swap = build_root + '/swap' vm_options = [f"--vm-type={vm_type}"] if vm_telnet: vm_options += [f"--vm-telnet={vm_telnet}"] if vm_memory: vm_options += [f"--memory={vm_memory}"] if vm_type != 'lxc' and vm_type != 'nspawn': vm_options += [f"--vm-disk={my_build_device}"] vm_options += [f"--vm-swap={my_build_swap}"] vm_options += [f"--logfile={build_root}/.build.log"] if vm_type == 'kvm': if config['build-kernel']: vm_options += [f"--vm-kernel={config['build-kernel']}"] if config['build-initrd']: vm_options += [f"--vm-initrd={config['build-initrd']}"] build_root += '/.mount' if vm_disk_size: vm_options += [f"--vmdisk-rootsize={vm_disk_size}"] if config['build-vmdisk-swapsize']: vm_options += [f"--vmdisk-swapsize={config['build-vmdisk-swapsize']}"] if config['build-vmdisk-filesystem']: vm_options += [f"--vmdisk-filesystem={config['build-vmdisk-filesystem']}"] if config['build-vm-user']: vm_options += [f"--vm-user={config['build-vm-user']}"] if opts.preload: print("Preload done for selected repo/arch.") sys.exit(0) print('Running build') cmd = [ config['build-cmd'], f"--root={build_root}", f"--rpmlist={rpmlist_filename}", f"--dist={bc_filename}", f"--arch={bi.buildarch}", ] cmd += specialcmdopts + vm_options + buildargs cmd += [build_descr] # determine if we're building under root (user == None) and use su_wrapper accordingly if calculate_build_root_user(vm_type) is None: cmd = su_wrapper(cmd) # change personality, if needed if hostarch != bi.buildarch and bi.buildarch in change_personality: cmd = [change_personality[bi.buildarch]] + cmd # record our settings for later builds if not opts.local_package and store.is_package: store.last_buildroot = repo, arch, vm_type try: rc = run_external(cmd[0], *cmd[1:]) if rc: print() print(f"Build failed with exit code {rc}") print(f"The buildroot was: {build_root}") print() print("Cleaning the build root may fix the problem or allow you to start debugging from a well-defined state:") print(" - add '--clean' option to your 'osc build' command") print(" - run 'osc wipe [--vm-type=...]' prior running your 'osc build' command again") sys.exit(rc) except KeyboardInterrupt as keyboard_interrupt_exception: print("keyboard interrupt, killing build ...") cmd.append('--kill') run_external(cmd[0], *cmd[1:]) raise keyboard_interrupt_exception pacdir = os.path.join(build_root, '.build.packages') if os.path.islink(pacdir): pacdir = os.readlink(pacdir) pacdir = os.path.join(build_root, pacdir) if os.path.exists(pacdir): (s_built, b_built) = get_built_files(pacdir, bi.buildtype) print() if s_built: print(decode_it(s_built)) print() print(decode_it(b_built)) if opts.keep_pkgs: for i in b_built.splitlines() + s_built.splitlines(): shutil.copy2(i, os.path.join(opts.keep_pkgs, os.path.basename(decode_it(i)))) if bi_file: bi_file.close() if bc_file: bc_file.close() rpmlist_file.close() # vim: sw=4 et osc-1.12.1/osc/checker.py000066400000000000000000000062051475337502500151400ustar00rootroot00000000000000import base64 import os from tempfile import mkdtemp from shutil import rmtree class KeyError(Exception): def __init__(self, key, *args): super().__init__() self.args = args self.key = key def __str__(self): return f"{self.key} :{' '.join(self.args)}" class Checker: def __init__(self): import rpm self.dbdir = mkdtemp(prefix='oscrpmdb') self.imported = {} # pylint: disable=E1101 rpm.addMacro('_dbpath', self.dbdir) self.ts = rpm.TransactionSet() self.ts.initDB() self.ts.openDB() self.ts.setVSFlags(0) # self.ts.Debug(1) def readkeys(self, keys=None): import rpm keys = keys or [] # pylint: disable=E1101 rpm.addMacro('_dbpath', self.dbdir) for key in keys: try: self.readkey(key) except KeyError as e: print(e) if not self.imported: raise KeyError('', "no key imported") import rpm # pylint: disable=E1101 rpm.delMacro("_dbpath") # python is an idiot # def __del__(self): # self.cleanup() def cleanup(self): self.ts.closeDB() rmtree(self.dbdir) def readkey(self, file): if file in self.imported: return fd = open(file) line = fd.readline() if line and line[0:14] == "-----BEGIN PGP": line = fd.readline() while line and line != "\n": line = fd.readline() if not line: raise KeyError(file, "not a pgp public key") else: raise KeyError(file, "not a pgp public key") key = '' line = fd.readline() crc = None while line: if line[0:12] == "-----END PGP": break line = line.rstrip() if line[0] == '=': crc = line[1:] line = fd.readline() break else: key += line line = fd.readline() fd.close() if not line or line[0:12] != "-----END PGP": raise KeyError(file, "not a pgp public key") # TODO: compute and compare CRC, see RFC 2440 bkey = base64.b64decode(key) r = self.ts.pgpImportPubkey(bkey) if r != 0: raise KeyError(file, "failed to import pubkey") self.imported[file] = 1 def check(self, pkg): # avoid errors on non rpm if pkg[-4:] != '.rpm': return fd = None try: fd = os.open(pkg, os.O_RDONLY) hdr = self.ts.hdrFromFdno(fd) finally: if fd is not None: os.close(fd) if __name__ == "__main__": import sys keyfiles = [] pkgs = [] for arg in sys.argv[1:]: if arg[-4:] == '.rpm': pkgs.append(arg) else: keyfiles.append(arg) checker = Checker() try: checker.readkeys(keyfiles) for pkg in pkgs: checker.check(pkg) except Exception as e: checker.cleanup() raise e # vim: sw=4 et osc-1.12.1/osc/cmdln.py000066400000000000000000000246311475337502500146340ustar00rootroot00000000000000""" A modern, lightweight alternative to cmdln.py from https://github.com/trentm/cmdln """ import argparse import inspect import sys import textwrap def option(*args, **kwargs): """ Decorator to add an option to the optparser argument of a Cmdln subcommand. Example: class MyShell(cmdln.Cmdln): @cmdln.option("-f", "--force", help="force removal") def do_remove(self, subcmd, opts, *args): #... """ def decorate(f): if not hasattr(f, "options"): f.options = [] new_args = [i for i in args if i] f.options.insert(0, (new_args, kwargs)) return f return decorate def alias(*aliases): """ Decorator to add aliases for Cmdln.do_* command handlers. Example: class MyShell(cmdln.Cmdln): @cmdln.alias("!", "sh") def do_shell(self, argv): #...implement 'shell' command """ def decorate(f): if not hasattr(f, "aliases"): f.aliases = [] f.aliases += aliases return f return decorate def name(name): """ Decorator to explicitly name a Cmdln subcommand. Example: class MyShell(cmdln.Cmdln): @cmdln.name("cmd-with-dashes") def do_cmd_with_dashes(self, subcmd, opts): #... """ def decorate(f): f.name = name return f return decorate def hide(value=True): """ For obsolete calls, hide them in help listings. Example: class MyShell(cmdln.Cmdln): @cmdln.hide() def do_shell(self, argv): #...implement 'shell' command """ def decorate(f): f.hidden = bool(value) return f return decorate class HelpFormatter(argparse.RawDescriptionHelpFormatter): def _split_lines(self, text, width): # remove the leading and trailing whitespaces to avoid printing unwanted blank lines text = text.strip() result = [] for line in text.splitlines(): if not line.strip(): # textwrap normally returns [] on a string that contains only whitespaces; we want [""] to print a blank line result.append("") else: result.extend(textwrap.wrap(line, width)) return result def _format_action(self, action): if isinstance(action, argparse._SubParsersAction): parts = [] subactions = action._get_subactions() subactions.sort(key=lambda x: x.metavar) for i in subactions: if i.help == argparse.SUPPRESS: # don't display commands with suppressed help continue if len(i.metavar) > 20: parts.append("%*s%-21s" % (self._current_indent, "", i.metavar)) parts.append("%*s %s" % (self._current_indent + 21, "", i.help)) else: parts.append("%*s%-21s %s" % (self._current_indent, "", i.metavar, i.help)) return "\n".join(parts) return super()._format_action(action) class Cmdln: def get_argparser_usage(self): return "%(prog)s [global opts] [--help] [opts] [args]" def get_subcommand_prog(self, subcommand): return f"{self.argparser.prog} [global opts] {subcommand}" def _remove_leading_spaces_from_text(self, text): lines = text.splitlines() lines = self._remove_leading_spaces_from_lines(lines) return "\n".join(lines) def _remove_leading_spaces_from_lines(self, lines): # compute the indentation (leading spaces) in the docstring leading_spaces = 0 for line in lines: line_leading_spaces = len(line) - len(line.lstrip(' ')) if leading_spaces == 0: leading_spaces = line_leading_spaces leading_spaces = min(leading_spaces, line_leading_spaces) # dedent the lines (remove leading spaces) lines = [line[leading_spaces:] for line in lines] return lines def create_argparser(self): """ Create `.argparser` and `.subparsers`. Override this method to replace them with your own. """ self.argparser = argparse.ArgumentParser( usage=self.get_argparser_usage(), description=self._remove_leading_spaces_from_text(self.__doc__), formatter_class=HelpFormatter, ) self.subparsers = self.argparser.add_subparsers( title="commands", dest="command", ) self.pre_argparse() self.add_global_options(self.argparser) # map command name to `do_*` function that runs the command self.cmd_map = {} # map aliases back to the command names self.alias_to_cmd_name_map = {} for attr in dir(self): if not attr.startswith("do_"): continue cmd_name = attr[3:] cmd_func = getattr(self, attr) # extract data from the function cmd_name = getattr(cmd_func, "name", cmd_name) options = getattr(cmd_func, "options", []) aliases = getattr(cmd_func, "aliases", []) hidden = getattr(cmd_func, "hidden", False) # map command name and aliases to the function self.cmd_map[cmd_name] = cmd_func self.alias_to_cmd_name_map[cmd_name] = cmd_name for i in aliases: self.cmd_map[i] = cmd_func self.alias_to_cmd_name_map[i] = cmd_name if cmd_func.__doc__: # split doctext into lines, allow the first line to start at a new line help_lines = cmd_func.__doc__.lstrip().splitlines() # use the first line as help text help_text = help_lines.pop(0) # use the remaining lines as description help_lines = self._remove_leading_spaces_from_lines(help_lines) help_desc = "\n".join(help_lines) help_desc = help_desc.strip() else: help_text = "" help_desc = "" if hidden: help_text = argparse.SUPPRESS subparser = self.subparsers.add_parser( cmd_name, aliases=aliases, help=help_text, description=help_desc, prog=self.get_subcommand_prog(cmd_name), formatter_class=HelpFormatter, conflict_handler="resolve", ) # add hidden copy of global options so they can be used in any place self.add_global_options(subparser, suppress=True) # add sub-command options, overriding hidden copies of global options if needed (due to conflict_handler="resolve") for option_args, option_kwargs in options: subparser.add_argument(*option_args, **option_kwargs) def argparse_error(self, *args, **kwargs): """ Raise an argument parser error. Automatically pick the right parser for the main program or a subcommand. """ if not self.options.command: parser = self.argparser else: parser = self.subparsers._name_parser_map.get(self.options.command, self.argparser) parser.error(*args, **kwargs) def pre_argparse(self): """ Hook method executed after `.main()` creates `.argparser` instance and before `parse_args()` is called. """ pass def add_global_options(self, parser, suppress=False): """ Add options to the main argument parser and all subparsers. """ pass def post_argparse(self): """ Hook method executed after `.main()` calls `parse_args()`. When called, `.options` and `.args` hold the results of `parse_args()`. """ pass def main(self, argv=None): if argv is None: argv = sys.argv else: argv = argv[:] # don't modify caller's list self.create_argparser() self.options, self.args = self.argparser.parse_known_args(argv[1:]) unrecognized = [i for i in self.args if i.startswith("-")] if unrecognized: self.argparser.error(f"unrecognized arguments: {' '.join(unrecognized)}") self.post_argparse() if not self.options.command: self.argparser.error("Please specify a command") # find the `do_*` function to call by its name cmd = self.cmd_map[self.options.command] # run the command with parsed args sig = inspect.signature(cmd) arg_names = list(sig.parameters.keys()) if arg_names == ["subcmd", "opts"]: # positional args specified manually via @cmdln.option if self.args: self.argparser.error(f"unrecognized arguments: {' '.join(self.args)}") cmd(self.options.command, self.options) elif arg_names == ["subcmd", "opts", "args"]: # positional args are the remaining (unrecognized) args cmd(self.options.command, self.options, *self.args) else: # positional args are the remaining (unrecongnized) args # and the do_* handler takes other arguments than "subcmd", "opts", "args" import warnings warnings.warn( f"do_{self.options.command}() handler has deprecated signature. " f"It takes the following args: {arg_names}, while it should be taking ['subcmd', 'opts'] " f"and handling positional arguments explicitly via @cmdln.option.", FutureWarning ) try: cmd(self.options.command, self.options, *self.args) except TypeError as e: if e.args[0].startswith("do_"): sys.exit(str(e)) raise @alias("?") def do_help(self, subcmd, opts, *args): """ Give detailed help on a specific sub-command usage: %(prog)s [SUBCOMMAND] """ if not args: self.argparser.print_help() return for action in self.argparser._actions: if not isinstance(action, argparse._SubParsersAction): continue for choice, subparser in action.choices.items(): if choice == args[0]: subparser.print_help() return osc-1.12.1/osc/commandline.py000066400000000000000000016316151475337502500160340ustar00rootroot00000000000000# Copyright (C) 2006 Novell Inc. All rights reserved. # This program is free software; it may be used, copied, modified # and distributed under the terms of the GNU General Public Licence, # either version 2, or version 3 (at your option). import argparse import getpass import glob import importlib import importlib.util import inspect import os import pkgutil import re import shutil import subprocess import sys import textwrap import tempfile import time import traceback from functools import cmp_to_key from operator import itemgetter from pathlib import Path from tempfile import NamedTemporaryFile from typing import List from urllib.parse import urlsplit from urllib.error import HTTPError from . import commands as osc_commands from . import oscerr from .commandline_common import * from .util.xml import xml_fromstring from .util.xml import xml_parse class OscCommand(Command): """ Inherit from this class to create new commands. The first line of the docstring becomes the help text, the remaining lines become the command description. """ class OscMainCommand(MainCommand): name = "osc" MODULES = ( ("osc.commands", osc_commands.__path__[0]), ) if not IN_VENV: MODULES += ( ("osc.commands.usr_lib", "/usr/lib/osc-plugins"), ("osc.commands.usr_local_lib", "/usr/local/lib/osc-plugins"), ("osc.commands.home_local_lib", "~/.local/lib/osc-plugins"), ("osc.commands.home", "~/.osc-plugins"), ) def __init__(self): super().__init__() self.args = None self.download_progress = None def init_arguments(self): self.add_argument( "-v", "--verbose", action="store_true", help="increase verbosity (conflicts with --quiet)", ) self.add_argument( "-q", "--quiet", action="store_true", help="be quiet, not verbose (conflicts with --verbose)", ) self.add_argument( "--debug", action="store_true", help="print info useful for debugging", ) self.add_argument( "--debugger", action="store_true", help="jump into the debugger before executing anything", ) self.add_argument( "--post-mortem", action="store_true", help="jump into the debugger in case of errors", ) self.add_argument( "--traceback", action="store_true", help="print call trace in case of errors", ) self.add_argument( "-H", "--http-debug", action="store_true", help="debug HTTP traffic (filters some headers)", ) self.add_argument( "--http-full-debug", action="store_true", help="debug HTTP traffic (filters no headers)", ) self.add_argument( "-A", "--apiurl", metavar="URL", help="Open Build Service API URL or a configured alias", ) self.add_argument( "--config", dest="conffile", metavar="FILE", help="specify alternate configuration file", ) self.add_argument( "--setopt", metavar="KEY=VALUE", action="append", default=[], help="set a config option for the current program run", ) self.add_argument( "--no-keyring", action="store_true", help="disable usage of desktop keyring system", ) def post_parse_args(self, args): from . import conf from . import store as osc_store from .meter import create_text_meter if args.command == "help": # HACK: never ask for credentials when displaying help return # apiurl hasn't been specified by the user # we need to set it here because the 'default' option of an argument doesn't support lazy evaluation if args.apiurl is None: try: # try reading the apiurl from the working copy args.apiurl = osc_store.Store(Path.cwd()).apiurl except oscerr.NoWorkingCopy: # we can't use conf.config["apiurl"] because it contains the default "https://api.opensuse.org" # let's leave setting the right value to conf.get_config() pass overrides = {} for i in args.setopt: key, value = i.split("=") overrides[key] = value try: conf.get_config( override_apiurl=args.apiurl, override_conffile=args.conffile, override_debug=args.debug, override_http_debug=args.http_debug, override_http_full_debug=args.http_full_debug, override_no_keyring=args.no_keyring, override_post_mortem=args.post_mortem, override_quiet=args.quiet, override_traceback=args.traceback, override_verbose=args.verbose, overrides=overrides, ) except oscerr.NoConfigfile as e: print(e.msg, file=sys.stderr) print(f"Creating osc configuration file {e.file} ...", file=sys.stderr) conf.interactive_config_setup(e.file, args.apiurl) print("done", file=sys.stderr) self.post_parse_args(args) except oscerr.ConfigMissingApiurl as e: print(e.msg, file=sys.stderr) conf.interactive_config_setup(e.file, e.url, initial=False) self.post_parse_args(args) except oscerr.ConfigMissingCredentialsError as e: print(e.msg, file=sys.stderr) print("Please enter new credentials.", file=sys.stderr) conf.interactive_config_setup(e.file, e.url, initial=False) self.post_parse_args(args) # write config values back to args # this is crucial mainly for apiurl to resolve an alias to full url for i in ["apiurl", "debug", "http_debug", "http_full_debug", "post_mortem", "traceback", "verbose"]: setattr(args, i, conf.config[i]) args.no_keyring = not conf.config["use_keyring"] if conf.config["show_download_progress"]: self.download_progress = create_text_meter() if not args.apiurl: self.parser.error("Could not determine apiurl, use -A/--apiurl to specify one") # needed for LegacyOsc class self.args = args def _wrap_legacy_command(self, func_): class LegacyCommandWrapper(Command): func = func_ __doc__ = getattr(func_, "__doc__", "") aliases = getattr(func_, "aliases", []) hidden = getattr(func_, "hidden", False) name = getattr(func_, "name", func_.__name__[3:]) def __repr__(self): result = super().__repr__() result += f"({self.func.__name__})" return result def init_arguments(self): options = getattr(self.func, "options", []) for option_args, option_kwargs in options: self.add_argument(*option_args, **option_kwargs) def run(self, args): sig = inspect.signature(self.func) arg_names = list(sig.parameters.keys()) if arg_names == ["subcmd", "opts"]: # handler doesn't take positional args via *args if args.positional_args: self.parser.error(f"unrecognized arguments: " + " ".join(args.positional_args)) return self.func(args.command, args) else: # handler takes positional args via *args return self.func(args.command, args, *args.positional_args) return LegacyCommandWrapper def load_legacy_commands(self): # lazy links of attributes that would normally be initialized in the instance of Osc class class LegacyOsc(Osc): # pylint: disable=used-before-assignment # pylint: disable=no-self-argument @property def argparser(self_): return self.parser # pylint: disable=no-self-argument @property def download_progress(self_): return self.download_progress # pylint: disable=no-self-argument @property def options(self_): return self.args # pylint: disable=no-self-argument @options.setter def options(self_, value): pass # pylint: disable=no-self-argument @property def subparsers(self_): return self.subparsers osc_instance = LegacyOsc() for name in dir(osc_instance): if not name.startswith("do_"): continue func = getattr(osc_instance, name) if not inspect.ismethod(func) and not inspect.isfunction(func): continue cls = self._wrap_legacy_command(func) self.load_command(cls, "osc.commands.old") @classmethod def main(cls, argv=None, run=True): """ Initialize OscMainCommand, load all commands and run the selected command. """ cmd = cls() cmd.load_commands() cmd.load_legacy_commands() if run: args = cmd.parse_args(args=argv) exit_code = cmd.run(args) sys.exit(exit_code) else: args = None return cmd, args def get_parser(): """ Needed by argparse-manpage to generate man pages from the argument parser. """ main, _ = OscMainCommand.main(run=False) return main.parser # ================================================================================ # The legacy code follows. # Please do not use it if possible. # ================================================================================ HELP_MULTIBUILD_MANY = """Only work with the specified flavors of a multibuild package. Globs are resolved according to _multibuild file from server. Empty string is resolved to a package without a flavor.""" HELP_MULTIBUILD_ONE = "Only work with the specified flavor of a multibuild package." def pop_args( args, arg1_name: str = None, arg1_is_optional: bool = False, arg1_default: str = None, arg2_name: str = None, arg2_is_optional: bool = False, arg2_default: str = None, ): """ Pop 2 arguments from `args`. They may be either 2 individual entries or a single entry with values separated with "/". .. warning:: The `args` list gets modified in this function call! :param args: List of command-line arguments. :type args: list(str) :param arg1_name: Name of the first argument :type arg1_name: str :param arg1_is_optional: Whether to error out when arg1 cannot be retrieved. :type arg1_is_optional: bool :param arg1_default: Used if arg1 is not specified in `args`. :type arg1_default: bool :param arg2_name: Name of the second argument :type arg2_name: str :param arg2_is_optional: Whether to error out when arg2 cannot be retrieved. :type arg2_is_optional: bool :param arg2_default: Used if arg2 is not specified in `args`. :type arg2_default: bool :returns: Project and package. :rtype: tuple(str) """ assert isinstance(args, list) assert isinstance(arg1_name, str) assert isinstance(arg2_name, str) if arg1_is_optional: arg2_is_optional = True used_arg1_default = False try: arg1 = args.pop(0) except IndexError: arg1 = arg1_default used_arg1_default = True if arg1 is None and not arg1_is_optional: raise oscerr.OscValueError(f"Please specify a {arg1_name}") from None if not isinstance(arg1, (str, type(None))): raise TypeError(f"{arg1_name.capitalize()} should be 'str', found: {type(arg1).__name__}") arg2 = None if arg1 and "/" in arg1: # project/package, repo/arch, etc. if arg1.count("/") != 1: raise oscerr.OscValueError(f"Argument doesn't match the '<{arg1_name}>/<{arg2_name}>' pattern: {arg1}") arg1, arg2 = arg1.split("/") if arg2 is None: try: arg2 = args.pop(0) except IndexError: if used_arg1_default: # we use arg2_default only after arg1_default was used arg2 = arg2_default if arg2 is None and not arg2_is_optional: raise oscerr.OscValueError(f"Please specify a {arg2_name}") from None if not isinstance(arg2, (str, type(None))): raise TypeError(f"{arg2_name.capitalize()} should be 'str', found: {type(arg2).__name__}") return arg1, arg2 def pop_project_package_from_args( args: List[str], project_is_optional: bool = False, default_project: str = None, package_is_optional: bool = False, default_package: str = None, ): """ Pop project and package from given `args`. They may be either 2 individual entries or a single entry with values separated with "/". .. warning:: The `args` list gets modified in this function call! :param args: List of command-line arguments. :type args: list(str) :param project_is_optional: Whether to error out when project cannot be retrieved. Implies `package_is_optional=False`. :type project_is_optional: bool :param default_project: Used if project is not specified in `args`. Resolved from the current working copy if set to '.'. :type default_project: str :param package_is_optional: Whether to error out when package cannot be retrieved. :type package_is_optional: bool :param default_package: Used if package is not specified in `args`. Resolved from the current working copy if set to '.'. :type default_package: str :returns: Project and package. :rtype: tuple(str) """ from . import store as osc_store project, package = pop_args( args, arg1_name="project", arg1_is_optional=project_is_optional, arg1_default=default_project, arg2_name="package", arg2_is_optional=package_is_optional, arg2_default=default_package, ) path = Path.cwd() project_store = None package_store = None if project == ".": # project name taken from the working copy try: project_store = osc_store.get_store(path) project = project_store.project except oscerr.NoWorkingCopy: if not project_is_optional: raise project = None if package == ".": # package name taken from the working copy try: package_store = osc_store.get_store(path) package_store.assert_is_package() package = package_store.package except oscerr.NoWorkingCopy: if package_is_optional: package = None elif not project_store and default_package == ".": # project wasn't retrieved from store, let's ask for specifying a package raise oscerr.OscValueError("Please specify a package") from None else: raise return project, package def pop_repository_arch_from_args( args: List[str], repository_is_optional: bool = False, default_repository: str = None, arch_is_optional: bool = False, default_arch: str = None, ): """ Pop repository and arch from given `args`. They may be either 2 individual entries or a single entry with values separated with "/". .. warning:: The `args` list gets modified in this function call! :param args: List of command-line arguments. :type args: list(str) :param repository_is_optional: Whether to error out when project cannot be retrieved. Implies `arch_is_optional=False`. :type repository_is_optional: bool :param default_repository: Used if repository is not specified in `args`. :type default_repository: str :param arch_is_optional: Whether to error out when arch cannot be retrieved. :type arch_is_optional: bool :param default_arch: Used if arch is not specified in `args`. :type default_arch: str :returns: Repository and arch. :rtype: tuple(str) """ repository, arch = pop_args( args, arg1_name="repository", arg1_is_optional=repository_is_optional, arg1_default=default_repository, arg2_name="arch", arg2_is_optional=arch_is_optional, arg2_default=default_arch, ) return repository, arch def pop_project_package_repository_arch_from_args( args: List[str], project_is_optional: bool = False, default_project: str = None, package_is_optional: bool = False, default_package: str = None, repository_is_optional: bool = False, default_repository: str = None, arch_is_optional: bool = False, default_arch: str = None, ): """ Pop project, package, repository and arch from given `args`. .. warning:: The `args` list gets modified in this function call! :param args: List of command-line arguments. :type args: list(str) :param project_is_optional: Whether to error out when project cannot be retrieved. Implies `package_is_optional=False`. :type project_is_optional: bool :param default_project: Used if project is not specified in `args`. Resolved from the current working copy if set to '.'. :type default_project: str :param package_is_optional: Whether to error out when package cannot be retrieved. :type package_is_optional: bool :param default_package: Used if package is not specified in `args`. Resolved from the current working copy if set to '.'. :type default_package: str :param repository_is_optional: Whether to error out when project cannot be retrieved. Implies `arch_is_optional=False`. :type repository_is_optional: bool :param default_repository: Used if repository is not specified in `args`. :type default_repository: str :param arch_is_optional: Whether to error out when arch cannot be retrieved. :type arch_is_optional: bool :param default_arch: Used if arch is not specified in `args`. :type default_arch: str :returns: Project, package, repository and arch. :rtype: tuple(str) """ args_backup = args.copy() if project_is_optional or package_is_optional: repository_is_optional = True arch_is_optional = True try_working_copy = default_project == "." or default_package == "." try: # try this sequence first: project package repository arch project, package = pop_project_package_from_args( args, project_is_optional=project_is_optional, default_project=default_project, package_is_optional=package_is_optional, default_package=default_package, ) if args: # we got more than 2 arguments -> we shouldn't try to retrieve project and package from a working copy try_working_copy = False repository, arch = pop_repository_arch_from_args( args, repository_is_optional=repository_is_optional, default_repository=default_repository, arch_is_optional=arch_is_optional, default_arch=default_arch, ) except oscerr.OscValueError as ex: if not try_working_copy: raise ex from None # then read project and package from working copy and try repository arch args[:] = args_backup.copy() project, package = pop_project_package_from_args( [], default_project=".", default_package="." ) repository, arch = pop_repository_arch_from_args( args, repository_is_optional=repository_is_optional, default_repository=default_repository, arch_is_optional=arch_is_optional, default_arch=default_arch, ) return project, package, repository, arch def pop_project_package_targetproject_targetpackage_from_args( args: List[str], project_is_optional: bool = False, default_project: str = None, package_is_optional: bool = False, default_package: str = None, target_project_is_optional: bool = False, default_target_project: str = None, target_package_is_optional: bool = False, default_target_package: str = None, ): """ Pop project, package, target project and target package from given `args`. .. warning:: The `args` list gets modified in this function call! :param args: List of command-line arguments. :type args: list(str) :param project_is_optional: Whether to error out when project cannot be retrieved. Implies `package_is_optional=False`. :type project_is_optional: bool :param default_project: Used if project is not specified in `args`. Resolved from the current working copy if set to '.'. :type default_project: str :param package_is_optional: Whether to error out when package cannot be retrieved. :type package_is_optional: bool :param default_package: Used if package is not specified in `args`. Resolved from the current working copy if set to '.'. :type default_package: str :param target_project_is_optional: Whether to error out when target project cannot be retrieved. Implies `target_package_is_optional=False`. :type target_project_is_optional: bool :param default_target_project: Used if target project is not specified in `args`. Resolved from the current working copy if set to '.'. :type default_target_project: str :param target_package_is_optional: Whether to error out when target package cannot be retrieved. :type target_package_is_optional: bool :param default_target_package: Used if target package is not specified in `args`. Resolved from the current working copy if set to '.'. :type default_target_package: str :returns: Project, package, target project and target package. :rtype: tuple(str) """ args_backup = args.copy() if project_is_optional or package_is_optional: target_project_is_optional = True target_package_is_optional = True try_working_copy = default_project == "." or default_package == "." try: # try this sequence first: project package target_project target_package project, package = pop_project_package_from_args( args, project_is_optional=project_is_optional, default_project=default_project, package_is_optional=package_is_optional, default_package=default_package, ) if args: # we got more than 2 arguments -> we shouldn't try to retrieve project and package from a working copy try_working_copy = False target_project, target_package = pop_project_package_from_args( args, project_is_optional=target_project_is_optional, default_project=default_target_project, package_is_optional=target_package_is_optional, default_package=default_target_package, ) except oscerr.OscValueError as ex: if not try_working_copy: raise ex from None # then read project and package from working copy and target_project target_package args[:] = args_backup.copy() project, package = pop_project_package_from_args( [], default_project=".", default_package=".", package_is_optional=False, ) target_project, target_package = pop_project_package_from_args( args, project_is_optional=target_project_is_optional, default_project=default_target_project, package_is_optional=target_package_is_optional, default_package=default_target_package, ) return project, package, target_project, target_package def ensure_no_remaining_args(args): """ Error out when `args` still contains arguments. :raises oscerr.WrongArgs: The `args` list still contains arguments. """ if not args: return args_str = " ".join(args) raise oscerr.WrongArgs(f"Unexpected args: {args_str}") class Osc(cmdln.Cmdln): """ openSUSE commander is a command-line interface to the Open Build Service. Type 'osc --help' for help on a specific subcommand. For additional information, see * http://en.opensuse.org/openSUSE:Build_Service_Tutorial * http://en.opensuse.org/openSUSE:OSC You can modify osc commands, or roll your own, via the plugin API: * http://en.opensuse.org/openSUSE:OSC_plugins """ name = 'osc' def __init__(self): from .util import safewriter self.options = None self._load_plugins() sys.stderr = safewriter.SafeWriter(sys.stderr) sys.stdout = safewriter.SafeWriter(sys.stdout) def _debug(self, *args): # if options are not initialized, still allow to use it if not hasattr(self, 'options') or self.options.debug: print(*args, file=sys.stderr) def get_version(self): from .core import get_osc_version return get_osc_version() def _process_project_name(self, project): from . import conf from .core import is_package_dir from .core import is_project_dir from .core import store_read_project if isinstance(project, str): if project == '.': if is_package_dir(Path.cwd()) or is_project_dir(Path.cwd()): project = store_read_project(Path.cwd()) else: raise oscerr.WrongArgs('No working directory') return project.replace(conf.config['project_separator'], ':') return project def add_global_options(self, parser, suppress=False): def _add_parser_arguments_from_data(argument_parser, data): for kwargs in data: args = kwargs.pop("names") if suppress: kwargs["help"] = argparse.SUPPRESS kwargs["default"] = argparse.SUPPRESS argument_parser.add_argument(*args, **kwargs) arguments = [] arguments.append(dict( names=['-v', '--verbose'], action='store_true', help='increase verbosity', )) arguments.append(dict( names=['-q', '--quiet'], action='store_true', help='be quiet, not verbose', )) arguments.append(dict( names=['--debugger'], action='store_true', help='jump into the debugger before executing anything', )) arguments.append(dict( names=['--post-mortem'], action='store_true', help='jump into the debugger in case of errors', )) arguments.append(dict( names=['--traceback'], action='store_true', help='print call trace in case of errors', )) arguments.append(dict( names=['-H', '--http-debug'], action='store_true', help='debug HTTP traffic (filters some headers)', )) arguments.append(dict( names=['--http-full-debug'], action='store_true', help='debug HTTP traffic (filters no headers)', )) arguments.append(dict( names=['--debug'], action='store_true', help='print info useful for debugging', )) arguments.append(dict( names=['-A', '--apiurl'], metavar='URL/alias', help='specify URL to access API server at or an alias', )) arguments.append(dict( names=['--config'], dest='conffile', metavar='FILE', help='specify alternate configuration file', )) arguments.append(dict( names=['--no-keyring'], action='store_true', help='disable usage of desktop keyring system', )) _add_parser_arguments_from_data(parser, arguments) def post_argparse(self): from . import conf from .meter import create_text_meter """merge commandline options into the config""" # handle conflicting options manually because the mutually exclusive group is buggy # https://github.com/python/cpython/issues/96310 if self.options.quiet and self.options.verbose: self.argparse_error("argument -q/--quiet: not allowed with argument -v/--verbose") # avoid loading config that may trigger prompt for username, password etc. if not self.options.command: # no command specified return if self.alias_to_cmd_name_map.get(self.options.command, None) == "help": # help command specified return try: conf.get_config(override_conffile=self.options.conffile, override_apiurl=self.options.apiurl, override_debug=self.options.debug, override_http_debug=self.options.http_debug, override_http_full_debug=self.options.http_full_debug, override_traceback=self.options.traceback, override_post_mortem=self.options.post_mortem, override_no_keyring=self.options.no_keyring, override_verbose=self.options.verbose) except oscerr.NoConfigfile as e: print(e.msg, file=sys.stderr) print(f'Creating osc configuration file {e.file} ...', file=sys.stderr) conf.interactive_config_setup(e.file, self.options.apiurl) print('done', file=sys.stderr) self.post_argparse() except oscerr.ConfigMissingApiurl as e: print(e.msg, file=sys.stderr) conf.interactive_config_setup(e.file, e.url, initial=False) self.post_argparse() except oscerr.ConfigMissingCredentialsError as e: print(e.msg) print('Please enter new credentials.') conf.interactive_config_setup(e.file, e.url, initial=False) self.post_argparse() self.options.verbose = conf.config['verbose'] self.download_progress = None if conf.config.get('show_download_progress', False): self.download_progress = create_text_meter() def get_api_url(self): from . import conf from . import store as osc_store from .core import is_package_dir from .core import is_project_dir try: localdir = Path.cwd() except Exception as e: # check for Stale NFS file handle: '.' try: os.stat('.') except Exception as ee: e = ee print("Path.cwd() failed: ", e, file=sys.stderr) sys.exit(1) try: store = osc_store.get_store(Path.cwd()) return store.apiurl except oscerr.NoWorkingCopy: return conf.config['apiurl'] def do_version(self, subcmd, opts): """ Give version of osc binary usage: osc version """ from .core import get_osc_version print(get_osc_version()) @cmdln.option('project') @cmdln.option('package', nargs='?') @cmdln.option('scm_url', nargs='?') def do_init(self, subcmd, opts): """ Initialize a directory as working copy Initialize an existing directory to be a working copy of an (already existing) buildservice project/package. (This is the same as checking out a package and then copying sources into the directory. It does NOT create a new package. To create a package, use 'osc meta pkg ... ...') You wouldn't normally use this command. To get a working copy of a package (e.g. for building it or working on it, you would normally use the checkout command. Use "osc help checkout" to get help for it. usage: osc init PRJ osc init PRJ PAC osc init PRJ PAC SCM_URL """ from . import conf from .core import Package from .core import Project from .core import show_files_meta from .core import show_scmsync from .core import store_write_string project = opts.project package = opts.package scm_url = opts.scm_url apiurl = self.get_api_url() if not scm_url: scm_url = show_scmsync(apiurl, project, package) if scm_url: if package: Package.init_package(apiurl, project, package, Path.cwd(), scm_url=scm_url) print(f'Initializing {Path.cwd()} (Project: {project}, Package: {package}) as git repository') else: Project.init_project(apiurl, Path.cwd(), project, conf.config['do_package_tracking'], getPackageList=False, scm_url=scm_url) print(f'Initializing {Path.cwd()} (Project: {project}) as scm repository') return if not package: Project.init_project(apiurl, Path.cwd(), project, conf.config['do_package_tracking'], getPackageList=False) print(f'Initializing {Path.cwd()} (Project: {project})') else: Package.init_package(apiurl, project, package, Path.cwd()) store_write_string(Path.cwd(), '_files', show_files_meta(apiurl, project, package) + b'\n') print(f'Initializing {Path.cwd()} (Project: {project}, Package: {package})') @cmdln.alias('ls') @cmdln.alias('ll') @cmdln.alias('lL') @cmdln.alias('LL') @cmdln.option('-a', '--arch', metavar='ARCH', help='specify architecture (only for binaries)') @cmdln.option('-r', '--repo', metavar='REPO', help='specify repository (only for binaries)') @cmdln.option('-b', '--binaries', action='store_true', help='list built binaries instead of sources') @cmdln.option('-e', '--expand', action='store_true', help='expand linked package (only for sources)') @cmdln.option('-u', '--unexpand', action='store_true', help='always work with unexpanded (source) packages') @cmdln.option('-l', '--long', dest='verbose', action='store_true', help='print extra information') @cmdln.option('-D', '--deleted', action='store_true', help='show only the former deleted projects or packages') @cmdln.option('-M', '--meta', action='store_true', help='list meta data files') @cmdln.option('-R', '--revision', metavar='REVISION', help='specify revision (only for sources)') def do_list(self, subcmd, opts, *args): """ List sources or binaries on the server Examples for listing sources: ls # list all projects (deprecated) ls / # list all projects ls . # take PROJECT/PACKAGE from current dir. ls PROJECT # list packages in a project ls PROJECT PACKAGE # list source files of package of a project ls PROJECT PACKAGE # list if this file exists ls -v PROJECT PACKAGE # verbosely list source files of package ls -l PROJECT PACKAGE # verbosely list source files of package ll PROJECT PACKAGE # verbosely list source files of package LL PROJECT PACKAGE # verbosely list source files of expanded link With --verbose, the following fields will be shown for each item: MD5 hash of file (doesn't apply to binaries) Revision number of the last commit Size (in bytes) Date and time of the last commit Examples for listing binaries: ls -b PROJECT # list all binaries of a project ls -b PROJECT -a ARCH # list ARCH binaries of a project ls -b PROJECT -r REPO # list binaries in REPO ls -b PROJECT PACKAGE REPO ARCH usage: ls [PROJECT [PACKAGE]] ls -b [PROJECT [PACKAGE [REPO [ARCH]]]] """ from .core import ET from .core import Linkinfo from .core import Repo from .core import get_binarylist from .core import get_repos_of_project from .core import is_package_dir from .core import is_project_dir from .core import meta_get_filelist from .core import meta_get_packagelist from .core import meta_get_project_list from .core import revision_is_empty from .core import shorttime from .core import show_files_meta from .core import slash_split from .core import store_read_package from .core import store_read_project args = slash_split(args) if subcmd == 'll': opts.verbose = True if subcmd in ('lL', 'LL'): opts.verbose = True opts.expand = True project = None package = None fname = None if len(args) == 0: # For consistency with *all* other commands # this lists what the server has in the current wd. # CAUTION: 'osc ls -b' already works like this. pass if len(args) > 0: project = args[0] if project == '/': project = None if project == '.': project = self._process_project_name(project) cwd = os.getcwd() if is_package_dir(cwd): package = store_read_package(cwd) project = self._process_project_name(project) if len(args) > 1: package = args[1] if len(args) > 2: if opts.deleted: raise oscerr.WrongArgs("Too many arguments when listing deleted packages") if opts.binaries: if opts.repo: if opts.repo != args[2]: raise oscerr.WrongArgs(f"conflicting repos specified ('{opts.repo}' vs '{args[2]}')") else: opts.repo = args[2] else: fname = args[2] if len(args) > 3: if not opts.binaries: raise oscerr.WrongArgs('Too many arguments') if opts.arch: if opts.arch != args[3]: raise oscerr.WrongArgs(f"conflicting archs specified ('{opts.arch}' vs '{args[3]}')") else: opts.arch = args[3] if opts.binaries and opts.expand: raise oscerr.WrongOptions('Sorry, --binaries and --expand are mutual exclusive.') apiurl = self.get_api_url() # list binaries if opts.binaries: # ls -b toplevel doesn't make sense, so use info from # current dir if available if len(args) == 0: cwd = Path.cwd() if is_project_dir(cwd): project = store_read_project(cwd) elif is_package_dir(cwd): project = store_read_project(cwd) package = store_read_package(cwd) if not project: raise oscerr.WrongArgs('There are no binaries to list above project level.') if opts.revision: raise oscerr.WrongOptions('Sorry, the --revision option is not supported for binaries.') repos = [] if opts.repo and opts.arch: repos.append(Repo(opts.repo, opts.arch)) elif opts.repo and not opts.arch: repos = [repo for repo in get_repos_of_project(apiurl, project) if repo.name == opts.repo] elif opts.arch and not opts.repo: repos = [repo for repo in get_repos_of_project(apiurl, project) if repo.arch == opts.arch] else: repos = get_repos_of_project(apiurl, project) results = [] for repo in repos: results.append((repo, get_binarylist(apiurl, project, repo.name, repo.arch, package=package, verbose=opts.verbose))) for result in results: indent = '' if len(results) > 1: print(f'{result[0].name}/{result[0].arch}') indent = ' ' if opts.verbose: for f in result[1]: if f.size is None and f.mtime is None: print("%9s %12s %-40s" % ('unknown', 'unknown', f.name)) elif f.size is None and f.mtime is not None: print("%9s %s %-40s" % ('unknown', shorttime(f.mtime), f.name)) elif f.size is not None and f.mtime is None: print("%9d %12s %-40s" % (f.size, 'unknown', f.name)) else: print("%9d %s %-40s" % (f.size, shorttime(f.mtime), f.name)) else: for f in result[1]: print(indent + f) # list sources elif not opts.binaries: if not args or not project: for prj in meta_get_project_list(apiurl, opts.deleted): print(prj) elif len(args) == 1: for pkg in meta_get_packagelist(apiurl, project, deleted=opts.deleted, expand=opts.expand): print(pkg) elif len(args) == 2 or len(args) == 3: link_seen = False print_not_found = True rev = opts.revision for i in [1, 2]: l = meta_get_filelist(apiurl, project, package, verbose=opts.verbose, expand=opts.expand, meta=opts.meta, deleted=opts.deleted, revision=rev) link_seen = '_link' in l if opts.verbose: out = ['%s %7s %9d %s %s' % (i.md5, i.rev, i.size, shorttime(i.mtime), i.name) for i in l if not fname or fname == i.name] if len(out) > 0: print_not_found = False print('\n'.join(out)) elif fname: if fname in l: print(fname) print_not_found = False else: print('\n'.join(l)) if opts.expand or opts.unexpand or not link_seen: break m = show_files_meta(apiurl, project, package) li = Linkinfo() root = xml_fromstring(m) li.read(root.find('linkinfo')) if li.haserror(): raise oscerr.LinkExpandError(project, package, li.error) project, package, rev = li.project, li.package, li.rev if not revision_is_empty(rev): print(f'# -> {project} {package} ({rev})') else: print(f'# -> {project} {package} (latest)') opts.expand = True if fname and print_not_found: print(f'file \'{fname}\' does not exist') return 1 @cmdln.option('--extend-package-names', default=False, action="store_true", help='Extend packages names with project name as suffix') def do_addcontainers(self, subcmd, opts, *args): """ Add maintained containers for a give package The command adds all containers which are marked as maintained and contain an rpm originating from the specified source package. Examples: osc addcontainers [PROJECT PACKAGE] """ from . import _private apiurl = self.get_api_url() args = list(args) project, package = pop_project_package_from_args( args, default_project=".", default_package=".", package_is_optional=False ) _private.add_containers( apiurl, project, package, extend_package_names=opts.extend_package_names, print_to="stdout" ) @cmdln.option('-s', '--skip-disabled', action='store_true', help='Skip disabled channels. Otherwise the source gets added, but not the repositories.') @cmdln.option('-e', '--enable-all', action='store_true', help='Enable all added channels including the ones disabled by default.') def do_addchannels(self, subcmd, opts, *args): """ Add channels to project The command adds all channels which are defined to be used for a given source package. The source link target is used to lookup the channels. The command can be used for a certain package or for all in the specified project. In case no channel is defined the operation is just returning. Examples: osc addchannels [PROJECT [PACKAGE]] """ from . import _private apiurl = self.get_api_url() args = list(args) project, package = pop_project_package_from_args( args, default_project=".", default_package=".", package_is_optional=True ) if opts.enable_all and opts.skip_disabled: self.argparse_error("Options '--enable-all' and '--skip-disabled' are mutually exclusive") _private.add_channels( apiurl, project, package, enable_all=opts.enable_all, skip_disabled=opts.skip_disabled, print_to="stdout" ) @cmdln.alias('enablechannel') def do_enablechannels(self, subcmd, opts, *args): """ Enables channels Enables existing channel packages in a project. Enabling means adding the needed repositories for building. The command can be used to enable a specific one or all channels of a project. Examples: osc enablechannels [PROJECT [PACKAGE]] """ from . import _private apiurl = self.get_api_url() args = list(args) project, package = pop_project_package_from_args( args, default_project=".", default_package=".", package_is_optional=True ) _private.enable_channels(apiurl, project, package, print_to="stdout") @cmdln.option('-f', '--force', action='store_true', help='force generation of new patchinfo file, do not update existing one.') def do_patchinfo(self, subcmd, opts, *args): """ Generate and edit a patchinfo file A patchinfo file describes the packages for an update and the kind of problem it solves. This command either creates a new _patchinfo or updates an existing one. Examples: osc patchinfo osc patchinfo [PROJECT [PATCH_NAME]] """ from .core import Package from .core import checkout_package from .core import http_POST from .core import is_package_dir from .core import is_project_dir from .core import makeurl from .core import meta_get_filelist from .core import meta_get_packagelist from .core import run_editor from .core import slash_split from .core import store_read_package from .core import store_read_project args = slash_split(args) apiurl = self.get_api_url() project_dir = localdir = Path.cwd() patchinfo = 'patchinfo' if len(args) == 0: if is_project_dir(localdir): project = store_read_project(localdir) apiurl = self.get_api_url() for p in meta_get_packagelist(apiurl, project): if p.startswith("_patchinfo") or p.startswith("patchinfo"): patchinfo = p else: if is_package_dir(localdir): project = store_read_project(localdir) patchinfo = store_read_package(localdir) apiurl = self.get_api_url() if not os.path.exists('_patchinfo'): sys.exit('Current checked out package has no _patchinfo. Either call it from project level or specify patch name.') else: sys.exit('This command must be called in a checked out project or patchinfo package.') else: project = self._process_project_name(args[0]) if len(args) > 1: patchinfo = args[1] filelist = None if patchinfo: try: filelist = meta_get_filelist(apiurl, project, patchinfo) except HTTPError: pass if opts.force or not filelist or '_patchinfo' not in filelist: print("Creating new patchinfo...") query = {"cmd": "createpatchinfo", "name": patchinfo} if opts.force: query["force"] = 1 url = makeurl(apiurl, ['source', project], query=query) f = http_POST(url) for p in meta_get_packagelist(apiurl, project): if p.startswith("_patchinfo") or p.startswith("patchinfo"): patchinfo = p else: print("Update existing _patchinfo file...") query = {"cmd": "updatepatchinfo"} url = makeurl(apiurl, ['source', project, patchinfo], query=query) f = http_POST(url) # CAUTION: # Both conf.config['checkout_no_colon'] and conf.config['checkout_rooted'] # fool this test: if is_package_dir(localdir): pac = Package(localdir) pac.update() filename = "_patchinfo" else: checkout_package(apiurl, project, patchinfo, prj_dir=project_dir) filename = project_dir / patchinfo / "/_patchinfo" run_editor(filename) @cmdln.alias('bsdevelproject') @cmdln.alias('dp') def do_develproject(self, subcmd, opts, *args): """ Print the devel project / package of a package Examples: osc develproject [PROJECT PACKAGE] """ from .core import show_devel_project apiurl = self.get_api_url() args = list(args) project, package = pop_project_package_from_args(args, default_project=".", default_package=".") devel_project, devel_package = show_devel_project(apiurl, project, package) if not devel_project: print(f"Package {project}/{package} has no devel project", file=sys.stderr) sys.exit(1) print(f"{devel_project}/{devel_package}") @cmdln.alias('ca') def do_cleanassets(self, subcmd, opts, *args): """ Clean all previous downloaded assets This is useful to prepare a new git commit. """ from .core import clean_assets clean_assets(".") @cmdln.alias('da') def do_downloadassets(self, subcmd, opts, *args): from .core import download_assets """ Download all assets referenced in the build descriptions """ download_assets(".") @cmdln.alias('sdp') @cmdln.option('-u', '--unset', action='store_true', help='remove devel project') def do_setdevelproject(self, subcmd, opts, *args): """Set the devel project / package of a package Examples: osc setdevelproject [PROJECT PACKAGE] DEVEL_PROJECT [DEVEL_PACKAGE] """ from .core import set_devel_project apiurl = self.get_api_url() args = list(args) if opts.unset: project, package = pop_project_package_from_args( args, default_project=".", default_package=".", package_is_optional=False ) devel_project = None devel_package = None else: args_backup = args.copy() try: # try this sequence first: project package devel_project [devel_package] project, package = pop_project_package_from_args(args, package_is_optional=False) devel_project, devel_package = pop_project_package_from_args( args, default_package=package, package_is_optional=True ) except oscerr.OscValueError: # then read project and package from working copy and try devel_project [devel_package] args = args_backup.copy() project, package = pop_project_package_from_args( [], default_project=".", default_package=".", package_is_optional=False ) devel_project, devel_package = pop_project_package_from_args( args, default_package=package, package_is_optional=True ) if args: args_str = ", ".join(args) self.argparse_error(f"Unknown arguments: {args_str}") set_devel_project(apiurl, project, package, devel_project, devel_package, print_to="stdout") def do_showlinked(self, subcmd, opts, *args): """ Show all packages linking to a given one Examples: osc showlinked [PROJECT PACKAGE] """ from . import _private apiurl = self.get_api_url() args = list(args) project, package = pop_project_package_from_args( args, default_project=".", default_package=".", package_is_optional=False ) linked_packages = _private.get_linked_packages(apiurl, project, package) for pkg in linked_packages: print(f"{pkg['project']}/{pkg['name']}") @cmdln.option('-c', '--create', action='store_true', help='Create a new token') @cmdln.option('-d', '--delete', metavar='TOKENID', help='Delete a token') @cmdln.option('-o', '--operation', metavar='OPERATION', help="Operation associated with the token. Choices: runservice, branch, release, rebuild, workflow") @cmdln.option('-t', '--trigger', metavar='TOKENSTRING', help='Trigger the action of a token') @cmdln.option('', '--scm-token', metavar='SCM_TOKEN', help='The scm\'s access token (only in combination with a --operation=workflow option)') @cmdln.option('-a', '--arch', help='Release/Rebuild only binaries from the specified architecture') @cmdln.option('-r', '--repo', help='Release/Rebuild only binaries from the specified repository') @cmdln.option('--target-project', metavar='PROJECT', help='Release only to specified project') @cmdln.option('--target-repo', metavar='REPO', help='Release only to specified repository') @cmdln.option('--set-release', metavar='RELEASE_TAG', help='Rename binaries during release using this release tag') def do_token(self, subcmd, opts, *args): """ Show and manage authentication token Authentication token can be used to run specific commands without sending credentials. usage: osc token osc token --create --operation [ ] osc token --delete osc token --trigger [--operation ] [ ] """ from . import conf from . import obs_api from .core import slash_split args = slash_split(args) if opts.scm_token and opts.operation != 'workflow': msg = 'The --scm-token option requires a --operation=workflow option' raise oscerr.WrongOptions(msg) apiurl = self.get_api_url() user = conf.get_apiurl_usr(apiurl) if len(args) > 1: project = args[0] package = args[1] else: project = None package = None if opts.create: if not opts.operation: self.argparser.error("Please specify --operation") if opts.operation == 'workflow' and not opts.scm_token: msg = 'The --operation=workflow option requires a --scm-token= option' raise oscerr.WrongOptions(msg) print("Create a new token") status = obs_api.Token.cmd_create( apiurl, user, operation=opts.operation, project=project, package=package, scm_token=opts.scm_token, ) print(status.to_string()) elif opts.delete: print("Delete token") status = obs_api.Token.do_delete(apiurl, user, token=opts.delete) print(status.to_string()) elif opts.trigger: print("Trigger token") status = obs_api.Token.do_trigger( apiurl, token=opts.trigger, project=project, package=package, repo=opts.repo, arch=opts.arch, target_project=opts.target_project, target_repo=opts.target_repo, set_release=opts.set_release, ) print(status.to_string()) else: if args and args[0] in ['create', 'delete', 'trigger']: raise oscerr.WrongArgs("Did you mean --" + args[0] + "?") # just list tokens token_list = obs_api.Token.do_list(apiurl, user) for obj in token_list: print(obj.to_human_readable_string()) print() @cmdln.option('-a', '--attribute', metavar='ATTRIBUTE', help='affect only a given attribute') @cmdln.option('--attribute-defaults', action='store_true', default=None, help='include defined attribute defaults') @cmdln.option('--attribute-project', action='store_true', default=None, help='include project values, if missing in packages ') @cmdln.option('--blame', action='store_true', help='show author and time of each line') @cmdln.option('-f', '--force', action='store_true', help='force the save operation, allows one to ignores some errors like depending repositories. For prj meta only.') @cmdln.option('-F', '--file', metavar='FILE', help='read metadata from FILE, instead of opening an editor. ' '\'-\' denotes standard input. ') @cmdln.option('-r', '--revision', metavar='REV', help='checkout given revision instead of head revision. For prj and prjconf meta only') @cmdln.option('-m', '--message', metavar='TEXT', help='specify log message TEXT. For prj and prjconf meta only') @cmdln.option('-e', '--edit', action='store_true', help='edit metadata') @cmdln.option('-c', '--create', action='store_true', help='create attribute without values') @cmdln.option('-R', '--remove-linking-repositories', action='store_true', help='Try to remove also all repositories building against remove ones.') @cmdln.option('-s', '--set', metavar='ATTRIBUTE_VALUES', help='set attribute values') @cmdln.option('--add', metavar='ATTRIBUTE_VALUES', help='add to the existing attribute values, skip duplicates') @cmdln.option('--delete', action='store_true', help='delete a pattern or attribute') def do_meta(self, subcmd, opts, *args): """ Show meta information, or edit it Show or edit build service metadata of type . This command displays metadata on buildservice objects like projects, packages, or users. The type of metadata is specified by the word after "meta", like e.g. "meta prj". prj denotes metadata of a buildservice project. prjconf denotes the (build) configuration of a project. pkg denotes metadata of a buildservice package. user denotes the metadata of a user. group denotes the metadata of a group. pattern denotes installation patterns defined for a project. To list patterns, use 'osc meta pattern PRJ'. An additional argument will be the pattern file to view or edit. With the --edit switch, the metadata can be edited. Per default, osc opens the program specified by the environmental variable EDITOR with a temporary file. Alternatively, content to be saved can be supplied via the --file switch. If the argument is '-', input is taken from stdin: osc meta prjconf home:user | sed ... | osc meta prjconf home:user -F - For meta prj and prjconf updates optional commit messages can be applied with --message. When trying to edit a non-existing resource, it is created implicitly. Examples: osc meta prj PRJ osc meta pkg PRJ PKG osc meta pkg PRJ PKG -e usage: osc meta [-r|--revision REV] ARGS... osc meta ARGS... osc meta [-m|--message TEXT] -e|--edit ARGS... osc meta [-m|--message TEXT] -F|--file ARGS... osc meta pattern --delete PRJ PATTERN osc meta attribute PRJ [PKG [SUBPACKAGE]] [--attribute ATTRIBUTE] [--create [--set ]|--delete|--set ] """ from . import _private from . import conf from . import store as osc_store from .core import decode_it from .core import edit_meta from .core import get_group_meta from .core import get_user_meta from .core import http_DELETE from .core import http_POST from .core import makeurl from .core import metatypes from .core import show_attribute_meta from .core import show_package_meta from .core import show_pattern_meta from .core import show_pattern_metalist from .core import show_project_conf from .core import show_project_meta from .core import slash_split from .core import store_read_package from .core import store_read_project from .core import streamfile args = slash_split(args) if not args or args[0] not in metatypes.keys(): raise oscerr.WrongArgs('Unknown meta type. Choose one of %s.' % ', '.join(metatypes)) cmd = args[0] del args[0] if cmd == 'pkg': min_args, max_args = 0, 2 elif cmd == 'pattern': min_args, max_args = 1, 2 elif cmd == 'attribute': min_args, max_args = 1, 3 elif cmd in ('prj', 'prjconf'): min_args, max_args = 0, 1 else: min_args, max_args = 1, 1 if len(args) < min_args: raise oscerr.WrongArgs('Too few arguments.') if len(args) > max_args: raise oscerr.WrongArgs('Too many arguments.') if opts.add and opts.set: self.argparse_error("Options --add and --set are mutually exclusive") if cmd == "attribute" and opts.edit and not opts.attribute: self.argparse_error("Please specify --attribute") apiurl = self.get_api_url() project = None package = None subpackage = None user = None group = None pattern = None # Specific arguments # # If project or package arguments missing, assume to work # with project and/or package in current local directory. attributepath = [] if cmd in ['prj', 'prjconf']: if len(args) < 1: apiurl = osc_store.Store(Path.cwd()).apiurl project = store_read_project(Path.cwd()) else: project = self._process_project_name(args[0]) elif cmd == 'pkg': if len(args) < 2: apiurl = osc_store.Store(Path.cwd()).apiurl project = store_read_project(Path.cwd()) if len(args) < 1: package = store_read_package(Path.cwd()) else: package = args[0] else: project = self._process_project_name(args[0]) package = args[1] elif cmd == 'attribute': project = self._process_project_name(args[0]) if len(args) > 1: package = args[1] else: package = None if opts.attribute_project: raise oscerr.WrongOptions('--attribute-project works only when also a package is given') if len(args) > 2: subpackage = args[2] else: subpackage = None attributepath.append('source') attributepath.append(project) if package: attributepath.append(package) if subpackage: attributepath.append(subpackage) attributepath.append('_attribute') elif cmd == 'user': user = args[0] elif cmd == 'group': group = args[0] elif cmd == 'pattern': project = self._process_project_name(args[0]) if len(args) > 1: pattern = args[1] else: pattern = None # enforce pattern argument if needed if opts.edit or opts.file: raise oscerr.WrongArgs('A pattern file argument is required.') if cmd not in ['prj', 'prjconf'] and (opts.message or opts.revision): raise oscerr.WrongOptions('options --revision and --message are only supported for the prj or prjconf subcommand') # show if not opts.edit and not opts.file and not opts.delete and not opts.create and not opts.set and not opts.add: if cmd == 'prj': sys.stdout.write(decode_it(b''.join(show_project_meta(apiurl, project, rev=opts.revision, blame=opts.blame)))) elif cmd == 'pkg': sys.stdout.write(decode_it(b''.join(show_package_meta(apiurl, project, package, blame=opts.blame)))) elif cmd == 'attribute': sys.stdout.write(decode_it(b''.join(show_attribute_meta(apiurl, project, package, subpackage, opts.attribute, opts.attribute_defaults, opts.attribute_project)))) elif cmd == 'prjconf': sys.stdout.write(decode_it(b''.join(show_project_conf(apiurl, project, rev=opts.revision, blame=opts.blame)))) elif cmd == 'user': r = get_user_meta(apiurl, user) if r: sys.stdout.write(decode_it(r)) elif cmd == 'group': r = get_group_meta(apiurl, group) if r: sys.stdout.write(decode_it(r)) elif cmd == 'pattern': if pattern: r = show_pattern_meta(apiurl, project, pattern) if r: sys.stdout.write(''.join(r)) else: r = show_pattern_metalist(apiurl, project) if r: sys.stdout.write('\n'.join(r) + '\n') # edit if opts.edit and not opts.file: if cmd == 'prj': edit_meta(metatype='prj', edit=True, force=opts.force, remove_linking_repositories=opts.remove_linking_repositories, path_args=(project, ), apiurl=apiurl, msg=opts.message, template_args=({ 'name': project, 'user': conf.get_apiurl_usr(apiurl)})) elif cmd == 'pkg': edit_meta(metatype='pkg', edit=True, path_args=(project, package), apiurl=apiurl, template_args=({ 'name': package, 'user': conf.get_apiurl_usr(apiurl)})) elif cmd == 'prjconf': edit_meta(metatype='prjconf', edit=True, path_args=(project, ), apiurl=apiurl, msg=opts.message, template_args=None) elif cmd == 'user': edit_meta(metatype='user', edit=True, path_args=(user, ), apiurl=apiurl, template_args=({'user': user})) elif cmd == 'group': edit_meta(metatype='group', edit=True, path_args=(group, ), apiurl=apiurl, template_args=({'group': group})) elif cmd == 'pattern': edit_meta(metatype='pattern', edit=True, path_args=(project, pattern), apiurl=apiurl, template_args=None) elif cmd == 'attribute': edit_meta( metatype='attribute', edit=True, path_args=(project, opts.attribute), apiurl=apiurl, # PUT is not supported method="POST", template_args=None, ) # create attribute entry if (opts.create or opts.set or opts.add) and cmd == 'attribute': if not opts.attribute: raise oscerr.WrongOptions('no attribute given to create') aname = opts.attribute.split(":") if len(aname) != 2: raise oscerr.WrongOptions('Given attribute is not in "NAMESPACE:NAME" style') values = [] if opts.add: # read the existing values from server root = _private.api.get(apiurl, attributepath) nodes = _private.api.find_nodes(root, "attributes", "attribute", {"namespace": aname[0], "name": aname[1]}, "value") for node in nodes: # append the existing values values.append(node.text) # pretend we're setting values in order to append the values we have specified on the command-line, # because OBS API doesn't support extending the value list directly opts.set = opts.add if opts.set: for i in opts.set.split(','): # append the new values # we skip duplicates during --add if opts.add and i in values: continue values.append(i) values_str = "" for value in values: value = _private.api.xml_escape(value) values_str += f"{value}" ns = _private.api.xml_escape(aname[0]) name = _private.api.xml_escape(aname[1]) d = f"{values_str}" url = makeurl(apiurl, attributepath) for data in streamfile(url, http_POST, data=d): sys.stdout.buffer.write(data) # upload file if opts.file: if opts.file == '-': f = sys.stdin.read() else: try: f = open(opts.file).read() except: sys.exit(f'could not open file \'{opts.file}\'.') if cmd == 'prj': edit_meta(metatype='prj', data=f, edit=opts.edit, force=opts.force, remove_linking_repositories=opts.remove_linking_repositories, apiurl=apiurl, msg=opts.message, path_args=(project, )) elif cmd == 'pkg': edit_meta(metatype='pkg', data=f, edit=opts.edit, apiurl=apiurl, path_args=(project, package)) elif cmd == 'prjconf': edit_meta(metatype='prjconf', data=f, edit=opts.edit, apiurl=apiurl, msg=opts.message, path_args=(project, )) elif cmd == 'user': edit_meta(metatype='user', data=f, edit=opts.edit, apiurl=apiurl, path_args=(user, )) elif cmd == 'group': edit_meta(metatype='group', data=f, edit=opts.edit, apiurl=apiurl, path_args=(group, )) elif cmd == 'pattern': edit_meta(metatype='pattern', data=f, edit=opts.edit, apiurl=apiurl, path_args=(project, pattern)) # delete if opts.delete: path = metatypes[cmd]['path'] if cmd == 'pattern': path = path % (project, pattern) u = makeurl(apiurl, [path]) http_DELETE(u) elif cmd == 'attribute': if not opts.attribute: raise oscerr.WrongOptions('no attribute given to create') attributepath.append(opts.attribute) u = makeurl(apiurl, attributepath) for data in streamfile(u, http_DELETE): sys.stdout.buffer.write(data) else: raise oscerr.WrongOptions('The --delete switch is only for pattern metadata or attributes.') # TODO: rewrite and consolidate the current submitrequest/createrequest "mess" @cmdln.option('-m', '--message', metavar='TEXT', help='specify message TEXT') @cmdln.option('-F', '--file', metavar='FILE', help='read log message from FILE, \'-\' denotes standard input.') @cmdln.option('-r', '--revision', metavar='REV', help='specify a certain source revision ID (the md5 sum) for the source package') @cmdln.option('-s', '--supersede', metavar='REQUEST_ID', help='Superseding another request by this one') @cmdln.option('--nodevelproject', action='store_true', help='do not follow a defined devel project ' '(primary project where a package is developed)') @cmdln.option('--separate-requests', action='store_true', help='Create multiple requests instead of a single one (when command is used for entire project)') @cmdln.option('--cleanup', action='store_true', help='remove package if submission gets accepted (default for home::branch projects)') @cmdln.option('--no-cleanup', action='store_true', help='never remove source package on accept, but update its content') @cmdln.option('--no-update', action='store_true', help='never touch source package on accept (will break source links)') @cmdln.option('--update-link', action='store_true', help='This transfers the source including the _link file.') @cmdln.option('-d', '--diff', action='store_true', help='show diff only instead of creating the actual request') @cmdln.option('--yes', action='store_true', help='proceed without asking.') @cmdln.alias("sr") @cmdln.alias("submitreq") @cmdln.alias("submitpac") def do_submitrequest(self, subcmd, opts, *args): """ Create request to submit source into another Project [See http://en.opensuse.org/openSUSE:Build_Service_Collaboration for information on this topic.] See the "request" command for showing and modifying existing requests. usage: osc submitreq [OPTIONS] osc submitreq [OPTIONS] DESTPRJ [DESTPKG] osc submitreq [OPTIONS] SOURCEPRJ SOURCEPKG DESTPRJ [DESTPKG] osc submitpac ... is a shorthand for osc submitreq --cleanup ... """ from . import _private from . import conf from .core import ET from .core import Package from .core import _html_escape from .core import change_request_state from .core import check_existing_maintenance_requests from .core import check_existing_requests from .core import create_submit_request from .core import decode_it from .core import edit_message from .core import highlight_diff from .core import http_GET from .core import http_POST from .core import is_project_dir from .core import makeurl from .core import meta_get_packagelist from .core import parse_diff_for_commit_message from .core import raw_input from .core import run_pager from .core import server_diff from .core import show_devel_project from .core import slash_split from .core import store_read_project def _check_service(root): serviceinfo = root.find('serviceinfo') if serviceinfo is not None: # code "running" is ok, because the api will choke when trying # to create the sr (if it is still running) if serviceinfo.get('code') not in ('running', 'succeeded'): print('A service run for package %s %s:' % (root.get('name'), serviceinfo.get('code')), file=sys.stderr) error = serviceinfo.find('error') if error is not None: print('\n'.join(error.text.split('\\n'))) sys.exit('\nPlease fix this first') if opts.cleanup and opts.no_cleanup: raise oscerr.WrongOptions('\'--cleanup\' and \'--no-cleanup\' are mutually exclusive') src_update = conf.config['submitrequest_on_accept_action'] or None # we should check here for home::branch and default to update, but that would require OBS 1.7 server if subcmd == 'submitpac' and not opts.no_cleanup: opts.cleanup = True if opts.cleanup: src_update = "cleanup" elif opts.no_cleanup: src_update = "update" elif opts.no_update: src_update = "noupdate" if opts.message: pass elif opts.file: if opts.file == '-': opts.message = sys.stdin.read() else: try: opts.message = open(opts.file).read() except: sys.exit(f'could not open file \'{opts.file}\'.') myreqs = [] if opts.supersede: myreqs = [opts.supersede] args = slash_split(args) if len(args) > 4: raise oscerr.WrongArgs('Too many arguments.') if len(args) == 2 and is_project_dir(Path.cwd()): sys.exit('You can not specify a target package when submitting an entire project\n') apiurl = self.get_api_url() if len(args) < 2 and is_project_dir(Path.cwd()): if opts.diff: raise oscerr.WrongOptions('\'--diff\' is not supported in a project working copy') project = store_read_project(Path.cwd()) sr_ids = [] target_project = None if len(args) == 1: target_project = self._process_project_name(args[0]) if opts.separate_requests: for p in meta_get_packagelist(apiurl, project): # get _link info from server, that knows about the local state ... u = makeurl(apiurl, ['source', project, p]) f = http_GET(u) root = xml_parse(f).getroot() _check_service(root) linkinfo = root.find('linkinfo') if linkinfo is None: if len(args) < 1: print("Package ", p, " is not a source link and no target specified.") sys.exit("This is currently not supported.") else: if linkinfo.get('error'): print("Package ", p, " is a broken source link.") sys.exit("Please fix this first") t = linkinfo.get('project') if t is None: print("Skipping package ", p, " since it is a source link pointing inside the project.") continue print("Submitting package ", p) try: result = create_submit_request(apiurl, project, p, target_project, src_update=src_update) except HTTPError as e: if e.hdrs.get('X-Opensuse-Errorcode') == 'missing_action': print("Package ", p, " no changes. Skipping...") continue raise if not result: sys.exit("submit request creation failed") sr_ids.append(result) else: actionxml = "" options_block = "" if src_update: options_block += f"""{src_update}""" if opts.update_link: options_block + """true """ options_block += "" target_prj_block = "" if target_project is not None: target_prj_block = f"""""" s = """ %s %s """ % \ (project, target_prj_block, options_block) actionxml += s xml = """ %s %s """ % \ (actionxml, _html_escape(opts.message or "")) u = makeurl(apiurl, ['request'], query={"cmd": "create", "addrevision": "1"}) f = http_POST(u, data=xml) root = xml_parse(f).getroot() sr_ids.append(root.get('id')) print("Request(s) created: ", end=' ') for i in sr_ids: print(i, end=' ') # was this project created by clone request ? u = makeurl(apiurl, ['source', project, '_attribute', 'OBS:RequestCloned']) f = http_GET(u) root = xml_parse(f).getroot() value = root.findtext('attribute/value') if value and not opts.yes: repl = '' print('\n\nThere are already following submit request: %s.' % ', '.join([str(i) for i in myreqs])) repl = raw_input('\nSupersede the old requests? (y/n) ') if repl.lower() == 'y': myreqs += [value] if len(myreqs) > 0: for req in myreqs: change_request_state(apiurl, str(req), 'superseded', f'superseded by {sr_ids[0]}', sr_ids[0]) sys.exit('Successfully finished') elif len(args) <= 2: # try using the working copy at hand p = Package(Path.cwd()) src_project = p.prjname src_package = p.name if p.apiurl != apiurl: print(f'The apiurl for the working copy of this package is {p.apiurl}') print(f'You cannot use this command with the -A {self.options.apiurl} option.') sys.exit(1) apiurl = p.apiurl if len(args) == 0 and p.islink(): dst_project = p.linkinfo.project dst_package = p.linkinfo.package elif len(args) > 0: dst_project = self._process_project_name(args[0]) if len(args) == 2: dst_package = args[1] else: if p.islink(): dst_package = p.linkinfo.package else: dst_package = src_package else: sys.exit('Package \'%s\' is not a source link, so I cannot guess the submit target.\n' 'Please provide it the target via commandline arguments.' % p.name) modified = [i for i in p.filenamelist if not p.status(i) in (' ', '?', 'S')] if len(modified) > 0 and not opts.yes: print('Your working copy has local modifications.') repl = raw_input('Proceed without committing the local changes? (y|N) ') if repl != 'y': raise oscerr.UserAbort() elif len(args) >= 3: # get the arguments from the commandline src_project, src_package, dst_project = args[0:3] if len(args) == 4: dst_package = args[3] else: dst_package = src_package src_project = self._process_project_name(src_project) dst_project = self._process_project_name(dst_project) else: self.argparse_error("Incorrect number of arguments.") # check for failed source service u = makeurl(apiurl, ['source', src_project, src_package]) f = http_GET(u) root = xml_parse(f).getroot() _check_service(root) if not opts.nodevelproject: devloc = None try: devloc, _ = show_devel_project(apiurl, dst_project, dst_package) except HTTPError: print("""\ Warning: failed to fetch meta data for '%s' package '%s' (new package?) """ % (dst_project, dst_package), file=sys.stderr) if devloc and \ dst_project != devloc and \ src_project != devloc: print("""\ A different project, %s, is defined as the place where development of the package %s primarily takes place. Please submit there instead, or use --nodevelproject to force direct submission.""" % (devloc, dst_package)) if not opts.diff: sys.exit(1) rev = opts.revision if not rev: # get _link info from server, that knows about the local state ... u = makeurl(apiurl, ['source', src_project, src_package], query={"expand": "1"}) f = http_GET(u) root = xml_parse(f).getroot() linkinfo = root.find('linkinfo') if linkinfo is None: rev = root.get('rev') else: if linkinfo.get('project') != dst_project or linkinfo.get('package') != dst_package: # the submit target is not link target. use merged md5sum references to # avoid not mergable sources when multiple request from same source get created. rev = root.get('srcmd5') rdiff = None if opts.diff or not opts.message: try: rdiff = b'old: %s/%s\nnew: %s/%s rev %s\n' % (dst_project.encode(), dst_package.encode(), src_project.encode(), src_package.encode(), str(rev).encode()) rdiff += server_diff(apiurl, dst_project, dst_package, None, src_project, src_package, rev, True) except: rdiff = b'' if opts.diff: run_pager(highlight_diff(rdiff)) return if rdiff is not None: rdiff = decode_it(rdiff) supersede_existing = False reqs = [] if not opts.supersede: (supersede_existing, reqs) = check_existing_requests(apiurl, src_project, src_package, dst_project, dst_package, not opts.yes) if not supersede_existing: (supersede_existing, reqs) = check_existing_maintenance_requests(apiurl, src_project, [src_package], dst_project, None, not opts.yes) if not opts.message: msg = "" if opts.supersede: from .obs_api import Request req = Request.from_api(apiurl, opts.supersede) msg = req.description + "\n" difflines = [] doappend = False changes_re = re.compile(r'^--- .*\.changes ') for line in rdiff.split('\n'): if line.startswith('--- '): if changes_re.match(line): doappend = True else: doappend = False if doappend: difflines.append(line) diff = "\n".join(parse_diff_for_commit_message("\n".join(difflines))) opts.message = edit_message(footer=rdiff, template=f"{msg}{diff}") result = create_submit_request(apiurl, src_project, src_package, dst_project, dst_package, opts.message, orev=rev, src_update=src_update, dst_updatelink=opts.update_link) print('created request id', result) if conf.config['print_web_links']: obs_url = _private.get_configuration_value(apiurl, "obs_url") print(f"{obs_url}/request/show/{result}") if supersede_existing: for req in reqs: change_request_state(apiurl, req.reqid, 'superseded', f'superseded by {result}', result) if opts.supersede: change_request_state(apiurl, opts.supersede, 'superseded', opts.message or '', result) def _submit_request(self, args, opts, options_block): from . import conf from .core import ET from .core import Package from .core import get_request_collection from .core import http_GET from .core import is_project_dir from .core import makeurl from .core import meta_get_packagelist from .core import raw_input from .core import server_diff from .core import show_devel_project from .core import show_upstream_rev from .core import store_read_project actionxml = "" apiurl = self.get_api_url() if len(args) == 0 and is_project_dir(Path.cwd()): # submit requests for multiple packages are currently handled via multiple requests # They could be also one request with multiple actions, but that avoids to accepts parts of it. project = store_read_project(Path.cwd()) pi = [] pac = [] targetprojects = [] # loop via all packages for checking their state for p in meta_get_packagelist(apiurl, project): if p.startswith("_patchinfo:"): pi.append(p) else: # get _link info from server, that knows about the local state ... u = makeurl(apiurl, ['source', project, p]) f = http_GET(u) root = xml_parse(f).getroot() linkinfo = root.find('linkinfo') if linkinfo is None: print("Package ", p, " is not a source link.") sys.exit("This is currently not supported.") if linkinfo.get('error'): print("Package ", p, " is a broken source link.") sys.exit("Please fix this first") t = linkinfo.get('project') if t: rdiff = b'' try: rdiff = server_diff(apiurl, t, p, opts.revision, project, p, None, True) except: rdiff = b'' if rdiff != b'': targetprojects.append(t) pac.append(p) else: print("Skipping package ", p, " since it has no difference with the target package.") else: print("Skipping package ", p, " since it is a source link pointing inside the project.") # loop via all packages to do the action for p in pac: s = """ %s """ % \ (project, p, opts.revision or show_upstream_rev(apiurl, project, p), t, p, options_block) actionxml += s # create submit requests for all found patchinfos for p in pi: for t in targetprojects: s = """ %s """ % \ (project, p, t, p, options_block) actionxml += s return actionxml, [] elif len(args) <= 2: # try using the working copy at hand p = Package(Path.cwd()) src_project = p.prjname src_package = p.name if len(args) == 0 and p.islink(): dst_project = p.linkinfo.project dst_package = p.linkinfo.package elif len(args) > 0: dst_project = self._process_project_name(args[0]) if len(args) == 2: dst_package = args[1] else: dst_package = src_package else: sys.exit('Package \'%s\' is not a source link, so I cannot guess the submit target.\n' 'Please provide it the target via commandline arguments.' % p.name) modified = [i for i in p.filenamelist if p.status(i) != ' ' and p.status(i) != '?'] if len(modified) > 0: print('Your working copy has local modifications.') repl = raw_input('Proceed without committing the local changes? (y|N) ') if repl != 'y': sys.exit(1) elif len(args) >= 3: # get the arguments from the commandline src_project, src_package, dst_project = args[0:3] if len(args) == 4: dst_package = args[3] else: dst_package = src_package src_project = self._process_project_name(src_project) dst_project = self._process_project_name(dst_project) else: self.argparse_error("Incorrect number of arguments.") if not opts.nodevelproject: devloc = None try: devloc, _ = show_devel_project(apiurl, dst_project, dst_package) except HTTPError: print("""\ Warning: failed to fetch meta data for '%s' package '%s' (new package?) """ % (dst_project, dst_package), file=sys.stderr) if devloc and \ dst_project != devloc and \ src_project != devloc: print("""\ A different project, %s, is defined as the place where development of the package %s primarily takes place. Please submit there instead, or use --nodevelproject to force direct submission.""" % (devloc, dst_package)) sys.exit(1) reqs = get_request_collection(apiurl, project=dst_project, package=dst_package, types=['submit'], states=['new', 'review']) user = conf.get_apiurl_usr(apiurl) myreqs = [i for i in reqs if i.state.who == user and i.reqid != opts.supersede] myreq_ids = [r.reqid for r in myreqs] repl = 'y' if len(myreqs) > 0 and not opts.yes: print('You already created the following submit request: %s.' % ', '.join(myreq_ids)) repl = raw_input('Supersede the old requests? (y/n/c) ') if repl.lower() == 'c': print('Aborting', file=sys.stderr) sys.exit(1) elif repl.lower() != 'y': myreqs = [] actionxml = """ %s """ % \ (src_project, src_package, opts.revision or show_upstream_rev(apiurl, src_project, src_package), dst_project, dst_package, options_block) if opts.supersede: myreq_ids.append(opts.supersede) # print 'created request id', result return actionxml, myreq_ids def _delete_request(self, args, opts): if len(args) < 1: raise oscerr.WrongArgs('Please specify at least a project.') if len(args) > 2: raise oscerr.WrongArgs('Too many arguments.') package = "" if len(args) > 1: package = f"""package="{args[1]}" """ actionxml = f""" """ return actionxml def _changedevel_request(self, args, opts): from .core import find_default_project from .core import is_package_dir from .core import store_read_package from .core import store_read_project if len(args) > 4: raise oscerr.WrongArgs('Too many arguments.') if len(args) == 0 and is_package_dir('.') and find_default_project(): wd = Path.cwd() devel_project = store_read_project(wd) devel_package = package = store_read_package(wd) project = find_default_project(self.get_api_url(), package) else: if len(args) < 3: raise oscerr.WrongArgs('Too few arguments.') devel_project = self._process_project_name(args[2]) project = self._process_project_name(args[0]) package = args[1] devel_package = package if len(args) > 3: devel_package = self._process_project_name(args[3]) actionxml = """ """ % \ (devel_project, devel_package, project, package) return actionxml def _release_request(self, args, opts): if len(args) < 3 or len(args) > 5: raise oscerr.WrongArgs('Wrong number of arguments for release' + str(len(args))) project = self._process_project_name(args[0]) package = args[1] target_project = args[2] source_repository = target_repository = "" if len(args) == 5: source_repository = """ repository="%s" """ % args[3] target_repository = """ repository="%s" """ % args[4] elif len(args) == 4: target_repository = """ repository="%s" """ % args[3] actionxml = """ """ % \ (project, package, source_repository, target_project, target_repository) return actionxml def _add_me(self, args, opts): from . import conf from .core import get_user_meta if len(args) > 3: raise oscerr.WrongArgs('Too many arguments.') if len(args) < 2: raise oscerr.WrongArgs('Too few arguments.') apiurl = self.get_api_url() user = conf.get_apiurl_usr(apiurl) role = args[0] project = self._process_project_name(args[1]) actionxml = """ """ % \ (project, user, role) if len(args) > 2: package = args[2] actionxml = """ """ % \ (project, package, user, role) if get_user_meta(apiurl, user) is None: raise oscerr.WrongArgs('osc: an error occurred.') return actionxml def _add_user(self, args, opts): from .core import get_user_meta if len(args) > 4: raise oscerr.WrongArgs('Too many arguments.') if len(args) < 3: raise oscerr.WrongArgs('Too few arguments.') apiurl = self.get_api_url() user = args[0] role = args[1] project = self._process_project_name(args[2]) actionxml = """ """ % \ (project, user, role) if len(args) > 3: package = args[3] actionxml = """ """ % \ (project, package, user, role) if get_user_meta(apiurl, user) is None: raise oscerr.WrongArgs('osc: an error occured.') return actionxml def _add_group(self, args, opts): from .core import get_group_meta if len(args) > 4: raise oscerr.WrongArgs('Too many arguments.') if len(args) < 3: raise oscerr.WrongArgs('Too few arguments.') apiurl = self.get_api_url() group = args[0] role = args[1] project = self._process_project_name(args[2]) actionxml = """ """ % \ (project, group, role) if len(args) > 3: package = args[3] actionxml = """ """ % \ (project, package, group, role) if get_group_meta(apiurl, group) is None: raise oscerr.WrongArgs('osc: an error occured.') return actionxml def _set_bugowner(self, args, opts): from .core import get_group_meta from .core import get_user_meta if len(args) > 3: raise oscerr.WrongArgs('Too many arguments.') if len(args) < 2: raise oscerr.WrongArgs('Too few arguments.') apiurl = self.get_api_url() user = args[0] project = self._process_project_name(args[1]) package = "" if len(args) > 2: package = f"""package="{args[2]}" """ if user.startswith('group:'): group = user.replace('group:', '') actionxml = """ """ % \ (project, package, group) if get_group_meta(apiurl, group) is None: raise oscerr.WrongArgs('osc: an error occurred.') else: actionxml = """ """ % \ (project, package, user) if get_user_meta(apiurl, user) is None: raise oscerr.WrongArgs('osc: an error occured.') return actionxml @cmdln.option('-a', '--action', action='append', nargs='+', metavar=('ACTION', '[ARGS]'), dest='actions', default=[], help='specify action type of a request, can be : submit/delete/change_devel/add_role/set_bugowner') @cmdln.option('-m', '--message', metavar='TEXT', help='specify message TEXT') @cmdln.option('-r', '--revision', metavar='REV', help='for "create", specify a certain source revision ID (the md5 sum)') @cmdln.option('-s', '--supersede', metavar='REQUEST_ID', help='Superseding another request by this one') @cmdln.option('--nodevelproject', action='store_true', help='do not follow a defined devel project ' '(primary project where a package is developed)') @cmdln.option('--cleanup', action='store_true', help='remove package if submission gets accepted (default for home::branch projects)') @cmdln.option('--no-cleanup', action='store_true', help='never remove source package on accept, but update its content') @cmdln.option('--no-update', action='store_true', help='never touch source package on accept (will break source links)') @cmdln.option('--yes', action='store_true', help='proceed without asking.') @cmdln.alias("creq") def do_createrequest(self, subcmd, opts, *args): """ Create a request with multiple actions usage: osc creq [OPTIONS] [ -a submit SOURCEPRJ SOURCEPKG DESTPRJ [DESTPKG] -a delete PROJECT [PACKAGE] -a change_devel PROJECT PACKAGE DEVEL_PROJECT [DEVEL_PACKAGE] -a add_me ROLE PROJECT [PACKAGE] -a add_group GROUP ROLE PROJECT [PACKAGE] -a add_role USER ROLE PROJECT [PACKAGE] -a set_bugowner USER PROJECT [PACKAGE] -a release PROJECT PACKAGE TARGET_PROJECT [[SOURCE_REPOSITORY] TARGET_REPOSITORY] ] Option -m works for all types of request actions, the rest work only for submit. Example: osc creq -a submit -a delete home:someone:branches:openSUSE:Tools -a change_devel openSUSE:Tools osc home:someone:branches:openSUSE:Tools -m ok This will submit all modified packages under current directory, delete project home:someone:branches:openSUSE:Tools and change the devel project to home:someone:branches:openSUSE:Tools for package osc in project openSUSE:Tools. """ from . import conf from .core import ET from .core import _html_escape from .core import change_request_state from .core import edit_message from .core import http_POST from .core import makeurl from .core import slash_split src_update = conf.config['submitrequest_on_accept_action'] or None # we should check here for home::branch and default to update, but that would require OBS 1.7 server if opts.cleanup: src_update = "cleanup" elif opts.no_cleanup: src_update = "update" elif opts.no_update: src_update = "noupdate" options_block = "" if src_update: options_block = f"""{src_update} """ args = slash_split(args) apiurl = self.get_api_url() actionsxml = "" supersede = set() for actiondata in opts.actions: action = actiondata[0] args = actiondata[1:] if action == 'submit': actions, to_supersede = self._submit_request(args, opts, options_block) actionsxml += actions supersede.update(to_supersede) elif action == 'delete': actionsxml += self._delete_request(args, opts) elif action == 'change_devel': actionsxml += self._changedevel_request(args, opts) elif action == 'release': actionsxml += self._release_request(args, opts) elif action == 'add_me': actionsxml += self._add_me(args, opts) elif action == 'add_group': actionsxml += self._add_group(args, opts) elif action == 'add_role': actionsxml += self._add_user(args, opts) elif action == 'set_bugowner': actionsxml += self._set_bugowner(args, opts) else: raise oscerr.WrongArgs(f"Unsupported action {action}") if actionsxml == "": sys.exit('No actions need to be taken.') if not opts.message: opts.message = edit_message() xml = """ %s %s """ % \ (actionsxml, _html_escape(opts.message or "")) u = makeurl(apiurl, ['request'], query={"cmd": "create"}) f = http_POST(u, data=xml) root = xml_parse(f).getroot() rid = root.get('id') print(f"Request {rid} created") for srid in supersede: change_request_state(apiurl, srid, 'superseded', f'superseded by {rid}', rid) @cmdln.option('-m', '--message', metavar='TEXT', help='specify message TEXT') @cmdln.option('-r', '--role', metavar='role', help='specify user role (default: maintainer)') @cmdln.alias("reqbugownership") @cmdln.alias("requestbugownership") @cmdln.alias("reqmaintainership") @cmdln.alias("reqms") @cmdln.alias("reqbs") def do_requestmaintainership(self, subcmd, opts, *args): """ Requests to add user as maintainer or bugowner usage: osc requestmaintainership # for current user in checked out package osc requestmaintainership USER # for specified user in checked out package osc requestmaintainership PROJECT # for current user if cwd is not a checked out package osc requestmaintainership PROJECT group:NAME # request for specified group osc requestmaintainership PROJECT PACKAGE # for current user osc requestmaintainership PROJECT PACKAGE USER # request for specified user osc requestmaintainership PROJECT PACKAGE group:NAME # request for specified group osc requestbugownership ... # accepts same parameters but uses bugowner role """ from . import conf from .core import Request from .core import edit_message from .core import is_package_dir from .core import slash_split from .core import store_read_package from .core import store_read_project args = slash_split(args) apiurl = self.get_api_url() if len(args) == 2: project = self._process_project_name(args[0]) package = args[1] if package.startswith('group:'): user = package package = None else: user = conf.get_apiurl_usr(apiurl) elif len(args) == 3: project = self._process_project_name(args[0]) package = args[1] user = args[2] elif len(args) < 2 and is_package_dir(Path.cwd()): project = store_read_project(Path.cwd()) package = store_read_package(Path.cwd()) if len(args) == 0: user = conf.get_apiurl_usr(apiurl) else: user = args[0] elif len(args) == 1: user = conf.get_apiurl_usr(apiurl) project = self._process_project_name(args[0]) package = None else: raise oscerr.WrongArgs('Wrong number of arguments.') role = 'maintainer' if subcmd in ('reqbugownership', 'requestbugownership', 'reqbs'): role = 'bugowner' if opts.role: role = opts.role if role not in ('maintainer', 'bugowner'): raise oscerr.WrongOptions('invalid \'--role\': either specify \'maintainer\' or \'bugowner\'') if not opts.message: opts.message = edit_message() r = Request() if user.startswith('group:'): group = user.replace('group:', '') if role == 'bugowner': r.add_action('set_bugowner', tgt_project=project, tgt_package=package, group_name=group) else: r.add_action('add_role', tgt_project=project, tgt_package=package, group_name=group, group_role=role) elif role == 'bugowner': r.add_action('set_bugowner', tgt_project=project, tgt_package=package, person_name=user) else: r.add_action('add_role', tgt_project=project, tgt_package=package, person_name=user, person_role=role) r.description = opts.message r.create(apiurl) print(r.reqid) @cmdln.option('-m', '--message', metavar='TEXT', help='specify message TEXT') @cmdln.option('-r', '--repository', metavar='REPOSITORY', help='specify repository') @cmdln.option('--all', action='store_true', help='deletes entire project with packages inside') @cmdln.option('--accept-in-hours', metavar='HOURS', help='specify time when request shall get accepted automatically. Only works with write permissions in target.') @cmdln.alias("dr") @cmdln.alias("dropreq") @cmdln.alias("droprequest") @cmdln.alias("deletereq") def do_deleterequest(self, subcmd, opts, *args): """ Request to delete (or 'drop') a package or project usage: osc deletereq [-m TEXT] # works in checked out project/package osc deletereq [-m TEXT] PROJECT PACKAGE osc deletereq [-m TEXT] PROJECT [--all|--repository REPOSITORY] """ from .core import Request from .core import edit_message from .core import is_package_dir from .core import is_project_dir from .core import slash_split from .core import store_read_package from .core import store_read_project args = slash_split(args) project = None package = None repository = None if len(args) > 2: raise oscerr.WrongArgs('Too many arguments.') elif len(args) == 1: project = self._process_project_name(args[0]) elif len(args) == 2: project = self._process_project_name(args[0]) package = args[1] elif is_project_dir(Path.cwd()): project = store_read_project(Path.cwd()) elif is_package_dir(Path.cwd()): project = store_read_project(Path.cwd()) package = store_read_package(Path.cwd()) else: raise oscerr.WrongArgs('Please specify at least a project.') if not opts.all and package is None and not opts.repository: raise oscerr.WrongOptions('No package name has been provided. Use --all option, if you want to request to delete the entire project.') if opts.repository: repository = opts.repository if not opts.message: if package is not None: footer = textwrap.TextWrapper(width=66).fill( 'please explain why you like to delete package %s of project %s' % (package, project)) else: footer = textwrap.TextWrapper(width=66).fill( f'please explain why you like to delete project {project}') opts.message = edit_message(footer) r = Request() r.add_action('delete', tgt_project=project, tgt_package=package, tgt_repository=repository) r.description = opts.message if opts.accept_in_hours: r.accept_at_in_hours(int(opts.accept_in_hours)) r.create(self.get_api_url()) print(r.reqid) @cmdln.option('-m', '--message', metavar='TEXT', help='specify message TEXT') @cmdln.alias("cr") @cmdln.alias("changedevelreq") def do_changedevelrequest(self, subcmd, opts, *args): """ Create request to change the devel package definition [See http://en.opensuse.org/openSUSE:Build_Service_Collaboration for information on this topic.] See the "request" command for showing and modifying existing requests. usage: osc changedevelrequest PROJECT PACKAGE DEVEL_PROJECT [DEVEL_PACKAGE] """ from .core import Request from .core import edit_message from .core import find_default_project from .core import is_package_dir from .core import store_read_package from .core import store_read_project if len(args) == 0 and is_package_dir('.') and find_default_project(): wd = Path.cwd() devel_project = store_read_project(wd) devel_package = package = store_read_package(wd) project = find_default_project(self.get_api_url(), package) elif len(args) < 3: raise oscerr.WrongArgs('Too few arguments.') elif len(args) > 4: raise oscerr.WrongArgs('Too many arguments.') else: devel_project = self._process_project_name(args[2]) project = self._process_project_name(args[0]) package = args[1] devel_package = package if len(args) == 4: devel_package = args[3] if not opts.message: footer = textwrap.TextWrapper(width=66).fill( 'please explain why you like to change the devel project of %s/%s to %s/%s' % (project, package, devel_project, devel_package)) opts.message = edit_message(footer) r = Request() r.add_action('change_devel', src_project=devel_project, src_package=devel_package, tgt_project=project, tgt_package=package) r.description = opts.message r.create(self.get_api_url()) print(r.reqid) @cmdln.option('-d', '--diff', action='store_true', help='generate a diff') @cmdln.option('-S', '--superseded-request', metavar='SUPERSEDED_REQUEST', help='Create the diff relative to a given former request') @cmdln.option('-u', '--unified', action='store_true', help='output the diff in the unified diff format') @cmdln.option('--no-devel', action='store_true', help='Do not attempt to forward to devel project') @cmdln.option('-m', '--message', metavar='TEXT', help='specify message TEXT') @cmdln.option('-t', '--type', metavar='TYPE', help='limit to requests which contain a given action type (submit/delete/change_devel/add_role/set_bugowner/maintenance_incident/maintenance_release)') @cmdln.option('-a', '--all', action='store_true', help='all states. Same as\'-s all\'') @cmdln.option('-f', '--force', action='store_true', help='enforce state change, can be used to ignore open reviews, devel-package dependencies and more') @cmdln.option('-s', '--state', help='only list requests in one of the comma separated given states (new/review/accepted/revoked/declined) or "all" [default="new,review,declined"]') @cmdln.option('-D', '--days', metavar='DAYS', help='only list requests in state "new" or changed in the last DAYS.') @cmdln.option('-U', '--user', metavar='USER', help='requests or reviews limited for the specified USER') @cmdln.option('-G', '--group', metavar='GROUP', help='requests or reviews limited for the specified GROUP') @cmdln.option('-P', '--project', metavar='PROJECT', help='requests or reviews limited for the specified PROJECT') @cmdln.option('-p', '--package', metavar='PACKAGE', help='requests or reviews limited for the specified PACKAGE, requires also a PROJECT') @cmdln.option('-b', '--brief', action='store_true', default=False, help='print output in list view as list subcommand') @cmdln.option('-M', '--mine', action='store_true', help='only show requests created by yourself') @cmdln.option('-B', '--bugowner', action='store_true', help='also show requests about packages where I am bugowner') @cmdln.option('-e', '--edit', action='store_true', help='edit a submit action') @cmdln.option('-i', '--interactive', action='store_true', help='interactive review of request') @cmdln.option('--or-revoke', action='store_true', help='For automation scripts: accepts (if using with accept argument) a request when it is in new or review state. Or revoke it when it got declined. Otherwise just do nothing.') @cmdln.option('--non-interactive', action='store_true', help='non-interactive review of request') @cmdln.option('--exclude-target-project', action='append', help='exclude target project from request list') @cmdln.option('--keep-packages-locked', action='store_true', help='Avoid unlocking of packages in maintenance incident when revoking release requests') @cmdln.option('--incoming', action='store_true', help='Show only requests where the project is target') @cmdln.option('--involved-projects', action='store_true', help='show all requests for project/packages where USER is involved') @cmdln.option('--target-package-filter', metavar='TARGET_PACKAGE_FILTER', help='only list requests for the packages matching the package filter. A (python) regular expression is expected.') @cmdln.option('--source-buildstatus', action='store_true', help='print the buildstatus of the source package (only works with "show" and the interactive review)') @cmdln.alias("rq") @cmdln.alias("review") # FIXME: rewrite this mess and split request and review def do_request(self, subcmd, opts, *args): """ Show or modify requests and reviews [See http://en.opensuse.org/openSUSE:Build_Service_Collaboration for information on this topic.] The 'request' command has the following sub commands: "list" lists open requests attached to a project or package or person. Uses the project/package of the current directory if none of -M, -U USER, project/package are given. "log" will show the history of the given ID "show" will show the request itself, and generate a diff for review, if used with the --diff option. The keyword show can be omitted if the ID is numeric. "decline" will change the request state to "declined" "reopen" will set the request back to new or review. "setincident" will direct "maintenance" requests into specific incidents "supersede" will supersede one request with another existing one. "revoke" will set the request state to "revoked" WARNING: Revoking a maitenance release request unlocks packages in the source project. To avoid unlocking, use the --keep-packages-locked option. "accept" will change the request state to "accepted" and will trigger the actual submit process. That would normally be a server-side copy of the source package to the target package. "approve" marks a requests in "review" state as approved. This request will get accepted automatically when the last review got accepted. "checkout" will checkout the request's source package ("submit" requests only). "prioritize" change the priority of a request to either "critical", "important", "moderate" or "low" The 'review' command has the following sub commands: "list" lists open requests that need to be reviewed by the specified user or group "add" adds a person or group as reviewer to a request "accept" mark the review positive "decline" mark the review negative. A negative review will decline the request. usage: osc request list [-M] [-U USER] [-s state] [-D DAYS] [-t type] [-B] [PRJ [PKG]] osc request log ID osc request [show] [-d] [-b] ID osc request accept [-m TEXT] ID osc request approve [-m TEXT] ID osc request cancelapproval [-m TEXT] ID osc request decline [-m TEXT] ID osc request revoke [-m TEXT] ID osc request reopen [-m TEXT] ID osc request setincident [-m TEXT] ID INCIDENT osc request supersede [-m TEXT] ID SUPERSEDING_ID osc request approvenew [-m TEXT] PROJECT osc request prioritize [-m TEXT] ID PRIORITY osc request checkout/co ID osc request clone [-m TEXT] ID osc review show [-d] [-b] ID osc review list [-U USER] [-G GROUP] [-P PROJECT [-p PACKAGE]] [-s state] osc review add [-m TEXT] [-U USER] [-G GROUP] [-P PROJECT [-p PACKAGE]] ID osc review accept [-m TEXT] [-U USER] [-G GROUP] [-P PROJECT [-p PACKAGE]] ID osc review decline [-m TEXT] [-U USER] [-G GROUP] [-P PROJECT [-p PACKAGE]] ID osc review reopen [-m TEXT] [-U USER] [-G GROUP] [-P PROJECT [-p PACKAGE]] ID osc review supersede [-m TEXT] [-U USER] [-G GROUP] [-P PROJECT [-p PACKAGE]] ID SUPERSEDING_ID """ from . import _private from . import conf from .core import ET from .core import change_request_state from .core import change_request_state_template from .core import change_review_state from .core import check_existing_requests from .core import checkout_package from .core import clone_request from .core import create_submit_request from .core import edit_message from .core import get_request from .core import get_request_collection from .core import get_request_log from .core import get_results from .core import get_review_list from .core import get_user_projpkgs_request_list from .core import highlight_diff from .core import http_GET from .core import http_POST from .core import makeurl from .core import print_comments from .core import raw_input from .core import request_diff from .core import request_interactive_review from .core import run_pager from .core import show_package_meta from .core import show_project_meta from .core import slash_split from .core import store_read_package from .core import store_read_project from .core import submit_action_diff args = slash_split(args) if opts.all and opts.state: raise oscerr.WrongOptions('Sorry, the options \'--all\' and \'--state\' ' 'are mutually exclusive.') if opts.mine and opts.user: raise oscerr.WrongOptions('Sorry, the options \'--user\' and \'--mine\' ' 'are mutually exclusive.') if opts.interactive and opts.non_interactive: raise oscerr.WrongOptions('Sorry, the options \'--interactive\' and ' '\'--non-interactive\' are mutually exclusive') if opts.all: state_list = ["all"] else: if not opts.state: opts.state = "new,review,declined" state_list = opts.state.split(",") state_list = [i for i in state_list if i.strip()] if not args: args = ['list'] opts.mine = 1 if opts.incoming: conf.config['include_request_from_project'] = False cmds = ['list', 'ls', 'log', 'show', 'decline', 'reopen', 'clone', 'accept', 'approve', 'cancelapproval', 'approvenew', 'wipe', 'setincident', 'supersede', 'revoke', 'checkout', 'co', 'priorize', 'prioritize'] if subcmd != 'review' and args[0] not in cmds: raise oscerr.WrongArgs('Unknown request action %s. Choose one of %s.' % (args[0], ', '.join(cmds))) cmds = ['show', 'list', 'add', 'decline', 'accept', 'reopen', 'supersede'] if subcmd == 'review' and args[0] not in cmds: raise oscerr.WrongArgs('Unknown review action %s. Choose one of %s.' % (args[0], ', '.join(cmds))) cmd = args[0] del args[0] if cmd == 'ls': cmd = "list" apiurl = self.get_api_url() if cmd == 'list': min_args, max_args = 0, 2 elif cmd in ('supersede', 'setincident', 'prioritize', 'priorize'): min_args, max_args = 2, 2 else: min_args, max_args = 1, 1 if len(args) < min_args: raise oscerr.WrongArgs('Too few arguments.') if len(args) > max_args: raise oscerr.WrongArgs('Too many arguments.') if cmd in ['add'] and not opts.user and not opts.group and not opts.project: raise oscerr.WrongArgs('No reviewer specified.') source_buildstatus = conf.config['request_show_source_buildstatus'] or opts.source_buildstatus reqid = None supersedid = None if cmd == 'list' or cmd == 'approvenew': package = None project = None if len(args) > 0: project = self._process_project_name(args[0]) elif opts.project: project = opts.project if opts.package: package = opts.package elif not opts.mine and not opts.user and not opts.group: try: project = store_read_project(Path.cwd()) package = store_read_package(Path.cwd()) except oscerr.NoWorkingCopy: pass if len(args) > 1: package = args[1] elif cmd == 'supersede': reqid = args[0] supersedid = args[1] elif cmd == 'setincident': reqid = args[0] incident = args[1] elif cmd in ['prioritize', 'priorize']: reqid = args[0] priority = args[1] elif cmd in ['log', 'add', 'show', 'decline', 'reopen', 'clone', 'accept', 'wipe', 'revoke', 'checkout', 'co', 'approve', 'cancelapproval']: reqid = args[0] # clone all packages from a given request if cmd in ['clone']: # should we force a message? print(f'Cloned packages are available in project: {clone_request(apiurl, reqid, opts.message)}') # approve request elif cmd == 'approve' or cmd == 'cancelapproval': query = {'cmd': cmd} url = makeurl(apiurl, ['request', reqid], query) r = http_POST(url, data=opts.message) print(xml_parse(r).getroot().get('code')) # change incidents elif cmd == 'setincident': query = {'cmd': 'setincident', 'incident': incident} url = makeurl(apiurl, ['request', reqid], query) r = http_POST(url, data=opts.message) print(xml_parse(r).getroot().get('code')) # change priority elif cmd in ['prioritize', 'priorize']: query = {'cmd': 'setpriority', 'priority': priority} url = makeurl(apiurl, ['request', reqid], query) r = http_POST(url, data=opts.message) print(xml_parse(r).getroot().get('code')) # add new reviewer to existing request elif cmd in ['add'] and subcmd == 'review': query = {'cmd': 'addreview'} if opts.user: query['by_user'] = opts.user if opts.group: query['by_group'] = opts.group if opts.project: query['by_project'] = opts.project if opts.package: query['by_package'] = opts.package url = makeurl(apiurl, ['request', reqid], query) if not opts.message: opts.message = edit_message() r = http_POST(url, data=opts.message) print(xml_parse(r).getroot().get('code')) # list and approvenew elif cmd == 'list' or cmd == 'approvenew': states = ('new', 'accepted', 'revoked', 'declined', 'review', 'superseded') who = '' if cmd == 'approvenew': states = ('new',) results = get_request_collection(apiurl, project=project, package=package, states=['new']) else: for s in state_list: if s != 'all' and s not in states: raise oscerr.WrongArgs(f"Unknown state '{s}', try one of {','.join(states)}") if opts.mine: who = conf.get_apiurl_usr(apiurl) if opts.user: who = opts.user # FIXME -B not implemented! if opts.bugowner: self._debug('list: option --bugowner ignored: not impl.') if subcmd == 'review': # FIXME: do the review list for the user and for all groups he belong to results = get_review_list(apiurl, project, package, who, opts.group, opts.project, opts.package, state_list, opts.type, req_states=("new", "review")) else: if opts.involved_projects: who = who or conf.get_apiurl_usr(apiurl) results = get_user_projpkgs_request_list(apiurl, who, req_state=state_list, req_type=opts.type, exclude_projects=opts.exclude_target_project or []) else: roles = ["creator"] if opts.mine else None types = [opts.type] if opts.type else None results = get_request_collection( apiurl, project=project, package=package, user=who, states=state_list, types=types, roles=roles) # Check if project actually exists if result list is empty if not results: if project: msg = 'No results for %(kind)s %(entity)s' emsg = '%(kind)s %(entity)s does not exist' d = {'entity': [project], 'kind': 'project'} meth = show_project_meta if package: d['kind'] = 'package' d['entity'].append(package) meth = show_package_meta try: entity = d['entity'] d['entity'] = '/'.join(entity) meth(apiurl, *entity) print(msg % d) except HTTPError: print(emsg % d) else: print('No results') return # we must not sort the results here, since the api is doing it already "the right way" days = opts.days or conf.config['request_list_days'] since = '' try: days = float(days) except ValueError: days = 0 if days > 0: since = time.strftime('%Y-%m-%dT%H:%M:%S', time.localtime(time.time() - days * 24 * 3600)) skipped = 0 # bs has received 2009-09-20 a new xquery compare() function # which allows us to limit the list inside of get_request_list # That would be much faster for coolo. But counting the remainder # would not be possible with current xquery implementation. # Workaround: fetch all, and filter on client side. # FIXME: date filtering should become implemented on server side if opts.target_package_filter: filter_pattern = re.compile(opts.target_package_filter) for result in results: filtered = False for action in result.actions: if action.type == 'group' or not opts.target_package_filter: continue if action.tgt_package is not None and not filter_pattern.match(action.tgt_package): filtered = True break if not filtered: if days == 0 or result.state.when > since or result.state.name == 'new': if (opts.interactive or conf.config['request_show_interactive']) and not opts.non_interactive: ignore_reviews = subcmd != 'review' request_interactive_review(apiurl, result, group=opts.group, ignore_reviews=ignore_reviews, source_buildstatus=source_buildstatus) else: print(result.list_view(), '\n') else: skipped += 1 if skipped: print("There are %d requests older than %s days.\n" % (skipped, days)) if cmd == 'approvenew': print("\n *** Approve them all ? [y/n] ***") if sys.stdin.read(1) == "y": if not opts.message: opts.message = edit_message() for result in results: print(result.reqid, ": ", end=' ') r = change_request_state(apiurl, result.reqid, 'accepted', opts.message or '', force=opts.force) print(f'Result of change request state: {r}') else: print('Aborted...', file=sys.stderr) raise oscerr.UserAbort() elif cmd == 'log': for l in get_request_log(apiurl, reqid): print(l) # show elif cmd == 'show': r = get_request(apiurl, reqid) if opts.brief: print(r.list_view()) elif opts.edit: if not r.get_actions('submit'): raise oscerr.WrongOptions('\'--edit\' not possible ' '(request has no \'submit\' action)') return request_interactive_review(apiurl, r, 'e') elif (opts.interactive or conf.config['request_show_interactive']) and not opts.non_interactive: ignore_reviews = subcmd != 'review' return request_interactive_review(apiurl, r, group=opts.group, ignore_reviews=ignore_reviews, source_buildstatus=source_buildstatus) else: print(r) print_comments(apiurl, 'request', reqid) if source_buildstatus: sr_actions = r.get_actions('submit') if not sr_actions: raise oscerr.WrongOptions('\'--source-buildstatus\' not possible ' '(request has no \'submit\' actions)') for action in sr_actions: print(f'Buildstatus for \'{action.src_project}/{action.src_package}\':') print('\n'.join(get_results(apiurl, action.src_project, action.src_package))) if opts.diff: diff = b'' try: # works since OBS 2.1 diff = request_diff(apiurl, reqid, opts.superseded_request) except HTTPError as e: if e.code == 404: # Any referenced object does not exist, eg. the superseded request root = xml_fromstring(e.read()) summary = root.find('summary') print(summary.text, file=sys.stderr) raise oscerr.WrongOptions("Object does not exist") # for OBS 2.0 and before sr_actions = r.get_actions('submit') if not r.get_actions('submit') and not r.get_actions('maintenance_incident') and not r.get_actions('maintenance_release'): raise oscerr.WrongOptions('\'--diff\' not possible (request has no supported actions)') for action in sr_actions: diff += b'old: %s/%s\nnew: %s/%s\n' % (action.src_project.encode(), action.src_package.encode(), action.tgt_project.encode(), action.tgt_package.encode()) diff += submit_action_diff(apiurl, action) diff += b'\n\n' run_pager(highlight_diff(diff), tmp_suffix="") # checkout elif cmd in ('checkout', 'co'): r = get_request(apiurl, reqid) sr_actions = r.get_actions('submit', 'maintenance_release') if not sr_actions: raise oscerr.WrongArgs('\'checkout\' not possible (request has no \'submit\' actions)') for action in sr_actions: checkout_package(apiurl, action.src_project, action.src_package, action.src_rev, expand_link=True, prj_dir=Path(action.src_project)) else: state_map = {'reopen': 'new', 'accept': 'accepted', 'decline': 'declined', 'wipe': 'deleted', 'revoke': 'revoked', 'supersede': 'superseded'} # Change review state only if subcmd == 'review': if not opts.message: opts.message = edit_message() if cmd in ('accept', 'decline', 'reopen', 'supersede'): if opts.user or opts.group or opts.project or opts.package: r = change_review_state(apiurl, reqid, state_map[cmd], opts.user, opts.group, opts.project, opts.package, opts.message or '', supersed=supersedid) print(r) else: rq = get_request(apiurl, reqid) if rq.state.name in ('new', 'review'): for review in rq.reviews: # try all, but do not fail on error try: r = change_review_state(apiurl, reqid, state_map[cmd], review.by_user, review.by_group, review.by_project, review.by_package, opts.message or '', supersed=supersedid) print(r) except HTTPError as e: body = e.read() if e.code in [403]: if review.by_user: print(f'No permission on review by user {review.by_user}:') if review.by_group: print(f'No permission on review by group {review.by_group}') if review.by_package: print(f'No permission on review by package {review.by_project} / {review.by_package}') elif review.by_project: print(f'No permission on review by project {review.by_project}') print(e, file=sys.stderr) else: print('Request is closed, please reopen the request first before changing any reviews.') # Change state of entire request elif cmd in ['reopen', 'accept', 'decline', 'wipe', 'revoke', 'supersede']: rq = get_request(apiurl, reqid) if opts.or_revoke: if rq.state.name == "declined": cmd = "revoke" elif rq.state.name != "new" and rq.state.name != "review": return 0 if rq.state.name == state_map[cmd]: repl = raw_input("\n *** The state of the request (#%s) is already '%s'. Change state anyway? [y/n] *** " % (reqid, rq.state.name)) if repl.lower() != 'y': print('Aborted...', file=sys.stderr) raise oscerr.UserAbort() if not opts.message: tmpl = change_request_state_template(rq, state_map[cmd]) opts.message = edit_message(template=tmpl) try: r = change_request_state(apiurl, reqid, state_map[cmd], opts.message or '', supersed=supersedid, force=opts.force, keep_packages_locked=opts.keep_packages_locked) print(f'Result of change request state: {r}') except HTTPError as e: print(e, file=sys.stderr) details = e.hdrs.get('X-Opensuse-Errorcode') if details: print(details, file=sys.stderr) root = xml_fromstring(e.read()) summary = root.find('summary') if summary is not None: print(summary.text) if opts.or_revoke: if e.code in [400, 403, 404, 500]: print('Revoking it ...') r = change_request_state(apiurl, reqid, 'revoked', opts.message or '', supersed=supersedid, force=opts.force) sys.exit(1) # check for devel instances after accepted requests if cmd in ['accept']: print(rq) if opts.interactive: _private.forward_request(apiurl, rq, interactive=True) sr_actions = rq.get_actions('submit') for action in sr_actions: u = makeurl(apiurl, ['/search/package'], { 'match': f"([devel[@project='{action.tgt_project}' and @package='{action.tgt_package}']])" }) f = http_GET(u) root = xml_parse(f).getroot() if root.findall('package') and not opts.no_devel: for node in root.findall('package'): project = node.get('project') package = node.get('name') # skip it when this is anyway a link to me link_url = makeurl(apiurl, ['source', project, package]) links_to_project = links_to_package = None try: file = http_GET(link_url) root = xml_parse(file).getroot() link_node = root.find('linkinfo') if link_node is not None: links_to_project = link_node.get('project') or project links_to_package = link_node.get('package') or package except HTTPError as e: if e.code != 404: print(f'Cannot get list of files for {project}/{package}: {e}', file=sys.stderr) except SyntaxError as e: print(f'Cannot parse list of files for {project}/{package}: {e}', file=sys.stderr) if links_to_project == action.tgt_project and links_to_package == action.tgt_package: # links to my request target anyway, no need to forward submit continue print(project, end=' ') if package != action.tgt_package: print("/", package, end=' ') repl = raw_input('\nForward this submit to it? ([y]/n)') if repl.lower() == 'y' or repl == '': (supersede, reqs) = check_existing_requests(apiurl, action.tgt_project, action.tgt_package, project, package) msg = f"{rq.description} (forwarded request {reqid} from {rq.creator})" rid = create_submit_request(apiurl, action.tgt_project, action.tgt_package, project, package, msg) print(msg) print("New request #", rid) for req in reqs: change_request_state(apiurl, req.reqid, 'superseded', f'superseded by {rid}', rid) @cmdln.option('-r', '--revision', metavar='rev', help='use the specified revision.') @cmdln.option('-R', '--use-plain-revision', action='store_true', help='Do not expand revision the specified or latest rev') @cmdln.option('-u', '--unset', action='store_true', help='remove revision in link, it will point always to latest revision') @cmdln.option('--vrev', metavar='vrev', help='Enforce a given vrev') def do_setlinkrev(self, subcmd, opts, *args): """ Updates a revision number in a source link This command adds or updates a specified revision number in a source link. The current revision of the source is used, if no revision number is specified. usage: osc setlinkrev osc setlinkrev PROJECT [PACKAGE] """ from .core import Package from .core import meta_get_packagelist from .core import parseRevisionOption from .core import set_link_rev apiurl = self.get_api_url() rev = parseRevisionOption(opts.revision)[0] or '' if opts.unset: rev = None args = list(args) if not args: p = Package(Path.cwd()) project = p.prjname package = p.name assert apiurl == p.apiurl if not p.islink(): sys.exit('Local directory is no checked out source link package, aborting') else: project, package = pop_project_package_from_args( args, default_project=".", default_package=".", package_is_optional=True ) if opts.revision and not package: # It is highly unlikely that all links for all packages in a project should be set to the same revision. self.argparse_error("The --revision option requires to specify a package") if package: packages = [package] else: packages = meta_get_packagelist(apiurl, project) for p in packages: try: rev = set_link_rev(apiurl, project, p, revision=rev, expand=not opts.use_plain_revision, vrev=opts.vrev) except HTTPError as e: if e.code != 404: raise print(f"WARNING: Package {project}/{p} has no link", file=sys.stderr) continue if rev is None: print(f"Removed link revision from package {project}/{p}") else: print(f"Set link revision of package {project}/{p} to {rev}") def do_linktobranch(self, subcmd, opts, *args): """ Convert a package containing a classic link with patch to a branch This command tells the server to convert a _link with or without a project.diff to a branch. This is a full copy with a _link file pointing to the branched place. usage: osc linktobranch # from a package working copy osc linktobranch PROJECT PACKAGE """ from .core import Package from .core import link_to_branch apiurl = self.get_api_url() # assume we're in a working copy if no args were specified update_working_copy = not args args = list(args) project, package = pop_project_package_from_args( args, default_project=".", default_package=".", package_is_optional=False ) ensure_no_remaining_args(args) link_to_branch(apiurl, project, package) if update_working_copy: pac = Package(Path.cwd()) pac.update(rev=pac.latest_rev()) @cmdln.option('-m', '--message', metavar='TEXT', help='specify message TEXT') def do_detachbranch(self, subcmd, opts, *args): """ Replace a link with its expanded sources If a package is a link it is replaced with its expanded sources. The link does not exist anymore. usage: osc detachbranch # from a package working copy osc detachbranch PROJECT PACKAGE """ from .core import ET from .core import Linkinfo from .core import copy_pac from .core import delete_files from .core import show_files_meta from .core import show_package_meta apiurl = self.get_api_url() args = list(args) project, package = pop_project_package_from_args( args, default_project=".", default_package=".", package_is_optional=False ) ensure_no_remaining_args(args) try: copy_pac(apiurl, project, package, apiurl, project, package, expand=True, comment=opts.message) except HTTPError as e: root = xml_fromstring(show_files_meta(apiurl, project, package, 'latest', expand=False)) li = Linkinfo() li.read(root.find('linkinfo')) if li.islink() and li.haserror(): try: show_package_meta(apiurl, li.project, li.package) except HTTPError as e: if e.code == 404: print("Link target got removed, dropping link. WARNING: latest submissions in link target might be lost!") delete_files(apiurl, project, package, ['_link']) else: raise oscerr.LinkExpandError(project, package, li.error) elif not li.islink(): print(f'package \'{project}/{package}\' is no link', file=sys.stderr) else: raise e @cmdln.option('-C', '--cicount', choices=['add', 'copy', 'local'], help='cicount attribute in the link, known values are add, copy, and local, default in buildservice is currently add.') @cmdln.option('-c', '--current', action='store_true', help='link fixed against current revision.') @cmdln.option('-r', '--revision', metavar='rev', help='link the specified revision.') @cmdln.option('-f', '--force', action='store_true', help='overwrite an existing link file if it is there.') @cmdln.option('--disable-build', action='store_true', help='disable building of the linked package') @cmdln.option('-d', '--disable-publish', action='store_true', help='disable publishing of the linked package') @cmdln.option('-N', '--new-package', action='store_true', help='create a link to a not yet existing package') def do_linkpac(self, subcmd, opts, *args): """ "Link" a package to another package A linked package is a clone of another package, but plus local modifications. It can be cross-project. The TARGET_PACKAGE name is optional; the source packages' name will be used if TARGET_PACKAGE is omitted. Afterwards, you will want to 'checkout TARGET_PROJECT TARGET_PACKAGE'. To add a patch, add the patch as file and add it to the _link file. You can also specify text which will be inserted at the top of the spec file. See the examples in the _link file. NOTE: In case you want to fix or update another package, you should use the 'branch' command. A branch has correct repositories (and a link) setup up by default and will be cleaned up automatically after it was submitted back. usage: osc linkpac PROJECT PACKAGE TARGET_PROJECT [TARGET_PACKAGE] osc linkpac TARGET_PROJECT [TARGET_PACKAGE] # from a package checkout """ from .core import link_pac from .core import parseRevisionOption from .core import show_upstream_rev_vrev apiurl = self.get_api_url() args = list(args) src_project, src_package, tgt_project, tgt_package = pop_project_package_targetproject_targetpackage_from_args( args, default_project=".", default_package=".", target_package_is_optional=True, ) ensure_no_remaining_args(args) if not tgt_package: tgt_package = src_package rev, dummy = parseRevisionOption(opts.revision) vrev = None if src_project == tgt_project and not opts.cicount: # in this case, the user usually wants to build different spec # files from the same source opts.cicount = "copy" if opts.current and not opts.new_package: rev, vrev = show_upstream_rev_vrev(apiurl, src_project, src_package, expand=True) if rev is None or len(rev) < 32: # vrev is only needed for srcmd5 and OBS instances < 2.1.17 do not support it vrev = None link_pac( src_project, src_package, tgt_project, tgt_package, opts.force, rev, opts.cicount, opts.disable_publish, opts.new_package, vrev, disable_build=opts.disable_build, ) @cmdln.option('--nosources', action='store_true', help='ignore source packages when copying build results to destination project') @cmdln.option('-m', '--map-repo', metavar='SRC=TARGET[,SRC=TARGET]', help='Allows repository mapping(s) to be given as SRC=TARGET[,SRC=TARGET]') @cmdln.option('-d', '--disable-publish', action='store_true', help='disable publishing of the aggregated package') def do_aggregatepac(self, subcmd, opts, *args): """ "Aggregate" a package to another package Aggregation of a package means that the build results (binaries) of a package are basically copied into another project. This can be used to make packages available from building that are needed in a project but available only in a different project. Note that this is done at the expense of disk space. See http://en.opensuse.org/openSUSE:Build_Service_Tips_and_Tricks#link_and_aggregate for more information. The DESTPAC name is optional; the source packages' name will be used if DESTPAC is omitted. usage: osc aggregatepac SOURCEPRJ SOURCEPAC[:FLAVOR] DESTPRJ [DESTPAC] """ from .core import aggregate_pac args = list(args) src_project, src_package, tgt_project, tgt_package = pop_project_package_targetproject_targetpackage_from_args( args, target_package_is_optional=True, ) ensure_no_remaining_args(args) if not tgt_package: tgt_package = src_package repo_map = {} if opts.map_repo: for pair in opts.map_repo.split(','): src_tgt = pair.split('=') if len(src_tgt) != 2: raise oscerr.WrongOptions(f'map "{opts.map_repo}" must be SRC=TARGET[,SRC=TARGET]') repo_map[src_tgt[0]] = src_tgt[1] aggregate_pac(src_project, src_package, tgt_project, tgt_package, repo_map, opts.disable_publish, opts.nosources) @cmdln.option('-c', '--client-side-copy', action='store_true', help='do a (slower) client-side copy') @cmdln.option('-k', '--keep-maintainers', action='store_true', help='keep original maintainers. Default is remove all and replace with the one calling the script.') @cmdln.option('-K', '--keep-link', action='store_true', help='If the target package is a link, the link is kept, but may be updated. If the source package is a link, its expanded version is considered.') @cmdln.option('-d', '--keep-develproject', action='store_true', help='keep develproject tag in the package metadata') @cmdln.option('-r', '--revision', metavar='rev', help='copy the specified revision.') @cmdln.option('-t', '--to-apiurl', metavar='URL', help='URL of destination api server. Default is the source api server.') @cmdln.option('-m', '--message', metavar='TEXT', help='specify message TEXT') @cmdln.option('-e', '--expand', action='store_true', help='if the source package is a link then copy the expanded version of the link') def do_copypac(self, subcmd, opts, *args): """ Copy a package A way to copy package to somewhere else. It can be done across buildservice instances, if the -t option is used. In that case, a client-side copy and link expansion are implied. Using --client-side-copy always involves downloading all files, and uploading them to the target. The DESTPAC name is optional; the source packages' name will be used if DESTPAC is omitted. If SOURCEPRJ or DESTPRJ is '.' it will be expanded to the PRJ of the current directory. usage: osc copypac SOURCEPRJ SOURCEPAC DESTPRJ [DESTPAC] """ from . import conf from .core import copy_pac from .core import decode_it from .core import parseRevisionOption from .core import show_upstream_rev args = list(args) src_project, src_package, tgt_project, tgt_package = pop_project_package_targetproject_targetpackage_from_args( args, target_package_is_optional=True, ) ensure_no_remaining_args(args) if not tgt_package: tgt_package = src_package src_apiurl = conf.config['apiurl'] if opts.to_apiurl: tgt_apiurl = conf.config.apiurl_aliases.get(opts.to_apiurl, opts.to_apiurl) else: tgt_apiurl = src_apiurl if src_apiurl != tgt_apiurl: opts.client_side_copy = True opts.expand = True rev, _ = parseRevisionOption(opts.revision) if opts.message: comment = opts.message else: src_rev = rev or show_upstream_rev(src_apiurl, src_project, src_package) comment = f'osc copypac from project:{src_project} package:{src_package} revision:{src_rev}' if opts.keep_link: comment += ", using keep-link" if opts.expand: comment += ", using expand" if opts.client_side_copy: comment += ", using client side copy" r = copy_pac(src_apiurl, src_project, src_package, tgt_apiurl, tgt_project, tgt_package, client_side_copy=opts.client_side_copy, keep_maintainers=opts.keep_maintainers, keep_develproject=opts.keep_develproject, expand=opts.expand, revision=rev, comment=comment, keep_link=opts.keep_link) if r is not None: print(decode_it(r)) @cmdln.option('-a', '--arch', metavar='ARCH', help='Release only binaries from the specified architecture') @cmdln.option('-r', '--repo', metavar='REPO', help='Release only binaries from the specified repository') @cmdln.option('--target-project', metavar='TARGETPROJECT', help='Release only to specified project') @cmdln.option('--target-repository', metavar='TARGETREPOSITORY', help='Release only to specified repository') @cmdln.option('--set-release', metavar='RELEASETAG', help='rename binaries during release using this release tag') @cmdln.option('--no-delay', action='store_true', help="Don't put the release job in a queue to be run later, but immediately run it. Thus the next call to osc prjresult will reflect it. Otherwise there is no way to know if it is finished or didn't start yet.") def do_release(self, subcmd, opts, *args): """ Release sources and binaries This command is used to transfer sources and binaries without rebuilding them. It requires defined release targets set to trigger="manual". usage: osc release [PROJECT [PACKAGE]] """ from . import _private apiurl = self.get_api_url() args = list(args) project, package = pop_project_package_from_args( args, default_project=".", default_package=".", package_is_optional=True ) _private.release( apiurl, project=project, package=package, repository=opts.repo, architecture=opts.arch, target_project=opts.target_project, target_repository=opts.target_repository, set_release_to=opts.set_release, delayed=not opts.no_delay, print_to="stdout", ) @cmdln.option('-m', '--message', metavar='TEXT', help='specify message TEXT') @cmdln.option('-p', '--package', metavar='PKG', action='append', help='specify packages to release') def do_releaserequest(self, subcmd, opts, *args): """ Create a release request For maintenance incident projects: This command is used by the maintenance team to start the release process of a maintenance update. This includes usually testing based on the defined reviewers of the update project. [See https://openbuildservice.org/help/manuals/obs-user-guide/cha.obs.maintenance_setup.html for information on this topic.] For normal projects: This command is used to transfer sources and binaries without rebuilding them. It requires defined release targets set to trigger="manual". [See https://openbuildservice.org/help/manuals/obs-user-guide/cha.obs.request_and_review_system.html for information on this topic.] usage: osc releaserequest [-p package] [ SOURCEPROJECT ] """ from .core import ET from .core import Request from .core import create_release_request from .core import edit_message from .core import is_project_dir from .core import show_project_meta from .core import slash_split from .core import store_read_project # FIXME: additional parameters can be a certain repo list to create a partitial release args = slash_split(args) apiurl = self.get_api_url() source_project = None if len(args) > 1: raise oscerr.WrongArgs('Too many arguments.') if len(args) == 0 and is_project_dir(Path.cwd()): source_project = store_read_project(Path.cwd()) elif len(args) == 0: raise oscerr.WrongArgs('Too few arguments.') if len(args) > 0: source_project = self._process_project_name(args[0]) f = show_project_meta(apiurl, source_project) root = xml_fromstring(b''.join(f)) if not opts.message: opts.message = edit_message() if 'kind' in root.attrib and root.attrib['kind'] == 'maintenance_incident': r = create_release_request(apiurl, source_project, opts.message) else: r = Request() if opts.package: for pac in opts.package: r.add_action('release', src_project=source_project, src_package=pac) else: r.add_action('release', src_project=source_project) r.description = opts.message r.create(apiurl) print(r.reqid) @cmdln.option('-a', '--attribute', metavar='ATTRIBUTE', help='Use this attribute to find default maintenance project (default is OBS:MaintenanceProject)') @cmdln.option('--noaccess', action='store_true', help='Create a hidden project') @cmdln.option('-m', '--message', metavar='TEXT', help='specify message TEXT') def do_createincident(self, subcmd, opts, *args): """ Create a maintenance incident [See https://openbuildservice.org/help/manuals/obs-user-guide/cha.obs.maintenance_setup.html for information on this topic.] This command is asking to open an empty maintenance incident. This can usually only be done by a responsible maintenance team. Please see the "mbranch" command on how to full such a project content and the "patchinfo" command how add the required maintenance update information. usage: osc createincident [ MAINTENANCEPROJECT ] """ from . import conf from .core import ET from .core import http_POST from .core import makeurl from .core import search from .core import slash_split args = slash_split(args) apiurl = self.get_api_url() maintenance_attribute = conf.config['maintenance_attribute'] if opts.attribute: maintenance_attribute = opts.attribute source_project = target_project = None if len(args) > 1: raise oscerr.WrongArgs('Too many arguments.') if len(args) == 1: target_project = self._process_project_name(args[0]) else: xpath = f'attribute/@name = \'{maintenance_attribute}\'' res = search(apiurl, project_id=xpath) root = res['project_id'] project = root.find('project') if project is None: sys.exit('Unable to find defined OBS:MaintenanceProject project on server.') target_project = project.get('name') print(f'Using target project \'{target_project}\'') query = {'cmd': 'createmaintenanceincident'} if opts.noaccess: query["noaccess"] = 1 url = makeurl(apiurl, ['source', target_project], query=query) r = http_POST(url, data=opts.message) project = None for i in xml_fromstring(r.read()).findall('data'): if i.get('name') == 'targetproject': project = i.text.strip() if project: print("Incident project created: ", project) else: print(xml_parse(r).getroot().get('code')) print(xml_parse(r).getroot().get('error')) @cmdln.option('-a', '--attribute', metavar='ATTRIBUTE', help='Use this attribute to find default maintenance project (default is OBS:MaintenanceProject)') @cmdln.option('-m', '--message', metavar='TEXT', help='specify message TEXT') @cmdln.option('--release-project', metavar='RELEASEPROJECT', help='Specify the release project') @cmdln.option('--enforce-branching', action='store_true', help='submit from a fresh branched project') @cmdln.option('--no-cleanup', action='store_true', help='do not remove source project on accept') @cmdln.option('--cleanup', action='store_true', help='do remove source project on accept') @cmdln.option('--incident', metavar='INCIDENT', help='specify incident number to merge in') @cmdln.option('--incident-project', metavar='INCIDENT_PROJECT', help='specify incident project to merge in') @cmdln.option('-s', '--supersede', metavar='REQUEST_ID', help='Superseding another request by this one') @cmdln.alias("mr") def do_maintenancerequest(self, subcmd, opts, *args): """ Create a request for starting a maintenance incident [See https://openbuildservice.org/help/manuals/obs-user-guide/cha.obs.maintenance_setup.html for information on this topic.] This command is asking the maintenance team to start a maintenance incident based on a created maintenance update. Please see the "mbranch" command on how to create such a project and the "patchinfo" command how add the required maintenance update information. usage: osc maintenancerequest [ SOURCEPROJECT [ SOURCEPACKAGES RELEASEPROJECT ] ] osc maintenancerequest . The 2nd line when issued within a package directory provides a short cut to submit a single package (the one in the current directory) from the project of this package to be submitted to the release project this package links to. This syntax is only valid when specified from a package subdirectory. """ # FIXME: the follow syntax would make more sense and would obsolete the --release-project parameter # but is incompatible with the current one # osc maintenancerequest [ SOURCEPROJECT [ RELEASEPROJECT [ SOURCEPACKAGES ] ] from . import _private from . import conf from .core import Package from .core import change_request_state from .core import check_existing_maintenance_requests from .core import create_maintenance_request from .core import edit_message from .core import is_package_dir from .core import is_project_dir from .core import search from .core import slash_split from .core import store_read_package from .core import store_read_project args = slash_split(args) apiurl = self.get_api_url() maintenance_attribute = conf.config['maintenance_attribute'] if opts.attribute: maintenance_attribute = opts.attribute source_project = target_project = release_project = opt_sourceupdate = None source_packages = [] if len(args) == 0 and (is_project_dir(Path.cwd()) or is_package_dir(Path.cwd())): source_project = store_read_project(Path.cwd()) if is_package_dir(Path.cwd()): source_packages = [store_read_package(Path.cwd())] elif len(args) == 0: raise oscerr.WrongArgs('Too few arguments.') if len(args) > 0: if len(args) == 1 and args[0] == '.': if is_package_dir(Path.cwd()): source_project = store_read_project(Path.cwd()) source_packages = [store_read_package(Path.cwd())] p = Package(Path.cwd()) release_project = p.linkinfo.project else: raise oscerr.WrongArgs('No package directory') else: source_project = self._process_project_name(args[0]) if len(args) > 1: if len(args) == 2: sys.exit('Source package defined, but no release project.') source_packages = args[1:] release_project = self._process_project_name(args[-1]) source_packages.remove(release_project) if opts.cleanup: opt_sourceupdate = 'cleanup' if not opts.no_cleanup: default_branch = f'home:{conf.get_apiurl_usr(apiurl)}:branches:' if source_project.startswith(default_branch): opt_sourceupdate = 'cleanup' if opts.release_project: release_project = opts.release_project if opts.incident_project: target_project = opts.incident_project else: xpath = f'attribute/@name = \'{maintenance_attribute}\'' res = search(apiurl, project_id=xpath) root = res['project_id'] project = root.find('project') if project is None: sys.exit('Unable to find defined OBS:MaintenanceProject project on server.') target_project = project.get('name') if opts.incident: target_project += ":" + opts.incident release_in = '' if release_project is not None: release_in = f'. (release in \'{release_project}\')' print(f'Using target project \'{target_project}\'{release_in}') if not opts.message: msg = "" if opts.supersede: from .obs_api import Request req = Request.from_api(apiurl, opts.supersede) msg = req.description + "\n" opts.message = edit_message(template=f"{msg}") supersede_existing = False reqs = [] if not opts.supersede: (supersede_existing, reqs) = check_existing_maintenance_requests(apiurl, source_project, source_packages, target_project, None) # unspecified release project r = create_maintenance_request(apiurl, source_project, source_packages, target_project, release_project, opt_sourceupdate, opts.message, opts.enforce_branching) print(r.reqid) if conf.config['print_web_links']: obs_url = _private.get_configuration_value(apiurl, "obs_url") print(f"{obs_url}/request/show/{r.reqid}") if supersede_existing: for req in reqs: change_request_state(apiurl, req.reqid, 'superseded', f'superseded by {r.reqid}', r.reqid) if opts.supersede: change_request_state(apiurl, opts.supersede, 'superseded', opts.message or '', r.reqid) @cmdln.option('-c', '--checkout', action='store_true', help='Checkout branched package afterwards ' '(\'osc bco\' is a shorthand for this option)') @cmdln.option('-a', '--attribute', metavar='ATTRIBUTE', help='Use this attribute to find affected packages (default is OBS:Maintained)') @cmdln.option('-u', '--update-project-attribute', metavar='UPDATE_ATTRIBUTE', help='Use this attribute to find update projects (default is OBS:UpdateProject) ') @cmdln.option('--dryrun', action='store_true', help='Just simulate the action and report back the result.') @cmdln.option('--noaccess', action='store_true', help='Create a hidden project') @cmdln.option('--nodevelproject', action='store_true', help='do not follow a defined devel project ' '(primary project where a package is developed)') @cmdln.option('--version', action='store_true', help='print version of maintained package') @cmdln.alias('sm') @cmdln.alias('maintained') def do_mbranch(self, subcmd, opts, *args): """ Search or branch multiple instances of a package This command is used for searching all relevant instances of packages and creating links of them in one project. This is esp. used for maintenance updates. It can also be used to branch all packages marked before with a given attribute. [See http://en.opensuse.org/openSUSE:Build_Service_Concept_Maintenance for information on this topic.] The branched package will live in home:USERNAME:branches:ATTRIBUTE:PACKAGE if nothing else specified. If osc maintained or sm is issued only the relevant instances of a package will be shown. No branch will be created. This is similar to osc mbranch --dryrun. usage: osc sm [SOURCEPACKAGE] [-a ATTRIBUTE] osc mbranch [ SOURCEPACKAGE [ TARGETPROJECT ] ] """ from . import conf from .core import Project from .core import attribute_branch_pkg from .core import checkout_package from .core import get_source_rev from .core import meta_get_packagelist from .core import output from .core import slash_split from .core import statfrmt args = slash_split(args) apiurl = self.get_api_url() tproject = None maintained_attribute = conf.config['maintained_attribute'] if opts.attribute: maintained_attribute = opts.attribute maintained_update_project_attribute = conf.config['maintained_update_project_attribute'] if opts.update_project_attribute: maintained_update_project_attribute = opts.update_project_attribute if not args or len(args) > 2: raise oscerr.WrongArgs('Wrong number of arguments.') if len(args) >= 1: package = args[0] if len(args) >= 2: tproject = self._process_project_name(args[1]) if subcmd in ('maintained', 'sm'): opts.dryrun = 1 result = attribute_branch_pkg(apiurl, maintained_attribute, maintained_update_project_attribute, package, tproject, noaccess=opts.noaccess, nodevelproject=opts.nodevelproject, dryrun=opts.dryrun) if result is None: print('ERROR: Attribute branch call came not back with a project.', file=sys.stderr) sys.exit(1) if opts.dryrun: for r in result.findall('package'): line = f"{r.get('project')}/{r.get('package')}" if opts.version: sr = get_source_rev(apiurl, r.get('project'), r.get('package')) version = sr.get('version') if not version or version == 'unknown': version = 'unknown' line = line + f' (version: {version})' for d in r.findall('devel'): line += f" using sources from {d.get('project')}/{d.get('package')}" print(line) return apiopt = '' if conf.get_configParser().get("general", "apiurl", fallback=None) != apiurl: apiopt = f'-A {apiurl} ' print('A working copy of the maintenance branch can be checked out with:\n\n' 'osc %sco %s' % (apiopt, result)) if opts.checkout: Project.init_project(apiurl, result, result, conf.config['do_package_tracking']) print(statfrmt('A', result)) # all packages for package in meta_get_packagelist(apiurl, result): try: checkout_package(apiurl, result, package, expand_link=True, prj_dir=Path(result)) except: print('Error while checkout package:\n', package, file=sys.stderr) output.print_msg('Note: You can use "osc delete" or "osc submitpac" when done.\n', print_to="verbose") @cmdln.alias('branchco') @cmdln.alias('bco') @cmdln.alias('getpac') @cmdln.option('--nodevelproject', action='store_true', help='do not follow a defined devel project ' '(primary project where a package is developed)') @cmdln.option('-c', '--checkout', action='store_true', help='Checkout branched package afterwards using "co -e -S"' '(\'osc bco\' is a shorthand for this option)') @cmdln.option('-f', '--force', default=False, action="store_true", help='force branch, overwrite target') @cmdln.option('--add-repositories', default=False, action="store_true", help='Add repositories to target project (happens by default when project is new)') @cmdln.option('--extend-package-names', default=False, action="store_true", help='Extend packages names with project name as suffix') @cmdln.option('--noaccess', action='store_true', help='Create a hidden project') @cmdln.option('-m', '--message', metavar='TEXT', help='specify message TEXT') @cmdln.option('-M', '--maintenance', default=False, action="store_true", help='Create project and package in maintenance mode') @cmdln.option('-N', '--new-package', action='store_true', help='create a branch pointing to a not yet existing package') @cmdln.option('-r', '--revision', metavar='rev', help='branch against a specific revision') @cmdln.option('--linkrev', metavar='linkrev', help='specify the used revision in the link target.') @cmdln.option('--add-repositories-block', metavar='add_repositories_block', help='specify the used block strategy for new repositories') @cmdln.option('--add-repositories-rebuild', metavar='add_repositories_rebuild', help='specify the used rebuild strategy for new repositories') @cmdln.option('--disable-build', action='store_true', help='disable building of the branched package') def do_branch(self, subcmd, opts, *args): """ Branch a package [See http://en.opensuse.org/openSUSE:Build_Service_Collaboration for information on this topic.] Create a source link from a package of an existing project to a new subproject of the requesters home project (home:branches:) The branched package will live in home:USERNAME:branches:PROJECT/PACKAGE if nothing else specified. With getpac or bco, the branched package will come from one of %(getpac_default_project)s (list of projects from oscrc:getpac_default_project) if nothing else is specified on the command line. In case of branch errors, where the source has currently merge conflicts use --linkrev=base option. usage: osc branch osc branch SOURCEPROJECT SOURCEPACKAGE osc branch SOURCEPROJECT SOURCEPACKAGE TARGETPROJECT|. osc branch SOURCEPROJECT SOURCEPACKAGE TARGETPROJECT|. TARGETPACKAGE osc getpac SOURCEPACKAGE osc bco ... """ from . import conf from .core import ET from .core import branch_pkg from .core import checkout_package from .core import find_default_project from .core import is_package_dir from .core import output from .core import print_request_list from .core import show_attribute_meta from .core import slash_split from .core import store_read_package from .core import store_read_project if subcmd in ('getpac', 'branchco', 'bco'): opts.checkout = True args = slash_split(args) tproject = tpackage = None if subcmd in ('getpac', 'bco') and len(args) == 1: def_p = find_default_project(self.get_api_url(), args[0]) print(f'defaulting to {def_p}/{args[0]}', file=sys.stderr) # python has no args.unshift ??? args = [def_p, args[0]] if len(args) == 0 and is_package_dir('.'): args = (store_read_project('.'), store_read_package('.')) if len(args) < 2 or len(args) > 4: raise oscerr.WrongArgs('Wrong number of arguments.') apiurl = self.get_api_url() expected = f'home:{conf.get_apiurl_usr(apiurl)}:branches:{args[0]}' if len(args) >= 3: expected = tproject = self._process_project_name(args[2]) if len(args) >= 4: tpackage = args[3] try: exists, targetprj, targetpkg, srcprj, srcpkg = \ branch_pkg(apiurl, self._process_project_name(args[0]), args[1], nodevelproject=opts.nodevelproject, rev=opts.revision, linkrev=opts.linkrev, target_project=tproject, target_package=tpackage, return_existing=opts.checkout, msg=opts.message or '', force=opts.force, noaccess=opts.noaccess, add_repositories=opts.add_repositories, add_repositories_block=opts.add_repositories_block, add_repositories_rebuild=opts.add_repositories_rebuild, extend_package_names=opts.extend_package_names, missingok=opts.new_package, maintenance=opts.maintenance, disable_build=opts.disable_build) except oscerr.NotMissing as e: print('NOTE: Package target exists already via project links, link will point to given project.') print(' A submission will initialize a new instance.') exists, targetprj, targetpkg, srcprj, srcpkg = \ branch_pkg(apiurl, self._process_project_name(args[0]), args[1], nodevelproject=opts.nodevelproject, rev=opts.revision, linkrev=opts.linkrev, target_project=tproject, target_package=tpackage, return_existing=opts.checkout, msg=opts.message or '', force=opts.force, noaccess=opts.noaccess, add_repositories=opts.add_repositories, add_repositories_block=opts.add_repositories_block, add_repositories_rebuild=opts.add_repositories_rebuild, extend_package_names=opts.extend_package_names, missingok=False, maintenance=opts.maintenance, newinstance=opts.new_package, disable_build=opts.disable_build) if exists: print(f'Using existing branch project: {targetprj}', file=sys.stderr) devloc = None if not exists and (srcprj != self._process_project_name(args[0]) or srcpkg != args[1]): try: root = xml_fromstring(b''.join(show_attribute_meta(apiurl, args[0], None, None, conf.config['maintained_update_project_attribute'], None, None))) # this might raise an AttributeError uproject = root.find('attribute').find('value').text print('\nNote: The branch has been created from the configured update project: %s' % uproject) except (AttributeError, HTTPError) as e: devloc = srcprj print('\nNote: The branch has been created of a different project,\n' ' %s,\n' ' which is the primary location of where development for\n' ' that package takes place.\n' ' That\'s also where you would normally make changes against.\n' ' A direct branch of the specified package can be forced\n' ' with the --nodevelproject option.\n' % devloc) package = targetpkg or args[1] if opts.checkout: checkout_package(apiurl, targetprj, package, server_service_files=False, expand_link=True, prj_dir=Path(targetprj)) output.print_msg('Note: You can use "osc delete" or "osc submitpac" when done.\n', print_to="verbose") else: apiopt = '' if conf.get_configParser().get("general", "apiurl", fallback=None) != apiurl: apiopt = f'-A {apiurl} ' print('A working copy of the branched package can be checked out with:\n\n' 'osc %sco %s/%s' % (apiopt, targetprj, package)) print_request_list(apiurl, args[0], args[1]) if devloc: print_request_list(apiurl, devloc, srcpkg) @cmdln.option('-m', '--message', metavar='TEXT', help='specify log message TEXT') def do_undelete(self, subcmd, opts, *args): """ Restores a deleted project or package on the server The server restores a package including the sources and meta configuration. Binaries remain to be lost and will be rebuild. usage: osc undelete PROJECT osc undelete PROJECT PACKAGE """ from .core import edit_message from .core import undelete_package from .core import undelete_project apiurl = self.get_api_url() args = list(args) project, package = pop_project_package_from_args( args, package_is_optional=True ) ensure_no_remaining_args(args) msg = opts.message or edit_message() if package: undelete_package(apiurl, project, package, msg) else: undelete_project(apiurl, project, msg) @cmdln.option('-r', '--recursive', action='store_true', help='deletes a project with packages inside') @cmdln.option('-f', '--force', action='store_true', help='deletes a project where other depends on') @cmdln.option('-m', '--message', metavar='TEXT', help='specify log message TEXT') def do_rdelete(self, subcmd, opts, *args): """ Delete a project or packages on the server As a safety measure, project must be empty (i.e., you need to delete all packages first). Also, packages must have no requests pending (i.e., you need to accept/revoke such requests first). If you are sure that you want to remove this project and all its packages use \'--recursive\' switch. It may still not work because other depends on it. If you want to ignore this as well use \'--force\' switch. usage: osc rdelete [-r] [-f] PROJECT [PACKAGE] """ from .core import delete_package from .core import delete_project from .core import edit_message apiurl = self.get_api_url() args = list(args) project, package = pop_project_package_from_args( args, package_is_optional=True ) ensure_no_remaining_args(args) msg = opts.message or edit_message() # empty arguments result in recursive project delete ... if not project: raise oscerr.WrongArgs('Project argument is empty') if package: delete_package(apiurl, project, package, opts.force, msg) else: delete_project(apiurl, project, opts.force, msg, recursive=opts.recursive) @cmdln.option('-m', '--message', metavar='TEXT', help='specify log message TEXT') def do_lock(self, subcmd, opts, *args): """ Locks a project or package usage: osc lock PROJECT [PACKAGE] """ from .core import lock apiurl = self.get_api_url() args = list(args) project, package = pop_project_package_from_args( args, package_is_optional=True ) ensure_no_remaining_args(args) # TODO: make consistent with unlock and require a message? lock(apiurl, project, package, opts.message) @cmdln.option('-m', '--message', metavar='TEXT', help='specify log message TEXT') def do_unlock(self, subcmd, opts, *args): """ Unlocks a project or package Unlocks a locked project or package. A comment is required. usage: osc unlock PROJECT [PACKAGE] """ from .core import edit_message from .core import unlock_package from .core import unlock_project apiurl = self.get_api_url() args = list(args) project, package = pop_project_package_from_args( args, package_is_optional=True ) ensure_no_remaining_args(args) msg = opts.message or edit_message() if package: unlock_package(apiurl, project, package, msg) else: unlock_project(apiurl, project, msg) @cmdln.alias('metafromspec') @cmdln.alias('updatepkgmetafromspec') @cmdln.option('', '--specfile', metavar='FILE', help='Path to specfile. (if you pass more than working copy this option is ignored)') def do_updatepacmetafromspec(self, subcmd, opts, *args): """ Update package meta information from a specfile ARG, if specified, is a package working copy. """ from .core import Package from .core import parseargs args = parseargs(args) if opts.specfile and len(args) == 1: specfile = opts.specfile else: specfile = None pacs = Package.from_paths(args) for p in pacs: p.read_meta_from_spec(specfile) p.update_package_meta() @cmdln.alias('linkdiff') @cmdln.alias('ldiff') @cmdln.alias('di') @cmdln.option('-c', '--change', metavar='rev', help='the change made by revision rev (like -r rev-1:rev).' 'If rev is negative this is like -r rev:rev-1.') @cmdln.option('-r', '--revision', metavar='rev1[:rev2]', help='If rev1 is specified it will compare your working copy against ' 'the revision (rev1) on the server. ' 'If rev1 and rev2 are specified it will compare rev1 against rev2 ' '(NOTE: changes in your working copy are ignored in this case)') @cmdln.option('-M', '--meta', action='store_true', help='operate on meta files') @cmdln.option('-p', '--plain', action='store_true', help='output the diff in plain (not unified) diff format') @cmdln.option('-l', '--link', action='store_true', help='(osc linkdiff): compare against the base revision of the link') @cmdln.option('--missingok', action='store_true', help='do not fail if the source or target project/package does not exist on the server') @cmdln.option('-u', '--unexpand', action='store_true', help='Local changes only, ignore changes in linked package sources') def do_diff(self, subcmd, opts, *args): """ Generates a diff Generates a diff, comparing local changes against the repository server. usage: ARG, if specified, is a filename to include in the diff. Default: all files. osc diff --link osc linkdiff Compare current checkout directory against the link base. osc diff --link PROJ PACK osc linkdiff PROJ PACK Compare a package against the link base (ignoring working copy changes). """ from .core import ET from .core import Package from .core import highlight_diff from .core import http_GET from .core import makeurl from .core import parseRevisionOption from .core import parseargs from .core import run_pager from .core import server_diff from .core import server_diff_noex if (subcmd in ('ldiff', 'linkdiff')): opts.link = True args = parseargs(args) pacs = None if not opts.link or not len(args) == 2: pacs = Package.from_paths(args) if opts.link: query = {'rev': 'latest'} if pacs: apiurl = pacs[0].apiurl project = pacs[0].prjname package = pacs[0].name else: apiurl = self.get_api_url() project = args[0] package = args[1] u = makeurl(apiurl, ['source', project, package], query=query) f = http_GET(u) root = xml_parse(f).getroot() linkinfo = root.find('linkinfo') if linkinfo is None: raise oscerr.APIError('package is not a source link') baserev = linkinfo.get('baserev') opts.revision = baserev print(f"diff committed package against linked revision {baserev}\n") run_pager( highlight_diff( server_diff( self.get_api_url(), linkinfo.get("project"), linkinfo.get("package"), baserev, project, package, linkinfo.get("lsrcmd5"), not opts.plain, opts.missingok, ) ) ) return if opts.change: try: rev = int(opts.change) if rev > 0: rev1 = rev - 1 rev2 = rev elif rev < 0: rev1 = -rev rev2 = -rev - 1 else: return except: print(f'Revision \'{opts.change}\' not an integer', file=sys.stderr) return else: rev1, rev2 = parseRevisionOption(opts.revision) diff = b'' for pac in pacs: if not rev2: for i in pac.get_diff(rev1): diff += b''.join(i) else: if args == ["."]: # parseargs() returns ["."] (list with workdir) if no args are specified # "." is illegal filename that causes server to return 400 files = None else: files = args diff += server_diff_noex(pac.apiurl, pac.prjname, pac.name, rev1, pac.prjname, pac.name, rev2, not opts.plain, opts.missingok, opts.meta, not opts.unexpand, files=files) run_pager(highlight_diff(diff)) @cmdln.option('--issues-only', action='store_true', help='show only issues in diff') @cmdln.option('-M', '--meta', action='store_true', help='diff meta data') @cmdln.option('-r', '--revision', metavar='N[:M]', help='revision id, where N = old revision and M = new revision') @cmdln.option('-p', '--plain', action='store_true', help='output the diff in plain (not unified) diff format' ' and show diff of files in archives') @cmdln.option('-c', '--change', metavar='rev', help='the change made by revision rev (like -r rev-1:rev). ' 'If rev is negative this is like -r rev:rev-1.') @cmdln.option('--missingok', action='store_true', help='do not fail if the source or target project/package does not exist on the server') @cmdln.option('-u', '--unexpand', action='store_true', help='diff unexpanded version if sources are linked') @cmdln.option('--xml', action='store_true', help='show diff as xml (only for issues diff)') def do_rdiff(self, subcmd, opts, *args): """ Server-side "pretty" diff of two packages Compares two packages (three or four arguments) or shows the changes of a specified revision of a package (two arguments) If no revision is specified the latest revision is used. usage: osc rdiff OLDPRJ OLDPAC NEWPRJ [NEWPAC] osc rdiff PROJECT PACKAGE osc rdiff PROJECT --meta """ from .core import decode_it from .core import highlight_diff from .core import parseRevisionOption from .core import run_pager from .core import server_diff_noex apiurl = self.get_api_url() args = list(args) old_project, old_package, new_project, new_package = pop_project_package_targetproject_targetpackage_from_args( args, package_is_optional=True, target_project_is_optional=True, target_package_is_optional=True, ) ensure_no_remaining_args(args) if not new_project: new_project = old_project if not new_package: new_package = old_package if not old_package: new_project = old_project new_package = "_project" old_project = None old_package = None if opts.meta: opts.unexpand = True rev1 = None rev2 = None if opts.change: try: rev = int(opts.change) if rev > 0: rev1 = rev - 1 rev2 = rev elif rev < 0: rev1 = -rev rev2 = -rev - 1 else: return except: print(f'Revision \'{opts.change}\' not an integer', file=sys.stderr) return else: if opts.revision: rev1, rev2 = parseRevisionOption(opts.revision) rdiff = server_diff_noex(apiurl, old_project, old_package, rev1, new_project, new_package, rev2, not opts.plain, opts.missingok, meta=opts.meta, expand=not opts.unexpand, onlyissues=opts.issues_only, xml=opts.xml) if opts.issues_only: print(decode_it(rdiff)) else: run_pager(highlight_diff(rdiff)) def _pdiff_raise_non_existing_package(self, project, package, msg=None): raise oscerr.PackageMissing(project, package, msg or f'{project}/{package} does not exist.') def _pdiff_package_exists(self, apiurl, project, package): from .core import show_package_meta try: show_package_meta(apiurl, project, package) return True except HTTPError as e: if e.code != 404: print(f'Cannot check that {project}/{package} exists: {e}', file=sys.stderr) return False def _pdiff_guess_parent(self, apiurl, project, package, check_exists_first=False): # Make sure the parent exists if check_exists_first and not self._pdiff_package_exists(apiurl, project, package): self._pdiff_raise_non_existing_package(project, package) if project.startswith('home:'): guess = project[len('home:'):] # remove user name pos = guess.find(':') if pos > 0: guess = guess[guess.find(':') + 1:] if guess.startswith('branches:'): guess = guess[len('branches:'):] return (guess, package) return (None, None) def _pdiff_get_parent_from_link(self, apiurl, project, package): from .core import ET from .core import http_GET from .core import makeurl link_url = makeurl(apiurl, ['source', project, package, '_link']) try: file = http_GET(link_url) root = xml_parse(file).getroot() except HTTPError as e: return (None, None) except SyntaxError as e: print(f'Cannot parse {project}/{package}/_link: {e}', file=sys.stderr) return (None, None) parent_project = root.get('project') parent_package = root.get('package') or package if parent_project is None: return (None, None) return (parent_project, parent_package) def _pdiff_get_exists_and_parent(self, apiurl, project, package): from .core import ET from .core import http_GET from .core import makeurl link_url = makeurl(apiurl, ['source', project, package]) try: file = http_GET(link_url) root = xml_parse(file).getroot() except HTTPError as e: if e.code != 404: print(f'Cannot get list of files for {project}/{package}: {e}', file=sys.stderr) return (None, None, None) except SyntaxError as e: print(f'Cannot parse list of files for {project}/{package}: {e}', file=sys.stderr) return (None, None, None) link_node = root.find('linkinfo') if link_node is None: return (True, None, None) parent_project = link_node.get('project') parent_package = link_node.get('package') or package if parent_project is None: raise oscerr.APIError(f'{project}/{package} is a link with no parent?') return (True, parent_project, parent_package) @cmdln.option('-p', '--plain', action='store_true', dest='plain', help='output the diff in plain (not unified) diff format') @cmdln.option('-n', '--nomissingok', action='store_true', dest='nomissingok', help='fail if the parent package does not exist on the server') def do_pdiff(self, subcmd, opts, *args): """ Quick alias to diff the content of a package with its parent usage: osc pdiff [--plain|-p] [--nomissing-ok|-n] osc pdiff [--plain|-p] [--nomissing-ok|-n] PKG osc pdiff [--plain|-p] [--nomissing-ok|-n] PRJ PKG """ from .core import highlight_diff from .core import is_package_dir from .core import is_project_dir from .core import run_pager from .core import server_diff from .core import slash_split from .core import store_read_package from .core import store_read_project apiurl = self.get_api_url() args = slash_split(args) unified = not opts.plain noparentok = not opts.nomissingok if len(args) > 2: raise oscerr.WrongArgs('Too many arguments.') if len(args) == 0: if not is_package_dir(Path.cwd()): raise oscerr.WrongArgs('Current directory is not a checked out package. Please specify a project and a package.') project = store_read_project(Path.cwd()) package = store_read_package(Path.cwd()) elif len(args) == 1: if not is_project_dir(Path.cwd()): raise oscerr.WrongArgs('Current directory is not a checked out project. Please specify a project and a package.') project = store_read_project(Path.cwd()) package = args[0] elif len(args) == 2: project = self._process_project_name(args[0]) package = args[1] else: raise RuntimeError('Internal error: bad check for arguments.') # Find parent package # Old way, that does one more request to api #(parent_project, parent_package) = self._pdiff_get_parent_from_link(apiurl, project, package) # if not parent_project: # (parent_project, parent_package) = self._pdiff_guess_parent(apiurl, project, package, check_exists_first = True) # if parent_project and parent_package: # print 'Guessed that %s/%s is the parent package.' % (parent_project, parent_package) # New way (exists, parent_project, parent_package) = self._pdiff_get_exists_and_parent(apiurl, project, package) if not exists: self._pdiff_raise_non_existing_package(project, package) if not parent_project: (parent_project, parent_package) = self._pdiff_guess_parent(apiurl, project, package, check_exists_first=False) if parent_project and parent_package: print(f'Guessed that {parent_project}/{parent_package} is the parent package.') if not parent_project or not parent_package: print(f'Cannot find a parent for {project}/{package} to diff against.', file=sys.stderr) return 1 if not noparentok and not self._pdiff_package_exists(apiurl, parent_project, parent_package): self._pdiff_raise_non_existing_package(parent_project, parent_package, msg='Parent for %s/%s (%s/%s) does not exist.' % (project, package, parent_project, parent_package)) rdiff = server_diff(apiurl, parent_project, parent_package, None, project, package, None, unified=unified, missingok=noparentok) run_pager(highlight_diff(rdiff)) def _get_branch_parent(self, prj): m = re.match('^home:[^:]+:branches:(.+)', prj) # OBS_Maintained is a special case if m and prj.find(':branches:OBS_Maintained:') == -1: return m.group(1) return None def _prdiff_skip_package(self, opts, pkg): if opts.exclude and re.search(opts.exclude, pkg): return True if opts.include and not re.search(opts.include, pkg): return True return False def _prdiff_output_diff(self, opts, rdiff): from .core import decode_it if opts.diffstat: print() with subprocess.Popen("diffstat", stdin=subprocess.PIPE, stdout=subprocess.PIPE, close_fds=True) as p: p.stdin.write(rdiff) p.stdin.close() print("".join(decode_it(x) for x in p.stdout.readlines())) elif opts.unified: print() if isinstance(rdiff, str): print(rdiff) else: try: sys.stdout.buffer.write(rdiff) except AttributeError as e: print(decode_it(rdiff)) # run_pager(rdiff) def _prdiff_output_matching_requests(self, opts, requests, srcprj, pkg): """ Search through the given list of requests and output any submitrequests which target pkg and originate from srcprj. """ for req in requests: for action in req.get_actions('submit'): if action.src_project != srcprj: continue if action.tgt_package != pkg: continue print() print(req.list_view()) break @cmdln.alias('projectdiff') @cmdln.alias('projdiff') @cmdln.option('-r', '--requests', action='store_true', help='show open requests for any packages with differences') @cmdln.option('-e', '--exclude', metavar='REGEXP', dest='exclude', help='skip packages matching REGEXP') @cmdln.option('-i', '--include', metavar='REGEXP', dest='include', help='only consider packages matching REGEXP') @cmdln.option('-n', '--show-not-in-old', action='store_true', help='show packages only in the new project') @cmdln.option('-o', '--show-not-in-new', action='store_true', help='show packages only in the old project') @cmdln.option('-u', '--unified', action='store_true', help='show full unified diffs of differences') @cmdln.option('-d', '--diffstat', action='store_true', help='show diffstat of differences') def do_prdiff(self, subcmd, opts, *args): """ Server-side diff of two projects Compares two projects and either summarizes or outputs the differences in full. In the second form, a project is compared with one of its branches inside a home:$USER project (the branch is treated as NEWPRJ). The home branch is optional if the current working directory is a checked out copy of it. usage: osc prdiff [OPTIONS] OLDPRJ NEWPRJ osc prdiff [OPTIONS] [home:$USER:branch:$PRJ] """ from .core import Project from .core import get_request_collection from .core import is_project_dir from .core import meta_get_packagelist from .core import server_diff_noex if len(args) > 2: raise oscerr.WrongArgs('Too many arguments.') if len(args) == 0: if is_project_dir(Path.cwd()): newprj = Project('.', getPackageList=False).name oldprj = self._get_branch_parent(newprj) if oldprj is None: raise oscerr.WrongArgs('Current directory is not a valid home branch.') else: raise oscerr.WrongArgs('Current directory is not a project.') elif len(args) == 1: newprj = self._process_project_name(args[0]) oldprj = self._get_branch_parent(newprj) if oldprj is None: raise oscerr.WrongArgs('Single-argument form must be for a home branch.') elif len(args) == 2: oldprj = self._process_project_name(args[0]) newprj = self._process_project_name(args[1]) else: raise RuntimeError('BUG in argument parsing, please report.\n' 'args: ' + repr(args)) if opts.diffstat and opts.unified: print('error - cannot specify both --diffstat and --unified', file=sys.stderr) sys.exit(1) apiurl = self.get_api_url() old_packages = meta_get_packagelist(apiurl, oldprj) new_packages = meta_get_packagelist(apiurl, newprj) if opts.requests: requests = get_request_collection(apiurl, project=oldprj, states=('new', 'review')) for pkg in old_packages: if self._prdiff_skip_package(opts, pkg): continue if pkg not in new_packages: if opts.show_not_in_new: print(f"old only: {pkg}") continue rdiff = server_diff_noex( apiurl, oldprj, pkg, None, newprj, pkg, None, unified=True, missingok=False, meta=False, expand=True ) if rdiff: print(f"differs: {pkg}") self._prdiff_output_diff(opts, rdiff) if opts.requests: self._prdiff_output_matching_requests(opts, requests, newprj, pkg) else: print(f"identical: {pkg}") for pkg in new_packages: if self._prdiff_skip_package(opts, pkg): continue if pkg not in old_packages: if opts.show_not_in_old: print(f"new only: {pkg}") def do_repourls(self, subcmd, opts, *args): """ Shows URLs of .repo files Shows URLs on which to access the project repositories. usage: osc repourls [PROJECT] """ from . import _private from .core import decode_it from .core import get_buildconfig from .core import get_repositories_of_project from .core import return_external from .core import store_read_project def _repo_type(apiurl, project, repo): if not os.path.exists('/usr/lib/build/queryconfig'): return None build_config = get_buildconfig(apiurl, project, repo) with tempfile.NamedTemporaryFile() as f: f.write(build_config) f.flush() repo_type = return_external('/usr/lib/build/queryconfig', '--dist', f.name, 'repotype').rstrip(b'\n') if not repo_type: return None return decode_it(repo_type) apiurl = self.get_api_url() if len(args) == 1: project = self._process_project_name(args[0]) elif len(args) == 0: project = store_read_project('.') else: raise oscerr.WrongArgs('Wrong number of arguments') download_url = _private.get_configuration_value(apiurl, "download_url") url_deb_tmpl = 'deb ' + download_url + '/%s/%s/ /' url_arch_tmpl = 'Server=' + download_url + '/%s/%s/$arch' url_tmpl = download_url + '/%s/%s/%s.repo' repos = get_repositories_of_project(apiurl, project) for repo in repos: repo_type = _repo_type(apiurl, project, repo) if repo_type == 'debian': print(url_deb_tmpl % (project.replace(':', ':/'), repo)) elif repo_type == 'arch': print(url_arch_tmpl % (project.replace(':', ':/'), repo)) else: # We assume everything else is rpm-md print(url_tmpl % (project.replace(':', ':/'), repo, project)) def do_browse(self, subcmd, opts, *args): """ Opens browser usage: osc browse [PROJECT [PACKAGE]] osc browse [REQUEST_ID] """ from . import _private from .core import run_external args = list(args) apiurl = self.get_api_url() obs_url = _private.get_configuration_value(apiurl, "obs_url") if len(args) == 1 and args[0].isnumeric(): reqid = args.pop(0) url = f"{obs_url}/request/show/{reqid}" else: project, package = pop_project_package_from_args( args, default_project=".", default_package=".", package_is_optional=True ) if package: url = f"{obs_url}/package/show/{project}/{package}" else: url = f"{obs_url}/project/show/{project}" ensure_no_remaining_args(args) run_external('xdg-open', url) @cmdln.option('-r', '--revision', metavar='rev', help='checkout the specified revision. ' 'NOTE: if you checkout the complete project ' 'this option is ignored!') @cmdln.option('-e', '--expand-link', action='store_true', help='if a package is a link, check out the expanded ' 'sources (no-op, since this became the default)') @cmdln.option('-D', '--deleted', action='store_true', help='checkout an already deleted package. No meta information ') @cmdln.option('-u', '--unexpand-link', action='store_true', help='if a package is a link, check out the _link file ' 'instead of the expanded sources') @cmdln.option('-M', '--meta', action='store_true', help='checkout out meta data instead of sources') @cmdln.option('-c', '--current-dir', action='store_true', help='place PACKAGE folder in the current directory ' 'instead of a PROJECT/PACKAGE directory') @cmdln.option('-o', '--output-dir', metavar='outdir', help='place package in the specified directory ' 'instead of a PROJECT/PACKAGE directory') @cmdln.option('-s', '--source-service-files', action='store_true', help='Run source services.') @cmdln.option('-S', '--server-side-source-service-files', action='store_true', help='Use server side generated sources instead of local generation.') @cmdln.option('-l', '--limit-size', metavar='limit_size', help='Skip all files with a given size') @cmdln.option('--native-obs-package', action='store_true', help='Do not clone native scm repositories: Different representation and you will not be able to submit changes!') @cmdln.alias('co') def do_checkout(self, subcmd, opts, *args): """ Check out content from the repository Check out content from the repository server, creating a local working copy. When checking out a single package, the option --revision can be used to specify a revision of the package to be checked out. When a package is a source link, then it will be checked out in expanded form. If --unexpand-link option is used, the checkout will instead produce the raw _link file plus patches. usage: osc co PROJECT [PACKAGE] [FILE] osc co PROJECT # entire project osc co PROJECT PACKAGE # a package osc co PROJECT PACKAGE FILE # single file -> to current dir while inside a project directory: osc co PACKAGE # check out PACKAGE from project with the result of rpm -q --qf '%%{DISTURL}\\n' PACKAGE osc co obs://API/PROJECT/PLATFORM/REVISION-PACKAGE """ from . import conf from .core import ET from .core import Linkinfo from .core import Project from .core import checkRevision from .core import checkout_deleted_package from .core import checkout_package from .core import get_osc_version from .core import get_source_file from .core import is_project_dir from .core import meta_get_packagelist from .core import parseRevisionOption from .core import print_request_list from .core import revision_is_empty from .core import run_obs_scm_bridge from .core import show_files_meta from .core import show_project_meta from .core import show_scmsync from .core import show_upstream_srcmd5 from .core import slash_split from .core import statfrmt from .core import store_read_project if opts.unexpand_link: expand_link = False else: expand_link = True if not args: self.argparse_error("Incorrect number of arguments.") # A DISTURL can be found in build results to be able to relocate the source used to build # obs://$OBS_INSTANCE/$PROJECT/$REPOSITORY/$XSRCMD5-$PACKAGE(:$FLAVOR) # obs://build.opensuse.org/openSUSE:11.3/standard/fc6c25e795a89503e99d59da5dc94a79-screen m = re.match(r"obs://([^/]+)/(\S+)/([^/]+)/([A-Fa-f\d]+)\-([^:]*)(:\S+)?", args[0]) if m and len(args) == 1: apiurl = "https://" + m.group(1) project = m.group(2) project_dir = Path(project) # platform = m.group(3) opts.revision = m.group(4) package = m.group(5) apiurl = apiurl.replace('/build.', '/api.') filename = None else: args = slash_split(args) project = package = filename = None apiurl = self.get_api_url() try: project = self._process_project_name(args[0]) project_dir = Path(project) package = args[1] filename = args[2] except: pass if len(args) == 1 and is_project_dir(Path.cwd()): project = store_read_project(Path.cwd()) project_dir = Path.cwd() package = args[0] if opts.deleted and package: if not opts.output_dir: raise oscerr.WrongOptions('-o | --output-dir is needed to get deleted sources') elif opts.deleted and not package: raise oscerr.WrongOptions('-D | --deleted can only be used with a package') rev, dummy = parseRevisionOption(opts.revision) if revision_is_empty(rev): rev = "latest" if not revision_is_empty(rev) and rev != "latest" and not checkRevision(project, package, rev): print(f'Revision \'{rev}\' does not exist', file=sys.stderr) sys.exit(1) if filename: # Note: same logic as with 'osc cat' (not 'osc ls', which never merges!) if expand_link: rev = show_upstream_srcmd5(apiurl, project, package, expand=True, revision=rev) get_source_file(apiurl, project, package, filename, revision=rev, progress_obj=self.download_progress) elif package: if opts.deleted: checkout_deleted_package(apiurl, project, package, opts.output_dir) else: if opts.current_dir: project_dir = None checkout_package(apiurl, project, package, rev, expand_link=expand_link, prj_dir=project_dir, service_files=opts.source_service_files, server_service_files=opts.server_side_source_service_files, progress_obj=self.download_progress, size_limit=opts.limit_size, meta=opts.meta, outdir=opts.output_dir, native_obs_package=opts.native_obs_package) if os.isatty(sys.stdout.fileno()): print_request_list(apiurl, project, package) elif project: sep = '/' if not opts.output_dir and conf.config['checkout_no_colon'] else conf.config['project_separator'] chosen_output = opts.output_dir if opts.output_dir else project prj_dir = Path(chosen_output.replace(':', sep)) if os.path.exists(prj_dir): sys.exit(f'osc: project directory \'{prj_dir}\' already exists') # check if the project does exist (show_project_meta will throw an exception) show_project_meta(apiurl, project) scm_url = show_scmsync(apiurl, project) if scm_url is not None and not opts.native_obs_package: run_obs_scm_bridge(url=scm_url, target_dir=str(prj_dir)) Project.init_project(apiurl, prj_dir, project, conf.config['do_package_tracking'], scm_url=scm_url) print(statfrmt('A', prj_dir)) if scm_url is not None: return # all packages for package in meta_get_packagelist(apiurl, project): if opts.output_dir is not None: outputdir = os.path.join(opts.output_dir, package) if not os.path.exists(opts.output_dir): os.mkdir(os.path.join(opts.output_dir)) else: outputdir = None # don't check out local links by default try: m = show_files_meta(apiurl, project, package) li = Linkinfo() li.read(xml_fromstring(''.join(m)).find('linkinfo')) if not li.haserror(): if li.project == project: print(statfrmt('S', package + " link to package " + li.package)) continue except: pass try: checkout_package(apiurl, project, package, expand_link=expand_link, prj_dir=prj_dir, service_files=opts.source_service_files, server_service_files=opts.server_side_source_service_files, progress_obj=self.download_progress, size_limit=opts.limit_size, meta=opts.meta, native_obs_package=opts.native_obs_package) except oscerr.LinkExpandError as e: print('Link cannot be expanded:\n', e, file=sys.stderr) print('Use "osc repairlink" for fixing merge conflicts:\n', file=sys.stderr) # check out in unexpanded form at least checkout_package(apiurl, project, package, expand_link=False, prj_dir=prj_dir, service_files=opts.source_service_files, server_service_files=opts.server_side_source_service_files, progress_obj=self.download_progress, size_limit=opts.limit_size, meta=opts.meta, native_obs_package=opts.native_obs_package) if os.isatty(sys.stdout.fileno()): print_request_list(apiurl, project) else: self.argparse_error("Incorrect number of arguments.") @cmdln.option('-e', '--show-excluded', action='store_true', help='also show files which are excluded by the ' '"exclude_glob" config option') @cmdln.alias('st') def do_status(self, subcmd, opts, *args): """ Show status of files in working copy Show the status of files in a local working copy, indicating whether files have been changed locally, deleted, added, ... The first column in the output specifies the status and is one of the following characters: ' ' no modifications (shown only in verbose output) 'A' Added 'C' Conflicted 'D' Deleted 'M' Modified 'R' Replaced (file was deleted and added again afterwards) '?' item is not under version control '!' item is missing (removed by non-osc command) or incomplete 'S' item is skipped (item exceeds a file size limit or is _service:* file) 'F' Frozen (use "osc pull" to merge conflicts) (package-only state) examples: osc st osc st osc st file1 file2 ... usage: osc status [OPTS] [PATH...] """ from .core import Package from .core import Project from .core import compare from .core import is_project_dir from .core import parseargs from .core import statfrmt args = parseargs(args) lines = [] excl_states = (' ',) if opts.quiet: excl_states += ('?',) elif opts.verbose: excl_states = () for arg in args: if is_project_dir(arg): prj = Project(arg, False) # don't exclude packages with state ' ' because the packages # might have modified etc. files prj_excl = [st for st in excl_states if st != ' '] for st, pac in sorted(prj.get_status(*prj_excl), key=cmp_to_key(compare)): p = prj.get_pacobj(pac) if p is None: # state is != ' ' lines.append(statfrmt(st, os.path.normpath(os.path.join(prj.dir, pac)))) continue if p.isfrozen(): lines.append(statfrmt('F', os.path.normpath(os.path.join(prj.dir, pac)))) elif st == ' ' and opts.verbose or st != ' ': lines.append(statfrmt(st, os.path.normpath(os.path.join(prj.dir, pac)))) states = p.get_status(opts.show_excluded, *excl_states) for st, filename in sorted(states, key=cmp_to_key(compare)): lines.append(statfrmt(st, os.path.normpath(os.path.join(p.dir, filename)))) else: p = Package(arg) for st, filename in sorted(p.get_status(opts.show_excluded, *excl_states), key=cmp_to_key(compare)): lines.append(statfrmt(st, os.path.normpath(os.path.join(p.dir, filename)))) if lines: print('\n'.join(lines)) @cmdln.option('-f', '--force', action='store_true', help='add files even if they are excluded by the exclude_glob config option') def do_add(self, subcmd, opts, *args): """ Mark files to be added upon the next commit In case a URL is given the file will get downloaded and registered to be downloaded by the server as well via the download_url source service. This is recommended for release tar balls to track their source and to help others to review your changes esp. on version upgrades. usage: osc add URL [URL...] osc add FILE [FILE...] """ from .core import addDownloadUrlService from .core import addFiles from .core import addGitSource from .core import parseargs if not args: self.argparse_error("Incorrect number of arguments.") # Do some magic here, when adding a url. We want that the server to download the tar ball and to verify it for arg in parseargs(args): if arg.endswith('.git') or arg.startswith('git://') or \ arg.startswith('git@') or (arg.startswith('https://github.com') and '/releases/' not in arg and '/archive/' not in arg) or \ arg.startswith('https://gitlab.com'): addGitSource(arg) elif arg.startswith('http://') or arg.startswith('https://') or arg.startswith('ftp://'): addDownloadUrlService(arg) else: addFiles([arg], force=opts.force) def do_mkpac(self, subcmd, opts, *args): """ Create a new package under version control usage: osc mkpac new_package """ from . import conf from .core import createPackageDir if not conf.config['do_package_tracking']: print("to use this feature you have to enable \'do_package_tracking\' " "in the [general] section in the configuration file", file=sys.stderr) sys.exit(1) if len(args) != 1: raise oscerr.WrongArgs('Wrong number of arguments.') createPackageDir(args[0]) @cmdln.option('-r', '--recursive', action='store_true', help='If CWD is a project dir then scan all package dirs as well') @cmdln.alias('ar') def do_addremove(self, subcmd, opts, *args): """ Adds new files, removes disappeared files Adds all files new in the local copy, and removes all disappeared files. ARG, if specified, is a package working copy. """ from . import conf from .core import Package from .core import Project from .core import addFiles from .core import getTransActPath from .core import is_project_dir from .core import parseargs from .core import statfrmt args = parseargs(args) arg_list = args[:] for arg in arg_list: if is_project_dir(arg) and conf.config['do_package_tracking']: prj = Project(arg, False) for pac in prj.pacs_unvers: pac_dir = getTransActPath(os.path.join(prj.dir, pac)) if os.path.isdir(pac_dir): addFiles([pac_dir], prj) for pac in prj.pacs_broken: if prj.get_state(pac) != 'D': prj.set_state(pac, 'D') print(statfrmt('D', getTransActPath(os.path.join(prj.dir, pac)))) if opts.recursive: for pac in prj.pacs_have: state = prj.get_state(pac) if state is not None and state != 'D': pac_dir = getTransActPath(os.path.join(prj.dir, pac)) args.append(pac_dir) args.remove(arg) prj.write_packages() elif is_project_dir(arg): print('osc: addremove is not supported in a project dir unless ' '\'do_package_tracking\' is enabled in the configuration file', file=sys.stderr) sys.exit(1) pacs = Package.from_paths(args) for p in pacs: todo = list(set(p.filenamelist + p.filenamelist_unvers + p.to_be_added)) for filename in todo: abs_filename = os.path.join(p.absdir, filename) if os.path.isdir(abs_filename): continue # ignore foo.rXX, foo.mine for files which are in 'C' state if os.path.splitext(filename)[0] in p.in_conflict: continue state = p.status(filename) if state == '?': # TODO: should ignore typical backup files suffix ~ or .orig p.addfile(filename) elif state == 'D' and os.path.isfile(abs_filename): # if the "deleted" file exists in the wc, track it again p.addfile(filename) elif state == '!': p.delete_file(filename) print(statfrmt('D', getTransActPath(os.path.join(p.dir, filename)))) @cmdln.alias('ci') @cmdln.alias('checkin') @cmdln.option('-m', '--message', metavar='TEXT', help='specify log message TEXT') @cmdln.option('-n', '--no-message', default=False, action='store_true', help='do not specify a log message') @cmdln.option('-F', '--file', metavar='FILE', help='read log message from FILE, \'-\' denotes standard input.') @cmdln.option('-f', '--force', default=False, action="store_true", help='Allow empty commit with no changes. When committing a project, allow removing packages even if other packages depend on them.') @cmdln.option("--skip-local-service-run", "--noservice", "--no-service", default=False, action="store_true", help="Skip run of local source services as specified in _service file.") def do_commit(self, subcmd, opts, *args): """ Upload content to the repository server Upload content which is changed in your working copy, to the repository server. examples: osc ci # current dir osc ci osc ci file1 file2 ... """ from .core import get_default_editor try: self._commit(subcmd, opts, args) except oscerr.ExtRuntimeError as e: pattern = re.compile("No such file") if "No such file" in e.msg: editor = os.getenv('EDITOR', default=get_default_editor()) print(f"Editor {editor} not found") return 1 print("ERROR: service run failed", e, file=sys.stderr) return 1 except oscerr.PackageNotInstalled as e: print(f"ERROR: please install {e.args} ", end='') print("or use the --noservice option") return 1 def _commit(self, subcmd, opts, args): from . import conf from .core import Package from .core import Project from .core import edit_message from .core import get_commit_msg from .core import is_project_dir from .core import parseargs from .core import raw_input from .core import store_unlink_file args = parseargs(args) msg = '' if opts.message: msg = opts.message elif opts.file: if opts.file == '-': msg = sys.stdin.read() else: try: msg = open(opts.file).read() except: sys.exit(f'could not open file \'{opts.file}\'.') skip_local_service_run = False if not conf.config['local_service_run'] or opts.skip_local_service_run: skip_local_service_run = True for arg in args.copy(): if conf.config['do_package_tracking'] and is_project_dir(arg): prj = Project(arg) if prj.scm_url: print(f"WARNING: Skipping project '{prj.name}' because it is managed in scm (git): {prj.scm_url}") args.remove(arg) continue if not msg and not opts.no_message: msg = edit_message() # check any of the packages is a link, if so, as for branching pacs = (Package(os.path.join(prj.dir, pac)) for pac in prj.pacs_have if prj.get_state(pac) == ' ') can_branch = False if any(pac.is_link_to_different_project() for pac in pacs): repl = raw_input('Some of the packages are links to a different project!\n' 'Create a local branch before commit? (y|N) ') if repl in ('y', 'Y'): can_branch = True prj.commit(msg=msg, skip_local_service_run=skip_local_service_run, verbose=opts.verbose, can_branch=can_branch) args.remove(arg) pacs, no_pacs = Package.from_paths_nofail(args) for pac in pacs.copy(): if pac.scm_url: print(f"WARNING: Skipping package '{pac.name}' because it is managed in scm (git): {pac.scm_url}") pacs.remove(pac) continue if conf.config['do_package_tracking'] and (pacs or no_pacs): prj_paths = {} single_paths = [] files = {} # XXX: this is really ugly pac_objs = {} # it is possible to commit packages from different projects at the same # time: iterate over all pacs and put each pac to the right project in the dict for pac in pacs: path = os.path.normpath(os.path.join(pac.dir, os.pardir)) if is_project_dir(path): # use this path construction for computing "pac_name", # because it is possible that pac.name != pac_name (e.g. # for an external package wc) pac_name = os.path.basename(os.path.normpath(pac.absdir)) prj_paths.setdefault(path, []).append(pac_name) pac_objs.setdefault(path, []).append(pac) files.setdefault(path, {})[pac_name] = pac.todo else: single_paths.append(pac.dir) if not pac.todo: pac.todo = pac.filenamelist + pac.filenamelist_unvers pac.todo.sort() for pac in no_pacs: if os.path.exists(pac): # fail with an appropriate error message Package(pac) path = os.path.normpath(os.path.join(pac, os.pardir)) if is_project_dir(path): pac_name = os.path.basename(os.path.normpath(os.path.abspath(pac))) prj_paths.setdefault(path, []).append(pac_name) pac_objs.setdefault(path, []) # wrt. the current implementation of Project.commit, this # actually not needed files.setdefault(path, {})[pac_name] = [] else: # fail with an appropriate error message Package(pac) for prj_path, packages in prj_paths.items(): prj = Project(prj_path) if not msg and not opts.no_message: msg = get_commit_msg(prj.absdir, pac_objs[prj_path]) # check any of the packages is a link, if so, as for branching can_branch = False if any(pac.is_link_to_different_project() for pac in pacs): repl = raw_input('Some of the packages are links to a different project!\n' 'Create a local branch before commit? (y|N) ') if repl in ('y', 'Y'): can_branch = True prj_files = files[prj_path] prj.commit(packages, msg=msg, files=prj_files, skip_local_service_run=skip_local_service_run, verbose=opts.verbose, can_branch=can_branch, force=opts.force) store_unlink_file(prj.absdir, '_commit_msg') for pac in single_paths: p = Package(pac) if not msg and not opts.no_message: msg = get_commit_msg(p.absdir, [p]) p.commit(msg, skip_local_service_run=skip_local_service_run, verbose=opts.verbose, force=opts.force) store_unlink_file(p.absdir, '_commit_msg') elif no_pacs: # fail with an appropriate error message Package.from_paths(no_pacs) else: for p in pacs: if not p.todo: p.todo = p.filenamelist + p.filenamelist_unvers p.todo.sort() if not msg and not opts.no_message: msg = get_commit_msg(p.absdir, [p]) p.commit(msg, skip_local_service_run=skip_local_service_run, verbose=opts.verbose, force=opts.force) store_unlink_file(p.absdir, '_commit_msg') @cmdln.option('-r', '--revision', metavar='REV', help='update to specified revision (this option will be ignored ' 'if you are going to update the complete project or more than ' 'one package)') @cmdln.option('', '--linkrev', metavar='REV', help='revision of the link target that is used during link expansion') @cmdln.option('-u', '--unexpand-link', action='store_true', help='if a package is an expanded link, update to the raw _link file') @cmdln.option('-e', '--expand-link', action='store_true', help='if a package is a link, update to the expanded sources') @cmdln.option('-s', '--source-service-files', action='store_true', help='Run local source services after update.') @cmdln.option('-S', '--server-side-source-service-files', action='store_true', help='Use server side generated sources instead of local generation.') @cmdln.option('-l', '--limit-size', metavar='limit_size', help='Skip all files with a given size') @cmdln.alias('up') def do_update(self, subcmd, opts, *args): """ Update a working copy examples: 1. osc up If the current working directory is a package, update it. If the directory is a project directory, update all contained packages, AND check out newly added packages. To update only checked out packages, without checking out new ones, you might want to use "osc up *" from within the project dir. 2. osc up PAC Update the packages specified by the path argument(s) When --expand-link is used with source link packages, the expanded sources will be checked out. Without this option, the _link file and patches will be checked out. The option --unexpand-link can be used to switch back to the "raw" source with a _link file plus patch(es). """ from . import conf from .core import ET from .core import Linkinfo from .core import Package from .core import Project from .core import checkRevision from .core import is_project_dir from .core import parseRevisionOption from .core import parseargs from .core import print_request_list from .core import show_files_meta from .core import show_upstream_rev if opts.expand_link and opts.unexpand_link: raise oscerr.WrongOptions('Sorry, the options --expand-link and ' '--unexpand-link and are mutually ' 'exclusive.') args = parseargs(args) arg_list = args[:] for arg in arg_list: if is_project_dir(arg): prj = Project(arg, progress_obj=self.download_progress) if prj.scm_url: print("Please use git to update project", prj.name) print("This git repository is hosted at", prj.scm_url) continue if conf.config['do_package_tracking']: prj.update(expand_link=opts.expand_link, unexpand_link=opts.unexpand_link) args.remove(arg) else: # if not tracking package, and 'update' is run inside a project dir, # it should do the following: # (a) update all packages args += prj.pacs_have # (b) fetch new packages prj.checkout_missing_pacs(opts.expand_link, opts.unexpand_link) args.remove(arg) print_request_list(prj.apiurl, prj.name) args.sort() pacs = Package.from_paths(args, progress_obj=self.download_progress) if opts.revision and len(args) == 1: rev, dummy = parseRevisionOption(opts.revision) if not checkRevision(pacs[0].prjname, pacs[0].name, rev, pacs[0].apiurl): print(f'Revision \'{rev}\' does not exist', file=sys.stderr) sys.exit(1) if opts.expand_link or opts.unexpand_link: meta = show_files_meta(pacs[0].apiurl, pacs[0].prjname, pacs[0].name, revision=rev, linkrev=opts.linkrev, expand=opts.server_side_source_service_files) directory = xml_fromstring(meta) li_node = directory.find('linkinfo') if li_node is None: print(f'Revision \'{rev}\' is no link', file=sys.stderr) sys.exit(1) li = Linkinfo() li.read(li_node) if li.haserror() and opts.expand_link: raise oscerr.LinkExpandError(pacs[0].prjname, pacs[0].name, li.error) rev = li.lsrcmd5 if opts.expand_link: rev = li.xsrcmd5 if rev is None: # 2 cases: a) unexpand and passed rev has linkerror # b) expand and passed rev is already expanded rev = directory.get('srcmd5') else: rev = None for p in pacs: if len(pacs) > 1: print(f'Updating {p.name}') # this shouldn't be needed anymore with the new update mechanism # an expand/unexpand update is treated like a normal update (there's nothing special) # FIXME: ugly workaround for #399247 # if opts.expand_link or opts.unexpand_link: # if [ i for i in p.filenamelist+p.filenamelist_unvers if p.status(i) != ' ' and p.status(i) != '?']: # print >>sys.stderr, 'osc: cannot expand/unexpand because your working ' \ # 'copy has local modifications.\nPlease revert/commit them ' \ # 'and try again.' # sys.exit(1) if p.scm_url: print("Please use git to update package", p.name) print("This git repository is hosted at", p.scm_url) continue if not rev: if opts.expand_link: rev = p.latest_rev(expand=True) if p.islink() and not p.isexpanded(): print('Expanding to rev', rev) elif opts.unexpand_link and p.islink() and p.isexpanded(): rev = show_upstream_rev(p.apiurl, p.prjname, p.name, meta=p.meta) print('Unexpanding to rev', rev) elif (p.islink() and p.isexpanded()) or opts.server_side_source_service_files: rev = p.latest_rev(include_service_files=opts.server_side_source_service_files) p.update(rev, opts.server_side_source_service_files, opts.limit_size) if opts.source_service_files: print('Running local source services') p.run_source_services() if opts.unexpand_link: p.unmark_frozen() rev = None print_request_list(p.apiurl, p.prjname, p.name) @cmdln.option('-f', '--force', action='store_true', help='forces removal of entire package and its files') @cmdln.alias('rm') @cmdln.alias('del') @cmdln.alias('remove') def do_delete(self, subcmd, opts, *args): """ Mark files or package directories to be deleted upon the next 'checkin' usage: cd .../PROJECT/PACKAGE osc delete FILE [...] cd .../PROJECT osc delete PACKAGE [...] This command works on check out copies. Use "rdelete" for working on server side only. This is needed for removing the entire project. As a safety measure, projects must be empty (i.e., you need to delete all packages first). If you are sure that you want to remove a package and all its files use \'--force\' switch. Sometimes this also works without --force. """ from . import conf from .core import Package from .core import Project from .core import getPrjPacPaths from .core import getTransActPath from .core import is_project_dir from .core import parseargs from .core import statfrmt if not args: self.argparse_error("Incorrect number of arguments.") args = parseargs(args) # check if args contains a package which was removed by # a non-osc command and mark it with the 'D'-state arg_list = args[:] for i in arg_list: if not os.path.exists(i): prj_dir, pac_dir = getPrjPacPaths(i) if is_project_dir(prj_dir): prj = Project(prj_dir, False) if i in prj.pacs_broken: if prj.get_state(i) != 'A': prj.set_state(pac_dir, 'D') else: prj.del_package_node(i) print(statfrmt('D', getTransActPath(i))) args.remove(i) prj.write_packages() pacs = Package.from_paths(args) for p in pacs: if not p.todo: prj_dir, pac_dir = getPrjPacPaths(p.absdir) if is_project_dir(prj_dir): if conf.config['do_package_tracking']: prj = Project(prj_dir, False) prj.delPackage(p, opts.force) else: print("WARNING: package tracking is disabled, operation skipped !", file=sys.stderr) else: pathn = getTransActPath(p.dir) for filename in p.todo: p.clear_from_conflictlist(filename) ret, state = p.delete_file(filename, opts.force) if ret: print(statfrmt('D', os.path.join(pathn, filename))) continue if state == '?': sys.exit(f'\'{filename}\' is not under version control') elif state in ('A', 'M') and not opts.force: sys.exit(f'\'{filename}\' has local modifications (use --force to remove this file)') elif state == 'S': sys.exit(f'\'{filename}\' is marked as skipped and no local file with this name exists') def do_resolved(self, subcmd, opts, *args): """ Remove 'conflicted' state on working copy files If an upstream change can't be merged automatically, a file is put into in 'conflicted' ('C') state. Within the file, conflicts are marked with special <<<<<<< as well as ======== and >>>>>>> lines. After manually resolving all conflicting parts, use this command to remove the 'conflicted' state. Note: this subcommand does not semantically resolve conflicts or remove conflict markers; it merely removes the conflict-related artifact files and allows PATH to be committed again. usage: osc resolved FILE [FILE...] """ from .core import Package from .core import parseargs if not args: self.argparse_error("Incorrect number of arguments.") args = parseargs(args) pacs = Package.from_paths(args, skip_dirs=True) for p in pacs: for filename in p.todo: print(f'Resolved conflicted state of "{filename}"') p.clear_from_conflictlist(filename) @cmdln.alias('dists') def do_distributions(self, subcmd, opts, *args): """ Shows all available distributions This command shows the available distributions. For active distributions it shows the name, project and name of the repository and a suggested default repository name. usage: osc distributions """ from .core import get_distributions from .util.helper import format_table apiurl = self.get_api_url() dists = get_distributions(apiurl) if dists: headers = dists[0].keys() rows = [] for dist in dists: rows.append([dist[h] for h in headers]) print(format_table(rows, headers).rstrip()) @cmdln.option('-f', '--force', action='store_true', default=False, help="Don't ask and delete files") @cmdln.option('project') @cmdln.option('package') @cmdln.option('files', metavar="file", nargs='+') def do_rremove(self, subcmd, opts): """ Remove source files from selected package """ from .core import delete_files from .core import raw_input project = opts.project package = opts.package files = opts.files apiurl = self.get_api_url() if len(files) == 0: if '/' not in project: raise oscerr.WrongArgs("Missing operand, type osc help rremove for help") else: files = (package, ) project, package = project.split('/') for filename in files: if not opts.force: resp = raw_input(f"rm: remove source file `{filename}' from `{project}/{package}'? (yY|nN) ") if resp not in ('y', 'Y'): continue try: delete_files(apiurl, project, package, (filename, )) except HTTPError as e: if opts.force: print(e, file=sys.stderr) body = e.read() if e.code in (400, 403, 404, 500): if '' in body: msg = body.split('')[1] msg = msg.split('')[0] print(msg, file=sys.stderr) else: raise e @cmdln.alias('r') @cmdln.option('-l', '--last-build', action='store_true', default=None, help='show last build results (succeeded/failed/unknown)') @cmdln.option('-r', '--repo', action='append', default=[], help='Show results only for specified repo(s)') @cmdln.option('-a', '--arch', action='append', default=[], help='Show results only for specified architecture(s)') @cmdln.option('-b', '--brief', action='store_true', help='show the result in "pkgname repo arch result". Default for -f') @cmdln.option('--no-multibuild', action='store_true', default=False, help='Disable results for all direct affect packages inside of the project') @cmdln.option('-M', '--multibuild-package', metavar='FLAVOR', action='append', default=[], help=HELP_MULTIBUILD_MANY) @cmdln.option('-V', '--vertical', action='store_true', help='list packages vertically instead horizontally for entire project') @cmdln.option('-w', '--watch', action='store_true', help='watch the results until all finished building') @cmdln.option('-F', '--fail-on-error', action='store_true', help='fail with exit 1 if any build has errored') @cmdln.option('-s', '--status-filter', help='only show packages with the given build status') @cmdln.option('-f', '--failed', action='store_true', help='show only failed results') @cmdln.option('', '--xml', action='store_true', default=False, help='generate output in XML (former results_meta)') @cmdln.option('', '--csv', action='store_true', default=False, help='generate output in CSV format') @cmdln.option('', '--format', default=None, help="Change the format of the text (default) or csv output. Not supported for xml output.\n" "Supported fields: project, package, repository, arch, state, dirty, code, details.\n" "Text output format requires using the field names in form of named fields for string interpolation: ``%%(field)s``.\n" "CSV output format requires field names separated with commas.") @cmdln.option('--show-excluded', action='store_true', help='show repos that are excluded for this package') def do_results(self, subcmd, opts, *args): """ Shows the build results of a package or project usage: osc results # (inside working copy of PRJ or PKG) osc results PROJECT [PACKAGE[:FLAVOR]] """ from .core import MultibuildFlavorResolver from .core import csv from .core import decode_it from .core import get_package_results from .core import get_results from .core import io from .core import is_package_dir from .core import is_project_dir from .core import result_xml_to_dicts from .core import slash_split from .core import store_read_package from .core import store_read_project args = slash_split(args) apiurl = self.get_api_url() if len(args) > 2: raise oscerr.WrongArgs('Too many arguments (required none, one, or two)') project = package = None wd = Path.cwd() if is_project_dir(wd): project = store_read_project(wd) elif is_package_dir(wd): project = store_read_project(wd) package = store_read_package(wd) if len(args) > 0: project = self._process_project_name(args[0]) if len(args) > 1: package = args[1] if project is None: raise oscerr.WrongOptions("No project given") if opts.failed and opts.status_filter: raise oscerr.WrongArgs('-s and -f cannot be used together') if opts.multibuild_package and opts.no_multibuild: self.argparser.error("-M/--multibuild-package and --no-multibuild are mutually exclusive") if opts.xml and opts.format: self.argparser.error("--xml and --format are mutually exclusive") if opts.failed: opts.status_filter = 'failed' opts.brief = True if package is None: opts.hide_legend = None opts.name_filter = None opts.show_non_building = None opts.show_excluded = None return self.do_prjresults('prjresults', opts, *args) if opts.xml and opts.csv: raise oscerr.WrongOptions("--xml and --csv are mutual exclusive") kwargs = {'apiurl': apiurl, 'project': project, 'package': package, 'lastbuild': opts.last_build, 'repository': opts.repo, 'arch': opts.arch, 'wait': opts.watch, 'showexcl': opts.show_excluded, 'code': opts.status_filter} if opts.multibuild_package: opts.no_multibuild = False resolver = MultibuildFlavorResolver(apiurl, project, package, use_local=False) kwargs['multibuild_packages'] = resolver.resolve(opts.multibuild_package) if not opts.no_multibuild: kwargs['multibuild'] = kwargs['locallink'] = True if opts.xml or opts.csv: # hmm should we filter excluded repos here as well? # for now, ignore --show-excluded del kwargs['showexcl'] for xml in get_package_results(**kwargs): if opts.xml: print(decode_it(xml), end='') else: # csv formatting if opts.format is None: columns = ["repository", "arch", "package", "state", "dirty", "code", "details"] else: # split columns by colon, semicolon or pipe columns = opts.format.split(",") supported_columns = ["project", "package", "repository", "arch", "state", "dirty", "code", "details"] unknown_columns = sorted(set(columns) - set(supported_columns)) if unknown_columns: self.argparser.error(f"Unknown format fields: {''.join(unknown_columns)}") f = io.StringIO() writer = csv.writer(f, dialect="unix") rows = [r for r, _ in result_xml_to_dicts(xml)] for row in rows: writer.writerow([row[i] for i in columns]) f.seek(0) print(f.read(), end="") else: kwargs['verbose'] = opts.verbose kwargs['wait'] = opts.watch kwargs['printJoin'] = '\n' kwargs['format'] = opts.format out = {} get_results(out=out, **kwargs) if opts.fail_on_error and out['failed']: sys.exit(1) # WARNING: this function is also called by do_results. You need to set a default there # as well when adding a new option! @cmdln.option('-b', '--brief', action='store_true', help='show the result in "pkgname repo arch result"') @cmdln.option('-w', '--watch', action='store_true', help='watch the results until all finished building, only supported with --xml') @cmdln.option('-c', '--csv', action='store_true', help='csv output') @cmdln.option('', '--xml', action='store_true', default=False, help='generate output in XML') @cmdln.option('-s', '--status-filter', metavar='STATUS', help='show only packages with buildstatus STATUS (see legend)') @cmdln.option('-n', '--name-filter', metavar='EXPR', help='show only packages whose names match EXPR') @cmdln.option('-a', '--arch', metavar='ARCH', action='append', help='show results only for specified architecture(s)') @cmdln.option('-r', '--repo', metavar='REPO', action='append', help='show results only for specified repo(s)') @cmdln.option('-V', '--vertical', action='store_true', help='list packages vertically instead horizontally') @cmdln.option('--show-excluded', action='store_true', help='show packages that are excluded in all repos, also hide repos that have only excluded packages') @cmdln.alias('pr') def do_prjresults(self, subcmd, opts, *args): """ Shows project-wide build results usage: osc prjresults (inside working copy) osc prjresults PROJECT """ from .core import decode_it from .core import get_package_results from .core import get_prj_results from .core import store_read_project apiurl = self.get_api_url() if args: if len(args) == 1: project = self._process_project_name(args[0]) else: raise oscerr.WrongArgs('Wrong number of arguments.') else: wd = Path.cwd() project = store_read_project(wd) if opts.xml: kwargs = {} if opts.repo: kwargs['repository'] = opts.repo if opts.arch: kwargs['arch'] = opts.arch kwargs['wait'] = opts.watch for results in get_package_results(apiurl, project, **kwargs): print(decode_it(results)) return if opts.watch: print('Please implement support for osc prjresults --watch without --xml.') return 2 print('\n'.join(get_prj_results(apiurl, project, hide_legend=opts.quiet, csv=opts.csv, status_filter=opts.status_filter, name_filter=opts.name_filter, repo=opts.repo, arch=opts.arch, vertical=opts.vertical, show_excluded=opts.show_excluded, brief=opts.brief))) @cmdln.alias('rpmlint') @cmdln.alias('lint') def do_rpmlintlog(self, subcmd, opts, *args): """ Shows the rpmlint logfile Shows the rpmlint logfile to analyse if there are any problems with the spec file and the built binaries. usage: osc rpmlintlog project package repository arch """ from . import build as osc_build from .core import decode_it from .core import get_rpmlint_log from .core import slash_split from .core import store_read_package from .core import store_read_project apiurl = self.get_api_url() args = slash_split(args) if len(args) <= 3: project = store_read_project(Path.cwd()) package = store_read_package(Path.cwd()) if len(args) == 1: repository, arch = self._find_last_repo_arch(args[0], fatal=False) if repository is None: # no local build with this repo was done print('failed to guess arch, using hostarch') repository = args[0] arch = osc_build.hostarch elif len(args) == 2: repository, arch = args elif len(args) == 3: raise oscerr.WrongArgs('Too many arguments.') else: # len(args) = 0 case self.print_repos() elif len(args) == 4: project, package, repository, arch = args else: raise oscerr.WrongArgs('please provide project package repository arch.') print(decode_it(get_rpmlint_log(apiurl, project, package, repository, arch))) @cmdln.alias('bl') @cmdln.alias('blt') @cmdln.alias('buildlogtail') @cmdln.option('-l', '--last', action='store_true', help='Show the last finished log file') @cmdln.option('--lastsucceeded', '--last-succeeded', action='store_true', help='Show the last succeeded log file') @cmdln.option('-M', '--multibuild-package', metavar='FLAVOR', help=HELP_MULTIBUILD_ONE) @cmdln.option('-o', '--offset', metavar='OFFSET', help='get log start or end from the offset') @cmdln.option('-s', '--strip-time', action='store_true', help='strip leading build time from the log') def do_buildlog(self, subcmd, opts, *args): """ Shows the build log of a package Shows the log file of the build of a package. Can be used to follow the log while it is being written. Needs to be called from within a package directory. When called as buildlogtail (or blt) it just shows the end of the logfile. This is useful to see just a build failure reasons. The arguments REPOSITORY and ARCH are the first two columns in the 'osc results' output. If the buildlog url is used buildlog command has the same behavior as remotebuildlog. buildlog [REPOSITORY ARCH | BUILDLOGURL] """ from . import build as osc_build from . import conf from .core import ET from .core import http_GET from .core import makeurl from .core import parse_buildlogurl from .core import print_buildlog from .core import store_read_package from .core import store_read_project project = package = repository = arch = None apiurl = self.get_api_url() if len(args) == 1 and args[0].startswith('http'): apiurl, project, package, repository, arch = parse_buildlogurl(args[0]) else: project = store_read_project(Path.cwd()) package = store_read_package(Path.cwd()) if len(args) == 1: repository, arch = self._find_last_repo_arch(args[0], fatal=False) if repository is None: # no local build with this repo was done print('failed to guess arch, using hostarch') repository = args[0] arch = osc_build.hostarch elif len(args) < 2: self.print_repos() elif len(args) > 2: raise oscerr.WrongArgs('Too many arguments.') else: repository = args[0] arch = args[1] if opts.multibuild_package: package = package + ":" + opts.multibuild_package offset = 0 if subcmd in ("blt", "buildlogtail"): query = {'view': 'entry'} if opts.last: query['last'] = 1 if opts.lastsucceeded: query['lastsucceeded'] = 1 u = makeurl(self.get_api_url(), ['build', project, repository, arch, package, '_log'], query=query) f = http_GET(u) root = xml_parse(f).getroot() offset = int(root.find('entry').get('size')) if opts.offset: offset = offset - int(opts.offset) else: offset = offset - (8 * 1024) if offset < 0: offset = 0 elif opts.offset: offset = int(opts.offset) strip_time = opts.strip_time or conf.config['buildlog_strip_time'] print_buildlog(apiurl, project, package, repository, arch, offset, strip_time, opts.last, opts.lastsucceeded) def print_repos(self, repos_only=False, exc_class=oscerr.WrongArgs, exc_msg='Missing arguments', project=None): from .core import is_package_dir from .core import is_project_dir wd = Path.cwd() doprint = False if is_package_dir(wd): msg = 'Valid arguments for this package are:' doprint = True elif is_project_dir(wd): msg = 'Valid arguments for this project are:' doprint = True args = [] if project is not None: args.append(project) msg = 'Valid arguments are:' doprint = True if doprint: print(msg) print() if repos_only: self.do_repositories("repos_only", None, *args) else: self.do_repositories(None, None, *args) raise exc_class(exc_msg) @cmdln.alias('rbl') @cmdln.alias('rbuildlog') @cmdln.alias('rblt') @cmdln.alias('rbuildlogtail') @cmdln.alias('remotebuildlogtail') @cmdln.option('-l', '--last', action='store_true', help='Show the last finished log file') @cmdln.option('--lastsucceeded', '--last-succeeded', action='store_true', help='Show the last succeeded log file') @cmdln.option('-M', '--multibuild-package', metavar='FLAVOR', help=HELP_MULTIBUILD_ONE) @cmdln.option('-o', '--offset', metavar='OFFSET', help='get log starting or ending from the offset') @cmdln.option('-s', '--strip-time', action='store_true', help='strip leading build time from the log') def do_remotebuildlog(self, subcmd, opts, *args): """ Shows the build log of a package Shows the log file of the build of a package. Can be used to follow the log while it is being written. remotebuildlogtail shows just the tail of the log file. usage: osc remotebuildlog project package[:flavor] repository arch or osc remotebuildlog project/package[:flavor]/repository/arch or osc remotebuildlog buildlogurl """ from . import conf from .core import ET from .core import http_GET from .core import makeurl from .core import parse_buildlogurl from .core import print_buildlog from .core import slash_split if len(args) == 1 and args[0].startswith('http'): apiurl, project, package, repository, arch = parse_buildlogurl(args[0]) else: args = slash_split(args) apiurl = self.get_api_url() if len(args) < 4: raise oscerr.WrongArgs('Too few arguments.') elif len(args) > 4: raise oscerr.WrongArgs('Too many arguments.') else: project, package, repository, arch = args project = self._process_project_name(project) if opts.multibuild_package: package = package + ":" + opts.multibuild_package offset = 0 if subcmd in ("rblt", "rbuildlogtail", "remotebuildlogtail"): query = {'view': 'entry'} if opts.last: query['last'] = 1 if opts.lastsucceeded: query['lastsucceeded'] = 1 u = makeurl(self.get_api_url(), ['build', project, repository, arch, package, '_log'], query=query) f = http_GET(u) root = xml_parse(f).getroot() offset = int(root.find('entry').get('size')) if opts.offset: offset = offset - int(opts.offset) else: offset = offset - (8 * 1024) if offset < 0: offset = 0 elif opts.offset: offset = int(opts.offset) strip_time = opts.strip_time or conf.config['buildlog_strip_time'] print_buildlog(apiurl, project, package, repository, arch, offset, strip_time, opts.last, opts.lastsucceeded) def _find_last_repo_arch(self, repo=None, fatal=True): from .core import ET from .core import store files = glob.glob(os.path.join(Path.cwd(), store, "_buildinfo-*")) if repo is not None: files = [f for f in files if os.path.basename(f).replace('_buildinfo-', '').startswith(repo + '-')] if not files: if not fatal: return None, None self.print_repos() cfg = files[0] # find newest file for f in files[1:]: if os.stat(f).st_atime > os.stat(cfg).st_atime: cfg = f root = xml_parse(cfg).getroot() repo = root.get("repository") arch = root.findtext("arch") return repo, arch @cmdln.alias('lbl') @cmdln.option('-o', '--offset', metavar='OFFSET', help='get log starting from offset') @cmdln.option('-s', '--strip-time', action='store_true', help='strip leading build time from the log') def do_localbuildlog(self, subcmd, opts, *args): """ Shows the build log of a local buildchroot usage: osc lbl [REPOSITORY [ARCH]] osc lbl # show log of newest last local build """ from . import build as osc_build from . import conf from .core import BUFSIZE from .core import buildlog_strip_time from .core import is_package_dir from .core import store_read_package from .core import store_read_project if conf.config['build-type']: # FIXME: raise Exception instead print('Not implemented for VMs', file=sys.stderr) sys.exit(1) if len(args) == 0 or len(args) == 1: project = store_read_project('.') package = store_read_package('.') repo = None if args: repo = args[0] repo, arch = self._find_last_repo_arch(repo) elif len(args) == 2: project = store_read_project('.') package = store_read_package('.') repo = args[0] arch = args[1] else: if is_package_dir(Path.cwd()): self.print_repos() raise oscerr.WrongArgs('Wrong number of arguments.') # TODO: refactor/unify buildroot calculation and move it to core.py apihost = urlsplit(self.get_api_url())[1] buildroot = osc_build.calculate_build_root(apihost, project, package, repo, arch) offset = 0 if opts.offset: offset = int(opts.offset) logfile = os.path.join(buildroot, '.build.log') if not os.path.isfile(logfile): raise oscerr.OscIOError(None, f'logfile \'{logfile}\' does not exist') f = open(logfile, 'rb') f.seek(offset) data = f.read(BUFSIZE) while len(data): if opts.strip_time or conf.config['buildlog_strip_time']: # FIXME: this is not working when the time is split between 2 chunks data = buildlog_strip_time(data) sys.stdout.buffer.write(data) data = f.read(BUFSIZE) f.close() @cmdln.option('-M', '--multibuild-package', metavar='FLAVOR', help=HELP_MULTIBUILD_ONE) @cmdln.alias('tr') def do_triggerreason(self, subcmd, opts, *args): """ Show reason why a package got triggered to build The server decides when a package needs to get rebuild, this command shows the detailed reason for a package. A brief reason is also stored in the jobhistory, which can be accessed via "osc jobhistory". Trigger reasons might be: - new build (never build yet or rebuild manually forced) - source change (e.g. on updating sources) - meta change (packages which are used for building have changed) - rebuild count sync (In case that it is configured to sync release numbers) usage in package or project directory: osc triggerreason REPOSITORY ARCH osc triggerreason PROJECT PACKAGE[:FLAVOR] REPOSITORY ARCH """ from .core import ET from .core import is_package_dir from .core import show_package_trigger_reason from .core import slash_split from .core import store_read_package from .core import store_read_project wd = Path.cwd() args = slash_split(args) project = package = repository = arch = None if len(args) < 2: self.print_repos() apiurl = self.get_api_url() if len(args) == 2: # 2 if is_package_dir('.'): package = store_read_package(wd) else: raise oscerr.WrongArgs('package is not specified.') project = store_read_project(wd) repository = args[0] arch = args[1] elif len(args) == 4: project = self._process_project_name(args[0]) package = args[1] repository = args[2] arch = args[3] else: raise oscerr.WrongArgs('Too many arguments.') if opts.multibuild_package: package = package + ":" + opts.multibuild_package print(apiurl, project, package, repository, arch) xml = show_package_trigger_reason(apiurl, project, package, repository, arch) root = xml_fromstring(xml) if root.find('explain') is None: reason = "No triggerreason found" print(reason) else: reason = root.find('explain').text triggertime = time.strftime('%Y-%m-%d %H:%M:%S', time.gmtime(int(root.find('time').text))) print(f"{reason} (at {triggertime})") if reason == "meta change": print("changed keys:") for package in root.findall('packagechange'): print(" ", package.get('change'), package.get('key')) # FIXME: the new osc syntax should allow to specify multiple packages # FIXME: the command should optionally use buildinfo data to show all dependencies @cmdln.option('-M', '--multibuild-package', metavar='FLAVOR', help=HELP_MULTIBUILD_ONE) def do_dependson(self, subcmd, opts, *args): """ Dependson shows the build dependencies inside of a project, valid for a given repository and architecture The command can be used to find build dependencies (wrt. a given repository and arch) that reside in the same project. To see all build dependencies use the buildinfo command. This is no guarantee, since the new build might have changed dependencies. NOTE: to see all binary packages, which can trigger a build you need to refer the buildinfo, since this command shows only the dependencies inside of a project. The arguments REPOSITORY and ARCH can be taken from the first two columns of the 'osc repos' output. usage in package or project directory: osc dependson REPOSITORY ARCH usage: osc dependson PROJECT [PACKAGE[:FLAVOR]] REPOSITORY ARCH """ self._dependson(False, opts, *args) @cmdln.option('-M', '--multibuild-package', metavar='FLAVOR', help=HELP_MULTIBUILD_ONE) def do_whatdependson(self, subcmd, opts, *args): """ Show the packages that require the specified package during the build The command whatdependson can be used to find out what will be triggered when a certain package changes. This is no guarantee, since the new build might have changed dependencies. The packages marked with the [i] flag are inherited to the project. The arguments REPOSITORY and ARCH can be taken from the first two columns of the 'osc repos' output. usage in package or project directory: osc whatdependson REPOSITORY ARCH usage: osc whatdependson PROJECT [PACKAGE[:FLAVOR]] REPOSITORY ARCH """ self._dependson(True, opts, *args) def _dependson(self, reverse, opts, *args): from .core import ET from .core import get_dependson from .core import is_package_dir from .core import is_project_dir from .core import meta_get_packagelist from .core import slash_split from .core import store_read_package from .core import store_read_project wd = Path.cwd() args = slash_split(args) project = packages = repository = arch = None if len(args) < 2 and (is_package_dir('.') or is_project_dir('.')): self.print_repos() if len(args) > 4: raise oscerr.WrongArgs('Too many arguments.') apiurl = self.get_api_url() if len(args) < 3: # 2 if is_package_dir('.'): packages = [store_read_package(wd)] elif not is_project_dir('.'): raise oscerr.WrongArgs('Project and package is not specified.') project = store_read_project(wd) repository = args[0] arch = args[1] if len(args) == 3: project = self._process_project_name(args[0]) repository = args[1] arch = args[2] if len(args) == 4: project = self._process_project_name(args[0]) packages = [args[1]] repository = args[2] arch = args[3] if packages is not None and opts.multibuild_package: packages = [packages[0] + ":" + opts.multibuild_package] project_packages = meta_get_packagelist(apiurl, project, deleted=False, expand=False) xml = get_dependson(apiurl, project, repository, arch, packages, reverse) root = xml_fromstring(xml) for package in root.findall('package'): print(package.get('name'), ":") for dep in package.findall('pkgdep'): inherited = " " if dep.text in project_packages else "[i]" print(f" {inherited} {dep.text}") @cmdln.option('--alternative-project', metavar='PROJECT', help='specify the build target project') @cmdln.option('-d', '--debug', action='store_true', dest="debug_dependencies", help='verbose output of build dependencies') @cmdln.option('-M', '--multibuild-package', metavar='FLAVOR', help=HELP_MULTIBUILD_ONE) @cmdln.option('-x', '--extra-pkgs', metavar='PAC', action='append', help='Add this package when computing the buildinfo') @cmdln.option('-p', '--prefer-pkgs', metavar='DIR', action='append', help='Prefer packages from this directory when installing the build-root') @cmdln.option('--nodebugpackages', '--no-debug-packages', action='store_true', help='Skip installation of additional debug packages for CLI builds (specified in obs:cli_debug_packages in project metadata)') def do_buildinfo(self, subcmd, opts, *args): """ Shows the build info Shows the build "info" which is used in building a package. This command is mostly used internally by the 'build' subcommand. It needs to be called from within a package directory. The BUILD_DESCR argument is optional. BUILD_DESCR is a local RPM specfile or Debian "dsc" file. If specified, it is sent to the server, and the buildinfo will be based on it. If the argument is not supplied, the buildinfo is derived from the specfile which is currently on the source repository server. The returned data is XML and contains a list of the packages used in building, their source, and the expanded BuildRequires. The arguments REPOSITORY and ARCH are optional. They can be taken from the first two columns of the 'osc repos' output. If not specified, REPOSITORY defaults to the 'build_repository' config entry in your 'oscrc' and ARCH defaults to your host architecture. usage: in a package working copy: osc buildinfo [OPTS] REPOSITORY ARCH BUILD_DESCR osc buildinfo [OPTS] REPOSITORY (ARCH = hostarch, BUILD_DESCR is detected automatically) osc buildinfo [OPTS] ARCH (REPOSITORY = build_repository (config option), BUILD_DESCR is detected automatically) osc buildinfo [OPTS] BUILD_DESCR (REPOSITORY = build_repository (config option), ARCH = hostarch) osc buildinfo [OPTS] (REPOSITORY = build_repository (config option), ARCH = hostarch, BUILD_DESCR is detected automatically) Note: if BUILD_DESCR does not exist locally the remote BUILD_DESCR is used osc buildinfo [OPTS] PROJECT PACKAGE[:FLAVOR] REPOSITORY ARCH [BUILD_DESCR] """ from . import build as osc_build from .core import decode_it from .core import get_buildconfig from .core import get_buildinfo from .core import is_package_dir from .core import return_external from .core import slash_split from .core import store_read_package from .core import store_read_project wd = Path.cwd() args = slash_split(args) project = package = repository = arch = build_descr = None if len(args) <= 3: if not is_package_dir('.'): raise oscerr.WrongArgs('Incorrect number of arguments (Note: \'.\' is no package wc)') if opts.alternative_project: project = opts.alternative_project package = '_repository' else: project = store_read_project('.') package = store_read_package('.') repository, arch, build_descr = self.parse_repoarchdescr(args, alternative_project=opts.alternative_project, ignore_descr=True, multibuild_package=opts.multibuild_package) elif len(args) == 4 or len(args) == 5: project = self._process_project_name(args[0]) package = args[1] repository = args[2] arch = args[3] if len(args) == 5: build_descr = args[4] else: raise oscerr.WrongArgs('Too many arguments.') apiurl = self.get_api_url() # TODO: refactor retrieving build type in build.py and use it here or directly in create_build_descr_data() if build_descr: build_type = os.path.splitext(build_descr)[1] else: build_type = None build_descr_data, prefer_pkgs = osc_build.create_build_descr_data( build_descr, build_type=build_type, repo=repository, arch=arch, prefer_pkgs=opts.prefer_pkgs, # define=, # define_with=, # define_without=, ) if opts.multibuild_package: package = package + ":" + opts.multibuild_package extra_pkgs = opts.extra_pkgs.copy() if opts.extra_pkgs else [] if os.path.exists("/usr/lib/build/queryconfig") and not opts.nodebugpackages: with NamedTemporaryFile(mode="w+b", prefix="obs_buildconfig_") as bc_file: # print('Getting buildconfig from server and store to %s' % bc_filename) bc = get_buildconfig(apiurl, project, repository) bc_file.write(bc) bc_file.flush() debug_pkgs = decode_it(return_external("/usr/lib/build/queryconfig", "--dist", bc_file.name, "substitute", "obs:cli_debug_packages")) if debug_pkgs: extra_pkgs.extend(debug_pkgs.strip().split(" ")) buildinfo = get_buildinfo( apiurl, project, package, repository, arch, specfile=build_descr_data, debug=opts.debug_dependencies, addlist=extra_pkgs, ) print(decode_it(buildinfo)) def do_buildconfig(self, subcmd, opts, *args): """ Shows the build config Shows the build configuration which is used in building a package. This command is mostly used internally by the 'build' command. The returned data is the project-wide build configuration in a format which is directly readable by the build script. It contains RPM macros and BuildRequires expansions, for example. The argument REPOSITORY an be taken from the first column of the 'osc repos' output. usage: osc buildconfig REPOSITORY (in pkg or prj dir) osc buildconfig PROJECT REPOSITORY """ from .core import decode_it from .core import get_buildconfig from .core import is_package_dir from .core import is_project_dir from .core import slash_split from .core import store_read_project wd = Path.cwd() args = slash_split(args) if len(args) < 1 and (is_package_dir('.') or is_project_dir('.')): self.print_repos(True) if len(args) > 2: raise oscerr.WrongArgs('Too many arguments.') apiurl = self.get_api_url() if len(args) == 1: # FIXME: check if args[0] is really a repo and not a project, need a is_project() function for this project = store_read_project(wd) repository = args[0] elif len(args) == 2: project = self._process_project_name(args[0]) repository = args[1] else: raise oscerr.WrongArgs('Wrong number of arguments.') print(decode_it(get_buildconfig(apiurl, project, repository))) @cmdln.option('worker', metavar=':') def do_workerinfo(self, subcmd, opts): """ Gets the information to a worker from the server Examples: osc workerinfo x86_64:goat:1 usage: osc workerinfo : """ from .core import get_worker_info worker = opts.worker apiurl = self.get_api_url() print(''.join(get_worker_info(apiurl, worker))) @cmdln.option('', '--ignore-file', action='store_true', help='ignore _constraints file and only check project constraints') @cmdln.option('-M', '--multibuild-package', metavar='FLAVOR', help=HELP_MULTIBUILD_ONE) def do_checkconstraints(self, subcmd, opts, *args): """ Check the constraints and view compliant workers Checks the constraints for compliant workers. usage: remote request: osc checkconstraints [OPTS] PROJECT PACKAGE REPOSITORY ARCH [CONSTRAINTSFILE] in a package working copy: osc checkconstraints [OPTS] REPOSITORY ARCH [CONSTRAINTSFILE] osc checkconstraints [OPTS] CONSTRAINTSFILE osc checkconstraints [OPTS] """ from .core import check_constraints from .core import get_repos_of_project from .core import slash_split from .core import store_read_package from .core import store_read_project repository = arch = constraintsfile = None args = slash_split(args) if len(args) > 5: raise oscerr.WrongArgs('Too many arguments') if len(args) == 4 or len(args) == 5: project = self._process_project_name(args[0]) package = args[1] repository = args[2] arch = args[3] opts.ignore_file = True if len(args) == 5: constraintsfile = args[4]; else: project = store_read_project('.') package = store_read_package('.') if opts.multibuild_package: package = package + ":" + opts.multibuild_package if len(args) == 1: constraintsfile = args[0] elif len(args) == 2 or len(args) == 3: repository = args[0] arch = args[1] if len(args) == 3: constraintsfile = args[2] constraintsfile_data = None if constraintsfile is not None: constraintsfile_data = open(constraintsfile).read() elif not opts.ignore_file: if os.path.isfile("_constraints"): constraintsfile_data = open("_constraints").read() else: print("No local _constraints file. Using just the project constraints") apiurl = self.get_api_url() r = [] if not arch and not repository: result_line_templ = '%(name)-25s %(arch)-25s %(comp_workers)s' for repo in get_repos_of_project(apiurl, project): rmap = {} rmap['name'] = repo.name rmap['arch'] = repo.arch workers = check_constraints(apiurl, project, repo.name, repo.arch, package, constraintsfile_data) rmap['comp_workers'] = len(workers) r.append(result_line_templ % rmap) r.insert(0, 'Repository Arch Worker') r.insert(1, '---------- ---- ------') else: r = check_constraints(apiurl, project, repository, arch, package, constraintsfile_data) r.insert(0, 'Worker') r.insert(1, '------') print('\n'.join(r)) @cmdln.alias('repos') @cmdln.alias('platforms') def do_repositories(self, subcmd, opts, *args): """ Shows repositories configured for a project It skips repositories by default which are disabled for a given package. usage: osc repos osc repos [PROJECT] [PACKAGE] """ from . import store as osc_store from .core import build_table from .core import get_repos_of_project from .core import get_repositories_of_project from .core import is_package_dir from .core import is_project_dir from .core import show_package_disabled_repos from .core import store_read_package from .core import store_read_project apiurl = self.get_api_url() project = None package = None disabled = None if len(args) == 1: project = self._process_project_name(args[0]) elif len(args) == 2: project = self._process_project_name(args[0]) package = args[1] elif len(args) == 0: store = osc_store.get_store(".") if store.is_package: package = store.package project = store.project elif store.is_project: project = store.project else: raise oscerr.WrongArgs('Wrong number of arguments') if project is None: raise oscerr.WrongArgs('No project specified') if package is not None: disabled = show_package_disabled_repos(apiurl, project, package) if subcmd == 'repos_only': for repo in get_repositories_of_project(apiurl, project): if (disabled is None) or ((disabled is not None) and (repo not in [d['repo'] for d in disabled])): print(repo) else: data = [] for repo in get_repos_of_project(apiurl, project): if disabled is not None: if ({'repo': repo.name, 'arch': repo.arch} in disabled or repo.name in [d['repo'] for d in disabled if d['arch'] is None] or repo.arch in [d['arch'] for d in disabled if d['repo'] is None]): continue data += [repo.name, repo.arch] for row in build_table(2, data, width=2): print(row) def parse_repoarchdescr(self, args, noinit=False, alternative_project=None, ignore_descr=False, vm_type=None, multibuild_package=None): from . import build as osc_build from . import conf from .core import Package from .core import Repo from .core import decode_it from .core import get_buildconfig from .core import get_repos_of_project from .core import is_package_dir from .core import return_external from .core import store from .core import store_read_package from .core import store_read_project """helper to parse the repo, arch and build description from args""" arg_arch = arg_repository = arg_descr = None if len(args) < 3: # some magic, works only sometimes, but people seem to like it :/ all_archs = [] for mainarch in osc_build.can_also_build: all_archs.append(mainarch) for subarch in osc_build.can_also_build.get(mainarch): all_archs.append(subarch) for arg in args: if (arg.endswith('.spec') or arg.endswith('.dsc') or arg.endswith('.kiwi') or arg.endswith('.livebuild') or arg.endswith('flatpak.yaml') or arg.endswith('flatpak.yml') or arg.endswith('flatpak.json') or arg.startswith('Dockerfile.') or arg.startswith('Containerfile.') or arg in ('PKGBUILD', 'build.collax', 'Chart.yaml', 'Containerfile', 'Dockerfile', 'fissile.yml', 'appimage.yml', '_preinstallimage')): arg_descr = arg else: if (arg == osc_build.hostarch or arg in all_archs) and arg_arch is None: # it seems to be an architecture in general arg_arch = arg if not (arg == osc_build.hostarch or arg in osc_build.can_also_build.get(osc_build.hostarch, [])): if vm_type not in ('qemu', 'emulator'): print("WARNING: native compile is not possible, a emulator via binfmt misc handler must be configured!") elif not arg_repository: arg_repository = arg else: # raise oscerr.WrongArgs('\'%s\' is neither a build description nor a supported arch' % arg) # take it as arch (even though this is no supported arch) - hopefully, this invalid # arch will be detected below arg_arch = arg else: arg_repository, arg_arch, arg_descr = args arg_arch = arg_arch or osc_build.hostarch self._debug("hostarch: ", osc_build.hostarch) self._debug("arg_arch: ", arg_arch) self._debug("arg_repository: ", arg_repository) self._debug("arg_descr: ", arg_descr) repositories = [] # store list of repos for potential offline use repolistfile = os.path.join(Path.cwd(), store, "_build_repositories") if noinit: repositories = Repo.fromfile(repolistfile) else: project = alternative_project or store_read_project('.') apiurl = self.get_api_url() repositories = list(get_repos_of_project(apiurl, project)) if not repositories: raise oscerr.WrongArgs(f'no repositories defined for project \'{project}\'') if alternative_project is None: # only persist our own repos Repo.tofile(repolistfile, repositories) no_repo = False repo_names = sorted({r.name for r in repositories}) if not arg_repository and repositories: # XXX: we should avoid hardcoding repository names # Use a default value from config, but just even if it's available # unless try standard, or openSUSE_Factory, or openSUSE_Tumbleweed no_repo = True arg_repository = repositories[-1].name for repository in (conf.config['build_repository'], 'standard', 'openSUSE_Factory', 'openSUSE_Tumbleweed'): if repository in repo_names: arg_repository = repository no_repo = False break if not arg_repository: raise oscerr.WrongArgs('please specify a repository') if not noinit: if arg_repository not in repo_names: raise oscerr.WrongArgs(f"{arg_repository} is not a valid repository, use one of: {', '.join(repo_names)}") arches = [r.arch for r in repositories if r.name == arg_repository and r.arch] if arches and arg_arch not in arches: raise oscerr.WrongArgs(f"{arg_arch} is not a valid arch for the repository {arg_repository}, use one of: {', '.join(arches)}") # can be implemented using # reduce(lambda x, y: x + y, (glob.glob(x) for x in ('*.spec', '*.dsc', '*.kiwi'))) # but be a bit more readable :) descr = glob.glob('*.spec') + glob.glob('*.dsc') + glob.glob('*.kiwi') + glob.glob('*.livebuild') + \ glob.glob('PKGBUILD') + glob.glob('build.collax') + glob.glob('Dockerfile') + \ glob.glob('Dockerfile.*') + glob.glob('Containerfile') + glob.glob('Containerfile.*') + \ glob.glob('fissile.yml') + glob.glob('appimage.yml') + glob.glob('Chart.yaml') + \ glob.glob('*flatpak.yaml') + glob.glob('*flatpak.yml') + glob.glob('*flatpak.json') + \ glob.glob('*.productcompose') + glob.glob('mkosi.*') # FIXME: # * request repos from server and select by build type. if not arg_descr and len(descr) == 1: arg_descr = descr[0] elif not arg_descr: msg = None if len(descr) > 1: if no_repo: raise oscerr.WrongArgs("Repository is missing. Cannot guess build description without repository") apiurl = self.get_api_url() project = alternative_project or store_read_project('.') # some distros like Debian rename and move build to obs-build if not os.path.isfile('/usr/lib/build/queryconfig') and os.path.isfile('/usr/lib/obs-build/queryconfig'): queryconfig = '/usr/lib/obs-build/queryconfig' else: queryconfig = '/usr/lib/build/queryconfig' if noinit: bc_filename = f'_buildconfig-{arg_repository}-{arg_arch}' if is_package_dir('.'): bc_filename = os.path.join(Path.cwd(), store, bc_filename) else: bc_filename = os.path.abspath(bc_filename) if not os.path.isfile(bc_filename): raise oscerr.WrongOptions('--offline is not possible, no local buildconfig file') recipe = return_external(queryconfig, '--dist', bc_filename, 'type') else: bc = get_buildconfig(apiurl, project, arg_repository) with tempfile.NamedTemporaryFile() as f: f.write(bc) f.flush() recipe = return_external(queryconfig, '--dist', f.name, 'type') recipe = recipe.strip() if recipe == 'arch': recipe = 'PKGBUILD' recipe = decode_it(recipe) pac = os.path.basename(Path.cwd()) if is_package_dir(Path.cwd()): pac = store_read_package(Path.cwd()) if multibuild_package: pac = multibuild_package if recipe == 'PKGBUILD': cands = [d for d in descr if d.startswith(recipe)] elif recipe == 'mkosi': cands = [d for d in descr if d.startswith('mkosi.')] else: cands = [d for d in descr if d.endswith('.' + recipe)] if len(cands) > 1: repo_cands = [d for d in cands if d == f'{pac}-{arg_repository}.{recipe}'] if repo_cands: cands = repo_cands else: pac_cands = [d for d in cands if d == f'{pac}.{recipe}'] if pac_cands: cands = pac_cands if len(cands) == 1: arg_descr = cands[0] if not arg_descr: msg = f"Multiple build description files found: {', '.join(cands)}" elif not ignore_descr: msg = 'Missing argument: build description (for example a spec, dsc or kiwi file)' try: p = Package('.') if p.islink() and not p.isexpanded(): msg += ' (this package is not expanded - you might want to try osc up --expand)' except: pass if msg: raise oscerr.WrongArgs(msg) return arg_repository, arg_arch, arg_descr @cmdln.option('--clean', action='store_true', help='Delete old build root before initializing it') @cmdln.option('-o', '--offline', action='store_true', help='Start with cached prjconf and packages without contacting the api server') @cmdln.option('-l', '--preload', action='store_true', help='Preload all files into the cache for offline operation') @cmdln.option('--no-changelog', action='store_true', help='don\'t update the package changelog from a changes file') @cmdln.option('--rsync-src', metavar='RSYNCSRCPATH', dest='rsyncsrc', help='Copy folder to buildroot after installing all RPMs. Use together with --rsync-dest. This is the path on the HOST filesystem e.g. /tmp/linux-kernel-tree. It defines RSYNCDONE 1 .') @cmdln.option('--rsync-dest', metavar='RSYNCDESTPATH', dest='rsyncdest', help='Copy folder to buildroot after installing all RPMs. Use together with --rsync-src. This is the path on the TARGET filesystem e.g. /usr/src/packages/BUILD/linux-2.6 .') @cmdln.option('--overlay', metavar='OVERLAY', help='Copy overlay filesystem to buildroot after installing all RPMs .') @cmdln.option('--noinit', '--no-init', action='store_true', help='Skip initialization of build root and start with build immediately.') @cmdln.option('--checks', action='store_true', help='Run checks even if disabled in the build config') @cmdln.option('--nochecks', '--no-checks', action='store_true', help='Do not run build checks on the resulting packages.') @cmdln.option('--no-verify', '--noverify', action='store_true', help='Skip signature verification (via pgp keys) of packages used for build. (Global config in oscrc: no_verify)') @cmdln.option('--nodebugpackages', '--no-debug-packages', action='store_true', help='Skip installation of additional debug packages for CLI builds (specified in obs:cli_debug_packages in project metadata)') @cmdln.option("--skip-local-service-run", "--noservice", "--no-service", default=False, action="store_true", help="Skip run of local source services as specified in _service file.") @cmdln.option('-p', '--prefer-pkgs', metavar='DIR', action='append', help='Prefer packages from this directory when installing the build-root') @cmdln.option('-k', '--keep-pkgs', metavar='DIR', help='Save built packages into this directory') @cmdln.option('-M', '--multibuild-package', metavar='FLAVOR', help=HELP_MULTIBUILD_ONE) @cmdln.option('-x', '--extra-pkgs', metavar='PAC', action='append', help='Add this package when installing the build-root') @cmdln.option('-X', '--extra-pkgs-from', metavar='FILE', action='append', help='Add packages listed in this file when installing the build-root') @cmdln.option('--root', metavar='ROOT', help='Build in specified directory') @cmdln.option('-j', '--jobs', metavar='N', help='Compile with N jobs') @cmdln.option('-t', '--threads', metavar='N', help='Compile with N threads') @cmdln.option('--icecream', metavar='N', help='use N parallel build jobs with icecream') @cmdln.option('--ccache', action='store_true', help='use ccache to speed up rebuilds') @cmdln.option('--pkg-ccache', metavar='/path/to/_ccache.tar', help='path to an existing uncompressed archive ccache. Using this option implies --ccache') @cmdln.option('--sccache', action='store_true', help='use sccache to speed up rebuilds. Conflicts with --cache') @cmdln.option('--sccache-uri', metavar='redis://127.0.0.1:6389', help='Optional remote URI for sccache storage. Implies --sccache.') @cmdln.option('--with', metavar='X', dest='_with', action='append', help='enable feature X for build') @cmdln.option('--without', metavar='X', action='append', help='disable feature X for build') @cmdln.option('--define', metavar='\'X Y\'', action='append', help='define macro X with value Y') @cmdln.option('--build-opt', metavar='OPT', action='append', help='pass option OPT to the build command') @cmdln.option('--buildtool-opt', metavar='OPT', action='append', help=textwrap.dedent( """ pass option OPT to the build tool command (rpmbuild), for example: * don't clean build environment after a successful build: --buildtool-opt=--noclean """ ), ) @cmdln.option('--userootforbuild', '--login-as-root', action='store_true', help='Run build or shell as root. The default is to build as ' 'unprivileged user. Note that a line "# norootforbuild" ' 'in the spec file will invalidate this option.') @cmdln.option('--build-uid', metavar='uid:gid|"caller"', help='specify the numeric uid:gid pair to assign to the ' 'unprivileged "abuild" user or use "caller" to use the current user uid:gid') @cmdln.option('--local-package', action='store_true', help='build a package which does not exist on the server') @cmdln.option('--stage', metavar='STAGE', help='runs a specific stage, default is "a" for all. Append a trailing "="' 'to only run the specified stage. Append a trailing "+" to run' 'the specified stage and all stages coming after it. With no' 'suffix, stages up to and included the specified stage are run.') @cmdln.option('--linksources', action='store_true', help='use hard links instead of a deep copied source') @cmdln.option('--vm-memory', metavar='MEMORY', help='amount of memory for VM defined in MB') @cmdln.option('--vm-disk-size', metavar='DISKSIZE', help='size for newly created disk image in MB') @cmdln.option('--vm-type', metavar='TYPE', help='use VM type TYPE (e.g. kvm)') @cmdln.option('--vm-telnet', metavar='TELNET', help='Launch a telnet server inside of VM build') @cmdln.option('--target', metavar='TARGET', help='define target platform') @cmdln.option('--alternative-project', metavar='PROJECT', help='specify the build target project') @cmdln.option('-d', '--debuginfo', action='store_true', help='also build debuginfo sub-packages') @cmdln.option('--disable-debuginfo', action='store_true', help='disable build of debuginfo packages') @cmdln.option('-b', '--baselibs', action='store_true', help='Create -32bit/-64bit/-x86 rpms for other architectures') @cmdln.option('--release', metavar='N', help='set release number of the package to N') @cmdln.option('--disable-cpio-bulk-download', action='store_true', help='disable downloading packages as cpio archive from api') @cmdln.option('--cpio-bulk-download', action='store_false', dest='disable_cpio_bulk_download', help=argparse.SUPPRESS) @cmdln.option('--download-api-only', action='store_true', help='only fetch packages from the api') @cmdln.option('--oldpackages', metavar='DIR', help='take previous build from DIR (special values: _self, _link)') @cmdln.option('--verbose-mode', metavar='MODE', help='set verbose mode of the "build" program, arguments can be "all" or "vm"') @cmdln.option('--wipe', action='store_true', help=argparse.SUPPRESS) @cmdln.option('--shell', action='store_true', help=argparse.SUPPRESS) @cmdln.option('--shell-after-fail', action='store_true', help="run a shell if the build tool fails") @cmdln.option('--shell-cmd', metavar='COMMAND', help='run specified command instead of bash') @cmdln.option('-f', '--force', action='store_true', help='Do not ask for confirmation to wipe') @cmdln.option('--host', metavar='HOST', help='perform the build on a remote server - user@server:~/remote/directory') @cmdln.option('--trust-all-projects', action='store_true', help='trust packages from all projects') @cmdln.option('--nopreinstallimage', '--no-preinstallimage', action='store_true', help='Do not use preinstall images for creating the build root.') @cmdln.option("--just-print-buildroot", action="store_true", help="Print build root path and exit.") @cmdln.option('--no-timestamps', '-s', '--strip-time', action='store_true', help='Hide the time prefix in output.') @cmdln.alias('chroot') @cmdln.alias('shell') @cmdln.alias('wipe') def do_build(self, subcmd, opts, *args): """ Build a package on your local machine The command works from a package checkout (local changes are fine). You can use `osc repos` to look up REPOSITORY and ARCH arguments. If they are not set, `osc` choses from configured repos in this priority: 1. `build_repository` mentioned in config file 2. "standard" 3. "openSUSE_Factory" 4. "openSUSE_Tumbleweed" 5. last repo from the sorted list BUILD_DESCR is either a RPM spec file, or a Debian dsc file. The command honors packagecachedir, build-root and build-uid settings in oscrc, if present. You may want to set su-wrapper = 'sudo' in oscrc, and configure sudo with option NOPASSWD for /usr/bin/build. If neither --clean nor --noinit is given, build will reuse an existing build-root again, removing unneeded packages and add missing ones. This is usually the fastest option. If the package doesn't exist on the server please use the --local-package option. If the project of the package doesn't exist on the server use the --alternative-project option. Example: osc build [OPTS] --alternative-project openSUSE:10.3 standard i586 BUILD_DESCR usage: osc build [OPTS] # will try to guess a build environment osc build [OPTS] REPOSITORY ARCH BUILD_DESCR osc build [OPTS] REPOSITORY ARCH osc build [OPTS] REPOSITORY (ARCH = hostarch, BUILD_DESCR is detected automatically) osc build [OPTS] ARCH (REPOSITORY = build_repository (config option), BUILD_DESCR is detected automatically) osc build [OPTS] BUILD_DESCR (REPOSITORY = build_repository (config option), ARCH = hostarch) osc build [OPTS] (REPOSITORY = build_repository (config option), ARCH = hostarch, BUILD_DESCR is detected automatically) For debugging after a build you can jump into the build environment: osc shell [OPTS] REPOSITORY ARCH Run a single command inside of the build environment: osc shell --shell-cmd=COMMAND [OPTS] REPOSITORY ARCH Useful `shell` OPTS --noinit # for faster run --shell-cmd=COMMAND --shell-after-fail --extra-pkgs=PACKAGE # install additional packages To clean up the build environment run osc wipe [OPTS] osc wipe [OPTS] REPOSITORY ARCH If you've set the used VM type in oscrc, it can be also overridden here --vm-type=chroot # for faster, but uncleaner and unsecure build --vm-type=kvm # for clean and secure build --vm-type=qemu # for slow cross architecture build using system emulator # Note: # Configuration can be overridden by envvars, e.g. # OSC_SU_WRAPPER overrides the setting of su-wrapper. # OSC_BUILD_ROOT overrides the setting of build-root. # OSC_PACKAGECACHEDIR overrides the setting of packagecachedir. """ from . import build as osc_build from . import conf from . import git_scm from . import store as osc_store from .core import Project from .output import get_user_input if shutil.which(conf.config['build-cmd']) is None: print(f"Error: build ('{conf.config['build-cmd']}') command not found", file=sys.stderr) print('Install the build package from http://download.opensuse.org/repositories/openSUSE:/Tools/', file=sys.stderr) return 1 if opts.debuginfo and opts.disable_debuginfo: raise oscerr.WrongOptions('osc: --debuginfo and --disable-debuginfo are mutual exclusive') if subcmd == 'wipe': opts.wipe = True if len(args) > 3: raise oscerr.WrongArgs('Too many arguments') if not opts.local_package: store = osc_store.get_store(Path.cwd(), print_warnings=True) if isinstance(store, git_scm.store.GitStore): opts.local_package = True else: store.assert_is_package() try: if opts.alternative_project and opts.alternative_project == store.project: opts.alternative_project = None except RuntimeError: # ignore the following exception: Couldn't map git branch '' to a project pass else: try: store = osc_store.get_store(Path.cwd(), print_warnings=True) except oscerr.NoWorkingCopy: store = None if store is None: try: # if opts.local_package is set, build.main() reads project from the store and sets package to "_project" # that's why we're ok with store from the parent directory that holds information about the project # FIXME: the parent directory may contain a git repo that doesn't contain a project; we have no way of recognizing that! store = osc_store.get_store(os.path.dirname(Path.cwd()), print_warnings=True) except oscerr.NoWorkingCopy: store = None # HACK: avoid calling some underlying store_*() functions from parse_repoarchdescr() method # We'll fix parse_repoarchdescr() later because it requires a larger change if not opts.alternative_project and isinstance(store, git_scm.GitStore): opts.alternative_project = store.project if len(args) == 0 and store and store.is_package and store.last_buildroot: # build env not specified, just read from last build attempt args = [store.last_buildroot[0], store.last_buildroot[1]] if not opts.vm_type: opts.vm_type = store.last_buildroot[2] if opts.just_print_buildroot: if opts.root: build_root = opts.root else: args = self.parse_repoarchdescr(args, opts.noinit or opts.offline, opts.alternative_project, False, opts.vm_type, opts.multibuild_package) repo, arch, build_descr = args prj, pac = osc_build.calculate_prj_pac(store, opts, build_descr) apihost = urlsplit(self.get_api_url())[1] user = osc_build.calculate_build_root_user(opts.vm_type) build_root = osc_build.calculate_build_root(apihost, prj, pac, repo, arch, user) print(build_root) return vm_chroot = opts.vm_type or conf.config['build-type'] if (subcmd in ('shell', 'chroot') or opts.shell or opts.wipe) and not vm_chroot: if opts.root: build_root = opts.root else: args = self.parse_repoarchdescr(args, opts.noinit or opts.offline, opts.alternative_project, False, opts.vm_type, opts.multibuild_package) repo, arch, build_descr = args prj, pac = osc_build.calculate_prj_pac(store, opts, build_descr) apihost = urlsplit(self.get_api_url())[1] user = osc_build.calculate_build_root_user(opts.vm_type) build_root = osc_build.calculate_build_root(apihost, prj, pac, repo, arch, user) if opts.wipe and not opts.force: # Confirm delete reply = get_user_input( f"Really wipe '{build_root}'?", answers={"y": "yes", "n": "no"}, default_answer="n", ) if reply != "y": raise oscerr.UserAbort() build_args = ['--root=' + build_root, '--noinit', '--shell'] if opts.wipe: build_args.append('--wipe') ret = osc_build.run_build(opts, *build_args) print(f"The buildroot was: {build_root}") sys.exit(ret) elif subcmd in ('shell', 'chroot') or opts.shell: print(f'--shell in combination with build-type {vm_chroot} is experimental.') print('The semantics may change at any time!') opts.shell = True args = self.parse_repoarchdescr(args, opts.noinit or opts.offline, opts.alternative_project, False, opts.vm_type, opts.multibuild_package) if not opts.local_package: try: prj = Project(os.pardir, getPackageList=False, wc_check=False) if prj.status(store.package) == "A": # a package with state 'A' most likely does not exist on # the server - hence, treat it as a local package opts.local_package = True print("INFO: Building the package as a local package.", file=sys.stderr) except oscerr.NoWorkingCopy: pass if conf.config['no_verify']: opts.no_verify = True if opts.keep_pkgs and not os.path.isdir(opts.keep_pkgs): if os.path.exists(opts.keep_pkgs): raise oscerr.WrongOptions(f'Preferred save location \'{opts.keep_pkgs}\' is not a directory') else: os.makedirs(opts.keep_pkgs) if opts.prefer_pkgs: for d in opts.prefer_pkgs: if not os.path.isdir(d): raise oscerr.WrongOptions(f'Preferred package location \'{d}\' is not a directory') if opts.offline and opts.preload: raise oscerr.WrongOptions('--offline and --preload are mutually exclusive') if not opts.skip_local_service_run: # if the option is not set by the user read the default from config if not conf.config['local_service_run']: opts.skip_local_service_run = True if opts.shell or opts.wipe: opts.skip_local_service_run = True if opts.preload: opts.nopreinstallimage = True print(f'Building {args[2]} for {args[0]}/{args[1]}') if not opts.host: ret = osc_build.main(self.get_api_url(), store, opts, args) if (subcmd in ('shell', 'chroot') or opts.shell or opts.wipe) and not vm_chroot: print(f"The buildroot was: {build_root}") return ret else: return self._do_rbuild(subcmd, opts, *args) def _do_rbuild(self, subcmd, opts, *args): from .core import run_external # drop the --argument, value tuple from the list def drop_arg2(lst, name): if not name: return lst while name in lst: i = lst.index(name) lst.pop(i + 1) lst.pop(i) return lst # change the local directory to more suitable remote one in hostargs # and perform the rsync to such location as well def rsync_dirs_2host(hostargs, short_name, long_name, dirs): drop_arg2(hostargs, short_name) drop_arg2(hostargs, long_name) for pdir in dirs: # drop the last '/' from pdir name - this is because # rsync foo remote:/bar create /bar/foo on remote machine # rsync foo/ remote:/bar copy the content of foo in the /bar if pdir[-1:] == os.path.sep: pdir = pdir[:-1] hostprefer = os.path.join( hostpath, basename, f"{long_name.replace('-', '_')}__", os.path.basename(os.path.abspath(pdir))) hostargs.append(long_name) hostargs.append(hostprefer) rsync_prefer_cmd = ['rsync', '-az', '--delete', '-e', 'ssh', pdir, f"{hostname}:{os.path.dirname(hostprefer)}"] print(f"Run: {' '.join(rsync_prefer_cmd)}") ret = run_external(rsync_prefer_cmd[0], *rsync_prefer_cmd[1:]) if ret != 0: return ret return 0 cwd = Path.cwd() basename = os.path.basename(cwd) if ':' not in opts.host: hostname = opts.host hostpath = "~/" else: hostname, hostpath = opts.host.split(':', 1) # arguments for build: use all arguments behind build and drop --host 'HOST' hostargs = sys.argv[sys.argv.index(subcmd) + 1:] drop_arg2(hostargs, '--host') # global arguments: use first '-' up to subcmd gi = 0 for i, a in enumerate(sys.argv): if a == subcmd: break if a[0] == '-': gi = i break if gi: hostglobalargs = sys.argv[gi: sys.argv.index(subcmd) + 1] else: hostglobalargs = (subcmd, ) # keep-pkgs hostkeep = None if opts.keep_pkgs: drop_arg2(hostargs, '-k') drop_arg2(hostargs, '--keep-pkgs') hostkeep = os.path.join( hostpath, basename, "__keep_pkgs__", "") # <--- this adds last '/', thus triggers correct rsync behavior hostargs.append('--keep-pkgs') hostargs.append(hostkeep) ### run all commands ### # 1.) rsync sources rsync_source_cmd = ['rsync', '-az', '--delete', '-e', 'ssh', cwd, f"{hostname}:{hostpath}"] print(f"Run: {' '.join(rsync_source_cmd)}") ret = run_external(rsync_source_cmd[0], *rsync_source_cmd[1:]) if ret != 0: return ret # 2.) rsync prefer-pkgs dirs, overlay and rsyns-src if opts.prefer_pkgs: ret = rsync_dirs_2host(hostargs, '-p', '--prefer-pkgs', opts.prefer_pkgs) if ret != 0: return ret for arg, long_name in ((opts.rsyncsrc, '--rsync-src'), (opts.overlay, '--overlay')): if not arg: continue ret = rsync_dirs_2host(hostargs, None, long_name, (arg, )) if ret != 0: return ret # 3.) call osc build osc_cmd = "osc" for var in ('OSC_SU_WRAPPER', 'OSC_BUILD_ROOT', 'OSC_PACKAGECACHEDIR'): if os.getenv(var): osc_cmd = f"{var}={os.getenv(var)} {osc_cmd}" ssh_cmd = \ ['ssh', '-t', hostname, "cd %(remote_dir)s; %(osc_cmd)s %(global_args)s %(local_args)s" % dict( remote_dir=os.path.join(hostpath, basename), osc_cmd=osc_cmd, global_args=" ".join(hostglobalargs), local_args=" ".join(hostargs)) ] print(f"Run: {' '.join(ssh_cmd)}") build_ret = run_external(ssh_cmd[0], *ssh_cmd[1:]) if build_ret != 0: return build_ret # 4.) get keep-pkgs back if opts.keep_pkgs: ret = rsync_keep_cmd = ['rsync', '-az', '-e', 'ssh', f"{hostname}:{hostkeep}", opts.keep_pkgs] print(f"Run: {' '.join(rsync_keep_cmd)}") ret = run_external(rsync_keep_cmd[0], *rsync_keep_cmd[1:]) if ret != 0: return ret return build_ret @cmdln.option('', '--csv', action='store_true', help='generate output in CSV (separated by |)') @cmdln.option('-l', '--limit', metavar='limit', type=int, default=0, help='for setting the number of results') @cmdln.option('-M', '--multibuild-package', metavar='FLAVOR', help=HELP_MULTIBUILD_ONE) @cmdln.alias('buildhist') def do_buildhistory(self, subcmd, opts, *args): """ Shows the build history of a package The arguments REPOSITORY and ARCH can be taken from the first two columns of the 'osc repos' output. usage: osc buildhist REPOSITORY ARCHITECTURE osc buildhist PROJECT PACKAGE[:FLAVOR] REPOSITORY ARCHITECTURE """ from . import _private apiurl = self.get_api_url() args = list(args) args_backup = args.copy() try: args = [".", "."] + args_backup.copy() project, package, repository, arch = pop_project_package_repository_arch_from_args(args) ensure_no_remaining_args(args) except (oscerr.NoWorkingCopy, oscerr.WrongArgs): args[:] = args_backup.copy() project, package, repository, arch = pop_project_package_repository_arch_from_args(args) ensure_no_remaining_args(args) if opts.multibuild_package: package = package + ":" + opts.multibuild_package history = _private.BuildHistory(apiurl, project, package, repository, arch, limit=opts.limit) if opts.csv: print(history.to_csv(), end="") else: print(history.to_text_table()) @cmdln.option('', '--csv', action='store_true', help='generate output in CSV (separated by |)') @cmdln.option('-l', '--limit', metavar='limit', help='for setting the number of results') @cmdln.option('-M', '--multibuild-package', metavar='FLAVOR', help=HELP_MULTIBUILD_ONE) @cmdln.alias('jobhist') def do_jobhistory(self, subcmd, opts, *args): """ Shows the job history of a project The arguments REPOSITORY and ARCH can be taken from the first two columns of the 'osc repos' output. usage: osc jobhist REPOSITORY ARCHITECTURE (in project dir) osc jobhist PROJECT [PACKAGE[:FLAVOR]] REPOSITORY ARCHITECTURE """ from .core import is_package_dir from .core import is_project_dir from .core import print_jobhistory from .core import slash_split from .core import store_read_package from .core import store_read_project wd = Path.cwd() args = slash_split(args) if len(args) < 2 and (is_project_dir('.') or is_package_dir('.')): self.print_repos() apiurl = self.get_api_url() if len(args) == 4: project = self._process_project_name(args[0]) package = args[1] repository = args[2] arch = args[3] elif len(args) == 3: project = self._process_project_name(args[0]) package = None # skipped = prj repository = args[1] arch = args[2] elif len(args) == 2: package = None try: package = store_read_package(wd) except: pass project = store_read_project(wd) repository = args[0] arch = args[1] else: raise oscerr.WrongArgs('Wrong number of arguments') if opts.multibuild_package and package: package = package + ":" + opts.multibuild_package format = 'text' if opts.csv: format = 'csv' print_jobhistory(apiurl, project, package, repository, arch, format, opts.limit) @cmdln.option('-r', '--revision', metavar='rev', help='show log of the specified revision') @cmdln.option("-p", "--patch", action="store_true", help='show patch for each revision; NOTE: use this option carefully because it loads patches on demand in a pager') @cmdln.option('', '--csv', action='store_true', help='generate output in CSV') @cmdln.option('', '--xml', action='store_true', help='generate output in XML') @cmdln.option('-D', '--deleted', action='store_true', default=None, help='work on deleted package') @cmdln.option('-M', '--meta', action='store_true', default=None, help='checkout out meta data instead of sources') def do_log(self, subcmd, opts, *args): """ Shows the commit log of a package usage: osc log (inside working copy) osc log remote_project [remote_package] """ from .core import checkRevision from .core import get_commitlog from .core import parseRevisionOption from .core import revision_is_empty from .output import pipe_to_pager apiurl = self.get_api_url() args = list(args) project, package = pop_project_package_from_args( args, default_project=".", default_package=".", package_is_optional=True ) rev, rev_upper = parseRevisionOption(opts.revision) if not revision_is_empty(rev) and not checkRevision(project, package, rev, apiurl, opts.meta): print(f'Revision \'{rev}\' does not exist', file=sys.stderr) sys.exit(1) format = 'text' if opts.csv: format = 'csv' if opts.xml: format = 'xml' lines = get_commitlog(apiurl, project, package, rev, format, opts.meta, opts.deleted, rev_upper, patch=opts.patch) pipe_to_pager(lines, add_newlines=True) @cmdln.option('-v', '--verbose', action='store_true', help='verbose run of local services for debugging purposes') def do_service(self, subcmd, opts, *args): """ Handle source services Source services can be used to modify sources like downloading files, verify files, generating files or modify existing files. usage: osc service COMMAND (inside working copy) osc service run [SOURCE_SERVICE] osc service runall osc service manualrun [SOURCE_SERVICE] osc service remoterun [PROJECT PACKAGE] osc service merge [PROJECT PACKAGE] osc service wait [PROJECT PACKAGE] COMMAND can be: run r run defined services with modes "trylocal", "localonly", or no mode set locally, may take an optional parameter to run only a specified source service. In case parameters exist for this one in _service file they are used. runall ra run all services independent of the used mode manualrun mr run all services with mode "manual", may take an optional parameter to run only a specified source service remoterun rr trigger a re-run on the server side merge commits all server side generated files and drops the _service definition wait waits until the service finishes and returns with an error if it failed Not for common usage anymore: localrun lr the same as "run" but services with mode "serveronly" are also executed disabledrun dr run all services with mode "disabled" """ # disabledrun and localrun exists as well, but are considered to be obsolete from . import store as osc_store from .core import Package from .core import mergeservice from .core import runservice from .core import waitservice args = list(args) apiurl = self.get_api_url() project = None package = None singleservice = None mode = None remote_commands = ("remoterun", "rr", "merge", "wait") if len(args) < 1: self.argparse_error("Please specify a command") command = args.pop(0) if command not in ("runall", "ra", "run", "localrun", "manualrun", "disabledrun", "remoterun", "lr", "dr", "mr", "r", "rr", "merge", "wait"): self.argparse_error(f"Invalid command: {command}") if command in ("localrun", "lr"): print(f"WARNING: Command '{command}' is obsolete, please use 'run' instead.", file=sys.stderr) if command in ("disabledrun", "dr"): print(f"WARNING: Command '{command}' is obsolete,\n" "please convert your _service to use 'manual' and then 'manualrun/mr' instead.", file=sys.stderr) if command in remote_commands: project, package = pop_project_package_from_args( args, default_project=".", default_package=".", package_is_optional=False ) elif len(args) == 1: singleservice = args.pop(0) ensure_no_remaining_args(args) if command in ('remoterun', 'rr'): print(runservice(apiurl, project, package)) return if command == "wait": print(waitservice(apiurl, project, package)) return if command == "merge": print(mergeservice(apiurl, project, package)) return store = osc_store.get_store(Path.cwd(), print_warnings=True) store.assert_is_package() if command in ('runall', 'ra', 'run', 'localrun', 'manualrun', 'disabledrun', 'lr', 'mr', 'dr', 'r'): p = Package(".") if command in ("localrun", "lr"): mode = "local" elif command in ("manualrun", "mr"): mode = "manual" elif command in ("disabledrun", "dr"): mode = "disabled" elif command in ("runall", "ra"): mode = "all" return p.run_source_services(mode, singleservice, opts.verbose) @cmdln.option('-a', '--arch', metavar='ARCH', help='trigger rebuilds for a specific architecture') @cmdln.option('-r', '--repo', metavar='REPO', help='trigger rebuilds for a specific repository') @cmdln.option('-f', '--failed', action='store_true', help='rebuild all failed packages') @cmdln.option('-M', '--multibuild-package', metavar="FLAVOR", action='append', help=HELP_MULTIBUILD_MANY) @cmdln.option('--all', action='store_true', help='Rebuild all packages of entire project') @cmdln.alias('rebuildpac') def do_rebuild(self, subcmd, opts, *args): """ Trigger package rebuilds Note that it is normally NOT needed to kick off rebuilds like this, because they principally happen in a fully automatic way, triggered by source check-ins. In particular, the order in which packages are built is handled by the build service. The arguments REPOSITORY and ARCH can be taken from the first two columns of the 'osc repos' output. usage: osc rebuild [PROJECT [PACKAGE[:FLAVOR] [REPOSITORY [ARCH]]]] """ from .core import MultibuildFlavorResolver from .core import rebuild apiurl = self.get_api_url() args = list(args) project, package, repo, arch = pop_project_package_repository_arch_from_args( args, package_is_optional=True, repository_is_optional=True, arch_is_optional=True, default_project='.', default_package='.', ) ensure_no_remaining_args(args) if opts.repo: repo = opts.repo if opts.arch: arch = opts.arch code = None if opts.failed: code = 'failed' if not (opts.all or package or repo or arch or code): raise oscerr.WrongOptions('No option has been provided. If you want to rebuild all packages of the entire project, use --all option.') if opts.all and package: self.argparser.error("Option '--all' conflicts with the 'package' argument. Omit the argument or run osc from outside a package working copy.") if opts.multibuild_package: resolver = MultibuildFlavorResolver(apiurl, project, package, use_local=False) packages = resolver.resolve_as_packages(opts.multibuild_package) else: packages = [package] for package in packages: print(rebuild(apiurl, project, package, repo, arch, code)) def do_info(self, subcmd, opts, *args): """ Print information about a working copy Print information about each ARG (default: '.') ARG is a working-copy path. """ from . import store as osc_store from .core import Package from .core import Project from .core import parseargs args = parseargs(args) for pdir in args: store = osc_store.get_store(pdir) if store.is_package: p = Package(pdir) else: p = Project(pdir, getPackageList=False, wc_check=False) print(p.info()) @cmdln.option('-M', '--multibuild-package', metavar='FLAVOR', action='append', help=HELP_MULTIBUILD_MANY) def do_sendsysrq(self, subcmd, opts, *args): """ Trigger a sysrq in a running build This is only going to work when the build is running in a supported VM. Also only a subset of sysrq are supported. Typical use case for debugging are 9, t and w in this sequence. usage: osc sendsysrq REPOSITORY ARCH SYSRQ osc sendsysrq PROJECT PACKAGE[:FLAVOR] REPOSITORY ARCH SYSRQ """ from . import store as osc_store from .core import MultibuildFlavorResolver from .core import cmdbuild from .core import is_package_dir from .core import slash_split from .core import store_read_package from .core import store_read_project args = slash_split(args) project = package = repo = arch = sysrq = None apiurl = self.get_api_url() if len(args) < 4: if is_package_dir(Path.cwd()): project = store_read_project(Path.cwd()) package = store_read_package(Path.cwd()) apiurl = osc_store.Store(Path.cwd()).apiurl repo = args[0] arch = args[1] sysrq = args[2] else: raise oscerr.WrongArgs('Too few arguments.') elif len(args) != 5: raise oscerr.WrongArgs('Wrong number of arguments.') else: project = self._process_project_name(args[0]) package = args[1] repo = args[2] arch = args[3] sysrq = args[4] if opts.multibuild_package: resolver = MultibuildFlavorResolver(apiurl, project, package, use_local=False) packages = resolver.resolve_as_packages(opts.multibuild_package) else: packages = [package] for package in packages: print(cmdbuild(apiurl, 'sendsysrq', project, package, arch, repo, None, sysrq)) @cmdln.option('-a', '--arch', metavar='ARCH', help='Restart builds for a specific architecture') @cmdln.option('-M', '--multibuild-package', metavar="FLAVOR", action='append', help=HELP_MULTIBUILD_MANY) @cmdln.option('-r', '--repo', metavar='REPO', help='Restart builds for a specific repository') @cmdln.option('--all', action='store_true', help='Restart all running builds of entire project') @cmdln.alias('abortbuild') def do_restartbuild(self, subcmd, opts, *args): """ Restart the build of a certain project or package usage: osc restartbuild [PROJECT [PACKAGE[:FLAVOR] [REPOSITORY [ARCH]]]] """ from . import store as osc_store from .core import MultibuildFlavorResolver from .core import cmdbuild from .core import is_package_dir from .core import is_project_dir from .core import slash_split from .core import store_read_package from .core import store_read_project args = slash_split(args) package = repo = arch = code = None apiurl = self.get_api_url() if opts.repo: repo = opts.repo if opts.arch: arch = opts.arch if len(args) < 1: if is_package_dir(Path.cwd()): project = store_read_project(Path.cwd()) package = store_read_package(Path.cwd()) apiurl = osc_store.Store(Path.cwd()).apiurl elif is_project_dir(Path.cwd()): project = store_read_project(Path.cwd()) apiurl = osc_store.Store(Path.cwd()).apiurl else: raise oscerr.WrongArgs('Too few arguments.') else: project = self._process_project_name(args[0]) if len(args) > 1: package = args[1] if len(args) > 2: repo = args[2] if len(args) > 3: arch = args[3] if not (opts.all or package or repo or arch): raise oscerr.WrongOptions('No option has been provided. If you want to restart all packages of the entire project, use --all option.') if opts.multibuild_package: resolver = MultibuildFlavorResolver(apiurl, project, package, use_local=False) packages = resolver.resolve_as_packages(opts.multibuild_package) else: packages = [package] for package in packages: print(cmdbuild(apiurl, subcmd, project, package, arch, repo)) @cmdln.option('-a', '--arch', metavar='ARCH', help='Delete all binary packages for a specific architecture') @cmdln.option('-M', '--multibuild-package', metavar="FLAVOR", action='append', help=HELP_MULTIBUILD_MANY) @cmdln.option('-r', '--repo', metavar='REPO', help='Delete all binary packages for a specific repository') @cmdln.option('--build-disabled', action='store_true', help='Delete all binaries of packages for which the build is disabled') @cmdln.option('--build-failed', action='store_true', help='Delete all binaries of packages for which the build failed') @cmdln.option('--broken', action='store_true', help='Delete all binaries of packages for which the package source is bad') @cmdln.option('--unresolvable', action='store_true', help='Delete all binaries of packages which have dependency errors') @cmdln.option('--all', action='store_true', help='Delete all binaries regardless of the package status (previously default)') @cmdln.alias("unpublish") def do_wipebinaries(self, subcmd, opts, *args): """ Delete all binary packages of a certain project/package With the optional argument you can specify a certain package otherwise all binary packages in the project will be deleted. usage: osc wipebinaries OPTS # works in checked out project dir osc wipebinaries OPTS PROJECT [PACKAGE[:FLAVOR]] osc unpublish OPTS # works in checked out project dir osc unpublish OPTS PROJECT [PACKAGE[:FLAVOR]] """ from .core import MultibuildFlavorResolver from .core import is_package_dir from .core import is_project_dir from .core import slash_split from .core import store_read_package from .core import store_read_project from .core import unpublish from .core import wipebinaries args = slash_split(args) package = project = None apiurl = self.get_api_url() # try to get project and package from checked out dirs if len(args) < 1: if is_project_dir(Path.cwd()): project = store_read_project(Path.cwd()) if is_package_dir(Path.cwd()): project = store_read_project(Path.cwd()) package = store_read_package(Path.cwd()) if project is None: raise oscerr.WrongArgs('Missing argument.') if len(args) > 2: raise oscerr.WrongArgs('Wrong number of arguments.') # respect given project and package if len(args) >= 1: project = self._process_project_name(args[0]) if len(args) == 2: package = args[1] codes = [] if opts.build_disabled: codes.append('disabled') if opts.build_failed: codes.append('failed') if opts.broken: codes.append('broken') if opts.unresolvable: codes.append('unresolvable') if len(codes) == 0: # don't do a second wipe if a filter got specified if opts.all or opts.repo or opts.arch: codes.append(None) if len(codes) == 0: raise oscerr.WrongOptions('No option has been provided. If you want to delete all binaries, use --all option.') if opts.multibuild_package: resolver = MultibuildFlavorResolver(apiurl, project, package, use_local=False) packages = resolver.resolve_as_packages(opts.multibuild_package) else: packages = [package] # make a new request for each code= parameter and for each package in packages for package in packages: for code in codes: if subcmd == 'unpublish': print(unpublish(apiurl, project, package, opts.arch, opts.repo, code)) else: print(wipebinaries(apiurl, project, package, opts.arch, opts.repo, code)) @cmdln.option('-d', '--destdir', default='./binaries', metavar='DIR', help='destination directory') @cmdln.option('-M', '--multibuild-package', metavar="FLAVOR", action='append', help='Get binaries from the specified flavor of a multibuild package.' ' It is meant for use from a package checkout when it is not possible to specify package:flavor.') @cmdln.option('--sources', action="store_true", help='also fetch source packages') @cmdln.option('--debuginfo', action="store_true", help='also fetch debug packages') @cmdln.option('--ccache', action="store_true", help='allow fetching ccache archive') def do_getbinaries(self, subcmd, opts, *args): """ Download binaries to a local directory This command downloads packages directly from the api server. Thus, it directly accesses the packages that are used for building others even when they are not "published" yet. usage: osc getbinaries REPOSITORY # works in checked out project/package (check out all archs in subdirs) osc getbinaries REPOSITORY ARCHITECTURE # works in checked out project/package osc getbinaries PROJECT REPOSITORY ARCHITECTURE osc getbinaries PROJECT PACKAGE REPOSITORY ARCHITECTURE osc getbinaries PROJECT PACKAGE REPOSITORY ARCHITECTURE FILE """ from .core import MultibuildFlavorResolver from .core import get_binary_file from .core import get_binarylist from .core import get_repos_of_project from .core import is_package_dir from .core import is_project_dir from .core import meta_get_packagelist from .core import output from .core import slash_split from .core import store_read_package from .core import store_read_project args = slash_split(args) apiurl = self.get_api_url() project = None package = None binary = None if len(args) < 1 and is_package_dir('.'): self.print_repos() architecture = None if len(args) == 4 or len(args) == 5: project = self._process_project_name(args[0]) package = args[1] repository = args[2] architecture = args[3] if len(args) == 5: binary = args[4] elif len(args) == 3: project, repository, architecture = args elif len(args) >= 1 and len(args) <= 2: if is_package_dir(Path.cwd()): project = store_read_project(Path.cwd()) package = store_read_package(Path.cwd()) elif is_project_dir(Path.cwd()): project = store_read_project(Path.cwd()) else: raise oscerr.WrongArgs('Missing arguments: either specify and ' ' or move to a project or package working copy') repository = args[0] if len(args) == 2: architecture = args[1] else: raise oscerr.WrongArgs('Need either 1, 2, 3 or 4 arguments') repos = list(get_repos_of_project(apiurl, project)) if not [i for i in repos if repository == i.name]: self.print_repos(exc_msg=f'Invalid repository \'{repository}\'', project=project) arches = [architecture] if architecture is None: arches = [i.arch for i in repos if repository == i.name] if package is None: package_specified = False package = meta_get_packagelist(apiurl, project, deleted=0) if opts.multibuild_package: # remove packages that do not have matching flavor for i in package.copy(): package_flavor = i.rsplit(":", 1) # package has flavor, check if the flavor is in opts.multibuild_packages flavor_match = len(package_flavor) == 2 and package_flavor[1] in opts.multibuild_package # package nas no flavor, check if "" is in opts.multibuild_package no_flavor_match = len(package_flavor) == 1 and "" in opts.multibuild_package if not flavor_match and not no_flavor_match: package.remove(i) else: package_specified = True if opts.multibuild_package: resolver = MultibuildFlavorResolver(apiurl, project, package, use_local=False) package = resolver.resolve_as_packages(opts.multibuild_package) else: package = [package] if binary is not None: output.print_msg("Binary filename was specified, ignoring source and debuginfo filters", print_to="debug") opts.sources = True opts.debuginfo = True # Set binary target directory and create if not existing target_dir = os.path.normpath(opts.destdir) if not os.path.isdir(target_dir): print(f'Creating directory "{target_dir}"') os.makedirs(target_dir, 0o755) for arch in arches: for pac in package: binaries = get_binarylist(apiurl, project, repository, arch, package=pac, verbose=True, withccache=opts.ccache) if not binaries: print('no binaries found: Either the package %s ' 'does not exist or no binaries have been built.' % pac, file=sys.stderr) continue for i in binaries: if binary is not None and binary != i.name: continue # skip source rpms if not opts.sources and (i.name.endswith('.src.rpm') or i.name.endswith('.sdeb')): continue # skip debuginfo rpms if not opts.debuginfo and ('-debuginfo-' in i.name or '-debugsource-' in i.name): continue if package_specified: # if package is specified, download everything into the target dir fname = f'{target_dir}/{i.name}' elif i.name.startswith("_") or i.name.endswith(".log"): # download logs and metadata into subdirs # to avoid overwriting them with files with indentical names fname = f'{target_dir}/{pac}/{i.name}' else: # always download packages into the target dir fname = f'{target_dir}/{i.name}' if os.path.exists(fname): st = os.stat(fname) if st.st_mtime == i.mtime and st.st_size == i.size: continue get_binary_file(apiurl, project, repository, arch, i.name, package=pac, target_filename=fname, target_mtime=i.mtime, progress_meter=not opts.quiet) @cmdln.option('-b', '--bugowner', action='store_true', help='restrict listing to items where the user is bugowner') @cmdln.option('-m', '--maintainer', action='store_true', help='restrict listing to items where the user is maintainer') @cmdln.option('-a', '--all', action='store_true', help='all involvements') @cmdln.option('-U', '--user', metavar='USER', help='search for USER instead of yourself') @cmdln.option('--exclude-project', action='append', help='exclude requests for specified project') @cmdln.option('--maintained', action='store_true', help='limit search results to packages with maintained attribute set.') def do_my(self, subcmd, opts, *args): """ Show waiting work, packages, projects or requests involving yourself Examples: # list all open tasks for me osc my [work] # list packages where I am bugowner osc my pkg -b # list projects where I am maintainer osc my prj -m # list request for all my projects and packages osc my rq # list requests, excluding project 'foo' and 'bar' osc my rq --exclude-project foo,bar # list requests I made osc my sr usage: osc my where TYPE is one of requests, submitrequests, projects or packages (rq, sr, prj or pkg) """ # TODO: please clarify the difference between sr and rq. # My first implementeation was to make no difference between requests FROM one # of my projects and TO one of my projects. The current implementation appears to make this difference. # The usage above indicates, that sr would be a subset of rq, which is no the case with my tests. # jw. from . import conf from .core import ET from .core import Request from .core import get_request_collection from .core import get_user_projpkgs from .core import get_user_projpkgs_request_list from .core import http_GET from .core import makeurl args_rq = ('requests', 'request', 'req', 'rq', 'work') args_sr = ('submitrequests', 'submitrequest', 'submitreq', 'submit', 'sr') args_prj = ('projects', 'project', 'projs', 'proj', 'prj') args_pkg = ('packages', 'package', 'pack', 'pkgs', 'pkg') args_patchinfos = ('patchinfos', 'work') if opts.bugowner and opts.maintainer: raise oscerr.WrongOptions('Sorry, \'--bugowner\' and \'maintainer\' are mutually exclusive') elif opts.all and (opts.bugowner or opts.maintainer): raise oscerr.WrongOptions('Sorry, \'--all\' and \'--bugowner\' or \'--maintainer\' are mutually exclusive') apiurl = self.get_api_url() exclude_projects = [] for i in opts.exclude_project or []: prj = i.split(',') if len(prj) == 1: exclude_projects.append(i) else: exclude_projects.extend(prj) if not opts.user: user = conf.get_apiurl_usr(apiurl) else: user = opts.user what = {'project': '', 'package': ''} type = "work" if len(args) > 0: type = args[0] list_patchinfos = list_requests = False if type in args_patchinfos: list_patchinfos = True if type in args_rq: list_requests = True elif type in args_prj: what = {'project': ''} elif type in args_sr: requests = get_request_collection(apiurl, roles=['creator'], user=user) for r in sorted(requests, key=lambda x: x.reqid): print(r.list_view(), '\n') return elif type not in args_pkg: raise oscerr.WrongArgs(f"invalid type {type}") role_filter = '' if opts.maintainer: role_filter = 'maintainer' elif opts.bugowner: role_filter = 'bugowner' elif list_requests: role_filter = 'maintainer' if opts.all: role_filter = '' if list_patchinfos: u = makeurl(apiurl, ['/search/package'], { 'match': f"([kind='patchinfo' and issue[@state='OPEN' and owner/@login='{user}']])" }) f = http_GET(u) root = xml_parse(f).getroot() if root.findall('package'): print("Patchinfos with open bugs assigned to you:\n") for node in root.findall('package'): project = node.get('project') package = node.get('name') print(project, "/", package, '\n') p = makeurl(apiurl, ['source', project, package], {'view': 'issues'}) fp = http_GET(p) issues = xml_parse(fp).findall('issue') for issue in issues: if issue.find('state') is None or issue.find('state').text != "OPEN": continue if issue.find('owner') is None or issue.find('owner').find('login').text != user: continue print(" #", issue.find('label').text, ': ', end=' ') desc = issue.find('summary') if desc is not None: print(desc.text) else: print("\n") print("") if list_requests: # try api side search as supported since OBS 2.2 try: requests = [] # open reviews u = makeurl(apiurl, ['request'], { 'view': 'collection', 'states': 'review', 'reviewstates': 'new', 'roles': 'reviewer', 'user': user, }) f = http_GET(u) root = xml_parse(f).getroot() if root.findall('request'): print("Requests which request a review by you:\n") for node in root.findall('request'): r = Request() r.read(node) print(r.list_view(), '\n') print("") # open requests u = makeurl(apiurl, ['request'], { 'view': 'collection', 'states': 'new', 'roles': 'maintainer', 'user': user, }) f = http_GET(u) root = xml_parse(f).getroot() if root.findall('request'): print("Requests for your packages:\n") for node in root.findall('request'): r = Request() r.read(node) print(r.list_view(), '\n') print("") # declined requests submitted by me u = makeurl(apiurl, ['request'], { 'view': 'collection', 'states': 'declined', 'roles': 'creator', 'user': user, }) f = http_GET(u) root = xml_parse(f).getroot() if root.findall('request'): print("Declined requests created by you (revoke, reopen or supersede):\n") for node in root.findall('request'): r = Request() r.read(node) print(r.list_view(), '\n') print("") return except HTTPError as e: if e.code != 400: raise e # skip it ... try again with old style below res = get_user_projpkgs(apiurl, user, role_filter, exclude_projects, 'project' in what, 'package' in what, opts.maintained, opts.verbose) # map of project =>[list of packages] # if list of packages is empty user is maintainer of the whole project request_todo = {} dummy_elm = ET.Element('dummy') roles = {} if len(what.keys()) == 2: for i in res.get('project_id', res.get('project', dummy_elm)).findall('project'): request_todo[i.get('name')] = [] roles[i.get('name')] = [p.get('role') for p in i.findall('person') if p.get('userid') == user] for i in res.get('package_id', res.get('package', dummy_elm)).findall('package'): prj = i.get('project') roles[f"{prj}/{i.get('name')}"] = [p.get('role') for p in i.findall('person') if p.get('userid') == user] if prj not in request_todo or request_todo[prj] != []: request_todo.setdefault(prj, []).append(i.get('name')) else: for i in res.get('project_id', res.get('project', dummy_elm)).findall('project'): roles[i.get('name')] = [p.get('role') for p in i.findall('person') if p.get('userid') == user] if list_requests: # old style, only for OBS 2.1 and before. Should not be used, since it is slow and incomplete requests = get_user_projpkgs_request_list(apiurl, user, projpkgs=request_todo) for r in sorted(requests, key=lambda x: x.reqid): print(r.list_view(), '\n') if not requests: print(" -> try also 'osc my sr' to see more.") else: for i in sorted(roles.keys()): out = f'{i}' prjpac = i.split('/') if type in args_pkg and len(prjpac) == 1 and not opts.verbose: continue if opts.verbose: out = f"{i} ({', '.join(sorted(roles[i]))})" if len(prjpac) == 2: out = f" {prjpac[1]} ({', '.join(sorted(roles[i]))})" print(out) @cmdln.option('--repos-baseurl', action='store_true', help='show base URLs of download repositories') @cmdln.option('-e', '--exact', action='store_true', help='show only exact matches, this is default now') @cmdln.option('-s', '--substring', action='store_true', help='Show also results where the search term is a sub string, slower search') @cmdln.option('--package', action='store_true', help='search for a package') @cmdln.option('--project', action='store_true', help='search for a project') @cmdln.option('--title', action='store_true', help='search for matches in the \'title\' element') @cmdln.option('--description', action='store_true', help='search for matches in the \'description\' element') @cmdln.option('-a', '--limit-to-attribute', metavar='ATTRIBUTE', help='match only when given attribute exists in meta data') @cmdln.option('-V', '--version', action='store_true', help='show package version, revision, and srcmd5. CAUTION: This is slow and unreliable') @cmdln.option('-i', '--involved', action='store_true', help='show projects/packages where given person (or myself) is involved as bugowner or maintainer [[{group|person}/]] default: person') @cmdln.option('-b', '--bugowner', action='store_true', help='as -i, but only bugowner') @cmdln.option('-m', '--maintainer', action='store_true', help='as -i, but only maintainer') @cmdln.option('-M', '--mine', action='store_true', help='shorthand for --bugowner --package') @cmdln.option('--csv', action='store_true', help='generate output in CSV (separated by |)') @cmdln.option('--binary', action='store_true', help='search binary packages') @cmdln.option('-B', '--baseproject', metavar='PROJECT', help='search packages built for PROJECT (implies --binary)') @cmdln.option('--binaryversion', metavar='VERSION', help='search for binary with specified version (implies --binary)') @cmdln.alias('se') @cmdln.alias('bse') def do_search(self, subcmd, opts, *args): """ Search for a project and/or package If no option is specified osc will search for projects and packages which contains the \'search term\' in their name, title or description. usage: osc search \'search term\' osc bse ... ('osc search --binary') osc se 'perl(Foo::Bar)' ('osc search --package perl-Foo-Bar') """ from . import conf from .core import build_table from .core import filter_role from .core import get_source_rev from .core import search from .core import xpath_join def build_xpath(attr, what, substr=False): if substr: return f'contains({attr}, \'{what}\')' else: return f'{attr} = \'{what}\'' search_term = '' if len(args) > 1: raise oscerr.WrongArgs('Too many arguments') elif len(args) == 0: if opts.involved or opts.bugowner or opts.maintainer or opts.mine: search_term = conf.get_apiurl_usr(conf.config['apiurl']) else: raise oscerr.WrongArgs('Too few arguments') else: search_term = args[0] # XXX: is it a good idea to make this the default? # support perl symbols: if re.match(r'^perl\(\w+(::\w+)*\)$', search_term): search_term = re.sub(r'\)', '', re.sub(r'(::|\()', '-', search_term)) opts.package = True if opts.mine: opts.bugowner = True opts.package = True if (opts.title or opts.description) and (opts.involved or opts.bugowner or opts.maintainer): raise oscerr.WrongOptions('Sorry, the options \'--title\' and/or \'--description\' ' 'are mutually exclusive with \'-i\'/\'-b\'/\'-m\'/\'-M\'') if opts.substring and opts.exact: raise oscerr.WrongOptions('Sorry, the options \'--substring\' and \'--exact\' are mutually exclusive') if not opts.substring: opts.exact = True if subcmd == 'bse' or opts.baseproject or opts.binaryversion: opts.binary = True if opts.binary and (opts.title or opts.description or opts.involved or opts.bugowner or opts.maintainer or opts.project or opts.package): raise oscerr.WrongOptions('Sorry, \'--binary\' and \'--title\' or \'--description\' or \'--involved ' 'or \'--bugowner\' or \'--maintainer\' or \'--limit-to-attribute \\ ' 'or \'--project\' or \'--package\' are mutually exclusive') apiurl = self.get_api_url() xpath = '' if opts.title: xpath = xpath_join(xpath, build_xpath('title', search_term, opts.substring), inner=True) if opts.description: xpath = xpath_join(xpath, build_xpath('description', search_term, opts.substring), inner=True) if opts.project or opts.package or opts.binary: xpath = xpath_join(xpath, build_xpath('@name', search_term, opts.substring), inner=True) # role filter role_filter = '' if opts.bugowner or opts.maintainer or opts.involved: tmp = search_term.split(':') if len(tmp) > 1: search_type, search_term = [tmp[0], tmp[1]] else: search_type = 'person' search_dict = {'person': 'userid', 'group': 'groupid'} try: search_id = search_dict[search_type] except KeyError: search_type, search_id = ['person', 'userid'] xpath = xpath_join(xpath, f'{search_type}/@{search_id} = \'{search_term}\'', inner=True) role_filter = f'{search_term} ({search_type})' role_filter_xpath = xpath if opts.bugowner and not opts.maintainer: xpath = xpath_join(xpath, f'{search_type}/@role=\'bugowner\'', op='and') role_filter = 'bugowner' elif not opts.bugowner and opts.maintainer: xpath = xpath_join(xpath, f'{search_type}/@role=\'maintainer\'', op='and') role_filter = 'maintainer' if opts.limit_to_attribute: xpath = xpath_join(xpath, f'attribute/@name=\'{opts.limit_to_attribute}\'', op='and') if opts.baseproject: xpath = xpath_join(xpath, f'path/@project=\'{self._process_project_name(opts.baseproject)}\'', op='and') if opts.binaryversion: m = re.match(r'(.+)-(.*?)$', opts.binaryversion) if m: if m.group(2) != '': xpath = xpath_join(xpath, f'@versrel=\'{opts.binaryversion}\'', op='and') else: xpath = xpath_join(xpath, f'@version=\'{m.group(1)}\'', op='and') else: xpath = xpath_join(xpath, f'@version=\'{opts.binaryversion}\'', op='and') if not xpath: xpath = xpath_join(xpath, build_xpath('@name', search_term, opts.substring), inner=True) xpath = xpath_join(xpath, build_xpath('title', search_term, opts.substring), inner=True) xpath = xpath_join(xpath, build_xpath('description', search_term, opts.substring), inner=True) what = {'project': xpath, 'package': xpath} if opts.project and not opts.package: what = {'project': xpath} elif not opts.project and opts.package: what = {'package': xpath} elif opts.binary: what = {'published/binary/id': xpath} try: res = search(apiurl, **what) except HTTPError as e: if e.code != 400 or not role_filter: raise e # backward compatibility: local role filtering if opts.limit_to_attribute: role_filter_xpath = xpath_join(role_filter_xpath, f'attribute/@name=\'{opts.limit_to_attribute}\'', op='and') what = {kind: role_filter_xpath for kind in what.keys()} res = search(apiurl, **what) filter_role(res, search_term, role_filter) if role_filter: role_filter = f'{search_term} ({role_filter})' kind_map = {'published/binary/id': 'binary'} for kind, root in res.items(): results = [] for node in root.findall(kind_map.get(kind, kind)): result = [] project = node.get('project') package = None if project is None: project = node.get('name') else: if kind == 'published/binary/id': package = node.get('package') else: package = node.get('name') result.append(project) if package is not None: result.append(package) if opts.version and package is not None: sr = get_source_rev(apiurl, project, package) v = sr.get('version') r = sr.get('rev') s = sr.get('srcmd5') if not v or v == 'unknown': v = '-' if not r: r = '-' if not s: s = '-' result.append(v) result.append(r) result.append(s) if opts.verbose: if opts.binary: result.append(node.get('repository') or '-') result.append(node.get('arch') or '-') result.append(node.get('version') or '-') result.append(node.get('release') or '-') else: title = node.findtext('title').strip() if len(title) > 60: title = title[:61] + '...' result.append(title) if opts.repos_baseurl: # FIXME: no hardcoded URL of instance result.append(f"http://download.opensuse.org/repositories/{project.replace(':', ':/')}/") if kind == 'published/binary/id': result.append(node.get('filepath')) results.append(result) if not results: print(f'No matches found for \'{role_filter or search_term}\' in {kind}s') continue # construct a sorted, flat list # Sort by first column, follwed by second column if we have two columns, else sort by first. if len(results[0]) > 1: sorted_results = sorted(results, key=itemgetter(0, 1)) else: sorted_results = sorted(results, key=itemgetter(0)) new = [] for i in sorted_results: new.extend(i) results = new headline = [] if kind in ('package', 'published/binary/id'): headline = ['# Project', '# Package'] else: headline = ['# Project'] if opts.version and kind == 'package': headline.append('# Ver') headline.append('Rev') headline.append('Srcmd5') if opts.verbose: if opts.binary: headline.extend(['# Repository', '# Arch', '# Version', '# Release']) else: headline.append('# Title') if opts.repos_baseurl: headline.append('# URL') if opts.binary: headline.append('# filepath') if not opts.csv: if len(what.keys()) > 1: print('#' * 68) print(f'matches for \'{role_filter or search_term}\' in {kind}s:\n') for row in build_table(len(headline), results, headline, 2, csv=opts.csv): print(row) @cmdln.option('-p', '--project', metavar='project', help='specify the path to a project') @cmdln.option('-n', '--name', metavar='name', help='specify a package name') @cmdln.option('-t', '--title', metavar='title', help='set a title') @cmdln.option('-d', '--description', metavar='description', help='set the description of the package') @cmdln.option('', '--delete-old-files', action='store_true', help='delete existing files from the server') @cmdln.option('-c', '--commit', action='store_true', help='commit the new files') @cmdln.option('srpm') def do_importsrcpkg(self, subcmd, opts): """ Import a new package from a src.rpm A new package dir will be created inside the project dir (if no project is specified and the current working dir is a project dir the package will be created in this project). If the package does not exist on the server it will be created too otherwise the meta data of the existing package will be updated ( and <description />). The src.rpm will be extracted into the package dir. The files won't be committed unless you explicitly pass the --commit switch. SRPM is the path of the src.rpm in the local filesystem, or an URL. """ from . import conf from . import store as osc_store from .core import ET from .core import Package from .core import Project from .core import addFiles from .core import createPackageDir from .core import edit_meta from .core import is_project_dir from .core import meta_exists from .core import parse_meta_to_string from .core import store_read_project from .core import unpack_srcrpm from .grabber import OscFileGrabber from .util import rpmquery srpm = opts.srpm if opts.delete_old_files and conf.config['do_package_tracking']: # IMHO the --delete-old-files option doesn't really fit into our # package tracking strategy print('--delete-old-files is not supported anymore', file=sys.stderr) print('when do_package_tracking is enabled', file=sys.stderr) sys.exit(1) if '://' in srpm: if srpm.endswith('/'): print(f'{srpm} is not a valid link. It must not end with /') sys.exit(1) print('trying to fetch', srpm) OscFileGrabber().urlgrab(srpm) srpm = os.path.basename(srpm) srpm = os.path.abspath(srpm) if not os.path.isfile(srpm): print(f'file \'{srpm}\' does not exist', file=sys.stderr) sys.exit(1) if opts.project: project_dir = opts.project else: project_dir = str(Path.cwd()) if not is_project_dir(project_dir): raise oscerr.WrongArgs(f"'{project_dir}' is no project working copy") if conf.config['do_package_tracking']: project = Project(project_dir) else: project = store_read_project(project_dir) rpmq = rpmquery.RpmQuery.query(srpm) title, pac, descr, url = rpmq.summary(), rpmq.name(), rpmq.description(), rpmq.url() if url is None: url = '' if opts.title: title = opts.title if opts.name: pac = opts.name elif pac is not None: pac = pac.decode() if opts.description: descr = opts.description # title and description can be empty if not pac: print('please specify a package name with the \'--name\' option. ' 'The automatic detection failed', file=sys.stderr) sys.exit(1) if conf.config['do_package_tracking']: createPackageDir(os.path.join(project.dir, pac), project) else: if not os.path.exists(os.path.join(project_dir, pac)): apiurl = osc_store.Store(project_dir).apiurl user = conf.get_apiurl_usr(apiurl) data = meta_exists(metatype='pkg', path_args=(project, pac), template_args=({ 'name': pac, 'user': user}), apiurl=apiurl) if data: data = xml_fromstring(parse_meta_to_string(data)) data.find('title').text = ''.join(title) data.find('description').text = ''.join(descr) data.find('url').text = url data = ET.tostring(data, encoding="unicode") else: print('error - cannot get meta data', file=sys.stderr) sys.exit(1) edit_meta(metatype='pkg', path_args=(project, pac), data=data, apiurl=apiurl) Package.init_package(apiurl, project, pac, os.path.join(project_dir, pac)) else: print('error - local package already exists', file=sys.stderr) sys.exit(1) unpack_srcrpm(srpm, os.path.join(project_dir, pac)) p = Package(os.path.join(project_dir, pac)) if len(p.filenamelist) == 0 and opts.commit: print('Adding files to working copy...') addFiles(glob.glob(f'{os.path.join(project_dir, pac)}/*')) if conf.config['do_package_tracking']: project.commit((pac, )) else: p.update_datastructs() p.commit() elif opts.commit and opts.delete_old_files: for filename in p.filenamelist: p.delete_remote_source_file(filename) p.update_local_filesmeta() print('Adding files to working copy...') addFiles(glob.glob('*')) p.update_datastructs() p.commit() else: print('No files were committed to the server. Please ' 'commit them manually.') print(f'Package \'{pac}\' only imported locally') sys.exit(1) print(f'Package \'{pac}\' imported successfully') @cmdln.option('-X', '-m', '--method', default='GET', metavar='HTTP_METHOD', choices=('HEAD', 'GET', 'PUT', 'POST', 'DELETE'), help='specify HTTP method to use (HEAD|GET|PUT|DELETE|POST)') @cmdln.option('-e', '--edit', default=None, action='store_true', help='GET, edit and PUT the location') @cmdln.option('-d', '--data', default=None, metavar='STRING', help='specify string data for e.g. POST') @cmdln.option('-T', '-f', '--file', default=None, metavar='FILE', help='specify filename to upload, uses PUT mode by default') @cmdln.option('-a', '--add-header', default=None, metavar='NAME STRING', nargs=2, action='append', dest='headers', help='add the specified header to the request') @cmdln.option('url', help="either URL '/path' or full URL with 'scheme://hostname/path'") def do_api(self, subcmd, opts): """ Issue an arbitrary request to the API Useful for testing. URL can be specified either partially (only the path component), or fully with URL scheme and hostname ('http://...'). Note the global -A and -H options (see osc help). Examples: osc api /source/home:user osc api -X PUT -T /etc/fstab source/home:user/test5/myfstab osc api -e /configuration """ from .core import edit_text from .core import http_request url = opts.url apiurl = self.get_api_url() # default is PUT when uploading files if opts.file and opts.method == 'GET': opts.method = 'PUT' if not url.startswith('http'): if not url.startswith('/'): url = '/' + url url = apiurl + url if opts.headers: opts.headers = dict(opts.headers) r = http_request(opts.method, url, data=opts.data, file=opts.file, headers=opts.headers) if opts.edit: # to edit the output, we need to read all of it # it's going to run ouf of memory if the data is too big out = r.read() text = edit_text(out) r = http_request("PUT", url, data=text, headers=opts.headers) while True: data = r.read(8192) if not data: break sys.stdout.buffer.write(data) @cmdln.option('-b', '--bugowner-only', action='store_true', help='Show only the bugowner') @cmdln.option('-B', '--bugowner', action='store_true', help='Show only the bugowner if defined, or maintainer otherwise') @cmdln.option('-e', '--email', action='store_true', help='show email addresses instead of user names') @cmdln.option('--nodevelproject', action='store_true', help='do not follow a defined devel project ' '(primary project where a package is developed)') @cmdln.option('-D', '--devel-project', metavar='devel_project', help='define the project where this package is primarily developed') @cmdln.option('-a', '--add', metavar='user', help='add a new person for given role ("maintainer" by default)') @cmdln.option('-A', '--all', action='store_true', help='list all found entries not just the first one') @cmdln.option('-s', '--set-bugowner', metavar='user', help='Set the bugowner to specified person (or group via group: prefix)') @cmdln.option('-S', '--set-bugowner-request', metavar='user', help='Set the bugowner to specified person via a request (or group via group: prefix)') @cmdln.option('-U', '--user', metavar='USER', help='All official maintained instances for the specified USER (specified by the username or email)') @cmdln.option('-G', '--group', metavar='GROUP', help='All official maintained instances for the specified GROUP') @cmdln.option('-d', '--delete', metavar='user', help='delete a maintainer/bugowner (can be specified via --role)') @cmdln.option('-r', '--role', metavar='role', action='append', default=[], help='Specify user role') @cmdln.option('-m', '--message', help='Define message as commit entry or request description') @cmdln.alias('bugowner') def do_maintainer(self, subcmd, opts, *args): """ Show maintainers according to server side configuration # Search for official maintained sources in OBS instance osc maintainer BINARY_OR_PACKAGE_NAME <options> osc maintainer -U <user> <options> osc maintainer -G <group> <options> # Lookup via containers osc maintainer <options> osc maintainer PRJ <options> osc maintainer PRJ PKG <options> The tool looks up the default responsible person for a certain project or package. When using with an OBS 2.4 (or later) server it is doing the lookup for a given binary according to the server side configuration of default owners. The tool is also looking into devel packages and supports to fallback to the project in case a package has no defined maintainer. Please use "osc meta pkg" in case you need to know the definition in a specific container. PRJ and PKG default to current working-copy path. """ from .core import ET from .core import _html_escape from .core import addPerson from .core import build_table from .core import delPerson from .core import edit_message from .core import get_group_data from .core import get_user_data from .core import http_POST from .core import makeurl from .core import owner from .core import raw_input from .core import setBugowner from .core import set_devel_project from .core import show_package_meta from .core import show_project_meta from .core import slash_split from .core import store_read_package from .core import store_read_project def get_maintainer_data(apiurl, maintainer, verbose=False): tags = ('email',) if maintainer.startswith('group:'): group = maintainer.replace('group:', '') if verbose: return [maintainer] + get_group_data(apiurl, group, 'title', *tags) return get_group_data(apiurl, group, 'email') if verbose: tags = ('login', 'realname', 'email') return get_user_data(apiurl, maintainer, *tags) def setBugownerHelper(apiurl, project, package, bugowner): try: setBugowner(apiurl, project, package, bugowner) except HTTPError as e: if e.code != 403: raise print("No write permission in", project, end=' ') if package: print("/", package, end=' ') print() repl = raw_input('\nCreating a request instead? (y/n) ') if repl.lower() == 'y': opts.set_bugowner_request = bugowner opts.set_bugowner = None search_term = None prj = None pac = None metaroot = None searchresult = None roles = ['bugowner', 'maintainer'] if len(opts.role): roles = opts.role elif opts.bugowner_only or opts.bugowner or subcmd == 'bugowner': roles = ['bugowner'] args = slash_split(args) if opts.user or opts.group: if len(args) != 0: raise oscerr.WrongArgs('Either search for user or for packages.') elif len(args) == 0: try: pac = store_read_package('.') except oscerr.NoWorkingCopy: pass prj = store_read_project('.') elif len(args) == 1: # it is unclear if one argument is a search_term or a project, try search_term first for new OBS 2.4 search_term = prj = self._process_project_name(args[0]) elif len(args) == 2: prj = self._process_project_name(args[0]) pac = args[1] else: raise oscerr.WrongArgs('Wrong number of arguments.') apiurl = self.get_api_url() # Try the OBS 2.4 way first. if search_term or opts.user or opts.group: limit = None if opts.all: limit = 0 filterroles = roles if filterroles == ['bugowner', 'maintainer']: # use server side configured default filterroles = None if search_term: # try the package name first, it is faster and may catch cases where no # binary with that name exists for the given package name searchresult = owner(apiurl, search_term, "package", usefilter=filterroles, devel=None, limit=limit) if searchresult is None or len(searchresult) == 0: searchresult = owner(apiurl, search_term, "binary", usefilter=filterroles, devel=None, limit=limit) if searchresult is not None and len(searchresult) == 0: # We talk to an OBS 2.4 or later understanding the call if opts.set_bugowner or opts.set_bugowner_request: # filtered search did not succeed, but maybe we want to set an owner initially? searchresult = owner(apiurl, search_term, "binary", usefilter="", devel=None, limit=-1) if searchresult: print("WARNING: the binary exists, but has no matching maintainership roles defined.") print("Do you want to set it in the container where the binary appeared first?") result = searchresult.find('owner') print("This is: " + result.get('project'), end=' ') if result.get('package'): print(" / " + result.get('package')) repl = raw_input('\nUse this container? (y/n) ') if repl.lower() != 'y': searchresult = None elif opts.user: if "@" in opts.user: # resolve email address to login from . import obs_api users = obs_api.Person.search(apiurl, email=opts.user) if users: opts.user = users[0].login searchresult = owner(apiurl, opts.user, "user", usefilter=filterroles, devel=None) elif opts.group: searchresult = owner(apiurl, opts.group, "group", usefilter=filterroles, devel=None) else: raise oscerr.WrongArgs('osc bug, no valid search criteria') if opts.add: if searchresult: for result in searchresult.findall('owner'): for role in roles: addPerson(apiurl, result.get('project'), result.get('package'), opts.add, role) else: for role in roles: addPerson(apiurl, prj, pac, opts.add, role) elif opts.set_bugowner or opts.set_bugowner_request: bugowner = opts.set_bugowner or opts.set_bugowner_request requestactionsxml = "" if searchresult: for result in searchresult.findall('owner'): if opts.set_bugowner: setBugownerHelper(apiurl, result.get('project'), result.get('package'), opts.set_bugowner) if opts.set_bugowner_request: args = [bugowner, result.get('project')] if result.get('package'): args = args + [result.get('package')] requestactionsxml += self._set_bugowner(args, opts) else: if opts.set_bugowner: setBugownerHelper(apiurl, prj, pac, opts.set_bugowner) if opts.set_bugowner_request: args = [bugowner, prj] if pac: args = args + [pac] requestactionsxml += self._set_bugowner(args, opts) if requestactionsxml != "": if opts.message: message = opts.message else: message = edit_message() xml = """<request> %s <state name="new"/> <description>%s</description> </request> """ % \ (requestactionsxml, _html_escape(message or "")) u = makeurl(apiurl, ['request'], query={"cmd": "create"}) f = http_POST(u, data=xml) root = xml_parse(f).getroot() print("Request ID:", root.get('id')) elif opts.delete: if searchresult: for result in searchresult.findall('owner'): for role in roles: delPerson(apiurl, result.get('project'), result.get('package'), opts.delete, role) else: for role in roles: delPerson(apiurl, prj, pac, opts.delete, role) elif opts.devel_project: # XXX: does it really belong to this command? if not prj: path = os.getcwd() msg = f"Directory '{path}' is not a working copy" raise oscerr.NoWorkingCopy(msg) set_devel_project(apiurl, prj, pac, opts.devel_project) else: if pac: m = show_package_meta(apiurl, prj, pac) metaroot = xml_fromstring(b''.join(m)) if not opts.nodevelproject: while metaroot.findall('devel'): d = metaroot.find('devel') prj = d.get('project', prj) pac = d.get('package', pac) if opts.verbose: print(f"Following to the development space: {prj}/{pac}") m = show_package_meta(apiurl, prj, pac) metaroot = xml_fromstring(b''.join(m)) if not metaroot.findall('person') and not metaroot.findall('group'): if opts.verbose: print("No dedicated persons in package defined, showing the project persons.") pac = None m = show_project_meta(apiurl, prj) metaroot = xml_fromstring(b''.join(m)) else: # fallback to project lookup for old servers if prj and not searchresult: m = show_project_meta(apiurl, prj) metaroot = xml_fromstring(b''.join(m)) # extract the maintainers projects = [] # from owner search if searchresult: for result in searchresult.findall('owner'): maintainers = {} maintainers.setdefault("project", result.get('project')) maintainers.setdefault("package", result.get('package')) for person in result.findall('person'): maintainers.setdefault(person.get('role'), []).append(person.get('name')) for group in result.findall('group'): maintainers.setdefault(group.get('role'), []).append("group:" + group.get('name')) projects = projects + [maintainers] # from meta data if metaroot: # we have just one result maintainers = {} for person in metaroot.findall('person'): maintainers.setdefault(person.get('role'), []).append(person.get('userid')) for group in metaroot.findall('group'): maintainers.setdefault(group.get('role'), []).append("group:" + group.get('groupid')) projects = [maintainers] # showing the maintainers for maintainers in projects: indent = "" definingproject = maintainers.get("project") if definingproject: definingpackage = maintainers.get("package") indent = " " if definingpackage: print(f"Defined in package: {definingproject}/{definingpackage} ") else: print("Defined in project: ", definingproject) if prj: # not for user/group search for role in roles: if opts.bugowner and not maintainers.get(role, []): role = 'maintainer' if pac: print(f"{indent}{role} of {prj}/{pac} : ") else: print(f"{indent}{role} of {prj} : ") if opts.email: emails = [] for maintainer in maintainers.get(role, []): user = get_maintainer_data(apiurl, maintainer, verbose=False) if user: emails.append(''.join(user)) print(indent, end=' ') print(', '.join(emails) or '-') elif opts.verbose: userdata = [] for maintainer in maintainers.get(role, []): user = get_maintainer_data(apiurl, maintainer, verbose=True) userdata.append(user[0]) if user[1] != '-': userdata.append(f"{user[1]} <{user[2]}>") else: userdata.append(user[2]) for row in build_table(2, userdata, None, 3): print(indent, end=' ') print(row) else: print(indent, end=' ') print(', '.join(maintainers.get(role, [])) or '-') print() @cmdln.alias('who') @cmdln.alias('user') @cmdln.option('user', nargs='*') def do_whois(self, subcmd, opts): """ Show fullname and email of a buildservice user """ from . import conf from . import obs_api from .output import print_msg apiurl = self.get_api_url() usernames = opts.user if not usernames: usernames = [conf.config["api_host_options"][apiurl]["user"]] # remove duplicates usernames = list(set(usernames)) users = obs_api.Person.search(apiurl, login=usernames) users_by_login = {i.login: i for i in users} for username in usernames: user = users_by_login.get(username, None) if not user: print_msg(f"User '{username}' does not exist", print_to="warning") continue print(f'{user.login}: "{user.realname}" <{user.email}>') @cmdln.name("create-pbuild-config") @cmdln.alias('cpc') @cmdln.option("repository") @cmdln.option("arch") def do_create_pbuild_config(self, subcmd, opts): """ This command is creating the necessary files to build using pbuild tool. It basically creates _config and _pbuild file in the project directory. Changes from there can not get submitted back, except the project is managed in git. Examples: osc cpc REPOSITORY ARCH """ from .core import create_pbuild_config from .core import is_package_dir from .core import is_project_dir from .core import store_read_project apiurl = self.get_api_url() project = None project_dir = str(Path.cwd()) if is_project_dir(project_dir): project = store_read_project(project_dir) elif is_package_dir(project_dir): project_dir = str(Path.cwd().parent) project = store_read_project(project_dir) else: raise oscerr.WrongArgs('Creating pbuild only works in a checked out project or package') create_pbuild_config(apiurl, project, opts.repository, opts.arch, project_dir) @cmdln.option('-r', '--revision', metavar='rev', help='print out the specified revision') @cmdln.option('-e', '--expand', action='store_true', help='(default) force expansion of linked packages.') @cmdln.option('-u', '--unexpand', action='store_true', help='always work with unexpanded packages.') @cmdln.option('-D', '--deleted', action='store_true', help='access file in a deleted package') @cmdln.option('-M', '--meta', action='store_true', help='list meta data files') @cmdln.alias('blame') @cmdln.alias('less') def do_cat(self, subcmd, opts, *args): """ Output the content of a file to standard output Examples: osc cat file osc cat project package file osc cat project/package/file osc cat http://api.opensuse.org/build/.../_log osc cat http://api.opensuse.org/source/../_link osc less file osc less project package file osc blame file osc blame project package file """ from .core import http_GET from .core import is_package_dir from .core import makeurl from .core import parseRevisionOption from .core import run_pager from .core import show_upstream_srcmd5 from .core import slash_split from .core import store_read_package from .core import store_read_project from .core import streamfile if len(args) == 1 and (args[0].startswith('http://') or args[0].startswith('https://')): opts.method = 'GET' opts.headers = None opts.data = None opts.file = None return self.do_api('list', opts, *args) args = slash_split(args) project = package = filename = None if len(args) == 3: project = self._process_project_name(args[0]) package = args[1] filename = args[2] elif len(args) == 1 and is_package_dir(Path.cwd()): project = store_read_project(Path.cwd()) package = store_read_package(Path.cwd()) filename = args[0] else: raise oscerr.WrongArgs('Wrong number of arguments.') rev, dummy = parseRevisionOption(opts.revision) apiurl = self.get_api_url() query = {} if subcmd == 'blame': query['view'] = "blame" if opts.meta: query['meta'] = 1 if opts.deleted: query['deleted'] = 1 if opts.revision: query['rev'] = opts.revision if not opts.unexpand: query['rev'] = show_upstream_srcmd5(apiurl, project, package, expand=True, revision=opts.revision, meta=opts.meta, deleted=opts.deleted) query['expand'] = 1 # important for blame case to follow links in old revisions u = makeurl(apiurl, ['source', project, package, filename], query=query) if subcmd == 'less': f = http_GET(u) run_pager(b''.join(f.readlines())) else: for data in streamfile(u): if isinstance(data, str): sys.stdout.write(data) else: sys.stdout.buffer.write(data) # helper function to download a file from a specific revision def download(self, name, md5, dir, destfile): from .core import BUFSIZE from .core import http_GET from .core import makeurl from .core import streamfile o = open(destfile, 'wb') if md5 != '': query = {'rev': dir['srcmd5']} u = makeurl(dir['apiurl'], ['source', dir['project'], dir['package'], name], query=query) for buf in streamfile(u, http_GET, BUFSIZE): o.write(buf) o.close() @cmdln.option('-d', '--destdir', default='repairlink', metavar='DIR', help='destination directory') def do_repairlink(self, subcmd, opts, *args): """ Repair a broken source link This command checks out a package with merged source changes. It uses a 3-way merge to resolve file conflicts. After reviewing/repairing the merge, use 'osc resolved ...' and 'osc ci' to re-create a working source link. usage: * For merging conflicting changes of a checkout package: osc repairlink * Check out a package and merge changes: osc repairlink PROJECT PACKAGE * Pull conflicting changes from one project into another one: osc repairlink PROJECT PACKAGE INTO_PROJECT [INTO_PACKAGE] """ from .core import ET from .core import Package from .core import binary_file from .core import http_GET from .core import is_package_dir from .core import makeurl from .core import run_external from .core import slash_split from .core import statfrmt from .core import store_read_package from .core import store_read_project from .core import store_write_string apiurl = self.get_api_url() args = slash_split(args) if len(args) >= 3 and len(args) <= 4: prj = self._process_project_name(args[0]) package = target_package = args[1] target_prj = self._process_project_name(args[2]) if len(args) == 4: target_package = args[3] elif len(args) == 2: target_prj = prj = self._process_project_name(args[0]) target_package = package = args[1] elif is_package_dir(Path.cwd()): target_prj = prj = store_read_project(Path.cwd()) target_package = package = store_read_package(Path.cwd()) else: raise oscerr.WrongArgs('Please specify project and package') # first try stored reference, then lastworking query = {'rev': 'latest'} u = makeurl(apiurl, ['source', prj, package], query=query) f = http_GET(u) root = xml_parse(f).getroot() linkinfo = root.find('linkinfo') if linkinfo is None: raise oscerr.APIError('package is not a source link') if linkinfo.get('error') is None: raise oscerr.APIError('source link is not broken') workingrev = None if linkinfo.get('baserev'): query = {'rev': 'latest', 'linkrev': 'base'} u = makeurl(apiurl, ['source', prj, package], query=query) f = http_GET(u) root = xml_parse(f).getroot() linkinfo = root.find('linkinfo') if linkinfo.get('error') is None: workingrev = linkinfo.get('xsrcmd5') if workingrev is None: query = {'lastworking': 1} u = makeurl(apiurl, ['source', prj, package], query=query) f = http_GET(u) root = xml_parse(f).getroot() linkinfo = root.find('linkinfo') if linkinfo is None: raise oscerr.APIError('package is not a source link') if linkinfo.get('error') is None: raise oscerr.APIError('source link is not broken') workingrev = linkinfo.get('lastworking') if workingrev is None: raise oscerr.APIError('source link never worked') print("using last working link target") else: print("using link target of last commit") query = {'expand': 1, 'emptylink': 1} u = makeurl(apiurl, ['source', prj, package], query=query) f = http_GET(u) meta = f.readlines() root_new = xml_fromstring(b''.join(meta)) dir_new = {'apiurl': apiurl, 'project': prj, 'package': package} dir_new['srcmd5'] = root_new.get('srcmd5') dir_new['entries'] = [[n.get('name'), n.get('md5')] for n in root_new.findall('entry')] query = {'rev': workingrev} u = makeurl(apiurl, ['source', prj, package], query=query) f = http_GET(u) root_oldpatched = xml_parse(f).getroot() linkinfo_oldpatched = root_oldpatched.find('linkinfo') if linkinfo_oldpatched is None: raise oscerr.APIError('working rev is not a source link?') if linkinfo_oldpatched.get('error') is not None: raise oscerr.APIError('working rev is not working?') dir_oldpatched = {'apiurl': apiurl, 'project': prj, 'package': package} dir_oldpatched['srcmd5'] = root_oldpatched.get('srcmd5') dir_oldpatched['entries'] = [[n.get('name'), n.get('md5')] for n in root_oldpatched.findall('entry')] query = {} query['rev'] = linkinfo_oldpatched.get('srcmd5') u = makeurl(apiurl, ['source', linkinfo_oldpatched.get('project'), linkinfo_oldpatched.get('package')], query=query) f = http_GET(u) root_old = xml_parse(f).getroot() dir_old = {'apiurl': apiurl} dir_old['project'] = linkinfo_oldpatched.get('project') dir_old['package'] = linkinfo_oldpatched.get('package') dir_old['srcmd5'] = root_old.get('srcmd5') dir_old['entries'] = [[n.get('name'), n.get('md5')] for n in root_old.findall('entry')] entries_old = dict(dir_old['entries']) entries_oldpatched = dict(dir_oldpatched['entries']) entries_new = dict(dir_new['entries']) entries = {} entries.update(entries_old) entries.update(entries_oldpatched) entries.update(entries_new) destdir = opts.destdir if os.path.isdir(destdir): shutil.rmtree(destdir) os.mkdir(destdir) Package.init_package(apiurl, target_prj, target_package, destdir) store_write_string(destdir, '_files', b''.join(meta) + b'\n') store_write_string(destdir, '_linkrepair', '') pac = Package(destdir) for name in sorted(entries.keys()): md5_old = entries_old.get(name, '') md5_new = entries_new.get(name, '') md5_oldpatched = entries_oldpatched.get(name, '') if md5_new != '': self.download(name, md5_new, dir_new, pac.store.sources_get_path(name)) if md5_old == md5_new: if md5_oldpatched == '': pac.put_on_deletelist(name) continue print(statfrmt(' ', name)) self.download(name, md5_oldpatched, dir_oldpatched, os.path.join(destdir, name)) continue if md5_old == md5_oldpatched: if md5_new == '': continue print(statfrmt('U', name)) shutil.copy2(pac.store.sources_get_path(name), os.path.join(destdir, name)) continue if md5_new == md5_oldpatched: if md5_new == '': continue print(statfrmt('G', name)) shutil.copy2(pac.store.sources_get_path(name), os.path.join(destdir, name)) continue self.download(name, md5_oldpatched, dir_oldpatched, os.path.join(destdir, name + '.mine')) if md5_new != '': shutil.copy2(pac.store.sources_get_path(name), os.path.join(destdir, name + '.new')) else: self.download(name, md5_new, dir_new, os.path.join(destdir, name + '.new')) self.download(name, md5_old, dir_old, os.path.join(destdir, name + '.old')) if binary_file(os.path.join(destdir, name + '.mine')) or \ binary_file(os.path.join(destdir, name + '.old')) or \ binary_file(os.path.join(destdir, name + '.new')): shutil.copy2(os.path.join(destdir, name + '.new'), os.path.join(destdir, name)) print(statfrmt('C', name)) pac.put_on_conflictlist(name) continue o = open(os.path.join(destdir, name), 'wb') code = run_external( 'diff3', '-m', '-E', '-L', '.mine', os.path.join(destdir, name + '.mine'), '-L', '.old', os.path.join(destdir, name + '.old'), '-L', '.new', os.path.join(destdir, name + '.new'), stdout=o ) if code == 0: print(statfrmt('G', name)) os.unlink(os.path.join(destdir, name + '.mine')) os.unlink(os.path.join(destdir, name + '.old')) os.unlink(os.path.join(destdir, name + '.new')) elif code == 1: print(statfrmt('C', name)) pac.put_on_conflictlist(name) else: print(statfrmt('?', name)) pac.put_on_conflictlist(name) pac.write_deletelist() pac.write_conflictlist() print() print(f'Please change into the \'{destdir}\' directory,') print('fix the conflicts (files marked with \'C\' above),') print('run \'osc resolved ...\', and commit the changes.') def do_pull(self, subcmd, opts, *args): """ Merge the changes of the link target into your working copy """ from .core import ET from .core import Package from .core import binary_file from .core import http_GET from .core import makeurl from .core import run_external from .core import statfrmt from .core import store_write_string p = Package('.') # check if everything is committed for filename in p.filenamelist: state = p.status(filename) if state != ' ' and state != 'S': raise oscerr.WrongArgs('Please commit your local changes first!') # check if we need to update upstream_rev = p.latest_rev() if not (p.isfrozen() or p.ispulled()): raise oscerr.WrongArgs('osc pull makes only sense with a detached head, did you mean osc up?') if p.rev != upstream_rev: raise oscerr.WorkingCopyOutdated((p.absdir, p.rev, upstream_rev)) elif not p.islink(): raise oscerr.WrongArgs('osc pull only works on linked packages.') elif not p.isexpanded(): raise oscerr.WrongArgs('osc pull only works on expanded links.') linkinfo = p.linkinfo baserev = linkinfo.baserev if baserev is None: raise oscerr.WrongArgs('osc pull only works on links containing a base revision.') # get revisions we need query = {'expand': 1, 'emptylink': 1} u = makeurl(p.apiurl, ['source', p.prjname, p.name], query=query) f = http_GET(u) meta = f.readlines() root_new = xml_fromstring(b''.join(meta)) linkinfo_new = root_new.find('linkinfo') if linkinfo_new is None: raise oscerr.APIError('link is not a really a link?') if linkinfo_new.get('error') is not None: raise oscerr.APIError('link target is broken') if linkinfo_new.get('srcmd5') == baserev: print("Already up-to-date.") p.unmark_frozen() return dir_new = {'apiurl': p.apiurl, 'project': p.prjname, 'package': p.name} dir_new['srcmd5'] = root_new.get('srcmd5') dir_new['entries'] = [[n.get('name'), n.get('md5')] for n in root_new.findall('entry')] dir_oldpatched = {'apiurl': p.apiurl, 'project': p.prjname, 'package': p.name, 'srcmd5': p.srcmd5} dir_oldpatched['entries'] = [[f.name, f.md5] for f in p.filelist] query = {'rev': linkinfo.srcmd5} u = makeurl(p.apiurl, ['source', linkinfo.project, linkinfo.package], query=query) f = http_GET(u) root_old = xml_parse(f).getroot() dir_old = {'apiurl': p.apiurl, 'project': linkinfo.project, 'package': linkinfo.package, 'srcmd5': linkinfo.srcmd5} dir_old['entries'] = [[n.get('name'), n.get('md5')] for n in root_old.findall('entry')] # now do 3-way merge entries_old = dict(dir_old['entries']) entries_oldpatched = dict(dir_oldpatched['entries']) entries_new = dict(dir_new['entries']) entries = {} entries.update(entries_old) entries.update(entries_oldpatched) entries.update(entries_new) for name in sorted(entries.keys()): if name.startswith('_service:') or name.startswith('_service_'): continue md5_old = entries_old.get(name, '') md5_new = entries_new.get(name, '') md5_oldpatched = entries_oldpatched.get(name, '') if md5_old == md5_new or md5_oldpatched == md5_new: continue if md5_old == md5_oldpatched: if md5_new == '': print(statfrmt('D', name)) p.put_on_deletelist(name) os.unlink(name) elif md5_old == '': print(statfrmt('A', name)) self.download(name, md5_new, dir_new, name) p.put_on_addlist(name) else: print(statfrmt('U', name)) self.download(name, md5_new, dir_new, name) continue # need diff3 to resolve issue if md5_oldpatched == '': open(name, 'w').write('') os.rename(name, name + '.mine') self.download(name, md5_new, dir_new, name + '.new') self.download(name, md5_old, dir_old, name + '.old') if binary_file(name + '.mine') or binary_file(name + '.old') or binary_file(name + '.new'): shutil.copy2(name + '.new', name) print(statfrmt('C', name)) p.put_on_conflictlist(name) continue o = open(name, 'wb') code = run_external( 'diff3', '-m', '-E', '-L', '.mine', name + '.mine', '-L', '.old', name + '.old', '-L', '.new', name + '.new', stdout=o ) if code == 0: print(statfrmt('G', name)) os.unlink(name + '.mine') os.unlink(name + '.old') os.unlink(name + '.new') elif code == 1: print(statfrmt('C', name)) p.put_on_conflictlist(name) else: print(statfrmt('?', name)) p.put_on_conflictlist(name) p.write_deletelist() p.write_addlist() p.write_conflictlist() # store new linkrev store_write_string(p.absdir, '_pulled', linkinfo_new.get('srcmd5') + '\n') p.unmark_frozen() print() if p.in_conflict: print('Please fix the conflicts (files marked with \'C\' above),') print('run \'osc resolved ...\', and commit the changes') print('to update the link information.') else: print('Please commit the changes to update the link information.') @cmdln.option('--create', action='store_true', default=False, help='create new gpg signing key for this project') @cmdln.option('--extend', action='store_true', default=False, help='extend expiration date of the gpg public key for this project') @cmdln.option('--delete', action='store_true', default=False, help='delete the gpg signing key in this project') @cmdln.option('--notraverse', action='store_true', default=False, help='don\'t traverse projects upwards to find key') @cmdln.option('--sslcert', action='store_true', default=False, help='fetch SSL certificate instead of GPG key') def do_signkey(self, subcmd, opts, *args): """ Manage Project Signing Key osc signkey [--create|--delete|--extend] <PROJECT> osc signkey [--notraverse] <PROJECT> This command is for managing gpg keys. It shows the public key by default. There is no way to download or upload the private part of a key by design. However you can create a new own key. You may want to consider to sign the public key with your own existing key. If a project has no key, the key from upper level project will be used (e.g. when dropping "KDE:KDE4:Community" key, the one from "KDE:KDE4" will be used). WARNING: THE OLD KEY CANNOT BE RESTORED AFTER USING DELETE OR CREATE """ from .core import http_DELETE from .core import http_POST from .core import is_package_dir from .core import is_project_dir from .core import makeurl from .core import store_read_project apiurl = self.get_api_url() f = None prj = None if len(args) == 0: cwd = Path.cwd() if is_project_dir(cwd) or is_package_dir(cwd): prj = store_read_project(cwd) if len(args) == 1: prj = self._process_project_name(args[0]) if not prj: raise oscerr.WrongArgs('Please specify just the project') if opts.create: url = makeurl(apiurl, ['source', prj], query={"cmd": "createkey"}) f = http_POST(url) elif opts.extend: url = makeurl(apiurl, ['source', prj], query={"cmd": "extendkey"}) f = http_POST(url) elif opts.delete: url = makeurl(apiurl, ['source', prj, "_pubkey"]) f = http_DELETE(url) else: from . import obs_api try: # use current api, supporting fallback to higher project and server side scripts keyinfo = obs_api.Keyinfo.from_api(apiurl, prj) if opts.sslcert: for sslcert in keyinfo.sslcert_list or []: print(sslcert.to_human_readable_string()) print() else: for pubkey in keyinfo.pubkey_list or []: print(pubkey.to_human_readable_string()) print() return except HTTPError as e: if e.code != 404: raise # the _keyinfo endpoint doesn't exist, use the old _pubkey/_sslcert instead if opts.sslcert: result = obs_api.Keyinfo.get_sslcert_deprecated(apiurl, prj, traverse=not(opts.notraverse)) else: result = obs_api.Keyinfo.get_pubkey_deprecated(apiurl, prj, traverse=not(opts.notraverse)) if result: _, key = result print(key) return while True: data = f.read(16384) if not data: break sys.stdout.buffer.write(data) @cmdln.option('-m', '--message', help='add MESSAGE to changes (do not open an editor)') @cmdln.option('-F', '--file', metavar='FILE', help='read changes message from FILE (do not open an editor)') @cmdln.option('-e', '--just-edit', action='store_true', default=False, help='just open changes (cannot be used with -m)') def do_vc(self, subcmd, opts, *args): """ Edit the changes file osc vc [-m MESSAGE|-e] [filename[.changes]|path [file_with_comment]] If no <filename> is given, exactly one *.changes or *.spec file has to be in the cwd or in path. The email address used in .changes file is read from BuildService instance, or should be defined in oscrc [https://api.opensuse.org/] user = login pass = password email = user@defined.email or can be specified via mailaddr environment variable. By default, osc vc opens the program specified by the EDITOR environment variable (and it uses Vim if that variable is not set) with a temporary file that should replace the *.changes file when saved by the editor, or discarded otherwise. """ from . import conf from . import store as osc_store from .core import is_package_dir from .core import vc_export_env from .core import which if opts.message and opts.file: raise oscerr.WrongOptions('\'--message\' and \'--file\' are mutually exclusive') elif opts.message and opts.just_edit: raise oscerr.WrongOptions('\'--message\' and \'--just-edit\' are mutually exclusive') elif opts.file and opts.just_edit: raise oscerr.WrongOptions('\'--file\' and \'--just-edit\' are mutually exclusive') meego_style = False if not args: try: fn_changelog = glob.glob('*.changes')[0] fp = open(fn_changelog) titleline = fp.readline() fp.close() if re.match(r'^\*\W+(.+\W+\d{1,2}\W+20\d{2})\W+(.+)\W+<(.+)>\W+(.+)$', titleline): meego_style = True except IndexError: pass cmd_list = [conf.config['vc-cmd']] if meego_style: if not os.path.exists('/usr/bin/vc'): print('Error: you need meego-packaging-tools for /usr/bin/vc command', file=sys.stderr) return 1 cmd_list = ['/usr/bin/vc'] elif which(cmd_list[0]) is None: print(f'Error: vc (\'{cmd_list[0]}\') command not found', file=sys.stderr) print('Install the build package from http://download.opensuse.org/repositories/openSUSE:/Tools/', file=sys.stderr) return 1 if args and is_package_dir(args[0]): apiurl = osc_store.Store(args[0]).apiurl else: apiurl = self.get_api_url() if meego_style: if opts.message or opts.just_edit: print('Warning: to edit MeeGo style changelog, opts will be ignored.', file=sys.stderr) else: if opts.message: cmd_list.append("-m") cmd_list.append(opts.message) if opts.file: if len(args) > 1: raise oscerr.WrongOptions('--file and file_with_comment are mutually exclusive') elif not os.path.isfile(opts.file): raise oscerr.WrongOptions(f'\'{opts.file}\': is no file') args = list(args) if not args: args.append('') args.append(opts.file) if opts.just_edit: cmd_list.append("-e") cmd_list.extend(args) vc_export_env(apiurl) vc = subprocess.Popen(cmd_list) vc.wait() sys.exit(vc.returncode) @cmdln.option('-f', '--force', action='store_true', help='forces removal of entire package and its files') @cmdln.option('source') @cmdln.option('dest') def do_mv(self, subcmd, opts): """ Move SOURCE file to DEST and keep it under version control """ from .core import Package source = opts.source dest = opts.dest if not os.path.isfile(source): raise oscerr.WrongArgs(f"Source file '{source}' does not exist or is not a file") if not opts.force and os.path.isfile(dest): raise oscerr.WrongArgs(f"Dest file '{dest}' already exists") if os.path.isdir(dest): dest = os.path.join(dest, os.path.basename(source)) src_pkg = Package(source) tgt_pkg = Package(dest) if not src_pkg: raise oscerr.NoWorkingCopy(f"Error: \"{os.path.abspath(source)}\" is not located in an osc working copy.") if not tgt_pkg: raise oscerr.NoWorkingCopy(f"Error: \"{os.path.abspath(dest)}\" does not point to an osc working copy.") os.rename(source, dest) try: tgt_pkg.addfile(os.path.basename(dest)) except oscerr.PackageFileConflict: # file is already tracked pass # instantiate src_pkg *again* to load fresh state from .osc that was written on deleting a file in tgt_pkg # it would be way better to use a single Package instance where possible src_pkg = Package(source) src_pkg.delete_file(os.path.basename(source), force=opts.force) @cmdln.option('-d', '--delete', action='store_true', help='delete option from config or reset option to the default)') @cmdln.option('-s', '--stdin', action='store_true', help='indicates that the config value should be read from stdin') @cmdln.option('-p', '--prompt', action='store_true', help='prompt for a value') @cmdln.option('--change-password', action='store_true', help='Change password') @cmdln.option('--select-password-store', action='store_true', help='Change the password store') @cmdln.option('--no-echo', action='store_true', help='prompt for a value but do not echo entered characters') @cmdln.option('--dump', action='store_true', help='dump the complete configuration (without \'pass\' and \'passx\' options)') @cmdln.option('--dump-full', action='store_true', help='dump the complete configuration (including \'pass\' and \'passx\' options)') def do_config(self, subcmd, opts, *args): """ Get/set a config option Examples: osc config section option (get current value) osc config section option value (set to value) osc config section option --delete (delete option/reset to the default) osc config section --change-password (changes the password in section "section") (section is either an apiurl or an alias or 'general') osc config --dump (dump the complete configuration) """ from . import conf from .core import raw_input prompt_value = 'Value: ' if opts.change_password: opts.no_echo = True opts.prompt = True opts.select_password_store = True prompt_value = 'Password : ' if len(args) != 1: raise oscerr.WrongArgs('--change-password only needs the apiurl') args = [args[0], 'pass'] elif opts.select_password_store: if len(args) != 1: raise oscerr.WrongArgs('--select-password-store only needs the apiurl') args = [args[0], 'pass'] if len(args) < 2 and not (opts.dump or opts.dump_full): raise oscerr.WrongArgs('Too few arguments') elif opts.dump or opts.dump_full: cp = conf.get_configParser(conf.config['conffile']) for sect in cp.sections(): print(f'[{sect}]') for opt in sorted(cp.options(sect)): if sect == 'general' and opt in conf.api_host_options or \ sect != 'general' and opt not in conf.api_host_options: continue if opt in ('pass', 'passx') and not opts.dump_full: continue val = str(cp.get(sect, opt, raw=True)) # special handling for continuation lines val = '\n '.join(val.split('\n')) print(f'{opt} = {val}') print() return section, opt, val = args[0], args[1], args[2:] if val and (opts.delete or opts.stdin or opts.prompt or opts.no_echo): raise oscerr.WrongOptions('Sorry, \'--delete\' or \'--stdin\' or \'--prompt\' or \'--no-echo\' ' 'and the specification of a value argument are mutually exclusive') elif (opts.prompt or opts.no_echo) and opts.stdin: raise oscerr.WrongOptions('Sorry, \'--prompt\' or \'--no-echo\' and \'--stdin\' are mutually exclusive') elif opts.stdin: # strip lines val = [i.strip() for i in sys.stdin.readlines() if i.strip()] if not val: raise oscerr.WrongArgs('error: read empty value from stdin') elif opts.no_echo or opts.prompt: if opts.no_echo: inp = getpass.getpass(prompt_value).strip() else: inp = raw_input(prompt_value).strip() if not inp: raise oscerr.WrongArgs('error: no value was entered') val = [inp] creds_mgr_descr = None if opt == 'pass' and opts.select_password_store: creds_mgr_descr = conf.select_credentials_manager_descr() orig_opt = opt opt, newval = conf.config_set_option(section, opt, ' '.join(val), delete=opts.delete, update=True, creds_mgr_descr=creds_mgr_descr) if newval is None and opts.delete: print(f'\'{section}\': \'{opt}\' got removed') elif newval is None: print(f'\'{section}\': \'{opt}\' is not set') else: if orig_opt == 'pass': print('Password has been changed.') elif opts.no_echo: # supress value print(f'\'{section}\': set \'{opt}\'') else: print(f'\'{section}\': \'{opt}\' is set to \'{newval}\'') @cmdln.option('file', nargs='+') def do_revert(self, subcmd, opts): """ Restore changed files or the entire working copy Examples: osc revert <modified file(s)> osc revert . Note: this only works for package working copies """ from .core import Package files = opts.file pacs = Package.from_paths(files) for p in pacs: if not p.todo: p.todo = p.filenamelist + p.to_be_added for f in p.todo: p.revert(f) @cmdln.option('--force-apiurl', action='store_true', help='ask once for an apiurl and force this apiurl for all inconsistent projects/packages') def do_repairwc(self, subcmd, opts, *args): """ Try to repair an inconsistent working copy Examples: osc repairwc <path> Note: if <path> is omitted it defaults to '.' (<path> can be a project or package working copy) Warning: This command might delete some files in the storedir (.osc). Please check the state of the wc afterwards (via 'osc status'). """ from . import conf from .core import Package from .core import Project from .core import is_package_dir from .core import is_project_dir from .core import parseargs from .core import raw_input def get_apiurl(apiurls): print('No apiurl is defined for this working copy.\n' 'Please choose one from the following list (enter the number):') for i in range(len(apiurls)): print(' %d) %s' % (i, apiurls[i])) num = raw_input('> ') try: num = int(num) except ValueError: raise oscerr.WrongArgs(f'\'{num}\' is not a number. Aborting') if num < 0 or num >= len(apiurls): raise oscerr.WrongArgs(f'number \'{num}\' out of range. Aborting') return apiurls[num] args = parseargs(args) pacs = [] apiurls = list(conf.config['api_host_options'].keys()) apiurl = '' for i in args: if is_project_dir(i): try: prj = Project(i, getPackageList=False) except (oscerr.WorkingCopyInconsistent, oscerr.NoWorkingCopy) as e: dirty_files = getattr(e, "dirty_files", []) if '_apiurl' in dirty_files and (not apiurl or not opts.force_apiurl): apiurl = get_apiurl(apiurls) prj = Project(i, getPackageList=False, wc_check=False) prj.wc_repair(apiurl) for p in prj.pacs_have: if p in prj.pacs_broken: continue try: Package(os.path.join(i, p)) except oscerr.WorkingCopyInconsistent: pacs.append(os.path.join(i, p)) elif is_package_dir(i): pacs.append(i) else: print('\'%s\' is neither a project working copy ' 'nor a package working copy' % i, file=sys.stderr) for pdir in pacs: try: p = Package(pdir) except (oscerr.WorkingCopyInconsistent, oscerr.NoWorkingCopy) as e: dirty_files = getattr(e, "dirty_files", []) if '_apiurl' in dirty_files and (not apiurl or not opts.force_apiurl): apiurl = get_apiurl(apiurls) p = Package(pdir, wc_check=False) repaired = p.wc_repair(apiurl) if repaired: print(f'done. Please check the state of the wc (via \'osc status {i}\').') else: print(f'osc: working copy \'{i}\' is not inconsistent', file=sys.stderr) @cmdln.option('-n', '--dry-run', action='store_true', help='print the results without actually removing a file') def do_clean(self, subcmd, opts, *args): """ Removes all untracked files from the package working copy Examples: osc clean <path> Note: if <path> is omitted it defaults to '.' (<path> has to be a package working copy) Warning: This command removes all files with status '?'. """ from .core import Package from .core import getTransActPath from .core import is_package_dir from .core import parseargs pacs = parseargs(args) # do a sanity check first for pac in pacs: if not is_package_dir(pac): raise oscerr.WrongArgs(f'\'{pac}\' is no package working copy') for pdir in pacs: p = Package(pdir) pdir = getTransActPath(pdir) todo = [fname for st, fname in p.get_status() if st == '?'] for fname in p.excluded: # there might be some rare cases, where an excluded file has # not state '?' if os.path.isfile(fname) and p.status(fname) == '?': todo.append(fname) for filename in todo: print(f'Removing: {os.path.join(pdir, filename)}') if not opts.dry_run: os.unlink(os.path.join(p.absdir, filename)) @cmdln.option('-c', '--comment', help='comment text', metavar='COMMENT') @cmdln.option('-p', '--parent', help='reply to comment with parent id', metavar='PARENT') def do_comment(self, subcmd, opts, *args): """ List / create / delete comments On create: If -p is given a reply to the ID is created. Otherwise a toplevel comment is created. If -c is not given the default editor will be opened and you can type your comment usage: osc comment list package PROJECT PACKAGE osc comment list project PROJECT osc comment list request REQUEST_ID osc comment create [-p PARENT_ID] [-c COMMENT] package PROJECT PACKAGE osc comment create [-p PARENT_ID] [-c COMMENT] project PROJECT osc comment create [-p PARENT_ID] [-c COMMENT] request REQUEST_ID osc comment delete ID """ from .core import create_comment from .core import delete_comment from .core import edit_text from .core import print_comments from .core import slash_split comment = None args = slash_split(args) apiurl = self.get_api_url() if len(args) < 2: self.argparse_error("Incorrect number of arguments.") cmds = ['list', 'create', 'delete'] if args[0] not in cmds: raise oscerr.WrongArgs('Unknown comment action %s. Choose one of %s.' % (args[0], ', '.join(cmds))) comment_targets = ['package', 'project', 'request'] if args[0] != 'delete' and args[1] not in comment_targets: raise oscerr.WrongArgs('Unknown comment target %s. Choose one of %s.' % (args[1], ', '.join(comment_targets))) if args[1] == 'package' and len(args) != 4: raise oscerr.WrongArgs('Please use PROJECT PACKAGE') elif args[1] == 'project' and len(args) != 3: raise oscerr.WrongArgs('Please use PROJECT') elif args[1] == 'request' and len(args) != 3: raise oscerr.WrongArgs('Please use REQUEST') elif args[0] == 'delete' and len(args) != 2: raise oscerr.WrongArgs('Please use COMMENT_ID') if not opts.comment and args[0] == 'create': comment = edit_text() else: comment = opts.comment if args[0] == 'list': if args[1] == 'package' or args[1] == 'project': args[2] = self._process_project_name(args[2]) print_comments(apiurl, args[1], *args[2:]) elif args[0] == 'create': if args[1] == 'package' or args[1] == 'project': args[2] = self._process_project_name(args[2]) result = create_comment(apiurl, args[1], comment, *args[2:], parent=opts.parent) print(result) elif args[0] == 'delete': result = delete_comment(apiurl, args[1]) print(result) def _load_plugins(self): from . import output if IN_VENV: output.print_msg("Running in virtual environment, skipping loading legacy plugins.", print_to="debug") return plugin_dirs = [ '/usr/lib/osc-plugins', '/usr/local/lib/osc-plugins', os.path.expanduser('~/.local/lib/osc-plugins'), os.path.expanduser('~/.osc-plugins')] for plugin_dir in plugin_dirs: if not os.path.isdir(plugin_dir): continue sys.path.append(plugin_dir) for extfile in os.listdir(plugin_dir): if not extfile.endswith('.py'): continue try: modname = "osc.plugins." + os.path.splitext(extfile)[0] spec = importlib.util.spec_from_file_location(modname, os.path.join(plugin_dir, extfile)) mod = importlib.util.module_from_spec(spec) sys.modules[modname] = mod spec.loader.exec_module(mod) # restore the old exec semantic mod.__dict__.update(globals()) for name in dir(mod): data = getattr(mod, name) # Add all functions (which are defined in the imported module) # to the class (filtering only methods which start with "do_" # breaks the old behavior). # Also add imported modules (needed for backward compatibility). # New plugins should not use "self.<imported modname>.<something>" # to refer to the imported module. Instead use # "<imported modname>.<something>". if (inspect.isfunction(data) and inspect.getmodule(data) == mod or inspect.ismodule(data)): setattr(self.__class__, name, data) except (SyntaxError, NameError, ImportError) as e: if os.environ.get('OSC_PLUGIN_FAIL_IGNORE'): print(f"{os.path.join(plugin_dir, extfile)}: {e}\n", file=sys.stderr) else: traceback.print_exc(file=sys.stderr) print(f'\n{os.path.join(plugin_dir, extfile)}: {e}', file=sys.stderr) print("\n Try 'env OSC_PLUGIN_FAIL_IGNORE=1 osc ...'", file=sys.stderr) sys.exit(1) # fini! ############################################################################### # vim: sw=4 et �������������������������������������������������������������������������������������������������������������������osc-1.12.1/osc/commandline_common.py����������������������������������������������������������������0000664�0000000�0000000�00000023367�14753375025�0017402�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������import argparse import copy import importlib import inspect import os import pkgutil import sys import textwrap from typing import List from . import cmdln # python3.6 requires reading sys.real_prefix to detect virtualenv IN_VENV = getattr(sys, "real_prefix", sys.base_prefix) != sys.prefix class OscArgumentParser(argparse.ArgumentParser): def _get_formatter(self): # cache formatter to speed things a little bit up if not hasattr(self, "_formatter"): self._formatter = self.formatter_class(prog=self.prog) return self._formatter def add_argument(self, *args, **kwargs): # remember added arguments so we can add them to subcommands easily if not hasattr(self, "_added_arguments"): self._added_arguments = [] self._added_arguments.append((args, kwargs)) super().add_argument(*args, **kwargs) class Command: #: Name of the command as used in the argument parser. name: str = None #: Optional aliases to the command. aliases: List[str] = [] #: Whether the command is hidden from help. #: Defaults to ``False``. hidden: bool = False #: Name of the parent command class. #: Can be prefixed if the parent comes from a different location, #: for example ``osc.commands.<ClassName>`` when extending osc command with a plugin. #: See ``OscMainCommand.MODULES`` for available prefixes. parent: str = None def __init__(self, full_name, parent=None): self.full_name = full_name self.parent = parent self.subparsers = None if not self.name: raise ValueError(f"Command '{self.full_name}' has no 'name' set") if parent: self.parser = self.parent.subparsers.add_parser( self.name, aliases=self.aliases, help=self.get_help(), description=self.get_description(), formatter_class=cmdln.HelpFormatter, conflict_handler="resolve", prog=f"{self.main_command.name} [global opts] {self.name}", ) self.parser.set_defaults(_selected_command=self) else: self.parser = OscArgumentParser( description=self.get_description(), formatter_class=cmdln.HelpFormatter, usage="%(prog)s [global opts] <command> [--help] [opts] [args]", ) if self.parent: for arg_args, arg_kwargs in self.parent.parser._added_arguments: if not arg_args: continue if not arg_args[0].startswith("-"): continue if "--help" in arg_args: continue arg_kwargs = arg_kwargs.copy() arg_kwargs["help"] = argparse.SUPPRESS arg_kwargs["default"] = argparse.SUPPRESS self.parser.add_argument(*arg_args, **arg_kwargs) self.init_arguments() def __repr__(self): return f"<osc plugin {self.full_name} at {self.__hash__():#x}>" def get_help(self): """ Return the help text of the command. The first line of the docstring is returned by default. """ if self.hidden: return argparse.SUPPRESS if not self.__doc__: return "" help_lines = self.__doc__.strip().splitlines() if not help_lines: return "" return help_lines[0] def get_description(self): """ Return the description of the command. The docstring without the first line is returned by default. """ if not self.__doc__: return "" help_lines = self.__doc__.strip().splitlines() if not help_lines: return "" # skip the first line that contains help text help_lines.pop(0) # remove any leading empty lines while help_lines and not help_lines[0]: help_lines.pop(0) result = "\n".join(help_lines) result = textwrap.dedent(result) return result @property def main_command(self): """ Return reference to the main command that represents the executable and contains the main instance of ArgumentParser. """ if not self.parent: return self return self.parent.main_command def add_argument(self, *args, **kwargs): """ Add a new argument to the command's argument parser. See `argparse <https://docs.python.org/3/library/argparse.html>`_ documentation for allowed parameters. """ self.parser.add_argument(*args, **kwargs) def init_arguments(self): """ Override to add arguments to the argument parser. .. note:: Make sure you're adding arguments only by calling ``self.add_argument()``. Using ``self.parser.add_argument()`` directly is not recommended because it disables argument intermixing. """ def run(self, args): """ Override to implement the command functionality. .. note:: ``args.positional_args`` is a list containing any unknown (unparsed) positional arguments. .. note:: Consider moving any reusable code into a library, leaving the command-line code only a thin wrapper on top of it. If the code is generic enough, it should be added to osc directly. In such case don't hesitate to open an `issue <https://github.com/openSUSE/osc/issues>`_. """ raise NotImplementedError() def register(self, command_class, command_full_name): if not self.subparsers: # instantiate subparsers on first use self.subparsers = self.parser.add_subparsers(dest="command", title="commands") # Check for parser conflicts. # This is how Python 3.11+ behaves by default. if command_class.name in self.subparsers._name_parser_map: raise argparse.ArgumentError(self.subparsers, f"conflicting subparser: {command_class.name}") for alias in command_class.aliases: if alias in self.subparsers._name_parser_map: raise argparse.ArgumentError(self.subparsers, f"conflicting subparser alias: {alias}") command = command_class(command_full_name, parent=self) return command class MainCommand(Command): MODULES = () def __init__(self): super().__init__(self.__class__.__name__) self.command_classes = {} self.download_progress = None def post_parse_args(self, args): pass def run(self, args): cmd = getattr(args, "_selected_command", None) if not cmd: self.parser.error("Please specify a command") self.post_parse_args(args) return cmd.run(args) def load_command(self, cls, module_prefix): mod_cls_name = f"{module_prefix}.{cls.__name__}" parent_name = getattr(cls, "parent", None) if parent_name: # allow relative references to classes in the the same module/directory if "." not in parent_name: parent_name = f"{module_prefix}.{parent_name}" try: parent = self.main_command.command_classes[parent_name] except KeyError: msg = f"Failed to load command class '{mod_cls_name}' because it references parent '{parent_name}' that doesn't exist" print(msg, file=sys.stderr) return None cmd = parent.register(cls, mod_cls_name) else: cmd = self.main_command.register(cls, mod_cls_name) cmd.full_name = mod_cls_name self.main_command.command_classes[mod_cls_name] = cmd return cmd def load_commands(self): if IN_VENV: from . import output # pylint: disable=import-outside-toplevel output.print_msg("Running in virtual environment, skipping loading plugins installed outside the virtual environment.", print_to="debug") for module_prefix, module_path in self.MODULES: module_path = os.path.expanduser(module_path) # some plugins have their modules installed next to them instead of site-packages if module_path not in sys.path: sys.path.append(module_path) for loader, module_name, _ in pkgutil.iter_modules(path=[module_path]): full_name = f"{module_prefix}.{module_name}" spec = loader.find_spec(full_name) mod = importlib.util.module_from_spec(spec) try: spec.loader.exec_module(mod) except Exception as e: # pylint: disable=broad-except msg = f"Failed to load commands from module '{full_name}': {e}" print(msg, file=sys.stderr) continue for name in dir(mod): if name.startswith("_"): continue cls = getattr(mod, name) if not inspect.isclass(cls): continue if not issubclass(cls, Command): continue if cls.__module__ != full_name: # skip classes that weren't defined directly in the loaded plugin module continue self.load_command(cls, module_prefix) def parse_args(self, *args, **kwargs): namespace, unknown_args = self.parser.parse_known_args(*args, **kwargs) unrecognized = [i for i in unknown_args if i.startswith("-")] if unrecognized: self.parser.error(f"unrecognized arguments: " + " ".join(unrecognized)) namespace.positional_args = list(unknown_args) return namespace �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/osc/commandline_git.py�������������������������������������������������������������������0000664�0000000�0000000�00000013064�14753375025�0016666�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������import argparse import os import subprocess import sys import osc.commandline_common import osc.commands_git from . import oscerr from .output import print_msg class OwnerRepoAction(argparse.Action): def __call__(self, parser, namespace, value, option_string=None): from . import gitea_api try: if isinstance(value, list): namespace_value = [gitea_api.Repo.split_id(i) for i in value] else: namespace_value = gitea_api.Repo.split_id(value) except ValueError as e: raise argparse.ArgumentError(self, str(e)) setattr(namespace, self.dest, namespace_value) class OwnerRepoPullAction(argparse.Action): def __call__(self, parser, namespace, value, option_string=None): from . import gitea_api try: if isinstance(value, list): namespace_value = [gitea_api.PullRequest.split_id(i) for i in value] else: namespace_value = gitea_api.PullRequest.split_id(value) except ValueError as e: raise argparse.ArgumentError(self, str(e)) setattr(namespace, self.dest, namespace_value) class BooleanAction(argparse.Action): def __call__(self, parser, namespace, value, option_string=None): if value is None: setattr(namespace, self.dest, None) elif value.lower() in ["0", "no", "false", "off"]: setattr(namespace, self.dest, False) elif value.lower() in ["1", "yes", "true", "on"]: setattr(namespace, self.dest, True) else: raise argparse.ArgumentError(self, f"Invalid boolean value: {value}") class GitObsCommand(osc.commandline_common.Command): @property def gitea_conf(self): return self.main_command.gitea_conf @property def gitea_login(self): return self.main_command.gitea_login @property def gitea_conn(self): return self.main_command.gitea_conn def print_gitea_settings(self): print(f"Using the following Gitea settings:", file=sys.stderr) print(f" * Config path: {self.gitea_conf.path}", file=sys.stderr) print(f" * Login (name of the entry in the config file): {self.gitea_login.name}", file=sys.stderr) print(f" * URL: {self.gitea_login.url}", file=sys.stderr) print(f" * User: {self.gitea_login.user}", file=sys.stderr) print("", file=sys.stderr) def add_argument_owner_repo(self, **kwargs): self.add_argument( "owner_repo", action=OwnerRepoAction, help="Owner and repo: (format: <owner>/<repo>)", **kwargs, ) def add_argument_owner_repo_pull(self, **kwargs): self.add_argument( "owner_repo_pull", action=OwnerRepoPullAction, help="Owner, repo and pull request number (format: <owner>/<repo>#<pull-request-number>)", **kwargs, ) def add_argument_new_repo_name(self): self.add_argument( "--new-repo-name", help="Name of the newly forked repo", ) class GitObsMainCommand(osc.commandline_common.MainCommand): name = "git-obs" MODULES = ( ("osc.commands_git", osc.commands_git.__path__[0]), ) def __init__(self, *args, **kwargs): super().__init__(*args, **kwargs) self._args = None self._gitea_conf = None self._gitea_login = None self._gitea_conn = None def init_arguments(self): self.add_argument( "--gitea-config", help="Path to gitea config. Default: $GIT_OBS_CONFIG or ~/.config/tea/config.yml.", ) self.add_argument( "-G", "--gitea-login", help="Name of the login entry in the config file. Default: $GIT_OBS_LOGIN or the default entry from the config file.", ) def post_parse_args(self, args): if not args.gitea_config: value = os.getenv("GIT_OBS_CONFIG", "").strip() if value: args.gitea_config = value if not args.gitea_login: value = os.getenv("GIT_OBS_LOGIN", "").strip() if value: args.gitea_login = value self._args = args @classmethod def main(cls, argv=None, run=True): """ Initialize OscMainCommand, load all commands and run the selected command. """ cmd = cls() cmd.load_commands() if run: args = cmd.parse_args(args=argv) exit_code = cmd.run(args) sys.exit(exit_code) else: args = None return cmd, args @property def gitea_conf(self): from . import gitea_api if self._gitea_conf is None: self._gitea_conf = gitea_api.Config(self._args.gitea_config) return self._gitea_conf @property def gitea_login(self): if self._gitea_login is None: self._gitea_login = self.gitea_conf.get_login(name=self._args.gitea_login) return self._gitea_login @property def gitea_conn(self): from . import gitea_api if self._gitea_conn is None: self._gitea_conn = gitea_api.Connection(self.gitea_login) assert self._gitea_login is not None return self._gitea_conn def main(): try: GitObsMainCommand.main() except oscerr.OscBaseError as e: print_msg(str(e), print_to="error") sys.exit(1) except subprocess.CalledProcessError as e: print_msg(str(e), print_to="error") sys.exit(1) if __name__ == "__main__": main() ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/osc/commands/����������������������������������������������������������������������������0000775�0000000�0000000�00000000000�14753375025�0014760�5����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/osc/commands/__init__.py�����������������������������������������������������������������0000664�0000000�0000000�00000000000�14753375025�0017057�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/osc/commands/person.py�������������������������������������������������������������������0000664�0000000�0000000�00000000251�14753375025�0016636�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������import osc.commandline class PersonCommand(osc.commandline.OscCommand): """ Manage persons """ name = "person" def run(self, args): pass �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/osc/commands/person_register.py����������������������������������������������������������0000664�0000000�0000000�00000002734�14753375025�0020552�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������import osc.commandline class PersonRegisterCommand(osc.commandline.OscCommand): """ Register a new person (user) """ name = "register" parent = "PersonCommand" def init_arguments(self): self.add_argument( "--login", required=True, help="Login.", ) self.add_argument( "--realname", required=True, help="Real name of the person.", ) self.add_argument( "--email", required=True, help="Email address.", ) self.add_argument( "--password", help="Password. An interactive prompt is shown if password is not specified.", ) self.add_argument( "--note", help="Any notes about the person.", ) self.add_argument( "--state", help="State of the account. Defaults to 'unconfirmed'.", ) def run(self, args): from osc import obs_api from osc.util.helper import raw_input if args.password: password = args.password else: password = raw_input(f"Enter password for {args.login}@{args.apiurl}: ") obs_api.Person.cmd_register( args.apiurl, login=args.login, realname=args.realname, email=args.email, password=password, note=args.note, state=args.state, ) ������������������������������������osc-1.12.1/osc/commands/person_search.py������������������������������������������������������������0000664�0000000�0000000�00000002471�14753375025�0020171�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������import osc.commandline class PersonSearchCommand(osc.commandline.OscCommand): """ Search a person (user) """ name = "search" parent = "PersonCommand" def init_arguments(self): self.add_argument( "--login", help="Search by a login.", ) self.add_argument( "--login-contains", metavar="SUBSTR", help="Search by a substring in a login.", ) self.add_argument( "--realname-contains", metavar="SUBSTR", help="Search by a substring in a realname.", ) self.add_argument( "--email", help="Search by an email address.", ) self.add_argument( "--email-contains", metavar="SUBSTR", help="Search by a substring in an email address.", ) def run(self, args): from .. import obs_api persons = obs_api.Person.search( args.apiurl, login=args.login, login__contains=args.login_contains, realname__contains=args.realname_contains, email=args.email, email__contains=args.email_contains, ) for person in persons: print(person.to_human_readable_string()) print() �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/osc/commands/repo.py���������������������������������������������������������������������0000664�0000000�0000000�00000000272�14753375025�0016300�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������import osc.commandline class RepoCommand(osc.commandline.OscCommand): """ Manage repositories in project meta """ name = "repo" def run(self, args): pass ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/osc/commands/repo_add.py�����������������������������������������������������������������0000664�0000000�0000000�00000006611�14753375025�0017113�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������import difflib import osc.commandline from .. import oscerr class RepoAddCommand(osc.commandline.OscCommand): """ Add a repository to project meta """ name = "add" parent = "RepoCommand" def init_arguments(self): self.add_argument( "project", help="Name of the project", ) self.add_argument( "--repo", metavar="NAME", required=True, help="Name of the repository we're adding", ) self.add_argument( "--arch", dest="arches", metavar="[ARCH]", action="append", required=True, help="Architecture of the repository. Can be specified multiple times.", ) self.add_argument( "--path", dest="paths", metavar="[PROJECT/REPO]", action="append", required=True, help="Path associated to the repository. Format is PROJECT/REPO. Can be specified multiple times.", ) self.add_argument( "--disable-publish", action="store_true", default=False, help="Disable publishing the added repository", ) self.add_argument( "--yes", action="store_true", help="Proceed without asking", ) def run(self, args): from .. import obs_api from ..output import get_user_input paths = [] for path in args.paths: if "/" not in path: self.parser.error(f"Invalid path (expected format is PROJECT/REPO): {path}") project, repo = path.split("/") paths.append({"project": project, "repository": repo}) project_obj = obs_api.Project.from_api(args.apiurl, args.project) old = project_obj.to_string() matching_repos = [i for i in project_obj.repository_list or [] if i.name == args.repo] if matching_repos: raise oscerr.OscValueError(f"Repository '{args.repo}' already exists in project meta") project_obj.repository_list.append( { "name": args.repo, "arch_list": args.arches, "path_list": paths, } ) if args.disable_publish: matching_publish_disable_repos = [ i for i in project_obj.publish_list or [] if i.flag == "disable" and i.repository == args.repo ] if not matching_publish_disable_repos: if project_obj.publish_list is None: project_obj.publish_list = [] project_obj.publish_list.append( { "flag": "disable", "repository": args.repo, } ) if not args.yes: new = project_obj.to_string() diff = difflib.unified_diff(old.splitlines(), new.splitlines(), fromfile="old", tofile="new") print("\n".join(diff)) print() reply = get_user_input( f""" You're changing meta of project '{args.project}' Do you want to apply the changes? """, answers={"y": "yes", "n": "no"}, ) if reply == "n": raise oscerr.UserAbort() project_obj.to_api(args.apiurl) �����������������������������������������������������������������������������������������������������������������������osc-1.12.1/osc/commands/repo_list.py����������������������������������������������������������������0000664�0000000�0000000�00000003400�14753375025�0017327�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������import osc.commandline class RepoListCommand(osc.commandline.OscCommand): """ List repositories in project meta """ name = "list" aliases = ["ls"] parent = "RepoCommand" def init_arguments(self): self.add_argument( "project", help="Name of the project", ) def run(self, args): from .. import obs_api from ..output import KeyValueTable project_obj = obs_api.Project.from_api(args.apiurl, args.project) repo_flags = project_obj.resolve_repository_flags() flag_map = {} for (repo_name, arch), data in repo_flags.items(): for flag_name, flag_value in data.items(): if flag_value is None: continue action = "enable" if flag_value else "disable" flag_map.setdefault(repo_name, {}).setdefault(flag_name, {}).setdefault(action, []).append(arch) table = KeyValueTable() for repo in project_obj.repository_list or []: table.add("Repository", repo.name, color="bold") table.add("Architectures", ", ".join(repo.arch_list)) if repo.path_list: paths = [f"{path.project}/{path.repository}" for path in repo.path_list] table.add("Paths", paths) if repo.name in flag_map: table.add("Flags", None) for flag_name in flag_map[repo.name]: lines = [] for action, archs in flag_map[repo.name][flag_name].items(): lines.append(f"{action + ':':<8s} {', '.join(archs)}") lines.sort() table.add(flag_name, lines, indent=4) table.newline() print(str(table)) ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/osc/commands/repo_remove.py��������������������������������������������������������������0000664�0000000�0000000�00000003757�14753375025�0017670�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������import difflib import osc.commandline from .. import oscerr class RepoRemoveCommand(osc.commandline.OscCommand): """ Remove repositories from project meta """ name = "remove" aliases = ["rm"] parent = "RepoCommand" def init_arguments(self): self.add_argument( "project", help="Name of the project", ) self.add_argument( "--repo", metavar="[NAME]", action="append", required=True, help="Name of the repository we're removing. Can be specified multiple times.", ) self.add_argument( "--yes", action="store_true", help="Proceed without asking", ) def run(self, args): from .. import obs_api from ..output import get_user_input project_obj = obs_api.Project.from_api(args.apiurl, args.project) old = project_obj.to_string() for repo in args.repo: if project_obj.repository_list is not None: project_obj.repository_list = [i for i in project_obj.repository_list if i.name != repo] if project_obj.publish_list is not None: project_obj.publish_list = [ i for i in project_obj.publish_list if i.flag != "disable" or i.repository != repo ] if not project_obj.has_changed(): return if not args.yes: new = project_obj.to_string() diff = difflib.unified_diff(old.splitlines(), new.splitlines(), fromfile="old", tofile="new") print("\n".join(diff)) print() reply = get_user_input( f""" You're changing meta of project '{args.project}' Do you want to apply the changes? """, answers={"y": "yes", "n": "no"}, ) if reply == "n": raise oscerr.UserAbort() project_obj.to_api(args.apiurl) �����������������osc-1.12.1/osc/commands_git/������������������������������������������������������������������������0000775�0000000�0000000�00000000000�14753375025�0015623�5����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/osc/commands_git/__init__.py�������������������������������������������������������������0000664�0000000�0000000�00000000000�14753375025�0017722�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/osc/commands_git/api.py������������������������������������������������������������������0000664�0000000�0000000�00000002366�14753375025�0016755�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������import json import sys import osc.commandline_git class ApiCommand(osc.commandline_git.GitObsCommand): """ Make an arbitrary request to API """ name = "api" def init_arguments(self): self.add_argument( "-X", "--method", choices=["GET", "HEAD", "POST", "PATCH", "PUT"], default="GET", ) self.add_argument( "url", ) self.add_argument("--data") def run(self, args): from osc import gitea_api from osc.output import tty self.print_gitea_settings() url = self.gitea_conn.makeurl(args.url) json_data = None if args.data: json_data = json.loads(args.data) response = self.gitea_conn.request( method=args.method, url=url, json_data=json_data ) print(tty.colorize("Response:", "white,bold"), file=sys.stderr) if response.headers.get("Content-Type", "").startswith("application/json;"): print( json.dumps( json.loads(response.data), indent=4, sort_keys=True, ) ) else: print(response.data.decode("utf-8")) ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/osc/commands_git/login.py����������������������������������������������������������������0000664�0000000�0000000�00000000422�14753375025�0017303�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������import osc.commandline_git class LoginCommand(osc.commandline_git.GitObsCommand): """ Manage configured credentials to Gitea servers """ name = "login" def init_arguments(self): pass def run(self, args): self.parser.print_help() ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/osc/commands_git/login_add.py������������������������������������������������������������0000664�0000000�0000000�00000002235�14753375025�0020117�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������import sys import osc.commandline_git class LoginAddCommand(osc.commandline_git.GitObsCommand): """ Add a Gitea credentials entry """ name = "add" parent = "LoginCommand" def init_arguments(self): self.parser.add_argument("name") self.parser.add_argument("--url", required=True) self.parser.add_argument("--user", required=True) self.parser.add_argument("--token", required=True) self.parser.add_argument("--ssh-key") self.parser.add_argument("--set-as-default", action="store_true", default=None) def run(self, args): from osc import gitea_api print(f"Adding a Gitea credentials entry with name '{args.name}' ...", file=sys.stderr) print(f" * Config path: {self.gitea_conf.path}", file=sys.stderr) print("", file=sys.stderr) # TODO: try to authenticate to verify that the new entry works login = gitea_api.Login(name=args.name, url=args.url, user=args.user, token=args.token, ssh_key=args.ssh_key, default=args.set_as_default) self.gitea_conf.add_login(login) print("Added entry:") print(login.to_human_readable_string()) �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/osc/commands_git/login_list.py�����������������������������������������������������������0000664�0000000�0000000�00000000763�14753375025�0020346�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������import osc.commandline_git class LoginListCommand(osc.commandline_git.GitObsCommand): """ List Gitea credentials entries """ name = "list" parent = "LoginCommand" def init_arguments(self): self.parser.add_argument("--show-tokens", action="store_true", help="Show tokens in the output") def run(self, args): for login in self.gitea_conf.list_logins(): print(login.to_human_readable_string(show_token=args.show_tokens)) print() �������������osc-1.12.1/osc/commands_git/login_remove.py���������������������������������������������������������0000664�0000000�0000000�00000001207�14753375025�0020662�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������import sys import osc.commandline_git class LoginRemoveCommand(osc.commandline_git.GitObsCommand): """ Remove a Gitea credentials entry """ name = "remove" parent = "LoginCommand" def init_arguments(self): self.parser.add_argument("name") def run(self, args): print(f"Removing a Gitea credentials entry with name '{args.name}' ...", file=sys.stderr) print(f" * Config path: {self.gitea_conf.path}", file=sys.stderr) print("", file=sys.stderr) login = self.gitea_conf.remove_login(args.name) print("Removed entry:") print(login.to_human_readable_string()) �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/osc/commands_git/login_update.py���������������������������������������������������������0000664�0000000�0000000�00000002706�14753375025�0020654�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������import sys import osc.commandline_git class LoginUpdateCommand(osc.commandline_git.GitObsCommand): """ Update a Gitea credentials entry """ name = "update" parent = "LoginCommand" def init_arguments(self): self.parser.add_argument("name") self.parser.add_argument("--new-name") self.parser.add_argument("--new-url") self.parser.add_argument("--new-user") self.parser.add_argument("--new-token") self.parser.add_argument("--new-ssh-key") self.parser.add_argument("--set-as-default", action="store_true") def run(self, args): print(f"Updating a Gitea credentials entry with name '{args.name}' ...", file=sys.stderr) print(f" * Config path: {self.gitea_conf.path}", file=sys.stderr) print("", file=sys.stderr) # TODO: try to authenticate to verify that the updated entry works original_login = self.gitea_conf.get_login(args.name) print("Original entry:") print(original_login.to_human_readable_string()) updated_login = self.gitea_conf.update_login( args.name, new_name=args.new_name, new_url=args.new_url, new_user=args.new_user, new_token=args.new_token, new_ssh_key=args.new_ssh_key, set_as_default=args.set_as_default, ) print("") print("Updated entry:") print(updated_login.to_human_readable_string()) ����������������������������������������������������������osc-1.12.1/osc/commands_git/pr.py�������������������������������������������������������������������0000664�0000000�0000000�00000000577�14753375025�0016627�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������import osc.commandline_git # we decided not to use the command name 'pull' because that could be confused # with the completely unrelated 'git pull' command class PullRequestCommand(osc.commandline_git.GitObsCommand): """ Manage pull requests """ name = "pr" def init_arguments(self): pass def run(self, args): self.parser.print_help() ���������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/osc/commands_git/pr_checkout.py����������������������������������������������������������0000664�0000000�0000000�00000003770�14753375025�0020512�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������import subprocess import osc.commandline_git class PullRequestCheckoutCommand(osc.commandline_git.GitObsCommand): """ Check out a pull request """ name = "checkout" parent = "PullRequestCommand" def init_arguments(self): self.add_argument( "pull", type=int, help="Number of the pull request", ) self.add_argument( "-f", "--force", action="store_true", help="Reset the existing local branch to the latest state of the pull request", ) def run(self, args): from osc import gitea_api self.print_gitea_settings() git = gitea_api.Git(".") owner, repo = git.get_owner_repo() pr = gitea_api.PullRequest.get(self.gitea_conn, owner, repo, args.pull).json() head_ssh_url = pr["head"]["repo"]["ssh_url"] head_owner = pr["head"]["repo"]["owner"]["login"] head_branch = pr["head"]["ref"] try: git.add_remote(head_owner, head_ssh_url) except subprocess.CalledProcessError as e: # TODO: check if the remote url matches if e.returncode != 3: # returncode 3 means that the remote exists; see `man git-remote` raise git.fetch(head_owner) local_branch = git.fetch_pull_request(args.pull, force=args.force) # LFS data may not be accessible in the "origin" remote, we need to allow searching in all remotes git.set_config("lfs.remote.searchall", "1") # configure branch for `git push` git.set_config(f"branch.{local_branch}.remote", head_owner) git.set_config(f"branch.{local_branch}.pushRemote", head_owner) git.set_config(f"branch.{local_branch}.merge", f"refs/heads/{head_branch}") # allow `git push` with no arguments to push to a remote branch that is named differently than the local branch git.set_config("push.default", "upstream") git.switch(local_branch) ��������osc-1.12.1/osc/commands_git/pr_create.py������������������������������������������������������������0000664�0000000�0000000�00000015637�14753375025�0020155�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������import os import re import subprocess import sys from typing import Optional import osc.commandline_git def get_editor() -> str: import shutil editor = os.getenv("EDITOR", None) if editor: candidates = [editor] else: candidates = ["vim", "vi"] editor_path = None for i in candidates: editor_path = shutil.which(i) if editor_path: break if not editor_path: raise RuntimeError(f"Unable to start editor '{candidates[0]}'") return editor_path def run_editor(file_path: str): cmd = [get_editor(), file_path] subprocess.run(cmd) def edit_message(template: Optional[str] = None) -> str: import tempfile with tempfile.NamedTemporaryFile(mode="w+", encoding="utf-8", prefix="git_obs_message_") as f: if template: f.write(template) f.flush() run_editor(f.name) f.seek(0) return f.read() NEW_PULL_REQUEST_TEMPLATE = """ # title {title} # description {description} # # Please enter pull request title and description in the following format: # <title> # <blank line> # <description> # # Lines starting with '#' will be ignored, and an empty message aborts the operation. # # Creating {source_owner}/{source_repo}#{source_branch} -> {target_owner}/{target_repo}#{target_branch} # """.lstrip() class PullRequestCreateCommand(osc.commandline_git.GitObsCommand): """ Create a pull request """ name = "create" parent = "PullRequestCommand" def init_arguments(self): self.add_argument( "--title", metavar="TEXT", help="Pull request title", ) self.add_argument( "--description", metavar="TEXT", help="Pull request description (body)", ) self.add_argument( "--source-owner", metavar="OWNER", help="Owner of the source repo (default: derived from remote URL in local git repo)", ) self.add_argument( "--source-repo", metavar="REPO", help="Name of the source repo (default: derived from remote URL in local git repo)", ) self.add_argument( "--source-branch", metavar="BRANCH", help="Source branch (default: the current branch in local git repo)", ) self.add_argument( "--target-branch", metavar="BRANCH", help="Target branch (default: derived from the current branch in local git repo)", ) def run(self, args): from osc import gitea_api # the source args are optional, but if one of them is used, the others must be used too source_args = (args.source_owner, args.source_repo, args.source_branch) if sum((int(i is not None) for i in source_args)) not in (0, len(source_args)): self.parser.error("All of the following options must be used together: --source-owner, --source-repo, --source-branch") self.print_gitea_settings() use_local_git = args.source_owner is None if use_local_git: # local git repo git = gitea_api.Git(".") local_owner, local_repo = git.get_owner_repo() local_branch = git.current_branch local_rev = git.get_branch_head(local_branch) # remote git repo - source if use_local_git: source_owner = local_owner source_repo = local_repo source_branch = local_branch else: source_owner = args.source_owner source_repo = args.source_repo source_branch = args.source_branch source_repo_data = gitea_api.Repo.get(self.gitea_conn, source_owner, source_repo).json() source_branch_data = gitea_api.Branch.get(self.gitea_conn, source_owner, source_repo, source_branch).json() source_rev = source_branch_data["commit"]["id"] # remote git repo - target target_owner, target_repo = source_repo_data["parent"]["full_name"].split("/") if source_branch.startswith("for/"): # source branch name format: for/<target-branch>/<what-the-branch-name-would-normally-be> target_branch = source_branch.split("/")[1] else: target_branch = source_branch target_branch_data = gitea_api.Branch.get(self.gitea_conn, target_owner, target_repo, target_branch).json() target_rev = target_branch_data["commit"]["id"] print("Creating a pull request ...", file=sys.stderr) if use_local_git: print(f" * Local git: branch: {local_branch}, rev: {local_rev}", file=sys.stderr) print(f" * Source: {source_owner}/{source_repo}, branch: {source_branch}, rev: {source_rev}", file=sys.stderr) print(f" * Target: {target_owner}/{target_repo}, branch: {target_branch}, rev: {target_rev}", file=sys.stderr) if use_local_git and local_rev != source_rev: from osc.output import tty print(f"{tty.colorize('ERROR', 'red,bold')}: Local commit doesn't correspond with the latest commit in the remote source branch") sys.exit(1) if source_rev == target_rev: from osc.output import tty print(f"{tty.colorize('ERROR', 'red,bold')}: Source and target are identical, make and push changes to the remote source repo first") sys.exit(1) title = args.title or "" description = args.description or "" if not title or not description: # TODO: add list of commits and list of changed files to the template; requires local git repo message = edit_message(template=NEW_PULL_REQUEST_TEMPLATE.format(**locals())) # remove comments message = "\n".join([i for i in message.splitlines() if not i.startswith("#")]) # strip leading and trailing spaces message = message.strip() if not message: raise RuntimeError("Aborting operation due to empty title and description.") parts = re.split(r"\n\n", message, 1) if len(parts) == 1: # empty description title = parts[0] description = "" else: title = parts[0] description = parts[1] title = title.strip() description = description.strip() pull = gitea_api.PullRequest.create( self.gitea_conn, target_owner=target_owner, target_repo=target_repo, target_branch=target_branch, source_owner=source_owner, # source_repo is not required because the information lives in Gitea database source_branch=source_branch, title=title, description=description, ).json() print("", file=sys.stderr) print("Pull request created:", file=sys.stderr) print(gitea_api.PullRequest.to_human_readable_string(pull)) �������������������������������������������������������������������������������������������������osc-1.12.1/osc/commands_git/pr_get.py���������������������������������������������������������������0000664�0000000�0000000�00000003521�14753375025�0017456�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������import sys import osc.commandline_git class PullRequestGetCommand(osc.commandline_git.GitObsCommand): """ Get details about the specified pull requests """ name = "get" aliases = ["show"] # for compatibility with osc parent = "PullRequestCommand" def init_arguments(self): self.add_argument_owner_repo_pull(nargs="+") self.add_argument( "-p", "--patch", action="store_true", help="Show patches associated with the pull requests", ) def run(self, args): from osc import gitea_api from osc.core import highlight_diff from osc.output import tty self.print_gitea_settings() num_entries = 0 failed_entries = [] for owner, repo, pull in args.owner_repo_pull: try: pr = gitea_api.PullRequest.get(self.gitea_conn, owner, repo, int(pull)).json() num_entries += 1 except gitea_api.GiteaException as e: if e.status == 404: failed_entries.append(f"{owner}/{repo}#{pull}") continue raise print(gitea_api.PullRequest.to_human_readable_string(pr)) if args.patch: print("") print(tty.colorize("Patch:", "bold")) patch = gitea_api.PullRequest.get_patch(self.gitea_conn, owner, repo, pull).data patch = highlight_diff(patch) print(patch.decode("utf-8")) print() print(f"Total entries: {num_entries}", file=sys.stderr) if failed_entries: print( f"{tty.colorize('ERROR', 'red,bold')}: Couldn't retrieve the following pull requests: {', '.join(failed_entries)}", file=sys.stderr, ) sys.exit(1) �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/osc/commands_git/pr_list.py��������������������������������������������������������������0000664�0000000�0000000�00000002027�14753375025�0017652�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������import sys import osc.commandline_git class PullRequestListCommand(osc.commandline_git.GitObsCommand): """ List pull requests in a repository """ name = "list" parent = "PullRequestCommand" def init_arguments(self): self.add_argument_owner_repo(nargs="+") self.add_argument( "--state", choices=["open", "closed", "all"], default="open", help="State of the pull requests (default: open)", ) def run(self, args): from osc import gitea_api self.print_gitea_settings() total_entries = 0 for owner, repo in args.owner_repo: data = gitea_api.PullRequest.list(self.gitea_conn, owner, repo, state=args.state).json() total_entries += len(data) text = gitea_api.PullRequest.list_to_human_readable_string(data, sort=True) if text: print(text) print("", file=sys.stderr) print(f"Total entries: {total_entries}", file=sys.stderr) ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/osc/commands_git/pr_search.py������������������������������������������������������������0000664�0000000�0000000�00000004374�14753375025�0020153�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������import sys import osc.commandline_git class PullRequestSearchCommand(osc.commandline_git.GitObsCommand): """ Search pull requests in the whole gitea instance """ name = "search" parent = "PullRequestCommand" def init_arguments(self): self.add_argument( "--state", choices=["open", "closed"], default="open", help="Filter by state: open, closed (default: open)", ) self.add_argument( "--title", help="Filter by substring in title", ) self.add_argument( "--owner", help="Filter by owner of the repository associated with the pull requests", ) self.add_argument( "--label", dest="labels", metavar="LABEL", action="append", help="Filter by associated labels. Non existent labels are discarded. Can be specified multiple times.", ) self.add_argument( "--assigned", action="store_true", help="Filter pull requests assigned to you", ) self.add_argument( "--created", action="store_true", help="Filter pull requests created by you", ) self.add_argument( "--mentioned", action="store_true", help="Filter pull requests mentioning you", ) self.add_argument( "--review-requested", action="store_true", help="Filter pull requests requesting your review", ) def run(self, args): from osc import gitea_api self.print_gitea_settings() data = gitea_api.PullRequest.search( self.gitea_conn, state=args.state, title=args.title, owner=args.owner, labels=args.labels, assigned=args.assigned, created=args.created, mentioned=args.mentioned, review_requested=args.review_requested, ).json() text = gitea_api.PullRequest.list_to_human_readable_string(data, sort=True) if text: print(text) print("", file=sys.stderr) print(f"Total entries: {len(data)}", file=sys.stderr) ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/osc/commands_git/pr_set.py���������������������������������������������������������������0000664�0000000�0000000�00000004057�14753375025�0017477�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������import sys import osc.commandline_git def b(value: str): if value is not None: return value.lower() in ["1", "yes", "true", "on"] return None class PullRequestSetCommand(osc.commandline_git.GitObsCommand): """ Change a pull request """ name = "set" parent = "PullRequestCommand" def init_arguments(self): self.add_argument_owner_repo_pull(nargs="+") self.add_argument( "--title", ) self.add_argument( "--description", ) self.add_argument( "--allow-maintainer-edit", action=osc.commandline_git.BooleanAction, help="Users with write access to the base branch can also push to the pull request's head branch", ) def run(self, args): from osc import gitea_api from osc.core import highlight_diff from osc.output import tty self.print_gitea_settings() print(args) num_entries = 0 failed_entries = [] for owner, repo, pull in args.owner_repo_pull: try: pr = gitea_api.PullRequest.set( self.gitea_conn, owner, repo, int(pull), title=args.title, description=args.description, allow_maintainer_edit=args.allow_maintainer_edit, ).json() num_entries += 1 except gitea_api.GiteaException as e: if e.status == 404: failed_entries.append(f"{owner}/{repo}#{pull}") continue raise print(gitea_api.PullRequest.to_human_readable_string(pr)) print() print(f"Total modified entries: {num_entries}", file=sys.stderr) if failed_entries: print( f"{tty.colorize('ERROR', 'red,bold')}: Couldn't change the following pull requests: {', '.join(failed_entries)}", file=sys.stderr, ) sys.exit(1) ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/osc/commands_git/repo.py�����������������������������������������������������������������0000664�0000000�0000000�00000000362�14753375025�0017143�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������import osc.commandline_git class RepoCommand(osc.commandline_git.GitObsCommand): """ Manage git repos """ name = "repo" def init_arguments(self): pass def run(self, args): self.parser.print_help() ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/osc/commands_git/repo_clone.py�����������������������������������������������������������0000664�0000000�0000000�00000005570�14753375025�0020331�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������import subprocess import sys import osc.commandline_git class RepoCloneCommand(osc.commandline_git.GitObsCommand): """ Clone a git repo NOTE: Some of the options may result in setting "core.sshCommand" config option in the git repository." """ name = "clone" parent = "RepoCommand" def init_arguments(self): self.add_argument_owner_repo(nargs="+") self.add_argument( "-a", "--anonymous", action="store_true", default=None, help="Clone anonymously via the http protocol", ) self.add_argument( "-i", "--ssh-key", help="Path to a private SSH key (identity file)", ) self.add_argument( "--no-ssh-strict-host-key-checking", action="store_true", help="Set 'StrictHostKeyChecking no' ssh option", ) # TODO: replace with an optional argument to get closer to the `git clone` command? self.add_argument( "--directory", help="Clone into the given directory", ) def run(self, args): from osc import gitea_api from osc.output import tty self.print_gitea_settings() if len(args.owner_repo) > 1 and args.directory: self.parser.error("The --directory option cannot be used with multiple repos") num_entries = 0 failed_entries = [] for owner, repo in args.owner_repo: print(f"Cloning git repo {owner}/{repo} ...", file=sys.stderr) try: gitea_api.Repo.clone( self.gitea_conn, owner, repo, directory=args.directory, anonymous=args.anonymous, add_remotes=True, ssh_private_key_path=args.ssh_key or self.gitea_login.ssh_key, ssh_strict_host_key_checking=not(args.no_ssh_strict_host_key_checking), ) num_entries += 1 except gitea_api.GiteaException as e: if e.status == 404: print(f" * {tty.colorize('ERROR', 'red,bold')}: Repo doesn't exist: {owner}/{repo}", file=sys.stderr) failed_entries.append(f"{owner}/{repo}") continue raise except subprocess.CalledProcessError as e: print(f" * {tty.colorize('ERROR', 'red,bold')}: git clone failed", file=sys.stderr) failed_entries.append(f"{owner}/{repo}") continue print("", file=sys.stderr) print(f"Total cloned repos: {num_entries}", file=sys.stderr) if failed_entries: print(f"{tty.colorize('ERROR', 'red,bold')}: Couldn't clone the following repos: {', '.join(failed_entries)}", file=sys.stderr) sys.exit(1) ����������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/osc/commands_git/repo_fork.py������������������������������������������������������������0000664�0000000�0000000�00000004156�14753375025�0020171�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������import sys import osc.commandline_git class RepoForkCommand(osc.commandline_git.GitObsCommand): """ Fork a git repo """ name = "fork" parent = "RepoCommand" def init_arguments(self): self.add_argument_owner_repo(nargs="+") self.add_argument_new_repo_name() def run(self, args): from osc import gitea_api from osc.output import tty self.print_gitea_settings() if len(args.owner_repo) > 1 and args.new_repo_name: self.parser.error("The --new-repo-name option cannot be used with multiple repos") num_entries = 0 failed_entries = [] for owner, repo in args.owner_repo: print(f"Forking git repo {owner}/{repo} ...", file=sys.stderr) try: response = gitea_api.Fork.create(self.gitea_conn, owner, repo, new_repo_name=args.new_repo_name) repo = response.json() fork_owner = repo["owner"]["login"] fork_repo = repo["name"] print(f" * Fork created: {fork_owner}/{fork_repo}", file=sys.stderr) num_entries += 1 except gitea_api.ForkExists as e: fork_owner = e.fork_owner fork_repo = e.fork_repo print(f" * Fork already exists: {fork_owner}/{fork_repo}", file=sys.stderr) print(f" * {tty.colorize('WARNING', 'yellow,bold')}: Using an existing fork with a different name than requested", file=sys.stderr) num_entries += 1 except gitea_api.GiteaException as e: if e.status == 404: print(f" * {tty.colorize('ERROR', 'red,bold')}: Repo doesn't exist: {owner}/{repo}", file=sys.stderr) failed_entries.append(f"{owner}/{repo}") continue raise print("", file=sys.stderr) print(f"Total forked repos: {num_entries}", file=sys.stderr) if failed_entries: print(f"{tty.colorize('ERROR', 'red,bold')}: Couldn't fork the following repos: {', '.join(failed_entries)}", file=sys.stderr) sys.exit(1) ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/osc/commands_git/ssh_key.py��������������������������������������������������������������0000664�0000000�0000000�00000000375�14753375025�0017647�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������import osc.commandline_git class SSHKeyCommand(osc.commandline_git.GitObsCommand): """ Manage public SSH keys """ name = "ssh-key" def init_arguments(self): pass def run(self, args): self.parser.print_help() �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/osc/commands_git/ssh_key_add.py����������������������������������������������������������0000664�0000000�0000000�00000001642�14753375025�0020455�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������import os import osc.commandline_git class SSHKeyAddCommand(osc.commandline_git.GitObsCommand): """ """ name = "add" parent = "SSHKeyCommand" def init_arguments(self): group = self.parser.add_mutually_exclusive_group(required=True) group.add_argument( "--key", help="SSH public key", ) group.add_argument( "--key-path", metavar="PATH", help="Path to the SSH public key", ) def run(self, args): from osc import gitea_api self.print_gitea_settings() if args.key: key = args.key else: with open(os.path.expanduser(args.key_path)) as f: key = f.read().strip() response = gitea_api.SSHKey.create(self.gitea_conn, key) print("Added entry:") print(gitea_api.SSHKey.to_human_readable_string(response.json())) ����������������������������������������������������������������������������������������������osc-1.12.1/osc/commands_git/ssh_key_list.py���������������������������������������������������������0000664�0000000�0000000�00000000673�14753375025�0020703�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������import osc.commandline_git class SSHKeyListCommand(osc.commandline_git.GitObsCommand): """ """ name = "list" parent = "SSHKeyCommand" def init_arguments(self): pass def run(self, args): from osc import gitea_api self.print_gitea_settings() for i in gitea_api.SSHKey.list(self.gitea_conn).json(): print(gitea_api.SSHKey.to_human_readable_string(i)) print() ���������������������������������������������������������������������osc-1.12.1/osc/commands_git/ssh_key_remove.py�������������������������������������������������������0000664�0000000�0000000�00000001336�14753375025�0021222�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������import sys import osc.commandline_git class SSHKeyRemoveCommand(osc.commandline_git.GitObsCommand): """ """ name = "remove" parent = "SSHKeyCommand" def init_arguments(self): self.parser.add_argument( "id", type=int, help="Id of the SSH public key", ) def run(self, args): from osc import gitea_api self.print_gitea_settings() print(f"Removing ssh key with id='{args.id}' ...", file=sys.stderr) response = gitea_api.SSHKey.get(self.gitea_conn, args.id) gitea_api.SSHKey.delete(self.gitea_conn, args.id) print("Removed entry:") print(gitea_api.SSHKey.to_human_readable_string(response.json())) ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/osc/conf.py������������������������������������������������������������������������������0000664�0000000�0000000�00000214660�14753375025�0014467�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������# Copyright Contributors to the osc project. # # This program is free software; you can redistribute it and/or # modify it under the terms of the GNU General Public License # as published by the Free Software Foundation; either version 2 # of the License, or (at your option) any later version. """ This module handles configuration of osc. Configuring osc from oscrc -------------------------- To configure osc from oscrc, do following:: import osc.conf # see ``get_config()`` documentation for available function arguments # see ``oscrc(5)`` man page for oscrc configuration options osc.conf.get_config() Configuring osc from API ------------------------ To configure osc purely from the API (without reading oscrc), do following:: import osc.conf # initialize the main config object config = osc.conf.Options() # configure host options for an apiurl apiurl = osc.conf.sanitize_apiurl(apiurl) host_options = HostOptions(apiurl=apiurl, username=..., _parent=config) config.api_host_options[apiurl] = host_options # set the default ``apiurl`` config.apiurl = ... # place the config object in `osc.conf` osc.conf.config = config # optional: enable http debugging according to the ``http_debug`` and ``http_full_debug`` options from osc.connection import enable_http_debug enable_http_debug(osc.conf.config) """ import collections import errno import getpass import http.client import os import re import shutil import sys import textwrap from io import BytesIO from io import StringIO from urllib.parse import urlsplit from . import credentials from . import OscConfigParser from . import oscerr from .output import tty from .util import xdg from .util.helper import raw_input from .util.models import * GENERIC_KEYRING = False try: import keyring GENERIC_KEYRING = True except: pass __all__ = [ "get_config", "Options", "HostOptions", "Password", "config", ] class Password(collections.UserString): """ Lazy password that wraps either a string or a function. The result of the function gets returned any time the object is used as a string. """ def __init__(self, data): self._data = data @property def data(self): if callable(self._data): # if ``data`` is a function, call it every time the string gets evaluated # we use the password only from time to time to make a session cookie # and there's no need to keep the password in program memory longer than necessary result = self._data() # the function can also return a function, let's evaluate them recursively while callable(result): result = result() if result is None: raise oscerr.OscIOError(None, "Unable to retrieve password") return result return self._data def __format__(self, format_spec): if format_spec.endswith("s"): return f"{self.__str__():{format_spec}}" return super().__format__(format_spec) def encode(self, *args, **kwargs): if sys.version_info < (3, 8): # avoid returning the Password object on python < 3.8 return str(self).encode(*args, **kwargs) return super().encode(*args, **kwargs) HttpHeader = NewType("HttpHeader", Tuple[str, str]) class OscOptions(BaseModel): def __init__(self, *args, **kwargs): super().__init__(*args, **kwargs) self._allow_new_attributes = True self._extra_fields = {} self._allow_new_attributes = False # compat function with the config dict def _get_field_name(self, name): if name in self.__fields__: return name for field_name, field in self.__fields__.items(): ini_key = field.extra.get("ini_key", None) if ini_key == name: return field_name return None # compat function with the config dict def __getitem__(self, name): field_name = self._get_field_name(name) if field_name is None and not hasattr(self, name): return self._extra_fields[name] field_name = field_name or name try: return getattr(self, field_name) except AttributeError: raise KeyError(name) # compat function with the config dict def __setitem__(self, name, value): field_name = self._get_field_name(name) if field_name is None and not hasattr(self, name): self._extra_fields[name] = value return field_name = field_name or name setattr(self, field_name, value) # compat function with the config dict def __contains__(self, name): try: self[name] except KeyError: return False return True # compat function with the config dict def setdefault(self, name, default=None): field_name = self._get_field_name(name) # we're ignoring ``default`` because the field always exists return getattr(self, field_name, None) # compat function with the config dict def get(self, name, default=None): try: return self[name] except KeyError: return default def set_value_from_string(self, name, value): field_name = self._get_field_name(name) field = self.__fields__[field_name] if not isinstance(value, str): setattr(self, field_name, value) return if not value.strip(): if field.is_optional: setattr(self, field_name, None) return if field.origin_type is Password: value = Password(value) setattr(self, field_name, value) return if field.type is List[HttpHeader]: value = http.client.parse_headers(BytesIO(value.strip().encode("utf-8"))).items() setattr(self, field_name, value) return if field.origin_type is list: # split list options into actual lists value = re.split(r"[, ]+", value) setattr(self, field_name, value) return if field.origin_type is bool: if value.lower() in ["1", "yes", "true", "on"]: value = True setattr(self, field_name, value) return if value.lower() in ["0", "no", "false", "off"]: value = False setattr(self, field_name, value) return if field.origin_type is int: value = int(value) setattr(self, field_name, value) return setattr(self, field_name, value) class HostOptions(OscOptions): """ Configuration options for individual apiurls. """ def __init__(self, _parent, **kwargs): super().__init__(_parent=_parent, **kwargs) apiurl: str = Field( default=None, description=textwrap.dedent( """ URL to the API server. """ ), ) # type: ignore[assignment] aliases: List[str] = Field( default=[], description=textwrap.dedent( """ Aliases of the apiurl. """ ), ) # type: ignore[assignment] username: str = Field( default=None, description=textwrap.dedent( """ Username for the apiurl. """ ), ini_key="user", ) # type: ignore[assignment] credentials_mgr_class: Optional[str] = Field( default=None, description=textwrap.dedent( """ Fully qualified name of a class used to fetch a password. """ ), ) # type: ignore[assignment] password: Optional[Password] = Field( default=None, description=textwrap.dedent( """ Password for the apiurl. May be empty if the credentials manager fetches the password from a keyring or ``sshkey`` is used. """ ), ini_key="pass", ) # type: ignore[assignment] sshkey: Optional[str] = Field( default=FromParent("sshkey"), description=textwrap.dedent( """ A pointer to public SSH key that corresponds with a private SSH used for authentication: - keep empty for auto detection - path to the public SSH key - public SSH key filename (must be placed in ~/.ssh) - fingerprint of a SSH key (2nd column of ``ssh-add -l``) NOTE: The private key may not be available on disk because it could be in a GPG keyring, on YubiKey or forwarded through SSH agent. TIP: To give osc a hint which ssh key from the agent to use during auto detection, append ``obs=<apiurl-hostname>`` to the **private** key's comment. This will also work nicely during SSH agent forwarding, because the comments get forwarded too. - To edit the key, run: ``ssh-keygen -c -f ~/.ssh/<private-key>`` - To query the key, run: ``ssh-keygen -y -f ~/.ssh/<private-key>`` - Example comment: ``<username@host> obs=api.example.com obs=api-test.example.com`` """ ), ) # type: ignore[assignment] downloadurl: Optional[str] = Field( default=None, description=textwrap.dedent( """ Redirect downloads of packages used during build to an alternative location. This allows specifying a local mirror or a proxy, which can greatly improve download performance, latency and more. """ ), ) # type: ignore[assignment] cafile: Optional[str] = Field( default=None, description=textwrap.dedent( """ The path to a file of concatenated CA certificates in PEM format. If specified, the CA certificates from the path will be used to validate other peers' certificates instead of the system-wide certificates. """ ), ) # type: ignore[assignment] capath: Optional[str] = Field( default=None, description=textwrap.dedent( """ The path to a directory containing several CA certificates in PEM format. If specified, the CA certificates from the path will be used to validate other peers' certificates instead of the system-wide certificates. """ ), ) # type: ignore[assignment] sslcertck: bool = Field( default=True, description=textwrap.dedent( """ Whether to validate SSL certificate of the server. It is highly recommended to keep this option enabled. """ ), ) # type: ignore[assignment] allow_http: bool = Field( default=False, description=textwrap.dedent( """ Whether to allow plain HTTP connections. Using HTTP is insecure because it sends passwords and session cookies in plain text. It is highly recommended to keep this option disabled. """ ), ) # type: ignore[assignment] http_headers: List[HttpHeader] = Field( default=[], description=textwrap.dedent( """ Additional HTTP headers attached to each HTTP or HTTPS request. The format is [(header-name, header-value)]. """ ), ini_description=textwrap.dedent( """ Additional HTTP headers attached to each request. The format is HTTP headers separated with newlines. Example:: http_headers = X-Header1: Value1 X-Header2: Value2 """ ), ini_type="newline-separated-list", ) # type: ignore[assignment] trusted_prj: List[str] = Field( default=[], description=textwrap.dedent( """ List of names of the trusted projects. The names can contain globs. Please note that some repos may contain malicious packages that can compromise the build result or even your system! """ ), ) # type: ignore[assignment] disable_hdrmd5_check: bool = Field( default=FromParent("disable_hdrmd5_check"), description=textwrap.dedent( """ Disable hdrmd5 checks of downloaded and cached packages in ``osc build``. It is recommended to keep the check enabled. OBS builds the noarch packages once per binary arch. Such noarch packages are supposed to be nearly identical across all build arches, any discrepancy in the payload and dependencies is considered a packaging bug. But to guarantee that the local builds work identically to builds in OBS, using the arch-specific copy of the noarch package is required. Unfortunatelly only one of the noarch packages gets distributed and can be downloaded from a local mirror. All other noarch packages are available through the OBS API only. Since there is currently no information about hdrmd5 checksums of published noarch packages, we download them, verify hdrmd5 and re-download the package from OBS API on mismatch. The same can also happen for architecture depend packages when someone is messing around with the source history or the release number handling in a way that it is not increasing. If you want to save some bandwidth and don't care about the exact rebuilds you can turn this option on to disable hdrmd5 checks completely. """ ), ) # type: ignore[assignment] passx: Optional[str] = Field( default=None, deprecated_text=textwrap.dedent( """ Option 'passx' (oscrc option [$apiurl]/passx) is deprecated. You should be using the 'password' option with 'credentials_mgr_class' set to 'osc.credentials.ObfuscatedConfigFileCredentialsManager' instead. """ ), ) # type: ignore[assignment] realname: Optional[str] = Field( default=FromParent("realname"), description=textwrap.dedent( """ Name of the user passed to the ``vc`` tool via ``VC_REALNAME`` env variable. """ ), ) # type: ignore[assignment] email: Optional[str] = Field( default=FromParent("email"), description=textwrap.dedent( """ Email of the user passed to the ``vc`` tool via ``VC_MAILADDR`` env variable. """ ), ) # type: ignore[assignment] class Options(OscOptions): """ Main configuration options. """ # for internal use conffile: Optional[str] = Field( default=None, exclude=True, ) # type: ignore[assignment] api_host_options: Dict[str, HostOptions] = Field( default={}, description=textwrap.dedent( """ A dictionary that maps ``apiurl`` to ``HostOptions``. """ ), ini_exclude=True, ) # type: ignore[assignment] @property def apiurl_aliases(self): """ Compute and return a dictionary that maps ``alias`` to ``apiurl``. """ result = {} for apiurl, opts in self.api_host_options.items(): result[apiurl] = apiurl for alias in opts.aliases: result[alias] = apiurl return result section_generic: str = Field( default="Generic options", exclude=True, section=True, ) # type: ignore[assignment] apiurl: str = Field( default="https://api.opensuse.org", description=textwrap.dedent( """ Default URL to the API server. Credentials and other ``apiurl`` specific settings must be configured in a ``[$apiurl]`` config section or via API in an ``api_host_options`` entry. """ ), ) # type: ignore[assignment] section_auth: str = Field( default="Authentication options", exclude=True, section=True, ) # type: ignore[assignment] username: Optional[str] = Field( default=None, ini_key="user", deprecated_text=textwrap.dedent( """ Option 'username' (oscrc option [global]/user) is deprecated. You should be using username for each apiurl instead. """ ), ) # type: ignore[assignment] password: Optional[Password] = Field( default=None, ini_key="pass", deprecated_text=textwrap.dedent( """ Option 'password' (oscrc option [global]/pass) is deprecated. You should be using password for each apiurl instead. """ ), ) # type: ignore[assignment] passx: Optional[str] = Field( default=None, deprecated_text=textwrap.dedent( """ Option 'passx' (oscrc option [global]/passx) is deprecated. You should be using password for each apiurl instead. """ ), ) # type: ignore[assignment] sshkey: Optional[str] = Field( default=None, description=HostOptions.__fields__["sshkey"].description, ) # type: ignore[assignment] use_keyring: bool = Field( default=False, description=textwrap.dedent( """ Enable keyring as an option for storing passwords. """ ), ) # type: ignore[assignment] section_verbosity: str = Field( default="Verbosity options", exclude=True, section=True, ) # type: ignore[assignment] quiet: bool = Field( default=False, description=textwrap.dedent( """ Reduce amount of printed information to bare minimum. If enabled, automatically sets ``verbose`` to ``False``. """ ), ) # type: ignore[assignment] verbose: bool = Field( default=False, description=textwrap.dedent( """ Increase amount of printed information to stdout. Automatically set to ``False`` when ``quiet`` is enabled. """ ), get_callback=lambda conf, value: False if conf.quiet else value, ) # type: ignore[assignment] debug: bool = Field( default=False, description=textwrap.dedent( """ Print debug information to stderr. """ ), ) # type: ignore[assignment] http_debug: bool = Field( default=False, description=textwrap.dedent( """ Print HTTP traffic to stderr. Automatically set to ``True`` when``http_full_debug`` is enabled. """ ), get_callback=lambda conf, value: True if conf.http_full_debug else value, ) # type: ignore[assignment] http_full_debug: bool = Field( default=False, description=textwrap.dedent( """ [CAUTION!] Print HTTP traffic incl. authentication data to stderr. If enabled, automatically sets ``http_debug`` to ``True``. """ ), ) # type: ignore[assignment] post_mortem: bool = Field( default=False, description=textwrap.dedent( """ Jump into a debugger when an unandled exception occurs. """ ), ) # type: ignore[assignment] traceback: bool = Field( default=False, description=textwrap.dedent( """ Print full traceback to stderr when an unandled exception occurs. """ ), ) # type: ignore[assignment] show_download_progress: bool = Field( default=True, description=textwrap.dedent( """ Show download progressbar. """ ), ) # type: ignore[assignment] section_connection: str = Field( default="Connection options", exclude=True, section=True, ) # type: ignore[assignment] http_retries: int = Field( default=3, description=textwrap.dedent( """ Number of retries on HTTP error. """ ), ) # type: ignore[assignment] cookiejar: str = Field( default=os.path.join(xdg.XDG_STATE_HOME, "osc", "cookiejar"), description=textwrap.dedent( """ Path to a cookie jar that stores session cookies. """ ), ) # type: ignore[assignment] section_scm: str = Field( default="SCM options", exclude=True, section=True, ) # type: ignore[assignment] realname: Optional[str] = Field( default=None, description=HostOptions.__fields__["realname"].description, ) # type: ignore[assignment] email: Optional[str] = Field( default=None, description=HostOptions.__fields__["email"].description, ) # type: ignore[assignment] local_service_run: bool = Field( default=True, description=textwrap.dedent( """ Run local services during commit. """ ), ) # type: ignore[assignment] getpac_default_project: str = Field( default="openSUSE:Factory", description=textwrap.dedent( """ The default project for ``osc getpac`` and ``osc bco``. The value is a space separated list of strings. """ ), ) # type: ignore[assignment] exclude_glob: List[str] = Field( default=[".osc", "CVS", ".svn", ".*", "_linkerror", "*~", "#*#", "*.orig", "*.bak", "*.changes.vctmp.*"], description=textwrap.dedent( """ Space separated list of files ignored by SCM. The files can contain globs. """ ), ) # type: ignore[assignment] exclude_files: List[str] = Field( default=[], description=textwrap.dedent( """ Files that match the listed glob patterns get skipped during checkout. """ ), ) # type: ignore[assignment] include_files: List[str] = Field( default=[], description=textwrap.dedent( """ Files that do not match the listed glob patterns get skipped during checkout. The ``exclude_files`` option takes priority over ``include_files``. """ ), ) # type: ignore[assignment] checkout_no_colon: bool = Field( default=False, description=textwrap.dedent( """ Use '/' as project separator instead the default ':' and create corresponding subdirs. If enabled, it takes priority over the ``project_separator`` option. """ ), ) # type: ignore[assignment] project_separator: str = Field( default=":", description=textwrap.dedent( """ Use the specified string to separate projects. """ ), ) # type: ignore[assignment] check_filelist: bool = Field( default=True, description=textwrap.dedent( """ Check for untracked files and removed files before commit. """ ), ) # type: ignore[assignment] do_package_tracking: bool = Field( default=True, description=textwrap.dedent( """ Track packages in parent project's .osc/_packages. """ ), ) # type: ignore[assignment] checkout_rooted: bool = Field( default=False, description=textwrap.dedent( """ Prevent checking out projects inside other projects or packages. """ ), ) # type: ignore[assignment] status_mtime_heuristic: bool = Field( default=False, description=textwrap.dedent( """ Consider a file with a modified mtime as modified. """ ), ) # type: ignore[assignment] linkcontrol: bool = Field( default=False, description=textwrap.dedent( # TODO: explain what linkcontrol does """ """ ), ) # type: ignore[assignment] section_build: str = Field( default="Build options", exclude=True, section=True, ) # type: ignore[assignment] build_repository: str = Field( default="openSUSE_Factory", description=textwrap.dedent( """ The default repository used when the ``repository`` argument is omitted from ``osc build``. """ ), ) # type: ignore[assignment] buildlog_strip_time: bool = Field( default=False, description=textwrap.dedent( """ Strip the build time from the build logs. """ ), ) # type: ignore[assignment] package_cache_dir: str = Field( default="/var/tmp/osbuild-packagecache", description=textwrap.dedent( """ The directory where downloaded packages are stored. Must be writable by you. """ ), ini_key="packagecachedir", ) # type: ignore[assignment] no_verify: bool = Field( default=False, description=textwrap.dedent( """ Disable signature verification of packages used for build. """ ), ) # type: ignore[assignment] builtin_signature_check: bool = Field( default=True, description=textwrap.dedent( """ Use the RPM's built-in package signature verification. """ ), ) # type: ignore[assignment] disable_hdrmd5_check: bool = Field( default=False, description=HostOptions.__fields__["disable_hdrmd5_check"].description, ) # type: ignore[assignment] section_request: str = Field( default="Request options", exclude=True, section=True, ) # type: ignore[assignment] include_request_from_project: bool = Field( default=True, description=textwrap.dedent( """ When querying requests, show also those that originate in the specified projects. """ ), ) # type: ignore[assignment] request_list_days: int = Field( default=0, description=textwrap.dedent( """ Limit the age of requests shown with ``osc req list`` to the given number of days. This is only the default that can be overridden with ``osc request list -D <VALUE>``. Use ``0`` for unlimited. """ ), ) # type: ignore[assignment] check_for_request_on_action: bool = Field( default=True, description=textwrap.dedent( """ Check for pending requests after executing an action (e.g. checkout, update, commit). """ ), ) # type: ignore[assignment] request_show_interactive: bool = Field( default=False, description=textwrap.dedent( """ Show requests in the interactive mode by default. """ ), ) # type: ignore[assignment] print_web_links: bool = Field( default=False, description=textwrap.dedent( """ Print links to Web UI that can be directly pasted to a web browser where possible. """ ), ) # type: ignore[assignment] request_show_source_buildstatus: bool = Field( default=False, description=textwrap.dedent( """ Print the buildstatus of the source package. Works only with ``osc request show`` and the interactive review. """ ), ) # type: ignore[assignment] submitrequest_accepted_template: Optional[str] = Field( default=None, description=textwrap.dedent( """ Template message for accepting a request. Supported substitutions: ``%(reqid)s``, ``%(type)s``, ``%(who)s``, ``%(src_project)s``, ``%(src_package)s``, ``%(src_rev)s``, ``%(tgt_project)s``, ``%(tgt_package)s`` Example:: Hi %(who)s, your request %(reqid)s (type: %(type)s) for %(tgt_project)s/%(tgt_package)s has been accepted. Thank you for your contribution. """ ), ) # type: ignore[assignment] submitrequest_declined_template: Optional[str] = Field( default=None, description=textwrap.dedent( """ Template message for declining a request. Supported substitutions: ``%(reqid)s``, ``%(type)s``, ``%(who)s``, ``%(src_project)s``, ``%(src_package)s``, ``%(src_rev)s``, ``%(tgt_project)s``, ``%(tgt_package)s`` Example:: Hi %(who)s, your request %(reqid)s (type: %(type)s) for %(tgt_project)s/%(tgt_package)s has been declined because ... """ ), ) # type: ignore[assignment] request_show_review: bool = Field( default=False, description=textwrap.dedent( """ Review requests interactively. """ ), ) # type: ignore[assignment] review_inherit_group: bool = Field( default=False, description=textwrap.dedent( """ If a review was accepted in interactive mode and a group was specified, the review will be accepted for this group. """ ), ) # type: ignore[assignment] submitrequest_on_accept_action: Optional[str] = Field( default=None, description=textwrap.dedent( """ What to do with the source package if the request has been accepted. If nothing is specified the API default is used. Choices: cleanup, update, noupdate """ ), ) # type: ignore[assignment] # XXX: let's hide attributes from documentation as it is not clear if anyone uses them and should them change from their defaults # section_obs_attributes: str = Field( # default="OBS attributes", # exclude=True, # section=True, # ) # type: ignore[assignment] maintained_attribute: str = Field( default="OBS:Maintained", ) # type: ignore[assignment] maintenance_attribute: str = Field( default="OBS:MaintenanceProject", ) # type: ignore[assignment] maintained_update_project_attribute: str = Field( default="OBS:UpdateProject", ) # type: ignore[assignment] section_build_tool: str = Field( default="Build tool options", exclude=True, section=True, ) # type: ignore[assignment] build_jobs: Optional[int] = Field( default=os.cpu_count, description=textwrap.dedent( """ The number of parallel processes during the build. Defaults to the number of available CPU threads. If the value is greater than ``0`` then it is passed as ``--jobs`` to the build tool. """ ), ini_key="build-jobs", ) # type: ignore[assignment] vm_type: Optional[str] = Field( default=None, description=textwrap.dedent( """ Type of the build environment passed the build tool as the ``--vm-type`` option: - <empty> chroot build - kvm KVM VM build (rootless, needs build-device, build-swap, build-memory) - xen XEN VM build (needs build-device, build-swap, build-memory) - qemu [EXPERIMENTAL] QEMU VM build - lxc [EXPERIMENTAL] LXC build - uml - zvm - openstack - ec2 - docker - podman (rootless) - pvm - nspawn See ``build --help`` for more details about supported options. """ ), ini_key="build-type", ) # type: ignore[assignment] build_memory: Optional[int] = Field( default=None, description=textwrap.dedent( """ The amount of RAM (in MiB) assigned to a build VM. """ ), ini_key="build-memory", ) # type: ignore[assignment] build_root: str = Field( default="/var/tmp/build-root%(dash_user)s/%(repo)s-%(arch)s", description=textwrap.dedent( """ Path to the build root directory. Supported substitutions: ``%(repo)s``, ``%(arch)s``, ``%(project)s``, ``%(package)s``, ``%(apihost)s``, ``%(user)s``, ``%(dash_user)s`` where:: - ``apihost`` is the hostname extracted from the currently used ``apiurl``. - ``dash_user`` is the username prefixed with a dash. If ``user`` is empty, ``dash_user`` is also empty. NOTE: The configuration holds the original unexpanded string. Call ``osc.build.get_build_root()`` with proper arguments to retrieve an actual path. Passed as ``--root <VALUE>`` to the build tool. """ ), ini_key="build-root", ) # type: ignore[assignment] build_shell_after_fail: bool = Field( default=False, description=textwrap.dedent( """ Start a shell prompt in the build environment if a build fails. Passed as ``--shell-after-fail`` to the build tool. """ ), ini_key="build-shell-after-fail", ) # type: ignore[assignment] build_uid: Optional[str] = Field( default=None, description=textwrap.dedent( """ Numeric uid:gid to use for the abuild user. Neither of the values should be 0. This is useful if you are hacking in the buildroot. This must be set to the same value if the buildroot is re-used. Passed as ``--uid <VALUE>`` to the build tool. """ ), ini_key="build-uid", ) # type: ignore[assignment] build_vm_kernel: Optional[str] = Field( default=None, description=textwrap.dedent( """ The kernel used in a VM build. """ ), ini_key="build-kernel", ) # type: ignore[assignment] build_vm_initrd: Optional[str] = Field( default=None, description=textwrap.dedent( """ The initrd used in a VM build. """ ), ini_key="build-initrd", ) # type: ignore[assignment] build_vm_disk: Optional[str] = Field( default=None, description=textwrap.dedent( """ The disk image used as rootfs in a VM build. Passed as ``--vm-disk <VALUE>`` to the build tool. """ ), ini_key="build-device", ) # type: ignore[assignment] build_vm_disk_filesystem: Optional[str] = Field( default=None, description=textwrap.dedent( """ The file system type of the disk image used as rootfs in a VM build. Supported values: ext3 (default), ext4, xfs, reiserfs, btrfs. Passed as ``--vm-disk-filesystem <VALUE>`` to the build tool. """ ), ini_key="build-vmdisk-filesystem", ) # type: ignore[assignment] build_vm_disk_size: Optional[int] = Field( default=None, description=textwrap.dedent( """ The size of the disk image (in MiB) used as rootfs in a VM build. Passed as ``--vm-disk-size`` to the build tool. """ ), ini_key="build-vmdisk-rootsize", ) # type: ignore[assignment] build_vm_swap: Optional[str] = Field( default=None, description=textwrap.dedent( """ Path to the disk image used as a swap for VM builds. Passed as ``--swap`` to the build tool. """ ), ini_key="build-swap", ) # type: ignore[assignment] build_vm_swap_size: Optional[int] = Field( default=None, description=textwrap.dedent( """ The size of the disk image (in MiB) used as swap in a VM build. Passed as ``--vm-swap-size`` to the build tool. """ ), ini_key="build-vmdisk-swapsize", ) # type: ignore[assignment] build_vm_user: Optional[str] = Field( default=None, description=textwrap.dedent( """ The username of a user used to run QEMU/KVM process. """ ), ini_key="build-vm-user", ) # type: ignore[assignment] icecream: int = Field( default=0, description=textwrap.dedent( """ Use Icecream distributed compiler. The value represents the number of parallel build jobs. Passed as ``--icecream <VALUE>`` to the build tool. """ ), ) # type: ignore[assignment] ccache: bool = Field( default=False, description=textwrap.dedent( """ Enable compiler cache (ccache) in build roots. Passed as ``--ccache`` to the build tool. """ ), ) # type: ignore[assignment] sccache: bool = Field( default=False, description=textwrap.dedent( """ Enable shared compilation cache (sccache) in build roots. Conflicts with ``ccache``. Passed as ``--sccache`` to the build tool. """ ), ) # type: ignore[assignment] sccache_uri: Optional[str] = Field( default=None, description=textwrap.dedent( """ Optional URI for sccache storage. Supported URIs depend on the sccache configuration. The URI allows the following substitutions: - ``{pkgname}``: name of the package to be build Examples: - file:///var/tmp/osbuild-sccache-{pkgname}.tar.lzop - file:///var/tmp/osbuild-sccache-{pkgname}.tar - redis://127.0.0.1:6379 Passed as ``--sccache-uri <VALUE>`` to the build tool. """ ), ) # type: ignore[assignment] no_preinstallimage: bool = Field( default=False, description=textwrap.dedent( """ Do not use preinstall images to initialize build roots. """ ), ) # type: ignore[assignment] extra_pkgs: List[str] = Field( default=[], description=textwrap.dedent( """ Extra packages to install into the build root when building packages locally with ``osc build``. This corresponds to ``osc build -x pkg1 -x pkg2 ...``. The configured values can be overriden from the command-line with ``-x ''``. This global setting may leads to dependency problems when the base distro is not providing the package. Therefore using server-side ``cli_debug_packages`` option instead is recommended. Passed as ``--extra-packs <VALUE>`` to the build tool. """ ), ini_key="extra-pkgs", ) # type: ignore[assignment] section_programs: str = Field( default="Paths to programs", exclude=True, section=True, ) # type: ignore[assignment] build_cmd: str = Field( default= shutil.which("build", path="/usr/bin:/usr/lib/build:/usr/lib/obs-build") or shutil.which("obs-build", path="/usr/bin:/usr/lib/build:/usr/lib/obs-build") or "/usr/bin/build", description=textwrap.dedent( """ Path to the 'build' tool. """ ), ini_key="build-cmd", ) # type: ignore[assignment] download_assets_cmd: str = Field( default= shutil.which("download_assets", path="/usr/lib/build:/usr/lib/obs-build") or "/usr/lib/build/download_assets", description=textwrap.dedent( """ Path to the 'download_assets' tool used for downloading assets in SCM/Git based builds. """ ), ini_key="download-assets-cmd", ) # type: ignore[assignment] obs_scm_bridge_cmd: str = Field( default= shutil.which("obs_scm_bridge", path="/usr/lib/obs/service") or "/usr/lib/obs/service/obs_scm_bridge", description=textwrap.dedent( """ Path to the 'obs_scm_bridge' tool used for cloning scmsync projects and packages. """ ), ini_key="obs-scm-bridge-cmd", ) # type: ignore[assignment] vc_cmd: str = Field( default=shutil.which("vc", path="/usr/lib/build:/usr/lib/obs-build") or "/usr/lib/build/vc", description=textwrap.dedent( """ Path to the 'vc' tool. """ ), ini_key="vc-cmd", ) # type: ignore[assignment] su_wrapper: str = Field( default="sudo", description=textwrap.dedent( """ The wrapper to call build tool as root (sudo, su -, ...). If empty, the build tool runs under the current user wich works only with KVM at this moment. """ ), ini_key="su-wrapper", ) # type: ignore[assignment] # Generate rst from a model. Use it to generate man page in sphinx. # This IS NOT a public API. def _model_to_rst(cls, title=None, description=None, sections=None, output_file=None): def header(text, char="-"): result = f"{text}\n" result += f"{'':{char}^{len(text)}}" return result def bold(text): text = text.replace(r"*", r"\*") return f"**{text}**" def italic(text): text = text.replace(r"*", r"\*") return f"*{text}*" def get_type(name, field): ini_type = field.extra.get("ini_type", None) if ini_type: return ini_type if field.origin_type.__name__ == "list": return "space-separated-list" return field.origin_type.__name__ def get_default(name, field): if field.default is None: return None if field.default_is_lazy: # lazy default may return different results under different circumstances -> return nothing return None ini_type = field.extra.get("ini_type", None) if ini_type: return None if isinstance(field.default, FromParent): return None origin_type = field.origin_type if origin_type == bool: return str(int(field.default)) if origin_type == int: return str(field.default) if origin_type == list: if not field.default: return None default_str = " ".join(field.default) return f'"{default_str}"' if origin_type == str: return f'"{field.default}"' # TODO: raise Exception(f"{name} {field}, {origin_type}") result = [] if title: result.append(header(title, char="=")) result.append("") if description: result.append(description) result.append("") for name, field in cls.__fields__.items(): extra = field.extra is_section_header = extra.get("section", False) if is_section_header: result.append(header(field.default)) result.append("") continue exclude = extra.get("ini_exclude", False) or field.exclude exclude |= field.description is None if exclude: continue ini_key = extra.get("ini_key", name) x = f"{bold(ini_key)} : {get_type(name, field)}" default = get_default(name, field) if default: x += f" = {italic(default)}" result.append(x) result.append("") desc = extra.get("ini_description", None) or field.description or "" for line in desc.splitlines(): result.append(f" {line}") result.append("") sections = sections or {} for section_name, section_class in sections.items(): result.append(header(section_name)) result.append(_model_to_rst(section_class)) if output_file: with open(output_file, "w", encoding="utf-8") as f: f.write("\n".join(result)) return "\n".join(result) # being global to this module, this object can be accessed from outside # it will hold the parsed configuration config = Options() general_opts = [field.extra.get("ini_key", field.name) for field in Options.__fields__.values() if not field.exclude] api_host_options = [field.extra.get("ini_key", field.name) for field in HostOptions.__fields__.values() if not field.exclude] # HACK: Proxy object that modifies field defaults in the Options class; needed for compatibility with the old DEFAULTS dict; prevents breaking osc-plugin-collab # This IS NOT a public API. class Defaults: def _get_field(self, name): if hasattr(Options, name): return getattr(Options, name) for i in dir(Options): field = getattr(Options, i) if field.extra.get("ini_key", None) == name: return field return None def __getitem__(self, name): field = self._get_field(name) result = field.default if field.type is List[str]: # return list as a string so we can append another string to it return ", ".join(result) return result def __setitem__(self, name, value): obj = Options() obj.set_value_from_string(name, value) field = self._get_field(name) field.default = obj[name] DEFAULTS = Defaults() new_conf_template = """ # see oscrc(5) man page for the full list of available options [general] # Default URL to the API server. # Credentials and other `apiurl` specific settings must be configured in a `[$apiurl]` config section. apiurl=%(apiurl)s [%(apiurl)s] # aliases= # user= # pass= # credentials_mgr_class=osc.credentials... """ account_not_configured_text = """ Your user account / password are not configured yet. You will be asked for them below, and they will be stored in %s for future use. """ config_incomplete_text = """ Your configuration file %s is not complete. Make sure that it has a [general] section. (You can copy&paste the below. Some commented defaults are shown.) """ config_missing_apiurl_text = """ The apiurl \'%s\' does not exist in the config file. Please enter your credentials for this apiurl. """ def sanitize_apiurl(apiurl): """ Sanitize apiurl: - add https:// schema if apiurl contains none - strip trailing slashes """ return urljoin(*parse_apisrv_url(None, apiurl)) def parse_apisrv_url(scheme, apisrv): if apisrv.startswith('http://') or apisrv.startswith('https://'): url = apisrv elif scheme is not None: url = scheme + apisrv else: url = f"https://{apisrv}" scheme, url, path = urlsplit(url)[0:3] return scheme, url, path.rstrip('/') def urljoin(scheme, apisrv, path=''): return f"{scheme}://{apisrv}" + path def is_known_apiurl(url): """returns ``True`` if url is a known apiurl""" apiurl = sanitize_apiurl(url) return apiurl in config['api_host_options'] def extract_known_apiurl(url): """ Return longest prefix of given url that is known apiurl, None if there is no known apiurl that is prefix of given url. """ scheme, host, path = parse_apisrv_url(None, url) p = path.split('/') while p: apiurl = urljoin(scheme, host, '/'.join(p)) if apiurl in config['api_host_options']: return apiurl p.pop() return None def get_apiurl_api_host_options(apiurl): """ Returns all apihost specific options for the given apiurl, ``None`` if no such specific options exist. """ # FIXME: in A Better World (tm) there was a config object which # knows this instead of having to extract it from a url where it # had been mingled into before. But this works fine for now. apiurl = sanitize_apiurl(apiurl) if is_known_apiurl(apiurl): return config['api_host_options'][apiurl] raise oscerr.ConfigMissingApiurl(f'missing credentials for apiurl: \'{apiurl}\'', '', apiurl) def get_apiurl_usr(apiurl): """ returns the user for this host - if this host does not exist in the internal api_host_options the default user is returned. """ # FIXME: maybe there should be defaults not just for the user but # for all apihost specific options. The ConfigParser class # actually even does this but for some reason we don't use it # (yet?). try: return get_apiurl_api_host_options(apiurl)['user'] except KeyError: print('no specific section found in config file for host of [\'%s\'] - using default user: \'%s\'' % (apiurl, config['user']), file=sys.stderr) return config['user'] def get_configParser(conffile=None, force_read=False): """ Returns an ConfigParser() object. After its first invocation the ConfigParser object is stored in a method attribute and this attribute is returned unless you pass force_read=True. """ if not conffile: conffile = identify_conf() conffile = os.path.expanduser(conffile) if 'conffile' not in get_configParser.__dict__: get_configParser.conffile = conffile if force_read or 'cp' not in get_configParser.__dict__ or conffile != get_configParser.conffile: get_configParser.cp = OscConfigParser.OscConfigParser() get_configParser.cp.read(conffile) get_configParser.conffile = conffile return get_configParser.cp def write_config(fname, cp): """write new configfile in a safe way""" # config file is behind a symlink # resolve the symlink and continue writing the config as usual if os.path.islink(fname): fname = os.path.realpath(fname) if os.path.exists(fname) and not os.path.isfile(fname): # only write to a regular file return # create directories to the config file (if they don't exist already) fdir = os.path.dirname(fname) if fdir: try: os.makedirs(fdir, mode=0o700) except OSError as e: if e.errno != errno.EEXIST: raise with open(f"{fname}.new", 'w') as f: cp.write(f, comments=True) try: os.rename(f"{fname}.new", fname) os.chmod(fname, 0o600) except: if os.path.exists(f"{fname}.new"): os.unlink(f"{fname}.new") raise def config_set_option(section, opt, val=None, delete=False, update=True, creds_mgr_descr=None, **kwargs): """ Sets a config option. If val is not specified the current/default value is returned. If val is specified, opt is set to val and the new value is returned. If an option was modified get_config is called with ``**kwargs`` unless update is set to ``False`` (``override_conffile`` defaults to ``config['conffile']``). If val is not specified and delete is ``True`` then the option is removed from the config/reset to the default value. """ cp = get_configParser(config['conffile']) if section != 'general': section = config.apiurl_aliases.get(section, section) scheme, host, path = \ parse_apisrv_url(config.get('scheme', 'https'), section) section = urljoin(scheme, host, path) sections = {} for url in cp.sections(): if url == 'general': sections[url] = url else: scheme, host, path = \ parse_apisrv_url(config.get('scheme', 'https'), url) apiurl = urljoin(scheme, host, path) sections[apiurl] = url section = sections.get(section.rstrip('/'), section) if section not in cp.sections(): raise oscerr.ConfigError(f'unknown section \'{section}\'', config['conffile']) if section == 'general' and opt not in general_opts or \ section != 'general' and opt not in api_host_options: raise oscerr.ConfigError(f'unknown config option \'{opt}\'', config['conffile']) if not val and not delete and opt == 'pass' and creds_mgr_descr is not None: # change password store creds_mgr = _get_credentials_manager(section, cp) user = _extract_user_compat(cp, section, creds_mgr) val = creds_mgr.get_password(section, user, defer=False) run = False if val: if opt == 'pass': creds_mgr = _get_credentials_manager(section, cp) user = _extract_user_compat(cp, section, creds_mgr) old_pw = creds_mgr.get_password(section, user, defer=False) try: creds_mgr.delete_password(section, user) if creds_mgr_descr: creds_mgr_new = creds_mgr_descr.create(cp) else: creds_mgr_new = creds_mgr creds_mgr_new.set_password(section, user, val) write_config(config['conffile'], cp) opt = credentials.AbstractCredentialsManager.config_entry old_pw = None finally: if old_pw is not None: creds_mgr.set_password(section, user, old_pw) # not nice, but needed if the Credentials Manager will change # something in cp write_config(config['conffile'], cp) else: cp.set(section, opt, val) write_config(config['conffile'], cp) run = True elif delete and (cp.has_option(section, opt) or opt == 'pass'): if opt == 'pass': creds_mgr = _get_credentials_manager(section, cp) user = _extract_user_compat(cp, section, creds_mgr) creds_mgr.delete_password(section, user) else: cp.remove_option(section, opt) write_config(config['conffile'], cp) run = True if run and update: kw = { 'override_conffile': config['conffile'], 'override_no_keyring': config['use_keyring'], } kw.update(kwargs) get_config(**kw) if cp.has_option(section, opt): return (opt, cp.get(section, opt, raw=True)) return (opt, None) def _extract_user_compat(cp, section, creds_mgr): """ This extracts the user either from the ConfigParser or the creds_mgr. Only needed for deprecated Gnome Keyring """ user = cp.get(section, 'user') if user is None and hasattr(creds_mgr, 'get_user'): user = creds_mgr.get_user(section) return user def write_initial_config(conffile, entries, custom_template='', creds_mgr_descriptor=None): """ write osc's intial configuration file. entries is a dict which contains values for the config file (e.g. { 'user' : 'username', 'pass' : 'password' } ). custom_template is an optional configuration template. """ conf_template = custom_template or new_conf_template config = globals()["config"].dict() config.update(entries) sio = StringIO(conf_template.strip() % config) cp = OscConfigParser.OscConfigParser() cp.read_file(sio) cp.set(config['apiurl'], 'user', config['user']) if creds_mgr_descriptor: creds_mgr = creds_mgr_descriptor.create(cp) else: creds_mgr = _get_credentials_manager(config['apiurl'], cp) creds_mgr.set_password(config['apiurl'], config['user'], config['pass']) write_config(conffile, cp) def add_section(filename, url, user, passwd, creds_mgr_descriptor=None, allow_http=None): """ Add a section to config file for new api url. """ global config cp = get_configParser(filename) try: cp.add_section(url) except OscConfigParser.configparser.DuplicateSectionError: # Section might have existed, but was empty pass cp.set(url, 'user', user) if creds_mgr_descriptor: creds_mgr = creds_mgr_descriptor.create(cp) else: creds_mgr = _get_credentials_manager(url, cp) creds_mgr.set_password(url, user, passwd) if allow_http: cp.set(url, 'allow_http', "1") write_config(filename, cp) def _get_credentials_manager(url, cp): if cp.has_option(url, credentials.AbstractCredentialsManager.config_entry): creds_mgr = credentials.create_credentials_manager(url, cp) if creds_mgr is None: msg = f'Unable to instantiate creds mgr (section: {url})' conffile = get_configParser.conffile raise oscerr.ConfigMissingCredentialsError(msg, conffile, url) return creds_mgr if config['use_keyring'] and GENERIC_KEYRING: return credentials.get_keyring_credentials_manager(cp) elif cp.get(url, "passx", fallback=None) is not None: return credentials.ObfuscatedConfigFileCredentialsManager(cp, None) return credentials.PlaintextConfigFileCredentialsManager(cp, None) def get_config(override_conffile=None, override_apiurl=None, override_debug=None, override_http_debug=None, override_http_full_debug=None, override_traceback=None, override_post_mortem=None, override_quiet=None, override_no_keyring=None, override_verbose=None, overrides=None ): """ Configure osc. The configuration options are loaded with the following priority: 1. environment variables: ``OSC_<uppercase_option>`` or ``OSC_<uppercase_host_alias>_<uppercase_host_option>`` 2. override arguments provided to ``get_config()`` 3. oscrc config file """ if overrides: overrides = overrides.copy() else: overrides = {} if override_apiurl is not None: overrides["apiurl"] = override_apiurl if override_debug is not None: overrides["debug"] = override_debug if override_http_debug is not None: overrides["http_debug"] = override_http_debug if override_http_full_debug is not None: overrides["http_full_debug"] = override_http_full_debug if override_traceback is not None: overrides["traceback"] = override_traceback if override_post_mortem is not None: overrides["post_mortem"] = override_post_mortem if override_no_keyring is not None: overrides["use_keyring"] = not override_no_keyring if override_quiet is not None: overrides["quiet"] = override_quiet if override_verbose is not None: overrides["verbose"] = override_verbose if override_conffile is not None: conffile = override_conffile else: conffile = identify_conf() if conffile in ["", "/dev/null"]: cp = OscConfigParser.OscConfigParser() cp.add_section("general") else: conffile = os.path.expanduser(conffile) if not os.path.exists(conffile): raise oscerr.NoConfigfile(conffile, account_not_configured_text % conffile) cp = get_configParser(conffile) if not cp.has_section("general"): # FIXME: it might be sufficient to just assume defaults? msg = config_incomplete_text % conffile defaults = Options().dict() msg += new_conf_template % defaults raise oscerr.ConfigError(msg, conffile) has_password = False for section in cp.sections(): keys = ["pass", "passx"] for key in keys: value = cp.get(section, key, fallback="").strip() if value: has_password = True break # make sure oscrc is not world readable, it may contain a password conffile_stat = os.stat(conffile) # applying 0o7777 mask because we want to ignore the file type bits if conffile_stat.st_mode & 0o7777 != 0o600: try: os.chmod(conffile, 0o600) except OSError as e: if e.errno in (errno.EROFS, errno.EPERM): if has_password: print(f"Warning: Configuration file '{conffile}' may have insecure file permissions.", file=sys.stderr) else: raise e global config config = Options() config.conffile = conffile # read 'debug' value before it gets properly stored into Options for early debug messages if override_debug: debug_str = str(override_debug) elif "OSC_DEBUG" in os.environ: debug_str = os.environ["OSC_DEBUG"] elif "debug" in cp["general"]: debug_str = cp["general"]["debug"] else: debug_str = "0" debug = True if debug_str.strip().lower() in ("1", "yes", "true", "on") else False # read host options first in order to populate apiurl aliases urls = [i for i in cp.sections() if i != "general"] for url in urls: apiurl = sanitize_apiurl(url) # the username will be overwritten later while reading actual config values username = cp[url].get("user", "") host_options = HostOptions(apiurl=apiurl, username=username, _parent=config) known_ini_keys = set() for name, field in host_options.__fields__.items(): # the following code relies on interating through fields in a given order: aliases, username, credentials_mgr_class, password ini_key = field.extra.get("ini_key", name) known_ini_keys.add(ini_key) known_ini_keys.add(name) # iterate through aliases and store the value of the the first env that matches OSC_HOST_{ALIAS}_{NAME} env_value = None for alias in host_options.aliases: alias = alias.replace("-", "_") env_key = f"OSC_HOST_{alias.upper()}_{name.upper()}" env_value = os.environ.get(env_key, None) if env_value is not None: break if env_value is not None: value = env_value elif ini_key in cp[url]: value = cp[url][ini_key] else: value = None if name == "credentials_mgr_class": # HACK: inject credentials_mgr_class back in case we have specified it from env to have it available for reading password if value: cp[url][credentials.AbstractCredentialsManager.config_entry] = value elif name == "password": creds_mgr = _get_credentials_manager(url, cp) if env_value is None: value = creds_mgr.get_password(url, host_options.username, defer=True, apiurl=host_options.apiurl) if value is not None: host_options.set_value_from_string(name, value) for key, value in cp[url].items(): if key.startswith("_"): continue if key in known_ini_keys: continue if debug: print(f"DEBUG: Config option '[{url}]/{key}' doesn't map to any HostOptions field", file=sys.stderr) host_options[key] = value scheme = urlsplit(apiurl)[0] if scheme == "http" and not host_options.allow_http: msg = "The apiurl '{apiurl}' uses HTTP protocol without any encryption.\n" msg += "All communication incl. sending your password IS NOT ENCRYPTED!\n" msg += "Add 'allow_http=1' to the [{apiurl}] config file section to mute this message.\n" print(msg.format(apiurl=apiurl), file=sys.stderr) config.api_host_options[apiurl] = host_options # read the main options known_ini_keys = set() for name, field in config.__fields__.items(): ini_key = field.extra.get("ini_key", name) known_ini_keys.add(ini_key) known_ini_keys.add(name) env_key = f"OSC_{name.upper()}" # priority: env, overrides, config if env_key in os.environ: value = os.environ[env_key] # remove any matching records from overrides because they are checked for emptiness later overrides.pop(name, None) overrides.pop(ini_key, None) elif name in overrides: value = overrides.pop(name) elif ini_key in overrides: value = overrides.pop(ini_key) elif ini_key in cp["general"]: value = cp["general"][ini_key] else: continue if name == "apiurl": # resolve an apiurl alias to an actual apiurl apiurl = config.apiurl_aliases.get(value, None) if not apiurl: # no alias matched, try again with a sanitized apiurl (with https:// prefix) # and if there's no match again, just use the sanitized apiurl apiurl = sanitize_apiurl(value) apiurl = config.apiurl_aliases.get(apiurl, apiurl) value = apiurl config.set_value_from_string(name, value) # BEGIN: override credentials for the default apiurl # OSC_APIURL is handled already because it's a regular field env_username = os.environ.get("OSC_USERNAME", "") env_credentials_mgr_class = os.environ.get("OSC_CREDENTIALS_MGR_CLASS", None) env_password = os.environ.get("OSC_PASSWORD", None) if config.apiurl not in config.api_host_options: host_options = HostOptions(apiurl=config.apiurl, username=env_username, _parent=config) config.api_host_options[config.apiurl] = host_options # HACK: inject section so we can add credentials_mgr_class later cp.add_section(config.apiurl) host_options = config.api_host_options[config.apiurl] if env_username: host_options.set_value_from_string("username", env_username) if env_credentials_mgr_class: host_options.set_value_from_string("credentials_mgr_class", env_credentials_mgr_class) # HACK: inject credentials_mgr_class in case we have specified it from env to have it available for reading password cp[config.apiurl]["credentials_mgr_class"] = env_credentials_mgr_class if env_password: password = Password(env_password) host_options.password = password elif env_credentials_mgr_class: creds_mgr = _get_credentials_manager(config.apiurl, cp) password = creds_mgr.get_password(config.apiurl, host_options.username, defer=True, apiurl=host_options.apiurl) host_options.password = password # END: override credentials for the default apiurl for apiurl, host_options in config.api_host_options.items(): if not host_options.username: raise oscerr.ConfigMissingCredentialsError(f"No user configured for apiurl {apiurl}", conffile, apiurl) if host_options.password is None: raise oscerr.ConfigMissingCredentialsError(f"No password configured for apiurl {apiurl}", conffile, apiurl) for key, value in cp["general"].items(): if key.startswith("_"): continue if key in known_ini_keys: continue if debug: print(f"DEBUG: Config option '[general]/{key}' doesn't map to any Options field", file=sys.stderr) config[key] = value if overrides: unused_overrides_str = ", ".join((f"'{i}'" for i in overrides)) raise oscerr.ConfigError(f"Unknown config options: {unused_overrides_str}", "<command-line>") # XXX unless config['user'] goes away (and is replaced with a handy function, or # config becomes an object, even better), set the global 'user' here as well, # provided that there _are_ credentials for the chosen apiurl: try: config['user'] = get_apiurl_usr(config['apiurl']) except oscerr.ConfigMissingApiurl as e: e.msg = config_missing_apiurl_text % config['apiurl'] e.file = conffile raise e # enable connection debugging after all config options are set from .connection import enable_http_debug enable_http_debug(config) def identify_conf(): # needed for compat reasons(users may have their oscrc still in ~ if 'OSC_CONFIG' in os.environ: return os.environ.get('OSC_CONFIG') conffile = os.path.join(xdg.XDG_CONFIG_HOME, "osc", "oscrc") if os.path.exists(os.path.expanduser("~/.oscrc")) or os.path.islink(os.path.expanduser("~/.oscrc")): if "XDG_CONFIG_HOME" in os.environ: print(f"{tty.colorize('WARNING', 'yellow,bold')}: Ignoring XDG_CONFIG_HOME env, loading an existing config from '~/.oscrc' instead", file=sys.stderr) print(" To fix this, move the existing '~/.oscrc' to XDG location such as '~/.config/osc/oscrc'", file=sys.stderr) elif os.path.exists(os.path.expanduser(conffile)): print(f"{tty.colorize('WARNING', 'yellow,bold')}: Ignoring config '{conffile}' in XDG location, loading an existing config from ~/.oscrc instead", file=sys.stderr) print(" To fix this, remove '~/.oscrc'", file=sys.stderr) return '~/.oscrc' return conffile def interactive_config_setup(conffile, apiurl, initial=True): if not apiurl: apiurl = Options()["apiurl"] scheme = urlsplit(apiurl)[0] http = scheme == "http" if http: msg = "The apiurl '{apiurl}' uses HTTP protocol without any encryption.\n" msg += "All communication incl. sending your password WILL NOT BE ENCRYPTED!\n" msg += "Do you really want to continue with no encryption?\n" print(msg.format(apiurl=apiurl), file=sys.stderr) yes = raw_input("Type 'YES' to continue: ") if yes != "YES": raise oscerr.UserAbort() print() apiurl_no_scheme = urlsplit(apiurl)[1] or apiurl user_prompt = f"Username [{apiurl_no_scheme}]: " user = raw_input(user_prompt) pass_prompt = f"Password [{user}@{apiurl_no_scheme}]: " passwd = getpass.getpass(pass_prompt) creds_mgr_descr = select_credentials_manager_descr() if initial: config = {'user': user, 'pass': passwd} if apiurl: config['apiurl'] = apiurl if http: config['allow_http'] = 1 write_initial_config(conffile, config, creds_mgr_descriptor=creds_mgr_descr) else: add_section(conffile, apiurl, user, passwd, creds_mgr_descriptor=creds_mgr_descr, allow_http=http) def select_credentials_manager_descr(): if not credentials.has_keyring_support(): print('To use keyrings please install python%d-keyring.' % sys.version_info.major) creds_mgr_descriptors = credentials.get_credentials_manager_descriptors() rows = [] for i, creds_mgr_descr in enumerate(creds_mgr_descriptors, 1): rows += [str(i), creds_mgr_descr.name(), creds_mgr_descr.description()] from .core import build_table headline = ('NUM', 'NAME', 'DESCRIPTION') table = build_table(len(headline), rows, headline) print() for row in table: print(row) i = raw_input('Select credentials manager [default=1]: ') if not i: i = "1" if not i.isdigit(): sys.exit('Invalid selection') i = int(i) - 1 if i < 0 or i >= len(creds_mgr_descriptors): sys.exit('Invalid selection') return creds_mgr_descriptors[i] # vim: sw=4 et ��������������������������������������������������������������������������������osc-1.12.1/osc/connection.py������������������������������������������������������������������������0000664�0000000�0000000�00000070072�14753375025�0015676�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������import base64 import fcntl import inspect import os import re import shutil import subprocess import ssl import sys import tempfile import time import warnings import http.client import http.cookiejar import urllib.parse import urllib.request import urllib3.exceptions import urllib3.poolmanager import urllib3.response import urllib3.util from . import __version__ from . import conf from . import oscerr from . import oscssl from . import output from .util.helper import decode_it # print only the first occurrence of matching warnings, regardless of location warnings.filterwarnings("once", category=urllib3.exceptions.InsecureRequestWarning) class MockRequest: """ Mock a request object for `cookiejar.extract_cookies()` and `cookiejar.add_cookie_header()`. """ def __init__(self, url, headers): self.url = url self.headers = headers self.unverifiable = False self.type = "https" def get_full_url(self): return self.url def get_header(self, header_name, default=None): return self.headers.get(header_name, default) def has_header(self, header_name): return header_name in self.headers def add_unredirected_header(self, key, val): # modifies the `headers` variable that was passed to object's constructor self.headers[key] = val def enable_http_debug(config): if not int(config["http_debug"]) and not int(config["http_full_debug"]): http.client.print = lambda *args, **kwargs: None return # HACK: override HTTPResponse's init to increase debug level old_HTTPResponse__init__ = http.client.HTTPResponse.__init__ def new_HTTPResponse__init__(self, *args, **kwargs): old_HTTPResponse__init__(self, *args, **kwargs) self.debuglevel = 1 http.client.HTTPResponse.__init__ = new_HTTPResponse__init__ # increase HTTPConnection debug level http.client.HTTPConnection.debuglevel = 1 # HACK: because HTTPResponse's debug data uses print(), # let's inject custom print() function to that module def new_print(*args, file=None): if not int(config["http_full_debug"]) and args: # hide private data (authorization and cookies) when full debug is not enabled if args[:2] == ("header:", "Set-Cookie:"): return if args[0] == "send:": args = list(args) # (?<=...) - '...' must be present before the pattern (positive lookbehind assertion) args[1] = re.sub(r"(?<=\\r\\n)authorization:.*?\\r\\n", "", args[1], re.I) args[1] = re.sub(r"(?<=\\r\\n)Cookie:.*?\\r\\n", "", args[1], re.I) print("DEBUG:", *args, file=sys.stderr) http.client.print = new_print def get_proxy_manager(env): proxy_url = os.environ.get(env, None) if not proxy_url: return proxy_purl = urllib3.util.parse_url(proxy_url) # rebuild proxy url in order to remove auth because ProxyManager would fail on it if proxy_purl.port: proxy_url = f"{proxy_purl.scheme}://{proxy_purl.host}:{proxy_purl.port}" else: proxy_url = f"{proxy_purl.scheme}://{proxy_purl.host}" proxy_headers = urllib3.make_headers( user_agent=f"osc/{__version__}", ) if proxy_purl.auth: proxy_basic_auth = urllib.parse.unquote(proxy_purl.auth) proxy_basic_auth = proxy_basic_auth.encode("utf-8") proxy_basic_auth = base64.b64encode(proxy_basic_auth).decode() proxy_headers["Proxy-Authorization"] = f"Basic {proxy_basic_auth:s}" manager = urllib3.ProxyManager(proxy_url, proxy_headers=proxy_headers) return manager # Instantiate on first use in `http_request()`. # Each `apiurl` requires a differently configured pool # (incl. trusted keys for example). CONNECTION_POOLS = {} # Pool manager for requests outside apiurls. POOL_MANAGER = urllib3.PoolManager() # Proxy manager for HTTP connections. HTTP_PROXY_MANAGER = get_proxy_manager("HTTP_PROXY") # Proxy manager for HTTPS connections. HTTPS_PROXY_MANAGER = get_proxy_manager("HTTPS_PROXY") def http_request_wrap_file(func): """ Turn file path into a file object and close it automatically by using a context manager. """ def new_func(method, url, headers=None, data=None, file=None): if file: with open(file, "rb") as f: return func(method, url, headers, data, file=f) else: return func(method, url, headers, data, file) new_func.__name__ = func.__name__ new_func.__doc__ = func.__doc__ return new_func @http_request_wrap_file def http_request(method: str, url: str, headers=None, data=None, file=None): """ Send a HTTP request to a server. Features: * Authentication ([apiurl]/{user,pass} in oscrc) * Session cookie support (~/.local/state/osc/cookiejar) * SSL certificate verification incl. managing trusted certs * SSL certificate verification bypass (if [apiurl]/sslcertck=0 in oscrc) * Expired SSL certificates are no longer accepted. Either prolong them or set sslcertck=0. * Proxy support (HTTPS_PROXY env, NO_PROXY is respected) * Retries (http_retries in oscrc) * Requests outside apiurl (incl. proxy support) * Connection debugging (-H/--http-debug, --http-full-debug) :param method: HTTP request method (such as GET, POST, PUT, DELETE). :param url: The URL to perform the request on. :param headers: Dictionary of custom headers to send. :param data: Data to send in the request body (conflicts with `file`). :param file: Path to a file to send as data in the request body (conflicts with `data`). """ purl = urllib3.util.parse_url(url) apiurl = conf.extract_known_apiurl(url) headers = urllib3.response.HTTPHeaderDict(headers or {}) # identify osc headers.update(urllib3.make_headers(user_agent=f"osc/{__version__}")) if data and file: raise RuntimeError('Specify either `data` or `file`') elif data: if hasattr(data, "encode"): data = data.encode("utf-8") content_length = len(data) elif file: content_length = os.fstat(file.fileno()).st_size data = file else: content_length = 0 if content_length: headers.add("Content-Length", str(content_length)) # handle requests that go outside apiurl # do not set auth cookie or auth credentials if not apiurl: if purl.scheme == "http" and HTTP_PROXY_MANAGER and not urllib.request.proxy_bypass(url): # connection through proxy manager = HTTP_PROXY_MANAGER elif purl.scheme == "https" and HTTPS_PROXY_MANAGER and not urllib.request.proxy_bypass(url): # connection through proxy manager = HTTPS_PROXY_MANAGER else: # direct connection manager = POOL_MANAGER response = manager.urlopen(method, url, body=data, headers=headers, preload_content=False) if response.status / 100 != 2: raise urllib.error.HTTPError(url, response.status, response.reason, response.headers, response) return response options = conf.config["api_host_options"][apiurl] if options.http_headers: new_headers = urllib3.response.HTTPHeaderDict() # user-defined headers from the config file new_headers.update(options.http_headers) # original ``headers`` (Content-Length, User-Agent) must prevail over user-defined headers new_headers.update(headers) headers = new_headers global CONNECTION_POOLS pool = CONNECTION_POOLS.get(apiurl, None) if not pool: pool_kwargs = {} # urllib3.Retry() argument 'method_whitelist' got renamed to 'allowed_methods' sig = inspect.signature(urllib3.Retry) arg_names = list(sig.parameters.keys()) if "allowed_methods" in arg_names: retries_kwargs = {"allowed_methods": None} else: retries_kwargs = {"method_whitelist": None} pool_kwargs["retries"] = urllib3.Retry( total=int(conf.config["http_retries"]), backoff_factor=2, status_forcelist=( 500, # Internal Server Error 502, # Bad Gateway 503, # Service Unavailable 504, # Gateway Timeout ), # don't raise because we want an actual response rather than a MaxRetryError with "too many <status_code> error responses" message raise_on_status=False, **retries_kwargs, ) if purl.scheme == "https": ssl_context = oscssl.create_ssl_context() ssl_context.load_default_certs() pool_kwargs["ssl_context"] = ssl_context # turn cert verification off if sslcertck = 0 if options["cafile"] or options["capath"]: ssl_context.load_verify_locations(cafile=options["cafile"], capath=options["capath"]) # urllib3 v1 pool_kwargs["cert_reqs"] = "CERT_REQUIRED" if options["sslcertck"] else "CERT_NONE" # urllib3 v2 if options["sslcertck"]: ssl_context.check_hostname = True ssl_context.verify_mode = ssl.CERT_REQUIRED else: ssl_context.check_hostname = False ssl_context.verify_mode = ssl.CERT_NONE if purl.scheme == "http" and HTTP_PROXY_MANAGER and not urllib.request.proxy_bypass(url): # connection through HTTP proxy pool = HTTP_PROXY_MANAGER.connection_from_host( host=purl.host, port=purl.port, scheme=purl.scheme, pool_kwargs=pool_kwargs ) HTTP_PROXY_MANAGER.request('GET', url) elif purl.scheme == "https" and HTTPS_PROXY_MANAGER and not urllib.request.proxy_bypass(url): # connection through HTTPS proxy pool = HTTPS_PROXY_MANAGER.connection_from_host( host=purl.host, port=purl.port, scheme=purl.scheme, pool_kwargs=pool_kwargs ) elif purl.scheme == "https": # direct connection pool = urllib3.HTTPSConnectionPool(host=purl.host, port=purl.port, **pool_kwargs) else: pool = urllib3.HTTPConnectionPool(host=purl.host, port=purl.port, **pool_kwargs) if purl.scheme == "https": # inject ssl context instance into pool so we can use it later pool.ssl_context = ssl_context # inject trusted cert store instance into pool so we can use it later pool.trusted_cert_store = oscssl.TrustedCertStore(ssl_context, purl.host, purl.port) CONNECTION_POOLS[apiurl] = pool auth_handlers = [ CookieJarAuthHandler(apiurl, os.path.expanduser(conf.config["cookiejar"])), SignatureAuthHandler(apiurl, options["user"], options["sshkey"], options["pass"]), BasicAuthHandler(apiurl, options["user"], options["pass"]), ] for handler in auth_handlers: # authenticate using a cookie (if available) success = handler.set_request_headers(url, headers) if success: break # Rails sends a html response if the header is not set # https://github.com/openSUSE/open-build-service/pull/13019 headers.add("Accept", "application/xml") if method == "PUT" or (method == "POST" and (data or file)): headers.add("Content-Type", "application/xml; charset=utf-8") elif method == "POST": headers.add("Content-Type", "application/x-www-form-urlencoded") if purl.scheme == "http" and HTTP_PROXY_MANAGER: # HTTP proxy requires full URL with 'same host' checking off urlopen_url = url assert_same_host = False else: # everything else is fine with path only # join path and query, ignore the remaining args; args are (scheme, netloc, path, query, fragment) urlopen_url = urllib.parse.urlunsplit(("", "", purl.path, purl.query, "")) assert_same_host = True if int(conf.config['http_debug']): # use the hacked print() for consistency http.client.print(40 * '-') http.client.print(method, url) try: response = pool.urlopen( method, urlopen_url, body=data, headers=headers, preload_content=False, assert_same_host=assert_same_host ) except urllib3.exceptions.MaxRetryError as e: if not isinstance(e.reason, urllib3.exceptions.SSLError): # re-raise exceptions that are not related to SSL raise # ssl.SSLCertVerificationError doesn't exist on python 3.6 # ssl.CertificateError is an alias for ssl.SSLCertVerificationError on python 3.7+ if isinstance(e.reason.args[0], ssl.CertificateError): self_signed_verify_codes = ( oscssl.X509_V_ERR_DEPTH_ZERO_SELF_SIGNED_CERT, oscssl.X509_V_ERR_SELF_SIGNED_CERT_IN_CHAIN, ) if e.reason.args[0].verify_code not in self_signed_verify_codes: # re-raise ssl exceptions that are not related to self-signed certs raise e.reason.args[0] from None else: # re-raise other than ssl exceptions raise e.reason.args[0] from None # get the untrusted certificated from server cert = pool.trusted_cert_store.get_server_certificate() # prompt user if we should trust the certificate pool.trusted_cert_store.prompt_trust(cert, reason=e.reason) if hasattr(data, 'seek'): data.seek(0) response = pool.urlopen( method, urlopen_url, body=data, headers=headers, preload_content=False, assert_same_host=assert_same_host ) if response.status == 401: # session cookie has expired, re-authenticate for handler in auth_handlers: success = handler.set_request_headers_after_401(url, headers, response) if success: break if hasattr(data, 'seek'): data.seek(0) response = pool.urlopen(method, urlopen_url, body=data, headers=headers, preload_content=False) # we want to save a session cookie before an exception is raised on failed requests for handler in auth_handlers: handler.process_response(url, headers, response) if response.status / 100 != 2: raise urllib.error.HTTPError(url, response.status, response.reason, response.headers, response) return response # pylint: disable=C0103,C0116 def http_GET(*args, **kwargs): return http_request("GET", *args, **kwargs) # pylint: disable=C0103,C0116 def http_POST(*args, **kwargs): return http_request("POST", *args, **kwargs) # pylint: disable=C0103,C0116 def http_PUT(*args, **kwargs): return http_request("PUT", *args, **kwargs) # pylint: disable=C0103,C0116 def http_DELETE(*args, **kwargs): return http_request("DELETE", *args, **kwargs) class AuthHandlerBase: def __init__(self, apiurl): self.apiurl = apiurl def _get_auth_schemes(self, response): """ Extract all `www-authenticate` headers from `response` and return them in a dictionary: `{scheme: auth_method}`. """ result = {} for auth_method in response.headers.get_all("www-authenticate", []): scheme = auth_method.split()[0].lower() result[scheme] = auth_method return result def set_request_headers(self, url, request_headers): """ Modify request headers with auth headers. :param url: Request URL provides context for `request_headers` modifications :type url: str :param request_headers: object to be modified :type request_headers: urllib3.response.HTTPHeaderDict :return: `True` on if `request_headers` was modified, `False` otherwise """ raise NotImplementedError def set_request_headers_after_401(self, url, request_headers, response): """ Modify request headers with auth headers after getting 401 response. :param url: Request URL provides context for `request_headers` modifications :type url: str :param request_headers: object to be modified :type request_headers: urllib3.response.HTTPHeaderDict :param response: Response object provides context for `request_headers` modifications :type response: urllib3.response.HTTPResponse :return: `True` on if `request_headers` was modified, `False` otherwise """ raise NotImplementedError def process_response(self, url, request_headers, response): """ Retrieve data from response, save cookies, etc. :param url: Request URL provides context for `request_headers` modifications :type url: str :param request_headers: object to be modified :type request_headers: urllib3.response.HTTPHeaderDict :param response: Response object provides context for `request_headers` modifications :type response: urllib3.response.HTTPResponse """ raise NotImplementedError class CookieJarAuthHandler(AuthHandlerBase): # Shared among instances, instantiate on first use, key equals to cookiejar path. COOKIEJARS = {} def __init__(self, apiurl, cookiejar_path): super().__init__(apiurl) self.cookiejar_path = cookiejar_path if self.cookiejar_path in self.COOKIEJARS: self.cookiejar_lock_path = None else: # Cookiejar hasn't been loaded yet, let's lock it to avoid # doing expensive signature auth in multiple processes. # This usually happens when a user runs multiple osc instances # from the command-line in parallel. self.cookiejar_lock_path = f"{self.cookiejar_path}.lock" self.cookiejar_lock_fd = None @property def _cookiejar(self): jar = self.COOKIEJARS.get(self.cookiejar_path, None) if not jar: try: os.makedirs(os.path.dirname(self.cookiejar_path), mode=0o700) except FileExistsError: pass jar = http.cookiejar.LWPCookieJar(self.cookiejar_path) if os.path.isfile(self.cookiejar_path): try: jar.load() except http.cookiejar.LoadError: pass self.COOKIEJARS[self.cookiejar_path] = jar return jar def _lock(self): if self.cookiejar_lock_path: try: os.makedirs(os.path.dirname(self.cookiejar_lock_path), mode=0o700) except FileExistsError: pass self.cookiejar_lock_fd = open(self.cookiejar_lock_path, "w") fcntl.flock(self.cookiejar_lock_fd, fcntl.LOCK_EX) def _unlock(self): if self.cookiejar_lock_path: self.cookiejar_lock_path = None fcntl.flock(self.cookiejar_lock_fd, fcntl.LOCK_UN) self.cookiejar_lock_fd.close() def set_request_headers(self, url, request_headers): self._lock() self._cookiejar.add_cookie_header(MockRequest(url, request_headers)) if request_headers.get_all("cookie", None): # we have a valid cookie already -> unlock immediately self._unlock() return True return False def set_request_headers_after_401(self, url, request_headers, response): # can't do anything, we have tried setting a cookie already return False def process_response(self, url, request_headers, response): if response.headers.get_all("set-cookie", None): self._cookiejar.extract_cookies(response, MockRequest(url, response.headers)) self._cookiejar.save() self._unlock() class BasicAuthHandler(AuthHandlerBase): def __init__(self, apiurl, user, password): super().__init__(apiurl) self.user = user self.password = password def set_request_headers(self, url, request_headers): return False def set_request_headers_after_401(self, url, request_headers, response): auth_schemes = self._get_auth_schemes(response) if "basic" not in auth_schemes: return False if not self.user or not self.password: return False basic_auth = f"{self.user:s}:{self.password:s}" basic_auth = basic_auth.encode("utf-8") basic_auth = base64.b64encode(basic_auth).decode() request_headers["Authorization"] = f"Basic {basic_auth:s}" return True def process_response(self, url, request_headers, response): pass class SignatureAuthHandler(AuthHandlerBase): def __init__(self, apiurl, user, sshkey, basic_auth_password=None): super().__init__(apiurl) self.user = user self.sshkey = None self.sshkey_fingerprint = None if sshkey and re.match("^[A-Z0-9]+:.*", sshkey): # if it starts with a prefix such as 'SHA256:' then it's a fingerprint self.sshkey_fingerprint = sshkey else: self.sshkey = sshkey if self.sshkey: # if only a file name is provided, prepend ~/.ssh if "/" not in self.sshkey: self.sshkey = os.path.join("~", ".ssh", self.sshkey) self.sshkey = os.path.expanduser(self.sshkey) output.print_msg(f"Using ssh key file configured in oscrc: {self.sshkey}", print_to="debug") self.ssh_keygen_path = shutil.which("ssh-keygen") self.ssh_add_path = shutil.which("ssh-add") creds_mgr = conf.config["api_host_options"][self.apiurl].get("credentials_mgr_class", None) if creds_mgr == "osc.credentials.TransientCredentialsManager": self.basic_auth_password = False else: # value of `basic_auth_password` is only used as a hint if we should skip signature auth self.basic_auth_password = bool(basic_auth_password) self.temp_pubkey = None def list_ssh_agent_keys(self): if not self.ssh_add_path: return [] cmd = [self.ssh_add_path, '-L'] proc = subprocess.Popen(cmd, stdout=subprocess.PIPE, stderr=subprocess.PIPE, encoding="utf-8") stdout, _ = proc.communicate() if proc.returncode == 0 and stdout.strip(): return stdout.strip().splitlines() else: return [] def list_ssh_agent_fingerprints(self): if not self.ssh_add_path: return [] cmd = [self.ssh_add_path, '-l'] proc = subprocess.Popen(cmd, stdout=subprocess.PIPE, stderr=subprocess.PIPE, encoding="utf-8") stdout, _ = proc.communicate() if proc.returncode == 0 and stdout.strip(): lines = stdout.strip().splitlines() return [i.split(" ")[1] for i in lines] else: return [] def guess_keyfile(self): # `ssh-keygen -Y sign` requires a file with a key which is not available during ssh agent forwarding # that's why we need to list ssh-agent's keys and store the first one into a temp file keys_in_agent = self.list_ssh_agent_keys() if keys_in_agent: selected_key = None # use ssh key from ssh agent by the specified fingerprint if self.sshkey_fingerprint: fingerprints_in_agent = self.list_ssh_agent_fingerprints() try: indx = fingerprints_in_agent.index(self.sshkey_fingerprint) selected_key = keys_in_agent[indx] output.print_msg(f"Using ssh key from ssh agent that matches fingerprint '{self.sshkey_fingerprint}': {selected_key}", print_to="debug") except ValueError: pass # use ssh key from ssh agent by key's comment obs=<hostname> matching the hostname of apiurl if selected_key is None: apiurl_hostname = urllib.parse.urlparse(self.apiurl).hostname for key in keys_in_agent: comments = key.strip().split(" ")[2:] pattern = f"obs={apiurl_hostname}" if pattern in comments: selected_key = key output.print_msg(f"Using ssh key from ssh agent that has comment '{pattern}' which matches apiurl '{self.apiurl}': {selected_key}", print_to="debug") break # use the first ssh key from ssh agent if selected_key is None: selected_key = keys_in_agent[0] output.print_msg(f"Using the first ssh key from ssh agent (see `ssh-add -L`): {selected_key}", print_to="debug") self.temp_pubkey = tempfile.NamedTemporaryFile(mode="w+") self.temp_pubkey.write(keys_in_agent[0]) self.temp_pubkey.flush() return self.temp_pubkey.name sshdir = os.path.expanduser('~/.ssh') keyfiles = ('id_ed25519', 'id_ed25519_sk', 'id_rsa', 'id_ecdsa', 'id_ecdsa_sk', 'id_dsa') output.print_msg(f"Searching ssh keys in '{sshdir}' in the following order: {', '.join(keyfiles)}", print_to="debug") for keyfile in keyfiles: keyfile_path = os.path.join(sshdir, keyfile) if os.path.isfile(keyfile_path): output.print_msg(f"Using ssh key from file: {keyfile_path}", print_to="debug") return keyfile_path return None def ssh_sign(self, data, namespace, keyfile=None): if not keyfile: keyfile = self.guess_keyfile() if not keyfile: raise oscerr.OscIOError(None, "No SSH key configured or auto-detected") keyfile = os.path.expanduser(keyfile) cmd = [self.ssh_keygen_path, '-Y', 'sign', '-f', keyfile, '-n', namespace, '-q'] proc = subprocess.Popen(cmd, stdin=subprocess.PIPE, stdout=subprocess.PIPE, encoding="utf-8") signature, _ = proc.communicate(data) if self.temp_pubkey: self.temp_pubkey.close() self.temp_pubkey = None if proc.returncode: raise oscerr.OscIOError(None, 'ssh-keygen signature creation failed: %d' % proc.returncode) match = re.match(r"\A-----BEGIN SSH SIGNATURE-----\n(.*)\n-----END SSH SIGNATURE-----", signature, re.S) if not match: raise oscerr.OscIOError(None, 'could not extract ssh signature') return base64.b64decode(match.group(1)) def get_authorization(self, chal): realm = chal.get('realm', '') now = int(time.time()) sigdata = "(created): %d" % now signature = self.ssh_sign(sigdata, realm, self.sshkey) signature = decode_it(base64.b64encode(signature)) return 'keyId="%s",algorithm="ssh",headers="(created)",created=%d,signature="%s"' \ % (self.user, now, signature) def add_signature_auth_header(self, req, auth): token, challenge = auth.split(' ', 1) chal = urllib.request.parse_keqv_list(filter(None, urllib.request.parse_http_list(challenge))) auth = self.get_authorization(chal) if not auth: return False auth_val = f'Signature {auth}' req.add('Authorization', auth_val) return True def set_request_headers(self, url, request_headers): return False def set_request_headers_after_401(self, url, request_headers, response): auth_schemes = self._get_auth_schemes(response) if "signature" not in auth_schemes: # unsupported on server return False if not self.user: return False if self.basic_auth_password and "basic" in auth_schemes: # prefer basic auth, but only if password is set return False if not self.ssh_keygen_path: output.print_msg("Skipping signature auth because ssh-keygen is not available", print_to="debug") return False if not self.sshkey_known(): # ssh key not set, try to guess it self.sshkey = self.guess_keyfile() if not self.sshkey_known(): # ssh key cannot be guessed return False return self.add_signature_auth_header(request_headers, auth_schemes["signature"]) def process_response(self, url, request_headers, response): pass def sshkey_known(self): return self.sshkey is not None ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/osc/core.py������������������������������������������������������������������������������0000664�0000000�0000000�00000670041�14753375025�0014471�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������# Copyright (C) 2006 Novell Inc. All rights reserved. # This program is free software; it may be used, copied, modified # and distributed under the terms of the GNU General Public Licence, # either version 2, or version 3 (at your option). import codecs import copy import csv import datetime import difflib import errno import fnmatch import glob import hashlib import io import locale import os import platform import re import shlex import shutil import subprocess import sys import tempfile import textwrap import time import warnings from functools import cmp_to_key, total_ordering from http.client import IncompleteRead from io import StringIO from pathlib import Path from typing import Optional, Dict, Union, List, Iterable from urllib.parse import parse_qs, urlsplit, urlunsplit, urlparse, quote, urlencode, unquote from urllib.error import HTTPError from xml.etree import ElementTree as ET try: import distro except ImportError: distro = None from . import __version__ from . import _private from . import conf from . import meter from . import oscerr from . import output from . import store as osc_store from .connection import http_request, http_GET, http_POST, http_PUT, http_DELETE from .obs_scm import File from .obs_scm import Linkinfo from .obs_scm import Package from .obs_scm import Project from .obs_scm import Serviceinfo from .obs_scm import Store from .obs_scm.store import __store_version__ from .obs_scm.store import check_store_version from .obs_scm.store import delete_storedir from .obs_scm.store import is_package_dir from .obs_scm.store import is_project_dir from .obs_scm.store import read_inconflict from .obs_scm.store import read_filemeta from .obs_scm.store import read_sizelimit from .obs_scm.store import read_tobeadded from .obs_scm.store import read_tobedeleted from .obs_scm.store import store from .obs_scm.store import store_read_apiurl from .obs_scm.store import store_read_file from .obs_scm.store import store_read_last_buildroot from .obs_scm.store import store_readlist from .obs_scm.store import store_read_package from .obs_scm.store import store_read_project from .obs_scm.store import store_read_scmurl from .obs_scm.store import store_unlink_file from .obs_scm.store import store_write_apiurl from .obs_scm.store import store_write_initial_packages from .obs_scm.store import store_write_last_buildroot from .obs_scm.store import store_write_project from .obs_scm.store import store_write_string from .output import get_default_pager from .output import run_pager from .output import sanitize_text from .util import xdg from .util.helper import decode_list, decode_it, raw_input, _html_escape from .util.xml import xml_fromstring from .util.xml import xml_indent_compat as xmlindent from .util.xml import xml_parse ET_ENCODING = "unicode" def compare(a, b): return cmp(a[1:], b[1:]) def cmp(a, b): return (a > b) - (a < b) DISTURL_RE = re.compile(r"^(?P<bs>.*)://(?P<apiurl>.*?)/(?P<project>.*?)/(?P<repository>.*?)/(?P<revision>.*)-(?P<source>.*)$") BUILDLOGURL_RE = re.compile(r"^(?P<apiurl>https?://.*?)/build/(?P<project>.*?)/(?P<repository>.*?)/(?P<arch>.*?)/(?P<package>.*?)/_log$") BUFSIZE = 1024 * 1024 new_project_templ = """\ <project name="%(name)s"> <title> """ new_package_templ = """\ """ new_attribute_templ = """\ """ new_user_template = """\ %(user)s PUT_EMAIL_ADDRESS_HERE PUT_REAL_NAME_HERE """ new_group_template = """\ %(group)s """ info_templ = """\ Project name: %s Package name: %s Path: %s API URL: %s Source URL: %s srcmd5: %s Revision: %s Link info: %s """ project_info_templ = """\ Project name: %s Path: %s API URL: %s Source URL: %s """ new_pattern_template = """\ """ buildstatus_symbols = {'succeeded': '.', 'disabled': ' ', 'expansion error': 'U', # obsolete with OBS 2.0 'unresolvable': 'U', 'failed': 'F', 'broken': 'B', 'blocked': 'b', 'building': '%', 'finished': 'f', 'scheduled': 's', 'locked': 'L', 'excluded': 'x', 'dispatching': 'd', 'signing': 'S', } # os.path.samefile is available only under Unix def os_path_samefile(path1, path2): try: return os.path.samefile(path1, path2) except AttributeError: return os.path.realpath(path1) == os.path.realpath(path2) def revision_is_empty(rev: Union[None, str, int]): return rev in (None, "") class DirectoryServiceinfo: def __init__(self): self.code = None self.xsrcmd5 = None self.lsrcmd5 = None self.error = '' def read(self, serviceinfo_node): if serviceinfo_node is None: return self.code = serviceinfo_node.get('code') self.xsrcmd5 = serviceinfo_node.get('xsrcmd5') self.lsrcmd5 = serviceinfo_node.get('lsrcmd5') self.error = serviceinfo_node.find('error') if self.error: self.error = self.error.text def isexpanded(self): """ Returns true, if the directory contains the "expanded"/generated service files """ return self.lsrcmd5 is not None and self.xsrcmd5 is None def haserror(self): return self.error is not None class AbstractState: """ Base class which represents state-like objects (````, ````). """ def __init__(self, tag): self.__tag = tag def get_node_attrs(self): """:return: attributes for the tag/element""" raise NotImplementedError() def get_node_name(self): """:return: tag/element name""" return self.__tag def get_comment(self): """:return: data from ```` tag""" raise NotImplementedError() def get_description(self): """:return: data from ```` tag""" raise NotImplementedError() def to_xml(self): """:return: object serialized to XML""" root = ET.Element(self.get_node_name()) for attr in self.get_node_attrs(): val = getattr(self, attr) if val is not None: root.set(attr, val) if self.get_description(): ET.SubElement(root, 'description').text = self.get_description() if self.get_comment(): ET.SubElement(root, 'comment').text = self.get_comment() return root def to_str(self): """:return: object serialized to pretty-printed XML""" root = self.to_xml() xmlindent(root) return ET.tostring(root, encoding=ET_ENCODING) class ReviewState(AbstractState): """Represents the review state in a request""" def __init__(self, review_node): if not review_node.get('state'): raise oscerr.APIError('invalid review node (state attr expected): %s' % ET.tostring(review_node, encoding=ET_ENCODING)) super().__init__(review_node.tag) self.state = review_node.get('state') self.by_user = review_node.get('by_user') self.by_group = review_node.get('by_group') self.by_project = review_node.get('by_project') self.by_package = review_node.get('by_package') self.who = review_node.get('who') self.when = review_node.get('when') self.comment = review_node.findtext("comment", default="").strip() def __repr__(self): result = super().__repr__() result += "(" result += f"{self.state}" if self.who: result += f" by {self.who}" for by in ("user", "group", "project", "package"): by_value = getattr(self, f"by_{by}", None) if by_value: result += f" [{by} {by_value}])" break result += ")" return result def get_node_attrs(self): return ('state', 'by_user', 'by_group', 'by_project', 'by_package', 'who', 'when') def get_comment(self): return self.comment def get_description(self): return None class RequestHistory(AbstractState): """Represents a history element of a request""" re_name = re.compile(r'^Request (?:got )?([^\s]+)$') def __init__(self, history_node): super().__init__(history_node.tag) self.who = history_node.get('who') self.when = history_node.get('when') if history_node.find('description') is not None: # OBS 2.6 self.description = history_node.findtext("description").strip() else: # OBS 2.5 and before self.description = history_node.get("name") self.comment = '' if history_node.find("comment") is not None: self.comment = history_node.findtext("comment").strip() self.name = self._parse_name(history_node) def _parse_name(self, history_node): name = history_node.get('name', None) if name is not None: # OBS 2.5 and before return name mo = self.re_name.search(self.description) if mo is not None: return mo.group(1) return self.description def get_node_attrs(self): return ('who', 'when') def get_description(self): return self.description def get_comment(self): return self.comment class RequestState(AbstractState): """Represents the state of a request""" def __init__(self, state_node): if not state_node.get('name'): raise oscerr.APIError('invalid request state node (name attr expected): %s' % ET.tostring(state_node, encoding=ET_ENCODING)) super().__init__(state_node.tag) self.name = state_node.get('name') self.who = state_node.get('who') self.when = state_node.get('when') self.approver = state_node.get('approver') self.superseded_by = state_node.get("superseded_by", None) if state_node.find('description') is None: # OBS 2.6 has it always, before it did not exist self.description = state_node.get('description') self.comment = '' if state_node.find('comment') is not None: self.comment = state_node.findtext("comment").strip() def get_node_attrs(self): return ('name', 'who', 'when', 'approver') def get_comment(self): return self.comment def get_description(self): return None class Action: """ Represents an ```` element of a Request. This class is quite common so that it can be used for all different action types. .. note:: Instances only provide attributes for their specific type. Examples:: r = Action('set_bugowner', tgt_project='foo', person_name='buguser') # available attributes: r.type (== 'set_bugowner'), r.tgt_project (== 'foo'), r.tgt_package (is None) r.to_str() -> r = Action('delete', tgt_project='foo', tgt_package='bar') # available attributes: r.type (== 'delete'), r.tgt_project (== 'foo'), r.tgt_package (=='bar') r.to_str() -> """ # allowed types + the corresponding (allowed) attributes type_args = {'submit': ('src_project', 'src_package', 'src_rev', 'tgt_project', 'tgt_package', 'opt_sourceupdate', 'acceptinfo_rev', 'acceptinfo_srcmd5', 'acceptinfo_xsrcmd5', 'acceptinfo_osrcmd5', 'acceptinfo_oxsrcmd5', 'opt_updatelink', 'opt_makeoriginolder'), 'add_role': ('tgt_project', 'tgt_package', 'person_name', 'person_role', 'group_name', 'group_role'), 'set_bugowner': ('tgt_project', 'tgt_package', 'person_name', 'group_name'), 'maintenance_release': ('src_project', 'src_package', 'src_rev', 'tgt_project', 'tgt_package', 'person_name', 'acceptinfo_rev', 'acceptinfo_srcmd5', 'acceptinfo_xsrcmd5', 'acceptinfo_osrcmd5', 'acceptinfo_oxsrcmd5', 'acceptinfo_oproject', 'acceptinfo_opackage'), 'release': ('src_project', 'src_package', 'src_rev', 'src_repository', 'tgt_project', 'tgt_package', 'person_name', 'acceptinfo_rev', 'acceptinfo_srcmd5', 'acceptinfo_xsrcmd5', 'acceptinfo_osrcmd5', 'acceptinfo_oxsrcmd5', 'acceptinfo_oproject', 'acceptinfo_opackage', 'tgt_repository'), 'maintenance_incident': ('src_project', 'src_package', 'src_rev', 'tgt_project', 'tgt_package', 'tgt_releaseproject', 'person_name', 'opt_sourceupdate', 'opt_makeoriginolder', 'acceptinfo_rev', 'acceptinfo_srcmd5', 'acceptinfo_xsrcmd5', 'acceptinfo_osrcmd5', 'acceptinfo_oxsrcmd5'), 'delete': ('tgt_project', 'tgt_package', 'tgt_repository'), 'change_devel': ('src_project', 'src_package', 'tgt_project', 'tgt_package'), 'group': ('grouped_id', )} # attribute prefix to element name map (only needed for abbreviated attributes) prefix_to_elm = {'src': 'source', 'tgt': 'target', 'opt': 'options'} def __init__(self, type, **kwargs): self.apiurl = kwargs.pop("apiurl", None) self._src_pkg_object = None self._tgt_pkg_object = None if type not in Action.type_args.keys(): raise oscerr.WrongArgs(f'invalid action type: \'{type}\'') self.type = type for i in kwargs.keys(): if i not in Action.type_args[type]: raise oscerr.WrongArgs(f'invalid argument: \'{i}\'') # set all type specific attributes for i in Action.type_args[type]: setattr(self, i, kwargs.get(i)) def __repr__(self): result = super().__repr__() result += "(" result += f"type={self.type}" src_pkg = self.src_pkg_object if src_pkg: result += f" source={src_pkg.project}/{src_pkg.name}" elif getattr(self, "src_project", None): result += f" source={self.src_project}" tgt_pkg = self.tgt_pkg_object if tgt_pkg: result += f" target={tgt_pkg.project}/{tgt_pkg.name}" elif getattr(self, "tgt_project", None): result += f" target={self.tgt_project}" result += ")" return result @property def src_pkg_object(self): if not getattr(self, "src_project", None) or not getattr(self, "src_package", None): return None if not self._src_pkg_object: src_rev = getattr(self, "src_rev", None) self._src_pkg_object = _private.ApiPackage(self.apiurl, self.src_project, self.src_package, src_rev) return self._src_pkg_object @property def tgt_pkg_object(self): if not self._tgt_pkg_object: if self.type == "maintenance_incident": # the target project for maintenance incidents is virtual and cannot be queried # the actual target project is in the "releaseproject" attribute # # tgt_releaseproject is always set for a maintenance_incident # pylint: disable=no-member tgt_project = self.tgt_releaseproject # the target package is not specified # we need to extract it from source package's _meta src_package_meta_releasename = self.src_pkg_object.get_meta_value("releasename") tgt_package = src_package_meta_releasename.split(".")[0] else: if not getattr(self, "tgt_project", None) or not getattr(self, "tgt_package", None): return None # tgt_project and tgt_package are checked above # pylint: disable=no-member tgt_project = self.tgt_project tgt_package = self.tgt_package self._tgt_pkg_object = _private.ApiPackage(self.apiurl, tgt_project, tgt_package) return self._tgt_pkg_object def to_xml(self): """ Serialize object to XML. The xml tag names and attributes are constructed from the instance's attributes. :return: object serialized to XML Example:: self.group_name -> tag name is "group", attribute name is "name" self.src_project -> tag name is "source" (translated via prefix_to_elm dict), attribute name is "project" Attributes prefixed with ``opt_`` need a special handling, the resulting xml should look like this: ``opt_updatelink`` -> ``value``. Attributes which are ``None`` will be skipped. """ root = ET.Element('action', type=self.type) for i in Action.type_args[self.type]: prefix, attr = i.split('_', 1) vals = getattr(self, i) # single, plain elements are _not_ stored in a list plain = False if vals is None: continue elif not hasattr(vals, 'append'): vals = [vals] plain = True for val in vals: elm = root.find(Action.prefix_to_elm.get(prefix, prefix)) if elm is None or not plain: elm = ET.Element(Action.prefix_to_elm.get(prefix, prefix)) root.append(elm) if prefix == 'opt': ET.SubElement(elm, attr).text = val else: elm.set(attr, val) return root def to_str(self): """:return: object serialized to pretty-printed XML""" root = self.to_xml() xmlindent(root) return ET.tostring(root, encoding=ET_ENCODING) @staticmethod def from_xml(action_node, apiurl=None): """create action from XML""" if action_node is None or \ action_node.get('type') not in Action.type_args.keys() or \ action_node.tag not in ('action', 'submit'): raise oscerr.WrongArgs('invalid argument') elm_to_prefix = {i[1]: i[0] for i in Action.prefix_to_elm.items()} kwargs = {} for node in action_node: prefix = elm_to_prefix.get(node.tag, node.tag) if prefix == 'opt': data = [(f'opt_{opt.tag}', opt.text.strip()) for opt in node if opt.text] else: data = [(f'{prefix}_{k}', v) for k, v in node.items()] # it would be easier to store everything in a list but in # this case we would lose some "structure" (see to_xml) for k, v in data: if k in kwargs: l = kwargs[k] if not hasattr(l, 'append'): l = [l] kwargs[k] = l l.append(v) else: kwargs[k] = v kwargs["apiurl"] = apiurl return Action(action_node.get('type'), **kwargs) @total_ordering class Request: """Represents a request (````)""" @classmethod def from_api(cls, apiurl: str, req_id: int): # TODO: deprecate get_request() or move its content here req_id = str(req_id) return get_request(apiurl, req_id) def __init__(self): self._init_attributes() def _init_attributes(self): """initialize attributes with default values""" self.reqid = None self.creator = '' self.title = '' self.description = '' self.priority = None self.state = None self.accept_at = None self.actions = [] self.statehistory = [] self.reviews = [] self._issues = None def __eq__(self, other): return int(self.reqid) == int(other.reqid) def __lt__(self, other): return int(self.reqid) < int(other.reqid) @property def id(self): return self.reqid @property def issues(self): if self._issues is None: self._issues = get_request_issues(self.apiurl, self.id) return self._issues def read(self, root, apiurl=None): """read in a request""" self._init_attributes() self.apiurl = apiurl if not root.get('id'): raise oscerr.APIError(f'invalid request: {ET.tostring(root, encoding=ET_ENCODING)}\n') self.reqid = root.get('id') if root.get('creator'): # OBS 2.8 and later is delivering creator informations self.creator = root.get('creator') if root.find('state') is None: raise oscerr.APIError(f'invalid request (state expected): {ET.tostring(root, encoding=ET_ENCODING)}\n') self.state = RequestState(root.find('state')) action_nodes = root.findall('action') if not action_nodes: # check for old-style requests for i in root.findall('submit'): i.set('type', 'submit') action_nodes.append(i) for action in action_nodes: self.actions.append(Action.from_xml(action, self.apiurl)) for review in root.findall('review'): self.reviews.append(ReviewState(review)) for history_element in root.findall('history'): self.statehistory.append(RequestHistory(history_element)) if root.findtext("priority"): self.priority = root.findtext("priority").strip() if root.findtext("accept_at"): self.accept_at = root.findtext("accept_at").strip() if root.findtext("title"): self.title = root.findtext("title").strip() if root.findtext("description"): self.description = root.findtext("description").strip() def add_action(self, type, **kwargs): """add a new action to the request""" self.actions.append(Action(type, **kwargs)) def get_actions(self, *types) -> List[Action]: """ get all actions with a specific type (if types is empty return all actions) """ if not types: return self.actions return [i for i in self.actions if i.type in types] def to_xml(self): """:return: object serialized to XML""" root = ET.Element('request') if self.reqid is not None: root.set('id', self.reqid) if self.creator: root.set('creator', self.creator) for action in self.actions: root.append(action.to_xml()) if self.state is not None: root.append(self.state.to_xml()) for review in self.reviews: root.append(review.to_xml()) for hist in self.statehistory: root.append(hist.to_xml()) if self.title: ET.SubElement(root, 'title').text = self.title if self.description: ET.SubElement(root, 'description').text = self.description if self.accept_at: ET.SubElement(root, 'accept_at').text = self.accept_at if self.priority: ET.SubElement(root, 'priority').text = self.priority return root def to_str(self): """:return: object serialized to pretty-printed XML""" root = self.to_xml() xmlindent(root) return ET.tostring(root, encoding=ET_ENCODING) def accept_at_in_hours(self, hours): """set auto accept_at time""" now = datetime.datetime.utcnow() now = now + datetime.timedelta(hours=hours) self.accept_at = now.isoformat() + '+00:00' @staticmethod def format_review(review, show_srcupdate=False): """ format a review depending on the reviewer's type. A dict which contains the formatted str's is returned. """ d = {'state': f'{review.state}:'} if review.by_package: d['by'] = f'{review.by_project}/{review.by_package}' d['type'] = 'Package' elif review.by_project: d['by'] = f'{review.by_project}' d['type'] = 'Project' elif review.by_group: d['by'] = f'{review.by_group}' d['type'] = 'Group' else: d['by'] = f'{review.by_user}' d['type'] = 'User' if review.who: d['by'] += f'({review.who})' return d def format_action(self, action: Action, show_srcupdate=False): """ format an action depending on the action's type. A dict which contains the formatted str's is returned. """ def prj_pkg_join(prj, pkg, repository=None): if not pkg: if not repository: return prj or '' return f'{prj}({repository})' return f'{prj}/{pkg}' d = {'type': f'{action.type}:'} if action.type == 'set_bugowner': if action.person_name: d['source'] = action.person_name if action.group_name: d['source'] = f'group:{action.group_name}' d['target'] = prj_pkg_join(action.tgt_project, action.tgt_package) elif action.type == 'change_devel': d['source'] = prj_pkg_join(action.tgt_project, action.tgt_package) d['target'] = f'developed in {prj_pkg_join(action.src_project, action.src_package)}' elif action.type == 'maintenance_incident': d['source'] = f'{action.src_project} ->' if action.src_package: d['source'] = f'{prj_pkg_join(action.src_project, action.src_package)}' if action.src_rev: d['source'] = d['source'] + f'@{action.src_rev}' d['source'] = d['source'] + ' ->' d['target'] = action.tgt_project if action.tgt_releaseproject: d['target'] += " (release in " + action.tgt_releaseproject + ")" srcupdate = ' ' if action.opt_sourceupdate and show_srcupdate: srcupdate = f'({action.opt_sourceupdate})' elif action.type in ('maintenance_release', 'release'): d['source'] = f'{prj_pkg_join(action.src_project, action.src_package)}' if action.src_rev: d['source'] = d['source'] + f'@{action.src_rev}' d['source'] = d['source'] + ' ->' d['target'] = prj_pkg_join(action.tgt_project, action.tgt_package) elif action.type == 'submit': d['source'] = f'{prj_pkg_join(action.src_project, action.src_package)}' if action.src_rev: d['source'] = d['source'] + f'@{action.src_rev}' if action.opt_sourceupdate and show_srcupdate: d['source'] = d['source'] + f'({action.opt_sourceupdate})' d['source'] = d['source'] + ' ->' tgt_package = action.tgt_package if action.src_package == action.tgt_package: tgt_package = '' d['target'] = prj_pkg_join(action.tgt_project, tgt_package) if action.opt_makeoriginolder: d['target'] = d['target'] + ' ***make origin older***' if action.opt_updatelink: d['target'] = d['target'] + ' ***update link***' elif action.type == 'add_role': roles = [] if action.person_name and action.person_role: roles.append(f'person: {action.person_name} as {action.person_role}') if action.group_name and action.group_role: roles.append(f'group: {action.group_name} as {action.group_role}') d['source'] = ', '.join(roles) d['target'] = prj_pkg_join(action.tgt_project, action.tgt_package) elif action.type == 'delete': d['source'] = '' d['target'] = prj_pkg_join(action.tgt_project, action.tgt_package, action.tgt_repository) elif action.type == 'group': l = action.grouped_id if l is None: # there may be no requests in a group action l = '' if not hasattr(l, 'append'): l = [l] d['source'] = ', '.join(l) + ' ->' d['target'] = self.reqid else: raise oscerr.APIError(f'Unknown action type {action.type}\n') return d def list_view(self): """return "list view" format""" status = self.state.name if self.state.name == 'review' and self.state.approver: status += "(approved)" lines = ['%6s State:%-10s By:%-12s When:%-19s' % (self.reqid, status, self.state.who, self.state.when)] lines += [f" Created by: {self.creator}"] tmpl = ' %(type)-16s %(source)-50s %(target)s' for action in self.actions: lines.append(tmpl % self.format_action(action)) tmpl = ' Review by %(type)-10s is %(state)-10s %(by)-50s' for review in self.reviews: lines.append(tmpl % Request.format_review(review)) history = [f'{hist.description}: {hist.who}' for hist in self.statehistory] if history: lines.append(f" From: {' -> '.join(history)}") if self.description: lines.append(textwrap.fill(self.description, width=80, initial_indent=' Descr: ', subsequent_indent=' ')) lines.append(textwrap.fill(self.state.comment, width=80, initial_indent=' Comment: ', subsequent_indent=' ')) return '\n'.join(lines) def __str__(self): """return "detailed" format""" lines = [ f"Request: {self.reqid}", f"Created by: {self.creator}", ] if self.accept_at and self.state.name in ['new', 'review']: lines.append(' *** This request will get automatically accepted after ' + self.accept_at + ' ! ***\n') if self.priority in ['critical', 'important'] and self.state.name in ['new', 'review']: lines.append(' *** This request has classified as ' + self.priority + ' ! ***\n') if self.state and self.state.approver and self.state.name == 'review': lines.append(' *** This request got approved by ' + self.state.approver + '. It will get automatically accepted after last review got accepted! ***\n') lines += ["", "Actions:"] for action in self.actions: fmt_action = self.format_action(action, show_srcupdate=True) if action.type == 'delete': lines += [f" {fmt_action['type']:13} {fmt_action['target']}"] else: lines += [f" {fmt_action['type']:13} {fmt_action['source']} {fmt_action['target']}"] lines += ["", "Message:", textwrap.indent(self.description or "", prefix=" ")] if self.state: state_name = self.state.name if self.state.superseded_by: state_name += f" by {self.state.superseded_by}" lines += ["", "State:", f" {state_name:61} {self.state.when:12} {self.state.who}"] if self.state.comment: lines += [textwrap.indent(self.state.comment, prefix=" | ", predicate=lambda line: True)] if self.reviews: lines += [""] lines += ["Review:"] for review in reversed(self.reviews): d = {'state': review.state} if review.by_user: d['by'] = "User: " + review.by_user if review.by_group: d['by'] = "Group: " + review.by_group if review.by_package: d['by'] = "Package: " + review.by_project + "/" + review.by_package elif review.by_project: d['by'] = "Project: " + review.by_project d['when'] = review.when or '' d['who'] = review.who or '' lines += [f" {d['state']:10} {d['by']:50} {d['when']:12} {d['who']}"] if review.comment: lines += [textwrap.indent(review.comment, prefix=" | ", predicate=lambda line: True)] if self.statehistory: lines += ["", "History:"] for hist in reversed(self.statehistory): lines += [f" {hist.when:10} {hist.who:30} {hist.description}"] return '\n'.join(lines) def create(self, apiurl: str, addrevision=False, enforce_branching=False): """create a new request""" query = {'cmd': 'create'} if addrevision: query['addrevision'] = "1" if enforce_branching: query['enforce_branching'] = "1" u = makeurl(apiurl, ['request'], query=query) f = http_POST(u, data=self.to_str()) root = xml_fromstring(f.read()) self.read(root) def shorttime(t): """format time as Apr 02 18:19 or Apr 02 2005 depending on whether it is in the current year """ if time.gmtime()[0] == time.gmtime(t)[0]: # same year return time.strftime('%b %d %H:%M', time.gmtime(t)) else: return time.strftime('%b %d %Y', time.gmtime(t)) def parse_disturl(disturl: str): """Parse a disturl, returns tuple (apiurl, project, source, repository, revision), else raises an oscerr.WrongArgs exception """ global DISTURL_RE m = DISTURL_RE.match(disturl) if not m: raise oscerr.WrongArgs(f"`{disturl}' does not look like disturl") apiurl = m.group('apiurl') if apiurl.split('.')[0] != 'api': apiurl = 'https://api.' + ".".join(apiurl.split('.')[1:]) return (apiurl, m.group('project'), m.group('source'), m.group('repository'), m.group('revision')) def parse_buildlogurl(buildlogurl: str): """Parse a build log url, returns a tuple (apiurl, project, package, repository, arch), else raises oscerr.WrongArgs exception""" global BUILDLOGURL_RE m = BUILDLOGURL_RE.match(buildlogurl) if not m: raise oscerr.WrongArgs(f'\'{buildlogurl}\' does not look like url with a build log') return (m.group('apiurl'), m.group('project'), m.group('package'), m.group('repository'), m.group('arch')) def slash_split(args): """Split command line arguments like 'foo/bar' into 'foo' 'bar'. This is handy to allow copy/paste a project/package combination in this form. Leading and trailing slashes are removed before the split, because the split could otherwise give additional empty strings. """ result = [] for arg in args: arg = arg.strip("/") result += arg.split("/") return result def expand_proj_pack(args, idx=0, howmany=0): """looks for occurance of '.' at the position idx. If howmany is 2, both proj and pack are expanded together using the current directory, or none of them if not possible. If howmany is 0, proj is expanded if possible, then, if there is no idx+1 element in args (or args[idx+1] == '.'), pack is also expanded, if possible. If howmany is 1, only proj is expanded if possible. If args[idx] does not exist, an implicit '.' is assumed. If not enough elements up to idx exist, an error is raised. See also parseargs(args), slash_split(args), Package.from_paths(args) All these need unification, somehow. """ # print args,idx,howmany if len(args) < idx: raise oscerr.WrongArgs('not enough argument, expected at least %d' % idx) if len(args) == idx: args += '.' if args[idx + 0] == '.': if howmany == 0 and len(args) > idx + 1: if args[idx + 1] == '.': # we have two dots. # remove one dot and make sure to expand both proj and pack args.pop(idx + 1) howmany = 2 else: howmany = 1 # print args,idx,howmany args[idx + 0] = store_read_project('.') if howmany == 0: try: package = store_read_package('.') args.insert(idx + 1, package) except: pass elif howmany == 2: package = store_read_package('.') args.insert(idx + 1, package) return args def findpacs(files, progress_obj=None, fatal=True): """collect Package objects belonging to the given files and make sure each Package is returned only once""" import warnings warnings.warn( "osc.core.findpacs() is deprecated. " "Use osc.core.Package.from_paths() or osc.core.Package.from_paths_nofail() instead.", DeprecationWarning ) if fatal: return Package.from_paths(files, progress_obj) return Package.from_paths_nofail(files, progress_obj) def parseargs(list_of_args): """Convenience method osc's commandline argument parsing. If called with an empty tuple (or list), return a list containing the current directory. Otherwise, return a list of the arguments.""" if list_of_args: return list(list_of_args) else: return [os.curdir] def statfrmt(statusletter, filename): return f'{statusletter} {filename}' def pathjoin(a, *p): """Join two or more pathname components, inserting '/' as needed. Cut leading ./""" path = os.path.join(a, *p) if path.startswith('./'): path = path[2:] return path class UrlQueryArray(list): """ Passing values wrapped in this object causes ``makeurl()`` to encode the list in Ruby on Rails compatible way (adding square brackets to the parameter names): {"file": UrlQueryArray(["foo", "bar"])} -> &file[]=foo&file[]=bar """ pass def makeurl(apiurl: str, path: List[str], query: Optional[dict] = None): """ Construct an URL based on the given arguments. :param apiurl: URL to the API server. :param path: List of URL path components. :param query: Optional dictionary with URL query data. Values can be: ``str``, ``int``, ``bool``, ``[str]``, ``[int]``. Items with value equal to ``None`` will be skipped. """ apiurl_scheme, apiurl_netloc, apiurl_path = urlsplit(apiurl)[0:3] path = apiurl_path.split("/") + [i.strip("/") for i in path] path = [quote(i, safe="/:") for i in path] path_str = "/".join(path) # DEPRECATED if isinstance(query, (list, tuple)): warnings.warn( "makeurl() query taking a list or a tuple is deprecated. Use dict instead.", DeprecationWarning ) query_str = "&".join(query) return urlunsplit((apiurl_scheme, apiurl_netloc, path_str, query_str, "")) # DEPRECATED if isinstance(query, str): warnings.warn( "makeurl() query taking a string is deprecated. Use dict instead.", DeprecationWarning ) query_str = query return urlunsplit((apiurl_scheme, apiurl_netloc, path_str, query_str, "")) if query is None: query = {} query = copy.deepcopy(query) for key in list(query): value = query[key] if value in (None, [], ()): # remove items with value equal to None or [] or () del query[key] elif isinstance(value, bool): # convert boolean values to "0" or "1" query[key] = str(int(value)) elif isinstance(value, UrlQueryArray): # encode lists in Ruby on Rails compatible way: # {"file": ["foo", "bar"]} -> &file[]=foo&file[]=bar del query[key] query[f"{key}[]"] = value query_str = urlencode(query, doseq=True) _private.print_msg("makeurl:", path_str+"?"+query_str, print_to="debug") return urlunsplit((apiurl_scheme, apiurl_netloc, path_str, query_str, "")) def meta_get_packagelist(apiurl: str, prj, deleted=None, expand=False): query = {} if deleted: query['deleted'] = 1 elif deleted in (False, 0): # HACK: Omitted 'deleted' and 'deleted=0' produce different results. # By explicit 'deleted=0', we also get multibuild packages listed. # See: https://github.com/openSUSE/open-build-service/issues/9715 query['deleted'] = 0 if expand: query['expand'] = 1 u = makeurl(apiurl, ['source', prj], query) f = http_GET(u) root = xml_parse(f).getroot() return [node.get('name') for node in root.findall('entry')] def meta_get_filelist( apiurl: str, prj: str, package: str, verbose=False, expand=False, revision=None, meta=False, deleted=False ): """return a list of file names, or a list File() instances if verbose=True""" query: Dict[str, Union[str, int]] = {} if deleted: query['deleted'] = 1 if expand: query['expand'] = 1 if meta: query['meta'] = 1 if not revision_is_empty(revision): query['rev'] = revision else: query['rev'] = 'latest' u = makeurl(apiurl, ['source', prj, package], query=query) f = http_GET(u) root = xml_parse(f).getroot() if not verbose: return [node.get('name') for node in root.findall('entry')] else: l = [] # rev = int(root.get('rev')) # don't force int. also allow srcmd5 here. rev = root.get('rev') for node in root.findall('entry'): f = File(node.get('name'), node.get('md5'), int(node.get('size')), int(node.get('mtime'))) f.rev = rev l.append(f) return l def meta_get_project_list(apiurl: str, deleted=False): query = {} if deleted: query['deleted'] = 1 u = makeurl(apiurl, ['source'], query) f = http_GET(u) root = xml_parse(f).getroot() return sorted(node.get('name') for node in root if node.get('name')) def show_project_meta(apiurl: str, prj: str, rev=None, blame=None): query = {} if blame: query['view'] = "blame" if not revision_is_empty(rev): query['rev'] = rev url = makeurl(apiurl, ['source', prj, '_project', '_meta'], query) try: f = http_GET(url) except HTTPError as e: error_help = "%d" % e.code os_err = e.hdrs.get('X-Opensuse-Errorcode') if os_err: error_help = "%s (%d) project: %s" % (os_err, e.code, prj) if e.code == 404 and os_err == 'unknown_package': error_help = 'option -r|--revision is not supported by this OBS version' e.osc_msg = f'BuildService API error: {error_help}' raise else: if blame: url = makeurl(apiurl, ['source', prj, '_project', '_meta'], query) else: url = makeurl(apiurl, ['source', prj, '_meta']) f = http_GET(url) return f.readlines() def show_project_conf(apiurl: str, prj: str, rev=None, blame=None): query = {} url = None if not revision_is_empty(rev): query['rev'] = rev if blame: query['view'] = "blame" url = makeurl(apiurl, ['source', prj, '_project', '_config'], query=query) else: url = makeurl(apiurl, ['source', prj, '_config'], query=query) f = http_GET(url) return f.readlines() def show_package_trigger_reason(apiurl: str, prj: str, pac: str, repo: str, arch: str): url = makeurl(apiurl, ['build', prj, repo, arch, pac, '_reason']) try: f = http_GET(url) return f.read() except HTTPError as e: e.osc_msg = f'Error getting trigger reason for project \'{prj}\' package \'{pac}\'' raise def show_package_meta(apiurl: str, prj: str, pac: str, meta=False, blame=None): query: Dict[str, Union[str, int]] = {} if meta: query['meta'] = 1 if blame: query['view'] = "blame" query['meta'] = 1 url = makeurl(apiurl, ['source', prj, pac, '_meta'], query) try: f = http_GET(url) return f.readlines() except HTTPError as e: e.osc_msg = f'Error getting meta for project \'{unquote(prj)}\' package \'{pac}\'' raise def show_attribute_meta(apiurl: str, prj: str, pac, subpac, attribute, with_defaults, with_project): path = [] path.append('source') path.append(prj) if pac: path.append(pac) if pac and subpac: path.append(subpac) path.append('_attribute') if attribute: path.append(attribute) query = {} query["with_default"] = with_defaults query["with_project"] = with_project url = makeurl(apiurl, path, query) try: f = http_GET(url) return f.readlines() except HTTPError as e: e.osc_msg = f'Error getting meta for project \'{prj}\' package \'{pac}\'' raise def clean_assets(directory): return run_external(conf.config['download-assets-cmd'], '--clean', directory) def download_assets(directory): return run_external(conf.config['download-assets-cmd'], '--unpack', '--noassetdir', directory) def show_scmsync(apiurl, prj, pac=None): from . import obs_api if pac: package_obj = obs_api.Package.from_api(apiurl, prj, pac) return package_obj.scmsync project_obj = obs_api.Project.from_api(apiurl, prj) return project_obj.scmsync def show_devel_project(apiurl, prj, pac): from . import obs_api package_obj = obs_api.Package.from_api(apiurl, prj, pac) if package_obj.devel is None: return None, None # mute a false-positive: Instance of 'dict' has no 'project' member (no-member) # pylint: disable=no-member return package_obj.devel.project, package_obj.devel.package def set_devel_project(apiurl, prj, pac, devprj=None, devpac=None, print_to="debug"): from . import obs_api if devprj: msg = "Setting devel project of" else: msg = "Unsetting devel project from" msg = _private.format_msg_project_package_options( msg, prj, pac, devprj, devpac, ) output.print_msg(msg, print_to=print_to) package_obj = obs_api.Package.from_api(apiurl, prj, pac) if devprj is None: package_obj.devel = None else: package_obj.devel = {"project": devprj, "package": devpac} if package_obj.has_changed(): return package_obj.to_api(apiurl) # TODO: debug log that we have skipped the API call return None def show_package_disabled_repos(apiurl: str, prj: str, pac: str): from . import obs_api # FIXME: don't work if all repos of a project are disabled and only some are enabled since is empty package_obj = obs_api.Package.from_api(apiurl, prj, pac) result = [] for i in package_obj.build_list or []: if i.flag == "disable": # pylint: disable=no-member result.append({"repo": i.repository, "arch": i.arch}) # pylint: disable=no-member return result def show_pattern_metalist(apiurl: str, prj: str): url = makeurl(apiurl, ['source', prj, '_pattern']) try: f = http_GET(url) tree = xml_parse(f) except HTTPError as e: e.osc_msg = f'show_pattern_metalist: Error getting pattern list for project \'{prj}\'' raise r = sorted(node.get('name') for node in tree.getroot()) return r def show_pattern_meta(apiurl: str, prj: str, pattern: str): url = makeurl(apiurl, ['source', prj, '_pattern', pattern]) try: f = http_GET(url) return f.readlines() except HTTPError as e: e.osc_msg = f'show_pattern_meta: Error getting pattern \'{pattern}\' for project \'{prj}\'' raise def show_configuration(apiurl): u = makeurl(apiurl, ['configuration']) f = http_GET(u) return f.readlines() class metafile: """metafile that can be manipulated and is stored back after manipulation.""" class _URLFactory: # private class which might go away again... def __init__(self, delegate, force_supported=True): self._delegate = delegate self._force_supported = force_supported def is_force_supported(self): return self._force_supported def __call__(self, **kwargs): return self._delegate(**kwargs) def __init__(self, url, input, change_is_required=False, file_ext='.xml', method=None): if isinstance(url, self._URLFactory): self._url_factory = url else: delegate = lambda **kwargs: url # force is not supported for a raw url self._url_factory = self._URLFactory(delegate, False) self.url = self._url_factory() self.change_is_required = change_is_required (fd, self.filename) = tempfile.mkstemp(prefix='osc_metafile.', suffix=file_ext) self._method = method open_mode = 'w' input_as_str = None if not isinstance(input, list): input = [input] if input and isinstance(input[0], str): input_as_str = ''.join(input) else: open_mode = 'wb' input_as_str = b''.join(input) f = os.fdopen(fd, open_mode) f.write(input_as_str) f.close() self.hash_orig = dgst(self.filename) def sync(self): if self.change_is_required and self.hash_orig == dgst(self.filename): print('File unchanged. Not saving.') os.unlink(self.filename) return print('Sending meta data...') # don't do any exception handling... it's up to the caller what to do in case # of an exception if self._method == "POST": http_POST(self.url, file=self.filename) else: http_PUT(self.url, file=self.filename) os.unlink(self.filename) print('Done.') def edit(self): try: try_force = False while True: if not try_force: run_editor(self.filename) try_force = False try: self.sync() break except HTTPError as e: error_help = "%d" % e.code if e.hdrs.get('X-Opensuse-Errorcode'): error_help = "%s (%d)" % (e.hdrs.get('X-Opensuse-Errorcode'), e.code) print('BuildService API error:', error_help, file=sys.stderr) # examine the error - we can't raise an exception because we might want # to try again root = xml_fromstring(e.read()) summary = root.find('summary') if summary is not None: print(summary.text, file=sys.stderr) if self._url_factory.is_force_supported(): prompt = 'Try again? ([y/N/f]): ' else: prompt = 'Try again? ([y/N): ' ri = raw_input(prompt) if ri in ('y', 'Y'): self.url = self._url_factory() elif ri in ('f', 'F') and self._url_factory.is_force_supported(): self.url = self._url_factory(force='1') try_force = True else: break finally: self.discard() def discard(self): if os.path.exists(self.filename): print(f'discarding {self.filename}') os.unlink(self.filename) # different types of metadata metatypes = {'prj': {'path': 'source/%s/_meta', 'template': new_project_templ, 'file_ext': '.xml' }, 'pkg': {'path': 'source/%s/%s/_meta', 'template': new_package_templ, 'file_ext': '.xml' }, 'attribute': {'path': 'source/%s/_attribute/%s', 'template': new_attribute_templ, 'file_ext': '.xml' }, 'prjconf': {'path': 'source/%s/_config', 'template': '', 'file_ext': '.txt' }, 'user': {'path': 'person/%s', 'template': new_user_template, 'file_ext': '.xml' }, 'group': {'path': 'group/%s', 'template': new_group_template, 'file_ext': '.xml' }, 'pattern': {'path': 'source/%s/_pattern/%s', 'template': new_pattern_template, 'file_ext': '.xml' }, } def meta_exists(metatype: str, path_args=None, template_args=None, create_new=True, apiurl=None): global metatypes if not apiurl: apiurl = conf.config['apiurl'] url = make_meta_url(metatype, path_args, apiurl) try: data = http_GET(url).readlines() except HTTPError as e: if e.code == 404 and create_new: data = metatypes[metatype]['template'] if template_args: data = StringIO(data % template_args).readlines() else: raise e return data def make_meta_url( metatype: str, path_args=None, apiurl: Optional[str] = None, force=False, remove_linking_repositories=False, msg=None, ): global metatypes if not apiurl: apiurl = conf.config['apiurl'] if metatype not in metatypes.keys(): raise AttributeError(f'make_meta_url(): Unknown meta type \'{metatype}\'') path = metatypes[metatype]['path'] if path_args: path = path % path_args query = {} if force: query = {'force': '1'} if remove_linking_repositories: query['remove_linking_repositories'] = '1' if msg: query['comment'] = msg return makeurl(apiurl, [path], query) def parse_meta_to_string(data: Union[bytes, list, Iterable]) -> str: """ Converts the output of meta_exists into a string value """ # data can be a bytes object, a list with strings, a list with bytes, just a string. # So we need the following even if it is ugly. if isinstance(data, bytes): data = decode_it(data) elif isinstance(data, list): data = decode_list(data) return ''.join(data) def edit_meta( metatype, path_args=None, data: Optional[List[str]] = None, template_args=None, edit=False, force=False, remove_linking_repositories=False, change_is_required=False, apiurl: Optional[str] = None, method: Optional[str] = None, msg=None, ): global metatypes if not apiurl: apiurl = conf.config['apiurl'] if not data: data = meta_exists(metatype, path_args, template_args, create_new=metatype != 'prjconf', # prjconf always exists, 404 => unknown prj apiurl=apiurl) if edit: change_is_required = True if metatype == 'pkg': # check if the package is a link to a different project project, package = path_args orgprj = xml_fromstring(parse_meta_to_string(data)).get('project') if orgprj is not None and unquote(project) != orgprj: print('The package is linked from a different project.') print('If you want to edit the meta of the package create first a branch.') print(f' osc branch {orgprj} {package} {unquote(project)}') print(f' osc meta pkg {unquote(project)} {package} -e') return def delegate(force=force): return make_meta_url(metatype, path_args, apiurl, force, remove_linking_repositories, msg) url_factory = metafile._URLFactory(delegate) f = metafile(url_factory, data, change_is_required, metatypes[metatype]['file_ext'], method=method) if edit: f.edit() else: f.sync() def show_files_meta( apiurl: str, prj: str, pac: str, revision=None, expand=False, linkrev=None, linkrepair=False, meta=False, deleted=False, ): query = {} if not revision_is_empty(revision): query['rev'] = revision else: query['rev'] = 'latest' if not revision_is_empty(linkrev): query['linkrev'] = linkrev elif conf.config['linkcontrol']: query['linkrev'] = 'base' if meta: query['meta'] = 1 if deleted: query['deleted'] = 1 if expand: query['expand'] = 1 if linkrepair: query['emptylink'] = 1 f = http_GET(makeurl(apiurl, ['source', prj, pac], query=query)) return f.read() def show_upstream_srcmd5( apiurl: str, prj: str, pac: str, expand=False, revision=None, meta=False, include_service_files=False, deleted=False ): m = show_files_meta(apiurl, prj, pac, expand=expand, revision=revision, meta=meta, deleted=deleted) et = xml_fromstring(m) if include_service_files: try: sinfo = et.find('serviceinfo') if sinfo is not None and sinfo.get('xsrcmd5') and not sinfo.get('error'): return sinfo.get('xsrcmd5') except: pass return et.get('srcmd5') def show_upstream_xsrcmd5( apiurl: str, prj, pac, revision=None, linkrev=None, linkrepair=False, meta=False, include_service_files=False ): m = show_files_meta( apiurl, prj, pac, revision=revision, linkrev=linkrev, linkrepair=linkrepair, meta=meta, expand=include_service_files, ) et = xml_fromstring(m) if include_service_files: return et.get('srcmd5') li_node = et.find('linkinfo') if li_node is None: return None li = Linkinfo() li.read(li_node) if li.haserror(): raise oscerr.LinkExpandError(prj, pac, li.error) return li.xsrcmd5 def show_project_sourceinfo(apiurl: str, project: str, nofilename: bool, *packages): query = {} query["view"] = "info" query["package"] = packages query["nofilename"] = nofilename f = http_GET(makeurl(apiurl, ['source', project], query=query)) return f.read() def get_project_sourceinfo(apiurl: str, project: str, nofilename: bool, *packages): try: si = show_project_sourceinfo(apiurl, project, nofilename, *packages) except HTTPError as e: # old API servers (e.g. 2.3.5) do not know the 'nofilename' parameter, so retry without if e.code == 400 and nofilename: return get_project_sourceinfo(apiurl, project, False, *packages) # an uri too long error is sometimes handled as status 500 # (depending, e.g., on the apache2 configuration) if e.code not in (414, 500): raise if len(packages) == 1: raise oscerr.APIError(f'package name too long: {packages[0]}') n = int(len(packages) / 2) pkgs = packages[:n] res = get_project_sourceinfo(apiurl, project, nofilename, *pkgs) pkgs = packages[n:] res.update(get_project_sourceinfo(apiurl, project, nofilename, *pkgs)) return res root = xml_fromstring(si) res = {} for sinfo in root.findall('sourceinfo'): res[sinfo.get('package')] = sinfo return res def show_upstream_rev_vrev(apiurl: str, prj, pac, revision=None, expand=False, meta=False): m = show_files_meta(apiurl, prj, pac, revision=revision, expand=expand, meta=meta) et = xml_fromstring(m) rev = et.get("rev") or None vrev = et.get("vrev") or None return rev, vrev def show_upstream_rev( apiurl: str, prj, pac, revision=None, expand=False, linkrev=None, meta=False, include_service_files=False ): m = show_files_meta(apiurl, prj, pac, revision=revision, expand=expand, linkrev=linkrev, meta=meta) et = xml_fromstring(m) if include_service_files: try: sinfo = et.find('serviceinfo') if sinfo is not None and sinfo.get('xsrcmd5') and not sinfo.get('error'): return sinfo.get('xsrcmd5') except: pass return et.get('rev') def read_meta_from_spec(specfile, *args): """ Read tags and sections from spec file. To read out a tag the passed argument mustn't end with a colon. To read out a section the passed argument must start with a '%'. This method returns a dictionary which contains the requested data. """ if not os.path.isfile(specfile): raise oscerr.OscIOError(None, f'\'{specfile}\' is not a regular file') rpmspec_path = shutil.which("rpmspec") if rpmspec_path: result = {} for arg in args: # convert tag to lower case and remove the leading '%' tag = arg.lower().lstrip("%") cmd = [rpmspec_path, "-q", specfile, "--srpm", "--qf", "%{" + tag + "}"] value = subprocess.check_output(cmd, encoding="utf-8") if value == "(none)": value = "" result[arg] = value return result try: lines = codecs.open(specfile, 'r', locale.getpreferredencoding()).readlines() except UnicodeDecodeError: lines = open(specfile).readlines() tags = [] sections = [] spec_data = {} for itm in args: if itm.startswith('%'): sections.append(itm) else: tags.append(itm) tag_pat = r'(?P^%s)\s*:\s*(?P.*)' for tag in tags: m = re.compile(tag_pat % tag, re.I | re.M).search(''.join(lines)) if m and m.group('val'): spec_data[tag] = m.group('val').strip() section_pat = r'^%s\s*?$' for section in sections: m = re.compile(section_pat % section, re.I | re.M).search(''.join(lines)) if m is None: spec_data[section] = "" continue start = lines.index(m.group() + '\n') + 1 data = [] for line in lines[start:]: if line.startswith('%'): break data.append(line) spec_data[section] = data return spec_data def _get_linux_distro(): if distro is not None: return distro.id() return None def get_default_editor(): system = platform.system() if system == 'Linux': dist = _get_linux_distro() if dist == 'debian': return 'editor' elif dist == 'fedora': return 'vi' return 'vim' return 'vi' def format_diff_line(line): if line.startswith(b"+++ ") or line.startswith(b"--- ") or line.startswith(b"Index:"): line = b"\x1b[1m" + line + b"\x1b[0m" elif line.startswith(b"+"): line = b"\x1b[32m" + line + b"\x1b[0m" elif line.startswith(b"-"): line = b"\x1b[31m" + line + b"\x1b[0m" elif line.startswith(b"@"): line = b"\x1b[96m" + line + b"\x1b[0m" return line def highlight_diff(diff): if sys.stdout.isatty(): diff = b"\n".join((format_diff_line(line) for line in diff.split(b"\n"))) return diff def run_editor(filename): cmd = _editor_command() cmd.append(filename) return run_external(cmd[0], *cmd[1:]) def _editor_command(): editor = os.getenv("EDITOR", default="").strip() editor = editor or get_default_editor() try: cmd = shlex.split(editor) except SyntaxError: cmd = editor.split() return cmd # list of files with message backups # we'll show this list when osc errors out MESSAGE_BACKUPS = [] def _edit_message_open_editor(filename, data, orig_mtime): editor = _editor_command() mtime = os.stat(filename).st_mtime if isinstance(data, str): data = bytes(data, 'utf-8') if mtime == orig_mtime: # prepare file for editors if editor[0] in ('vi', 'vim'): with tempfile.NamedTemporaryFile() as f: f.write(data) f.flush() editor.extend(['-c', f':r {f.name}', filename]) run_external(editor[0], *editor[1:]) else: with open(filename, 'wb') as f: f.write(data) orig_mtime = os.stat(filename).st_mtime run_editor(filename) else: run_editor(filename) if os.stat(filename).st_mtime != orig_mtime: # file has changed cache_dir = os.path.expanduser(os.path.join(xdg.XDG_CACHE_HOME, "osc", "edited-messages")) try: os.makedirs(cache_dir, mode=0o700) except FileExistsError: pass # remove any stored messages older than 1 day now = datetime.datetime.now() epoch = datetime.datetime.timestamp(now - datetime.timedelta(days=1)) for fn in os.listdir(cache_dir): path = os.path.join(cache_dir, fn) if not os.path.isfile(path): continue mtime = os.path.getmtime(path) if mtime < epoch: os.unlink(path) # store the current message's backup to the cache dir message_backup_path = os.path.join(cache_dir, str(now).replace(" ", "_")) shutil.copyfile(filename, message_backup_path) MESSAGE_BACKUPS.append(message_backup_path) return True return False def edit_message(footer='', template='', templatelen=30): delim = '--This line, and those below, will be ignored--\n' data = '' if template != '': if templatelen is not None: lines = template.splitlines() data = '\n'.join(lines[:templatelen]) if lines[templatelen:]: footer = '%s\n\n%s' % ('\n'.join(lines[templatelen:]), footer) data += '\n' + delim + '\n' + footer return edit_text(data, delim, suffix='.diff', template=template) def edit_text(data='', delim=None, suffix='.txt', template=''): try: (fd, filename) = tempfile.mkstemp(prefix='osc-editor', suffix=suffix) os.close(fd) mtime = os.stat(filename).st_mtime ri_err = False while True: if not ri_err: file_changed = _edit_message_open_editor(filename, data, mtime) msg = open(filename).read() if delim: msg = msg.split(delim)[0].rstrip() if msg and file_changed: break else: reason = 'Log message not specified' if template == msg: reason = 'Default log message was not changed. Press \'c\' to continue.' ri = raw_input(f'{reason}\na)bort, c)ontinue, e)dit: ') if ri in 'aA': raise oscerr.UserAbort() elif ri in 'cC': break elif ri in 'eE': ri_err = False else: print(f"{ri} is not a valid option.") ri_err = True finally: os.unlink(filename) return msg def clone_request(apiurl: str, reqid, msg=None): query = {'cmd': 'branch', 'request': reqid} url = makeurl(apiurl, ['source'], query) r = http_POST(url, data=msg) root = xml_fromstring(r.read()) project = None for i in root.findall('data'): if i.get('name') == 'targetproject': project = i.text.strip() if not project: raise oscerr.APIError(f'invalid data from clone request:\n{ET.tostring(root, encoding=ET_ENCODING)}\n') return project # create a maintenance release request def create_release_request(apiurl: str, src_project, message=""): r = Request() # api will complete the request r.add_action('maintenance_release', src_project=src_project) r.description = message r.create(apiurl) return r # create a maintenance incident per request def create_maintenance_request( apiurl: str, src_project, src_packages, tgt_project, tgt_releaseproject, opt_sourceupdate, message="", enforce_branching=False, rev=None, ): r = Request() if src_packages: for p in src_packages: r.add_action('maintenance_incident', src_project=src_project, src_package=p, src_rev=rev, tgt_project=tgt_project, tgt_releaseproject=tgt_releaseproject, opt_sourceupdate=opt_sourceupdate) else: r.add_action('maintenance_incident', src_project=src_project, tgt_project=tgt_project, tgt_releaseproject=tgt_releaseproject, opt_sourceupdate=opt_sourceupdate) r.description = message r.create(apiurl, addrevision=True, enforce_branching=enforce_branching) return r def create_submit_request( apiurl: str, src_project: str, src_package: Optional[str] = None, dst_project: Optional[str] = None, dst_package: Optional[str] = None, message: str = "", orev: Optional[str] = None, src_update: Optional[str] = None, dst_updatelink: Optional[bool] = None, ): from . import obs_api req = obs_api.Request( action_list=[ { "type": "submit", "source": { "project": src_project, "package": src_package, "rev": orev or show_upstream_rev(apiurl, src_project, src_package), }, "target": { "project": dst_project, "package": dst_package, }, "options": { "sourceupdate": src_update, "updatelink": "true" if dst_updatelink else None, } }, ], description=message, ) try: new_req = req.cmd_create(apiurl) except HTTPError as e: if e.hdrs.get('X-Opensuse-Errorcode') == "submit_request_rejected": print('WARNING: As the project is in maintenance, a maintenance incident request is') print('WARNING: being created (instead of a regular submit request). If this is not your') print('WARNING: intention please revoke it to avoid unnecessary work for all involved parties.') xpath = f"maintenance/maintains/@project = '{dst_project}' and attribute/@name = '{conf.config['maintenance_attribute']}'" res = search(apiurl, project_id=xpath) root = res['project_id'] project = root.find('project') if project is None: print(f"WARNING: This project is not maintained in the maintenance project specified by '{conf.config['maintenance_attribute']}', looking elsewhere") xpath = f'maintenance/maintains/@project = \'{dst_project}\'' res = search(apiurl, project_id=xpath) root = res['project_id'] project = root.find('project') if project is None: raise oscerr.APIError("Server did not define a default maintenance project, can't submit.") tproject = project.get('name') r = create_maintenance_request(apiurl, src_project, [src_package], tproject, dst_project, src_update, message, rev=orev) return r.reqid else: raise return new_req.id def get_request(apiurl: str, reqid): u = makeurl(apiurl, ['request', reqid], {'withfullhistory': '1'}) f = http_GET(u) root = xml_parse(f).getroot() r = Request() r.read(root, apiurl=apiurl) return r def change_review_state( apiurl: str, reqid, newstate, by_user="", by_group="", by_project="", by_package="", message="", supersed=None ): query = {"cmd": "changereviewstate", "newstate": newstate} if by_user: query['by_user'] = by_user if by_group: query['by_group'] = by_group if by_project: query['by_project'] = by_project if by_package: query['by_package'] = by_package if supersed: query['superseded_by'] = supersed u = makeurl(apiurl, ['request', reqid], query=query) f = http_POST(u, data=message) root = xml_parse(f).getroot() return root.get('code') def change_request_state(apiurl: str, reqid, newstate, message="", supersed=None, force=False, keep_packages_locked=False): query = {"cmd": "changestate", "newstate": newstate} if supersed: query['superseded_by'] = supersed if force: query['force'] = "1" if keep_packages_locked: query['keep_packages_locked'] = "1" u = makeurl(apiurl, ['request', reqid], query=query) f = http_POST(u, data=message) root = xml_parse(f).getroot() return root.get('code', 'unknown') def change_request_state_template(req, newstate): if not req.actions: return '' action = req.actions[0] tmpl_name = f'{action.type}request_{newstate}_template' tmpl = conf.config.get(tmpl_name, "") or "" tmpl = tmpl.replace('\\t', '\t').replace('\\n', '\n') data = {'reqid': req.reqid, 'type': action.type, 'who': req.creator} if req.actions[0].type == 'submit': data.update({'src_project': action.src_project, 'src_package': action.src_package, 'src_rev': action.src_rev, 'dst_project': action.tgt_project, 'dst_package': action.tgt_package, 'tgt_project': action.tgt_project, 'tgt_package': action.tgt_package}) try: return tmpl % data except KeyError as e: print(f'error: cannot interpolate \'{e.args[0]}\' in \'{tmpl_name}\'', file=sys.stderr) return '' def get_review_list( apiurl: str, project="", package="", byuser="", bygroup="", byproject="", bypackage="", states=(), req_type="", req_states=("review",) ): # this is so ugly... def build_by(xpath, val): if 'all' in states: return xpath_join(xpath, f'review/{val}', op='and') elif states: s_xp = '' for state in states: s_xp = xpath_join(s_xp, f'@state=\'{state}\'', inner=True) val = val.strip('[').strip(']') return xpath_join(xpath, f'review[{val} and ({s_xp})]', op='and') else: # default case return xpath_join(xpath, f'review[{val} and @state=\'new\']', op='and') return '' xpath = '' # By default we're interested only in reviews of requests that are in state review. for req_state in req_states: xpath = xpath_join(xpath, f"state/@name='{req_state}'", inner=True) xpath = f"({xpath})" if states == (): xpath = xpath_join(xpath, 'review/@state=\'new\'', op='and') if byuser: xpath = build_by(xpath, f'@by_user=\'{byuser}\'') if bygroup: xpath = build_by(xpath, f'@by_group=\'{bygroup}\'') if bypackage: xpath = build_by(xpath, f'@by_project=\'{byproject}\' and @by_package=\'{bypackage}\'') elif byproject: xpath = build_by(xpath, f'@by_project=\'{byproject}\'') if req_type: xpath = xpath_join(xpath, f'action/@type=\'{req_type}\'', op='and') # XXX: we cannot use the '|' in the xpath expression because it is not supported # in the backend todo = {} if project: todo['project'] = project if package: todo['package'] = package for kind, val in todo.items(): xpath_base = 'action/target/@%(kind)s=\'%(val)s\'' if conf.config['include_request_from_project']: xpath_base = xpath_join(xpath_base, 'action/source/@%(kind)s=\'%(val)s\'', op='or', inner=True) xpath = xpath_join(xpath, xpath_base % {'kind': kind, 'val': val}, op='and', nexpr_parentheses=True) output.print_msg(f"[ {xpath} ]", print_to="debug") res = search(apiurl, request=xpath) collection = res['request'] requests = [] for root in collection.findall('request'): r = Request() r.read(root) requests.append(r) return requests # this function uses the logic in the api which is faster and more exact then the xpath search def get_request_collection( apiurl: str, user=None, group=None, roles=None, project=None, package=None, states=None, review_states=None, types: List[str] = None, ids=None, withfullhistory=False ): # We don't want to overload server by requesting everything. # Let's enforce specifying at least some search criteria. if not any([user, group, project, package, ids]): raise oscerr.OscValueError("Please specify search criteria") query = {"view": "collection"} if user: query["user"] = user if group: query["group"] = group if roles: query["roles"] = ",".join(roles) if project: query["project"] = project if package: if not project: raise ValueError("Project must be set to query a package; see https://github.com/openSUSE/open-build-service/issues/13075") query["package"] = package states = states or ("new", "review", "declined") if states: if "all" not in states: query["states"] = ",".join(states) if review_states: if "all" not in review_states: query["review_states"] = ",".join(review_states) if types: assert not isinstance(types, str) query["types"] = ",".join(types) if ids: query["ids"] = ",".join(ids) if withfullhistory: query["withfullhistory"] = "1" u = makeurl(apiurl, ['request'], query) f = http_GET(u) res = xml_parse(f).getroot() requests = [] for root in res.findall('request'): r = Request() r.read(root) # post-process results until we switch back to the /search/request # which seems to be more suitable for such queries exclude = False for action in r.actions: src_project = getattr(action, "src_project", None) src_package = getattr(action, "src_package", None) tgt_project = getattr(action, "tgt_project", None) tgt_package = getattr(action, "tgt_package", None) # skip if neither of source and target project matches if "project" in query and query["project"] not in (src_project, tgt_project): exclude = True break # skip if neither of source and target package matches if "package" in query and query["package"] not in (src_package, tgt_package): exclude = True break if not conf.config["include_request_from_project"]: if "project" in query and "package" in query: if (src_project, src_package) == (query["project"], query["package"]): exclude = True break elif "project" in query: if src_project == query["project"]: exclude = True break if exclude: continue requests.append(r) return requests def get_exact_request_list( apiurl: str, src_project: str, dst_project: str, src_package: Optional[str] = None, dst_package: Optional[str] = None, req_who: Optional[str] = None, req_state=("new", "review", "declined"), req_type: Optional[str] = None, ): xpath = "" if "all" not in req_state: for state in req_state: xpath = xpath_join(xpath, f'state/@name=\'{state}\'', op='or', inner=True) xpath = f'({xpath})' if req_who: xpath = xpath_join(xpath, '(state/@who=\'%(who)s\' or history/@who=\'%(who)s\')' % {'who': req_who}, op='and') xpath += f" and action[source/@project='{src_project}'" if src_package: xpath += f" and source/@package='{src_package}'" xpath += f" and target/@project='{dst_project}'" if dst_package: xpath += f" and target/@package='{dst_package}'" xpath += "]" if req_type: xpath += f" and action/@type='{req_type}'" output.print_msg(f"[ {xpath} ]", print_to="debug") res = search(apiurl, request=xpath) collection = res['request'] requests = [] for root in collection.findall('request'): r = Request() r.read(root) requests.append(r) return requests def get_request_list( apiurl: str, project="", package="", req_who="", req_state=("new", "review", "declined"), req_type=None, exclude_target_projects=None, withfullhistory=False, roles=None, ): kwargs = { "apiurl": apiurl, "user": req_who, "roles": roles, "project": project, "package": package, "states": req_state, "withfullhistory": withfullhistory, } if req_type is not None: kwargs["types"] = [req_type] assert not exclude_target_projects, "unsupported" return get_request_collection(**kwargs) # old style search, this is to be removed def get_user_projpkgs_request_list( apiurl: str, user, req_state=( "new", "review", "declined", ), req_type=None, exclude_projects=None, projpkgs=None, ): """OBSOLETE: user involved request search is supported by OBS 2.2 server side in a better way Return all running requests for all projects/packages where is user is involved""" exclude_projects = exclude_projects or [] projpkgs = projpkgs or {} if not projpkgs: res = get_user_projpkgs(apiurl, user, exclude_projects=exclude_projects) projects = [] for i in res['project_id'].findall('project'): projpkgs[i.get('name')] = [] projects.append(i.get('name')) for i in res['package_id'].findall('package'): if not i.get('project') in projects: projpkgs.setdefault(i.get('project'), []).append(i.get('name')) if not projpkgs: return [] xpath = '' for prj, pacs in projpkgs.items(): if not pacs: xpath = xpath_join(xpath, f'action/target/@project=\'{prj}\'', inner=True) else: xp = '' for p in pacs: xp = xpath_join(xp, f'action/target/@package=\'{p}\'', inner=True) xp = xpath_join(xp, f'action/target/@project=\'{prj}\'', op='and') xpath = xpath_join(xpath, xp, inner=True) if req_type: xpath = xpath_join(xpath, f'action/@type=\'{req_type}\'', op='and') if 'all' not in req_state: xp = '' for state in req_state: xp = xpath_join(xp, f'state/@name=\'{state}\'', inner=True) xpath = xpath_join(xp, xpath, op='and', nexpr_parentheses=True) res = search(apiurl, request=xpath) result = [] for root in res['request'].findall('request'): r = Request() r.read(root) result.append(r) return result def get_request_log(apiurl: str, reqid): r = get_request(apiurl, reqid) data = [] frmt = '-' * 76 + '\n%s | %s | %s\n\n%s' r.statehistory.reverse() # the description of the request is used for the initial log entry # otherwise its comment attribute would contain None if len(r.statehistory) >= 1: r.statehistory[-1].comment = r.description else: r.state.comment = r.description for state in [r.state] + r.statehistory: s = frmt % (state.name, state.who, state.when, str(state.comment)) data.append(s) return data def check_existing_requests( apiurl: str, src_project: str, src_package: str, dst_project: str, dst_package: str, ask=True ): reqs = get_exact_request_list( apiurl, src_project, dst_project, src_package, dst_package, req_type="submit", req_state=["new", "review", "declined"], ) if not ask: return True, reqs repl = '' if reqs: open_request_string = "The following submit request is already open:" supersede_request_string = "Supersede the old request?" if len(reqs) > 1: open_request_string = "The following submit requests are already open:" supersede_request_string = "Supersede the old requests?" print(f"{open_request_string} {', '.join([i.reqid for i in reqs])}.") repl = raw_input(f'{supersede_request_string} (y/n/c) ') while repl.lower() not in ['c', 'y', 'n']: print(f'{repl} is not a valid option.') repl = raw_input(f'{supersede_request_string} (y/n/c) ') if repl.lower() == 'c': print('Aborting', file=sys.stderr) raise oscerr.UserAbort() return repl == 'y', reqs def check_existing_maintenance_requests( apiurl: str, src_project: str, src_packages: List[str], dst_project: str, release_project, ask=True ): reqs = [] for src_package in src_packages: reqs += get_exact_request_list( apiurl, src_project, dst_project, src_package, None, req_type="maintenance_incident", req_state=["new", "review", "declined"], ) if not ask: return True, reqs repl = '' if reqs: open_request_string = "The following maintenance incident request is already open:" supersede_request_string = "Supersede the old request?" if len(reqs) > 1: open_request_string = "The following maintenance incident requests are already open:" supersede_request_string = "Supersede the old requests?" print(f"{open_request_string} {', '.join([i.reqid for i in reqs])}.") repl = raw_input(f'{supersede_request_string} (y/n/c) ') while repl.lower() not in ['c', 'y', 'n']: print(f'{repl} is not a valid option.') repl = raw_input(f'{supersede_request_string} (y/n/c) ') if repl.lower() == 'c': print('Aborting', file=sys.stderr) raise oscerr.UserAbort() return repl == 'y', reqs # old function for compat reasons. Some plugins may call this function. # and we do not want to break the plugins. def get_group(apiurl: str, group: str): return get_group_meta(apiurl, group) def get_group_meta(apiurl: str, group: str): u = makeurl(apiurl, ['group', group]) try: f = http_GET(u) return b''.join(f.readlines()) except HTTPError: print(f'group \'{group}\' not found') return None def get_user_meta(apiurl: str, user: str): u = makeurl(apiurl, ['person', user]) try: f = http_GET(u) return b''.join(f.readlines()) except HTTPError: print(f'user \'{user}\' not found') return None def _get_xml_data(meta, *tags): data = [] if meta is not None: root = xml_fromstring(meta) for tag in tags: elm = root.find(tag) if elm is None or elm.text is None: data.append('-') else: data.append(elm.text) return data def get_user_data(apiurl: str, user: str, *tags): """get specified tags from the user meta""" meta = get_user_meta(apiurl, user) return _get_xml_data(meta, *tags) def get_group_data(apiurl: str, group: str, *tags): meta = get_group_meta(apiurl, group) return _get_xml_data(meta, *tags) def download(url: str, filename, progress_obj=None, mtime=None): global BUFSIZE o = None try: prefix = os.path.basename(filename) path = os.path.dirname(filename) (fd, tmpfile) = tempfile.mkstemp(dir=path, prefix=prefix, suffix='.osctmp') os.fchmod(fd, 0o644) try: o = os.fdopen(fd, 'wb') for buf in streamfile(url, http_GET, BUFSIZE, progress_obj=progress_obj, text=filename): if isinstance(buf, str): o.write(bytes(buf, "utf-8")) else: o.write(buf) o.close() os.rename(tmpfile, filename) except: os.unlink(tmpfile) raise finally: if o is not None: o.close() if mtime: utime(filename, (-1, mtime)) def get_source_file( apiurl: str, prj: str, package: str, filename, targetfilename=None, revision=None, progress_obj=None, mtime=None, meta=False, ): targetfilename = targetfilename or filename query = {} if meta: query['meta'] = 1 if not revision_is_empty(revision): query['rev'] = revision u = makeurl( apiurl, ["source", prj, package, filename], query=query, ) download(u, targetfilename, progress_obj, mtime) def get_binary_file( apiurl: str, prj: str, repo: str, arch: str, filename, package: Optional[str] = None, target_filename=None, target_mtime=None, progress_meter=False, ): progress_obj = None if progress_meter: progress_obj = meter.create_text_meter() target_filename = target_filename or filename # create target directory if it doesn't exist target_dir = os.path.dirname(target_filename) if target_dir: try: os.makedirs(target_dir, 0o755) except OSError as e: if e.errno != errno.EEXIST: raise where = package or '_repository' u = makeurl(apiurl, ['build', prj, repo, arch, where, filename]) download(u, target_filename, progress_obj, target_mtime) if target_filename.endswith('.AppImage'): os.chmod(target_filename, 0o755) def dgst(file): # if not os.path.exists(file): # return None global BUFSIZE s = hashlib.md5() f = open(file, 'rb') while True: buf = f.read(BUFSIZE) if not buf: break s.update(buf) f.close() return s.hexdigest() def sha256_dgst(file): global BUFSIZE f = open(file, 'rb') s = hashlib.sha256() while True: buf = f.read(BUFSIZE) if not buf: break s.update(buf) f.close() return s.hexdigest() def binary(data: bytes): """ Return ``True`` if ``data`` is binary data. We're using heuristics according to OBS: src/backend/BSSrcServer/filediff - look for "diff binary detection" """ if b"\0" in data: return True binary_chars = re.findall(b"[\x00-\x07\x0e-\x1f]", data) return len(binary_chars) * 40 > len(data) def binary_file(fn): """read 4096 bytes from a file named fn, and call binary() on the data""" with open(fn, 'rb') as f: return binary(f.read(4096)) def get_source_file_diff(dir, filename, rev, oldfilename=None, olddir=None, origfilename=None): """ This methods diffs oldfilename against filename (so filename will be shown as the new file). The variable origfilename is used if filename and oldfilename differ in their names (for instance if a tempfile is used for filename etc.) """ global store if not oldfilename: oldfilename = filename if not olddir: olddir = os.path.join(dir, store, "sources") if not origfilename: origfilename = filename file1 = os.path.join(olddir, oldfilename) # old/stored original file2 = os.path.join(dir, filename) # working copy if binary_file(file1) or binary_file(file2): return [b'Binary file \'%s\' has changed.\n' % origfilename.encode()] f1 = f2 = None try: f1 = open(file1, 'rb') s1 = f1.readlines() f1.close() f2 = open(file2, 'rb') s2 = f2.readlines() f2.close() finally: if f1: f1.close() if f2: f2.close() from_file = b'%s\t(revision %s)' % (origfilename.encode(), str(rev).encode()) to_file = b'%s\t(working copy)' % origfilename.encode() d = difflib.diff_bytes(difflib.unified_diff, s1, s2, fromfile=from_file, tofile=to_file) d = list(d) # python2.7's difflib slightly changed the format # adapt old format to the new format if len(d) > 1: d[0] = d[0].replace(b' \n', b'\n') d[1] = d[1].replace(b' \n', b'\n') # if file doesn't end with newline, we need to append one in the diff result for i, line in enumerate(d): if not line.endswith(b'\n'): d[i] += b'\n\\ No newline at end of file' if i + 1 != len(d): d[i] += b'\n' return d def server_diff( apiurl: str, old_project: str, old_package: str, old_revision: str, new_project: str, new_package: str, new_revision: str, unified=False, missingok=False, meta=False, expand=True, onlyissues=False, full=True, xml=False, files: list = None, ): query: Dict[str, Union[str, int]] = {"cmd": "diff"} if expand: query['expand'] = 1 if old_project: query['oproject'] = old_project if old_package: query['opackage'] = old_package if not revision_is_empty(old_revision): query['orev'] = old_revision if not revision_is_empty(new_revision): query['rev'] = new_revision if unified: query['unified'] = 1 if missingok: query['missingok'] = 1 if meta: query['meta'] = 1 if full: query['filelimit'] = 0 query['tarlimit'] = 0 if onlyissues: query['onlyissues'] = 1 query['view'] = 'xml' query['unified'] = 0 if files: query["file"] = UrlQueryArray(files) u = makeurl(apiurl, ['source', new_project, new_package], query=query) f = http_POST(u) if onlyissues and not xml: del_issue_list = [] add_issue_list = [] chn_issue_list = [] root = xml_fromstring(f.read()) node = root.find('issues') for issuenode in node.findall('issue'): if issuenode.get('state') == 'deleted': del_issue_list.append(issuenode.get('label')) elif issuenode.get('state') == 'added': add_issue_list.append(issuenode.get('label')) else: chn_issue_list.append(issuenode.get('label')) string = 'added:\n----------\n' + '\n'.join(add_issue_list) + \ '\n\nchanged:\n----------\n' + '\n'.join(chn_issue_list) + \ '\n\ndeleted:\n----------\n' + '\n'.join(del_issue_list) return string return f.read() def server_diff_noex( apiurl: str, old_project: str, old_package: str, old_revision: str, new_project: str, new_package: str, new_revision: str, unified=False, missingok=False, meta=False, expand=True, onlyissues=False, xml=False, files: list = None, ): try: return server_diff(apiurl, old_project, old_package, old_revision, new_project, new_package, new_revision, unified, missingok, meta, expand, onlyissues, True, xml, files=files) except HTTPError as e: msg = None body = None try: body = e.read() if b'bad link' not in body: return b'# diff failed: ' + body except: return b'# diff failed with unknown error' if expand: rdiff = b"## diff on expanded link not possible, showing unexpanded version\n" try: rdiff += server_diff_noex(apiurl, old_project, old_package, old_revision, new_project, new_package, new_revision, unified, missingok, meta, False, files=files) except: elm = xml_fromstring(body).find('summary') summary = '' if elm is not None and elm.text is not None: summary = elm.text return b'error: diffing failed: %s' % summary.encode() return rdiff def request_diff(apiurl: str, reqid, superseded_reqid=None): query = {'cmd': 'diff'} if superseded_reqid: query['diff_to_superseded'] = superseded_reqid u = makeurl(apiurl, ['request', reqid], query) f = http_POST(u) return f.read() def get_request_issues(apiurl: str, reqid): """ gets a request xml with the issues for the request inside and creates a list 'issue_list' with a dict of the relevant information for the issues. This only works with bugtrackers we can access, like buzilla.o.o """ u = makeurl(apiurl, ['request', reqid], query={'cmd': 'diff', 'view': 'xml', 'withissues': '1'}) f = http_POST(u) request_tree = xml_parse(f).getroot() issue_list = [] for elem in request_tree.iterfind('action/sourcediff/issues/issue'): issue_id = elem.get('name') encode_search = f'@name=\'{issue_id}\'' u = makeurl(apiurl, ['search/issue'], query={'match': encode_search}) f = http_GET(u) collection = xml_parse(f).getroot() for cissue in collection: issue = {} for issue_detail in cissue.iter(): if issue_detail.text: issue[issue_detail.tag] = issue_detail.text.strip() issue_list.append(issue) return issue_list def submit_action_diff(apiurl: str, action: Action): """diff a single submit action""" # backward compatiblity: only a recent api/backend supports the missingok parameter try: return server_diff(apiurl, action.tgt_project, action.tgt_package, None, action.src_project, action.src_package, action.src_rev, True, True) except HTTPError as e: if e.code == 400: try: return server_diff(apiurl, action.tgt_project, action.tgt_package, None, action.src_project, action.src_package, action.src_rev, True, False) except HTTPError as e: if e.code != 404: raise e root = xml_fromstring(e.read()) return b'error: \'%s\' does not exist' % root.findtext("summary").encode() elif e.code == 404: root = xml_fromstring(e.read()) return b'error: \'%s\' does not exist' % root.findtext("summary").encode() raise e def make_dir( apiurl: str, project: str, package: str, pathname=None, prj_dir=None, package_tracking=True, pkg_path=None ): """ creates the plain directory structure for a package dir. The 'apiurl' parameter is needed for the project dir initialization. The 'project' and 'package' parameters specify the name of the project and the package. The optional 'pathname' parameter is used for printing out the message that a new dir was created (default: 'prj_dir/package'). The optional 'prj_dir' parameter specifies the path to the project dir (default: 'project'). If pkg_path is not None store the package's content in pkg_path (no project structure is created) """ prj_dir = prj_dir or project # FIXME: carefully test each patch component of prj_dir, # if we have a .osc/_files entry at that level. # -> if so, we have a package/project clash, # and should rename this path component by appending '.proj' # and give user a warning message, to discourage such clashes if pkg_path is None: pathname = pathname or getTransActPath(os.path.join(prj_dir, package)) pkg_path = os.path.join(prj_dir, package) if is_package_dir(prj_dir): # we want this to become a project directory, # but it already is a package directory. raise oscerr.OscIOError(None, 'checkout_package: package/project clash. Moving myself away not implemented') if not is_project_dir(prj_dir): # this directory could exist as a parent direory for one of our earlier # checked out sub-projects. in this case, we still need to initialize it. print(statfrmt('A', prj_dir)) Project.init_project(apiurl, prj_dir, project, package_tracking) if is_project_dir(os.path.join(prj_dir, package)): # the thing exists, but is a project directory and not a package directory # FIXME: this should be a warning message to discourage package/project clashes raise oscerr.OscIOError(None, 'checkout_package: package/project clash. Moving project away not implemented') else: pathname = pkg_path if not os.path.exists(pkg_path): print(statfrmt('A', pathname)) os.mkdir(os.path.join(pkg_path)) # os.mkdir(os.path.join(prj_dir, package, store)) return pkg_path def run_obs_scm_bridge(url: str, target_dir: str): if not os.path.isfile(conf.config.obs_scm_bridge_cmd): raise oscerr.OscIOError(None, "Install the obs-scm-bridge package to work on packages managed in scm (git)!") env = os.environ.copy() env["OSC_VERSION"] = get_osc_version() run_external([conf.config.obs_scm_bridge_cmd, "--outdir", target_dir, "--url", url], env=env) def checkout_package( apiurl: str, project: str, package: str, revision=None, pathname=None, prj_obj=None, expand_link=False, prj_dir: Path=None, server_service_files=None, service_files=None, native_obs_package=False, progress_obj=None, size_limit=None, meta=False, outdir=None, ): try: # the project we're in might be deleted. # that'll throw an error then. olddir = Path.cwd() except FileNotFoundError: olddir = Path(os.environ.get("PWD")) if not prj_dir: prj_dir = olddir else: sep = "/" if conf.config['checkout_no_colon'] else conf.config['project_separator'] prj_dir = Path(str(prj_dir).replace(':', sep)) root_dots = Path('.') oldproj = None if conf.config['checkout_rooted']: if prj_dir.stem == '/': output.print_msg(f"checkout_rooted ignored for {prj_dir}", print_to="verbose") # ?? should we complain if not is_project_dir(prj_dir) ?? else: # if we are inside a project or package dir, ascend to parent # directories, so that all projects are checked out relative to # the same root. if is_project_dir(".."): # if we are in a package dir, goto parent. # Hmm, with 'checkout_no_colon' in effect, we have directory levels that # do not easily reveal the fact, that they are part of a project path. # At least this test should find that the parent of 'home/username/branches' # is a project (hack alert). Also goto parent in this case. root_dots = Path("../") elif is_project_dir("../.."): # testing two levels is better than one. # May happen in case of checkout_no_colon, or # if project roots were previously inconsistent root_dots = Path("../../") if is_project_dir(root_dots): oldproj = store_read_project(root_dots) if conf.config['checkout_no_colon']: n = len(oldproj.split(':')) else: n = 1 root_dots = root_dots / ("../" * n) if str(root_dots) != '.': output.print_msg(f"{prj_dir} is project dir of {oldproj}. Root found at {os.path.abspath(root_dots)}", print_to="verbose") prj_dir = root_dots / prj_dir if not pathname: pathname = getTransActPath(os.path.join(prj_dir, package)) # before we create directories and stuff, check if the package actually # exists meta_data = b''.join(show_package_meta(apiurl, project, package)) root = xml_fromstring(meta_data) scmsync_element = root.find("scmsync") if not native_obs_package and scmsync_element is not None and scmsync_element.text is not None: directory = make_dir(apiurl, project, package, pathname, prj_dir, conf.config['do_package_tracking'], outdir) scm_url = scmsync_element.text fetch_obsinfo = "noobsinfo" not in parse_qs(urlparse(scm_url).query) if revision is not None and fetch_obsinfo: # search for the git sha sum based on the OBS DISTURL package source revision # we need also take into account that the url was different at that point of time from .obs_api.scmsync_obsinfo import ScmsyncObsinfo scmsync_obsinfo = ScmsyncObsinfo.from_api(apiurl, project, package, rev=revision) if scmsync_obsinfo.revision: scm_url = f"{scmsync_obsinfo.url}#{scmsync_obsinfo.revision}" else: scm_url = f"{scmsync_obsinfo.url}" run_obs_scm_bridge(url=scm_url, target_dir=directory) Package.init_package(apiurl, project, package, directory, size_limit, meta, progress_obj, scm_url) # add package to /.obs/_packages if not prj_obj: prj_obj = Project(prj_dir) prj_obj.set_state(package, ' ') prj_obj.write_packages() return isfrozen = False if expand_link: # try to read from the linkinfo # if it is a link we use the xsrcmd5 as the revision to be # checked out try: x = show_upstream_xsrcmd5(apiurl, project, package, revision=revision, meta=meta, include_service_files=server_service_files) except: x = show_upstream_xsrcmd5(apiurl, project, package, revision=revision, meta=meta, linkrev='base', include_service_files=server_service_files) if x: isfrozen = True if x: revision = x directory = make_dir(apiurl, project, package, pathname, prj_dir, conf.config['do_package_tracking'], outdir) p = Package.init_package(apiurl, project, package, directory, size_limit, meta, progress_obj) if isfrozen: p.mark_frozen() # no project structure is wanted when outdir is used if conf.config['do_package_tracking'] and outdir is None: # check if we can re-use an existing project object if prj_obj is None: prj_obj = Project(prj_dir) prj_obj.set_state(p.name, ' ') prj_obj.write_packages() p.update(revision, server_service_files, size_limit) if service_files: print('Running all source services local') p.run_source_services() def replace_pkg_meta( pkgmeta, new_name: str, new_prj: str, keep_maintainers=False, dst_userid=None, keep_develproject=False, keep_lock: bool = False, keep_scmsync: bool = True, ): """ update pkgmeta with new new_name and new_prj and set calling user as the only maintainer (unless keep_maintainers is set). Additionally remove the develproject entry () unless keep_develproject is true. """ root = xml_fromstring(b''.join(pkgmeta)) root.set('name', new_name) root.set('project', new_prj) # never take releasename, it needs to be explicit for releasename in root.findall('releasename'): root.remove(releasename) if not keep_maintainers: for person in root.findall('person'): root.remove(person) for group in root.findall('group'): root.remove(group) if not keep_develproject: for dp in root.findall('devel'): root.remove(dp) if not keep_lock: for node in root.findall("lock"): root.remove(node) if not keep_scmsync: for node in root.findall("scmsync"): root.remove(node) return ET.tostring(root, encoding=ET_ENCODING) def link_to_branch(apiurl: str, project: str, package: str): """ convert a package with a _link + project.diff to a branch """ if '_link' in meta_get_filelist(apiurl, project, package): u = makeurl(apiurl, ["source", project, package], {"cmd": "linktobranch"}) http_POST(u) else: raise oscerr.OscIOError(None, f'no _link file inside project \'{project}\' package \'{package}\'') def link_pac( src_project: str, src_package: str, dst_project: str, dst_package: str, force: bool, rev=None, cicount=None, disable_publish=False, missing_target=False, vrev=None, disable_build=False, ): """ create a linked package - "src" is the original package - "dst" is the "link" package that we are creating here """ from . import obs_api if src_project == dst_project and src_package == dst_package: raise oscerr.OscValueError("Cannot link package. Source and target are the same.") if not revision_is_empty(rev) and not checkRevision(src_project, src_package, rev): raise oscerr.OscValueError(f"Revision doesn't exist: {rev}") apiurl = conf.config["apiurl"] create_dst_package = False src_package_obj = obs_api.Package.from_api(apiurl, src_project, src_package) try: dst_package_obj = obs_api.Package.from_api(apiurl, dst_project, dst_package) if dst_package_obj.project != dst_project: # If the target package doesn't exist and the target project contains a project link, # the package meta from the linked project is returned instead! # We need to detect it and create the target package based on source package meta. create_dst_package = True except HTTPError as e: if e.code != 404: raise create_dst_package = True if create_dst_package: if missing_target: # we start with empty values because we want has_changed() to return True dst_package_obj = obs_api.Package(project="", name="") else: dst_package_obj = copy.deepcopy(src_package_obj) # purging unwanted fields; see also replace_pkg_meta() # TODO: create Package.clone() or .copy() method instead of this dst_package_obj.devel = None dst_package_obj.group_list = [] dst_package_obj.lock = None dst_package_obj.person_list = [] dst_package_obj.releasename = None dst_package_obj.scmsync = None dst_package_obj.project = dst_project dst_package_obj.name = dst_package if disable_build: dst_package_obj.build_list = [{"flag": "disable"}] if disable_publish: dst_package_obj.publish_list = [{"flag": "disable"}] dst_package_obj.scmsync = None if dst_package_obj.has_changed(): dst_package_obj.to_api(apiurl) # create the _link file # but first, make sure not to overwrite an existing one if '_link' in meta_get_filelist(apiurl, dst_project, dst_package): if force: print('forced overwrite of existing _link file', file=sys.stderr) else: print(file=sys.stderr) print('_link file already exists...! Aborting', file=sys.stderr) sys.exit(1) if not revision_is_empty(rev): rev = f' rev="{rev}"' else: rev = '' if vrev: vrev = f' vrev="{vrev}"' else: vrev = '' missingok = '' if missing_target: missingok = ' missingok="true"' if cicount: cicount = f' cicount="{cicount}"' else: cicount = '' print('Creating _link...', end=' ') project = '' if src_project != dst_project: project = f'project="{src_project}"' link_template = """\ """ % (project, src_package, missingok, rev, vrev, cicount) u = makeurl(apiurl, ['source', dst_project, dst_package, '_link']) http_PUT(u, data=link_template) print('Done.') def aggregate_pac( src_project: str, src_package: str, dst_project: str, dst_package: str, repo_map: Optional[dict] = None, disable_publish=False, nosources=False, repo_check=True, ): """ aggregate package - "src" is the original package - "dst" is the "aggregate" package that we are creating here - "map" is a dictionary SRC => TARGET repository mappings - "repo_check" determines if presence of repos in the source and destination repos is checked """ if (src_project, src_package) == (dst_project, dst_package): raise oscerr.OscValueError("Cannot aggregate package. Source and target are the same.") meta_change = False dst_meta = '' apiurl = conf.config['apiurl'] repo_map = repo_map or {} # we need to remove :flavor from the package names when accessing meta src_package_meta = src_package.split(":")[0] dst_package_meta = dst_package.split(":")[0] try: dst_meta = meta_exists(metatype='pkg', path_args=(dst_project, dst_package_meta), template_args=None, create_new=False, apiurl=apiurl) root = xml_fromstring(parse_meta_to_string(dst_meta)) if root.get('project') != dst_project: # The source comes from a different project via a project link, we need to create this instance meta_change = True except HTTPError as e: if e.code != 404: raise meta_change = True if repo_check: src_repos = set(get_repositories_of_project(apiurl, src_project)) dst_repos = set(get_repositories_of_project(apiurl, dst_project)) if repo_map: map_from = set(repo_map.keys()) map_to = set(repo_map.values()) # only repos that do not exist in src/dst remain delta_from = map_from - src_repos delta_to = map_to - dst_repos if delta_from or delta_to: msg = ["The following repos in repo map do not exist"] if delta_from: msg += [" Source repos: " + ", ".join(sorted(delta_from))] if delta_to: msg += [" Destination repos: " + ", ".join(sorted(delta_to))] raise oscerr.OscBaseError("\n".join(msg)) else: # no overlap between src and dst repos leads to the 'broken: missing repositories: ' message if not src_repos & dst_repos: msg = [ "The source and the destination project do not have any repository names in common.", "Use repo map to specify actual repository mapping.", ] raise oscerr.OscBaseError("\n".join(msg)) if meta_change: src_meta = show_package_meta(apiurl, src_project, src_package_meta) dst_meta = replace_pkg_meta(src_meta, dst_package_meta, dst_project) meta_change = True if disable_publish: meta_change = True root = xml_fromstring(''.join(dst_meta)) elm = root.find('publish') if not elm: elm = ET.SubElement(root, 'publish') elm.clear() ET.SubElement(elm, 'disable') dst_meta = ET.tostring(root, encoding=ET_ENCODING) if meta_change: edit_meta('pkg', path_args=(dst_project, dst_package_meta), data=dst_meta) # create the _aggregate file # but first, make sure not to overwrite an existing one if '_aggregate' in meta_get_filelist(apiurl, dst_project, dst_package_meta): print(file=sys.stderr) print('_aggregate file already exists...! Aborting', file=sys.stderr) sys.exit(1) print('Creating _aggregate...', end=' ') aggregate_template = f""" """ aggregate_template += f""" {src_package} """ if nosources: aggregate_template += """\ """ for src, tgt in repo_map.items(): aggregate_template += f""" """ aggregate_template += """\ """ u = makeurl(apiurl, ['source', dst_project, dst_package_meta, '_aggregate']) http_PUT(u, data=aggregate_template) print('Done.') def attribute_branch_pkg( apiurl: str, attribute: str, maintained_update_project_attribute, package: str, targetproject: str, return_existing=False, force=False, noaccess=False, add_repositories=False, dryrun=False, nodevelproject=False, maintenance=False, ): """ Branch packages defined via attributes (via API call) """ query = {'cmd': 'branch'} query['attribute'] = attribute if targetproject: query['target_project'] = targetproject if dryrun: query['dryrun'] = "1" if force: query['force'] = "1" if noaccess: query['noaccess'] = "1" if nodevelproject: query['ignoredevel'] = '1' if add_repositories: query['add_repositories'] = "1" if maintenance: query['maintenance'] = "1" if package: query['package'] = package if maintained_update_project_attribute: query['update_project_attribute'] = maintained_update_project_attribute u = makeurl(apiurl, ['source'], query=query) f = None try: f = http_POST(u) except HTTPError as e: root = xml_fromstring(e.read()) summary = root.find('summary') if summary is not None and summary.text is not None: raise oscerr.APIError(summary.text) msg = f'unexpected response: {ET.tostring(root, encoding=ET_ENCODING)}' raise oscerr.APIError(msg) r = None root = xml_fromstring(f.read()) if dryrun: return root # TODO: change api here and return parsed XML as class if conf.config['http_debug']: print(ET.tostring(root, encoding=ET_ENCODING), file=sys.stderr) for node in root.findall('data'): r = node.get('name') if r and r == 'targetproject': return node.text return r def branch_pkg( apiurl: str, src_project: str, src_package: str, nodevelproject=False, rev=None, linkrev=None, target_project: Optional[str] = None, target_package=None, return_existing=False, msg="", force=False, noaccess=False, add_repositories=False, add_repositories_block=None, add_repositories_rebuild=None, extend_package_names=False, missingok=False, maintenance=False, newinstance=False, disable_build=False, ): """ Branch a package (via API call) """ # BEGIN: Error out on branching scmsync packages; this should be properly handled in the API # read src_package meta try: m = b"".join(show_package_meta(apiurl, src_project, src_package)) root = xml_fromstring(m) except HTTPError as e: if e.code == 404 and missingok: root = None else: raise devel_project = None devel_package = None if root is not None and not nodevelproject: devel_node = root.find("devel") if devel_node is not None: devel_project = devel_node.get("project") devel_package = devel_node.get("package", src_package) if devel_project: # replace src_package meta with devel_package meta because we're about branch from devel m = b"".join(show_package_meta(apiurl, devel_project, devel_package)) root = xml_fromstring(m) # error out if we're branching a scmsync package (we'd end up with garbage anyway) if root is not None and root.find("scmsync") is not None: msg = ("osc cannot branch packages with , i.e. externally " "managed sources. Often, the URL for cloning is also the URL " "for a collaborative web interface where you can fork (branch). " "The scmsync URL was: " + root.find("scmsync").text) if devel_project: raise oscerr.PackageError(devel_project, devel_package, msg) raise oscerr.PackageError(src_project, src_package, msg) # END: Error out on branching scmsync packages; this should be properly handled in the API query = {'cmd': 'branch'} if nodevelproject: query['ignoredevel'] = '1' if force: query['force'] = '1' if noaccess: query['noaccess'] = '1' if add_repositories: query['add_repositories'] = "1" if add_repositories_block: query['add_repositories_block'] = add_repositories_block if add_repositories_rebuild: query['add_repositories_rebuild'] = add_repositories_rebuild if maintenance: query['maintenance'] = "1" if missingok: query['missingok'] = "1" if newinstance: query['newinstance'] = "1" if extend_package_names: query['extend_package_names'] = "1" if not revision_is_empty(rev): query['rev'] = rev if not revision_is_empty(linkrev): query['linkrev'] = linkrev if target_project: query['target_project'] = target_project if target_package: query['target_package'] = target_package if msg: query['comment'] = msg u = makeurl(apiurl, ['source', src_project, src_package], query=query) try: f = http_POST(u) except HTTPError as e: root = xml_fromstring(e.read()) if missingok: if root and root.get('code') == "not_missing": raise oscerr.NotMissing("Package exists already via project link, but link will point to given project") summary = root.find('summary') if summary is None: raise oscerr.APIError(f'unexpected response:\n{ET.tostring(root, encoding=ET_ENCODING)}') if not return_existing: raise oscerr.APIError(f'failed to branch: {summary.text}') m = re.match(r"branch target package already exists: (\S+)/(\S+)", summary.text) if not m: e.msg += '\n' + summary.text raise return (True, m.group(1), m.group(2), None, None) root = xml_fromstring(f.read()) if conf.config['http_debug']: print(ET.tostring(root, encoding=ET_ENCODING), file=sys.stderr) data = {} for i in root.findall('data'): data[i.get('name')] = i.text if disable_build: target_meta = show_package_meta(apiurl, data["targetproject"], data["targetpackage"]) root = xml_fromstring(b''.join(target_meta)) elm = root.find('build') if not elm: elm = ET.SubElement(root, 'build') elm.clear() ET.SubElement(elm, 'disable') target_meta = ET.tostring(root, encoding=ET_ENCODING) edit_meta('pkg', path_args=(data["targetproject"], data["targetpackage"]), data=target_meta) return (False, data.get('targetproject', None), data.get('targetpackage', None), data.get('sourceproject', None), data.get('sourcepackage', None)) def copy_pac( src_apiurl: str, src_project: str, src_package: str, dst_apiurl: str, dst_project: str, dst_package: str, client_side_copy=False, keep_maintainers=False, keep_develproject=False, expand=False, revision=None, comment=None, force_meta_update=None, keep_link=None, ): """ Create a copy of a package. Copying can be done by downloading the files from one package and commit them into the other by uploading them (client-side copy) -- or by the server, in a single api call. """ if (src_apiurl, src_project, src_package) == (dst_apiurl, dst_project, dst_package): # special cases when source and target can be the same: # * expanding sources # * downgrading package to an old revision if not any([expand, revision]): raise oscerr.OscValueError("Cannot copy package. Source and target are the same.") meta = None if not (src_apiurl == dst_apiurl and src_project == dst_project and src_package == dst_package): src_meta = show_package_meta(src_apiurl, src_project, src_package) dst_userid = conf.get_apiurl_usr(dst_apiurl) meta = replace_pkg_meta(src_meta, dst_package, dst_project, keep_maintainers, dst_userid, keep_develproject, keep_scmsync=(not client_side_copy)) url = make_meta_url('pkg', (dst_project, dst_package), dst_apiurl) found = None try: found = http_GET(url).readlines() except HTTPError as e: pass if force_meta_update or not found: print('Sending meta data...') u = makeurl(dst_apiurl, ['source', dst_project, dst_package, '_meta']) http_PUT(u, data=meta) if meta is None: meta = show_files_meta(dst_apiurl, dst_project, dst_package) root = xml_fromstring(meta) if root.find("scmsync") is not None: print("Note: package source is managed via SCM") return print('Copying files...') if not client_side_copy: query = {'cmd': 'copy', 'oproject': src_project, 'opackage': src_package} if expand or keep_link: query['expand'] = '1' if keep_link: query['keeplink'] = '1' if not revision_is_empty(revision): query['orev'] = revision if comment: query['comment'] = comment u = makeurl(dst_apiurl, ['source', dst_project, dst_package], query=query) f = http_POST(u) return f.read() else: # copy one file after the other query = {'rev': 'upload'} xml = show_files_meta(src_apiurl, src_project, src_package, expand=expand, revision=revision) filelist = xml_fromstring(xml) revision = filelist.get('srcmd5') # filter out _service: files for entry in filelist.findall('entry'): # hmm the old code also checked for _service_ (but this is # probably a relict from former times (if at all)) if entry.get('name').startswith('_service:'): filelist.remove(entry) tfilelist = Package.commit_filelist(dst_apiurl, dst_project, dst_package, filelist, msg=comment) todo = Package.commit_get_missing(tfilelist) for filename in todo: print(' ', filename) # hmm ideally, we would pass a file-like (that delegates to # streamfile) to http_PUT... with tempfile.NamedTemporaryFile(prefix='osc-copypac') as f: get_source_file(src_apiurl, src_project, src_package, filename, targetfilename=f.name, revision=revision) path = ['source', dst_project, dst_package, filename] u = makeurl(dst_apiurl, path, query={'rev': 'repository'}) http_PUT(u, file=f.name) tfilelist = Package.commit_filelist(dst_apiurl, dst_project, dst_package, filelist, msg=comment) todo = Package.commit_get_missing(tfilelist) if todo: raise oscerr.APIError(f"failed to copy: {', '.join(todo)}") return 'Done.' def lock(apiurl: str, project: str, package: str, msg: str = None): url_path = ["source", project] if package: url_path += [package] url_query = { "cmd": "set_flag", "flag": "lock", "status": "enable", } if msg: url_query["comment"] = msg _private.api.post(apiurl, url_path, url_query) def unlock_package(apiurl: str, prj: str, pac: str, msg): query = {'cmd': 'unlock', 'comment': msg} u = makeurl(apiurl, ['source', prj, pac], query) http_POST(u) def unlock_project(apiurl: str, prj: str, msg=None): query = {'cmd': 'unlock', 'comment': msg} u = makeurl(apiurl, ['source', prj], query) http_POST(u) def undelete_package(apiurl: str, prj: str, pac: str, msg=None): query = {'cmd': 'undelete'} if msg: query['comment'] = msg else: query['comment'] = 'undeleted via osc' u = makeurl(apiurl, ['source', prj, pac], query) http_POST(u) def undelete_project(apiurl: str, prj: str, msg=None): query = {'cmd': 'undelete'} if msg: query['comment'] = msg else: query['comment'] = 'undeleted via osc' u = makeurl(apiurl, ['source', prj], query) http_POST(u) def delete_package(apiurl: str, prj: str, pac: str, force=False, msg=None): if not force: requests = get_request_collection(apiurl, project=prj, package=pac) if requests: error_msg = \ "Package has pending requests. Deleting the package will break them. " \ "They should be accepted/declined/revoked before deleting the package. " \ "Or just use the 'force' option" raise oscerr.PackageError(prj, pac, error_msg) query = {} if force: query['force'] = "1" if msg: query['comment'] = msg u = makeurl(apiurl, ['source', prj, pac], query) http_DELETE(u) def delete_project(apiurl: str, prj: str, force=False, msg=None, recursive=False): if not recursive: packages = meta_get_packagelist(apiurl, prj) if packages: error_msg = \ "Project contains packages. It must be empty before deleting it. " \ "If you are sure that you want to remove this project and all its " \ "packages use the 'recursive' option." raise oscerr.ProjectError(prj, error_msg) query = {} if force: query['force'] = "1" if msg: query['comment'] = msg u = makeurl(apiurl, ['source', prj], query) http_DELETE(u) def delete_files(apiurl: str, prj: str, pac: str, files): for filename in files: u = makeurl(apiurl, ['source', prj, pac, filename], query={'comment': f'removed {filename}'}) http_DELETE(u) # old compat lib call def get_platforms(apiurl: str): return get_repositories(apiurl) def get_repositories(apiurl: str): f = http_GET(makeurl(apiurl, ['platform'])) tree = xml_parse(f) r = sorted(node.get('name') for node in tree.getroot()) return r def get_distributions(apiurl: str): """Returns list of dicts with headers 'distribution', 'project', 'repository', 'reponame'""" f = http_GET(makeurl(apiurl, ['distributions'])) root = xml_fromstring(b''.join(f)) distlist = [] for node in root.findall('distribution'): dmap = {} for child in node: if child.tag == 'name': dmap['distribution'] = child.text elif child.tag in ('project', 'repository', 'reponame'): dmap[child.tag] = child.text distlist.append(dmap) return distlist # old compat lib call def get_platforms_of_project(apiurl: str, prj: str): return get_repositories_of_project(apiurl, prj) def get_repositories_of_project(apiurl: str, prj: str): from . import obs_api project_obj = obs_api.Project.from_api(apiurl, prj) return [i.name for i in project_obj.repository_list or []] class Repo: repo_line_templ = '%-15s %-10s' def __init__(self, name: str, arch: str): self.name = name self.arch = arch def __str__(self): return self.repo_line_templ % (self.name, self.arch) def __repr__(self): return f'Repo({self.name} {self.arch})' @staticmethod def fromfile(filename): if not os.path.exists(filename): return [] repos = [] lines = open(filename).readlines() for line in lines: data = line.split() if len(data) == 2: repos.append(Repo(data[0], data[1])) elif len(data) == 1: # only for backward compatibility repos.append(Repo(data[0], '')) return repos @staticmethod def tofile(filename, repos): with open(filename, 'w') as f: for repo in repos: f.write(f'{repo.name} {repo.arch}\n') def get_repos_of_project(apiurl: str, prj: str): from . import obs_api project_obj = obs_api.Project.from_api(apiurl, prj) for repo in project_obj.repository_list or []: for arch in repo.arch_list or []: yield Repo(repo.name, arch) def get_binarylist( apiurl: str, prj: str, repo: str, arch: str, package: Optional[str] = None, verbose=False, withccache=False ): what = package or '_repository' query = {} if withccache: query['withccache'] = 1 u = makeurl(apiurl, ['build', prj, repo, arch, what], query=query) f = http_GET(u) tree = xml_parse(f) if not verbose: return [node.get('filename') for node in tree.findall('binary')] else: l = [] for node in tree.findall('binary'): f = File(node.get('filename'), None, int(node.get('size') or 0) or None, int(node.get('mtime') or 0) or None) l.append(f) return l def get_binarylist_published(apiurl: str, prj: str, repo: str, arch: str): u = makeurl(apiurl, ['published', prj, repo, arch]) f = http_GET(u) tree = xml_parse(f) r = [node.get('name') for node in tree.findall('entry')] return r def show_results_meta( apiurl: str, prj: str, package: Optional[str] = None, lastbuild: Optional[str] = None, repository: Optional[List[str]] = None, arch: Optional[List[str]] = None, oldstate: Optional[str] = None, multibuild: Optional[bool] = None, locallink: Optional[bool] = None, code: Optional[str] = None, ): repository = repository or [] arch = arch or [] query = {} query["package"] = package query["oldstate"] = oldstate query["lastbuild"] = lastbuild query["multibuild"] = multibuild query["locallink"] = locallink query["code"] = code query["repository"] = repository query["arch"] = arch u = makeurl(apiurl, ['build', prj, '_result'], query=query) f = http_GET(u) return f.readlines() def show_prj_results_meta( apiurl: str, prj: str, repositories: Optional[List[str]] = None, arches: Optional[List[str]] = None ): # this function is only needed for backward/api compatibility if repositories is None: repositories = [] if arches is None: arches = [] return show_results_meta(apiurl, prj, repository=repositories, arch=arches) def result_xml_to_dicts(xml): # assumption: xml contains at most one status element (maybe we should # generalize this to arbitrary status element) root = xml_fromstring(xml) for node in root.findall('result'): rmap = {} rmap['project'] = rmap['prj'] = node.get('project') rmap['repository'] = rmap['repo'] = rmap['rep'] = node.get('repository') rmap['arch'] = node.get('arch') rmap['state'] = node.get('state') rmap['dirty'] = node.get('dirty') == 'true' or node.get('code') == 'blocked' rmap['repostate'] = node.get('code') rmap['pkg'] = rmap['package'] = rmap['pac'] = '' rmap['code'] = node.get('code') rmap['details'] = node.get('details') # the way we currently use this function, there should be # always a status element snodes = node.findall('status') is_multi = len(snodes) > 1 if len(snodes) < 1: # the repository setup is broken smap = dict(rmap) smap['pkg'] = "_repository" smap['code'] = rmap['repostate'] smap['details'] = node.get('details') yield smap, is_multi continue for statusnode in snodes: smap = dict(rmap) smap['pkg'] = smap['package'] = smap['pac'] = statusnode.get('package') smap['code'] = statusnode.get('code', '') details = statusnode.find('details') if details is not None: smap['details'] = details.text if rmap['code'] == 'broken': # real error just becomes visible in details/verbose smap['code'] = rmap['code'] smap['details'] = "repository: " + rmap['details'] yield smap, is_multi def format_results(results, format): """apply selected format on each dict in results and return it as a list of strings""" return [format % r for r in results] def get_results( apiurl: str, project: str, package: str, verbose=False, printJoin="", out: Optional[dict] = None, *args, **kwargs ): """returns list of/or prints a human readable status for the specified package""" # hmm the function name is a bit too generic - something like # get_package_results_human would be better, but this would break the existing # api (unless we keep get_results around as well)... format = kwargs.pop('format', None) if format is None: format = '%(rep)-20s %(arch)-10s %(pkg)-30s %(status)s' r = [] printed = False failed = False multibuild_packages = kwargs.pop('multibuild_packages', []) show_excluded = kwargs.pop('showexcl', False) code_filter = kwargs.get('code', None) for results in get_package_results(apiurl, project, package, **kwargs): r = [] for res, is_multi in result_xml_to_dicts(results): if not show_excluded and res['code'] == 'excluded': continue if '_oldstate' in res: oldstate = res['_oldstate'] continue if multibuild_packages: l = res['pkg'].rsplit(':', 1) if (len(l) != 2 or l[1] not in multibuild_packages) and not (len(l) == 1 and "" in multibuild_packages): # special case: packages without flavor when multibuild_packages contains an empty string continue res['status'] = res['code'] if verbose and res['details'] is not None: if res['code'] in ('unresolvable', 'expansion error'): lines = res['details'].split(',') res['status'] += ': \n ' + '\n '.join(lines) else: res['status'] += f": {res['details']}" elif res['code'] in ('scheduled', ) and res['details']: # highlight scheduled jobs with possible dispatch problems res['status'] += '*' if res['dirty']: if verbose: res['status'] = f"outdated (was: {res['status']})" else: res['status'] += '*' elif res['code'] in ('succeeded', ) and res['repostate'] != "published": if verbose: res['status'] += '(unpublished)' else: res['status'] += '*' # we need to do the code filtering again, because result_xml_to_dicts returns the code # of the repository if the result is already prefiltered by the backend. So we need # to filter out the repository states. if code_filter is None or code_filter == res['code']: r.append(format % res) if res['code'] in ('failed', 'broken', 'unresolvable'): failed = True if printJoin: if printed: # will print a newline if already a result was printed (improves readability) print() print(printJoin.join(r)) printed = True if out is None: out = {} out["failed"] = failed return r def get_package_results(apiurl: str, project: str, package: Optional[str] = None, wait=False, multibuild_packages: Optional[List[str]] = None, *args, **kwargs): """generator that returns a the package results as an xml structure""" xml = b'' waiting_states = ('blocked', 'scheduled', 'dispatching', 'building', 'signing', 'finished') while True: waiting = False try: xml = b''.join(show_results_meta(apiurl, project, package, *args, **kwargs)) except HTTPError as e: # check for simple timeout error and fetch again if e.code == 502 or e.code == 504: # re-try result request continue root = xml_fromstring(e.read()) if e.code == 400 and kwargs.get('multibuild') and re.search('multibuild', getattr(root.find('summary'), 'text', '')): kwargs['multibuild'] = None kwargs['locallink'] = None continue raise root = xml_fromstring(xml) kwargs['oldstate'] = root.get('state') for result in root.findall('result'): if result.get('dirty') is not None: waiting = True break elif result.get('code') in waiting_states: waiting = True break else: pkg = result.find('status') if pkg is not None and pkg.get('code') in waiting_states: waiting = True break # filter the result according to the specified multibuild_packages (flavors) if multibuild_packages: for result in list(root): for status in list(result): package = status.attrib["package"] package_flavor = package.rsplit(":", 1) # package has flavor, check if the flavor is in multibuild_packages flavor_match = len(package_flavor) == 2 and package_flavor[1] in multibuild_packages # package nas no flavor, check if "" is in multibuild_packages no_flavor_match = len(package_flavor) == 1 and "" in multibuild_packages if not flavor_match and not no_flavor_match: # package doesn't match multibuild_packages, remove the corresponding from result.remove(status) # remove empty from if len(result) == 0: root.remove(result) if len(root) == 0: break xmlindent(root) xml = ET.tostring(root) if not wait or not waiting: break else: yield xml yield xml def get_prj_results( apiurl: str, prj: str, hide_legend=False, csv=False, status_filter=None, name_filter=None, arch=None, repo=None, vertical=None, show_excluded=None, brief=False, ): # print '----------------------------------------' global buildstatus_symbols r = [] f = show_prj_results_meta(apiurl, prj) root = xml_fromstring(b''.join(f)) if name_filter is not None: name_filter = re.compile(name_filter) pacs = [] # sequence of (repo,arch) tuples targets = [] # {package: {(repo,arch): status}} status = {} if root.find('result') is None: return [] for results in root.findall('result'): for node in results.findall('status'): pacs.append(node.get('package')) pacs = sorted(list(set(pacs))) for node in root.findall('result'): # filter architecture and repository if arch and node.get('arch') not in arch: continue if repo and node.get('repository') not in repo: continue if node.get('dirty') == "true": state = "outdated" else: state = node.get('state') if node.get('details'): state += ' details: ' + node.get('details') tg = (node.get('repository'), node.get('arch'), state) targets.append(tg) for pacnode in node.findall('status'): pac = pacnode.get('package') if pac not in status: status[pac] = {} status[pac][tg] = pacnode.get('code') targets.sort() # filter option filters = [] if status_filter or name_filter or not show_excluded: pacs_to_show = [] targets_to_show = [] # filtering for Package Status if status_filter: if status_filter in buildstatus_symbols.values(): # a list is needed because if status_filter == "U" # we have to filter either an "expansion error" (obsolete) # or an "unresolvable" state for txt, sym in buildstatus_symbols.items(): if sym == status_filter: filters.append(txt) else: filters.append(status_filter) for filt_txt in filters: for pkg in status.keys(): for repo in status[pkg].keys(): if status[pkg][repo] == filt_txt: if not name_filter: pacs_to_show.append(pkg) targets_to_show.append(repo) elif name_filter.search(pkg) is not None: pacs_to_show.append(pkg) # filtering for Package Name elif name_filter: for pkg in pacs: if name_filter.search(pkg) is not None: pacs_to_show.append(pkg) # filter non building states elif not show_excluded: enabled = {} for pkg in status.keys(): showpkg = False for repo in status[pkg].keys(): if status[pkg][repo] != "excluded": enabled[repo] = 1 showpkg = True if showpkg: pacs_to_show.append(pkg) targets_to_show = enabled.keys() pacs = [i for i in pacs if i in pacs_to_show] if targets_to_show: targets = [i for i in targets if i in targets_to_show] # csv output if csv: # TODO: option to disable the table header row = ['_'] + ['/'.join(tg) for tg in targets] r.append(';'.join(row)) for pac in pacs: row = [pac] + [status[pac][tg] for tg in targets if tg in status[pac]] r.append(';'.join(row)) return r if brief: for pac, repo_states in status.items(): for repo, state in repo_states.items(): if filters and state not in filters: continue r.append(f'{pac} {repo[0]} {repo[1]} {state}') return r if not vertical: # human readable output max_pacs = 40 for startpac in range(0, len(pacs), max_pacs): offset = 0 for pac in pacs[startpac:startpac + max_pacs]: r.append(' |' * offset + ' ' + pac) offset += 1 for tg in targets: line = [] line.append(' ') for pac in pacs[startpac:startpac + max_pacs]: st = '' if pac not in status or tg not in status[pac]: # for newly added packages, status may be missing st = '?' else: try: st = buildstatus_symbols[status[pac][tg]] except: print(f'osc: warn: unknown status \'{status[pac][tg]}\'...') print('please edit osc/core.py, and extend the buildstatus_symbols dictionary.') st = '?' buildstatus_symbols[status[pac][tg]] = '?' line.append(st) line.append(' ') line.append(' %s %s (%s)' % tg) line = ''.join(line) r.append(line) r.append('') else: offset = 0 for tg in targets: r.append('| ' * offset + '%s %s (%s)' % tg) offset += 1 for pac in pacs: line = [] for tg in targets: st = '' if pac not in status or tg not in status[pac]: # for newly added packages, status may be missing st = '?' else: try: st = buildstatus_symbols[status[pac][tg]] except: print(f'osc: warn: unknown status \'{status[pac][tg]}\'...') print('please edit osc/core.py, and extend the buildstatus_symbols dictionary.') st = '?' buildstatus_symbols[status[pac][tg]] = '?' line.append(st) line.append(' ' + pac) r.append(' '.join(line)) line = [] for i in range(0, len(targets)): line.append(str(i % 10)) r.append(' '.join(line)) r.append('') if not hide_legend and len(pacs): r.append(' Legend:') legend = [] for i, j in buildstatus_symbols.items(): if i == "expansion error": continue legend.append('%3s %-20s' % (j, i)) legend.append(' ? buildstatus not available (only new packages)') if vertical: for i in range(0, len(targets)): s = '%1d %s %s (%s)' % (i % 10, targets[i][0], targets[i][1], targets[i][2]) if i < len(legend): legend[i] += s else: legend.append(' ' * 24 + s) r += legend return r def streamfile(url: str, http_meth=http_GET, bufsize=8192, data=None, progress_obj=None, text=None): """ performs http_meth on url and read bufsize bytes from the response until EOF is reached. After each read bufsize bytes are yielded to the caller. A spezial usage is bufsize="line" to read line by line (text). """ cl = '' retries = 0 # Repeat requests until we get reasonable Content-Length header # Server (or iChain) is corrupting data at some point, see bnc#656281 while cl == '': if retries >= int(conf.config['http_retries']): raise oscerr.OscIOError(None, f'Content-Length is empty for {url}, protocol violation') retries = retries + 1 if retries > 1 and conf.config['http_debug']: print('\n\nRetry %d --' % (retries - 1), url, file=sys.stderr) f = http_meth.__call__(url, data=data) cl = f.info().get('Content-Length') if cl is not None: # sometimes the proxy adds the same header again # which yields in value like '3495, 3495' # use the first of these values (should be all the same) cl = cl.split(',')[0] cl = int(cl) if progress_obj: if not text: basename = os.path.basename(urlsplit(url)[2]) else: basename = text progress_obj.start(basename, cl) if bufsize == "line": bufsize = 8192 xread = f.readline else: xread = f.read read = 0 while True: data = xread(bufsize) if not data: break read += len(data) if progress_obj: progress_obj.update(read) yield data if progress_obj: progress_obj.end() f.close() if cl is not None and read != cl: raise oscerr.OscIOError(None, 'Content-Length is not matching file size for %s: %i vs %i file size' % (url, cl, read)) def buildlog_strip_time(data): """Strips the leading build time from the log""" if isinstance(data, str): time_regex = re.compile(r'^\[[^\]]*\] ', re.M) return time_regex.sub('', data) else: time_regex = re.compile(br'^\[[^\]]*\] ', re.M) return time_regex.sub(b'', data) def print_buildlog( apiurl: str, prj: str, package: str, repository: str, arch: str, offset=0, strip_time=False, last=False, lastsucceeded=False, output_buffer=None, ): """prints out the buildlog on stdout""" output_buffer = output_buffer or sys.stdout.buffer def print_data(data, strip_time=False): if strip_time: data = buildlog_strip_time(data) # to protect us against control characters (CVE-2012-1095) output_buffer.write(sanitize_text(data)) query = {'nostream': '1', 'start': f'{offset}'} if last: query['last'] = 1 if lastsucceeded: query['lastsucceeded'] = 1 retry_count = 0 while True: query['start'] = offset start_offset = offset u = makeurl(apiurl, ['build', prj, repository, arch, package, '_log'], query=query) try: for data in streamfile(u): offset += len(data) print_data(data, strip_time) except IncompleteRead as e: if retry_count >= 3: raise e retry_count += 1 data = e.partial if len(data): offset += len(data) print_data(data, strip_time) continue if start_offset == offset: break def get_dependson(apiurl: str, project: str, repository: str, arch: str, packages=None, reverse=None): query = {} query["package"] = packages if reverse: query["view"] = "revpkgnames" else: query["view"] = "pkgnames" u = makeurl(apiurl, ['build', project, repository, arch, '_builddepinfo'], query=query) f = http_GET(u) return f.read() def get_buildinfo( apiurl: str, prj: str, package: str, repository: str, arch: str, specfile=None, addlist=None, debug=None ): query = {} query["add"] = addlist query["debug"] = debug u = makeurl(apiurl, ['build', prj, repository, arch, package, '_buildinfo'], query=query) if specfile: f = http_POST(u, data=specfile) else: f = http_GET(u) return f.read() def get_buildconfig(apiurl: str, prj: str, repository: str, path=None): query = {} query["path"] = path u = makeurl(apiurl, ['build', prj, repository, '_buildconfig'], query=query) f = http_GET(u) return f.read() def create_pbuild_config(apiurl: str, project: str, repository: str, arch: str, project_dir): """ This is always replacing a possible exiting config for now we could extend the _pbuild file easily, but what should we do with multiple instances of the _config? """ # get expanded buildconfig for given project and repository bc = get_buildconfig(apiurl, project, repository) if not bc: msg = "Failed to get build config for project '{project}', repository '{repository}'" raise oscerr.NotFoundAPIError(msg) with open(os.path.join(project_dir, '_config'), "w") as f: f.write(decode_it(bc)) # create the _pbuild file based on expanded repository path informations pb = xml_fromstring('') tree = ET.ElementTree(pb) preset = ET.SubElement(pb, 'preset', name=repository, default="") # default should be empty, but ET crashes bi_text = decode_it(get_buildinfo(apiurl, project, '_repository', repository, arch, specfile="Name: dummy")) root = xml_fromstring(bi_text) # cross compile setups are not yet supported # for path in root.findall('hostsystem'): # ET.SubElement(preset, 'hostrepo').text = path.get('url') for path in root.findall('path'): ET.SubElement(preset, 'repo').text = path.get('url') ET.SubElement(preset, 'arch').text = arch xmlindent(tree) tree.write(os.path.join(project_dir,'_pbuild'), encoding="utf-8", xml_declaration=True) def get_worker_info(apiurl: str, worker: str): u = makeurl(apiurl, ['worker', worker]) f = http_GET(u) return decode_it(f.read()) def check_constraints(apiurl: str, prj: str, repository: str, arch: str, package: str, constraintsfile=None): query = {"cmd": "checkconstraints", "project": prj, "package": package, "repository": repository, "arch": arch} u = makeurl(apiurl, ["worker"], query) f = http_POST(u, data=constraintsfile) root = xml_fromstring(b''.join(f)) return [node.get('name') for node in root.findall('entry')] def get_source_rev(apiurl: str, project: str, package: str, revision=None): # API supports ?deleted=1&meta=1&rev=4 # but not rev=current,rev=latest,rev=top, or anything like this. # CAUTION: We have to loop through all rev and find the highest one, if none given. if not revision_is_empty(revision): url = makeurl(apiurl, ['source', project, package, '_history'], {'rev': revision}) else: url = makeurl(apiurl, ['source', project, package, '_history']) f = http_GET(url) xml = xml_parse(f) ent = None for new in xml.findall('revision'): # remember the newest one. if not ent: ent = new elif ent.findtext("time") < new.findtext("time"): ent = new if not ent: return {'version': None, 'error': 'empty revisionlist: no such package?'} e = {} for k in ent.keys(): e[k] = ent.get(k) for k in list(ent): e[k.tag] = k.text return e def print_jobhistory(apiurl: str, prj: str, current_package: str, repository: str, arch: str, format="text", limit=20): query = {} if current_package: query['package'] = current_package if limit is not None and int(limit) > 0: query['limit'] = int(limit) u = makeurl(apiurl, ['build', prj, repository, arch, '_jobhistory'], query) f = http_GET(u) root = xml_parse(f).getroot() if format == 'text': print("time package reason code build time worker") for node in root.findall('jobhist'): package = node.get('package') worker = node.get('workerid') reason = node.get('reason') if not reason: reason = "unknown" code = node.get('code') st = int(node.get('starttime')) et = int(node.get('endtime')) endtime = time.strftime('%Y-%m-%d %H:%M:%S', time.gmtime(et)) waittm = et - st if waittm > 24 * 60 * 60: waitbuild = "%1dd %2dh %2dm %2ds" % (waittm / (24 * 60 * 60), (waittm / (60 * 60)) % 24, (waittm / 60) % 60, waittm % 60) elif waittm > 60 * 60: waitbuild = " %2dh %2dm %2ds" % (waittm / (60 * 60), (waittm / 60) % 60, waittm % 60) else: waitbuild = " %2dm %2ds" % (waittm / 60, waittm % 60) if format == 'csv': print(f'{endtime}|{package}|{reason}|{code}|{waitbuild}|{worker}') else: print('%s %-50s %-16s %-16s %-16s %-16s' % (endtime, package[0:49], reason[0:15], code[0:15], waitbuild, worker)) def get_commitlog( apiurl: str, prj: str, package: str, revision: Optional[str], format: str = "text", meta: Optional[bool] = None, deleted: Optional[bool] = None, revision_upper: Optional[str] = None, patch: Optional[bool] = None, ): if package is None: package = "_project" from . import obs_api revision_list = obs_api.Package.get_revision_list(apiurl, prj, package, deleted=deleted, meta=meta) # TODO: consider moving the following block to Package.get_revision_list() # keep only entries matching the specified revision if not revision_is_empty(revision): if isinstance(revision, str) and len(revision) == 32: # revision is srcmd5 revision_list = [i for i in revision_list if i.srcmd5 == revision] else: revision = int(revision) if revision_is_empty(revision_upper): revision_list = [i for i in revision_list if i.rev == revision] else: revision_upper = int(revision_upper) revision_list = [i for i in revision_list if i.rev <= revision_upper and i.rev >= revision] if format == "csv": f = io.StringIO() writer = csv.writer(f, dialect="unix") for revision in reversed(revision_list): writer.writerow( ( revision.rev, revision.user, revision.get_time_str(), revision.srcmd5, revision.comment, revision.requestid, ) ) f.seek(0) yield from f.read().splitlines() return if format == "xml": root = ET.Element("log") for revision in reversed(revision_list): entry = ET.SubElement(root, "logentry") entry.attrib["revision"] = str(revision.rev) entry.attrib["srcmd5"] = revision.srcmd5 ET.SubElement(entry, "author").text = revision.user ET.SubElement(entry, "date").text = revision.get_time_str() ET.SubElement(entry, "requestid").text = str(revision.requestid) if revision.requestid else "" ET.SubElement(entry, "msg").text = revision.comment or "" xmlindent(root) yield from ET.tostring(root, encoding="utf-8").decode("utf-8").splitlines() return if format == "text": for revision in reversed(revision_list): entry = ( f"r{revision.rev}", revision.user, revision.get_time_str(), revision.srcmd5, revision.version, f"rq{revision.requestid}" if revision.requestid else "" ) yield 76 * "-" yield " | ".join(entry) yield "" yield revision.comment or "" yield "" if patch: rdiff = server_diff_noex( apiurl, prj, package, revision.rev - 1, prj, package, revision.rev, meta=meta, ) yield highlight_diff(rdiff).decode("utf-8", errors="replace") return raise ValueError(f"Invalid format: {format}") def runservice(apiurl: str, prj: str, package: str): u = makeurl(apiurl, ['source', prj, package], query={'cmd': 'runservice'}) try: f = http_POST(u) except HTTPError as e: e.osc_msg = f'could not trigger service run for project \'{prj}\' package \'{package}\'' raise root = xml_parse(f).getroot() return root.get('code') def waitservice(apiurl: str, prj: str, package: str): u = makeurl(apiurl, ['source', prj, package], query={'cmd': 'waitservice'}) try: f = http_POST(u) except HTTPError as e: e.osc_msg = f'The service for project \'{prj}\' package \'{package}\' failed' raise root = xml_parse(f).getroot() return root.get('code') def mergeservice(apiurl: str, prj: str, package: str): # first waiting that the service finishes and that it did not fail waitservice(apiurl, prj, package) # real merge u = makeurl(apiurl, ['source', prj, package], query={'cmd': 'mergeservice'}) try: f = http_POST(u) except HTTPError as e: e.osc_msg = f'could not merge service files in project \'{prj}\' package \'{package}\'' raise root = xml_parse(f).getroot() return root.get('code') def rebuild(apiurl: str, prj: str, package: str, repo: str, arch: str, code=None): query = {'cmd': 'rebuild'} if package: query['package'] = package if repo: query['repository'] = repo if arch: query['arch'] = arch if code: query['code'] = code u = makeurl(apiurl, ['build', prj], query=query) try: f = http_POST(u) except HTTPError as e: e.osc_msg = f'could not trigger rebuild for project \'{prj}\' package \'{package}\'' raise root = xml_parse(f).getroot() return root.get('code') def get_osc_version(): return __version__ def abortbuild(apiurl: str, project: str, package=None, arch=None, repo=None): return cmdbuild(apiurl, 'abortbuild', project, package, arch, repo) def restartbuild(apiurl: str, project: str, package=None, arch=None, repo=None): return cmdbuild(apiurl, 'restartbuild', project, package, arch, repo) def unpublish(apiurl: str, project: str, package: Optional[str] = None, arch=None, repo=None, code=None): return cmdbuild(apiurl, "unpublish", project, package, arch, repo, code) def wipebinaries(apiurl: str, project: str, package: Optional[str] = None, arch=None, repo=None, code=None): return cmdbuild(apiurl, "wipe", project, package, arch, repo, code) def cmdbuild( apiurl: str, cmd: str, project: str, package: Optional[str] = None, arch=None, repo=None, code=None, sysrq=None ): query = {"cmd": cmd} if package: query['package'] = package if arch: query['arch'] = arch if repo: query['repository'] = repo if code: query['code'] = code if sysrq: query['sysrq'] = sysrq u = makeurl(apiurl, ['build', project], query) try: f = http_POST(u) except HTTPError as e: e.osc_msg = f'{cmd} command failed for project {project}' if package: e.osc_msg += f' package {package}' if arch: e.osc_msg += f' arch {arch}' if repo: e.osc_msg += f' repository {repo}' if code: e.osc_msg += f' code={code}' if sysrq: e.osc_msg += f' sysrq={code}' raise root = xml_parse(f).getroot() return root.get('code') def parseRevisionOption(string, allow_md5=True): """ returns a tuple which contains the revisions """ revisions = [None, None] if string: parts = string.split(':') if len(parts) > 2: raise oscerr.OscInvalidRevision(string) for i, revision in enumerate(parts, 0): if revision.isdigit() or (allow_md5 and revision.isalnum() and len(revision) == 32): revisions[i] = revision elif revision != '' and revision != 'latest': raise oscerr.OscInvalidRevision(string) return tuple(revisions) def checkRevision(prj: str, pac: str, revision, apiurl: Optional[str] = None, meta=False): """ check if revision is valid revision, i.e. it is not larger than the upstream revision id """ if len(revision) == 32: # there isn't a way to check this kind of revision for validity return True if not apiurl: apiurl = conf.config['apiurl'] try: if int(revision) > int(show_upstream_rev(apiurl, prj, pac, meta=meta)) \ or int(revision) <= 0: return False else: return True except (ValueError, TypeError): return False def build_table(col_num, data=None, headline=None, width=1, csv=False): """ This method builds a simple table. Example:: build_table(2, ['foo', 'bar', 'suse', 'osc'], ['col1', 'col2'], 2) col1 col2 foo bar suse osc """ data = data or [] headline = headline or [] longest_col = [] for i in range(col_num): longest_col.append(0) if headline and not csv: data[0:0] = headline data = [str(i) for i in data] # find longest entry in each column i = 0 for itm in data: if longest_col[i] < len(itm): longest_col[i] = len(itm) if i == col_num - 1: i = 0 else: i += 1 # calculate length for each column for i, row in enumerate(longest_col): longest_col[i] = row + width # build rows row = [] table = [] i = 0 for itm in data: if i % col_num == 0: i = 0 row = [] table.append(row) # there is no need to justify the entries of the last column # or when generating csv if i == col_num - 1 or csv: row.append(itm) else: row.append(itm.ljust(longest_col[i])) i += 1 if csv: separator = '|' else: separator = '' return [separator.join(row) for row in table] def xpath_join(expr, new_expr, op='or', inner=False, nexpr_parentheses=False): """ Join two xpath expressions. If inner is False expr will be surrounded with parentheses (unless it's not already surrounded). If nexpr_parentheses is True new_expr will be surrounded with parentheses. """ if not expr: return new_expr elif not new_expr: return expr # NOTE: this is NO syntax check etc. (e.g. if a literal contains a '(' or ')' # the check might fail and expr will be surrounded with parentheses or NOT) parentheses = not inner if not inner and expr.startswith('(') and expr.endswith(')'): parentheses = False braces = [i for i in expr if i == '(' or i == ')'] closed = 0 while len(braces): if braces.pop() == ')': closed += 1 continue else: closed += -1 while len(braces): if braces.pop() == '(': closed += -1 else: closed += 1 if closed != 0: parentheses = True break if parentheses: expr = f'({expr})' if nexpr_parentheses: new_expr = f'({new_expr})' return f'{expr} {op} {new_expr}' def search(apiurl: str, queries=None, **kwargs): """ Perform a search request. The requests are constructed as follows: kwargs = {'kind1' => xpath1, 'kind2' => xpath2, ..., 'kindN' => xpathN} GET /search/kind1?match=xpath1 ... GET /search/kindN?match=xpathN queries is a dict of optional http query parameters, which are passed to the makeurl call, of the form {kindI1: dict_or_list, ..., kindIL: dict_or_list}, where kind_i1 to kind_iL are keys of kwargs. """ if queries is None: queries = {} res = {} for urlpath, xpath in kwargs.items(): path = ['search'] path += urlpath.split('_') # FIXME: take underscores as path seperators. I see no other way atm to fix OBS api calls and not breaking osc api query = queries.get(urlpath, {}) query['match'] = xpath u = makeurl(apiurl, path, query) f = http_GET(u) res[urlpath] = xml_parse(f).getroot() return res def owner( apiurl: str, search_term=None, mode="binary", attribute=None, project=None, usefilter=None, devel=None, limit=None, binary=None, ): """ Perform a binary package owner search. This is supported since OBS 2.4. """ # binary is just for API backward compatibility if not (search_term is None) ^ (binary is None): raise ValueError('Either specify search_term or binary') elif binary is not None: search_term = binary # find default project, if not specified # mode can be "binary" or "package" atm query = {mode: search_term} if attribute: query['attribute'] = attribute if project: query['project'] = project if devel: query['devel'] = devel if limit is not None: query['limit'] = limit if usefilter is not None: query['filter'] = ",".join(usefilter) u = makeurl(apiurl, ['search', 'owner'], query) res = None try: f = http_GET(u) res = xml_parse(f).getroot() except HTTPError as e: # old server not supporting this search pass return res def set_link_rev(apiurl: str, project: str, package: str, revision="", expand=False, msg: str=None, vrev: str=None): url = makeurl(apiurl, ["source", project, package, "_link"]) try: f = http_GET(url) root = xml_parse(f).getroot() except HTTPError as e: e.osc_msg = f'Unable to get _link file in package \'{package}\' for project \'{project}\'' raise revision = _set_link_rev(apiurl, project, package, root, revision, expand=expand, setvrev=vrev) l = ET.tostring(root, encoding=ET_ENCODING) if not msg: if revision: msg = f"Set link revision to {revision}" else: msg = "Unset link revision" url = makeurl(apiurl, ["source", project, package, "_link"], {"comment": msg}) http_PUT(url, data=l) return revision def _set_link_rev(apiurl: str, project: str, package: str, root, revision="", expand=False, setvrev: str=None): """ Updates the rev attribute of the _link xml. If revision is set to None the rev and vrev attributes are removed from the _link xml. updates the rev attribute of the _link xml. If revision is the empty string the latest rev of the link's source package is used (or the xsrcmd5 if expand is True). If revision is neither None nor the empty string the _link's rev attribute is set to this revision (or to the xsrcmd5 if expand is True). """ src_project = root.get('project', project) src_package = root.get('package', package) vrev = None if revision is None: if 'rev' in root.keys(): del root.attrib['rev'] if 'vrev' in root.keys(): del root.attrib['vrev'] elif not revision or expand: revision, vrev = show_upstream_rev_vrev(apiurl, src_project, src_package, revision=revision, expand=expand) if revision: root.set('rev', revision) # add vrev when revision is a srcmd5 if setvrev: root.set('vrev', setvrev) elif not revision_is_empty(vrev) and not revision_is_empty(revision) and len(revision) >= 32: root.set('vrev', vrev) return revision def delete_dir(dir): # small security checks if os.path.islink(dir): raise oscerr.OscIOError(None, 'cannot remove linked dir') elif os.path.abspath(dir) == '/': raise oscerr.OscIOError(None, 'cannot remove \'/\'') for dirpath, dirnames, filenames in os.walk(dir, topdown=False): for filename in filenames: os.unlink(os.path.join(dirpath, filename)) for dirname in dirnames: os.rmdir(os.path.join(dirpath, dirname)) os.rmdir(dir) def unpack_srcrpm(srpm, dir, *files): """ This method unpacks the passed srpm into the passed dir. If arguments are passed to the \'files\' tuple only this files will be unpacked. """ if not is_srcrpm(srpm): print(f'error - \'{srpm}\' is not a source rpm.', file=sys.stderr) sys.exit(1) curdir = os.getcwd() if os.path.isdir(dir): os.chdir(dir) ret = -1 with open(srpm) as fsrpm: with open(os.devnull, 'w') as devnull: rpm2cpio_proc = subprocess.Popen(['rpm2cpio'], stdin=fsrpm, stdout=subprocess.PIPE) cpio_proc = subprocess.Popen(['cpio', '-i'] + list(files), stdin=rpm2cpio_proc.stdout, stderr=devnull) rpm2cpio_proc.stdout.close() cpio_proc.communicate() rpm2cpio_proc.wait() ret = rpm2cpio_proc.returncode if not ret: ret = cpio_proc.returncode if ret != 0: print(f'error \'{ret}\' - cannot extract \'{srpm}\'', file=sys.stderr) sys.exit(1) os.chdir(curdir) def is_rpm(f): """check if the named file is an RPM package""" try: h = open(f, 'rb').read(4) except: return False if isinstance(h, str): isrpmstr = '\xed\xab\xee\xdb' else: isrpmstr = b'\xed\xab\xee\xdb' if h == isrpmstr: return True else: return False def is_srcrpm(f): """check if the named file is a source RPM""" if not is_rpm(f): return False try: h = open(f, 'rb').read(8) except: return False issrcrpm = bytes(bytearray([h[7]])).decode('utf-8') if issrcrpm == '\x01': return True else: return False def addMaintainer(apiurl: str, prj: str, pac: str, user: str): # for backward compatibility only addPerson(apiurl, prj, pac, user) def addPerson(apiurl: str, prj: str, pac: str, user: str, role="maintainer"): """ add a new person to a package or project """ path = (prj, ) kind = 'prj' if pac: path = path + (pac ,) kind = 'pkg' data = meta_exists(metatype=kind, path_args=path, template_args=None, create_new=False) if data and get_user_meta(apiurl, user) is not None: root = xml_fromstring(parse_meta_to_string(data)) found = False for person in root.iter('person'): if person.get('userid') == user and person.get('role') == role: found = True print("user already exists") break if not found: # the xml has a fixed structure root.insert(2, ET.Element('person', role=role, userid=user)) print(f'user \'{user}\' added to \'{pac or prj}\'') edit_meta(metatype=kind, path_args=path, data=ET.tostring(root, encoding=ET_ENCODING)) else: print("osc: an error occured") def delMaintainer(apiurl: str, prj: str, pac: str, user: str): # for backward compatibility only delPerson(apiurl, prj, pac, user) def delPerson(apiurl: str, prj: str, pac: str, user: str, role="maintainer"): """ delete a person from a package or project """ path = (prj, ) kind = 'prj' if pac: path = path + (pac, ) kind = 'pkg' data = meta_exists(metatype=kind, path_args=path, template_args=None, create_new=False) if data and get_user_meta(apiurl, user) is not None: root = xml_fromstring(parse_meta_to_string(data)) found = False for person in root.iter('person'): if person.get('userid') == user and person.get('role') == role: root.remove(person) found = True print(f"user '{user}' removed") if found: edit_meta(metatype=kind, path_args=path, data=ET.tostring(root, encoding=ET_ENCODING)) else: print(f"user '{user}' not found in '{pac or prj}'") else: print("an error occured") def setBugowner(apiurl: str, prj: str, pac: str, user=None, group=None): """ delete all bugowners (user and group entries) and set one new one in a package or project """ path = (prj, ) kind = 'prj' if pac: path = path + (pac, ) kind = 'pkg' data = meta_exists(metatype=kind, path_args=path, template_args=None, create_new=False) if user.startswith('group:'): group = user.replace('group:', '') user = None if data: root = xml_fromstring(parse_meta_to_string(data)) for group_element in root.iter('group'): if group_element.get('role') == "bugowner": root.remove(group_element) for person_element in root.iter('person'): if person_element.get('role') == "bugowner": root.remove(person_element) if user: root.insert(2, ET.Element('person', role='bugowner', userid=user)) elif group: root.insert(2, ET.Element('group', role='bugowner', groupid=group)) else: print("Neither user nor group is specified") edit_meta(metatype=kind, path_args=path, data=ET.tostring(root, encoding=ET_ENCODING)) def createPackageDir(pathname, prj_obj=None): """ create and initialize a new package dir in the given project. prj_obj can be a Project() instance. """ prj_dir, pac_dir = getPrjPacPaths(pathname) if is_project_dir(prj_dir): global store if not os.path.exists(os.path.join(pathname, store)): prj = prj_obj or Project(prj_dir, False) Package.init_package(prj.apiurl, prj.name, pac_dir, pathname) prj.addPackage(pac_dir) print(statfrmt('A', os.path.normpath(pathname))) else: raise oscerr.OscIOError(None, f'file or directory \'{pathname}\' already exists') else: msg = f'\'{prj_dir}\' is not a working copy' if os.path.exists(os.path.join(prj_dir, '.svn')): msg += '\ntry svn instead of osc.' raise oscerr.NoWorkingCopy(msg) def stripETxml(node): node.tail = None if node.text is not None: node.text = node.text.replace(" ", "").replace("\n", "") for child in node: stripETxml(child) def addGitSource(url): service_file = os.path.join(os.getcwd(), '_service') addfile = False if os.path.exists(service_file): services = xml_parse(os.path.join(os.getcwd(), '_service')).getroot() else: services = xml_fromstring("") addfile = True stripETxml(services) si = Serviceinfo() s = si.addGitUrl(services, url) s = si.addTarUp(services) s = si.addRecompressTar(services) s = si.addSetVersion(services) si.read(s) # for pretty output xmlindent(s) f = open(service_file, 'w') f.write(ET.tostring(s, encoding=ET_ENCODING)) f.close() if addfile: addFiles(['_service']) def addDownloadUrlService(url): service_file = os.path.join(os.getcwd(), '_service') addfile = False if os.path.exists(service_file): services = xml_parse(os.path.join(os.getcwd(), '_service')).getroot() else: services = xml_fromstring("") addfile = True stripETxml(services) si = Serviceinfo() s = si.addDownloadUrl(services, url) si.read(s) # for pretty output xmlindent(s) f = open(service_file, 'w') f.write(ET.tostring(s, encoding=ET_ENCODING)) f.close() if addfile: addFiles(['_service']) # download file path = os.getcwd() files = os.listdir(path) si.execute(path) newfiles = os.listdir(path) # add verify service for new files for filename in files: newfiles.remove(filename) for filename in newfiles: if filename.startswith('_service:download_url:'): s = si.addVerifyFile(services, filename) # for pretty output xmlindent(s) f = open(service_file, 'w') f.write(ET.tostring(s, encoding=ET_ENCODING)) f.close() def addFiles(filenames, prj_obj=None, force=False): for filename in filenames: if not os.path.exists(filename): raise oscerr.OscIOError(None, f'file \'{filename}\' does not exist') # TODO: this function needs improvement # it should check if we're in a project or a package working copy and behave accordingly # init a package dir if we have a normal dir in the "filenames"-list # so that it will be find by Package.from_paths_nofail() later pacs = list(filenames) for filename in filenames: prj_dir, pac_dir = getPrjPacPaths(filename) if not is_package_dir(filename) and os.path.isdir(filename) and is_project_dir(prj_dir) \ and conf.config['do_package_tracking']: store = Store(prj_dir) prj_name = store_read_project(prj_dir) prj_apiurl = store.apiurl Package.init_package(prj_apiurl, prj_name, pac_dir, filename) elif is_package_dir(filename) and conf.config['do_package_tracking']: print(f'osc: warning: \'{filename}\' is already under version control') pacs.remove(filename) elif os.path.isdir(filename) and is_project_dir(prj_dir): raise oscerr.WrongArgs('osc: cannot add a directory to a project unless ' '\'do_package_tracking\' is enabled in the configuration file') pacs, no_pacs = Package.from_paths_nofail(pacs) for filename in no_pacs: filename = os.path.normpath(filename) directory = os.path.join(filename, os.pardir) if not is_package_dir(directory): print(f'osc: warning: \'{filename}\' cannot be associated to a package') continue resp = raw_input(f"{filename} is a directory, do you want to archive it for submission? (y/n) ") if resp not in ('y', 'Y'): continue archive = f"{filename}.obscpio" todo = [os.path.join(p, elm) for p, dirnames, fnames in os.walk(filename, followlinks=False) for elm in dirnames + fnames] enc_todo = [b'%s' % elem.encode() for elem in todo] with open(archive, 'w') as f: cpio_proc = subprocess.Popen(['cpio', '-o', '-H', 'newc', '-0'], stdin=subprocess.PIPE, stdout=f) cpio_proc.communicate(b'\0'.join(enc_todo)) pacs.extend(Package.from_paths([archive])) for pac in pacs: if conf.config['do_package_tracking'] and not pac.todo: prj = prj_obj or Project(os.path.dirname(pac.absdir), False) if pac.name in prj.pacs_unvers: prj.addPackage(pac.name) print(statfrmt('A', getTransActPath(os.path.join(pac.dir, os.pardir, pac.name)))) for filename in pac.filenamelist_unvers: if os.path.isdir(os.path.join(pac.dir, filename)): print(f'skipping directory \'{os.path.join(pac.dir, filename)}\'') else: pac.todo.append(filename) elif pac.name in prj.pacs_have: print(f'osc: warning: \'{pac.name}\' is already under version control') for filename in pac.todo: if filename in pac.skipped: continue if filename in pac.excluded and not force: print(f'osc: warning: \'{filename}\' is excluded from a working copy', file=sys.stderr) continue try: pac.addfile(filename) except oscerr.PackageFileConflict as e: fname = os.path.join(getTransActPath(pac.dir), filename) print(f'osc: warning: \'{fname}\' is already under version control') def getPrjPacPaths(path): """ returns the path for a project and a package from path. This is needed if you try to add or delete packages: Examples:: osc add pac1/: prj_dir = CWD; pac_dir = pac1 osc add /path/to/pac1: prj_dir = path/to; pac_dir = pac1 osc add /path/to/pac1/file => this would be an invalid path the caller has to validate the returned path! """ # make sure we hddave a dir: osc add bar vs. osc add bar/; osc add /path/to/prj_dir/new_pack # filename = os.path.join(tail, '') prj_dir, pac_dir = os.path.split(os.path.normpath(path)) if prj_dir == '': prj_dir = os.getcwd() return (prj_dir, pac_dir) def getTransActPath(pac_dir): """ returns the path for the commit and update operations/transactions. Normally the "dir" attribute of a Package() object will be passed to this method. """ path = str(Path(pac_dir)) # accept str and Path as pac_dir return '' if path == '.' else path def get_commit_message_template(pac): """ Read the difference in .changes file(s) and put them as a template to commit message. """ diff = [] template = [] if pac.todo: todo = pac.todo else: todo = pac.filenamelist + pac.filenamelist_unvers files = [i for i in todo if i.endswith('.changes') and pac.status(i) in ('A', 'M')] for filename in files: if pac.status(filename) == 'M': diff += get_source_file_diff(pac.absdir, filename, pac.rev) elif pac.status(filename) == 'A': with open(os.path.join(pac.absdir, filename), 'rb') as f: diff.extend(b'+' + line for line in f) if diff: template = parse_diff_for_commit_message(''.join(decode_list(diff))) return template def parse_diff_for_commit_message(diff, template=None): template = template or [] date_re = re.compile(r'\+(Mon|Tue|Wed|Thu|Fri|Sat|Sun) ([A-Z][a-z]{2}) ( ?[0-9]|[0-3][0-9]) .*') diff = diff.split('\n') # The first four lines contains a header of diff for line in diff[3:]: # this condition is magical, but it removes all unwanted lines from commit message if not(line) or (line and line[0] != '+') or \ date_re.match(line) or \ line == '+' or line[0:3] == '+++': continue if line == '+-------------------------------------------------------------------': template.append('') else: template.append(line[1:]) return template def get_commit_msg(wc_dir, pacs): template = store_read_file(wc_dir, '_commit_msg') # open editor for commit message # but first, produce status and diff to append to the template footer = [] lines = [] for p in pacs: states = sorted(p.get_status(False, ' ', '?'), key=cmp_to_key(compare)) changed = [statfrmt(st, os.path.normpath(os.path.join(p.dir, filename))) for st, filename in states] if changed: footer += changed footer.append(f'\nDiff for working copy: {p.dir}') footer.extend([''.join(decode_list(i)) for i in p.get_diff(ignoreUnversioned=True)]) lines.extend(get_commit_message_template(p)) if template is None: if lines and lines[0] == '': del lines[0] template = '\n'.join(lines) msg = '' # if footer is empty, there is nothing to commit, and no edit needed. if footer: msg = edit_message(footer='\n'.join(footer), template=template) if msg: store_write_string(wc_dir, '_commit_msg', msg + '\n') else: store_unlink_file(wc_dir, '_commit_msg') return msg def print_request_list(apiurl, project, package=None, states=("new", "review"), force=False): """ prints list of pending requests for the specified project/package if "check_for_request_on_action" is enabled in the config or if "force" is set to True """ if not conf.config['check_for_request_on_action'] and not force: return requests = get_request_collection(apiurl, project=project, package=package, states=states) msg = '\nPending requests for %s: %s (%s)' if sys.stdout.isatty(): msg = f'\033[1m{msg}\033[0m' if package is None and requests: print(msg % ('project', project, len(requests))) elif requests: print(msg % ('package', f"{project}/{package}", len(requests))) for r in requests: print(r.list_view(), '\n') def request_interactive_review(apiurl, request, initial_cmd='', group=None, ignore_reviews=False, source_buildstatus=False): """review the request interactively""" tmpfile = None def safe_change_request_state(*args, **kwargs): try: change_request_state(*args, **kwargs) return True except HTTPError as e: print('Server returned an error:', e, file=sys.stderr) details = e.hdrs.get('X-Opensuse-Errorcode') if details: print(details, file=sys.stderr) root = xml_fromstring(e.read()) summary = root.find('summary') if summary is not None: print(summary.text, file=sys.stderr) print('Try -f to force the state change', file=sys.stderr) return False def get_repos(src_actions): """ Translate src_actions to [{"proj": ..., "pkg": ..., "repo": ..., "arch": ...}] """ result = [] for action in src_actions: disabled = show_package_disabled_repos(apiurl, action.src_project, action.src_package) for repo in get_repos_of_project(apiurl, action.src_project): if (disabled is None) or (repo.name not in [d["repo"] for d in disabled]): entry = { "proj": action.src_project, "pkg": action.src_package, "repo": repo.name, "arch": repo.arch } result.append(entry) return result def select_repo(src_actions): """ Prompt user to select a repo from a list. """ repos = get_repos(src_actions) for num, entry in enumerate(repos): print(f"({num}) {entry['proj']}/{entry['pkg']}/{entry['repo']}/{entry['arch']}") if not repos: print('No repos') return None while True: try: reply = raw_input(f"Number of repo to examine (0 - {len(repos)-1}): ").strip() if not reply: return None reply_num = int(reply) return repos[reply_num] except (ValueError, IndexError): print(f"Invalid index. Please choose between 0 and {len(repos)-1}") def safe_get_rpmlint_log(src_actions): repo = select_repo(src_actions) if not repo: return try: run_pager(get_rpmlint_log(apiurl, **repo)) except HTTPError as e: if e.code == 404: print(f"No rpmlint log for {repo['repo']}/{repo['arch']}") else: raise def get_build_log(src_actions): repo = select_repo(src_actions) if not repo: return try: buffer = io.BytesIO() print_buildlog(apiurl, repo["proj"], repo["pkg"], repo["repo"], repo["arch"], output_buffer=buffer) buffer.seek(0) run_pager(buffer.read()) except HTTPError as e: if e.code == 404: print(f"No build log for {repo['repo']}/{repo['arch']}") else: raise def print_request(request): print(request) def print_source_buildstatus(src_actions, newline=False): if newline: print() for action in src_actions: print(f'{action.src_project}/{action.src_package}:') try: print('\n'.join(get_results(apiurl, action.src_project, action.src_package))) except HTTPError as e: if e.code != 404: raise print(f'unable to retrieve the buildstatus: {e}') def get_formatted_issues(apiurl, reqid): """get issue_list and return a printable string""" issue_list = get_request_issues(apiurl, reqid) issues = "" issues_nodetails = "" # the check_list is needed to make sure that every issue is just listed # once. Sometimes the API returns the same issue twice or more. See: # https://github.com/openSUSE/open-build-service/issues/4044 # Once this is fixed this can be changed. check_list = [] for issue in issue_list: if issue['label'] in check_list: continue if 'summary' in issue: issues += ("## BUG# " + issue['label'] + ": " + issue.get('summary') + " : " + issue.get('state', 'unknown state') + '\n') else: issues_nodetails += issue['label'] + ' ' check_list.append(issue['label']) if issues_nodetails: issues += '## No details for the issue(s): ' + issues_nodetails + '\n' return issues print_request(request) print_comments(apiurl, 'request', request.reqid) try: prompt = '(a)ccept/(d)ecline/(r)evoke/c(l)one/co(m)ment/(s)kip/(c)ancel > ' editable_actions = request.get_actions('submit', 'maintenance_incident') # actions which have sources + buildresults src_actions = editable_actions + request.get_actions('maintenance_release') if editable_actions: prompt = 'd(i)ff/(a)ccept/(d)ecline/(r)evoke/(b)uildstatus/(bl)buildlog/rpm(li)ntlog/c(l)one/(e)dit/co(m)ment/(s)kip/(c)ancel > ' elif src_actions: # no edit for maintenance release requests prompt = 'd(i)ff/(a)ccept/(d)ecline/(r)evoke/(b)uildstatus/(bl)buildlog/rpm(li)ntlog/c(l)one/co(m)ment/(s)kip/(c)ancel > ' editprj = '' orequest = None if source_buildstatus and src_actions: print_source_buildstatus(src_actions, newline=True) while True: if initial_cmd: repl = initial_cmd initial_cmd = '' else: repl = raw_input(prompt).strip() # remember if we're accepting so we can decide whether to forward request to the parent project later on accept = repl == "a" or repl.startswith("a ") if repl == 'i' and src_actions: req_summary = str(request) + '\n' issues = '\n\n' + get_formatted_issues(apiurl, request.reqid) if orequest is not None and tmpfile: tmpfile.close() tmpfile = None if tmpfile is None: tmpfile = tempfile.NamedTemporaryFile(suffix='.diff', mode='rb+') tmpfile.write(req_summary.encode()) tmpfile.write(issues.encode()) try: diff = request_diff(apiurl, request.reqid) tmpfile.write(diff) except HTTPError as e: if e.code != 400: raise # backward compatible diff for old apis for action in src_actions: diff = b'old: %s/%s\nnew: %s/%s\n' % (action.src_project.encode(), action.src_package.encode(), action.tgt_project.encode(), action.tgt_package.encode()) diff += submit_action_diff(apiurl, action) diff += b'\n\n' tmpfile.write(diff) tmpfile.flush() run_editor(tmpfile.name) print_request(request) print_comments(apiurl, 'request', request.reqid) elif repl == 's': print(f'skipping: #{request.reqid}', file=sys.stderr) break elif repl == 'c': print('Aborting', file=sys.stderr) raise oscerr.UserAbort() elif repl == 'm': if tmpfile is not None: tmpfile.seek(0) comment = edit_message(footer=decode_it(tmpfile.read())) else: comment = edit_text() create_comment(apiurl, 'request', comment, request.reqid) elif repl == 'b' and src_actions: print_source_buildstatus(src_actions) elif repl == 'li' and src_actions: safe_get_rpmlint_log(src_actions) elif repl == 'bl' and src_actions: get_build_log(src_actions) elif repl == 'e' and editable_actions: # this is only for editable actions if not editprj: editprj = clone_request(apiurl, request.reqid, 'osc editrequest') orequest = request request = edit_submitrequest(apiurl, editprj, orequest, request) src_actions = editable_actions = request.get_actions('submit', 'maintenance_incident') print_request(request) prompt = 'd(i)ff/(a)ccept/(b)uildstatus/(e)dit/(s)kip/(c)ancel > ' else: state_map = {'a': 'accepted', 'd': 'declined', 'r': 'revoked'} mo = re.search(r'^([adrl])(?:\s+(-f)?\s*-m\s+(.*))?$', repl) if mo is None or orequest and mo.group(1) != 'a': print(f'invalid choice: \'{repl}\'', file=sys.stderr) continue state = state_map.get(mo.group(1)) force = mo.group(2) is not None msg = mo.group(3) footer = '' msg_template = '' if not (state is None or request.state is None): footer = 'changing request from state \'%s\' to \'%s\'\n\n' \ % (request.state.name, state) msg_template = change_request_state_template(request, state) if tmpfile is None: footer += str(request) if tmpfile is not None: tmpfile.seek(0) # the read bytes probably have a moderate size so the str won't be too large footer += '\n\n' + decode_it(tmpfile.read()) if msg is None: try: msg = edit_message(footer=footer, template=msg_template) except oscerr.UserAbort: # do not abort (show prompt again) continue else: msg = msg.strip('\'').strip('"') if orequest is not None: request.create(apiurl) if not safe_change_request_state(apiurl, request.reqid, 'accepted', msg, force=force): # an error occured continue repl = raw_input('Supersede original request? (y|N) ') if repl in ('y', 'Y'): safe_change_request_state(apiurl, orequest.reqid, 'superseded', f'superseded by {request.reqid}', request.reqid, force=force) elif state is None: clone_request(apiurl, request.reqid, msg) else: reviews = [r for r in request.reviews if r.state == 'new'] if not reviews or ignore_reviews: if safe_change_request_state(apiurl, request.reqid, state, msg, force=force): if accept: from . import _private _private.forward_request(apiurl, request, interactive=True) break else: # an error occured continue group_reviews = [r for r in reviews if (r.by_group is not None and r.by_group == group)] if len(group_reviews) == 1 and conf.config['review_inherit_group']: review = group_reviews[0] else: print('Please chose one of the following reviews:') for i in range(len(reviews)): fmt = Request.format_review(reviews[i]) print('(%i)' % i, 'by %(type)-10s %(by)s' % fmt) num = raw_input('> ') try: num = int(num) except ValueError: print(f'\'{num}\' is not a number.') continue if num < 0 or num >= len(reviews): print(f'number \'{num}\' out of range.') continue review = reviews[num] change_review_state(apiurl, request.reqid, state, by_user=review.by_user, by_group=review.by_group, by_project=review.by_project, by_package=review.by_package, message=msg) break finally: if tmpfile is not None: tmpfile.close() def edit_submitrequest(apiurl, project, orequest, new_request=None): """edit a submit action from orequest/new_request""" actions = orequest.get_actions('submit') oactions = actions if new_request is not None: actions = new_request.get_actions('submit') num = 0 if len(actions) > 1: print('Please chose one of the following submit actions:') for i in range(len(actions)): # it is safe to use orequest because currently the formatting # of a submit action does not need instance specific data fmt = orequest.format_action(actions[i]) print('(%i)' % i, f"{fmt['source']} {fmt['target']}") num = raw_input('> ') try: num = int(num) except ValueError: raise oscerr.WrongArgs(f'\'{num}\' is not a number.') if num < 0 or num >= len(orequest.actions): raise oscerr.WrongArgs(f'number \'{num}\' out of range.') # the api replaced ':' with '_' in prj and pkg names (clone request) package = '%s.%s' % (oactions[num].src_package.replace(':', '_'), oactions[num].src_project.replace(':', '_')) tmpdir = None cleanup = True try: tmpdir = tempfile.mkdtemp(prefix='osc_editsr') p = Package.init_package(apiurl, project, package, tmpdir) p.update() shell = os.getenv('SHELL', default='/bin/sh') olddir = os.getcwd() os.chdir(tmpdir) print('Checked out package \'%s\' to %s. Started a new shell (%s).\n' 'Please fix the package and close the shell afterwards.' % (package, tmpdir, shell)) run_external(shell) # the pkg might have uncommitted changes... cleanup = False os.chdir(olddir) # reread data p = Package(tmpdir) modified = p.get_status(False, ' ', '?', 'S') if modified: print('Your working copy has the following modifications:') print('\n'.join([statfrmt(st, filename) for st, filename in modified])) repl = raw_input('Do you want to commit the local changes first? (y|N) ') if repl in ('y', 'Y'): msg = get_commit_msg(p.absdir, [p]) p.commit(msg=msg) cleanup = True finally: if cleanup: shutil.rmtree(tmpdir) else: print(f'Please remove the dir \'{tmpdir}\' manually') r = Request() for action in orequest.get_actions(): new_action = Action.from_xml(action.to_xml()) r.actions.append(new_action) if new_action.type == 'submit': new_action.src_package = '%s.%s' % (action.src_package.replace(':', '_'), action.src_project.replace(':', '_')) new_action.src_project = project # do an implicit cleanup new_action.opt_sourceupdate = 'cleanup' return r def get_user_projpkgs(apiurl, user, role=None, exclude_projects=None, proj=True, pkg=True, maintained=False, metadata=False): """Return all project/packages where user is involved.""" exclude_projects = exclude_projects or [] xpath = f'person/@userid = \'{user}\'' excl_prj = '' excl_pkg = '' for i in exclude_projects: excl_prj = xpath_join(excl_prj, f'not(@name = \'{i}\')', op='and') excl_pkg = xpath_join(excl_pkg, f'not(@project = \'{i}\')', op='and') role_filter_xpath = xpath if role: xpath = xpath_join(xpath, f'person/@role = \'{role}\'', inner=True, op='and') xpath_pkg = xpath_join(xpath, excl_pkg, op='and') xpath_prj = xpath_join(xpath, excl_prj, op='and') if maintained: xpath_pkg = xpath_join(xpath_pkg, '(project/attribute/@name=\'%(attr)s\' or attribute/@name=\'%(attr)s\')' % {'attr': conf.config['maintained_attribute']}, op='and') what = {} if pkg: if metadata: what['package'] = xpath_pkg else: what['package_id'] = xpath_pkg if proj: if metadata: what['project'] = xpath_prj else: what['project_id'] = xpath_prj try: res = search(apiurl, **what) except HTTPError as e: if e.code != 400 or not role_filter_xpath: raise e # backward compatibility: local role filtering what = {kind: role_filter_xpath for kind in what.keys()} if 'package' in what: what['package'] = xpath_join(role_filter_xpath, excl_pkg, op='and') if 'project' in what: what['project'] = xpath_join(role_filter_xpath, excl_prj, op='and') res = search(apiurl, **what) filter_role(res, user, role) return res def run_external(filename, *args, **kwargs): """Executes the program filename via subprocess.call. *args are additional arguments which are passed to the program filename. **kwargs specify additional arguments for the subprocess.call function. if no args are specified the plain filename is passed to subprocess.call (this can be used to execute a shell command). Otherwise [filename] + list(args) is passed to the subprocess.call function. """ # unless explicitly specified use shell=False kwargs.setdefault('shell', False) if args: cmd = [filename] + list(args) else: cmd = filename try: return subprocess.call(cmd, **kwargs) except OSError as e: if e.errno != errno.ENOENT: raise raise oscerr.ExtRuntimeError(e.strerror, filename) def return_external(filename, *args, **kwargs): """Executes the program filename via subprocess.check_output. ``*args`` are additional arguments which are passed to the program filename. ``**kwargs`` specify additional arguments for the subprocess.check_output function. if no args are specified the plain filename is passed to subprocess.check_output (this can be used to execute a shell command). Otherwise [filename] + list(args) is passed to the subprocess.check_output function. Returns the output of the command. """ if args: cmd = [filename] + list(args) else: cmd = filename try: # backward compatibility for python 2.6 if 'check_output' not in dir(subprocess): process = subprocess.Popen(cmd, stdout=subprocess.PIPE) output, errstr = process.communicate() retcode = process.poll() if retcode: error = subprocess.CalledProcessError(retcode, cmd) error.output = output raise error return output return subprocess.check_output(cmd, **kwargs) except OSError as e: if e.errno != errno.ENOENT: raise raise oscerr.ExtRuntimeError(e.strerror, filename) # backward compatibility: local role filtering def filter_role(meta, user, role): """ remove all project/package nodes if no person node exists where @userid=user and @role=role """ for kind, root in meta.items(): delete = [] for node in root.findall(kind): found = False for p in node.findall('person'): if p.get('userid') == user and p.get('role') == role: found = True break if not found: delete.append(node) for node in delete: root.remove(node) def find_default_project(apiurl: Optional[str] = None, package: Optional[str] = None): """ look though the list of conf.config['getpac_default_project'] and find the first project where the given package exists in the build service. """ if not conf.config['getpac_default_project']: return None candidates = re.split('[, ]+', conf.config['getpac_default_project']) if package is None or len(candidates) == 1: return candidates[0] # search through the list, where package exists ... for prj in candidates: try: # any fast query will do here. show_package_meta(apiurl, prj, package) return prj except HTTPError: pass return None def utime(filename, arg, ignore_einval=True): """wrapper around os.utime which ignore errno EINVAL by default""" try: # workaround for bnc#857610): if filename resides on a nfs share # os.utime might raise EINVAL os.utime(filename, arg) except OSError as e: if e.errno == errno.EINVAL and ignore_einval: return raise def which(name: str): """Searches "name" in PATH.""" name = os.path.expanduser(name) if os.path.isabs(name): if os.path.exists(name): return name return None for directory in os.environ.get('PATH', '').split(':'): path = os.path.join(directory, name) if os.path.exists(path): return path return None def get_comments(apiurl: str, kind, *args): url = makeurl(apiurl, ["comments", kind] + list(args)) f = http_GET(url) return xml_parse(f).getroot() def print_comments(apiurl: str, kind, *args): def print_rec(comments, indent=''): for comment in comments: print(indent, end='') print('(', comment.get('id'), ')', 'On', comment.get('when'), comment.get('who'), 'wrote:') text = indent + comment.text.replace('\r\n', ' \n') print(('\n' + indent).join(text.split('\n'))) print() print_rec([c for c in root if c.get('parent') == comment.get('id')], indent + ' ') root = get_comments(apiurl, kind, *args) comments = [c for c in root if c.get('parent') is None] if comments: print('\nComments:') print_rec(comments) def create_comment(apiurl: str, kind, comment, *args, **kwargs) -> Optional[str]: query = {} query["parent_id"] = kwargs.get("parent", None) u = makeurl(apiurl, ["comments", kind] + list(args), query=query) f = http_POST(u, data=comment) ret = xml_fromstring(f.read()).find('summary') if ret is None: return None return ret.text def delete_comment(apiurl: str, cid: str) -> Optional[str]: u = makeurl(apiurl, ['comment', cid]) f = http_DELETE(u) ret = xml_fromstring(f.read()).find('summary') if ret is None: return None return ret.text def get_rpmlint_log(apiurl: str, proj: str, pkg: str, repo: str, arch: str): u = makeurl(apiurl, ['build', proj, repo, arch, pkg, 'rpmlint.log']) f = http_GET(u) return f.read() def checkout_deleted_package(apiurl: str, proj: str, pkg: str, dst): pl = meta_get_filelist(apiurl, proj, pkg, deleted=True) query = {} query['deleted'] = 1 if os.path.isdir(dst): print(f'Restoring in existing directory {dst}') else: print(f'Creating {dst}') os.makedirs(dst) for filename in pl: print(f'Restoring {filename} to {dst}') full_file_path = os.path.join(dst, filename) u = makeurl(apiurl, ['source', proj, pkg, filename], query=query) with open(full_file_path, 'wb') as f: for data in streamfile(u): f.write(data) print('done.') def vc_export_env(apiurl: str, quiet=False): # try to set the env variables for the user's realname and email # (the variables are used by the "vc" script or some source service) tag2envs = {'realname': ['VC_REALNAME'], 'email': ['VC_MAILADDR', 'mailaddr']} tag2val = {} missing_tags = [] for (tag, envs) in tag2envs.items(): env_present = [env for env in envs if env in os.environ] config_present = bool(conf.config['api_host_options'][apiurl].get(tag, None)) if not env_present and not config_present: missing_tags.append(tag) elif config_present: tag2val[tag] = conf.config['api_host_options'][apiurl][tag] if missing_tags: user = conf.get_apiurl_usr(apiurl) data = get_user_data(apiurl, user, *missing_tags) if data: for tag in missing_tags: val = data.pop(0) if val != '-': tag2val[tag] = val elif not quiet: msg = f'Try env {tag2envs[tag][0]}=...' print(msg, file=sys.stderr) for (tag, val) in tag2val.items(): for env in tag2envs[tag]: if val: os.environ[env] = val class MultibuildFlavorResolver: def __init__(self, apiurl: str, project: str, package: str, use_local=False): self.apiurl = apiurl self.project = project self.package = package # whether to use local _multibuild file or download it from server self.use_local = use_local def get_multibuild_data(self): """ Retrieve contents of _multibuild file from given project/package. Return None if the file doesn't exist. """ # use local _multibuild file if self.use_local: try: with open("_multibuild") as f: return f.read() except OSError as e: if e.errno != errno.EEXIST: raise return None # use _multibuild file from server query = {} query['expand'] = 1 u = makeurl(self.apiurl, ['source', self.project, self.package, '_multibuild'], query=query) try: f = http_GET(u) except HTTPError as e: if e.code == 404: return None raise return f.read() @staticmethod def parse_multibuild_data(s: str): """ Return set of flavors from a string with multibuild xml. """ result = set() # handle empty string and None if not s: return result root = xml_fromstring(s) for node in root.findall("flavor"): result.add(node.text) return result def resolve(self, patterns: List[str]): """ Return list of flavors based on given flavor `patterns`. If `patterns` contain a glob, it's resolved according to _multibuild file, values without globs are passed through. """ # determine if we're using globs # yes: contact server and do glob matching # no: use the specified values directly use_globs = False for pattern in patterns: if '*' in pattern: use_globs = True break if use_globs: multibuild_xml = self.get_multibuild_data() all_flavors = self.parse_multibuild_data(multibuild_xml) flavors = set() for pattern in patterns: # not a glob, use it as it is if '*' not in pattern: flavors.add(pattern) continue # match the globs with flavors from server for flavor in all_flavors: if fnmatch.fnmatch(flavor, pattern): flavors.add(flavor) else: flavors = patterns return sorted(flavors) def resolve_as_packages(self, patterns: List[str]): """ Return list of package:flavor based on given flavor `patterns`. If a value from `patterns` contains a glob, it is resolved according to the _multibuild file. Values without globs are passed through. If a value is empty string, package without flavor is returned. """ flavors = self.resolve(patterns) packages = [] for flavor in flavors: if flavor: packages.append(self.package + ':' + flavor) else: # special case: no flavor packages.append(self.package) return packages # vim: sw=4 et osc-1.12.1/osc/credentials.py000066400000000000000000000255311475337502500160340ustar00rootroot00000000000000import base64 import bz2 import getpass import importlib import sys from urllib.parse import urlsplit try: import keyring except ImportError: keyring = None except BaseException as e: # catch and report any exceptions raised in the 'keyring' module msg = "Warning: Unable to load the 'keyring' module due to an internal error:" print(msg, e, file=sys.stderr) keyring = None from . import conf from . import oscerr class AbstractCredentialsManagerDescriptor: def name(self): raise NotImplementedError() def description(self): raise NotImplementedError() def priority(self): # priority determines order in the credentials managers list # higher number means higher priority raise NotImplementedError() def create(self, cp): raise NotImplementedError() def __lt__(self, other): return (-self.priority(), self.name()) < (-other.priority(), other.name()) class AbstractCredentialsManager: config_entry = 'credentials_mgr_class' def __init__(self, cp, options): super().__init__() self._cp = cp self._process_options(options) @classmethod def create(cls, cp, options): return cls(cp, options) def _get_password(self, url, user, apiurl=None): raise NotImplementedError() def get_password(self, url, user, defer=True, apiurl=None): if defer: return conf.Password(lambda: self._get_password(url, user, apiurl=apiurl)) else: return conf.Password(self._get_password(url, user, apiurl=apiurl)) def set_password(self, url, user, password): raise NotImplementedError() def delete_password(self, url, user): raise NotImplementedError() def _qualified_name(self): return qualified_name(self) def _process_options(self, options): pass class PlaintextConfigFileCredentialsManager(AbstractCredentialsManager): def get_password(self, url, user, defer=True, apiurl=None): password = self._cp.get(url, "pass", fallback=None, raw=True) if password is None: return None return conf.Password(password) def set_password(self, url, user, password): self._cp.set(url, 'pass', password) self._cp.set(url, self.config_entry, self._qualified_name()) def delete_password(self, url, user): self._cp.remove_option(url, 'pass') def _process_options(self, options): if options is not None: raise RuntimeError('options must be None') class PlaintextConfigFileDescriptor(AbstractCredentialsManagerDescriptor): def name(self): return 'Config' def description(self): return 'Store the password in plain text in the osc config file [insecure, persistent]' def priority(self): return 1 def create(self, cp): return PlaintextConfigFileCredentialsManager(cp, None) class ObfuscatedConfigFileCredentialsManager(PlaintextConfigFileCredentialsManager): def get_password(self, url, user, defer=True, apiurl=None): if self._cp.has_option(url, 'passx', proper=True): passwd = self._cp.get(url, 'passx', raw=True) else: passwd = super().get_password(url, user, apiurl=apiurl) password = self.decode_password(passwd) return conf.Password(password) def set_password(self, url, user, password): compressed_pw = bz2.compress(password.encode('ascii')) password = base64.b64encode(compressed_pw).decode("ascii") super().set_password(url, user, password) def delete_password(self, url, user): self._cp.remove_option(url, 'passx') super().delete_password(url, user) @classmethod def decode_password(cls, password): if password is None: # avoid crash on encoding None when 'pass' is not specified in the config return None compressed_pw = base64.b64decode(password.encode("ascii")) return bz2.decompress(compressed_pw).decode("ascii") class ObfuscatedConfigFileDescriptor(AbstractCredentialsManagerDescriptor): def name(self): return 'Obfuscated config' def description(self): return 'Store the password in obfuscated form in the osc config file [insecure, persistent]' def priority(self): return 2 def create(self, cp): return ObfuscatedConfigFileCredentialsManager(cp, None) class TransientCredentialsManager(AbstractCredentialsManager): def __init__(self, *args, **kwargs): super().__init__(*args, **kwargs) self._password = None def _process_options(self, options): if options is not None: raise RuntimeError('options must be None') def _get_password(self, url, user, apiurl=None): if self._password is None: if apiurl: # strip scheme from apiurl because we don't want to display it to the user apiurl_no_scheme = urlsplit(apiurl)[1] msg = f'Password [{user}@{apiurl_no_scheme}]: ' else: msg = 'Password: ' self._password = getpass.getpass(msg) return self._password def set_password(self, url, user, password): self._password = password self._cp.set(url, self.config_entry, self._qualified_name()) def delete_password(self, url, user): self._password = None class TransientDescriptor(AbstractCredentialsManagerDescriptor): def name(self): return 'Transient' def description(self): return 'Do not store the password and always ask for it [secure, in-memory]' def priority(self): return 3 def create(self, cp): return TransientCredentialsManager(cp, None) class KeyringCredentialsManager(AbstractCredentialsManager): def __init__(self, *args, **kwargs): super().__init__(*args, **kwargs) self._password = None def _process_options(self, options): if options is None: raise RuntimeError('options may not be None') self._backend_cls_name = options def _load_backend(self): try: keyring_backend = keyring.core.load_keyring(self._backend_cls_name) except ModuleNotFoundError: msg = f"Invalid credentials_mgr_class: {self._backend_cls_name}" raise oscerr.ConfigError(msg, conf.config['conffile']) keyring.set_keyring(keyring_backend) @classmethod def create(cls, cp, options): if not has_keyring_support(): return None return super().create(cp, options) def _get_password(self, url, user, apiurl=None): if self._password is None: self._load_backend() self._password = keyring.get_password(urlsplit(url)[1], user) # TODO: this works fine on the command-line but a long-running process using osc library would start failing after changing the password in the keyring # TODO: implement retrieving the password again after basic auth fails; sufficiently inform user about what's being done return self._password def set_password(self, url, user, password): self._load_backend() keyring.set_password(urlsplit(url)[1], user, password) config_value = f"{self._qualified_name()}:{self._backend_cls_name}" self._cp.set(url, self.config_entry, config_value) self._password = password def delete_password(self, url, user): self._load_backend() service = urlsplit(url)[1] data = keyring.get_password(service, user) if data is None: return keyring.delete_password(service, user) class KeyringCredentialsDescriptor(AbstractCredentialsManagerDescriptor): def __init__(self, keyring_backend, name=None, description=None, priority=None): self._keyring_backend = keyring_backend self._name = name self._description = description self._priority = priority def name(self): if self._name: return self._name if hasattr(self._keyring_backend, 'name'): return self._keyring_backend.name return self._keyring_backend.__class__.__name__ def description(self): if self._description: return self._description return 'Backend provided by python-keyring' def priority(self): if self._priority is not None: return self._priority return 0 def create(self, cp): qualified_backend_name = qualified_name(self._keyring_backend) return KeyringCredentialsManager(cp, qualified_backend_name) # we're supporting only selected python-keyring backends in osc SUPPORTED_KEYRING_BACKENDS = { "keyutils.osc.OscKernelKeyringBackend": { "name": "Kernel keyring", "description": "Store password in user session keyring in kernel keyring [secure, in-memory, per-session]", "priority": 10, }, "keyring.backends.SecretService.Keyring": { "name": "Secret Service", "description": "Store password in Secret Service (GNOME Keyring backend) [secure, persistent]", "priority": 9, }, "keyring.backends.kwallet.DBusKeyring": { "name": "KWallet", "description": "Store password in KWallet [secure, persistent]", "priority": 8, }, } def get_credentials_manager_descriptors(): descriptors = [] if has_keyring_support(): for backend in keyring.backend.get_all_keyring(): qualified_backend_name = qualified_name(backend) data = SUPPORTED_KEYRING_BACKENDS.get(qualified_backend_name, None) if not data: continue descriptor = KeyringCredentialsDescriptor( backend, data["name"], data["description"], data["priority"] ) descriptors.append(descriptor) descriptors.append(PlaintextConfigFileDescriptor()) descriptors.append(ObfuscatedConfigFileDescriptor()) descriptors.append(TransientDescriptor()) descriptors.sort() return descriptors def get_keyring_credentials_manager(cp): keyring_backend = keyring.get_keyring() return KeyringCredentialsManager(cp, qualified_name(keyring_backend)) def create_credentials_manager(url, cp): config_entry = cp.get(url, AbstractCredentialsManager.config_entry) if ':' in config_entry: creds_mgr_cls, options = config_entry.split(':', 1) else: creds_mgr_cls = config_entry options = None mod, cls = creds_mgr_cls.rsplit('.', 1) try: creds_mgr = getattr(importlib.import_module(mod), cls).create(cp, options) except ModuleNotFoundError: msg = f"Invalid credentials_mgr_class: {creds_mgr_cls}" raise oscerr.ConfigError(msg, conf.config['conffile']) return creds_mgr def qualified_name(obj): return f"{obj.__module__}.{obj.__class__.__name__}" def has_keyring_support(): return keyring is not None osc-1.12.1/osc/fetch.py000066400000000000000000000424521475337502500146310ustar00rootroot00000000000000# Copyright (C) 2006 Novell Inc. All rights reserved. # This program is free software; it may be used, copied, modified # and distributed under the terms of the GNU General Public Licence, # either version 2, or (at your option) any later version. import glob import os import re import shutil import subprocess import sys import tempfile from urllib.request import HTTPError from . import checker as osc_checker from . import conf from . import oscerr from .core import makeurl, dgst from .grabber import OscFileGrabber, OscMirrorGroup from .meter import create_text_meter from .util import packagequery, cpio from .util.helper import decode_it class Fetcher: def __init__(self, cachedir='/tmp', urllist=None, http_debug=False, cookiejar=None, offline=False, enable_cpio=True, modules=None, download_api_only=False): # set up progress bar callback self.progress_obj = None if sys.stdout.isatty(): self.progress_obj = create_text_meter(use_pb_fallback=False) self.cachedir = cachedir # generic download URL lists self.urllist = urllist or [] self.modules = modules or [] self.http_debug = http_debug self.offline = offline self.cpio = {} self.enable_cpio = enable_cpio self.download_api_only = download_api_only self.gr = OscFileGrabber(progress_obj=self.progress_obj) def __add_cpio(self, pac): prpap = f'{pac.project}/{pac.repository}/{pac.repoarch}/{pac.repopackage}' self.cpio.setdefault(prpap, {})[pac.repofilename] = pac def __download_cpio_archive(self, apiurl, project, repo, arch, package, **pkgs): if not pkgs: return query = {} query["binary"] = pkgs query["view"] = "cpio" query["module"] = self.modules try: url = makeurl(apiurl, ['build', project, repo, arch, package], query=query) sys.stdout.write("preparing download ...\r") sys.stdout.flush() with tempfile.NamedTemporaryFile(prefix='osc_build_cpio') as tmparchive: self.gr.urlgrab(url, filename=tmparchive.name, text=f'fetching packages for \'{project}\'') archive = cpio.CpioRead(tmparchive.name) archive.read() for hdr in archive: # XXX: we won't have an .errors file because we're using # getbinarylist instead of the public/... route # (which is routed to getbinaries) # getbinaries does not support kiwi builds if hdr.filename == b'.errors': archive.copyin_file(hdr.filename) raise oscerr.APIError('CPIO archive is incomplete ' '(see .errors file)') if package == '_repository': n = re.sub(br'\.pkg\.tar\.(zst|.z)$', b'.arch', hdr.filename) if n.startswith(b'container:'): n = re.sub(br'\.tar\.(zst|.z)$', b'.tar', hdr.filename) pac = pkgs[decode_it(n.rsplit(b'.', 1)[0])] pac.canonname = hdr.filename else: pac = pkgs[decode_it(n.rsplit(b'.', 1)[0])] else: # this is a kiwi product pac = pkgs[decode_it(hdr.filename)] # Extract a single file from the cpio archive fd = None tmpfile = None try: fd, tmpfile = tempfile.mkstemp(prefix='osc_build_file') archive.copyin_file(hdr.filename, decode_it(os.path.dirname(tmpfile)), decode_it(os.path.basename(tmpfile))) self.move_package(tmpfile, pac.localdir, pac) finally: if fd is not None: os.close(fd) if tmpfile is not None and os.path.exists(tmpfile): os.unlink(tmpfile) for pac in pkgs.values(): if not os.path.isfile(pac.fullfilename): raise oscerr.APIError('failed to fetch file \'%s\': ' 'missing in CPIO archive' % pac.repofilename) except HTTPError as e: if e.code != 414: raise # query str was too large keys = list(pkgs.keys()) if len(keys) == 1: raise oscerr.APIError('unable to fetch cpio archive: ' 'server always returns code 414') n = int(len(pkgs) / 2) new_pkgs = {k: pkgs[k] for k in keys[:n]} self.__download_cpio_archive(apiurl, project, repo, arch, package, **new_pkgs) new_pkgs = {k: pkgs[k] for k in keys[n:]} self.__download_cpio_archive(apiurl, project, repo, arch, package, **new_pkgs) def __fetch_cpio(self, apiurl): for prpap, pkgs in self.cpio.items(): project, repo, arch, package = prpap.split('/', 3) self.__download_cpio_archive(apiurl, project, repo, arch, package, **pkgs) def fetch(self, pac, prefix=''): # for use by the failure callback self.curpac = pac mg = OscMirrorGroup(self.gr, pac.urllist) if self.http_debug: print(f'\nURLs to try for package \'{pac}\':', file=sys.stderr) print('\n'.join(pac.urllist), file=sys.stderr) print(file=sys.stderr) try: with tempfile.NamedTemporaryFile(prefix='osc_build', delete=False) as tmpfile: mg_stat = mg.urlgrab(pac.filename, filename=tmpfile.name, text=f'{prefix}({pac.project}) {pac.filename}') if mg_stat: self.move_package(tmpfile.name, pac.localdir, pac) if not mg_stat: if self.enable_cpio: print('%s/%s: attempting download from api, since not found' % (pac.project, pac.name)) self.__add_cpio(pac) return print() print('Error: Failed to retrieve %s from the following locations ' '(in order):' % pac.filename, file=sys.stderr) print('\n'.join(pac.urllist), file=sys.stderr) sys.exit(1) finally: if os.path.exists(tmpfile.name): os.unlink(tmpfile.name) def move_package(self, tmpfile, destdir, pac_obj=None): canonname = None if pac_obj and pac_obj.name.startswith('container:'): canonname = pac_obj.canonname if canonname is None: pkgq = packagequery.PackageQuery.query(tmpfile, extra_rpmtags=(1044, 1051, 1052)) if pkgq: canonname = pkgq.canonname() else: if pac_obj is None: print('Unsupported file type: ', tmpfile, file=sys.stderr) sys.exit(1) canonname = pac_obj.binary decoded_canonname = decode_it(canonname) if b'/' in canonname or '/' in decoded_canonname: raise oscerr.OscIOError(None, 'canonname contains a slash') fullfilename = os.path.join(destdir, decoded_canonname) if pac_obj is not None: pac_obj.canonname = canonname pac_obj.fullfilename = fullfilename shutil.move(tmpfile, fullfilename) os.chmod(fullfilename, 0o644) def dirSetup(self, pac): dir = os.path.join(self.cachedir, pac.localdir) if not os.path.exists(dir): try: os.makedirs(dir, mode=0o755) except OSError as e: print('packagecachedir is not writable for you?', file=sys.stderr) print(e, file=sys.stderr) sys.exit(1) def _build_urllist(self, buildinfo, pac): if self.download_api_only: return [] urllist = self.urllist key = f'{pac.project}/{pac.repository}' project_repo_url = buildinfo.urls.get(key) if project_repo_url is not None: urllist = [project_repo_url] return urllist def run(self, buildinfo): apiurl = buildinfo.apiurl cached = 0 all = len(buildinfo.deps) for i in buildinfo.deps: urllist = self._build_urllist(buildinfo, i) i.makeurls(self.cachedir, urllist) # find container extension by looking in the cache if i.name.startswith('container:') and i.fullfilename.endswith('.tar.xz'): for ext in ['.tar.xz', '.tar.gz', '.tar']: if os.path.exists(i.fullfilename[:-7] + ext): i.canonname = i.canonname[:-7] + ext i.makeurls(self.cachedir, urllist) if os.path.exists(i.fullfilename): cached += 1 if not i.name.startswith('container:') and i.pacsuffix != 'rpm': continue hdrmd5_is_valid = True if i.hdrmd5: if i.name.startswith('container:'): hdrmd5 = dgst(i.fullfilename) if hdrmd5 != i.hdrmd5: hdrmd5_is_valid = False else: hdrmd5 = packagequery.PackageQuery.queryhdrmd5(i.fullfilename) if hdrmd5 != i.hdrmd5: if conf.config["api_host_options"][apiurl]["disable_hdrmd5_check"]: print(f"Warning: Ignoring a hdrmd5 mismatch for {i.fullfilename}: {hdrmd5} (actual) != {i.hdrmd5} (expected)") hdrmd5_is_valid = True else: print(f"The file will be redownloaded from the API due to a hdrmd5 mismatch for {i.fullfilename}: {hdrmd5} (actual) != {i.hdrmd5} (expected)") hdrmd5_is_valid = False if not hdrmd5_is_valid: os.unlink(i.fullfilename) cached -= 1 miss = 0 needed = all - cached if all: miss = 100.0 * needed / all print("%.1f%% cache miss. %d/%d dependencies cached.\n" % (miss, cached, all)) done = 1 for i in buildinfo.deps: if not os.path.exists(i.fullfilename): if self.offline: raise oscerr.OscIOError(None, 'Missing \'%s\' in cache: ' '--offline not possible.' % i.fullfilename) self.dirSetup(i) try: # if there isn't a progress bar, there is no output at all prefix = '' if not self.progress_obj: print('%d/%d (%s) %s' % (done, needed, i.project, i.filename)) else: prefix = '[%d/%d] ' % (done, needed) self.fetch(i, prefix=prefix) if not os.path.isfile(i.fullfilename): # if the file wasn't downloaded and cannot be found on disk, # mark it for downloading from the API self.__add_cpio(i) else: hdrmd5 = packagequery.PackageQuery.queryhdrmd5(i.fullfilename) if hdrmd5 != i.hdrmd5: if conf.config["api_host_options"][apiurl]["disable_hdrmd5_check"]: print(f"Warning: Ignoring a hdrmd5 mismatch for {i.fullfilename}: {hdrmd5} (actual) != {i.hdrmd5} (expected)") else: print(f"The file will be redownloaded from the API due to a hdrmd5 mismatch for {i.fullfilename}: {hdrmd5} (actual) != {i.hdrmd5} (expected)") os.unlink(i.fullfilename) self.__add_cpio(i) except KeyboardInterrupt: print('Cancelled by user (ctrl-c)') print('Exiting.') sys.exit(0) done += 1 self.__fetch_cpio(buildinfo.apiurl) prjs = list(buildinfo.projects.keys()) for prj in prjs: dest = os.path.join(self.cachedir, prj) pubkey_path_base = os.path.join(dest, "_pubkey") pubkey_paths = glob.glob(f"{pubkey_path_base}*") if self.offline: # we're offline, only index the keys found on disk if pubkey_paths: for pubkey_path in pubkey_paths: buildinfo.keys.append(pubkey_path) buildinfo.prjkeys.append(prj) continue from . import obs_api os.makedirs(dest, mode=0o755, exist_ok=True) pubkeys = [] try: keyinfo = obs_api.Keyinfo.from_api(buildinfo.apiurl, prj) for pubkey in keyinfo.pubkey_list or []: pubkeys.append(pubkey.value) except HTTPError as e: result = obs_api.Keyinfo.get_pubkey_deprecated(buildinfo.apiurl, prj, traverse=True) if result: # overwrite ``prj`` with the project that contains the key we're using prj, pubkey = result pubkeys.append(pubkey) # remove the existing files, we'll create new files with new contents for pubkey_path in pubkey_paths: os.unlink(pubkey_path) if pubkeys: for num, pubkey in enumerate(pubkeys): pubkey_path = f"{pubkey_path_base}-{num}" with open(pubkey_path, "w") as f: f.write(pubkey) buildinfo.keys.append(pubkey_path) if prj not in buildinfo.prjkeys: buildinfo.prjkeys.append(prj) def verify_pacs_old(pac_list): """Take a list of rpm filenames and run rpm -K on them. In case of failure, exit. Check all packages in one go, since this takes only 6 seconds on my Athlon 700 instead of 20 when calling 'rpm -K' for each of them. """ if not pac_list: return # don't care about the return value because we check the # output anyway, and rpm always writes to stdout. # save locale first (we rely on English rpm output here) saved_LC_ALL = os.environ.get('LC_ALL') os.environ['LC_ALL'] = 'en_EN' o = subprocess.Popen(['rpm', '-K'] + pac_list, stdout=subprocess.PIPE, stderr=subprocess.STDOUT, close_fds=True).stdout # restore locale if saved_LC_ALL: os.environ['LC_ALL'] = saved_LC_ALL else: os.environ.pop('LC_ALL') for line in o.readlines(): if 'OK' not in line: print() print('The following package could not be verified:', file=sys.stderr) print(line, file=sys.stderr) sys.exit(1) if 'NOT OK' in line: print() print('The following package could not be verified:', file=sys.stderr) print(line, file=sys.stderr) if 'MISSING KEYS' in line: missing_key = line.split('#')[-1].split(')')[0] print(""" - If the key (%(name)s) is missing, install it first. For example, do the following: osc signkey PROJECT > file and, as root: rpm --import %(dir)s/keyfile-%(name)s Then, just start the build again. - If you do not trust the packages, you should configure osc build for XEN or KVM - You may use --no-verify to skip the verification (which is a risk for your system). """ % {'name': missing_key, 'dir': os.path.expanduser('~')}, file=sys.stderr) else: print(""" - If the signature is wrong, you may try deleting the package manually and re-run this program, so it is fetched again. """, file=sys.stderr) sys.exit(1) def verify_pacs(bi): """Take a list of rpm filenames and verify their signatures. In case of failure, exit. """ pac_list = [i.fullfilename for i in bi.deps] if conf.config['builtin_signature_check'] is not True: return verify_pacs_old(pac_list) if not pac_list: return if not bi.keys: raise oscerr.APIError("can't verify packages due to lack of GPG keys") print("using keys from", ', '.join(bi.prjkeys)) failed = False checker = osc_checker.Checker() try: checker.readkeys(bi.keys) for pkg in pac_list: try: checker.check(pkg) except Exception as e: failed = True print(pkg, ':', e) except: checker.cleanup() raise if failed: checker.cleanup() sys.exit(1) checker.cleanup() # vim: sw=4 et osc-1.12.1/osc/git_scm/000077500000000000000000000000001475337502500146045ustar00rootroot00000000000000osc-1.12.1/osc/git_scm/README.md000066400000000000000000000002521475337502500160620ustar00rootroot00000000000000# Warning This module provides EXPERIMENTAL and UNSTABLE support for git scm such as https://src.opensuse.org/. The code may change or disappear without a prior notice! osc-1.12.1/osc/git_scm/__init__.py000066400000000000000000000003241475337502500167140ustar00rootroot00000000000000import sys from .store import GitStore def warn_experimental(): print("WARNING: Using EXPERIMENTAL support for git scm. The functionality may change or disappear without a prior notice!", file=sys.stderr) osc-1.12.1/osc/git_scm/store.py000066400000000000000000000160251475337502500163160ustar00rootroot00000000000000import json import os import subprocess import urllib.parse from pathlib import Path from .. import conf as osc_conf from .. import oscerr class GitStore: @classmethod def is_project_dir(cls, path): try: store = cls(path) except oscerr.NoWorkingCopy: return False return store.is_project @classmethod def is_package_dir(cls, path): try: store = cls(path) except oscerr.NoWorkingCopy: return False return store.is_package def __init__(self, path, check=True): self.path = path self.abspath = os.path.abspath(self.path) try: self.toplevel = self._run_git(["rev-parse", "--show-toplevel"]) self.toplevel = os.path.abspath(self.toplevel) except subprocess.CalledProcessError: self.toplevel = None # TODO: how to determine if the current git repo contains a project or a package? self.is_project = False self.is_package = False if self.toplevel: # NOTE: we have only one store in project-git for all packages config_path = os.path.join(self.toplevel, "_config") pbuild_path = os.path.join(self.toplevel, "_pbuild") if self.toplevel == self.abspath and (os.path.isfile(config_path) or os.path.isfile(pbuild_path)): self.is_project = True self.is_package = False else: self.is_project = False self.is_package = True self._package = None self._project = None if check and not any([self.is_project, self.is_package]): msg = f"Directory '{self.path}' is not a Git SCM working copy" raise oscerr.NoWorkingCopy(msg) if check and not self.scmurl: msg = f"Directory '{self.path}' is a Git SCM repo that lacks the 'origin' remote" raise oscerr.NoWorkingCopy(msg) # TODO: decide if we need explicit 'git lfs pull' or not # self._run_git(["lfs", "pull"]) def assert_is_project(self): if not self.is_project: msg = f"Directory '{self.path}' is not a Git SCM working copy of a project" raise oscerr.NoWorkingCopy(msg) def assert_is_package(self): if not self.is_package: msg = f"Directory '{self.path}' is not a Git SCM working copy of a package" raise oscerr.NoWorkingCopy(msg) def _run_git(self, args): return subprocess.check_output(["git"] + args, encoding="utf-8", cwd=self.abspath).strip() @property def apiurl(self): from ..obs_scm.store import Store # read apiurl from the current directory with .osc metadata # NOTE: this never triggers if a store is retrieved from osc.store.get_store(), # because obs_scm store takes precedence as .osc is present store = Store(self.toplevel, check=False) if store.apiurl: return store.apiurl # read project from parent directory that contains a project with .osc metadata store = Store(os.path.join(self.toplevel, ".."), check=False) if store.is_project and store.apiurl: return store.apiurl # HACK: we're using the currently configured apiurl return osc_conf.config["apiurl"] @property def project(self): from ..obs_scm.store import Store # read project from the current directory with .osc metadata # NOTE: this never triggers if a store is retrieved from osc.store.get_store(), # because obs_scm store takes precedence as .osc is present if self._project is None: store = Store(self.toplevel, check=False) if store.project: self._project = store.project # read project from parent directory that contains a project with .osc metadata if self._project is None: store = Store(os.path.join(self.toplevel, ".."), check=False) if store.is_project and store.project: self._project = store.project # guess project from git branch if self._project is None: # get project from the branch name branch = self._run_git(["branch", "--show-current"]) # HACK: replace hard-coded mapping with metadata from git or the build service # NOTE: you never know which git repo is supposed to be used in which project if branch == "factory": self._project = "openSUSE:Factory" else: raise oscerr.NoWorkingCopy(f"Couldn't map git branch '{branch}' to a project") return self._project @project.setter def project(self, value): self._project = value @property def package(self): if self._package is None: origin = self._run_git(["remote", "get-url", "origin"]) self._package = Path(urllib.parse.urlsplit(origin).path).stem return self._package @package.setter def package(self, value): self._package = value def _get_option(self, name): try: result = self._run_git(["config", "--local", "--get", f"osc.{name}"]) except subprocess.CalledProcessError: result = None return result def _check_type(self, name, value, expected_type): if not isinstance(value, expected_type): raise TypeError(f"The option '{name}' should be {expected_type.__name__}, not {type(value).__name__}") def _set_option(self, name, value): self._run_git(["config", "--local", f"osc.{name}", value]) def _unset_option(self, name): try: self._run_git(["config", "--local", "--unset", f"osc.{name}"]) except subprocess.CalledProcessError: pass def _get_dict_option(self, name): result = self._get_option(name) if result is None: return None result = json.loads(result) self._check_type(name, result, dict) return result def _set_dict_option(self, name, value): if value is None: self._unset_option(name) return self._check_type(name, value, dict) value = json.dumps(value) self._set_option(name, value) @property def last_buildroot(self): self.assert_is_package() result = self._get_dict_option("last-buildroot") if result is not None: result = (result["repo"], result["arch"], result["vm_type"]) return result @last_buildroot.setter def last_buildroot(self, value): self.assert_is_package() if len(value) != 3: raise ValueError("A tuple with exactly 3 items is expected: (repo, arch, vm_type)") value = { "repo": value[0], "arch": value[1], "vm_type": value[2], } self._set_dict_option("last-buildroot", value) @property def scmurl(self): try: return self._run_git(["remote", "get-url", "origin"]) except subprocess.CalledProcessError: return None osc-1.12.1/osc/gitea_api/000077500000000000000000000000001475337502500151015ustar00rootroot00000000000000osc-1.12.1/osc/gitea_api/__init__.py000066400000000000000000000006331475337502500172140ustar00rootroot00000000000000from .connection import Connection from .exceptions import BranchDoesNotExist from .exceptions import BranchExists from .exceptions import ForkExists from .exceptions import GiteaException from .branch import Branch from .conf import Config from .conf import Login from .fork import Fork from .git import Git from .pr import PullRequest from .ssh_key import SSHKey from .repo import Repo from .user import User osc-1.12.1/osc/gitea_api/branch.py000066400000000000000000000053141475337502500167130ustar00rootroot00000000000000from typing import Optional from .connection import Connection from .connection import GiteaHTTPResponse from .exceptions import BranchDoesNotExist from .exceptions import BranchExists from .exceptions import GiteaException class Branch: @classmethod def get( cls, conn: Connection, owner: str, repo: str, branch: str, ) -> GiteaHTTPResponse: """ Retrieve details about a repository branch. :param conn: Gitea ``Connection`` instance. :param owner: Owner of the repo. :param repo: Name of the repo. :param branch: Name of the branch. """ url = conn.makeurl("repos", owner, repo, "branches", branch) try: return conn.request("GET", url) except GiteaException as e: if e.status == 404: raise BranchDoesNotExist(e.response, owner, repo, branch) from None raise @classmethod def list( cls, conn: Connection, owner: str, repo: str, ) -> GiteaHTTPResponse: """ Retrieve details about all repository branches. :param conn: Gitea ``Connection`` instance. :param owner: Owner of the repo. :param repo: Name of the repo. """ url = conn.makeurl("repos", owner, repo, "branches") # XXX: returns 'null' when there are no branches; an empty list would be a better API return conn.request("GET", url) @classmethod def create( cls, conn: Connection, owner: str, repo: str, *, old_ref_name: Optional[str] = None, new_branch_name: str, exist_ok: bool = False, ) -> GiteaHTTPResponse: """ Create a new branch in a repository. :param conn: Gitea ``Connection`` instance. :param owner: Owner of the repo. :param repo: Name of the repo. :param old_ref_name: Name of the old branch/tag/commit to create from. :param new_branch_name: Name of the branch to create. :param exist_ok: A ``BranchExists`` exception is raised when the target exists. Set to ``True`` to avoid throwing the exception. """ json_data = { "new_branch_name": new_branch_name, "old_ref_name": old_ref_name, } url = conn.makeurl("repos", owner, repo, "branches") try: return conn.request("POST", url, json_data=json_data) except GiteaException as e: if e.status == 409: if exist_ok: return cls.get(conn, owner, repo, new_branch_name) raise BranchExists(e.response, owner, repo, new_branch_name) from None raise osc-1.12.1/osc/gitea_api/conf.py000066400000000000000000000136741475337502500164130ustar00rootroot00000000000000import io import os from typing import Any from typing import Dict from typing import List from typing import Optional import ruamel.yaml from osc import oscerr from osc.util.models import BaseModel from osc.util.models import Field class Login(BaseModel): name: str = Field() # type: ignore[assignment] url: str = Field() # type: ignore[assignment] user: str = Field() # type: ignore[assignment] token: str = Field() # type: ignore[assignment] ssh_key: Optional[str] = Field() # type: ignore[assignment] default: Optional[bool] = Field() # type: ignore[assignment] class AlreadyExists(oscerr.OscBaseError): def __init__(self, name): super().__init__() self.name = name def __str__(self): return f"Gitea config entry with name '{self.name}' already exists" class DoesNotExist(oscerr.OscBaseError): def __init__(self, **kwargs): super().__init__() self.kwargs = kwargs def __str__(self): if self.kwargs == {"name": None}: return "Could not find a default Gitea config entry" kwargs_str = ", ".join([f"{key}={value}" for key, value in self.kwargs.items()]) return f"Could not find a matching Gitea config entry: {kwargs_str}" def __init__(self, **kwargs): # ignore extra fields for key in list(kwargs): if key not in self.__fields__: kwargs.pop(key, None) super().__init__(**kwargs) def to_human_readable_string(self, *, show_token: bool = False): from osc.output import KeyValueTable table = KeyValueTable() table.add("Name", self.name, color="bold") if self.default: table.add("Default", "true", color="bold") table.add("URL", self.url) table.add("User", self.user) if self.ssh_key: table.add("SSH Key", self.ssh_key) if show_token: # tokens are stored in the plain text, there's not reason to protect them too much # let's only hide them from the output by default table.add("Token", self.token) return f"{table}" class Config: """ Manage the tea config.yml file. No data is cached in the objects, all changes are in sync with the file on disk. """ def __init__(self, path: Optional[str] = None): if not path: path = os.path.expanduser("~/.config/tea/config.yml") self.path = os.path.abspath(path) self.logins: List[Login] = [] def _read(self) -> Dict[str, Any]: try: with open(self.path, "r") as f: yaml = ruamel.yaml.YAML() return yaml.load(f) except FileNotFoundError: return {} def _write(self, data): yaml = ruamel.yaml.YAML() yaml.default_flow_style = False buf = io.StringIO() yaml.dump(data, buf) buf.seek(0) text = buf.read() os.makedirs(os.path.dirname(self.path), mode=0o700, exist_ok=True) with open(self.path, "w") as f: f.write(text) def list_logins(self) -> List[Login]: data = self._read() result = [] for i in data.get("logins", []): login = Login(**i) result.append(login) return result def get_login(self, name: Optional[str] = None) -> Login: """ Return ``Login`` object for the given ``name``. If ``name`` equals to ``None``, return the default ``Login``. """ for login in self.list_logins(): if name is None and login.default: return login if login.name == name: return login raise Login.DoesNotExist(name=name) def get_login_by_url_user(self, url: str, user: str) -> Login: """ Return ``Login`` object for the given ``url`` and ``user``. """ for login in self.list_logins(): if (login.url, login.user) == (url, user): return login raise Login.DoesNotExist(url=url, user=user) def add_login(self, login: Login): data = self._read() data.setdefault("logins", []) for entry in data["logins"]: if entry.get("name", None) == login.name: raise Login.AlreadyExists(login.name) else: if login.default: entry.pop("default", None) data["logins"].append(login.dict()) self._write(data) def remove_login(self, name: str) -> Login: # throw an exception if the login name doesn't exist login = self.get_login(name) data = self._read() for num, entry in enumerate(list(data["logins"])): if entry.get("name", None) == login.name: data["logins"].pop(num) self._write(data) return login def update_login( self, name: str, new_name: Optional[str] = None, new_url: Optional[str] = None, new_user: Optional[str] = None, new_token: Optional[str] = None, new_ssh_key: Optional[str] = None, set_as_default: Optional[bool] = None, ) -> Login: login = self.get_login(name) if new_name is not None: login.name = new_name if new_url is not None: login.url = new_url if new_user is not None: login.user = new_user if new_token is not None: login.token = new_token if new_ssh_key is not None: login.ssh_key = new_ssh_key if set_as_default: login.default = True if not login.has_changed(): return login data = self._read() for entry in data["logins"]: if entry.get("name", None) == name: entry.update(login.dict()) else: if set_as_default: entry.pop("default", None) self._write(data) return login osc-1.12.1/osc/gitea_api/connection.py000066400000000000000000000120351475337502500176130ustar00rootroot00000000000000import copy import http.client import json import time import urllib.parse from typing import Optional import urllib3 import urllib3.exceptions import urllib3.response from .conf import Login class GiteaHTTPResponse: """ A ``urllib3.response.HTTPResponse`` wrapper that ensures compatibility with older versions of urllib3. """ def __init__(self, response: urllib3.response.HTTPResponse): self.__dict__["_response"] = response def __getattr__(self, name): return getattr(self._response, name) def json(self): if hasattr(self._response, "json"): return self._response.json() return json.loads(self._response.data) class Connection: def __init__(self, login: Login, alternative_port: Optional[int] = None): """ :param login: ``Login`` object with Gitea url and credentials. :param alternative_port: Use an alternative port for the connection. This is needed for testing when gitea runs on a random port. """ self.login = login parsed_url = urllib.parse.urlparse(self.login.url, scheme="https") if parsed_url.scheme == "http": ConnectionClass = urllib3.connection.HTTPConnection elif parsed_url.scheme == "https": ConnectionClass = urllib3.connection.HTTPSConnection else: raise ValueError(f"Unsupported scheme in Gitea url '{self.login.url}'") self.host = parsed_url.hostname assert self.host is not None self.port = alternative_port if alternative_port else parsed_url.port self.conn = ConnectionClass(host=self.host, port=self.port) # retries; variables are named according to urllib3 self.retry_count = 3 self.retry_backoff_factor = 2 self.retry_status_forcelist = ( 500, # Internal Server Error 502, # Bad Gateway 503, # Service Unavailable 504, # Gateway Timeout ) if hasattr(self.conn, "set_cert"): # needed to avoid: AttributeError: 'HTTPSConnection' object has no attribute 'assert_hostname'. Did you mean: 'server_hostname'? self.conn.set_cert() def makeurl(self, *path: str, query: Optional[dict] = None): """ Return relative url prefixed with "/api/v1/" followed with concatenated ``*path``. """ url_path = ["", "api", "v1"] + [urllib.parse.quote(i, safe="/:") for i in path] url_path_str = "/".join(url_path) if query is None: query = {} query = copy.deepcopy(query) for key in list(query): value = query[key] if value in (None, [], ()): # remove items with value equal to None or [] or () del query[key] elif isinstance(value, bool): # convert boolean values to "0" or "1" query[key] = str(int(value)) url_query_str = urllib.parse.urlencode(query, doseq=True) return urllib.parse.urlunsplit(("", "", url_path_str, url_query_str, "")) def request(self, method, url, json_data: Optional[dict] = None) -> GiteaHTTPResponse: """ Make a request and return ``GiteaHTTPResponse``. """ headers = { "Content-Type": "application/json", } if self.login.token: headers["Authorization"] = f"token {self.login.token}" if json_data: json_data = dict(((key, value) for key, value in json_data.items() if value is not None)) body = json.dumps(json_data) if json_data else None for retry in range(1 + self.retry_count): # 1 regular request + ``self.retry_count`` retries try: self.conn.request(method, url, body, headers) response = self.conn.getresponse() if response.status not in self.retry_status_forcelist: # we are happy with the response status -> use the response break if retry >= self.retry_count: # we have reached maximum number of retries -> use the response break except (urllib3.exceptions.HTTPError, ConnectionResetError): if retry >= self.retry_count: raise # {backoff factor} * (2 ** ({number of previous retries})) time.sleep(self.retry_backoff_factor * (2 ** retry)) self.conn.close() if isinstance(response, http.client.HTTPResponse): result = GiteaHTTPResponse(urllib3.response.HTTPResponse.from_httplib(response)) else: result = GiteaHTTPResponse(response) if not hasattr(response, "status"): from .exceptions import GiteaException # pylint: disable=import-outside-toplevel,cyclic-import raise GiteaException(result) if response.status // 100 != 2: from .exceptions import GiteaException # pylint: disable=import-outside-toplevel,cyclic-import raise GiteaException(result) return result osc-1.12.1/osc/gitea_api/exceptions.py000066400000000000000000000037721475337502500176450ustar00rootroot00000000000000import re from .. import oscerr from .connection import GiteaHTTPResponse class GiteaException(oscerr.OscBaseError): def __init__(self, response: GiteaHTTPResponse): self.response = response @property def status(self): return self.response.status @property def reason(self): return self.response.reason def __str__(self): result = f"{self.status} {self.reason}" if self.response.data: result += f": {self.response.data}" return result class BranchDoesNotExist(GiteaException): def __init__(self, response: GiteaHTTPResponse, owner: str, repo: str, branch: str): super().__init__(response) self.owner = owner self.repo = repo self.branch = branch def __str__(self): result = f"Repo '{self.owner}/{self.repo}' does not contain branch '{self.branch}'" return result class BranchExists(GiteaException): def __init__(self, response: GiteaHTTPResponse, owner: str, repo: str, branch: str): super().__init__(response) self.owner = owner self.repo = repo self.branch = branch def __str__(self): result = f"Repo '{self.owner}/{self.repo}' already contains branch '{self.branch}'" return result class ForkExists(GiteaException): def __init__(self, response: GiteaHTTPResponse, owner: str, repo: str): super().__init__(response) self.owner = owner self.repo = repo regex = re.compile(r".*fork path: (?P[^/]+)/(?P[^\]]+)\].*") match = regex.match(self.response.json()["message"]) assert match is not None self.fork_owner = match.groupdict()["owner"] self.fork_repo = match.groupdict()["repo"] def __str__(self): result = f"Repo '{self.owner}/{self.repo}' is already forked as '{self.fork_owner}/{self.fork_repo}'" return result class InvalidSshPublicKey(oscerr.OscBaseError): def __str__(self): return "Invalid public ssh key" osc-1.12.1/osc/gitea_api/fork.py000066400000000000000000000040451475337502500164170ustar00rootroot00000000000000from typing import Optional from .connection import Connection from .connection import GiteaHTTPResponse from .exceptions import ForkExists from .exceptions import GiteaException class Fork: @classmethod def list( cls, conn: Connection, owner: str, repo: str, ) -> GiteaHTTPResponse: """ List forks of a repository. :param conn: Gitea ``Connection`` instance. :param owner: Owner of the repo. :param repo: Name of the repo. """ url = conn.makeurl("repos", owner, repo, "forks") return conn.request("GET", url) @classmethod def create( cls, conn: Connection, owner: str, repo: str, *, new_repo_name: Optional[str] = None, target_org: Optional[str] = None, exist_ok: bool = False, ) -> GiteaHTTPResponse: """ Fork a repository. :param conn: Gitea ``Connection`` instance. :param owner: Owner of the repo. :param repo: Name of the repo. :param new_repo_name: Name of the forked repository. :param target_org: Name of the organization, if forking into organization. :param exist_ok: A ``ForkExists`` exception is raised when the target exists. Set to ``True`` to avoid throwing the exception. """ json_data = { "name": new_repo_name, "organization": target_org, } url = conn.makeurl("repos", owner, repo, "forks") try: return conn.request("POST", url, json_data=json_data) except GiteaException as e: # use ForkExists exception to parse fork_owner and fork_repo from the response if e.status == 409: fork_exists_exception = ForkExists(e.response, owner, repo) if exist_ok: from . import Repo return Repo.get(conn, fork_exists_exception.fork_owner, fork_exists_exception.fork_repo) raise fork_exists_exception from None raise osc-1.12.1/osc/gitea_api/git.py000066400000000000000000000045251475337502500162440ustar00rootroot00000000000000import os import subprocess import urllib from typing import Optional from typing import Tuple class Git: def __init__(self, workdir): self.abspath = os.path.abspath(workdir) def _run_git(self, args: list) -> str: return subprocess.check_output(["git"] + args, encoding="utf-8", cwd=self.abspath).strip() def init(self, *, quiet=True): cmd = ["init"] if quiet: cmd += ["-q"] self._run_git(cmd) # BRANCHES @property def current_branch(self) -> str: return self._run_git(["branch", "--show-current"]) def get_branch_head(self, branch: str) -> str: return self._run_git(["rev-parse", branch]) def switch(self, branch: str): self._run_git(["switch", branch]) def fetch_pull_request( self, pull_number: int, *, remote: str = "origin", force: bool = False, ): """ Fetch pull/$pull_number/head to pull/$pull_number branch """ target_branch = f"pull/{pull_number}" cmd = ["fetch", remote, f"pull/{pull_number}/head:{target_branch}"] if force: cmd += [ "--force", "--update-head-ok", ] self._run_git(cmd) return target_branch # CONFIG def set_config(self, key: str, value: str): self._run_git(["config", key, value]) # REMOTES def get_remote_url(self, name: str = "origin") -> str: return self._run_git(["remote", "get-url", name]) def add_remote(self, name: str, url: str): self._run_git(["remote", "add", name, url]) def fetch(self, name: Optional[str] = None): if name: cmd = ["fetch", name] else: cmd = ["fetch", "--all"] self._run_git(cmd) def get_owner_repo(self, remote: str = "origin") -> Tuple[str, str]: remote_url = self.get_remote_url(name=remote) if "@" in remote_url: # ssh://gitea@example.com:owner/repo.git # ssh://gitea@example.com:22/owner/repo.git remote_url = remote_url.rsplit("@", 1)[-1] parsed_remote_url = urllib.parse.urlparse(remote_url) path = parsed_remote_url.path if path.endswith(".git"): path = path[:-4] owner, repo = path.strip("/").split("/")[-2:] return owner, repo osc-1.12.1/osc/gitea_api/pr.py000066400000000000000000000203421475337502500160750ustar00rootroot00000000000000import re from typing import List from typing import Optional from typing import Tuple from .connection import Connection from .connection import GiteaHTTPResponse class PullRequest: @classmethod def cmp(cls, entry: dict): if "base" in entry: # a proper pull request return entry["base"]["repo"]["full_name"], entry["number"] else: # an issue without pull request details return entry["repository"]["full_name"], entry["number"] @classmethod def split_id(cls, pr_id: str) -> Tuple[str, str, str]: """ Split /# into individual components and return them in a tuple. """ match = re.match(r"^([^/]+)/([^/]+)#([0-9]+)$", pr_id) if not match: raise ValueError(f"Invalid pull request id: {pr_id}") return match.groups() @classmethod def to_human_readable_string(cls, entry: dict): from osc.output import KeyValueTable from . import User def yes_no(value): return "yes" if value else "no" if "base" in entry: # a proper pull request entry_id = f"{entry['base']['repo']['full_name']}#{entry['number']}" is_pull_request = True else: # an issue without pull request details entry_id = f"{entry['repository']['full_name']}#{entry['number']}" is_pull_request = False # HACK: search API returns issues, the URL needs to be transformed to a pull request URL entry_url = entry["url"] entry_url = re.sub(r"^(.*)/api/v1/repos/(.+/.+)/issues/([0-9]+)$", r"\1/\2/pulls/\3", entry_url) table = KeyValueTable() table.add("ID", entry_id, color="bold") table.add("URL", f"{entry_url}") table.add("Title", f"{entry['title']}") table.add("State", entry["state"]) if is_pull_request: table.add("Draft", yes_no(entry["draft"])) table.add("Merged", yes_no(entry["merged"])) table.add("Allow edit", yes_no(entry["allow_maintainer_edit"])) table.add("Author", f"{User.to_login_full_name_email_string(entry['user'])}") if is_pull_request: table.add("Source", f"{entry['head']['repo']['full_name']}, branch: {entry['head']['ref']}, commit: {entry['head']['sha']}") table.add("Description", entry["body"]) return str(table) @classmethod def list_to_human_readable_string(cls, entries: List, sort: bool = False): if sort: entries = sorted(entries, key=cls.cmp) result = [] for entry in entries: result.append(cls.to_human_readable_string(entry)) return "\n\n".join(result) @classmethod def create( cls, conn: Connection, *, target_owner: str, target_repo: str, target_branch: str, source_owner: str, source_branch: str, title: str, description: Optional[str] = None, ) -> GiteaHTTPResponse: """ Create a pull request to ``owner``/``repo`` to the ``base`` branch. The pull request comes from a fork. The fork repo name is determined from gitea database. :param conn: Gitea ``Connection`` instance. :param target_owner: Owner of the target repo. :param target_repo: Name of the target repo. :param target_branch: Name of the target branch in the target repo. :param source_owner: Owner of the source (forked) repo. :param source_branch: Name of the source branch in the source (forked) repo. :param title: Pull request title. :param description: Pull request description. """ url = conn.makeurl("repos", target_owner, target_repo, "pulls") data = { "base": target_branch, "head": f"{source_owner}:{source_branch}", "title": title, "body": description, } return conn.request("POST", url, json_data=data) @classmethod def get( cls, conn: Connection, owner: str, repo: str, number: int, ) -> GiteaHTTPResponse: """ Get a pull request. :param conn: Gitea ``Connection`` instance. :param owner: Owner of the repo. :param repo: Name of the repo. :param number: Number of the pull request in the repo. """ url = conn.makeurl("repos", owner, repo, "pulls", str(number)) return conn.request("GET", url) @classmethod def set( cls, conn: Connection, owner: str, repo: str, number: int, *, title: Optional[str] = None, description: Optional[str] = None, allow_maintainer_edit: Optional[bool] = None, ) -> GiteaHTTPResponse: """ Change a pull request. :param conn: Gitea ``Connection`` instance. :param owner: Owner of the repo. :param repo: Name of the repo. :param number: Number of the pull request in the repo. :param title: Change pull request title. :param description: Change pull request description. :param allow_maintainer_edit: Change whether users with write access to the base branch can also push to the pull request's head branch. """ json_data = { "title": title, "description": description, "allow_maintainer_edit": allow_maintainer_edit, } url = conn.makeurl("repos", owner, repo, "pulls", str(number)) return conn.request("PATCH", url, json_data=json_data) @classmethod def list( cls, conn: Connection, owner: str, repo: str, *, state: Optional[str] = "open", ) -> GiteaHTTPResponse: """ List pull requests in a repo. :param conn: Gitea ``Connection`` instance. :param owner: Owner of the repo. :param repo: Name of the repo. :param state: Filter by state: open, closed, all. Defaults to open. """ if state == "all": state = None q = { "state": state, } url = conn.makeurl("repos", owner, repo, "pulls", query=q) return conn.request("GET", url) @classmethod def search( cls, conn: Connection, *, state: str = "open", title: Optional[str] = None, owner: Optional[str] = None, labels: Optional[List[str]] = None, assigned: bool = False, created: bool = False, mentioned: bool = False, review_requested: bool = False, ) -> GiteaHTTPResponse: """ Search pull requests. :param conn: Gitea ``Connection`` instance. :param state: Filter by state: open, closed. Defaults to open. :param title: Filter by substring in title. :param owner: Filter by owner of the repository associated with the pull requests. :param labels: Filter by associated labels. Non existent labels are discarded. :param assigned: Filter pull requests assigned to you. :param created: Filter pull requests created by you. :param mentioned: Filter pull requests mentioning you. :param review_requested: Filter pull requests requesting your review. """ q = { "type": "pulls", "state": state, "q": title, "owner": owner, "labels": ",".join(labels) if labels else None, "assigned": assigned, "created": created, "mentioned": mentioned, "review_requested": review_requested, } url = conn.makeurl("repos", "issues", "search", query=q) return conn.request("GET", url) @classmethod def get_patch( cls, conn: Connection, owner: str, repo: str, number: str, ) -> GiteaHTTPResponse: """ Get a patch associated with a pull request. :param conn: Gitea ``Connection`` instance. :param owner: Owner of the repo. :param repo: Name of the repo. :param number: Number of the pull request in the repo. """ url = conn.makeurl("repos", owner, repo, "pulls", f"{number}.patch") return conn.request("GET", url) osc-1.12.1/osc/gitea_api/repo.py000066400000000000000000000110271475337502500164210ustar00rootroot00000000000000import os import re import subprocess from typing import Optional from typing import Tuple from .connection import Connection from .connection import GiteaHTTPResponse from .user import User class Repo: @classmethod def split_id(cls, repo_id: str) -> Tuple[str, str]: """ Split / into individual components and return them in a tuple. """ match = re.match(r"^([^/]+)/([^/]+)$", repo_id) if not match: raise ValueError(f"Invalid repo id: {repo_id}") return match.groups() @classmethod def get( cls, conn: Connection, owner: str, repo: str, ) -> GiteaHTTPResponse: """ Retrieve details about a repository. :param conn: Gitea ``Connection`` instance. :param owner: Owner of the repo. :param repo: Name of the repo. """ url = conn.makeurl("repos", owner, repo) return conn.request("GET", url) @classmethod def clone( cls, conn: Connection, owner: str, repo: str, *, directory: Optional[str] = None, cwd: Optional[str] = None, anonymous: bool = False, add_remotes: bool = False, ssh_private_key_path: Optional[str] = None, ssh_strict_host_key_checking: bool = True, ) -> str: """ Clone a repository using 'git clone' command, return absolute path to it. :param conn: Gitea ``Connection`` instance. :param owner: Owner of the repo. :param repo: Name of the repo. :param directory: The name of a new directory to clone into. Defaults to the repo name. :param cwd: Working directory. Defaults to the current working directory. :param anonymous: Whether to use``clone_url`` for an anonymous access or use authenticated ``ssh_url``. :param add_remotes: Determine and add 'parent' or 'fork' remotes to the cloned repo. """ import shlex cwd = os.path.abspath(cwd) if cwd else os.getcwd() directory = directory if directory else repo # it's perfectly fine to use os.path.join() here because git can take an absolute path directory_abspath = os.path.join(cwd, directory) repo_data = cls.get(conn, owner, repo).json() clone_url = repo_data["clone_url"] if anonymous else repo_data["ssh_url"] remotes = {} if add_remotes: user = User.get(conn).json() if repo_data["owner"]["login"] == user["login"]: # we're cloning our own repo, setting remote to the parent (if exists) parent = repo_data["parent"] remotes["parent"] = parent["clone_url"] if anonymous else parent["ssh_url"] else: # we're cloning someone else's repo, setting remote to our fork (if exists) from . import Fork forks = Fork.list(conn, owner, repo).json() forks = [i for i in forks if i["owner"]["login"] == user["login"]] if forks: assert len(forks) == 1 fork = forks[0] remotes["fork"] = fork["clone_url"] if anonymous else fork["ssh_url"] ssh_args = [] env = os.environ.copy() if ssh_private_key_path: ssh_args += [ # avoid guessing the ssh key, use the specified one "-o IdentitiesOnly=yes", f"-o IdentityFile={shlex.quote(ssh_private_key_path)}", ] if not ssh_strict_host_key_checking: ssh_args += [ "-o StrictHostKeyChecking=no", "-o UserKnownHostsFile=/dev/null", "-o LogLevel=ERROR", ] if ssh_args: env["GIT_SSH_COMMAND"] = f"ssh {' '.join(ssh_args)}" # clone cmd = ["git", "clone", clone_url, directory] subprocess.run(cmd, cwd=cwd, env=env, check=True) # setup remotes for name, url in remotes.items(): cmd = ["git", "-C", directory_abspath, "remote", "add", name, url] subprocess.run(cmd, cwd=cwd, check=True) # store used ssh args (GIT_SSH_COMMAND) in the local git config # to allow seamlessly running ``git push`` and other commands if ssh_args: cmd = ["git", "-C", directory_abspath, "config", "core.sshCommand", f"echo 'Using core.sshCommand: {env['GIT_SSH_COMMAND']}' >&2; {env['GIT_SSH_COMMAND']}"] subprocess.run(cmd, cwd=cwd, check=True) return directory_abspath osc-1.12.1/osc/gitea_api/ssh_key.py000066400000000000000000000055461475337502500171320ustar00rootroot00000000000000from typing import Optional from .connection import Connection from .connection import GiteaHTTPResponse class SSHKey: @classmethod def get(cls, conn: Connection, id: int) -> GiteaHTTPResponse: """ Get an authenticated user's public key by its ``id``. :param conn: Gitea ``Connection`` instance. :param id: key numeric id """ url = conn.makeurl("user", "keys", str(id)) return conn.request("GET", url) @classmethod def list(cls, conn: Connection) -> GiteaHTTPResponse: """ List the authenticated user's public keys. :param conn: Gitea ``Connection`` instance. """ url = conn.makeurl("user", "keys") return conn.request("GET", url) @classmethod def _split_key(cls, key): import re return re.split(" +", key, maxsplit=2) @classmethod def _validate_key_format(cls, key): """ Check that the public ssh key has the correct format: - must be a single line of text - it is possible to split it into parts - the part is base64 encoded """ import base64 import binascii from .exceptions import InvalidSshPublicKey key = key.strip() if len(key.splitlines()) != 1: raise InvalidSshPublicKey() try: key_type, key_base64, key_comment = cls._split_key(key) except ValueError: raise InvalidSshPublicKey() try: base64.b64decode(key_base64) except binascii.Error: raise InvalidSshPublicKey() @classmethod def create(cls, conn: Connection, key: str, title: Optional[str] = None) -> GiteaHTTPResponse: """ Create a public key. :param conn: Gitea ``Connection`` instance. :param key: An armored SSH key to add. :param title: Title of the key to add. Derived from the key if not specified. """ url = conn.makeurl("user", "keys") cls._validate_key_format(key) if not title: title = cls._split_key(key)[2] data = { "key": key, "title": title, } return conn.request("POST", url, json_data=data) @classmethod def delete(cls, conn: Connection, id: int): """ Delete a public key :param conn: Gitea ``Connection`` instance. :param id: Id of key to delete. """ url = conn.makeurl("user", "keys", str(id)) return conn.request("DELETE", url) @classmethod def to_human_readable_string(cls, data): from osc.output import KeyValueTable table = KeyValueTable() table.add("ID", f"{data['id']}", color="bold") table.add("Title", f"{data['title']}") table.add("Key", f"{data['key']}") return str(table) osc-1.12.1/osc/gitea_api/user.py000066400000000000000000000013601475337502500164310ustar00rootroot00000000000000from .connection import Connection from .connection import GiteaHTTPResponse class User: @classmethod def to_full_name_email_string(cls, data): full_name = data["full_name"] email = data["email"] if full_name: return f"{full_name} <{email}>" return email @classmethod def to_login_full_name_email_string(cls, data): return f"{data['login']} ({cls.to_full_name_email_string(data)})" @classmethod def get( cls, conn: Connection, ) -> GiteaHTTPResponse: """ Retrieve details about the current user. :param conn: Gitea ``Connection`` instance. """ url = conn.makeurl("user") return conn.request("GET", url) osc-1.12.1/osc/grabber.py000066400000000000000000000031131475337502500151330ustar00rootroot00000000000000# Copyright (C) 2018 SUSE Linux. All rights reserved. # This program is free software; it may be used, copied, modified # and distributed under the terms of the GNU General Public Licence, # either version 2, or (at your option) any later version. import os from urllib.request import HTTPError from urllib.parse import urlparse from urllib.parse import unquote from urllib.error import URLError try: from urllib3.exceptions import URLSchemeUnknown except ImportError: class URLSchemeUnknown(Exception): pass from .core import streamfile class OscFileGrabber: def __init__(self, progress_obj=None): self.progress_obj = progress_obj def urlgrab(self, url, filename=None, text=None): if filename is None: parts = urlparse(url) filename = os.path.basename(unquote(parts[2])) with open(filename, 'wb') as f: for i in streamfile(url, progress_obj=self.progress_obj, text=text): f.write(i) class OscMirrorGroup: def __init__(self, grabber, mirrors): self._grabber = grabber self._mirrors = mirrors def urlgrab(self, url, filename=None, text=None): for mirror in self._mirrors: try: self._grabber.urlgrab(mirror, filename, text) return True except (HTTPError, URLError, URLSchemeUnknown, KeyError) as e: # urllib3 1.25.10 throws a KeyError: pool_key_constructor = self.key_fn_by_scheme[scheme] # try next mirror pass return False osc-1.12.1/osc/meter.py000066400000000000000000000065241475337502500146540ustar00rootroot00000000000000# Copyright (C) 2018 SUSE Linux. All rights reserved. # This program is free software; it may be used, copied, modified # and distributed under the terms of the GNU General Public Licence, # either version 2, or (at your option) any later version. import signal import sys from abc import ABC from abc import abstractmethod from typing import Optional try: import progressbar as pb have_pb_module = True using_pb_progressbar2 = tuple(map(int, pb.__version__.split('.'))) >= (3, 1) except ImportError: have_pb_module = False class TextMeterBase(ABC): @abstractmethod def start(self, basename: str, size: Optional[int] = None): pass @abstractmethod def update(self, amount_read: int): pass @abstractmethod def end(self): pass class PBTextMeter(TextMeterBase): def __init__(self): self.bar: pb.ProgressBar def start(self, basename: str, size: Optional[int] = None): if size is None: widgets = [f"{basename} ", pb.AnimatedMarker(), ' ', pb.Timer()] self.bar = pb.ProgressBar(widgets=widgets, maxval=pb.UnknownLength) else: widgets = [f"{basename} ", pb.Bar(), ' ', pb.ETA()] if size: # if size is 0, using pb.Percentage will result in # a ZeroDivisionException widgets[3:3] = [pb.Percentage(), " "] self.bar = pb.ProgressBar(widgets=widgets, maxval=size) # When a signal handler is set, it resets SA_RESTART flag # - see signal.siginterrupt() python docs. # ProgressBar's constructor sets signal handler for SIGWINCH. # So let's make sure that it doesn't interrupt syscalls in osc. signal.siginterrupt(signal.SIGWINCH, False) self.bar.start() def update(self, amount_read: int): self.bar.update(amount_read) def end(self): # replace marker (ticks) and left+right borders of the bar with spaces # to hide it from output after completion for i in self.bar.widgets: if not isinstance(i, pb.Bar): continue if using_pb_progressbar2: i.marker = lambda _progress, _data, _width: " " i.left = lambda _progress, _data, _width: " " i.right = lambda _progress, _data, _width: " " else: i.marker = " " i.left = " " i.right = " " self.bar.finish() class SimpleTextMeter(TextMeterBase): def start(self, basename: str, size: Optional[int] = None): print(basename, file=sys.stderr) def update(self, amount_read: int): pass def end(self): pass class NoTextMeter(TextMeterBase): def start(self, basename: str, size: Optional[int] = None): pass def update(self, amount_read: int): pass def end(self): pass def create_text_meter(*args, **kwargs) -> TextMeterBase: from .conf import config use_pb_fallback = kwargs.pop("use_pb_fallback", False) meter_class: TextMeterBase if config.quiet: meter_class = NoTextMeter elif not have_pb_module or not config.show_download_progress or not sys.stdout.isatty() or use_pb_fallback: meter_class = SimpleTextMeter else: meter_class = PBTextMeter return meter_class(*args, **kwargs) # vim: sw=4 et osc-1.12.1/osc/obs_api/000077500000000000000000000000001475337502500145735ustar00rootroot00000000000000osc-1.12.1/osc/obs_api/__init__.py000066400000000000000000000003241475337502500167030ustar00rootroot00000000000000from .keyinfo import Keyinfo from .package import Package from .package_sources import PackageSources from .person import Person from .project import Project from .request import Request from .token import Token osc-1.12.1/osc/obs_api/enums.py000066400000000000000000000040131475337502500162720ustar00rootroot00000000000000from ..util.models import * # pylint: disable=wildcard-import,unused-wildcard-import class BlockModes(str, Enum): ALL = "all" LOCAL = "local" NEVER = "never" class BoolString(str, Enum): TRUE = "true" FALSE = "false" class BuildArch(str, Enum): NOARCH = "noarch" AARCH64 = "aarch64" AARCH64_ILP32 = "aarch64_ilp32" ARMV4L = "armv4l" ARMV5L = "armv5l" ARMV6L = "armv6l" ARMV7L = "armv7l" ARMV5EL = "armv5el" ARMV6EL = "armv6el" ARMV7EL = "armv7el" ARMV7HL = "armv7hl" ARMV8EL = "armv8el" HPPA = "hppa" M68K = "m68k" I386 = "i386" I486 = "i486" I586 = "i586" I686 = "i686" ATHLON = "athlon" IA64 = "ia64" K1OM = "k1om" LOONGARCH64 = "loongarch64" MIPS = "mips" MIPSEL = "mipsel" MIPS32 = "mips32" MIPS64 = "mips64" MIPS64EL = "mips64el" PPC = "ppc" PPC64 = "ppc64" PPC64P7 = "ppc64p7" PPC64LE = "ppc64le" RISCV64 = "riscv64" S390 = "s390" S390X = "s390x" SH4 = "sh4" SPARC = "sparc" SPARC64 = "sparc64" SPARC64V = "sparc64v" SPARCV8 = "sparcv8" SPARCV9 = "sparcv9" SPARCV9V = "sparcv9v" X86_64 = "x86_64" LOCAL = "local" class LinkedbuildModes(str, Enum): OFF = "off" LOCALDEP = "localdep" ALLDIRECT = "alldirect" ALL = "all" class LocalRole(str, Enum): MAINTAINER = "maintainer" BUGOWNER = "bugowner" REVIEWER = "reviewer" DOWNLOADER = "downloader" READER = "reader" class ObsRatings(str, Enum): LOW = "low" MODERATE = "moderate" IMPORTANT = "important" CRITICAL = "critical" class RebuildModes(str, Enum): TRANSITIVE = "transitive" DIRECT = "direct" LOCAL = "local" class ReleaseTriggers(str, Enum): MANUAL = "manual" MAINTENANCE = "maintenance" OBSGENDIFF = "obsgendiff" class RequestStates(str, Enum): REVIEW = "review" NEW = "new" ACCEPTED = "accepted" DECLINED = "declined" REVOKED = "revoked" SUPERSEDED = "superseded" DELETED = "deleted" osc-1.12.1/osc/obs_api/flag.py000066400000000000000000000007721475337502500160640ustar00rootroot00000000000000from ..util.models import * # pylint: disable=wildcard-import,unused-wildcard-import class Flag(XmlModel): XML_TAG = None def __init__(self, flag, **kwargs): super().__init__(flag=flag, **kwargs) class FlagChoices(Enum): ENABLE = "enable" DISABLE = "disable" flag: FlagChoices = Field( xml_set_tag=True, ) arch: Optional[str] = Field( xml_attribute=True, ) repository: Optional[str] = Field( xml_attribute=True, ) osc-1.12.1/osc/obs_api/group_role.py000066400000000000000000000004441475337502500173240ustar00rootroot00000000000000from ..util.models import * # pylint: disable=wildcard-import,unused-wildcard-import from .enums import LocalRole class GroupRole(XmlModel): XML_TAG = "group" groupid: str = Field( xml_attribute=True, ) role: LocalRole = Field( xml_attribute=True, ) osc-1.12.1/osc/obs_api/keyinfo.py000066400000000000000000000074461475337502500166240ustar00rootroot00000000000000import textwrap from ..util.models import * # pylint: disable=wildcard-import,unused-wildcard-import from .keyinfo_pubkey import KeyinfoPubkey from .keyinfo_sslcert import KeyinfoSslcert class Keyinfo(XmlModel): XML_TAG = "keyinfo" project: Optional[str] = Field( xml_attribute=True, description=textwrap.dedent( """ The name of the project. """ ), ) pubkey_list: Optional[List[KeyinfoPubkey]] = Field( xml_name="pubkey", ) sslcert_list: Optional[List[KeyinfoSslcert]] = Field( xml_name="sslcert", ) @classmethod def from_api(cls, apiurl: str, project: str) -> "Keyinfo": url_path = ["source", project, "_keyinfo"] url_query = {} response = cls.xml_request("GET", apiurl, url_path, url_query) return cls.from_file(response, apiurl=apiurl) @classmethod def get_pubkey_deprecated(cls, apiurl: str, project: str, *, traverse: bool = True) -> Optional[Tuple[str, str]]: """ Old API for retrieving pubkey of the given ``project``. Use ``Keyinfo.from_api()`` instead if possible. :param traverse: If set to ``True`` and the key is not found, traverse project hierarchy for the first available key. :return: (project, pubkey) or None """ from urllib.error import HTTPError from ..connection import http_request from ..core import makeurl from ..output import print_msg while True: url_path = ["source", project, "_pubkey"] url_query = {} url = makeurl(apiurl, url_path, url_query) try: response = http_request("GET", url) pubkey = response.read().decode("utf-8") return project, pubkey except HTTPError as e: if e.code != 404: raise if not traverse: return None parts = project.rsplit(":", 1) if parts[0] != project: print_msg(f"No pubkey found in project '{project}'. Trying the parent project '{parts[0]}'...", print_to="debug") project = parts[0] continue # we're at the top level, no key found return None @classmethod def get_sslcert_deprecated(cls, apiurl: str, project: str, *, traverse: bool = True) -> Optional[Tuple[str, str]]: """ Old API for retrieving sslcert of the given ``project``. Use ``Keyinfo.from_api()`` instead if possible. :param traverse: If set to ``True`` and the cert is not found, traverse project hierarchy for the first available cert. :return: (project, sslcert) or None """ from urllib.error import HTTPError from ..connection import http_request from ..core import makeurl from ..output import print_msg while True: url_path = ["source", project, "_project", "_sslcert"] url_query = { "meta": 1, } url = makeurl(apiurl, url_path, url_query) try: response = http_request("GET", url) sslcert = response.read().decode("utf-8") return project, sslcert except HTTPError as e: if e.code != 404: raise if not traverse: return None parts = project.rsplit(":", 1) if parts[0] != project: print_msg(f"No sslcert found in project '{project}'. Trying the parent project '{parts[0]}'...", print_to="debug") project = parts[0] continue # we're at the top level, no cert found return None osc-1.12.1/osc/obs_api/keyinfo_pubkey.py000066400000000000000000000025261475337502500201750ustar00rootroot00000000000000from ..util.models import * # pylint: disable=wildcard-import,unused-wildcard-import class KeyinfoPubkey(XmlModel): XML_TAG = "pubkey" keyid: Optional[str] = Field( xml_attribute=True, ) userid: Optional[str] = Field( xml_attribute=True, ) algo: Optional[str] = Field( xml_attribute=True, ) keysize: Optional[str] = Field( xml_attribute=True, ) expires: Optional[int] = Field( xml_attribute=True, ) fingerprint: Optional[str] = Field( xml_attribute=True, ) value: str = Field( xml_set_text=True, ) def get_expires_str(self) -> str: import datetime if self.expires is None: return "" return datetime.datetime.fromtimestamp(self.expires).strftime("%Y-%m-%d %H:%M:%S") def to_human_readable_string(self) -> str: """ Render the object as a human readable string. """ from ..output import KeyValueTable table = KeyValueTable() table.add("Type", "GPG public key") table.add("User ID", self.userid, color="bold") table.add("Algorithm", self.algo) table.add("Key size", self.keysize) table.add("Expires", self.get_expires_str()) table.add("Fingerprint", self.fingerprint) return f"{table}\n{self.value}" osc-1.12.1/osc/obs_api/keyinfo_sslcert.py000066400000000000000000000036521475337502500203560ustar00rootroot00000000000000from ..util.models import * # pylint: disable=wildcard-import,unused-wildcard-import class KeyinfoSslcert(XmlModel): XML_TAG = "sslcert" keyid: Optional[str] = Field( xml_attribute=True, ) serial: Optional[str] = Field( xml_attribute=True, ) issuer: Optional[str] = Field( xml_attribute=True, ) subject: Optional[str] = Field( xml_attribute=True, ) algo: Optional[str] = Field( xml_attribute=True, ) keysize: Optional[str] = Field( xml_attribute=True, ) begins: Optional[int] = Field( xml_attribute=True, ) expires: Optional[int] = Field( xml_attribute=True, ) fingerprint: Optional[str] = Field( xml_attribute=True, ) value: str = Field( xml_set_text=True, ) def get_begins_str(self) -> str: import datetime if self.begins is None: return "" return datetime.datetime.fromtimestamp(self.begins).strftime("%Y-%m-%d %H:%M:%S") def get_expires_str(self) -> str: import datetime if self.expires is None: return "" return datetime.datetime.fromtimestamp(self.expires).strftime("%Y-%m-%d %H:%M:%S") def to_human_readable_string(self) -> str: """ Render the object as a human readable string. """ from ..output import KeyValueTable table = KeyValueTable() table.add("Type", "SSL certificate") table.add("Subject", self.subject, color="bold") table.add("Key ID", self.keyid) table.add("Serial", self.serial) table.add("Issuer", self.issuer) table.add("Algorithm", self.algo) table.add("Key size", self.keysize) table.add("Begins", self.get_begins_str()) table.add("Expires", self.get_expires_str()) table.add("Fingerprint", self.fingerprint) return f"{table}\n{self.value}" osc-1.12.1/osc/obs_api/linkinfo.py000066400000000000000000000013621475337502500167600ustar00rootroot00000000000000from ..util.models import * # pylint: disable=wildcard-import,unused-wildcard-import class Linkinfo(XmlModel): XML_TAG = "linkinfo" project: str = Field( xml_attribute=True, ) package: str = Field( xml_attribute=True, ) lsrcmd5: Optional[str] = Field( xml_attribute=True, ) xsrcmd5: Optional[str] = Field( xml_attribute=True, ) baserev: Optional[str] = Field( xml_attribute=True, ) rev: Optional[str] = Field( xml_attribute=True, ) srcmd5: Optional[str] = Field( xml_attribute=True, ) error: Optional[str] = Field( xml_attribute=True, ) missingok: Optional[bool] = Field( xml_attribute=True, ) osc-1.12.1/osc/obs_api/package.py000066400000000000000000000123221475337502500165400ustar00rootroot00000000000000from ..util.models import * # pylint: disable=wildcard-import,unused-wildcard-import from .flag import Flag from .group_role import GroupRole from .package_devel import PackageDevel from .package_revision import PackageRevision from .person_role import PersonRole from .simple_flag import SimpleFlag from .status import Status class Package(XmlModel): XML_TAG = "package" name: str = Field( xml_attribute=True, ) project: str = Field( xml_attribute=True, ) title: Optional[str] = Field() description: Optional[str] = Field() devel: Optional[PackageDevel] = Field() releasename: Optional[str] = Field() person_list: Optional[List[PersonRole]] = Field( xml_name="person", ) group_list: Optional[List[GroupRole]] = Field( xml_name="group", ) lock: Optional[SimpleFlag] = Field( xml_wrapped=True, ) build_list: Optional[List[Flag]] = Field( xml_name="build", xml_wrapped=True, ) publish_list: Optional[List[Flag]] = Field( xml_name="publish", xml_wrapped=True, ) useforbuild_list: Optional[List[Flag]] = Field( xml_name="useforbuild", xml_wrapped=True, ) debuginfo_list: Optional[List[Flag]] = Field( xml_name="debuginfo", xml_wrapped=True, ) binarydownload: Optional[SimpleFlag] = Field() sourceaccess: Optional[SimpleFlag] = Field() url: Optional[str] = Field() scmsync: Optional[str] = Field() bcntsynctag: Optional[str] = Field() @classmethod def from_api(cls, apiurl, project, package, *, rev=None): # ``rev`` is metadata revision, not revision of the source code url_path = ["source", project, package, "_meta"] url_query = { "rev": rev, } response = cls.xml_request("GET", apiurl, url_path, url_query) return cls.from_file(response, apiurl=apiurl) def to_api(self, apiurl, *, project=None, package=None): project = project or self.project package = package or self.name url_path = ["source", project, package, "_meta"] url_query = {} response = self.xml_request("PUT", apiurl, url_path, url_query, data=self.to_string()) return Status.from_file(response, apiurl=apiurl) @classmethod def cmd_fork( cls, apiurl: str, project: str, package: str, *, scmsync: str, ): """ POST /source/{project}/{package}?cmd=fork&scmsync={scmsync} For a package managed in Git. :param apiurl: Full apiurl or its alias. :param project: Project name. :param package: Package name. :param scmsync: Checkout Git URL. Example: https://src.example.com/owner/repo#branch """ url_path = ["source", project, package] url_query = { "cmd": "fork", "scmsync": scmsync, } response = cls.xml_request("POST", apiurl, url_path, url_query) return Status.from_file(response, apiurl=apiurl) @classmethod def cmd_release( cls, apiurl: str, project: str, package: str, *, repository: Optional[str] = None, arch: Optional[str] = None, target_project: Optional[str] = None, target_repository: Optional[str] = None, setrelease: Optional[str] = None, nodelay: Optional[bool] = None, ): """ POST /source/{project}/{package}?cmd=release Release sources and binaries of a specified package. :param apiurl: Full apiurl or its alias. :param project: Project name. :param package: Package name. :param repository: Limit the release to the given repository. :param arch: Limit the release to the given architecture. :param target_project: The name of the release target project. :param target_repository: The name of the release target repository. :param setrelease: Tag the release with the given value. :param nodelay: Do not delay the relase. If not set, the release will be delayed to be done later. """ url_path = ["source", project, package] url_query = { "cmd": "release", "repository": repository, "arch": arch, "target_project": target_project, "target_repository": target_repository, "setrelease": setrelease, "nodelay": nodelay, } response = cls.xml_request("POST", apiurl, url_path, url_query) return Status.from_file(response, apiurl=apiurl) @classmethod def get_revision_list(cls, apiurl: str, project: str, package: str, deleted: Optional[bool] = None, meta: Optional[bool] = None): from ..util.xml import xml_parse url_path = ["source", project, package, "_history"] url_query = { "meta": meta, "deleted": deleted, } response = cls.xml_request("GET", apiurl, url_path, url_query) root = xml_parse(response).getroot() assert root.tag == "revisionlist" result = [] for node in root: result.append(PackageRevision.from_xml(node, apiurl=apiurl)) return result osc-1.12.1/osc/obs_api/package_devel.py000066400000000000000000000004201475337502500177130ustar00rootroot00000000000000from ..util.models import * # pylint: disable=wildcard-import,unused-wildcard-import class PackageDevel(XmlModel): XML_TAG = "devel" project: str = Field( xml_attribute=True, ) package: Optional[str] = Field( xml_attribute=True, ) osc-1.12.1/osc/obs_api/package_revision.py000066400000000000000000000011371475337502500204600ustar00rootroot00000000000000from ..util.models import * # pylint: disable=wildcard-import,unused-wildcard-import class PackageRevision(XmlModel): XML_TAG = "revision" rev: int = Field( xml_attribute=True, ) vrev: Optional[str] = Field( xml_attribute=True, ) srcmd5: str = Field( ) version: str = Field( ) time: int = Field( ) user: str = Field( ) comment: Optional[str] = Field( ) requestid: Optional[int] = Field( ) def get_time_str(self): import time return time.strftime("%Y-%m-%d %H:%M:%S", time.gmtime(self.time)) osc-1.12.1/osc/obs_api/package_sources.py000066400000000000000000000033661475337502500203130ustar00rootroot00000000000000from ..util.models import * # pylint: disable=wildcard-import,unused-wildcard-import from .linkinfo import Linkinfo from .package_sources_file import PackageSourcesFile from .serviceinfo import Serviceinfo class PackageSources(XmlModel): XML_TAG = "directory" name: str = Field( xml_attribute=True, ) rev: str = Field( xml_attribute=True, ) vrev: Optional[str] = Field( xml_attribute=True, ) srcmd5: str = Field( xml_attribute=True, ) linkinfo: Optional[Linkinfo] = Field( ) serviceinfo: Optional[Serviceinfo] = Field( ) file_list: Optional[List[PackageSourcesFile]] = Field( xml_name="entry", ) @classmethod def from_api( cls, apiurl: str, project: str, package: str, *, deleted: Optional[bool] = None, expand: Optional[bool] = None, meta: Optional[bool] = None, rev: Optional[str] = None, ): """ :param deleted: Set to ``True`` to list source files of a deleted package. Throws 400: Bad Request if such package exists. :param expand: Expand links. :param meta: Set to ``True`` to list metadata file (``_meta``) instead of the sources. :param rev: Show sources of the specified revision. """ from ..core import revision_is_empty if revision_is_empty(rev): rev = None url_path = ["source", project, package] url_query = { "deleted": deleted, "expand": expand, "meta": meta, "rev": rev, } response = cls.xml_request("GET", apiurl, url_path, url_query) return cls.from_file(response, apiurl=apiurl) osc-1.12.1/osc/obs_api/package_sources_file.py000066400000000000000000000010631475337502500213020ustar00rootroot00000000000000from ..util.models import * # pylint: disable=wildcard-import,unused-wildcard-import class PackageSourcesFile(XmlModel): XML_TAG = "entry" name: str = Field( xml_attribute=True, ) md5: str = Field( xml_attribute=True, ) mtime: int = Field( xml_attribute=True, ) size: int = Field( xml_attribute=True, ) skipped: Optional[bool] = Field( xml_attribute=True, ) def _get_cmp_data(self): return (self.name, self.mtime, self.size, self.md5, self.skipped or False) osc-1.12.1/osc/obs_api/person.py000066400000000000000000000061321475337502500164550ustar00rootroot00000000000000from ..util.models import * # pylint: disable=wildcard-import,unused-wildcard-import from .enums import BoolString from .person_owner import PersonOwner from .person_watchlist import PersonWatchlist from .status import Status class Person(XmlModel): XML_TAG = "person" login: str = Field( ) email: Optional[str] = Field( ) realname: Optional[str] = Field( ) owner: Optional[PersonOwner] = Field( ) state: Optional[str] = Field( ) globalrole_list: Optional[List[str]] = Field( xml_name="globalrole", ) watchlist: Optional[PersonWatchlist] = Field( ) ignore_auth_services: Optional[BoolString] = Field( ) def to_human_readable_string(self) -> str: """ Render the object as a human readable string. """ from ..output import KeyValueTable table = KeyValueTable() table.add("Login", self.login, color="bold") table.add("Real name", self.realname) table.add("Email", self.email) table.add("State", self.state) return f"{table}" @classmethod def from_api(cls, apiurl: str, username: str): url_path = ["person", username] url_query = {} response = cls.xml_request("GET", apiurl, url_path, url_query) return cls.from_file(response, apiurl=apiurl) @classmethod def search( cls, apiurl: str, login: Optional[str] = None, email: Optional[str] = None, realname: Optional[str] = None, state: Optional[str] = None, **kwargs, ) -> List["Person"]: from ..util.xml import xml_parse from ..util.xpath import XPathQuery as Q url_path = ["search", "person"] url_query = { "match": Q( login=login, email=email, realname=realname, state=state, **kwargs, ), } response = cls.xml_request("GET", apiurl, url_path, url_query) root = xml_parse(response).getroot() assert root.tag == "collection" result = [] for node in root: result.append(cls.from_xml(node, apiurl=apiurl)) return result @classmethod def cmd_register( cls, apiurl: str, *, login: str, realname: str, email: str, password: str, note: Optional[str] = None, state: Optional[str] = "confirmed", ): person = UnregisteredPerson(login=login, realname=realname, email=email, password=password, note=note, state=state) url_path = ["person"] url_query = { "cmd": "register", } response = cls.xml_request("POST", apiurl, url_path, url_query, data=person.to_string()) return Status.from_file(response, apiurl=apiurl) class UnregisteredPerson(XmlModel): XML_TAG = "unregisteredperson" login: str = Field( ) realname: str = Field( ) email: str = Field( ) password: str = Field( ) note: Optional[str] = Field( ) state: Optional[str] = Field( ) osc-1.12.1/osc/obs_api/person_owner.py000066400000000000000000000003071475337502500176650ustar00rootroot00000000000000from ..util.models import * # pylint: disable=wildcard-import,unused-wildcard-import class PersonOwner(XmlModel): XML_TAG = "owner" userid: str = Field( xml_attribute=True, ) osc-1.12.1/osc/obs_api/person_role.py000066400000000000000000000004451475337502500174770ustar00rootroot00000000000000from ..util.models import * # pylint: disable=wildcard-import,unused-wildcard-import from .enums import LocalRole class PersonRole(XmlModel): XML_TAG = "person" userid: str = Field( xml_attribute=True, ) role: LocalRole = Field( xml_attribute=True, ) osc-1.12.1/osc/obs_api/person_watchlist.py000066400000000000000000000011711475337502500205350ustar00rootroot00000000000000from ..util.models import * # pylint: disable=wildcard-import,unused-wildcard-import from .person_watchlist_package import PersonWatchlistPackage from .person_watchlist_project import PersonWatchlistProject from .person_watchlist_request import PersonWatchlistRequest class PersonWatchlist(XmlModel): XML_TAG = "watchlist" project_list: Optional[List[PersonWatchlistProject]] = Field( xml_name="project", ) package_list: Optional[List[PersonWatchlistPackage]] = Field( xml_name="package", ) request_list: Optional[List[PersonWatchlistRequest]] = Field( xml_name="request", ) osc-1.12.1/osc/obs_api/person_watchlist_package.py000066400000000000000000000004171475337502500222120ustar00rootroot00000000000000from ..util.models import * # pylint: disable=wildcard-import,unused-wildcard-import class PersonWatchlistPackage(XmlModel): XML_TAG = "package" name: str = Field( xml_attribute=True, ) project: str = Field( xml_attribute=True, ) osc-1.12.1/osc/obs_api/person_watchlist_project.py000066400000000000000000000003221475337502500222600ustar00rootroot00000000000000from ..util.models import * # pylint: disable=wildcard-import,unused-wildcard-import class PersonWatchlistProject(XmlModel): XML_TAG = "project" name: str = Field( xml_attribute=True, ) osc-1.12.1/osc/obs_api/person_watchlist_request.py000066400000000000000000000003241475337502500223040ustar00rootroot00000000000000from ..util.models import * # pylint: disable=wildcard-import,unused-wildcard-import class PersonWatchlistRequest(XmlModel): XML_TAG = "request" number: str = Field( xml_attribute=True, ) osc-1.12.1/osc/obs_api/project.py000066400000000000000000000104711475337502500166160ustar00rootroot00000000000000from ..util.models import * # pylint: disable=wildcard-import,unused-wildcard-import from .flag import Flag from .group_role import GroupRole from .person_role import PersonRole from .project_devel import ProjectDevel from .project_link import ProjectLink from .project_maintenance_maintains import ProjectMaintenanceMaintains from .repository import Repository from .simple_flag import SimpleFlag from .status import Status class Project(XmlModel): XML_TAG = "project" name: str = Field( xml_attribute=True, ) class KindEnum(str, Enum): STANDARD = "standard" MAINTENANCE = "maintenance" MAINTENANCE_INCIDENT = "maintenance_incident" MAINTENANCE_RELEASE = "maintenance_release" kind: Optional[KindEnum] = Field( xml_attribute=True, ) title: str = Field( ) description: str = Field( ) url: Optional[str] = Field( ) link_list: Optional[List[ProjectLink]] = Field( xml_name="link", ) mountproject: Optional[str] = Field( ) remoteurl: Optional[str] = Field( ) scmsync: Optional[str] = Field( ) devel: Optional[ProjectDevel] = Field( ) person_list: Optional[List[PersonRole]] = Field( xml_name="person", ) group_list: Optional[List[GroupRole]] = Field( xml_name="group", ) lock: Optional[SimpleFlag] = Field( xml_wrapped=True, ) build_list: Optional[List[Flag]] = Field( xml_name="build", xml_wrapped=True, ) publish_list: Optional[List[Flag]] = Field( xml_name="publish", xml_wrapped=True, ) useforbuild_list: Optional[List[Flag]] = Field( xml_name="useforbuild", xml_wrapped=True, ) debuginfo_list: Optional[List[Flag]] = Field( xml_name="debuginfo", xml_wrapped=True, ) binarydownload_list: Optional[List[Flag]] = Field( xml_name="binarydownload", xml_wrapped=True, ) sourceaccess: Optional[SimpleFlag] = Field( xml_wrapped=True, ) access: Optional[SimpleFlag] = Field( xml_wrapped=True, ) maintenance_list: Optional[List[ProjectMaintenanceMaintains]] = Field( xml_name="maintenance", xml_wrapped=True, ) repository_list: Optional[List[Repository]] = Field( xml_name="repository", ) @classmethod def from_api(cls, apiurl, project): url_path = ["source", project, "_meta"] url_query = {} response = cls.xml_request("GET", apiurl, url_path, url_query) return cls.from_file(response, apiurl=apiurl) def to_api(self, apiurl, *, project=None): project = project or self.name url_path = ["source", project, "_meta"] url_query = {} response = self.xml_request("PUT", apiurl, url_path, url_query, data=self.to_string()) return Status.from_file(response, apiurl=apiurl) def resolve_repository_flags(self, package_obj=None): """ Resolve the `build`, `debuginfo`, `publish` and `useforbuild` flags and return their values for each repository and build arch. :returns: {(repo_name, repo_buildarch): {flag_name: bool} for all available repos """ result = {} flag_names = ("build", "debuginfo", "publish", "useforbuild") # populate the result matrix: {(repo, arch): {"build": None, "debuginfo": None, "publish": None, "useforbuild": None}} for repo_obj in self.repository_list or []: for arch in repo_obj.arch_list or []: result[(repo_obj.name, arch)] = dict([(flag_name, None) for flag_name in flag_names]) for flag_name in flag_names: flag_objects = getattr(self, f"{flag_name}_list") or [] if package_obj is not None: flag_objects += getattr(package_obj, f"{flag_name}_list") or [] for flag_obj in flag_objects: # look up entries matching the current flag and change their values according to the flag's tag for (entry_repo, entry_arch), entry_data in result.items(): match = flag_obj.repository in (entry_repo, None) and flag_obj.arch in (entry_arch, None) if match: entry_data[flag_name] = True if flag_obj.flag == "enable" else False return result osc-1.12.1/osc/obs_api/project_devel.py000066400000000000000000000003111475337502500177650ustar00rootroot00000000000000from ..util.models import * # pylint: disable=wildcard-import,unused-wildcard-import class ProjectDevel(XmlModel): XML_TAG = "devel" project: str = Field( xml_attribute=True, ) osc-1.12.1/osc/obs_api/project_link.py000066400000000000000000000005641475337502500176350ustar00rootroot00000000000000from ..util.models import * # pylint: disable=wildcard-import,unused-wildcard-import class ProjectLink(XmlModel): XML_TAG = "link" project: str = Field( xml_attribute=True, ) class VrevmodeEnum(str, Enum): UNEXTEND = "unextend" EXTEND = "extend" vrevmode: Optional[VrevmodeEnum] = Field( xml_attribute=True, ) osc-1.12.1/osc/obs_api/project_maintenance_maintains.py000066400000000000000000000003341475337502500232200ustar00rootroot00000000000000from ..util.models import * # pylint: disable=wildcard-import,unused-wildcard-import class ProjectMaintenanceMaintains(XmlModel): XML_TAG = "maintains" project: str = Field( xml_attribute=True, ) osc-1.12.1/osc/obs_api/repository.py000066400000000000000000000023721475337502500173700ustar00rootroot00000000000000from ..util.models import * # pylint: disable=wildcard-import,unused-wildcard-import from .enums import BlockModes from .enums import BuildArch from .enums import LinkedbuildModes from .enums import RebuildModes from .repository_download import RepositoryDownload from .repository_hostsystem import RepositoryHostsystem from .repository_path import RepositoryPath from .repository_releasetarget import RepositoryReleasetarget class Repository(XmlModel): XML_TAG = "repository" name: str = Field( xml_attribute=True, ) rebuild: Optional[RebuildModes] = Field( xml_attribute=True, ) block: Optional[BlockModes] = Field( xml_attribute=True, ) linkedbuild: Optional[LinkedbuildModes] = Field( xml_attribute=True, ) download_list: Optional[List[RepositoryDownload]] = Field( xml_name="download", ) releasetarget_list: Optional[List[RepositoryReleasetarget]] = Field( xml_name="releasetarget", ) hostsystem_list: Optional[List[RepositoryHostsystem]] = Field( xml_name="hostsystem", ) path_list: Optional[List[RepositoryPath]] = Field( xml_name="path", ) arch_list: Optional[List[BuildArch]] = Field( xml_name="arch", ) osc-1.12.1/osc/obs_api/repository_download.py000066400000000000000000000013361475337502500212560ustar00rootroot00000000000000from ..util.models import * # pylint: disable=wildcard-import,unused-wildcard-import from .repository_download_master import RepositoryDownloadMaster class RepositoryDownload(XmlModel): XML_TAG = "download" arch: str = Field( xml_attribute=True, ) url: str = Field( xml_attribute=True, ) class RepotypeEnum(str, Enum): RPMMD = "rpmmd" SUSETAGS = "susetags" DEB = "deb" ARCH = "arch" MDK = "mdk" REGISTRY = "registry" repotype: RepotypeEnum = Field( xml_attribute=True, ) archfilter: Optional[str] = Field( ) master: Optional[RepositoryDownloadMaster] = Field( ) pubkey: Optional[str] = Field( ) osc-1.12.1/osc/obs_api/repository_download_master.py000066400000000000000000000004401475337502500226240ustar00rootroot00000000000000from ..util.models import * # pylint: disable=wildcard-import,unused-wildcard-import class RepositoryDownloadMaster(XmlModel): XML_TAG = "master" url: str = Field( xml_attribute=True, ) sslfingerprint: Optional[str] = Field( xml_attribute=True, ) osc-1.12.1/osc/obs_api/repository_hostsystem.py000066400000000000000000000004261475337502500216700ustar00rootroot00000000000000from ..util.models import * # pylint: disable=wildcard-import,unused-wildcard-import class RepositoryHostsystem(XmlModel): XML_TAG = "hostsystem" repository: str = Field( xml_attribute=True, ) project: str = Field( xml_attribute=True, ) osc-1.12.1/osc/obs_api/repository_path.py000066400000000000000000000004121475337502500203750ustar00rootroot00000000000000from ..util.models import * # pylint: disable=wildcard-import,unused-wildcard-import class RepositoryPath(XmlModel): XML_TAG = "path" project: str = Field( xml_attribute=True, ) repository: str = Field( xml_attribute=True, ) osc-1.12.1/osc/obs_api/repository_releasetarget.py000066400000000000000000000006231475337502500222740ustar00rootroot00000000000000from ..util.models import * # pylint: disable=wildcard-import,unused-wildcard-import from .enums import ReleaseTriggers class RepositoryReleasetarget(XmlModel): XML_TAG = "releasetarget" project: str = Field( xml_attribute=True, ) repository: str = Field( xml_attribute=True, ) trigger: Optional[ReleaseTriggers] = Field( xml_attribute=True, ) osc-1.12.1/osc/obs_api/request.py000066400000000000000000000104621475337502500166400ustar00rootroot00000000000000from ..util.models import * # pylint: disable=wildcard-import,unused-wildcard-import from .enums import ObsRatings from .request_action import RequestAction from .request_history import RequestHistory from .request_review import RequestReview from .request_state import RequestState class Request(XmlModel): XML_TAG = "request" id: Optional[str] = Field( xml_attribute=True, ) actions: Optional[int] = Field( xml_attribute=True, ) creator: Optional[str] = Field( xml_attribute=True, ) action_list: List[RequestAction] = Field( xml_name="action", ) state: Optional[RequestState] = Field( ) description: Optional[str] = Field( ) priority: Optional[ObsRatings] = Field( ) review_list: Optional[List[RequestReview]] = Field( xml_name="review", ) history_list: Optional[List[RequestHistory]] = Field( xml_name="history", ) title: Optional[str] = Field( ) accept_at: Optional[str] = Field( ) @classmethod def from_api( cls, apiurl: str, request_id: int, *, with_history: Optional[bool] = None, with_full_history: Optional[bool] = None ) -> "Request": """ Return the specified request. :param request_id: Id of the request. :param withhistory: Include the request history in the results. :param withfullhistory: Includes both, request and review history in the results. """ url_path = ["request", request_id] url_query = { "withhistory": with_history, "withfullhistory": with_full_history, } response = cls.xml_request("GET", apiurl, url_path, url_query) return cls.from_file(response, apiurl=apiurl) @classmethod def cmd_diff( cls, apiurl: str, request_id: int, *, with_issues: Optional[bool] = None, with_description_issues: Optional[bool] = None, diff_to_superseded: Optional[int] = None ) -> "Request": """ Return the specified request including a diff of all packages in the request. :param request_id: Id of the request. :param with_issues: Include parsed issues from referenced sources in the change files. :param with_description_issues: Include parsed issues from request description. :param diff_to_superseded: Diff relatively to the given superseded request. """ url_path = ["request", str(request_id)] url_query = { "cmd": "diff", "view": "xml", "withissues": with_issues, "withdescriptionissues": with_description_issues, "diff_to_superseded": diff_to_superseded, } response = cls.xml_request("POST", apiurl, url_path, url_query) return cls.from_file(response, apiurl=apiurl) def get_issues(self): """ Aggregate issues from action/sourcediff into a single list. The list may contain duplicates. To get any issues returned, it is crucial to load the request with the issues by calling ``cmd_diff()`` with appropriate arguments first. """ result = [] for action in self.action_list or []: if action.sourcediff is None: continue for issue in action.sourcediff.issue_list or []: result.append(issue) return result def cmd_create(self, apiurl: str, *, add_revision: Optional[bool] = None, enforce_branching: Optional[bool] = None, ignore_build_state: Optional[bool] = None, ignore_delegate: Optional[bool] = None, ): """ :param add_revision: Ask the server to add revisions of the current sources to the request. :param ignore_build_state: Skip the build state check. :param ignore_delegate: Enforce a new package instance in a project which has OBS:DelegateRequestTarget set. """ url_path = ["request"] url_query = { "cmd": "create", "addrevision": add_revision, "ignore_delegate": ignore_delegate, } response = self.xml_request("POST", apiurl, url_path, url_query, data=self.to_string()) return Request.from_file(response, apiurl=apiurl) osc-1.12.1/osc/obs_api/request_action.py000066400000000000000000000216561475337502500202040ustar00rootroot00000000000000import urllib.error from ..util.models import * # pylint: disable=wildcard-import,unused-wildcard-import from .package import Package from .package_sources import PackageSources from .request_action_acceptinfo import RequestActionAcceptinfo from .request_action_group import RequestActionGroup from .request_action_grouped import RequestActionGrouped from .request_action_options import RequestActionOptions from .request_action_person import RequestActionPerson from .request_action_source import RequestActionSource from .request_action_target import RequestActionTarget from .request_sourcediff import RequestSourcediff class RequestAction(XmlModel): XML_TAG = "action" class TypeEnum(str, Enum): SUBMIT = "submit" DELETE = "delete" CHANGE_DEVEL = "change_devel" ADD_ROLE = "add_role" SET_BUGOWNER = "set_bugowner" MAINTENANCE_INCIDENT = "maintenance_incident" MAINTENANCE_RELEASE = "maintenance_release" RELEASE = "release" GROUP = "group" type: TypeEnum = Field( xml_attribute=True, ) source: Optional[RequestActionSource] = Field( ) target: Optional[RequestActionTarget] = Field( ) person: Optional[RequestActionPerson] = Field( ) group: Optional[RequestActionGroup] = Field( ) grouped_list: Optional[List[RequestActionGrouped]] = Field( xml_name="grouped", ) options: Optional[RequestActionOptions] = Field( ) sourcediff: Optional[RequestSourcediff] = Field( ) acceptinfo: Optional[RequestActionAcceptinfo] = Field( ) def __init__(self, **kwargs): super().__init__(**kwargs) self._allow_new_attributes = True # source and target always come from ``self._apiurl`` while devel and factory projects may live elsewhere self._devel_apiurl = self._apiurl self._factory_apiurl = self._apiurl self._factory_project = "openSUSE:Factory" self._props = {} self._allow_new_attributes = False def _get_package(self, package_type): key = f"{package_type}_package" if key not in self._props: func = getattr(self, f"_get_{package_type}_apiurl_project_package") apiurl, project, package = func() if apiurl is None: self._props[key] = None else: try: self._props[key] = Package.from_api(apiurl, project, package) except urllib.error.HTTPError as e: if e.code != 404: raise self._props[key] = None return self._props[key] def _get_package_sources(self, package_type, *, rev=None): key = f"{package_type}_package_sources" if key not in self._props: func = getattr(self, f"_get_{package_type}_apiurl_project_package") apiurl, project, package = func() if apiurl is None: self._props[key] = None else: try: self._props[key] = PackageSources.from_api(apiurl, project, package, rev=rev) except urllib.error.HTTPError as e: if e.code != 404: raise self._props[key] = None return self._props[key] def _get_source_apiurl_project_package(self): return self._apiurl, self.source.project, self.source.package @property def source_package(self) -> Optional[Package]: """ Return a ``Package`` object that encapsulates metadata of the source package. """ return self._get_package("source") @property def source_package_sources(self) -> Optional[PackageSources]: """ Return a ``PackageSources`` object that contains information about the ``source.rev`` revision of the source package sources in OBS SCM. """ if self.source is None: return None return self._get_package_sources("source", rev=self.source.rev) def _get_target_apiurl_project_package(self): if self.target is None: return None, None, None target_project, target_package = self.get_actual_target_project_package() return self._apiurl, target_project, target_package @property def target_package(self) -> Optional[Package]: """ Return a ``Package`` object that encapsulates metadata of the target package. """ return self._get_package("target") @property def target_package_sources(self) -> Optional[PackageSources]: """ Return a ``PackageSources`` object that contains information about the current revision of the target package sources in OBS SCM. """ return self._get_package_sources("target") def _get_factory_apiurl_project_package(self): if self.target is None: # a new package was submitted, it doesn't exist on target; let's read the package name from the source target_project, target_package = None, self.source.package else: target_project, target_package = self.get_actual_target_project_package() if (self._apiurl, target_project) == (self._factory_apiurl, self._factory_project): # factory package equals the target package return None, None, None return self._factory_apiurl, self._factory_project, target_package @property def factory_package(self) -> Optional[Package]: """ Return a ``Package`` object that encapsulates metadata of the package in the factory project. The name of the package equals the target package name. """ return self._get_package("factory") @property def factory_package_sources(self) -> Optional[PackageSources]: """ Return a ``PackageSources`` object that contains information about the current revision of the factory package sources in OBS SCM. """ return self._get_package_sources("factory") def _get_devel_apiurl_project_package(self): if self.factory_package is None: return None, None, None devel = self.factory_package.devel if devel is None: return None, None, None return ( self._devel_apiurl, devel.project, devel.package or self.factory_package.name, ) @property def devel_package(self) -> Optional[Package]: """ Return a ``Package`` object that encapsulates metadata of the package in the devel project. The devel project name and package name come from ``self.factory_package.devel``. If the devel package name is not set, target package name is used. """ return self._get_package("devel") @property def devel_package_sources(self) -> Optional[PackageSources]: """ Return a ``PackageSources`` object that contains information about the current revision of the devel package sources in OBS SCM. """ return self._get_package_sources("devel") def get_actual_target_project_package(self) -> Tuple[str, str]: """ Return the target project and package names because maintenance incidents require special handling. The target project for maintenance incidents is virtual and cannot be queried. The actual target project is specified in target's ``releaseproject`` field. Also the target package for maintenance incidents is not set explicitly. It is extracted from ``releasename`` field from the source metadata. If ``releasename`` is not defined, source package name is used. """ if self.type == "maintenance_incident": # dmach's note on security: # The ``releaseproject`` is baked into the target information in the request and that's perfectly ok. # The ``releasename`` is part of the source package metadata and *may change* after the request is created. # After consulting this with OBS developers, I believe this doesn't represent any security issue # because the project is fixed and tampering with ``releasename`` might only lead to inconsistent naming, # the package would still end up it the same project. # target.releaseproject is always set for a maintenance_incident assert self.target assert self.target.releaseproject project = self.target.releaseproject # the target package is not specified # we need to extract it from source package's metadata or use source package name as a fallback assert self.source_package if self.source_package.releasename: package = self.source_package.releasename.split(".")[0] else: package = self.source_package.name return project, package assert self.target assert self.target.project assert self.target.package return self.target.project, self.target.package osc-1.12.1/osc/obs_api/request_action_acceptinfo.py000066400000000000000000000011551475337502500223670ustar00rootroot00000000000000from ..util.models import * # pylint: disable=wildcard-import,unused-wildcard-import class RequestActionAcceptinfo(XmlModel): XML_TAG = "acceptinfo" rev: str = Field( xml_attribute=True, ) srcmd5: str = Field( xml_attribute=True, ) osrcmd5: str = Field( xml_attribute=True, ) oproject: Optional[str] = Field( xml_attribute=True, ) opackage: Optional[str] = Field( xml_attribute=True, ) xsrcmd5: Optional[str] = Field( xml_attribute=True, ) oxsrcmd5: Optional[str] = Field( xml_attribute=True, ) osc-1.12.1/osc/obs_api/request_action_group.py000066400000000000000000000004201475337502500214020ustar00rootroot00000000000000from ..util.models import * # pylint: disable=wildcard-import,unused-wildcard-import class RequestActionGroup(XmlModel): XML_TAG = "group" name: str = Field( xml_attribute=True, ) role: Optional[str] = Field( xml_attribute=True, ) osc-1.12.1/osc/obs_api/request_action_grouped.py000066400000000000000000000003161475337502500217170ustar00rootroot00000000000000from ..util.models import * # pylint: disable=wildcard-import,unused-wildcard-import class RequestActionGrouped(XmlModel): XML_TAG = "grouped" id: str = Field( xml_attribute=True, ) osc-1.12.1/osc/obs_api/request_action_options.py000066400000000000000000000011701475337502500217440ustar00rootroot00000000000000from ..util.models import * # pylint: disable=wildcard-import,unused-wildcard-import class RequestActionOptions(XmlModel): XML_TAG = "options" class SourceupdateEnum(str, Enum): UPDATE = "update" NOUPDATE = "noupdate" CLEANUP = "cleanup" sourceupdate: Optional[SourceupdateEnum] = Field( ) class UpdatelinkEnum(str, Enum): TRUE = "true" FALSE = "false" updatelink: Optional[UpdatelinkEnum] = Field( ) class MakeoriginolderEnum(str, Enum): TRUE = "true" FALSE = "false" makeoriginolder: Optional[MakeoriginolderEnum] = Field( ) osc-1.12.1/osc/obs_api/request_action_person.py000066400000000000000000000004221475337502500215560ustar00rootroot00000000000000from ..util.models import * # pylint: disable=wildcard-import,unused-wildcard-import class RequestActionPerson(XmlModel): XML_TAG = "person" name: str = Field( xml_attribute=True, ) role: Optional[str] = Field( xml_attribute=True, ) osc-1.12.1/osc/obs_api/request_action_source.py000066400000000000000000000006451475337502500215570ustar00rootroot00000000000000from ..util.models import * # pylint: disable=wildcard-import,unused-wildcard-import class RequestActionSource(XmlModel): XML_TAG = "source" project: str = Field( xml_attribute=True, ) package: Optional[str] = Field( xml_attribute=True, ) rev: Optional[str] = Field( xml_attribute=True, ) repository: Optional[str] = Field( xml_attribute=True, ) osc-1.12.1/osc/obs_api/request_action_target.py000066400000000000000000000006601475337502500215420ustar00rootroot00000000000000from ..util.models import * # pylint: disable=wildcard-import,unused-wildcard-import class RequestActionTarget(XmlModel): XML_TAG = "target" project: str = Field( xml_attribute=True, ) package: Optional[str] = Field( xml_attribute=True, ) releaseproject: Optional[str] = Field( xml_attribute=True, ) repository: Optional[str] = Field( xml_attribute=True, ) osc-1.12.1/osc/obs_api/request_history.py000066400000000000000000000005231475337502500204160ustar00rootroot00000000000000from ..util.models import * # pylint: disable=wildcard-import,unused-wildcard-import class RequestHistory(XmlModel): XML_TAG = "history" who: str = Field( xml_attribute=True, ) when: str = Field( xml_attribute=True, ) description: str = Field( ) comment: Optional[str] = Field( ) osc-1.12.1/osc/obs_api/request_review.py000066400000000000000000000025121475337502500202160ustar00rootroot00000000000000from ..util.models import * # pylint: disable=wildcard-import,unused-wildcard-import from .enums import RequestStates from .request_review_history import RequestReviewHistory class RequestReview(XmlModel): XML_TAG = "review" state: RequestStates = Field( xml_attribute=True, ) created: Optional[str] = Field( xml_attribute=True, ) by_user: Optional[str] = Field( xml_attribute=True, ) by_group: Optional[str] = Field( xml_attribute=True, ) by_project: Optional[str] = Field( xml_attribute=True, ) by_package: Optional[str] = Field( xml_attribute=True, ) who: Optional[str] = Field( xml_attribute=True, ) when: Optional[str] = Field( xml_attribute=True, ) comment: Optional[str] = Field( ) history_list: Optional[List[RequestReviewHistory]] = Field( xml_name="history", ) def get_user_and_type(self): if self.by_user: return (self.by_user, "user") if self.by_group: return (self.by_group, "group") if self.by_package: return (f"{self.by_project}/{self.by_package}", "package") if self.by_project: return (self.by_project, "project") raise RuntimeError("Unable to determine user and its type") osc-1.12.1/osc/obs_api/request_review_history.py000066400000000000000000000005311475337502500217760ustar00rootroot00000000000000from ..util.models import * # pylint: disable=wildcard-import,unused-wildcard-import class RequestReviewHistory(XmlModel): XML_TAG = "history" who: str = Field( xml_attribute=True, ) when: str = Field( xml_attribute=True, ) description: str = Field( ) comment: Optional[str] = Field( ) osc-1.12.1/osc/obs_api/request_sourcediff.py000066400000000000000000000015041475337502500210460ustar00rootroot00000000000000from ..util.models import * # pylint: disable=wildcard-import,unused-wildcard-import from .request_sourcediff_files_file import RequestSourcediffFilesFile from .request_sourcediff_issue import RequestSourcediffIssue from .request_sourcediff_new import RequestSourcediffNew from .request_sourcediff_old import RequestSourcediffOld class RequestSourcediff(XmlModel): XML_TAG = "sourcediff" key: str = Field( xml_attribute=True, ) old: Optional[RequestSourcediffOld] = Field( ) new: Optional[RequestSourcediffNew] = Field( ) files_list: List[RequestSourcediffFilesFile] = Field( xml_name="files", xml_wrapped=True, ) issue_list: Optional[List[RequestSourcediffIssue]] = Field( xml_name="issues", xml_wrapped=True, xml_item_name="issue", ) osc-1.12.1/osc/obs_api/request_sourcediff_file_diff.py000066400000000000000000000004141475337502500230340ustar00rootroot00000000000000from ..util.models import * # pylint: disable=wildcard-import,unused-wildcard-import class RequestSourcediffFileDiff(XmlModel): XML_TAG = "diff" lines: int = Field( xml_attribute=True, ) text: str = Field( xml_set_text=True, ) osc-1.12.1/osc/obs_api/request_sourcediff_file_new.py000066400000000000000000000005031475337502500227140ustar00rootroot00000000000000from ..util.models import * # pylint: disable=wildcard-import,unused-wildcard-import class RequestSourcediffFileNew(XmlModel): XML_TAG = "new" name: str = Field( xml_attribute=True, ) md5: str = Field( xml_attribute=True, ) size: int = Field( xml_attribute=True, ) osc-1.12.1/osc/obs_api/request_sourcediff_file_old.py000066400000000000000000000005031475337502500227010ustar00rootroot00000000000000from ..util.models import * # pylint: disable=wildcard-import,unused-wildcard-import class RequestSourcediffFileOld(XmlModel): XML_TAG = "old" name: str = Field( xml_attribute=True, ) md5: str = Field( xml_attribute=True, ) size: int = Field( xml_attribute=True, ) osc-1.12.1/osc/obs_api/request_sourcediff_files_file.py000066400000000000000000000011101475337502500232200ustar00rootroot00000000000000from ..util.models import * # pylint: disable=wildcard-import,unused-wildcard-import from .request_sourcediff_file_diff import RequestSourcediffFileDiff from .request_sourcediff_file_new import RequestSourcediffFileNew from .request_sourcediff_file_old import RequestSourcediffFileOld class RequestSourcediffFilesFile(XmlModel): XML_TAG = "file" state: str = Field( xml_attribute=True, ) old: Optional[RequestSourcediffFileOld] = Field( ) new: Optional[RequestSourcediffFileNew] = Field( ) diff: RequestSourcediffFileDiff = Field( ) osc-1.12.1/osc/obs_api/request_sourcediff_issue.py000066400000000000000000000006741475337502500222650ustar00rootroot00000000000000from ..util.models import * # pylint: disable=wildcard-import,unused-wildcard-import class RequestSourcediffIssue(XmlModel): XML_TAG = "issue" state: str = Field( xml_attribute=True, ) tracker: str = Field( xml_attribute=True, ) name: str = Field( xml_attribute=True, ) label: str = Field( xml_attribute=True, ) url: str = Field( xml_attribute=True, ) osc-1.12.1/osc/obs_api/request_sourcediff_new.py000066400000000000000000000006011475337502500217140ustar00rootroot00000000000000from ..util.models import * # pylint: disable=wildcard-import,unused-wildcard-import class RequestSourcediffNew(XmlModel): XML_TAG = "new" project: str = Field( xml_attribute=True, ) package: str = Field( xml_attribute=True, ) rev: str = Field( xml_attribute=True, ) srcmd5: str = Field( xml_attribute=True, ) osc-1.12.1/osc/obs_api/request_sourcediff_old.py000066400000000000000000000006011475337502500217010ustar00rootroot00000000000000from ..util.models import * # pylint: disable=wildcard-import,unused-wildcard-import class RequestSourcediffOld(XmlModel): XML_TAG = "old" project: str = Field( xml_attribute=True, ) package: str = Field( xml_attribute=True, ) rev: str = Field( xml_attribute=True, ) srcmd5: str = Field( xml_attribute=True, ) osc-1.12.1/osc/obs_api/request_state.py000066400000000000000000000011771475337502500200430ustar00rootroot00000000000000from ..util.models import * # pylint: disable=wildcard-import,unused-wildcard-import from .enums import RequestStates class RequestState(XmlModel): XML_TAG = "state" name: RequestStates = Field( xml_attribute=True, ) who: Optional[str] = Field( xml_attribute=True, ) when: Optional[str] = Field( xml_attribute=True, ) created: Optional[str] = Field( xml_attribute=True, ) superseded_by: Optional[int] = Field( xml_attribute=True, ) approver: Optional[str] = Field( xml_attribute=True, ) comment: Optional[str] = Field( ) osc-1.12.1/osc/obs_api/scmsync_obsinfo.py000066400000000000000000000040551475337502500203470ustar00rootroot00000000000000import typing from ..util.models import * # pylint: disable=wildcard-import,unused-wildcard-import class ScmsyncObsinfo(BaseModel): """ Class for handling _scmsync.obsinfo files """ # the fields are defined in obs_scm_bridge in ObsGit.write_obsinfo() # https://github.com/openSUSE/obs-scm-bridge/blob/main/obs_scm_bridge mtime: int = Field() commit: str = Field() url: Optional[str] = Field() revision: Optional[str] = Field() subdir: Optional[str] = Field() projectscmsync: Optional[str] = Field() @classmethod def from_string(cls, data: str) -> "ScmsyncObsinfo": kwargs = {} for line in data.splitlines(): line = line.strip() if not line: continue key, value = line.split(": ", 1) field = cls.__fields__.get(key, None) if field and field.type is int: value = int(value) kwargs[key] = value return cls(**kwargs) @classmethod def from_file(cls, file: Union[str, typing.IO]) -> "ScmsyncObsinfo": if isinstance(file, str): with open(file, "r", encoding="utf-8") as f: return cls.from_string(f.read()) data = file.read() if isinstance(data, bytes): data = data.decode("utf-8") return cls.from_string(data) @classmethod def from_api(cls, apiurl: str, project: str, package: str, *, rev: str) -> "ScmsyncObsinfo": import urllib.error from .. import oscerr from ..connection import http_request from ..core import makeurl url_path = ["source", project, package, "_scmsync.obsinfo"] url_query = {"rev": rev} url = makeurl(apiurl, url_path, url_query) try: response = http_request("GET", url) except urllib.error.HTTPError as e: if e.status == 404: raise oscerr.NotFoundAPIError(f"File '_scmsync.obsinfo' was not found in {project}/{package}, rev={rev}") raise return cls.from_file(response) osc-1.12.1/osc/obs_api/serviceinfo.py000066400000000000000000000006501475337502500174620ustar00rootroot00000000000000from ..util.models import * # pylint: disable=wildcard-import,unused-wildcard-import class Serviceinfo(XmlModel): XML_TAG = "serviceinfo" xsrcmd5: Optional[str] = Field( xml_attribute=True, ) lsrcmd5: Optional[str] = Field( xml_attribute=True, ) error: Optional[str] = Field( xml_attribute=True, ) code: Optional[str] = Field( xml_attribute=True, ) osc-1.12.1/osc/obs_api/simple_flag.py000066400000000000000000000013351475337502500174310ustar00rootroot00000000000000from ..util.models import * # pylint: disable=wildcard-import,unused-wildcard-import from xml.etree import ElementTree as ET class SimpleFlag(XmlModel): XML_TAG = None def __init__(self, flag, **kwargs): super().__init__(flag=flag, **kwargs) class SimpleFlagChoices(Enum): ENABLE = "enable" DISABLE = "disable" flag: SimpleFlagChoices = Field( xml_set_tag=True, ) def __eq__(self, other): if hasattr(other, "flag"): return self.flag == other.flag # allow comparing with a string return self.flag == other @classmethod def from_xml(cls, root: ET.Element, *, apiurl: Optional[str] = None): return cls(flag=root[0].tag) osc-1.12.1/osc/obs_api/status.py000066400000000000000000000021501475337502500164660ustar00rootroot00000000000000import textwrap from ..util.models import * # pylint: disable=wildcard-import,unused-wildcard-import from .status_data import StatusData class Status(XmlModel): XML_TAG = "status" code: str = Field( xml_attribute=True, description=textwrap.dedent( """ Status code returned by the server. """ ), ) summary: Optional[str] = Field( description=textwrap.dedent( """ Human readable summary. """ ), ) details: Optional[str] = Field( description=textwrap.dedent( """ Detailed, human readable information. """ ), ) data_list: Optional[List[StatusData]] = Field( xml_name="data", description=textwrap.dedent( """ Additional machine readable data. """ ), ) @property def data(self): result = {} for entry in self.data_list or []: key = entry.name value = entry.value result[key] = value return result osc-1.12.1/osc/obs_api/status_data.py000066400000000000000000000013011475337502500174540ustar00rootroot00000000000000import textwrap from ..util.models import * # pylint: disable=wildcard-import,unused-wildcard-import class StatusData(XmlModel): XML_TAG = "data" class NameEnum(str, Enum): SOURCEPROJECT = "sourceproject" SOURCEPACKAGE = "sourcepackage" TARGETPROJECT = "targetproject" TARGETPACKAGE = "targetpackage" TOKEN = "token" ID = "id" name: NameEnum = Field( xml_attribute=True, description=textwrap.dedent( """ Key. """ ), ) value: str = Field( xml_set_text=True, description=textwrap.dedent( """ Value. """ ), ) osc-1.12.1/osc/obs_api/token.py000066400000000000000000000133231475337502500162670ustar00rootroot00000000000000import textwrap from ..util.models import * # pylint: disable=wildcard-import,unused-wildcard-import from .status import Status class Token(XmlModel): XML_TAG = "entry" id: int = Field( xml_attribute=True, description=textwrap.dedent( """ The unique id of this token. """ ), ) string: str = Field( xml_attribute=True, description=textwrap.dedent( """ The token secret. This string can be used instead of the password to authenticate the user or to trigger service runs via the `POST /trigger/runservice` route. """ ), ) description: Optional[str] = Field( xml_attribute=True, description=textwrap.dedent( """ This attribute can be used to identify a token from the list of tokens of a user. """ ), ) project: Optional[str] = Field( xml_attribute=True, description=textwrap.dedent( """ If this token is bound to a specific package, then the packages' project is available in this attribute. """ ), ) package: Optional[str] = Field( xml_attribute=True, description=textwrap.dedent( """ The package name to which this token is bound, if it has been created for a specific package. Otherwise this attribute and the project attribute are omitted. """ ), ) class Kind(str, Enum): RSS = "rss" REBUILD = "rebuild" RELEASE = "release" RUNSERVICE = "runservice" WIPE = "wipe" WORKFLOW = "workflow" kind: Kind = Field( xml_attribute=True, description=textwrap.dedent( """ This attribute specifies which actions can be performed via this token. - rss: used to retrieve the notification RSS feed - rebuild: trigger rebuilds of packages - release: trigger project releases - runservice: run a service via the POST /trigger/runservice route - wipe: trigger wipe of binary artifacts - workflow: trigger SCM/CI workflows, see https://openbuildservice.org/help/manuals/obs-user-guide/cha.obs.scm_ci_workflow_integration.html """ ), ) triggered_at: str = Field( xml_attribute=True, description=textwrap.dedent( """ The date and time a token got triggered the last time. """ ), ) def to_human_readable_string(self) -> str: """ Render the object as a human readable string. """ from ..output import KeyValueTable table = KeyValueTable() table.add("ID", str(self.id)) table.add("String", self.string, color="bold") table.add("Operation", self.kind) table.add("Description", self.description) table.add("Project", self.project) table.add("Package", self.package) table.add("Triggered at", self.triggered_at) return f"{table}" @classmethod def do_list(cls, apiurl: str, user: str): from ..util.xml import xml_parse url_path = ["person", user, "token"] url_query = {} response = cls.xml_request("GET", apiurl, url_path, url_query) root = xml_parse(response).getroot() assert root.tag == "directory" result = [] for node in root: result.append(cls.from_xml(node, apiurl=apiurl)) return result @classmethod def cmd_create( cls, apiurl: str, user: str, *, operation: Optional[str] = None, project: Optional[str] = None, package: Optional[str] = None, scm_token: Optional[str] = None, ): if operation == "workflow" and not scm_token: raise ValueError('``operation`` = "workflow" requires ``scm_token``') url_path = ["person", user, "token"] url_query = { "cmd": "create", "operation": operation, "project": project, "package": package, "scm_token": scm_token, } response = cls.xml_request("POST", apiurl, url_path, url_query) return Status.from_file(response, apiurl=apiurl) @classmethod def do_delete(cls, apiurl: str, user: str, token: str): url_path = ["person", user, "token", token] url_query = {} response = cls.xml_request("DELETE", apiurl, url_path, url_query) return Status.from_file(response, apiurl=apiurl) @classmethod def do_trigger( cls, apiurl: str, token: str, *, operation: Optional[str] = None, project: Optional[str] = None, package: Optional[str] = None, repo: Optional[str] = None, arch: Optional[str] = None, target_project: Optional[str] = None, target_repo: Optional[str] = None, set_release: Optional[str] = None, ): if operation: url_path = ["trigger", operation] else: url_path = ["trigger"] url_query = { "project": project, "package": package, "repository": repo, "architecture": arch, "targetproject": target_project, "targetrepository": target_repo, "setrelease": set_release, } headers = { "Content-Type": "application/octet-stream", "Authorization": f"Token {token}", } response = cls.xml_request("POST", apiurl, url_path, url_query, headers=headers) return Status.from_file(response, apiurl=apiurl) osc-1.12.1/osc/obs_scm/000077500000000000000000000000001475337502500146045ustar00rootroot00000000000000osc-1.12.1/osc/obs_scm/__init__.py000066400000000000000000000002561475337502500167200ustar00rootroot00000000000000from .file import File from .linkinfo import Linkinfo from .package import Package from .project import Project from .serviceinfo import Serviceinfo from .store import Store osc-1.12.1/osc/obs_scm/file.py000066400000000000000000000032561475337502500161030ustar00rootroot00000000000000from functools import total_ordering from ..util.xml import ET @total_ordering class File: """represent a file, including its metadata""" def __init__(self, name, md5, size, mtime, skipped=False): self.name = name self.md5 = md5 self.size = size self.mtime = mtime self.skipped = skipped def __repr__(self): return self.name def __str__(self): return self.name def __eq__(self, other): if isinstance(other, str): return self.name == other self_data = (self.name, self.md5, self.size, self.mtime, self.skipped) other_data = (other.name, other.md5, other.size, other.mtime, other.skipped) return self_data == other_data def __lt__(self, other): self_data = (self.name, self.md5, self.size, self.mtime, self.skipped) other_data = (other.name, other.md5, other.size, other.mtime, other.skipped) return self_data < other_data @classmethod def from_xml_node(cls, node): assert node.tag == "entry" kwargs = { "name": node.get("name"), "md5": node.get("md5"), "size": int(node.get("size")), "mtime": int(node.get("mtime")), "skipped": "skipped" in node.attrib, } return cls(**kwargs) def to_xml_node(self, parent_node): attributes = { "name": self.name, "md5": self.md5, "size": str(int(self.size)), "mtime": str(int(self.mtime)), } if self.skipped: attributes["skipped"] = "true" new_node = ET.SubElement(parent_node, "entry", attributes) return new_node osc-1.12.1/osc/obs_scm/linkinfo.py000066400000000000000000000045551475337502500170000ustar00rootroot00000000000000class Linkinfo: """linkinfo metadata (which is part of the xml representing a directory) """ def __init__(self): """creates an empty linkinfo instance""" self.project = None self.package = None self.xsrcmd5 = None self.lsrcmd5 = None self.srcmd5 = None self.error = None self.rev = None self.baserev = None def read(self, linkinfo_node): """read in the linkinfo metadata from the ```` element passed as elementtree node. If the passed element is ``None``, the method does nothing. """ if linkinfo_node is None: return self.project = linkinfo_node.get('project') self.package = linkinfo_node.get('package') self.xsrcmd5 = linkinfo_node.get('xsrcmd5') self.lsrcmd5 = linkinfo_node.get('lsrcmd5') self.srcmd5 = linkinfo_node.get('srcmd5') self.error = linkinfo_node.get('error') self.rev = linkinfo_node.get('rev') self.baserev = linkinfo_node.get('baserev') def islink(self): """:return: ``True`` if the linkinfo is not empty, otherwise ``False``""" if self.xsrcmd5 or self.lsrcmd5 or self.error is not None: return True return False def isexpanded(self): """:return: ``True`` if the package is an expanded link""" if self.lsrcmd5 and not self.xsrcmd5: return True return False def haserror(self): """:return: ``True`` if the link is in error state (could not be applied)""" if self.error: return True return False def __str__(self): """return an informatory string representation""" if self.islink() and not self.isexpanded(): return 'project %s, package %s, xsrcmd5 %s, rev %s' \ % (self.project, self.package, self.xsrcmd5, self.rev) elif self.islink() and self.isexpanded(): if self.haserror(): return 'broken link to project %s, package %s, srcmd5 %s, lsrcmd5 %s: %s' \ % (self.project, self.package, self.srcmd5, self.lsrcmd5, self.error) else: return 'expanded link to project %s, package %s, srcmd5 %s, lsrcmd5 %s' \ % (self.project, self.package, self.srcmd5, self.lsrcmd5) else: return 'None' osc-1.12.1/osc/obs_scm/package.py000066400000000000000000002106071475337502500165570ustar00rootroot00000000000000import difflib import fnmatch import glob import shutil import os import sys import tempfile from functools import total_ordering from typing import Optional from .. import conf from .. import oscerr from ..util.xml import ET from ..util.xml import xml_fromstring from ..util.xml import xml_parse from .file import File from .linkinfo import Linkinfo from .serviceinfo import Serviceinfo from .store import __store_version__ from .store import Store from .store import check_store_version from .store import read_inconflict from .store import read_filemeta from .store import read_sizelimit from .store import read_tobeadded from .store import read_tobedeleted from .store import store from .store import store_read_file from .store import store_write_project from .store import store_write_string @total_ordering class Package: """represent a package (its directory) and read/keep/write its metadata""" # should _meta be a required file? REQ_STOREFILES = ('_project', '_package', '_apiurl', '_files', '_osclib_version') OPT_STOREFILES = ('_to_be_added', '_to_be_deleted', '_in_conflict', '_in_update', '_in_commit', '_meta', '_meta_mode', '_frozenlink', '_pulled', '_linkrepair', '_size_limit', '_commit_msg', '_last_buildroot') def __init__(self, workingdir, progress_obj=None, size_limit=None, wc_check=True): from .. import store as osc_store global store self.todo = [] if os.path.isfile(workingdir) or not os.path.exists(workingdir): # workingdir is a file # workingdir doesn't exist -> it points to a non-existing file in a working dir (e.g. during mv) workingdir, todo_entry = os.path.split(workingdir) self.todo.append(todo_entry) self.dir = workingdir or "." self.absdir = os.path.abspath(self.dir) self.store = osc_store.get_store(self.dir, check=wc_check) self.store.assert_is_package() self.storedir = os.path.join(self.absdir, store) self.progress_obj = progress_obj self.size_limit = size_limit self.scm_url = self.store.scmurl if size_limit and size_limit == 0: self.size_limit = None self.prjname = self.store.project self.name = self.store.package self.apiurl = self.store.apiurl self.update_datastructs() dirty_files = [] if wc_check: dirty_files = self.wc_check() if dirty_files: msg = 'Your working copy \'%s\' is in an inconsistent state.\n' \ 'Please run \'osc repairwc %s\' (Note this might _remove_\n' \ 'files from the .osc/ dir). Please check the state\n' \ 'of the working copy afterwards (via \'osc status %s\')' % (self.dir, self.dir, self.dir) raise oscerr.WorkingCopyInconsistent(self.prjname, self.name, dirty_files, msg) def __repr__(self): return super().__repr__() + f"({self.prjname}/{self.name})" def __hash__(self): return hash((self.name, self.prjname, self.apiurl)) def __eq__(self, other): return (self.name, self.prjname, self.apiurl) == (other.name, other.prjname, other.apiurl) def __lt__(self, other): return (self.name, self.prjname, self.apiurl) < (other.name, other.prjname, other.apiurl) @classmethod def from_paths(cls, paths, progress_obj=None, *, skip_dirs=False): """ Return a list of Package objects from working copies in given paths. """ packages = [] for path in paths: if skip_dirs and os.path.isdir(path): continue package = cls(path, progress_obj) seen_package = None try: # re-use an existing package seen_package_index = packages.index(package) seen_package = packages[seen_package_index] except ValueError: pass if seen_package: # merge package into seen_package if seen_package.absdir != package.absdir: raise oscerr.PackageExists(package.prjname, package.name, "Duplicate package") seen_package.merge(package) else: # use the new package instance packages.append(package) return packages @classmethod def from_paths_nofail(cls, paths, progress_obj=None, *, skip_dirs=False): """ Return a list of Package objects from working copies in given paths and a list of strings with paths that do not contain Package working copies. """ packages = [] failed_to_load = [] for path in paths: if skip_dirs and os.path.isdir(path): continue try: package = cls(path, progress_obj) except oscerr.NoWorkingCopy: failed_to_load.append(path) continue # the following code is identical to from_paths() seen_package = None try: # re-use an existing package seen_package_index = packages.index(package) seen_package = packages[seen_package_index] except ValueError: pass if seen_package: # merge package into seen_package if seen_package.absdir != package.absdir: raise oscerr.PackageExists(package.prjname, package.name, "Duplicate package") seen_package.merge(package) else: # use the new package instance packages.append(package) return packages, failed_to_load def wc_check(self): dirty_files = [] if self.scm_url: return dirty_files for fname in self.filenamelist: if not self.store.sources_is_file(fname) and fname not in self.skipped: dirty_files.append(fname) for fname in Package.REQ_STOREFILES: if not os.path.isfile(os.path.join(self.storedir, fname)): dirty_files.append(fname) for fname in self.store.sources_list_files(): if fname in self.filenamelist and fname in self.skipped: dirty_files.append(fname) elif fname not in self.filenamelist: dirty_files.append(fname) for fname in self.to_be_deleted[:]: if fname not in self.filenamelist: dirty_files.append(fname) for fname in self.in_conflict[:]: if fname not in self.filenamelist: dirty_files.append(fname) return dirty_files def wc_repair(self, apiurl: Optional[str] = None) -> bool: from ..core import get_source_file repaired: bool = False store = Store(self.dir, check=False) store.assert_is_package() # check_store_version() does the metadata migration that was disabled due to Store(..., check=False) check_store_version(self.absdir) # there was a time when osc did not write _osclib_version file; let's assume these checkouts have version 1.0 if not store.exists("_osclib_version"): store.write_string("_osclib_version", "1.0") if not store.exists("_apiurl") or apiurl: if apiurl is None: msg = 'cannot repair wc: the \'_apiurl\' file is missing but ' \ 'no \'apiurl\' was passed to wc_repair' # hmm should we raise oscerr.WrongArgs? raise oscerr.WorkingCopyInconsistent(self.prjname, self.name, [], msg) # sanity check conf.parse_apisrv_url(None, apiurl) store.apiurl = apiurl self.apiurl = apiurl repaired = True # all files which are present in the filelist have to exist in the storedir for f in self.filelist: # XXX: should we also check the md5? if not self.store.sources_is_file(f.name) and f.name not in self.skipped: # if get_source_file fails we're screwed up... get_source_file(self.apiurl, self.prjname, self.name, f.name, targetfilename=self.store.sources_get_path(f.name), revision=self.rev, mtime=f.mtime) repaired = True for fname in store: if fname in Package.REQ_STOREFILES or fname in Package.OPT_STOREFILES or \ fname.startswith('_build'): continue for fname in self.store.sources_list_files(): if fname not in self.filenamelist or fname in self.skipped: # this file does not belong to the storedir so remove it os.unlink(self.store.sources_get_path(fname)) repaired = True for fname in self.to_be_deleted[:]: if fname not in self.filenamelist: self.to_be_deleted.remove(fname) self.write_deletelist() repaired = True for fname in self.in_conflict[:]: if fname not in self.filenamelist: self.in_conflict.remove(fname) self.write_conflictlist() repaired = True return repaired def info(self): from ..core import info_templ from ..core import makeurl source_url = makeurl(self.apiurl, ['source', self.prjname, self.name]) r = info_templ % (self.prjname, self.name, self.absdir, self.apiurl, source_url, self.srcmd5, self.rev, self.linkinfo) return r def addfile(self, n): from ..core import statfrmt if not os.path.exists(os.path.join(self.absdir, n)): raise oscerr.OscIOError(None, f'error: file \'{n}\' does not exist') if n in self.to_be_deleted: self.to_be_deleted.remove(n) self.write_deletelist() elif n in self.filenamelist or n in self.to_be_added: raise oscerr.PackageFileConflict(self.prjname, self.name, n, f'osc: warning: \'{n}\' is already under version control') if self.dir != '.': pathname = os.path.join(self.dir, n) else: pathname = n self.to_be_added.append(n) self.write_addlist() print(statfrmt('A', pathname)) def delete_file(self, n, force=False): """deletes a file if possible and marks the file as deleted""" state = '?' try: state = self.status(n) except OSError as ioe: if not force: raise ioe if state in ['?', 'A', 'M', 'R', 'C'] and not force: return (False, state) # special handling for skipped files: if file exists, simply delete it if state == 'S': exists = os.path.exists(os.path.join(self.dir, n)) self.delete_localfile(n) return (exists, 'S') self.delete_localfile(n) was_added = n in self.to_be_added if state in ('A', 'R') or state == '!' and was_added: self.to_be_added.remove(n) self.write_addlist() elif state == 'C': # don't remove "merge files" (*.mine, *.new...) # that's why we don't use clear_from_conflictlist self.in_conflict.remove(n) self.write_conflictlist() if state not in ('A', '?') and not (state == '!' and was_added): self.put_on_deletelist(n) self.write_deletelist() return (True, state) def delete_localfile(self, n): try: os.unlink(os.path.join(self.dir, n)) except: pass def put_on_deletelist(self, n): if n not in self.to_be_deleted: self.to_be_deleted.append(n) def put_on_conflictlist(self, n): if n not in self.in_conflict: self.in_conflict.append(n) def put_on_addlist(self, n): if n not in self.to_be_added: self.to_be_added.append(n) def clear_from_conflictlist(self, n): """delete an entry from the file, and remove the file if it would be empty""" if n in self.in_conflict: filename = os.path.join(self.dir, n) storefilename = self.store.sources_get_path(n) myfilename = os.path.join(self.dir, n + '.mine') upfilename = os.path.join(self.dir, n + '.new') try: os.unlink(myfilename) os.unlink(upfilename) if self.islinkrepair() or self.ispulled(): os.unlink(os.path.join(self.dir, n + '.old')) except: pass self.in_conflict.remove(n) self.write_conflictlist() # XXX: this isn't used at all def write_meta_mode(self): # XXX: the "elif" is somehow a contradiction (with current and the old implementation # it's not possible to "leave" the metamode again) (except if you modify pac.meta # which is really ugly:) ) if self.meta: store_write_string(self.absdir, '_meta_mode', '') elif self.ismetamode(): os.unlink(os.path.join(self.storedir, '_meta_mode')) def write_sizelimit(self): if self.size_limit and self.size_limit <= 0: try: os.unlink(os.path.join(self.storedir, '_size_limit')) except: pass else: store_write_string(self.absdir, '_size_limit', str(self.size_limit) + '\n') def write_addlist(self): self.__write_storelist('_to_be_added', self.to_be_added) def write_deletelist(self): self.__write_storelist('_to_be_deleted', self.to_be_deleted) def delete_source_file(self, n): """delete local a source file""" self.delete_localfile(n) self.store.sources_delete_file(n) def delete_remote_source_file(self, n): """delete a remote source file (e.g. from the server)""" from ..core import http_DELETE from ..core import makeurl query = {"rev": "upload"} u = makeurl(self.apiurl, ['source', self.prjname, self.name, n], query=query) http_DELETE(u) def put_source_file(self, n, tdir, copy_only=False): from ..core import http_PUT from ..core import makeurl query = {"rev": "repository"} tfilename = os.path.join(tdir, n) shutil.copyfile(os.path.join(self.dir, n), tfilename) # escaping '+' in the URL path (note: not in the URL query string) is # only a workaround for ruby on rails, which swallows it otherwise if not copy_only: u = makeurl(self.apiurl, ['source', self.prjname, self.name, n], query=query) http_PUT(u, file=tfilename) if n in self.to_be_added: self.to_be_added.remove(n) def __commit_update_store(self, tdir): """move files from transaction directory into the store""" for filename in os.listdir(tdir): os.rename(os.path.join(tdir, filename), self.store.sources_get_path(filename)) def __generate_commitlist(self, todo_send): root = ET.Element('directory') for i in sorted(todo_send.keys()): ET.SubElement(root, 'entry', name=i, md5=todo_send[i]) return root @staticmethod def commit_filelist(apiurl: str, project: str, package: str, filelist, msg="", user=None, **query): """send the commitlog and the local filelist to the server""" from ..core import ET_ENCODING from ..core import http_POST from ..core import makeurl if user is None: user = conf.get_apiurl_usr(apiurl) query.update({'cmd': 'commitfilelist', 'user': user, 'comment': msg}) u = makeurl(apiurl, ['source', project, package], query=query) f = http_POST(u, data=ET.tostring(filelist, encoding=ET_ENCODING)) root = xml_parse(f).getroot() return root @staticmethod def commit_get_missing(filelist): """returns list of missing files (filelist is the result of commit_filelist)""" from ..core import ET_ENCODING error = filelist.get('error') if error is None: return [] elif error != 'missing': raise oscerr.APIError('commit_get_missing_files: ' 'unexpected \'error\' attr: \'%s\'' % error) todo = [] for n in filelist.findall('entry'): name = n.get('name') if name is None: raise oscerr.APIError('missing \'name\' attribute:\n%s\n' % ET.tostring(filelist, encoding=ET_ENCODING)) todo.append(n.get('name')) return todo def __send_commitlog(self, msg, local_filelist, validate=False): """send the commitlog and the local filelist to the server""" query = {} if self.islink() and self.isexpanded(): query['keeplink'] = '1' if conf.config['linkcontrol'] or self.isfrozen(): query['linkrev'] = self.linkinfo.srcmd5 if self.ispulled(): query['repairlink'] = '1' query['linkrev'] = self.get_pulled_srcmd5() if self.islinkrepair(): query['repairlink'] = '1' if validate: query['withvalidate'] = '1' return self.commit_filelist(self.apiurl, self.prjname, self.name, local_filelist, msg, **query) def commit(self, msg='', verbose=False, skip_local_service_run=False, can_branch=False, force=False): from ..core import ET_ENCODING from ..core import branch_pkg from ..core import dgst from ..core import getTransActPath from ..core import http_GET from ..core import makeurl from ..core import print_request_list from ..core import sha256_dgst from ..core import statfrmt # commit only if the upstream revision is the same as the working copy's upstream_rev = self.latest_rev() if self.rev != upstream_rev: raise oscerr.WorkingCopyOutdated((self.absdir, self.rev, upstream_rev)) if not skip_local_service_run: r = self.run_source_services(mode="trylocal", verbose=verbose) if r != 0: # FIXME: it is better to raise this in Serviceinfo.execute with more # information (like which service/command failed) raise oscerr.ServiceRuntimeError('A service failed with error: %d' % r) # check if it is a link, if so, branch the package if self.is_link_to_different_project(): if can_branch: orgprj = self.get_local_origin_project() print(f"Branching {self.name} from {orgprj} to {self.prjname}") exists, targetprj, targetpkg, srcprj, srcpkg = branch_pkg( self.apiurl, orgprj, self.name, target_project=self.prjname) # update _meta and _files to sychronize the local package # to the new branched one in OBS self.update_local_pacmeta() self.update_local_filesmeta() else: print(f"{self.name} Not commited because is link to a different project") return 1 if not self.todo: self.todo = [i for i in self.to_be_added if i not in self.filenamelist] + self.filenamelist pathn = getTransActPath(self.dir) todo_send = {} todo_delete = [] real_send = [] sha256sums = {} for filename in self.filenamelist + [i for i in self.to_be_added if i not in self.filenamelist]: if filename.startswith('_service:') or filename.startswith('_service_'): continue st = self.status(filename) if st == 'C': print('Please resolve all conflicts before committing using "osc resolved FILE"!') return 1 elif filename in self.todo: if st in ('A', 'R', 'M'): todo_send[filename] = dgst(os.path.join(self.absdir, filename)) sha256sums[filename] = sha256_dgst(os.path.join(self.absdir, filename)) real_send.append(filename) print(statfrmt('Sending', os.path.join(pathn, filename))) elif st in (' ', '!', 'S'): if st == '!' and filename in self.to_be_added: print(f'file \'{filename}\' is marked as \'A\' but does not exist') return 1 f = self.findfilebyname(filename) if f is None: raise oscerr.PackageInternalError(self.prjname, self.name, 'error: file \'%s\' with state \'%s\' is not known by meta' % (filename, st)) todo_send[filename] = f.md5 elif st == 'D': todo_delete.append(filename) print(statfrmt('Deleting', os.path.join(pathn, filename))) elif st in ('R', 'M', 'D', ' ', '!', 'S'): # ignore missing new file (it's not part of the current commit) if st == '!' and filename in self.to_be_added: continue f = self.findfilebyname(filename) if f is None: raise oscerr.PackageInternalError(self.prjname, self.name, 'error: file \'%s\' with state \'%s\' is not known by meta' % (filename, st)) todo_send[filename] = f.md5 if ((self.ispulled() or self.islinkrepair() or self.isfrozen()) and st != 'A' and filename not in sha256sums): # Ignore files with state 'A': if we should consider it, # it would have been in pac.todo, which implies that it is # in sha256sums. # The storefile is guaranteed to exist (since we have a # pulled/linkrepair wc, the file cannot have state 'S') storefile = self.store.sources_get_path(filename) sha256sums[filename] = sha256_dgst(storefile) if not force and not real_send and not todo_delete and not self.islinkrepair() and not self.ispulled(): print(f'nothing to do for package {self.name}') return 1 print('Transmitting file data', end=' ') filelist = self.__generate_commitlist(todo_send) sfilelist = self.__send_commitlog(msg, filelist, validate=True) hash_entries = [e for e in sfilelist.findall('entry') if e.get('hash') is not None] if sfilelist.get('error') and hash_entries: name2elem = {e.get('name'): e for e in filelist.findall('entry')} for entry in hash_entries: filename = entry.get('name') fileelem = name2elem.get(filename) if filename not in sha256sums: msg = 'There is no sha256 sum for file %s.\n' \ 'This could be due to an outdated working copy.\n' \ 'Please update your working copy with osc update and\n' \ 'commit again afterwards.' print(msg % filename) return 1 fileelem.set('hash', f'sha256:{sha256sums[filename]}') sfilelist = self.__send_commitlog(msg, filelist) send = self.commit_get_missing(sfilelist) real_send = [i for i in real_send if i not in send] # abort after 3 tries tries = 3 tdir = None try: tdir = os.path.join(self.storedir, '_in_commit') if os.path.isdir(tdir): shutil.rmtree(tdir) os.mkdir(tdir) while send and tries: for filename in send[:]: sys.stdout.write('.') sys.stdout.flush() self.put_source_file(filename, tdir) send.remove(filename) tries -= 1 sfilelist = self.__send_commitlog(msg, filelist) send = self.commit_get_missing(sfilelist) if send: raise oscerr.PackageInternalError(self.prjname, self.name, 'server does not accept filelist:\n%s\nmissing:\n%s\n' % (ET.tostring(filelist, encoding=ET_ENCODING), ET.tostring(sfilelist, encoding=ET_ENCODING))) # these files already exist on the server for filename in real_send: self.put_source_file(filename, tdir, copy_only=True) # update store with the committed files self.__commit_update_store(tdir) finally: if tdir is not None and os.path.isdir(tdir): shutil.rmtree(tdir) self.rev = sfilelist.get('rev') print() print(f'Committed revision {self.rev}.') if self.ispulled(): os.unlink(os.path.join(self.storedir, '_pulled')) if self.islinkrepair(): os.unlink(os.path.join(self.storedir, '_linkrepair')) self.linkrepair = False # XXX: mark package as invalid? print('The source link has been repaired. This directory can now be removed.') if self.islink() and self.isexpanded(): li = Linkinfo() li.read(sfilelist.find('linkinfo')) if li.xsrcmd5 is None: raise oscerr.APIError(f'linkinfo has no xsrcmd5 attr:\n{ET.tostring(sfilelist, encoding=ET_ENCODING)}\n') sfilelist = xml_fromstring(self.get_files_meta(revision=li.xsrcmd5)) for i in sfilelist.findall('entry'): if i.get('name') in self.skipped: i.set('skipped', 'true') store_write_string(self.absdir, '_files', ET.tostring(sfilelist, encoding=ET_ENCODING) + '\n') for filename in todo_delete: self.to_be_deleted.remove(filename) self.store.sources_delete_file(filename) self.write_deletelist() self.write_addlist() self.update_datastructs() print_request_list(self.apiurl, self.prjname, self.name) # FIXME: add testcases for this codepath sinfo = sfilelist.find('serviceinfo') if sinfo is not None: print('Waiting for server side source service run') u = makeurl(self.apiurl, ['source', self.prjname, self.name]) while sinfo is not None and sinfo.get('code') == 'running': sys.stdout.write('.') sys.stdout.flush() # does it make sense to add some delay? sfilelist = xml_fromstring(http_GET(u).read()) # if sinfo is None another commit might have occured in the "meantime" sinfo = sfilelist.find('serviceinfo') print('') rev = self.latest_rev() self.update(rev=rev) elif self.get_local_meta() is None: # if this was a newly added package there is no _meta # file self.update_local_pacmeta() def __write_storelist(self, name, data): if len(data) == 0: try: os.unlink(os.path.join(self.storedir, name)) except: pass else: store_write_string(self.absdir, name, '%s\n' % '\n'.join(data)) def write_conflictlist(self): self.__write_storelist('_in_conflict', self.in_conflict) def updatefile(self, n, revision, mtime=None): from ..core import get_source_file from ..core import utime filename = os.path.join(self.dir, n) storefilename = self.store.sources_get_path(n) origfile_tmp = os.path.join(self.storedir, '_in_update', f'{n}.copy') origfile = os.path.join(self.storedir, '_in_update', n) if os.path.isfile(filename): shutil.copyfile(filename, origfile_tmp) os.rename(origfile_tmp, origfile) else: origfile = None get_source_file(self.apiurl, self.prjname, self.name, n, targetfilename=storefilename, revision=revision, progress_obj=self.progress_obj, mtime=mtime, meta=self.meta) shutil.copyfile(storefilename, filename) if mtime: utime(filename, (-1, mtime)) if origfile is not None: os.unlink(origfile) def mergefile(self, n, revision, mtime=None): from ..core import binary_file from ..core import get_source_file from ..core import run_external filename = os.path.join(self.dir, n) storefilename = self.store.sources_get_path(n) myfilename = os.path.join(self.dir, n + '.mine') upfilename = os.path.join(self.dir, n + '.new') origfile_tmp = os.path.join(self.storedir, '_in_update', f'{n}.copy') origfile = os.path.join(self.storedir, '_in_update', n) shutil.copyfile(filename, origfile_tmp) os.rename(origfile_tmp, origfile) os.rename(filename, myfilename) get_source_file(self.apiurl, self.prjname, self.name, n, revision=revision, targetfilename=upfilename, progress_obj=self.progress_obj, mtime=mtime, meta=self.meta) if binary_file(myfilename) or binary_file(upfilename): # don't try merging shutil.copyfile(upfilename, filename) shutil.copyfile(upfilename, storefilename) os.unlink(origfile) self.in_conflict.append(n) self.write_conflictlist() return 'C' else: # try merging # diff3 OPTIONS... MINE OLDER YOURS ret = -1 with open(filename, 'w') as f: args = ('-m', '-E', myfilename, storefilename, upfilename) ret = run_external('diff3', *args, stdout=f) # "An exit status of 0 means `diff3' was successful, 1 means some # conflicts were found, and 2 means trouble." if ret == 0: # merge was successful... clean up shutil.copyfile(upfilename, storefilename) os.unlink(upfilename) os.unlink(myfilename) os.unlink(origfile) return 'G' elif ret == 1: # unsuccessful merge shutil.copyfile(upfilename, storefilename) os.unlink(origfile) self.in_conflict.append(n) self.write_conflictlist() return 'C' else: merge_cmd = 'diff3 ' + ' '.join(args) raise oscerr.ExtRuntimeError(f'diff3 failed with exit code: {ret}', merge_cmd) def update_local_filesmeta(self, revision=None): """ Update the local _files file in the store. It is replaced with the version pulled from upstream. """ meta = self.get_files_meta(revision=revision) store_write_string(self.absdir, '_files', meta + '\n') def get_files_meta(self, revision='latest', skip_service=True): from ..core import ET_ENCODING from ..core import show_files_meta fm = show_files_meta(self.apiurl, self.prjname, self.name, revision=revision, meta=self.meta) # look for "too large" files according to size limit and mark them root = xml_fromstring(fm) for e in root.findall('entry'): size = e.get('size') if size and self.size_limit and int(size) > self.size_limit \ or skip_service and (e.get('name').startswith('_service:') or e.get('name').startswith('_service_')): e.set('skipped', 'true') continue if conf.config["exclude_files"]: exclude = False for pattern in conf.config["exclude_files"]: if fnmatch.fnmatch(e.get("name"), pattern): exclude = True break if exclude: e.set("skipped", "true") continue if conf.config["include_files"]: include = False for pattern in conf.config["include_files"]: if fnmatch.fnmatch(e.get("name"), pattern): include = True break if not include: e.set("skipped", "true") continue return ET.tostring(root, encoding=ET_ENCODING) def get_local_meta(self): """Get the local _meta file for the package.""" meta = store_read_file(self.absdir, '_meta') return meta def get_local_origin_project(self): """Get the originproject from the _meta file.""" # if the wc was checked out via some old osc version # there might be no meta file: in this case we assume # that the origin project is equal to the wc's project meta = self.get_local_meta() if meta is None: return self.prjname root = xml_fromstring(meta) return root.get('project') def is_link_to_different_project(self): """Check if the package is a link to a different project.""" if self.name == "_project": return False orgprj = self.get_local_origin_project() return self.prjname != orgprj def update_datastructs(self): """ Update the internal data structures if the local _files file has changed (e.g. update_local_filesmeta() has been called). """ from ..core import DirectoryServiceinfo if self.scm_url: self.filenamelist = [] self.filelist = [] self.skipped = [] self.to_be_added = [] self.to_be_deleted = [] self.in_conflict = [] self.linkrepair = None self.rev = None self.srcmd5 = None self.linkinfo = Linkinfo() self.serviceinfo = DirectoryServiceinfo() self.size_limit = None self.meta = None self.excluded = [] self.filenamelist_unvers = [] return files_tree = read_filemeta(self.dir) files_tree_root = files_tree.getroot() self.rev = files_tree_root.get('rev') self.srcmd5 = files_tree_root.get('srcmd5') self.linkinfo = Linkinfo() self.linkinfo.read(files_tree_root.find('linkinfo')) self.serviceinfo = DirectoryServiceinfo() self.serviceinfo.read(files_tree_root.find('serviceinfo')) self.filenamelist = [] self.filelist = [] self.skipped = [] for node in files_tree_root.findall('entry'): try: f = File(node.get('name'), node.get('md5'), int(node.get('size')), int(node.get('mtime'))) if node.get('skipped'): self.skipped.append(f.name) f.skipped = True except: # okay, a very old version of _files, which didn't contain any metadata yet... f = File(node.get('name'), '', 0, 0) self.filelist.append(f) self.filenamelist.append(f.name) self.to_be_added = read_tobeadded(self.absdir) self.to_be_deleted = read_tobedeleted(self.absdir) self.in_conflict = read_inconflict(self.absdir) self.linkrepair = os.path.isfile(os.path.join(self.storedir, '_linkrepair')) self.size_limit = read_sizelimit(self.dir) self.meta = self.ismetamode() # gather unversioned files, but ignore some stuff self.excluded = [] for i in os.listdir(self.dir): for j in conf.config['exclude_glob']: if fnmatch.fnmatch(i, j): self.excluded.append(i) break self.filenamelist_unvers = [i for i in os.listdir(self.dir) if i not in self.excluded if i not in self.filenamelist] def islink(self): """tells us if the package is a link (has 'linkinfo'). A package with linkinfo is a package which links to another package. Returns ``True`` if the package is a link, otherwise ``False``.""" return self.linkinfo.islink() def isexpanded(self): """tells us if the package is a link which is expanded. Returns ``True`` if the package is expanded, otherwise ``False``.""" return self.linkinfo.isexpanded() def islinkrepair(self): """tells us if we are repairing a broken source link.""" return self.linkrepair def ispulled(self): """tells us if we have pulled a link.""" return os.path.isfile(os.path.join(self.storedir, '_pulled')) def isfrozen(self): """tells us if the link is frozen.""" return os.path.isfile(os.path.join(self.storedir, '_frozenlink')) def ismetamode(self): """tells us if the package is in meta mode""" return os.path.isfile(os.path.join(self.storedir, '_meta_mode')) def get_pulled_srcmd5(self): pulledrev = None for line in open(os.path.join(self.storedir, '_pulled')): pulledrev = line.strip() return pulledrev def haslinkerror(self): """ Returns ``True`` if the link is broken otherwise ``False``. If the package is not a link it returns ``False``. """ return self.linkinfo.haserror() def linkerror(self): """ Returns an error message if the link is broken otherwise ``None``. If the package is not a link it returns ``None``. """ return self.linkinfo.error def hasserviceinfo(self): """ Returns ``True``, if this package contains services. """ return self.serviceinfo.lsrcmd5 is not None or self.serviceinfo.xsrcmd5 is not None def update_local_pacmeta(self): """ Update the local _meta file in the store. It is replaced with the version pulled from upstream. """ from ..core import show_package_meta meta = show_package_meta(self.apiurl, self.prjname, self.name) if meta != "": # is empty for _project for example meta = b''.join(meta) store_write_string(self.absdir, '_meta', meta + b'\n') def findfilebyname(self, n): for i in self.filelist: if i.name == n: return i def get_status(self, excluded=False, *exclude_states): global store todo = self.todo if not todo: todo = self.filenamelist + self.to_be_added + \ [i for i in self.filenamelist_unvers if not os.path.isdir(os.path.join(self.absdir, i))] if excluded: todo.extend([i for i in self.excluded if i != store]) todo = set(todo) res = [] for fname in sorted(todo): st = self.status(fname) if st not in exclude_states: res.append((st, fname)) return res def status(self, n): """ status can be:: file storefile file present STATUS exists exists in _files x - - 'A' and listed in _to_be_added x x - 'R' and listed in _to_be_added x x x ' ' if digest differs: 'M' and if in conflicts file: 'C' x - - '?' - x x 'D' and listed in _to_be_deleted x x x 'D' and listed in _to_be_deleted (e.g. if deleted file was modified) x x x 'C' and listed in _in_conflict x - x 'S' and listed in self.skipped - - x 'S' and listed in self.skipped - x x '!' - - - NOT DEFINED """ from ..core import dgst known_by_meta = False exists = False exists_in_store = False localfile = os.path.join(self.absdir, n) if n in self.filenamelist: known_by_meta = True if os.path.exists(localfile): exists = True if self.store.sources_is_file(n): exists_in_store = True if n in self.to_be_deleted: state = 'D' elif n in self.in_conflict: state = 'C' elif n in self.skipped: state = 'S' elif n in self.to_be_added and exists and exists_in_store: state = 'R' elif n in self.to_be_added and exists: state = 'A' elif exists and exists_in_store and known_by_meta: filemeta = self.findfilebyname(n) state = ' ' if conf.config['status_mtime_heuristic']: if os.path.getmtime(localfile) != filemeta.mtime and dgst(localfile) != filemeta.md5: state = 'M' elif dgst(localfile) != filemeta.md5: state = 'M' elif n in self.to_be_added and not exists: state = '!' elif not exists and exists_in_store and known_by_meta and n not in self.to_be_deleted: state = '!' elif exists and not exists_in_store and not known_by_meta: state = '?' elif not exists_in_store and known_by_meta: # XXX: this codepath shouldn't be reached (we restore the storefile # in update_datastructs) raise oscerr.PackageInternalError(self.prjname, self.name, 'error: file \'%s\' is known by meta but no storefile exists.\n' 'This might be caused by an old wc format. Please backup your current\n' 'wc and checkout the package again. Afterwards copy all files (except the\n' '.osc/ dir) into the new package wc.' % n) elif os.path.islink(localfile): # dangling symlink, whose name is _not_ tracked: treat it # as unversioned state = '?' else: # this case shouldn't happen (except there was a typo in the filename etc.) raise oscerr.OscIOError(None, f'osc: \'{n}\' is not under version control') return state def get_diff(self, revision=None, ignoreUnversioned=False): from ..core import binary_file from ..core import get_source_file from ..core import get_source_file_diff from ..core import revision_is_empty diff_hdr = b'Index: %s\n' diff_hdr += b'===================================================================\n' kept = [] added = [] deleted = [] def diff_add_delete(fname, add, revision): diff = [] diff.append(diff_hdr % fname.encode()) origname = fname if add: diff.append(b'--- %s\t(revision 0)\n' % fname.encode()) rev = 'revision 0' if not revision_is_empty(revision) and fname not in self.to_be_added: rev = 'working copy' diff.append(b'+++ %s\t(%s)\n' % (fname.encode(), rev.encode())) fname = os.path.join(self.absdir, fname) if not os.path.isfile(fname): raise oscerr.OscIOError(None, 'file \'%s\' is marked as \'A\' but does not exist\n' '(either add the missing file or revert it)' % fname) else: if not revision_is_empty(revision): b_revision = str(revision).encode() else: b_revision = self.rev.encode() diff.append(b'--- %s\t(revision %s)\n' % (fname.encode(), b_revision)) diff.append(b'+++ %s\t(working copy)\n' % fname.encode()) fname = self.store.sources_get_path(fname) fd = None tmpfile = None try: if not revision_is_empty(revision) and not add: (fd, tmpfile) = tempfile.mkstemp(prefix='osc_diff') get_source_file(self.apiurl, self.prjname, self.name, origname, tmpfile, revision) fname = tmpfile if binary_file(fname): what = b'added' if not add: what = b'deleted' diff = diff[:1] diff.append(b'Binary file \'%s\' %s.\n' % (origname.encode(), what)) return diff tmpl = b'+%s' ltmpl = b'@@ -0,0 +1,%d @@\n' if not add: tmpl = b'-%s' ltmpl = b'@@ -1,%d +0,0 @@\n' with open(fname, 'rb') as f: lines = [tmpl % i for i in f.readlines()] if len(lines): diff.append(ltmpl % len(lines)) if not lines[-1].endswith(b'\n'): lines.append(b'\n\\ No newline at end of file\n') diff.extend(lines) finally: if fd is not None: os.close(fd) if tmpfile is not None and os.path.exists(tmpfile): os.unlink(tmpfile) return diff if revision is None: todo = self.todo or [i for i in self.filenamelist if i not in self.to_be_added] + self.to_be_added for fname in todo: if fname in self.to_be_added and self.status(fname) == 'A': added.append(fname) elif fname in self.to_be_deleted: deleted.append(fname) elif fname in self.filenamelist: kept.append(self.findfilebyname(fname)) elif fname in self.to_be_added and self.status(fname) == '!': raise oscerr.OscIOError(None, 'file \'%s\' is marked as \'A\' but does not exist\n' '(either add the missing file or revert it)' % fname) elif not ignoreUnversioned: raise oscerr.OscIOError(None, f'file \'{fname}\' is not under version control') else: fm = self.get_files_meta(revision=revision) root = xml_fromstring(fm) rfiles = self.__get_files(root) # swap added and deleted kept, deleted, added, services = self.__get_rev_changes(rfiles) added = [f.name for f in added] added.extend([f for f in self.to_be_added if f not in kept]) deleted = [f.name for f in deleted] deleted.extend(self.to_be_deleted) for f in added[:]: if f in deleted: added.remove(f) deleted.remove(f) # print kept, added, deleted for f in kept: state = self.status(f.name) if state in ('S', '?', '!'): continue elif state == ' ' and revision is None: continue elif not revision_is_empty(revision) and self.findfilebyname(f.name).md5 == f.md5 and state != 'M': continue yield [diff_hdr % f.name.encode()] if revision is None: yield get_source_file_diff(self.absdir, f.name, self.rev) else: fd = None tmpfile = None diff = [] try: (fd, tmpfile) = tempfile.mkstemp(prefix='osc_diff') get_source_file(self.apiurl, self.prjname, self.name, f.name, tmpfile, revision) diff = get_source_file_diff(self.absdir, f.name, revision, os.path.basename(tmpfile), os.path.dirname(tmpfile), f.name) finally: if fd is not None: os.close(fd) if tmpfile is not None and os.path.exists(tmpfile): os.unlink(tmpfile) yield diff for f in added: yield diff_add_delete(f, True, revision) for f in deleted: yield diff_add_delete(f, False, revision) def merge(self, otherpac): for todo_entry in otherpac.todo: if todo_entry not in self.todo: self.todo.append(todo_entry) def __str__(self): r = """ name: %s prjname: %s workingdir: %s localfilelist: %s linkinfo: %s rev: %s 'todo' files: %s """ % (self.name, self.prjname, self.dir, '\n '.join(self.filenamelist), self.linkinfo, self.rev, self.todo) return r def read_meta_from_spec(self, spec=None): from ..core import read_meta_from_spec if spec: specfile = spec else: # scan for spec files speclist = glob.glob(os.path.join(self.dir, '*.spec')) if len(speclist) == 1: specfile = speclist[0] elif len(speclist) > 1: print('the following specfiles were found:') for filename in speclist: print(filename) print('please specify one with --specfile') sys.exit(1) else: print('no specfile was found - please specify one ' 'with --specfile') sys.exit(1) data = read_meta_from_spec(specfile, 'Summary', 'Url', '%description') self.summary = data.get('Summary', '') self.url = data.get('Url', '') self.descr = data.get('%description', '') def update_package_meta(self, force=False): """ for the updatepacmetafromspec subcommand argument force supress the confirm question """ from .. import obs_api from ..output import get_user_input package_obj = obs_api.Package.from_api(self.apiurl, self.prjname, self.name) old = package_obj.to_string() package_obj.title = self.summary.strip() package_obj.description = "".join(self.descr).strip() package_obj.url = self.url.strip() new = package_obj.to_string() if not package_obj.has_changed(): return if force: reply = "y" else: while True: print("\n".join(difflib.unified_diff(old.splitlines(), new.splitlines(), fromfile="old", tofile="new"))) print() reply = get_user_input( "Write?", answers={"y": "yes", "n": "no", "e": "edit"}, ) if reply == "y": break if reply == "n": break if reply == "e": _, _, edited_obj = package_obj.do_edit() package_obj.do_update(edited_obj) new = package_obj.to_string() continue if reply == "y": package_obj.to_api(self.apiurl) def mark_frozen(self): store_write_string(self.absdir, '_frozenlink', '') print() print(f"The link in this package (\"{self.name}\") is currently broken. Checking") print("out the last working version instead; please use 'osc pull'") print("to merge the conflicts.") print() def unmark_frozen(self): if os.path.exists(os.path.join(self.storedir, '_frozenlink')): os.unlink(os.path.join(self.storedir, '_frozenlink')) def latest_rev(self, include_service_files=False, expand=False): from ..core import show_upstream_rev from ..core import show_upstream_xsrcmd5 # if expand is True the xsrcmd5 will be returned (even if the wc is unexpanded) if self.islinkrepair(): upstream_rev = show_upstream_xsrcmd5(self.apiurl, self.prjname, self.name, linkrepair=1, meta=self.meta, include_service_files=include_service_files) elif self.islink() and (self.isexpanded() or expand): if self.isfrozen() or self.ispulled(): upstream_rev = show_upstream_xsrcmd5(self.apiurl, self.prjname, self.name, linkrev=self.linkinfo.srcmd5, meta=self.meta, include_service_files=include_service_files) else: try: upstream_rev = show_upstream_xsrcmd5(self.apiurl, self.prjname, self.name, meta=self.meta, include_service_files=include_service_files) except: try: upstream_rev = show_upstream_xsrcmd5(self.apiurl, self.prjname, self.name, linkrev=self.linkinfo.srcmd5, meta=self.meta, include_service_files=include_service_files) except: upstream_rev = show_upstream_xsrcmd5(self.apiurl, self.prjname, self.name, linkrev="base", meta=self.meta, include_service_files=include_service_files) self.mark_frozen() elif not self.islink() and expand: upstream_rev = show_upstream_xsrcmd5(self.apiurl, self.prjname, self.name, meta=self.meta, include_service_files=include_service_files) else: upstream_rev = show_upstream_rev(self.apiurl, self.prjname, self.name, meta=self.meta, include_service_files=include_service_files) return upstream_rev def __get_files(self, fmeta_root): from ..core import ET_ENCODING f = [] if fmeta_root.get('rev') is None and len(fmeta_root.findall('entry')) > 0: raise oscerr.APIError(f"missing rev attribute in _files:\n{''.join(ET.tostring(fmeta_root, encoding=ET_ENCODING))}") for i in fmeta_root.findall('entry'): error = i.get('error') if error is not None: raise oscerr.APIError(f'broken files meta: {error}') skipped = i.get('skipped') is not None f.append(File(i.get('name'), i.get('md5'), int(i.get('size')), int(i.get('mtime')), skipped)) return f def __get_rev_changes(self, revfiles): kept = [] added = [] deleted = [] services = [] revfilenames = [] for f in revfiles: revfilenames.append(f.name) # treat skipped like deleted files if f.skipped: if f.name.startswith('_service:'): services.append(f) else: deleted.append(f) continue # treat skipped like added files # problem: this overwrites existing files during the update # (because skipped files aren't in self.filenamelist_unvers) if f.name in self.filenamelist and f.name not in self.skipped: kept.append(f) else: added.append(f) for f in self.filelist: if f.name not in revfilenames: deleted.append(f) return kept, added, deleted, services def update_needed(self, sinfo): # this method might return a false-positive (that is a True is returned, # even though no update is needed) (for details, see comments below) if self.islink(): if self.isexpanded(): # check if both revs point to the same expanded sources # Note: if the package contains a _service file, sinfo.srcmd5's lsrcmd5 # points to the "expanded" services (xservicemd5) => chances # for a false-positive are high, because osc usually works on the # "unexpanded" services. # Once the srcserver supports something like noservice=1, we can get rid of # this false-positives (patch was already sent to the ml) (but this also # requires some slight changes in osc) return sinfo.get('srcmd5') != self.srcmd5 elif self.hasserviceinfo(): # check if we have expanded or unexpanded services if self.serviceinfo.isexpanded(): return sinfo.get('lsrcmd5') != self.srcmd5 else: # again, we might have a false-positive here, because # a mismatch of the "xservicemd5"s does not neccessarily # imply a change in the "unexpanded" services. return sinfo.get('lsrcmd5') != self.serviceinfo.xsrcmd5 # simple case: unexpanded sources and no services # self.srcmd5 should also work return sinfo.get('lsrcmd5') != self.linkinfo.lsrcmd5 elif self.hasserviceinfo(): if self.serviceinfo.isexpanded(): return sinfo.get('srcmd5') != self.srcmd5 else: # cannot handle this case, because the sourceinfo does not contain # information about the lservicemd5. Once the srcserver supports # a noservice=1 query parameter, we can handle this case. return True return sinfo.get('srcmd5') != self.srcmd5 def update(self, rev=None, service_files=False, size_limit=None): from ..core import ET_ENCODING from ..core import dgst rfiles = [] # size_limit is only temporary for this update old_size_limit = self.size_limit if size_limit is not None: self.size_limit = int(size_limit) in_update_files_path = os.path.join(self.storedir, "_in_update", "_files") if os.path.isfile(in_update_files_path) and os.path.getsize(in_update_files_path) != 0: print('resuming broken update...') root = xml_parse(os.path.join(self.storedir, '_in_update', '_files')).getroot() rfiles = self.__get_files(root) kept, added, deleted, services = self.__get_rev_changes(rfiles) # check if we aborted in the middle of a file update broken_file = os.listdir(os.path.join(self.storedir, '_in_update')) broken_file.remove('_files') if len(broken_file) == 1: origfile = os.path.join(self.storedir, '_in_update', broken_file[0]) wcfile = os.path.join(self.absdir, broken_file[0]) origfile_md5 = dgst(origfile) origfile_meta = self.findfilebyname(broken_file[0]) if origfile.endswith('.copy'): # ok it seems we aborted at some point during the copy process # (copy process == copy wcfile to the _in_update dir). remove file+continue os.unlink(origfile) elif self.findfilebyname(broken_file[0]) is None: # should we remove this file from _in_update? if we don't # the user has no chance to continue without removing the file manually raise oscerr.PackageInternalError(self.prjname, self.name, '\'%s\' is not known by meta but exists in \'_in_update\' dir') elif os.path.isfile(wcfile) and dgst(wcfile) != origfile_md5: (fd, tmpfile) = tempfile.mkstemp(dir=self.absdir, prefix=broken_file[0] + '.') os.close(fd) os.rename(wcfile, tmpfile) os.rename(origfile, wcfile) print('warning: it seems you modified \'%s\' after the broken ' 'update. Restored original file and saved modified version ' 'to \'%s\'.' % (wcfile, tmpfile)) elif not os.path.isfile(wcfile): # this is strange... because it existed before the update. restore it os.rename(origfile, wcfile) else: # everything seems to be ok os.unlink(origfile) elif len(broken_file) > 1: raise oscerr.PackageInternalError(self.prjname, self.name, 'too many files in \'_in_update\' dir') tmp = rfiles[:] for f in tmp: if self.store.sources_is_file(f.name): if dgst(self.store.sources_get_path(f.name)) == f.md5: if f in kept: kept.remove(f) elif f in added: added.remove(f) # this can't happen elif f in deleted: deleted.remove(f) if not service_files: services = [] self.__update(kept, added, deleted, services, ET.tostring(root, encoding=ET_ENCODING), root.get('rev')) os.unlink(os.path.join(self.storedir, '_in_update', '_files')) os.rmdir(os.path.join(self.storedir, '_in_update')) # ok everything is ok (hopefully)... fm = self.get_files_meta(revision=rev) root = xml_fromstring(fm) rfiles = self.__get_files(root) store_write_string(self.absdir, '_files', fm + '\n', subdir='_in_update') kept, added, deleted, services = self.__get_rev_changes(rfiles) if not service_files: services = [] self.__update(kept, added, deleted, services, fm, root.get('rev')) os.unlink(os.path.join(self.storedir, '_in_update', '_files')) if os.path.isdir(os.path.join(self.storedir, '_in_update')): os.rmdir(os.path.join(self.storedir, '_in_update')) self.size_limit = old_size_limit def __update(self, kept, added, deleted, services, fm, rev): from ..core import get_source_file from ..core import getTransActPath from ..core import statfrmt pathn = getTransActPath(self.dir) # check for conflicts with existing files for f in added: if f.name in self.filenamelist_unvers: raise oscerr.PackageFileConflict(self.prjname, self.name, f.name, f'failed to add file \'{f.name}\' file/dir with the same name already exists') # ok, the update can't fail due to existing files for f in added: self.updatefile(f.name, rev, f.mtime) print(statfrmt('A', os.path.join(pathn, f.name))) for f in deleted: # if the storefile doesn't exist we're resuming an aborted update: # the file was already deleted but we cannot know this # OR we're processing a _service: file (simply keep the file) if self.store.sources_is_file(f.name) and self.status(f.name) not in ('M', 'C'): # if self.status(f.name) != 'M': self.delete_localfile(f.name) self.store.sources_delete_file(f.name) print(statfrmt('D', os.path.join(pathn, f.name))) if f.name in self.to_be_deleted: self.to_be_deleted.remove(f.name) self.write_deletelist() elif f.name in self.in_conflict: self.in_conflict.remove(f.name) self.write_conflictlist() for f in kept: state = self.status(f.name) # print f.name, state if state == 'M' and self.findfilebyname(f.name).md5 == f.md5: # remote file didn't change pass elif state == 'M': # try to merge changes merge_status = self.mergefile(f.name, rev, f.mtime) print(statfrmt(merge_status, os.path.join(pathn, f.name))) elif state == '!': self.updatefile(f.name, rev, f.mtime) print(f'Restored \'{os.path.join(pathn, f.name)}\'') elif state == 'C': get_source_file(self.apiurl, self.prjname, self.name, f.name, targetfilename=self.store.sources_get_path(f.name), revision=rev, progress_obj=self.progress_obj, mtime=f.mtime, meta=self.meta) print(f'skipping \'{f.name}\' (this is due to conflicts)') elif state == 'D' and self.findfilebyname(f.name).md5 != f.md5: # XXX: in the worst case we might end up with f.name being # in _to_be_deleted and in _in_conflict... this needs to be checked if os.path.exists(os.path.join(self.absdir, f.name)): merge_status = self.mergefile(f.name, rev, f.mtime) print(statfrmt(merge_status, os.path.join(pathn, f.name))) if merge_status == 'C': # state changes from delete to conflict self.to_be_deleted.remove(f.name) self.write_deletelist() else: # XXX: we cannot recover this case because we've no file # to backup self.updatefile(f.name, rev, f.mtime) print(statfrmt('U', os.path.join(pathn, f.name))) elif state == ' ' and self.findfilebyname(f.name).md5 != f.md5: self.updatefile(f.name, rev, f.mtime) print(statfrmt('U', os.path.join(pathn, f.name))) # checkout service files for f in services: get_source_file(self.apiurl, self.prjname, self.name, f.name, targetfilename=os.path.join(self.absdir, f.name), revision=rev, progress_obj=self.progress_obj, mtime=f.mtime, meta=self.meta) print(statfrmt('A', os.path.join(pathn, f.name))) store_write_string(self.absdir, '_files', fm + '\n') if not self.meta: self.update_local_pacmeta() self.update_datastructs() print(f'At revision {self.rev}.') def run_source_services(self, mode=None, singleservice=None, verbose=None): if self.name.startswith("_"): return 0 curdir = os.getcwd() os.chdir(self.absdir) # e.g. /usr/lib/obs/service/verify_file fails if not inside the project dir. si = Serviceinfo() if os.path.exists('_service'): try: service = xml_parse(os.path.join(self.absdir, '_service')).getroot() except ET.ParseError as v: line, column = v.position print(f'XML error in _service file on line {line}, column {column}') sys.exit(1) si.read(service) si.getProjectGlobalServices(self.apiurl, self.prjname, self.name) r = si.execute(self.absdir, mode, singleservice, verbose) os.chdir(curdir) return r def revert(self, filename): if filename not in self.filenamelist and filename not in self.to_be_added: raise oscerr.OscIOError(None, f'file \'{filename}\' is not under version control') elif filename in self.skipped: raise oscerr.OscIOError(None, f'file \'{filename}\' is marked as skipped and cannot be reverted') if filename in self.filenamelist and not self.store.sources_is_file(filename): msg = f"file '{filename}' is listed in filenamelist but no storefile exists" raise oscerr.PackageInternalError(self.prjname, self.name, msg) state = self.status(filename) if not (state == 'A' or state == '!' and filename in self.to_be_added): shutil.copyfile(self.store.sources_get_path(filename), os.path.join(self.absdir, filename)) if state == 'D': self.to_be_deleted.remove(filename) self.write_deletelist() elif state == 'C': self.clear_from_conflictlist(filename) elif state in ('A', 'R') or state == '!' and filename in self.to_be_added: self.to_be_added.remove(filename) self.write_addlist() @staticmethod def init_package(apiurl: str, project, package, dir, size_limit=None, meta=False, progress_obj=None, scm_url=None): global store if not os.path.exists(dir): os.mkdir(dir) elif not os.path.isdir(dir): raise oscerr.OscIOError(None, f'error: \'{dir}\' is no directory') if os.path.exists(os.path.join(dir, store)): raise oscerr.OscIOError(None, f'error: \'{dir}\' is already an initialized osc working copy') else: os.mkdir(os.path.join(dir, store)) s = Store(dir, check=False) s.write_string("_osclib_version", Store.STORE_VERSION) s.apiurl = apiurl s.project = project s.package = package if meta: s.write_string("_meta_mode", "") if size_limit: s.size_limit = int(size_limit) if scm_url: s.scmurl = scm_url else: s.write_string("_files", "") return Package(dir, progress_obj=progress_obj, size_limit=size_limit) osc-1.12.1/osc/obs_scm/project.py000066400000000000000000000674261475337502500166430ustar00rootroot00000000000000import fnmatch import os from pathlib import Path from typing import Optional from .. import conf from .. import oscerr from ..util.xml import ET from ..util.xml import xml_parse from .store import Store from .store import delete_storedir from .store import store from .store import store_read_package from .store import store_read_project from .store import store_write_initial_packages from .store import store_write_project from .store import store_write_string from .store import is_package_dir class Project: """ Represent a checked out project directory, holding packages. :Attributes: ``dir`` The directory path containing the project. ``name`` The name of the project. ``apiurl`` The endpoint URL of the API server. ``pacs_available`` List of names of packages available server-side. This is only populated if ``getPackageList`` is set to ``True`` in the constructor. ``pacs_have`` List of names of packages which exist server-side and exist in the local project working copy (if 'do_package_tracking' is disabled). If 'do_package_tracking' is enabled it represents the list names of packages which are tracked in the project working copy (that is it might contain packages which exist on the server as well as packages which do not exist on the server (for instance if the local package was added or if the package was removed on the server-side)). ``pacs_excluded`` List of names of packages in the local project directory which are excluded by the `exclude_glob` configuration variable. Only set if `do_package_tracking` is enabled. ``pacs_unvers`` List of names of packages in the local project directory which are not tracked. Only set if `do_package_tracking` is enabled. ``pacs_broken`` List of names of packages which are tracked but do not exist in the local project working copy. Only set if `do_package_tracking` is enabled. ``pacs_missing`` List of names of packages which exist server-side but are not expected to exist in the local project directory. """ REQ_STOREFILES = ('_project', '_apiurl') def __init__(self, dir, getPackageList=True, progress_obj=None, wc_check=True): """ Constructor. :Parameters: `dir` : str The directory path containing the checked out project. `getPackageList` : bool Set to `False` if you want to skip retrieval from the server of the list of packages in the project . `wc_check` : bool """ from ..core import meta_get_packagelist self.dir = Path(dir) self.absdir = os.path.abspath(dir) self.store = Store(dir, check=wc_check) self.progress_obj = progress_obj self.name = store_read_project(self.dir) self.scm_url = self.store.scmurl self.apiurl = self.store.apiurl dirty_files = [] if wc_check: dirty_files = self.wc_check() if dirty_files: msg = 'Your working copy \'%s\' is in an inconsistent state.\n' \ 'Please run \'osc repairwc %s\' and check the state\n' \ 'of the working copy afterwards (via \'osc status %s\')' % (self.dir, self.dir, self.dir) raise oscerr.WorkingCopyInconsistent(self.name, None, dirty_files, msg) if getPackageList: self.pacs_available = meta_get_packagelist(self.apiurl, self.name) else: self.pacs_available = [] if conf.config['do_package_tracking']: self.pac_root = self.read_packages().getroot() self.pacs_have = [pac.get('name') for pac in self.pac_root.findall('package')] self.pacs_excluded = [i for i in os.listdir(self.dir) for j in conf.config['exclude_glob'] if fnmatch.fnmatch(i, j)] self.pacs_unvers = [i for i in os.listdir(self.dir) if i not in self.pacs_have and i not in self.pacs_excluded] # store all broken packages (e.g. packages which where removed by a non-osc cmd) # in the self.pacs_broken list self.pacs_broken = [] for p in self.pacs_have: if not os.path.isdir(os.path.join(self.absdir, p)): # all states will be replaced with the '!'-state # (except it is already marked as deleted ('D'-state)) self.pacs_broken.append(p) else: self.pacs_have = [i for i in os.listdir(self.dir) if i in self.pacs_available] self.pacs_missing = [i for i in self.pacs_available if i not in self.pacs_have] def wc_check(self): global store dirty_files = [] req_storefiles = Project.REQ_STOREFILES if conf.config['do_package_tracking'] and self.scm_url is None: req_storefiles += ('_packages',) for fname in req_storefiles: if not os.path.exists(os.path.join(self.absdir, store, fname)): dirty_files.append(fname) return dirty_files def wc_repair(self, apiurl: Optional[str] = None) -> bool: repaired: bool = False store = Store(self.dir, check=False) store.assert_is_project() # there was a time when osc did not write _osclib_version file; let's assume these checkouts have version 1.0 if not store.exists("_osclib_version"): store.write_string("_osclib_version", "1.0") repaired = True if not store.exists("_apiurl") or apiurl: if apiurl is None: msg = 'cannot repair wc: the \'_apiurl\' file is missing but ' \ 'no \'apiurl\' was passed to wc_repair' # hmm should we raise oscerr.WrongArgs? raise oscerr.WorkingCopyInconsistent(self.name, None, [], msg) # sanity check conf.parse_apisrv_url(None, apiurl) store.apiurl = apiurl self.apiurl = apiurl repaired = True return repaired def checkout_missing_pacs(self, sinfos, expand_link=False, unexpand_link=False): from ..core import checkout_package from ..core import getTransActPath for pac in self.pacs_missing: if conf.config['do_package_tracking'] and pac in self.pacs_unvers: # pac is not under version control but a local file/dir exists msg = f'can\'t add package \'{pac}\': Object already exists' raise oscerr.PackageExists(self.name, pac, msg) if not (expand_link or unexpand_link): sinfo = sinfos.get(pac) if sinfo is None: # should never happen... continue linked = sinfo.find('linked') if linked is not None and linked.get('project') == self.name: # hmm what about a linkerror (sinfo.get('lsrcmd5') is None)? # Should we skip the package as well or should we it out? # let's skip it for now print(f"Skipping {pac} (link to package {linked.get('package')})") continue print(f'checking out new package {pac}') checkout_package(self.apiurl, self.name, pac, pathname=getTransActPath(os.path.join(self.dir, pac)), prj_obj=self, prj_dir=self.dir, expand_link=expand_link or not unexpand_link, progress_obj=self.progress_obj) def status(self, pac: str): exists = os.path.exists(os.path.join(self.absdir, pac)) st = self.get_state(pac) if st is None and exists: return '?' elif st is None: raise oscerr.OscIOError(None, f'osc: \'{pac}\' is not under version control') elif st in ('A', ' ') and not exists: return '!' elif st == 'D' and not exists: return 'D' else: return st def get_status(self, *exclude_states): res = [] for pac in self.pacs_have: st = self.status(pac) if st not in exclude_states: res.append((st, pac)) if '?' not in exclude_states: res.extend([('?', pac) for pac in self.pacs_unvers]) return res def get_pacobj(self, pac, *pac_args, **pac_kwargs): from ..core import Package try: st = self.status(pac) if st in ('?', '!') or st == 'D' and not os.path.exists(os.path.join(self.dir, pac)): return None return Package(os.path.join(self.dir, pac), *pac_args, **pac_kwargs) except oscerr.OscIOError: return None def set_state(self, pac, state): node = self.get_package_node(pac) if node is None: self.new_package_entry(pac, state) else: node.set('state', state) def get_package_node(self, pac: str): for node in self.pac_root.findall('package'): if pac == node.get('name'): return node return None def del_package_node(self, pac): for node in self.pac_root.findall('package'): if pac == node.get('name'): self.pac_root.remove(node) def get_state(self, pac: str): node = self.get_package_node(pac) if node is not None: return node.get('state') else: return None def info(self): from ..core import project_info_templ from ..core import makeurl source_url = makeurl(self.apiurl, ['source', self.name]) r = project_info_templ % (self.name, self.absdir, self.apiurl, source_url) return r def new_package_entry(self, name, state): ET.SubElement(self.pac_root, 'package', name=name, state=state) def read_packages(self): """ Returns an ``xml.etree.ElementTree`` object representing the parsed contents of the project's ``.osc/_packages`` XML file. """ from ..core import Package from ..core import meta_get_packagelist global store packages_file = os.path.join(self.absdir, store, '_packages') if os.path.isfile(packages_file) and os.path.getsize(packages_file): try: result = xml_parse(packages_file) except: msg = f'Cannot read package file \'{packages_file}\'. ' msg += 'You can try to remove it and then run osc repairwc.' raise oscerr.OscIOError(None, msg) return result else: # scan project for existing packages and migrate them cur_pacs = [] for data in os.listdir(self.dir): pac_dir = os.path.join(self.absdir, data) # we cannot use self.pacs_available because we cannot guarantee that the package list # was fetched from the server if data in meta_get_packagelist(self.apiurl, self.name) and is_package_dir(pac_dir) \ and Package(pac_dir).name == data: cur_pacs.append(ET.Element('package', name=data, state=' ')) store_write_initial_packages(self.absdir, self.name, cur_pacs) return xml_parse(os.path.join(self.absdir, store, '_packages')) def write_packages(self): from ..core import ET_ENCODING from ..core import xmlindent xmlindent(self.pac_root) store_write_string(self.absdir, '_packages', ET.tostring(self.pac_root, encoding=ET_ENCODING)) def addPackage(self, pac): for i in conf.config['exclude_glob']: if fnmatch.fnmatch(pac, i): msg = f'invalid package name: \'{pac}\' (see \'exclude_glob\' config option)' raise oscerr.OscIOError(None, msg) state = self.get_state(pac) if state is None or state == 'D': self.new_package_entry(pac, 'A') self.write_packages() # sometimes the new pac doesn't exist in the list because # it would take too much time to update all data structs regularly if pac in self.pacs_unvers: self.pacs_unvers.remove(pac) else: raise oscerr.PackageExists(self.name, pac, f'package \'{pac}\' is already under version control') def delPackage(self, pac, force=False): from ..core import delete_dir from ..core import getTransActPath from ..core import statfrmt state = self.get_state(pac.name) can_delete = True if state == ' ' or state == 'D': del_files = [] for filename in pac.filenamelist + pac.filenamelist_unvers: filestate = pac.status(filename) if filestate == 'M' or filestate == 'C' or \ filestate == 'A' or filestate == '?': can_delete = False else: del_files.append(filename) if can_delete or force: for filename in del_files: pac.delete_localfile(filename) if pac.status(filename) != '?': # this is not really necessary pac.put_on_deletelist(filename) print(statfrmt('D', getTransActPath(os.path.join(pac.dir, filename)))) print(statfrmt('D', getTransActPath(os.path.join(pac.dir, os.pardir, pac.name)))) pac.write_deletelist() self.set_state(pac.name, 'D') self.write_packages() else: print(f'package \'{pac.name}\' has local modifications (see osc st for details)') elif state == 'A': if force: delete_dir(pac.absdir) self.del_package_node(pac.name) self.write_packages() print(statfrmt('D', pac.name)) else: print(f'package \'{pac.name}\' has local modifications (see osc st for details)') elif state is None: print('package is not under version control') else: print('unsupported state') def update(self, pacs=(), expand_link=False, unexpand_link=False, service_files=False): from ..core import Package from ..core import checkout_package from ..core import get_project_sourceinfo from ..core import getTransActPath from ..core import show_upstream_xsrcmd5 if pacs: for pac in pacs: Package(os.path.join(self.dir, pac), progress_obj=self.progress_obj).update() else: # we need to make sure that the _packages file will be written (even if an exception # occurs) try: # update complete project # packages which no longer exists upstream upstream_del = [pac for pac in self.pacs_have if pac not in self.pacs_available and self.get_state(pac) != 'A'] sinfo_pacs = [pac for pac in self.pacs_have if self.get_state(pac) in (' ', 'D') and pac not in self.pacs_broken] sinfo_pacs.extend(self.pacs_missing) sinfos = get_project_sourceinfo(self.apiurl, self.name, True, *sinfo_pacs) for pac in upstream_del: if self.status(pac) != '!': p = Package(os.path.join(self.dir, pac)) self.delPackage(p, force=True) delete_storedir(p.storedir) try: os.rmdir(pac) except: pass self.pac_root.remove(self.get_package_node(pac)) self.pacs_have.remove(pac) for pac in self.pacs_have: state = self.get_state(pac) if pac in self.pacs_broken: if self.get_state(pac) != 'A': checkout_package(self.apiurl, self.name, pac, pathname=getTransActPath(os.path.join(self.dir, pac)), prj_obj=self, prj_dir=self.dir, expand_link=not unexpand_link, progress_obj=self.progress_obj) elif state == ' ': # do a simple update p = Package(os.path.join(self.dir, pac), progress_obj=self.progress_obj) rev = None needs_update = True if p.scm_url is not None: # git managed. print("Skipping git managed package ", pac) continue elif expand_link and p.islink() and not p.isexpanded(): if p.haslinkerror(): try: rev = show_upstream_xsrcmd5(p.apiurl, p.prjname, p.name, revision=p.rev) except: rev = show_upstream_xsrcmd5(p.apiurl, p.prjname, p.name, revision=p.rev, linkrev="base") p.mark_frozen() else: rev = p.linkinfo.xsrcmd5 print('Expanding to rev', rev) elif unexpand_link and p.islink() and p.isexpanded(): rev = p.linkinfo.lsrcmd5 print('Unexpanding to rev', rev) elif p.islink() and p.isexpanded(): needs_update = p.update_needed(sinfos[p.name]) if needs_update: rev = p.latest_rev() elif p.hasserviceinfo() and p.serviceinfo.isexpanded() and not service_files: # FIXME: currently, do_update does not propagate the --server-side-source-service-files # option to this method. Consequence: an expanded service is always unexpanded during # an update (TODO: discuss if this is a reasonable behavior (at least this the default # behavior for a while)) needs_update = True else: needs_update = p.update_needed(sinfos[p.name]) print(f'Updating {p.name}') if needs_update: p.update(rev, service_files) else: print(f'At revision {p.rev}.') if unexpand_link: p.unmark_frozen() elif state == 'D': # pac exists (the non-existent pac case was handled in the first if block) p = Package(os.path.join(self.dir, pac), progress_obj=self.progress_obj) if p.update_needed(sinfos[p.name]): p.update() elif state == 'A' and pac in self.pacs_available: # file/dir called pac already exists and is under version control msg = f'can\'t add package \'{pac}\': Object already exists' raise oscerr.PackageExists(self.name, pac, msg) elif state == 'A': # do nothing pass else: print(f'unexpected state.. package \'{pac}\'') self.checkout_missing_pacs(sinfos, expand_link, unexpand_link) finally: self.write_packages() def commit(self, pacs=(), msg='', files=None, verbose=False, skip_local_service_run=False, can_branch=False, force=False): from ..core import Package from ..core import os_path_samefile files = files or {} if pacs: try: for pac in pacs: todo = [] if pac in files: todo = files[pac] state = self.get_state(pac) if state == 'A': self.commitNewPackage(pac, msg, todo, verbose=verbose, skip_local_service_run=skip_local_service_run) elif state == 'D': self.commitDelPackage(pac, force=force) elif state == ' ': # display the correct dir when sending the changes if os_path_samefile(os.path.join(self.dir, pac), os.getcwd()): p = Package('.') else: p = Package(os.path.join(self.dir, pac)) p.todo = todo p.commit(msg, verbose=verbose, skip_local_service_run=skip_local_service_run, can_branch=can_branch, force=force) elif pac in self.pacs_unvers and not is_package_dir(os.path.join(self.dir, pac)): print(f'osc: \'{pac}\' is not under version control') elif pac in self.pacs_broken or not os.path.exists(os.path.join(self.dir, pac)): print(f'osc: \'{pac}\' package not found') elif state is None: self.commitExtPackage(pac, msg, todo, verbose=verbose, skip_local_service_run=skip_local_service_run) finally: self.write_packages() else: # if we have packages marked as '!' we cannot commit for pac in self.pacs_broken: if self.get_state(pac) != 'D': msg = f'commit failed: package \'{pac}\' is missing' raise oscerr.PackageMissing(self.name, pac, msg) try: for pac in self.pacs_have: state = self.get_state(pac) if state == ' ': # do a simple commit Package(os.path.join(self.dir, pac)).commit(msg, verbose=verbose, skip_local_service_run=skip_local_service_run) elif state == 'D': self.commitDelPackage(pac, force=force) elif state == 'A': self.commitNewPackage(pac, msg, verbose=verbose, skip_local_service_run=skip_local_service_run) finally: self.write_packages() def commitNewPackage(self, pac, msg='', files=None, verbose=False, skip_local_service_run=False): """creates and commits a new package if it does not exist on the server""" from ..core import Package from ..core import edit_meta from ..core import os_path_samefile from ..core import statfrmt files = files or [] if pac in self.pacs_available: print(f'package \'{pac}\' already exists') else: user = conf.get_apiurl_usr(self.apiurl) edit_meta(metatype='pkg', path_args=(self.name, pac), template_args=({ 'name': pac, 'user': user}), apiurl=self.apiurl) # display the correct dir when sending the changes olddir = os.getcwd() if os_path_samefile(os.path.join(self.dir, pac), os.curdir): os.chdir(os.pardir) p = Package(pac) else: p = Package(os.path.join(self.dir, pac)) p.todo = files print(statfrmt('Sending', os.path.normpath(p.dir))) p.commit(msg=msg, verbose=verbose, skip_local_service_run=skip_local_service_run) self.set_state(pac, ' ') os.chdir(olddir) def commitDelPackage(self, pac, force=False): """deletes a package on the server and in the working copy""" from ..core import Package from ..core import delete_package from ..core import getTransActPath from ..core import os_path_samefile from ..core import statfrmt try: # display the correct dir when sending the changes if os_path_samefile(os.path.join(self.dir, pac), os.curdir): pac_dir = pac else: pac_dir = os.path.join(self.dir, pac) p = Package(os.path.join(self.dir, pac)) # print statfrmt('Deleting', os.path.normpath(os.path.join(p.dir, os.pardir, pac))) delete_storedir(p.storedir) try: os.rmdir(p.dir) except: pass except OSError: pac_dir = os.path.join(self.dir, pac) except (oscerr.NoWorkingCopy, oscerr.WorkingCopyOutdated, oscerr.PackageError): pass # print statfrmt('Deleting', getTransActPath(os.path.join(self.dir, pac))) print(statfrmt('Deleting', getTransActPath(pac_dir))) delete_package(self.apiurl, self.name, pac, force=force) self.del_package_node(pac) def commitExtPackage(self, pac, msg, files=None, verbose=False, skip_local_service_run=False): """commits a package from an external project""" from ..core import Package from ..core import edit_meta from ..core import meta_exists from ..core import os_path_samefile files = files or [] if os_path_samefile(os.path.join(self.dir, pac), os.getcwd()): pac_path = '.' else: pac_path = os.path.join(self.dir, pac) store = Store(pac_path) project = store_read_project(pac_path) package = store_read_package(pac_path) apiurl = store.apiurl if not meta_exists(metatype='pkg', path_args=(project, package), template_args=None, create_new=False, apiurl=apiurl): user = conf.get_apiurl_usr(self.apiurl) edit_meta(metatype='pkg', path_args=(project, package), template_args=({'name': pac, 'user': user}), apiurl=apiurl) p = Package(pac_path) p.todo = files p.commit(msg=msg, verbose=verbose, skip_local_service_run=skip_local_service_run) def __str__(self): r = [] r.append('*****************************************************') r.append(f'Project {self.name} (dir={self.dir}, absdir={self.absdir})') r.append(f"have pacs:\n{', '.join(self.pacs_have)}") r.append(f"missing pacs:\n{', '.join(self.pacs_missing)}") r.append('*****************************************************') return '\n'.join(r) @staticmethod def init_project( apiurl: str, dir: Path, project, package_tracking=True, getPackageList=True, progress_obj=None, wc_check=True, scm_url=None, ): global store if not os.path.exists(dir): # use makedirs (checkout_no_colon config option might be enabled) os.makedirs(dir) elif not os.path.isdir(dir): raise oscerr.OscIOError(None, f'error: \'{dir}\' is no directory') if os.path.exists(os.path.join(dir, store)): raise oscerr.OscIOError(None, f'error: \'{dir}\' is already an initialized osc working copy') else: os.mkdir(os.path.join(dir, store)) s = Store(dir, check=False) s.write_string("_osclib_version", Store.STORE_VERSION) s.apiurl = apiurl s.project = project if scm_url: s.scmurl = scm_url package_tracking = None if package_tracking: store_write_initial_packages(dir, project, []) return Project(dir, getPackageList, progress_obj, wc_check) osc-1.12.1/osc/obs_scm/serviceinfo.py000066400000000000000000000237331475337502500175020ustar00rootroot00000000000000import hashlib import os import shutil import tempfile import time from typing import Optional from urllib.error import HTTPError from urllib.parse import urlparse from .. import oscerr from .. import output from ..util.xml import ET class Serviceinfo: """Source service content """ def __init__(self): """creates an empty serviceinfo instance""" self.services = [] self.apiurl: Optional[str] = None self.project: Optional[str] = None self.package: Optional[str] = None def read(self, serviceinfo_node, append=False): """read in the source services ```` element passed as elementtree node. """ def error(msg, xml): from ..core import ET_ENCODING data = f'invalid service format:\n{ET.tostring(xml, encoding=ET_ENCODING)}' raise ValueError(f"{data}\n\n{msg}") if serviceinfo_node is None: return if not append: self.services = [] services = serviceinfo_node.findall('service') for service in services: name = service.get('name') if name is None: error("invalid service definition. Attribute name missing.", service) if len(name) < 3 or '/' in name: error(f"invalid service name: {name}", service) mode = service.get('mode', '') data = {'name': name, 'mode': mode} command = [name] for param in service.findall('param'): option = param.get('name') if option is None: error(f"{name}: a parameter requires a name", service) value = '' if param.text: value = param.text command.append('--' + option) # hmm is this reasonable or do we want to allow real # options (e.g., "--force" (without an argument)) as well? command.append(value) data['command'] = command self.services.append(data) def getProjectGlobalServices(self, apiurl: str, project: str, package: str): from ..core import http_POST from ..core import makeurl from ..util.xml import xml_parse self.apiurl = apiurl # get all project wide services in one file, we don't store it yet u = makeurl(apiurl, ["source", project, package], query={"cmd": "getprojectservices"}) try: f = http_POST(u) root = xml_parse(f).getroot() self.read(root, True) self.project = project self.package = package except HTTPError as e: if e.code == 404 and package != '_project': self.getProjectGlobalServices(apiurl, project, '_project') self.package = package elif e.code != 403 and e.code != 400: raise e def addVerifyFile(self, serviceinfo_node, filename: str): f = open(filename, 'rb') digest = hashlib.sha256(f.read()).hexdigest() f.close() r = serviceinfo_node s = ET.Element("service", name="verify_file") ET.SubElement(s, "param", name="file").text = filename ET.SubElement(s, "param", name="verifier").text = "sha256" ET.SubElement(s, "param", name="checksum").text = digest r.append(s) return r def addDownloadUrl(self, serviceinfo_node, url_string: str): url = urlparse(url_string) protocol = url.scheme host = url.netloc path = url.path r = serviceinfo_node s = ET.Element("service", name="download_url") ET.SubElement(s, "param", name="protocol").text = protocol ET.SubElement(s, "param", name="host").text = host ET.SubElement(s, "param", name="path").text = path r.append(s) return r def addSetVersion(self, serviceinfo_node): r = serviceinfo_node s = ET.Element("service", name="set_version", mode="buildtime") r.append(s) return r def addGitUrl(self, serviceinfo_node, url_string: Optional[str]): r = serviceinfo_node s = ET.Element("service", name="obs_scm") ET.SubElement(s, "param", name="url").text = url_string ET.SubElement(s, "param", name="scm").text = "git" r.append(s) return r def addTarUp(self, serviceinfo_node): r = serviceinfo_node s = ET.Element("service", name="tar", mode="buildtime") r.append(s) return r def addRecompressTar(self, serviceinfo_node): r = serviceinfo_node s = ET.Element("service", name="recompress", mode="buildtime") ET.SubElement(s, "param", name="file").text = "*.tar" ET.SubElement(s, "param", name="compression").text = "xz" r.append(s) return r def execute(self, dir, callmode: Optional[str] = None, singleservice=None, verbose: Optional[bool] = None): old_dir = os.path.join(dir, '.old') # if 2 osc instances are executed at a time one, of them fails on .old file existence # sleep up to 10 seconds until we can create the directory for i in reversed(range(10)): try: os.mkdir(old_dir) break except FileExistsError: time.sleep(1) if i == 0: msg = f'"{old_dir}" exists, please remove it' raise oscerr.OscIOError(None, msg) try: result = self._execute(dir, old_dir, callmode, singleservice, verbose) finally: shutil.rmtree(old_dir) return result def _execute( self, dir, old_dir, callmode: Optional[str] = None, singleservice=None, verbose: Optional[bool] = None ): from ..core import get_osc_version from ..core import run_external from ..core import vc_export_env # cleanup existing generated files for filename in os.listdir(dir): if filename.startswith('_service:') or filename.startswith('_service_'): os.rename(os.path.join(dir, filename), os.path.join(old_dir, filename)) allservices = self.services or [] service_names = [s['name'] for s in allservices] if singleservice and singleservice not in service_names: # set array to the manual specified singleservice, if it is not part of _service file data = {'name': singleservice, 'command': [singleservice], 'mode': callmode} allservices = [data] elif singleservice: allservices = [s for s in allservices if s['name'] == singleservice] # set the right called mode or the service would be skipped below for s in allservices: s['mode'] = callmode if not allservices: # short-circuit to avoid a potential http request in vc_export_env # (if there are no services to execute this http request is # useless) return 0 # services can detect that they run via osc this way os.putenv("OSC_VERSION", get_osc_version()) # set environment when using OBS 2.3 or later if self.project is not None: # These need to be kept in sync with bs_service os.putenv("OBS_SERVICE_APIURL", self.apiurl) os.putenv("OBS_SERVICE_PROJECT", self.project) os.putenv("OBS_SERVICE_PACKAGE", self.package) # also export vc env vars (some services (like obs_scm) use them) vc_export_env(self.apiurl) # recreate files ret = 0 for service in allservices: if callmode != "all": if service['mode'] == "buildtime": continue if service['mode'] == "serveronly" and callmode != "local": continue if service['mode'] == "manual" and callmode != "manual": continue if service['mode'] != "manual" and callmode == "manual": continue if service['mode'] == "disabled" and callmode != "disabled": continue if service['mode'] != "disabled" and callmode == "disabled": continue if service['mode'] != "trylocal" and service['mode'] != "localonly" and callmode == "trylocal": continue temp_dir = None try: temp_dir = tempfile.mkdtemp(dir=dir, suffix=f".{service['name']}.service") cmd = service['command'] if not os.path.exists("/usr/lib/obs/service/" + cmd[0]): raise oscerr.PackageNotInstalled(f"obs-service-{cmd[0]}") cmd[0] = "/usr/lib/obs/service/" + cmd[0] cmd = cmd + ["--outdir", temp_dir] output.print_msg(f"Running source_service '{service['name']}' ...", print_to="stdout") output.print_msg("Run source service:", " ".join(cmd), print_to="verbose") r = run_external(*cmd) if r != 0: print("Aborting: service call failed: ", ' '.join(cmd)) # FIXME: addDownloadUrlService calls si.execute after # updating _services. return r if service['mode'] == "manual" or service['mode'] == "disabled" or service['mode'] == "trylocal" or service['mode'] == "localonly" or callmode == "local" or callmode == "trylocal" or callmode == "all": for filename in os.listdir(temp_dir): os.rename(os.path.join(temp_dir, filename), os.path.join(dir, filename)) else: name = service['name'] for filename in os.listdir(temp_dir): os.rename(os.path.join(temp_dir, filename), os.path.join(dir, "_service:" + name + ":" + filename)) finally: if temp_dir is not None: shutil.rmtree(temp_dir) return 0 osc-1.12.1/osc/obs_scm/store.py000066400000000000000000000466421475337502500163260ustar00rootroot00000000000000""" Store class wraps access to files in the '.osc' directory. It is meant to be used as an implementation detail of Project and Package classes and shouldn't be used in any code outside osc. """ import os from .. import oscerr from .._private import api from ..util.xml import ET from typing import List # __store_version__ is to be incremented when the format of the working copy # "store" changes in an incompatible way. Please add any needed migration # functionality to check_store_version(). __store_version__ = '2.0' class Store: STORE_DIR = ".osc" STORE_VERSION = "2.0" @classmethod def is_project_dir(cls, path): try: store = cls(path) except oscerr.NoWorkingCopy: return False return store.is_project @classmethod def is_package_dir(cls, path): try: store = cls(path) except oscerr.NoWorkingCopy: return False return store.is_package def __init__(self, path, check=True): self.path = path self.abspath = os.path.abspath(self.path) if check: check_store_version(self.abspath) self.is_project = self.exists("_project") and not self.exists("_package") self.is_package = self.exists("_project") and self.exists("_package") if check and not any([self.is_project, self.is_package]): msg = f"Directory '{self.path}' is not an OBS SCM working copy" raise oscerr.NoWorkingCopy(msg) def __contains__(self, fn): return self.exists(fn) def __iter__(self): path = os.path.join(self.abspath, self.STORE_DIR) for fn in os.listdir(path): full_path = os.path.join(path, fn) if os.path.isdir(full_path): continue yield fn def assert_is_project(self): if not self.is_project: msg = f"Directory '{self.path}' is not an OBS SCM working copy of a project" raise oscerr.NoWorkingCopy(msg) def assert_is_package(self): if not self.is_package: msg = f"Directory '{self.path}' is not an OBS SCM working copy of a package" raise oscerr.NoWorkingCopy(msg) def get_path(self, fn, subdir=None): # sanitize input to ensure that joining path works as expected fn = fn.lstrip("/") if subdir: subdir = subdir.lstrip("/") return os.path.join(self.abspath, self.STORE_DIR, subdir, fn) return os.path.join(self.abspath, self.STORE_DIR, fn) def exists(self, fn, subdir=None): return os.path.exists(self.get_path(fn, subdir=subdir)) def unlink(self, fn, subdir=None): try: os.unlink(self.get_path(fn, subdir=subdir)) except FileNotFoundError: pass def read_file(self, fn, subdir=None): if not self.exists(fn, subdir=subdir): return None with open(self.get_path(fn, subdir=subdir), encoding="utf-8") as f: return f.read() def write_file(self, fn, value, subdir=None): if value is None: self.unlink(fn, subdir=subdir) return try: if subdir: os.makedirs(self.get_path(subdir)) else: os.makedirs(self.get_path("")) except FileExistsError: pass old = self.get_path(fn, subdir=subdir) new = self.get_path(f"{fn}.new", subdir=subdir) try: with open(new, "w", encoding="utf-8") as f: f.write(value) os.rename(new, old) except: if os.path.exists(new): os.unlink(new) raise def read_list(self, fn, subdir=None): if not self.exists(fn, subdir=subdir): return None with open(self.get_path(fn, subdir=subdir), encoding="utf-8") as f: return [line.rstrip("\n") for line in f] def write_list(self, fn, value, subdir=None): if value is None: self.unlink(fn, subdir=subdir) return if not isinstance(value, (list, tuple)): msg = f"The argument `value` should be list, not {type(value).__name__}" raise TypeError(msg) value = "".join((f"{line or ''}\n" for line in value)) self.write_file(fn, value, subdir=subdir) def read_string(self, fn, subdir=None): if not self.exists(fn, subdir=subdir): return None with open(self.get_path(fn, subdir=subdir), encoding="utf-8") as f: return f.readline().strip() def write_string(self, fn, value, subdir=None): if value is None: self.unlink(fn, subdir=subdir) return if isinstance(value, bytes): value = value.decode("utf-8") if not isinstance(value, str): msg = f"The argument `value` should be str, not {type(value).__name__}" raise TypeError(msg) self.write_file(fn, f"{value}\n", subdir=subdir) def read_int(self, fn): if not self.exists(fn): return None result = self.read_string(fn) if not result.isdigit(): return None return int(result) def write_int(self, fn, value, subdir=None): if value is None: self.unlink(fn, subdir=subdir) return if not isinstance(value, int): msg = f"The argument `value` should be int, not {type(value).__name__}" raise TypeError(msg) value = str(value) self.write_string(fn, value, subdir=subdir) def read_xml_node(self, fn, node_name, subdir=None): from ..util.xml import xml_parse path = self.get_path(fn, subdir=subdir) try: tree = xml_parse(path) except SyntaxError as e: msg = f"Unable to parse '{path}': {e}" raise oscerr.NoWorkingCopy(msg) root = tree.getroot() assert root.tag == node_name # TODO: return root? return tree def write_xml_node(self, fn, node_name, node, subdir=None): path = self.get_path(fn, subdir=subdir) assert node.tag == node_name api.write_xml_node_to_file(node, path) def _sanitize_apiurl(self, value): # apiurl shouldn't end with a slash, strip it so we can use apiurl without modifications # in config['api_host_options'][apiurl] and other places if isinstance(value, str): value = value.strip("/") elif isinstance(value, bytes): value = value.strip(b"/") return value @property def apiurl(self): return self._sanitize_apiurl(self.read_string("_apiurl")) @apiurl.setter def apiurl(self, value): self.write_string("_apiurl", self._sanitize_apiurl(value)) @property def project(self): return self.read_string("_project") @project.setter def project(self, value): self.write_string("_project", value) @property def package(self): return self.read_string("_package") @package.setter def package(self, value): self.write_string("_package", value) @property def scmurl(self): return self.read_string("_scm") @scmurl.setter def scmurl(self, value): return self.write_string("_scm", value) @property def size_limit(self): return self.read_int("_size_limit") @size_limit.setter def size_limit(self, value): return self.write_int("_size_limit", value) @property def to_be_added(self): self.assert_is_package() return self.read_list("_to_be_added") or [] @to_be_added.setter def to_be_added(self, value): self.assert_is_package() return self.write_list("_to_be_added", value) @property def to_be_deleted(self): self.assert_is_package() return self.read_list("_to_be_deleted") or [] @to_be_deleted.setter def to_be_deleted(self, value): self.assert_is_package() return self.write_list("_to_be_deleted", value) @property def in_conflict(self): self.assert_is_package() return self.read_list("_in_conflict") or [] @in_conflict.setter def in_conflict(self, value): self.assert_is_package() return self.write_list("_in_conflict", value) @property def osclib_version(self): return self.read_string("_osclib_version") @property def files(self): from .. import core as osc_core self.assert_is_package() if self.exists("_scm"): msg = "Package '{self.path}' is managed via SCM" raise oscerr.NoWorkingCopy(msg) if not self.exists("_files"): msg = "Package '{self.path}' doesn't contain _files metadata" raise oscerr.NoWorkingCopy(msg) result = [] directory_node = self.read_xml_node("_files", "directory").getroot() for entry_node in api.find_nodes(directory_node, "directory", "entry"): result.append(osc_core.File.from_xml_node(entry_node)) return result @files.setter def files(self, value): if not isinstance(value, (list, tuple)): msg = f"The argument `value` should be list, not {type(value).__name__}" raise TypeError(msg) root = ET.Element("directory") for file_obj in sorted(value): file_obj.to_xml_node(root) self.write_xml_node("_files", "directory", root) @property def last_buildroot(self): self.assert_is_package() items = self.read_list("_last_buildroot") if items is None: return items if len(items) != 3: msg = f"Package '{self.path}' contains _last_buildroot metadata that doesn't contain 3 lines: [repo, arch, vm_type]" raise oscerr.NoWorkingCopy(msg) if items[2] in ("", "None"): items[2] = None return items @last_buildroot.setter def last_buildroot(self, value): self.assert_is_package() if len(value) != 3: raise ValueError("A list with exactly 3 items is expected: [repo, arch, vm_type]") self.write_list("_last_buildroot", value) @property def _meta_node(self): if not self.exists("_meta"): return None if self.is_package: root = self.read_xml_node("_meta", "package").getroot() else: root = self.read_xml_node("_meta", "project").getroot() return root def sources_get_path(self, file_name: str) -> str: if "/" in file_name: raise ValueError(f"Plain file name expected: {file_name}") result = os.path.join(self.abspath, self.STORE_DIR, "sources", file_name) os.makedirs(os.path.dirname(result), exist_ok=True) return result def sources_list_files(self) -> List[str]: result = [] invalid = [] topdir = os.path.join(self.abspath, self.STORE_DIR, "sources") if not os.path.isdir(topdir): return [] for fn in os.listdir(topdir): if self.sources_is_file(fn): result.append(fn) else: invalid.append(fn) if invalid: msg = ".osc/sources contains entries other than regular files" raise oscerr.WorkingCopyInconsistent(self.project, self.package, invalid, msg) return result def sources_is_file(self, file_name: str) -> bool: return os.path.isfile(self.sources_get_path(file_name)) def sources_delete_file(self, file_name: str): try: os.unlink(self.sources_get_path(file_name)) except: pass store = '.osc' def check_store_version(dir): global store versionfile = os.path.join(dir, store, '_osclib_version') try: with open(versionfile) as f: v = f.read().strip() except: if is_project_dir(dir): v = '1.0' else: v = '' if v == '': msg = f'Error: "{os.path.abspath(dir)}" is not an osc working copy.' if os.path.exists(os.path.join(dir, '.svn')): msg = msg + '\nTry svn instead of osc.' raise oscerr.NoWorkingCopy(msg) if v != __store_version__: migrated = False if v in ['0.2', '0.3', '0.4', '0.5', '0.6', '0.7', '0.8', '0.9', '0.95', '0.96', '0.97', '0.98', '0.99']: # no migration needed, only change metadata version to 1.0 s = Store(dir, check=False) v = "1.0" s.write_string("_osclib_version", v) migrated = True if v == "1.0": store_dir = os.path.join(dir, store) sources_dir = os.path.join(dir, store, "sources") sources_dir_mv = sources_dir if os.path.isfile(sources_dir): # there is a conflict with an existing "sources" file sources_dir_mv = os.path.join(dir, store, "_sources") os.makedirs(sources_dir_mv, exist_ok=True) s = Store(dir, check=False) if s.is_package and not s.scmurl: from .package import Package from .project import Project scm_files = [i.name for i in s.files] for fn in os.listdir(store_dir): old_path = os.path.join(store_dir, fn) new_path = os.path.join(sources_dir_mv, fn) if not os.path.isfile(old_path): continue if fn in Package.REQ_STOREFILES or fn in Package.OPT_STOREFILES: continue if fn.startswith("_") and fn not in scm_files: continue if os.path.isfile(old_path): os.rename(old_path, new_path) if sources_dir != sources_dir_mv: os.rename(sources_dir_mv, sources_dir) v = "2.0" s.write_string("_osclib_version", v) migrated = True if migrated: return msg = f'The osc metadata of your working copy "{dir}"' msg += f'\nhas __store_version__ = {v}, but it should be {__store_version__}' msg += '\nPlease do a fresh checkout or update your client. Sorry about the inconvenience.' raise oscerr.WorkingCopyWrongVersion(msg) def is_project_dir(d): global store return os.path.exists(os.path.join(d, store, '_project')) and not \ os.path.exists(os.path.join(d, store, '_package')) def is_package_dir(d): global store return os.path.exists(os.path.join(d, store, '_project')) and \ os.path.exists(os.path.join(d, store, '_package')) def read_filemeta(dir): from ..util.xml import xml_parse global store msg = f'\'{dir}\' is not a valid working copy.' filesmeta = os.path.join(dir, store, '_files') if not is_package_dir(dir): raise oscerr.NoWorkingCopy(msg) if os.path.isfile(os.path.join(dir, store, '_scm')): raise oscerr.NoWorkingCopy("Is managed via scm") if not os.path.isfile(filesmeta): raise oscerr.NoWorkingCopy(f'{msg} ({filesmeta} does not exist)') try: r = xml_parse(filesmeta) except SyntaxError as e: raise oscerr.NoWorkingCopy(f'{msg}\nWhen parsing .osc/_files, the following error was encountered:\n{e}') return r def store_readlist(dir, name): global store r = [] if os.path.exists(os.path.join(dir, store, name)): with open(os.path.join(dir, store, name)) as f: r = [line.rstrip('\n') for line in f] return r def read_tobeadded(dir): return store_readlist(dir, '_to_be_added') def read_tobedeleted(dir): return store_readlist(dir, '_to_be_deleted') def read_sizelimit(dir): global store r = None fname = os.path.join(dir, store, '_size_limit') if os.path.exists(fname): with open(fname) as f: r = f.readline().strip() if r is None or not r.isdigit(): return None return int(r) def read_inconflict(dir): return store_readlist(dir, '_in_conflict') def store_read_project(dir): global store try: with open(os.path.join(dir, store, '_project')) as f: p = f.readline().strip() except OSError: msg = f'Error: \'{os.path.abspath(dir)}\' is not an osc project dir or working copy' if os.path.exists(os.path.join(dir, '.svn')): msg += '\nTry svn instead of osc.' raise oscerr.NoWorkingCopy(msg) return p def store_read_package(dir): global store try: with open(os.path.join(dir, store, '_package')) as f: p = f.readline().strip() except OSError: msg = f'Error: \'{os.path.abspath(dir)}\' is not an osc package working copy' if os.path.exists(os.path.join(dir, '.svn')): msg += '\nTry svn instead of osc.' raise oscerr.NoWorkingCopy(msg) return p def store_read_scmurl(dir): import warnings warnings.warn( "osc.core.store_read_scmurl() is deprecated. " "You should be using high-level classes such as Store, Project or Package instead.", DeprecationWarning ) return Store(dir).scmurl def store_read_apiurl(dir, defaulturl=True): import warnings warnings.warn( "osc.core.store_read_apiurl() is deprecated. " "You should be using high-level classes such as Store, Project or Package instead.", DeprecationWarning ) return Store(dir).apiurl def store_read_last_buildroot(dir): global store fname = os.path.join(dir, store, '_last_buildroot') if os.path.exists(fname): lines = open(fname).read().splitlines() if len(lines) == 3: return lines return def store_write_string(dir, file, string, subdir=''): from ..core import decode_it global store if subdir and not os.path.isdir(os.path.join(dir, store, subdir)): os.mkdir(os.path.join(dir, store, subdir)) fname = os.path.join(dir, store, subdir, file) try: f = open(fname + '.new', 'w') if not isinstance(string, str): string = decode_it(string) f.write(string) f.close() os.rename(fname + '.new', fname) except: if os.path.exists(fname + '.new'): os.unlink(fname + '.new') raise def store_write_project(dir, project): store_write_string(dir, '_project', project + '\n') def store_write_apiurl(dir, apiurl): import warnings warnings.warn( "osc.core.store_write_apiurl() is deprecated. " "You should be using high-level classes such as Store, Project or Package instead.", DeprecationWarning ) Store(dir).apiurl = apiurl def store_write_last_buildroot(dir, repo, arch, vm_type): store_write_string(dir, '_last_buildroot', repo + '\n' + arch + '\n' + vm_type + '\n') def store_unlink_file(dir, file): global store try: os.unlink(os.path.join(dir, store, file)) except: pass def store_read_file(dir, file): global store try: with open(os.path.join(dir, store, file)) as f: return f.read() except: return None def store_write_initial_packages(dir, project, subelements): global store fname = os.path.join(dir, store, '_packages') root = ET.Element('project', name=project) for elem in subelements: root.append(elem) ET.ElementTree(root).write(fname) def delete_storedir(store_dir): """ This method deletes a store dir. """ from ..core import delete_dir head, tail = os.path.split(store_dir) if tail == '.osc': delete_dir(store_dir) osc-1.12.1/osc/oscerr.py000066400000000000000000000134311475337502500150300ustar00rootroot00000000000000# Copyright (C) 2008 Novell Inc. All rights reserved. # This program is free software; it may be used, copied, modified # and distributed under the terms of the GNU General Public Licence, # either version 2, or (at your option) any later version. class OscBaseError(Exception): def __init__(self, args=()): super().__init__() self.args = args def __str__(self): return ''.join(self.args) class UserAbort(OscBaseError): """Exception raised when the user requested abortion""" class ConfigError(OscBaseError): """Exception raised when there is an error in the config file""" def __init__(self, msg, fname): super().__init__() self.msg = msg self.file = fname def __str__(self): return f"Error in config file {self.file}\n {self.msg}" class ConfigMissingApiurl(ConfigError): """Exception raised when a apiurl does not exist in the config file""" def __init__(self, msg, fname, url): super().__init__(msg, fname) self.url = url class ConfigMissingCredentialsError(ConfigError): def __init__(self, msg, fname, url): super().__init__(msg, fname) self.url = url class APIError(OscBaseError): """Exception raised when there is an error in the output from the API""" def __init__(self, msg): super().__init__() self.msg = msg def __str__(self): return f"{self.__class__.__name__}: {self.msg}" class NotFoundAPIError(APIError): """ Requested data was not found. """ class NoConfigfile(OscBaseError): """Exception raised when osc's configfile cannot be found""" def __init__(self, fname, msg): super().__init__() self.file = fname self.msg = msg def __str__(self): return f"Config file cannot be found: {self.file}\n {self.msg}" class ExtRuntimeError(OscBaseError): """Exception raised when there is a runtime error of an external tool""" def __init__(self, msg, fname): super().__init__() self.msg = msg self.file = fname class ServiceRuntimeError(OscBaseError): """Exception raised when the execution of a source service failed""" def __init__(self, msg): super().__init__() self.msg = msg class WrongArgs(OscBaseError): """Exception raised by the cli for wrong arguments usage""" class WrongOptions(OscBaseError): """Exception raised by the cli for wrong option usage""" # def __str__(self): # s = 'Sorry, wrong options.' # if self.args: # s += '\n' + self.args # return s class NoWorkingCopy(OscBaseError): """Exception raised when directory is neither a project dir nor a package dir""" class NotMissing(OscBaseError): """Exception raised when link target should not exist, but it does""" class WorkingCopyWrongVersion(OscBaseError): """Exception raised when working copy's .osc/_osclib_version doesn't match""" class WorkingCopyOutdated(OscBaseError): """Exception raised when the working copy is outdated. It takes a tuple with three arguments: path to wc, revision that it has, revision that it should have. """ def __str__(self): return ('Working copy \'%s\' is out of date (rev %s vs rev %s).\n' 'Looks as if you need to update it first.' % (self.args[0], self.args[1], self.args[2])) class ProjectError(OscBaseError): """Base class for all Project related exceptions""" def __init__(self, prj, msg=None): super().__init__() self.prj = prj self.msg = msg def __str__(self): result = f"{self.__class__.__name__}: {self.prj}" if self.msg: result += f": {self.msg}" return result class PackageError(OscBaseError): """Base class for all Package related exceptions""" def __init__(self, prj, pac, msg=None): super().__init__() self.prj = prj self.pac = pac self.msg = msg def __str__(self): result = f"{self.__class__.__name__}: {self.prj}/{self.pac}" if self.msg: result += f": {self.msg}" return result class WorkingCopyInconsistent(PackageError): """Exception raised when the working copy is in an inconsistent state""" def __init__(self, prj, pac, dirty_files, msg): super().__init__(prj, pac, msg) self.dirty_files = dirty_files class LinkExpandError(PackageError): """Exception raised when source link expansion fails""" class OscIOError(OscBaseError): def __init__(self, e, msg): super().__init__() self.e = e self.msg = msg class OscValueError(OscBaseError): """ Invalid argument value (of correct type). """ pass class OscInvalidRevision(OscValueError): """ Invalid revision value. """ def __str__(self): return f"Invalid revision value: {''.join(self.args)}" class PackageNotInstalled(OscBaseError): """ Exception raised when a package is not installed on local system """ def __init__(self, pkg): super().__init__((pkg,)) def __str__(self): return f'Package {self.args} is required for this operation' class SignalInterrupt(Exception): """Exception raised on SIGTERM and SIGHUP.""" class PackageExists(PackageError): """ Exception raised when a local object already exists """ class PackageMissing(PackageError): """ Exception raised when a local object doesn't exist """ class PackageFileConflict(PackageError): """ Exception raised when there's a file conflict. Conflict doesn't mean an unsuccessfull merge in this context. """ def __init__(self, prj, pac, file, msg): super().__init__(prj, pac, msg) self.file = file class PackageInternalError(PackageError): pass # vim: sw=4 et osc-1.12.1/osc/oscssl.py000066400000000000000000000154031475337502500150420ustar00rootroot00000000000000import binascii import os import socket import ssl import subprocess import sys import tempfile import typing from cryptography import x509 from cryptography.hazmat.primitives import hashes from cryptography.hazmat.primitives import serialization from urllib3.util.ssl_ import create_urllib3_context from . import oscerr from .util import xdg # based on openssl's include/openssl/x509_vfy.h.in X509_V_ERR_DEPTH_ZERO_SELF_SIGNED_CERT = 18 X509_V_ERR_SELF_SIGNED_CERT_IN_CHAIN = 19 def create_ssl_context(): """ Create a ssl context with disabled weak crypto. Relatively safe defaults are set in urllib3 already, but we restrict crypto even more. """ ssl_context = create_urllib3_context() # we consider anything older than TLSv1_2 insecure if sys.version_info[:2] <= (3, 6): # deprecated since py3.7 ssl_context.options |= ssl.OP_NO_TLSv1 ssl_context.options |= ssl.OP_NO_TLSv1_1 else: # raise minimum version if too low if ssl_context.minimum_version < ssl.TLSVersion.TLSv1_2: ssl_context.minimum_version = ssl.TLSVersion.TLSv1_2 return ssl_context class CertVerificationError(oscerr.OscBaseError): def __str__(self): args_str = [str(i) for i in self.args] return "Certificate Verification Error: " + "\n".join(args_str) class TrustedCertStore: def __init__(self, ssl_context, host, port): self.ssl_context = ssl_context self.host = host self.port = port or 443 if not self.host: raise ValueError("Empty `host`") self.dir_path = os.path.expanduser(os.path.join(xdg.XDG_CONFIG_HOME, "osc", "trusted-certs")) if not os.path.isdir(self.dir_path): try: os.makedirs(self.dir_path, mode=0o700) except FileExistsError: pass file_name = f"{self.host}_{self.port}" self.pem_path = os.path.join(self.dir_path, f"{file_name}.pem") if os.path.isfile(self.pem_path): # load permanently trusted certificate that is stored on disk with open(self.pem_path, "rb") as f: self.cert = x509.load_pem_x509_certificate(f.read()) self.ssl_context.load_verify_locations(cafile=self.pem_path) else: self.cert = None def get_server_certificate(self): # The following code throws an exception on self-signed certs, # therefore we need to retrieve the cert differently. # pem = ssl.get_server_certificate((self.host, self.port)) ssl_context = create_ssl_context() ssl_context.check_hostname = False ssl_context.verify_mode = ssl.CERT_NONE sock = ssl_context.wrap_socket(socket.socket(), server_hostname=self.host) sock.connect((self.host, self.port)) der = sock.getpeercert(binary_form=True) pem = ssl.DER_cert_to_PEM_cert(der) cert = x509.load_pem_x509_certificate(pem.encode("utf-8")) return cert def trust_permanently(self, cert): """ Permanently trust the certificate. Store it as a pem file in ~/.config/osc/trusted-certs. """ self.cert = cert data = self.cert.public_bytes(serialization.Encoding.PEM) with open(self.pem_path, "wb") as f: f.write(data) self.ssl_context.load_verify_locations(cafile=self.pem_path) def trust_temporarily(self, cert): """ Temporarily trust the certificate. """ self.cert = cert tmp_dir = os.path.expanduser(os.path.join(xdg.XDG_CONFIG_HOME, "osc")) data = self.cert.public_bytes(serialization.Encoding.PEM) with tempfile.NamedTemporaryFile(mode="wb+", dir=tmp_dir, prefix="temp_trusted_cert_") as f: f.write(data) f.flush() self.ssl_context.load_verify_locations(cafile=f.name) @staticmethod def _display_cert(cert): print("Subject:", cert.subject.rfc4514_string()) print("Issuer:", cert.issuer.rfc4514_string()) try: san_ext = cert.extensions.get_extension_for_oid(x509.ExtensionOID.SUBJECT_ALTERNATIVE_NAME) san_ext_value = typing.cast(x509.SubjectAlternativeName, san_ext.value) san_ext_dnsnames = san_ext_value.get_values_for_type(x509.DNSName) except x509.extensions.ExtensionNotFound: san_ext_dnsnames = ["(not available)"] for san in san_ext_dnsnames: print("subjectAltName:", san) print("Valid:", cert.not_valid_before, "->", cert.not_valid_after) print("Fingerprint(MD5):", binascii.hexlify(cert.fingerprint(hashes.MD5())).decode("utf-8")) print("Fingerprint(SHA1):", binascii.hexlify(cert.fingerprint(hashes.SHA1())).decode("utf-8")) def prompt_trust(self, cert, reason): if self.cert: # check if the certificate matches the already trusted certificate for the host and port if cert != self.cert: raise CertVerificationError([ "Remote host identification has changed", "", "WARNING: REMOTE HOST IDENTIFICATION HAS CHANGED!", "IT IS POSSIBLE THAT SOMEONE IS DOING SOMETHING NASTY!", "", f"Offending certificate is at '{self.pem_path}'" ]) else: # since there is no trusted certificate on disk, # let's display the server cert and give user options to trust it print("The server certificate failed verification") print() self._display_cert(cert) print(f"Reason: {reason}") while True: print(""" Would you like to 0 - quit (default) 1 - continue anyways 2 - trust the server certificate permanently 9 - review the server certificate """) print("Enter choice [0129]: ", end="") r = input() if not r or r == "0": raise CertVerificationError(["Untrusted certificate"]) elif r == "1": self.trust_temporarily(cert) return elif r == "2": self.trust_permanently(cert) return elif r == "9": # TODO: avoid calling openssl to convert pem to text pem = cert.public_bytes(encoding=serialization.Encoding.PEM).decode("utf-8") cmd = ["openssl", "x509", "-text"] try: cert_text = subprocess.check_output(cmd, input=pem, encoding="utf-8") print(cert_text) except FileNotFoundError: print("ERROR: Unable to display certificate because the 'openssl' executable is not available", file=sys.stderr) osc-1.12.1/osc/output/000077500000000000000000000000001475337502500145175ustar00rootroot00000000000000osc-1.12.1/osc/output/__init__.py000066400000000000000000000006111475337502500166260ustar00rootroot00000000000000from .key_value_table import KeyValueTable from .input import get_user_input from .output import get_default_pager from .output import pipe_to_pager from .output import print_msg from .output import run_pager from .output import sanitize_text from .output import safe_print from .output import safe_write from .tty import colorize from .widechar import wc_ljust from .widechar import wc_width osc-1.12.1/osc/output/input.py000066400000000000000000000031231475337502500162270ustar00rootroot00000000000000import sys import textwrap from typing import Dict from typing import Optional from .. import oscerr from .tty import colorize def get_user_input(question: str, answers: Dict[str, str], default_answer: Optional[str] = None) -> str: """ Ask user a question and wait for reply. :param question: The question. The text gets automatically dedented and stripped. :param answers: A dictionary with answers. Keys are the expected replies and values are their descriptions. :param default_answer: The default answer. Must be ``None`` or match an ``answers`` entry. """ if default_answer and default_answer not in answers: raise ValueError(f"Default answer doesn't match any answer: {default_answer}") question = textwrap.dedent(question) question = question.strip() prompt = [] for key, value in answers.items(): value = f"{colorize(key, 'bold')}){value}" prompt.append(value) prompt_str = " / ".join(prompt) if default_answer: prompt_str += f" (default={colorize(default_answer, 'bold')})" prompt_str += ": " print(question, file=sys.stderr) while True: try: reply = input(prompt_str) except EOFError: # interpret ctrl-d as user abort raise oscerr.UserAbort() # pylint: disable=raise-missing-from if reply in answers: return reply if reply.strip() in answers: return reply.strip() if not reply.strip(): return default_answer print(f"Invalid reply: {colorize(reply, 'bold,red')}", file=sys.stderr) osc-1.12.1/osc/output/key_value_table.py000066400000000000000000000047271475337502500202360ustar00rootroot00000000000000from . import tty from . import widechar class KeyValueTable: class NewLine: pass def __init__(self): self.rows = [] def add(self, key, value, color=None, key_color=None, indent=0): if value is None: lines = [] elif isinstance(value, (list, tuple)): lines = value[:] else: lines = value.splitlines() if not lines: lines = [""] # add the first line with the key self.rows.append((key, lines[0], color, key_color, indent)) # then add the continuation lines without the key for line in lines[1:]: self.rows.append(("", line, color, key_color, 0)) def newline(self): self.rows.append((self.NewLine, None, None, None, 0)) def __str__(self): if not self.rows: return "" col1_width = max([widechar.wc_width(key) + indent for key, _, _, _, indent in self.rows if key != self.NewLine]) result = [] skip = False for row_num in range(len(self.rows)): if skip: skip = False continue key, value, color, key_color, indent = self.rows[row_num] if key == self.NewLine: result.append("") continue next_indent = 0 # fake value if not value and row_num < len(self.rows) - 1: # let's peek if there's a continuation line we could merge instead of the blank value next_key, next_value, next_color, next_key_color, next_indent = self.rows[row_num + 1] if not next_key: value = next_value color = next_color key_color = next_key_color row_num += 1 skip = True line = indent * " " if not value and next_indent > 0: # no value, the key represents a section followed by indented keys -> skip ljust() and " : " separator line += tty.colorize(key, key_color) else: line += tty.colorize(widechar.wc_ljust(key, col1_width - indent), key_color) if not key: # continuation line without a key -> skip " : " separator line += " " else: line += " : " line += tty.colorize(value, color) result.append(line) return "\n".join(result) osc-1.12.1/osc/output/output.py000066400000000000000000000177631475337502500164470ustar00rootroot00000000000000import os import platform import re import shlex import subprocess import sys import tempfile from typing import Dict from typing import List from typing import Optional from typing import TextIO from typing import Union from . import tty def print_msg(*args, print_to: Optional[str] = "debug"): """ Print ``*args`` to the ``print_to`` target: - None: print nothing - debug: print() to stderr with "DEBUG:" prefix if config["debug"] is set - verbose: print() to stdout if config["verbose"] or config["debug"] is set - error: print() to stderr with red "ERROR:" prefix - warning: print() to stderr with yellow "WARNING:" prefix - stdout: print() to stdout - stderr: print() to stderr """ from .. import conf if print_to is None: return elif print_to == "debug": # print a debug message to stderr if config["debug"] is set if conf.config["debug"]: print("DEBUG:", *args, file=sys.stderr) elif print_to == "verbose": # print a verbose message to stdout if config["verbose"] or config["debug"] is set if conf.config["verbose"] or conf.config["debug"]: print(*args) elif print_to == "error": print(tty.colorize("ERROR:", "red,bold"), *args, file=sys.stderr) elif print_to == "warning": print(tty.colorize("WARNING:", "yellow,bold"), *args, file=sys.stderr) elif print_to == "stdout": # print the message to stdout print(*args) elif print_to == "stderr": # print the message to stderr print(*args, file=sys.stderr) else: raise ValueError(f"Invalid value of the 'print_to' option: {print_to}") # cached compiled regular expressions; they are created on the first use SANITIZE_TEXT_RE: Optional[Dict] = None def sanitize_text(text: Union[bytes, str]) -> Union[bytes, str]: """ Remove forbidden characters and escape sequences from ``text``. This must be run on lines or the whole text to work correctly. Processing blocks of constant size might lead to splitting escape sequences and leaving garbage characters after sanitizing. """ global SANITIZE_TEXT_RE if not SANITIZE_TEXT_RE: SANITIZE_TEXT_RE = {} # CONTROL CHARACTERS # remove all control characters with the exception of: # 0x09 - horizontal tab (\t) # 0x0A - line feed (\n) # 0x0D - carriage return (\r) # 0x1B - escape - is selectively handled later as part of sanitizing escape sequences regex = r"[\x00-\x08\x0B\x0C\x0E-\x1A\x1C-\x1F]" SANITIZE_TEXT_RE["str_control"] = re.compile(regex) SANITIZE_TEXT_RE["bytes_control"] = re.compile(regex.encode("ascii")) # CSI ESCAPE SEQUENCES # https://en.wikipedia.org/wiki/ANSI_escape_code#CSI_codes # remove all but allowed CSI escape sequences # negative lookahead assertion that allows safe color escape sequences neg_allowed_csi_sequences = r"(?!\[([0-5]|[34][0-7]|;)+m)" # range 0x30–0x3F (OCT \040-\077) (ASCII 0–9:;<=>?); zero or more characters csi_parameter_bytes = r"[\x30-\x3F]*" # range 0x20–0x2F (OCT \040-\057) (ASCII space and !"#$%&'()*+,-./); zero or more characters csi_itermediate_bytes = r"[\x20-\x2F]*" # range 0x40–0x7E (OCT \100-\176) (ASCII @A–Z[\]^_`a–z{|}~); 1 character csi_final_byte = r"[\x40-\x7E]" regex = rf"\033{neg_allowed_csi_sequences}\[{csi_parameter_bytes}{csi_itermediate_bytes}{csi_final_byte}" SANITIZE_TEXT_RE["str_csi_sequences"] = re.compile(regex) SANITIZE_TEXT_RE["bytes_csi_sequences"] = re.compile(regex.encode("ascii")) # FE ESCAPE SEQUENCES # https://en.wikipedia.org/wiki/ANSI_escape_code#Fe_Escape_sequences # remove all Fe escape sequences # range 0x40 to 0x5F (ASCII @A–Z[\]^_); 1 character fe = r"[\x40-x5F]" regex = rf"\033{neg_allowed_csi_sequences}{fe}" SANITIZE_TEXT_RE["str_fe_sequences"] = re.compile(regex) SANITIZE_TEXT_RE["bytes_fe_sequences"] = re.compile(regex.encode("ascii")) # REMAINING ESCAPE CHARACTERS # remove all remaining escape characters that are not followed with the allowed CSI escape sequences regex = rf"\033{neg_allowed_csi_sequences}" SANITIZE_TEXT_RE["str_esc"] = re.compile(regex) SANITIZE_TEXT_RE["bytes_esc"] = re.compile(regex.encode("ascii")) if isinstance(text, bytes): text = SANITIZE_TEXT_RE["bytes_control"].sub(b"", text) text = SANITIZE_TEXT_RE["bytes_csi_sequences"].sub(b"", text) text = SANITIZE_TEXT_RE["bytes_fe_sequences"].sub(b"", text) text = SANITIZE_TEXT_RE["bytes_esc"].sub(b"", text) else: text = SANITIZE_TEXT_RE["str_control"].sub("", text) text = SANITIZE_TEXT_RE["str_csi_sequences"].sub("", text) text = SANITIZE_TEXT_RE["str_fe_sequences"].sub("", text) text = SANITIZE_TEXT_RE["str_esc"].sub("", text) return text def safe_print(*args, **kwargs): """ A wrapper to print() that runs sanitize_text() on all arguments. """ args = [sanitize_text(i) for i in args] print(*args, **kwargs) def safe_write(file: TextIO, text: Union[str, bytes], *, add_newline: bool = False): """ Run sanitize_text() on ``text`` and write it to ``file``. :param add_newline: Write a newline after writing the ``text``. """ text = sanitize_text(text) if isinstance(text, bytes): if hasattr(file, "buffer"): file.buffer.write(text) if add_newline: file.buffer.write(os.linesep.encode("utf-8")) else: # file has no "buffer" attribute, let's try to write the bytes directly file.write(text) if add_newline: file.write(os.linesep.encode("utf-8")) else: file.write(text) if add_newline: file.write(os.linesep) def get_default_pager(): from ..core import _get_linux_distro system = platform.system() if system == 'Linux': dist = _get_linux_distro() if dist == 'debian': return 'pager' return 'less' return 'more' def get_pager(): """ Return (pager, env) where ``pager`` is a list with parsed pager command ``env`` is copy of os.environ() with added variables specific to the pager """ env = os.environ.copy() pager = os.getenv("PAGER", default="").strip() pager = pager or get_default_pager() # LESS env is not always set and we need -R to display escape sequences properly less_opts = os.getenv("LESS", default="") if "-R" not in less_opts: less_opts += " -R" env["LESS"] = less_opts return shlex.split(pager), env def run_pager(message: Union[bytes, str], tmp_suffix: str = ""): from ..core import run_external if not message: return if not tty.IS_INTERACTIVE: safe_write(sys.stdout, message) return mode = "w+b" if isinstance(message, bytes) else "w+" with tempfile.NamedTemporaryFile(mode=mode, suffix=tmp_suffix) as tmpfile: safe_write(tmpfile, message) tmpfile.flush() pager, env = get_pager() cmd = pager + [tmpfile.name] run_external(*cmd, env=env) def pipe_to_pager(lines: Union[List[bytes], List[str]], *, add_newlines=False): """ Pipe ``lines`` to the pager. If running in a non-interactive terminal, print the data instead. Add a newline after each line if ``add_newlines`` is ``True``. """ if not tty.IS_INTERACTIVE: for line in lines: safe_write(sys.stdout, line, add_newline=add_newlines) return pager, env = get_pager() with subprocess.Popen(pager, stdin=subprocess.PIPE, encoding="utf-8", env=env) as proc: try: for line in lines: safe_write(proc.stdin, line, add_newline=add_newlines) proc.stdin.flush() proc.stdin.close() except BrokenPipeError: pass proc.wait() osc-1.12.1/osc/output/tty.py000066400000000000000000000021521475337502500157110ustar00rootroot00000000000000import os import sys try: IS_INTERACTIVE = os.isatty(sys.stdout.fileno()) except OSError: IS_INTERACTIVE = False ESCAPE_CODES = { "reset": "\033[0m", "bold": "\033[1m", "dim": "\033[2m", "italic": "\033[3m", "underline": "\033[4m", "blink": "\033[5m", "black": "\033[30m", "red": "\033[31m", "green": "\033[32m", "yellow": "\033[33m", "blue": "\033[34m", "magenta": "\033[35m", "cyan": "\033[36m", "white": "\033[37m", "bg_black": "\033[40m", "bg_red": "\033[41m", "bg_green": "\033[42m", "bg_yellow": "\033[43m", "bg_blue": "\033[44m", "bg_magenta": "\033[45m", "bg_cyan": "\033[46m", "bg_white": "\033[47m", } def colorize(text, color): """ Colorize `text` if the `color` is specified and we're running in an interactive terminal. """ if not IS_INTERACTIVE: return text if not color: return text if not text: return text result = "" for i in color.split(","): result += ESCAPE_CODES[i] result += text result += ESCAPE_CODES["reset"] return result osc-1.12.1/osc/output/widechar.py000066400000000000000000000007021475337502500166560ustar00rootroot00000000000000import unicodedata def wc_width(text): result = 0 for char in text: if unicodedata.east_asian_width(char) in ("F", "W"): result += 2 else: result += 1 return result def wc_ljust(text, width, fillchar=" "): text_width = wc_width(text) fill_width = wc_width(fillchar) while text_width + fill_width <= width: text += fillchar text_width += fill_width return text osc-1.12.1/osc/py.typed000066400000000000000000000000001475337502500146440ustar00rootroot00000000000000osc-1.12.1/osc/store.py000066400000000000000000000020331475337502500146630ustar00rootroot00000000000000""" Store class wraps access to files in the '.osc' directory. It is meant to be used as an implementation detail of Project and Package classes and shouldn't be used in any code outside osc. """ import os from xml.etree import ElementTree as ET from . import oscerr from . import git_scm from .obs_scm import Store def get_store(path, check=True, print_warnings=False): """ Return a store object that wraps SCM in given `path`: - Store for OBS SCM - GitStore for Git SCM """ # if there are '.osc' and '.git' directories next to each other, '.osc' takes preference store = None try: store = Store(path, check) except oscerr.NoWorkingCopy: pass if not store: try: store = git_scm.GitStore(path, check) if print_warnings: git_scm.warn_experimental() except oscerr.NoWorkingCopy: pass if not store: msg = f"Directory '{path}' is not a working copy" raise oscerr.NoWorkingCopy(msg) return store osc-1.12.1/osc/util/000077500000000000000000000000001475337502500141345ustar00rootroot00000000000000osc-1.12.1/osc/util/__init__.py000066400000000000000000000001171475337502500162440ustar00rootroot00000000000000__all__ = ['ar', 'cpio', 'debquery', 'packagequery', 'rpmquery', 'safewriter'] osc-1.12.1/osc/util/ar.py000066400000000000000000000167711475337502500151240ustar00rootroot00000000000000# Copyright 2009 Marcus Huewe # # This program is free software; you can redistribute it and/or # modify it under the terms of the GNU General Public License version 2 # as published by the Free Software Foundation; # # This program is distributed in the hope that it will be useful, # but WITHOUT ANY WARRANTY; without even the implied warranty of # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the # GNU General Public License for more details. # # You should have received a copy of the GNU General Public License # along with this program; if not, write to the Free Software # Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA import os import re import stat import sys from io import BytesIO from typing import Union class ArError(Exception): """Base class for all ar related errors""" def __init__(self, fn: bytes, msg: str): super().__init__() self.file = fn self.msg = msg def __str__(self): return f"{self.msg}: {self.file.decode('utf-8')}" class ArHdr: """Represents an ar header entry""" def __init__(self, fn: bytes, date: bytes, uid: bytes, gid: bytes, mode: bytes, size: bytes, fmag: bytes, off: bytes): self.file = fn.strip() self.date = date.strip() self.uid = uid.strip() self.gid = gid.strip() if not mode.strip(): # provide a dummy mode for the ext_fn hdr mode = b"0" self.mode = stat.S_IMODE(int(mode, 8)) self.size = int(size) self.fmag = fmag # data section starts at off and ends at off + size self.dataoff = int(off) def __str__(self): return '%16s %d' % (self.file, self.size) class ArFile(BytesIO): """Represents a file which resides in the archive""" def __init__(self, fn, uid, gid, mode, buf): super().__init__(buf) self.name = fn self.uid = uid self.gid = gid self.mode = mode def saveTo(self, dir=None): """ writes file to dir/filename if dir isn't specified the current working dir is used. Additionally it tries to set the owner/group and permissions. """ if self.name.startswith(b"/"): raise ArError(self.name, "Extracting files with absolute paths is not supported for security reasons") if not dir: dir = os.getcwdb() fn = os.path.join(dir, self.name.decode("utf-8")) dir_path, _ = os.path.split(fn) if dir_path: os.makedirs(dir_path, exist_ok=True) with open(fn, 'wb') as f: f.write(self.getvalue()) os.chmod(fn, self.mode) uid = self.uid if uid != os.geteuid() or os.geteuid() != 0: uid = -1 gid = self.gid if gid not in os.getgroups() or os.getegid() != 0: gid = -1 os.chown(fn, uid, gid) return fn def __str__(self): return '%s %s %s %s' % (self.name, self.uid, self.gid, self.mode) class Ar: """ Represents an ar archive (only GNU format is supported). Readonly access. """ hdr_len = 60 hdr_pat = re.compile(b'^(.{16})(.{12})(.{6})(.{6})(.{8})(.{10})(.{2})', re.DOTALL) def __init__(self, fn=None, fh=None): if fn is None and fh is None: raise ValueError('either \'fn\' or \'fh\' must be is not None') if fh is not None: self.__file = fh self.__closefile = False self.filename = fh.name else: # file object: will be closed in __del__() self.__file = None self.__closefile = True self.filename = fn self._init_datastructs() def __del__(self): if self.__file and self.__closefile: self.__file.close() def _init_datastructs(self): self.hdrs = [] self.ext_fnhdr = None def _appendHdr(self, hdr): # GNU uses an internal '//' file to store very long filenames if hdr.file.startswith(b'//'): self.ext_fnhdr = hdr else: self.hdrs.append(hdr) def _fixupFilenames(self): """ support the GNU approach for very long filenames: every filename which exceeds 16 bytes is stored in the data section of a special file ('//') and the filename in the header of this long file specifies the offset in the special file's data section. The end of such a filename is indicated with a trailing '/'. Another special file is the '/' which contains the symbol lookup table. """ # read extended header with long file names and then only seek into the right offsets ext_fnhdr_data = None if self.ext_fnhdr: self.__file.seek(self.ext_fnhdr.dataoff, os.SEEK_SET) ext_fnhdr_data = self.__file.read(self.ext_fnhdr.size) for h in self.hdrs: if h.file == b'/': continue if h.file.endswith(b"/"): # regular file name h.file = h.file[:-1] continue if not h.file.startswith(b'/'): continue # long file name assert h.file[0:1] == b"/" assert ext_fnhdr_data is not None start = int(h.file[1:]) end = ext_fnhdr_data.find(b'/', start) if end == -1: raise ArError(b'//', 'invalid data section - trailing slash (off: %d)' % start) h.file = ext_fnhdr_data[start:end] def _get_file(self, hdr): self.__file.seek(hdr.dataoff, os.SEEK_SET) return ArFile(hdr.file, hdr.uid, hdr.gid, hdr.mode, self.__file.read(hdr.size)) def read(self): """reads in the archive.""" if not self.__file: self.__file = open(self.filename, 'rb') else: self.__file.seek(0, os.SEEK_SET) self._init_datastructs() data = self.__file.read(7) if data != b'!': raise ArError(self.filename, 'no ar archive') pos = 8 while len(data) != 0: self.__file.seek(pos, os.SEEK_SET) data = self.__file.read(self.hdr_len) if not data: break pos += self.hdr_len m = self.hdr_pat.search(data) if not m: raise ArError(self.filename, 'unexpected hdr entry') args = m.groups() + (pos, ) hdr = ArHdr(*args) self._appendHdr(hdr) # data blocks are 2 bytes aligned - if they end on an odd # offset ARFMAG[0] will be used for padding (according to the current binutils code) pos += hdr.size + (hdr.size & 1) self._fixupFilenames() def get_file(self, fn: Union[str, bytes]): # accept str for better user experience if isinstance(fn, str): fn = fn.encode("utf-8") for h in self.hdrs: if h.file == fn: return self._get_file(h) return None def __iter__(self): for h in self.hdrs: if h.file == b'/': continue yield self._get_file(h) if __name__ == '__main__': if len(sys.argv) != 2: print('usage: %s ' % sys.argv[0]) sys.exit(1) # a potential user might want to pass a bytes instead of a str # to make sure that the ArError's file attribute is always a # bytes ar = Ar(fn=sys.argv[1]) ar.read() for hdr in ar.hdrs: print(hdr) osc-1.12.1/osc/util/archquery.py000066400000000000000000000160711475337502500165160ustar00rootroot00000000000000import os import re import subprocess import sys from . import packagequery class ArchError(packagequery.PackageError): pass class ArchQuery(packagequery.PackageQuery, packagequery.PackageQueryResult): def __init__(self, fh): self.__file = fh self.__path = os.path.abspath(fh.name) self.fields = {} # self.magic = None # self.pkgsuffix = 'pkg.tar.gz' self.pkgsuffix = b'arch' def read(self, all_tags=True, self_provides=True, *extra_tags): # all_tags and *extra_tags are currently ignored f = open(self.__path, 'rb') # self.magic = f.read(5) # if self.magic == '\375\067zXZ': # self.pkgsuffix = 'pkg.tar.xz' fn = open('/dev/null', 'wb') pipe = subprocess.Popen(['tar', '-O', '-xf', self.__path, '.PKGINFO'], stdout=subprocess.PIPE, stderr=fn).stdout for line in pipe.readlines(): line = line.rstrip().split(b' = ', 2) if len(line) == 2: field, value = line[0].decode('ascii'), line[1] self.fields.setdefault(field, []).append(value) if self_provides: prv = b'%s = %s' % (self.name(), self.fields['pkgver'][0]) self.fields.setdefault('provides', []).append(prv) return self def vercmp(self, archq): res = packagequery.cmp(int(self.epoch()), int(archq.epoch())) if res != 0: return res res = ArchQuery.rpmvercmp(self.version(), archq.version()) if res != 0: return res res = ArchQuery.rpmvercmp(self.release(), archq.release()) return res def name(self): return self.fields['pkgname'][0] if 'pkgname' in self.fields else None def version(self): pkgver = self.fields['pkgver'][0] if 'pkgver' in self.fields else None if pkgver is not None: pkgver = re.sub(br'[0-9]+:', b'', pkgver, 1) pkgver = re.sub(br'-[^-]*$', b'', pkgver) return pkgver def release(self): pkgver = self.fields['pkgver'][0] if 'pkgver' in self.fields else None if pkgver is not None: m = re.search(br'-([^-])*$', pkgver) if m: return m.group(1) return None def _epoch(self): pkgver = self.fields.get('pkgver', [b''])[0] if pkgver: m = re.match(br'([0-9])+:', pkgver) if m: return m.group(1) return b'' def epoch(self): epoch = self._epoch() if epoch: return epoch return b'0' def arch(self): return self.fields['arch'][0] if 'arch' in self.fields else None def description(self): return self.fields['pkgdesc'][0] if 'pkgdesc' in self.fields else None def path(self): return self.__path def provides(self): return self.fields['provides'] if 'provides' in self.fields else [] def requires(self): return self.fields['depend'] if 'depend' in self.fields else [] def conflicts(self): return self.fields['conflict'] if 'conflict' in self.fields else [] def obsoletes(self): return self.fields['replaces'] if 'replaces' in self.fields else [] def recommends(self): # a .PKGINFO has no notion of "recommends" return [] def suggests(self): # libsolv treats an optdepend as a "suggests", hence we do the same if 'optdepend' not in self.fields: return [] return [re.sub(b':.*', b'', entry) for entry in self.fields['optdepend']] def supplements(self): # a .PKGINFO has no notion of "recommends" return [] def enhances(self): # a .PKGINFO has no notion of "enhances" return [] def canonname(self): name = self.name() if name is None: raise ArchError(self.path(), 'package has no name') version = self.version() if version is None: raise ArchError(self.path(), 'package has no version') arch = self.arch() if arch is None: raise ArchError(self.path(), 'package has no arch') return ArchQuery.filename(name, self._epoch(), version, self.release(), arch) def gettag(self, tag): # implement me, if needed return None @staticmethod def query(filename, all_tags=False, *extra_tags): f = open(filename, 'rb') archq = ArchQuery(f) archq.read(all_tags, *extra_tags) f.close() return archq @staticmethod def rpmvercmp(ver1, ver2): """ implementation of RPM's version comparison algorithm (as described in lib/rpmvercmp.c) """ if ver1 == ver2: return 0 elif ver1 is None: return -1 elif ver2 is None: return 1 res = 0 while res == 0: # remove all leading non alphanumeric chars ver1 = re.sub(b'^[^a-zA-Z0-9]*', b'', ver1) ver2 = re.sub(b'^[^a-zA-Z0-9]*', b'', ver2) if not (len(ver1) and len(ver2)): break # check if we have a digits segment mo1 = re.match(br'(\d+)', ver1) mo2 = re.match(br'(\d+)', ver2) numeric = True if mo1 is None: mo1 = re.match(b'([a-zA-Z]+)', ver1) mo2 = re.match(b'([a-zA-Z]+)', ver2) numeric = False # check for different types: alpha and numeric if mo2 is None: if numeric: return 1 return -1 seg1 = mo1.group(0) ver1 = ver1[mo1.end(0):] seg2 = mo2.group(1) ver2 = ver2[mo2.end(1):] if numeric: # remove leading zeros seg1 = re.sub(b'^0+', b'', seg1) seg2 = re.sub(b'^0+', b'', seg2) # longer digit segment wins - if both have the same length # a simple ascii compare decides res = len(seg1) - len(seg2) or packagequery.cmp(seg1, seg2) else: res = packagequery.cmp(seg1, seg2) if res > 0: return 1 elif res < 0: return -1 return packagequery.cmp(ver1, ver2) @staticmethod def filename(name, epoch, version, release, arch): if epoch: if release: return b'%s-%s:%s-%s-%s.arch' % (name, epoch, version, release, arch) else: return b'%s-%s:%s-%s.arch' % (name, epoch, version, arch) if release: return b'%s-%s-%s-%s.arch' % (name, version, release, arch) else: return b'%s-%s-%s.arch' % (name, version, arch) if __name__ == '__main__': archq = ArchQuery.query(sys.argv[1]) print(archq.name(), archq.version(), archq.release(), archq.arch()) try: print(archq.canonname()) except ArchError as e: print(e.msg) print(archq.description()) print('##########') print(b'\n'.join(archq.provides())) print('##########') print(b'\n'.join(archq.requires())) osc-1.12.1/osc/util/cpio.py000066400000000000000000000210311475337502500154350ustar00rootroot00000000000000# Copyright 2009 Marcus Huewe # # This program is free software; you can redistribute it and/or # modify it under the terms of the GNU General Public License version 2 # as published by the Free Software Foundation; # # This program is distributed in the hope that it will be useful, # but WITHOUT ANY WARRANTY; without even the implied warranty of # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the # GNU General Public License for more details. # # You should have received a copy of the GNU General Public License # along with this program; if not, write to the Free Software # Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA import os import stat import struct import sys # format implementation is based on src/copyin.c and src/util.c (see cpio sources) class CpioError(Exception): """base class for all cpio related errors""" def __init__(self, fn, msg): super().__init__() self.file = fn self.msg = msg def __str__(self): return '%s: %s' % (self.file, self.msg) class CpioHdr: """ Represents a cpio header ("New" portable format and CRC format). """ def __init__(self, mgc, ino, mode, uid, gid, nlink, mtime, filesize, dev_maj, dev_min, rdev_maj, rdev_min, namesize, checksum, off=-1, filename=b''): """ All passed parameters are hexadecimal strings (not NUL terminated) except off and filename. They will be converted into normal ints. """ self.ino = ino self.mode = mode self.uid = uid self.gid = gid self.nlink = nlink self.mtime = mtime # 0 indicates FIFO or dir self.filesize = filesize self.dev_maj = dev_maj self.dev_min = dev_min # only needed for special block/char files self.rdev_maj = rdev_maj self.rdev_min = rdev_min # length of filename (inluding terminating NUL) self.namesize = namesize # != 0 indicates CRC format (which we do not support atm) self.checksum = checksum for k, v in self.__dict__.items(): self.__dict__[k] = int(v, 16) self.filename = filename # data starts at dataoff and ends at dataoff+filesize self.dataoff = off def __str__(self): return "%s %s %s %s" % (self.filename, self.filesize, self.namesize, self.dataoff) class CpioRead: """ Represents a cpio archive. Supported formats: * ascii SVR4 no CRC also called "new_ascii" """ # supported formats - use name -> mgc mapping to increase readabilty sfmt = { 'newascii': b'070701', } # header format hdr_fmt = '6s8s8s8s8s8s8s8s8s8s8s8s8s8s' hdr_len = 110 def __init__(self, filename): self.filename = filename self.format = -1 self.__file = None self._init_datastructs() def __del__(self): if self.__file: self.__file.close() def __iter__(self): yield from self.hdrs def _init_datastructs(self): self.hdrs = [] def _calc_padding(self, off): """ skip some bytes after a header or a file. based on 'static void tape_skip_padding()' in copyin.c. """ if self._is_format('newascii'): return (4 - (off % 4)) % 4 def _is_format(self, type): return self.format == self.sfmt[type] def _copyin_file(self, hdr, dest, fn): """saves file to disk""" # TODO: investigate links (e.g. symbolic links are working) # check if we have a regular file if not stat.S_ISREG(stat.S_IFMT(hdr.mode)): msg = '\'%s\' is no regular file - only regular files are supported atm' % hdr.filename raise NotImplementedError(msg) self.__file.seek(hdr.dataoff, os.SEEK_SET) if fn.startswith(b"/"): raise CpioError(fn, "Extracting files with absolute paths is not supported for security reasons") fn = os.path.join(dest, fn) dir_path, _ = os.path.split(fn) if dir_path: os.makedirs(dir_path, exist_ok=True) with open(fn, 'wb') as f: f.write(self.__file.read(hdr.filesize)) os.chmod(fn, hdr.mode) uid = hdr.uid if uid != os.geteuid() or os.geteuid() != 1: uid = -1 gid = hdr.gid if gid not in os.getgroups() or os.getegid() != -1: gid = -1 os.chown(fn, uid, gid) def _get_hdr(self, fn): for h in self.hdrs: if h.filename == fn: return h return None def read(self): if not self.__file: self.__file = open(self.filename, 'rb') else: self.__file.seek(0, os.SEEK_SET) self._init_datastructs() data = self.__file.read(6) self.format = data if self.format not in self.sfmt.values(): raise CpioError(self.filename, '\'%s\' is not a supported cpio format' % self.format) pos = 0 while len(data) != 0: self.__file.seek(pos, os.SEEK_SET) data = self.__file.read(self.hdr_len) if not data: break pos += self.hdr_len data = struct.unpack(self.hdr_fmt, data) hdr = CpioHdr(*data) hdr.filename = self.__file.read(hdr.namesize - 1) if hdr.filename == b'TRAILER!!!': break pos += hdr.namesize if self._is_format('newascii'): pos += self._calc_padding(hdr.namesize + 110) hdr.dataoff = pos self.hdrs.append(hdr) pos += hdr.filesize + self._calc_padding(hdr.filesize) def copyin_file(self, filename, dest=None, new_fn=None): """ copies filename to dest. If dest is None the file will be stored in $PWD/filename. If dest points to a dir the file will be stored in dest/filename. In case new_fn is specified the file will be stored as new_fn. """ # accept str for better user experience if isinstance(filename, str): filename = filename.encode("utf-8") if isinstance(dest, str): dest = dest.encode("utf-8") if isinstance(new_fn, str): new_fn = new_fn.encode("utf-8") hdr = self._get_hdr(filename) if not hdr: raise CpioError(filename, '\'%s\' does not exist in archive' % filename) dest = dest or os.getcwdb() fn = new_fn or filename self._copyin_file(hdr, dest, fn) return os.path.join(dest, fn).decode("utf-8") def copyin(self, dest=None): """ extracts the cpio archive to dest. If dest is None $PWD will be used. """ dest = dest or os.getcwdb() for h in self.hdrs: self._copyin_file(h, dest, h.filename) class CpioWrite: """cpio archive small files in memory, using new style portable header format""" def __init__(self): self.cpio = bytearray() def add(self, name=None, content=None, perms=0x1a4, type=0x8000): namesize = len(name) + 1 if namesize % 2: name += b'\0' filesize = len(content) mode = perms | type c = bytearray() c.extend(b'070701') # magic c.extend(b'%08X' % 0) # inode c.extend(b'%08X' % mode) # mode c.extend(b'%08X' % 0) # uid c.extend(b'%08X' % 0) # gid c.extend(b'%08X' % 0) # nlink c.extend(b'%08X' % 0) # mtime c.extend(b'%08X' % filesize) c.extend(b'%08X' % 0) # major c.extend(b'%08X' % 0) # minor c.extend(b'%08X' % 0) # rmajor c.extend(b'%08X' % 0) # rminor c.extend(b'%08X' % namesize) c.extend(b'%08X' % 0) # checksum c.extend(name + b'\0') c.extend(b'\0' * (len(c) % 4)) c.extend(content) if len(c) % 4: c.extend(b'\0' * (4 - len(c) % 4)) self.cpio.extend(c) def add_padding(self): if len(self.cpio) % 512: self.cpio.extend(b'\0' * (512 - len(self.cpio) % 512)) def get(self): self.add(b'TRAILER!!!', b'') self.add_padding() return bytes(self.cpio) if __name__ == '__main__': if len(sys.argv) != 2: print('usage: %s /path/to/file.cpio' % sys.argv[0]) sys.exit(1) # a potential user might want to pass a bytes instead of a str # to make sure that the CpioError's file attribute is always a # bytes cpio = CpioRead(sys.argv[1]) cpio.read() for hdr in cpio: print(hdr) osc-1.12.1/osc/util/debquery.py000066400000000000000000000220521475337502500163270ustar00rootroot00000000000000import itertools import os import re import sys import tarfile from io import BytesIO from . import ar from . import packagequery HAVE_LZMA = True try: import lzma except ImportError: HAVE_LZMA = False HAVE_ZSTD = True try: # Note: zstd is not supporting stream compression types import zstandard except ImportError: HAVE_ZSTD = False class DebError(packagequery.PackageError): pass class DebQuery(packagequery.PackageQuery, packagequery.PackageQueryResult): default_tags = ( b'package', b'version', b'release', b'epoch', b'architecture', b'description', b'provides', b'depends', b'pre_depends', b'conflicts', b'breaks' ) def __init__(self, fh): self._file = fh self._path = os.path.abspath(fh.name) self.filename_suffix = 'deb' self.fields = {} def read(self, all_tags=False, self_provides=True, *extra_tags): arfile = ar.Ar(fh=self._file) arfile.read() debbin = arfile.get_file(b'debian-binary') if debbin is None: raise DebError(self._path, 'no debian binary') if debbin.read() != b'2.0\n': raise DebError(self._path, 'invalid debian binary format') for open_func in [self._open_tar_gz, self._open_tar_xz, self._open_tar_zst, self._open_tar]: tar = open_func(arfile) if tar is not None: break if tar is None: raise DebError(self._path, 'missing control.tar') try: name = './control' control = tar.extractfile(name) except KeyError: raise DebError(self._path, 'missing \'control\' file in control.tar') self._parse_control(control, all_tags, self_provides, *extra_tags) return self def _open_tar(self, arfile): control = arfile.get_file(b'control.tar') if not control: return None return tarfile.open(fileobj=control) def _open_tar_gz(self, arfile): control = arfile.get_file(b'control.tar.gz') if not control: return None return tarfile.open(fileobj=control) def _open_tar_xz(self, arfile): control = arfile.get_file(b'control.tar.xz') if not control: return None if not HAVE_LZMA: raise DebError(self._path, 'can\'t open control.tar.xz without python-lzma') decompressed = lzma.decompress(control.read()) return tarfile.open(fileobj=BytesIO(decompressed)) def _open_tar_zst(self, arfile): control = arfile.get_file(b'control.tar.zst') if not control: return None if not HAVE_ZSTD: raise DebError(self._path, 'can\'t open control.tar.zst without python-zstandard') with zstandard.ZstdDecompressor().stream_reader(BytesIO(control.read())) as reader: decompressed = reader.read() return tarfile.open(fileobj=BytesIO(decompressed)) def _parse_control(self, control, all_tags=False, self_provides=True, *extra_tags): data = control.readline().strip() while data: field, val = re.split(br':\s*', data.strip(), 1) data = control.readline() while data and re.match(br'\s+', data): val += b'\n' + data.strip() data = control.readline().rstrip() field = field.replace(b'-', b'_').lower() if field in self.default_tags + extra_tags or all_tags: # a hyphen is not allowed in dict keys self.fields[field] = val versrel = self.fields[b'version'].rsplit(b'-', 1) if len(versrel) == 2: self.fields[b'version'] = versrel[0] self.fields[b'release'] = versrel[1] else: self.fields[b'release'] = None verep = self.fields[b'version'].split(b':', 1) if len(verep) == 2: self.fields[b'epoch'] = verep[0] self.fields[b'version'] = verep[1] else: self.fields[b'epoch'] = b'0' self.fields[b'provides'] = self._split_field_value(b'provides') self.fields[b'depends'] = self._split_field_value(b'depends') self.fields[b'pre_depends'] = self._split_field_value(b'pre_depends') self.fields[b'conflicts'] = self._split_field_value(b'conflicts') self.fields[b'breaks'] = self._split_field_value(b'breaks') self.fields[b'recommends'] = self._split_field_value(b'recommends') self.fields[b'suggests'] = self._split_field_value(b'suggests') self.fields[b'enhances'] = self._split_field_value(b'enhances') if self_provides: # add self provides entry self.fields[b'provides'].append(b'%s (= %s)' % (self.name(), b'-'.join(versrel))) def _split_field_value(self, field, delimeter=br',\s*'): return [i.strip() for i in re.split(delimeter, self.fields.get(field, b'')) if i] def vercmp(self, debq): res = packagequery.cmp(int(self.epoch()), int(debq.epoch())) if res != 0: return res res = DebQuery.debvercmp(self.version(), debq.version()) if res != 0: return res res = DebQuery.debvercmp(self.release(), debq.release()) return res def name(self): return self.fields[b'package'] def version(self): return self.fields[b'version'] def release(self): return self.fields[b'release'] def epoch(self): return self.fields[b'epoch'] def arch(self): return self.fields[b'architecture'] def description(self): return self.fields[b'description'] def path(self): return self._path def provides(self): return self.fields[b'provides'] def requires(self): return self.fields[b'depends'] + self.fields[b'pre_depends'] def conflicts(self): return self.fields[b'conflicts'] + self.fields[b'breaks'] def obsoletes(self): return [] def recommends(self): return self.fields[b'recommends'] def suggests(self): return self.fields[b'suggests'] def supplements(self): # a control file has no notion of "supplements" return [] def enhances(self): return self.fields[b'enhances'] def gettag(self, num): return self.fields.get(num, None) def canonname(self): return DebQuery.filename(self.name(), self.epoch(), self.version(), self.release(), self.arch()) @staticmethod def query(filename, all_tags=False, *extra_tags): f = open(filename, 'rb') debq = DebQuery(f) debq.read(all_tags, *extra_tags) f.close() return debq @staticmethod def debvercmp(ver1, ver2): """ implementation of dpkg's version comparison algorithm """ # 32 is arbitrary - it is needed for the "longer digit string wins" handling # (found this nice approach in Build/Deb.pm (build package)) ver1 = re.sub(br'(\d+)', lambda m: (32 * b'0' + m.group(1))[-32:], ver1) ver2 = re.sub(br'(\d+)', lambda m: (32 * b'0' + m.group(1))[-32:], ver2) vers = itertools.zip_longest(ver1, ver2, fillvalue=b'') for v1, v2 in vers: if v1 == v2: continue if not v1: # this makes the corresponding condition in the following # else part superfluous - keep the superfluous condition for # now (just to ease a (hopefully) upcoming refactoring (this # method really deserves a cleanup...)) return -1 if not v2: # see above return 1 v1 = bytes(bytearray([v1])) v2 = bytes(bytearray([v2])) if (v1.isalpha() and v2.isalpha()) or (v1.isdigit() and v2.isdigit()): res = packagequery.cmp(v1, v2) if res != 0: return res else: if v1 == b'~' or not v1: return -1 elif v2 == b'~' or not v2: return 1 ord1 = ord(v1) if not (v1.isalpha() or v1.isdigit()): ord1 += 256 ord2 = ord(v2) if not (v2.isalpha() or v2.isdigit()): ord2 += 256 if ord1 > ord2: return 1 else: return -1 return 0 @staticmethod def filename(name, epoch, version, release, arch): if release: return b'%s_%s-%s_%s.deb' % (name, version, release, arch) else: return b'%s_%s_%s.deb' % (name, version, arch) if __name__ == '__main__': try: debq = DebQuery.query(sys.argv[1]) except DebError as e: print(e.msg) sys.exit(2) print(debq.name(), debq.version(), debq.release(), debq.arch()) print(debq.description()) print('##########') print(b'\n'.join(debq.provides())) print('##########') print(b'\n'.join(debq.requires())) osc-1.12.1/osc/util/git_version.py000066400000000000000000000052201475337502500170350ustar00rootroot00000000000000import os import subprocess def get_git_archive_version(): """ Return version that is set by git during `git archive`. The returned format is equal to what `git describe --tags` returns. """ # the `version` variable contents get substituted during `git archive` # it requires adding this to .gitattributes: export-subst version = "1.12.1" if version.startswith(("$", "%")): # "$": version hasn't been substituted during `git archive` # "%": "Format:" and "$" characters get removed from the version string (a GitHub bug?) return None return version def get_git_version(): """ Determine version from git repo by calling `git describe --tags`. """ cmd = ["git", "describe", "--tags"] # run the command from the place where this file is placed # to ensure that we're in a git repo cwd = os.path.dirname(__file__) try: proc = subprocess.Popen(cmd, stdout=subprocess.PIPE, stderr=subprocess.PIPE, cwd=cwd) except OSError: # `git` command not found return None stdout, _ = proc.communicate() if proc.returncode != 0: return None version = stdout.strip().decode("utf-8") return version def get_version(version): """ Get the most relevant version of the software: 1. the version set during `git archive` 2. the version from the git tags by calling `git describe --tags` 3. the version explicitly specified in the source code The version conforms PEP 440. """ # use version from the archive git_version = get_git_archive_version() # use version from the git repo if not git_version: git_version = get_git_version() # unable to determine version from git if not git_version: return version if "-" not in git_version: git_tag = git_version git_commits = None git_hash = None else: git_tag, git_commits, git_hash = git_version.rsplit("-", 2) git_commits = int(git_commits) # remove the 'g' prefix from hash git_hash = git_hash[1:] # removing "~" because it is not an allowed character in git tags # and also because the normalized form is (for example) 1.0.0b0 if version and git_tag != version.replace("~", ""): # Git tag doesn't correspond with version specified in the source code. # The most common reason is that forks do not have their tags in sync with upstream. # In that case just return the version specified in the source code. return version result = git_tag if git_hash: result += f"+{git_commits}.git.{git_hash}" return result osc-1.12.1/osc/util/helper.py000066400000000000000000000035361475337502500157740ustar00rootroot00000000000000# Copyright (C) 2018 SUSE Linux. All rights reserved. # This program is free software; it may be used, copied, modified # and distributed under the terms of the GNU General Public Licence, # either version 2, or (at your option) any later version. import builtins import html from .. import oscerr def decode_list(ilist): """ Decodes the elements of a list if needed """ dlist = [] for elem in ilist: if not isinstance(elem, str): dlist.append(decode_it(elem)) else: dlist.append(elem) return dlist def decode_it(obj): """Decode the given object unless it is a str. If the given object is a str or has no decode method, the object itself is returned. Otherwise, try to decode the object using utf-8. If this fails due to a UnicodeDecodeError, try to decode the object using latin-1. """ if isinstance(obj, str) or not hasattr(obj, 'decode'): return obj try: return obj.decode('utf-8') except UnicodeDecodeError: return obj.decode('latin-1') def raw_input(*args): func = builtins.input try: return func(*args) except EOFError: # interpret ctrl-d as user abort raise oscerr.UserAbort() def _html_escape(data): return html.escape(data, quote=False) def format_table(rows, headers): """Format list of tuples into equal width table with headers""" maxlens = [len(h) for h in headers] for r in rows: for i, c in enumerate(r): maxlens[i] = max(maxlens[i], len(c)) tpltpl = [] for i, m in enumerate(maxlens): tpltpl.append('{%s:<%s}' % (i, m)) # {0:12} {1:7} {2:10} {3:8} templ = ' '.join(tpltpl) + '\n' out = templ.format(*headers) out += templ.format(*['-' * m for m in maxlens]) for r in rows: out += templ.format(*r) return out osc-1.12.1/osc/util/models.py000066400000000000000000000732141475337502500160000ustar00rootroot00000000000000""" This module implements a lightweight and limited alternative to pydantic's BaseModel and Field classes. It works on python 3.6+. This module IS NOT a supported API, it is meant for osc internal use only. """ import copy import functools import inspect import sys import tempfile import types import typing from typing import Callable from typing import get_type_hints from xml.etree import ElementTree as ET # supported types from enum import Enum from typing import Any from typing import Dict from typing import List from typing import NewType from typing import Optional from typing import Tuple from typing import Union if sys.version_info < (3, 8): def get_origin(typ): result = getattr(typ, "__origin__", None) bases = getattr(result, "__orig_bases__", None) if bases: result = bases[0] return result else: from typing import get_origin import urllib3.response from . import xml __all__ = ( "BaseModel", "XmlModel", "Field", "NotSet", "FromParent", "Enum", "Dict", "List", "NewType", "Optional", "Tuple", "Union", ) class NotSetClass: def __repr__(self): return "NotSet" def __bool__(self): return False NotSet = NotSetClass() class FromParent(NotSetClass): def __init__(self, field_name, *, fallback=NotSet): self.field_name = field_name self.fallback = fallback def __repr__(self): return f"FromParent(field_name={self.field_name})" class Field(property): def __init__( self, default: Any = NotSet, description: Optional[str] = None, exclude: bool = False, get_callback: Optional[Callable] = None, **extra, ): # the default value; it can be a factory function that is lazily evaluated on the first use # model sets it to None if it equals to NotSet (for better usability) self.default = default # a flag indicating, whether the default is a callable with lazy evalution self.default_is_lazy = callable(self.default) # the name of model's attribute associated with this field instance - set from the model self.name = None # the type of this field instance - set from the model self.type = None # the description of the field self.description = description # docstring - for sphinx and help() self.__doc__ = self.description if self.__doc__: # append information about the default value if isinstance(self.default, FromParent): self.__doc__ += f"\n\nDefault: inherited from parent config's field ``{self.default.field_name}``" elif self.default is not NotSet: self.__doc__ += f"\n\nDefault: ``{self.default}``" # whether to exclude this field from export self.exclude = exclude # optional callback to postprocess returned field value # it takes (model_instance, value) and returns modified value self.get_callback = get_callback # extra fields self.extra = extra # create an instance specific of self.get() so we can annotate it in the model self.get_copy = types.FunctionType( self.get.__code__, self.get.__globals__, self.get.__name__, self.get.__defaults__, self.get.__closure__, ) # turn function into a method by binding it to the instance self.get_copy = types.MethodType(self.get_copy, self) super().__init__(fget=self.get_copy, fset=self.set, doc=description) @property def origin_type(self): origin_type = get_origin(self.type) or self.type if self.is_optional: types = [i for i in self.type.__args__ if i != type(None)] return get_origin(types[0]) or types[0] return origin_type @property def inner_type(self): if self.is_optional: types = [i for i in self.type.__args__ if i != type(None)] type_ = types[0] else: type_ = self.type if get_origin(type_) != list: return None if not hasattr(type_, "__args__"): return None inner_type = [i for i in type_.__args__ if i != type(None)][0] return inner_type @property def is_optional(self): origin_type = get_origin(self.type) or self.type return origin_type == Union and len(self.type.__args__) == 2 and type(None) in self.type.__args__ @property def is_model(self): return inspect.isclass(self.origin_type) and issubclass(self.origin_type, BaseModel) @property def is_model_list(self): return inspect.isclass(self.inner_type) and issubclass(self.inner_type, BaseModel) def validate_type(self, value, expected_types=None): if not expected_types and self.is_optional and value is None: return True if expected_types is None: expected_types = (self.type,) elif not isinstance(expected_types, (list, tuple)): expected_types = (expected_types,) valid_type = False for expected_type in expected_types: if valid_type: break origin_type = get_origin(expected_type) or expected_type # unwrap Union if origin_type == Union: if value is None and type(None) in expected_type.__args__: valid_type = True continue valid_type |= self.validate_type(value, expected_types=expected_type.__args__) continue # unwrap NewType if (callable(NewType) or isinstance(origin_type, NewType)) and hasattr(origin_type, "__supertype__"): valid_type |= self.validate_type(value, expected_types=(origin_type.__supertype__,)) continue if ( inspect.isclass(expected_type) and issubclass(expected_type, BaseModel) and isinstance(value, (expected_type, dict)) ): valid_type = True continue if ( inspect.isclass(expected_type) and issubclass(expected_type, Enum) ): # test if the value is part of the enum expected_type(value) valid_type = True continue if not isinstance(value, origin_type): msg = f"Field '{self.name}' has type '{self.type}'. Cannot assign a value with type '{type(value).__name__}'." raise TypeError(msg) # the type annotation has no arguments -> no need to check those if not getattr(expected_type, "__args__", None): valid_type = True continue if origin_type in (list, tuple): valid_type_items = True for i in value: valid_type_items &= self.validate_type(i, expected_type.__args__) valid_type |= valid_type_items elif origin_type in (dict,): valid_type_items = True for k, v in value.items(): valid_type_items &= self.validate_type(k, expected_type.__args__[0]) valid_type_items &= self.validate_type(v, expected_type.__args__[1]) valid_type |= valid_type_items else: raise TypeError(f"Field '{self.name}' has unsupported type '{self.type}'.") return valid_type def get(self, obj): try: result = obj._values[self.name] # convert dictionaries into objects # we can't do it earlier because list is a standalone object that is not under our control if result is not None and self.is_model_list: for num, i in enumerate(result): if isinstance(i, dict): klass = self.inner_type result[num] = klass(**i, _parent=obj) if self.get_callback is not None: result = self.get_callback(obj, result) return result except KeyError: pass try: result = obj._defaults[self.name] if isinstance(result, (dict, list)): # make a deepcopy to avoid problems with mutable defaults result = copy.deepcopy(result) obj._values[self.name] = result if self.get_callback is not None: result = self.get_callback(obj, result) return result except KeyError: pass if isinstance(self.default, FromParent): if obj._parent is None: if self.default.fallback is not NotSet: return self.default.fallback else: raise RuntimeError(f"The field '{self.name}' has default {self.default} but the model has no parent set") return getattr(obj._parent, self.default.field_name or self.name) if self.default is NotSet: raise RuntimeError(f"The field '{self.name}' has no default") # make a deepcopy to avoid problems with mutable defaults default = copy.deepcopy(self.default) # lazy evaluation of a factory function on first use if callable(default): default = default() # if this is a model field, convert dict to a model instance if self.is_model and isinstance(default, dict): cls = self.origin_type new_value = cls() # pylint: disable=not-callable for k, v in default.items(): setattr(new_value, k, v) default = new_value obj._defaults[self.name] = default return default def set(self, obj, value): # if this is a model field, convert dict to a model instance if self.is_model and isinstance(value, dict): # initialize a model instance from a dictionary klass = self.origin_type value = klass(**value, _parent=obj) # pylint: disable=not-callable elif self.is_model_list and isinstance(value, list): new_value = [] for i in value: if isinstance(i, dict): klass = self.inner_type new_value.append(klass(**i, _parent=obj)) else: i._parent = obj new_value.append(i) value = new_value elif self.is_model and isinstance(value, str) and hasattr(self.origin_type, "XML_TAG_FIELD"): klass = self.origin_type key = getattr(self.origin_type, "XML_TAG_FIELD") value = klass(**{key: value}, _parent=obj) elif self.is_model and value is not None: value._parent = obj self.validate_type(value) obj._values[self.name] = value class ModelMeta(type): def __new__(mcs, name, bases, attrs): new_cls = super().__new__(mcs, name, bases, attrs) new_cls.__fields__ = {} # NOTE: dir() doesn't preserve attribute order # we need to iterate through __mro__ classes to workaround that for parent_cls in reversed(new_cls.__mro__): for field_name in parent_cls.__dict__: if field_name in new_cls.__fields__: continue field = getattr(new_cls, field_name) if not isinstance(field, Field): continue new_cls.__fields__[field_name] = field # fill model specific details back to the fields for field_name, field in new_cls.__fields__.items(): # property name associated with the field in this model field.name = field_name # field type associated with the field in this model field.type = get_type_hints(new_cls)[field_name] # set annotation for the getter so it shows up in sphinx field.get_copy.__func__.__annotations__ = {"return": field.type} # set 'None' as the default for optional fields if field.default is NotSet and field.is_optional: field.default = None return new_cls @functools.total_ordering class BaseModel(metaclass=ModelMeta): __fields__: Dict[str, Field] def __setattr__(self, name, value): if getattr(self, "_allow_new_attributes", True) or hasattr(self.__class__, name) or hasattr(self, name): # allow setting properties - test if they exist in the class # also allow setting existing attributes that were previously initialized via __dict__ return super().__setattr__(name, value) raise AttributeError(f"Setting attribute '{self.__class__.__name__}.{name}' is not allowed") def __init__(self, **kwargs): self._allow_new_attributes = True self._defaults = {} # field defaults cached in field.get() self._values = {} # field values explicitly set after initializing the model self._parent = kwargs.pop("_parent", None) uninitialized_fields = [] for name, field in self.__fields__.items(): if name not in kwargs: if field.default is NotSet: uninitialized_fields.append(field.name) continue value = kwargs.pop(name) setattr(self, name, value) if kwargs: unknown_fields_str = ", ".join([f"'{i}'" for i in kwargs]) raise TypeError(f"The following kwargs of '{self.__class__.__name__}.__init__()' do not match any field: {unknown_fields_str}") if uninitialized_fields: uninitialized_fields_str = ", ".join([f"'{i}'" for i in uninitialized_fields]) raise TypeError( f"The following fields of '{self.__class__.__name__}' object are not initialized and have no default either: {uninitialized_fields_str}" ) for name, field in self.__fields__.items(): field.validate_type(getattr(self, name)) self._snapshot = {} # copy of ``self.dict()`` so we can determine if the object has changed later on self.do_snapshot() self._allow_new_attributes = False def _get_cmp_data(self): result = [] for name, field in self.__fields__.items(): if field.exclude: continue value = getattr(self, name) if isinstance(value, dict): value = sorted(list(value.items())) result.append((name, value)) return result def __eq__(self, other): if type(self) != type(other): return False return self._get_cmp_data() == other._get_cmp_data() def __lt__(self, other): if type(self) != type(other): return False return self._get_cmp_data() < other._get_cmp_data() def dict(self): result = {} for name, field in self.__fields__.items(): if field.exclude: continue value = getattr(self, name) if value is not None and field.is_model: result[name] = value.dict() elif value is not None and field.is_model_list: result[name] = [i.dict() for i in value] else: result[name] = value return result def do_snapshot(self): """ Save ``self.dict()`` result as a new starting point for detecting changes in the object data. """ self._snapshot = self.dict() def has_changed(self): """ Determine if the object data has changed since its creation or the last snapshot. """ return self.dict() != self._snapshot class XmlModel(BaseModel): XML_TAG = None _apiurl: Optional[str] = Field( exclude=True, default=FromParent("_apiurl", fallback=None), ) def to_xml(self, *, with_comments: bool = False) -> ET.Element: xml_tag = None # check if there's a special field that sets the tag for field_name, field in self.__fields__.items(): xml_set_tag = field.extra.get("xml_set_tag", False) if xml_set_tag: value = getattr(self, field_name) xml_tag = value break # use the value from the class if xml_tag is None: xml_tag = self.XML_TAG assert xml_tag is not None root = ET.Element(xml_tag) if with_comments: comment = [ "", "The commented attributes and elements only provide hints on the XML structure.", "See OBS documentation such as XML schema files for more details:", "https://github.com/openSUSE/open-build-service/tree/master/docs/api/api", "", ] comment_node = ET.Comment(text="\n".join(comment)) root.append(comment_node) for field_name, field in self.__fields__.items(): if field.exclude: continue xml_attribute = field.extra.get("xml_attribute", False) xml_set_tag = field.extra.get("xml_set_tag", False) xml_set_text = field.extra.get("xml_set_text", False) xml_name = field.extra.get("xml_name", field_name) xml_wrapped = field.extra.get("xml_wrapped", False) xml_item_name = field.extra.get("xml_item_name", xml_name) if xml_set_tag: # a special case when the field determines the top-level tag name continue if with_comments: if xml_attribute: comment = f'{xml_name}=""' else: comment = f"<{xml_name}>" comment_node = ET.Comment(text=f" {comment} ") root.append(comment_node) value = getattr(self, field_name) if value is None: # skip fields that are not set continue # if value is wrapped into an external element, create it if xml_wrapped: wrapper_node = ET.SubElement(root, xml_name) else: wrapper_node = root if xml_set_text: wrapper_node.text = str(value) continue if field.origin_type == list: for entry in value: if isinstance(entry, dict): klass = field.inner_type obj = klass(**entry) node = obj.to_xml() wrapper_node.append(node) elif field.inner_type and issubclass(field.inner_type, XmlModel): wrapper_node.append(entry.to_xml()) else: node = ET.SubElement(wrapper_node, xml_item_name) if xml_attribute: node.attrib[xml_attribute] = entry else: node.text = entry elif issubclass(field.origin_type, XmlModel): wrapper_node.append(value.to_xml()) elif xml_attribute: wrapper_node.attrib[xml_name] = str(value) else: node = ET.SubElement(wrapper_node, xml_name) node.text = str(value) return root @classmethod def from_string(cls, string: str, *, apiurl: Optional[str] = None) -> "XmlModel": """ Instantiate model from string. """ root = xml.xml_fromstring(string) return cls.from_xml(root, apiurl=apiurl) @classmethod def from_file(cls, file: Union[str, typing.IO], *, apiurl: Optional[str] = None) -> "XmlModel": """ Instantiate model from file. """ root = xml.xml_parse(file).getroot() return cls.from_xml(root, apiurl=apiurl) def to_bytes(self, *, with_comments: bool = False) -> bytes: """ Serialize the object as XML and return it as utf-8 encoded bytes. """ root = self.to_xml(with_comments=with_comments) xml.xml_indent(root) return ET.tostring(root, encoding="utf-8") def to_string(self, *, with_comments: bool = False) -> str: """ Serialize the object as XML and return it as a string. """ return self.to_bytes(with_comments=with_comments).decode("utf-8") def to_file(self, file: Union[str, typing.IO], *, with_comments: bool = False) -> None: """ Serialize the object as XML and save it to an utf-8 encoded file. """ root = self.to_xml(with_comments=with_comments) xml.xml_indent(root) return ET.ElementTree(root).write(file, encoding="utf-8") @staticmethod def value_from_string(field, value): """ Convert field value from string to the actual type of the field. """ if field.origin_type is bool: if value.lower() in ["1", "yes", "true", "on"]: value = True return value if value.lower() in ["0", "no", "false", "off"]: value = False return value if field.origin_type is int: if not value or not value.strip(): return None value = int(value) return value return value @classmethod def _remove_processed_node(cls, parent, node): """ Remove a node that has been fully processed and is now empty. """ if len(node) != 0: raise RuntimeError(f"Node {node} contains unprocessed child elements {list(node)}") if node.attrib: raise RuntimeError(f"Node {node} contains unprocessed attributes {node.attrib}") if node.text is not None and node.text.strip(): raise RuntimeError(f"Node {node} contains unprocessed text {node.text}") if parent is not None: parent.remove(node) @classmethod def from_xml(cls, root: ET.Element, *, apiurl: Optional[str] = None): """ Instantiate model from a XML root. """ # We need to make sure we parse all data # and that's why we remove processed elements and attributes and check that nothing remains. # Otherwise we'd be sending partial XML back and that would lead to data loss. # # Let's make a copy of the xml tree because we'll destroy it during the process. orig_root = root root = copy.deepcopy(root) kwargs = {} for field_name, field in cls.__fields__.items(): xml_attribute = field.extra.get("xml_attribute", False) xml_set_tag = field.extra.get("xml_set_tag", False) xml_set_text = field.extra.get("xml_set_text", False) xml_name = field.extra.get("xml_name", field_name) xml_wrapped = field.extra.get("xml_wrapped", False) xml_item_name = field.extra.get("xml_item_name", xml_name) value: Any node: Optional[ET.Element] if xml_set_tag: # field contains name of the ``root`` tag if xml_wrapped: # the last node wins (overrides the previous nodes) for node in root[:]: value = node.tag cls._remove_processed_node(root, node) else: value = root.tag kwargs[field_name] = value continue if xml_set_text: # field contains the value (text) of the element if xml_wrapped: # the last node wins (overrides the previous nodes) for node in root[:]: value = node.text node.text = None cls._remove_processed_node(root, node) else: value = root.text root.text = None value = value.strip() kwargs[field_name] = value continue if xml_attribute: # field is an attribute that contains a scalar if xml_name not in root.attrib: continue value = cls.value_from_string(field, root.attrib.pop(xml_name)) kwargs[field_name] = value continue if field.origin_type is list: if xml_wrapped: wrapper_node = root.find(xml_name) # we'll consider all nodes inside the wrapper node nodes = wrapper_node[:] if wrapper_node is not None else None else: wrapper_node = None # we'll consider only nodes with matching name nodes = root.findall(xml_item_name) if not nodes: if wrapper_node is not None: cls._remove_processed_node(root, wrapper_node) continue values = [] for node in nodes: if field.is_model_list: klass = field.inner_type entry = klass.from_xml(node, apiurl=apiurl) # clear node as it was checked in from_xml() already node.text = None node.attrib = {} node[:] = [] else: entry = cls.value_from_string(field, node.text) node.text = None values.append(entry) if xml_wrapped: cls._remove_processed_node(wrapper_node, node) else: cls._remove_processed_node(root, node) if xml_wrapped: cls._remove_processed_node(root, wrapper_node) kwargs[field_name] = values continue if field.is_model: # field contains an instance of XmlModel assert xml_name is not None node = root.find(xml_name) if node is None: continue klass = field.origin_type kwargs[field_name] = klass.from_xml(node, apiurl=apiurl) # clear node as it was checked in from_xml() already node.text = None node.attrib = {} node[:] = [] cls._remove_processed_node(root, node) continue # field contains a scalar node = root.find(xml_name) if node is None: continue value = cls.value_from_string(field, node.text) node.text = None cls._remove_processed_node(root, node) if value is None: if field.is_optional: continue value = "" kwargs[field_name] = value cls._remove_processed_node(None, root) obj = cls(**kwargs, _apiurl=apiurl) obj.__dict__["_root"] = orig_root return obj @classmethod def xml_request( cls, method: str, apiurl: str, path: List[str], query: Optional[dict] = None, headers: Optional[str] = None, data: Optional[str] = None, ) -> urllib3.response.HTTPResponse: from ..connection import http_request from ..core import makeurl url = makeurl(apiurl, path, query) # TODO: catch HTTPError and return the wrapped response as XmlModel instance return http_request(method, url, headers=headers, data=data) def do_update(self, other: "XmlModel") -> None: """ Update values of the fields in the current model instance from another. """ self._values = copy.deepcopy(other._values) def do_edit(self) -> Tuple[str, str, "XmlModel"]: """ Serialize model as XML and open it in an editor for editing. Return a tuple with: * a string with original data * a string with edited data * an instance of the class with edited data loaded IMPORTANT: This method is always interactive. """ from ..core import run_editor from ..output import get_user_input def write_file(f, data): f.seek(0) f.write(data) f.truncate() f.flush() with tempfile.NamedTemporaryFile(mode="w+", encoding="utf-8", prefix="obs_xml_", suffix=".xml") as f: original_data = self.to_string() original_data_with_comments = self.to_string(with_comments=True) write_file(f, original_data_with_comments) while True: run_editor(f.name) try: edited_obj = self.__class__.from_file(f.name, apiurl=self._apiurl) f.seek(0) edited_data = f.read() break except Exception as e: reply = get_user_input( f""" The edited data is not valid. {e} """, answers={"a": "abort", "e": "edit", "u": "undo changes and edit"}, ) if reply == "a": from .. import oscerr raise oscerr.UserAbort() elif reply == "e": continue elif reply == "u": write_file(f, original_data_with_comments) continue # strip comments, we don't need to increase traffic to the server edited_data = edited_obj.to_string() return original_data, edited_data, edited_obj osc-1.12.1/osc/util/packagequery.py000066400000000000000000000127131475337502500171730ustar00rootroot00000000000000import sys from .helper import decode_it class PackageError(Exception): """base class for all package related errors""" def __init__(self, fname, msg): super().__init__() self.fname = fname self.msg = msg class PackageQueries(dict): """Dict of package name keys and package query values. When assigning a package query, to a name, the package is evaluated to see if it matches the wanted architecture and if it has a greater version than the current value. """ # map debian and rpm arches to common obs arches architectureMap = {'i386': ['i586', 'i686'], 'amd64': ['x86_64'], 'ppc64el': ['ppc64le'], 'armv6hl': ['armv6l'], 'armv7hl': ['armv7l']} def __init__(self, wanted_architecture): self.wanted_architecture = wanted_architecture super().__init__() def add(self, query): """Adds package query to dict if it is of the correct architecture and is newer (has a greater version) than the currently assigned package. :param query: a PackageQuery """ self.__setitem__(query.name(), query) def __setitem__(self, name, query): if decode_it(name) != decode_it(query.name()): raise ValueError("key '%s' does not match " "package query name '%s'" % (name, query.name())) architecture = decode_it(query.arch()) if (architecture in [self.wanted_architecture, 'noarch', 'all', 'any'] or self.wanted_architecture in self.architectureMap.get(architecture, [])): current_query = self.get(name) # if current query does not exist or is older than this new query if current_query is None or current_query.vercmp(query) <= 0: super().__setitem__(name, query) class PackageQuery: """abstract base class for all package types""" def read(self, all_tags=False, *extra_tags): """Returns a PackageQueryResult instance""" raise NotImplementedError # Hmmm... this should be a module function (inherting this stuff # does not make much sense) (the same is true for the queryhdrmd5 method) @staticmethod def query(filename, all_tags=False, extra_rpmtags=(), extra_debtags=(), self_provides=True): f = open(filename, 'rb') magic = f.read(7) f.seek(0) extra_tags = () pkgquery = None if magic[:4] == b'\xed\xab\xee\xdb': from . import rpmquery pkgquery = rpmquery.RpmQuery(f) extra_tags = extra_rpmtags elif magic == b'!': from . import debquery pkgquery = debquery.DebQuery(f) extra_tags = extra_debtags elif magic == b' b) - (a < b) if __name__ == '__main__': try: pkgq = PackageQuery.query(sys.argv[1]) except PackageError as e: print(e.msg) sys.exit(2) print(pkgq.name()) print(pkgq.version()) print(pkgq.release()) print(pkgq.description()) print('##########') print('\n'.join(pkgq.provides())) print('##########') print('\n'.join(pkgq.requires())) osc-1.12.1/osc/util/repodata.py000066400000000000000000000153031475337502500163070ustar00rootroot00000000000000"""Module for reading repodata directory (created with createrepo) for package information instead of scanning individual rpms.""" import gzip import os from xml.etree import ElementTree as ET from . import rpmquery from . import packagequery def namespace(name): return "{http://linux.duke.edu/metadata/%s}" % name OPERATOR_BY_FLAGS = { "EQ": "=", "LE": "<=", "GE": ">=", "LT": "<", "GT": ">" } def primaryPath(directory): """Returns path to the primary repository data file. :param directory: repository directory that contains the repodata subdirectory :return: path to primary repository data file :rtype: str :raise IOError: if repomd.xml contains no primary location """ from .xml import xml_parse metaDataPath = os.path.join(directory, "repodata", "repomd.xml") elementTree = xml_parse(metaDataPath) root = elementTree.getroot() for dataElement in root: if dataElement.get("type") == "primary": locationElement = dataElement.find(namespace("repo") + "location") # even though the repomd.xml file is under repodata, the location a # attribute is relative to parent directory (directory). primaryPath = os.path.join(directory, locationElement.get("href")) break else: raise OSError("'%s' contains no primary location" % metaDataPath) return primaryPath def queries(directory): """Returns a list of RepoDataQueries constructed from the repodata under the directory. :param directory: path to a repository directory (parent directory of repodata directory) :return: list of RepoDataQueryResult instances :raise IOError: if repomd.xml contains no primary location """ from .xml import xml_parse path = primaryPath(directory) gunzippedPrimary = gzip.GzipFile(path) elementTree = xml_parse(gunzippedPrimary) root = elementTree.getroot() packageQueries = [] for packageElement in root: packageQuery = RepoDataQueryResult(directory, packageElement) packageQueries.append(packageQuery) return packageQueries def _to_bytes_or_None(method): def _method(self, *args, **kwargs): res = method(self, *args, **kwargs) if res is None: return None return res.encode() return _method def _to_bytes_list(method): def _method(self, *args, **kwargs): res = method(self, *args, **kwargs) return [data.encode() for data in res] return _method class RepoDataQueryResult(packagequery.PackageQueryResult): """PackageQueryResult that reads in data from the repodata directory files.""" def __init__(self, directory, element): """Creates a RepoDataQueryResult from the a package Element under a metadata Element in a primary.xml file. :param directory: repository directory path. Used to convert relative paths to full paths. :param element: package Element """ self.__directory = os.path.abspath(directory) self.__element = element def __formatElement(self): return self.__element.find(namespace("common") + "format") def __parseEntry(self, element): entry = element.get("name") flags = element.get("flags") if flags is not None: version = element.get("ver") operator = OPERATOR_BY_FLAGS[flags] entry += " %s %s" % (operator, version) release = element.get("rel") if release is not None: entry += "-%s" % release return entry def __parseEntryCollection(self, collection): formatElement = self.__formatElement() collectionElement = formatElement.find(namespace("rpm") + collection) entries = [] if collectionElement is not None: for entryElement in collectionElement.findall(namespace("rpm") + "entry"): entry = self.__parseEntry(entryElement) entries.append(entry) return entries def __versionElement(self): return self.__element.find(namespace("common") + "version") @_to_bytes_or_None def arch(self): return self.__element.findtext(namespace("common") + "arch") @_to_bytes_or_None def description(self): return self.__element.findtext(namespace("common") + "description") def distribution(self): return None @_to_bytes_or_None def epoch(self): return self.__versionElement().get("epoch") @_to_bytes_or_None def name(self): return self.__element.findtext(namespace("common") + "name") def path(self): locationElement = self.__element.find(namespace("common") + "location") relativePath = locationElement.get("href") absolutePath = os.path.join(self.__directory, relativePath) return absolutePath @_to_bytes_list def provides(self): return self.__parseEntryCollection("provides") @_to_bytes_or_None def release(self): return self.__versionElement().get("rel") @_to_bytes_list def requires(self): return self.__parseEntryCollection("requires") @_to_bytes_list def conflicts(self): return self.__parseEntryCollection('conflicts') @_to_bytes_list def obsoletes(self): return self.__parseEntryCollection('obsoletes') @_to_bytes_list def recommends(self): return self.__parseEntryCollection('recommends') @_to_bytes_list def suggests(self): return self.__parseEntryCollection('suggests') @_to_bytes_list def supplements(self): return self.__parseEntryCollection('supplements') @_to_bytes_list def enhances(self): return self.__parseEntryCollection('enhances') def canonname(self): if self.release() is None: release = None else: release = self.release() return rpmquery.RpmQuery.filename(self.name(), None, self.version(), release, self.arch()) def gettag(self, tag): # implement me, if needed return None def vercmp(self, other): # if either self.epoch() or other.epoch() is None, the vercmp will do # the correct thing because one is transformed into b'None' and the # other one into b"b''" (and 'b' is greater than 'N') res = rpmquery.RpmQuery.rpmvercmp(str(self.epoch()).encode(), str(other.epoch()).encode()) if res != 0: return res res = rpmquery.RpmQuery.rpmvercmp(self.version(), other.version()) if res != 0: return res res = rpmquery.RpmQuery.rpmvercmp(self.release(), other.release()) return res @_to_bytes_or_None def version(self): return self.__versionElement().get("ver") osc-1.12.1/osc/util/rpmquery.py000066400000000000000000000315571475337502500164050ustar00rootroot00000000000000import os import re import struct import sys from . import packagequery from .helper import decode_it def cmp(a, b): return (a > b) - (a < b) class RpmError(packagequery.PackageError): pass class RpmHeaderError(RpmError): pass class RpmHeader: """corresponds more or less to the indexEntry_s struct""" def __init__(self, offset, length): self.offset = offset # length of the data section (without length of indexEntries) self.length = length self.entries = [] def append(self, entry): self.entries.append(entry) def gettag(self, tag): for i in self.entries: if i.tag == tag: return i return None def __iter__(self): yield from self.entries def __len__(self): return len(self.entries) class RpmHeaderEntry: """corresponds to the entryInfo_s struct (except the data attribute)""" # each element represents an int ENTRY_SIZE = 16 def __init__(self, tag, type, offset, count): self.tag = tag self.type = type self.offset = offset self.count = count self.data = None class RpmQuery(packagequery.PackageQuery, packagequery.PackageQueryResult): LEAD_SIZE = 96 LEAD_MAGIC = 0xedabeedb HEADER_MAGIC = 0x8eade801 HEADERSIG_TYPE = 5 LESS = 1 << 1 GREATER = 1 << 2 EQUAL = 1 << 3 SENSE_STRONG = 1 << 27 default_tags = ( 1000, 1001, 1002, 1003, 1004, 1022, 1005, 1020, 1047, 1112, 1113, # provides 1049, 1048, 1050, # requires 1054, 1053, 1055, # conflicts 1090, 1114, 1115, # obsoletes 1156, 1158, 1157, # oldsuggests 5046, 5047, 5048, # recommends 5049, 5051, 5050, # suggests 5052, 5053, 5054, # supplements 5055, 5056, 5057 # enhances ) def __init__(self, fh): self.__file = fh self.__path = os.path.abspath(fh.name) self.filename_suffix = 'rpm' self.header = None def read(self, all_tags=False, self_provides=True, *extra_tags, **extra_kw): # self_provides is unused because a rpm always has a self provides self.__read_lead() data = self.__file.read(RpmHeaderEntry.ENTRY_SIZE) hdrmgc, reserved, il, dl = struct.unpack('!I3i', data) if self.HEADER_MAGIC != hdrmgc: raise RpmHeaderError(self.__path, 'invalid headermagic \'%s\'' % hdrmgc) # skip signature header for now size = il * RpmHeaderEntry.ENTRY_SIZE + dl # data is 8 byte aligned pad = (size + 7) & ~7 querysig = extra_kw.get('querysig') if not querysig: self.__file.read(pad) data = self.__file.read(RpmHeaderEntry.ENTRY_SIZE) hdrmgc, reserved, il, dl = struct.unpack('!I3i', data) self.header = RpmHeader(pad, dl) if self.HEADER_MAGIC != hdrmgc: raise RpmHeaderError(self.__path, 'invalid headermagic \'%s\'' % hdrmgc) data = self.__file.read(il * RpmHeaderEntry.ENTRY_SIZE) while len(data) > 0: ei = struct.unpack('!4i', data[:RpmHeaderEntry.ENTRY_SIZE]) self.header.append(RpmHeaderEntry(*ei)) data = data[RpmHeaderEntry.ENTRY_SIZE:] data = self.__file.read(self.header.length) for i in self.header: if i.tag in self.default_tags + extra_tags or all_tags: try: # this may fail for -debug* packages self.__read_data(i, data) except: pass return self def __read_lead(self): data = self.__file.read(self.LEAD_SIZE) leadmgc, = struct.unpack('!I', data[:4]) if leadmgc != self.LEAD_MAGIC: raise RpmError(self.__path, 'not a rpm (invalid lead magic \'%s\')' % leadmgc) sigtype, = struct.unpack('!h', data[78:80]) if sigtype != self.HEADERSIG_TYPE: raise RpmError(self.__path, 'invalid header signature \'%s\'' % sigtype) def __read_data(self, entry, data): off = entry.offset if entry.type == 2: entry.data = struct.unpack('!%dc' % entry.count, data[off:off + 1 * entry.count]) if entry.type == 3: entry.data = struct.unpack('!%dh' % entry.count, data[off:off + 2 * entry.count]) elif entry.type == 4: entry.data = struct.unpack('!%di' % entry.count, data[off:off + 4 * entry.count]) elif entry.type == 6: entry.data = unpack_string(data[off:]) elif entry.type == 7: entry.data = data[off:off + entry.count] elif entry.type == 8 or entry.type == 9: cnt = entry.count entry.data = [] while cnt > 0: cnt -= 1 s = unpack_string(data[off:]) # also skip '\0' off += len(s) + 1 entry.data.append(s) if entry.type == 8: return lang = os.getenv('LANGUAGE') or os.getenv('LC_ALL') \ or os.getenv('LC_MESSAGES') or os.getenv('LANG') if lang is None: entry.data = entry.data[0] return # get private i18n table table = self.header.gettag(100) # just care about the country code lang = lang.split('_', 1)[0] cnt = 0 for i in table.data: if cnt > len(entry.data) - 1: break if i == lang: entry.data = entry.data[cnt] return cnt += 1 entry.data = entry.data[0] else: raise RpmHeaderError(self.__path, 'unsupported tag type \'%d\' (tag: \'%s\'' % (entry.type, entry.tag)) def __reqprov(self, tag, flags, version, strong=None): pnames = self.header.gettag(tag) if not pnames: return [] pnames = pnames.data pflags = self.header.gettag(flags).data pvers = self.header.gettag(version).data if not (pnames and pflags and pvers): raise RpmError(self.__path, 'cannot get provides/requires, tags are missing') res = [] for name, flags, ver in zip(pnames, pflags, pvers): if strong is not None: # compat code for the obsolete RPMTAG_OLDSUGGESTSNAME tag # strong == 1 => return only "recommends" # strong == 0 => return only "suggests" if strong == 1: strong = self.SENSE_STRONG if (flags & self.SENSE_STRONG) != strong: continue # RPMSENSE_SENSEMASK = 15 (see rpmlib.h) but ignore RPMSENSE_SERIAL (= 1 << 0) therefore use 14 if flags & 14: name += b' ' if flags & self.GREATER: name += b'>' elif flags & self.LESS: name += b'<' if flags & self.EQUAL: name += b'=' name += b' %s' % ver res.append(name) return res def vercmp(self, rpmq): res = RpmQuery.rpmvercmp(str(self.epoch()), str(rpmq.epoch())) if res != 0: return res res = RpmQuery.rpmvercmp(self.version(), rpmq.version()) if res != 0: return res res = RpmQuery.rpmvercmp(self.release(), rpmq.release()) return res # XXX: create dict for the tag => number mapping?! def name(self): return self.header.gettag(1000).data def version(self): return self.header.gettag(1001).data def release(self): return self.header.gettag(1002).data def epoch(self): epoch = self.header.gettag(1003) if epoch is None: return 0 return epoch.data[0] def arch(self): return self.header.gettag(1022).data def summary(self): return self.header.gettag(1004).data def description(self): return self.header.gettag(1005).data def url(self): entry = self.header.gettag(1020) if entry is None: return None return entry.data def path(self): return self.__path def provides(self): return self.__reqprov(1047, 1112, 1113) def requires(self): return self.__reqprov(1049, 1048, 1050) def conflicts(self): return self.__reqprov(1054, 1053, 1055) def obsoletes(self): return self.__reqprov(1090, 1114, 1115) def recommends(self): recommends = self.__reqprov(5046, 5048, 5047) if not recommends: recommends = self.__reqprov(1156, 1158, 1157, 1) return recommends def suggests(self): suggests = self.__reqprov(5049, 5051, 5050) if not suggests: suggests = self.__reqprov(1156, 1158, 1157, 0) return suggests def supplements(self): return self.__reqprov(5052, 5054, 5053) def enhances(self): return self.__reqprov(5055, 5057, 5506) def is_src(self): # SOURCERPM = 1044 return self.gettag(1044) is None def is_nosrc(self): # NOSOURCE = 1051, NOPATCH = 1052 return self.is_src() and \ (self.gettag(1051) is not None or self.gettag(1052) is not None) def gettag(self, num): return self.header.gettag(num) def canonname(self): if self.is_nosrc(): arch = b'nosrc' elif self.is_src(): arch = b'src' else: arch = self.arch() return RpmQuery.filename(self.name(), None, self.version(), self.release(), arch) @staticmethod def query(filename): f = open(filename, 'rb') rpmq = RpmQuery(f) rpmq.read() f.close() return rpmq @staticmethod def queryhdrmd5(filename): f = open(filename, 'rb') rpmq = RpmQuery(f) rpmq.read(1004, querysig=True) f.close() entry = rpmq.gettag(1004) if entry is None: return None return ''.join(["%02x" % x for x in struct.unpack('16B', entry.data)]) @staticmethod def rpmvercmp(ver1, ver2): """ implementation of RPM's version comparison algorithm (as described in lib/rpmvercmp.c) """ if ver1 == ver2: return 0 res = 0 ver1 = decode_it(ver1) ver2 = decode_it(ver2) while res == 0: # remove all leading non alphanumeric or tilde chars ver1 = re.sub('^[^a-zA-Z0-9~]*', '', ver1) ver2 = re.sub('^[^a-zA-Z0-9~]*', '', ver2) if ver1.startswith('~') or ver2.startswith('~'): if not ver1.startswith('~'): return 1 elif not ver2.startswith('~'): return -1 ver1 = ver1[1:] ver2 = ver2[1:] continue if not (len(ver1) and len(ver2)): break # check if we have a digits segment mo1 = re.match(r'(\d+)', ver1) mo2 = re.match(r'(\d+)', ver2) numeric = True if mo1 is None: mo1 = re.match('([a-zA-Z]+)', ver1) mo2 = re.match('([a-zA-Z]+)', ver2) numeric = False # check for different types: alpha and numeric if mo2 is None: if numeric: return 1 return -1 seg1 = mo1.group(0) ver1 = ver1[mo1.end(0):] seg2 = mo2.group(1) ver2 = ver2[mo2.end(1):] if numeric: # remove leading zeros seg1 = re.sub('^0+', '', seg1) seg2 = re.sub('^0+', '', seg2) # longer digit segment wins - if both have the same length # a simple ascii compare decides res = len(seg1) - len(seg2) or cmp(seg1, seg2) else: res = cmp(seg1, seg2) if res > 0: return 1 elif res < 0: return -1 return cmp(ver1, ver2) @staticmethod def filename(name, epoch, version, release, arch): return b'%s-%s-%s.%s.rpm' % (name, version, release, arch) def unpack_string(data, encoding=None): """unpack a '\\0' terminated string from data""" idx = data.find(b'\0') if idx == -1: raise ValueError('illegal string: not \\0 terminated') data = data[:idx] if encoding is not None: data = data.decode(encoding) return data if __name__ == '__main__': try: rpmq = RpmQuery.query(sys.argv[1]) except RpmError as e: print(e.msg) sys.exit(2) print(rpmq.name(), rpmq.version(), rpmq.release(), rpmq.arch(), rpmq.url()) print(rpmq.summary()) print(rpmq.description()) print('##########') print('\n'.join(rpmq.provides())) print('##########') print('\n'.join(rpmq.requires())) print('##########') print(RpmQuery.queryhdrmd5(sys.argv[1])) osc-1.12.1/osc/util/safewriter.py000066400000000000000000000014511475337502500166620ustar00rootroot00000000000000# be careful when debugging this code: # don't add print statements when setting sys.stdout = SafeWriter(sys.stdout)... class SafeWriter: """ Safely write an (unicode) str. In case of an "UnicodeEncodeError" the the str is encoded with the "encoding" encoding. All getattr, setattr calls are passed through to the "writer" instance. """ def __init__(self, writer, encoding='unicode_escape'): self._writer = writer self._encoding = encoding def write(self, s): try: self._writer.write(s) except UnicodeEncodeError as e: self._writer.write(s.encode(self._encoding)) def __getattr__(self, name): return getattr(self._writer, name) def __setattr__(self, name, value): super().__setattr__(name, value) osc-1.12.1/osc/util/xdg.py000066400000000000000000000004201475337502500152640ustar00rootroot00000000000000import os XDG_DATA_HOME = os.environ.get("XDG_DATA_HOME", "~/.local/share") XDG_CONFIG_HOME = os.environ.get("XDG_CONFIG_HOME", "~/.config") XDG_STATE_HOME = os.environ.get("XDG_STATE_HOME", "~/.local/state") XDG_CACHE_HOME = os.environ.get("XDG_CACHE_HOME", "~/.cache") osc-1.12.1/osc/util/xml.py000066400000000000000000000070121475337502500153060ustar00rootroot00000000000000""" Functions that manipulate with XML. """ import io import xml.sax.saxutils from typing import Union from xml.etree import ElementTree as ET def xml_escape(string): """ Escape the string so it's safe to use in XML and xpath. """ entities = { '"': """, "'": "'", } if isinstance(string, bytes): return xml.sax.saxutils.escape(string.decode("utf-8"), entities=entities).encode("utf-8") return xml.sax.saxutils.escape(string, entities=entities) def xml_unescape(string): """ Decode XML entities in the string. """ entities = { """: '"', "'": "'", } if isinstance(string, bytes): return xml.sax.saxutils.unescape(string.decode("utf-8"), entities=entities).encode("utf-8") return xml.sax.saxutils.unescape(string, entities=entities) def xml_strip_text(node): """ Recursively strip inner text in nodes: - if text contains only whitespaces - if node contains child nodes """ if node.text and not node.text.strip(): node.text = None elif len(node) != 0: node.text = None for child in node: xml_strip_text(child) def xml_indent_compat(elem, level=0): """ XML indentation code for python < 3.9. Source: http://effbot.org/zone/element-lib.htm#prettyprint """ i = "\n" + level * " " if isinstance(elem, ET.ElementTree): elem = elem.getroot() if len(elem): if not elem.text or not elem.text.strip(): elem.text = i + " " for e in elem: xml_indent_compat(e, level + 1) if not e.tail or not e.tail.strip(): e.tail = i + " " if not e.tail or not e.tail.strip(): e.tail = i else: if level and (not elem.tail or not elem.tail.strip()): elem.tail = i def xml_indent(root): """ Indent XML so it looks pretty after printing or saving to file. """ if hasattr(ET, "indent"): # ElementTree supports indent() in Python 3.9 and newer xml_strip_text(root) ET.indent(root) else: xml_indent_compat(root) def _extend_parser_error_msg(e: ET.ParseError, text: Union[str, bytes]): from ..output import tty y, x = e.position text = text.splitlines()[y-1][x-1:] if isinstance(text, bytes): text = text.decode("utf-8") new_text = "" for char in text: if char >= " ": new_text += char continue byte = ord(char) char = f"0x{byte:0>2X}" char = tty.colorize(char, "bg_red") new_text += char e.msg += ": " + new_text def xml_fromstring(text: str): """ xml.etree.ElementTree.fromstring() wrapper that extends error message in ParseError exceptions with a snippet of the broken XML. """ try: return ET.fromstring(text) except ET.ParseError as e: _extend_parser_error_msg(e, text) raise def xml_parse(source): """ xml.etree.ElementTree.parse() wrapper that extends error message in ParseError exceptions with a snippet of the broken XML. """ if isinstance(source, str): # source is a file name with open(source, "rb") as f: data = f.read() else: # source is an IO object data = source.read() if isinstance(data, bytes): f = io.BytesIO(data) else: f = io.StringIO(data) try: return ET.parse(f) except ET.ParseError as e: _extend_parser_error_msg(e, data) raise osc-1.12.1/osc/util/xpath.py000066400000000000000000000102171475337502500156330ustar00rootroot00000000000000from . import xml class XPathQuery: """ A query object that translates keyword arguments into a xpath query. The query objects can combined using `&` and `|` operators. Inspired with: https://docs.djangoproject.com/en/dev/topics/db/queries/#complex-lookups-with-q-objects """ VALID_OPS = [ "eq", "gt", "gteq", "lt", "lteq", "contains", "ends_with", "starts_with", ] def __init__(self, **kwargs): self.xpath = "" self.last_op = None for key, value in kwargs.items(): if value is None: continue key, op, value, op_not = self._parse(key, value) self._apply(key, op, value, op_not) def __str__(self): return self.xpath def _parse(self, key, value): op = "eq" op_not = False parts = key.split("__") for valid_op in self.VALID_OPS: # there must always be a field name followed by 0+ operators # in this case there's only the name if len(parts) == 1: continue if parts[-2:] == ["not", valid_op]: op = parts[-1] op_not = True parts = parts[:-2] break elif parts[-1] == valid_op: op = parts[-1] parts = parts[:-1] break elif parts[-1] == "not": op_not = True parts = parts[:-1] break key = "__".join(parts) return key, op, value, op_not def _apply(self, key, op, value, op_not=False): if "__" in key: prefix, key = key.rsplit("__", 1) prefix = prefix.replace("__", "/") else: prefix = "" if isinstance(value, (list, tuple)): q = XPathQuery() for i in value: if op_not: # translate name__not=["foo", "bar"] into XPathQuery(name__not="foo") & XPathQuery(name__not="bar") q &= XPathQuery()._apply(key, op, i, op_not) else: # translate name=["foo", "bar"] into XPathQuery(name="foo") | XPathQuery(name="bar") q |= XPathQuery()._apply(key, op, i, op_not) if prefix: q.xpath = f"{prefix}[{q.xpath}]" self &= q return self if isinstance(value, bool): value = str(int(value)) prefix = xml.xml_escape(prefix) key = xml.xml_escape(key) key = f"@{key}" value = xml.xml_escape(value) value = f"'{value}'" q = XPathQuery() if op == "eq": q.xpath = f"{key}={value}" elif op == "contains": q.xpath = f"contains({key}, {value})" else: raise ValueError(f"Invalid operator: {op}") if op_not: q.xpath = f"not({q.xpath})" if prefix: q.xpath = f"{prefix}[{q.xpath}]" self &= q return self @staticmethod def _imerge(q1, op, q2): """ Merge `q2` into `q1`. """ if not q1.xpath and not q2.xpath: return if not q1.xpath: q1.xpath = q2.xpath q1.last_op = q2.last_op return if not q2.xpath: return assert op is not None if q1.last_op not in (None, op): q1.xpath = f"({q1.xpath})" q1.xpath += f" {op} " if q2.last_op in (None, op): q1.xpath += f"{q2.xpath}" else: q1.xpath += f"({q2.xpath})" q1.last_op = op def __and__(self, other): result = XPathQuery() self._imerge(result, None, self) self._imerge(result, "and", other) return result def __iand__(self, other): self._imerge(self, "and", other) return self def __or__(self, other): result = XPathQuery() self._imerge(result, None, self) self._imerge(result, "or", other) return result def __ior__(self, other): self._imerge(self, "or", other) return self osc-1.12.1/setup.cfg000066400000000000000000000037461475337502500142260ustar00rootroot00000000000000[metadata] name = osc version = attr: osc.__version__ description = openSUSE commander long_description = Command-line client for the Open Build Service keywords = openSUSE, SUSE, RPM, build, buildservice, command-line license = GPLv2+ url = http://en.opensuse.org/openSUSE:OSC download_url = https://github.com/openSUSE/osc author = openSUSE project author_email = opensuse-buildservice@opensuse.org classifiers = Development Status :: 5 - Production/Stable Environment :: Console Intended Audience :: Developers Intended Audience :: Information Technology Intended Audience :: System Administrators License :: OSI Approved :: GNU General Public License v2 or later (GPLv2+) Operating System :: MacOS :: MacOS X Operating System :: POSIX :: BSD :: FreeBSD Operating System :: POSIX :: Linux Programming Language :: Python Programming Language :: Python :: 3 Programming Language :: Python :: 3.6 Programming Language :: Python :: 3.7 Programming Language :: Python :: 3.8 Programming Language :: Python :: 3.9 Programming Language :: Python :: 3.10 Programming Language :: Python :: 3.11 Topic :: Software Development :: Build Tools Topic :: System :: Archiving :: Packaging [options] packages = osc osc._private osc.commands osc.commands_git osc.git_scm osc.gitea_api osc.obs_api osc.obs_scm osc.output osc.util install_requires = cryptography # rpm is not available on pip, install a matching package manually prior installing osc rpm ruamel.yaml urllib3 [options.extras_require] lint = darker==1.5.1 mypy [options.package_data] osc = py.typed [options.entry_points] console_scripts = git-obs = osc.commandline_git:main osc = osc.babysitter:main [flake8] exclude = .git,__pycache__ max-line-length = 120 [pylint] # import-outside-toplevel: we're using lazy imports on too many places disable = import-outside-toplevel [pylint.FORMAT] max-line-length = 120 osc-1.12.1/setup.py000077500000000000000000000001371475337502500141110ustar00rootroot00000000000000#!/usr/bin/env python3 import setuptools if __name__ == "__main__": setuptools.setup() osc-1.12.1/tests/000077500000000000000000000000001475337502500135355ustar00rootroot00000000000000osc-1.12.1/tests/__init__.py000066400000000000000000000000001475337502500156340ustar00rootroot00000000000000osc-1.12.1/tests/addfile_fixtures/000077500000000000000000000000001475337502500170565ustar00rootroot00000000000000osc-1.12.1/tests/addfile_fixtures/oscrc000066400000000000000000000001361475337502500201120ustar00rootroot00000000000000[general] apiurl = http://localhost [http://localhost] user=Admin pass=opensuse allow_http=1 osc-1.12.1/tests/addfile_fixtures/osctest/000077500000000000000000000000001475337502500205425ustar00rootroot00000000000000osc-1.12.1/tests/addfile_fixtures/osctest/.osc/000077500000000000000000000000001475337502500214045ustar00rootroot00000000000000osc-1.12.1/tests/addfile_fixtures/osctest/.osc/_apiurl000066400000000000000000000000211475337502500227530ustar00rootroot00000000000000http://localhost osc-1.12.1/tests/addfile_fixtures/osctest/.osc/_osclib_version000066400000000000000000000000041475337502500245000ustar00rootroot000000000000001.0 osc-1.12.1/tests/addfile_fixtures/osctest/.osc/_packages000066400000000000000000000000331475337502500232400ustar00rootroot00000000000000 osc-1.12.1/tests/addfile_fixtures/osctest/.osc/_project000066400000000000000000000000101475337502500231230ustar00rootroot00000000000000osctest osc-1.12.1/tests/addfile_fixtures/osctest/simple/000077500000000000000000000000001475337502500220335ustar00rootroot00000000000000osc-1.12.1/tests/addfile_fixtures/osctest/simple/.osc/000077500000000000000000000000001475337502500226755ustar00rootroot00000000000000osc-1.12.1/tests/addfile_fixtures/osctest/simple/.osc/_apiurl000066400000000000000000000000211475337502500242440ustar00rootroot00000000000000http://localhost osc-1.12.1/tests/addfile_fixtures/osctest/simple/.osc/_files000066400000000000000000000005711475337502500240640ustar00rootroot00000000000000 osc-1.12.1/tests/addfile_fixtures/osctest/simple/.osc/_osclib_version000066400000000000000000000000041475337502500257710ustar00rootroot000000000000001.0 osc-1.12.1/tests/addfile_fixtures/osctest/simple/.osc/_package000066400000000000000000000000061475337502500243460ustar00rootroot00000000000000simpleosc-1.12.1/tests/addfile_fixtures/osctest/simple/.osc/_project000066400000000000000000000000071475337502500244220ustar00rootroot00000000000000osctestosc-1.12.1/tests/addfile_fixtures/osctest/simple/.osc/_to_be_deleted000066400000000000000000000000041475337502500255270ustar00rootroot00000000000000foo osc-1.12.1/tests/addfile_fixtures/osctest/simple/.osc/foo000066400000000000000000000000271475337502500234020ustar00rootroot00000000000000This is a simple test. osc-1.12.1/tests/addfile_fixtures/osctest/simple/.osc/merge000066400000000000000000000000601475337502500237130ustar00rootroot00000000000000Is it possible to merge this file? I hope so... osc-1.12.1/tests/addfile_fixtures/osctest/simple/.osc/nochange000066400000000000000000000000311475337502500243740ustar00rootroot00000000000000This file didn't change. osc-1.12.1/tests/addfile_fixtures/osctest/simple/merge000066400000000000000000000000601475337502500230510ustar00rootroot00000000000000Is it possible to merge this file? I hope so... osc-1.12.1/tests/addfile_fixtures/osctest/simple/nochange000066400000000000000000000000511475337502500235340ustar00rootroot00000000000000This file didn't change but is modified. osc-1.12.1/tests/addfile_fixtures/osctest/simple/toadd1000066400000000000000000000000071475337502500231270ustar00rootroot00000000000000toadd1 osc-1.12.1/tests/addfile_fixtures/osctest/simple/toadd2000066400000000000000000000000071475337502500231300ustar00rootroot00000000000000toadd2 osc-1.12.1/tests/commit_fixtures/000077500000000000000000000000001475337502500167565ustar00rootroot00000000000000osc-1.12.1/tests/commit_fixtures/oscrc000066400000000000000000000001361475337502500200120ustar00rootroot00000000000000[general] apiurl = http://localhost [http://localhost] user=Admin pass=opensuse allow_http=1 osc-1.12.1/tests/commit_fixtures/osctest/000077500000000000000000000000001475337502500204425ustar00rootroot00000000000000osc-1.12.1/tests/commit_fixtures/osctest/.osc/000077500000000000000000000000001475337502500213045ustar00rootroot00000000000000osc-1.12.1/tests/commit_fixtures/osctest/.osc/_apiurl000066400000000000000000000000211475337502500226530ustar00rootroot00000000000000http://localhost osc-1.12.1/tests/commit_fixtures/osctest/.osc/_osclib_version000066400000000000000000000000041475337502500244000ustar00rootroot000000000000001.0 osc-1.12.1/tests/commit_fixtures/osctest/.osc/_packages000066400000000000000000000000331475337502500231400ustar00rootroot00000000000000 osc-1.12.1/tests/commit_fixtures/osctest/.osc/_project000066400000000000000000000000101475337502500230230ustar00rootroot00000000000000osctest osc-1.12.1/tests/commit_fixtures/osctest/add/000077500000000000000000000000001475337502500211725ustar00rootroot00000000000000osc-1.12.1/tests/commit_fixtures/osctest/add/.osc/000077500000000000000000000000001475337502500220345ustar00rootroot00000000000000osc-1.12.1/tests/commit_fixtures/osctest/add/.osc/_apiurl000066400000000000000000000000211475337502500234030ustar00rootroot00000000000000http://localhost osc-1.12.1/tests/commit_fixtures/osctest/add/.osc/_files000066400000000000000000000005671475337502500232300ustar00rootroot00000000000000 osc-1.12.1/tests/commit_fixtures/osctest/add/.osc/_meta000066400000000000000000000002021475337502500230360ustar00rootroot00000000000000 Title example Description example osc-1.12.1/tests/commit_fixtures/osctest/add/.osc/_osclib_version000066400000000000000000000000041475337502500251300ustar00rootroot000000000000001.0 osc-1.12.1/tests/commit_fixtures/osctest/add/.osc/_package000066400000000000000000000000041475337502500235030ustar00rootroot00000000000000add osc-1.12.1/tests/commit_fixtures/osctest/add/.osc/_project000066400000000000000000000000071475337502500235610ustar00rootroot00000000000000osctestosc-1.12.1/tests/commit_fixtures/osctest/add/.osc/_to_be_added000066400000000000000000000000041475337502500243210ustar00rootroot00000000000000add osc-1.12.1/tests/commit_fixtures/osctest/add/.osc/foo000066400000000000000000000000271475337502500225410ustar00rootroot00000000000000This is a simple test. osc-1.12.1/tests/commit_fixtures/osctest/add/.osc/merge000066400000000000000000000000601475337502500230520ustar00rootroot00000000000000Is it possible to merge this file? I hope so... osc-1.12.1/tests/commit_fixtures/osctest/add/.osc/nochange000066400000000000000000000000311475337502500235330ustar00rootroot00000000000000This file didn't change. osc-1.12.1/tests/commit_fixtures/osctest/add/add000066400000000000000000000000131475337502500216370ustar00rootroot00000000000000added file osc-1.12.1/tests/commit_fixtures/osctest/add/exists000066400000000000000000000000001475337502500224220ustar00rootroot00000000000000osc-1.12.1/tests/commit_fixtures/osctest/add/foo000066400000000000000000000000271475337502500216770ustar00rootroot00000000000000This is a simple test. osc-1.12.1/tests/commit_fixtures/osctest/add/merge000066400000000000000000000000601475337502500222100ustar00rootroot00000000000000Is it possible to merge this file? I hope so... osc-1.12.1/tests/commit_fixtures/osctest/add/nochange000066400000000000000000000000311475337502500226710ustar00rootroot00000000000000This file didn't change. osc-1.12.1/tests/commit_fixtures/osctest/added_missing/000077500000000000000000000000001475337502500232345ustar00rootroot00000000000000osc-1.12.1/tests/commit_fixtures/osctest/added_missing/.osc/000077500000000000000000000000001475337502500240765ustar00rootroot00000000000000osc-1.12.1/tests/commit_fixtures/osctest/added_missing/.osc/_apiurl000066400000000000000000000000211475337502500254450ustar00rootroot00000000000000http://localhost osc-1.12.1/tests/commit_fixtures/osctest/added_missing/.osc/_files000066400000000000000000000003231475337502500252600ustar00rootroot00000000000000 osc-1.12.1/tests/commit_fixtures/osctest/added_missing/.osc/_meta000066400000000000000000000002141475337502500251030ustar00rootroot00000000000000 Title example Description example osc-1.12.1/tests/commit_fixtures/osctest/added_missing/.osc/_osclib_version000066400000000000000000000000041475337502500271720ustar00rootroot000000000000001.0 osc-1.12.1/tests/commit_fixtures/osctest/added_missing/.osc/_package000066400000000000000000000000161475337502500255500ustar00rootroot00000000000000added_missing osc-1.12.1/tests/commit_fixtures/osctest/added_missing/.osc/_project000066400000000000000000000000071475337502500256230ustar00rootroot00000000000000osctestosc-1.12.1/tests/commit_fixtures/osctest/added_missing/.osc/_to_be_added000066400000000000000000000000101475337502500263600ustar00rootroot00000000000000add bar osc-1.12.1/tests/commit_fixtures/osctest/added_missing/bar000066400000000000000000000000071475337502500237200ustar00rootroot00000000000000foobar osc-1.12.1/tests/commit_fixtures/osctest/allstates/000077500000000000000000000000001475337502500224365ustar00rootroot00000000000000osc-1.12.1/tests/commit_fixtures/osctest/allstates/.osc/000077500000000000000000000000001475337502500233005ustar00rootroot00000000000000osc-1.12.1/tests/commit_fixtures/osctest/allstates/.osc/_apiurl000066400000000000000000000000211475337502500246470ustar00rootroot00000000000000http://localhost osc-1.12.1/tests/commit_fixtures/osctest/allstates/.osc/_files000066400000000000000000000012451475337502500244660ustar00rootroot00000000000000 osc-1.12.1/tests/commit_fixtures/osctest/allstates/.osc/_meta000066400000000000000000000002101475337502500243010ustar00rootroot00000000000000 Title example Description example osc-1.12.1/tests/commit_fixtures/osctest/allstates/.osc/_osclib_version000066400000000000000000000000041475337502500263740ustar00rootroot000000000000001.0 osc-1.12.1/tests/commit_fixtures/osctest/allstates/.osc/_package000066400000000000000000000000121475337502500247460ustar00rootroot00000000000000allstates osc-1.12.1/tests/commit_fixtures/osctest/allstates/.osc/_project000066400000000000000000000000071475337502500250250ustar00rootroot00000000000000osctestosc-1.12.1/tests/commit_fixtures/osctest/allstates/.osc/_to_be_added000066400000000000000000000000141475337502500255660ustar00rootroot00000000000000add missing osc-1.12.1/tests/commit_fixtures/osctest/allstates/.osc/_to_be_deleted000066400000000000000000000000041475337502500261320ustar00rootroot00000000000000foo osc-1.12.1/tests/commit_fixtures/osctest/allstates/.osc/foo000066400000000000000000000000271475337502500240050ustar00rootroot00000000000000This is a simple test. osc-1.12.1/tests/commit_fixtures/osctest/allstates/.osc/merge000066400000000000000000000000601475337502500243160ustar00rootroot00000000000000Is it possible to merge this file? I hope so... osc-1.12.1/tests/commit_fixtures/osctest/allstates/.osc/missing000066400000000000000000000000101475337502500246630ustar00rootroot00000000000000missing osc-1.12.1/tests/commit_fixtures/osctest/allstates/.osc/nochange000066400000000000000000000000311475337502500247770ustar00rootroot00000000000000This file didn't change. osc-1.12.1/tests/commit_fixtures/osctest/allstates/.osc/test000066400000000000000000000000051475337502500241750ustar00rootroot00000000000000test osc-1.12.1/tests/commit_fixtures/osctest/allstates/add000066400000000000000000000000131475337502500231030ustar00rootroot00000000000000added file osc-1.12.1/tests/commit_fixtures/osctest/allstates/exists000066400000000000000000000000001475337502500236660ustar00rootroot00000000000000osc-1.12.1/tests/commit_fixtures/osctest/allstates/missing000066400000000000000000000000111475337502500240220ustar00rootroot00000000000000replaced osc-1.12.1/tests/commit_fixtures/osctest/allstates/nochange000066400000000000000000000000261475337502500241410ustar00rootroot00000000000000This file did change. osc-1.12.1/tests/commit_fixtures/osctest/allstates/test000066400000000000000000000000051475337502500233330ustar00rootroot00000000000000test osc-1.12.1/tests/commit_fixtures/osctest/branch/000077500000000000000000000000001475337502500216775ustar00rootroot00000000000000osc-1.12.1/tests/commit_fixtures/osctest/branch/.osc/000077500000000000000000000000001475337502500225415ustar00rootroot00000000000000osc-1.12.1/tests/commit_fixtures/osctest/branch/.osc/_apiurl000066400000000000000000000000211475337502500241100ustar00rootroot00000000000000http://localhost osc-1.12.1/tests/commit_fixtures/osctest/branch/.osc/_files000066400000000000000000000006041475337502500237250ustar00rootroot00000000000000 osc-1.12.1/tests/commit_fixtures/osctest/branch/.osc/_meta000066400000000000000000000002621475337502500235510ustar00rootroot00000000000000 Title example Description example osc-1.12.1/tests/commit_fixtures/osctest/branch/.osc/_osclib_version000066400000000000000000000000041475337502500256350ustar00rootroot000000000000001.0 osc-1.12.1/tests/commit_fixtures/osctest/branch/.osc/_package000066400000000000000000000000071475337502500242130ustar00rootroot00000000000000branch osc-1.12.1/tests/commit_fixtures/osctest/branch/.osc/_project000066400000000000000000000000101475337502500242600ustar00rootroot00000000000000osctest osc-1.12.1/tests/commit_fixtures/osctest/branch/.osc/simple000066400000000000000000000000251475337502500237520ustar00rootroot00000000000000imple modified file. osc-1.12.1/tests/commit_fixtures/osctest/branch/cfilesremote000066400000000000000000000007641475337502500243120ustar00rootroot00000000000000 osc-1.12.1/tests/commit_fixtures/osctest/branch/files000066400000000000000000000006011475337502500227210ustar00rootroot00000000000000 osc-1.12.1/tests/commit_fixtures/osctest/branch/filesremote000066400000000000000000000010031475337502500241320ustar00rootroot00000000000000 osc-1.12.1/tests/commit_fixtures/osctest/branch/simple000066400000000000000000000000261475337502500231110ustar00rootroot00000000000000simple modified file. osc-1.12.1/tests/commit_fixtures/osctest/conflict/000077500000000000000000000000001475337502500222435ustar00rootroot00000000000000osc-1.12.1/tests/commit_fixtures/osctest/conflict/.osc/000077500000000000000000000000001475337502500231055ustar00rootroot00000000000000osc-1.12.1/tests/commit_fixtures/osctest/conflict/.osc/_apiurl000066400000000000000000000000211475337502500244540ustar00rootroot00000000000000http://localhost osc-1.12.1/tests/commit_fixtures/osctest/conflict/.osc/_files000066400000000000000000000005731475337502500242760ustar00rootroot00000000000000 osc-1.12.1/tests/commit_fixtures/osctest/conflict/.osc/_in_conflict000066400000000000000000000000061475337502500254520ustar00rootroot00000000000000merge osc-1.12.1/tests/commit_fixtures/osctest/conflict/.osc/_meta000066400000000000000000000002071475337502500241140ustar00rootroot00000000000000 Title example Description example osc-1.12.1/tests/commit_fixtures/osctest/conflict/.osc/_osclib_version000066400000000000000000000000041475337502500262010ustar00rootroot000000000000001.0 osc-1.12.1/tests/commit_fixtures/osctest/conflict/.osc/_package000066400000000000000000000000101475337502500245510ustar00rootroot00000000000000conflictosc-1.12.1/tests/commit_fixtures/osctest/conflict/.osc/_project000066400000000000000000000000071475337502500246320ustar00rootroot00000000000000osctestosc-1.12.1/tests/commit_fixtures/osctest/conflict/.osc/foo000066400000000000000000000000271475337502500236120ustar00rootroot00000000000000This is a simple test. osc-1.12.1/tests/commit_fixtures/osctest/conflict/.osc/merge000066400000000000000000000000601475337502500241230ustar00rootroot00000000000000Is it possible to merge this file? I hope so... osc-1.12.1/tests/commit_fixtures/osctest/conflict/.osc/nochange000066400000000000000000000000311475337502500246040ustar00rootroot00000000000000This file didn't change. osc-1.12.1/tests/commit_fixtures/osctest/conflict/foo000066400000000000000000000000271475337502500227500ustar00rootroot00000000000000This is a simple test. osc-1.12.1/tests/commit_fixtures/osctest/conflict/merge000066400000000000000000000000601475337502500232610ustar00rootroot00000000000000Is it possible to merge this file? I hope so... osc-1.12.1/tests/commit_fixtures/osctest/conflict/nochange000066400000000000000000000000311475337502500237420ustar00rootroot00000000000000This file didn't change. osc-1.12.1/tests/commit_fixtures/osctest/delete/000077500000000000000000000000001475337502500217045ustar00rootroot00000000000000osc-1.12.1/tests/commit_fixtures/osctest/delete/.osc/000077500000000000000000000000001475337502500225465ustar00rootroot00000000000000osc-1.12.1/tests/commit_fixtures/osctest/delete/.osc/_apiurl000066400000000000000000000000211475337502500241150ustar00rootroot00000000000000http://localhost osc-1.12.1/tests/commit_fixtures/osctest/delete/.osc/_files000066400000000000000000000005721475337502500237360ustar00rootroot00000000000000 osc-1.12.1/tests/commit_fixtures/osctest/delete/.osc/_meta000066400000000000000000000002051475337502500235530ustar00rootroot00000000000000 Title example Description example osc-1.12.1/tests/commit_fixtures/osctest/delete/.osc/_osclib_version000066400000000000000000000000041475337502500256420ustar00rootroot000000000000001.0 osc-1.12.1/tests/commit_fixtures/osctest/delete/.osc/_package000066400000000000000000000000071475337502500242200ustar00rootroot00000000000000delete osc-1.12.1/tests/commit_fixtures/osctest/delete/.osc/_project000066400000000000000000000000071475337502500242730ustar00rootroot00000000000000osctestosc-1.12.1/tests/commit_fixtures/osctest/delete/.osc/_to_be_deleted000066400000000000000000000000111475337502500253760ustar00rootroot00000000000000nochange osc-1.12.1/tests/commit_fixtures/osctest/delete/.osc/foo000066400000000000000000000000271475337502500232530ustar00rootroot00000000000000This is a simple test. osc-1.12.1/tests/commit_fixtures/osctest/delete/.osc/merge000066400000000000000000000000601475337502500235640ustar00rootroot00000000000000Is it possible to merge this file? I hope so... osc-1.12.1/tests/commit_fixtures/osctest/delete/.osc/nochange000066400000000000000000000000311475337502500242450ustar00rootroot00000000000000This file didn't change. osc-1.12.1/tests/commit_fixtures/osctest/delete/exists000066400000000000000000000000001475337502500231340ustar00rootroot00000000000000osc-1.12.1/tests/commit_fixtures/osctest/delete/foo000066400000000000000000000000271475337502500224110ustar00rootroot00000000000000This is a simple test. osc-1.12.1/tests/commit_fixtures/osctest/delete/merge000066400000000000000000000000601475337502500227220ustar00rootroot00000000000000Is it possible to merge this file? I hope so... osc-1.12.1/tests/commit_fixtures/osctest/multiple/000077500000000000000000000000001475337502500222755ustar00rootroot00000000000000osc-1.12.1/tests/commit_fixtures/osctest/multiple/.osc/000077500000000000000000000000001475337502500231375ustar00rootroot00000000000000osc-1.12.1/tests/commit_fixtures/osctest/multiple/.osc/_apiurl000066400000000000000000000000211475337502500245060ustar00rootroot00000000000000http://localhost osc-1.12.1/tests/commit_fixtures/osctest/multiple/.osc/_files000066400000000000000000000007271475337502500243310ustar00rootroot00000000000000 osc-1.12.1/tests/commit_fixtures/osctest/multiple/.osc/_meta000066400000000000000000000002071475337502500241460ustar00rootroot00000000000000 Title example Description example osc-1.12.1/tests/commit_fixtures/osctest/multiple/.osc/_osclib_version000066400000000000000000000000041475337502500262330ustar00rootroot000000000000001.0 osc-1.12.1/tests/commit_fixtures/osctest/multiple/.osc/_package000066400000000000000000000000111475337502500246040ustar00rootroot00000000000000multiple osc-1.12.1/tests/commit_fixtures/osctest/multiple/.osc/_project000066400000000000000000000000071475337502500246640ustar00rootroot00000000000000osctestosc-1.12.1/tests/commit_fixtures/osctest/multiple/.osc/_to_be_added000066400000000000000000000000111475337502500254220ustar00rootroot00000000000000add add2 osc-1.12.1/tests/commit_fixtures/osctest/multiple/.osc/_to_be_deleted000066400000000000000000000000121475337502500257700ustar00rootroot00000000000000foo merge osc-1.12.1/tests/commit_fixtures/osctest/multiple/.osc/foo000066400000000000000000000000271475337502500236440ustar00rootroot00000000000000This is a simple test. osc-1.12.1/tests/commit_fixtures/osctest/multiple/.osc/merge000066400000000000000000000000601475337502500241550ustar00rootroot00000000000000Is it possible to merge this file? I hope so... osc-1.12.1/tests/commit_fixtures/osctest/multiple/.osc/nochange000066400000000000000000000000311475337502500246360ustar00rootroot00000000000000This file didn't change. osc-1.12.1/tests/commit_fixtures/osctest/multiple/.osc/test000066400000000000000000000000051475337502500240340ustar00rootroot00000000000000test osc-1.12.1/tests/commit_fixtures/osctest/multiple/add000066400000000000000000000000131475337502500227420ustar00rootroot00000000000000added file osc-1.12.1/tests/commit_fixtures/osctest/multiple/add2000066400000000000000000000000051475337502500230250ustar00rootroot00000000000000add2 osc-1.12.1/tests/commit_fixtures/osctest/multiple/exists000066400000000000000000000000001475337502500235250ustar00rootroot00000000000000osc-1.12.1/tests/commit_fixtures/osctest/multiple/nochange000066400000000000000000000000261475337502500240000ustar00rootroot00000000000000This file did change. osc-1.12.1/tests/commit_fixtures/osctest/multiple/test000066400000000000000000000000051475337502500231720ustar00rootroot00000000000000test osc-1.12.1/tests/commit_fixtures/osctest/nochanges/000077500000000000000000000000001475337502500224075ustar00rootroot00000000000000osc-1.12.1/tests/commit_fixtures/osctest/nochanges/.osc/000077500000000000000000000000001475337502500232515ustar00rootroot00000000000000osc-1.12.1/tests/commit_fixtures/osctest/nochanges/.osc/_apiurl000066400000000000000000000000211475337502500246200ustar00rootroot00000000000000http://localhost osc-1.12.1/tests/commit_fixtures/osctest/nochanges/.osc/_files000066400000000000000000000006141475337502500244360ustar00rootroot00000000000000 osc-1.12.1/tests/commit_fixtures/osctest/nochanges/.osc/_meta000066400000000000000000000002101475337502500242520ustar00rootroot00000000000000 Title example Description example osc-1.12.1/tests/commit_fixtures/osctest/nochanges/.osc/_osclib_version000066400000000000000000000000041475337502500263450ustar00rootroot000000000000001.0 osc-1.12.1/tests/commit_fixtures/osctest/nochanges/.osc/_package000066400000000000000000000000121475337502500247170ustar00rootroot00000000000000nochanges osc-1.12.1/tests/commit_fixtures/osctest/nochanges/.osc/_project000066400000000000000000000000071475337502500247760ustar00rootroot00000000000000osctestosc-1.12.1/tests/commit_fixtures/osctest/nochanges/.osc/merge000066400000000000000000000000601475337502500242670ustar00rootroot00000000000000Is it possible to merge this file? I hope so... osc-1.12.1/tests/commit_fixtures/osctest/nochanges/.osc/nochange000066400000000000000000000000311475337502500247500ustar00rootroot00000000000000This file didn't change. osc-1.12.1/tests/commit_fixtures/osctest/nochanges/exists000066400000000000000000000000001475337502500236370ustar00rootroot00000000000000osc-1.12.1/tests/commit_fixtures/osctest/nochanges/nochange000066400000000000000000000000311475337502500241060ustar00rootroot00000000000000This file didn't change. osc-1.12.1/tests/commit_fixtures/osctest/simple/000077500000000000000000000000001475337502500217335ustar00rootroot00000000000000osc-1.12.1/tests/commit_fixtures/osctest/simple/.osc/000077500000000000000000000000001475337502500225755ustar00rootroot00000000000000osc-1.12.1/tests/commit_fixtures/osctest/simple/.osc/_apiurl000066400000000000000000000000211475337502500241440ustar00rootroot00000000000000http://localhost osc-1.12.1/tests/commit_fixtures/osctest/simple/.osc/_files000066400000000000000000000005721475337502500237650ustar00rootroot00000000000000 osc-1.12.1/tests/commit_fixtures/osctest/simple/.osc/_meta000066400000000000000000000002051475337502500236020ustar00rootroot00000000000000 Title example Description example osc-1.12.1/tests/commit_fixtures/osctest/simple/.osc/_osclib_version000066400000000000000000000000041475337502500256710ustar00rootroot000000000000001.0 osc-1.12.1/tests/commit_fixtures/osctest/simple/.osc/_package000066400000000000000000000000061475337502500242460ustar00rootroot00000000000000simpleosc-1.12.1/tests/commit_fixtures/osctest/simple/.osc/_project000066400000000000000000000000071475337502500243220ustar00rootroot00000000000000osctestosc-1.12.1/tests/commit_fixtures/osctest/simple/.osc/foo000066400000000000000000000000271475337502500233020ustar00rootroot00000000000000This is a simple test. osc-1.12.1/tests/commit_fixtures/osctest/simple/.osc/merge000066400000000000000000000000601475337502500236130ustar00rootroot00000000000000Is it possible to merge this file? I hope so... osc-1.12.1/tests/commit_fixtures/osctest/simple/.osc/nochange000066400000000000000000000000311475337502500242740ustar00rootroot00000000000000This file didn't change. osc-1.12.1/tests/commit_fixtures/osctest/simple/exists000066400000000000000000000000001475337502500231630ustar00rootroot00000000000000osc-1.12.1/tests/commit_fixtures/osctest/simple/foo000066400000000000000000000000271475337502500224400ustar00rootroot00000000000000This is a simple test. osc-1.12.1/tests/commit_fixtures/osctest/simple/merge000066400000000000000000000000601475337502500227510ustar00rootroot00000000000000Is it possible to merge this file? I hope so... osc-1.12.1/tests/commit_fixtures/osctest/simple/nochange000066400000000000000000000000511475337502500234340ustar00rootroot00000000000000This file didn't change but is modified. osc-1.12.1/tests/commit_fixtures/testAddedMissing_cfilesremote000066400000000000000000000004361475337502500247000ustar00rootroot00000000000000 osc-1.12.1/tests/commit_fixtures/testAddedMissing_filesremote000066400000000000000000000003041475337502500245270ustar00rootroot00000000000000 osc-1.12.1/tests/commit_fixtures/testAddedMissing_lfilelist000066400000000000000000000002151475337502500242010ustar00rootroot00000000000000osc-1.12.1/tests/commit_fixtures/testAddedMissing_lfilelistwithSHA000066400000000000000000000003341475337502500254330ustar00rootroot00000000000000osc-1.12.1/tests/commit_fixtures/testAddedMissing_missingfilelist000066400000000000000000000001741475337502500254230ustar00rootroot00000000000000 osc-1.12.1/tests/commit_fixtures/testAddedMissing_missingfilelistwithSHA000066400000000000000000000002061475337502500266470ustar00rootroot00000000000000 osc-1.12.1/tests/commit_fixtures/testAddedMissing_missingfilelistwithSHAsum000066400000000000000000000003121475337502500273720ustar00rootroot00000000000000 osc-1.12.1/tests/commit_fixtures/testAddfile_cfilesremote000066400000000000000000000007251475337502500236760ustar00rootroot00000000000000 osc-1.12.1/tests/commit_fixtures/testAddfile_filesremote000066400000000000000000000005721475337502500235330ustar00rootroot00000000000000 osc-1.12.1/tests/commit_fixtures/testAddfile_lfilelist000066400000000000000000000004121475337502500231750ustar00rootroot00000000000000osc-1.12.1/tests/commit_fixtures/testAddfile_missingfilelist000066400000000000000000000001621475337502500244150ustar00rootroot00000000000000 osc-1.12.1/tests/commit_fixtures/testAllStates_cfilesremote000066400000000000000000000012261475337502500242370ustar00rootroot00000000000000 osc-1.12.1/tests/commit_fixtures/testAllStates_expfiles000066400000000000000000000012451475337502500233760ustar00rootroot00000000000000 osc-1.12.1/tests/commit_fixtures/testAllStates_filesremote000066400000000000000000000012261475337502500240740ustar00rootroot00000000000000 osc-1.12.1/tests/commit_fixtures/testAllStates_lfilelist000066400000000000000000000006111475337502500235420ustar00rootroot00000000000000osc-1.12.1/tests/commit_fixtures/testAllStates_missingfilelist000066400000000000000000000003751475337502500247670ustar00rootroot00000000000000 osc-1.12.1/tests/commit_fixtures/testConflictfile_filesremote000066400000000000000000000005731475337502500246050ustar00rootroot00000000000000 osc-1.12.1/tests/commit_fixtures/testDeletefile_cfilesremote000066400000000000000000000004321475337502500244030ustar00rootroot00000000000000 osc-1.12.1/tests/commit_fixtures/testDeletefile_filesremote000066400000000000000000000005721475337502500242450ustar00rootroot00000000000000 osc-1.12.1/tests/commit_fixtures/testDeletefile_lfilelist000066400000000000000000000002171475337502500237120ustar00rootroot00000000000000osc-1.12.1/tests/commit_fixtures/testExpand_cfilesremote000066400000000000000000000007641475337502500235700ustar00rootroot00000000000000 osc-1.12.1/tests/commit_fixtures/testExpand_expandedfilesremote000066400000000000000000000006011475337502500251240ustar00rootroot00000000000000 osc-1.12.1/tests/commit_fixtures/testExpand_filesremote000066400000000000000000000010031475337502500234100ustar00rootroot00000000000000 osc-1.12.1/tests/commit_fixtures/testExpand_lfilelist000066400000000000000000000001251475337502500230650ustar00rootroot00000000000000osc-1.12.1/tests/commit_fixtures/testExpand_missingfilelist000066400000000000000000000001701475337502500243030ustar00rootroot00000000000000 osc-1.12.1/tests/commit_fixtures/testInterrupted_lfilelist000066400000000000000000000003171475337502500241560ustar00rootroot00000000000000osc-1.12.1/tests/commit_fixtures/testMultiple_cfilesremote000066400000000000000000000007231475337502500241370ustar00rootroot00000000000000 osc-1.12.1/tests/commit_fixtures/testMultiple_filesremote000066400000000000000000000007251475337502500237760ustar00rootroot00000000000000 osc-1.12.1/tests/commit_fixtures/testMultiple_lfilelist000066400000000000000000000004121475337502500234400ustar00rootroot00000000000000osc-1.12.1/tests/commit_fixtures/testMultiple_missingfilelist000066400000000000000000000003641475337502500246640ustar00rootroot00000000000000 osc-1.12.1/tests/commit_fixtures/testNoChanges_filesremote000066400000000000000000000005731475337502500240510ustar00rootroot00000000000000 osc-1.12.1/tests/commit_fixtures/testOpenRequests000066400000000000000000000000461475337502500222360ustar00rootroot00000000000000 osc-1.12.1/tests/commit_fixtures/testPartial_cfilesremote000066400000000000000000000007251475337502500237420ustar00rootroot00000000000000 osc-1.12.1/tests/commit_fixtures/testPartial_filesremote000066400000000000000000000007251475337502500235770ustar00rootroot00000000000000 osc-1.12.1/tests/commit_fixtures/testPartial_lfilelist000066400000000000000000000004131475337502500232420ustar00rootroot00000000000000osc-1.12.1/tests/commit_fixtures/testPartial_missingfilelist000066400000000000000000000002711475337502500244620ustar00rootroot00000000000000 osc-1.12.1/tests/commit_fixtures/testSimple_cfilesremote000066400000000000000000000005721475337502500235770ustar00rootroot00000000000000 osc-1.12.1/tests/commit_fixtures/testSimple_filesremote000066400000000000000000000005721475337502500234340ustar00rootroot00000000000000 osc-1.12.1/tests/commit_fixtures/testSimple_lfilelist000066400000000000000000000003171475337502500231020ustar00rootroot00000000000000osc-1.12.1/tests/commit_fixtures/testSimple_lfilelistwithSHA000066400000000000000000000004361475337502500243340ustar00rootroot00000000000000osc-1.12.1/tests/commit_fixtures/testSimple_missingfilelist000066400000000000000000000001721475337502500243170ustar00rootroot00000000000000 osc-1.12.1/tests/commit_fixtures/testSimple_missingfilelistwithSHA000066400000000000000000000002111475337502500255410ustar00rootroot00000000000000 osc-1.12.1/tests/commit_fixtures/testSimple_missingfilelistwithSHAsum000066400000000000000000000003111475337502500262670ustar00rootroot00000000000000 osc-1.12.1/tests/common.py000066400000000000000000000243111475337502500154000ustar00rootroot00000000000000import io import os import shutil import sys import tempfile import unittest from unittest.mock import patch from urllib.request import HTTPHandler, addinfourl, build_opener from urllib.parse import urlparse, parse_qs from xml.etree import ElementTree as ET import urllib3.response import osc.conf import osc.core from osc.util.xml import xml_fromstring def urlcompare(url, *args): """compare all components of url except query string - it is converted to dict, therefor different ordering does not makes url's different, as well as quoting of a query string""" components = urlparse(url) query_args = parse_qs(components.query) components = components._replace(query=None) if not args: return False for url in args: components2 = urlparse(url) query_args2 = parse_qs(components2.query) components2 = components2._replace(query=None) if components != components2 or \ query_args != query_args2: return False return True def xml_equal(actual, exp): try: actual_xml = xml_fromstring(actual) exp_xml = xml_fromstring(exp) except ET.ParseError: return False todo = [(actual_xml, exp_xml)] while todo: actual_xml, exp_xml = todo.pop(0) if actual_xml.tag != exp_xml.tag: return False if actual_xml.attrib != exp_xml.attrib: return False if actual_xml.text != exp_xml.text: return False if actual_xml.tail != exp_xml.tail: return False if len(actual_xml) != len(exp_xml): return False todo.extend(list(zip(actual_xml, exp_xml))) return True class RequestWrongOrder(Exception): """raised if an unexpected request is issued to urllib2""" def __init__(self, url, exp_url, method, exp_method): super().__init__() self.url = url self.exp_url = exp_url self.method = method self.exp_method = exp_method def __str__(self): return '%s, %s, %s, %s' % (self.url, self.exp_url, self.method, self.exp_method) class RequestDataMismatch(Exception): """raised if POSTed or PUTed data doesn't match with the expected data""" def __init__(self, url, got, exp): self.url = url self.got = got self.exp = exp def __str__(self): return '%s, %s, %s' % (self.url, self.got, self.exp) EXPECTED_REQUESTS = [] # HACK: Fix "ValueError: I/O operation on closed file." error in tests on openSUSE Leap 15.2. # The problem seems to appear only in the tests, possibly some interaction with MockHTTPConnectionPool. # Porting 753fbc03 to urllib3 in openSUSE Leap 15.2 would fix the problem. urllib3.response.HTTPResponse.__iter__ = lambda self: iter(self._fp) class MockHTTPConnectionPool: def __init__(self, host, port=None, **conn_kw): pass def urlopen(self, method, url, body=None, headers=None, retries=None, **response_kw): global EXPECTED_REQUESTS request = EXPECTED_REQUESTS.pop(0) url = f"http://localhost{url}" if not urlcompare(request["url"], url) or request["method"] != method: raise RequestWrongOrder(request["url"], url, request["method"], method) if method in ("POST", "PUT"): if 'exp' not in request and 'expfile' in request: with open(request['expfile'], 'rb') as f: exp = f.read() elif 'exp' in request and 'expfile' not in request: exp = request['exp'].encode('utf-8') else: raise RuntimeError('Specify either `exp` or `expfile`') body = body or b"" if hasattr(body, "read"): # if it is a file-like object, read it body = body.read() if hasattr(body, "encode"): # if it can be encoded to bytes, do it body = body.encode("utf-8") if body != exp: # We do not have a notion to explicitly mark xml content. In case # of xml, we do not care about the exact xml representation (for # now). Hence, if both, data and exp, are xml and are "equal", # everything is fine (for now); otherwise, error out # (of course, this is problematic if we want to ensure that XML # documents are bit identical...) if not xml_equal(body, exp): raise RequestDataMismatch(url, repr(body), repr(exp)) if 'exception' in request: raise request["exception"] if 'text' not in request and 'file' in request: with open(request['file'], 'rb') as f: data = f.read() elif 'text' in request and 'file' not in request: data = request['text'].encode('utf-8') else: raise RuntimeError('Specify either `text` or `file`') response = urllib3.response.HTTPResponse(body=data, status=request.get("code", 200)) response._fp = io.BytesIO(data) return response def urldecorator(method, url, **kwargs): def decorate(test_method): def wrapped_test_method(self): # put all args into a single dictionary kwargs["method"] = method kwargs["url"] = url # prepend fixtures dir to `file` if "file" in kwargs: kwargs["file"] = os.path.join(self._get_fixtures_dir(), kwargs["file"]) # prepend fixtures dir to `expfile` if "expfile" in kwargs: kwargs["expfile"] = os.path.join(self._get_fixtures_dir(), kwargs["expfile"]) EXPECTED_REQUESTS.append(kwargs) test_method(self) # mock connection pool, but only just once if not hasattr(test_method, "_MockHTTPConnectionPool"): wrapped_test_method = patch('urllib3.HTTPConnectionPool', MockHTTPConnectionPool)(wrapped_test_method) wrapped_test_method._MockHTTPConnectionPool = True wrapped_test_method.__name__ = test_method.__name__ return wrapped_test_method return decorate def GET(path, **kwargs): return urldecorator('GET', path, **kwargs) def PUT(path, **kwargs): return urldecorator('PUT', path, **kwargs) def POST(path, **kwargs): return urldecorator('POST', path, **kwargs) def DELETE(path, **kwargs): return urldecorator('DELETE', path, **kwargs) class OscTestCase(unittest.TestCase): def setUp(self, copytree=True): global EXPECTED_REQUESTS EXPECTED_REQUESTS = [] os.chdir(os.path.dirname(__file__)) oscrc = os.path.join(self._get_fixtures_dir(), 'oscrc') osc.conf.get_config(override_conffile=oscrc, override_no_keyring=True) os.environ['OSC_CONFIG'] = oscrc self.tmpdir = tempfile.mkdtemp(prefix='osc_test') if copytree: shutil.copytree(os.path.join(self._get_fixtures_dir(), 'osctest'), os.path.join(self.tmpdir, 'osctest')) self.stdout = sys.stdout sys.stdout = io.StringIO() def tearDown(self): sys.stdout = self.stdout try: shutil.rmtree(self.tmpdir) except: pass os.environ.pop("OSC_CONFIG", "") self.assertTrue(len(EXPECTED_REQUESTS) == 0) def _get_fixtures_dir(self): raise NotImplementedError('subclasses should implement this method') def _get_fixture(self, filename): path = os.path.join(self._get_fixtures_dir(), filename) with open(path) as f: return f.read() def _change_to_pkg(self, name): os.chdir(os.path.join(self.tmpdir, 'osctest', name)) def _check_list(self, fname, exp): fname = os.path.join('.osc', fname) self.assertFileContentEqual(fname, exp) def _check_addlist(self, exp): self._check_list('_to_be_added', exp) def _check_deletelist(self, exp): self._check_list('_to_be_deleted', exp) def _check_conflictlist(self, exp): self._check_list('_in_conflict', exp) def _check_status(self, p, fname, exp): self.assertEqual(p.status(fname), exp) def _check_digests(self, fname, *skipfiles): fname = os.path.join(self._get_fixtures_dir(), fname) with open(os.path.join('.osc', '_files')) as f: files_act = f.read() with open(fname) as f: files_exp = f.read() self.assertXMLEqual(files_act, files_exp) root = xml_fromstring(files_act) for i in root.findall('entry'): if i.get('name') in skipfiles: continue self.assertTrue(os.path.exists(os.path.join('.osc', 'sources', i.get('name')))) self.assertEqual(osc.core.dgst(os.path.join('.osc', 'sources', i.get('name'))), i.get('md5')) def assertFilesEqual(self, first, second): self.assertTrue(os.path.exists(first)) self.assertTrue(os.path.exists(second)) with open(first) as f1, open(second) as f2: self.assertEqual(f1.read(), f2.read()) def assertFileContentEqual(self, file_path, expected_content): self.assertTrue(os.path.exists(file_path)) with open(file_path) as f: self.assertEqual(f.read(), expected_content) def assertFileContentNotEqual(self, file_path, expected_content): self.assertTrue(os.path.exists(file_path)) with open(file_path) as f: self.assertNotEqual(f.read(), expected_content) def assertXMLEqual(self, act, exp): if xml_equal(act, exp): return # ok, xmls are different, hence, assertEqual is expected to fail # (we just use it in order to get a "nice" error message) self.assertEqual(act, exp) # not reached (unless assertEqual is overridden in an incompatible way) raise RuntimeError('assertEqual assumptions violated') def assertEqualMultiline(self, got, exp): if (got + exp).find('\n') == -1: self.assertEqual(got, exp) else: start_delim = "\n" + (" 8< ".join(["-----"] * 8)) + "\n" end_delim = "\n" + (" >8 ".join(["-----"] * 8)) + "\n\n" self.assertEqual(got, exp, "got:" + start_delim + got + end_delim + "expected:" + start_delim + exp + end_delim) osc-1.12.1/tests/conf_fixtures/000077500000000000000000000000001475337502500164135ustar00rootroot00000000000000osc-1.12.1/tests/conf_fixtures/oscrc000066400000000000000000000001361475337502500174470ustar00rootroot00000000000000[general] apiurl = http://localhost [http://localhost] user=Admin pass=opensuse allow_http=1 osc-1.12.1/tests/deletefile_fixtures/000077500000000000000000000000001475337502500175705ustar00rootroot00000000000000osc-1.12.1/tests/deletefile_fixtures/oscrc000066400000000000000000000001361475337502500206240ustar00rootroot00000000000000[general] apiurl = http://localhost [http://localhost] user=Admin pass=opensuse allow_http=1 osc-1.12.1/tests/deletefile_fixtures/osctest/000077500000000000000000000000001475337502500212545ustar00rootroot00000000000000osc-1.12.1/tests/deletefile_fixtures/osctest/.osc/000077500000000000000000000000001475337502500221165ustar00rootroot00000000000000osc-1.12.1/tests/deletefile_fixtures/osctest/.osc/_apiurl000066400000000000000000000000211475337502500234650ustar00rootroot00000000000000http://localhost osc-1.12.1/tests/deletefile_fixtures/osctest/.osc/_osclib_version000066400000000000000000000000041475337502500252120ustar00rootroot000000000000001.0 osc-1.12.1/tests/deletefile_fixtures/osctest/.osc/_packages000066400000000000000000000000331475337502500237520ustar00rootroot00000000000000 osc-1.12.1/tests/deletefile_fixtures/osctest/.osc/_project000066400000000000000000000000101475337502500236350ustar00rootroot00000000000000osctest osc-1.12.1/tests/deletefile_fixtures/osctest/already_deleted/000077500000000000000000000000001475337502500243635ustar00rootroot00000000000000osc-1.12.1/tests/deletefile_fixtures/osctest/already_deleted/.osc/000077500000000000000000000000001475337502500252255ustar00rootroot00000000000000osc-1.12.1/tests/deletefile_fixtures/osctest/already_deleted/.osc/_apiurl000066400000000000000000000000211475337502500265740ustar00rootroot00000000000000http://localhost osc-1.12.1/tests/deletefile_fixtures/osctest/already_deleted/.osc/_files000066400000000000000000000005711475337502500264140ustar00rootroot00000000000000 osc-1.12.1/tests/deletefile_fixtures/osctest/already_deleted/.osc/_osclib_version000066400000000000000000000000041475337502500303210ustar00rootroot000000000000001.0 osc-1.12.1/tests/deletefile_fixtures/osctest/already_deleted/.osc/_package000066400000000000000000000000061475337502500266760ustar00rootroot00000000000000simpleosc-1.12.1/tests/deletefile_fixtures/osctest/already_deleted/.osc/_project000066400000000000000000000000071475337502500267520ustar00rootroot00000000000000osctestosc-1.12.1/tests/deletefile_fixtures/osctest/already_deleted/.osc/_to_be_added000066400000000000000000000000071475337502500275150ustar00rootroot00000000000000toadd1 osc-1.12.1/tests/deletefile_fixtures/osctest/already_deleted/.osc/_to_be_deleted000066400000000000000000000000041475337502500300570ustar00rootroot00000000000000foo osc-1.12.1/tests/deletefile_fixtures/osctest/already_deleted/.osc/foo000066400000000000000000000000271475337502500257320ustar00rootroot00000000000000This is a simple test. osc-1.12.1/tests/deletefile_fixtures/osctest/already_deleted/.osc/merge000066400000000000000000000000601475337502500262430ustar00rootroot00000000000000Is it possible to merge this file? I hope so... osc-1.12.1/tests/deletefile_fixtures/osctest/already_deleted/.osc/nochange000066400000000000000000000000311475337502500267240ustar00rootroot00000000000000This file didn't change. osc-1.12.1/tests/deletefile_fixtures/osctest/already_deleted/merge000066400000000000000000000000601475337502500254010ustar00rootroot00000000000000Is it possible to merge this file? I hope so... osc-1.12.1/tests/deletefile_fixtures/osctest/already_deleted/nochange000066400000000000000000000000511475337502500260640ustar00rootroot00000000000000This file didn't change but is modified. osc-1.12.1/tests/deletefile_fixtures/osctest/already_deleted/toadd1000066400000000000000000000000071475337502500254570ustar00rootroot00000000000000toadd1 osc-1.12.1/tests/deletefile_fixtures/osctest/already_deleted/toadd2000066400000000000000000000000071475337502500254600ustar00rootroot00000000000000toadd2 osc-1.12.1/tests/deletefile_fixtures/osctest/conflict/000077500000000000000000000000001475337502500230555ustar00rootroot00000000000000osc-1.12.1/tests/deletefile_fixtures/osctest/conflict/.osc/000077500000000000000000000000001475337502500237175ustar00rootroot00000000000000osc-1.12.1/tests/deletefile_fixtures/osctest/conflict/.osc/_apiurl000066400000000000000000000000211475337502500252660ustar00rootroot00000000000000http://localhost osc-1.12.1/tests/deletefile_fixtures/osctest/conflict/.osc/_files000066400000000000000000000005741475337502500251110ustar00rootroot00000000000000 osc-1.12.1/tests/deletefile_fixtures/osctest/conflict/.osc/_in_conflict000066400000000000000000000000041475337502500262620ustar00rootroot00000000000000foo osc-1.12.1/tests/deletefile_fixtures/osctest/conflict/.osc/_osclib_version000066400000000000000000000000041475337502500270130ustar00rootroot000000000000001.0 osc-1.12.1/tests/deletefile_fixtures/osctest/conflict/.osc/_package000066400000000000000000000000061475337502500253700ustar00rootroot00000000000000simpleosc-1.12.1/tests/deletefile_fixtures/osctest/conflict/.osc/_project000066400000000000000000000000071475337502500254440ustar00rootroot00000000000000osctestosc-1.12.1/tests/deletefile_fixtures/osctest/conflict/.osc/_to_be_added000066400000000000000000000000071475337502500262070ustar00rootroot00000000000000toadd1 osc-1.12.1/tests/deletefile_fixtures/osctest/conflict/.osc/foo000066400000000000000000000000271475337502500244240ustar00rootroot00000000000000This is a simple test. osc-1.12.1/tests/deletefile_fixtures/osctest/conflict/.osc/merge000066400000000000000000000000601475337502500247350ustar00rootroot00000000000000Is it possible to merge this file? I hope so... osc-1.12.1/tests/deletefile_fixtures/osctest/conflict/.osc/nochange000066400000000000000000000000311475337502500254160ustar00rootroot00000000000000This file didn't change. osc-1.12.1/tests/deletefile_fixtures/osctest/conflict/foo000066400000000000000000000001201475337502500235540ustar00rootroot00000000000000<<<<<<< foo.mine This is no test. ======= This is a simple test. >>>>>>> foo.r2 osc-1.12.1/tests/deletefile_fixtures/osctest/conflict/foo.mine000066400000000000000000000000211475337502500245030ustar00rootroot00000000000000This is no test. osc-1.12.1/tests/deletefile_fixtures/osctest/conflict/foo.r2000066400000000000000000000000271475337502500241040ustar00rootroot00000000000000This is a simple test. osc-1.12.1/tests/deletefile_fixtures/osctest/conflict/merge000066400000000000000000000000601475337502500240730ustar00rootroot00000000000000Is it possible to merge this file? I hope so... osc-1.12.1/tests/deletefile_fixtures/osctest/conflict/nochange000066400000000000000000000000511475337502500245560ustar00rootroot00000000000000This file didn't change but is modified. osc-1.12.1/tests/deletefile_fixtures/osctest/conflict/toadd1000066400000000000000000000000071475337502500241510ustar00rootroot00000000000000toadd1 osc-1.12.1/tests/deletefile_fixtures/osctest/conflict/toadd2000066400000000000000000000000071475337502500241520ustar00rootroot00000000000000toadd2 osc-1.12.1/tests/deletefile_fixtures/osctest/delete/000077500000000000000000000000001475337502500225165ustar00rootroot00000000000000osc-1.12.1/tests/deletefile_fixtures/osctest/delete/.osc/000077500000000000000000000000001475337502500233605ustar00rootroot00000000000000osc-1.12.1/tests/deletefile_fixtures/osctest/delete/.osc/_apiurl000066400000000000000000000000211475337502500247270ustar00rootroot00000000000000http://localhost osc-1.12.1/tests/deletefile_fixtures/osctest/delete/.osc/_files000066400000000000000000000005711475337502500245470ustar00rootroot00000000000000 osc-1.12.1/tests/deletefile_fixtures/osctest/delete/.osc/_osclib_version000066400000000000000000000000041475337502500264540ustar00rootroot000000000000001.0 osc-1.12.1/tests/deletefile_fixtures/osctest/delete/.osc/_package000066400000000000000000000000061475337502500250310ustar00rootroot00000000000000simpleosc-1.12.1/tests/deletefile_fixtures/osctest/delete/.osc/_project000066400000000000000000000000071475337502500251050ustar00rootroot00000000000000osctestosc-1.12.1/tests/deletefile_fixtures/osctest/delete/.osc/_to_be_added000066400000000000000000000000071475337502500256500ustar00rootroot00000000000000toadd1 osc-1.12.1/tests/deletefile_fixtures/osctest/delete/.osc/_to_be_deleted000066400000000000000000000000041475337502500262120ustar00rootroot00000000000000foo osc-1.12.1/tests/deletefile_fixtures/osctest/delete/.osc/foo000066400000000000000000000000271475337502500240650ustar00rootroot00000000000000This is a simple test. osc-1.12.1/tests/deletefile_fixtures/osctest/delete/.osc/merge000066400000000000000000000000601475337502500243760ustar00rootroot00000000000000Is it possible to merge this file? I hope so... osc-1.12.1/tests/deletefile_fixtures/osctest/delete/.osc/nochange000066400000000000000000000000311475337502500250570ustar00rootroot00000000000000This file didn't change. osc-1.12.1/tests/deletefile_fixtures/osctest/delete/merge000066400000000000000000000000601475337502500235340ustar00rootroot00000000000000Is it possible to merge this file? I hope so... osc-1.12.1/tests/deletefile_fixtures/osctest/delete/nochange000066400000000000000000000000511475337502500242170ustar00rootroot00000000000000This file didn't change but is modified. osc-1.12.1/tests/deletefile_fixtures/osctest/delete/toadd2000066400000000000000000000000071475337502500236130ustar00rootroot00000000000000toadd2 osc-1.12.1/tests/deletefile_fixtures/osctest/replace/000077500000000000000000000000001475337502500226675ustar00rootroot00000000000000osc-1.12.1/tests/deletefile_fixtures/osctest/replace/.osc/000077500000000000000000000000001475337502500235315ustar00rootroot00000000000000osc-1.12.1/tests/deletefile_fixtures/osctest/replace/.osc/_apiurl000066400000000000000000000000211475337502500251000ustar00rootroot00000000000000http://localhost osc-1.12.1/tests/deletefile_fixtures/osctest/replace/.osc/_files000066400000000000000000000005711475337502500247200ustar00rootroot00000000000000 osc-1.12.1/tests/deletefile_fixtures/osctest/replace/.osc/_osclib_version000066400000000000000000000000041475337502500266250ustar00rootroot000000000000001.0 osc-1.12.1/tests/deletefile_fixtures/osctest/replace/.osc/_package000066400000000000000000000000061475337502500252020ustar00rootroot00000000000000simpleosc-1.12.1/tests/deletefile_fixtures/osctest/replace/.osc/_project000066400000000000000000000000071475337502500252560ustar00rootroot00000000000000osctestosc-1.12.1/tests/deletefile_fixtures/osctest/replace/.osc/_to_be_added000066400000000000000000000000151475337502500260200ustar00rootroot00000000000000toadd1 merge osc-1.12.1/tests/deletefile_fixtures/osctest/replace/.osc/foo000066400000000000000000000000271475337502500242360ustar00rootroot00000000000000This is a simple test. osc-1.12.1/tests/deletefile_fixtures/osctest/replace/.osc/merge000066400000000000000000000000601475337502500245470ustar00rootroot00000000000000Is it possible to merge this file? I hope so... osc-1.12.1/tests/deletefile_fixtures/osctest/replace/.osc/nochange000066400000000000000000000000311475337502500252300ustar00rootroot00000000000000This file didn't change. osc-1.12.1/tests/deletefile_fixtures/osctest/replace/foo000066400000000000000000000000271475337502500233740ustar00rootroot00000000000000This is a simple test. osc-1.12.1/tests/deletefile_fixtures/osctest/replace/merge000066400000000000000000000000111475337502500237010ustar00rootroot00000000000000replaced osc-1.12.1/tests/deletefile_fixtures/osctest/replace/nochange000066400000000000000000000000511475337502500243700ustar00rootroot00000000000000This file didn't change but is modified. osc-1.12.1/tests/deletefile_fixtures/osctest/replace/toadd1000066400000000000000000000000071475337502500237630ustar00rootroot00000000000000toadd1 osc-1.12.1/tests/deletefile_fixtures/osctest/replace/toadd2000066400000000000000000000000071475337502500237640ustar00rootroot00000000000000toadd2 osc-1.12.1/tests/deletefile_fixtures/osctest/simple/000077500000000000000000000000001475337502500225455ustar00rootroot00000000000000osc-1.12.1/tests/deletefile_fixtures/osctest/simple/.osc/000077500000000000000000000000001475337502500234075ustar00rootroot00000000000000osc-1.12.1/tests/deletefile_fixtures/osctest/simple/.osc/_apiurl000066400000000000000000000000211475337502500247560ustar00rootroot00000000000000http://localhost osc-1.12.1/tests/deletefile_fixtures/osctest/simple/.osc/_files000066400000000000000000000011341475337502500245720ustar00rootroot00000000000000 osc-1.12.1/tests/deletefile_fixtures/osctest/simple/.osc/_osclib_version000066400000000000000000000000041475337502500265030ustar00rootroot000000000000001.0 osc-1.12.1/tests/deletefile_fixtures/osctest/simple/.osc/_package000066400000000000000000000000061475337502500250600ustar00rootroot00000000000000simpleosc-1.12.1/tests/deletefile_fixtures/osctest/simple/.osc/_project000066400000000000000000000000071475337502500251340ustar00rootroot00000000000000osctestosc-1.12.1/tests/deletefile_fixtures/osctest/simple/.osc/_to_be_added000066400000000000000000000000071475337502500256770ustar00rootroot00000000000000toadd1 osc-1.12.1/tests/deletefile_fixtures/osctest/simple/.osc/foo000066400000000000000000000000271475337502500241140ustar00rootroot00000000000000This is a simple test. osc-1.12.1/tests/deletefile_fixtures/osctest/simple/.osc/merge000066400000000000000000000000601475337502500244250ustar00rootroot00000000000000Is it possible to merge this file? I hope so... osc-1.12.1/tests/deletefile_fixtures/osctest/simple/.osc/nochange000066400000000000000000000000311475337502500251060ustar00rootroot00000000000000This file didn't change. osc-1.12.1/tests/deletefile_fixtures/osctest/simple/foo000066400000000000000000000000271475337502500232520ustar00rootroot00000000000000This is a simple test. osc-1.12.1/tests/deletefile_fixtures/osctest/simple/merge000066400000000000000000000000601475337502500235630ustar00rootroot00000000000000Is it possible to merge this file? I hope so... osc-1.12.1/tests/deletefile_fixtures/osctest/simple/nochange000066400000000000000000000000511475337502500242460ustar00rootroot00000000000000This file didn't change but is modified. osc-1.12.1/tests/deletefile_fixtures/osctest/simple/skipped_exists000066400000000000000000000000071475337502500255230ustar00rootroot00000000000000foobar osc-1.12.1/tests/deletefile_fixtures/osctest/simple/toadd1000066400000000000000000000000071475337502500236410ustar00rootroot00000000000000toadd1 osc-1.12.1/tests/deletefile_fixtures/osctest/simple/toadd2000066400000000000000000000000071475337502500236420ustar00rootroot00000000000000toadd2 osc-1.12.1/tests/difffile_fixtures/000077500000000000000000000000001475337502500172365ustar00rootroot00000000000000osc-1.12.1/tests/difffile_fixtures/oscrc000066400000000000000000000001361475337502500202720ustar00rootroot00000000000000[general] apiurl = http://localhost [http://localhost] user=Admin pass=opensuse allow_http=1 osc-1.12.1/tests/difffile_fixtures/osctest/000077500000000000000000000000001475337502500207225ustar00rootroot00000000000000osc-1.12.1/tests/difffile_fixtures/osctest/.osc/000077500000000000000000000000001475337502500215645ustar00rootroot00000000000000osc-1.12.1/tests/difffile_fixtures/osctest/.osc/_apiurl000066400000000000000000000000211475337502500231330ustar00rootroot00000000000000http://localhost osc-1.12.1/tests/difffile_fixtures/osctest/.osc/_osclib_version000066400000000000000000000000041475337502500246600ustar00rootroot000000000000001.0 osc-1.12.1/tests/difffile_fixtures/osctest/.osc/_packages000066400000000000000000000000331475337502500234200ustar00rootroot00000000000000 osc-1.12.1/tests/difffile_fixtures/osctest/.osc/_project000066400000000000000000000000101475337502500233030ustar00rootroot00000000000000osctest osc-1.12.1/tests/difffile_fixtures/osctest/binary/000077500000000000000000000000001475337502500222065ustar00rootroot00000000000000osc-1.12.1/tests/difffile_fixtures/osctest/binary/.osc/000077500000000000000000000000001475337502500230505ustar00rootroot00000000000000osc-1.12.1/tests/difffile_fixtures/osctest/binary/.osc/_apiurl000066400000000000000000000000211475337502500244170ustar00rootroot00000000000000http://localhost osc-1.12.1/tests/difffile_fixtures/osctest/binary/.osc/_files000066400000000000000000000004461475337502500242400ustar00rootroot00000000000000 osc-1.12.1/tests/difffile_fixtures/osctest/binary/.osc/_osclib_version000066400000000000000000000000041475337502500261440ustar00rootroot000000000000001.0 osc-1.12.1/tests/difffile_fixtures/osctest/binary/.osc/_package000066400000000000000000000000071475337502500245220ustar00rootroot00000000000000binary osc-1.12.1/tests/difffile_fixtures/osctest/binary/.osc/_project000066400000000000000000000000101475337502500245670ustar00rootroot00000000000000osctest osc-1.12.1/tests/difffile_fixtures/osctest/binary/.osc/_to_be_added000066400000000000000000000000151475337502500253370ustar00rootroot00000000000000binary_added osc-1.12.1/tests/difffile_fixtures/osctest/binary/.osc/_to_be_deleted000066400000000000000000000000171475337502500257060ustar00rootroot00000000000000binary_deleted osc-1.12.1/tests/difffile_fixtures/osctest/binary/.osc/binary000066400000000000000000000000221475337502500242510ustar00rootroot00000000000000I'm a binaryfile osc-1.12.1/tests/difffile_fixtures/osctest/binary/.osc/binary_deleted000066400000000000000000000000321475337502500257400ustar00rootroot00000000000000I'm a deleted binaryfile osc-1.12.1/tests/difffile_fixtures/osctest/binary/binary000066400000000000000000000000331475337502500234110ustar00rootroot00000000000000I'm a binaryfile modified osc-1.12.1/tests/difffile_fixtures/osctest/binary/binary_added000066400000000000000000000000071475337502500245330ustar00rootroot00000000000000fbar osc-1.12.1/tests/difffile_fixtures/osctest/remote_localdelete/000077500000000000000000000000001475337502500245525ustar00rootroot00000000000000osc-1.12.1/tests/difffile_fixtures/osctest/remote_localdelete/.osc/000077500000000000000000000000001475337502500254145ustar00rootroot00000000000000osc-1.12.1/tests/difffile_fixtures/osctest/remote_localdelete/.osc/_apiurl000066400000000000000000000000211475337502500267630ustar00rootroot00000000000000http://localhost osc-1.12.1/tests/difffile_fixtures/osctest/remote_localdelete/.osc/_files000066400000000000000000000005711475337502500266030ustar00rootroot00000000000000 osc-1.12.1/tests/difffile_fixtures/osctest/remote_localdelete/.osc/_osclib_version000066400000000000000000000000041475337502500305100ustar00rootroot000000000000001.0 osc-1.12.1/tests/difffile_fixtures/osctest/remote_localdelete/.osc/_package000066400000000000000000000000231475337502500270640ustar00rootroot00000000000000remote_localdelete osc-1.12.1/tests/difffile_fixtures/osctest/remote_localdelete/.osc/_project000066400000000000000000000000071475337502500271410ustar00rootroot00000000000000osctestosc-1.12.1/tests/difffile_fixtures/osctest/remote_localdelete/.osc/_to_be_deleted000066400000000000000000000000061475337502500302500ustar00rootroot00000000000000merge osc-1.12.1/tests/difffile_fixtures/osctest/remote_localdelete/.osc/foo000066400000000000000000000000271475337502500261210ustar00rootroot00000000000000This is a simple test. osc-1.12.1/tests/difffile_fixtures/osctest/remote_localdelete/.osc/merge000066400000000000000000000000601475337502500264320ustar00rootroot00000000000000Is it possible to merge this file? I hope so... osc-1.12.1/tests/difffile_fixtures/osctest/remote_localdelete/.osc/nochange000066400000000000000000000000311475337502500271130ustar00rootroot00000000000000This file didn't change. osc-1.12.1/tests/difffile_fixtures/osctest/remote_localdelete/foo000066400000000000000000000000271475337502500252570ustar00rootroot00000000000000This is a simple test. osc-1.12.1/tests/difffile_fixtures/osctest/remote_localdelete/nochange000066400000000000000000000000311475337502500262510ustar00rootroot00000000000000This file didn't change. osc-1.12.1/tests/difffile_fixtures/osctest/remote_localdelete/toadd2000066400000000000000000000000071475337502500256470ustar00rootroot00000000000000toadd2 osc-1.12.1/tests/difffile_fixtures/osctest/remote_localmodified/000077500000000000000000000000001475337502500250705ustar00rootroot00000000000000osc-1.12.1/tests/difffile_fixtures/osctest/remote_localmodified/.osc/000077500000000000000000000000001475337502500257325ustar00rootroot00000000000000osc-1.12.1/tests/difffile_fixtures/osctest/remote_localmodified/.osc/_apiurl000066400000000000000000000000211475337502500273010ustar00rootroot00000000000000http://localhost osc-1.12.1/tests/difffile_fixtures/osctest/remote_localmodified/.osc/_files000066400000000000000000000007301475337502500271160ustar00rootroot00000000000000 osc-1.12.1/tests/difffile_fixtures/osctest/remote_localmodified/.osc/_osclib_version000066400000000000000000000000041475337502500310260ustar00rootroot000000000000001.0 osc-1.12.1/tests/difffile_fixtures/osctest/remote_localmodified/.osc/_package000066400000000000000000000000251475337502500274040ustar00rootroot00000000000000remote_localmodified osc-1.12.1/tests/difffile_fixtures/osctest/remote_localmodified/.osc/_project000066400000000000000000000000071475337502500274570ustar00rootroot00000000000000osctestosc-1.12.1/tests/difffile_fixtures/osctest/remote_localmodified/.osc/binary000066400000000000000000000000331475337502500271350ustar00rootroot00000000000000I'm a binaryfile modified osc-1.12.1/tests/difffile_fixtures/osctest/remote_localmodified/.osc/foo000066400000000000000000000000271475337502500264370ustar00rootroot00000000000000This is a simple test. osc-1.12.1/tests/difffile_fixtures/osctest/remote_localmodified/.osc/merge000066400000000000000000000000601475337502500267500ustar00rootroot00000000000000Is it possible to merge this file? I hope so... osc-1.12.1/tests/difffile_fixtures/osctest/remote_localmodified/.osc/nochange000066400000000000000000000000311475337502500274310ustar00rootroot00000000000000This file didn't change. osc-1.12.1/tests/difffile_fixtures/osctest/remote_localmodified/binary000066400000000000000000000000371475337502500262770ustar00rootroot00000000000000I'm a binaryfile modified foo osc-1.12.1/tests/difffile_fixtures/osctest/remote_localmodified/foo000066400000000000000000000000271475337502500255750ustar00rootroot00000000000000This is a simple test. osc-1.12.1/tests/difffile_fixtures/osctest/remote_localmodified/merge000066400000000000000000000000601475337502500261060ustar00rootroot00000000000000Is it possible to merge this file? I hope so... osc-1.12.1/tests/difffile_fixtures/osctest/remote_localmodified/nochange000066400000000000000000000000441475337502500265730ustar00rootroot00000000000000This file didn't change. oh it does osc-1.12.1/tests/difffile_fixtures/osctest/remote_localmodified/toadd1000066400000000000000000000000071475337502500261640ustar00rootroot00000000000000toadd1 osc-1.12.1/tests/difffile_fixtures/osctest/remote_localmodified/toadd2000066400000000000000000000000071475337502500261650ustar00rootroot00000000000000toadd2 osc-1.12.1/tests/difffile_fixtures/osctest/remote_simple/000077500000000000000000000000001475337502500235665ustar00rootroot00000000000000osc-1.12.1/tests/difffile_fixtures/osctest/remote_simple/.osc/000077500000000000000000000000001475337502500244305ustar00rootroot00000000000000osc-1.12.1/tests/difffile_fixtures/osctest/remote_simple/.osc/_apiurl000066400000000000000000000000211475337502500257770ustar00rootroot00000000000000http://localhost osc-1.12.1/tests/difffile_fixtures/osctest/remote_simple/.osc/_files000066400000000000000000000005711475337502500256170ustar00rootroot00000000000000 osc-1.12.1/tests/difffile_fixtures/osctest/remote_simple/.osc/_osclib_version000066400000000000000000000000041475337502500275240ustar00rootroot000000000000001.0 osc-1.12.1/tests/difffile_fixtures/osctest/remote_simple/.osc/_package000066400000000000000000000000161475337502500261020ustar00rootroot00000000000000remote_simple osc-1.12.1/tests/difffile_fixtures/osctest/remote_simple/.osc/_project000066400000000000000000000000071475337502500261550ustar00rootroot00000000000000osctestosc-1.12.1/tests/difffile_fixtures/osctest/remote_simple/.osc/_to_be_added000066400000000000000000000000071475337502500267200ustar00rootroot00000000000000toadd1 osc-1.12.1/tests/difffile_fixtures/osctest/remote_simple/.osc/foo000066400000000000000000000000271475337502500251350ustar00rootroot00000000000000This is a simple test. osc-1.12.1/tests/difffile_fixtures/osctest/remote_simple/.osc/merge000066400000000000000000000000601475337502500254460ustar00rootroot00000000000000Is it possible to merge this file? I hope so... osc-1.12.1/tests/difffile_fixtures/osctest/remote_simple/.osc/nochange000066400000000000000000000000311475337502500261270ustar00rootroot00000000000000This file didn't change. osc-1.12.1/tests/difffile_fixtures/osctest/remote_simple/binary000066400000000000000000000000331475337502500247710ustar00rootroot00000000000000I'm a binaryfile modified osc-1.12.1/tests/difffile_fixtures/osctest/remote_simple/foo000066400000000000000000000000271475337502500242730ustar00rootroot00000000000000This is a simple test. osc-1.12.1/tests/difffile_fixtures/osctest/remote_simple/merge000066400000000000000000000000601475337502500246040ustar00rootroot00000000000000Is it possible to merge this file? I hope so... osc-1.12.1/tests/difffile_fixtures/osctest/remote_simple/nochange000066400000000000000000000000311475337502500252650ustar00rootroot00000000000000This file didn't change. osc-1.12.1/tests/difffile_fixtures/osctest/remote_simple/toadd1000066400000000000000000000000071475337502500246620ustar00rootroot00000000000000toadd1 osc-1.12.1/tests/difffile_fixtures/osctest/remote_simple/toadd2000066400000000000000000000000071475337502500246630ustar00rootroot00000000000000toadd2 osc-1.12.1/tests/difffile_fixtures/osctest/remote_simple_noadd/000077500000000000000000000000001475337502500247335ustar00rootroot00000000000000osc-1.12.1/tests/difffile_fixtures/osctest/remote_simple_noadd/.osc/000077500000000000000000000000001475337502500255755ustar00rootroot00000000000000osc-1.12.1/tests/difffile_fixtures/osctest/remote_simple_noadd/.osc/_apiurl000066400000000000000000000000211475337502500271440ustar00rootroot00000000000000http://localhost osc-1.12.1/tests/difffile_fixtures/osctest/remote_simple_noadd/.osc/_files000066400000000000000000000005711475337502500267640ustar00rootroot00000000000000 osc-1.12.1/tests/difffile_fixtures/osctest/remote_simple_noadd/.osc/_osclib_version000066400000000000000000000000041475337502500306710ustar00rootroot000000000000001.0 osc-1.12.1/tests/difffile_fixtures/osctest/remote_simple_noadd/.osc/_package000066400000000000000000000000241475337502500272460ustar00rootroot00000000000000remote_simple_noadd osc-1.12.1/tests/difffile_fixtures/osctest/remote_simple_noadd/.osc/_project000066400000000000000000000000071475337502500273220ustar00rootroot00000000000000osctestosc-1.12.1/tests/difffile_fixtures/osctest/remote_simple_noadd/.osc/foo000066400000000000000000000000271475337502500263020ustar00rootroot00000000000000This is a simple test. osc-1.12.1/tests/difffile_fixtures/osctest/remote_simple_noadd/.osc/merge000066400000000000000000000000601475337502500266130ustar00rootroot00000000000000Is it possible to merge this file? I hope so... osc-1.12.1/tests/difffile_fixtures/osctest/remote_simple_noadd/.osc/nochange000066400000000000000000000000311475337502500272740ustar00rootroot00000000000000This file didn't change. osc-1.12.1/tests/difffile_fixtures/osctest/remote_simple_noadd/foo000066400000000000000000000000271475337502500254400ustar00rootroot00000000000000This is a simple test. osc-1.12.1/tests/difffile_fixtures/osctest/remote_simple_noadd/merge000066400000000000000000000000601475337502500257510ustar00rootroot00000000000000Is it possible to merge this file? I hope so... osc-1.12.1/tests/difffile_fixtures/osctest/remote_simple_noadd/nochange000066400000000000000000000000311475337502500264320ustar00rootroot00000000000000This file didn't change. osc-1.12.1/tests/difffile_fixtures/osctest/remote_simple_noadd/toadd2000066400000000000000000000000071475337502500260300ustar00rootroot00000000000000toadd2 osc-1.12.1/tests/difffile_fixtures/osctest/replaced/000077500000000000000000000000001475337502500225015ustar00rootroot00000000000000osc-1.12.1/tests/difffile_fixtures/osctest/replaced/.osc/000077500000000000000000000000001475337502500233435ustar00rootroot00000000000000osc-1.12.1/tests/difffile_fixtures/osctest/replaced/.osc/_apiurl000066400000000000000000000000211475337502500247120ustar00rootroot00000000000000http://localhost osc-1.12.1/tests/difffile_fixtures/osctest/replaced/.osc/_files000066400000000000000000000003041475337502500245240ustar00rootroot00000000000000 osc-1.12.1/tests/difffile_fixtures/osctest/replaced/.osc/_osclib_version000066400000000000000000000000041475337502500264370ustar00rootroot000000000000001.0 osc-1.12.1/tests/difffile_fixtures/osctest/replaced/.osc/_package000066400000000000000000000000111475337502500250100ustar00rootroot00000000000000replaced osc-1.12.1/tests/difffile_fixtures/osctest/replaced/.osc/_project000066400000000000000000000000071475337502500250700ustar00rootroot00000000000000osctestosc-1.12.1/tests/difffile_fixtures/osctest/replaced/.osc/_to_be_added000066400000000000000000000000111475337502500256260ustar00rootroot00000000000000replaced osc-1.12.1/tests/difffile_fixtures/osctest/replaced/.osc/replaced000066400000000000000000000000211475337502500250360ustar00rootroot00000000000000yet another file osc-1.12.1/tests/difffile_fixtures/osctest/replaced/replaced000066400000000000000000000000151475337502500241770ustar00rootroot00000000000000foo replaced osc-1.12.1/tests/difffile_fixtures/osctest/simple/000077500000000000000000000000001475337502500222135ustar00rootroot00000000000000osc-1.12.1/tests/difffile_fixtures/osctest/simple/.osc/000077500000000000000000000000001475337502500230555ustar00rootroot00000000000000osc-1.12.1/tests/difffile_fixtures/osctest/simple/.osc/_apiurl000066400000000000000000000000211475337502500244240ustar00rootroot00000000000000http://localhost osc-1.12.1/tests/difffile_fixtures/osctest/simple/.osc/_files000066400000000000000000000014061475337502500242420ustar00rootroot00000000000000 osc-1.12.1/tests/difffile_fixtures/osctest/simple/.osc/_in_conflict000066400000000000000000000000041475337502500254200ustar00rootroot00000000000000foo osc-1.12.1/tests/difffile_fixtures/osctest/simple/.osc/_osclib_version000066400000000000000000000000041475337502500261510ustar00rootroot000000000000001.0 osc-1.12.1/tests/difffile_fixtures/osctest/simple/.osc/_package000066400000000000000000000000061475337502500245260ustar00rootroot00000000000000simpleosc-1.12.1/tests/difffile_fixtures/osctest/simple/.osc/_project000066400000000000000000000000071475337502500246020ustar00rootroot00000000000000osctestosc-1.12.1/tests/difffile_fixtures/osctest/simple/.osc/_to_be_added000066400000000000000000000000351475337502500253460ustar00rootroot00000000000000toadd1 replaced addedmissing osc-1.12.1/tests/difffile_fixtures/osctest/simple/.osc/_to_be_deleted000066400000000000000000000000111475337502500257050ustar00rootroot00000000000000somefile osc-1.12.1/tests/difffile_fixtures/osctest/simple/.osc/foo000066400000000000000000000000271475337502500235620ustar00rootroot00000000000000This is a simple test. osc-1.12.1/tests/difffile_fixtures/osctest/simple/.osc/merge000066400000000000000000000000601475337502500240730ustar00rootroot00000000000000Is it possible to merge this file? I hope so... osc-1.12.1/tests/difffile_fixtures/osctest/simple/.osc/missing000066400000000000000000000000101475337502500244400ustar00rootroot00000000000000missing osc-1.12.1/tests/difffile_fixtures/osctest/simple/.osc/nochange000066400000000000000000000000311475337502500245540ustar00rootroot00000000000000This file didn't change. osc-1.12.1/tests/difffile_fixtures/osctest/simple/.osc/replaced000066400000000000000000000000211475337502500245500ustar00rootroot00000000000000yet another file osc-1.12.1/tests/difffile_fixtures/osctest/simple/.osc/somefile000066400000000000000000000000151475337502500245770ustar00rootroot00000000000000some content osc-1.12.1/tests/difffile_fixtures/osctest/simple/foo000066400000000000000000000001201475337502500227120ustar00rootroot00000000000000<<<<<<< foo.mine This is no test. ======= This is a simple test. >>>>>>> foo.r2 osc-1.12.1/tests/difffile_fixtures/osctest/simple/merge000066400000000000000000000000601475337502500232310ustar00rootroot00000000000000Is it possible to merge this file? I hope so... osc-1.12.1/tests/difffile_fixtures/osctest/simple/nochange000066400000000000000000000000511475337502500237140ustar00rootroot00000000000000This file didn't change but is modified. osc-1.12.1/tests/difffile_fixtures/osctest/simple/replaced000066400000000000000000000000151475337502500237110ustar00rootroot00000000000000foo replaced osc-1.12.1/tests/difffile_fixtures/osctest/simple/toadd1000066400000000000000000000000071475337502500233070ustar00rootroot00000000000000toadd1 osc-1.12.1/tests/difffile_fixtures/osctest/simple/toadd2000066400000000000000000000000071475337502500233100ustar00rootroot00000000000000toadd2 osc-1.12.1/tests/difffile_fixtures/testDiffRemoteDeletedLocalAdded_files000066400000000000000000000005721475337502500264570ustar00rootroot00000000000000 osc-1.12.1/tests/difffile_fixtures/testDiffRemoteExistingLocalNotExisting_binary000066400000000000000000000000071475337502500303300ustar00rootroot00000000000000fbar osc-1.12.1/tests/difffile_fixtures/testDiffRemoteExistingLocalNotExisting_files000066400000000000000000000010651475337502500301530ustar00rootroot00000000000000 osc-1.12.1/tests/difffile_fixtures/testDiffRemoteExistingLocalNotExisting_foobar000066400000000000000000000000161475337502500303140ustar00rootroot00000000000000foobar barfoo osc-1.12.1/tests/difffile_fixtures/testDiffRemoteMissingLocalDeleted_files000066400000000000000000000004351475337502500270650ustar00rootroot00000000000000 osc-1.12.1/tests/difffile_fixtures/testDiffRemoteMissingLocalExisting_files000066400000000000000000000004371475337502500273130ustar00rootroot00000000000000 osc-1.12.1/tests/difffile_fixtures/testDiffRemoteModified_files000066400000000000000000000005721475337502500247340ustar00rootroot00000000000000 osc-1.12.1/tests/difffile_fixtures/testDiffRemoteModified_merge000066400000000000000000000000431475337502500247220ustar00rootroot00000000000000Is it possible to merge this file? osc-1.12.1/tests/difffile_fixtures/testDiffRemoteNoChange_files000066400000000000000000000005721475337502500246760ustar00rootroot00000000000000 osc-1.12.1/tests/difffile_fixtures/testDiffRemoteUnchangedLocalModified_binary000066400000000000000000000000331475337502500276760ustar00rootroot00000000000000I'm a binaryfile modified osc-1.12.1/tests/difffile_fixtures/testDiffRemoteUnchangedLocalModified_files000066400000000000000000000007301475337502500275200ustar00rootroot00000000000000 osc-1.12.1/tests/difffile_fixtures/testDiffRemoteUnchangedLocalModified_nochange000066400000000000000000000000311475337502500301720ustar00rootroot00000000000000This file didn't change. osc-1.12.1/tests/fixtures/000077500000000000000000000000001475337502500154065ustar00rootroot00000000000000osc-1.12.1/tests/fixtures/README000066400000000000000000000015741475337502500162750ustar00rootroot00000000000000Generate data for creating test archives ---------------------------------------- echo 'foobar' > /tmp/foo # root perms required for the next command echo 'numbers' > /123 echo 'qwerty' > very-long-long-long-long-name echo 'asdfgh' > very-long-long-long-long-name2 echo 'newline' > 'very-long-name -with-newline' echo 'newline' > 'a b' mkdir 'dir' echo 'file-in-a-dir' > dir/file Create archive.ar ----------------- ar qP archive.ar /tmp/foo /123 very-long-long-long-long-name very-long-long-long-long-name2 'very-long-name -with-newline' 'a b' dir/file Create archive.cpio ------------------- printf "/tmp/foo\0/123\0very-long-long-long-long-name\0very-long-long-long-long-name2\0very-long-name -with-newline\0a\nb\0dir/file\0" | cpio -ocv0 --owner=root:root > archive.cpio Create archive-no-ext_fnhdr.ar ------------------------------ ar qP archive-no-ext_fnhdr.ar dir/file osc-1.12.1/tests/fixtures/archive-no-ext_fnhdr.ar000066400000000000000000000001221475337502500217370ustar00rootroot00000000000000! dir/file/ 1724142481 1000 1000 100644 14 ` file-in-a-dir osc-1.12.1/tests/fixtures/archive.ar000066400000000000000000000012041475337502500173500ustar00rootroot00000000000000! // 94 ` very-long-long-long-long-name/ very-long-long-long-long-name2/ very-long-name -with-newline/ /tmp/foo/ 1716888536 1000 1000 100644 7 ` foobar /123/ 1716883776 0 0 100644 8 ` numbers /0 1716882802 1000 1000 100644 7 ` qwerty /31 1716882988 1000 1000 100644 7 ` asdfgh /63 1716884767 1000 1000 100644 8 ` newline a b/ 1716884876 1000 1000 100644 8 ` newline dir/file/ 1716992150 1000 1000 100644 14 ` file-in-a-dir osc-1.12.1/tests/fixtures/archive.cpio000066400000000000000000000030001475337502500176740ustar00rootroot000000000000000707010000A485000081A40000000000000000000000016655A3D800000007000000000000002600000000000000000000000900000000/tmp/foofoobar 070701001D80B8000081A40000000000000000000000016655914000000008000000000000002300000000000000000000000500000000/123numbers 0707010101520A000081A400000000000000000000000166558D7200000007000000000000002B00000000000000000000001E00000000very-long-long-long-long-nameqwerty 07070101015246000081A400000000000000000000000166558E2C00000007000000000000002B00000000000000000000001F00000000very-long-long-long-long-name2asdfgh 070701010155A2000081A40000000000000000000000016655951F00000008000000000000002B00000000000000000000001D00000000very-long-name -with-newlinenewline 070701010155CF000081A40000000000000000000000016655958C00000008000000000000002B00000000000000000000000400000000a bnewline 0707010101AB6D000081A4000000000000000000000001665738960000000E000000000000002B00000000000000000000000900000000dir/filefile-in-a-dir 07070100000000000000000000000000000000000000010000000000000000000000000000000000000000000000000000000B00000000TRAILER!!!osc-1.12.1/tests/fixtures/packages/000077500000000000000000000000001475337502500171645ustar00rootroot00000000000000osc-1.12.1/tests/fixtures/packages/oscrc000066400000000000000000000001361475337502500202200ustar00rootroot00000000000000[general] apiurl = http://localhost [http://localhost] user=Admin pass=opensuse allow_http=1 osc-1.12.1/tests/fixtures/packages/osctest/000077500000000000000000000000001475337502500206505ustar00rootroot00000000000000osc-1.12.1/tests/fixtures/packages/osctest/openSUSE:Tools/000077500000000000000000000000001475337502500234245ustar00rootroot00000000000000osc-1.12.1/tests/fixtures/packages/osctest/openSUSE:Tools/osc/000077500000000000000000000000001475337502500242105ustar00rootroot00000000000000osc-1.12.1/tests/fixtures/packages/osctest/openSUSE:Tools/osc/.osc/000077500000000000000000000000001475337502500250525ustar00rootroot00000000000000osc-1.12.1/tests/fixtures/packages/osctest/openSUSE:Tools/osc/.osc/_apiurl000066400000000000000000000000211475337502500264210ustar00rootroot00000000000000http://localhost osc-1.12.1/tests/fixtures/packages/osctest/openSUSE:Tools/osc/.osc/_files000066400000000000000000000013061475337502500262360ustar00rootroot00000000000000 osc-1.12.1/tests/fixtures/packages/osctest/openSUSE:Tools/osc/.osc/_osclib_version000066400000000000000000000000041475337502500301460ustar00rootroot000000000000001.0 osc-1.12.1/tests/fixtures/packages/osctest/openSUSE:Tools/osc/.osc/_package000066400000000000000000000000041475337502500265210ustar00rootroot00000000000000osc osc-1.12.1/tests/fixtures/packages/osctest/openSUSE:Tools/osc/.osc/_project000066400000000000000000000000171475337502500266000ustar00rootroot00000000000000openSUSE:Tools osc-1.12.1/tests/fixtures/packages/osctest/projectA-different-apiurl/000077500000000000000000000000001475337502500256555ustar00rootroot00000000000000osc-1.12.1/tests/fixtures/packages/osctest/projectA-different-apiurl/pkgA/000077500000000000000000000000001475337502500265375ustar00rootroot00000000000000osc-1.12.1/tests/fixtures/packages/osctest/projectA-different-apiurl/pkgA/.osc/000077500000000000000000000000001475337502500274015ustar00rootroot00000000000000osc-1.12.1/tests/fixtures/packages/osctest/projectA-different-apiurl/pkgA/.osc/_apiurl000066400000000000000000000000231475337502500307520ustar00rootroot00000000000000http://example.com osc-1.12.1/tests/fixtures/packages/osctest/projectA-different-apiurl/pkgA/.osc/_files000066400000000000000000000001251475337502500305630ustar00rootroot00000000000000 osc-1.12.1/tests/fixtures/packages/osctest/projectA-different-apiurl/pkgA/.osc/_osclib_version000066400000000000000000000000041475337502500324750ustar00rootroot000000000000001.0 osc-1.12.1/tests/fixtures/packages/osctest/projectA-different-apiurl/pkgA/.osc/_package000066400000000000000000000000041475337502500310500ustar00rootroot00000000000000pkgAosc-1.12.1/tests/fixtures/packages/osctest/projectA-different-apiurl/pkgA/.osc/_project000066400000000000000000000000101475337502500311200ustar00rootroot00000000000000projectAosc-1.12.1/tests/fixtures/packages/osctest/projectA-different-apiurl/pkgB/000077500000000000000000000000001475337502500265405ustar00rootroot00000000000000osc-1.12.1/tests/fixtures/packages/osctest/projectA-different-apiurl/pkgB/.osc/000077500000000000000000000000001475337502500274025ustar00rootroot00000000000000osc-1.12.1/tests/fixtures/packages/osctest/projectA-different-apiurl/pkgB/.osc/_apiurl000066400000000000000000000000231475337502500307530ustar00rootroot00000000000000http://example.com osc-1.12.1/tests/fixtures/packages/osctest/projectA-different-apiurl/pkgB/.osc/_files000066400000000000000000000001251475337502500305640ustar00rootroot00000000000000 osc-1.12.1/tests/fixtures/packages/osctest/projectA-different-apiurl/pkgB/.osc/_osclib_version000066400000000000000000000000041475337502500324760ustar00rootroot000000000000001.0 osc-1.12.1/tests/fixtures/packages/osctest/projectA-different-apiurl/pkgB/.osc/_package000066400000000000000000000000041475337502500310510ustar00rootroot00000000000000pkgBosc-1.12.1/tests/fixtures/packages/osctest/projectA-different-apiurl/pkgB/.osc/_project000066400000000000000000000000101475337502500311210ustar00rootroot00000000000000projectAosc-1.12.1/tests/fixtures/packages/osctest/projectA/000077500000000000000000000000001475337502500224175ustar00rootroot00000000000000osc-1.12.1/tests/fixtures/packages/osctest/projectA/pkgA-symlink000077700000000000000000000000001475337502500255102pkgAustar00rootroot00000000000000osc-1.12.1/tests/fixtures/packages/osctest/projectA/pkgA/000077500000000000000000000000001475337502500233015ustar00rootroot00000000000000osc-1.12.1/tests/fixtures/packages/osctest/projectA/pkgA/.osc/000077500000000000000000000000001475337502500241435ustar00rootroot00000000000000osc-1.12.1/tests/fixtures/packages/osctest/projectA/pkgA/.osc/_apiurl000066400000000000000000000000211475337502500255120ustar00rootroot00000000000000http://localhost osc-1.12.1/tests/fixtures/packages/osctest/projectA/pkgA/.osc/_files000066400000000000000000000001251475337502500253250ustar00rootroot00000000000000 osc-1.12.1/tests/fixtures/packages/osctest/projectA/pkgA/.osc/_osclib_version000066400000000000000000000000041475337502500272370ustar00rootroot000000000000001.0 osc-1.12.1/tests/fixtures/packages/osctest/projectA/pkgA/.osc/_package000066400000000000000000000000041475337502500256120ustar00rootroot00000000000000pkgAosc-1.12.1/tests/fixtures/packages/osctest/projectA/pkgA/.osc/_project000066400000000000000000000000101475337502500256620ustar00rootroot00000000000000projectAosc-1.12.1/tests/fixtures/packages/osctest/projectA/pkgA/pkgA.changes000066400000000000000000000000001475337502500255030ustar00rootroot00000000000000osc-1.12.1/tests/fixtures/packages/osctest/projectA/pkgA/pkgA.spec000066400000000000000000000000001475337502500250250ustar00rootroot00000000000000osc-1.12.1/tests/fixtures/packages/osctest/projectA/pkgB/000077500000000000000000000000001475337502500233025ustar00rootroot00000000000000osc-1.12.1/tests/fixtures/packages/osctest/projectA/pkgB/.osc/000077500000000000000000000000001475337502500241445ustar00rootroot00000000000000osc-1.12.1/tests/fixtures/packages/osctest/projectA/pkgB/.osc/_apiurl000066400000000000000000000000211475337502500255130ustar00rootroot00000000000000http://localhost osc-1.12.1/tests/fixtures/packages/osctest/projectA/pkgB/.osc/_files000066400000000000000000000001251475337502500253260ustar00rootroot00000000000000 osc-1.12.1/tests/fixtures/packages/osctest/projectA/pkgB/.osc/_osclib_version000066400000000000000000000000041475337502500272400ustar00rootroot000000000000001.0 osc-1.12.1/tests/fixtures/packages/osctest/projectA/pkgB/.osc/_package000066400000000000000000000000041475337502500256130ustar00rootroot00000000000000pkgBosc-1.12.1/tests/fixtures/packages/osctest/projectA/pkgB/.osc/_project000066400000000000000000000000101475337502500256630ustar00rootroot00000000000000projectAosc-1.12.1/tests/fixtures/packages/osctest/projectB/000077500000000000000000000000001475337502500224205ustar00rootroot00000000000000osc-1.12.1/tests/fixtures/packages/osctest/projectB/pkgA/000077500000000000000000000000001475337502500233025ustar00rootroot00000000000000osc-1.12.1/tests/fixtures/packages/osctest/projectB/pkgA/.osc/000077500000000000000000000000001475337502500241445ustar00rootroot00000000000000osc-1.12.1/tests/fixtures/packages/osctest/projectB/pkgA/.osc/_apiurl000066400000000000000000000000211475337502500255130ustar00rootroot00000000000000http://localhost osc-1.12.1/tests/fixtures/packages/osctest/projectB/pkgA/.osc/_files000066400000000000000000000001251475337502500253260ustar00rootroot00000000000000 osc-1.12.1/tests/fixtures/packages/osctest/projectB/pkgA/.osc/_osclib_version000066400000000000000000000000041475337502500272400ustar00rootroot000000000000001.0 osc-1.12.1/tests/fixtures/packages/osctest/projectB/pkgA/.osc/_package000066400000000000000000000000041475337502500256130ustar00rootroot00000000000000pkgAosc-1.12.1/tests/fixtures/packages/osctest/projectB/pkgA/.osc/_project000066400000000000000000000000101475337502500256630ustar00rootroot00000000000000projectBosc-1.12.1/tests/fixtures/packages/osctest/projectB/pkgB/000077500000000000000000000000001475337502500233035ustar00rootroot00000000000000osc-1.12.1/tests/fixtures/packages/osctest/projectB/pkgB/.osc/000077500000000000000000000000001475337502500241455ustar00rootroot00000000000000osc-1.12.1/tests/fixtures/packages/osctest/projectB/pkgB/.osc/_apiurl000066400000000000000000000000211475337502500255140ustar00rootroot00000000000000http://localhost osc-1.12.1/tests/fixtures/packages/osctest/projectB/pkgB/.osc/_files000066400000000000000000000001251475337502500253270ustar00rootroot00000000000000 osc-1.12.1/tests/fixtures/packages/osctest/projectB/pkgB/.osc/_osclib_version000066400000000000000000000000041475337502500272410ustar00rootroot000000000000001.0 osc-1.12.1/tests/fixtures/packages/osctest/projectB/pkgB/.osc/_package000066400000000000000000000000041475337502500256140ustar00rootroot00000000000000pkgBosc-1.12.1/tests/fixtures/packages/osctest/projectB/pkgB/.osc/_project000066400000000000000000000000101475337502500256640ustar00rootroot00000000000000projectBosc-1.12.1/tests/init_package_fixtures/000077500000000000000000000000001475337502500201045ustar00rootroot00000000000000osc-1.12.1/tests/init_package_fixtures/oscrc000066400000000000000000000001361475337502500211400ustar00rootroot00000000000000[general] apiurl = http://localhost [http://localhost] user=Admin pass=opensuse allow_http=1 osc-1.12.1/tests/init_project_fixtures/000077500000000000000000000000001475337502500201575ustar00rootroot00000000000000osc-1.12.1/tests/init_project_fixtures/oscrc000066400000000000000000000001361475337502500212130ustar00rootroot00000000000000[general] apiurl = http://localhost [http://localhost] user=Admin pass=opensuse allow_http=1 osc-1.12.1/tests/osc000077700000000000000000000000001475337502500151612../oscustar00rootroot00000000000000osc-1.12.1/tests/prdiff_fixtures/000077500000000000000000000000001475337502500167405ustar00rootroot00000000000000osc-1.12.1/tests/prdiff_fixtures/common-two-diff000066400000000000000000000004061475337502500216700ustar00rootroot00000000000000Index: common-two =================================================================== --- common-two 2013-01-18 19:18:38.225983117 +0000 +++ common-two 2013-01-18 19:19:27.882082325 +0000 @@ -1,4 +1,5 @@ line one line two line three +an extra line last line osc-1.12.1/tests/prdiff_fixtures/home:user:branches:some:project/000077500000000000000000000000001475337502500251205ustar00rootroot00000000000000osc-1.12.1/tests/prdiff_fixtures/home:user:branches:some:project/.osc/000077500000000000000000000000001475337502500257625ustar00rootroot00000000000000osc-1.12.1/tests/prdiff_fixtures/home:user:branches:some:project/.osc/_apiurl000066400000000000000000000000211475337502500273310ustar00rootroot00000000000000http://localhost osc-1.12.1/tests/prdiff_fixtures/home:user:branches:some:project/.osc/_osclib_version000066400000000000000000000000041475337502500310560ustar00rootroot000000000000001.0 osc-1.12.1/tests/prdiff_fixtures/home:user:branches:some:project/.osc/_packages000066400000000000000000000002171475337502500276220ustar00rootroot00000000000000 osc-1.12.1/tests/prdiff_fixtures/home:user:branches:some:project/.osc/_project000066400000000000000000000000401475337502500275040ustar00rootroot00000000000000home:user:branches:some:project osc-1.12.1/tests/prdiff_fixtures/home:user:branches:some:project/common-two000066400000000000000000000000641475337502500271420ustar00rootroot00000000000000line one line two line three an extra line last lineosc-1.12.1/tests/prdiff_fixtures/home:user:branches:some:project/directory000066400000000000000000000002321475337502500270440ustar00rootroot00000000000000 osc-1.12.1/tests/prdiff_fixtures/new:prj/000077500000000000000000000000001475337502500203375ustar00rootroot00000000000000osc-1.12.1/tests/prdiff_fixtures/new:prj/common-two000066400000000000000000000000641475337502500223610ustar00rootroot00000000000000line one line two line three an extra line last lineosc-1.12.1/tests/prdiff_fixtures/new:prj/directory000066400000000000000000000002321475337502500222630ustar00rootroot00000000000000 osc-1.12.1/tests/prdiff_fixtures/no-requests000066400000000000000000000000471475337502500211510ustar00rootroot00000000000000 osc-1.12.1/tests/prdiff_fixtures/old:prj/000077500000000000000000000000001475337502500203245ustar00rootroot00000000000000osc-1.12.1/tests/prdiff_fixtures/old:prj/common-two000066400000000000000000000000461475337502500223460ustar00rootroot00000000000000line one line two line three last lineosc-1.12.1/tests/prdiff_fixtures/old:prj/directory000066400000000000000000000002321475337502500222500ustar00rootroot00000000000000 osc-1.12.1/tests/prdiff_fixtures/oscrc000066400000000000000000000001361475337502500177740ustar00rootroot00000000000000[general] apiurl = http://localhost [http://localhost] user=Admin pass=opensuse allow_http=1 osc-1.12.1/tests/prdiff_fixtures/osctest/000077500000000000000000000000001475337502500204245ustar00rootroot00000000000000osc-1.12.1/tests/prdiff_fixtures/osctest/.osc/000077500000000000000000000000001475337502500212665ustar00rootroot00000000000000osc-1.12.1/tests/prdiff_fixtures/osctest/.osc/_apiurl000066400000000000000000000000211475337502500226350ustar00rootroot00000000000000http://localhost osc-1.12.1/tests/prdiff_fixtures/osctest/.osc/_osclib_version000066400000000000000000000000041475337502500243620ustar00rootroot000000000000001.0 osc-1.12.1/tests/prdiff_fixtures/osctest/.osc/_packages000066400000000000000000000002171475337502500231260ustar00rootroot00000000000000 osc-1.12.1/tests/prdiff_fixtures/osctest/.osc/_project000066400000000000000000000000401475337502500230100ustar00rootroot00000000000000home:user:branches:some:project osc-1.12.1/tests/prdiff_fixtures/osctest/common-one/000077500000000000000000000000001475337502500224735ustar00rootroot00000000000000osc-1.12.1/tests/prdiff_fixtures/osctest/common-one/.osc/000077500000000000000000000000001475337502500233355ustar00rootroot00000000000000osc-1.12.1/tests/prdiff_fixtures/osctest/common-one/.osc/_apiurl000066400000000000000000000000211475337502500247040ustar00rootroot00000000000000http://localhost osc-1.12.1/tests/prdiff_fixtures/osctest/common-one/.osc/_files000066400000000000000000000006401475337502500245210ustar00rootroot00000000000000 osc-1.12.1/tests/prdiff_fixtures/osctest/common-one/.osc/_meta000066400000000000000000000004411475337502500243440ustar00rootroot00000000000000 blah foo osc-1.12.1/tests/prdiff_fixtures/osctest/common-one/.osc/_osclib_version000066400000000000000000000000041475337502500264310ustar00rootroot000000000000001.0 osc-1.12.1/tests/prdiff_fixtures/osctest/common-one/.osc/_package000066400000000000000000000000121475337502500250030ustar00rootroot00000000000000common-oneosc-1.12.1/tests/prdiff_fixtures/osctest/common-one/.osc/_project000066400000000000000000000000401475337502500250570ustar00rootroot00000000000000home:user:branches:some:project osc-1.12.1/tests/prdiff_fixtures/osctest/common-one/.osc/common-one.spec000066400000000000000000000000301475337502500262510ustar00rootroot00000000000000contents are irrelevant osc-1.12.1/tests/prdiff_fixtures/osctest/common-one/common-one.spec000066400000000000000000000000301475337502500254070ustar00rootroot00000000000000contents are irrelevant osc-1.12.1/tests/prdiff_fixtures/osctest/common-two/000077500000000000000000000000001475337502500225235ustar00rootroot00000000000000osc-1.12.1/tests/prdiff_fixtures/osctest/common-two/.osc/000077500000000000000000000000001475337502500233655ustar00rootroot00000000000000osc-1.12.1/tests/prdiff_fixtures/osctest/common-two/.osc/_apiurl000066400000000000000000000000211475337502500247340ustar00rootroot00000000000000http://localhost osc-1.12.1/tests/prdiff_fixtures/osctest/common-two/.osc/_files000066400000000000000000000006401475337502500245510ustar00rootroot00000000000000 osc-1.12.1/tests/prdiff_fixtures/osctest/common-two/.osc/_meta000066400000000000000000000004411475337502500243740ustar00rootroot00000000000000 blah foo osc-1.12.1/tests/prdiff_fixtures/osctest/common-two/.osc/_osclib_version000066400000000000000000000000041475337502500264610ustar00rootroot000000000000001.0 osc-1.12.1/tests/prdiff_fixtures/osctest/common-two/.osc/_package000066400000000000000000000000121475337502500250330ustar00rootroot00000000000000common-twoosc-1.12.1/tests/prdiff_fixtures/osctest/common-two/.osc/_project000066400000000000000000000000401475337502500251070ustar00rootroot00000000000000home:user:branches:some:project osc-1.12.1/tests/prdiff_fixtures/osctest/common-two/.osc/common-two.spec000066400000000000000000000000301475337502500263310ustar00rootroot00000000000000contents are irrelevant osc-1.12.1/tests/prdiff_fixtures/osctest/common-two/common-two.spec000066400000000000000000000000301475337502500254670ustar00rootroot00000000000000contents are irrelevant osc-1.12.1/tests/prdiff_fixtures/request000066400000000000000000000010221475337502500203460ustar00rootroot00000000000000 update - Fix it to work - Improve support for something osc-1.12.1/tests/prdiff_fixtures/some:project/000077500000000000000000000000001475337502500213645ustar00rootroot00000000000000osc-1.12.1/tests/prdiff_fixtures/some:project/.osc/000077500000000000000000000000001475337502500222265ustar00rootroot00000000000000osc-1.12.1/tests/prdiff_fixtures/some:project/.osc/_apiurl000066400000000000000000000000211475337502500235750ustar00rootroot00000000000000http://localhost osc-1.12.1/tests/prdiff_fixtures/some:project/.osc/_osclib_version000066400000000000000000000000041475337502500253220ustar00rootroot000000000000001.0 osc-1.12.1/tests/prdiff_fixtures/some:project/.osc/_packages000066400000000000000000000001741475337502500240700ustar00rootroot00000000000000 osc-1.12.1/tests/prdiff_fixtures/some:project/.osc/_project000066400000000000000000000000151475337502500237520ustar00rootroot00000000000000some:project osc-1.12.1/tests/prdiff_fixtures/some:project/common-two000066400000000000000000000000641475337502500234060ustar00rootroot00000000000000line one line two line three an extra line last lineosc-1.12.1/tests/prdiff_fixtures/some:project/directory000066400000000000000000000002321475337502500233100ustar00rootroot00000000000000 osc-1.12.1/tests/project_package_status_fixtures/000077500000000000000000000000001475337502500222125ustar00rootroot00000000000000osc-1.12.1/tests/project_package_status_fixtures/oscrc000066400000000000000000000001361475337502500232460ustar00rootroot00000000000000[general] apiurl = http://localhost [http://localhost] user=Admin pass=opensuse allow_http=1 osc-1.12.1/tests/project_package_status_fixtures/osctest/000077500000000000000000000000001475337502500236765ustar00rootroot00000000000000osc-1.12.1/tests/project_package_status_fixtures/osctest/.osc/000077500000000000000000000000001475337502500245405ustar00rootroot00000000000000osc-1.12.1/tests/project_package_status_fixtures/osctest/.osc/_apiurl000066400000000000000000000000211475337502500261070ustar00rootroot00000000000000http://localhost osc-1.12.1/tests/project_package_status_fixtures/osctest/.osc/_osclib_version000066400000000000000000000000041475337502500276340ustar00rootroot000000000000001.0 osc-1.12.1/tests/project_package_status_fixtures/osctest/.osc/_packages000066400000000000000000000005011475337502500263740ustar00rootroot00000000000000 osc-1.12.1/tests/project_package_status_fixtures/osctest/.osc/_project000066400000000000000000000000101475337502500262570ustar00rootroot00000000000000osctest osc-1.12.1/tests/project_package_status_fixtures/osctest/added/000077500000000000000000000000001475337502500247375ustar00rootroot00000000000000osc-1.12.1/tests/project_package_status_fixtures/osctest/added/.osc/000077500000000000000000000000001475337502500256015ustar00rootroot00000000000000osc-1.12.1/tests/project_package_status_fixtures/osctest/added/.osc/_apiurl000066400000000000000000000000211475337502500271500ustar00rootroot00000000000000http://localhost osc-1.12.1/tests/project_package_status_fixtures/osctest/added/.osc/_files000066400000000000000000000000161475337502500267620ustar00rootroot00000000000000 osc-1.12.1/tests/project_package_status_fixtures/osctest/added/.osc/_osclib_version000066400000000000000000000000041475337502500306750ustar00rootroot000000000000001.0 osc-1.12.1/tests/project_package_status_fixtures/osctest/added/.osc/_package000066400000000000000000000000061475337502500272520ustar00rootroot00000000000000added osc-1.12.1/tests/project_package_status_fixtures/osctest/added/.osc/_project000066400000000000000000000000101475337502500273200ustar00rootroot00000000000000osctest osc-1.12.1/tests/project_package_status_fixtures/osctest/added/.osc/_to_be_added000066400000000000000000000000041475337502500300660ustar00rootroot00000000000000new osc-1.12.1/tests/project_package_status_fixtures/osctest/added/exists000066400000000000000000000000001475337502500261670ustar00rootroot00000000000000osc-1.12.1/tests/project_package_status_fixtures/osctest/added/new000066400000000000000000000000041475337502500254450ustar00rootroot00000000000000new osc-1.12.1/tests/project_package_status_fixtures/osctest/conflict/000077500000000000000000000000001475337502500254775ustar00rootroot00000000000000osc-1.12.1/tests/project_package_status_fixtures/osctest/conflict/.osc/000077500000000000000000000000001475337502500263415ustar00rootroot00000000000000osc-1.12.1/tests/project_package_status_fixtures/osctest/conflict/.osc/_apiurl000066400000000000000000000000211475337502500277100ustar00rootroot00000000000000http://localhost osc-1.12.1/tests/project_package_status_fixtures/osctest/conflict/.osc/_files000066400000000000000000000004371475337502500275310ustar00rootroot00000000000000 osc-1.12.1/tests/project_package_status_fixtures/osctest/conflict/.osc/_in_conflict000066400000000000000000000000111475337502500307020ustar00rootroot00000000000000conflict osc-1.12.1/tests/project_package_status_fixtures/osctest/conflict/.osc/_osclib_version000066400000000000000000000000041475337502500314350ustar00rootroot000000000000001.0 osc-1.12.1/tests/project_package_status_fixtures/osctest/conflict/.osc/_package000066400000000000000000000000111475337502500300060ustar00rootroot00000000000000conflict osc-1.12.1/tests/project_package_status_fixtures/osctest/conflict/.osc/_project000066400000000000000000000000071475337502500300660ustar00rootroot00000000000000osctestosc-1.12.1/tests/project_package_status_fixtures/osctest/conflict/.osc/conflict000066400000000000000000000000261475337502500300630ustar00rootroot00000000000000This file did change. osc-1.12.1/tests/project_package_status_fixtures/osctest/conflict/.osc/test000066400000000000000000000000051475337502500272360ustar00rootroot00000000000000test osc-1.12.1/tests/project_package_status_fixtures/osctest/conflict/conflict000066400000000000000000000000131475337502500272150ustar00rootroot00000000000000Inconflict osc-1.12.1/tests/project_package_status_fixtures/osctest/conflict/exists000066400000000000000000000000001475337502500267270ustar00rootroot00000000000000osc-1.12.1/tests/project_package_status_fixtures/osctest/conflict/test000066400000000000000000000000051475337502500263740ustar00rootroot00000000000000test osc-1.12.1/tests/project_package_status_fixtures/osctest/deleted/000077500000000000000000000000001475337502500253045ustar00rootroot00000000000000osc-1.12.1/tests/project_package_status_fixtures/osctest/deleted/.osc/000077500000000000000000000000001475337502500261465ustar00rootroot00000000000000osc-1.12.1/tests/project_package_status_fixtures/osctest/deleted/.osc/_apiurl000066400000000000000000000000211475337502500275150ustar00rootroot00000000000000http://localhost osc-1.12.1/tests/project_package_status_fixtures/osctest/deleted/.osc/_files000066400000000000000000000004371475337502500273360ustar00rootroot00000000000000 osc-1.12.1/tests/project_package_status_fixtures/osctest/deleted/.osc/_osclib_version000066400000000000000000000000041475337502500312420ustar00rootroot000000000000001.0 osc-1.12.1/tests/project_package_status_fixtures/osctest/deleted/.osc/_package000066400000000000000000000000101475337502500276120ustar00rootroot00000000000000deleted osc-1.12.1/tests/project_package_status_fixtures/osctest/deleted/.osc/_project000066400000000000000000000000071475337502500276730ustar00rootroot00000000000000osctestosc-1.12.1/tests/project_package_status_fixtures/osctest/deleted/.osc/_to_be_deleted000066400000000000000000000000161475337502500310030ustar00rootroot00000000000000modified test osc-1.12.1/tests/project_package_status_fixtures/osctest/deleted/.osc/modified000066400000000000000000000000261475337502500276470ustar00rootroot00000000000000This file did change. osc-1.12.1/tests/project_package_status_fixtures/osctest/deleted/.osc/test000066400000000000000000000000051475337502500270430ustar00rootroot00000000000000test osc-1.12.1/tests/project_package_status_fixtures/osctest/excluded/000077500000000000000000000000001475337502500254735ustar00rootroot00000000000000osc-1.12.1/tests/project_package_status_fixtures/osctest/excluded/.osc/000077500000000000000000000000001475337502500263355ustar00rootroot00000000000000osc-1.12.1/tests/project_package_status_fixtures/osctest/excluded/.osc/_apiurl000066400000000000000000000000211475337502500277040ustar00rootroot00000000000000http://localhost osc-1.12.1/tests/project_package_status_fixtures/osctest/excluded/.osc/_files000066400000000000000000000004371475337502500275250ustar00rootroot00000000000000 osc-1.12.1/tests/project_package_status_fixtures/osctest/excluded/.osc/_osclib_version000066400000000000000000000000041475337502500314310ustar00rootroot000000000000001.0 osc-1.12.1/tests/project_package_status_fixtures/osctest/excluded/.osc/_package000066400000000000000000000000111475337502500300020ustar00rootroot00000000000000excluded osc-1.12.1/tests/project_package_status_fixtures/osctest/excluded/.osc/_project000066400000000000000000000000071475337502500300620ustar00rootroot00000000000000osctestosc-1.12.1/tests/project_package_status_fixtures/osctest/excluded/.osc/modified000066400000000000000000000000261475337502500300360ustar00rootroot00000000000000This file did change. osc-1.12.1/tests/project_package_status_fixtures/osctest/excluded/.osc/test000066400000000000000000000000051475337502500272320ustar00rootroot00000000000000test osc-1.12.1/tests/project_package_status_fixtures/osctest/excluded/_linkerror000066400000000000000000000000001475337502500275520ustar00rootroot00000000000000osc-1.12.1/tests/project_package_status_fixtures/osctest/excluded/dir/000077500000000000000000000000001475337502500262515ustar00rootroot00000000000000osc-1.12.1/tests/project_package_status_fixtures/osctest/excluded/dir/file000066400000000000000000000000051475337502500271060ustar00rootroot00000000000000file osc-1.12.1/tests/project_package_status_fixtures/osctest/excluded/exists000066400000000000000000000000001475337502500267230ustar00rootroot00000000000000osc-1.12.1/tests/project_package_status_fixtures/osctest/excluded/foo.orig000066400000000000000000000000001475337502500271260ustar00rootroot00000000000000osc-1.12.1/tests/project_package_status_fixtures/osctest/excluded/modified000066400000000000000000000000111475337502500271660ustar00rootroot00000000000000modified osc-1.12.1/tests/project_package_status_fixtures/osctest/excluded/test000066400000000000000000000000051475337502500263700ustar00rootroot00000000000000test osc-1.12.1/tests/project_package_status_fixtures/osctest/simple/000077500000000000000000000000001475337502500251675ustar00rootroot00000000000000osc-1.12.1/tests/project_package_status_fixtures/osctest/simple/.osc/000077500000000000000000000000001475337502500260315ustar00rootroot00000000000000osc-1.12.1/tests/project_package_status_fixtures/osctest/simple/.osc/_apiurl000066400000000000000000000000211475337502500274000ustar00rootroot00000000000000http://localhost osc-1.12.1/tests/project_package_status_fixtures/osctest/simple/.osc/_files000066400000000000000000000012421475337502500272140ustar00rootroot00000000000000 osc-1.12.1/tests/project_package_status_fixtures/osctest/simple/.osc/_osclib_version000066400000000000000000000000041475337502500311250ustar00rootroot000000000000001.0 osc-1.12.1/tests/project_package_status_fixtures/osctest/simple/.osc/_package000066400000000000000000000000071475337502500275030ustar00rootroot00000000000000simple osc-1.12.1/tests/project_package_status_fixtures/osctest/simple/.osc/_project000066400000000000000000000000071475337502500275560ustar00rootroot00000000000000osctestosc-1.12.1/tests/project_package_status_fixtures/osctest/simple/.osc/_to_be_added000066400000000000000000000000321475337502500303170ustar00rootroot00000000000000add missing missing_added osc-1.12.1/tests/project_package_status_fixtures/osctest/simple/.osc/_to_be_deleted000066400000000000000000000000041475337502500306630ustar00rootroot00000000000000foo osc-1.12.1/tests/project_package_status_fixtures/osctest/simple/.osc/foo000066400000000000000000000000271475337502500265360ustar00rootroot00000000000000This is a simple test. osc-1.12.1/tests/project_package_status_fixtures/osctest/simple/.osc/merge000066400000000000000000000000601475337502500270470ustar00rootroot00000000000000Is it possible to merge this file? I hope so... osc-1.12.1/tests/project_package_status_fixtures/osctest/simple/.osc/missing000066400000000000000000000000101475337502500274140ustar00rootroot00000000000000missing osc-1.12.1/tests/project_package_status_fixtures/osctest/simple/.osc/nochange000066400000000000000000000000311475337502500275300ustar00rootroot00000000000000This file didn't change. osc-1.12.1/tests/project_package_status_fixtures/osctest/simple/.osc/test000066400000000000000000000000051475337502500267260ustar00rootroot00000000000000test osc-1.12.1/tests/project_package_status_fixtures/osctest/simple/add000066400000000000000000000000131475337502500256340ustar00rootroot00000000000000added file osc-1.12.1/tests/project_package_status_fixtures/osctest/simple/exists000066400000000000000000000000001475337502500264170ustar00rootroot00000000000000osc-1.12.1/tests/project_package_status_fixtures/osctest/simple/missing000066400000000000000000000000111475337502500265530ustar00rootroot00000000000000replaced osc-1.12.1/tests/project_package_status_fixtures/osctest/simple/nochange000066400000000000000000000000261475337502500266720ustar00rootroot00000000000000This file did change. osc-1.12.1/tests/project_package_status_fixtures/osctest/simple/test000066400000000000000000000000051475337502500260640ustar00rootroot00000000000000test osc-1.12.1/tests/repairwc_fixtures/000077500000000000000000000000001475337502500173025ustar00rootroot00000000000000osc-1.12.1/tests/repairwc_fixtures/oscrc000066400000000000000000000001361475337502500203360ustar00rootroot00000000000000[general] apiurl = http://localhost [http://localhost] user=Admin pass=opensuse allow_http=1 osc-1.12.1/tests/repairwc_fixtures/osctest/000077500000000000000000000000001475337502500207665ustar00rootroot00000000000000osc-1.12.1/tests/repairwc_fixtures/osctest/.osc/000077500000000000000000000000001475337502500216305ustar00rootroot00000000000000osc-1.12.1/tests/repairwc_fixtures/osctest/.osc/_apiurl000066400000000000000000000000211475337502500231770ustar00rootroot00000000000000http://localhost osc-1.12.1/tests/repairwc_fixtures/osctest/.osc/_osclib_version000066400000000000000000000000041475337502500247240ustar00rootroot000000000000001.0 osc-1.12.1/tests/repairwc_fixtures/osctest/.osc/_packages000066400000000000000000000000331475337502500234640ustar00rootroot00000000000000 osc-1.12.1/tests/repairwc_fixtures/osctest/.osc/_project000066400000000000000000000000101475337502500233470ustar00rootroot00000000000000osctest osc-1.12.1/tests/repairwc_fixtures/osctest/_packages000066400000000000000000000000331475337502500226220ustar00rootroot00000000000000 osc-1.12.1/tests/repairwc_fixtures/osctest/buildfiles/000077500000000000000000000000001475337502500231105ustar00rootroot00000000000000osc-1.12.1/tests/repairwc_fixtures/osctest/buildfiles/.osc/000077500000000000000000000000001475337502500237525ustar00rootroot00000000000000osc-1.12.1/tests/repairwc_fixtures/osctest/buildfiles/.osc/_apiurl000066400000000000000000000000211475337502500253210ustar00rootroot00000000000000http://localhost osc-1.12.1/tests/repairwc_fixtures/osctest/buildfiles/.osc/_buildconfig_prj_arch000066400000000000000000000000001475337502500301570ustar00rootroot00000000000000osc-1.12.1/tests/repairwc_fixtures/osctest/buildfiles/.osc/_buildinfo_prj_arch.xml000066400000000000000000000000001475337502500304440ustar00rootroot00000000000000osc-1.12.1/tests/repairwc_fixtures/osctest/buildfiles/.osc/_files000066400000000000000000000005761475337502500251460ustar00rootroot00000000000000 osc-1.12.1/tests/repairwc_fixtures/osctest/buildfiles/.osc/_in_conflict000066400000000000000000000000111475337502500263130ustar00rootroot00000000000000nochange osc-1.12.1/tests/repairwc_fixtures/osctest/buildfiles/.osc/_osclib_version000066400000000000000000000000041475337502500270460ustar00rootroot000000000000001.0 osc-1.12.1/tests/repairwc_fixtures/osctest/buildfiles/.osc/_package000066400000000000000000000000131475337502500254210ustar00rootroot00000000000000buildfiles osc-1.12.1/tests/repairwc_fixtures/osctest/buildfiles/.osc/_project000066400000000000000000000000071475337502500254770ustar00rootroot00000000000000osctestosc-1.12.1/tests/repairwc_fixtures/osctest/buildfiles/.osc/_to_be_added000066400000000000000000000000071475337502500262420ustar00rootroot00000000000000foobar osc-1.12.1/tests/repairwc_fixtures/osctest/buildfiles/.osc/_to_be_deleted000066400000000000000000000000041475337502500266040ustar00rootroot00000000000000foo osc-1.12.1/tests/repairwc_fixtures/osctest/buildfiles/.osc/foo000066400000000000000000000000271475337502500244570ustar00rootroot00000000000000This is a simple test. osc-1.12.1/tests/repairwc_fixtures/osctest/buildfiles/.osc/merge000066400000000000000000000000601475337502500247700ustar00rootroot00000000000000Is it possible to merge this file? I hope so... osc-1.12.1/tests/repairwc_fixtures/osctest/buildfiles/.osc/nochange000066400000000000000000000000311475337502500254510ustar00rootroot00000000000000This file didn't change. osc-1.12.1/tests/repairwc_fixtures/osctest/buildfiles/foobar000066400000000000000000000000001475337502500242710ustar00rootroot00000000000000osc-1.12.1/tests/repairwc_fixtures/osctest/buildfiles/merge000066400000000000000000000000601475337502500241260ustar00rootroot00000000000000Is it possible to merge this file? I hope so... osc-1.12.1/tests/repairwc_fixtures/osctest/buildfiles/nochange000066400000000000000000000000511475337502500246110ustar00rootroot00000000000000This file didn't change but is modified. osc-1.12.1/tests/repairwc_fixtures/osctest/buildfiles/toadd1000066400000000000000000000000071475337502500242040ustar00rootroot00000000000000toadd1 osc-1.12.1/tests/repairwc_fixtures/osctest/invalid_apiurl/000077500000000000000000000000001475337502500237705ustar00rootroot00000000000000osc-1.12.1/tests/repairwc_fixtures/osctest/invalid_apiurl/.osc/000077500000000000000000000000001475337502500246325ustar00rootroot00000000000000osc-1.12.1/tests/repairwc_fixtures/osctest/invalid_apiurl/.osc/_apiurl000066400000000000000000000000311475337502500262020ustar00rootroot00000000000000urlwithoutprotocolandtld osc-1.12.1/tests/repairwc_fixtures/osctest/invalid_apiurl/.osc/_files000066400000000000000000000001371475337502500260170ustar00rootroot00000000000000 osc-1.12.1/tests/repairwc_fixtures/osctest/invalid_apiurl/.osc/_meta000066400000000000000000000004201475337502500256360ustar00rootroot00000000000000 Title of New Package LONG DESCRIPTION GOES HERE PUT_UPSTREAM_URL_HERE osc-1.12.1/tests/repairwc_fixtures/osctest/invalid_apiurl/.osc/_osclib_version000066400000000000000000000000041475337502500277260ustar00rootroot000000000000001.0 osc-1.12.1/tests/repairwc_fixtures/osctest/invalid_apiurl/.osc/_package000066400000000000000000000000171475337502500263050ustar00rootroot00000000000000invalid_apiurl osc-1.12.1/tests/repairwc_fixtures/osctest/invalid_apiurl/.osc/_project000066400000000000000000000000071475337502500263570ustar00rootroot00000000000000remote osc-1.12.1/tests/repairwc_fixtures/osctest/multiple/000077500000000000000000000000001475337502500226215ustar00rootroot00000000000000osc-1.12.1/tests/repairwc_fixtures/osctest/multiple/.osc/000077500000000000000000000000001475337502500234635ustar00rootroot00000000000000osc-1.12.1/tests/repairwc_fixtures/osctest/multiple/.osc/_apiurl000066400000000000000000000000211475337502500250320ustar00rootroot00000000000000http://localhost osc-1.12.1/tests/repairwc_fixtures/osctest/multiple/.osc/_files000066400000000000000000000005741475337502500246550ustar00rootroot00000000000000 osc-1.12.1/tests/repairwc_fixtures/osctest/multiple/.osc/_in_conflict000066400000000000000000000000111475337502500260240ustar00rootroot00000000000000nochange osc-1.12.1/tests/repairwc_fixtures/osctest/multiple/.osc/_osclib_version000066400000000000000000000000041475337502500265570ustar00rootroot000000000000001.0 osc-1.12.1/tests/repairwc_fixtures/osctest/multiple/.osc/_package000066400000000000000000000000111475337502500251300ustar00rootroot00000000000000multiple osc-1.12.1/tests/repairwc_fixtures/osctest/multiple/.osc/_project000066400000000000000000000000071475337502500252100ustar00rootroot00000000000000osctestosc-1.12.1/tests/repairwc_fixtures/osctest/multiple/.osc/_to_be_added000066400000000000000000000000071475337502500257530ustar00rootroot00000000000000foobar osc-1.12.1/tests/repairwc_fixtures/osctest/multiple/.osc/_to_be_deleted000066400000000000000000000000211475337502500263140ustar00rootroot00000000000000foo nofilesentry osc-1.12.1/tests/repairwc_fixtures/osctest/multiple/.osc/foo000066400000000000000000000000271475337502500241700ustar00rootroot00000000000000This is a simple test. osc-1.12.1/tests/repairwc_fixtures/osctest/multiple/.osc/unknown_file000066400000000000000000000000001475337502500260720ustar00rootroot00000000000000osc-1.12.1/tests/repairwc_fixtures/osctest/multiple/foobar000066400000000000000000000000001475337502500240020ustar00rootroot00000000000000osc-1.12.1/tests/repairwc_fixtures/osctest/multiple/merge000066400000000000000000000000601475337502500236370ustar00rootroot00000000000000Is it possible to merge this file? I hope so... osc-1.12.1/tests/repairwc_fixtures/osctest/multiple/nochange000066400000000000000000000000511475337502500243220ustar00rootroot00000000000000This file didn't change but is modified. osc-1.12.1/tests/repairwc_fixtures/osctest/multiple/toadd1000066400000000000000000000000071475337502500237150ustar00rootroot00000000000000toadd1 osc-1.12.1/tests/repairwc_fixtures/osctest/noapiurl/000077500000000000000000000000001475337502500226175ustar00rootroot00000000000000osc-1.12.1/tests/repairwc_fixtures/osctest/noapiurl/.osc/000077500000000000000000000000001475337502500234615ustar00rootroot00000000000000osc-1.12.1/tests/repairwc_fixtures/osctest/noapiurl/.osc/_files000066400000000000000000000005741475337502500246530ustar00rootroot00000000000000 osc-1.12.1/tests/repairwc_fixtures/osctest/noapiurl/.osc/_in_conflict000066400000000000000000000000111475337502500260220ustar00rootroot00000000000000nochange osc-1.12.1/tests/repairwc_fixtures/osctest/noapiurl/.osc/_osclib_version000066400000000000000000000000041475337502500265550ustar00rootroot000000000000001.0 osc-1.12.1/tests/repairwc_fixtures/osctest/noapiurl/.osc/_package000066400000000000000000000000111475337502500251260ustar00rootroot00000000000000noapiurl osc-1.12.1/tests/repairwc_fixtures/osctest/noapiurl/.osc/_project000066400000000000000000000000071475337502500252060ustar00rootroot00000000000000osctestosc-1.12.1/tests/repairwc_fixtures/osctest/noapiurl/.osc/_to_be_added000066400000000000000000000000071475337502500257510ustar00rootroot00000000000000foobar osc-1.12.1/tests/repairwc_fixtures/osctest/noapiurl/.osc/_to_be_deleted000066400000000000000000000000041475337502500263130ustar00rootroot00000000000000foo osc-1.12.1/tests/repairwc_fixtures/osctest/noapiurl/.osc/foo000066400000000000000000000000271475337502500241660ustar00rootroot00000000000000This is a simple test. osc-1.12.1/tests/repairwc_fixtures/osctest/noapiurl/.osc/merge000066400000000000000000000000601475337502500244770ustar00rootroot00000000000000Is it possible to merge this file? I hope so... osc-1.12.1/tests/repairwc_fixtures/osctest/noapiurl/.osc/nochange000066400000000000000000000000311475337502500251600ustar00rootroot00000000000000This file didn't change. osc-1.12.1/tests/repairwc_fixtures/osctest/noapiurl/foobar000066400000000000000000000000001475337502500240000ustar00rootroot00000000000000osc-1.12.1/tests/repairwc_fixtures/osctest/noapiurl/merge000066400000000000000000000000601475337502500236350ustar00rootroot00000000000000Is it possible to merge this file? I hope so... osc-1.12.1/tests/repairwc_fixtures/osctest/noapiurl/nochange000066400000000000000000000000511475337502500243200ustar00rootroot00000000000000This file didn't change but is modified. osc-1.12.1/tests/repairwc_fixtures/osctest/noapiurl/toadd1000066400000000000000000000000071475337502500237130ustar00rootroot00000000000000toadd1 osc-1.12.1/tests/repairwc_fixtures/osctest/simple/000077500000000000000000000000001475337502500222575ustar00rootroot00000000000000osc-1.12.1/tests/repairwc_fixtures/osctest/simple/.osc/000077500000000000000000000000001475337502500231215ustar00rootroot00000000000000osc-1.12.1/tests/repairwc_fixtures/osctest/simple/.osc/_apiurl000066400000000000000000000000211475337502500244700ustar00rootroot00000000000000http://localhost osc-1.12.1/tests/repairwc_fixtures/osctest/simple/.osc/_files000066400000000000000000000005711475337502500243100ustar00rootroot00000000000000 osc-1.12.1/tests/repairwc_fixtures/osctest/simple/.osc/_osclib_version000066400000000000000000000000041475337502500262150ustar00rootroot000000000000001.0 osc-1.12.1/tests/repairwc_fixtures/osctest/simple/.osc/_package000066400000000000000000000000061475337502500245720ustar00rootroot00000000000000simpleosc-1.12.1/tests/repairwc_fixtures/osctest/simple/.osc/_project000066400000000000000000000000071475337502500246460ustar00rootroot00000000000000osctestosc-1.12.1/tests/repairwc_fixtures/osctest/simple/.osc/_to_be_deleted000066400000000000000000000000041475337502500257530ustar00rootroot00000000000000foo osc-1.12.1/tests/repairwc_fixtures/osctest/simple/.osc/foo000066400000000000000000000000271475337502500236260ustar00rootroot00000000000000This is a simple test. osc-1.12.1/tests/repairwc_fixtures/osctest/simple/.osc/merge000066400000000000000000000000601475337502500241370ustar00rootroot00000000000000Is it possible to merge this file? I hope so... osc-1.12.1/tests/repairwc_fixtures/osctest/simple/.osc/nochange000066400000000000000000000000311475337502500246200ustar00rootroot00000000000000This file didn't change. osc-1.12.1/tests/repairwc_fixtures/osctest/simple/merge000066400000000000000000000000601475337502500232750ustar00rootroot00000000000000Is it possible to merge this file? I hope so... osc-1.12.1/tests/repairwc_fixtures/osctest/simple/nochange000066400000000000000000000000511475337502500237600ustar00rootroot00000000000000This file didn't change but is modified. osc-1.12.1/tests/repairwc_fixtures/osctest/simple/toadd1000066400000000000000000000000071475337502500233530ustar00rootroot00000000000000toadd1 osc-1.12.1/tests/repairwc_fixtures/osctest/simple/toadd2000066400000000000000000000000071475337502500233540ustar00rootroot00000000000000toadd2 osc-1.12.1/tests/repairwc_fixtures/osctest/simple1/000077500000000000000000000000001475337502500223405ustar00rootroot00000000000000osc-1.12.1/tests/repairwc_fixtures/osctest/simple1/.osc/000077500000000000000000000000001475337502500232025ustar00rootroot00000000000000osc-1.12.1/tests/repairwc_fixtures/osctest/simple1/.osc/_apiurl000066400000000000000000000000211475337502500245510ustar00rootroot00000000000000http://localhost osc-1.12.1/tests/repairwc_fixtures/osctest/simple1/.osc/_files000066400000000000000000000005711475337502500243710ustar00rootroot00000000000000 osc-1.12.1/tests/repairwc_fixtures/osctest/simple1/.osc/_osclib_version000066400000000000000000000000041475337502500262760ustar00rootroot000000000000001.0 osc-1.12.1/tests/repairwc_fixtures/osctest/simple1/.osc/_package000066400000000000000000000000101475337502500246460ustar00rootroot00000000000000simple1 osc-1.12.1/tests/repairwc_fixtures/osctest/simple1/.osc/_project000066400000000000000000000000071475337502500247270ustar00rootroot00000000000000osctestosc-1.12.1/tests/repairwc_fixtures/osctest/simple1/.osc/_to_be_deleted000066400000000000000000000000041475337502500260340ustar00rootroot00000000000000foo osc-1.12.1/tests/repairwc_fixtures/osctest/simple1/.osc/merge000066400000000000000000000000601475337502500242200ustar00rootroot00000000000000Is it possible to merge this file? I hope so... osc-1.12.1/tests/repairwc_fixtures/osctest/simple1/.osc/nochange000066400000000000000000000000311475337502500247010ustar00rootroot00000000000000This file didn't change. osc-1.12.1/tests/repairwc_fixtures/osctest/simple1/merge000066400000000000000000000000601475337502500233560ustar00rootroot00000000000000Is it possible to merge this file? I hope so... osc-1.12.1/tests/repairwc_fixtures/osctest/simple1/nochange000066400000000000000000000000511475337502500240410ustar00rootroot00000000000000This file didn't change but is modified. osc-1.12.1/tests/repairwc_fixtures/osctest/simple1/toadd1000066400000000000000000000000071475337502500234340ustar00rootroot00000000000000toadd1 osc-1.12.1/tests/repairwc_fixtures/osctest/simple1/toadd2000066400000000000000000000000071475337502500234350ustar00rootroot00000000000000toadd2 osc-1.12.1/tests/repairwc_fixtures/osctest/simple2/000077500000000000000000000000001475337502500223415ustar00rootroot00000000000000osc-1.12.1/tests/repairwc_fixtures/osctest/simple2/.osc/000077500000000000000000000000001475337502500232035ustar00rootroot00000000000000osc-1.12.1/tests/repairwc_fixtures/osctest/simple2/.osc/_apiurl000066400000000000000000000000211475337502500245520ustar00rootroot00000000000000http://localhost osc-1.12.1/tests/repairwc_fixtures/osctest/simple2/.osc/_files000066400000000000000000000005711475337502500243720ustar00rootroot00000000000000 osc-1.12.1/tests/repairwc_fixtures/osctest/simple2/.osc/_osclib_version000066400000000000000000000000041475337502500262770ustar00rootroot000000000000001.0 osc-1.12.1/tests/repairwc_fixtures/osctest/simple2/.osc/_package000066400000000000000000000000101475337502500246470ustar00rootroot00000000000000simple2 osc-1.12.1/tests/repairwc_fixtures/osctest/simple2/.osc/_project000066400000000000000000000000071475337502500247300ustar00rootroot00000000000000osctestosc-1.12.1/tests/repairwc_fixtures/osctest/simple2/.osc/_to_be_deleted000066400000000000000000000000041475337502500260350ustar00rootroot00000000000000foo osc-1.12.1/tests/repairwc_fixtures/osctest/simple2/.osc/foo000066400000000000000000000000271475337502500237100ustar00rootroot00000000000000This is a simple test. osc-1.12.1/tests/repairwc_fixtures/osctest/simple2/.osc/merge000066400000000000000000000000601475337502500242210ustar00rootroot00000000000000Is it possible to merge this file? I hope so... osc-1.12.1/tests/repairwc_fixtures/osctest/simple2/.osc/nochange000066400000000000000000000000311475337502500247020ustar00rootroot00000000000000This file didn't change. osc-1.12.1/tests/repairwc_fixtures/osctest/simple2/.osc/somefile000066400000000000000000000000111475337502500247210ustar00rootroot00000000000000somefile osc-1.12.1/tests/repairwc_fixtures/osctest/simple2/merge000066400000000000000000000000601475337502500233570ustar00rootroot00000000000000Is it possible to merge this file? I hope so... osc-1.12.1/tests/repairwc_fixtures/osctest/simple2/nochange000066400000000000000000000000511475337502500240420ustar00rootroot00000000000000This file didn't change but is modified. osc-1.12.1/tests/repairwc_fixtures/osctest/simple2/toadd1000066400000000000000000000000071475337502500234350ustar00rootroot00000000000000toadd1 osc-1.12.1/tests/repairwc_fixtures/osctest/simple2/toadd2000066400000000000000000000000071475337502500234360ustar00rootroot00000000000000toadd2 osc-1.12.1/tests/repairwc_fixtures/osctest/simple3/000077500000000000000000000000001475337502500223425ustar00rootroot00000000000000osc-1.12.1/tests/repairwc_fixtures/osctest/simple3/.osc/000077500000000000000000000000001475337502500232045ustar00rootroot00000000000000osc-1.12.1/tests/repairwc_fixtures/osctest/simple3/.osc/_apiurl000066400000000000000000000000211475337502500245530ustar00rootroot00000000000000http://localhost osc-1.12.1/tests/repairwc_fixtures/osctest/simple3/.osc/_files000066400000000000000000000005731475337502500243750ustar00rootroot00000000000000 osc-1.12.1/tests/repairwc_fixtures/osctest/simple3/.osc/_osclib_version000066400000000000000000000000041475337502500263000ustar00rootroot000000000000001.0 osc-1.12.1/tests/repairwc_fixtures/osctest/simple3/.osc/_package000066400000000000000000000000101475337502500246500ustar00rootroot00000000000000simple3 osc-1.12.1/tests/repairwc_fixtures/osctest/simple3/.osc/_project000066400000000000000000000000071475337502500247310ustar00rootroot00000000000000osctestosc-1.12.1/tests/repairwc_fixtures/osctest/simple3/.osc/_to_be_added000066400000000000000000000000071475337502500254740ustar00rootroot00000000000000toadd1 osc-1.12.1/tests/repairwc_fixtures/osctest/simple3/.osc/_to_be_deleted000066400000000000000000000000041475337502500260360ustar00rootroot00000000000000foo osc-1.12.1/tests/repairwc_fixtures/osctest/simple3/.osc/foo000066400000000000000000000000271475337502500237110ustar00rootroot00000000000000This is a simple test. osc-1.12.1/tests/repairwc_fixtures/osctest/simple3/.osc/merge000066400000000000000000000000601475337502500242220ustar00rootroot00000000000000Is it possible to merge this file? I hope so... osc-1.12.1/tests/repairwc_fixtures/osctest/simple3/.osc/nochange000066400000000000000000000000311475337502500247030ustar00rootroot00000000000000This file didn't change. osc-1.12.1/tests/repairwc_fixtures/osctest/simple3/.osc/toadd1000066400000000000000000000000001475337502500242710ustar00rootroot00000000000000osc-1.12.1/tests/repairwc_fixtures/osctest/simple3/merge000066400000000000000000000000601475337502500233600ustar00rootroot00000000000000Is it possible to merge this file? I hope so... osc-1.12.1/tests/repairwc_fixtures/osctest/simple3/nochange000066400000000000000000000000511475337502500240430ustar00rootroot00000000000000This file didn't change but is modified. osc-1.12.1/tests/repairwc_fixtures/osctest/simple3/toadd1000066400000000000000000000000071475337502500234360ustar00rootroot00000000000000toadd1 osc-1.12.1/tests/repairwc_fixtures/osctest/simple3/toadd2000066400000000000000000000000071475337502500234370ustar00rootroot00000000000000toadd2 osc-1.12.1/tests/repairwc_fixtures/osctest/simple4/000077500000000000000000000000001475337502500223435ustar00rootroot00000000000000osc-1.12.1/tests/repairwc_fixtures/osctest/simple4/.osc/000077500000000000000000000000001475337502500232055ustar00rootroot00000000000000osc-1.12.1/tests/repairwc_fixtures/osctest/simple4/.osc/_apiurl000066400000000000000000000000211475337502500245540ustar00rootroot00000000000000http://localhost osc-1.12.1/tests/repairwc_fixtures/osctest/simple4/.osc/_files000066400000000000000000000006041475337502500243710ustar00rootroot00000000000000 osc-1.12.1/tests/repairwc_fixtures/osctest/simple4/.osc/_osclib_version000066400000000000000000000000041475337502500263010ustar00rootroot000000000000001.0 osc-1.12.1/tests/repairwc_fixtures/osctest/simple4/.osc/_package000066400000000000000000000000211475337502500246530ustar00rootroot00000000000000working_nonempty osc-1.12.1/tests/repairwc_fixtures/osctest/simple4/.osc/_project000066400000000000000000000000071475337502500247320ustar00rootroot00000000000000osctestosc-1.12.1/tests/repairwc_fixtures/osctest/simple4/.osc/_to_be_deleted000066400000000000000000000000131475337502500260370ustar00rootroot00000000000000foo remove osc-1.12.1/tests/repairwc_fixtures/osctest/simple4/.osc/foo000066400000000000000000000000271475337502500237120ustar00rootroot00000000000000This is a simple test. osc-1.12.1/tests/repairwc_fixtures/osctest/simple4/.osc/merge000066400000000000000000000000601475337502500242230ustar00rootroot00000000000000Is it possible to merge this file? I hope so... osc-1.12.1/tests/repairwc_fixtures/osctest/simple4/.osc/nochange000066400000000000000000000000311475337502500247040ustar00rootroot00000000000000This file didn't change. osc-1.12.1/tests/repairwc_fixtures/osctest/simple4/merge000066400000000000000000000000601475337502500233610ustar00rootroot00000000000000Is it possible to merge this file? I hope so... osc-1.12.1/tests/repairwc_fixtures/osctest/simple4/nochange000066400000000000000000000000511475337502500240440ustar00rootroot00000000000000This file didn't change but is modified. osc-1.12.1/tests/repairwc_fixtures/osctest/simple4/toadd1000066400000000000000000000000071475337502500234370ustar00rootroot00000000000000toadd1 osc-1.12.1/tests/repairwc_fixtures/osctest/simple5/000077500000000000000000000000001475337502500223445ustar00rootroot00000000000000osc-1.12.1/tests/repairwc_fixtures/osctest/simple5/.osc/000077500000000000000000000000001475337502500232065ustar00rootroot00000000000000osc-1.12.1/tests/repairwc_fixtures/osctest/simple5/.osc/_apiurl000066400000000000000000000000211475337502500245550ustar00rootroot00000000000000http://localhost osc-1.12.1/tests/repairwc_fixtures/osctest/simple5/.osc/_files000066400000000000000000000006041475337502500243720ustar00rootroot00000000000000 osc-1.12.1/tests/repairwc_fixtures/osctest/simple5/.osc/_in_conflict000066400000000000000000000000111475337502500255470ustar00rootroot00000000000000conflict osc-1.12.1/tests/repairwc_fixtures/osctest/simple5/.osc/_osclib_version000066400000000000000000000000041475337502500263020ustar00rootroot000000000000001.0 osc-1.12.1/tests/repairwc_fixtures/osctest/simple5/.osc/_package000066400000000000000000000000211475337502500246540ustar00rootroot00000000000000working_nonempty osc-1.12.1/tests/repairwc_fixtures/osctest/simple5/.osc/_project000066400000000000000000000000071475337502500247330ustar00rootroot00000000000000osctestosc-1.12.1/tests/repairwc_fixtures/osctest/simple5/.osc/_to_be_deleted000066400000000000000000000000041475337502500260400ustar00rootroot00000000000000foo osc-1.12.1/tests/repairwc_fixtures/osctest/simple5/.osc/foo000066400000000000000000000000271475337502500237130ustar00rootroot00000000000000This is a simple test. osc-1.12.1/tests/repairwc_fixtures/osctest/simple5/.osc/merge000066400000000000000000000000601475337502500242240ustar00rootroot00000000000000Is it possible to merge this file? I hope so... osc-1.12.1/tests/repairwc_fixtures/osctest/simple5/.osc/nochange000066400000000000000000000000311475337502500247050ustar00rootroot00000000000000This file didn't change. osc-1.12.1/tests/repairwc_fixtures/osctest/simple5/merge000066400000000000000000000000601475337502500233620ustar00rootroot00000000000000Is it possible to merge this file? I hope so... osc-1.12.1/tests/repairwc_fixtures/osctest/simple5/nochange000066400000000000000000000000511475337502500240450ustar00rootroot00000000000000This file didn't change but is modified. osc-1.12.1/tests/repairwc_fixtures/osctest/simple5/toadd1000066400000000000000000000000071475337502500234400ustar00rootroot00000000000000toadd1 osc-1.12.1/tests/repairwc_fixtures/osctest/simple6/000077500000000000000000000000001475337502500223455ustar00rootroot00000000000000osc-1.12.1/tests/repairwc_fixtures/osctest/simple6/.osc/000077500000000000000000000000001475337502500232075ustar00rootroot00000000000000osc-1.12.1/tests/repairwc_fixtures/osctest/simple6/.osc/_apiurl000066400000000000000000000000211475337502500245560ustar00rootroot00000000000000http://localhost osc-1.12.1/tests/repairwc_fixtures/osctest/simple6/.osc/_files000066400000000000000000000005731475337502500244000ustar00rootroot00000000000000 osc-1.12.1/tests/repairwc_fixtures/osctest/simple6/.osc/_osclib_version000066400000000000000000000000041475337502500263030ustar00rootroot000000000000001.0 osc-1.12.1/tests/repairwc_fixtures/osctest/simple6/.osc/_package000066400000000000000000000000101475337502500246530ustar00rootroot00000000000000simple6 osc-1.12.1/tests/repairwc_fixtures/osctest/simple6/.osc/_project000066400000000000000000000000071475337502500247340ustar00rootroot00000000000000osctestosc-1.12.1/tests/repairwc_fixtures/osctest/simple6/.osc/_to_be_added000066400000000000000000000000071475337502500254770ustar00rootroot00000000000000foobar osc-1.12.1/tests/repairwc_fixtures/osctest/simple6/.osc/_to_be_deleted000066400000000000000000000000041475337502500260410ustar00rootroot00000000000000foo osc-1.12.1/tests/repairwc_fixtures/osctest/simple6/.osc/merge000066400000000000000000000000601475337502500242250ustar00rootroot00000000000000Is it possible to merge this file? I hope so... osc-1.12.1/tests/repairwc_fixtures/osctest/simple6/.osc/nochange000066400000000000000000000000311475337502500247060ustar00rootroot00000000000000This file didn't change. osc-1.12.1/tests/repairwc_fixtures/osctest/simple6/merge000066400000000000000000000000601475337502500233630ustar00rootroot00000000000000Is it possible to merge this file? I hope so... osc-1.12.1/tests/repairwc_fixtures/osctest/simple6/nochange000066400000000000000000000000511475337502500240460ustar00rootroot00000000000000This file didn't change but is modified. osc-1.12.1/tests/repairwc_fixtures/osctest/simple6/toadd1000066400000000000000000000000071475337502500234410ustar00rootroot00000000000000toadd1 osc-1.12.1/tests/repairwc_fixtures/osctest/simple7/000077500000000000000000000000001475337502500223465ustar00rootroot00000000000000osc-1.12.1/tests/repairwc_fixtures/osctest/simple7/.osc/000077500000000000000000000000001475337502500232105ustar00rootroot00000000000000osc-1.12.1/tests/repairwc_fixtures/osctest/simple7/.osc/_apiurl000066400000000000000000000000211475337502500245570ustar00rootroot00000000000000http://localhost osc-1.12.1/tests/repairwc_fixtures/osctest/simple7/.osc/_files000066400000000000000000000007511475337502500243770ustar00rootroot00000000000000 osc-1.12.1/tests/repairwc_fixtures/osctest/simple7/.osc/_in_conflict000066400000000000000000000000111475337502500255510ustar00rootroot00000000000000nochange osc-1.12.1/tests/repairwc_fixtures/osctest/simple7/.osc/_osclib_version000066400000000000000000000000041475337502500263040ustar00rootroot000000000000001.0 osc-1.12.1/tests/repairwc_fixtures/osctest/simple7/.osc/_package000066400000000000000000000000101475337502500246540ustar00rootroot00000000000000simple7 osc-1.12.1/tests/repairwc_fixtures/osctest/simple7/.osc/_project000066400000000000000000000000071475337502500247350ustar00rootroot00000000000000osctestosc-1.12.1/tests/repairwc_fixtures/osctest/simple7/.osc/_to_be_added000066400000000000000000000000071475337502500255000ustar00rootroot00000000000000foobar osc-1.12.1/tests/repairwc_fixtures/osctest/simple7/.osc/_to_be_deleted000066400000000000000000000000041475337502500260420ustar00rootroot00000000000000foo osc-1.12.1/tests/repairwc_fixtures/osctest/simple7/.osc/foo000066400000000000000000000000271475337502500237150ustar00rootroot00000000000000This is a simple test. osc-1.12.1/tests/repairwc_fixtures/osctest/simple7/.osc/merge000066400000000000000000000000601475337502500242260ustar00rootroot00000000000000Is it possible to merge this file? I hope so... osc-1.12.1/tests/repairwc_fixtures/osctest/simple7/.osc/nochange000066400000000000000000000000311475337502500247070ustar00rootroot00000000000000This file didn't change. osc-1.12.1/tests/repairwc_fixtures/osctest/simple7/foobar000066400000000000000000000000001475337502500235270ustar00rootroot00000000000000osc-1.12.1/tests/repairwc_fixtures/osctest/simple7/merge000066400000000000000000000000601475337502500233640ustar00rootroot00000000000000Is it possible to merge this file? I hope so... osc-1.12.1/tests/repairwc_fixtures/osctest/simple7/nochange000066400000000000000000000000511475337502500240470ustar00rootroot00000000000000This file didn't change but is modified. osc-1.12.1/tests/repairwc_fixtures/osctest/simple7/toadd1000066400000000000000000000000071475337502500234420ustar00rootroot00000000000000toadd1 osc-1.12.1/tests/repairwc_fixtures/osctest/simple8/000077500000000000000000000000001475337502500223475ustar00rootroot00000000000000osc-1.12.1/tests/repairwc_fixtures/osctest/simple8/.osc/000077500000000000000000000000001475337502500232115ustar00rootroot00000000000000osc-1.12.1/tests/repairwc_fixtures/osctest/simple8/.osc/_apiurl000066400000000000000000000000211475337502500245600ustar00rootroot00000000000000http://localhost osc-1.12.1/tests/repairwc_fixtures/osctest/simple8/.osc/_files000066400000000000000000000007511475337502500244000ustar00rootroot00000000000000 osc-1.12.1/tests/repairwc_fixtures/osctest/simple8/.osc/_osclib_version000066400000000000000000000000041475337502500263050ustar00rootroot000000000000001.0 osc-1.12.1/tests/repairwc_fixtures/osctest/simple8/.osc/_package000066400000000000000000000000101475337502500246550ustar00rootroot00000000000000simple8 osc-1.12.1/tests/repairwc_fixtures/osctest/simple8/.osc/_project000066400000000000000000000000071475337502500247360ustar00rootroot00000000000000osctestosc-1.12.1/tests/repairwc_fixtures/osctest/simple8/.osc/_to_be_added000066400000000000000000000000071475337502500255010ustar00rootroot00000000000000foobar osc-1.12.1/tests/repairwc_fixtures/osctest/simple8/.osc/_to_be_deleted000066400000000000000000000000041475337502500260430ustar00rootroot00000000000000foo osc-1.12.1/tests/repairwc_fixtures/osctest/simple8/.osc/foo000066400000000000000000000000271475337502500237160ustar00rootroot00000000000000This is a simple test. osc-1.12.1/tests/repairwc_fixtures/osctest/simple8/.osc/merge000066400000000000000000000000601475337502500242270ustar00rootroot00000000000000Is it possible to merge this file? I hope so... osc-1.12.1/tests/repairwc_fixtures/osctest/simple8/.osc/nochange000066400000000000000000000000311475337502500247100ustar00rootroot00000000000000This file didn't change. osc-1.12.1/tests/repairwc_fixtures/osctest/simple8/.osc/skipped000066400000000000000000000000001475337502500245610ustar00rootroot00000000000000osc-1.12.1/tests/repairwc_fixtures/osctest/simple8/merge000066400000000000000000000000601475337502500233650ustar00rootroot00000000000000Is it possible to merge this file? I hope so... osc-1.12.1/tests/repairwc_fixtures/osctest/simple8/nochange000066400000000000000000000000511475337502500240500ustar00rootroot00000000000000This file didn't change but is modified. osc-1.12.1/tests/repairwc_fixtures/osctest/simple8/toadd1000066400000000000000000000000071475337502500234430ustar00rootroot00000000000000toadd1 osc-1.12.1/tests/repairwc_fixtures/osctest/working_empty/000077500000000000000000000000001475337502500236645ustar00rootroot00000000000000osc-1.12.1/tests/repairwc_fixtures/osctest/working_empty/.osc/000077500000000000000000000000001475337502500245265ustar00rootroot00000000000000osc-1.12.1/tests/repairwc_fixtures/osctest/working_empty/.osc/_apiurl000066400000000000000000000000211475337502500260750ustar00rootroot00000000000000http://localhost osc-1.12.1/tests/repairwc_fixtures/osctest/working_empty/.osc/_files000066400000000000000000000000161475337502500257070ustar00rootroot00000000000000 osc-1.12.1/tests/repairwc_fixtures/osctest/working_empty/.osc/_osclib_version000066400000000000000000000000041475337502500276220ustar00rootroot000000000000001.0 osc-1.12.1/tests/repairwc_fixtures/osctest/working_empty/.osc/_package000066400000000000000000000000161475337502500262000ustar00rootroot00000000000000working_empty osc-1.12.1/tests/repairwc_fixtures/osctest/working_empty/.osc/_project000066400000000000000000000000101475337502500262450ustar00rootroot00000000000000osctest osc-1.12.1/tests/repairwc_fixtures/osctest/working_nonempty/000077500000000000000000000000001475337502500243775ustar00rootroot00000000000000osc-1.12.1/tests/repairwc_fixtures/osctest/working_nonempty/.osc/000077500000000000000000000000001475337502500252415ustar00rootroot00000000000000osc-1.12.1/tests/repairwc_fixtures/osctest/working_nonempty/.osc/_apiurl000066400000000000000000000000211475337502500266100ustar00rootroot00000000000000http://localhost osc-1.12.1/tests/repairwc_fixtures/osctest/working_nonempty/.osc/_files000066400000000000000000000006041475337502500264250ustar00rootroot00000000000000 osc-1.12.1/tests/repairwc_fixtures/osctest/working_nonempty/.osc/_in_conflict000066400000000000000000000000111475337502500276020ustar00rootroot00000000000000nochange osc-1.12.1/tests/repairwc_fixtures/osctest/working_nonempty/.osc/_osclib_version000066400000000000000000000000041475337502500303350ustar00rootroot000000000000001.0 osc-1.12.1/tests/repairwc_fixtures/osctest/working_nonempty/.osc/_package000066400000000000000000000000211475337502500267070ustar00rootroot00000000000000working_nonempty osc-1.12.1/tests/repairwc_fixtures/osctest/working_nonempty/.osc/_project000066400000000000000000000000071475337502500267660ustar00rootroot00000000000000osctestosc-1.12.1/tests/repairwc_fixtures/osctest/working_nonempty/.osc/_to_be_added000066400000000000000000000000071475337502500275310ustar00rootroot00000000000000foobar osc-1.12.1/tests/repairwc_fixtures/osctest/working_nonempty/.osc/_to_be_deleted000066400000000000000000000000041475337502500300730ustar00rootroot00000000000000foo osc-1.12.1/tests/repairwc_fixtures/osctest/working_nonempty/.osc/foo000066400000000000000000000000271475337502500257460ustar00rootroot00000000000000This is a simple test. osc-1.12.1/tests/repairwc_fixtures/osctest/working_nonempty/.osc/merge000066400000000000000000000000601475337502500262570ustar00rootroot00000000000000Is it possible to merge this file? I hope so... osc-1.12.1/tests/repairwc_fixtures/osctest/working_nonempty/.osc/nochange000066400000000000000000000000311475337502500267400ustar00rootroot00000000000000This file didn't change. osc-1.12.1/tests/repairwc_fixtures/osctest/working_nonempty/foobar000066400000000000000000000000001475337502500255600ustar00rootroot00000000000000osc-1.12.1/tests/repairwc_fixtures/osctest/working_nonempty/merge000066400000000000000000000000601475337502500254150ustar00rootroot00000000000000Is it possible to merge this file? I hope so... osc-1.12.1/tests/repairwc_fixtures/osctest/working_nonempty/nochange000066400000000000000000000000511475337502500261000ustar00rootroot00000000000000This file didn't change but is modified. osc-1.12.1/tests/repairwc_fixtures/osctest/working_nonempty/toadd1000066400000000000000000000000071475337502500254730ustar00rootroot00000000000000toadd1 osc-1.12.1/tests/repairwc_fixtures/prj_invalidapiurl/000077500000000000000000000000001475337502500230205ustar00rootroot00000000000000osc-1.12.1/tests/repairwc_fixtures/prj_invalidapiurl/.osc/000077500000000000000000000000001475337502500236625ustar00rootroot00000000000000osc-1.12.1/tests/repairwc_fixtures/prj_invalidapiurl/.osc/_apiurl000066400000000000000000000000211475337502500252310ustar00rootroot00000000000000noschemeandnotld osc-1.12.1/tests/repairwc_fixtures/prj_invalidapiurl/.osc/_osclib_version000066400000000000000000000000041475337502500267560ustar00rootroot000000000000001.0 osc-1.12.1/tests/repairwc_fixtures/prj_invalidapiurl/.osc/_packages000066400000000000000000000000401475337502500255140ustar00rootroot00000000000000 osc-1.12.1/tests/repairwc_fixtures/prj_invalidapiurl/.osc/_project000066400000000000000000000000221475337502500254040ustar00rootroot00000000000000prj_invalidapiurl osc-1.12.1/tests/repairwc_fixtures/prj_noapiurl/000077500000000000000000000000001475337502500220065ustar00rootroot00000000000000osc-1.12.1/tests/repairwc_fixtures/prj_noapiurl/.osc/000077500000000000000000000000001475337502500226505ustar00rootroot00000000000000osc-1.12.1/tests/repairwc_fixtures/prj_noapiurl/.osc/_osclib_version000066400000000000000000000000041475337502500257440ustar00rootroot000000000000001.0 osc-1.12.1/tests/repairwc_fixtures/prj_noapiurl/.osc/_packages000066400000000000000000000000401475337502500245020ustar00rootroot00000000000000 osc-1.12.1/tests/repairwc_fixtures/prj_noapiurl/.osc/_project000066400000000000000000000000151475337502500243740ustar00rootroot00000000000000prj_noapiurl osc-1.12.1/tests/request_fixtures/000077500000000000000000000000001475337502500171565ustar00rootroot00000000000000osc-1.12.1/tests/request_fixtures/oscrc000066400000000000000000000001361475337502500202120ustar00rootroot00000000000000[general] apiurl = http://localhost [http://localhost] user=Admin pass=opensuse allow_http=1 osc-1.12.1/tests/request_fixtures/test_read_request1.xml000066400000000000000000000010601475337502500235000ustar00rootroot00000000000000 Create Request foobar title of the request this is a very long description osc-1.12.1/tests/request_fixtures/test_read_request2.xml000066400000000000000000000013131475337502500235020ustar00rootroot00000000000000 cleanup 1 review start Created request osc-1.12.1/tests/request_fixtures/test_request_list_view1.xml000066400000000000000000000020451475337502500245760ustar00rootroot00000000000000 osc-1.12.1/tests/request_fixtures/test_request_list_view2.xml000066400000000000000000000011721475337502500245770ustar00rootroot00000000000000 Created Request Review Approved This is a simple request with a lot of ... ... text and other stuff. This request also contains a description. This is useful to describe the request. blabla blabla osc-1.12.1/tests/request_fixtures/test_request_str1.xml000066400000000000000000000020111475337502500233720ustar00rootroot00000000000000 cleanup 1 currently in review review start accepted just a samll description in order to describe this request - blablabla test. osc-1.12.1/tests/results_fixtures/000077500000000000000000000000001475337502500171675ustar00rootroot00000000000000osc-1.12.1/tests/results_fixtures/oscrc000066400000000000000000000001361475337502500202230ustar00rootroot00000000000000[general] apiurl = http://localhost [http://localhost] user=Admin pass=opensuse allow_http=1 osc-1.12.1/tests/results_fixtures/result-dirty.xml000066400000000000000000000016001475337502500223550ustar00rootroot00000000000000 osc-1.12.1/tests/results_fixtures/result.xml000066400000000000000000000014301475337502500212250ustar00rootroot00000000000000 osc-1.12.1/tests/revertfile_fixtures/000077500000000000000000000000001475337502500176355ustar00rootroot00000000000000osc-1.12.1/tests/revertfile_fixtures/oscrc000066400000000000000000000001361475337502500206710ustar00rootroot00000000000000[general] apiurl = http://localhost [http://localhost] user=Admin pass=opensuse allow_http=1 osc-1.12.1/tests/revertfile_fixtures/osctest/000077500000000000000000000000001475337502500213215ustar00rootroot00000000000000osc-1.12.1/tests/revertfile_fixtures/osctest/.osc/000077500000000000000000000000001475337502500221635ustar00rootroot00000000000000osc-1.12.1/tests/revertfile_fixtures/osctest/.osc/_apiurl000066400000000000000000000000211475337502500235320ustar00rootroot00000000000000http://localhost osc-1.12.1/tests/revertfile_fixtures/osctest/.osc/_osclib_version000066400000000000000000000000041475337502500252570ustar00rootroot000000000000001.0 osc-1.12.1/tests/revertfile_fixtures/osctest/.osc/_packages000066400000000000000000000000331475337502500240170ustar00rootroot00000000000000 osc-1.12.1/tests/revertfile_fixtures/osctest/.osc/_project000066400000000000000000000000101475337502500237020ustar00rootroot00000000000000osctest osc-1.12.1/tests/revertfile_fixtures/osctest/simple/000077500000000000000000000000001475337502500226125ustar00rootroot00000000000000osc-1.12.1/tests/revertfile_fixtures/osctest/simple/.osc/000077500000000000000000000000001475337502500234545ustar00rootroot00000000000000osc-1.12.1/tests/revertfile_fixtures/osctest/simple/.osc/_apiurl000066400000000000000000000000211475337502500250230ustar00rootroot00000000000000http://localhost osc-1.12.1/tests/revertfile_fixtures/osctest/simple/.osc/_files000066400000000000000000000015461475337502500246460ustar00rootroot00000000000000 osc-1.12.1/tests/revertfile_fixtures/osctest/simple/.osc/_in_conflict000066400000000000000000000000041475337502500260170ustar00rootroot00000000000000foo osc-1.12.1/tests/revertfile_fixtures/osctest/simple/.osc/_osclib_version000066400000000000000000000000041475337502500265500ustar00rootroot000000000000001.0 osc-1.12.1/tests/revertfile_fixtures/osctest/simple/.osc/_package000066400000000000000000000000061475337502500251250ustar00rootroot00000000000000simpleosc-1.12.1/tests/revertfile_fixtures/osctest/simple/.osc/_project000066400000000000000000000000071475337502500252010ustar00rootroot00000000000000osctestosc-1.12.1/tests/revertfile_fixtures/osctest/simple/.osc/_to_be_added000066400000000000000000000000351475337502500257450ustar00rootroot00000000000000toadd1 replaced addedmissing osc-1.12.1/tests/revertfile_fixtures/osctest/simple/.osc/_to_be_deleted000066400000000000000000000000211475337502500263050ustar00rootroot00000000000000somefile deleted osc-1.12.1/tests/revertfile_fixtures/osctest/simple/.osc/deleted000066400000000000000000000000001475337502500247730ustar00rootroot00000000000000osc-1.12.1/tests/revertfile_fixtures/osctest/simple/.osc/foo000066400000000000000000000000271475337502500241610ustar00rootroot00000000000000This is a simple test. osc-1.12.1/tests/revertfile_fixtures/osctest/simple/.osc/merge000066400000000000000000000000601475337502500244720ustar00rootroot00000000000000Is it possible to merge this file? I hope so... osc-1.12.1/tests/revertfile_fixtures/osctest/simple/.osc/missing000066400000000000000000000000101475337502500250370ustar00rootroot00000000000000missing osc-1.12.1/tests/revertfile_fixtures/osctest/simple/.osc/nochange000066400000000000000000000000311475337502500251530ustar00rootroot00000000000000This file didn't change. osc-1.12.1/tests/revertfile_fixtures/osctest/simple/.osc/replaced000066400000000000000000000000211475337502500251470ustar00rootroot00000000000000yet another file osc-1.12.1/tests/revertfile_fixtures/osctest/simple/.osc/somefile000066400000000000000000000000151475337502500251760ustar00rootroot00000000000000some content osc-1.12.1/tests/revertfile_fixtures/osctest/simple/foo000066400000000000000000000001201475337502500233110ustar00rootroot00000000000000<<<<<<< foo.mine This is no test. ======= This is a simple test. >>>>>>> foo.r2 osc-1.12.1/tests/revertfile_fixtures/osctest/simple/merge000066400000000000000000000000601475337502500236300ustar00rootroot00000000000000Is it possible to merge this file? I hope so... osc-1.12.1/tests/revertfile_fixtures/osctest/simple/nochange000066400000000000000000000000511475337502500243130ustar00rootroot00000000000000This file didn't change but is modified. osc-1.12.1/tests/revertfile_fixtures/osctest/simple/replaced000066400000000000000000000000151475337502500243100ustar00rootroot00000000000000foo replaced osc-1.12.1/tests/revertfile_fixtures/osctest/simple/toadd1000066400000000000000000000000071475337502500237060ustar00rootroot00000000000000toadd1 osc-1.12.1/tests/revertfile_fixtures/osctest/simple/toadd2000066400000000000000000000000071475337502500237070ustar00rootroot00000000000000toadd2 osc-1.12.1/tests/setlinkrev_fixtures/000077500000000000000000000000001475337502500176545ustar00rootroot00000000000000osc-1.12.1/tests/setlinkrev_fixtures/expandedsrc_filesremote000066400000000000000000000010351475337502500244740ustar00rootroot00000000000000 osc-1.12.1/tests/setlinkrev_fixtures/link_with_rev000066400000000000000000000000631475337502500224420ustar00rootroot00000000000000 osc-1.12.1/tests/setlinkrev_fixtures/md5_rev_link000066400000000000000000000001331475337502500221520ustar00rootroot00000000000000 osc-1.12.1/tests/setlinkrev_fixtures/noproject_link000066400000000000000000000000321475337502500226120ustar00rootroot00000000000000 osc-1.12.1/tests/setlinkrev_fixtures/oscrc000066400000000000000000000001361475337502500207100ustar00rootroot00000000000000[general] apiurl = http://localhost [http://localhost] user=Admin pass=opensuse allow_http=1 osc-1.12.1/tests/setlinkrev_fixtures/rev_link000066400000000000000000000000641475337502500214100ustar00rootroot00000000000000 osc-1.12.1/tests/setlinkrev_fixtures/simple_filesremote000066400000000000000000000004331475337502500234660ustar00rootroot00000000000000 osc-1.12.1/tests/setlinkrev_fixtures/simple_link000066400000000000000000000000531475337502500221030ustar00rootroot00000000000000 osc-1.12.1/tests/test__private_api.py000066400000000000000000000014241475337502500176110ustar00rootroot00000000000000import unittest from osc._private.api import xml_escape class TestXmlEscape(unittest.TestCase): def test_lt(self): actual = xml_escape("<") expected = "<" self.assertEqual(actual, expected) def test_gt(self): actual = xml_escape(">") expected = ">" self.assertEqual(actual, expected) def test_apos(self): actual = xml_escape("'") expected = "'" self.assertEqual(actual, expected) def test_quot(self): actual = xml_escape("\"") expected = """ self.assertEqual(actual, expected) def test_amp(self): actual = xml_escape("&") expected = "&" self.assertEqual(actual, expected) if __name__ == "__main__": unittest.main() osc-1.12.1/tests/test__private_package.py000066400000000000000000000114601475337502500204340ustar00rootroot00000000000000import os import unittest from osc._private.package import ApiPackage from osc._private.package import LocalPackage from osc._private.package import PackageBase from .common import GET from .common import OscTestCase FIXTURES_DIR = os.path.join(os.path.dirname(__file__), "fixtures", "packages") class PackageBaseMock(PackageBase): def _get_directory_node(self): pass def _load_from_directory_node(self, directory_node): pass class TestPackageBase(unittest.TestCase): def setUp(self): self.p1 = PackageBaseMock("http://urlA", "projA", "pkgA") def test_str(self): self.assertEqual(str(self.p1), "projA/pkgA") def test_repr(self): self.assertTrue(repr(self.p1).endswith("(projA/pkgA)")) def test_eq(self): # the same p2 = PackageBaseMock(self.p1.apiurl, self.p1.project, self.p1.name) self.assertEqual(self.p1, p2) # package name differs p2 = PackageBaseMock(self.p1.apiurl, self.p1.project, "pkgB") self.assertNotEqual(self.p1, p2) # project name differs p2 = PackageBaseMock(self.p1.apiurl, "projB", self.p1.name) self.assertNotEqual(self.p1, p2) # baseurl differs p2 = PackageBaseMock("http://urlB", self.p1.project, self.p1.name) self.assertNotEqual(self.p1, p2) def test_lt(self): # the same p2 = PackageBaseMock(self.p1.apiurl, self.p1.project, self.p1.name) self.assertFalse(self.p1 < p2) # package name differs p2 = PackageBaseMock(self.p1.apiurl, self.p1.project, "pkgB") self.assertTrue(self.p1 < p2) # project name differs p2 = PackageBaseMock(self.p1.apiurl, "projB", self.p1.name) self.assertTrue(self.p1 < p2) # baseurl differs p2 = PackageBaseMock("http://urlB", self.p1.project, self.p1.name) self.assertTrue(self.p1 < p2) def test_hash(self): p2 = PackageBaseMock(self.p1.apiurl, self.p1.project, self.p1.name) self.assertEqual(hash(self.p1), hash(p2)) packages = set() packages.add(self.p1) # the second instance appears to be there because it has the same hash # it is ok, because we consider such packages equal self.assertIn(p2, packages) class TestLocalPackage(OscTestCase): def _get_fixtures_dir(self): return FIXTURES_DIR def test_load(self): path = os.path.join(self.tmpdir, "osctest", "openSUSE:Tools", "osc") p = LocalPackage(path) self.assertEqual(p.name, "osc") self.assertEqual(p.project, "openSUSE:Tools") self.assertEqual(p.apiurl, "http://localhost") self.assertEqual(p.rev, "373") self.assertEqual(p.vrev, "339") self.assertEqual(p.srcmd5, "30ccce6c3a1a4322e79c2935a52af18b") self.assertEqual(p.linkinfo.project, "openSUSE:Factory") self.assertEqual(p.linkinfo.package, "osc") self.assertEqual(p.linkinfo.srcmd5, "1ccbcd1b0b531a37ad75b34b5a1e2e3e") self.assertEqual(p.linkinfo.baserev, "2c3ae65909d69e0f63113ccfe0e5f3f8") self.assertEqual(p.linkinfo.xsrcmd5, "6a31b956f9431b0644ad6cf8e845c4e5") self.assertEqual(p.linkinfo.lsrcmd5, "30ccce6c3a1a4322e79c2935a52af18b") self.assertEqual(len(p.files), 3) f = p.files[0] self.assertEqual(f.name, "osc-0.182.0.tar.gz") self.assertEqual(f.md5, "87f040c76f3da86fd7218c972b9df1dc") self.assertEqual(f.size, 381596) self.assertEqual(f.mtime, 1662638726) class TestApiPackage(OscTestCase): def _get_fixtures_dir(self): return FIXTURES_DIR @GET("http://localhost/source/openSUSE:Tools/osc", file="osctest/openSUSE:Tools/osc/.osc/_files") def test_load(self): p = ApiPackage("http://localhost", "openSUSE:Tools", "osc") self.assertEqual(p.name, "osc") self.assertEqual(p.project, "openSUSE:Tools") self.assertEqual(p.apiurl, "http://localhost") self.assertEqual(p.rev, "373") self.assertEqual(p.vrev, "339") self.assertEqual(p.srcmd5, "30ccce6c3a1a4322e79c2935a52af18b") self.assertEqual(p.linkinfo.project, "openSUSE:Factory") self.assertEqual(p.linkinfo.package, "osc") self.assertEqual(p.linkinfo.srcmd5, "1ccbcd1b0b531a37ad75b34b5a1e2e3e") self.assertEqual(p.linkinfo.baserev, "2c3ae65909d69e0f63113ccfe0e5f3f8") self.assertEqual(p.linkinfo.xsrcmd5, "6a31b956f9431b0644ad6cf8e845c4e5") self.assertEqual(p.linkinfo.lsrcmd5, "30ccce6c3a1a4322e79c2935a52af18b") self.assertEqual(len(p.files), 3) f = p.files[0] self.assertEqual(f.name, "osc-0.182.0.tar.gz") self.assertEqual(f.md5, "87f040c76f3da86fd7218c972b9df1dc") self.assertEqual(f.size, 381596) self.assertEqual(f.mtime, 1662638726) if __name__ == "__main__": unittest.main() osc-1.12.1/tests/test_addfiles.py000066400000000000000000000062271475337502500167300ustar00rootroot00000000000000import os import sys import unittest import osc.core import osc.oscerr from .common import OscTestCase FIXTURES_DIR = os.path.join(os.path.dirname(__file__), 'addfile_fixtures') def suite(): return unittest.defaultTestLoader.loadTestsFromTestCase(TestAddFiles) class TestAddFiles(OscTestCase): def _get_fixtures_dir(self): return FIXTURES_DIR def testSimpleAdd(self): """add one file ('toadd1') to the wc""" self._change_to_pkg('simple') p = osc.core.Package('.') p.addfile('toadd1') exp = 'A toadd1\n' self.assertEqual(sys.stdout.getvalue(), exp) self.assertFalse(os.path.exists(os.path.join('.osc', 'sources', 'toadd1'))) self._check_status(p, 'toadd1', 'A') self._check_addlist('toadd1\n') def testSimpleMultipleAdd(self): """add multiple files ('toadd1', 'toadd2') to the wc""" self._change_to_pkg('simple') p = osc.core.Package('.') p.addfile('toadd1') p.addfile('toadd2') exp = 'A toadd1\nA toadd2\n' self.assertEqual(sys.stdout.getvalue(), exp) self.assertFalse(os.path.exists(os.path.join('.osc', 'sources', 'toadd1'))) self.assertFalse(os.path.exists(os.path.join('.osc', 'sources', 'toadd2'))) self._check_status(p, 'toadd1', 'A') self._check_status(p, 'toadd2', 'A') self._check_addlist('toadd1\ntoadd2\n') def testAddVersionedFile(self): """add a versioned file""" self._change_to_pkg('simple') p = osc.core.Package('.') self.assertRaises(osc.oscerr.PackageFileConflict, p.addfile, 'merge') self.assertFalse(os.path.exists(os.path.join('.osc', '_to_be_added'))) self._check_status(p, 'merge', ' ') def testAddUnversionedFileTwice(self): """add the same file twice""" self._change_to_pkg('simple') p = osc.core.Package('.') p.addfile('toadd1') self.assertRaises(osc.oscerr.PackageFileConflict, p.addfile, 'toadd1') exp = 'A toadd1\n' self.assertEqual(sys.stdout.getvalue(), exp) self.assertFalse(os.path.exists(os.path.join('.osc', 'sources', 'toadd1'))) self._check_status(p, 'toadd1', 'A') self._check_addlist('toadd1\n') def testReplace(self): """replace a deleted file ('foo')""" self._change_to_pkg('simple') p = osc.core.Package('.') with open('foo', 'w') as f: f.write('replaced file\n') p.addfile('foo') exp = 'A foo\n' self.assertEqual(sys.stdout.getvalue(), exp) self.assertFileContentNotEqual(os.path.join('.osc', 'sources', 'foo'), 'replaced file\n') self.assertFalse(os.path.exists(os.path.join('.osc', '_to_be_deleted'))) self._check_status(p, 'foo', 'R') self._check_addlist('foo\n') def testAddNonExistentFile(self): """add a non existent file""" self._change_to_pkg('simple') p = osc.core.Package('.') self.assertRaises(osc.oscerr.OscIOError, p.addfile, 'doesnotexist') self.assertFalse(os.path.exists(os.path.join('.osc', '_to_be_added'))) if __name__ == '__main__': unittest.main() osc-1.12.1/tests/test_build.py000066400000000000000000000022541475337502500162500ustar00rootroot00000000000000import unittest import osc.conf from osc.build import check_trusted_projects from osc.oscerr import UserAbort class TestTrustedProjects(unittest.TestCase): def setUp(self): osc.conf.config = osc.conf.Options() def test_name(self): apiurl = "https://example.com" osc.conf.config["apiurl"] = apiurl osc.conf.config.setdefault("api_host_options", {}).setdefault(apiurl, {}).setdefault("trusted_prj", None) osc.conf.config["api_host_options"][apiurl]["trusted_prj"] = [] self.assertRaises(UserAbort, check_trusted_projects, apiurl, ["foo"], interactive=False) osc.conf.config["api_host_options"][apiurl]["trusted_prj"] = ["qwerty", "foo", "asdfg"] check_trusted_projects(apiurl, ["foo"], interactive=False) def test_glob(self): apiurl = "https://example.com" osc.conf.config["apiurl"] = apiurl osc.conf.config.setdefault("api_host_options", {}).setdefault(apiurl, {}).setdefault("trusted_prj", None) osc.conf.config["api_host_options"][apiurl]["trusted_prj"] = ["f*"] check_trusted_projects(apiurl, ["foo"], interactive=False) if __name__ == "__main__": unittest.main() osc-1.12.1/tests/test_commandline.py000066400000000000000000000710531475337502500174420ustar00rootroot00000000000000import argparse import os import shutil import tempfile import unittest from osc.commandline import Command from osc.commandline import MainCommand from osc.commandline import OscMainCommand from osc.commandline import pop_project_package_from_args from osc.commandline import pop_project_package_repository_arch_from_args from osc.commandline import pop_project_package_targetproject_targetpackage_from_args from osc.commandline import pop_repository_arch_from_args from osc.oscerr import NoWorkingCopy, OscValueError from osc.store import Store class TestMainCommand(MainCommand): name = "osc-test" def init_arguments(self, command=None): self.add_argument( "-A", "--apiurl", ) class TestCommand(Command): name = "test-cmd" OSCRC_LOCALHOST = """ [general] apiurl = https://localhost [https://localhost] user=Admin pass=opensuse """.lstrip() class TestCommandClasses(unittest.TestCase): def setUp(self): os.environ.pop("OSC_CONFIG", None) self.tmpdir = tempfile.mkdtemp(prefix="osc_test") os.chdir(self.tmpdir) self.oscrc = None def tearDown(self): os.environ.pop("OSC_CONFIG", None) try: shutil.rmtree(self.tmpdir) except OSError: pass def write_oscrc_localhost(self): self.oscrc = os.path.join(self.tmpdir, "oscrc") with open(self.oscrc, "w") as f: f.write(OSCRC_LOCALHOST) def test_load_commands(self): main = TestMainCommand() main.load_commands() def test_load_command(self): main = TestMainCommand() cmd = main.load_command(TestCommand, "test.osc.commands") self.assertTrue(str(cmd).startswith(" req2) def test_sort(self): req1 = Request() req1.reqid = 2 req2 = Request() req2.reqid = 1 requests = [req1, req2] requests.sort() self.assertEqual(requests[0].reqid, 1) self.assertEqual(requests[1].reqid, 2) if __name__ == "__main__": unittest.main() osc-1.12.1/tests/test_credentials.py000066400000000000000000000010521475337502500174410ustar00rootroot00000000000000import unittest import osc.conf from osc.credentials import ObfuscatedConfigFileCredentialsManager class TestObfuscatedConfigFileCredentialsManager(unittest.TestCase): def test_decode_password(self): # obfuscated "opensuse" password_str = "QlpoOTFBWSZTWeTSblkAAAGBgAIBygAgADDACGNEHxaYXckU4UJDk0m5ZA==" password = osc.conf.Password(password_str) decoded = ObfuscatedConfigFileCredentialsManager.decode_password(password) self.assertEqual(decoded, "opensuse") if __name__ == "__main__": unittest.main() osc-1.12.1/tests/test_deletefiles.py000066400000000000000000000210761475337502500174410ustar00rootroot00000000000000import os import unittest import osc.core import osc.oscerr from .common import OscTestCase FIXTURES_DIR = os.path.join(os.path.dirname(__file__), 'deletefile_fixtures') def suite(): return unittest.defaultTestLoader.loadTestsFromTestCase(TestDeleteFiles) class TestDeleteFiles(OscTestCase): def _get_fixtures_dir(self): return FIXTURES_DIR def testSimpleRemove(self): """delete a file ('foo') from the wc""" self._change_to_pkg('simple') p = osc.core.Package('.') ret = p.delete_file('foo') self.__check_ret(ret, True, ' ') self.assertFalse(os.path.exists('foo')) self.assertTrue(os.path.exists(os.path.join('.osc', 'sources', 'foo'))) self._check_deletelist('foo\n') self._check_status(p, 'foo', 'D') def testDeleteModified(self): """delete modified file ('nochange') from the wc (without force)""" self._change_to_pkg('simple') p = osc.core.Package('.') ret = p.delete_file('nochange') self.__check_ret(ret, False, 'M') self.assertTrue(os.path.exists('nochange')) self.assertTrue(os.path.exists(os.path.join('.osc', 'sources', 'nochange'))) self.assertFalse(os.path.exists(os.path.join('.osc', '_to_be_deleted'))) self._check_status(p, 'nochange', 'M') def testDeleteUnversioned(self): """delete an unversioned file ('toadd2') from the wc""" self._change_to_pkg('simple') p = osc.core.Package('.') ret = p.delete_file('toadd2') self.__check_ret(ret, False, '?') self.assertTrue(os.path.exists('toadd2')) self.assertFalse(os.path.exists(os.path.join('.osc', '_to_be_deleted'))) self._check_status(p, 'toadd2', '?') def testDeleteAdded(self): """delete an added file ('toadd1') from the wc (without force)""" self._change_to_pkg('simple') p = osc.core.Package('.') ret = p.delete_file('toadd1') self.__check_ret(ret, False, 'A') self.assertTrue(os.path.exists('toadd1')) self.assertFalse(os.path.exists(os.path.join('.osc', '_to_be_deleted'))) self._check_status(p, 'toadd1', 'A') def testDeleteReplaced(self): """delete an added file ('merge') from the wc (without force)""" self._change_to_pkg('replace') p = osc.core.Package('.') ret = p.delete_file('merge') self.__check_ret(ret, False, 'R') self.assertTrue(os.path.exists('merge')) self.assertFalse(os.path.exists(os.path.join('.osc', '_to_be_deleted'))) self._check_addlist('toadd1\nmerge\n') self._check_status(p, 'merge', 'R') def testDeleteConflict(self): """delete a file ('foo', state='C') from the wc (without force)""" self._change_to_pkg('conflict') p = osc.core.Package('.') ret = p.delete_file('foo') self.__check_ret(ret, False, 'C') self.assertTrue(os.path.exists('foo')) self.assertTrue(os.path.exists(os.path.join('.osc', 'sources', 'foo'))) self.assertFalse(os.path.exists(os.path.join('.osc', '_to_be_deleted'))) self._check_conflictlist('foo\n') self._check_status(p, 'foo', 'C') def testDeleteModifiedForce(self): """force deletion modified file ('nochange') from wc""" self._change_to_pkg('simple') p = osc.core.Package('.') ret = p.delete_file('nochange', force=True) self.__check_ret(ret, True, 'M') self.assertFalse(os.path.exists('nochange')) self.assertTrue(os.path.exists(os.path.join('.osc', 'sources', 'nochange'))) self._check_deletelist('nochange\n') self._check_status(p, 'nochange', 'D') def testDeleteUnversionedForce(self): """delete an unversioned file ('toadd2') from the wc (with force)""" self._change_to_pkg('simple') p = osc.core.Package('.') ret = p.delete_file('toadd2', force=True) self.__check_ret(ret, True, '?') self.assertFalse(os.path.exists('toadd2')) self.assertFalse(os.path.exists(os.path.join('.osc', '_to_be_deleted'))) self.assertRaises(osc.oscerr.OscIOError, p.status, 'toadd2') def testDeleteAddedForce(self): """delete an added file ('toadd1') from the wc (with force)""" self._change_to_pkg('simple') p = osc.core.Package('.') ret = p.delete_file('toadd1', force=True) self.__check_ret(ret, True, 'A') self.assertFalse(os.path.exists('toadd1')) self.assertFalse(os.path.exists(os.path.join('.osc', '_to_be_deleted'))) self.assertFalse(os.path.exists(os.path.join('.osc', '_to_be_added'))) self.assertRaises(osc.oscerr.OscIOError, p.status, 'toadd1') def testDeleteReplacedForce(self): """delete an added file ('merge') from the wc (with force)""" self._change_to_pkg('replace') p = osc.core.Package('.') ret = p.delete_file('merge', force=True) self.__check_ret(ret, True, 'R') self.assertFalse(os.path.exists('merge')) self.assertTrue(os.path.exists(os.path.join('.osc', 'sources', 'merge'))) self._check_deletelist('merge\n') self._check_addlist('toadd1\n') self._check_status(p, 'merge', 'D') def testDeleteConflictForce(self): """delete a file ('foo', state='C') from the wc (with force)""" self._change_to_pkg('conflict') p = osc.core.Package('.') ret = p.delete_file('foo', force=True) self.__check_ret(ret, True, 'C') self.assertFalse(os.path.exists('foo')) self.assertTrue(os.path.exists('foo.r2')) self.assertTrue(os.path.exists('foo.mine')) self.assertTrue(os.path.exists(os.path.join('.osc', 'sources', 'foo'))) self._check_deletelist('foo\n') self.assertFalse(os.path.exists(os.path.join('.osc', '_in_conflict'))) self._check_status(p, 'foo', 'D') def testDeleteMultiple(self): """delete mutliple files from the wc""" self._change_to_pkg('simple') p = osc.core.Package('.') ret = p.delete_file('foo') self.__check_ret(ret, True, ' ') ret = p.delete_file('merge') self.__check_ret(ret, True, ' ') self.assertFalse(os.path.exists('foo')) self.assertFalse(os.path.exists('merge')) self.assertTrue(os.path.exists(os.path.join('.osc', 'sources', 'foo'))) self.assertTrue(os.path.exists(os.path.join('.osc', 'sources', 'merge'))) self._check_deletelist('foo\nmerge\n') def testDeleteAlreadyDeleted(self): """delete already deleted file from the wc""" self._change_to_pkg('already_deleted') p = osc.core.Package('.') ret = p.delete_file('foo') self.__check_ret(ret, True, 'D') self.assertFalse(os.path.exists('foo')) self.assertTrue(os.path.exists(os.path.join('.osc', 'sources', 'foo'))) self._check_deletelist('foo\n') self._check_status(p, 'foo', 'D') def testDeleteAddedMissing(self): """ delete a file which was added to the wc and is removed again (via a non osc command). It's current state is '!' """ self._change_to_pkg('delete') p = osc.core.Package('.') ret = p.delete_file('toadd1') self.__check_ret(ret, True, '!') self.assertFalse(os.path.exists('toadd1')) self.assertFalse(os.path.exists(os.path.join('.osc', 'sources', 'toadd1'))) self._check_deletelist('foo\n') self.assertFalse(os.path.exists(os.path.join('.osc', '_to_be_added'))) def testDeleteSkippedLocalNotExistent(self): """ delete a skipped file: no local file with that name exists """ self._change_to_pkg('simple') p = osc.core.Package('.') ret = p.delete_file('skipped') self.__check_ret(ret, False, 'S') self.assertFalse(os.path.exists(os.path.join('.osc', '_to_be_deleted'))) def testDeleteSkippedLocalExistent(self): """ delete a skipped file: a local file with that name exists and will be deleted (for instance _service:* files have status 'S' but a local files might exist) """ self._change_to_pkg('simple') p = osc.core.Package('.') ret = p.delete_file('skipped_exists') self.__check_ret(ret, True, 'S') self.assertFalse(os.path.exists('skipped_exists')) self.assertFalse(os.path.exists(os.path.join('.osc', '_to_be_deleted'))) def __check_ret(self, ret, exp1, exp2): self.assertTrue(len(ret) == 2) self.assertTrue(ret[0] == exp1) self.assertTrue(ret[1] == exp2) if __name__ == '__main__': unittest.main() osc-1.12.1/tests/test_difffiles.py000066400000000000000000000257721475337502500171160ustar00rootroot00000000000000import os import re import unittest import osc.core import osc.oscerr from osc.util.helper import decode_list from .common import GET, OscTestCase FIXTURES_DIR = os.path.join(os.path.dirname(__file__), 'difffile_fixtures') def suite(): return unittest.defaultTestLoader.loadTestsFromTestCase(TestDiffFiles) class TestDiffFiles(OscTestCase): diff_hdr = 'Index: %s\n===================================================================' def _get_fixtures_dir(self): return FIXTURES_DIR def testDiffUnmodified(self): """diff an unmodified file""" self._change_to_pkg('simple') p = osc.core.Package('.') p.todo = ['merge'] self.__check_diff(p, '', None) def testDiffAdded(self): """diff an added file""" self._change_to_pkg('simple') p = osc.core.Package('.') p.todo = ['toadd1'] exp = """%s --- toadd1\t(revision 0) +++ toadd1\t(revision 0) @@ -0,0 +1,1 @@ +toadd1 """ % (TestDiffFiles.diff_hdr % 'toadd1') self.__check_diff(p, exp, None) def testDiffRemoved(self): """diff a removed file""" self._change_to_pkg('simple') p = osc.core.Package('.') p.todo = ['somefile'] exp = """%s --- somefile\t(revision 2) +++ somefile\t(working copy) @@ -1,1 +0,0 @@ -some content """ % (TestDiffFiles.diff_hdr % 'somefile') self.__check_diff(p, exp, None) def testDiffMissing(self): """diff a missing file (missing files are ignored)""" self._change_to_pkg('simple') p = osc.core.Package('.') p.todo = ['missing'] self.__check_diff(p, '', None) def testDiffReplaced(self): """diff a replaced file""" self._change_to_pkg('simple') p = osc.core.Package('.') p.todo = ['replaced'] exp = """%s --- replaced\t(revision 2) +++ replaced\t(working copy) @@ -1,1 +1,1 @@ -yet another file +foo replaced """ % (TestDiffFiles.diff_hdr % 'replaced') self.__check_diff(p, exp, None) def testDiffSkipped(self): """diff a skipped file (skipped files are ignored)""" self._change_to_pkg('simple') p = osc.core.Package('.') p.todo = ['skipped'] self.__check_diff(p, '', None) def testDiffConflict(self): """diff a file which is in the conflict state""" self._change_to_pkg('simple') p = osc.core.Package('.') p.todo = ['foo'] exp = """%s --- foo\t(revision 2) +++ foo\t(working copy) @@ -1,1 +1,5 @@ +<<<<<<< foo.mine +This is no test. +======= This is a simple test. +>>>>>>> foo.r2 """ % (TestDiffFiles.diff_hdr % 'foo') self.__check_diff(p, exp, None) def testDiffModified(self): """diff a modified file""" self._change_to_pkg('simple') p = osc.core.Package('.') p.todo = ['nochange'] exp = """%s --- nochange\t(revision 2) +++ nochange\t(working copy) @@ -1,1 +1,2 @@ -This file didn't change. +This file didn't change but +is modified. """ % (TestDiffFiles.diff_hdr % 'nochange') self.__check_diff(p, exp, None) def testDiffUnversioned(self): """diff an unversioned file""" self._change_to_pkg('simple') p = osc.core.Package('.') p.todo = ['toadd2'] self.assertRaises(osc.oscerr.OscIOError, self.__check_diff, p, '', None) def testDiffAddedMissing(self): """diff a file which has satus 'A' but the local file does not exist""" self._change_to_pkg('simple') p = osc.core.Package('.') p.todo = ['addedmissing'] self.assertRaises(osc.oscerr.OscIOError, self.__check_diff, p, '', None) def testDiffMultipleFiles(self): """diff multiple files""" self._change_to_pkg('simple') p = osc.core.Package('.') p.todo = ['nochange', 'somefile'] exp = """%s --- nochange\t(revision 2) +++ nochange\t(working copy) @@ -1,1 +1,2 @@ -This file didn't change. +This file didn't change but +is modified. %s --- somefile\t(revision 2) +++ somefile\t(working copy) @@ -1,1 +0,0 @@ -some content """ % (TestDiffFiles.diff_hdr % 'nochange', TestDiffFiles.diff_hdr % 'somefile') self.__check_diff(p, exp, None) def testDiffReplacedEmptyTodo(self): """diff a complete package""" self._change_to_pkg('replaced') p = osc.core.Package('.') exp = """%s --- replaced\t(revision 2) +++ replaced\t(working copy) @@ -1,1 +1,1 @@ -yet another file +foo replaced """ % (TestDiffFiles.diff_hdr % 'replaced') self.__check_diff(p, exp, None) def testDiffBinaryAdded(self): """diff an added binary file""" self._change_to_pkg('binary') p = osc.core.Package('.') p.todo = ['binary_added'] exp = """%s Binary file 'binary_added' added. """ % (TestDiffFiles.diff_hdr % 'binary_added') self.__check_diff(p, exp, None) def testDiffBinaryDeleted(self): """diff a deleted binary file""" self._change_to_pkg('binary') p = osc.core.Package('.') p.todo = ['binary_deleted'] exp = """%s Binary file 'binary_deleted' deleted. """ % (TestDiffFiles.diff_hdr % 'binary_deleted') self.__check_diff(p, exp, None) def testDiffBinaryModified(self): """diff a modified binary file""" self._change_to_pkg('binary') p = osc.core.Package('.') p.todo = ['binary'] exp = """%s Binary file 'binary' has changed. """ % (TestDiffFiles.diff_hdr % 'binary') self.__check_diff(p, exp, None) # diff with revision @GET('http://localhost/source/osctest/remote_simple_noadd?rev=3', file='testDiffRemoteNoChange_files') def testDiffRemoteNoChange(self): """diff against remote revision where no file changed""" self._change_to_pkg('remote_simple_noadd') p = osc.core.Package('.') self.__check_diff(p, '', 3) @GET('http://localhost/source/osctest/remote_simple?rev=3', file='testDiffRemoteModified_files') @GET('http://localhost/source/osctest/remote_simple/merge?rev=3', file='testDiffRemoteModified_merge') def testDiffRemoteModified(self): """diff against a remote revision with one modified file""" self._change_to_pkg('remote_simple') p = osc.core.Package('.') exp = """%s --- merge\t(revision 3) +++ merge\t(working copy) @@ -1,3 +1,4 @@ Is it possible to merge this file? +I hope so... %s --- toadd1\t(revision 0) +++ toadd1\t(revision 0) @@ -0,0 +1,1 @@ +toadd1 """ % (TestDiffFiles.diff_hdr % 'merge', TestDiffFiles.diff_hdr % 'toadd1') self.__check_diff(p, exp, 3) @GET('http://localhost/source/osctest/remote_simple?rev=3', file='testDiffRemoteDeletedLocalAdded_files') def testDiffRemoteNotExistingLocalAdded(self): """ a file which doesn't exist in a remote revision and has status A in the wc """ self._change_to_pkg('remote_simple') p = osc.core.Package('.') exp = """%s --- toadd1\t(revision 0) +++ toadd1\t(revision 0) @@ -0,0 +1,1 @@ +toadd1 """ % (TestDiffFiles.diff_hdr % 'toadd1') self.__check_diff(p, exp, 3) @GET('http://localhost/source/osctest/remote_simple_noadd?rev=3', file='testDiffRemoteExistingLocalNotExisting_files') @GET('http://localhost/source/osctest/remote_simple_noadd/foobar?rev=3', file='testDiffRemoteExistingLocalNotExisting_foobar') @GET('http://localhost/source/osctest/remote_simple_noadd/binary?rev=3', file='testDiffRemoteExistingLocalNotExisting_binary') def testDiffRemoteExistingLocalNotExisting(self): """ a file doesn't exist in the local wc but exists in the remote revision """ self._change_to_pkg('remote_simple_noadd') p = osc.core.Package('.') exp = """%s --- foobar\t(revision 3) +++ foobar\t(working copy) @@ -1,2 +0,0 @@ -foobar -barfoo %s Binary file 'binary' deleted. """ % (TestDiffFiles.diff_hdr % 'foobar', TestDiffFiles.diff_hdr % 'binary') self.__check_diff(p, exp, 3) @GET('http://localhost/source/osctest/remote_localmodified?rev=3', file='testDiffRemoteUnchangedLocalModified_files') @GET('http://localhost/source/osctest/remote_localmodified/nochange?rev=3', file='testDiffRemoteUnchangedLocalModified_nochange') @GET('http://localhost/source/osctest/remote_localmodified/binary?rev=3', file='testDiffRemoteUnchangedLocalModified_binary') def testDiffRemoteUnchangedLocalModified(self): """remote revision didn't change, local file is modified""" self._change_to_pkg('remote_localmodified') p = osc.core.Package('.') exp = """%s --- nochange\t(revision 3) +++ nochange\t(working copy) @@ -1,1 +1,2 @@ This file didn't change. +oh it does %s Binary file 'binary' has changed. """ % (TestDiffFiles.diff_hdr % 'nochange', TestDiffFiles.diff_hdr % 'binary') self.__check_diff(p, exp, 3) @GET('http://localhost/source/osctest/remote_simple_noadd?rev=3', file='testDiffRemoteMissingLocalExisting_files') def testDiffRemoteMissingLocalExisting(self): """ remote revision misses a file which exists in the local wc (state ' ')""" self._change_to_pkg('remote_simple_noadd') p = osc.core.Package('.') exp = """%s --- foo\t(revision 0) +++ foo\t(working copy) @@ -0,0 +1,1 @@ +This is a simple test. """ % (TestDiffFiles.diff_hdr % 'foo') self.__check_diff(p, exp, 3) @GET('http://localhost/source/osctest/remote_localdelete?rev=3', file='testDiffRemoteMissingLocalDeleted_files') def testDiffRemoteMissingLocalDeleted(self): """ remote revision misses a file which is marked for deletion in the local wc """ # empty diff is expected (svn does the same) self._change_to_pkg('remote_localdelete') p = osc.core.Package('.') self.__check_diff(p, '', 3) def __check_diff(self, p, exp, revision=None): got = '' for i in p.get_diff(revision): got += ''.join(decode_list(i)) # When a hunk header refers to a single line in the "from" # file and/or the "to" file, e.g. # # @@ -37,37 +41,43 @@ # @@ -37,39 +41,41 @@ # @@ -37,37 +41,41 @@ # # some systems will avoid repeating the line number: # # @@ -37 +41,43 @@ # @@ -37,39 +41 @@ # @@ -37 +41 @@ # # so we need to canonise the output to avoid false negative # test failures. # TODO: Package.get_diff should return a consistent format # (regardless of the used python version) def __canonise_diff(diff): # we cannot use re.M because python 2.6's re.sub does # not support a flags argument diff = [re.sub(r'^@@ -(\d+) ', '@@ -\\1,\\1 ', line) for line in diff.split('\n')] diff = [re.sub(r'^(@@ -\d+,\d+) \+(\d+) ', '\\1 +\\2,\\2 ', line) for line in diff] return '\n'.join(diff) got = __canonise_diff(got) exp = __canonise_diff(exp) self.assertEqualMultiline(got, exp) if __name__ == '__main__': unittest.main() osc-1.12.1/tests/test_doc_plugins.py000066400000000000000000000044201475337502500174540ustar00rootroot00000000000000""" These tests make sure that the examples in the documentation about osc plugins are not outdated. """ import os import unittest from osc.commandline import MainCommand from osc.commandline import OscMainCommand PLUGINS_DIR = os.path.join(os.path.dirname(__file__), "..", "doc", "plugins") class TestMainCommand(MainCommand): name = "osc-test" MODULES = ( ("test.osc.commands", PLUGINS_DIR), ) class TestPopProjectPackageFromArgs(unittest.TestCase): def test_load_commands(self): """ Test if all plugins from the tutorial can be properly loaded """ main = TestMainCommand() main.load_commands() def test_simple(self): """ Test the 'simple' command """ main = TestMainCommand() main.load_commands() args = main.parse_args(["simple", "arg1", "arg2"]) self.assertEqual(args.command, "simple") self.assertEqual(args.bool_option, False) self.assertEqual(args.arguments, ["arg1", "arg2"]) def test_request_list(self): """ Test the 'request list' command """ main = TestMainCommand() main.load_commands() args = main.parse_args(["request", "list"]) self.assertEqual(args.command, "list") self.assertEqual(args.message, None) def test_request_accept(self): """ Test the 'request accept' command """ main = TestMainCommand() main.load_commands() args = main.parse_args(["request", "accept", "-m", "a message", "12345"]) self.assertEqual(args.command, "accept") self.assertEqual(args.message, "a message") self.assertEqual(args.id, 12345) def test_plugin_locations(self): osc_paths = [i[1] for i in OscMainCommand.MODULES] # skip the first line with osc.commands osc_paths = osc_paths[1:] path = os.path.join(PLUGINS_DIR, "plugin_locations.rst") with open(path, "r") as f: # s doc_paths = f.readlines() # skip the first line with osc.commands doc_paths = doc_paths[1:] doc_paths = [i.lstrip(" -") for i in doc_paths] doc_paths = [i.rstrip("\n") for i in doc_paths] self.assertEqual(doc_paths, osc_paths) osc-1.12.1/tests/test_git_scm_store.py000066400000000000000000000035241475337502500200130ustar00rootroot00000000000000import os import shutil import subprocess import tempfile import unittest from osc.git_scm.store import GitStore class TestGitStore(unittest.TestCase): def setUp(self): self.tmpdir = tempfile.mkdtemp(prefix="osc_test") os.chdir(self.tmpdir) # 'git init -b ' is not supported on older distros subprocess.check_output(["git", "init", "-q"]) subprocess.check_output(["git", "checkout", "-b", "factory", "-q"]) subprocess.check_output(["git", "remote", "add", "origin", "https://example.com/packages/my-package.git"]) def tearDown(self): try: shutil.rmtree(self.tmpdir) except OSError: pass def test_package(self): store = GitStore(self.tmpdir) self.assertEqual(store.package, "my-package") def test_project(self): store = GitStore(self.tmpdir) self.assertEqual(store.project, "openSUSE:Factory") def test_last_buildroot(self): store = GitStore(self.tmpdir) self.assertEqual(store.last_buildroot, None) store.last_buildroot = ("repo", "arch", "vm_type") store = GitStore(self.tmpdir) self.assertEqual(store.last_buildroot, ("repo", "arch", "vm_type")) def test_last_buildroot_empty_vm_type(self): store = GitStore(self.tmpdir) self.assertEqual(store.last_buildroot, None) store.last_buildroot = ("repo", "arch", None) store = GitStore(self.tmpdir) self.assertEqual(store.last_buildroot, ("repo", "arch", None)) def test_scmurl(self): store = GitStore(self.tmpdir) self.assertEqual(store.scmurl, "https://example.com/packages/my-package.git") if not shutil.which("git"): TestGitStore = unittest.skip("The 'git' executable is not available")(TestGitStore) if __name__ == "__main__": unittest.main() osc-1.12.1/tests/test_grabber.py000066400000000000000000000015411475337502500165530ustar00rootroot00000000000000import os import tempfile import unittest import osc.conf import osc.grabber as osc_grabber FIXTURES_DIR = os.path.join(os.path.dirname(__file__), "conf_fixtures") class TestMirrorGroup(unittest.TestCase): def setUp(self): self.tmpdir = tempfile.mkdtemp(prefix='osc_test') oscrc = os.path.join(self._get_fixtures_dir(), "oscrc") osc.conf.get_config(override_conffile=oscrc, override_no_keyring=True) def tearDown(self): try: shutil.rmtree(self.tmpdir) except: pass def _get_fixtures_dir(self): return FIXTURES_DIR def test_invalid_scheme(self): gr = osc_grabber.OscFileGrabber() mg = osc_grabber.OscMirrorGroup(gr, ["container://example.com"]) mg.urlgrab(None, os.path.join(self.tmpdir, "file")) if __name__ == "__main__": unittest.main() osc-1.12.1/tests/test_helpers.py000066400000000000000000000016361475337502500166160ustar00rootroot00000000000000import unittest from osc.util.helper import decode_it, decode_list def suite(): return unittest.defaultTestLoader.loadTestsFromTestCase(TestResults) class TestResults(unittest.TestCase): def testDecodeList(self): strlist = ['Test1', 'Test2', 'Test3'] mixlist = ['Test1', b'Test2', 'Test3'] byteslist = [b'Test1', b'Test2', b'Test3'] out = decode_list(strlist) self.assertListEqual(out, strlist) out = decode_list(mixlist) self.assertListEqual(out, strlist) out = decode_list(byteslist) self.assertListEqual(out, strlist) def testDecodeIt(self): bytes_obj = b'Test the decoding' string_obj = 'Test the decoding' out = decode_it(bytes_obj) self.assertEqual(out, string_obj) out = decode_it(string_obj) self.assertEqual(out, string_obj) if __name__ == '__main__': unittest.main() osc-1.12.1/tests/test_init_package.py000066400000000000000000000103061475337502500175640ustar00rootroot00000000000000import os import unittest import osc.core import osc.oscerr from .common import OscTestCase FIXTURES_DIR = os.path.join(os.path.dirname(__file__), 'init_package_fixtures') def suite(): return unittest.defaultTestLoader.loadTestsFromTestCase(TestInitPackage) class TestInitPackage(OscTestCase): def _get_fixtures_dir(self): return FIXTURES_DIR def setUp(self): super().setUp(copytree=False) def test_simple(self): """initialize a package dir""" pac_dir = os.path.join(self.tmpdir, 'testpkg') osc.core.Package.init_package('http://localhost', 'osctest', 'testpkg', pac_dir) storedir = os.path.join(pac_dir, osc.core.store) self.assertFalse(os.path.exists(os.path.join(storedir, '_meta_mode'))) self.assertFalse(os.path.exists(os.path.join(storedir, '_size_limit'))) self._check_list(os.path.join(storedir, '_project'), 'osctest\n') self._check_list(os.path.join(storedir, '_package'), 'testpkg\n') self._check_list(os.path.join(storedir, '_files'), '\n') self._check_list(os.path.join(storedir, '_apiurl'), 'http://localhost\n') def test_size_limit(self): """initialize a package dir with size_limit parameter""" pac_dir = os.path.join(self.tmpdir, 'testpkg') osc.core.Package.init_package('http://localhost', 'osctest', 'testpkg', pac_dir, size_limit=42) storedir = os.path.join(pac_dir, osc.core.store) self.assertFalse(os.path.exists(os.path.join(storedir, '_meta_mode'))) self._check_list(os.path.join(storedir, '_size_limit'), '42\n') self._check_list(os.path.join(storedir, '_project'), 'osctest\n') self._check_list(os.path.join(storedir, '_package'), 'testpkg\n') self._check_list(os.path.join(storedir, '_files'), '\n') self._check_list(os.path.join(storedir, '_apiurl'), 'http://localhost\n') def test_meta_mode(self): """initialize a package dir with meta paramter""" pac_dir = os.path.join(self.tmpdir, 'testpkg') osc.core.Package.init_package('http://localhost', 'osctest', 'testpkg', pac_dir, meta=True) storedir = os.path.join(pac_dir, osc.core.store) self.assertFalse(os.path.exists(os.path.join(storedir, '_size_limit'))) self._check_list(os.path.join(storedir, '_meta_mode'), '\n') self._check_list(os.path.join(storedir, '_project'), 'osctest\n') self._check_list(os.path.join(storedir, '_package'), 'testpkg\n') self._check_list(os.path.join(storedir, '_files'), '\n') self._check_list(os.path.join(storedir, '_apiurl'), 'http://localhost\n') def test_dirExists(self): """initialize a package dir (dir already exists)""" pac_dir = os.path.join(self.tmpdir, 'testpkg') os.mkdir(pac_dir) osc.core.Package.init_package('http://localhost', 'osctest', 'testpkg', pac_dir) storedir = os.path.join(pac_dir, osc.core.store) self.assertFalse(os.path.exists(os.path.join(storedir, '_meta_mode'))) self.assertFalse(os.path.exists(os.path.join(storedir, '_size_limit'))) self._check_list(os.path.join(storedir, '_project'), 'osctest\n') self._check_list(os.path.join(storedir, '_package'), 'testpkg\n') self._check_list(os.path.join(storedir, '_files'), '\n') self._check_list(os.path.join(storedir, '_apiurl'), 'http://localhost\n') def test_storedirExists(self): """initialize a package dir (dir+storedir already exists)""" pac_dir = os.path.join(self.tmpdir, 'testpkg') os.mkdir(pac_dir) os.mkdir(os.path.join(pac_dir, osc.core.store)) self.assertRaises(osc.oscerr.OscIOError, osc.core.Package.init_package, 'http://localhost', 'osctest', 'testpkg', pac_dir) def test_dirIsFile(self): """initialize a package dir (dir is a file)""" pac_dir = os.path.join(self.tmpdir, 'testpkg') os.mkdir(pac_dir) with open(os.path.join(pac_dir, osc.core.store), 'w') as f: f.write('foo\n') self.assertRaises(osc.oscerr.OscIOError, osc.core.Package.init_package, 'http://localhost', 'osctest', 'testpkg', pac_dir) if __name__ == '__main__': unittest.main() osc-1.12.1/tests/test_init_project.py000066400000000000000000000060561475337502500176460ustar00rootroot00000000000000import os import unittest import osc.conf import osc.core import osc.oscerr from .common import GET, OscTestCase FIXTURES_DIR = os.path.join(os.path.dirname(__file__), 'init_project_fixtures') def suite(): return unittest.defaultTestLoader.loadTestsFromTestCase(TestInitProject) class TestInitProject(OscTestCase): def _get_fixtures_dir(self): return FIXTURES_DIR def setUp(self): super().setUp(copytree=False) def test_simple(self): """initialize a project dir""" prj_dir = os.path.join(self.tmpdir, 'testprj') prj = osc.core.Project.init_project('http://localhost', prj_dir, 'testprj', getPackageList=False) self.assertTrue(isinstance(prj, osc.core.Project)) storedir = os.path.join(prj_dir, osc.core.store) self._check_list(os.path.join(storedir, '_project'), 'testprj\n') self._check_list(os.path.join(storedir, '_apiurl'), 'http://localhost\n') self._check_list(os.path.join(storedir, '_packages'), '') def test_dirExists(self): """initialize a project dir but the dir already exists""" prj_dir = os.path.join(self.tmpdir, 'testprj') os.mkdir(prj_dir) prj = osc.core.Project.init_project('http://localhost', prj_dir, 'testprj', getPackageList=False) self.assertTrue(isinstance(prj, osc.core.Project)) storedir = os.path.join(prj_dir, osc.core.store) self._check_list(os.path.join(storedir, '_project'), 'testprj\n') self._check_list(os.path.join(storedir, '_apiurl'), 'http://localhost\n') self._check_list(os.path.join(storedir, '_packages'), '') def test_storedirExists(self): """initialize a project dir but the storedir already exists""" prj_dir = os.path.join(self.tmpdir, 'testprj') os.mkdir(prj_dir) os.mkdir(os.path.join(prj_dir, osc.core.store)) self.assertRaises(osc.oscerr.OscIOError, osc.core.Project.init_project, 'http://localhost', prj_dir, 'testprj') @GET('http://localhost/source/testprj', text='') def test_no_package_tracking(self): """initialize a project dir but disable package tracking; enable getPackageList=True; disable wc_check (because we didn't disable the package tracking before the Project class was imported therefore REQ_STOREFILES contains '_packages') """ # disable package tracking osc.conf.config['do_package_tracking'] = False prj_dir = os.path.join(self.tmpdir, 'testprj') os.mkdir(prj_dir) prj = osc.core.Project.init_project('http://localhost', prj_dir, 'testprj', False, wc_check=False) self.assertTrue(isinstance(prj, osc.core.Project)) storedir = os.path.join(prj_dir, osc.core.store) self._check_list(os.path.join(storedir, '_project'), 'testprj\n') self._check_list(os.path.join(storedir, '_apiurl'), 'http://localhost\n') self.assertFalse(os.path.exists(os.path.join(storedir, '_packages'))) if __name__ == '__main__': unittest.main() osc-1.12.1/tests/test_keyinfo.py000066400000000000000000000016371475337502500166210ustar00rootroot00000000000000import unittest from osc import obs_api class TestKeyinfo(unittest.TestCase): def test_empty_pubkey(self): ki = obs_api.Keyinfo() ki.pubkey_list = [{"value": ""}] expected = """ Type : GPG public key User ID : Algorithm : Key size : Expires : Fingerprint : """.strip() actual = ki.pubkey_list[0].to_human_readable_string() self.assertEqual(expected, actual) def test_empty_sslcert(self): ki = obs_api.Keyinfo() ki.sslcert_list = [{"value": ""}] expected = """ Type : SSL certificate Subject : Key ID : Serial : Issuer : Algorithm : Key size : Begins : Expires : Fingerprint : """.strip() actual = ki.sslcert_list[0].to_human_readable_string() self.assertEqual(expected, actual) if __name__ == "__main__": unittest.main() osc-1.12.1/tests/test_models.py000066400000000000000000000324711475337502500164400ustar00rootroot00000000000000import unittest from typing import Set from osc.util.models import * from osc.util.models import get_origin class TestTyping(unittest.TestCase): def test_get_origin_list(self): typ = get_origin(list) self.assertEqual(typ, None) def test_get_origin_list_str(self): typ = get_origin(List[str]) self.assertEqual(typ, list) class TestNotSet(unittest.TestCase): def test_repr(self): self.assertEqual(repr(NotSet), "NotSet") def test_bool(self): self.assertEqual(bool(NotSet), False) class Test(unittest.TestCase): def test_dict(self): class TestSubmodel(BaseModel): text: str = Field(default="default") class TestModel(BaseModel): a: str = Field(default="default") b: Optional[str] = Field(default=None) sub: Optional[List[TestSubmodel]] = Field(default=None) m = TestModel() self.assertEqual(m.dict(), {"a": "default", "b": None, "sub": None}) m.b = "B" m.sub = [{"text": "one"}, {"text": "two"}] self.assertEqual(m.dict(), {"a": "default", "b": "B", "sub": [{"text": "one"}, {"text": "two"}]}) def test_unknown_fields(self): class TestModel(BaseModel): pass self.assertRaises(TypeError, TestModel, does_not_exist=None) def test_uninitialized(self): class TestModel(BaseModel): field: str = Field() self.assertRaises(TypeError, TestModel) def test_invalid_type(self): class TestModel(BaseModel): field: Optional[str] = Field() m = TestModel() self.assertRaises(TypeError, setattr, m.field, []) def test_unsupported_type(self): class TestModel(BaseModel): field: Set[str] = Field(default=None) self.assertRaises(TypeError, TestModel) def test_lazy_default(self): class TestModel(BaseModel): field: List[str] = Field(default=lambda: ["string"]) m = TestModel() self.assertEqual(m.field, ["string"]) def test_lazy_default_invalid_type(self): class TestModel(BaseModel): field: List[str] = Field(default=lambda: None) self.assertRaises(TypeError, TestModel) def test_is_set(self): class TestModel(BaseModel): field: Optional[str] = Field() m = TestModel() self.assertNotIn("field", m._values) self.assertEqual(m.field, None) m.field = "text" self.assertIn("field", m._values) self.assertEqual(m.field, "text") def test_str(self): class TestModel(BaseModel): field: str = Field(default="default") m = TestModel() field = m.__fields__["field"] self.assertEqual(field.is_model, False) self.assertEqual(field.is_optional, False) self.assertEqual(field.origin_type, str) self.assertEqual(m.field, "default") m.field = "text" self.assertEqual(m.field, "text") def test_optional_str(self): class TestModel(BaseModel): field: Optional[str] = Field() m = TestModel() field = m.__fields__["field"] self.assertEqual(field.is_model, False) self.assertEqual(field.is_optional, True) self.assertEqual(field.origin_type, str) self.assertEqual(m.field, None) m.field = "text" self.assertEqual(m.field, "text") def test_int(self): class TestModel(BaseModel): field: int = Field(default=0) m = TestModel() field = m.__fields__["field"] self.assertEqual(field.is_model, False) self.assertEqual(field.is_optional, False) self.assertEqual(field.origin_type, int) self.assertEqual(m.field, 0) m.field = 1 self.assertEqual(m.field, 1) def test_optional_int(self): class TestModel(BaseModel): field: Optional[int] = Field() m = TestModel() field = m.__fields__["field"] self.assertEqual(field.is_model, False) self.assertEqual(field.is_optional, True) self.assertEqual(field.origin_type, int) self.assertEqual(m.field, None) m.field = 1 self.assertEqual(m.field, 1) def test_submodel(self): class TestSubmodel(BaseModel): text: str = Field(default="default") class TestModel(BaseModel): field: TestSubmodel = Field(default={}) m = TestModel() field = m.__fields__["field"] self.assertEqual(field.is_model, True) self.assertEqual(field.is_optional, False) self.assertEqual(field.origin_type, TestSubmodel) m = TestModel(field=TestSubmodel()) self.assertEqual(m.field.text, "default") m = TestModel(field={"text": "text"}) self.assertEqual(m.field.text, "text") def test_optional_submodel(self): class TestSubmodel(BaseModel): text: str = Field(default="default") class TestModel(BaseModel): field: Optional[TestSubmodel] = Field(default=None) m = TestModel() field = m.__fields__["field"] self.assertEqual(field.is_model, True) self.assertEqual(field.is_optional, True) self.assertEqual(field.origin_type, TestSubmodel) self.assertEqual(m.field, None) m.dict() m = TestModel(field=TestSubmodel()) self.assertIsInstance(m.field, TestSubmodel) self.assertEqual(m.field.text, "default") m.dict() m = TestModel(field={"text": "text"}) self.assertNotEqual(m.field, None) self.assertEqual(m.field.text, "text") m.dict() def test_list_submodels(self): class TestSubmodel(BaseModel): text: str = Field(default="default") class TestModel(BaseModel): field: List[TestSubmodel] = Field(default=[]) m = TestModel() field = m.__fields__["field"] self.assertEqual(field.is_model, False) self.assertEqual(field.is_model_list, True) self.assertEqual(field.is_optional, False) self.assertEqual(field.origin_type, list) m.dict() m = TestModel(field=[TestSubmodel()]) self.assertEqual(m.field[0].text, "default") m.dict() m = TestModel(field=[{"text": "text"}]) self.assertEqual(m.field[0].text, "text") m.dict() self.assertRaises(TypeError, getattr(m, "field")) def test_optional_list_submodels(self): class TestSubmodel(BaseModel): text: str = Field(default="default") class TestModel(BaseModel): field: Optional[List[TestSubmodel]] = Field(default=[]) m = TestModel() field = m.__fields__["field"] self.assertEqual(field.is_model, False) self.assertEqual(field.is_model_list, True) self.assertEqual(field.is_optional, True) self.assertEqual(field.origin_type, list) m.dict() m = TestModel(field=[TestSubmodel()]) self.assertEqual(m.field[0].text, "default") m.dict() m = TestModel(field=[{"text": "text"}]) self.assertEqual(m.field[0].text, "text") m.dict() m.field = None self.assertEqual(m.field, None) m.dict() def test_enum(self): class Numbers(Enum): one = "one" two = "two" class TestModel(BaseModel): field: Optional[Numbers] = Field(default=None) m = TestModel() field = m.__fields__["field"] self.assertEqual(field.is_model, False) self.assertEqual(field.is_optional, True) self.assertEqual(field.origin_type, Numbers) self.assertEqual(m.field, None) m.field = "one" self.assertEqual(m.field, "one") self.assertRaises(ValueError, setattr, m, "field", "does-not-exist") def test_parent(self): class ParentModel(BaseModel): field: str = Field(default="text") class ChildModel(BaseModel): field: str = Field(default=FromParent("field")) field2: str = Field(default=FromParent("field")) p = ParentModel() c = ChildModel(_parent=p) self.assertEqual(p.field, "text") self.assertEqual(c.field, "text") self.assertEqual(c.field2, "text") c.field = "new-text" self.assertEqual(p.field, "text") self.assertEqual(c.field, "new-text") self.assertEqual(c.field2, "text") def test_parent_fallback(self): class SubModel(BaseModel): field: str = Field(default=FromParent("field", fallback="submodel-fallback")) class Model(BaseModel): field: str = Field(default=FromParent("field", fallback="model-fallback")) sub: Optional[SubModel] = Field() sub_list: Optional[List[SubModel]] = Field() m = Model() s = SubModel(_parent=m) m.sub = s self.assertEqual(m.field, "model-fallback") self.assertEqual(m.sub.field, "model-fallback") m = Model(sub={}) self.assertEqual(m.field, "model-fallback") self.assertEqual(m.sub.field, "model-fallback") m = Model(sub=SubModel()) self.assertEqual(m.field, "model-fallback") self.assertEqual(m.sub.field, "model-fallback") m = Model() s = SubModel(_parent=m) m.sub_list = [s] self.assertEqual(m.field, "model-fallback") self.assertEqual(m.sub_list[0].field, "model-fallback") m = Model(sub_list=[{}]) self.assertEqual(m.field, "model-fallback") self.assertEqual(m.sub_list[0].field, "model-fallback") m = Model(sub_list=[SubModel()]) self.assertEqual(m.field, "model-fallback") self.assertEqual(m.sub_list[0].field, "model-fallback") m = Model() m.sub_list = [] m.sub_list.append({}) self.assertEqual(m.field, "model-fallback") self.assertEqual(m.sub_list[0].field, "model-fallback") def test_get_callback(self): class Model(BaseModel): quiet: bool = Field( default=False, ) verbose: bool = Field( default=False, # return False if ``quiet`` is True; return the actual value otherwise get_callback=lambda obj, value: False if obj.quiet else value, ) m = Model() self.assertEqual(m.quiet, False) self.assertEqual(m.verbose, False) m.quiet = True m.verbose = True self.assertEqual(m.quiet, True) self.assertEqual(m.verbose, False) m.quiet = False m.verbose = True self.assertEqual(m.quiet, False) self.assertEqual(m.verbose, True) def test_has_changed(self): class TestSubmodel(BaseModel): text: str = Field(default="default") class TestModel(BaseModel): field: Optional[List[TestSubmodel]] = Field(default=[]) m = TestModel() self.assertFalse(m.has_changed()) # a new instance of empty list m.field = [] self.assertFalse(m.has_changed()) m.field = [{"text": "one"}, {"text": "two"}] self.assertTrue(m.has_changed()) m.do_snapshot() # a new instance of list with new instances of objects with the same data m.field = [{"text": "one"}, {"text": "two"}] self.assertFalse(m.has_changed()) def test_append_dict(self): class TestSubmodel(BaseModel): text: str = Field(default="default") class TestModel(BaseModel): field: Optional[List[TestSubmodel]] = Field(default=[]) m = TestModel() m.field.append({"text": "value"}) # dict is converted to object next time the field is retrieved self.assertIsInstance(m.field[0], BaseModel) self.assertEqual(m.field[0].text, "value") def test_ordering(self): class TestSubmodel(BaseModel): txt: Optional[str] = Field() class TestModel(BaseModel): num: Optional[int] = Field() txt: Optional[str] = Field() sub: Optional[TestSubmodel] = Field() dct: Optional[Dict[str, TestSubmodel]] = Field() m1 = TestModel() m2 = TestModel() self.assertEqual(m1, m2) m1 = TestModel(num=1) m2 = TestModel(num=2) self.assertNotEqual(m1, m2) self.assertLess(m1, m2) self.assertGreater(m2, m1) m1 = TestModel(txt="a") m2 = TestModel(txt="b") self.assertNotEqual(m1, m2) self.assertLess(m1, m2) self.assertGreater(m2, m1) m1 = TestModel(sub={}) m2 = TestModel(sub={}) self.assertEqual(m1, m2) m1 = TestModel(sub={"txt": "a"}) m2 = TestModel(sub={"txt": "b"}) self.assertNotEqual(m1, m2) self.assertLess(m1, m2) self.assertGreater(m2, m1) m1 = TestModel(dct={}) m2 = TestModel(dct={}) self.assertEqual(m1, m2) m1 = TestModel(dct={"a": TestSubmodel()}) m2 = TestModel(dct={"b": TestSubmodel()}) self.assertNotEqual(m1, m2) self.assertLess(m1, m2) self.assertGreater(m2, m1) # dict ordering doesn't matter m1 = TestModel(dct={"a": TestSubmodel(), "b": TestSubmodel()}) m2 = TestModel(dct={"b": TestSubmodel(), "a": TestSubmodel()}) self.assertEqual(m1, m2) if __name__ == "__main__": unittest.main() osc-1.12.1/tests/test_models_xmlmodel.py000066400000000000000000000162001475337502500203310ustar00rootroot00000000000000import io import textwrap import unittest from osc.util.models import * from osc.obs_api.simple_flag import SimpleFlag class TestXmlModel(unittest.TestCase): def test_attribute(self): class TestModel(XmlModel): XML_TAG = "tag" value: str = Field(xml_attribute=True) m = TestModel(value="FOO") self.assertEqual(m.dict(), {"value": "FOO"}) expected = """""" self.assertEqual(m.to_string(), expected) # verify that we can also load the serialized data m = TestModel.from_string(expected) self.assertEqual(m.to_string(), expected) def test_element(self): class TestModel(XmlModel): XML_TAG = "tag" value: str = Field() m = TestModel(value="FOO") self.assertEqual(m.dict(), {"value": "FOO"}) expected = textwrap.dedent( """ FOO """ ).strip() self.assertEqual(m.to_string(), expected) # verify that we can also load the serialized data m = TestModel.from_string(expected) self.assertEqual(m.to_string(), expected) def test_element_list(self): class TestModel(XmlModel): XML_TAG = "tag" value_list: List[str] = Field(xml_name="value") m = TestModel(value_list=["FOO", "BAR"]) self.assertEqual(m.dict(), {"value_list": ["FOO", "BAR"]}) expected = textwrap.dedent( """ FOO BAR """ ).strip() self.assertEqual(m.to_string(), expected) # verify that we can also load the serialized data m = TestModel.from_string(expected) self.assertEqual(m.to_string(), expected) def test_child_model(self): class ChildModel(XmlModel): XML_TAG = "child" value: str = Field() class ParentModel(XmlModel): XML_TAG = "parent" text: str = Field() child: ChildModel = Field() m = ParentModel(text="TEXT", child={"value": "FOO"}) expected = textwrap.dedent( """ TEXT FOO """ ).strip() self.assertEqual(m.to_string(), expected) # verify that we can also load the serialized data m = ParentModel.from_string(expected) self.assertEqual(m.to_string(), expected) def test_child_model_list(self): class ChildModel(XmlModel): XML_TAG = "child" value: str = Field() class ParentModel(XmlModel): XML_TAG = "parent" text: str = Field() child: List[ChildModel] = Field() m = ParentModel(text="TEXT", child=[{"value": "FOO"}, {"value": "BAR"}]) expected = textwrap.dedent( """ TEXT FOO BAR """ ).strip() self.assertEqual(m.to_string(), expected) # verify that we can also load the serialized data m = ParentModel.from_string(expected) self.assertEqual(m.to_string(), expected) def test_child_model_list_wrapped(self): class ChildModel(XmlModel): XML_TAG = "child" value: str = Field() class ParentModel(XmlModel): XML_TAG = "parent" text: str = Field() child: List[ChildModel] = Field(xml_wrapped=True, xml_name="children") m = ParentModel(text="TEXT", child=[{"value": "FOO"}, {"value": "BAR"}]) expected = textwrap.dedent( """ TEXT FOO BAR """ ).strip() self.assertEqual(m.to_string(), expected) # verify that we can also load the serialized data m = ParentModel.from_string(expected) self.assertEqual(m.to_string(), expected) def test_apiurl(self): class ChildModel(XmlModel): XML_TAG = "child" value: str = Field() class ParentModel(XmlModel): XML_TAG = "parent" text: str = Field() child: List[ChildModel] = Field(xml_wrapped=True, xml_name="children") # serialize the model and load it with apiurl set m = ParentModel(text="TEXT", child=[{"value": "FOO"}, {"value": "BAR"}]) xml = m.to_string() apiurl = "https://api.example.com" m = ParentModel.from_string(xml, apiurl=apiurl) m.child.append({"value": "BAZ"}) self.assertEqual(m._apiurl, apiurl) self.assertEqual(m.child[0]._apiurl, apiurl) self.assertEqual(m.child[1]._apiurl, apiurl) self.assertEqual(m.child[2]._apiurl, apiurl) # test the same as above but with a file f = io.StringIO(xml) m = ParentModel.from_file(f, apiurl=apiurl) m.child.append({"value": "BAZ"}) self.assertEqual(m._apiurl, apiurl) self.assertEqual(m.child[0]._apiurl, apiurl) self.assertEqual(m.child[1]._apiurl, apiurl) self.assertEqual(m.child[2]._apiurl, apiurl) def test_empty_int_optional(self): class TestModel(XmlModel): XML_TAG = "model" num_attr: Optional[int] = Field(xml_attribute=True) num_elem: Optional[int] = Field() data = textwrap.dedent( """ """ ).strip() m = TestModel.from_string(data) self.assertEqual(m.num_attr, None) self.assertEqual(m.num_elem, None) def test_empty_int(self): class TestModel(XmlModel): XML_TAG = "model" num_attr: int = Field(xml_attribute=True) num_elem: int = Field() data = textwrap.dedent( """ """ ).strip() self.assertRaises(TypeError, TestModel.from_string, data) def test_simple_flag(self): class TestModel(XmlModel): XML_TAG = "model" simple_flag: Optional[SimpleFlag] = Field( xml_wrapped=True, ) data = textwrap.dedent( """ """ ).strip() m = TestModel.from_string(data) self.assertEqual(m.simple_flag, "enable") self.assertEqual(data, m.to_string()) if __name__ == "__main__": unittest.main() osc-1.12.1/tests/test_output.py000066400000000000000000000213501475337502500165070ustar00rootroot00000000000000import contextlib import io import tempfile import unittest import osc.conf from osc.output import KeyValueTable from osc.output import print_msg from osc.output import safe_write from osc.output import sanitize_text from osc.output import tty class TestKeyValueTable(unittest.TestCase): def test_empty(self): t = KeyValueTable() self.assertEqual(str(t), "") def test_simple(self): t = KeyValueTable() t.add("Key", "Value") t.add("FooBar", "Text") expected = """ Key : Value FooBar : Text """.strip() self.assertEqual(str(t), expected) def test_newline(self): t = KeyValueTable() t.add("Key", "Value") t.newline() t.add("FooBar", "Text") expected = """ Key : Value FooBar : Text """.strip() self.assertEqual(str(t), expected) def test_continuation(self): t = KeyValueTable() t.add("Key", ["Value1", "Value2"]) expected = """ Key : Value1 Value2 """.strip() self.assertEqual(str(t), expected) def test_section(self): t = KeyValueTable() t.add("Section", None) t.add("Key", "Value", indent=4) t.add("FooBar", "Text", indent=4) expected = """ Section Key : Value FooBar : Text """.strip() self.assertEqual(str(t), expected) def test_wide_chars(self): t = KeyValueTable() t.add("Key", "Value") t.add("🚀🚀🚀", "Value") expected = """ Key : Value 🚀🚀🚀 : Value """.strip() self.assertEqual(str(t), expected) def test_empty_value_no_color(self): t = KeyValueTable() t.add("Key", "", color="bold") expected = "Key : " self.assertEqual(str(t), expected) class TestPrintMsg(unittest.TestCase): def setUp(self): osc.conf.config = osc.conf.Options() def test_debug(self): osc.conf.config["debug"] = False stdout = io.StringIO() stderr = io.StringIO() with contextlib.redirect_stdout(stdout), contextlib.redirect_stderr(stderr): print_msg("foo", "bar", print_to="debug") self.assertEqual("", stdout.getvalue()) self.assertEqual("", stderr.getvalue()) osc.conf.config["debug"] = True stdout = io.StringIO() stderr = io.StringIO() with contextlib.redirect_stdout(stdout), contextlib.redirect_stderr(stderr): print_msg("foo", "bar", print_to="debug") self.assertEqual("", stdout.getvalue()) self.assertEqual("DEBUG: foo bar\n", stderr.getvalue()) def test_verbose(self): osc.conf.config["verbose"] = False stdout = io.StringIO() stderr = io.StringIO() with contextlib.redirect_stdout(stdout), contextlib.redirect_stderr(stderr): print_msg("foo", "bar", print_to="verbose") self.assertEqual("", stdout.getvalue()) self.assertEqual("", stderr.getvalue()) osc.conf.config["verbose"] = True stdout = io.StringIO() stderr = io.StringIO() with contextlib.redirect_stdout(stdout), contextlib.redirect_stderr(stderr): print_msg("foo", "bar", print_to="verbose") self.assertEqual("foo bar\n", stdout.getvalue()) self.assertEqual("", stderr.getvalue()) osc.conf.config["verbose"] = False osc.conf.config["debug"] = True stdout = io.StringIO() stderr = io.StringIO() with contextlib.redirect_stdout(stdout), contextlib.redirect_stderr(stderr): print_msg("foo", "bar", print_to="verbose") self.assertEqual("foo bar\n", stdout.getvalue()) self.assertEqual("", stderr.getvalue()) def test_error(self): stdout = io.StringIO() stderr = io.StringIO() with contextlib.redirect_stdout(stdout), contextlib.redirect_stderr(stderr): print_msg("foo", "bar", print_to="error") self.assertEqual("", stdout.getvalue()) self.assertEqual(f"{tty.colorize('ERROR:', 'red,bold')} foo bar\n", stderr.getvalue()) def test_warning(self): stdout = io.StringIO() stderr = io.StringIO() with contextlib.redirect_stdout(stdout), contextlib.redirect_stderr(stderr): print_msg("foo", "bar", print_to="warning") self.assertEqual("", stdout.getvalue()) self.assertEqual(f"{tty.colorize('WARNING:', 'yellow,bold')} foo bar\n", stderr.getvalue()) def test_none(self): stdout = io.StringIO() stderr = io.StringIO() with contextlib.redirect_stdout(stdout), contextlib.redirect_stderr(stderr): print_msg("foo", "bar", print_to=None) self.assertEqual("", stdout.getvalue()) self.assertEqual("", stderr.getvalue()) def test_stdout(self): stdout = io.StringIO() stderr = io.StringIO() with contextlib.redirect_stdout(stdout), contextlib.redirect_stderr(stderr): print_msg("foo", "bar", print_to="stdout") self.assertEqual("foo bar\n", stdout.getvalue()) self.assertEqual("", stderr.getvalue()) def test_stderr(self): stdout = io.StringIO() stderr = io.StringIO() with contextlib.redirect_stdout(stdout), contextlib.redirect_stderr(stderr): print_msg("foo", "bar", print_to="stderr") self.assertEqual("", stdout.getvalue()) self.assertEqual("foo bar\n", stderr.getvalue()) class TestSanitization(unittest.TestCase): def test_control_chars_bytes(self): original = b"".join([i.to_bytes(1, byteorder="big") for i in range(32)]) sanitized = sanitize_text(original) self.assertEqual(sanitized, b"\t\n\r") def test_control_chars_str(self): original = "".join([chr(i) for i in range(32)]) sanitized = sanitize_text(original) self.assertEqual(sanitized, "\t\n\r") def test_csi_escape_sequences_str(self): # allowed CSI escape sequences originals = [">\033[0m<", ">\033[1;31;47m]<"] for original in originals: sanitized = sanitize_text(original) self.assertEqual(sanitized, original) # not allowed CSI escape sequences originals = [">\033[8m<"] for original in originals: sanitized = sanitize_text(original) self.assertEqual(sanitized, "><") def test_csi_escape_sequences_bytes(self): # allowed CSI escape sequences originals = [b">\033[0m<", b">\033[1;31;47m]<"] for original in originals: sanitized = sanitize_text(original) self.assertEqual(sanitized, original) # not allowed CSI escape sequences originals = [b">\033[8m<"] for original in originals: sanitized = sanitize_text(original) self.assertEqual(sanitized, b"><") def test_standalone_escape_str(self): original = ">\033<" sanitized = sanitize_text(original) self.assertEqual(sanitized, "><") def test_standalone_escape_bytes(self): # standalone escape original = b">\033<" sanitized = sanitize_text(original) self.assertEqual(sanitized, b"><") def test_fe_escape_sequences_str(self): for i in range(0x40, 0x5F + 1): char = chr(i) original = f">\033{char}<" sanitized = sanitize_text(original) self.assertEqual(sanitized, "><") def test_fe_escape_sequences_bytes(self): for i in range(0x40, 0x5F + 1): byte = i.to_bytes(1, byteorder="big") original = b">\033" + byte + b"<" sanitized = sanitize_text(original) self.assertEqual(sanitized, b"><") def test_osc_escape_sequences_str(self): # OSC (Operating System Command) sequences original = "\033]0;this is the window title\007" sanitized = sanitize_text(original) # \033] is removed with the Fe sequences self.assertEqual(sanitized, "0;this is the window title") def test_osc_escape_sequences_bytes(self): # OSC (Operating System Command) sequences original = b"\033]0;this is the window title\007" sanitized = sanitize_text(original) # \033] is removed with the Fe sequences self.assertEqual(sanitized, b"0;this is the window title") class TestSafeWrite(unittest.TestCase): def test_string_to_file(self): with tempfile.NamedTemporaryFile(mode="w+") as f: safe_write(f, "string") def test_bytes_to_file(self): with tempfile.NamedTemporaryFile(mode="wb+") as f: safe_write(f, b"bytes") def test_string_to_stringio(self): with io.StringIO() as f: safe_write(f, "string") def test_bytes_to_bytesio(self): with io.BytesIO() as f: safe_write(f, b"bytes") if __name__ == "__main__": unittest.main() osc-1.12.1/tests/test_package_status.py000066400000000000000000000060261475337502500201500ustar00rootroot00000000000000import os import unittest import osc.core import osc.oscerr from .common import OscTestCase FIXTURES_DIR = os.path.join(os.path.dirname(__file__), 'project_package_status_fixtures') def suite(): return unittest.defaultTestLoader.loadTestsFromTestCase(TestPackageStatus) class TestPackageStatus(OscTestCase): def _get_fixtures_dir(self): return FIXTURES_DIR def test_allfiles(self): """get the status of all files in the wc""" self._change_to_pkg('simple') p = osc.core.Package('.') exp_st = [('A', 'add'), ('?', 'exists'), ('D', 'foo'), ('!', 'merge'), ('R', 'missing'), ('!', 'missing_added'), ('M', 'nochange'), ('S', 'skipped'), (' ', 'test')] st = p.get_status() self.assertEqual(exp_st, st) def test_todo(self): """ get the status of some files in the wc. """ self._change_to_pkg('simple') p = osc.core.Package('.') p.todo = ['test', 'missing_added', 'foo'] exp_st = [('D', 'foo'), ('!', 'missing_added')] st = p.get_status(False, ' ') self.assertEqual(exp_st, st) def test_todo_noexcl(self): """ get the status of some files in the wc. """ self._change_to_pkg('simple') p = osc.core.Package('.') p.todo = ['test', 'missing_added', 'foo'] exp_st = [('D', 'foo'), ('!', 'missing_added'), (' ', 'test')] st = p.get_status() self.assertEqual(exp_st, st) def test_exclude_state(self): """get the status of all files in the wc but exclude some states""" self._change_to_pkg('simple') p = osc.core.Package('.') exp_st = [('A', 'add'), ('?', 'exists'), ('D', 'foo')] st = p.get_status(False, '!', 'S', ' ', 'M', 'R') self.assertEqual(exp_st, st) def test_nonexistent(self): """get the status of a non existent file""" self._change_to_pkg('simple') p = osc.core.Package('.') p.todo = ['doesnotexist'] self.assertRaises(osc.oscerr.OscIOError, p.get_status) def test_conflict(self): """get status of the wc (one file in conflict state)""" self._change_to_pkg('conflict') p = osc.core.Package('.') exp_st = [('C', 'conflict'), ('?', 'exists'), (' ', 'test')] st = p.get_status() self.assertEqual(exp_st, st) def test_excluded(self): """get status of the wc (ignore excluded files); package has state ' '""" self._change_to_pkg('excluded') p = osc.core.Package('.') exp_st = [('?', 'exists'), ('M', 'modified')] st = p.get_status(False, ' ') self.assertEqual(exp_st, st) def test_noexcluded(self): """get status of the wc (include excluded files)""" self._change_to_pkg('excluded') p = osc.core.Package('.') exp_st = [('?', '_linkerror'), ('?', 'exists'), ('?', 'foo.orig'), ('M', 'modified'), (' ', 'test')] st = p.get_status(True) self.assertEqual(exp_st, st) if __name__ == '__main__': unittest.main() osc-1.12.1/tests/test_prdiff.py000066400000000000000000000220261475337502500164220ustar00rootroot00000000000000import os import re import shutil import sys import tempfile import unittest import osc.commandline import osc.core import osc.oscerr from .common import GET, POST, OscTestCase, EXPECTED_REQUESTS FIXTURES_DIR = os.path.join(os.path.dirname(__file__), 'prdiff_fixtures') UPSTREAM = 'some:project' BRANCH = 'home:user:branches:' + UPSTREAM def rdiff_url(pkg, oldprj, newprj): return 'http://localhost/source/%s/%s?unified=1&opackage=%s&oproject=%s&cmd=diff&expand=1&tarlimit=0&filelimit=0' % \ (newprj, pkg, pkg, oldprj.replace(':', '%3A')) def request_url(prj): return "http://localhost/request" + f"?view=collection&project={prj}&states=new,review".replace(":", "%3A").replace(",", "%2C") def GET_PROJECT_PACKAGES(*projects): def decorator(test_method): # decorators get applied in the reversed order (bottom-up) for project in reversed(projects): test_method = GET(f'http://localhost/source/{project}', file=f'{project}/directory')(test_method) return test_method return decorator def POST_RDIFF(oldprj, newprj): def decorator(test_method): # decorators get applied in the reversed order (bottom-up) test_method = POST(rdiff_url('common-three', oldprj, newprj), exp='', text='')(test_method) test_method = POST(rdiff_url('common-two', oldprj, newprj), exp='', file='common-two-diff')(test_method) test_method = POST(rdiff_url('common-one', oldprj, newprj), exp='', text='')(test_method) return test_method return decorator def suite(): return unittest.defaultTestLoader.loadTestsFromTestCase(TestProjectDiff) class TestProjectDiff(OscTestCase): diff_hdr = 'Index: %s\n===================================================================' def setUp(self, copytree=True): super().setUp(copytree=copytree) self.tmpdir_fixtures = tempfile.mkdtemp(prefix='osc_test') shutil.copytree(self._get_fixtures_dir(), os.path.join(self.tmpdir_fixtures, "fixtures")) def tearDown(self): try: shutil.rmtree(self.tmpdir_fixtures) except: pass super().tearDown() def _get_fixtures_dir(self): return FIXTURES_DIR def _change_to_tmpdir(self, *args): os.chdir(os.path.join(self.tmpdir, *args)) def _run_prdiff(self, *args): """Runs osc prdiff, returning captured STDOUT as a string.""" cli = osc.commandline.Osc() argv = ['osc', '--no-keyring', 'prdiff'] argv.extend(args) cli.main(argv=argv) return sys.stdout.getvalue() def testPrdiffTooManyArgs(self): def runner(): self._run_prdiff('one', 'two', 'superfluous-arg') self.assertRaises(osc.oscerr.WrongArgs, runner) @GET_PROJECT_PACKAGES(UPSTREAM, BRANCH) @POST_RDIFF(UPSTREAM, BRANCH) @POST(rdiff_url('only-in-new', UPSTREAM, BRANCH), exp='', text='') def testPrdiffZeroArgs(self): exp = """identical: common-one differs: common-two identical: common-three identical: only-in-new """ def runner(): self._run_prdiff() os.chdir('/tmp') self.assertRaises(osc.oscerr.WrongArgs, runner) self._change_to_tmpdir(self.tmpdir_fixtures, "fixtures", UPSTREAM) self.assertRaises(osc.oscerr.WrongArgs, runner) self._change_to_tmpdir(self.tmpdir_fixtures, "fixtures", BRANCH) out = self._run_prdiff() self.assertEqualMultiline(out, exp) @GET_PROJECT_PACKAGES(UPSTREAM, BRANCH) @POST_RDIFF(UPSTREAM, BRANCH) @POST(rdiff_url('only-in-new', UPSTREAM, BRANCH), exp='', text='') def testPrdiffOneArg(self): self._change_to_tmpdir() exp = """identical: common-one differs: common-two identical: common-three identical: only-in-new """ out = self._run_prdiff('home:user:branches:some:project') self.assertEqualMultiline(out, exp) @GET_PROJECT_PACKAGES('old:prj', 'new:prj') @POST_RDIFF('old:prj', 'new:prj') def testPrdiffTwoArgs(self): self._change_to_tmpdir() exp = """identical: common-one differs: common-two identical: common-three """ out = self._run_prdiff('old:prj', 'new:prj') self.assertEqualMultiline(out, exp) @GET_PROJECT_PACKAGES('old:prj', 'new:prj') @POST_RDIFF('old:prj', 'new:prj') def testPrdiffOldOnly(self): self._change_to_tmpdir() exp = """identical: common-one differs: common-two identical: common-three old only: only-in-old """ out = self._run_prdiff('--show-not-in-new', 'old:prj', 'new:prj') self.assertEqualMultiline(out, exp) @GET_PROJECT_PACKAGES('old:prj', 'new:prj') @POST_RDIFF('old:prj', 'new:prj') def testPrdiffNewOnly(self): self._change_to_tmpdir() exp = """identical: common-one differs: common-two identical: common-three new only: only-in-new """ out = self._run_prdiff('--show-not-in-old', 'old:prj', 'new:prj') self.assertEqualMultiline(out, exp) @GET_PROJECT_PACKAGES('old:prj', 'new:prj') @POST_RDIFF('old:prj', 'new:prj') def testPrdiffDiffstat(self): self._change_to_tmpdir() exp = """identical: common-one differs: common-two common-two | 1 + 1 file changed, 1 insertion(+) identical: common-three """ out = self._run_prdiff('--diffstat', 'old:prj', 'new:prj') self.assertEqualMultiline(out, exp) @GET_PROJECT_PACKAGES('old:prj', 'new:prj') @POST_RDIFF('old:prj', 'new:prj') def testPrdiffUnified(self): self._change_to_tmpdir() exp = """identical: common-one differs: common-two Index: common-two =================================================================== --- common-two\t2013-01-18 19:18:38.225983117 +0000 +++ common-two\t2013-01-18 19:19:27.882082325 +0000 @@ -1,4 +1,5 @@ line one line two line three +an extra line last line identical: common-three """ out = self._run_prdiff('--unified', 'old:prj', 'new:prj') self.assertEqualMultiline(out, exp) @GET_PROJECT_PACKAGES('old:prj', 'new:prj') @POST(rdiff_url('common-two', 'old:prj', 'new:prj'), exp='', file='common-two-diff') @POST(rdiff_url('common-three', 'old:prj', 'new:prj'), exp='', text='') def testPrdiffInclude(self): self._change_to_tmpdir() exp = """differs: common-two identical: common-three """ out = self._run_prdiff('--include', 'common-t', 'old:prj', 'new:prj') self.assertEqualMultiline(out, exp) @GET_PROJECT_PACKAGES('old:prj', 'new:prj') @POST(rdiff_url('common-two', 'old:prj', 'new:prj'), exp='', file='common-two-diff') @POST(rdiff_url('common-three', 'old:prj', 'new:prj'), exp='', text='') def testPrdiffExclude(self): self._change_to_tmpdir() exp = """differs: common-two identical: common-three """ out = self._run_prdiff('--exclude', 'one', 'old:prj', 'new:prj') self.assertEqualMultiline(out, exp) @GET_PROJECT_PACKAGES('old:prj', 'new:prj') @POST(rdiff_url('common-two', 'old:prj', 'new:prj'), exp='', file='common-two-diff') def testPrdiffIncludeExclude(self): self._change_to_tmpdir() exp = """differs: common-two """ out = self._run_prdiff('--include', 'common-t', '--exclude', 'three', 'old:prj', 'new:prj') self.assertEqualMultiline(out, exp) @GET_PROJECT_PACKAGES(UPSTREAM, BRANCH) @GET(request_url(UPSTREAM), exp='', file='request') @POST(rdiff_url('common-one', UPSTREAM, BRANCH), exp='', text='') @POST(rdiff_url('common-two', UPSTREAM, BRANCH), exp='', file='common-two-diff') @POST(rdiff_url('common-three', UPSTREAM, BRANCH), exp='', file='common-two-diff') @POST(rdiff_url('only-in-new', UPSTREAM, BRANCH), exp='', text='') def testPrdiffRequestsMatching(self): self._change_to_tmpdir() exp = """identical: common-one differs: common-two 148023 State:new By:user When:2013-01-11T11:04:14 Created by: creator submit: home:user:branches:some:project/common-two@7 -> some:project Descr: - Fix it to work - Improve support for something differs: common-three identical: only-in-new """ out = self._run_prdiff('--requests', UPSTREAM, BRANCH) self.assertEqualMultiline(out, exp) # Reverse the direction of the diff. @GET_PROJECT_PACKAGES(BRANCH, UPSTREAM) @GET(request_url(BRANCH), exp='', file='no-requests') @POST(rdiff_url('common-one', BRANCH, UPSTREAM), exp='', text='') @POST(rdiff_url('common-two', BRANCH, UPSTREAM), exp='', file='common-two-diff') @POST(rdiff_url('common-three', BRANCH, UPSTREAM), exp='', file='common-two-diff') @POST(rdiff_url('only-in-new', BRANCH, UPSTREAM), exp='', text='') def testPrdiffRequestsSwitched(self): self._change_to_tmpdir() exp = """identical: common-one differs: common-two differs: common-three identical: only-in-new """ out = self._run_prdiff('--requests', BRANCH, UPSTREAM) self.assertEqualMultiline(out, exp) if __name__ == '__main__': unittest.main() osc-1.12.1/tests/test_project_status.py000066400000000000000000000133051475337502500202210ustar00rootroot00000000000000import os import unittest import osc.core import osc.oscerr from .common import OscTestCase FIXTURES_DIR = os.path.join(os.path.dirname(__file__), 'project_package_status_fixtures') def suite(): return unittest.defaultTestLoader.loadTestsFromTestCase(TestProjectStatus) class TestProjectStatus(OscTestCase): def _get_fixtures_dir(self): return FIXTURES_DIR def test_simple(self): """get the status of a package with state ' '""" self._change_to_pkg('.') prj = osc.core.Project('.', getPackageList=False) exp_st = ' ' st = prj.status('simple') self.assertEqual(exp_st, st) def test_added(self): """get the status of an added package""" self._change_to_pkg('.') prj = osc.core.Project('.', getPackageList=False) exp_st = 'A' st = prj.status('added') self.assertEqual(exp_st, st) def test_deleted(self): """get the status of a deleted package""" self._change_to_pkg('.') prj = osc.core.Project('.', getPackageList=False) exp_st = 'D' st = prj.status('deleted') self.assertEqual(exp_st, st) def test_added_deleted(self): """ get the status of a package which was added and deleted afterwards (with a non osc command) """ self._change_to_pkg('.') prj = osc.core.Project('.', getPackageList=False) exp_st = '!' st = prj.status('added_deleted') self.assertEqual(exp_st, st) def test_missing(self): """ get the status of a package with state " " which was removed by a non osc command """ self._change_to_pkg('.') prj = osc.core.Project('.', getPackageList=False) exp_st = '!' st = prj.status('missing') self.assertEqual(exp_st, st) def test_deleted_deleted(self): """ get the status of a package which was deleted (with an osc command) and afterwards the package directory was deleted with a non osc command """ self._change_to_pkg('.') prj = osc.core.Project('.', getPackageList=False) exp_st = 'D' st = prj.status('deleted_deleted') self.assertEqual(exp_st, st) def test_unversioned_exists(self): """get the status of an unversioned package""" self._change_to_pkg('.') prj = osc.core.Project('.', getPackageList=False) exp_st = '?' st = prj.status('excluded') self.assertEqual(exp_st, st) def test_unversioned_nonexistent(self): """get the status of an unversioned, nonexistent package""" self._change_to_pkg('.') prj = osc.core.Project('.', getPackageList=False) self.assertRaises(osc.oscerr.OscIOError, prj.status, 'doesnotexist') def test_get_status(self): """get the status of the complete project""" self._change_to_pkg('.') prj = osc.core.Project('.', getPackageList=False) exp_st = [(' ', 'conflict'), (' ', 'simple'), ('A', 'added'), ('D', 'deleted'), ('!', 'missing'), ('!', 'added_deleted'), ('D', 'deleted_deleted'), ('?', 'excluded')] st = prj.get_status() self.assertEqual(exp_st, st) def test_get_status_excl(self): """get the status of the complete project (exclude some states)""" self._change_to_pkg('.') prj = osc.core.Project('.', getPackageList=False) exp_st = [('A', 'added'), ('!', 'missing'), ('!', 'added_deleted')] st = prj.get_status('D', ' ', '?') self.assertEqual(exp_st, st) def test_get_pacobj_simple(self): """package exists""" self._change_to_pkg('.') prj = osc.core.Project('.', getPackageList=False) p = prj.get_pacobj('simple') self.assertTrue(isinstance(p, osc.core.Package)) self.assertEqual(p.name, 'simple') def test_get_pacobj_added(self): """package has state 'A', also test pac_kwargs""" self._change_to_pkg('.') prj = osc.core.Project('.', getPackageList=False) p = prj.get_pacobj('added', progress_obj={}) self.assertTrue(isinstance(p, osc.core.Package)) self.assertEqual(p.name, 'added') self.assertEqual(p.progress_obj, {}) def test_get_pacobj_deleted(self): """package has state 'D' and exists, also test pac_args""" self._change_to_pkg('.') prj = osc.core.Project('.', getPackageList=False) p = prj.get_pacobj('deleted', {}) self.assertTrue(isinstance(p, osc.core.Package)) self.assertEqual(p.name, 'deleted') self.assertEqual(p.progress_obj, {}) def test_get_pacobj_missing(self): """package is missing""" self._change_to_pkg('.') prj = osc.core.Project('.', getPackageList=False) p = prj.get_pacobj('missing') self.assertTrue(isinstance(p, type(None))) def test_get_pacobj_deleted_deleted(self): """package has state 'D' and does not exist""" self._change_to_pkg('.') prj = osc.core.Project('.', getPackageList=False) p = prj.get_pacobj('deleted_deleted') self.assertTrue(isinstance(p, type(None))) def test_get_pacobj_unversioned(self): """package/dir has state '?'""" self._change_to_pkg('.') prj = osc.core.Project('.', getPackageList=False) p = prj.get_pacobj('excluded') self.assertTrue(isinstance(p, type(None))) def test_get_pacobj_nonexistent(self): """package/dir does not exist""" self._change_to_pkg('.') prj = osc.core.Project('.', getPackageList=False) p = prj.get_pacobj('doesnotexist') self.assertTrue(isinstance(p, type(None))) if __name__ == '__main__': unittest.main() osc-1.12.1/tests/test_repairwc.py000066400000000000000000000243741475337502500167740ustar00rootroot00000000000000import os import shutil import sys import unittest from xml.etree import ElementTree as ET import osc.core import osc.oscerr from .common import GET, PUT, POST, DELETE, OscTestCase FIXTURES_DIR = os.path.join(os.path.dirname(__file__), 'repairwc_fixtures') def suite(): return unittest.defaultTestLoader.loadTestsFromTestCase(TestRepairWC) class TestRepairWC(OscTestCase): def _get_fixtures_dir(self): return FIXTURES_DIR def __assertNotRaises(self, exception, meth, *args, **kwargs): try: meth(*args, **kwargs) except exception: self.fail('%s raised' % exception.__name__) def test_working_empty(self): """consistent, empty working copy""" self._change_to_pkg('working_empty') self.__assertNotRaises(osc.oscerr.WorkingCopyInconsistent, osc.core.Package, '.') def test_working_nonempty(self): """ consistent, non-empty working copy. One file is in conflict, one file is marked for deletion and one file has state 'A' """ self._change_to_pkg('working_nonempty') self.__assertNotRaises(osc.oscerr.WorkingCopyInconsistent, osc.core.Package, '.') def test_buildfiles(self): """ wc has a _buildconfig_prj_arch and a _buildinfo_prj_arch.xml in the storedir """ self._change_to_pkg('buildfiles') self.__assertNotRaises(osc.oscerr.WorkingCopyInconsistent, osc.core.Package, '.') @GET('http://localhost/source/osctest/simple1/foo?rev=1', text='This is a simple test.\n') def test_simple1(self): """a file is marked for deletion but storefile doesn't exist""" self._change_to_pkg('simple1') self.assertRaises(osc.oscerr.WorkingCopyInconsistent, osc.core.Package, '.') p = osc.core.Package('.', wc_check=False) p.wc_repair() self.assertTrue(os.path.exists(os.path.join('.osc', 'sources', 'foo'))) self._check_deletelist('foo\n') self._check_status(p, 'foo', 'D') self._check_status(p, 'nochange', 'M') self._check_status(p, 'merge', ' ') self._check_status(p, 'toadd1', '?') # additional cleanup check self.__assertNotRaises(osc.oscerr.WorkingCopyInconsistent, osc.core.Package, '.') def test_simple2(self): """a file "somefile" exists in the storedir which isn't tracked""" self._change_to_pkg('simple2') self.assertRaises(osc.oscerr.WorkingCopyInconsistent, osc.core.Package, '.') p = osc.core.Package('.', wc_check=False) p.wc_repair() self.assertFalse(os.path.exists(os.path.join('.osc', 'sources', 'somefile'))) self._check_deletelist('foo\n') self._check_status(p, 'foo', 'D') self._check_status(p, 'nochange', 'M') self._check_status(p, 'merge', ' ') self._check_status(p, 'toadd1', '?') # additional cleanup check self.__assertNotRaises(osc.oscerr.WorkingCopyInconsistent, osc.core.Package, '.') def test_simple3(self): """toadd1 has state 'A' and a file .osc/sources/toadd1 exists""" self._change_to_pkg('simple3') self.assertRaises(osc.oscerr.WorkingCopyInconsistent, osc.core.Package, '.') p = osc.core.Package('.', wc_check=False) p.wc_repair() self.assertFalse(os.path.exists(os.path.join('.osc', 'sources', 'toadd1'))) self._check_deletelist('foo\n') self._check_status(p, 'foo', 'D') self._check_status(p, 'nochange', 'M') self._check_status(p, 'merge', ' ') self._check_addlist('toadd1\n') self._check_status(p, 'toadd1', 'A') # additional cleanup check self.__assertNotRaises(osc.oscerr.WorkingCopyInconsistent, osc.core.Package, '.') def test_simple4(self): """a file is listed in _to_be_deleted but isn't present in _files""" self._change_to_pkg('simple4') self.assertRaises(osc.oscerr.WorkingCopyInconsistent, osc.core.Package, '.') p = osc.core.Package('.', wc_check=False) p.wc_repair() self._check_deletelist('foo\n') self._check_status(p, 'foo', 'D') self._check_status(p, 'nochange', 'M') self._check_status(p, 'merge', ' ') self._check_status(p, 'toadd1', '?') # additional cleanup check self.__assertNotRaises(osc.oscerr.WorkingCopyInconsistent, osc.core.Package, '.') def test_simple5(self): """a file is listed in _in_conflict but isn't present in _files""" self._change_to_pkg('simple5') self.assertRaises(osc.oscerr.WorkingCopyInconsistent, osc.core.Package, '.') p = osc.core.Package('.', wc_check=False) p.wc_repair() self.assertFalse(os.path.exists(os.path.join('.osc', '_in_conflict'))) self._check_deletelist('foo\n') self._check_status(p, 'foo', 'D') self._check_status(p, 'nochange', 'M') self._check_status(p, 'merge', ' ') self._check_status(p, 'toadd1', '?') # additional cleanup check self.__assertNotRaises(osc.oscerr.WorkingCopyInconsistent, osc.core.Package, '.') @GET('http://localhost/source/osctest/simple6/foo?rev=1', text='This is a simple test.\n') def test_simple6(self): """ a file is listed in _to_be_deleted and is present in _files but the storefile is missing """ self._change_to_pkg('simple6') self.assertRaises(osc.oscerr.WorkingCopyInconsistent, osc.core.Package, '.') p = osc.core.Package('.', wc_check=False) p.wc_repair() self.assertTrue(os.path.exists(os.path.join('.osc', 'sources', 'foo'))) self._check_deletelist('foo\n') self._check_status(p, 'foo', 'D') self._check_status(p, 'nochange', 'M') self._check_status(p, 'merge', ' ') self._check_status(p, 'toadd1', '?') # additional cleanup check self.__assertNotRaises(osc.oscerr.WorkingCopyInconsistent, osc.core.Package, '.') def test_simple7(self): """files marked as skipped don't exist in the storedir""" self._change_to_pkg('simple7') self.__assertNotRaises(osc.oscerr.WorkingCopyInconsistent, osc.core.Package, '.') def test_simple8(self): """ a file is marked as skipped but the skipped file exists in the storedir """ self._change_to_pkg('simple8') self.assertRaises(osc.oscerr.WorkingCopyInconsistent, osc.core.Package, '.') p = osc.core.Package('.', wc_check=False) p.wc_repair() self.assertFalse(os.path.exists(os.path.join('.osc', 'sources', 'skipped'))) self._check_deletelist('foo\n') self._check_status(p, 'foo', 'D') self._check_status(p, 'nochange', 'M') self._check_status(p, 'merge', ' ') self._check_status(p, 'toadd1', '?') self._check_status(p, 'skipped', 'S') # additional cleanup check self.__assertNotRaises(osc.oscerr.WorkingCopyInconsistent, osc.core.Package, '.') @GET('http://localhost/source/osctest/multiple/merge?rev=1', text='Is it\npossible to\nmerge this file?I hope so...\n') @GET('http://localhost/source/osctest/multiple/nochange?rev=1', text='This file didn\'t change.\n') def test_multiple(self): """ a storefile is missing, a file is listed in _to_be_deleted but is not present in _files, a file is listed in _in_conflict but the storefile is missing and a file exists in the storedir but is not present in _files """ self._change_to_pkg('multiple') self.assertRaises(osc.oscerr.WorkingCopyInconsistent, osc.core.Package, '.') p = osc.core.Package('.', wc_check=False) p.wc_repair() self.assertTrue(os.path.exists(os.path.join('.osc', 'sources', 'foo'))) self.assertFalse(os.path.exists(os.path.join('.osc', 'sources', 'unknown_file'))) self._check_deletelist('foo\n') self._check_status(p, 'foo', 'D') self._check_status(p, 'nochange', 'C') self._check_status(p, 'merge', ' ') self._check_status(p, 'foobar', 'A') self._check_status(p, 'toadd1', '?') # additional cleanup check self.__assertNotRaises(osc.oscerr.WorkingCopyInconsistent, osc.core.Package, '.') def test_noapiurl(self): """the package wc has no _apiurl file""" self._change_to_pkg('noapiurl') p = osc.core.Package('.', wc_check=False) p.wc_repair('http://localhost') self.assertTrue(os.path.exists(os.path.join('.osc', '_apiurl'))) self.assertFileContentEqual(os.path.join('.osc', '_apiurl'), 'http://localhost\n') self.assertEqual(p.apiurl, 'http://localhost') def test_invalidapiurl(self): """the package wc has an invalid apiurl file (invalid url format)""" self._change_to_pkg('invalid_apiurl') p = osc.core.Package('.', wc_check=False) p.wc_repair('http://localhost') self.assertTrue(os.path.exists(os.path.join('.osc', '_apiurl'))) self.assertFileContentEqual(os.path.join('.osc', '_apiurl'), 'http://localhost\n') self.assertEqual(p.apiurl, 'http://localhost') def test_noapiurlNotExistingApiurl(self): """the package wc has no _apiurl file and no apiurl is passed to repairwc""" self._change_to_pkg('noapiurl') self.assertRaises(osc.oscerr.WorkingCopyInconsistent, osc.core.Package, '.') p = osc.core.Package('.', wc_check=False) self.assertRaises(osc.oscerr.WorkingCopyInconsistent, p.wc_repair) self.assertFalse(os.path.exists(os.path.join('.osc', '_apiurl'))) def test_project_noapiurl(self): """the project wc has no _apiurl file""" prj_dir = os.path.join(self.tmpdir, 'prj_noapiurl') shutil.copytree(os.path.join(self._get_fixtures_dir(), 'prj_noapiurl'), prj_dir) storedir = os.path.join(prj_dir, osc.core.store) self.assertRaises(osc.oscerr.WorkingCopyInconsistent, osc.core.Project, prj_dir, getPackageList=False) prj = osc.core.Project(prj_dir, wc_check=False, getPackageList=False) prj.wc_repair('http://localhost') self.assertTrue(os.path.exists(os.path.join(storedir, '_apiurl'))) self.assertTrue(os.path.exists(os.path.join(storedir, '_apiurl'))) self.assertFileContentEqual(os.path.join(storedir, '_apiurl'), 'http://localhost\n') if __name__ == '__main__': unittest.main() osc-1.12.1/tests/test_request.py000066400000000000000000000577651475337502500166620ustar00rootroot00000000000000import os import unittest from xml.etree import ElementTree as ET import osc.core import osc.oscerr from osc.util.xml import xml_fromstring from .common import OscTestCase FIXTURES_DIR = os.path.join(os.path.dirname(__file__), 'request_fixtures') def suite(): return unittest.defaultTestLoader.loadTestsFromTestCase(TestRequest) class TestRequest(OscTestCase): def _get_fixtures_dir(self): return FIXTURES_DIR def setUp(self): super().setUp(copytree=False) def test_createsr(self): """create a simple submitrequest""" r = osc.core.Request() r.add_action('submit', src_project='foo', src_package='bar', src_rev='42', tgt_project='foobar', tgt_package='bar') self.assertEqual(r.actions[0].type, 'submit') self.assertEqual(r.actions[0].src_project, 'foo') self.assertEqual(r.actions[0].src_package, 'bar') self.assertEqual(r.actions[0].src_rev, '42') self.assertEqual(r.actions[0].tgt_project, 'foobar') self.assertEqual(r.actions[0].tgt_package, 'bar') self.assertTrue(r.actions[0].opt_sourceupdate is None) self.assertTrue(r.actions[0].opt_updatelink is None) self.assertTrue(len(r.statehistory) == 0) self.assertTrue(len(r.reviews) == 0) self.assertRaises(AttributeError, getattr, r.actions[0], 'doesnotexist') exp = """ """ self.assertXMLEqual(exp, r.to_str()) def test_createsr_with_option(self): """create a submitrequest with option""" """create a simple submitrequest""" r = osc.core.Request() r.add_action('submit', src_project='foo', src_package='bar', tgt_project='foobar', tgt_package='bar', opt_sourceupdate='cleanup', opt_updatelink='1') self.assertEqual(r.actions[0].type, 'submit') self.assertEqual(r.actions[0].src_project, 'foo') self.assertEqual(r.actions[0].src_package, 'bar') self.assertEqual(r.actions[0].tgt_project, 'foobar') self.assertEqual(r.actions[0].tgt_package, 'bar') self.assertEqual(r.actions[0].opt_sourceupdate, 'cleanup') self.assertEqual(r.actions[0].opt_updatelink, '1') self.assertTrue(r.actions[0].src_rev is None) self.assertTrue(len(r.statehistory) == 0) self.assertTrue(len(r.reviews) == 0) self.assertRaises(AttributeError, getattr, r.actions[0], 'doesnotexist') exp = """ cleanup 1 """ self.assertXMLEqual(exp, r.to_str()) def test_createsr_missing_tgt_package(self): """create a submitrequest with missing target package""" r = osc.core.Request() r.add_action('submit', src_project='foo', src_package='bar', tgt_project='foobar') self.assertEqual(r.actions[0].type, 'submit') self.assertEqual(r.actions[0].src_project, 'foo') self.assertEqual(r.actions[0].src_package, 'bar') self.assertEqual(r.actions[0].tgt_project, 'foobar') self.assertTrue(len(r.statehistory) == 0) self.assertTrue(len(r.reviews) == 0) self.assertTrue(r.actions[0].tgt_package is None) self.assertRaises(AttributeError, getattr, r.actions[0], 'doesnotexist') exp = """ """ self.assertXMLEqual(exp, r.to_str()) def test_createsr_invalid_argument(self): """create a submitrequest with invalid action argument""" r = osc.core.Request() self.assertRaises(osc.oscerr.WrongArgs, r.add_action, 'submit', src_project='foo', src_invalid='bar') def test_create_request_invalid_type(self): """create a request with an invalid action type""" r = osc.core.Request() self.assertRaises(osc.oscerr.WrongArgs, r.add_action, 'invalid', foo='bar') def test_create_add_role_person(self): """create an add_role request (person element)""" r = osc.core.Request() r.add_action('add_role', tgt_project='foo', tgt_package='bar', person_name='user', person_role='reader') self.assertEqual(r.actions[0].type, 'add_role') self.assertEqual(r.actions[0].tgt_project, 'foo') self.assertEqual(r.actions[0].tgt_package, 'bar') self.assertEqual(r.actions[0].person_name, 'user') self.assertEqual(r.actions[0].person_role, 'reader') self.assertTrue(r.actions[0].group_name is None) self.assertTrue(r.actions[0].group_role is None) exp = """ """ self.assertXMLEqual(exp, r.to_str()) def test_create_add_role_group(self): """create an add_role request (group element)""" r = osc.core.Request() r.add_action('add_role', tgt_project='foo', tgt_package='bar', group_name='group', group_role='reviewer') self.assertEqual(r.actions[0].type, 'add_role') self.assertEqual(r.actions[0].tgt_project, 'foo') self.assertEqual(r.actions[0].tgt_package, 'bar') self.assertEqual(r.actions[0].group_name, 'group') self.assertEqual(r.actions[0].group_role, 'reviewer') self.assertTrue(r.actions[0].person_name is None) self.assertTrue(r.actions[0].person_role is None) self.assertTrue(len(r.statehistory) == 0) self.assertTrue(len(r.reviews) == 0) exp = """ """ self.assertXMLEqual(exp, r.to_str()) def test_create_add_role_person_group(self): """create an add_role request (person+group element)""" r = osc.core.Request() r.add_action('add_role', tgt_project='foo', tgt_package='bar', person_name='user', person_role='reader', group_name='group', group_role='reviewer') self.assertEqual(r.actions[0].type, 'add_role') self.assertEqual(r.actions[0].tgt_project, 'foo') self.assertEqual(r.actions[0].tgt_package, 'bar') self.assertEqual(r.actions[0].person_name, 'user') self.assertEqual(r.actions[0].person_role, 'reader') self.assertEqual(r.actions[0].group_name, 'group') self.assertEqual(r.actions[0].group_role, 'reviewer') self.assertTrue(len(r.statehistory) == 0) self.assertTrue(len(r.reviews) == 0) exp = """ """ self.assertXMLEqual(exp, r.to_str()) def test_create_set_bugowner_project(self): """create a set_bugowner request for a project""" r = osc.core.Request() r.add_action('set_bugowner', tgt_project='foobar', person_name='buguser') self.assertEqual(r.actions[0].type, 'set_bugowner') self.assertEqual(r.actions[0].tgt_project, 'foobar') self.assertEqual(r.actions[0].person_name, 'buguser') self.assertTrue(r.actions[0].tgt_package is None) self.assertTrue(len(r.statehistory) == 0) self.assertTrue(len(r.reviews) == 0) exp = """ """ self.assertXMLEqual(exp, r.to_str()) def test_create_set_bugowner_package(self): """create a set_bugowner request for a package""" r = osc.core.Request() r.add_action('set_bugowner', tgt_project='foobar', tgt_package='baz', person_name='buguser') self.assertEqual(r.actions[0].type, 'set_bugowner') self.assertEqual(r.actions[0].tgt_project, 'foobar') self.assertEqual(r.actions[0].tgt_package, 'baz') self.assertEqual(r.actions[0].person_name, 'buguser') self.assertTrue(len(r.statehistory) == 0) self.assertTrue(len(r.reviews) == 0) exp = """ """ self.assertXMLEqual(exp, r.to_str()) def test_create_delete_project(self): """create a delete request for a project""" r = osc.core.Request() r.add_action('delete', tgt_project='foo') self.assertEqual(r.actions[0].type, 'delete') self.assertEqual(r.actions[0].tgt_project, 'foo') self.assertTrue(r.actions[0].tgt_package is None) self.assertTrue(len(r.statehistory) == 0) self.assertTrue(len(r.reviews) == 0) exp = """ """ self.assertXMLEqual(exp, r.to_str()) def test_create_delete_package(self): """create a delete request for a package""" r = osc.core.Request() r.add_action('delete', tgt_project='foo', tgt_package='deleteme') self.assertEqual(r.actions[0].type, 'delete') self.assertEqual(r.actions[0].tgt_project, 'foo') self.assertEqual(r.actions[0].tgt_package, 'deleteme') self.assertTrue(len(r.statehistory) == 0) self.assertTrue(len(r.reviews) == 0) exp = """ """ self.assertXMLEqual(exp, r.to_str()) def test_create_change_devel(self): """create a change devel request""" r = osc.core.Request() r.add_action('change_devel', src_project='foo', src_package='bar', tgt_project='devprj', tgt_package='devpkg') self.assertEqual(r.actions[0].type, 'change_devel') self.assertEqual(r.actions[0].src_project, 'foo') self.assertEqual(r.actions[0].src_package, 'bar') self.assertEqual(r.actions[0].tgt_project, 'devprj') self.assertEqual(r.actions[0].tgt_package, 'devpkg') self.assertTrue(len(r.statehistory) == 0) self.assertTrue(len(r.reviews) == 0) exp = """ """ self.assertXMLEqual(exp, r.to_str()) def test_action_from_xml1(self): """create action from xml""" xml = """ """ action = osc.core.Action.from_xml(xml_fromstring(xml)) self.assertEqual(action.type, 'add_role') self.assertEqual(action.tgt_project, 'foo') self.assertEqual(action.tgt_package, 'bar') self.assertEqual(action.person_name, 'user') self.assertEqual(action.person_role, 'reader') self.assertEqual(action.group_name, 'group') self.assertEqual(action.group_role, 'reviewer') self.assertXMLEqual(xml, action.to_str()) def test_action_from_xml2(self): """create action from xml""" xml = """ cleanup 1 """ action = osc.core.Action.from_xml(xml_fromstring(xml)) self.assertEqual(action.type, 'submit') self.assertEqual(action.src_project, 'foo') self.assertEqual(action.src_package, 'bar') self.assertEqual(action.tgt_project, 'foobar') self.assertEqual(action.tgt_package, 'bar') self.assertEqual(action.opt_sourceupdate, 'cleanup') self.assertEqual(action.opt_updatelink, '1') self.assertTrue(action.src_rev is None) self.assertXMLEqual(xml, action.to_str()) def test_action_from_xml3(self): """create action from xml (with acceptinfo element)""" xml = """ """ action = osc.core.Action.from_xml(xml_fromstring(xml)) self.assertEqual(action.type, 'submit') self.assertEqual(action.src_project, 'testprj') self.assertEqual(action.src_package, 'bar') self.assertEqual(action.tgt_project, 'foobar') self.assertEqual(action.tgt_package, 'baz') self.assertTrue(action.opt_sourceupdate is None) self.assertTrue(action.opt_updatelink is None) self.assertTrue(action.src_rev is None) self.assertEqual(action.acceptinfo_rev, '5') self.assertEqual(action.acceptinfo_srcmd5, 'aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa') self.assertEqual(action.acceptinfo_xsrcmd5, 'ffffffffffffffffffffffffffffffff') self.assertTrue(action.acceptinfo_osrcmd5 is None) self.assertTrue(action.acceptinfo_oxsrcmd5 is None) self.assertXMLEqual(xml, action.to_str()) def test_action_from_xml_unknown_type(self): """try to create action from xml with unknown type""" xml = '' self.assertRaises(osc.oscerr.WrongArgs, osc.core.Action.from_xml, xml_fromstring(xml)) def test_read_request1(self): """read in a request""" xml = self._get_fixture('test_read_request1.xml') r = osc.core.Request() r.read(xml_fromstring(xml)) self.assertEqual(r.reqid, '42') self.assertEqual(r.actions[0].type, 'submit') self.assertEqual(r.actions[0].src_project, 'foo') self.assertEqual(r.actions[0].src_package, 'bar') self.assertEqual(r.actions[0].src_rev, '1') self.assertEqual(r.actions[0].tgt_project, 'foobar') self.assertEqual(r.actions[0].tgt_package, 'bar') self.assertTrue(r.actions[0].opt_sourceupdate is None) self.assertTrue(r.actions[0].opt_updatelink is None) self.assertEqual(r.actions[1].type, 'delete') self.assertEqual(r.actions[1].tgt_project, 'deleteme') self.assertTrue(r.actions[1].tgt_package is None) self.assertEqual(r.state.name, 'accepted') self.assertEqual(r.state.when, '2010-12-27T01:36:29') self.assertEqual(r.state.who, 'user1') self.assertEqual(r.state.approver, None) self.assertEqual(r.state.comment, '') self.assertEqual(r.statehistory[0].when, '2010-12-13T13:02:03') self.assertEqual(r.statehistory[0].who, 'creator') self.assertEqual(r.statehistory[0].comment, 'foobar') self.assertEqual(r.title, 'title of the request') self.assertEqual(r.description, 'this is a\nvery long\ndescription') self.assertTrue(len(r.statehistory) == 1) self.assertTrue(len(r.reviews) == 0) self.assertXMLEqual(xml, r.to_str()) def test_read_request2(self): """read in a request (with reviews)""" xml = self._get_fixture('test_read_request2.xml') r = osc.core.Request() r.read(xml_fromstring(xml)) self.assertEqual(r.reqid, '123') self.assertEqual(r.actions[0].type, 'submit') self.assertEqual(r.actions[0].src_project, 'xyz') self.assertEqual(r.actions[0].src_package, 'abc') self.assertTrue(r.actions[0].src_rev is None) self.assertEqual(r.actions[0].opt_sourceupdate, 'cleanup') self.assertEqual(r.actions[0].opt_updatelink, '1') self.assertEqual(r.actions[1].type, 'add_role') self.assertEqual(r.actions[1].tgt_project, 'home:foo') self.assertEqual(r.actions[1].person_name, 'bar') self.assertEqual(r.actions[1].person_role, 'maintainer') self.assertEqual(r.actions[1].group_name, 'groupxyz') self.assertEqual(r.actions[1].group_role, 'reader') self.assertTrue(r.actions[1].tgt_package is None) self.assertEqual(r.state.name, 'review') self.assertEqual(r.state.when, '2010-12-27T01:36:29') self.assertEqual(r.state.approver, 'someone') self.assertEqual(r.state.who, 'abc') self.assertEqual(r.state.comment, '') self.assertEqual(r.reviews[0].state, 'new') self.assertEqual(r.reviews[0].by_group, 'group1') self.assertEqual(r.reviews[0].when, '2010-12-28T00:11:22') self.assertEqual(r.reviews[0].who, 'abc') self.assertEqual(r.reviews[0].comment, 'review start') self.assertTrue(r.reviews[0].by_user is None) self.assertEqual(r.statehistory[0].when, '2010-12-11T00:00:00') self.assertEqual(r.statehistory[0].who, 'creator') self.assertEqual(r.statehistory[0].comment, '') self.assertEqual(r.creator, 'creator') self.assertTrue(len(r.statehistory) == 1) self.assertTrue(len(r.reviews) == 1) self.assertXMLEqual(xml, r.to_str()) def test_read_request3(self): """read in a request (with an "empty" comment+description)""" xml = """ """ r = osc.core.Request() r.read(xml_fromstring(xml)) self.assertEqual(r.reqid, '2') self.assertEqual(r.actions[0].type, 'set_bugowner') self.assertEqual(r.actions[0].tgt_project, 'foo') self.assertEqual(r.actions[0].person_name, 'buguser') self.assertEqual(r.state.name, 'new') self.assertEqual(r.state.when, '2010-12-28T12:36:29') self.assertEqual(r.state.who, 'xyz') self.assertEqual(r.state.comment, '') self.assertEqual(r.description, '') self.assertTrue(len(r.statehistory) == 0) self.assertTrue(len(r.reviews) == 0) self.assertEqual(r.creator, 'xyz') exp = """ """ self.assertXMLEqual(exp, r.to_str()) def test_request_list_view1(self): """test the list_view method""" xml = self._get_fixture('test_request_list_view1.xml') exp = """\ 62 State:new By:Admin When:2010-12-29T14:57:25 Created by: Admin set_bugowner: buguser foo add_role: person: xyz as maintainer, group: group1 as reader foobar add_role: person: abc as reviewer foo/bar change_devel: foo/bar developed in devprj/devpkg submit: srcprj/srcpackage -> tgtprj/tgtpackage submit: foo/bar -> baz delete: deleteme delete: foo/bar\n""" r = osc.core.Request() r.read(xml_fromstring(xml)) self.assertEqual(exp, r.list_view()) def test_request_list_view2(self): """test the list_view method (with history elements and description)""" xml = self._get_fixture('test_request_list_view2.xml') r = osc.core.Request() r.read(xml_fromstring(xml)) exp = """\ 21 State:accepted By:foobar When:2010-12-29T16:37:45 Created by: foobar set_bugowner: buguser foo From: Created Request: user -> Review Approved: foobar Descr: This is a simple request with a lot of ... ... text and other stuff. This request also contains a description. This is useful to describe the request. blabla blabla\n""" self.assertEqual(exp, r.list_view()) def test_request_str1(self): """test the __str__ method""" xml = self._get_fixture('test_request_str1.xml') r = osc.core.Request() r = osc.core.Request() r.read(xml_fromstring(xml)) self.assertEqual(r.creator, 'creator') exp = """\ Request: 123 Created by: creator Actions: submit: xyz/abc(cleanup) -> foo ***update link*** add_role: person: bar as maintainer, group: groupxyz as reader home:foo Message: just a samll description in order to describe this request - blablabla test. State: review 2010-12-27T01:36:29 abc | currently in review Review: accepted Group: group1 2010-12-29T00:11:22 abc | accepted new Group: group1 2010-12-28T00:11:22 abc | review start History: 2010-12-12T00:00:00 creator revoked 2010-12-11T00:00:00 creator new""" self.assertEqual(exp, str(r)) def test_request_str2(self): """test the __str__ method""" xml = """\ """ r = osc.core.Request() r.read(xml_fromstring(xml)) self.assertEqual(r.creator, 'creator') exp = """\ Request: 98765 Created by: creator Actions: change_devel: foo/bar developed in devprj/devpkg delete: deleteme Message: State: new 2010-12-29T00:11:22 creator""" self.assertEqual(exp, str(r)) def test_legacy_request(self): """load old-style submitrequest""" xml = """\ """ r = osc.core.Request() r.read(xml_fromstring(xml)) self.assertEqual(r.reqid, '1234') self.assertEqual(r.actions[0].type, 'submit') self.assertEqual(r.actions[0].src_project, 'foobar') self.assertEqual(r.actions[0].src_package, 'baz') self.assertEqual(r.actions[0].tgt_project, 'foo') self.assertEqual(r.actions[0].tgt_package, 'baz') self.assertTrue(r.actions[0].opt_sourceupdate is None) self.assertTrue(r.actions[0].opt_updatelink is None) self.assertEqual(r.state.name, 'new') self.assertEqual(r.state.when, '2010-12-30T02:11:22') self.assertEqual(r.state.who, 'olduser') self.assertEqual(r.state.comment, '') self.assertEqual(r.creator, 'olduser') exp = """\ """ self.assertXMLEqual(exp, r.to_str()) def test_get_actions(self): """test get_actions method""" xml = self._get_fixture('test_request_list_view1.xml') r = osc.core.Request() r.read(xml_fromstring(xml)) sr_actions = r.get_actions('submit') self.assertTrue(len(sr_actions) == 2) for i in sr_actions: self.assertEqual(i.type, 'submit') self.assertTrue(len(r.get_actions('submit', 'delete', 'change_devel')) == 5) self.assertTrue(len(r.get_actions()) == 8) if __name__ == '__main__': unittest.main() osc-1.12.1/tests/test_results.py000066400000000000000000000044031475337502500166500ustar00rootroot00000000000000import os import sys import unittest import osc.commandline from .common import GET, OscTestCase def suite(): return unittest.defaultTestLoader.loadTestsFromTestCase(TestResults) class TestResults(OscTestCase): def setUp(self): super().setUp(copytree=False) def _get_fixtures_name(self): return 'results_fixtures' def _get_fixtures_dir(self): return os.path.join(os.path.dirname(__file__), self._get_fixtures_name()) def _run_osc(self, *args): """Runs osc, returning captured STDOUT as a string.""" cli = osc.commandline.Osc() argv = ['osc', '--no-keyring'] argv.extend(args) cli.main(argv=argv) return sys.stdout.getvalue() @GET('http://localhost/build/testproject/_result', file='result.xml') def testPrjresults(self): out = self._run_osc('prjresults', '--xml', 'testproject') self.assertEqualMultiline(out, self._get_fixture('result.xml') + '\n') @GET('http://localhost/build/testproject/_result', file='result-dirty.xml') @GET('http://localhost/build/testproject/_result?oldstate=c57e2ee592dbbf26ebf19cc4f1bc1e83', file='result.xml') def testPrjresultsWatch(self): out = self._run_osc('prjresults', '--watch', '--xml', 'testproject') self.assertEqualMultiline(out, self._get_fixture('result-dirty.xml') + '\n' + self._get_fixture('result.xml') + '\n') @GET('http://localhost/build/testproject/_result?package=python-MarkupSafe&multibuild=1&locallink=1', file='result.xml') def testResults(self): out = self._run_osc('results', '--xml', 'testproject', 'python-MarkupSafe') self.assertEqualMultiline(out, self._get_fixture('result.xml')) @GET('http://localhost/build/testproject/_result?package=python-MarkupSafe&multibuild=1&locallink=1', file='result-dirty.xml') @GET('http://localhost/build/testproject/_result?package=python-MarkupSafe&oldstate=c57e2ee592dbbf26ebf19cc4f1bc1e83&multibuild=1&locallink=1', file='result.xml') def testResultsWatch(self): out = self._run_osc('results', '--watch', '--xml', 'testproject', 'python-MarkupSafe') self.assertEqualMultiline(out, self._get_fixture('result-dirty.xml') + self._get_fixture('result.xml')) if __name__ == '__main__': unittest.main() osc-1.12.1/tests/test_revertfiles.py000066400000000000000000000063301475337502500175020ustar00rootroot00000000000000import os import unittest import osc.core import osc.oscerr from .common import OscTestCase FIXTURES_DIR = os.path.join(os.path.dirname(__file__), 'revertfile_fixtures') def suite(): return unittest.defaultTestLoader.loadTestsFromTestCase(TestRevertFiles) class TestRevertFiles(OscTestCase): def _get_fixtures_dir(self): return FIXTURES_DIR def testRevertUnchanged(self): """revert an unchanged file (state == ' ')""" self._change_to_pkg('simple') p = osc.core.Package('.') self.assertRaises(osc.oscerr.OscIOError, p.revert, 'toadd2') self._check_status(p, 'toadd2', '?') def testRevertModified(self): """revert a modified file""" self._change_to_pkg('simple') p = osc.core.Package('.') p.revert('nochange') self.__check_file('nochange') self._check_status(p, 'nochange', ' ') def testRevertAdded(self): """revert an added file""" self._change_to_pkg('simple') p = osc.core.Package('.') p.revert('toadd1') self.assertTrue(os.path.exists('toadd1')) self._check_addlist('replaced\naddedmissing\n') self._check_status(p, 'toadd1', '?') def testRevertDeleted(self): """revert a deleted file""" self._change_to_pkg('simple') p = osc.core.Package('.') p.revert('somefile') self.__check_file('somefile') self._check_deletelist('deleted\n') self._check_status(p, 'somefile', ' ') def testRevertMissing(self): """revert a missing (state == '!') file""" self._change_to_pkg('simple') p = osc.core.Package('.') p.revert('missing') self.__check_file('missing') self._check_status(p, 'missing', ' ') def testRevertMissingAdded(self): """revert a missing file which was added to the wc""" self._change_to_pkg('simple') p = osc.core.Package('.') p.revert('addedmissing') self._check_addlist('toadd1\nreplaced\n') self.assertRaises(osc.oscerr.OscIOError, p.status, 'addedmissing') def testRevertReplaced(self): """revert a replaced (state == 'R') file""" self._change_to_pkg('simple') p = osc.core.Package('.') p.revert('replaced') self.__check_file('replaced') self._check_addlist('toadd1\naddedmissing\n') self._check_status(p, 'replaced', ' ') def testRevertConflict(self): """revert a file which is in the conflict state""" self._change_to_pkg('simple') p = osc.core.Package('.') p.revert('foo') self.__check_file('foo') self.assertFalse(os.path.exists(os.path.join('.osc', '_in_conflict'))) self._check_status(p, 'foo', ' ') def testRevertSkipped(self): """revert a skipped file""" self._change_to_pkg('simple') p = osc.core.Package('.') self.assertRaises(osc.oscerr.OscIOError, p.revert, 'skipped') def __check_file(self, fname): storefile = os.path.join('.osc', 'sources', fname) self.assertTrue(os.path.exists(fname)) self.assertTrue(os.path.exists(storefile)) self.assertFilesEqual(fname, storefile) if __name__ == '__main__': unittest.main() osc-1.12.1/tests/test_scmsync_obsinfo.py000066400000000000000000000021141475337502500203420ustar00rootroot00000000000000import unittest from osc.obs_api.scmsync_obsinfo import ScmsyncObsinfo class TestScmsyncObsinfo(unittest.TestCase): def test_empty(self): self.assertRaises(TypeError, ScmsyncObsinfo.from_string, "") def test_mandatory(self): data = """ mtime: 123 commit: abcdef """ info = ScmsyncObsinfo.from_string(data) self.assertEqual(info.mtime, 123) self.assertEqual(info.commit, "abcdef") self.assertEqual(info.url, None) def test_all(self): data = """ mtime: 123 commit: abcdef url: https://example.com revision: 1 subdir: dirname projectscmsync: project """ info = ScmsyncObsinfo.from_string(data) self.assertEqual(info.mtime, 123) self.assertEqual(info.commit, "abcdef") self.assertEqual(info.url, "https://example.com") self.assertEqual(info.revision, "1") self.assertEqual(info.subdir, "dirname") self.assertEqual(info.projectscmsync, "project") if __name__ == "__main__": unittest.main() osc-1.12.1/tests/test_setlinkrev.py000066400000000000000000000114331475337502500173360ustar00rootroot00000000000000import os import unittest import osc.core import osc.oscerr from .common import GET, PUT, OscTestCase FIXTURES_DIR = os.path.join(os.path.dirname(__file__), 'setlinkrev_fixtures') def suite(): return unittest.defaultTestLoader.loadTestsFromTestCase(TestSetLinkRev) class TestSetLinkRev(OscTestCase): def setUp(self): super().setUp(copytree=False) def _get_fixtures_dir(self): return FIXTURES_DIR @GET('http://localhost/source/osctest/simple/_link', file='simple_link') @GET('http://localhost/source/srcprj/srcpkg?rev=latest', file='simple_filesremote') @PUT('http://localhost/source/osctest/simple/_link?comment=Set+link+revision+to+42', exp='', text='dummytext') def test_simple1(self): """a simple set_link_rev call without revision""" osc.core.set_link_rev('http://localhost', 'osctest', 'simple') @GET('http://localhost/source/osctest/simple/_link', file='simple_link') @PUT('http://localhost/source/osctest/simple/_link?comment=Set+link+revision+to+42', exp='', text='dummytext') def test_simple2(self): """a simple set_link_rev call with revision""" osc.core.set_link_rev('http://localhost', 'osctest', 'simple', '42') @GET('http://localhost/source/osctest/simple/_link', file='noproject_link') @GET('http://localhost/source/osctest/srcpkg?rev=latest&expand=1', file='expandedsrc_filesremote') @PUT('http://localhost/source/osctest/simple/_link?comment=Set+link+revision+to+eeeeeeeeeeeeeeeeeeeeeeeeeeeeeeee', exp='', text='dummytext') def test_expandedsrc(self): """expand src package""" osc.core.set_link_rev('http://localhost', 'osctest', 'simple', expand=True) @GET('http://localhost/source/osctest/simple/_link', file='link_with_rev') @GET('http://localhost/source/srcprj/srcpkg?rev=latest', file='simple_filesremote') @PUT('http://localhost/source/osctest/simple/_link?comment=Set+link+revision+to+42', exp='', text='dummytext') def test_existingrev(self): """link already has a rev attribute, update it to current version""" # we could also avoid the superfluous PUT osc.core.set_link_rev('http://localhost', 'osctest', 'simple') @GET('http://localhost/source/osctest/simple/_link', file='link_with_rev') @GET('http://localhost/source/srcprj/srcpkg?rev=latest&expand=1', file='expandedsrc_filesremote') @PUT('http://localhost/source/osctest/simple/_link?comment=Set+link+revision+to+eeeeeeeeeeeeeeeeeeeeeeeeeeeeeeee', exp='', text='dummytext') def test_expandexistingrev(self): """link already has a rev attribute, update it to current version""" osc.core.set_link_rev('http://localhost', 'osctest', 'simple', expand=True) @GET('http://localhost/source/osctest/simple/_link', file='simple_link') @GET('http://localhost/source/srcprj/srcpkg?rev=latest&expand=1', text='conflict in file merge', code=400) def test_linkerror(self): """link is broken""" from urllib.error import HTTPError # the backend returns status 400 if we try to expand a broken _link self.assertRaises(HTTPError, osc.core.set_link_rev, 'http://localhost', 'osctest', 'simple', expand=True) @GET('http://localhost/source/osctest/simple/_link', file='rev_link') @PUT('http://localhost/source/osctest/simple/_link?comment=Unset+link+revision', exp='', text='dummytext') def test_deleterev(self): """delete rev attribute from link xml""" osc.core.set_link_rev('http://localhost', 'osctest', 'simple', revision=None) @GET('http://localhost/source/osctest/simple/_link', file='md5_rev_link') @PUT('http://localhost/source/osctest/simple/_link?comment=Unset+link+revision', exp='', text='dummytext') def test_deleterev_md5(self): """delete rev and vrev attribute from link xml""" osc.core.set_link_rev('http://localhost', 'osctest', 'simple', revision=None) @GET('http://localhost/source/osctest/simple/_link', file='simple_link') @PUT('http://localhost/source/osctest/simple/_link?comment=Unset+link+revision', exp='', text='dummytext') def test_deleterevnonexistent(self): """delete non existent rev attribute from link xml""" osc.core.set_link_rev('http://localhost', 'osctest', 'simple', revision=None) if __name__ == '__main__': unittest.main() osc-1.12.1/tests/test_store.py000066400000000000000000000222371475337502500163100ustar00rootroot00000000000000import os import sys import tempfile import unittest import osc.core as osc_core from osc.store import Store class TestStore(unittest.TestCase): def setUp(self): self.tmpdir = tempfile.mkdtemp(prefix='osc_test') self.store = Store(self.tmpdir, check=False) self.store.write_string("_osclib_version", Store.STORE_VERSION) self.store.apiurl = "http://localhost" self.store.is_package = True self.store.project = "project name" self.store.package = "package name" def tearDown(self): try: shutil.rmtree(self.tmpdir) except: pass def fileEquals(self, fn, expected_value): path = os.path.join(self.tmpdir, ".osc", fn) with open(path) as f: actual_value = f.read() self.assertEqual(actual_value, expected_value, f"File: {fn}") def test_read_write_file(self): self.store.write_file("_file", "\n\nline1\nline2") self.fileEquals("_file", "\n\nline1\nline2") self.assertEqual(self.store.read_file("_file"), "\n\nline1\nline2") # writing None removes the file self.store.write_file("_file", None) self.assertFalse(self.store.exists("_file")) self.assertRaises(TypeError, self.store.write_string, "_file", 123) self.assertRaises(TypeError, self.store.write_string, "_file", ["123"]) def test_read_write_int(self): self.store.write_int("_int", 123) self.fileEquals("_int", "123\n") self.assertEqual(self.store.read_int("_int"), 123) # writing None removes the file self.store.write_int("_int", None) self.assertFalse(self.store.exists("_int")) self.assertRaises(TypeError, self.store.write_int, "_int", "123") self.assertRaises(TypeError, self.store.write_int, "_int", b"123") self.assertRaises(TypeError, self.store.write_int, "_int", ["123"]) def test_read_write_list(self): self.store.write_list("_list", ["one", "two", "three"]) self.fileEquals("_list", "one\ntwo\nthree\n") self.assertEqual(self.store.read_list("_list"), ["one", "two", "three"]) # writing None removes the file self.store.write_list("_list", None) self.assertFalse(self.store.exists("_list")) self.assertRaises(TypeError, self.store.write_list, "_list", "123") self.assertRaises(TypeError, self.store.write_list, "_list", b"123") self.assertRaises(TypeError, self.store.write_list, "_list", 123) def test_read_write_string(self): self.store.write_string("_string", "string") self.fileEquals("_string", "string\n") self.assertEqual(self.store.read_string("_string"), "string") self.store.write_string("_bytes", b"bytes") self.fileEquals("_bytes", "bytes\n") self.assertEqual(self.store.read_string("_bytes"), "bytes") # writing None removes the file self.store.write_string("_string", None) self.assertFalse(self.store.exists("_string")) self.assertRaises(TypeError, self.store.write_string, "_string", 123) self.assertRaises(TypeError, self.store.write_string, "_string", ["123"]) def test_contains(self): self.assertTrue("_project" in self.store) self.assertTrue("_package" in self.store) self.assertFalse("_foo" in self.store) def test_iter(self): self.assertEqual(len(list(self.store)), 4) for fn in self.store: self.assertIn(fn, ["_osclib_version", "_apiurl", "_project", "_package"]) def test_apiurl(self): self.store.apiurl = "https://example.com" self.fileEquals("_apiurl", "https://example.com\n") store2 = Store(self.tmpdir) self.assertEqual(store2.apiurl, "https://example.com") def test_apiurl_no_trailing_slash(self): self.store.apiurl = "https://example.com/" self.fileEquals("_apiurl", "https://example.com\n") self.store.write_string("_apiurl", "https://example.com/") self.fileEquals("_apiurl", "https://example.com/\n") self.assertEqual(self.store.apiurl, "https://example.com") def test_package(self): self.fileEquals("_package", "package name\n") store2 = Store(self.tmpdir) self.assertEqual(store2.package, "package name") def test_project(self): self.fileEquals("_project", "project name\n") store2 = Store(self.tmpdir) self.assertEqual(store2.project, "project name") def test_scmurl(self): self.store.scmurl = "https://example.com/project.git" self.fileEquals("_scm", "https://example.com/project.git\n") store2 = Store(self.tmpdir) self.assertEqual(store2.scmurl, "https://example.com/project.git") def test_size_limit(self): self.store.size_limit = 123 self.fileEquals("_size_limit", "123\n") store2 = Store(self.tmpdir) self.assertEqual(store2.size_limit, 123) def test_to_be_added(self): self.store.to_be_added = ["foo", "bar", "baz"] self.fileEquals("_to_be_added", "foo\nbar\nbaz\n") store2 = Store(self.tmpdir) self.assertEqual(store2.to_be_added, ["foo", "bar", "baz"]) def test_to_be_deleted(self): self.store.to_be_deleted = ["foo", "bar", "baz"] self.fileEquals("_to_be_deleted", "foo\nbar\nbaz\n") store2 = Store(self.tmpdir) self.assertEqual(store2.to_be_deleted, ["foo", "bar", "baz"]) def test_in_conflict(self): self.store.in_conflict = ["foo", "bar", "baz"] self.fileEquals("_in_conflict", "foo\nbar\nbaz\n") store2 = Store(self.tmpdir) self.assertEqual(store2.in_conflict, ["foo", "bar", "baz"]) def test_osclib_version(self): # no setter, users are not supposed to set the version self.assertRaises(AttributeError, setattr, self.store, "osclib_version", "123") self.store.write_string("_osclib_version", "123") self.fileEquals("_osclib_version", "123\n") store2 = Store(self.tmpdir, check=False) self.assertEqual(store2.osclib_version, "123") def test_files(self): files = [ osc_core.File(name="foo", md5="aabbcc", size=1, mtime=2), osc_core.File(name="bar", md5="ddeeff", size=3, mtime=4, skipped=True), ] self.store.files = files expected = """ """.lstrip() if sys.version_info[:2] <= (3, 7): # ElementTree doesn't preserve attribute order on py <= 3.7; https://bugs.python.org/issue34160 expected = """ """.lstrip() self.fileEquals("_files", expected) store2 = Store(self.tmpdir) files2 = store2.files # files got ordered self.assertTrue(files2[0] == files[1]) self.assertTrue(files2[1] == files[0]) def test_last_buildroot(self): self.assertEqual(self.store.last_buildroot, None) self.store.last_buildroot = "repo", "arch", "vm_type" self.fileEquals("_last_buildroot", "repo\narch\nvm_type\n") self.assertRaises(ValueError, setattr, self.store, "last_buildroot", ["one"]) self.assertRaises(ValueError, setattr, self.store, "last_buildroot", ["one", "two"]) self.assertRaises(ValueError, setattr, self.store, "last_buildroot", ["one", "two", "three", "four"]) store2 = Store(self.tmpdir) self.assertEqual(store2.last_buildroot, ["repo", "arch", "vm_type"]) self.store.last_buildroot = "repo", "arch", None self.fileEquals("_last_buildroot", "repo\narch\n\n") store2 = Store(self.tmpdir) self.assertEqual(store2.last_buildroot, ["repo", "arch", None]) def test_meta_node(self): self.store.write_string( "_meta", """ title desc name """, ) node = self.store._meta_node self.assertNotEqual(node, None) # try to read the _meta via a package class from osc._private import LocalPackage self.store.files = [] pkg = LocalPackage(self.tmpdir) self.assertEqual(pkg.get_meta_value("releasename"), "name") def test_migrate_10_20_sources_file(self): self.store = Store(self.tmpdir, check=False) self.store.write_string("_osclib_version", "1.0") self.store.apiurl = "http://localhost" self.store.is_package = True self.store.project = "project name" self.store.package = "package name" self.store.write_string("sources", "") self.store.files = [ osc_core.File(name="sources", md5="aabbcc", size=0, mtime=0), ] Store(self.tmpdir, check=True) self.assertTrue(os.path.exists(os.path.join(self.tmpdir, ".osc", "sources", "sources"))) if __name__ == "__main__": unittest.main() osc-1.12.1/tests/test_update.py000066400000000000000000000365041475337502500164400ustar00rootroot00000000000000import os import sys import unittest import osc.core import osc.oscerr from .common import GET, OscTestCase FIXTURES_DIR = os.path.join(os.path.dirname(__file__), 'update_fixtures') def suite(): return unittest.defaultTestLoader.loadTestsFromTestCase(TestUpdate) class TestUpdate(OscTestCase): def _get_fixtures_dir(self): return FIXTURES_DIR @GET('http://localhost/source/osctest/simple?rev=latest', file='testUpdateNoChanges_files') @GET('http://localhost/source/osctest/simple/_meta', file='meta.xml') def testUpdateNoChanges(self): """update without any changes (the wc is the most recent version)""" self._change_to_pkg('simple') osc.core.Package('.').update() self.assertEqual(sys.stdout.getvalue(), 'At revision 1.\n') @GET('http://localhost/source/osctest/simple?rev=2', file='testUpdateNewFile_files') @GET('http://localhost/source/osctest/simple/upstream_added?rev=2', file='testUpdateNewFile_upstream_added') @GET('http://localhost/source/osctest/simple/_meta', file='meta.xml') def testUpdateNewFile(self): """a new file was added to the remote package""" self._change_to_pkg('simple') osc.core.Package('.').update(rev=2) exp = 'A upstream_added\nAt revision 2.\n' self.assertEqual(sys.stdout.getvalue(), exp) self._check_digests('testUpdateNewFile_files') @GET('http://localhost/source/osctest/simple?rev=2', file='testUpdateNewFileLocalExists_files') def testUpdateNewFileLocalExists(self): """ a new file was added to the remote package but the same (unversioned) file exists locally """ self._change_to_pkg('simple') self.assertRaises(osc.oscerr.PackageFileConflict, osc.core.Package('.').update, rev=2) @GET('http://localhost/source/osctest/simple?rev=2', file='testUpdateDeletedFile_files') @GET('http://localhost/source/osctest/simple/_meta', file='meta.xml') def testUpdateDeletedFile(self): """a file was deleted from the remote package""" self._change_to_pkg('simple') osc.core.Package('.').update(rev=2) exp = 'D foo\nAt revision 2.\n' self.assertEqual(sys.stdout.getvalue(), exp) self._check_digests('testUpdateDeletedFile_files') self.assertFalse(os.path.exists('foo')) self.assertFalse(os.path.exists(os.path.join('.osc', 'sources', 'foo'))) @GET('http://localhost/source/osctest/simple?rev=2', file='testUpdateUpstreamModifiedFile_files') @GET('http://localhost/source/osctest/simple/foo?rev=2', file='testUpdateUpstreamModifiedFile_foo') @GET('http://localhost/source/osctest/simple/_meta', file='meta.xml') def testUpdateUpstreamModifiedFile(self): """a file was modified in the remote package (local file isn't modified)""" self._change_to_pkg('simple') osc.core.Package('.').update(rev=2) exp = 'U foo\nAt revision 2.\n' self.assertEqual(sys.stdout.getvalue(), exp) self._check_digests('testUpdateUpstreamModifiedFile_files') @GET('http://localhost/source/osctest/conflict?rev=2', file='testUpdateConflict_files') @GET('http://localhost/source/osctest/conflict/merge?rev=2', file='testUpdateConflict_merge') @GET('http://localhost/source/osctest/conflict/_meta', file='meta.xml') def testUpdateConflict(self): """ a file was modified in the remote package (local file is also modified and a merge isn't possible) """ self._change_to_pkg('conflict') osc.core.Package('.').update(rev=2) exp = 'C merge\nAt revision 2.\n' self._check_digests('testUpdateConflict_files') self.assertEqual(sys.stdout.getvalue(), exp) self._check_conflictlist('merge\n') @GET('http://localhost/source/osctest/already_in_conflict?rev=2', file='testUpdateAlreadyInConflict_files') @GET('http://localhost/source/osctest/already_in_conflict/merge?rev=2', file='testUpdateAlreadyInConflict_merge') @GET('http://localhost/source/osctest/already_in_conflict/_meta', file='meta.xml') def testUpdateAlreadyInConflict(self): """ a file was modified in the remote package (the local file is already in conflict) """ self._change_to_pkg('already_in_conflict') osc.core.Package('.').update(rev=2) exp = 'skipping \'merge\' (this is due to conflicts)\nAt revision 2.\n' self.assertEqual(sys.stdout.getvalue(), exp) self._check_conflictlist('merge\n') self._check_digests('testUpdateAlreadyInConflict_files') @GET('http://localhost/source/osctest/deleted?rev=2', file='testUpdateLocalDeletions_files') @GET('http://localhost/source/osctest/deleted/foo?rev=2', file='testUpdateLocalDeletions_foo') @GET('http://localhost/source/osctest/deleted/merge?rev=2', file='testUpdateLocalDeletions_merge') @GET('http://localhost/source/osctest/deleted/_meta', file='meta.xml') def testUpdateLocalDeletions(self): """ the files 'foo' and 'merge' were modified in the remote package and marked for deletion in the local wc. Additionally the file 'merge' was modified in the wc before deletion so the local file still exists (and a merge with the remote file is not possible) """ self._change_to_pkg('deleted') osc.core.Package('.').update(rev=2) exp = 'U foo\nC merge\nAt revision 2.\n' self.assertEqual(sys.stdout.getvalue(), exp) self._check_deletelist('foo\n') self._check_conflictlist('merge\n') self.assertFilesEqual('foo', os.path.join('.osc', 'sources', 'foo')) self._check_digests('testUpdateLocalDeletions_files') @GET('http://localhost/source/osctest/restore?rev=latest', file='testUpdateRestore_files') @GET('http://localhost/source/osctest/restore/foo?rev=1', file='testUpdateRestore_foo') @GET('http://localhost/source/osctest/restore/_meta', file='meta.xml') def testUpdateRestore(self): """local file 'foo' was deleted with a non osc command and will be restored""" self._change_to_pkg('restore') osc.core.Package('.').update() exp = 'Restored \'foo\'\nAt revision 1.\n' self.assertEqual(sys.stdout.getvalue(), exp) self._check_digests('testUpdateRestore_files') @GET('http://localhost/source/osctest/limitsize?rev=latest', file='testUpdateLimitSizeNoChange_filesremote') @GET('http://localhost/source/osctest/limitsize/_meta', file='meta.xml') def testUpdateLimitSizeNoChange(self): """ a new file was added to the remote package but isn't checked out because of the size constraint """ self._change_to_pkg('limitsize') osc.core.Package('.').update(size_limit=50) exp = 'D bigfile\nAt revision 2.\n' self.assertEqual(sys.stdout.getvalue(), exp) self.assertFalse(os.path.exists(os.path.join('.osc', 'sources', 'bigfile'))) self.assertFalse(os.path.exists('bigfile')) self._check_digests('testUpdateLimitSizeNoChange_files', 'bigfile') @GET('http://localhost/source/osctest/limitsize_local?rev=latest', file='testUpdateLocalLimitSizeNoChange_filesremote') @GET('http://localhost/source/osctest/limitsize_local/_meta', file='meta.xml') def testUpdateLocalLimitSizeNoChange(self): """ a new file was added to the remote package but isn't checked out because of the local size constraint """ self._change_to_pkg('limitsize_local') p = osc.core.Package('.') p.update() exp = 'D bigfile\nD merge\nAt revision 2.\n' self.assertEqual(sys.stdout.getvalue(), exp) self.assertFalse(os.path.exists(os.path.join('.osc', 'sources', 'bigfile'))) self.assertFalse(os.path.exists(os.path.join('.osc', 'sources', 'merge'))) self.assertFalse(os.path.exists('bigfile')) self._check_digests('testUpdateLocalLimitSizeNoChange_files', 'bigfile', 'merge') self._check_status(p, 'bigfile', 'S') self._check_status(p, 'merge', 'S') @GET('http://localhost/source/osctest/limitsize?rev=latest', file='testUpdateLimitSizeAddDelete_filesremote') @GET('http://localhost/source/osctest/limitsize/exists?rev=2', file='testUpdateLimitSizeAddDelete_exists') @GET('http://localhost/source/osctest/limitsize/_meta', file='meta.xml') def testUpdateLimitSizeAddDelete(self): """ a new file (exists) was added to the remote package with size < size_limit and one file (nochange) was deleted from the remote package (local file 'nochange' is modified). Additionally files which didn't change are removed the local wc due to the size constraint. """ self._change_to_pkg('limitsize') osc.core.Package('.').update(size_limit=10) exp = 'A exists\nD bigfile\nD foo\nD merge\nD nochange\nAt revision 2.\n' self.assertEqual(sys.stdout.getvalue(), exp) self.assertFalse(os.path.exists(os.path.join('.osc', 'sources', 'bigfile'))) self.assertFalse(os.path.exists('bigfile')) self.assertFalse(os.path.exists(os.path.join('.osc', 'sources', 'foo'))) self.assertFalse(os.path.exists('foo')) self.assertFalse(os.path.exists(os.path.join('.osc', 'sources', 'merge'))) self.assertFalse(os.path.exists('merge')) # exists because local version is modified self.assertTrue(os.path.exists('nochange')) self._check_digests('testUpdateLimitSizeAddDelete_files', 'bigfile', 'foo', 'merge', 'nochange') @GET('http://localhost/source/osctest/services?rev=latest', file='testUpdateServiceFilesAddDelete_filesremote') @GET('http://localhost/source/osctest/services/bigfile?rev=2', file='testUpdateServiceFilesAddDelete_bigfile') @GET('http://localhost/source/osctest/services/_service:bar?rev=2', file='testUpdateServiceFilesAddDelete__service:bar') @GET('http://localhost/source/osctest/services/_service:foo?rev=2', file='testUpdateServiceFilesAddDelete__service:foo') @GET('http://localhost/source/osctest/services/_meta', file='meta.xml') def testUpdateAddDeleteServiceFiles(self): """update package with _service:* files""" self._change_to_pkg('services') osc.core.Package('.').update(service_files=True) exp = 'A bigfile\nD _service:exists\nA _service:bar\nA _service:foo\nAt revision 2.\n' self.assertEqual(sys.stdout.getvalue(), exp) self.assertFalse(os.path.exists(os.path.join('.osc', 'sources', '_service:bar'))) self.assertFileContentEqual('_service:bar', 'another service\n') self.assertFalse(os.path.exists(os.path.join('.osc', 'sources', '_service:foo'))) self.assertFileContentEqual('_service:foo', 'small\n') self.assertTrue(os.path.exists('_service:exists')) self._check_digests('testUpdateServiceFilesAddDelete_files', '_service:foo', '_service:bar') @GET('http://localhost/source/osctest/services?rev=latest', file='testUpdateServiceFilesAddDelete_filesremote') @GET('http://localhost/source/osctest/services/bigfile?rev=2', file='testUpdateServiceFilesAddDelete_bigfile') @GET('http://localhost/source/osctest/services/_meta', file='meta.xml') def testUpdateDisableAddDeleteServiceFiles(self): """update package with _service:* files (with service_files=False)""" self._change_to_pkg('services') osc.core.Package('.').update() exp = 'A bigfile\nD _service:exists\nAt revision 2.\n' self.assertEqual(sys.stdout.getvalue(), exp) self.assertFalse(os.path.exists(os.path.join('.osc', 'sources', '_service:bar'))) self.assertFalse(os.path.exists('_service:bar')) self.assertFalse(os.path.exists(os.path.join('.osc', 'sources', '_service:foo'))) self.assertFalse(os.path.exists('_service:foo')) self.assertTrue(os.path.exists('_service:exists')) self._check_digests('testUpdateServiceFilesAddDelete_files', '_service:foo', '_service:bar') @GET('http://localhost/source/osctest/metamode?meta=1&rev=latest', file='testUpdateMetaMode_filesremote') @GET('http://localhost/source/osctest/metamode/_meta?meta=1&rev=1', file='testUpdateMetaMode__meta') def testUpdateMetaMode(self): """update package with metamode enabled""" self._change_to_pkg('metamode') p = osc.core.Package('.') p.update() exp = 'A _meta\nD foo\nD merge\nD nochange\nAt revision 1.\n' self.assertEqual(sys.stdout.getvalue(), exp) self.assertFalse(os.path.exists('foo')) self.assertFalse(os.path.exists('merge')) self.assertFalse(os.path.exists('nochange')) self._check_digests('testUpdateMetaMode_filesremote') self._check_status(p, '_meta', ' ') @GET('http://localhost/source/osctest/new?rev=latest', file='testUpdateNew_filesremote') @GET('http://localhost/source/osctest/new/_meta', file='meta.xml') def testUpdateNew(self): """update a new (empty) package. The package has no revision.""" self._change_to_pkg('new') p = osc.core.Package('.') p.update() exp = 'At revision None.\n' self.assertEqual(sys.stdout.getvalue(), exp) self._check_digests('testUpdateNew_filesremote') # tests to recover from an aborted/broken update @GET('http://localhost/source/osctest/simple/foo?rev=2', file='testUpdateResume_foo') @GET('http://localhost/source/osctest/simple/merge?rev=2', file='testUpdateResume_merge') @GET('http://localhost/source/osctest/simple/_meta', file='meta.xml') @GET('http://localhost/source/osctest/simple?rev=2', file='testUpdateResume_files') @GET('http://localhost/source/osctest/simple/_meta', file='meta.xml') def testUpdateResume(self): """resume an aborted update""" self._change_to_pkg('resume') osc.core.Package('.').update(rev=2) exp = 'resuming broken update...\nU foo\nU merge\nAt revision 2.\nAt revision 2.\n' self.assertEqual(sys.stdout.getvalue(), exp) self.assertFalse(os.path.exists(os.path.join('.osc', '_in_update'))) self._check_digests('testUpdateResume_files') @GET('http://localhost/source/osctest/simple/foo?rev=1', file='testUpdateResumeDeletedFile_foo') @GET('http://localhost/source/osctest/simple/merge?rev=1', file='testUpdateResumeDeletedFile_merge') @GET('http://localhost/source/osctest/simple/_meta', file='meta.xml') @GET('http://localhost/source/osctest/simple?rev=1', file='testUpdateResumeDeletedFile_files') @GET('http://localhost/source/osctest/simple/_meta', file='meta.xml') def testUpdateResumeDeletedFile(self): """ resume an aborted update (the file 'added' was already deleted in the first update run). It's marked as deleted again (this is due to an expected issue with the update code) """ self._change_to_pkg('resume_deleted') osc.core.Package('.').update(rev=1) exp = 'resuming broken update...\nD added\nU foo\nU merge\nAt revision 1.\nAt revision 1.\n' self.assertEqual(sys.stdout.getvalue(), exp) self.assertFalse(os.path.exists(os.path.join('.osc', '_in_update'))) self.assertFalse(os.path.exists('added')) self.assertFalse(os.path.exists(os.path.join('.osc', 'sources', 'added'))) self._check_digests('testUpdateResumeDeletedFile_files') if __name__ == '__main__': unittest.main() osc-1.12.1/tests/test_util_ar.py000066400000000000000000000053371475337502500166150ustar00rootroot00000000000000import os import shutil import tempfile import unittest from osc.util.ar import Ar from osc.util.ar import ArError FIXTURES_DIR = os.path.join(os.path.dirname(__file__), "fixtures") class TestAr(unittest.TestCase): def setUp(self): self.tmpdir = tempfile.mkdtemp(prefix="osc_test_") try: self.old_cwd = os.getcwd() except FileNotFoundError: self.old_cwd = os.path.expanduser("~") os.chdir(self.tmpdir) self.archive = os.path.join(FIXTURES_DIR, "archive.ar") self.ar = Ar(self.archive) self.ar.read() def tearDown(self): os.chdir(self.old_cwd) shutil.rmtree(self.tmpdir) def test_file_list(self): actual = [i.name for i in self.ar] expected = [ # absolute path b"/tmp/foo", # this is a filename, not a long filename reference b"/123", b"very-long-long-long-long-name", b"very-long-long-long-long-name2", # long file name with a newline b"very-long-name\n-with-newline", # short file name with a newline b"a\nb", b"dir/file", ] self.assertEqual(actual, expected) def test_get_file(self): f = self.ar.get_file(b"/tmp/foo") self.assertIsNotNone(f) f = self.ar.get_file("/tmp/foo") self.assertIsNotNone(f) f = self.ar.get_file("does-not-exist") self.assertIsNone(f) def test_saveTo(self): f = self.ar.get_file("a\nb") path = f.saveTo(self.tmpdir) # check that we've got the expected path self.assertEqual(path, os.path.join(self.tmpdir, "a\nb")) # ... and that the contents also match with open(path, "r", encoding="utf-8") as f: self.assertEqual(f.read(), "newline\n") def test_saveTo_subdir(self): f = self.ar.get_file("dir/file") path = f.saveTo(self.tmpdir) # check that we've got the expected path self.assertEqual(path, os.path.join(self.tmpdir, "dir/file")) # ... and that the contents also match with open(path, "r", encoding="utf-8") as f: self.assertEqual(f.read(), "file-in-a-dir\n") def test_saveTo_abspath(self): f = self.ar.get_file("/tmp/foo") assert f is not None # this is supposed to throw an error, extracting files with absolute paths might overwrite system files self.assertRaises(ArError, f.saveTo, self.tmpdir) def test_no_exthdr(self): self.archive = os.path.join(FIXTURES_DIR, "archive-no-ext_fnhdr.ar") self.ar = Ar(self.archive) self.ar.read() self.test_saveTo_subdir() if __name__ == "__main__": unittest.main() osc-1.12.1/tests/test_util_cpio.py000066400000000000000000000042111475337502500171330ustar00rootroot00000000000000import os import shutil import tempfile import unittest from osc.util.cpio import CpioRead from osc.util.cpio import CpioError FIXTURES_DIR = os.path.join(os.path.dirname(__file__), "fixtures") class TestCpio(unittest.TestCase): def setUp(self): self.tmpdir = tempfile.mkdtemp(prefix="osc_test_") try: self.old_cwd = os.getcwd() except FileNotFoundError: self.old_cwd = os.path.expanduser("~") os.chdir(self.tmpdir) self.archive = os.path.join(FIXTURES_DIR, "archive.cpio") self.cpio = CpioRead(self.archive) self.cpio.read() def tearDown(self): os.chdir(self.old_cwd) shutil.rmtree(self.tmpdir) def test_file_list(self): actual = [i.filename for i in self.cpio] expected = [ # absolute path b"/tmp/foo", # this is a filename, not a long filename reference b"/123", b"very-long-long-long-long-name", b"very-long-long-long-long-name2", # long file name with a newline b"very-long-name\n-with-newline", # short file name with a newline b"a\nb", b"dir/file", ] self.assertEqual(actual, expected) def test_copyin_file(self): path = self.cpio.copyin_file("a\nb", dest=self.tmpdir) # check that we've got the expected path self.assertEqual(path, os.path.join(self.tmpdir, "a\nb")) # ... and that the contents also match with open(path, "r", encoding="utf-8") as f: self.assertEqual(f.read(), "newline\n") def test_copyin_file_abspath(self): self.assertRaises(CpioError, self.cpio.copyin_file, "/tmp/foo") def test_copyin_file_subdir(self): path = self.cpio.copyin_file("dir/file", dest=self.tmpdir) # check that we've got the expected path self.assertEqual(path, os.path.join(self.tmpdir, "dir/file")) # ... and that the contents also match with open(path, "r", encoding="utf-8") as f: self.assertEqual(f.read(), "file-in-a-dir\n") if __name__ == "__main__": unittest.main() osc-1.12.1/tests/test_vc.py000066400000000000000000000051631475337502500155630ustar00rootroot00000000000000import os import unittest import osc.conf from osc.core import vc_export_env from .common import GET from .common import patch class TestVC(unittest.TestCase): def setUp(self): osc.conf.config = osc.conf.Options() config = osc.conf.config host_options = osc.conf.HostOptions( config, apiurl="http://localhost", username="Admin" ) config.api_host_options[host_options["apiurl"]] = host_options config["apiurl"] = host_options["apiurl"] self.host_options = host_options @patch.dict(os.environ, {}, clear=True) def test_vc_export_env_conf(self): self.host_options.realname = "" self.host_options.email = "" vc_export_env("http://localhost") expected = { "VC_REALNAME": "", "VC_MAILADDR": "", "mailaddr": "", } self.assertEqual(os.environ, expected) @patch.dict(os.environ, {}, clear=True) @GET( "http://localhost/person/Admin", text="Adminroot@localhostOBS Instance Superuser", ) def test_vc_export_env_conf_realname(self): self.host_options.realname = "" vc_export_env("http://localhost") expected = { "VC_REALNAME": "", "VC_MAILADDR": "root@localhost", "mailaddr": "root@localhost", } self.assertEqual(os.environ, expected) @patch.dict(os.environ, {}, clear=True) @GET( "http://localhost/person/Admin", text="Adminroot@localhostOBS Instance Superuser", ) def test_vc_export_env_conf_email(self): self.host_options.email = "" vc_export_env("http://localhost") expected = { "VC_REALNAME": "OBS Instance Superuser", "VC_MAILADDR": "", "mailaddr": "", } self.assertEqual(os.environ, expected) @patch.dict(os.environ, {}, clear=True) @GET( "http://localhost/person/Admin", text="Adminroot@localhostOBS Instance Superuser", ) def test_vc_export_env_api_call(self): vc_export_env("http://localhost") expected = { "VC_REALNAME": "OBS Instance Superuser", "VC_MAILADDR": "root@localhost", "mailaddr": "root@localhost", } self.assertEqual(os.environ, expected) if __name__ == "__main__": unittest.main() osc-1.12.1/tests/test_xpath.py000066400000000000000000000063231475337502500162760ustar00rootroot00000000000000import unittest from osc.util.xpath import XPathQuery as Q class TestQuery(unittest.TestCase): def test_noop(self): q = Q(name="foo") self.assertEqual(str(q), "@name='foo'") def test_not(self): q = Q(name__not="foo") self.assertEqual(str(q), "not(@name='foo')") def test_eq(self): q = Q(name__eq="foo") self.assertEqual(str(q), "@name='foo'") def test_not_eq(self): q = Q(name__not__eq="foo") self.assertEqual(str(q), "not(@name='foo')") def test_contains(self): q = Q(name__contains="foo") self.assertEqual(str(q), "contains(@name, 'foo')") def test_and(self): q1 = Q(name="foo") q2 = Q(name="bar") q = q1 & q2 self.assertEqual(str(q), "@name='foo' and @name='bar'") q3 = Q(name="baz") q = q & q3 self.assertEqual(str(q), "@name='foo' and @name='bar' and @name='baz'") def test_or(self): q1 = Q(name="foo") q2 = Q(name="bar") q = q1 | q2 self.assertEqual(str(q), "@name='foo' or @name='bar'") q3 = Q(name="baz") q = q | q3 self.assertEqual(str(q), "@name='foo' or @name='bar' or @name='baz'") def test_and_or(self): q1 = Q(name="foo") q2 = Q(name="bar") q = q1 & q2 self.assertEqual(str(q), "@name='foo' and @name='bar'") q3 = Q(name="baz") q = q | q3 self.assertEqual(str(q), "(@name='foo' and @name='bar') or @name='baz'") q4 = Q(name="xyz") q = q | q4 self.assertEqual(str(q), "(@name='foo' and @name='bar') or @name='baz' or @name='xyz'") def test_or_and(self): q1 = Q(name="foo") q2 = Q(name="bar") q = q1 | q2 self.assertEqual(str(q), "@name='foo' or @name='bar'") q3 = Q(name="baz") q = q & q3 self.assertEqual(str(q), "(@name='foo' or @name='bar') and @name='baz'") q4 = Q(name="xyz") q = q & q4 self.assertEqual(str(q), "(@name='foo' or @name='bar') and @name='baz' and @name='xyz'") def test_and_or_and(self): q1 = Q(name="foo") q2 = Q(name="bar") q3 = Q(name="baz") q4 = Q(name="xyz") q = (q1 & q2) | (q3 & q4) self.assertEqual(str(q), "(@name='foo' and @name='bar') or (@name='baz' and @name='xyz')") def test_or_and_or(self): q1 = Q(name="foo") q2 = Q(name="bar") q3 = Q(name="baz") q4 = Q(name="xyz") q = (q1 | q2) & (q3 | q4) self.assertEqual(str(q), "(@name='foo' or @name='bar') and (@name='baz' or @name='xyz')") def test_multiple_kwargs(self): q = Q(name1="foo", name2="bar") self.assertEqual(str(q), "@name1='foo' and @name2='bar'") def test_eq_list(self): q = Q(name=["foo", "bar", "baz"]) self.assertEqual(str(q), "@name='foo' or @name='bar' or @name='baz'") def test_not_eq_list(self): q = Q(name__not=["foo", "bar", "baz"]) self.assertEqual(str(q), "not(@name='foo') and not(@name='bar') and not(@name='baz')") def test_review_state(self): q = Q(state__name=["new"]) self.assertEqual(str(q), "state[@name='new']") if __name__ == "__main__": unittest.main() osc-1.12.1/tests/update_fixtures/000077500000000000000000000000001475337502500167505ustar00rootroot00000000000000osc-1.12.1/tests/update_fixtures/meta.xml000066400000000000000000000002711475337502500204200ustar00rootroot00000000000000 <description> </description> <person userid="Admin" role="maintainer"/> <person userid="Admin" role="bugowner"/> </package>���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/oscrc��������������������������������������������������������������0000664�0000000�0000000�00000000136�14753375025�0020004�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������[general] apiurl = http://localhost [http://localhost] user=Admin pass=opensuse allow_http=1 ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/�����������������������������������������������������������0000775�0000000�0000000�00000000000�14753375025�0020434�5����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/.osc/������������������������������������������������������0000775�0000000�0000000�00000000000�14753375025�0021276�5����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/.osc/_apiurl�����������������������������������������������0000664�0000000�0000000�00000000021�14753375025�0022645�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������http://localhost ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/.osc/_osclib_version���������������������������������������0000664�0000000�0000000�00000000004�14753375025�0024372�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������1.0 ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/.osc/_packages���������������������������������������������0000664�0000000�0000000�00000000033�14753375025�0023132�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������<project name="osctest" /> �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/.osc/_project����������������������������������������������0000664�0000000�0000000�00000000010�14753375025�0023015�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������osctest ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/already_in_conflict/���������������������������������������0000775�0000000�0000000�00000000000�14753375025�0024424�5����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/already_in_conflict/.osc/����������������������������������0000775�0000000�0000000�00000000000�14753375025�0025266�5����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/already_in_conflict/.osc/_apiurl���������������������������0000664�0000000�0000000�00000000021�14753375025�0026635�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������http://localhost ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/already_in_conflict/.osc/_files����������������������������0000664�0000000�0000000�00000000606�14753375025�0026454�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������<directory name="already_in_conflict" rev="1" srcmd5="2df1eacfe03a3bec2112529e7f4dc39a" vrev="1"> <entry md5="0d62ceea6020d75154078a20d8c9f9ba" mtime="1282133912" name="foo" size="23" /> <entry md5="17b9e9e1a032ed44e7a584dc6303ffa8" mtime="1282133912" name="merge" size="48" /> <entry md5="7efa70f68983fad1cf487f69dedf93e9" mtime="1282133912" name="nochange" size="25" /> </directory>��������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/already_in_conflict/.osc/_in_conflict����������������������0000664�0000000�0000000�00000000006�14753375025�0027633�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������merge ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/already_in_conflict/.osc/_meta�����������������������������0000664�0000000�0000000�00000000306�14753375025�0026275�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������<package project="osctest" name="already_in_conflict"> <title/> <description> </description> <person userid="Admin" role="maintainer"/> <person userid="Admin" role="bugowner"/> </package>��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/already_in_conflict/.osc/_osclib_version�������������������0000664�0000000�0000000�00000000004�14753375025�0030362�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������1.0 ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/already_in_conflict/.osc/_package��������������������������0000664�0000000�0000000�00000000023�14753375025�0026736�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������already_in_conflict�������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/already_in_conflict/.osc/_project��������������������������0000664�0000000�0000000�00000000007�14753375025�0027013�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������osctest�������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/already_in_conflict/.osc/foo�������������������������������0000664�0000000�0000000�00000000027�14753375025�0025773�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������This is a simple test. ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/already_in_conflict/.osc/merge�����������������������������0000664�0000000�0000000�00000000060�14753375025�0026304�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������Is it possible to merge this file? I hope so... ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/already_in_conflict/.osc/nochange��������������������������0000664�0000000�0000000�00000000031�14753375025�0026765�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������This file didn't change. �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/already_in_conflict/foo������������������������������������0000664�0000000�0000000�00000000027�14753375025�0025131�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������This is a simple test. ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/already_in_conflict/merge����������������������������������0000664�0000000�0000000�00000000023�14753375025�0025441�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������Is it I hope so... �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/already_in_conflict/nochange�������������������������������0000664�0000000�0000000�00000000031�14753375025�0026123�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������This file didn't change. �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/conflict/��������������������������������������������������0000775�0000000�0000000�00000000000�14753375025�0022235�5����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/conflict/.osc/���������������������������������������������0000775�0000000�0000000�00000000000�14753375025�0023077�5����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/conflict/.osc/_apiurl��������������������������������������0000664�0000000�0000000�00000000021�14753375025�0024446�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������http://localhost ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/conflict/.osc/_files���������������������������������������0000664�0000000�0000000�00000000573�14753375025�0024270�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������<directory name="conflict" rev="1" srcmd5="2df1eacfe03a3bec2112529e7f4dc39a" vrev="1"> <entry md5="0d62ceea6020d75154078a20d8c9f9ba" mtime="1282130148" name="foo" size="23" /> <entry md5="17b9e9e1a032ed44e7a584dc6303ffa8" mtime="1282130148" name="merge" size="48" /> <entry md5="7efa70f68983fad1cf487f69dedf93e9" mtime="1282130148" name="nochange" size="25" /> </directory>�������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/conflict/.osc/_osclib_version������������������������������0000664�0000000�0000000�00000000004�14753375025�0026173�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������1.0 ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/conflict/.osc/_package�������������������������������������0000664�0000000�0000000�00000000010�14753375025�0024543�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������conflict������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/conflict/.osc/_project�������������������������������������0000664�0000000�0000000�00000000007�14753375025�0024624�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������osctest�������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/conflict/.osc/foo������������������������������������������0000664�0000000�0000000�00000000027�14753375025�0023604�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������This is a simple test. ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/conflict/.osc/merge����������������������������������������0000664�0000000�0000000�00000000060�14753375025�0024115�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������Is it possible to merge this file? I hope so... ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/conflict/.osc/nochange�������������������������������������0000664�0000000�0000000�00000000031�14753375025�0024576�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������This file didn't change. �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/conflict/foo�����������������������������������������������0000664�0000000�0000000�00000000027�14753375025�0022742�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������This is a simple test. ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/conflict/merge���������������������������������������������0000664�0000000�0000000�00000000060�14753375025�0023253�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������Is it possible to merge this file? I hope so... ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/conflict/nochange������������������������������������������0000664�0000000�0000000�00000000031�14753375025�0023734�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������This file didn't change. �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/deleted/���������������������������������������������������0000775�0000000�0000000�00000000000�14753375025�0022042�5����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/deleted/.osc/����������������������������������������������0000775�0000000�0000000�00000000000�14753375025�0022704�5����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/deleted/.osc/_apiurl���������������������������������������0000664�0000000�0000000�00000000021�14753375025�0024253�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������http://localhost ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/deleted/.osc/_files����������������������������������������0000664�0000000�0000000�00000000572�14753375025�0024074�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������<directory name="deleted" rev="1" srcmd5="2df1eacfe03a3bec2112529e7f4dc39a" vrev="1"> <entry md5="0d62ceea6020d75154078a20d8c9f9ba" mtime="1282134731" name="foo" size="23" /> <entry md5="17b9e9e1a032ed44e7a584dc6303ffa8" mtime="1282134731" name="merge" size="48" /> <entry md5="7efa70f68983fad1cf487f69dedf93e9" mtime="1282134731" name="nochange" size="25" /> </directory>��������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/deleted/.osc/_osclib_version�������������������������������0000664�0000000�0000000�00000000004�14753375025�0026000�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������1.0 ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/deleted/.osc/_package��������������������������������������0000664�0000000�0000000�00000000007�14753375025�0024356�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������deleted�������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/deleted/.osc/_project��������������������������������������0000664�0000000�0000000�00000000007�14753375025�0024431�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������osctest�������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/deleted/.osc/_to_be_deleted��������������������������������0000664�0000000�0000000�00000000012�14753375025�0025535�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������merge foo ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/deleted/.osc/foo�������������������������������������������0000664�0000000�0000000�00000000027�14753375025�0023411�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������This is a simple test. ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/deleted/.osc/merge�����������������������������������������0000664�0000000�0000000�00000000060�14753375025�0023722�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������Is it possible to merge this file? I hope so... ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/deleted/.osc/nochange��������������������������������������0000664�0000000�0000000�00000000031�14753375025�0024403�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������This file didn't change. �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/deleted/merge����������������������������������������������0000664�0000000�0000000�00000000061�14753375025�0023061�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������Is it possible to, merge this file? I hope so... �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/deleted/nochange�������������������������������������������0000664�0000000�0000000�00000000031�14753375025�0023541�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������This file didn't change. �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/limitsize/�������������������������������������������������0000775�0000000�0000000�00000000000�14753375025�0022445�5����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/limitsize/.osc/��������������������������������������������0000775�0000000�0000000�00000000000�14753375025�0023307�5����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/limitsize/.osc/_apiurl�������������������������������������0000664�0000000�0000000�00000000021�14753375025�0024656�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������http://localhost ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/limitsize/.osc/_files��������������������������������������0000664�0000000�0000000�00000000575�14753375025�0024502�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������<directory name="limitsize" rev="1" srcmd5="2df1eacfe03a3bec2112529e7f4dc39a" vrev="1"> <entry md5="0d62ceea6020d75154078a20d8c9f9ba" mtime="1282047302" name="foo" size="23" /> <entry md5="17b9e9e1a032ed44e7a584dc6303ffa8" mtime="1282047303" name="merge" size="48" /> <entry md5="7efa70f68983fad1cf487f69dedf93e9" mtime="1282047303" name="nochange" size="25" /> </directory> �����������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/limitsize/.osc/_osclib_version�����������������������������0000664�0000000�0000000�00000000004�14753375025�0026403�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������1.0 ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/limitsize/.osc/_package������������������������������������0000664�0000000�0000000�00000000012�14753375025�0024755�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������limitsize ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/limitsize/.osc/_project������������������������������������0000664�0000000�0000000�00000000007�14753375025�0025034�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������osctest�������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/limitsize/.osc/foo�����������������������������������������0000664�0000000�0000000�00000000027�14753375025�0024014�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������This is a simple test. ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/limitsize/.osc/merge���������������������������������������0000664�0000000�0000000�00000000060�14753375025�0024325�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������Is it possible to merge this file? I hope so... ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/limitsize/.osc/nochange������������������������������������0000664�0000000�0000000�00000000031�14753375025�0025006�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������This file didn't change. �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/limitsize/foo����������������������������������������������0000664�0000000�0000000�00000000027�14753375025�0023152�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������This is a simple test. ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/limitsize/merge��������������������������������������������0000664�0000000�0000000�00000000060�14753375025�0023463�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������Is it possible to merge this file? I hope so... ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/limitsize/nochange�����������������������������������������0000664�0000000�0000000�00000000051�14753375025�0024146�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������This file didn't change but is modified. ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/limitsize_local/�������������������������������������������0000775�0000000�0000000�00000000000�14753375025�0023617�5����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/limitsize_local/.osc/��������������������������������������0000775�0000000�0000000�00000000000�14753375025�0024461�5����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/limitsize_local/.osc/_apiurl�������������������������������0000664�0000000�0000000�00000000021�14753375025�0026030�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������http://localhost ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/limitsize_local/.osc/_files��������������������������������0000664�0000000�0000000�00000000575�14753375025�0025654�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������<directory name="limitsize" rev="1" srcmd5="2df1eacfe03a3bec2112529e7f4dc39a" vrev="1"> <entry md5="0d62ceea6020d75154078a20d8c9f9ba" mtime="1282047302" name="foo" size="23" /> <entry md5="17b9e9e1a032ed44e7a584dc6303ffa8" mtime="1282047303" name="merge" size="48" /> <entry md5="7efa70f68983fad1cf487f69dedf93e9" mtime="1282047303" name="nochange" size="25" /> </directory> �����������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/limitsize_local/.osc/_osclib_version�����������������������0000664�0000000�0000000�00000000004�14753375025�0027555�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������1.0 ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/limitsize_local/.osc/_package������������������������������0000664�0000000�0000000�00000000020�14753375025�0026126�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������limitsize_local ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/limitsize_local/.osc/_project������������������������������0000664�0000000�0000000�00000000007�14753375025�0026206�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������osctest�������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/limitsize_local/.osc/_size_limit���������������������������0000664�0000000�0000000�00000000003�14753375025�0026704�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������30 �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/limitsize_local/.osc/foo�����������������������������������0000664�0000000�0000000�00000000027�14753375025�0025166�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������This is a simple test. ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/limitsize_local/.osc/merge���������������������������������0000664�0000000�0000000�00000000060�14753375025�0025477�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������Is it possible to merge this file? I hope so... ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/limitsize_local/.osc/nochange������������������������������0000664�0000000�0000000�00000000031�14753375025�0026160�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������This file didn't change. �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/limitsize_local/foo����������������������������������������0000664�0000000�0000000�00000000027�14753375025�0024324�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������This is a simple test. ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/limitsize_local/merge��������������������������������������0000664�0000000�0000000�00000000060�14753375025�0024635�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������Is it possible to merge this file? I hope so... ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/limitsize_local/nochange�����������������������������������0000664�0000000�0000000�00000000051�14753375025�0025320�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������This file didn't change but is modified. ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/metamode/��������������������������������������������������0000775�0000000�0000000�00000000000�14753375025�0022227�5����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/metamode/.osc/���������������������������������������������0000775�0000000�0000000�00000000000�14753375025�0023071�5����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/metamode/.osc/_apiurl��������������������������������������0000664�0000000�0000000�00000000021�14753375025�0024440�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������http://localhost ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/metamode/.osc/_files���������������������������������������0000664�0000000�0000000�00000000571�14753375025�0024260�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������<directory name="simple" rev="1" srcmd5="2df1eacfe03a3bec2112529e7f4dc39a" vrev="1"> <entry md5="0d62ceea6020d75154078a20d8c9f9ba" mtime="1282047302" name="foo" size="23" /> <entry md5="17b9e9e1a032ed44e7a584dc6303ffa8" mtime="1282047303" name="merge" size="48" /> <entry md5="7efa70f68983fad1cf487f69dedf93e9" mtime="1282047303" name="nochange" size="25" /> </directory>���������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/metamode/.osc/_meta_mode�����������������������������������0000664�0000000�0000000�00000000000�14753375025�0025073�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/metamode/.osc/_osclib_version������������������������������0000664�0000000�0000000�00000000004�14753375025�0026165�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������1.0 ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/metamode/.osc/_package�������������������������������������0000664�0000000�0000000�00000000011�14753375025�0024536�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������metamode �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/metamode/.osc/_project�������������������������������������0000664�0000000�0000000�00000000007�14753375025�0024616�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������osctest�������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/metamode/.osc/foo������������������������������������������0000664�0000000�0000000�00000000027�14753375025�0023576�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������This is a simple test. ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/metamode/.osc/merge����������������������������������������0000664�0000000�0000000�00000000060�14753375025�0024107�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������Is it possible to merge this file? I hope so... ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/metamode/.osc/nochange�������������������������������������0000664�0000000�0000000�00000000031�14753375025�0024570�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������This file didn't change. �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/metamode/foo�����������������������������������������������0000664�0000000�0000000�00000000027�14753375025�0022734�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������This is a simple test. ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/metamode/merge���������������������������������������������0000664�0000000�0000000�00000000060�14753375025�0023245�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������Is it possible to merge this file? I hope so... ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/metamode/nochange������������������������������������������0000664�0000000�0000000�00000000031�14753375025�0023726�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������This file didn't change. �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/new/�������������������������������������������������������0000775�0000000�0000000�00000000000�14753375025�0021225�5����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/new/.osc/��������������������������������������������������0000775�0000000�0000000�00000000000�14753375025�0022067�5����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/new/.osc/_apiurl�������������������������������������������0000664�0000000�0000000�00000000021�14753375025�0023436�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������http://localhost ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/new/.osc/_files��������������������������������������������0000664�0000000�0000000�00000000031�14753375025�0023245�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������<directory name="new" /> �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/new/.osc/_osclib_version�����������������������������������0000664�0000000�0000000�00000000004�14753375025�0025163�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������1.0 ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/new/.osc/_package������������������������������������������0000664�0000000�0000000�00000000004�14753375025�0023536�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������new ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/new/.osc/_project������������������������������������������0000664�0000000�0000000�00000000007�14753375025�0023614�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������osctest�������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/restore/���������������������������������������������������0000775�0000000�0000000�00000000000�14753375025�0022117�5����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/restore/.osc/����������������������������������������������0000775�0000000�0000000�00000000000�14753375025�0022761�5����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/restore/.osc/_apiurl���������������������������������������0000664�0000000�0000000�00000000021�14753375025�0024330�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������http://localhost ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/restore/.osc/_files����������������������������������������0000664�0000000�0000000�00000000573�14753375025�0024152�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������<directory name="restore" rev="1" srcmd5="2df1eacfe03a3bec2112529e7f4dc39a" vrev="1"> <entry md5="0d62ceea6020d75154078a20d8c9f9ba" mtime="1282047302" name="foo" size="23" /> <entry md5="17b9e9e1a032ed44e7a584dc6303ffa8" mtime="1282047303" name="merge" size="48" /> <entry md5="7efa70f68983fad1cf487f69dedf93e9" mtime="1282047303" name="nochange" size="25" /> </directory> �������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/restore/.osc/_osclib_version�������������������������������0000664�0000000�0000000�00000000004�14753375025�0026055�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������1.0 ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/restore/.osc/_package��������������������������������������0000664�0000000�0000000�00000000010�14753375025�0024425�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������restore ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/restore/.osc/_project��������������������������������������0000664�0000000�0000000�00000000007�14753375025�0024506�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������osctest�������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/restore/.osc/foo�������������������������������������������0000664�0000000�0000000�00000000027�14753375025�0023466�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������This is a simple test. ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/restore/.osc/merge�����������������������������������������0000664�0000000�0000000�00000000060�14753375025�0023777�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������Is it possible to merge this file? I hope so... ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/restore/.osc/nochange��������������������������������������0000664�0000000�0000000�00000000031�14753375025�0024460�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������This file didn't change. �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/restore/exists���������������������������������������������0000664�0000000�0000000�00000000000�14753375025�0023347�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/restore/merge����������������������������������������������0000664�0000000�0000000�00000000060�14753375025�0023135�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������Is it possible to merge this file? I hope so... ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/restore/nochange�������������������������������������������0000664�0000000�0000000�00000000031�14753375025�0023616�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������This file didn't change. �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/resume/����������������������������������������������������0000775�0000000�0000000�00000000000�14753375025�0021734�5����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/resume/.osc/�����������������������������������������������0000775�0000000�0000000�00000000000�14753375025�0022576�5����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/resume/.osc/_apiurl����������������������������������������0000664�0000000�0000000�00000000021�14753375025�0024145�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������http://localhost ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/resume/.osc/_files�����������������������������������������0000664�0000000�0000000�00000000727�14753375025�0023770�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������<directory name="simple" rev="1" srcmd5="2df1eacfe03a3bec2112529e7f4dc39a" vrev="1"> <entry md5="ff22941336956098ae9a564289d1bf1b" mtime="1282137256" name="added" size="15" /> <entry md5="0d62ceea6020d75154078a20d8c9f9ba" mtime="1282047302" name="foo" size="23" /> <entry md5="17b9e9e1a032ed44e7a584dc6303ffa8" mtime="1282047303" name="merge" size="48" /> <entry md5="7efa70f68983fad1cf487f69dedf93e9" mtime="1282047303" name="nochange" size="25" /> </directory> �����������������������������������������osc-1.12.1/tests/update_fixtures/osctest/resume/.osc/_in_update/������������������������������������0000775�0000000�0000000�00000000000�14753375025�0024705�5����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/resume/.osc/_in_update/_files������������������������������0000664�0000000�0000000�00000000726�14753375025�0026076�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������<directory name="simple" rev="2" srcmd5="3ac41c59a5ed169d5ffef4d824700f7d" vrev="2"> <entry md5="ff22941336956098ae9a564289d1bf1b" mtime="1282137256" name="added" size="15" /> <entry md5="14758f1afd44c09b7992073ccf00b43d" mtime="1282137220" name="foo" size="7" /> <entry md5="256d8f76ba7a0a231fb46a84866f25d8" mtime="1282137238" name="merge" size="20" /> <entry md5="7efa70f68983fad1cf487f69dedf93e9" mtime="1282047303" name="nochange" size="25" /> </directory> ������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/resume/.osc/_in_update/foo���������������������������������0000664�0000000�0000000�00000000027�14753375025�0025412�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������This is a simple test. ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/resume/.osc/_meta������������������������������������������0000664�0000000�0000000�00000000271�14753375025�0023606�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������<package project="osctest" name="simple"> <title/> <description> </description> <person userid="Admin" role="maintainer"/> <person userid="Admin" role="bugowner"/> </package>���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/resume/.osc/_osclib_version��������������������������������0000664�0000000�0000000�00000000004�14753375025�0025672�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������1.0 ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/resume/.osc/_package���������������������������������������0000664�0000000�0000000�00000000006�14753375025�0024247�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������simple��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/resume/.osc/_project���������������������������������������0000664�0000000�0000000�00000000007�14753375025�0024323�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������osctest�������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/resume/.osc/added������������������������������������������0000664�0000000�0000000�00000000017�14753375025�0023560�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������This is a test �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/resume/.osc/foo��������������������������������������������0000664�0000000�0000000�00000000027�14753375025�0023303�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������This is a simple test. ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/resume/.osc/merge������������������������������������������0000664�0000000�0000000�00000000060�14753375025�0023614�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������Is it possible to merge this file? I hope so... ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/resume/.osc/nochange���������������������������������������0000664�0000000�0000000�00000000031�14753375025�0024275�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������This file didn't change. �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/resume/added�����������������������������������������������0000664�0000000�0000000�00000000017�14753375025�0022716�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������This is a test �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/resume/exists����������������������������������������������0000664�0000000�0000000�00000000000�14753375025�0023164�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/resume/foo�������������������������������������������������0000664�0000000�0000000�00000000027�14753375025�0022441�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������This is a simple test. ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/resume/merge�����������������������������������������������0000664�0000000�0000000�00000000060�14753375025�0022752�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������Is it possible to merge this file? I hope so... ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/resume/nochange��������������������������������������������0000664�0000000�0000000�00000000031�14753375025�0023433�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������This file didn't change. �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/resume_deleted/��������������������������������������������0000775�0000000�0000000�00000000000�14753375025�0023422�5����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/resume_deleted/.osc/���������������������������������������0000775�0000000�0000000�00000000000�14753375025�0024264�5����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/resume_deleted/.osc/_apiurl��������������������������������0000664�0000000�0000000�00000000021�14753375025�0025633�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������http://localhost ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/resume_deleted/.osc/_files���������������������������������0000664�0000000�0000000�00000000726�14753375025�0025455�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������<directory name="simple" rev="1" srcmd5="3ac41c59a5ed169d5ffef4d824700f7d" vrev="1"> <entry md5="d41d8cd98f00b204e9800998ecf8427e" mtime="1282137256" name="added" size="15" /> <entry md5="14758f1afd44c09b7992073ccf00b43d" mtime="1282137220" name="foo" size="7" /> <entry md5="256d8f76ba7a0a231fb46a84866f25d8" mtime="1282137238" name="merge" size="20" /> <entry md5="7efa70f68983fad1cf487f69dedf93e9" mtime="1282047303" name="nochange" size="25" /> </directory> ������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/resume_deleted/.osc/_in_update/����������������������������0000775�0000000�0000000�00000000000�14753375025�0026373�5����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/resume_deleted/.osc/_in_update/_files����������������������0000664�0000000�0000000�00000000571�14753375025�0027562�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������<directory name="simple" rev="1" srcmd5="2df1eacfe03a3bec2112529e7f4dc39a" vrev="1"> <entry md5="0d62ceea6020d75154078a20d8c9f9ba" mtime="1282047302" name="foo" size="23" /> <entry md5="17b9e9e1a032ed44e7a584dc6303ffa8" mtime="1282047303" name="merge" size="48" /> <entry md5="7efa70f68983fad1cf487f69dedf93e9" mtime="1282047303" name="nochange" size="25" /> </directory>���������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/resume_deleted/.osc/_in_update/foo�������������������������0000664�0000000�0000000�00000000007�14753375025�0027076�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������foobar �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/resume_deleted/.osc/_meta����������������������������������0000664�0000000�0000000�00000000271�14753375025�0025274�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������<package project="osctest" name="simple"> <title/> <description> </description> <person userid="Admin" role="maintainer"/> <person userid="Admin" role="bugowner"/> </package>���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/resume_deleted/.osc/_osclib_version������������������������0000664�0000000�0000000�00000000004�14753375025�0027360�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������1.0 ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/resume_deleted/.osc/_package�������������������������������0000664�0000000�0000000�00000000006�14753375025�0025735�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������simple��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/resume_deleted/.osc/_project�������������������������������0000664�0000000�0000000�00000000007�14753375025�0026011�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������osctest�������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/resume_deleted/.osc/added����������������������������������0000664�0000000�0000000�00000000000�14753375025�0025236�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/resume_deleted/.osc/foo������������������������������������0000664�0000000�0000000�00000000007�14753375025�0024767�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������foobar �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/resume_deleted/.osc/merge����������������������������������0000664�0000000�0000000�00000000024�14753375025�0025302�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������xxx xxx yyy zzz zzz ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/resume_deleted/.osc/nochange�������������������������������0000664�0000000�0000000�00000000031�14753375025�0025763�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������This file didn't change. �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/resume_deleted/added���������������������������������������0000664�0000000�0000000�00000000000�14753375025�0024374�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/resume_deleted/exists��������������������������������������0000664�0000000�0000000�00000000000�14753375025�0024652�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/resume_deleted/f�������������������������������������������0000664�0000000�0000000�00000000017�14753375025�0023570�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������This is a test �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/resume_deleted/foo�����������������������������������������0000664�0000000�0000000�00000000007�14753375025�0024125�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������foobar �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/resume_deleted/merge���������������������������������������0000664�0000000�0000000�00000000024�14753375025�0024440�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������xxx xxx yyy zzz zzz ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/resume_deleted/nochange������������������������������������0000664�0000000�0000000�00000000031�14753375025�0025121�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������This file didn't change. �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/services/��������������������������������������������������0000775�0000000�0000000�00000000000�14753375025�0022257�5����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/services/.osc/���������������������������������������������0000775�0000000�0000000�00000000000�14753375025�0023121�5����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/services/.osc/_apiurl��������������������������������������0000664�0000000�0000000�00000000021�14753375025�0024470�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������http://localhost ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/services/.osc/_files���������������������������������������0000664�0000000�0000000�00000000615�14753375025�0024307�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������<directory name="foo" rev="1" srcmd5="b9f060f4b3640e58a1d44abc25ffb9bd" vrev="1"> <entry md5="7b1458c733a187d4f3807665ddd02cca" mtime="1282565027" name="_service:exists" size="20" skipped="true" /> <entry md5="0d62ceea6020d75154078a20d8c9f9ba" mtime="1282320303" name="foo" size="23" /> <entry md5="17b9e9e1a032ed44e7a584dc6303ffa8" mtime="1282320303" name="merge" size="48" /> </directory> �������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/services/.osc/_osclib_version������������������������������0000664�0000000�0000000�00000000004�14753375025�0026215�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������1.0 ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/services/.osc/_package�������������������������������������0000664�0000000�0000000�00000000011�14753375025�0024566�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������services �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/services/.osc/_project�������������������������������������0000664�0000000�0000000�00000000007�14753375025�0024646�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������osctest�������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/services/.osc/foo������������������������������������������0000664�0000000�0000000�00000000027�14753375025�0023626�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������This is a simple test. ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/services/.osc/merge����������������������������������������0000664�0000000�0000000�00000000060�14753375025�0024137�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������Is it possible to merge this file? I hope so... ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/services/_service:exists�����������������������������������0000664�0000000�0000000�00000000024�14753375025�0025327�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������another service foo ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/services/foo�����������������������������������������������0000664�0000000�0000000�00000000027�14753375025�0022764�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������This is a simple test. ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/services/merge���������������������������������������������0000664�0000000�0000000�00000000060�14753375025�0023275�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������Is it possible to merge this file? I hope so... ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/simple/����������������������������������������������������0000775�0000000�0000000�00000000000�14753375025�0021725�5����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/simple/.osc/�����������������������������������������������0000775�0000000�0000000�00000000000�14753375025�0022567�5����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/simple/.osc/_apiurl����������������������������������������0000664�0000000�0000000�00000000021�14753375025�0024136�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������http://localhost ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/simple/.osc/_files�����������������������������������������0000664�0000000�0000000�00000000571�14753375025�0023756�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������<directory name="simple" rev="1" srcmd5="2df1eacfe03a3bec2112529e7f4dc39a" vrev="1"> <entry md5="0d62ceea6020d75154078a20d8c9f9ba" mtime="1282047302" name="foo" size="23" /> <entry md5="17b9e9e1a032ed44e7a584dc6303ffa8" mtime="1282047303" name="merge" size="48" /> <entry md5="7efa70f68983fad1cf487f69dedf93e9" mtime="1282047303" name="nochange" size="25" /> </directory>���������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/simple/.osc/_osclib_version��������������������������������0000664�0000000�0000000�00000000004�14753375025�0025663�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������1.0 ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/simple/.osc/_package���������������������������������������0000664�0000000�0000000�00000000006�14753375025�0024240�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������simple��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/simple/.osc/_project���������������������������������������0000664�0000000�0000000�00000000007�14753375025�0024314�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������osctest�������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/simple/.osc/foo��������������������������������������������0000664�0000000�0000000�00000000027�14753375025�0023274�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������This is a simple test. ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/simple/.osc/merge������������������������������������������0000664�0000000�0000000�00000000060�14753375025�0023605�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������Is it possible to merge this file? I hope so... ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/simple/.osc/nochange���������������������������������������0000664�0000000�0000000�00000000031�14753375025�0024266�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������This file didn't change. �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/simple/exists����������������������������������������������0000664�0000000�0000000�00000000000�14753375025�0023155�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/simple/foo�������������������������������������������������0000664�0000000�0000000�00000000027�14753375025�0022432�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������This is a simple test. ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/simple/merge�����������������������������������������������0000664�0000000�0000000�00000000060�14753375025�0022743�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������Is it possible to merge this file? I hope so... ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/osctest/simple/nochange��������������������������������������������0000664�0000000�0000000�00000000051�14753375025�0023426�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������This file didn't change but is modified. ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/testUpdateAlreadyInConflict_files����������������������������������0000664�0000000�0000000�00000000606�14753375025�0025454�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������<directory name="already_in_conflict" rev="2" srcmd5="686b725018c89978678e15daa666ff85" vrev="2"> <entry md5="0d62ceea6020d75154078a20d8c9f9ba" mtime="1282133912" name="foo" size="23" /> <entry md5="14758f1afd44c09b7992073ccf00b43d" mtime="1282134056" name="merge" size="7" /> <entry md5="7efa70f68983fad1cf487f69dedf93e9" mtime="1282133912" name="nochange" size="25" /> </directory> ��������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/testUpdateAlreadyInConflict_merge����������������������������������0000664�0000000�0000000�00000000007�14753375025�0025444�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������foobar �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/testUpdateConflict_files�������������������������������������������0000664�0000000�0000000�00000000574�14753375025�0023667�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������<directory name="conflict" rev="2" srcmd5="6463d0bd161765e9a2b7186606c72ca1" vrev="2"> <entry md5="0d62ceea6020d75154078a20d8c9f9ba" mtime="1282130148" name="foo" size="23" /> <entry md5="89fcd308c6e6919c472e56ec82ace945" mtime="1282130545" name="merge" size="46" /> <entry md5="7efa70f68983fad1cf487f69dedf93e9" mtime="1282130148" name="nochange" size="25" /> </directory> ������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/testUpdateConflict_merge�������������������������������������������0000664�0000000�0000000�00000000056�14753375025�0023657�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������Is it possible to merge this file? We'll see. ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/testUpdateDeletedFile_files����������������������������������������0000664�0000000�0000000�00000000437�14753375025�0024272�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������<directory name="simple" rev="2" srcmd5="2df1eacfe03a3bec2112529e7f4dc39a" vrev="2"> <entry md5="17b9e9e1a032ed44e7a584dc6303ffa8" mtime="1282047303" name="merge" size="48" /> <entry md5="7efa70f68983fad1cf487f69dedf93e9" mtime="1282047303" name="nochange" size="25" /> </directory> ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/testUpdateLimitSizeAddDelete_exists��������������������������������0000664�0000000�0000000�00000000006�14753375025�0025776�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������small ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/testUpdateLimitSizeAddDelete_files���������������������������������0000664�0000000�0000000�00000001000�14753375025�0025554�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������<directory name="foo" rev="2" srcmd5="018a80019e08143e7ae324c778873d62" vrev="2"> <entry md5="ed955c917012307d982b7cdd5799ff1a" mtime="1282320398" name="bigfile" size="69" skipped="true" /> <entry md5="d15dbfcb847653913855e21370d83af1" mtime="1282553634" name="exists" size="6" /> <entry md5="0d62ceea6020d75154078a20d8c9f9ba" mtime="1282320303" name="foo" size="23" skipped="true" /> <entry md5="17b9e9e1a032ed44e7a584dc6303ffa8" mtime="1282320303" name="merge" size="48" skipped="true" /> </directory> osc-1.12.1/tests/update_fixtures/testUpdateLimitSizeAddDelete_filesremote���������������������������0000664�0000000�0000000�00000000723�14753375025�0027003�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������<directory name="foo" rev="2" vrev="2" srcmd5="018a80019e08143e7ae324c778873d62"> <entry name="bigfile" md5="ed955c917012307d982b7cdd5799ff1a" size="69" mtime="1282320398" /> <entry name="exists" md5="d15dbfcb847653913855e21370d83af1" size="6" mtime="1282553634" /> <entry name="foo" md5="0d62ceea6020d75154078a20d8c9f9ba" size="23" mtime="1282320303" /> <entry name="merge" md5="17b9e9e1a032ed44e7a584dc6303ffa8" size="48" mtime="1282320303" /> </directory> ���������������������������������������������osc-1.12.1/tests/update_fixtures/testUpdateLimitSizeNoChange_files����������������������������������0000664�0000000�0000000�00000000753�14753375025�0025441�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������<directory name="limitsize" rev="2" srcmd5="e51a3133d3d3eb2a48e06efb79e2d503" vrev="2"> <entry md5="ed955c917012307d982b7cdd5799ff1a" mtime="1282320398" name="bigfile" size="69" skipped="true" /> <entry md5="0d62ceea6020d75154078a20d8c9f9ba" mtime="1282320303" name="foo" size="23" /> <entry md5="17b9e9e1a032ed44e7a584dc6303ffa8" mtime="1282320303" name="merge" size="48" /> <entry md5="7efa70f68983fad1cf487f69dedf93e9" mtime="1282047303" name="nochange" size="25" /> </directory> ���������������������osc-1.12.1/tests/update_fixtures/testUpdateLimitSizeNoChange_filesremote����������������������������0000664�0000000�0000000�00000000734�14753375025�0026654�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������<directory name="limitsize" rev="2" vrev="2" srcmd5="e51a3133d3d3eb2a48e06efb79e2d503"> <entry name="bigfile" md5="ed955c917012307d982b7cdd5799ff1a" size="69" mtime="1282320398" /> <entry name="foo" md5="0d62ceea6020d75154078a20d8c9f9ba" size="23" mtime="1282320303" /> <entry name="merge" md5="17b9e9e1a032ed44e7a584dc6303ffa8" size="48" mtime="1282320303" /> <entry name="nochange" md5="7efa70f68983fad1cf487f69dedf93e9" size="25" mtime="1282047303" /> </directory> ������������������������������������osc-1.12.1/tests/update_fixtures/testUpdateLocalDeletions_files�������������������������������������0000664�0000000�0000000�00000000573�14753375025�0025026�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������<directory name="deleted" rev="2" srcmd5="0e717058d371ab9029336418c8c883bd" vrev="2"> <entry md5="2bb5f888a0063a0931c12f35851953e4" mtime="1282135005" name="foo" size="37" /> <entry md5="426e11f11438365322f102c02b0a33f0" mtime="1282134896" name="merge" size="50" /> <entry md5="7efa70f68983fad1cf487f69dedf93e9" mtime="1282134731" name="nochange" size="25" /> </directory> �������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/testUpdateLocalDeletions_foo���������������������������������������0000664�0000000�0000000�00000000045�14753375025�0024501�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������This is a simple test. And an update �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/testUpdateLocalDeletions_merge�������������������������������������0000664�0000000�0000000�00000000062�14753375025�0025014�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������Is it possible to merge this file? We'll see. Foo ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������osc-1.12.1/tests/update_fixtures/testUpdateLocalLimitSizeNoChange_files�����������������������������0000664�0000000�0000000�00000001000�14753375025�0026376�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������<directory name="limitsize_local" rev="2" srcmd5="e51a3133d3d3eb2a48e06efb79e2d503" vrev="2"> <entry md5="ed955c917012307d982b7cdd5799ff1a" mtime="1282320398" name="bigfile" size="69" skipped="true" /> <entry md5="0d62ceea6020d75154078a20d8c9f9ba" mtime="1282320303" name="foo" size="23" /> <entry md5="17b9e9e1a032ed44e7a584dc6303ffa8" mtime="1282320303" name="merge" size="48" skipped="true" /> <entry md5="7efa70f68983fad1cf487f69dedf93e9" mtime="1282047303" name="nochange" size="25" /> </directory> osc-1.12.1/tests/update_fixtures/testUpdateLocalLimitSizeNoChange_filesremote�����������������������0000664�0000000�0000000�00000000742�14753375025�0027626�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������<directory name="limitsize_local" rev="2" vrev="2" srcmd5="e51a3133d3d3eb2a48e06efb79e2d503"> <entry name="bigfile" md5="ed955c917012307d982b7cdd5799ff1a" size="69" mtime="1282320398" /> <entry name="foo" md5="0d62ceea6020d75154078a20d8c9f9ba" size="23" mtime="1282320303" /> <entry name="merge" md5="17b9e9e1a032ed44e7a584dc6303ffa8" size="48" mtime="1282320303" /> <entry name="nochange" md5="7efa70f68983fad1cf487f69dedf93e9" size="25" mtime="1282047303" /> </directory> ������������������������������osc-1.12.1/tests/update_fixtures/testUpdateMetaMode__meta�������������������������������������������0000664�0000000�0000000�00000000137�14753375025�0023577�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������<package project="osctest" name="metamode"> <title>foo osc-1.12.1/tests/update_fixtures/testUpdateMetaMode_filesremote000066400000000000000000000003011475337502500250210ustar00rootroot00000000000000 osc-1.12.1/tests/update_fixtures/testUpdateNewFileLocalExists_exists000066400000000000000000000000071475337502500260360ustar00rootroot00000000000000exists osc-1.12.1/tests/update_fixtures/testUpdateNewFileLocalExists_files000066400000000000000000000007271475337502500256320ustar00rootroot00000000000000 osc-1.12.1/tests/update_fixtures/testUpdateNewFile_files000066400000000000000000000007401475337502500234520ustar00rootroot00000000000000 osc-1.12.1/tests/update_fixtures/testUpdateNewFile_upstream_added000066400000000000000000000000271475337502500253270ustar00rootroot00000000000000This is a simple test. osc-1.12.1/tests/update_fixtures/testUpdateNew_filesremote000066400000000000000000000000441475337502500240630ustar00rootroot00000000000000 osc-1.12.1/tests/update_fixtures/testUpdateNoChanges_files000066400000000000000000000005721475337502500237710ustar00rootroot00000000000000 osc-1.12.1/tests/update_fixtures/testUpdateRestore_files000066400000000000000000000005721475337502500235470ustar00rootroot00000000000000 osc-1.12.1/tests/update_fixtures/testUpdateRestore_foo000066400000000000000000000000271475337502500232230ustar00rootroot00000000000000This is a simple test. osc-1.12.1/tests/update_fixtures/testUpdateResumeDeletedFile_files000066400000000000000000000005721475337502500254530ustar00rootroot00000000000000 osc-1.12.1/tests/update_fixtures/testUpdateResumeDeletedFile_foo000066400000000000000000000000271475337502500251270ustar00rootroot00000000000000This is a simple test. osc-1.12.1/tests/update_fixtures/testUpdateResumeDeletedFile_merge000066400000000000000000000000601475337502500254400ustar00rootroot00000000000000Is it possible to merge this file? I hope so... osc-1.12.1/tests/update_fixtures/testUpdateResume_files000066400000000000000000000007261475337502500233650ustar00rootroot00000000000000 osc-1.12.1/tests/update_fixtures/testUpdateResume_foo000066400000000000000000000000071475337502500230360ustar00rootroot00000000000000foobar osc-1.12.1/tests/update_fixtures/testUpdateResume_merge000066400000000000000000000000241475337502500233510ustar00rootroot00000000000000xxx xxx yyy zzz zzz osc-1.12.1/tests/update_fixtures/testUpdateServiceFilesAddDelete__service:bar000066400000000000000000000000201475337502500275030ustar00rootroot00000000000000another service osc-1.12.1/tests/update_fixtures/testUpdateServiceFilesAddDelete__service:foo000066400000000000000000000000061475337502500275260ustar00rootroot00000000000000small osc-1.12.1/tests/update_fixtures/testUpdateServiceFilesAddDelete_bigfile000066400000000000000000000001051475337502500265320ustar00rootroot00000000000000This is a file with a lot of text. Foo foo bar bar bar. foobarfoobar osc-1.12.1/tests/update_fixtures/testUpdateServiceFilesAddDelete_files000066400000000000000000000011331475337502500262350ustar00rootroot00000000000000 osc-1.12.1/tests/update_fixtures/testUpdateServiceFilesAddDelete_filesremote000066400000000000000000000010751475337502500274560ustar00rootroot00000000000000 osc-1.12.1/tests/update_fixtures/testUpdateUpstreamModifiedFile_files000066400000000000000000000005721475337502500261650ustar00rootroot00000000000000 osc-1.12.1/tests/update_fixtures/testUpdateUpstreamModifiedFile_foo000066400000000000000000000000471475337502500256430ustar00rootroot00000000000000 This is a simple test.