Sign Up
Log In
Log In
or
Sign Up
Places
All Projects
Status Monitor
Collapse sidebar
openSUSE:Tools
sle-prjmgr-tools
_service:obs_scm:sle-prjmgr-tools-1678907444.92...
Overview
Repositories
Revisions
Requests
Users
Attributes
Meta
File _service:obs_scm:sle-prjmgr-tools-1678907444.9216408.obscpio of Package sle-prjmgr-tools
07070100000000000041ED0000000000000000000000046412183400000000000000000000000000000000000000000000002C00000000sle-prjmgr-tools-1678907444.9216408/.github07070100000001000041ED0000000000000000000000026412183400000000000000000000000000000000000000000000003B00000000sle-prjmgr-tools-1678907444.9216408/.github/ISSUE_TEMPLATE07070100000002000081A4000000000000000000000001641218340000023F000000000000000000000000000000000000004900000000sle-prjmgr-tools-1678907444.9216408/.github/ISSUE_TEMPLATE/bug_report.md--- name: Bug report about: Create a report to help us improve title: '' labels: bug assignees: '' --- ## Describe the bug A clear and concise description of what the bug is. ## To Reproduce Steps to reproduce the behavior: 1. Go to '...' 2. Click on '....' 3. Scroll down to '....' 4. See error ## Expected behavior A clear and concise description of what you expected to happen. ## Logs If applicable, add logs to help explain your problem. ## Version ``` <output of sle-prjmgr-tools -v> ``` ## Additional context Add any other context about the problem here. 07070100000003000081A4000000000000000000000001641218340000025B000000000000000000000000000000000000004E00000000sle-prjmgr-tools-1678907444.9216408/.github/ISSUE_TEMPLATE/feature_request.md--- name: Feature request about: Suggest an idea for this project title: '' labels: enhancement assignees: '' --- ## Is your feature request related to a problem? Please describe A clear and concise description of what the problem is. Ex. I'm always frustrated when [...] ## Describe the solution you'd like A clear and concise description of what you want to happen. ## Describe alternatives you've considered A clear and concise description of any alternative solutions or features you've considered. ## Additional context Add any other context or screenshots about the feature request here. 07070100000004000041ED0000000000000000000000026412183400000000000000000000000000000000000000000000003600000000sle-prjmgr-tools-1678907444.9216408/.github/workflows07070100000005000081A4000000000000000000000001641218340000060C000000000000000000000000000000000000004000000000sle-prjmgr-tools-1678907444.9216408/.github/workflows/build.ymlname: Build on: push: branches: [ main ] pull_request: branches: [ main ] jobs: build_rpm: name: Build RPM in Docker container runs-on: ubuntu-latest steps: - uses: actions/checkout@v3 - name: Set up Docker Buildx uses: docker/setup-buildx-action@v2 - name: Build Docker image uses: docker/build-push-action@v3 with: context: . file: docker/build.dockerfile push: false load: true tags: sle-prjmgr-tools-builder - name: Build RPM inside Docker image run: docker run --rm -v $PWD:/code sle-prjmgr-tools-builder - name: Store built RPMs uses: actions/upload-artifact@v3 with: name: rpms path: rpm-build/*.rpm install_rpm: name: Install previously built RPM needs: build_rpm runs-on: ubuntu-latest steps: - uses: actions/checkout@v3 - name: Download a single artifact uses: actions/download-artifact@v3 with: name: rpms path: rpm-build - name: Set up Docker Buildx uses: docker/setup-buildx-action@v2 - name: Build Docker image uses: docker/build-push-action@v3 with: context: . file: docker/install-check.dockerfile push: false load: true tags: sle-prjmgr-tools-install-check - name: Try installing the RPM in a openSUSE Leap image run: docker run --rm -v $PWD/rpm-build:/rpms sle-prjmgr-tools-install-check 07070100000006000081A40000000000000000000000016412183400000818000000000000000000000000000000000000004100000000sle-prjmgr-tools-1678907444.9216408/.github/workflows/codeql.yml# For most projects, this workflow file will not need changing; you simply need # to commit it to your repository. # # You may wish to alter this file to override the set of languages analyzed, # or to provide custom queries or build logic. # # ******** NOTE ******** # We have attempted to detect the languages in your repository. Please check # the `language` matrix defined below to confirm you have the correct set of # supported CodeQL languages. # name: "CodeQL" on: push: branches: [ "main" ] pull_request: # The branches below must be a subset of the branches above branches: [ "main" ] schedule: - cron: '42 4 * * 4' jobs: analyze: name: Analyze runs-on: ubuntu-latest permissions: actions: read contents: read security-events: write strategy: fail-fast: false matrix: language: [ 'python' ] steps: - name: Checkout repository uses: actions/checkout@v3 # Initializes the CodeQL tools for scanning. - name: Initialize CodeQL uses: github/codeql-action/init@v2 with: languages: ${{ matrix.language }} # If you wish to specify custom queries, you can do so here or in a config file. # By default, queries listed here will override any specified in a config file. # Prefix the list here with "+" to use these queries and those in the config file. # Details on CodeQL's query packs refer to : https://docs.github.com/en/code-security/code-scanning/automatically-scanning-your-code-for-vulnerabilities-and-errors/configuring-code-scanning#using-queries-in-ql-packs # queries: security-extended,security-and-quality # Autobuild attempts to build any compiled languages (C/C++, C#, Go, or Java). # If this step fails, then you should remove it and run the build manually (see below) - name: Autobuild uses: github/codeql-action/autobuild@v2 - name: Perform CodeQL Analysis uses: github/codeql-action/analyze@v2 with: category: "/language:${{matrix.language}}" 07070100000007000081A40000000000000000000000016412183400000643000000000000000000000000000000000000003F00000000sle-prjmgr-tools-1678907444.9216408/.github/workflows/lint.ymlname: Linting on: push: branches: [ main ] pull_request: branches: [ main ] jobs: python_pylint: name: pylint runs-on: ubuntu-20.04 steps: - uses: actions/checkout@v3 - uses: actions/setup-python@v4 with: python-version: '3.6' cache: 'pip' # caching pip dependencies cache-dependency-path: '**/setup.cfg' - run: pip install .[lint] - run: pylint --rcfile=.pylintrc sle_prjmgr_tools python_mypy: name: mypy runs-on: ubuntu-latest steps: - uses: actions/checkout@v3 - run: pip install .[lint] - uses: actions/setup-python@v4 with: python-version: '3.8' # We need Python 3.8, so we can install types-lxml cache: 'pip' # caching pip dependencies cache-dependency-path: '**/setup.cfg' - run: pip install .[lint] - run: pip install types-lxml - run: python -m mypy --check-untyped-defs sle_prjmgr_tools python_black: name: black formatter runs-on: ubuntu-latest steps: - uses: actions/checkout@v3 - uses: psf/black@stable with: options: "--check --safe --verbose" version: "22.8.0" lint_docs: runs-on: ubuntu-20.04 steps: - uses: actions/checkout@v3 - name: Set up Python uses: actions/setup-python@v4 with: python-version: 3.9 - name: Install dependencies run: pip install -U rstcheck doc8 sphinx - name: Run rstcheck run: rstcheck -r docs - name: Run doc8 run: doc8 --ignore D001 docs 07070100000008000081A40000000000000000000000016412183400000487000000000000000000000000000000000000004700000000sle-prjmgr-tools-1678907444.9216408/.github/workflows/release_pypi.ymlname: Publish Python distributions to PyPI on: push: branches: - main jobs: build-n-publish: name: Build and publish Python distributions to PyPI runs-on: ubuntu-latest steps: - uses: actions/checkout@v2 - uses: actions/setup-python@v4 with: python-version: '3.11' cache: 'pip' cache-dependency-path: '**/setup.cfg' - name: Install build dependencies run: pip install setuptools wheel - name: Fetch date for version bump run: echo "new_version=$(date +'%Y%m%d%H%M')" >> $GITHUB_ENV - name: Replace version in __init__.py run: sed -i '/__version__ = "[0-9].[0-9].[0-9]/s/.$/.'${{ env.new_version }}'"/g' sle_prjmgr_tools/__init__.py - name: Install Deps with pip run: pip install . - name: Install pypa/build run: python -m pip install build --user - name: Build a binary wheel and a source tarball run: python setup.py sdist bdist_wheel - name: Publish distribution to PyPI uses: pypa/gh-action-pypi-publish@release/v1 with: password: ${{ secrets.PYPI_API_TOKEN }} 07070100000009000081A400000000000000000000000164121834000001CD000000000000000000000000000000000000003F00000000sle-prjmgr-tools-1678907444.9216408/.github/workflows/test.ymlname: Testing on: push: branches: [ main ] pull_request: branches: [ main ] jobs: python_pytest: name: pytest runs-on: ubuntu-20.04 steps: - uses: actions/checkout@v3 - uses: actions/setup-python@v4 with: python-version: '3.6' cache: 'pip' # caching pip dependencies cache-dependency-path: '**/setup.cfg' - run: pip install .[test] - run: pytest -v --junitxml=report.xml 0707010000000A000081A400000000000000000000000164121834000008D1000000000000000000000000000000000000002F00000000sle-prjmgr-tools-1678907444.9216408/.gitignore### Custom folders rpm-build ### IDEs # We want everything from IDEA based ideas ignored .idea # We want everything from Jetbrains Fleet ignored .fleet # We want everything from MS VSCode ignored .vscode ### Python template # Byte-compiled / optimized / DLL files __pycache__/ *.py[cod] *$py.class # C extensions *.so # Distribution / packaging .Python build/ develop-eggs/ dist/ downloads/ eggs/ .eggs/ lib/ lib64/ parts/ sdist/ var/ wheels/ share/python-wheels/ *.egg-info/ .installed.cfg *.egg MANIFEST # PyInstaller # Usually these files are written by a python script from a template # before PyInstaller builds the exe, so as to inject date/other infos into it. *.manifest # Installer logs pip-log.txt pip-delete-this-directory.txt # Unit test / coverage reports htmlcov/ .tox/ .nox/ .coverage .coverage.* .cache nosetests.xml coverage.xml *.cover *.py,cover .hypothesis/ .pytest_cache/ cover/ # Translations *.mo *.pot # Django stuff: *.log local_settings.py db.sqlite3 db.sqlite3-journal # Flask stuff: instance/ .webassets-cache # Scrapy stuff: .scrapy # Sphinx documentation docs/_build/ # PyBuilder .pybuilder/ target/ # Jupyter Notebook .ipynb_checkpoints # IPython profile_default/ ipython_config.py # pyenv # For a library or package, you might want to ignore these files since the code is # intended to run in multiple environments; otherwise, check them in: # .python-version # pipenv # According to pypa/pipenv#598, it is recommended to include Pipfile.lock in version control. # However, in case of collaboration, if having platform-specific dependencies or dependencies # having no cross-platform support, pipenv may install dependencies that don't work, or not # install all needed dependencies. #Pipfile.lock # PEP 582; used by e.g. github.com/David-OConnor/pyflow __pypackages__/ # Celery stuff celerybeat-schedule celerybeat.pid # SageMath parsed files *.sage.py # Environments .env .venv env/ venv/ ENV/ env.bak/ venv.bak/ # Spyder project settings .spyderproject .spyproject # Rope project settings .ropeproject # mkdocs documentation /site # mypy .mypy_cache/ .dmypy.json dmypy.json # Pyre type checker .pyre/ # pytype static type analyzer .pytype/ # Cython debug symbols cython_debug/ 0707010000000B000041ED0000000000000000000000026412183400000000000000000000000000000000000000000000002900000000sle-prjmgr-tools-1678907444.9216408/.obs0707010000000C000081A40000000000000000000000016412183400000263000000000000000000000000000000000000003700000000sle-prjmgr-tools-1678907444.9216408/.obs/workflows.yml--- pr: steps: - link_package: source_project: openSUSE:Tools source_package: sle-prjmgr-tools target_project: openSUSE:Tools:OSRT:TestGithub - configure_repositories: project: openSUSE:Tools:OSRT:TestGithub repositories: - name: openSUSE_Tumbleweed paths: - target_project: openSUSE:Factory target_repository: snapshot architectures: [ x86_64 ] - name: '15.4' paths: - target_project: openSUSE:Tools target_repository: '15.4' architectures: [ x86_64 ] filters: event: pull_request0707010000000D000081A40000000000000000000000016412183400000044000000000000000000000000000000000000002E00000000sle-prjmgr-tools-1678907444.9216408/.pylintrc[MASTER] extension-pkg-whitelist=lxml [FORMAT] max-line-length=120 0707010000000E000081A400000000000000000000000164121834000002A3000000000000000000000000000000000000003600000000sle-prjmgr-tools-1678907444.9216408/.readthedocs.yaml# .readthedocs.yaml # Read the Docs configuration file # See https://docs.readthedocs.io/en/stable/config-file/v2.html for details # Required version: 2 # Set the version of Python and other tools you might need build: os: ubuntu-20.04 apt_packages: - swig - clang tools: python: "3.6" # Build documentation in the docs/ directory with Sphinx sphinx: configuration: docs/conf.py # If using Sphinx, optionally build your docs in additional formats such as PDF formats: - epub - pdf # Optionally declare the Python requirements required to build your docs python: install: - path: . method: pip extra_requirements: - docs 0707010000000F000081A40000000000000000000000016412183400000057000000000000000000000000000000000000003200000000sle-prjmgr-tools-1678907444.9216408/.rstcheck.cfg[rstcheck] ignore_messages=(Unknown directive type "automodule"\.$) report_level=error 07070100000010000081A40000000000000000000000016412183400000126000000000000000000000000000000000000003700000000sle-prjmgr-tools-1678907444.9216408/CODE_OF_CONDUCT.md# Code of Conduct The openSUSE Code of Conduct applies to this repository. It can be found at two places: - in the openSUSE Wiki: <https://en.opensuse.org/Code_of_Conduct> - or its canoncial version on code.opensuse.org: <https://code.opensuse.org/project/coc/blob/main/f/Code-of-Conduct.md> 07070100000011000081A40000000000000000000000016412183400003687000000000000000000000000000000000000002C00000000sle-prjmgr-tools-1678907444.9216408/LICENSEEUROPEAN UNION PUBLIC LICENCE v. 1.2 EUPL © the European Union 2007, 2016 This European Union Public Licence (the ‘EUPL’) applies to the Work (as defined below) which is provided under the terms of this Licence. Any use of the Work, other than as authorised under this Licence is prohibited (to the extent such use is covered by a right of the copyright holder of the Work). The Work is provided under the terms of this Licence when the Licensor (as defined below) has placed the following notice immediately following the copyright notice for the Work: Licensed under the EUPL or has expressed by any other means his willingness to license under the EUPL. 1.Definitions In this Licence, the following terms have the following meaning: — ‘The Licence’:this Licence. — ‘The Original Work’:the work or software distributed or communicated by the Licensor under this Licence, available as Source Code and also as Executable Code as the case may be. — ‘Derivative Works’:the works or software that could be created by the Licensee, based upon the Original Work or modifications thereof. This Licence does not define the extent of modification or dependence on the Original Work required in order to classify a work as a Derivative Work; this extent is determined by copyright law applicable in the country mentioned in Article 15. — ‘The Work’:the Original Work or its Derivative Works. — ‘The Source Code’:the human-readable form of the Work which is the most convenient for people to study and modify. — ‘The Executable Code’:any code which has generally been compiled and which is meant to be interpreted by a computer as a program. — ‘The Licensor’:the natural or legal person that distributes or communicates the Work under the Licence. — ‘Contributor(s)’:any natural or legal person who modifies the Work under the Licence, or otherwise contributes to the creation of a Derivative Work. — ‘The Licensee’ or ‘You’:any natural or legal person who makes any usage of the Work under the terms of the Licence. — ‘Distribution’ or ‘Communication’:any act of selling, giving, lending, renting, distributing, communicating, transmitting, or otherwise making available, online or offline, copies of the Work or providing access to its essential functionalities at the disposal of any other natural or legal person. 2.Scope of the rights granted by the Licence The Licensor hereby grants You a worldwide, royalty-free, non-exclusive, sublicensable licence to do the following, for the duration of copyright vested in the Original Work: — use the Work in any circumstance and for all usage, — reproduce the Work, — modify the Work, and make Derivative Works based upon the Work, — communicate to the public, including the right to make available or display the Work or copies thereof to the public and perform publicly, as the case may be, the Work, — distribute the Work or copies thereof, — lend and rent the Work or copies thereof, — sublicense rights in the Work or copies thereof. Those rights can be exercised on any media, supports and formats, whether now known or later invented, as far as the applicable law permits so. In the countries where moral rights apply, the Licensor waives his right to exercise his moral right to the extent allowed by law in order to make effective the licence of the economic rights here above listed. The Licensor grants to the Licensee royalty-free, non-exclusive usage rights to any patents held by the Licensor, to the extent necessary to make use of the rights granted on the Work under this Licence. 3.Communication of the Source Code The Licensor may provide the Work either in its Source Code form, or as Executable Code. If the Work is provided as Executable Code, the Licensor provides in addition a machine-readable copy of the Source Code of the Work along with each copy of the Work that the Licensor distributes or indicates, in a notice following the copyright notice attached to the Work, a repository where the Source Code is easily and freely accessible for as long as the Licensor continues to distribute or communicate the Work. 4.Limitations on copyright Nothing in this Licence is intended to deprive the Licensee of the benefits from any exception or limitation to the exclusive rights of the rights owners in the Work, of the exhaustion of those rights or of other applicable limitations thereto. 5.Obligations of the Licensee The grant of the rights mentioned above is subject to some restrictions and obligations imposed on the Licensee. Those obligations are the following: Attribution right: The Licensee shall keep intact all copyright, patent or trademarks notices and all notices that refer to the Licence and to the disclaimer of warranties. The Licensee must include a copy of such notices and a copy of the Licence with every copy of the Work he/she distributes or communicates. The Licensee must cause any Derivative Work to carry prominent notices stating that the Work has been modified and the date of modification. Copyleft clause: If the Licensee distributes or communicates copies of the Original Works or Derivative Works, this Distribution or Communication will be done under the terms of this Licence or of a later version of this Licence unless the Original Work is expressly distributed only under this version of the Licence — for example by communicating ‘EUPL v. 1.2 only’. The Licensee (becoming Licensor) cannot offer or impose any additional terms or conditions on the Work or Derivative Work that alter or restrict the terms of the Licence. Compatibility clause: If the Licensee Distributes or Communicates Derivative Works or copies thereof based upon both the Work and another work licensed under a Compatible Licence, this Distribution or Communication can be done under the terms of this Compatible Licence. For the sake of this clause, ‘Compatible Licence’ refers to the licences listed in the appendix attached to this Licence. Should the Licensee's obligations under the Compatible Licence conflict with his/her obligations under this Licence, the obligations of the Compatible Licence shall prevail. Provision of Source Code: When distributing or communicating copies of the Work, the Licensee will provide a machine-readable copy of the Source Code or indicate a repository where this Source will be easily and freely available for as long as the Licensee continues to distribute or communicate the Work. Legal Protection: This Licence does not grant permission to use the trade names, trademarks, service marks, or names of the Licensor, except as required for reasonable and customary use in describing the origin of the Work and reproducing the content of the copyright notice. 6.Chain of Authorship The original Licensor warrants that the copyright in the Original Work granted hereunder is owned by him/her or licensed to him/her and that he/she has the power and authority to grant the Licence. Each Contributor warrants that the copyright in the modifications he/she brings to the Work are owned by him/her or licensed to him/her and that he/she has the power and authority to grant the Licence. Each time You accept the Licence, the original Licensor and subsequent Contributors grant You a licence to their contributions to the Work, under the terms of this Licence. 7.Disclaimer of Warranty The Work is a work in progress, which is continuously improved by numerous Contributors. It is not a finished work and may therefore contain defects or ‘bugs’ inherent to this type of development. For the above reason, the Work is provided under the Licence on an ‘as is’ basis and without warranties of any kind concerning the Work, including without limitation merchantability, fitness for a particular purpose, absence of defects or errors, accuracy, non-infringement of intellectual property rights other than copyright as stated in Article 6 of this Licence. This disclaimer of warranty is an essential part of the Licence and a condition for the grant of any rights to the Work. 8.Disclaimer of Liability Except in the cases of wilful misconduct or damages directly caused to natural persons, the Licensor will in no event be liable for any direct or indirect, material or moral, damages of any kind, arising out of the Licence or of the use of the Work, including without limitation, damages for loss of goodwill, work stoppage, computer failure or malfunction, loss of data or any commercial damage, even if the Licensor has been advised of the possibility of such damage. However, the Licensor will be liable under statutory product liability laws as far such laws apply to the Work. 9.Additional agreements While distributing the Work, You may choose to conclude an additional agreement, defining obligations or services consistent with this Licence. However, if accepting obligations, You may act only on your own behalf and on your sole responsibility, not on behalf of the original Licensor or any other Contributor, and only if You agree to indemnify, defend, and hold each Contributor harmless for any liability incurred by, or claims asserted against such Contributor by the fact You have accepted any warranty or additional liability. 10.Acceptance of the Licence The provisions of this Licence can be accepted by clicking on an icon ‘I agree’ placed under the bottom of a window displaying the text of this Licence or by affirming consent in any other similar way, in accordance with the rules of applicable law. Clicking on that icon indicates your clear and irrevocable acceptance of this Licence and all of its terms and conditions. Similarly, you irrevocably accept this Licence and all of its terms and conditions by exercising any rights granted to You by Article 2 of this Licence, such as the use of the Work, the creation by You of a Derivative Work or the Distribution or Communication by You of the Work or copies thereof. 11.Information to the public In case of any Distribution or Communication of the Work by means of electronic communication by You (for example, by offering to download the Work from a remote location) the distribution channel or media (for example, a website) must at least provide to the public the information requested by the applicable law regarding the Licensor, the Licence and the way it may be accessible, concluded, stored and reproduced by the Licensee. 12.Termination of the Licence The Licence and the rights granted hereunder will terminate automatically upon any breach by the Licensee of the terms of the Licence. Such a termination will not terminate the licences of any person who has received the Work from the Licensee under the Licence, provided such persons remain in full compliance with the Licence. 13.Miscellaneous Without prejudice of Article 9 above, the Licence represents the complete agreement between the Parties as to the Work. If any provision of the Licence is invalid or unenforceable under applicable law, this will not affect the validity or enforceability of the Licence as a whole. Such provision will be construed or reformed so as necessary to make it valid and enforceable. The European Commission may publish other linguistic versions or new versions of this Licence or updated versions of the Appendix, so far this is required and reasonable, without reducing the scope of the rights granted by the Licence. New versions of the Licence will be published with a unique version number. All linguistic versions of this Licence, approved by the European Commission, have identical value. Parties can take advantage of the linguistic version of their choice. 14.Jurisdiction Without prejudice to specific agreement between parties, — any litigation resulting from the interpretation of this License, arising between the European Union institutions, bodies, offices or agencies, as a Licensor, and any Licensee, will be subject to the jurisdiction of the Court of Justice of the European Union, as laid down in article 272 of the Treaty on the Functioning of the European Union, — any litigation arising between other parties and resulting from the interpretation of this License, will be subject to the exclusive jurisdiction of the competent court where the Licensor resides or conducts its primary business. 15.Applicable Law Without prejudice to specific agreement between parties, — this Licence shall be governed by the law of the European Union Member State where the Licensor has his seat, resides or has his registered office, — this licence shall be governed by Belgian law if the Licensor has no seat, residence or registered office inside a European Union Member State. Appendix ‘Compatible Licences’ according to Article 5 EUPL are: — GNU General Public License (GPL) v. 2, v. 3 — GNU Affero General Public License (AGPL) v. 3 — Open Software License (OSL) v. 2.1, v. 3.0 — Eclipse Public License (EPL) v. 1.0 — CeCILL v. 2.0, v. 2.1 — Mozilla Public Licence (MPL) v. 2 — GNU Lesser General Public Licence (LGPL) v. 2.1, v. 3 — Creative Commons Attribution-ShareAlike v. 3.0 Unported (CC BY-SA 3.0) for works other than software — European Union Public Licence (EUPL) v. 1.1, v. 1.2 — Québec Free and Open-Source Licence — Reciprocity (LiLiQ-R) or Strong Reciprocity (LiLiQ-R+). The European Commission may update this Appendix to later versions of the above licences without producing a new version of the EUPL, as long as they provide the rights granted in Article 2 of this Licence and protect the covered Source Code from exclusive appropriation. All other changes or additions to this Appendix require the production of a new EUPL version. 07070100000012000081A400000000000000000000000164121834000003C4000000000000000000000000000000000000002D00000000sle-prjmgr-tools-1678907444.9216408/Makefilebuild: python3 setup.py build install: python3 setup.py install sdist: build python3 setup.py sdist rpm: sdist @mkdir -p rpm-build @cp dist/*.tar.gz rpm-build/ rpmbuild \ --define='_topdir %(pwd)/rpm-build' \ --define='_builddir %{_topdir}' \ --define='_rpmdir %{_topdir}' \ --define='_srcrpmdir %{_topdir}' \ --define='_specdir %{_topdir}' \ --define='_rpmfilename %%{NAME}-%%{VERSION}-%%{RELEASE}.%%{ARCH}.rpm' \ --define='_sourcedir %{_topdir}' \ -ba sle-prjmgr-tools.spec clean: @rm -r rpm-build build dist *.egg-info podman-rpm-builder: @podman build -f docker/build.dockerfile -t localhost/sle-prjmgr-tools-builder:latest . @podman run -it --rm -v $(CURDIR):/code localhost/sle-prjmgr-tools-builder:latest podman-rpm-install-check: @podman build -f docker/install-check.dockerfile -t localhost/sle-prjmgr-tools-install-check:latest . @podman run -it --rm -v $(CURDIR)/rpm-build:/rpms localhost/sle-prjmgr-tools-install-check:latest 07070100000013000081A4000000000000000000000001641218340000044C000000000000000000000000000000000000002E00000000sle-prjmgr-tools-1678907444.9216408/README.md[![Documentation Status](https://readthedocs.org/projects/sle-prjmgr-tools/badge/?version=latest)](https://sle-prjmgr-tools.readthedocs.io/en/latest/?badge=latest) [![Code style: black](https://img.shields.io/badge/code%20style-black-000000.svg)](https://github.com/psf/black) # sle-prjmgr-tools This is a repository containing shared release management tools for SUSE. Tools are expected to be version agnostic, release or version specific tools should be clearly documented. ## How to install & use The preferred way are the RPMs. You may install the tool via either: - `zypper in sle-prjmgr-tools` - `pip install sle-prjmgr-tools` - `pip install git+https://github.com/openSUSE/sle-prjmgr-tools.git` - or you install it from the locally built RPMs or via pip. ## Tool overview For an overview see `sle-prjmgr-tools -h` or `sle-prjmgr-tools <tool_name> -h`. ## Archive of the old code This repository has its root (before open-sourcing it) in the internal SUSE Gitlab. If you have access then you may visit the history [here](https://gitlab.suse.de/sle-prjmgr/release-management-tools). 07070100000014000081A40000000000000000000000016412183400000185000000000000000000000000000000000000003000000000sle-prjmgr-tools-1678907444.9216408/SECURITY.md# Security Policy ## Supported Versions This project is a rolling release. Thus only the latest release is supported. If you find a security vulnerability in an older version, please check that it applies to the current version. ## Reporting a Vulnerability In case you find a valid vulnerability please contact the SUSE Security team at: [security@suse.com](mailto:security@suse.com) 07070100000015000041ED0000000000000000000000026412183400000000000000000000000000000000000000000000002B00000000sle-prjmgr-tools-1678907444.9216408/docker07070100000016000081A4000000000000000000000001641218340000012E000000000000000000000000000000000000003C00000000sle-prjmgr-tools-1678907444.9216408/docker/build.dockerfileFROM registry.opensuse.org/opensuse/leap:15.4 RUN ["mkdir", "/code"] RUN [ \ "zypper", \ "in", \ "-y", \ "rpm-build", \ "python-rpm-macros", \ "python3-devel", \ "python3-setuptools", \ "python3-argcomplete" \ ] WORKDIR "/code" VOLUME ["/code"] CMD [ "make", "rpm" ] 07070100000017000081A40000000000000000000000016412183400000227000000000000000000000000000000000000004400000000sle-prjmgr-tools-1678907444.9216408/docker/install-check.dockerfileFROM registry.opensuse.org/opensuse/leap:15.4 RUN [ \ "zypper", \ "in", \ "-y", \ "python3-requests", \ "python3-PyYAML", \ "python3-jira", \ "python3-lxml", \ "python3-importlib_resources", \ "python3-keyring", \ "python3-rpmfile", \ "python3-argcomplete", \ "osc" \ ] WORKDIR "/rpms" VOLUME ["/rpms"] # We don't know the exact name of the RPM. Thus we need a command line to expand the wildcard. CMD ["/bin/sh", "-c", "zypper in -y --allow-unsigned-rpm *.rpm && echo '---' && sle-prjmgr-tools -h"] 07070100000018000041ED0000000000000000000000036412183400000000000000000000000000000000000000000000002900000000sle-prjmgr-tools-1678907444.9216408/docs07070100000019000081A4000000000000000000000001641218340000027A000000000000000000000000000000000000003200000000sle-prjmgr-tools-1678907444.9216408/docs/Makefile# Minimal makefile for Sphinx documentation # # You can set these variables from the command line, and also # from the environment for the first two. SPHINXOPTS ?= SPHINXBUILD ?= sphinx-build SOURCEDIR = . BUILDDIR = _build # Put it first so that "make" without argument is like "make help". help: @$(SPHINXBUILD) -M help "$(SOURCEDIR)" "$(BUILDDIR)" $(SPHINXOPTS) $(O) .PHONY: help Makefile # Catch-all target: route all unknown targets to Sphinx using the new # "make mode" option. $(O) is meant as a shortcut for $(SPHINXOPTS). %: Makefile @$(SPHINXBUILD) -M $@ "$(SOURCEDIR)" "$(BUILDDIR)" $(SPHINXOPTS) $(O) 0707010000001A000041ED0000000000000000000000026412183400000000000000000000000000000000000000000000003000000000sle-prjmgr-tools-1678907444.9216408/docs/apidoc0707010000001B000081A40000000000000000000000016412183400000055000000000000000000000000000000000000003C00000000sle-prjmgr-tools-1678907444.9216408/docs/apidoc/modules.rstsle_prjmgr_tools ================ .. toctree:: :maxdepth: 2 sle_prjmgr_tools 0707010000001C000081A400000000000000000000000164121834000000C4000000000000000000000000000000000000004C00000000sle-prjmgr-tools-1678907444.9216408/docs/apidoc/sle_prjmgr_tools.config.rstsle\_prjmgr\_tools.config package ================================= Module contents --------------- .. automodule:: sle_prjmgr_tools.config :members: :undoc-members: :show-inheritance: 0707010000001D000081A4000000000000000000000001641218340000095B000000000000000000000000000000000000004500000000sle-prjmgr-tools-1678907444.9216408/docs/apidoc/sle_prjmgr_tools.rstsle\_prjmgr\_tools package ========================== Subpackages ----------- .. toctree:: :maxdepth: 4 sle_prjmgr_tools.config sle_prjmgr_tools.utils Submodules ---------- sle\_prjmgr\_tools.cli module ----------------------------- .. automodule:: sle_prjmgr_tools.cli :members: :undoc-members: :show-inheritance: sle\_prjmgr\_tools.diff\_modules module --------------------------------------- .. automodule:: sle_prjmgr_tools.diff_modules :members: :undoc-members: :show-inheritance: sle\_prjmgr\_tools.incident\_repos module ----------------------------------------- .. automodule:: sle_prjmgr_tools.incident_repos :members: :undoc-members: :show-inheritance: sle\_prjmgr\_tools.jira\_epics module ------------------------------------- .. automodule:: sle_prjmgr_tools.jira_epics :members: :undoc-members: :show-inheritance: sle\_prjmgr\_tools.list\_accepted\_pkgs module ---------------------------------------------- .. automodule:: sle_prjmgr_tools.list_accepted_pkgs :members: :undoc-members: :show-inheritance: sle\_prjmgr\_tools.package\_updates\_from\_xcdchk module -------------------------------------------------------- .. automodule:: sle_prjmgr_tools.package_updates_from_xcdchk :members: :undoc-members: :show-inheritance: sle\_prjmgr\_tools.packagelist\_report module --------------------------------------------- .. automodule:: sle_prjmgr_tools.packagelist_report :members: :undoc-members: :show-inheritance: sle\_prjmgr\_tools.release\_to module ------------------------------------- .. automodule:: sle_prjmgr_tools.release_to :members: :undoc-members: :show-inheritance: sle\_prjmgr\_tools.search\_binary module ---------------------------------------- .. automodule:: sle_prjmgr_tools.search_binary :members: :undoc-members: :show-inheritance: sle\_prjmgr\_tools.sle\_build module ------------------------------------ .. automodule:: sle_prjmgr_tools.sle_build :members: :undoc-members: :show-inheritance: sle\_prjmgr\_tools.update\_build\_status\_page module ----------------------------------------------------- .. automodule:: sle_prjmgr_tools.update_build_status_page :members: :undoc-members: :show-inheritance: Module contents --------------- .. automodule:: sle_prjmgr_tools :members: :undoc-members: :show-inheritance: 0707010000001E000081A400000000000000000000000164121834000002F1000000000000000000000000000000000000004B00000000sle-prjmgr-tools-1678907444.9216408/docs/apidoc/sle_prjmgr_tools.utils.rstsle\_prjmgr\_tools.utils package ================================ Submodules ---------- sle\_prjmgr\_tools.utils.confluence module ------------------------------------------ .. automodule:: sle_prjmgr_tools.utils.confluence :members: :undoc-members: :show-inheritance: sle\_prjmgr\_tools.utils.jira module ------------------------------------ .. automodule:: sle_prjmgr_tools.utils.jira :members: :undoc-members: :show-inheritance: sle\_prjmgr\_tools.utils.osc module ----------------------------------- .. automodule:: sle_prjmgr_tools.utils.osc :members: :undoc-members: :show-inheritance: Module contents --------------- .. automodule:: sle_prjmgr_tools.utils :members: :undoc-members: :show-inheritance: 0707010000001F000081A40000000000000000000000016412183400000457000000000000000000000000000000000000003100000000sle-prjmgr-tools-1678907444.9216408/docs/conf.pyimport os import sys sys.path.insert(0, os.path.abspath("..")) # Configuration file for the Sphinx documentation builder. # # For the full list of built-in configuration values, see the documentation: # https://www.sphinx-doc.org/en/master/usage/configuration.html # -- Project information ----------------------------------------------------- # https://www.sphinx-doc.org/en/master/usage/configuration.html#project-information project = "sle-prjmgr-tools" copyright = "2022, SLE Project Managers" author = "SLE Project Managers" release = "0.0.1" # -- General configuration --------------------------------------------------- # https://www.sphinx-doc.org/en/master/usage/configuration.html#general-configuration extensions = [ "sphinx.ext.autodoc", "sphinx.ext.coverage", ] templates_path = ["_templates"] exclude_patterns = ["_build", "Thumbs.db", ".DS_Store"] # -- Options for HTML output ------------------------------------------------- # https://www.sphinx-doc.org/en/master/usage/configuration.html#options-for-html-output html_theme = "sphinx_rtd_theme" html_static_path = ["_static"] 07070100000020000081A40000000000000000000000016412183400000240000000000000000000000000000000000000003300000000sle-prjmgr-tools-1678907444.9216408/docs/index.rstWelcome to sle-prjmgr-tools's documentation! ============================================ Welcome to the documentation of the SLE Project Manager Tools. This application is tightly integrated with the `openSUSE Commander <https://github.com/openSUSE/osc>`_ and the internal instance of the `Open Build Service <https://openbuildservice.org/>`_. .. toctree:: :maxdepth: 1 :numbered: :caption: Contents: User Guide <user-guide> Code Documentation <apidoc/modules> Indices and tables ================== * :ref:`genindex` * :ref:`modindex` * :ref:`search` 07070100000021000081A40000000000000000000000016412183400000320000000000000000000000000000000000000003200000000sle-prjmgr-tools-1678907444.9216408/docs/make.bat@ECHO OFF pushd %~dp0 REM Command file for Sphinx documentation if "%SPHINXBUILD%" == "" ( set SPHINXBUILD=sphinx-build ) set SOURCEDIR=. set BUILDDIR=_build %SPHINXBUILD% >NUL 2>NUL if errorlevel 9009 ( echo. echo.The 'sphinx-build' command was not found. Make sure you have Sphinx echo.installed, then set the SPHINXBUILD environment variable to point echo.to the full path of the 'sphinx-build' executable. Alternatively you echo.may add the Sphinx directory to PATH. echo. echo.If you don't have Sphinx installed, grab it from echo.https://www.sphinx-doc.org/ exit /b 1 ) if "%1" == "" goto help %SPHINXBUILD% -M %1 %SOURCEDIR% %BUILDDIR% %SPHINXOPTS% %O% goto end :help %SPHINXBUILD% -M help %SOURCEDIR% %BUILDDIR% %SPHINXOPTS% %O% :end popd 07070100000022000081A40000000000000000000000016412183400000BBC000000000000000000000000000000000000003800000000sle-prjmgr-tools-1678907444.9216408/docs/user-guide.rst********** User Guide ********** This page gives a longer explanation what the different scriptlets are doing. A short summary can be found in the help of the tool itself. .. code-block:: shell sle-prjmgr-tools -h # Help of the full tool sle-prjmgr-tools <name> -h # Help of each scriptlet. In case you encounter any issues please open a bug on GitHub: `GitHub Issues - openSUSE/sle-prjmgr-tools <https://github.com/openSUSE/sle-prjmgr-tools/issues/new/choose>`_ Shell Completion ################ Bash ==== Add the following line to your ``~/.bashrc`` manually please: .. code-block:: shell eval "$(register-python-argcomplete my-awesome-script)" If the file does not exist please create it with: .. code-block:: shell touch ~/.bashrc ZSH === Please execute the following code snippet: .. code-block:: shell autoload -U bashcompinit bashcompinit After you have done this please follow the instructions for Bash. Fish ==== Please execute the following commands in a fish terminal: .. code-block:: shell register-python-argcomplete --shell fish my-awesome-script > ~/.config/fish/completions/sle-prjmgr-tools.fish Diff Modules ############ .. note:: This documentation is a work in progress. Please give it some love and open a PR to fill this section with content. Incident Repos ############## .. note:: This documentation is a work in progress. Please give it some love and open a PR to fill this section with content. Jira Epics ########## This script will search through all changelogs of a project and list all JIRA issues that are mentioned. The script will include also the revision history in the scanning process in addition to the changelogs. The syntax of such a mention is ``jsc#KEY-9999``. List accepted Packages ###################### .. note:: This documentation is a work in progress. Please give it some love and open a PR to fill this section with content. Package updates from XCDCHK ########################### .. note:: This documentation is a work in progress. Please give it some love and open a PR to fill this section with content. Packagelist Report ################## .. note:: This documentation is a work in progress. Please give it some love and open a PR to fill this section with content. Release To ########## .. note:: This documentation is a work in progress. Please give it some love and open a PR to fill this section with content. Search Binary ############# .. note:: This documentation is a work in progress. Please give it some love and open a PR to fill this section with content. SLE Build ######### .. note:: This documentation is a work in progress. Please give it some love and open a PR to fill this section with content. Update Build Status Page ######################## .. note:: This documentation is a work in progress. Please give it some love and open a PR to fill this section with content. 07070100000023000081A40000000000000000000000016412183400000539000000000000000000000000000000000000002E00000000sle-prjmgr-tools-1678907444.9216408/setup.cfg[metadata] name = sle-prjmgr-tools version = attr: sle_prjmgr_tools.__version__ description = SLE Project management tools to release SLE based products long_description = file: README.md long_description_content_type = text/markdown keywords = SUSE license = EUPL-1.2 classifiers = Programming Language :: Python :: 3 Programming Language :: Python :: 3.6 Programming Language :: Python :: 3.7 Programming Language :: Python :: 3.8 Programming Language :: Python :: 3.9 Programming Language :: Python :: 3.10 Programming Language :: Python :: 3.11 license_files = LICENSE author = SUSE SLE Project Managers project_urls = Source = https://github.com/openSUSE/sle-prjmgr-tools Tracker = https://github.com/openSUSE/sle-prjmgr-tools/issues [options] zip_safe = False packages = find: install_requires = argcomplete requests PyYAML osc jira lxml rpmfile keyring importlib_resources python_requires = >=3.6 [options.package_data] sle_prjmgr_tools.config = *.json [options.entry_points] console_scripts = sle-prjmgr-tools = sle_prjmgr_tools.cli:main [options.extras_require] lint = black pylint mypy types-PyYAML types-requests # types-lxml docs = sphinx>=4.3.0 sphinx-rtd-theme>=0.5.1 doc8 rstcheck test = pytest 07070100000024000081A40000000000000000000000016412183400000045000000000000000000000000000000000000002D00000000sle-prjmgr-tools-1678907444.9216408/setup.pyfrom setuptools import setup if __name__ == "__main__": setup() 07070100000025000081A400000000000000000000000164121834000000A2000000000000000000000000000000000000003D00000000sle-prjmgr-tools-1678907444.9216408/sle-prjmgr-tools.changes------------------------------------------------------------------- Tue Nov 29 12:55:04 UTC 2022 - Enno Gotthold <egotthold@suse.com> - Initial version packaged 07070100000026000081A400000000000000000000000164121834000006AF000000000000000000000000000000000000003A00000000sle-prjmgr-tools-1678907444.9216408/sle-prjmgr-tools.spec# # spec file for package sle-prjmgr-tools # # Copyright (c) 2022 SUSE LLC # # All modifications and additions to the file contributed by third parties # remain the property of their copyright owners, unless otherwise agreed # upon. The license for this file, and modifications and additions to the # file, is the same license as for the pristine package itself (unless the # license for the pristine package is not an Open Source License, in which # case the license is the MIT License). An "Open Source License" is a # license that conforms to the Open Source Definition (Version 1.9) # published by the Open Source Initiative. # Please submit bugfixes or comments via https://bugs.opensuse.org/ # %define sdist_name sle_prjmgr_tools Name: sle-prjmgr-tools Version: 0.0.5 Release: 0 Summary: SUSE SLE release management tools License: EUPL-1.2 URL: https://github.com/openSUSE/sle-prjmgr-tools Source: %{name}-%{version}.tar.gz BuildRequires: python3-devel BuildRequires: python3-setuptools BuildRequires: python3-argcomplete BuildRequires: python-rpm-macros Requires: python3-requests Requires: python3-importlib-resources Requires: python3-PyYAML Requires: python3-jira Requires: python3-lxml Requires: python3-argcomplete Requires: osc Recommends: python3-keyring %description SUSE SLE release management tools that help publish Service Packs and Snapshots. %prep %autosetup -n %{name}-%{version} %build %py3_build %install %py3_install %post %postun %files %license LICENSE %doc README.md %{_bindir}/%{name} %{python3_sitelib}/%{sdist_name}/ %{python3_sitelib}/%{sdist_name}-* %changelog 07070100000027000041ED0000000000000000000000046412183400000000000000000000000000000000000000000000003500000000sle-prjmgr-tools-1678907444.9216408/sle_prjmgr_tools07070100000028000081A40000000000000000000000016412183400000059000000000000000000000000000000000000004100000000sle-prjmgr-tools-1678907444.9216408/sle_prjmgr_tools/__init__.py""" Main module of the project. Defines only the version atm. """ __version__ = "0.0.5" 07070100000029000081A40000000000000000000000016412183400000C65000000000000000000000000000000000000003C00000000sle-prjmgr-tools-1678907444.9216408/sle_prjmgr_tools/cli.py# PYTHON_ARGCOMPLETE_OK """ This is the wrapper script that specifies the CLI entry point that can load the scripts from a config file. The config file has the following format: { "modules": [ "module1", "module2" ] } Valid locations for the configuration file are: - "/etc/sle-prjmgr-tools.json" - "$XDG_CONFIG_HOME/sle-prjmgr-tools.json" - "$RELEASE_MANAGEMENT_TOOLS_FILE" A module must have a "build_parser" method that takes a single argument. The method is responsible to assign with "set_default" the func kwarg. The CLI entrypoint is called "main_cli" and has a single arguments that is an argparse namespace. This should be just a wrapper to the "main" function that has the actual arguments defined. If no config is supplied the built-in configuration is used. Example usage of the CLI for development (from git project root): > . venv/bin/activate > export RELEASE_MANAGEMENT_TOOLS_FILE="config/sle-prjmgr-tools.json" > python3 -m sle_prjmgr_tools.cli -h """ import argparse import importlib import logging import sys import urllib.error import argcomplete # type: ignore from sle_prjmgr_tools import config PARSER = argparse.ArgumentParser( prog="sle_prjmgr_tools", formatter_class=argparse.ArgumentDefaultsHelpFormatter ) PARSER.add_argument( "--osc-config", dest="osc_config", help="The location of the oscrc if a specific one should be used.", ) PARSER.add_argument( "--osc-instance", dest="osc_instance", help="The URL of the API from the Open Buildservice instance that should be used.", default="https://api.suse.de", ) PARSER.add_argument( "--jira-instance", dest="jira_instance", help="The URL for the JIRA instance.", default="https://jira.suse.com", ) PARSER.add_argument( "--confluence-instance", dest="confluence_instance", help="The URL for the Confluence instance.", default="https://confluence.suse.com", ) SUBPARSERS = PARSER.add_subparsers( help="Help for the subprograms that this tool offers." ) logger = logging.getLogger() def import_plugin(name: str): """ This method imports a plugin :param name: The name of the module in the "sle_prjmgr_tools" module. """ plugin = importlib.import_module(f".{name}", package="sle_prjmgr_tools") plugin.build_parser(SUBPARSERS) def main(): """ The main entrypoint for the library. """ import_plugin("version") module_list = config.load_modules() for module in module_list: import_plugin(module) argcomplete.autocomplete(PARSER) args = PARSER.parse_args() if "func" in vars(args): # Run a subprogramm only if the parser detected it correctly. try: args.func(args) except urllib.error.URLError as url_error: if "name or service not known" in str(url_error).lower(): print( "No connection to one of the tools. Please make sure the connection to the tools is available" " before executing the program!" ) sys.exit(1) return PARSER.print_help() sys.exit(1) if __name__ == "__main__": main() 0707010000002A000041ED0000000000000000000000026412183400000000000000000000000000000000000000000000003C00000000sle-prjmgr-tools-1678907444.9216408/sle_prjmgr_tools/config0707010000002B000081A40000000000000000000000016412183400000710000000000000000000000000000000000000004800000000sle-prjmgr-tools-1678907444.9216408/sle_prjmgr_tools/config/__init__.py""" Configuration related package. """ import json import logging import os import traceback from typing import List from importlib_resources import files CONFIG_LOCATIONS = [ "/etc/sle-prjmgr-tools.json", "$XDG_CONFIG_HOME/sle-prjmgr-tools.json", "$RELEASE_MANAGEMENT_TOOLS_FILE", ] logger = logging.getLogger() def load_modules() -> List[str]: """ Method that loads all modules from the config and returns them to the application. """ module_list = [] config_found = False for location in CONFIG_LOCATIONS: if "$" in location: real_location = os.path.expandvars(location) else: real_location = location if os.path.isfile(real_location): module_list = load_config(real_location) config_found = True else: logger.debug('"%s" was skipped.', real_location) if not config_found: backup_config_path = str( files("sle_prjmgr_tools.config").joinpath("sle-prjmgr-tools.json") ) module_list = load_config(backup_config_path) logger.debug("Built-In Configuration was used!") return module_list def load_config(path: str) -> List[str]: """ Loads the JSON config file. :param path: This path is excepted to exist. It should be the absolute path to the config file. :return: The list of modules that should be loaded. """ with open(path, "rt", encoding="UTF-8") as json_fp: try: json_dict = json.load(json_fp) logger.debug('JSON config loaded from "%s".', path) return json_dict.get("modules") except json.JSONDecodeError: logger.debug("JSON syntax error! Fix syntax to load modules.") traceback.print_exc() return [] 0707010000002C000081A40000000000000000000000016412183400000113000000000000000000000000000000000000005200000000sle-prjmgr-tools-1678907444.9216408/sle_prjmgr_tools/config/sle-prjmgr-tools.json{ "modules": [ "update_build_status_page", "incident_repos", "packagelist_report", "search_binary", "list_accepted_pkgs", "jira_epics", "sle_build", "release_to", "diff_modules", "package_updates_from_xcdchk", "ibs_to_jira" ] }0707010000002D000081A400000000000000000000000164121834000023E1000000000000000000000000000000000000004500000000sle-prjmgr-tools-1678907444.9216408/sle_prjmgr_tools/diff_modules.py""" Script to diff two well-known modules with each another. """ import argparse import pathlib import subprocess from collections import namedtuple from typing import Optional from sle_prjmgr_tools.utils.osc import OscUtils DiffModuleOptions = namedtuple("DiffModuleOptions", "project revision") OscOptions = namedtuple("OscOptions", "config instance") def build_parser(parent_parser): """ Builds the parser for this script. This is executed by the main CLI dynamically. :param parent_parser: The subparsers object from argparse. """ subparser = parent_parser.add_parser("diff_modules", help="diff_modules help") subparser.add_argument( "-f", "--from-project", dest="from_project", default="15-SP4", help='The source project. Should be in the form "15-SP4".', ) subparser.add_argument( "-o", "--from-revision-number", dest="from_revision_number", help='The revision number of the "groups.yml" file in the source project.', ) subparser.add_argument( "-t", "--to-project", dest="to_project", default="15-SP5", help='The source project. Should be in the form "15-SP4".', ) subparser.add_argument( "-d", "--to-revision-number", dest="to_revision_number", help='The revision number of the "groups.yml" file in the target project.', ) subparser.add_argument( "--output-file", dest="output_file", type=argparse.FileType("w+", encoding="UTF-8"), help="If the argument is present the result is written to the file specified in the argument, otherwise result" "is printed to stdout.", ) subparser.set_defaults(func=main_cli) def checkout_source_files( from_project: DiffModuleOptions, to_project: DiffModuleOptions, from_groups_filename: pathlib.Path, to_groups_filename: pathlib.Path, osc_config: Optional[OscOptions] = None, ): """ Download the two "group.yml" files from "000package-groups" that will be compared. :param from_project: The project the source package is in. :param to_project: The project the target package is in. :param from_groups_filename: The desired filename for the source file. :param to_groups_filename: The desired filename for the target file. :param osc_config: The tuple with the osc options. """ if osc_config is None: my_utils = OscUtils() else: my_utils = OscUtils( osc_server=osc_config.instance, override_config=osc_config.config ) my_utils.get_file_from_package( from_project.project, "000package-groups", from_project.revision, "groups.yml", target_filename=str(from_groups_filename), ) my_utils.get_file_from_package( to_project.project, "000package-groups", to_project.revision, "groups.yml", target_filename=str(to_groups_filename), ) def run_sed(expression: str, file: pathlib.Path): """ Runs the given sed Expression on a specified file. :param expression: The sed expression. :param file: The file that should be used. """ subprocess.run(["sed", "-i", "-e", expression, str(file)], check=False) def cleanup_source_files( from_groups_filename: pathlib.Path, to_groups_filename: pathlib.Path ): """ Removes the temporary files. :param from_groups_filename: The file that acts as a source for the comparison. :param to_groups_filename: The file that acts as a target for the comparison. """ from_groups_filename.unlink() to_groups_filename.unlink() def remove_comments(from_groups: pathlib.Path, to_groups: pathlib.Path): """ Remove all comments from the two files. :param from_groups: The source file that should be modified. :param to_groups: The target file that should be modified. """ expression = r"s/\s*#.*$//" run_sed(expression, from_groups) run_sed(expression, to_groups) def output_mark_packages_in_to_file(to_groups: pathlib.Path): """ Mark all package in the target file. :param to_groups: The file that should be modified. """ run_sed(r"s/\(^[a-zA-Z_]*:$\)/ \1/g", to_groups) def diff_file( from_groups: pathlib.Path, to_groups: pathlib.Path, changes_file: pathlib.Path ): """ Diff the two files and write it to the third file. :param from_groups: The source file that should be modified. :param to_groups: The target file that should be modified. :param changes_file: The file with the resulting changes. """ with changes_file.open("w") as changes_file_fp: subprocess.run( [ "diff", "--suppress-common-lines", "--ignore-blank-lines", "--ignore-trailing-space", str(from_groups), str(to_groups), ], stdout=changes_file_fp, check=False, ) def output_remove_line_numbers(changes_file: pathlib.Path): """ Remove all line numbers from the file. :param changes_file: The file that should be modified. """ run_sed(r"/^[0-9].*$/d", changes_file) def output_remove_dashes(changes_file: pathlib.Path): """ Remove all dashes from the diff. :param changes_file: The file that should be modified. """ run_sed(r"/^---$/d", changes_file) def output_remove_repeated_module_names(changes_file: pathlib.Path): """ Remove all repeated module names from the diff. :param changes_file: The file that should be modified. """ run_sed(r"/< [a-zA-Z_]*:$/d", changes_file) def output_remove_empty_lines(changes_file: pathlib.Path): """ Remove all empty lines from the diff. :param changes_file: The file that should be modified. """ run_sed(r"/^[<>]\s*$/d", changes_file) def output_mark_removed_packages(changes_file: pathlib.Path): """ Mark all packages that are removed. :param changes_file: The file that should be modified. """ run_sed(r"s/^<\s*-/- /", changes_file) def output_mark_added_packages(changes_file: pathlib.Path): """ Mark all packages that are added. :param changes_file: The file that should be modified. """ run_sed(r"s/^>\s*-/+ /", changes_file) def output_removed_from_module_names(changes_file: pathlib.Path): """ Mark packages that are removed from a module. :param changes_file: The file that should be modified. """ run_sed(r"s/^[<>]\s\s//", changes_file) def output_remove_unrelated_to_modules(changes_file: pathlib.Path): """ Remove all packages that are unrelated to modules. :param changes_file: The file that should be modified. """ run_sed(r"/^UNWANTED:/,$!d", changes_file) def main( from_project: DiffModuleOptions, to_project: DiffModuleOptions, osc_config: Optional[OscOptions] = None, ) -> str: """ Main routine executes the non-CLI related logic. :param from_project: The source project that should be compared. :param to_project: The target project that should be compared. :param osc_config: The osc configuration that should be used. """ from_groups_file = pathlib.Path("groups_FROM.yml") to_groups_file = pathlib.Path("groups_TO.yml") changes_groups_file = pathlib.Path( f"Changes_from_{from_project}_to_{to_project}.diff" ) from_project = DiffModuleOptions( f"SUSE:SLE-{from_project.project}:GA", from_project.revision ) to_project = DiffModuleOptions( f"SUSE:SLE-{to_project.project}:GA", to_project.revision ) checkout_source_files( from_project, to_project, from_groups_file, to_groups_file, osc_config=osc_config, ) remove_comments(from_groups_file, to_groups_file) output_mark_packages_in_to_file(to_groups_file) diff_file(from_groups_file, to_groups_file, changes_groups_file) cleanup_source_files(from_groups_file, to_groups_file) output_remove_line_numbers(changes_groups_file) output_remove_dashes(changes_groups_file) output_remove_repeated_module_names(changes_groups_file) output_remove_empty_lines(changes_groups_file) output_mark_removed_packages(changes_groups_file) output_mark_added_packages(changes_groups_file) output_removed_from_module_names(changes_groups_file) output_remove_unrelated_to_modules(changes_groups_file) result = changes_groups_file.read_text(encoding="UTF-8") changes_groups_file.unlink() return result def main_cli(args): """ Main routine that executes the script :param args: Argparse Namespace that has all the arguments """ result = main( DiffModuleOptions(args.from_project, args.from_revision_number), DiffModuleOptions(args.to_project, args.to_revision_number), osc_config=OscOptions(args.osc_config, args.osc_instance), ) if args.output_file: with args.output_file as output_file_fp: output_file_fp.write(result) else: print(result) 0707010000002E000081A400000000000000000000000164121834000048AA000000000000000000000000000000000000004400000000sle-prjmgr-tools-1678907444.9216408/sle_prjmgr_tools/ibs_to_jira.py""" Script to post comments to Jira once an SR is included in a build and - if in state "IBS integration" transitions the ticket to the next state. """ import enum import logging from typing import Dict, List, Optional from sle_prjmgr_tools import list_accepted_pkgs, sle_build from sle_prjmgr_tools.utils.jira import JiraUtils, SSLOptions from sle_prjmgr_tools.utils.osc import OscUtils, OscReleaseHelper class IssueState(enum.Enum): """ Enum that contains all issue states that should be grouped for comments. """ CORRECT = 1 INCORRECT = 2 DEVELOPMENT = 3 OTHER = 999 logger = logging.getLogger() class JiraWork(JiraUtils): """ This is a subclass that performs all the work that is required to be done in JIRA. """ def __init__( self, jira_url: str, pat_token: str, ssl_options: SSLOptions, milestone_name: str = "", ): self.milestone_name = milestone_name super().__init__(jira_url, pat_token, ssl_options) def _build_comment_template( self, comment_template: str, srs: List[int], osc_api_url: str = "https://build.opensuse.org", build_number: str = "", ) -> str: """ This is rendering the comment template for a JIRA comment. :param comment_template: The template with three placeholders. The first one will be the build_number, the second one the milestone text (" (<name>)") and the third one will be the list of SRs in an unordered JIRA list. :param srs: The list of OBS Submit Requests :param build_number: The build number :return: The rendered template """ milestone_text = "" if self.milestone_name: milestone_text = f" ({self.milestone_name})" return comment_template % ( build_number, milestone_text, format_list_of_srs(srs, osc_api_url), ) def jira_search_correct(self, jscs: List[str]) -> List[str]: """ This method searches for issue that are in the state that the workflow states as correct. :param jscs: The list of JIRA tickets that should be filtered. :return: The list of JIRA tickets that are matched by the filter. """ issue_status_allowed = [ "IBS Integration", ] issue_status_allowed_str = '","'.join(issue_status_allowed) correct_state = ( f'issue in ({",".join(jscs)})' f' AND status = "{issue_status_allowed_str}"' f" AND type = Implementation" ) return self.jira_do_search(correct_state, len(jscs)) def jira_post_comment_correct(self, issue: str, srs: List[int]) -> None: """ This posts a comment for issues that are deemed correct in the workflow. :param issue: The issue that a comment should be posted to. :param srs: The list of Submit Requests from the OBS that should be put into the comment. """ comment_template = """ A submit request referencing this feature has been merged into %s%s. Submit Requests: %s """ comment = self._build_comment_template( comment_template, srs, "https://api.suse.de", "", ) self.jira_obj.add_comment(issue, comment) issue_obj = self.jira_obj.issue(issue) if "status:code_merged" not in issue_obj.fields.labels: issue_obj.fields.labels.append("status:code_merged") if "status:wait_for_status" not in issue_obj.fields.labels: issue_obj.fields.labels.append("status:wait_for_status") issue_obj.update(fields={"labels": issue_obj.fields.labels}) def jira_search_incorrect(self, jscs: List[str]) -> List[str]: """ This method searches for issue that are in the state that the workflow states as incorrect. :param jscs: The list of JIRA tickets that should be filtered. :return: The list of issues that are found to be incorrect. """ issue_status_disallowed = [ "QE Open", "QE In Progress", "QE Blocked", "Engineering Done", "In Maintenance", "Dev In Progress", "IBS Integration", ] issue_status_disallowed_str = '","'.join(issue_status_disallowed) incorrect_state = ( f'issue in ({",".join(jscs)}) ' f'AND status NOT IN ("{issue_status_disallowed_str}") ' "AND type = Implementation" ) return self.jira_do_search(incorrect_state, len(jscs)) def jira_post_comment_incorrect(self, issue: str, srs: List[int]) -> None: """ This posts a comment for issues that are deemed incorrect in the workflow. :param issue: The issue that a comment should be posted to. :param srs: The list of SRs from the OBS that are related to this issue. """ comment_template = """ A submit request referencing this feature has been merged into %s%s. Please update the state of this ticket, as it doesn't reflect to correct state of development. Submit Requests: %s """ comment = self._build_comment_template( comment_template, srs, "https://api.suse.de", "" ) self.jira_obj.add_comment(issue, comment) issue_obj = self.jira_obj.issue(issue) if "status:wait_for_status" not in issue_obj.fields.labels: issue_obj.fields.labels.append("status:wait_for_status") if "status:code_merged" not in issue_obj.fields.labels: issue_obj.fields.labels.append("status:code_merged") issue_obj.update(fields={"labels": issue_obj.fields.labels}) def jira_search_development(self, jscs: List[str]) -> List[str]: """ This method searches for issue that are in the state that the workflow states as in development. :param jscs: The list of JIRA tickets that should be filtered. :return: The list of issues that are found to be in development. """ development_state = f'issue in ({",".join(jscs)}) AND status = "Dev In Progress" AND type = Implementation' return self.jira_do_search(development_state, len(jscs)) def jira_post_comment_development(self, issue: str, srs: List[int]) -> None: """ This posts a comment for issues that are deemed in development in the workflow. :param issue: The issue that a comment should be posted to. :param srs: The list of SRs from the OBS that are related to this issue. """ comment_template = """ A submit request referencing this feature has been merged into %s%s. Submit Requests: %s """ comment = self._build_comment_template( comment_template, srs, "https://api.suse.de", "", ) self.jira_obj.add_comment(issue, comment) issue_obj = self.jira_obj.issue(issue) if "status:code_merged" not in issue_obj.fields.labels: issue_obj.fields.labels.append("status:code_merged") if "status:wait_for_status" not in issue_obj.fields.labels: issue_obj.fields.labels.remove("status:wait_for_status") issue_obj.update(fields={"labels": issue_obj.fields.labels}) @staticmethod def jira_search_other( issues_by_category: Dict[IssueState, List[str]], list_with_jscs: List[str] ) -> List[str]: """ This method searches for issues that are in the state that the workflow states as not defined. :param issues_by_category: This is the dict with the pre-filtered issues by category. :param list_with_jscs: The list of JIRA tickets that should be filtered. :return: The list of issues that are remaining. """ result: List[str] = list_with_jscs.copy() for jsc in list_with_jscs: for jsc_list in issues_by_category.values(): if jsc in jsc_list: result.remove(jsc) return result def jira_post_comment_other(self, issue: str, srs: List[int]) -> None: """ This posts a comment to an issue that cannot be classified by one of the filters that are defined. :param issue: The issue that a comment should be posted to. :param srs: The list of SRs from the OBS that are related to this issue. """ comment_template = """ A submit request referencing this feature has been merged into %s%s. Submit Requests: %s """ comment = self._build_comment_template( comment_template, srs, "https://api.suse.de", "" ) self.jira_obj.add_comment(issue, comment) def build_parser(parent_parser) -> None: """ Builds the parser for this script. This is executed by the main CLI dynamically. :param parent_parser: The subparsers object from argparse. """ subparser = parent_parser.add_parser("ibs_to_jira", help="ibs_to_jira help") subparser.add_argument( "--jira-pat", "-j", required=True, dest="jira_pat", help='JIRA PAT token that can be created under "Profile" > "Personal Access Tokens" > "Create token".', ) subparser.add_argument( "--ssl-cert-bundle", "-s", dest="ssl_cert_bundle", help="Path to the CA bundle that will be used by the script to verify the SSL certificates of Jira.", default="/usr/share/pki/trust/anchors/SUSE_Trust_Root.crt.pem", ) subparser.add_argument( "--ssl-cert-check-disable", "-S", dest="ssl_cert_check_disable", help="If this flag is set, then all SSL verification of Jira is disabled.", action="store_true", ) subparser.add_argument( "--project", "-p", dest="project", help="Project to work with.", default="SUSE:SLE-15-SP5:GA", ) subparser.add_argument( "--milestone", "-m", dest="milestone", help="If this flag is given the name of the milestone is put into the comment in Jira.", action="store_true", ) subparser.set_defaults(func=main_cli) def osc_collect_srs_between_builds( obs_url: str, project: str, duration: int ) -> List[int]: """ Collects all SRS between the two builds that we have atm. :param obs_url: URL where the API from the build-service can be reached. :param project: The project to check for. :param duration: The duration that should be checked for. Is relative from execution time of script. :return: The number of SRs that were done in the timeframe. """ result = [] pkg_requests = list_accepted_pkgs.main(obs_url, project, duration) for request_type in pkg_requests: result.extend(request_type) return result def transform_srs_per_jsc_to_jsc_per_sr( srs_with_jscs: Dict[int, List[str]] ) -> Dict[str, List[int]]: """ Transform the list of submit requests with the corresponding JSC mentions to a list of JSC tickets with their corresponding SRs. :param srs_with_jscs: The initial data. :return: The dict with the submit requests grouped by jsc. """ result: Dict[str, List[int]] = {} for submit_request, jscs in srs_with_jscs.items(): for jsc in jscs: if jsc not in result: result[jsc] = [] result[jsc].append(submit_request) return result def format_list_of_srs(srs: List[int], web_ui_url: str) -> str: """ Formats the list of SRs that are to be posted into the comment. :param srs: The list of Submit Requests from the OBS that should be put into the comment. :param web_ui_url: The URL that should be used to link a Submit Request to the Web UI. :return: The list in JIRA comment markup that will result in an unordered list. """ srs_formatted_str = "" for submit_request in srs: srs_formatted_str += ( f"- [{submit_request}|{web_ui_url}/request/show/{submit_request}]\n" ) return srs_formatted_str def main_osc_work(obs_url: str, osc_config: Optional[str], project: str) -> dict: """ This encapsulates all the work that is done on the OBS side. :param obs_url: URL where the API from the build-service can be reached. :param osc_config: The path to the osc configuration. If not present this will be searched for by osc. :param project: The project that should be released. :return: The dictionary with the SRs as keys and the list of jscs as a values """ osc_work = OscReleaseHelper(osc_server=obs_url, override_config=osc_config) # Check TEST (get build number and mtime) old_build = sle_build.sle_15_media_build(obs_url, project) # Do release osc_work.release_repo_to_test(project) # Check TEST (get build number and mtime) new_build = sle_build.sle_15_media_build(obs_url, project) duration = int(next(iter(old_build.values())).mtime) - int( next(iter(new_build.values())).mtime ) if duration == 0: print("WARNING: BuildIDs are identical!") elif duration < 0: print("WARNING: GA has an older build den GA:TEST!") # Collect all SRs between the build numbers list_with_srs = osc_collect_srs_between_builds(obs_url, project, duration) # Collect all JSCs from the SRs dict_with_srs = {} for submit_request in list_with_srs: dict_with_srs[submit_request] = osc_work.osc_get_jsc_from_sr(submit_request) return dict_with_srs # pylint: disable-next=too-many-arguments def main_jira_work( jira_pat: str, jira_url: str, dict_with_jscs: Dict[str, List[int]], ssl_cert_bundle: str = "/usr/share/pki/trust/anchors/SUSE_Trust_Root.crt.pem", ssl_cert_check_disable: bool = False, milestone_name: str = "", ) -> None: """ This subroutine is grouping together all work that is related to JIRA. :param jira_pat: The PAT for the JIRA instance. :param jira_url: The URL to the JIRA instance. :param dict_with_jscs: Dictionary that contains all jscs with their corresponding SRs from the OBS. :param ssl_cert_bundle: The path to the CA certificate bundle. If this is None, the fallback to certifi is used. :param ssl_cert_check_disable: If this is set to True one can skip certificate validation. :param milestone_name: The name of the milestone. Leave this empty to skip adding the milestone name to the comment. """ jira_work = JiraWork( jira_url, jira_pat, SSLOptions(ssl_cert_check_disable, ssl_cert_bundle), milestone_name, ) # Setup dict & search issues_by_category = { IssueState.CORRECT: jira_work.jira_search_correct(list(dict_with_jscs.keys())), IssueState.INCORRECT: jira_work.jira_search_incorrect( list(dict_with_jscs.keys()) ), IssueState.DEVELOPMENT: jira_work.jira_search_development( list(dict_with_jscs.keys()) ), } issues_by_category[IssueState.OTHER] = jira_work.jira_search_other( issues_by_category, list(dict_with_jscs.keys()) ) # Duplicate detection issues_by_category_sum = 0 for value in issues_by_category.values(): issues_by_category_sum += len(value) if issues_by_category_sum != len(dict_with_jscs.keys()): logger.warning( "There were issues found that were not existing in JIRA or the current user has no access to!" ) print(issues_by_category) # Post comments to Jira for category, issues in issues_by_category.items(): for issue in issues: if category == IssueState.CORRECT: jira_work.jira_post_comment_correct(issue, dict_with_jscs[issue]) jira_work.jira_transition_tickets(issue) elif category == IssueState.INCORRECT: jira_work.jira_post_comment_incorrect(issue, dict_with_jscs[issue]) elif category == IssueState.DEVELOPMENT: jira_work.jira_post_comment_development(issue, dict_with_jscs[issue]) elif category == IssueState.OTHER: jira_work.jira_post_comment_other(issue, dict_with_jscs[issue]) def main( jira_pat_token: str, obs_url: str = "https://api.suse.de", jira_url: str = "https://jira.suse.de", obs_project: str = "SUSE:SLE-15-SP5:GA", osc_config: Optional[str] = None, ssl_cert_bundle: str = "/usr/share/pki/trust/anchors/SUSE_Trust_Root.crt.pem", ssl_cert_check_disable: bool = False, is_milestone: bool = False, ) -> None: """ Main routine that executes the script :param jira_pat_token: The token to authenticate against JIRA. :param obs_url: URL where the API from the build-service can be reached. :param jira_url: The URL to the JIRA instance. :param osc_config: The path to the osc configuration. If not present this will be searched for by osc. :param obs_project: The project that should be released. :param ssl_cert_bundle: The path to the CA certificate bundle. If this is None, the fallback to certifi is used. :param ssl_cert_check_disable: If this is set to True one can skip certificate validation. :param is_milestone: Whether this is a milestone or not. If it is, the name of it will be included. """ # pylint: disable=R0913 # OSC work dict_with_srs = main_osc_work(obs_url, osc_config, obs_project) print(dict_with_srs) # Calculate all SRs for every JSC dict_with_jscs = transform_srs_per_jsc_to_jsc_per_sr(dict_with_srs) milestone_name = "" if is_milestone: osc_work = OscUtils(obs_url) milestone_name = osc_work.osc_retrieve_betaversion(obs_project) # Jira work main_jira_work( jira_pat_token, jira_url, dict_with_jscs, ssl_cert_bundle, ssl_cert_check_disable, milestone_name, ) def main_cli(args) -> None: """ Main routine that executes the script :param args: Argparse Namespace that has all the arguments """ main( args.jira_pat, args.osc_instance, args.jira_instance, args.project, args.osc_config, args.ssl_cert_bundle, args.ssl_cert_check_disable, args.milestone, ) print("Script done") 0707010000002F000081A40000000000000000000000016412183400000961000000000000000000000000000000000000004700000000sle-prjmgr-tools-1678907444.9216408/sle_prjmgr_tools/incident_repos.py""" This script retrieves all repositories for each incident it is given. The datasource is the SUSE internal SMELT tool. """ from typing import List import requests QUERY = """ query {{ incidents(incidentId:{}) {{ edges {{ node {{ repositories {{ edges {{ node {{ name }} }} }} }} }} }} }} """ def build_parser(parent_parser): """ Builds the parser for this script. This is executed by the main CLI dynamically. :param parent_parser: The subparsers object from argparse. """ subparser = parent_parser.add_parser("incident_repos", help="incident_repos help") subparser.add_argument( "--smelt-api", dest="smelt_api", help="URL to the SMELT API.", default="https://smelt.suse.de/graphql/", ) subparser.add_argument( "incidents", metavar="incident", help="The incident numbers", nargs="+", type=int, ) subparser.set_defaults(func=main_cli) def get_incident_repos(smelt_api_url: str, iid: int, timeout=180): """ Retrieve all repositories that are affected by a single incident. :param smelt_api_url: The URL where the SMELT GraphQL API is present. :param iid: The incident ID. :param timeout: The timeout that is used for the SMELT API. :return: The set of repositories that is affected. """ query = {"query": QUERY.format(iid)} results = requests.post(smelt_api_url, query, timeout=timeout).json() incs_repos = [ i["node"]["repositories"]["edges"] for i in results["data"]["incidents"]["edges"] ] repos = set() for inc_repos in incs_repos: repos.update({s["node"]["name"] for s in inc_repos}) return repos def main(smelt_api_url: str, incidents: List[int]): """ Main routine executes the non-CLI related logic. :param smelt_api_url: The URL where the SMELT GraphQL API is present. :param incidents: The list of incidents that should be looked up. """ for iid in incidents: print(f"{iid}:") for repo in get_incident_repos(smelt_api_url, iid): print(f" * {repo}") def main_cli(args): """ Main routine that executes the script :param args: Argparse Namespace that has all the arguments """ main(args.smelt_api, args.incidents) 07070100000030000081A400000000000000000000000164121834000010A3000000000000000000000000000000000000004300000000sle-prjmgr-tools-1678907444.9216408/sle_prjmgr_tools/jira_epics.py""" Get all JIRA Issues that are mentioned in the changelog of a project. """ import re from typing import Dict, List from lxml import etree from osc import core, conf # type: ignore def build_parser(parent_parser): """ Builds the parser for this script. This is executed by the main CLI dynamically. :param parent_parser: The subparsers object from argparse. """ subparser = parent_parser.add_parser("jira_epics", help="jira_epics help") subparser.add_argument( "--target", "-t", dest="target", default="SUSE:SLE-15-SP5:GA", help="Project to execute against.", ) subparser.set_defaults(func=main_cli) def osc_prepare(): """ This has to be executed to the ini-style config is converted into their corresponding types. """ conf.get_config() def osc_packages_exclude_000(apiurl: str, target: str) -> List[str]: """ Retrieve a list of packages for a given project where all packages starting with ``000`` are being excluded. :param apiurl: URL where the API from the build-service can be reached. :param target: The target project to search. :return: The list of packages in the project. """ result = [] for package in core.meta_get_packagelist(apiurl, target): if not package.startswith("000"): result.append(package) return result def get_issue_list(apiurl: str, target: str, package: str) -> List[str]: """ Scans the changelog for a specific package for jsc mentions. :param apiurl: URL where the API from the build-service can be reached. :param target: The target project that this is checked against. :param package: The package that should be scanned for issues. :return: The list of issues that could be found. """ baseurl = ["source", target, package] query = {"cmd": "diff", "view": "xml", "onlyissues": "1", "orev": "1"} url = core.makeurl(apiurl, baseurl, query) fp_post_result = core.http_POST(url) xml_issues = etree.parse(fp_post_result).getroot() issue_list_xml = xml_issues.xpath( '//sourcediff/issues/issue[@state="added"][@tracker="jsc"]' ) issue_list = [] for issue in issue_list_xml: issue_list.append(issue.get("label")) return issue_list def scan_commitlog_for_issues(apiurl: str, target: str, package: str) -> List[str]: """ Scans the commitlog for a specific package for jsc mentions. :param apiurl: URL where the API from the build-service can be reached. :param target: The target project that this is checked against. :param package: The package that should be scanned for issues. :return: The list of issues that could be found. """ issue_list = [] jsc_regex = re.compile(r"jsc#[a-zA-Z]*-\d*") xml_str_commitlog = "\n".join( core.get_commitlog( apiurl, target, package, None, format="xml", revision_upper=None ) ) tree = etree.fromstring(xml_str_commitlog) msgs = tree.xpath("//msg") for msg in msgs: matches = jsc_regex.findall(msg.text) if len(matches) > 0: issue_list.extend(matches) return issue_list def main( apiurl="https://api.suse.de", target="SUSE:SLE-15-SP3:GA" ) -> Dict[str, List[str]]: """ Main routine executes the non-CLI related logic. :param apiurl: URL where the API from the build-service can be reached. :param target: The target project that this is checked against. """ osc_prepare() result: Dict[str, List[str]] = {} for package in osc_packages_exclude_000(apiurl, target): issue_list = get_issue_list(apiurl, target, package) issue_list.extend(scan_commitlog_for_issues(apiurl, target, package)) result[package] = issue_list return result def main_cli(args): """ Main routine that executes the script :param args: Argparse Namespace that has all the arguments """ data = main(apiurl=args.osc_instance, target=args.target) for package, issue_list in data.items(): if len(issue_list) > 0: print(package) for issue in issue_list: print(issue) else: print(f"{package} - No jscs mentioned") 07070100000031000081A40000000000000000000000016412183400001200000000000000000000000000000000000000004B00000000sle-prjmgr-tools-1678907444.9216408/sle_prjmgr_tools/list_accepted_pkgs.py""" Module to list all accepted SRs that are accepted that changed, added or removed a package. """ import time from typing import Optional, Tuple from osc import conf, core # type: ignore def build_parser(parent_parser): """ Builds the parser for this script. This is executed by the main CLI dynamically. :return: The subparsers object from argparse. """ subparser = parent_parser.add_parser( "list_accepted_pkgs", help="list_accepted_pkgs help" ) subparser.add_argument("--project", "-p", dest="project", type=str, required=True) subparser.add_argument("--days", "-d", dest="time_range", type=int) subparser.set_defaults(func=main_cli) def osc_prepare(osc_config: Optional[str] = None, osc_server: Optional[str] = None): """ This has to be executed to the ini-style config is converted into their corresponding types. :param osc_config: Path to the configuration file for osc. The default delegates the task to the osc library. :param osc_server: Server URL that points to the OBS server API. """ conf.get_config(override_conffile=osc_config, override_apiurl=osc_server) def osc_get_submit_requests(project: str): """ Looks in the IBS the list of requests up that are accepted and submitted. :param project: The project that should be checked. :return: The list of requests objects that are found with the state accepted and that are submitted. """ return core.get_review_list( "https://api.suse.de", project=project, states=("accepted"), req_type="submit" ) def osc_get_delete_requests(project: str): """ Looks in the IBS the list of requests up that are accepted and that delete a package. :param project: The project that should be checked. :return: The list of requests objects that are found with the state accepted and that are deleting a package. """ return core.get_review_list( "https://api.suse.de", project=project, states=("accepted"), req_type="delete" ) def filter_requests(days: int, requests) -> list: """ Filter a given list by its date. The time of days is respected and not rounded. A request that was submitted at 2pm on the earliest days that is included would be ignored if it is 3pm during execution of this method. :param days: The days that should be filtered :param requests: The list with the requests to filter. :return: Filter list where every request is younger than the days specified. """ result = [] earliest_date = time.strftime( "%Y-%m-%dT%H:%M:%S", time.localtime(time.time() - days * 24 * 3600) ) for request in requests: if request.state.when > earliest_date: result.append(request) return result def main_cli(args): """ Main routine that executes the script :param args: Argparse Namespace that has all the arguments """ # Parse arguments submit_requests, delete_requests = main( args.osc_instance, args.project, args.time_range ) # Print both retrieved lists with format: "pkg-name (YYYY-MM-DDThh:mm:ss) hyperlink-to-request" print("==============================") print("SUBMIT REQUESTS") print("==============================") for request in submit_requests: print( f"{request.actions[0].src_package} ({request.state.when}) https://api.suse.de/{request.reqid}" ) print("==============================") print("DELETE REQUESTS") print("==============================") for request in delete_requests: print( f"{request.actions[0].src_package} ({request.state.when}) https://api.suse.de/{request.reqid}" ) def main(apiurl: str, project: str, time_range: int) -> Tuple[list, list]: """ Main routine executes the non-CLI related logic. :param apiurl: URL where the API from the build-service can be reached. :param project: The project that should be queried. :param time_range: The list of days that present the delimiter for the filter :return: A Tuple with the list of requests """ # Prepare osc -> Config is ini style and needs to be converted first osc_prepare(osc_server=apiurl) # Retrieve SRs submit_requests = osc_get_submit_requests(project) # Retrieve SRs with delete requests delete_requests = osc_get_delete_requests(project) # Filter requests by days submit_requests = filter_requests(time_range, submit_requests) delete_requests = filter_requests(time_range, delete_requests) return submit_requests, delete_requests 07070100000032000081A40000000000000000000000016412183400005199000000000000000000000000000000000000005400000000sle-prjmgr-tools-1678907444.9216408/sle_prjmgr_tools/package_updates_from_xcdchk.py""" This script retrieves all updates to packages between two builds. """ import re from collections import namedtuple from typing import Dict, List, Set import requests def build_parser(parent_parser): """ Builds the parser for this script. This is executed by the main CLI dynamically. :return: The subparsers object from argparse. """ subparser = parent_parser.add_parser( "package_updates_from_xcdchk", help="package_updates_from_xcdchk help" ) subparser.add_argument( "-A", "--xcdchk-timeout", dest="xcdchk_timeout", type=int, default=60, help="The xcdchk timeout after which the request will be aborted.", ) subparser.add_argument( "-a", "--xcdchk-url", dest="xcdchk_url", default="http://xcdchk.suse.de", help="The xcdchk Domain.", ) subparser.add_argument( "-b", "--build", dest="builds", type=float, nargs=2, required=True, help="The build numbers that should be compared.", ) subparser.add_argument( "--service-pack", dest="service_pack", default="SLE-15-SP5", help='The service pack in the form of "SLE-15-SPx".', ) subparser.add_argument( "--origin-service-pack", dest="origin_service_pack", default="SLE-15-SP5", help='The service pack in the form of "SLE-15-SPx".', ) subparser.set_defaults(func=main_cli) class XcdChkUpdatedPackageVersion: """ This class is responsible for holding the package versions. """ # Disabled because dataclasses are not a thing in Python 3.6 # pylint: disable=R0903 def __init__(self, old_version="", new_version=""): self.old_version = old_version self.new_version = new_version class XcdChkData: """ This class is responsible for holding the data of the script. """ # Disabled because dataclasses are not a thing in Python 3.6 # pylint: disable=R0903 def __init__(self): self.updated_pkgs: Dict[str, XcdChkUpdatedPackageVersion] = {} self.added_pkgs: Dict[str, str] = {} self.removed_pkgs: List[str] = [] self.downgraded_pkgs = List[str] class XcdCheckFetchOptions: """ This class is responsible to hold the options for fetching data from xcdchk. """ # Disabled because dataclasses are not a thing in Python 3.6 # pylint: disable=R0903 def __init__(self, url="", timeout=60): self.url = url self.timeout = timeout class XcdCheckRawData: """ This class is responsible for holding the raw data of the script. """ # Disabled because dataclasses are not a thing in Python 3.6 # pylint: disable=R0903 def __init__(self): self.changelog = "" self.updated = "" self.new = "" self.missing = "" self.downgraded = "" def xcdchk_fetch_data( version: str, origin_version: str, build1: str, build2: str, xcdchk_options: XcdCheckFetchOptions, ) -> XcdCheckRawData: """ Fetches the data of xcdchk. :param version: The SLES version that is paired to the newer of the two builds. :param origin_version: The SLES version that is paired to the older version of the two builds. :param build1: The older of the two builds. :param build2: The newer of the two builds. :param xcdchk_options: The options that are used to fetch the data from xcdchk :return: The object that contains all the data that is needed for the script to work. """ xcdchk_url = xcdchk_options.url xcdchk_timeout = xcdchk_options.timeout common_part = f"{origin_version}-Full-Test-Build{build1}-Build{build2}" result = XcdCheckRawData() result.changelog = requests.get( f"{xcdchk_url}/raw/{version}-Full-Test/{build2}/all/ChangeLog-{common_part}", timeout=xcdchk_timeout, ).text result.updated = requests.get( f"{xcdchk_url}/raw/{version}-Full-Test/{build2}/all/{common_part}-updated-RPMs", timeout=xcdchk_timeout, ).text result.new = requests.get( f"{xcdchk_url}/raw/{version}-Full-Test/{build2}/all/{common_part}-new-RPMs", timeout=xcdchk_timeout, ).text result.missing = requests.get( f"{xcdchk_url}/raw/{version}-Full-Test/{build2}/all/{common_part}-missing-RPMs", timeout=xcdchk_timeout, ).text result.downgraded = requests.get( f"{xcdchk_url}/raw/{version}-Full-Test/{build2}/all/{common_part}-downgraded-RPMs", timeout=xcdchk_timeout, ).text return result def __get_package_names(changelog: str) -> Dict[str, List[str]]: add_update_regex = re.compile( r"^(?P<change>o Updated|o Added)\W(?P<package_name>.*)(\W\(.*\))", flags=re.MULTILINE, ) result: Dict[str, List[str]] = {"updated": [], "added": []} for match in add_update_regex.finditer(changelog): if match.group("change") == "o Updated": result["updated"].append(match.group("package_name")) elif match.group("change") == "o Added": result["added"].append(match.group("package_name")) return result def xcdchk_updated_pkgs( changelog: str, updated: str ) -> Dict[str, XcdChkUpdatedPackageVersion]: """ This function parses the updated packages from their changelogs and then searches for the version from and to version in the updated packages file. :param changelog: The changelog to search through. :param updated: The updated packages to search through. :return: The keys are package names and the value of each key is the NamedTuple "XcdChkUpdatedPackageVersion". """ result = {} package_names = __get_package_names(changelog) upgrade_regex = re.compile( r"(?P<name>.*)\.(?P<arch>.*):\s(?P<old_version>.*)\s=>\s(?P<new_version>.*)" ) for package in package_names.get("updated", []): for line in updated.split("\n"): if line.startswith(package): match = upgrade_regex.match(line) if match is not None: result[package] = XcdChkUpdatedPackageVersion( match.group("old_version"), match.group("new_version") ) for package in package_names.get("added", []): for line in updated.split("\n"): if line.startswith(package): match = upgrade_regex.match(line) if match is not None: result[package] = XcdChkUpdatedPackageVersion( match.group("old_version"), match.group("new_version") ) return result def xcdchk_added_pkgs(changelog: str, added: str) -> Dict[str, str]: """ This function parses the added packages from their changelogs and then searches for the version from and to version in the added packages file. :param changelog: The changelog to search through. :param added: The added packages to search through. :return: A dict where package names are keys and versions are the values. """ package_names = __get_package_names(changelog) result = {} package_version_regex = re.compile( r"(?P<name>.*)-(?P<version>[^-]*)-(?P<build>[^-]*)\..*" ) for package in package_names.get("added", []): for line in added.split("\n"): if line.startswith(package): match = package_version_regex.match(line) if match is not None: result[package] = match.group("version") for package in package_names.get("updated", []): for line in added.split("\n"): if line.startswith(package): result[package] = "" return result def xcdchk_removed_pkgs(missing_rpms: str) -> List[str]: """ This function parses the removed packages from the removed packages file. :param missing_rpms: The file to search through where missing RPMs are listed. :return: The list of packages has been removed. """ package_name_regex = re.compile(r"(-[^-]*\..*)") packages: List[str] = [] for line in missing_rpms.split("\n"): if "x86_64" in line or "noarch" in line: package_name_match = next(package_name_regex.finditer(line)) if package_name_match is not None: packages.append(line[: package_name_match.start()]) # Filter kernel and debug packages filter_regex = re.compile(r"^kernel|.*debugsource.*|.*debuginfo.*") result: List[str] = [] for package in packages: filter_match = filter_regex.match(package) if filter_match is None: # Append if regex DOESN'T apply result.append(package) return result def xcdchk_downgraded_pkgs( downgraded_rpms: str, ) -> Dict[str, XcdChkUpdatedPackageVersion]: """ Parse the list of downgraded packages and return a well formatted result for further work. :param downgraded_rpms: The str with the list of downgraded packages. :returns: A dictionary where the keys are package names and the value for each package is a version object. """ downgrade_regex = re.compile( r"(?P<name>.*)\.(?P<arch>.*):\s(?P<old_version>.*)\s=>\s(?P<new_version>.*)" ) result: Dict[str, XcdChkUpdatedPackageVersion] = {} for match in downgrade_regex.finditer(downgraded_rpms): result[match.group("name")] = XcdChkUpdatedPackageVersion( match.group("old_version"), match.group("new_version") ) return result def xcdchk_mentioned_bugs(changelog: str) -> List[str]: """ Find all openSUSE, SUSE or Novell Bugzilla references in the changelog. :param changelog: The str with the full changelog. :returns: The sorted list with the Bugzilla bug numbers. """ regex_bugzilla = re.compile(r"(bsc#|bnc#|boo#)(\d{7})", flags=re.M) bugs_set: Set[str] = set() for match in regex_bugzilla.finditer(changelog): bugs_set.add(match.group(2)) return sorted(bugs_set) def build_bsc_query_p1_p2(changelog: str) -> str: """ Builds the query for all P1 & P2 bugs that can be found in Bugzilla. :params changelog: The str with the full changelog. :returns: The str with the full URL that includes the query. """ bugs_list = xcdchk_mentioned_bugs(changelog) bugs = "%2C".join(bugs_list) # Seperated by %2C which is "," url = "https://bugzilla.suse.com/buglist.cgi?" return ( f"{url}bug_id={bugs}" f"&bug_id_type=anyexact" f"&bug_status=RESOLVED" f"&bug_status=VERIFIED" f"&columnlist=short_desc" f"&priority=P1%20-%20Urgent" f"&priority=P2%20-%20High" f"&product=PUBLIC%20SUSE%20Linux%20Enterprise%20Desktop%2015%20SP5" f"&product=PUBLIC%20SUSE%20Linux%20Enterprise%20High%20Availability%20Extension%2015%20SP5" f"&product=PUBLIC%20SUSE%20Linux%20Enterprise%20HPC%2015%20SP5" f"&product=PUBLIC%20SUSE%20Linux%20Enterprise%20Server%2015%20SP5" f"&product=SUSE%20Linux%20Enterprise%20Desktop%2015%20SP5" f"&product=SUSE%20Linux%20Enterprise%20High%20Availability%20Extension%2015%20SP5" f"&product=SUSE%20Linux%20Enterprise%20HPC%2015%20SP5" f"&product=SUSE%20Linux%20Enterprise%20HPC%2015%20SP5%20in%20Public%20Clouds" f"&product=SUSE%20Linux%20Enterprise%20Server%2015%20SP5" f"&product=SUSE%20Linux%20Enterprise%20Server%2015%20SP5%20in%20Public%20Clouds" f"&product=SUSE%20Linux%20Enterprise%20Server%20for%20SAP%2015%20SP5%20in%20Public%20Clouds" f"&product=SUSE%20Linux%20Enterprise%20Server%20for%20SAP%20Applications%2015%20SP5" f"&query_based_on=SLE_15SP5_Resolved_issues" f"&query_format=advanced" f"&resolution=FIXED" ) def mentioned_jira_references(changelog: str) -> List[str]: """ Scans a given changelog for SLE and PED issues. :param changelog: The str with the full changelog between two images. :returns: The list of SLE and PED issues that can be found. """ jira_ped_regex = re.compile(r"jsc#SLE-[0-9]{5}|jsc#PED-[0-9]{1,5}") result: Set[str] = set() for match in jira_ped_regex.finditer(changelog): result.add(match.group()) return sorted(result) def build_ped_list(changelog: str) -> List[str]: """ Scans a given changelog for PED issues. :param changelog: The str with the full changelog between two images. :returns: The list of PED issues that can be found. """ result = set() ped_regex = re.compile(r"PED-[0-9]{1,5}") for match in ped_regex.finditer(changelog): result.add(match.group()) return sorted(result) JiraResponse = namedtuple("JiraResponse", "query comment labels") def build_jira_query_incorrect(ped: str, build2: str) -> JiraResponse: """ Builds a Jira Query that shows all issues that are in an incorrect state and that must be manually handled. :param ped: The list of ped issues seperated by commas. :param build2: The newer of the two build numbers available to the script. :returns: The built JiraResponse tuple. """ allowed_stati = [ "QE Open", "QE In Progress", "QE Blocked", "Engineering Done", "Dev In Progress", "IBS Integration", ] allowed_stati_str = '","'.join(allowed_stati) return build_jira_query( ( f"issue in ({ped})" f' AND status NOT IN ("{allowed_stati_str}")' " AND type = Implementation" ), ( f"A submit request referencing this feature has been merged into build{build2}.\n" "Please update the state of this ticket, as it doesn't reflect the correct state of development." ), "Add status:wait_for_status\nAdd status:code_merged", ) def build_jira_query_development(ped: str, build2: str) -> JiraResponse: """ Builds a Jira Query that shows all issues that are in development state. :param ped: The list of ped issues seperated by commas. :param build2: The newer of the two build numbers available to the script. :returns: The built JiraResponse tuple. """ return build_jira_query( f'issue in ({ped}) AND status = "Dev In Progress" AND type = Implementation', f"A submit request referencing this feature has been merged into build{build2}.", "Remove status:wait_for_status\nAdd status:code_merged", ) def build_jira_query_completed(ped: str, build2: str) -> JiraResponse: """ Builds a Jira Query that shows all issues that are in a completed state. :param ped: The list of ped issues seperated by commas. :param build2: The newer of the two build numbers available to the script. :returns: The built JiraResponse tuple. """ return build_jira_query( ( f"issue in ({ped})" ' AND status IN ("QE Open","QE In Progress","QE Blocked","Engineering Done")' " AND type = Implementation" ), f"A submit request referencing this feature has been merged into build{build2}.", 'no handling required, only remove stale "status:" labels', ) def build_jira_query_ready(ped: str, build2: str) -> JiraResponse: """ Builds a Jira Query that shows all issues that are ready to be moved to their next status automatically. :param ped: The list of ped issues seperated by commas. :param build2: The newer of the two build numbers available to the script. :returns: The built JiraResponse tuple. """ return build_jira_query( f'issue in ({ped}) AND status = "IBS Integration" AND type = Implementation', f"A submit request referencing this feature has been merged into build{build2}.", "Remove status:code_merged\nRemove status:wait_for_status", ) def build_jira_query(query: str, comment: str, labels: str) -> JiraResponse: """ Builds a JIRA query tuple. :param query: The query that should be attached to the tuple. :param comment: The comment that should be attached to the tuple. :param labels: The labels that should be attached to the tuple. :returns: The built JiraResponse tuple. """ return JiraResponse(query, comment, labels) PackageUpdatedFromXcdChkResult = namedtuple( "PackageUpdatedFromXcdChkResult", "xcdchk_data bsc_query jira_references jira_queries", ) def main( version: str, origin_version: str, build1: str, build2: str, xcdchk_options: XcdCheckFetchOptions, ) -> PackageUpdatedFromXcdChkResult: """ Main routine executes the non-CLI related logic. """ xcdchk_sources = xcdchk_fetch_data( version, origin_version, build1, build2, xcdchk_options, ) xcdchk_data = { "updated": xcdchk_updated_pkgs( xcdchk_sources.changelog, xcdchk_sources.updated, ), "added": xcdchk_added_pkgs( xcdchk_sources.changelog, xcdchk_sources.new, ), "removed": xcdchk_removed_pkgs(xcdchk_sources.missing), "downgraded": xcdchk_downgraded_pkgs(xcdchk_sources.downgraded), "mentioned_bugs": xcdchk_mentioned_bugs(xcdchk_sources.changelog), } bsc_query = build_bsc_query_p1_p2(xcdchk_sources.changelog) jira_references = mentioned_jira_references(xcdchk_sources.changelog) ped_list = build_ped_list(xcdchk_sources.changelog) jira_queries = { "incorrect": build_jira_query_incorrect(",".join(ped_list), build2), "development": build_jira_query_development(",".join(ped_list), build2), "completed": build_jira_query_completed(",".join(ped_list), build2), "ready": build_jira_query_ready(",".join(ped_list), build2), } return PackageUpdatedFromXcdChkResult( xcdchk_data, bsc_query, jira_references, jira_queries ) def print_jira_query(header: str, jira_query: str, comment: str, labels: str): """ Formats a JIRA query for printing to stdout. """ print_template = f""" {header} ========================================= Query: ------ {jira_query} Comment: -------- {comment} Labels: ------- {labels}""" print(print_template) def print_from_to_version_package(name: str, from_version: str, to_version: str): """ Formats a package that has a version for printing on stdout. """ print(f"* {name}: {from_version} => {to_version}") def main_cli(args): """ Main routine that executes the script :param args: Argparse Namespace that has all the arguments """ result = main( args.service_pack, args.origin_service_pack, args.builds[0], args.builds[1], XcdCheckFetchOptions(args.xcdchk_url, args.xcdchk_timeout), ) print("Updated packages:") for package, version in result.xcdchk_data.get("updated").items(): print_from_to_version_package(package, version.old_version, version.new_version) print("") print("Added packages:") for package, version in result.xcdchk_data.get("added").items(): print(f"* {package} {version}") print("") print("Removed packages:") for package in result.xcdchk_data.get("removed"): print(f"* {package}") print("") print("Downgraded packages:") for package, version in result.xcdchk_data.get("downgraded").items(): print_from_to_version_package(package, version.old_version, version.new_version) print("") print("Mentioned bug references:") print(",".join(result.xcdchk_data.get("mentioned_bugs"))) print("") print(f"Filter for resolved P1/P2 bugs with this build in {args.service_pack}") print(result.bsc_query) print("") print("Mentioned JIRA references:") print(",".join(result.jira_references)) print("") print_jira_query( "JIRA query for incorrect state", result.jira_queries.get("incorrect").query, result.jira_queries.get("incorrect").comment, result.jira_queries.get("incorrect").labels, ) print_jira_query( "JIRA query for still under development", result.jira_queries.get("development").query, result.jira_queries.get("incorrect").comment, result.jira_queries.get("incorrect").labels, ) print_jira_query( "JIRA query for already completed features", result.jira_queries.get("completed").query, result.jira_queries.get("incorrect").comment, result.jira_queries.get("incorrect").labels, ) print_jira_query( "JIRA query for features ready to transition", result.jira_queries.get("ready").query, result.jira_queries.get("incorrect").comment, result.jira_queries.get("incorrect").labels, ) 07070100000033000081A400000000000000000000000164121834000030CE000000000000000000000000000000000000004B00000000sle-prjmgr-tools-1678907444.9216408/sle_prjmgr_tools/packagelist_report.py""" This script calculates a diff between two project revisions and writes its output to a file where the added, moved and removed packages are named. """ import logging import shlex import subprocess import sys import textwrap from typing import Dict, List, Optional import yaml logging.basicConfig(level=logging.DEBUG) logger = logging.getLogger(__name__) def build_parser(parent_parser): """ Builds the parser for this script. This is executed by the main CLI dynamically. :param parent_parser: The subparsers object from argparse. """ subparser = parent_parser.add_parser( "packagelist_report", help="Report of package movements between 2 projects" ) subparser.add_argument( "-f", "--from-project", type=str, help="Origin project", required=True ) subparser.add_argument( "-t", "--to-project", type=str, help="Target project", required=True ) subparser.add_argument( "--from-revision-number", type=str, help="Origin revision number" ) subparser.add_argument( "--to-revision-number", type=str, help="Target revision number" ) subparser.set_defaults(func=main_cli) def convert_txt_to_dict(file_content: str) -> Dict[str, List[str]]: """ Converts the text file downloaded and parses :param file_content: The string that should be split :return: The dictionary with the package names as key and the list of groups they are in as a list. """ ret: Dict[str, List[str]] = {} for line in file_content.splitlines(): pkg, group = line.strip().split(":") ret.setdefault(pkg, []) ret[pkg].append(group) return ret def convert_yml_to_dict(file_content: str, unsorted=False) -> Dict[str, List[str]]: """ Loads the file content as a YAML file and then parses it according to the expected file structure. :param file_content: The content of the YAML file. :param unsorted: If the package should be added to unsorted or not explicitly. :return: The dictionary with the package names as key and the list of groups they are in as a list. """ ret: Dict[str, List[str]] = {} try: parsed_yaml = yaml.safe_load(file_content) for module, packages in parsed_yaml.items(): for package in packages: if unsorted: ret[package] = ["unsorted"] else: ret[package] = [module] except yaml.YAMLError as yaml_error: logger.error(yaml_error, exc_info=True) return ret def download_file(cmd: str): """ Takes an osc command returns the stdout. :param cmd: The osc command to run. :return: The stdout output. """ logger.debug("Executing: %s", cmd) cmd_args = shlex.split(cmd) output = subprocess.run(cmd_args, capture_output=True, text=True, check=False) if output.returncode != 0: logger.warning("Failed to execute: %s", cmd) logger.warning(output.stderr) return output def get_yml_files(command: str, revision) -> Optional[Dict[str, List[str]]]: """ Retrieve the different reference YAML files and parse them. :param command: The pre-built command that should be used. :param revision: The project revision to get the file from. :return: The dictionary with the package names as key and the list of groups they are in as a list. """ ret: Dict[str, List[str]] = {} yaml_files = ["reference-summary.yml", "reference-unsorted.yml", "unneeded.yml"] for file in yaml_files: cmd = command + f" {file}" if revision: cmd += f" -r {revision}" output = download_file(cmd) if output.returncode != 0: return None unsorted_flag = file != yaml_files[0] content = convert_yml_to_dict(output.stdout, unsorted_flag) ret = {**ret, **content} return ret def get_txt_file(cmd: str, revision) -> Optional[Dict[str, List[str]]]: """ Retrieve and parse the content of ``summary-staging.txt``. :param cmd: The pre-built command that should be used. :param revision: The project revision to get the file from. :return: The file that has been downloaded by osc and then parsed by ``convert_txt_to_dict()``. """ file = "summary-staging.txt" cmd += f" {file}" if revision: cmd += f" -r {revision}" output = download_file(cmd) if output.returncode != 0: return None return convert_txt_to_dict(output.stdout) def get_file_content(project: str, revision: str): """ Retrieves the file content of a text file from 000package-groups. :param project: The project that should be checked. :param revision: The project revision to get the file from. :return: The """ apiurl = "https://api.suse.de/" package = "000package-groups" cmd = f"osc -A {apiurl} cat {project} {package}" ret = get_txt_file(cmd, revision) if not ret: ret = get_yml_files(cmd, revision) if not ret: sys.exit(1) write_summary_dict(project, ret) return ret def read_yaml_file(file, unsorted=False): """ Loads the file content as a YAML file and then parses it according to the expected file structure. :param file: The content of the YAML file. :param unsorted: If the package should be added to unsorted or not explicitly. :return: The dictionary with the package names as key and the list of groups they are in as a list. """ ret = {} with open(file, "r", encoding="UTF-8") as stream: try: parsed_yaml = yaml.safe_load(stream) for module, packages in parsed_yaml.items(): for package in packages: if unsorted: ret[package] = ["unsorted"] else: ret[package] = [module] except yaml.YAMLError as yaml_error: logger.error(yaml_error, exc_info=True) return ret def read_summary_file(file: str) -> Dict[str, List[str]]: """ Reads a file and interprets it in the format of ``summary-staging.txt``. :param file: The path to the file. :return: The dict with the package names as keys and the list of groups as a value. """ ret: Dict[str, List[str]] = {} with open(file, "r", encoding="UTF-8") as fp_summary_file: for line in fp_summary_file: pkg, group = line.strip().split(":") ret.setdefault(pkg, []) ret[pkg].append(group) return ret def write_summary_file(file: str, content: str) -> None: """ Write a file to disk with the given content. :param file: The file path of the desired target. :param content: The content to write. """ logger.info("Summary report saved in %s", file) with open(file, "w", encoding="UTF-8") as fp_summary_file: fp_summary_file.write(content) def write_summary_dict(file: str, content: Dict[str, List[str]]) -> None: """ Write a file to disk with the given content. :param file: The file path of the desired target. :param content: The dict with the packages as keys and their categories as value. """ logger.info("List of %s packages saved in %s", file, file) output = [] for pkg in sorted(content): for group in sorted(content[pkg]): output.append(f"{pkg}:{group}") with open(file, "w", encoding="UTF-8") as fp_summary_file: for line in sorted(output): fp_summary_file.write(line + "\n") def generate_package_diff_report(added: dict, moved: dict, removed: dict) -> str: """ Generates the report with the help of the presorted dictionaries. :param added: The dict with added packages :param moved: The dict with moved packages :param removed: The dict with removed packages :return: The str with the generated report. Newlines for formatting are present. """ report = "" for removed_package in sorted(removed.keys()): report += f"**Remove from {removed_package}**\n\n```\n" paragraph = ", ".join(removed[removed_package]) report += "\n".join( textwrap.wrap( paragraph, width=90, break_long_words=False, break_on_hyphens=False ) ) report += "\n```\n\n" for move in sorted(moved.keys()): report += f"**Move from {move}**\n\n```\n" paragraph = ", ".join(moved[move]) report += "\n".join( textwrap.wrap( paragraph, width=90, break_long_words=False, break_on_hyphens=False ) ) report += "\n```\n\n" for group in sorted(added): report += f"**Add to {group}**\n\n```\n" paragraph = ", ".join(added[group]) report += "\n".join( textwrap.wrap( paragraph, width=90, break_long_words=False, break_on_hyphens=False ) ) report += "\n```\n\n" return report.strip() def calculcate_package_diff(old_file: dict, new_file: dict) -> Optional[str]: """ Calculate the package diff between the two dictionaries that were passed. :param old_file: Dictionary with the content from the source project. :param new_file: Dictionary with the content from the target project. :return: The str with the generated report. As generated by ``generate_package_diff_report()``. """ # remove common part keys = list(old_file.keys()) for key in keys: if new_file.get(key, []) == old_file[key]: del new_file[key] del old_file[key] if not old_file and not new_file: return None added: Dict[str, List[str]] = {} for pkg in new_file: if pkg in old_file: continue addkey = ",".join(new_file[pkg]) added.setdefault(addkey, []) added[addkey].append(pkg) removed: Dict[str, List[str]] = {} for pkg in old_file: old_groups = old_file[pkg] if new_file.get(pkg): continue removekey = ",".join(old_groups) removed.setdefault(removekey, []) removed[removekey].append(pkg) moved: Dict[str, List[str]] = {} for pkg in old_file: old_groups = old_file[pkg] new_groups = new_file.get(pkg) if not new_groups: continue movekey = ",".join(old_groups) + " to " + ",".join(new_groups) moved.setdefault(movekey, []) moved[movekey].append(pkg) return generate_package_diff_report(added, moved, removed) def main( from_project: str, from_revision_number: str, to_project: str, to_revision_number: str, ): """ Main routine executes the non-CLI related logic. :param from_project: Source project that should be used as a base. :param from_revision_number: Project revision number. :param to_project: Source project that should be used as a comparison target. :param to_revision_number: Project revision number. """ from_summary = get_file_content(from_project, from_revision_number) to_summary = get_file_content(to_project, to_revision_number) report = calculcate_package_diff(from_summary, to_summary) # logger.debug(f"\n{report}") logger.debug("\n%s", from_revision_number) logger.debug("\n%s", to_revision_number) if report: summary_file = "summary-report.md" write_summary_file(summary_file, report) else: logger.info("No package movement reported") # if os.path.isfile(reference_summary): # from_shipped = read_yaml_file(f'FROM-shipped.yml') # from_unsorted = read_yaml_file(f'FROM-unsorted.yml',True) # from_summary = {**from_shipped, **from_unsorted} # from_summary_file = f'from_summary-external' # write_summary_dict(from_summary_file, from_summary) # to_summary = read_summary_file(f'TO-summary-staging.txt') # to_summary_file = f'to_summary-external' # write_summary_dict(to_summary_file, to_summary) # report = calculcate_package_diff(from_summary, to_summary) # logger.debug(f"\n{report}") # summary_file = f'diff-report-external.md' # write_summary_file(summary_file, report) def main_cli(args): """ Main routine that executes the script :param args: Argparse Namespace that has all the arguments """ main( args.from_project, args.from_revision_number, args.to_project, args.to_revision_number, ) 07070100000034000081A400000000000000000000000164121834000008C7000000000000000000000000000000000000004300000000sle-prjmgr-tools-1678907444.9216408/sle_prjmgr_tools/release_to.py""" Module that will release a ``:GA`` project to ``:GA:TEST`` or ``GA:PUBLISH``. """ import enum from typing import Optional from sle_prjmgr_tools.utils.osc import OscReleaseHelper class ReleaseTargets(enum.Enum): """ This Enum contains all possible release targets for this script. """ TEST = 0 PUBLISH = 1 def build_parser(parent_parser): """ Builds the parser for this script. This is executed by the main CLI dynamically. :param parent_parser: The subparsers object from argparse. """ # pylint: disable=R0801 subparser = parent_parser.add_parser("release_to", help="release_to help") group = subparser.add_mutually_exclusive_group(required=True) group.add_argument( "--test", dest="release_target", action="store_const", const=ReleaseTargets.TEST, help="If this flag is given, a release to :GA:TEST is triggered.", ) group.add_argument( "--publish", dest="release_target", action="store_const", const=ReleaseTargets.PUBLISH, help="If this flag is given, a release to :GA:PUBLISH is triggered.", ) subparser.add_argument( "project", metavar="project", help="Project that should be released including the GA suffix.", ) def main( obs_url: str, project: str, target: ReleaseTargets, osc_config: Optional[str] = None ): """ Main routine executes the non-CLI related logic. :param obs_url: URL to the OBS instance. :param project: Project to release. Must include the ``:GA`` suffix but not more. :param target: Decides what the release target is. :param osc_config: Path to the ``.oscrc``. It may be ``None`` if osc should handle the lookup. """ releaser = OscReleaseHelper(osc_server=obs_url, override_config=osc_config) if target == ReleaseTargets.TEST: releaser.release_repo_to_test(project) elif target == ReleaseTargets.PUBLISH: releaser.release_repo_to_publish(project) def main_cli(args): """ Main routine that executes the script :param args: Argparse Namespace that has all the arguments """ main(args.osc_instance, args.project, args.release_target, args.osc_config) 07070100000035000081A40000000000000000000000016412183400000906000000000000000000000000000000000000004600000000sle-prjmgr-tools-1678907444.9216408/sle_prjmgr_tools/search_binary.py""" Retrieve the Channel, Project and Source Package for a list of given binaries. """ from typing import List import requests QUERY = """ query {{ binaries(name_Iexact:"{}") {{ edges {{ node {{ channelsources {{ edges {{ node {{ channel {{ name }} package {{ name }} project {{ name }} }} }} }} }} }} }} }} """ SMELT_API = "https://smelt.suse.de/graphql/" def build_parser(parent_parser): """ Builds the parser for this script. This is executed by the main CLI dynamically. :param parent_parser: The subparsers object from argparse. """ subparser = parent_parser.add_parser("search_binary", help="search_binary help") subparser.add_argument( "binaries", metavar="binary", help="The binary names", nargs="+" ) subparser.set_defaults(func=main_cli) def get_binary_info(binary: str, timeout=180): """ Retrieve all channel, project and source package names for a given binary. :param binary: The name of the binaries that should be searched for. :param timeout: Timout when the connection should be aborted. :return: A Tuple with the name of the channel, project and source package is yielded. """ query = {"query": QUERY.format(binary)} results = requests.post(SMELT_API, query, timeout=timeout).json() for bin_result in results["data"]["binaries"]["edges"]: for result in bin_result["node"]["channelsources"]["edges"]: yield ( result["node"]["channel"]["name"], result["node"]["project"]["name"], result["node"]["package"]["name"], ) def main(binaries: List[str]): """ Main routine executes the non-CLI related logic. :param binaries: The list of binaries that should be checked. """ for binary in binaries: print(f"{binary}:") for chan, proj, pack in get_binary_info(binary): print(f" {chan}: {proj}/{pack}") def main_cli(args): """ Main routine that executes the script :param args: Argparse Namespace that has all the arguments """ main(args.binaries) 07070100000036000081A40000000000000000000000016412183400004931000000000000000000000000000000000000004200000000sle-prjmgr-tools-1678907444.9216408/sle_prjmgr_tools/sle_build.py""" This module is responsible to collect the current build numbers of the various images that are built. """ import argparse import concurrent.futures import re from typing import Dict, List, Optional from urllib.error import HTTPError from lxml import etree from osc import conf, core # type: ignore class Build: """ Helper class that holds data about a single build. """ # Disabled because dataclasses are not a thing in Python 3.6 # pylint: disable=R0903 def __init__(self, name="", mtime=0): self.name = name self.mtime = mtime self.kind = "" self.number = "" class SleBuildData: """ Helper class that holds all the builds and image names in one object. """ # Disabled because dataclasses are not a thing in Python 3.6 # pylint: disable=R0903 def __init__(self): self.codestream = "" self.builds_ga: Dict[str, Build] = {} self.builds_test: Dict[str, Build] = {} self.builds_publish: Dict[str, Build] = {} self.images_test = [] self.images_publish = [] self.images_wsl = [] def build_parser(parent_parser): """ Builds the parser for this script. This is executed by the main CLI dynamically. :param parent_parser: The subparsers object from argparse. """ subparser = parent_parser.add_parser( "sle_build", help="sle_build help", formatter_class=argparse.ArgumentDefaultsHelpFormatter, ) subparser.add_argument( "version", metavar="V", help='Project to get the builds for (e.g. "15-SP5").' ) subparser.set_defaults(func=main_cli) def osc_prepare(osc_config: Optional[str] = None, osc_server: Optional[str] = None): """ This has to be executed to the ini-style config is converted into their corresponding types. :param osc_config: Path to the configuration file for osc. The default delegates the task to the osc library. :param osc_server: Server URL that points to the OBS server API. """ conf.get_config(override_conffile=osc_config, override_apiurl=osc_server) def osc_get_builds(apiurl: str, project: str) -> List[Build]: """ Get builds from the build-service. :param apiurl: URL where the API from the build-service can be reached. :param project: The project to look at. :return: The list with the image names. """ result = [] query = { "package": "000product", "multibuild": "1", "repository": "images", "arch": "local", "view": "binarylist", } url = core.makeurl(apiurl, ["build", project, "_result"], query=query) file_object_builds = core.http_GET(url) tree = etree.parse(file_object_builds) binary_list = tree.xpath("//resultlist/result/binarylist/binary[@filename]") regex_iso_filename = re.compile( r".*(DVD|cd-cd|Packages|Full|Online)-x86_64.*Media1.iso$" ) for binary in binary_list: filename_attribute = binary.get("filename") mtime_attribute = binary.get("mtime") if regex_iso_filename.match(filename_attribute): result.append(Build(filename_attribute, mtime_attribute)) return result def sle_15_media_build(apiurl: str, project: str) -> Dict[str, Build]: """ Searches in the specified project for the current build flavors with their corresponding id. :param apiurl: URL where the API from the build-service can be reached. :param project: The project to look at. :return: A dict where the keys are the build flavor and the values the build number. """ result = {} regex_output = re.compile( r"(.*DVD|.*cd-cd|.*Packages|.*Full|.*Online)-x86_64-(Build.*)-Media1.iso" ) for build in osc_get_builds(apiurl, project): build_number_match = regex_output.match(build.name) if build_number_match is None: raise ValueError("No regex match for build number!") build.kind = build_number_match.group(1) build.number = build_number_match.group(2) result[build.kind] = build return result def osc_get_sle_non_release_packages(apiurl: str, project: str): """ Retrieve all packages that are not related to a release. :param apiurl: URL where the API from the build-service can be reached. :param project: The project to look at. :return: The list of packages. """ package_list = core.meta_get_packagelist(apiurl, project) result = [] for package in package_list: if package.startswith("SLE") and "release" not in package: result.append(package) return result def osc_get_build_flavors( apiurl: str, project: str, package: str, filename: str ) -> list: """ Reads the sources of a package to retrieve the allowed build flavors of an image. :param apiurl: URL where the API from the build-service can be reached. :param project: The project to look at. :param package: The package where the multibuild file is located is. :param filename: The name of the multibuild file. :return: The list of flavors. """ url = core.makeurl(apiurl, ["source", project, package, filename]) try: file_object_osc_cat = core.http_GET(url) except HTTPError as error: if error.code == 404: # Package has no multibuild file return [] raise root = etree.parse(file_object_osc_cat) elements = root.xpath("//multibuild/flavor") result = [] for element in elements: result.append(element.text) return result def osc_get_non_product_packages(apiurl: str, project: str) -> List[str]: """ Get a list of packages that are unrelated to the product building progress. :param apiurl: URL where the API from the build-service can be reached. :param project: The project to look at. :return: The list of packages. """ package_list = core.meta_get_packagelist(apiurl, project) result = [] package_list.remove("000product") for package in package_list: if "_product" in package or "kiwi" in package: continue result.append(package) return result def get_kiwi_template(apiurl: str, project: str) -> str: """ Retrieve the name of the kiwi template package :param apiurl: URL where the API from the build-service can be reached. :param project: The project to look at. :return: The full name of the kiwi-template package. """ project_packages = core.meta_get_packagelist(apiurl, project) kiwi_template = "" for package in project_packages: if package.startswith("kiwi-templates"): kiwi_template = package break return kiwi_template def get_sle_image_jeos_single( apiurl: str, project: str, repo, kiwi_template: str, i ) -> List[str]: """ Search for JeOS images in a given project, repo and with the specified kiwi template. :param apiurl: URL where the API from the build-service can be reached. :param project: The project to look at. :param repo: The repository to search for jeos images. :param kiwi_template: The full name of the kiwi package. :param i: The flavor of the package that is being searched for. :return: The images that have been found. """ result = [] binaries = core.get_binarylist( apiurl, project, repo.name, repo.arch, f"{kiwi_template}:{i}" ) for binary in binaries: if binary.endswith(".packages"): result.append(binary[:-9]) return result def get_sle_images_jeos(apiurl: str, project: str, kiwi_template: str) -> List[str]: """ multibuild JeOS :param apiurl: URL where the API from the build-service can be reached. :param project: The project to look at. :param kiwi_template: The full name of the kiwi package. :return: The list of names that the images from JeOS/Minimal will have. """ result = [] # core.get_repos_of_project returns an iterator repos = list(core.get_repos_of_project(apiurl, project)) with concurrent.futures.ThreadPoolExecutor(max_workers=5) as executor: futures = [] for i in osc_get_build_flavors(apiurl, project, kiwi_template, "_multibuild"): for repo in repos: futures.append( executor.submit( get_sle_image_jeos_single, apiurl, project, repo, kiwi_template, i, ) ) for future in concurrent.futures.as_completed(futures): result.extend(future.result()) return result def get_sle_images_multibuild_single( apiurl: str, project: str, repo, package: str, flavor: str ): """ Retrieve a list of images. :param apiurl: URL where the API from the build-service can be reached. :param project: The project to look at. :param repo: The repository to get the images from. :param package: The package where the images are in. :param flavor: The flavor of the package that should be checked. :return: The list of images. """ result = [] binaries = core.get_binarylist( apiurl, project, repo.name, repo.arch, f"{package}:{flavor}" ) for binary in binaries: if binary.endswith(".packages"): result.append(binary[:-9]) return result def get_sle_images_multibuild(apiurl: str, project: str): """ Multibuild images such as cloud etc., typical name is "SLE12-SP5-EC2". This is for 15.2+ and 12.5+. :param apiurl: URL where the API from the build-service can be reached. :param project: The project to look at. """ result = [] # core.get_repos_of_project returns an iterator repos = list(core.get_repos_of_project(apiurl, project)) with concurrent.futures.ThreadPoolExecutor(max_workers=5) as executor: futures = [] for package in osc_get_sle_non_release_packages(apiurl, project): for flavor in osc_get_build_flavors( apiurl, project, package, "_multibuild" ): for repo in repos: futures.append( executor.submit( get_sle_images_multibuild_single, apiurl, project, repo, package, flavor, ) ) for future in concurrent.futures.as_completed(futures): result.extend(future.result()) return result def get_sle_images_old_style_single(apiurl: str, project, repo, i): """ Retrieve the old style image names. :param apiurl: URL where the API from the build-service can be reached. :param project: The project to look at. :param repo: The repository to search for images. :param i: The package to search for old-style images. :return: The names of the images as a list. """ result = [] binaries = core.get_binarylist(apiurl, project, repo.name, repo.arch, i) for binary in binaries: if binary.endswith(".packages"): result.append(binary[:-9]) return result def get_sle_images_old_style(apiurl: str, project: str): """ old style :param apiurl: URL where the API from the build-service can be reached. :param project: The project to look at. """ result = [] # core.get_repos_of_project returns an iterator repos = [] for repo in list(core.get_repos_of_project(apiurl, project)): if repo.name == "images": repos.append(repo) with concurrent.futures.ThreadPoolExecutor(max_workers=5) as executor: futures = [] for i in osc_get_non_product_packages(apiurl, project): for repo in repos: futures.append( executor.submit( get_sle_images_old_style_single, apiurl, project, repo, i ) ) for future in concurrent.futures.as_completed(futures): result.extend(future.result()) return result def sle_images(apiurl: str, project: str) -> List[str]: """ Collects all SLE 15 style images that can be found and sorts them. :param apiurl: URL where the API from the build-service can be reached. :param project: The project to look at. :return: The list of images that can be found. """ temp_storage = [] kiwi_template = get_kiwi_template(apiurl, project) if kiwi_template != "": jeos_images = get_sle_images_jeos(apiurl, project, kiwi_template) temp_storage.extend(jeos_images) multibuild_images = get_sle_images_multibuild(apiurl, project) temp_storage.extend(multibuild_images) old_style_images = get_sle_images_old_style(apiurl, project) temp_storage.extend(old_style_images) temp_storage.sort() return temp_storage def osc_get_dvd_images(apiurl: str, version: str) -> List[str]: """ Get all SLE 12 style DVD images. :param apiurl: URL where the API from the build-service can be reached. :param version: The version of SLE to look for. :return: The list of images that can be found. """ packages = core.meta_get_packagelist(apiurl, f"SUSE:SLE-{version}:GA") result = [] product_regex = re.compile(r"_product.*(DVD-x86|cd-cd.*x86_64)") for package in packages: if product_regex.match(package): result.append(package) return result def get_wsl_binaries(apiurl: str, project: str) -> List[str]: """ This method avoids displaying images-test/$arch. :param apiurl: URL where the API from the build-service can be reached. :param project: The project to look for WSL images in. :return: A list of names for build WSL images. """ result = [] try: binary_list = core.get_binarylist( apiurl, project, "standard", "x86_64", "wsl-appx", ) except HTTPError as http_error: if http_error.code != 404: # All non 404 should be raised loudly raise # Something do not exist, thus no binary available binary_list = [] for binary in binary_list: if binary.endswith(".appx"): result.append(binary) return result def osc_get_sle_12_images(apiurl: str, version: str) -> Dict[str, Build]: """ Retrieves the list of SLE 12 images. :param apiurl: URL where the API from the build-service can be reached. :param version: The version of SLE to check for. :return: A dict where the keys are build flavors and the values are build ids. """ result = {} for image in osc_get_dvd_images(apiurl, version): my_media = core.get_binarylist( "https://api.suse.de/", f"SUSE:SLE-{version}:GA", "images", "local", package=image, ) builds = [] for media in my_media: if ( media.endswith("Media1.iso") or media.endswith("Media.iso") ) and "x86_64" in media: builds.append(media) regex_build = re.compile(r"(.*)-DVD.*x86_64-(Build[0-9]+)-Media(1)?\.iso") if len(builds) == 1: my_match = regex_build.match(builds[0]) if my_match is None: raise ValueError("No match for the regex of the build number!") build = Build(name=builds[0]) build.kind = my_match.group(1) build.number = my_match.group(2) result[build.kind] = build return result def main(apiurl: str, version: str, osc_config: Optional[str] = None) -> SleBuildData: """ Main function to get the builds for the specified version. :param apiurl: URL where the API from the build-service can be reached. :param version: The version of the product to check. Should be in format "<codestream>-SP<number>" :param osc_config: The config location for osc to use. If None then the default is retrieved by osc. :return: Object with the summarized data. """ # Preparations result = SleBuildData() result.codestream = version.split("-", 1)[0] # Prepare osc osc_prepare(osc_config=osc_config, osc_server=apiurl) # Special cases if result.codestream == "15": result.builds_ga = sle_15_media_build(apiurl, f"SUSE:SLE-{version}:GA") result.builds_test = sle_15_media_build(apiurl, f"SUSE:SLE-{version}:GA:TEST") result.builds_publish = sle_15_media_build( apiurl, f"SUSE:SLE-{version}:GA:PUBLISH" ) else: result.builds_ga = osc_get_sle_12_images(apiurl, version) result.images_test = sle_images(apiurl, f"SUSE:SLE-{version}:GA:TEST") result.images_publish = sle_images(apiurl, f"SUSE:SLE-{version}:GA:PUBLISH") result.images_wsl = get_wsl_binaries( apiurl, f"SUSE:SLE-{version}:Update:WSL:Update:CR" ) return result def main_cli(args): """ Main routine that executes the script :param args: Argparse Namespace that has all the arguments """ data = main(args.osc_instance, args.version, osc_config=args.osc_config) if data.codestream == "15": print(f"builds from SUSE:SLE-{args.version}:GA:") for build, version in data.builds_ga.items(): print(f"{build}:\t\t{version.number}") print("") print(f"builds from SUSE:SLE-{args.version}:GA:TEST:") for build, version in data.builds_test.items(): print(f"{build}:\t\t{version.number}") print("") print(f"builds from SUSE:SLE-{args.version}:GA:PUBLISH") for build, version in data.builds_publish.items(): print(f"{build}:\t\t{version.number}") else: print(f"builds from SUSE:SLE-{args.version}:GA:") for build, version in data.builds_ga.items(): print(f"{build}:\t\t{version.number}") # Normal process print("") print(f"images from SUSE:SLE-{args.version}:GA:TEST:") for image in data.images_test: print(image) print("") print(f"images from SUSE:SLE-{args.version}:GA:PUBLISH:") for image in data.images_publish: print(image) # avoid displaying images-test/$arch print("") print(f"WSL image (from SUSE:SLE-{args.version}:Update:WSL:Update:CR):") if len(data.images_wsl) > 0: for binary in data.images_wsl: print(binary) else: print("No binary available or package not found!") 07070100000037000081A400000000000000000000000164121834000045C7000000000000000000000000000000000000005100000000sle-prjmgr-tools-1678907444.9216408/sle_prjmgr_tools/update_build_status_page.py""" Updates the build status page on Confluence for the service pack that is given with information from IBS. """ import getpass import json import os import re import subprocess import sys from datetime import date, datetime, timedelta from typing import Any, Dict, Set, Union import requests JIRA_PASSWORD_FILE = os.path.expanduser("~/.jira_susedotcom_password") NEXT_BUILD_DAYS = 7 PROJECT_TO_PAGE_MAP = { "test": { "page_id": 243269754, "url": "https://confluence.suse.com/display/SUSELinuxEnterpriseServer12SP5/Test", }, "SUSE:SLE-12-SP5:GA": { "page_id": 227442811, "url": "https://confluence.suse.com/display/SUSELinuxEnterpriseServer12SP5/Build+status+SLE+12+SP5", }, "SUSE:SLE-15-SP1:GA": { "page_id": 203129002, "url": "https://confluence.suse.com/display/SUSELinuxEnterpriseServer15SP1/Build+status", }, "SUSE:SLE-15-SP2:GA": { "page_id": 255262891, "url": "https://confluence.suse.com/display/SUSELinuxEnterpriseServer15SP2/Build+Status", }, "SUSE:SLE-15-SP3:GA": { "page_id": 415957115, "url": "https://confluence.suse.com/display/SUSELinuxEnterpriseServer15SP3/Build+Status", }, "SUSE:SLE-15-SP4:GA": { "page_id": 798884391, "url": "https://confluence.suse.com/display/SUSELinuxEnterpriseServer15SP4/Build+Status", }, "SUSE:SLE-15-SP5:GA": { "page_id": 1058603270, "url": "https://confluence.suse.com/display/SUSELinuxEnterpriseServer15SP5/Build+Status", }, } NEXT_BUILD_DATE = None page_cache: Dict[ str, Any ] = {} # to limit amount of rest calls {page_id: expanded_body.json} def build_parser(parent_parser): """ Builds the parser for this script. This is executed by the main CLI dynamically. :param parent_parser: The subparsers object from argparse. """ global NEXT_BUILD_DATE # pylint: disable=W0603 NEXT_BUILD_DATE = datetime.now() + timedelta(days=NEXT_BUILD_DAYS) subparser = parent_parser.add_parser( "update_build_status_page", help="update_build_status_page help" ) subparser.add_argument( "project", metavar="PROJECT", help="SUSE:PROJECT:GA", nargs=1, choices=PROJECT_TO_PAGE_MAP.keys(), ) product_group = subparser.add_argument_group("Confluence Server related options") product_group.add_argument( "--next-build-date", help=f"YYYY-mm-dd Adjusts a date of next build. Default is in {date.strftime(NEXT_BUILD_DATE, '%Y-%m-%d')}", default=date.strftime(NEXT_BUILD_DATE, "%Y-%m-%d"), ) product_group.add_argument( "--build-id", help="Build id is autodetected from output of sle_common/sle-build. But you can override it.", ) product_group.add_argument( "--build-label", help="Will append label behind build id. E.g. Alpha-1.0-candidate", ) server_group = subparser.add_argument_group("Confluence Server related options") server_group.add_argument( "--server", help="JIRA Server (devel by default), could be also url", default="https://confluence.suse.com/rest/api", ) server_group.add_argument( "--user", help=f"JIRA user [{os.getenv('USER')}]", default=os.getenv("USER") ) server_group.add_argument("--auth", help="JIRA authentication", default="basic") server_group.add_argument( "--password", help=f"JIRA/Confluence password. Can be also stored in {JIRA_PASSWORD_FILE}", ) subparser.set_defaults(func=main_cli) def get_project(project: str): """ Retrieves the desired project. :param project: The project to work with. :return: The project or SLES 15 SP1 for testing. """ if project == "test": return "SUSE:SLE-15-SP1:GA" return project def free_page_cache(url: str): """ Checks if a page is cached and removes it if it is. :param url: The URL to check. """ for addr in list(page_cache.keys()): if addr == "url" or addr.startswith(f"{url}?"): del page_cache[addr] def confluence_generate_build_summary( server: str, user: str, password: str, project: str, build_id: int, build_label: str, changelog: str, ): """ Merge manually pre-filled section for next build with changelog from the build :param server: Confluence server URL :param user: The user for Confluence to log in with. :param password: The password for Confluence to log in with. :param project: The project in the IBS to work with. :param build_id: The override ID of the build that is being worked with. :param build_label: The override label of the build that is being worked with. :param changelog: The cached output of the ``sle_build`` script. :return: The generated build summary. """ # pylint: disable=R0913 build_str = str(build_id) if build_label: build_str = f"{build_id} - {build_label}" pre_filled = confluence_get_next_build_section( server, user, password, page_id_by_project(project) ) summary = pre_filled.replace( '<ac:parameter ac:name="title">Next build</ac:parameter>', f'<ac:parameter ac:name="title">Build {build_str} ({datetime.now().strftime("%Y%m%d")})</ac:parameter>', ) # replace only the last </p> or </ul> with </p> or </ul> followed by <p>ourlist</p> summary = re.sub( r"(</[a-z]*>)\s*</ac:rich-text-body>", rf"\1<p><br /><br />{to_html_list(changelog)}</p></ac:rich-text-body>", summary, ) match = re.search(r'<time datetime="(?P<date>\d+-\d+-\d+)" />', summary) if match: summary = re.sub( r'<time datetime="\d+-\d+-\d+" />', f'<time datetime="{datetime.now().strftime("%Y-%m-%d")}" />', summary, ) return summary def confluence_get_prev_builds(server: str, user: str, password: str, project: str): """ Retrieve the old page content of the specified project. :param server: Confluence server URL :param user: The user for Confluence to log in with. :param password: The password for Confluence to log in with. :param project: The IBS project to work with. :return: The page content of the Confluence page for the specified IBS project. """ content = confluence_get_page_content( server, user, password, page_id_by_project(project), expand=True ) # print (content) content = content["body"]["storage"]["value"] next_build_section = confluence_get_next_build_section( server, user, password, page_id_by_project(project) ) next_bs_start = content.find(next_build_section) return content[next_bs_start + len(next_build_section) :] def to_html_list(data: str): """ Converts a newline seperated str into a HTML tag seperated str. :param data: The str to convert. :return: The str with newlines replaced by the HTML ``<br />`` tag. """ result = "" for line in data.split("\n"): if line.strip(): result += f"{line}<br />" return result def get_last_build_number(project, changelog=None): """ Retrieves the last build ID from the ``sle_build`` script. :param project: The project name in IBS. :param changelog: The changelog from the build. :return: The build number. """ # I'm getting build id as the highest number from the changelog # sle12 and sle15 both have different syntax versions: Set[Union[int, float]] = set() if not changelog: changelog = get_last_build_changelog( project ) # reuse if possible, it takes a lot of time project = get_project(project) if project.startswith("SUSE:SLE-12"): # SLE-12-SP5-HPC: Build0151 for line in changelog.split("\n"): version = re.search("([0-9]+)$", line) if version: versions.add(int(version.groups()[0])) elif project.startswith("SUSE:SLE-15"): # SLE-15-SP1-Installer: Build224.10 for line in changelog.split("\n"): version = re.search(r"([0-9]+\.[0-9]+)$", line) if version: versions.add(float(version.groups()[0])) # print versions return str(sorted(versions)[-1]) # sorted def get_last_build_changelog(project: str): """ Retrieves the output of the ``sle_build`` script. :param project: The project name in IBS. :return: The output of the ``sle_build`` script. """ # Build id is ignored project = get_project(project) prj_match = re.search(r"SLE-(\d+-\w+):\w+", project) if prj_match is None: raise ValueError("Could not get project name from given project!") prj = prj_match.groups(0)[0] cmd = f"{os.path.join(os.path.dirname(os.path.realpath(__file__)), 'sle-build')} {prj}" out = subprocess.check_output(cmd, shell=True) # Python3 returns bytes-like obj if not isinstance(out, str): return out.decode("utf-8") return out def page_id_by_project(project: str): """ Retrieve the page ID in Confluence by the IBS project name. :param project: The project name in IBS. :return: The page ID in Confluence. """ return PROJECT_TO_PAGE_MAP[project]["page_id"] def page_url_by_project(project: str): """ Retrieve the page URL in Confluence by the IBS project name. :param project: The project name in IBS. :return: The URL to Confluence. """ return PROJECT_TO_PAGE_MAP[project]["url"] def confluence_get_next_build_section( server: str, user: str, password: str, page_id: int ): """ Retrieve a section of a specified Confluence page. :param server: Confluence server URL :param user: The user for Confluence to log in with. :param password: The password for Confluence to log in with. :param page_id: The ID of the Confluence page. :return: The content of the desired page. """ content = confluence_get_page_content(server, user, password, page_id, expand=True) # print content content = content["body"]["storage"]["value"] next_build_section = content[ content.find("<ac:structured-macro") : content.find("</ac:structured-macro>") + len("</ac:structured-macro>") ] return next_build_section def generate_new_build_section(next_build_date: date): """ Generates a section for the next build that will be released. :param next_build_date: The date that will be used for the next build. :return: The Confluence formatted section for the next build. """ next_build = f""" <ac:structured-macro ac:name="expand" ac:schema-version="1" ac:macro-id="6541c1a0-8ae5-4076-975c-7eb3c22fc21e"> <ac:parameter ac:name="title">Next build</ac:parameter> <ac:rich-text-body> <p><time datetime="{date.strftime(next_build_date, "%Y-%m-%d")}" /> </p><p> -----------------------------------------------------------------</p> </ac:rich-text-body> </ac:structured-macro>""" return next_build.replace("\n", "") def confluence_get_page_content( server: str, user: str, password: str, page_id: int, expand=False ): """ Retrieves the page content of a Confluence page. :param server: Confluence server URL :param user: The user for Confluence to log in with. :param password: The password for Confluence to log in with. :param page_id: The ID of the Confluence page. :param expand: Whether the page should be expanded or not. :return: The page content. """ url = f"{server}/content/{page_id:d}" if expand: url += "?expand=body.storage" if url not in page_cache: auth = (user, password) confluence_request = requests.get(url, auth=auth, timeout=180) # print (url) if confluence_request.status_code == 401: raise ValueError("Authentication fail") page_cache[url] = confluence_request.json() return page_cache[url] def confluence_get_page_heading(server: str, user: str, password: str, project: str): """ Retrieves the heading of a Confluence page. :param server: Confluence server URL :param user: The user for Confluence to log in with. :param password: The password for Confluence to log in with. :param project: The project in the IBS to work with. :return: The heading of the confluence page for the IBS project. """ content = confluence_get_page_content( server, user, password, page_id_by_project(project), expand=True ) # print(content) content = content["body"]["storage"]["value"] next_build_section = confluence_get_next_build_section( server, user, password, page_id_by_project(project) ) next_bs_start = content.find(next_build_section) return content[:next_bs_start] def confluence_set_page_content( server: str, user: str, password: str, page_id: int, json_content ): """ Update a Confluence page with the given content. :param server: Confluence server URL :param user: The user for Confluence to log in with. :param password: The password for Confluence to log in with. :param page_id: The page ID to update. :param json_content: The JSON content that the Confluence API should expect. """ current_data = confluence_get_page_content(server, user, password, page_id) new_data = { "id": page_id, "type": current_data["type"], "space": {"key": current_data["space"]["key"]}, "title": current_data["title"], "version": {"number": int(current_data["version"]["number"]) + 1}, } new_data.setdefault("body", {}).setdefault( "storage", {"representation": "storage"} )["value"] = json_content # print(json.dumps(new_data, indent=4, sort_keys=True)) url = f"{server}/content/{page_id:d}" free_page_cache(url) confluence_request = requests.put( url, data=json.dumps(new_data), auth=(user, password), headers={"Content-Type": "application/json", "Accept": "application/json"}, timeout=180, ) try: confluence_request.raise_for_status() except requests.HTTPError as http_error: print(http_error.response.text) raise http_error def confluence_update_build_summary( server: str, user: str, password: str, project: str, build_id: int, changelog, next_build_date: date, build_label, ): """ Updates the build summary of a Confluence page for a certain IBS project. :param server: Confluence server URL :param user: The user for Confluence to log in with. :param password: The password for Confluence to log in with. :param project: The project in the IBS to work with. :param build_id: The override ID of the build that is being worked with. :param changelog: The cached output of the ``sle_build`` script. :param next_build_date: Date of the next build. :param build_label: The override label of the build that is being worked with. """ # pylint: disable=R0913 new_page_body = confluence_get_page_heading(server, user, password, project) new_page_body += generate_new_build_section(next_build_date) new_page_body += confluence_generate_build_summary( server, user, password, project, build_id, build_label, changelog ) new_page_body += confluence_get_prev_builds(server, user, password, project) confluence_set_page_content( server, user, password, page_id_by_project(project), new_page_body ) def main( server: str, user: str, password: str, build_id, build_label, next_build_date: str, project: str, ): """ Main routine executes the non-CLI related logic. :param server: Confluence server URL :param user: The user for Confluence to log in with. :param password: The password for Confluence to log in with. :param build_id: The override ID of the build that is being worked with. :param build_label: The override label of the build that is being worked with. :param next_build_date: The date of the next build. :param project: The project in the IBS to work with. """ # pylint: disable=R0913 if not password: if os.path.exists(JIRA_PASSWORD_FILE): try: with open(JIRA_PASSWORD_FILE, encoding="UTF-8") as fp_password_file: user, password = fp_password_file.read().strip().split("\n") except ValueError: print( f"Error: {JIRA_PASSWORD_FILE} is supposed to contain exactly two lines user and password." ) sys.exit(1) else: print( f"You can save confluence/jira user and password into {JIRA_PASSWORD_FILE} (two separate lines). To" "avoid re-entering" ) password = getpass.getpass( f"Please enter {server} password for user {user}: " ) changelog = get_last_build_changelog(project) if not build_id: build_id = get_last_build_number(project, changelog=changelog) confluence_update_build_summary( server, user, password, project, build_id, changelog, datetime.strptime(next_build_date, "%Y-%m-%d"), build_label, ) print(f"Updated {page_url_by_project(project)}") def main_cli(args): """ Main routine that executes the script :param args: Argparse Namespace that has all the arguments """ main( args.server, args.user, args.password, args.build_id, args.build_label, args.next_build_date, args.project[0], ) 07070100000038000041ED0000000000000000000000026412183400000000000000000000000000000000000000000000003B00000000sle-prjmgr-tools-1678907444.9216408/sle_prjmgr_tools/utils07070100000039000081A40000000000000000000000016412183400000072000000000000000000000000000000000000004700000000sle-prjmgr-tools-1678907444.9216408/sle_prjmgr_tools/utils/__init__.py""" This utils module should contain code that is shared between more than one release management CLI module. """ 0707010000003A000081A40000000000000000000000016412183400000055000000000000000000000000000000000000004900000000sle-prjmgr-tools-1678907444.9216408/sle_prjmgr_tools/utils/confluence.py""" This module should contain helper functionality that assists for Confluence. """ 0707010000003B000081A40000000000000000000000016412183400001745000000000000000000000000000000000000004300000000sle-prjmgr-tools-1678907444.9216408/sle_prjmgr_tools/utils/jira.py""" This module should contain helper functionality that assists for Jira. """ import logging from collections import namedtuple from typing import Dict, List, Union import jira SSLOptions = namedtuple("SSLOptions", "check_cert truststore") class JiraUtils: """ This class contains the shared functions that will enable scripts to interact with JIRA. """ def __init__(self, jira_url: str, pat_token: str, ssl_options: SSLOptions): """ Default constructor that initializes the object. Authentication is only possible doing PAT. For more information follow up on the Atlassian Documentation: https://confluence.atlassian.com/enterprise/using-personal-access-tokens-1026032365.html :param jira_url: URL to access the JIRA instance. :param pat_token: The token to access the JIRA instance. Will define if a script can access the required resources. :param ssl_options: The NamedTuple that contains the options to configure the SSL setup. """ self.logger = logging.getLogger() self.jira_url = jira_url options = self.__prepare_ssl_options(ssl_options) self.jira_obj = jira.JIRA( self.jira_url, options=options, token_auth=pat_token, ) @staticmethod def __prepare_ssl_options(ssl_options: SSLOptions) -> dict: """ Prepares the SSL options dict for the JIRA Client. :param ssl_options: The NamedTuple that contains the options to configure the SSL setup. :return: The dictionary that will be passed to the JIRA library and in the end to requests. """ result: Dict[str, Union[str, bool]] = {} if ssl_options.check_cert: result["verify"] = ssl_options.truststore else: result["verify"] = False return result def jira_get_field_values(self, field_id: str, issue: str) -> Dict[str, str]: """ Retrieves a list of all available field values in a select or multi-select. :param field_id: The ID of the field that the values should be retrieved for. :param issue: The issue that decides the field values that are available to search for. :return: The dict of possible field values or an empty dict. Keys represent the names and values are the IDs. """ result = {} issue_obj = self.jira_obj.issue(issue) meta = self.jira_obj.editmeta(issue_obj.key) for option in meta["fields"][field_id]["allowedValues"]: result[option.get("value")] = option.get("id") return result def jira_get_field_name(self, name: str) -> str: """ Retrieve the field ID by the name of the field that an end user sees. :param name: The name of the field. :return: The field ID or an emtpy string. """ result = "" jira_fields = self.jira_obj.fields() for field in jira_fields: if field.get("name") == name: field_id = field.get("id") if isinstance(field_id, str): result = field_id break # Should never happen since the ID is always # a str but mypy requires this logic. continue return result def jira_get_version_obj(self, issue: str, name: str): """ Get the version object that represents a version in JIRA: :param issue: The issue that decides the versions that are available to search for. :param name: The name of the version that should be retrieved :return: The full version object as returned by the JIRA library. """ issue_obj = self.jira_obj.issue(issue) project = issue_obj.get_field("project") for version in self.jira_obj.project_versions(project): if version.name == name: return version return None def jira_get_transition_id(self, jsc: str, transition_name: str) -> str: """ Retrieve the transition ID of a ticket by the transition name. :param jsc: The Jira ticket number. :param transition_name: Name of the transition. :return: The target transition ID or an empty str. """ transitions = self.jira_obj.transitions(jsc) target_transition_id = "" for transition in transitions: if transition.get("name") == transition_name: target_transition_id = transition.get("id") return target_transition_id def jira_transition_tickets(self, jsc: str) -> None: """ Transition an issue in the workflow if it is in the correct state. If not log a message. :param jsc: The Jira ticket number. """ target_transition_id = self.jira_get_transition_id(jsc, "Integrated") if target_transition_id == "": self.logger.error( 'Issue "%s" could not be transitioned to the state "QE Open" because the transition could not be' " identified!", jsc, ) return self.jira_obj.transition_issue(jsc, target_transition_id) def jira_do_search(self, jql: str, max_results: int = 50) -> List[str]: """ Perform a JIRA search. JQL documentation: https://confluence.atlassian.com/jiracoreserver073/advanced-searching-861257209.html :param jql: The JQL that should be used for searching. :param max_results: The number of results that should be :return: The list of issue keys that match the filter. The number of results is limited by ``max_results``. """ result: List[str] = [] for issue in self.jira_obj.search_issues(jql, maxResults=max_results): if isinstance(jira.Issue, str): result.append(issue.key) return result 0707010000003C000081A400000000000000000000000164121834000036E1000000000000000000000000000000000000004200000000sle-prjmgr-tools-1678907444.9216408/sle_prjmgr_tools/utils/osc.py""" This module should contain helper functionality that assists for the Open Build Service. """ import pathlib import re import sys import tempfile import time from collections import namedtuple from typing import List, Optional import rpmfile # type: ignore from lxml import etree from osc import conf, core # type: ignore BinaryParsed = namedtuple("BinaryParsed", ("package", "filename", "name", "arch")) class OscUtils: """ This class contains the shared functions that will enable scripts to interact with an Open Build Service instance. """ def __init__( self, osc_server: str = "https://build.opensuse.org", override_config: Optional[str] = None, ): """ Default constructor that initializes the object. :param osc_server: Server URL that points to the OBS server API. :param override_config: Path to the configuration file for osc. The default delegates the task to the osc library. """ self.osc_server = osc_server self.osc_web_ui_url = "" self.override_config = override_config self.osc_prepare() def osc_prepare(self) -> None: """ This has to be executed to the ini-style config is converted into their corresponding types. """ conf.get_config( override_conffile=self.override_config, override_apiurl=self.osc_server ) @staticmethod def convert_project_to_product(project: str) -> str: """ Assumes the following schema: RootProject:SomeSubProject:MoreProjects:SLE-<digits>-SP<digits>:SomeProject :param project: Project to convert :return: A str in the form "SLES<major>-SP<SP version>". """ project_parts = project.split(":") product_version = project_parts[-2].split("-") result = f"SLES{product_version[1]}-{product_version[2]}" return result def get_file_from_package( self, project: str, package: str, revision, filename: str, target_filename: Optional[str] = None, ): # pylint: disable=R0913 """ Retrieve a given file from a package that is text based. :param project: The project the package is in. :param package: The package the file is in. :param revision: The file revision that should be downloaded. :param filename: The filename that should be downloaded. :param target_filename: If this is given, then the file will be downloaded with the specified name. """ core.get_source_file( self.osc_server, project, package, filename, targetfilename=target_filename, revision=revision, ) def osc_get_web_ui_url(self) -> str: """ Search the API for the Web UI URL. :return: The URL of the WebUI for OBS. """ if self.osc_web_ui_url != "": return self.osc_web_ui_url obs_config_xml = core.show_configuration(self.osc_server) root = etree.fromstring(obs_config_xml) node = root.find("obs_url") if node is None or not node.text: raise ValueError("obs_url configuration element expected") self.osc_web_ui_url = node.text return self.osc_web_ui_url def osc_get_binary_names( self, project: str, repository: str, arch: str, package: str ) -> List[BinaryParsed]: """ Retrieves the names of all binaries that are built from a given source package. :param project: The project the source package is in. :param repository: The repository the binaries are in. :param arch: The architecture that the packages are built for. :param package: The source package name. :return: The list of binary packages parsed and split up in a tuple with four elements. """ # Copied from openSUSE-release-tools binary_regex = r"(?:.*::)?(?P<filename>(?P<name>.*)-(?P<version>[^-]+)-(?P<release>[^-]+)\.(?P<arch>[^-\.]+))" rpm_regex = binary_regex + r"\.rpm" parsed = [] for binary in core.get_binarylist( self.osc_server, project, repository, arch, package ): result = re.match(rpm_regex, binary) if not result: continue name = result.group("name") if name.endswith("-debuginfo") or name.endswith("-debuginfo-32bit"): continue if name.endswith("-debugsource"): continue if result.group("arch") == "src": continue parsed.append( BinaryParsed( package, result.group("filename"), name, result.group("arch") ) ) return parsed # pylint: disable-next=too-many-arguments def osc_get_textfile_from_rpm( self, project: str, repo: str, arch: str, binary_name: str, filename: str ) -> str: """ Retrieves a textfile from an RPM package. :param project: The project the binary is in. :param repo: The repository the binary is in. :param arch: The architecture of the binary that should be retrieved. :param binary_name: The name of the binary. :param filename: The filename inside the binary that should be read. :return: The content of the textfile. """ with tempfile.TemporaryDirectory() as tmpdirname: target_filename = pathlib.Path(tmpdirname) / "requested.rpm" # Download binary core.get_binary_file( self.osc_server, project, repo, arch, binary_name, target_filename=str(target_filename), ) # Unpack rpm with rpmfile.open(target_filename) as rpm: # Get file with rpm.extractfile(f"./{filename}") as requested_file_fd: return requested_file_fd.read() def osc_retrieve_betaversion(self, project: str) -> str: """ Retrieve the current beta version from the "SLES.prod" file in the "sles-release" binary. :param project: The project the sles binary is found in. :return: The current beta version. This may be different than the one that is set in the project configuration. """ # 000release-packages binaries = self.osc_get_binary_names( project, "standard", "x86_64", "000release-packages:SLES-release" ) binary_name = "" for binary in binaries: if binary.name == "sles-release": binary_name = binary.name if not binary_name: raise ValueError("sles-release binary not found!") # Built RPM: /etc/products.d/SLES.prod file = self.osc_get_textfile_from_rpm( project, "standard", "x86_64", binary_name, "etc/products.d/SLES.prod" ) root = etree.fromstring(file) # XML --> product.buildconfig.betaversion.text result = root.xpath("/product/buildconfig/betaversion") return result[0].text def osc_is_repo_published(self, project: str, repository: str) -> bool: """ Checks if the repository in the specified project is already published. This does not reflect if the current build is published just that the build available via the API is published. :param project: The project that should be checked. :param repository: The repository that should be checked. :return: True if the repository is published. """ url = core.makeurl( self.osc_server, ["build", project, "_result"], query={"view": "summary", "repository": repository}, ) with core.http_GET(url) as result: my_str = result.read().decode() root = etree.fromstring(my_str) my_nodes = root.xpath(f'/resultlist/result[@project="{project}"]') return all( ( node.get("code") == "published" and node.get("state") == "published" for node in my_nodes ) ) def osc_get_containers(self, project: str) -> List[str]: """ Searches in a given project for the packages that correspond to containers. :param project: The project that should be searched in. :return: The list of str with package names that match the container regex. """ package_list = core.meta_get_packagelist(self.osc_server, project) container_regex = re.compile(r"^(cdi|virt)-.*-container") result: List[str] = [] for package_name in package_list: if container_regex.match(package_name): result.append(package_name) return result def osc_get_products(self, project: str) -> List[str]: """ Get all packages that belong to products being built in this project. :param project: The project to check for. :return: The list of packages that match the criteria. Might be empty. """ package_list = core.meta_get_packagelist(self.osc_server, project) products_regex = re.compile(rf"^{self.convert_project_to_product(project)}") result: List[str] = [] for package_name in package_list: if products_regex.match(package_name) and "release" not in package_name: result.append(package_name) return result def osc_get_jsc_from_sr(self, sr_number: int) -> List[str]: """ Get all jsc's from a single Submit Request in the Open Build Service. :param sr_number: The submit request number that should be checked. :return: The list of jsc's that were mentioned. """ issues = core.get_request_issues(self.osc_server, str(sr_number)) result = [] for issue in issues: if issue.get("tracker") == "jsc": result.append(issue.get("name")) return result def osc_do_release( self, project: str, package: str = "", repo: str = "", target_project: str = "", target_repository: str = "", no_delay: bool = False, ) -> None: """ Perform a release for a given project. :param project: The project to release. :param package: Release only a specific package. :param repo: The repository that should be published. :param target_project: The target project where to release to. :param target_repository: The target repository where to release to. :param no_delay: If the action should be regularly scheduled or if it should be performed immediately """ # pylint: disable=R0913 baseurl = ["source", project] query = {"cmd": "release"} if package: baseurl.append(package) if repo: query["repository"] = repo if target_project: query["target_project"] = target_project if target_repository: query["target_repository"] = target_repository if no_delay: query["nodelay"] = "1" url = core.makeurl(self.osc_server, baseurl, query=query) fp_post_result = core.http_POST(url) while True: buf = fp_post_result.read(16384) if not buf: break sys.stdout.write(core.decode_it(buf)) class OscReleaseHelper(OscUtils): """ Helper class to deduplicate between the different release scripts. """ def release_to_common(self, project: str) -> None: """ This consolidates common steps that are required to release a project. Specifics should be implemented in ``release_repo_to_<target>``. :param project: The project that will be released. """ time.sleep(60) while not self.osc_is_repo_published(project, "containers"): print("containers: PENDING") time.sleep(60) if self.osc_is_repo_published(project, "containers"): print("containers: PUBLISHED") while not self.osc_is_repo_published(project, "images"): print("images: PENDING") time.sleep(60) if self.osc_is_repo_published(project, "images"): print("images: PUBLISHED") def release_repo_to_test(self, project: str) -> None: """ Releases a ``:GA`` to ``:GA:TEST``. This is a synchronous call that will block until it is done. :param project: The project including the ``:GA`` suffix. """ for container in self.osc_get_containers(project): self.osc_do_release( project, package=container, repo="containerfile", target_project=f"{project}:TEST", target_repository="containers", ) self.osc_do_release( project, "sles15-image", repo="images", target_project=f"{project}:TEST", target_repository="containers", ) products = self.osc_get_products(project) if products == "": print("[WARNING] There is no cloud image to be released") products.insert(0, "000product") products.insert(0, "kiwi-templates-Minimal") for product in products: self.osc_do_release(project, package=product) self.release_to_common(f"{project}:TEST") def release_repo_to_publish(self, project: str) -> None: """ Releases a ``:GA:TEST`` to ``:GA:PUBLISH``. This is a synchronous call that will block until it is done. :param project: The project including the ``:GA`` suffix. """ self.osc_do_release(f"{project}:TEST") self.release_to_common(f"{project}:GA:PUBLISH") 0707010000003D000081A40000000000000000000000016412183400000287000000000000000000000000000000000000004000000000sle-prjmgr-tools-1678907444.9216408/sle_prjmgr_tools/version.py""" Module to serve the version output of the tool. """ from argparse import Namespace import sle_prjmgr_tools def build_parser(parent_parser): """ Builds the parser for this script. This is executed by the main CLI dynamically. """ subparser = parent_parser.add_parser( "version", help="Shows the version of the tool that is used." ) subparser.set_defaults(func=main_cli) def main_cli(args: Namespace) -> None: # pylint: disable=unused-argument """ Main routine that executes the script :param args: Argparse Namespace that has all the arguments """ print(sle_prjmgr_tools.__version__) 0707010000003E000041ED0000000000000000000000056412183400000000000000000000000000000000000000000000002A00000000sle-prjmgr-tools-1678907444.9216408/tests0707010000003F000081A40000000000000000000000016412183400000329000000000000000000000000000000000000003600000000sle-prjmgr-tools-1678907444.9216408/tests/conftest.py""" Helper module that is shared by all tests and read by pytest. More information: https://docs.pytest.org/en/6.2.x/fixture.html#scope-sharing-fixtures-across-classes-modules-packages-or-session """ import pytest from sle_prjmgr_tools.utils.jira import JiraUtils, SSLOptions def pytest_addoption(parser): parser.addoption( "--pat-token", required=True, action="store", help="This is the pat token that will be used during testing.", ) @pytest.fixture(scope="module") def jira_obj(request): pat_token = request.config.getoption("--pat-token") ssl_options = SSLOptions( True, "/usr/share/pki/trust/anchors/SUSE_Trust_Root.crt.pem" ) return JiraUtils( "https://jira-devel.suse.de", pat_token, ssl_options, ) 07070100000040000041ED0000000000000000000000026412183400000000000000000000000000000000000000000000002F00000000sle-prjmgr-tools-1678907444.9216408/tests/data07070100000041000081A40000000000000000000000016412183400000005000000000000000000000000000000000000003A00000000sle-prjmgr-tools-1678907444.9216408/tests/data/.gitignoreoscrc07070100000042000081A4000000000000000000000001641218340000009E000000000000000000000000000000000000003E00000000sle-prjmgr-tools-1678907444.9216408/tests/data/oscrc.template[general] apiurl = https://api.suse.de [https://api.suse.de] user=<username> sshkey=id_rsa credentials_mgr_class=osc.credentials.TransientCredentialsManager 07070100000043000081A40000000000000000000000016412183400000CD4000000000000000000000000000000000000003E00000000sle-prjmgr-tools-1678907444.9216408/tests/ibs_to_jira_test.pyimport argparse import pytest from sle_prjmgr_tools import ibs_to_jira from sle_prjmgr_tools.utils.jira import SSLOptions @pytest.fixture(scope="module") def jira_work_obj(request) -> ibs_to_jira.JiraWork: pat_token = request.config.getoption("--pat-token") ssl_options = SSLOptions( True, "/usr/share/pki/trust/anchors/SUSE_Trust_Root.crt.pem" ) return ibs_to_jira.JiraWork( "https://jira-devel.suse.de", pat_token, ssl_options, ) def test_build_parser(): # Arrange parser = argparse.ArgumentParser(prog="main parser") subparsers = parser.add_subparsers(help="subparser") program_input = ["ibs_to_jira", "-j", "my_pat"] # Act ibs_to_jira.build_parser(subparsers) result = parser.parse_args(program_input) # Assert assert result.jira_pat == "my_pat" def test_osc_collect_srs_between_builds(): # Arrange apiurl = "https://api.suse.de" project = "SUSE:SLE-15-SP5:GA" duration = 100 # Act result = ibs_to_jira.osc_collect_srs_between_builds(apiurl, project, duration) # Assert assert False def test_transform_srs_per_jsc_to_jsc_per_sr(): # Arrange input_data = {1234: ["abc", "def"], 5678: ["ghi", "jkl"]} expected_result = { "abc": [1234], "def": [1234], "ghi": [5678], "jkl": [5678], } # Act result = ibs_to_jira.transform_srs_per_jsc_to_jsc_per_sr(input_data) # Assert assert result == expected_result def test_jira_search_correct(jira_work_obj): # Arrange jsc = ["PED-141"] # Act result = jira_work_obj.jira_search_correct(jsc) # Assert assert result == jsc def test_jira_post_comment_correct(jira_work_obj): # Arrange # Act & Assert # No Exception is enough that this test passes jira_work_obj.jira_post_comment_correct("PED-141", [1234, 5678]) def test_jira_search_incorrect(jira_work_obj): # Arrange jsc = ["PED-158"] # Act result = jira_work_obj.jira_search_incorrect(jsc) # Assert assert result == jsc def test_jira_post_comment_incorrect(jira_work_obj): # Arrange issue = "PED-123" srs = [123, 124] # Act jira_work_obj.jira_post_comment_incorrect(issue, srs) # Assert assert False def test_jira_search_development(jira_work_obj): # Arrange jscs = ["PED-157"] # Act results = jira_work_obj.jira_search_development(jscs) # Assert assert results == jscs def test_jira_post_comment_development(jira_work_obj): # Arrange issue = "PED-123" srs = [123, 124] # Act jira_work_obj.jira_post_comment_development(issue, srs) # Assert assert False def test_jira_search_other(jira_work_obj): # Arrange # Act jira_work_obj.jira_search_other() # Assert assert False def test_jira_post_comment_other(jira_work_obj): # Arrange issues = "PED-123" srs = [123, 124, 125] # Act jira_work_obj.jira_post_comment_other(issues, srs) # Assert assert False @pytest.mark.skip("Test not fully set up for testing!") def test_main(request): # Arrange pat_token = request.config.getoption("--pat-token") # Act ibs_to_jira.main(pat_token) # Assert assert False 07070100000044000081A40000000000000000000000016412183400000000000000000000000000000000000000000000003D00000000sle-prjmgr-tools-1678907444.9216408/tests/release_to_test.py07070100000045000041ED0000000000000000000000026412183400000000000000000000000000000000000000000000003200000000sle-prjmgr-tools-1678907444.9216408/tests/scripts07070100000046000081A40000000000000000000000016412183400000540000000000000000000000000000000000000004700000000sle-prjmgr-tools-1678907444.9216408/tests/scripts/project_testsetup.py""" This scripts sets up an environment in an Open Build Service and JIRA instance. Botmaster: - gocd instance that is configured with this script - Required to run against test project: - SLE15.SP5.Stagings.RelPkgs - SLE15.SP5.Staging.A OBS requirements: - The OSC Client ist installed and able to authenticate non-interactively. - Project structure must be: ``...:SLE-<digits>-SP<digit>:GA:<|TEST|PUBLISHED>`` - The projects must have two repositories "containers" and "images". JIRA requirements: - PAT (= Personal Access Token) authentication - User must be able to write comments on tickets - User must be able to move a ticket via the transition "Integrated" towards the next state. - The issue type "Implementation" must be known to the JIRA instance - The following status must be known to the JIRA instance: - QE Open - QE In Progress - QE Blocked - Engineering Done - In Maintenance - Dev In Progress - IBS Integration After these requirements are met this script can be executed. """ from osc import core project = { "source": "SUSE:SLE-15-SP5:GA", "target": "home:SchoolGuy:SLE-15-SP5:GA", } # Meta taken from real projects and release targets adjusted # OBS: Create project structure # OBS: Create META for GA:PUBLISH # OBS: Create META for GA:TEST # OBS: Create META for GA # Do Staging Setup 07070100000047000081A40000000000000000000000016412183400000EBF000000000000000000000000000000000000003C00000000sle-prjmgr-tools-1678907444.9216408/tests/sle_build_test.pyimport pytest from sle_prjmgr_tools import sle_build @pytest.fixture(scope="function", autouse=True) def setup_osc(): sle_build.osc_prepare("tests/data/oscrc", "https://api.suse.de") def test_osc_get_builds(): # Arrange apiurl = "https://api.suse.de" project = "SUSE:SLE-15-SP5:GA:TEST" # Act result = sle_build.osc_get_builds(apiurl, project) # Assert print(result) assert False def test_sle_15_media_build(): # Arrange apiurl = "https://api.suse.de" project = "SUSE:SLE-15-SP5:GA" # TODO: Mock osc_get_builds to achieve consistent results # Act result = sle_build.sle_15_media_build(apiurl, project) # Assert print(result) assert False def test_osc_sle_non_release_packages(): # Arrange apiurl = "https://api.suse.de" project = "SUSE:SLE-15-SP5:GA" # Act result = sle_build.osc_get_sle_non_release_packages(apiurl, project) # Assert print(result) assert False def test_osc_get_build_flavors(): # Arrange apiurl = "https://api.suse.de" project = "SUSE:SLE-15-SP5:GA" # Act result = sle_build.osc_get_build_flavors( apiurl, project, "kiwi-templates-Minimal", "_multibuild" ) # Assert assert result == [ "kvm-and-xen", "kvm", "VMware", "MS-HyperV", "OpenStack-Cloud", "RaspberryPi", ] def test_osc_get_non_product_packages(): # Arrange apiurl = "https://api.suse.de" project = "SUSE:SLE-15-SP5:GA" # Act result = sle_build.osc_get_non_product_packages(apiurl, project) # Assert print(result) assert False def test_get_kiwi_template(): # Arrange apiurl = "https://api.suse.de" project = "SUSE:SLE-15-SP5:GA" # Act result = sle_build.get_kiwi_template(apiurl, project) # Assert assert result == "kiwi-templates-Minimal" def test_get_sle_images_jeos(): # Arrange apiurl = "https://api.suse.de" project = "SUSE:SLE-15-SP5:GA" kiwi_template = "kiwi-templates-Minimal" # Act result = sle_build.get_sle_images_jeos(apiurl, project, kiwi_template) # Assert print(result) assert False def test_get_sle_images_multibuild(): # Arrange apiurl = "https://api.suse.de" project = "SUSE:SLE-15-SP5:GA" # Act result = sle_build.get_sle_images_multibuild(apiurl, project) # Assert print(result) assert False def test_get_images_old_style(): # Arrange apiurl = "https://api.suse.de" project = "SUSE:SLE-15-SP5:GA" # Act result = sle_build.get_sle_images_old_style(apiurl, project) # Assert print(result) assert False def test_sle_images(): # Arrange apiurl = "https://api.suse.de" project = "SUSE:SLE-15-SP5:GA" # Act result = sle_build.sle_images(apiurl, project) # Assert print(result) assert False def test_osc_get_dvd_images(): # Arrange apiurl = "https://api.suse.de" project = "15-SP5" # Act result = sle_build.osc_get_dvd_images(apiurl, project) # Assert print(result) assert False def test_get_wsl_binaries(): # Arrange apiurl = "https://api.suse.de" project = "SUSE:SLE-15-SP5:Update:WSL:Update:CR" # Act result = sle_build.get_wsl_binaries(apiurl, project) # Assert print(result) assert False def test_osc_get_sle_12_images(): # Arrange apiurl = "https://api.suse.de" project = "SUSE:SLE-12-SP5:GA" # Act result = sle_build.osc_get_sle_12_images(apiurl, project) # Assert print(result) assert False def test_main(): # Arrange version = "" # Act result = sle_build.main(version) # Assert print(result) assert False 07070100000048000041ED0000000000000000000000026412183400000000000000000000000000000000000000000000003000000000sle-prjmgr-tools-1678907444.9216408/tests/utils07070100000049000081A40000000000000000000000016412183400000000000000000000000000000000000000000000003C00000000sle-prjmgr-tools-1678907444.9216408/tests/utils/__init__.py0707010000004A000081A40000000000000000000000016412183400000C21000000000000000000000000000000000000003D00000000sle-prjmgr-tools-1678907444.9216408/tests/utils/jira_test.pydef test_jira_transition_tickets(jira_obj): # Arrange jira_instance_obj = jira_obj test_issue = jira_instance_obj.jira_obj.create_issue( project="PED", summary="Test issue for ibs_to_jira script!", description="Test issue for ibs_to_jira script!", issuetype={"name": "Implementation"}, ) # Transition to Effort Estimation evaluate_transition_id = jira_instance_obj.jira_get_transition_id( test_issue.key, "Evaluate" ) dev_lead_id = jira_instance_obj.jira_get_field_name("Dev Lead") fix_version_id = jira_instance_obj.jira_get_field_name("Fix Version/s") fix_version_obj = jira_instance_obj.jira_get_version_obj( test_issue.key, "15 SP5 GM" ) qe_project_manager_id = jira_instance_obj.jira_get_field_name("QE Project Manager") versions = [{"id": version.id} for version in test_issue.fields.fixVersions] versions.append({"id": fix_version_obj.id}) user_obj = jira_instance_obj.jira_obj.search_users(user="eg_admin")[0] test_issue.update( fields={ dev_lead_id: user_obj.raw, fix_version_id: versions, qe_project_manager_id: user_obj.raw, } ) jira_instance_obj.jira_obj.transition_issue( test_issue.key, evaluate_transition_id, ) # Transition to Dev under Estimation evaluate_dev_transition_id = jira_instance_obj.jira_get_transition_id( test_issue.key, "Evaluate Dev" ) tl_initial_id = jira_instance_obj.jira_get_field_name( "TL initial effort estimation" ) tl_ongoing_id = jira_instance_obj.jira_get_field_name( "TL ongoing effort estimation" ) test_issue.update( fields={ tl_initial_id: {"value": "No Effort"}, tl_ongoing_id: {"value": "No Effort"}, }, ) jira_instance_obj.jira_obj.transition_issue( test_issue.key, evaluate_dev_transition_id, ) # Transition to Ready for Dev approve_transition_id = jira_instance_obj.jira_get_transition_id( test_issue.key, "Approve" ) jira_instance_obj.jira_obj.transition_issue(test_issue.key, approve_transition_id) # Transition to Dev in Progress start_transition_id = jira_instance_obj.jira_get_transition_id( test_issue.key, "Start" ) jira_instance_obj.jira_obj.transition_issue(test_issue.key, start_transition_id) # Transition to Dev Done dev_done_transition_id = jira_instance_obj.jira_get_transition_id( test_issue.key, "Dev Done" ) jira_instance_obj.jira_obj.transition_issue(test_issue.key, dev_done_transition_id) # Act jira_instance_obj.jira_transition_tickets(test_issue.key) # Assert test_issue = jira_instance_obj.jira_obj.issue(test_issue.key) assert test_issue.fields.status.name == "QE Open" def test_jira_do_search(jira_obj): # Arrange jira_instance = jira_obj() jql = "issue = YES-168" # Act result = jira_instance.jira_do_search(jql) # Assert assert len(result) == 1 assert result[0] == "YES-168" 0707010000004B000081A400000000000000000000000164121834000013EB000000000000000000000000000000000000003C00000000sle-prjmgr-tools-1678907444.9216408/tests/utils/osc_test.pyimport pytest from sle_prjmgr_tools.utils import osc OSC_SERVER = "https://api.suse.de" OVERRIDE_CONFIG = "tests/data/oscrc" @pytest.fixture def osc_utils(): def _osc_utils_factory(osc_server: str): return osc.OscUtils(osc_server=osc_server, override_config=OVERRIDE_CONFIG) return _osc_utils_factory @pytest.fixture def osc_release_helper(): def _osc_release_helper_factory(osc_server: str): return osc.OscReleaseHelper( osc_server=osc_server, override_config="tests/data/oscrc" ) return _osc_release_helper_factory def test_object_creation(): # Arrange # Act result = osc.OscUtils( osc_server="https://api.suse.de", override_config=OVERRIDE_CONFIG ) # Assert assert isinstance(result, osc.OscUtils) def test_osc_get_binary_names(osc_utils): # Arrange osc_utils_obj = osc_utils(OSC_SERVER) project = "SUSE:SLE-15-SP5:GA" # Act result = osc_utils_obj.osc_get_binary_names( project, "standard", "x86_64", "000release-packages:SLES-release" ) # Assert print(result) assert False def test_osc_get_textfile_from_rpm(osc_utils): # Arrange osc_utils_obj = osc_utils(OSC_SERVER) project = "SUSE:SLE-15-SP5:GA" # Act result = osc_utils_obj.osc_get_textfile_from_rpm( project, "standard", "x86_64", "sles-release", "etc/products.d/SLES.prod" ) # Assert print(result) assert False def test_osc_retrieve_betaversion(osc_utils): # Arrange osc_utils_obj = osc_utils(OSC_SERVER) project = "SUSE:SLE-15-SP5:GA" # Act result = osc_utils_obj.osc_retrieve_betaversion(project) # Assert assert result == "Snapshot-202211-1" def test_osc_is_repo_published(osc_utils): # Arrange osc_utils_obj = osc_utils(OSC_SERVER) project = "SUSE:SLE-15-SP5:GA" repository = "images" # Act result = osc_utils_obj.osc_is_repo_published(project, repository) # Assert assert result def test_osc_get_containers(osc_utils): # Arrange osc_utils_obj = osc_utils(OSC_SERVER) project = "SUSE:SLE-15-SP5:GA" expected_result = [ "cdi-apiserver-container", "cdi-cloner-container", "cdi-controller-container", "cdi-importer-container", "cdi-operator-container", "cdi-uploadproxy-container", "cdi-uploadserver-container", "virt-api-container", "virt-controller-container", "virt-exportproxy-container", "virt-exportserver-container", "virt-handler-container", "virt-launcher-container", "virt-libguestfs-tools-container", "virt-operator-container", ] # Act result = osc_utils_obj.osc_get_containers(project) # Assert assert result == expected_result def test_osc_get_products(osc_utils): # Arrange osc_utils_obj = osc_utils(OSC_SERVER) project = "SUSE:SLE-15-SP5:GA" expected_result = [ "SLES15-SP5", "SLES15-SP5-BYOS", "SLES15-SP5-CHOST-BYOS", "SLES15-SP5-EC2-ECS-HVM", "SLES15-SP5-HPC", "SLES15-SP5-HPC-BYOS", "SLES15-SP5-Hardened-BYOS", "SLES15-SP5-SAP", "SLES15-SP5-SAP-Azure-LI-BYOS", "SLES15-SP5-SAP-Azure-VLI-BYOS", "SLES15-SP5-SAP-BYOS", "SLES15-SP5-SAPCAL", ] # Act result = osc_utils_obj.osc_get_products(project) # Assert assert result == expected_result @pytest.mark.skip def test_osc_do_release(osc_utils): # Arrange osc_utils_obj = osc_utils(OSC_SERVER) project = "SUSE:SLE15-SP5:GA" # Act osc_utils_obj.osc_do_release(project) # Assert assert False def test_osc_get_jsc_from_sr(osc_utils): # Arrange osc_utils_obj = osc_utils(OSC_SERVER) sr_number = 284017 expected_result = [ "PED-1183", "PED-1504", "PED-1509", "PED-1517", "PED-1559", "PED-1817", "PED-1917", "PED-1981", "PED-2064", "PED-606", "PED-818", "PED-850", ] # Act result = osc_utils_obj.osc_get_jsc_from_sr(sr_number) # Assert assert result == expected_result # ---------------- Release Helper ---------------- @pytest.mark.skip def test_release_to_common(osc_release_helper): # Arrange osc_release_helper_obj = osc_release_helper(OSC_SERVER) project = "SUSE:SLE15-SP5:GA" # Act osc_release_helper_obj.release_to_common(project) # Assert assert False @pytest.mark.skip def test_release_repo_to_test(osc_release_helper): # Arrange osc_release_helper_obj = osc_release_helper(OSC_SERVER) project = "SUSE:SLE15-SP5:GA" # Act osc_release_helper_obj.release_repo_to_test(project) # Assert assert False @pytest.mark.skip def test_release_repo_to_publish(osc_release_helper): # Arrange osc_release_helper_obj = osc_release_helper(OSC_SERVER) project = "SUSE:SLE15-SP5:GA" # Act osc_release_helper_obj.release_repo_to_publish(project) # Assert assert False 07070100000000000000000000000000000000000000010000000000000000000000000000000000000000000000000000000B00000000TRAILER!!!416 blocks
Locations
Projects
Search
Status Monitor
Help
OpenBuildService.org
Documentation
API Documentation
Code of Conduct
Contact
Support
@OBShq
Terms
openSUSE Build Service is sponsored by
The Open Build Service is an
openSUSE project
.
Sign Up
Log In
Places
Places
All Projects
Status Monitor